Jan 21 15:01:42 crc systemd[1]: Starting Kubernetes Kubelet... Jan 21 15:01:42 crc restorecon[4569]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:01:42 crc restorecon[4569]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 21 15:01:43 crc kubenswrapper[4707]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 15:01:43 crc kubenswrapper[4707]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 21 15:01:43 crc kubenswrapper[4707]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 15:01:43 crc kubenswrapper[4707]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 15:01:43 crc kubenswrapper[4707]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 21 15:01:43 crc kubenswrapper[4707]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.026503 4707 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030857 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030876 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030881 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030885 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030889 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030894 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030898 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030902 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030906 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030911 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030916 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030922 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030925 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030929 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030933 4707 feature_gate.go:330] unrecognized feature gate: Example Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030937 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030942 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030946 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030950 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030955 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030958 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030961 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030965 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030968 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030972 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030975 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030978 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030981 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030984 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030988 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030991 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030994 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.030998 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031002 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031005 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031008 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031012 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031015 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031019 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031022 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031025 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031028 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031032 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031036 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031039 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031042 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031045 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031049 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031052 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031055 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031058 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031061 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031065 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031068 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031071 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031074 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031077 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031081 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031085 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031090 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031093 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031097 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031100 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031103 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031106 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031111 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031115 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031118 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031122 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031125 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.031130 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031884 4707 flags.go:64] FLAG: --address="0.0.0.0" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031926 4707 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031938 4707 flags.go:64] FLAG: --anonymous-auth="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031945 4707 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031952 4707 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031957 4707 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031964 4707 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031970 4707 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031975 4707 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031979 4707 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031983 4707 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031993 4707 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.031997 4707 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032001 4707 flags.go:64] FLAG: --cgroup-root="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032005 4707 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032009 4707 flags.go:64] FLAG: --client-ca-file="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032013 4707 flags.go:64] FLAG: --cloud-config="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032016 4707 flags.go:64] FLAG: --cloud-provider="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032020 4707 flags.go:64] FLAG: --cluster-dns="[]" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032027 4707 flags.go:64] FLAG: --cluster-domain="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032030 4707 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032034 4707 flags.go:64] FLAG: --config-dir="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032038 4707 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032042 4707 flags.go:64] FLAG: --container-log-max-files="5" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032049 4707 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032053 4707 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032057 4707 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032063 4707 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032067 4707 flags.go:64] FLAG: --contention-profiling="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032072 4707 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032077 4707 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032081 4707 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032086 4707 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032093 4707 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032097 4707 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032100 4707 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032104 4707 flags.go:64] FLAG: --enable-load-reader="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032108 4707 flags.go:64] FLAG: --enable-server="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032112 4707 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032117 4707 flags.go:64] FLAG: --event-burst="100" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032121 4707 flags.go:64] FLAG: --event-qps="50" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032125 4707 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032129 4707 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032136 4707 flags.go:64] FLAG: --eviction-hard="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032142 4707 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032145 4707 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032149 4707 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032153 4707 flags.go:64] FLAG: --eviction-soft="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032157 4707 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032161 4707 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032165 4707 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032168 4707 flags.go:64] FLAG: --experimental-mounter-path="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032172 4707 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032176 4707 flags.go:64] FLAG: --fail-swap-on="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032180 4707 flags.go:64] FLAG: --feature-gates="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032185 4707 flags.go:64] FLAG: --file-check-frequency="20s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032188 4707 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032192 4707 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032196 4707 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032200 4707 flags.go:64] FLAG: --healthz-port="10248" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032204 4707 flags.go:64] FLAG: --help="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032208 4707 flags.go:64] FLAG: --hostname-override="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032212 4707 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032216 4707 flags.go:64] FLAG: --http-check-frequency="20s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032219 4707 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032223 4707 flags.go:64] FLAG: --image-credential-provider-config="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032227 4707 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032231 4707 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032235 4707 flags.go:64] FLAG: --image-service-endpoint="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032239 4707 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032243 4707 flags.go:64] FLAG: --kube-api-burst="100" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032246 4707 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032250 4707 flags.go:64] FLAG: --kube-api-qps="50" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032254 4707 flags.go:64] FLAG: --kube-reserved="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032258 4707 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032262 4707 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032267 4707 flags.go:64] FLAG: --kubelet-cgroups="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032270 4707 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032274 4707 flags.go:64] FLAG: --lock-file="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032285 4707 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032289 4707 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032293 4707 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032301 4707 flags.go:64] FLAG: --log-json-split-stream="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032305 4707 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032309 4707 flags.go:64] FLAG: --log-text-split-stream="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032313 4707 flags.go:64] FLAG: --logging-format="text" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032316 4707 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032321 4707 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032324 4707 flags.go:64] FLAG: --manifest-url="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032328 4707 flags.go:64] FLAG: --manifest-url-header="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032334 4707 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032339 4707 flags.go:64] FLAG: --max-open-files="1000000" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032344 4707 flags.go:64] FLAG: --max-pods="110" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032347 4707 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032351 4707 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032355 4707 flags.go:64] FLAG: --memory-manager-policy="None" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032359 4707 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032362 4707 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032366 4707 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032371 4707 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032383 4707 flags.go:64] FLAG: --node-status-max-images="50" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032388 4707 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032392 4707 flags.go:64] FLAG: --oom-score-adj="-999" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032396 4707 flags.go:64] FLAG: --pod-cidr="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032400 4707 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032409 4707 flags.go:64] FLAG: --pod-manifest-path="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032413 4707 flags.go:64] FLAG: --pod-max-pids="-1" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032417 4707 flags.go:64] FLAG: --pods-per-core="0" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032421 4707 flags.go:64] FLAG: --port="10250" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032424 4707 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032428 4707 flags.go:64] FLAG: --provider-id="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032432 4707 flags.go:64] FLAG: --qos-reserved="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032436 4707 flags.go:64] FLAG: --read-only-port="10255" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032440 4707 flags.go:64] FLAG: --register-node="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032443 4707 flags.go:64] FLAG: --register-schedulable="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032447 4707 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032455 4707 flags.go:64] FLAG: --registry-burst="10" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032459 4707 flags.go:64] FLAG: --registry-qps="5" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032462 4707 flags.go:64] FLAG: --reserved-cpus="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032466 4707 flags.go:64] FLAG: --reserved-memory="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032471 4707 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032475 4707 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032479 4707 flags.go:64] FLAG: --rotate-certificates="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032483 4707 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032487 4707 flags.go:64] FLAG: --runonce="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032490 4707 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032494 4707 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032498 4707 flags.go:64] FLAG: --seccomp-default="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032502 4707 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032506 4707 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032510 4707 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032514 4707 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032519 4707 flags.go:64] FLAG: --storage-driver-password="root" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032522 4707 flags.go:64] FLAG: --storage-driver-secure="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032526 4707 flags.go:64] FLAG: --storage-driver-table="stats" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032529 4707 flags.go:64] FLAG: --storage-driver-user="root" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032534 4707 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032538 4707 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032543 4707 flags.go:64] FLAG: --system-cgroups="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032547 4707 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032554 4707 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032559 4707 flags.go:64] FLAG: --tls-cert-file="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032563 4707 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032569 4707 flags.go:64] FLAG: --tls-min-version="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032573 4707 flags.go:64] FLAG: --tls-private-key-file="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032577 4707 flags.go:64] FLAG: --topology-manager-policy="none" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032581 4707 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032584 4707 flags.go:64] FLAG: --topology-manager-scope="container" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032588 4707 flags.go:64] FLAG: --v="2" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032594 4707 flags.go:64] FLAG: --version="false" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032600 4707 flags.go:64] FLAG: --vmodule="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032605 4707 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.032610 4707 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032739 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032743 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032747 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032750 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032753 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032757 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032760 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032763 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032768 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032773 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032777 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032780 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032784 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032787 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032790 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032796 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032800 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032819 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032823 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032826 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032829 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032832 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032836 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032840 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032844 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032847 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032850 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032854 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032857 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032860 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032865 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032869 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032873 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032876 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032880 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032884 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032888 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032892 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032895 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032898 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032901 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032905 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032908 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032911 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032915 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032918 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032921 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032925 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032929 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032933 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032936 4707 feature_gate.go:330] unrecognized feature gate: Example Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032939 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032943 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032947 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032951 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032954 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032957 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032960 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032964 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032967 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032972 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032977 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032981 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032985 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032989 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032992 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.032996 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.033000 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.033003 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.033006 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.033010 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.033599 4707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.043604 4707 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.043641 4707 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043716 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043724 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043728 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043733 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043737 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043741 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043745 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043748 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043752 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043755 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043759 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043763 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043766 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043770 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043774 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043779 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043783 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043788 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043791 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043797 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043802 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043818 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043822 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043830 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043833 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043837 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043840 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043843 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043847 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043850 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043854 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043859 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043863 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043867 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043871 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043875 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043879 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043883 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043887 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043891 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043895 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043898 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043902 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043906 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043910 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043913 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043917 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043920 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043924 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043928 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043931 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043935 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043939 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043943 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043946 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043950 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043953 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043957 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043961 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043966 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043970 4707 feature_gate.go:330] unrecognized feature gate: Example Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043973 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043977 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043980 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.043983 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044001 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044005 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044009 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044012 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044016 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044020 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.044027 4707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044129 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044136 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044140 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044143 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044148 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044152 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044156 4707 feature_gate.go:330] unrecognized feature gate: Example Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044160 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044163 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044167 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044170 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044173 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044178 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044181 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044185 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044188 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044191 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044196 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044201 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044205 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044209 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044213 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044217 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044221 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044224 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044228 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044232 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044235 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044239 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044242 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044245 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044248 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044251 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044255 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044259 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044262 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044266 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044285 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044290 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044293 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044297 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044300 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044304 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044307 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044311 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044314 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044317 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044320 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044323 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044328 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044331 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044334 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044338 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044342 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044346 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044350 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044353 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044356 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044361 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044365 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044369 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044373 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044376 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044380 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044383 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044387 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044390 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044394 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044398 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044401 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.044405 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.044412 4707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.045035 4707 server.go:940] "Client rotation is on, will bootstrap in background" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.047823 4707 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.047904 4707 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.048602 4707 server.go:997] "Starting client certificate rotation" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.048625 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.048790 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-07 04:40:42.580764366 +0000 UTC Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.048881 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.060964 4707 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.062671 4707 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.063257 4707 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.25.165:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.072543 4707 log.go:25] "Validated CRI v1 runtime API" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.091898 4707 log.go:25] "Validated CRI v1 image API" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.095029 4707 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.098615 4707 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-21-14-57-50-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.098653 4707 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.115789 4707 manager.go:217] Machine: {Timestamp:2026-01-21 15:01:43.113879403 +0000 UTC m=+0.295395645 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445406 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:64d1d4ed-0a74-4014-b8fb-96fafa8c47a8 BootID:14f7c65e-7534-4950-b6fd-0d06696bf53c Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:50:63:70 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:50:63:70 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:7f:42:af Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:59:11:47 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:87:17:d0 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:a7:b2:d6 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:62:d8:d5:97:6a:5b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d2:7f:64:c6:08:7c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.116000 4707 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.116209 4707 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.116497 4707 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.116678 4707 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.116707 4707 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.116915 4707 topology_manager.go:138] "Creating topology manager with none policy" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.116924 4707 container_manager_linux.go:303] "Creating device plugin manager" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.117198 4707 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.117226 4707 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.117990 4707 state_mem.go:36] "Initialized new in-memory state store" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.118141 4707 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.119944 4707 kubelet.go:418] "Attempting to sync node with API server" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.119970 4707 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.120028 4707 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.120043 4707 kubelet.go:324] "Adding apiserver pod source" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.120057 4707 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.122976 4707 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.123675 4707 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.123672 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.123740 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.165:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.123841 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.123960 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.165:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.125158 4707 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126195 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126218 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126241 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126249 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126260 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126267 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126274 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126287 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126295 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126303 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126313 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126319 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.126899 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.127630 4707 server.go:1280] "Started kubelet" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.128320 4707 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.128305 4707 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.128586 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.128999 4707 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 21 15:01:43 crc systemd[1]: Started Kubernetes Kubelet. Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.129626 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.129653 4707 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.129784 4707 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.129800 4707 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.129850 4707 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.130365 4707 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.130595 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 19:11:36.349549419 +0000 UTC Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.131000 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.131063 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.165:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.131413 4707 factory.go:55] Registering systemd factory Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.131430 4707 factory.go:221] Registration of the systemd container factory successfully Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.135346 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="200ms" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.136321 4707 factory.go:153] Registering CRI-O factory Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.136354 4707 factory.go:221] Registration of the crio container factory successfully Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.136505 4707 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.136537 4707 factory.go:103] Registering Raw factory Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.136551 4707 manager.go:1196] Started watching for new ooms in manager Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.136751 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.25.165:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188cc72225d533d1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 15:01:43.127307217 +0000 UTC m=+0.308823439,LastTimestamp:2026-01-21 15:01:43.127307217 +0000 UTC m=+0.308823439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.140751 4707 server.go:460] "Adding debug handlers to kubelet server" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.140899 4707 manager.go:319] Starting recovery of all containers Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144339 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144379 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144392 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144402 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144412 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144422 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144432 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144442 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144455 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144465 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144476 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144485 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144496 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144509 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144533 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144542 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144552 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144561 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144571 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144581 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144590 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144600 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144610 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144620 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144628 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144638 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144651 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144662 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144671 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144680 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144693 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144702 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144712 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144721 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144731 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144740 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144750 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144759 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144767 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144776 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144784 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144793 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144802 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144827 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144836 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144846 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144856 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144865 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144877 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144885 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144894 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144903 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144915 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144924 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144936 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144946 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144954 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144963 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144971 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144980 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144989 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.144999 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145009 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145017 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145028 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145037 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145046 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145055 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145064 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145073 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145081 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.145089 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146143 4707 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146434 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146449 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146466 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146475 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146485 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146495 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146504 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146513 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146524 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146532 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146542 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146552 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146560 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146570 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146580 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146589 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146599 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146607 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146617 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146626 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146636 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146644 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146654 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146663 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146671 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146680 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146688 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146697 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146706 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146714 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146722 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146733 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146751 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146760 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146771 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146783 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146793 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146825 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146836 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146847 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146857 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146867 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146876 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146886 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146895 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146904 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146913 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146921 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146930 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146939 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146947 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146956 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146964 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146975 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146984 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.146991 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147000 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147012 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147021 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147031 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147040 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147048 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147058 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147067 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147075 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147085 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147093 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147101 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147109 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147119 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147132 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147174 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147184 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147193 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147203 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147236 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147246 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147257 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147267 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147283 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147292 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147300 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147310 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147318 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147326 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147337 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147347 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147355 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147365 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147374 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147383 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147392 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147401 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147410 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147420 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147430 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147439 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147449 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147458 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147468 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147478 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147487 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147496 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147505 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147514 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147522 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147530 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147539 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147548 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147555 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147564 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147572 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147580 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147587 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147596 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147606 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147614 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147623 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147631 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147641 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147650 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147666 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147674 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147685 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147694 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147702 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147711 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147720 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147728 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147737 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147745 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147753 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147761 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147774 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147782 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147791 4707 reconstruct.go:97] "Volume reconstruction finished" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.147798 4707 reconciler.go:26] "Reconciler: start to sync state" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.154353 4707 manager.go:324] Recovery completed Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.162710 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.166344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.166389 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.166400 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.167574 4707 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.167600 4707 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.167625 4707 state_mem.go:36] "Initialized new in-memory state store" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.172083 4707 policy_none.go:49] "None policy: Start" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.172788 4707 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.172837 4707 state_mem.go:35] "Initializing new in-memory state store" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.179590 4707 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.181327 4707 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.181368 4707 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.181395 4707 kubelet.go:2335] "Starting kubelet main sync loop" Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.181435 4707 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.182402 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.182484 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.165:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.219605 4707 manager.go:334] "Starting Device Plugin manager" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.219649 4707 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.219660 4707 server.go:79] "Starting device plugin registration server" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.220019 4707 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.220037 4707 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.220193 4707 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.220278 4707 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.220293 4707 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.227749 4707 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.282203 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.282308 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.283240 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.283273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.283284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.283449 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.283689 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.283724 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284227 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284372 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284380 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284425 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284474 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284700 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284724 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.284866 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285196 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285382 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285450 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285573 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285637 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285666 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285680 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285720 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.285731 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286226 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286351 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286367 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286917 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.286946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.320483 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.321669 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.321718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.321729 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.321757 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.322572 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.165:6443: connect: connection refused" node="crc" Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.336236 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="400ms" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.351832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.351883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.351906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.351927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.351945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352018 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352199 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352235 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.352252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.453830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.453886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.453907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.453923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.453943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.453957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.453971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.453985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454117 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454000 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454332 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.454512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.522892 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.524032 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.524067 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.524090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.524114 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.524571 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.165:6443: connect: connection refused" node="crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.623351 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.640671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.644609 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a85b8f453e6e4cac4c62035e95d47fdf01cb33a6a47b8b7e7e467b8a679d412b WatchSource:0}: Error finding container a85b8f453e6e4cac4c62035e95d47fdf01cb33a6a47b8b7e7e467b8a679d412b: Status 404 returned error can't find the container with id a85b8f453e6e4cac4c62035e95d47fdf01cb33a6a47b8b7e7e467b8a679d412b Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.654320 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-60a0dcbcd1034e6b9fdc7b0624ec1ccf416d53697d9458faac4d171817ba051d WatchSource:0}: Error finding container 60a0dcbcd1034e6b9fdc7b0624ec1ccf416d53697d9458faac4d171817ba051d: Status 404 returned error can't find the container with id 60a0dcbcd1034e6b9fdc7b0624ec1ccf416d53697d9458faac4d171817ba051d Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.665946 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.674997 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-5410b7e3af9e681c652ab8b19389c36e5017d192df2afbf3d3c8e8ad5d04b753 WatchSource:0}: Error finding container 5410b7e3af9e681c652ab8b19389c36e5017d192df2afbf3d3c8e8ad5d04b753: Status 404 returned error can't find the container with id 5410b7e3af9e681c652ab8b19389c36e5017d192df2afbf3d3c8e8ad5d04b753 Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.690974 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.697200 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.700997 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9edcc2c0afb7ea5b3c33f11608b3147ee5f393d4f5c372f496feab524f5d8d06 WatchSource:0}: Error finding container 9edcc2c0afb7ea5b3c33f11608b3147ee5f393d4f5c372f496feab524f5d8d06: Status 404 returned error can't find the container with id 9edcc2c0afb7ea5b3c33f11608b3147ee5f393d4f5c372f496feab524f5d8d06 Jan 21 15:01:43 crc kubenswrapper[4707]: W0121 15:01:43.707161 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f07039e1cf11a782b8c51545e2d9a86c61094cdec03993b033b2a53be2ec4af3 WatchSource:0}: Error finding container f07039e1cf11a782b8c51545e2d9a86c61094cdec03993b033b2a53be2ec4af3: Status 404 returned error can't find the container with id f07039e1cf11a782b8c51545e2d9a86c61094cdec03993b033b2a53be2ec4af3 Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.737132 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="800ms" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.928843 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.931108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.931153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.931162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:43 crc kubenswrapper[4707]: I0121 15:01:43.931186 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:01:43 crc kubenswrapper[4707]: E0121 15:01:43.931886 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.165:6443: connect: connection refused" node="crc" Jan 21 15:01:44 crc kubenswrapper[4707]: W0121 15:01:44.070943 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:44 crc kubenswrapper[4707]: E0121 15:01:44.071009 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.165:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.129284 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.131305 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 02:12:49.584904281 +0000 UTC Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.187718 4707 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742" exitCode=0 Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.187827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.187981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f07039e1cf11a782b8c51545e2d9a86c61094cdec03993b033b2a53be2ec4af3"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.188094 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.188981 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.189019 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.189030 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.189858 4707 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a" exitCode=0 Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.189935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.189994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9edcc2c0afb7ea5b3c33f11608b3147ee5f393d4f5c372f496feab524f5d8d06"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.190098 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.190946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.190976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.190988 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.191048 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.191088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5410b7e3af9e681c652ab8b19389c36e5017d192df2afbf3d3c8e8ad5d04b753"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.192527 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f" exitCode=0 Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.192584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.192602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"60a0dcbcd1034e6b9fdc7b0624ec1ccf416d53697d9458faac4d171817ba051d"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.192668 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.193444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.193471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.193481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.194548 4707 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="698a0bb8dba9951c181a7f819389706ff91c21125a8705fe76e5173a8e0d71a8" exitCode=0 Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.194573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"698a0bb8dba9951c181a7f819389706ff91c21125a8705fe76e5173a8e0d71a8"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.194588 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a85b8f453e6e4cac4c62035e95d47fdf01cb33a6a47b8b7e7e467b8a679d412b"} Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.194660 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.194698 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.195290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.195327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.195338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.195458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.195483 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.195493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:44 crc kubenswrapper[4707]: W0121 15:01:44.322073 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:44 crc kubenswrapper[4707]: E0121 15:01:44.322147 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.165:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:01:44 crc kubenswrapper[4707]: W0121 15:01:44.422592 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:44 crc kubenswrapper[4707]: E0121 15:01:44.422678 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.165:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:01:44 crc kubenswrapper[4707]: E0121 15:01:44.538113 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="1.6s" Jan 21 15:01:44 crc kubenswrapper[4707]: W0121 15:01:44.710457 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.165:6443: connect: connection refused Jan 21 15:01:44 crc kubenswrapper[4707]: E0121 15:01:44.710546 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.165:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.732026 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.733388 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.733438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.733451 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:44 crc kubenswrapper[4707]: I0121 15:01:44.733493 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:01:44 crc kubenswrapper[4707]: E0121 15:01:44.734026 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.165:6443: connect: connection refused" node="crc" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.127504 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.132195 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 06:15:06.410663871 +0000 UTC Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.199071 4707 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4c4e6e7530fbf2681c46e3cc91b37cefa9a0f30a496bc049cfbff12a9a4e3d31" exitCode=0 Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.199139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4c4e6e7530fbf2681c46e3cc91b37cefa9a0f30a496bc049cfbff12a9a4e3d31"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.199246 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.199873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.199908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.199919 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.200845 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.200835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bddde6ec5b36fdff21a26422f9d552fffa7b8fea406b82552daafa46943b5b0d"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.201526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.201554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.201563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.203450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.203479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.203491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.203600 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.204310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.204334 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.204343 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.205348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.205375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.205388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.205378 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.205978 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.206010 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.206020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.207764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.207828 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.207841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.207850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.207860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6"} Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.207929 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.208543 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.208582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.208593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:45 crc kubenswrapper[4707]: I0121 15:01:45.475060 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.132966 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 12:24:16.996487575 +0000 UTC Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.211613 4707 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ce934ba3545183939ee0825701b4f127825ebe59788e0b9b9aa08e2e18e28692" exitCode=0 Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.211749 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.211784 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.211789 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.211842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ce934ba3545183939ee0825701b4f127825ebe59788e0b9b9aa08e2e18e28692"} Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.211938 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.211797 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.212986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213013 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213021 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213186 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213195 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.212986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213421 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213432 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.213521 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.334358 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.335230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.335257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.335266 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:46 crc kubenswrapper[4707]: I0121 15:01:46.335289 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.039211 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.133659 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:39:48.497404079 +0000 UTC Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.217314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d34c4d7c3947e5fcbd9a96151b30de17ff1124612eb7525177c2344c8625ae5e"} Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.217364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b2f3829436e297e032723e4c63a5b0a19c28e1e1b2a46fa18abc6858bb427d8c"} Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.217377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"68da03de7dff23aaddb13c005cd9cc46f744829679f7e362499937dd96cc2efc"} Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.217389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"35cd877686c4f6de7f9e976bb1195d7dd34e9da39f2e584c5575da8ddb3856af"} Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.217392 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.217441 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.217399 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0925e2e555e1f73758141e7a8c14360bfa384808371aa58655b60dac8cad7170"} Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.217508 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.218374 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.218404 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.218415 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.218372 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.218459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.218471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:47 crc kubenswrapper[4707]: I0121 15:01:47.600913 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.134193 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 17:05:29.607075593 +0000 UTC Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.219895 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.220740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.220781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.220791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.814804 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.814961 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.815775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.815843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.815853 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.819529 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:48 crc kubenswrapper[4707]: I0121 15:01:48.939646 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.134518 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 23:27:02.516075944 +0000 UTC Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.181832 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.182001 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.182851 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.182907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.182920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.221727 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.221741 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.222885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.222931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.222941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.223060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.223096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.223108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.255493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.255582 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.256354 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.256399 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:49 crc kubenswrapper[4707]: I0121 15:01:49.256410 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.135571 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 03:04:42.67063376 +0000 UTC Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.223619 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.224498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.224545 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.224557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.390604 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.390716 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.390753 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.391694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.391720 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.391732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.525732 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:50 crc kubenswrapper[4707]: I0121 15:01:50.847313 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:01:51 crc kubenswrapper[4707]: I0121 15:01:51.136221 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 13:25:46.783607447 +0000 UTC Jan 21 15:01:51 crc kubenswrapper[4707]: I0121 15:01:51.224915 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:51 crc kubenswrapper[4707]: I0121 15:01:51.225568 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:51 crc kubenswrapper[4707]: I0121 15:01:51.225600 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:51 crc kubenswrapper[4707]: I0121 15:01:51.225611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:52 crc kubenswrapper[4707]: I0121 15:01:52.136639 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 03:58:35.171487987 +0000 UTC Jan 21 15:01:52 crc kubenswrapper[4707]: I0121 15:01:52.226893 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:52 crc kubenswrapper[4707]: I0121 15:01:52.227618 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:52 crc kubenswrapper[4707]: I0121 15:01:52.227647 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:52 crc kubenswrapper[4707]: I0121 15:01:52.227656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:53 crc kubenswrapper[4707]: I0121 15:01:53.136942 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 01:08:39.67675101 +0000 UTC Jan 21 15:01:53 crc kubenswrapper[4707]: E0121 15:01:53.228678 4707 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 21 15:01:53 crc kubenswrapper[4707]: I0121 15:01:53.525783 4707 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 15:01:53 crc kubenswrapper[4707]: I0121 15:01:53.525894 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:01:54 crc kubenswrapper[4707]: I0121 15:01:54.137344 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 13:10:02.408157478 +0000 UTC Jan 21 15:01:55 crc kubenswrapper[4707]: E0121 15:01:55.129345 4707 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 21 15:01:55 crc kubenswrapper[4707]: I0121 15:01:55.129405 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 21 15:01:55 crc kubenswrapper[4707]: I0121 15:01:55.137685 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 09:01:03.120241427 +0000 UTC Jan 21 15:01:55 crc kubenswrapper[4707]: I0121 15:01:55.475661 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 15:01:55 crc kubenswrapper[4707]: I0121 15:01:55.475749 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:01:55 crc kubenswrapper[4707]: I0121 15:01:55.612459 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 21 15:01:55 crc kubenswrapper[4707]: I0121 15:01:55.612511 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 21 15:01:56 crc kubenswrapper[4707]: I0121 15:01:56.138600 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 00:12:35.630144068 +0000 UTC Jan 21 15:01:57 crc kubenswrapper[4707]: I0121 15:01:57.139355 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 11:15:51.975584262 +0000 UTC Jan 21 15:01:57 crc kubenswrapper[4707]: I0121 15:01:57.620422 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 21 15:01:57 crc kubenswrapper[4707]: I0121 15:01:57.620531 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:57 crc kubenswrapper[4707]: I0121 15:01:57.621446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:57 crc kubenswrapper[4707]: I0121 15:01:57.621481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:57 crc kubenswrapper[4707]: I0121 15:01:57.621490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:57 crc kubenswrapper[4707]: I0121 15:01:57.629429 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 21 15:01:58 crc kubenswrapper[4707]: I0121 15:01:58.139778 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 20:31:01.855186254 +0000 UTC Jan 21 15:01:58 crc kubenswrapper[4707]: I0121 15:01:58.239002 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:01:58 crc kubenswrapper[4707]: I0121 15:01:58.241424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:01:58 crc kubenswrapper[4707]: I0121 15:01:58.241458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:01:58 crc kubenswrapper[4707]: I0121 15:01:58.241471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:01:59 crc kubenswrapper[4707]: I0121 15:01:59.140720 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 10:56:02.036373709 +0000 UTC Jan 21 15:01:59 crc kubenswrapper[4707]: I0121 15:01:59.246533 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 15:01:59 crc kubenswrapper[4707]: I0121 15:01:59.255781 4707 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.141510 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 22:39:33.156470833 +0000 UTC Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.479949 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.480098 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.480880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.480906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.480915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.483157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.615625 4707 trace.go:236] Trace[818094437]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 15:01:46.450) (total time: 14164ms): Jan 21 15:02:00 crc kubenswrapper[4707]: Trace[818094437]: ---"Objects listed" error: 14164ms (15:02:00.615) Jan 21 15:02:00 crc kubenswrapper[4707]: Trace[818094437]: [14.164711439s] [14.164711439s] END Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.615657 4707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 15:02:00 crc kubenswrapper[4707]: E0121 15:02:00.615670 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.615772 4707 trace.go:236] Trace[1823579809]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 15:01:47.510) (total time: 13105ms): Jan 21 15:02:00 crc kubenswrapper[4707]: Trace[1823579809]: ---"Objects listed" error: 13105ms (15:02:00.615) Jan 21 15:02:00 crc kubenswrapper[4707]: Trace[1823579809]: [13.105679892s] [13.105679892s] END Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.615789 4707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.617434 4707 trace.go:236] Trace[244709800]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 15:01:46.677) (total time: 13939ms): Jan 21 15:02:00 crc kubenswrapper[4707]: Trace[244709800]: ---"Objects listed" error: 13939ms (15:02:00.617) Jan 21 15:02:00 crc kubenswrapper[4707]: Trace[244709800]: [13.939676375s] [13.939676375s] END Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.617458 4707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.617460 4707 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.617576 4707 trace.go:236] Trace[1350526685]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 15:01:46.859) (total time: 13757ms): Jan 21 15:02:00 crc kubenswrapper[4707]: Trace[1350526685]: ---"Objects listed" error: 13757ms (15:02:00.617) Jan 21 15:02:00 crc kubenswrapper[4707]: Trace[1350526685]: [13.757686168s] [13.757686168s] END Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.617591 4707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 15:02:00 crc kubenswrapper[4707]: E0121 15:02:00.617635 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.641377 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43962->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.641395 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43966->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.641432 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43962->192.168.126.11:17697: read: connection reset by peer" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.641455 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43966->192.168.126.11:17697: read: connection reset by peer" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.641671 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.641688 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.657218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.660554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:02:00 crc kubenswrapper[4707]: I0121 15:02:00.660638 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.129215 4707 apiserver.go:52] "Watching apiserver" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.132858 4707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.133132 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.133451 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.133496 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.133547 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.133604 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.133647 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.133656 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.133646 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.133715 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.133979 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.135660 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.135894 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.136324 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.136347 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.136368 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.136490 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.140855 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.140947 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.141169 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.141638 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 11:26:51.840269858 +0000 UTC Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.181829 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.192294 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.204509 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.212235 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.222799 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.231197 4707 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.232520 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.239882 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.245421 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.247318 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc" exitCode=255 Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.247380 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc"} Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.248723 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.251328 4707 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.254678 4707 scope.go:117] "RemoveContainer" containerID="f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.255755 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.255784 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.262928 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.271515 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.279483 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.290044 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.298448 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.307421 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.315228 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321782 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321799 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321871 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.321974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322161 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322198 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322541 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.322638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323143 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323640 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323851 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323882 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323997 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324028 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324045 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324060 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324138 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324175 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324260 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324336 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324455 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324540 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325277 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325407 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325489 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325602 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325790 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325804 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325871 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325901 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325973 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326021 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326100 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326195 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326409 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326509 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326540 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326941 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327048 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327141 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327152 4707 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327161 4707 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327171 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327181 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327191 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327201 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327212 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327222 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327231 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327239 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327251 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327260 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327268 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327277 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.330127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323804 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.323918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324686 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.324939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325150 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325166 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325482 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.330844 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.331071 4707 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.331108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.331358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.331368 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325768 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.326991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.327849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.328369 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.328543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.328869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.328867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.328929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.328598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.329052 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:02:01.828998261 +0000 UTC m=+19.010514483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.331645 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.331688 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:01.831668476 +0000 UTC m=+19.013184698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.331926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.332156 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.332221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.332245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.332352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.332392 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.332398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.332476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.332516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.332635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.333127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.333197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.333497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.333694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.333826 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329151 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.329907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.330209 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.334155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.334180 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:01.834165973 +0000 UTC m=+19.015682195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.331627 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.334442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.334652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.334702 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.335285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.335575 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.335740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.335791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.335970 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.336084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.336283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.336216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.336609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.336964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.337002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.337115 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.337237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.337248 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.337497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.337526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.337959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.337973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.338326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.338449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.338658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.338848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.339135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.339273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.339459 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.339742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.339789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.340052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.325738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.340311 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.340751 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.340774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.341030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.341142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.341479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.341545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.341594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.341623 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.341636 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.341644 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.341648 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.341656 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.341664 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.341681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.341689 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:01.841676218 +0000 UTC m=+19.023192440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.341714 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:01.841704232 +0000 UTC m=+19.023220454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.342059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.343655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.346340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.349525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.349540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.349759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.349834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.349779 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.349789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350665 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.350797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.351529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.351627 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.351656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.351717 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.351759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.351806 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.351916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.351930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.351780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.352167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.352419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.352476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.352556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.352588 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.352614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.352974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353718 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.353975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.354064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.354160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.354004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.354303 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.354371 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.354187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.354796 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.354840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.354887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355459 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355491 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355624 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355782 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.355939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.356024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.356495 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.356502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.357100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.358545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.359944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.370919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.372694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.373883 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.427873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.427909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.427983 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.427994 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428003 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428012 4707 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428019 4707 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428028 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428036 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428044 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428052 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428060 4707 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428090 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428100 4707 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428108 4707 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428117 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428125 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428134 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428143 4707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428151 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428158 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428166 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428173 4707 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428181 4707 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428190 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428197 4707 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428206 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428214 4707 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428223 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428231 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428239 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428246 4707 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428254 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428262 4707 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428269 4707 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428277 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428285 4707 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428293 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428301 4707 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428308 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428316 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428324 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428331 4707 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428339 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428348 4707 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428356 4707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428365 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428373 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428381 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428389 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428422 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428431 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428440 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428448 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428457 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428464 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428472 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428480 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428488 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428496 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428504 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428512 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428519 4707 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428565 4707 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428576 4707 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428586 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428596 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428605 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428614 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428623 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428631 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428640 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428649 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428656 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428664 4707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428701 4707 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428852 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428873 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428885 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428895 4707 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428906 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428915 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428924 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428935 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.428944 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429257 4707 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429274 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429288 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429298 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429308 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429316 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429326 4707 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429337 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429348 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429358 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429367 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429375 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429385 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429394 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429405 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429414 4707 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429423 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429432 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429442 4707 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429450 4707 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429461 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429471 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429480 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429488 4707 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429498 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429506 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429515 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429524 4707 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429533 4707 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429541 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429551 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429560 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429567 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429576 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429584 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429593 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429604 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429613 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429627 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429637 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429667 4707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429677 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429685 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429694 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429743 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429754 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429765 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429773 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429781 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429790 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429802 4707 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429825 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429835 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429844 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429854 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429863 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429873 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429883 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429891 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429900 4707 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429908 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429917 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429926 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429954 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429963 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429972 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.429998 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430007 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430014 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430024 4707 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430036 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430062 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430071 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430097 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430106 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430115 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430123 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430131 4707 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430138 4707 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430147 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430156 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430165 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430174 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430182 4707 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430192 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430201 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430214 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430224 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430234 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430252 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430259 4707 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430273 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430280 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430290 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.430298 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.444156 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.450134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.455124 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:02:01 crc kubenswrapper[4707]: W0121 15:02:01.457147 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-46bf219d8af8d588d9bf2ff1dcbd014e48ca9d4bcd8925c526c8d41f5931198e WatchSource:0}: Error finding container 46bf219d8af8d588d9bf2ff1dcbd014e48ca9d4bcd8925c526c8d41f5931198e: Status 404 returned error can't find the container with id 46bf219d8af8d588d9bf2ff1dcbd014e48ca9d4bcd8925c526c8d41f5931198e Jan 21 15:02:01 crc kubenswrapper[4707]: W0121 15:02:01.463334 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-355eb1c098c34770cc6713b48c66753e32913b7a936b0a3dd7c27b363aec1525 WatchSource:0}: Error finding container 355eb1c098c34770cc6713b48c66753e32913b7a936b0a3dd7c27b363aec1525: Status 404 returned error can't find the container with id 355eb1c098c34770cc6713b48c66753e32913b7a936b0a3dd7c27b363aec1525 Jan 21 15:02:01 crc kubenswrapper[4707]: W0121 15:02:01.465671 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-58fe1b37f1989b095b583e02170b851409f05bec970792a31eca65dcb811d870 WatchSource:0}: Error finding container 58fe1b37f1989b095b583e02170b851409f05bec970792a31eca65dcb811d870: Status 404 returned error can't find the container with id 58fe1b37f1989b095b583e02170b851409f05bec970792a31eca65dcb811d870 Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.833073 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.833176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.833254 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.833256 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:02:02.833232631 +0000 UTC m=+20.014748853 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.833303 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:02.833291444 +0000 UTC m=+20.014807665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.934351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.934408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:01 crc kubenswrapper[4707]: I0121 15:02:01.934432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934532 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934572 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934606 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934618 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934572 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934688 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934700 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934587 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:02.93457443 +0000 UTC m=+20.116090652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934748 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:02.934733503 +0000 UTC m=+20.116249714 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:01 crc kubenswrapper[4707]: E0121 15:02:01.934760 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:02.934754411 +0000 UTC m=+20.116270634 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.142366 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 01:46:03.145751431 +0000 UTC Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.252627 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.255022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e"} Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.255379 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.257015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f"} Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.257047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf"} Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.257058 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"58fe1b37f1989b095b583e02170b851409f05bec970792a31eca65dcb811d870"} Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.262350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c"} Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.262387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"355eb1c098c34770cc6713b48c66753e32913b7a936b0a3dd7c27b363aec1525"} Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.269193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"46bf219d8af8d588d9bf2ff1dcbd014e48ca9d4bcd8925c526c8d41f5931198e"} Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.271248 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.286156 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.299291 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.311255 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.322823 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.332959 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.347328 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.373153 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.395205 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.415780 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.426851 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.437039 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.447858 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.460792 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.471439 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.481443 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:02Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.840910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.840999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.841055 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:02:04.841034155 +0000 UTC m=+22.022550377 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.841063 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.841133 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:04.841125528 +0000 UTC m=+22.022641751 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.942100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.942150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:02 crc kubenswrapper[4707]: I0121 15:02:02.942193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942290 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942345 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:04.942331839 +0000 UTC m=+22.123848061 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942343 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942386 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942400 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942357 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942464 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:04.942445455 +0000 UTC m=+22.123961677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942490 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942505 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:02 crc kubenswrapper[4707]: E0121 15:02:02.942539 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:04.942528143 +0000 UTC m=+22.124044364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.142497 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 00:00:17.535577264 +0000 UTC Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.181716 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.181793 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.181823 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:03 crc kubenswrapper[4707]: E0121 15:02:03.181886 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:03 crc kubenswrapper[4707]: E0121 15:02:03.181941 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:03 crc kubenswrapper[4707]: E0121 15:02:03.182017 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.184991 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.185650 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.186384 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.186960 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.187498 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.187972 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.188552 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.189049 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.189609 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.190091 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.190540 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.191160 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.191592 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.192082 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.192553 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.193025 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.193540 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.193908 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.195327 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.196995 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.197603 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.198087 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.198598 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.199012 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.199609 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.199994 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.200550 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.201137 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.201539 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.202042 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.202465 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.202888 4707 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.202980 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.204172 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.204594 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.204966 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.206790 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.208490 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.209396 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.209864 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.210752 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.211336 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.211734 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.212589 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.213453 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.214072 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.214796 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.215316 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.216079 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.216703 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.217459 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.217878 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.218314 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.218471 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.219232 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.219711 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.220510 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.234284 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.247142 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.259288 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.268944 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.281015 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.817839 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.819376 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.819551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.819624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.819825 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.825023 4707 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.825206 4707 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.825947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.825979 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.825989 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.826006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.826015 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:03Z","lastTransitionTime":"2026-01-21T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:03 crc kubenswrapper[4707]: E0121 15:02:03.840722 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.843094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.843149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.843161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.843175 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.843184 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:03Z","lastTransitionTime":"2026-01-21T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:03 crc kubenswrapper[4707]: E0121 15:02:03.854046 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.857337 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.857373 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.857383 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.857400 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.857410 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:03Z","lastTransitionTime":"2026-01-21T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:03 crc kubenswrapper[4707]: E0121 15:02:03.867520 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.870617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.870647 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.870656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.870668 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.870679 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:03Z","lastTransitionTime":"2026-01-21T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:03 crc kubenswrapper[4707]: E0121 15:02:03.885314 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.889699 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.889745 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.889756 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.889771 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.889779 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:03Z","lastTransitionTime":"2026-01-21T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:03 crc kubenswrapper[4707]: E0121 15:02:03.900242 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:03 crc kubenswrapper[4707]: E0121 15:02:03.900399 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.902105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.902155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.902166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.902184 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:03 crc kubenswrapper[4707]: I0121 15:02:03.902194 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:03Z","lastTransitionTime":"2026-01-21T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.005143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.005472 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.005484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.005502 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.005515 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.107188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.107235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.107245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.107260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.107269 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.142616 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 18:41:00.854478703 +0000 UTC Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.209950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.209990 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.210001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.210016 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.210025 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.274917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.290245 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:04Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.301558 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:04Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.312378 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.312424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.312436 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.312457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.312467 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.313110 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:04Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.323182 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:04Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.332553 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:04Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.343413 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:04Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.355858 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:04Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.366095 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:04Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.414905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.414953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.414963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.414980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.414990 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.516908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.516950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.516959 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.516980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.516990 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.619277 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.619320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.619330 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.619346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.619355 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.721488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.721520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.721528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.721557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.721568 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.823586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.823623 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.823634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.823649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.823658 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.854394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.854483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.854560 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.854595 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:02:08.854566745 +0000 UTC m=+26.036082977 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.854629 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:08.854619746 +0000 UTC m=+26.036135978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.925007 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.925074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.925083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.925098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.925106 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:04Z","lastTransitionTime":"2026-01-21T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.955378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.955419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:04 crc kubenswrapper[4707]: I0121 15:02:04.955438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955521 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955534 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955550 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955561 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955571 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:08.955559099 +0000 UTC m=+26.137075322 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955585 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955608 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955619 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955589 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:08.955580691 +0000 UTC m=+26.137096913 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:04 crc kubenswrapper[4707]: E0121 15:02:04.955678 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:08.955662586 +0000 UTC m=+26.137178808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.027453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.027515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.027525 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.027539 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.027548 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.130101 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.130146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.130171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.130186 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.130195 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.143327 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 13:56:17.728274703 +0000 UTC Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.181891 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.181880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:05 crc kubenswrapper[4707]: E0121 15:02:05.182000 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.181898 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:05 crc kubenswrapper[4707]: E0121 15:02:05.182145 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:05 crc kubenswrapper[4707]: E0121 15:02:05.182219 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.232083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.232121 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.232130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.232144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.232171 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.334311 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.334359 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.334369 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.334383 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.334391 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.436370 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.436408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.436418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.436432 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.436441 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.538548 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.538593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.538603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.538618 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.538628 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.641603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.641662 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.641674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.641690 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.641704 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.743361 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.743405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.743414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.743428 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.743437 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.845735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.845782 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.845791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.845822 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.845833 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.948011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.948051 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.948060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.948076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:05 crc kubenswrapper[4707]: I0121 15:02:05.948086 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:05Z","lastTransitionTime":"2026-01-21T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.050019 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.050063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.050071 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.050087 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.050101 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.143945 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 22:10:10.192109353 +0000 UTC Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.152719 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.152759 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.152768 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.152784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.152794 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.183609 4707 csr.go:261] certificate signing request csr-g7pzm is approved, waiting to be issued Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.189302 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jspjl"] Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.189312 4707 csr.go:257] certificate signing request csr-g7pzm is issued Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.189561 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-xjrsl"] Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.189707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jspjl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.189740 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.192129 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.192265 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.192332 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.193615 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.193626 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.193692 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.193844 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.211594 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.226486 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.238096 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.247425 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.255516 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.255563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.255574 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.255592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.255602 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.259154 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.266227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fe6f2caf-2457-4027-a1df-0dfa2c670a65-hosts-file\") pod \"node-resolver-jspjl\" (UID: \"fe6f2caf-2457-4027-a1df-0dfa2c670a65\") " pod="openshift-dns/node-resolver-jspjl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.266261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07107e90-ae92-467e-8436-2c24c21b5554-host\") pod \"node-ca-xjrsl\" (UID: \"07107e90-ae92-467e-8436-2c24c21b5554\") " pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.266303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07107e90-ae92-467e-8436-2c24c21b5554-serviceca\") pod \"node-ca-xjrsl\" (UID: \"07107e90-ae92-467e-8436-2c24c21b5554\") " pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.266325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbh2q\" (UniqueName: \"kubernetes.io/projected/07107e90-ae92-467e-8436-2c24c21b5554-kube-api-access-xbh2q\") pod \"node-ca-xjrsl\" (UID: \"07107e90-ae92-467e-8436-2c24c21b5554\") " pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.266387 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4rdg\" (UniqueName: \"kubernetes.io/projected/fe6f2caf-2457-4027-a1df-0dfa2c670a65-kube-api-access-s4rdg\") pod \"node-resolver-jspjl\" (UID: \"fe6f2caf-2457-4027-a1df-0dfa2c670a65\") " pod="openshift-dns/node-resolver-jspjl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.267067 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.276009 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.285677 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lxkz2"] Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.286018 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.287559 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.288067 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.288581 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.288973 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.289140 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.307491 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.322508 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.350462 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.358297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.358352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.358367 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.358442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.358454 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.364081 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.366798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-var-lib-cni-multus\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.366871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-hostroot\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.366903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07107e90-ae92-467e-8436-2c24c21b5554-serviceca\") pod \"node-ca-xjrsl\" (UID: \"07107e90-ae92-467e-8436-2c24c21b5554\") " pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.366924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-var-lib-kubelet\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.366945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-system-cni-dir\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-os-release\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-var-lib-cni-bin\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbh2q\" (UniqueName: \"kubernetes.io/projected/07107e90-ae92-467e-8436-2c24c21b5554-kube-api-access-xbh2q\") pod \"node-ca-xjrsl\" (UID: \"07107e90-ae92-467e-8436-2c24c21b5554\") " pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-cnibin\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367135 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-run-netns\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-daemon-config\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4rdg\" (UniqueName: \"kubernetes.io/projected/fe6f2caf-2457-4027-a1df-0dfa2c670a65-kube-api-access-s4rdg\") pod \"node-resolver-jspjl\" (UID: \"fe6f2caf-2457-4027-a1df-0dfa2c670a65\") " pod="openshift-dns/node-resolver-jspjl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-socket-dir-parent\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-etc-kubernetes\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fe6f2caf-2457-4027-a1df-0dfa2c670a65-hosts-file\") pod \"node-resolver-jspjl\" (UID: \"fe6f2caf-2457-4027-a1df-0dfa2c670a65\") " pod="openshift-dns/node-resolver-jspjl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07107e90-ae92-467e-8436-2c24c21b5554-host\") pod \"node-ca-xjrsl\" (UID: \"07107e90-ae92-467e-8436-2c24c21b5554\") " pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-conf-dir\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-run-multus-certs\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07107e90-ae92-467e-8436-2c24c21b5554-host\") pod \"node-ca-xjrsl\" (UID: \"07107e90-ae92-467e-8436-2c24c21b5554\") " pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-cni-dir\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-cni-binary-copy\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fe6f2caf-2457-4027-a1df-0dfa2c670a65-hosts-file\") pod \"node-resolver-jspjl\" (UID: \"fe6f2caf-2457-4027-a1df-0dfa2c670a65\") " pod="openshift-dns/node-resolver-jspjl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-run-k8s-cni-cncf-io\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.367565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79znq\" (UniqueName: \"kubernetes.io/projected/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-kube-api-access-79znq\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.368335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07107e90-ae92-467e-8436-2c24c21b5554-serviceca\") pod \"node-ca-xjrsl\" (UID: \"07107e90-ae92-467e-8436-2c24c21b5554\") " pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.376433 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.382497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4rdg\" (UniqueName: \"kubernetes.io/projected/fe6f2caf-2457-4027-a1df-0dfa2c670a65-kube-api-access-s4rdg\") pod \"node-resolver-jspjl\" (UID: \"fe6f2caf-2457-4027-a1df-0dfa2c670a65\") " pod="openshift-dns/node-resolver-jspjl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.383013 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbh2q\" (UniqueName: \"kubernetes.io/projected/07107e90-ae92-467e-8436-2c24c21b5554-kube-api-access-xbh2q\") pod \"node-ca-xjrsl\" (UID: \"07107e90-ae92-467e-8436-2c24c21b5554\") " pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.387638 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.401853 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.412691 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.424465 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.438157 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.450779 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.460729 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.460771 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.460781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.460798 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.460825 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.462703 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.468890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-var-lib-kubelet\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.468934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-system-cni-dir\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.468957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-os-release\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.468976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-var-lib-cni-bin\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.468992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-run-netns\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-daemon-config\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-cnibin\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-var-lib-kubelet\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-socket-dir-parent\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-etc-kubernetes\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-system-cni-dir\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-conf-dir\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469117 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-cnibin\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469111 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-os-release\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-run-netns\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-etc-kubernetes\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-conf-dir\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-var-lib-cni-bin\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-run-multus-certs\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79znq\" (UniqueName: \"kubernetes.io/projected/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-kube-api-access-79znq\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-socket-dir-parent\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-cni-dir\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-run-multus-certs\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-cni-binary-copy\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-run-k8s-cni-cncf-io\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-var-lib-cni-multus\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-hostroot\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-hostroot\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-run-k8s-cni-cncf-io\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-cni-dir\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.469595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-host-var-lib-cni-multus\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.470211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-multus-daemon-config\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.470237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-cni-binary-copy\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.472765 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.482877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79znq\" (UniqueName: \"kubernetes.io/projected/e2bdbb11-a196-4dc3-b197-64ef1bec8e8a-kube-api-access-79znq\") pod \"multus-lxkz2\" (UID: \"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\") " pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.488735 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.501905 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jspjl" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.506829 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xjrsl" Jan 21 15:02:06 crc kubenswrapper[4707]: W0121 15:02:06.513281 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe6f2caf_2457_4027_a1df_0dfa2c670a65.slice/crio-a2a8ef683b2635c733ec9c60c9498ea308e28244325856572a3a9b3eaaf6cc36 WatchSource:0}: Error finding container a2a8ef683b2635c733ec9c60c9498ea308e28244325856572a3a9b3eaaf6cc36: Status 404 returned error can't find the container with id a2a8ef683b2635c733ec9c60c9498ea308e28244325856572a3a9b3eaaf6cc36 Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.563092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.563142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.563153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.563171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.563196 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.596225 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lxkz2" Jan 21 15:02:06 crc kubenswrapper[4707]: W0121 15:02:06.619036 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2bdbb11_a196_4dc3_b197_64ef1bec8e8a.slice/crio-2d89c8bb3590b051abde03550f9dcb5a10c32c1e039ea10f2e2dfbfff4f5287f WatchSource:0}: Error finding container 2d89c8bb3590b051abde03550f9dcb5a10c32c1e039ea10f2e2dfbfff4f5287f: Status 404 returned error can't find the container with id 2d89c8bb3590b051abde03550f9dcb5a10c32c1e039ea10f2e2dfbfff4f5287f Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.664750 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-fmg2k"] Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.665103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.665234 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.665272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.665282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.665300 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.665309 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.666294 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mk2p2"] Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.666833 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.668066 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.668108 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.668242 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.668481 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.668586 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.671150 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.671329 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.680113 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.691625 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.702207 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.710552 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.747742 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.767856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.767890 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.767899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.767944 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.767955 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.768751 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-rootfs\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-os-release\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-cnibin\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772133 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-mcd-auth-proxy-config\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f6eefe63-30b0-46e6-af7b-3909a3849128-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd8tq\" (UniqueName: \"kubernetes.io/projected/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-kube-api-access-fd8tq\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6eefe63-30b0-46e6-af7b-3909a3849128-cni-binary-copy\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-proxy-tls\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sv4\" (UniqueName: \"kubernetes.io/projected/f6eefe63-30b0-46e6-af7b-3909a3849128-kube-api-access-57sv4\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.772476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-system-cni-dir\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.791202 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.811717 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.822760 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.833724 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.843779 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.851828 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.861052 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.869727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.869939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.870032 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.870117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.870212 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.872356 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.872943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd8tq\" (UniqueName: \"kubernetes.io/projected/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-kube-api-access-fd8tq\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.872978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6eefe63-30b0-46e6-af7b-3909a3849128-cni-binary-copy\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-proxy-tls\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57sv4\" (UniqueName: \"kubernetes.io/projected/f6eefe63-30b0-46e6-af7b-3909a3849128-kube-api-access-57sv4\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-system-cni-dir\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-rootfs\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-os-release\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-cnibin\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-mcd-auth-proxy-config\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f6eefe63-30b0-46e6-af7b-3909a3849128-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-system-cni-dir\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-rootfs\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-os-release\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6eefe63-30b0-46e6-af7b-3909a3849128-cnibin\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f6eefe63-30b0-46e6-af7b-3909a3849128-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.873951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6eefe63-30b0-46e6-af7b-3909a3849128-cni-binary-copy\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.874095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-mcd-auth-proxy-config\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.876548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-proxy-tls\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.887783 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.888288 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd8tq\" (UniqueName: \"kubernetes.io/projected/f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f-kube-api-access-fd8tq\") pod \"machine-config-daemon-fmg2k\" (UID: \"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\") " pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.891414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sv4\" (UniqueName: \"kubernetes.io/projected/f6eefe63-30b0-46e6-af7b-3909a3849128-kube-api-access-57sv4\") pod \"multus-additional-cni-plugins-mk2p2\" (UID: \"f6eefe63-30b0-46e6-af7b-3909a3849128\") " pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.914553 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.926345 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.937591 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.954217 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.963640 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.972500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.972533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.972542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.972557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.972566 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:06Z","lastTransitionTime":"2026-01-21T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.974359 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.979379 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.987502 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:06Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:06 crc kubenswrapper[4707]: W0121 15:02:06.988487 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ec1d80_7cd8_4686_a4b1_9f7b49955e2f.slice/crio-d5dcdc59b5a61cfe884ed58992adb25ae99f63b2c2dd3cf7d660e3832e98d406 WatchSource:0}: Error finding container d5dcdc59b5a61cfe884ed58992adb25ae99f63b2c2dd3cf7d660e3832e98d406: Status 404 returned error can't find the container with id d5dcdc59b5a61cfe884ed58992adb25ae99f63b2c2dd3cf7d660e3832e98d406 Jan 21 15:02:06 crc kubenswrapper[4707]: I0121 15:02:06.989230 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.012069 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.031386 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.050244 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.072469 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-brcdw"] Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.073247 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.076671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.076710 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.076720 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.076737 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.076748 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.078330 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.078560 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.079444 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.079624 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.079748 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.079879 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.080413 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.090244 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.113161 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.128064 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.146573 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 13:52:46.506621623 +0000 UTC Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.154176 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.171059 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.174787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-netns\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.174854 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-ovn\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.174879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-config\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.174898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovn-node-metrics-cert\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.174921 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-ovn-kubernetes\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.174939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.174961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-script-lib\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-openvswitch\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-var-lib-openvswitch\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-systemd-units\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-slash\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-systemd\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-etc-openvswitch\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-kubelet\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-log-socket\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175369 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-bin\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-node-log\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-netd\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-env-overrides\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.175456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw78q\" (UniqueName: \"kubernetes.io/projected/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-kube-api-access-kw78q\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.179594 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.179643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.179654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.179676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.179687 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.181899 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.181954 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:07 crc kubenswrapper[4707]: E0121 15:02:07.182035 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.181913 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:07 crc kubenswrapper[4707]: E0121 15:02:07.182138 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:07 crc kubenswrapper[4707]: E0121 15:02:07.182229 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.191780 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-21 14:57:06 +0000 UTC, rotation deadline is 2026-11-13 14:34:45.524164993 +0000 UTC Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.191861 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7103h32m38.33230658s for next certificate rotation Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.191836 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.207572 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.220239 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.230986 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.240708 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.251245 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.261282 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.270823 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-slash\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-systemd\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276090 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-etc-openvswitch\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-kubelet\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-log-socket\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-bin\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-slash\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-node-log\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-kubelet\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-netd\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-systemd\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-log-socket\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-etc-openvswitch\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276252 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-netd\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276252 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-bin\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-env-overrides\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-node-log\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw78q\" (UniqueName: \"kubernetes.io/projected/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-kube-api-access-kw78q\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-netns\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-ovn\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-config\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovn-node-metrics-cert\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-ovn\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-ovn-kubernetes\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-ovn-kubernetes\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-script-lib\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-openvswitch\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-var-lib-openvswitch\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-env-overrides\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-systemd-units\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-systemd-units\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-openvswitch\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-var-lib-openvswitch\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.276917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-netns\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.277289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-config\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.277326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-script-lib\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.279965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovn-node-metrics-cert\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.281303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.281329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.281338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.281351 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.281360 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.283032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xjrsl" event={"ID":"07107e90-ae92-467e-8436-2c24c21b5554","Type":"ContainerStarted","Data":"03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.283064 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xjrsl" event={"ID":"07107e90-ae92-467e-8436-2c24c21b5554","Type":"ContainerStarted","Data":"dc564f7fea161e2388ef3a80241804dc0e0054f8bf79593cd97a4bb68df2a632"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.284602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jspjl" event={"ID":"fe6f2caf-2457-4027-a1df-0dfa2c670a65","Type":"ContainerStarted","Data":"92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.284628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jspjl" event={"ID":"fe6f2caf-2457-4027-a1df-0dfa2c670a65","Type":"ContainerStarted","Data":"a2a8ef683b2635c733ec9c60c9498ea308e28244325856572a3a9b3eaaf6cc36"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.286189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.286230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.286240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"d5dcdc59b5a61cfe884ed58992adb25ae99f63b2c2dd3cf7d660e3832e98d406"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.287627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" event={"ID":"f6eefe63-30b0-46e6-af7b-3909a3849128","Type":"ContainerStarted","Data":"d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.287651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" event={"ID":"f6eefe63-30b0-46e6-af7b-3909a3849128","Type":"ContainerStarted","Data":"ced37d3566a6d40b2bc9b13b073090193256bc4470f46c66810d84c38c1ae664"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.288491 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.289409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxkz2" event={"ID":"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a","Type":"ContainerStarted","Data":"cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.289450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxkz2" event={"ID":"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a","Type":"ContainerStarted","Data":"2d89c8bb3590b051abde03550f9dcb5a10c32c1e039ea10f2e2dfbfff4f5287f"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.290168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw78q\" (UniqueName: \"kubernetes.io/projected/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-kube-api-access-kw78q\") pod \"ovnkube-node-brcdw\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.300117 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.319045 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.359061 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.384150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.384193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.384215 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.384230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.384240 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.386447 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:07 crc kubenswrapper[4707]: W0121 15:02:07.397925 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa3aa8ac_04fb_4f52_8e61_ddb7676bca30.slice/crio-30f1e7d7b795e0dd15c982ee504cd337d62f29782263b1e84dfdadfdca67f83e WatchSource:0}: Error finding container 30f1e7d7b795e0dd15c982ee504cd337d62f29782263b1e84dfdadfdca67f83e: Status 404 returned error can't find the container with id 30f1e7d7b795e0dd15c982ee504cd337d62f29782263b1e84dfdadfdca67f83e Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.403560 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.440958 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.477067 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.486688 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.486727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.486737 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.486754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.486764 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.519006 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.558583 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.588685 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.588723 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.588733 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.588749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.588759 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.598986 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.640354 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.681159 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.691379 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.691411 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.691420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.691433 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.691442 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.718760 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.759301 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.793105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.793139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.793149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.793162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.793171 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.803390 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.838940 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.880120 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.895605 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.895638 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.895647 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.895660 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.895670 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.919525 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.960309 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.996436 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.997904 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.997939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.997947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.997961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:07 crc kubenswrapper[4707]: I0121 15:02:07.997970 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:07Z","lastTransitionTime":"2026-01-21T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.041932 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.078689 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.100343 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.100389 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.100398 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.100415 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.100426 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:08Z","lastTransitionTime":"2026-01-21T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.119540 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.147412 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 13:47:47.667526621 +0000 UTC Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.159024 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.199412 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.203215 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.203273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.203284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.203304 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.203330 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:08Z","lastTransitionTime":"2026-01-21T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.237507 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.278666 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.293790 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181" exitCode=0 Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.293873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.293914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"30f1e7d7b795e0dd15c982ee504cd337d62f29782263b1e84dfdadfdca67f83e"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.295429 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6eefe63-30b0-46e6-af7b-3909a3849128" containerID="d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef" exitCode=0 Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.295479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" event={"ID":"f6eefe63-30b0-46e6-af7b-3909a3849128","Type":"ContainerDied","Data":"d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.305551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.305997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.306092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.306177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.306250 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:08Z","lastTransitionTime":"2026-01-21T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.318309 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.360713 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.398271 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.409122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.409170 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.409185 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.409204 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.409216 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:08Z","lastTransitionTime":"2026-01-21T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.438137 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.481682 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.512267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.512305 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.512315 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.512333 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.512343 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:08Z","lastTransitionTime":"2026-01-21T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.517195 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.561982 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.597630 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.613850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.613898 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.613908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.613926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.613937 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:08Z","lastTransitionTime":"2026-01-21T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.640090 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.679109 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.715992 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.716039 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.716050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.716068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.716083 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:08Z","lastTransitionTime":"2026-01-21T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.719773 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.759885 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.799279 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.817980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.818012 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.818022 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.818039 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.818049 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:08Z","lastTransitionTime":"2026-01-21T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.838289 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.878570 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.892122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.892254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.892292 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:02:16.892264978 +0000 UTC m=+34.073781200 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.892334 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.892386 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:16.892370429 +0000 UTC m=+34.073886650 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.920579 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.920612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.920622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.920634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.920644 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:08Z","lastTransitionTime":"2026-01-21T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.921412 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.993643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.994007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:08 crc kubenswrapper[4707]: I0121 15:02:08.994159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.993888 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.994117 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.994461 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.994520 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.994291 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.994559 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.994582 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.994531 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:16.994513591 +0000 UTC m=+34.176029813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.994643 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:16.994630222 +0000 UTC m=+34.176146444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:08 crc kubenswrapper[4707]: E0121 15:02:08.994656 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:16.99464988 +0000 UTC m=+34.176166102 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.022259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.022307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.022321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.022341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.022352 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.124830 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.125094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.125104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.125122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.125136 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.147960 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 00:23:19.51534809 +0000 UTC Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.182253 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.182348 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.182394 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:09 crc kubenswrapper[4707]: E0121 15:02:09.182392 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:09 crc kubenswrapper[4707]: E0121 15:02:09.182477 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:09 crc kubenswrapper[4707]: E0121 15:02:09.182569 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.227099 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.227137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.227147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.227161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.227172 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.304964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.305019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.305031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.305042 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.305051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.305059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.306676 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6eefe63-30b0-46e6-af7b-3909a3849128" containerID="b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d" exitCode=0 Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.306731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" event={"ID":"f6eefe63-30b0-46e6-af7b-3909a3849128","Type":"ContainerDied","Data":"b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.317489 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.329008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.329055 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.329070 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.329088 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.329100 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.332997 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.349597 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.369536 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.379659 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.397573 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.409029 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.420181 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.430232 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.431657 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.431689 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.431699 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.431717 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.431728 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.440606 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.451059 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.461102 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.470277 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.486528 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:09Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.534650 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.534692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.534702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.534717 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.534728 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.637291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.637338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.637348 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.637365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.637376 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.739588 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.739627 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.739636 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.739652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.739662 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.841663 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.841703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.841712 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.841737 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.841747 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.943820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.943862 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.943878 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.943894 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:09 crc kubenswrapper[4707]: I0121 15:02:09.943904 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:09Z","lastTransitionTime":"2026-01-21T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.046183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.046222 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.046232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.046247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.046271 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.148139 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 00:22:37.299009184 +0000 UTC Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.149613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.149652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.149661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.149676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.149685 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.252354 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.252392 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.252400 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.252414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.252426 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.312858 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6eefe63-30b0-46e6-af7b-3909a3849128" containerID="be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b" exitCode=0 Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.312910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" event={"ID":"f6eefe63-30b0-46e6-af7b-3909a3849128","Type":"ContainerDied","Data":"be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.328215 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.341769 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.351789 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.354621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.354655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.354666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.354680 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.354688 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.362852 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.371142 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.383382 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.395330 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.405185 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.417770 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.428507 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.438657 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.449988 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.456541 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.456567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.456576 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.456593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.456603 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.459527 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.473477 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:10Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.559909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.559958 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.559969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.559987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.559997 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.662353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.662394 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.662404 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.662420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.662434 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.764762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.764825 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.764837 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.764858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.764871 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.867202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.867235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.867244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.867259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.867267 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.969923 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.969963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.969972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.969988 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:10 crc kubenswrapper[4707]: I0121 15:02:10.969997 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:10Z","lastTransitionTime":"2026-01-21T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.072481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.072519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.072530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.072546 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.072555 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.149261 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 08:45:03.647553121 +0000 UTC Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.175263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.175322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.175331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.175348 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.175359 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.182501 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.182540 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:11 crc kubenswrapper[4707]: E0121 15:02:11.182593 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.182503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:11 crc kubenswrapper[4707]: E0121 15:02:11.182694 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:11 crc kubenswrapper[4707]: E0121 15:02:11.182855 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.277485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.277524 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.277532 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.277545 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.277553 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.318193 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6eefe63-30b0-46e6-af7b-3909a3849128" containerID="024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e" exitCode=0 Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.318260 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" event={"ID":"f6eefe63-30b0-46e6-af7b-3909a3849128","Type":"ContainerDied","Data":"024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.321531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.329553 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.342171 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.352516 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.367398 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.376552 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.379644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.379678 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.379688 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.379701 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.379710 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.388734 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.397497 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.408475 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.417937 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.434480 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.445676 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.456471 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.467731 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.481801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.482068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.482078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.482093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.482103 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.484578 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:11Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.584659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.584715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.584728 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.584746 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.584757 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.686498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.686531 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.686542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.686557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.686567 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.788789 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.788892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.788905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.788933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.788944 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.891475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.891505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.891515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.891528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.891539 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.993993 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.994026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.994037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.994053 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:11 crc kubenswrapper[4707]: I0121 15:02:11.994063 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:11Z","lastTransitionTime":"2026-01-21T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.096723 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.096760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.096770 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.096788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.096797 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:12Z","lastTransitionTime":"2026-01-21T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.149453 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 06:14:42.78690642 +0000 UTC Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.199583 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.199625 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.199635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.199650 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.199660 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:12Z","lastTransitionTime":"2026-01-21T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.301635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.301671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.301680 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.301696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.301707 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:12Z","lastTransitionTime":"2026-01-21T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.327461 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6eefe63-30b0-46e6-af7b-3909a3849128" containerID="87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b" exitCode=0 Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.327514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" event={"ID":"f6eefe63-30b0-46e6-af7b-3909a3849128","Type":"ContainerDied","Data":"87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.338951 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.348718 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.361972 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.372876 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.382486 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.391002 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.401599 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.404286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.404341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.404351 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.404367 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.404377 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:12Z","lastTransitionTime":"2026-01-21T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.412482 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.422214 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.433457 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.446870 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.457927 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.471923 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.486409 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:12Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.506954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.507002 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.507014 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.507032 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.507045 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:12Z","lastTransitionTime":"2026-01-21T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.609608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.609646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.609654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.609670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.609680 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:12Z","lastTransitionTime":"2026-01-21T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.711633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.711670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.711679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.711694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.711704 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:12Z","lastTransitionTime":"2026-01-21T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.813759 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.813796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.813822 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.813835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.813844 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:12Z","lastTransitionTime":"2026-01-21T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.915617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.915917 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.915928 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.915941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:12 crc kubenswrapper[4707]: I0121 15:02:12.915951 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:12Z","lastTransitionTime":"2026-01-21T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.018250 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.018280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.018288 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.018302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.018311 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.050724 4707 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.120977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.121028 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.121038 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.121054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.121064 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.149891 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 22:03:57.386256547 +0000 UTC Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.182267 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.182333 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.182377 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:13 crc kubenswrapper[4707]: E0121 15:02:13.182510 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:13 crc kubenswrapper[4707]: E0121 15:02:13.182794 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:13 crc kubenswrapper[4707]: E0121 15:02:13.182865 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.193566 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.201158 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.210354 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.219390 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.223009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.223063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.223077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.223100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.223116 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.233317 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.244921 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.254731 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.268589 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.281933 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.293555 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.310155 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.322576 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.325143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.325172 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.325182 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.325198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.325211 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.336551 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.339318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.339696 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.339721 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.339731 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.343180 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6eefe63-30b0-46e6-af7b-3909a3849128" containerID="233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e" exitCode=0 Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.343221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" event={"ID":"f6eefe63-30b0-46e6-af7b-3909a3849128","Type":"ContainerDied","Data":"233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.350420 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.363963 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.364292 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.365729 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.376969 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.386382 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.396739 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.408894 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.418952 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.427694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.427733 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.427743 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.427764 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.427774 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.431004 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.442158 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.455162 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.467762 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.479105 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.488315 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.503403 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.513269 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.522671 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.530089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.530129 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.530140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.530156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.530166 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.532217 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.541038 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.554640 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.563798 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.574618 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.584643 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.593467 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.603039 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.611995 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.621637 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.630032 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.632488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.632515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.632525 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.632540 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.632550 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.639848 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.650432 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:13Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.734178 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.734214 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.734223 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.734239 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.734251 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.836880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.837204 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.837214 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.837230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.837241 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.939174 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.939214 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.939224 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.939241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:13 crc kubenswrapper[4707]: I0121 15:02:13.939252 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:13Z","lastTransitionTime":"2026-01-21T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.019109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.019404 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.019510 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.019578 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.019632 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: E0121 15:02:14.031419 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.035533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.035582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.035591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.035610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.035624 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: E0121 15:02:14.047281 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.050834 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.050965 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.051032 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.051109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.051166 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: E0121 15:02:14.062186 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.065144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.065252 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.065320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.065402 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.065472 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: E0121 15:02:14.080155 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.083521 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.083562 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.083572 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.083586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.083596 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: E0121 15:02:14.093298 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: E0121 15:02:14.093463 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.094907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.094941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.094949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.094965 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.094974 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.150941 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 01:16:02.063428713 +0000 UTC Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.196752 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.196800 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.196857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.196873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.196881 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.299208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.299251 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.299259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.299276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.299287 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.348667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" event={"ID":"f6eefe63-30b0-46e6-af7b-3909a3849128","Type":"ContainerStarted","Data":"122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.358773 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.368845 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.378978 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.388561 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.395680 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.401612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.401642 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.401653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.401667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.401676 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.405310 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.418195 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.440591 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.453654 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.471180 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.498327 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.504362 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.504407 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.504417 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.504433 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.504445 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.530651 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.549466 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.564512 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.607034 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.607095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.607107 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.607122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.607132 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.712218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.712254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.712262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.712278 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.712287 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.814432 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.814469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.814479 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.814494 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.814507 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.916496 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.916528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.916539 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.916552 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:14 crc kubenswrapper[4707]: I0121 15:02:14.916561 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:14Z","lastTransitionTime":"2026-01-21T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.019082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.019124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.019133 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.019149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.019160 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.122112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.122159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.122173 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.122192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.122202 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.151758 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 21:48:39.943790101 +0000 UTC Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.182664 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.182718 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.182729 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:15 crc kubenswrapper[4707]: E0121 15:02:15.182831 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:15 crc kubenswrapper[4707]: E0121 15:02:15.183208 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:15 crc kubenswrapper[4707]: E0121 15:02:15.183271 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.224393 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.224430 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.224440 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.224470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.224482 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.326992 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.327236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.327329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.327416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.327480 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.353103 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/0.log" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.355103 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6" exitCode=1 Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.355139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.355918 4707 scope.go:117] "RemoveContainer" containerID="37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.366641 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.376205 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.385921 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.402793 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"Removed *v1.EgressFirewall event handler 9\\\\nI0121 15:02:14.956669 6068 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0121 15:02:14.956715 6068 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.956864 6068 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.957021 6068 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.957073 6068 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.957279 6068 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.957569 6068 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:02:14.957582 6068 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:02:14.957617 6068 factory.go:656] Stopping watch factory\\\\nI0121 15:02:14.957630 6068 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:02:14.957637 6068 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.412754 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.424034 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.430064 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.430190 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.430275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.430338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.430421 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.435534 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.445656 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.456167 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.465078 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.481471 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.492688 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.503225 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.515151 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.532570 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.532610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.532620 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.532635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.532647 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.634479 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.634514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.634523 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.634534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.634544 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.736357 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.736405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.736415 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.736428 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.736437 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.838597 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.838648 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.838657 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.838671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.838681 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.941052 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.941088 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.941097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.941109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:15 crc kubenswrapper[4707]: I0121 15:02:15.941118 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:15Z","lastTransitionTime":"2026-01-21T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.043231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.043272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.043281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.043297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.043305 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.145301 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.145350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.145360 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.145378 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.145401 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.152500 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 16:43:07.438785909 +0000 UTC Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.248231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.248266 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.248275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.248291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.248300 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.352036 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.352093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.352104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.352121 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.352134 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.360417 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/1.log" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.361138 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/0.log" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.363654 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed" exitCode=1 Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.363699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.363750 4707 scope.go:117] "RemoveContainer" containerID="37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.364266 4707 scope.go:117] "RemoveContainer" containerID="05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed" Jan 21 15:02:16 crc kubenswrapper[4707]: E0121 15:02:16.364426 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.376900 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.389204 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.397461 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.414255 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b47cc1cbe8dabe132dae4bb1ebc42a00cd6c149c28d155cfdc9c9bb74df9e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"message\\\":\\\"Removed *v1.EgressFirewall event handler 9\\\\nI0121 15:02:14.956669 6068 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0121 15:02:14.956715 6068 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.956864 6068 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.957021 6068 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.957073 6068 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.957279 6068 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:02:14.957569 6068 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:02:14.957582 6068 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:02:14.957617 6068 factory.go:656] Stopping watch factory\\\\nI0121 15:02:14.957630 6068 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:02:14.957637 6068 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.424638 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.434328 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.444081 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.454052 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.454972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.455130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.455227 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.455300 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.455361 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.463909 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.472342 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.482937 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.492373 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.502778 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.514264 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.557534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.557578 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.557589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.557606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.557617 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.660025 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.660068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.660078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.660093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.660104 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.762607 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.762653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.762664 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.762683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.762693 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.865014 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.865292 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.865303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.865321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.865332 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.966339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.966555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:16 crc kubenswrapper[4707]: E0121 15:02:16.966635 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:16 crc kubenswrapper[4707]: E0121 15:02:16.966689 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:32.966675721 +0000 UTC m=+50.148191943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:16 crc kubenswrapper[4707]: E0121 15:02:16.966752 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:02:32.966745373 +0000 UTC m=+50.148261586 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.967248 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.967272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.967281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.967294 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:16 crc kubenswrapper[4707]: I0121 15:02:16.967303 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:16Z","lastTransitionTime":"2026-01-21T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.067356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.067412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.067442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067573 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067577 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067617 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067622 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067631 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067707 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:33.067686751 +0000 UTC m=+50.249202983 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067591 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067767 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067780 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:33.067757175 +0000 UTC m=+50.249273397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.067836 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:33.067801038 +0000 UTC m=+50.249317260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.069183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.069213 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.069222 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.069237 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.069246 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.152905 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 00:18:12.784068937 +0000 UTC Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.172058 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.172100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.172108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.172124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.172142 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.182340 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.182345 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.182429 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.182571 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.182648 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.182730 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.274612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.274653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.274663 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.274680 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.274692 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.367956 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/1.log" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.371038 4707 scope.go:117] "RemoveContainer" containerID="05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed" Jan 21 15:02:17 crc kubenswrapper[4707]: E0121 15:02:17.371176 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.377012 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.377050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.377060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.377078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.377090 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.383094 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.393176 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.401391 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.410749 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.420607 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.429857 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.440334 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.451259 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.461852 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.475684 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.479514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.479558 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.479569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.479588 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.479600 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.486679 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.501854 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.513854 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.525342 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:17Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.582386 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.582442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.582451 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.582468 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.582479 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.684619 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.684662 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.684675 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.684692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.684701 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.787035 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.787073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.787082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.787098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.787109 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.889280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.889326 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.889335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.889350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.889361 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.991982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.992029 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.992038 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.992055 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:17 crc kubenswrapper[4707]: I0121 15:02:17.992064 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:17Z","lastTransitionTime":"2026-01-21T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.094477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.094521 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.094531 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.094547 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.094560 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:18Z","lastTransitionTime":"2026-01-21T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.153092 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 06:07:14.72222518 +0000 UTC Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.196190 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.196225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.196236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.196247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.196257 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:18Z","lastTransitionTime":"2026-01-21T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.298316 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.298356 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.298366 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.298381 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.298391 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:18Z","lastTransitionTime":"2026-01-21T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.400403 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.400446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.400455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.400467 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.400477 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:18Z","lastTransitionTime":"2026-01-21T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.502541 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.502591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.502603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.502622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.502634 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:18Z","lastTransitionTime":"2026-01-21T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.605064 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.605104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.605116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.605131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.605141 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:18Z","lastTransitionTime":"2026-01-21T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.707450 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.707488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.707497 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.707512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.707524 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:18Z","lastTransitionTime":"2026-01-21T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.809803 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.809868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.809880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.809901 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.809913 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:18Z","lastTransitionTime":"2026-01-21T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.908330 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt"] Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.908741 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.910171 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.911603 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.911855 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.911873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.911882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.911893 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.911902 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:18Z","lastTransitionTime":"2026-01-21T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.923261 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:18Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.933991 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:18Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.946467 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:18Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.955001 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:18Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.964932 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:18Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.973420 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:18Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.983978 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:18Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.986177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9259afc-a0fa-4676-b6a1-0607479688f0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.986219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9259afc-a0fa-4676-b6a1-0607479688f0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.986346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9259afc-a0fa-4676-b6a1-0607479688f0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:18 crc kubenswrapper[4707]: I0121 15:02:18.986461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctvs9\" (UniqueName: \"kubernetes.io/projected/e9259afc-a0fa-4676-b6a1-0607479688f0-kube-api-access-ctvs9\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.002414 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:18Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.014374 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.014432 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.014458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.014476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.014492 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.014775 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.025921 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.040192 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.052859 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.062047 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.074239 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.084534 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.087939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9259afc-a0fa-4676-b6a1-0607479688f0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.087992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctvs9\" (UniqueName: \"kubernetes.io/projected/e9259afc-a0fa-4676-b6a1-0607479688f0-kube-api-access-ctvs9\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.088014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9259afc-a0fa-4676-b6a1-0607479688f0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.088101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9259afc-a0fa-4676-b6a1-0607479688f0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.088736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9259afc-a0fa-4676-b6a1-0607479688f0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.088977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9259afc-a0fa-4676-b6a1-0607479688f0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.092894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9259afc-a0fa-4676-b6a1-0607479688f0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.102117 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctvs9\" (UniqueName: \"kubernetes.io/projected/e9259afc-a0fa-4676-b6a1-0607479688f0-kube-api-access-ctvs9\") pod \"ovnkube-control-plane-749d76644c-gxcpt\" (UID: \"e9259afc-a0fa-4676-b6a1-0607479688f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.117373 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.117416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.117427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.117461 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.117471 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.153701 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 06:59:40.891013776 +0000 UTC Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.182508 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.182556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:19 crc kubenswrapper[4707]: E0121 15:02:19.182674 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.182721 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:19 crc kubenswrapper[4707]: E0121 15:02:19.182858 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:19 crc kubenswrapper[4707]: E0121 15:02:19.182945 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.186110 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.202039 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.213700 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.219021 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.220278 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.220327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.220337 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.220354 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.220366 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.223515 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: W0121 15:02:19.230711 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9259afc_a0fa_4676_b6a1_0607479688f0.slice/crio-237a73e89b478df2d959e61b3bb5b04f1cb152bdbccadea57b51237f0480e050 WatchSource:0}: Error finding container 237a73e89b478df2d959e61b3bb5b04f1cb152bdbccadea57b51237f0480e050: Status 404 returned error can't find the container with id 237a73e89b478df2d959e61b3bb5b04f1cb152bdbccadea57b51237f0480e050 Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.235055 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.248629 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.257436 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.268846 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.279980 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.288954 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.298257 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.308502 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.319440 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.322612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.322640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.322649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.322663 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.322671 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.327786 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.338043 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.348660 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:19Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.377412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" event={"ID":"e9259afc-a0fa-4676-b6a1-0607479688f0","Type":"ContainerStarted","Data":"10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.377494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" event={"ID":"e9259afc-a0fa-4676-b6a1-0607479688f0","Type":"ContainerStarted","Data":"237a73e89b478df2d959e61b3bb5b04f1cb152bdbccadea57b51237f0480e050"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.425109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.425151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.425161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.425178 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.425189 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.527659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.527705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.527718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.527736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.527749 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.630340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.630373 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.630382 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.630396 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.630406 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.732661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.732703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.732712 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.732730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.732740 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.835497 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.835533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.835542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.835560 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.835571 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.940847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.940895 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.940907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.940926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.940937 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:19Z","lastTransitionTime":"2026-01-21T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.999273 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-62mww"] Jan 21 15:02:19 crc kubenswrapper[4707]: I0121 15:02:19.999708 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:19 crc kubenswrapper[4707]: E0121 15:02:19.999768 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.007992 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.021237 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.031008 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.040420 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.043091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.043132 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.043142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.043162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.043172 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.049144 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.059007 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.069753 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.077921 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.087310 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.097017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6kw\" (UniqueName: \"kubernetes.io/projected/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-kube-api-access-ft6kw\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.097016 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.097079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.108042 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.117829 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.130599 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.139150 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.146256 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.146297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.146307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.146327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.146337 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.151985 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.153882 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 04:21:35.860686686 +0000 UTC Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.161695 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.198516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.198602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6kw\" (UniqueName: \"kubernetes.io/projected/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-kube-api-access-ft6kw\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:20 crc kubenswrapper[4707]: E0121 15:02:20.198724 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:20 crc kubenswrapper[4707]: E0121 15:02:20.198848 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs podName:d5fb5fe4-8f42-4057-b731-b2c8da0661e3 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:20.698824629 +0000 UTC m=+37.880340851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs") pod "network-metrics-daemon-62mww" (UID: "d5fb5fe4-8f42-4057-b731-b2c8da0661e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.216387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6kw\" (UniqueName: \"kubernetes.io/projected/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-kube-api-access-ft6kw\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.252930 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.252974 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.252986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.253006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.253019 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.355859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.356271 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.356284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.356307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.356321 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.383272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" event={"ID":"e9259afc-a0fa-4676-b6a1-0607479688f0","Type":"ContainerStarted","Data":"be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.394231 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.410123 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.421069 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.453226 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.458597 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.458629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.458638 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.458655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.458665 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.473405 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.483955 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.494940 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.504180 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.515278 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.524895 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.535827 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.546062 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.557851 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.560645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.560682 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.560691 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.560708 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.560717 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.568094 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.579071 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.589156 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.663148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.663183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.663192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.663207 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.663217 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.704996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:20 crc kubenswrapper[4707]: E0121 15:02:20.705208 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:20 crc kubenswrapper[4707]: E0121 15:02:20.705308 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs podName:d5fb5fe4-8f42-4057-b731-b2c8da0661e3 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:21.705288667 +0000 UTC m=+38.886804889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs") pod "network-metrics-daemon-62mww" (UID: "d5fb5fe4-8f42-4057-b731-b2c8da0661e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.765179 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.765220 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.765229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.765245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.765257 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.867871 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.868108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.868182 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.868254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.868336 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.971114 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.971168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.971177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.971193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:20 crc kubenswrapper[4707]: I0121 15:02:20.971203 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:20Z","lastTransitionTime":"2026-01-21T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.073612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.073665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.073675 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.073691 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.073702 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.154422 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 14:29:50.285489763 +0000 UTC Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.176183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.176223 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.176233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.176250 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.176260 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.181754 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.181828 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:21 crc kubenswrapper[4707]: E0121 15:02:21.181951 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.181959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:21 crc kubenswrapper[4707]: E0121 15:02:21.182092 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:21 crc kubenswrapper[4707]: E0121 15:02:21.182138 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.278463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.278513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.278524 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.278541 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.278551 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.380229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.380269 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.380279 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.380296 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.380305 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.482787 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.482848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.482859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.482873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.482883 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.585473 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.585527 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.585539 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.585555 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.585567 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.687875 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.687910 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.687919 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.687933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.687942 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.716626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:21 crc kubenswrapper[4707]: E0121 15:02:21.716750 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:21 crc kubenswrapper[4707]: E0121 15:02:21.716799 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs podName:d5fb5fe4-8f42-4057-b731-b2c8da0661e3 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:23.716787308 +0000 UTC m=+40.898303530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs") pod "network-metrics-daemon-62mww" (UID: "d5fb5fe4-8f42-4057-b731-b2c8da0661e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.790655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.790694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.790702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.790718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.790728 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.892893 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.892931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.892941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.892963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.892972 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.995452 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.995494 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.995515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.995531 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:21 crc kubenswrapper[4707]: I0121 15:02:21.995542 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:21Z","lastTransitionTime":"2026-01-21T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.098019 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.098054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.098064 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.098078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.098088 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:22Z","lastTransitionTime":"2026-01-21T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.154855 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 09:23:59.27455296 +0000 UTC Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.182551 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:22 crc kubenswrapper[4707]: E0121 15:02:22.183082 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.199749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.199779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.199790 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.199803 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.199838 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:22Z","lastTransitionTime":"2026-01-21T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.301596 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.301633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.301647 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.301663 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.301672 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:22Z","lastTransitionTime":"2026-01-21T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.403647 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.403685 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.403694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.403710 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.403719 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:22Z","lastTransitionTime":"2026-01-21T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.506247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.506292 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.506301 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.506320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.506330 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:22Z","lastTransitionTime":"2026-01-21T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.608052 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.608096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.608107 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.608124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.608135 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:22Z","lastTransitionTime":"2026-01-21T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.711118 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.711166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.711176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.711193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.711203 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:22Z","lastTransitionTime":"2026-01-21T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.813564 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.813606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.813614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.813629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.813638 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:22Z","lastTransitionTime":"2026-01-21T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.916153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.916206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.916218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.916237 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:22 crc kubenswrapper[4707]: I0121 15:02:22.916248 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:22Z","lastTransitionTime":"2026-01-21T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.018824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.018869 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.018879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.018897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.018908 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.121150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.121192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.121203 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.121219 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.121240 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.155581 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 04:18:32.201155325 +0000 UTC Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.182047 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.182045 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:23 crc kubenswrapper[4707]: E0121 15:02:23.182353 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.182074 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:23 crc kubenswrapper[4707]: E0121 15:02:23.182425 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:23 crc kubenswrapper[4707]: E0121 15:02:23.182508 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.194909 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.204881 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.215624 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.222869 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.223030 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.223104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.223180 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.223243 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.228701 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.239449 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.249744 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.260459 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.268591 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.283622 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.292401 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.302113 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.310571 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.319491 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.324914 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.324955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.324966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.324986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.324995 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.327244 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.336455 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.344312 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.428144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.428192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.428202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.428219 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.428229 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.530383 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.530417 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.530427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.530442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.530451 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.632655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.632704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.632714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.632736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.632746 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.735275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.735312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.735321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.735335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.735344 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.737885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:23 crc kubenswrapper[4707]: E0121 15:02:23.738053 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:23 crc kubenswrapper[4707]: E0121 15:02:23.738122 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs podName:d5fb5fe4-8f42-4057-b731-b2c8da0661e3 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:27.738105989 +0000 UTC m=+44.919622211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs") pod "network-metrics-daemon-62mww" (UID: "d5fb5fe4-8f42-4057-b731-b2c8da0661e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.837870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.837915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.837925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.837945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.837955 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.939915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.939950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.939958 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.939975 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:23 crc kubenswrapper[4707]: I0121 15:02:23.939984 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:23Z","lastTransitionTime":"2026-01-21T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.042146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.042206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.042216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.042235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.042245 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.144732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.144764 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.144774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.144789 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.144798 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.156336 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 02:09:37.882815587 +0000 UTC Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.181703 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:24 crc kubenswrapper[4707]: E0121 15:02:24.181839 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.246593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.246668 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.246680 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.246697 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.246706 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.348375 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.348429 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.348439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.348455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.348464 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.450211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.450249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.450263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.450280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.450289 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.479527 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.479582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.479594 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.479612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.479625 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: E0121 15:02:24.490744 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.494125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.494260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.494406 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.494472 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.494534 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: E0121 15:02:24.504659 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.508592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.508628 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.508641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.508656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.508666 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: E0121 15:02:24.518468 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.521507 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.521548 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.521571 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.521589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.521599 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: E0121 15:02:24.531632 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.534728 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.534761 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.534770 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.534784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.534795 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: E0121 15:02:24.544135 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:24 crc kubenswrapper[4707]: E0121 15:02:24.544248 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.552110 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.552139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.552149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.552161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.552171 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.654440 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.654484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.654493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.654511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.654521 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.756921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.756972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.756984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.756999 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.757008 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.858622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.858649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.858657 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.858673 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.858682 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.960573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.960608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.960617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.960631 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:24 crc kubenswrapper[4707]: I0121 15:02:24.960639 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:24Z","lastTransitionTime":"2026-01-21T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.062058 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.062097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.062106 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.062120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.062129 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.156438 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 00:24:40.290138431 +0000 UTC Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.163966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.164011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.164021 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.164037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.164046 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.182268 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.182300 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.182268 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:25 crc kubenswrapper[4707]: E0121 15:02:25.182383 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:25 crc kubenswrapper[4707]: E0121 15:02:25.182462 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:25 crc kubenswrapper[4707]: E0121 15:02:25.182518 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.265972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.266015 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.266025 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.266040 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.266050 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.368256 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.368294 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.368302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.368321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.368331 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.470323 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.470358 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.470369 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.470385 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.470393 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.571750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.571786 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.571795 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.571824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.571835 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.674206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.674236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.674244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.674259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.674266 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.776022 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.776057 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.776066 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.776079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.776087 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.877985 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.878050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.878060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.878079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.878091 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.980124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.980161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.980170 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.980184 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:25 crc kubenswrapper[4707]: I0121 15:02:25.980193 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:25Z","lastTransitionTime":"2026-01-21T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.082499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.082546 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.082557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.082570 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.082579 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:26Z","lastTransitionTime":"2026-01-21T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.156962 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 21:02:12.384699791 +0000 UTC Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.182518 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:26 crc kubenswrapper[4707]: E0121 15:02:26.182640 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.183959 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.183982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.183991 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.184002 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.184011 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:26Z","lastTransitionTime":"2026-01-21T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.285892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.285924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.285932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.285946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.285955 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:26Z","lastTransitionTime":"2026-01-21T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.388553 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.388603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.388616 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.388633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.388645 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:26Z","lastTransitionTime":"2026-01-21T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.490295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.490333 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.490341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.490356 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.490365 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:26Z","lastTransitionTime":"2026-01-21T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.592850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.592888 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.592896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.592912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.592920 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:26Z","lastTransitionTime":"2026-01-21T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.695134 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.695169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.695177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.695192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.695202 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:26Z","lastTransitionTime":"2026-01-21T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.797593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.797636 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.797644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.797659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.797670 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:26Z","lastTransitionTime":"2026-01-21T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.899391 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.899426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.899435 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.899450 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:26 crc kubenswrapper[4707]: I0121 15:02:26.899459 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:26Z","lastTransitionTime":"2026-01-21T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.000983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.001040 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.001049 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.001065 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.001075 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.103552 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.103594 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.103603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.103630 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.103640 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.158086 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 06:46:09.996918857 +0000 UTC Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.182249 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.182308 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.182264 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:27 crc kubenswrapper[4707]: E0121 15:02:27.182404 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:27 crc kubenswrapper[4707]: E0121 15:02:27.182470 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:27 crc kubenswrapper[4707]: E0121 15:02:27.182549 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.205983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.206030 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.206039 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.206054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.206065 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.310954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.311002 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.311012 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.311026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.311036 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.412565 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.412608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.412631 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.412646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.412655 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.514462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.514503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.514511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.514526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.514536 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.616888 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.616933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.616942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.616959 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.616968 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.719424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.719478 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.719490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.719507 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.719516 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.778265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:27 crc kubenswrapper[4707]: E0121 15:02:27.778416 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:27 crc kubenswrapper[4707]: E0121 15:02:27.778511 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs podName:d5fb5fe4-8f42-4057-b731-b2c8da0661e3 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:35.77849152 +0000 UTC m=+52.960007742 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs") pod "network-metrics-daemon-62mww" (UID: "d5fb5fe4-8f42-4057-b731-b2c8da0661e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.821860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.821903 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.821912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.821927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.821939 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.923448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.923508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.923518 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.923532 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:27 crc kubenswrapper[4707]: I0121 15:02:27.923542 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:27Z","lastTransitionTime":"2026-01-21T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.025974 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.026018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.026027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.026044 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.026054 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.128546 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.128581 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.128859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.128881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.128890 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.158357 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 14:57:57.159032606 +0000 UTC Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.181684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:28 crc kubenswrapper[4707]: E0121 15:02:28.181825 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.231141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.231197 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.231208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.231244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.231257 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.333394 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.333448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.333457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.333471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.333482 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.435680 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.435714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.435750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.435765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.435773 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.537833 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.537867 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.537876 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.537891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.537901 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.639801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.639854 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.639864 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.639878 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.639888 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.741796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.741861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.741872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.741887 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.741897 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.843919 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.843963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.843973 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.843991 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.844005 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.947098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.947138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.947147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.947160 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:28 crc kubenswrapper[4707]: I0121 15:02:28.947171 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:28Z","lastTransitionTime":"2026-01-21T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.050000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.050088 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.050099 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.050122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.050132 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.152368 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.152405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.152433 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.152449 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.152458 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.158574 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 08:16:25.515591312 +0000 UTC Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.181918 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.181940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:29 crc kubenswrapper[4707]: E0121 15:02:29.182033 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.182055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:29 crc kubenswrapper[4707]: E0121 15:02:29.182157 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:29 crc kubenswrapper[4707]: E0121 15:02:29.182232 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.254707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.254753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.254763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.254782 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.254792 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.259691 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.267123 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.272029 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.282383 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.291405 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.306495 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.315005 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.322628 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.331445 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.339305 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.356762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.356797 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.356820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.356835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.356844 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.360325 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.370274 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.380615 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.391485 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.399571 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.410742 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.420910 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.430614 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.459441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.459499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.459508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.459523 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.459533 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.561615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.561676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.561685 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.561700 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.561710 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.664182 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.664473 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.664545 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.664612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.664681 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.767544 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.767846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.767945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.768026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.768090 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.870749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.870788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.870798 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.870833 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.870846 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.973514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.973554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.973565 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.973583 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:29 crc kubenswrapper[4707]: I0121 15:02:29.973593 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:29Z","lastTransitionTime":"2026-01-21T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.075757 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.075799 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.075822 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.075841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.075851 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.159559 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 13:24:16.273373644 +0000 UTC Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.178168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.178295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.178380 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.178442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.178507 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.182467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:30 crc kubenswrapper[4707]: E0121 15:02:30.182594 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.280923 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.280978 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.280987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.281005 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.281016 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.382964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.383002 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.383011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.383026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.383035 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.485282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.485340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.485350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.485367 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.485376 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.587893 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.587929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.587938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.587953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.587962 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.690328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.690369 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.690378 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.690394 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.690404 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.792549 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.792603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.792612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.792627 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.792637 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.895427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.895503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.895523 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.895543 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.895601 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.998313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.998370 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.998380 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.998396 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:30 crc kubenswrapper[4707]: I0121 15:02:30.998408 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:30Z","lastTransitionTime":"2026-01-21T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.101047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.101099 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.101110 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.101128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.101140 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:31Z","lastTransitionTime":"2026-01-21T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.159873 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 23:34:51.967964114 +0000 UTC Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.182188 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.182188 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.182208 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:31 crc kubenswrapper[4707]: E0121 15:02:31.182326 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:31 crc kubenswrapper[4707]: E0121 15:02:31.182569 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:31 crc kubenswrapper[4707]: E0121 15:02:31.182666 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.203654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.203707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.203718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.203733 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.203742 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:31Z","lastTransitionTime":"2026-01-21T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.306220 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.306271 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.306282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.306300 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.306312 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:31Z","lastTransitionTime":"2026-01-21T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.408963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.408998 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.409027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.409047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.409058 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:31Z","lastTransitionTime":"2026-01-21T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.511661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.511716 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.511726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.511742 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.511752 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:31Z","lastTransitionTime":"2026-01-21T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.616734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.616782 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.616794 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.616829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.616842 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:31Z","lastTransitionTime":"2026-01-21T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.718924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.718991 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.719004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.719020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.719049 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:31Z","lastTransitionTime":"2026-01-21T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.821228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.821255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.821263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.821276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.821285 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:31Z","lastTransitionTime":"2026-01-21T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.927063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.927115 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.927125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.927142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:31 crc kubenswrapper[4707]: I0121 15:02:31.927151 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:31Z","lastTransitionTime":"2026-01-21T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.029507 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.029729 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.029792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.029914 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.029978 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.132598 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.132694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.132706 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.132767 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.132779 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.160064 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 10:25:32.492318367 +0000 UTC Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.182336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:32 crc kubenswrapper[4707]: E0121 15:02:32.182469 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.183110 4707 scope.go:117] "RemoveContainer" containerID="05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.235462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.235499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.235509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.235526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.235535 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.337603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.337936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.337946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.337965 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.337977 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.413994 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/1.log" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.416281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.416704 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.428278 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.439921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.439971 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.439984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.440004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.440016 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.443096 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.462803 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.473662 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.494443 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.504785 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.516079 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.529510 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.540380 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.542158 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.542196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.542208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.542221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.542231 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.551438 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.565106 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.574316 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.583785 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.593447 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.601768 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.617488 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.626356 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.645087 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.645136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.645146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.645168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.645178 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.747409 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.747452 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.747462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.747481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.747493 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.849682 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.849739 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.849754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.849772 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.849783 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.952090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.952140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.952150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.952167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:32 crc kubenswrapper[4707]: I0121 15:02:32.952179 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:32Z","lastTransitionTime":"2026-01-21T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.024878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.024997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.025105 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.025126 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:03:05.025094591 +0000 UTC m=+82.206610814 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.025168 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:03:05.025159504 +0000 UTC m=+82.206675726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.054801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.054868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.054878 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.054896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.054907 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.126060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.126117 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.126161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126262 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126286 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126309 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126320 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126360 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:03:05.126342401 +0000 UTC m=+82.307858624 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126380 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:03:05.126372217 +0000 UTC m=+82.307888440 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126398 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126450 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126470 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.126549 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:03:05.126529125 +0000 UTC m=+82.308045357 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.156936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.156980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.156993 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.157009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.157020 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.160500 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 17:43:34.340843949 +0000 UTC Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.181645 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.181720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.181665 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.181804 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.181882 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.182037 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.193443 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.203937 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.213837 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.238002 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.249588 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.259262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.259309 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.259322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.259338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.259347 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.261644 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.271652 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.279638 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.294133 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.303620 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.313053 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.321786 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.329561 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.340093 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.348532 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.358198 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.360896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.360955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.360965 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.360980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.360991 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.366598 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.420335 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/2.log" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.421016 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/1.log" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.422922 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23" exitCode=1 Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.422965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.423040 4707 scope.go:117] "RemoveContainer" containerID="05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.423597 4707 scope.go:117] "RemoveContainer" containerID="5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23" Jan 21 15:02:33 crc kubenswrapper[4707]: E0121 15:02:33.423801 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.434394 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.446830 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.458941 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.462735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.462832 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.462843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.462860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.462870 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.470035 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.481185 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.490774 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.499393 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.509653 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.525530 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f9db2320bdf021faa25866aea153c4c885b2ab57fb47a51d404acf6e42e5ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:15Z\\\",\\\"message\\\":\\\"service openshift-apiserver/api cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:02:15.963469 6195 services_controller.go:452] Built service openshift-apiserver/api per-node LB for network=default: []services.LB{}\\\\nI0121 15:02:15.963479 6195 services_controller.go:453] Built service openshift-apiserver/api template LB for network=default: []services.LB{}\\\\nF0121 15:02:15.963477 6195 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:32Z\\\",\\\"message\\\":\\\"crc\\\\nI0121 15:02:32.822922 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-fmg2k in node crc\\\\nI0121 15:02:32.822927 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822931 6433 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fmg2k after 0 failed attempt(s)\\\\nI0121 15:02:32.822933 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822892 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-mk2p2\\\\nI0121 15:02:32.822931 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-62mww] creating logical port openshift-multus_network-metrics-daemon-62mww for pod on switch crc\\\\nF0121 15:02:32.822969 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.536359 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.547685 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.557409 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.565326 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.565372 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.565382 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.565401 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.565411 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.567607 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.578711 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.589294 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.600134 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.611319 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.667836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.667896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.667905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.667924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.667934 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.770141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.770188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.770197 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.770213 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.770223 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.872356 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.872397 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.872407 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.872426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.872434 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.974194 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.974245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.974258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.974274 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:33 crc kubenswrapper[4707]: I0121 15:02:33.974286 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:33Z","lastTransitionTime":"2026-01-21T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.076448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.076500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.076512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.076529 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.076540 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.161201 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 20:16:34.090083292 +0000 UTC Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.178560 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.178593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.178602 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.178616 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.178625 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.181911 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:34 crc kubenswrapper[4707]: E0121 15:02:34.182048 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.281144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.281199 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.281210 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.281229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.281240 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.383431 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.383476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.383484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.383498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.383508 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.427290 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/2.log" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.430621 4707 scope.go:117] "RemoveContainer" containerID="5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23" Jan 21 15:02:34 crc kubenswrapper[4707]: E0121 15:02:34.430844 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.440675 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.452310 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.460529 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.475316 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:32Z\\\",\\\"message\\\":\\\"crc\\\\nI0121 15:02:32.822922 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-fmg2k in node crc\\\\nI0121 15:02:32.822927 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822931 6433 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fmg2k after 0 failed attempt(s)\\\\nI0121 15:02:32.822933 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822892 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-mk2p2\\\\nI0121 15:02:32.822931 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-62mww] creating logical port openshift-multus_network-metrics-daemon-62mww for pod on switch crc\\\\nF0121 15:02:32.822969 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.484348 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.485894 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.485936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.485946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.485962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.485973 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.494588 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.503844 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.512933 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.523321 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.543031 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.554447 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.564013 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.574693 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.584859 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.587918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.587951 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.587962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.587976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.587985 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.594963 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.606357 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.615278 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.651954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.652012 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.652041 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.652058 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.652068 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: E0121 15:02:34.663329 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.666080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.666109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.666117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.666130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.666139 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: E0121 15:02:34.674978 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.678200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.678229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.678237 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.678250 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.678258 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: E0121 15:02:34.688041 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.690589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.690617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.690626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.690637 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.690646 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: E0121 15:02:34.699840 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.702407 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.702438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.702447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.702475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.702484 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: E0121 15:02:34.711301 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:34 crc kubenswrapper[4707]: E0121 15:02:34.711426 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.712628 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.712661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.712672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.712687 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.712696 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.814952 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.814999 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.815007 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.815023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.815032 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.917225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.917265 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.917275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.917291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:34 crc kubenswrapper[4707]: I0121 15:02:34.917300 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:34Z","lastTransitionTime":"2026-01-21T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.019451 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.019496 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.019505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.019520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.019529 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.121832 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.121881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.121891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.121908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.121918 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.162128 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 13:02:15.789193347 +0000 UTC Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.182511 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.182570 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.182527 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:35 crc kubenswrapper[4707]: E0121 15:02:35.182639 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:35 crc kubenswrapper[4707]: E0121 15:02:35.182707 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:35 crc kubenswrapper[4707]: E0121 15:02:35.182796 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.224151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.224194 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.224202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.224220 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.224230 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.326388 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.326436 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.326446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.326462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.326470 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.428369 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.428410 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.428421 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.428437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.428447 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.530482 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.530521 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.530530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.530545 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.530554 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.632048 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.632083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.632092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.632105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.632113 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.733767 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.733847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.733857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.733872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.733881 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.836011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.836053 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.836062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.836076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.836089 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.854611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:35 crc kubenswrapper[4707]: E0121 15:02:35.854773 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:35 crc kubenswrapper[4707]: E0121 15:02:35.854890 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs podName:d5fb5fe4-8f42-4057-b731-b2c8da0661e3 nodeName:}" failed. No retries permitted until 2026-01-21 15:02:51.854870986 +0000 UTC m=+69.036387198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs") pod "network-metrics-daemon-62mww" (UID: "d5fb5fe4-8f42-4057-b731-b2c8da0661e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.938601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.938679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.938689 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.938705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:35 crc kubenswrapper[4707]: I0121 15:02:35.938716 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:35Z","lastTransitionTime":"2026-01-21T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.040573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.040639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.040651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.040675 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.040686 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.143361 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.143420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.143429 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.143442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.143455 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.162745 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 07:01:46.904674158 +0000 UTC Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.181965 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:36 crc kubenswrapper[4707]: E0121 15:02:36.182078 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.246289 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.246325 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.246336 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.246350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.246361 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.348232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.348267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.348277 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.348291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.348304 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.449997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.450037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.450047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.450061 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.450071 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.552760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.552795 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.552842 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.552858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.552867 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.654905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.654935 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.654945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.654958 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.654968 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.756865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.756897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.756906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.756920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.756929 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.859224 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.859272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.859283 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.859306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.859316 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.961649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.961695 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.961707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.961724 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:36 crc kubenswrapper[4707]: I0121 15:02:36.961736 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:36Z","lastTransitionTime":"2026-01-21T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.064289 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.064335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.064346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.064360 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.064370 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.163175 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 22:11:19.924736411 +0000 UTC Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.167091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.167138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.167148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.167166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.167176 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.182344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.182370 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:37 crc kubenswrapper[4707]: E0121 15:02:37.182453 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.182532 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:37 crc kubenswrapper[4707]: E0121 15:02:37.182537 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:37 crc kubenswrapper[4707]: E0121 15:02:37.182672 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.269557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.269589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.269599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.269612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.269621 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.372146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.372191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.372200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.372216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.372227 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.473905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.473947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.473956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.473972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.473982 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.575907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.575950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.575961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.575979 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.575988 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.678326 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.678368 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.678378 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.678396 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.678406 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.780429 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.780477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.780488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.780506 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.780517 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.882590 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.882631 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.882642 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.882656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.882666 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.984876 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.984923 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.984932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.984949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:37 crc kubenswrapper[4707]: I0121 15:02:37.984959 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:37Z","lastTransitionTime":"2026-01-21T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.086545 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.086592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.086603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.086622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.086633 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:38Z","lastTransitionTime":"2026-01-21T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.163906 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 15:41:52.85446988 +0000 UTC Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.182465 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:38 crc kubenswrapper[4707]: E0121 15:02:38.182601 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.188894 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.188927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.188937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.188950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.188959 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:38Z","lastTransitionTime":"2026-01-21T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.291178 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.291208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.291218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.291234 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.291244 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:38Z","lastTransitionTime":"2026-01-21T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.393866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.393906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.393917 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.393933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.393946 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:38Z","lastTransitionTime":"2026-01-21T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.495853 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.495899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.495909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.495925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.495935 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:38Z","lastTransitionTime":"2026-01-21T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.598090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.598131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.598141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.598157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.598166 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:38Z","lastTransitionTime":"2026-01-21T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.700791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.700864 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.700875 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.700891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.700902 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:38Z","lastTransitionTime":"2026-01-21T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.802910 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.802946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.802954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.802970 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.802979 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:38Z","lastTransitionTime":"2026-01-21T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.905478 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.905519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.905530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.905546 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:38 crc kubenswrapper[4707]: I0121 15:02:38.905556 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:38Z","lastTransitionTime":"2026-01-21T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.007934 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.007970 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.007980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.007994 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.008004 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.112577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.112618 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.112628 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.112645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.112656 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.164059 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 18:49:21.116291594 +0000 UTC Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.182404 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.182485 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:39 crc kubenswrapper[4707]: E0121 15:02:39.182521 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:39 crc kubenswrapper[4707]: E0121 15:02:39.182624 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.182494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:39 crc kubenswrapper[4707]: E0121 15:02:39.182724 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.215668 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.215714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.215722 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.215735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.215747 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.318276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.318309 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.318318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.318331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.318341 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.420958 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.421000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.421011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.421026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.421036 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.522990 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.523035 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.523047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.523063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.523073 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.625201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.625243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.625255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.625269 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.625280 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.727581 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.727627 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.727636 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.727654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.727665 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.829498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.829539 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.829548 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.829564 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.829575 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.931512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.931544 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.931552 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.931566 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:39 crc kubenswrapper[4707]: I0121 15:02:39.931576 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:39Z","lastTransitionTime":"2026-01-21T15:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.034239 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.034491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.034565 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.034637 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.034709 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.137284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.137344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.137354 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.137369 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.137379 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.164693 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 16:05:00.478934379 +0000 UTC Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.182068 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:40 crc kubenswrapper[4707]: E0121 15:02:40.182212 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.239132 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.239177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.239186 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.239201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.239210 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.341623 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.341931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.342024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.342093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.342158 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.444656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.444693 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.444704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.444721 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.444730 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.547415 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.547676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.547757 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.547866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.547963 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.650317 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.650355 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.650366 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.650382 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.650392 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.752292 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.752327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.752337 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.752351 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.752360 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.854298 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.854335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.854345 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.854360 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.854370 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.956321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.956359 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.956368 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.956386 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:40 crc kubenswrapper[4707]: I0121 15:02:40.956396 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:40Z","lastTransitionTime":"2026-01-21T15:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.058640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.058684 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.058694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.058709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.058723 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.161682 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.161725 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.161735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.161751 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.161761 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.164832 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 07:20:47.646575581 +0000 UTC Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.182200 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.182227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.182279 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:41 crc kubenswrapper[4707]: E0121 15:02:41.182336 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:41 crc kubenswrapper[4707]: E0121 15:02:41.182419 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:41 crc kubenswrapper[4707]: E0121 15:02:41.182471 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.263941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.263986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.263994 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.264009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.264017 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.366491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.366553 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.366564 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.366580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.366590 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.468960 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.469002 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.469011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.469026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.469036 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.571707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.571755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.571764 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.571780 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.571790 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.677877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.678604 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.678716 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.678791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.678873 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.781495 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.781551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.781563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.781579 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.781590 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.883667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.883707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.883718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.883736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.883746 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.985742 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.985783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.985792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.986018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:41 crc kubenswrapper[4707]: I0121 15:02:41.986046 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:41Z","lastTransitionTime":"2026-01-21T15:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.088192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.088234 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.088243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.088258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.088267 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:42Z","lastTransitionTime":"2026-01-21T15:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.165628 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:55:06.838097269 +0000 UTC Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.182017 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:42 crc kubenswrapper[4707]: E0121 15:02:42.182157 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.190247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.190285 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.190295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.190309 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.190321 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:42Z","lastTransitionTime":"2026-01-21T15:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.292154 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.292199 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.292208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.292322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.292338 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:42Z","lastTransitionTime":"2026-01-21T15:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.394920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.394979 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.394991 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.395008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.395018 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:42Z","lastTransitionTime":"2026-01-21T15:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.497171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.497646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.497656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.497673 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.497684 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:42Z","lastTransitionTime":"2026-01-21T15:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.600139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.600191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.600200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.600214 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.600225 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:42Z","lastTransitionTime":"2026-01-21T15:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.702541 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.702591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.702603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.702621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.702630 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:42Z","lastTransitionTime":"2026-01-21T15:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.804902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.804961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.804971 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.804987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.804997 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:42Z","lastTransitionTime":"2026-01-21T15:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.907112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.907145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.907154 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.907168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:42 crc kubenswrapper[4707]: I0121 15:02:42.907180 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:42Z","lastTransitionTime":"2026-01-21T15:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.009692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.009744 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.009754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.009774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.009785 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.112111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.112167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.112177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.112196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.112206 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.166062 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 17:22:53.30572311 +0000 UTC Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.181596 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.181601 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:43 crc kubenswrapper[4707]: E0121 15:02:43.181715 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.181619 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:43 crc kubenswrapper[4707]: E0121 15:02:43.181853 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:43 crc kubenswrapper[4707]: E0121 15:02:43.181984 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.193597 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.204649 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.215109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.215143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.215154 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.215168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.215178 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.215565 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.226267 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.240340 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.250021 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.258293 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.269071 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.280677 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.291412 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.301121 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.312415 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.316899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.316937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.316957 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.316974 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.316983 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.322289 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.332829 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.342450 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.350318 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.363393 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:32Z\\\",\\\"message\\\":\\\"crc\\\\nI0121 15:02:32.822922 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-fmg2k in node crc\\\\nI0121 15:02:32.822927 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822931 6433 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fmg2k after 0 failed attempt(s)\\\\nI0121 15:02:32.822933 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822892 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-mk2p2\\\\nI0121 15:02:32.822931 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-62mww] creating logical port openshift-multus_network-metrics-daemon-62mww for pod on switch crc\\\\nF0121 15:02:32.822969 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:43Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.418936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.418985 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.418995 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.419010 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.419020 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.521603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.521645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.521653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.521669 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.521681 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.624424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.624474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.624485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.624503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.624514 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.726491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.726538 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.726546 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.726562 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.726575 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.829245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.829291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.829300 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.829318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.829328 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.931705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.931754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.931768 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.931786 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:43 crc kubenswrapper[4707]: I0121 15:02:43.931796 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:43Z","lastTransitionTime":"2026-01-21T15:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.034355 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.034400 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.034410 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.034425 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.034434 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.137076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.137123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.137133 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.137146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.137155 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.166563 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 03:59:17.4000573 +0000 UTC Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.182016 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:44 crc kubenswrapper[4707]: E0121 15:02:44.182164 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.240248 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.240288 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.240297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.240314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.240324 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.342166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.342198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.342206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.342221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.342230 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.444455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.444501 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.444511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.444528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.444538 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.546950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.547010 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.547019 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.547035 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.547045 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.649082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.649117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.649125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.649139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.649149 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.735793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.735860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.735875 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.735906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.735914 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: E0121 15:02:44.746355 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:44Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.750034 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.750070 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.750079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.750096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.750108 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: E0121 15:02:44.760064 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:44Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.763414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.763433 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.763442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.763454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.763465 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: E0121 15:02:44.788308 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:44Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.802224 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.802272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.802287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.802305 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.802317 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: E0121 15:02:44.815448 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:44Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.819069 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.819188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.819274 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.819346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.819406 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: E0121 15:02:44.829782 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:44Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:44 crc kubenswrapper[4707]: E0121 15:02:44.830091 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.831444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.831477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.831487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.831504 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.831515 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.934044 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.934086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.934095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.934111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:44 crc kubenswrapper[4707]: I0121 15:02:44.934121 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:44Z","lastTransitionTime":"2026-01-21T15:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.036337 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.036569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.036638 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.036716 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.036800 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.139249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.139284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.139295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.139308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.139318 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.167586 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 18:41:44.780798539 +0000 UTC Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.182069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.182091 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.182078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:45 crc kubenswrapper[4707]: E0121 15:02:45.182182 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:45 crc kubenswrapper[4707]: E0121 15:02:45.182241 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:45 crc kubenswrapper[4707]: E0121 15:02:45.182624 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.183027 4707 scope.go:117] "RemoveContainer" containerID="5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23" Jan 21 15:02:45 crc kubenswrapper[4707]: E0121 15:02:45.183210 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.241198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.241232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.241259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.241274 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.241283 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.343350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.343399 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.343409 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.343423 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.343432 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.445708 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.445750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.445760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.445773 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.445782 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.548037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.548076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.548087 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.548123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.548137 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.650588 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.650622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.650632 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.650645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.650655 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.752542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.752576 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.752585 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.752598 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.752609 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.854874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.854939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.854969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.854985 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.855008 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.956858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.956896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.956905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.956918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:45 crc kubenswrapper[4707]: I0121 15:02:45.956928 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:45Z","lastTransitionTime":"2026-01-21T15:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.059410 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.059446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.059456 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.059469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.059492 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.161551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.161615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.161624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.161641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.161652 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.168751 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:18:22.700646815 +0000 UTC Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.182084 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:46 crc kubenswrapper[4707]: E0121 15:02:46.182474 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.264199 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.264242 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.264251 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.264275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.264286 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.366312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.366359 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.366371 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.366388 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.366400 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.468764 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.468801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.468830 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.468846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.468856 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.571225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.571272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.571281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.571297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.571307 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.674166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.674207 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.674216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.674229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.674238 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.776772 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.776832 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.776843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.776857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.776867 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.879241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.879273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.879282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.879295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.879305 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.982066 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.982103 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.982113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.982128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:46 crc kubenswrapper[4707]: I0121 15:02:46.982139 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:46Z","lastTransitionTime":"2026-01-21T15:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.085344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.085378 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.085387 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.085402 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.085411 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:47Z","lastTransitionTime":"2026-01-21T15:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.168878 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 05:28:26.805175336 +0000 UTC Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.182307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.182391 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.182385 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:47 crc kubenswrapper[4707]: E0121 15:02:47.182548 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:47 crc kubenswrapper[4707]: E0121 15:02:47.182637 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:47 crc kubenswrapper[4707]: E0121 15:02:47.182690 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.187108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.187133 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.187142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.187156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.187167 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:47Z","lastTransitionTime":"2026-01-21T15:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.288972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.289009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.289018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.289047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.289056 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:47Z","lastTransitionTime":"2026-01-21T15:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.392589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.392615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.392624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.392639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.392647 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:47Z","lastTransitionTime":"2026-01-21T15:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.494421 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.494461 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.494470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.494488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.494497 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:47Z","lastTransitionTime":"2026-01-21T15:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.596536 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.596561 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.596569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.596582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.596593 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:47Z","lastTransitionTime":"2026-01-21T15:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.698200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.698234 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.698242 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.698254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.698265 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:47Z","lastTransitionTime":"2026-01-21T15:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.800316 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.800371 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.800384 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.800403 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.800415 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:47Z","lastTransitionTime":"2026-01-21T15:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.902412 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.902445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.902454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.902471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:47 crc kubenswrapper[4707]: I0121 15:02:47.902480 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:47Z","lastTransitionTime":"2026-01-21T15:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.004906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.004943 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.004953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.004969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.004979 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.107431 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.107474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.107484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.107505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.107514 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.169891 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 03:57:15.234403438 +0000 UTC Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.182017 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:48 crc kubenswrapper[4707]: E0121 15:02:48.182170 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.209828 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.209870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.209880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.209896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.209905 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.312078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.312119 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.312128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.312143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.312154 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.413971 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.414016 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.414026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.414042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.414065 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.516201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.516249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.516260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.516276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.516285 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.618665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.618703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.618712 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.618726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.618737 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.720717 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.720760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.720769 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.720785 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.720796 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.822424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.822465 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.822475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.822490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.822500 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.924567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.924601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.924610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.924625 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:48 crc kubenswrapper[4707]: I0121 15:02:48.924635 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:48Z","lastTransitionTime":"2026-01-21T15:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.027247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.027282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.027291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.027307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.027315 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.129557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.129762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.129982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.130140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.130274 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.171055 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 04:28:08.501318259 +0000 UTC Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.182756 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.182975 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.182942 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:49 crc kubenswrapper[4707]: E0121 15:02:49.183204 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:49 crc kubenswrapper[4707]: E0121 15:02:49.183311 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:49 crc kubenswrapper[4707]: E0121 15:02:49.183391 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.232253 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.232500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.232583 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.232664 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.232735 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.334522 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.334567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.334578 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.334593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.334603 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.436632 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.436678 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.436689 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.436708 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.436717 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.539269 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.539309 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.539318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.539333 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.539343 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.642159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.642197 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.642206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.642222 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.642232 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.744666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.744714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.744724 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.744741 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.744751 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.847104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.847141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.847150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.847165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.847177 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.948986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.949040 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.949050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.949068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:49 crc kubenswrapper[4707]: I0121 15:02:49.949090 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:49Z","lastTransitionTime":"2026-01-21T15:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.051419 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.051466 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.051476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.051492 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.051503 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.154299 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.154342 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.154352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.154365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.154374 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.171701 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 01:44:50.547871549 +0000 UTC Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.182057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:50 crc kubenswrapper[4707]: E0121 15:02:50.182206 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.257935 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.257978 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.257989 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.258004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.258014 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.360018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.360057 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.360066 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.360081 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.360100 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.462601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.462645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.462654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.462674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.462683 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.565716 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.565785 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.565796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.565835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.565848 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.668885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.668939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.668950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.668972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.668982 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.770507 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.770542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.770550 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.770563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.770571 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.872738 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.872784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.872792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.872822 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.872833 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.975206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.975910 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.975949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.975971 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:50 crc kubenswrapper[4707]: I0121 15:02:50.975983 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:50Z","lastTransitionTime":"2026-01-21T15:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.078644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.078683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.078694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.078708 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.078716 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.172545 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 03:37:12.598982874 +0000 UTC Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.181158 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.181188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.181198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.181211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.181221 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.182034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.182183 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:51 crc kubenswrapper[4707]: E0121 15:02:51.182233 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.182205 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:51 crc kubenswrapper[4707]: E0121 15:02:51.182411 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:51 crc kubenswrapper[4707]: E0121 15:02:51.182527 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.283632 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.283672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.283682 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.283698 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.283708 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.385844 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.385882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.385892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.385905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.385914 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.487827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.487874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.487884 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.487911 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.487923 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.590275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.590315 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.590328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.590345 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.590355 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.692405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.692444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.692454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.692467 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.692477 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.794444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.794483 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.794491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.794506 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.794515 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.896474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.896544 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.896554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.896568 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.896579 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.913245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:51 crc kubenswrapper[4707]: E0121 15:02:51.913482 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:51 crc kubenswrapper[4707]: E0121 15:02:51.913572 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs podName:d5fb5fe4-8f42-4057-b731-b2c8da0661e3 nodeName:}" failed. No retries permitted until 2026-01-21 15:03:23.913555693 +0000 UTC m=+101.095071915 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs") pod "network-metrics-daemon-62mww" (UID: "d5fb5fe4-8f42-4057-b731-b2c8da0661e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.998899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.998939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.998949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.998963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:51 crc kubenswrapper[4707]: I0121 15:02:51.998972 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:51Z","lastTransitionTime":"2026-01-21T15:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.101623 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.101673 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.101681 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.101696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.101705 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:52Z","lastTransitionTime":"2026-01-21T15:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.172865 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 10:57:49.207537238 +0000 UTC Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.182154 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:52 crc kubenswrapper[4707]: E0121 15:02:52.182285 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.203908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.203950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.203960 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.203976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.203986 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:52Z","lastTransitionTime":"2026-01-21T15:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.307004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.307075 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.307086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.307117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.307127 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:52Z","lastTransitionTime":"2026-01-21T15:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.409216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.409264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.409276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.409293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.409303 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:52Z","lastTransitionTime":"2026-01-21T15:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.478249 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/0.log" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.478371 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2bdbb11-a196-4dc3-b197-64ef1bec8e8a" containerID="cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49" exitCode=1 Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.478480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxkz2" event={"ID":"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a","Type":"ContainerDied","Data":"cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.479176 4707 scope.go:117] "RemoveContainer" containerID="cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.493018 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.502999 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.511406 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.511667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.511718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.511732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.511749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.511758 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:52Z","lastTransitionTime":"2026-01-21T15:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.524756 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:32Z\\\",\\\"message\\\":\\\"crc\\\\nI0121 15:02:32.822922 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-fmg2k in node crc\\\\nI0121 15:02:32.822927 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822931 6433 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fmg2k after 0 failed attempt(s)\\\\nI0121 15:02:32.822933 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822892 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-mk2p2\\\\nI0121 15:02:32.822931 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-62mww] creating logical port openshift-multus_network-metrics-daemon-62mww for pod on switch crc\\\\nF0121 15:02:32.822969 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.532894 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.544176 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"2026-01-21T15:02:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937\\\\n2026-01-21T15:02:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937 to /host/opt/cni/bin/\\\\n2026-01-21T15:02:07Z [verbose] multus-daemon started\\\\n2026-01-21T15:02:07Z [verbose] Readiness Indicator file check\\\\n2026-01-21T15:02:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.551514 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.559233 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.569885 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.578948 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.588865 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.596429 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.604307 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.614186 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.614218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.614228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.614245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.614255 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:52Z","lastTransitionTime":"2026-01-21T15:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.616333 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.626677 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.636238 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.647534 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:52Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.716074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.716129 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.716138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.716161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.716171 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:52Z","lastTransitionTime":"2026-01-21T15:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.822459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.822499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.822508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.822524 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.822534 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:52Z","lastTransitionTime":"2026-01-21T15:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.924839 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.924880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.924890 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.924904 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:52 crc kubenswrapper[4707]: I0121 15:02:52.924913 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:52Z","lastTransitionTime":"2026-01-21T15:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.027029 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.027062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.027071 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.027098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.027107 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.128489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.128520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.128530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.128542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.128570 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.173843 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 11:07:05.772518883 +0000 UTC Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.182115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.182116 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.182148 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:53 crc kubenswrapper[4707]: E0121 15:02:53.182286 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:53 crc kubenswrapper[4707]: E0121 15:02:53.182339 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:53 crc kubenswrapper[4707]: E0121 15:02:53.182516 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.192060 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.201186 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.208371 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.221482 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:32Z\\\",\\\"message\\\":\\\"crc\\\\nI0121 15:02:32.822922 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-fmg2k in node crc\\\\nI0121 15:02:32.822927 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822931 6433 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fmg2k after 0 failed attempt(s)\\\\nI0121 15:02:32.822933 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822892 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-mk2p2\\\\nI0121 15:02:32.822931 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-62mww] creating logical port openshift-multus_network-metrics-daemon-62mww for pod on switch crc\\\\nF0121 15:02:32.822969 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.228545 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.230042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.230096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.230107 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.230125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.230136 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.238175 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.247305 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.255422 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.263894 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.270648 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.279404 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"2026-01-21T15:02:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937\\\\n2026-01-21T15:02:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937 to /host/opt/cni/bin/\\\\n2026-01-21T15:02:07Z [verbose] multus-daemon started\\\\n2026-01-21T15:02:07Z [verbose] Readiness Indicator file check\\\\n2026-01-21T15:02:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.287260 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.296667 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.305250 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.313178 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.324405 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.331853 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.331875 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.331883 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.331897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.331905 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.332097 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.432891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.433006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.433082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.433146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.433199 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.482019 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/0.log" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.482078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxkz2" event={"ID":"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a","Type":"ContainerStarted","Data":"99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.490840 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.506715 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:32Z\\\",\\\"message\\\":\\\"crc\\\\nI0121 15:02:32.822922 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-fmg2k in node crc\\\\nI0121 15:02:32.822927 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822931 6433 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fmg2k after 0 failed attempt(s)\\\\nI0121 15:02:32.822933 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822892 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-mk2p2\\\\nI0121 15:02:32.822931 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-62mww] creating logical port openshift-multus_network-metrics-daemon-62mww for pod on switch crc\\\\nF0121 15:02:32.822969 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.517350 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.526460 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.534400 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.535574 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.535632 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.535643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.535661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.535671 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.542449 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.550648 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.557431 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.565975 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"2026-01-21T15:02:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937\\\\n2026-01-21T15:02:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937 to /host/opt/cni/bin/\\\\n2026-01-21T15:02:07Z [verbose] multus-daemon started\\\\n2026-01-21T15:02:07Z [verbose] Readiness Indicator file check\\\\n2026-01-21T15:02:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.573508 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.581219 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.589404 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.597800 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.607264 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.615556 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.624909 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.633510 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:53Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.637364 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.637391 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.637399 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.637413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.637422 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.739429 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.739469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.739482 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.739498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.739508 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.841136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.841188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.841200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.841217 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.841234 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.943011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.943072 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.943082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.943098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:53 crc kubenswrapper[4707]: I0121 15:02:53.943107 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:53Z","lastTransitionTime":"2026-01-21T15:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.045402 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.045470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.045489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.045510 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.045522 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.148257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.148312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.148322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.148337 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.148362 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.174231 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 06:16:59.092858288 +0000 UTC Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.181898 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:54 crc kubenswrapper[4707]: E0121 15:02:54.182038 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.250096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.250129 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.250138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.250152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.250160 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.352228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.352299 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.352310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.352326 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.352336 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.454349 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.454391 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.454402 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.454421 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.454436 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.556395 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.556443 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.556454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.556474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.556484 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.658336 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.658378 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.658390 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.658408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.658419 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.760489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.760525 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.760536 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.760551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.760562 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.862700 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.862754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.862765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.862786 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.862799 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.878645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.878678 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.878687 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.878701 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.878709 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: E0121 15:02:54.889576 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:54Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.892429 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.892463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.892475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.892487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.892496 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: E0121 15:02:54.901791 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:54Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.904288 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.904317 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.904328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.904340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.904348 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: E0121 15:02:54.913229 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:54Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.919408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.919444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.919455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.919469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.919478 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: E0121 15:02:54.929518 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:54Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.932109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.932140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.932153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.932165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.932174 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:54 crc kubenswrapper[4707]: E0121 15:02:54.940382 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:54Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:54 crc kubenswrapper[4707]: E0121 15:02:54.940519 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.965267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.965299 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.965308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.965323 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:54 crc kubenswrapper[4707]: I0121 15:02:54.965333 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:54Z","lastTransitionTime":"2026-01-21T15:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.067597 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.067636 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.067645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.067659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.067668 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.169870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.169918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.169929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.169946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.169956 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.175184 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 07:43:13.503132128 +0000 UTC Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.182517 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.182565 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:55 crc kubenswrapper[4707]: E0121 15:02:55.182629 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.182662 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:55 crc kubenswrapper[4707]: E0121 15:02:55.182776 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:55 crc kubenswrapper[4707]: E0121 15:02:55.182841 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.272882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.272929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.272938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.272955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.272966 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.375537 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.375577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.375585 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.375599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.375609 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.479095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.479243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.479254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.479270 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.479280 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.582485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.582519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.582544 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.582560 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.582571 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.684697 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.684743 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.684752 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.684768 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.684777 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.786541 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.786591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.786600 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.786614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.786624 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.888864 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.888902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.888913 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.888927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.888937 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.990964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.991001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.991009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.991034 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:55 crc kubenswrapper[4707]: I0121 15:02:55.991044 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:55Z","lastTransitionTime":"2026-01-21T15:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.093357 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.093403 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.093413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.093430 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.093442 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:56Z","lastTransitionTime":"2026-01-21T15:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.175438 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 09:14:27.658433398 +0000 UTC Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.181762 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:56 crc kubenswrapper[4707]: E0121 15:02:56.181992 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.191470 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.196000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.196062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.196073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.196090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.196099 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:56Z","lastTransitionTime":"2026-01-21T15:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.298112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.298157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.298166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.298183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.298195 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:56Z","lastTransitionTime":"2026-01-21T15:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.400299 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.400344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.400356 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.400374 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.400383 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:56Z","lastTransitionTime":"2026-01-21T15:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.502277 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.502313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.502321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.502337 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.502346 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:56Z","lastTransitionTime":"2026-01-21T15:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.604836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.604876 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.604885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.604908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.604918 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:56Z","lastTransitionTime":"2026-01-21T15:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.706929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.706966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.706975 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.706990 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.707016 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:56Z","lastTransitionTime":"2026-01-21T15:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.809542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.809575 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.809584 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.809598 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.809609 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:56Z","lastTransitionTime":"2026-01-21T15:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.912553 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.912595 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.912604 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.912654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:56 crc kubenswrapper[4707]: I0121 15:02:56.912665 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:56Z","lastTransitionTime":"2026-01-21T15:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.014964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.015013 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.015023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.015038 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.015047 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.117311 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.117344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.117355 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.117369 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.117379 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.175951 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 11:09:02.514708022 +0000 UTC Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.194561 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:57 crc kubenswrapper[4707]: E0121 15:02:57.194708 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.195631 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.195673 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:57 crc kubenswrapper[4707]: E0121 15:02:57.195737 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:57 crc kubenswrapper[4707]: E0121 15:02:57.195907 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.196271 4707 scope.go:117] "RemoveContainer" containerID="5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.218826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.218871 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.218881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.218898 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.218909 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.321144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.321183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.321192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.321211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.321222 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.424007 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.424042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.424051 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.424066 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.424076 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.495509 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/2.log" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.498477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.498824 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.510312 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"2026-01-21T15:02:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937\\\\n2026-01-21T15:02:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937 to /host/opt/cni/bin/\\\\n2026-01-21T15:02:07Z [verbose] multus-daemon started\\\\n2026-01-21T15:02:07Z [verbose] Readiness Indicator file check\\\\n2026-01-21T15:02:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.519075 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.526150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.526184 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.526193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.526208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.526218 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.533684 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.544862 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.555120 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.565942 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.575459 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.585082 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.596241 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.607070 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.621530 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcfccd0-9ae0-44ba-a142-b621f14854f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bddde6ec5b36fdff21a26422f9d552fffa7b8fea406b82552daafa46943b5b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.628666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.628704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.628713 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.628728 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.628739 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.642521 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.655412 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.677241 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.692792 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.702126 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.717784 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:32Z\\\",\\\"message\\\":\\\"crc\\\\nI0121 15:02:32.822922 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-fmg2k in node crc\\\\nI0121 15:02:32.822927 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822931 6433 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fmg2k after 0 failed attempt(s)\\\\nI0121 15:02:32.822933 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822892 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-mk2p2\\\\nI0121 15:02:32.822931 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-62mww] creating logical port openshift-multus_network-metrics-daemon-62mww for pod on switch crc\\\\nF0121 15:02:32.822969 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.725852 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.731610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.731640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.731650 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.731665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.731675 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.834659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.834698 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.834709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.834724 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.834734 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.937136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.937184 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.937194 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.937211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:57 crc kubenswrapper[4707]: I0121 15:02:57.937223 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:57Z","lastTransitionTime":"2026-01-21T15:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.039462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.039516 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.039528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.039548 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.039559 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.142345 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.142387 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.142397 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.142414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.142424 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.176763 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 01:51:48.600337808 +0000 UTC Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.182015 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:02:58 crc kubenswrapper[4707]: E0121 15:02:58.182121 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.244572 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.244602 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.244610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.244624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.244633 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.346835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.346886 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.346895 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.346925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.346940 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.450200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.450246 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.450258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.450281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.450293 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.502258 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/3.log" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.502891 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/2.log" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.504722 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" exitCode=1 Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.504749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.504789 4707 scope.go:117] "RemoveContainer" containerID="5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.505256 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:02:58 crc kubenswrapper[4707]: E0121 15:02:58.505391 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.517154 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"2026-01-21T15:02:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937\\\\n2026-01-21T15:02:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937 to /host/opt/cni/bin/\\\\n2026-01-21T15:02:07Z [verbose] multus-daemon started\\\\n2026-01-21T15:02:07Z [verbose] Readiness Indicator file check\\\\n2026-01-21T15:02:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.525597 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.534255 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.542058 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.550329 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.552887 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.552927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.552938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.552953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.552975 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.560114 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.572423 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.584696 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.597246 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.608267 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.616297 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcfccd0-9ae0-44ba-a142-b621f14854f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bddde6ec5b36fdff21a26422f9d552fffa7b8fea406b82552daafa46943b5b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.626198 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.637472 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.647603 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.654902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.654933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.654942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.654957 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.654979 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.657052 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.665123 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.680208 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5eaaf52c7331f8da015727c1d9f573f81443b8093b8f4c99d0dbe9fef682bf23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:32Z\\\",\\\"message\\\":\\\"crc\\\\nI0121 15:02:32.822922 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-fmg2k in node crc\\\\nI0121 15:02:32.822927 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822931 6433 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fmg2k after 0 failed attempt(s)\\\\nI0121 15:02:32.822933 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0121 15:02:32.822892 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-mk2p2\\\\nI0121 15:02:32.822931 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-62mww] creating logical port openshift-multus_network-metrics-daemon-62mww for pod on switch crc\\\\nF0121 15:02:32.822969 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:57Z\\\",\\\"message\\\":\\\"transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: U\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.690014 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:58Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.757286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.757344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.757354 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.757379 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.757393 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.861076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.861124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.861136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.861154 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.861171 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.963584 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.963629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.963639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.963692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:58 crc kubenswrapper[4707]: I0121 15:02:58.963710 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:58Z","lastTransitionTime":"2026-01-21T15:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.066489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.066541 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.066554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.066571 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.066583 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.169465 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.169509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.169518 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.169533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.169543 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.177744 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 22:32:18.223793045 +0000 UTC Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.182088 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.182160 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.182090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:02:59 crc kubenswrapper[4707]: E0121 15:02:59.182223 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:02:59 crc kubenswrapper[4707]: E0121 15:02:59.182286 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:02:59 crc kubenswrapper[4707]: E0121 15:02:59.182391 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.272174 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.272231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.272242 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.272295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.272312 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.374445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.374491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.374504 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.374519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.374529 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.476547 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.476585 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.476594 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.476611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.476620 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.508441 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/3.log" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.511610 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:02:59 crc kubenswrapper[4707]: E0121 15:02:59.511776 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.521733 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcfccd0-9ae0-44ba-a142-b621f14854f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bddde6ec5b36fdff21a26422f9d552fffa7b8fea406b82552daafa46943b5b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.531663 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.543066 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.552609 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.563754 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.574194 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.578565 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.578599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.578607 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.578621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.578630 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.583232 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.599782 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:57Z\\\",\\\"message\\\":\\\"transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: U\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.611421 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.621286 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.629188 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.638084 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.646473 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.653899 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.663264 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"2026-01-21T15:02:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937\\\\n2026-01-21T15:02:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937 to /host/opt/cni/bin/\\\\n2026-01-21T15:02:07Z [verbose] multus-daemon started\\\\n2026-01-21T15:02:07Z [verbose] Readiness Indicator file check\\\\n2026-01-21T15:02:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.672309 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.681118 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.681224 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.681308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.681391 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.681460 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.682301 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.691891 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:02:59Z is after 2025-08-24T17:21:41Z" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.784031 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.784083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.784094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.784111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.784123 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.887068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.887117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.887127 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.887147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.887159 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.989950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.989994 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.990003 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.990018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:02:59 crc kubenswrapper[4707]: I0121 15:02:59.990028 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:02:59Z","lastTransitionTime":"2026-01-21T15:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.091801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.091866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.091874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.091890 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.091899 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:00Z","lastTransitionTime":"2026-01-21T15:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.178620 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 02:53:50.291062345 +0000 UTC Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.181992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:00 crc kubenswrapper[4707]: E0121 15:03:00.182336 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.194020 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.194904 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.194953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.194964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.194979 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.194988 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:00Z","lastTransitionTime":"2026-01-21T15:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.296983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.297026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.297035 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.297051 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.297063 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:00Z","lastTransitionTime":"2026-01-21T15:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.399511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.399551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.399560 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.399575 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.399585 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:00Z","lastTransitionTime":"2026-01-21T15:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.501950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.501995 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.502004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.502020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.502030 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:00Z","lastTransitionTime":"2026-01-21T15:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.604150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.604204 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.604213 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.604231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.604240 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:00Z","lastTransitionTime":"2026-01-21T15:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.706362 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.706412 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.706422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.706437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.706455 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:00Z","lastTransitionTime":"2026-01-21T15:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.808017 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.808053 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.808065 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.808079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.808087 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:00Z","lastTransitionTime":"2026-01-21T15:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.910009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.910041 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.910049 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.910062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:00 crc kubenswrapper[4707]: I0121 15:03:00.910071 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:00Z","lastTransitionTime":"2026-01-21T15:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.012568 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.012609 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.012618 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.012634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.012644 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.114590 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.114641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.114652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.114668 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.114681 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.179285 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:32:13.51185641 +0000 UTC Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.182631 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.182676 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:01 crc kubenswrapper[4707]: E0121 15:03:01.182757 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.182852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:01 crc kubenswrapper[4707]: E0121 15:03:01.182944 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:01 crc kubenswrapper[4707]: E0121 15:03:01.182969 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.215835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.215870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.215880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.215895 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.215905 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.317738 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.317770 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.317779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.317791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.317799 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.419483 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.419514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.419523 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.419539 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.419548 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.521653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.521693 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.521702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.521718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.521728 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.623765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.623855 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.623877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.623897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.624025 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.726170 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.726206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.726215 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.726231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.726243 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.829412 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.829475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.829487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.829510 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.829520 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.932375 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.932407 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.932416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.932429 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:01 crc kubenswrapper[4707]: I0121 15:03:01.932439 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:01Z","lastTransitionTime":"2026-01-21T15:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.034406 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.034456 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.034468 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.034485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.034498 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.136679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.136725 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.136734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.136751 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.136763 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.180419 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 09:37:02.329007474 +0000 UTC Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.181709 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:02 crc kubenswrapper[4707]: E0121 15:03:02.181935 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.238641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.238679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.238688 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.238702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.238714 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.340861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.340915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.340926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.340942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.340951 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.443501 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.443747 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.443847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.443938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.444004 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.546684 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.546725 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.546734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.546751 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.546763 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.648988 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.649023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.649033 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.649047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.649058 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.750920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.751128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.751229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.751309 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.751366 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.853901 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.853938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.853947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.853962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.853971 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.956130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.956181 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.956190 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.956206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:02 crc kubenswrapper[4707]: I0121 15:03:02.956216 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:02Z","lastTransitionTime":"2026-01-21T15:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.058069 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.058124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.058133 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.058150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.058161 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.160307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.160350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.160360 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.160375 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.160386 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.181550 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 10:41:33.008309759 +0000 UTC Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.181755 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.181782 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:03 crc kubenswrapper[4707]: E0121 15:03:03.182279 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.181782 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:03 crc kubenswrapper[4707]: E0121 15:03:03.182409 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:03 crc kubenswrapper[4707]: E0121 15:03:03.182684 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.196773 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd0228bd-1ad3-4e64-8cb4-c1ae5010568f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd877686c4f6de7f9e976bb1195d7dd34e9da39f2e584c5575da8ddb3856af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68da03de7dff23aaddb13c005cd9cc46f744829679f7e362499937dd96cc2efc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2f3829436e297e032723e4c63a5b0a19c28e1e1b2a46fa18abc6858bb427d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34c4d7c3947e5fcbd9a96151b30de17ff1124612eb7525177c2344c8625ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0925e2e555e1f73758141e7a8c14360bfa384808371aa58655b60dac8cad7170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a0bb8dba9951c181a7f819389706ff91c21125a8705fe76e5173a8e0d71a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a0bb8dba9951c181a7f819389706ff91c21125a8705fe76e5173a8e0d71a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4e6e7530fbf2681c46e3cc91b37cefa9a0f30a496bc049cfbff12a9a4e3d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4e6e7530fbf2681c46e3cc91b37cefa9a0f30a496bc049cfbff12a9a4e3d31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ce934ba3545183939ee0825701b4f127825ebe59788e0b9b9aa08e2e18e28692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce934ba3545183939ee0825701b4f127825ebe59788e0b9b9aa08e2e18e28692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.205150 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-62mww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ft6kw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-62mww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.213301 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://417a37c85beb794e0a6fb0107963451412396faadc2d0abc8f184519f9b6e8d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.221956 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de4a8acc0793df0fb12e7d1b7b86001c2c5e7d4324286eab7cc254c82713868f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eda118fab0fcc311ae7449994f57dc26ed7d45456cfc77b26bf91418ff1e5fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.230445 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07107e90-ae92-467e-8436-2c24c21b5554\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e768dceb52ab6d846b8778009b6e4e11870366303ace7aa942a773657ed35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbh2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.242064 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lxkz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:52Z\\\",\\\"message\\\":\\\"2026-01-21T15:02:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937\\\\n2026-01-21T15:02:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b17bb885-8132-4464-932e-f41505e05937 to /host/opt/cni/bin/\\\\n2026-01-21T15:02:07Z [verbose] multus-daemon started\\\\n2026-01-21T15:02:07Z [verbose] Readiness Indicator file check\\\\n2026-01-21T15:02:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79znq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lxkz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.251566 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13f1c2e9b0e8a8c309f185ba64f5ab961631b619937402ab3876d3aea259488c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fd8tq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fmg2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.260286 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24e913f8-27bb-477b-999a-1b05430c82ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c7141567f2ff27d34b0061acf51407069931697bcfc1186f5ab27c111cda9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66ef54ecc8e190c5f522c66359d8f08947ff16e84fe7653dc841e0297e34861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5001af979b32a7c6d5eae31881615ee5a630bf65f22c70077372b9bd3afee3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d430ad5b300f5db2ccb2124c23ba2b8b38da2c2118412673704649c794652f0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.262501 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.262530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.262539 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.262556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.262565 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.270207 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.277718 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcfccd0-9ae0-44ba-a142-b621f14854f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bddde6ec5b36fdff21a26422f9d552fffa7b8fea406b82552daafa46943b5b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f963f16188a1c7afb0298399eda3e3dfe35e27bebf6183f392c3dfa00b7742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.286645 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.296538 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6eefe63-30b0-46e6-af7b-3909a3849128\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122eded10ddffd1602070b260d3ebf3b4591234f7b701d6b21add236f39d467a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22d37fcaecb7813f85e728b33469a00e76fa24ba42566ab94280add813cf1ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a86506942bbd74633fb9d64e53aaa69f8dbfa8ac331c0a8497aa7d923bda9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be5e31388896523773eed2f1b4118907fcffa2858d7a9ee41d8ce61ea8109d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024957ae57720f5eff8768f9e735f74de9dbea2cc7c96d8ede38117e773e805e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87abc7ffb6a008c8fb583856ad09c61fc2bc794206160962460f81d8cc8b149b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://233a2eedae6b6169e15cff1fe22bff5c379a789cb641f086b6bbb0435c94599e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57sv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mk2p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.305055 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9259afc-a0fa-4676-b6a1-0607479688f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcc0ae3cfe833161e2c84b2067745708fe0546024052896a8e1a2554a144dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be2883623b85be0aed6462f0966d83d3773039bd009d8965fb3a07cccba880fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctvs9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gxcpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.314954 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:01:55.173221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:01:55.174693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1628546024/tls.crt::/tmp/serving-cert-1628546024/tls.key\\\\\\\"\\\\nI0121 15:02:00.626159 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:02:00.629665 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:02:00.629687 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:02:00.629714 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:02:00.629719 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:02:00.636406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 15:02:00.636426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636430 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:02:00.636434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:02:00.636437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:02:00.636440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:02:00.636443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 15:02:00.636512 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 15:02:00.637696 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:01:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.324194 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fef8e221-b481-4175-9231-9d20e583719f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce862bf25db884864ffabb663a02ee35731f2d4d9fb5bc144f0727a87e46ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33865ecf28037e333b8659744469cc9588e48eed168b8ef55c8757fb90dedc38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6683fcbfb1434a6229fbeda852069ef769ff715f8a0f3af48a4d9242496a09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:01:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.331916 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jspjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe6f2caf-2457-4027-a1df-0dfa2c670a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e93ea8e3a52a82742af4526fa7b3091a324eb6b541704ec58196dfc743f313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s4rdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jspjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.345501 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:02:57Z\\\",\\\"message\\\":\\\"transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: U\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kw78q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:02:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-brcdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.355211 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1b11b7a5effa83403fcc1b56e6c01034da6e824754df7c4f531dc511be0f43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.364159 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:03Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.364785 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.364889 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.364902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.364922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.364933 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.466340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.466387 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.466397 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.466412 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.466423 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.569766 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.569845 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.569856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.569898 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.569913 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.673295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.673418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.673429 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.673447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.673458 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.777569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.777623 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.777633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.777651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.777663 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.879629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.879666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.879677 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.879694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.879704 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.982316 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.982355 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.982364 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.982378 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:03 crc kubenswrapper[4707]: I0121 15:03:03.982389 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:03Z","lastTransitionTime":"2026-01-21T15:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.084932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.084972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.084987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.085004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.085015 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:04Z","lastTransitionTime":"2026-01-21T15:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.182060 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:04 crc kubenswrapper[4707]: E0121 15:03:04.182194 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.182446 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 09:50:38.468768187 +0000 UTC Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.187780 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.187841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.187850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.187881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.187891 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:04Z","lastTransitionTime":"2026-01-21T15:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.290275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.290321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.290331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.290350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.290360 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:04Z","lastTransitionTime":"2026-01-21T15:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.392179 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.392466 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.392537 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.392607 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.392681 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:04Z","lastTransitionTime":"2026-01-21T15:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.495620 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.496064 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.496141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.496231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.496296 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:04Z","lastTransitionTime":"2026-01-21T15:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.598761 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.598800 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.598831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.598859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.598871 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:04Z","lastTransitionTime":"2026-01-21T15:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.700571 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.700604 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.700613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.700628 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.700636 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:04Z","lastTransitionTime":"2026-01-21T15:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.802475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.802511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.802520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.802534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.802544 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:04Z","lastTransitionTime":"2026-01-21T15:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.904672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.904712 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.904722 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.904735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:04 crc kubenswrapper[4707]: I0121 15:03:04.904744 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:04Z","lastTransitionTime":"2026-01-21T15:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.006554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.006597 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.006607 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.006625 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.006636 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.034305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.034480 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.034458854 +0000 UTC m=+146.215975077 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.034667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.034801 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.034884 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.03487631 +0000 UTC m=+146.216392532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.108555 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.108848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.108934 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.109015 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.109087 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.135272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.135307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.135336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135455 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135471 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135480 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135517 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.135503648 +0000 UTC m=+146.317019870 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135638 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135649 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135657 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135682 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.135673744 +0000 UTC m=+146.317189967 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135787 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.135852 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.135803846 +0000 UTC m=+146.317320068 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.182702 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 20:28:49.483038719 +0000 UTC Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.182747 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.182712 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.182892 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.182714 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.182995 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.183108 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.211626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.211655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.211665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.211680 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.211689 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.238103 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.238146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.238156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.238173 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.238184 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.248321 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.252037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.252079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.252089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.252112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.252125 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.262402 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.266343 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.266378 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.266387 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.266399 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.266409 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.276455 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.280293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.280340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.280352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.280369 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.280381 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.289768 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.292662 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.292704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.292714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.292730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.292741 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.301773 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14f7c65e-7534-4950-b6fd-0d06696bf53c\\\",\\\"systemUUID\\\":\\\"64d1d4ed-0a74-4014-b8fb-96fafa8c47a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:03:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:03:05 crc kubenswrapper[4707]: E0121 15:03:05.301939 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.314079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.314113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.314123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.314137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.314146 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.416014 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.416067 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.416077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.416098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.416109 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.518740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.518803 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.518846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.518862 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.518873 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.621374 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.621427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.621437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.621452 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.621460 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.723540 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.723578 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.723588 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.723619 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.723629 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.825674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.825709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.825718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.825732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.825742 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.928060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.928102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.928111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.928132 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:05 crc kubenswrapper[4707]: I0121 15:03:05.928142 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:05Z","lastTransitionTime":"2026-01-21T15:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.030047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.030087 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.030095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.030110 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.030118 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.132715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.132752 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.132760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.132775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.132784 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.182579 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:06 crc kubenswrapper[4707]: E0121 15:03:06.182705 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.182880 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 14:47:35.044367512 +0000 UTC Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.234780 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.234850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.234860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.234873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.234883 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.337330 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.337379 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.337388 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.337405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.337418 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.439499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.439554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.439566 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.439584 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.439595 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.541205 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.541246 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.541255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.541272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.541282 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.643939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.643976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.643985 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.644001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.644013 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.746603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.746639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.746649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.746662 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.746671 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.848550 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.848702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.848777 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.848883 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.848969 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.951352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.951396 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.951406 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.951422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:06 crc kubenswrapper[4707]: I0121 15:03:06.951433 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:06Z","lastTransitionTime":"2026-01-21T15:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.053243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.053280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.053291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.053306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.053315 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.157848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.157899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.157910 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.157925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.157933 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.182301 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.182328 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:07 crc kubenswrapper[4707]: E0121 15:03:07.182415 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.182467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:07 crc kubenswrapper[4707]: E0121 15:03:07.182560 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:07 crc kubenswrapper[4707]: E0121 15:03:07.182593 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.183285 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 08:29:04.901521504 +0000 UTC Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.259736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.259774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.259783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.259824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.259838 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.361845 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.361881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.361891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.361905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.361914 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.464049 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.464086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.464095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.464109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.464117 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.565882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.565918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.565927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.565940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.565949 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.668049 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.668098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.668109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.668126 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.668134 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.770037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.770076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.770085 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.770100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.770110 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.872115 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.872149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.872159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.872172 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.872181 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.974120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.974169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.974178 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.974194 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:07 crc kubenswrapper[4707]: I0121 15:03:07.974203 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:07Z","lastTransitionTime":"2026-01-21T15:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.075615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.075656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.075671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.075686 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.075695 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.178008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.178080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.178090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.178108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.178122 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.182404 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:08 crc kubenswrapper[4707]: E0121 15:03:08.182617 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.184368 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 15:59:17.991729303 +0000 UTC Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.280614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.280662 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.280671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.280693 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.280704 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.382748 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.382831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.382841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.382856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.382867 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.485029 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.485067 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.485077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.485092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.485106 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.587487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.587533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.587545 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.587560 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.587569 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.690174 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.690220 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.690230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.690248 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.690262 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.792066 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.792107 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.792117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.792148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.792157 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.894608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.894662 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.894671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.894688 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.894697 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.997600 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.997652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.997661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.997676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:08 crc kubenswrapper[4707]: I0121 15:03:08.997688 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:08Z","lastTransitionTime":"2026-01-21T15:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.100545 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.100584 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.100592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.100606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.100617 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:09Z","lastTransitionTime":"2026-01-21T15:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.182264 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.182324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.182351 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:09 crc kubenswrapper[4707]: E0121 15:03:09.182402 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:09 crc kubenswrapper[4707]: E0121 15:03:09.182568 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:09 crc kubenswrapper[4707]: E0121 15:03:09.182641 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.184553 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 19:51:31.293995991 +0000 UTC Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.203118 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.203160 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.203169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.203185 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.203195 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:09Z","lastTransitionTime":"2026-01-21T15:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.305415 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.305467 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.305477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.305491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.305499 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:09Z","lastTransitionTime":"2026-01-21T15:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.407439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.407476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.407485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.407500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.407509 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:09Z","lastTransitionTime":"2026-01-21T15:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.509842 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.509880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.509891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.509909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.509919 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:09Z","lastTransitionTime":"2026-01-21T15:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.611726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.611791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.611800 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.611827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.611856 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:09Z","lastTransitionTime":"2026-01-21T15:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.714729 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.714781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.714793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.714827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.714837 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:09Z","lastTransitionTime":"2026-01-21T15:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.816664 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.816698 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.816707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.816721 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.816730 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:09Z","lastTransitionTime":"2026-01-21T15:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.919020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.919194 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.919601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.919622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:09 crc kubenswrapper[4707]: I0121 15:03:09.919632 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:09Z","lastTransitionTime":"2026-01-21T15:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.022244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.022278 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.022288 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.022302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.022313 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.124088 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.124131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.124141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.124157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.124167 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.182069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:10 crc kubenswrapper[4707]: E0121 15:03:10.182193 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.183021 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:03:10 crc kubenswrapper[4707]: E0121 15:03:10.183187 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.184860 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 16:50:53.72334389 +0000 UTC Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.226644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.226686 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.226696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.226711 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.226721 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.329159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.329200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.329210 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.329225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.329235 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.431168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.431206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.431228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.431244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.431253 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.532641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.532674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.532683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.532697 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.532706 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.634489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.634534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.634542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.634556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.634566 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.735879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.735907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.735916 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.735926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.735935 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.838255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.838301 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.838310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.838326 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.838337 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.941404 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.941446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.941455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.941471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:10 crc kubenswrapper[4707]: I0121 15:03:10.941482 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:10Z","lastTransitionTime":"2026-01-21T15:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.043861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.043912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.043923 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.043944 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.043955 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.145696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.145756 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.145767 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.145785 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.145797 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.182464 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.182509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.182709 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:11 crc kubenswrapper[4707]: E0121 15:03:11.182804 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:11 crc kubenswrapper[4707]: E0121 15:03:11.182862 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:11 crc kubenswrapper[4707]: E0121 15:03:11.182961 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.185010 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 11:46:27.038269796 +0000 UTC Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.248290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.248337 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.248350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.248365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.248377 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.350433 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.350472 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.350480 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.350500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.350509 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.454340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.454382 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.454391 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.454420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.454430 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.556246 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.556287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.556297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.556312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.556320 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.658775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.658819 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.658829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.658846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.658856 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.760546 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.760580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.760589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.760604 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.760614 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.862272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.862302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.862310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.862322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.862332 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.963882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.963939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.963951 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.963966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:11 crc kubenswrapper[4707]: I0121 15:03:11.963976 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:11Z","lastTransitionTime":"2026-01-21T15:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.066137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.066183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.066191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.066206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.066217 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.168864 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.168906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.168915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.168930 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.168941 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.182271 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:12 crc kubenswrapper[4707]: E0121 15:03:12.182392 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.185364 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 15:59:33.679380345 +0000 UTC Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.271589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.271621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.271630 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.271644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.271653 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.373582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.373635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.373646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.373661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.373673 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.475777 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.475837 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.475848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.475867 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.475878 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.577636 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.577704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.577714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.577727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.577736 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.679762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.679805 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.679838 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.679854 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.679863 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.781912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.781958 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.781969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.781986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.781995 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.884182 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.884215 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.884226 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.884240 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.884250 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.986481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.986522 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.986531 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.986547 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:12 crc kubenswrapper[4707]: I0121 15:03:12.986556 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:12Z","lastTransitionTime":"2026-01-21T15:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.089151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.089191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.089202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.089217 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.089227 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:13Z","lastTransitionTime":"2026-01-21T15:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.181888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.181935 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.182139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:13 crc kubenswrapper[4707]: E0121 15:03:13.182245 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:13 crc kubenswrapper[4707]: E0121 15:03:13.182341 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:13 crc kubenswrapper[4707]: E0121 15:03:13.182422 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.185997 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 16:39:39.183250004 +0000 UTC Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.190859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.190929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.190940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.190956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.190966 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:13Z","lastTransitionTime":"2026-01-21T15:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.207874 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=13.207803148 podStartE2EDuration="13.207803148s" podCreationTimestamp="2026-01-21 15:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.206548093 +0000 UTC m=+90.388064316" watchObservedRunningTime="2026-01-21 15:03:13.207803148 +0000 UTC m=+90.389319380" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.245928 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.24590591 podStartE2EDuration="44.24590591s" podCreationTimestamp="2026-01-21 15:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.231452684 +0000 UTC m=+90.412968916" watchObservedRunningTime="2026-01-21 15:03:13.24590591 +0000 UTC m=+90.427422133" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.286245 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xjrsl" podStartSLOduration=67.286225027 podStartE2EDuration="1m7.286225027s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.273799652 +0000 UTC m=+90.455315874" watchObservedRunningTime="2026-01-21 15:03:13.286225027 +0000 UTC m=+90.467741250" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.286337 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lxkz2" podStartSLOduration=67.286333959 podStartE2EDuration="1m7.286333959s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.286205581 +0000 UTC m=+90.467721803" watchObservedRunningTime="2026-01-21 15:03:13.286333959 +0000 UTC m=+90.467850182" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.293345 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.293405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.293418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.293437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.293450 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:13Z","lastTransitionTime":"2026-01-21T15:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.296187 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podStartSLOduration=67.296167012 podStartE2EDuration="1m7.296167012s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.296088696 +0000 UTC m=+90.477604918" watchObservedRunningTime="2026-01-21 15:03:13.296167012 +0000 UTC m=+90.477683234" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.323711 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.323693239 podStartE2EDuration="1m12.323693239s" podCreationTimestamp="2026-01-21 15:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.310490568 +0000 UTC m=+90.492006791" watchObservedRunningTime="2026-01-21 15:03:13.323693239 +0000 UTC m=+90.505209461" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.332771 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.332753845 podStartE2EDuration="1m13.332753845s" podCreationTimestamp="2026-01-21 15:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.323946741 +0000 UTC m=+90.505462962" watchObservedRunningTime="2026-01-21 15:03:13.332753845 +0000 UTC m=+90.514270067" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.341587 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=17.34157245 podStartE2EDuration="17.34157245s" podCreationTimestamp="2026-01-21 15:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.332572909 +0000 UTC m=+90.514089130" watchObservedRunningTime="2026-01-21 15:03:13.34157245 +0000 UTC m=+90.523088672" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.354097 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mk2p2" podStartSLOduration=67.354085829 podStartE2EDuration="1m7.354085829s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.353730107 +0000 UTC m=+90.535246340" watchObservedRunningTime="2026-01-21 15:03:13.354085829 +0000 UTC m=+90.535602061" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.365074 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gxcpt" podStartSLOduration=67.365054844 podStartE2EDuration="1m7.365054844s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.364457303 +0000 UTC m=+90.545973525" watchObservedRunningTime="2026-01-21 15:03:13.365054844 +0000 UTC m=+90.546571066" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.395798 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.395850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.395860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.395874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.395884 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:13Z","lastTransitionTime":"2026-01-21T15:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.420943 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jspjl" podStartSLOduration=67.420927294 podStartE2EDuration="1m7.420927294s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:13.42083385 +0000 UTC m=+90.602350072" watchObservedRunningTime="2026-01-21 15:03:13.420927294 +0000 UTC m=+90.602443517" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.497479 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.497519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.497527 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.497542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.497550 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:13Z","lastTransitionTime":"2026-01-21T15:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.599514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.599542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.599550 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.599563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.599571 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:13Z","lastTransitionTime":"2026-01-21T15:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.701604 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.701638 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.701648 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.701675 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.701686 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:13Z","lastTransitionTime":"2026-01-21T15:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.803778 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.803848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.803859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.803874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.803882 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:13Z","lastTransitionTime":"2026-01-21T15:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.905694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.905735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.905744 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.905757 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:13 crc kubenswrapper[4707]: I0121 15:03:13.905768 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:13Z","lastTransitionTime":"2026-01-21T15:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.007458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.007494 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.007505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.007519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.007528 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.109648 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.109701 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.109716 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.109728 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.109738 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.181897 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:14 crc kubenswrapper[4707]: E0121 15:03:14.182221 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.187055 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 06:41:30.494203847 +0000 UTC Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.211770 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.211835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.211845 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.211859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.211867 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.313852 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.313884 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.313894 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.313905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.313915 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.415286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.415319 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.415328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.415341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.415349 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.517160 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.517389 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.517456 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.517527 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.517591 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.620364 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.620402 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.620413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.620426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.620439 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.722450 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.722702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.722783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.722898 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.722961 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.824675 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.824705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.824715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.824727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.824735 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.927056 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.927092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.927100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.927113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:14 crc kubenswrapper[4707]: I0121 15:03:14.927123 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:14Z","lastTransitionTime":"2026-01-21T15:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.028853 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.028885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.028894 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.028908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.028917 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:15Z","lastTransitionTime":"2026-01-21T15:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.130260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.130294 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.130304 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.130317 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.130326 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:15Z","lastTransitionTime":"2026-01-21T15:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.182655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.182741 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:15 crc kubenswrapper[4707]: E0121 15:03:15.182785 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.182843 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:15 crc kubenswrapper[4707]: E0121 15:03:15.182907 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:15 crc kubenswrapper[4707]: E0121 15:03:15.182952 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.187829 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 10:29:01.448898903 +0000 UTC Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.232320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.232351 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.232361 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.232372 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.232381 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:15Z","lastTransitionTime":"2026-01-21T15:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.334901 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.334938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.334948 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.334962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.334972 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:15Z","lastTransitionTime":"2026-01-21T15:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.436872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.436920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.436929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.436940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.436950 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:15Z","lastTransitionTime":"2026-01-21T15:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.539062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.539093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.539123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.539138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.539147 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:15Z","lastTransitionTime":"2026-01-21T15:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.640969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.641194 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.641306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.641381 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.641672 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:15Z","lastTransitionTime":"2026-01-21T15:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.675555 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.675745 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.675858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.675941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.676008 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:03:15Z","lastTransitionTime":"2026-01-21T15:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.703533 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr"] Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.703852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.705419 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.706339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.706514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.707050 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.735717 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.735763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.735846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.735863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.735887 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.836617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.836684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.836726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.836770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.836790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.836997 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.837027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.837742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.841513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:15 crc kubenswrapper[4707]: I0121 15:03:15.850358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ghgvr\" (UID: \"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:16 crc kubenswrapper[4707]: I0121 15:03:16.013732 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" Jan 21 15:03:16 crc kubenswrapper[4707]: W0121 15:03:16.024710 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90a4336b_15d5_48f2_b6f8_d9b9ed13a5eb.slice/crio-1375634fec37f682cafbe09499ce2c6c45d7d277f4206a17a630adb8afe253a6 WatchSource:0}: Error finding container 1375634fec37f682cafbe09499ce2c6c45d7d277f4206a17a630adb8afe253a6: Status 404 returned error can't find the container with id 1375634fec37f682cafbe09499ce2c6c45d7d277f4206a17a630adb8afe253a6 Jan 21 15:03:16 crc kubenswrapper[4707]: I0121 15:03:16.182576 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:16 crc kubenswrapper[4707]: E0121 15:03:16.182992 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:16 crc kubenswrapper[4707]: I0121 15:03:16.188835 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 22:26:43.205183882 +0000 UTC Jan 21 15:03:16 crc kubenswrapper[4707]: I0121 15:03:16.188878 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 21 15:03:16 crc kubenswrapper[4707]: I0121 15:03:16.195149 4707 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 21 15:03:16 crc kubenswrapper[4707]: I0121 15:03:16.548562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" event={"ID":"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb","Type":"ContainerStarted","Data":"9eb5a8e227b665e4208f5bb08a0b5b571e01a698d131615d69cd49d693c5ec0f"} Jan 21 15:03:16 crc kubenswrapper[4707]: I0121 15:03:16.548623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" event={"ID":"90a4336b-15d5-48f2-b6f8-d9b9ed13a5eb","Type":"ContainerStarted","Data":"1375634fec37f682cafbe09499ce2c6c45d7d277f4206a17a630adb8afe253a6"} Jan 21 15:03:17 crc kubenswrapper[4707]: I0121 15:03:17.182721 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:17 crc kubenswrapper[4707]: I0121 15:03:17.182746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:17 crc kubenswrapper[4707]: I0121 15:03:17.182793 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:17 crc kubenswrapper[4707]: E0121 15:03:17.182892 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:17 crc kubenswrapper[4707]: E0121 15:03:17.183106 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:17 crc kubenswrapper[4707]: E0121 15:03:17.183146 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:18 crc kubenswrapper[4707]: I0121 15:03:18.181878 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:18 crc kubenswrapper[4707]: E0121 15:03:18.182246 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:19 crc kubenswrapper[4707]: I0121 15:03:19.181831 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:19 crc kubenswrapper[4707]: I0121 15:03:19.181884 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:19 crc kubenswrapper[4707]: E0121 15:03:19.182024 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:19 crc kubenswrapper[4707]: I0121 15:03:19.181904 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:19 crc kubenswrapper[4707]: E0121 15:03:19.182209 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:19 crc kubenswrapper[4707]: E0121 15:03:19.182137 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:20 crc kubenswrapper[4707]: I0121 15:03:20.181770 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:20 crc kubenswrapper[4707]: E0121 15:03:20.181923 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:21 crc kubenswrapper[4707]: I0121 15:03:21.182567 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:21 crc kubenswrapper[4707]: I0121 15:03:21.182606 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:21 crc kubenswrapper[4707]: I0121 15:03:21.182647 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:21 crc kubenswrapper[4707]: E0121 15:03:21.183046 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:21 crc kubenswrapper[4707]: E0121 15:03:21.183159 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:21 crc kubenswrapper[4707]: E0121 15:03:21.183208 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:21 crc kubenswrapper[4707]: I0121 15:03:21.183471 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:03:21 crc kubenswrapper[4707]: E0121 15:03:21.183619 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:03:22 crc kubenswrapper[4707]: I0121 15:03:22.182192 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:22 crc kubenswrapper[4707]: E0121 15:03:22.182282 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:23 crc kubenswrapper[4707]: I0121 15:03:23.181757 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:23 crc kubenswrapper[4707]: I0121 15:03:23.181798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:23 crc kubenswrapper[4707]: I0121 15:03:23.181833 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:23 crc kubenswrapper[4707]: E0121 15:03:23.182543 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:23 crc kubenswrapper[4707]: E0121 15:03:23.182611 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:23 crc kubenswrapper[4707]: E0121 15:03:23.182687 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:24 crc kubenswrapper[4707]: I0121 15:03:24.010051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:24 crc kubenswrapper[4707]: E0121 15:03:24.010181 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:03:24 crc kubenswrapper[4707]: E0121 15:03:24.010233 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs podName:d5fb5fe4-8f42-4057-b731-b2c8da0661e3 nodeName:}" failed. No retries permitted until 2026-01-21 15:04:28.010218789 +0000 UTC m=+165.191735011 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs") pod "network-metrics-daemon-62mww" (UID: "d5fb5fe4-8f42-4057-b731-b2c8da0661e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:03:24 crc kubenswrapper[4707]: I0121 15:03:24.181575 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:24 crc kubenswrapper[4707]: E0121 15:03:24.181677 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:25 crc kubenswrapper[4707]: I0121 15:03:25.182397 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:25 crc kubenswrapper[4707]: E0121 15:03:25.182602 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:25 crc kubenswrapper[4707]: I0121 15:03:25.182634 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:25 crc kubenswrapper[4707]: E0121 15:03:25.182689 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:25 crc kubenswrapper[4707]: I0121 15:03:25.182738 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:25 crc kubenswrapper[4707]: E0121 15:03:25.182783 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:26 crc kubenswrapper[4707]: I0121 15:03:26.181764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:26 crc kubenswrapper[4707]: E0121 15:03:26.181904 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:27 crc kubenswrapper[4707]: I0121 15:03:27.182151 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:27 crc kubenswrapper[4707]: I0121 15:03:27.182224 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:27 crc kubenswrapper[4707]: E0121 15:03:27.182253 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:27 crc kubenswrapper[4707]: I0121 15:03:27.182169 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:27 crc kubenswrapper[4707]: E0121 15:03:27.182334 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:27 crc kubenswrapper[4707]: E0121 15:03:27.182467 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:28 crc kubenswrapper[4707]: I0121 15:03:28.181797 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:28 crc kubenswrapper[4707]: E0121 15:03:28.181974 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:29 crc kubenswrapper[4707]: I0121 15:03:29.182074 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:29 crc kubenswrapper[4707]: E0121 15:03:29.182184 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:29 crc kubenswrapper[4707]: I0121 15:03:29.182227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:29 crc kubenswrapper[4707]: E0121 15:03:29.182344 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:29 crc kubenswrapper[4707]: I0121 15:03:29.182660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:29 crc kubenswrapper[4707]: E0121 15:03:29.182740 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:30 crc kubenswrapper[4707]: I0121 15:03:30.181513 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:30 crc kubenswrapper[4707]: E0121 15:03:30.181609 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:31 crc kubenswrapper[4707]: I0121 15:03:31.182359 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:31 crc kubenswrapper[4707]: I0121 15:03:31.182395 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:31 crc kubenswrapper[4707]: I0121 15:03:31.182359 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:31 crc kubenswrapper[4707]: E0121 15:03:31.182478 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:31 crc kubenswrapper[4707]: E0121 15:03:31.182540 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:31 crc kubenswrapper[4707]: E0121 15:03:31.182615 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:32 crc kubenswrapper[4707]: I0121 15:03:32.182089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:32 crc kubenswrapper[4707]: E0121 15:03:32.182187 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:33 crc kubenswrapper[4707]: I0121 15:03:33.182155 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:33 crc kubenswrapper[4707]: I0121 15:03:33.182195 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:33 crc kubenswrapper[4707]: I0121 15:03:33.182170 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:33 crc kubenswrapper[4707]: E0121 15:03:33.183053 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:33 crc kubenswrapper[4707]: E0121 15:03:33.183180 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:33 crc kubenswrapper[4707]: E0121 15:03:33.183344 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:34 crc kubenswrapper[4707]: I0121 15:03:34.181629 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:34 crc kubenswrapper[4707]: E0121 15:03:34.181902 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:35 crc kubenswrapper[4707]: I0121 15:03:35.182545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:35 crc kubenswrapper[4707]: I0121 15:03:35.182619 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:35 crc kubenswrapper[4707]: E0121 15:03:35.182665 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:35 crc kubenswrapper[4707]: I0121 15:03:35.182545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:35 crc kubenswrapper[4707]: E0121 15:03:35.182732 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:35 crc kubenswrapper[4707]: E0121 15:03:35.182889 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:36 crc kubenswrapper[4707]: I0121 15:03:36.181892 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:36 crc kubenswrapper[4707]: E0121 15:03:36.182030 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:36 crc kubenswrapper[4707]: I0121 15:03:36.182979 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:03:36 crc kubenswrapper[4707]: E0121 15:03:36.183147 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-brcdw_openshift-ovn-kubernetes(fa3aa8ac-04fb-4f52-8e61-ddb7676bca30)\"" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" Jan 21 15:03:37 crc kubenswrapper[4707]: I0121 15:03:37.181604 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:37 crc kubenswrapper[4707]: I0121 15:03:37.181702 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:37 crc kubenswrapper[4707]: E0121 15:03:37.181736 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:37 crc kubenswrapper[4707]: I0121 15:03:37.181852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:37 crc kubenswrapper[4707]: E0121 15:03:37.181961 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:37 crc kubenswrapper[4707]: E0121 15:03:37.182279 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:38 crc kubenswrapper[4707]: I0121 15:03:38.182489 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:38 crc kubenswrapper[4707]: E0121 15:03:38.183242 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:38 crc kubenswrapper[4707]: I0121 15:03:38.599217 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/1.log" Jan 21 15:03:38 crc kubenswrapper[4707]: I0121 15:03:38.600102 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/0.log" Jan 21 15:03:38 crc kubenswrapper[4707]: I0121 15:03:38.600147 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2bdbb11-a196-4dc3-b197-64ef1bec8e8a" containerID="99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177" exitCode=1 Jan 21 15:03:38 crc kubenswrapper[4707]: I0121 15:03:38.600182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxkz2" event={"ID":"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a","Type":"ContainerDied","Data":"99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177"} Jan 21 15:03:38 crc kubenswrapper[4707]: I0121 15:03:38.600216 4707 scope.go:117] "RemoveContainer" containerID="cb706d315e0df049bb79dd89bb6761803d2b7abba5fa856fae8ff8c61c18fb49" Jan 21 15:03:38 crc kubenswrapper[4707]: I0121 15:03:38.600606 4707 scope.go:117] "RemoveContainer" containerID="99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177" Jan 21 15:03:38 crc kubenswrapper[4707]: E0121 15:03:38.600777 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-lxkz2_openshift-multus(e2bdbb11-a196-4dc3-b197-64ef1bec8e8a)\"" pod="openshift-multus/multus-lxkz2" podUID="e2bdbb11-a196-4dc3-b197-64ef1bec8e8a" Jan 21 15:03:38 crc kubenswrapper[4707]: I0121 15:03:38.614651 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ghgvr" podStartSLOduration=92.61463486 podStartE2EDuration="1m32.61463486s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:16.562291781 +0000 UTC m=+93.743808004" watchObservedRunningTime="2026-01-21 15:03:38.61463486 +0000 UTC m=+115.796151082" Jan 21 15:03:39 crc kubenswrapper[4707]: I0121 15:03:39.182133 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:39 crc kubenswrapper[4707]: I0121 15:03:39.182236 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:39 crc kubenswrapper[4707]: I0121 15:03:39.182131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:39 crc kubenswrapper[4707]: E0121 15:03:39.182600 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:39 crc kubenswrapper[4707]: E0121 15:03:39.182687 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:39 crc kubenswrapper[4707]: E0121 15:03:39.182532 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:39 crc kubenswrapper[4707]: I0121 15:03:39.604933 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/1.log" Jan 21 15:03:40 crc kubenswrapper[4707]: I0121 15:03:40.182580 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:40 crc kubenswrapper[4707]: E0121 15:03:40.182749 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:41 crc kubenswrapper[4707]: I0121 15:03:41.181650 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:41 crc kubenswrapper[4707]: E0121 15:03:41.181750 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:41 crc kubenswrapper[4707]: I0121 15:03:41.181879 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:41 crc kubenswrapper[4707]: E0121 15:03:41.182059 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:41 crc kubenswrapper[4707]: I0121 15:03:41.182302 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:41 crc kubenswrapper[4707]: E0121 15:03:41.182527 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:42 crc kubenswrapper[4707]: I0121 15:03:42.182157 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:42 crc kubenswrapper[4707]: E0121 15:03:42.182448 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:43 crc kubenswrapper[4707]: I0121 15:03:43.182069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:43 crc kubenswrapper[4707]: I0121 15:03:43.182088 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:43 crc kubenswrapper[4707]: I0121 15:03:43.182121 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:43 crc kubenswrapper[4707]: E0121 15:03:43.183015 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:43 crc kubenswrapper[4707]: E0121 15:03:43.183100 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:43 crc kubenswrapper[4707]: E0121 15:03:43.183171 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:43 crc kubenswrapper[4707]: E0121 15:03:43.229957 4707 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 21 15:03:43 crc kubenswrapper[4707]: E0121 15:03:43.245756 4707 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:03:44 crc kubenswrapper[4707]: I0121 15:03:44.182398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:44 crc kubenswrapper[4707]: E0121 15:03:44.182517 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:45 crc kubenswrapper[4707]: I0121 15:03:45.182195 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:45 crc kubenswrapper[4707]: I0121 15:03:45.182197 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:45 crc kubenswrapper[4707]: I0121 15:03:45.182411 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:45 crc kubenswrapper[4707]: E0121 15:03:45.182354 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:45 crc kubenswrapper[4707]: E0121 15:03:45.182453 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:45 crc kubenswrapper[4707]: E0121 15:03:45.182512 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:46 crc kubenswrapper[4707]: I0121 15:03:46.182124 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:46 crc kubenswrapper[4707]: E0121 15:03:46.182257 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:47 crc kubenswrapper[4707]: I0121 15:03:47.181890 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:47 crc kubenswrapper[4707]: I0121 15:03:47.181905 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:47 crc kubenswrapper[4707]: E0121 15:03:47.182095 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:47 crc kubenswrapper[4707]: E0121 15:03:47.182000 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:47 crc kubenswrapper[4707]: I0121 15:03:47.181907 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:47 crc kubenswrapper[4707]: E0121 15:03:47.182170 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:48 crc kubenswrapper[4707]: I0121 15:03:48.182445 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:48 crc kubenswrapper[4707]: E0121 15:03:48.182543 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:48 crc kubenswrapper[4707]: E0121 15:03:48.246736 4707 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:03:49 crc kubenswrapper[4707]: I0121 15:03:49.182478 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:49 crc kubenswrapper[4707]: E0121 15:03:49.183027 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:49 crc kubenswrapper[4707]: I0121 15:03:49.182609 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:49 crc kubenswrapper[4707]: E0121 15:03:49.183236 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:49 crc kubenswrapper[4707]: I0121 15:03:49.182529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:49 crc kubenswrapper[4707]: E0121 15:03:49.183427 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:50 crc kubenswrapper[4707]: I0121 15:03:50.181739 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:50 crc kubenswrapper[4707]: E0121 15:03:50.181867 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:50 crc kubenswrapper[4707]: I0121 15:03:50.182312 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:03:50 crc kubenswrapper[4707]: I0121 15:03:50.631359 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/3.log" Jan 21 15:03:50 crc kubenswrapper[4707]: I0121 15:03:50.633635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerStarted","Data":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} Jan 21 15:03:50 crc kubenswrapper[4707]: I0121 15:03:50.633995 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:03:50 crc kubenswrapper[4707]: I0121 15:03:50.652714 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podStartSLOduration=104.652698224 podStartE2EDuration="1m44.652698224s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:03:50.65224549 +0000 UTC m=+127.833761711" watchObservedRunningTime="2026-01-21 15:03:50.652698224 +0000 UTC m=+127.834214445" Jan 21 15:03:50 crc kubenswrapper[4707]: I0121 15:03:50.789224 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-62mww"] Jan 21 15:03:50 crc kubenswrapper[4707]: I0121 15:03:50.789316 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:50 crc kubenswrapper[4707]: E0121 15:03:50.789392 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:51 crc kubenswrapper[4707]: I0121 15:03:51.182099 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:51 crc kubenswrapper[4707]: I0121 15:03:51.182129 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:51 crc kubenswrapper[4707]: I0121 15:03:51.182112 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:51 crc kubenswrapper[4707]: E0121 15:03:51.182209 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:51 crc kubenswrapper[4707]: E0121 15:03:51.182334 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:51 crc kubenswrapper[4707]: E0121 15:03:51.182373 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:52 crc kubenswrapper[4707]: I0121 15:03:52.182555 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:52 crc kubenswrapper[4707]: E0121 15:03:52.182680 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:53 crc kubenswrapper[4707]: I0121 15:03:53.181844 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:53 crc kubenswrapper[4707]: I0121 15:03:53.181869 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:53 crc kubenswrapper[4707]: I0121 15:03:53.181898 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:53 crc kubenswrapper[4707]: E0121 15:03:53.182717 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:53 crc kubenswrapper[4707]: E0121 15:03:53.182798 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:53 crc kubenswrapper[4707]: E0121 15:03:53.182983 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:53 crc kubenswrapper[4707]: I0121 15:03:53.183122 4707 scope.go:117] "RemoveContainer" containerID="99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177" Jan 21 15:03:53 crc kubenswrapper[4707]: E0121 15:03:53.247082 4707 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:03:53 crc kubenswrapper[4707]: I0121 15:03:53.641953 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/1.log" Jan 21 15:03:53 crc kubenswrapper[4707]: I0121 15:03:53.642006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxkz2" event={"ID":"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a","Type":"ContainerStarted","Data":"d6408cce6d76083d530e42528e1345d3faf4b05c560998bf0cd690fbd03e6e5a"} Jan 21 15:03:54 crc kubenswrapper[4707]: I0121 15:03:54.182390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:54 crc kubenswrapper[4707]: E0121 15:03:54.182514 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:55 crc kubenswrapper[4707]: I0121 15:03:55.182619 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:55 crc kubenswrapper[4707]: I0121 15:03:55.182724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:55 crc kubenswrapper[4707]: E0121 15:03:55.182743 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:55 crc kubenswrapper[4707]: I0121 15:03:55.182630 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:55 crc kubenswrapper[4707]: E0121 15:03:55.182883 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:55 crc kubenswrapper[4707]: E0121 15:03:55.182970 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:56 crc kubenswrapper[4707]: I0121 15:03:56.181969 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:56 crc kubenswrapper[4707]: E0121 15:03:56.182086 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:57 crc kubenswrapper[4707]: I0121 15:03:57.181952 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:57 crc kubenswrapper[4707]: I0121 15:03:57.181991 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:57 crc kubenswrapper[4707]: E0121 15:03:57.182045 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:03:57 crc kubenswrapper[4707]: I0121 15:03:57.181967 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:57 crc kubenswrapper[4707]: E0121 15:03:57.182133 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:03:57 crc kubenswrapper[4707]: E0121 15:03:57.182186 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:03:58 crc kubenswrapper[4707]: I0121 15:03:58.181534 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:03:58 crc kubenswrapper[4707]: E0121 15:03:58.181652 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62mww" podUID="d5fb5fe4-8f42-4057-b731-b2c8da0661e3" Jan 21 15:03:59 crc kubenswrapper[4707]: I0121 15:03:59.182307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:03:59 crc kubenswrapper[4707]: I0121 15:03:59.182372 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:03:59 crc kubenswrapper[4707]: I0121 15:03:59.182447 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:03:59 crc kubenswrapper[4707]: I0121 15:03:59.183892 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 15:03:59 crc kubenswrapper[4707]: I0121 15:03:59.184231 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 15:03:59 crc kubenswrapper[4707]: I0121 15:03:59.184622 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 15:03:59 crc kubenswrapper[4707]: I0121 15:03:59.184701 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 15:04:00 crc kubenswrapper[4707]: I0121 15:04:00.182566 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:04:00 crc kubenswrapper[4707]: I0121 15:04:00.184619 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 15:04:00 crc kubenswrapper[4707]: I0121 15:04:00.185623 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.499302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.523217 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9qt5"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.523610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.523853 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-v44w7"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.524297 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.525590 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.538671 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.538766 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.538844 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.539088 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.539235 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.539301 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.539490 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.539569 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.539939 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.539974 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.540104 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.540104 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.540214 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.540393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.540561 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.540686 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.541498 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.545481 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.547521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.547690 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.547798 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.547881 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmpc8"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.548461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.548578 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.548884 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.549160 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.549641 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-62wx6"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.550067 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.550524 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.550771 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.551845 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-c6ckn"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.552186 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dmhxk"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.552427 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.552551 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.552637 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.553536 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.553592 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.553971 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ggf7f"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.554300 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.554541 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.559507 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.559742 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.559987 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.560082 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.559771 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.560093 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.560395 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.560486 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.560736 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.560862 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.562381 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.562543 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.562583 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.562649 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.562717 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.562660 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.564886 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.565308 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-6kmlc"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.565685 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566031 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566073 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566087 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566172 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566177 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566032 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566298 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566326 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566399 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566405 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566564 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566600 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566693 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566870 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9qt5"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.566973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.567014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.567054 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.567227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6kmlc" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.567358 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-njgpj"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.567702 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.567830 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.568357 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.568655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.569078 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.569408 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.569484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.569846 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.569905 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.578820 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.579139 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.579558 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.581329 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.581479 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.581623 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.581915 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.581953 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.582111 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.582225 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.582793 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.582825 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.583761 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.584161 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.584167 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.584880 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.585144 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.585207 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.586831 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587037 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587108 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587156 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587390 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587459 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587553 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587596 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587684 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587723 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587790 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.587968 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.588222 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.588355 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.588377 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.588491 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.588833 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.588930 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.589010 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.589030 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.589136 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.589162 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.589228 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.591273 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.591486 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.591533 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.591735 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.592004 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.592701 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.593552 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.594777 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.600268 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.600373 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.602237 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.602733 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.603270 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q6l9f"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.603779 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604047 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51519fd7-68a2-4048-8d76-affda2188f00-trusted-ca\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-config\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7af8a399-b5f7-49ab-97e2-ba8c219e40eb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fnkpm\" (UID: \"7af8a399-b5f7-49ab-97e2-ba8c219e40eb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f5ea184-c2df-41ad-9038-9602b5dbc1a1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8xcdf\" (UID: \"6f5ea184-c2df-41ad-9038-9602b5dbc1a1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-service-ca\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-etcd-ca\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-client-ca\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrqrq\" (UID: \"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf8n8\" (UniqueName: \"kubernetes.io/projected/6a1d8d5b-9f51-4661-9dda-d2e43a87eb85-kube-api-access-sf8n8\") pod \"cluster-samples-operator-665b6dd947-5jtpg\" (UID: \"6a1d8d5b-9f51-4661-9dda-d2e43a87eb85\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-oauth-config\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-config\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-audit-policies\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5999c94d-18a3-46e1-8a10-9f31b1b6577a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m6khq\" (UID: \"5999c94d-18a3-46e1-8a10-9f31b1b6577a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f1d02b-fe59-40fe-a47d-59589afef318-service-ca-bundle\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nx8s\" (UniqueName: \"kubernetes.io/projected/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-kube-api-access-5nx8s\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl55g\" (UniqueName: \"kubernetes.io/projected/c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e-kube-api-access-gl55g\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrqrq\" (UID: \"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604647 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn6pq\" (UniqueName: \"kubernetes.io/projected/60f4a311-31b0-4a1e-b5a0-9209f96340c7-kube-api-access-xn6pq\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2clmf\" (UniqueName: \"kubernetes.io/projected/60cbb465-ee79-49db-8488-943c0b44334b-kube-api-access-2clmf\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-config\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfqrp\" (UniqueName: \"kubernetes.io/projected/51519fd7-68a2-4048-8d76-affda2188f00-kube-api-access-dfqrp\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5999c94d-18a3-46e1-8a10-9f31b1b6577a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m6khq\" (UID: \"5999c94d-18a3-46e1-8a10-9f31b1b6577a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bsr7\" (UniqueName: \"kubernetes.io/projected/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-kube-api-access-6bsr7\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f1d02b-fe59-40fe-a47d-59589afef318-serving-cert\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-auth-proxy-config\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f10ad167-f30a-4c37-8a38-fb877eeb567a-serving-cert\") pod \"openshift-config-operator-7777fb866f-lpvrj\" (UID: \"f10ad167-f30a-4c37-8a38-fb877eeb567a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604893 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbch\" (UniqueName: \"kubernetes.io/projected/39df9137-94c9-4052-bb18-458e1d29e081-kube-api-access-kxbch\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604908 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tclm\" (UniqueName: \"kubernetes.io/projected/3742167f-2077-4b3c-b385-2a108c04538c-kube-api-access-5tclm\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whknl\" (UniqueName: \"kubernetes.io/projected/22f1d02b-fe59-40fe-a47d-59589afef318-kube-api-access-whknl\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d87c8f0a-f206-4813-968e-8f060d659834-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-oauth-serving-cert\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d87c8f0a-f206-4813-968e-8f060d659834-serving-cert\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6896438b-9d7e-4637-ac39-63c12dbf0e66-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.604990 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-etcd-client\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpxzf\" (UniqueName: \"kubernetes.io/projected/825ae202-8fd1-4554-b267-3720efe4954e-kube-api-access-bpxzf\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j8l7\" (UID: \"825ae202-8fd1-4554-b267-3720efe4954e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39df9137-94c9-4052-bb18-458e1d29e081-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-etcd-client\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3742167f-2077-4b3c-b385-2a108c04538c-audit-dir\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhcrf\" (UniqueName: \"kubernetes.io/projected/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-kube-api-access-mhcrf\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d87c8f0a-f206-4813-968e-8f060d659834-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/651b3241-1dc1-4356-8fe1-65eec3fcc47c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bl9mx\" (UID: \"651b3241-1dc1-4356-8fe1-65eec3fcc47c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-machine-approver-tls\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddxcm\" (UniqueName: \"kubernetes.io/projected/690ffd9a-e08c-4a06-9d78-990fd46651d8-kube-api-access-ddxcm\") pod \"downloads-7954f5f757-6kmlc\" (UID: \"690ffd9a-e08c-4a06-9d78-990fd46651d8\") " pod="openshift-console/downloads-7954f5f757-6kmlc" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f1d02b-fe59-40fe-a47d-59589afef318-config\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d87c8f0a-f206-4813-968e-8f060d659834-etcd-client\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-config\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-client-ca\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605235 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f5ea184-c2df-41ad-9038-9602b5dbc1a1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8xcdf\" (UID: \"6f5ea184-c2df-41ad-9038-9602b5dbc1a1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf4zt\" (UniqueName: \"kubernetes.io/projected/4a760eb2-4041-468b-a269-13e523796dd0-kube-api-access-sf4zt\") pod \"control-plane-machine-set-operator-78cbb6b69f-bjb5q\" (UID: \"4a760eb2-4041-468b-a269-13e523796dd0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-etcd-serving-ca\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-audit-dir\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651b3241-1dc1-4356-8fe1-65eec3fcc47c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bl9mx\" (UID: \"651b3241-1dc1-4356-8fe1-65eec3fcc47c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d87c8f0a-f206-4813-968e-8f060d659834-encryption-config\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605339 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5ea184-c2df-41ad-9038-9602b5dbc1a1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8xcdf\" (UID: \"6f5ea184-c2df-41ad-9038-9602b5dbc1a1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-config\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a760eb2-4041-468b-a269-13e523796dd0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bjb5q\" (UID: \"4a760eb2-4041-468b-a269-13e523796dd0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-encryption-config\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af8a399-b5f7-49ab-97e2-ba8c219e40eb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fnkpm\" (UID: \"7af8a399-b5f7-49ab-97e2-ba8c219e40eb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51519fd7-68a2-4048-8d76-affda2188f00-config\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6896438b-9d7e-4637-ac39-63c12dbf0e66-images\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f4a311-31b0-4a1e-b5a0-9209f96340c7-serving-cert\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a1d8d5b-9f51-4661-9dda-d2e43a87eb85-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5jtpg\" (UID: \"6a1d8d5b-9f51-4661-9dda-d2e43a87eb85\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605482 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-image-import-ca\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-trusted-ca-bundle\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af8a399-b5f7-49ab-97e2-ba8c219e40eb-config\") pod \"kube-apiserver-operator-766d6c64bb-fnkpm\" (UID: \"7af8a399-b5f7-49ab-97e2-ba8c219e40eb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6896438b-9d7e-4637-ac39-63c12dbf0e66-config\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-config\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39df9137-94c9-4052-bb18-458e1d29e081-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605582 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xggd6\" (UniqueName: \"kubernetes.io/projected/651b3241-1dc1-4356-8fe1-65eec3fcc47c-kube-api-access-xggd6\") pod \"openshift-apiserver-operator-796bbdcf4f-bl9mx\" (UID: \"651b3241-1dc1-4356-8fe1-65eec3fcc47c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d87c8f0a-f206-4813-968e-8f060d659834-audit-policies\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d87c8f0a-f206-4813-968e-8f060d659834-audit-dir\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-serving-cert\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39df9137-94c9-4052-bb18-458e1d29e081-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4vpp\" (UniqueName: \"kubernetes.io/projected/d87c8f0a-f206-4813-968e-8f060d659834-kube-api-access-c4vpp\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f10ad167-f30a-4c37-8a38-fb877eeb567a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lpvrj\" (UID: \"f10ad167-f30a-4c37-8a38-fb877eeb567a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60cbb465-ee79-49db-8488-943c0b44334b-serving-cert\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-serving-cert\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605727 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-etcd-service-ca\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-node-pullsecrets\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/825ae202-8fd1-4554-b267-3720efe4954e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j8l7\" (UID: \"825ae202-8fd1-4554-b267-3720efe4954e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/825ae202-8fd1-4554-b267-3720efe4954e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j8l7\" (UID: \"825ae202-8fd1-4554-b267-3720efe4954e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-audit\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5999c94d-18a3-46e1-8a10-9f31b1b6577a-config\") pod \"kube-controller-manager-operator-78b949d7b-m6khq\" (UID: \"5999c94d-18a3-46e1-8a10-9f31b1b6577a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrqrq\" (UID: \"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tft84\" (UniqueName: \"kubernetes.io/projected/f10ad167-f30a-4c37-8a38-fb877eeb567a-kube-api-access-tft84\") pod \"openshift-config-operator-7777fb866f-lpvrj\" (UID: \"f10ad167-f30a-4c37-8a38-fb877eeb567a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51519fd7-68a2-4048-8d76-affda2188f00-serving-cert\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-serving-cert\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.605978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkwcs\" (UniqueName: \"kubernetes.io/projected/0ebc35b7-89c9-490c-9756-6d532a2411ed-kube-api-access-dkwcs\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.606005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75mlc\" (UniqueName: \"kubernetes.io/projected/6896438b-9d7e-4637-ac39-63c12dbf0e66-kube-api-access-75mlc\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.606032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f1d02b-fe59-40fe-a47d-59589afef318-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.607053 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.607865 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.608554 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.609254 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-k84wt"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.609501 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.609660 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.609770 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kjvk4"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.609822 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.609932 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.612513 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-knchs"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.612637 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.613013 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-v44w7"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.613061 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.617100 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-c6ckn"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.617857 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.620157 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-62wx6"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.620964 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmpc8"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.624863 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.626729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.626951 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.627472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.627610 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.628051 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.629093 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.629581 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.629585 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.630263 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.630343 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.631324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6kmlc"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.632461 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ms2tj"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.632887 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.633451 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.634003 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dnwth"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.634389 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.634538 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.635971 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h69c5"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.636635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.636926 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.637281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.637516 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgg5r"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.637861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.638508 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z4txk"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.642402 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.645829 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ggf7f"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.646190 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.647504 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.650831 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.654182 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q6l9f"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.656448 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.657518 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.658325 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.659106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.659936 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kjvk4"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.660968 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.661556 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.662377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.663171 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-knchs"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.664161 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.665280 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.665984 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.666461 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-njgpj"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.667731 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.668665 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dmhxk"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.669677 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.670663 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ms2tj"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.672174 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h69c5"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.674339 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.674893 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dnwth"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.676168 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.676926 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z4txk"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.678025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.678469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgg5r"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.679369 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pk8cn"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.680017 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pk8cn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.680285 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-zdb5r"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.680831 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.681266 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pk8cn"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.685040 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706030 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-image-import-ca\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f4a311-31b0-4a1e-b5a0-9209f96340c7-serving-cert\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a1d8d5b-9f51-4661-9dda-d2e43a87eb85-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5jtpg\" (UID: \"6a1d8d5b-9f51-4661-9dda-d2e43a87eb85\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/029666b4-21ed-4caf-a28b-f78d1316de59-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-knchs\" (UID: \"029666b4-21ed-4caf-a28b-f78d1316de59\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-trusted-ca-bundle\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af8a399-b5f7-49ab-97e2-ba8c219e40eb-config\") pod \"kube-apiserver-operator-766d6c64bb-fnkpm\" (UID: \"7af8a399-b5f7-49ab-97e2-ba8c219e40eb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6896438b-9d7e-4637-ac39-63c12dbf0e66-config\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xggd6\" (UniqueName: \"kubernetes.io/projected/651b3241-1dc1-4356-8fe1-65eec3fcc47c-kube-api-access-xggd6\") pod \"openshift-apiserver-operator-796bbdcf4f-bl9mx\" (UID: \"651b3241-1dc1-4356-8fe1-65eec3fcc47c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d87c8f0a-f206-4813-968e-8f060d659834-audit-policies\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39df9137-94c9-4052-bb18-458e1d29e081-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706924 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60cbb465-ee79-49db-8488-943c0b44334b-serving-cert\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-serving-cert\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4vpp\" (UniqueName: \"kubernetes.io/projected/d87c8f0a-f206-4813-968e-8f060d659834-kube-api-access-c4vpp\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.706989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-serving-cert\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39df9137-94c9-4052-bb18-458e1d29e081-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-etcd-service-ca\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/080b0540-7292-4bb0-b198-50101893d856-srv-cert\") pod \"olm-operator-6b444d44fb-6dncd\" (UID: \"080b0540-7292-4bb0-b198-50101893d856\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/56397290-3efc-4453-8cf4-ceda50f383f6-default-certificate\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-node-pullsecrets\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/825ae202-8fd1-4554-b267-3720efe4954e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j8l7\" (UID: \"825ae202-8fd1-4554-b267-3720efe4954e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5999c94d-18a3-46e1-8a10-9f31b1b6577a-config\") pod \"kube-controller-manager-operator-78b949d7b-m6khq\" (UID: \"5999c94d-18a3-46e1-8a10-9f31b1b6577a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tft84\" (UniqueName: \"kubernetes.io/projected/f10ad167-f30a-4c37-8a38-fb877eeb567a-kube-api-access-tft84\") pod \"openshift-config-operator-7777fb866f-lpvrj\" (UID: \"f10ad167-f30a-4c37-8a38-fb877eeb567a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707198 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51519fd7-68a2-4048-8d76-affda2188f00-serving-cert\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkwcs\" (UniqueName: \"kubernetes.io/projected/0ebc35b7-89c9-490c-9756-6d532a2411ed-kube-api-access-dkwcs\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51519fd7-68a2-4048-8d76-affda2188f00-trusted-ca\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-config\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f5ea184-c2df-41ad-9038-9602b5dbc1a1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8xcdf\" (UID: \"6f5ea184-c2df-41ad-9038-9602b5dbc1a1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-service-ca\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-etcd-ca\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/029666b4-21ed-4caf-a28b-f78d1316de59-proxy-tls\") pod \"machine-config-controller-84d6567774-knchs\" (UID: \"029666b4-21ed-4caf-a28b-f78d1316de59\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707373 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf8n8\" (UniqueName: \"kubernetes.io/projected/6a1d8d5b-9f51-4661-9dda-d2e43a87eb85-kube-api-access-sf8n8\") pod \"cluster-samples-operator-665b6dd947-5jtpg\" (UID: \"6a1d8d5b-9f51-4661-9dda-d2e43a87eb85\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-config\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f1d02b-fe59-40fe-a47d-59589afef318-service-ca-bundle\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nx8s\" (UniqueName: \"kubernetes.io/projected/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-kube-api-access-5nx8s\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl55g\" (UniqueName: \"kubernetes.io/projected/c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e-kube-api-access-gl55g\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrqrq\" (UID: \"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707515 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e82fcc54-393c-4981-a666-769ed5c9c92d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kjvk4\" (UID: \"e82fcc54-393c-4981-a666-769ed5c9c92d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn6pq\" (UniqueName: \"kubernetes.io/projected/60f4a311-31b0-4a1e-b5a0-9209f96340c7-kube-api-access-xn6pq\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2clmf\" (UniqueName: \"kubernetes.io/projected/60cbb465-ee79-49db-8488-943c0b44334b-kube-api-access-2clmf\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-config\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfqrp\" (UniqueName: \"kubernetes.io/projected/51519fd7-68a2-4048-8d76-affda2188f00-kube-api-access-dfqrp\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5999c94d-18a3-46e1-8a10-9f31b1b6577a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m6khq\" (UID: \"5999c94d-18a3-46e1-8a10-9f31b1b6577a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bsr7\" (UniqueName: \"kubernetes.io/projected/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-kube-api-access-6bsr7\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-image-import-ca\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707717 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlhsk\" (UniqueName: \"kubernetes.io/projected/029666b4-21ed-4caf-a28b-f78d1316de59-kube-api-access-mlhsk\") pod \"machine-config-controller-84d6567774-knchs\" (UID: \"029666b4-21ed-4caf-a28b-f78d1316de59\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whknl\" (UniqueName: \"kubernetes.io/projected/22f1d02b-fe59-40fe-a47d-59589afef318-kube-api-access-whknl\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d87c8f0a-f206-4813-968e-8f060d659834-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d87c8f0a-f206-4813-968e-8f060d659834-serving-cert\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6896438b-9d7e-4637-ac39-63c12dbf0e66-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-etcd-client\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707857 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c83727c-a8f9-4a68-a14d-e589059d313c-cert\") pod \"ingress-canary-pk8cn\" (UID: \"1c83727c-a8f9-4a68-a14d-e589059d313c\") " pod="openshift-ingress-canary/ingress-canary-pk8cn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39df9137-94c9-4052-bb18-458e1d29e081-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/56397290-3efc-4453-8cf4-ceda50f383f6-stats-auth\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d87c8f0a-f206-4813-968e-8f060d659834-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.707988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56397290-3efc-4453-8cf4-ceda50f383f6-metrics-certs\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/651b3241-1dc1-4356-8fe1-65eec3fcc47c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bl9mx\" (UID: \"651b3241-1dc1-4356-8fe1-65eec3fcc47c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d87c8f0a-f206-4813-968e-8f060d659834-etcd-client\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f1d02b-fe59-40fe-a47d-59589afef318-config\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-client-ca\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-key\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708156 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f5ea184-c2df-41ad-9038-9602b5dbc1a1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8xcdf\" (UID: \"6f5ea184-c2df-41ad-9038-9602b5dbc1a1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf4zt\" (UniqueName: \"kubernetes.io/projected/4a760eb2-4041-468b-a269-13e523796dd0-kube-api-access-sf4zt\") pod \"control-plane-machine-set-operator-78cbb6b69f-bjb5q\" (UID: \"4a760eb2-4041-468b-a269-13e523796dd0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708190 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-etcd-serving-ca\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-audit-dir\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708233 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651b3241-1dc1-4356-8fe1-65eec3fcc47c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bl9mx\" (UID: \"651b3241-1dc1-4356-8fe1-65eec3fcc47c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-encryption-config\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af8a399-b5f7-49ab-97e2-ba8c219e40eb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fnkpm\" (UID: \"7af8a399-b5f7-49ab-97e2-ba8c219e40eb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6896438b-9d7e-4637-ac39-63c12dbf0e66-images\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a760eb2-4041-468b-a269-13e523796dd0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bjb5q\" (UID: \"4a760eb2-4041-468b-a269-13e523796dd0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks4tw\" (UniqueName: \"kubernetes.io/projected/d43c6f9c-a13c-406e-90f3-152b22768174-kube-api-access-ks4tw\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d87c8f0a-f206-4813-968e-8f060d659834-audit-dir\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-config\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f10ad167-f30a-4c37-8a38-fb877eeb567a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lpvrj\" (UID: \"f10ad167-f30a-4c37-8a38-fb877eeb567a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/825ae202-8fd1-4554-b267-3720efe4954e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j8l7\" (UID: \"825ae202-8fd1-4554-b267-3720efe4954e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-audit\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56397290-3efc-4453-8cf4-ceda50f383f6-service-ca-bundle\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrqrq\" (UID: \"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-serving-cert\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75mlc\" (UniqueName: \"kubernetes.io/projected/6896438b-9d7e-4637-ac39-63c12dbf0e66-kube-api-access-75mlc\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f1d02b-fe59-40fe-a47d-59589afef318-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/080b0540-7292-4bb0-b198-50101893d856-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6dncd\" (UID: \"080b0540-7292-4bb0-b198-50101893d856\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708817 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rw5\" (UniqueName: \"kubernetes.io/projected/56397290-3efc-4453-8cf4-ceda50f383f6-kube-api-access-j6rw5\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7af8a399-b5f7-49ab-97e2-ba8c219e40eb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fnkpm\" (UID: \"7af8a399-b5f7-49ab-97e2-ba8c219e40eb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nswvl\" (UniqueName: \"kubernetes.io/projected/8a14118b-5f31-4136-82d5-b2f597f6554c-kube-api-access-nswvl\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-client-ca\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrqrq\" (UID: \"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-oauth-config\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-audit-policies\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5999c94d-18a3-46e1-8a10-9f31b1b6577a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m6khq\" (UID: \"5999c94d-18a3-46e1-8a10-9f31b1b6577a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.708995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tjnr\" (UniqueName: \"kubernetes.io/projected/1c83727c-a8f9-4a68-a14d-e589059d313c-kube-api-access-5tjnr\") pod \"ingress-canary-pk8cn\" (UID: \"1c83727c-a8f9-4a68-a14d-e589059d313c\") " pod="openshift-ingress-canary/ingress-canary-pk8cn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-cabundle\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-auth-proxy-config\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f10ad167-f30a-4c37-8a38-fb877eeb567a-serving-cert\") pod \"openshift-config-operator-7777fb866f-lpvrj\" (UID: \"f10ad167-f30a-4c37-8a38-fb877eeb567a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbch\" (UniqueName: \"kubernetes.io/projected/39df9137-94c9-4052-bb18-458e1d29e081-kube-api-access-kxbch\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tclm\" (UniqueName: \"kubernetes.io/projected/3742167f-2077-4b3c-b385-2a108c04538c-kube-api-access-5tclm\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f1d02b-fe59-40fe-a47d-59589afef318-serving-cert\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-oauth-serving-cert\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpxzf\" (UniqueName: \"kubernetes.io/projected/825ae202-8fd1-4554-b267-3720efe4954e-kube-api-access-bpxzf\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j8l7\" (UID: \"825ae202-8fd1-4554-b267-3720efe4954e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-etcd-client\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk686\" (UniqueName: \"kubernetes.io/projected/080b0540-7292-4bb0-b198-50101893d856-kube-api-access-vk686\") pod \"olm-operator-6b444d44fb-6dncd\" (UID: \"080b0540-7292-4bb0-b198-50101893d856\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d87c8f0a-f206-4813-968e-8f060d659834-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3742167f-2077-4b3c-b385-2a108c04538c-audit-dir\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhcrf\" (UniqueName: \"kubernetes.io/projected/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-kube-api-access-mhcrf\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-machine-approver-tls\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxcm\" (UniqueName: \"kubernetes.io/projected/690ffd9a-e08c-4a06-9d78-990fd46651d8-kube-api-access-ddxcm\") pod \"downloads-7954f5f757-6kmlc\" (UID: \"690ffd9a-e08c-4a06-9d78-990fd46651d8\") " pod="openshift-console/downloads-7954f5f757-6kmlc" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-config\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-428c7\" (UniqueName: \"kubernetes.io/projected/e82fcc54-393c-4981-a666-769ed5c9c92d-kube-api-access-428c7\") pod \"multus-admission-controller-857f4d67dd-kjvk4\" (UID: \"e82fcc54-393c-4981-a666-769ed5c9c92d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-config\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d87c8f0a-f206-4813-968e-8f060d659834-encryption-config\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5ea184-c2df-41ad-9038-9602b5dbc1a1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8xcdf\" (UID: \"6f5ea184-c2df-41ad-9038-9602b5dbc1a1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51519fd7-68a2-4048-8d76-affda2188f00-config\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709541 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6896438b-9d7e-4637-ac39-63c12dbf0e66-config\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709802 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-config\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.709938 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51519fd7-68a2-4048-8d76-affda2188f00-trusted-ca\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.710629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-etcd-serving-ca\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.710674 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-audit-dir\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.710684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-client-ca\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.710692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-config\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.710758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d87c8f0a-f206-4813-968e-8f060d659834-audit-dir\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.711065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.711205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-config\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.711217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651b3241-1dc1-4356-8fe1-65eec3fcc47c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bl9mx\" (UID: \"651b3241-1dc1-4356-8fe1-65eec3fcc47c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.711571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-service-ca\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.711593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d87c8f0a-f206-4813-968e-8f060d659834-audit-policies\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.711908 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f10ad167-f30a-4c37-8a38-fb877eeb567a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lpvrj\" (UID: \"f10ad167-f30a-4c37-8a38-fb877eeb567a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.712188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.712358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f1d02b-fe59-40fe-a47d-59589afef318-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.712422 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-audit\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.712576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39df9137-94c9-4052-bb18-458e1d29e081-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.712718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51519fd7-68a2-4048-8d76-affda2188f00-config\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.712864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60cbb465-ee79-49db-8488-943c0b44334b-serving-cert\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.712912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f1d02b-fe59-40fe-a47d-59589afef318-service-ca-bundle\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.712980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrqrq\" (UID: \"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.713262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.713296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-trusted-ca-bundle\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.713558 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-config\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.713900 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.714075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f1d02b-fe59-40fe-a47d-59589afef318-config\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.714447 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-config\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.714557 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-client-ca\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.714973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6896438b-9d7e-4637-ac39-63c12dbf0e66-images\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.715037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d87c8f0a-f206-4813-968e-8f060d659834-encryption-config\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.715068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.715114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-node-pullsecrets\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.715391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-serving-cert\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.715532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f4a311-31b0-4a1e-b5a0-9209f96340c7-serving-cert\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.715581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3742167f-2077-4b3c-b385-2a108c04538c-audit-dir\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.715833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/651b3241-1dc1-4356-8fe1-65eec3fcc47c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bl9mx\" (UID: \"651b3241-1dc1-4356-8fe1-65eec3fcc47c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.716223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-audit-policies\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.716461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a1d8d5b-9f51-4661-9dda-d2e43a87eb85-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5jtpg\" (UID: \"6a1d8d5b-9f51-4661-9dda-d2e43a87eb85\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.716492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-serving-cert\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.716823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-auth-proxy-config\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.716832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.716880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.716961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-etcd-ca\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.717114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-oauth-serving-cert\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.717185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.717262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d87c8f0a-f206-4813-968e-8f060d659834-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.717658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrqrq\" (UID: \"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.719184 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-encryption-config\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.719486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51519fd7-68a2-4048-8d76-affda2188f00-serving-cert\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.719843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-serving-cert\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.719909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d87c8f0a-f206-4813-968e-8f060d659834-etcd-client\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.719950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39df9137-94c9-4052-bb18-458e1d29e081-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.720027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d87c8f0a-f206-4813-968e-8f060d659834-serving-cert\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.720184 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-oauth-config\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.720261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f1d02b-fe59-40fe-a47d-59589afef318-serving-cert\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.720314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6896438b-9d7e-4637-ac39-63c12dbf0e66-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.720804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f10ad167-f30a-4c37-8a38-fb877eeb567a-serving-cert\") pod \"openshift-config-operator-7777fb866f-lpvrj\" (UID: \"f10ad167-f30a-4c37-8a38-fb877eeb567a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.720932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.721053 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-etcd-client\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.721431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.721671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.722041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-machine-approver-tls\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.722372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.722924 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.726014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.738175 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-etcd-client\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.739563 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pgf7w"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.745442 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.748944 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.750710 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pgf7w"] Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.756692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-etcd-service-ca\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.765475 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.784789 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.804654 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlhsk\" (UniqueName: \"kubernetes.io/projected/029666b4-21ed-4caf-a28b-f78d1316de59-kube-api-access-mlhsk\") pod \"machine-config-controller-84d6567774-knchs\" (UID: \"029666b4-21ed-4caf-a28b-f78d1316de59\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c83727c-a8f9-4a68-a14d-e589059d313c-cert\") pod \"ingress-canary-pk8cn\" (UID: \"1c83727c-a8f9-4a68-a14d-e589059d313c\") " pod="openshift-ingress-canary/ingress-canary-pk8cn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/56397290-3efc-4453-8cf4-ceda50f383f6-stats-auth\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56397290-3efc-4453-8cf4-ceda50f383f6-metrics-certs\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-key\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks4tw\" (UniqueName: \"kubernetes.io/projected/d43c6f9c-a13c-406e-90f3-152b22768174-kube-api-access-ks4tw\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56397290-3efc-4453-8cf4-ceda50f383f6-service-ca-bundle\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/080b0540-7292-4bb0-b198-50101893d856-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6dncd\" (UID: \"080b0540-7292-4bb0-b198-50101893d856\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rw5\" (UniqueName: \"kubernetes.io/projected/56397290-3efc-4453-8cf4-ceda50f383f6-kube-api-access-j6rw5\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nswvl\" (UniqueName: \"kubernetes.io/projected/8a14118b-5f31-4136-82d5-b2f597f6554c-kube-api-access-nswvl\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tjnr\" (UniqueName: \"kubernetes.io/projected/1c83727c-a8f9-4a68-a14d-e589059d313c-kube-api-access-5tjnr\") pod \"ingress-canary-pk8cn\" (UID: \"1c83727c-a8f9-4a68-a14d-e589059d313c\") " pod="openshift-ingress-canary/ingress-canary-pk8cn" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-cabundle\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk686\" (UniqueName: \"kubernetes.io/projected/080b0540-7292-4bb0-b198-50101893d856-kube-api-access-vk686\") pod \"olm-operator-6b444d44fb-6dncd\" (UID: \"080b0540-7292-4bb0-b198-50101893d856\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810839 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-428c7\" (UniqueName: \"kubernetes.io/projected/e82fcc54-393c-4981-a666-769ed5c9c92d-kube-api-access-428c7\") pod \"multus-admission-controller-857f4d67dd-kjvk4\" (UID: \"e82fcc54-393c-4981-a666-769ed5c9c92d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/029666b4-21ed-4caf-a28b-f78d1316de59-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-knchs\" (UID: \"029666b4-21ed-4caf-a28b-f78d1316de59\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.810986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/080b0540-7292-4bb0-b198-50101893d856-srv-cert\") pod \"olm-operator-6b444d44fb-6dncd\" (UID: \"080b0540-7292-4bb0-b198-50101893d856\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.811002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/56397290-3efc-4453-8cf4-ceda50f383f6-default-certificate\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.811043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/029666b4-21ed-4caf-a28b-f78d1316de59-proxy-tls\") pod \"machine-config-controller-84d6567774-knchs\" (UID: \"029666b4-21ed-4caf-a28b-f78d1316de59\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.811098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e82fcc54-393c-4981-a666-769ed5c9c92d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kjvk4\" (UID: \"e82fcc54-393c-4981-a666-769ed5c9c92d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.811099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-config\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.811910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/029666b4-21ed-4caf-a28b-f78d1316de59-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-knchs\" (UID: \"029666b4-21ed-4caf-a28b-f78d1316de59\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.824919 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.845282 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.848620 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/825ae202-8fd1-4554-b267-3720efe4954e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j8l7\" (UID: \"825ae202-8fd1-4554-b267-3720efe4954e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.864937 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.884974 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.892734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/825ae202-8fd1-4554-b267-3720efe4954e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j8l7\" (UID: \"825ae202-8fd1-4554-b267-3720efe4954e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.905132 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.925725 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.930353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af8a399-b5f7-49ab-97e2-ba8c219e40eb-config\") pod \"kube-apiserver-operator-766d6c64bb-fnkpm\" (UID: \"7af8a399-b5f7-49ab-97e2-ba8c219e40eb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.945518 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.965845 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.985014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 15:04:06 crc kubenswrapper[4707]: I0121 15:04:06.995691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af8a399-b5f7-49ab-97e2-ba8c219e40eb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fnkpm\" (UID: \"7af8a399-b5f7-49ab-97e2-ba8c219e40eb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.025533 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.032949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f5ea184-c2df-41ad-9038-9602b5dbc1a1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8xcdf\" (UID: \"6f5ea184-c2df-41ad-9038-9602b5dbc1a1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.044986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.065609 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.072570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5ea184-c2df-41ad-9038-9602b5dbc1a1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8xcdf\" (UID: \"6f5ea184-c2df-41ad-9038-9602b5dbc1a1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.086523 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.106038 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.118104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5999c94d-18a3-46e1-8a10-9f31b1b6577a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m6khq\" (UID: \"5999c94d-18a3-46e1-8a10-9f31b1b6577a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.125401 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.145896 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.156078 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5999c94d-18a3-46e1-8a10-9f31b1b6577a-config\") pod \"kube-controller-manager-operator-78b949d7b-m6khq\" (UID: \"5999c94d-18a3-46e1-8a10-9f31b1b6577a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.165498 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.186560 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.199333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a760eb2-4041-468b-a269-13e523796dd0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bjb5q\" (UID: \"4a760eb2-4041-468b-a269-13e523796dd0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.205929 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.225374 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.245947 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.265150 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.285658 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.309552 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.325267 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.346128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.365179 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.385043 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.405028 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.415101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/56397290-3efc-4453-8cf4-ceda50f383f6-default-certificate\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.425515 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.445410 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.452118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56397290-3efc-4453-8cf4-ceda50f383f6-service-ca-bundle\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.465604 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.484964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.493405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/56397290-3efc-4453-8cf4-ceda50f383f6-stats-auth\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.505789 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.525865 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.533215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56397290-3efc-4453-8cf4-ceda50f383f6-metrics-certs\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.545676 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.566069 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.586459 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.593586 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e82fcc54-393c-4981-a666-769ed5c9c92d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kjvk4\" (UID: \"e82fcc54-393c-4981-a666-769ed5c9c92d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.605199 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.624876 4707 request.go:700] Waited for 1.011700897s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-controller-dockercfg-c2lfx&limit=500&resourceVersion=0 Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.626187 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.645626 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.654393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/029666b4-21ed-4caf-a28b-f78d1316de59-proxy-tls\") pod \"machine-config-controller-84d6567774-knchs\" (UID: \"029666b4-21ed-4caf-a28b-f78d1316de59\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.665880 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.673621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/080b0540-7292-4bb0-b198-50101893d856-srv-cert\") pod \"olm-operator-6b444d44fb-6dncd\" (UID: \"080b0540-7292-4bb0-b198-50101893d856\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.686264 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.706196 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.714087 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/080b0540-7292-4bb0-b198-50101893d856-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6dncd\" (UID: \"080b0540-7292-4bb0-b198-50101893d856\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.725127 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.745846 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.765604 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.785368 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.805384 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811242 4707 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811415 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-key podName:8a14118b-5f31-4136-82d5-b2f597f6554c nodeName:}" failed. No retries permitted until 2026-01-21 15:04:08.311397156 +0000 UTC m=+145.492913378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-key") pod "service-ca-9c57cc56f-ms2tj" (UID: "8a14118b-5f31-4136-82d5-b2f597f6554c") : failed to sync secret cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811258 4707 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811589 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c83727c-a8f9-4a68-a14d-e589059d313c-cert podName:1c83727c-a8f9-4a68-a14d-e589059d313c nodeName:}" failed. No retries permitted until 2026-01-21 15:04:08.311580898 +0000 UTC m=+145.493097121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c83727c-a8f9-4a68-a14d-e589059d313c-cert") pod "ingress-canary-pk8cn" (UID: "1c83727c-a8f9-4a68-a14d-e589059d313c") : failed to sync secret cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811285 4707 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811295 4707 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811729 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-operator-metrics podName:d43c6f9c-a13c-406e-90f3-152b22768174 nodeName:}" failed. No retries permitted until 2026-01-21 15:04:08.311711371 +0000 UTC m=+145.493227603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-operator-metrics") pod "marketplace-operator-79b997595-xgg5r" (UID: "d43c6f9c-a13c-406e-90f3-152b22768174") : failed to sync secret cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811331 4707 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811747 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-cabundle podName:8a14118b-5f31-4136-82d5-b2f597f6554c nodeName:}" failed. No retries permitted until 2026-01-21 15:04:08.311740345 +0000 UTC m=+145.493256578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-cabundle") pod "service-ca-9c57cc56f-ms2tj" (UID: "8a14118b-5f31-4136-82d5-b2f597f6554c") : failed to sync configmap cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: E0121 15:04:07.811764 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-trusted-ca podName:d43c6f9c-a13c-406e-90f3-152b22768174 nodeName:}" failed. No retries permitted until 2026-01-21 15:04:08.311753951 +0000 UTC m=+145.493270173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-trusted-ca") pod "marketplace-operator-79b997595-xgg5r" (UID: "d43c6f9c-a13c-406e-90f3-152b22768174") : failed to sync configmap cache: timed out waiting for the condition Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.825595 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.845903 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.865119 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.885202 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.906273 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.925662 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.945548 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.965903 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 15:04:07 crc kubenswrapper[4707]: I0121 15:04:07.985725 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.005007 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.025257 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.045167 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.065416 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.085138 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.105240 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.126043 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.145237 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.165889 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.185337 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.205435 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.226056 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.245744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.265410 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.290085 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.305783 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.325841 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.328353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-cabundle\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.328418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.328440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.328559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c83727c-a8f9-4a68-a14d-e589059d313c-cert\") pod \"ingress-canary-pk8cn\" (UID: \"1c83727c-a8f9-4a68-a14d-e589059d313c\") " pod="openshift-ingress-canary/ingress-canary-pk8cn" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.328585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-key\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.329328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.329420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-cabundle\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.331017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8a14118b-5f31-4136-82d5-b2f597f6554c-signing-key\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.331185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.345439 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.365488 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.385879 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.405601 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.411330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c83727c-a8f9-4a68-a14d-e589059d313c-cert\") pod \"ingress-canary-pk8cn\" (UID: \"1c83727c-a8f9-4a68-a14d-e589059d313c\") " pod="openshift-ingress-canary/ingress-canary-pk8cn" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.428998 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.445257 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.466065 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.485517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.538034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tft84\" (UniqueName: \"kubernetes.io/projected/f10ad167-f30a-4c37-8a38-fb877eeb567a-kube-api-access-tft84\") pod \"openshift-config-operator-7777fb866f-lpvrj\" (UID: \"f10ad167-f30a-4c37-8a38-fb877eeb567a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.557794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nx8s\" (UniqueName: \"kubernetes.io/projected/5fa05f5b-3668-4d8f-99e9-a0eb8e644b25-kube-api-access-5nx8s\") pod \"etcd-operator-b45778765-njgpj\" (UID: \"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.576673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkwcs\" (UniqueName: \"kubernetes.io/projected/0ebc35b7-89c9-490c-9756-6d532a2411ed-kube-api-access-dkwcs\") pod \"console-f9d7485db-c6ckn\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.597037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl55g\" (UniqueName: \"kubernetes.io/projected/c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e-kube-api-access-gl55g\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrqrq\" (UID: \"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.616501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxcm\" (UniqueName: \"kubernetes.io/projected/690ffd9a-e08c-4a06-9d78-990fd46651d8-kube-api-access-ddxcm\") pod \"downloads-7954f5f757-6kmlc\" (UID: \"690ffd9a-e08c-4a06-9d78-990fd46651d8\") " pod="openshift-console/downloads-7954f5f757-6kmlc" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.636237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xggd6\" (UniqueName: \"kubernetes.io/projected/651b3241-1dc1-4356-8fe1-65eec3fcc47c-kube-api-access-xggd6\") pod \"openshift-apiserver-operator-796bbdcf4f-bl9mx\" (UID: \"651b3241-1dc1-4356-8fe1-65eec3fcc47c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.644204 4707 request.go:700] Waited for 1.933053512s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.656617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn6pq\" (UniqueName: \"kubernetes.io/projected/60f4a311-31b0-4a1e-b5a0-9209f96340c7-kube-api-access-xn6pq\") pod \"controller-manager-879f6c89f-x9qt5\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.677201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75mlc\" (UniqueName: \"kubernetes.io/projected/6896438b-9d7e-4637-ac39-63c12dbf0e66-kube-api-access-75mlc\") pod \"machine-api-operator-5694c8668f-ggf7f\" (UID: \"6896438b-9d7e-4637-ac39-63c12dbf0e66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.696261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2clmf\" (UniqueName: \"kubernetes.io/projected/60cbb465-ee79-49db-8488-943c0b44334b-kube-api-access-2clmf\") pod \"route-controller-manager-6576b87f9c-t6stx\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.712126 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.717672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpxzf\" (UniqueName: \"kubernetes.io/projected/825ae202-8fd1-4554-b267-3720efe4954e-kube-api-access-bpxzf\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j8l7\" (UID: \"825ae202-8fd1-4554-b267-3720efe4954e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.737932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfqrp\" (UniqueName: \"kubernetes.io/projected/51519fd7-68a2-4048-8d76-affda2188f00-kube-api-access-dfqrp\") pod \"console-operator-58897d9998-62wx6\" (UID: \"51519fd7-68a2-4048-8d76-affda2188f00\") " pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.743726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.757071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7af8a399-b5f7-49ab-97e2-ba8c219e40eb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fnkpm\" (UID: \"7af8a399-b5f7-49ab-97e2-ba8c219e40eb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.759527 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.775647 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.778955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5999c94d-18a3-46e1-8a10-9f31b1b6577a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m6khq\" (UID: \"5999c94d-18a3-46e1-8a10-9f31b1b6577a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.784730 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.789590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6kmlc" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.797748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf8n8\" (UniqueName: \"kubernetes.io/projected/6a1d8d5b-9f51-4661-9dda-d2e43a87eb85-kube-api-access-sf8n8\") pod \"cluster-samples-operator-665b6dd947-5jtpg\" (UID: \"6a1d8d5b-9f51-4661-9dda-d2e43a87eb85\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.804305 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.810888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.817483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbch\" (UniqueName: \"kubernetes.io/projected/39df9137-94c9-4052-bb18-458e1d29e081-kube-api-access-kxbch\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.845651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whknl\" (UniqueName: \"kubernetes.io/projected/22f1d02b-fe59-40fe-a47d-59589afef318-kube-api-access-whknl\") pod \"authentication-operator-69f744f599-dmhxk\" (UID: \"22f1d02b-fe59-40fe-a47d-59589afef318\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.855024 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx"] Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.860119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.869371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4vpp\" (UniqueName: \"kubernetes.io/projected/d87c8f0a-f206-4813-968e-8f060d659834-kube-api-access-c4vpp\") pod \"apiserver-7bbb656c7d-4t9v9\" (UID: \"d87c8f0a-f206-4813-968e-8f060d659834\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.869596 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.881970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f5ea184-c2df-41ad-9038-9602b5dbc1a1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8xcdf\" (UID: \"6f5ea184-c2df-41ad-9038-9602b5dbc1a1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.892964 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-c6ckn"] Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.904271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhcrf\" (UniqueName: \"kubernetes.io/projected/2c1f606a-2541-4b8d-a4ce-62919f0a44b0-kube-api-access-mhcrf\") pod \"apiserver-76f77b778f-v44w7\" (UID: \"2c1f606a-2541-4b8d-a4ce-62919f0a44b0\") " pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.918165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bsr7\" (UniqueName: \"kubernetes.io/projected/1ab8b695-baf0-46bb-bf98-a13bc585f7ce-kube-api-access-6bsr7\") pod \"machine-approver-56656f9798-xjv7l\" (UID: \"1ab8b695-baf0-46bb-bf98-a13bc585f7ce\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.940778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf4zt\" (UniqueName: \"kubernetes.io/projected/4a760eb2-4041-468b-a269-13e523796dd0-kube-api-access-sf4zt\") pod \"control-plane-machine-set-operator-78cbb6b69f-bjb5q\" (UID: \"4a760eb2-4041-468b-a269-13e523796dd0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.955787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.959589 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tclm\" (UniqueName: \"kubernetes.io/projected/3742167f-2077-4b3c-b385-2a108c04538c-kube-api-access-5tclm\") pod \"oauth-openshift-558db77b4-bmpc8\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.960920 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.969211 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.977312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39df9137-94c9-4052-bb18-458e1d29e081-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-96l4g\" (UID: \"39df9137-94c9-4052-bb18-458e1d29e081\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.982117 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" Jan 21 15:04:08 crc kubenswrapper[4707]: I0121 15:04:08.987092 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.006233 4707 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.006786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.021101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.025872 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.027937 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.033716 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.042686 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.042926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.042996 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:06:11.042978922 +0000 UTC m=+268.224495144 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.044043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.044874 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6kmlc"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.055065 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.061754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlhsk\" (UniqueName: \"kubernetes.io/projected/029666b4-21ed-4caf-a28b-f78d1316de59-kube-api-access-mlhsk\") pod \"machine-config-controller-84d6567774-knchs\" (UID: \"029666b4-21ed-4caf-a28b-f78d1316de59\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.065144 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.080114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks4tw\" (UniqueName: \"kubernetes.io/projected/d43c6f9c-a13c-406e-90f3-152b22768174-kube-api-access-ks4tw\") pod \"marketplace-operator-79b997595-xgg5r\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.086675 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-njgpj"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.103617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rw5\" (UniqueName: \"kubernetes.io/projected/56397290-3efc-4453-8cf4-ceda50f383f6-kube-api-access-j6rw5\") pod \"router-default-5444994796-k84wt\" (UID: \"56397290-3efc-4453-8cf4-ceda50f383f6\") " pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.119645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nswvl\" (UniqueName: \"kubernetes.io/projected/8a14118b-5f31-4136-82d5-b2f597f6554c-kube-api-access-nswvl\") pod \"service-ca-9c57cc56f-ms2tj\" (UID: \"8a14118b-5f31-4136-82d5-b2f597f6554c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.140129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tjnr\" (UniqueName: \"kubernetes.io/projected/1c83727c-a8f9-4a68-a14d-e589059d313c-kube-api-access-5tjnr\") pod \"ingress-canary-pk8cn\" (UID: \"1c83727c-a8f9-4a68-a14d-e589059d313c\") " pod="openshift-ingress-canary/ingress-canary-pk8cn" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.143592 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.143629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.143667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.150017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.150120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.157176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.160312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-428c7\" (UniqueName: \"kubernetes.io/projected/e82fcc54-393c-4981-a666-769ed5c9c92d-kube-api-access-428c7\") pod \"multus-admission-controller-857f4d67dd-kjvk4\" (UID: \"e82fcc54-393c-4981-a666-769ed5c9c92d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.163110 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.175702 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.195921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.195950 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9qt5"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.195959 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ggf7f"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.197927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.199949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.200767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk686\" (UniqueName: \"kubernetes.io/projected/080b0540-7292-4bb0-b198-50101893d856-kube-api-access-vk686\") pod \"olm-operator-6b444d44fb-6dncd\" (UID: \"080b0540-7292-4bb0-b198-50101893d856\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.204084 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.207973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.214014 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:09 crc kubenswrapper[4707]: W0121 15:04:09.216736 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f4a311_31b0_4a1e_b5a0_9209f96340c7.slice/crio-89e64b120d455663a394261526f2a0d78502ad22520872423138b0f07abbfecc WatchSource:0}: Error finding container 89e64b120d455663a394261526f2a0d78502ad22520872423138b0f07abbfecc: Status 404 returned error can't find the container with id 89e64b120d455663a394261526f2a0d78502ad22520872423138b0f07abbfecc Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.233230 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.239001 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245138 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/152ee428-bdb0-4de2-a795-af600b7669fd-metrics-tls\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a6b585e-78c7-440d-81da-7ea7a3961948-metrics-tls\") pod \"dns-operator-744455d44c-h69c5\" (UID: \"7a6b585e-78c7-440d-81da-7ea7a3961948\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb976779-fc90-40ac-a41e-d93b53181836-proxy-tls\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb976779-fc90-40ac-a41e-d93b53181836-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-tls\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8079cbea-8f34-4384-b064-d979305eb076-certs\") pod \"machine-config-server-zdb5r\" (UID: \"8079cbea-8f34-4384-b064-d979305eb076\") " pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcm94\" (UniqueName: \"kubernetes.io/projected/8079cbea-8f34-4384-b064-d979305eb076-kube-api-access-gcm94\") pod \"machine-config-server-zdb5r\" (UID: \"8079cbea-8f34-4384-b064-d979305eb076\") " pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-certificates\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zkfq\" (UniqueName: \"kubernetes.io/projected/152ee428-bdb0-4de2-a795-af600b7669fd-kube-api-access-9zkfq\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245522 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5537fff0-7562-4e93-8db8-d13d18966e3d-secret-volume\") pod \"collect-profiles-29483460-rgbtm\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-trusted-ca\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245554 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/130b5f6d-939d-4b35-b0a4-5fc39dc8b077-serving-cert\") pod \"service-ca-operator-777779d784-dnwth\" (UID: \"130b5f6d-939d-4b35-b0a4-5fc39dc8b077\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ee0078f-3c54-4136-88a7-59585fc4b5ee-config-volume\") pod \"dns-default-z4txk\" (UID: \"5ee0078f-3c54-4136-88a7-59585fc4b5ee\") " pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-srv-cert\") pod \"catalog-operator-68c6474976-xxr5n\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sznln\" (UniqueName: \"kubernetes.io/projected/7a6b585e-78c7-440d-81da-7ea7a3961948-kube-api-access-sznln\") pod \"dns-operator-744455d44c-h69c5\" (UID: \"7a6b585e-78c7-440d-81da-7ea7a3961948\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245673 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksvwh\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-kube-api-access-ksvwh\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245717 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c4cb519-1970-4b6b-99ce-1e505723ceca-apiservice-cert\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245920 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e72b99f9-e554-43e0-ae76-ee94b2613a88-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2c4cb519-1970-4b6b-99ce-1e505723ceca-tmpfs\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhtc6\" (UniqueName: \"kubernetes.io/projected/2c4cb519-1970-4b6b-99ce-1e505723ceca-kube-api-access-fhtc6\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.245994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5537fff0-7562-4e93-8db8-d13d18966e3d-config-volume\") pod \"collect-profiles-29483460-rgbtm\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.246363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.246431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7jqd\" (UniqueName: \"kubernetes.io/projected/a13a87c3-f0a5-4fa9-892d-e2371300124a-kube-api-access-x7jqd\") pod \"catalog-operator-68c6474976-xxr5n\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.246617 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.746603828 +0000 UTC m=+146.928120049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.246724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e72b99f9-e554-43e0-ae76-ee94b2613a88-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.246758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-bound-sa-token\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.246782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcgfh\" (UniqueName: \"kubernetes.io/projected/5ee0078f-3c54-4136-88a7-59585fc4b5ee-kube-api-access-wcgfh\") pod \"dns-default-z4txk\" (UID: \"5ee0078f-3c54-4136-88a7-59585fc4b5ee\") " pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.246939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqfj5\" (UniqueName: \"kubernetes.io/projected/cb976779-fc90-40ac-a41e-d93b53181836-kube-api-access-bqfj5\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.247219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d29f4db-3090-4118-88c3-878f8ddc4ab1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7pl4s\" (UID: \"0d29f4db-3090-4118-88c3-878f8ddc4ab1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.247261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh79n\" (UniqueName: \"kubernetes.io/projected/5537fff0-7562-4e93-8db8-d13d18966e3d-kube-api-access-gh79n\") pod \"collect-profiles-29483460-rgbtm\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.247278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwrcc\" (UniqueName: \"kubernetes.io/projected/130b5f6d-939d-4b35-b0a4-5fc39dc8b077-kube-api-access-gwrcc\") pod \"service-ca-operator-777779d784-dnwth\" (UID: \"130b5f6d-939d-4b35-b0a4-5fc39dc8b077\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.247333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/152ee428-bdb0-4de2-a795-af600b7669fd-trusted-ca\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.247437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-profile-collector-cert\") pod \"catalog-operator-68c6474976-xxr5n\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.247460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8079cbea-8f34-4384-b064-d979305eb076-node-bootstrap-token\") pod \"machine-config-server-zdb5r\" (UID: \"8079cbea-8f34-4384-b064-d979305eb076\") " pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.247513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee0078f-3c54-4136-88a7-59585fc4b5ee-metrics-tls\") pod \"dns-default-z4txk\" (UID: \"5ee0078f-3c54-4136-88a7-59585fc4b5ee\") " pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.247556 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wklgh\" (UniqueName: \"kubernetes.io/projected/0cc1862f-ca4e-4e64-95df-61589500669a-kube-api-access-wklgh\") pod \"migrator-59844c95c7-t9whl\" (UID: \"0cc1862f-ca4e-4e64-95df-61589500669a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.247587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvvm\" (UniqueName: \"kubernetes.io/projected/0d29f4db-3090-4118-88c3-878f8ddc4ab1-kube-api-access-ggvvm\") pod \"package-server-manager-789f6589d5-7pl4s\" (UID: \"0d29f4db-3090-4118-88c3-878f8ddc4ab1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.248419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/152ee428-bdb0-4de2-a795-af600b7669fd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.248562 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130b5f6d-939d-4b35-b0a4-5fc39dc8b077-config\") pod \"service-ca-operator-777779d784-dnwth\" (UID: \"130b5f6d-939d-4b35-b0a4-5fc39dc8b077\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.248605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c4cb519-1970-4b6b-99ce-1e505723ceca-webhook-cert\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.248685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cb976779-fc90-40ac-a41e-d93b53181836-images\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: W0121 15:04:09.250767 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56397290_3efc_4453_8cf4_ceda50f383f6.slice/crio-a6e28fdd50096994f58f9cae5951e4800e4132c7389c5c628dd8eeffb4d60b22 WatchSource:0}: Error finding container a6e28fdd50096994f58f9cae5951e4800e4132c7389c5c628dd8eeffb4d60b22: Status 404 returned error can't find the container with id a6e28fdd50096994f58f9cae5951e4800e4132c7389c5c628dd8eeffb4d60b22 Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.275561 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.287260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pk8cn" Jan 21 15:04:09 crc kubenswrapper[4707]: W0121 15:04:09.296538 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod825ae202_8fd1_4554_b267_3720efe4954e.slice/crio-c666ba7d04bdea6710e3cd0c3b609c0de32ab89e561af64fc3b52576e57f7c2b WatchSource:0}: Error finding container c666ba7d04bdea6710e3cd0c3b609c0de32ab89e561af64fc3b52576e57f7c2b: Status 404 returned error can't find the container with id c666ba7d04bdea6710e3cd0c3b609c0de32ab89e561af64fc3b52576e57f7c2b Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.343081 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.346856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.349979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350156 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwrcc\" (UniqueName: \"kubernetes.io/projected/130b5f6d-939d-4b35-b0a4-5fc39dc8b077-kube-api-access-gwrcc\") pod \"service-ca-operator-777779d784-dnwth\" (UID: \"130b5f6d-939d-4b35-b0a4-5fc39dc8b077\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-profile-collector-cert\") pod \"catalog-operator-68c6474976-xxr5n\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8079cbea-8f34-4384-b064-d979305eb076-node-bootstrap-token\") pod \"machine-config-server-zdb5r\" (UID: \"8079cbea-8f34-4384-b064-d979305eb076\") " pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/152ee428-bdb0-4de2-a795-af600b7669fd-trusted-ca\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.350277 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.850255826 +0000 UTC m=+147.031772048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee0078f-3c54-4136-88a7-59585fc4b5ee-metrics-tls\") pod \"dns-default-z4txk\" (UID: \"5ee0078f-3c54-4136-88a7-59585fc4b5ee\") " pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvvm\" (UniqueName: \"kubernetes.io/projected/0d29f4db-3090-4118-88c3-878f8ddc4ab1-kube-api-access-ggvvm\") pod \"package-server-manager-789f6589d5-7pl4s\" (UID: \"0d29f4db-3090-4118-88c3-878f8ddc4ab1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wklgh\" (UniqueName: \"kubernetes.io/projected/0cc1862f-ca4e-4e64-95df-61589500669a-kube-api-access-wklgh\") pod \"migrator-59844c95c7-t9whl\" (UID: \"0cc1862f-ca4e-4e64-95df-61589500669a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-plugins-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2sb2\" (UniqueName: \"kubernetes.io/projected/6b4276ea-aeb9-4265-bff9-c7c339da152d-kube-api-access-z2sb2\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/152ee428-bdb0-4de2-a795-af600b7669fd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-registration-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130b5f6d-939d-4b35-b0a4-5fc39dc8b077-config\") pod \"service-ca-operator-777779d784-dnwth\" (UID: \"130b5f6d-939d-4b35-b0a4-5fc39dc8b077\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c4cb519-1970-4b6b-99ce-1e505723ceca-webhook-cert\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-socket-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350598 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cb976779-fc90-40ac-a41e-d93b53181836-images\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/152ee428-bdb0-4de2-a795-af600b7669fd-metrics-tls\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a6b585e-78c7-440d-81da-7ea7a3961948-metrics-tls\") pod \"dns-operator-744455d44c-h69c5\" (UID: \"7a6b585e-78c7-440d-81da-7ea7a3961948\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-mountpoint-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb976779-fc90-40ac-a41e-d93b53181836-proxy-tls\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb976779-fc90-40ac-a41e-d93b53181836-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-tls\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8079cbea-8f34-4384-b064-d979305eb076-certs\") pod \"machine-config-server-zdb5r\" (UID: \"8079cbea-8f34-4384-b064-d979305eb076\") " pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcm94\" (UniqueName: \"kubernetes.io/projected/8079cbea-8f34-4384-b064-d979305eb076-kube-api-access-gcm94\") pod \"machine-config-server-zdb5r\" (UID: \"8079cbea-8f34-4384-b064-d979305eb076\") " pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zkfq\" (UniqueName: \"kubernetes.io/projected/152ee428-bdb0-4de2-a795-af600b7669fd-kube-api-access-9zkfq\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-certificates\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5537fff0-7562-4e93-8db8-d13d18966e3d-secret-volume\") pod \"collect-profiles-29483460-rgbtm\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-trusted-ca\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.350989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/130b5f6d-939d-4b35-b0a4-5fc39dc8b077-serving-cert\") pod \"service-ca-operator-777779d784-dnwth\" (UID: \"130b5f6d-939d-4b35-b0a4-5fc39dc8b077\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-srv-cert\") pod \"catalog-operator-68c6474976-xxr5n\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351048 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ee0078f-3c54-4136-88a7-59585fc4b5ee-config-volume\") pod \"dns-default-z4txk\" (UID: \"5ee0078f-3c54-4136-88a7-59585fc4b5ee\") " pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sznln\" (UniqueName: \"kubernetes.io/projected/7a6b585e-78c7-440d-81da-7ea7a3961948-kube-api-access-sznln\") pod \"dns-operator-744455d44c-h69c5\" (UID: \"7a6b585e-78c7-440d-81da-7ea7a3961948\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351104 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksvwh\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-kube-api-access-ksvwh\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c4cb519-1970-4b6b-99ce-1e505723ceca-apiservice-cert\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e72b99f9-e554-43e0-ae76-ee94b2613a88-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2c4cb519-1970-4b6b-99ce-1e505723ceca-tmpfs\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351211 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtc6\" (UniqueName: \"kubernetes.io/projected/2c4cb519-1970-4b6b-99ce-1e505723ceca-kube-api-access-fhtc6\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5537fff0-7562-4e93-8db8-d13d18966e3d-config-volume\") pod \"collect-profiles-29483460-rgbtm\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7jqd\" (UniqueName: \"kubernetes.io/projected/a13a87c3-f0a5-4fa9-892d-e2371300124a-kube-api-access-x7jqd\") pod \"catalog-operator-68c6474976-xxr5n\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/152ee428-bdb0-4de2-a795-af600b7669fd-trusted-ca\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e72b99f9-e554-43e0-ae76-ee94b2613a88-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-bound-sa-token\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcgfh\" (UniqueName: \"kubernetes.io/projected/5ee0078f-3c54-4136-88a7-59585fc4b5ee-kube-api-access-wcgfh\") pod \"dns-default-z4txk\" (UID: \"5ee0078f-3c54-4136-88a7-59585fc4b5ee\") " pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351406 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-csi-data-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351425 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqfj5\" (UniqueName: \"kubernetes.io/projected/cb976779-fc90-40ac-a41e-d93b53181836-kube-api-access-bqfj5\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh79n\" (UniqueName: \"kubernetes.io/projected/5537fff0-7562-4e93-8db8-d13d18966e3d-kube-api-access-gh79n\") pod \"collect-profiles-29483460-rgbtm\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.351477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d29f4db-3090-4118-88c3-878f8ddc4ab1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7pl4s\" (UID: \"0d29f4db-3090-4118-88c3-878f8ddc4ab1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.354597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e72b99f9-e554-43e0-ae76-ee94b2613a88-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.354623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee0078f-3c54-4136-88a7-59585fc4b5ee-metrics-tls\") pod \"dns-default-z4txk\" (UID: \"5ee0078f-3c54-4136-88a7-59585fc4b5ee\") " pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.356089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cb976779-fc90-40ac-a41e-d93b53181836-images\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.357704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-certificates\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.359846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c4cb519-1970-4b6b-99ce-1e505723ceca-webhook-cert\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.360476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d29f4db-3090-4118-88c3-878f8ddc4ab1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7pl4s\" (UID: \"0d29f4db-3090-4118-88c3-878f8ddc4ab1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.360869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130b5f6d-939d-4b35-b0a4-5fc39dc8b077-config\") pod \"service-ca-operator-777779d784-dnwth\" (UID: \"130b5f6d-939d-4b35-b0a4-5fc39dc8b077\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.361516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb976779-fc90-40ac-a41e-d93b53181836-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.362668 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-profile-collector-cert\") pod \"catalog-operator-68c6474976-xxr5n\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.362964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a6b585e-78c7-440d-81da-7ea7a3961948-metrics-tls\") pod \"dns-operator-744455d44c-h69c5\" (UID: \"7a6b585e-78c7-440d-81da-7ea7a3961948\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.363510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-trusted-ca\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.364168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/152ee428-bdb0-4de2-a795-af600b7669fd-metrics-tls\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.364397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ee0078f-3c54-4136-88a7-59585fc4b5ee-config-volume\") pod \"dns-default-z4txk\" (UID: \"5ee0078f-3c54-4136-88a7-59585fc4b5ee\") " pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.364792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb976779-fc90-40ac-a41e-d93b53181836-proxy-tls\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.365619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5537fff0-7562-4e93-8db8-d13d18966e3d-secret-volume\") pod \"collect-profiles-29483460-rgbtm\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.365917 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.865901371 +0000 UTC m=+147.047417593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.365960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e72b99f9-e554-43e0-ae76-ee94b2613a88-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.366259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5537fff0-7562-4e93-8db8-d13d18966e3d-config-volume\") pod \"collect-profiles-29483460-rgbtm\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.366405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2c4cb519-1970-4b6b-99ce-1e505723ceca-tmpfs\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.367062 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-tls\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.367423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c4cb519-1970-4b6b-99ce-1e505723ceca-apiservice-cert\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.367573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8079cbea-8f34-4384-b064-d979305eb076-certs\") pod \"machine-config-server-zdb5r\" (UID: \"8079cbea-8f34-4384-b064-d979305eb076\") " pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.369699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-srv-cert\") pod \"catalog-operator-68c6474976-xxr5n\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.371280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/130b5f6d-939d-4b35-b0a4-5fc39dc8b077-serving-cert\") pod \"service-ca-operator-777779d784-dnwth\" (UID: \"130b5f6d-939d-4b35-b0a4-5fc39dc8b077\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.374309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8079cbea-8f34-4384-b064-d979305eb076-node-bootstrap-token\") pod \"machine-config-server-zdb5r\" (UID: \"8079cbea-8f34-4384-b064-d979305eb076\") " pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.393746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.399089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.403801 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.406205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwrcc\" (UniqueName: \"kubernetes.io/projected/130b5f6d-939d-4b35-b0a4-5fc39dc8b077-kube-api-access-gwrcc\") pod \"service-ca-operator-777779d784-dnwth\" (UID: \"130b5f6d-939d-4b35-b0a4-5fc39dc8b077\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.428826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcm94\" (UniqueName: \"kubernetes.io/projected/8079cbea-8f34-4384-b064-d979305eb076-kube-api-access-gcm94\") pod \"machine-config-server-zdb5r\" (UID: \"8079cbea-8f34-4384-b064-d979305eb076\") " pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.452640 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zkfq\" (UniqueName: \"kubernetes.io/projected/152ee428-bdb0-4de2-a795-af600b7669fd-kube-api-access-9zkfq\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.456019 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-v44w7"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.456909 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.457104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.457308 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.957284833 +0000 UTC m=+147.138801055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.457411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-csi-data-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.460288 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:09.960261596 +0000 UTC m=+147.141777819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-csi-data-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-plugins-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2sb2\" (UniqueName: \"kubernetes.io/projected/6b4276ea-aeb9-4265-bff9-c7c339da152d-kube-api-access-z2sb2\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-registration-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-socket-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-mountpoint-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460618 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-plugins-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-mountpoint-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460694 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-registration-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.460749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b4276ea-aeb9-4265-bff9-c7c339da152d-socket-dir\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.463538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvvm\" (UniqueName: \"kubernetes.io/projected/0d29f4db-3090-4118-88c3-878f8ddc4ab1-kube-api-access-ggvvm\") pod \"package-server-manager-789f6589d5-7pl4s\" (UID: \"0d29f4db-3090-4118-88c3-878f8ddc4ab1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.484153 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wklgh\" (UniqueName: \"kubernetes.io/projected/0cc1862f-ca4e-4e64-95df-61589500669a-kube-api-access-wklgh\") pod \"migrator-59844c95c7-t9whl\" (UID: \"0cc1862f-ca4e-4e64-95df-61589500669a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.514907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksvwh\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-kube-api-access-ksvwh\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.521749 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.547078 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/152ee428-bdb0-4de2-a795-af600b7669fd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qlnlt\" (UID: \"152ee428-bdb0-4de2-a795-af600b7669fd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.547505 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.553577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-bound-sa-token\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.558533 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.561648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.562404 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.062384829 +0000 UTC m=+147.243901050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.562478 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.562921 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.062908243 +0000 UTC m=+147.244424465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.563877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcgfh\" (UniqueName: \"kubernetes.io/projected/5ee0078f-3c54-4136-88a7-59585fc4b5ee-kube-api-access-wcgfh\") pod \"dns-default-z4txk\" (UID: \"5ee0078f-3c54-4136-88a7-59585fc4b5ee\") " pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.566561 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.572123 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.572847 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmpc8"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.577349 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-62wx6"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.581714 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.596648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqfj5\" (UniqueName: \"kubernetes.io/projected/cb976779-fc90-40ac-a41e-d93b53181836-kube-api-access-bqfj5\") pod \"machine-config-operator-74547568cd-vk7ss\" (UID: \"cb976779-fc90-40ac-a41e-d93b53181836\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.597638 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.601664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhtc6\" (UniqueName: \"kubernetes.io/projected/2c4cb519-1970-4b6b-99ce-1e505723ceca-kube-api-access-fhtc6\") pod \"packageserver-d55dfcdfc-f7d2n\" (UID: \"2c4cb519-1970-4b6b-99ce-1e505723ceca\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.620175 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sznln\" (UniqueName: \"kubernetes.io/projected/7a6b585e-78c7-440d-81da-7ea7a3961948-kube-api-access-sznln\") pod \"dns-operator-744455d44c-h69c5\" (UID: \"7a6b585e-78c7-440d-81da-7ea7a3961948\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.645506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7jqd\" (UniqueName: \"kubernetes.io/projected/a13a87c3-f0a5-4fa9-892d-e2371300124a-kube-api-access-x7jqd\") pod \"catalog-operator-68c6474976-xxr5n\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.671099 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.673638 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.173607756 +0000 UTC m=+147.355123978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.674267 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.676530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.676897 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.176880509 +0000 UTC m=+147.358396731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.681515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh79n\" (UniqueName: \"kubernetes.io/projected/5537fff0-7562-4e93-8db8-d13d18966e3d-kube-api-access-gh79n\") pod \"collect-profiles-29483460-rgbtm\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.683869 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.687057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dmhxk"] Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.698206 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zdb5r" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.700593 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c6ckn" event={"ID":"0ebc35b7-89c9-490c-9756-6d532a2411ed","Type":"ContainerStarted","Data":"f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.700622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c6ckn" event={"ID":"0ebc35b7-89c9-490c-9756-6d532a2411ed","Type":"ContainerStarted","Data":"10aa7ff284582adc14afab6e0dad6d9f6c329e2b4d1a261377e7daa4bfc59510"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.703714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2sb2\" (UniqueName: \"kubernetes.io/projected/6b4276ea-aeb9-4265-bff9-c7c339da152d-kube-api-access-z2sb2\") pod \"csi-hostpathplugin-pgf7w\" (UID: \"6b4276ea-aeb9-4265-bff9-c7c339da152d\") " pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.709674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6kmlc" event={"ID":"690ffd9a-e08c-4a06-9d78-990fd46651d8","Type":"ContainerStarted","Data":"f0d73a2b6eb6fe314399b9e1a1ffccc8820dfb42938995c9d6e91da466c451bd"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.709733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6kmlc" event={"ID":"690ffd9a-e08c-4a06-9d78-990fd46651d8","Type":"ContainerStarted","Data":"70c51a9a54a1b24619c0f1eff7c0ab89cf079af64e9e62c89706f334567d3e71"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.719673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" event={"ID":"651b3241-1dc1-4356-8fe1-65eec3fcc47c","Type":"ContainerStarted","Data":"f4de6e487f08e8056dfd093816d1703276b1d627639ef2ef16db52d71bd48bd2"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.719709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" event={"ID":"651b3241-1dc1-4356-8fe1-65eec3fcc47c","Type":"ContainerStarted","Data":"6a9667834c313bd30aaf58965b38e1c5c0549ea95e856131ab3012d1737a65d2"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.723770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k84wt" event={"ID":"56397290-3efc-4453-8cf4-ceda50f383f6","Type":"ContainerStarted","Data":"d2d923fd8b41aab45cc1937180f2d8f36e5f5e8b139b05779d7069786f80c439"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.723795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k84wt" event={"ID":"56397290-3efc-4453-8cf4-ceda50f383f6","Type":"ContainerStarted","Data":"a6e28fdd50096994f58f9cae5951e4800e4132c7389c5c628dd8eeffb4d60b22"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.725674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" event={"ID":"60f4a311-31b0-4a1e-b5a0-9209f96340c7","Type":"ContainerStarted","Data":"8c7602952d14253b6519f06ca054bc1e5c86a7f8ff7f27134348c79d254defd4"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.725709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" event={"ID":"60f4a311-31b0-4a1e-b5a0-9209f96340c7","Type":"ContainerStarted","Data":"89e64b120d455663a394261526f2a0d78502ad22520872423138b0f07abbfecc"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.726027 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.728431 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" event={"ID":"825ae202-8fd1-4554-b267-3720efe4954e","Type":"ContainerStarted","Data":"3bd7deeb5060b91a657a2bc8fb05060f8281c8d7264aac3c7a6d197d2a69112a"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.728467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" event={"ID":"825ae202-8fd1-4554-b267-3720efe4954e","Type":"ContainerStarted","Data":"c666ba7d04bdea6710e3cd0c3b609c0de32ab89e561af64fc3b52576e57f7c2b"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.730654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" event={"ID":"1ab8b695-baf0-46bb-bf98-a13bc585f7ce","Type":"ContainerStarted","Data":"e680a25a0f4149385b119b78198cd69ff3a8f5faef399f433c8e8d8b28336d2f"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.730682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" event={"ID":"1ab8b695-baf0-46bb-bf98-a13bc585f7ce","Type":"ContainerStarted","Data":"be46f2cac39eb91f6e277d93007f032da4376ad6a9cfa338e4c9aeaf72a2c251"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.731788 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" event={"ID":"7af8a399-b5f7-49ab-97e2-ba8c219e40eb","Type":"ContainerStarted","Data":"6b8452dec60175e0ff4fca4cb627becadff2d92f4eec9f1f3732dbccb5175437"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.732152 4707 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x9qt5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.732199 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" podUID="60f4a311-31b0-4a1e-b5a0-9209f96340c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.735543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" event={"ID":"2c1f606a-2541-4b8d-a4ce-62919f0a44b0","Type":"ContainerStarted","Data":"cb440bb329327d706f1e8662b89a52f28f992f30dbbd0e3e899d9b4dcbd1430e"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.738388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" event={"ID":"5999c94d-18a3-46e1-8a10-9f31b1b6577a","Type":"ContainerStarted","Data":"39ed1da39b65fc0929be06c5f29b9d9c2b7eb50f35c474066f3d8ff414046b48"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.739744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" event={"ID":"6896438b-9d7e-4637-ac39-63c12dbf0e66","Type":"ContainerStarted","Data":"da0ccaf9a6425e8e6b95df5c00217708e279b9eea336c04ddbb278e32752df92"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.739774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" event={"ID":"6896438b-9d7e-4637-ac39-63c12dbf0e66","Type":"ContainerStarted","Data":"0c2f8647ec8f77fce45e17b6e1c5a60f1769b566d08892b00c7c1d748bf6f70d"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.744242 4707 generic.go:334] "Generic (PLEG): container finished" podID="f10ad167-f30a-4c37-8a38-fb877eeb567a" containerID="af8faa2c72a66ce73990e623a9d7828840fa3ddbfe98534722f79e211e317a7f" exitCode=0 Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.744612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" event={"ID":"f10ad167-f30a-4c37-8a38-fb877eeb567a","Type":"ContainerDied","Data":"af8faa2c72a66ce73990e623a9d7828840fa3ddbfe98534722f79e211e317a7f"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.744891 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" event={"ID":"f10ad167-f30a-4c37-8a38-fb877eeb567a","Type":"ContainerStarted","Data":"0e839b90e598c4a16a0b04391973c04856771b63b21a00ac8367ecebd1556007"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.781286 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.781736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.783008 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.282987499 +0000 UTC m=+147.464503721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.788931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.789497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" event={"ID":"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e","Type":"ContainerStarted","Data":"66e3237b7fd366521e464e3ef770db2206413d30354207662a382ae50af64ade"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.789560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" event={"ID":"c4f644ce-33ec-4cd1-9f94-cc8e1c16bf2e","Type":"ContainerStarted","Data":"438d9aa40d86fef27baaa4739d6608a47bbba2b20f7a42681441c93d9cabf680"} Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.789997 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.289977166 +0000 UTC m=+147.471493388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.796729 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.799382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" event={"ID":"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25","Type":"ContainerStarted","Data":"c89efc5d3223485469ba090aab6d1b0fb65d9173ac70583dd196c9d0c30d0c26"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.799417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" event={"ID":"5fa05f5b-3668-4d8f-99e9-a0eb8e644b25","Type":"ContainerStarted","Data":"f4237782c04363de0e69fba00a45fc53e6c0807b53ebd710963febf018c3223a"} Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.827093 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.833969 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.864336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.870937 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:09 crc kubenswrapper[4707]: W0121 15:04:09.874418 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd87c8f0a_f206_4813_968e_8f060d659834.slice/crio-15e487103999136e0cd431fdb2221759cf6bb2890e31095fd8df00ee6c8f93dd WatchSource:0}: Error finding container 15e487103999136e0cd431fdb2221759cf6bb2890e31095fd8df00ee6c8f93dd: Status 404 returned error can't find the container with id 15e487103999136e0cd431fdb2221759cf6bb2890e31095fd8df00ee6c8f93dd Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.889971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.890067 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.390053225 +0000 UTC m=+147.571569447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.890463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.890671 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.3906625 +0000 UTC m=+147.572178721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.912616 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.945725 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.945771 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.992952 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:09 crc kubenswrapper[4707]: E0121 15:04:09.993342 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.493328002 +0000 UTC m=+147.674844223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:09 crc kubenswrapper[4707]: I0121 15:04:09.995758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kjvk4"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.002995 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.016940 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-knchs"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.018745 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ms2tj"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.094292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.094682 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.59467064 +0000 UTC m=+147.776186861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.151011 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pk8cn"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.195067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.195318 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.695279721 +0000 UTC m=+147.876795944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.195484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.195720 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.695710033 +0000 UTC m=+147.877226255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.195984 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgg5r"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.199709 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.203964 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:10 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:10 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:10 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.203998 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.275553 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrqrq" podStartSLOduration=124.275540221 podStartE2EDuration="2m4.275540221s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:10.273430803 +0000 UTC m=+147.454947024" watchObservedRunningTime="2026-01-21 15:04:10.275540221 +0000 UTC m=+147.457056443" Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.296371 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.296758 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.796744567 +0000 UTC m=+147.978260790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.396461 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" podStartSLOduration=124.396447342 podStartE2EDuration="2m4.396447342s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:10.394571017 +0000 UTC m=+147.576087240" watchObservedRunningTime="2026-01-21 15:04:10.396447342 +0000 UTC m=+147.577963564" Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.397774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.398157 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:10.898146055 +0000 UTC m=+148.079662278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.445405 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-6kmlc" podStartSLOduration=124.445383519 podStartE2EDuration="2m4.445383519s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:10.434598596 +0000 UTC m=+147.616114818" watchObservedRunningTime="2026-01-21 15:04:10.445383519 +0000 UTC m=+147.626899741" Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.461667 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dnwth"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.502140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.506165 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.006134098 +0000 UTC m=+148.187650309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.511835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.512130 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.012116027 +0000 UTC m=+148.193632249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.555908 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z4txk"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.555963 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.585155 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.589570 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pgf7w"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.600756 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.613057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.613409 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.113395478 +0000 UTC m=+148.294911700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: W0121 15:04:10.637948 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee0078f_3c54_4136_88a7_59585fc4b5ee.slice/crio-9b3848a7dcc26921fcd791cee1360d95a0912c08a57ec6b31bac6445acdb6f18 WatchSource:0}: Error finding container 9b3848a7dcc26921fcd791cee1360d95a0912c08a57ec6b31bac6445acdb6f18: Status 404 returned error can't find the container with id 9b3848a7dcc26921fcd791cee1360d95a0912c08a57ec6b31bac6445acdb6f18 Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.710243 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.718455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.718696 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.218685637 +0000 UTC m=+148.400201860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.744631 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.806940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4bbb0d7351bec985f1ab4bf1449efa9a4c5466d22f1817c481e82da04407989c"} Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.809562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" event={"ID":"8a14118b-5f31-4136-82d5-b2f597f6554c","Type":"ContainerStarted","Data":"15abaa457295590c3719b7fe7e001326e298ddba6d6b564fd55e96750be8ca41"} Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.819538 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.819802 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.319789119 +0000 UTC m=+148.501305341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.831024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" event={"ID":"029666b4-21ed-4caf-a28b-f78d1316de59","Type":"ContainerStarted","Data":"5595638d15d0cc7998338cc1d48ba0354ead6d18cae363a545f6cf0647413217"} Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.840462 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-k84wt" podStartSLOduration=124.840448732 podStartE2EDuration="2m4.840448732s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:10.8401633 +0000 UTC m=+148.021679522" watchObservedRunningTime="2026-01-21 15:04:10.840448732 +0000 UTC m=+148.021964953" Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.848313 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.859020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" event={"ID":"d87c8f0a-f206-4813-968e-8f060d659834","Type":"ContainerStarted","Data":"15e487103999136e0cd431fdb2221759cf6bb2890e31095fd8df00ee6c8f93dd"} Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.872988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" event={"ID":"4a760eb2-4041-468b-a269-13e523796dd0","Type":"ContainerStarted","Data":"4f77454c235590ae864003751178720785ec24d932f56bc999c7f8522b3c9142"} Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.898137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" event={"ID":"6896438b-9d7e-4637-ac39-63c12dbf0e66","Type":"ContainerStarted","Data":"90f7fb06cd5302ac4707097938c69623c0bdf3cbd1c2603e49f6fdfb7c3f9fdd"} Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.909747 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h69c5"] Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.909784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pk8cn" event={"ID":"1c83727c-a8f9-4a68-a14d-e589059d313c","Type":"ContainerStarted","Data":"ef52734318f147128fe73444f7473601d734d578041df26992d36fb155507aee"} Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.920684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:10 crc kubenswrapper[4707]: E0121 15:04:10.921109 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.421099738 +0000 UTC m=+148.602615960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.950541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" event={"ID":"6f5ea184-c2df-41ad-9038-9602b5dbc1a1","Type":"ContainerStarted","Data":"8362b489be998ed3de295d5860cd5204bc327c611c7e0077da9031ecd5d34c72"} Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.950583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" event={"ID":"6f5ea184-c2df-41ad-9038-9602b5dbc1a1","Type":"ContainerStarted","Data":"f8ba5fa335c5b4e6a1ff704a3ee4be7c2cf502f8ecfad571d0475c1997d9bf97"} Jan 21 15:04:10 crc kubenswrapper[4707]: W0121 15:04:10.990063 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6b585e_78c7_440d_81da_7ea7a3961948.slice/crio-efb0419c7a27ecc6dfcb2a19b2c1a6ed76db88d6303896c0dd1096af742b5556 WatchSource:0}: Error finding container efb0419c7a27ecc6dfcb2a19b2c1a6ed76db88d6303896c0dd1096af742b5556: Status 404 returned error can't find the container with id efb0419c7a27ecc6dfcb2a19b2c1a6ed76db88d6303896c0dd1096af742b5556 Jan 21 15:04:10 crc kubenswrapper[4707]: I0121 15:04:10.991088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c00c7a2fe40aa7f3ea605b831313f7c72ea04fa9acb714f4626d7187512149a8"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.029210 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.030106 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.530092693 +0000 UTC m=+148.711608915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.039445 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" event={"ID":"d43c6f9c-a13c-406e-90f3-152b22768174","Type":"ContainerStarted","Data":"e2e6f46621b20839c82610db073d9e05dd372b17c126ef00da5f854c88773281"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.071712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" event={"ID":"080b0540-7292-4bb0-b198-50101893d856","Type":"ContainerStarted","Data":"5f76366e60c2a8de1d935a9c4c806cbe8331904d19f2148d770086a0d4481030"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.071757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" event={"ID":"080b0540-7292-4bb0-b198-50101893d856","Type":"ContainerStarted","Data":"aead47ac1a502671620b5220428258cca31dfef8a9e80b8ca30c6ef00ac5de83"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.072518 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.075393 4707 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-6dncd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.075430 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" podUID="080b0540-7292-4bb0-b198-50101893d856" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.079124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" event={"ID":"6a1d8d5b-9f51-4661-9dda-d2e43a87eb85","Type":"ContainerStarted","Data":"957fd87567313b4eac0617563d6eb18ffafae5a8f6055f54bfa2d0cd39770faf"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.085038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" event={"ID":"a13a87c3-f0a5-4fa9-892d-e2371300124a","Type":"ContainerStarted","Data":"a280de3748429292af9bd8cdd971f3edfea4a3a1463c47e1c74e1c31a0af19ec"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.102378 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j8l7" podStartSLOduration=125.10236144 podStartE2EDuration="2m5.10236144s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.101033006 +0000 UTC m=+148.282549228" watchObservedRunningTime="2026-01-21 15:04:11.10236144 +0000 UTC m=+148.283877662" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.131688 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.132614 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.632600149 +0000 UTC m=+148.814116371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.143074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" event={"ID":"22f1d02b-fe59-40fe-a47d-59589afef318","Type":"ContainerStarted","Data":"310aa2b02c666b3776e53c46e80d9f2f0be6b1b65b701e1ec421ec46085510eb"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.143115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" event={"ID":"22f1d02b-fe59-40fe-a47d-59589afef318","Type":"ContainerStarted","Data":"94ee7e1fb9b999f57749eb96e83b33ff9d76f00b359faacf96ec171d30162d31"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.173221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ce50a75cc4f7c1b5a817de2d4686879b976f88326436756693ac4f48bae9a204"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.217604 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:11 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:11 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:11 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.217653 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.235649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.236935 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.736922357 +0000 UTC m=+148.918438578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.289084 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n"] Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.289114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" event={"ID":"39df9137-94c9-4052-bb18-458e1d29e081","Type":"ContainerStarted","Data":"a115e1ef4364a54d41099397131fb70ecee0c6d654dfcf634e721379eec1cdc9"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.289131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" event={"ID":"39df9137-94c9-4052-bb18-458e1d29e081","Type":"ContainerStarted","Data":"e61bb3be7163c422c833917568cb630adff2f70c94adee7c405337842fe17951"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.289140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" event={"ID":"0d29f4db-3090-4118-88c3-878f8ddc4ab1","Type":"ContainerStarted","Data":"593179b496d2db8ba7ec04655fcce30a58f1534fc38619537b33df2152e29367"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.289155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" event={"ID":"7af8a399-b5f7-49ab-97e2-ba8c219e40eb","Type":"ContainerStarted","Data":"9ac600eb9d978d371bad0b3db2ca3107e171a4249722fe075cbc463feda10661"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.289178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" event={"ID":"130b5f6d-939d-4b35-b0a4-5fc39dc8b077","Type":"ContainerStarted","Data":"4cb1349459ed1c1c570a6db837797c10e53eab70059af4615772d6b9e4d844b2"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.289187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl" event={"ID":"0cc1862f-ca4e-4e64-95df-61589500669a","Type":"ContainerStarted","Data":"1b79967bc6acc6ac3ba8b0f4c38c0a021350910080e4d50a04044a6bedb83557"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.339291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.341055 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.841039522 +0000 UTC m=+149.022555743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.341756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" event={"ID":"1ab8b695-baf0-46bb-bf98-a13bc585f7ce","Type":"ContainerStarted","Data":"d0fe5c2f831407b210038ded18b3dc7e2131f093ef6a232b358e43f11fda613d"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.351403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z4txk" event={"ID":"5ee0078f-3c54-4136-88a7-59585fc4b5ee","Type":"ContainerStarted","Data":"9b3848a7dcc26921fcd791cee1360d95a0912c08a57ec6b31bac6445acdb6f18"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.354693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" event={"ID":"3742167f-2077-4b3c-b385-2a108c04538c","Type":"ContainerStarted","Data":"81503b2ec210da2e36d7d852635a9a152151b0d9460e8543ea05ca7626c6bce8"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.354737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" event={"ID":"3742167f-2077-4b3c-b385-2a108c04538c","Type":"ContainerStarted","Data":"9716e752da423cfc3ce05f25134660ebb91a1f53da27ea7436fbe3b768a0b355"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.355210 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bl9mx" podStartSLOduration=125.355195231 podStartE2EDuration="2m5.355195231s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.352382424 +0000 UTC m=+148.533898646" watchObservedRunningTime="2026-01-21 15:04:11.355195231 +0000 UTC m=+148.536711453" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.355599 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.365670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" event={"ID":"2c1f606a-2541-4b8d-a4ce-62919f0a44b0","Type":"ContainerDied","Data":"67ded3eb3c7361dd2cc3657f236c95142a056cb01a2498b7541eac51ed9d5b08"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.366418 4707 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bmpc8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.366452 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" podUID="3742167f-2077-4b3c-b385-2a108c04538c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.373879 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c1f606a-2541-4b8d-a4ce-62919f0a44b0" containerID="67ded3eb3c7361dd2cc3657f236c95142a056cb01a2498b7541eac51ed9d5b08" exitCode=0 Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.395384 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zdb5r" event={"ID":"8079cbea-8f34-4384-b064-d979305eb076","Type":"ContainerStarted","Data":"a82451e275544618e9e4522c0a81a6f5c55ed2a4a409662e0c0f8dc50412c38d"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.395427 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zdb5r" event={"ID":"8079cbea-8f34-4384-b064-d979305eb076","Type":"ContainerStarted","Data":"b929601ec266e33e81eb551c1ef2cb2b8c211095a7bd3561b6b3941a037c6f2f"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.397707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" event={"ID":"6b4276ea-aeb9-4265-bff9-c7c339da152d","Type":"ContainerStarted","Data":"b566b4ca4ec15a2f93c466f29d17fdb5d17cead5b9237e50e96b338501ad2766"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.410572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" event={"ID":"5999c94d-18a3-46e1-8a10-9f31b1b6577a","Type":"ContainerStarted","Data":"3eff03fdd8bafa0a9fef8b2a2d83a397fa979dc590754019e3f971161a7ef102"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.411387 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-njgpj" podStartSLOduration=125.411375189 podStartE2EDuration="2m5.411375189s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.408719473 +0000 UTC m=+148.590235695" watchObservedRunningTime="2026-01-21 15:04:11.411375189 +0000 UTC m=+148.592891411" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.415532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" event={"ID":"60cbb465-ee79-49db-8488-943c0b44334b","Type":"ContainerStarted","Data":"c43e2b358a664619361465fee12959c77a9c53e9e1f0ffd92fb6a5da3fb07573"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.415560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" event={"ID":"60cbb465-ee79-49db-8488-943c0b44334b","Type":"ContainerStarted","Data":"58953f61c72096c829c04d2eb2679981b1f4d62e70d58fb41076b42d3f274310"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.416224 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.422413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-62wx6" event={"ID":"51519fd7-68a2-4048-8d76-affda2188f00","Type":"ContainerStarted","Data":"0d5db597197da97983cbae2c8d8635ca77477a468a403f0c55d0baafe9fd2e1e"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.422512 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-62wx6" event={"ID":"51519fd7-68a2-4048-8d76-affda2188f00","Type":"ContainerStarted","Data":"64f80162c209ec3800cd685a37617838a5aa3b2887deaf41b8ea454052f02ccb"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.422836 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.440316 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.441410 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:11.941394379 +0000 UTC m=+149.122910602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.446580 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.448450 4707 csr.go:261] certificate signing request csr-fmj77 is approved, waiting to be issued Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.464988 4707 csr.go:257] certificate signing request csr-fmj77 is issued Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.496278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" event={"ID":"e82fcc54-393c-4981-a666-769ed5c9c92d","Type":"ContainerStarted","Data":"c09e2388848b868cae1eb194717e296e2f9eca77facf5becb0e641e9b3ecab77"} Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.496574 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6kmlc" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.497188 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-c6ckn" podStartSLOduration=125.497177418 podStartE2EDuration="2m5.497177418s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.465779069 +0000 UTC m=+148.647295291" watchObservedRunningTime="2026-01-21 15:04:11.497177418 +0000 UTC m=+148.678693640" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.497756 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-62wx6" podStartSLOduration=125.49775267 podStartE2EDuration="2m5.49775267s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.496385432 +0000 UTC m=+148.677901655" watchObservedRunningTime="2026-01-21 15:04:11.49775267 +0000 UTC m=+148.679268891" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.505978 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-6kmlc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.506017 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6kmlc" podUID="690ffd9a-e08c-4a06-9d78-990fd46651d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.511139 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.525101 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnkpm" podStartSLOduration=125.525088002 podStartE2EDuration="2m5.525088002s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.522873618 +0000 UTC m=+148.704389841" watchObservedRunningTime="2026-01-21 15:04:11.525088002 +0000 UTC m=+148.706604224" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.543512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.545016 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.045005823 +0000 UTC m=+149.226522045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.560770 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6khq" podStartSLOduration=125.560757966 podStartE2EDuration="2m5.560757966s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.559971381 +0000 UTC m=+148.741487604" watchObservedRunningTime="2026-01-21 15:04:11.560757966 +0000 UTC m=+148.742274189" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.619762 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-96l4g" podStartSLOduration=125.619747615 podStartE2EDuration="2m5.619747615s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.613065422 +0000 UTC m=+148.794581644" watchObservedRunningTime="2026-01-21 15:04:11.619747615 +0000 UTC m=+148.801263838" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.651103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.652258 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjv7l" podStartSLOduration=125.652243679 podStartE2EDuration="2m5.652243679s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.650570323 +0000 UTC m=+148.832086544" watchObservedRunningTime="2026-01-21 15:04:11.652243679 +0000 UTC m=+148.833759900" Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.652487 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.152472735 +0000 UTC m=+149.333988957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.688168 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" podStartSLOduration=125.688138732 podStartE2EDuration="2m5.688138732s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.687867306 +0000 UTC m=+148.869383529" watchObservedRunningTime="2026-01-21 15:04:11.688138732 +0000 UTC m=+148.869654955" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.769455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.769723 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.269711003 +0000 UTC m=+149.451227226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.790002 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8xcdf" podStartSLOduration=125.789986 podStartE2EDuration="2m5.789986s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.787862225 +0000 UTC m=+148.969378446" watchObservedRunningTime="2026-01-21 15:04:11.789986 +0000 UTC m=+148.971502222" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.790854 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ggf7f" podStartSLOduration=125.790850449 podStartE2EDuration="2m5.790850449s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.746355191 +0000 UTC m=+148.927871414" watchObservedRunningTime="2026-01-21 15:04:11.790850449 +0000 UTC m=+148.972366672" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.821750 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dmhxk" podStartSLOduration=125.821735273 podStartE2EDuration="2m5.821735273s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.821223951 +0000 UTC m=+149.002740172" watchObservedRunningTime="2026-01-21 15:04:11.821735273 +0000 UTC m=+149.003251495" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.870400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.870936 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.370923359 +0000 UTC m=+149.552439582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.892567 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-zdb5r" podStartSLOduration=5.892551395 podStartE2EDuration="5.892551395s" podCreationTimestamp="2026-01-21 15:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.860466076 +0000 UTC m=+149.041982298" watchObservedRunningTime="2026-01-21 15:04:11.892551395 +0000 UTC m=+149.074067617" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.893183 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" podStartSLOduration=125.893176499 podStartE2EDuration="2m5.893176499s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.891255843 +0000 UTC m=+149.072772064" watchObservedRunningTime="2026-01-21 15:04:11.893176499 +0000 UTC m=+149.074692721" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.969787 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" podStartSLOduration=125.969769739 podStartE2EDuration="2m5.969769739s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:11.968784455 +0000 UTC m=+149.150300676" watchObservedRunningTime="2026-01-21 15:04:11.969769739 +0000 UTC m=+149.151285961" Jan 21 15:04:11 crc kubenswrapper[4707]: I0121 15:04:11.971882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:11 crc kubenswrapper[4707]: E0121 15:04:11.972097 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.472086152 +0000 UTC m=+149.653602374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.009442 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.073327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.073657 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.573618324 +0000 UTC m=+149.755134546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.074136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.074599 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.574589262 +0000 UTC m=+149.756105484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.175120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.175772 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.675755071 +0000 UTC m=+149.857271292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.217955 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:12 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:12 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:12 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.218000 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.278120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.278432 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.778421905 +0000 UTC m=+149.959938127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.378826 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.379307 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.879286723 +0000 UTC m=+150.060802945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.379387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.379748 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.879735659 +0000 UTC m=+150.061251881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.423886 4707 patch_prober.go:28] interesting pod/console-operator-58897d9998-62wx6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.423937 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-62wx6" podUID="51519fd7-68a2-4048-8d76-affda2188f00" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.472876 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-21 14:59:11 +0000 UTC, rotation deadline is 2026-11-17 02:07:04.913015009 +0000 UTC Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.472909 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7187h2m52.440110934s for next certificate rotation Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.480246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.480526 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:12.980514276 +0000 UTC m=+150.162030499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.562616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" event={"ID":"f10ad167-f30a-4c37-8a38-fb877eeb567a","Type":"ContainerStarted","Data":"fb402a07504fade4203efecba028d36145629bfdbd861436a28c7e69bf7d011f"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.562676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.581521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.581871 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.081858708 +0000 UTC m=+150.263374930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.643357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" event={"ID":"e82fcc54-393c-4981-a666-769ed5c9c92d","Type":"ContainerStarted","Data":"77d531df36a461a869ffc881636885c7f50ecfa3433c6de7c8a4fe613dab4500"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.643507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" event={"ID":"e82fcc54-393c-4981-a666-769ed5c9c92d","Type":"ContainerStarted","Data":"82985b7e20f551e1f41455e6070d91c5e54c31bc2b27d5b792f75d08e5aa2e10"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.670239 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" podStartSLOduration=126.670224541 podStartE2EDuration="2m6.670224541s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:12.634399689 +0000 UTC m=+149.815915911" watchObservedRunningTime="2026-01-21 15:04:12.670224541 +0000 UTC m=+149.851740763" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.670667 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjvk4" podStartSLOduration=126.670662997 podStartE2EDuration="2m6.670662997s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:12.668868096 +0000 UTC m=+149.850384317" watchObservedRunningTime="2026-01-21 15:04:12.670662997 +0000 UTC m=+149.852179219" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.672863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"99fde6758c90ac44a3291313f82661a1ade772a7984769a56b8a8e367f8a8d2b"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.685286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.686072 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.186059509 +0000 UTC m=+150.367575730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.688577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" event={"ID":"2c1f606a-2541-4b8d-a4ce-62919f0a44b0","Type":"ContainerStarted","Data":"5073c1bdf2d7167c22a4c33953d2252ddad7580c38e2ca9f5885e31c1ac81a1f"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.719694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" event={"ID":"7a6b585e-78c7-440d-81da-7ea7a3961948","Type":"ContainerStarted","Data":"d353a9eb2c0b0d085b44c9c376d78697a2d7adb84435c5da7114a758ad7f63c9"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.719732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" event={"ID":"7a6b585e-78c7-440d-81da-7ea7a3961948","Type":"ContainerStarted","Data":"efb0419c7a27ecc6dfcb2a19b2c1a6ed76db88d6303896c0dd1096af742b5556"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.742203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" event={"ID":"5537fff0-7562-4e93-8db8-d13d18966e3d","Type":"ContainerStarted","Data":"3366326ca952000f6610fe4a617379c49e1cb863f2bc2229d35cc5564f663ccd"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.742247 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" event={"ID":"5537fff0-7562-4e93-8db8-d13d18966e3d","Type":"ContainerStarted","Data":"ae73ba976a5a0b0c7da9f0c9f2f608f36318b085640a7c9f58bc0d38aa0da08e"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.763976 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" podStartSLOduration=126.763946759 podStartE2EDuration="2m6.763946759s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:12.762146166 +0000 UTC m=+149.943662388" watchObservedRunningTime="2026-01-21 15:04:12.763946759 +0000 UTC m=+149.945462981" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.780033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ad7309a7f5c69080df2fd908c4ec028db890766d746c12a649f260b4afa5525b"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.780676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.790337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.790683 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.290670944 +0000 UTC m=+150.472187166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.812573 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" podStartSLOduration=126.812557811 podStartE2EDuration="2m6.812557811s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:12.780436805 +0000 UTC m=+149.961953028" watchObservedRunningTime="2026-01-21 15:04:12.812557811 +0000 UTC m=+149.994074033" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.830072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" event={"ID":"0d29f4db-3090-4118-88c3-878f8ddc4ab1","Type":"ContainerStarted","Data":"cadd8b2159a1c8d4d582aa2de2d3a028469a095d987fbf49e1b86cf65cbec2ae"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.830105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" event={"ID":"0d29f4db-3090-4118-88c3-878f8ddc4ab1","Type":"ContainerStarted","Data":"f221a95938cd6e5a074da93ddaf2061b396bec5c1c4696137ed45d910b14aec8"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.830168 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.855013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" event={"ID":"6a1d8d5b-9f51-4661-9dda-d2e43a87eb85","Type":"ContainerStarted","Data":"911dffd0c8cd01d11361e8005bc9a63a6f9a4eb0c3a8072f67e4c54b8ef32426"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.855054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" event={"ID":"6a1d8d5b-9f51-4661-9dda-d2e43a87eb85","Type":"ContainerStarted","Data":"35c1b8b6301895a677e5b07f8d577525663523ad2d97f7682d6f3a6029250dbb"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.888219 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" podStartSLOduration=126.88820474 podStartE2EDuration="2m6.88820474s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:12.865329972 +0000 UTC m=+150.046846194" watchObservedRunningTime="2026-01-21 15:04:12.88820474 +0000 UTC m=+150.069720961" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.891151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.892203 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.392189039 +0000 UTC m=+150.573705260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.893249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" event={"ID":"152ee428-bdb0-4de2-a795-af600b7669fd","Type":"ContainerStarted","Data":"fde862dbec1977ede2262ba84c04dc99daeafaad16e15d8a11bcc20b03bd3a29"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.893358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" event={"ID":"152ee428-bdb0-4de2-a795-af600b7669fd","Type":"ContainerStarted","Data":"ac228c18a4cc04c2797b6801463e841608331403acb0a19e3816e33c7cd9b946"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.893441 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" event={"ID":"152ee428-bdb0-4de2-a795-af600b7669fd","Type":"ContainerStarted","Data":"1b7da4a6cc52df279c5a6e725ef3faabbb3aaf43c75a21ab6555e4b6ce7914e1"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.916629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z4txk" event={"ID":"5ee0078f-3c54-4136-88a7-59585fc4b5ee","Type":"ContainerStarted","Data":"6811a855a13c6fe15223f871230d95e24e1ed197ac469d342f326b9b5383974d"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.916925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.922881 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jtpg" podStartSLOduration=126.922858521 podStartE2EDuration="2m6.922858521s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:12.888532009 +0000 UTC m=+150.070048231" watchObservedRunningTime="2026-01-21 15:04:12.922858521 +0000 UTC m=+150.104374743" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.923409 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qlnlt" podStartSLOduration=126.92340032 podStartE2EDuration="2m6.92340032s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:12.921672753 +0000 UTC m=+150.103188975" watchObservedRunningTime="2026-01-21 15:04:12.92340032 +0000 UTC m=+150.104916541" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.935294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pk8cn" event={"ID":"1c83727c-a8f9-4a68-a14d-e589059d313c","Type":"ContainerStarted","Data":"571064cd5a80dcc447b05c41d7b987fe5dabf49979a68ce1d8f61e27a253e564"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.979355 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z4txk" podStartSLOduration=6.979340491 podStartE2EDuration="6.979340491s" podCreationTimestamp="2026-01-21 15:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:12.978026484 +0000 UTC m=+150.159542716" watchObservedRunningTime="2026-01-21 15:04:12.979340491 +0000 UTC m=+150.160856712" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.979740 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" event={"ID":"130b5f6d-939d-4b35-b0a4-5fc39dc8b077","Type":"ContainerStarted","Data":"54da445d9050c45c522fb7e51d7b6d68d70b16e93b37c8675a7ecd3562f4a9c4"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.990475 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pk8cn" podStartSLOduration=6.990462902 podStartE2EDuration="6.990462902s" podCreationTimestamp="2026-01-21 15:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:12.98975305 +0000 UTC m=+150.171269273" watchObservedRunningTime="2026-01-21 15:04:12.990462902 +0000 UTC m=+150.171979124" Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.990578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" event={"ID":"8a14118b-5f31-4136-82d5-b2f597f6554c","Type":"ContainerStarted","Data":"579f85b60e7851e6a06cd54fff945a1f5e0f8a69232493855d14c30be6321a63"} Jan 21 15:04:12 crc kubenswrapper[4707]: I0121 15:04:12.993350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:12 crc kubenswrapper[4707]: E0121 15:04:12.995596 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.495585171 +0000 UTC m=+150.677101393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.002960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" event={"ID":"029666b4-21ed-4caf-a28b-f78d1316de59","Type":"ContainerStarted","Data":"0a4d5adb16c9cacd3eff444104d4fc665f2322efd407b711a6c80eb4f5212190"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.003002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" event={"ID":"029666b4-21ed-4caf-a28b-f78d1316de59","Type":"ContainerStarted","Data":"2ff273ab7ac95c1bc2ec8923cfa28360b3e84af883e272f609b0254ec98ca42a"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.018912 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dnwth" podStartSLOduration=127.018900468 podStartE2EDuration="2m7.018900468s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:13.01816065 +0000 UTC m=+150.199676872" watchObservedRunningTime="2026-01-21 15:04:13.018900468 +0000 UTC m=+150.200416690" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.028612 4707 generic.go:334] "Generic (PLEG): container finished" podID="d87c8f0a-f206-4813-968e-8f060d659834" containerID="6affafa67d5547d7b469176d69f11fe4daec922d60da2777ca9ebba30c979fe2" exitCode=0 Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.028691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" event={"ID":"d87c8f0a-f206-4813-968e-8f060d659834","Type":"ContainerDied","Data":"6affafa67d5547d7b469176d69f11fe4daec922d60da2777ca9ebba30c979fe2"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.049703 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-knchs" podStartSLOduration=127.049687008 podStartE2EDuration="2m7.049687008s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:13.044835914 +0000 UTC m=+150.226352137" watchObservedRunningTime="2026-01-21 15:04:13.049687008 +0000 UTC m=+150.231203230" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.054058 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" event={"ID":"a13a87c3-f0a5-4fa9-892d-e2371300124a","Type":"ContainerStarted","Data":"fdf5c3dc9a9fddaeed0daff9c1a05de484bfadd721c10c9f616a0c7b32569f54"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.054937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.067513 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ms2tj" podStartSLOduration=127.067498457 podStartE2EDuration="2m7.067498457s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:13.067232942 +0000 UTC m=+150.248749164" watchObservedRunningTime="2026-01-21 15:04:13.067498457 +0000 UTC m=+150.249014678" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.090066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.092630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" event={"ID":"4a760eb2-4041-468b-a269-13e523796dd0","Type":"ContainerStarted","Data":"6b3c2d5d62fd5c52c4c35e0776e52b01d9af62299c69312a46ce7fa082087898"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.094766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl" event={"ID":"0cc1862f-ca4e-4e64-95df-61589500669a","Type":"ContainerStarted","Data":"cba0d9d8f1b04468370af82ea18d33090de9fdf422b1c82fd7248c06aee470c6"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.094794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl" event={"ID":"0cc1862f-ca4e-4e64-95df-61589500669a","Type":"ContainerStarted","Data":"9ea1e034f75e7edf4cc409aef9203e6d85626f1a5fcb9491fdbca4dbea8ce98b"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.095103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.097878 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.597865405 +0000 UTC m=+150.779381626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.110161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"87b9b6e63cb2bd92c9d59ff6f37c18ba58f4344bd1e312c337a1ce4e0e551343"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.120218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" event={"ID":"d43c6f9c-a13c-406e-90f3-152b22768174","Type":"ContainerStarted","Data":"b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.120917 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.128172 4707 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xgg5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.128208 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" podUID="d43c6f9c-a13c-406e-90f3-152b22768174" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.128715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" event={"ID":"2c4cb519-1970-4b6b-99ce-1e505723ceca","Type":"ContainerStarted","Data":"e9e9c37682c4d0fe006bb878ece9db0de4eb9d6f561f5e68bf6e7ddfa8f60ab4"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.128746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" event={"ID":"2c4cb519-1970-4b6b-99ce-1e505723ceca","Type":"ContainerStarted","Data":"0b5230af6fb27822466c7017caa455f74b94f84b8942cc9c3e3c1902268383dc"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.129433 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.139170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" event={"ID":"cb976779-fc90-40ac-a41e-d93b53181836","Type":"ContainerStarted","Data":"1eab8ed32605564c1bfb0f3f7515527e74fc4b9f10c69590fa8af2cf71dd520f"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.139200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" event={"ID":"cb976779-fc90-40ac-a41e-d93b53181836","Type":"ContainerStarted","Data":"00769c538be0aefce4f3e551ee24e56be63e2ef09e0234bb60f72c66b050d434"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.139210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" event={"ID":"cb976779-fc90-40ac-a41e-d93b53181836","Type":"ContainerStarted","Data":"463b17d71435d95f700b39dce065dbd22b52dd42a52d01903854fb95134f9c94"} Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.139526 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-6kmlc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.139601 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6kmlc" podUID="690ffd9a-e08c-4a06-9d78-990fd46651d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.152175 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-62wx6" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.162911 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6dncd" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.163268 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" podStartSLOduration=127.163253259 podStartE2EDuration="2m7.163253259s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:13.162613637 +0000 UTC m=+150.344129860" watchObservedRunningTime="2026-01-21 15:04:13.163253259 +0000 UTC m=+150.344769471" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.164164 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.197309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.212747 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t9whl" podStartSLOduration=127.21273344 podStartE2EDuration="2m7.21273344s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:13.211689907 +0000 UTC m=+150.393206129" watchObservedRunningTime="2026-01-21 15:04:13.21273344 +0000 UTC m=+150.394249662" Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.224078 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.72405917 +0000 UTC m=+150.905575392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.253487 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:13 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:13 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:13 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.253542 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.312825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.313793 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.813777461 +0000 UTC m=+150.995293684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.314534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.318349 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.818335069 +0000 UTC m=+150.999851291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.356623 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" podStartSLOduration=127.356608551 podStartE2EDuration="2m7.356608551s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:13.330274802 +0000 UTC m=+150.511791025" watchObservedRunningTime="2026-01-21 15:04:13.356608551 +0000 UTC m=+150.538124772" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.398357 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" podStartSLOduration=127.398342397 podStartE2EDuration="2m7.398342397s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:13.396800585 +0000 UTC m=+150.578316808" watchObservedRunningTime="2026-01-21 15:04:13.398342397 +0000 UTC m=+150.579858619" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.416891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.417639 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:13.91762258 +0000 UTC m=+151.099138802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.456901 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bjb5q" podStartSLOduration=127.456886607 podStartE2EDuration="2m7.456886607s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:13.433871047 +0000 UTC m=+150.615387270" watchObservedRunningTime="2026-01-21 15:04:13.456886607 +0000 UTC m=+150.638402828" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.490259 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk7ss" podStartSLOduration=127.490244104 podStartE2EDuration="2m7.490244104s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:13.45743092 +0000 UTC m=+150.638947142" watchObservedRunningTime="2026-01-21 15:04:13.490244104 +0000 UTC m=+150.671760326" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.521071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.521339 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:14.021329451 +0000 UTC m=+151.202845673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.622154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.622275 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:14.122260803 +0000 UTC m=+151.303777025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.622605 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.622824 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:14.122797512 +0000 UTC m=+151.304313734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.723110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.723431 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:14.223418085 +0000 UTC m=+151.404934308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.785598 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f7d2n" Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.823980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.824328 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:14.324315585 +0000 UTC m=+151.505831807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:13 crc kubenswrapper[4707]: I0121 15:04:13.924916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:13 crc kubenswrapper[4707]: E0121 15:04:13.925231 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:14.425213214 +0000 UTC m=+151.606729436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:14 crc kubenswrapper[4707]: E0121 15:04:14.026791 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:14.526778086 +0000 UTC m=+151.708294308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.027119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.128152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:14 crc kubenswrapper[4707]: E0121 15:04:14.128456 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:04:14.62844071 +0000 UTC m=+151.809956932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.144652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" event={"ID":"d87c8f0a-f206-4813-968e-8f060d659834","Type":"ContainerStarted","Data":"5266ffebfe068c282ff3bf5f5c8c7bac0358d028f254af74681e944f49f1b2b1"} Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.150697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" event={"ID":"6b4276ea-aeb9-4265-bff9-c7c339da152d","Type":"ContainerStarted","Data":"022ee9755c44775503084bf3748cc1cf791fa9f575764ae7636f30c18f657695"} Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.150733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" event={"ID":"6b4276ea-aeb9-4265-bff9-c7c339da152d","Type":"ContainerStarted","Data":"62ed7170e6277c2e6105f94d05ed4d1eb18e00a96a84168012678b1f32c9adbc"} Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.150743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" event={"ID":"6b4276ea-aeb9-4265-bff9-c7c339da152d","Type":"ContainerStarted","Data":"db3104487ccf62d0fe1638588124a8ce8d27bd7690d30cda7126857ce4e7fb6b"} Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.152561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z4txk" event={"ID":"5ee0078f-3c54-4136-88a7-59585fc4b5ee","Type":"ContainerStarted","Data":"49a358f94f38c5d8616827b3cfd0cc30ace6bc7aa66fe33e0e7821f4912aa255"} Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.155070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" event={"ID":"2c1f606a-2541-4b8d-a4ce-62919f0a44b0","Type":"ContainerStarted","Data":"a173e81e0da059265852719b9feba22b2f4291d88e74cbf238a422cac77cf35a"} Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.157280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h69c5" event={"ID":"7a6b585e-78c7-440d-81da-7ea7a3961948","Type":"ContainerStarted","Data":"4a1b00c8c52375d398e7779b47882504b0cdd2d9a407ad9d412fb46cc9411824"} Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.167130 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" podStartSLOduration=128.167101644 podStartE2EDuration="2m8.167101644s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:14.164906476 +0000 UTC m=+151.346422698" watchObservedRunningTime="2026-01-21 15:04:14.167101644 +0000 UTC m=+151.348617867" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.178067 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.178917 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpvrj" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.181149 4707 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.183307 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" podStartSLOduration=128.183297043 podStartE2EDuration="2m8.183297043s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:14.181860568 +0000 UTC m=+151.363376790" watchObservedRunningTime="2026-01-21 15:04:14.183297043 +0000 UTC m=+151.364813265" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.200843 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:14 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:14 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:14 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.200883 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.230077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:14 crc kubenswrapper[4707]: E0121 15:04:14.230414 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:04:14.730400077 +0000 UTC m=+151.911916299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q6l9f" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.312960 4707 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-21T15:04:14.181169129Z","Handler":null,"Name":""} Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.314782 4707 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.314838 4707 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.331086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.341670 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.433829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.436162 4707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.436199 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.453058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q6l9f\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.586469 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.633822 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dtrmt"] Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.634618 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.636006 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.652572 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtrmt"] Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.725895 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q6l9f"] Jan 21 15:04:14 crc kubenswrapper[4707]: W0121 15:04:14.730854 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode72b99f9_e554_43e0_ae76_ee94b2613a88.slice/crio-e56323bb131fbb166f3a33eb484c57e95faf09f151378cf1266522a809e27e32 WatchSource:0}: Error finding container e56323bb131fbb166f3a33eb484c57e95faf09f151378cf1266522a809e27e32: Status 404 returned error can't find the container with id e56323bb131fbb166f3a33eb484c57e95faf09f151378cf1266522a809e27e32 Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.737752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-utilities\") pod \"certified-operators-dtrmt\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.737833 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-catalog-content\") pod \"certified-operators-dtrmt\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.737971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tdt8\" (UniqueName: \"kubernetes.io/projected/beb79892-2b5e-47cf-9381-08a5654df324-kube-api-access-8tdt8\") pod \"certified-operators-dtrmt\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.831167 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ncnxd"] Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.832152 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.833697 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.837663 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncnxd"] Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.839519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-catalog-content\") pod \"certified-operators-dtrmt\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.839553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tdt8\" (UniqueName: \"kubernetes.io/projected/beb79892-2b5e-47cf-9381-08a5654df324-kube-api-access-8tdt8\") pod \"certified-operators-dtrmt\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.839592 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-utilities\") pod \"certified-operators-dtrmt\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.839950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-utilities\") pod \"certified-operators-dtrmt\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.840166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-catalog-content\") pod \"certified-operators-dtrmt\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.863539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tdt8\" (UniqueName: \"kubernetes.io/projected/beb79892-2b5e-47cf-9381-08a5654df324-kube-api-access-8tdt8\") pod \"certified-operators-dtrmt\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.940790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-utilities\") pod \"community-operators-ncnxd\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.940849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4txd\" (UniqueName: \"kubernetes.io/projected/0e136188-e9e8-4eea-bd22-e01006fadce4-kube-api-access-t4txd\") pod \"community-operators-ncnxd\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.940907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-catalog-content\") pod \"community-operators-ncnxd\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:14 crc kubenswrapper[4707]: I0121 15:04:14.949617 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.032547 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4gph2"] Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.033789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.041555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-utilities\") pod \"community-operators-ncnxd\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.041591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4txd\" (UniqueName: \"kubernetes.io/projected/0e136188-e9e8-4eea-bd22-e01006fadce4-kube-api-access-t4txd\") pod \"community-operators-ncnxd\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.041641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-catalog-content\") pod \"community-operators-ncnxd\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.042171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-utilities\") pod \"community-operators-ncnxd\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.042280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-catalog-content\") pod \"community-operators-ncnxd\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.043438 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gph2"] Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.059365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4txd\" (UniqueName: \"kubernetes.io/projected/0e136188-e9e8-4eea-bd22-e01006fadce4-kube-api-access-t4txd\") pod \"community-operators-ncnxd\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.103353 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtrmt"] Jan 21 15:04:15 crc kubenswrapper[4707]: W0121 15:04:15.107656 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeb79892_2b5e_47cf_9381_08a5654df324.slice/crio-99c79956b8828ff46306ffef19782b6af09c7b27202c04992faa00d5d3b7bcc7 WatchSource:0}: Error finding container 99c79956b8828ff46306ffef19782b6af09c7b27202c04992faa00d5d3b7bcc7: Status 404 returned error can't find the container with id 99c79956b8828ff46306ffef19782b6af09c7b27202c04992faa00d5d3b7bcc7 Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.142585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvzkb\" (UniqueName: \"kubernetes.io/projected/1225cfb7-54e0-493f-b631-8743502cc57a-kube-api-access-vvzkb\") pod \"certified-operators-4gph2\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.142640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-catalog-content\") pod \"certified-operators-4gph2\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.142675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-utilities\") pod \"certified-operators-4gph2\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.151740 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.167778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtrmt" event={"ID":"beb79892-2b5e-47cf-9381-08a5654df324","Type":"ContainerStarted","Data":"99c79956b8828ff46306ffef19782b6af09c7b27202c04992faa00d5d3b7bcc7"} Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.170311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" event={"ID":"6b4276ea-aeb9-4265-bff9-c7c339da152d","Type":"ContainerStarted","Data":"e708c902134d4a7e88d3beabf2ae2ae915352e580b55f5d5879a3b3898f77e4d"} Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.171756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" event={"ID":"e72b99f9-e554-43e0-ae76-ee94b2613a88","Type":"ContainerStarted","Data":"94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705"} Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.171794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" event={"ID":"e72b99f9-e554-43e0-ae76-ee94b2613a88","Type":"ContainerStarted","Data":"e56323bb131fbb166f3a33eb484c57e95faf09f151378cf1266522a809e27e32"} Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.171978 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.173230 4707 generic.go:334] "Generic (PLEG): container finished" podID="5537fff0-7562-4e93-8db8-d13d18966e3d" containerID="3366326ca952000f6610fe4a617379c49e1cb863f2bc2229d35cc5564f663ccd" exitCode=0 Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.173369 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" event={"ID":"5537fff0-7562-4e93-8db8-d13d18966e3d","Type":"ContainerDied","Data":"3366326ca952000f6610fe4a617379c49e1cb863f2bc2229d35cc5564f663ccd"} Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.191130 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pgf7w" podStartSLOduration=9.191111884 podStartE2EDuration="9.191111884s" podCreationTimestamp="2026-01-21 15:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:15.184722295 +0000 UTC m=+152.366238517" watchObservedRunningTime="2026-01-21 15:04:15.191111884 +0000 UTC m=+152.372628105" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.197247 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.201030 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:15 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:15 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:15 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.201068 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.214144 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" podStartSLOduration=129.214127012 podStartE2EDuration="2m9.214127012s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:15.213140936 +0000 UTC m=+152.394657158" watchObservedRunningTime="2026-01-21 15:04:15.214127012 +0000 UTC m=+152.395643234" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.226008 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.229842 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.231293 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.231459 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.232926 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.236113 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sn2nd"] Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.236999 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.243685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvzkb\" (UniqueName: \"kubernetes.io/projected/1225cfb7-54e0-493f-b631-8743502cc57a-kube-api-access-vvzkb\") pod \"certified-operators-4gph2\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.243757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-catalog-content\") pod \"certified-operators-4gph2\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.244169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-catalog-content\") pod \"certified-operators-4gph2\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.244207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-utilities\") pod \"certified-operators-4gph2\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.244440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-utilities\") pod \"certified-operators-4gph2\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.249616 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sn2nd"] Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.260560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvzkb\" (UniqueName: \"kubernetes.io/projected/1225cfb7-54e0-493f-b631-8743502cc57a-kube-api-access-vvzkb\") pod \"certified-operators-4gph2\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.324778 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncnxd"] Jan 21 15:04:15 crc kubenswrapper[4707]: W0121 15:04:15.328775 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e136188_e9e8_4eea_bd22_e01006fadce4.slice/crio-20fae57f46a49e2df559fb5fb428760f1354d2040b7599a08c6febd705095a19 WatchSource:0}: Error finding container 20fae57f46a49e2df559fb5fb428760f1354d2040b7599a08c6febd705095a19: Status 404 returned error can't find the container with id 20fae57f46a49e2df559fb5fb428760f1354d2040b7599a08c6febd705095a19 Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.345966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6508928-d944-40b0-9640-0193417ba7bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f6508928-d944-40b0-9640-0193417ba7bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.346052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-catalog-content\") pod \"community-operators-sn2nd\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.346264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6508928-d944-40b0-9640-0193417ba7bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f6508928-d944-40b0-9640-0193417ba7bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.346456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kfzn\" (UniqueName: \"kubernetes.io/projected/d6840e11-1e4b-47d4-8a4d-cc857f791915-kube-api-access-9kfzn\") pod \"community-operators-sn2nd\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.346523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-utilities\") pod \"community-operators-sn2nd\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.419067 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.447265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-utilities\") pod \"community-operators-sn2nd\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.447314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6508928-d944-40b0-9640-0193417ba7bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f6508928-d944-40b0-9640-0193417ba7bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.447344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-catalog-content\") pod \"community-operators-sn2nd\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.447385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6508928-d944-40b0-9640-0193417ba7bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f6508928-d944-40b0-9640-0193417ba7bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.447424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfzn\" (UniqueName: \"kubernetes.io/projected/d6840e11-1e4b-47d4-8a4d-cc857f791915-kube-api-access-9kfzn\") pod \"community-operators-sn2nd\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.448128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-utilities\") pod \"community-operators-sn2nd\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.448233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-catalog-content\") pod \"community-operators-sn2nd\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.448106 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6508928-d944-40b0-9640-0193417ba7bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f6508928-d944-40b0-9640-0193417ba7bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.462215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6508928-d944-40b0-9640-0193417ba7bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f6508928-d944-40b0-9640-0193417ba7bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.465292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kfzn\" (UniqueName: \"kubernetes.io/projected/d6840e11-1e4b-47d4-8a4d-cc857f791915-kube-api-access-9kfzn\") pod \"community-operators-sn2nd\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.551034 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gph2"] Jan 21 15:04:15 crc kubenswrapper[4707]: W0121 15:04:15.555702 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1225cfb7_54e0_493f_b631_8743502cc57a.slice/crio-3499d717d3075fe515aa4060713c72cdf02978dab0c2032a9e19dab13d8d4f73 WatchSource:0}: Error finding container 3499d717d3075fe515aa4060713c72cdf02978dab0c2032a9e19dab13d8d4f73: Status 404 returned error can't find the container with id 3499d717d3075fe515aa4060713c72cdf02978dab0c2032a9e19dab13d8d4f73 Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.598443 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.606296 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.943706 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 15:04:15 crc kubenswrapper[4707]: W0121 15:04:15.946609 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf6508928_d944_40b0_9640_0193417ba7bb.slice/crio-ee27b711350a26c440c2a46bfd3b6f13f0157bfd794e57241ed9b2617ac6091b WatchSource:0}: Error finding container ee27b711350a26c440c2a46bfd3b6f13f0157bfd794e57241ed9b2617ac6091b: Status 404 returned error can't find the container with id ee27b711350a26c440c2a46bfd3b6f13f0157bfd794e57241ed9b2617ac6091b Jan 21 15:04:15 crc kubenswrapper[4707]: I0121 15:04:15.990105 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sn2nd"] Jan 21 15:04:15 crc kubenswrapper[4707]: W0121 15:04:15.997252 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6840e11_1e4b_47d4_8a4d_cc857f791915.slice/crio-c0ae0bbdf7492a5bb716b25e3d2ccbccd4646481dff2903d26ae9b62870a0561 WatchSource:0}: Error finding container c0ae0bbdf7492a5bb716b25e3d2ccbccd4646481dff2903d26ae9b62870a0561: Status 404 returned error can't find the container with id c0ae0bbdf7492a5bb716b25e3d2ccbccd4646481dff2903d26ae9b62870a0561 Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.177858 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerID="ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc" exitCode=0 Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.177914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncnxd" event={"ID":"0e136188-e9e8-4eea-bd22-e01006fadce4","Type":"ContainerDied","Data":"ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc"} Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.177935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncnxd" event={"ID":"0e136188-e9e8-4eea-bd22-e01006fadce4","Type":"ContainerStarted","Data":"20fae57f46a49e2df559fb5fb428760f1354d2040b7599a08c6febd705095a19"} Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.179333 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.179543 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerID="fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a" exitCode=0 Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.179598 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2nd" event={"ID":"d6840e11-1e4b-47d4-8a4d-cc857f791915","Type":"ContainerDied","Data":"fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a"} Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.179625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2nd" event={"ID":"d6840e11-1e4b-47d4-8a4d-cc857f791915","Type":"ContainerStarted","Data":"c0ae0bbdf7492a5bb716b25e3d2ccbccd4646481dff2903d26ae9b62870a0561"} Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.182030 4707 generic.go:334] "Generic (PLEG): container finished" podID="beb79892-2b5e-47cf-9381-08a5654df324" containerID="6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9" exitCode=0 Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.182084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtrmt" event={"ID":"beb79892-2b5e-47cf-9381-08a5654df324","Type":"ContainerDied","Data":"6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9"} Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.183856 4707 generic.go:334] "Generic (PLEG): container finished" podID="1225cfb7-54e0-493f-b631-8743502cc57a" containerID="e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749" exitCode=0 Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.183896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gph2" event={"ID":"1225cfb7-54e0-493f-b631-8743502cc57a","Type":"ContainerDied","Data":"e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749"} Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.183911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gph2" event={"ID":"1225cfb7-54e0-493f-b631-8743502cc57a","Type":"ContainerStarted","Data":"3499d717d3075fe515aa4060713c72cdf02978dab0c2032a9e19dab13d8d4f73"} Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.186160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f6508928-d944-40b0-9640-0193417ba7bb","Type":"ContainerStarted","Data":"5e1b622cf345675782c1da5774525a06c711fd5e872c5db28518c350231c397f"} Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.186182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f6508928-d944-40b0-9640-0193417ba7bb","Type":"ContainerStarted","Data":"ee27b711350a26c440c2a46bfd3b6f13f0157bfd794e57241ed9b2617ac6091b"} Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.198004 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.197991183 podStartE2EDuration="1.197991183s" podCreationTimestamp="2026-01-21 15:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:16.196851411 +0000 UTC m=+153.378367633" watchObservedRunningTime="2026-01-21 15:04:16.197991183 +0000 UTC m=+153.379507405" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.206979 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:16 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:16 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:16 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.207027 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.364919 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.457286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh79n\" (UniqueName: \"kubernetes.io/projected/5537fff0-7562-4e93-8db8-d13d18966e3d-kube-api-access-gh79n\") pod \"5537fff0-7562-4e93-8db8-d13d18966e3d\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.457343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5537fff0-7562-4e93-8db8-d13d18966e3d-config-volume\") pod \"5537fff0-7562-4e93-8db8-d13d18966e3d\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.457382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5537fff0-7562-4e93-8db8-d13d18966e3d-secret-volume\") pod \"5537fff0-7562-4e93-8db8-d13d18966e3d\" (UID: \"5537fff0-7562-4e93-8db8-d13d18966e3d\") " Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.457955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5537fff0-7562-4e93-8db8-d13d18966e3d-config-volume" (OuterVolumeSpecName: "config-volume") pod "5537fff0-7562-4e93-8db8-d13d18966e3d" (UID: "5537fff0-7562-4e93-8db8-d13d18966e3d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.461616 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5537fff0-7562-4e93-8db8-d13d18966e3d-kube-api-access-gh79n" (OuterVolumeSpecName: "kube-api-access-gh79n") pod "5537fff0-7562-4e93-8db8-d13d18966e3d" (UID: "5537fff0-7562-4e93-8db8-d13d18966e3d"). InnerVolumeSpecName "kube-api-access-gh79n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.461622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5537fff0-7562-4e93-8db8-d13d18966e3d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5537fff0-7562-4e93-8db8-d13d18966e3d" (UID: "5537fff0-7562-4e93-8db8-d13d18966e3d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.558731 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh79n\" (UniqueName: \"kubernetes.io/projected/5537fff0-7562-4e93-8db8-d13d18966e3d-kube-api-access-gh79n\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.558762 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5537fff0-7562-4e93-8db8-d13d18966e3d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.558772 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5537fff0-7562-4e93-8db8-d13d18966e3d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.832203 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6pd85"] Jan 21 15:04:16 crc kubenswrapper[4707]: E0121 15:04:16.832378 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5537fff0-7562-4e93-8db8-d13d18966e3d" containerName="collect-profiles" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.832396 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5537fff0-7562-4e93-8db8-d13d18966e3d" containerName="collect-profiles" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.832479 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5537fff0-7562-4e93-8db8-d13d18966e3d" containerName="collect-profiles" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.833099 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.834756 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.841330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pd85"] Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.962953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-utilities\") pod \"redhat-marketplace-6pd85\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.963009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsn8\" (UniqueName: \"kubernetes.io/projected/10e83da2-c4ad-415c-9c64-7507e2864b39-kube-api-access-clsn8\") pod \"redhat-marketplace-6pd85\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:16 crc kubenswrapper[4707]: I0121 15:04:16.963040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-catalog-content\") pod \"redhat-marketplace-6pd85\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.064275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-utilities\") pod \"redhat-marketplace-6pd85\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.064373 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clsn8\" (UniqueName: \"kubernetes.io/projected/10e83da2-c4ad-415c-9c64-7507e2864b39-kube-api-access-clsn8\") pod \"redhat-marketplace-6pd85\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.064416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-catalog-content\") pod \"redhat-marketplace-6pd85\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.065548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-catalog-content\") pod \"redhat-marketplace-6pd85\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.066070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-utilities\") pod \"redhat-marketplace-6pd85\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.082671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsn8\" (UniqueName: \"kubernetes.io/projected/10e83da2-c4ad-415c-9c64-7507e2864b39-kube-api-access-clsn8\") pod \"redhat-marketplace-6pd85\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.144156 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.199947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" event={"ID":"5537fff0-7562-4e93-8db8-d13d18966e3d","Type":"ContainerDied","Data":"ae73ba976a5a0b0c7da9f0c9f2f608f36318b085640a7c9f58bc0d38aa0da08e"} Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.199973 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.199983 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae73ba976a5a0b0c7da9f0c9f2f608f36318b085640a7c9f58bc0d38aa0da08e" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.202024 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:17 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:17 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:17 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.202082 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.204211 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6508928-d944-40b0-9640-0193417ba7bb" containerID="5e1b622cf345675782c1da5774525a06c711fd5e872c5db28518c350231c397f" exitCode=0 Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.204248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f6508928-d944-40b0-9640-0193417ba7bb","Type":"ContainerDied","Data":"5e1b622cf345675782c1da5774525a06c711fd5e872c5db28518c350231c397f"} Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.241522 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7lr"] Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.242680 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.249175 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7lr"] Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.372511 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvwc8\" (UniqueName: \"kubernetes.io/projected/637d112b-95a3-4c37-889c-273c86fee7a9-kube-api-access-fvwc8\") pod \"redhat-marketplace-cf7lr\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.372733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-catalog-content\") pod \"redhat-marketplace-cf7lr\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.372800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-utilities\") pod \"redhat-marketplace-cf7lr\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.423610 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pd85"] Jan 21 15:04:17 crc kubenswrapper[4707]: W0121 15:04:17.452882 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10e83da2_c4ad_415c_9c64_7507e2864b39.slice/crio-5f04c1e33ed9e70b91201d7533819725b096aea074e9c846aa4da5af374e6011 WatchSource:0}: Error finding container 5f04c1e33ed9e70b91201d7533819725b096aea074e9c846aa4da5af374e6011: Status 404 returned error can't find the container with id 5f04c1e33ed9e70b91201d7533819725b096aea074e9c846aa4da5af374e6011 Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.474275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-utilities\") pod \"redhat-marketplace-cf7lr\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.474348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvwc8\" (UniqueName: \"kubernetes.io/projected/637d112b-95a3-4c37-889c-273c86fee7a9-kube-api-access-fvwc8\") pod \"redhat-marketplace-cf7lr\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.474368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-catalog-content\") pod \"redhat-marketplace-cf7lr\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.474776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-catalog-content\") pod \"redhat-marketplace-cf7lr\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.474989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-utilities\") pod \"redhat-marketplace-cf7lr\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.489104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvwc8\" (UniqueName: \"kubernetes.io/projected/637d112b-95a3-4c37-889c-273c86fee7a9-kube-api-access-fvwc8\") pod \"redhat-marketplace-cf7lr\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.567102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.761634 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7lr"] Jan 21 15:04:17 crc kubenswrapper[4707]: W0121 15:04:17.773988 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod637d112b_95a3_4c37_889c_273c86fee7a9.slice/crio-97692c92ca0527d38b4bcdce070fa8e3ef09ccd6c1b901c4e73de07871b32557 WatchSource:0}: Error finding container 97692c92ca0527d38b4bcdce070fa8e3ef09ccd6c1b901c4e73de07871b32557: Status 404 returned error can't find the container with id 97692c92ca0527d38b4bcdce070fa8e3ef09ccd6c1b901c4e73de07871b32557 Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.829457 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2vhtb"] Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.830452 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.834781 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.839407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2vhtb"] Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.982994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-utilities\") pod \"redhat-operators-2vhtb\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.983178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cp86\" (UniqueName: \"kubernetes.io/projected/f4e7d97d-3ff8-43f3-86b0-14de31de7179-kube-api-access-6cp86\") pod \"redhat-operators-2vhtb\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:17 crc kubenswrapper[4707]: I0121 15:04:17.983237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-catalog-content\") pod \"redhat-operators-2vhtb\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.084790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cp86\" (UniqueName: \"kubernetes.io/projected/f4e7d97d-3ff8-43f3-86b0-14de31de7179-kube-api-access-6cp86\") pod \"redhat-operators-2vhtb\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.084870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-catalog-content\") pod \"redhat-operators-2vhtb\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.084927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-utilities\") pod \"redhat-operators-2vhtb\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.085322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-catalog-content\") pod \"redhat-operators-2vhtb\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.085326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-utilities\") pod \"redhat-operators-2vhtb\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.103403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cp86\" (UniqueName: \"kubernetes.io/projected/f4e7d97d-3ff8-43f3-86b0-14de31de7179-kube-api-access-6cp86\") pod \"redhat-operators-2vhtb\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.159149 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.207289 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:18 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:18 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:18 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.207337 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.231349 4707 generic.go:334] "Generic (PLEG): container finished" podID="637d112b-95a3-4c37-889c-273c86fee7a9" containerID="898594e6842063de193765cab0d1466dd79f6dbb4612846b767573fc93a64507" exitCode=0 Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.231409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7lr" event={"ID":"637d112b-95a3-4c37-889c-273c86fee7a9","Type":"ContainerDied","Data":"898594e6842063de193765cab0d1466dd79f6dbb4612846b767573fc93a64507"} Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.231432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7lr" event={"ID":"637d112b-95a3-4c37-889c-273c86fee7a9","Type":"ContainerStarted","Data":"97692c92ca0527d38b4bcdce070fa8e3ef09ccd6c1b901c4e73de07871b32557"} Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.231932 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cjs7j"] Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.232844 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.240245 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cjs7j"] Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.249562 4707 generic.go:334] "Generic (PLEG): container finished" podID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerID="b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2" exitCode=0 Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.250947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pd85" event={"ID":"10e83da2-c4ad-415c-9c64-7507e2864b39","Type":"ContainerDied","Data":"b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2"} Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.250986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pd85" event={"ID":"10e83da2-c4ad-415c-9c64-7507e2864b39","Type":"ContainerStarted","Data":"5f04c1e33ed9e70b91201d7533819725b096aea074e9c846aa4da5af374e6011"} Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.394141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-utilities\") pod \"redhat-operators-cjs7j\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.394198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-catalog-content\") pod \"redhat-operators-cjs7j\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.394250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wz5w\" (UniqueName: \"kubernetes.io/projected/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-kube-api-access-9wz5w\") pod \"redhat-operators-cjs7j\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.406609 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2vhtb"] Jan 21 15:04:18 crc kubenswrapper[4707]: W0121 15:04:18.432068 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4e7d97d_3ff8_43f3_86b0_14de31de7179.slice/crio-5900bb92016cb2d4aaca53587291bcd9dc1418158d5af7111b54a595ab4a7b46 WatchSource:0}: Error finding container 5900bb92016cb2d4aaca53587291bcd9dc1418158d5af7111b54a595ab4a7b46: Status 404 returned error can't find the container with id 5900bb92016cb2d4aaca53587291bcd9dc1418158d5af7111b54a595ab4a7b46 Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.459230 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.496396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-utilities\") pod \"redhat-operators-cjs7j\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.496528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-catalog-content\") pod \"redhat-operators-cjs7j\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.497090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-catalog-content\") pod \"redhat-operators-cjs7j\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.497406 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-utilities\") pod \"redhat-operators-cjs7j\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.497671 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wz5w\" (UniqueName: \"kubernetes.io/projected/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-kube-api-access-9wz5w\") pod \"redhat-operators-cjs7j\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.512567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wz5w\" (UniqueName: \"kubernetes.io/projected/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-kube-api-access-9wz5w\") pod \"redhat-operators-cjs7j\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.548180 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.598827 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6508928-d944-40b0-9640-0193417ba7bb-kubelet-dir\") pod \"f6508928-d944-40b0-9640-0193417ba7bb\" (UID: \"f6508928-d944-40b0-9640-0193417ba7bb\") " Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.598875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6508928-d944-40b0-9640-0193417ba7bb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f6508928-d944-40b0-9640-0193417ba7bb" (UID: "f6508928-d944-40b0-9640-0193417ba7bb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.598987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6508928-d944-40b0-9640-0193417ba7bb-kube-api-access\") pod \"f6508928-d944-40b0-9640-0193417ba7bb\" (UID: \"f6508928-d944-40b0-9640-0193417ba7bb\") " Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.599388 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6508928-d944-40b0-9640-0193417ba7bb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.601170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6508928-d944-40b0-9640-0193417ba7bb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f6508928-d944-40b0-9640-0193417ba7bb" (UID: "f6508928-d944-40b0-9640-0193417ba7bb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.701129 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6508928-d944-40b0-9640-0193417ba7bb-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.744304 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.744337 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.748331 4707 patch_prober.go:28] interesting pod/console-f9d7485db-c6ckn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.748412 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-c6ckn" podUID="0ebc35b7-89c9-490c-9756-6d532a2411ed" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.802136 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-6kmlc" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.808759 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cjs7j"] Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.961263 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.961549 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.967959 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.997845 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 15:04:18 crc kubenswrapper[4707]: E0121 15:04:18.998332 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6508928-d944-40b0-9640-0193417ba7bb" containerName="pruner" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.998344 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6508928-d944-40b0-9640-0193417ba7bb" containerName="pruner" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.998439 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6508928-d944-40b0-9640-0193417ba7bb" containerName="pruner" Jan 21 15:04:18 crc kubenswrapper[4707]: I0121 15:04:18.998844 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.003911 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.006726 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.014083 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.065830 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.066066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.071652 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.105704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.105904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.198872 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.201800 4707 patch_prober.go:28] interesting pod/router-default-5444994796-k84wt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:04:19 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 21 15:04:19 crc kubenswrapper[4707]: [+]process-running ok Jan 21 15:04:19 crc kubenswrapper[4707]: healthz check failed Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.201897 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k84wt" podUID="56397290-3efc-4453-8cf4-ceda50f383f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.207656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.207729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.207928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.231296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.261084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f6508928-d944-40b0-9640-0193417ba7bb","Type":"ContainerDied","Data":"ee27b711350a26c440c2a46bfd3b6f13f0157bfd794e57241ed9b2617ac6091b"} Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.261125 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee27b711350a26c440c2a46bfd3b6f13f0157bfd794e57241ed9b2617ac6091b" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.261202 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.267496 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerID="fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c" exitCode=0 Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.267530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vhtb" event={"ID":"f4e7d97d-3ff8-43f3-86b0-14de31de7179","Type":"ContainerDied","Data":"fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c"} Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.267564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vhtb" event={"ID":"f4e7d97d-3ff8-43f3-86b0-14de31de7179","Type":"ContainerStarted","Data":"5900bb92016cb2d4aaca53587291bcd9dc1418158d5af7111b54a595ab4a7b46"} Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.274715 4707 generic.go:334] "Generic (PLEG): container finished" podID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerID="a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e" exitCode=0 Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.275009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjs7j" event={"ID":"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74","Type":"ContainerDied","Data":"a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e"} Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.275078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjs7j" event={"ID":"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74","Type":"ContainerStarted","Data":"69bea011173c32015ce719bd9a8128eb342f9fae7af21d63c57d716d1b87ab55"} Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.279895 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-v44w7" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.283433 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4t9v9" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.339127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:19 crc kubenswrapper[4707]: I0121 15:04:19.673116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 15:04:20 crc kubenswrapper[4707]: I0121 15:04:20.205140 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:20 crc kubenswrapper[4707]: I0121 15:04:20.209307 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-k84wt" Jan 21 15:04:21 crc kubenswrapper[4707]: I0121 15:04:21.597131 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z4txk" Jan 21 15:04:22 crc kubenswrapper[4707]: W0121 15:04:22.683129 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1b749e86_4dc7_4888_a1ac_b1e3adc918be.slice/crio-b5e6f1472054e20ec2e65425c2f4b464b70de2234d013419fc2c3283a523d7a2 WatchSource:0}: Error finding container b5e6f1472054e20ec2e65425c2f4b464b70de2234d013419fc2c3283a523d7a2: Status 404 returned error can't find the container with id b5e6f1472054e20ec2e65425c2f4b464b70de2234d013419fc2c3283a523d7a2 Jan 21 15:04:23 crc kubenswrapper[4707]: I0121 15:04:23.318939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1b749e86-4dc7-4888-a1ac-b1e3adc918be","Type":"ContainerStarted","Data":"b5e6f1472054e20ec2e65425c2f4b464b70de2234d013419fc2c3283a523d7a2"} Jan 21 15:04:24 crc kubenswrapper[4707]: I0121 15:04:24.325503 4707 generic.go:334] "Generic (PLEG): container finished" podID="1b749e86-4dc7-4888-a1ac-b1e3adc918be" containerID="bd08a394d5baf2916881e8ca0ea241c8ec1b0f6e8a41027433679aba90370138" exitCode=0 Jan 21 15:04:24 crc kubenswrapper[4707]: I0121 15:04:24.325657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1b749e86-4dc7-4888-a1ac-b1e3adc918be","Type":"ContainerDied","Data":"bd08a394d5baf2916881e8ca0ea241c8ec1b0f6e8a41027433679aba90370138"} Jan 21 15:04:25 crc kubenswrapper[4707]: I0121 15:04:25.547770 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:25 crc kubenswrapper[4707]: I0121 15:04:25.685497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kube-api-access\") pod \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\" (UID: \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\") " Jan 21 15:04:25 crc kubenswrapper[4707]: I0121 15:04:25.685605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kubelet-dir\") pod \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\" (UID: \"1b749e86-4dc7-4888-a1ac-b1e3adc918be\") " Jan 21 15:04:25 crc kubenswrapper[4707]: I0121 15:04:25.685710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1b749e86-4dc7-4888-a1ac-b1e3adc918be" (UID: "1b749e86-4dc7-4888-a1ac-b1e3adc918be"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:04:25 crc kubenswrapper[4707]: I0121 15:04:25.685917 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:25 crc kubenswrapper[4707]: I0121 15:04:25.692063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1b749e86-4dc7-4888-a1ac-b1e3adc918be" (UID: "1b749e86-4dc7-4888-a1ac-b1e3adc918be"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:25 crc kubenswrapper[4707]: I0121 15:04:25.787238 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b749e86-4dc7-4888-a1ac-b1e3adc918be-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:26 crc kubenswrapper[4707]: I0121 15:04:26.335348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1b749e86-4dc7-4888-a1ac-b1e3adc918be","Type":"ContainerDied","Data":"b5e6f1472054e20ec2e65425c2f4b464b70de2234d013419fc2c3283a523d7a2"} Jan 21 15:04:26 crc kubenswrapper[4707]: I0121 15:04:26.335386 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5e6f1472054e20ec2e65425c2f4b464b70de2234d013419fc2c3283a523d7a2" Jan 21 15:04:26 crc kubenswrapper[4707]: I0121 15:04:26.335432 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:04:28 crc kubenswrapper[4707]: I0121 15:04:28.021339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:04:28 crc kubenswrapper[4707]: I0121 15:04:28.037972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5fb5fe4-8f42-4057-b731-b2c8da0661e3-metrics-certs\") pod \"network-metrics-daemon-62mww\" (UID: \"d5fb5fe4-8f42-4057-b731-b2c8da0661e3\") " pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:04:28 crc kubenswrapper[4707]: I0121 15:04:28.093653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62mww" Jan 21 15:04:28 crc kubenswrapper[4707]: I0121 15:04:28.748044 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:28 crc kubenswrapper[4707]: I0121 15:04:28.751195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:04:30 crc kubenswrapper[4707]: I0121 15:04:30.614115 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9qt5"] Jan 21 15:04:30 crc kubenswrapper[4707]: I0121 15:04:30.614449 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" podUID="60f4a311-31b0-4a1e-b5a0-9209f96340c7" containerName="controller-manager" containerID="cri-o://8c7602952d14253b6519f06ca054bc1e5c86a7f8ff7f27134348c79d254defd4" gracePeriod=30 Jan 21 15:04:30 crc kubenswrapper[4707]: I0121 15:04:30.634380 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx"] Jan 21 15:04:30 crc kubenswrapper[4707]: I0121 15:04:30.634554 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" podUID="60cbb465-ee79-49db-8488-943c0b44334b" containerName="route-controller-manager" containerID="cri-o://c43e2b358a664619361465fee12959c77a9c53e9e1f0ffd92fb6a5da3fb07573" gracePeriod=30 Jan 21 15:04:31 crc kubenswrapper[4707]: I0121 15:04:31.364223 4707 generic.go:334] "Generic (PLEG): container finished" podID="60f4a311-31b0-4a1e-b5a0-9209f96340c7" containerID="8c7602952d14253b6519f06ca054bc1e5c86a7f8ff7f27134348c79d254defd4" exitCode=0 Jan 21 15:04:31 crc kubenswrapper[4707]: I0121 15:04:31.364286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" event={"ID":"60f4a311-31b0-4a1e-b5a0-9209f96340c7","Type":"ContainerDied","Data":"8c7602952d14253b6519f06ca054bc1e5c86a7f8ff7f27134348c79d254defd4"} Jan 21 15:04:31 crc kubenswrapper[4707]: I0121 15:04:31.365754 4707 generic.go:334] "Generic (PLEG): container finished" podID="60cbb465-ee79-49db-8488-943c0b44334b" containerID="c43e2b358a664619361465fee12959c77a9c53e9e1f0ffd92fb6a5da3fb07573" exitCode=0 Jan 21 15:04:31 crc kubenswrapper[4707]: I0121 15:04:31.365777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" event={"ID":"60cbb465-ee79-49db-8488-943c0b44334b","Type":"ContainerDied","Data":"c43e2b358a664619361465fee12959c77a9c53e9e1f0ffd92fb6a5da3fb07573"} Jan 21 15:04:34 crc kubenswrapper[4707]: I0121 15:04:34.593639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.280342 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.336557 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr"] Jan 21 15:04:35 crc kubenswrapper[4707]: E0121 15:04:35.336915 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60cbb465-ee79-49db-8488-943c0b44334b" containerName="route-controller-manager" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.336934 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="60cbb465-ee79-49db-8488-943c0b44334b" containerName="route-controller-manager" Jan 21 15:04:35 crc kubenswrapper[4707]: E0121 15:04:35.336967 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b749e86-4dc7-4888-a1ac-b1e3adc918be" containerName="pruner" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.336974 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b749e86-4dc7-4888-a1ac-b1e3adc918be" containerName="pruner" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.337125 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b749e86-4dc7-4888-a1ac-b1e3adc918be" containerName="pruner" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.337147 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="60cbb465-ee79-49db-8488-943c0b44334b" containerName="route-controller-manager" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.338220 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.340387 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr"] Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.390144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" event={"ID":"60cbb465-ee79-49db-8488-943c0b44334b","Type":"ContainerDied","Data":"58953f61c72096c829c04d2eb2679981b1f4d62e70d58fb41076b42d3f274310"} Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.390187 4707 scope.go:117] "RemoveContainer" containerID="c43e2b358a664619361465fee12959c77a9c53e9e1f0ffd92fb6a5da3fb07573" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.390452 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.429532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60cbb465-ee79-49db-8488-943c0b44334b-serving-cert\") pod \"60cbb465-ee79-49db-8488-943c0b44334b\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.429630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2clmf\" (UniqueName: \"kubernetes.io/projected/60cbb465-ee79-49db-8488-943c0b44334b-kube-api-access-2clmf\") pod \"60cbb465-ee79-49db-8488-943c0b44334b\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.429659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-client-ca\") pod \"60cbb465-ee79-49db-8488-943c0b44334b\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.429676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-config\") pod \"60cbb465-ee79-49db-8488-943c0b44334b\" (UID: \"60cbb465-ee79-49db-8488-943c0b44334b\") " Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.429845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-client-ca\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.429871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-config\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.429913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnlzn\" (UniqueName: \"kubernetes.io/projected/59709653-4cc2-41de-aa1d-0e64487a84e3-kube-api-access-xnlzn\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.429936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59709653-4cc2-41de-aa1d-0e64487a84e3-serving-cert\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.431131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-config" (OuterVolumeSpecName: "config") pod "60cbb465-ee79-49db-8488-943c0b44334b" (UID: "60cbb465-ee79-49db-8488-943c0b44334b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.431529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-client-ca" (OuterVolumeSpecName: "client-ca") pod "60cbb465-ee79-49db-8488-943c0b44334b" (UID: "60cbb465-ee79-49db-8488-943c0b44334b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.437182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cbb465-ee79-49db-8488-943c0b44334b-kube-api-access-2clmf" (OuterVolumeSpecName: "kube-api-access-2clmf") pod "60cbb465-ee79-49db-8488-943c0b44334b" (UID: "60cbb465-ee79-49db-8488-943c0b44334b"). InnerVolumeSpecName "kube-api-access-2clmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.438690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60cbb465-ee79-49db-8488-943c0b44334b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60cbb465-ee79-49db-8488-943c0b44334b" (UID: "60cbb465-ee79-49db-8488-943c0b44334b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.469551 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.485799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-62mww"] Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-config\") pod \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530383 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-client-ca\") pod \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-proxy-ca-bundles\") pod \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f4a311-31b0-4a1e-b5a0-9209f96340c7-serving-cert\") pod \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn6pq\" (UniqueName: \"kubernetes.io/projected/60f4a311-31b0-4a1e-b5a0-9209f96340c7-kube-api-access-xn6pq\") pod \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\" (UID: \"60f4a311-31b0-4a1e-b5a0-9209f96340c7\") " Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-client-ca\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-config\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnlzn\" (UniqueName: \"kubernetes.io/projected/59709653-4cc2-41de-aa1d-0e64487a84e3-kube-api-access-xnlzn\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59709653-4cc2-41de-aa1d-0e64487a84e3-serving-cert\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530840 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2clmf\" (UniqueName: \"kubernetes.io/projected/60cbb465-ee79-49db-8488-943c0b44334b-kube-api-access-2clmf\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530853 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530862 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cbb465-ee79-49db-8488-943c0b44334b-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.530871 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60cbb465-ee79-49db-8488-943c0b44334b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.532608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "60f4a311-31b0-4a1e-b5a0-9209f96340c7" (UID: "60f4a311-31b0-4a1e-b5a0-9209f96340c7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.532987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-config" (OuterVolumeSpecName: "config") pod "60f4a311-31b0-4a1e-b5a0-9209f96340c7" (UID: "60f4a311-31b0-4a1e-b5a0-9209f96340c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.533684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-config\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.534755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-client-ca\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.534857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "60f4a311-31b0-4a1e-b5a0-9209f96340c7" (UID: "60f4a311-31b0-4a1e-b5a0-9209f96340c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.534950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59709653-4cc2-41de-aa1d-0e64487a84e3-serving-cert\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.537452 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f4a311-31b0-4a1e-b5a0-9209f96340c7-kube-api-access-xn6pq" (OuterVolumeSpecName: "kube-api-access-xn6pq") pod "60f4a311-31b0-4a1e-b5a0-9209f96340c7" (UID: "60f4a311-31b0-4a1e-b5a0-9209f96340c7"). InnerVolumeSpecName "kube-api-access-xn6pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.538089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f4a311-31b0-4a1e-b5a0-9209f96340c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60f4a311-31b0-4a1e-b5a0-9209f96340c7" (UID: "60f4a311-31b0-4a1e-b5a0-9209f96340c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.546945 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnlzn\" (UniqueName: \"kubernetes.io/projected/59709653-4cc2-41de-aa1d-0e64487a84e3-kube-api-access-xnlzn\") pod \"route-controller-manager-cf855f677-kxvrr\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.632024 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.632059 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f4a311-31b0-4a1e-b5a0-9209f96340c7-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.632068 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn6pq\" (UniqueName: \"kubernetes.io/projected/60f4a311-31b0-4a1e-b5a0-9209f96340c7-kube-api-access-xn6pq\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.632081 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.632091 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f4a311-31b0-4a1e-b5a0-9209f96340c7-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.660098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.756983 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx"] Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.765055 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t6stx"] Jan 21 15:04:35 crc kubenswrapper[4707]: I0121 15:04:35.841634 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr"] Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.396383 4707 generic.go:334] "Generic (PLEG): container finished" podID="beb79892-2b5e-47cf-9381-08a5654df324" containerID="5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13" exitCode=0 Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.396425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtrmt" event={"ID":"beb79892-2b5e-47cf-9381-08a5654df324","Type":"ContainerDied","Data":"5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.400006 4707 generic.go:334] "Generic (PLEG): container finished" podID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerID="001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558" exitCode=0 Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.400069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjs7j" event={"ID":"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74","Type":"ContainerDied","Data":"001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.402253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.403720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9qt5" event={"ID":"60f4a311-31b0-4a1e-b5a0-9209f96340c7","Type":"ContainerDied","Data":"89e64b120d455663a394261526f2a0d78502ad22520872423138b0f07abbfecc"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.403760 4707 scope.go:117] "RemoveContainer" containerID="8c7602952d14253b6519f06ca054bc1e5c86a7f8ff7f27134348c79d254defd4" Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.405475 4707 generic.go:334] "Generic (PLEG): container finished" podID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerID="a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e" exitCode=0 Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.405517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pd85" event={"ID":"10e83da2-c4ad-415c-9c64-7507e2864b39","Type":"ContainerDied","Data":"a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.407170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" event={"ID":"59709653-4cc2-41de-aa1d-0e64487a84e3","Type":"ContainerStarted","Data":"7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.407188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" event={"ID":"59709653-4cc2-41de-aa1d-0e64487a84e3","Type":"ContainerStarted","Data":"d23459563487b819df8f32fe2369113162820b26638b1bab33bcbfe617651a40"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.407697 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.412161 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerID="9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2" exitCode=0 Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.412233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2nd" event={"ID":"d6840e11-1e4b-47d4-8a4d-cc857f791915","Type":"ContainerDied","Data":"9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.413072 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.414155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-62mww" event={"ID":"d5fb5fe4-8f42-4057-b731-b2c8da0661e3","Type":"ContainerStarted","Data":"336b700398b282f6dfc50753dc367c63da73e2cd0a97a6dbe5b3c9f507e9ef48"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.414208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-62mww" event={"ID":"d5fb5fe4-8f42-4057-b731-b2c8da0661e3","Type":"ContainerStarted","Data":"f737db4797229c21a5f85d309721c226a4e74ded875afe4dd383f7f604b44ab9"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.414221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-62mww" event={"ID":"d5fb5fe4-8f42-4057-b731-b2c8da0661e3","Type":"ContainerStarted","Data":"739ba8ad58123f9866dc8288d5a57155f7a0bc9281ddbbb4f869f85b55f3e58d"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.418987 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerID="4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e" exitCode=0 Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.419038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vhtb" event={"ID":"f4e7d97d-3ff8-43f3-86b0-14de31de7179","Type":"ContainerDied","Data":"4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.424051 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" podStartSLOduration=6.424036666 podStartE2EDuration="6.424036666s" podCreationTimestamp="2026-01-21 15:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:36.423853675 +0000 UTC m=+173.605369897" watchObservedRunningTime="2026-01-21 15:04:36.424036666 +0000 UTC m=+173.605552888" Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.428626 4707 generic.go:334] "Generic (PLEG): container finished" podID="1225cfb7-54e0-493f-b631-8743502cc57a" containerID="7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f" exitCode=0 Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.428696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gph2" event={"ID":"1225cfb7-54e0-493f-b631-8743502cc57a","Type":"ContainerDied","Data":"7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.431757 4707 generic.go:334] "Generic (PLEG): container finished" podID="637d112b-95a3-4c37-889c-273c86fee7a9" containerID="849961b0084be8fd803a03d67237f26fcf911b05a9ff8af8c79af2f3ddd92c1f" exitCode=0 Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.431797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7lr" event={"ID":"637d112b-95a3-4c37-889c-273c86fee7a9","Type":"ContainerDied","Data":"849961b0084be8fd803a03d67237f26fcf911b05a9ff8af8c79af2f3ddd92c1f"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.437285 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerID="c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538" exitCode=0 Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.437322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncnxd" event={"ID":"0e136188-e9e8-4eea-bd22-e01006fadce4","Type":"ContainerDied","Data":"c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538"} Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.489831 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9qt5"] Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.493011 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9qt5"] Jan 21 15:04:36 crc kubenswrapper[4707]: I0121 15:04:36.544499 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-62mww" podStartSLOduration=150.544485673 podStartE2EDuration="2m30.544485673s" podCreationTimestamp="2026-01-21 15:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:36.540994463 +0000 UTC m=+173.722510684" watchObservedRunningTime="2026-01-21 15:04:36.544485673 +0000 UTC m=+173.726001895" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.188196 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60cbb465-ee79-49db-8488-943c0b44334b" path="/var/lib/kubelet/pods/60cbb465-ee79-49db-8488-943c0b44334b/volumes" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.188981 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f4a311-31b0-4a1e-b5a0-9209f96340c7" path="/var/lib/kubelet/pods/60f4a311-31b0-4a1e-b5a0-9209f96340c7/volumes" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.445396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vhtb" event={"ID":"f4e7d97d-3ff8-43f3-86b0-14de31de7179","Type":"ContainerStarted","Data":"b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4"} Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.447739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjs7j" event={"ID":"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74","Type":"ContainerStarted","Data":"95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5"} Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.450791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gph2" event={"ID":"1225cfb7-54e0-493f-b631-8743502cc57a","Type":"ContainerStarted","Data":"bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500"} Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.452677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncnxd" event={"ID":"0e136188-e9e8-4eea-bd22-e01006fadce4","Type":"ContainerStarted","Data":"ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c"} Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.455370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtrmt" event={"ID":"beb79892-2b5e-47cf-9381-08a5654df324","Type":"ContainerStarted","Data":"7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543"} Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.456964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pd85" event={"ID":"10e83da2-c4ad-415c-9c64-7507e2864b39","Type":"ContainerStarted","Data":"9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8"} Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.458391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7lr" event={"ID":"637d112b-95a3-4c37-889c-273c86fee7a9","Type":"ContainerStarted","Data":"089cd92dc4e5c6403556b1fe51ef61f87ceede8a5e07b6b647e163f77f6ef958"} Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.460550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2nd" event={"ID":"d6840e11-1e4b-47d4-8a4d-cc857f791915","Type":"ContainerStarted","Data":"41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f"} Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.464415 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2vhtb" podStartSLOduration=2.562408381 podStartE2EDuration="20.46440592s" podCreationTimestamp="2026-01-21 15:04:17 +0000 UTC" firstStartedPulling="2026-01-21 15:04:19.268738494 +0000 UTC m=+156.450254716" lastFinishedPulling="2026-01-21 15:04:37.170736034 +0000 UTC m=+174.352252255" observedRunningTime="2026-01-21 15:04:37.464278573 +0000 UTC m=+174.645794795" watchObservedRunningTime="2026-01-21 15:04:37.46440592 +0000 UTC m=+174.645922142" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.480035 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dtrmt" podStartSLOduration=2.652613962 podStartE2EDuration="23.48002178s" podCreationTimestamp="2026-01-21 15:04:14 +0000 UTC" firstStartedPulling="2026-01-21 15:04:16.183251987 +0000 UTC m=+153.364768208" lastFinishedPulling="2026-01-21 15:04:37.010659803 +0000 UTC m=+174.192176026" observedRunningTime="2026-01-21 15:04:37.477884219 +0000 UTC m=+174.659400441" watchObservedRunningTime="2026-01-21 15:04:37.48002178 +0000 UTC m=+174.661538001" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.489851 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cf7lr" podStartSLOduration=1.5750843159999999 podStartE2EDuration="20.489834482s" podCreationTimestamp="2026-01-21 15:04:17 +0000 UTC" firstStartedPulling="2026-01-21 15:04:18.234275058 +0000 UTC m=+155.415791280" lastFinishedPulling="2026-01-21 15:04:37.149025224 +0000 UTC m=+174.330541446" observedRunningTime="2026-01-21 15:04:37.4881633 +0000 UTC m=+174.669679521" watchObservedRunningTime="2026-01-21 15:04:37.489834482 +0000 UTC m=+174.671350704" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.501266 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4gph2" podStartSLOduration=1.531418323 podStartE2EDuration="22.501251372s" podCreationTimestamp="2026-01-21 15:04:15 +0000 UTC" firstStartedPulling="2026-01-21 15:04:16.18469836 +0000 UTC m=+153.366214581" lastFinishedPulling="2026-01-21 15:04:37.154531408 +0000 UTC m=+174.336047630" observedRunningTime="2026-01-21 15:04:37.49982723 +0000 UTC m=+174.681343442" watchObservedRunningTime="2026-01-21 15:04:37.501251372 +0000 UTC m=+174.682767595" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.516314 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sn2nd" podStartSLOduration=1.596867693 podStartE2EDuration="22.516299545s" podCreationTimestamp="2026-01-21 15:04:15 +0000 UTC" firstStartedPulling="2026-01-21 15:04:16.181026392 +0000 UTC m=+153.362542614" lastFinishedPulling="2026-01-21 15:04:37.100458244 +0000 UTC m=+174.281974466" observedRunningTime="2026-01-21 15:04:37.513978983 +0000 UTC m=+174.695495206" watchObservedRunningTime="2026-01-21 15:04:37.516299545 +0000 UTC m=+174.697815767" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.527041 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6pd85" podStartSLOduration=2.71889691 podStartE2EDuration="21.527028404s" podCreationTimestamp="2026-01-21 15:04:16 +0000 UTC" firstStartedPulling="2026-01-21 15:04:18.254273829 +0000 UTC m=+155.435790051" lastFinishedPulling="2026-01-21 15:04:37.062405323 +0000 UTC m=+174.243921545" observedRunningTime="2026-01-21 15:04:37.52546418 +0000 UTC m=+174.706980403" watchObservedRunningTime="2026-01-21 15:04:37.527028404 +0000 UTC m=+174.708544626" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.540635 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cjs7j" podStartSLOduration=1.61715626 podStartE2EDuration="19.540620484s" podCreationTimestamp="2026-01-21 15:04:18 +0000 UTC" firstStartedPulling="2026-01-21 15:04:19.276702684 +0000 UTC m=+156.458218907" lastFinishedPulling="2026-01-21 15:04:37.200166909 +0000 UTC m=+174.381683131" observedRunningTime="2026-01-21 15:04:37.539411013 +0000 UTC m=+174.720927234" watchObservedRunningTime="2026-01-21 15:04:37.540620484 +0000 UTC m=+174.722136707" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.551563 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ncnxd" podStartSLOduration=2.6510970069999997 podStartE2EDuration="23.551541422s" podCreationTimestamp="2026-01-21 15:04:14 +0000 UTC" firstStartedPulling="2026-01-21 15:04:16.179121905 +0000 UTC m=+153.360638127" lastFinishedPulling="2026-01-21 15:04:37.07956632 +0000 UTC m=+174.261082542" observedRunningTime="2026-01-21 15:04:37.549553129 +0000 UTC m=+174.731069351" watchObservedRunningTime="2026-01-21 15:04:37.551541422 +0000 UTC m=+174.733057644" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.568056 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.568104 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.761862 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65874f94f5-5s7hf"] Jan 21 15:04:37 crc kubenswrapper[4707]: E0121 15:04:37.762082 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f4a311-31b0-4a1e-b5a0-9209f96340c7" containerName="controller-manager" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.762101 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f4a311-31b0-4a1e-b5a0-9209f96340c7" containerName="controller-manager" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.762201 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f4a311-31b0-4a1e-b5a0-9209f96340c7" containerName="controller-manager" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.762567 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.764874 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.764888 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.765069 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.765456 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.766292 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.766371 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.788850 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.804081 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65874f94f5-5s7hf"] Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.863796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0ff684-393c-4583-a3cf-bc18f9da9d69-serving-cert\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.863915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-client-ca\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.863984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-proxy-ca-bundles\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.864034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lnpd\" (UniqueName: \"kubernetes.io/projected/4c0ff684-393c-4583-a3cf-bc18f9da9d69-kube-api-access-4lnpd\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.864144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-config\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.965525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lnpd\" (UniqueName: \"kubernetes.io/projected/4c0ff684-393c-4583-a3cf-bc18f9da9d69-kube-api-access-4lnpd\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.965589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-config\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.965630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0ff684-393c-4583-a3cf-bc18f9da9d69-serving-cert\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.965663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-client-ca\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.965727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-proxy-ca-bundles\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.966785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-proxy-ca-bundles\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.967921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-config\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.968982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-client-ca\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.975377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0ff684-393c-4583-a3cf-bc18f9da9d69-serving-cert\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:37 crc kubenswrapper[4707]: I0121 15:04:37.984998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lnpd\" (UniqueName: \"kubernetes.io/projected/4c0ff684-393c-4583-a3cf-bc18f9da9d69-kube-api-access-4lnpd\") pod \"controller-manager-65874f94f5-5s7hf\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:38 crc kubenswrapper[4707]: I0121 15:04:38.074841 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:38 crc kubenswrapper[4707]: I0121 15:04:38.159902 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:38 crc kubenswrapper[4707]: I0121 15:04:38.160299 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:38 crc kubenswrapper[4707]: I0121 15:04:38.474241 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65874f94f5-5s7hf"] Jan 21 15:04:38 crc kubenswrapper[4707]: W0121 15:04:38.479222 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0ff684_393c_4583_a3cf_bc18f9da9d69.slice/crio-0a23b8dca7bbe184efeb16e07cb991861ae8e9556f4d8d9b0ec3ec7d803827e2 WatchSource:0}: Error finding container 0a23b8dca7bbe184efeb16e07cb991861ae8e9556f4d8d9b0ec3ec7d803827e2: Status 404 returned error can't find the container with id 0a23b8dca7bbe184efeb16e07cb991861ae8e9556f4d8d9b0ec3ec7d803827e2 Jan 21 15:04:38 crc kubenswrapper[4707]: I0121 15:04:38.548822 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:38 crc kubenswrapper[4707]: I0121 15:04:38.549004 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:38 crc kubenswrapper[4707]: I0121 15:04:38.635424 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-cf7lr" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" containerName="registry-server" probeResult="failure" output=< Jan 21 15:04:38 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 21 15:04:38 crc kubenswrapper[4707]: > Jan 21 15:04:39 crc kubenswrapper[4707]: I0121 15:04:39.202522 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2vhtb" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="registry-server" probeResult="failure" output=< Jan 21 15:04:39 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 21 15:04:39 crc kubenswrapper[4707]: > Jan 21 15:04:39 crc kubenswrapper[4707]: I0121 15:04:39.476970 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" event={"ID":"4c0ff684-393c-4583-a3cf-bc18f9da9d69","Type":"ContainerStarted","Data":"10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516"} Jan 21 15:04:39 crc kubenswrapper[4707]: I0121 15:04:39.477003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" event={"ID":"4c0ff684-393c-4583-a3cf-bc18f9da9d69","Type":"ContainerStarted","Data":"0a23b8dca7bbe184efeb16e07cb991861ae8e9556f4d8d9b0ec3ec7d803827e2"} Jan 21 15:04:39 crc kubenswrapper[4707]: I0121 15:04:39.477783 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:39 crc kubenswrapper[4707]: I0121 15:04:39.490108 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:39 crc kubenswrapper[4707]: I0121 15:04:39.500745 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" podStartSLOduration=9.500731061 podStartE2EDuration="9.500731061s" podCreationTimestamp="2026-01-21 15:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:39.498151156 +0000 UTC m=+176.679667379" watchObservedRunningTime="2026-01-21 15:04:39.500731061 +0000 UTC m=+176.682247273" Jan 21 15:04:39 crc kubenswrapper[4707]: I0121 15:04:39.578517 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cjs7j" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerName="registry-server" probeResult="failure" output=< Jan 21 15:04:39 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 21 15:04:39 crc kubenswrapper[4707]: > Jan 21 15:04:39 crc kubenswrapper[4707]: I0121 15:04:39.945237 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:04:39 crc kubenswrapper[4707]: I0121 15:04:39.945282 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:04:44 crc kubenswrapper[4707]: I0121 15:04:44.950355 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:44 crc kubenswrapper[4707]: I0121 15:04:44.950738 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:44 crc kubenswrapper[4707]: I0121 15:04:44.980578 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.152083 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.152129 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.178638 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.420143 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.420276 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.446454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.531120 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.534355 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.539210 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.607363 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.607409 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:45 crc kubenswrapper[4707]: I0121 15:04:45.633014 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:46 crc kubenswrapper[4707]: I0121 15:04:46.533968 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:47 crc kubenswrapper[4707]: I0121 15:04:47.144972 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:47 crc kubenswrapper[4707]: I0121 15:04:47.145145 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:47 crc kubenswrapper[4707]: I0121 15:04:47.170874 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:47 crc kubenswrapper[4707]: I0121 15:04:47.486846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmpc8"] Jan 21 15:04:47 crc kubenswrapper[4707]: I0121 15:04:47.550852 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:04:47 crc kubenswrapper[4707]: I0121 15:04:47.599911 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:47 crc kubenswrapper[4707]: I0121 15:04:47.639763 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:47 crc kubenswrapper[4707]: I0121 15:04:47.801248 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gph2"] Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.189853 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.221123 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.517281 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4gph2" podUID="1225cfb7-54e0-493f-b631-8743502cc57a" containerName="registry-server" containerID="cri-o://bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500" gracePeriod=2 Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.576445 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.605010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.801722 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sn2nd"] Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.802119 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sn2nd" podUID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerName="registry-server" containerID="cri-o://41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f" gracePeriod=2 Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.903280 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.995591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-utilities\") pod \"1225cfb7-54e0-493f-b631-8743502cc57a\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.995656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvzkb\" (UniqueName: \"kubernetes.io/projected/1225cfb7-54e0-493f-b631-8743502cc57a-kube-api-access-vvzkb\") pod \"1225cfb7-54e0-493f-b631-8743502cc57a\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " Jan 21 15:04:48 crc kubenswrapper[4707]: I0121 15:04:48.995683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-catalog-content\") pod \"1225cfb7-54e0-493f-b631-8743502cc57a\" (UID: \"1225cfb7-54e0-493f-b631-8743502cc57a\") " Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:48.996738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-utilities" (OuterVolumeSpecName: "utilities") pod "1225cfb7-54e0-493f-b631-8743502cc57a" (UID: "1225cfb7-54e0-493f-b631-8743502cc57a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.012019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1225cfb7-54e0-493f-b631-8743502cc57a-kube-api-access-vvzkb" (OuterVolumeSpecName: "kube-api-access-vvzkb") pod "1225cfb7-54e0-493f-b631-8743502cc57a" (UID: "1225cfb7-54e0-493f-b631-8743502cc57a"). InnerVolumeSpecName "kube-api-access-vvzkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.035968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1225cfb7-54e0-493f-b631-8743502cc57a" (UID: "1225cfb7-54e0-493f-b631-8743502cc57a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.096764 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.096795 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvzkb\" (UniqueName: \"kubernetes.io/projected/1225cfb7-54e0-493f-b631-8743502cc57a-kube-api-access-vvzkb\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.096820 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1225cfb7-54e0-493f-b631-8743502cc57a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.162592 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.298110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kfzn\" (UniqueName: \"kubernetes.io/projected/d6840e11-1e4b-47d4-8a4d-cc857f791915-kube-api-access-9kfzn\") pod \"d6840e11-1e4b-47d4-8a4d-cc857f791915\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.298217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-utilities\") pod \"d6840e11-1e4b-47d4-8a4d-cc857f791915\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.298246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-catalog-content\") pod \"d6840e11-1e4b-47d4-8a4d-cc857f791915\" (UID: \"d6840e11-1e4b-47d4-8a4d-cc857f791915\") " Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.298906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-utilities" (OuterVolumeSpecName: "utilities") pod "d6840e11-1e4b-47d4-8a4d-cc857f791915" (UID: "d6840e11-1e4b-47d4-8a4d-cc857f791915"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.300160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6840e11-1e4b-47d4-8a4d-cc857f791915-kube-api-access-9kfzn" (OuterVolumeSpecName: "kube-api-access-9kfzn") pod "d6840e11-1e4b-47d4-8a4d-cc857f791915" (UID: "d6840e11-1e4b-47d4-8a4d-cc857f791915"). InnerVolumeSpecName "kube-api-access-9kfzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.344037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6840e11-1e4b-47d4-8a4d-cc857f791915" (UID: "d6840e11-1e4b-47d4-8a4d-cc857f791915"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.398925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.399183 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.399271 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6840e11-1e4b-47d4-8a4d-cc857f791915-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.399585 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kfzn\" (UniqueName: \"kubernetes.io/projected/d6840e11-1e4b-47d4-8a4d-cc857f791915-kube-api-access-9kfzn\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.522723 4707 generic.go:334] "Generic (PLEG): container finished" podID="1225cfb7-54e0-493f-b631-8743502cc57a" containerID="bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500" exitCode=0 Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.522788 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gph2" event={"ID":"1225cfb7-54e0-493f-b631-8743502cc57a","Type":"ContainerDied","Data":"bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500"} Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.522870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gph2" event={"ID":"1225cfb7-54e0-493f-b631-8743502cc57a","Type":"ContainerDied","Data":"3499d717d3075fe515aa4060713c72cdf02978dab0c2032a9e19dab13d8d4f73"} Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.522903 4707 scope.go:117] "RemoveContainer" containerID="bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.523104 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gph2" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.525524 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerID="41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f" exitCode=0 Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.525586 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn2nd" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.525631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2nd" event={"ID":"d6840e11-1e4b-47d4-8a4d-cc857f791915","Type":"ContainerDied","Data":"41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f"} Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.526146 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pl4s" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.526168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2nd" event={"ID":"d6840e11-1e4b-47d4-8a4d-cc857f791915","Type":"ContainerDied","Data":"c0ae0bbdf7492a5bb716b25e3d2ccbccd4646481dff2903d26ae9b62870a0561"} Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.540228 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gph2"] Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.542457 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4gph2"] Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.542805 4707 scope.go:117] "RemoveContainer" containerID="7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.564495 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sn2nd"] Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.566323 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sn2nd"] Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.567995 4707 scope.go:117] "RemoveContainer" containerID="e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.581920 4707 scope.go:117] "RemoveContainer" containerID="bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500" Jan 21 15:04:49 crc kubenswrapper[4707]: E0121 15:04:49.583274 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500\": container with ID starting with bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500 not found: ID does not exist" containerID="bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.583363 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500"} err="failed to get container status \"bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500\": rpc error: code = NotFound desc = could not find container \"bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500\": container with ID starting with bb3f5bca224a59ca9fed2ce6aa41e64d3efd58473ef9d778bb282cbe258fc500 not found: ID does not exist" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.583465 4707 scope.go:117] "RemoveContainer" containerID="7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f" Jan 21 15:04:49 crc kubenswrapper[4707]: E0121 15:04:49.584165 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f\": container with ID starting with 7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f not found: ID does not exist" containerID="7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.584200 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f"} err="failed to get container status \"7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f\": rpc error: code = NotFound desc = could not find container \"7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f\": container with ID starting with 7f110514c09f66dd8b51f51fbab6876df6af87a046aa19e1b1a856cbb7d6f79f not found: ID does not exist" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.584215 4707 scope.go:117] "RemoveContainer" containerID="e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749" Jan 21 15:04:49 crc kubenswrapper[4707]: E0121 15:04:49.584421 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749\": container with ID starting with e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749 not found: ID does not exist" containerID="e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.584437 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749"} err="failed to get container status \"e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749\": rpc error: code = NotFound desc = could not find container \"e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749\": container with ID starting with e984d4f0c7305caf349e9c2ce53de626a3dbd3707d52636c72b978a223c63749 not found: ID does not exist" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.584460 4707 scope.go:117] "RemoveContainer" containerID="41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.596300 4707 scope.go:117] "RemoveContainer" containerID="9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.606345 4707 scope.go:117] "RemoveContainer" containerID="fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.619381 4707 scope.go:117] "RemoveContainer" containerID="41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f" Jan 21 15:04:49 crc kubenswrapper[4707]: E0121 15:04:49.619722 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f\": container with ID starting with 41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f not found: ID does not exist" containerID="41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.619754 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f"} err="failed to get container status \"41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f\": rpc error: code = NotFound desc = could not find container \"41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f\": container with ID starting with 41902278ccd79fe0f97c2fdfb884d4bdf83c791c3a6592db07d7dc661419939f not found: ID does not exist" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.619776 4707 scope.go:117] "RemoveContainer" containerID="9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2" Jan 21 15:04:49 crc kubenswrapper[4707]: E0121 15:04:49.620071 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2\": container with ID starting with 9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2 not found: ID does not exist" containerID="9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.620095 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2"} err="failed to get container status \"9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2\": rpc error: code = NotFound desc = could not find container \"9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2\": container with ID starting with 9c4d194b36478f2bb9aa366ba107af8ef7089455914a73223be2fb537f72b5d2 not found: ID does not exist" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.620111 4707 scope.go:117] "RemoveContainer" containerID="fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a" Jan 21 15:04:49 crc kubenswrapper[4707]: E0121 15:04:49.620293 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a\": container with ID starting with fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a not found: ID does not exist" containerID="fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a" Jan 21 15:04:49 crc kubenswrapper[4707]: I0121 15:04:49.620387 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a"} err="failed to get container status \"fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a\": rpc error: code = NotFound desc = could not find container \"fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a\": container with ID starting with fd66bf067134174ab58f80f44b052f99adeb6f47ed7956b96fe10ee0fffdcb8a not found: ID does not exist" Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.201781 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7lr"] Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.202005 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cf7lr" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" containerName="registry-server" containerID="cri-o://089cd92dc4e5c6403556b1fe51ef61f87ceede8a5e07b6b647e163f77f6ef958" gracePeriod=2 Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.532480 4707 generic.go:334] "Generic (PLEG): container finished" podID="637d112b-95a3-4c37-889c-273c86fee7a9" containerID="089cd92dc4e5c6403556b1fe51ef61f87ceede8a5e07b6b647e163f77f6ef958" exitCode=0 Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.532573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7lr" event={"ID":"637d112b-95a3-4c37-889c-273c86fee7a9","Type":"ContainerDied","Data":"089cd92dc4e5c6403556b1fe51ef61f87ceede8a5e07b6b647e163f77f6ef958"} Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.572075 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65874f94f5-5s7hf"] Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.572442 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" podUID="4c0ff684-393c-4583-a3cf-bc18f9da9d69" containerName="controller-manager" containerID="cri-o://10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516" gracePeriod=30 Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.576878 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.667880 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr"] Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.668056 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" podUID="59709653-4cc2-41de-aa1d-0e64487a84e3" containerName="route-controller-manager" containerID="cri-o://7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a" gracePeriod=30 Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.716289 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-catalog-content\") pod \"637d112b-95a3-4c37-889c-273c86fee7a9\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.716400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-utilities\") pod \"637d112b-95a3-4c37-889c-273c86fee7a9\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.716441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvwc8\" (UniqueName: \"kubernetes.io/projected/637d112b-95a3-4c37-889c-273c86fee7a9-kube-api-access-fvwc8\") pod \"637d112b-95a3-4c37-889c-273c86fee7a9\" (UID: \"637d112b-95a3-4c37-889c-273c86fee7a9\") " Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.717134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-utilities" (OuterVolumeSpecName: "utilities") pod "637d112b-95a3-4c37-889c-273c86fee7a9" (UID: "637d112b-95a3-4c37-889c-273c86fee7a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.722473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637d112b-95a3-4c37-889c-273c86fee7a9-kube-api-access-fvwc8" (OuterVolumeSpecName: "kube-api-access-fvwc8") pod "637d112b-95a3-4c37-889c-273c86fee7a9" (UID: "637d112b-95a3-4c37-889c-273c86fee7a9"). InnerVolumeSpecName "kube-api-access-fvwc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.737060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "637d112b-95a3-4c37-889c-273c86fee7a9" (UID: "637d112b-95a3-4c37-889c-273c86fee7a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.817858 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.817889 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvwc8\" (UniqueName: \"kubernetes.io/projected/637d112b-95a3-4c37-889c-273c86fee7a9-kube-api-access-fvwc8\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:50 crc kubenswrapper[4707]: I0121 15:04:50.817899 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d112b-95a3-4c37-889c-273c86fee7a9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.039650 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.050024 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.127637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-client-ca\") pod \"59709653-4cc2-41de-aa1d-0e64487a84e3\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.127689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-config\") pod \"59709653-4cc2-41de-aa1d-0e64487a84e3\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.127778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnlzn\" (UniqueName: \"kubernetes.io/projected/59709653-4cc2-41de-aa1d-0e64487a84e3-kube-api-access-xnlzn\") pod \"59709653-4cc2-41de-aa1d-0e64487a84e3\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.127834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59709653-4cc2-41de-aa1d-0e64487a84e3-serving-cert\") pod \"59709653-4cc2-41de-aa1d-0e64487a84e3\" (UID: \"59709653-4cc2-41de-aa1d-0e64487a84e3\") " Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.128328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-client-ca" (OuterVolumeSpecName: "client-ca") pod "59709653-4cc2-41de-aa1d-0e64487a84e3" (UID: "59709653-4cc2-41de-aa1d-0e64487a84e3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.128544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-config" (OuterVolumeSpecName: "config") pod "59709653-4cc2-41de-aa1d-0e64487a84e3" (UID: "59709653-4cc2-41de-aa1d-0e64487a84e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.130294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59709653-4cc2-41de-aa1d-0e64487a84e3-kube-api-access-xnlzn" (OuterVolumeSpecName: "kube-api-access-xnlzn") pod "59709653-4cc2-41de-aa1d-0e64487a84e3" (UID: "59709653-4cc2-41de-aa1d-0e64487a84e3"). InnerVolumeSpecName "kube-api-access-xnlzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.130317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59709653-4cc2-41de-aa1d-0e64487a84e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59709653-4cc2-41de-aa1d-0e64487a84e3" (UID: "59709653-4cc2-41de-aa1d-0e64487a84e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.187085 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1225cfb7-54e0-493f-b631-8743502cc57a" path="/var/lib/kubelet/pods/1225cfb7-54e0-493f-b631-8743502cc57a/volumes" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.187630 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6840e11-1e4b-47d4-8a4d-cc857f791915" path="/var/lib/kubelet/pods/d6840e11-1e4b-47d4-8a4d-cc857f791915/volumes" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.228644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-client-ca\") pod \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.228682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-config\") pod \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.228718 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0ff684-393c-4583-a3cf-bc18f9da9d69-serving-cert\") pod \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.228758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-proxy-ca-bundles\") pod \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.228794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lnpd\" (UniqueName: \"kubernetes.io/projected/4c0ff684-393c-4583-a3cf-bc18f9da9d69-kube-api-access-4lnpd\") pod \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\" (UID: \"4c0ff684-393c-4583-a3cf-bc18f9da9d69\") " Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.229061 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnlzn\" (UniqueName: \"kubernetes.io/projected/59709653-4cc2-41de-aa1d-0e64487a84e3-kube-api-access-xnlzn\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.229076 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59709653-4cc2-41de-aa1d-0e64487a84e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.229085 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.229093 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59709653-4cc2-41de-aa1d-0e64487a84e3-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.229270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4c0ff684-393c-4583-a3cf-bc18f9da9d69" (UID: "4c0ff684-393c-4583-a3cf-bc18f9da9d69"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.229275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c0ff684-393c-4583-a3cf-bc18f9da9d69" (UID: "4c0ff684-393c-4583-a3cf-bc18f9da9d69"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.229311 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-config" (OuterVolumeSpecName: "config") pod "4c0ff684-393c-4583-a3cf-bc18f9da9d69" (UID: "4c0ff684-393c-4583-a3cf-bc18f9da9d69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.247223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0ff684-393c-4583-a3cf-bc18f9da9d69-kube-api-access-4lnpd" (OuterVolumeSpecName: "kube-api-access-4lnpd") pod "4c0ff684-393c-4583-a3cf-bc18f9da9d69" (UID: "4c0ff684-393c-4583-a3cf-bc18f9da9d69"). InnerVolumeSpecName "kube-api-access-4lnpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.269256 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0ff684-393c-4583-a3cf-bc18f9da9d69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c0ff684-393c-4583-a3cf-bc18f9da9d69" (UID: "4c0ff684-393c-4583-a3cf-bc18f9da9d69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.330504 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0ff684-393c-4583-a3cf-bc18f9da9d69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.330535 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.330548 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lnpd\" (UniqueName: \"kubernetes.io/projected/4c0ff684-393c-4583-a3cf-bc18f9da9d69-kube-api-access-4lnpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.330557 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.330566 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0ff684-393c-4583-a3cf-bc18f9da9d69-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.540250 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c0ff684-393c-4583-a3cf-bc18f9da9d69" containerID="10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516" exitCode=0 Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.540293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" event={"ID":"4c0ff684-393c-4583-a3cf-bc18f9da9d69","Type":"ContainerDied","Data":"10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516"} Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.540313 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.540351 4707 scope.go:117] "RemoveContainer" containerID="10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.540340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65874f94f5-5s7hf" event={"ID":"4c0ff684-393c-4583-a3cf-bc18f9da9d69","Type":"ContainerDied","Data":"0a23b8dca7bbe184efeb16e07cb991861ae8e9556f4d8d9b0ec3ec7d803827e2"} Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.542568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7lr" event={"ID":"637d112b-95a3-4c37-889c-273c86fee7a9","Type":"ContainerDied","Data":"97692c92ca0527d38b4bcdce070fa8e3ef09ccd6c1b901c4e73de07871b32557"} Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.542629 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf7lr" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.544425 4707 generic.go:334] "Generic (PLEG): container finished" podID="59709653-4cc2-41de-aa1d-0e64487a84e3" containerID="7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a" exitCode=0 Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.544463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" event={"ID":"59709653-4cc2-41de-aa1d-0e64487a84e3","Type":"ContainerDied","Data":"7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a"} Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.544476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" event={"ID":"59709653-4cc2-41de-aa1d-0e64487a84e3","Type":"ContainerDied","Data":"d23459563487b819df8f32fe2369113162820b26638b1bab33bcbfe617651a40"} Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.544532 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.553663 4707 scope.go:117] "RemoveContainer" containerID="10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.556934 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr"] Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.557024 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516\": container with ID starting with 10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516 not found: ID does not exist" containerID="10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.557068 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516"} err="failed to get container status \"10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516\": rpc error: code = NotFound desc = could not find container \"10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516\": container with ID starting with 10983d91177ba15425fb40084aa7f552806e19a4dfc73ad90b63c91338e71516 not found: ID does not exist" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.557101 4707 scope.go:117] "RemoveContainer" containerID="089cd92dc4e5c6403556b1fe51ef61f87ceede8a5e07b6b647e163f77f6ef958" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.561097 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf855f677-kxvrr"] Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.564853 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7lr"] Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.566277 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7lr"] Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.572229 4707 scope.go:117] "RemoveContainer" containerID="849961b0084be8fd803a03d67237f26fcf911b05a9ff8af8c79af2f3ddd92c1f" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.572553 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65874f94f5-5s7hf"] Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.574352 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-65874f94f5-5s7hf"] Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.583135 4707 scope.go:117] "RemoveContainer" containerID="898594e6842063de193765cab0d1466dd79f6dbb4612846b767573fc93a64507" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.593178 4707 scope.go:117] "RemoveContainer" containerID="7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.602979 4707 scope.go:117] "RemoveContainer" containerID="7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.603180 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a\": container with ID starting with 7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a not found: ID does not exist" containerID="7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.603201 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a"} err="failed to get container status \"7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a\": rpc error: code = NotFound desc = could not find container \"7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a\": container with ID starting with 7d4760202ed3551aed74a32b986285b1d80fb318a86004f2c3c45eac57fd860a not found: ID does not exist" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768196 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5684d8dd54-xhg4w"] Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768370 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerName="extract-content" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768381 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerName="extract-content" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768393 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59709653-4cc2-41de-aa1d-0e64487a84e3" containerName="route-controller-manager" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768398 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="59709653-4cc2-41de-aa1d-0e64487a84e3" containerName="route-controller-manager" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768411 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" containerName="extract-content" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768416 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" containerName="extract-content" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768424 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerName="registry-server" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768447 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerName="registry-server" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768455 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" containerName="extract-utilities" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768462 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" containerName="extract-utilities" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768471 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" containerName="registry-server" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768486 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" containerName="registry-server" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768495 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1225cfb7-54e0-493f-b631-8743502cc57a" containerName="extract-utilities" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768500 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1225cfb7-54e0-493f-b631-8743502cc57a" containerName="extract-utilities" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768509 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0ff684-393c-4583-a3cf-bc18f9da9d69" containerName="controller-manager" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768514 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0ff684-393c-4583-a3cf-bc18f9da9d69" containerName="controller-manager" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768521 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerName="extract-utilities" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768526 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerName="extract-utilities" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768532 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1225cfb7-54e0-493f-b631-8743502cc57a" containerName="extract-content" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768537 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1225cfb7-54e0-493f-b631-8743502cc57a" containerName="extract-content" Jan 21 15:04:51 crc kubenswrapper[4707]: E0121 15:04:51.768545 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1225cfb7-54e0-493f-b631-8743502cc57a" containerName="registry-server" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768552 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1225cfb7-54e0-493f-b631-8743502cc57a" containerName="registry-server" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768628 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="59709653-4cc2-41de-aa1d-0e64487a84e3" containerName="route-controller-manager" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768636 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" containerName="registry-server" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768643 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6840e11-1e4b-47d4-8a4d-cc857f791915" containerName="registry-server" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768651 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0ff684-393c-4583-a3cf-bc18f9da9d69" containerName="controller-manager" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768658 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1225cfb7-54e0-493f-b631-8743502cc57a" containerName="registry-server" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.768964 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.769954 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj"] Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.770346 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.773337 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.773639 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.773729 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.773924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.774234 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.774298 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.774350 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.774409 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.774598 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.774704 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.774882 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.774990 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.778391 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.778524 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5684d8dd54-xhg4w"] Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.781213 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj"] Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.937950 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-client-ca\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.938063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-config\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.938089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-config\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.938108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9a81a24-a627-44f9-8a97-740d11c21dcf-serving-cert\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.938146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-client-ca\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.938315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799km\" (UniqueName: \"kubernetes.io/projected/c9a81a24-a627-44f9-8a97-740d11c21dcf-kube-api-access-799km\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.938339 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-proxy-ca-bundles\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.938456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrp8\" (UniqueName: \"kubernetes.io/projected/2a4510d1-485f-4588-8dc3-3ff405a6080e-kube-api-access-thrp8\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:51 crc kubenswrapper[4707]: I0121 15:04:51.938584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4510d1-485f-4588-8dc3-3ff405a6080e-serving-cert\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.039184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-proxy-ca-bundles\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.039220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4510d1-485f-4588-8dc3-3ff405a6080e-serving-cert\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.039238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrp8\" (UniqueName: \"kubernetes.io/projected/2a4510d1-485f-4588-8dc3-3ff405a6080e-kube-api-access-thrp8\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.039268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-client-ca\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.039287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-config\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.039314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-config\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.039333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9a81a24-a627-44f9-8a97-740d11c21dcf-serving-cert\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.039350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-client-ca\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.039383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-799km\" (UniqueName: \"kubernetes.io/projected/c9a81a24-a627-44f9-8a97-740d11c21dcf-kube-api-access-799km\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.040303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-client-ca\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.040340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-config\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.040374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-proxy-ca-bundles\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.040932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-config\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.041650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-client-ca\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.043150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9a81a24-a627-44f9-8a97-740d11c21dcf-serving-cert\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.045332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4510d1-485f-4588-8dc3-3ff405a6080e-serving-cert\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.052945 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-799km\" (UniqueName: \"kubernetes.io/projected/c9a81a24-a627-44f9-8a97-740d11c21dcf-kube-api-access-799km\") pod \"route-controller-manager-7b94b688d5-jg8xj\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.054464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrp8\" (UniqueName: \"kubernetes.io/projected/2a4510d1-485f-4588-8dc3-3ff405a6080e-kube-api-access-thrp8\") pod \"controller-manager-5684d8dd54-xhg4w\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.086327 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.095260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.426578 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj"] Jan 21 15:04:52 crc kubenswrapper[4707]: W0121 15:04:52.431247 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a81a24_a627_44f9_8a97_740d11c21dcf.slice/crio-ffc6a91a7a20056cb567b69072ac6f34a8f431fa1abd19e4825e39e49ae79a06 WatchSource:0}: Error finding container ffc6a91a7a20056cb567b69072ac6f34a8f431fa1abd19e4825e39e49ae79a06: Status 404 returned error can't find the container with id ffc6a91a7a20056cb567b69072ac6f34a8f431fa1abd19e4825e39e49ae79a06 Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.470222 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5684d8dd54-xhg4w"] Jan 21 15:04:52 crc kubenswrapper[4707]: W0121 15:04:52.474305 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a4510d1_485f_4588_8dc3_3ff405a6080e.slice/crio-0bda58feebdcb3473c0fcaaf798cf885a297f4de366cefebc47cec8cd351e94a WatchSource:0}: Error finding container 0bda58feebdcb3473c0fcaaf798cf885a297f4de366cefebc47cec8cd351e94a: Status 404 returned error can't find the container with id 0bda58feebdcb3473c0fcaaf798cf885a297f4de366cefebc47cec8cd351e94a Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.550087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" event={"ID":"c9a81a24-a627-44f9-8a97-740d11c21dcf","Type":"ContainerStarted","Data":"efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76"} Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.550129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" event={"ID":"c9a81a24-a627-44f9-8a97-740d11c21dcf","Type":"ContainerStarted","Data":"ffc6a91a7a20056cb567b69072ac6f34a8f431fa1abd19e4825e39e49ae79a06"} Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.550304 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.555033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" event={"ID":"2a4510d1-485f-4588-8dc3-3ff405a6080e","Type":"ContainerStarted","Data":"0bda58feebdcb3473c0fcaaf798cf885a297f4de366cefebc47cec8cd351e94a"} Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.555265 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.557293 4707 patch_prober.go:28] interesting pod/controller-manager-5684d8dd54-xhg4w container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.557335 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" podUID="2a4510d1-485f-4588-8dc3-3ff405a6080e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.564111 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" podStartSLOduration=2.564099097 podStartE2EDuration="2.564099097s" podCreationTimestamp="2026-01-21 15:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:52.561725488 +0000 UTC m=+189.743241709" watchObservedRunningTime="2026-01-21 15:04:52.564099097 +0000 UTC m=+189.745615319" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.574491 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" podStartSLOduration=2.57447576 podStartE2EDuration="2.57447576s" podCreationTimestamp="2026-01-21 15:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:04:52.572789831 +0000 UTC m=+189.754306053" watchObservedRunningTime="2026-01-21 15:04:52.57447576 +0000 UTC m=+189.755991982" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.601971 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cjs7j"] Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.602164 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cjs7j" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerName="registry-server" containerID="cri-o://95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5" gracePeriod=2 Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.821557 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:04:52 crc kubenswrapper[4707]: I0121 15:04:52.961225 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.055258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wz5w\" (UniqueName: \"kubernetes.io/projected/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-kube-api-access-9wz5w\") pod \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.055329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-catalog-content\") pod \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.055357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-utilities\") pod \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\" (UID: \"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74\") " Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.056170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-utilities" (OuterVolumeSpecName: "utilities") pod "3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" (UID: "3a45ac60-de0f-49bd-b84e-d3fccd5a3e74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.059045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-kube-api-access-9wz5w" (OuterVolumeSpecName: "kube-api-access-9wz5w") pod "3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" (UID: "3a45ac60-de0f-49bd-b84e-d3fccd5a3e74"). InnerVolumeSpecName "kube-api-access-9wz5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.139367 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" (UID: "3a45ac60-de0f-49bd-b84e-d3fccd5a3e74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.157315 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wz5w\" (UniqueName: \"kubernetes.io/projected/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-kube-api-access-9wz5w\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.157345 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.157355 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.188036 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0ff684-393c-4583-a3cf-bc18f9da9d69" path="/var/lib/kubelet/pods/4c0ff684-393c-4583-a3cf-bc18f9da9d69/volumes" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.188501 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59709653-4cc2-41de-aa1d-0e64487a84e3" path="/var/lib/kubelet/pods/59709653-4cc2-41de-aa1d-0e64487a84e3/volumes" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.189024 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637d112b-95a3-4c37-889c-273c86fee7a9" path="/var/lib/kubelet/pods/637d112b-95a3-4c37-889c-273c86fee7a9/volumes" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.561703 4707 generic.go:334] "Generic (PLEG): container finished" podID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerID="95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5" exitCode=0 Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.561755 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjs7j" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.561748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjs7j" event={"ID":"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74","Type":"ContainerDied","Data":"95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5"} Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.561837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjs7j" event={"ID":"3a45ac60-de0f-49bd-b84e-d3fccd5a3e74","Type":"ContainerDied","Data":"69bea011173c32015ce719bd9a8128eb342f9fae7af21d63c57d716d1b87ab55"} Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.561858 4707 scope.go:117] "RemoveContainer" containerID="95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.563163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" event={"ID":"2a4510d1-485f-4588-8dc3-3ff405a6080e","Type":"ContainerStarted","Data":"2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d"} Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.567818 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.576194 4707 scope.go:117] "RemoveContainer" containerID="001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.576418 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cjs7j"] Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.578653 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cjs7j"] Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.591691 4707 scope.go:117] "RemoveContainer" containerID="a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.604503 4707 scope.go:117] "RemoveContainer" containerID="95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5" Jan 21 15:04:53 crc kubenswrapper[4707]: E0121 15:04:53.604905 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5\": container with ID starting with 95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5 not found: ID does not exist" containerID="95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.604934 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5"} err="failed to get container status \"95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5\": rpc error: code = NotFound desc = could not find container \"95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5\": container with ID starting with 95afef69dc31a16e301d19f5671bf1ab41c57955f41223b3fac454f6fe46dba5 not found: ID does not exist" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.604953 4707 scope.go:117] "RemoveContainer" containerID="001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558" Jan 21 15:04:53 crc kubenswrapper[4707]: E0121 15:04:53.605197 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558\": container with ID starting with 001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558 not found: ID does not exist" containerID="001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.605220 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558"} err="failed to get container status \"001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558\": rpc error: code = NotFound desc = could not find container \"001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558\": container with ID starting with 001d6ba523646cf96410887cbda126b74601c546c6df1c7763ea0c647690e558 not found: ID does not exist" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.605235 4707 scope.go:117] "RemoveContainer" containerID="a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e" Jan 21 15:04:53 crc kubenswrapper[4707]: E0121 15:04:53.605581 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e\": container with ID starting with a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e not found: ID does not exist" containerID="a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e" Jan 21 15:04:53 crc kubenswrapper[4707]: I0121 15:04:53.605603 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e"} err="failed to get container status \"a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e\": rpc error: code = NotFound desc = could not find container \"a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e\": container with ID starting with a28a403b3f05d8a50ff641a0ec4acb9126c7dc30b9784abeb41678837d4de94e not found: ID does not exist" Jan 21 15:04:55 crc kubenswrapper[4707]: I0121 15:04:55.193896 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" path="/var/lib/kubelet/pods/3a45ac60-de0f-49bd-b84e-d3fccd5a3e74/volumes" Jan 21 15:04:56 crc kubenswrapper[4707]: I0121 15:04:56.987943 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 15:04:56 crc kubenswrapper[4707]: E0121 15:04:56.988297 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerName="extract-utilities" Jan 21 15:04:56 crc kubenswrapper[4707]: I0121 15:04:56.988308 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerName="extract-utilities" Jan 21 15:04:56 crc kubenswrapper[4707]: E0121 15:04:56.988321 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerName="registry-server" Jan 21 15:04:56 crc kubenswrapper[4707]: I0121 15:04:56.988327 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerName="registry-server" Jan 21 15:04:56 crc kubenswrapper[4707]: E0121 15:04:56.988349 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerName="extract-content" Jan 21 15:04:56 crc kubenswrapper[4707]: I0121 15:04:56.988355 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerName="extract-content" Jan 21 15:04:56 crc kubenswrapper[4707]: I0121 15:04:56.988434 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a45ac60-de0f-49bd-b84e-d3fccd5a3e74" containerName="registry-server" Jan 21 15:04:56 crc kubenswrapper[4707]: I0121 15:04:56.988744 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:04:56 crc kubenswrapper[4707]: I0121 15:04:56.990755 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 15:04:56 crc kubenswrapper[4707]: I0121 15:04:56.990838 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 15:04:56 crc kubenswrapper[4707]: I0121 15:04:56.996829 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 15:04:57 crc kubenswrapper[4707]: I0121 15:04:57.098572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:04:57 crc kubenswrapper[4707]: I0121 15:04:57.098646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:04:57 crc kubenswrapper[4707]: I0121 15:04:57.199369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:04:57 crc kubenswrapper[4707]: I0121 15:04:57.199440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:04:57 crc kubenswrapper[4707]: I0121 15:04:57.199514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:04:57 crc kubenswrapper[4707]: I0121 15:04:57.217761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:04:57 crc kubenswrapper[4707]: I0121 15:04:57.304348 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:04:57 crc kubenswrapper[4707]: I0121 15:04:57.672526 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 15:04:57 crc kubenswrapper[4707]: W0121 15:04:57.681739 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0192e4ac_ea08_4e74_a3f5_052d4b45f819.slice/crio-50c846706dc088bed58a60a7a72384a6c2dd69747c4c5d53c281b88e4b7af05e WatchSource:0}: Error finding container 50c846706dc088bed58a60a7a72384a6c2dd69747c4c5d53c281b88e4b7af05e: Status 404 returned error can't find the container with id 50c846706dc088bed58a60a7a72384a6c2dd69747c4c5d53c281b88e4b7af05e Jan 21 15:04:58 crc kubenswrapper[4707]: I0121 15:04:58.585665 4707 generic.go:334] "Generic (PLEG): container finished" podID="0192e4ac-ea08-4e74-a3f5-052d4b45f819" containerID="ce797361c04bf81eeeb6fb4687d680594f5755d6920fdaad198984ace025dded" exitCode=0 Jan 21 15:04:58 crc kubenswrapper[4707]: I0121 15:04:58.585700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0192e4ac-ea08-4e74-a3f5-052d4b45f819","Type":"ContainerDied","Data":"ce797361c04bf81eeeb6fb4687d680594f5755d6920fdaad198984ace025dded"} Jan 21 15:04:58 crc kubenswrapper[4707]: I0121 15:04:58.585947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0192e4ac-ea08-4e74-a3f5-052d4b45f819","Type":"ContainerStarted","Data":"50c846706dc088bed58a60a7a72384a6c2dd69747c4c5d53c281b88e4b7af05e"} Jan 21 15:04:59 crc kubenswrapper[4707]: I0121 15:04:59.816838 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:04:59 crc kubenswrapper[4707]: I0121 15:04:59.939076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kube-api-access\") pod \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\" (UID: \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\") " Jan 21 15:04:59 crc kubenswrapper[4707]: I0121 15:04:59.939155 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kubelet-dir\") pod \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\" (UID: \"0192e4ac-ea08-4e74-a3f5-052d4b45f819\") " Jan 21 15:04:59 crc kubenswrapper[4707]: I0121 15:04:59.939223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0192e4ac-ea08-4e74-a3f5-052d4b45f819" (UID: "0192e4ac-ea08-4e74-a3f5-052d4b45f819"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:04:59 crc kubenswrapper[4707]: I0121 15:04:59.939494 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:04:59 crc kubenswrapper[4707]: I0121 15:04:59.945148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0192e4ac-ea08-4e74-a3f5-052d4b45f819" (UID: "0192e4ac-ea08-4e74-a3f5-052d4b45f819"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:00 crc kubenswrapper[4707]: I0121 15:05:00.041032 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0192e4ac-ea08-4e74-a3f5-052d4b45f819-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:00 crc kubenswrapper[4707]: I0121 15:05:00.595115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0192e4ac-ea08-4e74-a3f5-052d4b45f819","Type":"ContainerDied","Data":"50c846706dc088bed58a60a7a72384a6c2dd69747c4c5d53c281b88e4b7af05e"} Jan 21 15:05:00 crc kubenswrapper[4707]: I0121 15:05:00.595150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:05:00 crc kubenswrapper[4707]: I0121 15:05:00.595153 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50c846706dc088bed58a60a7a72384a6c2dd69747c4c5d53c281b88e4b7af05e" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.387040 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 15:05:02 crc kubenswrapper[4707]: E0121 15:05:02.387208 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0192e4ac-ea08-4e74-a3f5-052d4b45f819" containerName="pruner" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.387219 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0192e4ac-ea08-4e74-a3f5-052d4b45f819" containerName="pruner" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.387317 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0192e4ac-ea08-4e74-a3f5-052d4b45f819" containerName="pruner" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.387608 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.388841 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.388867 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.393787 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.569925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.569966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kube-api-access\") pod \"installer-9-crc\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.570002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-var-lock\") pod \"installer-9-crc\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.671051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.671093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kube-api-access\") pod \"installer-9-crc\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.671147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-var-lock\") pod \"installer-9-crc\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.671187 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.671259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-var-lock\") pod \"installer-9-crc\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.684788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kube-api-access\") pod \"installer-9-crc\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:02 crc kubenswrapper[4707]: I0121 15:05:02.702590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:03 crc kubenswrapper[4707]: I0121 15:05:03.044586 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 15:05:03 crc kubenswrapper[4707]: I0121 15:05:03.607615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"308450d9-b350-4362-9d3b-5e5c3a09ae9a","Type":"ContainerStarted","Data":"433af81b663997986e5842b3ce3673f5d46dd44c14c76658b4cc2dd291e8b55d"} Jan 21 15:05:03 crc kubenswrapper[4707]: I0121 15:05:03.607921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"308450d9-b350-4362-9d3b-5e5c3a09ae9a","Type":"ContainerStarted","Data":"d923ec6cb18300f0c8adb5c8ec0455634098cdbf91307a90a1a7163037e2f91c"} Jan 21 15:05:03 crc kubenswrapper[4707]: I0121 15:05:03.618585 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.618574623 podStartE2EDuration="1.618574623s" podCreationTimestamp="2026-01-21 15:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:05:03.617136153 +0000 UTC m=+200.798652385" watchObservedRunningTime="2026-01-21 15:05:03.618574623 +0000 UTC m=+200.800090845" Jan 21 15:05:09 crc kubenswrapper[4707]: I0121 15:05:09.945913 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:05:09 crc kubenswrapper[4707]: I0121 15:05:09.946277 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:05:09 crc kubenswrapper[4707]: I0121 15:05:09.946317 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:05:09 crc kubenswrapper[4707]: I0121 15:05:09.946751 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:05:09 crc kubenswrapper[4707]: I0121 15:05:09.946801 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54" gracePeriod=600 Jan 21 15:05:10 crc kubenswrapper[4707]: I0121 15:05:10.563172 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5684d8dd54-xhg4w"] Jan 21 15:05:10 crc kubenswrapper[4707]: I0121 15:05:10.563641 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" podUID="2a4510d1-485f-4588-8dc3-3ff405a6080e" containerName="controller-manager" containerID="cri-o://2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d" gracePeriod=30 Jan 21 15:05:10 crc kubenswrapper[4707]: I0121 15:05:10.576665 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj"] Jan 21 15:05:10 crc kubenswrapper[4707]: I0121 15:05:10.576895 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" podUID="c9a81a24-a627-44f9-8a97-740d11c21dcf" containerName="route-controller-manager" containerID="cri-o://efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76" gracePeriod=30 Jan 21 15:05:10 crc kubenswrapper[4707]: I0121 15:05:10.633582 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54" exitCode=0 Jan 21 15:05:10 crc kubenswrapper[4707]: I0121 15:05:10.633618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54"} Jan 21 15:05:10 crc kubenswrapper[4707]: I0121 15:05:10.633662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"bf3c9e6b6e14a46494620756c3c0f7583b42676c1dc0d8e1f947083111f31743"} Jan 21 15:05:10 crc kubenswrapper[4707]: I0121 15:05:10.981806 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.026833 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.158869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-config\") pod \"2a4510d1-485f-4588-8dc3-3ff405a6080e\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.158971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4510d1-485f-4588-8dc3-3ff405a6080e-serving-cert\") pod \"2a4510d1-485f-4588-8dc3-3ff405a6080e\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.159687 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-config" (OuterVolumeSpecName: "config") pod "2a4510d1-485f-4588-8dc3-3ff405a6080e" (UID: "2a4510d1-485f-4588-8dc3-3ff405a6080e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.159711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thrp8\" (UniqueName: \"kubernetes.io/projected/2a4510d1-485f-4588-8dc3-3ff405a6080e-kube-api-access-thrp8\") pod \"2a4510d1-485f-4588-8dc3-3ff405a6080e\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.159860 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-client-ca\") pod \"c9a81a24-a627-44f9-8a97-740d11c21dcf\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.159912 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9a81a24-a627-44f9-8a97-740d11c21dcf-serving-cert\") pod \"c9a81a24-a627-44f9-8a97-740d11c21dcf\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.159936 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-client-ca\") pod \"2a4510d1-485f-4588-8dc3-3ff405a6080e\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.159954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-proxy-ca-bundles\") pod \"2a4510d1-485f-4588-8dc3-3ff405a6080e\" (UID: \"2a4510d1-485f-4588-8dc3-3ff405a6080e\") " Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.159971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-config\") pod \"c9a81a24-a627-44f9-8a97-740d11c21dcf\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.159997 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-799km\" (UniqueName: \"kubernetes.io/projected/c9a81a24-a627-44f9-8a97-740d11c21dcf-kube-api-access-799km\") pod \"c9a81a24-a627-44f9-8a97-740d11c21dcf\" (UID: \"c9a81a24-a627-44f9-8a97-740d11c21dcf\") " Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.160311 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.160340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2a4510d1-485f-4588-8dc3-3ff405a6080e" (UID: "2a4510d1-485f-4588-8dc3-3ff405a6080e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.160386 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-client-ca" (OuterVolumeSpecName: "client-ca") pod "2a4510d1-485f-4588-8dc3-3ff405a6080e" (UID: "2a4510d1-485f-4588-8dc3-3ff405a6080e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.160565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-client-ca" (OuterVolumeSpecName: "client-ca") pod "c9a81a24-a627-44f9-8a97-740d11c21dcf" (UID: "c9a81a24-a627-44f9-8a97-740d11c21dcf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.161048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-config" (OuterVolumeSpecName: "config") pod "c9a81a24-a627-44f9-8a97-740d11c21dcf" (UID: "c9a81a24-a627-44f9-8a97-740d11c21dcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.163178 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a81a24-a627-44f9-8a97-740d11c21dcf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c9a81a24-a627-44f9-8a97-740d11c21dcf" (UID: "c9a81a24-a627-44f9-8a97-740d11c21dcf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.163208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4510d1-485f-4588-8dc3-3ff405a6080e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2a4510d1-485f-4588-8dc3-3ff405a6080e" (UID: "2a4510d1-485f-4588-8dc3-3ff405a6080e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.163208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a4510d1-485f-4588-8dc3-3ff405a6080e-kube-api-access-thrp8" (OuterVolumeSpecName: "kube-api-access-thrp8") pod "2a4510d1-485f-4588-8dc3-3ff405a6080e" (UID: "2a4510d1-485f-4588-8dc3-3ff405a6080e"). InnerVolumeSpecName "kube-api-access-thrp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.163677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a81a24-a627-44f9-8a97-740d11c21dcf-kube-api-access-799km" (OuterVolumeSpecName: "kube-api-access-799km") pod "c9a81a24-a627-44f9-8a97-740d11c21dcf" (UID: "c9a81a24-a627-44f9-8a97-740d11c21dcf"). InnerVolumeSpecName "kube-api-access-799km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.261962 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-799km\" (UniqueName: \"kubernetes.io/projected/c9a81a24-a627-44f9-8a97-740d11c21dcf-kube-api-access-799km\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.261995 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4510d1-485f-4588-8dc3-3ff405a6080e-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.262005 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thrp8\" (UniqueName: \"kubernetes.io/projected/2a4510d1-485f-4588-8dc3-3ff405a6080e-kube-api-access-thrp8\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.262015 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.262022 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9a81a24-a627-44f9-8a97-740d11c21dcf-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.262030 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.262037 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a4510d1-485f-4588-8dc3-3ff405a6080e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.262084 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a81a24-a627-44f9-8a97-740d11c21dcf-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.638548 4707 generic.go:334] "Generic (PLEG): container finished" podID="2a4510d1-485f-4588-8dc3-3ff405a6080e" containerID="2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d" exitCode=0 Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.638611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" event={"ID":"2a4510d1-485f-4588-8dc3-3ff405a6080e","Type":"ContainerDied","Data":"2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d"} Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.638625 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.638647 4707 scope.go:117] "RemoveContainer" containerID="2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.638636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5684d8dd54-xhg4w" event={"ID":"2a4510d1-485f-4588-8dc3-3ff405a6080e","Type":"ContainerDied","Data":"0bda58feebdcb3473c0fcaaf798cf885a297f4de366cefebc47cec8cd351e94a"} Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.640292 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9a81a24-a627-44f9-8a97-740d11c21dcf" containerID="efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76" exitCode=0 Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.640319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" event={"ID":"c9a81a24-a627-44f9-8a97-740d11c21dcf","Type":"ContainerDied","Data":"efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76"} Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.640336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" event={"ID":"c9a81a24-a627-44f9-8a97-740d11c21dcf","Type":"ContainerDied","Data":"ffc6a91a7a20056cb567b69072ac6f34a8f431fa1abd19e4825e39e49ae79a06"} Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.640393 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.651830 4707 scope.go:117] "RemoveContainer" containerID="2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d" Jan 21 15:05:11 crc kubenswrapper[4707]: E0121 15:05:11.652295 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d\": container with ID starting with 2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d not found: ID does not exist" containerID="2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.652371 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d"} err="failed to get container status \"2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d\": rpc error: code = NotFound desc = could not find container \"2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d\": container with ID starting with 2de0952a548eafc149d6488aa94cc1120315d31da0899fb38e04975fdc56a50d not found: ID does not exist" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.652390 4707 scope.go:117] "RemoveContainer" containerID="efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.654561 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5684d8dd54-xhg4w"] Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.662490 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5684d8dd54-xhg4w"] Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.662539 4707 scope.go:117] "RemoveContainer" containerID="efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76" Jan 21 15:05:11 crc kubenswrapper[4707]: E0121 15:05:11.662920 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76\": container with ID starting with efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76 not found: ID does not exist" containerID="efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.662955 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76"} err="failed to get container status \"efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76\": rpc error: code = NotFound desc = could not find container \"efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76\": container with ID starting with efcdc78528a98398d489ee4c4542b1b152e8e828e236fd7347bb94971f7b3a76 not found: ID does not exist" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.664908 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj"] Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.666716 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b94b688d5-jg8xj"] Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.780142 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-845bf7bd6-mfstq"] Jan 21 15:05:11 crc kubenswrapper[4707]: E0121 15:05:11.780367 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a81a24-a627-44f9-8a97-740d11c21dcf" containerName="route-controller-manager" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.780381 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a81a24-a627-44f9-8a97-740d11c21dcf" containerName="route-controller-manager" Jan 21 15:05:11 crc kubenswrapper[4707]: E0121 15:05:11.780407 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4510d1-485f-4588-8dc3-3ff405a6080e" containerName="controller-manager" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.780414 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4510d1-485f-4588-8dc3-3ff405a6080e" containerName="controller-manager" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.780507 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4510d1-485f-4588-8dc3-3ff405a6080e" containerName="controller-manager" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.780523 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a81a24-a627-44f9-8a97-740d11c21dcf" containerName="route-controller-manager" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.780903 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.781945 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546"] Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.782529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.783282 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.783409 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.783522 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.783637 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.786414 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.787157 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.787279 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.787348 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.787585 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.787866 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.787983 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.788027 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.789754 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-845bf7bd6-mfstq"] Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.789851 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.791511 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546"] Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.968898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t77sx\" (UniqueName: \"kubernetes.io/projected/e312200c-b816-4716-b8be-ff5989fea607-kube-api-access-t77sx\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.968947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7f59015-8e9c-4593-9ae2-a26d2673543a-serving-cert\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.968979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-proxy-ca-bundles\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.969033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-client-ca\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.969049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnt9x\" (UniqueName: \"kubernetes.io/projected/d7f59015-8e9c-4593-9ae2-a26d2673543a-kube-api-access-fnt9x\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.969069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-config\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.969084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e312200c-b816-4716-b8be-ff5989fea607-serving-cert\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.969101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-client-ca\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:11 crc kubenswrapper[4707]: I0121 15:05:11.969143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-config\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.069683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t77sx\" (UniqueName: \"kubernetes.io/projected/e312200c-b816-4716-b8be-ff5989fea607-kube-api-access-t77sx\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.069732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7f59015-8e9c-4593-9ae2-a26d2673543a-serving-cert\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.069762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-proxy-ca-bundles\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.069791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-client-ca\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.069832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnt9x\" (UniqueName: \"kubernetes.io/projected/d7f59015-8e9c-4593-9ae2-a26d2673543a-kube-api-access-fnt9x\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.069855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-config\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.069869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e312200c-b816-4716-b8be-ff5989fea607-serving-cert\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.069886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-client-ca\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.069915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-config\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.071034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-config\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.071639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-client-ca\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.071704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-config\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.071970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-client-ca\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.072123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-proxy-ca-bundles\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.075322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7f59015-8e9c-4593-9ae2-a26d2673543a-serving-cert\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.075321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e312200c-b816-4716-b8be-ff5989fea607-serving-cert\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.083308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t77sx\" (UniqueName: \"kubernetes.io/projected/e312200c-b816-4716-b8be-ff5989fea607-kube-api-access-t77sx\") pod \"route-controller-manager-968cb47b5-fq546\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.083331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnt9x\" (UniqueName: \"kubernetes.io/projected/d7f59015-8e9c-4593-9ae2-a26d2673543a-kube-api-access-fnt9x\") pod \"controller-manager-845bf7bd6-mfstq\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.097108 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.102195 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.435173 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546"] Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.472470 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-845bf7bd6-mfstq"] Jan 21 15:05:12 crc kubenswrapper[4707]: W0121 15:05:12.477563 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7f59015_8e9c_4593_9ae2_a26d2673543a.slice/crio-6eeb7206b5a7d986383bbb6d1a29e77e17fc5778c2865a67457b6222e8f5e7a0 WatchSource:0}: Error finding container 6eeb7206b5a7d986383bbb6d1a29e77e17fc5778c2865a67457b6222e8f5e7a0: Status 404 returned error can't find the container with id 6eeb7206b5a7d986383bbb6d1a29e77e17fc5778c2865a67457b6222e8f5e7a0 Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.507786 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" podUID="3742167f-2077-4b3c-b385-2a108c04538c" containerName="oauth-openshift" containerID="cri-o://81503b2ec210da2e36d7d852635a9a152151b0d9460e8543ea05ca7626c6bce8" gracePeriod=15 Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.645659 4707 generic.go:334] "Generic (PLEG): container finished" podID="3742167f-2077-4b3c-b385-2a108c04538c" containerID="81503b2ec210da2e36d7d852635a9a152151b0d9460e8543ea05ca7626c6bce8" exitCode=0 Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.645796 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" event={"ID":"3742167f-2077-4b3c-b385-2a108c04538c","Type":"ContainerDied","Data":"81503b2ec210da2e36d7d852635a9a152151b0d9460e8543ea05ca7626c6bce8"} Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.647424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" event={"ID":"e312200c-b816-4716-b8be-ff5989fea607","Type":"ContainerStarted","Data":"a0f8eaceeea68633d08090177a76a49aa53a51a9bc0dc80d97cc6326cc30c75e"} Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.647458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" event={"ID":"e312200c-b816-4716-b8be-ff5989fea607","Type":"ContainerStarted","Data":"2fe872a22aadf62d2cdf1d33dc722d2688cb182de544116233b409626b3ec737"} Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.647739 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.688231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" event={"ID":"d7f59015-8e9c-4593-9ae2-a26d2673543a","Type":"ContainerStarted","Data":"0904bffa31bc1fe3b5e68e67c0e0f9aeed63b0baadb5888d22071f56b0fc406a"} Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.688282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" event={"ID":"d7f59015-8e9c-4593-9ae2-a26d2673543a","Type":"ContainerStarted","Data":"6eeb7206b5a7d986383bbb6d1a29e77e17fc5778c2865a67457b6222e8f5e7a0"} Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.688632 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.689592 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" podStartSLOduration=2.689581118 podStartE2EDuration="2.689581118s" podCreationTimestamp="2026-01-21 15:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:05:12.688155653 +0000 UTC m=+209.869671874" watchObservedRunningTime="2026-01-21 15:05:12.689581118 +0000 UTC m=+209.871097340" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.709175 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" podStartSLOduration=2.709160101 podStartE2EDuration="2.709160101s" podCreationTimestamp="2026-01-21 15:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:05:12.707346217 +0000 UTC m=+209.888862439" watchObservedRunningTime="2026-01-21 15:05:12.709160101 +0000 UTC m=+209.890676323" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.716756 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.868979 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:12 crc kubenswrapper[4707]: I0121 15:05:12.913495 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.080535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tclm\" (UniqueName: \"kubernetes.io/projected/3742167f-2077-4b3c-b385-2a108c04538c-kube-api-access-5tclm\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.080627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3742167f-2077-4b3c-b385-2a108c04538c-audit-dir\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.080668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-idp-0-file-data\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.080688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-audit-policies\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.080710 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-session\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.080728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-error\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.080725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3742167f-2077-4b3c-b385-2a108c04538c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-login\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-ocp-branding-template\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081370 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-provider-selection\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081397 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-serving-cert\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-cliconfig\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081446 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-router-certs\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-trusted-ca-bundle\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081489 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-service-ca\") pod \"3742167f-2077-4b3c-b385-2a108c04538c\" (UID: \"3742167f-2077-4b3c-b385-2a108c04538c\") " Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081663 4707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3742167f-2077-4b3c-b385-2a108c04538c-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081674 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.081969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.082007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.085046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.085553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.085731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.085963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.086076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.086258 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.087177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3742167f-2077-4b3c-b385-2a108c04538c-kube-api-access-5tclm" (OuterVolumeSpecName: "kube-api-access-5tclm") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "kube-api-access-5tclm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.087601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.089069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3742167f-2077-4b3c-b385-2a108c04538c" (UID: "3742167f-2077-4b3c-b385-2a108c04538c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183104 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183317 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183328 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183339 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183348 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183356 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tclm\" (UniqueName: \"kubernetes.io/projected/3742167f-2077-4b3c-b385-2a108c04538c-kube-api-access-5tclm\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183365 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183375 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183383 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183391 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183400 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.183410 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3742167f-2077-4b3c-b385-2a108c04538c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.188723 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a4510d1-485f-4588-8dc3-3ff405a6080e" path="/var/lib/kubelet/pods/2a4510d1-485f-4588-8dc3-3ff405a6080e/volumes" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.189287 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a81a24-a627-44f9-8a97-740d11c21dcf" path="/var/lib/kubelet/pods/c9a81a24-a627-44f9-8a97-740d11c21dcf/volumes" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.693770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" event={"ID":"3742167f-2077-4b3c-b385-2a108c04538c","Type":"ContainerDied","Data":"9716e752da423cfc3ce05f25134660ebb91a1f53da27ea7436fbe3b768a0b355"} Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.693826 4707 scope.go:117] "RemoveContainer" containerID="81503b2ec210da2e36d7d852635a9a152151b0d9460e8543ea05ca7626c6bce8" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.693925 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bmpc8" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.704884 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmpc8"] Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.707500 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmpc8"] Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.781289 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-758c4c8f95-6bknm"] Jan 21 15:05:13 crc kubenswrapper[4707]: E0121 15:05:13.781620 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3742167f-2077-4b3c-b385-2a108c04538c" containerName="oauth-openshift" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.781698 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3742167f-2077-4b3c-b385-2a108c04538c" containerName="oauth-openshift" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.781872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3742167f-2077-4b3c-b385-2a108c04538c" containerName="oauth-openshift" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.782287 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.783760 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.784136 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.784225 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.784294 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.787608 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.787697 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.787770 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.787963 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.788113 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.788168 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.788116 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.790150 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.793690 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.794418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-758c4c8f95-6bknm"] Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.796980 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.800571 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-template-error\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891156 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjm57\" (UniqueName: \"kubernetes.io/projected/f4ca4a55-d8a8-41f6-9c82-390687a155a1-kube-api-access-wjm57\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-session\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-audit-policies\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-router-certs\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891391 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-template-login\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-service-ca\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.891731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4ca4a55-d8a8-41f6-9c82-390687a155a1-audit-dir\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-template-login\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-service-ca\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4ca4a55-d8a8-41f6-9c82-390687a155a1-audit-dir\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-template-error\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjm57\" (UniqueName: \"kubernetes.io/projected/f4ca4a55-d8a8-41f6-9c82-390687a155a1-kube-api-access-wjm57\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-session\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-audit-policies\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.992880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-router-certs\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.993044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4ca4a55-d8a8-41f6-9c82-390687a155a1-audit-dir\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.993983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-audit-policies\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.994123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.994311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.994349 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-service-ca\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.996760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-template-error\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.997594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.997121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.997280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.997660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.996883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-user-template-login\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.997688 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-router-certs\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:13 crc kubenswrapper[4707]: I0121 15:05:13.997910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f4ca4a55-d8a8-41f6-9c82-390687a155a1-v4-0-config-system-session\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:14 crc kubenswrapper[4707]: I0121 15:05:14.006927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjm57\" (UniqueName: \"kubernetes.io/projected/f4ca4a55-d8a8-41f6-9c82-390687a155a1-kube-api-access-wjm57\") pod \"oauth-openshift-758c4c8f95-6bknm\" (UID: \"f4ca4a55-d8a8-41f6-9c82-390687a155a1\") " pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:14 crc kubenswrapper[4707]: I0121 15:05:14.096305 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:14 crc kubenswrapper[4707]: I0121 15:05:14.430602 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-758c4c8f95-6bknm"] Jan 21 15:05:14 crc kubenswrapper[4707]: W0121 15:05:14.436680 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4ca4a55_d8a8_41f6_9c82_390687a155a1.slice/crio-808d806c5d778b8b69daf2512b6e74abdddd0332e4736077bd10ebd31b275fe1 WatchSource:0}: Error finding container 808d806c5d778b8b69daf2512b6e74abdddd0332e4736077bd10ebd31b275fe1: Status 404 returned error can't find the container with id 808d806c5d778b8b69daf2512b6e74abdddd0332e4736077bd10ebd31b275fe1 Jan 21 15:05:14 crc kubenswrapper[4707]: I0121 15:05:14.699967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" event={"ID":"f4ca4a55-d8a8-41f6-9c82-390687a155a1","Type":"ContainerStarted","Data":"d3909d23123f970e5985c46c26de506a75d39a8e7e3b6cac4a0bef906a85f114"} Jan 21 15:05:14 crc kubenswrapper[4707]: I0121 15:05:14.700259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" event={"ID":"f4ca4a55-d8a8-41f6-9c82-390687a155a1","Type":"ContainerStarted","Data":"808d806c5d778b8b69daf2512b6e74abdddd0332e4736077bd10ebd31b275fe1"} Jan 21 15:05:14 crc kubenswrapper[4707]: I0121 15:05:14.714525 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" podStartSLOduration=27.714510514 podStartE2EDuration="27.714510514s" podCreationTimestamp="2026-01-21 15:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:05:14.712280096 +0000 UTC m=+211.893796318" watchObservedRunningTime="2026-01-21 15:05:14.714510514 +0000 UTC m=+211.896026736" Jan 21 15:05:15 crc kubenswrapper[4707]: I0121 15:05:15.188136 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3742167f-2077-4b3c-b385-2a108c04538c" path="/var/lib/kubelet/pods/3742167f-2077-4b3c-b385-2a108c04538c/volumes" Jan 21 15:05:15 crc kubenswrapper[4707]: I0121 15:05:15.703954 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:15 crc kubenswrapper[4707]: I0121 15:05:15.708097 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-758c4c8f95-6bknm" Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.871209 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtrmt"] Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.872570 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dtrmt" podUID="beb79892-2b5e-47cf-9381-08a5654df324" containerName="registry-server" containerID="cri-o://7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543" gracePeriod=30 Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.878872 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ncnxd"] Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.879064 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ncnxd" podUID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerName="registry-server" containerID="cri-o://ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c" gracePeriod=30 Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.888795 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgg5r"] Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.889026 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" podUID="d43c6f9c-a13c-406e-90f3-152b22768174" containerName="marketplace-operator" containerID="cri-o://b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8" gracePeriod=30 Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.891775 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pd85"] Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.892000 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6pd85" podUID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerName="registry-server" containerID="cri-o://9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8" gracePeriod=30 Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.893097 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2vhtb"] Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.893319 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2vhtb" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="registry-server" containerID="cri-o://b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4" gracePeriod=30 Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.896498 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xcp6l"] Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.897118 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.906176 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xcp6l"] Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.941987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/654186e4-d936-41b5-8967-fb853141daf4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xcp6l\" (UID: \"654186e4-d936-41b5-8967-fb853141daf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.942274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78chx\" (UniqueName: \"kubernetes.io/projected/654186e4-d936-41b5-8967-fb853141daf4-kube-api-access-78chx\") pod \"marketplace-operator-79b997595-xcp6l\" (UID: \"654186e4-d936-41b5-8967-fb853141daf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:27 crc kubenswrapper[4707]: I0121 15:05:27.942329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/654186e4-d936-41b5-8967-fb853141daf4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xcp6l\" (UID: \"654186e4-d936-41b5-8967-fb853141daf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.043091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78chx\" (UniqueName: \"kubernetes.io/projected/654186e4-d936-41b5-8967-fb853141daf4-kube-api-access-78chx\") pod \"marketplace-operator-79b997595-xcp6l\" (UID: \"654186e4-d936-41b5-8967-fb853141daf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.043158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/654186e4-d936-41b5-8967-fb853141daf4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xcp6l\" (UID: \"654186e4-d936-41b5-8967-fb853141daf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.043219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/654186e4-d936-41b5-8967-fb853141daf4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xcp6l\" (UID: \"654186e4-d936-41b5-8967-fb853141daf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.044241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/654186e4-d936-41b5-8967-fb853141daf4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xcp6l\" (UID: \"654186e4-d936-41b5-8967-fb853141daf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.052113 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/654186e4-d936-41b5-8967-fb853141daf4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xcp6l\" (UID: \"654186e4-d936-41b5-8967-fb853141daf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.060604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78chx\" (UniqueName: \"kubernetes.io/projected/654186e4-d936-41b5-8967-fb853141daf4-kube-api-access-78chx\") pod \"marketplace-operator-79b997595-xcp6l\" (UID: \"654186e4-d936-41b5-8967-fb853141daf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.164106 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4 is running failed: container process not found" containerID="b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.164524 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4 is running failed: container process not found" containerID="b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.165057 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4 is running failed: container process not found" containerID="b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.165118 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-2vhtb" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="registry-server" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.212483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.351284 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.483640 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.486438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.526535 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.533550 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-utilities\") pod \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks4tw\" (UniqueName: \"kubernetes.io/projected/d43c6f9c-a13c-406e-90f3-152b22768174-kube-api-access-ks4tw\") pod \"d43c6f9c-a13c-406e-90f3-152b22768174\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547493 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-trusted-ca\") pod \"d43c6f9c-a13c-406e-90f3-152b22768174\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-utilities\") pod \"10e83da2-c4ad-415c-9c64-7507e2864b39\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4txd\" (UniqueName: \"kubernetes.io/projected/0e136188-e9e8-4eea-bd22-e01006fadce4-kube-api-access-t4txd\") pod \"0e136188-e9e8-4eea-bd22-e01006fadce4\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-catalog-content\") pod \"10e83da2-c4ad-415c-9c64-7507e2864b39\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clsn8\" (UniqueName: \"kubernetes.io/projected/10e83da2-c4ad-415c-9c64-7507e2864b39-kube-api-access-clsn8\") pod \"10e83da2-c4ad-415c-9c64-7507e2864b39\" (UID: \"10e83da2-c4ad-415c-9c64-7507e2864b39\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-catalog-content\") pod \"beb79892-2b5e-47cf-9381-08a5654df324\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdt8\" (UniqueName: \"kubernetes.io/projected/beb79892-2b5e-47cf-9381-08a5654df324-kube-api-access-8tdt8\") pod \"beb79892-2b5e-47cf-9381-08a5654df324\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-utilities\") pod \"beb79892-2b5e-47cf-9381-08a5654df324\" (UID: \"beb79892-2b5e-47cf-9381-08a5654df324\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-utilities\") pod \"0e136188-e9e8-4eea-bd22-e01006fadce4\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cp86\" (UniqueName: \"kubernetes.io/projected/f4e7d97d-3ff8-43f3-86b0-14de31de7179-kube-api-access-6cp86\") pod \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-catalog-content\") pod \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\" (UID: \"f4e7d97d-3ff8-43f3-86b0-14de31de7179\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-operator-metrics\") pod \"d43c6f9c-a13c-406e-90f3-152b22768174\" (UID: \"d43c6f9c-a13c-406e-90f3-152b22768174\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.547716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-catalog-content\") pod \"0e136188-e9e8-4eea-bd22-e01006fadce4\" (UID: \"0e136188-e9e8-4eea-bd22-e01006fadce4\") " Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.548136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-utilities" (OuterVolumeSpecName: "utilities") pod "f4e7d97d-3ff8-43f3-86b0-14de31de7179" (UID: "f4e7d97d-3ff8-43f3-86b0-14de31de7179"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.548665 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-utilities" (OuterVolumeSpecName: "utilities") pod "10e83da2-c4ad-415c-9c64-7507e2864b39" (UID: "10e83da2-c4ad-415c-9c64-7507e2864b39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.549742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-utilities" (OuterVolumeSpecName: "utilities") pod "beb79892-2b5e-47cf-9381-08a5654df324" (UID: "beb79892-2b5e-47cf-9381-08a5654df324"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.550214 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-utilities" (OuterVolumeSpecName: "utilities") pod "0e136188-e9e8-4eea-bd22-e01006fadce4" (UID: "0e136188-e9e8-4eea-bd22-e01006fadce4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.550759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d43c6f9c-a13c-406e-90f3-152b22768174" (UID: "d43c6f9c-a13c-406e-90f3-152b22768174"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.551579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb79892-2b5e-47cf-9381-08a5654df324-kube-api-access-8tdt8" (OuterVolumeSpecName: "kube-api-access-8tdt8") pod "beb79892-2b5e-47cf-9381-08a5654df324" (UID: "beb79892-2b5e-47cf-9381-08a5654df324"). InnerVolumeSpecName "kube-api-access-8tdt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.553651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e136188-e9e8-4eea-bd22-e01006fadce4-kube-api-access-t4txd" (OuterVolumeSpecName: "kube-api-access-t4txd") pod "0e136188-e9e8-4eea-bd22-e01006fadce4" (UID: "0e136188-e9e8-4eea-bd22-e01006fadce4"). InnerVolumeSpecName "kube-api-access-t4txd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.554025 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d43c6f9c-a13c-406e-90f3-152b22768174" (UID: "d43c6f9c-a13c-406e-90f3-152b22768174"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.556163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d43c6f9c-a13c-406e-90f3-152b22768174-kube-api-access-ks4tw" (OuterVolumeSpecName: "kube-api-access-ks4tw") pod "d43c6f9c-a13c-406e-90f3-152b22768174" (UID: "d43c6f9c-a13c-406e-90f3-152b22768174"). InnerVolumeSpecName "kube-api-access-ks4tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.557259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e83da2-c4ad-415c-9c64-7507e2864b39-kube-api-access-clsn8" (OuterVolumeSpecName: "kube-api-access-clsn8") pod "10e83da2-c4ad-415c-9c64-7507e2864b39" (UID: "10e83da2-c4ad-415c-9c64-7507e2864b39"). InnerVolumeSpecName "kube-api-access-clsn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.557957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e7d97d-3ff8-43f3-86b0-14de31de7179-kube-api-access-6cp86" (OuterVolumeSpecName: "kube-api-access-6cp86") pod "f4e7d97d-3ff8-43f3-86b0-14de31de7179" (UID: "f4e7d97d-3ff8-43f3-86b0-14de31de7179"). InnerVolumeSpecName "kube-api-access-6cp86". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.577600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10e83da2-c4ad-415c-9c64-7507e2864b39" (UID: "10e83da2-c4ad-415c-9c64-7507e2864b39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.602605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "beb79892-2b5e-47cf-9381-08a5654df324" (UID: "beb79892-2b5e-47cf-9381-08a5654df324"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.607864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e136188-e9e8-4eea-bd22-e01006fadce4" (UID: "0e136188-e9e8-4eea-bd22-e01006fadce4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648755 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648785 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4txd\" (UniqueName: \"kubernetes.io/projected/0e136188-e9e8-4eea-bd22-e01006fadce4-kube-api-access-t4txd\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648796 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clsn8\" (UniqueName: \"kubernetes.io/projected/10e83da2-c4ad-415c-9c64-7507e2864b39-kube-api-access-clsn8\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648824 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e83da2-c4ad-415c-9c64-7507e2864b39-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648833 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648853 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdt8\" (UniqueName: \"kubernetes.io/projected/beb79892-2b5e-47cf-9381-08a5654df324-kube-api-access-8tdt8\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648862 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb79892-2b5e-47cf-9381-08a5654df324-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648870 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648877 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cp86\" (UniqueName: \"kubernetes.io/projected/f4e7d97d-3ff8-43f3-86b0-14de31de7179-kube-api-access-6cp86\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648886 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648907 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e136188-e9e8-4eea-bd22-e01006fadce4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648915 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648922 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks4tw\" (UniqueName: \"kubernetes.io/projected/d43c6f9c-a13c-406e-90f3-152b22768174-kube-api-access-ks4tw\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.648930 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d43c6f9c-a13c-406e-90f3-152b22768174-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.655733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4e7d97d-3ff8-43f3-86b0-14de31de7179" (UID: "f4e7d97d-3ff8-43f3-86b0-14de31de7179"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.742864 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xcp6l"] Jan 21 15:05:28 crc kubenswrapper[4707]: W0121 15:05:28.745752 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod654186e4_d936_41b5_8967_fb853141daf4.slice/crio-aa52b3e7d91d5d424f1259bdaa54e3dfbf743a89bf3b41a8357178e956351918 WatchSource:0}: Error finding container aa52b3e7d91d5d424f1259bdaa54e3dfbf743a89bf3b41a8357178e956351918: Status 404 returned error can't find the container with id aa52b3e7d91d5d424f1259bdaa54e3dfbf743a89bf3b41a8357178e956351918 Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.750063 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4e7d97d-3ff8-43f3-86b0-14de31de7179-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.765834 4707 generic.go:334] "Generic (PLEG): container finished" podID="d43c6f9c-a13c-406e-90f3-152b22768174" containerID="b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8" exitCode=0 Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.765927 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.766137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" event={"ID":"d43c6f9c-a13c-406e-90f3-152b22768174","Type":"ContainerDied","Data":"b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.766182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgg5r" event={"ID":"d43c6f9c-a13c-406e-90f3-152b22768174","Type":"ContainerDied","Data":"e2e6f46621b20839c82610db073d9e05dd372b17c126ef00da5f854c88773281"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.766212 4707 scope.go:117] "RemoveContainer" containerID="b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.768389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" event={"ID":"654186e4-d936-41b5-8967-fb853141daf4","Type":"ContainerStarted","Data":"aa52b3e7d91d5d424f1259bdaa54e3dfbf743a89bf3b41a8357178e956351918"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.773838 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerID="ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c" exitCode=0 Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.773903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncnxd" event={"ID":"0e136188-e9e8-4eea-bd22-e01006fadce4","Type":"ContainerDied","Data":"ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.773928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncnxd" event={"ID":"0e136188-e9e8-4eea-bd22-e01006fadce4","Type":"ContainerDied","Data":"20fae57f46a49e2df559fb5fb428760f1354d2040b7599a08c6febd705095a19"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.774010 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncnxd" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.776381 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerID="b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4" exitCode=0 Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.776423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vhtb" event={"ID":"f4e7d97d-3ff8-43f3-86b0-14de31de7179","Type":"ContainerDied","Data":"b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.776438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vhtb" event={"ID":"f4e7d97d-3ff8-43f3-86b0-14de31de7179","Type":"ContainerDied","Data":"5900bb92016cb2d4aaca53587291bcd9dc1418158d5af7111b54a595ab4a7b46"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.776489 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vhtb" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.781030 4707 generic.go:334] "Generic (PLEG): container finished" podID="beb79892-2b5e-47cf-9381-08a5654df324" containerID="7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543" exitCode=0 Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.781072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtrmt" event={"ID":"beb79892-2b5e-47cf-9381-08a5654df324","Type":"ContainerDied","Data":"7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.781264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtrmt" event={"ID":"beb79892-2b5e-47cf-9381-08a5654df324","Type":"ContainerDied","Data":"99c79956b8828ff46306ffef19782b6af09c7b27202c04992faa00d5d3b7bcc7"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.781110 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtrmt" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.782995 4707 generic.go:334] "Generic (PLEG): container finished" podID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerID="9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8" exitCode=0 Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.783033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pd85" event={"ID":"10e83da2-c4ad-415c-9c64-7507e2864b39","Type":"ContainerDied","Data":"9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.783056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pd85" event={"ID":"10e83da2-c4ad-415c-9c64-7507e2864b39","Type":"ContainerDied","Data":"5f04c1e33ed9e70b91201d7533819725b096aea074e9c846aa4da5af374e6011"} Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.783110 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pd85" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.797616 4707 scope.go:117] "RemoveContainer" containerID="b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.798071 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8\": container with ID starting with b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8 not found: ID does not exist" containerID="b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.798103 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8"} err="failed to get container status \"b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8\": rpc error: code = NotFound desc = could not find container \"b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8\": container with ID starting with b2e0c6013e312a94be0660b144ab426a9cbb616800e9c1bff9ed731a2bfad5f8 not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.798124 4707 scope.go:117] "RemoveContainer" containerID="ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.808152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgg5r"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.812079 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgg5r"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.815382 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2vhtb"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.817970 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2vhtb"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.831404 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ncnxd"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.833647 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ncnxd"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.833762 4707 scope.go:117] "RemoveContainer" containerID="c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.838393 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtrmt"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.841446 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dtrmt"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.850688 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pd85"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.853145 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pd85"] Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.854893 4707 scope.go:117] "RemoveContainer" containerID="ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.865150 4707 scope.go:117] "RemoveContainer" containerID="ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.865529 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c\": container with ID starting with ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c not found: ID does not exist" containerID="ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.865557 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c"} err="failed to get container status \"ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c\": rpc error: code = NotFound desc = could not find container \"ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c\": container with ID starting with ba45f91d0fc3a18709c3bbe39f188115de4b2cbb0fad4b48c79eb2cb05c84d9c not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.865578 4707 scope.go:117] "RemoveContainer" containerID="c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.865853 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538\": container with ID starting with c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538 not found: ID does not exist" containerID="c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.865873 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538"} err="failed to get container status \"c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538\": rpc error: code = NotFound desc = could not find container \"c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538\": container with ID starting with c19d99d1968cd9ce98357918b9848931483145bc8786cefd628b47ca622fe538 not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.865888 4707 scope.go:117] "RemoveContainer" containerID="ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.866277 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc\": container with ID starting with ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc not found: ID does not exist" containerID="ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.866300 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc"} err="failed to get container status \"ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc\": rpc error: code = NotFound desc = could not find container \"ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc\": container with ID starting with ca1a21a6d8f5e8698cfcf628073260945ab483b05c3cea30cb8844af134043dc not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.866313 4707 scope.go:117] "RemoveContainer" containerID="b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.877649 4707 scope.go:117] "RemoveContainer" containerID="4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.888687 4707 scope.go:117] "RemoveContainer" containerID="fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.898771 4707 scope.go:117] "RemoveContainer" containerID="b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.899069 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4\": container with ID starting with b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4 not found: ID does not exist" containerID="b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.899103 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4"} err="failed to get container status \"b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4\": rpc error: code = NotFound desc = could not find container \"b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4\": container with ID starting with b5ed58e20143ede6beb95e5d3a1b481b2fa160456adb794477309a4c420ad4c4 not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.899132 4707 scope.go:117] "RemoveContainer" containerID="4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.899427 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e\": container with ID starting with 4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e not found: ID does not exist" containerID="4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.899448 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e"} err="failed to get container status \"4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e\": rpc error: code = NotFound desc = could not find container \"4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e\": container with ID starting with 4da796b560e6dbc972359a7718d81b43852838f9b81e599d1bb2fe352521923e not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.899465 4707 scope.go:117] "RemoveContainer" containerID="fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.899739 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c\": container with ID starting with fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c not found: ID does not exist" containerID="fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.899758 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c"} err="failed to get container status \"fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c\": rpc error: code = NotFound desc = could not find container \"fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c\": container with ID starting with fcb918115e458daf4f56d191642164e511bb2e1094a0cbefc7231d1d5384043c not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.899774 4707 scope.go:117] "RemoveContainer" containerID="7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.909041 4707 scope.go:117] "RemoveContainer" containerID="5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.921785 4707 scope.go:117] "RemoveContainer" containerID="6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.931849 4707 scope.go:117] "RemoveContainer" containerID="7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.932288 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543\": container with ID starting with 7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543 not found: ID does not exist" containerID="7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.932311 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543"} err="failed to get container status \"7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543\": rpc error: code = NotFound desc = could not find container \"7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543\": container with ID starting with 7c994503f292fd7a973af8f20263d7ae531f4ecf3f76601e2dd2a1e820011543 not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.932329 4707 scope.go:117] "RemoveContainer" containerID="5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.932838 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13\": container with ID starting with 5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13 not found: ID does not exist" containerID="5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.932862 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13"} err="failed to get container status \"5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13\": rpc error: code = NotFound desc = could not find container \"5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13\": container with ID starting with 5dac53e9b57c0925a1970870e3538303cfe8eb2f3fe6818f1d728ae24bf7fe13 not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.932880 4707 scope.go:117] "RemoveContainer" containerID="6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.933425 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9\": container with ID starting with 6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9 not found: ID does not exist" containerID="6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.933446 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9"} err="failed to get container status \"6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9\": rpc error: code = NotFound desc = could not find container \"6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9\": container with ID starting with 6edc70067b01c5cc29634452505398aa962ccec33f1750463f60f7f6df786ae9 not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.933461 4707 scope.go:117] "RemoveContainer" containerID="9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.947916 4707 scope.go:117] "RemoveContainer" containerID="a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.961013 4707 scope.go:117] "RemoveContainer" containerID="b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.971292 4707 scope.go:117] "RemoveContainer" containerID="9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.971544 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8\": container with ID starting with 9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8 not found: ID does not exist" containerID="9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.971639 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8"} err="failed to get container status \"9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8\": rpc error: code = NotFound desc = could not find container \"9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8\": container with ID starting with 9cd45cdbf9a5f688c140390393c03847a651dd32ba87a40dddbedc73b344abc8 not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.971716 4707 scope.go:117] "RemoveContainer" containerID="a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.972044 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e\": container with ID starting with a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e not found: ID does not exist" containerID="a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.972078 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e"} err="failed to get container status \"a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e\": rpc error: code = NotFound desc = could not find container \"a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e\": container with ID starting with a94f19aa3c41f477416496a94f924bbe23132c31fbf005e08ae3e69139ddde1e not found: ID does not exist" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.972102 4707 scope.go:117] "RemoveContainer" containerID="b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2" Jan 21 15:05:28 crc kubenswrapper[4707]: E0121 15:05:28.972455 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2\": container with ID starting with b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2 not found: ID does not exist" containerID="b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2" Jan 21 15:05:28 crc kubenswrapper[4707]: I0121 15:05:28.972485 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2"} err="failed to get container status \"b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2\": rpc error: code = NotFound desc = could not find container \"b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2\": container with ID starting with b95486613177aef81facf442b5a0e0a313adc989bde216f648ca6d45aac76cf2 not found: ID does not exist" Jan 21 15:05:29 crc kubenswrapper[4707]: I0121 15:05:29.189082 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e136188-e9e8-4eea-bd22-e01006fadce4" path="/var/lib/kubelet/pods/0e136188-e9e8-4eea-bd22-e01006fadce4/volumes" Jan 21 15:05:29 crc kubenswrapper[4707]: I0121 15:05:29.190253 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e83da2-c4ad-415c-9c64-7507e2864b39" path="/var/lib/kubelet/pods/10e83da2-c4ad-415c-9c64-7507e2864b39/volumes" Jan 21 15:05:29 crc kubenswrapper[4707]: I0121 15:05:29.191029 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb79892-2b5e-47cf-9381-08a5654df324" path="/var/lib/kubelet/pods/beb79892-2b5e-47cf-9381-08a5654df324/volumes" Jan 21 15:05:29 crc kubenswrapper[4707]: I0121 15:05:29.192313 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d43c6f9c-a13c-406e-90f3-152b22768174" path="/var/lib/kubelet/pods/d43c6f9c-a13c-406e-90f3-152b22768174/volumes" Jan 21 15:05:29 crc kubenswrapper[4707]: I0121 15:05:29.192747 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" path="/var/lib/kubelet/pods/f4e7d97d-3ff8-43f3-86b0-14de31de7179/volumes" Jan 21 15:05:29 crc kubenswrapper[4707]: I0121 15:05:29.791837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" event={"ID":"654186e4-d936-41b5-8967-fb853141daf4","Type":"ContainerStarted","Data":"cf39b51e4f56bedf86a8549aedaebd26b75855013d6fd498265d13dfa3fa163d"} Jan 21 15:05:29 crc kubenswrapper[4707]: I0121 15:05:29.792337 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:29 crc kubenswrapper[4707]: I0121 15:05:29.794700 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" Jan 21 15:05:29 crc kubenswrapper[4707]: I0121 15:05:29.804864 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xcp6l" podStartSLOduration=2.804850463 podStartE2EDuration="2.804850463s" podCreationTimestamp="2026-01-21 15:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:05:29.803135895 +0000 UTC m=+226.984652117" watchObservedRunningTime="2026-01-21 15:05:29.804850463 +0000 UTC m=+226.986366686" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.084888 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mjbnh"] Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085138 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43c6f9c-a13c-406e-90f3-152b22768174" containerName="marketplace-operator" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085153 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43c6f9c-a13c-406e-90f3-152b22768174" containerName="marketplace-operator" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085169 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb79892-2b5e-47cf-9381-08a5654df324" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085174 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb79892-2b5e-47cf-9381-08a5654df324" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085184 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerName="extract-content" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085218 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerName="extract-content" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085225 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085231 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085239 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085244 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085252 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="extract-content" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085257 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="extract-content" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085263 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerName="extract-content" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085268 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerName="extract-content" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085274 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="extract-utilities" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085279 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="extract-utilities" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085287 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerName="extract-utilities" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085292 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerName="extract-utilities" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085301 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb79892-2b5e-47cf-9381-08a5654df324" containerName="extract-utilities" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085306 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb79892-2b5e-47cf-9381-08a5654df324" containerName="extract-utilities" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085312 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb79892-2b5e-47cf-9381-08a5654df324" containerName="extract-content" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085317 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb79892-2b5e-47cf-9381-08a5654df324" containerName="extract-content" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085345 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerName="extract-utilities" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085350 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerName="extract-utilities" Jan 21 15:05:30 crc kubenswrapper[4707]: E0121 15:05:30.085358 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085362 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085442 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e7d97d-3ff8-43f3-86b0-14de31de7179" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085468 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d43c6f9c-a13c-406e-90f3-152b22768174" containerName="marketplace-operator" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085478 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="beb79892-2b5e-47cf-9381-08a5654df324" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085484 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e83da2-c4ad-415c-9c64-7507e2864b39" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.085492 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e136188-e9e8-4eea-bd22-e01006fadce4" containerName="registry-server" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.086203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.087801 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.093177 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjbnh"] Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.264113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978a715c-0a07-4197-9617-ae4c81c49d34-utilities\") pod \"redhat-marketplace-mjbnh\" (UID: \"978a715c-0a07-4197-9617-ae4c81c49d34\") " pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.264180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978a715c-0a07-4197-9617-ae4c81c49d34-catalog-content\") pod \"redhat-marketplace-mjbnh\" (UID: \"978a715c-0a07-4197-9617-ae4c81c49d34\") " pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.264227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzlxg\" (UniqueName: \"kubernetes.io/projected/978a715c-0a07-4197-9617-ae4c81c49d34-kube-api-access-xzlxg\") pod \"redhat-marketplace-mjbnh\" (UID: \"978a715c-0a07-4197-9617-ae4c81c49d34\") " pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.286753 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5qt95"] Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.287760 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.289439 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.293798 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qt95"] Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.365494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978a715c-0a07-4197-9617-ae4c81c49d34-utilities\") pod \"redhat-marketplace-mjbnh\" (UID: \"978a715c-0a07-4197-9617-ae4c81c49d34\") " pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.365931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978a715c-0a07-4197-9617-ae4c81c49d34-catalog-content\") pod \"redhat-marketplace-mjbnh\" (UID: \"978a715c-0a07-4197-9617-ae4c81c49d34\") " pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.366084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzlxg\" (UniqueName: \"kubernetes.io/projected/978a715c-0a07-4197-9617-ae4c81c49d34-kube-api-access-xzlxg\") pod \"redhat-marketplace-mjbnh\" (UID: \"978a715c-0a07-4197-9617-ae4c81c49d34\") " pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.366229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978a715c-0a07-4197-9617-ae4c81c49d34-utilities\") pod \"redhat-marketplace-mjbnh\" (UID: \"978a715c-0a07-4197-9617-ae4c81c49d34\") " pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.366595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978a715c-0a07-4197-9617-ae4c81c49d34-catalog-content\") pod \"redhat-marketplace-mjbnh\" (UID: \"978a715c-0a07-4197-9617-ae4c81c49d34\") " pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.383440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzlxg\" (UniqueName: \"kubernetes.io/projected/978a715c-0a07-4197-9617-ae4c81c49d34-kube-api-access-xzlxg\") pod \"redhat-marketplace-mjbnh\" (UID: \"978a715c-0a07-4197-9617-ae4c81c49d34\") " pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.403994 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.469116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1d939f-e4be-4740-83ef-b93efb9d0db8-utilities\") pod \"redhat-operators-5qt95\" (UID: \"ba1d939f-e4be-4740-83ef-b93efb9d0db8\") " pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.469368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-984t9\" (UniqueName: \"kubernetes.io/projected/ba1d939f-e4be-4740-83ef-b93efb9d0db8-kube-api-access-984t9\") pod \"redhat-operators-5qt95\" (UID: \"ba1d939f-e4be-4740-83ef-b93efb9d0db8\") " pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.469504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1d939f-e4be-4740-83ef-b93efb9d0db8-catalog-content\") pod \"redhat-operators-5qt95\" (UID: \"ba1d939f-e4be-4740-83ef-b93efb9d0db8\") " pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.563366 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-845bf7bd6-mfstq"] Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.563542 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" podUID="d7f59015-8e9c-4593-9ae2-a26d2673543a" containerName="controller-manager" containerID="cri-o://0904bffa31bc1fe3b5e68e67c0e0f9aeed63b0baadb5888d22071f56b0fc406a" gracePeriod=30 Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.571008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-984t9\" (UniqueName: \"kubernetes.io/projected/ba1d939f-e4be-4740-83ef-b93efb9d0db8-kube-api-access-984t9\") pod \"redhat-operators-5qt95\" (UID: \"ba1d939f-e4be-4740-83ef-b93efb9d0db8\") " pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.571093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1d939f-e4be-4740-83ef-b93efb9d0db8-catalog-content\") pod \"redhat-operators-5qt95\" (UID: \"ba1d939f-e4be-4740-83ef-b93efb9d0db8\") " pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.571141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1d939f-e4be-4740-83ef-b93efb9d0db8-utilities\") pod \"redhat-operators-5qt95\" (UID: \"ba1d939f-e4be-4740-83ef-b93efb9d0db8\") " pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.571564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1d939f-e4be-4740-83ef-b93efb9d0db8-utilities\") pod \"redhat-operators-5qt95\" (UID: \"ba1d939f-e4be-4740-83ef-b93efb9d0db8\") " pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.571600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1d939f-e4be-4740-83ef-b93efb9d0db8-catalog-content\") pod \"redhat-operators-5qt95\" (UID: \"ba1d939f-e4be-4740-83ef-b93efb9d0db8\") " pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.586146 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-984t9\" (UniqueName: \"kubernetes.io/projected/ba1d939f-e4be-4740-83ef-b93efb9d0db8-kube-api-access-984t9\") pod \"redhat-operators-5qt95\" (UID: \"ba1d939f-e4be-4740-83ef-b93efb9d0db8\") " pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.599942 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.668301 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546"] Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.668470 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" podUID="e312200c-b816-4716-b8be-ff5989fea607" containerName="route-controller-manager" containerID="cri-o://a0f8eaceeea68633d08090177a76a49aa53a51a9bc0dc80d97cc6326cc30c75e" gracePeriod=30 Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.754437 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjbnh"] Jan 21 15:05:30 crc kubenswrapper[4707]: W0121 15:05:30.764109 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod978a715c_0a07_4197_9617_ae4c81c49d34.slice/crio-5d03f34a5f964bd00216ba30fc3d4e62356de4c77b1bf6fa8b37546e51b85932 WatchSource:0}: Error finding container 5d03f34a5f964bd00216ba30fc3d4e62356de4c77b1bf6fa8b37546e51b85932: Status 404 returned error can't find the container with id 5d03f34a5f964bd00216ba30fc3d4e62356de4c77b1bf6fa8b37546e51b85932 Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.797092 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7f59015-8e9c-4593-9ae2-a26d2673543a" containerID="0904bffa31bc1fe3b5e68e67c0e0f9aeed63b0baadb5888d22071f56b0fc406a" exitCode=0 Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.797153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" event={"ID":"d7f59015-8e9c-4593-9ae2-a26d2673543a","Type":"ContainerDied","Data":"0904bffa31bc1fe3b5e68e67c0e0f9aeed63b0baadb5888d22071f56b0fc406a"} Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.798582 4707 generic.go:334] "Generic (PLEG): container finished" podID="e312200c-b816-4716-b8be-ff5989fea607" containerID="a0f8eaceeea68633d08090177a76a49aa53a51a9bc0dc80d97cc6326cc30c75e" exitCode=0 Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.798632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" event={"ID":"e312200c-b816-4716-b8be-ff5989fea607","Type":"ContainerDied","Data":"a0f8eaceeea68633d08090177a76a49aa53a51a9bc0dc80d97cc6326cc30c75e"} Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.799482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjbnh" event={"ID":"978a715c-0a07-4197-9617-ae4c81c49d34","Type":"ContainerStarted","Data":"5d03f34a5f964bd00216ba30fc3d4e62356de4c77b1bf6fa8b37546e51b85932"} Jan 21 15:05:30 crc kubenswrapper[4707]: I0121 15:05:30.967938 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qt95"] Jan 21 15:05:30 crc kubenswrapper[4707]: W0121 15:05:30.975734 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba1d939f_e4be_4740_83ef_b93efb9d0db8.slice/crio-1db05d8f3a16a7b8a8a5eb1c77fc275de4fa2de9337967be144f85d1d39c5b67 WatchSource:0}: Error finding container 1db05d8f3a16a7b8a8a5eb1c77fc275de4fa2de9337967be144f85d1d39c5b67: Status 404 returned error can't find the container with id 1db05d8f3a16a7b8a8a5eb1c77fc275de4fa2de9337967be144f85d1d39c5b67 Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.042308 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.077740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e312200c-b816-4716-b8be-ff5989fea607-serving-cert\") pod \"e312200c-b816-4716-b8be-ff5989fea607\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.077879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-config\") pod \"e312200c-b816-4716-b8be-ff5989fea607\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.077962 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-client-ca\") pod \"e312200c-b816-4716-b8be-ff5989fea607\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.077986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t77sx\" (UniqueName: \"kubernetes.io/projected/e312200c-b816-4716-b8be-ff5989fea607-kube-api-access-t77sx\") pod \"e312200c-b816-4716-b8be-ff5989fea607\" (UID: \"e312200c-b816-4716-b8be-ff5989fea607\") " Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.078591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-config" (OuterVolumeSpecName: "config") pod "e312200c-b816-4716-b8be-ff5989fea607" (UID: "e312200c-b816-4716-b8be-ff5989fea607"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.078600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-client-ca" (OuterVolumeSpecName: "client-ca") pod "e312200c-b816-4716-b8be-ff5989fea607" (UID: "e312200c-b816-4716-b8be-ff5989fea607"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.081569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e312200c-b816-4716-b8be-ff5989fea607-kube-api-access-t77sx" (OuterVolumeSpecName: "kube-api-access-t77sx") pod "e312200c-b816-4716-b8be-ff5989fea607" (UID: "e312200c-b816-4716-b8be-ff5989fea607"). InnerVolumeSpecName "kube-api-access-t77sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.081908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e312200c-b816-4716-b8be-ff5989fea607-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e312200c-b816-4716-b8be-ff5989fea607" (UID: "e312200c-b816-4716-b8be-ff5989fea607"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.152891 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.179046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-client-ca\") pod \"d7f59015-8e9c-4593-9ae2-a26d2673543a\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.179313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-config\") pod \"d7f59015-8e9c-4593-9ae2-a26d2673543a\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.179625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-client-ca" (OuterVolumeSpecName: "client-ca") pod "d7f59015-8e9c-4593-9ae2-a26d2673543a" (UID: "d7f59015-8e9c-4593-9ae2-a26d2673543a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.180076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-config" (OuterVolumeSpecName: "config") pod "d7f59015-8e9c-4593-9ae2-a26d2673543a" (UID: "d7f59015-8e9c-4593-9ae2-a26d2673543a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.180165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnt9x\" (UniqueName: \"kubernetes.io/projected/d7f59015-8e9c-4593-9ae2-a26d2673543a-kube-api-access-fnt9x\") pod \"d7f59015-8e9c-4593-9ae2-a26d2673543a\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.180674 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-proxy-ca-bundles\") pod \"d7f59015-8e9c-4593-9ae2-a26d2673543a\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.180701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7f59015-8e9c-4593-9ae2-a26d2673543a-serving-cert\") pod \"d7f59015-8e9c-4593-9ae2-a26d2673543a\" (UID: \"d7f59015-8e9c-4593-9ae2-a26d2673543a\") " Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.181086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d7f59015-8e9c-4593-9ae2-a26d2673543a" (UID: "d7f59015-8e9c-4593-9ae2-a26d2673543a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.181504 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t77sx\" (UniqueName: \"kubernetes.io/projected/e312200c-b816-4716-b8be-ff5989fea607-kube-api-access-t77sx\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.181521 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e312200c-b816-4716-b8be-ff5989fea607-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.181531 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.181539 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.181546 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.181553 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f59015-8e9c-4593-9ae2-a26d2673543a-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.181577 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e312200c-b816-4716-b8be-ff5989fea607-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.182436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f59015-8e9c-4593-9ae2-a26d2673543a-kube-api-access-fnt9x" (OuterVolumeSpecName: "kube-api-access-fnt9x") pod "d7f59015-8e9c-4593-9ae2-a26d2673543a" (UID: "d7f59015-8e9c-4593-9ae2-a26d2673543a"). InnerVolumeSpecName "kube-api-access-fnt9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.183592 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f59015-8e9c-4593-9ae2-a26d2673543a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7f59015-8e9c-4593-9ae2-a26d2673543a" (UID: "d7f59015-8e9c-4593-9ae2-a26d2673543a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.282399 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnt9x\" (UniqueName: \"kubernetes.io/projected/d7f59015-8e9c-4593-9ae2-a26d2673543a-kube-api-access-fnt9x\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.282425 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7f59015-8e9c-4593-9ae2-a26d2673543a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.796332 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c56d749c-psphc"] Jan 21 15:05:31 crc kubenswrapper[4707]: E0121 15:05:31.796577 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e312200c-b816-4716-b8be-ff5989fea607" containerName="route-controller-manager" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.796590 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e312200c-b816-4716-b8be-ff5989fea607" containerName="route-controller-manager" Jan 21 15:05:31 crc kubenswrapper[4707]: E0121 15:05:31.796608 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7f59015-8e9c-4593-9ae2-a26d2673543a" containerName="controller-manager" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.796614 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f59015-8e9c-4593-9ae2-a26d2673543a" containerName="controller-manager" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.796699 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e312200c-b816-4716-b8be-ff5989fea607" containerName="route-controller-manager" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.796713 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7f59015-8e9c-4593-9ae2-a26d2673543a" containerName="controller-manager" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.797103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.799382 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q"] Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.800158 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.802360 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c56d749c-psphc"] Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.804083 4707 generic.go:334] "Generic (PLEG): container finished" podID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" containerID="96b1c1ee9e2c2d022381a767901f9b9742017c72d644738f29a7aed41e31d01c" exitCode=0 Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.804179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qt95" event={"ID":"ba1d939f-e4be-4740-83ef-b93efb9d0db8","Type":"ContainerDied","Data":"96b1c1ee9e2c2d022381a767901f9b9742017c72d644738f29a7aed41e31d01c"} Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.804216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qt95" event={"ID":"ba1d939f-e4be-4740-83ef-b93efb9d0db8","Type":"ContainerStarted","Data":"1db05d8f3a16a7b8a8a5eb1c77fc275de4fa2de9337967be144f85d1d39c5b67"} Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.804880 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q"] Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.805635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" event={"ID":"d7f59015-8e9c-4593-9ae2-a26d2673543a","Type":"ContainerDied","Data":"6eeb7206b5a7d986383bbb6d1a29e77e17fc5778c2865a67457b6222e8f5e7a0"} Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.805664 4707 scope.go:117] "RemoveContainer" containerID="0904bffa31bc1fe3b5e68e67c0e0f9aeed63b0baadb5888d22071f56b0fc406a" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.806058 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845bf7bd6-mfstq" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.808610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" event={"ID":"e312200c-b816-4716-b8be-ff5989fea607","Type":"ContainerDied","Data":"2fe872a22aadf62d2cdf1d33dc722d2688cb182de544116233b409626b3ec737"} Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.808646 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.811828 4707 generic.go:334] "Generic (PLEG): container finished" podID="978a715c-0a07-4197-9617-ae4c81c49d34" containerID="58fbef135fe33e99c252ca028718f4a9dc72fda4786004e9a09af91eed04688d" exitCode=0 Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.812365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjbnh" event={"ID":"978a715c-0a07-4197-9617-ae4c81c49d34","Type":"ContainerDied","Data":"58fbef135fe33e99c252ca028718f4a9dc72fda4786004e9a09af91eed04688d"} Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.827197 4707 scope.go:117] "RemoveContainer" containerID="a0f8eaceeea68633d08090177a76a49aa53a51a9bc0dc80d97cc6326cc30c75e" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.866130 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-845bf7bd6-mfstq"] Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.868077 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-845bf7bd6-mfstq"] Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.872330 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546"] Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.874345 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-968cb47b5-fq546"] Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.889541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b73c935c-7930-46cd-93cf-8d56a06aed72-serving-cert\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.889603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/627ac71d-e70a-4755-ab21-e415cf155fe5-client-ca\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.889642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b73c935c-7930-46cd-93cf-8d56a06aed72-client-ca\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.889665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/627ac71d-e70a-4755-ab21-e415cf155fe5-serving-cert\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.889685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627ac71d-e70a-4755-ab21-e415cf155fe5-config\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.889703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b73c935c-7930-46cd-93cf-8d56a06aed72-config\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.889736 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg8f5\" (UniqueName: \"kubernetes.io/projected/627ac71d-e70a-4755-ab21-e415cf155fe5-kube-api-access-wg8f5\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.889755 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkqwm\" (UniqueName: \"kubernetes.io/projected/b73c935c-7930-46cd-93cf-8d56a06aed72-kube-api-access-vkqwm\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.889771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/627ac71d-e70a-4755-ab21-e415cf155fe5-proxy-ca-bundles\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.991013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b73c935c-7930-46cd-93cf-8d56a06aed72-serving-cert\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.991058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/627ac71d-e70a-4755-ab21-e415cf155fe5-client-ca\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.991085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b73c935c-7930-46cd-93cf-8d56a06aed72-client-ca\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.991109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/627ac71d-e70a-4755-ab21-e415cf155fe5-serving-cert\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.991129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627ac71d-e70a-4755-ab21-e415cf155fe5-config\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.991146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b73c935c-7930-46cd-93cf-8d56a06aed72-config\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.991168 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg8f5\" (UniqueName: \"kubernetes.io/projected/627ac71d-e70a-4755-ab21-e415cf155fe5-kube-api-access-wg8f5\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.991202 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkqwm\" (UniqueName: \"kubernetes.io/projected/b73c935c-7930-46cd-93cf-8d56a06aed72-kube-api-access-vkqwm\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.991219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/627ac71d-e70a-4755-ab21-e415cf155fe5-proxy-ca-bundles\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.992376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/627ac71d-e70a-4755-ab21-e415cf155fe5-client-ca\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.992397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b73c935c-7930-46cd-93cf-8d56a06aed72-client-ca\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.992463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/627ac71d-e70a-4755-ab21-e415cf155fe5-proxy-ca-bundles\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.992913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b73c935c-7930-46cd-93cf-8d56a06aed72-config\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.993358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627ac71d-e70a-4755-ab21-e415cf155fe5-config\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.994795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/627ac71d-e70a-4755-ab21-e415cf155fe5-serving-cert\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:31 crc kubenswrapper[4707]: I0121 15:05:31.995384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b73c935c-7930-46cd-93cf-8d56a06aed72-serving-cert\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.007048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkqwm\" (UniqueName: \"kubernetes.io/projected/b73c935c-7930-46cd-93cf-8d56a06aed72-kube-api-access-vkqwm\") pod \"route-controller-manager-c56d749c-psphc\" (UID: \"b73c935c-7930-46cd-93cf-8d56a06aed72\") " pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.007274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg8f5\" (UniqueName: \"kubernetes.io/projected/627ac71d-e70a-4755-ab21-e415cf155fe5-kube-api-access-wg8f5\") pod \"controller-manager-5477b5ddcd-pzv6q\" (UID: \"627ac71d-e70a-4755-ab21-e415cf155fe5\") " pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.125049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.125456 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.486405 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r67jn"] Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.493162 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c56d749c-psphc"] Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.494079 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q"] Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.493262 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.498648 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 15:05:32 crc kubenswrapper[4707]: W0121 15:05:32.504042 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb73c935c_7930_46cd_93cf_8d56a06aed72.slice/crio-16e8e928e7527bb4c3b3a037d7ed3c7a7d1befd8a8d84fdd25266d64da53df26 WatchSource:0}: Error finding container 16e8e928e7527bb4c3b3a037d7ed3c7a7d1befd8a8d84fdd25266d64da53df26: Status 404 returned error can't find the container with id 16e8e928e7527bb4c3b3a037d7ed3c7a7d1befd8a8d84fdd25266d64da53df26 Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.513349 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r67jn"] Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.602240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvmq4\" (UniqueName: \"kubernetes.io/projected/332f17e3-46fd-4f6a-a1bc-8021c3590a38-kube-api-access-wvmq4\") pod \"certified-operators-r67jn\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.602307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-utilities\") pod \"certified-operators-r67jn\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.602343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-catalog-content\") pod \"certified-operators-r67jn\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.693490 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kdz4h"] Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.694517 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.697500 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.703242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-catalog-content\") pod \"certified-operators-r67jn\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.703370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvmq4\" (UniqueName: \"kubernetes.io/projected/332f17e3-46fd-4f6a-a1bc-8021c3590a38-kube-api-access-wvmq4\") pod \"certified-operators-r67jn\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.703471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-utilities\") pod \"certified-operators-r67jn\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.703901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-catalog-content\") pod \"certified-operators-r67jn\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.703977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-utilities\") pod \"certified-operators-r67jn\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.709608 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdz4h"] Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.722566 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvmq4\" (UniqueName: \"kubernetes.io/projected/332f17e3-46fd-4f6a-a1bc-8021c3590a38-kube-api-access-wvmq4\") pod \"certified-operators-r67jn\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.805277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-catalog-content\") pod \"community-operators-kdz4h\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.805341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-utilities\") pod \"community-operators-kdz4h\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.805514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d9rn\" (UniqueName: \"kubernetes.io/projected/8123c2d1-a229-4e16-9845-aa86f7849baa-kube-api-access-9d9rn\") pod \"community-operators-kdz4h\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.819667 4707 generic.go:334] "Generic (PLEG): container finished" podID="978a715c-0a07-4197-9617-ae4c81c49d34" containerID="3b4b1366ffb4af43f7061d8e1e66bab25ab030f580ebb5fc8efc0d075f7f79e7" exitCode=0 Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.819727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjbnh" event={"ID":"978a715c-0a07-4197-9617-ae4c81c49d34","Type":"ContainerDied","Data":"3b4b1366ffb4af43f7061d8e1e66bab25ab030f580ebb5fc8efc0d075f7f79e7"} Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.821504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qt95" event={"ID":"ba1d939f-e4be-4740-83ef-b93efb9d0db8","Type":"ContainerStarted","Data":"aff15436808d8475deb18e118b259983a096f1d8bc0326429f5c24246dc15d68"} Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.823073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" event={"ID":"b73c935c-7930-46cd-93cf-8d56a06aed72","Type":"ContainerStarted","Data":"7179bbaf7dda7d6ec126eb5ab6e668e6b8b615ca203474b93a84a9ece771cd88"} Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.823110 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" event={"ID":"b73c935c-7930-46cd-93cf-8d56a06aed72","Type":"ContainerStarted","Data":"16e8e928e7527bb4c3b3a037d7ed3c7a7d1befd8a8d84fdd25266d64da53df26"} Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.823226 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.826009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" event={"ID":"627ac71d-e70a-4755-ab21-e415cf155fe5","Type":"ContainerStarted","Data":"efd9101be843a71313348079c3d1c8192946c1d2f88f30d10b92baaafb1464ec"} Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.826045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" event={"ID":"627ac71d-e70a-4755-ab21-e415cf155fe5","Type":"ContainerStarted","Data":"2760b0bf21ce1f624175a3adee2f0b2bd0c729ca972a5d8056a4ed82198fe713"} Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.826319 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.834742 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.857300 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.878293 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5477b5ddcd-pzv6q" podStartSLOduration=2.878278317 podStartE2EDuration="2.878278317s" podCreationTimestamp="2026-01-21 15:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:05:32.856797204 +0000 UTC m=+230.038313425" watchObservedRunningTime="2026-01-21 15:05:32.878278317 +0000 UTC m=+230.059794529" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.891051 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.899124 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c56d749c-psphc" podStartSLOduration=2.899111302 podStartE2EDuration="2.899111302s" podCreationTimestamp="2026-01-21 15:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:05:32.897472577 +0000 UTC m=+230.078988799" watchObservedRunningTime="2026-01-21 15:05:32.899111302 +0000 UTC m=+230.080627525" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.908919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d9rn\" (UniqueName: \"kubernetes.io/projected/8123c2d1-a229-4e16-9845-aa86f7849baa-kube-api-access-9d9rn\") pod \"community-operators-kdz4h\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.909018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-catalog-content\") pod \"community-operators-kdz4h\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.909048 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-utilities\") pod \"community-operators-kdz4h\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.910627 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-catalog-content\") pod \"community-operators-kdz4h\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.911127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-utilities\") pod \"community-operators-kdz4h\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:32 crc kubenswrapper[4707]: I0121 15:05:32.928733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d9rn\" (UniqueName: \"kubernetes.io/projected/8123c2d1-a229-4e16-9845-aa86f7849baa-kube-api-access-9d9rn\") pod \"community-operators-kdz4h\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.006441 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.187912 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f59015-8e9c-4593-9ae2-a26d2673543a" path="/var/lib/kubelet/pods/d7f59015-8e9c-4593-9ae2-a26d2673543a/volumes" Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.188393 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e312200c-b816-4716-b8be-ff5989fea607" path="/var/lib/kubelet/pods/e312200c-b816-4716-b8be-ff5989fea607/volumes" Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.284008 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r67jn"] Jan 21 15:05:33 crc kubenswrapper[4707]: W0121 15:05:33.297422 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod332f17e3_46fd_4f6a_a1bc_8021c3590a38.slice/crio-163dcf16f85e277eaee44b3b4493873e892fc2d6fd99f3320c02f28d433ec6cf WatchSource:0}: Error finding container 163dcf16f85e277eaee44b3b4493873e892fc2d6fd99f3320c02f28d433ec6cf: Status 404 returned error can't find the container with id 163dcf16f85e277eaee44b3b4493873e892fc2d6fd99f3320c02f28d433ec6cf Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.371017 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdz4h"] Jan 21 15:05:33 crc kubenswrapper[4707]: W0121 15:05:33.430017 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8123c2d1_a229_4e16_9845_aa86f7849baa.slice/crio-f17c9112882ec4d7c6aa93f79b489e38a29f5bd498ca02f0adf0701c43e746ea WatchSource:0}: Error finding container f17c9112882ec4d7c6aa93f79b489e38a29f5bd498ca02f0adf0701c43e746ea: Status 404 returned error can't find the container with id f17c9112882ec4d7c6aa93f79b489e38a29f5bd498ca02f0adf0701c43e746ea Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.830927 4707 generic.go:334] "Generic (PLEG): container finished" podID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerID="22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b" exitCode=0 Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.831159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdz4h" event={"ID":"8123c2d1-a229-4e16-9845-aa86f7849baa","Type":"ContainerDied","Data":"22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b"} Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.831207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdz4h" event={"ID":"8123c2d1-a229-4e16-9845-aa86f7849baa","Type":"ContainerStarted","Data":"f17c9112882ec4d7c6aa93f79b489e38a29f5bd498ca02f0adf0701c43e746ea"} Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.834476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjbnh" event={"ID":"978a715c-0a07-4197-9617-ae4c81c49d34","Type":"ContainerStarted","Data":"c5f8572b2b7e317a69473edc42f704aed80504a5c07e5f9c628d285e23da52cf"} Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.835667 4707 generic.go:334] "Generic (PLEG): container finished" podID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerID="8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c" exitCode=0 Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.835722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r67jn" event={"ID":"332f17e3-46fd-4f6a-a1bc-8021c3590a38","Type":"ContainerDied","Data":"8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c"} Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.835742 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r67jn" event={"ID":"332f17e3-46fd-4f6a-a1bc-8021c3590a38","Type":"ContainerStarted","Data":"163dcf16f85e277eaee44b3b4493873e892fc2d6fd99f3320c02f28d433ec6cf"} Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.838442 4707 generic.go:334] "Generic (PLEG): container finished" podID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" containerID="aff15436808d8475deb18e118b259983a096f1d8bc0326429f5c24246dc15d68" exitCode=0 Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.838735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qt95" event={"ID":"ba1d939f-e4be-4740-83ef-b93efb9d0db8","Type":"ContainerDied","Data":"aff15436808d8475deb18e118b259983a096f1d8bc0326429f5c24246dc15d68"} Jan 21 15:05:33 crc kubenswrapper[4707]: I0121 15:05:33.885203 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mjbnh" podStartSLOduration=2.444275339 podStartE2EDuration="3.88517423s" podCreationTimestamp="2026-01-21 15:05:30 +0000 UTC" firstStartedPulling="2026-01-21 15:05:31.81352199 +0000 UTC m=+228.995038213" lastFinishedPulling="2026-01-21 15:05:33.254420882 +0000 UTC m=+230.435937104" observedRunningTime="2026-01-21 15:05:33.884144177 +0000 UTC m=+231.065660399" watchObservedRunningTime="2026-01-21 15:05:33.88517423 +0000 UTC m=+231.066690453" Jan 21 15:05:34 crc kubenswrapper[4707]: I0121 15:05:34.844481 4707 generic.go:334] "Generic (PLEG): container finished" podID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerID="a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b" exitCode=0 Jan 21 15:05:34 crc kubenswrapper[4707]: I0121 15:05:34.844830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdz4h" event={"ID":"8123c2d1-a229-4e16-9845-aa86f7849baa","Type":"ContainerDied","Data":"a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b"} Jan 21 15:05:34 crc kubenswrapper[4707]: I0121 15:05:34.847379 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r67jn" event={"ID":"332f17e3-46fd-4f6a-a1bc-8021c3590a38","Type":"ContainerStarted","Data":"cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915"} Jan 21 15:05:34 crc kubenswrapper[4707]: I0121 15:05:34.850352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qt95" event={"ID":"ba1d939f-e4be-4740-83ef-b93efb9d0db8","Type":"ContainerStarted","Data":"1fd076db9c4eed8607121ba4a0d6ca5df693f738c3cbfac096d814400824324f"} Jan 21 15:05:34 crc kubenswrapper[4707]: I0121 15:05:34.882607 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5qt95" podStartSLOduration=2.345186463 podStartE2EDuration="4.882551546s" podCreationTimestamp="2026-01-21 15:05:30 +0000 UTC" firstStartedPulling="2026-01-21 15:05:31.805423407 +0000 UTC m=+228.986939630" lastFinishedPulling="2026-01-21 15:05:34.342788491 +0000 UTC m=+231.524304713" observedRunningTime="2026-01-21 15:05:34.880293076 +0000 UTC m=+232.061809308" watchObservedRunningTime="2026-01-21 15:05:34.882551546 +0000 UTC m=+232.064067769" Jan 21 15:05:35 crc kubenswrapper[4707]: I0121 15:05:35.856911 4707 generic.go:334] "Generic (PLEG): container finished" podID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerID="cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915" exitCode=0 Jan 21 15:05:35 crc kubenswrapper[4707]: I0121 15:05:35.856975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r67jn" event={"ID":"332f17e3-46fd-4f6a-a1bc-8021c3590a38","Type":"ContainerDied","Data":"cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915"} Jan 21 15:05:36 crc kubenswrapper[4707]: I0121 15:05:36.864687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r67jn" event={"ID":"332f17e3-46fd-4f6a-a1bc-8021c3590a38","Type":"ContainerStarted","Data":"9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a"} Jan 21 15:05:36 crc kubenswrapper[4707]: I0121 15:05:36.867490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdz4h" event={"ID":"8123c2d1-a229-4e16-9845-aa86f7849baa","Type":"ContainerStarted","Data":"1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985"} Jan 21 15:05:36 crc kubenswrapper[4707]: I0121 15:05:36.881503 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r67jn" podStartSLOduration=2.394444827 podStartE2EDuration="4.881490062s" podCreationTimestamp="2026-01-21 15:05:32 +0000 UTC" firstStartedPulling="2026-01-21 15:05:33.836573469 +0000 UTC m=+231.018089691" lastFinishedPulling="2026-01-21 15:05:36.323618705 +0000 UTC m=+233.505134926" observedRunningTime="2026-01-21 15:05:36.879135052 +0000 UTC m=+234.060651273" watchObservedRunningTime="2026-01-21 15:05:36.881490062 +0000 UTC m=+234.063006284" Jan 21 15:05:36 crc kubenswrapper[4707]: I0121 15:05:36.895674 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kdz4h" podStartSLOduration=3.355684958 podStartE2EDuration="4.895659079s" podCreationTimestamp="2026-01-21 15:05:32 +0000 UTC" firstStartedPulling="2026-01-21 15:05:33.832331376 +0000 UTC m=+231.013847599" lastFinishedPulling="2026-01-21 15:05:35.372305498 +0000 UTC m=+232.553821720" observedRunningTime="2026-01-21 15:05:36.893562294 +0000 UTC m=+234.075078515" watchObservedRunningTime="2026-01-21 15:05:36.895659079 +0000 UTC m=+234.077175302" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.404697 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.405192 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.435435 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.601031 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.601072 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.628791 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.668516 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.669212 4707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.669442 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.669584 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6" gracePeriod=15 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.669600 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e" gracePeriod=15 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.669658 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d" gracePeriod=15 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.669690 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592" gracePeriod=15 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.669721 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363" gracePeriod=15 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670231 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 15:05:40 crc kubenswrapper[4707]: E0121 15:05:40.670454 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670466 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:05:40 crc kubenswrapper[4707]: E0121 15:05:40.670475 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670480 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:05:40 crc kubenswrapper[4707]: E0121 15:05:40.670506 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670512 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 15:05:40 crc kubenswrapper[4707]: E0121 15:05:40.670520 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670525 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 15:05:40 crc kubenswrapper[4707]: E0121 15:05:40.670533 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670539 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 15:05:40 crc kubenswrapper[4707]: E0121 15:05:40.670549 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670555 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 15:05:40 crc kubenswrapper[4707]: E0121 15:05:40.670579 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670585 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670757 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670921 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670938 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.670946 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.671203 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.695372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.695412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.695432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.695457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.695474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.695602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.695678 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.695719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.796756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.796802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.796834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.796856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.796872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.796887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.796907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.796929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.796988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.797020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.797038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.797055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.797072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.797089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.797098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.797179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.886356 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.887597 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.888134 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e" exitCode=0 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.888173 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592" exitCode=0 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.888185 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d" exitCode=0 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.888194 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363" exitCode=2 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.888221 4707 scope.go:117] "RemoveContainer" containerID="f3eea3dcc43b6cd9c192f932a6906ad751a785065c279546d94285386a6e53bc" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.889577 4707 generic.go:334] "Generic (PLEG): container finished" podID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" containerID="433af81b663997986e5842b3ce3673f5d46dd44c14c76658b4cc2dd291e8b55d" exitCode=0 Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.889650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"308450d9-b350-4362-9d3b-5e5c3a09ae9a","Type":"ContainerDied","Data":"433af81b663997986e5842b3ce3673f5d46dd44c14c76658b4cc2dd291e8b55d"} Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.890910 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.891173 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.916413 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mjbnh" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.916759 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.917061 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.917249 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.917255 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5qt95" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.917513 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.917690 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.917863 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:40 crc kubenswrapper[4707]: I0121 15:05:40.918011 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:41 crc kubenswrapper[4707]: I0121 15:05:41.902631 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.178928 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.179299 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.179538 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.179854 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.214564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kubelet-dir\") pod \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.214636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-var-lock\") pod \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.214688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "308450d9-b350-4362-9d3b-5e5c3a09ae9a" (UID: "308450d9-b350-4362-9d3b-5e5c3a09ae9a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.214701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kube-api-access\") pod \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\" (UID: \"308450d9-b350-4362-9d3b-5e5c3a09ae9a\") " Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.214759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-var-lock" (OuterVolumeSpecName: "var-lock") pod "308450d9-b350-4362-9d3b-5e5c3a09ae9a" (UID: "308450d9-b350-4362-9d3b-5e5c3a09ae9a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.215174 4707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.215197 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.219016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "308450d9-b350-4362-9d3b-5e5c3a09ae9a" (UID: "308450d9-b350-4362-9d3b-5e5c3a09ae9a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.316558 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308450d9-b350-4362-9d3b-5e5c3a09ae9a-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.892081 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.892121 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.908479 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.908959 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6" exitCode=0 Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.909008 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60a0dcbcd1034e6b9fdc7b0624ec1ccf416d53697d9458faac4d171817ba051d" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.909908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"308450d9-b350-4362-9d3b-5e5c3a09ae9a","Type":"ContainerDied","Data":"d923ec6cb18300f0c8adb5c8ec0455634098cdbf91307a90a1a7163037e2f91c"} Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.909934 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d923ec6cb18300f0c8adb5c8ec0455634098cdbf91307a90a1a7163037e2f91c" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.909977 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.919323 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.919773 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.920069 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.920352 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.920528 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.928970 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.929258 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.929420 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.929601 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.930996 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.931486 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.931844 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.932046 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.932234 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.932424 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.932629 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.944263 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.944572 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.944790 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.945018 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.945206 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:42 crc kubenswrapper[4707]: I0121 15:05:42.945401 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.006713 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.006953 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.025565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.025595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.025666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.025689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.025693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.025778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.025878 4707 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.025893 4707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.025902 4707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.031652 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.032065 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.032462 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.032674 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.032898 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.033119 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.033350 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.184651 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.189568 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.190093 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.190411 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.190772 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.191062 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.192182 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.913831 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.914238 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.914430 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.914890 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.915078 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.915252 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.915432 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.916427 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.916627 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.916855 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.917108 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.917374 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.917557 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.947360 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.947661 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.947920 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.948260 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.948472 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.948678 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:43 crc kubenswrapper[4707]: I0121 15:05:43.948994 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:45 crc kubenswrapper[4707]: E0121 15:05:45.704332 4707 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.165:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:45 crc kubenswrapper[4707]: I0121 15:05:45.704886 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:45 crc kubenswrapper[4707]: W0121 15:05:45.719393 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-439b1378214502e1d8e44b861ac0b4d944fbbd667cfd4194eb1f6e90b43a80d6 WatchSource:0}: Error finding container 439b1378214502e1d8e44b861ac0b4d944fbbd667cfd4194eb1f6e90b43a80d6: Status 404 returned error can't find the container with id 439b1378214502e1d8e44b861ac0b4d944fbbd667cfd4194eb1f6e90b43a80d6 Jan 21 15:05:45 crc kubenswrapper[4707]: E0121 15:05:45.722158 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.165:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188cc75aa18bbfd8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 15:05:45.72103676 +0000 UTC m=+242.902552982,LastTimestamp:2026-01-21 15:05:45.72103676 +0000 UTC m=+242.902552982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 15:05:45 crc kubenswrapper[4707]: I0121 15:05:45.922517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"abb4069de2d6843ebe57aa07fd2b2d92c8c766b02f89be31d55b40bf9de4aa4e"} Jan 21 15:05:45 crc kubenswrapper[4707]: I0121 15:05:45.922721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"439b1378214502e1d8e44b861ac0b4d944fbbd667cfd4194eb1f6e90b43a80d6"} Jan 21 15:05:45 crc kubenswrapper[4707]: E0121 15:05:45.923221 4707 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.165:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:05:45 crc kubenswrapper[4707]: I0121 15:05:45.923274 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:45 crc kubenswrapper[4707]: I0121 15:05:45.923509 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:45 crc kubenswrapper[4707]: I0121 15:05:45.923755 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:45 crc kubenswrapper[4707]: I0121 15:05:45.924051 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:45 crc kubenswrapper[4707]: I0121 15:05:45.924293 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:48 crc kubenswrapper[4707]: E0121 15:05:48.010515 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:48 crc kubenswrapper[4707]: E0121 15:05:48.010981 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:48 crc kubenswrapper[4707]: E0121 15:05:48.011228 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:48 crc kubenswrapper[4707]: E0121 15:05:48.011495 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:48 crc kubenswrapper[4707]: E0121 15:05:48.011861 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:48 crc kubenswrapper[4707]: I0121 15:05:48.011889 4707 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 21 15:05:48 crc kubenswrapper[4707]: E0121 15:05:48.012100 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="200ms" Jan 21 15:05:48 crc kubenswrapper[4707]: E0121 15:05:48.213031 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="400ms" Jan 21 15:05:48 crc kubenswrapper[4707]: E0121 15:05:48.614090 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="800ms" Jan 21 15:05:49 crc kubenswrapper[4707]: E0121 15:05:49.414489 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="1.6s" Jan 21 15:05:51 crc kubenswrapper[4707]: E0121 15:05:51.015119 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="3.2s" Jan 21 15:05:53 crc kubenswrapper[4707]: I0121 15:05:53.183660 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:53 crc kubenswrapper[4707]: I0121 15:05:53.183908 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:53 crc kubenswrapper[4707]: I0121 15:05:53.184225 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:53 crc kubenswrapper[4707]: I0121 15:05:53.184529 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:53 crc kubenswrapper[4707]: I0121 15:05:53.184772 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.182177 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.182914 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.183192 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.183601 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.183921 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.184254 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.194205 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.194226 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:05:54 crc kubenswrapper[4707]: E0121 15:05:54.194499 4707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.194780 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:54 crc kubenswrapper[4707]: E0121 15:05:54.216384 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.165:6443: connect: connection refused" interval="6.4s" Jan 21 15:05:54 crc kubenswrapper[4707]: E0121 15:05:54.220993 4707 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 192.168.25.165:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" volumeName="registry-storage" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.958434 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.958475 4707 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d" exitCode=1 Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.958520 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d"} Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.958963 4707 scope.go:117] "RemoveContainer" containerID="1740c3b652dac0dc7bcd3461ab8d47a53044493d0ac0dd7cd1b63a67f849060d" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.959456 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.959668 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.959964 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.960257 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.960511 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.960745 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.960942 4707 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ee2c56501b4c75e8ffe8ef42c3e65b02f4bbdac49269992f49eb42da278bd7ef" exitCode=0 Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.960971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ee2c56501b4c75e8ffe8ef42c3e65b02f4bbdac49269992f49eb42da278bd7ef"} Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.960996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ea11da9093d47223693f866ce9767befe38cb551f57551892205048e5b91b3c3"} Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.961350 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.961407 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.961734 4707 status_manager.go:851] "Failed to get status for pod" podUID="978a715c-0a07-4197-9617-ae4c81c49d34" pod="openshift-marketplace/redhat-marketplace-mjbnh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mjbnh\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: E0121 15:05:54.961795 4707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.961952 4707 status_manager.go:851] "Failed to get status for pod" podUID="ba1d939f-e4be-4740-83ef-b93efb9d0db8" pod="openshift-marketplace/redhat-operators-5qt95" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5qt95\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.962247 4707 status_manager.go:851] "Failed to get status for pod" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.962574 4707 status_manager.go:851] "Failed to get status for pod" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" pod="openshift-marketplace/community-operators-kdz4h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kdz4h\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.962844 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:54 crc kubenswrapper[4707]: I0121 15:05:54.963276 4707 status_manager.go:851] "Failed to get status for pod" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" pod="openshift-marketplace/certified-operators-r67jn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-r67jn\": dial tcp 192.168.25.165:6443: connect: connection refused" Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.966575 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.966800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fa7cf07a04f3f993ac887227b17f5731ecbfda458f7648c35988776d61820398"} Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.969288 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"101f903b2ed39c2dbb2709ea605d20ef4502899949d8b1f00fbdca7d39983f82"} Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.969409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"60e3103fc8632a94abbfd4a13f6bd153955916163563b40a7b21efabff3b1017"} Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.969490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8bec533dc64b58c8c4ceb68f7827ce8f1b4512b76e0313dff9c037752d38068b"} Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.969504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"63781242d696dfb377268dddbc5c8ad492a038c4edc3fa7ba0a971ac9aa61f53"} Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.969513 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2f34696a4b23ae36eb88afb47941d0fe8ae0fecbad96db73610932f153930569"} Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.969644 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.969643 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:05:55 crc kubenswrapper[4707]: I0121 15:05:55.969681 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:05:58 crc kubenswrapper[4707]: I0121 15:05:58.815410 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:05:58 crc kubenswrapper[4707]: I0121 15:05:58.818278 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:05:58 crc kubenswrapper[4707]: I0121 15:05:58.981037 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:05:59 crc kubenswrapper[4707]: I0121 15:05:59.194970 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:59 crc kubenswrapper[4707]: I0121 15:05:59.195010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:05:59 crc kubenswrapper[4707]: I0121 15:05:59.199356 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:06:01 crc kubenswrapper[4707]: I0121 15:06:01.107115 4707 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:06:01 crc kubenswrapper[4707]: I0121 15:06:01.994045 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:06:01 crc kubenswrapper[4707]: I0121 15:06:01.994084 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:06:01 crc kubenswrapper[4707]: I0121 15:06:01.997285 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:06:02 crc kubenswrapper[4707]: I0121 15:06:02.997527 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:06:02 crc kubenswrapper[4707]: I0121 15:06:02.997554 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e91fd434-0bd6-4c48-a7ae-a65aef6fd9b1" Jan 21 15:06:03 crc kubenswrapper[4707]: I0121 15:06:03.192642 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="324ea278-0cea-4405-8ded-49f7a7521751" Jan 21 15:06:10 crc kubenswrapper[4707]: I0121 15:06:10.746109 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 15:06:10 crc kubenswrapper[4707]: I0121 15:06:10.850791 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:06:10 crc kubenswrapper[4707]: I0121 15:06:10.900471 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 15:06:11 crc kubenswrapper[4707]: I0121 15:06:11.240759 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 15:06:11 crc kubenswrapper[4707]: I0121 15:06:11.487403 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 15:06:11 crc kubenswrapper[4707]: I0121 15:06:11.512284 4707 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 15:06:11 crc kubenswrapper[4707]: I0121 15:06:11.803941 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 15:06:11 crc kubenswrapper[4707]: I0121 15:06:11.908857 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 15:06:11 crc kubenswrapper[4707]: I0121 15:06:11.937594 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 15:06:12 crc kubenswrapper[4707]: I0121 15:06:12.008504 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 15:06:12 crc kubenswrapper[4707]: I0121 15:06:12.092880 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 15:06:12 crc kubenswrapper[4707]: I0121 15:06:12.146388 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 15:06:12 crc kubenswrapper[4707]: I0121 15:06:12.181438 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 15:06:12 crc kubenswrapper[4707]: I0121 15:06:12.256493 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 15:06:12 crc kubenswrapper[4707]: I0121 15:06:12.387489 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 15:06:12 crc kubenswrapper[4707]: I0121 15:06:12.774340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 15:06:12 crc kubenswrapper[4707]: I0121 15:06:12.955411 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 15:06:13 crc kubenswrapper[4707]: I0121 15:06:13.011774 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 15:06:13 crc kubenswrapper[4707]: I0121 15:06:13.230474 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 15:06:13 crc kubenswrapper[4707]: I0121 15:06:13.351268 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 15:06:13 crc kubenswrapper[4707]: I0121 15:06:13.420112 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 15:06:13 crc kubenswrapper[4707]: I0121 15:06:13.621946 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 15:06:13 crc kubenswrapper[4707]: I0121 15:06:13.664194 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 15:06:13 crc kubenswrapper[4707]: I0121 15:06:13.936985 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 15:06:14 crc kubenswrapper[4707]: I0121 15:06:14.116640 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 15:06:14 crc kubenswrapper[4707]: I0121 15:06:14.427479 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 15:06:14 crc kubenswrapper[4707]: I0121 15:06:14.528287 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 15:06:14 crc kubenswrapper[4707]: I0121 15:06:14.753975 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 15:06:14 crc kubenswrapper[4707]: I0121 15:06:14.776981 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 15:06:14 crc kubenswrapper[4707]: I0121 15:06:14.832225 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 15:06:14 crc kubenswrapper[4707]: I0121 15:06:14.896493 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 15:06:14 crc kubenswrapper[4707]: I0121 15:06:14.913521 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.141010 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.152768 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.158723 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.164667 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.191861 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.248447 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.387401 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.487967 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.504327 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.533582 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.592140 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.598317 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.673245 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.694934 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.770874 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.774629 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.949690 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 15:06:15 crc kubenswrapper[4707]: I0121 15:06:15.977345 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.029625 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.106320 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.116713 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.178353 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.331921 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.470296 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.518251 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.641304 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.641678 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.670055 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.690591 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.712917 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.842930 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.896476 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 15:06:16 crc kubenswrapper[4707]: I0121 15:06:16.990201 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.077711 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.379923 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.437970 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.467899 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.505089 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.620218 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.647616 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.696685 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.774326 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.877415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.907483 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.955077 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 15:06:17 crc kubenswrapper[4707]: I0121 15:06:17.956950 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.096632 4707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.112492 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.170317 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.214918 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.223376 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.360291 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.471651 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.481778 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.520836 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.542988 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.745789 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.759764 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.771073 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.850799 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.875381 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 15:06:18 crc kubenswrapper[4707]: I0121 15:06:18.985273 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.033324 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.045242 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.060681 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.121056 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.177395 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.202084 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.336829 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.349876 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.397942 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.418185 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.483792 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.523464 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.733412 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.776871 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.817995 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.943135 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.966086 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 15:06:19 crc kubenswrapper[4707]: I0121 15:06:19.993561 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.057589 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.318128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.336842 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.347667 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.363199 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.420509 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.509578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.512508 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.518510 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.627438 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.649527 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.695513 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.729770 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.904273 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 15:06:20 crc kubenswrapper[4707]: I0121 15:06:20.950310 4707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.041364 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.099953 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.144469 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.170307 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.189607 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.285330 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.373607 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.376675 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.400274 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.442539 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.450498 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.505066 4707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.527875 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.545868 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.626270 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.715768 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.716305 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.730193 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.747528 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.775107 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.778486 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.868826 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 15:06:21 crc kubenswrapper[4707]: I0121 15:06:21.906240 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.050997 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.099273 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.201464 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.202396 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.220666 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.335707 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.346957 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.356032 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.413987 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.450468 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.453452 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.548921 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.568095 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.749927 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.847641 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.875841 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.926926 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.973137 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 15:06:22 crc kubenswrapper[4707]: I0121 15:06:22.998346 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.006492 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.007261 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.101971 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.128925 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.185362 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.237062 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.245583 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.298354 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.311047 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.350391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.365715 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.422895 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.423933 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.443917 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.508123 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.555265 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.557271 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.593994 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.645184 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.659791 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.681990 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 15:06:23 crc kubenswrapper[4707]: I0121 15:06:23.683775 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.010898 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.028987 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.035991 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.075249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.359531 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.449696 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.490664 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.544269 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.550729 4707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.553972 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.554025 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.557688 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.572390 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.572378087 podStartE2EDuration="23.572378087s" podCreationTimestamp="2026-01-21 15:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:06:24.57005567 +0000 UTC m=+281.751571892" watchObservedRunningTime="2026-01-21 15:06:24.572378087 +0000 UTC m=+281.753894309" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.591214 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.636996 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.657940 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.731837 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.740987 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.756236 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.789175 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.795981 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.861869 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.886538 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 15:06:24 crc kubenswrapper[4707]: I0121 15:06:24.920336 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.131323 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.150917 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.219145 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.239606 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.343147 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.372261 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.432336 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.442567 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.458527 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.551536 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.653108 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.703401 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.741067 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 15:06:25 crc kubenswrapper[4707]: I0121 15:06:25.894790 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.191290 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.303971 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.367458 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.371249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.447471 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.568790 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.568829 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.570888 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.583653 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.744915 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.756505 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.794977 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.796626 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 15:06:26 crc kubenswrapper[4707]: I0121 15:06:26.837722 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.032508 4707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.155028 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.292308 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.345967 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.564133 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.602627 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.606072 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.723978 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.809175 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 15:06:27 crc kubenswrapper[4707]: I0121 15:06:27.955324 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 15:06:28 crc kubenswrapper[4707]: I0121 15:06:28.663459 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 15:06:28 crc kubenswrapper[4707]: I0121 15:06:28.812241 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 15:06:29 crc kubenswrapper[4707]: I0121 15:06:29.531513 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 15:06:30 crc kubenswrapper[4707]: I0121 15:06:30.056111 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 15:06:33 crc kubenswrapper[4707]: I0121 15:06:33.812441 4707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 15:06:33 crc kubenswrapper[4707]: I0121 15:06:33.812777 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://abb4069de2d6843ebe57aa07fd2b2d92c8c766b02f89be31d55b40bf9de4aa4e" gracePeriod=5 Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.131745 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.131979 4707 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="abb4069de2d6843ebe57aa07fd2b2d92c8c766b02f89be31d55b40bf9de4aa4e" exitCode=137 Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.368877 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.368957 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397210 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397225 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397254 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397582 4707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397603 4707 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397613 4707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.397624 4707 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.403832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:06:39 crc kubenswrapper[4707]: I0121 15:06:39.498890 4707 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:06:40 crc kubenswrapper[4707]: I0121 15:06:40.136300 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 15:06:40 crc kubenswrapper[4707]: I0121 15:06:40.136359 4707 scope.go:117] "RemoveContainer" containerID="abb4069de2d6843ebe57aa07fd2b2d92c8c766b02f89be31d55b40bf9de4aa4e" Jan 21 15:06:40 crc kubenswrapper[4707]: I0121 15:06:40.136394 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:06:41 crc kubenswrapper[4707]: I0121 15:06:41.187650 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 21 15:06:43 crc kubenswrapper[4707]: I0121 15:06:43.053963 4707 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.364963 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7sm6h"] Jan 21 15:07:01 crc kubenswrapper[4707]: E0121 15:07:01.365457 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.365468 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 15:07:01 crc kubenswrapper[4707]: E0121 15:07:01.365487 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" containerName="installer" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.365494 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" containerName="installer" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.365569 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="308450d9-b350-4362-9d3b-5e5c3a09ae9a" containerName="installer" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.365581 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.365947 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.376519 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7sm6h"] Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.534377 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec81333f-d1d6-447b-8578-130f79d58136-trusted-ca\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.534430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec81333f-d1d6-447b-8578-130f79d58136-registry-tls\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.534458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77l7v\" (UniqueName: \"kubernetes.io/projected/ec81333f-d1d6-447b-8578-130f79d58136-kube-api-access-77l7v\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.534479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec81333f-d1d6-447b-8578-130f79d58136-bound-sa-token\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.534504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec81333f-d1d6-447b-8578-130f79d58136-registry-certificates\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.534595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec81333f-d1d6-447b-8578-130f79d58136-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.534640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.534723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec81333f-d1d6-447b-8578-130f79d58136-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.603701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.635580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec81333f-d1d6-447b-8578-130f79d58136-trusted-ca\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.635629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec81333f-d1d6-447b-8578-130f79d58136-registry-tls\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.635652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77l7v\" (UniqueName: \"kubernetes.io/projected/ec81333f-d1d6-447b-8578-130f79d58136-kube-api-access-77l7v\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.635670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec81333f-d1d6-447b-8578-130f79d58136-bound-sa-token\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.635695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec81333f-d1d6-447b-8578-130f79d58136-registry-certificates\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.635724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec81333f-d1d6-447b-8578-130f79d58136-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.635750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec81333f-d1d6-447b-8578-130f79d58136-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.636682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec81333f-d1d6-447b-8578-130f79d58136-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.637060 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec81333f-d1d6-447b-8578-130f79d58136-registry-certificates\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.637230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec81333f-d1d6-447b-8578-130f79d58136-trusted-ca\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.640282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec81333f-d1d6-447b-8578-130f79d58136-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.640313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec81333f-d1d6-447b-8578-130f79d58136-registry-tls\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.650376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec81333f-d1d6-447b-8578-130f79d58136-bound-sa-token\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.651671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77l7v\" (UniqueName: \"kubernetes.io/projected/ec81333f-d1d6-447b-8578-130f79d58136-kube-api-access-77l7v\") pod \"image-registry-66df7c8f76-7sm6h\" (UID: \"ec81333f-d1d6-447b-8578-130f79d58136\") " pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:01 crc kubenswrapper[4707]: I0121 15:07:01.678389 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:02 crc kubenswrapper[4707]: I0121 15:07:02.012747 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7sm6h"] Jan 21 15:07:02 crc kubenswrapper[4707]: I0121 15:07:02.225244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" event={"ID":"ec81333f-d1d6-447b-8578-130f79d58136","Type":"ContainerStarted","Data":"c32bb1d4f40d17dfaead079b37b694ec8ea81b9f62a8ca8af1104ef9e1e15f9d"} Jan 21 15:07:02 crc kubenswrapper[4707]: I0121 15:07:02.225483 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:02 crc kubenswrapper[4707]: I0121 15:07:02.225495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" event={"ID":"ec81333f-d1d6-447b-8578-130f79d58136","Type":"ContainerStarted","Data":"355009cb2e36be5b55832264ae63d6cb74d282334a298607f2102981a2d3d099"} Jan 21 15:07:02 crc kubenswrapper[4707]: I0121 15:07:02.240326 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" podStartSLOduration=1.240306849 podStartE2EDuration="1.240306849s" podCreationTimestamp="2026-01-21 15:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:07:02.239576279 +0000 UTC m=+319.421092500" watchObservedRunningTime="2026-01-21 15:07:02.240306849 +0000 UTC m=+319.421823071" Jan 21 15:07:21 crc kubenswrapper[4707]: I0121 15:07:21.683087 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7sm6h" Jan 21 15:07:21 crc kubenswrapper[4707]: I0121 15:07:21.715287 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q6l9f"] Jan 21 15:07:39 crc kubenswrapper[4707]: I0121 15:07:39.945316 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:07:39 crc kubenswrapper[4707]: I0121 15:07:39.945657 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:07:46 crc kubenswrapper[4707]: I0121 15:07:46.739132 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" podUID="e72b99f9-e554-43e0-ae76-ee94b2613a88" containerName="registry" containerID="cri-o://94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705" gracePeriod=30 Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.002575 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.140894 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e72b99f9-e554-43e0-ae76-ee94b2613a88\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.140931 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-bound-sa-token\") pod \"e72b99f9-e554-43e0-ae76-ee94b2613a88\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.140961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-certificates\") pod \"e72b99f9-e554-43e0-ae76-ee94b2613a88\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.141001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e72b99f9-e554-43e0-ae76-ee94b2613a88-ca-trust-extracted\") pod \"e72b99f9-e554-43e0-ae76-ee94b2613a88\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.141023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e72b99f9-e554-43e0-ae76-ee94b2613a88-installation-pull-secrets\") pod \"e72b99f9-e554-43e0-ae76-ee94b2613a88\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.141686 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksvwh\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-kube-api-access-ksvwh\") pod \"e72b99f9-e554-43e0-ae76-ee94b2613a88\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.141950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-trusted-ca\") pod \"e72b99f9-e554-43e0-ae76-ee94b2613a88\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.142005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-tls\") pod \"e72b99f9-e554-43e0-ae76-ee94b2613a88\" (UID: \"e72b99f9-e554-43e0-ae76-ee94b2613a88\") " Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.142172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e72b99f9-e554-43e0-ae76-ee94b2613a88" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.142314 4707 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.142772 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e72b99f9-e554-43e0-ae76-ee94b2613a88" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.145655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-kube-api-access-ksvwh" (OuterVolumeSpecName: "kube-api-access-ksvwh") pod "e72b99f9-e554-43e0-ae76-ee94b2613a88" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88"). InnerVolumeSpecName "kube-api-access-ksvwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.145828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e72b99f9-e554-43e0-ae76-ee94b2613a88-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e72b99f9-e554-43e0-ae76-ee94b2613a88" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.146581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e72b99f9-e554-43e0-ae76-ee94b2613a88" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.147742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e72b99f9-e554-43e0-ae76-ee94b2613a88" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.147908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e72b99f9-e554-43e0-ae76-ee94b2613a88" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.154294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e72b99f9-e554-43e0-ae76-ee94b2613a88-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e72b99f9-e554-43e0-ae76-ee94b2613a88" (UID: "e72b99f9-e554-43e0-ae76-ee94b2613a88"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.243329 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e72b99f9-e554-43e0-ae76-ee94b2613a88-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.243377 4707 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.243394 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.243405 4707 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e72b99f9-e554-43e0-ae76-ee94b2613a88-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.243423 4707 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e72b99f9-e554-43e0-ae76-ee94b2613a88-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.243433 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksvwh\" (UniqueName: \"kubernetes.io/projected/e72b99f9-e554-43e0-ae76-ee94b2613a88-kube-api-access-ksvwh\") on node \"crc\" DevicePath \"\"" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.399695 4707 generic.go:334] "Generic (PLEG): container finished" podID="e72b99f9-e554-43e0-ae76-ee94b2613a88" containerID="94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705" exitCode=0 Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.399734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" event={"ID":"e72b99f9-e554-43e0-ae76-ee94b2613a88","Type":"ContainerDied","Data":"94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705"} Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.399760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" event={"ID":"e72b99f9-e554-43e0-ae76-ee94b2613a88","Type":"ContainerDied","Data":"e56323bb131fbb166f3a33eb484c57e95faf09f151378cf1266522a809e27e32"} Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.399774 4707 scope.go:117] "RemoveContainer" containerID="94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.399896 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q6l9f" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.413762 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q6l9f"] Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.416553 4707 scope.go:117] "RemoveContainer" containerID="94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.416734 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q6l9f"] Jan 21 15:07:47 crc kubenswrapper[4707]: E0121 15:07:47.416857 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705\": container with ID starting with 94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705 not found: ID does not exist" containerID="94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705" Jan 21 15:07:47 crc kubenswrapper[4707]: I0121 15:07:47.416883 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705"} err="failed to get container status \"94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705\": rpc error: code = NotFound desc = could not find container \"94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705\": container with ID starting with 94e742c04d1f21ee40fed3cafe88ad47110b4f679052130a602b4dcf0a27e705 not found: ID does not exist" Jan 21 15:07:49 crc kubenswrapper[4707]: I0121 15:07:49.187905 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e72b99f9-e554-43e0-ae76-ee94b2613a88" path="/var/lib/kubelet/pods/e72b99f9-e554-43e0-ae76-ee94b2613a88/volumes" Jan 21 15:08:09 crc kubenswrapper[4707]: I0121 15:08:09.945910 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:08:09 crc kubenswrapper[4707]: I0121 15:08:09.946637 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:08:39 crc kubenswrapper[4707]: I0121 15:08:39.945259 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:08:39 crc kubenswrapper[4707]: I0121 15:08:39.945640 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:08:39 crc kubenswrapper[4707]: I0121 15:08:39.945681 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:08:39 crc kubenswrapper[4707]: I0121 15:08:39.946190 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf3c9e6b6e14a46494620756c3c0f7583b42676c1dc0d8e1f947083111f31743"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:08:39 crc kubenswrapper[4707]: I0121 15:08:39.946253 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://bf3c9e6b6e14a46494620756c3c0f7583b42676c1dc0d8e1f947083111f31743" gracePeriod=600 Jan 21 15:08:40 crc kubenswrapper[4707]: I0121 15:08:40.611172 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="bf3c9e6b6e14a46494620756c3c0f7583b42676c1dc0d8e1f947083111f31743" exitCode=0 Jan 21 15:08:40 crc kubenswrapper[4707]: I0121 15:08:40.611246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"bf3c9e6b6e14a46494620756c3c0f7583b42676c1dc0d8e1f947083111f31743"} Jan 21 15:08:40 crc kubenswrapper[4707]: I0121 15:08:40.611428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"5f362a525166796306b00e86a9f5d281cb9947b8704d1d3ea656f1842b6d253f"} Jan 21 15:08:40 crc kubenswrapper[4707]: I0121 15:08:40.611448 4707 scope.go:117] "RemoveContainer" containerID="494e3e87c9636ebf10e133479f5ef49c8097d33faecf0271eaa368888f5e1d54" Jan 21 15:08:43 crc kubenswrapper[4707]: I0121 15:08:43.321566 4707 scope.go:117] "RemoveContainer" containerID="3f6c56ee0ff616913bbd62524a38fb2532ce736eb585f45a1761ed0d25e4b363" Jan 21 15:08:43 crc kubenswrapper[4707]: I0121 15:08:43.332746 4707 scope.go:117] "RemoveContainer" containerID="a7934f180911034afcb33476d9c25728d3dbefc52a12d0ae96d86e7fcec6a84e" Jan 21 15:08:43 crc kubenswrapper[4707]: I0121 15:08:43.343094 4707 scope.go:117] "RemoveContainer" containerID="e04351f4111cca003714d139c5a626e5f13cc5247651dd0fe86c6884338169d6" Jan 21 15:08:43 crc kubenswrapper[4707]: I0121 15:08:43.353153 4707 scope.go:117] "RemoveContainer" containerID="b2440c172fead9ac1fdf9d8101cce5018e571d1630cdea44a2556af00f7ea51f" Jan 21 15:08:43 crc kubenswrapper[4707]: I0121 15:08:43.366730 4707 scope.go:117] "RemoveContainer" containerID="9d86d784f7576291336b4eea57159faad778b881c39dfb40b88a3b7b532c6592" Jan 21 15:08:43 crc kubenswrapper[4707]: I0121 15:08:43.376462 4707 scope.go:117] "RemoveContainer" containerID="510b17402b5302be67edeaf109cc1610d196efd32743917340cce16b4faa229d" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.207146 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-brcdw"] Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.208473 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovn-controller" containerID="cri-o://02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12" gracePeriod=30 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.208567 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="northd" containerID="cri-o://69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3" gracePeriod=30 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.208535 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503" gracePeriod=30 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.208675 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovn-acl-logging" containerID="cri-o://25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82" gracePeriod=30 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.208687 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kube-rbac-proxy-node" containerID="cri-o://3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8" gracePeriod=30 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.208905 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="sbdb" containerID="cri-o://c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8" gracePeriod=30 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.208946 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="nbdb" containerID="cri-o://56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e" gracePeriod=30 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.233376 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" containerID="cri-o://50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd" gracePeriod=30 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.475196 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/3.log" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.476946 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovn-acl-logging/0.log" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.477417 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovn-controller/0.log" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.477820 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518207 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ffhpj"] Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518363 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="northd" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518373 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="northd" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518383 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518388 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518394 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e72b99f9-e554-43e0-ae76-ee94b2613a88" containerName="registry" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518399 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e72b99f9-e554-43e0-ae76-ee94b2613a88" containerName="registry" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518408 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="nbdb" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518413 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="nbdb" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518423 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kube-rbac-proxy-node" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518428 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kube-rbac-proxy-node" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518436 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518442 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518450 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518464 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="sbdb" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518469 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="sbdb" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518477 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovn-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518482 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovn-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518489 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518507 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518515 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kubecfg-setup" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518520 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kubecfg-setup" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518527 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovn-acl-logging" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518532 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovn-acl-logging" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518537 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518542 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518616 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518625 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="sbdb" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518633 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovn-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518638 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518645 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="nbdb" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518650 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518657 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518664 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518670 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518676 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovn-acl-logging" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518682 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e72b99f9-e554-43e0-ae76-ee94b2613a88" containerName="registry" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518690 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="kube-rbac-proxy-node" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518697 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="northd" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.518765 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.518772 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerName="ovnkube-controller" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.520023 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-systemd\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-netns\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-config\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-kubelet\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-ovn\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-script-lib\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-env-overrides\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-bin\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-var-lib-openvswitch\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-slash\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-ovn-kubernetes\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-etc-openvswitch\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-node-log\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovn-node-metrics-cert\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw78q\" (UniqueName: \"kubernetes.io/projected/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-kube-api-access-kw78q\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-log-socket\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-netd\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-systemd-units\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-openvswitch\") pod \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\" (UID: \"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30\") " Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666000 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/764455cf-2d25-4bae-8373-91782f261dbd-env-overrides\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-run-netns\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-slash" (OuterVolumeSpecName: "host-slash") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666117 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666124 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.665986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666156 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-node-log" (OuterVolumeSpecName: "node-log") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666335 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666369 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-log-socket" (OuterVolumeSpecName: "log-socket") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-run-openvswitch\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666421 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-run-ovn\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-log-socket\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-node-log\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-cni-netd\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-cni-bin\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-var-lib-openvswitch\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-kubelet\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666773 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phssw\" (UniqueName: \"kubernetes.io/projected/764455cf-2d25-4bae-8373-91782f261dbd-kube-api-access-phssw\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-systemd-units\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-etc-openvswitch\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/764455cf-2d25-4bae-8373-91782f261dbd-ovn-node-metrics-cert\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/764455cf-2d25-4bae-8373-91782f261dbd-ovnkube-config\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-run-systemd\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/764455cf-2d25-4bae-8373-91782f261dbd-ovnkube-script-lib\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-slash\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.666994 4707 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667008 4707 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667018 4707 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-slash\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667026 4707 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667036 4707 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667043 4707 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-node-log\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667051 4707 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-log-socket\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667060 4707 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667069 4707 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667077 4707 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667087 4707 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667095 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667103 4707 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667111 4707 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667119 4707 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.667533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.670313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-kube-api-access-kw78q" (OuterVolumeSpecName: "kube-api-access-kw78q") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "kube-api-access-kw78q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.670398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.675936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" (UID: "fa3aa8ac-04fb-4f52-8e61-ddb7676bca30"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-systemd-units\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-etc-openvswitch\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/764455cf-2d25-4bae-8373-91782f261dbd-ovn-node-metrics-cert\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/764455cf-2d25-4bae-8373-91782f261dbd-ovnkube-config\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768462 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-run-systemd\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-systemd-units\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-etc-openvswitch\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/764455cf-2d25-4bae-8373-91782f261dbd-ovnkube-script-lib\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-run-systemd\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.768935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-slash\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-slash\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769023 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/764455cf-2d25-4bae-8373-91782f261dbd-env-overrides\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-run-openvswitch\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-run-netns\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-run-ovn\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-run-ovn\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-log-socket\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769146 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-run-openvswitch\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-node-log\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-node-log\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-cni-netd\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-log-socket\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-cni-bin\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-cni-bin\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-var-lib-openvswitch\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-kubelet\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-cni-netd\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phssw\" (UniqueName: \"kubernetes.io/projected/764455cf-2d25-4bae-8373-91782f261dbd-kube-api-access-phssw\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-run-netns\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-var-lib-openvswitch\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-kubelet\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769345 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/764455cf-2d25-4bae-8373-91782f261dbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/764455cf-2d25-4bae-8373-91782f261dbd-env-overrides\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769432 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769444 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw78q\" (UniqueName: \"kubernetes.io/projected/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-kube-api-access-kw78q\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769453 4707 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769462 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/764455cf-2d25-4bae-8373-91782f261dbd-ovnkube-config\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.769673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/764455cf-2d25-4bae-8373-91782f261dbd-ovnkube-script-lib\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.772313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/764455cf-2d25-4bae-8373-91782f261dbd-ovn-node-metrics-cert\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.781543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phssw\" (UniqueName: \"kubernetes.io/projected/764455cf-2d25-4bae-8373-91782f261dbd-kube-api-access-phssw\") pod \"ovnkube-node-ffhpj\" (UID: \"764455cf-2d25-4bae-8373-91782f261dbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.831189 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.957707 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/2.log" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.958106 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/1.log" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.958142 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2bdbb11-a196-4dc3-b197-64ef1bec8e8a" containerID="d6408cce6d76083d530e42528e1345d3faf4b05c560998bf0cd690fbd03e6e5a" exitCode=2 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.958201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxkz2" event={"ID":"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a","Type":"ContainerDied","Data":"d6408cce6d76083d530e42528e1345d3faf4b05c560998bf0cd690fbd03e6e5a"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.958240 4707 scope.go:117] "RemoveContainer" containerID="99a74fa9e2342f9346d3dc9d77d0eef2d02bd28275f6ca66a29d79bb738e0177" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.958669 4707 scope.go:117] "RemoveContainer" containerID="d6408cce6d76083d530e42528e1345d3faf4b05c560998bf0cd690fbd03e6e5a" Jan 21 15:10:11 crc kubenswrapper[4707]: E0121 15:10:11.958918 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lxkz2_openshift-multus(e2bdbb11-a196-4dc3-b197-64ef1bec8e8a)\"" pod="openshift-multus/multus-lxkz2" podUID="e2bdbb11-a196-4dc3-b197-64ef1bec8e8a" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.960467 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovnkube-controller/3.log" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.963048 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovn-acl-logging/0.log" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.963665 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brcdw_fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/ovn-controller/0.log" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964057 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd" exitCode=0 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964082 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8" exitCode=0 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964089 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e" exitCode=0 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964097 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3" exitCode=0 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964103 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503" exitCode=0 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964109 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8" exitCode=0 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964115 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82" exitCode=143 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964122 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" containerID="02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12" exitCode=143 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964229 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964854 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964892 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964900 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964906 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964910 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964915 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964920 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964925 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964930 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964935 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964940 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964953 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964958 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964962 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964966 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964971 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964975 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964979 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964983 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964988 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964993 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.964998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965004 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965010 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965017 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965021 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965025 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965030 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965034 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965039 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965043 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965047 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brcdw" event={"ID":"fa3aa8ac-04fb-4f52-8e61-ddb7676bca30","Type":"ContainerDied","Data":"30f1e7d7b795e0dd15c982ee504cd337d62f29782263b1e84dfdadfdca67f83e"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965060 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965064 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965068 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965073 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965077 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965081 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965086 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965090 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965094 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.965099 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.966234 4707 generic.go:334] "Generic (PLEG): container finished" podID="764455cf-2d25-4bae-8373-91782f261dbd" containerID="d40d1b5ea0f44ee3c3fc01230909056c9a4fd459acc1984fac3e302ed3fc9c0b" exitCode=0 Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.966261 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerDied","Data":"d40d1b5ea0f44ee3c3fc01230909056c9a4fd459acc1984fac3e302ed3fc9c0b"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.966275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerStarted","Data":"bfb1328d7d94687585f7669aa505018c72683f2067416920897dae1a270fe7b8"} Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.978257 4707 scope.go:117] "RemoveContainer" containerID="50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd" Jan 21 15:10:11 crc kubenswrapper[4707]: I0121 15:10:11.989696 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.007963 4707 scope.go:117] "RemoveContainer" containerID="c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.014795 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-brcdw"] Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.017602 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-brcdw"] Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.023720 4707 scope.go:117] "RemoveContainer" containerID="56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.037180 4707 scope.go:117] "RemoveContainer" containerID="69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.049671 4707 scope.go:117] "RemoveContainer" containerID="55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.078863 4707 scope.go:117] "RemoveContainer" containerID="3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.099746 4707 scope.go:117] "RemoveContainer" containerID="25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.108719 4707 scope.go:117] "RemoveContainer" containerID="02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.119819 4707 scope.go:117] "RemoveContainer" containerID="5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.135826 4707 scope.go:117] "RemoveContainer" containerID="50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.137008 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd\": container with ID starting with 50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd not found: ID does not exist" containerID="50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.137053 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} err="failed to get container status \"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd\": rpc error: code = NotFound desc = could not find container \"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd\": container with ID starting with 50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.137073 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.137562 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\": container with ID starting with 5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72 not found: ID does not exist" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.137593 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} err="failed to get container status \"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\": rpc error: code = NotFound desc = could not find container \"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\": container with ID starting with 5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.137606 4707 scope.go:117] "RemoveContainer" containerID="c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.137934 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\": container with ID starting with c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8 not found: ID does not exist" containerID="c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.137957 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} err="failed to get container status \"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\": rpc error: code = NotFound desc = could not find container \"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\": container with ID starting with c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.137970 4707 scope.go:117] "RemoveContainer" containerID="56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.138841 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\": container with ID starting with 56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e not found: ID does not exist" containerID="56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.138862 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} err="failed to get container status \"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\": rpc error: code = NotFound desc = could not find container \"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\": container with ID starting with 56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.138910 4707 scope.go:117] "RemoveContainer" containerID="69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.139153 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\": container with ID starting with 69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3 not found: ID does not exist" containerID="69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.139169 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} err="failed to get container status \"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\": rpc error: code = NotFound desc = could not find container \"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\": container with ID starting with 69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.139183 4707 scope.go:117] "RemoveContainer" containerID="55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.139418 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\": container with ID starting with 55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503 not found: ID does not exist" containerID="55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.139438 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} err="failed to get container status \"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\": rpc error: code = NotFound desc = could not find container \"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\": container with ID starting with 55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.139450 4707 scope.go:117] "RemoveContainer" containerID="3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.139632 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\": container with ID starting with 3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8 not found: ID does not exist" containerID="3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.139649 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} err="failed to get container status \"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\": rpc error: code = NotFound desc = could not find container \"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\": container with ID starting with 3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.139661 4707 scope.go:117] "RemoveContainer" containerID="25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.139846 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\": container with ID starting with 25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82 not found: ID does not exist" containerID="25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.139875 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} err="failed to get container status \"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\": rpc error: code = NotFound desc = could not find container \"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\": container with ID starting with 25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.139887 4707 scope.go:117] "RemoveContainer" containerID="02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.140092 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\": container with ID starting with 02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12 not found: ID does not exist" containerID="02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140110 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} err="failed to get container status \"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\": rpc error: code = NotFound desc = could not find container \"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\": container with ID starting with 02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140125 4707 scope.go:117] "RemoveContainer" containerID="5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181" Jan 21 15:10:12 crc kubenswrapper[4707]: E0121 15:10:12.140286 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\": container with ID starting with 5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181 not found: ID does not exist" containerID="5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140305 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181"} err="failed to get container status \"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\": rpc error: code = NotFound desc = could not find container \"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\": container with ID starting with 5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140316 4707 scope.go:117] "RemoveContainer" containerID="50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140471 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} err="failed to get container status \"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd\": rpc error: code = NotFound desc = could not find container \"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd\": container with ID starting with 50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140488 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140640 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} err="failed to get container status \"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\": rpc error: code = NotFound desc = could not find container \"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\": container with ID starting with 5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140657 4707 scope.go:117] "RemoveContainer" containerID="c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140863 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} err="failed to get container status \"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\": rpc error: code = NotFound desc = could not find container \"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\": container with ID starting with c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.140881 4707 scope.go:117] "RemoveContainer" containerID="56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141042 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} err="failed to get container status \"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\": rpc error: code = NotFound desc = could not find container \"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\": container with ID starting with 56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141058 4707 scope.go:117] "RemoveContainer" containerID="69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141203 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} err="failed to get container status \"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\": rpc error: code = NotFound desc = could not find container \"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\": container with ID starting with 69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141221 4707 scope.go:117] "RemoveContainer" containerID="55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} err="failed to get container status \"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\": rpc error: code = NotFound desc = could not find container \"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\": container with ID starting with 55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141405 4707 scope.go:117] "RemoveContainer" containerID="3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141556 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} err="failed to get container status \"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\": rpc error: code = NotFound desc = could not find container \"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\": container with ID starting with 3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141574 4707 scope.go:117] "RemoveContainer" containerID="25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141715 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} err="failed to get container status \"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\": rpc error: code = NotFound desc = could not find container \"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\": container with ID starting with 25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.141734 4707 scope.go:117] "RemoveContainer" containerID="02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142022 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} err="failed to get container status \"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\": rpc error: code = NotFound desc = could not find container \"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\": container with ID starting with 02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142040 4707 scope.go:117] "RemoveContainer" containerID="5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142309 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181"} err="failed to get container status \"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\": rpc error: code = NotFound desc = could not find container \"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\": container with ID starting with 5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142329 4707 scope.go:117] "RemoveContainer" containerID="50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142515 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} err="failed to get container status \"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd\": rpc error: code = NotFound desc = could not find container \"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd\": container with ID starting with 50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142528 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142696 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} err="failed to get container status \"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\": rpc error: code = NotFound desc = could not find container \"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\": container with ID starting with 5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142708 4707 scope.go:117] "RemoveContainer" containerID="c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142866 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} err="failed to get container status \"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\": rpc error: code = NotFound desc = could not find container \"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\": container with ID starting with c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.142880 4707 scope.go:117] "RemoveContainer" containerID="56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143040 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} err="failed to get container status \"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\": rpc error: code = NotFound desc = could not find container \"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\": container with ID starting with 56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143071 4707 scope.go:117] "RemoveContainer" containerID="69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143210 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} err="failed to get container status \"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\": rpc error: code = NotFound desc = could not find container \"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\": container with ID starting with 69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143244 4707 scope.go:117] "RemoveContainer" containerID="55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143397 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} err="failed to get container status \"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\": rpc error: code = NotFound desc = could not find container \"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\": container with ID starting with 55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143414 4707 scope.go:117] "RemoveContainer" containerID="3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143584 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} err="failed to get container status \"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\": rpc error: code = NotFound desc = could not find container \"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\": container with ID starting with 3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143601 4707 scope.go:117] "RemoveContainer" containerID="25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143755 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} err="failed to get container status \"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\": rpc error: code = NotFound desc = could not find container \"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\": container with ID starting with 25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.143767 4707 scope.go:117] "RemoveContainer" containerID="02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144100 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} err="failed to get container status \"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\": rpc error: code = NotFound desc = could not find container \"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\": container with ID starting with 02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144117 4707 scope.go:117] "RemoveContainer" containerID="5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144279 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181"} err="failed to get container status \"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\": rpc error: code = NotFound desc = could not find container \"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\": container with ID starting with 5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144317 4707 scope.go:117] "RemoveContainer" containerID="50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144459 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd"} err="failed to get container status \"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd\": rpc error: code = NotFound desc = could not find container \"50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd\": container with ID starting with 50d0a1b88e1e8c94445ed21d45692a0af3359d0fc91d835be3cb48c20d9d56bd not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144475 4707 scope.go:117] "RemoveContainer" containerID="5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144669 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72"} err="failed to get container status \"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\": rpc error: code = NotFound desc = could not find container \"5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72\": container with ID starting with 5a52b68f8d57a66725cbb084eb6d7a9ef2a9d381e59c62806691627e21346a72 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144686 4707 scope.go:117] "RemoveContainer" containerID="c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144849 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8"} err="failed to get container status \"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\": rpc error: code = NotFound desc = could not find container \"c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8\": container with ID starting with c7e0abb4eb512b4cbe448bd82710212ab26397eeadfb5a513d048fb4c70822f8 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.144865 4707 scope.go:117] "RemoveContainer" containerID="56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145028 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e"} err="failed to get container status \"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\": rpc error: code = NotFound desc = could not find container \"56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e\": container with ID starting with 56ef5dfbc38ce27e53baa3b1ed4c6c8d6e95a3879b8c42cbcb95a05ccc30c82e not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145044 4707 scope.go:117] "RemoveContainer" containerID="69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145201 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3"} err="failed to get container status \"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\": rpc error: code = NotFound desc = could not find container \"69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3\": container with ID starting with 69f6e433f7747064b11f18407020a4405fe55e90716653d5f093ad2fd288a5a3 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145236 4707 scope.go:117] "RemoveContainer" containerID="55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145379 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503"} err="failed to get container status \"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\": rpc error: code = NotFound desc = could not find container \"55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503\": container with ID starting with 55e6970940bd17c5b6de89f30565d935626fc363020e2c24239db74fc43d7503 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145396 4707 scope.go:117] "RemoveContainer" containerID="3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145545 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8"} err="failed to get container status \"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\": rpc error: code = NotFound desc = could not find container \"3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8\": container with ID starting with 3f6ca1f3bd0bf5a5240a5ddf34149836b3e946f1fa12988d4623f9568bd5a9e8 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145562 4707 scope.go:117] "RemoveContainer" containerID="25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145700 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82"} err="failed to get container status \"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\": rpc error: code = NotFound desc = could not find container \"25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82\": container with ID starting with 25401fbdd34746e9cfce061de544cef3f7e896254c7ca1082150a79c2f73cc82 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145716 4707 scope.go:117] "RemoveContainer" containerID="02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145877 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12"} err="failed to get container status \"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\": rpc error: code = NotFound desc = could not find container \"02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12\": container with ID starting with 02c39ca1b790153d524366b5cf56f52505f87b8c1365ba15108243b5d00edc12 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.145893 4707 scope.go:117] "RemoveContainer" containerID="5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.146028 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181"} err="failed to get container status \"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\": rpc error: code = NotFound desc = could not find container \"5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181\": container with ID starting with 5d6a02e558b9ec15f97c41cb091c76a7c6fff16735a35eb72b9d3e0247882181 not found: ID does not exist" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.972098 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/2.log" Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.979553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerStarted","Data":"f756b2430a2c5b2670d541fd12b6ecaac0677cd65e473f08eb718c8ec6ff95bf"} Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.979603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerStarted","Data":"e218ccca28b99aaf6c41b4c3a177b423c02288a15ccbdfad1a9130e795e5788f"} Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.979615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerStarted","Data":"b94341d9d4d92c42e56dadf0de65bc3a51eabc232d8098f0740cc8e506363f9d"} Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.979625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerStarted","Data":"91685bbc0efd45896018fa1d69436610c81a989c124a0c1ca75a0f7eaaecd382"} Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.979634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerStarted","Data":"8071a68e96cf814f1ad8a4e4ef58ac634196f02538fe4e72ee0059e0848a8c10"} Jan 21 15:10:12 crc kubenswrapper[4707]: I0121 15:10:12.979658 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerStarted","Data":"6a0ab88e4a16e6ecddcda83be6dd5bc002a14b2d8723fe3ea64a826525969ffa"} Jan 21 15:10:13 crc kubenswrapper[4707]: I0121 15:10:13.187526 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3aa8ac-04fb-4f52-8e61-ddb7676bca30" path="/var/lib/kubelet/pods/fa3aa8ac-04fb-4f52-8e61-ddb7676bca30/volumes" Jan 21 15:10:14 crc kubenswrapper[4707]: I0121 15:10:14.991516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerStarted","Data":"6b4ac62017b268816ef56af6284e7ff7cb02550197245187fe839688fd3eb65f"} Jan 21 15:10:17 crc kubenswrapper[4707]: I0121 15:10:17.007066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" event={"ID":"764455cf-2d25-4bae-8373-91782f261dbd","Type":"ContainerStarted","Data":"75f41d7965be12d1a441463735e8399b6593d0cc059b16c6cb1228a19f826ed7"} Jan 21 15:10:17 crc kubenswrapper[4707]: I0121 15:10:17.007361 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:17 crc kubenswrapper[4707]: I0121 15:10:17.007392 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:17 crc kubenswrapper[4707]: I0121 15:10:17.026885 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" podStartSLOduration=6.026871322 podStartE2EDuration="6.026871322s" podCreationTimestamp="2026-01-21 15:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:10:17.024584045 +0000 UTC m=+514.206100268" watchObservedRunningTime="2026-01-21 15:10:17.026871322 +0000 UTC m=+514.208387544" Jan 21 15:10:17 crc kubenswrapper[4707]: I0121 15:10:17.029214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:18 crc kubenswrapper[4707]: I0121 15:10:18.011561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:18 crc kubenswrapper[4707]: I0121 15:10:18.031269 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:23 crc kubenswrapper[4707]: I0121 15:10:23.184376 4707 scope.go:117] "RemoveContainer" containerID="d6408cce6d76083d530e42528e1345d3faf4b05c560998bf0cd690fbd03e6e5a" Jan 21 15:10:23 crc kubenswrapper[4707]: E0121 15:10:23.184704 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lxkz2_openshift-multus(e2bdbb11-a196-4dc3-b197-64ef1bec8e8a)\"" pod="openshift-multus/multus-lxkz2" podUID="e2bdbb11-a196-4dc3-b197-64ef1bec8e8a" Jan 21 15:10:36 crc kubenswrapper[4707]: I0121 15:10:36.182408 4707 scope.go:117] "RemoveContainer" containerID="d6408cce6d76083d530e42528e1345d3faf4b05c560998bf0cd690fbd03e6e5a" Jan 21 15:10:37 crc kubenswrapper[4707]: I0121 15:10:37.079122 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/2.log" Jan 21 15:10:37 crc kubenswrapper[4707]: I0121 15:10:37.079361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lxkz2" event={"ID":"e2bdbb11-a196-4dc3-b197-64ef1bec8e8a","Type":"ContainerStarted","Data":"33e878496cb74a449af5f39ebf67c6692e3df76d5d87502554ef48ce7d068d2e"} Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.605867 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj"] Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.607789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.609468 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.613754 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj"] Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.652031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzpgs\" (UniqueName: \"kubernetes.io/projected/4c693f1a-3959-41d4-8209-ff55cf48b2ea-kube-api-access-dzpgs\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.652259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.652358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.752700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzpgs\" (UniqueName: \"kubernetes.io/projected/4c693f1a-3959-41d4-8209-ff55cf48b2ea-kube-api-access-dzpgs\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.752884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.752994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.753253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.753290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.767535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzpgs\" (UniqueName: \"kubernetes.io/projected/4c693f1a-3959-41d4-8209-ff55cf48b2ea-kube-api-access-dzpgs\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:38 crc kubenswrapper[4707]: I0121 15:10:38.919424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:39 crc kubenswrapper[4707]: I0121 15:10:39.247966 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj"] Jan 21 15:10:39 crc kubenswrapper[4707]: W0121 15:10:39.251433 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c693f1a_3959_41d4_8209_ff55cf48b2ea.slice/crio-a71cb56fc0c4fe75889a5eb237a08e0d572a34131e3f5a7cdc1849abd98b9040 WatchSource:0}: Error finding container a71cb56fc0c4fe75889a5eb237a08e0d572a34131e3f5a7cdc1849abd98b9040: Status 404 returned error can't find the container with id a71cb56fc0c4fe75889a5eb237a08e0d572a34131e3f5a7cdc1849abd98b9040 Jan 21 15:10:40 crc kubenswrapper[4707]: I0121 15:10:40.090387 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerID="93f1aa03d5db66de472dfcd2a5338cfd2d5a21aae89a96e4297e02fbc3353738" exitCode=0 Jan 21 15:10:40 crc kubenswrapper[4707]: I0121 15:10:40.090480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" event={"ID":"4c693f1a-3959-41d4-8209-ff55cf48b2ea","Type":"ContainerDied","Data":"93f1aa03d5db66de472dfcd2a5338cfd2d5a21aae89a96e4297e02fbc3353738"} Jan 21 15:10:40 crc kubenswrapper[4707]: I0121 15:10:40.091069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" event={"ID":"4c693f1a-3959-41d4-8209-ff55cf48b2ea","Type":"ContainerStarted","Data":"a71cb56fc0c4fe75889a5eb237a08e0d572a34131e3f5a7cdc1849abd98b9040"} Jan 21 15:10:40 crc kubenswrapper[4707]: I0121 15:10:40.091789 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:10:41 crc kubenswrapper[4707]: I0121 15:10:41.846092 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ffhpj" Jan 21 15:10:42 crc kubenswrapper[4707]: I0121 15:10:42.100600 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerID="7ce5b9a17dea112009953d43eb9de41fa243729ae42c581238da2a11d0c03bb2" exitCode=0 Jan 21 15:10:42 crc kubenswrapper[4707]: I0121 15:10:42.100638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" event={"ID":"4c693f1a-3959-41d4-8209-ff55cf48b2ea","Type":"ContainerDied","Data":"7ce5b9a17dea112009953d43eb9de41fa243729ae42c581238da2a11d0c03bb2"} Jan 21 15:10:43 crc kubenswrapper[4707]: I0121 15:10:43.105800 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerID="6ba165c5b0c7db7df79c638576a2cf64a512132fdbbe311faddb94be5b8369cf" exitCode=0 Jan 21 15:10:43 crc kubenswrapper[4707]: I0121 15:10:43.105872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" event={"ID":"4c693f1a-3959-41d4-8209-ff55cf48b2ea","Type":"ContainerDied","Data":"6ba165c5b0c7db7df79c638576a2cf64a512132fdbbe311faddb94be5b8369cf"} Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.281259 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.414209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzpgs\" (UniqueName: \"kubernetes.io/projected/4c693f1a-3959-41d4-8209-ff55cf48b2ea-kube-api-access-dzpgs\") pod \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.414301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-bundle\") pod \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.414361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-util\") pod \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\" (UID: \"4c693f1a-3959-41d4-8209-ff55cf48b2ea\") " Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.415018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-bundle" (OuterVolumeSpecName: "bundle") pod "4c693f1a-3959-41d4-8209-ff55cf48b2ea" (UID: "4c693f1a-3959-41d4-8209-ff55cf48b2ea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.418113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c693f1a-3959-41d4-8209-ff55cf48b2ea-kube-api-access-dzpgs" (OuterVolumeSpecName: "kube-api-access-dzpgs") pod "4c693f1a-3959-41d4-8209-ff55cf48b2ea" (UID: "4c693f1a-3959-41d4-8209-ff55cf48b2ea"). InnerVolumeSpecName "kube-api-access-dzpgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.423942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-util" (OuterVolumeSpecName: "util") pod "4c693f1a-3959-41d4-8209-ff55cf48b2ea" (UID: "4c693f1a-3959-41d4-8209-ff55cf48b2ea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.515784 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.515825 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c693f1a-3959-41d4-8209-ff55cf48b2ea-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:44 crc kubenswrapper[4707]: I0121 15:10:44.515835 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzpgs\" (UniqueName: \"kubernetes.io/projected/4c693f1a-3959-41d4-8209-ff55cf48b2ea-kube-api-access-dzpgs\") on node \"crc\" DevicePath \"\"" Jan 21 15:10:45 crc kubenswrapper[4707]: I0121 15:10:45.114550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" event={"ID":"4c693f1a-3959-41d4-8209-ff55cf48b2ea","Type":"ContainerDied","Data":"a71cb56fc0c4fe75889a5eb237a08e0d572a34131e3f5a7cdc1849abd98b9040"} Jan 21 15:10:45 crc kubenswrapper[4707]: I0121 15:10:45.114599 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj" Jan 21 15:10:45 crc kubenswrapper[4707]: I0121 15:10:45.114601 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a71cb56fc0c4fe75889a5eb237a08e0d572a34131e3f5a7cdc1849abd98b9040" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.653071 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-kvjjm"] Jan 21 15:10:46 crc kubenswrapper[4707]: E0121 15:10:46.653424 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerName="pull" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.653435 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerName="pull" Jan 21 15:10:46 crc kubenswrapper[4707]: E0121 15:10:46.653453 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerName="util" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.653459 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerName="util" Jan 21 15:10:46 crc kubenswrapper[4707]: E0121 15:10:46.653467 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerName="extract" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.653472 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerName="extract" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.653572 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c693f1a-3959-41d4-8209-ff55cf48b2ea" containerName="extract" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.653910 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-kvjjm" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.655218 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.657684 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gbbxg" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.658323 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.661003 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-kvjjm"] Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.840999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmmm\" (UniqueName: \"kubernetes.io/projected/c7e81409-5fc8-4b0d-9c98-dfaf6167892d-kube-api-access-bzmmm\") pod \"nmstate-operator-646758c888-kvjjm\" (UID: \"c7e81409-5fc8-4b0d-9c98-dfaf6167892d\") " pod="openshift-nmstate/nmstate-operator-646758c888-kvjjm" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.942249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmmm\" (UniqueName: \"kubernetes.io/projected/c7e81409-5fc8-4b0d-9c98-dfaf6167892d-kube-api-access-bzmmm\") pod \"nmstate-operator-646758c888-kvjjm\" (UID: \"c7e81409-5fc8-4b0d-9c98-dfaf6167892d\") " pod="openshift-nmstate/nmstate-operator-646758c888-kvjjm" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.956723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmmm\" (UniqueName: \"kubernetes.io/projected/c7e81409-5fc8-4b0d-9c98-dfaf6167892d-kube-api-access-bzmmm\") pod \"nmstate-operator-646758c888-kvjjm\" (UID: \"c7e81409-5fc8-4b0d-9c98-dfaf6167892d\") " pod="openshift-nmstate/nmstate-operator-646758c888-kvjjm" Jan 21 15:10:46 crc kubenswrapper[4707]: I0121 15:10:46.965367 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-kvjjm" Jan 21 15:10:47 crc kubenswrapper[4707]: I0121 15:10:47.299088 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-kvjjm"] Jan 21 15:10:47 crc kubenswrapper[4707]: W0121 15:10:47.307285 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7e81409_5fc8_4b0d_9c98_dfaf6167892d.slice/crio-80879ea6bb8370d2fd2144099077d2b512d3525b1bcf3621eedccf338713015f WatchSource:0}: Error finding container 80879ea6bb8370d2fd2144099077d2b512d3525b1bcf3621eedccf338713015f: Status 404 returned error can't find the container with id 80879ea6bb8370d2fd2144099077d2b512d3525b1bcf3621eedccf338713015f Jan 21 15:10:48 crc kubenswrapper[4707]: I0121 15:10:48.126960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-kvjjm" event={"ID":"c7e81409-5fc8-4b0d-9c98-dfaf6167892d","Type":"ContainerStarted","Data":"80879ea6bb8370d2fd2144099077d2b512d3525b1bcf3621eedccf338713015f"} Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.135131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-kvjjm" event={"ID":"c7e81409-5fc8-4b0d-9c98-dfaf6167892d","Type":"ContainerStarted","Data":"6957d42808af4cb2fec7b7dea988184d2cdf789469510227cb25f733288e72db"} Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.148366 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-kvjjm" podStartSLOduration=2.157652361 podStartE2EDuration="4.148353311s" podCreationTimestamp="2026-01-21 15:10:46 +0000 UTC" firstStartedPulling="2026-01-21 15:10:47.308987012 +0000 UTC m=+544.490503235" lastFinishedPulling="2026-01-21 15:10:49.299687963 +0000 UTC m=+546.481204185" observedRunningTime="2026-01-21 15:10:50.146452324 +0000 UTC m=+547.327968546" watchObservedRunningTime="2026-01-21 15:10:50.148353311 +0000 UTC m=+547.329869533" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.878882 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-h2gvw"] Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.879879 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-h2gvw" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.881754 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wk7pv" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.887053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnpqp\" (UniqueName: \"kubernetes.io/projected/591eebf5-40db-445a-a3ab-23a5f88f501c-kube-api-access-nnpqp\") pod \"nmstate-metrics-54757c584b-h2gvw\" (UID: \"591eebf5-40db-445a-a3ab-23a5f88f501c\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-h2gvw" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.890450 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9"] Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.891011 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.892231 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.892918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-h2gvw"] Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.907069 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9"] Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.909556 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-m4cqg"] Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.910139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.982830 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx"] Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.983353 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.984964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ghckf" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.986320 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.988334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnpqp\" (UniqueName: \"kubernetes.io/projected/591eebf5-40db-445a-a3ab-23a5f88f501c-kube-api-access-nnpqp\") pod \"nmstate-metrics-54757c584b-h2gvw\" (UID: \"591eebf5-40db-445a-a3ab-23a5f88f501c\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-h2gvw" Jan 21 15:10:50 crc kubenswrapper[4707]: I0121 15:10:50.988974 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.002178 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx"] Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.009004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnpqp\" (UniqueName: \"kubernetes.io/projected/591eebf5-40db-445a-a3ab-23a5f88f501c-kube-api-access-nnpqp\") pod \"nmstate-metrics-54757c584b-h2gvw\" (UID: \"591eebf5-40db-445a-a3ab-23a5f88f501c\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-h2gvw" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.089933 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-ovs-socket\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.089974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7bf6f36b-51bd-4196-b732-7807475b38b2-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-g8mcx\" (UID: \"7bf6f36b-51bd-4196-b732-7807475b38b2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.089999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-dbus-socket\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.090014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbwzx\" (UniqueName: \"kubernetes.io/projected/3dc50ee6-0d24-4790-a13d-caaf61114685-kube-api-access-kbwzx\") pod \"nmstate-webhook-8474b5b9d8-64vq9\" (UID: \"3dc50ee6-0d24-4790-a13d-caaf61114685\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.090032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tptkz\" (UniqueName: \"kubernetes.io/projected/7bf6f36b-51bd-4196-b732-7807475b38b2-kube-api-access-tptkz\") pod \"nmstate-console-plugin-7754f76f8b-g8mcx\" (UID: \"7bf6f36b-51bd-4196-b732-7807475b38b2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.090065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-nmstate-lock\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.090080 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzh9v\" (UniqueName: \"kubernetes.io/projected/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-kube-api-access-nzh9v\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.090108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bf6f36b-51bd-4196-b732-7807475b38b2-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-g8mcx\" (UID: \"7bf6f36b-51bd-4196-b732-7807475b38b2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.090122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3dc50ee6-0d24-4790-a13d-caaf61114685-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-64vq9\" (UID: \"3dc50ee6-0d24-4790-a13d-caaf61114685\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.154875 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-56fdbdc57d-q67pn"] Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.155422 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.162706 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56fdbdc57d-q67pn"] Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-nmstate-lock\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzh9v\" (UniqueName: \"kubernetes.io/projected/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-kube-api-access-nzh9v\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bf6f36b-51bd-4196-b732-7807475b38b2-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-g8mcx\" (UID: \"7bf6f36b-51bd-4196-b732-7807475b38b2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3dc50ee6-0d24-4790-a13d-caaf61114685-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-64vq9\" (UID: \"3dc50ee6-0d24-4790-a13d-caaf61114685\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-ovs-socket\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7bf6f36b-51bd-4196-b732-7807475b38b2-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-g8mcx\" (UID: \"7bf6f36b-51bd-4196-b732-7807475b38b2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-dbus-socket\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbwzx\" (UniqueName: \"kubernetes.io/projected/3dc50ee6-0d24-4790-a13d-caaf61114685-kube-api-access-kbwzx\") pod \"nmstate-webhook-8474b5b9d8-64vq9\" (UID: \"3dc50ee6-0d24-4790-a13d-caaf61114685\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tptkz\" (UniqueName: \"kubernetes.io/projected/7bf6f36b-51bd-4196-b732-7807475b38b2-kube-api-access-tptkz\") pod \"nmstate-console-plugin-7754f76f8b-g8mcx\" (UID: \"7bf6f36b-51bd-4196-b732-7807475b38b2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.191560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-nmstate-lock\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.192618 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-h2gvw" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.192709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-dbus-socket\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.192866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-ovs-socket\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.193066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7bf6f36b-51bd-4196-b732-7807475b38b2-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-g8mcx\" (UID: \"7bf6f36b-51bd-4196-b732-7807475b38b2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.202168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bf6f36b-51bd-4196-b732-7807475b38b2-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-g8mcx\" (UID: \"7bf6f36b-51bd-4196-b732-7807475b38b2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.204353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3dc50ee6-0d24-4790-a13d-caaf61114685-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-64vq9\" (UID: \"3dc50ee6-0d24-4790-a13d-caaf61114685\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.204845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tptkz\" (UniqueName: \"kubernetes.io/projected/7bf6f36b-51bd-4196-b732-7807475b38b2-kube-api-access-tptkz\") pod \"nmstate-console-plugin-7754f76f8b-g8mcx\" (UID: \"7bf6f36b-51bd-4196-b732-7807475b38b2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.205946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzh9v\" (UniqueName: \"kubernetes.io/projected/72f87fd4-c9ff-4410-ba80-fdc2525a37b8-kube-api-access-nzh9v\") pod \"nmstate-handler-m4cqg\" (UID: \"72f87fd4-c9ff-4410-ba80-fdc2525a37b8\") " pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.208173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbwzx\" (UniqueName: \"kubernetes.io/projected/3dc50ee6-0d24-4790-a13d-caaf61114685-kube-api-access-kbwzx\") pod \"nmstate-webhook-8474b5b9d8-64vq9\" (UID: \"3dc50ee6-0d24-4790-a13d-caaf61114685\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.222938 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:51 crc kubenswrapper[4707]: W0121 15:10:51.235958 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72f87fd4_c9ff_4410_ba80_fdc2525a37b8.slice/crio-b55c984fec065ecf05bd34288a055f986e098fc676b4bb69ad2009015920488a WatchSource:0}: Error finding container b55c984fec065ecf05bd34288a055f986e098fc676b4bb69ad2009015920488a: Status 404 returned error can't find the container with id b55c984fec065ecf05bd34288a055f986e098fc676b4bb69ad2009015920488a Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.292843 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-trusted-ca-bundle\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.293078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-oauth-serving-cert\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.293110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d6jq\" (UniqueName: \"kubernetes.io/projected/d617195d-65f3-4215-a296-7db0f1ae2969-kube-api-access-2d6jq\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.293137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-service-ca\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.293197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-console-config\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.293217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d617195d-65f3-4215-a296-7db0f1ae2969-console-serving-cert\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.293255 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d617195d-65f3-4215-a296-7db0f1ae2969-console-oauth-config\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.295725 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.394640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-console-config\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.394683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d617195d-65f3-4215-a296-7db0f1ae2969-console-serving-cert\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.394727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d617195d-65f3-4215-a296-7db0f1ae2969-console-oauth-config\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.394749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-trusted-ca-bundle\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.394765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-oauth-serving-cert\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.394789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d6jq\" (UniqueName: \"kubernetes.io/projected/d617195d-65f3-4215-a296-7db0f1ae2969-kube-api-access-2d6jq\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.394838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-service-ca\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.395876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-service-ca\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.395894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-console-config\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.396131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-oauth-serving-cert\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.397218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d617195d-65f3-4215-a296-7db0f1ae2969-trusted-ca-bundle\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.398339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d617195d-65f3-4215-a296-7db0f1ae2969-console-oauth-config\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.398599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d617195d-65f3-4215-a296-7db0f1ae2969-console-serving-cert\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.408079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d6jq\" (UniqueName: \"kubernetes.io/projected/d617195d-65f3-4215-a296-7db0f1ae2969-kube-api-access-2d6jq\") pod \"console-56fdbdc57d-q67pn\" (UID: \"d617195d-65f3-4215-a296-7db0f1ae2969\") " pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.467029 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.501284 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.526074 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-h2gvw"] Jan 21 15:10:51 crc kubenswrapper[4707]: W0121 15:10:51.528270 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod591eebf5_40db_445a_a3ab_23a5f88f501c.slice/crio-7cb25f684cb3cdccc2b3b5ded1497c0bb14ee66c7996aa7df6263a55d8d9874c WatchSource:0}: Error finding container 7cb25f684cb3cdccc2b3b5ded1497c0bb14ee66c7996aa7df6263a55d8d9874c: Status 404 returned error can't find the container with id 7cb25f684cb3cdccc2b3b5ded1497c0bb14ee66c7996aa7df6263a55d8d9874c Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.598111 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56fdbdc57d-q67pn"] Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.622017 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx"] Jan 21 15:10:51 crc kubenswrapper[4707]: I0121 15:10:51.646279 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9"] Jan 21 15:10:51 crc kubenswrapper[4707]: W0121 15:10:51.664432 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc50ee6_0d24_4790_a13d_caaf61114685.slice/crio-070cc490a0d028d129b0dcc466f28a0d56a2e8db8dd59f96d0c2fde200371587 WatchSource:0}: Error finding container 070cc490a0d028d129b0dcc466f28a0d56a2e8db8dd59f96d0c2fde200371587: Status 404 returned error can't find the container with id 070cc490a0d028d129b0dcc466f28a0d56a2e8db8dd59f96d0c2fde200371587 Jan 21 15:10:52 crc kubenswrapper[4707]: I0121 15:10:52.144398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" event={"ID":"3dc50ee6-0d24-4790-a13d-caaf61114685","Type":"ContainerStarted","Data":"070cc490a0d028d129b0dcc466f28a0d56a2e8db8dd59f96d0c2fde200371587"} Jan 21 15:10:52 crc kubenswrapper[4707]: I0121 15:10:52.145748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56fdbdc57d-q67pn" event={"ID":"d617195d-65f3-4215-a296-7db0f1ae2969","Type":"ContainerStarted","Data":"e536c583306b84a11d0895eea6fc0e1268847cd27b0d8df5f8ad363b2085e190"} Jan 21 15:10:52 crc kubenswrapper[4707]: I0121 15:10:52.145780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56fdbdc57d-q67pn" event={"ID":"d617195d-65f3-4215-a296-7db0f1ae2969","Type":"ContainerStarted","Data":"0ed86e114eae57d89f689ae46b9fbd27cd5e17cfe73c23ad188028f873af41af"} Jan 21 15:10:52 crc kubenswrapper[4707]: I0121 15:10:52.146836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m4cqg" event={"ID":"72f87fd4-c9ff-4410-ba80-fdc2525a37b8","Type":"ContainerStarted","Data":"b55c984fec065ecf05bd34288a055f986e098fc676b4bb69ad2009015920488a"} Jan 21 15:10:52 crc kubenswrapper[4707]: I0121 15:10:52.147617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-h2gvw" event={"ID":"591eebf5-40db-445a-a3ab-23a5f88f501c","Type":"ContainerStarted","Data":"7cb25f684cb3cdccc2b3b5ded1497c0bb14ee66c7996aa7df6263a55d8d9874c"} Jan 21 15:10:52 crc kubenswrapper[4707]: I0121 15:10:52.149072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" event={"ID":"7bf6f36b-51bd-4196-b732-7807475b38b2","Type":"ContainerStarted","Data":"fd55c757e1887facb49cc417203d577cdfdc45e933df245ab8b5abc6486dbdce"} Jan 21 15:10:52 crc kubenswrapper[4707]: I0121 15:10:52.159796 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56fdbdc57d-q67pn" podStartSLOduration=1.159779069 podStartE2EDuration="1.159779069s" podCreationTimestamp="2026-01-21 15:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:10:52.15614492 +0000 UTC m=+549.337661142" watchObservedRunningTime="2026-01-21 15:10:52.159779069 +0000 UTC m=+549.341295291" Jan 21 15:10:54 crc kubenswrapper[4707]: I0121 15:10:54.161486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" event={"ID":"3dc50ee6-0d24-4790-a13d-caaf61114685","Type":"ContainerStarted","Data":"0dc0b2eb07715ef39d82cc0e5aeb7d35bfeba121431d3c1b6c3731f8a286f6b6"} Jan 21 15:10:54 crc kubenswrapper[4707]: I0121 15:10:54.161737 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:10:54 crc kubenswrapper[4707]: I0121 15:10:54.163138 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m4cqg" event={"ID":"72f87fd4-c9ff-4410-ba80-fdc2525a37b8","Type":"ContainerStarted","Data":"8639dcde7d0b155d70ceedd418b39c966e3f78125709b39183b0b6d2efc2a7ac"} Jan 21 15:10:54 crc kubenswrapper[4707]: I0121 15:10:54.163263 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:10:54 crc kubenswrapper[4707]: I0121 15:10:54.164316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-h2gvw" event={"ID":"591eebf5-40db-445a-a3ab-23a5f88f501c","Type":"ContainerStarted","Data":"abefdeb50539b9ca8a900338797ce2cb53369ab4546271c76fd80521f9a53b54"} Jan 21 15:10:54 crc kubenswrapper[4707]: I0121 15:10:54.165414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" event={"ID":"7bf6f36b-51bd-4196-b732-7807475b38b2","Type":"ContainerStarted","Data":"c0f87118876b6e402a165ff024db1881dd04a3ffa9413bf6b8fad02840db13f8"} Jan 21 15:10:54 crc kubenswrapper[4707]: I0121 15:10:54.173850 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" podStartSLOduration=2.061127986 podStartE2EDuration="4.173839683s" podCreationTimestamp="2026-01-21 15:10:50 +0000 UTC" firstStartedPulling="2026-01-21 15:10:51.66642155 +0000 UTC m=+548.847937772" lastFinishedPulling="2026-01-21 15:10:53.779133246 +0000 UTC m=+550.960649469" observedRunningTime="2026-01-21 15:10:54.171914441 +0000 UTC m=+551.353430662" watchObservedRunningTime="2026-01-21 15:10:54.173839683 +0000 UTC m=+551.355355905" Jan 21 15:10:54 crc kubenswrapper[4707]: I0121 15:10:54.183196 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-m4cqg" podStartSLOduration=1.642262971 podStartE2EDuration="4.183184197s" podCreationTimestamp="2026-01-21 15:10:50 +0000 UTC" firstStartedPulling="2026-01-21 15:10:51.237481035 +0000 UTC m=+548.418997257" lastFinishedPulling="2026-01-21 15:10:53.778402261 +0000 UTC m=+550.959918483" observedRunningTime="2026-01-21 15:10:54.18243665 +0000 UTC m=+551.363952873" watchObservedRunningTime="2026-01-21 15:10:54.183184197 +0000 UTC m=+551.364700419" Jan 21 15:10:54 crc kubenswrapper[4707]: I0121 15:10:54.195321 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-g8mcx" podStartSLOduration=2.03910705 podStartE2EDuration="4.19530758s" podCreationTimestamp="2026-01-21 15:10:50 +0000 UTC" firstStartedPulling="2026-01-21 15:10:51.630286556 +0000 UTC m=+548.811802777" lastFinishedPulling="2026-01-21 15:10:53.786487084 +0000 UTC m=+550.968003307" observedRunningTime="2026-01-21 15:10:54.193656593 +0000 UTC m=+551.375172815" watchObservedRunningTime="2026-01-21 15:10:54.19530758 +0000 UTC m=+551.376823801" Jan 21 15:10:56 crc kubenswrapper[4707]: I0121 15:10:56.176443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-h2gvw" event={"ID":"591eebf5-40db-445a-a3ab-23a5f88f501c","Type":"ContainerStarted","Data":"c057c74e183e7691d863b98e0db867ed36c741c1c44a323280fab56bb9707f41"} Jan 21 15:10:56 crc kubenswrapper[4707]: I0121 15:10:56.199460 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-h2gvw" podStartSLOduration=1.876952736 podStartE2EDuration="6.199444762s" podCreationTimestamp="2026-01-21 15:10:50 +0000 UTC" firstStartedPulling="2026-01-21 15:10:51.530562174 +0000 UTC m=+548.712078397" lastFinishedPulling="2026-01-21 15:10:55.853054201 +0000 UTC m=+553.034570423" observedRunningTime="2026-01-21 15:10:56.198031182 +0000 UTC m=+553.379547404" watchObservedRunningTime="2026-01-21 15:10:56.199444762 +0000 UTC m=+553.380960984" Jan 21 15:11:01 crc kubenswrapper[4707]: I0121 15:11:01.238890 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-m4cqg" Jan 21 15:11:01 crc kubenswrapper[4707]: I0121 15:11:01.467739 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:11:01 crc kubenswrapper[4707]: I0121 15:11:01.467785 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:11:01 crc kubenswrapper[4707]: I0121 15:11:01.472041 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:11:02 crc kubenswrapper[4707]: I0121 15:11:02.201748 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56fdbdc57d-q67pn" Jan 21 15:11:02 crc kubenswrapper[4707]: I0121 15:11:02.262141 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-c6ckn"] Jan 21 15:11:09 crc kubenswrapper[4707]: I0121 15:11:09.945151 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:11:09 crc kubenswrapper[4707]: I0121 15:11:09.945932 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:11:11 crc kubenswrapper[4707]: I0121 15:11:11.506374 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-64vq9" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.512699 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq"] Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.518800 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.526744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.531028 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq"] Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.600495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8dd\" (UniqueName: \"kubernetes.io/projected/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-kube-api-access-bb8dd\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.600775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.600931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.701588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.702004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8dd\" (UniqueName: \"kubernetes.io/projected/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-kube-api-access-bb8dd\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.702040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.702114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.702360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.721764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8dd\" (UniqueName: \"kubernetes.io/projected/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-kube-api-access-bb8dd\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:19 crc kubenswrapper[4707]: I0121 15:11:19.833911 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:20 crc kubenswrapper[4707]: I0121 15:11:20.170281 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq"] Jan 21 15:11:20 crc kubenswrapper[4707]: I0121 15:11:20.267885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" event={"ID":"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3","Type":"ContainerStarted","Data":"8d85121904608780848c708a35bcef507e58da09efa9986ddc33e43fe3c241fa"} Jan 21 15:11:20 crc kubenswrapper[4707]: I0121 15:11:20.268129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" event={"ID":"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3","Type":"ContainerStarted","Data":"ed4dd28497f0f8a3972a6c9dae1815ad4bdd749b5dd01c766becddc44ef23a34"} Jan 21 15:11:21 crc kubenswrapper[4707]: I0121 15:11:21.274089 4707 generic.go:334] "Generic (PLEG): container finished" podID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerID="8d85121904608780848c708a35bcef507e58da09efa9986ddc33e43fe3c241fa" exitCode=0 Jan 21 15:11:21 crc kubenswrapper[4707]: I0121 15:11:21.274132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" event={"ID":"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3","Type":"ContainerDied","Data":"8d85121904608780848c708a35bcef507e58da09efa9986ddc33e43fe3c241fa"} Jan 21 15:11:23 crc kubenswrapper[4707]: I0121 15:11:23.283699 4707 generic.go:334] "Generic (PLEG): container finished" podID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerID="a3832589a1d9ff375ce454d2398501994c48a7a2dec286a892e96f514cc98195" exitCode=0 Jan 21 15:11:23 crc kubenswrapper[4707]: I0121 15:11:23.283728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" event={"ID":"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3","Type":"ContainerDied","Data":"a3832589a1d9ff375ce454d2398501994c48a7a2dec286a892e96f514cc98195"} Jan 21 15:11:24 crc kubenswrapper[4707]: I0121 15:11:24.289418 4707 generic.go:334] "Generic (PLEG): container finished" podID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerID="c952b8b0dbf7285a4e1f1241d64df25007394125ea3c4fb5375299fcea2b58c3" exitCode=0 Jan 21 15:11:24 crc kubenswrapper[4707]: I0121 15:11:24.289455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" event={"ID":"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3","Type":"ContainerDied","Data":"c952b8b0dbf7285a4e1f1241d64df25007394125ea3c4fb5375299fcea2b58c3"} Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.456981 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.563227 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-bundle\") pod \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.563270 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb8dd\" (UniqueName: \"kubernetes.io/projected/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-kube-api-access-bb8dd\") pod \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.563298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-util\") pod \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\" (UID: \"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3\") " Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.564071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-bundle" (OuterVolumeSpecName: "bundle") pod "ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" (UID: "ace82b3e-606d-4deb-ab5f-04fd56c3d8d3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.567851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-kube-api-access-bb8dd" (OuterVolumeSpecName: "kube-api-access-bb8dd") pod "ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" (UID: "ace82b3e-606d-4deb-ab5f-04fd56c3d8d3"). InnerVolumeSpecName "kube-api-access-bb8dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.573126 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-util" (OuterVolumeSpecName: "util") pod "ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" (UID: "ace82b3e-606d-4deb-ab5f-04fd56c3d8d3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.665011 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.665039 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb8dd\" (UniqueName: \"kubernetes.io/projected/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-kube-api-access-bb8dd\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:25 crc kubenswrapper[4707]: I0121 15:11:25.665050 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace82b3e-606d-4deb-ab5f-04fd56c3d8d3-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:26 crc kubenswrapper[4707]: I0121 15:11:26.300875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" event={"ID":"ace82b3e-606d-4deb-ab5f-04fd56c3d8d3","Type":"ContainerDied","Data":"ed4dd28497f0f8a3972a6c9dae1815ad4bdd749b5dd01c766becddc44ef23a34"} Jan 21 15:11:26 crc kubenswrapper[4707]: I0121 15:11:26.300908 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq" Jan 21 15:11:26 crc kubenswrapper[4707]: I0121 15:11:26.300911 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed4dd28497f0f8a3972a6c9dae1815ad4bdd749b5dd01c766becddc44ef23a34" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.297371 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-c6ckn" podUID="0ebc35b7-89c9-490c-9756-6d532a2411ed" containerName="console" containerID="cri-o://f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2" gracePeriod=15 Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.586927 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-c6ckn_0ebc35b7-89c9-490c-9756-6d532a2411ed/console/0.log" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.587156 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.787175 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-oauth-serving-cert\") pod \"0ebc35b7-89c9-490c-9756-6d532a2411ed\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.787224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-trusted-ca-bundle\") pod \"0ebc35b7-89c9-490c-9756-6d532a2411ed\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.787522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-service-ca\") pod \"0ebc35b7-89c9-490c-9756-6d532a2411ed\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.787588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-oauth-config\") pod \"0ebc35b7-89c9-490c-9756-6d532a2411ed\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.787682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkwcs\" (UniqueName: \"kubernetes.io/projected/0ebc35b7-89c9-490c-9756-6d532a2411ed-kube-api-access-dkwcs\") pod \"0ebc35b7-89c9-490c-9756-6d532a2411ed\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.787705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-config\") pod \"0ebc35b7-89c9-490c-9756-6d532a2411ed\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.787730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-serving-cert\") pod \"0ebc35b7-89c9-490c-9756-6d532a2411ed\" (UID: \"0ebc35b7-89c9-490c-9756-6d532a2411ed\") " Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.788081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0ebc35b7-89c9-490c-9756-6d532a2411ed" (UID: "0ebc35b7-89c9-490c-9756-6d532a2411ed"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.788292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-service-ca" (OuterVolumeSpecName: "service-ca") pod "0ebc35b7-89c9-490c-9756-6d532a2411ed" (UID: "0ebc35b7-89c9-490c-9756-6d532a2411ed"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.788469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0ebc35b7-89c9-490c-9756-6d532a2411ed" (UID: "0ebc35b7-89c9-490c-9756-6d532a2411ed"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.788520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-config" (OuterVolumeSpecName: "console-config") pod "0ebc35b7-89c9-490c-9756-6d532a2411ed" (UID: "0ebc35b7-89c9-490c-9756-6d532a2411ed"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.789176 4707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.789245 4707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.789307 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.789364 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ebc35b7-89c9-490c-9756-6d532a2411ed-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.791888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebc35b7-89c9-490c-9756-6d532a2411ed-kube-api-access-dkwcs" (OuterVolumeSpecName: "kube-api-access-dkwcs") pod "0ebc35b7-89c9-490c-9756-6d532a2411ed" (UID: "0ebc35b7-89c9-490c-9756-6d532a2411ed"). InnerVolumeSpecName "kube-api-access-dkwcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.794309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0ebc35b7-89c9-490c-9756-6d532a2411ed" (UID: "0ebc35b7-89c9-490c-9756-6d532a2411ed"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.794706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0ebc35b7-89c9-490c-9756-6d532a2411ed" (UID: "0ebc35b7-89c9-490c-9756-6d532a2411ed"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.890114 4707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.890147 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkwcs\" (UniqueName: \"kubernetes.io/projected/0ebc35b7-89c9-490c-9756-6d532a2411ed-kube-api-access-dkwcs\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:27 crc kubenswrapper[4707]: I0121 15:11:27.890158 4707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebc35b7-89c9-490c-9756-6d532a2411ed-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.311122 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-c6ckn_0ebc35b7-89c9-490c-9756-6d532a2411ed/console/0.log" Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.311161 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ebc35b7-89c9-490c-9756-6d532a2411ed" containerID="f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2" exitCode=2 Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.311185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c6ckn" event={"ID":"0ebc35b7-89c9-490c-9756-6d532a2411ed","Type":"ContainerDied","Data":"f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2"} Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.311206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c6ckn" event={"ID":"0ebc35b7-89c9-490c-9756-6d532a2411ed","Type":"ContainerDied","Data":"10aa7ff284582adc14afab6e0dad6d9f6c329e2b4d1a261377e7daa4bfc59510"} Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.311220 4707 scope.go:117] "RemoveContainer" containerID="f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2" Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.311314 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c6ckn" Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.329178 4707 scope.go:117] "RemoveContainer" containerID="f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2" Jan 21 15:11:28 crc kubenswrapper[4707]: E0121 15:11:28.329558 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2\": container with ID starting with f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2 not found: ID does not exist" containerID="f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2" Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.329596 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2"} err="failed to get container status \"f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2\": rpc error: code = NotFound desc = could not find container \"f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2\": container with ID starting with f568bb560e4344887e4256dc5557d6119742017d5c4be1747e4120e10b29e1e2 not found: ID does not exist" Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.336403 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-c6ckn"] Jan 21 15:11:28 crc kubenswrapper[4707]: I0121 15:11:28.339122 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-c6ckn"] Jan 21 15:11:29 crc kubenswrapper[4707]: I0121 15:11:29.187447 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ebc35b7-89c9-490c-9756-6d532a2411ed" path="/var/lib/kubelet/pods/0ebc35b7-89c9-490c-9756-6d532a2411ed/volumes" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.591625 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-769647f86-fhptv"] Jan 21 15:11:35 crc kubenswrapper[4707]: E0121 15:11:35.591987 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebc35b7-89c9-490c-9756-6d532a2411ed" containerName="console" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.592000 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebc35b7-89c9-490c-9756-6d532a2411ed" containerName="console" Jan 21 15:11:35 crc kubenswrapper[4707]: E0121 15:11:35.592009 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerName="extract" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.592016 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerName="extract" Jan 21 15:11:35 crc kubenswrapper[4707]: E0121 15:11:35.592027 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerName="util" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.592032 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerName="util" Jan 21 15:11:35 crc kubenswrapper[4707]: E0121 15:11:35.592042 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerName="pull" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.592047 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerName="pull" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.592125 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebc35b7-89c9-490c-9756-6d532a2411ed" containerName="console" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.592137 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace82b3e-606d-4deb-ab5f-04fd56c3d8d3" containerName="extract" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.592440 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.594045 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.594548 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-jq729" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.594584 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.597212 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.599299 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.603046 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-769647f86-fhptv"] Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.770026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78-apiservice-cert\") pod \"metallb-operator-controller-manager-769647f86-fhptv\" (UID: \"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78\") " pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.770081 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78-webhook-cert\") pod \"metallb-operator-controller-manager-769647f86-fhptv\" (UID: \"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78\") " pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.770184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cx5g\" (UniqueName: \"kubernetes.io/projected/3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78-kube-api-access-4cx5g\") pod \"metallb-operator-controller-manager-769647f86-fhptv\" (UID: \"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78\") " pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.822346 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5"] Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.822934 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.824152 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.824260 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-cmf5v" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.824704 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.833662 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5"] Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.871209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78-webhook-cert\") pod \"metallb-operator-controller-manager-769647f86-fhptv\" (UID: \"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78\") " pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.871270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/560343d7-3dae-42d2-b79f-e7ddbde3df71-webhook-cert\") pod \"metallb-operator-webhook-server-58ff8dbc9-jm9z5\" (UID: \"560343d7-3dae-42d2-b79f-e7ddbde3df71\") " pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.871304 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hsvv\" (UniqueName: \"kubernetes.io/projected/560343d7-3dae-42d2-b79f-e7ddbde3df71-kube-api-access-8hsvv\") pod \"metallb-operator-webhook-server-58ff8dbc9-jm9z5\" (UID: \"560343d7-3dae-42d2-b79f-e7ddbde3df71\") " pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.871325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cx5g\" (UniqueName: \"kubernetes.io/projected/3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78-kube-api-access-4cx5g\") pod \"metallb-operator-controller-manager-769647f86-fhptv\" (UID: \"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78\") " pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.871358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/560343d7-3dae-42d2-b79f-e7ddbde3df71-apiservice-cert\") pod \"metallb-operator-webhook-server-58ff8dbc9-jm9z5\" (UID: \"560343d7-3dae-42d2-b79f-e7ddbde3df71\") " pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.871384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78-apiservice-cert\") pod \"metallb-operator-controller-manager-769647f86-fhptv\" (UID: \"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78\") " pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.876394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78-apiservice-cert\") pod \"metallb-operator-controller-manager-769647f86-fhptv\" (UID: \"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78\") " pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.876522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78-webhook-cert\") pod \"metallb-operator-controller-manager-769647f86-fhptv\" (UID: \"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78\") " pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.885983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cx5g\" (UniqueName: \"kubernetes.io/projected/3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78-kube-api-access-4cx5g\") pod \"metallb-operator-controller-manager-769647f86-fhptv\" (UID: \"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78\") " pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.905103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.972441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/560343d7-3dae-42d2-b79f-e7ddbde3df71-webhook-cert\") pod \"metallb-operator-webhook-server-58ff8dbc9-jm9z5\" (UID: \"560343d7-3dae-42d2-b79f-e7ddbde3df71\") " pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.972492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hsvv\" (UniqueName: \"kubernetes.io/projected/560343d7-3dae-42d2-b79f-e7ddbde3df71-kube-api-access-8hsvv\") pod \"metallb-operator-webhook-server-58ff8dbc9-jm9z5\" (UID: \"560343d7-3dae-42d2-b79f-e7ddbde3df71\") " pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.972527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/560343d7-3dae-42d2-b79f-e7ddbde3df71-apiservice-cert\") pod \"metallb-operator-webhook-server-58ff8dbc9-jm9z5\" (UID: \"560343d7-3dae-42d2-b79f-e7ddbde3df71\") " pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.975303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/560343d7-3dae-42d2-b79f-e7ddbde3df71-apiservice-cert\") pod \"metallb-operator-webhook-server-58ff8dbc9-jm9z5\" (UID: \"560343d7-3dae-42d2-b79f-e7ddbde3df71\") " pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.984352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/560343d7-3dae-42d2-b79f-e7ddbde3df71-webhook-cert\") pod \"metallb-operator-webhook-server-58ff8dbc9-jm9z5\" (UID: \"560343d7-3dae-42d2-b79f-e7ddbde3df71\") " pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:35 crc kubenswrapper[4707]: I0121 15:11:35.997341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hsvv\" (UniqueName: \"kubernetes.io/projected/560343d7-3dae-42d2-b79f-e7ddbde3df71-kube-api-access-8hsvv\") pod \"metallb-operator-webhook-server-58ff8dbc9-jm9z5\" (UID: \"560343d7-3dae-42d2-b79f-e7ddbde3df71\") " pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:36 crc kubenswrapper[4707]: I0121 15:11:36.133138 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:36 crc kubenswrapper[4707]: I0121 15:11:36.281169 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-769647f86-fhptv"] Jan 21 15:11:36 crc kubenswrapper[4707]: W0121 15:11:36.288017 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d3dbff1_41f3_49a2_bf4d_9d4c1dcdef78.slice/crio-4e5db0b807f6e9bbb641c406e01b8c172fbe8433257b22d4a13ba67281d8dc74 WatchSource:0}: Error finding container 4e5db0b807f6e9bbb641c406e01b8c172fbe8433257b22d4a13ba67281d8dc74: Status 404 returned error can't find the container with id 4e5db0b807f6e9bbb641c406e01b8c172fbe8433257b22d4a13ba67281d8dc74 Jan 21 15:11:36 crc kubenswrapper[4707]: I0121 15:11:36.349922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" event={"ID":"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78","Type":"ContainerStarted","Data":"4e5db0b807f6e9bbb641c406e01b8c172fbe8433257b22d4a13ba67281d8dc74"} Jan 21 15:11:36 crc kubenswrapper[4707]: I0121 15:11:36.483122 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5"] Jan 21 15:11:36 crc kubenswrapper[4707]: W0121 15:11:36.485220 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod560343d7_3dae_42d2_b79f_e7ddbde3df71.slice/crio-2c8034078a933772f51594a63eb85510871cfc5029887722ba7024752eb85de2 WatchSource:0}: Error finding container 2c8034078a933772f51594a63eb85510871cfc5029887722ba7024752eb85de2: Status 404 returned error can't find the container with id 2c8034078a933772f51594a63eb85510871cfc5029887722ba7024752eb85de2 Jan 21 15:11:37 crc kubenswrapper[4707]: I0121 15:11:37.355709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" event={"ID":"560343d7-3dae-42d2-b79f-e7ddbde3df71","Type":"ContainerStarted","Data":"2c8034078a933772f51594a63eb85510871cfc5029887722ba7024752eb85de2"} Jan 21 15:11:39 crc kubenswrapper[4707]: I0121 15:11:39.366784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" event={"ID":"3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78","Type":"ContainerStarted","Data":"db673932d256cbfdabc7d01b663c520abd0d55b2ee3c83a3545344798ce5249a"} Jan 21 15:11:39 crc kubenswrapper[4707]: I0121 15:11:39.367082 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:11:39 crc kubenswrapper[4707]: I0121 15:11:39.382841 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" podStartSLOduration=1.993829445 podStartE2EDuration="4.382823636s" podCreationTimestamp="2026-01-21 15:11:35 +0000 UTC" firstStartedPulling="2026-01-21 15:11:36.29060438 +0000 UTC m=+593.472120602" lastFinishedPulling="2026-01-21 15:11:38.679598571 +0000 UTC m=+595.861114793" observedRunningTime="2026-01-21 15:11:39.381249394 +0000 UTC m=+596.562765616" watchObservedRunningTime="2026-01-21 15:11:39.382823636 +0000 UTC m=+596.564339858" Jan 21 15:11:39 crc kubenswrapper[4707]: I0121 15:11:39.945766 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:11:39 crc kubenswrapper[4707]: I0121 15:11:39.945838 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:11:41 crc kubenswrapper[4707]: I0121 15:11:41.376043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" event={"ID":"560343d7-3dae-42d2-b79f-e7ddbde3df71","Type":"ContainerStarted","Data":"201d3d97a3456ae30aecd86782774f2ef95b719c1acf6f7c50daadfe799687d5"} Jan 21 15:11:41 crc kubenswrapper[4707]: I0121 15:11:41.376164 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:11:41 crc kubenswrapper[4707]: I0121 15:11:41.389415 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" podStartSLOduration=1.910772194 podStartE2EDuration="6.389404831s" podCreationTimestamp="2026-01-21 15:11:35 +0000 UTC" firstStartedPulling="2026-01-21 15:11:36.487242743 +0000 UTC m=+593.668758965" lastFinishedPulling="2026-01-21 15:11:40.96587538 +0000 UTC m=+598.147391602" observedRunningTime="2026-01-21 15:11:41.388129602 +0000 UTC m=+598.569645823" watchObservedRunningTime="2026-01-21 15:11:41.389404831 +0000 UTC m=+598.570921053" Jan 21 15:11:56 crc kubenswrapper[4707]: I0121 15:11:56.136621 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-58ff8dbc9-jm9z5" Jan 21 15:12:09 crc kubenswrapper[4707]: I0121 15:12:09.945589 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:12:09 crc kubenswrapper[4707]: I0121 15:12:09.946017 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:12:09 crc kubenswrapper[4707]: I0121 15:12:09.946062 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:12:09 crc kubenswrapper[4707]: I0121 15:12:09.946578 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f362a525166796306b00e86a9f5d281cb9947b8704d1d3ea656f1842b6d253f"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:12:09 crc kubenswrapper[4707]: I0121 15:12:09.946632 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://5f362a525166796306b00e86a9f5d281cb9947b8704d1d3ea656f1842b6d253f" gracePeriod=600 Jan 21 15:12:10 crc kubenswrapper[4707]: I0121 15:12:10.497458 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="5f362a525166796306b00e86a9f5d281cb9947b8704d1d3ea656f1842b6d253f" exitCode=0 Jan 21 15:12:10 crc kubenswrapper[4707]: I0121 15:12:10.497618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"5f362a525166796306b00e86a9f5d281cb9947b8704d1d3ea656f1842b6d253f"} Jan 21 15:12:10 crc kubenswrapper[4707]: I0121 15:12:10.497709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"38eb7c71f338910a44fc52babc17693f7c677868f52954612f9a40b65ff03f90"} Jan 21 15:12:10 crc kubenswrapper[4707]: I0121 15:12:10.497732 4707 scope.go:117] "RemoveContainer" containerID="bf3c9e6b6e14a46494620756c3c0f7583b42676c1dc0d8e1f947083111f31743" Jan 21 15:12:15 crc kubenswrapper[4707]: I0121 15:12:15.908068 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-769647f86-fhptv" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.449826 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c"] Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.450571 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.452062 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.452401 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-jxchz" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.459875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c"] Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.462933 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pdjrb"] Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.464655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.467087 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.467132 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.533880 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-xwn92"] Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.534572 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.536478 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.544136 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-v97x2"] Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.544875 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.547691 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.547756 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.548087 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wnqvq" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.548980 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.549890 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-xwn92"] Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.621957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-reloader\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.622372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-metrics-certs\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.622569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-frr-startup\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.622608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-frr-sockets\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.622646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v7mb\" (UniqueName: \"kubernetes.io/projected/a50f5957-c813-4be6-846c-18fa67c16942-kube-api-access-6v7mb\") pod \"frr-k8s-webhook-server-7df86c4f6c-czn2c\" (UID: \"a50f5957-c813-4be6-846c-18fa67c16942\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.622692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-frr-conf\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.622723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-metrics\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.622889 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm8r6\" (UniqueName: \"kubernetes.io/projected/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-kube-api-access-pm8r6\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.622931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a50f5957-c813-4be6-846c-18fa67c16942-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-czn2c\" (UID: \"a50f5957-c813-4be6-846c-18fa67c16942\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm8r6\" (UniqueName: \"kubernetes.io/projected/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-kube-api-access-pm8r6\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723605 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a50f5957-c813-4be6-846c-18fa67c16942-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-czn2c\" (UID: \"a50f5957-c813-4be6-846c-18fa67c16942\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5bd042b-d9fe-489e-9d06-de8294e48164-metrics-certs\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5bd042b-d9fe-489e-9d06-de8294e48164-cert\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723673 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-metrics-certs\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-reloader\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-metrics-certs\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2kxl\" (UniqueName: \"kubernetes.io/projected/e01d0445-0202-4f05-8e3b-cae0bd5b2468-kube-api-access-c2kxl\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-frr-startup\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj8d6\" (UniqueName: \"kubernetes.io/projected/d5bd042b-d9fe-489e-9d06-de8294e48164-kube-api-access-rj8d6\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-frr-sockets\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.723866 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-memberlist\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.724025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v7mb\" (UniqueName: \"kubernetes.io/projected/a50f5957-c813-4be6-846c-18fa67c16942-kube-api-access-6v7mb\") pod \"frr-k8s-webhook-server-7df86c4f6c-czn2c\" (UID: \"a50f5957-c813-4be6-846c-18fa67c16942\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.724274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-reloader\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.724293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-frr-sockets\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.724368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-frr-conf\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.724611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-metrics\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.724637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e01d0445-0202-4f05-8e3b-cae0bd5b2468-metallb-excludel2\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.724572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-frr-conf\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.724693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-frr-startup\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.724869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-metrics\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.727879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-metrics-certs\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.730314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a50f5957-c813-4be6-846c-18fa67c16942-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-czn2c\" (UID: \"a50f5957-c813-4be6-846c-18fa67c16942\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.736134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm8r6\" (UniqueName: \"kubernetes.io/projected/3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3-kube-api-access-pm8r6\") pod \"frr-k8s-pdjrb\" (UID: \"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3\") " pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.736467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v7mb\" (UniqueName: \"kubernetes.io/projected/a50f5957-c813-4be6-846c-18fa67c16942-kube-api-access-6v7mb\") pod \"frr-k8s-webhook-server-7df86c4f6c-czn2c\" (UID: \"a50f5957-c813-4be6-846c-18fa67c16942\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.762799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.774949 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.826426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj8d6\" (UniqueName: \"kubernetes.io/projected/d5bd042b-d9fe-489e-9d06-de8294e48164-kube-api-access-rj8d6\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.826648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-memberlist\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.826716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e01d0445-0202-4f05-8e3b-cae0bd5b2468-metallb-excludel2\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.826754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5bd042b-d9fe-489e-9d06-de8294e48164-metrics-certs\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.826779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5bd042b-d9fe-489e-9d06-de8294e48164-cert\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.826804 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-metrics-certs\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.826850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2kxl\" (UniqueName: \"kubernetes.io/projected/e01d0445-0202-4f05-8e3b-cae0bd5b2468-kube-api-access-c2kxl\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: E0121 15:12:16.826873 4707 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 21 15:12:16 crc kubenswrapper[4707]: E0121 15:12:16.826922 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5bd042b-d9fe-489e-9d06-de8294e48164-metrics-certs podName:d5bd042b-d9fe-489e-9d06-de8294e48164 nodeName:}" failed. No retries permitted until 2026-01-21 15:12:17.326908947 +0000 UTC m=+634.508425169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5bd042b-d9fe-489e-9d06-de8294e48164-metrics-certs") pod "controller-6968d8fdc4-xwn92" (UID: "d5bd042b-d9fe-489e-9d06-de8294e48164") : secret "controller-certs-secret" not found Jan 21 15:12:16 crc kubenswrapper[4707]: E0121 15:12:16.827389 4707 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 15:12:16 crc kubenswrapper[4707]: E0121 15:12:16.827416 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-memberlist podName:e01d0445-0202-4f05-8e3b-cae0bd5b2468 nodeName:}" failed. No retries permitted until 2026-01-21 15:12:17.327408056 +0000 UTC m=+634.508924278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-memberlist") pod "speaker-v97x2" (UID: "e01d0445-0202-4f05-8e3b-cae0bd5b2468") : secret "metallb-memberlist" not found Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.827576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e01d0445-0202-4f05-8e3b-cae0bd5b2468-metallb-excludel2\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.829386 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.829643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-metrics-certs\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.841518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5bd042b-d9fe-489e-9d06-de8294e48164-cert\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.843613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2kxl\" (UniqueName: \"kubernetes.io/projected/e01d0445-0202-4f05-8e3b-cae0bd5b2468-kube-api-access-c2kxl\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:16 crc kubenswrapper[4707]: I0121 15:12:16.845285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj8d6\" (UniqueName: \"kubernetes.io/projected/d5bd042b-d9fe-489e-9d06-de8294e48164-kube-api-access-rj8d6\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:17 crc kubenswrapper[4707]: I0121 15:12:17.112000 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c"] Jan 21 15:12:17 crc kubenswrapper[4707]: W0121 15:12:17.114933 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda50f5957_c813_4be6_846c_18fa67c16942.slice/crio-3711b4573eae9c5804ab568c0c24d6c47eea664a339f61d365f29d3497ab2f8f WatchSource:0}: Error finding container 3711b4573eae9c5804ab568c0c24d6c47eea664a339f61d365f29d3497ab2f8f: Status 404 returned error can't find the container with id 3711b4573eae9c5804ab568c0c24d6c47eea664a339f61d365f29d3497ab2f8f Jan 21 15:12:17 crc kubenswrapper[4707]: I0121 15:12:17.333622 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-memberlist\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:17 crc kubenswrapper[4707]: I0121 15:12:17.333904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5bd042b-d9fe-489e-9d06-de8294e48164-metrics-certs\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:17 crc kubenswrapper[4707]: E0121 15:12:17.333795 4707 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 15:12:17 crc kubenswrapper[4707]: E0121 15:12:17.334238 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-memberlist podName:e01d0445-0202-4f05-8e3b-cae0bd5b2468 nodeName:}" failed. No retries permitted until 2026-01-21 15:12:18.33421416 +0000 UTC m=+635.515730392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-memberlist") pod "speaker-v97x2" (UID: "e01d0445-0202-4f05-8e3b-cae0bd5b2468") : secret "metallb-memberlist" not found Jan 21 15:12:17 crc kubenswrapper[4707]: I0121 15:12:17.337320 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5bd042b-d9fe-489e-9d06-de8294e48164-metrics-certs\") pod \"controller-6968d8fdc4-xwn92\" (UID: \"d5bd042b-d9fe-489e-9d06-de8294e48164\") " pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:17 crc kubenswrapper[4707]: I0121 15:12:17.444558 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:17 crc kubenswrapper[4707]: I0121 15:12:17.528372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerStarted","Data":"aa17200fb0ee52aee67e2ddd63d01d58d64af15879084c7aed3b9ea479e2bda3"} Jan 21 15:12:17 crc kubenswrapper[4707]: I0121 15:12:17.530590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" event={"ID":"a50f5957-c813-4be6-846c-18fa67c16942","Type":"ContainerStarted","Data":"3711b4573eae9c5804ab568c0c24d6c47eea664a339f61d365f29d3497ab2f8f"} Jan 21 15:12:17 crc kubenswrapper[4707]: I0121 15:12:17.578070 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-xwn92"] Jan 21 15:12:17 crc kubenswrapper[4707]: W0121 15:12:17.581973 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5bd042b_d9fe_489e_9d06_de8294e48164.slice/crio-d17bb8e9a9525ee9665e2a52f72b75de2c66fdc213cadfd75c86a8364ff07a47 WatchSource:0}: Error finding container d17bb8e9a9525ee9665e2a52f72b75de2c66fdc213cadfd75c86a8364ff07a47: Status 404 returned error can't find the container with id d17bb8e9a9525ee9665e2a52f72b75de2c66fdc213cadfd75c86a8364ff07a47 Jan 21 15:12:18 crc kubenswrapper[4707]: I0121 15:12:18.348169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-memberlist\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:18 crc kubenswrapper[4707]: I0121 15:12:18.351948 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e01d0445-0202-4f05-8e3b-cae0bd5b2468-memberlist\") pod \"speaker-v97x2\" (UID: \"e01d0445-0202-4f05-8e3b-cae0bd5b2468\") " pod="metallb-system/speaker-v97x2" Jan 21 15:12:18 crc kubenswrapper[4707]: I0121 15:12:18.354112 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-v97x2" Jan 21 15:12:18 crc kubenswrapper[4707]: I0121 15:12:18.568397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v97x2" event={"ID":"e01d0445-0202-4f05-8e3b-cae0bd5b2468","Type":"ContainerStarted","Data":"0ce4a4e766698cad192885275c408f961d06407f4784de0c1970b2ee9e91e1ba"} Jan 21 15:12:18 crc kubenswrapper[4707]: I0121 15:12:18.571786 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-xwn92" event={"ID":"d5bd042b-d9fe-489e-9d06-de8294e48164","Type":"ContainerStarted","Data":"a2a024595023521db1dcb897a7f06670dbf74fdb8060834a93a59f8a11b8014c"} Jan 21 15:12:18 crc kubenswrapper[4707]: I0121 15:12:18.571850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-xwn92" event={"ID":"d5bd042b-d9fe-489e-9d06-de8294e48164","Type":"ContainerStarted","Data":"dac116e29669b61a0bef25e87b5c6b7f13b2705ee8671497290cd573d68ec0b8"} Jan 21 15:12:18 crc kubenswrapper[4707]: I0121 15:12:18.571862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-xwn92" event={"ID":"d5bd042b-d9fe-489e-9d06-de8294e48164","Type":"ContainerStarted","Data":"d17bb8e9a9525ee9665e2a52f72b75de2c66fdc213cadfd75c86a8364ff07a47"} Jan 21 15:12:18 crc kubenswrapper[4707]: I0121 15:12:18.572922 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:18 crc kubenswrapper[4707]: I0121 15:12:18.593415 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-xwn92" podStartSLOduration=2.593397043 podStartE2EDuration="2.593397043s" podCreationTimestamp="2026-01-21 15:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:12:18.592269492 +0000 UTC m=+635.773785714" watchObservedRunningTime="2026-01-21 15:12:18.593397043 +0000 UTC m=+635.774913266" Jan 21 15:12:19 crc kubenswrapper[4707]: I0121 15:12:19.587908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v97x2" event={"ID":"e01d0445-0202-4f05-8e3b-cae0bd5b2468","Type":"ContainerStarted","Data":"f043dc5af324712d92c4a64ed87b455c0d39658f64a5adb80aedccd85e8bbfde"} Jan 21 15:12:19 crc kubenswrapper[4707]: I0121 15:12:19.588130 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v97x2" event={"ID":"e01d0445-0202-4f05-8e3b-cae0bd5b2468","Type":"ContainerStarted","Data":"8e001867954d71636514dcdbc404f083ec55d0be4874b6f2d532ee2a943d49cd"} Jan 21 15:12:19 crc kubenswrapper[4707]: I0121 15:12:19.588147 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-v97x2" Jan 21 15:12:19 crc kubenswrapper[4707]: I0121 15:12:19.602764 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-v97x2" podStartSLOduration=3.6027480240000003 podStartE2EDuration="3.602748024s" podCreationTimestamp="2026-01-21 15:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:12:19.601944372 +0000 UTC m=+636.783460594" watchObservedRunningTime="2026-01-21 15:12:19.602748024 +0000 UTC m=+636.784264246" Jan 21 15:12:23 crc kubenswrapper[4707]: I0121 15:12:23.616113 4707 generic.go:334] "Generic (PLEG): container finished" podID="3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3" containerID="c10688d2ebabcd58b67d1dc088a71961f6694353cd0c484a3bfbb970662f8fab" exitCode=0 Jan 21 15:12:23 crc kubenswrapper[4707]: I0121 15:12:23.616169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerDied","Data":"c10688d2ebabcd58b67d1dc088a71961f6694353cd0c484a3bfbb970662f8fab"} Jan 21 15:12:23 crc kubenswrapper[4707]: I0121 15:12:23.617739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" event={"ID":"a50f5957-c813-4be6-846c-18fa67c16942","Type":"ContainerStarted","Data":"9bb86e5c0cf0ebe245a48f3a187ad62dd256ca3c5ad22ca8d1b00d905e2b1564"} Jan 21 15:12:23 crc kubenswrapper[4707]: I0121 15:12:23.617841 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:23 crc kubenswrapper[4707]: I0121 15:12:23.640360 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" podStartSLOduration=2.060640319 podStartE2EDuration="7.640347894s" podCreationTimestamp="2026-01-21 15:12:16 +0000 UTC" firstStartedPulling="2026-01-21 15:12:17.11678183 +0000 UTC m=+634.298298052" lastFinishedPulling="2026-01-21 15:12:22.696489404 +0000 UTC m=+639.878005627" observedRunningTime="2026-01-21 15:12:23.64018595 +0000 UTC m=+640.821702172" watchObservedRunningTime="2026-01-21 15:12:23.640347894 +0000 UTC m=+640.821864117" Jan 21 15:12:24 crc kubenswrapper[4707]: I0121 15:12:24.623100 4707 generic.go:334] "Generic (PLEG): container finished" podID="3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3" containerID="6b324022fe6c878bfb151f26178c62cff12af327f29b5245b0c2b102f38da8bf" exitCode=0 Jan 21 15:12:24 crc kubenswrapper[4707]: I0121 15:12:24.623171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerDied","Data":"6b324022fe6c878bfb151f26178c62cff12af327f29b5245b0c2b102f38da8bf"} Jan 21 15:12:25 crc kubenswrapper[4707]: I0121 15:12:25.628525 4707 generic.go:334] "Generic (PLEG): container finished" podID="3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3" containerID="529826fdbdd8799427651571796d8c33e5902812142033a7532d1b4bf345c56f" exitCode=0 Jan 21 15:12:25 crc kubenswrapper[4707]: I0121 15:12:25.628563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerDied","Data":"529826fdbdd8799427651571796d8c33e5902812142033a7532d1b4bf345c56f"} Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.636319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerStarted","Data":"117858e06fd28aba388b526a79e35bbd52b55f8e91353dd5380d472c443b7a28"} Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.636658 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.636685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerStarted","Data":"41f764c8465f57309c24ca90415ac504560eedffe330a7065071308a61e6f241"} Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.636699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerStarted","Data":"836e25841d3b733763403f38b4e37d2f432614308d11e5b114b20ab446d54675"} Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.636709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerStarted","Data":"b5e9ffa81436c749705fbb215fcf5c64f88c61f3c212efcb6391a500074c6622"} Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.636728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerStarted","Data":"6d1f19aa4bb0b7bd6c180b2ceeebe3e1331be173000c69cf594cb32dd8171f99"} Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.636736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdjrb" event={"ID":"3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3","Type":"ContainerStarted","Data":"d02f0cc97d4debed6ca0791ed65533eb9b9290f374b1657ac46b544c8e5d0402"} Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.653414 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pdjrb" podStartSLOduration=4.855874141 podStartE2EDuration="10.653400762s" podCreationTimestamp="2026-01-21 15:12:16 +0000 UTC" firstStartedPulling="2026-01-21 15:12:16.885970006 +0000 UTC m=+634.067486228" lastFinishedPulling="2026-01-21 15:12:22.683496628 +0000 UTC m=+639.865012849" observedRunningTime="2026-01-21 15:12:26.651285992 +0000 UTC m=+643.832802214" watchObservedRunningTime="2026-01-21 15:12:26.653400762 +0000 UTC m=+643.834916984" Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.776067 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:26 crc kubenswrapper[4707]: I0121 15:12:26.804896 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:27 crc kubenswrapper[4707]: I0121 15:12:27.448478 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-xwn92" Jan 21 15:12:28 crc kubenswrapper[4707]: I0121 15:12:28.357420 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-v97x2" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.501043 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k"] Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.502134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.503578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.508826 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k"] Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.600854 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.600944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxxj7\" (UniqueName: \"kubernetes.io/projected/5ee91b4f-b397-4648-a2ab-dbf5699be347-kube-api-access-cxxj7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.601028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.701884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.701940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.702008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxxj7\" (UniqueName: \"kubernetes.io/projected/5ee91b4f-b397-4648-a2ab-dbf5699be347-kube-api-access-cxxj7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.702536 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.702578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.716431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxxj7\" (UniqueName: \"kubernetes.io/projected/5ee91b4f-b397-4648-a2ab-dbf5699be347-kube-api-access-cxxj7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:29 crc kubenswrapper[4707]: I0121 15:12:29.816767 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:30 crc kubenswrapper[4707]: I0121 15:12:30.172603 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k"] Jan 21 15:12:30 crc kubenswrapper[4707]: I0121 15:12:30.653647 4707 generic.go:334] "Generic (PLEG): container finished" podID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerID="59333d98473631dbd0f6f087b3d6c8396c0b2d921adcba4e849543037f58a1c0" exitCode=0 Jan 21 15:12:30 crc kubenswrapper[4707]: I0121 15:12:30.653687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" event={"ID":"5ee91b4f-b397-4648-a2ab-dbf5699be347","Type":"ContainerDied","Data":"59333d98473631dbd0f6f087b3d6c8396c0b2d921adcba4e849543037f58a1c0"} Jan 21 15:12:30 crc kubenswrapper[4707]: I0121 15:12:30.653736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" event={"ID":"5ee91b4f-b397-4648-a2ab-dbf5699be347","Type":"ContainerStarted","Data":"da1ac023357fdd164522b5eec3f8fcc8856f5c341a8e526d5ea62902e29ed249"} Jan 21 15:12:33 crc kubenswrapper[4707]: I0121 15:12:33.668516 4707 generic.go:334] "Generic (PLEG): container finished" podID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerID="bf5b428aaf5d3a61be6810d4bb56349fa3f78491a1ef36637d787d483d272a40" exitCode=0 Jan 21 15:12:33 crc kubenswrapper[4707]: I0121 15:12:33.668931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" event={"ID":"5ee91b4f-b397-4648-a2ab-dbf5699be347","Type":"ContainerDied","Data":"bf5b428aaf5d3a61be6810d4bb56349fa3f78491a1ef36637d787d483d272a40"} Jan 21 15:12:34 crc kubenswrapper[4707]: I0121 15:12:34.675093 4707 generic.go:334] "Generic (PLEG): container finished" podID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerID="4bea89a17a13d4adb370013e7129f3032c1c8448e2175e59d393c56956c3a309" exitCode=0 Jan 21 15:12:34 crc kubenswrapper[4707]: I0121 15:12:34.675128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" event={"ID":"5ee91b4f-b397-4648-a2ab-dbf5699be347","Type":"ContainerDied","Data":"4bea89a17a13d4adb370013e7129f3032c1c8448e2175e59d393c56956c3a309"} Jan 21 15:12:35 crc kubenswrapper[4707]: I0121 15:12:35.864602 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:35 crc kubenswrapper[4707]: I0121 15:12:35.972224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxxj7\" (UniqueName: \"kubernetes.io/projected/5ee91b4f-b397-4648-a2ab-dbf5699be347-kube-api-access-cxxj7\") pod \"5ee91b4f-b397-4648-a2ab-dbf5699be347\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " Jan 21 15:12:35 crc kubenswrapper[4707]: I0121 15:12:35.972312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-bundle\") pod \"5ee91b4f-b397-4648-a2ab-dbf5699be347\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " Jan 21 15:12:35 crc kubenswrapper[4707]: I0121 15:12:35.972395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-util\") pod \"5ee91b4f-b397-4648-a2ab-dbf5699be347\" (UID: \"5ee91b4f-b397-4648-a2ab-dbf5699be347\") " Jan 21 15:12:35 crc kubenswrapper[4707]: I0121 15:12:35.973301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-bundle" (OuterVolumeSpecName: "bundle") pod "5ee91b4f-b397-4648-a2ab-dbf5699be347" (UID: "5ee91b4f-b397-4648-a2ab-dbf5699be347"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:12:35 crc kubenswrapper[4707]: I0121 15:12:35.977016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee91b4f-b397-4648-a2ab-dbf5699be347-kube-api-access-cxxj7" (OuterVolumeSpecName: "kube-api-access-cxxj7") pod "5ee91b4f-b397-4648-a2ab-dbf5699be347" (UID: "5ee91b4f-b397-4648-a2ab-dbf5699be347"). InnerVolumeSpecName "kube-api-access-cxxj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:12:35 crc kubenswrapper[4707]: I0121 15:12:35.979709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-util" (OuterVolumeSpecName: "util") pod "5ee91b4f-b397-4648-a2ab-dbf5699be347" (UID: "5ee91b4f-b397-4648-a2ab-dbf5699be347"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:12:36 crc kubenswrapper[4707]: I0121 15:12:36.073293 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:12:36 crc kubenswrapper[4707]: I0121 15:12:36.073325 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxxj7\" (UniqueName: \"kubernetes.io/projected/5ee91b4f-b397-4648-a2ab-dbf5699be347-kube-api-access-cxxj7\") on node \"crc\" DevicePath \"\"" Jan 21 15:12:36 crc kubenswrapper[4707]: I0121 15:12:36.073335 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee91b4f-b397-4648-a2ab-dbf5699be347-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:12:36 crc kubenswrapper[4707]: I0121 15:12:36.685299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" event={"ID":"5ee91b4f-b397-4648-a2ab-dbf5699be347","Type":"ContainerDied","Data":"da1ac023357fdd164522b5eec3f8fcc8856f5c341a8e526d5ea62902e29ed249"} Jan 21 15:12:36 crc kubenswrapper[4707]: I0121 15:12:36.685512 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da1ac023357fdd164522b5eec3f8fcc8856f5c341a8e526d5ea62902e29ed249" Jan 21 15:12:36 crc kubenswrapper[4707]: I0121 15:12:36.685359 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k" Jan 21 15:12:36 crc kubenswrapper[4707]: I0121 15:12:36.767647 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-czn2c" Jan 21 15:12:36 crc kubenswrapper[4707]: I0121 15:12:36.777156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pdjrb" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.024645 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5"] Jan 21 15:12:43 crc kubenswrapper[4707]: E0121 15:12:43.025164 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerName="pull" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.025175 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerName="pull" Jan 21 15:12:43 crc kubenswrapper[4707]: E0121 15:12:43.025182 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerName="extract" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.025189 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerName="extract" Jan 21 15:12:43 crc kubenswrapper[4707]: E0121 15:12:43.025203 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerName="util" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.025209 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerName="util" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.025296 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee91b4f-b397-4648-a2ab-dbf5699be347" containerName="extract" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.025621 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.027344 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.027379 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-bxf6q" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.027521 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.035715 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5"] Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.147110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d61648ba-dbd2-4168-a42d-372f1b3e53fe-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-75hg5\" (UID: \"d61648ba-dbd2-4168-a42d-372f1b3e53fe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.147183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8rtw\" (UniqueName: \"kubernetes.io/projected/d61648ba-dbd2-4168-a42d-372f1b3e53fe-kube-api-access-j8rtw\") pod \"cert-manager-operator-controller-manager-64cf6dff88-75hg5\" (UID: \"d61648ba-dbd2-4168-a42d-372f1b3e53fe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.248115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d61648ba-dbd2-4168-a42d-372f1b3e53fe-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-75hg5\" (UID: \"d61648ba-dbd2-4168-a42d-372f1b3e53fe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.248209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8rtw\" (UniqueName: \"kubernetes.io/projected/d61648ba-dbd2-4168-a42d-372f1b3e53fe-kube-api-access-j8rtw\") pod \"cert-manager-operator-controller-manager-64cf6dff88-75hg5\" (UID: \"d61648ba-dbd2-4168-a42d-372f1b3e53fe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.248632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d61648ba-dbd2-4168-a42d-372f1b3e53fe-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-75hg5\" (UID: \"d61648ba-dbd2-4168-a42d-372f1b3e53fe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.260153 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.270355 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.284104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8rtw\" (UniqueName: \"kubernetes.io/projected/d61648ba-dbd2-4168-a42d-372f1b3e53fe-kube-api-access-j8rtw\") pod \"cert-manager-operator-controller-manager-64cf6dff88-75hg5\" (UID: \"d61648ba-dbd2-4168-a42d-372f1b3e53fe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.338826 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-bxf6q" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.347661 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" Jan 21 15:12:43 crc kubenswrapper[4707]: I0121 15:12:43.711775 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5"] Jan 21 15:12:44 crc kubenswrapper[4707]: I0121 15:12:44.720853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" event={"ID":"d61648ba-dbd2-4168-a42d-372f1b3e53fe","Type":"ContainerStarted","Data":"8506f1a066d124f7a6b232b2ece942b26acbb8e6f0729465bdc11614c6050977"} Jan 21 15:12:52 crc kubenswrapper[4707]: I0121 15:12:52.755996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" event={"ID":"d61648ba-dbd2-4168-a42d-372f1b3e53fe","Type":"ContainerStarted","Data":"bab261d22aa1ec85c826c9ef1b4be854348400b5e592477a4c8ed5089f0e2f38"} Jan 21 15:12:52 crc kubenswrapper[4707]: I0121 15:12:52.772931 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-75hg5" podStartSLOduration=1.189150482 podStartE2EDuration="9.772917602s" podCreationTimestamp="2026-01-21 15:12:43 +0000 UTC" firstStartedPulling="2026-01-21 15:12:43.726695594 +0000 UTC m=+660.908211817" lastFinishedPulling="2026-01-21 15:12:52.310462715 +0000 UTC m=+669.491978937" observedRunningTime="2026-01-21 15:12:52.768904432 +0000 UTC m=+669.950420655" watchObservedRunningTime="2026-01-21 15:12:52.772917602 +0000 UTC m=+669.954433823" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.212478 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6"] Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.213310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.215182 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.215829 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8b4pn" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.215931 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.220925 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6"] Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.346611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18724316-30e9-4c09-baa1-d3d3971f97de-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-tbzv6\" (UID: \"18724316-30e9-4c09-baa1-d3d3971f97de\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.346685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpbt6\" (UniqueName: \"kubernetes.io/projected/18724316-30e9-4c09-baa1-d3d3971f97de-kube-api-access-lpbt6\") pod \"cert-manager-cainjector-855d9ccff4-tbzv6\" (UID: \"18724316-30e9-4c09-baa1-d3d3971f97de\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.447861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18724316-30e9-4c09-baa1-d3d3971f97de-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-tbzv6\" (UID: \"18724316-30e9-4c09-baa1-d3d3971f97de\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.447913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpbt6\" (UniqueName: \"kubernetes.io/projected/18724316-30e9-4c09-baa1-d3d3971f97de-kube-api-access-lpbt6\") pod \"cert-manager-cainjector-855d9ccff4-tbzv6\" (UID: \"18724316-30e9-4c09-baa1-d3d3971f97de\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.463489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18724316-30e9-4c09-baa1-d3d3971f97de-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-tbzv6\" (UID: \"18724316-30e9-4c09-baa1-d3d3971f97de\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.463603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpbt6\" (UniqueName: \"kubernetes.io/projected/18724316-30e9-4c09-baa1-d3d3971f97de-kube-api-access-lpbt6\") pod \"cert-manager-cainjector-855d9ccff4-tbzv6\" (UID: \"18724316-30e9-4c09-baa1-d3d3971f97de\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.530076 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" Jan 21 15:12:59 crc kubenswrapper[4707]: I0121 15:12:59.879105 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6"] Jan 21 15:13:00 crc kubenswrapper[4707]: I0121 15:13:00.791797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" event={"ID":"18724316-30e9-4c09-baa1-d3d3971f97de","Type":"ContainerStarted","Data":"ec37b293c90fb09b1f630cc2e8159e6d567f6825fd19e8819a42137bb5015ad8"} Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.052344 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-p7vr2"] Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.052985 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.055478 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-248np" Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.064431 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-p7vr2"] Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.169309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t29nk\" (UniqueName: \"kubernetes.io/projected/d59f005b-d150-4190-b8ca-d757c89269f4-kube-api-access-t29nk\") pod \"cert-manager-webhook-f4fb5df64-p7vr2\" (UID: \"d59f005b-d150-4190-b8ca-d757c89269f4\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.169414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d59f005b-d150-4190-b8ca-d757c89269f4-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-p7vr2\" (UID: \"d59f005b-d150-4190-b8ca-d757c89269f4\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.270367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t29nk\" (UniqueName: \"kubernetes.io/projected/d59f005b-d150-4190-b8ca-d757c89269f4-kube-api-access-t29nk\") pod \"cert-manager-webhook-f4fb5df64-p7vr2\" (UID: \"d59f005b-d150-4190-b8ca-d757c89269f4\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.270447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d59f005b-d150-4190-b8ca-d757c89269f4-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-p7vr2\" (UID: \"d59f005b-d150-4190-b8ca-d757c89269f4\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.285269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d59f005b-d150-4190-b8ca-d757c89269f4-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-p7vr2\" (UID: \"d59f005b-d150-4190-b8ca-d757c89269f4\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.287236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t29nk\" (UniqueName: \"kubernetes.io/projected/d59f005b-d150-4190-b8ca-d757c89269f4-kube-api-access-t29nk\") pod \"cert-manager-webhook-f4fb5df64-p7vr2\" (UID: \"d59f005b-d150-4190-b8ca-d757c89269f4\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.366368 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.722509 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-p7vr2"] Jan 21 15:13:01 crc kubenswrapper[4707]: I0121 15:13:01.796639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" event={"ID":"d59f005b-d150-4190-b8ca-d757c89269f4","Type":"ContainerStarted","Data":"d397aa7bdb1349219d3ef05b9b642e1ca4d124d4fc0601207c6f12544d98958d"} Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.827528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" event={"ID":"d59f005b-d150-4190-b8ca-d757c89269f4","Type":"ContainerStarted","Data":"33d3a594cdabe421e962ed5a14632b6b4dedc2fb49fc51f253c96362ded8438d"} Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.827921 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.829010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" event={"ID":"18724316-30e9-4c09-baa1-d3d3971f97de","Type":"ContainerStarted","Data":"7f7afdaea4175dd98b324de499d20f4142cecd67c36dcbb327ea70ea54a70c6f"} Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.837439 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" podStartSLOduration=1.6446961679999998 podStartE2EDuration="5.837429949s" podCreationTimestamp="2026-01-21 15:13:01 +0000 UTC" firstStartedPulling="2026-01-21 15:13:01.734775239 +0000 UTC m=+678.916291462" lastFinishedPulling="2026-01-21 15:13:05.927509021 +0000 UTC m=+683.109025243" observedRunningTime="2026-01-21 15:13:06.836547609 +0000 UTC m=+684.018063832" watchObservedRunningTime="2026-01-21 15:13:06.837429949 +0000 UTC m=+684.018946171" Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.855586 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-tbzv6" podStartSLOduration=1.788692145 podStartE2EDuration="7.855569942s" podCreationTimestamp="2026-01-21 15:12:59 +0000 UTC" firstStartedPulling="2026-01-21 15:12:59.878960453 +0000 UTC m=+677.060476675" lastFinishedPulling="2026-01-21 15:13:05.94583825 +0000 UTC m=+683.127354472" observedRunningTime="2026-01-21 15:13:06.847218693 +0000 UTC m=+684.028734915" watchObservedRunningTime="2026-01-21 15:13:06.855569942 +0000 UTC m=+684.037086164" Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.857521 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-pdx6x"] Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.858136 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-pdx6x" Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.859348 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-5w27t" Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.864578 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-pdx6x"] Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.938009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34c2061b-4028-4e99-9345-d8443062558a-bound-sa-token\") pod \"cert-manager-86cb77c54b-pdx6x\" (UID: \"34c2061b-4028-4e99-9345-d8443062558a\") " pod="cert-manager/cert-manager-86cb77c54b-pdx6x" Jan 21 15:13:06 crc kubenswrapper[4707]: I0121 15:13:06.938322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcdqr\" (UniqueName: \"kubernetes.io/projected/34c2061b-4028-4e99-9345-d8443062558a-kube-api-access-tcdqr\") pod \"cert-manager-86cb77c54b-pdx6x\" (UID: \"34c2061b-4028-4e99-9345-d8443062558a\") " pod="cert-manager/cert-manager-86cb77c54b-pdx6x" Jan 21 15:13:07 crc kubenswrapper[4707]: I0121 15:13:07.039546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcdqr\" (UniqueName: \"kubernetes.io/projected/34c2061b-4028-4e99-9345-d8443062558a-kube-api-access-tcdqr\") pod \"cert-manager-86cb77c54b-pdx6x\" (UID: \"34c2061b-4028-4e99-9345-d8443062558a\") " pod="cert-manager/cert-manager-86cb77c54b-pdx6x" Jan 21 15:13:07 crc kubenswrapper[4707]: I0121 15:13:07.039595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34c2061b-4028-4e99-9345-d8443062558a-bound-sa-token\") pod \"cert-manager-86cb77c54b-pdx6x\" (UID: \"34c2061b-4028-4e99-9345-d8443062558a\") " pod="cert-manager/cert-manager-86cb77c54b-pdx6x" Jan 21 15:13:07 crc kubenswrapper[4707]: I0121 15:13:07.054436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34c2061b-4028-4e99-9345-d8443062558a-bound-sa-token\") pod \"cert-manager-86cb77c54b-pdx6x\" (UID: \"34c2061b-4028-4e99-9345-d8443062558a\") " pod="cert-manager/cert-manager-86cb77c54b-pdx6x" Jan 21 15:13:07 crc kubenswrapper[4707]: I0121 15:13:07.054606 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcdqr\" (UniqueName: \"kubernetes.io/projected/34c2061b-4028-4e99-9345-d8443062558a-kube-api-access-tcdqr\") pod \"cert-manager-86cb77c54b-pdx6x\" (UID: \"34c2061b-4028-4e99-9345-d8443062558a\") " pod="cert-manager/cert-manager-86cb77c54b-pdx6x" Jan 21 15:13:07 crc kubenswrapper[4707]: I0121 15:13:07.172147 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-pdx6x" Jan 21 15:13:07 crc kubenswrapper[4707]: I0121 15:13:07.503093 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-pdx6x"] Jan 21 15:13:07 crc kubenswrapper[4707]: W0121 15:13:07.507093 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34c2061b_4028_4e99_9345_d8443062558a.slice/crio-27b878def9702a94abb9f4d61230fcd102a6fb31e2dbc96db9800446fd1c32e5 WatchSource:0}: Error finding container 27b878def9702a94abb9f4d61230fcd102a6fb31e2dbc96db9800446fd1c32e5: Status 404 returned error can't find the container with id 27b878def9702a94abb9f4d61230fcd102a6fb31e2dbc96db9800446fd1c32e5 Jan 21 15:13:07 crc kubenswrapper[4707]: I0121 15:13:07.834719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-pdx6x" event={"ID":"34c2061b-4028-4e99-9345-d8443062558a","Type":"ContainerStarted","Data":"0adaabf333e46b26a17b433635ae973855cf1fc50284bff8206ee353b802ef86"} Jan 21 15:13:07 crc kubenswrapper[4707]: I0121 15:13:07.834776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-pdx6x" event={"ID":"34c2061b-4028-4e99-9345-d8443062558a","Type":"ContainerStarted","Data":"27b878def9702a94abb9f4d61230fcd102a6fb31e2dbc96db9800446fd1c32e5"} Jan 21 15:13:07 crc kubenswrapper[4707]: I0121 15:13:07.844974 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-pdx6x" podStartSLOduration=1.8449611639999999 podStartE2EDuration="1.844961164s" podCreationTimestamp="2026-01-21 15:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:13:07.843689883 +0000 UTC m=+685.025206106" watchObservedRunningTime="2026-01-21 15:13:07.844961164 +0000 UTC m=+685.026477386" Jan 21 15:13:11 crc kubenswrapper[4707]: I0121 15:13:11.369206 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-p7vr2" Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.357646 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-pww79"] Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.358428 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pww79" Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.361384 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.361578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-plx8k" Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.361684 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.379273 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pww79"] Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.511238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tjl5\" (UniqueName: \"kubernetes.io/projected/7eeb96f2-a82f-4fad-8511-94e31d4d23ee-kube-api-access-7tjl5\") pod \"openstack-operator-index-pww79\" (UID: \"7eeb96f2-a82f-4fad-8511-94e31d4d23ee\") " pod="openstack-operators/openstack-operator-index-pww79" Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.612412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tjl5\" (UniqueName: \"kubernetes.io/projected/7eeb96f2-a82f-4fad-8511-94e31d4d23ee-kube-api-access-7tjl5\") pod \"openstack-operator-index-pww79\" (UID: \"7eeb96f2-a82f-4fad-8511-94e31d4d23ee\") " pod="openstack-operators/openstack-operator-index-pww79" Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.627423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tjl5\" (UniqueName: \"kubernetes.io/projected/7eeb96f2-a82f-4fad-8511-94e31d4d23ee-kube-api-access-7tjl5\") pod \"openstack-operator-index-pww79\" (UID: \"7eeb96f2-a82f-4fad-8511-94e31d4d23ee\") " pod="openstack-operators/openstack-operator-index-pww79" Jan 21 15:13:13 crc kubenswrapper[4707]: I0121 15:13:13.671903 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pww79" Jan 21 15:13:14 crc kubenswrapper[4707]: I0121 15:13:14.014757 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pww79"] Jan 21 15:13:14 crc kubenswrapper[4707]: W0121 15:13:14.018872 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eeb96f2_a82f_4fad_8511_94e31d4d23ee.slice/crio-f22bd7bfaf35623982a850c85e5fd1116b335479206bbca6a2d627c51a425fed WatchSource:0}: Error finding container f22bd7bfaf35623982a850c85e5fd1116b335479206bbca6a2d627c51a425fed: Status 404 returned error can't find the container with id f22bd7bfaf35623982a850c85e5fd1116b335479206bbca6a2d627c51a425fed Jan 21 15:13:14 crc kubenswrapper[4707]: I0121 15:13:14.865671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pww79" event={"ID":"7eeb96f2-a82f-4fad-8511-94e31d4d23ee","Type":"ContainerStarted","Data":"f22bd7bfaf35623982a850c85e5fd1116b335479206bbca6a2d627c51a425fed"} Jan 21 15:13:15 crc kubenswrapper[4707]: I0121 15:13:15.877375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pww79" event={"ID":"7eeb96f2-a82f-4fad-8511-94e31d4d23ee","Type":"ContainerStarted","Data":"e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb"} Jan 21 15:13:15 crc kubenswrapper[4707]: I0121 15:13:15.888681 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-pww79" podStartSLOduration=1.778092915 podStartE2EDuration="2.88866837s" podCreationTimestamp="2026-01-21 15:13:13 +0000 UTC" firstStartedPulling="2026-01-21 15:13:14.020705462 +0000 UTC m=+691.202221684" lastFinishedPulling="2026-01-21 15:13:15.131280918 +0000 UTC m=+692.312797139" observedRunningTime="2026-01-21 15:13:15.887385006 +0000 UTC m=+693.068901278" watchObservedRunningTime="2026-01-21 15:13:15.88866837 +0000 UTC m=+693.070184592" Jan 21 15:13:16 crc kubenswrapper[4707]: I0121 15:13:16.742359 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-pww79"] Jan 21 15:13:17 crc kubenswrapper[4707]: I0121 15:13:17.364238 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9hglm"] Jan 21 15:13:17 crc kubenswrapper[4707]: I0121 15:13:17.366445 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 15:13:17 crc kubenswrapper[4707]: I0121 15:13:17.375184 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9hglm"] Jan 21 15:13:17 crc kubenswrapper[4707]: I0121 15:13:17.460731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8kcj\" (UniqueName: \"kubernetes.io/projected/7193a0d6-108f-4c14-a4fe-678aa49bc9dd-kube-api-access-c8kcj\") pod \"openstack-operator-index-9hglm\" (UID: \"7193a0d6-108f-4c14-a4fe-678aa49bc9dd\") " pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 15:13:17 crc kubenswrapper[4707]: I0121 15:13:17.561765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8kcj\" (UniqueName: \"kubernetes.io/projected/7193a0d6-108f-4c14-a4fe-678aa49bc9dd-kube-api-access-c8kcj\") pod \"openstack-operator-index-9hglm\" (UID: \"7193a0d6-108f-4c14-a4fe-678aa49bc9dd\") " pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 15:13:17 crc kubenswrapper[4707]: I0121 15:13:17.576481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8kcj\" (UniqueName: \"kubernetes.io/projected/7193a0d6-108f-4c14-a4fe-678aa49bc9dd-kube-api-access-c8kcj\") pod \"openstack-operator-index-9hglm\" (UID: \"7193a0d6-108f-4c14-a4fe-678aa49bc9dd\") " pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 15:13:17 crc kubenswrapper[4707]: I0121 15:13:17.682638 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 15:13:17 crc kubenswrapper[4707]: I0121 15:13:17.886722 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-pww79" podUID="7eeb96f2-a82f-4fad-8511-94e31d4d23ee" containerName="registry-server" containerID="cri-o://e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb" gracePeriod=2 Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.017233 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9hglm"] Jan 21 15:13:18 crc kubenswrapper[4707]: W0121 15:13:18.023543 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7193a0d6_108f_4c14_a4fe_678aa49bc9dd.slice/crio-9ef0ead2e398d4bca1ad7f3bd12ead79aa420a4431f96466f258378ecec7ff34 WatchSource:0}: Error finding container 9ef0ead2e398d4bca1ad7f3bd12ead79aa420a4431f96466f258378ecec7ff34: Status 404 returned error can't find the container with id 9ef0ead2e398d4bca1ad7f3bd12ead79aa420a4431f96466f258378ecec7ff34 Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.151735 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pww79" Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.269255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tjl5\" (UniqueName: \"kubernetes.io/projected/7eeb96f2-a82f-4fad-8511-94e31d4d23ee-kube-api-access-7tjl5\") pod \"7eeb96f2-a82f-4fad-8511-94e31d4d23ee\" (UID: \"7eeb96f2-a82f-4fad-8511-94e31d4d23ee\") " Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.273367 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eeb96f2-a82f-4fad-8511-94e31d4d23ee-kube-api-access-7tjl5" (OuterVolumeSpecName: "kube-api-access-7tjl5") pod "7eeb96f2-a82f-4fad-8511-94e31d4d23ee" (UID: "7eeb96f2-a82f-4fad-8511-94e31d4d23ee"). InnerVolumeSpecName "kube-api-access-7tjl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.370389 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tjl5\" (UniqueName: \"kubernetes.io/projected/7eeb96f2-a82f-4fad-8511-94e31d4d23ee-kube-api-access-7tjl5\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.892413 4707 generic.go:334] "Generic (PLEG): container finished" podID="7eeb96f2-a82f-4fad-8511-94e31d4d23ee" containerID="e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb" exitCode=0 Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.892452 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pww79" Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.892467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pww79" event={"ID":"7eeb96f2-a82f-4fad-8511-94e31d4d23ee","Type":"ContainerDied","Data":"e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb"} Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.892768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pww79" event={"ID":"7eeb96f2-a82f-4fad-8511-94e31d4d23ee","Type":"ContainerDied","Data":"f22bd7bfaf35623982a850c85e5fd1116b335479206bbca6a2d627c51a425fed"} Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.892798 4707 scope.go:117] "RemoveContainer" containerID="e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb" Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.894534 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9hglm" event={"ID":"7193a0d6-108f-4c14-a4fe-678aa49bc9dd","Type":"ContainerStarted","Data":"ce079610f88f7d056187fad72a6f6f1fb9e577c5ace7940757d24c6cc403e66c"} Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.894556 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9hglm" event={"ID":"7193a0d6-108f-4c14-a4fe-678aa49bc9dd","Type":"ContainerStarted","Data":"9ef0ead2e398d4bca1ad7f3bd12ead79aa420a4431f96466f258378ecec7ff34"} Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.906367 4707 scope.go:117] "RemoveContainer" containerID="e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb" Jan 21 15:13:18 crc kubenswrapper[4707]: E0121 15:13:18.906769 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb\": container with ID starting with e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb not found: ID does not exist" containerID="e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb" Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.906898 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb"} err="failed to get container status \"e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb\": rpc error: code = NotFound desc = could not find container \"e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb\": container with ID starting with e71b97fb96fcc025ab332bd10eca64cd465d983364858b9d7639e2705674f3bb not found: ID does not exist" Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.907668 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9hglm" podStartSLOduration=1.335529429 podStartE2EDuration="1.907659799s" podCreationTimestamp="2026-01-21 15:13:17 +0000 UTC" firstStartedPulling="2026-01-21 15:13:18.027113048 +0000 UTC m=+695.208629270" lastFinishedPulling="2026-01-21 15:13:18.599243418 +0000 UTC m=+695.780759640" observedRunningTime="2026-01-21 15:13:18.90417794 +0000 UTC m=+696.085694161" watchObservedRunningTime="2026-01-21 15:13:18.907659799 +0000 UTC m=+696.089176021" Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.912863 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-pww79"] Jan 21 15:13:18 crc kubenswrapper[4707]: I0121 15:13:18.915605 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-pww79"] Jan 21 15:13:19 crc kubenswrapper[4707]: I0121 15:13:19.188588 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eeb96f2-a82f-4fad-8511-94e31d4d23ee" path="/var/lib/kubelet/pods/7eeb96f2-a82f-4fad-8511-94e31d4d23ee/volumes" Jan 21 15:13:27 crc kubenswrapper[4707]: I0121 15:13:27.683845 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 15:13:27 crc kubenswrapper[4707]: I0121 15:13:27.684214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 15:13:27 crc kubenswrapper[4707]: I0121 15:13:27.705849 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 15:13:27 crc kubenswrapper[4707]: I0121 15:13:27.952168 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.728611 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8"] Jan 21 15:13:34 crc kubenswrapper[4707]: E0121 15:13:34.729022 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eeb96f2-a82f-4fad-8511-94e31d4d23ee" containerName="registry-server" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.729034 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eeb96f2-a82f-4fad-8511-94e31d4d23ee" containerName="registry-server" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.729129 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eeb96f2-a82f-4fad-8511-94e31d4d23ee" containerName="registry-server" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.729989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.731730 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.735329 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8"] Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.851777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-util\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.852053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-bundle\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.852182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lztzh\" (UniqueName: \"kubernetes.io/projected/2aa60207-9807-459c-9a86-65c1bd3d0064-kube-api-access-lztzh\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.953114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-bundle\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.953172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lztzh\" (UniqueName: \"kubernetes.io/projected/2aa60207-9807-459c-9a86-65c1bd3d0064-kube-api-access-lztzh\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.953303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-util\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.953494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-bundle\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.953551 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-util\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:34 crc kubenswrapper[4707]: I0121 15:13:34.968935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lztzh\" (UniqueName: \"kubernetes.io/projected/2aa60207-9807-459c-9a86-65c1bd3d0064-kube-api-access-lztzh\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:35 crc kubenswrapper[4707]: I0121 15:13:35.044859 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:35 crc kubenswrapper[4707]: I0121 15:13:35.387224 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8"] Jan 21 15:13:35 crc kubenswrapper[4707]: W0121 15:13:35.392075 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa60207_9807_459c_9a86_65c1bd3d0064.slice/crio-ce253b40a65a4ec177b77e7d8b301dd3cb0b83d34c5407a67c1f4a0852d57417 WatchSource:0}: Error finding container ce253b40a65a4ec177b77e7d8b301dd3cb0b83d34c5407a67c1f4a0852d57417: Status 404 returned error can't find the container with id ce253b40a65a4ec177b77e7d8b301dd3cb0b83d34c5407a67c1f4a0852d57417 Jan 21 15:13:35 crc kubenswrapper[4707]: I0121 15:13:35.972314 4707 generic.go:334] "Generic (PLEG): container finished" podID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerID="2c8f1e5580a5cce674b79cd942f3bade7b066cfdc65965c986e6ff0d22714bcf" exitCode=0 Jan 21 15:13:35 crc kubenswrapper[4707]: I0121 15:13:35.972498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" event={"ID":"2aa60207-9807-459c-9a86-65c1bd3d0064","Type":"ContainerDied","Data":"2c8f1e5580a5cce674b79cd942f3bade7b066cfdc65965c986e6ff0d22714bcf"} Jan 21 15:13:35 crc kubenswrapper[4707]: I0121 15:13:35.973090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" event={"ID":"2aa60207-9807-459c-9a86-65c1bd3d0064","Type":"ContainerStarted","Data":"ce253b40a65a4ec177b77e7d8b301dd3cb0b83d34c5407a67c1f4a0852d57417"} Jan 21 15:13:36 crc kubenswrapper[4707]: I0121 15:13:36.978923 4707 generic.go:334] "Generic (PLEG): container finished" podID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerID="504a643a38814d378695bb674a67d2ee66a91f3b1acb086daca0465a078de5b2" exitCode=0 Jan 21 15:13:36 crc kubenswrapper[4707]: I0121 15:13:36.979576 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" event={"ID":"2aa60207-9807-459c-9a86-65c1bd3d0064","Type":"ContainerDied","Data":"504a643a38814d378695bb674a67d2ee66a91f3b1acb086daca0465a078de5b2"} Jan 21 15:13:37 crc kubenswrapper[4707]: I0121 15:13:37.985213 4707 generic.go:334] "Generic (PLEG): container finished" podID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerID="7057f26105b6453321d8c7da6188b80f8fd7183951b31d4c814de9519921057c" exitCode=0 Jan 21 15:13:37 crc kubenswrapper[4707]: I0121 15:13:37.985304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" event={"ID":"2aa60207-9807-459c-9a86-65c1bd3d0064","Type":"ContainerDied","Data":"7057f26105b6453321d8c7da6188b80f8fd7183951b31d4c814de9519921057c"} Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.175942 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.302318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lztzh\" (UniqueName: \"kubernetes.io/projected/2aa60207-9807-459c-9a86-65c1bd3d0064-kube-api-access-lztzh\") pod \"2aa60207-9807-459c-9a86-65c1bd3d0064\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.302360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-bundle\") pod \"2aa60207-9807-459c-9a86-65c1bd3d0064\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.302407 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-util\") pod \"2aa60207-9807-459c-9a86-65c1bd3d0064\" (UID: \"2aa60207-9807-459c-9a86-65c1bd3d0064\") " Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.303064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-bundle" (OuterVolumeSpecName: "bundle") pod "2aa60207-9807-459c-9a86-65c1bd3d0064" (UID: "2aa60207-9807-459c-9a86-65c1bd3d0064"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.306576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa60207-9807-459c-9a86-65c1bd3d0064-kube-api-access-lztzh" (OuterVolumeSpecName: "kube-api-access-lztzh") pod "2aa60207-9807-459c-9a86-65c1bd3d0064" (UID: "2aa60207-9807-459c-9a86-65c1bd3d0064"). InnerVolumeSpecName "kube-api-access-lztzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.313593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-util" (OuterVolumeSpecName: "util") pod "2aa60207-9807-459c-9a86-65c1bd3d0064" (UID: "2aa60207-9807-459c-9a86-65c1bd3d0064"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.403720 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lztzh\" (UniqueName: \"kubernetes.io/projected/2aa60207-9807-459c-9a86-65c1bd3d0064-kube-api-access-lztzh\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.403755 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.403764 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa60207-9807-459c-9a86-65c1bd3d0064-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.995871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" event={"ID":"2aa60207-9807-459c-9a86-65c1bd3d0064","Type":"ContainerDied","Data":"ce253b40a65a4ec177b77e7d8b301dd3cb0b83d34c5407a67c1f4a0852d57417"} Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.995909 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce253b40a65a4ec177b77e7d8b301dd3cb0b83d34c5407a67c1f4a0852d57417" Jan 21 15:13:39 crc kubenswrapper[4707]: I0121 15:13:39.995930 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8" Jan 21 15:13:46 crc kubenswrapper[4707]: I0121 15:13:46.879273 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj"] Jan 21 15:13:46 crc kubenswrapper[4707]: E0121 15:13:46.879629 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerName="pull" Jan 21 15:13:46 crc kubenswrapper[4707]: I0121 15:13:46.879640 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerName="pull" Jan 21 15:13:46 crc kubenswrapper[4707]: E0121 15:13:46.879651 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerName="extract" Jan 21 15:13:46 crc kubenswrapper[4707]: I0121 15:13:46.879656 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerName="extract" Jan 21 15:13:46 crc kubenswrapper[4707]: E0121 15:13:46.879668 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerName="util" Jan 21 15:13:46 crc kubenswrapper[4707]: I0121 15:13:46.879673 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerName="util" Jan 21 15:13:46 crc kubenswrapper[4707]: I0121 15:13:46.879755 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa60207-9807-459c-9a86-65c1bd3d0064" containerName="extract" Jan 21 15:13:46 crc kubenswrapper[4707]: I0121 15:13:46.881605 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" Jan 21 15:13:46 crc kubenswrapper[4707]: I0121 15:13:46.887972 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-dx75l" Jan 21 15:13:46 crc kubenswrapper[4707]: I0121 15:13:46.901907 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj"] Jan 21 15:13:46 crc kubenswrapper[4707]: I0121 15:13:46.985407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74ccm\" (UniqueName: \"kubernetes.io/projected/3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf-kube-api-access-74ccm\") pod \"openstack-operator-controller-init-6d4d7d8545-bjcrj\" (UID: \"3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf\") " pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" Jan 21 15:13:47 crc kubenswrapper[4707]: I0121 15:13:47.087105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74ccm\" (UniqueName: \"kubernetes.io/projected/3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf-kube-api-access-74ccm\") pod \"openstack-operator-controller-init-6d4d7d8545-bjcrj\" (UID: \"3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf\") " pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" Jan 21 15:13:47 crc kubenswrapper[4707]: I0121 15:13:47.103460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74ccm\" (UniqueName: \"kubernetes.io/projected/3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf-kube-api-access-74ccm\") pod \"openstack-operator-controller-init-6d4d7d8545-bjcrj\" (UID: \"3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf\") " pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" Jan 21 15:13:47 crc kubenswrapper[4707]: I0121 15:13:47.203400 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" Jan 21 15:13:47 crc kubenswrapper[4707]: I0121 15:13:47.426449 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj"] Jan 21 15:13:48 crc kubenswrapper[4707]: I0121 15:13:48.030327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" event={"ID":"3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf","Type":"ContainerStarted","Data":"469647dd32efc85773776a088c1ed3ef69c6f5394c9b24b1667952a2cb9a6ccc"} Jan 21 15:13:51 crc kubenswrapper[4707]: I0121 15:13:51.043831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" event={"ID":"3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf","Type":"ContainerStarted","Data":"55d73a60944fe9f7d7ad79e64d1b64d20cb2043c83d889359f590d510c77308d"} Jan 21 15:13:51 crc kubenswrapper[4707]: I0121 15:13:51.044214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" Jan 21 15:13:51 crc kubenswrapper[4707]: I0121 15:13:51.066099 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" podStartSLOduration=1.7464132220000002 podStartE2EDuration="5.066086182s" podCreationTimestamp="2026-01-21 15:13:46 +0000 UTC" firstStartedPulling="2026-01-21 15:13:47.438858911 +0000 UTC m=+724.620375133" lastFinishedPulling="2026-01-21 15:13:50.758531872 +0000 UTC m=+727.940048093" observedRunningTime="2026-01-21 15:13:51.062243844 +0000 UTC m=+728.243760066" watchObservedRunningTime="2026-01-21 15:13:51.066086182 +0000 UTC m=+728.247602403" Jan 21 15:13:57 crc kubenswrapper[4707]: I0121 15:13:57.205333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.124480 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.125476 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.126996 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-m48rl" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.128341 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.129005 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.134997 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.135925 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-mr88l" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.139174 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.143666 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.144261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.146118 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bl4b2" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.158313 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.160073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.163094 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-h784z" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.180634 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.205205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfmcx\" (UniqueName: \"kubernetes.io/projected/1ff533c3-a29f-498b-a575-1492c0e07aa9-kube-api-access-nfmcx\") pod \"cinder-operator-controller-manager-9b68f5989-q6cj4\" (UID: \"1ff533c3-a29f-498b-a575-1492c0e07aa9\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.205408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rntz6\" (UniqueName: \"kubernetes.io/projected/4adccf62-a1c3-430c-9137-a515f03b23e4-kube-api-access-rntz6\") pod \"barbican-operator-controller-manager-7ddb5c749-nhgpx\" (UID: \"4adccf62-a1c3-430c-9137-a515f03b23e4\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.205509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvggw\" (UniqueName: \"kubernetes.io/projected/2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd-kube-api-access-hvggw\") pod \"glance-operator-controller-manager-c6994669c-xkgbq\" (UID: \"2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.205624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flc4z\" (UniqueName: \"kubernetes.io/projected/6676ddd6-9d32-480d-9306-d3dc0676d2cd-kube-api-access-flc4z\") pod \"designate-operator-controller-manager-9f958b845-xpbv7\" (UID: \"6676ddd6-9d32-480d-9306-d3dc0676d2cd\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.214270 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.216590 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.231332 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.233892 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.234261 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jt6h4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.234640 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.239187 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wdk75" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.273324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.280784 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.284667 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.285367 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.288094 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vdt9b" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.288092 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.288961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.290555 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.291051 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kdccs" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.302523 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.303403 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.309180 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.309300 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-8fjl7" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v6bf\" (UniqueName: \"kubernetes.io/projected/96a9240d-2c77-4718-8f9d-60633df4eee4-kube-api-access-7v6bf\") pod \"horizon-operator-controller-manager-77d5c5b54f-brpxr\" (UID: \"96a9240d-2c77-4718-8f9d-60633df4eee4\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flc4z\" (UniqueName: \"kubernetes.io/projected/6676ddd6-9d32-480d-9306-d3dc0676d2cd-kube-api-access-flc4z\") pod \"designate-operator-controller-manager-9f958b845-xpbv7\" (UID: \"6676ddd6-9d32-480d-9306-d3dc0676d2cd\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p96xf\" (UniqueName: \"kubernetes.io/projected/06c244e2-fb38-4ac0-8ed0-2f34d01c5228-kube-api-access-p96xf\") pod \"keystone-operator-controller-manager-767fdc4f47-hcgf8\" (UID: \"06c244e2-fb38-4ac0-8ed0-2f34d01c5228\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2kfs\" (UniqueName: \"kubernetes.io/projected/8951bb5e-fd60-4761-9c83-bc523b041b83-kube-api-access-r2kfs\") pod \"ironic-operator-controller-manager-78757b4889-jlngq\" (UID: \"8951bb5e-fd60-4761-9c83-bc523b041b83\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9tth\" (UniqueName: \"kubernetes.io/projected/e6ab0bb3-12a8-4314-83c7-280a512198f6-kube-api-access-d9tth\") pod \"heat-operator-controller-manager-594c8c9d5d-jr4nf\" (UID: \"e6ab0bb3-12a8-4314-83c7-280a512198f6\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfmcx\" (UniqueName: \"kubernetes.io/projected/1ff533c3-a29f-498b-a575-1492c0e07aa9-kube-api-access-nfmcx\") pod \"cinder-operator-controller-manager-9b68f5989-q6cj4\" (UID: \"1ff533c3-a29f-498b-a575-1492c0e07aa9\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310369 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4bk\" (UniqueName: \"kubernetes.io/projected/375b0905-f382-4e75-8c5c-4ece8f6ddde3-kube-api-access-qx4bk\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rntz6\" (UniqueName: \"kubernetes.io/projected/4adccf62-a1c3-430c-9137-a515f03b23e4-kube-api-access-rntz6\") pod \"barbican-operator-controller-manager-7ddb5c749-nhgpx\" (UID: \"4adccf62-a1c3-430c-9137-a515f03b23e4\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.310406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvggw\" (UniqueName: \"kubernetes.io/projected/2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd-kube-api-access-hvggw\") pod \"glance-operator-controller-manager-c6994669c-xkgbq\" (UID: \"2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.323255 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.325394 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.326274 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.329116 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-j75q2" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.329595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flc4z\" (UniqueName: \"kubernetes.io/projected/6676ddd6-9d32-480d-9306-d3dc0676d2cd-kube-api-access-flc4z\") pod \"designate-operator-controller-manager-9f958b845-xpbv7\" (UID: \"6676ddd6-9d32-480d-9306-d3dc0676d2cd\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.329645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvggw\" (UniqueName: \"kubernetes.io/projected/2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd-kube-api-access-hvggw\") pod \"glance-operator-controller-manager-c6994669c-xkgbq\" (UID: \"2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.330278 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.331002 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.332836 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-74b6x" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.332890 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rntz6\" (UniqueName: \"kubernetes.io/projected/4adccf62-a1c3-430c-9137-a515f03b23e4-kube-api-access-rntz6\") pod \"barbican-operator-controller-manager-7ddb5c749-nhgpx\" (UID: \"4adccf62-a1c3-430c-9137-a515f03b23e4\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.335146 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.341900 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.347988 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.353102 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.353755 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.355578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfmcx\" (UniqueName: \"kubernetes.io/projected/1ff533c3-a29f-498b-a575-1492c0e07aa9-kube-api-access-nfmcx\") pod \"cinder-operator-controller-manager-9b68f5989-q6cj4\" (UID: \"1ff533c3-a29f-498b-a575-1492c0e07aa9\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.356003 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-nxqvz" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.360333 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.371905 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.372570 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.373794 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-sdmgb" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.376595 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.377155 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.378446 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-n46zn" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.380779 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.391998 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.395300 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.395963 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.397254 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vrqjn" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.401743 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.402855 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.404163 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tkt2q" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.404423 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.410645 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.411157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9tth\" (UniqueName: \"kubernetes.io/projected/e6ab0bb3-12a8-4314-83c7-280a512198f6-kube-api-access-d9tth\") pod \"heat-operator-controller-manager-594c8c9d5d-jr4nf\" (UID: \"e6ab0bb3-12a8-4314-83c7-280a512198f6\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.411880 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlcth\" (UniqueName: \"kubernetes.io/projected/9356dd92-4c2e-476d-9a60-946f4f148564-kube-api-access-xlcth\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.411984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb588\" (UniqueName: \"kubernetes.io/projected/a674fe35-9ddd-4352-b770-1061b28fce34-kube-api-access-lb588\") pod \"neutron-operator-controller-manager-cb4666565-szzcm\" (UID: \"a674fe35-9ddd-4352-b770-1061b28fce34\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4bk\" (UniqueName: \"kubernetes.io/projected/375b0905-f382-4e75-8c5c-4ece8f6ddde3-kube-api-access-qx4bk\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v6bf\" (UniqueName: \"kubernetes.io/projected/96a9240d-2c77-4718-8f9d-60633df4eee4-kube-api-access-7v6bf\") pod \"horizon-operator-controller-manager-77d5c5b54f-brpxr\" (UID: \"96a9240d-2c77-4718-8f9d-60633df4eee4\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcmrg\" (UniqueName: \"kubernetes.io/projected/40734c06-7e00-4184-9487-35be214c9556-kube-api-access-mcmrg\") pod \"ovn-operator-controller-manager-55db956ddc-5tp6l\" (UID: \"40734c06-7e00-4184-9487-35be214c9556\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412397 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45x4v\" (UniqueName: \"kubernetes.io/projected/4d095e2a-7788-4a7e-af9b-eaed8407ef5a-kube-api-access-45x4v\") pod \"mariadb-operator-controller-manager-c87fff755-jf4st\" (UID: \"4d095e2a-7788-4a7e-af9b-eaed8407ef5a\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-747pw\" (UniqueName: \"kubernetes.io/projected/62e25886-609f-4ef6-848c-e116c738df6a-kube-api-access-747pw\") pod \"octavia-operator-controller-manager-7fc9b76cf6-d9dr4\" (UID: \"62e25886-609f-4ef6-848c-e116c738df6a\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p96xf\" (UniqueName: \"kubernetes.io/projected/06c244e2-fb38-4ac0-8ed0-2f34d01c5228-kube-api-access-p96xf\") pod \"keystone-operator-controller-manager-767fdc4f47-hcgf8\" (UID: \"06c244e2-fb38-4ac0-8ed0-2f34d01c5228\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfrsj\" (UniqueName: \"kubernetes.io/projected/4f34a822-4c51-4dfc-9812-1287c9b3281d-kube-api-access-nfrsj\") pod \"nova-operator-controller-manager-65849867d6-wgmrm\" (UID: \"4f34a822-4c51-4dfc-9812-1287c9b3281d\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2kfs\" (UniqueName: \"kubernetes.io/projected/8951bb5e-fd60-4761-9c83-bc523b041b83-kube-api-access-r2kfs\") pod \"ironic-operator-controller-manager-78757b4889-jlngq\" (UID: \"8951bb5e-fd60-4761-9c83-bc523b041b83\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.412972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cx8v\" (UniqueName: \"kubernetes.io/projected/4c2dd112-6dc9-4b29-95f2-25b3c44452a2-kube-api-access-6cx8v\") pod \"manila-operator-controller-manager-864f6b75bf-kffk5\" (UID: \"4c2dd112-6dc9-4b29-95f2-25b3c44452a2\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" Jan 21 15:14:13 crc kubenswrapper[4707]: E0121 15:14:13.412440 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:13 crc kubenswrapper[4707]: E0121 15:14:13.413148 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert podName:375b0905-f382-4e75-8c5c-4ece8f6ddde3 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:13.913133026 +0000 UTC m=+751.094649249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert") pod "infra-operator-controller-manager-77c48c7859-h6pkz" (UID: "375b0905-f382-4e75-8c5c-4ece8f6ddde3") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.414198 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.440248 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.441874 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.443937 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.444505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9tth\" (UniqueName: \"kubernetes.io/projected/e6ab0bb3-12a8-4314-83c7-280a512198f6-kube-api-access-d9tth\") pod \"heat-operator-controller-manager-594c8c9d5d-jr4nf\" (UID: \"e6ab0bb3-12a8-4314-83c7-280a512198f6\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.446221 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-g6p2g" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.447337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2kfs\" (UniqueName: \"kubernetes.io/projected/8951bb5e-fd60-4761-9c83-bc523b041b83-kube-api-access-r2kfs\") pod \"ironic-operator-controller-manager-78757b4889-jlngq\" (UID: \"8951bb5e-fd60-4761-9c83-bc523b041b83\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.450852 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p96xf\" (UniqueName: \"kubernetes.io/projected/06c244e2-fb38-4ac0-8ed0-2f34d01c5228-kube-api-access-p96xf\") pod \"keystone-operator-controller-manager-767fdc4f47-hcgf8\" (UID: \"06c244e2-fb38-4ac0-8ed0-2f34d01c5228\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.453858 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.459398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4bk\" (UniqueName: \"kubernetes.io/projected/375b0905-f382-4e75-8c5c-4ece8f6ddde3-kube-api-access-qx4bk\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.460626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v6bf\" (UniqueName: \"kubernetes.io/projected/96a9240d-2c77-4718-8f9d-60633df4eee4-kube-api-access-7v6bf\") pod \"horizon-operator-controller-manager-77d5c5b54f-brpxr\" (UID: \"96a9240d-2c77-4718-8f9d-60633df4eee4\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.464159 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.468981 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.482868 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.487861 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.488752 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.492591 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-dcgdh" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.501924 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.516855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cx8v\" (UniqueName: \"kubernetes.io/projected/4c2dd112-6dc9-4b29-95f2-25b3c44452a2-kube-api-access-6cx8v\") pod \"manila-operator-controller-manager-864f6b75bf-kffk5\" (UID: \"4c2dd112-6dc9-4b29-95f2-25b3c44452a2\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.516893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlcth\" (UniqueName: \"kubernetes.io/projected/9356dd92-4c2e-476d-9a60-946f4f148564-kube-api-access-xlcth\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.516916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zn9\" (UniqueName: \"kubernetes.io/projected/9f333daa-a37b-4c2e-bd13-e66416449d2c-kube-api-access-t7zn9\") pod \"placement-operator-controller-manager-686df47fcb-njwgg\" (UID: \"9f333daa-a37b-4c2e-bd13-e66416449d2c\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.516947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb588\" (UniqueName: \"kubernetes.io/projected/a674fe35-9ddd-4352-b770-1061b28fce34-kube-api-access-lb588\") pod \"neutron-operator-controller-manager-cb4666565-szzcm\" (UID: \"a674fe35-9ddd-4352-b770-1061b28fce34\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.517037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcmrg\" (UniqueName: \"kubernetes.io/projected/40734c06-7e00-4184-9487-35be214c9556-kube-api-access-mcmrg\") pod \"ovn-operator-controller-manager-55db956ddc-5tp6l\" (UID: \"40734c06-7e00-4184-9487-35be214c9556\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.517058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45x4v\" (UniqueName: \"kubernetes.io/projected/4d095e2a-7788-4a7e-af9b-eaed8407ef5a-kube-api-access-45x4v\") pod \"mariadb-operator-controller-manager-c87fff755-jf4st\" (UID: \"4d095e2a-7788-4a7e-af9b-eaed8407ef5a\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.517074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.517102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58ttw\" (UniqueName: \"kubernetes.io/projected/5219c03b-89a8-4c73-b04f-2c6b1f8e29f4-kube-api-access-58ttw\") pod \"swift-operator-controller-manager-85dd56d4cc-79xkw\" (UID: \"5219c03b-89a8-4c73-b04f-2c6b1f8e29f4\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.517127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-747pw\" (UniqueName: \"kubernetes.io/projected/62e25886-609f-4ef6-848c-e116c738df6a-kube-api-access-747pw\") pod \"octavia-operator-controller-manager-7fc9b76cf6-d9dr4\" (UID: \"62e25886-609f-4ef6-848c-e116c738df6a\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.517145 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfrsj\" (UniqueName: \"kubernetes.io/projected/4f34a822-4c51-4dfc-9812-1287c9b3281d-kube-api-access-nfrsj\") pod \"nova-operator-controller-manager-65849867d6-wgmrm\" (UID: \"4f34a822-4c51-4dfc-9812-1287c9b3281d\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" Jan 21 15:14:13 crc kubenswrapper[4707]: E0121 15:14:13.518231 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:13 crc kubenswrapper[4707]: E0121 15:14:13.518265 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert podName:9356dd92-4c2e-476d-9a60-946f4f148564 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:14.018254369 +0000 UTC m=+751.199770591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dfskdx" (UID: "9356dd92-4c2e-476d-9a60-946f4f148564") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.542473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cx8v\" (UniqueName: \"kubernetes.io/projected/4c2dd112-6dc9-4b29-95f2-25b3c44452a2-kube-api-access-6cx8v\") pod \"manila-operator-controller-manager-864f6b75bf-kffk5\" (UID: \"4c2dd112-6dc9-4b29-95f2-25b3c44452a2\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.545058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-747pw\" (UniqueName: \"kubernetes.io/projected/62e25886-609f-4ef6-848c-e116c738df6a-kube-api-access-747pw\") pod \"octavia-operator-controller-manager-7fc9b76cf6-d9dr4\" (UID: \"62e25886-609f-4ef6-848c-e116c738df6a\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.545384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlcth\" (UniqueName: \"kubernetes.io/projected/9356dd92-4c2e-476d-9a60-946f4f148564-kube-api-access-xlcth\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.546007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcmrg\" (UniqueName: \"kubernetes.io/projected/40734c06-7e00-4184-9487-35be214c9556-kube-api-access-mcmrg\") pod \"ovn-operator-controller-manager-55db956ddc-5tp6l\" (UID: \"40734c06-7e00-4184-9487-35be214c9556\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.548272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfrsj\" (UniqueName: \"kubernetes.io/projected/4f34a822-4c51-4dfc-9812-1287c9b3281d-kube-api-access-nfrsj\") pod \"nova-operator-controller-manager-65849867d6-wgmrm\" (UID: \"4f34a822-4c51-4dfc-9812-1287c9b3281d\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.549184 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb588\" (UniqueName: \"kubernetes.io/projected/a674fe35-9ddd-4352-b770-1061b28fce34-kube-api-access-lb588\") pod \"neutron-operator-controller-manager-cb4666565-szzcm\" (UID: \"a674fe35-9ddd-4352-b770-1061b28fce34\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.551212 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.553661 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.554787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.556660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45x4v\" (UniqueName: \"kubernetes.io/projected/4d095e2a-7788-4a7e-af9b-eaed8407ef5a-kube-api-access-45x4v\") pod \"mariadb-operator-controller-manager-c87fff755-jf4st\" (UID: \"4d095e2a-7788-4a7e-af9b-eaed8407ef5a\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.557518 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.580217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-kz5nc" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.580455 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.598962 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.624068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58ttw\" (UniqueName: \"kubernetes.io/projected/5219c03b-89a8-4c73-b04f-2c6b1f8e29f4-kube-api-access-58ttw\") pod \"swift-operator-controller-manager-85dd56d4cc-79xkw\" (UID: \"5219c03b-89a8-4c73-b04f-2c6b1f8e29f4\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.624139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xdt4\" (UniqueName: \"kubernetes.io/projected/0c59ef09-0beb-452c-8563-83bda9602961-kube-api-access-9xdt4\") pod \"telemetry-operator-controller-manager-5f8f495fcf-znz8t\" (UID: \"0c59ef09-0beb-452c-8563-83bda9602961\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.624173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zn9\" (UniqueName: \"kubernetes.io/projected/9f333daa-a37b-4c2e-bd13-e66416449d2c-kube-api-access-t7zn9\") pod \"placement-operator-controller-manager-686df47fcb-njwgg\" (UID: \"9f333daa-a37b-4c2e-bd13-e66416449d2c\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.633117 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.633882 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.638392 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-zlgpc" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.639993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zn9\" (UniqueName: \"kubernetes.io/projected/9f333daa-a37b-4c2e-bd13-e66416449d2c-kube-api-access-t7zn9\") pod \"placement-operator-controller-manager-686df47fcb-njwgg\" (UID: \"9f333daa-a37b-4c2e-bd13-e66416449d2c\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.644155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58ttw\" (UniqueName: \"kubernetes.io/projected/5219c03b-89a8-4c73-b04f-2c6b1f8e29f4-kube-api-access-58ttw\") pod \"swift-operator-controller-manager-85dd56d4cc-79xkw\" (UID: \"5219c03b-89a8-4c73-b04f-2c6b1f8e29f4\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.647787 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.682145 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.685995 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.698336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.703891 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.712916 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.726981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xdt4\" (UniqueName: \"kubernetes.io/projected/0c59ef09-0beb-452c-8563-83bda9602961-kube-api-access-9xdt4\") pod \"telemetry-operator-controller-manager-5f8f495fcf-znz8t\" (UID: \"0c59ef09-0beb-452c-8563-83bda9602961\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.727045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psl8g\" (UniqueName: \"kubernetes.io/projected/7a6b5573-a127-4713-919d-d79e38f60b87-kube-api-access-psl8g\") pod \"test-operator-controller-manager-7cd8bc9dbb-bvgfl\" (UID: \"7a6b5573-a127-4713-919d-d79e38f60b87\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.737257 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.738206 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-99p87"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.739032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.742182 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-99p87"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.742340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-2kqv9" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.742423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xdt4\" (UniqueName: \"kubernetes.io/projected/0c59ef09-0beb-452c-8563-83bda9602961-kube-api-access-9xdt4\") pod \"telemetry-operator-controller-manager-5f8f495fcf-znz8t\" (UID: \"0c59ef09-0beb-452c-8563-83bda9602961\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.786644 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.808663 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.828917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psl8g\" (UniqueName: \"kubernetes.io/projected/7a6b5573-a127-4713-919d-d79e38f60b87-kube-api-access-psl8g\") pod \"test-operator-controller-manager-7cd8bc9dbb-bvgfl\" (UID: \"7a6b5573-a127-4713-919d-d79e38f60b87\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.828978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg6cv\" (UniqueName: \"kubernetes.io/projected/79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6-kube-api-access-hg6cv\") pod \"watcher-operator-controller-manager-64cd966744-99p87\" (UID: \"79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.844556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.846515 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.847215 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.856558 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.856714 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-47bc4" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.856859 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.876772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psl8g\" (UniqueName: \"kubernetes.io/projected/7a6b5573-a127-4713-919d-d79e38f60b87-kube-api-access-psl8g\") pod \"test-operator-controller-manager-7cd8bc9dbb-bvgfl\" (UID: \"7a6b5573-a127-4713-919d-d79e38f60b87\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.879322 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.904894 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.921311 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.930867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5hl4\" (UniqueName: \"kubernetes.io/projected/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-kube-api-access-l5hl4\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.930911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg6cv\" (UniqueName: \"kubernetes.io/projected/79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6-kube-api-access-hg6cv\") pod \"watcher-operator-controller-manager-64cd966744-99p87\" (UID: \"79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.930951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.930988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.931045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:13 crc kubenswrapper[4707]: E0121 15:14:13.931160 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:13 crc kubenswrapper[4707]: E0121 15:14:13.931202 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert podName:375b0905-f382-4e75-8c5c-4ece8f6ddde3 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:14.931186987 +0000 UTC m=+752.112703208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert") pod "infra-operator-controller-manager-77c48c7859-h6pkz" (UID: "375b0905-f382-4e75-8c5c-4ece8f6ddde3") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.940259 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.944665 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.946197 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zpkks" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.951100 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc"] Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.966663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg6cv\" (UniqueName: \"kubernetes.io/projected/79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6-kube-api-access-hg6cv\") pod \"watcher-operator-controller-manager-64cd966744-99p87\" (UID: \"79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" Jan 21 15:14:13 crc kubenswrapper[4707]: I0121 15:14:13.985099 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.032624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwfb\" (UniqueName: \"kubernetes.io/projected/d8707730-a731-419d-ad12-caf5fddafbd5-kube-api-access-4dwfb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lpptc\" (UID: \"d8707730-a731-419d-ad12-caf5fddafbd5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.032663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5hl4\" (UniqueName: \"kubernetes.io/projected/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-kube-api-access-l5hl4\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.032707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.032728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.032756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.032925 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.032970 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:14.532956065 +0000 UTC m=+751.714472287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "metrics-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.033218 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.033243 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:14.533234578 +0000 UTC m=+751.714750800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "webhook-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.033280 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.033297 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert podName:9356dd92-4c2e-476d-9a60-946f4f148564 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:15.033291206 +0000 UTC m=+752.214807427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dfskdx" (UID: "9356dd92-4c2e-476d-9a60-946f4f148564") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.059528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5hl4\" (UniqueName: \"kubernetes.io/projected/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-kube-api-access-l5hl4\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.060356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.065194 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq"] Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.095523 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7"] Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.106119 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4"] Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.111886 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bd457aa_b1c8_4b86_ada4_17f76c7ad4dd.slice/crio-4da72fd835c6b6a0e077ef11c59c9ed1a84e7cd455229909bfb1f2a92835217c WatchSource:0}: Error finding container 4da72fd835c6b6a0e077ef11c59c9ed1a84e7cd455229909bfb1f2a92835217c: Status 404 returned error can't find the container with id 4da72fd835c6b6a0e077ef11c59c9ed1a84e7cd455229909bfb1f2a92835217c Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.118703 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ff533c3_a29f_498b_a575_1492c0e07aa9.slice/crio-022e1abc920591480fa3d50ea5892e57ae18e0ce41634939d04f6fa16dc35cd3 WatchSource:0}: Error finding container 022e1abc920591480fa3d50ea5892e57ae18e0ce41634939d04f6fa16dc35cd3: Status 404 returned error can't find the container with id 022e1abc920591480fa3d50ea5892e57ae18e0ce41634939d04f6fa16dc35cd3 Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.133887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dwfb\" (UniqueName: \"kubernetes.io/projected/d8707730-a731-419d-ad12-caf5fddafbd5-kube-api-access-4dwfb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lpptc\" (UID: \"d8707730-a731-419d-ad12-caf5fddafbd5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.149465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" event={"ID":"4adccf62-a1c3-430c-9137-a515f03b23e4","Type":"ContainerStarted","Data":"e1181ff0d33bd2106492ac908180fc1e95d443aaeb3b20fa0870529df93627a5"} Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.151309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" event={"ID":"1ff533c3-a29f-498b-a575-1492c0e07aa9","Type":"ContainerStarted","Data":"022e1abc920591480fa3d50ea5892e57ae18e0ce41634939d04f6fa16dc35cd3"} Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.152910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dwfb\" (UniqueName: \"kubernetes.io/projected/d8707730-a731-419d-ad12-caf5fddafbd5-kube-api-access-4dwfb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lpptc\" (UID: \"d8707730-a731-419d-ad12-caf5fddafbd5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.158131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" event={"ID":"6676ddd6-9d32-480d-9306-d3dc0676d2cd","Type":"ContainerStarted","Data":"9d02792b93962300a94de11e733191975263ecb6156a6227a26eed844b193908"} Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.160355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" event={"ID":"2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd","Type":"ContainerStarted","Data":"4da72fd835c6b6a0e077ef11c59c9ed1a84e7cd455229909bfb1f2a92835217c"} Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.187590 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr"] Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.194073 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a9240d_2c77_4718_8f9d_60633df4eee4.slice/crio-e5316412d015b5f7c3f3560b03a3a357463f0f6430836bf6e5fcdaf07e6f3619 WatchSource:0}: Error finding container e5316412d015b5f7c3f3560b03a3a357463f0f6430836bf6e5fcdaf07e6f3619: Status 404 returned error can't find the container with id e5316412d015b5f7c3f3560b03a3a357463f0f6430836bf6e5fcdaf07e6f3619 Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.197070 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf"] Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.205902 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6ab0bb3_12a8_4314_83c7_280a512198f6.slice/crio-657fc70bbae0560de034485e811af0a0e4ca8589cd2b996e2a67a813b8c7a1be WatchSource:0}: Error finding container 657fc70bbae0560de034485e811af0a0e4ca8589cd2b996e2a67a813b8c7a1be: Status 404 returned error can't find the container with id 657fc70bbae0560de034485e811af0a0e4ca8589cd2b996e2a67a813b8c7a1be Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.293091 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.386705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq"] Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.388957 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8951bb5e_fd60_4761_9c83_bc523b041b83.slice/crio-ce4fd4fff028a04b0b493b691da329473637587b6188ac7cb63e9e073063ee80 WatchSource:0}: Error finding container ce4fd4fff028a04b0b493b691da329473637587b6188ac7cb63e9e073063ee80: Status 404 returned error can't find the container with id ce4fd4fff028a04b0b493b691da329473637587b6188ac7cb63e9e073063ee80 Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.390020 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st"] Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.395249 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm"] Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.401702 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5"] Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.405113 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d095e2a_7788_4a7e_af9b_eaed8407ef5a.slice/crio-6afdd9a0ec2c36896a46fd19954ab8cbdade06f0eaa4f075f0870ff9c1530417 WatchSource:0}: Error finding container 6afdd9a0ec2c36896a46fd19954ab8cbdade06f0eaa4f075f0870ff9c1530417: Status 404 returned error can't find the container with id 6afdd9a0ec2c36896a46fd19954ab8cbdade06f0eaa4f075f0870ff9c1530417 Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.406215 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8"] Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.406670 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c2dd112_6dc9_4b29_95f2_25b3c44452a2.slice/crio-a186ff0e1402d8715927e0825d3dc08c84dc756abdea3c4f7cea33e59208719f WatchSource:0}: Error finding container a186ff0e1402d8715927e0825d3dc08c84dc756abdea3c4f7cea33e59208719f: Status 404 returned error can't find the container with id a186ff0e1402d8715927e0825d3dc08c84dc756abdea3c4f7cea33e59208719f Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.521293 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4"] Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.526369 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm"] Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.539765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.539863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.539960 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.539971 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.540009 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:15.53999711 +0000 UTC m=+752.721513332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "metrics-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.540021 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:15.540016677 +0000 UTC m=+752.721532898 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "webhook-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.540864 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t"] Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.543320 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-747pw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7fc9b76cf6-d9dr4_openstack-operators(62e25886-609f-4ef6-848c-e116c738df6a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.544704 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" podUID="62e25886-609f-4ef6-848c-e116c738df6a" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.545622 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-58ttw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-79xkw_openstack-operators(5219c03b-89a8-4c73-b04f-2c6b1f8e29f4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.547480 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" podUID="5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.549752 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l"] Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.553671 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9xdt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5f8f495fcf-znz8t_openstack-operators(0c59ef09-0beb-452c-8563-83bda9602961): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.555189 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" podUID="0c59ef09-0beb-452c-8563-83bda9602961" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.557329 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mcmrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-55db956ddc-5tp6l_openstack-operators(40734c06-7e00-4184-9487-35be214c9556): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.557779 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw"] Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.558459 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f333daa_a37b_4c2e_bd13_e66416449d2c.slice/crio-40ce83d397ae6c679a56d729cfc138ff755f95f5553be692fe3834a343abfd66 WatchSource:0}: Error finding container 40ce83d397ae6c679a56d729cfc138ff755f95f5553be692fe3834a343abfd66: Status 404 returned error can't find the container with id 40ce83d397ae6c679a56d729cfc138ff755f95f5553be692fe3834a343abfd66 Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.558516 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" podUID="40734c06-7e00-4184-9487-35be214c9556" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.562658 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg"] Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.562678 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t7zn9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-686df47fcb-njwgg_openstack-operators(9f333daa-a37b-4c2e-bd13-e66416449d2c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.563982 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" podUID="9f333daa-a37b-4c2e-bd13-e66416449d2c" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.670137 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl"] Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.671916 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6b5573_a127_4713_919d_d79e38f60b87.slice/crio-441ce9e8ea74d8bf08ea2bf33e61e39b2087fd213c312b5aade143d12f83554a WatchSource:0}: Error finding container 441ce9e8ea74d8bf08ea2bf33e61e39b2087fd213c312b5aade143d12f83554a: Status 404 returned error can't find the container with id 441ce9e8ea74d8bf08ea2bf33e61e39b2087fd213c312b5aade143d12f83554a Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.672967 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d145ef_2bd3_4c20_ae4b_fff81dbfc4b6.slice/crio-ec6cdae23307b1bc8f303f95b6facba65ebad5c47f04c52271d3a8370c4bed89 WatchSource:0}: Error finding container ec6cdae23307b1bc8f303f95b6facba65ebad5c47f04c52271d3a8370c4bed89: Status 404 returned error can't find the container with id ec6cdae23307b1bc8f303f95b6facba65ebad5c47f04c52271d3a8370c4bed89 Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.673430 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-99p87"] Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.674354 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hg6cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-64cd966744-99p87_openstack-operators(79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.676276 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" podUID="79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.685287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc"] Jan 21 15:14:14 crc kubenswrapper[4707]: W0121 15:14:14.688407 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8707730_a731_419d_ad12_caf5fddafbd5.slice/crio-ee5c9c4f28d09358f5e4f684a85deb5b1c2de57d7f2b73a249fa4b280614f881 WatchSource:0}: Error finding container ee5c9c4f28d09358f5e4f684a85deb5b1c2de57d7f2b73a249fa4b280614f881: Status 404 returned error can't find the container with id ee5c9c4f28d09358f5e4f684a85deb5b1c2de57d7f2b73a249fa4b280614f881 Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.690605 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4dwfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-lpptc_openstack-operators(d8707730-a731-419d-ad12-caf5fddafbd5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.692009 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" podUID="d8707730-a731-419d-ad12-caf5fddafbd5" Jan 21 15:14:14 crc kubenswrapper[4707]: I0121 15:14:14.942789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.942987 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:14 crc kubenswrapper[4707]: E0121 15:14:14.943155 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert podName:375b0905-f382-4e75-8c5c-4ece8f6ddde3 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:16.943139752 +0000 UTC m=+754.124655974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert") pod "infra-operator-controller-manager-77c48c7859-h6pkz" (UID: "375b0905-f382-4e75-8c5c-4ece8f6ddde3") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.043872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.043984 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.044056 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert podName:9356dd92-4c2e-476d-9a60-946f4f148564 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:17.044038983 +0000 UTC m=+754.225555205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dfskdx" (UID: "9356dd92-4c2e-476d-9a60-946f4f148564") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.166046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" event={"ID":"5219c03b-89a8-4c73-b04f-2c6b1f8e29f4","Type":"ContainerStarted","Data":"5af737c25c722e3f8df82b62942a9ca1d3e95970dccc2de59a013ce1eab7f526"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.166859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" event={"ID":"79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6","Type":"ContainerStarted","Data":"ec6cdae23307b1bc8f303f95b6facba65ebad5c47f04c52271d3a8370c4bed89"} Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.167322 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" podUID="5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.168005 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" podUID="79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.168280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" event={"ID":"0c59ef09-0beb-452c-8563-83bda9602961","Type":"ContainerStarted","Data":"bcbff26adef1b2e1ec1e735705382ce38e79f7355d46171529c5ad812fc5ebf5"} Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.172787 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" podUID="0c59ef09-0beb-452c-8563-83bda9602961" Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.173083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" event={"ID":"4f34a822-4c51-4dfc-9812-1287c9b3281d","Type":"ContainerStarted","Data":"d764a7850aad2c6c911bd3737d1311a0dfb555c04e589572887c225ef28d544b"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.174477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" event={"ID":"e6ab0bb3-12a8-4314-83c7-280a512198f6","Type":"ContainerStarted","Data":"657fc70bbae0560de034485e811af0a0e4ca8589cd2b996e2a67a813b8c7a1be"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.177145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" event={"ID":"8951bb5e-fd60-4761-9c83-bc523b041b83","Type":"ContainerStarted","Data":"ce4fd4fff028a04b0b493b691da329473637587b6188ac7cb63e9e073063ee80"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.180010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" event={"ID":"96a9240d-2c77-4718-8f9d-60633df4eee4","Type":"ContainerStarted","Data":"e5316412d015b5f7c3f3560b03a3a357463f0f6430836bf6e5fcdaf07e6f3619"} Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.183631 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" podUID="d8707730-a731-419d-ad12-caf5fddafbd5" Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.185005 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" podUID="40734c06-7e00-4184-9487-35be214c9556" Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.195786 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" event={"ID":"d8707730-a731-419d-ad12-caf5fddafbd5","Type":"ContainerStarted","Data":"ee5c9c4f28d09358f5e4f684a85deb5b1c2de57d7f2b73a249fa4b280614f881"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.195839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" event={"ID":"40734c06-7e00-4184-9487-35be214c9556","Type":"ContainerStarted","Data":"5285b84a44266276fd32026dcaf0de016240472c642cd950fa8d30799db8f0f6"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.195851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" event={"ID":"7a6b5573-a127-4713-919d-d79e38f60b87","Type":"ContainerStarted","Data":"441ce9e8ea74d8bf08ea2bf33e61e39b2087fd213c312b5aade143d12f83554a"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.195859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" event={"ID":"4d095e2a-7788-4a7e-af9b-eaed8407ef5a","Type":"ContainerStarted","Data":"6afdd9a0ec2c36896a46fd19954ab8cbdade06f0eaa4f075f0870ff9c1530417"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.196859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" event={"ID":"62e25886-609f-4ef6-848c-e116c738df6a","Type":"ContainerStarted","Data":"2780ef155fd58bc61c7c38d5c95dcf76709fc13ec1935a89f985672e7a3b9a8d"} Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.202507 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" podUID="62e25886-609f-4ef6-848c-e116c738df6a" Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.204634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" event={"ID":"a674fe35-9ddd-4352-b770-1061b28fce34","Type":"ContainerStarted","Data":"fe1e0f07ce25b64d2de7e3cc60e5ae49f2faed3568d04a2a88207aae537e8721"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.205948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" event={"ID":"9f333daa-a37b-4c2e-bd13-e66416449d2c","Type":"ContainerStarted","Data":"40ce83d397ae6c679a56d729cfc138ff755f95f5553be692fe3834a343abfd66"} Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.209003 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" podUID="9f333daa-a37b-4c2e-bd13-e66416449d2c" Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.224463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" event={"ID":"06c244e2-fb38-4ac0-8ed0-2f34d01c5228","Type":"ContainerStarted","Data":"3b3fdc9a10007761ab66b639055919d6d4bc9530aa4b526bb3a5089bf964c0ba"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.225650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" event={"ID":"4c2dd112-6dc9-4b29-95f2-25b3c44452a2","Type":"ContainerStarted","Data":"a186ff0e1402d8715927e0825d3dc08c84dc756abdea3c4f7cea33e59208719f"} Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.551587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.551843 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.551919 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:17.551903357 +0000 UTC m=+754.733419579 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "metrics-server-cert" not found Jan 21 15:14:15 crc kubenswrapper[4707]: I0121 15:14:15.551990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.552320 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:14:15 crc kubenswrapper[4707]: E0121 15:14:15.552475 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:17.552405812 +0000 UTC m=+754.733922034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "webhook-server-cert" not found Jan 21 15:14:16 crc kubenswrapper[4707]: E0121 15:14:16.233202 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" podUID="40734c06-7e00-4184-9487-35be214c9556" Jan 21 15:14:16 crc kubenswrapper[4707]: E0121 15:14:16.234166 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" podUID="62e25886-609f-4ef6-848c-e116c738df6a" Jan 21 15:14:16 crc kubenswrapper[4707]: E0121 15:14:16.234269 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" podUID="79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" Jan 21 15:14:16 crc kubenswrapper[4707]: E0121 15:14:16.234320 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" podUID="5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" Jan 21 15:14:16 crc kubenswrapper[4707]: E0121 15:14:16.234374 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" podUID="d8707730-a731-419d-ad12-caf5fddafbd5" Jan 21 15:14:16 crc kubenswrapper[4707]: E0121 15:14:16.234737 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" podUID="9f333daa-a37b-4c2e-bd13-e66416449d2c" Jan 21 15:14:16 crc kubenswrapper[4707]: E0121 15:14:16.235553 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" podUID="0c59ef09-0beb-452c-8563-83bda9602961" Jan 21 15:14:16 crc kubenswrapper[4707]: I0121 15:14:16.986295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:16 crc kubenswrapper[4707]: E0121 15:14:16.986425 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:16 crc kubenswrapper[4707]: E0121 15:14:16.986642 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert podName:375b0905-f382-4e75-8c5c-4ece8f6ddde3 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:20.986625271 +0000 UTC m=+758.168141492 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert") pod "infra-operator-controller-manager-77c48c7859-h6pkz" (UID: "375b0905-f382-4e75-8c5c-4ece8f6ddde3") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:17 crc kubenswrapper[4707]: I0121 15:14:17.088390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:17 crc kubenswrapper[4707]: E0121 15:14:17.088565 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:17 crc kubenswrapper[4707]: E0121 15:14:17.088626 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert podName:9356dd92-4c2e-476d-9a60-946f4f148564 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:21.088609994 +0000 UTC m=+758.270126217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dfskdx" (UID: "9356dd92-4c2e-476d-9a60-946f4f148564") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:17 crc kubenswrapper[4707]: I0121 15:14:17.242698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" event={"ID":"4adccf62-a1c3-430c-9137-a515f03b23e4","Type":"ContainerStarted","Data":"e45b507f5954b9cce48efe9efde816a45062e6746141cb2cb4c92830af61bce1"} Jan 21 15:14:17 crc kubenswrapper[4707]: I0121 15:14:17.242919 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" Jan 21 15:14:17 crc kubenswrapper[4707]: I0121 15:14:17.257762 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" podStartSLOduration=2.0744039 podStartE2EDuration="4.257747872s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:13.978885367 +0000 UTC m=+751.160401590" lastFinishedPulling="2026-01-21 15:14:16.162229341 +0000 UTC m=+753.343745562" observedRunningTime="2026-01-21 15:14:17.255484495 +0000 UTC m=+754.437000717" watchObservedRunningTime="2026-01-21 15:14:17.257747872 +0000 UTC m=+754.439264084" Jan 21 15:14:17 crc kubenswrapper[4707]: I0121 15:14:17.594934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:17 crc kubenswrapper[4707]: I0121 15:14:17.594997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:17 crc kubenswrapper[4707]: E0121 15:14:17.595035 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:14:17 crc kubenswrapper[4707]: E0121 15:14:17.595094 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:21.595078321 +0000 UTC m=+758.776594543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "webhook-server-cert" not found Jan 21 15:14:17 crc kubenswrapper[4707]: E0121 15:14:17.595168 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:14:17 crc kubenswrapper[4707]: E0121 15:14:17.595578 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:21.595559156 +0000 UTC m=+758.777075378 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "metrics-server-cert" not found Jan 21 15:14:20 crc kubenswrapper[4707]: I0121 15:14:20.828881 4707 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 15:14:21 crc kubenswrapper[4707]: I0121 15:14:21.038884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:21 crc kubenswrapper[4707]: E0121 15:14:21.039021 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:21 crc kubenswrapper[4707]: E0121 15:14:21.039085 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert podName:375b0905-f382-4e75-8c5c-4ece8f6ddde3 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:29.039069895 +0000 UTC m=+766.220586117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert") pod "infra-operator-controller-manager-77c48c7859-h6pkz" (UID: "375b0905-f382-4e75-8c5c-4ece8f6ddde3") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:21 crc kubenswrapper[4707]: I0121 15:14:21.140323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:21 crc kubenswrapper[4707]: E0121 15:14:21.140511 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:21 crc kubenswrapper[4707]: E0121 15:14:21.140582 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert podName:9356dd92-4c2e-476d-9a60-946f4f148564 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:29.140565889 +0000 UTC m=+766.322082112 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dfskdx" (UID: "9356dd92-4c2e-476d-9a60-946f4f148564") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:14:21 crc kubenswrapper[4707]: I0121 15:14:21.644483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:21 crc kubenswrapper[4707]: I0121 15:14:21.644542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:21 crc kubenswrapper[4707]: E0121 15:14:21.644644 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:14:21 crc kubenswrapper[4707]: E0121 15:14:21.644685 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:29.644672644 +0000 UTC m=+766.826188866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "metrics-server-cert" not found Jan 21 15:14:21 crc kubenswrapper[4707]: E0121 15:14:21.644644 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:14:21 crc kubenswrapper[4707]: E0121 15:14:21.644759 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:29.644749118 +0000 UTC m=+766.826265340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "webhook-server-cert" not found Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.272688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" event={"ID":"4f34a822-4c51-4dfc-9812-1287c9b3281d","Type":"ContainerStarted","Data":"c813c3747e39edf631ae0a59c3bfe2ea36cc0254e81c018a901ec8d129da1f49"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.272928 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.274931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" event={"ID":"e6ab0bb3-12a8-4314-83c7-280a512198f6","Type":"ContainerStarted","Data":"e4f42a8b26c44d79b0b42241e85e250bb46ac11e92535e51eb178335a37ba91b"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.274988 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.276325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" event={"ID":"a674fe35-9ddd-4352-b770-1061b28fce34","Type":"ContainerStarted","Data":"c8595d9ca2bcaf4f53f6bc9205a1d3fae85f9f44532fae632b4d59583c03b6f7"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.276437 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.277717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" event={"ID":"6676ddd6-9d32-480d-9306-d3dc0676d2cd","Type":"ContainerStarted","Data":"2f7d7170e2337291b3525b16efe7ac13fde05e46c80b71e57c73c1795a5e8afa"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.277761 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.278777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" event={"ID":"06c244e2-fb38-4ac0-8ed0-2f34d01c5228","Type":"ContainerStarted","Data":"9f157a1954f415b2a222d4a1081d13ac7ca6e425474cd711399734533ef230cf"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.278931 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.282715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" event={"ID":"96a9240d-2c77-4718-8f9d-60633df4eee4","Type":"ContainerStarted","Data":"76e7d98d71cfa80838fca4badb283a780fc5d99d758975b4e61ffbaedfea58aa"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.282848 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.283690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" event={"ID":"4c2dd112-6dc9-4b29-95f2-25b3c44452a2","Type":"ContainerStarted","Data":"54bd8be04d416835f1281493540d9e01504393fe58be46567299bee6842f6ce5"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.284003 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.285321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" event={"ID":"7a6b5573-a127-4713-919d-d79e38f60b87","Type":"ContainerStarted","Data":"33b413ceef8ffdcb86dbbdca687a971ccb0ac31560e3c2cc793fe5dde7400208"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.285705 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.286826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" event={"ID":"8951bb5e-fd60-4761-9c83-bc523b041b83","Type":"ContainerStarted","Data":"5a8b6331813d6ce1cc4c1f242ea893b7520dd2bafdb2bc296453a569d7df2394"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.287170 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.290143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" event={"ID":"4d095e2a-7788-4a7e-af9b-eaed8407ef5a","Type":"ContainerStarted","Data":"0629f352d3f4441913ab06b6e0a66cad5ad04c26a87acf9f4fafc1b2313ff041"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.290216 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.291551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" event={"ID":"2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd","Type":"ContainerStarted","Data":"970be3767e71031a742c429f6c28e936b6930af2dc7e5f41e212c357f8c9b82e"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.291645 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.293052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" event={"ID":"1ff533c3-a29f-498b-a575-1492c0e07aa9","Type":"ContainerStarted","Data":"63c114e32b92172fc2d850e2811160cb9627b2ab520857ee5130971d61c990d6"} Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.293136 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.391326 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" podStartSLOduration=1.868115831 podStartE2EDuration="10.391312035s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.416388872 +0000 UTC m=+751.597905094" lastFinishedPulling="2026-01-21 15:14:22.939585077 +0000 UTC m=+760.121101298" observedRunningTime="2026-01-21 15:14:23.388145719 +0000 UTC m=+760.569661942" watchObservedRunningTime="2026-01-21 15:14:23.391312035 +0000 UTC m=+760.572828258" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.419111 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" podStartSLOduration=2.122088343 podStartE2EDuration="10.41909794s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.20760493 +0000 UTC m=+751.389121151" lastFinishedPulling="2026-01-21 15:14:22.504614526 +0000 UTC m=+759.686130748" observedRunningTime="2026-01-21 15:14:23.416011035 +0000 UTC m=+760.597527256" watchObservedRunningTime="2026-01-21 15:14:23.41909794 +0000 UTC m=+760.600614163" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.444104 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" podStartSLOduration=1.750912151 podStartE2EDuration="10.444074762s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.208462131 +0000 UTC m=+751.389978354" lastFinishedPulling="2026-01-21 15:14:22.901624743 +0000 UTC m=+760.083140965" observedRunningTime="2026-01-21 15:14:23.439585918 +0000 UTC m=+760.621102141" watchObservedRunningTime="2026-01-21 15:14:23.444074762 +0000 UTC m=+760.625590985" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.455497 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.476658 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" podStartSLOduration=2.277507837 podStartE2EDuration="10.476646078s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.673956828 +0000 UTC m=+751.855473050" lastFinishedPulling="2026-01-21 15:14:22.873095069 +0000 UTC m=+760.054611291" observedRunningTime="2026-01-21 15:14:23.474233781 +0000 UTC m=+760.655750003" watchObservedRunningTime="2026-01-21 15:14:23.476646078 +0000 UTC m=+760.658162301" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.498088 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" podStartSLOduration=1.7531767409999999 podStartE2EDuration="10.498072462s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.128288134 +0000 UTC m=+751.309804356" lastFinishedPulling="2026-01-21 15:14:22.873183855 +0000 UTC m=+760.054700077" observedRunningTime="2026-01-21 15:14:23.494604749 +0000 UTC m=+760.676120970" watchObservedRunningTime="2026-01-21 15:14:23.498072462 +0000 UTC m=+760.679588684" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.521363 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" podStartSLOduration=1.733705027 podStartE2EDuration="10.521348455s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.114306816 +0000 UTC m=+751.295823038" lastFinishedPulling="2026-01-21 15:14:22.901950245 +0000 UTC m=+760.083466466" observedRunningTime="2026-01-21 15:14:23.518108009 +0000 UTC m=+760.699624231" watchObservedRunningTime="2026-01-21 15:14:23.521348455 +0000 UTC m=+760.702864676" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.550787 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" podStartSLOduration=2.063446267 podStartE2EDuration="10.550770246s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.414633741 +0000 UTC m=+751.596149963" lastFinishedPulling="2026-01-21 15:14:22.90195772 +0000 UTC m=+760.083473942" observedRunningTime="2026-01-21 15:14:23.546770874 +0000 UTC m=+760.728287096" watchObservedRunningTime="2026-01-21 15:14:23.550770246 +0000 UTC m=+760.732286468" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.575711 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" podStartSLOduration=2.083646967 podStartE2EDuration="10.575698457s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.408859159 +0000 UTC m=+751.590375381" lastFinishedPulling="2026-01-21 15:14:22.90091065 +0000 UTC m=+760.082426871" observedRunningTime="2026-01-21 15:14:23.573259599 +0000 UTC m=+760.754775821" watchObservedRunningTime="2026-01-21 15:14:23.575698457 +0000 UTC m=+760.757214668" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.595828 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" podStartSLOduration=2.234708663 podStartE2EDuration="10.595803635s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.538346925 +0000 UTC m=+751.719863148" lastFinishedPulling="2026-01-21 15:14:22.899441897 +0000 UTC m=+760.080958120" observedRunningTime="2026-01-21 15:14:23.593778135 +0000 UTC m=+760.775294357" watchObservedRunningTime="2026-01-21 15:14:23.595803635 +0000 UTC m=+760.777319857" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.610816 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" podStartSLOduration=2.119167198 podStartE2EDuration="10.610792679s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.390398614 +0000 UTC m=+751.571914836" lastFinishedPulling="2026-01-21 15:14:22.882024094 +0000 UTC m=+760.063540317" observedRunningTime="2026-01-21 15:14:23.60869838 +0000 UTC m=+760.790214602" watchObservedRunningTime="2026-01-21 15:14:23.610792679 +0000 UTC m=+760.792308901" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.643043 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" podStartSLOduration=2.266351946 podStartE2EDuration="10.643029337s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.128022495 +0000 UTC m=+751.309538717" lastFinishedPulling="2026-01-21 15:14:22.504699886 +0000 UTC m=+759.686216108" observedRunningTime="2026-01-21 15:14:23.640603472 +0000 UTC m=+760.822119695" watchObservedRunningTime="2026-01-21 15:14:23.643029337 +0000 UTC m=+760.824545558" Jan 21 15:14:23 crc kubenswrapper[4707]: I0121 15:14:23.664354 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" podStartSLOduration=2.16988006 podStartE2EDuration="10.664340322s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.406940429 +0000 UTC m=+751.588456651" lastFinishedPulling="2026-01-21 15:14:22.901400692 +0000 UTC m=+760.082916913" observedRunningTime="2026-01-21 15:14:23.663785077 +0000 UTC m=+760.845301299" watchObservedRunningTime="2026-01-21 15:14:23.664340322 +0000 UTC m=+760.845856544" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.064739 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-26x25"] Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.065919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.079305 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-26x25"] Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.176440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kktlm\" (UniqueName: \"kubernetes.io/projected/2510154b-bb33-4462-8993-3cd6c69e286b-kube-api-access-kktlm\") pod \"certified-operators-26x25\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.176671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-utilities\") pod \"certified-operators-26x25\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.176890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-catalog-content\") pod \"certified-operators-26x25\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.278470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kktlm\" (UniqueName: \"kubernetes.io/projected/2510154b-bb33-4462-8993-3cd6c69e286b-kube-api-access-kktlm\") pod \"certified-operators-26x25\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.278511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-utilities\") pod \"certified-operators-26x25\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.278609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-catalog-content\") pod \"certified-operators-26x25\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.279090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-catalog-content\") pod \"certified-operators-26x25\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.279478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-utilities\") pod \"certified-operators-26x25\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.293692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kktlm\" (UniqueName: \"kubernetes.io/projected/2510154b-bb33-4462-8993-3cd6c69e286b-kube-api-access-kktlm\") pod \"certified-operators-26x25\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.377671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:24 crc kubenswrapper[4707]: I0121 15:14:24.852386 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-26x25"] Jan 21 15:14:25 crc kubenswrapper[4707]: I0121 15:14:25.308022 4707 generic.go:334] "Generic (PLEG): container finished" podID="2510154b-bb33-4462-8993-3cd6c69e286b" containerID="4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b" exitCode=0 Jan 21 15:14:25 crc kubenswrapper[4707]: I0121 15:14:25.308087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26x25" event={"ID":"2510154b-bb33-4462-8993-3cd6c69e286b","Type":"ContainerDied","Data":"4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b"} Jan 21 15:14:25 crc kubenswrapper[4707]: I0121 15:14:25.308415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26x25" event={"ID":"2510154b-bb33-4462-8993-3cd6c69e286b","Type":"ContainerStarted","Data":"ed397f24d31c8a42ef74bc1e32bd0997d2e379e79829e57419f77ea91044f65e"} Jan 21 15:14:26 crc kubenswrapper[4707]: I0121 15:14:26.315429 4707 generic.go:334] "Generic (PLEG): container finished" podID="2510154b-bb33-4462-8993-3cd6c69e286b" containerID="6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb" exitCode=0 Jan 21 15:14:26 crc kubenswrapper[4707]: I0121 15:14:26.315469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26x25" event={"ID":"2510154b-bb33-4462-8993-3cd6c69e286b","Type":"ContainerDied","Data":"6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb"} Jan 21 15:14:27 crc kubenswrapper[4707]: I0121 15:14:27.321698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26x25" event={"ID":"2510154b-bb33-4462-8993-3cd6c69e286b","Type":"ContainerStarted","Data":"4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1"} Jan 21 15:14:27 crc kubenswrapper[4707]: I0121 15:14:27.334885 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-26x25" podStartSLOduration=1.894174344 podStartE2EDuration="3.334871997s" podCreationTimestamp="2026-01-21 15:14:24 +0000 UTC" firstStartedPulling="2026-01-21 15:14:25.30975539 +0000 UTC m=+762.491271612" lastFinishedPulling="2026-01-21 15:14:26.750453044 +0000 UTC m=+763.931969265" observedRunningTime="2026-01-21 15:14:27.333566782 +0000 UTC m=+764.515083004" watchObservedRunningTime="2026-01-21 15:14:27.334871997 +0000 UTC m=+764.516388210" Jan 21 15:14:29 crc kubenswrapper[4707]: I0121 15:14:29.138797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:29 crc kubenswrapper[4707]: E0121 15:14:29.139164 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:29 crc kubenswrapper[4707]: E0121 15:14:29.139212 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert podName:375b0905-f382-4e75-8c5c-4ece8f6ddde3 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:45.139198437 +0000 UTC m=+782.320714679 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert") pod "infra-operator-controller-manager-77c48c7859-h6pkz" (UID: "375b0905-f382-4e75-8c5c-4ece8f6ddde3") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:14:29 crc kubenswrapper[4707]: I0121 15:14:29.241491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:29 crc kubenswrapper[4707]: I0121 15:14:29.248406 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dfskdx\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:29 crc kubenswrapper[4707]: I0121 15:14:29.398072 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:29 crc kubenswrapper[4707]: I0121 15:14:29.646129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:29 crc kubenswrapper[4707]: I0121 15:14:29.646444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:29 crc kubenswrapper[4707]: E0121 15:14:29.646578 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:14:29 crc kubenswrapper[4707]: E0121 15:14:29.646629 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs podName:bd0195ed-9602-42a1-bd0e-a0d79f8a6197 nodeName:}" failed. No retries permitted until 2026-01-21 15:14:45.646614528 +0000 UTC m=+782.828130749 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-nns7n" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197") : secret "webhook-server-cert" not found Jan 21 15:14:29 crc kubenswrapper[4707]: I0121 15:14:29.652937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:29 crc kubenswrapper[4707]: I0121 15:14:29.772172 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx"] Jan 21 15:14:30 crc kubenswrapper[4707]: W0121 15:14:30.116667 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9356dd92_4c2e_476d_9a60_946f4f148564.slice/crio-61fece91d28908eb9db10f6ecab6f1f4fb5ec195e5d5cf6e11426c58e089c800 WatchSource:0}: Error finding container 61fece91d28908eb9db10f6ecab6f1f4fb5ec195e5d5cf6e11426c58e089c800: Status 404 returned error can't find the container with id 61fece91d28908eb9db10f6ecab6f1f4fb5ec195e5d5cf6e11426c58e089c800 Jan 21 15:14:30 crc kubenswrapper[4707]: I0121 15:14:30.336824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" event={"ID":"9356dd92-4c2e-476d-9a60-946f4f148564","Type":"ContainerStarted","Data":"61fece91d28908eb9db10f6ecab6f1f4fb5ec195e5d5cf6e11426c58e089c800"} Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.457274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.470919 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.510866 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.554126 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.584121 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.605130 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.685180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.688248 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.706132 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.706994 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.715570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" Jan 21 15:14:33 crc kubenswrapper[4707]: I0121 15:14:33.988948 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" Jan 21 15:14:34 crc kubenswrapper[4707]: I0121 15:14:34.378108 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:34 crc kubenswrapper[4707]: I0121 15:14:34.378234 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:34 crc kubenswrapper[4707]: I0121 15:14:34.407698 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:35 crc kubenswrapper[4707]: I0121 15:14:35.388514 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:35 crc kubenswrapper[4707]: I0121 15:14:35.417474 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-26x25"] Jan 21 15:14:37 crc kubenswrapper[4707]: I0121 15:14:37.370068 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-26x25" podUID="2510154b-bb33-4462-8993-3cd6c69e286b" containerName="registry-server" containerID="cri-o://4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1" gracePeriod=2 Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.046611 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.154777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-catalog-content\") pod \"2510154b-bb33-4462-8993-3cd6c69e286b\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.155051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-utilities\") pod \"2510154b-bb33-4462-8993-3cd6c69e286b\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.155083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kktlm\" (UniqueName: \"kubernetes.io/projected/2510154b-bb33-4462-8993-3cd6c69e286b-kube-api-access-kktlm\") pod \"2510154b-bb33-4462-8993-3cd6c69e286b\" (UID: \"2510154b-bb33-4462-8993-3cd6c69e286b\") " Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.155631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-utilities" (OuterVolumeSpecName: "utilities") pod "2510154b-bb33-4462-8993-3cd6c69e286b" (UID: "2510154b-bb33-4462-8993-3cd6c69e286b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.160046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2510154b-bb33-4462-8993-3cd6c69e286b-kube-api-access-kktlm" (OuterVolumeSpecName: "kube-api-access-kktlm") pod "2510154b-bb33-4462-8993-3cd6c69e286b" (UID: "2510154b-bb33-4462-8993-3cd6c69e286b"). InnerVolumeSpecName "kube-api-access-kktlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.189435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2510154b-bb33-4462-8993-3cd6c69e286b" (UID: "2510154b-bb33-4462-8993-3cd6c69e286b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.256733 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.256764 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kktlm\" (UniqueName: \"kubernetes.io/projected/2510154b-bb33-4462-8993-3cd6c69e286b-kube-api-access-kktlm\") on node \"crc\" DevicePath \"\"" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.256774 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2510154b-bb33-4462-8993-3cd6c69e286b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.376463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" event={"ID":"d8707730-a731-419d-ad12-caf5fddafbd5","Type":"ContainerStarted","Data":"f81d6212e7b46c7cc4d9c03b8fdd9060c30d3305e97507e4bea0f56ee40e5424"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.378437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" event={"ID":"40734c06-7e00-4184-9487-35be214c9556","Type":"ContainerStarted","Data":"e592e10125a5ad6cbafa06a737cbe06c97e19b4290da7089fbef2ffc14cfde56"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.378654 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.380717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" event={"ID":"79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6","Type":"ContainerStarted","Data":"39c10d68c0dbc799c2b840647380cd032397d3c2ea03aba0b3c0758af87a066a"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.380915 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.381993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" event={"ID":"9356dd92-4c2e-476d-9a60-946f4f148564","Type":"ContainerStarted","Data":"09c8747e2bac89103cb29b97e35d54f1e372a4dea0daecca785b4bd9414b0125"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.382053 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.383152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" event={"ID":"5219c03b-89a8-4c73-b04f-2c6b1f8e29f4","Type":"ContainerStarted","Data":"d17f33e08f973302863a6a158a1b0329e06116ee4116ba93b9392405aeaa6b17"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.383458 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.384539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" event={"ID":"62e25886-609f-4ef6-848c-e116c738df6a","Type":"ContainerStarted","Data":"0046dd925d470a4baf4e0eb975a04283e1f25124964c96ed2b0c722a82a5b5aa"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.384950 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.386145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" event={"ID":"0c59ef09-0beb-452c-8563-83bda9602961","Type":"ContainerStarted","Data":"8e0b35d221b6726be4c9bab8690c69dd64a57f1c4279e3e7101140eeba8d5358"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.386446 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.387477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" event={"ID":"9f333daa-a37b-4c2e-bd13-e66416449d2c","Type":"ContainerStarted","Data":"39ac84a7db285687d6db26470e02e89c210c3ae4cac2d9bde592910bf8a6ac53"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.387825 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.389626 4707 generic.go:334] "Generic (PLEG): container finished" podID="2510154b-bb33-4462-8993-3cd6c69e286b" containerID="4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1" exitCode=0 Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.389653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26x25" event={"ID":"2510154b-bb33-4462-8993-3cd6c69e286b","Type":"ContainerDied","Data":"4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.389671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26x25" event={"ID":"2510154b-bb33-4462-8993-3cd6c69e286b","Type":"ContainerDied","Data":"ed397f24d31c8a42ef74bc1e32bd0997d2e379e79829e57419f77ea91044f65e"} Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.389687 4707 scope.go:117] "RemoveContainer" containerID="4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.389770 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26x25" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.410063 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" podStartSLOduration=7.013435064 podStartE2EDuration="25.410049842s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.690524491 +0000 UTC m=+751.872040713" lastFinishedPulling="2026-01-21 15:14:33.087139269 +0000 UTC m=+770.268655491" observedRunningTime="2026-01-21 15:14:38.407139467 +0000 UTC m=+775.588655690" watchObservedRunningTime="2026-01-21 15:14:38.410049842 +0000 UTC m=+775.591566063" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.412657 4707 scope.go:117] "RemoveContainer" containerID="6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.429446 4707 scope.go:117] "RemoveContainer" containerID="4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.441242 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" podStartSLOduration=17.69833192 podStartE2EDuration="25.441227858s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:30.119047355 +0000 UTC m=+767.300563577" lastFinishedPulling="2026-01-21 15:14:37.861943294 +0000 UTC m=+775.043459515" observedRunningTime="2026-01-21 15:14:38.436170595 +0000 UTC m=+775.617686818" watchObservedRunningTime="2026-01-21 15:14:38.441227858 +0000 UTC m=+775.622744081" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.443835 4707 scope.go:117] "RemoveContainer" containerID="4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1" Jan 21 15:14:38 crc kubenswrapper[4707]: E0121 15:14:38.444324 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1\": container with ID starting with 4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1 not found: ID does not exist" containerID="4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.444353 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1"} err="failed to get container status \"4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1\": rpc error: code = NotFound desc = could not find container \"4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1\": container with ID starting with 4f0a7126033f8f27eec297636a0cee89266a22f492e07b2bc23de0339d8733f1 not found: ID does not exist" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.444374 4707 scope.go:117] "RemoveContainer" containerID="6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb" Jan 21 15:14:38 crc kubenswrapper[4707]: E0121 15:14:38.444630 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb\": container with ID starting with 6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb not found: ID does not exist" containerID="6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.444648 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb"} err="failed to get container status \"6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb\": rpc error: code = NotFound desc = could not find container \"6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb\": container with ID starting with 6dab76b026a4b7d49338392dcaf02ccdd55add810c4b8a063956b5a1001706cb not found: ID does not exist" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.444661 4707 scope.go:117] "RemoveContainer" containerID="4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b" Jan 21 15:14:38 crc kubenswrapper[4707]: E0121 15:14:38.444998 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b\": container with ID starting with 4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b not found: ID does not exist" containerID="4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.445016 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b"} err="failed to get container status \"4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b\": rpc error: code = NotFound desc = could not find container \"4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b\": container with ID starting with 4270040f7ef22bbe25ff175fb43c4cd1fafe0fb3532e202e26651f6f99af1b6b not found: ID does not exist" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.502654 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" podStartSLOduration=2.195072356 podStartE2EDuration="25.502640034s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.543227456 +0000 UTC m=+751.724743678" lastFinishedPulling="2026-01-21 15:14:37.850795134 +0000 UTC m=+775.032311356" observedRunningTime="2026-01-21 15:14:38.462295717 +0000 UTC m=+775.643811938" watchObservedRunningTime="2026-01-21 15:14:38.502640034 +0000 UTC m=+775.684156257" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.507184 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" podStartSLOduration=2.201035114 podStartE2EDuration="25.50717792s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.557245404 +0000 UTC m=+751.738761626" lastFinishedPulling="2026-01-21 15:14:37.863388211 +0000 UTC m=+775.044904432" observedRunningTime="2026-01-21 15:14:38.481043792 +0000 UTC m=+775.662560014" watchObservedRunningTime="2026-01-21 15:14:38.50717792 +0000 UTC m=+775.688694142" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.523347 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-26x25"] Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.532616 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-26x25"] Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.535608 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" podStartSLOduration=7.33654688 podStartE2EDuration="25.535599963s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.562558547 +0000 UTC m=+751.744074770" lastFinishedPulling="2026-01-21 15:14:32.761611631 +0000 UTC m=+769.943127853" observedRunningTime="2026-01-21 15:14:38.516302664 +0000 UTC m=+775.697818875" watchObservedRunningTime="2026-01-21 15:14:38.535599963 +0000 UTC m=+775.717116184" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.538328 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" podStartSLOduration=2.241131916 podStartE2EDuration="25.538324297s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.545552811 +0000 UTC m=+751.727069032" lastFinishedPulling="2026-01-21 15:14:37.842745191 +0000 UTC m=+775.024261413" observedRunningTime="2026-01-21 15:14:38.528279853 +0000 UTC m=+775.709796075" watchObservedRunningTime="2026-01-21 15:14:38.538324297 +0000 UTC m=+775.719840519" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.539843 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" podStartSLOduration=2.250692139 podStartE2EDuration="25.539838255s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.553570631 +0000 UTC m=+751.735086854" lastFinishedPulling="2026-01-21 15:14:37.842716748 +0000 UTC m=+775.024232970" observedRunningTime="2026-01-21 15:14:38.537397624 +0000 UTC m=+775.718913835" watchObservedRunningTime="2026-01-21 15:14:38.539838255 +0000 UTC m=+775.721354477" Jan 21 15:14:38 crc kubenswrapper[4707]: I0121 15:14:38.548843 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" podStartSLOduration=2.361936347 podStartE2EDuration="25.548833274s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:14.674267282 +0000 UTC m=+751.855783503" lastFinishedPulling="2026-01-21 15:14:37.861164207 +0000 UTC m=+775.042680430" observedRunningTime="2026-01-21 15:14:38.545410787 +0000 UTC m=+775.726927009" watchObservedRunningTime="2026-01-21 15:14:38.548833274 +0000 UTC m=+775.730349496" Jan 21 15:14:39 crc kubenswrapper[4707]: I0121 15:14:39.189966 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2510154b-bb33-4462-8993-3cd6c69e286b" path="/var/lib/kubelet/pods/2510154b-bb33-4462-8993-3cd6c69e286b/volumes" Jan 21 15:14:39 crc kubenswrapper[4707]: I0121 15:14:39.946046 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:14:39 crc kubenswrapper[4707]: I0121 15:14:39.946110 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:14:43 crc kubenswrapper[4707]: I0121 15:14:43.739748 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" Jan 21 15:14:43 crc kubenswrapper[4707]: I0121 15:14:43.789478 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" Jan 21 15:14:43 crc kubenswrapper[4707]: I0121 15:14:43.811146 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" Jan 21 15:14:43 crc kubenswrapper[4707]: I0121 15:14:43.847748 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" Jan 21 15:14:43 crc kubenswrapper[4707]: I0121 15:14:43.907153 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" Jan 21 15:14:44 crc kubenswrapper[4707]: I0121 15:14:44.063232 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" Jan 21 15:14:45 crc kubenswrapper[4707]: I0121 15:14:45.143467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:45 crc kubenswrapper[4707]: I0121 15:14:45.148044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"infra-operator-controller-manager-77c48c7859-h6pkz\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:45 crc kubenswrapper[4707]: I0121 15:14:45.410884 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kdccs" Jan 21 15:14:45 crc kubenswrapper[4707]: I0121 15:14:45.420386 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:45 crc kubenswrapper[4707]: I0121 15:14:45.650112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:45 crc kubenswrapper[4707]: I0121 15:14:45.653466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-nns7n\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:45 crc kubenswrapper[4707]: I0121 15:14:45.683726 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-47bc4" Jan 21 15:14:45 crc kubenswrapper[4707]: I0121 15:14:45.692783 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:45 crc kubenswrapper[4707]: I0121 15:14:45.774695 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz"] Jan 21 15:14:45 crc kubenswrapper[4707]: W0121 15:14:45.783409 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod375b0905_f382_4e75_8c5c_4ece8f6ddde3.slice/crio-2d9cfa07cb89a18e2a8095e5cb11e0e960479d3bc16f74cc3fab5ce9ee6ca333 WatchSource:0}: Error finding container 2d9cfa07cb89a18e2a8095e5cb11e0e960479d3bc16f74cc3fab5ce9ee6ca333: Status 404 returned error can't find the container with id 2d9cfa07cb89a18e2a8095e5cb11e0e960479d3bc16f74cc3fab5ce9ee6ca333 Jan 21 15:14:46 crc kubenswrapper[4707]: I0121 15:14:46.058253 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n"] Jan 21 15:14:46 crc kubenswrapper[4707]: W0121 15:14:46.059453 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd0195ed_9602_42a1_bd0e_a0d79f8a6197.slice/crio-2ab3ef5775dddc47d85bacd7254ab86ca56ddaeec082e2dd164c10b254cfb6c9 WatchSource:0}: Error finding container 2ab3ef5775dddc47d85bacd7254ab86ca56ddaeec082e2dd164c10b254cfb6c9: Status 404 returned error can't find the container with id 2ab3ef5775dddc47d85bacd7254ab86ca56ddaeec082e2dd164c10b254cfb6c9 Jan 21 15:14:46 crc kubenswrapper[4707]: I0121 15:14:46.427961 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" event={"ID":"bd0195ed-9602-42a1-bd0e-a0d79f8a6197","Type":"ContainerStarted","Data":"04f2aae1a90962c24e8739e6bbd1f7d8124043d6d56f594bba3c8c868bc521b3"} Jan 21 15:14:46 crc kubenswrapper[4707]: I0121 15:14:46.429010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" event={"ID":"bd0195ed-9602-42a1-bd0e-a0d79f8a6197","Type":"ContainerStarted","Data":"2ab3ef5775dddc47d85bacd7254ab86ca56ddaeec082e2dd164c10b254cfb6c9"} Jan 21 15:14:46 crc kubenswrapper[4707]: I0121 15:14:46.429096 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:14:46 crc kubenswrapper[4707]: I0121 15:14:46.429154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" event={"ID":"375b0905-f382-4e75-8c5c-4ece8f6ddde3","Type":"ContainerStarted","Data":"2d9cfa07cb89a18e2a8095e5cb11e0e960479d3bc16f74cc3fab5ce9ee6ca333"} Jan 21 15:14:46 crc kubenswrapper[4707]: I0121 15:14:46.447211 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" podStartSLOduration=33.447199582 podStartE2EDuration="33.447199582s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:14:46.445326478 +0000 UTC m=+783.626842701" watchObservedRunningTime="2026-01-21 15:14:46.447199582 +0000 UTC m=+783.628715804" Jan 21 15:14:48 crc kubenswrapper[4707]: I0121 15:14:48.440781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" event={"ID":"375b0905-f382-4e75-8c5c-4ece8f6ddde3","Type":"ContainerStarted","Data":"098a977d584fdcb0ebe244d0fa4d9dc9081e754df204db8fed081b450701f4e2"} Jan 21 15:14:48 crc kubenswrapper[4707]: I0121 15:14:48.441056 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:48 crc kubenswrapper[4707]: I0121 15:14:48.451027 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" podStartSLOduration=33.788456867 podStartE2EDuration="35.45101436s" podCreationTimestamp="2026-01-21 15:14:13 +0000 UTC" firstStartedPulling="2026-01-21 15:14:45.785114819 +0000 UTC m=+782.966631042" lastFinishedPulling="2026-01-21 15:14:47.447672323 +0000 UTC m=+784.629188535" observedRunningTime="2026-01-21 15:14:48.44942432 +0000 UTC m=+785.630940532" watchObservedRunningTime="2026-01-21 15:14:48.45101436 +0000 UTC m=+785.632530583" Jan 21 15:14:49 crc kubenswrapper[4707]: I0121 15:14:49.403489 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 15:14:55 crc kubenswrapper[4707]: I0121 15:14:55.424907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 15:14:55 crc kubenswrapper[4707]: I0121 15:14:55.697486 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.143462 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c"] Jan 21 15:15:00 crc kubenswrapper[4707]: E0121 15:15:00.143704 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2510154b-bb33-4462-8993-3cd6c69e286b" containerName="extract-content" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.143716 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2510154b-bb33-4462-8993-3cd6c69e286b" containerName="extract-content" Jan 21 15:15:00 crc kubenswrapper[4707]: E0121 15:15:00.143728 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2510154b-bb33-4462-8993-3cd6c69e286b" containerName="registry-server" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.143733 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2510154b-bb33-4462-8993-3cd6c69e286b" containerName="registry-server" Jan 21 15:15:00 crc kubenswrapper[4707]: E0121 15:15:00.143753 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2510154b-bb33-4462-8993-3cd6c69e286b" containerName="extract-utilities" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.143758 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2510154b-bb33-4462-8993-3cd6c69e286b" containerName="extract-utilities" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.143910 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2510154b-bb33-4462-8993-3cd6c69e286b" containerName="registry-server" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.144310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.148278 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.148289 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.150799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c"] Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.311482 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23aa5a49-72b2-4855-8655-448300642a47-secret-volume\") pod \"collect-profiles-29483475-8984c\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.311743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kscp9\" (UniqueName: \"kubernetes.io/projected/23aa5a49-72b2-4855-8655-448300642a47-kube-api-access-kscp9\") pod \"collect-profiles-29483475-8984c\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.311790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23aa5a49-72b2-4855-8655-448300642a47-config-volume\") pod \"collect-profiles-29483475-8984c\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.364352 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-sr7pd"] Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.365167 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.366499 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.366663 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.367197 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.367456 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.367729 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sr7pd"] Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.413717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23aa5a49-72b2-4855-8655-448300642a47-secret-volume\") pod \"collect-profiles-29483475-8984c\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.413770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kscp9\" (UniqueName: \"kubernetes.io/projected/23aa5a49-72b2-4855-8655-448300642a47-kube-api-access-kscp9\") pod \"collect-profiles-29483475-8984c\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.413804 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23aa5a49-72b2-4855-8655-448300642a47-config-volume\") pod \"collect-profiles-29483475-8984c\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.414545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23aa5a49-72b2-4855-8655-448300642a47-config-volume\") pod \"collect-profiles-29483475-8984c\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.420051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23aa5a49-72b2-4855-8655-448300642a47-secret-volume\") pod \"collect-profiles-29483475-8984c\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.427574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kscp9\" (UniqueName: \"kubernetes.io/projected/23aa5a49-72b2-4855-8655-448300642a47-kube-api-access-kscp9\") pod \"collect-profiles-29483475-8984c\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.458653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.514648 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dphrf\" (UniqueName: \"kubernetes.io/projected/6473b2ca-fd25-446b-8c6a-76e29eff2104-kube-api-access-dphrf\") pod \"crc-storage-crc-sr7pd\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.514707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6473b2ca-fd25-446b-8c6a-76e29eff2104-node-mnt\") pod \"crc-storage-crc-sr7pd\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.514754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6473b2ca-fd25-446b-8c6a-76e29eff2104-crc-storage\") pod \"crc-storage-crc-sr7pd\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.615602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dphrf\" (UniqueName: \"kubernetes.io/projected/6473b2ca-fd25-446b-8c6a-76e29eff2104-kube-api-access-dphrf\") pod \"crc-storage-crc-sr7pd\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.615669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6473b2ca-fd25-446b-8c6a-76e29eff2104-node-mnt\") pod \"crc-storage-crc-sr7pd\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.615717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6473b2ca-fd25-446b-8c6a-76e29eff2104-crc-storage\") pod \"crc-storage-crc-sr7pd\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.616255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6473b2ca-fd25-446b-8c6a-76e29eff2104-crc-storage\") pod \"crc-storage-crc-sr7pd\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.616621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6473b2ca-fd25-446b-8c6a-76e29eff2104-node-mnt\") pod \"crc-storage-crc-sr7pd\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.631070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dphrf\" (UniqueName: \"kubernetes.io/projected/6473b2ca-fd25-446b-8c6a-76e29eff2104-kube-api-access-dphrf\") pod \"crc-storage-crc-sr7pd\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.675760 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:00 crc kubenswrapper[4707]: I0121 15:15:00.797066 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c"] Jan 21 15:15:00 crc kubenswrapper[4707]: W0121 15:15:00.799126 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23aa5a49_72b2_4855_8655_448300642a47.slice/crio-d8ab4f99ff578e2acd1c4da69b7bf32ac8c20b1ddaee722c488abd54ba7128e2 WatchSource:0}: Error finding container d8ab4f99ff578e2acd1c4da69b7bf32ac8c20b1ddaee722c488abd54ba7128e2: Status 404 returned error can't find the container with id d8ab4f99ff578e2acd1c4da69b7bf32ac8c20b1ddaee722c488abd54ba7128e2 Jan 21 15:15:01 crc kubenswrapper[4707]: I0121 15:15:01.012168 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sr7pd"] Jan 21 15:15:01 crc kubenswrapper[4707]: W0121 15:15:01.067622 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6473b2ca_fd25_446b_8c6a_76e29eff2104.slice/crio-44a9e165651e082de9c3fd3f57dfb2732b644c06cd90fc27ffc59fc0d92b8035 WatchSource:0}: Error finding container 44a9e165651e082de9c3fd3f57dfb2732b644c06cd90fc27ffc59fc0d92b8035: Status 404 returned error can't find the container with id 44a9e165651e082de9c3fd3f57dfb2732b644c06cd90fc27ffc59fc0d92b8035 Jan 21 15:15:01 crc kubenswrapper[4707]: I0121 15:15:01.498561 4707 generic.go:334] "Generic (PLEG): container finished" podID="23aa5a49-72b2-4855-8655-448300642a47" containerID="31247faff77ca4ed1bf2d135533279d04b627fd3bed08f97d6daa1ac0c92750a" exitCode=0 Jan 21 15:15:01 crc kubenswrapper[4707]: I0121 15:15:01.498606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" event={"ID":"23aa5a49-72b2-4855-8655-448300642a47","Type":"ContainerDied","Data":"31247faff77ca4ed1bf2d135533279d04b627fd3bed08f97d6daa1ac0c92750a"} Jan 21 15:15:01 crc kubenswrapper[4707]: I0121 15:15:01.498648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" event={"ID":"23aa5a49-72b2-4855-8655-448300642a47","Type":"ContainerStarted","Data":"d8ab4f99ff578e2acd1c4da69b7bf32ac8c20b1ddaee722c488abd54ba7128e2"} Jan 21 15:15:01 crc kubenswrapper[4707]: I0121 15:15:01.499441 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sr7pd" event={"ID":"6473b2ca-fd25-446b-8c6a-76e29eff2104","Type":"ContainerStarted","Data":"44a9e165651e082de9c3fd3f57dfb2732b644c06cd90fc27ffc59fc0d92b8035"} Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.713208 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.842126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kscp9\" (UniqueName: \"kubernetes.io/projected/23aa5a49-72b2-4855-8655-448300642a47-kube-api-access-kscp9\") pod \"23aa5a49-72b2-4855-8655-448300642a47\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.842209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23aa5a49-72b2-4855-8655-448300642a47-secret-volume\") pod \"23aa5a49-72b2-4855-8655-448300642a47\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.842280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23aa5a49-72b2-4855-8655-448300642a47-config-volume\") pod \"23aa5a49-72b2-4855-8655-448300642a47\" (UID: \"23aa5a49-72b2-4855-8655-448300642a47\") " Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.842922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23aa5a49-72b2-4855-8655-448300642a47-config-volume" (OuterVolumeSpecName: "config-volume") pod "23aa5a49-72b2-4855-8655-448300642a47" (UID: "23aa5a49-72b2-4855-8655-448300642a47"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.846134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23aa5a49-72b2-4855-8655-448300642a47-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23aa5a49-72b2-4855-8655-448300642a47" (UID: "23aa5a49-72b2-4855-8655-448300642a47"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.846223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23aa5a49-72b2-4855-8655-448300642a47-kube-api-access-kscp9" (OuterVolumeSpecName: "kube-api-access-kscp9") pod "23aa5a49-72b2-4855-8655-448300642a47" (UID: "23aa5a49-72b2-4855-8655-448300642a47"). InnerVolumeSpecName "kube-api-access-kscp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.944023 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23aa5a49-72b2-4855-8655-448300642a47-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.944226 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kscp9\" (UniqueName: \"kubernetes.io/projected/23aa5a49-72b2-4855-8655-448300642a47-kube-api-access-kscp9\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:02 crc kubenswrapper[4707]: I0121 15:15:02.944239 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23aa5a49-72b2-4855-8655-448300642a47-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:03 crc kubenswrapper[4707]: I0121 15:15:03.510117 4707 generic.go:334] "Generic (PLEG): container finished" podID="6473b2ca-fd25-446b-8c6a-76e29eff2104" containerID="f213684e000592d6c88fd06d620047c601b5465bd66e1c2eae23b0ba1f084b37" exitCode=0 Jan 21 15:15:03 crc kubenswrapper[4707]: I0121 15:15:03.510179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sr7pd" event={"ID":"6473b2ca-fd25-446b-8c6a-76e29eff2104","Type":"ContainerDied","Data":"f213684e000592d6c88fd06d620047c601b5465bd66e1c2eae23b0ba1f084b37"} Jan 21 15:15:03 crc kubenswrapper[4707]: I0121 15:15:03.511362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" event={"ID":"23aa5a49-72b2-4855-8655-448300642a47","Type":"ContainerDied","Data":"d8ab4f99ff578e2acd1c4da69b7bf32ac8c20b1ddaee722c488abd54ba7128e2"} Jan 21 15:15:03 crc kubenswrapper[4707]: I0121 15:15:03.511388 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8ab4f99ff578e2acd1c4da69b7bf32ac8c20b1ddaee722c488abd54ba7128e2" Jan 21 15:15:03 crc kubenswrapper[4707]: I0121 15:15:03.511400 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c" Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.700238 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.864348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dphrf\" (UniqueName: \"kubernetes.io/projected/6473b2ca-fd25-446b-8c6a-76e29eff2104-kube-api-access-dphrf\") pod \"6473b2ca-fd25-446b-8c6a-76e29eff2104\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.864481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6473b2ca-fd25-446b-8c6a-76e29eff2104-node-mnt\") pod \"6473b2ca-fd25-446b-8c6a-76e29eff2104\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.864561 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6473b2ca-fd25-446b-8c6a-76e29eff2104-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6473b2ca-fd25-446b-8c6a-76e29eff2104" (UID: "6473b2ca-fd25-446b-8c6a-76e29eff2104"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.864651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6473b2ca-fd25-446b-8c6a-76e29eff2104-crc-storage\") pod \"6473b2ca-fd25-446b-8c6a-76e29eff2104\" (UID: \"6473b2ca-fd25-446b-8c6a-76e29eff2104\") " Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.865082 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6473b2ca-fd25-446b-8c6a-76e29eff2104-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.867903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6473b2ca-fd25-446b-8c6a-76e29eff2104-kube-api-access-dphrf" (OuterVolumeSpecName: "kube-api-access-dphrf") pod "6473b2ca-fd25-446b-8c6a-76e29eff2104" (UID: "6473b2ca-fd25-446b-8c6a-76e29eff2104"). InnerVolumeSpecName "kube-api-access-dphrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.877177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6473b2ca-fd25-446b-8c6a-76e29eff2104-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6473b2ca-fd25-446b-8c6a-76e29eff2104" (UID: "6473b2ca-fd25-446b-8c6a-76e29eff2104"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.965912 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6473b2ca-fd25-446b-8c6a-76e29eff2104-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:04 crc kubenswrapper[4707]: I0121 15:15:04.966111 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dphrf\" (UniqueName: \"kubernetes.io/projected/6473b2ca-fd25-446b-8c6a-76e29eff2104-kube-api-access-dphrf\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:05 crc kubenswrapper[4707]: I0121 15:15:05.522613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sr7pd" event={"ID":"6473b2ca-fd25-446b-8c6a-76e29eff2104","Type":"ContainerDied","Data":"44a9e165651e082de9c3fd3f57dfb2732b644c06cd90fc27ffc59fc0d92b8035"} Jan 21 15:15:05 crc kubenswrapper[4707]: I0121 15:15:05.522651 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a9e165651e082de9c3fd3f57dfb2732b644c06cd90fc27ffc59fc0d92b8035" Jan 21 15:15:05 crc kubenswrapper[4707]: I0121 15:15:05.522711 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sr7pd" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.626045 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-sr7pd"] Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.629459 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-sr7pd"] Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.733226 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-qjrxp"] Jan 21 15:15:07 crc kubenswrapper[4707]: E0121 15:15:07.733486 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23aa5a49-72b2-4855-8655-448300642a47" containerName="collect-profiles" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.733503 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23aa5a49-72b2-4855-8655-448300642a47" containerName="collect-profiles" Jan 21 15:15:07 crc kubenswrapper[4707]: E0121 15:15:07.733515 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6473b2ca-fd25-446b-8c6a-76e29eff2104" containerName="storage" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.733521 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6473b2ca-fd25-446b-8c6a-76e29eff2104" containerName="storage" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.733640 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6473b2ca-fd25-446b-8c6a-76e29eff2104" containerName="storage" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.733652 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="23aa5a49-72b2-4855-8655-448300642a47" containerName="collect-profiles" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.734057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.735955 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.735988 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.736069 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.736150 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.738220 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qjrxp"] Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.798324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4pdj\" (UniqueName: \"kubernetes.io/projected/2def2a51-5e8a-44c8-a7a0-99652042de6c-kube-api-access-p4pdj\") pod \"crc-storage-crc-qjrxp\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.798411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2def2a51-5e8a-44c8-a7a0-99652042de6c-node-mnt\") pod \"crc-storage-crc-qjrxp\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.798594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2def2a51-5e8a-44c8-a7a0-99652042de6c-crc-storage\") pod \"crc-storage-crc-qjrxp\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.899591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2def2a51-5e8a-44c8-a7a0-99652042de6c-crc-storage\") pod \"crc-storage-crc-qjrxp\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.899638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4pdj\" (UniqueName: \"kubernetes.io/projected/2def2a51-5e8a-44c8-a7a0-99652042de6c-kube-api-access-p4pdj\") pod \"crc-storage-crc-qjrxp\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.899679 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2def2a51-5e8a-44c8-a7a0-99652042de6c-node-mnt\") pod \"crc-storage-crc-qjrxp\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.899914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2def2a51-5e8a-44c8-a7a0-99652042de6c-node-mnt\") pod \"crc-storage-crc-qjrxp\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.900272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2def2a51-5e8a-44c8-a7a0-99652042de6c-crc-storage\") pod \"crc-storage-crc-qjrxp\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:07 crc kubenswrapper[4707]: I0121 15:15:07.914508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4pdj\" (UniqueName: \"kubernetes.io/projected/2def2a51-5e8a-44c8-a7a0-99652042de6c-kube-api-access-p4pdj\") pod \"crc-storage-crc-qjrxp\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:08 crc kubenswrapper[4707]: I0121 15:15:08.048557 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:08 crc kubenswrapper[4707]: I0121 15:15:08.388677 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qjrxp"] Jan 21 15:15:08 crc kubenswrapper[4707]: I0121 15:15:08.536724 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qjrxp" event={"ID":"2def2a51-5e8a-44c8-a7a0-99652042de6c","Type":"ContainerStarted","Data":"8a2e9d2e0c6e746cb7fab324f42981eacfc996065ec2b2ad0aa403b54c632a15"} Jan 21 15:15:09 crc kubenswrapper[4707]: I0121 15:15:09.188569 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6473b2ca-fd25-446b-8c6a-76e29eff2104" path="/var/lib/kubelet/pods/6473b2ca-fd25-446b-8c6a-76e29eff2104/volumes" Jan 21 15:15:09 crc kubenswrapper[4707]: I0121 15:15:09.542596 4707 generic.go:334] "Generic (PLEG): container finished" podID="2def2a51-5e8a-44c8-a7a0-99652042de6c" containerID="b7e4d4cbcccddede310a26fc6b0f9056fadab3f4b99c92bb4024f7818ae6104e" exitCode=0 Jan 21 15:15:09 crc kubenswrapper[4707]: I0121 15:15:09.542919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qjrxp" event={"ID":"2def2a51-5e8a-44c8-a7a0-99652042de6c","Type":"ContainerDied","Data":"b7e4d4cbcccddede310a26fc6b0f9056fadab3f4b99c92bb4024f7818ae6104e"} Jan 21 15:15:09 crc kubenswrapper[4707]: I0121 15:15:09.945600 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:15:09 crc kubenswrapper[4707]: I0121 15:15:09.945660 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:15:10 crc kubenswrapper[4707]: I0121 15:15:10.848098 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.035466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2def2a51-5e8a-44c8-a7a0-99652042de6c-crc-storage\") pod \"2def2a51-5e8a-44c8-a7a0-99652042de6c\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.035504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2def2a51-5e8a-44c8-a7a0-99652042de6c-node-mnt\") pod \"2def2a51-5e8a-44c8-a7a0-99652042de6c\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.035553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4pdj\" (UniqueName: \"kubernetes.io/projected/2def2a51-5e8a-44c8-a7a0-99652042de6c-kube-api-access-p4pdj\") pod \"2def2a51-5e8a-44c8-a7a0-99652042de6c\" (UID: \"2def2a51-5e8a-44c8-a7a0-99652042de6c\") " Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.035604 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2def2a51-5e8a-44c8-a7a0-99652042de6c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "2def2a51-5e8a-44c8-a7a0-99652042de6c" (UID: "2def2a51-5e8a-44c8-a7a0-99652042de6c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.035907 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2def2a51-5e8a-44c8-a7a0-99652042de6c-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.049202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2def2a51-5e8a-44c8-a7a0-99652042de6c-kube-api-access-p4pdj" (OuterVolumeSpecName: "kube-api-access-p4pdj") pod "2def2a51-5e8a-44c8-a7a0-99652042de6c" (UID: "2def2a51-5e8a-44c8-a7a0-99652042de6c"). InnerVolumeSpecName "kube-api-access-p4pdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.049786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2def2a51-5e8a-44c8-a7a0-99652042de6c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "2def2a51-5e8a-44c8-a7a0-99652042de6c" (UID: "2def2a51-5e8a-44c8-a7a0-99652042de6c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.137152 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4pdj\" (UniqueName: \"kubernetes.io/projected/2def2a51-5e8a-44c8-a7a0-99652042de6c-kube-api-access-p4pdj\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.137182 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2def2a51-5e8a-44c8-a7a0-99652042de6c-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.559677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qjrxp" event={"ID":"2def2a51-5e8a-44c8-a7a0-99652042de6c","Type":"ContainerDied","Data":"8a2e9d2e0c6e746cb7fab324f42981eacfc996065ec2b2ad0aa403b54c632a15"} Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.559716 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a2e9d2e0c6e746cb7fab324f42981eacfc996065ec2b2ad0aa403b54c632a15" Jan 21 15:15:11 crc kubenswrapper[4707]: I0121 15:15:11.559731 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qjrxp" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.910567 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst"] Jan 21 15:15:13 crc kubenswrapper[4707]: E0121 15:15:13.911047 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2def2a51-5e8a-44c8-a7a0-99652042de6c" containerName="storage" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.911060 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2def2a51-5e8a-44c8-a7a0-99652042de6c" containerName="storage" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.911196 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2def2a51-5e8a-44c8-a7a0-99652042de6c" containerName="storage" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.911802 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.914355 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst"] Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.914366 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openshift-service-ca.crt" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.914399 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dnsmasq" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.914538 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"kube-root-ca.crt" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.916069 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dnsmasq-dnsmasq-dockercfg-wnmj9" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.966328 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9"] Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.967235 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.968863 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dnsmasq-svc" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.969618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-57kz9\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.969662 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrcg\" (UniqueName: \"kubernetes.io/projected/27118245-cbcd-4382-97b9-d1c36df77cee-kube-api-access-jvrcg\") pod \"dnsmasq-dnsmasq-84b9f45d47-57kz9\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.969687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz2xs\" (UniqueName: \"kubernetes.io/projected/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-kube-api-access-lz2xs\") pod \"dnsmasq-dnsmasq-f5849d7b9-vwpst\" (UID: \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.969748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-57kz9\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.969763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-config\") pod \"dnsmasq-dnsmasq-f5849d7b9-vwpst\" (UID: \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:13 crc kubenswrapper[4707]: I0121 15:15:13.976136 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9"] Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.070546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-config\") pod \"dnsmasq-dnsmasq-f5849d7b9-vwpst\" (UID: \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.070732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-57kz9\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.070854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-57kz9\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.071059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrcg\" (UniqueName: \"kubernetes.io/projected/27118245-cbcd-4382-97b9-d1c36df77cee-kube-api-access-jvrcg\") pod \"dnsmasq-dnsmasq-84b9f45d47-57kz9\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.071397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz2xs\" (UniqueName: \"kubernetes.io/projected/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-kube-api-access-lz2xs\") pod \"dnsmasq-dnsmasq-f5849d7b9-vwpst\" (UID: \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.071282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-config\") pod \"dnsmasq-dnsmasq-f5849d7b9-vwpst\" (UID: \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.071677 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-57kz9\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.071692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-57kz9\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.085353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz2xs\" (UniqueName: \"kubernetes.io/projected/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-kube-api-access-lz2xs\") pod \"dnsmasq-dnsmasq-f5849d7b9-vwpst\" (UID: \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.085429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrcg\" (UniqueName: \"kubernetes.io/projected/27118245-cbcd-4382-97b9-d1c36df77cee-kube-api-access-jvrcg\") pod \"dnsmasq-dnsmasq-84b9f45d47-57kz9\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.227832 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.277588 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.673804 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst"] Jan 21 15:15:14 crc kubenswrapper[4707]: I0121 15:15:14.696932 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9"] Jan 21 15:15:14 crc kubenswrapper[4707]: W0121 15:15:14.698554 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27118245_cbcd_4382_97b9_d1c36df77cee.slice/crio-c494924c0291252805e9e7f1a499b9220c10e36ff9d9c7123d5beb687fb2994f WatchSource:0}: Error finding container c494924c0291252805e9e7f1a499b9220c10e36ff9d9c7123d5beb687fb2994f: Status 404 returned error can't find the container with id c494924c0291252805e9e7f1a499b9220c10e36ff9d9c7123d5beb687fb2994f Jan 21 15:15:15 crc kubenswrapper[4707]: I0121 15:15:15.585535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" event={"ID":"27118245-cbcd-4382-97b9-d1c36df77cee","Type":"ContainerStarted","Data":"c494924c0291252805e9e7f1a499b9220c10e36ff9d9c7123d5beb687fb2994f"} Jan 21 15:15:15 crc kubenswrapper[4707]: I0121 15:15:15.586743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" event={"ID":"74c0e744-8d0b-400a-a672-b9f40a2c8ec9","Type":"ContainerStarted","Data":"5d7b97faa88a803d3204b0c07c539f1c63f2ed1bc4e4b8a2c7b3acb4550671e3"} Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.468723 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s8jx4"] Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.471504 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.477213 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s8jx4"] Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.519928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d67efa67-67d4-4f70-ac0e-cdbd6f76d739-utilities\") pod \"community-operators-s8jx4\" (UID: \"d67efa67-67d4-4f70-ac0e-cdbd6f76d739\") " pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.520016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d67efa67-67d4-4f70-ac0e-cdbd6f76d739-catalog-content\") pod \"community-operators-s8jx4\" (UID: \"d67efa67-67d4-4f70-ac0e-cdbd6f76d739\") " pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.520049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2m7v\" (UniqueName: \"kubernetes.io/projected/d67efa67-67d4-4f70-ac0e-cdbd6f76d739-kube-api-access-z2m7v\") pod \"community-operators-s8jx4\" (UID: \"d67efa67-67d4-4f70-ac0e-cdbd6f76d739\") " pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.620802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d67efa67-67d4-4f70-ac0e-cdbd6f76d739-catalog-content\") pod \"community-operators-s8jx4\" (UID: \"d67efa67-67d4-4f70-ac0e-cdbd6f76d739\") " pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.620940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2m7v\" (UniqueName: \"kubernetes.io/projected/d67efa67-67d4-4f70-ac0e-cdbd6f76d739-kube-api-access-z2m7v\") pod \"community-operators-s8jx4\" (UID: \"d67efa67-67d4-4f70-ac0e-cdbd6f76d739\") " pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.621059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d67efa67-67d4-4f70-ac0e-cdbd6f76d739-utilities\") pod \"community-operators-s8jx4\" (UID: \"d67efa67-67d4-4f70-ac0e-cdbd6f76d739\") " pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.621150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d67efa67-67d4-4f70-ac0e-cdbd6f76d739-catalog-content\") pod \"community-operators-s8jx4\" (UID: \"d67efa67-67d4-4f70-ac0e-cdbd6f76d739\") " pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.621614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d67efa67-67d4-4f70-ac0e-cdbd6f76d739-utilities\") pod \"community-operators-s8jx4\" (UID: \"d67efa67-67d4-4f70-ac0e-cdbd6f76d739\") " pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.636704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2m7v\" (UniqueName: \"kubernetes.io/projected/d67efa67-67d4-4f70-ac0e-cdbd6f76d739-kube-api-access-z2m7v\") pod \"community-operators-s8jx4\" (UID: \"d67efa67-67d4-4f70-ac0e-cdbd6f76d739\") " pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:17 crc kubenswrapper[4707]: I0121 15:15:17.790102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.339688 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.341209 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.345294 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.345401 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.345529 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.345775 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.345971 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-sz77f" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.346037 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.346321 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.346156 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.396046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x9qr\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-kube-api-access-9x9qr\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.396084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/945ded51-9961-411b-b00f-0543ac91a18a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.396104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.396129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/945ded51-9961-411b-b00f-0543ac91a18a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.396157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.396173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.396188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.396204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.396971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.397090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.397243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.498841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.498912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x9qr\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-kube-api-access-9x9qr\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.498930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/945ded51-9961-411b-b00f-0543ac91a18a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.498946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.498966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/945ded51-9961-411b-b00f-0543ac91a18a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.498987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.499000 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.499014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.499028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.499050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.499082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.500200 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.500974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.500978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.501381 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.501942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.502173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.504792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.504793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/945ded51-9961-411b-b00f-0543ac91a18a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.504793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.505453 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/945ded51-9961-411b-b00f-0543ac91a18a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.513224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x9qr\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-kube-api-access-9x9qr\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.517538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:23 crc kubenswrapper[4707]: I0121 15:15:23.660956 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.160009 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.160979 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.163950 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.164520 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.164701 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.165044 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.165534 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.165915 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-rj8mh" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.165927 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.175346 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.206915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.206959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.207187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.207224 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.207245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.207286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.207313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.207336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.207369 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.207386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mjd4\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-kube-api-access-8mjd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.207441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.308182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.308225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.308258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.308276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.308290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.308313 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.308584 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.308836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.309473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.310020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.310193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.308331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.314426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.314482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.314502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mjd4\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-kube-api-access-8mjd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.314540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.314838 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.321135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.324427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.324881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.326449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.327252 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mjd4\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-kube-api-access-8mjd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.331740 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:24 crc kubenswrapper[4707]: I0121 15:15:24.477419 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.521428 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.522742 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.525518 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.525629 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.525993 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-6xxcv" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.526259 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.529558 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.529754 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.731249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.731452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.731484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.731500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kolla-config\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.731546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5t5w\" (UniqueName: \"kubernetes.io/projected/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kube-api-access-j5t5w\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.731578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-default\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.731625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.731652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.832637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-default\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.832737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.832773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.832803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.832843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.832865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.832882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kolla-config\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.832916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5t5w\" (UniqueName: \"kubernetes.io/projected/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kube-api-access-j5t5w\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.833263 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.833717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-default\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.833748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.834341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kolla-config\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.834875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.836505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.836563 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.846467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.847081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5t5w\" (UniqueName: \"kubernetes.io/projected/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kube-api-access-j5t5w\") pod \"openstack-galera-0\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.872501 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s8jx4"] Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.886222 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:25 crc kubenswrapper[4707]: I0121 15:15:25.949666 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:15:25 crc kubenswrapper[4707]: W0121 15:15:25.966487 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod945ded51_9961_411b_b00f_0543ac91a18a.slice/crio-c52196e4c3b9f97fab04d935d73b2d868fa4a476add586a3c4280639cd2164fe WatchSource:0}: Error finding container c52196e4c3b9f97fab04d935d73b2d868fa4a476add586a3c4280639cd2164fe: Status 404 returned error can't find the container with id c52196e4c3b9f97fab04d935d73b2d868fa4a476add586a3c4280639cd2164fe Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.022475 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.317553 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.652933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"945ded51-9961-411b-b00f-0543ac91a18a","Type":"ContainerStarted","Data":"c52196e4c3b9f97fab04d935d73b2d868fa4a476add586a3c4280639cd2164fe"} Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.655349 4707 generic.go:334] "Generic (PLEG): container finished" podID="27118245-cbcd-4382-97b9-d1c36df77cee" containerID="7e535854d25c09adbae2b264433f7ae08897fe4ace48903a5398f03ba12b2dd3" exitCode=0 Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.655397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" event={"ID":"27118245-cbcd-4382-97b9-d1c36df77cee","Type":"ContainerDied","Data":"7e535854d25c09adbae2b264433f7ae08897fe4ace48903a5398f03ba12b2dd3"} Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.656859 4707 generic.go:334] "Generic (PLEG): container finished" podID="74c0e744-8d0b-400a-a672-b9f40a2c8ec9" containerID="de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882" exitCode=0 Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.656921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" event={"ID":"74c0e744-8d0b-400a-a672-b9f40a2c8ec9","Type":"ContainerDied","Data":"de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882"} Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.660022 4707 generic.go:334] "Generic (PLEG): container finished" podID="d67efa67-67d4-4f70-ac0e-cdbd6f76d739" containerID="2fb876329e4ef133afa58a7fa6372aaebdc0901f0b27c3f06e47014db1fc6c47" exitCode=0 Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.660078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8jx4" event={"ID":"d67efa67-67d4-4f70-ac0e-cdbd6f76d739","Type":"ContainerDied","Data":"2fb876329e4ef133afa58a7fa6372aaebdc0901f0b27c3f06e47014db1fc6c47"} Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.660096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8jx4" event={"ID":"d67efa67-67d4-4f70-ac0e-cdbd6f76d739","Type":"ContainerStarted","Data":"827a3beeae5affa3cf5973fc9b432f932d8d5a9ec136592189a0ecf91bf4e8dd"} Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.661566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"bce9bbed-aa2b-429e-85f7-c4202d8af65f","Type":"ContainerStarted","Data":"eeabadc126ba92429b09e06078c52edaa20284bf78c5a0f10ee560b5359e7f3e"} Jan 21 15:15:26 crc kubenswrapper[4707]: I0121 15:15:26.662626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0","Type":"ContainerStarted","Data":"4eb05e2d5e2d8eb167bc5585a41c3eeb5817bc995c97104c6b8e234d4dbfd1d8"} Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.009616 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.012137 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.014526 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.014733 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.014871 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-c6gd4" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.015094 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.021757 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.148787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.148949 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.148979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.149026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.149052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vxfc\" (UniqueName: \"kubernetes.io/projected/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kube-api-access-6vxfc\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.149097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.149121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.149138 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.249879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.249936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.249983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.250015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vxfc\" (UniqueName: \"kubernetes.io/projected/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kube-api-access-6vxfc\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.250046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.250071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.250086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.250109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.250389 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.251174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.252592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.252682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.253065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.257447 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.267355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vxfc\" (UniqueName: \"kubernetes.io/projected/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kube-api-access-6vxfc\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.271703 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.276381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.283997 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.284980 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.288864 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.289490 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-p6w6j" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.289617 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.298071 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.340199 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.455190 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.456781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.456826 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-kolla-config\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.456948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w7ph\" (UniqueName: \"kubernetes.io/projected/9a17de20-67a0-4578-92a4-e19819061791-kube-api-access-2w7ph\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.456985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-config-data\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.558689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w7ph\" (UniqueName: \"kubernetes.io/projected/9a17de20-67a0-4578-92a4-e19819061791-kube-api-access-2w7ph\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.558745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-config-data\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.558789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.558855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.558872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-kolla-config\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.561509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-kolla-config\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.563887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-config-data\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.565985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.566058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.575097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w7ph\" (UniqueName: \"kubernetes.io/projected/9a17de20-67a0-4578-92a4-e19819061791-kube-api-access-2w7ph\") pod \"memcached-0\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.622258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.670799 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" event={"ID":"27118245-cbcd-4382-97b9-d1c36df77cee","Type":"ContainerStarted","Data":"17e5c220eb052e49499145259c9bfa39182e05fd853538d79c61f7b134299f6e"} Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.670980 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.673653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" event={"ID":"74c0e744-8d0b-400a-a672-b9f40a2c8ec9","Type":"ContainerStarted","Data":"1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09"} Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.684730 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" podStartSLOduration=3.757723531 podStartE2EDuration="14.684711851s" podCreationTimestamp="2026-01-21 15:15:13 +0000 UTC" firstStartedPulling="2026-01-21 15:15:14.700286249 +0000 UTC m=+811.881802471" lastFinishedPulling="2026-01-21 15:15:25.627274569 +0000 UTC m=+822.808790791" observedRunningTime="2026-01-21 15:15:27.681499589 +0000 UTC m=+824.863015811" watchObservedRunningTime="2026-01-21 15:15:27.684711851 +0000 UTC m=+824.866228074" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.702404 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" podStartSLOduration=3.785173736 podStartE2EDuration="14.702388171s" podCreationTimestamp="2026-01-21 15:15:13 +0000 UTC" firstStartedPulling="2026-01-21 15:15:14.67043586 +0000 UTC m=+811.851952082" lastFinishedPulling="2026-01-21 15:15:25.587650296 +0000 UTC m=+822.769166517" observedRunningTime="2026-01-21 15:15:27.6962735 +0000 UTC m=+824.877789721" watchObservedRunningTime="2026-01-21 15:15:27.702388171 +0000 UTC m=+824.883904393" Jan 21 15:15:27 crc kubenswrapper[4707]: I0121 15:15:27.751595 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:15:28 crc kubenswrapper[4707]: I0121 15:15:28.682957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"12ab3a9a-6d12-45c8-b029-d95a3c4d4608","Type":"ContainerStarted","Data":"49372a50539087ac1f392ac1cf5c62f19e613df68d001ddee9eefdebb54cae9f"} Jan 21 15:15:28 crc kubenswrapper[4707]: I0121 15:15:28.683313 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:28 crc kubenswrapper[4707]: I0121 15:15:28.856738 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:15:28 crc kubenswrapper[4707]: W0121 15:15:28.868549 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a17de20_67a0_4578_92a4_e19819061791.slice/crio-ebf4b45e59de68beb87c75489ddb1d6c6c25b416f28d16668ecaa31fccb9c7e4 WatchSource:0}: Error finding container ebf4b45e59de68beb87c75489ddb1d6c6c25b416f28d16668ecaa31fccb9c7e4: Status 404 returned error can't find the container with id ebf4b45e59de68beb87c75489ddb1d6c6c25b416f28d16668ecaa31fccb9c7e4 Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.176584 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.177493 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.179136 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-4cqsp" Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.191012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrchp\" (UniqueName: \"kubernetes.io/projected/73f7e559-7400-4d89-878e-7df47a936c82-kube-api-access-rrchp\") pod \"kube-state-metrics-0\" (UID: \"73f7e559-7400-4d89-878e-7df47a936c82\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.205151 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.293583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrchp\" (UniqueName: \"kubernetes.io/projected/73f7e559-7400-4d89-878e-7df47a936c82-kube-api-access-rrchp\") pod \"kube-state-metrics-0\" (UID: \"73f7e559-7400-4d89-878e-7df47a936c82\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.307785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrchp\" (UniqueName: \"kubernetes.io/projected/73f7e559-7400-4d89-878e-7df47a936c82-kube-api-access-rrchp\") pod \"kube-state-metrics-0\" (UID: \"73f7e559-7400-4d89-878e-7df47a936c82\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.503162 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.690193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9a17de20-67a0-4578-92a4-e19819061791","Type":"ContainerStarted","Data":"ebf4b45e59de68beb87c75489ddb1d6c6c25b416f28d16668ecaa31fccb9c7e4"} Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.691346 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0","Type":"ContainerStarted","Data":"f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178"} Jan 21 15:15:29 crc kubenswrapper[4707]: I0121 15:15:29.694007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"945ded51-9961-411b-b00f-0543ac91a18a","Type":"ContainerStarted","Data":"3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0"} Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.023571 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.024871 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.028116 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-p8kgz" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.028133 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.028239 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.028279 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.030317 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.031683 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.134696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.134760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.134844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.134861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.134884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.134967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.135009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-config\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.135032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfsdg\" (UniqueName: \"kubernetes.io/projected/807924a2-cb11-45c3-9a2a-f9864c1b3a78-kube-api-access-nfsdg\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.222105 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.235699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.235756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.235783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.235801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.235855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.235887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.235927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-config\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.235945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfsdg\" (UniqueName: \"kubernetes.io/projected/807924a2-cb11-45c3-9a2a-f9864c1b3a78-kube-api-access-nfsdg\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.236135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.236405 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.237076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.237511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-config\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.239974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.239996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.240084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.250660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfsdg\" (UniqueName: \"kubernetes.io/projected/807924a2-cb11-45c3-9a2a-f9864c1b3a78-kube-api-access-nfsdg\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.253713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.344961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.699728 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.712725 4707 generic.go:334] "Generic (PLEG): container finished" podID="d67efa67-67d4-4f70-ac0e-cdbd6f76d739" containerID="3f9f2ffde9b2fb63b8972e8c09109cb2771629883dd0bd6ca25f78f8bc43493a" exitCode=0 Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.712824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8jx4" event={"ID":"d67efa67-67d4-4f70-ac0e-cdbd6f76d739","Type":"ContainerDied","Data":"3f9f2ffde9b2fb63b8972e8c09109cb2771629883dd0bd6ca25f78f8bc43493a"} Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.715570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"bce9bbed-aa2b-429e-85f7-c4202d8af65f","Type":"ContainerStarted","Data":"30cfdca99688f8422580aaa82fb53297290e4107488f50f74bafac30cdb605e2"} Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.716519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"73f7e559-7400-4d89-878e-7df47a936c82","Type":"ContainerStarted","Data":"ed63453753c0e42b120a0f0c169fd6bfb2161a53177cfded6b95974546919f06"} Jan 21 15:15:32 crc kubenswrapper[4707]: I0121 15:15:32.717746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"12ab3a9a-6d12-45c8-b029-d95a3c4d4608","Type":"ContainerStarted","Data":"492abccf81ca00d53959d35041fea378a062143c55d54af1e3316598d6f94cea"} Jan 21 15:15:33 crc kubenswrapper[4707]: W0121 15:15:33.063259 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807924a2_cb11_45c3_9a2a_f9864c1b3a78.slice/crio-4ab1d1446839667e5f37d07c92435d73b93a559783b97d35a3a4d356e98b74ac WatchSource:0}: Error finding container 4ab1d1446839667e5f37d07c92435d73b93a559783b97d35a3a4d356e98b74ac: Status 404 returned error can't find the container with id 4ab1d1446839667e5f37d07c92435d73b93a559783b97d35a3a4d356e98b74ac Jan 21 15:15:33 crc kubenswrapper[4707]: I0121 15:15:33.724500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9a17de20-67a0-4578-92a4-e19819061791","Type":"ContainerStarted","Data":"96a5811f181efafd4ec7d45b252f716be9d17274a6d4379830931b1b93eb6d88"} Jan 21 15:15:33 crc kubenswrapper[4707]: I0121 15:15:33.724804 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:33 crc kubenswrapper[4707]: I0121 15:15:33.725407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"807924a2-cb11-45c3-9a2a-f9864c1b3a78","Type":"ContainerStarted","Data":"4ab1d1446839667e5f37d07c92435d73b93a559783b97d35a3a4d356e98b74ac"} Jan 21 15:15:33 crc kubenswrapper[4707]: I0121 15:15:33.738339 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.520034596 podStartE2EDuration="6.738326589s" podCreationTimestamp="2026-01-21 15:15:27 +0000 UTC" firstStartedPulling="2026-01-21 15:15:28.870358615 +0000 UTC m=+826.051874837" lastFinishedPulling="2026-01-21 15:15:33.088650608 +0000 UTC m=+830.270166830" observedRunningTime="2026-01-21 15:15:33.735708566 +0000 UTC m=+830.917224787" watchObservedRunningTime="2026-01-21 15:15:33.738326589 +0000 UTC m=+830.919842811" Jan 21 15:15:34 crc kubenswrapper[4707]: I0121 15:15:34.228985 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:34 crc kubenswrapper[4707]: I0121 15:15:34.281061 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:15:34 crc kubenswrapper[4707]: I0121 15:15:34.317609 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst"] Jan 21 15:15:34 crc kubenswrapper[4707]: I0121 15:15:34.732449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8jx4" event={"ID":"d67efa67-67d4-4f70-ac0e-cdbd6f76d739","Type":"ContainerStarted","Data":"c3004758ba59b29296ead88a62cdee302745d034ce43b7aa4484897ba1d0d582"} Jan 21 15:15:34 crc kubenswrapper[4707]: I0121 15:15:34.735222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"73f7e559-7400-4d89-878e-7df47a936c82","Type":"ContainerStarted","Data":"d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be"} Jan 21 15:15:34 crc kubenswrapper[4707]: I0121 15:15:34.735257 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:15:34 crc kubenswrapper[4707]: I0121 15:15:34.735377 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" podUID="74c0e744-8d0b-400a-a672-b9f40a2c8ec9" containerName="dnsmasq-dns" containerID="cri-o://1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09" gracePeriod=10 Jan 21 15:15:34 crc kubenswrapper[4707]: I0121 15:15:34.749617 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s8jx4" podStartSLOduration=9.964670353 podStartE2EDuration="17.749603062s" podCreationTimestamp="2026-01-21 15:15:17 +0000 UTC" firstStartedPulling="2026-01-21 15:15:26.66133627 +0000 UTC m=+823.842852493" lastFinishedPulling="2026-01-21 15:15:34.44626898 +0000 UTC m=+831.627785202" observedRunningTime="2026-01-21 15:15:34.748258232 +0000 UTC m=+831.929774454" watchObservedRunningTime="2026-01-21 15:15:34.749603062 +0000 UTC m=+831.931119283" Jan 21 15:15:34 crc kubenswrapper[4707]: I0121 15:15:34.760607 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=3.535208894 podStartE2EDuration="5.760594146s" podCreationTimestamp="2026-01-21 15:15:29 +0000 UTC" firstStartedPulling="2026-01-21 15:15:32.231718728 +0000 UTC m=+829.413234950" lastFinishedPulling="2026-01-21 15:15:34.45710398 +0000 UTC m=+831.638620202" observedRunningTime="2026-01-21 15:15:34.75834228 +0000 UTC m=+831.939858502" watchObservedRunningTime="2026-01-21 15:15:34.760594146 +0000 UTC m=+831.942110368" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.089532 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.091293 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.093337 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.093382 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-8h6xz" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.093617 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.093671 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.096664 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.150671 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.186467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.186533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.186560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-config\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.186579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.186620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.186634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whv2k\" (UniqueName: \"kubernetes.io/projected/2950a343-0190-497c-8775-4a5a6ddb13fa-kube-api-access-whv2k\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.186650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.186681 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287185 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-config\") pod \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\" (UID: \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\") " Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz2xs\" (UniqueName: \"kubernetes.io/projected/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-kube-api-access-lz2xs\") pod \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\" (UID: \"74c0e744-8d0b-400a-a672-b9f40a2c8ec9\") " Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-config\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287605 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whv2k\" (UniqueName: \"kubernetes.io/projected/2950a343-0190-497c-8775-4a5a6ddb13fa-kube-api-access-whv2k\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.287984 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.288067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.288973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.289944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-config\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.294099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.294597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-kube-api-access-lz2xs" (OuterVolumeSpecName: "kube-api-access-lz2xs") pod "74c0e744-8d0b-400a-a672-b9f40a2c8ec9" (UID: "74c0e744-8d0b-400a-a672-b9f40a2c8ec9"). InnerVolumeSpecName "kube-api-access-lz2xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.295057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.296329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.300125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whv2k\" (UniqueName: \"kubernetes.io/projected/2950a343-0190-497c-8775-4a5a6ddb13fa-kube-api-access-whv2k\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.304300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.318614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-config" (OuterVolumeSpecName: "config") pod "74c0e744-8d0b-400a-a672-b9f40a2c8ec9" (UID: "74c0e744-8d0b-400a-a672-b9f40a2c8ec9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.389552 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.389581 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz2xs\" (UniqueName: \"kubernetes.io/projected/74c0e744-8d0b-400a-a672-b9f40a2c8ec9-kube-api-access-lz2xs\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.460897 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.742752 4707 generic.go:334] "Generic (PLEG): container finished" podID="12ab3a9a-6d12-45c8-b029-d95a3c4d4608" containerID="492abccf81ca00d53959d35041fea378a062143c55d54af1e3316598d6f94cea" exitCode=0 Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.742820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"12ab3a9a-6d12-45c8-b029-d95a3c4d4608","Type":"ContainerDied","Data":"492abccf81ca00d53959d35041fea378a062143c55d54af1e3316598d6f94cea"} Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.744831 4707 generic.go:334] "Generic (PLEG): container finished" podID="74c0e744-8d0b-400a-a672-b9f40a2c8ec9" containerID="1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09" exitCode=0 Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.744886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" event={"ID":"74c0e744-8d0b-400a-a672-b9f40a2c8ec9","Type":"ContainerDied","Data":"1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09"} Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.744923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" event={"ID":"74c0e744-8d0b-400a-a672-b9f40a2c8ec9","Type":"ContainerDied","Data":"5d7b97faa88a803d3204b0c07c539f1c63f2ed1bc4e4b8a2c7b3acb4550671e3"} Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.744943 4707 scope.go:117] "RemoveContainer" containerID="1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.745055 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst" Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.747525 4707 generic.go:334] "Generic (PLEG): container finished" podID="bce9bbed-aa2b-429e-85f7-c4202d8af65f" containerID="30cfdca99688f8422580aaa82fb53297290e4107488f50f74bafac30cdb605e2" exitCode=0 Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.747602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"bce9bbed-aa2b-429e-85f7-c4202d8af65f","Type":"ContainerDied","Data":"30cfdca99688f8422580aaa82fb53297290e4107488f50f74bafac30cdb605e2"} Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.792252 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst"] Jan 21 15:15:35 crc kubenswrapper[4707]: I0121 15:15:35.798731 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-vwpst"] Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.504516 4707 scope.go:117] "RemoveContainer" containerID="de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882" Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.621124 4707 scope.go:117] "RemoveContainer" containerID="1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09" Jan 21 15:15:36 crc kubenswrapper[4707]: E0121 15:15:36.622346 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09\": container with ID starting with 1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09 not found: ID does not exist" containerID="1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09" Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.622383 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09"} err="failed to get container status \"1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09\": rpc error: code = NotFound desc = could not find container \"1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09\": container with ID starting with 1b5ac5e7959f3f995c28a4935b00b32eac60da10c83e574a6545d3767b6a8a09 not found: ID does not exist" Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.622440 4707 scope.go:117] "RemoveContainer" containerID="de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882" Jan 21 15:15:36 crc kubenswrapper[4707]: E0121 15:15:36.622998 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882\": container with ID starting with de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882 not found: ID does not exist" containerID="de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882" Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.623017 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882"} err="failed to get container status \"de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882\": rpc error: code = NotFound desc = could not find container \"de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882\": container with ID starting with de31dbeb663ee173924421270465a36ace3cbe66a24348eddb3b811072969882 not found: ID does not exist" Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.755841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"12ab3a9a-6d12-45c8-b029-d95a3c4d4608","Type":"ContainerStarted","Data":"d8b0e90dec26a393971665049bd2f2d03a8dc6c3dd80036e2a8856b78a74cd21"} Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.759676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"bce9bbed-aa2b-429e-85f7-c4202d8af65f","Type":"ContainerStarted","Data":"93601405e0699ec0ef7e3daeaae4615c531db22fb7b3ae9899ac20bde048fe28"} Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.760837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"807924a2-cb11-45c3-9a2a-f9864c1b3a78","Type":"ContainerStarted","Data":"ca42ae0b900f3e6fb7e330dad2077769f96ba0b58c3627f0f8492e7eb0037984"} Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.771142 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=8.500212357 podStartE2EDuration="11.771127352s" podCreationTimestamp="2026-01-21 15:15:25 +0000 UTC" firstStartedPulling="2026-01-21 15:15:28.588290644 +0000 UTC m=+825.769806867" lastFinishedPulling="2026-01-21 15:15:31.85920564 +0000 UTC m=+829.040721862" observedRunningTime="2026-01-21 15:15:36.76905257 +0000 UTC m=+833.950568791" watchObservedRunningTime="2026-01-21 15:15:36.771127352 +0000 UTC m=+833.952643574" Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.785428 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=7.252732142 podStartE2EDuration="12.78541132s" podCreationTimestamp="2026-01-21 15:15:24 +0000 UTC" firstStartedPulling="2026-01-21 15:15:26.325314363 +0000 UTC m=+823.506830584" lastFinishedPulling="2026-01-21 15:15:31.857993541 +0000 UTC m=+829.039509762" observedRunningTime="2026-01-21 15:15:36.782719135 +0000 UTC m=+833.964235368" watchObservedRunningTime="2026-01-21 15:15:36.78541132 +0000 UTC m=+833.966927542" Jan 21 15:15:36 crc kubenswrapper[4707]: I0121 15:15:36.900509 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:15:36 crc kubenswrapper[4707]: W0121 15:15:36.902931 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2950a343_0190_497c_8775_4a5a6ddb13fa.slice/crio-cce4e6bec73efdd12ce027087c6ce202c46e22045d03e881efc56218dbee3581 WatchSource:0}: Error finding container cce4e6bec73efdd12ce027087c6ce202c46e22045d03e881efc56218dbee3581: Status 404 returned error can't find the container with id cce4e6bec73efdd12ce027087c6ce202c46e22045d03e881efc56218dbee3581 Jan 21 15:15:37 crc kubenswrapper[4707]: I0121 15:15:37.188950 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c0e744-8d0b-400a-a672-b9f40a2c8ec9" path="/var/lib/kubelet/pods/74c0e744-8d0b-400a-a672-b9f40a2c8ec9/volumes" Jan 21 15:15:37 crc kubenswrapper[4707]: I0121 15:15:37.341175 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:37 crc kubenswrapper[4707]: I0121 15:15:37.341318 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:37 crc kubenswrapper[4707]: I0121 15:15:37.767234 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2950a343-0190-497c-8775-4a5a6ddb13fa","Type":"ContainerStarted","Data":"cce4e6bec73efdd12ce027087c6ce202c46e22045d03e881efc56218dbee3581"} Jan 21 15:15:37 crc kubenswrapper[4707]: I0121 15:15:37.791156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:37 crc kubenswrapper[4707]: I0121 15:15:37.791203 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:37 crc kubenswrapper[4707]: I0121 15:15:37.827219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.507375 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.783625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"807924a2-cb11-45c3-9a2a-f9864c1b3a78","Type":"ContainerStarted","Data":"0bb066f5db13b28e6cfb66a090c7e690726483b9338472d0b4b47b4408310358"} Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.784916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2950a343-0190-497c-8775-4a5a6ddb13fa","Type":"ContainerStarted","Data":"ab66fa9636c2c8a7dc17154279e06b78d15894768f66c0e4dc03d9b7c66afb61"} Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.784947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2950a343-0190-497c-8775-4a5a6ddb13fa","Type":"ContainerStarted","Data":"ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383"} Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.798514 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.757313054 podStartE2EDuration="9.798498191s" podCreationTimestamp="2026-01-21 15:15:30 +0000 UTC" firstStartedPulling="2026-01-21 15:15:33.065760362 +0000 UTC m=+830.247276584" lastFinishedPulling="2026-01-21 15:15:39.106945509 +0000 UTC m=+836.288461721" observedRunningTime="2026-01-21 15:15:39.796975417 +0000 UTC m=+836.978491638" watchObservedRunningTime="2026-01-21 15:15:39.798498191 +0000 UTC m=+836.980014413" Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.818498 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.626452018 podStartE2EDuration="5.818473184s" podCreationTimestamp="2026-01-21 15:15:34 +0000 UTC" firstStartedPulling="2026-01-21 15:15:36.904692027 +0000 UTC m=+834.086208249" lastFinishedPulling="2026-01-21 15:15:39.096713193 +0000 UTC m=+836.278229415" observedRunningTime="2026-01-21 15:15:39.811995159 +0000 UTC m=+836.993511371" watchObservedRunningTime="2026-01-21 15:15:39.818473184 +0000 UTC m=+836.999989406" Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.945781 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.945854 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.945899 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.946605 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38eb7c71f338910a44fc52babc17693f7c677868f52954612f9a40b65ff03f90"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:15:39 crc kubenswrapper[4707]: I0121 15:15:39.946656 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://38eb7c71f338910a44fc52babc17693f7c677868f52954612f9a40b65ff03f90" gracePeriod=600 Jan 21 15:15:40 crc kubenswrapper[4707]: I0121 15:15:40.461187 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:40 crc kubenswrapper[4707]: I0121 15:15:40.792570 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="38eb7c71f338910a44fc52babc17693f7c677868f52954612f9a40b65ff03f90" exitCode=0 Jan 21 15:15:40 crc kubenswrapper[4707]: I0121 15:15:40.792644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"38eb7c71f338910a44fc52babc17693f7c677868f52954612f9a40b65ff03f90"} Jan 21 15:15:40 crc kubenswrapper[4707]: I0121 15:15:40.792711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"cf1b4064e4934cd4c43802ba3e90cfa41897c007fa175bcbffaf2760e553c708"} Jan 21 15:15:40 crc kubenswrapper[4707]: I0121 15:15:40.792735 4707 scope.go:117] "RemoveContainer" containerID="5f362a525166796306b00e86a9f5d281cb9947b8704d1d3ea656f1842b6d253f" Jan 21 15:15:41 crc kubenswrapper[4707]: I0121 15:15:41.346542 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:41 crc kubenswrapper[4707]: I0121 15:15:41.372477 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:41 crc kubenswrapper[4707]: I0121 15:15:41.395134 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:41 crc kubenswrapper[4707]: I0121 15:15:41.446187 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:15:41 crc kubenswrapper[4707]: I0121 15:15:41.461157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:41 crc kubenswrapper[4707]: I0121 15:15:41.798003 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:41 crc kubenswrapper[4707]: I0121 15:15:41.824616 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:15:42 crc kubenswrapper[4707]: I0121 15:15:42.623588 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.485713 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.512230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.644795 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:15:44 crc kubenswrapper[4707]: E0121 15:15:44.645120 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c0e744-8d0b-400a-a672-b9f40a2c8ec9" containerName="init" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.645134 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c0e744-8d0b-400a-a672-b9f40a2c8ec9" containerName="init" Jan 21 15:15:44 crc kubenswrapper[4707]: E0121 15:15:44.645150 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c0e744-8d0b-400a-a672-b9f40a2c8ec9" containerName="dnsmasq-dns" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.645156 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c0e744-8d0b-400a-a672-b9f40a2c8ec9" containerName="dnsmasq-dns" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.645265 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c0e744-8d0b-400a-a672-b9f40a2c8ec9" containerName="dnsmasq-dns" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.645993 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.647403 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.647459 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.647934 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.648028 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-46v4m" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.654418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.734656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-scripts\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.734711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd4mc\" (UniqueName: \"kubernetes.io/projected/9d458004-5d78-43b6-aae5-7afb3dfa0c31-kube-api-access-xd4mc\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.734787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.734835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.734885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.734948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.734979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-config\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.835779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.835837 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.835861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-config\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.835881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-scripts\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.835929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd4mc\" (UniqueName: \"kubernetes.io/projected/9d458004-5d78-43b6-aae5-7afb3dfa0c31-kube-api-access-xd4mc\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.835965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.835982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.836849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-scripts\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.837226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.837373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-config\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.840685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.840716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.841725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.848370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd4mc\" (UniqueName: \"kubernetes.io/projected/9d458004-5d78-43b6-aae5-7afb3dfa0c31-kube-api-access-xd4mc\") pod \"ovn-northd-0\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:44 crc kubenswrapper[4707]: I0121 15:15:44.960967 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:45 crc kubenswrapper[4707]: I0121 15:15:45.327234 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:15:45 crc kubenswrapper[4707]: I0121 15:15:45.332364 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:15:45 crc kubenswrapper[4707]: I0121 15:15:45.820453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"9d458004-5d78-43b6-aae5-7afb3dfa0c31","Type":"ContainerStarted","Data":"bc7e5eb6486c9db4b4674e13aebddd0296ac1b5fd7302c7edf70fa0c8444ff41"} Jan 21 15:15:45 crc kubenswrapper[4707]: I0121 15:15:45.886550 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:45 crc kubenswrapper[4707]: I0121 15:15:45.886584 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:45 crc kubenswrapper[4707]: I0121 15:15:45.936586 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.065356 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-cxjh9"] Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.066236 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.067570 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.071799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-cxjh9"] Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.158164 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-operator-scripts\") pod \"root-account-create-update-cxjh9\" (UID: \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\") " pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.158225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55vb6\" (UniqueName: \"kubernetes.io/projected/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-kube-api-access-55vb6\") pod \"root-account-create-update-cxjh9\" (UID: \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\") " pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.259007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-operator-scripts\") pod \"root-account-create-update-cxjh9\" (UID: \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\") " pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.259060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55vb6\" (UniqueName: \"kubernetes.io/projected/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-kube-api-access-55vb6\") pod \"root-account-create-update-cxjh9\" (UID: \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\") " pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.259732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-operator-scripts\") pod \"root-account-create-update-cxjh9\" (UID: \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\") " pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.278568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55vb6\" (UniqueName: \"kubernetes.io/projected/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-kube-api-access-55vb6\") pod \"root-account-create-update-cxjh9\" (UID: \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\") " pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.380803 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.726896 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-cxjh9"] Jan 21 15:15:46 crc kubenswrapper[4707]: W0121 15:15:46.727979 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49ab0a29_8e07_47aa_84d9_5cfc48d3b69f.slice/crio-b71b7b87792a50333604e5e0aa8e974818098a35c6aac09242ffcbfd3d0f529c WatchSource:0}: Error finding container b71b7b87792a50333604e5e0aa8e974818098a35c6aac09242ffcbfd3d0f529c: Status 404 returned error can't find the container with id b71b7b87792a50333604e5e0aa8e974818098a35c6aac09242ffcbfd3d0f529c Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.826980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"9d458004-5d78-43b6-aae5-7afb3dfa0c31","Type":"ContainerStarted","Data":"b31f4f9052448b785d44b5b6bb2e793bfd762ba6718264d80a5fa47c35b16237"} Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.827020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"9d458004-5d78-43b6-aae5-7afb3dfa0c31","Type":"ContainerStarted","Data":"9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77"} Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.827202 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.828105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-cxjh9" event={"ID":"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f","Type":"ContainerStarted","Data":"c54252794c0f19f6048f542f3e43812c6e224174c052c5045000b9e469f61138"} Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.828129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-cxjh9" event={"ID":"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f","Type":"ContainerStarted","Data":"b71b7b87792a50333604e5e0aa8e974818098a35c6aac09242ffcbfd3d0f529c"} Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.844004 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=1.852615486 podStartE2EDuration="2.843991034s" podCreationTimestamp="2026-01-21 15:15:44 +0000 UTC" firstStartedPulling="2026-01-21 15:15:45.332173251 +0000 UTC m=+842.513689473" lastFinishedPulling="2026-01-21 15:15:46.323548799 +0000 UTC m=+843.505065021" observedRunningTime="2026-01-21 15:15:46.838568985 +0000 UTC m=+844.020085206" watchObservedRunningTime="2026-01-21 15:15:46.843991034 +0000 UTC m=+844.025507256" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.852220 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/root-account-create-update-cxjh9" podStartSLOduration=0.852208262 podStartE2EDuration="852.208262ms" podCreationTimestamp="2026-01-21 15:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:15:46.847645929 +0000 UTC m=+844.029162141" watchObservedRunningTime="2026-01-21 15:15:46.852208262 +0000 UTC m=+844.033724484" Jan 21 15:15:46 crc kubenswrapper[4707]: I0121 15:15:46.879776 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.189508 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-zpcb6"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.190298 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.197193 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-zpcb6"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.257612 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.258344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.259669 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.263775 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.374260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8tgp\" (UniqueName: \"kubernetes.io/projected/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-kube-api-access-t8tgp\") pod \"keystone-db-create-zpcb6\" (UID: \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\") " pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.374302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49l5q\" (UniqueName: \"kubernetes.io/projected/d5be7ae3-f509-470d-aa05-6369baaa2ec5-kube-api-access-49l5q\") pod \"keystone-37a4-account-create-update-jnqsq\" (UID: \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.374451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5be7ae3-f509-470d-aa05-6369baaa2ec5-operator-scripts\") pod \"keystone-37a4-account-create-update-jnqsq\" (UID: \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.374473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-operator-scripts\") pod \"keystone-db-create-zpcb6\" (UID: \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\") " pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.475528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8tgp\" (UniqueName: \"kubernetes.io/projected/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-kube-api-access-t8tgp\") pod \"keystone-db-create-zpcb6\" (UID: \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\") " pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.475566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49l5q\" (UniqueName: \"kubernetes.io/projected/d5be7ae3-f509-470d-aa05-6369baaa2ec5-kube-api-access-49l5q\") pod \"keystone-37a4-account-create-update-jnqsq\" (UID: \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.475642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5be7ae3-f509-470d-aa05-6369baaa2ec5-operator-scripts\") pod \"keystone-37a4-account-create-update-jnqsq\" (UID: \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.475661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-operator-scripts\") pod \"keystone-db-create-zpcb6\" (UID: \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\") " pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.476384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-operator-scripts\") pod \"keystone-db-create-zpcb6\" (UID: \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\") " pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.476448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5be7ae3-f509-470d-aa05-6369baaa2ec5-operator-scripts\") pod \"keystone-37a4-account-create-update-jnqsq\" (UID: \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.489120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8tgp\" (UniqueName: \"kubernetes.io/projected/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-kube-api-access-t8tgp\") pod \"keystone-db-create-zpcb6\" (UID: \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\") " pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.489148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49l5q\" (UniqueName: \"kubernetes.io/projected/d5be7ae3-f509-470d-aa05-6369baaa2ec5-kube-api-access-49l5q\") pod \"keystone-37a4-account-create-update-jnqsq\" (UID: \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.501288 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.535218 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-r6zw4"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.536769 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.539617 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-r6zw4"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.568555 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.644466 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.645481 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.648172 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.657701 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.677982 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc079f9-7c19-4944-9325-ce31a1688157-operator-scripts\") pod \"placement-db-create-r6zw4\" (UID: \"6cc079f9-7c19-4944-9325-ce31a1688157\") " pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.678028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89trk\" (UniqueName: \"kubernetes.io/projected/6cc079f9-7c19-4944-9325-ce31a1688157-kube-api-access-89trk\") pod \"placement-db-create-r6zw4\" (UID: \"6cc079f9-7c19-4944-9325-ce31a1688157\") " pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.779343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89trk\" (UniqueName: \"kubernetes.io/projected/6cc079f9-7c19-4944-9325-ce31a1688157-kube-api-access-89trk\") pod \"placement-db-create-r6zw4\" (UID: \"6cc079f9-7c19-4944-9325-ce31a1688157\") " pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.779383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ebe52-43fe-40d8-909b-c1c1b617cd5a-operator-scripts\") pod \"placement-40f0-account-create-update-5hmgk\" (UID: \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\") " pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.779574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc079f9-7c19-4944-9325-ce31a1688157-operator-scripts\") pod \"placement-db-create-r6zw4\" (UID: \"6cc079f9-7c19-4944-9325-ce31a1688157\") " pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.779611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgb9c\" (UniqueName: \"kubernetes.io/projected/915ebe52-43fe-40d8-909b-c1c1b617cd5a-kube-api-access-zgb9c\") pod \"placement-40f0-account-create-update-5hmgk\" (UID: \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\") " pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.780408 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc079f9-7c19-4944-9325-ce31a1688157-operator-scripts\") pod \"placement-db-create-r6zw4\" (UID: \"6cc079f9-7c19-4944-9325-ce31a1688157\") " pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.796273 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89trk\" (UniqueName: \"kubernetes.io/projected/6cc079f9-7c19-4944-9325-ce31a1688157-kube-api-access-89trk\") pod \"placement-db-create-r6zw4\" (UID: \"6cc079f9-7c19-4944-9325-ce31a1688157\") " pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.821200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s8jx4" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.838931 4707 generic.go:334] "Generic (PLEG): container finished" podID="49ab0a29-8e07-47aa-84d9-5cfc48d3b69f" containerID="c54252794c0f19f6048f542f3e43812c6e224174c052c5045000b9e469f61138" exitCode=0 Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.838979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-cxjh9" event={"ID":"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f","Type":"ContainerDied","Data":"c54252794c0f19f6048f542f3e43812c6e224174c052c5045000b9e469f61138"} Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.879053 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.881402 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s8jx4"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.881845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgb9c\" (UniqueName: \"kubernetes.io/projected/915ebe52-43fe-40d8-909b-c1c1b617cd5a-kube-api-access-zgb9c\") pod \"placement-40f0-account-create-update-5hmgk\" (UID: \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\") " pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.881884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ebe52-43fe-40d8-909b-c1c1b617cd5a-operator-scripts\") pod \"placement-40f0-account-create-update-5hmgk\" (UID: \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\") " pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.882803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ebe52-43fe-40d8-909b-c1c1b617cd5a-operator-scripts\") pod \"placement-40f0-account-create-update-5hmgk\" (UID: \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\") " pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.895466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgb9c\" (UniqueName: \"kubernetes.io/projected/915ebe52-43fe-40d8-909b-c1c1b617cd5a-kube-api-access-zgb9c\") pod \"placement-40f0-account-create-update-5hmgk\" (UID: \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\") " pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.928902 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdz4h"] Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.929101 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kdz4h" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerName="registry-server" containerID="cri-o://1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985" gracePeriod=2 Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.934063 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-zpcb6"] Jan 21 15:15:47 crc kubenswrapper[4707]: W0121 15:15:47.946058 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1559cd45_3a38_4dc8_8e5e_3e0a67542de4.slice/crio-7f119435b745f476125088d9abdc2344722bdf98162e2a3a177896773fb0ecbd WatchSource:0}: Error finding container 7f119435b745f476125088d9abdc2344722bdf98162e2a3a177896773fb0ecbd: Status 404 returned error can't find the container with id 7f119435b745f476125088d9abdc2344722bdf98162e2a3a177896773fb0ecbd Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.964556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:47 crc kubenswrapper[4707]: I0121 15:15:47.991291 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq"] Jan 21 15:15:48 crc kubenswrapper[4707]: W0121 15:15:48.021239 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5be7ae3_f509_470d_aa05_6369baaa2ec5.slice/crio-5e1cfd9bd57023d63279cf4b2b32e9a90d9022cff6f68fb896b5c587f81d074c WatchSource:0}: Error finding container 5e1cfd9bd57023d63279cf4b2b32e9a90d9022cff6f68fb896b5c587f81d074c: Status 404 returned error can't find the container with id 5e1cfd9bd57023d63279cf4b2b32e9a90d9022cff6f68fb896b5c587f81d074c Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.297703 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-r6zw4"] Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.395969 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk"] Jan 21 15:15:48 crc kubenswrapper[4707]: W0121 15:15:48.428960 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod915ebe52_43fe_40d8_909b_c1c1b617cd5a.slice/crio-30b1e551110c63a086efa952437250472d6f6b12b2736cabaae8657a6b7b296b WatchSource:0}: Error finding container 30b1e551110c63a086efa952437250472d6f6b12b2736cabaae8657a6b7b296b: Status 404 returned error can't find the container with id 30b1e551110c63a086efa952437250472d6f6b12b2736cabaae8657a6b7b296b Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.534576 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.696437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-catalog-content\") pod \"8123c2d1-a229-4e16-9845-aa86f7849baa\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.696771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-utilities\") pod \"8123c2d1-a229-4e16-9845-aa86f7849baa\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.696855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d9rn\" (UniqueName: \"kubernetes.io/projected/8123c2d1-a229-4e16-9845-aa86f7849baa-kube-api-access-9d9rn\") pod \"8123c2d1-a229-4e16-9845-aa86f7849baa\" (UID: \"8123c2d1-a229-4e16-9845-aa86f7849baa\") " Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.697346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-utilities" (OuterVolumeSpecName: "utilities") pod "8123c2d1-a229-4e16-9845-aa86f7849baa" (UID: "8123c2d1-a229-4e16-9845-aa86f7849baa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.701160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8123c2d1-a229-4e16-9845-aa86f7849baa-kube-api-access-9d9rn" (OuterVolumeSpecName: "kube-api-access-9d9rn") pod "8123c2d1-a229-4e16-9845-aa86f7849baa" (UID: "8123c2d1-a229-4e16-9845-aa86f7849baa"). InnerVolumeSpecName "kube-api-access-9d9rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.741265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8123c2d1-a229-4e16-9845-aa86f7849baa" (UID: "8123c2d1-a229-4e16-9845-aa86f7849baa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.798784 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.798824 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123c2d1-a229-4e16-9845-aa86f7849baa-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.798835 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d9rn\" (UniqueName: \"kubernetes.io/projected/8123c2d1-a229-4e16-9845-aa86f7849baa-kube-api-access-9d9rn\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.845860 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5be7ae3-f509-470d-aa05-6369baaa2ec5" containerID="b795d9c6c95d2ea68b071f6114cbc0a1b83db6018018c159a2b056eaa4b5b499" exitCode=0 Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.845933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" event={"ID":"d5be7ae3-f509-470d-aa05-6369baaa2ec5","Type":"ContainerDied","Data":"b795d9c6c95d2ea68b071f6114cbc0a1b83db6018018c159a2b056eaa4b5b499"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.845957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" event={"ID":"d5be7ae3-f509-470d-aa05-6369baaa2ec5","Type":"ContainerStarted","Data":"5e1cfd9bd57023d63279cf4b2b32e9a90d9022cff6f68fb896b5c587f81d074c"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.848376 4707 generic.go:334] "Generic (PLEG): container finished" podID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerID="1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985" exitCode=0 Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.848410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdz4h" event={"ID":"8123c2d1-a229-4e16-9845-aa86f7849baa","Type":"ContainerDied","Data":"1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.848466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdz4h" event={"ID":"8123c2d1-a229-4e16-9845-aa86f7849baa","Type":"ContainerDied","Data":"f17c9112882ec4d7c6aa93f79b489e38a29f5bd498ca02f0adf0701c43e746ea"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.848468 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdz4h" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.848485 4707 scope.go:117] "RemoveContainer" containerID="1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.849656 4707 generic.go:334] "Generic (PLEG): container finished" podID="1559cd45-3a38-4dc8-8e5e-3e0a67542de4" containerID="0c70505b8377bd779309c265a19c1a4d599d7c7e55b279d3deb5f52fddc79c7f" exitCode=0 Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.849709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-zpcb6" event={"ID":"1559cd45-3a38-4dc8-8e5e-3e0a67542de4","Type":"ContainerDied","Data":"0c70505b8377bd779309c265a19c1a4d599d7c7e55b279d3deb5f52fddc79c7f"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.849741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-zpcb6" event={"ID":"1559cd45-3a38-4dc8-8e5e-3e0a67542de4","Type":"ContainerStarted","Data":"7f119435b745f476125088d9abdc2344722bdf98162e2a3a177896773fb0ecbd"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.855540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" event={"ID":"915ebe52-43fe-40d8-909b-c1c1b617cd5a","Type":"ContainerStarted","Data":"fb73b3d35ea2bf064a335bd348e92533a992b130f7ef5249fa1d76315e8e8a33"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.855579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" event={"ID":"915ebe52-43fe-40d8-909b-c1c1b617cd5a","Type":"ContainerStarted","Data":"30b1e551110c63a086efa952437250472d6f6b12b2736cabaae8657a6b7b296b"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.861469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-r6zw4" event={"ID":"6cc079f9-7c19-4944-9325-ce31a1688157","Type":"ContainerStarted","Data":"4dd228771194d7a02860fad493275cd1310e5a44053f4995cf7c2c0c9d5053e7"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.861500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-r6zw4" event={"ID":"6cc079f9-7c19-4944-9325-ce31a1688157","Type":"ContainerStarted","Data":"084a1d3990c6f468acd156480f7090d7ab8c85cf0a3d7f84955431beefd303f4"} Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.869146 4707 scope.go:117] "RemoveContainer" containerID="a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.882493 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" podStartSLOduration=1.882483529 podStartE2EDuration="1.882483529s" podCreationTimestamp="2026-01-21 15:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:15:48.880311282 +0000 UTC m=+846.061827505" watchObservedRunningTime="2026-01-21 15:15:48.882483529 +0000 UTC m=+846.063999751" Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.907504 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdz4h"] Jan 21 15:15:48 crc kubenswrapper[4707]: I0121 15:15:48.914658 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kdz4h"] Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.014162 4707 scope.go:117] "RemoveContainer" containerID="22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.085219 4707 scope.go:117] "RemoveContainer" containerID="1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985" Jan 21 15:15:49 crc kubenswrapper[4707]: E0121 15:15:49.085762 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985\": container with ID starting with 1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985 not found: ID does not exist" containerID="1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.085786 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985"} err="failed to get container status \"1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985\": rpc error: code = NotFound desc = could not find container \"1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985\": container with ID starting with 1a45b70c11d1cf918cd331ae90bfe9c1639fa4c5dcffcf8560b0b53e0387d985 not found: ID does not exist" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.085803 4707 scope.go:117] "RemoveContainer" containerID="a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b" Jan 21 15:15:49 crc kubenswrapper[4707]: E0121 15:15:49.086042 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b\": container with ID starting with a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b not found: ID does not exist" containerID="a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.086062 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b"} err="failed to get container status \"a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b\": rpc error: code = NotFound desc = could not find container \"a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b\": container with ID starting with a75a76d4a89f706f5631ecdb2e532283e8f827728443cc14955241d72c8ee75b not found: ID does not exist" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.086074 4707 scope.go:117] "RemoveContainer" containerID="22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b" Jan 21 15:15:49 crc kubenswrapper[4707]: E0121 15:15:49.086262 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b\": container with ID starting with 22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b not found: ID does not exist" containerID="22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.086279 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b"} err="failed to get container status \"22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b\": rpc error: code = NotFound desc = could not find container \"22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b\": container with ID starting with 22fcd3c35f05ad7a3c641a98d6ec8e467ff787e6b328955603efae4eb1d23b9b not found: ID does not exist" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.143171 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.189652 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" path="/var/lib/kubelet/pods/8123c2d1-a229-4e16-9845-aa86f7849baa/volumes" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.333408 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55vb6\" (UniqueName: \"kubernetes.io/projected/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-kube-api-access-55vb6\") pod \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\" (UID: \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\") " Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.333565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-operator-scripts\") pod \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\" (UID: \"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f\") " Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.334051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49ab0a29-8e07-47aa-84d9-5cfc48d3b69f" (UID: "49ab0a29-8e07-47aa-84d9-5cfc48d3b69f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.337234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-kube-api-access-55vb6" (OuterVolumeSpecName: "kube-api-access-55vb6") pod "49ab0a29-8e07-47aa-84d9-5cfc48d3b69f" (UID: "49ab0a29-8e07-47aa-84d9-5cfc48d3b69f"). InnerVolumeSpecName "kube-api-access-55vb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.437585 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55vb6\" (UniqueName: \"kubernetes.io/projected/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-kube-api-access-55vb6\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.437613 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.868177 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-cxjh9" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.868935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-cxjh9" event={"ID":"49ab0a29-8e07-47aa-84d9-5cfc48d3b69f","Type":"ContainerDied","Data":"b71b7b87792a50333604e5e0aa8e974818098a35c6aac09242ffcbfd3d0f529c"} Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.868964 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b71b7b87792a50333604e5e0aa8e974818098a35c6aac09242ffcbfd3d0f529c" Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.870045 4707 generic.go:334] "Generic (PLEG): container finished" podID="6cc079f9-7c19-4944-9325-ce31a1688157" containerID="4dd228771194d7a02860fad493275cd1310e5a44053f4995cf7c2c0c9d5053e7" exitCode=0 Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.870087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-r6zw4" event={"ID":"6cc079f9-7c19-4944-9325-ce31a1688157","Type":"ContainerDied","Data":"4dd228771194d7a02860fad493275cd1310e5a44053f4995cf7c2c0c9d5053e7"} Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.872130 4707 generic.go:334] "Generic (PLEG): container finished" podID="915ebe52-43fe-40d8-909b-c1c1b617cd5a" containerID="fb73b3d35ea2bf064a335bd348e92533a992b130f7ef5249fa1d76315e8e8a33" exitCode=0 Jan 21 15:15:49 crc kubenswrapper[4707]: I0121 15:15:49.872436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" event={"ID":"915ebe52-43fe-40d8-909b-c1c1b617cd5a","Type":"ContainerDied","Data":"fb73b3d35ea2bf064a335bd348e92533a992b130f7ef5249fa1d76315e8e8a33"} Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.189765 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.222382 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.226274 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.248793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49l5q\" (UniqueName: \"kubernetes.io/projected/d5be7ae3-f509-470d-aa05-6369baaa2ec5-kube-api-access-49l5q\") pod \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\" (UID: \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\") " Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.248847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-operator-scripts\") pod \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\" (UID: \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\") " Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc079f9-7c19-4944-9325-ce31a1688157-operator-scripts\") pod \"6cc079f9-7c19-4944-9325-ce31a1688157\" (UID: \"6cc079f9-7c19-4944-9325-ce31a1688157\") " Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89trk\" (UniqueName: \"kubernetes.io/projected/6cc079f9-7c19-4944-9325-ce31a1688157-kube-api-access-89trk\") pod \"6cc079f9-7c19-4944-9325-ce31a1688157\" (UID: \"6cc079f9-7c19-4944-9325-ce31a1688157\") " Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249193 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5be7ae3-f509-470d-aa05-6369baaa2ec5-operator-scripts\") pod \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\" (UID: \"d5be7ae3-f509-470d-aa05-6369baaa2ec5\") " Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249215 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1559cd45-3a38-4dc8-8e5e-3e0a67542de4" (UID: "1559cd45-3a38-4dc8-8e5e-3e0a67542de4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249237 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8tgp\" (UniqueName: \"kubernetes.io/projected/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-kube-api-access-t8tgp\") pod \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\" (UID: \"1559cd45-3a38-4dc8-8e5e-3e0a67542de4\") " Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249456 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5be7ae3-f509-470d-aa05-6369baaa2ec5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5be7ae3-f509-470d-aa05-6369baaa2ec5" (UID: "d5be7ae3-f509-470d-aa05-6369baaa2ec5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cc079f9-7c19-4944-9325-ce31a1688157-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6cc079f9-7c19-4944-9325-ce31a1688157" (UID: "6cc079f9-7c19-4944-9325-ce31a1688157"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249750 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc079f9-7c19-4944-9325-ce31a1688157-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249763 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5be7ae3-f509-470d-aa05-6369baaa2ec5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.249771 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.251982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5be7ae3-f509-470d-aa05-6369baaa2ec5-kube-api-access-49l5q" (OuterVolumeSpecName: "kube-api-access-49l5q") pod "d5be7ae3-f509-470d-aa05-6369baaa2ec5" (UID: "d5be7ae3-f509-470d-aa05-6369baaa2ec5"). InnerVolumeSpecName "kube-api-access-49l5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.252204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc079f9-7c19-4944-9325-ce31a1688157-kube-api-access-89trk" (OuterVolumeSpecName: "kube-api-access-89trk") pod "6cc079f9-7c19-4944-9325-ce31a1688157" (UID: "6cc079f9-7c19-4944-9325-ce31a1688157"). InnerVolumeSpecName "kube-api-access-89trk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.252313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-kube-api-access-t8tgp" (OuterVolumeSpecName: "kube-api-access-t8tgp") pod "1559cd45-3a38-4dc8-8e5e-3e0a67542de4" (UID: "1559cd45-3a38-4dc8-8e5e-3e0a67542de4"). InnerVolumeSpecName "kube-api-access-t8tgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.351011 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8tgp\" (UniqueName: \"kubernetes.io/projected/1559cd45-3a38-4dc8-8e5e-3e0a67542de4-kube-api-access-t8tgp\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.351036 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49l5q\" (UniqueName: \"kubernetes.io/projected/d5be7ae3-f509-470d-aa05-6369baaa2ec5-kube-api-access-49l5q\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.351046 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89trk\" (UniqueName: \"kubernetes.io/projected/6cc079f9-7c19-4944-9325-ce31a1688157-kube-api-access-89trk\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543273 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.543520 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ab0a29-8e07-47aa-84d9-5cfc48d3b69f" containerName="mariadb-account-create-update" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543537 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ab0a29-8e07-47aa-84d9-5cfc48d3b69f" containerName="mariadb-account-create-update" Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.543556 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerName="extract-content" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543562 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerName="extract-content" Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.543578 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc079f9-7c19-4944-9325-ce31a1688157" containerName="mariadb-database-create" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543584 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc079f9-7c19-4944-9325-ce31a1688157" containerName="mariadb-database-create" Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.543600 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerName="registry-server" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543605 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerName="registry-server" Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.543616 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerName="extract-utilities" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543622 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerName="extract-utilities" Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.543639 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5be7ae3-f509-470d-aa05-6369baaa2ec5" containerName="mariadb-account-create-update" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543645 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5be7ae3-f509-470d-aa05-6369baaa2ec5" containerName="mariadb-account-create-update" Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.543656 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1559cd45-3a38-4dc8-8e5e-3e0a67542de4" containerName="mariadb-database-create" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543662 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1559cd45-3a38-4dc8-8e5e-3e0a67542de4" containerName="mariadb-database-create" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543777 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ab0a29-8e07-47aa-84d9-5cfc48d3b69f" containerName="mariadb-account-create-update" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543787 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8123c2d1-a229-4e16-9845-aa86f7849baa" containerName="registry-server" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543794 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5be7ae3-f509-470d-aa05-6369baaa2ec5" containerName="mariadb-account-create-update" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543825 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc079f9-7c19-4944-9325-ce31a1688157" containerName="mariadb-database-create" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.543835 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1559cd45-3a38-4dc8-8e5e-3e0a67542de4" containerName="mariadb-database-create" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.547301 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.549250 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.549303 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.549321 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-cz8ql" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.549589 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.557504 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.655105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-lock\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.655363 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.655417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7426\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-kube-api-access-w7426\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.655439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-cache\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.655529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.756774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.756931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-lock\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.757004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.757054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7426\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-kube-api-access-w7426\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.757079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-cache\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.757129 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") device mount path \"/mnt/openstack/pv07\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.757197 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.757218 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:15:50 crc kubenswrapper[4707]: E0121 15:15:50.757274 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift podName:4a26dbbb-ff73-461b-9ef4-045186b349f0 nodeName:}" failed. No retries permitted until 2026-01-21 15:15:51.257258107 +0000 UTC m=+848.438774330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift") pod "swift-storage-0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0") : configmap "swift-ring-files" not found Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.757498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-lock\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.757514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-cache\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.773344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.773628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7426\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-kube-api-access-w7426\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.879201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-r6zw4" event={"ID":"6cc079f9-7c19-4944-9325-ce31a1688157","Type":"ContainerDied","Data":"084a1d3990c6f468acd156480f7090d7ab8c85cf0a3d7f84955431beefd303f4"} Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.879236 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="084a1d3990c6f468acd156480f7090d7ab8c85cf0a3d7f84955431beefd303f4" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.879219 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-r6zw4" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.880661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" event={"ID":"d5be7ae3-f509-470d-aa05-6369baaa2ec5","Type":"ContainerDied","Data":"5e1cfd9bd57023d63279cf4b2b32e9a90d9022cff6f68fb896b5c587f81d074c"} Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.880684 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.880702 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e1cfd9bd57023d63279cf4b2b32e9a90d9022cff6f68fb896b5c587f81d074c" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.883508 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-zpcb6" Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.883519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-zpcb6" event={"ID":"1559cd45-3a38-4dc8-8e5e-3e0a67542de4","Type":"ContainerDied","Data":"7f119435b745f476125088d9abdc2344722bdf98162e2a3a177896773fb0ecbd"} Jan 21 15:15:50 crc kubenswrapper[4707]: I0121 15:15:50.883565 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f119435b745f476125088d9abdc2344722bdf98162e2a3a177896773fb0ecbd" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.042055 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x8xqg"] Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.043632 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.045205 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.045409 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.045538 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.047114 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x8xqg"] Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.141480 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.162844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-ring-data-devices\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.163057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-etc-swift\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.163169 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-combined-ca-bundle\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.163277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl52r\" (UniqueName: \"kubernetes.io/projected/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-kube-api-access-wl52r\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.163374 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-swiftconf\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.163524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-scripts\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.163587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-dispersionconf\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.264349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ebe52-43fe-40d8-909b-c1c1b617cd5a-operator-scripts\") pod \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\" (UID: \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\") " Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.264723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgb9c\" (UniqueName: \"kubernetes.io/projected/915ebe52-43fe-40d8-909b-c1c1b617cd5a-kube-api-access-zgb9c\") pod \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\" (UID: \"915ebe52-43fe-40d8-909b-c1c1b617cd5a\") " Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.264999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.265025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-scripts\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.265029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/915ebe52-43fe-40d8-909b-c1c1b617cd5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "915ebe52-43fe-40d8-909b-c1c1b617cd5a" (UID: "915ebe52-43fe-40d8-909b-c1c1b617cd5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.265043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-dispersionconf\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.265175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-ring-data-devices\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.265207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-etc-swift\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.265276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-combined-ca-bundle\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.265366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl52r\" (UniqueName: \"kubernetes.io/projected/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-kube-api-access-wl52r\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.265415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-swiftconf\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.265478 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ebe52-43fe-40d8-909b-c1c1b617cd5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:51 crc kubenswrapper[4707]: E0121 15:15:51.265623 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:15:51 crc kubenswrapper[4707]: E0121 15:15:51.265705 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:15:51 crc kubenswrapper[4707]: E0121 15:15:51.265803 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift podName:4a26dbbb-ff73-461b-9ef4-045186b349f0 nodeName:}" failed. No retries permitted until 2026-01-21 15:15:52.265783373 +0000 UTC m=+849.447299585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift") pod "swift-storage-0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0") : configmap "swift-ring-files" not found Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.266367 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-scripts\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.266733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-etc-swift\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.267009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-ring-data-devices\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.268693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-combined-ca-bundle\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.268733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-swiftconf\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.271192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915ebe52-43fe-40d8-909b-c1c1b617cd5a-kube-api-access-zgb9c" (OuterVolumeSpecName: "kube-api-access-zgb9c") pod "915ebe52-43fe-40d8-909b-c1c1b617cd5a" (UID: "915ebe52-43fe-40d8-909b-c1c1b617cd5a"). InnerVolumeSpecName "kube-api-access-zgb9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.272561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-dispersionconf\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.282428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl52r\" (UniqueName: \"kubernetes.io/projected/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-kube-api-access-wl52r\") pod \"swift-ring-rebalance-x8xqg\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.356495 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.366928 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgb9c\" (UniqueName: \"kubernetes.io/projected/915ebe52-43fe-40d8-909b-c1c1b617cd5a-kube-api-access-zgb9c\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.754176 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x8xqg"] Jan 21 15:15:51 crc kubenswrapper[4707]: W0121 15:15:51.755015 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod877b2ecd_b7f6_4057_a5a8_e4925a9cc743.slice/crio-1dfdfb7ade15666a8c5f1cdfe115d07b9368822a4129873e44619bd6901651a5 WatchSource:0}: Error finding container 1dfdfb7ade15666a8c5f1cdfe115d07b9368822a4129873e44619bd6901651a5: Status 404 returned error can't find the container with id 1dfdfb7ade15666a8c5f1cdfe115d07b9368822a4129873e44619bd6901651a5 Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.889983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" event={"ID":"877b2ecd-b7f6-4057-a5a8-e4925a9cc743","Type":"ContainerStarted","Data":"1dfdfb7ade15666a8c5f1cdfe115d07b9368822a4129873e44619bd6901651a5"} Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.891899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" event={"ID":"915ebe52-43fe-40d8-909b-c1c1b617cd5a","Type":"ContainerDied","Data":"30b1e551110c63a086efa952437250472d6f6b12b2736cabaae8657a6b7b296b"} Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.891934 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30b1e551110c63a086efa952437250472d6f6b12b2736cabaae8657a6b7b296b" Jan 21 15:15:51 crc kubenswrapper[4707]: I0121 15:15:51.891955 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.278547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:52 crc kubenswrapper[4707]: E0121 15:15:52.278670 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:15:52 crc kubenswrapper[4707]: E0121 15:15:52.278691 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:15:52 crc kubenswrapper[4707]: E0121 15:15:52.278739 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift podName:4a26dbbb-ff73-461b-9ef4-045186b349f0 nodeName:}" failed. No retries permitted until 2026-01-21 15:15:54.278724133 +0000 UTC m=+851.460240355 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift") pod "swift-storage-0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0") : configmap "swift-ring-files" not found Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.760164 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-twzb9"] Jan 21 15:15:52 crc kubenswrapper[4707]: E0121 15:15:52.760604 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915ebe52-43fe-40d8-909b-c1c1b617cd5a" containerName="mariadb-account-create-update" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.760621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="915ebe52-43fe-40d8-909b-c1c1b617cd5a" containerName="mariadb-account-create-update" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.760747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="915ebe52-43fe-40d8-909b-c1c1b617cd5a" containerName="mariadb-account-create-update" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.761199 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.770700 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-twzb9"] Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.872786 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz"] Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.873684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.875559 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.883791 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz"] Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.886934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v82q\" (UniqueName: \"kubernetes.io/projected/55f50403-e5d4-4cac-928c-a42bf62951a8-kube-api-access-8v82q\") pod \"glance-db-create-twzb9\" (UID: \"55f50403-e5d4-4cac-928c-a42bf62951a8\") " pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.887069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f50403-e5d4-4cac-928c-a42bf62951a8-operator-scripts\") pod \"glance-db-create-twzb9\" (UID: \"55f50403-e5d4-4cac-928c-a42bf62951a8\") " pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.988203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjr9\" (UniqueName: \"kubernetes.io/projected/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-kube-api-access-tcjr9\") pod \"glance-61f3-account-create-update-2xjtz\" (UID: \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\") " pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.988263 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-operator-scripts\") pod \"glance-61f3-account-create-update-2xjtz\" (UID: \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\") " pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.988294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f50403-e5d4-4cac-928c-a42bf62951a8-operator-scripts\") pod \"glance-db-create-twzb9\" (UID: \"55f50403-e5d4-4cac-928c-a42bf62951a8\") " pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.988430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v82q\" (UniqueName: \"kubernetes.io/projected/55f50403-e5d4-4cac-928c-a42bf62951a8-kube-api-access-8v82q\") pod \"glance-db-create-twzb9\" (UID: \"55f50403-e5d4-4cac-928c-a42bf62951a8\") " pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:52 crc kubenswrapper[4707]: I0121 15:15:52.989105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f50403-e5d4-4cac-928c-a42bf62951a8-operator-scripts\") pod \"glance-db-create-twzb9\" (UID: \"55f50403-e5d4-4cac-928c-a42bf62951a8\") " pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:53 crc kubenswrapper[4707]: I0121 15:15:53.010318 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v82q\" (UniqueName: \"kubernetes.io/projected/55f50403-e5d4-4cac-928c-a42bf62951a8-kube-api-access-8v82q\") pod \"glance-db-create-twzb9\" (UID: \"55f50403-e5d4-4cac-928c-a42bf62951a8\") " pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:53 crc kubenswrapper[4707]: I0121 15:15:53.077400 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:53 crc kubenswrapper[4707]: I0121 15:15:53.090575 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjr9\" (UniqueName: \"kubernetes.io/projected/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-kube-api-access-tcjr9\") pod \"glance-61f3-account-create-update-2xjtz\" (UID: \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\") " pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:53 crc kubenswrapper[4707]: I0121 15:15:53.090655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-operator-scripts\") pod \"glance-61f3-account-create-update-2xjtz\" (UID: \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\") " pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:53 crc kubenswrapper[4707]: I0121 15:15:53.091249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-operator-scripts\") pod \"glance-61f3-account-create-update-2xjtz\" (UID: \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\") " pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:53 crc kubenswrapper[4707]: I0121 15:15:53.104596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjr9\" (UniqueName: \"kubernetes.io/projected/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-kube-api-access-tcjr9\") pod \"glance-61f3-account-create-update-2xjtz\" (UID: \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\") " pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:53 crc kubenswrapper[4707]: I0121 15:15:53.184455 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.308885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:54 crc kubenswrapper[4707]: E0121 15:15:54.309056 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:15:54 crc kubenswrapper[4707]: E0121 15:15:54.309176 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:15:54 crc kubenswrapper[4707]: E0121 15:15:54.309444 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift podName:4a26dbbb-ff73-461b-9ef4-045186b349f0 nodeName:}" failed. No retries permitted until 2026-01-21 15:15:58.30921237 +0000 UTC m=+855.490728592 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift") pod "swift-storage-0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0") : configmap "swift-ring-files" not found Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.494334 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-cxjh9"] Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.499677 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-cxjh9"] Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.582751 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpkcc"] Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.583709 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.585610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.589910 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpkcc"] Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.714691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a13938f-6542-42f2-a635-9529887f363c-operator-scripts\") pod \"root-account-create-update-kpkcc\" (UID: \"8a13938f-6542-42f2-a635-9529887f363c\") " pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.714970 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99vmg\" (UniqueName: \"kubernetes.io/projected/8a13938f-6542-42f2-a635-9529887f363c-kube-api-access-99vmg\") pod \"root-account-create-update-kpkcc\" (UID: \"8a13938f-6542-42f2-a635-9529887f363c\") " pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.816345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a13938f-6542-42f2-a635-9529887f363c-operator-scripts\") pod \"root-account-create-update-kpkcc\" (UID: \"8a13938f-6542-42f2-a635-9529887f363c\") " pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.816417 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99vmg\" (UniqueName: \"kubernetes.io/projected/8a13938f-6542-42f2-a635-9529887f363c-kube-api-access-99vmg\") pod \"root-account-create-update-kpkcc\" (UID: \"8a13938f-6542-42f2-a635-9529887f363c\") " pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.817507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a13938f-6542-42f2-a635-9529887f363c-operator-scripts\") pod \"root-account-create-update-kpkcc\" (UID: \"8a13938f-6542-42f2-a635-9529887f363c\") " pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.832465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99vmg\" (UniqueName: \"kubernetes.io/projected/8a13938f-6542-42f2-a635-9529887f363c-kube-api-access-99vmg\") pod \"root-account-create-update-kpkcc\" (UID: \"8a13938f-6542-42f2-a635-9529887f363c\") " pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.910426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" event={"ID":"877b2ecd-b7f6-4057-a5a8-e4925a9cc743","Type":"ContainerStarted","Data":"a1746eb35de025fc0fec59c5f3fcfa37f77eeb90568db27ad9ffb35d3ca2b33f"} Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.915708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz"] Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.917487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:54 crc kubenswrapper[4707]: W0121 15:15:54.918044 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55f50403_e5d4_4cac_928c_a42bf62951a8.slice/crio-e7dd67827dd7dd22274a3d7cc9a5edc6dfe92b652146408656762d8bf50fa446 WatchSource:0}: Error finding container e7dd67827dd7dd22274a3d7cc9a5edc6dfe92b652146408656762d8bf50fa446: Status 404 returned error can't find the container with id e7dd67827dd7dd22274a3d7cc9a5edc6dfe92b652146408656762d8bf50fa446 Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.920365 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-twzb9"] Jan 21 15:15:54 crc kubenswrapper[4707]: I0121 15:15:54.932962 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" podStartSLOduration=1.150766922 podStartE2EDuration="3.93295379s" podCreationTimestamp="2026-01-21 15:15:51 +0000 UTC" firstStartedPulling="2026-01-21 15:15:51.756954002 +0000 UTC m=+848.938470224" lastFinishedPulling="2026-01-21 15:15:54.539140871 +0000 UTC m=+851.720657092" observedRunningTime="2026-01-21 15:15:54.930710811 +0000 UTC m=+852.112227032" watchObservedRunningTime="2026-01-21 15:15:54.93295379 +0000 UTC m=+852.114470011" Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.190256 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ab0a29-8e07-47aa-84d9-5cfc48d3b69f" path="/var/lib/kubelet/pods/49ab0a29-8e07-47aa-84d9-5cfc48d3b69f/volumes" Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.282547 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpkcc"] Jan 21 15:15:55 crc kubenswrapper[4707]: W0121 15:15:55.308363 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a13938f_6542_42f2_a635_9529887f363c.slice/crio-2eafac849f9b2819a44076d201ce36a8dd9f3780a464e1725853da15d5551afb WatchSource:0}: Error finding container 2eafac849f9b2819a44076d201ce36a8dd9f3780a464e1725853da15d5551afb: Status 404 returned error can't find the container with id 2eafac849f9b2819a44076d201ce36a8dd9f3780a464e1725853da15d5551afb Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.916412 4707 generic.go:334] "Generic (PLEG): container finished" podID="55f50403-e5d4-4cac-928c-a42bf62951a8" containerID="924e0ee83104507d86b0618f79b0bd26660930ec44ec0006755016cc489eea6d" exitCode=0 Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.916465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-twzb9" event={"ID":"55f50403-e5d4-4cac-928c-a42bf62951a8","Type":"ContainerDied","Data":"924e0ee83104507d86b0618f79b0bd26660930ec44ec0006755016cc489eea6d"} Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.916504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-twzb9" event={"ID":"55f50403-e5d4-4cac-928c-a42bf62951a8","Type":"ContainerStarted","Data":"e7dd67827dd7dd22274a3d7cc9a5edc6dfe92b652146408656762d8bf50fa446"} Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.918111 4707 generic.go:334] "Generic (PLEG): container finished" podID="a65869b4-a0c3-46ba-8c75-a194a22d9d2c" containerID="1e7579152ee0a2a533efbfa1b34360456cb8779e12e40941d34d9745d3fbe0db" exitCode=0 Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.918141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" event={"ID":"a65869b4-a0c3-46ba-8c75-a194a22d9d2c","Type":"ContainerDied","Data":"1e7579152ee0a2a533efbfa1b34360456cb8779e12e40941d34d9745d3fbe0db"} Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.918309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" event={"ID":"a65869b4-a0c3-46ba-8c75-a194a22d9d2c","Type":"ContainerStarted","Data":"0185466c74bc8c59061df6f977befb097b81e7d1c13a25bfe300e29d5ae6ace1"} Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.923641 4707 generic.go:334] "Generic (PLEG): container finished" podID="8a13938f-6542-42f2-a635-9529887f363c" containerID="111c9b79aba7c1ce6070d460e7114904ba89c9d35a5fd542f6794479355ce3d8" exitCode=0 Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.923704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kpkcc" event={"ID":"8a13938f-6542-42f2-a635-9529887f363c","Type":"ContainerDied","Data":"111c9b79aba7c1ce6070d460e7114904ba89c9d35a5fd542f6794479355ce3d8"} Jan 21 15:15:55 crc kubenswrapper[4707]: I0121 15:15:55.923726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kpkcc" event={"ID":"8a13938f-6542-42f2-a635-9529887f363c","Type":"ContainerStarted","Data":"2eafac849f9b2819a44076d201ce36a8dd9f3780a464e1725853da15d5551afb"} Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.296990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.358919 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.373396 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.455879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a13938f-6542-42f2-a635-9529887f363c-operator-scripts\") pod \"8a13938f-6542-42f2-a635-9529887f363c\" (UID: \"8a13938f-6542-42f2-a635-9529887f363c\") " Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.455952 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f50403-e5d4-4cac-928c-a42bf62951a8-operator-scripts\") pod \"55f50403-e5d4-4cac-928c-a42bf62951a8\" (UID: \"55f50403-e5d4-4cac-928c-a42bf62951a8\") " Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.455984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v82q\" (UniqueName: \"kubernetes.io/projected/55f50403-e5d4-4cac-928c-a42bf62951a8-kube-api-access-8v82q\") pod \"55f50403-e5d4-4cac-928c-a42bf62951a8\" (UID: \"55f50403-e5d4-4cac-928c-a42bf62951a8\") " Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.456011 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjr9\" (UniqueName: \"kubernetes.io/projected/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-kube-api-access-tcjr9\") pod \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\" (UID: \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\") " Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.456090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99vmg\" (UniqueName: \"kubernetes.io/projected/8a13938f-6542-42f2-a635-9529887f363c-kube-api-access-99vmg\") pod \"8a13938f-6542-42f2-a635-9529887f363c\" (UID: \"8a13938f-6542-42f2-a635-9529887f363c\") " Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.456118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-operator-scripts\") pod \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\" (UID: \"a65869b4-a0c3-46ba-8c75-a194a22d9d2c\") " Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.456344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a13938f-6542-42f2-a635-9529887f363c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a13938f-6542-42f2-a635-9529887f363c" (UID: "8a13938f-6542-42f2-a635-9529887f363c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.456556 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a13938f-6542-42f2-a635-9529887f363c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.456572 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a65869b4-a0c3-46ba-8c75-a194a22d9d2c" (UID: "a65869b4-a0c3-46ba-8c75-a194a22d9d2c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.456708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55f50403-e5d4-4cac-928c-a42bf62951a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55f50403-e5d4-4cac-928c-a42bf62951a8" (UID: "55f50403-e5d4-4cac-928c-a42bf62951a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.460792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f50403-e5d4-4cac-928c-a42bf62951a8-kube-api-access-8v82q" (OuterVolumeSpecName: "kube-api-access-8v82q") pod "55f50403-e5d4-4cac-928c-a42bf62951a8" (UID: "55f50403-e5d4-4cac-928c-a42bf62951a8"). InnerVolumeSpecName "kube-api-access-8v82q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.461015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-kube-api-access-tcjr9" (OuterVolumeSpecName: "kube-api-access-tcjr9") pod "a65869b4-a0c3-46ba-8c75-a194a22d9d2c" (UID: "a65869b4-a0c3-46ba-8c75-a194a22d9d2c"). InnerVolumeSpecName "kube-api-access-tcjr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.461124 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a13938f-6542-42f2-a635-9529887f363c-kube-api-access-99vmg" (OuterVolumeSpecName: "kube-api-access-99vmg") pod "8a13938f-6542-42f2-a635-9529887f363c" (UID: "8a13938f-6542-42f2-a635-9529887f363c"). InnerVolumeSpecName "kube-api-access-99vmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.557737 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f50403-e5d4-4cac-928c-a42bf62951a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.557761 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v82q\" (UniqueName: \"kubernetes.io/projected/55f50403-e5d4-4cac-928c-a42bf62951a8-kube-api-access-8v82q\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.557772 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcjr9\" (UniqueName: \"kubernetes.io/projected/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-kube-api-access-tcjr9\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.557781 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99vmg\" (UniqueName: \"kubernetes.io/projected/8a13938f-6542-42f2-a635-9529887f363c-kube-api-access-99vmg\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.557841 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65869b4-a0c3-46ba-8c75-a194a22d9d2c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.969358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kpkcc" event={"ID":"8a13938f-6542-42f2-a635-9529887f363c","Type":"ContainerDied","Data":"2eafac849f9b2819a44076d201ce36a8dd9f3780a464e1725853da15d5551afb"} Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.969706 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eafac849f9b2819a44076d201ce36a8dd9f3780a464e1725853da15d5551afb" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.969373 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kpkcc" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.970404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-twzb9" event={"ID":"55f50403-e5d4-4cac-928c-a42bf62951a8","Type":"ContainerDied","Data":"e7dd67827dd7dd22274a3d7cc9a5edc6dfe92b652146408656762d8bf50fa446"} Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.970437 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7dd67827dd7dd22274a3d7cc9a5edc6dfe92b652146408656762d8bf50fa446" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.970470 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-twzb9" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.978332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" event={"ID":"a65869b4-a0c3-46ba-8c75-a194a22d9d2c","Type":"ContainerDied","Data":"0185466c74bc8c59061df6f977befb097b81e7d1c13a25bfe300e29d5ae6ace1"} Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.978355 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0185466c74bc8c59061df6f977befb097b81e7d1c13a25bfe300e29d5ae6ace1" Jan 21 15:15:57 crc kubenswrapper[4707]: I0121 15:15:57.978359 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz" Jan 21 15:15:58 crc kubenswrapper[4707]: I0121 15:15:58.368896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:15:58 crc kubenswrapper[4707]: E0121 15:15:58.369063 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:15:58 crc kubenswrapper[4707]: E0121 15:15:58.369083 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:15:58 crc kubenswrapper[4707]: E0121 15:15:58.369129 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift podName:4a26dbbb-ff73-461b-9ef4-045186b349f0 nodeName:}" failed. No retries permitted until 2026-01-21 15:16:06.369114542 +0000 UTC m=+863.550630764 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift") pod "swift-storage-0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0") : configmap "swift-ring-files" not found Jan 21 15:15:59 crc kubenswrapper[4707]: I0121 15:15:59.990679 4707 generic.go:334] "Generic (PLEG): container finished" podID="877b2ecd-b7f6-4057-a5a8-e4925a9cc743" containerID="a1746eb35de025fc0fec59c5f3fcfa37f77eeb90568db27ad9ffb35d3ca2b33f" exitCode=0 Jan 21 15:15:59 crc kubenswrapper[4707]: I0121 15:15:59.990752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" event={"ID":"877b2ecd-b7f6-4057-a5a8-e4925a9cc743","Type":"ContainerDied","Data":"a1746eb35de025fc0fec59c5f3fcfa37f77eeb90568db27ad9ffb35d3ca2b33f"} Jan 21 15:16:00 crc kubenswrapper[4707]: I0121 15:16:00.000333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:16:00 crc kubenswrapper[4707]: I0121 15:16:00.998083 4707 generic.go:334] "Generic (PLEG): container finished" podID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" containerID="f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178" exitCode=0 Jan 21 15:16:00 crc kubenswrapper[4707]: I0121 15:16:00.998147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0","Type":"ContainerDied","Data":"f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178"} Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.001018 4707 generic.go:334] "Generic (PLEG): container finished" podID="945ded51-9961-411b-b00f-0543ac91a18a" containerID="3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0" exitCode=0 Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.001105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"945ded51-9961-411b-b00f-0543ac91a18a","Type":"ContainerDied","Data":"3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0"} Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.247423 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.432063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-scripts\") pod \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.432374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-combined-ca-bundle\") pod \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.432441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-swiftconf\") pod \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.432458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl52r\" (UniqueName: \"kubernetes.io/projected/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-kube-api-access-wl52r\") pod \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.432475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-ring-data-devices\") pod \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.432496 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-dispersionconf\") pod \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.432524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-etc-swift\") pod \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\" (UID: \"877b2ecd-b7f6-4057-a5a8-e4925a9cc743\") " Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.432905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "877b2ecd-b7f6-4057-a5a8-e4925a9cc743" (UID: "877b2ecd-b7f6-4057-a5a8-e4925a9cc743"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.433426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "877b2ecd-b7f6-4057-a5a8-e4925a9cc743" (UID: "877b2ecd-b7f6-4057-a5a8-e4925a9cc743"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.435732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-kube-api-access-wl52r" (OuterVolumeSpecName: "kube-api-access-wl52r") pod "877b2ecd-b7f6-4057-a5a8-e4925a9cc743" (UID: "877b2ecd-b7f6-4057-a5a8-e4925a9cc743"). InnerVolumeSpecName "kube-api-access-wl52r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.438565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "877b2ecd-b7f6-4057-a5a8-e4925a9cc743" (UID: "877b2ecd-b7f6-4057-a5a8-e4925a9cc743"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.447194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-scripts" (OuterVolumeSpecName: "scripts") pod "877b2ecd-b7f6-4057-a5a8-e4925a9cc743" (UID: "877b2ecd-b7f6-4057-a5a8-e4925a9cc743"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.448239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "877b2ecd-b7f6-4057-a5a8-e4925a9cc743" (UID: "877b2ecd-b7f6-4057-a5a8-e4925a9cc743"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.448690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "877b2ecd-b7f6-4057-a5a8-e4925a9cc743" (UID: "877b2ecd-b7f6-4057-a5a8-e4925a9cc743"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.534171 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.534201 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.534210 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.534218 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.534226 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.534235 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:01 crc kubenswrapper[4707]: I0121 15:16:01.534243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl52r\" (UniqueName: \"kubernetes.io/projected/877b2ecd-b7f6-4057-a5a8-e4925a9cc743-kube-api-access-wl52r\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:02 crc kubenswrapper[4707]: I0121 15:16:02.008676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"945ded51-9961-411b-b00f-0543ac91a18a","Type":"ContainerStarted","Data":"c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4"} Jan 21 15:16:02 crc kubenswrapper[4707]: I0121 15:16:02.008909 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:16:02 crc kubenswrapper[4707]: I0121 15:16:02.010173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" event={"ID":"877b2ecd-b7f6-4057-a5a8-e4925a9cc743","Type":"ContainerDied","Data":"1dfdfb7ade15666a8c5f1cdfe115d07b9368822a4129873e44619bd6901651a5"} Jan 21 15:16:02 crc kubenswrapper[4707]: I0121 15:16:02.010207 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dfdfb7ade15666a8c5f1cdfe115d07b9368822a4129873e44619bd6901651a5" Jan 21 15:16:02 crc kubenswrapper[4707]: I0121 15:16:02.010181 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x8xqg" Jan 21 15:16:02 crc kubenswrapper[4707]: I0121 15:16:02.011424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0","Type":"ContainerStarted","Data":"3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7"} Jan 21 15:16:02 crc kubenswrapper[4707]: I0121 15:16:02.011615 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:16:02 crc kubenswrapper[4707]: I0121 15:16:02.044755 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.404648859 podStartE2EDuration="40.044740415s" podCreationTimestamp="2026-01-21 15:15:22 +0000 UTC" firstStartedPulling="2026-01-21 15:15:25.968923832 +0000 UTC m=+823.150440054" lastFinishedPulling="2026-01-21 15:15:28.609015387 +0000 UTC m=+825.790531610" observedRunningTime="2026-01-21 15:16:02.038633376 +0000 UTC m=+859.220149598" watchObservedRunningTime="2026-01-21 15:16:02.044740415 +0000 UTC m=+859.226256637" Jan 21 15:16:02 crc kubenswrapper[4707]: I0121 15:16:02.061195 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.458568675 podStartE2EDuration="39.061175632s" podCreationTimestamp="2026-01-21 15:15:23 +0000 UTC" firstStartedPulling="2026-01-21 15:15:26.034090761 +0000 UTC m=+823.215606983" lastFinishedPulling="2026-01-21 15:15:28.636697719 +0000 UTC m=+825.818213940" observedRunningTime="2026-01-21 15:16:02.059899813 +0000 UTC m=+859.241416034" watchObservedRunningTime="2026-01-21 15:16:02.061175632 +0000 UTC m=+859.242691855" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.003266 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-2bdzl"] Jan 21 15:16:03 crc kubenswrapper[4707]: E0121 15:16:03.003752 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a13938f-6542-42f2-a635-9529887f363c" containerName="mariadb-account-create-update" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.003769 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a13938f-6542-42f2-a635-9529887f363c" containerName="mariadb-account-create-update" Jan 21 15:16:03 crc kubenswrapper[4707]: E0121 15:16:03.003784 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877b2ecd-b7f6-4057-a5a8-e4925a9cc743" containerName="swift-ring-rebalance" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.003790 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="877b2ecd-b7f6-4057-a5a8-e4925a9cc743" containerName="swift-ring-rebalance" Jan 21 15:16:03 crc kubenswrapper[4707]: E0121 15:16:03.003803 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55f50403-e5d4-4cac-928c-a42bf62951a8" containerName="mariadb-database-create" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.003826 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f50403-e5d4-4cac-928c-a42bf62951a8" containerName="mariadb-database-create" Jan 21 15:16:03 crc kubenswrapper[4707]: E0121 15:16:03.003836 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65869b4-a0c3-46ba-8c75-a194a22d9d2c" containerName="mariadb-account-create-update" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.003841 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65869b4-a0c3-46ba-8c75-a194a22d9d2c" containerName="mariadb-account-create-update" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.003993 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55f50403-e5d4-4cac-928c-a42bf62951a8" containerName="mariadb-database-create" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.004003 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a13938f-6542-42f2-a635-9529887f363c" containerName="mariadb-account-create-update" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.004019 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65869b4-a0c3-46ba-8c75-a194a22d9d2c" containerName="mariadb-account-create-update" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.004027 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="877b2ecd-b7f6-4057-a5a8-e4925a9cc743" containerName="swift-ring-rebalance" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.004427 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.005926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.006037 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-zv4ct" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.019471 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-2bdzl"] Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.162485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4x4\" (UniqueName: \"kubernetes.io/projected/f0cdf697-e82c-44fa-992a-4843848b3d33-kube-api-access-7b4x4\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.162550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-db-sync-config-data\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.162569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-combined-ca-bundle\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.162612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-config-data\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.263896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4x4\" (UniqueName: \"kubernetes.io/projected/f0cdf697-e82c-44fa-992a-4843848b3d33-kube-api-access-7b4x4\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.263962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-db-sync-config-data\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.263981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-combined-ca-bundle\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.264008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-config-data\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.268104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-db-sync-config-data\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.270775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-config-data\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.271243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-combined-ca-bundle\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.277265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4x4\" (UniqueName: \"kubernetes.io/projected/f0cdf697-e82c-44fa-992a-4843848b3d33-kube-api-access-7b4x4\") pod \"glance-db-sync-2bdzl\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.317203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:03 crc kubenswrapper[4707]: I0121 15:16:03.684377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-2bdzl"] Jan 21 15:16:03 crc kubenswrapper[4707]: W0121 15:16:03.688463 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0cdf697_e82c_44fa_992a_4843848b3d33.slice/crio-3c1e3de5a3da48c12997969d73d8901697e1eb6241258f1a1e15a5a350726fab WatchSource:0}: Error finding container 3c1e3de5a3da48c12997969d73d8901697e1eb6241258f1a1e15a5a350726fab: Status 404 returned error can't find the container with id 3c1e3de5a3da48c12997969d73d8901697e1eb6241258f1a1e15a5a350726fab Jan 21 15:16:04 crc kubenswrapper[4707]: I0121 15:16:04.021615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-2bdzl" event={"ID":"f0cdf697-e82c-44fa-992a-4843848b3d33","Type":"ContainerStarted","Data":"3c1e3de5a3da48c12997969d73d8901697e1eb6241258f1a1e15a5a350726fab"} Jan 21 15:16:06 crc kubenswrapper[4707]: I0121 15:16:06.411693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:16:06 crc kubenswrapper[4707]: I0121 15:16:06.417267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") pod \"swift-storage-0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:16:06 crc kubenswrapper[4707]: I0121 15:16:06.459330 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:16:06 crc kubenswrapper[4707]: I0121 15:16:06.841518 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:16:06 crc kubenswrapper[4707]: W0121 15:16:06.850422 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a26dbbb_ff73_461b_9ef4_045186b349f0.slice/crio-62ab95306fcb789d20522fc96dd9fae599b20c2a379a72875d440584f75b425d WatchSource:0}: Error finding container 62ab95306fcb789d20522fc96dd9fae599b20c2a379a72875d440584f75b425d: Status 404 returned error can't find the container with id 62ab95306fcb789d20522fc96dd9fae599b20c2a379a72875d440584f75b425d Jan 21 15:16:07 crc kubenswrapper[4707]: I0121 15:16:07.040774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"62ab95306fcb789d20522fc96dd9fae599b20c2a379a72875d440584f75b425d"} Jan 21 15:16:13 crc kubenswrapper[4707]: I0121 15:16:13.077382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-2bdzl" event={"ID":"f0cdf697-e82c-44fa-992a-4843848b3d33","Type":"ContainerStarted","Data":"b439aa595d2654611a443e0a117c7e818373bc9a2b50676f7467d7a722473847"} Jan 21 15:16:13 crc kubenswrapper[4707]: I0121 15:16:13.079695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"0adfedc9ed55605ee7e287bb792ea2a2b8c433568d147932a3110f197664d94e"} Jan 21 15:16:13 crc kubenswrapper[4707]: I0121 15:16:13.093266 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-2bdzl" podStartSLOduration=2.874912052 podStartE2EDuration="11.093249747s" podCreationTimestamp="2026-01-21 15:16:02 +0000 UTC" firstStartedPulling="2026-01-21 15:16:03.690190378 +0000 UTC m=+860.871706600" lastFinishedPulling="2026-01-21 15:16:11.908528073 +0000 UTC m=+869.090044295" observedRunningTime="2026-01-21 15:16:13.090722863 +0000 UTC m=+870.272239085" watchObservedRunningTime="2026-01-21 15:16:13.093249747 +0000 UTC m=+870.274765968" Jan 21 15:16:13 crc kubenswrapper[4707]: I0121 15:16:13.662993 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:16:13 crc kubenswrapper[4707]: I0121 15:16:13.995289 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2bfhk"] Jan 21 15:16:13 crc kubenswrapper[4707]: I0121 15:16:13.996079 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.001571 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2bfhk"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.080359 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-bkjdx"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.081178 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.087848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"79465f4c64a300da6b2b166a7231cade9cafced28c54a63fb1dd42634ea4b73d"} Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.087879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"dfc3a06ef645fad85b8c1641aeb4b27af7cb580998503dd959db2c02aff177cd"} Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.087889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"0b91da6e45cc4e4749ed6b4f2b71d51ae0be19bba7d48c2ab451efd86fbf36c4"} Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.102389 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.103233 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.104864 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.117391 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.125093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b153bbb-6db9-4837-8010-dbe2bdc51be0-operator-scripts\") pod \"barbican-db-create-2bfhk\" (UID: \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\") " pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.125139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbrgq\" (UniqueName: \"kubernetes.io/projected/6b153bbb-6db9-4837-8010-dbe2bdc51be0-kube-api-access-pbrgq\") pod \"barbican-db-create-2bfhk\" (UID: \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\") " pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.146194 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-bkjdx"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.184240 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-hp9gw"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.185140 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.223873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-hp9gw"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.227097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5n2z\" (UniqueName: \"kubernetes.io/projected/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-kube-api-access-l5n2z\") pod \"cinder-db-create-bkjdx\" (UID: \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\") " pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.227178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-operator-scripts\") pod \"cinder-db-create-bkjdx\" (UID: \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\") " pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.227212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhh8w\" (UniqueName: \"kubernetes.io/projected/85b038b0-2f9c-475f-98e5-574468773068-kube-api-access-xhh8w\") pod \"neutron-8bd9-account-create-update-v85vk\" (UID: \"85b038b0-2f9c-475f-98e5-574468773068\") " pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.227233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b038b0-2f9c-475f-98e5-574468773068-operator-scripts\") pod \"neutron-8bd9-account-create-update-v85vk\" (UID: \"85b038b0-2f9c-475f-98e5-574468773068\") " pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.227310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b153bbb-6db9-4837-8010-dbe2bdc51be0-operator-scripts\") pod \"barbican-db-create-2bfhk\" (UID: \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\") " pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.227365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbrgq\" (UniqueName: \"kubernetes.io/projected/6b153bbb-6db9-4837-8010-dbe2bdc51be0-kube-api-access-pbrgq\") pod \"barbican-db-create-2bfhk\" (UID: \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\") " pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.229118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b153bbb-6db9-4837-8010-dbe2bdc51be0-operator-scripts\") pod \"barbican-db-create-2bfhk\" (UID: \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\") " pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.243411 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.244612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.247172 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.260498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbrgq\" (UniqueName: \"kubernetes.io/projected/6b153bbb-6db9-4837-8010-dbe2bdc51be0-kube-api-access-pbrgq\") pod \"barbican-db-create-2bfhk\" (UID: \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\") " pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.269925 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.280335 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vjd82"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.281259 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.282989 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.285991 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.286189 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-5sl4v" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.291172 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.294581 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vjd82"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.308468 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-e371-account-create-update-78ppn"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.309397 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.314093 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.314461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.322635 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-e371-account-create-update-78ppn"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.328842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1081a6-b1c0-499f-bd60-82256c944e34-operator-scripts\") pod \"neutron-db-create-hp9gw\" (UID: \"de1081a6-b1c0-499f-bd60-82256c944e34\") " pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.328898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5n2z\" (UniqueName: \"kubernetes.io/projected/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-kube-api-access-l5n2z\") pod \"cinder-db-create-bkjdx\" (UID: \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\") " pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.328936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs22q\" (UniqueName: \"kubernetes.io/projected/8dbdd352-f6e2-4bb5-a980-af302e912c67-kube-api-access-fs22q\") pod \"barbican-e217-account-create-update-xlbv7\" (UID: \"8dbdd352-f6e2-4bb5-a980-af302e912c67\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.329006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-operator-scripts\") pod \"cinder-db-create-bkjdx\" (UID: \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\") " pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.329038 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhh8w\" (UniqueName: \"kubernetes.io/projected/85b038b0-2f9c-475f-98e5-574468773068-kube-api-access-xhh8w\") pod \"neutron-8bd9-account-create-update-v85vk\" (UID: \"85b038b0-2f9c-475f-98e5-574468773068\") " pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.329064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbdd352-f6e2-4bb5-a980-af302e912c67-operator-scripts\") pod \"barbican-e217-account-create-update-xlbv7\" (UID: \"8dbdd352-f6e2-4bb5-a980-af302e912c67\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.329080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b038b0-2f9c-475f-98e5-574468773068-operator-scripts\") pod \"neutron-8bd9-account-create-update-v85vk\" (UID: \"85b038b0-2f9c-475f-98e5-574468773068\") " pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.329100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f4n8\" (UniqueName: \"kubernetes.io/projected/de1081a6-b1c0-499f-bd60-82256c944e34-kube-api-access-6f4n8\") pod \"neutron-db-create-hp9gw\" (UID: \"de1081a6-b1c0-499f-bd60-82256c944e34\") " pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.330092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-operator-scripts\") pod \"cinder-db-create-bkjdx\" (UID: \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\") " pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.330509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b038b0-2f9c-475f-98e5-574468773068-operator-scripts\") pod \"neutron-8bd9-account-create-update-v85vk\" (UID: \"85b038b0-2f9c-475f-98e5-574468773068\") " pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.344795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5n2z\" (UniqueName: \"kubernetes.io/projected/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-kube-api-access-l5n2z\") pod \"cinder-db-create-bkjdx\" (UID: \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\") " pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.351624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhh8w\" (UniqueName: \"kubernetes.io/projected/85b038b0-2f9c-475f-98e5-574468773068-kube-api-access-xhh8w\") pod \"neutron-8bd9-account-create-update-v85vk\" (UID: \"85b038b0-2f9c-475f-98e5-574468773068\") " pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.395452 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.414720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.430316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs22q\" (UniqueName: \"kubernetes.io/projected/8dbdd352-f6e2-4bb5-a980-af302e912c67-kube-api-access-fs22q\") pod \"barbican-e217-account-create-update-xlbv7\" (UID: \"8dbdd352-f6e2-4bb5-a980-af302e912c67\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.430350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-config-data\") pod \"keystone-db-sync-vjd82\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.430404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbdd352-f6e2-4bb5-a980-af302e912c67-operator-scripts\") pod \"barbican-e217-account-create-update-xlbv7\" (UID: \"8dbdd352-f6e2-4bb5-a980-af302e912c67\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.430425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpsmn\" (UniqueName: \"kubernetes.io/projected/99016202-c215-433c-95f2-e75a8fe1a995-kube-api-access-hpsmn\") pod \"cinder-e371-account-create-update-78ppn\" (UID: \"99016202-c215-433c-95f2-e75a8fe1a995\") " pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.430446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f4n8\" (UniqueName: \"kubernetes.io/projected/de1081a6-b1c0-499f-bd60-82256c944e34-kube-api-access-6f4n8\") pod \"neutron-db-create-hp9gw\" (UID: \"de1081a6-b1c0-499f-bd60-82256c944e34\") " pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.430481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99016202-c215-433c-95f2-e75a8fe1a995-operator-scripts\") pod \"cinder-e371-account-create-update-78ppn\" (UID: \"99016202-c215-433c-95f2-e75a8fe1a995\") " pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.430503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9v6\" (UniqueName: \"kubernetes.io/projected/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-kube-api-access-hq9v6\") pod \"keystone-db-sync-vjd82\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.430549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-combined-ca-bundle\") pod \"keystone-db-sync-vjd82\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.430569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1081a6-b1c0-499f-bd60-82256c944e34-operator-scripts\") pod \"neutron-db-create-hp9gw\" (UID: \"de1081a6-b1c0-499f-bd60-82256c944e34\") " pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.431384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbdd352-f6e2-4bb5-a980-af302e912c67-operator-scripts\") pod \"barbican-e217-account-create-update-xlbv7\" (UID: \"8dbdd352-f6e2-4bb5-a980-af302e912c67\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.431668 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1081a6-b1c0-499f-bd60-82256c944e34-operator-scripts\") pod \"neutron-db-create-hp9gw\" (UID: \"de1081a6-b1c0-499f-bd60-82256c944e34\") " pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.448030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs22q\" (UniqueName: \"kubernetes.io/projected/8dbdd352-f6e2-4bb5-a980-af302e912c67-kube-api-access-fs22q\") pod \"barbican-e217-account-create-update-xlbv7\" (UID: \"8dbdd352-f6e2-4bb5-a980-af302e912c67\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.453288 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f4n8\" (UniqueName: \"kubernetes.io/projected/de1081a6-b1c0-499f-bd60-82256c944e34-kube-api-access-6f4n8\") pod \"neutron-db-create-hp9gw\" (UID: \"de1081a6-b1c0-499f-bd60-82256c944e34\") " pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.491750 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.506437 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.533110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-config-data\") pod \"keystone-db-sync-vjd82\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.533214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpsmn\" (UniqueName: \"kubernetes.io/projected/99016202-c215-433c-95f2-e75a8fe1a995-kube-api-access-hpsmn\") pod \"cinder-e371-account-create-update-78ppn\" (UID: \"99016202-c215-433c-95f2-e75a8fe1a995\") " pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.533273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99016202-c215-433c-95f2-e75a8fe1a995-operator-scripts\") pod \"cinder-e371-account-create-update-78ppn\" (UID: \"99016202-c215-433c-95f2-e75a8fe1a995\") " pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.533302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9v6\" (UniqueName: \"kubernetes.io/projected/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-kube-api-access-hq9v6\") pod \"keystone-db-sync-vjd82\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.533361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-combined-ca-bundle\") pod \"keystone-db-sync-vjd82\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.534240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99016202-c215-433c-95f2-e75a8fe1a995-operator-scripts\") pod \"cinder-e371-account-create-update-78ppn\" (UID: \"99016202-c215-433c-95f2-e75a8fe1a995\") " pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.536860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-config-data\") pod \"keystone-db-sync-vjd82\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.543651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-combined-ca-bundle\") pod \"keystone-db-sync-vjd82\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.554150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpsmn\" (UniqueName: \"kubernetes.io/projected/99016202-c215-433c-95f2-e75a8fe1a995-kube-api-access-hpsmn\") pod \"cinder-e371-account-create-update-78ppn\" (UID: \"99016202-c215-433c-95f2-e75a8fe1a995\") " pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.556173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9v6\" (UniqueName: \"kubernetes.io/projected/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-kube-api-access-hq9v6\") pod \"keystone-db-sync-vjd82\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.588213 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.603394 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.624321 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.764251 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2bfhk"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.864986 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-bkjdx"] Jan 21 15:16:14 crc kubenswrapper[4707]: I0121 15:16:14.944494 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk"] Jan 21 15:16:15 crc kubenswrapper[4707]: I0121 15:16:15.038745 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-hp9gw"] Jan 21 15:16:15 crc kubenswrapper[4707]: W0121 15:16:15.043598 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde1081a6_b1c0_499f_bd60_82256c944e34.slice/crio-94b8ea23ce6e540075c3d68c4999d9fbbfcfe4bb1b1de97fe2aa19cf3cdd24ae WatchSource:0}: Error finding container 94b8ea23ce6e540075c3d68c4999d9fbbfcfe4bb1b1de97fe2aa19cf3cdd24ae: Status 404 returned error can't find the container with id 94b8ea23ce6e540075c3d68c4999d9fbbfcfe4bb1b1de97fe2aa19cf3cdd24ae Jan 21 15:16:15 crc kubenswrapper[4707]: I0121 15:16:15.107437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-hp9gw" event={"ID":"de1081a6-b1c0-499f-bd60-82256c944e34","Type":"ContainerStarted","Data":"94b8ea23ce6e540075c3d68c4999d9fbbfcfe4bb1b1de97fe2aa19cf3cdd24ae"} Jan 21 15:16:15 crc kubenswrapper[4707]: I0121 15:16:15.110971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-2bfhk" event={"ID":"6b153bbb-6db9-4837-8010-dbe2bdc51be0","Type":"ContainerStarted","Data":"e56f630af9a126bbbc370b2b6b2e43e907b3150b41ea828619518d5e49281bd3"} Jan 21 15:16:15 crc kubenswrapper[4707]: I0121 15:16:15.119489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" event={"ID":"85b038b0-2f9c-475f-98e5-574468773068","Type":"ContainerStarted","Data":"8fc56089c805d1a49671c46837fd6016461d5135b47ac0e67c82784f89d2d037"} Jan 21 15:16:15 crc kubenswrapper[4707]: I0121 15:16:15.127526 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vjd82"] Jan 21 15:16:15 crc kubenswrapper[4707]: I0121 15:16:15.132362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-bkjdx" event={"ID":"f8dfc923-4af3-4ca0-9efa-f01fa14d7772","Type":"ContainerStarted","Data":"15638180935a736c0973f62a9acaf5a854d66b7f3159352d280273192ab1f042"} Jan 21 15:16:15 crc kubenswrapper[4707]: I0121 15:16:15.137034 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7"] Jan 21 15:16:15 crc kubenswrapper[4707]: I0121 15:16:15.140764 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-e371-account-create-update-78ppn"] Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.142169 4707 generic.go:334] "Generic (PLEG): container finished" podID="de1081a6-b1c0-499f-bd60-82256c944e34" containerID="6ef3e2522324b1b698104b8d16b38946069af9a3394c382864fecbc285f1a8ca" exitCode=0 Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.142578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-hp9gw" event={"ID":"de1081a6-b1c0-499f-bd60-82256c944e34","Type":"ContainerDied","Data":"6ef3e2522324b1b698104b8d16b38946069af9a3394c382864fecbc285f1a8ca"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.144271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vjd82" event={"ID":"78f1a9bb-73b3-4983-a81b-4ae1da3f521e","Type":"ContainerStarted","Data":"d90508f1cbf3651b119c08a55d0f0597a6e5bd6218b74adab843994f5824d103"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.145562 4707 generic.go:334] "Generic (PLEG): container finished" podID="6b153bbb-6db9-4837-8010-dbe2bdc51be0" containerID="93752cb0214af1aa190397cad0406ed11bd3f6dba015c3fbf12e099f8830c859" exitCode=0 Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.145616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-2bfhk" event={"ID":"6b153bbb-6db9-4837-8010-dbe2bdc51be0","Type":"ContainerDied","Data":"93752cb0214af1aa190397cad0406ed11bd3f6dba015c3fbf12e099f8830c859"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.148086 4707 generic.go:334] "Generic (PLEG): container finished" podID="85b038b0-2f9c-475f-98e5-574468773068" containerID="f1984e9b20aa50544fba8536d85aecb53e0c63d0022c32cbaf91a219384719e0" exitCode=0 Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.148217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" event={"ID":"85b038b0-2f9c-475f-98e5-574468773068","Type":"ContainerDied","Data":"f1984e9b20aa50544fba8536d85aecb53e0c63d0022c32cbaf91a219384719e0"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.150290 4707 generic.go:334] "Generic (PLEG): container finished" podID="f8dfc923-4af3-4ca0-9efa-f01fa14d7772" containerID="31896ebb7fd8a0ba8f4a099a589d633c45bbca89c2f32b8849942170749a152c" exitCode=0 Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.150343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-bkjdx" event={"ID":"f8dfc923-4af3-4ca0-9efa-f01fa14d7772","Type":"ContainerDied","Data":"31896ebb7fd8a0ba8f4a099a589d633c45bbca89c2f32b8849942170749a152c"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.154215 4707 generic.go:334] "Generic (PLEG): container finished" podID="99016202-c215-433c-95f2-e75a8fe1a995" containerID="ca241c4a9b687cbabd145c1ec655336baf10d6aef14fd3951470b276d7bd3c4f" exitCode=0 Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.154313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" event={"ID":"99016202-c215-433c-95f2-e75a8fe1a995","Type":"ContainerDied","Data":"ca241c4a9b687cbabd145c1ec655336baf10d6aef14fd3951470b276d7bd3c4f"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.154334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" event={"ID":"99016202-c215-433c-95f2-e75a8fe1a995","Type":"ContainerStarted","Data":"ce90a4677766fd26189b8f56f701c847104c0a39a13a2c803a93ffba9a643305"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.156253 4707 generic.go:334] "Generic (PLEG): container finished" podID="8dbdd352-f6e2-4bb5-a980-af302e912c67" containerID="87541c3326360047b9a65f7edaf21d2685779154d9e8a57721bdf77d39ce2562" exitCode=0 Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.156333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" event={"ID":"8dbdd352-f6e2-4bb5-a980-af302e912c67","Type":"ContainerDied","Data":"87541c3326360047b9a65f7edaf21d2685779154d9e8a57721bdf77d39ce2562"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.156354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" event={"ID":"8dbdd352-f6e2-4bb5-a980-af302e912c67","Type":"ContainerStarted","Data":"59b91d3b9e771a52bf68cf730e880290524fbf7d6023a9258f163c39aedf059f"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.179971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"6ba3795292e347eb0052109476f7b8868a66ba15e5ffbef833b4771ba1eb7f0c"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.180014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"b57d8c8d95a43a7fe418e385ecf17ab93c271370b136b8b9ea9e9ac36a8eb7ec"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.180026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"c20ae0c2ed4b8564da8434d55d14c1622df2625d7eeb4be8dbb41dd1c95ea25d"} Jan 21 15:16:16 crc kubenswrapper[4707]: I0121 15:16:16.180035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"19f135825fe61fa9589705bc5673cab68133a4b8973d340cdca3519f686f9181"} Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.544221 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.632681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.637970 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.644335 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.654466 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.658234 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.688492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b038b0-2f9c-475f-98e5-574468773068-operator-scripts\") pod \"85b038b0-2f9c-475f-98e5-574468773068\" (UID: \"85b038b0-2f9c-475f-98e5-574468773068\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.688538 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhh8w\" (UniqueName: \"kubernetes.io/projected/85b038b0-2f9c-475f-98e5-574468773068-kube-api-access-xhh8w\") pod \"85b038b0-2f9c-475f-98e5-574468773068\" (UID: \"85b038b0-2f9c-475f-98e5-574468773068\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.689916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b038b0-2f9c-475f-98e5-574468773068-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85b038b0-2f9c-475f-98e5-574468773068" (UID: "85b038b0-2f9c-475f-98e5-574468773068"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.699195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b038b0-2f9c-475f-98e5-574468773068-kube-api-access-xhh8w" (OuterVolumeSpecName: "kube-api-access-xhh8w") pod "85b038b0-2f9c-475f-98e5-574468773068" (UID: "85b038b0-2f9c-475f-98e5-574468773068"). InnerVolumeSpecName "kube-api-access-xhh8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.789793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbdd352-f6e2-4bb5-a980-af302e912c67-operator-scripts\") pod \"8dbdd352-f6e2-4bb5-a980-af302e912c67\" (UID: \"8dbdd352-f6e2-4bb5-a980-af302e912c67\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.789870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99016202-c215-433c-95f2-e75a8fe1a995-operator-scripts\") pod \"99016202-c215-433c-95f2-e75a8fe1a995\" (UID: \"99016202-c215-433c-95f2-e75a8fe1a995\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.789907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5n2z\" (UniqueName: \"kubernetes.io/projected/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-kube-api-access-l5n2z\") pod \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\" (UID: \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.789938 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpsmn\" (UniqueName: \"kubernetes.io/projected/99016202-c215-433c-95f2-e75a8fe1a995-kube-api-access-hpsmn\") pod \"99016202-c215-433c-95f2-e75a8fe1a995\" (UID: \"99016202-c215-433c-95f2-e75a8fe1a995\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.789965 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs22q\" (UniqueName: \"kubernetes.io/projected/8dbdd352-f6e2-4bb5-a980-af302e912c67-kube-api-access-fs22q\") pod \"8dbdd352-f6e2-4bb5-a980-af302e912c67\" (UID: \"8dbdd352-f6e2-4bb5-a980-af302e912c67\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.789988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1081a6-b1c0-499f-bd60-82256c944e34-operator-scripts\") pod \"de1081a6-b1c0-499f-bd60-82256c944e34\" (UID: \"de1081a6-b1c0-499f-bd60-82256c944e34\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-operator-scripts\") pod \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\" (UID: \"f8dfc923-4af3-4ca0-9efa-f01fa14d7772\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f4n8\" (UniqueName: \"kubernetes.io/projected/de1081a6-b1c0-499f-bd60-82256c944e34-kube-api-access-6f4n8\") pod \"de1081a6-b1c0-499f-bd60-82256c944e34\" (UID: \"de1081a6-b1c0-499f-bd60-82256c944e34\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbrgq\" (UniqueName: \"kubernetes.io/projected/6b153bbb-6db9-4837-8010-dbe2bdc51be0-kube-api-access-pbrgq\") pod \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\" (UID: \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790201 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b153bbb-6db9-4837-8010-dbe2bdc51be0-operator-scripts\") pod \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\" (UID: \"6b153bbb-6db9-4837-8010-dbe2bdc51be0\") " Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99016202-c215-433c-95f2-e75a8fe1a995-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99016202-c215-433c-95f2-e75a8fe1a995" (UID: "99016202-c215-433c-95f2-e75a8fe1a995"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dbdd352-f6e2-4bb5-a980-af302e912c67-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8dbdd352-f6e2-4bb5-a980-af302e912c67" (UID: "8dbdd352-f6e2-4bb5-a980-af302e912c67"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790499 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b038b0-2f9c-475f-98e5-574468773068-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790511 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhh8w\" (UniqueName: \"kubernetes.io/projected/85b038b0-2f9c-475f-98e5-574468773068-kube-api-access-xhh8w\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790521 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbdd352-f6e2-4bb5-a980-af302e912c67-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790528 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99016202-c215-433c-95f2-e75a8fe1a995-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de1081a6-b1c0-499f-bd60-82256c944e34-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de1081a6-b1c0-499f-bd60-82256c944e34" (UID: "de1081a6-b1c0-499f-bd60-82256c944e34"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.790911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b153bbb-6db9-4837-8010-dbe2bdc51be0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b153bbb-6db9-4837-8010-dbe2bdc51be0" (UID: "6b153bbb-6db9-4837-8010-dbe2bdc51be0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.791308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8dfc923-4af3-4ca0-9efa-f01fa14d7772" (UID: "f8dfc923-4af3-4ca0-9efa-f01fa14d7772"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.793585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99016202-c215-433c-95f2-e75a8fe1a995-kube-api-access-hpsmn" (OuterVolumeSpecName: "kube-api-access-hpsmn") pod "99016202-c215-433c-95f2-e75a8fe1a995" (UID: "99016202-c215-433c-95f2-e75a8fe1a995"). InnerVolumeSpecName "kube-api-access-hpsmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.793551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-kube-api-access-l5n2z" (OuterVolumeSpecName: "kube-api-access-l5n2z") pod "f8dfc923-4af3-4ca0-9efa-f01fa14d7772" (UID: "f8dfc923-4af3-4ca0-9efa-f01fa14d7772"). InnerVolumeSpecName "kube-api-access-l5n2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.793701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1081a6-b1c0-499f-bd60-82256c944e34-kube-api-access-6f4n8" (OuterVolumeSpecName: "kube-api-access-6f4n8") pod "de1081a6-b1c0-499f-bd60-82256c944e34" (UID: "de1081a6-b1c0-499f-bd60-82256c944e34"). InnerVolumeSpecName "kube-api-access-6f4n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.793781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dbdd352-f6e2-4bb5-a980-af302e912c67-kube-api-access-fs22q" (OuterVolumeSpecName: "kube-api-access-fs22q") pod "8dbdd352-f6e2-4bb5-a980-af302e912c67" (UID: "8dbdd352-f6e2-4bb5-a980-af302e912c67"). InnerVolumeSpecName "kube-api-access-fs22q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.794185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b153bbb-6db9-4837-8010-dbe2bdc51be0-kube-api-access-pbrgq" (OuterVolumeSpecName: "kube-api-access-pbrgq") pod "6b153bbb-6db9-4837-8010-dbe2bdc51be0" (UID: "6b153bbb-6db9-4837-8010-dbe2bdc51be0"). InnerVolumeSpecName "kube-api-access-pbrgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.892280 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5n2z\" (UniqueName: \"kubernetes.io/projected/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-kube-api-access-l5n2z\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.892312 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpsmn\" (UniqueName: \"kubernetes.io/projected/99016202-c215-433c-95f2-e75a8fe1a995-kube-api-access-hpsmn\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.892322 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs22q\" (UniqueName: \"kubernetes.io/projected/8dbdd352-f6e2-4bb5-a980-af302e912c67-kube-api-access-fs22q\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.892331 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1081a6-b1c0-499f-bd60-82256c944e34-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.892339 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8dfc923-4af3-4ca0-9efa-f01fa14d7772-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.892347 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f4n8\" (UniqueName: \"kubernetes.io/projected/de1081a6-b1c0-499f-bd60-82256c944e34-kube-api-access-6f4n8\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.892355 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbrgq\" (UniqueName: \"kubernetes.io/projected/6b153bbb-6db9-4837-8010-dbe2bdc51be0-kube-api-access-pbrgq\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:17 crc kubenswrapper[4707]: I0121 15:16:17.892362 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b153bbb-6db9-4837-8010-dbe2bdc51be0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.203738 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-2bfhk" event={"ID":"6b153bbb-6db9-4837-8010-dbe2bdc51be0","Type":"ContainerDied","Data":"e56f630af9a126bbbc370b2b6b2e43e907b3150b41ea828619518d5e49281bd3"} Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.203765 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-2bfhk" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.203772 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e56f630af9a126bbbc370b2b6b2e43e907b3150b41ea828619518d5e49281bd3" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.204705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" event={"ID":"85b038b0-2f9c-475f-98e5-574468773068","Type":"ContainerDied","Data":"8fc56089c805d1a49671c46837fd6016461d5135b47ac0e67c82784f89d2d037"} Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.204729 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fc56089c805d1a49671c46837fd6016461d5135b47ac0e67c82784f89d2d037" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.204709 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.205890 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-bkjdx" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.205925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-bkjdx" event={"ID":"f8dfc923-4af3-4ca0-9efa-f01fa14d7772","Type":"ContainerDied","Data":"15638180935a736c0973f62a9acaf5a854d66b7f3159352d280273192ab1f042"} Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.205966 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15638180935a736c0973f62a9acaf5a854d66b7f3159352d280273192ab1f042" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.207104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" event={"ID":"99016202-c215-433c-95f2-e75a8fe1a995","Type":"ContainerDied","Data":"ce90a4677766fd26189b8f56f701c847104c0a39a13a2c803a93ffba9a643305"} Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.207128 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce90a4677766fd26189b8f56f701c847104c0a39a13a2c803a93ffba9a643305" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.207128 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e371-account-create-update-78ppn" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.208184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" event={"ID":"8dbdd352-f6e2-4bb5-a980-af302e912c67","Type":"ContainerDied","Data":"59b91d3b9e771a52bf68cf730e880290524fbf7d6023a9258f163c39aedf059f"} Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.208209 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59b91d3b9e771a52bf68cf730e880290524fbf7d6023a9258f163c39aedf059f" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.208196 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.209196 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-hp9gw" event={"ID":"de1081a6-b1c0-499f-bd60-82256c944e34","Type":"ContainerDied","Data":"94b8ea23ce6e540075c3d68c4999d9fbbfcfe4bb1b1de97fe2aa19cf3cdd24ae"} Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.209217 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-hp9gw" Jan 21 15:16:18 crc kubenswrapper[4707]: I0121 15:16:18.209220 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94b8ea23ce6e540075c3d68c4999d9fbbfcfe4bb1b1de97fe2aa19cf3cdd24ae" Jan 21 15:16:20 crc kubenswrapper[4707]: I0121 15:16:20.221587 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0cdf697-e82c-44fa-992a-4843848b3d33" containerID="b439aa595d2654611a443e0a117c7e818373bc9a2b50676f7467d7a722473847" exitCode=0 Jan 21 15:16:20 crc kubenswrapper[4707]: I0121 15:16:20.221776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-2bdzl" event={"ID":"f0cdf697-e82c-44fa-992a-4843848b3d33","Type":"ContainerDied","Data":"b439aa595d2654611a443e0a117c7e818373bc9a2b50676f7467d7a722473847"} Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.688762 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.848302 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-config-data\") pod \"f0cdf697-e82c-44fa-992a-4843848b3d33\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.848362 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b4x4\" (UniqueName: \"kubernetes.io/projected/f0cdf697-e82c-44fa-992a-4843848b3d33-kube-api-access-7b4x4\") pod \"f0cdf697-e82c-44fa-992a-4843848b3d33\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.848387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-db-sync-config-data\") pod \"f0cdf697-e82c-44fa-992a-4843848b3d33\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.848445 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-combined-ca-bundle\") pod \"f0cdf697-e82c-44fa-992a-4843848b3d33\" (UID: \"f0cdf697-e82c-44fa-992a-4843848b3d33\") " Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.854032 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0cdf697-e82c-44fa-992a-4843848b3d33-kube-api-access-7b4x4" (OuterVolumeSpecName: "kube-api-access-7b4x4") pod "f0cdf697-e82c-44fa-992a-4843848b3d33" (UID: "f0cdf697-e82c-44fa-992a-4843848b3d33"). InnerVolumeSpecName "kube-api-access-7b4x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.855680 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f0cdf697-e82c-44fa-992a-4843848b3d33" (UID: "f0cdf697-e82c-44fa-992a-4843848b3d33"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.868328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0cdf697-e82c-44fa-992a-4843848b3d33" (UID: "f0cdf697-e82c-44fa-992a-4843848b3d33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.880533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-config-data" (OuterVolumeSpecName: "config-data") pod "f0cdf697-e82c-44fa-992a-4843848b3d33" (UID: "f0cdf697-e82c-44fa-992a-4843848b3d33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.949775 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.949820 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.949831 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b4x4\" (UniqueName: \"kubernetes.io/projected/f0cdf697-e82c-44fa-992a-4843848b3d33-kube-api-access-7b4x4\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:21 crc kubenswrapper[4707]: I0121 15:16:21.949841 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f0cdf697-e82c-44fa-992a-4843848b3d33-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:22 crc kubenswrapper[4707]: I0121 15:16:22.236823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-2bdzl" event={"ID":"f0cdf697-e82c-44fa-992a-4843848b3d33","Type":"ContainerDied","Data":"3c1e3de5a3da48c12997969d73d8901697e1eb6241258f1a1e15a5a350726fab"} Jan 21 15:16:22 crc kubenswrapper[4707]: I0121 15:16:22.236848 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-2bdzl" Jan 21 15:16:22 crc kubenswrapper[4707]: I0121 15:16:22.236864 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c1e3de5a3da48c12997969d73d8901697e1eb6241258f1a1e15a5a350726fab" Jan 21 15:16:22 crc kubenswrapper[4707]: I0121 15:16:22.250181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"91f811d60dd80febc0d665d8d1e4c22a670a4043e096ea0a70f5b837830a8b9f"} Jan 21 15:16:22 crc kubenswrapper[4707]: I0121 15:16:22.250392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"7af2e04db14be25f173dd381e8aada1f6fca26d64ef2c1471213d94d70d441d7"} Jan 21 15:16:22 crc kubenswrapper[4707]: I0121 15:16:22.255008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vjd82" event={"ID":"78f1a9bb-73b3-4983-a81b-4ae1da3f521e","Type":"ContainerStarted","Data":"e747338a48d85b7d6ace762a38846bdb6e3624ba8bf4af8b159c5363c6252c93"} Jan 21 15:16:22 crc kubenswrapper[4707]: I0121 15:16:22.269271 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-vjd82" podStartSLOduration=1.4511799650000001 podStartE2EDuration="8.269257596s" podCreationTimestamp="2026-01-21 15:16:14 +0000 UTC" firstStartedPulling="2026-01-21 15:16:15.155432637 +0000 UTC m=+872.336948859" lastFinishedPulling="2026-01-21 15:16:21.973510268 +0000 UTC m=+879.155026490" observedRunningTime="2026-01-21 15:16:22.26643714 +0000 UTC m=+879.447953362" watchObservedRunningTime="2026-01-21 15:16:22.269257596 +0000 UTC m=+879.450773819" Jan 21 15:16:23 crc kubenswrapper[4707]: I0121 15:16:23.263889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"e8f6a181d9396f315c57c1fa718fed1215117042aa452cbef4eeb8c3a68400e0"} Jan 21 15:16:23 crc kubenswrapper[4707]: I0121 15:16:23.264113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"f59325ec6d78879ef0e5dc29fc782e9d998428ee8ceb5a83b319c6921289e416"} Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.275318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"767c60a84352183a0a1460f1f409ae4f3ff11f0dc3b14e0f08ef19858a8e4c10"} Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.275532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"5e3216158efccac8796405d01246e40f3bd76d0da2ad3160c835ee6479e21c40"} Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.275544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerStarted","Data":"d3dc8279851af3aeb56531212bc19a4fe8eaf713d9b19e99bb97e6317f18363f"} Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.276681 4707 generic.go:334] "Generic (PLEG): container finished" podID="78f1a9bb-73b3-4983-a81b-4ae1da3f521e" containerID="e747338a48d85b7d6ace762a38846bdb6e3624ba8bf4af8b159c5363c6252c93" exitCode=0 Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.276718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vjd82" event={"ID":"78f1a9bb-73b3-4983-a81b-4ae1da3f521e","Type":"ContainerDied","Data":"e747338a48d85b7d6ace762a38846bdb6e3624ba8bf4af8b159c5363c6252c93"} Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.305378 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=20.189004646 podStartE2EDuration="35.305362749s" podCreationTimestamp="2026-01-21 15:15:49 +0000 UTC" firstStartedPulling="2026-01-21 15:16:06.852387622 +0000 UTC m=+864.033903844" lastFinishedPulling="2026-01-21 15:16:21.968745724 +0000 UTC m=+879.150261947" observedRunningTime="2026-01-21 15:16:24.29904228 +0000 UTC m=+881.480558501" watchObservedRunningTime="2026-01-21 15:16:24.305362749 +0000 UTC m=+881.486878971" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.519691 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45"] Jan 21 15:16:24 crc kubenswrapper[4707]: E0121 15:16:24.520020 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b153bbb-6db9-4837-8010-dbe2bdc51be0" containerName="mariadb-database-create" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520038 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b153bbb-6db9-4837-8010-dbe2bdc51be0" containerName="mariadb-database-create" Jan 21 15:16:24 crc kubenswrapper[4707]: E0121 15:16:24.520048 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99016202-c215-433c-95f2-e75a8fe1a995" containerName="mariadb-account-create-update" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520054 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="99016202-c215-433c-95f2-e75a8fe1a995" containerName="mariadb-account-create-update" Jan 21 15:16:24 crc kubenswrapper[4707]: E0121 15:16:24.520065 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbdd352-f6e2-4bb5-a980-af302e912c67" containerName="mariadb-account-create-update" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520071 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbdd352-f6e2-4bb5-a980-af302e912c67" containerName="mariadb-account-create-update" Jan 21 15:16:24 crc kubenswrapper[4707]: E0121 15:16:24.520083 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dfc923-4af3-4ca0-9efa-f01fa14d7772" containerName="mariadb-database-create" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520088 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dfc923-4af3-4ca0-9efa-f01fa14d7772" containerName="mariadb-database-create" Jan 21 15:16:24 crc kubenswrapper[4707]: E0121 15:16:24.520098 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b038b0-2f9c-475f-98e5-574468773068" containerName="mariadb-account-create-update" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520103 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b038b0-2f9c-475f-98e5-574468773068" containerName="mariadb-account-create-update" Jan 21 15:16:24 crc kubenswrapper[4707]: E0121 15:16:24.520113 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1081a6-b1c0-499f-bd60-82256c944e34" containerName="mariadb-database-create" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520118 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1081a6-b1c0-499f-bd60-82256c944e34" containerName="mariadb-database-create" Jan 21 15:16:24 crc kubenswrapper[4707]: E0121 15:16:24.520129 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0cdf697-e82c-44fa-992a-4843848b3d33" containerName="glance-db-sync" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520135 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0cdf697-e82c-44fa-992a-4843848b3d33" containerName="glance-db-sync" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520251 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b153bbb-6db9-4837-8010-dbe2bdc51be0" containerName="mariadb-database-create" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520264 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dbdd352-f6e2-4bb5-a980-af302e912c67" containerName="mariadb-account-create-update" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520273 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0cdf697-e82c-44fa-992a-4843848b3d33" containerName="glance-db-sync" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520284 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1081a6-b1c0-499f-bd60-82256c944e34" containerName="mariadb-database-create" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520291 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b038b0-2f9c-475f-98e5-574468773068" containerName="mariadb-account-create-update" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520299 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="99016202-c215-433c-95f2-e75a8fe1a995" containerName="mariadb-account-create-update" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520308 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8dfc923-4af3-4ca0-9efa-f01fa14d7772" containerName="mariadb-database-create" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.520966 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.523935 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.529105 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45"] Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.686441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74kmf\" (UniqueName: \"kubernetes.io/projected/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-kube-api-access-74kmf\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.686493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.686778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.686852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-config\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.787884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.788158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-config\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.788321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74kmf\" (UniqueName: \"kubernetes.io/projected/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-kube-api-access-74kmf\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.788415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.788604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.789128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-config\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.789207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.803502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74kmf\" (UniqueName: \"kubernetes.io/projected/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-kube-api-access-74kmf\") pod \"dnsmasq-dnsmasq-84fc46597c-zfl45\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:24 crc kubenswrapper[4707]: I0121 15:16:24.846603 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.206093 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45"] Jan 21 15:16:25 crc kubenswrapper[4707]: W0121 15:16:25.212003 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb09ad3d_cc59_429f_bbe8_ba6aec2f02c0.slice/crio-00e604df0934a581604e8556520d8697b2ded5b6546bb60c05f3de1ae5cdd740 WatchSource:0}: Error finding container 00e604df0934a581604e8556520d8697b2ded5b6546bb60c05f3de1ae5cdd740: Status 404 returned error can't find the container with id 00e604df0934a581604e8556520d8697b2ded5b6546bb60c05f3de1ae5cdd740 Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.282531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" event={"ID":"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0","Type":"ContainerStarted","Data":"00e604df0934a581604e8556520d8697b2ded5b6546bb60c05f3de1ae5cdd740"} Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.473710 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.599491 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq9v6\" (UniqueName: \"kubernetes.io/projected/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-kube-api-access-hq9v6\") pod \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.599576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-config-data\") pod \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.599668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-combined-ca-bundle\") pod \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\" (UID: \"78f1a9bb-73b3-4983-a81b-4ae1da3f521e\") " Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.602669 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-kube-api-access-hq9v6" (OuterVolumeSpecName: "kube-api-access-hq9v6") pod "78f1a9bb-73b3-4983-a81b-4ae1da3f521e" (UID: "78f1a9bb-73b3-4983-a81b-4ae1da3f521e"). InnerVolumeSpecName "kube-api-access-hq9v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.615398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78f1a9bb-73b3-4983-a81b-4ae1da3f521e" (UID: "78f1a9bb-73b3-4983-a81b-4ae1da3f521e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.626579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-config-data" (OuterVolumeSpecName: "config-data") pod "78f1a9bb-73b3-4983-a81b-4ae1da3f521e" (UID: "78f1a9bb-73b3-4983-a81b-4ae1da3f521e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.701507 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq9v6\" (UniqueName: \"kubernetes.io/projected/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-kube-api-access-hq9v6\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.701536 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:25 crc kubenswrapper[4707]: I0121 15:16:25.701547 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1a9bb-73b3-4983-a81b-4ae1da3f521e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.289306 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" containerID="9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32" exitCode=0 Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.289370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" event={"ID":"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0","Type":"ContainerDied","Data":"9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32"} Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.290485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vjd82" event={"ID":"78f1a9bb-73b3-4983-a81b-4ae1da3f521e","Type":"ContainerDied","Data":"d90508f1cbf3651b119c08a55d0f0597a6e5bd6218b74adab843994f5824d103"} Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.290512 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d90508f1cbf3651b119c08a55d0f0597a6e5bd6218b74adab843994f5824d103" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.290556 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vjd82" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.485156 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dxt6x"] Jan 21 15:16:26 crc kubenswrapper[4707]: E0121 15:16:26.485635 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f1a9bb-73b3-4983-a81b-4ae1da3f521e" containerName="keystone-db-sync" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.485652 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f1a9bb-73b3-4983-a81b-4ae1da3f521e" containerName="keystone-db-sync" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.485843 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f1a9bb-73b3-4983-a81b-4ae1da3f521e" containerName="keystone-db-sync" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.486287 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.488618 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.488700 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.488933 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.489059 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.489078 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-5sl4v" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.506107 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dxt6x"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.579474 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mjwjh"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.580736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.596821 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjwjh"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.613027 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-config-data\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.613110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-fernet-keys\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.613145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kstks\" (UniqueName: \"kubernetes.io/projected/e9cad7a1-0127-4377-a62f-8bc34f7812ca-kube-api-access-kstks\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.613179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-credential-keys\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.613199 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-combined-ca-bundle\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.613221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-scripts\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.665208 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-99p25"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.666065 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.667603 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.669276 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.674860 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-99p25"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.677462 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-2ld2h" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.706049 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.707649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.714371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-config-data\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.714405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-catalog-content\") pod \"redhat-marketplace-mjwjh\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.714435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-utilities\") pod \"redhat-marketplace-mjwjh\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.714500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-fernet-keys\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.714534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kstks\" (UniqueName: \"kubernetes.io/projected/e9cad7a1-0127-4377-a62f-8bc34f7812ca-kube-api-access-kstks\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.714570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-credential-keys\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.714594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-combined-ca-bundle\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.714620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-scripts\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.714665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlktm\" (UniqueName: \"kubernetes.io/projected/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-kube-api-access-dlktm\") pod \"redhat-marketplace-mjwjh\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.716282 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.732280 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.732436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-scripts\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.732541 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.733085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-combined-ca-bundle\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.733139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-credential-keys\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.733188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-config-data\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.737235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-fernet-keys\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.748315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kstks\" (UniqueName: \"kubernetes.io/projected/e9cad7a1-0127-4377-a62f-8bc34f7812ca-kube-api-access-kstks\") pod \"keystone-bootstrap-dxt6x\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.780247 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-87slz"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.781164 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.785617 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.785644 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.785912 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-pm765" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.796854 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-stp4w"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.797720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.800261 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-g88h6" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.800278 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.800463 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.807916 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-stp4w"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-combined-ca-bundle\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-scripts\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-catalog-content\") pod \"redhat-marketplace-mjwjh\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-utilities\") pod \"redhat-marketplace-mjwjh\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhrsl\" (UniqueName: \"kubernetes.io/projected/72983002-27f2-464c-9e83-cc2dd28ce9ea-kube-api-access-nhrsl\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-log-httpd\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72983002-27f2-464c-9e83-cc2dd28ce9ea-etc-machine-id\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-config-data\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-run-httpd\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-scripts\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlktm\" (UniqueName: \"kubernetes.io/projected/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-kube-api-access-dlktm\") pod \"redhat-marketplace-mjwjh\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-config-data\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckstq\" (UniqueName: \"kubernetes.io/projected/32aa4051-0ca7-4082-9693-ebd95e4ee508-kube-api-access-ckstq\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.815976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.816007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-db-sync-config-data\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.816420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-catalog-content\") pod \"redhat-marketplace-mjwjh\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.816727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-utilities\") pod \"redhat-marketplace-mjwjh\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.819980 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-87slz"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.825797 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-wkbgh"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.826903 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.830298 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.830712 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.830995 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-9zkld" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.837232 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-wkbgh"] Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.852710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlktm\" (UniqueName: \"kubernetes.io/projected/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-kube-api-access-dlktm\") pod \"redhat-marketplace-mjwjh\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.893233 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfzj4\" (UniqueName: \"kubernetes.io/projected/f80e12a4-1af2-441c-89f4-c8c54e590aee-kube-api-access-mfzj4\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924124 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80e12a4-1af2-441c-89f4-c8c54e590aee-logs\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhrsl\" (UniqueName: \"kubernetes.io/projected/72983002-27f2-464c-9e83-cc2dd28ce9ea-kube-api-access-nhrsl\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-log-httpd\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924207 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqb6t\" (UniqueName: \"kubernetes.io/projected/96a28438-f0ba-433d-84f7-11838f51c57c-kube-api-access-lqb6t\") pod \"barbican-db-sync-stp4w\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72983002-27f2-464c-9e83-cc2dd28ce9ea-etc-machine-id\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924247 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-config-data\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-combined-ca-bundle\") pod \"neutron-db-sync-87slz\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-config-data\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-run-httpd\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqpxh\" (UniqueName: \"kubernetes.io/projected/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-kube-api-access-bqpxh\") pod \"neutron-db-sync-87slz\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-scripts\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-combined-ca-bundle\") pod \"barbican-db-sync-stp4w\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-combined-ca-bundle\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-db-sync-config-data\") pod \"barbican-db-sync-stp4w\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-config-data\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924504 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckstq\" (UniqueName: \"kubernetes.io/projected/32aa4051-0ca7-4082-9693-ebd95e4ee508-kube-api-access-ckstq\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-config\") pod \"neutron-db-sync-87slz\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-db-sync-config-data\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-scripts\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-combined-ca-bundle\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-scripts\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.924912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-run-httpd\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.925230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-log-httpd\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.925294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72983002-27f2-464c-9e83-cc2dd28ce9ea-etc-machine-id\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.927697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-scripts\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.929148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-config-data\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.929263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-combined-ca-bundle\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.932195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-db-sync-config-data\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.932377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.932501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-scripts\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.933033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-config-data\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.936882 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhrsl\" (UniqueName: \"kubernetes.io/projected/72983002-27f2-464c-9e83-cc2dd28ce9ea-kube-api-access-nhrsl\") pod \"cinder-db-sync-99p25\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.943332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckstq\" (UniqueName: \"kubernetes.io/projected/32aa4051-0ca7-4082-9693-ebd95e4ee508-kube-api-access-ckstq\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.945293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:26 crc kubenswrapper[4707]: I0121 15:16:26.979157 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.033578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80e12a4-1af2-441c-89f4-c8c54e590aee-logs\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.033663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqb6t\" (UniqueName: \"kubernetes.io/projected/96a28438-f0ba-433d-84f7-11838f51c57c-kube-api-access-lqb6t\") pod \"barbican-db-sync-stp4w\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.033700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-combined-ca-bundle\") pod \"neutron-db-sync-87slz\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.033741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-config-data\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.033824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqpxh\" (UniqueName: \"kubernetes.io/projected/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-kube-api-access-bqpxh\") pod \"neutron-db-sync-87slz\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.033888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-combined-ca-bundle\") pod \"barbican-db-sync-stp4w\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.033906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-combined-ca-bundle\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.033973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-db-sync-config-data\") pod \"barbican-db-sync-stp4w\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.034006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-config\") pod \"neutron-db-sync-87slz\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.034061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-scripts\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.034126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfzj4\" (UniqueName: \"kubernetes.io/projected/f80e12a4-1af2-441c-89f4-c8c54e590aee-kube-api-access-mfzj4\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.034748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80e12a4-1af2-441c-89f4-c8c54e590aee-logs\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.040430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-combined-ca-bundle\") pod \"barbican-db-sync-stp4w\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.040820 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.041272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-scripts\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.047152 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-combined-ca-bundle\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.047590 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-db-sync-config-data\") pod \"barbican-db-sync-stp4w\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.047737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-combined-ca-bundle\") pod \"neutron-db-sync-87slz\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.049944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-config\") pod \"neutron-db-sync-87slz\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.051544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqpxh\" (UniqueName: \"kubernetes.io/projected/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-kube-api-access-bqpxh\") pod \"neutron-db-sync-87slz\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.052510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-config-data\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.062265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqb6t\" (UniqueName: \"kubernetes.io/projected/96a28438-f0ba-433d-84f7-11838f51c57c-kube-api-access-lqb6t\") pod \"barbican-db-sync-stp4w\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.068677 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfzj4\" (UniqueName: \"kubernetes.io/projected/f80e12a4-1af2-441c-89f4-c8c54e590aee-kube-api-access-mfzj4\") pod \"placement-db-sync-wkbgh\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.102377 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.197167 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.203049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.237800 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dxt6x"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.311375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" event={"ID":"e9cad7a1-0127-4377-a62f-8bc34f7812ca","Type":"ContainerStarted","Data":"7a332ff5c04b7d61ff569ed1de13cbdcfb8c7d6e40de02d4c667b953a69fd4d2"} Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.316863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" event={"ID":"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0","Type":"ContainerStarted","Data":"fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426"} Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.318933 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.344869 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjwjh"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.350463 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" podStartSLOduration=3.350447801 podStartE2EDuration="3.350447801s" podCreationTimestamp="2026-01-21 15:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:27.343473711 +0000 UTC m=+884.524989934" watchObservedRunningTime="2026-01-21 15:16:27.350447801 +0000 UTC m=+884.531964023" Jan 21 15:16:27 crc kubenswrapper[4707]: W0121 15:16:27.374705 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8d6b3e6_0dc7_425f_b6bf_5fefd802cf13.slice/crio-dbf5d88e0f57bacf7aba5dfd9f87ae8d038330ec154ca60991710c1455367203 WatchSource:0}: Error finding container dbf5d88e0f57bacf7aba5dfd9f87ae8d038330ec154ca60991710c1455367203: Status 404 returned error can't find the container with id dbf5d88e0f57bacf7aba5dfd9f87ae8d038330ec154ca60991710c1455367203 Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.461467 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-99p25"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.543529 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.545000 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.554194 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.554772 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-zv4ct" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.554926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.555381 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.558053 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.566437 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.593410 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.594553 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.598147 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.598432 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.610419 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.645004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.645069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj8nc\" (UniqueName: \"kubernetes.io/projected/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-kube-api-access-kj8nc\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.645116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.645155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.645188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.645245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-logs\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.645320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.645349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.686163 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-87slz"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.691365 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-stp4w"] Jan 21 15:16:27 crc kubenswrapper[4707]: W0121 15:16:27.702317 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda331f9ff_ccbb_4a26_a4c7_3dac83efd4f7.slice/crio-2e92c4d2038ada567083e22338a043e8eab0f07602278d0603b3a6962dca6a7e WatchSource:0}: Error finding container 2e92c4d2038ada567083e22338a043e8eab0f07602278d0603b3a6962dca6a7e: Status 404 returned error can't find the container with id 2e92c4d2038ada567083e22338a043e8eab0f07602278d0603b3a6962dca6a7e Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747255 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-logs\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hvf6\" (UniqueName: \"kubernetes.io/projected/7306506e-0676-466d-a570-29076f4d6cfd-kube-api-access-2hvf6\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-logs\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747543 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747679 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.747730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj8nc\" (UniqueName: \"kubernetes.io/projected/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-kube-api-access-kj8nc\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.750200 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-logs\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.750378 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.750431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.753297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.753687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.757227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.762427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.763186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj8nc\" (UniqueName: \"kubernetes.io/projected/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-kube-api-access-kj8nc\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.779284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.828099 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-wkbgh"] Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.848705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-logs\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.848805 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hvf6\" (UniqueName: \"kubernetes.io/projected/7306506e-0676-466d-a570-29076f4d6cfd-kube-api-access-2hvf6\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.848854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.848871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.848918 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.849012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.849071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.849097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.849274 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.849508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-logs\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.852459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.854179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.854331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.854670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.854678 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.862208 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hvf6\" (UniqueName: \"kubernetes.io/projected/7306506e-0676-466d-a570-29076f4d6cfd-kube-api-access-2hvf6\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.865971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.918955 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:27 crc kubenswrapper[4707]: I0121 15:16:27.929403 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.336493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"32aa4051-0ca7-4082-9693-ebd95e4ee508","Type":"ContainerStarted","Data":"ac3e78c3af9f94516aec40662c1fbbd1519fe2966135fa21aa7e15150f6bc050"} Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.343949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.351294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-87slz" event={"ID":"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7","Type":"ContainerStarted","Data":"30eeef85500bce636f1829c179683cddf9eb3aacb9a8c8f4b817abaab927f81f"} Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.351345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-87slz" event={"ID":"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7","Type":"ContainerStarted","Data":"2e92c4d2038ada567083e22338a043e8eab0f07602278d0603b3a6962dca6a7e"} Jan 21 15:16:28 crc kubenswrapper[4707]: W0121 15:16:28.369028 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7adcdf5_70d1_4382_a90e_d1c9433bf37b.slice/crio-f6fe22e1d59629c94539b7913834beb6658e24e4555e518693d7a3b2a7c393a4 WatchSource:0}: Error finding container f6fe22e1d59629c94539b7913834beb6658e24e4555e518693d7a3b2a7c393a4: Status 404 returned error can't find the container with id f6fe22e1d59629c94539b7913834beb6658e24e4555e518693d7a3b2a7c393a4 Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.369881 4707 generic.go:334] "Generic (PLEG): container finished" podID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerID="bf9b865aca35b05004f872249588f4e527e2fe3bdbee5da191a8515248b37d20" exitCode=0 Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.369945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjwjh" event={"ID":"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13","Type":"ContainerDied","Data":"bf9b865aca35b05004f872249588f4e527e2fe3bdbee5da191a8515248b37d20"} Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.369982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjwjh" event={"ID":"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13","Type":"ContainerStarted","Data":"dbf5d88e0f57bacf7aba5dfd9f87ae8d038330ec154ca60991710c1455367203"} Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.373818 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-87slz" podStartSLOduration=2.373794898 podStartE2EDuration="2.373794898s" podCreationTimestamp="2026-01-21 15:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:28.371212129 +0000 UTC m=+885.552728351" watchObservedRunningTime="2026-01-21 15:16:28.373794898 +0000 UTC m=+885.555311120" Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.420137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-wkbgh" event={"ID":"f80e12a4-1af2-441c-89f4-c8c54e590aee","Type":"ContainerStarted","Data":"4f056b0d0eb5134adfc3ae0e548fdf8835859d2cde6dbed81d320896f9d8c0d7"} Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.466441 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.492028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" event={"ID":"e9cad7a1-0127-4377-a62f-8bc34f7812ca","Type":"ContainerStarted","Data":"8e63927c2f73623e25aa4f9c0e08d2cdd5be5f7361f328bb4e017a36264b524f"} Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.510184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" event={"ID":"96a28438-f0ba-433d-84f7-11838f51c57c","Type":"ContainerStarted","Data":"1c19892dd5b150e7f3ffb5dbf25ec827fd20704c5337d1488a3cce9b734225a0"} Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.519427 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-99p25" event={"ID":"72983002-27f2-464c-9e83-cc2dd28ce9ea","Type":"ContainerStarted","Data":"592874b4a90757ffca67e79afe1c81bd120a684c3b44942967a9802754fa9591"} Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.521564 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" podStartSLOduration=2.521550705 podStartE2EDuration="2.521550705s" podCreationTimestamp="2026-01-21 15:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:28.514604889 +0000 UTC m=+885.696121111" watchObservedRunningTime="2026-01-21 15:16:28.521550705 +0000 UTC m=+885.703066927" Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.637576 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.653405 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:16:28 crc kubenswrapper[4707]: I0121 15:16:28.706685 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:16:29 crc kubenswrapper[4707]: I0121 15:16:29.526755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c7adcdf5-70d1-4382-a90e-d1c9433bf37b","Type":"ContainerStarted","Data":"9ed30b81f679371eaa72d2636ee9c1c49d5a67930c8383b19ca227f2d96c1db9"} Jan 21 15:16:29 crc kubenswrapper[4707]: I0121 15:16:29.527013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c7adcdf5-70d1-4382-a90e-d1c9433bf37b","Type":"ContainerStarted","Data":"f6fe22e1d59629c94539b7913834beb6658e24e4555e518693d7a3b2a7c393a4"} Jan 21 15:16:29 crc kubenswrapper[4707]: I0121 15:16:29.528424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7306506e-0676-466d-a570-29076f4d6cfd","Type":"ContainerStarted","Data":"c4e90eeab155bdb0f94b3fd0c61445fa0a9cd0288005686edd13881491d8d204"} Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.551393 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7306506e-0676-466d-a570-29076f4d6cfd","Type":"ContainerStarted","Data":"f1bd0a39dc35ecde21c687ca8d92237b86851cd03ee68656043e7e3f8f67184c"} Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.551728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7306506e-0676-466d-a570-29076f4d6cfd","Type":"ContainerStarted","Data":"757af711bf504652a3318a6cefab095399aba3ad818de782aa6f2b411c9a269c"} Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.551518 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="7306506e-0676-466d-a570-29076f4d6cfd" containerName="glance-httpd" containerID="cri-o://757af711bf504652a3318a6cefab095399aba3ad818de782aa6f2b411c9a269c" gracePeriod=30 Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.551480 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="7306506e-0676-466d-a570-29076f4d6cfd" containerName="glance-log" containerID="cri-o://f1bd0a39dc35ecde21c687ca8d92237b86851cd03ee68656043e7e3f8f67184c" gracePeriod=30 Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.553437 4707 generic.go:334] "Generic (PLEG): container finished" podID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerID="9fa8ae9358ce43401a9d0f4bbb4883f9ca1e9fcfb429d47df2e15d51f6f817c6" exitCode=0 Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.553589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjwjh" event={"ID":"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13","Type":"ContainerDied","Data":"9fa8ae9358ce43401a9d0f4bbb4883f9ca1e9fcfb429d47df2e15d51f6f817c6"} Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.558518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c7adcdf5-70d1-4382-a90e-d1c9433bf37b","Type":"ContainerStarted","Data":"e4c50d1e93862882939da23f3551cfa98be0f63964bcfe8d651a33021eb5ef9a"} Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.558626 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerName="glance-log" containerID="cri-o://9ed30b81f679371eaa72d2636ee9c1c49d5a67930c8383b19ca227f2d96c1db9" gracePeriod=30 Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.558780 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerName="glance-httpd" containerID="cri-o://e4c50d1e93862882939da23f3551cfa98be0f63964bcfe8d651a33021eb5ef9a" gracePeriod=30 Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.560952 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9cad7a1-0127-4377-a62f-8bc34f7812ca" containerID="8e63927c2f73623e25aa4f9c0e08d2cdd5be5f7361f328bb4e017a36264b524f" exitCode=0 Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.561005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" event={"ID":"e9cad7a1-0127-4377-a62f-8bc34f7812ca","Type":"ContainerDied","Data":"8e63927c2f73623e25aa4f9c0e08d2cdd5be5f7361f328bb4e017a36264b524f"} Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.571596 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.571580824 podStartE2EDuration="4.571580824s" podCreationTimestamp="2026-01-21 15:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:30.569770809 +0000 UTC m=+887.751287031" watchObservedRunningTime="2026-01-21 15:16:30.571580824 +0000 UTC m=+887.753097046" Jan 21 15:16:30 crc kubenswrapper[4707]: I0121 15:16:30.598475 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.59846052 podStartE2EDuration="4.59846052s" podCreationTimestamp="2026-01-21 15:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:30.595986756 +0000 UTC m=+887.777502979" watchObservedRunningTime="2026-01-21 15:16:30.59846052 +0000 UTC m=+887.779976742" Jan 21 15:16:31 crc kubenswrapper[4707]: I0121 15:16:31.569331 4707 generic.go:334] "Generic (PLEG): container finished" podID="7306506e-0676-466d-a570-29076f4d6cfd" containerID="757af711bf504652a3318a6cefab095399aba3ad818de782aa6f2b411c9a269c" exitCode=0 Jan 21 15:16:31 crc kubenswrapper[4707]: I0121 15:16:31.569516 4707 generic.go:334] "Generic (PLEG): container finished" podID="7306506e-0676-466d-a570-29076f4d6cfd" containerID="f1bd0a39dc35ecde21c687ca8d92237b86851cd03ee68656043e7e3f8f67184c" exitCode=143 Jan 21 15:16:31 crc kubenswrapper[4707]: I0121 15:16:31.569553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7306506e-0676-466d-a570-29076f4d6cfd","Type":"ContainerDied","Data":"757af711bf504652a3318a6cefab095399aba3ad818de782aa6f2b411c9a269c"} Jan 21 15:16:31 crc kubenswrapper[4707]: I0121 15:16:31.569577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7306506e-0676-466d-a570-29076f4d6cfd","Type":"ContainerDied","Data":"f1bd0a39dc35ecde21c687ca8d92237b86851cd03ee68656043e7e3f8f67184c"} Jan 21 15:16:31 crc kubenswrapper[4707]: I0121 15:16:31.571299 4707 generic.go:334] "Generic (PLEG): container finished" podID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerID="e4c50d1e93862882939da23f3551cfa98be0f63964bcfe8d651a33021eb5ef9a" exitCode=0 Jan 21 15:16:31 crc kubenswrapper[4707]: I0121 15:16:31.571317 4707 generic.go:334] "Generic (PLEG): container finished" podID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerID="9ed30b81f679371eaa72d2636ee9c1c49d5a67930c8383b19ca227f2d96c1db9" exitCode=143 Jan 21 15:16:31 crc kubenswrapper[4707]: I0121 15:16:31.571480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c7adcdf5-70d1-4382-a90e-d1c9433bf37b","Type":"ContainerDied","Data":"e4c50d1e93862882939da23f3551cfa98be0f63964bcfe8d651a33021eb5ef9a"} Jan 21 15:16:31 crc kubenswrapper[4707]: I0121 15:16:31.571499 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c7adcdf5-70d1-4382-a90e-d1c9433bf37b","Type":"ContainerDied","Data":"9ed30b81f679371eaa72d2636ee9c1c49d5a67930c8383b19ca227f2d96c1db9"} Jan 21 15:16:34 crc kubenswrapper[4707]: I0121 15:16:34.601224 4707 generic.go:334] "Generic (PLEG): container finished" podID="a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7" containerID="30eeef85500bce636f1829c179683cddf9eb3aacb9a8c8f4b817abaab927f81f" exitCode=0 Jan 21 15:16:34 crc kubenswrapper[4707]: I0121 15:16:34.601306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-87slz" event={"ID":"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7","Type":"ContainerDied","Data":"30eeef85500bce636f1829c179683cddf9eb3aacb9a8c8f4b817abaab927f81f"} Jan 21 15:16:34 crc kubenswrapper[4707]: I0121 15:16:34.848375 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:16:34 crc kubenswrapper[4707]: I0121 15:16:34.903153 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9"] Jan 21 15:16:34 crc kubenswrapper[4707]: I0121 15:16:34.903354 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" podUID="27118245-cbcd-4382-97b9-d1c36df77cee" containerName="dnsmasq-dns" containerID="cri-o://17e5c220eb052e49499145259c9bfa39182e05fd853538d79c61f7b134299f6e" gracePeriod=10 Jan 21 15:16:35 crc kubenswrapper[4707]: I0121 15:16:35.608314 4707 generic.go:334] "Generic (PLEG): container finished" podID="27118245-cbcd-4382-97b9-d1c36df77cee" containerID="17e5c220eb052e49499145259c9bfa39182e05fd853538d79c61f7b134299f6e" exitCode=0 Jan 21 15:16:35 crc kubenswrapper[4707]: I0121 15:16:35.608393 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" event={"ID":"27118245-cbcd-4382-97b9-d1c36df77cee","Type":"ContainerDied","Data":"17e5c220eb052e49499145259c9bfa39182e05fd853538d79c61f7b134299f6e"} Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.494015 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.616002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" event={"ID":"e9cad7a1-0127-4377-a62f-8bc34f7812ca","Type":"ContainerDied","Data":"7a332ff5c04b7d61ff569ed1de13cbdcfb8c7d6e40de02d4c667b953a69fd4d2"} Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.616030 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a332ff5c04b7d61ff569ed1de13cbdcfb8c7d6e40de02d4c667b953a69fd4d2" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.616051 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dxt6x" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.617562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-config-data\") pod \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.617603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-scripts\") pod \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.617660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kstks\" (UniqueName: \"kubernetes.io/projected/e9cad7a1-0127-4377-a62f-8bc34f7812ca-kube-api-access-kstks\") pod \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.617682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-fernet-keys\") pod \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.617717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-combined-ca-bundle\") pod \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.617793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-credential-keys\") pod \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\" (UID: \"e9cad7a1-0127-4377-a62f-8bc34f7812ca\") " Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.623131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e9cad7a1-0127-4377-a62f-8bc34f7812ca" (UID: "e9cad7a1-0127-4377-a62f-8bc34f7812ca"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.625339 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-scripts" (OuterVolumeSpecName: "scripts") pod "e9cad7a1-0127-4377-a62f-8bc34f7812ca" (UID: "e9cad7a1-0127-4377-a62f-8bc34f7812ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.625371 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9cad7a1-0127-4377-a62f-8bc34f7812ca-kube-api-access-kstks" (OuterVolumeSpecName: "kube-api-access-kstks") pod "e9cad7a1-0127-4377-a62f-8bc34f7812ca" (UID: "e9cad7a1-0127-4377-a62f-8bc34f7812ca"). InnerVolumeSpecName "kube-api-access-kstks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.637115 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9cad7a1-0127-4377-a62f-8bc34f7812ca" (UID: "e9cad7a1-0127-4377-a62f-8bc34f7812ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.638893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-config-data" (OuterVolumeSpecName: "config-data") pod "e9cad7a1-0127-4377-a62f-8bc34f7812ca" (UID: "e9cad7a1-0127-4377-a62f-8bc34f7812ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.641832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e9cad7a1-0127-4377-a62f-8bc34f7812ca" (UID: "e9cad7a1-0127-4377-a62f-8bc34f7812ca"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.719751 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kstks\" (UniqueName: \"kubernetes.io/projected/e9cad7a1-0127-4377-a62f-8bc34f7812ca-kube-api-access-kstks\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.719779 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.719789 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.719798 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.719823 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:36 crc kubenswrapper[4707]: I0121 15:16:36.719831 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cad7a1-0127-4377-a62f-8bc34f7812ca-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.610119 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dxt6x"] Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.614782 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dxt6x"] Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.714187 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kq52z"] Jan 21 15:16:37 crc kubenswrapper[4707]: E0121 15:16:37.714611 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9cad7a1-0127-4377-a62f-8bc34f7812ca" containerName="keystone-bootstrap" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.714696 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9cad7a1-0127-4377-a62f-8bc34f7812ca" containerName="keystone-bootstrap" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.715075 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9cad7a1-0127-4377-a62f-8bc34f7812ca" containerName="keystone-bootstrap" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.715781 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.720333 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.720365 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-5sl4v" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.720442 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.720514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.720531 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.730605 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kq52z"] Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.842637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-fernet-keys\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.842775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-combined-ca-bundle\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.842847 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-credential-keys\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.842878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2njc5\" (UniqueName: \"kubernetes.io/projected/1c33dad8-6a4e-40fd-8f84-1329e367ffac-kube-api-access-2njc5\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.842906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-scripts\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.842981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-config-data\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.944911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-combined-ca-bundle\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.945009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-credential-keys\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.945046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2njc5\" (UniqueName: \"kubernetes.io/projected/1c33dad8-6a4e-40fd-8f84-1329e367ffac-kube-api-access-2njc5\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.945071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-scripts\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.945130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-config-data\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.945158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-fernet-keys\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.949948 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-config-data\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.950239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-credential-keys\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.950413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-fernet-keys\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.950510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-combined-ca-bundle\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.960439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-scripts\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:37 crc kubenswrapper[4707]: I0121 15:16:37.961877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2njc5\" (UniqueName: \"kubernetes.io/projected/1c33dad8-6a4e-40fd-8f84-1329e367ffac-kube-api-access-2njc5\") pod \"keystone-bootstrap-kq52z\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:38 crc kubenswrapper[4707]: I0121 15:16:38.036406 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:39 crc kubenswrapper[4707]: I0121 15:16:39.189558 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9cad7a1-0127-4377-a62f-8bc34f7812ca" path="/var/lib/kubelet/pods/e9cad7a1-0127-4377-a62f-8bc34f7812ca/volumes" Jan 21 15:16:39 crc kubenswrapper[4707]: I0121 15:16:39.278395 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" podUID="27118245-cbcd-4382-97b9-d1c36df77cee" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: connect: connection refused" Jan 21 15:16:42 crc kubenswrapper[4707]: E0121 15:16:42.579833 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777" Jan 21 15:16:42 crc kubenswrapper[4707]: E0121 15:16:42.580229 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n566h8fh8ch55fhc7hfh655h5b9hd5h65dh588h55fh554h5f9hb9h5d9h95h678h54dhd6h9ch58bh8bh75hd9hcdh648h5c5h67bh684hc5h65cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckstq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack-kuttl-tests(32aa4051-0ca7-4082-9693-ebd95e4ee508): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:16:42 crc kubenswrapper[4707]: E0121 15:16:42.945054 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Jan 21 15:16:42 crc kubenswrapper[4707]: E0121 15:16:42.945374 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lqb6t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-stp4w_openstack-kuttl-tests(96a28438-f0ba-433d-84f7-11838f51c57c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:16:42 crc kubenswrapper[4707]: E0121 15:16:42.946902 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" podUID="96a28438-f0ba-433d-84f7-11838f51c57c" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.002361 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.007300 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.010605 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvrcg\" (UniqueName: \"kubernetes.io/projected/27118245-cbcd-4382-97b9-d1c36df77cee-kube-api-access-jvrcg\") pod \"27118245-cbcd-4382-97b9-d1c36df77cee\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj8nc\" (UniqueName: \"kubernetes.io/projected/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-kube-api-access-kj8nc\") pod \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-config\") pod \"27118245-cbcd-4382-97b9-d1c36df77cee\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127738 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-public-tls-certs\") pod \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-config-data\") pod \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-combined-ca-bundle\") pod \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqpxh\" (UniqueName: \"kubernetes.io/projected/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-kube-api-access-bqpxh\") pod \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127884 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-config\") pod \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-combined-ca-bundle\") pod \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\" (UID: \"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.127992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-dnsmasq-svc\") pod \"27118245-cbcd-4382-97b9-d1c36df77cee\" (UID: \"27118245-cbcd-4382-97b9-d1c36df77cee\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.128024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-logs\") pod \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.128038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-scripts\") pod \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.128051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-httpd-run\") pod \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\" (UID: \"c7adcdf5-70d1-4382-a90e-d1c9433bf37b\") " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.128651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7adcdf5-70d1-4382-a90e-d1c9433bf37b" (UID: "c7adcdf5-70d1-4382-a90e-d1c9433bf37b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.129371 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-logs" (OuterVolumeSpecName: "logs") pod "c7adcdf5-70d1-4382-a90e-d1c9433bf37b" (UID: "c7adcdf5-70d1-4382-a90e-d1c9433bf37b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.132415 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-kube-api-access-kj8nc" (OuterVolumeSpecName: "kube-api-access-kj8nc") pod "c7adcdf5-70d1-4382-a90e-d1c9433bf37b" (UID: "c7adcdf5-70d1-4382-a90e-d1c9433bf37b"). InnerVolumeSpecName "kube-api-access-kj8nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.133484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-scripts" (OuterVolumeSpecName: "scripts") pod "c7adcdf5-70d1-4382-a90e-d1c9433bf37b" (UID: "c7adcdf5-70d1-4382-a90e-d1c9433bf37b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.133891 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27118245-cbcd-4382-97b9-d1c36df77cee-kube-api-access-jvrcg" (OuterVolumeSpecName: "kube-api-access-jvrcg") pod "27118245-cbcd-4382-97b9-d1c36df77cee" (UID: "27118245-cbcd-4382-97b9-d1c36df77cee"). InnerVolumeSpecName "kube-api-access-jvrcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.139728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-kube-api-access-bqpxh" (OuterVolumeSpecName: "kube-api-access-bqpxh") pod "a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7" (UID: "a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7"). InnerVolumeSpecName "kube-api-access-bqpxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.139839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "c7adcdf5-70d1-4382-a90e-d1c9433bf37b" (UID: "c7adcdf5-70d1-4382-a90e-d1c9433bf37b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.147365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-config" (OuterVolumeSpecName: "config") pod "a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7" (UID: "a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.152027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7adcdf5-70d1-4382-a90e-d1c9433bf37b" (UID: "c7adcdf5-70d1-4382-a90e-d1c9433bf37b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.154935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7" (UID: "a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.160297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-config" (OuterVolumeSpecName: "config") pod "27118245-cbcd-4382-97b9-d1c36df77cee" (UID: "27118245-cbcd-4382-97b9-d1c36df77cee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.163272 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "27118245-cbcd-4382-97b9-d1c36df77cee" (UID: "27118245-cbcd-4382-97b9-d1c36df77cee"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.165501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7adcdf5-70d1-4382-a90e-d1c9433bf37b" (UID: "c7adcdf5-70d1-4382-a90e-d1c9433bf37b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.182988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-config-data" (OuterVolumeSpecName: "config-data") pod "c7adcdf5-70d1-4382-a90e-d1c9433bf37b" (UID: "c7adcdf5-70d1-4382-a90e-d1c9433bf37b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229399 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvrcg\" (UniqueName: \"kubernetes.io/projected/27118245-cbcd-4382-97b9-d1c36df77cee-kube-api-access-jvrcg\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229499 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229554 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj8nc\" (UniqueName: \"kubernetes.io/projected/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-kube-api-access-kj8nc\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229614 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229668 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229719 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229764 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229822 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqpxh\" (UniqueName: \"kubernetes.io/projected/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-kube-api-access-bqpxh\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229870 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229912 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.229963 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/27118245-cbcd-4382-97b9-d1c36df77cee-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.230026 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.230069 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.230117 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7adcdf5-70d1-4382-a90e-d1c9433bf37b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.242390 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.332417 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.657874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-87slz" event={"ID":"a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7","Type":"ContainerDied","Data":"2e92c4d2038ada567083e22338a043e8eab0f07602278d0603b3a6962dca6a7e"} Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.658116 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e92c4d2038ada567083e22338a043e8eab0f07602278d0603b3a6962dca6a7e" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.657884 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-87slz" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.660725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c7adcdf5-70d1-4382-a90e-d1c9433bf37b","Type":"ContainerDied","Data":"f6fe22e1d59629c94539b7913834beb6658e24e4555e518693d7a3b2a7c393a4"} Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.660757 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.660785 4707 scope.go:117] "RemoveContainer" containerID="e4c50d1e93862882939da23f3551cfa98be0f63964bcfe8d651a33021eb5ef9a" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.663299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" event={"ID":"27118245-cbcd-4382-97b9-d1c36df77cee","Type":"ContainerDied","Data":"c494924c0291252805e9e7f1a499b9220c10e36ff9d9c7123d5beb687fb2994f"} Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.663324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9" Jan 21 15:16:43 crc kubenswrapper[4707]: E0121 15:16:43.664707 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" podUID="96a28438-f0ba-433d-84f7-11838f51c57c" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.710662 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.718331 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.724298 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9"] Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.729904 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-57kz9"] Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.736462 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:16:43 crc kubenswrapper[4707]: E0121 15:16:43.736783 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerName="glance-httpd" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.736795 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerName="glance-httpd" Jan 21 15:16:43 crc kubenswrapper[4707]: E0121 15:16:43.736827 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27118245-cbcd-4382-97b9-d1c36df77cee" containerName="dnsmasq-dns" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.736834 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27118245-cbcd-4382-97b9-d1c36df77cee" containerName="dnsmasq-dns" Jan 21 15:16:43 crc kubenswrapper[4707]: E0121 15:16:43.736852 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7" containerName="neutron-db-sync" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.736858 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7" containerName="neutron-db-sync" Jan 21 15:16:43 crc kubenswrapper[4707]: E0121 15:16:43.736872 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27118245-cbcd-4382-97b9-d1c36df77cee" containerName="init" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.736877 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27118245-cbcd-4382-97b9-d1c36df77cee" containerName="init" Jan 21 15:16:43 crc kubenswrapper[4707]: E0121 15:16:43.736891 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerName="glance-log" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.736898 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerName="glance-log" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.737035 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerName="glance-httpd" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.737048 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="27118245-cbcd-4382-97b9-d1c36df77cee" containerName="dnsmasq-dns" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.737062 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" containerName="glance-log" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.737067 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7" containerName="neutron-db-sync" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.737786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.742989 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.745170 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.745312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.840420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-logs\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.840464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.840540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.840560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.840628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.840680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.840726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jzb8\" (UniqueName: \"kubernetes.io/projected/a0149516-3649-42aa-a3c7-fe89b3a453b7-kube-api-access-9jzb8\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.840793 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: E0121 15:16:43.897755 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Jan 21 15:16:43 crc kubenswrapper[4707]: E0121 15:16:43.898250 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nhrsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-99p25_openstack-kuttl-tests(72983002-27f2-464c-9e83-cc2dd28ce9ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:16:43 crc kubenswrapper[4707]: E0121 15:16:43.899408 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/cinder-db-sync-99p25" podUID="72983002-27f2-464c-9e83-cc2dd28ce9ea" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.903475 4707 scope.go:117] "RemoveContainer" containerID="9ed30b81f679371eaa72d2636ee9c1c49d5a67930c8383b19ca227f2d96c1db9" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.944589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.944667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.944717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jzb8\" (UniqueName: \"kubernetes.io/projected/a0149516-3649-42aa-a3c7-fe89b3a453b7-kube-api-access-9jzb8\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.944786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.944862 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-logs\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.944883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.944932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.944949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.945135 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.945666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-logs\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.945953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.950578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.951158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.951847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.955444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.962735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jzb8\" (UniqueName: \"kubernetes.io/projected/a0149516-3649-42aa-a3c7-fe89b3a453b7-kube-api-access-9jzb8\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:43 crc kubenswrapper[4707]: I0121 15:16:43.987239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.020408 4707 scope.go:117] "RemoveContainer" containerID="17e5c220eb052e49499145259c9bfa39182e05fd853538d79c61f7b134299f6e" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.055991 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.085580 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.094491 4707 scope.go:117] "RemoveContainer" containerID="7e535854d25c09adbae2b264433f7ae08897fe4ace48903a5398f03ba12b2dd3" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.203382 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-8488cbf69-kmsfx"] Jan 21 15:16:44 crc kubenswrapper[4707]: E0121 15:16:44.203869 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7306506e-0676-466d-a570-29076f4d6cfd" containerName="glance-log" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.203885 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7306506e-0676-466d-a570-29076f4d6cfd" containerName="glance-log" Jan 21 15:16:44 crc kubenswrapper[4707]: E0121 15:16:44.203917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7306506e-0676-466d-a570-29076f4d6cfd" containerName="glance-httpd" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.203922 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7306506e-0676-466d-a570-29076f4d6cfd" containerName="glance-httpd" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.204071 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7306506e-0676-466d-a570-29076f4d6cfd" containerName="glance-log" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.204093 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7306506e-0676-466d-a570-29076f4d6cfd" containerName="glance-httpd" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.204888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.211302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.211363 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-pm765" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.211743 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.211935 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.223143 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8488cbf69-kmsfx"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.251132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"7306506e-0676-466d-a570-29076f4d6cfd\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.251178 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-combined-ca-bundle\") pod \"7306506e-0676-466d-a570-29076f4d6cfd\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.251208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hvf6\" (UniqueName: \"kubernetes.io/projected/7306506e-0676-466d-a570-29076f4d6cfd-kube-api-access-2hvf6\") pod \"7306506e-0676-466d-a570-29076f4d6cfd\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.251301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-config-data\") pod \"7306506e-0676-466d-a570-29076f4d6cfd\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.251338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-httpd-run\") pod \"7306506e-0676-466d-a570-29076f4d6cfd\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.251374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-logs\") pod \"7306506e-0676-466d-a570-29076f4d6cfd\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.251398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-internal-tls-certs\") pod \"7306506e-0676-466d-a570-29076f4d6cfd\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.251448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-scripts\") pod \"7306506e-0676-466d-a570-29076f4d6cfd\" (UID: \"7306506e-0676-466d-a570-29076f4d6cfd\") " Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.252527 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-logs" (OuterVolumeSpecName: "logs") pod "7306506e-0676-466d-a570-29076f4d6cfd" (UID: "7306506e-0676-466d-a570-29076f4d6cfd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.252764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7306506e-0676-466d-a570-29076f4d6cfd" (UID: "7306506e-0676-466d-a570-29076f4d6cfd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.255911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "7306506e-0676-466d-a570-29076f4d6cfd" (UID: "7306506e-0676-466d-a570-29076f4d6cfd"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.258899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-scripts" (OuterVolumeSpecName: "scripts") pod "7306506e-0676-466d-a570-29076f4d6cfd" (UID: "7306506e-0676-466d-a570-29076f4d6cfd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.260416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7306506e-0676-466d-a570-29076f4d6cfd-kube-api-access-2hvf6" (OuterVolumeSpecName: "kube-api-access-2hvf6") pod "7306506e-0676-466d-a570-29076f4d6cfd" (UID: "7306506e-0676-466d-a570-29076f4d6cfd"). InnerVolumeSpecName "kube-api-access-2hvf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.282373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7306506e-0676-466d-a570-29076f4d6cfd" (UID: "7306506e-0676-466d-a570-29076f4d6cfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.282634 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kq52z"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.296281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-config-data" (OuterVolumeSpecName: "config-data") pod "7306506e-0676-466d-a570-29076f4d6cfd" (UID: "7306506e-0676-466d-a570-29076f4d6cfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.296485 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.308266 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7306506e-0676-466d-a570-29076f4d6cfd" (UID: "7306506e-0676-466d-a570-29076f4d6cfd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.352995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5dmn\" (UniqueName: \"kubernetes.io/projected/f3586105-9aee-47cc-98fb-0d83f6214b4b-kube-api-access-n5dmn\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-httpd-config\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-config\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-combined-ca-bundle\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-ovndb-tls-certs\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353217 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353228 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353235 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7306506e-0676-466d-a570-29076f4d6cfd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353243 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353251 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353269 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353277 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7306506e-0676-466d-a570-29076f4d6cfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.353285 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hvf6\" (UniqueName: \"kubernetes.io/projected/7306506e-0676-466d-a570-29076f4d6cfd-kube-api-access-2hvf6\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.373872 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.397608 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tbm6k"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.399000 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.405385 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbm6k"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.456585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-ovndb-tls-certs\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.456645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5dmn\" (UniqueName: \"kubernetes.io/projected/f3586105-9aee-47cc-98fb-0d83f6214b4b-kube-api-access-n5dmn\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.456706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-httpd-config\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.456725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-config\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.456785 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-combined-ca-bundle\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.456834 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.468107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-httpd-config\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.468734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-ovndb-tls-certs\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.469118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-config\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.478427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-combined-ca-bundle\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.492441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5dmn\" (UniqueName: \"kubernetes.io/projected/f3586105-9aee-47cc-98fb-0d83f6214b4b-kube-api-access-n5dmn\") pod \"neutron-8488cbf69-kmsfx\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.520759 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.529189 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.557602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-catalog-content\") pod \"redhat-operators-tbm6k\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.557834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc284\" (UniqueName: \"kubernetes.io/projected/d32d166e-4675-461c-8340-c9a1e99451e3-kube-api-access-qc284\") pod \"redhat-operators-tbm6k\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.557935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-utilities\") pod \"redhat-operators-tbm6k\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.659336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc284\" (UniqueName: \"kubernetes.io/projected/d32d166e-4675-461c-8340-c9a1e99451e3-kube-api-access-qc284\") pod \"redhat-operators-tbm6k\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.659405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-utilities\") pod \"redhat-operators-tbm6k\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.659437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-catalog-content\") pod \"redhat-operators-tbm6k\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.659920 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-catalog-content\") pod \"redhat-operators-tbm6k\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.660172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-utilities\") pod \"redhat-operators-tbm6k\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.675617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc284\" (UniqueName: \"kubernetes.io/projected/d32d166e-4675-461c-8340-c9a1e99451e3-kube-api-access-qc284\") pod \"redhat-operators-tbm6k\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.686323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a0149516-3649-42aa-a3c7-fe89b3a453b7","Type":"ContainerStarted","Data":"6ff6c5735aa95722c1831d59137680ff8803be3ac8dafac73a5349b3a742abd4"} Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.694173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" event={"ID":"1c33dad8-6a4e-40fd-8f84-1329e367ffac","Type":"ContainerStarted","Data":"2788f083d09f6b70bb2ba314399bb3a57cca0351ffd15839c0880290b15ad16c"} Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.694203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" event={"ID":"1c33dad8-6a4e-40fd-8f84-1329e367ffac","Type":"ContainerStarted","Data":"b591347a2b6e6e83b00e82b518aef2c8f381fe8a6626e060e4d6634a54cda0bd"} Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.701572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7306506e-0676-466d-a570-29076f4d6cfd","Type":"ContainerDied","Data":"c4e90eeab155bdb0f94b3fd0c61445fa0a9cd0288005686edd13881491d8d204"} Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.701620 4707 scope.go:117] "RemoveContainer" containerID="757af711bf504652a3318a6cefab095399aba3ad818de782aa6f2b411c9a269c" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.701697 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.707705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjwjh" event={"ID":"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13","Type":"ContainerStarted","Data":"9b258857d6fb4477c4a2bcf0b755a02f3d16ca940587313326345bf8bea790fc"} Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.709697 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" podStartSLOduration=7.709687233 podStartE2EDuration="7.709687233s" podCreationTimestamp="2026-01-21 15:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:44.709543753 +0000 UTC m=+901.891059975" watchObservedRunningTime="2026-01-21 15:16:44.709687233 +0000 UTC m=+901.891203456" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.720363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-wkbgh" event={"ID":"f80e12a4-1af2-441c-89f4-c8c54e590aee","Type":"ContainerStarted","Data":"0a2191db90bc16f94bfd958e8b8824d4ccdd945c494ba2d8e86a5b5a077adeca"} Jan 21 15:16:44 crc kubenswrapper[4707]: E0121 15:16:44.721734 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack-kuttl-tests/cinder-db-sync-99p25" podUID="72983002-27f2-464c-9e83-cc2dd28ce9ea" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.729174 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.732037 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mjwjh" podStartSLOduration=3.202187146 podStartE2EDuration="18.732028302s" podCreationTimestamp="2026-01-21 15:16:26 +0000 UTC" firstStartedPulling="2026-01-21 15:16:28.377951116 +0000 UTC m=+885.559467338" lastFinishedPulling="2026-01-21 15:16:43.907792282 +0000 UTC m=+901.089308494" observedRunningTime="2026-01-21 15:16:44.728280241 +0000 UTC m=+901.909796463" watchObservedRunningTime="2026-01-21 15:16:44.732028302 +0000 UTC m=+901.913544524" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.768242 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-wkbgh" podStartSLOduration=3.665573346 podStartE2EDuration="18.768226968s" podCreationTimestamp="2026-01-21 15:16:26 +0000 UTC" firstStartedPulling="2026-01-21 15:16:27.835407596 +0000 UTC m=+885.016923818" lastFinishedPulling="2026-01-21 15:16:42.938061217 +0000 UTC m=+900.119577440" observedRunningTime="2026-01-21 15:16:44.762634077 +0000 UTC m=+901.944150300" watchObservedRunningTime="2026-01-21 15:16:44.768226968 +0000 UTC m=+901.949743191" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.782060 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.790088 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.806469 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.808258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.812192 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.812371 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.814037 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.948423 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8488cbf69-kmsfx"] Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.966626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.966670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.966774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.966843 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.966874 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.966900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.966921 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:44 crc kubenswrapper[4707]: I0121 15:16:44.966946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r62dd\" (UniqueName: \"kubernetes.io/projected/8475154b-1028-4331-a68d-83679b4d78fe-kube-api-access-r62dd\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.068040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.068091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.068165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.068215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.068249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.068289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.068587 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.068935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.069002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r62dd\" (UniqueName: \"kubernetes.io/projected/8475154b-1028-4331-a68d-83679b4d78fe-kube-api-access-r62dd\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.069005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.069253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.079867 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.079999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.080123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.080442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.084214 4707 scope.go:117] "RemoveContainer" containerID="f1bd0a39dc35ecde21c687ca8d92237b86851cd03ee68656043e7e3f8f67184c" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.089956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r62dd\" (UniqueName: \"kubernetes.io/projected/8475154b-1028-4331-a68d-83679b4d78fe-kube-api-access-r62dd\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.104555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.121195 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.196365 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27118245-cbcd-4382-97b9-d1c36df77cee" path="/var/lib/kubelet/pods/27118245-cbcd-4382-97b9-d1c36df77cee/volumes" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.197351 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7306506e-0676-466d-a570-29076f4d6cfd" path="/var/lib/kubelet/pods/7306506e-0676-466d-a570-29076f4d6cfd/volumes" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.198227 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7adcdf5-70d1-4382-a90e-d1c9433bf37b" path="/var/lib/kubelet/pods/c7adcdf5-70d1-4382-a90e-d1c9433bf37b/volumes" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.571869 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbm6k"] Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.678792 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.731890 4707 generic.go:334] "Generic (PLEG): container finished" podID="d32d166e-4675-461c-8340-c9a1e99451e3" containerID="12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b" exitCode=0 Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.732098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm6k" event={"ID":"d32d166e-4675-461c-8340-c9a1e99451e3","Type":"ContainerDied","Data":"12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b"} Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.732122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm6k" event={"ID":"d32d166e-4675-461c-8340-c9a1e99451e3","Type":"ContainerStarted","Data":"c6a160060edee6e60bbd4fbb8d43fc37816a5032aa7996fce67dece185428430"} Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.745822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"32aa4051-0ca7-4082-9693-ebd95e4ee508","Type":"ContainerStarted","Data":"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619"} Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.746779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8475154b-1028-4331-a68d-83679b4d78fe","Type":"ContainerStarted","Data":"46bc4a7be99d864e6a1feae11c0b8b4225554e35619510f67a98c5dc79d21e50"} Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.750124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a0149516-3649-42aa-a3c7-fe89b3a453b7","Type":"ContainerStarted","Data":"5de2ad12831c86f17cb7a8db414b2f4c49c21fa9d4d8b5ef0892cbbc9d50e7fc"} Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.761969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" event={"ID":"f3586105-9aee-47cc-98fb-0d83f6214b4b","Type":"ContainerStarted","Data":"b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7"} Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.762012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" event={"ID":"f3586105-9aee-47cc-98fb-0d83f6214b4b","Type":"ContainerStarted","Data":"aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe"} Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.762022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" event={"ID":"f3586105-9aee-47cc-98fb-0d83f6214b4b","Type":"ContainerStarted","Data":"e5fd869ba52bec6c144ee2772e41a87b57cd6741cc6c9803c7446dff6cb36e84"} Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.762722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:16:45 crc kubenswrapper[4707]: I0121 15:16:45.783994 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" podStartSLOduration=1.783972165 podStartE2EDuration="1.783972165s" podCreationTimestamp="2026-01-21 15:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:45.77304615 +0000 UTC m=+902.954562372" watchObservedRunningTime="2026-01-21 15:16:45.783972165 +0000 UTC m=+902.965488386" Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.775499 4707 generic.go:334] "Generic (PLEG): container finished" podID="f80e12a4-1af2-441c-89f4-c8c54e590aee" containerID="0a2191db90bc16f94bfd958e8b8824d4ccdd945c494ba2d8e86a5b5a077adeca" exitCode=0 Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.775879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-wkbgh" event={"ID":"f80e12a4-1af2-441c-89f4-c8c54e590aee","Type":"ContainerDied","Data":"0a2191db90bc16f94bfd958e8b8824d4ccdd945c494ba2d8e86a5b5a077adeca"} Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.779342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8475154b-1028-4331-a68d-83679b4d78fe","Type":"ContainerStarted","Data":"8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1"} Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.782722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a0149516-3649-42aa-a3c7-fe89b3a453b7","Type":"ContainerStarted","Data":"39f668b37d80f373a8b899a91d73bbd8673c02c349c732988db8667e4da64a7e"} Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.888706 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.888691959 podStartE2EDuration="3.888691959s" podCreationTimestamp="2026-01-21 15:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:46.800711881 +0000 UTC m=+903.982228103" watchObservedRunningTime="2026-01-21 15:16:46.888691959 +0000 UTC m=+904.070208181" Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.890326 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5f98bc6758-lh9vt"] Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.893044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.894580 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.894901 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.897419 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.897488 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:46 crc kubenswrapper[4707]: I0121 15:16:46.912492 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5f98bc6758-lh9vt"] Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.003624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpmqq\" (UniqueName: \"kubernetes.io/projected/7713e80a-5469-4748-a300-4c581d9fd503-kube-api-access-kpmqq\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.003750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-config\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.003772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-internal-tls-certs\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.003845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-combined-ca-bundle\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.003943 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-public-tls-certs\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.004020 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-httpd-config\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.004047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-ovndb-tls-certs\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.105033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-ovndb-tls-certs\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.105262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpmqq\" (UniqueName: \"kubernetes.io/projected/7713e80a-5469-4748-a300-4c581d9fd503-kube-api-access-kpmqq\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.105438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-config\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.105520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-internal-tls-certs\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.105641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-combined-ca-bundle\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.105782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-public-tls-certs\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.106253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-httpd-config\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.116207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-public-tls-certs\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.117370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-httpd-config\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.131005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpmqq\" (UniqueName: \"kubernetes.io/projected/7713e80a-5469-4748-a300-4c581d9fd503-kube-api-access-kpmqq\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.136669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-ovndb-tls-certs\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.137391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-internal-tls-certs\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.138003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-config\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.155617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-combined-ca-bundle\") pod \"neutron-5f98bc6758-lh9vt\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.227852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.694359 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5f98bc6758-lh9vt"] Jan 21 15:16:47 crc kubenswrapper[4707]: W0121 15:16:47.694934 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7713e80a_5469_4748_a300_4c581d9fd503.slice/crio-d06a984dae75f2619b616314b69fc8158103d5c68bcc4bb569c6ea5cdd3be708 WatchSource:0}: Error finding container d06a984dae75f2619b616314b69fc8158103d5c68bcc4bb569c6ea5cdd3be708: Status 404 returned error can't find the container with id d06a984dae75f2619b616314b69fc8158103d5c68bcc4bb569c6ea5cdd3be708 Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.814413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8475154b-1028-4331-a68d-83679b4d78fe","Type":"ContainerStarted","Data":"ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab"} Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.816373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" event={"ID":"7713e80a-5469-4748-a300-4c581d9fd503","Type":"ContainerStarted","Data":"d06a984dae75f2619b616314b69fc8158103d5c68bcc4bb569c6ea5cdd3be708"} Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.827287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm6k" event={"ID":"d32d166e-4675-461c-8340-c9a1e99451e3","Type":"ContainerStarted","Data":"14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba"} Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.842796 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.84278281 podStartE2EDuration="3.84278281s" podCreationTimestamp="2026-01-21 15:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:47.839218215 +0000 UTC m=+905.020734437" watchObservedRunningTime="2026-01-21 15:16:47.84278281 +0000 UTC m=+905.024299033" Jan 21 15:16:47 crc kubenswrapper[4707]: I0121 15:16:47.961794 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mjwjh" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerName="registry-server" probeResult="failure" output=< Jan 21 15:16:47 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 21 15:16:47 crc kubenswrapper[4707]: > Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.084292 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.129595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-combined-ca-bundle\") pod \"f80e12a4-1af2-441c-89f4-c8c54e590aee\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.129951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80e12a4-1af2-441c-89f4-c8c54e590aee-logs\") pod \"f80e12a4-1af2-441c-89f4-c8c54e590aee\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.129977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfzj4\" (UniqueName: \"kubernetes.io/projected/f80e12a4-1af2-441c-89f4-c8c54e590aee-kube-api-access-mfzj4\") pod \"f80e12a4-1af2-441c-89f4-c8c54e590aee\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.130006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-config-data\") pod \"f80e12a4-1af2-441c-89f4-c8c54e590aee\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.130038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-scripts\") pod \"f80e12a4-1af2-441c-89f4-c8c54e590aee\" (UID: \"f80e12a4-1af2-441c-89f4-c8c54e590aee\") " Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.131373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80e12a4-1af2-441c-89f4-c8c54e590aee-logs" (OuterVolumeSpecName: "logs") pod "f80e12a4-1af2-441c-89f4-c8c54e590aee" (UID: "f80e12a4-1af2-441c-89f4-c8c54e590aee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.135248 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-scripts" (OuterVolumeSpecName: "scripts") pod "f80e12a4-1af2-441c-89f4-c8c54e590aee" (UID: "f80e12a4-1af2-441c-89f4-c8c54e590aee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.138750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80e12a4-1af2-441c-89f4-c8c54e590aee-kube-api-access-mfzj4" (OuterVolumeSpecName: "kube-api-access-mfzj4") pod "f80e12a4-1af2-441c-89f4-c8c54e590aee" (UID: "f80e12a4-1af2-441c-89f4-c8c54e590aee"). InnerVolumeSpecName "kube-api-access-mfzj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.151438 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f80e12a4-1af2-441c-89f4-c8c54e590aee" (UID: "f80e12a4-1af2-441c-89f4-c8c54e590aee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.151884 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-config-data" (OuterVolumeSpecName: "config-data") pod "f80e12a4-1af2-441c-89f4-c8c54e590aee" (UID: "f80e12a4-1af2-441c-89f4-c8c54e590aee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.233285 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.233317 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80e12a4-1af2-441c-89f4-c8c54e590aee-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.233328 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfzj4\" (UniqueName: \"kubernetes.io/projected/f80e12a4-1af2-441c-89f4-c8c54e590aee-kube-api-access-mfzj4\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.233340 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.233347 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f80e12a4-1af2-441c-89f4-c8c54e590aee-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.850960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" event={"ID":"7713e80a-5469-4748-a300-4c581d9fd503","Type":"ContainerStarted","Data":"0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e"} Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.851380 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.851406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" event={"ID":"7713e80a-5469-4748-a300-4c581d9fd503","Type":"ContainerStarted","Data":"451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a"} Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.852882 4707 generic.go:334] "Generic (PLEG): container finished" podID="d32d166e-4675-461c-8340-c9a1e99451e3" containerID="14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba" exitCode=0 Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.852933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm6k" event={"ID":"d32d166e-4675-461c-8340-c9a1e99451e3","Type":"ContainerDied","Data":"14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba"} Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.855688 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-wkbgh" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.856385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-wkbgh" event={"ID":"f80e12a4-1af2-441c-89f4-c8c54e590aee","Type":"ContainerDied","Data":"4f056b0d0eb5134adfc3ae0e548fdf8835859d2cde6dbed81d320896f9d8c0d7"} Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.856411 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f056b0d0eb5134adfc3ae0e548fdf8835859d2cde6dbed81d320896f9d8c0d7" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.876381 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" podStartSLOduration=2.876362587 podStartE2EDuration="2.876362587s" podCreationTimestamp="2026-01-21 15:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:48.867628736 +0000 UTC m=+906.049144959" watchObservedRunningTime="2026-01-21 15:16:48.876362587 +0000 UTC m=+906.057878809" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.893601 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5b8c8cff84-8wl5w"] Jan 21 15:16:48 crc kubenswrapper[4707]: E0121 15:16:48.893935 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e12a4-1af2-441c-89f4-c8c54e590aee" containerName="placement-db-sync" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.893947 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e12a4-1af2-441c-89f4-c8c54e590aee" containerName="placement-db-sync" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.894114 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80e12a4-1af2-441c-89f4-c8c54e590aee" containerName="placement-db-sync" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.894849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.897864 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5b8c8cff84-8wl5w"] Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.900304 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-9zkld" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.900659 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.900837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.901027 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.901189 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.944320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-config-data\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.944431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grx49\" (UniqueName: \"kubernetes.io/projected/38833865-b749-47e4-b140-8a541562f359-kube-api-access-grx49\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.944462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-internal-tls-certs\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.944503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-public-tls-certs\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.944520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-scripts\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.944642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38833865-b749-47e4-b140-8a541562f359-logs\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:48 crc kubenswrapper[4707]: I0121 15:16:48.944728 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-combined-ca-bundle\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.048966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grx49\" (UniqueName: \"kubernetes.io/projected/38833865-b749-47e4-b140-8a541562f359-kube-api-access-grx49\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.049025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-internal-tls-certs\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.049064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-public-tls-certs\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.049082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-scripts\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.049186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38833865-b749-47e4-b140-8a541562f359-logs\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.049254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-combined-ca-bundle\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.049292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-config-data\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.049584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38833865-b749-47e4-b140-8a541562f359-logs\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.051952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-public-tls-certs\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.052374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-combined-ca-bundle\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.054395 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-config-data\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.055253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-internal-tls-certs\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.056153 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-scripts\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.062469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grx49\" (UniqueName: \"kubernetes.io/projected/38833865-b749-47e4-b140-8a541562f359-kube-api-access-grx49\") pod \"placement-5b8c8cff84-8wl5w\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.209553 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.868162 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c33dad8-6a4e-40fd-8f84-1329e367ffac" containerID="2788f083d09f6b70bb2ba314399bb3a57cca0351ffd15839c0880290b15ad16c" exitCode=0 Jan 21 15:16:49 crc kubenswrapper[4707]: I0121 15:16:49.868241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" event={"ID":"1c33dad8-6a4e-40fd-8f84-1329e367ffac","Type":"ContainerDied","Data":"2788f083d09f6b70bb2ba314399bb3a57cca0351ffd15839c0880290b15ad16c"} Jan 21 15:16:51 crc kubenswrapper[4707]: I0121 15:16:51.915195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" event={"ID":"1c33dad8-6a4e-40fd-8f84-1329e367ffac","Type":"ContainerDied","Data":"b591347a2b6e6e83b00e82b518aef2c8f381fe8a6626e060e4d6634a54cda0bd"} Jan 21 15:16:51 crc kubenswrapper[4707]: I0121 15:16:51.915492 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b591347a2b6e6e83b00e82b518aef2c8f381fe8a6626e060e4d6634a54cda0bd" Jan 21 15:16:51 crc kubenswrapper[4707]: I0121 15:16:51.971431 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.124523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-combined-ca-bundle\") pod \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.124617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2njc5\" (UniqueName: \"kubernetes.io/projected/1c33dad8-6a4e-40fd-8f84-1329e367ffac-kube-api-access-2njc5\") pod \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.124722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-config-data\") pod \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.124740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-scripts\") pod \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.124755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-fernet-keys\") pod \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.124856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-credential-keys\") pod \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\" (UID: \"1c33dad8-6a4e-40fd-8f84-1329e367ffac\") " Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.132753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1c33dad8-6a4e-40fd-8f84-1329e367ffac" (UID: "1c33dad8-6a4e-40fd-8f84-1329e367ffac"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.133007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-scripts" (OuterVolumeSpecName: "scripts") pod "1c33dad8-6a4e-40fd-8f84-1329e367ffac" (UID: "1c33dad8-6a4e-40fd-8f84-1329e367ffac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.132973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c33dad8-6a4e-40fd-8f84-1329e367ffac-kube-api-access-2njc5" (OuterVolumeSpecName: "kube-api-access-2njc5") pod "1c33dad8-6a4e-40fd-8f84-1329e367ffac" (UID: "1c33dad8-6a4e-40fd-8f84-1329e367ffac"). InnerVolumeSpecName "kube-api-access-2njc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.135917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1c33dad8-6a4e-40fd-8f84-1329e367ffac" (UID: "1c33dad8-6a4e-40fd-8f84-1329e367ffac"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.149344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c33dad8-6a4e-40fd-8f84-1329e367ffac" (UID: "1c33dad8-6a4e-40fd-8f84-1329e367ffac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.150054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-config-data" (OuterVolumeSpecName: "config-data") pod "1c33dad8-6a4e-40fd-8f84-1329e367ffac" (UID: "1c33dad8-6a4e-40fd-8f84-1329e367ffac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.226052 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.226080 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2njc5\" (UniqueName: \"kubernetes.io/projected/1c33dad8-6a4e-40fd-8f84-1329e367ffac-kube-api-access-2njc5\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.226091 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.226099 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.226108 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.226115 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c33dad8-6a4e-40fd-8f84-1329e367ffac-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.231462 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5b8c8cff84-8wl5w"] Jan 21 15:16:52 crc kubenswrapper[4707]: W0121 15:16:52.231823 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38833865_b749_47e4_b140_8a541562f359.slice/crio-e62df5d82f7edafba1e2a62dff5f4267be6d7822c90811d6342ee07fb129707e WatchSource:0}: Error finding container e62df5d82f7edafba1e2a62dff5f4267be6d7822c90811d6342ee07fb129707e: Status 404 returned error can't find the container with id e62df5d82f7edafba1e2a62dff5f4267be6d7822c90811d6342ee07fb129707e Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.921589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" event={"ID":"38833865-b749-47e4-b140-8a541562f359","Type":"ContainerStarted","Data":"62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376"} Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.921823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" event={"ID":"38833865-b749-47e4-b140-8a541562f359","Type":"ContainerStarted","Data":"c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5"} Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.921838 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.921849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" event={"ID":"38833865-b749-47e4-b140-8a541562f359","Type":"ContainerStarted","Data":"e62df5d82f7edafba1e2a62dff5f4267be6d7822c90811d6342ee07fb129707e"} Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.921858 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.924868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm6k" event={"ID":"d32d166e-4675-461c-8340-c9a1e99451e3","Type":"ContainerStarted","Data":"38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7"} Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.926570 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kq52z" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.929581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"32aa4051-0ca7-4082-9693-ebd95e4ee508","Type":"ContainerStarted","Data":"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc"} Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.935920 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" podStartSLOduration=4.9359088589999995 podStartE2EDuration="4.935908859s" podCreationTimestamp="2026-01-21 15:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:52.933573817 +0000 UTC m=+910.115090039" watchObservedRunningTime="2026-01-21 15:16:52.935908859 +0000 UTC m=+910.117425081" Jan 21 15:16:52 crc kubenswrapper[4707]: I0121 15:16:52.952761 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tbm6k" podStartSLOduration=2.849177495 podStartE2EDuration="8.952746915s" podCreationTimestamp="2026-01-21 15:16:44 +0000 UTC" firstStartedPulling="2026-01-21 15:16:45.736288652 +0000 UTC m=+902.917804875" lastFinishedPulling="2026-01-21 15:16:51.839858073 +0000 UTC m=+909.021374295" observedRunningTime="2026-01-21 15:16:52.948973457 +0000 UTC m=+910.130489678" watchObservedRunningTime="2026-01-21 15:16:52.952746915 +0000 UTC m=+910.134263137" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.031139 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-84f545d4f5-d8v92"] Jan 21 15:16:53 crc kubenswrapper[4707]: E0121 15:16:53.031422 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c33dad8-6a4e-40fd-8f84-1329e367ffac" containerName="keystone-bootstrap" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.031437 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c33dad8-6a4e-40fd-8f84-1329e367ffac" containerName="keystone-bootstrap" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.031575 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c33dad8-6a4e-40fd-8f84-1329e367ffac" containerName="keystone-bootstrap" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.032052 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.050022 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.050381 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.050683 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.051752 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.051869 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.051871 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-5sl4v" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.056063 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-84f545d4f5-d8v92"] Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.139003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-internal-tls-certs\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.139068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-config-data\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.139099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-combined-ca-bundle\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.139168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnmq\" (UniqueName: \"kubernetes.io/projected/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-kube-api-access-cgnmq\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.139184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-credential-keys\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.139316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-public-tls-certs\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.139402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-scripts\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.139456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-fernet-keys\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.240827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-config-data\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.240879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-combined-ca-bundle\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.240933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnmq\" (UniqueName: \"kubernetes.io/projected/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-kube-api-access-cgnmq\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.240949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-credential-keys\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.240983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-public-tls-certs\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.241028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-scripts\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.241051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-fernet-keys\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.241111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-internal-tls-certs\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.245248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-credential-keys\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.245332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-fernet-keys\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.245719 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-scripts\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.245984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-internal-tls-certs\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.246035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-combined-ca-bundle\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.247661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-config-data\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.251027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-public-tls-certs\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.255237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnmq\" (UniqueName: \"kubernetes.io/projected/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-kube-api-access-cgnmq\") pod \"keystone-84f545d4f5-d8v92\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.342742 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.726670 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-84f545d4f5-d8v92"] Jan 21 15:16:53 crc kubenswrapper[4707]: W0121 15:16:53.736170 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd41aa1cf_56db_4d5a_91e9_478a8ce591d0.slice/crio-4a7f21396374a414dfad0a29d9cb9bc668499fd4dfba414eed3a54dd78ab02f6 WatchSource:0}: Error finding container 4a7f21396374a414dfad0a29d9cb9bc668499fd4dfba414eed3a54dd78ab02f6: Status 404 returned error can't find the container with id 4a7f21396374a414dfad0a29d9cb9bc668499fd4dfba414eed3a54dd78ab02f6 Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.936898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" event={"ID":"d41aa1cf-56db-4d5a-91e9-478a8ce591d0","Type":"ContainerStarted","Data":"cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee"} Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.937394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" event={"ID":"d41aa1cf-56db-4d5a-91e9-478a8ce591d0","Type":"ContainerStarted","Data":"4a7f21396374a414dfad0a29d9cb9bc668499fd4dfba414eed3a54dd78ab02f6"} Jan 21 15:16:53 crc kubenswrapper[4707]: I0121 15:16:53.952277 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" podStartSLOduration=0.952264793 podStartE2EDuration="952.264793ms" podCreationTimestamp="2026-01-21 15:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:16:53.949515021 +0000 UTC m=+911.131031243" watchObservedRunningTime="2026-01-21 15:16:53.952264793 +0000 UTC m=+911.133781015" Jan 21 15:16:54 crc kubenswrapper[4707]: I0121 15:16:54.056967 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:54 crc kubenswrapper[4707]: I0121 15:16:54.057034 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:54 crc kubenswrapper[4707]: I0121 15:16:54.090027 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:54 crc kubenswrapper[4707]: I0121 15:16:54.099940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:54 crc kubenswrapper[4707]: I0121 15:16:54.730166 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:54 crc kubenswrapper[4707]: I0121 15:16:54.730233 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:16:54 crc kubenswrapper[4707]: I0121 15:16:54.942426 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:54 crc kubenswrapper[4707]: I0121 15:16:54.942462 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:54 crc kubenswrapper[4707]: I0121 15:16:54.942473 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:16:55 crc kubenswrapper[4707]: I0121 15:16:55.122247 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:55 crc kubenswrapper[4707]: I0121 15:16:55.123156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:55 crc kubenswrapper[4707]: I0121 15:16:55.158415 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:55 crc kubenswrapper[4707]: I0121 15:16:55.158644 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:55 crc kubenswrapper[4707]: I0121 15:16:55.774368 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbm6k" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="registry-server" probeResult="failure" output=< Jan 21 15:16:55 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 21 15:16:55 crc kubenswrapper[4707]: > Jan 21 15:16:55 crc kubenswrapper[4707]: I0121 15:16:55.951125 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:55 crc kubenswrapper[4707]: I0121 15:16:55.951155 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:56 crc kubenswrapper[4707]: I0121 15:16:56.575907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:56 crc kubenswrapper[4707]: I0121 15:16:56.577082 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:16:56 crc kubenswrapper[4707]: I0121 15:16:56.930059 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:56 crc kubenswrapper[4707]: I0121 15:16:56.971041 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:57 crc kubenswrapper[4707]: I0121 15:16:57.579367 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:57 crc kubenswrapper[4707]: I0121 15:16:57.604926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:16:57 crc kubenswrapper[4707]: I0121 15:16:57.780170 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjwjh"] Jan 21 15:16:57 crc kubenswrapper[4707]: I0121 15:16:57.963263 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mjwjh" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerName="registry-server" containerID="cri-o://9b258857d6fb4477c4a2bcf0b755a02f3d16ca940587313326345bf8bea790fc" gracePeriod=2 Jan 21 15:16:58 crc kubenswrapper[4707]: I0121 15:16:58.972920 4707 generic.go:334] "Generic (PLEG): container finished" podID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerID="9b258857d6fb4477c4a2bcf0b755a02f3d16ca940587313326345bf8bea790fc" exitCode=0 Jan 21 15:16:58 crc kubenswrapper[4707]: I0121 15:16:58.973054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjwjh" event={"ID":"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13","Type":"ContainerDied","Data":"9b258857d6fb4477c4a2bcf0b755a02f3d16ca940587313326345bf8bea790fc"} Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.270337 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:59 crc kubenswrapper[4707]: E0121 15:16:59.335799 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/ceilometer-0" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.357642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlktm\" (UniqueName: \"kubernetes.io/projected/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-kube-api-access-dlktm\") pod \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.357680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-catalog-content\") pod \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.357725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-utilities\") pod \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\" (UID: \"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13\") " Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.358269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-utilities" (OuterVolumeSpecName: "utilities") pod "f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" (UID: "f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.361148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-kube-api-access-dlktm" (OuterVolumeSpecName: "kube-api-access-dlktm") pod "f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" (UID: "f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13"). InnerVolumeSpecName "kube-api-access-dlktm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.370470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" (UID: "f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.459924 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.459952 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlktm\" (UniqueName: \"kubernetes.io/projected/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-kube-api-access-dlktm\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.459962 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.980054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" event={"ID":"96a28438-f0ba-433d-84f7-11838f51c57c","Type":"ContainerStarted","Data":"13f131c209e26209d96710f9a60dc9ab4422ae7154a9afed2d58a0a38c71156f"} Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.984482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"32aa4051-0ca7-4082-9693-ebd95e4ee508","Type":"ContainerStarted","Data":"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd"} Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.984520 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="ceilometer-notification-agent" containerID="cri-o://8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619" gracePeriod=30 Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.984542 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.984559 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="sg-core" containerID="cri-o://c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc" gracePeriod=30 Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.984567 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="proxy-httpd" containerID="cri-o://31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd" gracePeriod=30 Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.993529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjwjh" event={"ID":"f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13","Type":"ContainerDied","Data":"dbf5d88e0f57bacf7aba5dfd9f87ae8d038330ec154ca60991710c1455367203"} Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.993572 4707 scope.go:117] "RemoveContainer" containerID="9b258857d6fb4477c4a2bcf0b755a02f3d16ca940587313326345bf8bea790fc" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.993675 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjwjh" Jan 21 15:16:59 crc kubenswrapper[4707]: I0121 15:16:59.999573 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" podStartSLOduration=2.598201751 podStartE2EDuration="33.999559841s" podCreationTimestamp="2026-01-21 15:16:26 +0000 UTC" firstStartedPulling="2026-01-21 15:16:27.704357425 +0000 UTC m=+884.885873647" lastFinishedPulling="2026-01-21 15:16:59.105715515 +0000 UTC m=+916.287231737" observedRunningTime="2026-01-21 15:16:59.991056324 +0000 UTC m=+917.172572547" watchObservedRunningTime="2026-01-21 15:16:59.999559841 +0000 UTC m=+917.181076064" Jan 21 15:17:00 crc kubenswrapper[4707]: I0121 15:17:00.012746 4707 scope.go:117] "RemoveContainer" containerID="9fa8ae9358ce43401a9d0f4bbb4883f9ca1e9fcfb429d47df2e15d51f6f817c6" Jan 21 15:17:00 crc kubenswrapper[4707]: I0121 15:17:00.028213 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjwjh"] Jan 21 15:17:00 crc kubenswrapper[4707]: I0121 15:17:00.034042 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjwjh"] Jan 21 15:17:00 crc kubenswrapper[4707]: I0121 15:17:00.034457 4707 scope.go:117] "RemoveContainer" containerID="bf9b865aca35b05004f872249588f4e527e2fe3bdbee5da191a8515248b37d20" Jan 21 15:17:00 crc kubenswrapper[4707]: I0121 15:17:00.973581 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.001325 4707 generic.go:334] "Generic (PLEG): container finished" podID="96a28438-f0ba-433d-84f7-11838f51c57c" containerID="13f131c209e26209d96710f9a60dc9ab4422ae7154a9afed2d58a0a38c71156f" exitCode=0 Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.001386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" event={"ID":"96a28438-f0ba-433d-84f7-11838f51c57c","Type":"ContainerDied","Data":"13f131c209e26209d96710f9a60dc9ab4422ae7154a9afed2d58a0a38c71156f"} Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.003259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-99p25" event={"ID":"72983002-27f2-464c-9e83-cc2dd28ce9ea","Type":"ContainerStarted","Data":"37c0a0b5683af8cf2897340a05e69b1b3c4f76f2984fc6f86bd2d0a6a22c70ee"} Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.005113 4707 generic.go:334] "Generic (PLEG): container finished" podID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerID="31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd" exitCode=0 Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.005131 4707 generic.go:334] "Generic (PLEG): container finished" podID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerID="c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc" exitCode=2 Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.005138 4707 generic.go:334] "Generic (PLEG): container finished" podID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerID="8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619" exitCode=0 Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.005175 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.005175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"32aa4051-0ca7-4082-9693-ebd95e4ee508","Type":"ContainerDied","Data":"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd"} Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.005292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"32aa4051-0ca7-4082-9693-ebd95e4ee508","Type":"ContainerDied","Data":"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc"} Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.005305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"32aa4051-0ca7-4082-9693-ebd95e4ee508","Type":"ContainerDied","Data":"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619"} Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.005313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"32aa4051-0ca7-4082-9693-ebd95e4ee508","Type":"ContainerDied","Data":"ac3e78c3af9f94516aec40662c1fbbd1519fe2966135fa21aa7e15150f6bc050"} Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.005328 4707 scope.go:117] "RemoveContainer" containerID="31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.023363 4707 scope.go:117] "RemoveContainer" containerID="c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.025394 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-99p25" podStartSLOduration=2.666486598 podStartE2EDuration="35.02538505s" podCreationTimestamp="2026-01-21 15:16:26 +0000 UTC" firstStartedPulling="2026-01-21 15:16:27.476336862 +0000 UTC m=+884.657853085" lastFinishedPulling="2026-01-21 15:16:59.835235315 +0000 UTC m=+917.016751537" observedRunningTime="2026-01-21 15:17:01.022741095 +0000 UTC m=+918.204257318" watchObservedRunningTime="2026-01-21 15:17:01.02538505 +0000 UTC m=+918.206901271" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.036266 4707 scope.go:117] "RemoveContainer" containerID="8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.049649 4707 scope.go:117] "RemoveContainer" containerID="31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd" Jan 21 15:17:01 crc kubenswrapper[4707]: E0121 15:17:01.049945 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd\": container with ID starting with 31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd not found: ID does not exist" containerID="31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.049972 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd"} err="failed to get container status \"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd\": rpc error: code = NotFound desc = could not find container \"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd\": container with ID starting with 31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd not found: ID does not exist" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.049988 4707 scope.go:117] "RemoveContainer" containerID="c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc" Jan 21 15:17:01 crc kubenswrapper[4707]: E0121 15:17:01.050202 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc\": container with ID starting with c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc not found: ID does not exist" containerID="c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.050222 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc"} err="failed to get container status \"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc\": rpc error: code = NotFound desc = could not find container \"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc\": container with ID starting with c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc not found: ID does not exist" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.050233 4707 scope.go:117] "RemoveContainer" containerID="8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619" Jan 21 15:17:01 crc kubenswrapper[4707]: E0121 15:17:01.050385 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619\": container with ID starting with 8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619 not found: ID does not exist" containerID="8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.050403 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619"} err="failed to get container status \"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619\": rpc error: code = NotFound desc = could not find container \"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619\": container with ID starting with 8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619 not found: ID does not exist" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.050413 4707 scope.go:117] "RemoveContainer" containerID="31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.050626 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd"} err="failed to get container status \"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd\": rpc error: code = NotFound desc = could not find container \"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd\": container with ID starting with 31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd not found: ID does not exist" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.050644 4707 scope.go:117] "RemoveContainer" containerID="c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.050847 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc"} err="failed to get container status \"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc\": rpc error: code = NotFound desc = could not find container \"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc\": container with ID starting with c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc not found: ID does not exist" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.050864 4707 scope.go:117] "RemoveContainer" containerID="8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.051106 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619"} err="failed to get container status \"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619\": rpc error: code = NotFound desc = could not find container \"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619\": container with ID starting with 8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619 not found: ID does not exist" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.051123 4707 scope.go:117] "RemoveContainer" containerID="31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.051266 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd"} err="failed to get container status \"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd\": rpc error: code = NotFound desc = could not find container \"31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd\": container with ID starting with 31704d28605a6a27c7a217f4e4bc37bbab238cebadb1b3e7e650b7d88ff39bfd not found: ID does not exist" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.051283 4707 scope.go:117] "RemoveContainer" containerID="c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.051432 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc"} err="failed to get container status \"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc\": rpc error: code = NotFound desc = could not find container \"c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc\": container with ID starting with c81459431c5c0d1df4b5b9de102640346df9f5849bd054ff27e0ebbf4bc5c5fc not found: ID does not exist" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.051447 4707 scope.go:117] "RemoveContainer" containerID="8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.051577 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619"} err="failed to get container status \"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619\": rpc error: code = NotFound desc = could not find container \"8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619\": container with ID starting with 8eaac4e62f1c8a630db3de8ee3bbe1e1336f66594892568e9d237dde6e6b5619 not found: ID does not exist" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.084959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-sg-core-conf-yaml\") pod \"32aa4051-0ca7-4082-9693-ebd95e4ee508\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.085027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-config-data\") pod \"32aa4051-0ca7-4082-9693-ebd95e4ee508\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.085051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-scripts\") pod \"32aa4051-0ca7-4082-9693-ebd95e4ee508\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.085112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-combined-ca-bundle\") pod \"32aa4051-0ca7-4082-9693-ebd95e4ee508\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.085166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-run-httpd\") pod \"32aa4051-0ca7-4082-9693-ebd95e4ee508\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.085199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-log-httpd\") pod \"32aa4051-0ca7-4082-9693-ebd95e4ee508\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.085228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckstq\" (UniqueName: \"kubernetes.io/projected/32aa4051-0ca7-4082-9693-ebd95e4ee508-kube-api-access-ckstq\") pod \"32aa4051-0ca7-4082-9693-ebd95e4ee508\" (UID: \"32aa4051-0ca7-4082-9693-ebd95e4ee508\") " Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.086739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "32aa4051-0ca7-4082-9693-ebd95e4ee508" (UID: "32aa4051-0ca7-4082-9693-ebd95e4ee508"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.087052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "32aa4051-0ca7-4082-9693-ebd95e4ee508" (UID: "32aa4051-0ca7-4082-9693-ebd95e4ee508"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.090727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-scripts" (OuterVolumeSpecName: "scripts") pod "32aa4051-0ca7-4082-9693-ebd95e4ee508" (UID: "32aa4051-0ca7-4082-9693-ebd95e4ee508"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.090853 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32aa4051-0ca7-4082-9693-ebd95e4ee508-kube-api-access-ckstq" (OuterVolumeSpecName: "kube-api-access-ckstq") pod "32aa4051-0ca7-4082-9693-ebd95e4ee508" (UID: "32aa4051-0ca7-4082-9693-ebd95e4ee508"). InnerVolumeSpecName "kube-api-access-ckstq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.104316 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "32aa4051-0ca7-4082-9693-ebd95e4ee508" (UID: "32aa4051-0ca7-4082-9693-ebd95e4ee508"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.127012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32aa4051-0ca7-4082-9693-ebd95e4ee508" (UID: "32aa4051-0ca7-4082-9693-ebd95e4ee508"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.138235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-config-data" (OuterVolumeSpecName: "config-data") pod "32aa4051-0ca7-4082-9693-ebd95e4ee508" (UID: "32aa4051-0ca7-4082-9693-ebd95e4ee508"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.186860 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.186886 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.186896 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32aa4051-0ca7-4082-9693-ebd95e4ee508-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.186904 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckstq\" (UniqueName: \"kubernetes.io/projected/32aa4051-0ca7-4082-9693-ebd95e4ee508-kube-api-access-ckstq\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.186914 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.186921 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.186928 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32aa4051-0ca7-4082-9693-ebd95e4ee508-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.191336 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" path="/var/lib/kubelet/pods/f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13/volumes" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.350367 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.362870 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.369110 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:01 crc kubenswrapper[4707]: E0121 15:17:01.369542 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="ceilometer-notification-agent" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.369610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="ceilometer-notification-agent" Jan 21 15:17:01 crc kubenswrapper[4707]: E0121 15:17:01.369669 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerName="registry-server" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.369732 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerName="registry-server" Jan 21 15:17:01 crc kubenswrapper[4707]: E0121 15:17:01.369795 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="proxy-httpd" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.369955 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="proxy-httpd" Jan 21 15:17:01 crc kubenswrapper[4707]: E0121 15:17:01.370026 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerName="extract-utilities" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.370492 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerName="extract-utilities" Jan 21 15:17:01 crc kubenswrapper[4707]: E0121 15:17:01.370547 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="sg-core" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.370590 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="sg-core" Jan 21 15:17:01 crc kubenswrapper[4707]: E0121 15:17:01.370637 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerName="extract-content" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.370683 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerName="extract-content" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.371263 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="sg-core" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.371361 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d6b3e6-0dc7-425f-b6bf-5fefd802cf13" containerName="registry-server" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.371414 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="ceilometer-notification-agent" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.371466 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" containerName="proxy-httpd" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.375711 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.377581 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.377821 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.380139 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.492301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-scripts\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.492355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-run-httpd\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.492371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-log-httpd\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.492396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.492614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-config-data\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.492797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phqpq\" (UniqueName: \"kubernetes.io/projected/fd22c263-f40d-47d1-85d2-e5227784aae2-kube-api-access-phqpq\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.492889 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.593900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.594033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-config-data\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.594097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phqpq\" (UniqueName: \"kubernetes.io/projected/fd22c263-f40d-47d1-85d2-e5227784aae2-kube-api-access-phqpq\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.594123 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.594225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-scripts\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.594272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-run-httpd\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.594284 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-log-httpd\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.594684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-run-httpd\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.594919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-log-httpd\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.597085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.597524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-scripts\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.597783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.600562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-config-data\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.610549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phqpq\" (UniqueName: \"kubernetes.io/projected/fd22c263-f40d-47d1-85d2-e5227784aae2-kube-api-access-phqpq\") pod \"ceilometer-0\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:01 crc kubenswrapper[4707]: I0121 15:17:01.697121 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.056520 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:02 crc kubenswrapper[4707]: W0121 15:17:02.062360 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd22c263_f40d_47d1_85d2_e5227784aae2.slice/crio-3d1a14cd68aceffbb375dc5d138b20f76116dd19828eaf25fd0f874e2666d550 WatchSource:0}: Error finding container 3d1a14cd68aceffbb375dc5d138b20f76116dd19828eaf25fd0f874e2666d550: Status 404 returned error can't find the container with id 3d1a14cd68aceffbb375dc5d138b20f76116dd19828eaf25fd0f874e2666d550 Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.288087 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.406026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqb6t\" (UniqueName: \"kubernetes.io/projected/96a28438-f0ba-433d-84f7-11838f51c57c-kube-api-access-lqb6t\") pod \"96a28438-f0ba-433d-84f7-11838f51c57c\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.406204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-db-sync-config-data\") pod \"96a28438-f0ba-433d-84f7-11838f51c57c\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.406371 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-combined-ca-bundle\") pod \"96a28438-f0ba-433d-84f7-11838f51c57c\" (UID: \"96a28438-f0ba-433d-84f7-11838f51c57c\") " Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.410125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a28438-f0ba-433d-84f7-11838f51c57c-kube-api-access-lqb6t" (OuterVolumeSpecName: "kube-api-access-lqb6t") pod "96a28438-f0ba-433d-84f7-11838f51c57c" (UID: "96a28438-f0ba-433d-84f7-11838f51c57c"). InnerVolumeSpecName "kube-api-access-lqb6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.416881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "96a28438-f0ba-433d-84f7-11838f51c57c" (UID: "96a28438-f0ba-433d-84f7-11838f51c57c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.426040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96a28438-f0ba-433d-84f7-11838f51c57c" (UID: "96a28438-f0ba-433d-84f7-11838f51c57c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.507992 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqb6t\" (UniqueName: \"kubernetes.io/projected/96a28438-f0ba-433d-84f7-11838f51c57c-kube-api-access-lqb6t\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.508029 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:02 crc kubenswrapper[4707]: I0121 15:17:02.508039 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a28438-f0ba-433d-84f7-11838f51c57c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.019466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" event={"ID":"96a28438-f0ba-433d-84f7-11838f51c57c","Type":"ContainerDied","Data":"1c19892dd5b150e7f3ffb5dbf25ec827fd20704c5337d1488a3cce9b734225a0"} Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.019711 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c19892dd5b150e7f3ffb5dbf25ec827fd20704c5337d1488a3cce9b734225a0" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.019506 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-stp4w" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.021061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerStarted","Data":"55ed378367ca41e2768d23e704055f2902ab694029b0245c1e2193c2cf12de3a"} Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.021096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerStarted","Data":"3d1a14cd68aceffbb375dc5d138b20f76116dd19828eaf25fd0f874e2666d550"} Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.022642 4707 generic.go:334] "Generic (PLEG): container finished" podID="72983002-27f2-464c-9e83-cc2dd28ce9ea" containerID="37c0a0b5683af8cf2897340a05e69b1b3c4f76f2984fc6f86bd2d0a6a22c70ee" exitCode=0 Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.022679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-99p25" event={"ID":"72983002-27f2-464c-9e83-cc2dd28ce9ea","Type":"ContainerDied","Data":"37c0a0b5683af8cf2897340a05e69b1b3c4f76f2984fc6f86bd2d0a6a22c70ee"} Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.191493 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32aa4051-0ca7-4082-9693-ebd95e4ee508" path="/var/lib/kubelet/pods/32aa4051-0ca7-4082-9693-ebd95e4ee508/volumes" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.222519 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw"] Jan 21 15:17:03 crc kubenswrapper[4707]: E0121 15:17:03.223032 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a28438-f0ba-433d-84f7-11838f51c57c" containerName="barbican-db-sync" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.223098 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a28438-f0ba-433d-84f7-11838f51c57c" containerName="barbican-db-sync" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.223311 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a28438-f0ba-433d-84f7-11838f51c57c" containerName="barbican-db-sync" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.224133 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.228891 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-78998996df-sk4d7"] Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.230039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.240372 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.240490 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-g88h6" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.240489 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.240592 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.249373 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw"] Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.294784 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-78998996df-sk4d7"] Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.321233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.321367 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.321471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km2wl\" (UniqueName: \"kubernetes.io/projected/ed6331a2-4b0a-407e-8ee0-3b1a82343071-kube-api-access-km2wl\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.321543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.321704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df067294-2809-413f-b245-6cd57c0cbf11-logs\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.321789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data-custom\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.321884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6331a2-4b0a-407e-8ee0-3b1a82343071-logs\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.321965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-combined-ca-bundle\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.322038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggk7r\" (UniqueName: \"kubernetes.io/projected/df067294-2809-413f-b245-6cd57c0cbf11-kube-api-access-ggk7r\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.322106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data-custom\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.335709 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2"] Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.336980 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.349414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2"] Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.353606 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.423735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-logs\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-combined-ca-bundle\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2wl\" (UniqueName: \"kubernetes.io/projected/ed6331a2-4b0a-407e-8ee0-3b1a82343071-kube-api-access-km2wl\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df067294-2809-413f-b245-6cd57c0cbf11-logs\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data-custom\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6331a2-4b0a-407e-8ee0-3b1a82343071-logs\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mms99\" (UniqueName: \"kubernetes.io/projected/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-kube-api-access-mms99\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-combined-ca-bundle\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggk7r\" (UniqueName: \"kubernetes.io/projected/df067294-2809-413f-b245-6cd57c0cbf11-kube-api-access-ggk7r\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data-custom\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.424386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data-custom\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.425110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6331a2-4b0a-407e-8ee0-3b1a82343071-logs\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.425225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df067294-2809-413f-b245-6cd57c0cbf11-logs\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.427137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data-custom\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.427519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-combined-ca-bundle\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.428104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.428340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.429176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.430018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data-custom\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.439292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km2wl\" (UniqueName: \"kubernetes.io/projected/ed6331a2-4b0a-407e-8ee0-3b1a82343071-kube-api-access-km2wl\") pod \"barbican-keystone-listener-6b9f974bb6-5sgjw\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.440261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggk7r\" (UniqueName: \"kubernetes.io/projected/df067294-2809-413f-b245-6cd57c0cbf11-kube-api-access-ggk7r\") pod \"barbican-worker-78998996df-sk4d7\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.525405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.525439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mms99\" (UniqueName: \"kubernetes.io/projected/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-kube-api-access-mms99\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.525500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data-custom\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.525546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-logs\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.525600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-combined-ca-bundle\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.527785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-logs\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.528800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data-custom\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.529362 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.530099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-combined-ca-bundle\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.540951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mms99\" (UniqueName: \"kubernetes.io/projected/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-kube-api-access-mms99\") pod \"barbican-api-66969f6cd6-htsz2\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.542970 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.548342 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.731749 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:03 crc kubenswrapper[4707]: W0121 15:17:03.928286 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded6331a2_4b0a_407e_8ee0_3b1a82343071.slice/crio-fb752dfa05e58405fd49147fa5e2b87c86854c23fa4745a68926b27b773715af WatchSource:0}: Error finding container fb752dfa05e58405fd49147fa5e2b87c86854c23fa4745a68926b27b773715af: Status 404 returned error can't find the container with id fb752dfa05e58405fd49147fa5e2b87c86854c23fa4745a68926b27b773715af Jan 21 15:17:03 crc kubenswrapper[4707]: I0121 15:17:03.929323 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw"] Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.000370 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-78998996df-sk4d7"] Jan 21 15:17:04 crc kubenswrapper[4707]: W0121 15:17:04.005740 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf067294_2809_413f_b245_6cd57c0cbf11.slice/crio-fb2ef70e2aeec9824e1137b5af1b57c5b43f6f86a9b0c6ecb002ffc1a47aa6fa WatchSource:0}: Error finding container fb2ef70e2aeec9824e1137b5af1b57c5b43f6f86a9b0c6ecb002ffc1a47aa6fa: Status 404 returned error can't find the container with id fb2ef70e2aeec9824e1137b5af1b57c5b43f6f86a9b0c6ecb002ffc1a47aa6fa Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.029211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" event={"ID":"df067294-2809-413f-b245-6cd57c0cbf11","Type":"ContainerStarted","Data":"fb2ef70e2aeec9824e1137b5af1b57c5b43f6f86a9b0c6ecb002ffc1a47aa6fa"} Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.030398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" event={"ID":"ed6331a2-4b0a-407e-8ee0-3b1a82343071","Type":"ContainerStarted","Data":"fb752dfa05e58405fd49147fa5e2b87c86854c23fa4745a68926b27b773715af"} Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.031961 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerStarted","Data":"5a92e9c55e7a6ebbcef5d232865bcf9d49d571117d63fc2036d14254e55dce2d"} Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.150917 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2"] Jan 21 15:17:04 crc kubenswrapper[4707]: W0121 15:17:04.168432 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f97fda_b4d4_4f2c_88b9_8e91f147479c.slice/crio-64d76e4e6473af9156cf3a38b2c31767c7d71226e45d28dd2c5033d6fb00fc79 WatchSource:0}: Error finding container 64d76e4e6473af9156cf3a38b2c31767c7d71226e45d28dd2c5033d6fb00fc79: Status 404 returned error can't find the container with id 64d76e4e6473af9156cf3a38b2c31767c7d71226e45d28dd2c5033d6fb00fc79 Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.242316 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.346585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-db-sync-config-data\") pod \"72983002-27f2-464c-9e83-cc2dd28ce9ea\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.346637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhrsl\" (UniqueName: \"kubernetes.io/projected/72983002-27f2-464c-9e83-cc2dd28ce9ea-kube-api-access-nhrsl\") pod \"72983002-27f2-464c-9e83-cc2dd28ce9ea\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.346654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-config-data\") pod \"72983002-27f2-464c-9e83-cc2dd28ce9ea\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.346715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-combined-ca-bundle\") pod \"72983002-27f2-464c-9e83-cc2dd28ce9ea\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.346755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72983002-27f2-464c-9e83-cc2dd28ce9ea-etc-machine-id\") pod \"72983002-27f2-464c-9e83-cc2dd28ce9ea\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.346771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-scripts\") pod \"72983002-27f2-464c-9e83-cc2dd28ce9ea\" (UID: \"72983002-27f2-464c-9e83-cc2dd28ce9ea\") " Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.347161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72983002-27f2-464c-9e83-cc2dd28ce9ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "72983002-27f2-464c-9e83-cc2dd28ce9ea" (UID: "72983002-27f2-464c-9e83-cc2dd28ce9ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.349799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-scripts" (OuterVolumeSpecName: "scripts") pod "72983002-27f2-464c-9e83-cc2dd28ce9ea" (UID: "72983002-27f2-464c-9e83-cc2dd28ce9ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.349939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72983002-27f2-464c-9e83-cc2dd28ce9ea" (UID: "72983002-27f2-464c-9e83-cc2dd28ce9ea"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.350466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72983002-27f2-464c-9e83-cc2dd28ce9ea-kube-api-access-nhrsl" (OuterVolumeSpecName: "kube-api-access-nhrsl") pod "72983002-27f2-464c-9e83-cc2dd28ce9ea" (UID: "72983002-27f2-464c-9e83-cc2dd28ce9ea"). InnerVolumeSpecName "kube-api-access-nhrsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.364248 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72983002-27f2-464c-9e83-cc2dd28ce9ea" (UID: "72983002-27f2-464c-9e83-cc2dd28ce9ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.380523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-config-data" (OuterVolumeSpecName: "config-data") pod "72983002-27f2-464c-9e83-cc2dd28ce9ea" (UID: "72983002-27f2-464c-9e83-cc2dd28ce9ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.448468 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhrsl\" (UniqueName: \"kubernetes.io/projected/72983002-27f2-464c-9e83-cc2dd28ce9ea-kube-api-access-nhrsl\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.448498 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.448508 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.448517 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72983002-27f2-464c-9e83-cc2dd28ce9ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.448524 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:04 crc kubenswrapper[4707]: I0121 15:17:04.448532 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72983002-27f2-464c-9e83-cc2dd28ce9ea-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.041119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" event={"ID":"b8f97fda-b4d4-4f2c-88b9-8e91f147479c","Type":"ContainerStarted","Data":"3b500bb7e40583e7b21a96e40dea3af99d0120e7b2ed2f9a45dcff38e9a6fd31"} Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.041299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" event={"ID":"b8f97fda-b4d4-4f2c-88b9-8e91f147479c","Type":"ContainerStarted","Data":"2dc93d3cda490fc3cbd6dd820c186b43e7850a11a7ddaff0ced763b88d4b3b4d"} Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.041311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" event={"ID":"b8f97fda-b4d4-4f2c-88b9-8e91f147479c","Type":"ContainerStarted","Data":"64d76e4e6473af9156cf3a38b2c31767c7d71226e45d28dd2c5033d6fb00fc79"} Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.041724 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.041850 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.042994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerStarted","Data":"14dcfb2ae7b3e3a53af3dbe31d6ae4d53fbe97c121e3651041bb4c0516b327ca"} Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.044267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-99p25" event={"ID":"72983002-27f2-464c-9e83-cc2dd28ce9ea","Type":"ContainerDied","Data":"592874b4a90757ffca67e79afe1c81bd120a684c3b44942967a9802754fa9591"} Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.044291 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-99p25" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.044294 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="592874b4a90757ffca67e79afe1c81bd120a684c3b44942967a9802754fa9591" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.064372 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" podStartSLOduration=2.064357779 podStartE2EDuration="2.064357779s" podCreationTimestamp="2026-01-21 15:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:05.053429781 +0000 UTC m=+922.234946003" watchObservedRunningTime="2026-01-21 15:17:05.064357779 +0000 UTC m=+922.245874001" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.202135 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:17:05 crc kubenswrapper[4707]: E0121 15:17:05.202436 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72983002-27f2-464c-9e83-cc2dd28ce9ea" containerName="cinder-db-sync" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.202448 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="72983002-27f2-464c-9e83-cc2dd28ce9ea" containerName="cinder-db-sync" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.202649 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="72983002-27f2-464c-9e83-cc2dd28ce9ea" containerName="cinder-db-sync" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.204351 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.208212 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.208377 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.208484 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-2ld2h" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.214355 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.214559 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.268620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.268988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.269029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsmff\" (UniqueName: \"kubernetes.io/projected/3c638341-68df-4182-b322-77cc26ed319e-kube-api-access-xsmff\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.269110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.269136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.269160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c638341-68df-4182-b322-77cc26ed319e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.371639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c638341-68df-4182-b322-77cc26ed319e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.371733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.371925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.371969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsmff\" (UniqueName: \"kubernetes.io/projected/3c638341-68df-4182-b322-77cc26ed319e-kube-api-access-xsmff\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.372067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.372100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.373877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c638341-68df-4182-b322-77cc26ed319e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.381078 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.381268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.382581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.394231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.425296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsmff\" (UniqueName: \"kubernetes.io/projected/3c638341-68df-4182-b322-77cc26ed319e-kube-api-access-xsmff\") pod \"cinder-scheduler-0\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.425357 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.426544 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.429024 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.429720 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.473516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckmth\" (UniqueName: \"kubernetes.io/projected/c71ee2cc-4277-43de-bcde-3164505d2ea2-kube-api-access-ckmth\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.473563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c71ee2cc-4277-43de-bcde-3164505d2ea2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.473637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.473664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.473684 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71ee2cc-4277-43de-bcde-3164505d2ea2-logs\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.473792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-scripts\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.473864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data-custom\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.517260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.574966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-scripts\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.575031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data-custom\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.575075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckmth\" (UniqueName: \"kubernetes.io/projected/c71ee2cc-4277-43de-bcde-3164505d2ea2-kube-api-access-ckmth\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.575096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c71ee2cc-4277-43de-bcde-3164505d2ea2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.575557 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c71ee2cc-4277-43de-bcde-3164505d2ea2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.576086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.576120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.576787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71ee2cc-4277-43de-bcde-3164505d2ea2-logs\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.577114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71ee2cc-4277-43de-bcde-3164505d2ea2-logs\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.580848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.581207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data-custom\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.586150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-scripts\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.588837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckmth\" (UniqueName: \"kubernetes.io/projected/c71ee2cc-4277-43de-bcde-3164505d2ea2-kube-api-access-ckmth\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.600482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.764727 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbm6k" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="registry-server" probeResult="failure" output=< Jan 21 15:17:05 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 21 15:17:05 crc kubenswrapper[4707]: > Jan 21 15:17:05 crc kubenswrapper[4707]: I0121 15:17:05.797487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:06 crc kubenswrapper[4707]: I0121 15:17:06.425318 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:17:06 crc kubenswrapper[4707]: W0121 15:17:06.428082 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc71ee2cc_4277_43de_bcde_3164505d2ea2.slice/crio-93af4f1cc90a6a53516f1f41e282942f3e5d3e31f5c9a5579cc497ddb1922333 WatchSource:0}: Error finding container 93af4f1cc90a6a53516f1f41e282942f3e5d3e31f5c9a5579cc497ddb1922333: Status 404 returned error can't find the container with id 93af4f1cc90a6a53516f1f41e282942f3e5d3e31f5c9a5579cc497ddb1922333 Jan 21 15:17:06 crc kubenswrapper[4707]: I0121 15:17:06.516176 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:17:06 crc kubenswrapper[4707]: W0121 15:17:06.524989 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c638341_68df_4182_b322_77cc26ed319e.slice/crio-7f1ac1938b25bad1693084f582e067d669ee9869a1228bb6edd2bdbd45da54de WatchSource:0}: Error finding container 7f1ac1938b25bad1693084f582e067d669ee9869a1228bb6edd2bdbd45da54de: Status 404 returned error can't find the container with id 7f1ac1938b25bad1693084f582e067d669ee9869a1228bb6edd2bdbd45da54de Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.061186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerStarted","Data":"1a96fb1cd7fca9884505d9e415196af3e7d2a10cc56064cf332b930c9c566236"} Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.062580 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.063614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c638341-68df-4182-b322-77cc26ed319e","Type":"ContainerStarted","Data":"7f1ac1938b25bad1693084f582e067d669ee9869a1228bb6edd2bdbd45da54de"} Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.065600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" event={"ID":"df067294-2809-413f-b245-6cd57c0cbf11","Type":"ContainerStarted","Data":"42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964"} Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.065699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" event={"ID":"df067294-2809-413f-b245-6cd57c0cbf11","Type":"ContainerStarted","Data":"2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a"} Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.069584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" event={"ID":"ed6331a2-4b0a-407e-8ee0-3b1a82343071","Type":"ContainerStarted","Data":"00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0"} Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.069650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" event={"ID":"ed6331a2-4b0a-407e-8ee0-3b1a82343071","Type":"ContainerStarted","Data":"8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb"} Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.070670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c71ee2cc-4277-43de-bcde-3164505d2ea2","Type":"ContainerStarted","Data":"44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853"} Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.070692 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c71ee2cc-4277-43de-bcde-3164505d2ea2","Type":"ContainerStarted","Data":"93af4f1cc90a6a53516f1f41e282942f3e5d3e31f5c9a5579cc497ddb1922333"} Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.085569 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.126261029 podStartE2EDuration="6.085553394s" podCreationTimestamp="2026-01-21 15:17:01 +0000 UTC" firstStartedPulling="2026-01-21 15:17:02.063935596 +0000 UTC m=+919.245451818" lastFinishedPulling="2026-01-21 15:17:06.023227961 +0000 UTC m=+923.204744183" observedRunningTime="2026-01-21 15:17:07.079927471 +0000 UTC m=+924.261443692" watchObservedRunningTime="2026-01-21 15:17:07.085553394 +0000 UTC m=+924.267069617" Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.104175 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" podStartSLOduration=2.085788276 podStartE2EDuration="4.104156571s" podCreationTimestamp="2026-01-21 15:17:03 +0000 UTC" firstStartedPulling="2026-01-21 15:17:04.007751316 +0000 UTC m=+921.189267538" lastFinishedPulling="2026-01-21 15:17:06.026119622 +0000 UTC m=+923.207635833" observedRunningTime="2026-01-21 15:17:07.102570697 +0000 UTC m=+924.284086919" watchObservedRunningTime="2026-01-21 15:17:07.104156571 +0000 UTC m=+924.285672792" Jan 21 15:17:07 crc kubenswrapper[4707]: I0121 15:17:07.118853 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" podStartSLOduration=2.025119524 podStartE2EDuration="4.11883857s" podCreationTimestamp="2026-01-21 15:17:03 +0000 UTC" firstStartedPulling="2026-01-21 15:17:03.930206278 +0000 UTC m=+921.111722500" lastFinishedPulling="2026-01-21 15:17:06.023925324 +0000 UTC m=+923.205441546" observedRunningTime="2026-01-21 15:17:07.114930819 +0000 UTC m=+924.296447041" watchObservedRunningTime="2026-01-21 15:17:07.11883857 +0000 UTC m=+924.300354792" Jan 21 15:17:08 crc kubenswrapper[4707]: I0121 15:17:08.078016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c638341-68df-4182-b322-77cc26ed319e","Type":"ContainerStarted","Data":"bebf25222697ac5df42b3e35ade3fd7785796fac1d2f3d0bd18f992e9a4d3cef"} Jan 21 15:17:08 crc kubenswrapper[4707]: I0121 15:17:08.080915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c71ee2cc-4277-43de-bcde-3164505d2ea2","Type":"ContainerStarted","Data":"07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394"} Jan 21 15:17:08 crc kubenswrapper[4707]: I0121 15:17:08.081666 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:08 crc kubenswrapper[4707]: I0121 15:17:08.099391 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.099354584 podStartE2EDuration="3.099354584s" podCreationTimestamp="2026-01-21 15:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:08.092902916 +0000 UTC m=+925.274419138" watchObservedRunningTime="2026-01-21 15:17:08.099354584 +0000 UTC m=+925.280870806" Jan 21 15:17:09 crc kubenswrapper[4707]: I0121 15:17:09.091438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c638341-68df-4182-b322-77cc26ed319e","Type":"ContainerStarted","Data":"99eb57f52202cc408b7c8645ecd36f4c6e8909b7e8bcca2637281ff0d9c6ed39"} Jan 21 15:17:09 crc kubenswrapper[4707]: I0121 15:17:09.110169 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.304466779 podStartE2EDuration="4.110155889s" podCreationTimestamp="2026-01-21 15:17:05 +0000 UTC" firstStartedPulling="2026-01-21 15:17:06.528062345 +0000 UTC m=+923.709578566" lastFinishedPulling="2026-01-21 15:17:07.333751444 +0000 UTC m=+924.515267676" observedRunningTime="2026-01-21 15:17:09.106202061 +0000 UTC m=+926.287718283" watchObservedRunningTime="2026-01-21 15:17:09.110155889 +0000 UTC m=+926.291672110" Jan 21 15:17:09 crc kubenswrapper[4707]: I0121 15:17:09.681499 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.125155 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-69647dd576-j6h6n"] Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.126394 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.127908 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.130740 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.139855 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-69647dd576-j6h6n"] Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.273091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.273148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fee1703-b855-46f5-9bc3-89c66b3377ea-logs\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.273194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.273282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69sb7\" (UniqueName: \"kubernetes.io/projected/1fee1703-b855-46f5-9bc3-89c66b3377ea-kube-api-access-69sb7\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.273399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.273552 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.273593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.375428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.375493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fee1703-b855-46f5-9bc3-89c66b3377ea-logs\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.375532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.375571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69sb7\" (UniqueName: \"kubernetes.io/projected/1fee1703-b855-46f5-9bc3-89c66b3377ea-kube-api-access-69sb7\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.375604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.375644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.375659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.377072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fee1703-b855-46f5-9bc3-89c66b3377ea-logs\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.381639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.382225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.384307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.387141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.388387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.388635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69sb7\" (UniqueName: \"kubernetes.io/projected/1fee1703-b855-46f5-9bc3-89c66b3377ea-kube-api-access-69sb7\") pod \"barbican-api-69647dd576-j6h6n\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.441088 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.517390 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:10 crc kubenswrapper[4707]: I0121 15:17:10.856250 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-69647dd576-j6h6n"] Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.106712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" event={"ID":"1fee1703-b855-46f5-9bc3-89c66b3377ea","Type":"ContainerStarted","Data":"8dbd5877a5c421fb4b87c4bb6c848927f876f78802f4b31d44917a93f5d9c274"} Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.107098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" event={"ID":"1fee1703-b855-46f5-9bc3-89c66b3377ea","Type":"ContainerStarted","Data":"213eab423feed883e957edd4043262addaa6f7acc6a2303daa55c0be03a83ae6"} Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.106891 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerName="cinder-api-log" containerID="cri-o://44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853" gracePeriod=30 Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.107584 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerName="cinder-api" containerID="cri-o://07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394" gracePeriod=30 Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.561833 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.696715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-combined-ca-bundle\") pod \"c71ee2cc-4277-43de-bcde-3164505d2ea2\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.696765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71ee2cc-4277-43de-bcde-3164505d2ea2-logs\") pod \"c71ee2cc-4277-43de-bcde-3164505d2ea2\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.696801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c71ee2cc-4277-43de-bcde-3164505d2ea2-etc-machine-id\") pod \"c71ee2cc-4277-43de-bcde-3164505d2ea2\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.696979 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c71ee2cc-4277-43de-bcde-3164505d2ea2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c71ee2cc-4277-43de-bcde-3164505d2ea2" (UID: "c71ee2cc-4277-43de-bcde-3164505d2ea2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.697082 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data\") pod \"c71ee2cc-4277-43de-bcde-3164505d2ea2\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.697124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data-custom\") pod \"c71ee2cc-4277-43de-bcde-3164505d2ea2\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.697217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckmth\" (UniqueName: \"kubernetes.io/projected/c71ee2cc-4277-43de-bcde-3164505d2ea2-kube-api-access-ckmth\") pod \"c71ee2cc-4277-43de-bcde-3164505d2ea2\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.697259 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-scripts\") pod \"c71ee2cc-4277-43de-bcde-3164505d2ea2\" (UID: \"c71ee2cc-4277-43de-bcde-3164505d2ea2\") " Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.697270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71ee2cc-4277-43de-bcde-3164505d2ea2-logs" (OuterVolumeSpecName: "logs") pod "c71ee2cc-4277-43de-bcde-3164505d2ea2" (UID: "c71ee2cc-4277-43de-bcde-3164505d2ea2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.698199 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71ee2cc-4277-43de-bcde-3164505d2ea2-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.698224 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c71ee2cc-4277-43de-bcde-3164505d2ea2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.701890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c71ee2cc-4277-43de-bcde-3164505d2ea2" (UID: "c71ee2cc-4277-43de-bcde-3164505d2ea2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.702432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71ee2cc-4277-43de-bcde-3164505d2ea2-kube-api-access-ckmth" (OuterVolumeSpecName: "kube-api-access-ckmth") pod "c71ee2cc-4277-43de-bcde-3164505d2ea2" (UID: "c71ee2cc-4277-43de-bcde-3164505d2ea2"). InnerVolumeSpecName "kube-api-access-ckmth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.702801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-scripts" (OuterVolumeSpecName: "scripts") pod "c71ee2cc-4277-43de-bcde-3164505d2ea2" (UID: "c71ee2cc-4277-43de-bcde-3164505d2ea2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.720684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c71ee2cc-4277-43de-bcde-3164505d2ea2" (UID: "c71ee2cc-4277-43de-bcde-3164505d2ea2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.733327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data" (OuterVolumeSpecName: "config-data") pod "c71ee2cc-4277-43de-bcde-3164505d2ea2" (UID: "c71ee2cc-4277-43de-bcde-3164505d2ea2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.799302 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.799351 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.799363 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckmth\" (UniqueName: \"kubernetes.io/projected/c71ee2cc-4277-43de-bcde-3164505d2ea2-kube-api-access-ckmth\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.799373 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:11 crc kubenswrapper[4707]: I0121 15:17:11.799382 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71ee2cc-4277-43de-bcde-3164505d2ea2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.114355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" event={"ID":"1fee1703-b855-46f5-9bc3-89c66b3377ea","Type":"ContainerStarted","Data":"acc95ab6fc21a9f96f200600c32518407710117ccce4e8740117fccde8402fe5"} Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.114570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.116288 4707 generic.go:334] "Generic (PLEG): container finished" podID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerID="07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394" exitCode=0 Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.116320 4707 generic.go:334] "Generic (PLEG): container finished" podID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerID="44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853" exitCode=143 Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.116327 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.116341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c71ee2cc-4277-43de-bcde-3164505d2ea2","Type":"ContainerDied","Data":"07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394"} Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.116366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c71ee2cc-4277-43de-bcde-3164505d2ea2","Type":"ContainerDied","Data":"44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853"} Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.116375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c71ee2cc-4277-43de-bcde-3164505d2ea2","Type":"ContainerDied","Data":"93af4f1cc90a6a53516f1f41e282942f3e5d3e31f5c9a5579cc497ddb1922333"} Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.116389 4707 scope.go:117] "RemoveContainer" containerID="07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.134385 4707 scope.go:117] "RemoveContainer" containerID="44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.143433 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" podStartSLOduration=2.143417788 podStartE2EDuration="2.143417788s" podCreationTimestamp="2026-01-21 15:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:12.133509169 +0000 UTC m=+929.315025391" watchObservedRunningTime="2026-01-21 15:17:12.143417788 +0000 UTC m=+929.324934011" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.146867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.152876 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.166043 4707 scope.go:117] "RemoveContainer" containerID="07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394" Jan 21 15:17:12 crc kubenswrapper[4707]: E0121 15:17:12.169906 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394\": container with ID starting with 07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394 not found: ID does not exist" containerID="07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.169941 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394"} err="failed to get container status \"07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394\": rpc error: code = NotFound desc = could not find container \"07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394\": container with ID starting with 07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394 not found: ID does not exist" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.169964 4707 scope.go:117] "RemoveContainer" containerID="44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853" Jan 21 15:17:12 crc kubenswrapper[4707]: E0121 15:17:12.170314 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853\": container with ID starting with 44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853 not found: ID does not exist" containerID="44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.170353 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853"} err="failed to get container status \"44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853\": rpc error: code = NotFound desc = could not find container \"44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853\": container with ID starting with 44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853 not found: ID does not exist" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.170380 4707 scope.go:117] "RemoveContainer" containerID="07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.170646 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394"} err="failed to get container status \"07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394\": rpc error: code = NotFound desc = could not find container \"07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394\": container with ID starting with 07b9cacf5226b2fcfacbfeaf5e8431d230cb27c0f46c9b5f18cef9cda1bad394 not found: ID does not exist" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.170674 4707 scope.go:117] "RemoveContainer" containerID="44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.170959 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853"} err="failed to get container status \"44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853\": rpc error: code = NotFound desc = could not find container \"44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853\": container with ID starting with 44d9012cfe012b9fd0b828a7f1e553b0b6113f0edafad7fdea74a557ea4e8853 not found: ID does not exist" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.172974 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:17:12 crc kubenswrapper[4707]: E0121 15:17:12.173289 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerName="cinder-api-log" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.173302 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerName="cinder-api-log" Jan 21 15:17:12 crc kubenswrapper[4707]: E0121 15:17:12.173317 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerName="cinder-api" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.173323 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerName="cinder-api" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.173485 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerName="cinder-api-log" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.173524 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71ee2cc-4277-43de-bcde-3164505d2ea2" containerName="cinder-api" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.174327 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.176899 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.177115 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.177634 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.180283 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.309459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-scripts\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.309495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f429a9-592b-4ff8-a548-b5e59c4c481e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.309693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.309748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57f429a9-592b-4ff8-a548-b5e59c4c481e-logs\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.309828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.309879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghvgv\" (UniqueName: \"kubernetes.io/projected/57f429a9-592b-4ff8-a548-b5e59c4c481e-kube-api-access-ghvgv\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.310058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.310097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data-custom\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.310148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.410992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411048 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data-custom\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411168 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-scripts\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411183 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f429a9-592b-4ff8-a548-b5e59c4c481e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411283 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57f429a9-592b-4ff8-a548-b5e59c4c481e-logs\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghvgv\" (UniqueName: \"kubernetes.io/projected/57f429a9-592b-4ff8-a548-b5e59c4c481e-kube-api-access-ghvgv\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f429a9-592b-4ff8-a548-b5e59c4c481e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.411843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57f429a9-592b-4ff8-a548-b5e59c4c481e-logs\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.414260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.414282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.414307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-scripts\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.414373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data-custom\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.414473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.415848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.428105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghvgv\" (UniqueName: \"kubernetes.io/projected/57f429a9-592b-4ff8-a548-b5e59c4c481e-kube-api-access-ghvgv\") pod \"cinder-api-0\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.490483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:12 crc kubenswrapper[4707]: I0121 15:17:12.847146 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:17:12 crc kubenswrapper[4707]: W0121 15:17:12.853883 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57f429a9_592b_4ff8_a548_b5e59c4c481e.slice/crio-a70e5a882345599d7fdd32496b92c15f7341dd3f3843e593a68dd6238b7c3677 WatchSource:0}: Error finding container a70e5a882345599d7fdd32496b92c15f7341dd3f3843e593a68dd6238b7c3677: Status 404 returned error can't find the container with id a70e5a882345599d7fdd32496b92c15f7341dd3f3843e593a68dd6238b7c3677 Jan 21 15:17:13 crc kubenswrapper[4707]: I0121 15:17:13.127770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"57f429a9-592b-4ff8-a548-b5e59c4c481e","Type":"ContainerStarted","Data":"a70e5a882345599d7fdd32496b92c15f7341dd3f3843e593a68dd6238b7c3677"} Jan 21 15:17:13 crc kubenswrapper[4707]: I0121 15:17:13.127829 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:13 crc kubenswrapper[4707]: I0121 15:17:13.199837 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c71ee2cc-4277-43de-bcde-3164505d2ea2" path="/var/lib/kubelet/pods/c71ee2cc-4277-43de-bcde-3164505d2ea2/volumes" Jan 21 15:17:14 crc kubenswrapper[4707]: I0121 15:17:14.138117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"57f429a9-592b-4ff8-a548-b5e59c4c481e","Type":"ContainerStarted","Data":"291188ab585da8ca8a57ab46efd037f8e3984e81e67c3a618faa0dd418123a81"} Jan 21 15:17:14 crc kubenswrapper[4707]: I0121 15:17:14.138324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"57f429a9-592b-4ff8-a548-b5e59c4c481e","Type":"ContainerStarted","Data":"97bae6a3ad6e694b48ede1784c23518cbd9dbfc8a5d2d44ec3582c962c55a005"} Jan 21 15:17:14 crc kubenswrapper[4707]: I0121 15:17:14.138357 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:14 crc kubenswrapper[4707]: I0121 15:17:14.156763 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.156747607 podStartE2EDuration="2.156747607s" podCreationTimestamp="2026-01-21 15:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:14.155168676 +0000 UTC m=+931.336684899" watchObservedRunningTime="2026-01-21 15:17:14.156747607 +0000 UTC m=+931.338263829" Jan 21 15:17:14 crc kubenswrapper[4707]: I0121 15:17:14.528795 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:17:14 crc kubenswrapper[4707]: I0121 15:17:14.764872 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:17:14 crc kubenswrapper[4707]: I0121 15:17:14.805326 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:17:14 crc kubenswrapper[4707]: I0121 15:17:14.935068 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:14 crc kubenswrapper[4707]: I0121 15:17:14.967612 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:15 crc kubenswrapper[4707]: I0121 15:17:15.600257 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbm6k"] Jan 21 15:17:15 crc kubenswrapper[4707]: I0121 15:17:15.682118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:15 crc kubenswrapper[4707]: I0121 15:17:15.708642 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.149175 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tbm6k" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="registry-server" containerID="cri-o://38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7" gracePeriod=2 Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.149714 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c638341-68df-4182-b322-77cc26ed319e" containerName="cinder-scheduler" containerID="cri-o://bebf25222697ac5df42b3e35ade3fd7785796fac1d2f3d0bd18f992e9a4d3cef" gracePeriod=30 Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.149759 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c638341-68df-4182-b322-77cc26ed319e" containerName="probe" containerID="cri-o://99eb57f52202cc408b7c8645ecd36f4c6e8909b7e8bcca2637281ff0d9c6ed39" gracePeriod=30 Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.547850 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.680697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-utilities\") pod \"d32d166e-4675-461c-8340-c9a1e99451e3\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.680747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc284\" (UniqueName: \"kubernetes.io/projected/d32d166e-4675-461c-8340-c9a1e99451e3-kube-api-access-qc284\") pod \"d32d166e-4675-461c-8340-c9a1e99451e3\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.680871 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-catalog-content\") pod \"d32d166e-4675-461c-8340-c9a1e99451e3\" (UID: \"d32d166e-4675-461c-8340-c9a1e99451e3\") " Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.681416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-utilities" (OuterVolumeSpecName: "utilities") pod "d32d166e-4675-461c-8340-c9a1e99451e3" (UID: "d32d166e-4675-461c-8340-c9a1e99451e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.684977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d32d166e-4675-461c-8340-c9a1e99451e3-kube-api-access-qc284" (OuterVolumeSpecName: "kube-api-access-qc284") pod "d32d166e-4675-461c-8340-c9a1e99451e3" (UID: "d32d166e-4675-461c-8340-c9a1e99451e3"). InnerVolumeSpecName "kube-api-access-qc284". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.752995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d32d166e-4675-461c-8340-c9a1e99451e3" (UID: "d32d166e-4675-461c-8340-c9a1e99451e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.756582 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.782821 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.782845 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc284\" (UniqueName: \"kubernetes.io/projected/d32d166e-4675-461c-8340-c9a1e99451e3-kube-api-access-qc284\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:16 crc kubenswrapper[4707]: I0121 15:17:16.782856 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d32d166e-4675-461c-8340-c9a1e99451e3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.157448 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c638341-68df-4182-b322-77cc26ed319e" containerID="99eb57f52202cc408b7c8645ecd36f4c6e8909b7e8bcca2637281ff0d9c6ed39" exitCode=0 Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.157522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c638341-68df-4182-b322-77cc26ed319e","Type":"ContainerDied","Data":"99eb57f52202cc408b7c8645ecd36f4c6e8909b7e8bcca2637281ff0d9c6ed39"} Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.159777 4707 generic.go:334] "Generic (PLEG): container finished" podID="d32d166e-4675-461c-8340-c9a1e99451e3" containerID="38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7" exitCode=0 Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.159803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm6k" event={"ID":"d32d166e-4675-461c-8340-c9a1e99451e3","Type":"ContainerDied","Data":"38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7"} Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.159835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm6k" event={"ID":"d32d166e-4675-461c-8340-c9a1e99451e3","Type":"ContainerDied","Data":"c6a160060edee6e60bbd4fbb8d43fc37816a5032aa7996fce67dece185428430"} Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.159850 4707 scope.go:117] "RemoveContainer" containerID="38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.159871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbm6k" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.177894 4707 scope.go:117] "RemoveContainer" containerID="14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.208892 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbm6k"] Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.211437 4707 scope.go:117] "RemoveContainer" containerID="12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.211491 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tbm6k"] Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.226495 4707 scope.go:117] "RemoveContainer" containerID="38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7" Jan 21 15:17:17 crc kubenswrapper[4707]: E0121 15:17:17.226779 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7\": container with ID starting with 38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7 not found: ID does not exist" containerID="38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.226828 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7"} err="failed to get container status \"38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7\": rpc error: code = NotFound desc = could not find container \"38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7\": container with ID starting with 38c908150023a00a7b86a462b52b782301ee14a14202933d851c27a20860a6a7 not found: ID does not exist" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.226848 4707 scope.go:117] "RemoveContainer" containerID="14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba" Jan 21 15:17:17 crc kubenswrapper[4707]: E0121 15:17:17.227071 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba\": container with ID starting with 14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba not found: ID does not exist" containerID="14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.227091 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba"} err="failed to get container status \"14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba\": rpc error: code = NotFound desc = could not find container \"14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba\": container with ID starting with 14ffa1f523ec8715513e0ab6fda040e2d54bf50c6afd4749fcb8fe9a59ce86ba not found: ID does not exist" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.227103 4707 scope.go:117] "RemoveContainer" containerID="12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b" Jan 21 15:17:17 crc kubenswrapper[4707]: E0121 15:17:17.227399 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b\": container with ID starting with 12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b not found: ID does not exist" containerID="12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.227416 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b"} err="failed to get container status \"12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b\": rpc error: code = NotFound desc = could not find container \"12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b\": container with ID starting with 12abe59ab1589a207cbf7f64c4796a4c08e209f3d8cf3af5dcdf336d8ebaa06b not found: ID does not exist" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.237463 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.293769 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-8488cbf69-kmsfx"] Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.295927 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" podUID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerName="neutron-api" containerID="cri-o://aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe" gracePeriod=30 Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.296000 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" podUID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerName="neutron-httpd" containerID="cri-o://b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7" gracePeriod=30 Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.935336 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.974859 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2"] Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.976106 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api" containerID="cri-o://3b500bb7e40583e7b21a96e40dea3af99d0120e7b2ed2f9a45dcff38e9a6fd31" gracePeriod=30 Jan 21 15:17:17 crc kubenswrapper[4707]: I0121 15:17:17.975980 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api-log" containerID="cri-o://2dc93d3cda490fc3cbd6dd820c186b43e7850a11a7ddaff0ced763b88d4b3b4d" gracePeriod=30 Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.169210 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerID="b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7" exitCode=0 Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.169274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" event={"ID":"f3586105-9aee-47cc-98fb-0d83f6214b4b","Type":"ContainerDied","Data":"b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7"} Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.170688 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c638341-68df-4182-b322-77cc26ed319e" containerID="bebf25222697ac5df42b3e35ade3fd7785796fac1d2f3d0bd18f992e9a4d3cef" exitCode=0 Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.170740 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c638341-68df-4182-b322-77cc26ed319e","Type":"ContainerDied","Data":"bebf25222697ac5df42b3e35ade3fd7785796fac1d2f3d0bd18f992e9a4d3cef"} Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.172057 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerID="2dc93d3cda490fc3cbd6dd820c186b43e7850a11a7ddaff0ced763b88d4b3b4d" exitCode=143 Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.172078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" event={"ID":"b8f97fda-b4d4-4f2c-88b9-8e91f147479c","Type":"ContainerDied","Data":"2dc93d3cda490fc3cbd6dd820c186b43e7850a11a7ddaff0ced763b88d4b3b4d"} Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.428115 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.517395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c638341-68df-4182-b322-77cc26ed319e-etc-machine-id\") pod \"3c638341-68df-4182-b322-77cc26ed319e\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.517437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data-custom\") pod \"3c638341-68df-4182-b322-77cc26ed319e\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.517497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsmff\" (UniqueName: \"kubernetes.io/projected/3c638341-68df-4182-b322-77cc26ed319e-kube-api-access-xsmff\") pod \"3c638341-68df-4182-b322-77cc26ed319e\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.517504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c638341-68df-4182-b322-77cc26ed319e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3c638341-68df-4182-b322-77cc26ed319e" (UID: "3c638341-68df-4182-b322-77cc26ed319e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.518218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data\") pod \"3c638341-68df-4182-b322-77cc26ed319e\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.518267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-scripts\") pod \"3c638341-68df-4182-b322-77cc26ed319e\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.518289 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-combined-ca-bundle\") pod \"3c638341-68df-4182-b322-77cc26ed319e\" (UID: \"3c638341-68df-4182-b322-77cc26ed319e\") " Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.518672 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c638341-68df-4182-b322-77cc26ed319e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.522304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c638341-68df-4182-b322-77cc26ed319e-kube-api-access-xsmff" (OuterVolumeSpecName: "kube-api-access-xsmff") pod "3c638341-68df-4182-b322-77cc26ed319e" (UID: "3c638341-68df-4182-b322-77cc26ed319e"). InnerVolumeSpecName "kube-api-access-xsmff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.522795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3c638341-68df-4182-b322-77cc26ed319e" (UID: "3c638341-68df-4182-b322-77cc26ed319e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.524062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-scripts" (OuterVolumeSpecName: "scripts") pod "3c638341-68df-4182-b322-77cc26ed319e" (UID: "3c638341-68df-4182-b322-77cc26ed319e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.555876 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c638341-68df-4182-b322-77cc26ed319e" (UID: "3c638341-68df-4182-b322-77cc26ed319e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.582752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data" (OuterVolumeSpecName: "config-data") pod "3c638341-68df-4182-b322-77cc26ed319e" (UID: "3c638341-68df-4182-b322-77cc26ed319e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.620320 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.620435 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsmff\" (UniqueName: \"kubernetes.io/projected/3c638341-68df-4182-b322-77cc26ed319e-kube-api-access-xsmff\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.620493 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.620542 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:18 crc kubenswrapper[4707]: I0121 15:17:18.620594 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c638341-68df-4182-b322-77cc26ed319e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.183844 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.189934 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" path="/var/lib/kubelet/pods/d32d166e-4675-461c-8340-c9a1e99451e3/volumes" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.190723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c638341-68df-4182-b322-77cc26ed319e","Type":"ContainerDied","Data":"7f1ac1938b25bad1693084f582e067d669ee9869a1228bb6edd2bdbd45da54de"} Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.190756 4707 scope.go:117] "RemoveContainer" containerID="99eb57f52202cc408b7c8645ecd36f4c6e8909b7e8bcca2637281ff0d9c6ed39" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.219689 4707 scope.go:117] "RemoveContainer" containerID="bebf25222697ac5df42b3e35ade3fd7785796fac1d2f3d0bd18f992e9a4d3cef" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.226756 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.237837 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.245230 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:17:19 crc kubenswrapper[4707]: E0121 15:17:19.245568 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="extract-utilities" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.245586 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="extract-utilities" Jan 21 15:17:19 crc kubenswrapper[4707]: E0121 15:17:19.245602 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="extract-content" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.245608 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="extract-content" Jan 21 15:17:19 crc kubenswrapper[4707]: E0121 15:17:19.245635 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c638341-68df-4182-b322-77cc26ed319e" containerName="probe" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.245641 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c638341-68df-4182-b322-77cc26ed319e" containerName="probe" Jan 21 15:17:19 crc kubenswrapper[4707]: E0121 15:17:19.245656 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="registry-server" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.245663 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="registry-server" Jan 21 15:17:19 crc kubenswrapper[4707]: E0121 15:17:19.245687 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c638341-68df-4182-b322-77cc26ed319e" containerName="cinder-scheduler" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.245692 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c638341-68df-4182-b322-77cc26ed319e" containerName="cinder-scheduler" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.245888 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c638341-68df-4182-b322-77cc26ed319e" containerName="cinder-scheduler" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.245921 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c638341-68df-4182-b322-77cc26ed319e" containerName="probe" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.245937 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d32d166e-4675-461c-8340-c9a1e99451e3" containerName="registry-server" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.246728 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.248651 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.251235 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.430972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt4jf\" (UniqueName: \"kubernetes.io/projected/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-kube-api-access-zt4jf\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.431028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-scripts\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.431078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.431152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.431203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.431277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.533157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-scripts\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.533393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.533454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.533504 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.533565 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.533607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.533672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt4jf\" (UniqueName: \"kubernetes.io/projected/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-kube-api-access-zt4jf\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.537953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.538071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.538640 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-scripts\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.539721 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.550048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt4jf\" (UniqueName: \"kubernetes.io/projected/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-kube-api-access-zt4jf\") pod \"cinder-scheduler-0\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:19 crc kubenswrapper[4707]: I0121 15:17:19.566822 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:20 crc kubenswrapper[4707]: I0121 15:17:20.001962 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:17:20 crc kubenswrapper[4707]: W0121 15:17:20.012375 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf44cdae5_61bb_40c3_a1d7_e10d4d64c203.slice/crio-c09323e85c1282cf63762ad55eecc61b7b5585b62eca846baaa50e2dad473df8 WatchSource:0}: Error finding container c09323e85c1282cf63762ad55eecc61b7b5585b62eca846baaa50e2dad473df8: Status 404 returned error can't find the container with id c09323e85c1282cf63762ad55eecc61b7b5585b62eca846baaa50e2dad473df8 Jan 21 15:17:20 crc kubenswrapper[4707]: I0121 15:17:20.179066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:17:20 crc kubenswrapper[4707]: I0121 15:17:20.201220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f44cdae5-61bb-40c3-a1d7-e10d4d64c203","Type":"ContainerStarted","Data":"c09323e85c1282cf63762ad55eecc61b7b5585b62eca846baaa50e2dad473df8"} Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.112189 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.147:9311/healthcheck\": read tcp 10.217.0.2:41444->10.217.0.147:9311: read: connection reset by peer" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.112214 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.147:9311/healthcheck\": read tcp 10.217.0.2:41446->10.217.0.147:9311: read: connection reset by peer" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.195759 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c638341-68df-4182-b322-77cc26ed319e" path="/var/lib/kubelet/pods/3c638341-68df-4182-b322-77cc26ed319e/volumes" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.214734 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerID="3b500bb7e40583e7b21a96e40dea3af99d0120e7b2ed2f9a45dcff38e9a6fd31" exitCode=0 Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.214797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" event={"ID":"b8f97fda-b4d4-4f2c-88b9-8e91f147479c","Type":"ContainerDied","Data":"3b500bb7e40583e7b21a96e40dea3af99d0120e7b2ed2f9a45dcff38e9a6fd31"} Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.215445 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.217181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f44cdae5-61bb-40c3-a1d7-e10d4d64c203","Type":"ContainerStarted","Data":"5775ffa72419df9485ac7699c70a095e09fa60e7badfeb84512d63346e47f9f6"} Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.217220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f44cdae5-61bb-40c3-a1d7-e10d4d64c203","Type":"ContainerStarted","Data":"9fa2d6082708378d81429a182459dfd951edb40a1d798ddd9d54ac8fa55269de"} Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.272123 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.272109688 podStartE2EDuration="2.272109688s" podCreationTimestamp="2026-01-21 15:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:21.256037042 +0000 UTC m=+938.437553264" watchObservedRunningTime="2026-01-21 15:17:21.272109688 +0000 UTC m=+938.453625910" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.566047 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.666774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data\") pod \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.667115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-combined-ca-bundle\") pod \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.667282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data-custom\") pod \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.667353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-logs\") pod \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.667410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mms99\" (UniqueName: \"kubernetes.io/projected/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-kube-api-access-mms99\") pod \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\" (UID: \"b8f97fda-b4d4-4f2c-88b9-8e91f147479c\") " Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.668730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-logs" (OuterVolumeSpecName: "logs") pod "b8f97fda-b4d4-4f2c-88b9-8e91f147479c" (UID: "b8f97fda-b4d4-4f2c-88b9-8e91f147479c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.675362 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-kube-api-access-mms99" (OuterVolumeSpecName: "kube-api-access-mms99") pod "b8f97fda-b4d4-4f2c-88b9-8e91f147479c" (UID: "b8f97fda-b4d4-4f2c-88b9-8e91f147479c"). InnerVolumeSpecName "kube-api-access-mms99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.688828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b8f97fda-b4d4-4f2c-88b9-8e91f147479c" (UID: "b8f97fda-b4d4-4f2c-88b9-8e91f147479c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.693661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8f97fda-b4d4-4f2c-88b9-8e91f147479c" (UID: "b8f97fda-b4d4-4f2c-88b9-8e91f147479c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.713591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data" (OuterVolumeSpecName: "config-data") pod "b8f97fda-b4d4-4f2c-88b9-8e91f147479c" (UID: "b8f97fda-b4d4-4f2c-88b9-8e91f147479c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.769101 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.769129 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.769142 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mms99\" (UniqueName: \"kubernetes.io/projected/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-kube-api-access-mms99\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.769152 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:21 crc kubenswrapper[4707]: I0121 15:17:21.769160 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f97fda-b4d4-4f2c-88b9-8e91f147479c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:22 crc kubenswrapper[4707]: I0121 15:17:22.225652 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" Jan 21 15:17:22 crc kubenswrapper[4707]: I0121 15:17:22.225662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2" event={"ID":"b8f97fda-b4d4-4f2c-88b9-8e91f147479c","Type":"ContainerDied","Data":"64d76e4e6473af9156cf3a38b2c31767c7d71226e45d28dd2c5033d6fb00fc79"} Jan 21 15:17:22 crc kubenswrapper[4707]: I0121 15:17:22.226397 4707 scope.go:117] "RemoveContainer" containerID="3b500bb7e40583e7b21a96e40dea3af99d0120e7b2ed2f9a45dcff38e9a6fd31" Jan 21 15:17:22 crc kubenswrapper[4707]: I0121 15:17:22.254242 4707 scope.go:117] "RemoveContainer" containerID="2dc93d3cda490fc3cbd6dd820c186b43e7850a11a7ddaff0ced763b88d4b3b4d" Jan 21 15:17:22 crc kubenswrapper[4707]: I0121 15:17:22.254989 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2"] Jan 21 15:17:22 crc kubenswrapper[4707]: I0121 15:17:22.261148 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-66969f6cd6-htsz2"] Jan 21 15:17:23 crc kubenswrapper[4707]: I0121 15:17:23.190179 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" path="/var/lib/kubelet/pods/b8f97fda-b4d4-4f2c-88b9-8e91f147479c/volumes" Jan 21 15:17:24 crc kubenswrapper[4707]: I0121 15:17:24.001457 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:17:24 crc kubenswrapper[4707]: I0121 15:17:24.567871 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:24 crc kubenswrapper[4707]: I0121 15:17:24.630245 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.589630 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.771693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-httpd-config\") pod \"f3586105-9aee-47cc-98fb-0d83f6214b4b\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.771847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-ovndb-tls-certs\") pod \"f3586105-9aee-47cc-98fb-0d83f6214b4b\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.771928 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5dmn\" (UniqueName: \"kubernetes.io/projected/f3586105-9aee-47cc-98fb-0d83f6214b4b-kube-api-access-n5dmn\") pod \"f3586105-9aee-47cc-98fb-0d83f6214b4b\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.772019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-combined-ca-bundle\") pod \"f3586105-9aee-47cc-98fb-0d83f6214b4b\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.772058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-config\") pod \"f3586105-9aee-47cc-98fb-0d83f6214b4b\" (UID: \"f3586105-9aee-47cc-98fb-0d83f6214b4b\") " Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.776002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f3586105-9aee-47cc-98fb-0d83f6214b4b" (UID: "f3586105-9aee-47cc-98fb-0d83f6214b4b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.784363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3586105-9aee-47cc-98fb-0d83f6214b4b-kube-api-access-n5dmn" (OuterVolumeSpecName: "kube-api-access-n5dmn") pod "f3586105-9aee-47cc-98fb-0d83f6214b4b" (UID: "f3586105-9aee-47cc-98fb-0d83f6214b4b"). InnerVolumeSpecName "kube-api-access-n5dmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.807433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-config" (OuterVolumeSpecName: "config") pod "f3586105-9aee-47cc-98fb-0d83f6214b4b" (UID: "f3586105-9aee-47cc-98fb-0d83f6214b4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.811970 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3586105-9aee-47cc-98fb-0d83f6214b4b" (UID: "f3586105-9aee-47cc-98fb-0d83f6214b4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.819951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f3586105-9aee-47cc-98fb-0d83f6214b4b" (UID: "f3586105-9aee-47cc-98fb-0d83f6214b4b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.873887 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.873914 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.873925 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5dmn\" (UniqueName: \"kubernetes.io/projected/f3586105-9aee-47cc-98fb-0d83f6214b4b-kube-api-access-n5dmn\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.873933 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:27 crc kubenswrapper[4707]: I0121 15:17:27.873942 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3586105-9aee-47cc-98fb-0d83f6214b4b-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.265955 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerID="aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe" exitCode=0 Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.266003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" event={"ID":"f3586105-9aee-47cc-98fb-0d83f6214b4b","Type":"ContainerDied","Data":"aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe"} Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.266202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" event={"ID":"f3586105-9aee-47cc-98fb-0d83f6214b4b","Type":"ContainerDied","Data":"e5fd869ba52bec6c144ee2772e41a87b57cd6741cc6c9803c7446dff6cb36e84"} Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.266034 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8488cbf69-kmsfx" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.266269 4707 scope.go:117] "RemoveContainer" containerID="b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.283280 4707 scope.go:117] "RemoveContainer" containerID="aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.295492 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-8488cbf69-kmsfx"] Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.305457 4707 scope.go:117] "RemoveContainer" containerID="b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7" Jan 21 15:17:28 crc kubenswrapper[4707]: E0121 15:17:28.306200 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7\": container with ID starting with b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7 not found: ID does not exist" containerID="b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.306228 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7"} err="failed to get container status \"b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7\": rpc error: code = NotFound desc = could not find container \"b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7\": container with ID starting with b419df83e0f7af1268b88166d5efe75467435260e841d9b8f8b2ef1c8794d8b7 not found: ID does not exist" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.306253 4707 scope.go:117] "RemoveContainer" containerID="aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe" Jan 21 15:17:28 crc kubenswrapper[4707]: E0121 15:17:28.307175 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe\": container with ID starting with aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe not found: ID does not exist" containerID="aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.307197 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe"} err="failed to get container status \"aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe\": rpc error: code = NotFound desc = could not find container \"aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe\": container with ID starting with aec13187296d4ce16d42a0eb7b69580601c4cfec92d26fb72d2d04f56819f0fe not found: ID does not exist" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.308966 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-8488cbf69-kmsfx"] Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.801306 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:17:28 crc kubenswrapper[4707]: E0121 15:17:28.802231 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.802295 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api" Jan 21 15:17:28 crc kubenswrapper[4707]: E0121 15:17:28.802347 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerName="neutron-httpd" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.802397 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerName="neutron-httpd" Jan 21 15:17:28 crc kubenswrapper[4707]: E0121 15:17:28.802448 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerName="neutron-api" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.802490 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerName="neutron-api" Jan 21 15:17:28 crc kubenswrapper[4707]: E0121 15:17:28.802533 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api-log" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.802574 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api-log" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.802758 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerName="neutron-api" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.802844 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.802920 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3586105-9aee-47cc-98fb-0d83f6214b4b" containerName="neutron-httpd" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.802977 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f97fda-b4d4-4f2c-88b9-8e91f147479c" containerName="barbican-api-log" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.803494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.805100 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.805356 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-rpvwp" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.805575 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.807088 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.990866 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.990942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config-secret\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.990973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fdbw\" (UniqueName: \"kubernetes.io/projected/9220cbec-82de-4d8a-9bca-575edd0d7482-kube-api-access-9fdbw\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:28 crc kubenswrapper[4707]: I0121 15:17:28.991071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.092480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.092573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.092611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config-secret\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.092638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fdbw\" (UniqueName: \"kubernetes.io/projected/9220cbec-82de-4d8a-9bca-575edd0d7482-kube-api-access-9fdbw\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.093312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.095856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config-secret\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.102226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.105125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fdbw\" (UniqueName: \"kubernetes.io/projected/9220cbec-82de-4d8a-9bca-575edd0d7482-kube-api-access-9fdbw\") pod \"openstackclient\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.115229 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.193679 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3586105-9aee-47cc-98fb-0d83f6214b4b" path="/var/lib/kubelet/pods/f3586105-9aee-47cc-98fb-0d83f6214b4b/volumes" Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.481191 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:17:29 crc kubenswrapper[4707]: W0121 15:17:29.487210 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9220cbec_82de_4d8a_9bca_575edd0d7482.slice/crio-901eb9b1e6f905c7ecceea72cfc473c81de78201fd750b65b379529c7c688fd0 WatchSource:0}: Error finding container 901eb9b1e6f905c7ecceea72cfc473c81de78201fd750b65b379529c7c688fd0: Status 404 returned error can't find the container with id 901eb9b1e6f905c7ecceea72cfc473c81de78201fd750b65b379529c7c688fd0 Jan 21 15:17:29 crc kubenswrapper[4707]: I0121 15:17:29.742583 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:17:30 crc kubenswrapper[4707]: I0121 15:17:30.285777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"9220cbec-82de-4d8a-9bca-575edd0d7482","Type":"ContainerStarted","Data":"901eb9b1e6f905c7ecceea72cfc473c81de78201fd750b65b379529c7c688fd0"} Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.406408 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb"] Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.407706 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.408947 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.409568 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.409728 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.425330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb"] Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.529349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-public-tls-certs\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.529575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-log-httpd\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.529682 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfs9k\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-kube-api-access-pfs9k\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.529752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-internal-tls-certs\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.529916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-combined-ca-bundle\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.530070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-run-httpd\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.530303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-etc-swift\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.530450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-config-data\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.634554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-combined-ca-bundle\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.634604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-run-httpd\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.634676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-etc-swift\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.634710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-config-data\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.634732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-public-tls-certs\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.634753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-log-httpd\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.634775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfs9k\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-kube-api-access-pfs9k\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.634793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-internal-tls-certs\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.635186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-run-httpd\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.635446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-log-httpd\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.639845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-public-tls-certs\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.639936 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-combined-ca-bundle\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.640321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-internal-tls-certs\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.640788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-etc-swift\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.647000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-config-data\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.655064 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfs9k\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-kube-api-access-pfs9k\") pod \"swift-proxy-869fbc79bd-dzdrb\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.702300 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:31 crc kubenswrapper[4707]: I0121 15:17:31.723202 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:32 crc kubenswrapper[4707]: I0121 15:17:32.146328 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb"] Jan 21 15:17:32 crc kubenswrapper[4707]: W0121 15:17:32.153987 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd582f12e_341b_45f5_a54a_8d016aed473a.slice/crio-3faa107af3aae21ad8749ef4ff20cfd921e18536a1be93695d7f985392e440fe WatchSource:0}: Error finding container 3faa107af3aae21ad8749ef4ff20cfd921e18536a1be93695d7f985392e440fe: Status 404 returned error can't find the container with id 3faa107af3aae21ad8749ef4ff20cfd921e18536a1be93695d7f985392e440fe Jan 21 15:17:32 crc kubenswrapper[4707]: I0121 15:17:32.302579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" event={"ID":"d582f12e-341b-45f5-a54a-8d016aed473a","Type":"ContainerStarted","Data":"3faa107af3aae21ad8749ef4ff20cfd921e18536a1be93695d7f985392e440fe"} Jan 21 15:17:33 crc kubenswrapper[4707]: I0121 15:17:33.325525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" event={"ID":"d582f12e-341b-45f5-a54a-8d016aed473a","Type":"ContainerStarted","Data":"591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214"} Jan 21 15:17:33 crc kubenswrapper[4707]: I0121 15:17:33.325562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" event={"ID":"d582f12e-341b-45f5-a54a-8d016aed473a","Type":"ContainerStarted","Data":"4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6"} Jan 21 15:17:33 crc kubenswrapper[4707]: I0121 15:17:33.326084 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:33 crc kubenswrapper[4707]: I0121 15:17:33.344192 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" podStartSLOduration=2.344178464 podStartE2EDuration="2.344178464s" podCreationTimestamp="2026-01-21 15:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:33.339365671 +0000 UTC m=+950.520881894" watchObservedRunningTime="2026-01-21 15:17:33.344178464 +0000 UTC m=+950.525694687" Jan 21 15:17:33 crc kubenswrapper[4707]: I0121 15:17:33.464455 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:33 crc kubenswrapper[4707]: I0121 15:17:33.464664 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="ceilometer-central-agent" containerID="cri-o://55ed378367ca41e2768d23e704055f2902ab694029b0245c1e2193c2cf12de3a" gracePeriod=30 Jan 21 15:17:33 crc kubenswrapper[4707]: I0121 15:17:33.464739 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="sg-core" containerID="cri-o://14dcfb2ae7b3e3a53af3dbe31d6ae4d53fbe97c121e3651041bb4c0516b327ca" gracePeriod=30 Jan 21 15:17:33 crc kubenswrapper[4707]: I0121 15:17:33.464745 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="proxy-httpd" containerID="cri-o://1a96fb1cd7fca9884505d9e415196af3e7d2a10cc56064cf332b930c9c566236" gracePeriod=30 Jan 21 15:17:33 crc kubenswrapper[4707]: I0121 15:17:33.464739 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="ceilometer-notification-agent" containerID="cri-o://5a92e9c55e7a6ebbcef5d232865bcf9d49d571117d63fc2036d14254e55dce2d" gracePeriod=30 Jan 21 15:17:34 crc kubenswrapper[4707]: I0121 15:17:34.334235 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerID="1a96fb1cd7fca9884505d9e415196af3e7d2a10cc56064cf332b930c9c566236" exitCode=0 Jan 21 15:17:34 crc kubenswrapper[4707]: I0121 15:17:34.334434 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerID="14dcfb2ae7b3e3a53af3dbe31d6ae4d53fbe97c121e3651041bb4c0516b327ca" exitCode=2 Jan 21 15:17:34 crc kubenswrapper[4707]: I0121 15:17:34.334441 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerID="55ed378367ca41e2768d23e704055f2902ab694029b0245c1e2193c2cf12de3a" exitCode=0 Jan 21 15:17:34 crc kubenswrapper[4707]: I0121 15:17:34.335028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerDied","Data":"1a96fb1cd7fca9884505d9e415196af3e7d2a10cc56064cf332b930c9c566236"} Jan 21 15:17:34 crc kubenswrapper[4707]: I0121 15:17:34.335065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerDied","Data":"14dcfb2ae7b3e3a53af3dbe31d6ae4d53fbe97c121e3651041bb4c0516b327ca"} Jan 21 15:17:34 crc kubenswrapper[4707]: I0121 15:17:34.335075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerDied","Data":"55ed378367ca41e2768d23e704055f2902ab694029b0245c1e2193c2cf12de3a"} Jan 21 15:17:34 crc kubenswrapper[4707]: I0121 15:17:34.335092 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:37 crc kubenswrapper[4707]: I0121 15:17:37.371193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"9220cbec-82de-4d8a-9bca-575edd0d7482","Type":"ContainerStarted","Data":"30a82c874f0417a2ea868196fc295a6b7cdf393c79ed490a067806d77d373590"} Jan 21 15:17:37 crc kubenswrapper[4707]: I0121 15:17:37.385154 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.377332356 podStartE2EDuration="9.385140266s" podCreationTimestamp="2026-01-21 15:17:28 +0000 UTC" firstStartedPulling="2026-01-21 15:17:29.489202419 +0000 UTC m=+946.670718641" lastFinishedPulling="2026-01-21 15:17:36.49701033 +0000 UTC m=+953.678526551" observedRunningTime="2026-01-21 15:17:37.382642859 +0000 UTC m=+954.564159080" watchObservedRunningTime="2026-01-21 15:17:37.385140266 +0000 UTC m=+954.566656488" Jan 21 15:17:37 crc kubenswrapper[4707]: I0121 15:17:37.744241 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:17:37 crc kubenswrapper[4707]: I0121 15:17:37.744426 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="73f7e559-7400-4d89-878e-7df47a936c82" containerName="kube-state-metrics" containerID="cri-o://d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be" gracePeriod=30 Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.111651 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wktmc"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.112699 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.121557 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wktmc"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.159471 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.172045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmnsb\" (UniqueName: \"kubernetes.io/projected/c9fe7962-fff4-44ce-a841-dab09e490d7c-kube-api-access-lmnsb\") pod \"nova-api-db-create-wktmc\" (UID: \"c9fe7962-fff4-44ce-a841-dab09e490d7c\") " pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.172088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9fe7962-fff4-44ce-a841-dab09e490d7c-operator-scripts\") pod \"nova-api-db-create-wktmc\" (UID: \"c9fe7962-fff4-44ce-a841-dab09e490d7c\") " pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.213698 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-rl5lm"] Jan 21 15:17:38 crc kubenswrapper[4707]: E0121 15:17:38.214104 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f7e559-7400-4d89-878e-7df47a936c82" containerName="kube-state-metrics" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.214121 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f7e559-7400-4d89-878e-7df47a936c82" containerName="kube-state-metrics" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.214285 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f7e559-7400-4d89-878e-7df47a936c82" containerName="kube-state-metrics" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.214786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.223482 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.224582 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.226315 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.230414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.242599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-rl5lm"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.273381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrchp\" (UniqueName: \"kubernetes.io/projected/73f7e559-7400-4d89-878e-7df47a936c82-kube-api-access-rrchp\") pod \"73f7e559-7400-4d89-878e-7df47a936c82\" (UID: \"73f7e559-7400-4d89-878e-7df47a936c82\") " Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.273626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9fe7962-fff4-44ce-a841-dab09e490d7c-operator-scripts\") pod \"nova-api-db-create-wktmc\" (UID: \"c9fe7962-fff4-44ce-a841-dab09e490d7c\") " pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.273796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da0d4c7-0c65-4e22-be82-afdc94255989-operator-scripts\") pod \"nova-cell0-db-create-rl5lm\" (UID: \"5da0d4c7-0c65-4e22-be82-afdc94255989\") " pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.273978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec05603-c838-4ce1-bb91-5a2d047b4263-operator-scripts\") pod \"nova-api-aef2-account-create-update-s7lls\" (UID: \"cec05603-c838-4ce1-bb91-5a2d047b4263\") " pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.274017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v25k2\" (UniqueName: \"kubernetes.io/projected/cec05603-c838-4ce1-bb91-5a2d047b4263-kube-api-access-v25k2\") pod \"nova-api-aef2-account-create-update-s7lls\" (UID: \"cec05603-c838-4ce1-bb91-5a2d047b4263\") " pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.274045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pktm\" (UniqueName: \"kubernetes.io/projected/5da0d4c7-0c65-4e22-be82-afdc94255989-kube-api-access-2pktm\") pod \"nova-cell0-db-create-rl5lm\" (UID: \"5da0d4c7-0c65-4e22-be82-afdc94255989\") " pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.274071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmnsb\" (UniqueName: \"kubernetes.io/projected/c9fe7962-fff4-44ce-a841-dab09e490d7c-kube-api-access-lmnsb\") pod \"nova-api-db-create-wktmc\" (UID: \"c9fe7962-fff4-44ce-a841-dab09e490d7c\") " pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.275750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9fe7962-fff4-44ce-a841-dab09e490d7c-operator-scripts\") pod \"nova-api-db-create-wktmc\" (UID: \"c9fe7962-fff4-44ce-a841-dab09e490d7c\") " pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.281123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f7e559-7400-4d89-878e-7df47a936c82-kube-api-access-rrchp" (OuterVolumeSpecName: "kube-api-access-rrchp") pod "73f7e559-7400-4d89-878e-7df47a936c82" (UID: "73f7e559-7400-4d89-878e-7df47a936c82"). InnerVolumeSpecName "kube-api-access-rrchp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.294268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmnsb\" (UniqueName: \"kubernetes.io/projected/c9fe7962-fff4-44ce-a841-dab09e490d7c-kube-api-access-lmnsb\") pod \"nova-api-db-create-wktmc\" (UID: \"c9fe7962-fff4-44ce-a841-dab09e490d7c\") " pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.375923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da0d4c7-0c65-4e22-be82-afdc94255989-operator-scripts\") pod \"nova-cell0-db-create-rl5lm\" (UID: \"5da0d4c7-0c65-4e22-be82-afdc94255989\") " pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.376521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da0d4c7-0c65-4e22-be82-afdc94255989-operator-scripts\") pod \"nova-cell0-db-create-rl5lm\" (UID: \"5da0d4c7-0c65-4e22-be82-afdc94255989\") " pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.376692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec05603-c838-4ce1-bb91-5a2d047b4263-operator-scripts\") pod \"nova-api-aef2-account-create-update-s7lls\" (UID: \"cec05603-c838-4ce1-bb91-5a2d047b4263\") " pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.380246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec05603-c838-4ce1-bb91-5a2d047b4263-operator-scripts\") pod \"nova-api-aef2-account-create-update-s7lls\" (UID: \"cec05603-c838-4ce1-bb91-5a2d047b4263\") " pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.380336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v25k2\" (UniqueName: \"kubernetes.io/projected/cec05603-c838-4ce1-bb91-5a2d047b4263-kube-api-access-v25k2\") pod \"nova-api-aef2-account-create-update-s7lls\" (UID: \"cec05603-c838-4ce1-bb91-5a2d047b4263\") " pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.380620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pktm\" (UniqueName: \"kubernetes.io/projected/5da0d4c7-0c65-4e22-be82-afdc94255989-kube-api-access-2pktm\") pod \"nova-cell0-db-create-rl5lm\" (UID: \"5da0d4c7-0c65-4e22-be82-afdc94255989\") " pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.380967 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrchp\" (UniqueName: \"kubernetes.io/projected/73f7e559-7400-4d89-878e-7df47a936c82-kube-api-access-rrchp\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.387957 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerID="5a92e9c55e7a6ebbcef5d232865bcf9d49d571117d63fc2036d14254e55dce2d" exitCode=0 Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.388014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerDied","Data":"5a92e9c55e7a6ebbcef5d232865bcf9d49d571117d63fc2036d14254e55dce2d"} Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.389088 4707 generic.go:334] "Generic (PLEG): container finished" podID="73f7e559-7400-4d89-878e-7df47a936c82" containerID="d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be" exitCode=2 Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.390581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"73f7e559-7400-4d89-878e-7df47a936c82","Type":"ContainerDied","Data":"d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be"} Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.390621 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"73f7e559-7400-4d89-878e-7df47a936c82","Type":"ContainerDied","Data":"ed63453753c0e42b120a0f0c169fd6bfb2161a53177cfded6b95974546919f06"} Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.390640 4707 scope.go:117] "RemoveContainer" containerID="d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.391194 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.398078 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pktm\" (UniqueName: \"kubernetes.io/projected/5da0d4c7-0c65-4e22-be82-afdc94255989-kube-api-access-2pktm\") pod \"nova-cell0-db-create-rl5lm\" (UID: \"5da0d4c7-0c65-4e22-be82-afdc94255989\") " pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.407417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v25k2\" (UniqueName: \"kubernetes.io/projected/cec05603-c838-4ce1-bb91-5a2d047b4263-kube-api-access-v25k2\") pod \"nova-api-aef2-account-create-update-s7lls\" (UID: \"cec05603-c838-4ce1-bb91-5a2d047b4263\") " pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.417255 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gpd8j"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.418241 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.429415 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.430270 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.432258 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.449385 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gpd8j"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.454093 4707 scope.go:117] "RemoveContainer" containerID="d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be" Jan 21 15:17:38 crc kubenswrapper[4707]: E0121 15:17:38.457982 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be\": container with ID starting with d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be not found: ID does not exist" containerID="d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.458015 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be"} err="failed to get container status \"d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be\": rpc error: code = NotFound desc = could not find container \"d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be\": container with ID starting with d8a3910121e6a30a4051812196e39bf11847590a9a7012791a9e689f82b144be not found: ID does not exist" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.469078 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.473909 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.491056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-operator-scripts\") pod \"nova-cell1-db-create-gpd8j\" (UID: \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.495189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026fa0a1-b506-474b-9a47-b98541421464-operator-scripts\") pod \"nova-cell0-e1b5-account-create-update-qrhx7\" (UID: \"026fa0a1-b506-474b-9a47-b98541421464\") " pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.495253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmsw\" (UniqueName: \"kubernetes.io/projected/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-kube-api-access-mdmsw\") pod \"nova-cell1-db-create-gpd8j\" (UID: \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.495398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm4qp\" (UniqueName: \"kubernetes.io/projected/026fa0a1-b506-474b-9a47-b98541421464-kube-api-access-fm4qp\") pod \"nova-cell0-e1b5-account-create-update-qrhx7\" (UID: \"026fa0a1-b506-474b-9a47-b98541421464\") " pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.526397 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.528956 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.533296 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.538497 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.540734 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.541765 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.543154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.543472 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.563576 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.576245 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.610229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.610317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-operator-scripts\") pod \"nova-cell1-db-create-gpd8j\" (UID: \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.610399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026fa0a1-b506-474b-9a47-b98541421464-operator-scripts\") pod \"nova-cell0-e1b5-account-create-update-qrhx7\" (UID: \"026fa0a1-b506-474b-9a47-b98541421464\") " pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.610431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmsw\" (UniqueName: \"kubernetes.io/projected/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-kube-api-access-mdmsw\") pod \"nova-cell1-db-create-gpd8j\" (UID: \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.610448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.610529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm4qp\" (UniqueName: \"kubernetes.io/projected/026fa0a1-b506-474b-9a47-b98541421464-kube-api-access-fm4qp\") pod \"nova-cell0-e1b5-account-create-update-qrhx7\" (UID: \"026fa0a1-b506-474b-9a47-b98541421464\") " pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.610548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.610604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkpbt\" (UniqueName: \"kubernetes.io/projected/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-api-access-nkpbt\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.623867 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-operator-scripts\") pod \"nova-cell1-db-create-gpd8j\" (UID: \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.627607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026fa0a1-b506-474b-9a47-b98541421464-operator-scripts\") pod \"nova-cell0-e1b5-account-create-update-qrhx7\" (UID: \"026fa0a1-b506-474b-9a47-b98541421464\") " pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.634522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm4qp\" (UniqueName: \"kubernetes.io/projected/026fa0a1-b506-474b-9a47-b98541421464-kube-api-access-fm4qp\") pod \"nova-cell0-e1b5-account-create-update-qrhx7\" (UID: \"026fa0a1-b506-474b-9a47-b98541421464\") " pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.640199 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2"] Jan 21 15:17:38 crc kubenswrapper[4707]: E0121 15:17:38.640700 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="ceilometer-notification-agent" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.640719 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="ceilometer-notification-agent" Jan 21 15:17:38 crc kubenswrapper[4707]: E0121 15:17:38.640736 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="sg-core" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.640742 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="sg-core" Jan 21 15:17:38 crc kubenswrapper[4707]: E0121 15:17:38.640751 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="ceilometer-central-agent" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.640757 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="ceilometer-central-agent" Jan 21 15:17:38 crc kubenswrapper[4707]: E0121 15:17:38.640771 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="proxy-httpd" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.640776 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="proxy-httpd" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.640944 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="ceilometer-central-agent" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.640966 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="proxy-httpd" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.640977 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="ceilometer-notification-agent" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.640986 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" containerName="sg-core" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.641717 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.643287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmsw\" (UniqueName: \"kubernetes.io/projected/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-kube-api-access-mdmsw\") pod \"nova-cell1-db-create-gpd8j\" (UID: \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.643873 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.646207 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.714052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-scripts\") pod \"fd22c263-f40d-47d1-85d2-e5227784aae2\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.714103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-combined-ca-bundle\") pod \"fd22c263-f40d-47d1-85d2-e5227784aae2\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.714772 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-sg-core-conf-yaml\") pod \"fd22c263-f40d-47d1-85d2-e5227784aae2\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.715003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phqpq\" (UniqueName: \"kubernetes.io/projected/fd22c263-f40d-47d1-85d2-e5227784aae2-kube-api-access-phqpq\") pod \"fd22c263-f40d-47d1-85d2-e5227784aae2\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.715048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-config-data\") pod \"fd22c263-f40d-47d1-85d2-e5227784aae2\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.715665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-run-httpd\") pod \"fd22c263-f40d-47d1-85d2-e5227784aae2\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.715750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-log-httpd\") pod \"fd22c263-f40d-47d1-85d2-e5227784aae2\" (UID: \"fd22c263-f40d-47d1-85d2-e5227784aae2\") " Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm7n5\" (UniqueName: \"kubernetes.io/projected/36fdb657-664d-435d-9ab2-edd0c800506c-kube-api-access-dm7n5\") pod \"nova-cell1-fe38-account-create-update-wpnm2\" (UID: \"36fdb657-664d-435d-9ab2-edd0c800506c\") " pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fdb657-664d-435d-9ab2-edd0c800506c-operator-scripts\") pod \"nova-cell1-fe38-account-create-update-wpnm2\" (UID: \"36fdb657-664d-435d-9ab2-edd0c800506c\") " pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fd22c263-f40d-47d1-85d2-e5227784aae2" (UID: "fd22c263-f40d-47d1-85d2-e5227784aae2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fd22c263-f40d-47d1-85d2-e5227784aae2" (UID: "fd22c263-f40d-47d1-85d2-e5227784aae2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkpbt\" (UniqueName: \"kubernetes.io/projected/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-api-access-nkpbt\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716760 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.716771 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd22c263-f40d-47d1-85d2-e5227784aae2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.717155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-scripts" (OuterVolumeSpecName: "scripts") pod "fd22c263-f40d-47d1-85d2-e5227784aae2" (UID: "fd22c263-f40d-47d1-85d2-e5227784aae2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.718125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd22c263-f40d-47d1-85d2-e5227784aae2-kube-api-access-phqpq" (OuterVolumeSpecName: "kube-api-access-phqpq") pod "fd22c263-f40d-47d1-85d2-e5227784aae2" (UID: "fd22c263-f40d-47d1-85d2-e5227784aae2"). InnerVolumeSpecName "kube-api-access-phqpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.719905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.722228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.725174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.731548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkpbt\" (UniqueName: \"kubernetes.io/projected/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-api-access-nkpbt\") pod \"kube-state-metrics-0\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.733693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fd22c263-f40d-47d1-85d2-e5227784aae2" (UID: "fd22c263-f40d-47d1-85d2-e5227784aae2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.765519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.777283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd22c263-f40d-47d1-85d2-e5227784aae2" (UID: "fd22c263-f40d-47d1-85d2-e5227784aae2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.782514 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.809714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-config-data" (OuterVolumeSpecName: "config-data") pod "fd22c263-f40d-47d1-85d2-e5227784aae2" (UID: "fd22c263-f40d-47d1-85d2-e5227784aae2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.818461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm7n5\" (UniqueName: \"kubernetes.io/projected/36fdb657-664d-435d-9ab2-edd0c800506c-kube-api-access-dm7n5\") pod \"nova-cell1-fe38-account-create-update-wpnm2\" (UID: \"36fdb657-664d-435d-9ab2-edd0c800506c\") " pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.818513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fdb657-664d-435d-9ab2-edd0c800506c-operator-scripts\") pod \"nova-cell1-fe38-account-create-update-wpnm2\" (UID: \"36fdb657-664d-435d-9ab2-edd0c800506c\") " pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.818574 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.818586 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phqpq\" (UniqueName: \"kubernetes.io/projected/fd22c263-f40d-47d1-85d2-e5227784aae2-kube-api-access-phqpq\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.818594 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.818602 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.818609 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd22c263-f40d-47d1-85d2-e5227784aae2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.819778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fdb657-664d-435d-9ab2-edd0c800506c-operator-scripts\") pod \"nova-cell1-fe38-account-create-update-wpnm2\" (UID: \"36fdb657-664d-435d-9ab2-edd0c800506c\") " pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.832110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm7n5\" (UniqueName: \"kubernetes.io/projected/36fdb657-664d-435d-9ab2-edd0c800506c-kube-api-access-dm7n5\") pod \"nova-cell1-fe38-account-create-update-wpnm2\" (UID: \"36fdb657-664d-435d-9ab2-edd0c800506c\") " pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.863026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.922183 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wktmc"] Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.973914 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:38 crc kubenswrapper[4707]: I0121 15:17:38.988639 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.019414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-rl5lm"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.199471 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f7e559-7400-4d89-878e-7df47a936c82" path="/var/lib/kubelet/pods/73f7e559-7400-4d89-878e-7df47a936c82/volumes" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.201002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gpd8j"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.268769 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7"] Jan 21 15:17:39 crc kubenswrapper[4707]: W0121 15:17:39.331962 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod026fa0a1_b506_474b_9a47_b98541421464.slice/crio-f8b4c83fa6c0cd9df3942fba48e08e6510e7e80babfc4470e1e3434337916064 WatchSource:0}: Error finding container f8b4c83fa6c0cd9df3942fba48e08e6510e7e80babfc4470e1e3434337916064: Status 404 returned error can't find the container with id f8b4c83fa6c0cd9df3942fba48e08e6510e7e80babfc4470e1e3434337916064 Jan 21 15:17:39 crc kubenswrapper[4707]: W0121 15:17:39.357706 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30efcc6d_82eb_4c51_82e4_a14fdbdd8ab6.slice/crio-e52ea25d6c240f866fc10f050fe8e68c9cffae76930f1987e8813744d89d5810 WatchSource:0}: Error finding container e52ea25d6c240f866fc10f050fe8e68c9cffae76930f1987e8813744d89d5810: Status 404 returned error can't find the container with id e52ea25d6c240f866fc10f050fe8e68c9cffae76930f1987e8813744d89d5810 Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.359134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.396536 4707 generic.go:334] "Generic (PLEG): container finished" podID="5da0d4c7-0c65-4e22-be82-afdc94255989" containerID="14cc43f0b41c7090b3769eaa7c3c1157933a68ecf8e7d9d7aa35a88d58bb5fa5" exitCode=0 Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.396611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" event={"ID":"5da0d4c7-0c65-4e22-be82-afdc94255989","Type":"ContainerDied","Data":"14cc43f0b41c7090b3769eaa7c3c1157933a68ecf8e7d9d7aa35a88d58bb5fa5"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.396673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" event={"ID":"5da0d4c7-0c65-4e22-be82-afdc94255989","Type":"ContainerStarted","Data":"11743849a2fe8c7b992a69c6daa5e13da9236d38eacd3eae87e6717e19fba5f2"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.397910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" event={"ID":"fec7d56d-7ffe-452b-a5fc-d1febd4218a6","Type":"ContainerStarted","Data":"3bd94c36268aea9cfa512b21c4fd0dbdba4e5ab6445d5390d147f8748ffcb799"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.397964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" event={"ID":"fec7d56d-7ffe-452b-a5fc-d1febd4218a6","Type":"ContainerStarted","Data":"5f127516e296a737b076b83f89ec3c5cecc7639d642e6c54fea8acd05daf501a"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.400286 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9fe7962-fff4-44ce-a841-dab09e490d7c" containerID="c727367a8fb1551f421d3c7b7ffddfef9fb2b665be85437347357093e4b2ff22" exitCode=0 Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.400341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-wktmc" event={"ID":"c9fe7962-fff4-44ce-a841-dab09e490d7c","Type":"ContainerDied","Data":"c727367a8fb1551f421d3c7b7ffddfef9fb2b665be85437347357093e4b2ff22"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.400361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-wktmc" event={"ID":"c9fe7962-fff4-44ce-a841-dab09e490d7c","Type":"ContainerStarted","Data":"9eefd8b3bd35c6fc342033ed457dca4d5c534fe25a9563792219e14d8b7b3d3c"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.401590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6","Type":"ContainerStarted","Data":"e52ea25d6c240f866fc10f050fe8e68c9cffae76930f1987e8813744d89d5810"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.402746 4707 generic.go:334] "Generic (PLEG): container finished" podID="cec05603-c838-4ce1-bb91-5a2d047b4263" containerID="7398d99b8c2d024f0d741e0979a2b5f13e02ba91e8bee355cd0a60ea43c7fc69" exitCode=0 Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.402795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" event={"ID":"cec05603-c838-4ce1-bb91-5a2d047b4263","Type":"ContainerDied","Data":"7398d99b8c2d024f0d741e0979a2b5f13e02ba91e8bee355cd0a60ea43c7fc69"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.402833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" event={"ID":"cec05603-c838-4ce1-bb91-5a2d047b4263","Type":"ContainerStarted","Data":"84fcb730c85819e0ff7a139b866c4dc06d7962821fd1604ef57608e375c32e2c"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.403739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" event={"ID":"026fa0a1-b506-474b-9a47-b98541421464","Type":"ContainerStarted","Data":"f8b4c83fa6c0cd9df3942fba48e08e6510e7e80babfc4470e1e3434337916064"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.417512 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fd22c263-f40d-47d1-85d2-e5227784aae2","Type":"ContainerDied","Data":"3d1a14cd68aceffbb375dc5d138b20f76116dd19828eaf25fd0f874e2666d550"} Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.417541 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.417549 4707 scope.go:117] "RemoveContainer" containerID="1a96fb1cd7fca9884505d9e415196af3e7d2a10cc56064cf332b930c9c566236" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.445035 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.447349 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" podStartSLOduration=1.447334568 podStartE2EDuration="1.447334568s" podCreationTimestamp="2026-01-21 15:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:39.439668346 +0000 UTC m=+956.621184568" watchObservedRunningTime="2026-01-21 15:17:39.447334568 +0000 UTC m=+956.628850790" Jan 21 15:17:39 crc kubenswrapper[4707]: W0121 15:17:39.531093 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36fdb657_664d_435d_9ab2_edd0c800506c.slice/crio-2082c25be76f3f17fffa21d26b3fdc96dbcc74e3566223635332ae7a1006a738 WatchSource:0}: Error finding container 2082c25be76f3f17fffa21d26b3fdc96dbcc74e3566223635332ae7a1006a738: Status 404 returned error can't find the container with id 2082c25be76f3f17fffa21d26b3fdc96dbcc74e3566223635332ae7a1006a738 Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.533033 4707 scope.go:117] "RemoveContainer" containerID="14dcfb2ae7b3e3a53af3dbe31d6ae4d53fbe97c121e3651041bb4c0516b327ca" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.547860 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.556467 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.565982 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.568439 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.576140 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.576467 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.579860 4707 scope.go:117] "RemoveContainer" containerID="5a92e9c55e7a6ebbcef5d232865bcf9d49d571117d63fc2036d14254e55dce2d" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.583605 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.608186 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.608406 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="8475154b-1028-4331-a68d-83679b4d78fe" containerName="glance-log" containerID="cri-o://8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1" gracePeriod=30 Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.608524 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="8475154b-1028-4331-a68d-83679b4d78fe" containerName="glance-httpd" containerID="cri-o://ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab" gracePeriod=30 Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.648983 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.649158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shktg\" (UniqueName: \"kubernetes.io/projected/6cfe3994-17c3-4306-a5a5-50f62e79f215-kube-api-access-shktg\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.649224 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-run-httpd\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.649264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.649310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-log-httpd\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.649351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-scripts\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.649460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-config-data\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.649557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: E0121 15:17:39.652669 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-shktg log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ceilometer-0" podUID="6cfe3994-17c3-4306-a5a5-50f62e79f215" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.660939 4707 scope.go:117] "RemoveContainer" containerID="55ed378367ca41e2768d23e704055f2902ab694029b0245c1e2193c2cf12de3a" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.751264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-run-httpd\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.751304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.751326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-log-httpd\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.751348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-scripts\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.751398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-config-data\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.751441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.751488 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shktg\" (UniqueName: \"kubernetes.io/projected/6cfe3994-17c3-4306-a5a5-50f62e79f215-kube-api-access-shktg\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.752223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-log-httpd\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.752512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-run-httpd\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.755115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.755469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.755536 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-scripts\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.756306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-config-data\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:39 crc kubenswrapper[4707]: I0121 15:17:39.766744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shktg\" (UniqueName: \"kubernetes.io/projected/6cfe3994-17c3-4306-a5a5-50f62e79f215-kube-api-access-shktg\") pod \"ceilometer-0\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.184404 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.184877 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerName="glance-log" containerID="cri-o://5de2ad12831c86f17cb7a8db414b2f4c49c21fa9d4d8b5ef0892cbbc9d50e7fc" gracePeriod=30 Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.184874 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerName="glance-httpd" containerID="cri-o://39f668b37d80f373a8b899a91d73bbd8673c02c349c732988db8667e4da64a7e" gracePeriod=30 Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.424632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6","Type":"ContainerStarted","Data":"48c5225159a3ffc9586877ac0dcbb13aa01282ae7849cfd26facedad54d07afc"} Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.424861 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.427015 4707 generic.go:334] "Generic (PLEG): container finished" podID="026fa0a1-b506-474b-9a47-b98541421464" containerID="be4c978b2be2e32de5ace8ea709ece67310a8ad3c2471018b5a376668dc85715" exitCode=0 Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.427132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" event={"ID":"026fa0a1-b506-474b-9a47-b98541421464","Type":"ContainerDied","Data":"be4c978b2be2e32de5ace8ea709ece67310a8ad3c2471018b5a376668dc85715"} Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.431923 4707 generic.go:334] "Generic (PLEG): container finished" podID="36fdb657-664d-435d-9ab2-edd0c800506c" containerID="841e9305d4ad8534d58b141a550858d835ba6a91bc9f3d6a0944858c473b8ca5" exitCode=0 Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.431972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" event={"ID":"36fdb657-664d-435d-9ab2-edd0c800506c","Type":"ContainerDied","Data":"841e9305d4ad8534d58b141a550858d835ba6a91bc9f3d6a0944858c473b8ca5"} Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.431987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" event={"ID":"36fdb657-664d-435d-9ab2-edd0c800506c","Type":"ContainerStarted","Data":"2082c25be76f3f17fffa21d26b3fdc96dbcc74e3566223635332ae7a1006a738"} Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.433314 4707 generic.go:334] "Generic (PLEG): container finished" podID="fec7d56d-7ffe-452b-a5fc-d1febd4218a6" containerID="3bd94c36268aea9cfa512b21c4fd0dbdba4e5ab6445d5390d147f8748ffcb799" exitCode=0 Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.433375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" event={"ID":"fec7d56d-7ffe-452b-a5fc-d1febd4218a6","Type":"ContainerDied","Data":"3bd94c36268aea9cfa512b21c4fd0dbdba4e5ab6445d5390d147f8748ffcb799"} Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.445028 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.142409271 podStartE2EDuration="2.445005874s" podCreationTimestamp="2026-01-21 15:17:38 +0000 UTC" firstStartedPulling="2026-01-21 15:17:39.359788177 +0000 UTC m=+956.541304398" lastFinishedPulling="2026-01-21 15:17:39.66238478 +0000 UTC m=+956.843901001" observedRunningTime="2026-01-21 15:17:40.435802039 +0000 UTC m=+957.617318261" watchObservedRunningTime="2026-01-21 15:17:40.445005874 +0000 UTC m=+957.626522096" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.445664 4707 generic.go:334] "Generic (PLEG): container finished" podID="8475154b-1028-4331-a68d-83679b4d78fe" containerID="8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1" exitCode=143 Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.445726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8475154b-1028-4331-a68d-83679b4d78fe","Type":"ContainerDied","Data":"8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1"} Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.454637 4707 generic.go:334] "Generic (PLEG): container finished" podID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerID="5de2ad12831c86f17cb7a8db414b2f4c49c21fa9d4d8b5ef0892cbbc9d50e7fc" exitCode=143 Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.454748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a0149516-3649-42aa-a3c7-fe89b3a453b7","Type":"ContainerDied","Data":"5de2ad12831c86f17cb7a8db414b2f4c49c21fa9d4d8b5ef0892cbbc9d50e7fc"} Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.454847 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.465061 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.570764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-run-httpd\") pod \"6cfe3994-17c3-4306-a5a5-50f62e79f215\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.570831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shktg\" (UniqueName: \"kubernetes.io/projected/6cfe3994-17c3-4306-a5a5-50f62e79f215-kube-api-access-shktg\") pod \"6cfe3994-17c3-4306-a5a5-50f62e79f215\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.570887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-log-httpd\") pod \"6cfe3994-17c3-4306-a5a5-50f62e79f215\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.570988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-sg-core-conf-yaml\") pod \"6cfe3994-17c3-4306-a5a5-50f62e79f215\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.571003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-config-data\") pod \"6cfe3994-17c3-4306-a5a5-50f62e79f215\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.571058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-combined-ca-bundle\") pod \"6cfe3994-17c3-4306-a5a5-50f62e79f215\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.571093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6cfe3994-17c3-4306-a5a5-50f62e79f215" (UID: "6cfe3994-17c3-4306-a5a5-50f62e79f215"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.571099 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-scripts\") pod \"6cfe3994-17c3-4306-a5a5-50f62e79f215\" (UID: \"6cfe3994-17c3-4306-a5a5-50f62e79f215\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.571257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6cfe3994-17c3-4306-a5a5-50f62e79f215" (UID: "6cfe3994-17c3-4306-a5a5-50f62e79f215"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.571525 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.571539 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfe3994-17c3-4306-a5a5-50f62e79f215-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.575375 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cfe3994-17c3-4306-a5a5-50f62e79f215" (UID: "6cfe3994-17c3-4306-a5a5-50f62e79f215"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.575652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6cfe3994-17c3-4306-a5a5-50f62e79f215" (UID: "6cfe3994-17c3-4306-a5a5-50f62e79f215"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.576028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-config-data" (OuterVolumeSpecName: "config-data") pod "6cfe3994-17c3-4306-a5a5-50f62e79f215" (UID: "6cfe3994-17c3-4306-a5a5-50f62e79f215"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.577980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfe3994-17c3-4306-a5a5-50f62e79f215-kube-api-access-shktg" (OuterVolumeSpecName: "kube-api-access-shktg") pod "6cfe3994-17c3-4306-a5a5-50f62e79f215" (UID: "6cfe3994-17c3-4306-a5a5-50f62e79f215"). InnerVolumeSpecName "kube-api-access-shktg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.579335 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-scripts" (OuterVolumeSpecName: "scripts") pod "6cfe3994-17c3-4306-a5a5-50f62e79f215" (UID: "6cfe3994-17c3-4306-a5a5-50f62e79f215"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.673198 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.673403 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.673415 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.673423 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfe3994-17c3-4306-a5a5-50f62e79f215-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.673433 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shktg\" (UniqueName: \"kubernetes.io/projected/6cfe3994-17c3-4306-a5a5-50f62e79f215-kube-api-access-shktg\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.738451 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.787291 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.802655 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.877105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pktm\" (UniqueName: \"kubernetes.io/projected/5da0d4c7-0c65-4e22-be82-afdc94255989-kube-api-access-2pktm\") pod \"5da0d4c7-0c65-4e22-be82-afdc94255989\" (UID: \"5da0d4c7-0c65-4e22-be82-afdc94255989\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.877250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9fe7962-fff4-44ce-a841-dab09e490d7c-operator-scripts\") pod \"c9fe7962-fff4-44ce-a841-dab09e490d7c\" (UID: \"c9fe7962-fff4-44ce-a841-dab09e490d7c\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.877897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9fe7962-fff4-44ce-a841-dab09e490d7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9fe7962-fff4-44ce-a841-dab09e490d7c" (UID: "c9fe7962-fff4-44ce-a841-dab09e490d7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.878172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da0d4c7-0c65-4e22-be82-afdc94255989-operator-scripts\") pod \"5da0d4c7-0c65-4e22-be82-afdc94255989\" (UID: \"5da0d4c7-0c65-4e22-be82-afdc94255989\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.878230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmnsb\" (UniqueName: \"kubernetes.io/projected/c9fe7962-fff4-44ce-a841-dab09e490d7c-kube-api-access-lmnsb\") pod \"c9fe7962-fff4-44ce-a841-dab09e490d7c\" (UID: \"c9fe7962-fff4-44ce-a841-dab09e490d7c\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.878562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da0d4c7-0c65-4e22-be82-afdc94255989-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5da0d4c7-0c65-4e22-be82-afdc94255989" (UID: "5da0d4c7-0c65-4e22-be82-afdc94255989"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.879118 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9fe7962-fff4-44ce-a841-dab09e490d7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.879139 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da0d4c7-0c65-4e22-be82-afdc94255989-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.881515 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da0d4c7-0c65-4e22-be82-afdc94255989-kube-api-access-2pktm" (OuterVolumeSpecName: "kube-api-access-2pktm") pod "5da0d4c7-0c65-4e22-be82-afdc94255989" (UID: "5da0d4c7-0c65-4e22-be82-afdc94255989"). InnerVolumeSpecName "kube-api-access-2pktm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.881587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9fe7962-fff4-44ce-a841-dab09e490d7c-kube-api-access-lmnsb" (OuterVolumeSpecName: "kube-api-access-lmnsb") pod "c9fe7962-fff4-44ce-a841-dab09e490d7c" (UID: "c9fe7962-fff4-44ce-a841-dab09e490d7c"). InnerVolumeSpecName "kube-api-access-lmnsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.979939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v25k2\" (UniqueName: \"kubernetes.io/projected/cec05603-c838-4ce1-bb91-5a2d047b4263-kube-api-access-v25k2\") pod \"cec05603-c838-4ce1-bb91-5a2d047b4263\" (UID: \"cec05603-c838-4ce1-bb91-5a2d047b4263\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.979986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec05603-c838-4ce1-bb91-5a2d047b4263-operator-scripts\") pod \"cec05603-c838-4ce1-bb91-5a2d047b4263\" (UID: \"cec05603-c838-4ce1-bb91-5a2d047b4263\") " Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.980467 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmnsb\" (UniqueName: \"kubernetes.io/projected/c9fe7962-fff4-44ce-a841-dab09e490d7c-kube-api-access-lmnsb\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.980483 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pktm\" (UniqueName: \"kubernetes.io/projected/5da0d4c7-0c65-4e22-be82-afdc94255989-kube-api-access-2pktm\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.980552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cec05603-c838-4ce1-bb91-5a2d047b4263-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cec05603-c838-4ce1-bb91-5a2d047b4263" (UID: "cec05603-c838-4ce1-bb91-5a2d047b4263"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:17:40 crc kubenswrapper[4707]: I0121 15:17:40.982825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec05603-c838-4ce1-bb91-5a2d047b4263-kube-api-access-v25k2" (OuterVolumeSpecName: "kube-api-access-v25k2") pod "cec05603-c838-4ce1-bb91-5a2d047b4263" (UID: "cec05603-c838-4ce1-bb91-5a2d047b4263"). InnerVolumeSpecName "kube-api-access-v25k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.081919 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v25k2\" (UniqueName: \"kubernetes.io/projected/cec05603-c838-4ce1-bb91-5a2d047b4263-kube-api-access-v25k2\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.081945 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec05603-c838-4ce1-bb91-5a2d047b4263-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.189708 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd22c263-f40d-47d1-85d2-e5227784aae2" path="/var/lib/kubelet/pods/fd22c263-f40d-47d1-85d2-e5227784aae2/volumes" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.462516 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.462524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-rl5lm" event={"ID":"5da0d4c7-0c65-4e22-be82-afdc94255989","Type":"ContainerDied","Data":"11743849a2fe8c7b992a69c6daa5e13da9236d38eacd3eae87e6717e19fba5f2"} Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.462554 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11743849a2fe8c7b992a69c6daa5e13da9236d38eacd3eae87e6717e19fba5f2" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.464829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-wktmc" event={"ID":"c9fe7962-fff4-44ce-a841-dab09e490d7c","Type":"ContainerDied","Data":"9eefd8b3bd35c6fc342033ed457dca4d5c534fe25a9563792219e14d8b7b3d3c"} Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.464863 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eefd8b3bd35c6fc342033ed457dca4d5c534fe25a9563792219e14d8b7b3d3c" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.464916 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-wktmc" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.468117 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.468195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls" event={"ID":"cec05603-c838-4ce1-bb91-5a2d047b4263","Type":"ContainerDied","Data":"84fcb730c85819e0ff7a139b866c4dc06d7962821fd1604ef57608e375c32e2c"} Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.468232 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84fcb730c85819e0ff7a139b866c4dc06d7962821fd1604ef57608e375c32e2c" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.468206 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.509277 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.517334 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.528860 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:41 crc kubenswrapper[4707]: E0121 15:17:41.529172 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fe7962-fff4-44ce-a841-dab09e490d7c" containerName="mariadb-database-create" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.529183 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fe7962-fff4-44ce-a841-dab09e490d7c" containerName="mariadb-database-create" Jan 21 15:17:41 crc kubenswrapper[4707]: E0121 15:17:41.529198 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec05603-c838-4ce1-bb91-5a2d047b4263" containerName="mariadb-account-create-update" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.529204 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec05603-c838-4ce1-bb91-5a2d047b4263" containerName="mariadb-account-create-update" Jan 21 15:17:41 crc kubenswrapper[4707]: E0121 15:17:41.529212 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da0d4c7-0c65-4e22-be82-afdc94255989" containerName="mariadb-database-create" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.529218 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da0d4c7-0c65-4e22-be82-afdc94255989" containerName="mariadb-database-create" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.529358 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da0d4c7-0c65-4e22-be82-afdc94255989" containerName="mariadb-database-create" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.529376 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec05603-c838-4ce1-bb91-5a2d047b4263" containerName="mariadb-account-create-update" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.529388 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9fe7962-fff4-44ce-a841-dab09e490d7c" containerName="mariadb-database-create" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.530609 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.532744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.532908 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.533095 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.552634 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.690464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.690534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-scripts\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.690550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.690585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.690617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.690655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.690668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-config-data\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.690795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljx25\" (UniqueName: \"kubernetes.io/projected/3ce84b34-d191-48a2-a613-e5794f571c4b-kube-api-access-ljx25\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.731926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.739226 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.755930 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.794073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-scripts\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.794107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.794146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.794179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.794197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.794211 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-config-data\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.794257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljx25\" (UniqueName: \"kubernetes.io/projected/3ce84b34-d191-48a2-a613-e5794f571c4b-kube-api-access-ljx25\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.794286 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.794839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.795067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.799035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.799767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.799869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.801352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-scripts\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.820211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-config-data\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.851832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljx25\" (UniqueName: \"kubernetes.io/projected/3ce84b34-d191-48a2-a613-e5794f571c4b-kube-api-access-ljx25\") pod \"ceilometer-0\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.895048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-operator-scripts\") pod \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\" (UID: \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\") " Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.895118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdmsw\" (UniqueName: \"kubernetes.io/projected/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-kube-api-access-mdmsw\") pod \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\" (UID: \"fec7d56d-7ffe-452b-a5fc-d1febd4218a6\") " Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.895376 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fec7d56d-7ffe-452b-a5fc-d1febd4218a6" (UID: "fec7d56d-7ffe-452b-a5fc-d1febd4218a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.895783 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.898230 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-kube-api-access-mdmsw" (OuterVolumeSpecName: "kube-api-access-mdmsw") pod "fec7d56d-7ffe-452b-a5fc-d1febd4218a6" (UID: "fec7d56d-7ffe-452b-a5fc-d1febd4218a6"). InnerVolumeSpecName "kube-api-access-mdmsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.950304 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.954344 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:41 crc kubenswrapper[4707]: I0121 15:17:41.997708 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdmsw\" (UniqueName: \"kubernetes.io/projected/fec7d56d-7ffe-452b-a5fc-d1febd4218a6-kube-api-access-mdmsw\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.098850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026fa0a1-b506-474b-9a47-b98541421464-operator-scripts\") pod \"026fa0a1-b506-474b-9a47-b98541421464\" (UID: \"026fa0a1-b506-474b-9a47-b98541421464\") " Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.098906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fdb657-664d-435d-9ab2-edd0c800506c-operator-scripts\") pod \"36fdb657-664d-435d-9ab2-edd0c800506c\" (UID: \"36fdb657-664d-435d-9ab2-edd0c800506c\") " Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.098969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm7n5\" (UniqueName: \"kubernetes.io/projected/36fdb657-664d-435d-9ab2-edd0c800506c-kube-api-access-dm7n5\") pod \"36fdb657-664d-435d-9ab2-edd0c800506c\" (UID: \"36fdb657-664d-435d-9ab2-edd0c800506c\") " Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.099135 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm4qp\" (UniqueName: \"kubernetes.io/projected/026fa0a1-b506-474b-9a47-b98541421464-kube-api-access-fm4qp\") pod \"026fa0a1-b506-474b-9a47-b98541421464\" (UID: \"026fa0a1-b506-474b-9a47-b98541421464\") " Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.099252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026fa0a1-b506-474b-9a47-b98541421464-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "026fa0a1-b506-474b-9a47-b98541421464" (UID: "026fa0a1-b506-474b-9a47-b98541421464"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.099447 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/026fa0a1-b506-474b-9a47-b98541421464-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.099545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36fdb657-664d-435d-9ab2-edd0c800506c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36fdb657-664d-435d-9ab2-edd0c800506c" (UID: "36fdb657-664d-435d-9ab2-edd0c800506c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.102200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026fa0a1-b506-474b-9a47-b98541421464-kube-api-access-fm4qp" (OuterVolumeSpecName: "kube-api-access-fm4qp") pod "026fa0a1-b506-474b-9a47-b98541421464" (UID: "026fa0a1-b506-474b-9a47-b98541421464"). InnerVolumeSpecName "kube-api-access-fm4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.102560 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fdb657-664d-435d-9ab2-edd0c800506c-kube-api-access-dm7n5" (OuterVolumeSpecName: "kube-api-access-dm7n5") pod "36fdb657-664d-435d-9ab2-edd0c800506c" (UID: "36fdb657-664d-435d-9ab2-edd0c800506c"). InnerVolumeSpecName "kube-api-access-dm7n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.143797 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.200822 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm7n5\" (UniqueName: \"kubernetes.io/projected/36fdb657-664d-435d-9ab2-edd0c800506c-kube-api-access-dm7n5\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.200850 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm4qp\" (UniqueName: \"kubernetes.io/projected/026fa0a1-b506-474b-9a47-b98541421464-kube-api-access-fm4qp\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.200860 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fdb657-664d-435d-9ab2-edd0c800506c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.475147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" event={"ID":"36fdb657-664d-435d-9ab2-edd0c800506c","Type":"ContainerDied","Data":"2082c25be76f3f17fffa21d26b3fdc96dbcc74e3566223635332ae7a1006a738"} Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.475184 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2082c25be76f3f17fffa21d26b3fdc96dbcc74e3566223635332ae7a1006a738" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.475149 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.476235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" event={"ID":"fec7d56d-7ffe-452b-a5fc-d1febd4218a6","Type":"ContainerDied","Data":"5f127516e296a737b076b83f89ec3c5cecc7639d642e6c54fea8acd05daf501a"} Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.476257 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-gpd8j" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.476260 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f127516e296a737b076b83f89ec3c5cecc7639d642e6c54fea8acd05daf501a" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.477326 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.477311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7" event={"ID":"026fa0a1-b506-474b-9a47-b98541421464","Type":"ContainerDied","Data":"f8b4c83fa6c0cd9df3942fba48e08e6510e7e80babfc4470e1e3434337916064"} Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.477434 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8b4c83fa6c0cd9df3942fba48e08e6510e7e80babfc4470e1e3434337916064" Jan 21 15:17:42 crc kubenswrapper[4707]: W0121 15:17:42.541034 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ce84b34_d191_48a2_a613_e5794f571c4b.slice/crio-7f253abf3928eb305ce34cd1b077d44fc0fec8063f0cf40342d3a1278a4c30f5 WatchSource:0}: Error finding container 7f253abf3928eb305ce34cd1b077d44fc0fec8063f0cf40342d3a1278a4c30f5: Status 404 returned error can't find the container with id 7f253abf3928eb305ce34cd1b077d44fc0fec8063f0cf40342d3a1278a4c30f5 Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.543275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:42 crc kubenswrapper[4707]: I0121 15:17:42.993499 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.159229 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.197318 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfe3994-17c3-4306-a5a5-50f62e79f215" path="/var/lib/kubelet/pods/6cfe3994-17c3-4306-a5a5-50f62e79f215/volumes" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.320682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-config-data\") pod \"8475154b-1028-4331-a68d-83679b4d78fe\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.321019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-internal-tls-certs\") pod \"8475154b-1028-4331-a68d-83679b4d78fe\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.321056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-scripts\") pod \"8475154b-1028-4331-a68d-83679b4d78fe\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.321114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r62dd\" (UniqueName: \"kubernetes.io/projected/8475154b-1028-4331-a68d-83679b4d78fe-kube-api-access-r62dd\") pod \"8475154b-1028-4331-a68d-83679b4d78fe\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.321174 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-httpd-run\") pod \"8475154b-1028-4331-a68d-83679b4d78fe\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.321202 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"8475154b-1028-4331-a68d-83679b4d78fe\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.321268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-combined-ca-bundle\") pod \"8475154b-1028-4331-a68d-83679b4d78fe\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.321293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-logs\") pod \"8475154b-1028-4331-a68d-83679b4d78fe\" (UID: \"8475154b-1028-4331-a68d-83679b4d78fe\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.321439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8475154b-1028-4331-a68d-83679b4d78fe" (UID: "8475154b-1028-4331-a68d-83679b4d78fe"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.321735 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.322589 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-logs" (OuterVolumeSpecName: "logs") pod "8475154b-1028-4331-a68d-83679b4d78fe" (UID: "8475154b-1028-4331-a68d-83679b4d78fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.325101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8475154b-1028-4331-a68d-83679b4d78fe-kube-api-access-r62dd" (OuterVolumeSpecName: "kube-api-access-r62dd") pod "8475154b-1028-4331-a68d-83679b4d78fe" (UID: "8475154b-1028-4331-a68d-83679b4d78fe"). InnerVolumeSpecName "kube-api-access-r62dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.325500 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-scripts" (OuterVolumeSpecName: "scripts") pod "8475154b-1028-4331-a68d-83679b4d78fe" (UID: "8475154b-1028-4331-a68d-83679b4d78fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.328023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "8475154b-1028-4331-a68d-83679b4d78fe" (UID: "8475154b-1028-4331-a68d-83679b4d78fe"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.345344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8475154b-1028-4331-a68d-83679b4d78fe" (UID: "8475154b-1028-4331-a68d-83679b4d78fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.363103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8475154b-1028-4331-a68d-83679b4d78fe" (UID: "8475154b-1028-4331-a68d-83679b4d78fe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.370903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-config-data" (OuterVolumeSpecName: "config-data") pod "8475154b-1028-4331-a68d-83679b4d78fe" (UID: "8475154b-1028-4331-a68d-83679b4d78fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.423086 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.424072 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.424372 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r62dd\" (UniqueName: \"kubernetes.io/projected/8475154b-1028-4331-a68d-83679b4d78fe-kube-api-access-r62dd\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.424418 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.424432 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.424441 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8475154b-1028-4331-a68d-83679b4d78fe-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.424449 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475154b-1028-4331-a68d-83679b4d78fe-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.452500 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.484652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerStarted","Data":"b8f8d78c1a69eb77beb67ef7906ecf6a1d6bb3824ff6ae8e2e9f9d9202110932"} Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.484695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerStarted","Data":"7f253abf3928eb305ce34cd1b077d44fc0fec8063f0cf40342d3a1278a4c30f5"} Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.485864 4707 generic.go:334] "Generic (PLEG): container finished" podID="8475154b-1028-4331-a68d-83679b4d78fe" containerID="ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab" exitCode=0 Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.485905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8475154b-1028-4331-a68d-83679b4d78fe","Type":"ContainerDied","Data":"ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab"} Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.485929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8475154b-1028-4331-a68d-83679b4d78fe","Type":"ContainerDied","Data":"46bc4a7be99d864e6a1feae11c0b8b4225554e35619510f67a98c5dc79d21e50"} Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.485946 4707 scope.go:117] "RemoveContainer" containerID="ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.486072 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.490891 4707 generic.go:334] "Generic (PLEG): container finished" podID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerID="39f668b37d80f373a8b899a91d73bbd8673c02c349c732988db8667e4da64a7e" exitCode=0 Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.490935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a0149516-3649-42aa-a3c7-fe89b3a453b7","Type":"ContainerDied","Data":"39f668b37d80f373a8b899a91d73bbd8673c02c349c732988db8667e4da64a7e"} Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.510858 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.525752 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.530844 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.533758 4707 scope.go:117] "RemoveContainer" containerID="8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.558923 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568149 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:17:43 crc kubenswrapper[4707]: E0121 15:17:43.568399 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec7d56d-7ffe-452b-a5fc-d1febd4218a6" containerName="mariadb-database-create" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568418 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec7d56d-7ffe-452b-a5fc-d1febd4218a6" containerName="mariadb-database-create" Jan 21 15:17:43 crc kubenswrapper[4707]: E0121 15:17:43.568434 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="026fa0a1-b506-474b-9a47-b98541421464" containerName="mariadb-account-create-update" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568440 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="026fa0a1-b506-474b-9a47-b98541421464" containerName="mariadb-account-create-update" Jan 21 15:17:43 crc kubenswrapper[4707]: E0121 15:17:43.568457 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8475154b-1028-4331-a68d-83679b4d78fe" containerName="glance-httpd" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568462 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8475154b-1028-4331-a68d-83679b4d78fe" containerName="glance-httpd" Jan 21 15:17:43 crc kubenswrapper[4707]: E0121 15:17:43.568469 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerName="glance-httpd" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568474 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerName="glance-httpd" Jan 21 15:17:43 crc kubenswrapper[4707]: E0121 15:17:43.568490 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerName="glance-log" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568495 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerName="glance-log" Jan 21 15:17:43 crc kubenswrapper[4707]: E0121 15:17:43.568513 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8475154b-1028-4331-a68d-83679b4d78fe" containerName="glance-log" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568519 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8475154b-1028-4331-a68d-83679b4d78fe" containerName="glance-log" Jan 21 15:17:43 crc kubenswrapper[4707]: E0121 15:17:43.568529 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fdb657-664d-435d-9ab2-edd0c800506c" containerName="mariadb-account-create-update" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568535 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fdb657-664d-435d-9ab2-edd0c800506c" containerName="mariadb-account-create-update" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568679 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerName="glance-httpd" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568690 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fec7d56d-7ffe-452b-a5fc-d1febd4218a6" containerName="mariadb-database-create" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568700 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8475154b-1028-4331-a68d-83679b4d78fe" containerName="glance-httpd" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568707 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fdb657-664d-435d-9ab2-edd0c800506c" containerName="mariadb-account-create-update" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568714 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0149516-3649-42aa-a3c7-fe89b3a453b7" containerName="glance-log" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568720 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8475154b-1028-4331-a68d-83679b4d78fe" containerName="glance-log" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.568731 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="026fa0a1-b506-474b-9a47-b98541421464" containerName="mariadb-account-create-update" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.569486 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.576797 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.577352 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.602116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.619109 4707 scope.go:117] "RemoveContainer" containerID="ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab" Jan 21 15:17:43 crc kubenswrapper[4707]: E0121 15:17:43.621181 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab\": container with ID starting with ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab not found: ID does not exist" containerID="ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.621213 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab"} err="failed to get container status \"ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab\": rpc error: code = NotFound desc = could not find container \"ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab\": container with ID starting with ea0f6614faf6251f2ad7ded33a15bdd2208a89cca692dba1248e81e087c834ab not found: ID does not exist" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.621233 4707 scope.go:117] "RemoveContainer" containerID="8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1" Jan 21 15:17:43 crc kubenswrapper[4707]: E0121 15:17:43.623019 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1\": container with ID starting with 8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1 not found: ID does not exist" containerID="8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.623613 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1"} err="failed to get container status \"8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1\": rpc error: code = NotFound desc = could not find container \"8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1\": container with ID starting with 8f7426df355746c9e37aa4d7f5e558ee2d386f32fe240ff93b59727c6302bde1 not found: ID does not exist" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.626567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-scripts\") pod \"a0149516-3649-42aa-a3c7-fe89b3a453b7\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.626613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-public-tls-certs\") pod \"a0149516-3649-42aa-a3c7-fe89b3a453b7\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.626726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jzb8\" (UniqueName: \"kubernetes.io/projected/a0149516-3649-42aa-a3c7-fe89b3a453b7-kube-api-access-9jzb8\") pod \"a0149516-3649-42aa-a3c7-fe89b3a453b7\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.626775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-logs\") pod \"a0149516-3649-42aa-a3c7-fe89b3a453b7\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.626897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a0149516-3649-42aa-a3c7-fe89b3a453b7\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.627015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-config-data\") pod \"a0149516-3649-42aa-a3c7-fe89b3a453b7\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.627298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-combined-ca-bundle\") pod \"a0149516-3649-42aa-a3c7-fe89b3a453b7\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.627354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-httpd-run\") pod \"a0149516-3649-42aa-a3c7-fe89b3a453b7\" (UID: \"a0149516-3649-42aa-a3c7-fe89b3a453b7\") " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.627628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-logs" (OuterVolumeSpecName: "logs") pod "a0149516-3649-42aa-a3c7-fe89b3a453b7" (UID: "a0149516-3649-42aa-a3c7-fe89b3a453b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.627916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0149516-3649-42aa-a3c7-fe89b3a453b7" (UID: "a0149516-3649-42aa-a3c7-fe89b3a453b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.630433 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.630868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-scripts" (OuterVolumeSpecName: "scripts") pod "a0149516-3649-42aa-a3c7-fe89b3a453b7" (UID: "a0149516-3649-42aa-a3c7-fe89b3a453b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.632342 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "a0149516-3649-42aa-a3c7-fe89b3a453b7" (UID: "a0149516-3649-42aa-a3c7-fe89b3a453b7"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.632736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0149516-3649-42aa-a3c7-fe89b3a453b7-kube-api-access-9jzb8" (OuterVolumeSpecName: "kube-api-access-9jzb8") pod "a0149516-3649-42aa-a3c7-fe89b3a453b7" (UID: "a0149516-3649-42aa-a3c7-fe89b3a453b7"). InnerVolumeSpecName "kube-api-access-9jzb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.672414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0149516-3649-42aa-a3c7-fe89b3a453b7" (UID: "a0149516-3649-42aa-a3c7-fe89b3a453b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.693443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0149516-3649-42aa-a3c7-fe89b3a453b7" (UID: "a0149516-3649-42aa-a3c7-fe89b3a453b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.716109 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5"] Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.717060 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.722584 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5"] Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.723414 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.723453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.723741 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-vbrdc" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.724385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-config-data" (OuterVolumeSpecName: "config-data") pod "a0149516-3649-42aa-a3c7-fe89b3a453b7" (UID: "a0149516-3649-42aa-a3c7-fe89b3a453b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.731420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.731458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-logs\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.731494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.731512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.731526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfhbx\" (UniqueName: \"kubernetes.io/projected/933eeedf-0a55-44e6-84f5-062e3c2716d9-kube-api-access-vfhbx\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.731564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.731606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.731650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.731872 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.732057 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.732080 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jzb8\" (UniqueName: \"kubernetes.io/projected/a0149516-3649-42aa-a3c7-fe89b3a453b7-kube-api-access-9jzb8\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.732101 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.732111 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.732119 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0149516-3649-42aa-a3c7-fe89b3a453b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.732127 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0149516-3649-42aa-a3c7-fe89b3a453b7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.747395 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-scripts\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmc56\" (UniqueName: \"kubernetes.io/projected/7dc26c7f-689c-4ded-8243-67e3cb4c7603-kube-api-access-bmc56\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-logs\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-config-data\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfhbx\" (UniqueName: \"kubernetes.io/projected/933eeedf-0a55-44e6-84f5-062e3c2716d9-kube-api-access-vfhbx\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833656 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833782 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-logs\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.833990 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.836545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.837025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.837478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.849512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.850377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfhbx\" (UniqueName: \"kubernetes.io/projected/933eeedf-0a55-44e6-84f5-062e3c2716d9-kube-api-access-vfhbx\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.856650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.905733 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.934849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-config-data\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.934948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-scripts\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.934971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.934998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmc56\" (UniqueName: \"kubernetes.io/projected/7dc26c7f-689c-4ded-8243-67e3cb4c7603-kube-api-access-bmc56\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.938384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.938717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-config-data\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.940096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-scripts\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:43 crc kubenswrapper[4707]: I0121 15:17:43.950703 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmc56\" (UniqueName: \"kubernetes.io/projected/7dc26c7f-689c-4ded-8243-67e3cb4c7603-kube-api-access-bmc56\") pod \"nova-cell0-conductor-db-sync-sqjw5\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.034150 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.363063 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.463157 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5"] Jan 21 15:17:44 crc kubenswrapper[4707]: W0121 15:17:44.466818 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc26c7f_689c_4ded_8243_67e3cb4c7603.slice/crio-9749179180ce2b2933250247fe84458d8a0175e773cb1d1c0c1bbcffd963d8c0 WatchSource:0}: Error finding container 9749179180ce2b2933250247fe84458d8a0175e773cb1d1c0c1bbcffd963d8c0: Status 404 returned error can't find the container with id 9749179180ce2b2933250247fe84458d8a0175e773cb1d1c0c1bbcffd963d8c0 Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.499694 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.499690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a0149516-3649-42aa-a3c7-fe89b3a453b7","Type":"ContainerDied","Data":"6ff6c5735aa95722c1831d59137680ff8803be3ac8dafac73a5349b3a742abd4"} Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.499757 4707 scope.go:117] "RemoveContainer" containerID="39f668b37d80f373a8b899a91d73bbd8673c02c349c732988db8667e4da64a7e" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.501719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerStarted","Data":"ca98d718c47ed378038fe7df95d2f699834093a794f92ae9aef0b348916a2f30"} Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.504502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" event={"ID":"7dc26c7f-689c-4ded-8243-67e3cb4c7603","Type":"ContainerStarted","Data":"9749179180ce2b2933250247fe84458d8a0175e773cb1d1c0c1bbcffd963d8c0"} Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.506520 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"933eeedf-0a55-44e6-84f5-062e3c2716d9","Type":"ContainerStarted","Data":"85b1010a0d52b0e3dd1f6e1bd11d438e313245e860e0396d821932b3d81e1a85"} Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.528705 4707 scope.go:117] "RemoveContainer" containerID="5de2ad12831c86f17cb7a8db414b2f4c49c21fa9d4d8b5ef0892cbbc9d50e7fc" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.530958 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.537121 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.552531 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.557953 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.568388 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.568964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.570318 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.648399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-scripts\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.648593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.649285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.649387 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lrx7\" (UniqueName: \"kubernetes.io/projected/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-kube-api-access-7lrx7\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.649446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-logs\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.649473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.649764 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.649939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-config-data\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.751020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.751073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-config-data\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.751121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-scripts\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.751155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.751181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.751221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lrx7\" (UniqueName: \"kubernetes.io/projected/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-kube-api-access-7lrx7\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.751239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-logs\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.751262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.751397 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.752938 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.753500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-logs\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.755312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.755554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.757181 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-config-data\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.758568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-scripts\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.764564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lrx7\" (UniqueName: \"kubernetes.io/projected/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-kube-api-access-7lrx7\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.775252 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:44 crc kubenswrapper[4707]: I0121 15:17:44.883978 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:45 crc kubenswrapper[4707]: I0121 15:17:45.191604 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8475154b-1028-4331-a68d-83679b4d78fe" path="/var/lib/kubelet/pods/8475154b-1028-4331-a68d-83679b4d78fe/volumes" Jan 21 15:17:45 crc kubenswrapper[4707]: I0121 15:17:45.192633 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0149516-3649-42aa-a3c7-fe89b3a453b7" path="/var/lib/kubelet/pods/a0149516-3649-42aa-a3c7-fe89b3a453b7/volumes" Jan 21 15:17:45 crc kubenswrapper[4707]: I0121 15:17:45.280323 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:17:45 crc kubenswrapper[4707]: W0121 15:17:45.286883 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02f3955e_a0f2_4a8e_8a04_c6b7e069d0ec.slice/crio-9ba3c7e54b4e1e596830637369113346a235506fb1ebb596f106d9ee54396fd1 WatchSource:0}: Error finding container 9ba3c7e54b4e1e596830637369113346a235506fb1ebb596f106d9ee54396fd1: Status 404 returned error can't find the container with id 9ba3c7e54b4e1e596830637369113346a235506fb1ebb596f106d9ee54396fd1 Jan 21 15:17:45 crc kubenswrapper[4707]: I0121 15:17:45.515474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"933eeedf-0a55-44e6-84f5-062e3c2716d9","Type":"ContainerStarted","Data":"cd62a3dc2c61a7c191ccbda0e7ac771e13ca81a7f503f321bfd2768094965edd"} Jan 21 15:17:45 crc kubenswrapper[4707]: I0121 15:17:45.515693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"933eeedf-0a55-44e6-84f5-062e3c2716d9","Type":"ContainerStarted","Data":"74d9d54b14de2c921f565409d9067db30aa7eea74a9bfef927fc5a41ce19f3e5"} Jan 21 15:17:45 crc kubenswrapper[4707]: I0121 15:17:45.517254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec","Type":"ContainerStarted","Data":"9ba3c7e54b4e1e596830637369113346a235506fb1ebb596f106d9ee54396fd1"} Jan 21 15:17:45 crc kubenswrapper[4707]: I0121 15:17:45.518956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerStarted","Data":"c0892d23e8c04ef92e90d0d760078d96e6415bc7a7329eaafda050c2690b9fd1"} Jan 21 15:17:45 crc kubenswrapper[4707]: I0121 15:17:45.534795 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.534779565 podStartE2EDuration="2.534779565s" podCreationTimestamp="2026-01-21 15:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:45.530038677 +0000 UTC m=+962.711554899" watchObservedRunningTime="2026-01-21 15:17:45.534779565 +0000 UTC m=+962.716295788" Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.528484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerStarted","Data":"a7150f1051360f92a9f0e5b088d60627d5673203037cb805b18b7152c14ec772"} Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.528790 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="ceilometer-central-agent" containerID="cri-o://b8f8d78c1a69eb77beb67ef7906ecf6a1d6bb3824ff6ae8e2e9f9d9202110932" gracePeriod=30 Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.528885 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="sg-core" containerID="cri-o://c0892d23e8c04ef92e90d0d760078d96e6415bc7a7329eaafda050c2690b9fd1" gracePeriod=30 Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.528929 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="ceilometer-notification-agent" containerID="cri-o://ca98d718c47ed378038fe7df95d2f699834093a794f92ae9aef0b348916a2f30" gracePeriod=30 Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.528877 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="proxy-httpd" containerID="cri-o://a7150f1051360f92a9f0e5b088d60627d5673203037cb805b18b7152c14ec772" gracePeriod=30 Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.533867 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.542769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec","Type":"ContainerStarted","Data":"abbe09e446aa5427d732558aa8e1909e7b43ad26726df39c2a2cc877fe7297c3"} Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.542799 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec","Type":"ContainerStarted","Data":"eccce3cbf45c52b0fa9e2d35adb70605731d496af2d57867762c823eecd5c9d6"} Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.549011 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.161234149 podStartE2EDuration="5.548996315s" podCreationTimestamp="2026-01-21 15:17:41 +0000 UTC" firstStartedPulling="2026-01-21 15:17:42.54299915 +0000 UTC m=+959.724515372" lastFinishedPulling="2026-01-21 15:17:45.930761315 +0000 UTC m=+963.112277538" observedRunningTime="2026-01-21 15:17:46.548321366 +0000 UTC m=+963.729837588" watchObservedRunningTime="2026-01-21 15:17:46.548996315 +0000 UTC m=+963.730512538" Jan 21 15:17:46 crc kubenswrapper[4707]: I0121 15:17:46.570637 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.5706209859999998 podStartE2EDuration="2.570620986s" podCreationTimestamp="2026-01-21 15:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:46.56849712 +0000 UTC m=+963.750013333" watchObservedRunningTime="2026-01-21 15:17:46.570620986 +0000 UTC m=+963.752137208" Jan 21 15:17:47 crc kubenswrapper[4707]: I0121 15:17:47.551617 4707 generic.go:334] "Generic (PLEG): container finished" podID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerID="a7150f1051360f92a9f0e5b088d60627d5673203037cb805b18b7152c14ec772" exitCode=0 Jan 21 15:17:47 crc kubenswrapper[4707]: I0121 15:17:47.551823 4707 generic.go:334] "Generic (PLEG): container finished" podID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerID="c0892d23e8c04ef92e90d0d760078d96e6415bc7a7329eaafda050c2690b9fd1" exitCode=2 Jan 21 15:17:47 crc kubenswrapper[4707]: I0121 15:17:47.551834 4707 generic.go:334] "Generic (PLEG): container finished" podID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerID="ca98d718c47ed378038fe7df95d2f699834093a794f92ae9aef0b348916a2f30" exitCode=0 Jan 21 15:17:47 crc kubenswrapper[4707]: I0121 15:17:47.551692 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerDied","Data":"a7150f1051360f92a9f0e5b088d60627d5673203037cb805b18b7152c14ec772"} Jan 21 15:17:47 crc kubenswrapper[4707]: I0121 15:17:47.552086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerDied","Data":"c0892d23e8c04ef92e90d0d760078d96e6415bc7a7329eaafda050c2690b9fd1"} Jan 21 15:17:47 crc kubenswrapper[4707]: I0121 15:17:47.552099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerDied","Data":"ca98d718c47ed378038fe7df95d2f699834093a794f92ae9aef0b348916a2f30"} Jan 21 15:17:48 crc kubenswrapper[4707]: I0121 15:17:48.877608 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:17:50 crc kubenswrapper[4707]: I0121 15:17:50.573988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" event={"ID":"7dc26c7f-689c-4ded-8243-67e3cb4c7603","Type":"ContainerStarted","Data":"12e34aafd762407b78d68f9a01a747e62412bb16c46c84d0bc5833e5a2e16f5d"} Jan 21 15:17:50 crc kubenswrapper[4707]: I0121 15:17:50.590608 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" podStartSLOduration=1.6666066229999998 podStartE2EDuration="7.590595577s" podCreationTimestamp="2026-01-21 15:17:43 +0000 UTC" firstStartedPulling="2026-01-21 15:17:44.468749664 +0000 UTC m=+961.650265887" lastFinishedPulling="2026-01-21 15:17:50.392738619 +0000 UTC m=+967.574254841" observedRunningTime="2026-01-21 15:17:50.588679031 +0000 UTC m=+967.770195253" watchObservedRunningTime="2026-01-21 15:17:50.590595577 +0000 UTC m=+967.772111799" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.583909 4707 generic.go:334] "Generic (PLEG): container finished" podID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerID="b8f8d78c1a69eb77beb67ef7906ecf6a1d6bb3824ff6ae8e2e9f9d9202110932" exitCode=0 Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.583980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerDied","Data":"b8f8d78c1a69eb77beb67ef7906ecf6a1d6bb3824ff6ae8e2e9f9d9202110932"} Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.584014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3ce84b34-d191-48a2-a613-e5794f571c4b","Type":"ContainerDied","Data":"7f253abf3928eb305ce34cd1b077d44fc0fec8063f0cf40342d3a1278a4c30f5"} Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.584025 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f253abf3928eb305ce34cd1b077d44fc0fec8063f0cf40342d3a1278a4c30f5" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.603700 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.700621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljx25\" (UniqueName: \"kubernetes.io/projected/3ce84b34-d191-48a2-a613-e5794f571c4b-kube-api-access-ljx25\") pod \"3ce84b34-d191-48a2-a613-e5794f571c4b\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.700669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-log-httpd\") pod \"3ce84b34-d191-48a2-a613-e5794f571c4b\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.700712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-combined-ca-bundle\") pod \"3ce84b34-d191-48a2-a613-e5794f571c4b\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.700769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-sg-core-conf-yaml\") pod \"3ce84b34-d191-48a2-a613-e5794f571c4b\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.700843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-run-httpd\") pod \"3ce84b34-d191-48a2-a613-e5794f571c4b\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.700871 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-config-data\") pod \"3ce84b34-d191-48a2-a613-e5794f571c4b\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.700947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-ceilometer-tls-certs\") pod \"3ce84b34-d191-48a2-a613-e5794f571c4b\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.700964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-scripts\") pod \"3ce84b34-d191-48a2-a613-e5794f571c4b\" (UID: \"3ce84b34-d191-48a2-a613-e5794f571c4b\") " Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.701239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3ce84b34-d191-48a2-a613-e5794f571c4b" (UID: "3ce84b34-d191-48a2-a613-e5794f571c4b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.701255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3ce84b34-d191-48a2-a613-e5794f571c4b" (UID: "3ce84b34-d191-48a2-a613-e5794f571c4b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.701678 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.701772 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ce84b34-d191-48a2-a613-e5794f571c4b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.708919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-scripts" (OuterVolumeSpecName: "scripts") pod "3ce84b34-d191-48a2-a613-e5794f571c4b" (UID: "3ce84b34-d191-48a2-a613-e5794f571c4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.708923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce84b34-d191-48a2-a613-e5794f571c4b-kube-api-access-ljx25" (OuterVolumeSpecName: "kube-api-access-ljx25") pod "3ce84b34-d191-48a2-a613-e5794f571c4b" (UID: "3ce84b34-d191-48a2-a613-e5794f571c4b"). InnerVolumeSpecName "kube-api-access-ljx25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.721068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3ce84b34-d191-48a2-a613-e5794f571c4b" (UID: "3ce84b34-d191-48a2-a613-e5794f571c4b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.749481 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3ce84b34-d191-48a2-a613-e5794f571c4b" (UID: "3ce84b34-d191-48a2-a613-e5794f571c4b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.751038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ce84b34-d191-48a2-a613-e5794f571c4b" (UID: "3ce84b34-d191-48a2-a613-e5794f571c4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.766089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-config-data" (OuterVolumeSpecName: "config-data") pod "3ce84b34-d191-48a2-a613-e5794f571c4b" (UID: "3ce84b34-d191-48a2-a613-e5794f571c4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.803110 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljx25\" (UniqueName: \"kubernetes.io/projected/3ce84b34-d191-48a2-a613-e5794f571c4b-kube-api-access-ljx25\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.803136 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.803145 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.803153 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.803161 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:51 crc kubenswrapper[4707]: I0121 15:17:51.803168 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce84b34-d191-48a2-a613-e5794f571c4b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.590840 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.614300 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.620214 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.632279 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:52 crc kubenswrapper[4707]: E0121 15:17:52.632671 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="proxy-httpd" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.632690 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="proxy-httpd" Jan 21 15:17:52 crc kubenswrapper[4707]: E0121 15:17:52.632712 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="ceilometer-central-agent" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.632720 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="ceilometer-central-agent" Jan 21 15:17:52 crc kubenswrapper[4707]: E0121 15:17:52.632747 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="sg-core" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.632753 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="sg-core" Jan 21 15:17:52 crc kubenswrapper[4707]: E0121 15:17:52.632764 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="ceilometer-notification-agent" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.632770 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="ceilometer-notification-agent" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.632945 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="ceilometer-notification-agent" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.632961 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="sg-core" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.632973 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="ceilometer-central-agent" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.632989 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" containerName="proxy-httpd" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.634359 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.636579 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.636618 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.636782 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.638316 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.817794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-run-httpd\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.817875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.817910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.817970 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcnl4\" (UniqueName: \"kubernetes.io/projected/c129b2b0-9811-4d58-9b65-de842874c901-kube-api-access-wcnl4\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.818035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-scripts\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.818078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.818094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-log-httpd\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.818155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-config-data\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.919545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcnl4\" (UniqueName: \"kubernetes.io/projected/c129b2b0-9811-4d58-9b65-de842874c901-kube-api-access-wcnl4\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.919607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-scripts\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.919663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.919686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-log-httpd\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.919772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-config-data\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.919865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-run-httpd\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.919909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.919959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.920992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-log-httpd\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.921401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-run-httpd\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.923942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.923961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-config-data\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.924088 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.924688 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-scripts\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.928241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.939467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcnl4\" (UniqueName: \"kubernetes.io/projected/c129b2b0-9811-4d58-9b65-de842874c901-kube-api-access-wcnl4\") pod \"ceilometer-0\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:52 crc kubenswrapper[4707]: I0121 15:17:52.952589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:53 crc kubenswrapper[4707]: I0121 15:17:53.189656 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce84b34-d191-48a2-a613-e5794f571c4b" path="/var/lib/kubelet/pods/3ce84b34-d191-48a2-a613-e5794f571c4b/volumes" Jan 21 15:17:53 crc kubenswrapper[4707]: I0121 15:17:53.337013 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:17:53 crc kubenswrapper[4707]: W0121 15:17:53.341679 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc129b2b0_9811_4d58_9b65_de842874c901.slice/crio-09245aae2040da0299fd7e54e0fc52418948cf1bf1f9df100fe5bdacf17693a4 WatchSource:0}: Error finding container 09245aae2040da0299fd7e54e0fc52418948cf1bf1f9df100fe5bdacf17693a4: Status 404 returned error can't find the container with id 09245aae2040da0299fd7e54e0fc52418948cf1bf1f9df100fe5bdacf17693a4 Jan 21 15:17:53 crc kubenswrapper[4707]: I0121 15:17:53.597537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerStarted","Data":"09245aae2040da0299fd7e54e0fc52418948cf1bf1f9df100fe5bdacf17693a4"} Jan 21 15:17:53 crc kubenswrapper[4707]: I0121 15:17:53.905847 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:53 crc kubenswrapper[4707]: I0121 15:17:53.905905 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:53 crc kubenswrapper[4707]: I0121 15:17:53.926695 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:53 crc kubenswrapper[4707]: I0121 15:17:53.936805 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:54 crc kubenswrapper[4707]: I0121 15:17:54.606622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerStarted","Data":"e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13"} Jan 21 15:17:54 crc kubenswrapper[4707]: I0121 15:17:54.606665 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:54 crc kubenswrapper[4707]: I0121 15:17:54.606763 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:54 crc kubenswrapper[4707]: I0121 15:17:54.884317 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:54 crc kubenswrapper[4707]: I0121 15:17:54.884353 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:54 crc kubenswrapper[4707]: I0121 15:17:54.908566 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:54 crc kubenswrapper[4707]: I0121 15:17:54.914457 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:55 crc kubenswrapper[4707]: I0121 15:17:55.614201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerStarted","Data":"21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97"} Jan 21 15:17:55 crc kubenswrapper[4707]: I0121 15:17:55.614252 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:55 crc kubenswrapper[4707]: I0121 15:17:55.614269 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:56 crc kubenswrapper[4707]: I0121 15:17:56.233349 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:56 crc kubenswrapper[4707]: I0121 15:17:56.234408 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:17:56 crc kubenswrapper[4707]: I0121 15:17:56.619714 4707 generic.go:334] "Generic (PLEG): container finished" podID="7dc26c7f-689c-4ded-8243-67e3cb4c7603" containerID="12e34aafd762407b78d68f9a01a747e62412bb16c46c84d0bc5833e5a2e16f5d" exitCode=0 Jan 21 15:17:56 crc kubenswrapper[4707]: I0121 15:17:56.619774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" event={"ID":"7dc26c7f-689c-4ded-8243-67e3cb4c7603","Type":"ContainerDied","Data":"12e34aafd762407b78d68f9a01a747e62412bb16c46c84d0bc5833e5a2e16f5d"} Jan 21 15:17:56 crc kubenswrapper[4707]: I0121 15:17:56.621483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerStarted","Data":"485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009"} Jan 21 15:17:57 crc kubenswrapper[4707]: I0121 15:17:57.230484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:57 crc kubenswrapper[4707]: I0121 15:17:57.238275 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:17:57 crc kubenswrapper[4707]: I0121 15:17:57.629603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerStarted","Data":"38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4"} Jan 21 15:17:57 crc kubenswrapper[4707]: I0121 15:17:57.884684 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:57 crc kubenswrapper[4707]: I0121 15:17:57.901232 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.812335501 podStartE2EDuration="5.901210326s" podCreationTimestamp="2026-01-21 15:17:52 +0000 UTC" firstStartedPulling="2026-01-21 15:17:53.344993809 +0000 UTC m=+970.526510031" lastFinishedPulling="2026-01-21 15:17:57.433868634 +0000 UTC m=+974.615384856" observedRunningTime="2026-01-21 15:17:57.656427891 +0000 UTC m=+974.837944114" watchObservedRunningTime="2026-01-21 15:17:57.901210326 +0000 UTC m=+975.082726548" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.008577 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-combined-ca-bundle\") pod \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.008676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-scripts\") pod \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.008707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmc56\" (UniqueName: \"kubernetes.io/projected/7dc26c7f-689c-4ded-8243-67e3cb4c7603-kube-api-access-bmc56\") pod \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.008760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-config-data\") pod \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\" (UID: \"7dc26c7f-689c-4ded-8243-67e3cb4c7603\") " Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.011919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc26c7f-689c-4ded-8243-67e3cb4c7603-kube-api-access-bmc56" (OuterVolumeSpecName: "kube-api-access-bmc56") pod "7dc26c7f-689c-4ded-8243-67e3cb4c7603" (UID: "7dc26c7f-689c-4ded-8243-67e3cb4c7603"). InnerVolumeSpecName "kube-api-access-bmc56". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.012488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-scripts" (OuterVolumeSpecName: "scripts") pod "7dc26c7f-689c-4ded-8243-67e3cb4c7603" (UID: "7dc26c7f-689c-4ded-8243-67e3cb4c7603"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.027142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-config-data" (OuterVolumeSpecName: "config-data") pod "7dc26c7f-689c-4ded-8243-67e3cb4c7603" (UID: "7dc26c7f-689c-4ded-8243-67e3cb4c7603"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.027419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dc26c7f-689c-4ded-8243-67e3cb4c7603" (UID: "7dc26c7f-689c-4ded-8243-67e3cb4c7603"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.110043 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.110083 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.110093 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmc56\" (UniqueName: \"kubernetes.io/projected/7dc26c7f-689c-4ded-8243-67e3cb4c7603-kube-api-access-bmc56\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.110103 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc26c7f-689c-4ded-8243-67e3cb4c7603-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.637426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" event={"ID":"7dc26c7f-689c-4ded-8243-67e3cb4c7603","Type":"ContainerDied","Data":"9749179180ce2b2933250247fe84458d8a0175e773cb1d1c0c1bbcffd963d8c0"} Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.637466 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9749179180ce2b2933250247fe84458d8a0175e773cb1d1c0c1bbcffd963d8c0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.637492 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.637741 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.722188 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:17:58 crc kubenswrapper[4707]: E0121 15:17:58.722505 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc26c7f-689c-4ded-8243-67e3cb4c7603" containerName="nova-cell0-conductor-db-sync" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.722522 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc26c7f-689c-4ded-8243-67e3cb4c7603" containerName="nova-cell0-conductor-db-sync" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.722699 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc26c7f-689c-4ded-8243-67e3cb4c7603" containerName="nova-cell0-conductor-db-sync" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.723205 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.728229 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-vbrdc" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.728357 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.738156 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.822358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx2cx\" (UniqueName: \"kubernetes.io/projected/36457603-5ef9-4f79-bf2c-11413e422937-kube-api-access-gx2cx\") pod \"nova-cell0-conductor-0\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.822714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.822930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.924430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx2cx\" (UniqueName: \"kubernetes.io/projected/36457603-5ef9-4f79-bf2c-11413e422937-kube-api-access-gx2cx\") pod \"nova-cell0-conductor-0\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.924541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.924619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.935518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.936289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:58 crc kubenswrapper[4707]: I0121 15:17:58.937206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx2cx\" (UniqueName: \"kubernetes.io/projected/36457603-5ef9-4f79-bf2c-11413e422937-kube-api-access-gx2cx\") pod \"nova-cell0-conductor-0\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:59 crc kubenswrapper[4707]: I0121 15:17:59.037673 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:59 crc kubenswrapper[4707]: W0121 15:17:59.413450 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36457603_5ef9_4f79_bf2c_11413e422937.slice/crio-6313ee5f80e449fe65c0ff7184c6fae072def507e528921ce6f85b2fc64f17b7 WatchSource:0}: Error finding container 6313ee5f80e449fe65c0ff7184c6fae072def507e528921ce6f85b2fc64f17b7: Status 404 returned error can't find the container with id 6313ee5f80e449fe65c0ff7184c6fae072def507e528921ce6f85b2fc64f17b7 Jan 21 15:17:59 crc kubenswrapper[4707]: I0121 15:17:59.420213 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:17:59 crc kubenswrapper[4707]: I0121 15:17:59.645008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"36457603-5ef9-4f79-bf2c-11413e422937","Type":"ContainerStarted","Data":"f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b"} Jan 21 15:17:59 crc kubenswrapper[4707]: I0121 15:17:59.645041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"36457603-5ef9-4f79-bf2c-11413e422937","Type":"ContainerStarted","Data":"6313ee5f80e449fe65c0ff7184c6fae072def507e528921ce6f85b2fc64f17b7"} Jan 21 15:17:59 crc kubenswrapper[4707]: I0121 15:17:59.645091 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:17:59 crc kubenswrapper[4707]: I0121 15:17:59.658853 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.658835964 podStartE2EDuration="1.658835964s" podCreationTimestamp="2026-01-21 15:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:17:59.6579566 +0000 UTC m=+976.839472822" watchObservedRunningTime="2026-01-21 15:17:59.658835964 +0000 UTC m=+976.840352186" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.058201 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.430961 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.434591 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.442387 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.442498 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.450790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.508303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-scripts\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.508374 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.508409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x4n7\" (UniqueName: \"kubernetes.io/projected/bfb72ff0-20b0-4878-a229-85528d202449-kube-api-access-9x4n7\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.508443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-config-data\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.584784 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.586273 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.596197 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.596737 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.598564 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.604241 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.604313 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.609904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-scripts\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.609971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.609999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x4n7\" (UniqueName: \"kubernetes.io/projected/bfb72ff0-20b0-4878-a229-85528d202449-kube-api-access-9x4n7\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.610032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-config-data\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.609904 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.615519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-config-data\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.618232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-scripts\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.631396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.649183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x4n7\" (UniqueName: \"kubernetes.io/projected/bfb72ff0-20b0-4878-a229-85528d202449-kube-api-access-9x4n7\") pod \"nova-cell0-cell-mapping-rwlwq\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.652591 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.653519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.660708 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.682131 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.711205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-config-data\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.711426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.711618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.711774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c64dc582-c8d6-412a-9f87-1c4a88154ee1-logs\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.711876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-config-data\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.711944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615ee5fc-ef35-43da-a9e6-0ffd523a1375-logs\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.712048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdkrz\" (UniqueName: \"kubernetes.io/projected/615ee5fc-ef35-43da-a9e6-0ffd523a1375-kube-api-access-jdkrz\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.712182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bbrd\" (UniqueName: \"kubernetes.io/projected/c64dc582-c8d6-412a-9f87-1c4a88154ee1-kube-api-access-9bbrd\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.719517 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.720640 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.722924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.742577 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.768148 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818427 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-config-data\") pod \"nova-scheduler-0\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c64dc582-c8d6-412a-9f87-1c4a88154ee1-logs\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-config-data\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615ee5fc-ef35-43da-a9e6-0ffd523a1375-logs\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdkrz\" (UniqueName: \"kubernetes.io/projected/615ee5fc-ef35-43da-a9e6-0ffd523a1375-kube-api-access-jdkrz\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bbrd\" (UniqueName: \"kubernetes.io/projected/c64dc582-c8d6-412a-9f87-1c4a88154ee1-kube-api-access-9bbrd\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818921 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7lvb\" (UniqueName: \"kubernetes.io/projected/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-kube-api-access-v7lvb\") pod \"nova-scheduler-0\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.818992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-config-data\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.819025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.819111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqsrn\" (UniqueName: \"kubernetes.io/projected/27442a8c-bc16-4113-800f-4d8f3046ad99-kube-api-access-vqsrn\") pod \"nova-cell1-novncproxy-0\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.819160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.820137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c64dc582-c8d6-412a-9f87-1c4a88154ee1-logs\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.821182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615ee5fc-ef35-43da-a9e6-0ffd523a1375-logs\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.822043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.824658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.825382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-config-data\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.826286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-config-data\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.840020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bbrd\" (UniqueName: \"kubernetes.io/projected/c64dc582-c8d6-412a-9f87-1c4a88154ee1-kube-api-access-9bbrd\") pod \"nova-metadata-0\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.844776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdkrz\" (UniqueName: \"kubernetes.io/projected/615ee5fc-ef35-43da-a9e6-0ffd523a1375-kube-api-access-jdkrz\") pod \"nova-api-0\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.920644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-config-data\") pod \"nova-scheduler-0\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.920728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.920750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.921362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7lvb\" (UniqueName: \"kubernetes.io/projected/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-kube-api-access-v7lvb\") pod \"nova-scheduler-0\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.921427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.921562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqsrn\" (UniqueName: \"kubernetes.io/projected/27442a8c-bc16-4113-800f-4d8f3046ad99-kube-api-access-vqsrn\") pod \"nova-cell1-novncproxy-0\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.924026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.924564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-config-data\") pod \"nova-scheduler-0\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.924615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.925288 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.934366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7lvb\" (UniqueName: \"kubernetes.io/projected/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-kube-api-access-v7lvb\") pod \"nova-scheduler-0\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.936251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqsrn\" (UniqueName: \"kubernetes.io/projected/27442a8c-bc16-4113-800f-4d8f3046ad99-kube-api-access-vqsrn\") pod \"nova-cell1-novncproxy-0\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.954989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:04 crc kubenswrapper[4707]: I0121 15:18:04.995181 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.002761 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.036041 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.151433 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq"] Jan 21 15:18:05 crc kubenswrapper[4707]: W0121 15:18:05.156550 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfb72ff0_20b0_4878_a229_85528d202449.slice/crio-2dfc155c6e0cd440978ae744a2887897d61e788aa696fe40d76c71657b3b51b3 WatchSource:0}: Error finding container 2dfc155c6e0cd440978ae744a2887897d61e788aa696fe40d76c71657b3b51b3: Status 404 returned error can't find the container with id 2dfc155c6e0cd440978ae744a2887897d61e788aa696fe40d76c71657b3b51b3 Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.273660 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr"] Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.274660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.276482 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.276790 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.287088 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr"] Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.393656 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:05 crc kubenswrapper[4707]: W0121 15:18:05.396910 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc64dc582_c8d6_412a_9f87_1c4a88154ee1.slice/crio-c8cd06e958b5c784a81e62d27706f99128b99cd888599296d83fefe6b4f0e69d WatchSource:0}: Error finding container c8cd06e958b5c784a81e62d27706f99128b99cd888599296d83fefe6b4f0e69d: Status 404 returned error can't find the container with id c8cd06e958b5c784a81e62d27706f99128b99cd888599296d83fefe6b4f0e69d Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.436356 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-config-data\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.436437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drnd4\" (UniqueName: \"kubernetes.io/projected/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-kube-api-access-drnd4\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.436536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-scripts\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.436733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.454490 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:05 crc kubenswrapper[4707]: W0121 15:18:05.455538 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod615ee5fc_ef35_43da_a9e6_0ffd523a1375.slice/crio-db6bf76d7b8bf8575ac92b40c069804278da0a89e056459c4b09c9b5f46f519c WatchSource:0}: Error finding container db6bf76d7b8bf8575ac92b40c069804278da0a89e056459c4b09c9b5f46f519c: Status 404 returned error can't find the container with id db6bf76d7b8bf8575ac92b40c069804278da0a89e056459c4b09c9b5f46f519c Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.460692 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.538164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.538298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-config-data\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.538365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drnd4\" (UniqueName: \"kubernetes.io/projected/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-kube-api-access-drnd4\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.538404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-scripts\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.542535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.542572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-config-data\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.542804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-scripts\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.556781 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drnd4\" (UniqueName: \"kubernetes.io/projected/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-kube-api-access-drnd4\") pod \"nova-cell1-conductor-db-sync-672zr\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.563159 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:18:05 crc kubenswrapper[4707]: W0121 15:18:05.566527 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27442a8c_bc16_4113_800f_4d8f3046ad99.slice/crio-631e3c2ed390f1fdd0d4502dfe8f1be14f405dc2371772fc8b674d4aeb46ce33 WatchSource:0}: Error finding container 631e3c2ed390f1fdd0d4502dfe8f1be14f405dc2371772fc8b674d4aeb46ce33: Status 404 returned error can't find the container with id 631e3c2ed390f1fdd0d4502dfe8f1be14f405dc2371772fc8b674d4aeb46ce33 Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.591145 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.695750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" event={"ID":"bfb72ff0-20b0-4878-a229-85528d202449","Type":"ContainerStarted","Data":"a1be8011bc6045d525baadc48d11d4837fc2704a537669c89fedb85109e454de"} Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.695793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" event={"ID":"bfb72ff0-20b0-4878-a229-85528d202449","Type":"ContainerStarted","Data":"2dfc155c6e0cd440978ae744a2887897d61e788aa696fe40d76c71657b3b51b3"} Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.697488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c64dc582-c8d6-412a-9f87-1c4a88154ee1","Type":"ContainerStarted","Data":"c8cd06e958b5c784a81e62d27706f99128b99cd888599296d83fefe6b4f0e69d"} Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.698382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"27442a8c-bc16-4113-800f-4d8f3046ad99","Type":"ContainerStarted","Data":"631e3c2ed390f1fdd0d4502dfe8f1be14f405dc2371772fc8b674d4aeb46ce33"} Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.700401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27","Type":"ContainerStarted","Data":"46af158bb52b1ebe16c017c70a11ece4ac78d0d7604e7850f09791cc27408b34"} Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.701418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"615ee5fc-ef35-43da-a9e6-0ffd523a1375","Type":"ContainerStarted","Data":"db6bf76d7b8bf8575ac92b40c069804278da0a89e056459c4b09c9b5f46f519c"} Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.706689 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" podStartSLOduration=1.7066783810000001 podStartE2EDuration="1.706678381s" podCreationTimestamp="2026-01-21 15:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:18:05.706290201 +0000 UTC m=+982.887806423" watchObservedRunningTime="2026-01-21 15:18:05.706678381 +0000 UTC m=+982.888194603" Jan 21 15:18:05 crc kubenswrapper[4707]: I0121 15:18:05.994138 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr"] Jan 21 15:18:06 crc kubenswrapper[4707]: I0121 15:18:06.708604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" event={"ID":"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9","Type":"ContainerStarted","Data":"073faa94385891588bd04cabc70df3b815c7cb775dc37edc546cf23853834b62"} Jan 21 15:18:06 crc kubenswrapper[4707]: I0121 15:18:06.708821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" event={"ID":"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9","Type":"ContainerStarted","Data":"fe888825cc1c1bad9fa6571a530e52597d5247bd74d37303ce8969fd169abe70"} Jan 21 15:18:07 crc kubenswrapper[4707]: I0121 15:18:07.726572 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" podStartSLOduration=2.7265581709999998 podStartE2EDuration="2.726558171s" podCreationTimestamp="2026-01-21 15:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:18:07.725984261 +0000 UTC m=+984.907500483" watchObservedRunningTime="2026-01-21 15:18:07.726558171 +0000 UTC m=+984.908074393" Jan 21 15:18:08 crc kubenswrapper[4707]: I0121 15:18:08.351505 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:08 crc kubenswrapper[4707]: I0121 15:18:08.367218 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.754291 4707 generic.go:334] "Generic (PLEG): container finished" podID="3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9" containerID="073faa94385891588bd04cabc70df3b815c7cb775dc37edc546cf23853834b62" exitCode=0 Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.754539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" event={"ID":"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9","Type":"ContainerDied","Data":"073faa94385891588bd04cabc70df3b815c7cb775dc37edc546cf23853834b62"} Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.757585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"27442a8c-bc16-4113-800f-4d8f3046ad99","Type":"ContainerStarted","Data":"5c80dd2658c929c67f6346efbdf320642e36107ab897452d3c02ab7d7505a7be"} Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.757642 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="27442a8c-bc16-4113-800f-4d8f3046ad99" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5c80dd2658c929c67f6346efbdf320642e36107ab897452d3c02ab7d7505a7be" gracePeriod=30 Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.758974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27","Type":"ContainerStarted","Data":"99b32f0c2b6aee998a94da8e45a516c8e93c20285f028ff18766f1f9df424d4c"} Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.760628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"615ee5fc-ef35-43da-a9e6-0ffd523a1375","Type":"ContainerStarted","Data":"8820f80a53124cd31ccbe0b68e5495d761a5834dccd7ae1953beea5e4bc9886a"} Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.760651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"615ee5fc-ef35-43da-a9e6-0ffd523a1375","Type":"ContainerStarted","Data":"79d3e5c487fe483466928d3803a04398dcec16dfd71fc55446774426c66dd7d7"} Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.771197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c64dc582-c8d6-412a-9f87-1c4a88154ee1","Type":"ContainerStarted","Data":"7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60"} Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.771253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c64dc582-c8d6-412a-9f87-1c4a88154ee1","Type":"ContainerStarted","Data":"1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920"} Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.771282 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerName="nova-metadata-log" containerID="cri-o://1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920" gracePeriod=30 Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.771337 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerName="nova-metadata-metadata" containerID="cri-o://7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60" gracePeriod=30 Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.786489 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.485740018 podStartE2EDuration="5.786474517s" podCreationTimestamp="2026-01-21 15:18:04 +0000 UTC" firstStartedPulling="2026-01-21 15:18:05.459969494 +0000 UTC m=+982.641485715" lastFinishedPulling="2026-01-21 15:18:08.760703992 +0000 UTC m=+985.942220214" observedRunningTime="2026-01-21 15:18:09.786218445 +0000 UTC m=+986.967734667" watchObservedRunningTime="2026-01-21 15:18:09.786474517 +0000 UTC m=+986.967990740" Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.802352 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.507932308 podStartE2EDuration="5.802340505s" podCreationTimestamp="2026-01-21 15:18:04 +0000 UTC" firstStartedPulling="2026-01-21 15:18:05.464397453 +0000 UTC m=+982.645913675" lastFinishedPulling="2026-01-21 15:18:08.758805651 +0000 UTC m=+985.940321872" observedRunningTime="2026-01-21 15:18:09.794692046 +0000 UTC m=+986.976208268" watchObservedRunningTime="2026-01-21 15:18:09.802340505 +0000 UTC m=+986.983856726" Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.813401 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.616861964 podStartE2EDuration="5.81338416s" podCreationTimestamp="2026-01-21 15:18:04 +0000 UTC" firstStartedPulling="2026-01-21 15:18:05.568790226 +0000 UTC m=+982.750306448" lastFinishedPulling="2026-01-21 15:18:08.765312421 +0000 UTC m=+985.946828644" observedRunningTime="2026-01-21 15:18:09.808759229 +0000 UTC m=+986.990275452" watchObservedRunningTime="2026-01-21 15:18:09.81338416 +0000 UTC m=+986.994900381" Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.824335 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.464097906 podStartE2EDuration="5.82432399s" podCreationTimestamp="2026-01-21 15:18:04 +0000 UTC" firstStartedPulling="2026-01-21 15:18:05.398517491 +0000 UTC m=+982.580033712" lastFinishedPulling="2026-01-21 15:18:08.758743574 +0000 UTC m=+985.940259796" observedRunningTime="2026-01-21 15:18:09.819968927 +0000 UTC m=+987.001485150" watchObservedRunningTime="2026-01-21 15:18:09.82432399 +0000 UTC m=+987.005840212" Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.946028 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.946097 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.956098 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:09 crc kubenswrapper[4707]: I0121 15:18:09.956160 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.003718 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.037183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.247784 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.318540 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c64dc582-c8d6-412a-9f87-1c4a88154ee1-logs\") pod \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.318597 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-config-data\") pod \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.318643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-combined-ca-bundle\") pod \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.318693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bbrd\" (UniqueName: \"kubernetes.io/projected/c64dc582-c8d6-412a-9f87-1c4a88154ee1-kube-api-access-9bbrd\") pod \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\" (UID: \"c64dc582-c8d6-412a-9f87-1c4a88154ee1\") " Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.319249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c64dc582-c8d6-412a-9f87-1c4a88154ee1-logs" (OuterVolumeSpecName: "logs") pod "c64dc582-c8d6-412a-9f87-1c4a88154ee1" (UID: "c64dc582-c8d6-412a-9f87-1c4a88154ee1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.324427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c64dc582-c8d6-412a-9f87-1c4a88154ee1-kube-api-access-9bbrd" (OuterVolumeSpecName: "kube-api-access-9bbrd") pod "c64dc582-c8d6-412a-9f87-1c4a88154ee1" (UID: "c64dc582-c8d6-412a-9f87-1c4a88154ee1"). InnerVolumeSpecName "kube-api-access-9bbrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.339661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c64dc582-c8d6-412a-9f87-1c4a88154ee1" (UID: "c64dc582-c8d6-412a-9f87-1c4a88154ee1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.341402 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-config-data" (OuterVolumeSpecName: "config-data") pod "c64dc582-c8d6-412a-9f87-1c4a88154ee1" (UID: "c64dc582-c8d6-412a-9f87-1c4a88154ee1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.420349 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bbrd\" (UniqueName: \"kubernetes.io/projected/c64dc582-c8d6-412a-9f87-1c4a88154ee1-kube-api-access-9bbrd\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.420374 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c64dc582-c8d6-412a-9f87-1c4a88154ee1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.420384 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.420391 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64dc582-c8d6-412a-9f87-1c4a88154ee1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.779191 4707 generic.go:334] "Generic (PLEG): container finished" podID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerID="7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60" exitCode=0 Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.779388 4707 generic.go:334] "Generic (PLEG): container finished" podID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerID="1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920" exitCode=143 Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.779597 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.779725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c64dc582-c8d6-412a-9f87-1c4a88154ee1","Type":"ContainerDied","Data":"7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60"} Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.779795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c64dc582-c8d6-412a-9f87-1c4a88154ee1","Type":"ContainerDied","Data":"1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920"} Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.779821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c64dc582-c8d6-412a-9f87-1c4a88154ee1","Type":"ContainerDied","Data":"c8cd06e958b5c784a81e62d27706f99128b99cd888599296d83fefe6b4f0e69d"} Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.779838 4707 scope.go:117] "RemoveContainer" containerID="7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.818193 4707 scope.go:117] "RemoveContainer" containerID="1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.821146 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.828931 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.836032 4707 scope.go:117] "RemoveContainer" containerID="7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60" Jan 21 15:18:10 crc kubenswrapper[4707]: E0121 15:18:10.838444 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60\": container with ID starting with 7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60 not found: ID does not exist" containerID="7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.838476 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60"} err="failed to get container status \"7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60\": rpc error: code = NotFound desc = could not find container \"7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60\": container with ID starting with 7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60 not found: ID does not exist" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.838499 4707 scope.go:117] "RemoveContainer" containerID="1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920" Jan 21 15:18:10 crc kubenswrapper[4707]: E0121 15:18:10.839238 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920\": container with ID starting with 1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920 not found: ID does not exist" containerID="1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.839275 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920"} err="failed to get container status \"1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920\": rpc error: code = NotFound desc = could not find container \"1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920\": container with ID starting with 1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920 not found: ID does not exist" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.839300 4707 scope.go:117] "RemoveContainer" containerID="7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.840115 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60"} err="failed to get container status \"7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60\": rpc error: code = NotFound desc = could not find container \"7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60\": container with ID starting with 7e48ef845a6d79266dd4438885e1033601d2745b4b727fb52b2930c11a882f60 not found: ID does not exist" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.840146 4707 scope.go:117] "RemoveContainer" containerID="1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.840173 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.840371 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920"} err="failed to get container status \"1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920\": rpc error: code = NotFound desc = could not find container \"1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920\": container with ID starting with 1baa8cc32c2b43c059dab72729480f688449840726414845867e05ec79761920 not found: ID does not exist" Jan 21 15:18:10 crc kubenswrapper[4707]: E0121 15:18:10.840515 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerName="nova-metadata-log" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.840527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerName="nova-metadata-log" Jan 21 15:18:10 crc kubenswrapper[4707]: E0121 15:18:10.840559 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerName="nova-metadata-metadata" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.840565 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerName="nova-metadata-metadata" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.840818 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerName="nova-metadata-log" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.840850 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" containerName="nova-metadata-metadata" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.843934 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.847578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.847685 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.859676 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.927794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-config-data\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.927942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.927985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79fca8c-2748-4e8c-8f52-389f2b82677f-logs\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.928098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:10 crc kubenswrapper[4707]: I0121 15:18:10.928130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xqsl\" (UniqueName: \"kubernetes.io/projected/f79fca8c-2748-4e8c-8f52-389f2b82677f-kube-api-access-9xqsl\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.029967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-config-data\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.030036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.030063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79fca8c-2748-4e8c-8f52-389f2b82677f-logs\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.030131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.030159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xqsl\" (UniqueName: \"kubernetes.io/projected/f79fca8c-2748-4e8c-8f52-389f2b82677f-kube-api-access-9xqsl\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.031052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79fca8c-2748-4e8c-8f52-389f2b82677f-logs\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.034674 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.034766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.035862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-config-data\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.046302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xqsl\" (UniqueName: \"kubernetes.io/projected/f79fca8c-2748-4e8c-8f52-389f2b82677f-kube-api-access-9xqsl\") pod \"nova-metadata-0\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.090871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.164449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.192145 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c64dc582-c8d6-412a-9f87-1c4a88154ee1" path="/var/lib/kubelet/pods/c64dc582-c8d6-412a-9f87-1c4a88154ee1/volumes" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.232924 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-combined-ca-bundle\") pod \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.232969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-config-data\") pod \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.233014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drnd4\" (UniqueName: \"kubernetes.io/projected/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-kube-api-access-drnd4\") pod \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.233187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-scripts\") pod \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\" (UID: \"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9\") " Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.238350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-kube-api-access-drnd4" (OuterVolumeSpecName: "kube-api-access-drnd4") pod "3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9" (UID: "3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9"). InnerVolumeSpecName "kube-api-access-drnd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.238569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-scripts" (OuterVolumeSpecName: "scripts") pod "3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9" (UID: "3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.253964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-config-data" (OuterVolumeSpecName: "config-data") pod "3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9" (UID: "3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.258969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9" (UID: "3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.335595 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.335619 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.335631 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.335639 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drnd4\" (UniqueName: \"kubernetes.io/projected/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9-kube-api-access-drnd4\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.533042 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:11 crc kubenswrapper[4707]: W0121 15:18:11.534495 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf79fca8c_2748_4e8c_8f52_389f2b82677f.slice/crio-25694234df08bb237d9a9b88497c6f0feac978e421339a991f02360f744b1ee6 WatchSource:0}: Error finding container 25694234df08bb237d9a9b88497c6f0feac978e421339a991f02360f744b1ee6: Status 404 returned error can't find the container with id 25694234df08bb237d9a9b88497c6f0feac978e421339a991f02360f744b1ee6 Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.787576 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" event={"ID":"3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9","Type":"ContainerDied","Data":"fe888825cc1c1bad9fa6571a530e52597d5247bd74d37303ce8969fd169abe70"} Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.787775 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe888825cc1c1bad9fa6571a530e52597d5247bd74d37303ce8969fd169abe70" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.787852 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.795906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f79fca8c-2748-4e8c-8f52-389f2b82677f","Type":"ContainerStarted","Data":"e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2"} Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.795951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f79fca8c-2748-4e8c-8f52-389f2b82677f","Type":"ContainerStarted","Data":"95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95"} Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.795962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f79fca8c-2748-4e8c-8f52-389f2b82677f","Type":"ContainerStarted","Data":"25694234df08bb237d9a9b88497c6f0feac978e421339a991f02360f744b1ee6"} Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.798035 4707 generic.go:334] "Generic (PLEG): container finished" podID="bfb72ff0-20b0-4878-a229-85528d202449" containerID="a1be8011bc6045d525baadc48d11d4837fc2704a537669c89fedb85109e454de" exitCode=0 Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.798110 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" event={"ID":"bfb72ff0-20b0-4878-a229-85528d202449","Type":"ContainerDied","Data":"a1be8011bc6045d525baadc48d11d4837fc2704a537669c89fedb85109e454de"} Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.821557 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.821542318 podStartE2EDuration="1.821542318s" podCreationTimestamp="2026-01-21 15:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:18:11.810093792 +0000 UTC m=+988.991610014" watchObservedRunningTime="2026-01-21 15:18:11.821542318 +0000 UTC m=+989.003058541" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.828581 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:18:11 crc kubenswrapper[4707]: E0121 15:18:11.828931 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9" containerName="nova-cell1-conductor-db-sync" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.828948 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9" containerName="nova-cell1-conductor-db-sync" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.829125 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9" containerName="nova-cell1-conductor-db-sync" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.830542 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.833217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.856069 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.949347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qlcg\" (UniqueName: \"kubernetes.io/projected/0e303ace-04be-4505-9437-af54b61d41e5-kube-api-access-2qlcg\") pod \"nova-cell1-conductor-0\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.949380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:11 crc kubenswrapper[4707]: I0121 15:18:11.949480 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.050754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qlcg\" (UniqueName: \"kubernetes.io/projected/0e303ace-04be-4505-9437-af54b61d41e5-kube-api-access-2qlcg\") pod \"nova-cell1-conductor-0\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.050785 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.050869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.054220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.054219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.062934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qlcg\" (UniqueName: \"kubernetes.io/projected/0e303ace-04be-4505-9437-af54b61d41e5-kube-api-access-2qlcg\") pod \"nova-cell1-conductor-0\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.147924 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.502472 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:18:12 crc kubenswrapper[4707]: W0121 15:18:12.504826 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e303ace_04be_4505_9437_af54b61d41e5.slice/crio-5201152a1f5520603563c6524f449144f5b0a32eafae894ee206e527ff039534 WatchSource:0}: Error finding container 5201152a1f5520603563c6524f449144f5b0a32eafae894ee206e527ff039534: Status 404 returned error can't find the container with id 5201152a1f5520603563c6524f449144f5b0a32eafae894ee206e527ff039534 Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.807896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0e303ace-04be-4505-9437-af54b61d41e5","Type":"ContainerStarted","Data":"c9b449437d00fe1991a9d06649066f2fc2cf5629c28ad87cf426887a6dbf94ac"} Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.807943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0e303ace-04be-4505-9437-af54b61d41e5","Type":"ContainerStarted","Data":"5201152a1f5520603563c6524f449144f5b0a32eafae894ee206e527ff039534"} Jan 21 15:18:12 crc kubenswrapper[4707]: I0121 15:18:12.821565 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.821548887 podStartE2EDuration="1.821548887s" podCreationTimestamp="2026-01-21 15:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:18:12.818856623 +0000 UTC m=+990.000372845" watchObservedRunningTime="2026-01-21 15:18:12.821548887 +0000 UTC m=+990.003065109" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.123971 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.269353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x4n7\" (UniqueName: \"kubernetes.io/projected/bfb72ff0-20b0-4878-a229-85528d202449-kube-api-access-9x4n7\") pod \"bfb72ff0-20b0-4878-a229-85528d202449\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.269442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-combined-ca-bundle\") pod \"bfb72ff0-20b0-4878-a229-85528d202449\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.269508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-config-data\") pod \"bfb72ff0-20b0-4878-a229-85528d202449\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.269582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-scripts\") pod \"bfb72ff0-20b0-4878-a229-85528d202449\" (UID: \"bfb72ff0-20b0-4878-a229-85528d202449\") " Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.280252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb72ff0-20b0-4878-a229-85528d202449-kube-api-access-9x4n7" (OuterVolumeSpecName: "kube-api-access-9x4n7") pod "bfb72ff0-20b0-4878-a229-85528d202449" (UID: "bfb72ff0-20b0-4878-a229-85528d202449"). InnerVolumeSpecName "kube-api-access-9x4n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.281032 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-scripts" (OuterVolumeSpecName: "scripts") pod "bfb72ff0-20b0-4878-a229-85528d202449" (UID: "bfb72ff0-20b0-4878-a229-85528d202449"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.292280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfb72ff0-20b0-4878-a229-85528d202449" (UID: "bfb72ff0-20b0-4878-a229-85528d202449"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.293949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-config-data" (OuterVolumeSpecName: "config-data") pod "bfb72ff0-20b0-4878-a229-85528d202449" (UID: "bfb72ff0-20b0-4878-a229-85528d202449"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.371836 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.371867 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.371877 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfb72ff0-20b0-4878-a229-85528d202449-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.371885 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x4n7\" (UniqueName: \"kubernetes.io/projected/bfb72ff0-20b0-4878-a229-85528d202449-kube-api-access-9x4n7\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.815470 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.815465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq" event={"ID":"bfb72ff0-20b0-4878-a229-85528d202449","Type":"ContainerDied","Data":"2dfc155c6e0cd440978ae744a2887897d61e788aa696fe40d76c71657b3b51b3"} Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.815518 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dfc155c6e0cd440978ae744a2887897d61e788aa696fe40d76c71657b3b51b3" Jan 21 15:18:13 crc kubenswrapper[4707]: I0121 15:18:13.815553 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.033589 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.034009 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" containerName="nova-scheduler-scheduler" containerID="cri-o://99b32f0c2b6aee998a94da8e45a516c8e93c20285f028ff18766f1f9df424d4c" gracePeriod=30 Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.062318 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.062514 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerName="nova-api-log" containerID="cri-o://79d3e5c487fe483466928d3803a04398dcec16dfd71fc55446774426c66dd7d7" gracePeriod=30 Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.062915 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerName="nova-api-api" containerID="cri-o://8820f80a53124cd31ccbe0b68e5495d761a5834dccd7ae1953beea5e4bc9886a" gracePeriod=30 Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.166009 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.166372 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerName="nova-metadata-log" containerID="cri-o://95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95" gracePeriod=30 Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.166892 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerName="nova-metadata-metadata" containerID="cri-o://e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2" gracePeriod=30 Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.674504 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.796992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xqsl\" (UniqueName: \"kubernetes.io/projected/f79fca8c-2748-4e8c-8f52-389f2b82677f-kube-api-access-9xqsl\") pod \"f79fca8c-2748-4e8c-8f52-389f2b82677f\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.797098 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-combined-ca-bundle\") pod \"f79fca8c-2748-4e8c-8f52-389f2b82677f\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.797186 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-config-data\") pod \"f79fca8c-2748-4e8c-8f52-389f2b82677f\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.797247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-nova-metadata-tls-certs\") pod \"f79fca8c-2748-4e8c-8f52-389f2b82677f\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.797311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79fca8c-2748-4e8c-8f52-389f2b82677f-logs\") pod \"f79fca8c-2748-4e8c-8f52-389f2b82677f\" (UID: \"f79fca8c-2748-4e8c-8f52-389f2b82677f\") " Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.798359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79fca8c-2748-4e8c-8f52-389f2b82677f-logs" (OuterVolumeSpecName: "logs") pod "f79fca8c-2748-4e8c-8f52-389f2b82677f" (UID: "f79fca8c-2748-4e8c-8f52-389f2b82677f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.809284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79fca8c-2748-4e8c-8f52-389f2b82677f-kube-api-access-9xqsl" (OuterVolumeSpecName: "kube-api-access-9xqsl") pod "f79fca8c-2748-4e8c-8f52-389f2b82677f" (UID: "f79fca8c-2748-4e8c-8f52-389f2b82677f"). InnerVolumeSpecName "kube-api-access-9xqsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.819995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f79fca8c-2748-4e8c-8f52-389f2b82677f" (UID: "f79fca8c-2748-4e8c-8f52-389f2b82677f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.828866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-config-data" (OuterVolumeSpecName: "config-data") pod "f79fca8c-2748-4e8c-8f52-389f2b82677f" (UID: "f79fca8c-2748-4e8c-8f52-389f2b82677f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.829972 4707 generic.go:334] "Generic (PLEG): container finished" podID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerID="e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2" exitCode=0 Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.830001 4707 generic.go:334] "Generic (PLEG): container finished" podID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerID="95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95" exitCode=143 Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.830036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f79fca8c-2748-4e8c-8f52-389f2b82677f","Type":"ContainerDied","Data":"e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2"} Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.830063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f79fca8c-2748-4e8c-8f52-389f2b82677f","Type":"ContainerDied","Data":"95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95"} Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.830074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f79fca8c-2748-4e8c-8f52-389f2b82677f","Type":"ContainerDied","Data":"25694234df08bb237d9a9b88497c6f0feac978e421339a991f02360f744b1ee6"} Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.830100 4707 scope.go:117] "RemoveContainer" containerID="e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.830228 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.833660 4707 generic.go:334] "Generic (PLEG): container finished" podID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerID="8820f80a53124cd31ccbe0b68e5495d761a5834dccd7ae1953beea5e4bc9886a" exitCode=0 Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.833691 4707 generic.go:334] "Generic (PLEG): container finished" podID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerID="79d3e5c487fe483466928d3803a04398dcec16dfd71fc55446774426c66dd7d7" exitCode=143 Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.833719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"615ee5fc-ef35-43da-a9e6-0ffd523a1375","Type":"ContainerDied","Data":"8820f80a53124cd31ccbe0b68e5495d761a5834dccd7ae1953beea5e4bc9886a"} Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.833743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"615ee5fc-ef35-43da-a9e6-0ffd523a1375","Type":"ContainerDied","Data":"79d3e5c487fe483466928d3803a04398dcec16dfd71fc55446774426c66dd7d7"} Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.839778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f79fca8c-2748-4e8c-8f52-389f2b82677f" (UID: "f79fca8c-2748-4e8c-8f52-389f2b82677f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.855161 4707 scope.go:117] "RemoveContainer" containerID="95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.878747 4707 scope.go:117] "RemoveContainer" containerID="e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2" Jan 21 15:18:14 crc kubenswrapper[4707]: E0121 15:18:14.879252 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2\": container with ID starting with e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2 not found: ID does not exist" containerID="e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.879317 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2"} err="failed to get container status \"e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2\": rpc error: code = NotFound desc = could not find container \"e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2\": container with ID starting with e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2 not found: ID does not exist" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.879349 4707 scope.go:117] "RemoveContainer" containerID="95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95" Jan 21 15:18:14 crc kubenswrapper[4707]: E0121 15:18:14.887646 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95\": container with ID starting with 95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95 not found: ID does not exist" containerID="95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.887706 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95"} err="failed to get container status \"95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95\": rpc error: code = NotFound desc = could not find container \"95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95\": container with ID starting with 95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95 not found: ID does not exist" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.887741 4707 scope.go:117] "RemoveContainer" containerID="e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.888671 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2"} err="failed to get container status \"e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2\": rpc error: code = NotFound desc = could not find container \"e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2\": container with ID starting with e039dcedbb20a259d2cf001ddbbf19c4235cd2597ee0d89a91d293aac246e5d2 not found: ID does not exist" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.888712 4707 scope.go:117] "RemoveContainer" containerID="95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.894128 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95"} err="failed to get container status \"95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95\": rpc error: code = NotFound desc = could not find container \"95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95\": container with ID starting with 95aabbbc972f1f7c1ebeceebfe5265ede6f4bd046186dfa2108d6dd238097c95 not found: ID does not exist" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.899378 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xqsl\" (UniqueName: \"kubernetes.io/projected/f79fca8c-2748-4e8c-8f52-389f2b82677f-kube-api-access-9xqsl\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.899405 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.899415 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.899424 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79fca8c-2748-4e8c-8f52-389f2b82677f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.899433 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79fca8c-2748-4e8c-8f52-389f2b82677f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:14 crc kubenswrapper[4707]: I0121 15:18:14.917167 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.000311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdkrz\" (UniqueName: \"kubernetes.io/projected/615ee5fc-ef35-43da-a9e6-0ffd523a1375-kube-api-access-jdkrz\") pod \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.000511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615ee5fc-ef35-43da-a9e6-0ffd523a1375-logs\") pod \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.000563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-config-data\") pod \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.000795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-combined-ca-bundle\") pod \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\" (UID: \"615ee5fc-ef35-43da-a9e6-0ffd523a1375\") " Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.001997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615ee5fc-ef35-43da-a9e6-0ffd523a1375-logs" (OuterVolumeSpecName: "logs") pod "615ee5fc-ef35-43da-a9e6-0ffd523a1375" (UID: "615ee5fc-ef35-43da-a9e6-0ffd523a1375"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.004015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615ee5fc-ef35-43da-a9e6-0ffd523a1375-kube-api-access-jdkrz" (OuterVolumeSpecName: "kube-api-access-jdkrz") pod "615ee5fc-ef35-43da-a9e6-0ffd523a1375" (UID: "615ee5fc-ef35-43da-a9e6-0ffd523a1375"). InnerVolumeSpecName "kube-api-access-jdkrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.020909 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-config-data" (OuterVolumeSpecName: "config-data") pod "615ee5fc-ef35-43da-a9e6-0ffd523a1375" (UID: "615ee5fc-ef35-43da-a9e6-0ffd523a1375"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.020982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "615ee5fc-ef35-43da-a9e6-0ffd523a1375" (UID: "615ee5fc-ef35-43da-a9e6-0ffd523a1375"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.103950 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.103990 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdkrz\" (UniqueName: \"kubernetes.io/projected/615ee5fc-ef35-43da-a9e6-0ffd523a1375-kube-api-access-jdkrz\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.104002 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615ee5fc-ef35-43da-a9e6-0ffd523a1375-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.104011 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615ee5fc-ef35-43da-a9e6-0ffd523a1375-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.163881 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.176101 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.193124 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79fca8c-2748-4e8c-8f52-389f2b82677f" path="/var/lib/kubelet/pods/f79fca8c-2748-4e8c-8f52-389f2b82677f/volumes" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.194599 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:15 crc kubenswrapper[4707]: E0121 15:18:15.197520 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerName="nova-api-api" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.197559 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerName="nova-api-api" Jan 21 15:18:15 crc kubenswrapper[4707]: E0121 15:18:15.197585 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb72ff0-20b0-4878-a229-85528d202449" containerName="nova-manage" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.197594 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb72ff0-20b0-4878-a229-85528d202449" containerName="nova-manage" Jan 21 15:18:15 crc kubenswrapper[4707]: E0121 15:18:15.197603 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerName="nova-metadata-metadata" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.197610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerName="nova-metadata-metadata" Jan 21 15:18:15 crc kubenswrapper[4707]: E0121 15:18:15.197621 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerName="nova-api-log" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.197627 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerName="nova-api-log" Jan 21 15:18:15 crc kubenswrapper[4707]: E0121 15:18:15.197663 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerName="nova-metadata-log" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.197672 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerName="nova-metadata-log" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.197997 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerName="nova-metadata-metadata" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.198021 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerName="nova-api-log" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.198032 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb72ff0-20b0-4878-a229-85528d202449" containerName="nova-manage" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.198054 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" containerName="nova-api-api" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.198061 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79fca8c-2748-4e8c-8f52-389f2b82677f" containerName="nova-metadata-log" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.199662 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.202134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.202363 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.205335 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.307716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a9ec4e-857c-43a8-b31e-dc38360c6527-logs\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.308130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.308190 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkrxd\" (UniqueName: \"kubernetes.io/projected/12a9ec4e-857c-43a8-b31e-dc38360c6527-kube-api-access-gkrxd\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.308280 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-config-data\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.308308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.410656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a9ec4e-857c-43a8-b31e-dc38360c6527-logs\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.410825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.410898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkrxd\" (UniqueName: \"kubernetes.io/projected/12a9ec4e-857c-43a8-b31e-dc38360c6527-kube-api-access-gkrxd\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.410998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-config-data\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.411046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.412277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a9ec4e-857c-43a8-b31e-dc38360c6527-logs\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.416366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-config-data\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.421926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.431406 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkrxd\" (UniqueName: \"kubernetes.io/projected/12a9ec4e-857c-43a8-b31e-dc38360c6527-kube-api-access-gkrxd\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.431911 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.538341 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.841038 4707 generic.go:334] "Generic (PLEG): container finished" podID="04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" containerID="99b32f0c2b6aee998a94da8e45a516c8e93c20285f028ff18766f1f9df424d4c" exitCode=0 Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.841241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27","Type":"ContainerDied","Data":"99b32f0c2b6aee998a94da8e45a516c8e93c20285f028ff18766f1f9df424d4c"} Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.845574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"615ee5fc-ef35-43da-a9e6-0ffd523a1375","Type":"ContainerDied","Data":"db6bf76d7b8bf8575ac92b40c069804278da0a89e056459c4b09c9b5f46f519c"} Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.845606 4707 scope.go:117] "RemoveContainer" containerID="8820f80a53124cd31ccbe0b68e5495d761a5834dccd7ae1953beea5e4bc9886a" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.845718 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.877162 4707 scope.go:117] "RemoveContainer" containerID="79d3e5c487fe483466928d3803a04398dcec16dfd71fc55446774426c66dd7d7" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.884534 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.894691 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.901859 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.903203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.906608 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.912864 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.927324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:15 crc kubenswrapper[4707]: I0121 15:18:15.944656 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.020894 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-config-data\") pod \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.021048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-combined-ca-bundle\") pod \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.021167 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7lvb\" (UniqueName: \"kubernetes.io/projected/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-kube-api-access-v7lvb\") pod \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.021396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.021467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zmp4\" (UniqueName: \"kubernetes.io/projected/63acb604-a9d8-4dd4-8898-67365790ab8d-kube-api-access-7zmp4\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.021590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63acb604-a9d8-4dd4-8898-67365790ab8d-logs\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.021634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-config-data\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.025182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-kube-api-access-v7lvb" (OuterVolumeSpecName: "kube-api-access-v7lvb") pod "04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" (UID: "04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27"). InnerVolumeSpecName "kube-api-access-v7lvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:16 crc kubenswrapper[4707]: E0121 15:18:16.041838 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-combined-ca-bundle podName:04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27 nodeName:}" failed. No retries permitted until 2026-01-21 15:18:16.541761228 +0000 UTC m=+993.723277450 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-combined-ca-bundle") pod "04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" (UID: "04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27") : error deleting /var/lib/kubelet/pods/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27/volume-subpaths: remove /var/lib/kubelet/pods/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27/volume-subpaths: no such file or directory Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.045267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-config-data" (OuterVolumeSpecName: "config-data") pod "04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" (UID: "04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.124717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63acb604-a9d8-4dd4-8898-67365790ab8d-logs\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.124768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-config-data\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.124834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.124873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zmp4\" (UniqueName: \"kubernetes.io/projected/63acb604-a9d8-4dd4-8898-67365790ab8d-kube-api-access-7zmp4\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.124940 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7lvb\" (UniqueName: \"kubernetes.io/projected/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-kube-api-access-v7lvb\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.124955 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.126741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63acb604-a9d8-4dd4-8898-67365790ab8d-logs\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.131398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-config-data\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.131833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.140176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zmp4\" (UniqueName: \"kubernetes.io/projected/63acb604-a9d8-4dd4-8898-67365790ab8d-kube-api-access-7zmp4\") pod \"nova-api-0\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.228743 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.592717 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:16 crc kubenswrapper[4707]: W0121 15:18:16.595168 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63acb604_a9d8_4dd4_8898_67365790ab8d.slice/crio-b7637115018493a92d8bc1d7e8521a48e1fb82b46a60752ac65b1a2df1a190e2 WatchSource:0}: Error finding container b7637115018493a92d8bc1d7e8521a48e1fb82b46a60752ac65b1a2df1a190e2: Status 404 returned error can't find the container with id b7637115018493a92d8bc1d7e8521a48e1fb82b46a60752ac65b1a2df1a190e2 Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.633665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-combined-ca-bundle\") pod \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\" (UID: \"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27\") " Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.637579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" (UID: "04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.736344 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.858044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"63acb604-a9d8-4dd4-8898-67365790ab8d","Type":"ContainerStarted","Data":"512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9"} Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.858289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"63acb604-a9d8-4dd4-8898-67365790ab8d","Type":"ContainerStarted","Data":"501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea"} Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.858301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"63acb604-a9d8-4dd4-8898-67365790ab8d","Type":"ContainerStarted","Data":"b7637115018493a92d8bc1d7e8521a48e1fb82b46a60752ac65b1a2df1a190e2"} Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.859536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27","Type":"ContainerDied","Data":"46af158bb52b1ebe16c017c70a11ece4ac78d0d7604e7850f09791cc27408b34"} Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.859579 4707 scope.go:117] "RemoveContainer" containerID="99b32f0c2b6aee998a94da8e45a516c8e93c20285f028ff18766f1f9df424d4c" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.859588 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.861250 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"12a9ec4e-857c-43a8-b31e-dc38360c6527","Type":"ContainerStarted","Data":"a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085"} Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.861285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"12a9ec4e-857c-43a8-b31e-dc38360c6527","Type":"ContainerStarted","Data":"6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a"} Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.861297 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"12a9ec4e-857c-43a8-b31e-dc38360c6527","Type":"ContainerStarted","Data":"bb70124027dccd1ed8810ae6a481045d7877b3f03aa77152e651a8587b6a68bb"} Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.874883 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.8748687259999999 podStartE2EDuration="1.874868726s" podCreationTimestamp="2026-01-21 15:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:18:16.870277449 +0000 UTC m=+994.051793672" watchObservedRunningTime="2026-01-21 15:18:16.874868726 +0000 UTC m=+994.056384978" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.890032 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.8900176549999999 podStartE2EDuration="1.890017655s" podCreationTimestamp="2026-01-21 15:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:18:16.884610673 +0000 UTC m=+994.066126895" watchObservedRunningTime="2026-01-21 15:18:16.890017655 +0000 UTC m=+994.071533877" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.899238 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.904753 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.912389 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:18:16 crc kubenswrapper[4707]: E0121 15:18:16.912751 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" containerName="nova-scheduler-scheduler" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.912774 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" containerName="nova-scheduler-scheduler" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.912962 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" containerName="nova-scheduler-scheduler" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.913560 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.915570 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:18:16 crc kubenswrapper[4707]: I0121 15:18:16.929356 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.041014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrfg9\" (UniqueName: \"kubernetes.io/projected/551e55fc-c235-41c7-8963-da7a6ebe4082-kube-api-access-qrfg9\") pod \"nova-scheduler-0\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.041262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-config-data\") pod \"nova-scheduler-0\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.041519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.142885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.142993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrfg9\" (UniqueName: \"kubernetes.io/projected/551e55fc-c235-41c7-8963-da7a6ebe4082-kube-api-access-qrfg9\") pod \"nova-scheduler-0\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.143027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-config-data\") pod \"nova-scheduler-0\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.145974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.146407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-config-data\") pod \"nova-scheduler-0\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.155931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrfg9\" (UniqueName: \"kubernetes.io/projected/551e55fc-c235-41c7-8963-da7a6ebe4082-kube-api-access-qrfg9\") pod \"nova-scheduler-0\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.167975 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.190531 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27" path="/var/lib/kubelet/pods/04c0e6b4-f0a0-4ce2-aa37-09b148c0aa27/volumes" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.191143 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615ee5fc-ef35-43da-a9e6-0ffd523a1375" path="/var/lib/kubelet/pods/615ee5fc-ef35-43da-a9e6-0ffd523a1375/volumes" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.226170 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.578391 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:18:17 crc kubenswrapper[4707]: W0121 15:18:17.581318 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod551e55fc_c235_41c7_8963_da7a6ebe4082.slice/crio-c23fbc4dbd806e45ac757c64bb779c4bb98c0dab221153bbee750d52a4c37c0d WatchSource:0}: Error finding container c23fbc4dbd806e45ac757c64bb779c4bb98c0dab221153bbee750d52a4c37c0d: Status 404 returned error can't find the container with id c23fbc4dbd806e45ac757c64bb779c4bb98c0dab221153bbee750d52a4c37c0d Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.870205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"551e55fc-c235-41c7-8963-da7a6ebe4082","Type":"ContainerStarted","Data":"04b05a7b3e7a0eda6bd36b474347f33248746fb143327226a015fb3e14bac540"} Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.870249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"551e55fc-c235-41c7-8963-da7a6ebe4082","Type":"ContainerStarted","Data":"c23fbc4dbd806e45ac757c64bb779c4bb98c0dab221153bbee750d52a4c37c0d"} Jan 21 15:18:17 crc kubenswrapper[4707]: I0121 15:18:17.886446 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.886432527 podStartE2EDuration="1.886432527s" podCreationTimestamp="2026-01-21 15:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:18:17.880648054 +0000 UTC m=+995.062164276" watchObservedRunningTime="2026-01-21 15:18:17.886432527 +0000 UTC m=+995.067948749" Jan 21 15:18:20 crc kubenswrapper[4707]: I0121 15:18:20.539168 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:20 crc kubenswrapper[4707]: I0121 15:18:20.539405 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:22 crc kubenswrapper[4707]: I0121 15:18:22.226270 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:22 crc kubenswrapper[4707]: I0121 15:18:22.957894 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:25 crc kubenswrapper[4707]: I0121 15:18:25.538988 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:25 crc kubenswrapper[4707]: I0121 15:18:25.539399 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:26 crc kubenswrapper[4707]: I0121 15:18:26.229060 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:26 crc kubenswrapper[4707]: I0121 15:18:26.229118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:26 crc kubenswrapper[4707]: I0121 15:18:26.552945 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:18:26 crc kubenswrapper[4707]: I0121 15:18:26.552994 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:18:27 crc kubenswrapper[4707]: I0121 15:18:27.226377 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:27 crc kubenswrapper[4707]: I0121 15:18:27.247369 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:27 crc kubenswrapper[4707]: I0121 15:18:27.310923 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:18:27 crc kubenswrapper[4707]: I0121 15:18:27.310935 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:18:27 crc kubenswrapper[4707]: I0121 15:18:27.961457 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:18:35 crc kubenswrapper[4707]: I0121 15:18:35.544268 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:35 crc kubenswrapper[4707]: I0121 15:18:35.544646 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:35 crc kubenswrapper[4707]: I0121 15:18:35.549475 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:35 crc kubenswrapper[4707]: I0121 15:18:35.549760 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:18:36 crc kubenswrapper[4707]: I0121 15:18:36.232716 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:36 crc kubenswrapper[4707]: I0121 15:18:36.233035 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:36 crc kubenswrapper[4707]: I0121 15:18:36.233344 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:36 crc kubenswrapper[4707]: I0121 15:18:36.233385 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:36 crc kubenswrapper[4707]: I0121 15:18:36.235898 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:36 crc kubenswrapper[4707]: I0121 15:18:36.238440 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:37 crc kubenswrapper[4707]: I0121 15:18:37.956160 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:18:37 crc kubenswrapper[4707]: I0121 15:18:37.956624 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="ceilometer-central-agent" containerID="cri-o://e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13" gracePeriod=30 Jan 21 15:18:37 crc kubenswrapper[4707]: I0121 15:18:37.956745 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="proxy-httpd" containerID="cri-o://38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4" gracePeriod=30 Jan 21 15:18:37 crc kubenswrapper[4707]: I0121 15:18:37.956779 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="sg-core" containerID="cri-o://485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009" gracePeriod=30 Jan 21 15:18:37 crc kubenswrapper[4707]: I0121 15:18:37.956829 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="ceilometer-notification-agent" containerID="cri-o://21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97" gracePeriod=30 Jan 21 15:18:38 crc kubenswrapper[4707]: I0121 15:18:38.518039 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.008120 4707 generic.go:334] "Generic (PLEG): container finished" podID="c129b2b0-9811-4d58-9b65-de842874c901" containerID="38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4" exitCode=0 Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.008372 4707 generic.go:334] "Generic (PLEG): container finished" podID="c129b2b0-9811-4d58-9b65-de842874c901" containerID="485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009" exitCode=2 Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.008382 4707 generic.go:334] "Generic (PLEG): container finished" podID="c129b2b0-9811-4d58-9b65-de842874c901" containerID="e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13" exitCode=0 Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.008207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerDied","Data":"38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4"} Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.008447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerDied","Data":"485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009"} Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.008459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerDied","Data":"e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13"} Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.008538 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-log" containerID="cri-o://501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea" gracePeriod=30 Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.008614 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-api" containerID="cri-o://512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9" gracePeriod=30 Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.945285 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:18:39 crc kubenswrapper[4707]: I0121 15:18:39.945326 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.017926 4707 generic.go:334] "Generic (PLEG): container finished" podID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerID="501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea" exitCode=143 Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.017999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"63acb604-a9d8-4dd4-8898-67365790ab8d","Type":"ContainerDied","Data":"501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea"} Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.019330 4707 generic.go:334] "Generic (PLEG): container finished" podID="27442a8c-bc16-4113-800f-4d8f3046ad99" containerID="5c80dd2658c929c67f6346efbdf320642e36107ab897452d3c02ab7d7505a7be" exitCode=137 Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.019362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"27442a8c-bc16-4113-800f-4d8f3046ad99","Type":"ContainerDied","Data":"5c80dd2658c929c67f6346efbdf320642e36107ab897452d3c02ab7d7505a7be"} Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.085234 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.203805 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqsrn\" (UniqueName: \"kubernetes.io/projected/27442a8c-bc16-4113-800f-4d8f3046ad99-kube-api-access-vqsrn\") pod \"27442a8c-bc16-4113-800f-4d8f3046ad99\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.203959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-combined-ca-bundle\") pod \"27442a8c-bc16-4113-800f-4d8f3046ad99\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.204077 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-config-data\") pod \"27442a8c-bc16-4113-800f-4d8f3046ad99\" (UID: \"27442a8c-bc16-4113-800f-4d8f3046ad99\") " Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.208767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27442a8c-bc16-4113-800f-4d8f3046ad99-kube-api-access-vqsrn" (OuterVolumeSpecName: "kube-api-access-vqsrn") pod "27442a8c-bc16-4113-800f-4d8f3046ad99" (UID: "27442a8c-bc16-4113-800f-4d8f3046ad99"). InnerVolumeSpecName "kube-api-access-vqsrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.223755 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-config-data" (OuterVolumeSpecName: "config-data") pod "27442a8c-bc16-4113-800f-4d8f3046ad99" (UID: "27442a8c-bc16-4113-800f-4d8f3046ad99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.225313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27442a8c-bc16-4113-800f-4d8f3046ad99" (UID: "27442a8c-bc16-4113-800f-4d8f3046ad99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.305789 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.305832 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27442a8c-bc16-4113-800f-4d8f3046ad99-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:40 crc kubenswrapper[4707]: I0121 15:18:40.305841 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqsrn\" (UniqueName: \"kubernetes.io/projected/27442a8c-bc16-4113-800f-4d8f3046ad99-kube-api-access-vqsrn\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.026266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"27442a8c-bc16-4113-800f-4d8f3046ad99","Type":"ContainerDied","Data":"631e3c2ed390f1fdd0d4502dfe8f1be14f405dc2371772fc8b674d4aeb46ce33"} Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.026315 4707 scope.go:117] "RemoveContainer" containerID="5c80dd2658c929c67f6346efbdf320642e36107ab897452d3c02ab7d7505a7be" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.026322 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.051605 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.060264 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.069732 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:18:41 crc kubenswrapper[4707]: E0121 15:18:41.070098 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27442a8c-bc16-4113-800f-4d8f3046ad99" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.070129 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27442a8c-bc16-4113-800f-4d8f3046ad99" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.070281 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="27442a8c-bc16-4113-800f-4d8f3046ad99" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.071017 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.072843 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.073647 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.073722 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.082006 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.189429 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27442a8c-bc16-4113-800f-4d8f3046ad99" path="/var/lib/kubelet/pods/27442a8c-bc16-4113-800f-4d8f3046ad99/volumes" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.220249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.220301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbrgt\" (UniqueName: \"kubernetes.io/projected/43e09a5c-6892-4ff5-9c5b-a30556a193ed-kube-api-access-mbrgt\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.221081 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.221234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.221392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.322528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.322584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.322621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbrgt\" (UniqueName: \"kubernetes.io/projected/43e09a5c-6892-4ff5-9c5b-a30556a193ed-kube-api-access-mbrgt\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.322689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.322729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.326998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.333194 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.334465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.334611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.346136 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbrgt\" (UniqueName: \"kubernetes.io/projected/43e09a5c-6892-4ff5-9c5b-a30556a193ed-kube-api-access-mbrgt\") pod \"nova-cell1-novncproxy-0\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.386627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.475040 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.628423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-log-httpd\") pod \"c129b2b0-9811-4d58-9b65-de842874c901\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.628579 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-run-httpd\") pod \"c129b2b0-9811-4d58-9b65-de842874c901\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.628627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-ceilometer-tls-certs\") pod \"c129b2b0-9811-4d58-9b65-de842874c901\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.628645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-combined-ca-bundle\") pod \"c129b2b0-9811-4d58-9b65-de842874c901\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.628675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-scripts\") pod \"c129b2b0-9811-4d58-9b65-de842874c901\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.628696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-sg-core-conf-yaml\") pod \"c129b2b0-9811-4d58-9b65-de842874c901\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.628715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcnl4\" (UniqueName: \"kubernetes.io/projected/c129b2b0-9811-4d58-9b65-de842874c901-kube-api-access-wcnl4\") pod \"c129b2b0-9811-4d58-9b65-de842874c901\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.628824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-config-data\") pod \"c129b2b0-9811-4d58-9b65-de842874c901\" (UID: \"c129b2b0-9811-4d58-9b65-de842874c901\") " Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.628982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c129b2b0-9811-4d58-9b65-de842874c901" (UID: "c129b2b0-9811-4d58-9b65-de842874c901"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.629008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c129b2b0-9811-4d58-9b65-de842874c901" (UID: "c129b2b0-9811-4d58-9b65-de842874c901"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.629506 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.629525 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c129b2b0-9811-4d58-9b65-de842874c901-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.632127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c129b2b0-9811-4d58-9b65-de842874c901-kube-api-access-wcnl4" (OuterVolumeSpecName: "kube-api-access-wcnl4") pod "c129b2b0-9811-4d58-9b65-de842874c901" (UID: "c129b2b0-9811-4d58-9b65-de842874c901"). InnerVolumeSpecName "kube-api-access-wcnl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.632242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-scripts" (OuterVolumeSpecName: "scripts") pod "c129b2b0-9811-4d58-9b65-de842874c901" (UID: "c129b2b0-9811-4d58-9b65-de842874c901"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.647692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c129b2b0-9811-4d58-9b65-de842874c901" (UID: "c129b2b0-9811-4d58-9b65-de842874c901"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.665413 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c129b2b0-9811-4d58-9b65-de842874c901" (UID: "c129b2b0-9811-4d58-9b65-de842874c901"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.694132 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-config-data" (OuterVolumeSpecName: "config-data") pod "c129b2b0-9811-4d58-9b65-de842874c901" (UID: "c129b2b0-9811-4d58-9b65-de842874c901"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.701157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c129b2b0-9811-4d58-9b65-de842874c901" (UID: "c129b2b0-9811-4d58-9b65-de842874c901"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.731050 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.731071 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.731083 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.731092 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.731100 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c129b2b0-9811-4d58-9b65-de842874c901-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.731119 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcnl4\" (UniqueName: \"kubernetes.io/projected/c129b2b0-9811-4d58-9b65-de842874c901-kube-api-access-wcnl4\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:41 crc kubenswrapper[4707]: I0121 15:18:41.762181 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.034570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"43e09a5c-6892-4ff5-9c5b-a30556a193ed","Type":"ContainerStarted","Data":"f42ade328f4a73251db136eabdb66bd21328d2aa0a7aed9f3d9b848f325c979b"} Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.034839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"43e09a5c-6892-4ff5-9c5b-a30556a193ed","Type":"ContainerStarted","Data":"294783e116ea9bfadd890a0f528e163107600b79c2a6f6bbc76eda62332af07d"} Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.037487 4707 generic.go:334] "Generic (PLEG): container finished" podID="c129b2b0-9811-4d58-9b65-de842874c901" containerID="21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97" exitCode=0 Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.037508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerDied","Data":"21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97"} Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.037538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c129b2b0-9811-4d58-9b65-de842874c901","Type":"ContainerDied","Data":"09245aae2040da0299fd7e54e0fc52418948cf1bf1f9df100fe5bdacf17693a4"} Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.037556 4707 scope.go:117] "RemoveContainer" containerID="38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.037578 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.048675 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.048661149 podStartE2EDuration="1.048661149s" podCreationTimestamp="2026-01-21 15:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:18:42.046298725 +0000 UTC m=+1019.227814948" watchObservedRunningTime="2026-01-21 15:18:42.048661149 +0000 UTC m=+1019.230177371" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.060134 4707 scope.go:117] "RemoveContainer" containerID="485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.064008 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.071092 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.084597 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:18:42 crc kubenswrapper[4707]: E0121 15:18:42.085317 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="ceilometer-central-agent" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.085342 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="ceilometer-central-agent" Jan 21 15:18:42 crc kubenswrapper[4707]: E0121 15:18:42.085359 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="proxy-httpd" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.085365 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="proxy-httpd" Jan 21 15:18:42 crc kubenswrapper[4707]: E0121 15:18:42.085376 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="ceilometer-notification-agent" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.085382 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="ceilometer-notification-agent" Jan 21 15:18:42 crc kubenswrapper[4707]: E0121 15:18:42.085388 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="sg-core" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.085393 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="sg-core" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.085532 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="ceilometer-central-agent" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.085544 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="sg-core" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.085555 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="proxy-httpd" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.085565 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c129b2b0-9811-4d58-9b65-de842874c901" containerName="ceilometer-notification-agent" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.086316 4707 scope.go:117] "RemoveContainer" containerID="21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.090292 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.091981 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.092087 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.092182 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.098295 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.162861 4707 scope.go:117] "RemoveContainer" containerID="e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.181569 4707 scope.go:117] "RemoveContainer" containerID="38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4" Jan 21 15:18:42 crc kubenswrapper[4707]: E0121 15:18:42.181854 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4\": container with ID starting with 38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4 not found: ID does not exist" containerID="38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.181889 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4"} err="failed to get container status \"38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4\": rpc error: code = NotFound desc = could not find container \"38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4\": container with ID starting with 38bd07ac54503c3d700225d31c3423acff57a77dee534c7fce56841f4518f8a4 not found: ID does not exist" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.181914 4707 scope.go:117] "RemoveContainer" containerID="485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009" Jan 21 15:18:42 crc kubenswrapper[4707]: E0121 15:18:42.182229 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009\": container with ID starting with 485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009 not found: ID does not exist" containerID="485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.182255 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009"} err="failed to get container status \"485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009\": rpc error: code = NotFound desc = could not find container \"485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009\": container with ID starting with 485b86c695eb9d603200784fedb4be8cde2cea72bca127a8d11c60b4aed35009 not found: ID does not exist" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.182275 4707 scope.go:117] "RemoveContainer" containerID="21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97" Jan 21 15:18:42 crc kubenswrapper[4707]: E0121 15:18:42.182664 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97\": container with ID starting with 21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97 not found: ID does not exist" containerID="21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.182684 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97"} err="failed to get container status \"21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97\": rpc error: code = NotFound desc = could not find container \"21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97\": container with ID starting with 21330dbcc39ae1985d5c3beb86869bfbdb5396a75ae2a9b0c718774a7d32ab97 not found: ID does not exist" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.182699 4707 scope.go:117] "RemoveContainer" containerID="e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13" Jan 21 15:18:42 crc kubenswrapper[4707]: E0121 15:18:42.182895 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13\": container with ID starting with e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13 not found: ID does not exist" containerID="e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.182917 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13"} err="failed to get container status \"e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13\": rpc error: code = NotFound desc = could not find container \"e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13\": container with ID starting with e0f55d1595e9b4aa6603a3236983cfff1cd8078570a761195ea14161cfae1a13 not found: ID does not exist" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.239849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjkmg\" (UniqueName: \"kubernetes.io/projected/b16efd30-cb63-4031-aab4-ec6ca21d7528-kube-api-access-wjkmg\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.240193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.240264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.240292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-config-data\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.240333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-scripts\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.240419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-run-httpd\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.240491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-log-httpd\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.240519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.343424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjkmg\" (UniqueName: \"kubernetes.io/projected/b16efd30-cb63-4031-aab4-ec6ca21d7528-kube-api-access-wjkmg\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.343701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.343734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.343769 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-config-data\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.343796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-scripts\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.343848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-run-httpd\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.343877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-log-httpd\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.343913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.344425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-log-httpd\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.344554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-run-httpd\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.349267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.349290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-scripts\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.350097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-config-data\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.356142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.356755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.359949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjkmg\" (UniqueName: \"kubernetes.io/projected/b16efd30-cb63-4031-aab4-ec6ca21d7528-kube-api-access-wjkmg\") pod \"ceilometer-0\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.413237 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.467557 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.546566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zmp4\" (UniqueName: \"kubernetes.io/projected/63acb604-a9d8-4dd4-8898-67365790ab8d-kube-api-access-7zmp4\") pod \"63acb604-a9d8-4dd4-8898-67365790ab8d\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.546683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-config-data\") pod \"63acb604-a9d8-4dd4-8898-67365790ab8d\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.546880 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-combined-ca-bundle\") pod \"63acb604-a9d8-4dd4-8898-67365790ab8d\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.547001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63acb604-a9d8-4dd4-8898-67365790ab8d-logs\") pod \"63acb604-a9d8-4dd4-8898-67365790ab8d\" (UID: \"63acb604-a9d8-4dd4-8898-67365790ab8d\") " Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.547355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63acb604-a9d8-4dd4-8898-67365790ab8d-logs" (OuterVolumeSpecName: "logs") pod "63acb604-a9d8-4dd4-8898-67365790ab8d" (UID: "63acb604-a9d8-4dd4-8898-67365790ab8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.547792 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63acb604-a9d8-4dd4-8898-67365790ab8d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.556164 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63acb604-a9d8-4dd4-8898-67365790ab8d-kube-api-access-7zmp4" (OuterVolumeSpecName: "kube-api-access-7zmp4") pod "63acb604-a9d8-4dd4-8898-67365790ab8d" (UID: "63acb604-a9d8-4dd4-8898-67365790ab8d"). InnerVolumeSpecName "kube-api-access-7zmp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.567152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63acb604-a9d8-4dd4-8898-67365790ab8d" (UID: "63acb604-a9d8-4dd4-8898-67365790ab8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.573767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-config-data" (OuterVolumeSpecName: "config-data") pod "63acb604-a9d8-4dd4-8898-67365790ab8d" (UID: "63acb604-a9d8-4dd4-8898-67365790ab8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.649750 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.649792 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zmp4\" (UniqueName: \"kubernetes.io/projected/63acb604-a9d8-4dd4-8898-67365790ab8d-kube-api-access-7zmp4\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.649805 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63acb604-a9d8-4dd4-8898-67365790ab8d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:42 crc kubenswrapper[4707]: I0121 15:18:42.858201 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.047315 4707 generic.go:334] "Generic (PLEG): container finished" podID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerID="512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9" exitCode=0 Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.047363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"63acb604-a9d8-4dd4-8898-67365790ab8d","Type":"ContainerDied","Data":"512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9"} Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.047385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"63acb604-a9d8-4dd4-8898-67365790ab8d","Type":"ContainerDied","Data":"b7637115018493a92d8bc1d7e8521a48e1fb82b46a60752ac65b1a2df1a190e2"} Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.047402 4707 scope.go:117] "RemoveContainer" containerID="512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.047475 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.052038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerStarted","Data":"a7277f4a240e0d1d1cc9dbdec25ae84f35054764e8e6314a29c9df618f186901"} Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.070625 4707 scope.go:117] "RemoveContainer" containerID="501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.080892 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.085658 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.091090 4707 scope.go:117] "RemoveContainer" containerID="512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9" Jan 21 15:18:43 crc kubenswrapper[4707]: E0121 15:18:43.091505 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9\": container with ID starting with 512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9 not found: ID does not exist" containerID="512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.091528 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9"} err="failed to get container status \"512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9\": rpc error: code = NotFound desc = could not find container \"512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9\": container with ID starting with 512c83ab99b477a6d948d34ec41cf8487961c04f784422a4bf62040718bceab9 not found: ID does not exist" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.091546 4707 scope.go:117] "RemoveContainer" containerID="501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea" Jan 21 15:18:43 crc kubenswrapper[4707]: E0121 15:18:43.091926 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea\": container with ID starting with 501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea not found: ID does not exist" containerID="501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.091952 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea"} err="failed to get container status \"501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea\": rpc error: code = NotFound desc = could not find container \"501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea\": container with ID starting with 501533eb6195958bd47e13ce257cd9f8a05bd170f28104ff0a9d44a7c167aeea not found: ID does not exist" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.098212 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:43 crc kubenswrapper[4707]: E0121 15:18:43.098616 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-log" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.098638 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-log" Jan 21 15:18:43 crc kubenswrapper[4707]: E0121 15:18:43.098646 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-api" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.098653 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-api" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.098895 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-api" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.098919 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" containerName="nova-api-log" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.099786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.101836 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.102107 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.102269 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.106513 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.157551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3b2c03-674f-48ab-a80c-b3e99cefb265-logs\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.157601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-config-data\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.157672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56zw\" (UniqueName: \"kubernetes.io/projected/cb3b2c03-674f-48ab-a80c-b3e99cefb265-kube-api-access-m56zw\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.157715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.157787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-public-tls-certs\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.157956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.190802 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63acb604-a9d8-4dd4-8898-67365790ab8d" path="/var/lib/kubelet/pods/63acb604-a9d8-4dd4-8898-67365790ab8d/volumes" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.191598 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c129b2b0-9811-4d58-9b65-de842874c901" path="/var/lib/kubelet/pods/c129b2b0-9811-4d58-9b65-de842874c901/volumes" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.259517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.259586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-public-tls-certs\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.259668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.259736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3b2c03-674f-48ab-a80c-b3e99cefb265-logs\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.259752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-config-data\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.259779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m56zw\" (UniqueName: \"kubernetes.io/projected/cb3b2c03-674f-48ab-a80c-b3e99cefb265-kube-api-access-m56zw\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.260538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3b2c03-674f-48ab-a80c-b3e99cefb265-logs\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.262928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.265683 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.265773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.265893 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.272897 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-public-tls-certs\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.273676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-config-data\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.274232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.276725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56zw\" (UniqueName: \"kubernetes.io/projected/cb3b2c03-674f-48ab-a80c-b3e99cefb265-kube-api-access-m56zw\") pod \"nova-api-0\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.411714 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:43 crc kubenswrapper[4707]: I0121 15:18:43.774211 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:18:43 crc kubenswrapper[4707]: W0121 15:18:43.775625 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb3b2c03_674f_48ab_a80c_b3e99cefb265.slice/crio-72476484ee919b456b5d721173c75a22fc6ee3fccd6ee4ab9dfd6e1627063b00 WatchSource:0}: Error finding container 72476484ee919b456b5d721173c75a22fc6ee3fccd6ee4ab9dfd6e1627063b00: Status 404 returned error can't find the container with id 72476484ee919b456b5d721173c75a22fc6ee3fccd6ee4ab9dfd6e1627063b00 Jan 21 15:18:44 crc kubenswrapper[4707]: I0121 15:18:44.060310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerStarted","Data":"3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211"} Jan 21 15:18:44 crc kubenswrapper[4707]: I0121 15:18:44.062424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb3b2c03-674f-48ab-a80c-b3e99cefb265","Type":"ContainerStarted","Data":"207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773"} Jan 21 15:18:44 crc kubenswrapper[4707]: I0121 15:18:44.062451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb3b2c03-674f-48ab-a80c-b3e99cefb265","Type":"ContainerStarted","Data":"79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4"} Jan 21 15:18:44 crc kubenswrapper[4707]: I0121 15:18:44.062461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb3b2c03-674f-48ab-a80c-b3e99cefb265","Type":"ContainerStarted","Data":"72476484ee919b456b5d721173c75a22fc6ee3fccd6ee4ab9dfd6e1627063b00"} Jan 21 15:18:44 crc kubenswrapper[4707]: I0121 15:18:44.088422 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.088408484 podStartE2EDuration="1.088408484s" podCreationTimestamp="2026-01-21 15:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:18:44.07870005 +0000 UTC m=+1021.260216273" watchObservedRunningTime="2026-01-21 15:18:44.088408484 +0000 UTC m=+1021.269924705" Jan 21 15:18:45 crc kubenswrapper[4707]: I0121 15:18:45.081322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerStarted","Data":"d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558"} Jan 21 15:18:46 crc kubenswrapper[4707]: I0121 15:18:46.087991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerStarted","Data":"823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8"} Jan 21 15:18:46 crc kubenswrapper[4707]: I0121 15:18:46.387773 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:47 crc kubenswrapper[4707]: I0121 15:18:47.097731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerStarted","Data":"9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c"} Jan 21 15:18:47 crc kubenswrapper[4707]: I0121 15:18:47.098021 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:18:47 crc kubenswrapper[4707]: I0121 15:18:47.121217 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.44416113 podStartE2EDuration="5.121203016s" podCreationTimestamp="2026-01-21 15:18:42 +0000 UTC" firstStartedPulling="2026-01-21 15:18:42.857923249 +0000 UTC m=+1020.039439472" lastFinishedPulling="2026-01-21 15:18:46.534965137 +0000 UTC m=+1023.716481358" observedRunningTime="2026-01-21 15:18:47.111230816 +0000 UTC m=+1024.292747038" watchObservedRunningTime="2026-01-21 15:18:47.121203016 +0000 UTC m=+1024.302719238" Jan 21 15:18:51 crc kubenswrapper[4707]: I0121 15:18:51.387862 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:51 crc kubenswrapper[4707]: I0121 15:18:51.402669 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.145969 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.238615 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k"] Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.239629 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.241239 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.242779 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.254567 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k"] Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.306408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-scripts\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.306556 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8s9b\" (UniqueName: \"kubernetes.io/projected/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-kube-api-access-d8s9b\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.306618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-config-data\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.306748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.408365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8s9b\" (UniqueName: \"kubernetes.io/projected/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-kube-api-access-d8s9b\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.408426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-config-data\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.408514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.408566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-scripts\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.413475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-config-data\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.413499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.414564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-scripts\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.422757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8s9b\" (UniqueName: \"kubernetes.io/projected/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-kube-api-access-d8s9b\") pod \"nova-cell1-cell-mapping-6wh8k\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.554566 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:52 crc kubenswrapper[4707]: I0121 15:18:52.912996 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k"] Jan 21 15:18:53 crc kubenswrapper[4707]: I0121 15:18:53.141440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" event={"ID":"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c","Type":"ContainerStarted","Data":"ea553eff67dde957d1a586908e14078e3b873dcfe703ba1e658cd6842e6dbe17"} Jan 21 15:18:53 crc kubenswrapper[4707]: I0121 15:18:53.141490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" event={"ID":"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c","Type":"ContainerStarted","Data":"a3b947bce06fab6d1e0fda88ba01f474f5fe673d7ca957a40c5be845cfdac36e"} Jan 21 15:18:53 crc kubenswrapper[4707]: I0121 15:18:53.411944 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:53 crc kubenswrapper[4707]: I0121 15:18:53.412283 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:18:54 crc kubenswrapper[4707]: I0121 15:18:54.432636 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:18:54 crc kubenswrapper[4707]: I0121 15:18:54.432956 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:18:58 crc kubenswrapper[4707]: I0121 15:18:58.185325 4707 generic.go:334] "Generic (PLEG): container finished" podID="dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c" containerID="ea553eff67dde957d1a586908e14078e3b873dcfe703ba1e658cd6842e6dbe17" exitCode=0 Jan 21 15:18:58 crc kubenswrapper[4707]: I0121 15:18:58.185422 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" event={"ID":"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c","Type":"ContainerDied","Data":"ea553eff67dde957d1a586908e14078e3b873dcfe703ba1e658cd6842e6dbe17"} Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.495388 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.618669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-config-data\") pod \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.618939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8s9b\" (UniqueName: \"kubernetes.io/projected/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-kube-api-access-d8s9b\") pod \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.619083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-combined-ca-bundle\") pod \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.619293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-scripts\") pod \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\" (UID: \"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c\") " Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.623031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-scripts" (OuterVolumeSpecName: "scripts") pod "dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c" (UID: "dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.624236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-kube-api-access-d8s9b" (OuterVolumeSpecName: "kube-api-access-d8s9b") pod "dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c" (UID: "dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c"). InnerVolumeSpecName "kube-api-access-d8s9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.638511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c" (UID: "dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.638703 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-config-data" (OuterVolumeSpecName: "config-data") pod "dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c" (UID: "dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.720747 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8s9b\" (UniqueName: \"kubernetes.io/projected/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-kube-api-access-d8s9b\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.720775 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.720784 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:18:59 crc kubenswrapper[4707]: I0121 15:18:59.720792 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.199342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" event={"ID":"dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c","Type":"ContainerDied","Data":"a3b947bce06fab6d1e0fda88ba01f474f5fe673d7ca957a40c5be845cfdac36e"} Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.199374 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k" Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.199384 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3b947bce06fab6d1e0fda88ba01f474f5fe673d7ca957a40c5be845cfdac36e" Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.356186 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.356385 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-log" containerID="cri-o://79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4" gracePeriod=30 Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.356437 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-api" containerID="cri-o://207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773" gracePeriod=30 Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.366935 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.367160 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="551e55fc-c235-41c7-8963-da7a6ebe4082" containerName="nova-scheduler-scheduler" containerID="cri-o://04b05a7b3e7a0eda6bd36b474347f33248746fb143327226a015fb3e14bac540" gracePeriod=30 Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.387884 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.388111 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-log" containerID="cri-o://6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a" gracePeriod=30 Jan 21 15:19:00 crc kubenswrapper[4707]: I0121 15:19:00.388207 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-metadata" containerID="cri-o://a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085" gracePeriod=30 Jan 21 15:19:01 crc kubenswrapper[4707]: I0121 15:19:01.207277 4707 generic.go:334] "Generic (PLEG): container finished" podID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerID="6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a" exitCode=143 Jan 21 15:19:01 crc kubenswrapper[4707]: I0121 15:19:01.207358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"12a9ec4e-857c-43a8-b31e-dc38360c6527","Type":"ContainerDied","Data":"6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a"} Jan 21 15:19:01 crc kubenswrapper[4707]: I0121 15:19:01.209888 4707 generic.go:334] "Generic (PLEG): container finished" podID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerID="79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4" exitCode=143 Jan 21 15:19:01 crc kubenswrapper[4707]: I0121 15:19:01.209950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb3b2c03-674f-48ab-a80c-b3e99cefb265","Type":"ContainerDied","Data":"79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4"} Jan 21 15:19:02 crc kubenswrapper[4707]: E0121 15:19:02.228343 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="04b05a7b3e7a0eda6bd36b474347f33248746fb143327226a015fb3e14bac540" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:19:02 crc kubenswrapper[4707]: E0121 15:19:02.229350 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="04b05a7b3e7a0eda6bd36b474347f33248746fb143327226a015fb3e14bac540" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:19:02 crc kubenswrapper[4707]: E0121 15:19:02.230368 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="04b05a7b3e7a0eda6bd36b474347f33248746fb143327226a015fb3e14bac540" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:19:02 crc kubenswrapper[4707]: E0121 15:19:02.230398 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="551e55fc-c235-41c7-8963-da7a6ebe4082" containerName="nova-scheduler-scheduler" Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.504586 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:56226->10.217.0.177:8775: read: connection reset by peer" Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.504616 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:56220->10.217.0.177:8775: read: connection reset by peer" Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.921093 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.925330 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.990983 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-internal-tls-certs\") pod \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991028 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-combined-ca-bundle\") pod \"12a9ec4e-857c-43a8-b31e-dc38360c6527\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-public-tls-certs\") pod \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3b2c03-674f-48ab-a80c-b3e99cefb265-logs\") pod \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-nova-metadata-tls-certs\") pod \"12a9ec4e-857c-43a8-b31e-dc38360c6527\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkrxd\" (UniqueName: \"kubernetes.io/projected/12a9ec4e-857c-43a8-b31e-dc38360c6527-kube-api-access-gkrxd\") pod \"12a9ec4e-857c-43a8-b31e-dc38360c6527\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-config-data\") pod \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-config-data\") pod \"12a9ec4e-857c-43a8-b31e-dc38360c6527\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991346 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m56zw\" (UniqueName: \"kubernetes.io/projected/cb3b2c03-674f-48ab-a80c-b3e99cefb265-kube-api-access-m56zw\") pod \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a9ec4e-857c-43a8-b31e-dc38360c6527-logs\") pod \"12a9ec4e-857c-43a8-b31e-dc38360c6527\" (UID: \"12a9ec4e-857c-43a8-b31e-dc38360c6527\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.991387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-combined-ca-bundle\") pod \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\" (UID: \"cb3b2c03-674f-48ab-a80c-b3e99cefb265\") " Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.992488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12a9ec4e-857c-43a8-b31e-dc38360c6527-logs" (OuterVolumeSpecName: "logs") pod "12a9ec4e-857c-43a8-b31e-dc38360c6527" (UID: "12a9ec4e-857c-43a8-b31e-dc38360c6527"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.995642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb3b2c03-674f-48ab-a80c-b3e99cefb265-logs" (OuterVolumeSpecName: "logs") pod "cb3b2c03-674f-48ab-a80c-b3e99cefb265" (UID: "cb3b2c03-674f-48ab-a80c-b3e99cefb265"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:03 crc kubenswrapper[4707]: I0121 15:19:03.999925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a9ec4e-857c-43a8-b31e-dc38360c6527-kube-api-access-gkrxd" (OuterVolumeSpecName: "kube-api-access-gkrxd") pod "12a9ec4e-857c-43a8-b31e-dc38360c6527" (UID: "12a9ec4e-857c-43a8-b31e-dc38360c6527"). InnerVolumeSpecName "kube-api-access-gkrxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.002507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3b2c03-674f-48ab-a80c-b3e99cefb265-kube-api-access-m56zw" (OuterVolumeSpecName: "kube-api-access-m56zw") pod "cb3b2c03-674f-48ab-a80c-b3e99cefb265" (UID: "cb3b2c03-674f-48ab-a80c-b3e99cefb265"). InnerVolumeSpecName "kube-api-access-m56zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.017287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-config-data" (OuterVolumeSpecName: "config-data") pod "cb3b2c03-674f-48ab-a80c-b3e99cefb265" (UID: "cb3b2c03-674f-48ab-a80c-b3e99cefb265"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.019995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12a9ec4e-857c-43a8-b31e-dc38360c6527" (UID: "12a9ec4e-857c-43a8-b31e-dc38360c6527"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.020399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb3b2c03-674f-48ab-a80c-b3e99cefb265" (UID: "cb3b2c03-674f-48ab-a80c-b3e99cefb265"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.025927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-config-data" (OuterVolumeSpecName: "config-data") pod "12a9ec4e-857c-43a8-b31e-dc38360c6527" (UID: "12a9ec4e-857c-43a8-b31e-dc38360c6527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.040953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cb3b2c03-674f-48ab-a80c-b3e99cefb265" (UID: "cb3b2c03-674f-48ab-a80c-b3e99cefb265"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.042487 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cb3b2c03-674f-48ab-a80c-b3e99cefb265" (UID: "cb3b2c03-674f-48ab-a80c-b3e99cefb265"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.048224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "12a9ec4e-857c-43a8-b31e-dc38360c6527" (UID: "12a9ec4e-857c-43a8-b31e-dc38360c6527"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093274 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m56zw\" (UniqueName: \"kubernetes.io/projected/cb3b2c03-674f-48ab-a80c-b3e99cefb265-kube-api-access-m56zw\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093302 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a9ec4e-857c-43a8-b31e-dc38360c6527-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093312 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093320 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093328 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093335 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093344 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3b2c03-674f-48ab-a80c-b3e99cefb265-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093351 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093360 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkrxd\" (UniqueName: \"kubernetes.io/projected/12a9ec4e-857c-43a8-b31e-dc38360c6527-kube-api-access-gkrxd\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093368 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3b2c03-674f-48ab-a80c-b3e99cefb265-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.093395 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a9ec4e-857c-43a8-b31e-dc38360c6527-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.228925 4707 generic.go:334] "Generic (PLEG): container finished" podID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerID="a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085" exitCode=0 Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.228960 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.228989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"12a9ec4e-857c-43a8-b31e-dc38360c6527","Type":"ContainerDied","Data":"a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085"} Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.229031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"12a9ec4e-857c-43a8-b31e-dc38360c6527","Type":"ContainerDied","Data":"bb70124027dccd1ed8810ae6a481045d7877b3f03aa77152e651a8587b6a68bb"} Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.229048 4707 scope.go:117] "RemoveContainer" containerID="a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.231113 4707 generic.go:334] "Generic (PLEG): container finished" podID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerID="207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773" exitCode=0 Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.231159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb3b2c03-674f-48ab-a80c-b3e99cefb265","Type":"ContainerDied","Data":"207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773"} Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.231166 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.231181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb3b2c03-674f-48ab-a80c-b3e99cefb265","Type":"ContainerDied","Data":"72476484ee919b456b5d721173c75a22fc6ee3fccd6ee4ab9dfd6e1627063b00"} Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.250650 4707 scope.go:117] "RemoveContainer" containerID="6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.252565 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.259909 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.266580 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.277055 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.277493 4707 scope.go:117] "RemoveContainer" containerID="a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085" Jan 21 15:19:04 crc kubenswrapper[4707]: E0121 15:19:04.277951 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085\": container with ID starting with a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085 not found: ID does not exist" containerID="a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.277992 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085"} err="failed to get container status \"a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085\": rpc error: code = NotFound desc = could not find container \"a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085\": container with ID starting with a68eea306b34ebbacad6797ab5238d7ba18c5abfca18491b13d05434f7e35085 not found: ID does not exist" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.278015 4707 scope.go:117] "RemoveContainer" containerID="6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a" Jan 21 15:19:04 crc kubenswrapper[4707]: E0121 15:19:04.278419 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a\": container with ID starting with 6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a not found: ID does not exist" containerID="6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.278455 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a"} err="failed to get container status \"6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a\": rpc error: code = NotFound desc = could not find container \"6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a\": container with ID starting with 6fc0eb1fd0b451816978ff60e1ac5edee094182447ae93ab575a445fcaedaf2a not found: ID does not exist" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.278480 4707 scope.go:117] "RemoveContainer" containerID="207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284243 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:19:04 crc kubenswrapper[4707]: E0121 15:19:04.284571 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-log" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284589 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-log" Jan 21 15:19:04 crc kubenswrapper[4707]: E0121 15:19:04.284598 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c" containerName="nova-manage" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284604 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c" containerName="nova-manage" Jan 21 15:19:04 crc kubenswrapper[4707]: E0121 15:19:04.284618 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-log" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284625 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-log" Jan 21 15:19:04 crc kubenswrapper[4707]: E0121 15:19:04.284633 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-metadata" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284638 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-metadata" Jan 21 15:19:04 crc kubenswrapper[4707]: E0121 15:19:04.284659 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-api" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284664 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-api" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284834 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-log" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284844 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c" containerName="nova-manage" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284852 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" containerName="nova-api-api" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284858 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-log" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.284869 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" containerName="nova-metadata-metadata" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.285700 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.287613 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.287780 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.291744 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.293202 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.297315 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.297856 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.297968 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.300016 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.303642 4707 scope.go:117] "RemoveContainer" containerID="79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.303739 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.323307 4707 scope.go:117] "RemoveContainer" containerID="207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773" Jan 21 15:19:04 crc kubenswrapper[4707]: E0121 15:19:04.323743 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773\": container with ID starting with 207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773 not found: ID does not exist" containerID="207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.323782 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773"} err="failed to get container status \"207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773\": rpc error: code = NotFound desc = could not find container \"207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773\": container with ID starting with 207c46e46cb620cae2fda9615c22173d193341eb976d5152b9ddf883029b0773 not found: ID does not exist" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.323853 4707 scope.go:117] "RemoveContainer" containerID="79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4" Jan 21 15:19:04 crc kubenswrapper[4707]: E0121 15:19:04.324288 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4\": container with ID starting with 79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4 not found: ID does not exist" containerID="79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.324335 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4"} err="failed to get container status \"79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4\": rpc error: code = NotFound desc = could not find container \"79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4\": container with ID starting with 79533eb838be446f5a438ef80f9d5c075fbaf4370e0af450cedefaeb6b7c9ec4 not found: ID does not exist" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.397875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19c4756-e139-49f4-a1e8-99afc8ed8088-logs\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.397949 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mgj\" (UniqueName: \"kubernetes.io/projected/a19c4756-e139-49f4-a1e8-99afc8ed8088-kube-api-access-55mgj\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.397996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sllq\" (UniqueName: \"kubernetes.io/projected/bffc824e-1c16-4342-b73f-91205d93c451-kube-api-access-2sllq\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.398015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.398034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.398080 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bffc824e-1c16-4342-b73f-91205d93c451-logs\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.398103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-config-data\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.398123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.398189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-config-data\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.398246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-public-tls-certs\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.398290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19c4756-e139-49f4-a1e8-99afc8ed8088-logs\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55mgj\" (UniqueName: \"kubernetes.io/projected/a19c4756-e139-49f4-a1e8-99afc8ed8088-kube-api-access-55mgj\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sllq\" (UniqueName: \"kubernetes.io/projected/bffc824e-1c16-4342-b73f-91205d93c451-kube-api-access-2sllq\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bffc824e-1c16-4342-b73f-91205d93c451-logs\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499618 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19c4756-e139-49f4-a1e8-99afc8ed8088-logs\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-config-data\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499837 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-config-data\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.499877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-public-tls-certs\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.500030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.500105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bffc824e-1c16-4342-b73f-91205d93c451-logs\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.502386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.502527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.503172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-public-tls-certs\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.503470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.504002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-config-data\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.504141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-config-data\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.504443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.512300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mgj\" (UniqueName: \"kubernetes.io/projected/a19c4756-e139-49f4-a1e8-99afc8ed8088-kube-api-access-55mgj\") pod \"nova-api-0\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.512797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sllq\" (UniqueName: \"kubernetes.io/projected/bffc824e-1c16-4342-b73f-91205d93c451-kube-api-access-2sllq\") pod \"nova-metadata-0\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.599036 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.613624 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:04 crc kubenswrapper[4707]: I0121 15:19:04.994452 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:19:04 crc kubenswrapper[4707]: W0121 15:19:04.995914 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbffc824e_1c16_4342_b73f_91205d93c451.slice/crio-2da51b0b7c1edc422a0096f34b4bd252cf641d3bf0f0719d394ad58642f44597 WatchSource:0}: Error finding container 2da51b0b7c1edc422a0096f34b4bd252cf641d3bf0f0719d394ad58642f44597: Status 404 returned error can't find the container with id 2da51b0b7c1edc422a0096f34b4bd252cf641d3bf0f0719d394ad58642f44597 Jan 21 15:19:05 crc kubenswrapper[4707]: W0121 15:19:05.063337 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda19c4756_e139_49f4_a1e8_99afc8ed8088.slice/crio-5576423c75052e9487e5da2c85414974ae20e4dcf5a9378746d80cc70a1979b8 WatchSource:0}: Error finding container 5576423c75052e9487e5da2c85414974ae20e4dcf5a9378746d80cc70a1979b8: Status 404 returned error can't find the container with id 5576423c75052e9487e5da2c85414974ae20e4dcf5a9378746d80cc70a1979b8 Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.063936 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.190338 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a9ec4e-857c-43a8-b31e-dc38360c6527" path="/var/lib/kubelet/pods/12a9ec4e-857c-43a8-b31e-dc38360c6527/volumes" Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.191613 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3b2c03-674f-48ab-a80c-b3e99cefb265" path="/var/lib/kubelet/pods/cb3b2c03-674f-48ab-a80c-b3e99cefb265/volumes" Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.240061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a19c4756-e139-49f4-a1e8-99afc8ed8088","Type":"ContainerStarted","Data":"68f94dbce7c88a21b49fc9a3b3510d01bf3d9de428d0ff716630f26929ab3d41"} Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.240087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a19c4756-e139-49f4-a1e8-99afc8ed8088","Type":"ContainerStarted","Data":"5576423c75052e9487e5da2c85414974ae20e4dcf5a9378746d80cc70a1979b8"} Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.241043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bffc824e-1c16-4342-b73f-91205d93c451","Type":"ContainerStarted","Data":"4f9de40936195255f1db32b4c266a0cc65cf32487afa43b4821fdb159dd1e243"} Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.241072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bffc824e-1c16-4342-b73f-91205d93c451","Type":"ContainerStarted","Data":"2da51b0b7c1edc422a0096f34b4bd252cf641d3bf0f0719d394ad58642f44597"} Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.242186 4707 generic.go:334] "Generic (PLEG): container finished" podID="551e55fc-c235-41c7-8963-da7a6ebe4082" containerID="04b05a7b3e7a0eda6bd36b474347f33248746fb143327226a015fb3e14bac540" exitCode=0 Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.242212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"551e55fc-c235-41c7-8963-da7a6ebe4082","Type":"ContainerDied","Data":"04b05a7b3e7a0eda6bd36b474347f33248746fb143327226a015fb3e14bac540"} Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.297092 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.421942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-config-data\") pod \"551e55fc-c235-41c7-8963-da7a6ebe4082\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.422274 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-combined-ca-bundle\") pod \"551e55fc-c235-41c7-8963-da7a6ebe4082\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.422373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrfg9\" (UniqueName: \"kubernetes.io/projected/551e55fc-c235-41c7-8963-da7a6ebe4082-kube-api-access-qrfg9\") pod \"551e55fc-c235-41c7-8963-da7a6ebe4082\" (UID: \"551e55fc-c235-41c7-8963-da7a6ebe4082\") " Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.427082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551e55fc-c235-41c7-8963-da7a6ebe4082-kube-api-access-qrfg9" (OuterVolumeSpecName: "kube-api-access-qrfg9") pod "551e55fc-c235-41c7-8963-da7a6ebe4082" (UID: "551e55fc-c235-41c7-8963-da7a6ebe4082"). InnerVolumeSpecName "kube-api-access-qrfg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.442869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-config-data" (OuterVolumeSpecName: "config-data") pod "551e55fc-c235-41c7-8963-da7a6ebe4082" (UID: "551e55fc-c235-41c7-8963-da7a6ebe4082"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.442921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "551e55fc-c235-41c7-8963-da7a6ebe4082" (UID: "551e55fc-c235-41c7-8963-da7a6ebe4082"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.524869 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrfg9\" (UniqueName: \"kubernetes.io/projected/551e55fc-c235-41c7-8963-da7a6ebe4082-kube-api-access-qrfg9\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.524901 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:05 crc kubenswrapper[4707]: I0121 15:19:05.524911 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551e55fc-c235-41c7-8963-da7a6ebe4082-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.249508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bffc824e-1c16-4342-b73f-91205d93c451","Type":"ContainerStarted","Data":"132d4a7b6c8eb04012605d47a3c3a13b3ca5900e48ed093c7525683f623c3630"} Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.251687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.251688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"551e55fc-c235-41c7-8963-da7a6ebe4082","Type":"ContainerDied","Data":"c23fbc4dbd806e45ac757c64bb779c4bb98c0dab221153bbee750d52a4c37c0d"} Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.251829 4707 scope.go:117] "RemoveContainer" containerID="04b05a7b3e7a0eda6bd36b474347f33248746fb143327226a015fb3e14bac540" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.253515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a19c4756-e139-49f4-a1e8-99afc8ed8088","Type":"ContainerStarted","Data":"c48740a97a789fa8e24952640d3171c5efb85d529be89bc23207d43578d4750d"} Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.266058 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.266042624 podStartE2EDuration="2.266042624s" podCreationTimestamp="2026-01-21 15:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:06.261360786 +0000 UTC m=+1043.442877008" watchObservedRunningTime="2026-01-21 15:19:06.266042624 +0000 UTC m=+1043.447558846" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.279245 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.279230763 podStartE2EDuration="2.279230763s" podCreationTimestamp="2026-01-21 15:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:06.277020626 +0000 UTC m=+1043.458536848" watchObservedRunningTime="2026-01-21 15:19:06.279230763 +0000 UTC m=+1043.460746985" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.295421 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.301882 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.309645 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:19:06 crc kubenswrapper[4707]: E0121 15:19:06.309988 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551e55fc-c235-41c7-8963-da7a6ebe4082" containerName="nova-scheduler-scheduler" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.310005 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="551e55fc-c235-41c7-8963-da7a6ebe4082" containerName="nova-scheduler-scheduler" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.310195 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="551e55fc-c235-41c7-8963-da7a6ebe4082" containerName="nova-scheduler-scheduler" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.310727 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.312081 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.315025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.438866 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.439020 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-config-data\") pod \"nova-scheduler-0\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.439093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcbb6\" (UniqueName: \"kubernetes.io/projected/290dd378-ee4b-464d-9321-2491ef417d10-kube-api-access-mcbb6\") pod \"nova-scheduler-0\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.540761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.540849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-config-data\") pod \"nova-scheduler-0\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.540890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcbb6\" (UniqueName: \"kubernetes.io/projected/290dd378-ee4b-464d-9321-2491ef417d10-kube-api-access-mcbb6\") pod \"nova-scheduler-0\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.543666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.543674 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-config-data\") pod \"nova-scheduler-0\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.552775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcbb6\" (UniqueName: \"kubernetes.io/projected/290dd378-ee4b-464d-9321-2491ef417d10-kube-api-access-mcbb6\") pod \"nova-scheduler-0\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:06 crc kubenswrapper[4707]: I0121 15:19:06.623261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:07 crc kubenswrapper[4707]: I0121 15:19:07.010592 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:19:07 crc kubenswrapper[4707]: W0121 15:19:07.012481 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod290dd378_ee4b_464d_9321_2491ef417d10.slice/crio-011230ad515ad40a394ff943baa5f2d0d858e3fef0e10aa6039fcac21c55d075 WatchSource:0}: Error finding container 011230ad515ad40a394ff943baa5f2d0d858e3fef0e10aa6039fcac21c55d075: Status 404 returned error can't find the container with id 011230ad515ad40a394ff943baa5f2d0d858e3fef0e10aa6039fcac21c55d075 Jan 21 15:19:07 crc kubenswrapper[4707]: I0121 15:19:07.189753 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551e55fc-c235-41c7-8963-da7a6ebe4082" path="/var/lib/kubelet/pods/551e55fc-c235-41c7-8963-da7a6ebe4082/volumes" Jan 21 15:19:07 crc kubenswrapper[4707]: I0121 15:19:07.262646 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"290dd378-ee4b-464d-9321-2491ef417d10","Type":"ContainerStarted","Data":"c6260bbf19dfc6a9ac3a5408173a09e768e183ee3df3567857551dbfb60cc47a"} Jan 21 15:19:07 crc kubenswrapper[4707]: I0121 15:19:07.262696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"290dd378-ee4b-464d-9321-2491ef417d10","Type":"ContainerStarted","Data":"011230ad515ad40a394ff943baa5f2d0d858e3fef0e10aa6039fcac21c55d075"} Jan 21 15:19:07 crc kubenswrapper[4707]: I0121 15:19:07.275934 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.275921805 podStartE2EDuration="1.275921805s" podCreationTimestamp="2026-01-21 15:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:07.273325461 +0000 UTC m=+1044.454841682" watchObservedRunningTime="2026-01-21 15:19:07.275921805 +0000 UTC m=+1044.457438027" Jan 21 15:19:09 crc kubenswrapper[4707]: I0121 15:19:09.600172 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:09 crc kubenswrapper[4707]: I0121 15:19:09.600484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:09 crc kubenswrapper[4707]: I0121 15:19:09.945208 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:19:09 crc kubenswrapper[4707]: I0121 15:19:09.945419 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:19:09 crc kubenswrapper[4707]: I0121 15:19:09.945454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:19:09 crc kubenswrapper[4707]: I0121 15:19:09.945868 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf1b4064e4934cd4c43802ba3e90cfa41897c007fa175bcbffaf2760e553c708"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:19:09 crc kubenswrapper[4707]: I0121 15:19:09.945924 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://cf1b4064e4934cd4c43802ba3e90cfa41897c007fa175bcbffaf2760e553c708" gracePeriod=600 Jan 21 15:19:10 crc kubenswrapper[4707]: I0121 15:19:10.293662 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="cf1b4064e4934cd4c43802ba3e90cfa41897c007fa175bcbffaf2760e553c708" exitCode=0 Jan 21 15:19:10 crc kubenswrapper[4707]: I0121 15:19:10.293731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"cf1b4064e4934cd4c43802ba3e90cfa41897c007fa175bcbffaf2760e553c708"} Jan 21 15:19:10 crc kubenswrapper[4707]: I0121 15:19:10.293918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"8bb97b7a6a7cd703c4244311ac8e9e2660a8fee90a5d7f829eaf760d2141c0e2"} Jan 21 15:19:10 crc kubenswrapper[4707]: I0121 15:19:10.293938 4707 scope.go:117] "RemoveContainer" containerID="38eb7c71f338910a44fc52babc17693f7c677868f52954612f9a40b65ff03f90" Jan 21 15:19:11 crc kubenswrapper[4707]: I0121 15:19:11.624396 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:12 crc kubenswrapper[4707]: I0121 15:19:12.474494 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:19:14 crc kubenswrapper[4707]: I0121 15:19:14.599163 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:14 crc kubenswrapper[4707]: I0121 15:19:14.599490 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:14 crc kubenswrapper[4707]: I0121 15:19:14.613756 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:14 crc kubenswrapper[4707]: I0121 15:19:14.613796 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:15 crc kubenswrapper[4707]: I0121 15:19:15.610923 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:19:15 crc kubenswrapper[4707]: I0121 15:19:15.610947 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:19:15 crc kubenswrapper[4707]: I0121 15:19:15.626946 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:19:15 crc kubenswrapper[4707]: I0121 15:19:15.626973 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:19:16 crc kubenswrapper[4707]: I0121 15:19:16.624375 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:16 crc kubenswrapper[4707]: I0121 15:19:16.644840 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:17 crc kubenswrapper[4707]: I0121 15:19:17.373976 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:24 crc kubenswrapper[4707]: I0121 15:19:24.604081 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:24 crc kubenswrapper[4707]: I0121 15:19:24.605246 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:24 crc kubenswrapper[4707]: I0121 15:19:24.608132 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:24 crc kubenswrapper[4707]: I0121 15:19:24.620098 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:24 crc kubenswrapper[4707]: I0121 15:19:24.620410 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:24 crc kubenswrapper[4707]: I0121 15:19:24.622137 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:24 crc kubenswrapper[4707]: I0121 15:19:24.625454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:25 crc kubenswrapper[4707]: I0121 15:19:25.398606 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:25 crc kubenswrapper[4707]: I0121 15:19:25.402259 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:25 crc kubenswrapper[4707]: I0121 15:19:25.402784 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.435108 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.435671 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="9220cbec-82de-4d8a-9bca-575edd0d7482" containerName="openstackclient" containerID="cri-o://30a82c874f0417a2ea868196fc295a6b7cdf393c79ed490a067806d77d373590" gracePeriod=2 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.454260 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.469653 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.469870 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="290dd378-ee4b-464d-9321-2491ef417d10" containerName="nova-scheduler-scheduler" containerID="cri-o://c6260bbf19dfc6a9ac3a5408173a09e768e183ee3df3567857551dbfb60cc47a" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.484001 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.484200 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-log" containerID="cri-o://4f9de40936195255f1db32b4c266a0cc65cf32487afa43b4821fdb159dd1e243" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.484324 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-metadata" containerID="cri-o://132d4a7b6c8eb04012605d47a3c3a13b3ca5900e48ed093c7525683f623c3630" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.491964 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.492129 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="ovn-northd" containerID="cri-o://9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.492232 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="openstack-network-exporter" containerID="cri-o://b31f4f9052448b785d44b5b6bb2e793bfd762ba6718264d80a5fa47c35b16237" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.501371 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc"] Jan 21 15:19:29 crc kubenswrapper[4707]: E0121 15:19:29.501757 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9220cbec-82de-4d8a-9bca-575edd0d7482" containerName="openstackclient" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.501769 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9220cbec-82de-4d8a-9bca-575edd0d7482" containerName="openstackclient" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.501937 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9220cbec-82de-4d8a-9bca-575edd0d7482" containerName="openstackclient" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.502767 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.517633 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.518954 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.519033 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.534861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.535114 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerName="glance-log" containerID="cri-o://74d9d54b14de2c921f565409d9067db30aa7eea74a9bfef927fc5a41ce19f3e5" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.535278 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerName="glance-httpd" containerID="cri-o://cd62a3dc2c61a7c191ccbda0e7ac771e13ca81a7f503f321bfd2768094965edd" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.537113 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.538108 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerName="openstack-network-exporter" containerID="cri-o://0bb066f5db13b28e6cfb66a090c7e690726483b9338472d0b4b47b4408310358" gracePeriod=300 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.545018 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.556977 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.557320 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="36457603-5ef9-4f79-bf2c-11413e422937" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.565773 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.566297 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="9a17de20-67a0-4578-92a4-e19819061791" containerName="memcached" containerID="cri-o://96a5811f181efafd4ec7d45b252f716be9d17274a6d4379830931b1b93eb6d88" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.602858 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.603068 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-log" containerID="cri-o://68f94dbce7c88a21b49fc9a3b3510d01bf3d9de428d0ff716630f26929ab3d41" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.603210 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-api" containerID="cri-o://c48740a97a789fa8e24952640d3171c5efb85d529be89bc23207d43578d4750d" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.613955 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.615315 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.642330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twr6c\" (UniqueName: \"kubernetes.io/projected/3602ad56-25ca-4375-8eac-ea931ec48243-kube-api-access-twr6c\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.644011 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.644242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.644429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.645424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5ssq\" (UniqueName: \"kubernetes.io/projected/aa920435-67d6-49dd-9e83-7d14c4316fef-kube-api-access-h5ssq\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.645577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data-custom\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.645742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.645935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa920435-67d6-49dd-9e83-7d14c4316fef-logs\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.646040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-combined-ca-bundle\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.646159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3602ad56-25ca-4375-8eac-ea931ec48243-logs\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.755926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.755972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp2x8\" (UniqueName: \"kubernetes.io/projected/e476808c-b16e-4aa7-9f4b-71e6a5c32576-kube-api-access-pp2x8\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.755995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5ssq\" (UniqueName: \"kubernetes.io/projected/aa920435-67d6-49dd-9e83-7d14c4316fef-kube-api-access-h5ssq\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-public-tls-certs\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data-custom\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-internal-tls-certs\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa920435-67d6-49dd-9e83-7d14c4316fef-logs\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-combined-ca-bundle\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756177 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3602ad56-25ca-4375-8eac-ea931ec48243-logs\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e476808c-b16e-4aa7-9f4b-71e6a5c32576-logs\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data-custom\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twr6c\" (UniqueName: \"kubernetes.io/projected/3602ad56-25ca-4375-8eac-ea931ec48243-kube-api-access-twr6c\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.756309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-combined-ca-bundle\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.765827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.767113 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.767250 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.767527 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerName="cinder-scheduler" containerID="cri-o://9fa2d6082708378d81429a182459dfd951edb40a1d798ddd9d54ac8fa55269de" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.767771 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="43e09a5c-6892-4ff5-9c5b-a30556a193ed" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f42ade328f4a73251db136eabdb66bd21328d2aa0a7aed9f3d9b848f325c979b" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.768176 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerName="probe" containerID="cri-o://5775ffa72419df9485ac7699c70a095e09fa60e7badfeb84512d63346e47f9f6" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.774970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3602ad56-25ca-4375-8eac-ea931ec48243-logs\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.775488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa920435-67d6-49dd-9e83-7d14c4316fef-logs\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.775511 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.779576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.781829 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="0e303ace-04be-4505-9437-af54b61d41e5" containerName="nova-cell1-conductor-conductor" containerID="cri-o://c9b449437d00fe1991a9d06649066f2fc2cf5629c28ad87cf426887a6dbf94ac" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.783245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data-custom\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.794489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.797654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-combined-ca-bundle\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.813907 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.816864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.817836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.821396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5ssq\" (UniqueName: \"kubernetes.io/projected/aa920435-67d6-49dd-9e83-7d14c4316fef-kube-api-access-h5ssq\") pod \"barbican-keystone-listener-5f47d8d9b-swktc\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.835281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twr6c\" (UniqueName: \"kubernetes.io/projected/3602ad56-25ca-4375-8eac-ea931ec48243-kube-api-access-twr6c\") pod \"barbican-worker-77f98c6b77-r9vnd\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.858236 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.858684 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerName="cinder-api-log" containerID="cri-o://97bae6a3ad6e694b48ede1784c23518cbd9dbfc8a5d2d44ec3582c962c55a005" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.859119 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerName="cinder-api" containerID="cri-o://291188ab585da8ca8a57ab46efd037f8e3984e81e67c3a618faa0dd418123a81" gracePeriod=30 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.864762 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.865528 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.866664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp2x8\" (UniqueName: \"kubernetes.io/projected/e476808c-b16e-4aa7-9f4b-71e6a5c32576-kube-api-access-pp2x8\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.866685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.866702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-public-tls-certs\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.866723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-internal-tls-certs\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.866787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e476808c-b16e-4aa7-9f4b-71e6a5c32576-logs\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.866838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data-custom\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.866867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-combined-ca-bundle\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.869095 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.871658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e476808c-b16e-4aa7-9f4b-71e6a5c32576-logs\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.880129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-public-tls-certs\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.881237 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.881570 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerName="openstack-network-exporter" containerID="cri-o://ab66fa9636c2c8a7dc17154279e06b78d15894768f66c0e4dc03d9b7c66afb61" gracePeriod=300 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.891627 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp2x8\" (UniqueName: \"kubernetes.io/projected/e476808c-b16e-4aa7-9f4b-71e6a5c32576-kube-api-access-pp2x8\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.893306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-combined-ca-bundle\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.899370 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.906095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.907858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data-custom\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.909560 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.915996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-internal-tls-certs\") pod \"barbican-api-55bdf489cb-7thg9\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.927254 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerName="ovsdbserver-nb" containerID="cri-o://ca42ae0b900f3e6fb7e330dad2077769f96ba0b58c3627f0f8492e7eb0037984" gracePeriod=300 Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.939854 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-fc8578799-v69q9"] Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.940877 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.969972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-public-tls-certs\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.970116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-scripts\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.970133 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d67e12-9371-4324-a476-3befcc10743d-logs\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.970186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-combined-ca-bundle\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.970211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-config-data\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.970424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-internal-tls-certs\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.970459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2dqb\" (UniqueName: \"kubernetes.io/projected/73d67e12-9371-4324-a476-3befcc10743d-kube-api-access-w2dqb\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:29 crc kubenswrapper[4707]: I0121 15:19:29.970896 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:19:29 crc kubenswrapper[4707]: E0121 15:19:29.972446 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:19:29 crc kubenswrapper[4707]: E0121 15:19:29.988912 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.008599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-fc8578799-v69q9"] Jan 21 15:19:30 crc kubenswrapper[4707]: E0121 15:19:30.032925 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:19:30 crc kubenswrapper[4707]: E0121 15:19:30.032982 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="ovn-northd" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.064864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.065134 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerName="glance-log" containerID="cri-o://eccce3cbf45c52b0fa9e2d35adb70605731d496af2d57867762c823eecd5c9d6" gracePeriod=30 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.065283 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerName="glance-httpd" containerID="cri-o://abbe09e446aa5427d732558aa8e1909e7b43ad26726df39c2a2cc877fe7297c3" gracePeriod=30 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.076184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-combined-ca-bundle\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.076677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-config-data\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.076740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-credential-keys\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.076785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-internal-tls-certs\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.076883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-internal-tls-certs\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.076903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2dqb\" (UniqueName: \"kubernetes.io/projected/73d67e12-9371-4324-a476-3befcc10743d-kube-api-access-w2dqb\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.076938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-combined-ca-bundle\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.076984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-config-data\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.077064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-public-tls-certs\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.077096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-fernet-keys\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.077125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxwrf\" (UniqueName: \"kubernetes.io/projected/a533a135-2b5a-4edb-aead-4924368ab8ab-kube-api-access-mxwrf\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.077188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-public-tls-certs\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.077256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-scripts\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.077272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-scripts\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.077301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d67e12-9371-4324-a476-3befcc10743d-logs\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.077632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d67e12-9371-4324-a476-3befcc10743d-logs\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.092291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-combined-ca-bundle\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.107182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-internal-tls-certs\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.107305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-public-tls-certs\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.107465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-scripts\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.111009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2dqb\" (UniqueName: \"kubernetes.io/projected/73d67e12-9371-4324-a476-3befcc10743d-kube-api-access-w2dqb\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.137363 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.137584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-config-data\") pod \"placement-6ddf8fc7c8-ztvhs\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.144718 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-c77875b78-c9kg9"] Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.148467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.156842 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerName="ovsdbserver-sb" containerID="cri-o://ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383" gracePeriod=300 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.163927 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-c77875b78-c9kg9"] Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.185704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-combined-ca-bundle\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.185757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-config-data\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.185907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-fernet-keys\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.185948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxwrf\" (UniqueName: \"kubernetes.io/projected/a533a135-2b5a-4edb-aead-4924368ab8ab-kube-api-access-mxwrf\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.185994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-public-tls-certs\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.186072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-scripts\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.186178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-credential-keys\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.186212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-internal-tls-certs\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.191946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-combined-ca-bundle\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.193345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-public-tls-certs\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.196412 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-scripts\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.196612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-credential-keys\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.200770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-internal-tls-certs\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.200771 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-fernet-keys\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.211728 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.212653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-config-data\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.215877 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.217186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxwrf\" (UniqueName: \"kubernetes.io/projected/a533a135-2b5a-4edb-aead-4924368ab8ab-kube-api-access-mxwrf\") pod \"keystone-fc8578799-v69q9\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.282731 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.288142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-ovndb-tls-certs\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.288327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jhw6\" (UniqueName: \"kubernetes.io/projected/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-kube-api-access-6jhw6\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.288362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-httpd-config\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.288381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-internal-tls-certs\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.288407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-public-tls-certs\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.288435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-config\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.288457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-combined-ca-bundle\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.351020 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="12ab3a9a-6d12-45c8-b029-d95a3c4d4608" containerName="galera" containerID="cri-o://d8b0e90dec26a393971665049bd2f2d03a8dc6c3dd80036e2a8856b78a74cd21" gracePeriod=30 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.354755 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="bce9bbed-aa2b-429e-85f7-c4202d8af65f" containerName="galera" containerID="cri-o://93601405e0699ec0ef7e3daeaae4615c531db22fb7b3ae9899ac20bde048fe28" gracePeriod=30 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.391862 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-config\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.391913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-combined-ca-bundle\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.392082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-ovndb-tls-certs\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.392240 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jhw6\" (UniqueName: \"kubernetes.io/projected/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-kube-api-access-6jhw6\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.392283 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-httpd-config\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.392303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-internal-tls-certs\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.392696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-public-tls-certs\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.397701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-combined-ca-bundle\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.398250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-httpd-config\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.398799 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-internal-tls-certs\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.400591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-config\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.400945 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-ovndb-tls-certs\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.401953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-public-tls-certs\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.413407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jhw6\" (UniqueName: \"kubernetes.io/projected/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-kube-api-access-6jhw6\") pod \"neutron-c77875b78-c9kg9\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: E0121 15:19:30.465683 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383 is running failed: container process not found" containerID="ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:19:30 crc kubenswrapper[4707]: E0121 15:19:30.465999 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383 is running failed: container process not found" containerID="ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:19:30 crc kubenswrapper[4707]: E0121 15:19:30.467791 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383 is running failed: container process not found" containerID="ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:19:30 crc kubenswrapper[4707]: E0121 15:19:30.467848 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerName="ovsdbserver-sb" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.477287 4707 generic.go:334] "Generic (PLEG): container finished" podID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerID="68f94dbce7c88a21b49fc9a3b3510d01bf3d9de428d0ff716630f26929ab3d41" exitCode=143 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.477343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a19c4756-e139-49f4-a1e8-99afc8ed8088","Type":"ContainerDied","Data":"68f94dbce7c88a21b49fc9a3b3510d01bf3d9de428d0ff716630f26929ab3d41"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.481485 4707 generic.go:334] "Generic (PLEG): container finished" podID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerID="74d9d54b14de2c921f565409d9067db30aa7eea74a9bfef927fc5a41ce19f3e5" exitCode=143 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.481522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"933eeedf-0a55-44e6-84f5-062e3c2716d9","Type":"ContainerDied","Data":"74d9d54b14de2c921f565409d9067db30aa7eea74a9bfef927fc5a41ce19f3e5"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.490187 4707 generic.go:334] "Generic (PLEG): container finished" podID="bffc824e-1c16-4342-b73f-91205d93c451" containerID="4f9de40936195255f1db32b4c266a0cc65cf32487afa43b4821fdb159dd1e243" exitCode=143 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.490253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bffc824e-1c16-4342-b73f-91205d93c451","Type":"ContainerDied","Data":"4f9de40936195255f1db32b4c266a0cc65cf32487afa43b4821fdb159dd1e243"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.497394 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_2950a343-0190-497c-8775-4a5a6ddb13fa/ovsdbserver-sb/0.log" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.497429 4707 generic.go:334] "Generic (PLEG): container finished" podID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerID="ab66fa9636c2c8a7dc17154279e06b78d15894768f66c0e4dc03d9b7c66afb61" exitCode=2 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.497443 4707 generic.go:334] "Generic (PLEG): container finished" podID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerID="ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383" exitCode=143 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.497486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2950a343-0190-497c-8775-4a5a6ddb13fa","Type":"ContainerDied","Data":"ab66fa9636c2c8a7dc17154279e06b78d15894768f66c0e4dc03d9b7c66afb61"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.497507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2950a343-0190-497c-8775-4a5a6ddb13fa","Type":"ContainerDied","Data":"ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.497648 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.504267 4707 generic.go:334] "Generic (PLEG): container finished" podID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerID="eccce3cbf45c52b0fa9e2d35adb70605731d496af2d57867762c823eecd5c9d6" exitCode=143 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.504333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec","Type":"ContainerDied","Data":"eccce3cbf45c52b0fa9e2d35adb70605731d496af2d57867762c823eecd5c9d6"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.507229 4707 generic.go:334] "Generic (PLEG): container finished" podID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerID="97bae6a3ad6e694b48ede1784c23518cbd9dbfc8a5d2d44ec3582c962c55a005" exitCode=143 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.507273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"57f429a9-592b-4ff8-a548-b5e59c4c481e","Type":"ContainerDied","Data":"97bae6a3ad6e694b48ede1784c23518cbd9dbfc8a5d2d44ec3582c962c55a005"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.515589 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerID="b31f4f9052448b785d44b5b6bb2e793bfd762ba6718264d80a5fa47c35b16237" exitCode=2 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.515673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"9d458004-5d78-43b6-aae5-7afb3dfa0c31","Type":"ContainerDied","Data":"b31f4f9052448b785d44b5b6bb2e793bfd762ba6718264d80a5fa47c35b16237"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.522154 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_807924a2-cb11-45c3-9a2a-f9864c1b3a78/ovsdbserver-nb/0.log" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.522200 4707 generic.go:334] "Generic (PLEG): container finished" podID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerID="0bb066f5db13b28e6cfb66a090c7e690726483b9338472d0b4b47b4408310358" exitCode=2 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.522214 4707 generic.go:334] "Generic (PLEG): container finished" podID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerID="ca42ae0b900f3e6fb7e330dad2077769f96ba0b58c3627f0f8492e7eb0037984" exitCode=143 Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.522232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"807924a2-cb11-45c3-9a2a-f9864c1b3a78","Type":"ContainerDied","Data":"0bb066f5db13b28e6cfb66a090c7e690726483b9338472d0b4b47b4408310358"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.522250 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"807924a2-cb11-45c3-9a2a-f9864c1b3a78","Type":"ContainerDied","Data":"ca42ae0b900f3e6fb7e330dad2077769f96ba0b58c3627f0f8492e7eb0037984"} Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.662082 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd"] Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.883657 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_807924a2-cb11-45c3-9a2a-f9864c1b3a78/ovsdbserver-nb/0.log" Jan 21 15:19:30 crc kubenswrapper[4707]: I0121 15:19:30.883901 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.016630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-scripts\") pod \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.016705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdb-rundir\") pod \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.016758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-combined-ca-bundle\") pod \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.016793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-metrics-certs-tls-certs\") pod \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.016844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-config\") pod \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.016872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfsdg\" (UniqueName: \"kubernetes.io/projected/807924a2-cb11-45c3-9a2a-f9864c1b3a78-kube-api-access-nfsdg\") pod \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.016905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.016920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdbserver-nb-tls-certs\") pod \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\" (UID: \"807924a2-cb11-45c3-9a2a-f9864c1b3a78\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.017386 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-scripts" (OuterVolumeSpecName: "scripts") pod "807924a2-cb11-45c3-9a2a-f9864c1b3a78" (UID: "807924a2-cb11-45c3-9a2a-f9864c1b3a78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.017784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "807924a2-cb11-45c3-9a2a-f9864c1b3a78" (UID: "807924a2-cb11-45c3-9a2a-f9864c1b3a78"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.018127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-config" (OuterVolumeSpecName: "config") pod "807924a2-cb11-45c3-9a2a-f9864c1b3a78" (UID: "807924a2-cb11-45c3-9a2a-f9864c1b3a78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.030833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "807924a2-cb11-45c3-9a2a-f9864c1b3a78" (UID: "807924a2-cb11-45c3-9a2a-f9864c1b3a78"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.085338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807924a2-cb11-45c3-9a2a-f9864c1b3a78-kube-api-access-nfsdg" (OuterVolumeSpecName: "kube-api-access-nfsdg") pod "807924a2-cb11-45c3-9a2a-f9864c1b3a78" (UID: "807924a2-cb11-45c3-9a2a-f9864c1b3a78"). InnerVolumeSpecName "kube-api-access-nfsdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.118896 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.118927 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.118937 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.118947 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807924a2-cb11-45c3-9a2a-f9864c1b3a78-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.118956 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfsdg\" (UniqueName: \"kubernetes.io/projected/807924a2-cb11-45c3-9a2a-f9864c1b3a78-kube-api-access-nfsdg\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.149743 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "807924a2-cb11-45c3-9a2a-f9864c1b3a78" (UID: "807924a2-cb11-45c3-9a2a-f9864c1b3a78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.221031 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.247473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "807924a2-cb11-45c3-9a2a-f9864c1b3a78" (UID: "807924a2-cb11-45c3-9a2a-f9864c1b3a78"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: W0121 15:19:31.253979 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa920435_67d6_49dd_9e83_7d14c4316fef.slice/crio-00a2439156d8b81cc43e616a05610dbabec8f4463065fad9bd62e75a687aceee WatchSource:0}: Error finding container 00a2439156d8b81cc43e616a05610dbabec8f4463065fad9bd62e75a687aceee: Status 404 returned error can't find the container with id 00a2439156d8b81cc43e616a05610dbabec8f4463065fad9bd62e75a687aceee Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.267748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.280679 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.307540 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc"] Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.307914 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerName="ovsdbserver-nb" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.307928 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerName="ovsdbserver-nb" Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.307941 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerName="openstack-network-exporter" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.307946 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerName="openstack-network-exporter" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.308101 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerName="openstack-network-exporter" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.308125 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" containerName="ovsdbserver-nb" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.310951 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.315496 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.319052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "807924a2-cb11-45c3-9a2a-f9864c1b3a78" (UID: "807924a2-cb11-45c3-9a2a-f9864c1b3a78"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.323064 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.336149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.366671 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.368483 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.368502 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/807924a2-cb11-45c3-9a2a-f9864c1b3a78-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.368511 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.378371 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.388593 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-xlbv7"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.414557 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fe38-account-create-update-wpnm2"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.446761 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-2bdzl"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.470296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqmrr\" (UniqueName: \"kubernetes.io/projected/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-kube-api-access-zqmrr\") pod \"barbican-e217-account-create-update-d7fcc\" (UID: \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.471260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts\") pod \"barbican-e217-account-create-update-d7fcc\" (UID: \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.473232 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.477340 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data podName:f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:31.977323969 +0000 UTC m=+1069.158840191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.479448 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-2bdzl"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.481181 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-84f6c"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.482243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.513442 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.517049 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-84f6c"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.531729 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-87slz"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.576234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts\") pod \"root-account-create-update-84f6c\" (UID: \"2db08188-97b2-43c0-9b46-7ea6639617ed\") " pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.576378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts\") pod \"barbican-e217-account-create-update-d7fcc\" (UID: \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.576704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqmrr\" (UniqueName: \"kubernetes.io/projected/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-kube-api-access-zqmrr\") pod \"barbican-e217-account-create-update-d7fcc\" (UID: \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.577107 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-87slz"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.577146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"43e09a5c-6892-4ff5-9c5b-a30556a193ed","Type":"ContainerDied","Data":"f42ade328f4a73251db136eabdb66bd21328d2aa0a7aed9f3d9b848f325c979b"} Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.577143 4707 generic.go:334] "Generic (PLEG): container finished" podID="43e09a5c-6892-4ff5-9c5b-a30556a193ed" containerID="f42ade328f4a73251db136eabdb66bd21328d2aa0a7aed9f3d9b848f325c979b" exitCode=0 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.582183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts\") pod \"barbican-e217-account-create-update-d7fcc\" (UID: \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.583910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqrld\" (UniqueName: \"kubernetes.io/projected/2db08188-97b2-43c0-9b46-7ea6639617ed-kube-api-access-jqrld\") pod \"root-account-create-update-84f6c\" (UID: \"2db08188-97b2-43c0-9b46-7ea6639617ed\") " pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.596469 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.607588 4707 generic.go:334] "Generic (PLEG): container finished" podID="12ab3a9a-6d12-45c8-b029-d95a3c4d4608" containerID="d8b0e90dec26a393971665049bd2f2d03a8dc6c3dd80036e2a8856b78a74cd21" exitCode=0 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.607877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"12ab3a9a-6d12-45c8-b029-d95a3c4d4608","Type":"ContainerDied","Data":"d8b0e90dec26a393971665049bd2f2d03a8dc6c3dd80036e2a8856b78a74cd21"} Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.613854 4707 generic.go:334] "Generic (PLEG): container finished" podID="9a17de20-67a0-4578-92a4-e19819061791" containerID="96a5811f181efafd4ec7d45b252f716be9d17274a6d4379830931b1b93eb6d88" exitCode=0 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.613984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9a17de20-67a0-4578-92a4-e19819061791","Type":"ContainerDied","Data":"96a5811f181efafd4ec7d45b252f716be9d17274a6d4379830931b1b93eb6d88"} Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.615307 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e1b5-account-create-update-qrhx7"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.622344 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_807924a2-cb11-45c3-9a2a-f9864c1b3a78/ovsdbserver-nb/0.log" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.622519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"807924a2-cb11-45c3-9a2a-f9864c1b3a78","Type":"ContainerDied","Data":"4ab1d1446839667e5f37d07c92435d73b93a559783b97d35a3a4d356e98b74ac"} Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.622615 4707 scope.go:117] "RemoveContainer" containerID="0bb066f5db13b28e6cfb66a090c7e690726483b9338472d0b4b47b4408310358" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.622777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.623979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqmrr\" (UniqueName: \"kubernetes.io/projected/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-kube-api-access-zqmrr\") pod \"barbican-e217-account-create-update-d7fcc\" (UID: \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\") " pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.625397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" event={"ID":"3602ad56-25ca-4375-8eac-ea931ec48243","Type":"ContainerStarted","Data":"a682dd07acd9a2d981504e197e1db5a83338efa8c1c6fcc84c7603ddcc25f7c8"} Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.630074 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6260bbf19dfc6a9ac3a5408173a09e768e183ee3df3567857551dbfb60cc47a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.631771 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6260bbf19dfc6a9ac3a5408173a09e768e183ee3df3567857551dbfb60cc47a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.634391 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6260bbf19dfc6a9ac3a5408173a09e768e183ee3df3567857551dbfb60cc47a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.634429 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="290dd378-ee4b-464d-9321-2491ef417d10" containerName="nova-scheduler-scheduler" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.634719 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e303ace-04be-4505-9437-af54b61d41e5" containerID="c9b449437d00fe1991a9d06649066f2fc2cf5629c28ad87cf426887a6dbf94ac" exitCode=0 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.634782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0e303ace-04be-4505-9437-af54b61d41e5","Type":"ContainerDied","Data":"c9b449437d00fe1991a9d06649066f2fc2cf5629c28ad87cf426887a6dbf94ac"} Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.639272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" event={"ID":"aa920435-67d6-49dd-9e83-7d14c4316fef","Type":"ContainerStarted","Data":"00a2439156d8b81cc43e616a05610dbabec8f4463065fad9bd62e75a687aceee"} Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.640324 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpkcc"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.676197 4707 generic.go:334] "Generic (PLEG): container finished" podID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerID="5775ffa72419df9485ac7699c70a095e09fa60e7badfeb84512d63346e47f9f6" exitCode=0 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.683983 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpkcc"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.684041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f44cdae5-61bb-40c3-a1d7-e10d4d64c203","Type":"ContainerDied","Data":"5775ffa72419df9485ac7699c70a095e09fa60e7badfeb84512d63346e47f9f6"} Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.741735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts\") pod \"root-account-create-update-84f6c\" (UID: \"2db08188-97b2-43c0-9b46-7ea6639617ed\") " pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.742138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqrld\" (UniqueName: \"kubernetes.io/projected/2db08188-97b2-43c0-9b46-7ea6639617ed-kube-api-access-jqrld\") pod \"root-account-create-update-84f6c\" (UID: \"2db08188-97b2-43c0-9b46-7ea6639617ed\") " pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.769655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.773129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts\") pod \"root-account-create-update-84f6c\" (UID: \"2db08188-97b2-43c0-9b46-7ea6639617ed\") " pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.796538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqrld\" (UniqueName: \"kubernetes.io/projected/2db08188-97b2-43c0-9b46-7ea6639617ed-kube-api-access-jqrld\") pod \"root-account-create-update-84f6c\" (UID: \"2db08188-97b2-43c0-9b46-7ea6639617ed\") " pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.814036 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.829422 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.852857 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rwlwq"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.866893 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-e371-account-create-update-78ppn"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.883341 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k"] Jan 21 15:19:31 crc kubenswrapper[4707]: W0121 15:19:31.891556 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73d67e12_9371_4324_a476_3befcc10743d.slice/crio-7f3bb00b4cc309a855d46e8c9b331cdbdec4f887410567914b414e1ffc3ec7f4 WatchSource:0}: Error finding container 7f3bb00b4cc309a855d46e8c9b331cdbdec4f887410567914b414e1ffc3ec7f4: Status 404 returned error can't find the container with id 7f3bb00b4cc309a855d46e8c9b331cdbdec4f887410567914b414e1ffc3ec7f4 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.898367 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6wh8k"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.899220 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_2950a343-0190-497c-8775-4a5a6ddb13fa/ovsdbserver-sb/0.log" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.899275 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.903503 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.905057 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-e371-account-create-update-78ppn"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.919040 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.928537 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.939413 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-40f0-account-create-update-5hmgk"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.964296 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.964863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-config-data\") pod \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.964894 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-combined-ca-bundle\") pod \"2950a343-0190-497c-8775-4a5a6ddb13fa\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.964914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbrgt\" (UniqueName: \"kubernetes.io/projected/43e09a5c-6892-4ff5-9c5b-a30556a193ed-kube-api-access-mbrgt\") pod \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.964967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-metrics-certs-tls-certs\") pod \"2950a343-0190-497c-8775-4a5a6ddb13fa\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.965021 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdb-rundir\") pod \"2950a343-0190-497c-8775-4a5a6ddb13fa\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.965069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-config\") pod \"2950a343-0190-497c-8775-4a5a6ddb13fa\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.965100 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whv2k\" (UniqueName: \"kubernetes.io/projected/2950a343-0190-497c-8775-4a5a6ddb13fa-kube-api-access-whv2k\") pod \"2950a343-0190-497c-8775-4a5a6ddb13fa\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.965143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdbserver-sb-tls-certs\") pod \"2950a343-0190-497c-8775-4a5a6ddb13fa\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.965184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-scripts\") pod \"2950a343-0190-497c-8775-4a5a6ddb13fa\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.965231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-vencrypt-tls-certs\") pod \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.965266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"2950a343-0190-497c-8775-4a5a6ddb13fa\" (UID: \"2950a343-0190-497c-8775-4a5a6ddb13fa\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.965306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-nova-novncproxy-tls-certs\") pod \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.965324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-combined-ca-bundle\") pod \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\" (UID: \"43e09a5c-6892-4ff5-9c5b-a30556a193ed\") " Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.966941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-scripts" (OuterVolumeSpecName: "scripts") pod "2950a343-0190-497c-8775-4a5a6ddb13fa" (UID: "2950a343-0190-497c-8775-4a5a6ddb13fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.971679 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e09a5c-6892-4ff5-9c5b-a30556a193ed-kube-api-access-mbrgt" (OuterVolumeSpecName: "kube-api-access-mbrgt") pod "43e09a5c-6892-4ff5-9c5b-a30556a193ed" (UID: "43e09a5c-6892-4ff5-9c5b-a30556a193ed"). InnerVolumeSpecName "kube-api-access-mbrgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.974950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-config" (OuterVolumeSpecName: "config") pod "2950a343-0190-497c-8775-4a5a6ddb13fa" (UID: "2950a343-0190-497c-8775-4a5a6ddb13fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.975260 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2950a343-0190-497c-8775-4a5a6ddb13fa" (UID: "2950a343-0190-497c-8775-4a5a6ddb13fa"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.975320 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:19:31 crc kubenswrapper[4707]: E0121 15:19:31.975355 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data podName:945ded51-9961-411b-b00f-0543ac91a18a nodeName:}" failed. No retries permitted until 2026-01-21 15:19:32.475341387 +0000 UTC m=+1069.656857609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data") pod "rabbitmq-server-0" (UID: "945ded51-9961-411b-b00f-0543ac91a18a") : configmap "rabbitmq-config-data" not found Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.977435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2950a343-0190-497c-8775-4a5a6ddb13fa-kube-api-access-whv2k" (OuterVolumeSpecName: "kube-api-access-whv2k") pod "2950a343-0190-497c-8775-4a5a6ddb13fa" (UID: "2950a343-0190-497c-8775-4a5a6ddb13fa"). InnerVolumeSpecName "kube-api-access-whv2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.978603 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.978863 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="ceilometer-central-agent" containerID="cri-o://3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211" gracePeriod=30 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.978992 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="proxy-httpd" containerID="cri-o://9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c" gracePeriod=30 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.979039 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="sg-core" containerID="cri-o://823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8" gracePeriod=30 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.979071 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="ceilometer-notification-agent" containerID="cri-o://d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558" gracePeriod=30 Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.979771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "2950a343-0190-497c-8775-4a5a6ddb13fa" (UID: "2950a343-0190-497c-8775-4a5a6ddb13fa"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:19:31 crc kubenswrapper[4707]: I0121 15:19:31.979947 4707 scope.go:117] "RemoveContainer" containerID="ca42ae0b900f3e6fb7e330dad2077769f96ba0b58c3627f0f8492e7eb0037984" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.042996 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-99p25"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.066604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qlcg\" (UniqueName: \"kubernetes.io/projected/0e303ace-04be-4505-9437-af54b61d41e5-kube-api-access-2qlcg\") pod \"0e303ace-04be-4505-9437-af54b61d41e5\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.066658 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-combined-ca-bundle\") pod \"0e303ace-04be-4505-9437-af54b61d41e5\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.066725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-config-data\") pod \"0e303ace-04be-4505-9437-af54b61d41e5\" (UID: \"0e303ace-04be-4505-9437-af54b61d41e5\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.067126 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbrgt\" (UniqueName: \"kubernetes.io/projected/43e09a5c-6892-4ff5-9c5b-a30556a193ed-kube-api-access-mbrgt\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.067138 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.067147 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.067155 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whv2k\" (UniqueName: \"kubernetes.io/projected/2950a343-0190-497c-8775-4a5a6ddb13fa-kube-api-access-whv2k\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.067173 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2950a343-0190-497c-8775-4a5a6ddb13fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.067189 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.076156 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.076228 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data podName:f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.076211964 +0000 UTC m=+1070.257728187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.100876 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-99p25"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.104309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43e09a5c-6892-4ff5-9c5b-a30556a193ed" (UID: "43e09a5c-6892-4ff5-9c5b-a30556a193ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.104396 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2950a343-0190-497c-8775-4a5a6ddb13fa" (UID: "2950a343-0190-497c-8775-4a5a6ddb13fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.135010 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.147409 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e303ace-04be-4505-9437-af54b61d41e5-kube-api-access-2qlcg" (OuterVolumeSpecName: "kube-api-access-2qlcg") pod "0e303ace-04be-4505-9437-af54b61d41e5" (UID: "0e303ace-04be-4505-9437-af54b61d41e5"). InnerVolumeSpecName "kube-api-access-2qlcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.170397 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qlcg\" (UniqueName: \"kubernetes.io/projected/0e303ace-04be-4505-9437-af54b61d41e5-kube-api-access-2qlcg\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.170421 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.170430 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.178503 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.225840 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.276536 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.314328 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-stp4w"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.315974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-config-data" (OuterVolumeSpecName: "config-data") pod "43e09a5c-6892-4ff5-9c5b-a30556a193ed" (UID: "43e09a5c-6892-4ff5-9c5b-a30556a193ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.327251 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-stp4w"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.344007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "43e09a5c-6892-4ff5-9c5b-a30556a193ed" (UID: "43e09a5c-6892-4ff5-9c5b-a30556a193ed"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.378962 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.378989 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.379002 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.383287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.398032 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "2950a343-0190-497c-8775-4a5a6ddb13fa" (UID: "2950a343-0190-497c-8775-4a5a6ddb13fa"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.410621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e303ace-04be-4505-9437-af54b61d41e5" (UID: "0e303ace-04be-4505-9437-af54b61d41e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.416581 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-wkbgh"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.431575 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-config-data" (OuterVolumeSpecName: "config-data") pod "0e303ace-04be-4505-9437-af54b61d41e5" (UID: "0e303ace-04be-4505-9437-af54b61d41e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.434894 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-wkbgh"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.452969 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5f98bc6758-lh9vt"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.453200 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" podUID="7713e80a-5469-4748-a300-4c581d9fd503" containerName="neutron-api" containerID="cri-o://451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.453561 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" podUID="7713e80a-5469-4748-a300-4c581d9fd503" containerName="neutron-httpd" containerID="cri-o://0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.454893 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-c77875b78-c9kg9"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.456321 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "43e09a5c-6892-4ff5-9c5b-a30556a193ed" (UID: "43e09a5c-6892-4ff5-9c5b-a30556a193ed"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.482306 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.482331 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.482341 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e303ace-04be-4505-9437-af54b61d41e5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.482350 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e09a5c-6892-4ff5-9c5b-a30556a193ed-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.482397 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.482433 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data podName:945ded51-9961-411b-b00f-0543ac91a18a nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.482419681 +0000 UTC m=+1070.663935903 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data") pod "rabbitmq-server-0" (UID: "945ded51-9961-411b-b00f-0543ac91a18a") : configmap "rabbitmq-config-data" not found Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.491887 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.518673 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.520462 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x8xqg"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.539989 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x8xqg"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.586206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-operator-scripts\") pod \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.587427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-combined-ca-bundle\") pod \"9a17de20-67a0-4578-92a4-e19819061791\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.587481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-combined-ca-bundle\") pod \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.587498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-kolla-config\") pod \"9a17de20-67a0-4578-92a4-e19819061791\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.587514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-default\") pod \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.587533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-generated\") pod \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.587641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w7ph\" (UniqueName: \"kubernetes.io/projected/9a17de20-67a0-4578-92a4-e19819061791-kube-api-access-2w7ph\") pod \"9a17de20-67a0-4578-92a4-e19819061791\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.588474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vxfc\" (UniqueName: \"kubernetes.io/projected/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kube-api-access-6vxfc\") pod \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.588514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-memcached-tls-certs\") pod \"9a17de20-67a0-4578-92a4-e19819061791\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.588560 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.588606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-galera-tls-certs\") pod \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.588632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12ab3a9a-6d12-45c8-b029-d95a3c4d4608" (UID: "12ab3a9a-6d12-45c8-b029-d95a3c4d4608"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.588678 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-config-data\") pod \"9a17de20-67a0-4578-92a4-e19819061791\" (UID: \"9a17de20-67a0-4578-92a4-e19819061791\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.588781 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kolla-config\") pod \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\" (UID: \"12ab3a9a-6d12-45c8-b029-d95a3c4d4608\") " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.591470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "12ab3a9a-6d12-45c8-b029-d95a3c4d4608" (UID: "12ab3a9a-6d12-45c8-b029-d95a3c4d4608"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.595049 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.595488 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" containerName="kube-state-metrics" containerID="cri-o://48c5225159a3ffc9586877ac0dcbb13aa01282ae7849cfd26facedad54d07afc" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.597051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9a17de20-67a0-4578-92a4-e19819061791" (UID: "9a17de20-67a0-4578-92a4-e19819061791"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.597901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "12ab3a9a-6d12-45c8-b029-d95a3c4d4608" (UID: "12ab3a9a-6d12-45c8-b029-d95a3c4d4608"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.599037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-config-data" (OuterVolumeSpecName: "config-data") pod "9a17de20-67a0-4578-92a4-e19819061791" (UID: "9a17de20-67a0-4578-92a4-e19819061791"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.599513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kube-api-access-6vxfc" (OuterVolumeSpecName: "kube-api-access-6vxfc") pod "12ab3a9a-6d12-45c8-b029-d95a3c4d4608" (UID: "12ab3a9a-6d12-45c8-b029-d95a3c4d4608"). InnerVolumeSpecName "kube-api-access-6vxfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.601966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "12ab3a9a-6d12-45c8-b029-d95a3c4d4608" (UID: "12ab3a9a-6d12-45c8-b029-d95a3c4d4608"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.662193 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a17de20-67a0-4578-92a4-e19819061791-kube-api-access-2w7ph" (OuterVolumeSpecName: "kube-api-access-2w7ph") pod "9a17de20-67a0-4578-92a4-e19819061791" (UID: "9a17de20-67a0-4578-92a4-e19819061791"). InnerVolumeSpecName "kube-api-access-2w7ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.699685 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w7ph\" (UniqueName: \"kubernetes.io/projected/9a17de20-67a0-4578-92a4-e19819061791-kube-api-access-2w7ph\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.699706 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vxfc\" (UniqueName: \"kubernetes.io/projected/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kube-api-access-6vxfc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.699726 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.699735 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.699744 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.699752 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a17de20-67a0-4578-92a4-e19819061791-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.699877 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.699888 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.709525 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-fc8578799-v69q9"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.722600 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.747949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "12ab3a9a-6d12-45c8-b029-d95a3c4d4608" (UID: "12ab3a9a-6d12-45c8-b029-d95a3c4d4608"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.756917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e303ace-04be-4505-9437-af54b61d41e5" containerName="nova-cell1-conductor-conductor" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.756957 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e303ace-04be-4505-9437-af54b61d41e5" containerName="nova-cell1-conductor-conductor" Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.757000 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ab3a9a-6d12-45c8-b029-d95a3c4d4608" containerName="galera" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.757006 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ab3a9a-6d12-45c8-b029-d95a3c4d4608" containerName="galera" Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.757026 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerName="ovsdbserver-sb" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.757032 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerName="ovsdbserver-sb" Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.757053 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerName="openstack-network-exporter" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.757058 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerName="openstack-network-exporter" Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.757074 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43e09a5c-6892-4ff5-9c5b-a30556a193ed" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.757083 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43e09a5c-6892-4ff5-9c5b-a30556a193ed" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.757108 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ab3a9a-6d12-45c8-b029-d95a3c4d4608" containerName="mysql-bootstrap" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.757114 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ab3a9a-6d12-45c8-b029-d95a3c4d4608" containerName="mysql-bootstrap" Jan 21 15:19:32 crc kubenswrapper[4707]: E0121 15:19:32.757129 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a17de20-67a0-4578-92a4-e19819061791" containerName="memcached" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.757134 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a17de20-67a0-4578-92a4-e19819061791" containerName="memcached" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.758917 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerName="ovsdbserver-sb" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.758945 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43e09a5c-6892-4ff5-9c5b-a30556a193ed" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.758961 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a17de20-67a0-4578-92a4-e19819061791" containerName="memcached" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.758978 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ab3a9a-6d12-45c8-b029-d95a3c4d4608" containerName="galera" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.758993 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e303ace-04be-4505-9437-af54b61d41e5" containerName="nova-cell1-conductor-conductor" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.759007 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" containerName="openstack-network-exporter" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.760541 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.760626 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.764671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.768316 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_2950a343-0190-497c-8775-4a5a6ddb13fa/ovsdbserver-sb/0.log" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.768483 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.768708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2950a343-0190-497c-8775-4a5a6ddb13fa","Type":"ContainerDied","Data":"cce4e6bec73efdd12ce027087c6ce202c46e22045d03e881efc56218dbee3581"} Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.768758 4707 scope.go:117] "RemoveContainer" containerID="ab66fa9636c2c8a7dc17154279e06b78d15894768f66c0e4dc03d9b7c66afb61" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.802503 4707 generic.go:334] "Generic (PLEG): container finished" podID="bce9bbed-aa2b-429e-85f7-c4202d8af65f" containerID="93601405e0699ec0ef7e3daeaae4615c531db22fb7b3ae9899ac20bde048fe28" exitCode=0 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.802603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"bce9bbed-aa2b-429e-85f7-c4202d8af65f","Type":"ContainerDied","Data":"93601405e0699ec0ef7e3daeaae4615c531db22fb7b3ae9899ac20bde048fe28"} Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.811472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9a17de20-67a0-4578-92a4-e19819061791","Type":"ContainerDied","Data":"ebf4b45e59de68beb87c75489ddb1d6c6c25b416f28d16668ecaa31fccb9c7e4"} Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.811575 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.815679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" event={"ID":"73d67e12-9371-4324-a476-3befcc10743d","Type":"ContainerStarted","Data":"7f3bb00b4cc309a855d46e8c9b331cdbdec4f887410567914b414e1ffc3ec7f4"} Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.826204 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-hp9gw"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.839046 4707 generic.go:334] "Generic (PLEG): container finished" podID="9220cbec-82de-4d8a-9bca-575edd0d7482" containerID="30a82c874f0417a2ea868196fc295a6b7cdf393c79ed490a067806d77d373590" exitCode=137 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.848848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"43e09a5c-6892-4ff5-9c5b-a30556a193ed","Type":"ContainerDied","Data":"294783e116ea9bfadd890a0f528e163107600b79c2a6f6bbc76eda62332af07d"} Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.849028 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.851416 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-hp9gw"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.850859 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fhpq\" (UniqueName: \"kubernetes.io/projected/251778b0-6da6-47ba-805d-3d0469f6969d-kube-api-access-2fhpq\") pod \"keystone-37a4-account-create-update-rtnnw\" (UID: \"251778b0-6da6-47ba-805d-3d0469f6969d\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.852737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts\") pod \"keystone-37a4-account-create-update-rtnnw\" (UID: \"251778b0-6da6-47ba-805d-3d0469f6969d\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.873678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2950a343-0190-497c-8775-4a5a6ddb13fa" (UID: "2950a343-0190-497c-8775-4a5a6ddb13fa"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.877194 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-c77875b78-c9kg9"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.885026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"12ab3a9a-6d12-45c8-b029-d95a3c4d4608","Type":"ContainerDied","Data":"49372a50539087ac1f392ac1cf5c62f19e613df68d001ddee9eefdebb54cae9f"} Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.885157 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.886210 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.889535 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2950a343-0190-497c-8775-4a5a6ddb13fa-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.923188 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.940373 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-8bd9-account-create-update-v85vk"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.941051 4707 generic.go:334] "Generic (PLEG): container finished" podID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerID="9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c" exitCode=0 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.941078 4707 generic.go:334] "Generic (PLEG): container finished" podID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerID="823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8" exitCode=2 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.941121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerDied","Data":"9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c"} Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.941141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerDied","Data":"823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8"} Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.946994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "9a17de20-67a0-4578-92a4-e19819061791" (UID: "9a17de20-67a0-4578-92a4-e19819061791"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.954563 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.956905 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-server" containerID="cri-o://7af2e04db14be25f173dd381e8aada1f6fca26d64ef2c1471213d94d70d441d7" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.957673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" event={"ID":"3602ad56-25ca-4375-8eac-ea931ec48243","Type":"ContainerStarted","Data":"69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22"} Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958157 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="swift-recon-cron" containerID="cri-o://767c60a84352183a0a1460f1f409ae4f3ff11f0dc3b14e0f08ef19858a8e4c10" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958253 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="rsync" containerID="cri-o://5e3216158efccac8796405d01246e40f3bd76d0da2ad3160c835ee6479e21c40" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958298 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-auditor" containerID="cri-o://f59325ec6d78879ef0e5dc29fc782e9d998428ee8ceb5a83b319c6921289e416" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958333 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-updater" containerID="cri-o://e8f6a181d9396f315c57c1fa718fed1215117042aa452cbef4eeb8c3a68400e0" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958334 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" secret="" err="secret \"barbican-barbican-dockercfg-g88h6\" not found" Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958370 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-server" containerID="cri-o://19f135825fe61fa9589705bc5673cab68133a4b8973d340cdca3519f686f9181" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958402 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-updater" containerID="cri-o://6ba3795292e347eb0052109476f7b8868a66ba15e5ffbef833b4771ba1eb7f0c" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958435 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-auditor" containerID="cri-o://b57d8c8d95a43a7fe418e385ecf17ab93c271370b136b8b9ea9e9ac36a8eb7ec" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958464 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-replicator" containerID="cri-o://c20ae0c2ed4b8564da8434d55d14c1622df2625d7eeb4be8dbb41dd1c95ea25d" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958497 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-replicator" containerID="cri-o://91f811d60dd80febc0d665d8d1e4c22a670a4043e096ea0a70f5b837830a8b9f" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958531 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-reaper" containerID="cri-o://79465f4c64a300da6b2b166a7231cade9cafced28c54a63fb1dd42634ea4b73d" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958534 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-auditor" containerID="cri-o://dfc3a06ef645fad85b8c1641aeb4b27af7cb580998503dd959db2c02aff177cd" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958567 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-replicator" containerID="cri-o://0b91da6e45cc4e4749ed6b4f2b71d51ae0be19bba7d48c2ab451efd86fbf36c4" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.958615 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-expirer" containerID="cri-o://d3dc8279851af3aeb56531212bc19a4fe8eaf713d9b19e99bb97e6317f18363f" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.955132 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-server" containerID="cri-o://0adfedc9ed55605ee7e287bb792ea2a2b8c433568d147932a3110f197664d94e" gracePeriod=30 Jan 21 15:19:32 crc kubenswrapper[4707]: I0121 15:19:32.976174 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:32.992778 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5b8c8cff84-8wl5w"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:32.993029 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" podUID="38833865-b749-47e4-b140-8a541562f359" containerName="placement-log" containerID="cri-o://c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5" gracePeriod=30 Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:32.993118 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" podUID="38833865-b749-47e4-b140-8a541562f359" containerName="placement-api" containerID="cri-o://62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376" gracePeriod=30 Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:32.993428 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:32.993478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0e303ace-04be-4505-9437-af54b61d41e5","Type":"ContainerDied","Data":"5201152a1f5520603563c6524f449144f5b0a32eafae894ee206e527ff039534"} Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:32.998153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fhpq\" (UniqueName: \"kubernetes.io/projected/251778b0-6da6-47ba-805d-3d0469f6969d-kube-api-access-2fhpq\") pod \"keystone-37a4-account-create-update-rtnnw\" (UID: \"251778b0-6da6-47ba-805d-3d0469f6969d\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:32.998263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts\") pod \"keystone-37a4-account-create-update-rtnnw\" (UID: \"251778b0-6da6-47ba-805d-3d0469f6969d\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.001137 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.003087 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-twzb9"] Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.003442 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.003586 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.503570868 +0000 UTC m=+1070.685087090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "combined-ca-bundle" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.003724 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.003751 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.503743613 +0000 UTC m=+1070.685259835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.004238 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.004266 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.504257209 +0000 UTC m=+1070.685773431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-worker-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.005248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts\") pod \"keystone-37a4-account-create-update-rtnnw\" (UID: \"251778b0-6da6-47ba-805d-3d0469f6969d\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.008218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" event={"ID":"aa920435-67d6-49dd-9e83-7d14c4316fef","Type":"ContainerStarted","Data":"92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762"} Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.030330 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-twzb9"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.030460 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" event={"ID":"e476808c-b16e-4aa7-9f4b-71e6a5c32576","Type":"ContainerStarted","Data":"b31de5931197a89702e85a64f235ad8862ba6805b13d1f2e6e303426a53ef2f8"} Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.030961 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" podStartSLOduration=4.030951155 podStartE2EDuration="4.030951155s" podCreationTimestamp="2026-01-21 15:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:33.005247042 +0000 UTC m=+1070.186763264" watchObservedRunningTime="2026-01-21 15:19:33.030951155 +0000 UTC m=+1070.212467378" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.051967 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gpd8j"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.063647 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gpd8j"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.070524 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.076082 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.077871 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.084440 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.095685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fhpq\" (UniqueName: \"kubernetes.io/projected/251778b0-6da6-47ba-805d-3d0469f6969d-kube-api-access-2fhpq\") pod \"keystone-37a4-account-create-update-rtnnw\" (UID: \"251778b0-6da6-47ba-805d-3d0469f6969d\") " pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.102427 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.103021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-m2s6n\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.103161 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgg7f\" (UniqueName: \"kubernetes.io/projected/fc35eebb-a329-4334-a277-5a27b1bfba38-kube-api-access-vgg7f\") pod \"dnsmasq-dnsmasq-84b9f45d47-m2s6n\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.103361 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-m2s6n\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.103538 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.103577 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data podName:f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:35.10356097 +0000 UTC m=+1072.285077192 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.117854 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-61f3-account-create-update-2xjtz"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.128022 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-rl5lm"] Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.202162 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:19:33 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:19:33 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:19:33 crc kubenswrapper[4707]: else Jan 21 15:19:33 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:19:33 crc kubenswrapper[4707]: fi Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:19:33 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:19:33 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:19:33 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:19:33 crc kubenswrapper[4707]: # support updates Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.203233 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" podUID="a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.203759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ab3a9a-6d12-45c8-b029-d95a3c4d4608" (UID: "12ab3a9a-6d12-45c8-b029-d95a3c4d4608"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.204770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgg7f\" (UniqueName: \"kubernetes.io/projected/fc35eebb-a329-4334-a277-5a27b1bfba38-kube-api-access-vgg7f\") pod \"dnsmasq-dnsmasq-84b9f45d47-m2s6n\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.205017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-m2s6n\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.205078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-m2s6n\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.205148 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.205921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-m2s6n\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.214586 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.151:8776/healthcheck\": read tcp 10.217.0.2:49370->10.217.0.151:8776: read: connection reset by peer" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.214629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-m2s6n\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.221647 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.231483 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" secret="" err="secret \"barbican-barbican-dockercfg-g88h6\" not found" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.232176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a17de20-67a0-4578-92a4-e19819061791" (UID: "9a17de20-67a0-4578-92a4-e19819061791"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.233838 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="026fa0a1-b506-474b-9a47-b98541421464" path="/var/lib/kubelet/pods/026fa0a1-b506-474b-9a47-b98541421464/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.234353 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fdb657-664d-435d-9ab2-edd0c800506c" path="/var/lib/kubelet/pods/36fdb657-664d-435d-9ab2-edd0c800506c/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.234826 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55f50403-e5d4-4cac-928c-a42bf62951a8" path="/var/lib/kubelet/pods/55f50403-e5d4-4cac-928c-a42bf62951a8/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.238308 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:50138->10.217.0.184:8775: read: connection reset by peer" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.238417 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:50124->10.217.0.184:8775: read: connection reset by peer" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.241496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgg7f\" (UniqueName: \"kubernetes.io/projected/fc35eebb-a329-4334-a277-5a27b1bfba38-kube-api-access-vgg7f\") pod \"dnsmasq-dnsmasq-84b9f45d47-m2s6n\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.245198 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72983002-27f2-464c-9e83-cc2dd28ce9ea" path="/var/lib/kubelet/pods/72983002-27f2-464c-9e83-cc2dd28ce9ea/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.250144 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807924a2-cb11-45c3-9a2a-f9864c1b3a78" path="/var/lib/kubelet/pods/807924a2-cb11-45c3-9a2a-f9864c1b3a78/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.251004 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b038b0-2f9c-475f-98e5-574468773068" path="/var/lib/kubelet/pods/85b038b0-2f9c-475f-98e5-574468773068/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.251692 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877b2ecd-b7f6-4057-a5a8-e4925a9cc743" path="/var/lib/kubelet/pods/877b2ecd-b7f6-4057-a5a8-e4925a9cc743/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.254277 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a13938f-6542-42f2-a635-9529887f363c" path="/var/lib/kubelet/pods/8a13938f-6542-42f2-a635-9529887f363c/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.255104 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dbdd352-f6e2-4bb5-a980-af302e912c67" path="/var/lib/kubelet/pods/8dbdd352-f6e2-4bb5-a980-af302e912c67/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.258239 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="915ebe52-43fe-40d8-909b-c1c1b617cd5a" path="/var/lib/kubelet/pods/915ebe52-43fe-40d8-909b-c1c1b617cd5a/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.259057 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a28438-f0ba-433d-84f7-11838f51c57c" path="/var/lib/kubelet/pods/96a28438-f0ba-433d-84f7-11838f51c57c/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.263369 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99016202-c215-433c-95f2-e75a8fe1a995" path="/var/lib/kubelet/pods/99016202-c215-433c-95f2-e75a8fe1a995/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.263928 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7" path="/var/lib/kubelet/pods/a331f9ff-ccbb-4a26-a4c7-3dac83efd4f7/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.274415 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a65869b4-a0c3-46ba-8c75-a194a22d9d2c" path="/var/lib/kubelet/pods/a65869b4-a0c3-46ba-8c75-a194a22d9d2c/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.276789 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb72ff0-20b0-4878-a229-85528d202449" path="/var/lib/kubelet/pods/bfb72ff0-20b0-4878-a229-85528d202449/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.278770 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c" path="/var/lib/kubelet/pods/dc2e27ad-a39a-48d4-bc51-c0e0e0d6c71c/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.281592 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de1081a6-b1c0-499f-bd60-82256c944e34" path="/var/lib/kubelet/pods/de1081a6-b1c0-499f-bd60-82256c944e34/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.288113 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0cdf697-e82c-44fa-992a-4843848b3d33" path="/var/lib/kubelet/pods/f0cdf697-e82c-44fa-992a-4843848b3d33/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.291360 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80e12a4-1af2-441c-89f4-c8c54e590aee" path="/var/lib/kubelet/pods/f80e12a4-1af2-441c-89f4-c8c54e590aee/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.291947 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fec7d56d-7ffe-452b-a5fc-d1febd4218a6" path="/var/lib/kubelet/pods/fec7d56d-7ffe-452b-a5fc-d1febd4218a6/volumes" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.305874 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17de20-67a0-4578-92a4-e19819061791-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.306056 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306108 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306140 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.806129491 +0000 UTC m=+1070.987645713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "cert-barbican-public-svc" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306192 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306211 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.806204131 +0000 UTC m=+1070.987720353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "barbican-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306239 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306255 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.806250247 +0000 UTC m=+1070.987766469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "cert-barbican-internal-svc" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306393 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306423 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.80641609 +0000 UTC m=+1070.987932311 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "combined-ca-bundle" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306459 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.306476 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:33.806470492 +0000 UTC m=+1070.987986714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "barbican-api-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.314187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "12ab3a9a-6d12-45c8-b029-d95a3c4d4608" (UID: "12ab3a9a-6d12-45c8-b029-d95a3c4d4608"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.422321 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ab3a9a-6d12-45c8-b029-d95a3c4d4608-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.533953 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.534250 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data podName:945ded51-9961-411b-b00f-0543ac91a18a nodeName:}" failed. No retries permitted until 2026-01-21 15:19:35.534221356 +0000 UTC m=+1072.715737577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data") pod "rabbitmq-server-0" (UID: "945ded51-9961-411b-b00f-0543ac91a18a") : configmap "rabbitmq-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.550481 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.550556 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.550539523 +0000 UTC m=+1071.732055746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-worker-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.551115 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.551151 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.551142097 +0000 UTC m=+1071.732658310 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.576825 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.576905 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.576875237 +0000 UTC m=+1071.758391458 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "combined-ca-bundle" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.829483 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:19:33 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 15:19:33 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 15:19:33 crc kubenswrapper[4707]: else Jan 21 15:19:33 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:19:33 crc kubenswrapper[4707]: fi Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:19:33 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:19:33 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:19:33 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:19:33 crc kubenswrapper[4707]: # support updates Jan 21 15:19:33 crc kubenswrapper[4707]: Jan 21 15:19:33 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.830666 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-84f6c" podUID="2db08188-97b2-43c0-9b46-7ea6639617ed" Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868058 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868125 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868125 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.868107962 +0000 UTC m=+1072.049624175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "barbican-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868068 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868272 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.868252674 +0000 UTC m=+1072.049768897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "barbican-api-config-data" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868178 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868287 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868298 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.868282821 +0000 UTC m=+1072.049799043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "cert-barbican-internal-svc" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868313 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.868307337 +0000 UTC m=+1072.049823559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "cert-barbican-public-svc" not found Jan 21 15:19:33 crc kubenswrapper[4707]: E0121 15:19:33.868323 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.868318979 +0000 UTC m=+1072.049835201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "combined-ca-bundle" not found Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982569 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-rl5lm"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982868 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982888 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982899 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-sqjw5"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982912 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982921 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-672zr"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982932 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-bkjdx"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982941 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-bkjdx"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982950 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-r6zw4"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982960 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-r6zw4"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982977 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-84f6c"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.982993 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.983003 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-jnqsq"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.983018 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2bfhk"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.983026 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2bfhk"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.983037 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wktmc"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.983948 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wktmc"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984028 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984042 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984053 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-aef2-account-create-update-s7lls"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984063 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-78998996df-sk4d7"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984073 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984082 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984091 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-69647dd576-j6h6n"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984100 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984109 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984117 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984127 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984135 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vjd82"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984144 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vjd82"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kq52z"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984160 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kq52z"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984179 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-84f545d4f5-d8v92"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984191 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-fc8578799-v69q9"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984200 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984209 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-zpcb6"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984218 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-zpcb6"] Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984383 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" podUID="d41aa1cf-56db-4d5a-91e9-478a8ce591d0" containerName="keystone-api" containerID="cri-o://cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee" gracePeriod=30 Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984711 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" podUID="df067294-2809-413f-b245-6cd57c0cbf11" containerName="barbican-worker-log" containerID="cri-o://2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a" gracePeriod=30 Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.984843 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" podUID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerName="barbican-keystone-listener-log" containerID="cri-o://8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb" gracePeriod=30 Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.985968 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" podUID="d582f12e-341b-45f5-a54a-8d016aed473a" containerName="proxy-httpd" containerID="cri-o://4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6" gracePeriod=30 Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.987310 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" podUID="df067294-2809-413f-b245-6cd57c0cbf11" containerName="barbican-worker" containerID="cri-o://42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964" gracePeriod=30 Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.987505 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" podUID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerName="barbican-keystone-listener" containerID="cri-o://00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0" gracePeriod=30 Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.987582 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" podUID="d582f12e-341b-45f5-a54a-8d016aed473a" containerName="proxy-server" containerID="cri-o://591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214" gracePeriod=30 Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.987909 4707 scope.go:117] "RemoveContainer" containerID="ff6b6fbaae9f2a28db796281a96b694a3e6ca8c67eaa98f3017c83471aeee383" Jan 21 15:19:33 crc kubenswrapper[4707]: I0121 15:19:33.989470 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-84f6c"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.040790 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.048087 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.049150 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="945ded51-9961-411b-b00f-0543ac91a18a" containerName="rabbitmq" containerID="cri-o://c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4" gracePeriod=604800 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.049378 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" containerName="rabbitmq" containerID="cri-o://3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7" gracePeriod=604800 Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.050067 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.050501 4707 generic.go:334] "Generic (PLEG): container finished" podID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerID="291188ab585da8ca8a57ab46efd037f8e3984e81e67c3a618faa0dd418123a81" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.050568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"57f429a9-592b-4ff8-a548-b5e59c4c481e","Type":"ContainerDied","Data":"291188ab585da8ca8a57ab46efd037f8e3984e81e67c3a618faa0dd418123a81"} Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.051893 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.051942 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="36457603-5ef9-4f79-bf2c-11413e422937" containerName="nova-cell0-conductor-conductor" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.054039 4707 generic.go:334] "Generic (PLEG): container finished" podID="bffc824e-1c16-4342-b73f-91205d93c451" containerID="132d4a7b6c8eb04012605d47a3c3a13b3ca5900e48ed093c7525683f623c3630" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.054097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bffc824e-1c16-4342-b73f-91205d93c451","Type":"ContainerDied","Data":"132d4a7b6c8eb04012605d47a3c3a13b3ca5900e48ed093c7525683f623c3630"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.056879 4707 generic.go:334] "Generic (PLEG): container finished" podID="290dd378-ee4b-464d-9321-2491ef417d10" containerID="c6260bbf19dfc6a9ac3a5408173a09e768e183ee3df3567857551dbfb60cc47a" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.057007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"290dd378-ee4b-464d-9321-2491ef417d10","Type":"ContainerDied","Data":"c6260bbf19dfc6a9ac3a5408173a09e768e183ee3df3567857551dbfb60cc47a"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.057075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"290dd378-ee4b-464d-9321-2491ef417d10","Type":"ContainerDied","Data":"011230ad515ad40a394ff943baa5f2d0d858e3fef0e10aa6039fcac21c55d075"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.057154 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="011230ad515ad40a394ff943baa5f2d0d858e3fef0e10aa6039fcac21c55d075" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.059246 4707 generic.go:334] "Generic (PLEG): container finished" podID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerID="abbe09e446aa5427d732558aa8e1909e7b43ad26726df39c2a2cc877fe7297c3" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.059354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec","Type":"ContainerDied","Data":"abbe09e446aa5427d732558aa8e1909e7b43ad26726df39c2a2cc877fe7297c3"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.063205 4707 generic.go:334] "Generic (PLEG): container finished" podID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerID="3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.063242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerDied","Data":"3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.064666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-84f6c" event={"ID":"2db08188-97b2-43c0-9b46-7ea6639617ed","Type":"ContainerStarted","Data":"ed96ad8010085d02da3c06f4251b0dd27d1caa285e493ed14e244faf3e736055"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.067523 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-84f6c" secret="" err="secret \"galera-openstack-dockercfg-6xxcv\" not found" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.073638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" event={"ID":"14b480ba-3ec1-4c37-a14c-d3b8a20d9837","Type":"ContainerStarted","Data":"0d0a2fb8f085d3c1e5fbaa104b3389c405f40c0df93816589c7cc92ed250ceb6"} Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.073947 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:19:34 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 15:19:34 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 15:19:34 crc kubenswrapper[4707]: else Jan 21 15:19:34 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:19:34 crc kubenswrapper[4707]: fi Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:19:34 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:19:34 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:19:34 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:19:34 crc kubenswrapper[4707]: # support updates Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.074995 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-84f6c" podUID="2db08188-97b2-43c0-9b46-7ea6639617ed" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.089265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"bce9bbed-aa2b-429e-85f7-c4202d8af65f","Type":"ContainerDied","Data":"eeabadc126ba92429b09e06078c52edaa20284bf78c5a0f10ee560b5359e7f3e"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.089309 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeabadc126ba92429b09e06078c52edaa20284bf78c5a0f10ee560b5359e7f3e" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.110050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" event={"ID":"a533a135-2b5a-4edb-aead-4924368ab8ab","Type":"ContainerStarted","Data":"a5427fd48efe38bc4168a074c22ec2c0df9496874a3232b354a558abe6a546e8"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.119952 4707 generic.go:334] "Generic (PLEG): container finished" podID="38833865-b749-47e4-b140-8a541562f359" containerID="c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5" exitCode=143 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.120078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" event={"ID":"38833865-b749-47e4-b140-8a541562f359","Type":"ContainerDied","Data":"c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.129575 4707 generic.go:334] "Generic (PLEG): container finished" podID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerID="cd62a3dc2c61a7c191ccbda0e7ac771e13ca81a7f503f321bfd2768094965edd" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.129669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"933eeedf-0a55-44e6-84f5-062e3c2716d9","Type":"ContainerDied","Data":"cd62a3dc2c61a7c191ccbda0e7ac771e13ca81a7f503f321bfd2768094965edd"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.133431 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" event={"ID":"73d67e12-9371-4324-a476-3befcc10743d","Type":"ContainerStarted","Data":"a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.141162 4707 scope.go:117] "RemoveContainer" containerID="96a5811f181efafd4ec7d45b252f716be9d17274a6d4379830931b1b93eb6d88" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150011 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="5e3216158efccac8796405d01246e40f3bd76d0da2ad3160c835ee6479e21c40" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150038 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="d3dc8279851af3aeb56531212bc19a4fe8eaf713d9b19e99bb97e6317f18363f" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150047 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="e8f6a181d9396f315c57c1fa718fed1215117042aa452cbef4eeb8c3a68400e0" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150054 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="f59325ec6d78879ef0e5dc29fc782e9d998428ee8ceb5a83b319c6921289e416" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150059 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="91f811d60dd80febc0d665d8d1e4c22a670a4043e096ea0a70f5b837830a8b9f" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150064 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="7af2e04db14be25f173dd381e8aada1f6fca26d64ef2c1471213d94d70d441d7" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150070 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="6ba3795292e347eb0052109476f7b8868a66ba15e5ffbef833b4771ba1eb7f0c" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150075 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="b57d8c8d95a43a7fe418e385ecf17ab93c271370b136b8b9ea9e9ac36a8eb7ec" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150081 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="c20ae0c2ed4b8564da8434d55d14c1622df2625d7eeb4be8dbb41dd1c95ea25d" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150087 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="19f135825fe61fa9589705bc5673cab68133a4b8973d340cdca3519f686f9181" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150093 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="79465f4c64a300da6b2b166a7231cade9cafced28c54a63fb1dd42634ea4b73d" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150098 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="dfc3a06ef645fad85b8c1641aeb4b27af7cb580998503dd959db2c02aff177cd" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150104 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="0b91da6e45cc4e4749ed6b4f2b71d51ae0be19bba7d48c2ab451efd86fbf36c4" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150109 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="0adfedc9ed55605ee7e287bb792ea2a2b8c433568d147932a3110f197664d94e" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"5e3216158efccac8796405d01246e40f3bd76d0da2ad3160c835ee6479e21c40"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"d3dc8279851af3aeb56531212bc19a4fe8eaf713d9b19e99bb97e6317f18363f"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150213 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"e8f6a181d9396f315c57c1fa718fed1215117042aa452cbef4eeb8c3a68400e0"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"f59325ec6d78879ef0e5dc29fc782e9d998428ee8ceb5a83b319c6921289e416"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"91f811d60dd80febc0d665d8d1e4c22a670a4043e096ea0a70f5b837830a8b9f"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"7af2e04db14be25f173dd381e8aada1f6fca26d64ef2c1471213d94d70d441d7"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150247 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"6ba3795292e347eb0052109476f7b8868a66ba15e5ffbef833b4771ba1eb7f0c"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"b57d8c8d95a43a7fe418e385ecf17ab93c271370b136b8b9ea9e9ac36a8eb7ec"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"c20ae0c2ed4b8564da8434d55d14c1622df2625d7eeb4be8dbb41dd1c95ea25d"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"19f135825fe61fa9589705bc5673cab68133a4b8973d340cdca3519f686f9181"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"79465f4c64a300da6b2b166a7231cade9cafced28c54a63fb1dd42634ea4b73d"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"dfc3a06ef645fad85b8c1641aeb4b27af7cb580998503dd959db2c02aff177cd"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"0b91da6e45cc4e4749ed6b4f2b71d51ae0be19bba7d48c2ab451efd86fbf36c4"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.150303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"0adfedc9ed55605ee7e287bb792ea2a2b8c433568d147932a3110f197664d94e"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.153130 4707 generic.go:334] "Generic (PLEG): container finished" podID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerID="9fa2d6082708378d81429a182459dfd951edb40a1d798ddd9d54ac8fa55269de" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.153179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f44cdae5-61bb-40c3-a1d7-e10d4d64c203","Type":"ContainerDied","Data":"9fa2d6082708378d81429a182459dfd951edb40a1d798ddd9d54ac8fa55269de"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.160316 4707 generic.go:334] "Generic (PLEG): container finished" podID="30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" containerID="48c5225159a3ffc9586877ac0dcbb13aa01282ae7849cfd26facedad54d07afc" exitCode=2 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.160478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6","Type":"ContainerDied","Data":"48c5225159a3ffc9586877ac0dcbb13aa01282ae7849cfd26facedad54d07afc"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.160592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6","Type":"ContainerDied","Data":"e52ea25d6c240f866fc10f050fe8e68c9cffae76930f1987e8813744d89d5810"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.160676 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e52ea25d6c240f866fc10f050fe8e68c9cffae76930f1987e8813744d89d5810" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.184001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config\") pod \"9220cbec-82de-4d8a-9bca-575edd0d7482\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.184062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fdbw\" (UniqueName: \"kubernetes.io/projected/9220cbec-82de-4d8a-9bca-575edd0d7482-kube-api-access-9fdbw\") pod \"9220cbec-82de-4d8a-9bca-575edd0d7482\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.184206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-combined-ca-bundle\") pod \"9220cbec-82de-4d8a-9bca-575edd0d7482\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.197367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config-secret\") pod \"9220cbec-82de-4d8a-9bca-575edd0d7482\" (UID: \"9220cbec-82de-4d8a-9bca-575edd0d7482\") " Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.199148 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.199315 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts podName:2db08188-97b2-43c0-9b46-7ea6639617ed nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.699301151 +0000 UTC m=+1071.880817374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts") pod "root-account-create-update-84f6c" (UID: "2db08188-97b2-43c0-9b46-7ea6639617ed") : configmap "openstack-scripts" not found Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.223595 4707 generic.go:334] "Generic (PLEG): container finished" podID="7713e80a-5469-4748-a300-4c581d9fd503" containerID="0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.223673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" event={"ID":"7713e80a-5469-4748-a300-4c581d9fd503","Type":"ContainerDied","Data":"0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.225847 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9220cbec-82de-4d8a-9bca-575edd0d7482-kube-api-access-9fdbw" (OuterVolumeSpecName: "kube-api-access-9fdbw") pod "9220cbec-82de-4d8a-9bca-575edd0d7482" (UID: "9220cbec-82de-4d8a-9bca-575edd0d7482"). InnerVolumeSpecName "kube-api-access-9fdbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.237038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9220cbec-82de-4d8a-9bca-575edd0d7482" (UID: "9220cbec-82de-4d8a-9bca-575edd0d7482"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.237801 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.245183 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.245717 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.245825 4707 generic.go:334] "Generic (PLEG): container finished" podID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerID="c48740a97a789fa8e24952640d3171c5efb85d529be89bc23207d43578d4750d" exitCode=0 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.245872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a19c4756-e139-49f4-a1e8-99afc8ed8088","Type":"ContainerDied","Data":"c48740a97a789fa8e24952640d3171c5efb85d529be89bc23207d43578d4750d"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.257230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" event={"ID":"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8","Type":"ContainerStarted","Data":"7c78520c3afb65fc8ed8fa39db23776760da91c22c5288edc2b28c9a26cb76a2"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.257323 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" secret="" err="secret \"galera-openstack-dockercfg-6xxcv\" not found" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.257738 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.257769 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" secret="" err="secret \"galera-openstack-dockercfg-6xxcv\" not found" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.288487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" event={"ID":"3602ad56-25ca-4375-8eac-ea931ec48243","Type":"ContainerStarted","Data":"3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf"} Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.289330 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" secret="" err="secret \"barbican-barbican-dockercfg-g88h6\" not found" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.290012 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.291598 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.291771 4707 scope.go:117] "RemoveContainer" containerID="f42ade328f4a73251db136eabdb66bd21328d2aa0a7aed9f3d9b848f325c979b" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.292235 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.293869 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api-log" containerID="cri-o://8dbd5877a5c421fb4b87c4bb6c848927f876f78802f4b31d44917a93f5d9c274" gracePeriod=30 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.294048 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.294537 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api" containerID="cri-o://acc95ab6fc21a9f96f200600c32518407710117ccce4e8740117fccde8402fe5" gracePeriod=30 Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.310803 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.311941 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.311973 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fdbw\" (UniqueName: \"kubernetes.io/projected/9220cbec-82de-4d8a-9bca-575edd0d7482-kube-api-access-9fdbw\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.312577 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.327090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.327398 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.347922 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.355846 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.358495 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.365009 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.369145 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:19:34 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:19:34 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:19:34 crc kubenswrapper[4707]: else Jan 21 15:19:34 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:19:34 crc kubenswrapper[4707]: fi Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:19:34 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:19:34 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:19:34 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:19:34 crc kubenswrapper[4707]: # support updates Jan 21 15:19:34 crc kubenswrapper[4707]: Jan 21 15:19:34 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.370546 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" podUID="a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.370605 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.382397 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.395546 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.404025 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.415875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-default\") pod \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "bce9bbed-aa2b-429e-85f7-c4202d8af65f" (UID: "bce9bbed-aa2b-429e-85f7-c4202d8af65f"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416526 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-combined-ca-bundle\") pod \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416546 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-combined-ca-bundle\") pod \"bffc824e-1c16-4342-b73f-91205d93c451\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data-custom\") pod \"57f429a9-592b-4ff8-a548-b5e59c4c481e\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-nova-metadata-tls-certs\") pod \"bffc824e-1c16-4342-b73f-91205d93c451\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416634 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-certs\") pod \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416658 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-scripts\") pod \"57f429a9-592b-4ff8-a548-b5e59c4c481e\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-public-tls-certs\") pod \"57f429a9-592b-4ff8-a548-b5e59c4c481e\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sllq\" (UniqueName: \"kubernetes.io/projected/bffc824e-1c16-4342-b73f-91205d93c451-kube-api-access-2sllq\") pod \"bffc824e-1c16-4342-b73f-91205d93c451\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-config-data\") pod \"bffc824e-1c16-4342-b73f-91205d93c451\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcbb6\" (UniqueName: \"kubernetes.io/projected/290dd378-ee4b-464d-9321-2491ef417d10-kube-api-access-mcbb6\") pod \"290dd378-ee4b-464d-9321-2491ef417d10\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416790 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-internal-tls-certs\") pod \"57f429a9-592b-4ff8-a548-b5e59c4c481e\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5t5w\" (UniqueName: \"kubernetes.io/projected/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kube-api-access-j5t5w\") pod \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f429a9-592b-4ff8-a548-b5e59c4c481e-etc-machine-id\") pod \"57f429a9-592b-4ff8-a548-b5e59c4c481e\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data\") pod \"57f429a9-592b-4ff8-a548-b5e59c4c481e\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kolla-config\") pod \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57f429a9-592b-4ff8-a548-b5e59c4c481e-logs\") pod \"57f429a9-592b-4ff8-a548-b5e59c4c481e\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkpbt\" (UniqueName: \"kubernetes.io/projected/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-api-access-nkpbt\") pod \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.416986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-generated\") pod \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.417003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-config-data\") pod \"290dd378-ee4b-464d-9321-2491ef417d10\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.417027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-galera-tls-certs\") pod \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.417042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-operator-scripts\") pod \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.417079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bffc824e-1c16-4342-b73f-91205d93c451-logs\") pod \"bffc824e-1c16-4342-b73f-91205d93c451\" (UID: \"bffc824e-1c16-4342-b73f-91205d93c451\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.417123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-combined-ca-bundle\") pod \"290dd378-ee4b-464d-9321-2491ef417d10\" (UID: \"290dd378-ee4b-464d-9321-2491ef417d10\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.417144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-combined-ca-bundle\") pod \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\" (UID: \"bce9bbed-aa2b-429e-85f7-c4202d8af65f\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.417159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghvgv\" (UniqueName: \"kubernetes.io/projected/57f429a9-592b-4ff8-a548-b5e59c4c481e-kube-api-access-ghvgv\") pod \"57f429a9-592b-4ff8-a548-b5e59c4c481e\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.417189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-combined-ca-bundle\") pod \"57f429a9-592b-4ff8-a548-b5e59c4c481e\" (UID: \"57f429a9-592b-4ff8-a548-b5e59c4c481e\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.417206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-config\") pod \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\" (UID: \"30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.418664 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.419703 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bce9bbed-aa2b-429e-85f7-c4202d8af65f" (UID: "bce9bbed-aa2b-429e-85f7-c4202d8af65f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.420259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bffc824e-1c16-4342-b73f-91205d93c451-logs" (OuterVolumeSpecName: "logs") pod "bffc824e-1c16-4342-b73f-91205d93c451" (UID: "bffc824e-1c16-4342-b73f-91205d93c451"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.426910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "bce9bbed-aa2b-429e-85f7-c4202d8af65f" (UID: "bce9bbed-aa2b-429e-85f7-c4202d8af65f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.427995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "bce9bbed-aa2b-429e-85f7-c4202d8af65f" (UID: "bce9bbed-aa2b-429e-85f7-c4202d8af65f"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.429015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f429a9-592b-4ff8-a548-b5e59c4c481e-logs" (OuterVolumeSpecName: "logs") pod "57f429a9-592b-4ff8-a548-b5e59c4c481e" (UID: "57f429a9-592b-4ff8-a548-b5e59c4c481e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.431309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57f429a9-592b-4ff8-a548-b5e59c4c481e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "57f429a9-592b-4ff8-a548-b5e59c4c481e" (UID: "57f429a9-592b-4ff8-a548-b5e59c4c481e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.436133 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.436197 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts podName:251778b0-6da6-47ba-805d-3d0469f6969d nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.936183032 +0000 UTC m=+1072.117699254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts") pod "keystone-37a4-account-create-update-rtnnw" (UID: "251778b0-6da6-47ba-805d-3d0469f6969d") : configmap "openstack-scripts" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.436235 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.436255 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts podName:a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:34.936249698 +0000 UTC m=+1072.117765921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts") pod "barbican-e217-account-create-update-d7fcc" (UID: "a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8") : configmap "openstack-scripts" not found Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.489232 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.508046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bffc824e-1c16-4342-b73f-91205d93c451-kube-api-access-2sllq" (OuterVolumeSpecName: "kube-api-access-2sllq") pod "bffc824e-1c16-4342-b73f-91205d93c451" (UID: "bffc824e-1c16-4342-b73f-91205d93c451"). InnerVolumeSpecName "kube-api-access-2sllq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.508338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f429a9-592b-4ff8-a548-b5e59c4c481e-kube-api-access-ghvgv" (OuterVolumeSpecName: "kube-api-access-ghvgv") pod "57f429a9-592b-4ff8-a548-b5e59c4c481e" (UID: "57f429a9-592b-4ff8-a548-b5e59c4c481e"). InnerVolumeSpecName "kube-api-access-ghvgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"933eeedf-0a55-44e6-84f5-062e3c2716d9\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfhbx\" (UniqueName: \"kubernetes.io/projected/933eeedf-0a55-44e6-84f5-062e3c2716d9-kube-api-access-vfhbx\") pod \"933eeedf-0a55-44e6-84f5-062e3c2716d9\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-logs\") pod \"933eeedf-0a55-44e6-84f5-062e3c2716d9\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-etc-machine-id\") pod \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-internal-tls-certs\") pod \"933eeedf-0a55-44e6-84f5-062e3c2716d9\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data-custom\") pod \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-config-data\") pod \"933eeedf-0a55-44e6-84f5-062e3c2716d9\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-combined-ca-bundle\") pod \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-scripts\") pod \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data\") pod \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt4jf\" (UniqueName: \"kubernetes.io/projected/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-kube-api-access-zt4jf\") pod \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\" (UID: \"f44cdae5-61bb-40c3-a1d7-e10d4d64c203\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-scripts\") pod \"933eeedf-0a55-44e6-84f5-062e3c2716d9\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-httpd-run\") pod \"933eeedf-0a55-44e6-84f5-062e3c2716d9\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.523882 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-combined-ca-bundle\") pod \"933eeedf-0a55-44e6-84f5-062e3c2716d9\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.524299 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f429a9-592b-4ff8-a548-b5e59c4c481e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.524311 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.524320 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57f429a9-592b-4ff8-a548-b5e59c4c481e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.524336 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bce9bbed-aa2b-429e-85f7-c4202d8af65f-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.524346 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce9bbed-aa2b-429e-85f7-c4202d8af65f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.524354 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bffc824e-1c16-4342-b73f-91205d93c451-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.524362 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghvgv\" (UniqueName: \"kubernetes.io/projected/57f429a9-592b-4ff8-a548-b5e59c4c481e-kube-api-access-ghvgv\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.524371 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sllq\" (UniqueName: \"kubernetes.io/projected/bffc824e-1c16-4342-b73f-91205d93c451-kube-api-access-2sllq\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.531273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-logs" (OuterVolumeSpecName: "logs") pod "933eeedf-0a55-44e6-84f5-062e3c2716d9" (UID: "933eeedf-0a55-44e6-84f5-062e3c2716d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.537753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "933eeedf-0a55-44e6-84f5-062e3c2716d9" (UID: "933eeedf-0a55-44e6-84f5-062e3c2716d9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.537829 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f44cdae5-61bb-40c3-a1d7-e10d4d64c203" (UID: "f44cdae5-61bb-40c3-a1d7-e10d4d64c203"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.571086 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.573836 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290dd378-ee4b-464d-9321-2491ef417d10-kube-api-access-mcbb6" (OuterVolumeSpecName: "kube-api-access-mcbb6") pod "290dd378-ee4b-464d-9321-2491ef417d10" (UID: "290dd378-ee4b-464d-9321-2491ef417d10"). InnerVolumeSpecName "kube-api-access-mcbb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.574085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kube-api-access-j5t5w" (OuterVolumeSpecName: "kube-api-access-j5t5w") pod "bce9bbed-aa2b-429e-85f7-c4202d8af65f" (UID: "bce9bbed-aa2b-429e-85f7-c4202d8af65f"). InnerVolumeSpecName "kube-api-access-j5t5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.574618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "57f429a9-592b-4ff8-a548-b5e59c4c481e" (UID: "57f429a9-592b-4ff8-a548-b5e59c4c481e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.574885 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9220cbec-82de-4d8a-9bca-575edd0d7482" (UID: "9220cbec-82de-4d8a-9bca-575edd0d7482"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.575717 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-api-access-nkpbt" (OuterVolumeSpecName: "kube-api-access-nkpbt") pod "30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" (UID: "30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6"). InnerVolumeSpecName "kube-api-access-nkpbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.582990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-scripts" (OuterVolumeSpecName: "scripts") pod "57f429a9-592b-4ff8-a548-b5e59c4c481e" (UID: "57f429a9-592b-4ff8-a548-b5e59c4c481e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.627410 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.627439 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.627448 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.627457 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcbb6\" (UniqueName: \"kubernetes.io/projected/290dd378-ee4b-464d-9321-2491ef417d10-kube-api-access-mcbb6\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.627465 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5t5w\" (UniqueName: \"kubernetes.io/projected/bce9bbed-aa2b-429e-85f7-c4202d8af65f-kube-api-access-j5t5w\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.627475 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkpbt\" (UniqueName: \"kubernetes.io/projected/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-api-access-nkpbt\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.627483 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.627491 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.627498 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933eeedf-0a55-44e6-84f5-062e3c2716d9-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.627567 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.627601 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.627588263 +0000 UTC m=+1073.809104485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-config-data" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.627830 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.627853 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.627846309 +0000 UTC m=+1073.809362531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "combined-ca-bundle" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.627884 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.627900 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.627895332 +0000 UTC m=+1073.809411553 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-worker-config-data" not found Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.677286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-kube-api-access-zt4jf" (OuterVolumeSpecName: "kube-api-access-zt4jf") pod "f44cdae5-61bb-40c3-a1d7-e10d4d64c203" (UID: "f44cdae5-61bb-40c3-a1d7-e10d4d64c203"). InnerVolumeSpecName "kube-api-access-zt4jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.685901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/933eeedf-0a55-44e6-84f5-062e3c2716d9-kube-api-access-vfhbx" (OuterVolumeSpecName: "kube-api-access-vfhbx") pod "933eeedf-0a55-44e6-84f5-062e3c2716d9" (UID: "933eeedf-0a55-44e6-84f5-062e3c2716d9"). InnerVolumeSpecName "kube-api-access-vfhbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.686446 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-scripts" (OuterVolumeSpecName: "scripts") pod "f44cdae5-61bb-40c3-a1d7-e10d4d64c203" (UID: "f44cdae5-61bb-40c3-a1d7-e10d4d64c203"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.687093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "bce9bbed-aa2b-429e-85f7-c4202d8af65f" (UID: "bce9bbed-aa2b-429e-85f7-c4202d8af65f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.688912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-scripts" (OuterVolumeSpecName: "scripts") pod "933eeedf-0a55-44e6-84f5-062e3c2716d9" (UID: "933eeedf-0a55-44e6-84f5-062e3c2716d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.690196 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "933eeedf-0a55-44e6-84f5-062e3c2716d9" (UID: "933eeedf-0a55-44e6-84f5-062e3c2716d9"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.709302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f44cdae5-61bb-40c3-a1d7-e10d4d64c203" (UID: "f44cdae5-61bb-40c3-a1d7-e10d4d64c203"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.728473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-config-data\") pod \"a19c4756-e139-49f4-a1e8-99afc8ed8088\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.728521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-internal-tls-certs\") pod \"a19c4756-e139-49f4-a1e8-99afc8ed8088\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.728543 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55mgj\" (UniqueName: \"kubernetes.io/projected/a19c4756-e139-49f4-a1e8-99afc8ed8088-kube-api-access-55mgj\") pod \"a19c4756-e139-49f4-a1e8-99afc8ed8088\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.728586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19c4756-e139-49f4-a1e8-99afc8ed8088-logs\") pod \"a19c4756-e139-49f4-a1e8-99afc8ed8088\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.728732 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-public-tls-certs\") pod \"a19c4756-e139-49f4-a1e8-99afc8ed8088\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.728865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-combined-ca-bundle\") pod \"a19c4756-e139-49f4-a1e8-99afc8ed8088\" (UID: \"a19c4756-e139-49f4-a1e8-99afc8ed8088\") " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.729796 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.729913 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfhbx\" (UniqueName: \"kubernetes.io/projected/933eeedf-0a55-44e6-84f5-062e3c2716d9-kube-api-access-vfhbx\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.729928 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.729939 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.729955 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.729964 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt4jf\" (UniqueName: \"kubernetes.io/projected/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-kube-api-access-zt4jf\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.729974 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.732013 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.732150 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts podName:2db08188-97b2-43c0-9b46-7ea6639617ed nodeName:}" failed. No retries permitted until 2026-01-21 15:19:35.732134265 +0000 UTC m=+1072.913650486 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts") pod "root-account-create-update-84f6c" (UID: "2db08188-97b2-43c0-9b46-7ea6639617ed") : configmap "openstack-scripts" not found Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.736034 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19c4756-e139-49f4-a1e8-99afc8ed8088-logs" (OuterVolumeSpecName: "logs") pod "a19c4756-e139-49f4-a1e8-99afc8ed8088" (UID: "a19c4756-e139-49f4-a1e8-99afc8ed8088"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.767156 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19c4756-e139-49f4-a1e8-99afc8ed8088-kube-api-access-55mgj" (OuterVolumeSpecName: "kube-api-access-55mgj") pod "a19c4756-e139-49f4-a1e8-99afc8ed8088" (UID: "a19c4756-e139-49f4-a1e8-99afc8ed8088"). InnerVolumeSpecName "kube-api-access-55mgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.783459 4707 scope.go:117] "RemoveContainer" containerID="d8b0e90dec26a393971665049bd2f2d03a8dc6c3dd80036e2a8856b78a74cd21" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.832314 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55mgj\" (UniqueName: \"kubernetes.io/projected/a19c4756-e139-49f4-a1e8-99afc8ed8088-kube-api-access-55mgj\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: I0121 15:19:34.832347 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19c4756-e139-49f4-a1e8-99afc8ed8088-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.934062 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.934717 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.934663462 +0000 UTC m=+1074.116179684 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "barbican-api-config-data" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.934145 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.934199 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.934955 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.934929272 +0000 UTC m=+1074.116445495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "cert-barbican-internal-svc" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.935019 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.934993102 +0000 UTC m=+1074.116509324 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "barbican-config-data" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.934229 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.935061 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.935055589 +0000 UTC m=+1074.116571802 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "cert-barbican-public-svc" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.934259 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.935084 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.935080046 +0000 UTC m=+1074.116596267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "combined-ca-bundle" not found Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.974677 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.981580 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.984671 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:19:34 crc kubenswrapper[4707]: E0121 15:19:34.984704 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="ovn-northd" Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.049067 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.049149 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts podName:251778b0-6da6-47ba-805d-3d0469f6969d nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.049107384 +0000 UTC m=+1073.230623606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts") pod "keystone-37a4-account-create-update-rtnnw" (UID: "251778b0-6da6-47ba-805d-3d0469f6969d") : configmap "openstack-scripts" not found Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.049238 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.049263 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts podName:a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:36.049255703 +0000 UTC m=+1073.230771925 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts") pod "barbican-e217-account-create-update-d7fcc" (UID: "a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8") : configmap "openstack-scripts" not found Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.077018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-config-data" (OuterVolumeSpecName: "config-data") pod "290dd378-ee4b-464d-9321-2491ef417d10" (UID: "290dd378-ee4b-464d-9321-2491ef417d10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.151159 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.151417 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.151461 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data podName:f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:39.151446644 +0000 UTC m=+1076.332962867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.171914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "290dd378-ee4b-464d-9321-2491ef417d10" (UID: "290dd378-ee4b-464d-9321-2491ef417d10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.210270 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e303ace-04be-4505-9437-af54b61d41e5" path="/var/lib/kubelet/pods/0e303ace-04be-4505-9437-af54b61d41e5/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.227950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a19c4756-e139-49f4-a1e8-99afc8ed8088" (UID: "a19c4756-e139-49f4-a1e8-99afc8ed8088"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.241576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bffc824e-1c16-4342-b73f-91205d93c451" (UID: "bffc824e-1c16-4342-b73f-91205d93c451"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.251465 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ab3a9a-6d12-45c8-b029-d95a3c4d4608" path="/var/lib/kubelet/pods/12ab3a9a-6d12-45c8-b029-d95a3c4d4608/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.252308 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1559cd45-3a38-4dc8-8e5e-3e0a67542de4" path="/var/lib/kubelet/pods/1559cd45-3a38-4dc8-8e5e-3e0a67542de4/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.258649 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c33dad8-6a4e-40fd-8f84-1329e367ffac" path="/var/lib/kubelet/pods/1c33dad8-6a4e-40fd-8f84-1329e367ffac/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.257483 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.261714 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290dd378-ee4b-464d-9321-2491ef417d10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.261728 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.262530 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2950a343-0190-497c-8775-4a5a6ddb13fa" path="/var/lib/kubelet/pods/2950a343-0190-497c-8775-4a5a6ddb13fa/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.269603 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9" path="/var/lib/kubelet/pods/3aabcb81-ad84-4d77-9bbd-5532f9dd3ef9/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.270061 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43e09a5c-6892-4ff5-9c5b-a30556a193ed" path="/var/lib/kubelet/pods/43e09a5c-6892-4ff5-9c5b-a30556a193ed/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.270506 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da0d4c7-0c65-4e22-be82-afdc94255989" path="/var/lib/kubelet/pods/5da0d4c7-0c65-4e22-be82-afdc94255989/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.271742 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b153bbb-6db9-4837-8010-dbe2bdc51be0" path="/var/lib/kubelet/pods/6b153bbb-6db9-4837-8010-dbe2bdc51be0/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.274268 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cc079f9-7c19-4944-9325-ce31a1688157" path="/var/lib/kubelet/pods/6cc079f9-7c19-4944-9325-ce31a1688157/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.275441 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f1a9bb-73b3-4983-a81b-4ae1da3f521e" path="/var/lib/kubelet/pods/78f1a9bb-73b3-4983-a81b-4ae1da3f521e/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.276506 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc26c7f-689c-4ded-8243-67e3cb4c7603" path="/var/lib/kubelet/pods/7dc26c7f-689c-4ded-8243-67e3cb4c7603/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.277062 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a17de20-67a0-4578-92a4-e19819061791" path="/var/lib/kubelet/pods/9a17de20-67a0-4578-92a4-e19819061791/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.277574 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9fe7962-fff4-44ce-a841-dab09e490d7c" path="/var/lib/kubelet/pods/c9fe7962-fff4-44ce-a841-dab09e490d7c/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.278661 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec05603-c838-4ce1-bb91-5a2d047b4263" path="/var/lib/kubelet/pods/cec05603-c838-4ce1-bb91-5a2d047b4263/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.279162 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5be7ae3-f509-470d-aa05-6369baaa2ec5" path="/var/lib/kubelet/pods/d5be7ae3-f509-470d-aa05-6369baaa2ec5/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.279617 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8dfc923-4af3-4ca0-9efa-f01fa14d7772" path="/var/lib/kubelet/pods/f8dfc923-4af3-4ca0-9efa-f01fa14d7772/volumes" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.287913 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.302682 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.303051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bce9bbed-aa2b-429e-85f7-c4202d8af65f" (UID: "bce9bbed-aa2b-429e-85f7-c4202d8af65f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.325471 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.334653 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.335370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57f429a9-592b-4ff8-a548-b5e59c4c481e" (UID: "57f429a9-592b-4ff8-a548-b5e59c4c481e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.346271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "933eeedf-0a55-44e6-84f5-062e3c2716d9" (UID: "933eeedf-0a55-44e6-84f5-062e3c2716d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.348919 4707 generic.go:334] "Generic (PLEG): container finished" podID="d582f12e-341b-45f5-a54a-8d016aed473a" containerID="591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214" exitCode=0 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.348941 4707 generic.go:334] "Generic (PLEG): container finished" podID="d582f12e-341b-45f5-a54a-8d016aed473a" containerID="4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6" exitCode=0 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.373743 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.373770 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.373781 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.373789 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.373797 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.381840 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerID="8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb" exitCode=143 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.393321 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.404975 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" podUID="73d67e12-9371-4324-a476-3befcc10743d" containerName="placement-log" containerID="cri-o://a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528" gracePeriod=30 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.405061 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" podUID="73d67e12-9371-4324-a476-3befcc10743d" containerName="placement-api" containerID="cri-o://b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803" gracePeriod=30 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.409863 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" podUID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerName="barbican-keystone-listener-log" containerID="cri-o://92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762" gracePeriod=30 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.409952 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" podUID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerName="barbican-keystone-listener" containerID="cri-o://d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57" gracePeriod=30 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.423362 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-config-data" (OuterVolumeSpecName: "config-data") pod "a19c4756-e139-49f4-a1e8-99afc8ed8088" (UID: "a19c4756-e139-49f4-a1e8-99afc8ed8088"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.434128 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" podUID="a533a135-2b5a-4edb-aead-4924368ab8ab" containerName="keystone-api" containerID="cri-o://f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b" gracePeriod=30 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.438017 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" podStartSLOduration=6.438005788 podStartE2EDuration="6.438005788s" podCreationTimestamp="2026-01-21 15:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:35.431211206 +0000 UTC m=+1072.612727428" watchObservedRunningTime="2026-01-21 15:19:35.438005788 +0000 UTC m=+1072.619522009" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.443617 4707 generic.go:334] "Generic (PLEG): container finished" podID="df067294-2809-413f-b245-6cd57c0cbf11" containerID="2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a" exitCode=143 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.444487 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.455947 4707 scope.go:117] "RemoveContainer" containerID="492abccf81ca00d53959d35041fea378a062143c55d54af1e3316598d6f94cea" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.463247 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.464217 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" podStartSLOduration=6.464204502 podStartE2EDuration="6.464204502s" podCreationTimestamp="2026-01-21 15:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:35.456982987 +0000 UTC m=+1072.638499229" watchObservedRunningTime="2026-01-21 15:19:35.464204502 +0000 UTC m=+1072.645720724" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.464908 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.469300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" (UID: "30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.482859 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.482888 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.488983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n"] Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489020 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489032 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f44cdae5-61bb-40c3-a1d7-e10d4d64c203","Type":"ContainerDied","Data":"c09323e85c1282cf63762ad55eecc61b7b5585b62eca846baaa50e2dad473df8"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489065 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"57f429a9-592b-4ff8-a548-b5e59c4c481e","Type":"ContainerDied","Data":"a70e5a882345599d7fdd32496b92c15f7341dd3f3843e593a68dd6238b7c3677"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" event={"ID":"d582f12e-341b-45f5-a54a-8d016aed473a","Type":"ContainerDied","Data":"591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" event={"ID":"d582f12e-341b-45f5-a54a-8d016aed473a","Type":"ContainerDied","Data":"4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" event={"ID":"d582f12e-341b-45f5-a54a-8d016aed473a","Type":"ContainerDied","Data":"3faa107af3aae21ad8749ef4ff20cfd921e18536a1be93695d7f985392e440fe"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" event={"ID":"ed6331a2-4b0a-407e-8ee0-3b1a82343071","Type":"ContainerDied","Data":"8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bffc824e-1c16-4342-b73f-91205d93c451","Type":"ContainerDied","Data":"2da51b0b7c1edc422a0096f34b4bd252cf641d3bf0f0719d394ad58642f44597"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" event={"ID":"73d67e12-9371-4324-a476-3befcc10743d","Type":"ContainerStarted","Data":"b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec","Type":"ContainerDied","Data":"9ba3c7e54b4e1e596830637369113346a235506fb1ebb596f106d9ee54396fd1"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" event={"ID":"aa920435-67d6-49dd-9e83-7d14c4316fef","Type":"ContainerStarted","Data":"d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" event={"ID":"e476808c-b16e-4aa7-9f4b-71e6a5c32576","Type":"ContainerStarted","Data":"ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" event={"ID":"a533a135-2b5a-4edb-aead-4924368ab8ab","Type":"ContainerStarted","Data":"f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489190 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" event={"ID":"df067294-2809-413f-b245-6cd57c0cbf11","Type":"ContainerDied","Data":"2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.489200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"933eeedf-0a55-44e6-84f5-062e3c2716d9","Type":"ContainerDied","Data":"85b1010a0d52b0e3dd1f6e1bd11d438e313245e860e0396d821932b3d81e1a85"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.495232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" event={"ID":"14b480ba-3ec1-4c37-a14c-d3b8a20d9837","Type":"ContainerStarted","Data":"22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.504498 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.507649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" (UID: "30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.517934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data" (OuterVolumeSpecName: "config-data") pod "57f429a9-592b-4ff8-a548-b5e59c4c481e" (UID: "57f429a9-592b-4ff8-a548-b5e59c4c481e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.519514 4707 generic.go:334] "Generic (PLEG): container finished" podID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerID="8dbd5877a5c421fb4b87c4bb6c848927f876f78802f4b31d44917a93f5d9c274" exitCode=143 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.519568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" event={"ID":"1fee1703-b855-46f5-9bc3-89c66b3377ea","Type":"ContainerDied","Data":"8dbd5877a5c421fb4b87c4bb6c848927f876f78802f4b31d44917a93f5d9c274"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.520693 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw"] Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.521731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9220cbec-82de-4d8a-9bca-575edd0d7482" (UID: "9220cbec-82de-4d8a-9bca-575edd0d7482"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.523198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a19c4756-e139-49f4-a1e8-99afc8ed8088","Type":"ContainerDied","Data":"5576423c75052e9487e5da2c85414974ae20e4dcf5a9378746d80cc70a1979b8"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.523237 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.523396 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" podStartSLOduration=6.523385473 podStartE2EDuration="6.523385473s" podCreationTimestamp="2026-01-21 15:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:35.51812113 +0000 UTC m=+1072.699637352" watchObservedRunningTime="2026-01-21 15:19:35.523385473 +0000 UTC m=+1072.704901695" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.525331 4707 generic.go:334] "Generic (PLEG): container finished" podID="36457603-5ef9-4f79-bf2c-11413e422937" containerID="f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b" exitCode=0 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.525503 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.525873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"36457603-5ef9-4f79-bf2c-11413e422937","Type":"ContainerDied","Data":"f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.525913 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"36457603-5ef9-4f79-bf2c-11413e422937","Type":"ContainerDied","Data":"6313ee5f80e449fe65c0ff7184c6fae072def507e528921ce6f85b2fc64f17b7"} Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.525997 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.526592 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.526949 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.528676 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" podUID="3602ad56-25ca-4375-8eac-ea931ec48243" containerName="barbican-worker-log" containerID="cri-o://69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22" gracePeriod=30 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.529683 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" podUID="3602ad56-25ca-4375-8eac-ea931ec48243" containerName="barbican-worker" containerID="cri-o://3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf" gracePeriod=30 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.588896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "933eeedf-0a55-44e6-84f5-062e3c2716d9" (UID: "933eeedf-0a55-44e6-84f5-062e3c2716d9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.594673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-combined-ca-bundle\") pod \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.594721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-etc-swift\") pod \"d582f12e-341b-45f5-a54a-8d016aed473a\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.594753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-scripts\") pod \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.594772 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-combined-ca-bundle\") pod \"36457603-5ef9-4f79-bf2c-11413e422937\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-public-tls-certs\") pod \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-internal-tls-certs\") pod \"d582f12e-341b-45f5-a54a-8d016aed473a\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598156 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-log-httpd\") pod \"d582f12e-341b-45f5-a54a-8d016aed473a\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-config-data\") pod \"d582f12e-341b-45f5-a54a-8d016aed473a\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-config-data\") pod \"36457603-5ef9-4f79-bf2c-11413e422937\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-httpd-run\") pod \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx2cx\" (UniqueName: \"kubernetes.io/projected/36457603-5ef9-4f79-bf2c-11413e422937-kube-api-access-gx2cx\") pod \"36457603-5ef9-4f79-bf2c-11413e422937\" (UID: \"36457603-5ef9-4f79-bf2c-11413e422937\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598776 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfs9k\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-kube-api-access-pfs9k\") pod \"d582f12e-341b-45f5-a54a-8d016aed473a\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-combined-ca-bundle\") pod \"d582f12e-341b-45f5-a54a-8d016aed473a\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-run-httpd\") pod \"d582f12e-341b-45f5-a54a-8d016aed473a\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-config-data\") pod \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-public-tls-certs\") pod \"d582f12e-341b-45f5-a54a-8d016aed473a\" (UID: \"d582f12e-341b-45f5-a54a-8d016aed473a\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.598974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-internal-tls-certs\") pod \"933eeedf-0a55-44e6-84f5-062e3c2716d9\" (UID: \"933eeedf-0a55-44e6-84f5-062e3c2716d9\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.599014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-logs\") pod \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.599044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lrx7\" (UniqueName: \"kubernetes.io/projected/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-kube-api-access-7lrx7\") pod \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\" (UID: \"02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec\") " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.599733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" (UID: "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.599826 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.599877 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data podName:945ded51-9961-411b-b00f-0543ac91a18a nodeName:}" failed. No retries permitted until 2026-01-21 15:19:39.599864054 +0000 UTC m=+1076.781380277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data") pod "rabbitmq-server-0" (UID: "945ded51-9961-411b-b00f-0543ac91a18a") : configmap "rabbitmq-config-data" not found Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.599905 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.599921 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.599931 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.599942 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9220cbec-82de-4d8a-9bca-575edd0d7482-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.605352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-config-data" (OuterVolumeSpecName: "config-data") pod "933eeedf-0a55-44e6-84f5-062e3c2716d9" (UID: "933eeedf-0a55-44e6-84f5-062e3c2716d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.610878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-config-data" (OuterVolumeSpecName: "config-data") pod "bffc824e-1c16-4342-b73f-91205d93c451" (UID: "bffc824e-1c16-4342-b73f-91205d93c451"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.611331 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.611541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d582f12e-341b-45f5-a54a-8d016aed473a" (UID: "d582f12e-341b-45f5-a54a-8d016aed473a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.611637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-logs" (OuterVolumeSpecName: "logs") pod "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" (UID: "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: W0121 15:19:35.611701 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/933eeedf-0a55-44e6-84f5-062e3c2716d9/volumes/kubernetes.io~secret/internal-tls-certs Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.611713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "933eeedf-0a55-44e6-84f5-062e3c2716d9" (UID: "933eeedf-0a55-44e6-84f5-062e3c2716d9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.611727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d582f12e-341b-45f5-a54a-8d016aed473a" (UID: "d582f12e-341b-45f5-a54a-8d016aed473a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.618085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36457603-5ef9-4f79-bf2c-11413e422937-kube-api-access-gx2cx" (OuterVolumeSpecName: "kube-api-access-gx2cx") pod "36457603-5ef9-4f79-bf2c-11413e422937" (UID: "36457603-5ef9-4f79-bf2c-11413e422937"). InnerVolumeSpecName "kube-api-access-gx2cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.622882 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.623043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-scripts" (OuterVolumeSpecName: "scripts") pod "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" (UID: "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.624310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "57f429a9-592b-4ff8-a548-b5e59c4c481e" (UID: "57f429a9-592b-4ff8-a548-b5e59c4c481e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.628578 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d582f12e-341b-45f5-a54a-8d016aed473a" (UID: "d582f12e-341b-45f5-a54a-8d016aed473a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.628936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" (UID: "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.629192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-kube-api-access-pfs9k" (OuterVolumeSpecName: "kube-api-access-pfs9k") pod "d582f12e-341b-45f5-a54a-8d016aed473a" (UID: "d582f12e-341b-45f5-a54a-8d016aed473a"). InnerVolumeSpecName "kube-api-access-pfs9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.629338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-kube-api-access-7lrx7" (OuterVolumeSpecName: "kube-api-access-7lrx7") pod "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" (UID: "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec"). InnerVolumeSpecName "kube-api-access-7lrx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: W0121 15:19:35.631413 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod251778b0_6da6_47ba_805d_3d0469f6969d.slice/crio-aee3d212ff8e73e59c8aa1843a8355ae3c199cae005f9ff60b7c5f62f41d67a7 WatchSource:0}: Error finding container aee3d212ff8e73e59c8aa1843a8355ae3c199cae005f9ff60b7c5f62f41d67a7: Status 404 returned error can't find the container with id aee3d212ff8e73e59c8aa1843a8355ae3c199cae005f9ff60b7c5f62f41d67a7 Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.648476 4707 scope.go:117] "RemoveContainer" containerID="c9b449437d00fe1991a9d06649066f2fc2cf5629c28ad87cf426887a6dbf94ac" Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.679620 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:19:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:19:35 crc kubenswrapper[4707]: Jan 21 15:19:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:19:35 crc kubenswrapper[4707]: Jan 21 15:19:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:19:35 crc kubenswrapper[4707]: Jan 21 15:19:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:19:35 crc kubenswrapper[4707]: Jan 21 15:19:35 crc kubenswrapper[4707]: if [ -n "keystone" ]; then Jan 21 15:19:35 crc kubenswrapper[4707]: GRANT_DATABASE="keystone" Jan 21 15:19:35 crc kubenswrapper[4707]: else Jan 21 15:19:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:19:35 crc kubenswrapper[4707]: fi Jan 21 15:19:35 crc kubenswrapper[4707]: Jan 21 15:19:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:19:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:19:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:19:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:19:35 crc kubenswrapper[4707]: # support updates Jan 21 15:19:35 crc kubenswrapper[4707]: Jan 21 15:19:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.681825 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"keystone-db-secret\\\" not found\"" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" podUID="251778b0-6da6-47ba-805d-3d0469f6969d" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a19c4756-e139-49f4-a1e8-99afc8ed8088" (UID: "a19c4756-e139-49f4-a1e8-99afc8ed8088"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701796 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701829 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701839 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701847 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701855 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701863 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lrx7\" (UniqueName: \"kubernetes.io/projected/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-kube-api-access-7lrx7\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701870 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933eeedf-0a55-44e6-84f5-062e3c2716d9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701877 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701884 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701893 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701900 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701908 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d582f12e-341b-45f5-a54a-8d016aed473a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701916 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx2cx\" (UniqueName: \"kubernetes.io/projected/36457603-5ef9-4f79-bf2c-11413e422937-kube-api-access-gx2cx\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.701924 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfs9k\" (UniqueName: \"kubernetes.io/projected/d582f12e-341b-45f5-a54a-8d016aed473a-kube-api-access-pfs9k\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.716028 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.728983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bffc824e-1c16-4342-b73f-91205d93c451" (UID: "bffc824e-1c16-4342-b73f-91205d93c451"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.803938 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.804116 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bffc824e-1c16-4342-b73f-91205d93c451-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.804158 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:19:35 crc kubenswrapper[4707]: E0121 15:19:35.804235 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts podName:2db08188-97b2-43c0-9b46-7ea6639617ed nodeName:}" failed. No retries permitted until 2026-01-21 15:19:37.80422083 +0000 UTC m=+1074.985737052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts") pod "root-account-create-update-84f6c" (UID: "2db08188-97b2-43c0-9b46-7ea6639617ed") : configmap "openstack-scripts" not found Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.859461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f44cdae5-61bb-40c3-a1d7-e10d4d64c203" (UID: "f44cdae5-61bb-40c3-a1d7-e10d4d64c203"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.907963 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.922096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" (UID: "30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.922345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "57f429a9-592b-4ff8-a548-b5e59c4c481e" (UID: "57f429a9-592b-4ff8-a548-b5e59c4c481e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.954999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36457603-5ef9-4f79-bf2c-11413e422937" (UID: "36457603-5ef9-4f79-bf2c-11413e422937"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:35 crc kubenswrapper[4707]: I0121 15:19:35.957701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "bce9bbed-aa2b-429e-85f7-c4202d8af65f" (UID: "bce9bbed-aa2b-429e-85f7-c4202d8af65f"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.009989 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.010020 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f429a9-592b-4ff8-a548-b5e59c4c481e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.010031 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.010041 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce9bbed-aa2b-429e-85f7-c4202d8af65f-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.016916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-config-data" (OuterVolumeSpecName: "config-data") pod "36457603-5ef9-4f79-bf2c-11413e422937" (UID: "36457603-5ef9-4f79-bf2c-11413e422937"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.040623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" (UID: "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.068659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a19c4756-e139-49f4-a1e8-99afc8ed8088" (UID: "a19c4756-e139-49f4-a1e8-99afc8ed8088"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.069877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" (UID: "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.069943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d582f12e-341b-45f5-a54a-8d016aed473a" (UID: "d582f12e-341b-45f5-a54a-8d016aed473a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.078998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d582f12e-341b-45f5-a54a-8d016aed473a" (UID: "d582f12e-341b-45f5-a54a-8d016aed473a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.084833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-config-data" (OuterVolumeSpecName: "config-data") pod "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" (UID: "02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.097114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d582f12e-341b-45f5-a54a-8d016aed473a" (UID: "d582f12e-341b-45f5-a54a-8d016aed473a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.097231 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data" (OuterVolumeSpecName: "config-data") pod "f44cdae5-61bb-40c3-a1d7-e10d4d64c203" (UID: "f44cdae5-61bb-40c3-a1d7-e10d4d64c203"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.112222 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36457603-5ef9-4f79-bf2c-11413e422937-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.112246 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.112256 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.112267 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.112275 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.112283 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.112291 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44cdae5-61bb-40c3-a1d7-e10d4d64c203-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.112298 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.112309 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a19c4756-e139-49f4-a1e8-99afc8ed8088-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.112224 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.112358 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts podName:a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:38.112345563 +0000 UTC m=+1075.293861785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts") pod "barbican-e217-account-create-update-d7fcc" (UID: "a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8") : configmap "openstack-scripts" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.112253 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.112627 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts podName:251778b0-6da6-47ba-805d-3d0469f6969d nodeName:}" failed. No retries permitted until 2026-01-21 15:19:38.112619549 +0000 UTC m=+1075.294135770 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts") pod "keystone-37a4-account-create-update-rtnnw" (UID: "251778b0-6da6-47ba-805d-3d0469f6969d") : configmap "openstack-scripts" not found Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.113892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-config-data" (OuterVolumeSpecName: "config-data") pod "d582f12e-341b-45f5-a54a-8d016aed473a" (UID: "d582f12e-341b-45f5-a54a-8d016aed473a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.136387 4707 scope.go:117] "RemoveContainer" containerID="30a82c874f0417a2ea868196fc295a6b7cdf393c79ed490a067806d77d373590" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.219183 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d582f12e-341b-45f5-a54a-8d016aed473a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.240411 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.249782 4707 scope.go:117] "RemoveContainer" containerID="5775ffa72419df9485ac7699c70a095e09fa60e7badfeb84512d63346e47f9f6" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.271368 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.295969 4707 scope.go:117] "RemoveContainer" containerID="9fa2d6082708378d81429a182459dfd951edb40a1d798ddd9d54ac8fa55269de" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.306529 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.313690 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.316639 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.319607 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts\") pod \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\" (UID: \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.319647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqmrr\" (UniqueName: \"kubernetes.io/projected/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-kube-api-access-zqmrr\") pod \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\" (UID: \"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.319680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts\") pod \"2db08188-97b2-43c0-9b46-7ea6639617ed\" (UID: \"2db08188-97b2-43c0-9b46-7ea6639617ed\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.319709 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqrld\" (UniqueName: \"kubernetes.io/projected/2db08188-97b2-43c0-9b46-7ea6639617ed-kube-api-access-jqrld\") pod \"2db08188-97b2-43c0-9b46-7ea6639617ed\" (UID: \"2db08188-97b2-43c0-9b46-7ea6639617ed\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.320754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8" (UID: "a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.320897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2db08188-97b2-43c0-9b46-7ea6639617ed" (UID: "2db08188-97b2-43c0-9b46-7ea6639617ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.323136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-kube-api-access-zqmrr" (OuterVolumeSpecName: "kube-api-access-zqmrr") pod "a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8" (UID: "a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8"). InnerVolumeSpecName "kube-api-access-zqmrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.325745 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.327360 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db08188-97b2-43c0-9b46-7ea6639617ed-kube-api-access-jqrld" (OuterVolumeSpecName: "kube-api-access-jqrld") pod "2db08188-97b2-43c0-9b46-7ea6639617ed" (UID: "2db08188-97b2-43c0-9b46-7ea6639617ed"). InnerVolumeSpecName "kube-api-access-jqrld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.340568 4707 scope.go:117] "RemoveContainer" containerID="291188ab585da8ca8a57ab46efd037f8e3984e81e67c3a618faa0dd418123a81" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.351940 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.366252 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.385789 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.388312 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="43e09a5c-6892-4ff5-9c5b-a30556a193ed" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.180:6080/vnc_lite.html\": context deadline exceeded" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.423137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-internal-tls-certs\") pod \"73d67e12-9371-4324-a476-3befcc10743d\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.424097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-scripts\") pod \"73d67e12-9371-4324-a476-3befcc10743d\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.424268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-config-data\") pod \"73d67e12-9371-4324-a476-3befcc10743d\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.424328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d67e12-9371-4324-a476-3befcc10743d-logs\") pod \"73d67e12-9371-4324-a476-3befcc10743d\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.424344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-public-tls-certs\") pod \"73d67e12-9371-4324-a476-3befcc10743d\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.424363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-combined-ca-bundle\") pod \"73d67e12-9371-4324-a476-3befcc10743d\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.424384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2dqb\" (UniqueName: \"kubernetes.io/projected/73d67e12-9371-4324-a476-3befcc10743d-kube-api-access-w2dqb\") pod \"73d67e12-9371-4324-a476-3befcc10743d\" (UID: \"73d67e12-9371-4324-a476-3befcc10743d\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.425039 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.425058 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqmrr\" (UniqueName: \"kubernetes.io/projected/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8-kube-api-access-zqmrr\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.425068 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db08188-97b2-43c0-9b46-7ea6639617ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.425077 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqrld\" (UniqueName: \"kubernetes.io/projected/2db08188-97b2-43c0-9b46-7ea6639617ed-kube-api-access-jqrld\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.425359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d67e12-9371-4324-a476-3befcc10743d-logs" (OuterVolumeSpecName: "logs") pod "73d67e12-9371-4324-a476-3befcc10743d" (UID: "73d67e12-9371-4324-a476-3befcc10743d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.426412 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.427319 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-scripts" (OuterVolumeSpecName: "scripts") pod "73d67e12-9371-4324-a476-3befcc10743d" (UID: "73d67e12-9371-4324-a476-3befcc10743d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.435911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d67e12-9371-4324-a476-3befcc10743d-kube-api-access-w2dqb" (OuterVolumeSpecName: "kube-api-access-w2dqb") pod "73d67e12-9371-4324-a476-3befcc10743d" (UID: "73d67e12-9371-4324-a476-3befcc10743d"). InnerVolumeSpecName "kube-api-access-w2dqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.439004 4707 scope.go:117] "RemoveContainer" containerID="97bae6a3ad6e694b48ede1784c23518cbd9dbfc8a5d2d44ec3582c962c55a005" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.452272 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.458316 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.465342 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.471981 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.475112 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.477403 4707 scope.go:117] "RemoveContainer" containerID="591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.477958 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.481500 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.487888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73d67e12-9371-4324-a476-3befcc10743d" (UID: "73d67e12-9371-4324-a476-3befcc10743d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.487933 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.492619 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.492664 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-config-data" (OuterVolumeSpecName: "config-data") pod "73d67e12-9371-4324-a476-3befcc10743d" (UID: "73d67e12-9371-4324-a476-3befcc10743d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.496388 4707 scope.go:117] "RemoveContainer" containerID="4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.515745 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.519845 4707 scope.go:117] "RemoveContainer" containerID="591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214" Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.520193 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214\": container with ID starting with 591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214 not found: ID does not exist" containerID="591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.520225 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214"} err="failed to get container status \"591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214\": rpc error: code = NotFound desc = could not find container \"591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214\": container with ID starting with 591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214 not found: ID does not exist" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.520249 4707 scope.go:117] "RemoveContainer" containerID="4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6" Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.520580 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6\": container with ID starting with 4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6 not found: ID does not exist" containerID="4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.520609 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6"} err="failed to get container status \"4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6\": rpc error: code = NotFound desc = could not find container \"4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6\": container with ID starting with 4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6 not found: ID does not exist" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.520630 4707 scope.go:117] "RemoveContainer" containerID="591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.520932 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214"} err="failed to get container status \"591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214\": rpc error: code = NotFound desc = could not find container \"591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214\": container with ID starting with 591b5206469ef1dae29bf227605f004793b7def21ddfdea16e907ffe2fe98214 not found: ID does not exist" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.521058 4707 scope.go:117] "RemoveContainer" containerID="4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.521334 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6"} err="failed to get container status \"4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6\": rpc error: code = NotFound desc = could not find container \"4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6\": container with ID starting with 4861f31ed10683aeea34aa322f4432042daa62e05c4c4567eb7bfb2bdd5309b6 not found: ID does not exist" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.521427 4707 scope.go:117] "RemoveContainer" containerID="132d4a7b6c8eb04012605d47a3c3a13b3ca5900e48ed093c7525683f623c3630" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.527204 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.527302 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d67e12-9371-4324-a476-3befcc10743d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.527359 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.527409 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2dqb\" (UniqueName: \"kubernetes.io/projected/73d67e12-9371-4324-a476-3befcc10743d-kube-api-access-w2dqb\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.527458 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.532294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "73d67e12-9371-4324-a476-3befcc10743d" (UID: "73d67e12-9371-4324-a476-3befcc10743d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.536887 4707 scope.go:117] "RemoveContainer" containerID="4f9de40936195255f1db32b4c266a0cc65cf32487afa43b4821fdb159dd1e243" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.537302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "73d67e12-9371-4324-a476-3befcc10743d" (UID: "73d67e12-9371-4324-a476-3befcc10743d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.555042 4707 generic.go:334] "Generic (PLEG): container finished" podID="73d67e12-9371-4324-a476-3befcc10743d" containerID="b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803" exitCode=0 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.555135 4707 generic.go:334] "Generic (PLEG): container finished" podID="73d67e12-9371-4324-a476-3befcc10743d" containerID="a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528" exitCode=143 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.555248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" event={"ID":"73d67e12-9371-4324-a476-3befcc10743d","Type":"ContainerDied","Data":"b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.555354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" event={"ID":"73d67e12-9371-4324-a476-3befcc10743d","Type":"ContainerDied","Data":"a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.555437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" event={"ID":"73d67e12-9371-4324-a476-3befcc10743d","Type":"ContainerDied","Data":"7f3bb00b4cc309a855d46e8c9b331cdbdec4f887410567914b414e1ffc3ec7f4"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.555675 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.556265 4707 scope.go:117] "RemoveContainer" containerID="abbe09e446aa5427d732558aa8e1909e7b43ad26726df39c2a2cc877fe7297c3" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.558596 4707 generic.go:334] "Generic (PLEG): container finished" podID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerID="92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762" exitCode=143 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.558656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" event={"ID":"aa920435-67d6-49dd-9e83-7d14c4316fef","Type":"ContainerDied","Data":"92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.559769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" event={"ID":"e476808c-b16e-4aa7-9f4b-71e6a5c32576","Type":"ContainerStarted","Data":"45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.559919 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" podUID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerName="barbican-api-log" containerID="cri-o://ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13" gracePeriod=30 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.560139 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.560164 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.560207 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" podUID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerName="barbican-api" containerID="cri-o://45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e" gracePeriod=30 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.566588 4707 generic.go:334] "Generic (PLEG): container finished" podID="38833865-b749-47e4-b140-8a541562f359" containerID="62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376" exitCode=0 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.566643 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.566661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" event={"ID":"38833865-b749-47e4-b140-8a541562f359","Type":"ContainerDied","Data":"62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.566679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b8c8cff84-8wl5w" event={"ID":"38833865-b749-47e4-b140-8a541562f359","Type":"ContainerDied","Data":"e62df5d82f7edafba1e2a62dff5f4267be6d7822c90811d6342ee07fb129707e"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.570537 4707 generic.go:334] "Generic (PLEG): container finished" podID="3602ad56-25ca-4375-8eac-ea931ec48243" containerID="69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22" exitCode=143 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.570610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" event={"ID":"3602ad56-25ca-4375-8eac-ea931ec48243","Type":"ContainerDied","Data":"69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.582782 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" podStartSLOduration=7.582767379 podStartE2EDuration="7.582767379s" podCreationTimestamp="2026-01-21 15:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:36.576848535 +0000 UTC m=+1073.758364756" watchObservedRunningTime="2026-01-21 15:19:36.582767379 +0000 UTC m=+1073.764283601" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.583544 4707 scope.go:117] "RemoveContainer" containerID="eccce3cbf45c52b0fa9e2d35adb70605731d496af2d57867762c823eecd5c9d6" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.597281 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.604972 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.606282 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.607106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc" event={"ID":"a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8","Type":"ContainerDied","Data":"7c78520c3afb65fc8ed8fa39db23776760da91c22c5288edc2b28c9a26cb76a2"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.611316 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-84f6c" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.611478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-84f6c" event={"ID":"2db08188-97b2-43c0-9b46-7ea6639617ed","Type":"ContainerDied","Data":"ed96ad8010085d02da3c06f4251b0dd27d1caa285e493ed14e244faf3e736055"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.611679 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-6ddf8fc7c8-ztvhs"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.613030 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc35eebb-a329-4334-a277-5a27b1bfba38" containerID="ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32" exitCode=0 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.613082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" event={"ID":"fc35eebb-a329-4334-a277-5a27b1bfba38","Type":"ContainerDied","Data":"ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.613101 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" event={"ID":"fc35eebb-a329-4334-a277-5a27b1bfba38","Type":"ContainerStarted","Data":"16f6f2238fc0146bb9c73f0b3597565fcdf99f25f26e4fac1c2f215b7fe00c96"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.623179 4707 scope.go:117] "RemoveContainer" containerID="cd62a3dc2c61a7c191ccbda0e7ac771e13ca81a7f503f321bfd2768094965edd" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.627924 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.628095 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-config-data\") pod \"38833865-b749-47e4-b140-8a541562f359\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.628151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38833865-b749-47e4-b140-8a541562f359-logs\") pod \"38833865-b749-47e4-b140-8a541562f359\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.628181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-scripts\") pod \"38833865-b749-47e4-b140-8a541562f359\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.628231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-public-tls-certs\") pod \"38833865-b749-47e4-b140-8a541562f359\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.628331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-combined-ca-bundle\") pod \"38833865-b749-47e4-b140-8a541562f359\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.628384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grx49\" (UniqueName: \"kubernetes.io/projected/38833865-b749-47e4-b140-8a541562f359-kube-api-access-grx49\") pod \"38833865-b749-47e4-b140-8a541562f359\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.628405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-internal-tls-certs\") pod \"38833865-b749-47e4-b140-8a541562f359\" (UID: \"38833865-b749-47e4-b140-8a541562f359\") " Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.628711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38833865-b749-47e4-b140-8a541562f359-logs" (OuterVolumeSpecName: "logs") pod "38833865-b749-47e4-b140-8a541562f359" (UID: "38833865-b749-47e4-b140-8a541562f359"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.628882 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.628923 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:40.628911284 +0000 UTC m=+1077.810427506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-config-data" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.628855 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.629074 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:40.629058912 +0000 UTC m=+1077.810575134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-worker-config-data" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.629128 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.629311 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:40.629300316 +0000 UTC m=+1077.810816538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "combined-ca-bundle" not found Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.628778 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.629435 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73d67e12-9371-4324-a476-3befcc10743d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.629724 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" event={"ID":"14b480ba-3ec1-4c37-a14c-d3b8a20d9837","Type":"ContainerStarted","Data":"9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.630151 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerName="neutron-api" containerID="cri-o://22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4" gracePeriod=30 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.630299 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerName="neutron-httpd" containerID="cri-o://9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997" gracePeriod=30 Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.632507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" event={"ID":"251778b0-6da6-47ba-805d-3d0469f6969d","Type":"ContainerStarted","Data":"aee3d212ff8e73e59c8aa1843a8355ae3c199cae005f9ff60b7c5f62f41d67a7"} Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.633265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-scripts" (OuterVolumeSpecName: "scripts") pod "38833865-b749-47e4-b140-8a541562f359" (UID: "38833865-b749-47e4-b140-8a541562f359"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.636127 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-869fbc79bd-dzdrb"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.639565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38833865-b749-47e4-b140-8a541562f359-kube-api-access-grx49" (OuterVolumeSpecName: "kube-api-access-grx49") pod "38833865-b749-47e4-b140-8a541562f359" (UID: "38833865-b749-47e4-b140-8a541562f359"). InnerVolumeSpecName "kube-api-access-grx49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.642867 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.659377 4707 scope.go:117] "RemoveContainer" containerID="74d9d54b14de2c921f565409d9067db30aa7eea74a9bfef927fc5a41ce19f3e5" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.685355 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" podStartSLOduration=7.685337223 podStartE2EDuration="7.685337223s" podCreationTimestamp="2026-01-21 15:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:36.670729563 +0000 UTC m=+1073.852245785" watchObservedRunningTime="2026-01-21 15:19:36.685337223 +0000 UTC m=+1073.866853444" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.707470 4707 scope.go:117] "RemoveContainer" containerID="c48740a97a789fa8e24952640d3171c5efb85d529be89bc23207d43578d4750d" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.723847 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.724415 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-config-data" (OuterVolumeSpecName: "config-data") pod "38833865-b749-47e4-b140-8a541562f359" (UID: "38833865-b749-47e4-b140-8a541562f359"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.727861 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-e217-account-create-update-d7fcc"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.735275 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grx49\" (UniqueName: \"kubernetes.io/projected/38833865-b749-47e4-b140-8a541562f359-kube-api-access-grx49\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.735536 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.735708 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38833865-b749-47e4-b140-8a541562f359-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.735991 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.736334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38833865-b749-47e4-b140-8a541562f359" (UID: "38833865-b749-47e4-b140-8a541562f359"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.754360 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-84f6c"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.755823 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-84f6c"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.771997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "38833865-b749-47e4-b140-8a541562f359" (UID: "38833865-b749-47e4-b140-8a541562f359"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.775131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "38833865-b749-47e4-b140-8a541562f359" (UID: "38833865-b749-47e4-b140-8a541562f359"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.806574 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.809392 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.837128 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.837585 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.837649 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38833865-b749-47e4-b140-8a541562f359-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.921477 4707 scope.go:117] "RemoveContainer" containerID="68f94dbce7c88a21b49fc9a3b3510d01bf3d9de428d0ff716630f26929ab3d41" Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939320 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939341 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939377 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:40.939363136 +0000 UTC m=+1078.120879357 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "barbican-config-data" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939381 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939392 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:40.939386729 +0000 UTC m=+1078.120902951 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "cert-barbican-internal-svc" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939455 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939485 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:40.939476309 +0000 UTC m=+1078.120992530 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "combined-ca-bundle" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939526 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939547 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:40.939540739 +0000 UTC m=+1078.121056961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "barbican-api-config-data" not found Jan 21 15:19:36 crc kubenswrapper[4707]: E0121 15:19:36.939581 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs podName:1fee1703-b855-46f5-9bc3-89c66b3377ea nodeName:}" failed. No retries permitted until 2026-01-21 15:19:40.939566467 +0000 UTC m=+1078.121082689 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs") pod "barbican-api-69647dd576-j6h6n" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea") : secret "cert-barbican-public-svc" not found Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.972426 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5b8c8cff84-8wl5w"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.976486 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5b8c8cff84-8wl5w"] Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.978842 4707 scope.go:117] "RemoveContainer" containerID="f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b" Jan 21 15:19:36 crc kubenswrapper[4707]: I0121 15:19:36.987468 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.033372 4707 scope.go:117] "RemoveContainer" containerID="f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b" Jan 21 15:19:37 crc kubenswrapper[4707]: E0121 15:19:37.035914 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b\": container with ID starting with f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b not found: ID does not exist" containerID="f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.035961 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b"} err="failed to get container status \"f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b\": rpc error: code = NotFound desc = could not find container \"f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b\": container with ID starting with f696f36818ced88ea01e4e6f662d25f63050aba88a1974af06864c5c02c2198b not found: ID does not exist" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.036040 4707 scope.go:117] "RemoveContainer" containerID="b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.040941 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts\") pod \"251778b0-6da6-47ba-805d-3d0469f6969d\" (UID: \"251778b0-6da6-47ba-805d-3d0469f6969d\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.041023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fhpq\" (UniqueName: \"kubernetes.io/projected/251778b0-6da6-47ba-805d-3d0469f6969d-kube-api-access-2fhpq\") pod \"251778b0-6da6-47ba-805d-3d0469f6969d\" (UID: \"251778b0-6da6-47ba-805d-3d0469f6969d\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.041688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "251778b0-6da6-47ba-805d-3d0469f6969d" (UID: "251778b0-6da6-47ba-805d-3d0469f6969d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.044562 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251778b0-6da6-47ba-805d-3d0469f6969d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.047113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/251778b0-6da6-47ba-805d-3d0469f6969d-kube-api-access-2fhpq" (OuterVolumeSpecName: "kube-api-access-2fhpq") pod "251778b0-6da6-47ba-805d-3d0469f6969d" (UID: "251778b0-6da6-47ba-805d-3d0469f6969d"). InnerVolumeSpecName "kube-api-access-2fhpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.053942 4707 scope.go:117] "RemoveContainer" containerID="a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.084710 4707 scope.go:117] "RemoveContainer" containerID="b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803" Jan 21 15:19:37 crc kubenswrapper[4707]: E0121 15:19:37.085057 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803\": container with ID starting with b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803 not found: ID does not exist" containerID="b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.085092 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803"} err="failed to get container status \"b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803\": rpc error: code = NotFound desc = could not find container \"b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803\": container with ID starting with b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803 not found: ID does not exist" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.085116 4707 scope.go:117] "RemoveContainer" containerID="a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528" Jan 21 15:19:37 crc kubenswrapper[4707]: E0121 15:19:37.085372 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528\": container with ID starting with a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528 not found: ID does not exist" containerID="a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.085394 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528"} err="failed to get container status \"a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528\": rpc error: code = NotFound desc = could not find container \"a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528\": container with ID starting with a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528 not found: ID does not exist" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.085410 4707 scope.go:117] "RemoveContainer" containerID="b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.085639 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803"} err="failed to get container status \"b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803\": rpc error: code = NotFound desc = could not find container \"b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803\": container with ID starting with b679f5c8be13e8ee4cc8382d9b68ec731e49abffc6c648e68fae5274844c3803 not found: ID does not exist" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.085656 4707 scope.go:117] "RemoveContainer" containerID="a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.085849 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528"} err="failed to get container status \"a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528\": rpc error: code = NotFound desc = could not find container \"a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528\": container with ID starting with a605877cf3b4ac8c1db37c116c331258a0aa934e0d2028f9ac6a64fc4893b528 not found: ID does not exist" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.085865 4707 scope.go:117] "RemoveContainer" containerID="62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.092903 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.103637 4707 scope.go:117] "RemoveContainer" containerID="c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.128709 4707 scope.go:117] "RemoveContainer" containerID="62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376" Jan 21 15:19:37 crc kubenswrapper[4707]: E0121 15:19:37.129120 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376\": container with ID starting with 62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376 not found: ID does not exist" containerID="62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.129158 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376"} err="failed to get container status \"62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376\": rpc error: code = NotFound desc = could not find container \"62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376\": container with ID starting with 62856d997c7d8e6edfb1fb21e027e6da7fcd0fb828f406a6c098c52b142dc376 not found: ID does not exist" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.129193 4707 scope.go:117] "RemoveContainer" containerID="c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5" Jan 21 15:19:37 crc kubenswrapper[4707]: E0121 15:19:37.129394 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5\": container with ID starting with c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5 not found: ID does not exist" containerID="c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.129410 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5"} err="failed to get container status \"c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5\": rpc error: code = NotFound desc = could not find container \"c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5\": container with ID starting with c37cde87b41314e89d86dd96e35aebc1fc6455dcac0789f46d231c2207087fe5 not found: ID does not exist" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.145316 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data-custom\") pod \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.145366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp2x8\" (UniqueName: \"kubernetes.io/projected/e476808c-b16e-4aa7-9f4b-71e6a5c32576-kube-api-access-pp2x8\") pod \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.145394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e476808c-b16e-4aa7-9f4b-71e6a5c32576-logs\") pod \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.145421 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-internal-tls-certs\") pod \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.145497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data\") pod \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.145516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-combined-ca-bundle\") pod \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.145548 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-public-tls-certs\") pod \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\" (UID: \"e476808c-b16e-4aa7-9f4b-71e6a5c32576\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.146306 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fhpq\" (UniqueName: \"kubernetes.io/projected/251778b0-6da6-47ba-805d-3d0469f6969d-kube-api-access-2fhpq\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.147753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e476808c-b16e-4aa7-9f4b-71e6a5c32576-logs" (OuterVolumeSpecName: "logs") pod "e476808c-b16e-4aa7-9f4b-71e6a5c32576" (UID: "e476808c-b16e-4aa7-9f4b-71e6a5c32576"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.151271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e476808c-b16e-4aa7-9f4b-71e6a5c32576-kube-api-access-pp2x8" (OuterVolumeSpecName: "kube-api-access-pp2x8") pod "e476808c-b16e-4aa7-9f4b-71e6a5c32576" (UID: "e476808c-b16e-4aa7-9f4b-71e6a5c32576"). InnerVolumeSpecName "kube-api-access-pp2x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.151748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e476808c-b16e-4aa7-9f4b-71e6a5c32576" (UID: "e476808c-b16e-4aa7-9f4b-71e6a5c32576"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.168785 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e476808c-b16e-4aa7-9f4b-71e6a5c32576" (UID: "e476808c-b16e-4aa7-9f4b-71e6a5c32576"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.176971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data" (OuterVolumeSpecName: "config-data") pod "e476808c-b16e-4aa7-9f4b-71e6a5c32576" (UID: "e476808c-b16e-4aa7-9f4b-71e6a5c32576"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.178990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e476808c-b16e-4aa7-9f4b-71e6a5c32576" (UID: "e476808c-b16e-4aa7-9f4b-71e6a5c32576"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.189703 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e476808c-b16e-4aa7-9f4b-71e6a5c32576" (UID: "e476808c-b16e-4aa7-9f4b-71e6a5c32576"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.190123 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" path="/var/lib/kubelet/pods/02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.190788 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290dd378-ee4b-464d-9321-2491ef417d10" path="/var/lib/kubelet/pods/290dd378-ee4b-464d-9321-2491ef417d10/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.191329 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db08188-97b2-43c0-9b46-7ea6639617ed" path="/var/lib/kubelet/pods/2db08188-97b2-43c0-9b46-7ea6639617ed/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.191646 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" path="/var/lib/kubelet/pods/30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.192493 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36457603-5ef9-4f79-bf2c-11413e422937" path="/var/lib/kubelet/pods/36457603-5ef9-4f79-bf2c-11413e422937/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.192956 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38833865-b749-47e4-b140-8a541562f359" path="/var/lib/kubelet/pods/38833865-b749-47e4-b140-8a541562f359/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.193528 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" path="/var/lib/kubelet/pods/57f429a9-592b-4ff8-a548-b5e59c4c481e/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.194575 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d67e12-9371-4324-a476-3befcc10743d" path="/var/lib/kubelet/pods/73d67e12-9371-4324-a476-3befcc10743d/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.195110 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9220cbec-82de-4d8a-9bca-575edd0d7482" path="/var/lib/kubelet/pods/9220cbec-82de-4d8a-9bca-575edd0d7482/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.195931 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="933eeedf-0a55-44e6-84f5-062e3c2716d9" path="/var/lib/kubelet/pods/933eeedf-0a55-44e6-84f5-062e3c2716d9/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.196508 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" path="/var/lib/kubelet/pods/a19c4756-e139-49f4-a1e8-99afc8ed8088/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.197353 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8" path="/var/lib/kubelet/pods/a803fdf0-9ccc-4bb6-abe6-6de56dc97cc8/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.197747 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce9bbed-aa2b-429e-85f7-c4202d8af65f" path="/var/lib/kubelet/pods/bce9bbed-aa2b-429e-85f7-c4202d8af65f/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.198375 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bffc824e-1c16-4342-b73f-91205d93c451" path="/var/lib/kubelet/pods/bffc824e-1c16-4342-b73f-91205d93c451/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.199259 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d582f12e-341b-45f5-a54a-8d016aed473a" path="/var/lib/kubelet/pods/d582f12e-341b-45f5-a54a-8d016aed473a/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.199711 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" path="/var/lib/kubelet/pods/f44cdae5-61bb-40c3-a1d7-e10d4d64c203/volumes" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.247434 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e476808c-b16e-4aa7-9f4b-71e6a5c32576-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.247453 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.247463 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.247472 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.247480 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.247488 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e476808c-b16e-4aa7-9f4b-71e6a5c32576-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.247496 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp2x8\" (UniqueName: \"kubernetes.io/projected/e476808c-b16e-4aa7-9f4b-71e6a5c32576-kube-api-access-pp2x8\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.552009 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.592509 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.150:9311/healthcheck\": read tcp 10.217.0.2:37784->10.217.0.150:9311: read: connection reset by peer" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.592636 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.150:9311/healthcheck\": read tcp 10.217.0.2:37778->10.217.0.150:9311: read: connection reset by peer" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.655435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-public-tls-certs\") pod \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.655501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-config-data\") pod \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.655522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-credential-keys\") pod \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.655573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-scripts\") pod \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.655591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-fernet-keys\") pod \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.655631 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgnmq\" (UniqueName: \"kubernetes.io/projected/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-kube-api-access-cgnmq\") pod \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.655693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-internal-tls-certs\") pod \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.655708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-combined-ca-bundle\") pod \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\" (UID: \"d41aa1cf-56db-4d5a-91e9-478a8ce591d0\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.664943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-scripts" (OuterVolumeSpecName: "scripts") pod "d41aa1cf-56db-4d5a-91e9-478a8ce591d0" (UID: "d41aa1cf-56db-4d5a-91e9-478a8ce591d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.665420 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_9d458004-5d78-43b6-aae5-7afb3dfa0c31/ovn-northd/0.log" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.665455 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerID="9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77" exitCode=139 Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.665500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"9d458004-5d78-43b6-aae5-7afb3dfa0c31","Type":"ContainerDied","Data":"9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.665524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"9d458004-5d78-43b6-aae5-7afb3dfa0c31","Type":"ContainerDied","Data":"bc7e5eb6486c9db4b4674e13aebddd0296ac1b5fd7302c7edf70fa0c8444ff41"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.665563 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7e5eb6486c9db4b4674e13aebddd0296ac1b5fd7302c7edf70fa0c8444ff41" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.669113 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_9d458004-5d78-43b6-aae5-7afb3dfa0c31/ovn-northd/0.log" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.669183 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.671025 4707 generic.go:334] "Generic (PLEG): container finished" podID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerID="acc95ab6fc21a9f96f200600c32518407710117ccce4e8740117fccde8402fe5" exitCode=0 Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.671074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" event={"ID":"1fee1703-b855-46f5-9bc3-89c66b3377ea","Type":"ContainerDied","Data":"acc95ab6fc21a9f96f200600c32518407710117ccce4e8740117fccde8402fe5"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.672004 4707 generic.go:334] "Generic (PLEG): container finished" podID="d41aa1cf-56db-4d5a-91e9-478a8ce591d0" containerID="cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee" exitCode=0 Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.672040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" event={"ID":"d41aa1cf-56db-4d5a-91e9-478a8ce591d0","Type":"ContainerDied","Data":"cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.672054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" event={"ID":"d41aa1cf-56db-4d5a-91e9-478a8ce591d0","Type":"ContainerDied","Data":"4a7f21396374a414dfad0a29d9cb9bc668499fd4dfba414eed3a54dd78ab02f6"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.672068 4707 scope.go:117] "RemoveContainer" containerID="cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.672131 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-84f545d4f5-d8v92" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.708905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d41aa1cf-56db-4d5a-91e9-478a8ce591d0" (UID: "d41aa1cf-56db-4d5a-91e9-478a8ce591d0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.708977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d41aa1cf-56db-4d5a-91e9-478a8ce591d0" (UID: "d41aa1cf-56db-4d5a-91e9-478a8ce591d0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.709044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-kube-api-access-cgnmq" (OuterVolumeSpecName: "kube-api-access-cgnmq") pod "d41aa1cf-56db-4d5a-91e9-478a8ce591d0" (UID: "d41aa1cf-56db-4d5a-91e9-478a8ce591d0"). InnerVolumeSpecName "kube-api-access-cgnmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.720123 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.722429 4707 generic.go:334] "Generic (PLEG): container finished" podID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerID="9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997" exitCode=0 Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.722501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" event={"ID":"14b480ba-3ec1-4c37-a14c-d3b8a20d9837","Type":"ContainerDied","Data":"9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.725990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" event={"ID":"251778b0-6da6-47ba-805d-3d0469f6969d","Type":"ContainerDied","Data":"aee3d212ff8e73e59c8aa1843a8355ae3c199cae005f9ff60b7c5f62f41d67a7"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.726062 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.735975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d41aa1cf-56db-4d5a-91e9-478a8ce591d0" (UID: "d41aa1cf-56db-4d5a-91e9-478a8ce591d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.756980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-sg-core-conf-yaml\") pod \"b16efd30-cb63-4031-aab4-ec6ca21d7528\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757021 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-rundir\") pod \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjkmg\" (UniqueName: \"kubernetes.io/projected/b16efd30-cb63-4031-aab4-ec6ca21d7528-kube-api-access-wjkmg\") pod \"b16efd30-cb63-4031-aab4-ec6ca21d7528\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-metrics-certs-tls-certs\") pod \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-combined-ca-bundle\") pod \"b16efd30-cb63-4031-aab4-ec6ca21d7528\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-scripts\") pod \"b16efd30-cb63-4031-aab4-ec6ca21d7528\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-ceilometer-tls-certs\") pod \"b16efd30-cb63-4031-aab4-ec6ca21d7528\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-northd-tls-certs\") pod \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757260 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-config-data\") pod \"b16efd30-cb63-4031-aab4-ec6ca21d7528\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-config\") pod \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-log-httpd\") pod \"b16efd30-cb63-4031-aab4-ec6ca21d7528\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-combined-ca-bundle\") pod \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-scripts\") pod \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd4mc\" (UniqueName: \"kubernetes.io/projected/9d458004-5d78-43b6-aae5-7afb3dfa0c31-kube-api-access-xd4mc\") pod \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\" (UID: \"9d458004-5d78-43b6-aae5-7afb3dfa0c31\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-run-httpd\") pod \"b16efd30-cb63-4031-aab4-ec6ca21d7528\" (UID: \"b16efd30-cb63-4031-aab4-ec6ca21d7528\") " Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757725 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgnmq\" (UniqueName: \"kubernetes.io/projected/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-kube-api-access-cgnmq\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757865 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757881 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757890 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757898 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.757929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b16efd30-cb63-4031-aab4-ec6ca21d7528" (UID: "b16efd30-cb63-4031-aab4-ec6ca21d7528"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.758099 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b16efd30-cb63-4031-aab4-ec6ca21d7528" (UID: "b16efd30-cb63-4031-aab4-ec6ca21d7528"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.759233 4707 generic.go:334] "Generic (PLEG): container finished" podID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerID="45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e" exitCode=0 Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.759257 4707 generic.go:334] "Generic (PLEG): container finished" podID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerID="ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13" exitCode=143 Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.759307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" event={"ID":"e476808c-b16e-4aa7-9f4b-71e6a5c32576","Type":"ContainerDied","Data":"45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.759330 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" event={"ID":"e476808c-b16e-4aa7-9f4b-71e6a5c32576","Type":"ContainerDied","Data":"ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.759340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" event={"ID":"e476808c-b16e-4aa7-9f4b-71e6a5c32576","Type":"ContainerDied","Data":"b31de5931197a89702e85a64f235ad8862ba6805b13d1f2e6e303426a53ef2f8"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.759405 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.759631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-config-data" (OuterVolumeSpecName: "config-data") pod "d41aa1cf-56db-4d5a-91e9-478a8ce591d0" (UID: "d41aa1cf-56db-4d5a-91e9-478a8ce591d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.759920 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d41aa1cf-56db-4d5a-91e9-478a8ce591d0" (UID: "d41aa1cf-56db-4d5a-91e9-478a8ce591d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.761924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-scripts" (OuterVolumeSpecName: "scripts") pod "9d458004-5d78-43b6-aae5-7afb3dfa0c31" (UID: "9d458004-5d78-43b6-aae5-7afb3dfa0c31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.764057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "9d458004-5d78-43b6-aae5-7afb3dfa0c31" (UID: "9d458004-5d78-43b6-aae5-7afb3dfa0c31"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.764136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-config" (OuterVolumeSpecName: "config") pod "9d458004-5d78-43b6-aae5-7afb3dfa0c31" (UID: "9d458004-5d78-43b6-aae5-7afb3dfa0c31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.769120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" event={"ID":"fc35eebb-a329-4334-a277-5a27b1bfba38","Type":"ContainerStarted","Data":"a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.775253 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.791996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b16efd30-cb63-4031-aab4-ec6ca21d7528-kube-api-access-wjkmg" (OuterVolumeSpecName: "kube-api-access-wjkmg") pod "b16efd30-cb63-4031-aab4-ec6ca21d7528" (UID: "b16efd30-cb63-4031-aab4-ec6ca21d7528"). InnerVolumeSpecName "kube-api-access-wjkmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.792051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d458004-5d78-43b6-aae5-7afb3dfa0c31-kube-api-access-xd4mc" (OuterVolumeSpecName: "kube-api-access-xd4mc") pod "9d458004-5d78-43b6-aae5-7afb3dfa0c31" (UID: "9d458004-5d78-43b6-aae5-7afb3dfa0c31"). InnerVolumeSpecName "kube-api-access-xd4mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.798992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-scripts" (OuterVolumeSpecName: "scripts") pod "b16efd30-cb63-4031-aab4-ec6ca21d7528" (UID: "b16efd30-cb63-4031-aab4-ec6ca21d7528"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.808507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d41aa1cf-56db-4d5a-91e9-478a8ce591d0" (UID: "d41aa1cf-56db-4d5a-91e9-478a8ce591d0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.817012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d458004-5d78-43b6-aae5-7afb3dfa0c31" (UID: "9d458004-5d78-43b6-aae5-7afb3dfa0c31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.817312 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" podStartSLOduration=4.817300117 podStartE2EDuration="4.817300117s" podCreationTimestamp="2026-01-21 15:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:19:37.816470556 +0000 UTC m=+1074.997986778" watchObservedRunningTime="2026-01-21 15:19:37.817300117 +0000 UTC m=+1074.998816339" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.838790 4707 generic.go:334] "Generic (PLEG): container finished" podID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerID="d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558" exitCode=0 Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.838864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerDied","Data":"d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.838893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b16efd30-cb63-4031-aab4-ec6ca21d7528","Type":"ContainerDied","Data":"a7277f4a240e0d1d1cc9dbdec25ae84f35054764e8e6314a29c9df618f186901"} Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.838952 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.857374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b16efd30-cb63-4031-aab4-ec6ca21d7528" (UID: "b16efd30-cb63-4031-aab4-ec6ca21d7528"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860506 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860532 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860541 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjkmg\" (UniqueName: \"kubernetes.io/projected/b16efd30-cb63-4031-aab4-ec6ca21d7528-kube-api-access-wjkmg\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860551 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860584 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860593 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860601 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860609 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860617 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860626 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41aa1cf-56db-4d5a-91e9-478a8ce591d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860634 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d458004-5d78-43b6-aae5-7afb3dfa0c31-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860665 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd4mc\" (UniqueName: \"kubernetes.io/projected/9d458004-5d78-43b6-aae5-7afb3dfa0c31-kube-api-access-xd4mc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.860674 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b16efd30-cb63-4031-aab4-ec6ca21d7528-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.887583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9d458004-5d78-43b6-aae5-7afb3dfa0c31" (UID: "9d458004-5d78-43b6-aae5-7afb3dfa0c31"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.915280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "9d458004-5d78-43b6-aae5-7afb3dfa0c31" (UID: "9d458004-5d78-43b6-aae5-7afb3dfa0c31"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.916993 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b16efd30-cb63-4031-aab4-ec6ca21d7528" (UID: "b16efd30-cb63-4031-aab4-ec6ca21d7528"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.931158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b16efd30-cb63-4031-aab4-ec6ca21d7528" (UID: "b16efd30-cb63-4031-aab4-ec6ca21d7528"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.939051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-config-data" (OuterVolumeSpecName: "config-data") pod "b16efd30-cb63-4031-aab4-ec6ca21d7528" (UID: "b16efd30-cb63-4031-aab4-ec6ca21d7528"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.962287 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.962317 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.962327 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.962335 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d458004-5d78-43b6-aae5-7afb3dfa0c31-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.962344 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16efd30-cb63-4031-aab4-ec6ca21d7528-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.966369 4707 scope.go:117] "RemoveContainer" containerID="cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee" Jan 21 15:19:37 crc kubenswrapper[4707]: E0121 15:19:37.970002 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee\": container with ID starting with cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee not found: ID does not exist" containerID="cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.970032 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee"} err="failed to get container status \"cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee\": rpc error: code = NotFound desc = could not find container \"cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee\": container with ID starting with cffb1594ca860b75c87ae56b8474562b40c4b24a1c0a8457c02620d90242a1ee not found: ID does not exist" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.970051 4707 scope.go:117] "RemoveContainer" containerID="45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.971093 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw"] Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.976943 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-37a4-account-create-update-rtnnw"] Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.978305 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.984719 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9"] Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.994178 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-55bdf489cb-7thg9"] Jan 21 15:19:37 crc kubenswrapper[4707]: I0121 15:19:37.999029 4707 scope.go:117] "RemoveContainer" containerID="ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.009552 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-84f545d4f5-d8v92"] Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.016565 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-84f545d4f5-d8v92"] Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.019287 4707 scope.go:117] "RemoveContainer" containerID="45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e" Jan 21 15:19:38 crc kubenswrapper[4707]: E0121 15:19:38.019594 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e\": container with ID starting with 45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e not found: ID does not exist" containerID="45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.019619 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e"} err="failed to get container status \"45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e\": rpc error: code = NotFound desc = could not find container \"45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e\": container with ID starting with 45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e not found: ID does not exist" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.019640 4707 scope.go:117] "RemoveContainer" containerID="ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13" Jan 21 15:19:38 crc kubenswrapper[4707]: E0121 15:19:38.019851 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13\": container with ID starting with ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13 not found: ID does not exist" containerID="ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.019872 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13"} err="failed to get container status \"ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13\": rpc error: code = NotFound desc = could not find container \"ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13\": container with ID starting with ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13 not found: ID does not exist" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.019884 4707 scope.go:117] "RemoveContainer" containerID="45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.020097 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e"} err="failed to get container status \"45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e\": rpc error: code = NotFound desc = could not find container \"45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e\": container with ID starting with 45a895eea53c4d35ff1117d05b957e3c2efd5b4ae2d6ac5cc0c061fc0e17a78e not found: ID does not exist" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.020115 4707 scope.go:117] "RemoveContainer" containerID="ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.020345 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13"} err="failed to get container status \"ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13\": rpc error: code = NotFound desc = could not find container \"ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13\": container with ID starting with ab100370a07339038d81d08dbd4983fd7c808f4f91f0e4c62d234baa0cd1bf13 not found: ID does not exist" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.020382 4707 scope.go:117] "RemoveContainer" containerID="9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.035635 4707 scope.go:117] "RemoveContainer" containerID="823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.051262 4707 scope.go:117] "RemoveContainer" containerID="d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.066978 4707 scope.go:117] "RemoveContainer" containerID="3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.081242 4707 scope.go:117] "RemoveContainer" containerID="9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c" Jan 21 15:19:38 crc kubenswrapper[4707]: E0121 15:19:38.081529 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c\": container with ID starting with 9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c not found: ID does not exist" containerID="9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.081555 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c"} err="failed to get container status \"9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c\": rpc error: code = NotFound desc = could not find container \"9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c\": container with ID starting with 9344039b27c2d8c4d1683d57c7e6f74e70ca941c0d5e9e5caf6e8a1c6bc1b57c not found: ID does not exist" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.081574 4707 scope.go:117] "RemoveContainer" containerID="823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8" Jan 21 15:19:38 crc kubenswrapper[4707]: E0121 15:19:38.081878 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8\": container with ID starting with 823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8 not found: ID does not exist" containerID="823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.081923 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8"} err="failed to get container status \"823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8\": rpc error: code = NotFound desc = could not find container \"823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8\": container with ID starting with 823c80c30eb9eaeccef9da15a29c63d9689ea2f424c3730e5df5b6fccca388c8 not found: ID does not exist" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.081942 4707 scope.go:117] "RemoveContainer" containerID="d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558" Jan 21 15:19:38 crc kubenswrapper[4707]: E0121 15:19:38.082231 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558\": container with ID starting with d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558 not found: ID does not exist" containerID="d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.082253 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558"} err="failed to get container status \"d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558\": rpc error: code = NotFound desc = could not find container \"d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558\": container with ID starting with d696d2cbe2cc66061a5a25a3376cde1f306630dea9c3e885cb73e7bf84730558 not found: ID does not exist" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.082266 4707 scope.go:117] "RemoveContainer" containerID="3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211" Jan 21 15:19:38 crc kubenswrapper[4707]: E0121 15:19:38.082473 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211\": container with ID starting with 3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211 not found: ID does not exist" containerID="3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.082513 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211"} err="failed to get container status \"3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211\": rpc error: code = NotFound desc = could not find container \"3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211\": container with ID starting with 3560bd7d90d8285bb1d6f4fd0eb0de36ad621507b5e608fd50550f2d85228211 not found: ID does not exist" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.165194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle\") pod \"1fee1703-b855-46f5-9bc3-89c66b3377ea\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.165347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom\") pod \"1fee1703-b855-46f5-9bc3-89c66b3377ea\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.165374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs\") pod \"1fee1703-b855-46f5-9bc3-89c66b3377ea\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.165419 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fee1703-b855-46f5-9bc3-89c66b3377ea-logs\") pod \"1fee1703-b855-46f5-9bc3-89c66b3377ea\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.165447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69sb7\" (UniqueName: \"kubernetes.io/projected/1fee1703-b855-46f5-9bc3-89c66b3377ea-kube-api-access-69sb7\") pod \"1fee1703-b855-46f5-9bc3-89c66b3377ea\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.165474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data\") pod \"1fee1703-b855-46f5-9bc3-89c66b3377ea\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.165491 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs\") pod \"1fee1703-b855-46f5-9bc3-89c66b3377ea\" (UID: \"1fee1703-b855-46f5-9bc3-89c66b3377ea\") " Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.166072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fee1703-b855-46f5-9bc3-89c66b3377ea-logs" (OuterVolumeSpecName: "logs") pod "1fee1703-b855-46f5-9bc3-89c66b3377ea" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.170389 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1fee1703-b855-46f5-9bc3-89c66b3377ea" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.171383 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.176450 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.181210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fee1703-b855-46f5-9bc3-89c66b3377ea-kube-api-access-69sb7" (OuterVolumeSpecName: "kube-api-access-69sb7") pod "1fee1703-b855-46f5-9bc3-89c66b3377ea" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea"). InnerVolumeSpecName "kube-api-access-69sb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.185783 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fee1703-b855-46f5-9bc3-89c66b3377ea" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.194214 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1fee1703-b855-46f5-9bc3-89c66b3377ea" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.197466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data" (OuterVolumeSpecName: "config-data") pod "1fee1703-b855-46f5-9bc3-89c66b3377ea" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.197947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1fee1703-b855-46f5-9bc3-89c66b3377ea" (UID: "1fee1703-b855-46f5-9bc3-89c66b3377ea"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.267640 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fee1703-b855-46f5-9bc3-89c66b3377ea-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.267672 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69sb7\" (UniqueName: \"kubernetes.io/projected/1fee1703-b855-46f5-9bc3-89c66b3377ea-kube-api-access-69sb7\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.267683 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.267692 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.267701 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.267711 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.267718 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fee1703-b855-46f5-9bc3-89c66b3377ea-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.860641 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" event={"ID":"1fee1703-b855-46f5-9bc3-89c66b3377ea","Type":"ContainerDied","Data":"213eab423feed883e957edd4043262addaa6f7acc6a2303daa55c0be03a83ae6"} Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.860869 4707 scope.go:117] "RemoveContainer" containerID="acc95ab6fc21a9f96f200600c32518407710117ccce4e8740117fccde8402fe5" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.860953 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-69647dd576-j6h6n" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.870659 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.883339 4707 scope.go:117] "RemoveContainer" containerID="8dbd5877a5c421fb4b87c4bb6c848927f876f78802f4b31d44917a93f5d9c274" Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.892913 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-69647dd576-j6h6n"] Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.900908 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-69647dd576-j6h6n"] Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.901003 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:19:38 crc kubenswrapper[4707]: I0121 15:19:38.904911 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:19:39 crc kubenswrapper[4707]: E0121 15:19:39.181734 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:39 crc kubenswrapper[4707]: E0121 15:19:39.181800 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data podName:f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:47.181785932 +0000 UTC m=+1084.363302153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:19:39 crc kubenswrapper[4707]: I0121 15:19:39.191325 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" path="/var/lib/kubelet/pods/1fee1703-b855-46f5-9bc3-89c66b3377ea/volumes" Jan 21 15:19:39 crc kubenswrapper[4707]: I0121 15:19:39.191847 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="251778b0-6da6-47ba-805d-3d0469f6969d" path="/var/lib/kubelet/pods/251778b0-6da6-47ba-805d-3d0469f6969d/volumes" Jan 21 15:19:39 crc kubenswrapper[4707]: I0121 15:19:39.192232 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" path="/var/lib/kubelet/pods/9d458004-5d78-43b6-aae5-7afb3dfa0c31/volumes" Jan 21 15:19:39 crc kubenswrapper[4707]: I0121 15:19:39.193157 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" path="/var/lib/kubelet/pods/b16efd30-cb63-4031-aab4-ec6ca21d7528/volumes" Jan 21 15:19:39 crc kubenswrapper[4707]: I0121 15:19:39.193777 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41aa1cf-56db-4d5a-91e9-478a8ce591d0" path="/var/lib/kubelet/pods/d41aa1cf-56db-4d5a-91e9-478a8ce591d0/volumes" Jan 21 15:19:39 crc kubenswrapper[4707]: I0121 15:19:39.194269 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" path="/var/lib/kubelet/pods/e476808c-b16e-4aa7-9f4b-71e6a5c32576/volumes" Jan 21 15:19:39 crc kubenswrapper[4707]: E0121 15:19:39.689106 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:19:39 crc kubenswrapper[4707]: E0121 15:19:39.689169 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data podName:945ded51-9961-411b-b00f-0543ac91a18a nodeName:}" failed. No retries permitted until 2026-01-21 15:19:47.689155684 +0000 UTC m=+1084.870671906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data") pod "rabbitmq-server-0" (UID: "945ded51-9961-411b-b00f-0543ac91a18a") : configmap "rabbitmq-config-data" not found Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.607104 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.610771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mjd4\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-kube-api-access-8mjd4\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.610822 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-pod-info\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.610849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-erlang-cookie\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.610875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.610910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-plugins\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.610931 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-server-conf\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.610948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-plugins-conf\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.610981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-confd\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.610995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.611008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-tls\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.611032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-erlang-cookie-secret\") pod \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\" (UID: \"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.612056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.612403 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.615235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.615247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.615334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-kube-api-access-8mjd4" (OuterVolumeSpecName: "kube-api-access-8mjd4") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "kube-api-access-8mjd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.615748 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.616510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-pod-info" (OuterVolumeSpecName: "pod-info") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.616844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.628648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.631298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data" (OuterVolumeSpecName: "config-data") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.652517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-server-conf" (OuterVolumeSpecName: "server-conf") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.678642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" (UID: "f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-erlang-cookie\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-server-conf\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711782 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x9qr\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-kube-api-access-9x9qr\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711838 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-confd\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-plugins\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-plugins-conf\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-tls\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/945ded51-9961-411b-b00f-0543ac91a18a-erlang-cookie-secret\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711937 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/945ded51-9961-411b-b00f-0543ac91a18a-pod-info\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.711999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data\") pod \"945ded51-9961-411b-b00f-0543ac91a18a\" (UID: \"945ded51-9961-411b-b00f-0543ac91a18a\") " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712394 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mjd4\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-kube-api-access-8mjd4\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712415 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712424 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712435 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712444 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712452 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712462 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712469 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712486 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712495 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.712504 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.717342 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.717446 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.717499 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:48.717484212 +0000 UTC m=+1085.899000434 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-worker-config-data" not found Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.717504 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.717596 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:48.71754172 +0000 UTC m=+1085.899057942 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-config-data" not found Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.717730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.717828 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.717856 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:19:48.717848898 +0000 UTC m=+1085.899365120 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "combined-ca-bundle" not found Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.719580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.719607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-kube-api-access-9x9qr" (OuterVolumeSpecName: "kube-api-access-9x9qr") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "kube-api-access-9x9qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.719766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.719955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.720395 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945ded51-9961-411b-b00f-0543ac91a18a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.721726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/945ded51-9961-411b-b00f-0543ac91a18a-pod-info" (OuterVolumeSpecName: "pod-info") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.729802 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.730594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data" (OuterVolumeSpecName: "config-data") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.748004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-server-conf" (OuterVolumeSpecName: "server-conf") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.767136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "945ded51-9961-411b-b00f-0543ac91a18a" (UID: "945ded51-9961-411b-b00f-0543ac91a18a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.813822 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.813998 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x9qr\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-kube-api-access-9x9qr\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814073 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814134 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814196 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814247 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814295 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/945ded51-9961-411b-b00f-0543ac91a18a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814362 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814477 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/945ded51-9961-411b-b00f-0543ac91a18a-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814535 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/945ded51-9961-411b-b00f-0543ac91a18a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814592 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.814650 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/945ded51-9961-411b-b00f-0543ac91a18a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.826045 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.886527 4707 generic.go:334] "Generic (PLEG): container finished" podID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" containerID="3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7" exitCode=0 Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.886591 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.886590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0","Type":"ContainerDied","Data":"3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7"} Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.886715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0","Type":"ContainerDied","Data":"4eb05e2d5e2d8eb167bc5585a41c3eeb5817bc995c97104c6b8e234d4dbfd1d8"} Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.886744 4707 scope.go:117] "RemoveContainer" containerID="3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.888847 4707 generic.go:334] "Generic (PLEG): container finished" podID="945ded51-9961-411b-b00f-0543ac91a18a" containerID="c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4" exitCode=0 Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.888875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"945ded51-9961-411b-b00f-0543ac91a18a","Type":"ContainerDied","Data":"c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4"} Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.888889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"945ded51-9961-411b-b00f-0543ac91a18a","Type":"ContainerDied","Data":"c52196e4c3b9f97fab04d935d73b2d868fa4a476add586a3c4280639cd2164fe"} Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.888889 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.901783 4707 scope.go:117] "RemoveContainer" containerID="f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.915897 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.926715 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.933440 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.935760 4707 scope.go:117] "RemoveContainer" containerID="3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7" Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.936271 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7\": container with ID starting with 3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7 not found: ID does not exist" containerID="3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.936302 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7"} err="failed to get container status \"3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7\": rpc error: code = NotFound desc = could not find container \"3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7\": container with ID starting with 3e89514bb3ef81286fa823342045e8d9a8b7d3bbdf0da4b5e799d28d28c555f7 not found: ID does not exist" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.936325 4707 scope.go:117] "RemoveContainer" containerID="f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178" Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.936592 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178\": container with ID starting with f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178 not found: ID does not exist" containerID="f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.936629 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178"} err="failed to get container status \"f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178\": rpc error: code = NotFound desc = could not find container \"f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178\": container with ID starting with f9d2942fe604e14511fed681fb9e40c2ac78fd9c4979576c63e2a2935c162178 not found: ID does not exist" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.936649 4707 scope.go:117] "RemoveContainer" containerID="c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.939045 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.944281 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.950927 4707 scope.go:117] "RemoveContainer" containerID="3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.975523 4707 scope.go:117] "RemoveContainer" containerID="c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4" Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.975971 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4\": container with ID starting with c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4 not found: ID does not exist" containerID="c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.975996 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4"} err="failed to get container status \"c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4\": rpc error: code = NotFound desc = could not find container \"c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4\": container with ID starting with c7cfd48cfefe3036574c13500f63ab4eba9d40cb05f3cbdfadf4b90d5a1af4f4 not found: ID does not exist" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.976014 4707 scope.go:117] "RemoveContainer" containerID="3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0" Jan 21 15:19:40 crc kubenswrapper[4707]: E0121 15:19:40.976303 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0\": container with ID starting with 3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0 not found: ID does not exist" containerID="3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0" Jan 21 15:19:40 crc kubenswrapper[4707]: I0121 15:19:40.976335 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0"} err="failed to get container status \"3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0\": rpc error: code = NotFound desc = could not find container \"3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0\": container with ID starting with 3c7c33794e5373ffa9660de47464bcb0a2625565893f2f84cbae1b0bdfe6f4a0 not found: ID does not exist" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.192350 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="945ded51-9961-411b-b00f-0543ac91a18a" path="/var/lib/kubelet/pods/945ded51-9961-411b-b00f-0543ac91a18a/volumes" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.200899 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" path="/var/lib/kubelet/pods/f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0/volumes" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.712735 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.716700 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.727393 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data-custom\") pod \"df067294-2809-413f-b245-6cd57c0cbf11\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.727516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data\") pod \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.727537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data-custom\") pod \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.727566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-combined-ca-bundle\") pod \"df067294-2809-413f-b245-6cd57c0cbf11\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.727588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df067294-2809-413f-b245-6cd57c0cbf11-logs\") pod \"df067294-2809-413f-b245-6cd57c0cbf11\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.727608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggk7r\" (UniqueName: \"kubernetes.io/projected/df067294-2809-413f-b245-6cd57c0cbf11-kube-api-access-ggk7r\") pod \"df067294-2809-413f-b245-6cd57c0cbf11\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.727956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df067294-2809-413f-b245-6cd57c0cbf11-logs" (OuterVolumeSpecName: "logs") pod "df067294-2809-413f-b245-6cd57c0cbf11" (UID: "df067294-2809-413f-b245-6cd57c0cbf11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.728358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data\") pod \"df067294-2809-413f-b245-6cd57c0cbf11\" (UID: \"df067294-2809-413f-b245-6cd57c0cbf11\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.728394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km2wl\" (UniqueName: \"kubernetes.io/projected/ed6331a2-4b0a-407e-8ee0-3b1a82343071-kube-api-access-km2wl\") pod \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.728706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-combined-ca-bundle\") pod \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.728729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6331a2-4b0a-407e-8ee0-3b1a82343071-logs\") pod \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\" (UID: \"ed6331a2-4b0a-407e-8ee0-3b1a82343071\") " Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.729436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed6331a2-4b0a-407e-8ee0-3b1a82343071-logs" (OuterVolumeSpecName: "logs") pod "ed6331a2-4b0a-407e-8ee0-3b1a82343071" (UID: "ed6331a2-4b0a-407e-8ee0-3b1a82343071"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.730236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df067294-2809-413f-b245-6cd57c0cbf11" (UID: "df067294-2809-413f-b245-6cd57c0cbf11"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.730528 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6331a2-4b0a-407e-8ee0-3b1a82343071-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.730592 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.730646 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df067294-2809-413f-b245-6cd57c0cbf11-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.731367 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6331a2-4b0a-407e-8ee0-3b1a82343071-kube-api-access-km2wl" (OuterVolumeSpecName: "kube-api-access-km2wl") pod "ed6331a2-4b0a-407e-8ee0-3b1a82343071" (UID: "ed6331a2-4b0a-407e-8ee0-3b1a82343071"). InnerVolumeSpecName "kube-api-access-km2wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.731865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ed6331a2-4b0a-407e-8ee0-3b1a82343071" (UID: "ed6331a2-4b0a-407e-8ee0-3b1a82343071"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.736554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df067294-2809-413f-b245-6cd57c0cbf11-kube-api-access-ggk7r" (OuterVolumeSpecName: "kube-api-access-ggk7r") pod "df067294-2809-413f-b245-6cd57c0cbf11" (UID: "df067294-2809-413f-b245-6cd57c0cbf11"). InnerVolumeSpecName "kube-api-access-ggk7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.748246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df067294-2809-413f-b245-6cd57c0cbf11" (UID: "df067294-2809-413f-b245-6cd57c0cbf11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.749730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed6331a2-4b0a-407e-8ee0-3b1a82343071" (UID: "ed6331a2-4b0a-407e-8ee0-3b1a82343071"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.761146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data" (OuterVolumeSpecName: "config-data") pod "ed6331a2-4b0a-407e-8ee0-3b1a82343071" (UID: "ed6331a2-4b0a-407e-8ee0-3b1a82343071"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.762900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data" (OuterVolumeSpecName: "config-data") pod "df067294-2809-413f-b245-6cd57c0cbf11" (UID: "df067294-2809-413f-b245-6cd57c0cbf11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.832591 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km2wl\" (UniqueName: \"kubernetes.io/projected/ed6331a2-4b0a-407e-8ee0-3b1a82343071-kube-api-access-km2wl\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.832619 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.832628 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.832638 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed6331a2-4b0a-407e-8ee0-3b1a82343071-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.832645 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.832653 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggk7r\" (UniqueName: \"kubernetes.io/projected/df067294-2809-413f-b245-6cd57c0cbf11-kube-api-access-ggk7r\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.832661 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df067294-2809-413f-b245-6cd57c0cbf11-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.897314 4707 generic.go:334] "Generic (PLEG): container finished" podID="df067294-2809-413f-b245-6cd57c0cbf11" containerID="42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964" exitCode=0 Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.897348 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.897375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" event={"ID":"df067294-2809-413f-b245-6cd57c0cbf11","Type":"ContainerDied","Data":"42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964"} Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.897402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-78998996df-sk4d7" event={"ID":"df067294-2809-413f-b245-6cd57c0cbf11","Type":"ContainerDied","Data":"fb2ef70e2aeec9824e1137b5af1b57c5b43f6f86a9b0c6ecb002ffc1a47aa6fa"} Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.897417 4707 scope.go:117] "RemoveContainer" containerID="42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.900645 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerID="00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0" exitCode=0 Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.900690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" event={"ID":"ed6331a2-4b0a-407e-8ee0-3b1a82343071","Type":"ContainerDied","Data":"00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0"} Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.900723 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.900732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw" event={"ID":"ed6331a2-4b0a-407e-8ee0-3b1a82343071","Type":"ContainerDied","Data":"fb752dfa05e58405fd49147fa5e2b87c86854c23fa4745a68926b27b773715af"} Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.915072 4707 scope.go:117] "RemoveContainer" containerID="2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.926472 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-78998996df-sk4d7"] Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.934549 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-78998996df-sk4d7"] Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.943536 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw"] Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.947676 4707 scope.go:117] "RemoveContainer" containerID="42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.948901 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b9f974bb6-5sgjw"] Jan 21 15:19:41 crc kubenswrapper[4707]: E0121 15:19:41.950140 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964\": container with ID starting with 42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964 not found: ID does not exist" containerID="42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.950186 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964"} err="failed to get container status \"42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964\": rpc error: code = NotFound desc = could not find container \"42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964\": container with ID starting with 42d62ef5f282edeb2563f2db5a11716f5b0650b5fac2965656092e5ac9f1f964 not found: ID does not exist" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.950216 4707 scope.go:117] "RemoveContainer" containerID="2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a" Jan 21 15:19:41 crc kubenswrapper[4707]: E0121 15:19:41.950674 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a\": container with ID starting with 2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a not found: ID does not exist" containerID="2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.950745 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a"} err="failed to get container status \"2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a\": rpc error: code = NotFound desc = could not find container \"2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a\": container with ID starting with 2b73fdcef853a349abde55baa394f064f420199add5dfb38f140e266b674446a not found: ID does not exist" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.950795 4707 scope.go:117] "RemoveContainer" containerID="00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.970688 4707 scope.go:117] "RemoveContainer" containerID="8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.985124 4707 scope.go:117] "RemoveContainer" containerID="00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0" Jan 21 15:19:41 crc kubenswrapper[4707]: E0121 15:19:41.985411 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0\": container with ID starting with 00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0 not found: ID does not exist" containerID="00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.985439 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0"} err="failed to get container status \"00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0\": rpc error: code = NotFound desc = could not find container \"00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0\": container with ID starting with 00ebdcc30cf164e2af5d364b6142e27fb221bb60135cf89869fc4900fb6c82e0 not found: ID does not exist" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.985457 4707 scope.go:117] "RemoveContainer" containerID="8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb" Jan 21 15:19:41 crc kubenswrapper[4707]: E0121 15:19:41.985733 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb\": container with ID starting with 8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb not found: ID does not exist" containerID="8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb" Jan 21 15:19:41 crc kubenswrapper[4707]: I0121 15:19:41.985771 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb"} err="failed to get container status \"8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb\": rpc error: code = NotFound desc = could not find container \"8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb\": container with ID starting with 8b76a4e4bbc292d4024533440328167e158045b382f20e0cb99b329dc81abeeb not found: ID does not exist" Jan 21 15:19:43 crc kubenswrapper[4707]: I0121 15:19:43.189451 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df067294-2809-413f-b245-6cd57c0cbf11" path="/var/lib/kubelet/pods/df067294-2809-413f-b245-6cd57c0cbf11/volumes" Jan 21 15:19:43 crc kubenswrapper[4707]: I0121 15:19:43.190154 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" path="/var/lib/kubelet/pods/ed6331a2-4b0a-407e-8ee0-3b1a82343071/volumes" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.238933 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.291067 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45"] Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.291293 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" podUID="fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" containerName="dnsmasq-dns" containerID="cri-o://fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426" gracePeriod=10 Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.651109 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.671157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dnsmasq-svc\") pod \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.671258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-config\") pod \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.671279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74kmf\" (UniqueName: \"kubernetes.io/projected/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-kube-api-access-74kmf\") pod \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.671917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dns-swift-storage-0\") pod \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\" (UID: \"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0\") " Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.678045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-kube-api-access-74kmf" (OuterVolumeSpecName: "kube-api-access-74kmf") pod "fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" (UID: "fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0"). InnerVolumeSpecName "kube-api-access-74kmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.699061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-config" (OuterVolumeSpecName: "config") pod "fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" (UID: "fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.699331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" (UID: "fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.718565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" (UID: "fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.773336 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.773478 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.773664 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74kmf\" (UniqueName: \"kubernetes.io/projected/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-kube-api-access-74kmf\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.773733 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.934500 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" containerID="fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426" exitCode=0 Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.934541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" event={"ID":"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0","Type":"ContainerDied","Data":"fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426"} Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.934559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" event={"ID":"fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0","Type":"ContainerDied","Data":"00e604df0934a581604e8556520d8697b2ded5b6546bb60c05f3de1ae5cdd740"} Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.934588 4707 scope.go:117] "RemoveContainer" containerID="fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.934661 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.965660 4707 scope.go:117] "RemoveContainer" containerID="9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.971493 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45"] Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.975446 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84fc46597c-zfl45"] Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.982871 4707 scope.go:117] "RemoveContainer" containerID="fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426" Jan 21 15:19:44 crc kubenswrapper[4707]: E0121 15:19:44.983338 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426\": container with ID starting with fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426 not found: ID does not exist" containerID="fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.983373 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426"} err="failed to get container status \"fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426\": rpc error: code = NotFound desc = could not find container \"fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426\": container with ID starting with fc506ee3960651e9bd9fda575184b0e6a358133346f23dcd37dd864efa34e426 not found: ID does not exist" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.983398 4707 scope.go:117] "RemoveContainer" containerID="9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32" Jan 21 15:19:44 crc kubenswrapper[4707]: E0121 15:19:44.983736 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32\": container with ID starting with 9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32 not found: ID does not exist" containerID="9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32" Jan 21 15:19:44 crc kubenswrapper[4707]: I0121 15:19:44.983760 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32"} err="failed to get container status \"9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32\": rpc error: code = NotFound desc = could not find container \"9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32\": container with ID starting with 9649e1a9b08cb3982617491d5482c9167d9695019fa305f105d606a008e27c32 not found: ID does not exist" Jan 21 15:19:45 crc kubenswrapper[4707]: I0121 15:19:45.189469 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" path="/var/lib/kubelet/pods/fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0/volumes" Jan 21 15:19:47 crc kubenswrapper[4707]: I0121 15:19:47.229243 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" podUID="7713e80a-5469-4748-a300-4c581d9fd503" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.141:9696/\": dial tcp 10.217.0.141:9696: connect: connection refused" Jan 21 15:19:48 crc kubenswrapper[4707]: E0121 15:19:48.732492 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:19:48 crc kubenswrapper[4707]: E0121 15:19:48.732492 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 15:19:48 crc kubenswrapper[4707]: E0121 15:19:48.732734 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:04.732720562 +0000 UTC m=+1101.914236785 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-config-data" not found Jan 21 15:19:48 crc kubenswrapper[4707]: E0121 15:19:48.732497 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:19:48 crc kubenswrapper[4707]: E0121 15:19:48.732786 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:04.732768282 +0000 UTC m=+1101.914284504 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-worker-config-data" not found Jan 21 15:19:48 crc kubenswrapper[4707]: E0121 15:19:48.732862 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:04.732844306 +0000 UTC m=+1101.914360538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "combined-ca-bundle" not found Jan 21 15:20:00 crc kubenswrapper[4707]: I0121 15:20:00.499011 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:20:00 crc kubenswrapper[4707]: I0121 15:20:00.501158 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.192:9696/\": dial tcp 10.217.0.192:9696: connect: connection refused" Jan 21 15:20:01 crc kubenswrapper[4707]: I0121 15:20:01.611370 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.758421 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5f98bc6758-lh9vt_7713e80a-5469-4748-a300-4c581d9fd503/neutron-api/0.log" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.758487 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.812427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpmqq\" (UniqueName: \"kubernetes.io/projected/7713e80a-5469-4748-a300-4c581d9fd503-kube-api-access-kpmqq\") pod \"7713e80a-5469-4748-a300-4c581d9fd503\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.812488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-internal-tls-certs\") pod \"7713e80a-5469-4748-a300-4c581d9fd503\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.812533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-config\") pod \"7713e80a-5469-4748-a300-4c581d9fd503\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.812580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-ovndb-tls-certs\") pod \"7713e80a-5469-4748-a300-4c581d9fd503\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.812660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-public-tls-certs\") pod \"7713e80a-5469-4748-a300-4c581d9fd503\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.812689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-combined-ca-bundle\") pod \"7713e80a-5469-4748-a300-4c581d9fd503\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.812734 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-httpd-config\") pod \"7713e80a-5469-4748-a300-4c581d9fd503\" (UID: \"7713e80a-5469-4748-a300-4c581d9fd503\") " Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.816736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7713e80a-5469-4748-a300-4c581d9fd503" (UID: "7713e80a-5469-4748-a300-4c581d9fd503"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.816776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7713e80a-5469-4748-a300-4c581d9fd503-kube-api-access-kpmqq" (OuterVolumeSpecName: "kube-api-access-kpmqq") pod "7713e80a-5469-4748-a300-4c581d9fd503" (UID: "7713e80a-5469-4748-a300-4c581d9fd503"). InnerVolumeSpecName "kube-api-access-kpmqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.839375 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7713e80a-5469-4748-a300-4c581d9fd503" (UID: "7713e80a-5469-4748-a300-4c581d9fd503"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.839538 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7713e80a-5469-4748-a300-4c581d9fd503" (UID: "7713e80a-5469-4748-a300-4c581d9fd503"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.840965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-config" (OuterVolumeSpecName: "config") pod "7713e80a-5469-4748-a300-4c581d9fd503" (UID: "7713e80a-5469-4748-a300-4c581d9fd503"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.842262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7713e80a-5469-4748-a300-4c581d9fd503" (UID: "7713e80a-5469-4748-a300-4c581d9fd503"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.850588 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7713e80a-5469-4748-a300-4c581d9fd503" (UID: "7713e80a-5469-4748-a300-4c581d9fd503"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.914559 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.914593 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.914603 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.914611 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.914622 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpmqq\" (UniqueName: \"kubernetes.io/projected/7713e80a-5469-4748-a300-4c581d9fd503-kube-api-access-kpmqq\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.914631 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:02 crc kubenswrapper[4707]: I0121 15:20:02.914639 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7713e80a-5469-4748-a300-4c581d9fd503-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.051590 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5f98bc6758-lh9vt_7713e80a-5469-4748-a300-4c581d9fd503/neutron-api/0.log" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.051757 4707 generic.go:334] "Generic (PLEG): container finished" podID="7713e80a-5469-4748-a300-4c581d9fd503" containerID="451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a" exitCode=137 Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.051801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" event={"ID":"7713e80a-5469-4748-a300-4c581d9fd503","Type":"ContainerDied","Data":"451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a"} Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.051841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" event={"ID":"7713e80a-5469-4748-a300-4c581d9fd503","Type":"ContainerDied","Data":"d06a984dae75f2619b616314b69fc8158103d5c68bcc4bb569c6ea5cdd3be708"} Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.051845 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5f98bc6758-lh9vt" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.051856 4707 scope.go:117] "RemoveContainer" containerID="0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.076544 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerID="767c60a84352183a0a1460f1f409ae4f3ff11f0dc3b14e0f08ef19858a8e4c10" exitCode=137 Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.076582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"767c60a84352183a0a1460f1f409ae4f3ff11f0dc3b14e0f08ef19858a8e4c10"} Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.077748 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5f98bc6758-lh9vt"] Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.083704 4707 scope.go:117] "RemoveContainer" containerID="451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.088041 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5f98bc6758-lh9vt"] Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.097511 4707 scope.go:117] "RemoveContainer" containerID="0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e" Jan 21 15:20:03 crc kubenswrapper[4707]: E0121 15:20:03.097769 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e\": container with ID starting with 0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e not found: ID does not exist" containerID="0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.097795 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e"} err="failed to get container status \"0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e\": rpc error: code = NotFound desc = could not find container \"0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e\": container with ID starting with 0eb8a35efad70803cf665a35b2d82d4c0dc80e96d0fe04cc7aad313f2f4b450e not found: ID does not exist" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.097832 4707 scope.go:117] "RemoveContainer" containerID="451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a" Jan 21 15:20:03 crc kubenswrapper[4707]: E0121 15:20:03.098015 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a\": container with ID starting with 451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a not found: ID does not exist" containerID="451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.098039 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a"} err="failed to get container status \"451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a\": rpc error: code = NotFound desc = could not find container \"451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a\": container with ID starting with 451dbc01bf9532de7339ae804c774197e5b081b1883ba36e172558dd32363e1a not found: ID does not exist" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.189297 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7713e80a-5469-4748-a300-4c581d9fd503" path="/var/lib/kubelet/pods/7713e80a-5469-4748-a300-4c581d9fd503/volumes" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.208334 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.217989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-lock\") pod \"4a26dbbb-ff73-461b-9ef4-045186b349f0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.218042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"4a26dbbb-ff73-461b-9ef4-045186b349f0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.218077 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") pod \"4a26dbbb-ff73-461b-9ef4-045186b349f0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.218119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7426\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-kube-api-access-w7426\") pod \"4a26dbbb-ff73-461b-9ef4-045186b349f0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.218132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-cache\") pod \"4a26dbbb-ff73-461b-9ef4-045186b349f0\" (UID: \"4a26dbbb-ff73-461b-9ef4-045186b349f0\") " Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.218328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-lock" (OuterVolumeSpecName: "lock") pod "4a26dbbb-ff73-461b-9ef4-045186b349f0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.218415 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.218762 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-cache" (OuterVolumeSpecName: "cache") pod "4a26dbbb-ff73-461b-9ef4-045186b349f0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.220387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "4a26dbbb-ff73-461b-9ef4-045186b349f0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.220457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4a26dbbb-ff73-461b-9ef4-045186b349f0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.221485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-kube-api-access-w7426" (OuterVolumeSpecName: "kube-api-access-w7426") pod "4a26dbbb-ff73-461b-9ef4-045186b349f0" (UID: "4a26dbbb-ff73-461b-9ef4-045186b349f0"). InnerVolumeSpecName "kube-api-access-w7426". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.320357 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7426\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-kube-api-access-w7426\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.320387 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4a26dbbb-ff73-461b-9ef4-045186b349f0-cache\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.320413 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.320422 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4a26dbbb-ff73-461b-9ef4-045186b349f0-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.333335 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 21 15:20:03 crc kubenswrapper[4707]: I0121 15:20:03.421136 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.087417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4a26dbbb-ff73-461b-9ef4-045186b349f0","Type":"ContainerDied","Data":"62ab95306fcb789d20522fc96dd9fae599b20c2a379a72875d440584f75b425d"} Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.087448 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.087467 4707 scope.go:117] "RemoveContainer" containerID="767c60a84352183a0a1460f1f409ae4f3ff11f0dc3b14e0f08ef19858a8e4c10" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.102067 4707 scope.go:117] "RemoveContainer" containerID="5e3216158efccac8796405d01246e40f3bd76d0da2ad3160c835ee6479e21c40" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.114438 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.118647 4707 scope.go:117] "RemoveContainer" containerID="d3dc8279851af3aeb56531212bc19a4fe8eaf713d9b19e99bb97e6317f18363f" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.119152 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.130837 4707 scope.go:117] "RemoveContainer" containerID="e8f6a181d9396f315c57c1fa718fed1215117042aa452cbef4eeb8c3a68400e0" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.144473 4707 scope.go:117] "RemoveContainer" containerID="f59325ec6d78879ef0e5dc29fc782e9d998428ee8ceb5a83b319c6921289e416" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.158227 4707 scope.go:117] "RemoveContainer" containerID="91f811d60dd80febc0d665d8d1e4c22a670a4043e096ea0a70f5b837830a8b9f" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.169894 4707 scope.go:117] "RemoveContainer" containerID="7af2e04db14be25f173dd381e8aada1f6fca26d64ef2c1471213d94d70d441d7" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.181967 4707 scope.go:117] "RemoveContainer" containerID="6ba3795292e347eb0052109476f7b8868a66ba15e5ffbef833b4771ba1eb7f0c" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.198528 4707 scope.go:117] "RemoveContainer" containerID="b57d8c8d95a43a7fe418e385ecf17ab93c271370b136b8b9ea9e9ac36a8eb7ec" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.211984 4707 scope.go:117] "RemoveContainer" containerID="c20ae0c2ed4b8564da8434d55d14c1622df2625d7eeb4be8dbb41dd1c95ea25d" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.226968 4707 scope.go:117] "RemoveContainer" containerID="19f135825fe61fa9589705bc5673cab68133a4b8973d340cdca3519f686f9181" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.240429 4707 scope.go:117] "RemoveContainer" containerID="79465f4c64a300da6b2b166a7231cade9cafced28c54a63fb1dd42634ea4b73d" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.253547 4707 scope.go:117] "RemoveContainer" containerID="dfc3a06ef645fad85b8c1641aeb4b27af7cb580998503dd959db2c02aff177cd" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.264355 4707 scope.go:117] "RemoveContainer" containerID="0b91da6e45cc4e4749ed6b4f2b71d51ae0be19bba7d48c2ab451efd86fbf36c4" Jan 21 15:20:04 crc kubenswrapper[4707]: I0121 15:20:04.277005 4707 scope.go:117] "RemoveContainer" containerID="0adfedc9ed55605ee7e287bb792ea2a2b8c433568d147932a3110f197664d94e" Jan 21 15:20:04 crc kubenswrapper[4707]: E0121 15:20:04.738123 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:20:04 crc kubenswrapper[4707]: E0121 15:20:04.738179 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 15:20:04 crc kubenswrapper[4707]: E0121 15:20:04.738134 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 15:20:04 crc kubenswrapper[4707]: E0121 15:20:04.738190 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:36.738175432 +0000 UTC m=+1133.919691664 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "combined-ca-bundle" not found Jan 21 15:20:04 crc kubenswrapper[4707]: E0121 15:20:04.738285 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:36.738271542 +0000 UTC m=+1133.919787764 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-config-data" not found Jan 21 15:20:04 crc kubenswrapper[4707]: E0121 15:20:04.738301 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom podName:3602ad56-25ca-4375-8eac-ea931ec48243 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:36.738294205 +0000 UTC m=+1133.919810427 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom") pod "barbican-worker-77f98c6b77-r9vnd" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243") : secret "barbican-worker-config-data" not found Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.189221 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" path="/var/lib/kubelet/pods/4a26dbbb-ff73-461b-9ef4-045186b349f0/volumes" Jan 21 15:20:05 crc kubenswrapper[4707]: E0121 15:20:05.613362 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda533a135_2b5a_4edb_aead_4924368ab8ab.slice/crio-f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda533a135_2b5a_4edb_aead_4924368ab8ab.slice/crio-conmon-f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.756916 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.761957 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.791512 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.855874 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-fernet-keys\") pod \"a533a135-2b5a-4edb-aead-4924368ab8ab\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.855916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-combined-ca-bundle\") pod \"a533a135-2b5a-4edb-aead-4924368ab8ab\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.855961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twr6c\" (UniqueName: \"kubernetes.io/projected/3602ad56-25ca-4375-8eac-ea931ec48243-kube-api-access-twr6c\") pod \"3602ad56-25ca-4375-8eac-ea931ec48243\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.855992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxwrf\" (UniqueName: \"kubernetes.io/projected/a533a135-2b5a-4edb-aead-4924368ab8ab-kube-api-access-mxwrf\") pod \"a533a135-2b5a-4edb-aead-4924368ab8ab\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3602ad56-25ca-4375-8eac-ea931ec48243-logs\") pod \"3602ad56-25ca-4375-8eac-ea931ec48243\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-config-data\") pod \"a533a135-2b5a-4edb-aead-4924368ab8ab\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856045 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-scripts\") pod \"a533a135-2b5a-4edb-aead-4924368ab8ab\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-combined-ca-bundle\") pod \"aa920435-67d6-49dd-9e83-7d14c4316fef\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5ssq\" (UniqueName: \"kubernetes.io/projected/aa920435-67d6-49dd-9e83-7d14c4316fef-kube-api-access-h5ssq\") pod \"aa920435-67d6-49dd-9e83-7d14c4316fef\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856135 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-public-tls-certs\") pod \"a533a135-2b5a-4edb-aead-4924368ab8ab\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856158 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data\") pod \"3602ad56-25ca-4375-8eac-ea931ec48243\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data-custom\") pod \"aa920435-67d6-49dd-9e83-7d14c4316fef\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-credential-keys\") pod \"a533a135-2b5a-4edb-aead-4924368ab8ab\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-internal-tls-certs\") pod \"a533a135-2b5a-4edb-aead-4924368ab8ab\" (UID: \"a533a135-2b5a-4edb-aead-4924368ab8ab\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856264 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom\") pod \"3602ad56-25ca-4375-8eac-ea931ec48243\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data\") pod \"aa920435-67d6-49dd-9e83-7d14c4316fef\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle\") pod \"3602ad56-25ca-4375-8eac-ea931ec48243\" (UID: \"3602ad56-25ca-4375-8eac-ea931ec48243\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa920435-67d6-49dd-9e83-7d14c4316fef-logs\") pod \"aa920435-67d6-49dd-9e83-7d14c4316fef\" (UID: \"aa920435-67d6-49dd-9e83-7d14c4316fef\") " Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3602ad56-25ca-4375-8eac-ea931ec48243-logs" (OuterVolumeSpecName: "logs") pod "3602ad56-25ca-4375-8eac-ea931ec48243" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa920435-67d6-49dd-9e83-7d14c4316fef-logs" (OuterVolumeSpecName: "logs") pod "aa920435-67d6-49dd-9e83-7d14c4316fef" (UID: "aa920435-67d6-49dd-9e83-7d14c4316fef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.856805 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3602ad56-25ca-4375-8eac-ea931ec48243-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.860551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-scripts" (OuterVolumeSpecName: "scripts") pod "a533a135-2b5a-4edb-aead-4924368ab8ab" (UID: "a533a135-2b5a-4edb-aead-4924368ab8ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.860603 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aa920435-67d6-49dd-9e83-7d14c4316fef" (UID: "aa920435-67d6-49dd-9e83-7d14c4316fef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.860608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3602ad56-25ca-4375-8eac-ea931ec48243" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.860789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a533a135-2b5a-4edb-aead-4924368ab8ab-kube-api-access-mxwrf" (OuterVolumeSpecName: "kube-api-access-mxwrf") pod "a533a135-2b5a-4edb-aead-4924368ab8ab" (UID: "a533a135-2b5a-4edb-aead-4924368ab8ab"). InnerVolumeSpecName "kube-api-access-mxwrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.860883 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3602ad56-25ca-4375-8eac-ea931ec48243-kube-api-access-twr6c" (OuterVolumeSpecName: "kube-api-access-twr6c") pod "3602ad56-25ca-4375-8eac-ea931ec48243" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243"). InnerVolumeSpecName "kube-api-access-twr6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.860935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa920435-67d6-49dd-9e83-7d14c4316fef-kube-api-access-h5ssq" (OuterVolumeSpecName: "kube-api-access-h5ssq") pod "aa920435-67d6-49dd-9e83-7d14c4316fef" (UID: "aa920435-67d6-49dd-9e83-7d14c4316fef"). InnerVolumeSpecName "kube-api-access-h5ssq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.861173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a533a135-2b5a-4edb-aead-4924368ab8ab" (UID: "a533a135-2b5a-4edb-aead-4924368ab8ab"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.862385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a533a135-2b5a-4edb-aead-4924368ab8ab" (UID: "a533a135-2b5a-4edb-aead-4924368ab8ab"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.873099 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa920435-67d6-49dd-9e83-7d14c4316fef" (UID: "aa920435-67d6-49dd-9e83-7d14c4316fef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.874166 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a533a135-2b5a-4edb-aead-4924368ab8ab" (UID: "a533a135-2b5a-4edb-aead-4924368ab8ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.874338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-config-data" (OuterVolumeSpecName: "config-data") pod "a533a135-2b5a-4edb-aead-4924368ab8ab" (UID: "a533a135-2b5a-4edb-aead-4924368ab8ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.874447 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3602ad56-25ca-4375-8eac-ea931ec48243" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.884514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a533a135-2b5a-4edb-aead-4924368ab8ab" (UID: "a533a135-2b5a-4edb-aead-4924368ab8ab"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.884937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data" (OuterVolumeSpecName: "config-data") pod "3602ad56-25ca-4375-8eac-ea931ec48243" (UID: "3602ad56-25ca-4375-8eac-ea931ec48243"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.885514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data" (OuterVolumeSpecName: "config-data") pod "aa920435-67d6-49dd-9e83-7d14c4316fef" (UID: "aa920435-67d6-49dd-9e83-7d14c4316fef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.885840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a533a135-2b5a-4edb-aead-4924368ab8ab" (UID: "a533a135-2b5a-4edb-aead-4924368ab8ab"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958238 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958264 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958276 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twr6c\" (UniqueName: \"kubernetes.io/projected/3602ad56-25ca-4375-8eac-ea931ec48243-kube-api-access-twr6c\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958285 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxwrf\" (UniqueName: \"kubernetes.io/projected/a533a135-2b5a-4edb-aead-4924368ab8ab-kube-api-access-mxwrf\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958295 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958303 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958311 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958320 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5ssq\" (UniqueName: \"kubernetes.io/projected/aa920435-67d6-49dd-9e83-7d14c4316fef-kube-api-access-h5ssq\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958329 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958337 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958346 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958353 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958360 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a533a135-2b5a-4edb-aead-4924368ab8ab-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958368 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958375 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa920435-67d6-49dd-9e83-7d14c4316fef-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958383 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602ad56-25ca-4375-8eac-ea931ec48243-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:05 crc kubenswrapper[4707]: I0121 15:20:05.958390 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa920435-67d6-49dd-9e83-7d14c4316fef-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.108902 4707 generic.go:334] "Generic (PLEG): container finished" podID="3602ad56-25ca-4375-8eac-ea931ec48243" containerID="3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf" exitCode=137 Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.109047 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.109271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" event={"ID":"3602ad56-25ca-4375-8eac-ea931ec48243","Type":"ContainerDied","Data":"3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf"} Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.109316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd" event={"ID":"3602ad56-25ca-4375-8eac-ea931ec48243","Type":"ContainerDied","Data":"a682dd07acd9a2d981504e197e1db5a83338efa8c1c6fcc84c7603ddcc25f7c8"} Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.109336 4707 scope.go:117] "RemoveContainer" containerID="3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.111020 4707 generic.go:334] "Generic (PLEG): container finished" podID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerID="d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57" exitCode=137 Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.111063 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.111083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" event={"ID":"aa920435-67d6-49dd-9e83-7d14c4316fef","Type":"ContainerDied","Data":"d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57"} Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.111109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc" event={"ID":"aa920435-67d6-49dd-9e83-7d14c4316fef","Type":"ContainerDied","Data":"00a2439156d8b81cc43e616a05610dbabec8f4463065fad9bd62e75a687aceee"} Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.112845 4707 generic.go:334] "Generic (PLEG): container finished" podID="a533a135-2b5a-4edb-aead-4924368ab8ab" containerID="f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b" exitCode=137 Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.112888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" event={"ID":"a533a135-2b5a-4edb-aead-4924368ab8ab","Type":"ContainerDied","Data":"f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b"} Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.112919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" event={"ID":"a533a135-2b5a-4edb-aead-4924368ab8ab","Type":"ContainerDied","Data":"a5427fd48efe38bc4168a074c22ec2c0df9496874a3232b354a558abe6a546e8"} Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.112934 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fc8578799-v69q9" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.126856 4707 scope.go:117] "RemoveContainer" containerID="69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.138005 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd"] Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.145248 4707 scope.go:117] "RemoveContainer" containerID="3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf" Jan 21 15:20:06 crc kubenswrapper[4707]: E0121 15:20:06.145565 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf\": container with ID starting with 3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf not found: ID does not exist" containerID="3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.145592 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf"} err="failed to get container status \"3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf\": rpc error: code = NotFound desc = could not find container \"3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf\": container with ID starting with 3ea9ee416cf1a3f5f736e68da1a62dc1a853b87e01c51d02ec0c8b3d1ddf6cdf not found: ID does not exist" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.145609 4707 scope.go:117] "RemoveContainer" containerID="69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.145843 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77f98c6b77-r9vnd"] Jan 21 15:20:06 crc kubenswrapper[4707]: E0121 15:20:06.145974 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22\": container with ID starting with 69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22 not found: ID does not exist" containerID="69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.145999 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22"} err="failed to get container status \"69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22\": rpc error: code = NotFound desc = could not find container \"69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22\": container with ID starting with 69c934fc7acb400a307d55ac593a9517ad81ee7dbc9da7811789290cc432cc22 not found: ID does not exist" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.146021 4707 scope.go:117] "RemoveContainer" containerID="d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.152877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-fc8578799-v69q9"] Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.156789 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-fc8578799-v69q9"] Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.160581 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc"] Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.161777 4707 scope.go:117] "RemoveContainer" containerID="92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.164159 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5f47d8d9b-swktc"] Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.172822 4707 scope.go:117] "RemoveContainer" containerID="d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57" Jan 21 15:20:06 crc kubenswrapper[4707]: E0121 15:20:06.173076 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57\": container with ID starting with d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57 not found: ID does not exist" containerID="d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.173105 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57"} err="failed to get container status \"d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57\": rpc error: code = NotFound desc = could not find container \"d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57\": container with ID starting with d95319ce653cbd4445d5ae4919d8f4d45ac808ec48abf6fe44e83256b4bb4e57 not found: ID does not exist" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.173127 4707 scope.go:117] "RemoveContainer" containerID="92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762" Jan 21 15:20:06 crc kubenswrapper[4707]: E0121 15:20:06.173889 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762\": container with ID starting with 92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762 not found: ID does not exist" containerID="92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.173914 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762"} err="failed to get container status \"92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762\": rpc error: code = NotFound desc = could not find container \"92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762\": container with ID starting with 92cd8579c7649cd56da1344fd8ab8f5e239ed89d6505d3b439022807a4c1d762 not found: ID does not exist" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.173929 4707 scope.go:117] "RemoveContainer" containerID="f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.187738 4707 scope.go:117] "RemoveContainer" containerID="f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b" Jan 21 15:20:06 crc kubenswrapper[4707]: E0121 15:20:06.187962 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b\": container with ID starting with f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b not found: ID does not exist" containerID="f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.188021 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b"} err="failed to get container status \"f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b\": rpc error: code = NotFound desc = could not find container \"f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b\": container with ID starting with f14d32e76fb155e360e4ad67c7da86e56d516778b0e6e215b32116b839e89b2b not found: ID does not exist" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.897260 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-c77875b78-c9kg9_14b480ba-3ec1-4c37-a14c-d3b8a20d9837/neutron-api/0.log" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.897449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.968907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-public-tls-certs\") pod \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.968961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-httpd-config\") pod \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.969011 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-combined-ca-bundle\") pod \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.969053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-config\") pod \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.969072 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-ovndb-tls-certs\") pod \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.969092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-internal-tls-certs\") pod \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.969162 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jhw6\" (UniqueName: \"kubernetes.io/projected/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-kube-api-access-6jhw6\") pod \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\" (UID: \"14b480ba-3ec1-4c37-a14c-d3b8a20d9837\") " Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.972946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "14b480ba-3ec1-4c37-a14c-d3b8a20d9837" (UID: "14b480ba-3ec1-4c37-a14c-d3b8a20d9837"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.973162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-kube-api-access-6jhw6" (OuterVolumeSpecName: "kube-api-access-6jhw6") pod "14b480ba-3ec1-4c37-a14c-d3b8a20d9837" (UID: "14b480ba-3ec1-4c37-a14c-d3b8a20d9837"). InnerVolumeSpecName "kube-api-access-6jhw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.994415 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "14b480ba-3ec1-4c37-a14c-d3b8a20d9837" (UID: "14b480ba-3ec1-4c37-a14c-d3b8a20d9837"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.995130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "14b480ba-3ec1-4c37-a14c-d3b8a20d9837" (UID: "14b480ba-3ec1-4c37-a14c-d3b8a20d9837"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.995828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-config" (OuterVolumeSpecName: "config") pod "14b480ba-3ec1-4c37-a14c-d3b8a20d9837" (UID: "14b480ba-3ec1-4c37-a14c-d3b8a20d9837"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:06 crc kubenswrapper[4707]: I0121 15:20:06.996280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14b480ba-3ec1-4c37-a14c-d3b8a20d9837" (UID: "14b480ba-3ec1-4c37-a14c-d3b8a20d9837"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.005636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "14b480ba-3ec1-4c37-a14c-d3b8a20d9837" (UID: "14b480ba-3ec1-4c37-a14c-d3b8a20d9837"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.070790 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.070829 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.070842 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.070851 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.070859 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jhw6\" (UniqueName: \"kubernetes.io/projected/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-kube-api-access-6jhw6\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.070869 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.070877 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/14b480ba-3ec1-4c37-a14c-d3b8a20d9837-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.122768 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-c77875b78-c9kg9_14b480ba-3ec1-4c37-a14c-d3b8a20d9837/neutron-api/0.log" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.122820 4707 generic.go:334] "Generic (PLEG): container finished" podID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerID="22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4" exitCode=137 Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.122869 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.122872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" event={"ID":"14b480ba-3ec1-4c37-a14c-d3b8a20d9837","Type":"ContainerDied","Data":"22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4"} Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.122894 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-c77875b78-c9kg9" event={"ID":"14b480ba-3ec1-4c37-a14c-d3b8a20d9837","Type":"ContainerDied","Data":"0d0a2fb8f085d3c1e5fbaa104b3389c405f40c0df93816589c7cc92ed250ceb6"} Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.122908 4707 scope.go:117] "RemoveContainer" containerID="9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.141564 4707 scope.go:117] "RemoveContainer" containerID="22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.145428 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-c77875b78-c9kg9"] Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.149163 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-c77875b78-c9kg9"] Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.154931 4707 scope.go:117] "RemoveContainer" containerID="9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997" Jan 21 15:20:07 crc kubenswrapper[4707]: E0121 15:20:07.155189 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997\": container with ID starting with 9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997 not found: ID does not exist" containerID="9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.155294 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997"} err="failed to get container status \"9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997\": rpc error: code = NotFound desc = could not find container \"9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997\": container with ID starting with 9f283182b8b40bf3c530dbb3a7cf8743b6eb3f5d280ff257fe420a9aa7bd7997 not found: ID does not exist" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.155372 4707 scope.go:117] "RemoveContainer" containerID="22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4" Jan 21 15:20:07 crc kubenswrapper[4707]: E0121 15:20:07.155659 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4\": container with ID starting with 22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4 not found: ID does not exist" containerID="22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.155684 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4"} err="failed to get container status \"22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4\": rpc error: code = NotFound desc = could not find container \"22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4\": container with ID starting with 22231810d937f2e2fb11b45802d7531d8132ca935f7ddad2926e2d2364b313e4 not found: ID does not exist" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.188188 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" path="/var/lib/kubelet/pods/14b480ba-3ec1-4c37-a14c-d3b8a20d9837/volumes" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.188881 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3602ad56-25ca-4375-8eac-ea931ec48243" path="/var/lib/kubelet/pods/3602ad56-25ca-4375-8eac-ea931ec48243/volumes" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.189433 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a533a135-2b5a-4edb-aead-4924368ab8ab" path="/var/lib/kubelet/pods/a533a135-2b5a-4edb-aead-4924368ab8ab/volumes" Jan 21 15:20:07 crc kubenswrapper[4707]: I0121 15:20:07.190355 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa920435-67d6-49dd-9e83-7d14c4316fef" path="/var/lib/kubelet/pods/aa920435-67d6-49dd-9e83-7d14c4316fef/volumes" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.441763 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442514 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7713e80a-5469-4748-a300-4c581d9fd503" containerName="neutron-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7713e80a-5469-4748-a300-4c581d9fd503" containerName="neutron-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442537 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-auditor" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442542 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-auditor" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442554 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d67e12-9371-4324-a476-3befcc10743d" containerName="placement-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442560 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d67e12-9371-4324-a476-3befcc10743d" containerName="placement-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442571 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-replicator" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442576 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-replicator" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442582 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerName="glance-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442587 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerName="glance-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442593 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerName="neutron-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442598 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerName="neutron-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442605 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-server" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-server" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442615 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerName="glance-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442620 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerName="glance-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442630 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df067294-2809-413f-b245-6cd57c0cbf11" containerName="barbican-worker" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442635 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df067294-2809-413f-b245-6cd57c0cbf11" containerName="barbican-worker" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442642 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerName="glance-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442647 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerName="glance-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442653 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="ovn-northd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442658 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="ovn-northd" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442665 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-auditor" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442670 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-auditor" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442678 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerName="barbican-keystone-listener" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442683 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerName="barbican-keystone-listener" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442690 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3602ad56-25ca-4375-8eac-ea931ec48243" containerName="barbican-worker-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442695 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3602ad56-25ca-4375-8eac-ea931ec48243" containerName="barbican-worker-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442704 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442708 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442716 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerName="cinder-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442722 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerName="cinder-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442729 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerName="barbican-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442734 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerName="barbican-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442743 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a533a135-2b5a-4edb-aead-4924368ab8ab" containerName="keystone-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442748 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a533a135-2b5a-4edb-aead-4924368ab8ab" containerName="keystone-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442754 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="ceilometer-central-agent" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442760 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="ceilometer-central-agent" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442766 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7713e80a-5469-4748-a300-4c581d9fd503" containerName="neutron-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442771 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7713e80a-5469-4748-a300-4c581d9fd503" containerName="neutron-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442778 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" containerName="kube-state-metrics" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442783 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" containerName="kube-state-metrics" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442791 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-replicator" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442797 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-replicator" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442802 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="openstack-network-exporter" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442825 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="openstack-network-exporter" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442833 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-metadata" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442837 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-metadata" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442843 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d582f12e-341b-45f5-a54a-8d016aed473a" containerName="proxy-server" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442848 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d582f12e-341b-45f5-a54a-8d016aed473a" containerName="proxy-server" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442856 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442861 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442869 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerName="barbican-keystone-listener-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442875 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerName="barbican-keystone-listener-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442882 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3602ad56-25ca-4375-8eac-ea931ec48243" containerName="barbican-worker" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442887 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3602ad56-25ca-4375-8eac-ea931ec48243" containerName="barbican-worker" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442895 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerName="barbican-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442899 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerName="barbican-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442908 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="ceilometer-notification-agent" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442912 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="ceilometer-notification-agent" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442923 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-server" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442928 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-server" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442935 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerName="cinder-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442941 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerName="cinder-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442947 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36457603-5ef9-4f79-bf2c-11413e422937" containerName="nova-cell0-conductor-conductor" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442952 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36457603-5ef9-4f79-bf2c-11413e422937" containerName="nova-cell0-conductor-conductor" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442957 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442962 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442969 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerName="neutron-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442974 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerName="neutron-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442983 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442988 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.442995 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="rsync" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.442999 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="rsync" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443008 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d67e12-9371-4324-a476-3befcc10743d" containerName="placement-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443013 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d67e12-9371-4324-a476-3befcc10743d" containerName="placement-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443021 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-reaper" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443026 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-reaper" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443033 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-server" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443039 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-server" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443045 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerName="barbican-keystone-listener-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443050 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerName="barbican-keystone-listener-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443059 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="proxy-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443065 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="proxy-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443072 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41aa1cf-56db-4d5a-91e9-478a8ce591d0" containerName="keystone-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443077 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41aa1cf-56db-4d5a-91e9-478a8ce591d0" containerName="keystone-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443083 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerName="glance-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443088 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerName="glance-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443094 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce9bbed-aa2b-429e-85f7-c4202d8af65f" containerName="mysql-bootstrap" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443099 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce9bbed-aa2b-429e-85f7-c4202d8af65f" containerName="mysql-bootstrap" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443104 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df067294-2809-413f-b245-6cd57c0cbf11" containerName="barbican-worker-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443109 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df067294-2809-413f-b245-6cd57c0cbf11" containerName="barbican-worker-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443118 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-updater" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443122 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-updater" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443128 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290dd378-ee4b-464d-9321-2491ef417d10" containerName="nova-scheduler-scheduler" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443133 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="290dd378-ee4b-464d-9321-2491ef417d10" containerName="nova-scheduler-scheduler" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443140 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443145 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443155 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-replicator" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443160 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-replicator" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443168 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38833865-b749-47e4-b140-8a541562f359" containerName="placement-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443173 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="38833865-b749-47e4-b140-8a541562f359" containerName="placement-api" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443179 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="swift-recon-cron" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443185 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="swift-recon-cron" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443191 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerName="barbican-keystone-listener" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443196 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerName="barbican-keystone-listener" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443203 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" containerName="init" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443209 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" containerName="init" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443225 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerName="cinder-scheduler" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443231 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerName="cinder-scheduler" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443239 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" containerName="dnsmasq-dns" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443245 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" containerName="dnsmasq-dns" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443253 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-auditor" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443258 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-auditor" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443264 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" containerName="setup-container" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443269 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" containerName="setup-container" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443274 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" containerName="rabbitmq" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443279 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" containerName="rabbitmq" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443285 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-expirer" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443289 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-expirer" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443296 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerName="probe" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443301 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerName="probe" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443309 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38833865-b749-47e4-b140-8a541562f359" containerName="placement-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443314 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="38833865-b749-47e4-b140-8a541562f359" containerName="placement-log" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443322 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945ded51-9961-411b-b00f-0543ac91a18a" containerName="setup-container" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443327 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="945ded51-9961-411b-b00f-0543ac91a18a" containerName="setup-container" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443333 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d582f12e-341b-45f5-a54a-8d016aed473a" containerName="proxy-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443338 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d582f12e-341b-45f5-a54a-8d016aed473a" containerName="proxy-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443345 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945ded51-9961-411b-b00f-0543ac91a18a" containerName="rabbitmq" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443350 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="945ded51-9961-411b-b00f-0543ac91a18a" containerName="rabbitmq" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443357 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-updater" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443362 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-updater" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443370 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="sg-core" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443374 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="sg-core" Jan 21 15:20:17 crc kubenswrapper[4707]: E0121 15:20:17.443381 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce9bbed-aa2b-429e-85f7-c4202d8af65f" containerName="galera" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443386 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce9bbed-aa2b-429e-85f7-c4202d8af65f" containerName="galera" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d582f12e-341b-45f5-a54a-8d016aed473a" containerName="proxy-server" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443496 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-auditor" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443505 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="ceilometer-central-agent" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443513 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="sg-core" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443522 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a533a135-2b5a-4edb-aead-4924368ab8ab" containerName="keystone-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443530 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-expirer" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443536 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerName="barbican-keystone-listener" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443543 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="swift-recon-cron" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443550 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="38833865-b749-47e4-b140-8a541562f359" containerName="placement-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443556 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce9bbed-aa2b-429e-85f7-c4202d8af65f" containerName="galera" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443561 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443569 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="290dd378-ee4b-464d-9321-2491ef417d10" containerName="nova-scheduler-scheduler" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443576 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443582 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerName="barbican-keystone-listener-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443589 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443595 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerName="cinder-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443601 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerName="neutron-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443609 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="ceilometer-notification-agent" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443615 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f429a9-592b-4ff8-a548-b5e59c4c481e" containerName="cinder-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443623 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffc824e-1c16-4342-b73f-91205d93c451" containerName="nova-metadata-metadata" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443628 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerName="glance-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443634 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="openstack-network-exporter" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443644 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-auditor" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443649 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3602ad56-25ca-4375-8eac-ea931ec48243" containerName="barbican-worker" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443655 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df067294-2809-413f-b245-6cd57c0cbf11" containerName="barbican-worker-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443663 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-auditor" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443670 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerName="cinder-scheduler" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443675 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="36457603-5ef9-4f79-bf2c-11413e422937" containerName="nova-cell0-conductor-conductor" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443681 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b480ba-3ec1-4c37-a14c-d3b8a20d9837" containerName="neutron-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443688 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerName="glance-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443693 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16efd30-cb63-4031-aab4-ec6ca21d7528" containerName="proxy-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443700 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-server" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443708 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-updater" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443715 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3602ad56-25ca-4375-8eac-ea931ec48243" containerName="barbican-worker-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443721 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-reaper" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443726 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-replicator" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443733 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerName="barbican-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443739 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-server" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="account-replicator" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443752 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-replicator" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443757 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa920435-67d6-49dd-9e83-7d14c4316fef" containerName="barbican-keystone-listener" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443764 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fee1703-b855-46f5-9bc3-89c66b3377ea" containerName="barbican-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443771 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e476808c-b16e-4aa7-9f4b-71e6a5c32576" containerName="barbican-api-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443776 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d582f12e-341b-45f5-a54a-8d016aed473a" containerName="proxy-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443783 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="container-updater" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443789 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="38833865-b749-47e4-b140-8a541562f359" containerName="placement-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443794 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6331a2-4b0a-407e-8ee0-3b1a82343071" containerName="barbican-keystone-listener-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443801 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7713e80a-5469-4748-a300-4c581d9fd503" containerName="neutron-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443821 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d67e12-9371-4324-a476-3befcc10743d" containerName="placement-log" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443829 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d67e12-9371-4324-a476-3befcc10743d" containerName="placement-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443836 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41aa1cf-56db-4d5a-91e9-478a8ce591d0" containerName="keystone-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443843 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb09ad3d-cc59-429f-bbe8-ba6aec2f02c0" containerName="dnsmasq-dns" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443851 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="02f3955e-a0f2-4a8e-8a04-c6b7e069d0ec" containerName="glance-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443858 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="rsync" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443866 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7713e80a-5469-4748-a300-4c581d9fd503" containerName="neutron-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="933eeedf-0a55-44e6-84f5-062e3c2716d9" containerName="glance-httpd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443878 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d458004-5d78-43b6-aae5-7afb3dfa0c31" containerName="ovn-northd" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443883 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="30efcc6d-82eb-4c51-82e4-a14fdbdd8ab6" containerName="kube-state-metrics" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443891 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19c4756-e139-49f4-a1e8-99afc8ed8088" containerName="nova-api-api" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="945ded51-9961-411b-b00f-0543ac91a18a" containerName="rabbitmq" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443907 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44cdae5-61bb-40c3-a1d7-e10d4d64c203" containerName="probe" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443912 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f16a2ec1-5c97-4eb4-b234-2d6cbf4188a0" containerName="rabbitmq" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443919 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df067294-2809-413f-b245-6cd57c0cbf11" containerName="barbican-worker" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.443926 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26dbbb-ff73-461b-9ef4-045186b349f0" containerName="object-server" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.444516 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.446893 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.446903 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.447022 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.447107 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.447204 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.447227 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-j2qhx" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.447309 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.454339 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.516570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.516624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.516650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.516668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bc7660e-aaa9-4865-a873-4db17c1db628-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.516740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.516784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.516829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.517033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.517092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bc7660e-aaa9-4865-a873-4db17c1db628-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.517123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx8kv\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-kube-api-access-qx8kv\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.517286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bc7660e-aaa9-4865-a873-4db17c1db628-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx8kv\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-kube-api-access-qx8kv\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bc7660e-aaa9-4865-a873-4db17c1db628-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618868 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.618945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.619170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.619471 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.619928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.620083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.620370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.620546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.620851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.624992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bc7660e-aaa9-4865-a873-4db17c1db628-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.625142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.626549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.626726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bc7660e-aaa9-4865-a873-4db17c1db628-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.635691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx8kv\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-kube-api-access-qx8kv\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.642156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-0\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.725315 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.727926 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.730228 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.730229 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.730592 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.731761 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.731918 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-4c552" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.731922 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.731963 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.732100 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.760837 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fbbd17b-fed9-463c-9c25-435f84d0205e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvjrs\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-kube-api-access-fvjrs\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fbbd17b-fed9-463c-9c25-435f84d0205e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.821917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.923140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fbbd17b-fed9-463c-9c25-435f84d0205e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.923243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvjrs\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-kube-api-access-fvjrs\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.923309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.923346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fbbd17b-fed9-463c-9c25-435f84d0205e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924566 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") device mount path \"/mnt/openstack/pv17\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.924664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.925232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.925443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.928196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.930681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.930778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.930966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fbbd17b-fed9-463c-9c25-435f84d0205e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.932291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fbbd17b-fed9-463c-9c25-435f84d0205e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.937673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvjrs\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-kube-api-access-fvjrs\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:17 crc kubenswrapper[4707]: I0121 15:20:17.939273 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.045065 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.172402 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.194052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1bc7660e-aaa9-4865-a873-4db17c1db628","Type":"ContainerStarted","Data":"ecbce59d03cb46ca5d6d98464312966ab1a453320f74319b63602870b531911d"} Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.324623 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.325711 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.329474 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.329910 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.330024 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-kvgwr" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.330405 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.334047 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.335321 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.434489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-kolla-config\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.434545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.434639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.434703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.434751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.434784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjgrc\" (UniqueName: \"kubernetes.io/projected/f9190471-72fb-4298-85b6-bf07b32b6de5-kube-api-access-bjgrc\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.434873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.434890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-default\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.521001 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:20:18 crc kubenswrapper[4707]: W0121 15:20:18.523154 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fbbd17b_fed9_463c_9c25_435f84d0205e.slice/crio-598e556105658c465cda9e50d32a3202a18a9737a6c1bcdb2fbfad62f3d1f31b WatchSource:0}: Error finding container 598e556105658c465cda9e50d32a3202a18a9737a6c1bcdb2fbfad62f3d1f31b: Status 404 returned error can't find the container with id 598e556105658c465cda9e50d32a3202a18a9737a6c1bcdb2fbfad62f3d1f31b Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.535671 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.535941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-default\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.535984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-kolla-config\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.536011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.536034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.536055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.536078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.536098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjgrc\" (UniqueName: \"kubernetes.io/projected/f9190471-72fb-4298-85b6-bf07b32b6de5-kube-api-access-bjgrc\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.536429 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.536670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.537134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-default\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.537272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-kolla-config\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.537272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.539127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.540083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.553367 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.647881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjgrc\" (UniqueName: \"kubernetes.io/projected/f9190471-72fb-4298-85b6-bf07b32b6de5-kube-api-access-bjgrc\") pod \"openstack-galera-0\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:18 crc kubenswrapper[4707]: I0121 15:20:18.943312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.200329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"4fbbd17b-fed9-463c-9c25-435f84d0205e","Type":"ContainerStarted","Data":"598e556105658c465cda9e50d32a3202a18a9737a6c1bcdb2fbfad62f3d1f31b"} Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.202950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1bc7660e-aaa9-4865-a873-4db17c1db628","Type":"ContainerStarted","Data":"f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b"} Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.334744 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.660431 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.661597 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.663885 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-chnzh" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.664184 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.664281 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.664376 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.677459 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.755781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.755832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.755865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.755888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.755909 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.755937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.755956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65tfb\" (UniqueName: \"kubernetes.io/projected/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kube-api-access-65tfb\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.755970 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.857371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.857421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.857450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.857485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.857622 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65tfb\" (UniqueName: \"kubernetes.io/projected/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kube-api-access-65tfb\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.857679 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.857801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.857839 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.858089 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.858143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.858244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.858415 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.858918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.870467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.871861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.872479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.874335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65tfb\" (UniqueName: \"kubernetes.io/projected/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kube-api-access-65tfb\") pod \"openstack-cell1-galera-0\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:19 crc kubenswrapper[4707]: I0121 15:20:19.978334 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:20 crc kubenswrapper[4707]: I0121 15:20:20.210701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f9190471-72fb-4298-85b6-bf07b32b6de5","Type":"ContainerStarted","Data":"2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572"} Jan 21 15:20:20 crc kubenswrapper[4707]: I0121 15:20:20.210985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f9190471-72fb-4298-85b6-bf07b32b6de5","Type":"ContainerStarted","Data":"0bca8923ca333e053b4bc11f78c49c8ebf18710aaec94c05f9867d5a0f9e125a"} Jan 21 15:20:20 crc kubenswrapper[4707]: I0121 15:20:20.212409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"4fbbd17b-fed9-463c-9c25-435f84d0205e","Type":"ContainerStarted","Data":"f96114fc5af659c21483e07417feb1e1da0f1dbf8acbb58d75419c23defc0836"} Jan 21 15:20:20 crc kubenswrapper[4707]: I0121 15:20:20.342796 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:20:20 crc kubenswrapper[4707]: W0121 15:20:20.345750 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d4d90d7_a400_45ab_8c5b_7e68ced57657.slice/crio-d03a51987a21b4d709dac3b6f6ad6c6355f12a24465c9d1ae438bce6450456ba WatchSource:0}: Error finding container d03a51987a21b4d709dac3b6f6ad6c6355f12a24465c9d1ae438bce6450456ba: Status 404 returned error can't find the container with id d03a51987a21b4d709dac3b6f6ad6c6355f12a24465c9d1ae438bce6450456ba Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.058500 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.059443 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.062226 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.062629 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-s49b2" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.069965 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.070515 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.176439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.176475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-478rf\" (UniqueName: \"kubernetes.io/projected/09bf8b72-b690-4084-854a-0245cd1a61b2-kube-api-access-478rf\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.176520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-kolla-config\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.176568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-config-data\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.176653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.221285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"8d4d90d7-a400-45ab-8c5b-7e68ced57657","Type":"ContainerStarted","Data":"e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce"} Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.221359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"8d4d90d7-a400-45ab-8c5b-7e68ced57657","Type":"ContainerStarted","Data":"d03a51987a21b4d709dac3b6f6ad6c6355f12a24465c9d1ae438bce6450456ba"} Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.277634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-kolla-config\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.277698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-config-data\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.277726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.277780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.277798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-478rf\" (UniqueName: \"kubernetes.io/projected/09bf8b72-b690-4084-854a-0245cd1a61b2-kube-api-access-478rf\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.278463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-kolla-config\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.278501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-config-data\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.282783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.292117 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-478rf\" (UniqueName: \"kubernetes.io/projected/09bf8b72-b690-4084-854a-0245cd1a61b2-kube-api-access-478rf\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.294120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.373434 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:21 crc kubenswrapper[4707]: I0121 15:20:21.862766 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:20:21 crc kubenswrapper[4707]: W0121 15:20:21.865981 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09bf8b72_b690_4084_854a_0245cd1a61b2.slice/crio-f2b7a8281dc592388d996fc191b88446c86a5f018e9c8d74de99a1d8705ca737 WatchSource:0}: Error finding container f2b7a8281dc592388d996fc191b88446c86a5f018e9c8d74de99a1d8705ca737: Status 404 returned error can't find the container with id f2b7a8281dc592388d996fc191b88446c86a5f018e9c8d74de99a1d8705ca737 Jan 21 15:20:22 crc kubenswrapper[4707]: I0121 15:20:22.228336 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9190471-72fb-4298-85b6-bf07b32b6de5" containerID="2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572" exitCode=0 Jan 21 15:20:22 crc kubenswrapper[4707]: I0121 15:20:22.228407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f9190471-72fb-4298-85b6-bf07b32b6de5","Type":"ContainerDied","Data":"2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572"} Jan 21 15:20:22 crc kubenswrapper[4707]: I0121 15:20:22.230107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"09bf8b72-b690-4084-854a-0245cd1a61b2","Type":"ContainerStarted","Data":"ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d"} Jan 21 15:20:22 crc kubenswrapper[4707]: I0121 15:20:22.230155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"09bf8b72-b690-4084-854a-0245cd1a61b2","Type":"ContainerStarted","Data":"f2b7a8281dc592388d996fc191b88446c86a5f018e9c8d74de99a1d8705ca737"} Jan 21 15:20:22 crc kubenswrapper[4707]: I0121 15:20:22.261883 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=1.26186745 podStartE2EDuration="1.26186745s" podCreationTimestamp="2026-01-21 15:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:22.259184844 +0000 UTC m=+1119.440701066" watchObservedRunningTime="2026-01-21 15:20:22.26186745 +0000 UTC m=+1119.443383672" Jan 21 15:20:22 crc kubenswrapper[4707]: I0121 15:20:22.880452 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:20:22 crc kubenswrapper[4707]: I0121 15:20:22.881367 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:20:22 crc kubenswrapper[4707]: I0121 15:20:22.884684 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-xwn5n" Jan 21 15:20:22 crc kubenswrapper[4707]: I0121 15:20:22.889759 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.004420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n67zn\" (UniqueName: \"kubernetes.io/projected/3b269364-bf60-4c6a-8a37-3c9a4cf7ae30-kube-api-access-n67zn\") pod \"kube-state-metrics-0\" (UID: \"3b269364-bf60-4c6a-8a37-3c9a4cf7ae30\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.105604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n67zn\" (UniqueName: \"kubernetes.io/projected/3b269364-bf60-4c6a-8a37-3c9a4cf7ae30-kube-api-access-n67zn\") pod \"kube-state-metrics-0\" (UID: \"3b269364-bf60-4c6a-8a37-3c9a4cf7ae30\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.120255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n67zn\" (UniqueName: \"kubernetes.io/projected/3b269364-bf60-4c6a-8a37-3c9a4cf7ae30-kube-api-access-n67zn\") pod \"kube-state-metrics-0\" (UID: \"3b269364-bf60-4c6a-8a37-3c9a4cf7ae30\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.213430 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.238667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f9190471-72fb-4298-85b6-bf07b32b6de5","Type":"ContainerStarted","Data":"eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee"} Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.240704 4707 generic.go:334] "Generic (PLEG): container finished" podID="8d4d90d7-a400-45ab-8c5b-7e68ced57657" containerID="e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce" exitCode=0 Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.240783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"8d4d90d7-a400-45ab-8c5b-7e68ced57657","Type":"ContainerDied","Data":"e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce"} Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.240973 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.262426 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=5.262411793 podStartE2EDuration="5.262411793s" podCreationTimestamp="2026-01-21 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:23.259208006 +0000 UTC m=+1120.440724227" watchObservedRunningTime="2026-01-21 15:20:23.262411793 +0000 UTC m=+1120.443928015" Jan 21 15:20:23 crc kubenswrapper[4707]: W0121 15:20:23.624823 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b269364_bf60_4c6a_8a37_3c9a4cf7ae30.slice/crio-aaee06e3e6f6d20add5b83f0226746849583a0c74616f3485c03c6a2281655d6 WatchSource:0}: Error finding container aaee06e3e6f6d20add5b83f0226746849583a0c74616f3485c03c6a2281655d6: Status 404 returned error can't find the container with id aaee06e3e6f6d20add5b83f0226746849583a0c74616f3485c03c6a2281655d6 Jan 21 15:20:23 crc kubenswrapper[4707]: I0121 15:20:23.625694 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:20:24 crc kubenswrapper[4707]: I0121 15:20:24.248853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"8d4d90d7-a400-45ab-8c5b-7e68ced57657","Type":"ContainerStarted","Data":"589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f"} Jan 21 15:20:24 crc kubenswrapper[4707]: I0121 15:20:24.250200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"3b269364-bf60-4c6a-8a37-3c9a4cf7ae30","Type":"ContainerStarted","Data":"359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e"} Jan 21 15:20:24 crc kubenswrapper[4707]: I0121 15:20:24.250257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"3b269364-bf60-4c6a-8a37-3c9a4cf7ae30","Type":"ContainerStarted","Data":"aaee06e3e6f6d20add5b83f0226746849583a0c74616f3485c03c6a2281655d6"} Jan 21 15:20:24 crc kubenswrapper[4707]: I0121 15:20:24.250365 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:20:24 crc kubenswrapper[4707]: I0121 15:20:24.265053 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=5.265030357 podStartE2EDuration="5.265030357s" podCreationTimestamp="2026-01-21 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:24.262640933 +0000 UTC m=+1121.444157155" watchObservedRunningTime="2026-01-21 15:20:24.265030357 +0000 UTC m=+1121.446546580" Jan 21 15:20:24 crc kubenswrapper[4707]: I0121 15:20:24.274889 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.000383733 podStartE2EDuration="2.274878042s" podCreationTimestamp="2026-01-21 15:20:22 +0000 UTC" firstStartedPulling="2026-01-21 15:20:23.626866715 +0000 UTC m=+1120.808382937" lastFinishedPulling="2026-01-21 15:20:23.901361023 +0000 UTC m=+1121.082877246" observedRunningTime="2026-01-21 15:20:24.272840229 +0000 UTC m=+1121.454356451" watchObservedRunningTime="2026-01-21 15:20:24.274878042 +0000 UTC m=+1121.456394254" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.680585 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.682012 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.683782 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.683858 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-7h2tt" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.683931 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.684401 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.684420 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.690913 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.842493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.842953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-config\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.842989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.843048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.843075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.843105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtw5f\" (UniqueName: \"kubernetes.io/projected/09ff63a8-6d5b-4529-9c0a-9e78957b1618-kube-api-access-xtw5f\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.843283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.843339 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.945154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.945219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.945262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.945279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-config\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.945298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.945323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.945342 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.945539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtw5f\" (UniqueName: \"kubernetes.io/projected/09ff63a8-6d5b-4529-9c0a-9e78957b1618-kube-api-access-xtw5f\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.945664 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.946300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.946597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-config\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.946683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.956494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.963913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtw5f\" (UniqueName: \"kubernetes.io/projected/09ff63a8-6d5b-4529-9c0a-9e78957b1618-kube-api-access-xtw5f\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.966778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:25 crc kubenswrapper[4707]: I0121 15:20:25.966964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:26 crc kubenswrapper[4707]: I0121 15:20:26.001178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:26 crc kubenswrapper[4707]: I0121 15:20:26.297513 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:26 crc kubenswrapper[4707]: I0121 15:20:26.675649 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.269829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"09ff63a8-6d5b-4529-9c0a-9e78957b1618","Type":"ContainerStarted","Data":"e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02"} Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.270086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"09ff63a8-6d5b-4529-9c0a-9e78957b1618","Type":"ContainerStarted","Data":"71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148"} Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.270099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"09ff63a8-6d5b-4529-9c0a-9e78957b1618","Type":"ContainerStarted","Data":"b9a9c406c96065b17fbb4c289d41e92c3683825f0b68c04dc336484f5aa59d64"} Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.282249 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.282239437 podStartE2EDuration="2.282239437s" podCreationTimestamp="2026-01-21 15:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:27.281321932 +0000 UTC m=+1124.462838164" watchObservedRunningTime="2026-01-21 15:20:27.282239437 +0000 UTC m=+1124.463755659" Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.952588 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.953975 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.955553 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.955850 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.956312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-kk8qr" Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.956893 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:20:27 crc kubenswrapper[4707]: I0121 15:20:27.958473 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.074496 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.074547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-config\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.074590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.074617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.074724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.074745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.074766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.074823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjf6s\" (UniqueName: \"kubernetes.io/projected/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-kube-api-access-vjf6s\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.176569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.177095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.177046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.177179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.177985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjf6s\" (UniqueName: \"kubernetes.io/projected/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-kube-api-access-vjf6s\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.178108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.178158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-config\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.178276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.178329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.178587 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.178846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-config\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.179004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.182569 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.183171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.184642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.193039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjf6s\" (UniqueName: \"kubernetes.io/projected/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-kube-api-access-vjf6s\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.196420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.269690 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.635740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:20:28 crc kubenswrapper[4707]: W0121 15:20:28.642010 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6a9b240_8c1c_45c5_98a0_0a5f624a0506.slice/crio-bfd929ee5d3cdf25d0f6db4f2d30d043cfe39d8e3021ee358414e3738de7e986 WatchSource:0}: Error finding container bfd929ee5d3cdf25d0f6db4f2d30d043cfe39d8e3021ee358414e3738de7e986: Status 404 returned error can't find the container with id bfd929ee5d3cdf25d0f6db4f2d30d043cfe39d8e3021ee358414e3738de7e986 Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.944041 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.944567 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:28 crc kubenswrapper[4707]: I0121 15:20:28.999867 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:29 crc kubenswrapper[4707]: I0121 15:20:29.282344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"b6a9b240-8c1c-45c5-98a0-0a5f624a0506","Type":"ContainerStarted","Data":"e9c76da884763d8ec8111f38ea6a811f7c45fca9db9d9d3fa8c038f1f9944718"} Jan 21 15:20:29 crc kubenswrapper[4707]: I0121 15:20:29.282384 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"b6a9b240-8c1c-45c5-98a0-0a5f624a0506","Type":"ContainerStarted","Data":"9ea2a83aab093f5ad72f13e2481f0741c78195045448ff4cfb2387b4ed4d0678"} Jan 21 15:20:29 crc kubenswrapper[4707]: I0121 15:20:29.282398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"b6a9b240-8c1c-45c5-98a0-0a5f624a0506","Type":"ContainerStarted","Data":"bfd929ee5d3cdf25d0f6db4f2d30d043cfe39d8e3021ee358414e3738de7e986"} Jan 21 15:20:29 crc kubenswrapper[4707]: I0121 15:20:29.298976 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:29 crc kubenswrapper[4707]: I0121 15:20:29.301126 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.301110504 podStartE2EDuration="2.301110504s" podCreationTimestamp="2026-01-21 15:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:29.296717199 +0000 UTC m=+1126.478233421" watchObservedRunningTime="2026-01-21 15:20:29.301110504 +0000 UTC m=+1126.482626726" Jan 21 15:20:29 crc kubenswrapper[4707]: I0121 15:20:29.329605 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:29 crc kubenswrapper[4707]: I0121 15:20:29.341171 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:20:29 crc kubenswrapper[4707]: I0121 15:20:29.979517 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:29 crc kubenswrapper[4707]: I0121 15:20:29.979833 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:30 crc kubenswrapper[4707]: I0121 15:20:30.033014 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:30 crc kubenswrapper[4707]: I0121 15:20:30.287776 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:30 crc kubenswrapper[4707]: I0121 15:20:30.334314 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:20:30 crc kubenswrapper[4707]: I0121 15:20:30.960152 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-lgngm"] Jan 21 15:20:30 crc kubenswrapper[4707]: I0121 15:20:30.960934 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:30 crc kubenswrapper[4707]: I0121 15:20:30.967533 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-lgngm"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.022701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph798\" (UniqueName: \"kubernetes.io/projected/cee367ab-367a-4b7e-bf75-0c7ae0964933-kube-api-access-ph798\") pod \"keystone-db-create-lgngm\" (UID: \"cee367ab-367a-4b7e-bf75-0c7ae0964933\") " pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.022788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee367ab-367a-4b7e-bf75-0c7ae0964933-operator-scripts\") pod \"keystone-db-create-lgngm\" (UID: \"cee367ab-367a-4b7e-bf75-0c7ae0964933\") " pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.082218 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.083069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.085526 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.096656 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.124513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-operator-scripts\") pod \"keystone-2c83-account-create-update-nkz5v\" (UID: \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.124559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbkh\" (UniqueName: \"kubernetes.io/projected/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-kube-api-access-fwbkh\") pod \"keystone-2c83-account-create-update-nkz5v\" (UID: \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.124596 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee367ab-367a-4b7e-bf75-0c7ae0964933-operator-scripts\") pod \"keystone-db-create-lgngm\" (UID: \"cee367ab-367a-4b7e-bf75-0c7ae0964933\") " pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.124744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph798\" (UniqueName: \"kubernetes.io/projected/cee367ab-367a-4b7e-bf75-0c7ae0964933-kube-api-access-ph798\") pod \"keystone-db-create-lgngm\" (UID: \"cee367ab-367a-4b7e-bf75-0c7ae0964933\") " pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.125241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee367ab-367a-4b7e-bf75-0c7ae0964933-operator-scripts\") pod \"keystone-db-create-lgngm\" (UID: \"cee367ab-367a-4b7e-bf75-0c7ae0964933\") " pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.140018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph798\" (UniqueName: \"kubernetes.io/projected/cee367ab-367a-4b7e-bf75-0c7ae0964933-kube-api-access-ph798\") pod \"keystone-db-create-lgngm\" (UID: \"cee367ab-367a-4b7e-bf75-0c7ae0964933\") " pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.226168 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-operator-scripts\") pod \"keystone-2c83-account-create-update-nkz5v\" (UID: \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.226207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbkh\" (UniqueName: \"kubernetes.io/projected/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-kube-api-access-fwbkh\") pod \"keystone-2c83-account-create-update-nkz5v\" (UID: \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.226999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-operator-scripts\") pod \"keystone-2c83-account-create-update-nkz5v\" (UID: \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.241893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbkh\" (UniqueName: \"kubernetes.io/projected/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-kube-api-access-fwbkh\") pod \"keystone-2c83-account-create-update-nkz5v\" (UID: \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.270312 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.280727 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.290310 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-vqr2f"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.291186 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.295889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-vqr2f"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.300547 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.322904 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.374716 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.407210 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.419064 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-4cm8d"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.420052 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.421765 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.431119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvfrd\" (UniqueName: \"kubernetes.io/projected/5838abad-cfc4-4b71-a97d-0a96e327ddd7-kube-api-access-vvfrd\") pod \"placement-db-create-vqr2f\" (UID: \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\") " pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.431251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5838abad-cfc4-4b71-a97d-0a96e327ddd7-operator-scripts\") pod \"placement-db-create-vqr2f\" (UID: \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\") " pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.449741 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-4cm8d"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.499582 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-gj2xh"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.500408 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.520687 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gj2xh"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.533935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvfrd\" (UniqueName: \"kubernetes.io/projected/5838abad-cfc4-4b71-a97d-0a96e327ddd7-kube-api-access-vvfrd\") pod \"placement-db-create-vqr2f\" (UID: \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\") " pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.534012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5c5x\" (UniqueName: \"kubernetes.io/projected/5a8304c0-9267-4892-8907-574c07ead717-kube-api-access-p5c5x\") pod \"placement-a414-account-create-update-4cm8d\" (UID: \"5a8304c0-9267-4892-8907-574c07ead717\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.534042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5838abad-cfc4-4b71-a97d-0a96e327ddd7-operator-scripts\") pod \"placement-db-create-vqr2f\" (UID: \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\") " pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.534097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8304c0-9267-4892-8907-574c07ead717-operator-scripts\") pod \"placement-a414-account-create-update-4cm8d\" (UID: \"5a8304c0-9267-4892-8907-574c07ead717\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.535254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5838abad-cfc4-4b71-a97d-0a96e327ddd7-operator-scripts\") pod \"placement-db-create-vqr2f\" (UID: \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\") " pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.549011 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvfrd\" (UniqueName: \"kubernetes.io/projected/5838abad-cfc4-4b71-a97d-0a96e327ddd7-kube-api-access-vvfrd\") pod \"placement-db-create-vqr2f\" (UID: \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\") " pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.603197 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-s5sw9"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.604015 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.606344 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.608559 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-s5sw9"] Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.635114 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.635591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443064ab-05cf-4215-9029-9d1c8adb1934-operator-scripts\") pod \"glance-db-create-gj2xh\" (UID: \"443064ab-05cf-4215-9029-9d1c8adb1934\") " pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.635673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5c5x\" (UniqueName: \"kubernetes.io/projected/5a8304c0-9267-4892-8907-574c07ead717-kube-api-access-p5c5x\") pod \"placement-a414-account-create-update-4cm8d\" (UID: \"5a8304c0-9267-4892-8907-574c07ead717\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.635732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t88rh\" (UniqueName: \"kubernetes.io/projected/443064ab-05cf-4215-9029-9d1c8adb1934-kube-api-access-t88rh\") pod \"glance-db-create-gj2xh\" (UID: \"443064ab-05cf-4215-9029-9d1c8adb1934\") " pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.635771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8304c0-9267-4892-8907-574c07ead717-operator-scripts\") pod \"placement-a414-account-create-update-4cm8d\" (UID: \"5a8304c0-9267-4892-8907-574c07ead717\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.637829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8304c0-9267-4892-8907-574c07ead717-operator-scripts\") pod \"placement-a414-account-create-update-4cm8d\" (UID: \"5a8304c0-9267-4892-8907-574c07ead717\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.653340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5c5x\" (UniqueName: \"kubernetes.io/projected/5a8304c0-9267-4892-8907-574c07ead717-kube-api-access-p5c5x\") pod \"placement-a414-account-create-update-4cm8d\" (UID: \"5a8304c0-9267-4892-8907-574c07ead717\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.736994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gs98\" (UniqueName: \"kubernetes.io/projected/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-kube-api-access-2gs98\") pod \"glance-8403-account-create-update-s5sw9\" (UID: \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.737173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t88rh\" (UniqueName: \"kubernetes.io/projected/443064ab-05cf-4215-9029-9d1c8adb1934-kube-api-access-t88rh\") pod \"glance-db-create-gj2xh\" (UID: \"443064ab-05cf-4215-9029-9d1c8adb1934\") " pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.737472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-operator-scripts\") pod \"glance-8403-account-create-update-s5sw9\" (UID: \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.737513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443064ab-05cf-4215-9029-9d1c8adb1934-operator-scripts\") pod \"glance-db-create-gj2xh\" (UID: \"443064ab-05cf-4215-9029-9d1c8adb1934\") " pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.738249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443064ab-05cf-4215-9029-9d1c8adb1934-operator-scripts\") pod \"glance-db-create-gj2xh\" (UID: \"443064ab-05cf-4215-9029-9d1c8adb1934\") " pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.744020 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.754895 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t88rh\" (UniqueName: \"kubernetes.io/projected/443064ab-05cf-4215-9029-9d1c8adb1934-kube-api-access-t88rh\") pod \"glance-db-create-gj2xh\" (UID: \"443064ab-05cf-4215-9029-9d1c8adb1934\") " pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.788599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-lgngm"] Jan 21 15:20:31 crc kubenswrapper[4707]: W0121 15:20:31.806062 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcee367ab_367a_4b7e_bf75_0c7ae0964933.slice/crio-a881a874df1436e46cdac9f0b5e780ae0919d8b3b4bfbe511a266332a038c41f WatchSource:0}: Error finding container a881a874df1436e46cdac9f0b5e780ae0919d8b3b4bfbe511a266332a038c41f: Status 404 returned error can't find the container with id a881a874df1436e46cdac9f0b5e780ae0919d8b3b4bfbe511a266332a038c41f Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.824473 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.838600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-operator-scripts\") pod \"glance-8403-account-create-update-s5sw9\" (UID: \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.838683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gs98\" (UniqueName: \"kubernetes.io/projected/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-kube-api-access-2gs98\") pod \"glance-8403-account-create-update-s5sw9\" (UID: \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.839587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-operator-scripts\") pod \"glance-8403-account-create-update-s5sw9\" (UID: \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.846837 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v"] Jan 21 15:20:31 crc kubenswrapper[4707]: W0121 15:20:31.852489 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode94be5be_28c9_4cf4_92bd_f08538ff3bd8.slice/crio-13b53250452ab6190be974634d9fb87eaca96d9b613ce6abc248a7fff6287c4c WatchSource:0}: Error finding container 13b53250452ab6190be974634d9fb87eaca96d9b613ce6abc248a7fff6287c4c: Status 404 returned error can't find the container with id 13b53250452ab6190be974634d9fb87eaca96d9b613ce6abc248a7fff6287c4c Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.857378 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gs98\" (UniqueName: \"kubernetes.io/projected/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-kube-api-access-2gs98\") pod \"glance-8403-account-create-update-s5sw9\" (UID: \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:31 crc kubenswrapper[4707]: I0121 15:20:31.920452 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.029120 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-vqr2f"] Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.224403 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-4cm8d"] Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.309417 4707 generic.go:334] "Generic (PLEG): container finished" podID="e94be5be-28c9-4cf4-92bd-f08538ff3bd8" containerID="aa23da1e8ad14c7c75f4c95b67498188f812619ba11cff54ffcb28ddef51018c" exitCode=0 Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.309479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" event={"ID":"e94be5be-28c9-4cf4-92bd-f08538ff3bd8","Type":"ContainerDied","Data":"aa23da1e8ad14c7c75f4c95b67498188f812619ba11cff54ffcb28ddef51018c"} Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.309505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" event={"ID":"e94be5be-28c9-4cf4-92bd-f08538ff3bd8","Type":"ContainerStarted","Data":"13b53250452ab6190be974634d9fb87eaca96d9b613ce6abc248a7fff6287c4c"} Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.310424 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gj2xh"] Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.311943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-vqr2f" event={"ID":"5838abad-cfc4-4b71-a97d-0a96e327ddd7","Type":"ContainerStarted","Data":"ee04cdad4b1da5323aca6084fc5bbc598c67c70ada9e09b164a6a23f2b22762e"} Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.313323 4707 generic.go:334] "Generic (PLEG): container finished" podID="cee367ab-367a-4b7e-bf75-0c7ae0964933" containerID="16dd7b38c695d1db80f48d375ca8303e3621f5b7878940f4a752db4a7794b23c" exitCode=0 Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.313357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-lgngm" event={"ID":"cee367ab-367a-4b7e-bf75-0c7ae0964933","Type":"ContainerDied","Data":"16dd7b38c695d1db80f48d375ca8303e3621f5b7878940f4a752db4a7794b23c"} Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.313370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-lgngm" event={"ID":"cee367ab-367a-4b7e-bf75-0c7ae0964933","Type":"ContainerStarted","Data":"a881a874df1436e46cdac9f0b5e780ae0919d8b3b4bfbe511a266332a038c41f"} Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.315073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" event={"ID":"5a8304c0-9267-4892-8907-574c07ead717","Type":"ContainerStarted","Data":"caaca7736951b0334d689e91025d52aef6ab214800993df6f55813719049298b"} Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.315093 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:32 crc kubenswrapper[4707]: I0121 15:20:32.379192 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-s5sw9"] Jan 21 15:20:32 crc kubenswrapper[4707]: W0121 15:20:32.382019 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa1c474d_9462_45ca_8889_0e62e9fa5fe4.slice/crio-5f4b0fef729a6f0f83851536dd3303b252189f358effa207ac3e907ba41522c9 WatchSource:0}: Error finding container 5f4b0fef729a6f0f83851536dd3303b252189f358effa207ac3e907ba41522c9: Status 404 returned error can't find the container with id 5f4b0fef729a6f0f83851536dd3303b252189f358effa207ac3e907ba41522c9 Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.142972 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.147090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.150837 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.151054 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-rsc44" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.151684 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.155025 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.176204 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.228397 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.269378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-lock\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.269508 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7zxq\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-kube-api-access-j7zxq\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.269563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.269650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.269674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-cache\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.279596 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-bdwtz"] Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.280503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.291221 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-bdwtz"] Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.304448 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.314834 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.316427 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.320023 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.347979 4707 generic.go:334] "Generic (PLEG): container finished" podID="aa1c474d-9462-45ca-8889-0e62e9fa5fe4" containerID="e48534f9c080cffaac35d1488292d55f1059c686dc837998c490ccb3e1c2e0fb" exitCode=0 Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.348098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" event={"ID":"aa1c474d-9462-45ca-8889-0e62e9fa5fe4","Type":"ContainerDied","Data":"e48534f9c080cffaac35d1488292d55f1059c686dc837998c490ccb3e1c2e0fb"} Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.348120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" event={"ID":"aa1c474d-9462-45ca-8889-0e62e9fa5fe4","Type":"ContainerStarted","Data":"5f4b0fef729a6f0f83851536dd3303b252189f358effa207ac3e907ba41522c9"} Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.348768 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-bdwtz"] Jan 21 15:20:33 crc kubenswrapper[4707]: E0121 15:20:33.349321 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-q2m82 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" podUID="4b630ab3-a5e6-430b-96be-6b09bbd08276" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.352302 4707 generic.go:334] "Generic (PLEG): container finished" podID="443064ab-05cf-4215-9029-9d1c8adb1934" containerID="d4289b6efa74e80d01a680e30c7ad22037e55e14b8070a9387edcaf4166f1213" exitCode=0 Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.352355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gj2xh" event={"ID":"443064ab-05cf-4215-9029-9d1c8adb1934","Type":"ContainerDied","Data":"d4289b6efa74e80d01a680e30c7ad22037e55e14b8070a9387edcaf4166f1213"} Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.352371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gj2xh" event={"ID":"443064ab-05cf-4215-9029-9d1c8adb1934","Type":"ContainerStarted","Data":"caf4b515d6ab67cbd7801e86882aa885bc290fd1adf2f31ce50cc403b3389908"} Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.353283 4707 generic.go:334] "Generic (PLEG): container finished" podID="5838abad-cfc4-4b71-a97d-0a96e327ddd7" containerID="1b506835d91987c2c094b1c1d50eee1a83ac3f42c3cedbbbcadfbdf94de1575a" exitCode=0 Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.353321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-vqr2f" event={"ID":"5838abad-cfc4-4b71-a97d-0a96e327ddd7","Type":"ContainerDied","Data":"1b506835d91987c2c094b1c1d50eee1a83ac3f42c3cedbbbcadfbdf94de1575a"} Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.358042 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a8304c0-9267-4892-8907-574c07ead717" containerID="6fad03905ac9924024bbd83b187773be3d1d8859cb027e58c2f343735ac4b6a5" exitCode=0 Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.358699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" event={"ID":"5a8304c0-9267-4892-8907-574c07ead717","Type":"ContainerDied","Data":"6fad03905ac9924024bbd83b187773be3d1d8859cb027e58c2f343735ac4b6a5"} Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.358948 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-zlwl4"] Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.360279 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d89d96bb-d73b-45b0-a807-1644c19c9a4c-etc-swift\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-swiftconf\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-ring-data-devices\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7zxq\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-kube-api-access-j7zxq\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-ring-data-devices\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-dispersionconf\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-swiftconf\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405744 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-scripts\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-dispersionconf\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-combined-ca-bundle\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.405856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b630ab3-a5e6-430b-96be-6b09bbd08276-etc-swift\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.407550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.407591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-scripts\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.407635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-cache\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.407655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq5j8\" (UniqueName: \"kubernetes.io/projected/d89d96bb-d73b-45b0-a807-1644c19c9a4c-kube-api-access-lq5j8\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.407710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-combined-ca-bundle\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.407738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-lock\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.407788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2m82\" (UniqueName: \"kubernetes.io/projected/4b630ab3-a5e6-430b-96be-6b09bbd08276-kube-api-access-q2m82\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: E0121 15:20:33.408042 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:20:33 crc kubenswrapper[4707]: E0121 15:20:33.408077 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:20:33 crc kubenswrapper[4707]: E0121 15:20:33.408119 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift podName:c9958390-b164-4ae8-b834-d1df8a561b95 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:33.908106018 +0000 UTC m=+1131.089622241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift") pod "swift-storage-0" (UID: "c9958390-b164-4ae8-b834-d1df8a561b95") : configmap "swift-ring-files" not found Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.408803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-cache\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.410481 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") device mount path \"/mnt/openstack/pv07\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.412859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-lock\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.423738 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-zlwl4"] Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.471030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7zxq\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-kube-api-access-j7zxq\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.487165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-ring-data-devices\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-ring-data-devices\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-dispersionconf\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-swiftconf\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-scripts\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-dispersionconf\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-combined-ca-bundle\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b630ab3-a5e6-430b-96be-6b09bbd08276-etc-swift\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-scripts\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq5j8\" (UniqueName: \"kubernetes.io/projected/d89d96bb-d73b-45b0-a807-1644c19c9a4c-kube-api-access-lq5j8\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-combined-ca-bundle\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2m82\" (UniqueName: \"kubernetes.io/projected/4b630ab3-a5e6-430b-96be-6b09bbd08276-kube-api-access-q2m82\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d89d96bb-d73b-45b0-a807-1644c19c9a4c-etc-swift\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.521917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-swiftconf\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.522537 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-scripts\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.523624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-ring-data-devices\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.525618 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-swiftconf\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.526775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-ring-data-devices\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.528155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b630ab3-a5e6-430b-96be-6b09bbd08276-etc-swift\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.528401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d89d96bb-d73b-45b0-a807-1644c19c9a4c-etc-swift\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.528843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-scripts\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.528857 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-dispersionconf\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.533086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-swiftconf\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.543248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-combined-ca-bundle\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.544306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-dispersionconf\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.547637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq5j8\" (UniqueName: \"kubernetes.io/projected/d89d96bb-d73b-45b0-a807-1644c19c9a4c-kube-api-access-lq5j8\") pod \"swift-ring-rebalance-zlwl4\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.554757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-combined-ca-bundle\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.575704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2m82\" (UniqueName: \"kubernetes.io/projected/4b630ab3-a5e6-430b-96be-6b09bbd08276-kube-api-access-q2m82\") pod \"swift-ring-rebalance-bdwtz\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.687214 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.728331 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.733654 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.738597 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.738950 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.744042 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.744240 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-br4g6" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.746776 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.769107 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.876841 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee367ab-367a-4b7e-bf75-0c7ae0964933-operator-scripts\") pod \"cee367ab-367a-4b7e-bf75-0c7ae0964933\" (UID: \"cee367ab-367a-4b7e-bf75-0c7ae0964933\") " Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwbkh\" (UniqueName: \"kubernetes.io/projected/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-kube-api-access-fwbkh\") pod \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\" (UID: \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\") " Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph798\" (UniqueName: \"kubernetes.io/projected/cee367ab-367a-4b7e-bf75-0c7ae0964933-kube-api-access-ph798\") pod \"cee367ab-367a-4b7e-bf75-0c7ae0964933\" (UID: \"cee367ab-367a-4b7e-bf75-0c7ae0964933\") " Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-operator-scripts\") pod \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\" (UID: \"e94be5be-28c9-4cf4-92bd-f08538ff3bd8\") " Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-scripts\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzcgj\" (UniqueName: \"kubernetes.io/projected/11022c93-b7e5-4241-812c-eed16cc40ce2-kube-api-access-xzcgj\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-config\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.928957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:33 crc kubenswrapper[4707]: E0121 15:20:33.933032 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:20:33 crc kubenswrapper[4707]: E0121 15:20:33.933072 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.933118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee367ab-367a-4b7e-bf75-0c7ae0964933-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cee367ab-367a-4b7e-bf75-0c7ae0964933" (UID: "cee367ab-367a-4b7e-bf75-0c7ae0964933"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:33 crc kubenswrapper[4707]: E0121 15:20:33.933146 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift podName:c9958390-b164-4ae8-b834-d1df8a561b95 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:34.933117411 +0000 UTC m=+1132.114633633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift") pod "swift-storage-0" (UID: "c9958390-b164-4ae8-b834-d1df8a561b95") : configmap "swift-ring-files" not found Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.933515 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e94be5be-28c9-4cf4-92bd-f08538ff3bd8" (UID: "e94be5be-28c9-4cf4-92bd-f08538ff3bd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.939135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-kube-api-access-fwbkh" (OuterVolumeSpecName: "kube-api-access-fwbkh") pod "e94be5be-28c9-4cf4-92bd-f08538ff3bd8" (UID: "e94be5be-28c9-4cf4-92bd-f08538ff3bd8"). InnerVolumeSpecName "kube-api-access-fwbkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:33 crc kubenswrapper[4707]: I0121 15:20:33.939423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee367ab-367a-4b7e-bf75-0c7ae0964933-kube-api-access-ph798" (OuterVolumeSpecName: "kube-api-access-ph798") pod "cee367ab-367a-4b7e-bf75-0c7ae0964933" (UID: "cee367ab-367a-4b7e-bf75-0c7ae0964933"). InnerVolumeSpecName "kube-api-access-ph798". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-scripts\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzcgj\" (UniqueName: \"kubernetes.io/projected/11022c93-b7e5-4241-812c-eed16cc40ce2-kube-api-access-xzcgj\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-config\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031462 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031630 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee367ab-367a-4b7e-bf75-0c7ae0964933-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031642 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwbkh\" (UniqueName: \"kubernetes.io/projected/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-kube-api-access-fwbkh\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031653 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph798\" (UniqueName: \"kubernetes.io/projected/cee367ab-367a-4b7e-bf75-0c7ae0964933-kube-api-access-ph798\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.031664 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e94be5be-28c9-4cf4-92bd-f08538ff3bd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.032062 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.032261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-scripts\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.032465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-config\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.034962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.036021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.036097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.046913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzcgj\" (UniqueName: \"kubernetes.io/projected/11022c93-b7e5-4241-812c-eed16cc40ce2-kube-api-access-xzcgj\") pod \"ovn-northd-0\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.082501 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.120320 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-zlwl4"] Jan 21 15:20:34 crc kubenswrapper[4707]: W0121 15:20:34.124088 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd89d96bb_d73b_45b0_a807_1644c19c9a4c.slice/crio-80dacc17718bf33c8470fbb1f4ead0256e1ff609da892b81218ebb9d48e5e4ae WatchSource:0}: Error finding container 80dacc17718bf33c8470fbb1f4ead0256e1ff609da892b81218ebb9d48e5e4ae: Status 404 returned error can't find the container with id 80dacc17718bf33c8470fbb1f4ead0256e1ff609da892b81218ebb9d48e5e4ae Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.365703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" event={"ID":"e94be5be-28c9-4cf4-92bd-f08538ff3bd8","Type":"ContainerDied","Data":"13b53250452ab6190be974634d9fb87eaca96d9b613ce6abc248a7fff6287c4c"} Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.366024 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13b53250452ab6190be974634d9fb87eaca96d9b613ce6abc248a7fff6287c4c" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.365720 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.366836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" event={"ID":"d89d96bb-d73b-45b0-a807-1644c19c9a4c","Type":"ContainerStarted","Data":"ab3eeddc6598c297fab3d07fa539d52160b58955adf32604c651489c3a40b876"} Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.366871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" event={"ID":"d89d96bb-d73b-45b0-a807-1644c19c9a4c","Type":"ContainerStarted","Data":"80dacc17718bf33c8470fbb1f4ead0256e1ff609da892b81218ebb9d48e5e4ae"} Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.368228 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-lgngm" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.368252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-lgngm" event={"ID":"cee367ab-367a-4b7e-bf75-0c7ae0964933","Type":"ContainerDied","Data":"a881a874df1436e46cdac9f0b5e780ae0919d8b3b4bfbe511a266332a038c41f"} Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.368287 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a881a874df1436e46cdac9f0b5e780ae0919d8b3b4bfbe511a266332a038c41f" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.368393 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.382497 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.393606 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" podStartSLOduration=1.393588268 podStartE2EDuration="1.393588268s" podCreationTimestamp="2026-01-21 15:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:34.392179128 +0000 UTC m=+1131.573695350" watchObservedRunningTime="2026-01-21 15:20:34.393588268 +0000 UTC m=+1131.575104490" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.478492 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:20:34 crc kubenswrapper[4707]: W0121 15:20:34.485034 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11022c93_b7e5_4241_812c_eed16cc40ce2.slice/crio-412a687571ac458ce8139915f80a087668d12812bd398a0a6eabba5476dde1a9 WatchSource:0}: Error finding container 412a687571ac458ce8139915f80a087668d12812bd398a0a6eabba5476dde1a9: Status 404 returned error can't find the container with id 412a687571ac458ce8139915f80a087668d12812bd398a0a6eabba5476dde1a9 Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.541269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-dispersionconf\") pod \"4b630ab3-a5e6-430b-96be-6b09bbd08276\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.541327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-combined-ca-bundle\") pod \"4b630ab3-a5e6-430b-96be-6b09bbd08276\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.541410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-swiftconf\") pod \"4b630ab3-a5e6-430b-96be-6b09bbd08276\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.541431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-scripts\") pod \"4b630ab3-a5e6-430b-96be-6b09bbd08276\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.541464 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b630ab3-a5e6-430b-96be-6b09bbd08276-etc-swift\") pod \"4b630ab3-a5e6-430b-96be-6b09bbd08276\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.541485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2m82\" (UniqueName: \"kubernetes.io/projected/4b630ab3-a5e6-430b-96be-6b09bbd08276-kube-api-access-q2m82\") pod \"4b630ab3-a5e6-430b-96be-6b09bbd08276\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.541505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-ring-data-devices\") pod \"4b630ab3-a5e6-430b-96be-6b09bbd08276\" (UID: \"4b630ab3-a5e6-430b-96be-6b09bbd08276\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.541924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b630ab3-a5e6-430b-96be-6b09bbd08276-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4b630ab3-a5e6-430b-96be-6b09bbd08276" (UID: "4b630ab3-a5e6-430b-96be-6b09bbd08276"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.542374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-scripts" (OuterVolumeSpecName: "scripts") pod "4b630ab3-a5e6-430b-96be-6b09bbd08276" (UID: "4b630ab3-a5e6-430b-96be-6b09bbd08276"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.542831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4b630ab3-a5e6-430b-96be-6b09bbd08276" (UID: "4b630ab3-a5e6-430b-96be-6b09bbd08276"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.542967 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b630ab3-a5e6-430b-96be-6b09bbd08276-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.544758 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4b630ab3-a5e6-430b-96be-6b09bbd08276" (UID: "4b630ab3-a5e6-430b-96be-6b09bbd08276"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.548649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b630ab3-a5e6-430b-96be-6b09bbd08276" (UID: "4b630ab3-a5e6-430b-96be-6b09bbd08276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.548852 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4b630ab3-a5e6-430b-96be-6b09bbd08276" (UID: "4b630ab3-a5e6-430b-96be-6b09bbd08276"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.552855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b630ab3-a5e6-430b-96be-6b09bbd08276-kube-api-access-q2m82" (OuterVolumeSpecName: "kube-api-access-q2m82") pod "4b630ab3-a5e6-430b-96be-6b09bbd08276" (UID: "4b630ab3-a5e6-430b-96be-6b09bbd08276"). InnerVolumeSpecName "kube-api-access-q2m82". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.636881 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.644466 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.644492 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.644503 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b630ab3-a5e6-430b-96be-6b09bbd08276-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.644513 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.644521 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2m82\" (UniqueName: \"kubernetes.io/projected/4b630ab3-a5e6-430b-96be-6b09bbd08276-kube-api-access-q2m82\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.644528 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b630ab3-a5e6-430b-96be-6b09bbd08276-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.745637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t88rh\" (UniqueName: \"kubernetes.io/projected/443064ab-05cf-4215-9029-9d1c8adb1934-kube-api-access-t88rh\") pod \"443064ab-05cf-4215-9029-9d1c8adb1934\" (UID: \"443064ab-05cf-4215-9029-9d1c8adb1934\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.746051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443064ab-05cf-4215-9029-9d1c8adb1934-operator-scripts\") pod \"443064ab-05cf-4215-9029-9d1c8adb1934\" (UID: \"443064ab-05cf-4215-9029-9d1c8adb1934\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.747366 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443064ab-05cf-4215-9029-9d1c8adb1934-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "443064ab-05cf-4215-9029-9d1c8adb1934" (UID: "443064ab-05cf-4215-9029-9d1c8adb1934"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.751096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443064ab-05cf-4215-9029-9d1c8adb1934-kube-api-access-t88rh" (OuterVolumeSpecName: "kube-api-access-t88rh") pod "443064ab-05cf-4215-9029-9d1c8adb1934" (UID: "443064ab-05cf-4215-9029-9d1c8adb1934"). InnerVolumeSpecName "kube-api-access-t88rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.770714 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.789255 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.791056 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.850260 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t88rh\" (UniqueName: \"kubernetes.io/projected/443064ab-05cf-4215-9029-9d1c8adb1934-kube-api-access-t88rh\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.850292 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443064ab-05cf-4215-9029-9d1c8adb1934-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.951612 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5c5x\" (UniqueName: \"kubernetes.io/projected/5a8304c0-9267-4892-8907-574c07ead717-kube-api-access-p5c5x\") pod \"5a8304c0-9267-4892-8907-574c07ead717\" (UID: \"5a8304c0-9267-4892-8907-574c07ead717\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.951978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8304c0-9267-4892-8907-574c07ead717-operator-scripts\") pod \"5a8304c0-9267-4892-8907-574c07ead717\" (UID: \"5a8304c0-9267-4892-8907-574c07ead717\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.952118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvfrd\" (UniqueName: \"kubernetes.io/projected/5838abad-cfc4-4b71-a97d-0a96e327ddd7-kube-api-access-vvfrd\") pod \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\" (UID: \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.952191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-operator-scripts\") pod \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\" (UID: \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.952318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5838abad-cfc4-4b71-a97d-0a96e327ddd7-operator-scripts\") pod \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\" (UID: \"5838abad-cfc4-4b71-a97d-0a96e327ddd7\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.952378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a8304c0-9267-4892-8907-574c07ead717-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a8304c0-9267-4892-8907-574c07ead717" (UID: "5a8304c0-9267-4892-8907-574c07ead717"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.952402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gs98\" (UniqueName: \"kubernetes.io/projected/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-kube-api-access-2gs98\") pod \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\" (UID: \"aa1c474d-9462-45ca-8889-0e62e9fa5fe4\") " Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.952649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa1c474d-9462-45ca-8889-0e62e9fa5fe4" (UID: "aa1c474d-9462-45ca-8889-0e62e9fa5fe4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.952763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.952897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5838abad-cfc4-4b71-a97d-0a96e327ddd7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5838abad-cfc4-4b71-a97d-0a96e327ddd7" (UID: "5838abad-cfc4-4b71-a97d-0a96e327ddd7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: E0121 15:20:34.953089 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:20:34 crc kubenswrapper[4707]: E0121 15:20:34.953110 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:20:34 crc kubenswrapper[4707]: E0121 15:20:34.953168 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift podName:c9958390-b164-4ae8-b834-d1df8a561b95 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:36.953149433 +0000 UTC m=+1134.134665655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift") pod "swift-storage-0" (UID: "c9958390-b164-4ae8-b834-d1df8a561b95") : configmap "swift-ring-files" not found Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.953256 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.953321 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5838abad-cfc4-4b71-a97d-0a96e327ddd7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.953373 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8304c0-9267-4892-8907-574c07ead717-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.954574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a8304c0-9267-4892-8907-574c07ead717-kube-api-access-p5c5x" (OuterVolumeSpecName: "kube-api-access-p5c5x") pod "5a8304c0-9267-4892-8907-574c07ead717" (UID: "5a8304c0-9267-4892-8907-574c07ead717"). InnerVolumeSpecName "kube-api-access-p5c5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.955190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5838abad-cfc4-4b71-a97d-0a96e327ddd7-kube-api-access-vvfrd" (OuterVolumeSpecName: "kube-api-access-vvfrd") pod "5838abad-cfc4-4b71-a97d-0a96e327ddd7" (UID: "5838abad-cfc4-4b71-a97d-0a96e327ddd7"). InnerVolumeSpecName "kube-api-access-vvfrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:34 crc kubenswrapper[4707]: I0121 15:20:34.957289 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-kube-api-access-2gs98" (OuterVolumeSpecName: "kube-api-access-2gs98") pod "aa1c474d-9462-45ca-8889-0e62e9fa5fe4" (UID: "aa1c474d-9462-45ca-8889-0e62e9fa5fe4"). InnerVolumeSpecName "kube-api-access-2gs98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.055257 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5c5x\" (UniqueName: \"kubernetes.io/projected/5a8304c0-9267-4892-8907-574c07ead717-kube-api-access-p5c5x\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.055287 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvfrd\" (UniqueName: \"kubernetes.io/projected/5838abad-cfc4-4b71-a97d-0a96e327ddd7-kube-api-access-vvfrd\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.055296 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gs98\" (UniqueName: \"kubernetes.io/projected/aa1c474d-9462-45ca-8889-0e62e9fa5fe4-kube-api-access-2gs98\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.379380 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" event={"ID":"5a8304c0-9267-4892-8907-574c07ead717","Type":"ContainerDied","Data":"caaca7736951b0334d689e91025d52aef6ab214800993df6f55813719049298b"} Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.379436 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caaca7736951b0334d689e91025d52aef6ab214800993df6f55813719049298b" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.379527 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-a414-account-create-update-4cm8d" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.382247 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" event={"ID":"aa1c474d-9462-45ca-8889-0e62e9fa5fe4","Type":"ContainerDied","Data":"5f4b0fef729a6f0f83851536dd3303b252189f358effa207ac3e907ba41522c9"} Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.382282 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f4b0fef729a6f0f83851536dd3303b252189f358effa207ac3e907ba41522c9" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.382313 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-8403-account-create-update-s5sw9" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.383926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gj2xh" event={"ID":"443064ab-05cf-4215-9029-9d1c8adb1934","Type":"ContainerDied","Data":"caf4b515d6ab67cbd7801e86882aa885bc290fd1adf2f31ce50cc403b3389908"} Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.383965 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caf4b515d6ab67cbd7801e86882aa885bc290fd1adf2f31ce50cc403b3389908" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.384034 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gj2xh" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.385741 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-vqr2f" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.385798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-vqr2f" event={"ID":"5838abad-cfc4-4b71-a97d-0a96e327ddd7","Type":"ContainerDied","Data":"ee04cdad4b1da5323aca6084fc5bbc598c67c70ada9e09b164a6a23f2b22762e"} Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.385839 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee04cdad4b1da5323aca6084fc5bbc598c67c70ada9e09b164a6a23f2b22762e" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.387384 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-bdwtz" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.388189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"11022c93-b7e5-4241-812c-eed16cc40ce2","Type":"ContainerStarted","Data":"eeacdf781fdc40d12fa1f766ca9a1e9ce44c84afa1271e3f9baffd10cf9d9255"} Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.388218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.388229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"11022c93-b7e5-4241-812c-eed16cc40ce2","Type":"ContainerStarted","Data":"8eb790e157aa673fadd935d740017ed4db8e762fcaad08bdb4716c7e456ed832"} Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.388249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"11022c93-b7e5-4241-812c-eed16cc40ce2","Type":"ContainerStarted","Data":"412a687571ac458ce8139915f80a087668d12812bd398a0a6eabba5476dde1a9"} Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.407275 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.4072638729999998 podStartE2EDuration="2.407263873s" podCreationTimestamp="2026-01-21 15:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:35.406508992 +0000 UTC m=+1132.588025214" watchObservedRunningTime="2026-01-21 15:20:35.407263873 +0000 UTC m=+1132.588780095" Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.438657 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-bdwtz"] Jan 21 15:20:35 crc kubenswrapper[4707]: I0121 15:20:35.443317 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-bdwtz"] Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.910594 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4bftn"] Jan 21 15:20:36 crc kubenswrapper[4707]: E0121 15:20:36.911257 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1c474d-9462-45ca-8889-0e62e9fa5fe4" containerName="mariadb-account-create-update" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911271 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1c474d-9462-45ca-8889-0e62e9fa5fe4" containerName="mariadb-account-create-update" Jan 21 15:20:36 crc kubenswrapper[4707]: E0121 15:20:36.911284 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8304c0-9267-4892-8907-574c07ead717" containerName="mariadb-account-create-update" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911290 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8304c0-9267-4892-8907-574c07ead717" containerName="mariadb-account-create-update" Jan 21 15:20:36 crc kubenswrapper[4707]: E0121 15:20:36.911301 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee367ab-367a-4b7e-bf75-0c7ae0964933" containerName="mariadb-database-create" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911307 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee367ab-367a-4b7e-bf75-0c7ae0964933" containerName="mariadb-database-create" Jan 21 15:20:36 crc kubenswrapper[4707]: E0121 15:20:36.911316 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443064ab-05cf-4215-9029-9d1c8adb1934" containerName="mariadb-database-create" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911321 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="443064ab-05cf-4215-9029-9d1c8adb1934" containerName="mariadb-database-create" Jan 21 15:20:36 crc kubenswrapper[4707]: E0121 15:20:36.911334 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5838abad-cfc4-4b71-a97d-0a96e327ddd7" containerName="mariadb-database-create" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911340 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5838abad-cfc4-4b71-a97d-0a96e327ddd7" containerName="mariadb-database-create" Jan 21 15:20:36 crc kubenswrapper[4707]: E0121 15:20:36.911361 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94be5be-28c9-4cf4-92bd-f08538ff3bd8" containerName="mariadb-account-create-update" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911368 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94be5be-28c9-4cf4-92bd-f08538ff3bd8" containerName="mariadb-account-create-update" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911529 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="443064ab-05cf-4215-9029-9d1c8adb1934" containerName="mariadb-database-create" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911541 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a8304c0-9267-4892-8907-574c07ead717" containerName="mariadb-account-create-update" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911548 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5838abad-cfc4-4b71-a97d-0a96e327ddd7" containerName="mariadb-database-create" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911556 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee367ab-367a-4b7e-bf75-0c7ae0964933" containerName="mariadb-database-create" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911564 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa1c474d-9462-45ca-8889-0e62e9fa5fe4" containerName="mariadb-account-create-update" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.911572 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94be5be-28c9-4cf4-92bd-f08538ff3bd8" containerName="mariadb-account-create-update" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.912183 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.914108 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.914696 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-kn6vz" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.926476 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4bftn"] Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.988465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:36 crc kubenswrapper[4707]: E0121 15:20:36.988635 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:20:36 crc kubenswrapper[4707]: E0121 15:20:36.988780 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.988752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-db-sync-config-data\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.988895 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-config-data\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.989002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-combined-ca-bundle\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:36 crc kubenswrapper[4707]: I0121 15:20:36.989047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsjmz\" (UniqueName: \"kubernetes.io/projected/6541749d-0704-4c9c-adec-82e776820a93-kube-api-access-hsjmz\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:36 crc kubenswrapper[4707]: E0121 15:20:36.989150 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift podName:c9958390-b164-4ae8-b834-d1df8a561b95 nodeName:}" failed. No retries permitted until 2026-01-21 15:20:40.989134572 +0000 UTC m=+1138.170650794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift") pod "swift-storage-0" (UID: "c9958390-b164-4ae8-b834-d1df8a561b95") : configmap "swift-ring-files" not found Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.089876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-combined-ca-bundle\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.089923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsjmz\" (UniqueName: \"kubernetes.io/projected/6541749d-0704-4c9c-adec-82e776820a93-kube-api-access-hsjmz\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.089981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-db-sync-config-data\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.090006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-config-data\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.096420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-config-data\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.097688 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-db-sync-config-data\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.105306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsjmz\" (UniqueName: \"kubernetes.io/projected/6541749d-0704-4c9c-adec-82e776820a93-kube-api-access-hsjmz\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.109871 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-combined-ca-bundle\") pod \"glance-db-sync-4bftn\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.189987 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b630ab3-a5e6-430b-96be-6b09bbd08276" path="/var/lib/kubelet/pods/4b630ab3-a5e6-430b-96be-6b09bbd08276/volumes" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.227769 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:37 crc kubenswrapper[4707]: I0121 15:20:37.620197 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4bftn"] Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.304014 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpzhh"] Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.305083 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.307050 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.317652 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpzhh"] Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.325572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl2tb\" (UniqueName: \"kubernetes.io/projected/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-kube-api-access-gl2tb\") pod \"root-account-create-update-kpzhh\" (UID: \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\") " pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.325608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-operator-scripts\") pod \"root-account-create-update-kpzhh\" (UID: \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\") " pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.410754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4bftn" event={"ID":"6541749d-0704-4c9c-adec-82e776820a93","Type":"ContainerStarted","Data":"9cf6b1987a60900afa662dfbbf59d1ae2b96a810d41798201bc72c7371a322e5"} Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.411114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4bftn" event={"ID":"6541749d-0704-4c9c-adec-82e776820a93","Type":"ContainerStarted","Data":"2b182a5686cb82cea3427d0889e877de8dd9cbbf6d87b5374ef87b41c252ee57"} Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.426316 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-4bftn" podStartSLOduration=2.426302033 podStartE2EDuration="2.426302033s" podCreationTimestamp="2026-01-21 15:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:38.421508005 +0000 UTC m=+1135.603024237" watchObservedRunningTime="2026-01-21 15:20:38.426302033 +0000 UTC m=+1135.607818255" Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.427075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl2tb\" (UniqueName: \"kubernetes.io/projected/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-kube-api-access-gl2tb\") pod \"root-account-create-update-kpzhh\" (UID: \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\") " pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.427114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-operator-scripts\") pod \"root-account-create-update-kpzhh\" (UID: \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\") " pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.427926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-operator-scripts\") pod \"root-account-create-update-kpzhh\" (UID: \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\") " pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.450647 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl2tb\" (UniqueName: \"kubernetes.io/projected/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-kube-api-access-gl2tb\") pod \"root-account-create-update-kpzhh\" (UID: \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\") " pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:38 crc kubenswrapper[4707]: I0121 15:20:38.623188 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:39 crc kubenswrapper[4707]: I0121 15:20:39.002328 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpzhh"] Jan 21 15:20:39 crc kubenswrapper[4707]: I0121 15:20:39.418462 4707 generic.go:334] "Generic (PLEG): container finished" podID="777ebaa8-4dd7-4ad8-aafa-81e19c8441c2" containerID="ef39347322b8be3a3ce0794824042d8b99eb534953ab353e81ab65e6783f7ddf" exitCode=0 Jan 21 15:20:39 crc kubenswrapper[4707]: I0121 15:20:39.418540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kpzhh" event={"ID":"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2","Type":"ContainerDied","Data":"ef39347322b8be3a3ce0794824042d8b99eb534953ab353e81ab65e6783f7ddf"} Jan 21 15:20:39 crc kubenswrapper[4707]: I0121 15:20:39.418562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kpzhh" event={"ID":"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2","Type":"ContainerStarted","Data":"acf97176bb86961bd92552e61353dedde41a271590b2a0c13d7628822a0955a1"} Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.425621 4707 generic.go:334] "Generic (PLEG): container finished" podID="6541749d-0704-4c9c-adec-82e776820a93" containerID="9cf6b1987a60900afa662dfbbf59d1ae2b96a810d41798201bc72c7371a322e5" exitCode=0 Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.425687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4bftn" event={"ID":"6541749d-0704-4c9c-adec-82e776820a93","Type":"ContainerDied","Data":"9cf6b1987a60900afa662dfbbf59d1ae2b96a810d41798201bc72c7371a322e5"} Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.427414 4707 generic.go:334] "Generic (PLEG): container finished" podID="d89d96bb-d73b-45b0-a807-1644c19c9a4c" containerID="ab3eeddc6598c297fab3d07fa539d52160b58955adf32604c651489c3a40b876" exitCode=0 Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.427494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" event={"ID":"d89d96bb-d73b-45b0-a807-1644c19c9a4c","Type":"ContainerDied","Data":"ab3eeddc6598c297fab3d07fa539d52160b58955adf32604c651489c3a40b876"} Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.691110 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.758965 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl2tb\" (UniqueName: \"kubernetes.io/projected/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-kube-api-access-gl2tb\") pod \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\" (UID: \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\") " Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.759196 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-operator-scripts\") pod \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\" (UID: \"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2\") " Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.759754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "777ebaa8-4dd7-4ad8-aafa-81e19c8441c2" (UID: "777ebaa8-4dd7-4ad8-aafa-81e19c8441c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.764227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-kube-api-access-gl2tb" (OuterVolumeSpecName: "kube-api-access-gl2tb") pod "777ebaa8-4dd7-4ad8-aafa-81e19c8441c2" (UID: "777ebaa8-4dd7-4ad8-aafa-81e19c8441c2"). InnerVolumeSpecName "kube-api-access-gl2tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.862110 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:40 crc kubenswrapper[4707]: I0121 15:20:40.862142 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl2tb\" (UniqueName: \"kubernetes.io/projected/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2-kube-api-access-gl2tb\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.064989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.069026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift\") pod \"swift-storage-0\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.262103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.448208 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kpzhh" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.448249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kpzhh" event={"ID":"777ebaa8-4dd7-4ad8-aafa-81e19c8441c2","Type":"ContainerDied","Data":"acf97176bb86961bd92552e61353dedde41a271590b2a0c13d7628822a0955a1"} Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.448543 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf97176bb86961bd92552e61353dedde41a271590b2a0c13d7628822a0955a1" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.619376 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:20:41 crc kubenswrapper[4707]: W0121 15:20:41.648132 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9958390_b164_4ae8_b834_d1df8a561b95.slice/crio-13f4b075478eda21b5827d16a44012dda1eeca3faa6e611a7cae524d0ec4ca5b WatchSource:0}: Error finding container 13f4b075478eda21b5827d16a44012dda1eeca3faa6e611a7cae524d0ec4ca5b: Status 404 returned error can't find the container with id 13f4b075478eda21b5827d16a44012dda1eeca3faa6e611a7cae524d0ec4ca5b Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.795220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.816038 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.872940 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-dispersionconf\") pod \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-swiftconf\") pod \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-ring-data-devices\") pod \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsjmz\" (UniqueName: \"kubernetes.io/projected/6541749d-0704-4c9c-adec-82e776820a93-kube-api-access-hsjmz\") pod \"6541749d-0704-4c9c-adec-82e776820a93\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq5j8\" (UniqueName: \"kubernetes.io/projected/d89d96bb-d73b-45b0-a807-1644c19c9a4c-kube-api-access-lq5j8\") pod \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d89d96bb-d73b-45b0-a807-1644c19c9a4c-etc-swift\") pod \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-config-data\") pod \"6541749d-0704-4c9c-adec-82e776820a93\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-scripts\") pod \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873455 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-db-sync-config-data\") pod \"6541749d-0704-4c9c-adec-82e776820a93\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-combined-ca-bundle\") pod \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\" (UID: \"d89d96bb-d73b-45b0-a807-1644c19c9a4c\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.873505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-combined-ca-bundle\") pod \"6541749d-0704-4c9c-adec-82e776820a93\" (UID: \"6541749d-0704-4c9c-adec-82e776820a93\") " Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.874140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89d96bb-d73b-45b0-a807-1644c19c9a4c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d89d96bb-d73b-45b0-a807-1644c19c9a4c" (UID: "d89d96bb-d73b-45b0-a807-1644c19c9a4c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.874569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d89d96bb-d73b-45b0-a807-1644c19c9a4c" (UID: "d89d96bb-d73b-45b0-a807-1644c19c9a4c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.877933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89d96bb-d73b-45b0-a807-1644c19c9a4c-kube-api-access-lq5j8" (OuterVolumeSpecName: "kube-api-access-lq5j8") pod "d89d96bb-d73b-45b0-a807-1644c19c9a4c" (UID: "d89d96bb-d73b-45b0-a807-1644c19c9a4c"). InnerVolumeSpecName "kube-api-access-lq5j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.879566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6541749d-0704-4c9c-adec-82e776820a93" (UID: "6541749d-0704-4c9c-adec-82e776820a93"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.880445 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d89d96bb-d73b-45b0-a807-1644c19c9a4c" (UID: "d89d96bb-d73b-45b0-a807-1644c19c9a4c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.881464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6541749d-0704-4c9c-adec-82e776820a93-kube-api-access-hsjmz" (OuterVolumeSpecName: "kube-api-access-hsjmz") pod "6541749d-0704-4c9c-adec-82e776820a93" (UID: "6541749d-0704-4c9c-adec-82e776820a93"). InnerVolumeSpecName "kube-api-access-hsjmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.890986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-scripts" (OuterVolumeSpecName: "scripts") pod "d89d96bb-d73b-45b0-a807-1644c19c9a4c" (UID: "d89d96bb-d73b-45b0-a807-1644c19c9a4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.891625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d89d96bb-d73b-45b0-a807-1644c19c9a4c" (UID: "d89d96bb-d73b-45b0-a807-1644c19c9a4c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.893416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6541749d-0704-4c9c-adec-82e776820a93" (UID: "6541749d-0704-4c9c-adec-82e776820a93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.899975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d89d96bb-d73b-45b0-a807-1644c19c9a4c" (UID: "d89d96bb-d73b-45b0-a807-1644c19c9a4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.909844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-config-data" (OuterVolumeSpecName: "config-data") pod "6541749d-0704-4c9c-adec-82e776820a93" (UID: "6541749d-0704-4c9c-adec-82e776820a93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.974956 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.974983 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.974993 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.975003 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsjmz\" (UniqueName: \"kubernetes.io/projected/6541749d-0704-4c9c-adec-82e776820a93-kube-api-access-hsjmz\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.975011 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq5j8\" (UniqueName: \"kubernetes.io/projected/d89d96bb-d73b-45b0-a807-1644c19c9a4c-kube-api-access-lq5j8\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.975019 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d89d96bb-d73b-45b0-a807-1644c19c9a4c-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.975027 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.975035 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89d96bb-d73b-45b0-a807-1644c19c9a4c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.975042 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.975049 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89d96bb-d73b-45b0-a807-1644c19c9a4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:41 crc kubenswrapper[4707]: I0121 15:20:41.975056 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6541749d-0704-4c9c-adec-82e776820a93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.466186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.466457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.466471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.466479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.466507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.466517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.466524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.466532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"13f4b075478eda21b5827d16a44012dda1eeca3faa6e611a7cae524d0ec4ca5b"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.470269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" event={"ID":"d89d96bb-d73b-45b0-a807-1644c19c9a4c","Type":"ContainerDied","Data":"80dacc17718bf33c8470fbb1f4ead0256e1ff609da892b81218ebb9d48e5e4ae"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.470310 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80dacc17718bf33c8470fbb1f4ead0256e1ff609da892b81218ebb9d48e5e4ae" Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.470377 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-zlwl4" Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.475347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4bftn" event={"ID":"6541749d-0704-4c9c-adec-82e776820a93","Type":"ContainerDied","Data":"2b182a5686cb82cea3427d0889e877de8dd9cbbf6d87b5374ef87b41c252ee57"} Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.475384 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b182a5686cb82cea3427d0889e877de8dd9cbbf6d87b5374ef87b41c252ee57" Jan 21 15:20:42 crc kubenswrapper[4707]: I0121 15:20:42.475395 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4bftn" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.494455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222"} Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.494791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3"} Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.494804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87"} Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.494830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759"} Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.494841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229"} Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.494850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072"} Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.494861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a"} Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.494869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerStarted","Data":"a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877"} Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.527522 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=10.52750464 podStartE2EDuration="10.52750464s" podCreationTimestamp="2026-01-21 15:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:43.518531109 +0000 UTC m=+1140.700047331" watchObservedRunningTime="2026-01-21 15:20:43.52750464 +0000 UTC m=+1140.709020861" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.646305 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7"] Jan 21 15:20:43 crc kubenswrapper[4707]: E0121 15:20:43.646559 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777ebaa8-4dd7-4ad8-aafa-81e19c8441c2" containerName="mariadb-account-create-update" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.646574 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="777ebaa8-4dd7-4ad8-aafa-81e19c8441c2" containerName="mariadb-account-create-update" Jan 21 15:20:43 crc kubenswrapper[4707]: E0121 15:20:43.646594 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6541749d-0704-4c9c-adec-82e776820a93" containerName="glance-db-sync" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.646599 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6541749d-0704-4c9c-adec-82e776820a93" containerName="glance-db-sync" Jan 21 15:20:43 crc kubenswrapper[4707]: E0121 15:20:43.646606 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89d96bb-d73b-45b0-a807-1644c19c9a4c" containerName="swift-ring-rebalance" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.646611 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89d96bb-d73b-45b0-a807-1644c19c9a4c" containerName="swift-ring-rebalance" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.646736 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89d96bb-d73b-45b0-a807-1644c19c9a4c" containerName="swift-ring-rebalance" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.646746 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6541749d-0704-4c9c-adec-82e776820a93" containerName="glance-db-sync" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.646753 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="777ebaa8-4dd7-4ad8-aafa-81e19c8441c2" containerName="mariadb-account-create-update" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.647399 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.649363 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.654323 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7"] Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.802762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpxl\" (UniqueName: \"kubernetes.io/projected/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-kube-api-access-pvpxl\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.802861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-config\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.802927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.803055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.904884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpxl\" (UniqueName: \"kubernetes.io/projected/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-kube-api-access-pvpxl\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.904967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-config\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.905024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.905059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.905831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.906043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.906587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-config\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.920631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpxl\" (UniqueName: \"kubernetes.io/projected/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-kube-api-access-pvpxl\") pod \"dnsmasq-dnsmasq-7b69d68bf7-xmkr7\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:43 crc kubenswrapper[4707]: I0121 15:20:43.981872 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:44 crc kubenswrapper[4707]: I0121 15:20:44.130833 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:20:44 crc kubenswrapper[4707]: I0121 15:20:44.358756 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7"] Jan 21 15:20:44 crc kubenswrapper[4707]: I0121 15:20:44.505396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" event={"ID":"4543cf32-1b8e-4b5d-8189-2cef6a154e9b","Type":"ContainerStarted","Data":"229d40c510ef1bc6a30ce808f2d81467dcae253935998ca27a2f9d7c2e6acd52"} Jan 21 15:20:44 crc kubenswrapper[4707]: I0121 15:20:44.713414 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpzhh"] Jan 21 15:20:44 crc kubenswrapper[4707]: I0121 15:20:44.720290 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kpzhh"] Jan 21 15:20:45 crc kubenswrapper[4707]: I0121 15:20:45.191884 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777ebaa8-4dd7-4ad8-aafa-81e19c8441c2" path="/var/lib/kubelet/pods/777ebaa8-4dd7-4ad8-aafa-81e19c8441c2/volumes" Jan 21 15:20:45 crc kubenswrapper[4707]: I0121 15:20:45.512727 4707 generic.go:334] "Generic (PLEG): container finished" podID="4543cf32-1b8e-4b5d-8189-2cef6a154e9b" containerID="5f28403aebcb12ef2e6138d7601af2b5ea7ec05817ec3373881a8597876e713e" exitCode=0 Jan 21 15:20:45 crc kubenswrapper[4707]: I0121 15:20:45.512775 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" event={"ID":"4543cf32-1b8e-4b5d-8189-2cef6a154e9b","Type":"ContainerDied","Data":"5f28403aebcb12ef2e6138d7601af2b5ea7ec05817ec3373881a8597876e713e"} Jan 21 15:20:46 crc kubenswrapper[4707]: I0121 15:20:46.525983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" event={"ID":"4543cf32-1b8e-4b5d-8189-2cef6a154e9b","Type":"ContainerStarted","Data":"302ea15ef3aee89402cad8d53e32bed232af74eea08c703bbb3d5a07199d6fc4"} Jan 21 15:20:46 crc kubenswrapper[4707]: I0121 15:20:46.526274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:46 crc kubenswrapper[4707]: I0121 15:20:46.543115 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" podStartSLOduration=3.543099522 podStartE2EDuration="3.543099522s" podCreationTimestamp="2026-01-21 15:20:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:46.538562017 +0000 UTC m=+1143.720078239" watchObservedRunningTime="2026-01-21 15:20:46.543099522 +0000 UTC m=+1143.724615744" Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.735534 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pdhwl"] Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.736590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.738537 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.751212 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pdhwl"] Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.791965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96hhq\" (UniqueName: \"kubernetes.io/projected/5185f1f9-4519-435b-a4fd-53fc033ad864-kube-api-access-96hhq\") pod \"root-account-create-update-pdhwl\" (UID: \"5185f1f9-4519-435b-a4fd-53fc033ad864\") " pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.792010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5185f1f9-4519-435b-a4fd-53fc033ad864-operator-scripts\") pod \"root-account-create-update-pdhwl\" (UID: \"5185f1f9-4519-435b-a4fd-53fc033ad864\") " pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.894921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96hhq\" (UniqueName: \"kubernetes.io/projected/5185f1f9-4519-435b-a4fd-53fc033ad864-kube-api-access-96hhq\") pod \"root-account-create-update-pdhwl\" (UID: \"5185f1f9-4519-435b-a4fd-53fc033ad864\") " pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.895005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5185f1f9-4519-435b-a4fd-53fc033ad864-operator-scripts\") pod \"root-account-create-update-pdhwl\" (UID: \"5185f1f9-4519-435b-a4fd-53fc033ad864\") " pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.895793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5185f1f9-4519-435b-a4fd-53fc033ad864-operator-scripts\") pod \"root-account-create-update-pdhwl\" (UID: \"5185f1f9-4519-435b-a4fd-53fc033ad864\") " pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:49 crc kubenswrapper[4707]: I0121 15:20:49.913653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96hhq\" (UniqueName: \"kubernetes.io/projected/5185f1f9-4519-435b-a4fd-53fc033ad864-kube-api-access-96hhq\") pod \"root-account-create-update-pdhwl\" (UID: \"5185f1f9-4519-435b-a4fd-53fc033ad864\") " pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:50 crc kubenswrapper[4707]: I0121 15:20:50.049758 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:50 crc kubenswrapper[4707]: I0121 15:20:50.418137 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pdhwl"] Jan 21 15:20:50 crc kubenswrapper[4707]: I0121 15:20:50.557210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-pdhwl" event={"ID":"5185f1f9-4519-435b-a4fd-53fc033ad864","Type":"ContainerStarted","Data":"2aa76b2b39f9c9928ae119ff9fce57b787162d559109e1ae1c054d3f3360f9ca"} Jan 21 15:20:51 crc kubenswrapper[4707]: I0121 15:20:51.567546 4707 generic.go:334] "Generic (PLEG): container finished" podID="1bc7660e-aaa9-4865-a873-4db17c1db628" containerID="f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b" exitCode=0 Jan 21 15:20:51 crc kubenswrapper[4707]: I0121 15:20:51.567629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1bc7660e-aaa9-4865-a873-4db17c1db628","Type":"ContainerDied","Data":"f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b"} Jan 21 15:20:51 crc kubenswrapper[4707]: I0121 15:20:51.570307 4707 generic.go:334] "Generic (PLEG): container finished" podID="4fbbd17b-fed9-463c-9c25-435f84d0205e" containerID="f96114fc5af659c21483e07417feb1e1da0f1dbf8acbb58d75419c23defc0836" exitCode=0 Jan 21 15:20:51 crc kubenswrapper[4707]: I0121 15:20:51.570388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"4fbbd17b-fed9-463c-9c25-435f84d0205e","Type":"ContainerDied","Data":"f96114fc5af659c21483e07417feb1e1da0f1dbf8acbb58d75419c23defc0836"} Jan 21 15:20:51 crc kubenswrapper[4707]: I0121 15:20:51.573905 4707 generic.go:334] "Generic (PLEG): container finished" podID="5185f1f9-4519-435b-a4fd-53fc033ad864" containerID="dfafa233b16792821e92a67c522a5cbe5d8dd451e5c5bbe6fab6dcd63eb69869" exitCode=0 Jan 21 15:20:51 crc kubenswrapper[4707]: I0121 15:20:51.574019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-pdhwl" event={"ID":"5185f1f9-4519-435b-a4fd-53fc033ad864","Type":"ContainerDied","Data":"dfafa233b16792821e92a67c522a5cbe5d8dd451e5c5bbe6fab6dcd63eb69869"} Jan 21 15:20:52 crc kubenswrapper[4707]: I0121 15:20:52.584801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"4fbbd17b-fed9-463c-9c25-435f84d0205e","Type":"ContainerStarted","Data":"bbadb76a87972ac7320d59c53dfb93227092e217045548f169a94fd6a5c429ce"} Jan 21 15:20:52 crc kubenswrapper[4707]: I0121 15:20:52.586712 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:20:52 crc kubenswrapper[4707]: I0121 15:20:52.587161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1bc7660e-aaa9-4865-a873-4db17c1db628","Type":"ContainerStarted","Data":"a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8"} Jan 21 15:20:52 crc kubenswrapper[4707]: I0121 15:20:52.587404 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:20:52 crc kubenswrapper[4707]: I0121 15:20:52.615278 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.615259317 podStartE2EDuration="36.615259317s" podCreationTimestamp="2026-01-21 15:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:52.607018945 +0000 UTC m=+1149.788535168" watchObservedRunningTime="2026-01-21 15:20:52.615259317 +0000 UTC m=+1149.796775539" Jan 21 15:20:52 crc kubenswrapper[4707]: I0121 15:20:52.628859 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.628843159 podStartE2EDuration="36.628843159s" podCreationTimestamp="2026-01-21 15:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:20:52.625404671 +0000 UTC m=+1149.806920893" watchObservedRunningTime="2026-01-21 15:20:52.628843159 +0000 UTC m=+1149.810359381" Jan 21 15:20:52 crc kubenswrapper[4707]: I0121 15:20:52.912357 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.046455 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5185f1f9-4519-435b-a4fd-53fc033ad864-operator-scripts\") pod \"5185f1f9-4519-435b-a4fd-53fc033ad864\" (UID: \"5185f1f9-4519-435b-a4fd-53fc033ad864\") " Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.046564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96hhq\" (UniqueName: \"kubernetes.io/projected/5185f1f9-4519-435b-a4fd-53fc033ad864-kube-api-access-96hhq\") pod \"5185f1f9-4519-435b-a4fd-53fc033ad864\" (UID: \"5185f1f9-4519-435b-a4fd-53fc033ad864\") " Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.047961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5185f1f9-4519-435b-a4fd-53fc033ad864-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5185f1f9-4519-435b-a4fd-53fc033ad864" (UID: "5185f1f9-4519-435b-a4fd-53fc033ad864"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.051628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5185f1f9-4519-435b-a4fd-53fc033ad864-kube-api-access-96hhq" (OuterVolumeSpecName: "kube-api-access-96hhq") pod "5185f1f9-4519-435b-a4fd-53fc033ad864" (UID: "5185f1f9-4519-435b-a4fd-53fc033ad864"). InnerVolumeSpecName "kube-api-access-96hhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.148278 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5185f1f9-4519-435b-a4fd-53fc033ad864-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.148324 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96hhq\" (UniqueName: \"kubernetes.io/projected/5185f1f9-4519-435b-a4fd-53fc033ad864-kube-api-access-96hhq\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.594102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-pdhwl" event={"ID":"5185f1f9-4519-435b-a4fd-53fc033ad864","Type":"ContainerDied","Data":"2aa76b2b39f9c9928ae119ff9fce57b787162d559109e1ae1c054d3f3360f9ca"} Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.594164 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa76b2b39f9c9928ae119ff9fce57b787162d559109e1ae1c054d3f3360f9ca" Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.594132 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-pdhwl" Jan 21 15:20:53 crc kubenswrapper[4707]: I0121 15:20:53.982968 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.031487 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n"] Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.031775 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" podUID="fc35eebb-a329-4334-a277-5a27b1bfba38" containerName="dnsmasq-dns" containerID="cri-o://a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef" gracePeriod=10 Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.428775 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.572609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgg7f\" (UniqueName: \"kubernetes.io/projected/fc35eebb-a329-4334-a277-5a27b1bfba38-kube-api-access-vgg7f\") pod \"fc35eebb-a329-4334-a277-5a27b1bfba38\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.572725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-dnsmasq-svc\") pod \"fc35eebb-a329-4334-a277-5a27b1bfba38\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.572801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-config\") pod \"fc35eebb-a329-4334-a277-5a27b1bfba38\" (UID: \"fc35eebb-a329-4334-a277-5a27b1bfba38\") " Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.579655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc35eebb-a329-4334-a277-5a27b1bfba38-kube-api-access-vgg7f" (OuterVolumeSpecName: "kube-api-access-vgg7f") pod "fc35eebb-a329-4334-a277-5a27b1bfba38" (UID: "fc35eebb-a329-4334-a277-5a27b1bfba38"). InnerVolumeSpecName "kube-api-access-vgg7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.618458 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "fc35eebb-a329-4334-a277-5a27b1bfba38" (UID: "fc35eebb-a329-4334-a277-5a27b1bfba38"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.620572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" event={"ID":"fc35eebb-a329-4334-a277-5a27b1bfba38","Type":"ContainerDied","Data":"a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef"} Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.620626 4707 scope.go:117] "RemoveContainer" containerID="a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.620778 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.620872 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc35eebb-a329-4334-a277-5a27b1bfba38" containerID="a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef" exitCode=0 Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.620898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" event={"ID":"fc35eebb-a329-4334-a277-5a27b1bfba38","Type":"ContainerDied","Data":"16f6f2238fc0146bb9c73f0b3597565fcdf99f25f26e4fac1c2f215b7fe00c96"} Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.624475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-config" (OuterVolumeSpecName: "config") pod "fc35eebb-a329-4334-a277-5a27b1bfba38" (UID: "fc35eebb-a329-4334-a277-5a27b1bfba38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.654059 4707 scope.go:117] "RemoveContainer" containerID="ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.669486 4707 scope.go:117] "RemoveContainer" containerID="a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef" Jan 21 15:20:54 crc kubenswrapper[4707]: E0121 15:20:54.669866 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef\": container with ID starting with a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef not found: ID does not exist" containerID="a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.669965 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef"} err="failed to get container status \"a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef\": rpc error: code = NotFound desc = could not find container \"a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef\": container with ID starting with a94c2c8e09962ee74fab2b790cb4410be22198a3bdd430357e9a637a7d1989ef not found: ID does not exist" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.670046 4707 scope.go:117] "RemoveContainer" containerID="ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32" Jan 21 15:20:54 crc kubenswrapper[4707]: E0121 15:20:54.670522 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32\": container with ID starting with ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32 not found: ID does not exist" containerID="ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.670546 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32"} err="failed to get container status \"ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32\": rpc error: code = NotFound desc = could not find container \"ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32\": container with ID starting with ccb684f31d139c9ff38e04d1accf687aa9f8eac8a4f5547e024b094d642bac32 not found: ID does not exist" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.674519 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.674600 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgg7f\" (UniqueName: \"kubernetes.io/projected/fc35eebb-a329-4334-a277-5a27b1bfba38-kube-api-access-vgg7f\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.674667 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fc35eebb-a329-4334-a277-5a27b1bfba38-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.949797 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n"] Jan 21 15:20:54 crc kubenswrapper[4707]: I0121 15:20:54.963787 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n"] Jan 21 15:20:55 crc kubenswrapper[4707]: I0121 15:20:55.188783 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc35eebb-a329-4334-a277-5a27b1bfba38" path="/var/lib/kubelet/pods/fc35eebb-a329-4334-a277-5a27b1bfba38/volumes" Jan 21 15:20:59 crc kubenswrapper[4707]: I0121 15:20:59.238430 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-m2s6n" podUID="fc35eebb-a329-4334-a277-5a27b1bfba38" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.196:5353: i/o timeout" Jan 21 15:21:07 crc kubenswrapper[4707]: I0121 15:21:07.763947 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:21:07 crc kubenswrapper[4707]: I0121 15:21:07.990072 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4zsrk"] Jan 21 15:21:07 crc kubenswrapper[4707]: E0121 15:21:07.990525 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc35eebb-a329-4334-a277-5a27b1bfba38" containerName="dnsmasq-dns" Jan 21 15:21:07 crc kubenswrapper[4707]: I0121 15:21:07.990542 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc35eebb-a329-4334-a277-5a27b1bfba38" containerName="dnsmasq-dns" Jan 21 15:21:07 crc kubenswrapper[4707]: E0121 15:21:07.990565 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc35eebb-a329-4334-a277-5a27b1bfba38" containerName="init" Jan 21 15:21:07 crc kubenswrapper[4707]: I0121 15:21:07.990571 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc35eebb-a329-4334-a277-5a27b1bfba38" containerName="init" Jan 21 15:21:07 crc kubenswrapper[4707]: E0121 15:21:07.990585 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5185f1f9-4519-435b-a4fd-53fc033ad864" containerName="mariadb-account-create-update" Jan 21 15:21:07 crc kubenswrapper[4707]: I0121 15:21:07.990590 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5185f1f9-4519-435b-a4fd-53fc033ad864" containerName="mariadb-account-create-update" Jan 21 15:21:07 crc kubenswrapper[4707]: I0121 15:21:07.990728 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5185f1f9-4519-435b-a4fd-53fc033ad864" containerName="mariadb-account-create-update" Jan 21 15:21:07 crc kubenswrapper[4707]: I0121 15:21:07.990741 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc35eebb-a329-4334-a277-5a27b1bfba38" containerName="dnsmasq-dns" Jan 21 15:21:07 crc kubenswrapper[4707]: I0121 15:21:07.991139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.006477 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4zsrk"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.048142 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.102452 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zjkkw"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.103309 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.112216 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-llxnn"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.113187 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.115852 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.116943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zjkkw"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.121978 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-llxnn"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.156774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-operator-scripts\") pod \"cinder-db-create-4zsrk\" (UID: \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\") " pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.156833 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8kld\" (UniqueName: \"kubernetes.io/projected/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-kube-api-access-f8kld\") pod \"cinder-db-create-4zsrk\" (UID: \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\") " pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.195068 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.195885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.198507 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.205794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.257841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5560e962-f3b1-420e-9e8a-f443c99d1e43-operator-scripts\") pod \"neutron-06ca-account-create-update-2ztql\" (UID: \"5560e962-f3b1-420e-9e8a-f443c99d1e43\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.258061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-operator-scripts\") pod \"barbican-db-create-zjkkw\" (UID: \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\") " pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.258160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlpqh\" (UniqueName: \"kubernetes.io/projected/5560e962-f3b1-420e-9e8a-f443c99d1e43-kube-api-access-jlpqh\") pod \"neutron-06ca-account-create-update-2ztql\" (UID: \"5560e962-f3b1-420e-9e8a-f443c99d1e43\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.258249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdgs6\" (UniqueName: \"kubernetes.io/projected/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-kube-api-access-tdgs6\") pod \"barbican-db-create-zjkkw\" (UID: \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\") " pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.258343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb97r\" (UniqueName: \"kubernetes.io/projected/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-kube-api-access-lb97r\") pod \"barbican-6090-account-create-update-llxnn\" (UID: \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.258437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-operator-scripts\") pod \"barbican-6090-account-create-update-llxnn\" (UID: \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.258534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-operator-scripts\") pod \"cinder-db-create-4zsrk\" (UID: \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\") " pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.258597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8kld\" (UniqueName: \"kubernetes.io/projected/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-kube-api-access-f8kld\") pod \"cinder-db-create-4zsrk\" (UID: \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\") " pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.259449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-operator-scripts\") pod \"cinder-db-create-4zsrk\" (UID: \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\") " pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.276581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8kld\" (UniqueName: \"kubernetes.io/projected/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-kube-api-access-f8kld\") pod \"cinder-db-create-4zsrk\" (UID: \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\") " pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.304044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.305076 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.305913 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.310044 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.321478 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.359537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5560e962-f3b1-420e-9e8a-f443c99d1e43-operator-scripts\") pod \"neutron-06ca-account-create-update-2ztql\" (UID: \"5560e962-f3b1-420e-9e8a-f443c99d1e43\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.359580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-operator-scripts\") pod \"barbican-db-create-zjkkw\" (UID: \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\") " pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.359628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlpqh\" (UniqueName: \"kubernetes.io/projected/5560e962-f3b1-420e-9e8a-f443c99d1e43-kube-api-access-jlpqh\") pod \"neutron-06ca-account-create-update-2ztql\" (UID: \"5560e962-f3b1-420e-9e8a-f443c99d1e43\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.359675 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdgs6\" (UniqueName: \"kubernetes.io/projected/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-kube-api-access-tdgs6\") pod \"barbican-db-create-zjkkw\" (UID: \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\") " pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.359699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb97r\" (UniqueName: \"kubernetes.io/projected/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-kube-api-access-lb97r\") pod \"barbican-6090-account-create-update-llxnn\" (UID: \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.359755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-operator-scripts\") pod \"barbican-6090-account-create-update-llxnn\" (UID: \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.360429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-operator-scripts\") pod \"barbican-6090-account-create-update-llxnn\" (UID: \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.360885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5560e962-f3b1-420e-9e8a-f443c99d1e43-operator-scripts\") pod \"neutron-06ca-account-create-update-2ztql\" (UID: \"5560e962-f3b1-420e-9e8a-f443c99d1e43\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.361921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-operator-scripts\") pod \"barbican-db-create-zjkkw\" (UID: \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\") " pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.376943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdgs6\" (UniqueName: \"kubernetes.io/projected/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-kube-api-access-tdgs6\") pod \"barbican-db-create-zjkkw\" (UID: \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\") " pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.377474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlpqh\" (UniqueName: \"kubernetes.io/projected/5560e962-f3b1-420e-9e8a-f443c99d1e43-kube-api-access-jlpqh\") pod \"neutron-06ca-account-create-update-2ztql\" (UID: \"5560e962-f3b1-420e-9e8a-f443c99d1e43\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.377595 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-98266"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.378466 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-98266"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.378589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.380461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb97r\" (UniqueName: \"kubernetes.io/projected/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-kube-api-access-lb97r\") pod \"barbican-6090-account-create-update-llxnn\" (UID: \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.388655 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.388716 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-4bkq4" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.388903 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.388980 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.395857 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-ws576"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.396663 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.405868 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-ws576"] Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.418124 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.429197 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.461507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-combined-ca-bundle\") pod \"keystone-db-sync-98266\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.461644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-config-data\") pod \"keystone-db-sync-98266\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.461669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-operator-scripts\") pod \"cinder-8672-account-create-update-6qhrw\" (UID: \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\") " pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.461700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxpcz\" (UniqueName: \"kubernetes.io/projected/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-kube-api-access-rxpcz\") pod \"neutron-db-create-ws576\" (UID: \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\") " pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.461758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-operator-scripts\") pod \"neutron-db-create-ws576\" (UID: \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\") " pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.461790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-kube-api-access-lqzfc\") pod \"cinder-8672-account-create-update-6qhrw\" (UID: \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\") " pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.461825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5wxp\" (UniqueName: \"kubernetes.io/projected/e0f1cb8a-c64f-4997-a896-0d7127aa309a-kube-api-access-m5wxp\") pod \"keystone-db-sync-98266\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.511135 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.569591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-config-data\") pod \"keystone-db-sync-98266\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.569631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-operator-scripts\") pod \"cinder-8672-account-create-update-6qhrw\" (UID: \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\") " pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.569655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxpcz\" (UniqueName: \"kubernetes.io/projected/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-kube-api-access-rxpcz\") pod \"neutron-db-create-ws576\" (UID: \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\") " pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.569702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-operator-scripts\") pod \"neutron-db-create-ws576\" (UID: \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\") " pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.569723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-kube-api-access-lqzfc\") pod \"cinder-8672-account-create-update-6qhrw\" (UID: \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\") " pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.569741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5wxp\" (UniqueName: \"kubernetes.io/projected/e0f1cb8a-c64f-4997-a896-0d7127aa309a-kube-api-access-m5wxp\") pod \"keystone-db-sync-98266\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.569772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-combined-ca-bundle\") pod \"keystone-db-sync-98266\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.573475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-operator-scripts\") pod \"cinder-8672-account-create-update-6qhrw\" (UID: \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\") " pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.573656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-operator-scripts\") pod \"neutron-db-create-ws576\" (UID: \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\") " pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.584518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-config-data\") pod \"keystone-db-sync-98266\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.588323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-combined-ca-bundle\") pod \"keystone-db-sync-98266\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.598393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-kube-api-access-lqzfc\") pod \"cinder-8672-account-create-update-6qhrw\" (UID: \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\") " pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.605705 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxpcz\" (UniqueName: \"kubernetes.io/projected/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-kube-api-access-rxpcz\") pod \"neutron-db-create-ws576\" (UID: \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\") " pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.606229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5wxp\" (UniqueName: \"kubernetes.io/projected/e0f1cb8a-c64f-4997-a896-0d7127aa309a-kube-api-access-m5wxp\") pod \"keystone-db-sync-98266\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.619232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.754730 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.763458 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:08 crc kubenswrapper[4707]: I0121 15:21:08.927262 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4zsrk"] Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.039346 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zjkkw"] Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.053568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-llxnn"] Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.137439 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql"] Jan 21 15:21:09 crc kubenswrapper[4707]: W0121 15:21:09.148144 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5560e962_f3b1_420e_9e8a_f443c99d1e43.slice/crio-37573d468d2d9d1a7a6770d7ae3b711ca6125baf7c8300e49d1c4b3027db4b8c WatchSource:0}: Error finding container 37573d468d2d9d1a7a6770d7ae3b711ca6125baf7c8300e49d1c4b3027db4b8c: Status 404 returned error can't find the container with id 37573d468d2d9d1a7a6770d7ae3b711ca6125baf7c8300e49d1c4b3027db4b8c Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.202563 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw"] Jan 21 15:21:09 crc kubenswrapper[4707]: W0121 15:21:09.212026 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14c1b37e_4ff9_4d56_97c1_744f743a2f9a.slice/crio-4b8b8c77955e11fe55648638aedf051e380ccd8db49cd207eebaa0b3cc1cc2fd WatchSource:0}: Error finding container 4b8b8c77955e11fe55648638aedf051e380ccd8db49cd207eebaa0b3cc1cc2fd: Status 404 returned error can't find the container with id 4b8b8c77955e11fe55648638aedf051e380ccd8db49cd207eebaa0b3cc1cc2fd Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.256355 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-98266"] Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.275120 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-ws576"] Jan 21 15:21:09 crc kubenswrapper[4707]: W0121 15:21:09.305641 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb82b2b5f_76f3_4da4_b413_fd8fc4d520c0.slice/crio-48e0c9e97d0648f761edba7fbc873351708302da7df45338cefcf4ad254d3f52 WatchSource:0}: Error finding container 48e0c9e97d0648f761edba7fbc873351708302da7df45338cefcf4ad254d3f52: Status 404 returned error can't find the container with id 48e0c9e97d0648f761edba7fbc873351708302da7df45338cefcf4ad254d3f52 Jan 21 15:21:09 crc kubenswrapper[4707]: W0121 15:21:09.307585 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0f1cb8a_c64f_4997_a896_0d7127aa309a.slice/crio-75dd82d0aa18bcc54c340258bdc468373a2e835f13cfa48cea7c4af642233655 WatchSource:0}: Error finding container 75dd82d0aa18bcc54c340258bdc468373a2e835f13cfa48cea7c4af642233655: Status 404 returned error can't find the container with id 75dd82d0aa18bcc54c340258bdc468373a2e835f13cfa48cea7c4af642233655 Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.718060 4707 generic.go:334] "Generic (PLEG): container finished" podID="5560e962-f3b1-420e-9e8a-f443c99d1e43" containerID="6d331224286c383855737ab2779d81c7a80a7051be238066760e4621490f70b0" exitCode=0 Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.718104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" event={"ID":"5560e962-f3b1-420e-9e8a-f443c99d1e43","Type":"ContainerDied","Data":"6d331224286c383855737ab2779d81c7a80a7051be238066760e4621490f70b0"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.718148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" event={"ID":"5560e962-f3b1-420e-9e8a-f443c99d1e43","Type":"ContainerStarted","Data":"37573d468d2d9d1a7a6770d7ae3b711ca6125baf7c8300e49d1c4b3027db4b8c"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.719744 4707 generic.go:334] "Generic (PLEG): container finished" podID="4dfb5d13-8ce5-49f7-a24d-4b913d85df26" containerID="f88d80aa5a2d179739c143a0050ab27e4ec78c655f22cd1f2e1091b62bc6e73b" exitCode=0 Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.719825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-4zsrk" event={"ID":"4dfb5d13-8ce5-49f7-a24d-4b913d85df26","Type":"ContainerDied","Data":"f88d80aa5a2d179739c143a0050ab27e4ec78c655f22cd1f2e1091b62bc6e73b"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.719861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-4zsrk" event={"ID":"4dfb5d13-8ce5-49f7-a24d-4b913d85df26","Type":"ContainerStarted","Data":"7a551ce9d02b395aa1997321f26dc8b59cc9f28325669f621103939e52da601e"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.721193 4707 generic.go:334] "Generic (PLEG): container finished" podID="9a360c79-ca70-4364-87a9-5d0b24cb9b2e" containerID="a2d9743e9e081360d25dfe8246d36ca6184ede8b708a4387ffff1b0b955bd74a" exitCode=0 Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.721218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-zjkkw" event={"ID":"9a360c79-ca70-4364-87a9-5d0b24cb9b2e","Type":"ContainerDied","Data":"a2d9743e9e081360d25dfe8246d36ca6184ede8b708a4387ffff1b0b955bd74a"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.721241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-zjkkw" event={"ID":"9a360c79-ca70-4364-87a9-5d0b24cb9b2e","Type":"ContainerStarted","Data":"69f0dd70fa0f3483d33731186d854362df9d7dd01976856699d519fb1fb20249"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.722554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-98266" event={"ID":"e0f1cb8a-c64f-4997-a896-0d7127aa309a","Type":"ContainerStarted","Data":"74830281e4e80b570ce296677137e2b13d42928b30d53e08074c5321973fff8d"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.722601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-98266" event={"ID":"e0f1cb8a-c64f-4997-a896-0d7127aa309a","Type":"ContainerStarted","Data":"75dd82d0aa18bcc54c340258bdc468373a2e835f13cfa48cea7c4af642233655"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.723718 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a8c3cab-2625-44be-97b5-1d4d5f210b6e" containerID="898270f329b3e42bbbee5e74a2988492248116b7c718075edc0eb8b621339c63" exitCode=0 Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.723785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" event={"ID":"5a8c3cab-2625-44be-97b5-1d4d5f210b6e","Type":"ContainerDied","Data":"898270f329b3e42bbbee5e74a2988492248116b7c718075edc0eb8b621339c63"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.723819 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" event={"ID":"5a8c3cab-2625-44be-97b5-1d4d5f210b6e","Type":"ContainerStarted","Data":"2097c4076af9513ca187562b829ed6dba9c31f378557195b8f88c687662b1f39"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.725052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-ws576" event={"ID":"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0","Type":"ContainerStarted","Data":"3d83cb67dec455e46dc1ce200dbbd0a61b39254b95b751d67e78d0b70627b898"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.725140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-ws576" event={"ID":"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0","Type":"ContainerStarted","Data":"48e0c9e97d0648f761edba7fbc873351708302da7df45338cefcf4ad254d3f52"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.726352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" event={"ID":"14c1b37e-4ff9-4d56-97c1-744f743a2f9a","Type":"ContainerStarted","Data":"55b0115da79e304b9f7eb8b41f3a28735875a035dbcde7cbf64e8cdf08faa4a9"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.726377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" event={"ID":"14c1b37e-4ff9-4d56-97c1-744f743a2f9a","Type":"ContainerStarted","Data":"4b8b8c77955e11fe55648638aedf051e380ccd8db49cd207eebaa0b3cc1cc2fd"} Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.746408 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-98266" podStartSLOduration=1.746395822 podStartE2EDuration="1.746395822s" podCreationTimestamp="2026-01-21 15:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:09.741163751 +0000 UTC m=+1166.922679973" watchObservedRunningTime="2026-01-21 15:21:09.746395822 +0000 UTC m=+1166.927912035" Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.782794 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" podStartSLOduration=1.782778712 podStartE2EDuration="1.782778712s" podCreationTimestamp="2026-01-21 15:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:09.77662708 +0000 UTC m=+1166.958143302" watchObservedRunningTime="2026-01-21 15:21:09.782778712 +0000 UTC m=+1166.964294933" Jan 21 15:21:09 crc kubenswrapper[4707]: I0121 15:21:09.791369 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-create-ws576" podStartSLOduration=1.791357079 podStartE2EDuration="1.791357079s" podCreationTimestamp="2026-01-21 15:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:09.787622724 +0000 UTC m=+1166.969138946" watchObservedRunningTime="2026-01-21 15:21:09.791357079 +0000 UTC m=+1166.972873291" Jan 21 15:21:10 crc kubenswrapper[4707]: I0121 15:21:10.734707 4707 generic.go:334] "Generic (PLEG): container finished" podID="b82b2b5f-76f3-4da4-b413-fd8fc4d520c0" containerID="3d83cb67dec455e46dc1ce200dbbd0a61b39254b95b751d67e78d0b70627b898" exitCode=0 Jan 21 15:21:10 crc kubenswrapper[4707]: I0121 15:21:10.734742 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-ws576" event={"ID":"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0","Type":"ContainerDied","Data":"3d83cb67dec455e46dc1ce200dbbd0a61b39254b95b751d67e78d0b70627b898"} Jan 21 15:21:10 crc kubenswrapper[4707]: I0121 15:21:10.736251 4707 generic.go:334] "Generic (PLEG): container finished" podID="14c1b37e-4ff9-4d56-97c1-744f743a2f9a" containerID="55b0115da79e304b9f7eb8b41f3a28735875a035dbcde7cbf64e8cdf08faa4a9" exitCode=0 Jan 21 15:21:10 crc kubenswrapper[4707]: I0121 15:21:10.736424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" event={"ID":"14c1b37e-4ff9-4d56-97c1-744f743a2f9a","Type":"ContainerDied","Data":"55b0115da79e304b9f7eb8b41f3a28735875a035dbcde7cbf64e8cdf08faa4a9"} Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.127944 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.132409 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.136188 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.147665 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.208446 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdgs6\" (UniqueName: \"kubernetes.io/projected/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-kube-api-access-tdgs6\") pod \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\" (UID: \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\") " Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.208489 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-operator-scripts\") pod \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\" (UID: \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\") " Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.208525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlpqh\" (UniqueName: \"kubernetes.io/projected/5560e962-f3b1-420e-9e8a-f443c99d1e43-kube-api-access-jlpqh\") pod \"5560e962-f3b1-420e-9e8a-f443c99d1e43\" (UID: \"5560e962-f3b1-420e-9e8a-f443c99d1e43\") " Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.208570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8kld\" (UniqueName: \"kubernetes.io/projected/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-kube-api-access-f8kld\") pod \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\" (UID: \"4dfb5d13-8ce5-49f7-a24d-4b913d85df26\") " Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.208618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-operator-scripts\") pod \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\" (UID: \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\") " Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.208665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5560e962-f3b1-420e-9e8a-f443c99d1e43-operator-scripts\") pod \"5560e962-f3b1-420e-9e8a-f443c99d1e43\" (UID: \"5560e962-f3b1-420e-9e8a-f443c99d1e43\") " Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.208702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-operator-scripts\") pod \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\" (UID: \"9a360c79-ca70-4364-87a9-5d0b24cb9b2e\") " Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.208735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb97r\" (UniqueName: \"kubernetes.io/projected/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-kube-api-access-lb97r\") pod \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\" (UID: \"5a8c3cab-2625-44be-97b5-1d4d5f210b6e\") " Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.208994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4dfb5d13-8ce5-49f7-a24d-4b913d85df26" (UID: "4dfb5d13-8ce5-49f7-a24d-4b913d85df26"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.209323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5560e962-f3b1-420e-9e8a-f443c99d1e43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5560e962-f3b1-420e-9e8a-f443c99d1e43" (UID: "5560e962-f3b1-420e-9e8a-f443c99d1e43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.209353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a8c3cab-2625-44be-97b5-1d4d5f210b6e" (UID: "5a8c3cab-2625-44be-97b5-1d4d5f210b6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.209422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a360c79-ca70-4364-87a9-5d0b24cb9b2e" (UID: "9a360c79-ca70-4364-87a9-5d0b24cb9b2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.213318 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-kube-api-access-f8kld" (OuterVolumeSpecName: "kube-api-access-f8kld") pod "4dfb5d13-8ce5-49f7-a24d-4b913d85df26" (UID: "4dfb5d13-8ce5-49f7-a24d-4b913d85df26"). InnerVolumeSpecName "kube-api-access-f8kld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.213577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5560e962-f3b1-420e-9e8a-f443c99d1e43-kube-api-access-jlpqh" (OuterVolumeSpecName: "kube-api-access-jlpqh") pod "5560e962-f3b1-420e-9e8a-f443c99d1e43" (UID: "5560e962-f3b1-420e-9e8a-f443c99d1e43"). InnerVolumeSpecName "kube-api-access-jlpqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.213778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-kube-api-access-lb97r" (OuterVolumeSpecName: "kube-api-access-lb97r") pod "5a8c3cab-2625-44be-97b5-1d4d5f210b6e" (UID: "5a8c3cab-2625-44be-97b5-1d4d5f210b6e"). InnerVolumeSpecName "kube-api-access-lb97r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.213950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-kube-api-access-tdgs6" (OuterVolumeSpecName: "kube-api-access-tdgs6") pod "9a360c79-ca70-4364-87a9-5d0b24cb9b2e" (UID: "9a360c79-ca70-4364-87a9-5d0b24cb9b2e"). InnerVolumeSpecName "kube-api-access-tdgs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.311007 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.311310 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5560e962-f3b1-420e-9e8a-f443c99d1e43-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.311402 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.311488 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb97r\" (UniqueName: \"kubernetes.io/projected/5a8c3cab-2625-44be-97b5-1d4d5f210b6e-kube-api-access-lb97r\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.311562 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdgs6\" (UniqueName: \"kubernetes.io/projected/9a360c79-ca70-4364-87a9-5d0b24cb9b2e-kube-api-access-tdgs6\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.311619 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.311689 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlpqh\" (UniqueName: \"kubernetes.io/projected/5560e962-f3b1-420e-9e8a-f443c99d1e43-kube-api-access-jlpqh\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.311761 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8kld\" (UniqueName: \"kubernetes.io/projected/4dfb5d13-8ce5-49f7-a24d-4b913d85df26-kube-api-access-f8kld\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.742984 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-zjkkw" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.742986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-zjkkw" event={"ID":"9a360c79-ca70-4364-87a9-5d0b24cb9b2e","Type":"ContainerDied","Data":"69f0dd70fa0f3483d33731186d854362df9d7dd01976856699d519fb1fb20249"} Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.743508 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69f0dd70fa0f3483d33731186d854362df9d7dd01976856699d519fb1fb20249" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.744520 4707 generic.go:334] "Generic (PLEG): container finished" podID="e0f1cb8a-c64f-4997-a896-0d7127aa309a" containerID="74830281e4e80b570ce296677137e2b13d42928b30d53e08074c5321973fff8d" exitCode=0 Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.744615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-98266" event={"ID":"e0f1cb8a-c64f-4997-a896-0d7127aa309a","Type":"ContainerDied","Data":"74830281e4e80b570ce296677137e2b13d42928b30d53e08074c5321973fff8d"} Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.745948 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.745968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6090-account-create-update-llxnn" event={"ID":"5a8c3cab-2625-44be-97b5-1d4d5f210b6e","Type":"ContainerDied","Data":"2097c4076af9513ca187562b829ed6dba9c31f378557195b8f88c687662b1f39"} Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.746160 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2097c4076af9513ca187562b829ed6dba9c31f378557195b8f88c687662b1f39" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.747232 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-4zsrk" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.747241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-4zsrk" event={"ID":"4dfb5d13-8ce5-49f7-a24d-4b913d85df26","Type":"ContainerDied","Data":"7a551ce9d02b395aa1997321f26dc8b59cc9f28325669f621103939e52da601e"} Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.747381 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a551ce9d02b395aa1997321f26dc8b59cc9f28325669f621103939e52da601e" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.748632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" event={"ID":"5560e962-f3b1-420e-9e8a-f443c99d1e43","Type":"ContainerDied","Data":"37573d468d2d9d1a7a6770d7ae3b711ca6125baf7c8300e49d1c4b3027db4b8c"} Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.748658 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37573d468d2d9d1a7a6770d7ae3b711ca6125baf7c8300e49d1c4b3027db4b8c" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.748741 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.975196 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:11 crc kubenswrapper[4707]: I0121 15:21:11.996464 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.121498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-operator-scripts\") pod \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\" (UID: \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\") " Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.121666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-kube-api-access-lqzfc\") pod \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\" (UID: \"14c1b37e-4ff9-4d56-97c1-744f743a2f9a\") " Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.121703 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxpcz\" (UniqueName: \"kubernetes.io/projected/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-kube-api-access-rxpcz\") pod \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\" (UID: \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\") " Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.121741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-operator-scripts\") pod \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\" (UID: \"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0\") " Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.122053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14c1b37e-4ff9-4d56-97c1-744f743a2f9a" (UID: "14c1b37e-4ff9-4d56-97c1-744f743a2f9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.122395 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b82b2b5f-76f3-4da4-b413-fd8fc4d520c0" (UID: "b82b2b5f-76f3-4da4-b413-fd8fc4d520c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.125546 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-kube-api-access-rxpcz" (OuterVolumeSpecName: "kube-api-access-rxpcz") pod "b82b2b5f-76f3-4da4-b413-fd8fc4d520c0" (UID: "b82b2b5f-76f3-4da4-b413-fd8fc4d520c0"). InnerVolumeSpecName "kube-api-access-rxpcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.125652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-kube-api-access-lqzfc" (OuterVolumeSpecName: "kube-api-access-lqzfc") pod "14c1b37e-4ff9-4d56-97c1-744f743a2f9a" (UID: "14c1b37e-4ff9-4d56-97c1-744f743a2f9a"). InnerVolumeSpecName "kube-api-access-lqzfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.223138 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-kube-api-access-lqzfc\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.223168 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxpcz\" (UniqueName: \"kubernetes.io/projected/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-kube-api-access-rxpcz\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.223178 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.223185 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c1b37e-4ff9-4d56-97c1-744f743a2f9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.756478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-ws576" event={"ID":"b82b2b5f-76f3-4da4-b413-fd8fc4d520c0","Type":"ContainerDied","Data":"48e0c9e97d0648f761edba7fbc873351708302da7df45338cefcf4ad254d3f52"} Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.756673 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48e0c9e97d0648f761edba7fbc873351708302da7df45338cefcf4ad254d3f52" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.756498 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-ws576" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.757862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" event={"ID":"14c1b37e-4ff9-4d56-97c1-744f743a2f9a","Type":"ContainerDied","Data":"4b8b8c77955e11fe55648638aedf051e380ccd8db49cd207eebaa0b3cc1cc2fd"} Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.757879 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.757890 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b8b8c77955e11fe55648638aedf051e380ccd8db49cd207eebaa0b3cc1cc2fd" Jan 21 15:21:12 crc kubenswrapper[4707]: I0121 15:21:12.982675 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.134950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5wxp\" (UniqueName: \"kubernetes.io/projected/e0f1cb8a-c64f-4997-a896-0d7127aa309a-kube-api-access-m5wxp\") pod \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.135125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-config-data\") pod \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.135154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-combined-ca-bundle\") pod \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\" (UID: \"e0f1cb8a-c64f-4997-a896-0d7127aa309a\") " Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.138959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f1cb8a-c64f-4997-a896-0d7127aa309a-kube-api-access-m5wxp" (OuterVolumeSpecName: "kube-api-access-m5wxp") pod "e0f1cb8a-c64f-4997-a896-0d7127aa309a" (UID: "e0f1cb8a-c64f-4997-a896-0d7127aa309a"). InnerVolumeSpecName "kube-api-access-m5wxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.152279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0f1cb8a-c64f-4997-a896-0d7127aa309a" (UID: "e0f1cb8a-c64f-4997-a896-0d7127aa309a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.164760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-config-data" (OuterVolumeSpecName: "config-data") pod "e0f1cb8a-c64f-4997-a896-0d7127aa309a" (UID: "e0f1cb8a-c64f-4997-a896-0d7127aa309a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.236330 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.236521 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f1cb8a-c64f-4997-a896-0d7127aa309a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.236532 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5wxp\" (UniqueName: \"kubernetes.io/projected/e0f1cb8a-c64f-4997-a896-0d7127aa309a-kube-api-access-m5wxp\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.765415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-98266" event={"ID":"e0f1cb8a-c64f-4997-a896-0d7127aa309a","Type":"ContainerDied","Data":"75dd82d0aa18bcc54c340258bdc468373a2e835f13cfa48cea7c4af642233655"} Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.765449 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75dd82d0aa18bcc54c340258bdc468373a2e835f13cfa48cea7c4af642233655" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.765457 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-98266" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.847982 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6sdhj"] Jan 21 15:21:13 crc kubenswrapper[4707]: E0121 15:21:13.848263 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfb5d13-8ce5-49f7-a24d-4b913d85df26" containerName="mariadb-database-create" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848288 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfb5d13-8ce5-49f7-a24d-4b913d85df26" containerName="mariadb-database-create" Jan 21 15:21:13 crc kubenswrapper[4707]: E0121 15:21:13.848301 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f1cb8a-c64f-4997-a896-0d7127aa309a" containerName="keystone-db-sync" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848307 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f1cb8a-c64f-4997-a896-0d7127aa309a" containerName="keystone-db-sync" Jan 21 15:21:13 crc kubenswrapper[4707]: E0121 15:21:13.848314 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8c3cab-2625-44be-97b5-1d4d5f210b6e" containerName="mariadb-account-create-update" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848320 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8c3cab-2625-44be-97b5-1d4d5f210b6e" containerName="mariadb-account-create-update" Jan 21 15:21:13 crc kubenswrapper[4707]: E0121 15:21:13.848327 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a360c79-ca70-4364-87a9-5d0b24cb9b2e" containerName="mariadb-database-create" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848332 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a360c79-ca70-4364-87a9-5d0b24cb9b2e" containerName="mariadb-database-create" Jan 21 15:21:13 crc kubenswrapper[4707]: E0121 15:21:13.848339 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c1b37e-4ff9-4d56-97c1-744f743a2f9a" containerName="mariadb-account-create-update" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848345 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c1b37e-4ff9-4d56-97c1-744f743a2f9a" containerName="mariadb-account-create-update" Jan 21 15:21:13 crc kubenswrapper[4707]: E0121 15:21:13.848363 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5560e962-f3b1-420e-9e8a-f443c99d1e43" containerName="mariadb-account-create-update" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848368 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5560e962-f3b1-420e-9e8a-f443c99d1e43" containerName="mariadb-account-create-update" Jan 21 15:21:13 crc kubenswrapper[4707]: E0121 15:21:13.848374 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82b2b5f-76f3-4da4-b413-fd8fc4d520c0" containerName="mariadb-database-create" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848379 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82b2b5f-76f3-4da4-b413-fd8fc4d520c0" containerName="mariadb-database-create" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848499 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a360c79-ca70-4364-87a9-5d0b24cb9b2e" containerName="mariadb-database-create" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848510 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f1cb8a-c64f-4997-a896-0d7127aa309a" containerName="keystone-db-sync" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848519 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c1b37e-4ff9-4d56-97c1-744f743a2f9a" containerName="mariadb-account-create-update" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848529 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfb5d13-8ce5-49f7-a24d-4b913d85df26" containerName="mariadb-database-create" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848535 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5560e962-f3b1-420e-9e8a-f443c99d1e43" containerName="mariadb-account-create-update" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848542 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a8c3cab-2625-44be-97b5-1d4d5f210b6e" containerName="mariadb-account-create-update" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848551 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82b2b5f-76f3-4da4-b413-fd8fc4d520c0" containerName="mariadb-database-create" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.848984 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.850415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.850635 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-4bkq4" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.850969 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.851014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.851117 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.857774 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6sdhj"] Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.898608 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.899701 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.901309 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.901432 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.901831 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-kn6vz" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.904211 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.914996 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.948193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-config-data\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.948243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-combined-ca-bundle\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.948267 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-scripts\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.948365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjw49\" (UniqueName: \"kubernetes.io/projected/a0850def-7e03-4120-8306-b2938afba2d6-kube-api-access-tjw49\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.948422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-credential-keys\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.948590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-fernet-keys\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.979427 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.980606 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.982722 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.985393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.989028 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.990671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.992393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:21:13 crc kubenswrapper[4707]: I0121 15:21:13.992493 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.003115 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.016352 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.049905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.049946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-config-data\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.049967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-config-data\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-scripts\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n56m9\" (UniqueName: \"kubernetes.io/projected/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-kube-api-access-n56m9\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-combined-ca-bundle\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-scripts\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjw49\" (UniqueName: \"kubernetes.io/projected/a0850def-7e03-4120-8306-b2938afba2d6-kube-api-access-tjw49\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-scripts\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050363 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-credential-keys\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lblb6\" (UniqueName: \"kubernetes.io/projected/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-kube-api-access-lblb6\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4twm4\" (UniqueName: \"kubernetes.io/projected/1f414bee-4347-4f3f-bada-d7a2ee12f495-kube-api-access-4twm4\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-logs\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-fernet-keys\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-log-httpd\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.050988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-config-data\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.051010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.051024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-run-httpd\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.055555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-combined-ca-bundle\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.060570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-fernet-keys\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.060709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-config-data\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.065930 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-credential-keys\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.066247 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7rjvw"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.067161 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.069522 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.069688 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-rcmqr" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.069773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.073375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjw49\" (UniqueName: \"kubernetes.io/projected/a0850def-7e03-4120-8306-b2938afba2d6-kube-api-access-tjw49\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.074202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-scripts\") pod \"keystone-bootstrap-6sdhj\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.085627 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7rjvw"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.092624 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-v66nf"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.120635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.130449 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-28qg4" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.131169 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.148667 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-v66nf"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.152945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-log-httpd\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.152980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-config-data\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153010 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-run-httpd\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-config-data\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-scripts\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n56m9\" (UniqueName: \"kubernetes.io/projected/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-kube-api-access-n56m9\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-config-data\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-combined-ca-bundle\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p279c\" (UniqueName: \"kubernetes.io/projected/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-kube-api-access-p279c\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-scripts\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-logs\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-etc-machine-id\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153361 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-config-data\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv8f6\" (UniqueName: \"kubernetes.io/projected/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-kube-api-access-fv8f6\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153422 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-scripts\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lblb6\" (UniqueName: \"kubernetes.io/projected/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-kube-api-access-lblb6\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-db-sync-config-data\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4twm4\" (UniqueName: \"kubernetes.io/projected/1f414bee-4347-4f3f-bada-d7a2ee12f495-kube-api-access-4twm4\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-scripts\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-combined-ca-bundle\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.153668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-logs\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.154110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-logs\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.154140 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-log-httpd\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.172371 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.198600 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.203588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.209945 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.221797 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.232577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-config-data\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.232887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.233347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.251392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.252210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.252404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.252614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-run-httpd\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-config-data\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-combined-ca-bundle\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256134 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p279c\" (UniqueName: \"kubernetes.io/projected/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-kube-api-access-p279c\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-logs\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256174 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-etc-machine-id\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256190 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-config-data\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv8f6\" (UniqueName: \"kubernetes.io/projected/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-kube-api-access-fv8f6\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-scripts\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-db-sync-config-data\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-scripts\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.256295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-combined-ca-bundle\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.262936 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-etc-machine-id\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.265400 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.265591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-logs\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.267217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.269114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-config-data\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.270427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.270947 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.276023 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-xczbb"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.276993 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.287417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n56m9\" (UniqueName: \"kubernetes.io/projected/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-kube-api-access-n56m9\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.291401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4twm4\" (UniqueName: \"kubernetes.io/projected/1f414bee-4347-4f3f-bada-d7a2ee12f495-kube-api-access-4twm4\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.291927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-scripts\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.292730 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.292909 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.293115 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-74qf4" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.297506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-combined-ca-bundle\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.297872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-config-data\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.303503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.303899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-combined-ca-bundle\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.305194 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-db-sync-config-data\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.306116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-config-data\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.306204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-scripts\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.306514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv8f6\" (UniqueName: \"kubernetes.io/projected/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-kube-api-access-fv8f6\") pod \"cinder-db-sync-7rjvw\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.307437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-scripts\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.309833 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-kdl56"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.310720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.314491 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-wqj57" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.323113 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.323371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lblb6\" (UniqueName: \"kubernetes.io/projected/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-kube-api-access-lblb6\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.329885 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-xczbb"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.332324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p279c\" (UniqueName: \"kubernetes.io/projected/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-kube-api-access-p279c\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.332578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-scripts\") pod \"placement-db-sync-v66nf\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.361843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.367138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.367352 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-kdl56"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.431627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.431797 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.464657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-config\") pod \"neutron-db-sync-xczbb\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.464853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-combined-ca-bundle\") pod \"barbican-db-sync-kdl56\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.464902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-combined-ca-bundle\") pod \"neutron-db-sync-xczbb\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.465109 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbs7g\" (UniqueName: \"kubernetes.io/projected/1a3a71af-3540-4d7c-9b47-04bf4098aa40-kube-api-access-bbs7g\") pod \"neutron-db-sync-xczbb\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.465216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdw5t\" (UniqueName: \"kubernetes.io/projected/b4de4059-fde2-431a-b7e8-7f2bc23218dd-kube-api-access-mdw5t\") pod \"barbican-db-sync-kdl56\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.465269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-db-sync-config-data\") pod \"barbican-db-sync-kdl56\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.511296 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.566552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbs7g\" (UniqueName: \"kubernetes.io/projected/1a3a71af-3540-4d7c-9b47-04bf4098aa40-kube-api-access-bbs7g\") pod \"neutron-db-sync-xczbb\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.566614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdw5t\" (UniqueName: \"kubernetes.io/projected/b4de4059-fde2-431a-b7e8-7f2bc23218dd-kube-api-access-mdw5t\") pod \"barbican-db-sync-kdl56\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.566635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-db-sync-config-data\") pod \"barbican-db-sync-kdl56\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.566664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-config\") pod \"neutron-db-sync-xczbb\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.566706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-combined-ca-bundle\") pod \"barbican-db-sync-kdl56\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.566728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-combined-ca-bundle\") pod \"neutron-db-sync-xczbb\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.571317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-combined-ca-bundle\") pod \"barbican-db-sync-kdl56\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.572198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-db-sync-config-data\") pod \"barbican-db-sync-kdl56\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.573025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-config\") pod \"neutron-db-sync-xczbb\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.574319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-combined-ca-bundle\") pod \"neutron-db-sync-xczbb\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.585591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbs7g\" (UniqueName: \"kubernetes.io/projected/1a3a71af-3540-4d7c-9b47-04bf4098aa40-kube-api-access-bbs7g\") pod \"neutron-db-sync-xczbb\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.589143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdw5t\" (UniqueName: \"kubernetes.io/projected/b4de4059-fde2-431a-b7e8-7f2bc23218dd-kube-api-access-mdw5t\") pod \"barbican-db-sync-kdl56\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.592691 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.602474 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.730337 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6sdhj"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.737976 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.747591 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.773384 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" event={"ID":"a0850def-7e03-4120-8306-b2938afba2d6","Type":"ContainerStarted","Data":"18518663776b598afad4ced2106d86881f07df9604e1edd2fcfd4c7783ecd23f"} Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.887707 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-v66nf"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.943648 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7rjvw"] Jan 21 15:21:14 crc kubenswrapper[4707]: I0121 15:21:14.952057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.130595 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.143689 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.158179 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.254087 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-xczbb"] Jan 21 15:21:15 crc kubenswrapper[4707]: W0121 15:21:15.260011 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4de4059_fde2_431a_b7e8_7f2bc23218dd.slice/crio-a21f191a05916fef7437e83f4cea64857cd80ca19cc8df83ecd3ee3aa858dcd4 WatchSource:0}: Error finding container a21f191a05916fef7437e83f4cea64857cd80ca19cc8df83ecd3ee3aa858dcd4: Status 404 returned error can't find the container with id a21f191a05916fef7437e83f4cea64857cd80ca19cc8df83ecd3ee3aa858dcd4 Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.269780 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-kdl56"] Jan 21 15:21:15 crc kubenswrapper[4707]: W0121 15:21:15.311315 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a3a71af_3540_4d7c_9b47_04bf4098aa40.slice/crio-d5717f0a92ce31c7acfc9c0b2a36039fad51f3037c9abe80d38ac9605c405e72 WatchSource:0}: Error finding container d5717f0a92ce31c7acfc9c0b2a36039fad51f3037c9abe80d38ac9605c405e72: Status 404 returned error can't find the container with id d5717f0a92ce31c7acfc9c0b2a36039fad51f3037c9abe80d38ac9605c405e72 Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.792625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" event={"ID":"685bf29d-d0cc-43bc-9d01-6585cc8b0f18","Type":"ContainerStarted","Data":"6974d5bf2b2381c2c14cf2c91fc90afcde48c9637fed19ce45073e9c9e80bec1"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.792859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" event={"ID":"685bf29d-d0cc-43bc-9d01-6585cc8b0f18","Type":"ContainerStarted","Data":"75b67b4f4a2055e9798f49ba00203ddca9e418046f8ac75c6d81a9890274846a"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.801198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" event={"ID":"a0850def-7e03-4120-8306-b2938afba2d6","Type":"ContainerStarted","Data":"7cbbc5f91d32fbfe39900eb972fe527aa33aacdf4a104c1e7339806c89526fae"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.804356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-kdl56" event={"ID":"b4de4059-fde2-431a-b7e8-7f2bc23218dd","Type":"ContainerStarted","Data":"1e5286050c55001bc380d39d61363c908de1f2d1b096180f6b900cc316a11aad"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.804385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-kdl56" event={"ID":"b4de4059-fde2-431a-b7e8-7f2bc23218dd","Type":"ContainerStarted","Data":"a21f191a05916fef7437e83f4cea64857cd80ca19cc8df83ecd3ee3aa858dcd4"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.806491 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" podStartSLOduration=1.806477937 podStartE2EDuration="1.806477937s" podCreationTimestamp="2026-01-21 15:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:15.805796215 +0000 UTC m=+1172.987312437" watchObservedRunningTime="2026-01-21 15:21:15.806477937 +0000 UTC m=+1172.987994159" Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.827203 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" podStartSLOduration=2.8271876540000003 podStartE2EDuration="2.827187654s" podCreationTimestamp="2026-01-21 15:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:15.819115941 +0000 UTC m=+1173.000632163" watchObservedRunningTime="2026-01-21 15:21:15.827187654 +0000 UTC m=+1173.008703876" Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.831030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-xczbb" event={"ID":"1a3a71af-3540-4d7c-9b47-04bf4098aa40","Type":"ContainerStarted","Data":"d8ca5318999b7bf8d6303c3e592edd3d377c05a2dce6855ef7cb9b43aa92a648"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.831075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-xczbb" event={"ID":"1a3a71af-3540-4d7c-9b47-04bf4098aa40","Type":"ContainerStarted","Data":"d5717f0a92ce31c7acfc9c0b2a36039fad51f3037c9abe80d38ac9605c405e72"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.833890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-v66nf" event={"ID":"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980","Type":"ContainerStarted","Data":"eb34cbdfd71f1c9dd2163feb2da32572dcbf417daea1ca6cfab5189985d32ea4"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.833920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-v66nf" event={"ID":"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980","Type":"ContainerStarted","Data":"d0da7d987ecaa42a9208a753c1d43ca05e0ff9ce88c84f5ffc5a46c2b7fa536e"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.839333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerStarted","Data":"272c231a4704f9f1eca4313adbfe0a7c4777a679b89314cc3e1e4804deee90c9"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.851405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1","Type":"ContainerStarted","Data":"eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.851449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1","Type":"ContainerStarted","Data":"ab27e31f4778d67964313190a5dadc4230159af8a95688f73abc230c454a3240"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.853726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7a1f1ed8-847e-4c27-bf41-2275c8ae1551","Type":"ContainerStarted","Data":"34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.853765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7a1f1ed8-847e-4c27-bf41-2275c8ae1551","Type":"ContainerStarted","Data":"8205523efb04ca8ce5c404bf9d1f5734da8efcaf4ddb6913991db99cbe11422f"} Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.855789 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-kdl56" podStartSLOduration=1.855771964 podStartE2EDuration="1.855771964s" podCreationTimestamp="2026-01-21 15:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:15.844779296 +0000 UTC m=+1173.026295518" watchObservedRunningTime="2026-01-21 15:21:15.855771964 +0000 UTC m=+1173.037288186" Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.877484 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-v66nf" podStartSLOduration=1.877468207 podStartE2EDuration="1.877468207s" podCreationTimestamp="2026-01-21 15:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:15.864034165 +0000 UTC m=+1173.045550387" watchObservedRunningTime="2026-01-21 15:21:15.877468207 +0000 UTC m=+1173.058984428" Jan 21 15:21:15 crc kubenswrapper[4707]: I0121 15:21:15.899674 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-xczbb" podStartSLOduration=1.899657647 podStartE2EDuration="1.899657647s" podCreationTimestamp="2026-01-21 15:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:15.875830435 +0000 UTC m=+1173.057346667" watchObservedRunningTime="2026-01-21 15:21:15.899657647 +0000 UTC m=+1173.081173869" Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.552025 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.578621 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.614783 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.862559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1","Type":"ContainerStarted","Data":"d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97"} Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.865192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7a1f1ed8-847e-4c27-bf41-2275c8ae1551","Type":"ContainerStarted","Data":"1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08"} Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.866495 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4de4059-fde2-431a-b7e8-7f2bc23218dd" containerID="1e5286050c55001bc380d39d61363c908de1f2d1b096180f6b900cc316a11aad" exitCode=0 Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.866544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-kdl56" event={"ID":"b4de4059-fde2-431a-b7e8-7f2bc23218dd","Type":"ContainerDied","Data":"1e5286050c55001bc380d39d61363c908de1f2d1b096180f6b900cc316a11aad"} Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.869001 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" containerID="eb34cbdfd71f1c9dd2163feb2da32572dcbf417daea1ca6cfab5189985d32ea4" exitCode=0 Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.869069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-v66nf" event={"ID":"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980","Type":"ContainerDied","Data":"eb34cbdfd71f1c9dd2163feb2da32572dcbf417daea1ca6cfab5189985d32ea4"} Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.871094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerStarted","Data":"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007"} Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.871120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerStarted","Data":"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9"} Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.883608 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.8835932079999997 podStartE2EDuration="3.883593208s" podCreationTimestamp="2026-01-21 15:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:16.876945503 +0000 UTC m=+1174.058461725" watchObservedRunningTime="2026-01-21 15:21:16.883593208 +0000 UTC m=+1174.065109430" Jan 21 15:21:16 crc kubenswrapper[4707]: I0121 15:21:16.899632 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.89961723 podStartE2EDuration="3.89961723s" podCreationTimestamp="2026-01-21 15:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:16.8989099 +0000 UTC m=+1174.080426122" watchObservedRunningTime="2026-01-21 15:21:16.89961723 +0000 UTC m=+1174.081133452" Jan 21 15:21:17 crc kubenswrapper[4707]: I0121 15:21:17.879912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerStarted","Data":"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e"} Jan 21 15:21:17 crc kubenswrapper[4707]: I0121 15:21:17.880167 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerName="glance-log" containerID="cri-o://eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b" gracePeriod=30 Jan 21 15:21:17 crc kubenswrapper[4707]: I0121 15:21:17.880326 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerName="glance-httpd" containerID="cri-o://d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97" gracePeriod=30 Jan 21 15:21:17 crc kubenswrapper[4707]: I0121 15:21:17.880711 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerName="glance-httpd" containerID="cri-o://1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08" gracePeriod=30 Jan 21 15:21:17 crc kubenswrapper[4707]: I0121 15:21:17.880699 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerName="glance-log" containerID="cri-o://34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7" gracePeriod=30 Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.287577 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.302723 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.376177 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.434697 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.453844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-scripts\") pod \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.453999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdw5t\" (UniqueName: \"kubernetes.io/projected/b4de4059-fde2-431a-b7e8-7f2bc23218dd-kube-api-access-mdw5t\") pod \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.454140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-combined-ca-bundle\") pod \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.454191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-combined-ca-bundle\") pod \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.454267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-config-data\") pod \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.454335 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-logs\") pod \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.454366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p279c\" (UniqueName: \"kubernetes.io/projected/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-kube-api-access-p279c\") pod \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\" (UID: \"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.454390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-db-sync-config-data\") pod \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\" (UID: \"b4de4059-fde2-431a-b7e8-7f2bc23218dd\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.454677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-logs" (OuterVolumeSpecName: "logs") pod "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" (UID: "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.455088 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.461960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b4de4059-fde2-431a-b7e8-7f2bc23218dd" (UID: "b4de4059-fde2-431a-b7e8-7f2bc23218dd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.475641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4de4059-fde2-431a-b7e8-7f2bc23218dd-kube-api-access-mdw5t" (OuterVolumeSpecName: "kube-api-access-mdw5t") pod "b4de4059-fde2-431a-b7e8-7f2bc23218dd" (UID: "b4de4059-fde2-431a-b7e8-7f2bc23218dd"). InnerVolumeSpecName "kube-api-access-mdw5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.476385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-kube-api-access-p279c" (OuterVolumeSpecName: "kube-api-access-p279c") pod "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" (UID: "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980"). InnerVolumeSpecName "kube-api-access-p279c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.481068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-scripts" (OuterVolumeSpecName: "scripts") pod "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" (UID: "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.484517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" (UID: "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.488621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4de4059-fde2-431a-b7e8-7f2bc23218dd" (UID: "b4de4059-fde2-431a-b7e8-7f2bc23218dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.490102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-config-data" (OuterVolumeSpecName: "config-data") pod "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" (UID: "c3dcb05d-aaf0-4646-a1dd-b52ada0e0980"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.555784 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-scripts\") pod \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.555920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-logs\") pod \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.555947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-combined-ca-bundle\") pod \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.555969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.555983 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-internal-tls-certs\") pod \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556045 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-config-data\") pod \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-logs\") pod \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-httpd-run\") pod \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-public-tls-certs\") pod \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556156 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-scripts\") pod \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lblb6\" (UniqueName: \"kubernetes.io/projected/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-kube-api-access-lblb6\") pod \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-httpd-run\") pod \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556229 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-combined-ca-bundle\") pod \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\" (UID: \"7a1f1ed8-847e-4c27-bf41-2275c8ae1551\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-config-data\") pod \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n56m9\" (UniqueName: \"kubernetes.io/projected/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-kube-api-access-n56m9\") pod \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\" (UID: \"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1\") " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-logs" (OuterVolumeSpecName: "logs") pod "7a1f1ed8-847e-4c27-bf41-2275c8ae1551" (UID: "7a1f1ed8-847e-4c27-bf41-2275c8ae1551"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556870 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556891 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556901 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556911 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p279c\" (UniqueName: \"kubernetes.io/projected/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-kube-api-access-p279c\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556921 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4de4059-fde2-431a-b7e8-7f2bc23218dd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556928 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556937 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.556946 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdw5t\" (UniqueName: \"kubernetes.io/projected/b4de4059-fde2-431a-b7e8-7f2bc23218dd-kube-api-access-mdw5t\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.557406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-logs" (OuterVolumeSpecName: "logs") pod "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" (UID: "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.557505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a1f1ed8-847e-4c27-bf41-2275c8ae1551" (UID: "7a1f1ed8-847e-4c27-bf41-2275c8ae1551"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.557697 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" (UID: "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.558726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-scripts" (OuterVolumeSpecName: "scripts") pod "7a1f1ed8-847e-4c27-bf41-2275c8ae1551" (UID: "7a1f1ed8-847e-4c27-bf41-2275c8ae1551"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.560718 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-kube-api-access-lblb6" (OuterVolumeSpecName: "kube-api-access-lblb6") pod "7a1f1ed8-847e-4c27-bf41-2275c8ae1551" (UID: "7a1f1ed8-847e-4c27-bf41-2275c8ae1551"). InnerVolumeSpecName "kube-api-access-lblb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.560732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-scripts" (OuterVolumeSpecName: "scripts") pod "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" (UID: "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.560900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-kube-api-access-n56m9" (OuterVolumeSpecName: "kube-api-access-n56m9") pod "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" (UID: "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1"). InnerVolumeSpecName "kube-api-access-n56m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.560906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" (UID: "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.560903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "7a1f1ed8-847e-4c27-bf41-2275c8ae1551" (UID: "7a1f1ed8-847e-4c27-bf41-2275c8ae1551"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.575295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" (UID: "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.575690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a1f1ed8-847e-4c27-bf41-2275c8ae1551" (UID: "7a1f1ed8-847e-4c27-bf41-2275c8ae1551"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.597360 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-config-data" (OuterVolumeSpecName: "config-data") pod "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" (UID: "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.599692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7a1f1ed8-847e-4c27-bf41-2275c8ae1551" (UID: "7a1f1ed8-847e-4c27-bf41-2275c8ae1551"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.599830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" (UID: "87627e0c-8283-45f8-a2bf-b2abd8dbfdb1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.601285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-config-data" (OuterVolumeSpecName: "config-data") pod "7a1f1ed8-847e-4c27-bf41-2275c8ae1551" (UID: "7a1f1ed8-847e-4c27-bf41-2275c8ae1551"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659011 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659040 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659051 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lblb6\" (UniqueName: \"kubernetes.io/projected/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-kube-api-access-lblb6\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659063 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659072 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659080 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659088 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n56m9\" (UniqueName: \"kubernetes.io/projected/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-kube-api-access-n56m9\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659096 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659103 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659111 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659150 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659159 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659172 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659180 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f1ed8-847e-4c27-bf41-2275c8ae1551-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.659187 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.671688 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.672550 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.770263 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.770473 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.888250 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerID="1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08" exitCode=0 Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.888284 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerID="34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7" exitCode=143 Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.888302 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.888332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7a1f1ed8-847e-4c27-bf41-2275c8ae1551","Type":"ContainerDied","Data":"1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.888355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7a1f1ed8-847e-4c27-bf41-2275c8ae1551","Type":"ContainerDied","Data":"34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.888368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7a1f1ed8-847e-4c27-bf41-2275c8ae1551","Type":"ContainerDied","Data":"8205523efb04ca8ce5c404bf9d1f5734da8efcaf4ddb6913991db99cbe11422f"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.888381 4707 scope.go:117] "RemoveContainer" containerID="1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.889872 4707 generic.go:334] "Generic (PLEG): container finished" podID="685bf29d-d0cc-43bc-9d01-6585cc8b0f18" containerID="6974d5bf2b2381c2c14cf2c91fc90afcde48c9637fed19ce45073e9c9e80bec1" exitCode=0 Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.889916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" event={"ID":"685bf29d-d0cc-43bc-9d01-6585cc8b0f18","Type":"ContainerDied","Data":"6974d5bf2b2381c2c14cf2c91fc90afcde48c9637fed19ce45073e9c9e80bec1"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.891341 4707 generic.go:334] "Generic (PLEG): container finished" podID="a0850def-7e03-4120-8306-b2938afba2d6" containerID="7cbbc5f91d32fbfe39900eb972fe527aa33aacdf4a104c1e7339806c89526fae" exitCode=0 Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.891366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" event={"ID":"a0850def-7e03-4120-8306-b2938afba2d6","Type":"ContainerDied","Data":"7cbbc5f91d32fbfe39900eb972fe527aa33aacdf4a104c1e7339806c89526fae"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.892592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-kdl56" event={"ID":"b4de4059-fde2-431a-b7e8-7f2bc23218dd","Type":"ContainerDied","Data":"a21f191a05916fef7437e83f4cea64857cd80ca19cc8df83ecd3ee3aa858dcd4"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.892634 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21f191a05916fef7437e83f4cea64857cd80ca19cc8df83ecd3ee3aa858dcd4" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.892690 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-kdl56" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.901658 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a3a71af-3540-4d7c-9b47-04bf4098aa40" containerID="d8ca5318999b7bf8d6303c3e592edd3d377c05a2dce6855ef7cb9b43aa92a648" exitCode=0 Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.901745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-xczbb" event={"ID":"1a3a71af-3540-4d7c-9b47-04bf4098aa40","Type":"ContainerDied","Data":"d8ca5318999b7bf8d6303c3e592edd3d377c05a2dce6855ef7cb9b43aa92a648"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.905432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-v66nf" event={"ID":"c3dcb05d-aaf0-4646-a1dd-b52ada0e0980","Type":"ContainerDied","Data":"d0da7d987ecaa42a9208a753c1d43ca05e0ff9ce88c84f5ffc5a46c2b7fa536e"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.905465 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0da7d987ecaa42a9208a753c1d43ca05e0ff9ce88c84f5ffc5a46c2b7fa536e" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.905510 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-v66nf" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.906885 4707 generic.go:334] "Generic (PLEG): container finished" podID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerID="d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97" exitCode=0 Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.906906 4707 generic.go:334] "Generic (PLEG): container finished" podID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerID="eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b" exitCode=143 Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.906922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1","Type":"ContainerDied","Data":"d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.906939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1","Type":"ContainerDied","Data":"eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.906947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"87627e0c-8283-45f8-a2bf-b2abd8dbfdb1","Type":"ContainerDied","Data":"ab27e31f4778d67964313190a5dadc4230159af8a95688f73abc230c454a3240"} Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.906984 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.911086 4707 scope.go:117] "RemoveContainer" containerID="34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.959940 4707 scope.go:117] "RemoveContainer" containerID="1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08" Jan 21 15:21:18 crc kubenswrapper[4707]: E0121 15:21:18.965044 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08\": container with ID starting with 1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08 not found: ID does not exist" containerID="1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.965077 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08"} err="failed to get container status \"1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08\": rpc error: code = NotFound desc = could not find container \"1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08\": container with ID starting with 1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08 not found: ID does not exist" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.965098 4707 scope.go:117] "RemoveContainer" containerID="34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7" Jan 21 15:21:18 crc kubenswrapper[4707]: E0121 15:21:18.971047 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7\": container with ID starting with 34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7 not found: ID does not exist" containerID="34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.971080 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7"} err="failed to get container status \"34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7\": rpc error: code = NotFound desc = could not find container \"34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7\": container with ID starting with 34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7 not found: ID does not exist" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.971102 4707 scope.go:117] "RemoveContainer" containerID="1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.975091 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.975334 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08"} err="failed to get container status \"1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08\": rpc error: code = NotFound desc = could not find container \"1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08\": container with ID starting with 1fdcbdb9cadd0dba93d4ecd2a025b4fc461fe26a5cee8999574028ff8b3f5e08 not found: ID does not exist" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.975370 4707 scope.go:117] "RemoveContainer" containerID="34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.982746 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7"} err="failed to get container status \"34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7\": rpc error: code = NotFound desc = could not find container \"34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7\": container with ID starting with 34c172f8a65233d26d78a9b61593267050ccf0db417b995b0bbeaf0dd31817d7 not found: ID does not exist" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.982789 4707 scope.go:117] "RemoveContainer" containerID="d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.987798 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.996489 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:21:18 crc kubenswrapper[4707]: E0121 15:21:18.996822 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerName="glance-httpd" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.996841 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerName="glance-httpd" Jan 21 15:21:18 crc kubenswrapper[4707]: E0121 15:21:18.996852 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" containerName="placement-db-sync" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.996858 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" containerName="placement-db-sync" Jan 21 15:21:18 crc kubenswrapper[4707]: E0121 15:21:18.996868 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerName="glance-log" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.996874 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerName="glance-log" Jan 21 15:21:18 crc kubenswrapper[4707]: E0121 15:21:18.996892 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerName="glance-httpd" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.996897 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerName="glance-httpd" Jan 21 15:21:18 crc kubenswrapper[4707]: E0121 15:21:18.996916 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerName="glance-log" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.996922 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerName="glance-log" Jan 21 15:21:18 crc kubenswrapper[4707]: E0121 15:21:18.996932 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4de4059-fde2-431a-b7e8-7f2bc23218dd" containerName="barbican-db-sync" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.996937 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4de4059-fde2-431a-b7e8-7f2bc23218dd" containerName="barbican-db-sync" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.997075 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerName="glance-log" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.997086 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" containerName="glance-httpd" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.997097 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerName="glance-httpd" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.997104 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4de4059-fde2-431a-b7e8-7f2bc23218dd" containerName="barbican-db-sync" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.997114 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" containerName="glance-log" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.997130 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" containerName="placement-db-sync" Jan 21 15:21:18 crc kubenswrapper[4707]: I0121 15:21:18.997857 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.003654 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.003788 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.004570 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-kn6vz" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.009764 4707 scope.go:117] "RemoveContainer" containerID="eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.014046 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.019006 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.040145 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.051476 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.059643 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.060831 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.064914 4707 scope.go:117] "RemoveContainer" containerID="d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.065190 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.065321 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:21:19 crc kubenswrapper[4707]: E0121 15:21:19.066110 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97\": container with ID starting with d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97 not found: ID does not exist" containerID="d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.066133 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97"} err="failed to get container status \"d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97\": rpc error: code = NotFound desc = could not find container \"d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97\": container with ID starting with d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97 not found: ID does not exist" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.066154 4707 scope.go:117] "RemoveContainer" containerID="eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b" Jan 21 15:21:19 crc kubenswrapper[4707]: E0121 15:21:19.067786 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b\": container with ID starting with eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b not found: ID does not exist" containerID="eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.067828 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b"} err="failed to get container status \"eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b\": rpc error: code = NotFound desc = could not find container \"eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b\": container with ID starting with eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b not found: ID does not exist" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.067849 4707 scope.go:117] "RemoveContainer" containerID="d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.068468 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97"} err="failed to get container status \"d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97\": rpc error: code = NotFound desc = could not find container \"d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97\": container with ID starting with d77684e272d3067cee123f9557a6fd98594032dad929188f7fdb262dd252ee97 not found: ID does not exist" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.068491 4707 scope.go:117] "RemoveContainer" containerID="eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.068723 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b"} err="failed to get container status \"eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b\": rpc error: code = NotFound desc = could not find container \"eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b\": container with ID starting with eef89dc1f4f3d75c8e652585bbf61030b2ee7a5ecfcaf4641016cfc8bec3b56b not found: ID does not exist" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.071062 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.074067 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-79b5787b96-vlq6n"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.075316 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.076647 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-28qg4" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.077200 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.077370 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.078204 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-79b5787b96-vlq6n"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175133 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-logs\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-scripts\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-config-data\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9dhq\" (UniqueName: \"kubernetes.io/projected/47279210-6924-452f-abb7-8d5d7ff90ec9-kube-api-access-z9dhq\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175496 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d380df51-2feb-4db3-ac56-f2dc791690c9-logs\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175515 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-combined-ca-bundle\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kfzz\" (UniqueName: \"kubernetes.io/projected/d380df51-2feb-4db3-ac56-f2dc791690c9-kube-api-access-6kfzz\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-config-data\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlxd7\" (UniqueName: \"kubernetes.io/projected/b6243463-c1a1-42a5-8697-11a221d45be7-kube-api-access-dlxd7\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-scripts\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.175652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.188749 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1f1ed8-847e-4c27-bf41-2275c8ae1551" path="/var/lib/kubelet/pods/7a1f1ed8-847e-4c27-bf41-2275c8ae1551/volumes" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.189453 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87627e0c-8283-45f8-a2bf-b2abd8dbfdb1" path="/var/lib/kubelet/pods/87627e0c-8283-45f8-a2bf-b2abd8dbfdb1/volumes" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-logs\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-scripts\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276612 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-config-data\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9dhq\" (UniqueName: \"kubernetes.io/projected/47279210-6924-452f-abb7-8d5d7ff90ec9-kube-api-access-z9dhq\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d380df51-2feb-4db3-ac56-f2dc791690c9-logs\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-combined-ca-bundle\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kfzz\" (UniqueName: \"kubernetes.io/projected/d380df51-2feb-4db3-ac56-f2dc791690c9-kube-api-access-6kfzz\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-config-data\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-scripts\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlxd7\" (UniqueName: \"kubernetes.io/projected/b6243463-c1a1-42a5-8697-11a221d45be7-kube-api-access-dlxd7\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.276934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.277640 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.278263 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.278871 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.279043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.279137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-logs\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.279652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d380df51-2feb-4db3-ac56-f2dc791690c9-logs\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.280433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.283765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-scripts\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.285429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-config-data\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.286856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-combined-ca-bundle\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.286998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-config-data\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.288586 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.288770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.288855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.289856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.290370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.290457 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.290577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-scripts\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.292567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9dhq\" (UniqueName: \"kubernetes.io/projected/47279210-6924-452f-abb7-8d5d7ff90ec9-kube-api-access-z9dhq\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.292848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kfzz\" (UniqueName: \"kubernetes.io/projected/d380df51-2feb-4db3-ac56-f2dc791690c9-kube-api-access-6kfzz\") pod \"placement-79b5787b96-vlq6n\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.294167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlxd7\" (UniqueName: \"kubernetes.io/projected/b6243463-c1a1-42a5-8697-11a221d45be7-kube-api-access-dlxd7\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.316992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.322209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.322389 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.380779 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.413636 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.529032 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.530461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.534170 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.534317 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-wqj57" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.534330 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.538031 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.553173 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.554978 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.557477 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.559203 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.600671 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-fb4888b88-mz44p"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.601922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.609295 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.625821 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-fb4888b88-mz44p"] Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.643740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:21:19 crc kubenswrapper[4707]: W0121 15:21:19.656763 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47279210_6924_452f_abb7_8d5d7ff90ec9.slice/crio-12ab4c266a95bb5fa67439fef7acd5ab5e9df1b9be55f315b9df742eb6bb6f66 WatchSource:0}: Error finding container 12ab4c266a95bb5fa67439fef7acd5ab5e9df1b9be55f315b9df742eb6bb6f66: Status 404 returned error can't find the container with id 12ab4c266a95bb5fa67439fef7acd5ab5e9df1b9be55f315b9df742eb6bb6f66 Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.683939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data-custom\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.683976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/cf2ca736-6fb3-48fd-b827-17934d478419-kube-api-access-6pqqt\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.683996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-logs\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-combined-ca-bundle\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-logs\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data-custom\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-combined-ca-bundle\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ca736-6fb3-48fd-b827-17934d478419-logs\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvj6\" (UniqueName: \"kubernetes.io/projected/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-kube-api-access-rgvj6\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684314 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data-custom\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-combined-ca-bundle\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.684361 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnqzp\" (UniqueName: \"kubernetes.io/projected/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-kube-api-access-nnqzp\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data-custom\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-combined-ca-bundle\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ca736-6fb3-48fd-b827-17934d478419-logs\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791332 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvj6\" (UniqueName: \"kubernetes.io/projected/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-kube-api-access-rgvj6\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data-custom\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-combined-ca-bundle\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnqzp\" (UniqueName: \"kubernetes.io/projected/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-kube-api-access-nnqzp\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data-custom\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/cf2ca736-6fb3-48fd-b827-17934d478419-kube-api-access-6pqqt\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-logs\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-combined-ca-bundle\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-logs\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.791520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.797201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ca736-6fb3-48fd-b827-17934d478419-logs\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.797831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-logs\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.798145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-logs\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.802690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-combined-ca-bundle\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.804032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.809759 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnqzp\" (UniqueName: \"kubernetes.io/projected/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-kube-api-access-nnqzp\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.810618 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-combined-ca-bundle\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.810636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data-custom\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.810684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data-custom\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.810697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data-custom\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.810978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/cf2ca736-6fb3-48fd-b827-17934d478419-kube-api-access-6pqqt\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.811208 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-combined-ca-bundle\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.811578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data\") pod \"barbican-api-fb4888b88-mz44p\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.812420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvj6\" (UniqueName: \"kubernetes.io/projected/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-kube-api-access-rgvj6\") pod \"barbican-keystone-listener-7fc4c9546b-n8wbl\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.823341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data\") pod \"barbican-worker-7b9b58ff55-8q9qd\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.917301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"47279210-6924-452f-abb7-8d5d7ff90ec9","Type":"ContainerStarted","Data":"12ab4c266a95bb5fa67439fef7acd5ab5e9df1b9be55f315b9df742eb6bb6f66"} Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.924614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerStarted","Data":"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151"} Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.924766 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="ceilometer-central-agent" containerID="cri-o://f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9" gracePeriod=30 Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.925916 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.925916 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="sg-core" containerID="cri-o://5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e" gracePeriod=30 Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.925919 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="proxy-httpd" containerID="cri-o://5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151" gracePeriod=30 Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.925957 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="ceilometer-notification-agent" containerID="cri-o://4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007" gracePeriod=30 Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.936355 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.941301 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.942199 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.115242271 podStartE2EDuration="6.942182538s" podCreationTimestamp="2026-01-21 15:21:13 +0000 UTC" firstStartedPulling="2026-01-21 15:21:15.157971469 +0000 UTC m=+1172.339487691" lastFinishedPulling="2026-01-21 15:21:18.984911735 +0000 UTC m=+1176.166427958" observedRunningTime="2026-01-21 15:21:19.938357763 +0000 UTC m=+1177.119873976" watchObservedRunningTime="2026-01-21 15:21:19.942182538 +0000 UTC m=+1177.123698760" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.946910 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:19 crc kubenswrapper[4707]: I0121 15:21:19.980016 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:21:20 crc kubenswrapper[4707]: W0121 15:21:20.002097 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6243463_c1a1_42a5_8697_11a221d45be7.slice/crio-6cf65c2bba9d9a8d94be9db8039c5d8f1d4729776345ad2374399cffc76de588 WatchSource:0}: Error finding container 6cf65c2bba9d9a8d94be9db8039c5d8f1d4729776345ad2374399cffc76de588: Status 404 returned error can't find the container with id 6cf65c2bba9d9a8d94be9db8039c5d8f1d4729776345ad2374399cffc76de588 Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.056835 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-79b5787b96-vlq6n"] Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.267835 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.369701 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-55779dfdd6-4x8fn"] Jan 21 15:21:20 crc kubenswrapper[4707]: E0121 15:21:20.370044 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685bf29d-d0cc-43bc-9d01-6585cc8b0f18" containerName="cinder-db-sync" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.370062 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="685bf29d-d0cc-43bc-9d01-6585cc8b0f18" containerName="cinder-db-sync" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.370209 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="685bf29d-d0cc-43bc-9d01-6585cc8b0f18" containerName="cinder-db-sync" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.370929 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.372501 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.376436 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.395057 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.401357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-db-sync-config-data\") pod \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.401421 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-etc-machine-id\") pod \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.401481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-combined-ca-bundle\") pod \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.401505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-config-data\") pod \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.401537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-scripts\") pod \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.401568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv8f6\" (UniqueName: \"kubernetes.io/projected/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-kube-api-access-fv8f6\") pod \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\" (UID: \"685bf29d-d0cc-43bc-9d01-6585cc8b0f18\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.404030 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-55779dfdd6-4x8fn"] Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.405666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-kube-api-access-fv8f6" (OuterVolumeSpecName: "kube-api-access-fv8f6") pod "685bf29d-d0cc-43bc-9d01-6585cc8b0f18" (UID: "685bf29d-d0cc-43bc-9d01-6585cc8b0f18"). InnerVolumeSpecName "kube-api-access-fv8f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.408147 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-scripts" (OuterVolumeSpecName: "scripts") pod "685bf29d-d0cc-43bc-9d01-6585cc8b0f18" (UID: "685bf29d-d0cc-43bc-9d01-6585cc8b0f18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.408582 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "685bf29d-d0cc-43bc-9d01-6585cc8b0f18" (UID: "685bf29d-d0cc-43bc-9d01-6585cc8b0f18"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.425556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "685bf29d-d0cc-43bc-9d01-6585cc8b0f18" (UID: "685bf29d-d0cc-43bc-9d01-6585cc8b0f18"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.485303 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-config-data" (OuterVolumeSpecName: "config-data") pod "685bf29d-d0cc-43bc-9d01-6585cc8b0f18" (UID: "685bf29d-d0cc-43bc-9d01-6585cc8b0f18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.497880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "685bf29d-d0cc-43bc-9d01-6585cc8b0f18" (UID: "685bf29d-d0cc-43bc-9d01-6585cc8b0f18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.509819 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-scripts\") pod \"a0850def-7e03-4120-8306-b2938afba2d6\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.509958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-credential-keys\") pod \"a0850def-7e03-4120-8306-b2938afba2d6\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-combined-ca-bundle\") pod \"a0850def-7e03-4120-8306-b2938afba2d6\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510055 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjw49\" (UniqueName: \"kubernetes.io/projected/a0850def-7e03-4120-8306-b2938afba2d6-kube-api-access-tjw49\") pod \"a0850def-7e03-4120-8306-b2938afba2d6\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510102 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-config-data\") pod \"a0850def-7e03-4120-8306-b2938afba2d6\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-fernet-keys\") pod \"a0850def-7e03-4120-8306-b2938afba2d6\" (UID: \"a0850def-7e03-4120-8306-b2938afba2d6\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-combined-ca-bundle\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510387 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316bd962-7ce5-4797-8a99-79aacd85c3fb-logs\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-internal-tls-certs\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvddn\" (UniqueName: \"kubernetes.io/projected/316bd962-7ce5-4797-8a99-79aacd85c3fb-kube-api-access-tvddn\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510468 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-config-data\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-scripts\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-public-tls-certs\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510588 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510603 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510611 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510619 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510628 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv8f6\" (UniqueName: \"kubernetes.io/projected/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-kube-api-access-fv8f6\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.510637 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bf29d-d0cc-43bc-9d01-6585cc8b0f18-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.526239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-scripts" (OuterVolumeSpecName: "scripts") pod "a0850def-7e03-4120-8306-b2938afba2d6" (UID: "a0850def-7e03-4120-8306-b2938afba2d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.532933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0850def-7e03-4120-8306-b2938afba2d6-kube-api-access-tjw49" (OuterVolumeSpecName: "kube-api-access-tjw49") pod "a0850def-7e03-4120-8306-b2938afba2d6" (UID: "a0850def-7e03-4120-8306-b2938afba2d6"). InnerVolumeSpecName "kube-api-access-tjw49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.574961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a0850def-7e03-4120-8306-b2938afba2d6" (UID: "a0850def-7e03-4120-8306-b2938afba2d6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.576062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-config-data" (OuterVolumeSpecName: "config-data") pod "a0850def-7e03-4120-8306-b2938afba2d6" (UID: "a0850def-7e03-4120-8306-b2938afba2d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.578944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0850def-7e03-4120-8306-b2938afba2d6" (UID: "a0850def-7e03-4120-8306-b2938afba2d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.588962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a0850def-7e03-4120-8306-b2938afba2d6" (UID: "a0850def-7e03-4120-8306-b2938afba2d6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-combined-ca-bundle\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316bd962-7ce5-4797-8a99-79aacd85c3fb-logs\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614341 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-internal-tls-certs\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvddn\" (UniqueName: \"kubernetes.io/projected/316bd962-7ce5-4797-8a99-79aacd85c3fb-kube-api-access-tvddn\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614422 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-config-data\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-scripts\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-public-tls-certs\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614689 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614704 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614712 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614721 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614730 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjw49\" (UniqueName: \"kubernetes.io/projected/a0850def-7e03-4120-8306-b2938afba2d6-kube-api-access-tjw49\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.614737 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0850def-7e03-4120-8306-b2938afba2d6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.616026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316bd962-7ce5-4797-8a99-79aacd85c3fb-logs\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.620270 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.620678 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-internal-tls-certs\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.622577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-config-data\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.623493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-scripts\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.623999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-combined-ca-bundle\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.629586 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-public-tls-certs\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.634592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvddn\" (UniqueName: \"kubernetes.io/projected/316bd962-7ce5-4797-8a99-79aacd85c3fb-kube-api-access-tvddn\") pod \"placement-55779dfdd6-4x8fn\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.695253 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.716506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbs7g\" (UniqueName: \"kubernetes.io/projected/1a3a71af-3540-4d7c-9b47-04bf4098aa40-kube-api-access-bbs7g\") pod \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.716538 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-config\") pod \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.716619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-combined-ca-bundle\") pod \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\" (UID: \"1a3a71af-3540-4d7c-9b47-04bf4098aa40\") " Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.720746 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3a71af-3540-4d7c-9b47-04bf4098aa40-kube-api-access-bbs7g" (OuterVolumeSpecName: "kube-api-access-bbs7g") pod "1a3a71af-3540-4d7c-9b47-04bf4098aa40" (UID: "1a3a71af-3540-4d7c-9b47-04bf4098aa40"). InnerVolumeSpecName "kube-api-access-bbs7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.726240 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd"] Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.754000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a3a71af-3540-4d7c-9b47-04bf4098aa40" (UID: "1a3a71af-3540-4d7c-9b47-04bf4098aa40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.776886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl"] Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.779888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-config" (OuterVolumeSpecName: "config") pod "1a3a71af-3540-4d7c-9b47-04bf4098aa40" (UID: "1a3a71af-3540-4d7c-9b47-04bf4098aa40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.788555 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-fb4888b88-mz44p"] Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.818512 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbs7g\" (UniqueName: \"kubernetes.io/projected/1a3a71af-3540-4d7c-9b47-04bf4098aa40-kube-api-access-bbs7g\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.818538 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.818550 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3a71af-3540-4d7c-9b47-04bf4098aa40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:20 crc kubenswrapper[4707]: W0121 15:21:20.819369 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf2ca736_6fb3_48fd_b827_17934d478419.slice/crio-c3de0a15d39aa2eb90940c9527ea5e94131c99be22da1c90edac2aafa08f88e6 WatchSource:0}: Error finding container c3de0a15d39aa2eb90940c9527ea5e94131c99be22da1c90edac2aafa08f88e6: Status 404 returned error can't find the container with id c3de0a15d39aa2eb90940c9527ea5e94131c99be22da1c90edac2aafa08f88e6 Jan 21 15:21:20 crc kubenswrapper[4707]: W0121 15:21:20.848148 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode41c8dac_d0ea_40c7_a4f2_0d43d31f58b1.slice/crio-73bddd9a9b9955c56c3a5111745237517c39a313a7a2863a240fa17953561f34 WatchSource:0}: Error finding container 73bddd9a9b9955c56c3a5111745237517c39a313a7a2863a240fa17953561f34: Status 404 returned error can't find the container with id 73bddd9a9b9955c56c3a5111745237517c39a313a7a2863a240fa17953561f34 Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.860099 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.984302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" event={"ID":"685bf29d-d0cc-43bc-9d01-6585cc8b0f18","Type":"ContainerDied","Data":"75b67b4f4a2055e9798f49ba00203ddca9e418046f8ac75c6d81a9890274846a"} Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.984344 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75b67b4f4a2055e9798f49ba00203ddca9e418046f8ac75c6d81a9890274846a" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.984393 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.992549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-xczbb" event={"ID":"1a3a71af-3540-4d7c-9b47-04bf4098aa40","Type":"ContainerDied","Data":"d5717f0a92ce31c7acfc9c0b2a36039fad51f3037c9abe80d38ac9605c405e72"} Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.992583 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5717f0a92ce31c7acfc9c0b2a36039fad51f3037c9abe80d38ac9605c405e72" Jan 21 15:21:20 crc kubenswrapper[4707]: I0121 15:21:20.992653 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-xczbb" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.005690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" event={"ID":"ee0ec627-76fd-4938-9a2b-db220a1ac5d5","Type":"ContainerStarted","Data":"69af5616e1526ccaf0294310ea014397d1c00ffb9cde6dfbef8e59f0e4eafc9c"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.007590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" event={"ID":"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1","Type":"ContainerStarted","Data":"73bddd9a9b9955c56c3a5111745237517c39a313a7a2863a240fa17953561f34"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.011043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b6243463-c1a1-42a5-8697-11a221d45be7","Type":"ContainerStarted","Data":"71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.011075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b6243463-c1a1-42a5-8697-11a221d45be7","Type":"ContainerStarted","Data":"6cf65c2bba9d9a8d94be9db8039c5d8f1d4729776345ad2374399cffc76de588"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.015229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" event={"ID":"a0850def-7e03-4120-8306-b2938afba2d6","Type":"ContainerDied","Data":"18518663776b598afad4ced2106d86881f07df9604e1edd2fcfd4c7783ecd23f"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.015247 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18518663776b598afad4ced2106d86881f07df9604e1edd2fcfd4c7783ecd23f" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.015332 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6sdhj" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.029169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4twm4\" (UniqueName: \"kubernetes.io/projected/1f414bee-4347-4f3f-bada-d7a2ee12f495-kube-api-access-4twm4\") pod \"1f414bee-4347-4f3f-bada-d7a2ee12f495\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.029243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-config-data\") pod \"1f414bee-4347-4f3f-bada-d7a2ee12f495\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.029301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-log-httpd\") pod \"1f414bee-4347-4f3f-bada-d7a2ee12f495\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.029315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-scripts\") pod \"1f414bee-4347-4f3f-bada-d7a2ee12f495\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.029359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-sg-core-conf-yaml\") pod \"1f414bee-4347-4f3f-bada-d7a2ee12f495\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.029431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-run-httpd\") pod \"1f414bee-4347-4f3f-bada-d7a2ee12f495\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.029522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-combined-ca-bundle\") pod \"1f414bee-4347-4f3f-bada-d7a2ee12f495\" (UID: \"1f414bee-4347-4f3f-bada-d7a2ee12f495\") " Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.032069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1f414bee-4347-4f3f-bada-d7a2ee12f495" (UID: "1f414bee-4347-4f3f-bada-d7a2ee12f495"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.037824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1f414bee-4347-4f3f-bada-d7a2ee12f495" (UID: "1f414bee-4347-4f3f-bada-d7a2ee12f495"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.040298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-scripts" (OuterVolumeSpecName: "scripts") pod "1f414bee-4347-4f3f-bada-d7a2ee12f495" (UID: "1f414bee-4347-4f3f-bada-d7a2ee12f495"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.041156 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f414bee-4347-4f3f-bada-d7a2ee12f495-kube-api-access-4twm4" (OuterVolumeSpecName: "kube-api-access-4twm4") pod "1f414bee-4347-4f3f-bada-d7a2ee12f495" (UID: "1f414bee-4347-4f3f-bada-d7a2ee12f495"). InnerVolumeSpecName "kube-api-access-4twm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.051412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"47279210-6924-452f-abb7-8d5d7ff90ec9","Type":"ContainerStarted","Data":"729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099361 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerID="5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151" exitCode=0 Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099393 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerID="5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e" exitCode=2 Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099401 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerID="4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007" exitCode=0 Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099408 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerID="f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9" exitCode=0 Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerDied","Data":"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerDied","Data":"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerDied","Data":"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerDied","Data":"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f414bee-4347-4f3f-bada-d7a2ee12f495","Type":"ContainerDied","Data":"272c231a4704f9f1eca4313adbfe0a7c4777a679b89314cc3e1e4804deee90c9"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099513 4707 scope.go:117] "RemoveContainer" containerID="5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.099646 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.112191 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6sdhj"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.113362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" event={"ID":"cf2ca736-6fb3-48fd-b827-17934d478419","Type":"ContainerStarted","Data":"c3de0a15d39aa2eb90940c9527ea5e94131c99be22da1c90edac2aafa08f88e6"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.131872 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.131902 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4twm4\" (UniqueName: \"kubernetes.io/projected/1f414bee-4347-4f3f-bada-d7a2ee12f495-kube-api-access-4twm4\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.131913 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f414bee-4347-4f3f-bada-d7a2ee12f495-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.131920 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.151141 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6sdhj"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.165175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1f414bee-4347-4f3f-bada-d7a2ee12f495" (UID: "1f414bee-4347-4f3f-bada-d7a2ee12f495"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.165202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" event={"ID":"d380df51-2feb-4db3-ac56-f2dc791690c9","Type":"ContainerStarted","Data":"27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.165248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" event={"ID":"d380df51-2feb-4db3-ac56-f2dc791690c9","Type":"ContainerStarted","Data":"923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.165259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" event={"ID":"d380df51-2feb-4db3-ac56-f2dc791690c9","Type":"ContainerStarted","Data":"cf132100523696cf3d340712971e050843538e96f77e623d017912998013fd5c"} Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.165913 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.165959 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177095 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-d5c68ff65-qkk97"] Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.177471 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="proxy-httpd" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="proxy-httpd" Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.177502 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3a71af-3540-4d7c-9b47-04bf4098aa40" containerName="neutron-db-sync" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177509 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3a71af-3540-4d7c-9b47-04bf4098aa40" containerName="neutron-db-sync" Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.177528 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="ceilometer-central-agent" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177533 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="ceilometer-central-agent" Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.177544 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="ceilometer-notification-agent" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177549 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="ceilometer-notification-agent" Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.177558 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="sg-core" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177564 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="sg-core" Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.177580 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0850def-7e03-4120-8306-b2938afba2d6" containerName="keystone-bootstrap" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177585 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0850def-7e03-4120-8306-b2938afba2d6" containerName="keystone-bootstrap" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177725 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="proxy-httpd" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177737 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3a71af-3540-4d7c-9b47-04bf4098aa40" containerName="neutron-db-sync" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177745 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="sg-core" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177755 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="ceilometer-central-agent" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177766 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0850def-7e03-4120-8306-b2938afba2d6" containerName="keystone-bootstrap" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.177775 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" containerName="ceilometer-notification-agent" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.178524 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.186220 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-74qf4" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.186454 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.186621 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.191469 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.216960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-config-data" (OuterVolumeSpecName: "config-data") pod "1f414bee-4347-4f3f-bada-d7a2ee12f495" (UID: "1f414bee-4347-4f3f-bada-d7a2ee12f495"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.220832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f414bee-4347-4f3f-bada-d7a2ee12f495" (UID: "1f414bee-4347-4f3f-bada-d7a2ee12f495"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.233592 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.233617 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.233626 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f414bee-4347-4f3f-bada-d7a2ee12f495-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.256338 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" podStartSLOduration=2.256321661 podStartE2EDuration="2.256321661s" podCreationTimestamp="2026-01-21 15:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:21.186788964 +0000 UTC m=+1178.368305186" watchObservedRunningTime="2026-01-21 15:21:21.256321661 +0000 UTC m=+1178.437837883" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.289941 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0850def-7e03-4120-8306-b2938afba2d6" path="/var/lib/kubelet/pods/a0850def-7e03-4120-8306-b2938afba2d6/volumes" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.290830 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-d5c68ff65-qkk97"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.290851 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-v9gl6"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.291590 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.292619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-v9gl6"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.292634 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.292646 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.293180 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.293479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.293501 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-55779dfdd6-4x8fn"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.293570 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.293573 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.296131 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-rcmqr" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.296273 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.296397 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.296587 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.296693 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-4bkq4" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.296853 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.296981 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.297098 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.297511 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.297626 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.335413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-combined-ca-bundle\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.335645 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9q5j\" (UniqueName: \"kubernetes.io/projected/9f52f02f-4ec5-4642-a742-004ac2668fe8-kube-api-access-d9q5j\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.335666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-ovndb-tls-certs\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.335697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-httpd-config\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.335719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-config\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.338260 4707 scope.go:117] "RemoveContainer" containerID="5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.400401 4707 scope.go:117] "RemoveContainer" containerID="4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.441986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-scripts\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442149 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f271b0ae-294d-4331-a6c5-54e7f991ab4f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2flgd\" (UniqueName: \"kubernetes.io/projected/f271b0ae-294d-4331-a6c5-54e7f991ab4f-kube-api-access-2flgd\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhpqt\" (UniqueName: \"kubernetes.io/projected/aac82614-0eca-4386-8dd7-2ae364a95de1-kube-api-access-hhpqt\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f271b0ae-294d-4331-a6c5-54e7f991ab4f-logs\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-combined-ca-bundle\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-scripts\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.442954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data-custom\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9q5j\" (UniqueName: \"kubernetes.io/projected/9f52f02f-4ec5-4642-a742-004ac2668fe8-kube-api-access-d9q5j\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-credential-keys\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-ovndb-tls-certs\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-httpd-config\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-config-data\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443265 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-scripts\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-fernet-keys\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-combined-ca-bundle\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlj2b\" (UniqueName: \"kubernetes.io/projected/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-kube-api-access-mlj2b\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aac82614-0eca-4386-8dd7-2ae364a95de1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.443391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-config\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.446969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-combined-ca-bundle\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.447550 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-config\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.448521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-httpd-config\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.459267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9q5j\" (UniqueName: \"kubernetes.io/projected/9f52f02f-4ec5-4642-a742-004ac2668fe8-kube-api-access-d9q5j\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.459309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-ovndb-tls-certs\") pod \"neutron-d5c68ff65-qkk97\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f271b0ae-294d-4331-a6c5-54e7f991ab4f-logs\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-scripts\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545598 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data-custom\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-credential-keys\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-config-data\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-scripts\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545671 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-fernet-keys\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-combined-ca-bundle\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlj2b\" (UniqueName: \"kubernetes.io/projected/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-kube-api-access-mlj2b\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aac82614-0eca-4386-8dd7-2ae364a95de1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.545783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-scripts\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.546029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aac82614-0eca-4386-8dd7-2ae364a95de1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.546174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f271b0ae-294d-4331-a6c5-54e7f991ab4f-logs\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.546204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.546239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f271b0ae-294d-4331-a6c5-54e7f991ab4f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.546265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.546297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2flgd\" (UniqueName: \"kubernetes.io/projected/f271b0ae-294d-4331-a6c5-54e7f991ab4f-kube-api-access-2flgd\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.546315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhpqt\" (UniqueName: \"kubernetes.io/projected/aac82614-0eca-4386-8dd7-2ae364a95de1-kube-api-access-hhpqt\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.559095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data-custom\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.559104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-fernet-keys\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.559265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-scripts\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.559514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-scripts\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.559755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-config-data\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.559797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-credential-keys\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.560099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-scripts\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.560272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.560359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhpqt\" (UniqueName: \"kubernetes.io/projected/aac82614-0eca-4386-8dd7-2ae364a95de1-kube-api-access-hhpqt\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.563297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-combined-ca-bundle\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.563667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlj2b\" (UniqueName: \"kubernetes.io/projected/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-kube-api-access-mlj2b\") pod \"keystone-bootstrap-v9gl6\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.564462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f271b0ae-294d-4331-a6c5-54e7f991ab4f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.566801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.569848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.570526 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.578023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2flgd\" (UniqueName: \"kubernetes.io/projected/f271b0ae-294d-4331-a6c5-54e7f991ab4f-kube-api-access-2flgd\") pod \"cinder-api-0\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.578243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.600669 4707 scope.go:117] "RemoveContainer" containerID="f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.662740 4707 scope.go:117] "RemoveContainer" containerID="5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151" Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.664035 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151\": container with ID starting with 5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151 not found: ID does not exist" containerID="5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.664078 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151"} err="failed to get container status \"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151\": rpc error: code = NotFound desc = could not find container \"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151\": container with ID starting with 5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.664100 4707 scope.go:117] "RemoveContainer" containerID="5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e" Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.665635 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e\": container with ID starting with 5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e not found: ID does not exist" containerID="5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.665674 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e"} err="failed to get container status \"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e\": rpc error: code = NotFound desc = could not find container \"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e\": container with ID starting with 5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.665700 4707 scope.go:117] "RemoveContainer" containerID="4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007" Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.667682 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007\": container with ID starting with 4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007 not found: ID does not exist" containerID="4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.667724 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007"} err="failed to get container status \"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007\": rpc error: code = NotFound desc = could not find container \"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007\": container with ID starting with 4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.667743 4707 scope.go:117] "RemoveContainer" containerID="f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9" Jan 21 15:21:21 crc kubenswrapper[4707]: E0121 15:21:21.671830 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9\": container with ID starting with f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9 not found: ID does not exist" containerID="f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.671864 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9"} err="failed to get container status \"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9\": rpc error: code = NotFound desc = could not find container \"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9\": container with ID starting with f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.671883 4707 scope.go:117] "RemoveContainer" containerID="5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.675934 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151"} err="failed to get container status \"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151\": rpc error: code = NotFound desc = could not find container \"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151\": container with ID starting with 5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.675958 4707 scope.go:117] "RemoveContainer" containerID="5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.676245 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e"} err="failed to get container status \"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e\": rpc error: code = NotFound desc = could not find container \"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e\": container with ID starting with 5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.676272 4707 scope.go:117] "RemoveContainer" containerID="4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.676473 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007"} err="failed to get container status \"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007\": rpc error: code = NotFound desc = could not find container \"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007\": container with ID starting with 4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.676496 4707 scope.go:117] "RemoveContainer" containerID="f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.676671 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9"} err="failed to get container status \"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9\": rpc error: code = NotFound desc = could not find container \"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9\": container with ID starting with f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.676695 4707 scope.go:117] "RemoveContainer" containerID="5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.676907 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151"} err="failed to get container status \"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151\": rpc error: code = NotFound desc = could not find container \"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151\": container with ID starting with 5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.676926 4707 scope.go:117] "RemoveContainer" containerID="5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.677092 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e"} err="failed to get container status \"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e\": rpc error: code = NotFound desc = could not find container \"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e\": container with ID starting with 5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.677110 4707 scope.go:117] "RemoveContainer" containerID="4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.677290 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007"} err="failed to get container status \"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007\": rpc error: code = NotFound desc = could not find container \"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007\": container with ID starting with 4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.677318 4707 scope.go:117] "RemoveContainer" containerID="f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.677471 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9"} err="failed to get container status \"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9\": rpc error: code = NotFound desc = could not find container \"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9\": container with ID starting with f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.677487 4707 scope.go:117] "RemoveContainer" containerID="5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.677684 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151"} err="failed to get container status \"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151\": rpc error: code = NotFound desc = could not find container \"5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151\": container with ID starting with 5791ee2e2d53a57eced84b806b64b2d6624a67a0a5d6b570087a1ea86a72f151 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.677804 4707 scope.go:117] "RemoveContainer" containerID="5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.678165 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e"} err="failed to get container status \"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e\": rpc error: code = NotFound desc = could not find container \"5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e\": container with ID starting with 5fa5d928fb5deb9ffa7d64dfdca0a616441f9a601281b97d0bbed2df62c9102e not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.678199 4707 scope.go:117] "RemoveContainer" containerID="4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.678431 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007"} err="failed to get container status \"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007\": rpc error: code = NotFound desc = could not find container \"4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007\": container with ID starting with 4db14441c927889774296d577e1e193423033604b054736e80c1eed3c5fb7007 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.678452 4707 scope.go:117] "RemoveContainer" containerID="f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.680095 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9"} err="failed to get container status \"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9\": rpc error: code = NotFound desc = could not find container \"f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9\": container with ID starting with f58608fbfbb12b5c88a16ce29c2dae4d5071ebaff68654407ef4e8f364de74c9 not found: ID does not exist" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.680540 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.681613 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.682817 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.703359 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.713668 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.750540 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.758645 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.760319 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.762050 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.765443 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.811164 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.851989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-config-data\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.852193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-run-httpd\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.852221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-scripts\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.852247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxw4\" (UniqueName: \"kubernetes.io/projected/8216f328-2076-4cd7-a713-1ac9d9a48557-kube-api-access-nbxw4\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.852364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-log-httpd\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.852390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.852418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.953494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-config-data\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.953743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-run-httpd\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.953767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-scripts\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.953786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxw4\" (UniqueName: \"kubernetes.io/projected/8216f328-2076-4cd7-a713-1ac9d9a48557-kube-api-access-nbxw4\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.953851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-log-httpd\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.953880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.953898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.954596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-run-httpd\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.955127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-log-httpd\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.957910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-config-data\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.960104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.961034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-scripts\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.967205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:21 crc kubenswrapper[4707]: I0121 15:21:21.970105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxw4\" (UniqueName: \"kubernetes.io/projected/8216f328-2076-4cd7-a713-1ac9d9a48557-kube-api-access-nbxw4\") pod \"ceilometer-0\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.093187 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.198952 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-v9gl6"] Jan 21 15:21:22 crc kubenswrapper[4707]: W0121 15:21:22.205901 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c1cabab_f9cf_4c1c_8690_daedf78a0a53.slice/crio-d2cb4e6c893a7aa52f1fd09fccbb1d9b5fd2dc84653c9c3153c3a44162a5dab2 WatchSource:0}: Error finding container d2cb4e6c893a7aa52f1fd09fccbb1d9b5fd2dc84653c9c3153c3a44162a5dab2: Status 404 returned error can't find the container with id d2cb4e6c893a7aa52f1fd09fccbb1d9b5fd2dc84653c9c3153c3a44162a5dab2 Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.206046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"47279210-6924-452f-abb7-8d5d7ff90ec9","Type":"ContainerStarted","Data":"577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.229712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" event={"ID":"316bd962-7ce5-4797-8a99-79aacd85c3fb","Type":"ContainerStarted","Data":"5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.229933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" event={"ID":"316bd962-7ce5-4797-8a99-79aacd85c3fb","Type":"ContainerStarted","Data":"84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.229947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" event={"ID":"316bd962-7ce5-4797-8a99-79aacd85c3fb","Type":"ContainerStarted","Data":"c6efe1634e159efa5d15d25489c07dd0eaac269d0facb952d945215a65da5eba"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.229982 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.230002 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.232010 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.231985833 podStartE2EDuration="4.231985833s" podCreationTimestamp="2026-01-21 15:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:22.229020394 +0000 UTC m=+1179.410536637" watchObservedRunningTime="2026-01-21 15:21:22.231985833 +0000 UTC m=+1179.413502055" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.235118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" event={"ID":"ee0ec627-76fd-4938-9a2b-db220a1ac5d5","Type":"ContainerStarted","Data":"1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.235157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" event={"ID":"ee0ec627-76fd-4938-9a2b-db220a1ac5d5","Type":"ContainerStarted","Data":"a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.238585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" event={"ID":"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1","Type":"ContainerStarted","Data":"ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.238614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" event={"ID":"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1","Type":"ContainerStarted","Data":"ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.246378 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" podStartSLOduration=2.246365833 podStartE2EDuration="2.246365833s" podCreationTimestamp="2026-01-21 15:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:22.245825386 +0000 UTC m=+1179.427341618" watchObservedRunningTime="2026-01-21 15:21:22.246365833 +0000 UTC m=+1179.427882054" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.252743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" event={"ID":"cf2ca736-6fb3-48fd-b827-17934d478419","Type":"ContainerStarted","Data":"ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.252775 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" event={"ID":"cf2ca736-6fb3-48fd-b827-17934d478419","Type":"ContainerStarted","Data":"b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.252791 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.252856 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.255395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b6243463-c1a1-42a5-8697-11a221d45be7","Type":"ContainerStarted","Data":"138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610"} Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.264559 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" podStartSLOduration=3.264548656 podStartE2EDuration="3.264548656s" podCreationTimestamp="2026-01-21 15:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:22.259019776 +0000 UTC m=+1179.440535998" watchObservedRunningTime="2026-01-21 15:21:22.264548656 +0000 UTC m=+1179.446064878" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.277733 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" podStartSLOduration=3.277716746 podStartE2EDuration="3.277716746s" podCreationTimestamp="2026-01-21 15:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:22.277344175 +0000 UTC m=+1179.458860388" watchObservedRunningTime="2026-01-21 15:21:22.277716746 +0000 UTC m=+1179.459232968" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.300685 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" podStartSLOduration=3.300668721 podStartE2EDuration="3.300668721s" podCreationTimestamp="2026-01-21 15:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:22.297989832 +0000 UTC m=+1179.479506054" watchObservedRunningTime="2026-01-21 15:21:22.300668721 +0000 UTC m=+1179.482184943" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.322985 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.322968027 podStartE2EDuration="3.322968027s" podCreationTimestamp="2026-01-21 15:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:22.313800683 +0000 UTC m=+1179.495316905" watchObservedRunningTime="2026-01-21 15:21:22.322968027 +0000 UTC m=+1179.504484250" Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.351013 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.359417 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.459738 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-d5c68ff65-qkk97"] Jan 21 15:21:22 crc kubenswrapper[4707]: W0121 15:21:22.471172 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f52f02f_4ec5_4642_a742_004ac2668fe8.slice/crio-98c39d949d751211f34cdf48694908dcfa4e28b0f99bd1b9cfb12803b7fb15bd WatchSource:0}: Error finding container 98c39d949d751211f34cdf48694908dcfa4e28b0f99bd1b9cfb12803b7fb15bd: Status 404 returned error can't find the container with id 98c39d949d751211f34cdf48694908dcfa4e28b0f99bd1b9cfb12803b7fb15bd Jan 21 15:21:22 crc kubenswrapper[4707]: I0121 15:21:22.671159 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:21:22 crc kubenswrapper[4707]: W0121 15:21:22.687594 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8216f328_2076_4cd7_a713_1ac9d9a48557.slice/crio-b603e7cb3adfb798ffa88529cc2462b503a82c97cdc54a4a3e8d432197604bb1 WatchSource:0}: Error finding container b603e7cb3adfb798ffa88529cc2462b503a82c97cdc54a4a3e8d432197604bb1: Status 404 returned error can't find the container with id b603e7cb3adfb798ffa88529cc2462b503a82c97cdc54a4a3e8d432197604bb1 Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.196452 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f414bee-4347-4f3f-bada-d7a2ee12f495" path="/var/lib/kubelet/pods/1f414bee-4347-4f3f-bada-d7a2ee12f495/volumes" Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.272605 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerStarted","Data":"b603e7cb3adfb798ffa88529cc2462b503a82c97cdc54a4a3e8d432197604bb1"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.315626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" event={"ID":"9f52f02f-4ec5-4642-a742-004ac2668fe8","Type":"ContainerStarted","Data":"16208ea82b812a06cbd8678b6432f56905a4fec41f434ee8846390c6c7f1477c"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.315675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" event={"ID":"9f52f02f-4ec5-4642-a742-004ac2668fe8","Type":"ContainerStarted","Data":"e555f5b33a22f3ba35e49640ecdfcebc760af82162eb8d8cac885aa76cc9b7b2"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.315686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" event={"ID":"9f52f02f-4ec5-4642-a742-004ac2668fe8","Type":"ContainerStarted","Data":"98c39d949d751211f34cdf48694908dcfa4e28b0f99bd1b9cfb12803b7fb15bd"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.342096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f271b0ae-294d-4331-a6c5-54e7f991ab4f","Type":"ContainerStarted","Data":"5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.342140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f271b0ae-294d-4331-a6c5-54e7f991ab4f","Type":"ContainerStarted","Data":"573ec6265771862c78d9b73c4d6bf86903ae4994441b6bf965997bcbeeb0a3f1"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.347203 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" podStartSLOduration=2.347192826 podStartE2EDuration="2.347192826s" podCreationTimestamp="2026-01-21 15:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:23.334336111 +0000 UTC m=+1180.515852333" watchObservedRunningTime="2026-01-21 15:21:23.347192826 +0000 UTC m=+1180.528709048" Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.349003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"aac82614-0eca-4386-8dd7-2ae364a95de1","Type":"ContainerStarted","Data":"12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.349029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"aac82614-0eca-4386-8dd7-2ae364a95de1","Type":"ContainerStarted","Data":"14d131d2bbde0df99b1ef3f7df18a610fdb7c7397f2ab08e7fe9722254fc0c03"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.355196 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" event={"ID":"3c1cabab-f9cf-4c1c-8690-daedf78a0a53","Type":"ContainerStarted","Data":"fee0cb39511a55b0679602f9c7a78524a3f99c3be04ba4a4ce0a4fd68b351c5c"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.355219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" event={"ID":"3c1cabab-f9cf-4c1c-8690-daedf78a0a53","Type":"ContainerStarted","Data":"d2cb4e6c893a7aa52f1fd09fccbb1d9b5fd2dc84653c9c3153c3a44162a5dab2"} Jan 21 15:21:23 crc kubenswrapper[4707]: I0121 15:21:23.370016 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" podStartSLOduration=2.370005509 podStartE2EDuration="2.370005509s" podCreationTimestamp="2026-01-21 15:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:23.366945883 +0000 UTC m=+1180.548462106" watchObservedRunningTime="2026-01-21 15:21:23.370005509 +0000 UTC m=+1180.551521731" Jan 21 15:21:24 crc kubenswrapper[4707]: I0121 15:21:24.371085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f271b0ae-294d-4331-a6c5-54e7f991ab4f","Type":"ContainerStarted","Data":"f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7"} Jan 21 15:21:24 crc kubenswrapper[4707]: I0121 15:21:24.371584 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:24 crc kubenswrapper[4707]: I0121 15:21:24.374609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"aac82614-0eca-4386-8dd7-2ae364a95de1","Type":"ContainerStarted","Data":"608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da"} Jan 21 15:21:24 crc kubenswrapper[4707]: I0121 15:21:24.378497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerStarted","Data":"c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf"} Jan 21 15:21:24 crc kubenswrapper[4707]: I0121 15:21:24.378515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerStarted","Data":"ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e"} Jan 21 15:21:24 crc kubenswrapper[4707]: I0121 15:21:24.378779 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:24 crc kubenswrapper[4707]: I0121 15:21:24.388879 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.388868661 podStartE2EDuration="3.388868661s" podCreationTimestamp="2026-01-21 15:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:24.384299667 +0000 UTC m=+1181.565815909" watchObservedRunningTime="2026-01-21 15:21:24.388868661 +0000 UTC m=+1181.570384883" Jan 21 15:21:24 crc kubenswrapper[4707]: I0121 15:21:24.410719 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.410710768 podStartE2EDuration="3.410710768s" podCreationTimestamp="2026-01-21 15:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:24.405298258 +0000 UTC m=+1181.586814479" watchObservedRunningTime="2026-01-21 15:21:24.410710768 +0000 UTC m=+1181.592226991" Jan 21 15:21:25 crc kubenswrapper[4707]: I0121 15:21:25.387488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerStarted","Data":"41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595"} Jan 21 15:21:25 crc kubenswrapper[4707]: I0121 15:21:25.388610 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c1cabab-f9cf-4c1c-8690-daedf78a0a53" containerID="fee0cb39511a55b0679602f9c7a78524a3f99c3be04ba4a4ce0a4fd68b351c5c" exitCode=0 Jan 21 15:21:25 crc kubenswrapper[4707]: I0121 15:21:25.388681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" event={"ID":"3c1cabab-f9cf-4c1c-8690-daedf78a0a53","Type":"ContainerDied","Data":"fee0cb39511a55b0679602f9c7a78524a3f99c3be04ba4a4ce0a4fd68b351c5c"} Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.683912 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.720763 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.857425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-config-data\") pod \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.857754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-fernet-keys\") pod \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.857835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlj2b\" (UniqueName: \"kubernetes.io/projected/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-kube-api-access-mlj2b\") pod \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.857904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-scripts\") pod \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.857982 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-combined-ca-bundle\") pod \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.858007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-credential-keys\") pod \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\" (UID: \"3c1cabab-f9cf-4c1c-8690-daedf78a0a53\") " Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.863211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-scripts" (OuterVolumeSpecName: "scripts") pod "3c1cabab-f9cf-4c1c-8690-daedf78a0a53" (UID: "3c1cabab-f9cf-4c1c-8690-daedf78a0a53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.863461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-kube-api-access-mlj2b" (OuterVolumeSpecName: "kube-api-access-mlj2b") pod "3c1cabab-f9cf-4c1c-8690-daedf78a0a53" (UID: "3c1cabab-f9cf-4c1c-8690-daedf78a0a53"). InnerVolumeSpecName "kube-api-access-mlj2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.863918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3c1cabab-f9cf-4c1c-8690-daedf78a0a53" (UID: "3c1cabab-f9cf-4c1c-8690-daedf78a0a53"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.864190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3c1cabab-f9cf-4c1c-8690-daedf78a0a53" (UID: "3c1cabab-f9cf-4c1c-8690-daedf78a0a53"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.879684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c1cabab-f9cf-4c1c-8690-daedf78a0a53" (UID: "3c1cabab-f9cf-4c1c-8690-daedf78a0a53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.892920 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-config-data" (OuterVolumeSpecName: "config-data") pod "3c1cabab-f9cf-4c1c-8690-daedf78a0a53" (UID: "3c1cabab-f9cf-4c1c-8690-daedf78a0a53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.960583 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.960851 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.960927 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.960982 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.961030 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlj2b\" (UniqueName: \"kubernetes.io/projected/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-kube-api-access-mlj2b\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:26 crc kubenswrapper[4707]: I0121 15:21:26.961080 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c1cabab-f9cf-4c1c-8690-daedf78a0a53-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.298893 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.299092 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerName="cinder-api-log" containerID="cri-o://5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114" gracePeriod=30 Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.299174 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerName="cinder-api" containerID="cri-o://f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7" gracePeriod=30 Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.405181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" event={"ID":"3c1cabab-f9cf-4c1c-8690-daedf78a0a53","Type":"ContainerDied","Data":"d2cb4e6c893a7aa52f1fd09fccbb1d9b5fd2dc84653c9c3153c3a44162a5dab2"} Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.405591 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2cb4e6c893a7aa52f1fd09fccbb1d9b5fd2dc84653c9c3153c3a44162a5dab2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.405231 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-v9gl6" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.409403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerStarted","Data":"ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6"} Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.409598 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.434951 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.561330117 podStartE2EDuration="6.434934964s" podCreationTimestamp="2026-01-21 15:21:21 +0000 UTC" firstStartedPulling="2026-01-21 15:21:22.690882538 +0000 UTC m=+1179.872398760" lastFinishedPulling="2026-01-21 15:21:26.564487384 +0000 UTC m=+1183.746003607" observedRunningTime="2026-01-21 15:21:27.42955809 +0000 UTC m=+1184.611074312" watchObservedRunningTime="2026-01-21 15:21:27.434934964 +0000 UTC m=+1184.616451186" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.551377 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-7f68c764d7-98wg2"] Jan 21 15:21:27 crc kubenswrapper[4707]: E0121 15:21:27.553317 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1cabab-f9cf-4c1c-8690-daedf78a0a53" containerName="keystone-bootstrap" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.553404 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1cabab-f9cf-4c1c-8690-daedf78a0a53" containerName="keystone-bootstrap" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.553606 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1cabab-f9cf-4c1c-8690-daedf78a0a53" containerName="keystone-bootstrap" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.554119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.555982 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.556315 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.556471 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.556715 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.557043 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.557184 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-4bkq4" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.570580 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7f68c764d7-98wg2"] Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.672733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-scripts\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.672803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-combined-ca-bundle\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.672848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-config-data\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.672871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-fernet-keys\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.672948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6zjd\" (UniqueName: \"kubernetes.io/projected/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-kube-api-access-j6zjd\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.672978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-credential-keys\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.673034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-internal-tls-certs\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.673067 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-public-tls-certs\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.774523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6zjd\" (UniqueName: \"kubernetes.io/projected/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-kube-api-access-j6zjd\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.774578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-credential-keys\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.774641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-internal-tls-certs\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.774677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-public-tls-certs\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.774760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-scripts\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.774793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-combined-ca-bundle\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.774832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-config-data\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.774853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-fernet-keys\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.786464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-credential-keys\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.786978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-combined-ca-bundle\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.787698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-fernet-keys\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.788025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-scripts\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.788278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-internal-tls-certs\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.789482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-public-tls-certs\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.793501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6zjd\" (UniqueName: \"kubernetes.io/projected/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-kube-api-access-j6zjd\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.795160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-config-data\") pod \"keystone-7f68c764d7-98wg2\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.886860 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.925575 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.977569 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data-custom\") pod \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.977656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2flgd\" (UniqueName: \"kubernetes.io/projected/f271b0ae-294d-4331-a6c5-54e7f991ab4f-kube-api-access-2flgd\") pod \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.977679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data\") pod \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.977705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f271b0ae-294d-4331-a6c5-54e7f991ab4f-logs\") pod \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.977740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-combined-ca-bundle\") pod \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.977775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-scripts\") pod \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.977803 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f271b0ae-294d-4331-a6c5-54e7f991ab4f-etc-machine-id\") pod \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\" (UID: \"f271b0ae-294d-4331-a6c5-54e7f991ab4f\") " Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.978159 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f271b0ae-294d-4331-a6c5-54e7f991ab4f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f271b0ae-294d-4331-a6c5-54e7f991ab4f" (UID: "f271b0ae-294d-4331-a6c5-54e7f991ab4f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.978973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f271b0ae-294d-4331-a6c5-54e7f991ab4f-logs" (OuterVolumeSpecName: "logs") pod "f271b0ae-294d-4331-a6c5-54e7f991ab4f" (UID: "f271b0ae-294d-4331-a6c5-54e7f991ab4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.981719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f271b0ae-294d-4331-a6c5-54e7f991ab4f" (UID: "f271b0ae-294d-4331-a6c5-54e7f991ab4f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.983931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f271b0ae-294d-4331-a6c5-54e7f991ab4f-kube-api-access-2flgd" (OuterVolumeSpecName: "kube-api-access-2flgd") pod "f271b0ae-294d-4331-a6c5-54e7f991ab4f" (UID: "f271b0ae-294d-4331-a6c5-54e7f991ab4f"). InnerVolumeSpecName "kube-api-access-2flgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:27 crc kubenswrapper[4707]: I0121 15:21:27.984605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-scripts" (OuterVolumeSpecName: "scripts") pod "f271b0ae-294d-4331-a6c5-54e7f991ab4f" (UID: "f271b0ae-294d-4331-a6c5-54e7f991ab4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.002033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f271b0ae-294d-4331-a6c5-54e7f991ab4f" (UID: "f271b0ae-294d-4331-a6c5-54e7f991ab4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.043674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data" (OuterVolumeSpecName: "config-data") pod "f271b0ae-294d-4331-a6c5-54e7f991ab4f" (UID: "f271b0ae-294d-4331-a6c5-54e7f991ab4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.079803 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f271b0ae-294d-4331-a6c5-54e7f991ab4f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.079849 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.079859 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2flgd\" (UniqueName: \"kubernetes.io/projected/f271b0ae-294d-4331-a6c5-54e7f991ab4f-kube-api-access-2flgd\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.079869 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.079878 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f271b0ae-294d-4331-a6c5-54e7f991ab4f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.079888 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.079895 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f271b0ae-294d-4331-a6c5-54e7f991ab4f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.301479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7f68c764d7-98wg2"] Jan 21 15:21:28 crc kubenswrapper[4707]: W0121 15:21:28.310780 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aee120c_53c3_4a72_a81e_2dd10cc5cb9a.slice/crio-a4d25ef1cd513130c256ed45fabe1218d3833a2b42fbd941a7c36c0b9ca09e02 WatchSource:0}: Error finding container a4d25ef1cd513130c256ed45fabe1218d3833a2b42fbd941a7c36c0b9ca09e02: Status 404 returned error can't find the container with id a4d25ef1cd513130c256ed45fabe1218d3833a2b42fbd941a7c36c0b9ca09e02 Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.418961 4707 generic.go:334] "Generic (PLEG): container finished" podID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerID="f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7" exitCode=0 Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.418989 4707 generic.go:334] "Generic (PLEG): container finished" podID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerID="5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114" exitCode=143 Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.419022 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.419051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f271b0ae-294d-4331-a6c5-54e7f991ab4f","Type":"ContainerDied","Data":"f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7"} Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.419076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f271b0ae-294d-4331-a6c5-54e7f991ab4f","Type":"ContainerDied","Data":"5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114"} Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.419088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f271b0ae-294d-4331-a6c5-54e7f991ab4f","Type":"ContainerDied","Data":"573ec6265771862c78d9b73c4d6bf86903ae4994441b6bf965997bcbeeb0a3f1"} Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.419104 4707 scope.go:117] "RemoveContainer" containerID="f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.421692 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" event={"ID":"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a","Type":"ContainerStarted","Data":"a4d25ef1cd513130c256ed45fabe1218d3833a2b42fbd941a7c36c0b9ca09e02"} Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.451474 4707 scope.go:117] "RemoveContainer" containerID="5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.453824 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.486745 4707 scope.go:117] "RemoveContainer" containerID="f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.491939 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:21:28 crc kubenswrapper[4707]: E0121 15:21:28.492064 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7\": container with ID starting with f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7 not found: ID does not exist" containerID="f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.492111 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7"} err="failed to get container status \"f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7\": rpc error: code = NotFound desc = could not find container \"f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7\": container with ID starting with f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7 not found: ID does not exist" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.492137 4707 scope.go:117] "RemoveContainer" containerID="5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114" Jan 21 15:21:28 crc kubenswrapper[4707]: E0121 15:21:28.492540 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114\": container with ID starting with 5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114 not found: ID does not exist" containerID="5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.492568 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114"} err="failed to get container status \"5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114\": rpc error: code = NotFound desc = could not find container \"5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114\": container with ID starting with 5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114 not found: ID does not exist" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.492600 4707 scope.go:117] "RemoveContainer" containerID="f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.492854 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7"} err="failed to get container status \"f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7\": rpc error: code = NotFound desc = could not find container \"f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7\": container with ID starting with f74941fb7e853183b1035d42219b044d238ef028f43f35a8be6b2d57432622b7 not found: ID does not exist" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.492912 4707 scope.go:117] "RemoveContainer" containerID="5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.493121 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114"} err="failed to get container status \"5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114\": rpc error: code = NotFound desc = could not find container \"5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114\": container with ID starting with 5e7e9be9802611547f9abbf3c8de3fd793be223bcea7378e75348f55eb336114 not found: ID does not exist" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.502177 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:21:28 crc kubenswrapper[4707]: E0121 15:21:28.502911 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerName="cinder-api-log" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.502932 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerName="cinder-api-log" Jan 21 15:21:28 crc kubenswrapper[4707]: E0121 15:21:28.502948 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerName="cinder-api" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.502955 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerName="cinder-api" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.503265 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerName="cinder-api-log" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.503309 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" containerName="cinder-api" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.505055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.509406 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.509460 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.509668 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.522067 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.589865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data-custom\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.589994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.590044 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.590058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-public-tls-certs\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.590077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c1a990-5286-4274-a26a-8740ee506a12-logs\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.590106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-scripts\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.590134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.590155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6nlk\" (UniqueName: \"kubernetes.io/projected/90c1a990-5286-4274-a26a-8740ee506a12-kube-api-access-v6nlk\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.590267 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90c1a990-5286-4274-a26a-8740ee506a12-etc-machine-id\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.691492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c1a990-5286-4274-a26a-8740ee506a12-logs\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.691747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-scripts\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.691778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.691798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6nlk\" (UniqueName: \"kubernetes.io/projected/90c1a990-5286-4274-a26a-8740ee506a12-kube-api-access-v6nlk\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.691922 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90c1a990-5286-4274-a26a-8740ee506a12-etc-machine-id\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.691960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data-custom\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.692013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.692051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.692063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-public-tls-certs\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.693071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90c1a990-5286-4274-a26a-8740ee506a12-etc-machine-id\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.693546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c1a990-5286-4274-a26a-8740ee506a12-logs\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.696751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data-custom\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.696752 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.696972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.697092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-scripts\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.697575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-public-tls-certs\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.698047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.708668 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6nlk\" (UniqueName: \"kubernetes.io/projected/90c1a990-5286-4274-a26a-8740ee506a12-kube-api-access-v6nlk\") pod \"cinder-api-0\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:28 crc kubenswrapper[4707]: I0121 15:21:28.830875 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.191673 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f271b0ae-294d-4331-a6c5-54e7f991ab4f" path="/var/lib/kubelet/pods/f271b0ae-294d-4331-a6c5-54e7f991ab4f/volumes" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.214875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:21:29 crc kubenswrapper[4707]: W0121 15:21:29.220880 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90c1a990_5286_4274_a26a_8740ee506a12.slice/crio-2cfc23d8af7f51b787084f15c145ea5d95b8afe2ea3d608d68f00e4b34f65866 WatchSource:0}: Error finding container 2cfc23d8af7f51b787084f15c145ea5d95b8afe2ea3d608d68f00e4b34f65866: Status 404 returned error can't find the container with id 2cfc23d8af7f51b787084f15c145ea5d95b8afe2ea3d608d68f00e4b34f65866 Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.323497 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.323543 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.353774 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.354461 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.381361 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.381422 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.423714 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.435996 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.436919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" event={"ID":"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a","Type":"ContainerStarted","Data":"464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed"} Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.437581 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.453623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"90c1a990-5286-4274-a26a-8740ee506a12","Type":"ContainerStarted","Data":"2cfc23d8af7f51b787084f15c145ea5d95b8afe2ea3d608d68f00e4b34f65866"} Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.455575 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.460226 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.460249 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.460259 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.477960 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" podStartSLOduration=2.477942664 podStartE2EDuration="2.477942664s" podCreationTimestamp="2026-01-21 15:21:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:29.472587721 +0000 UTC m=+1186.654103943" watchObservedRunningTime="2026-01-21 15:21:29.477942664 +0000 UTC m=+1186.659458886" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.790565 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-855bf96656-swjds"] Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.792496 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.800830 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.801040 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.806561 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-855bf96656-swjds"] Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.913284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-public-tls-certs\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.913352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-config\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.913371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-httpd-config\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.913551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-internal-tls-certs\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.914047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-ovndb-tls-certs\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.914075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbd8b\" (UniqueName: \"kubernetes.io/projected/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-kube-api-access-lbd8b\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:29 crc kubenswrapper[4707]: I0121 15:21:29.914105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-combined-ca-bundle\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.016063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbd8b\" (UniqueName: \"kubernetes.io/projected/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-kube-api-access-lbd8b\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.016358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-combined-ca-bundle\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.016496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-public-tls-certs\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.016536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-config\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.016554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-httpd-config\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.016611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-internal-tls-certs\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.016748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-ovndb-tls-certs\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.024711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-internal-tls-certs\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.024835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-public-tls-certs\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.025412 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-combined-ca-bundle\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.025781 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-ovndb-tls-certs\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.039959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-httpd-config\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.040189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-config\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.048379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbd8b\" (UniqueName: \"kubernetes.io/projected/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-kube-api-access-lbd8b\") pod \"neutron-855bf96656-swjds\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.166392 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.486530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"90c1a990-5286-4274-a26a-8740ee506a12","Type":"ContainerStarted","Data":"2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390"} Jan 21 15:21:30 crc kubenswrapper[4707]: I0121 15:21:30.702240 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-855bf96656-swjds"] Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.477447 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.480335 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.496966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"90c1a990-5286-4274-a26a-8740ee506a12","Type":"ContainerStarted","Data":"1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447"} Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.497004 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.498801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" event={"ID":"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e","Type":"ContainerStarted","Data":"41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51"} Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.498862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" event={"ID":"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e","Type":"ContainerStarted","Data":"9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686"} Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.498874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" event={"ID":"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e","Type":"ContainerStarted","Data":"dca8c21676d2517dfc849b4ad0a7ebb3326a9fcac4ed08cb9058357a9dcf254f"} Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.498961 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.498987 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.499738 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.540124 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" podStartSLOduration=2.540108873 podStartE2EDuration="2.540108873s" podCreationTimestamp="2026-01-21 15:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:31.539186858 +0000 UTC m=+1188.720703079" watchObservedRunningTime="2026-01-21 15:21:31.540108873 +0000 UTC m=+1188.721625094" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.547281 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.561468 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.561455948 podStartE2EDuration="3.561455948s" podCreationTimestamp="2026-01-21 15:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:31.554546771 +0000 UTC m=+1188.736062993" watchObservedRunningTime="2026-01-21 15:21:31.561455948 +0000 UTC m=+1188.742972170" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.622895 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.629680 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.740593 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.885324 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:31 crc kubenswrapper[4707]: I0121 15:21:31.924861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.506014 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerName="cinder-scheduler" containerID="cri-o://12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f" gracePeriod=30 Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.506050 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerName="probe" containerID="cri-o://608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da" gracePeriod=30 Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.573371 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-55976cc56-2tgtc"] Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.574723 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.576477 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.576659 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.585004 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-55976cc56-2tgtc"] Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.766481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data-custom\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.766535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.766561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-internal-tls-certs\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.766579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-public-tls-certs\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.766633 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-combined-ca-bundle\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.766651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-logs\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.766793 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl522\" (UniqueName: \"kubernetes.io/projected/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-kube-api-access-xl522\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.868564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl522\" (UniqueName: \"kubernetes.io/projected/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-kube-api-access-xl522\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.868677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data-custom\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.868723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.868750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-internal-tls-certs\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.868771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-public-tls-certs\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.868886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-combined-ca-bundle\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.868923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-logs\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.869282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-logs\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.873217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-public-tls-certs\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.873775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data-custom\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.874656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.878503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-combined-ca-bundle\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.881201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-internal-tls-certs\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:32 crc kubenswrapper[4707]: I0121 15:21:32.890041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl522\" (UniqueName: \"kubernetes.io/projected/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-kube-api-access-xl522\") pod \"barbican-api-55976cc56-2tgtc\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.189521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.514304 4707 generic.go:334] "Generic (PLEG): container finished" podID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerID="608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da" exitCode=0 Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.514337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"aac82614-0eca-4386-8dd7-2ae364a95de1","Type":"ContainerDied","Data":"608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da"} Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.555427 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-55976cc56-2tgtc"] Jan 21 15:21:33 crc kubenswrapper[4707]: W0121 15:21:33.593872 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb9fece3_dd79_4ad1_a2bc_65f4c66a6c6e.slice/crio-ec7397528e2161c9bd690ee1124510074c0cbc3d4639f9045e8c178ecfaeedf4 WatchSource:0}: Error finding container ec7397528e2161c9bd690ee1124510074c0cbc3d4639f9045e8c178ecfaeedf4: Status 404 returned error can't find the container with id ec7397528e2161c9bd690ee1124510074c0cbc3d4639f9045e8c178ecfaeedf4 Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.819563 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.986782 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhpqt\" (UniqueName: \"kubernetes.io/projected/aac82614-0eca-4386-8dd7-2ae364a95de1-kube-api-access-hhpqt\") pod \"aac82614-0eca-4386-8dd7-2ae364a95de1\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.986844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data-custom\") pod \"aac82614-0eca-4386-8dd7-2ae364a95de1\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.986963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-scripts\") pod \"aac82614-0eca-4386-8dd7-2ae364a95de1\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.987055 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-combined-ca-bundle\") pod \"aac82614-0eca-4386-8dd7-2ae364a95de1\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.987071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aac82614-0eca-4386-8dd7-2ae364a95de1-etc-machine-id\") pod \"aac82614-0eca-4386-8dd7-2ae364a95de1\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.987096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data\") pod \"aac82614-0eca-4386-8dd7-2ae364a95de1\" (UID: \"aac82614-0eca-4386-8dd7-2ae364a95de1\") " Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.987225 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aac82614-0eca-4386-8dd7-2ae364a95de1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aac82614-0eca-4386-8dd7-2ae364a95de1" (UID: "aac82614-0eca-4386-8dd7-2ae364a95de1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.987595 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aac82614-0eca-4386-8dd7-2ae364a95de1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.991747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-scripts" (OuterVolumeSpecName: "scripts") pod "aac82614-0eca-4386-8dd7-2ae364a95de1" (UID: "aac82614-0eca-4386-8dd7-2ae364a95de1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.992937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aac82614-0eca-4386-8dd7-2ae364a95de1" (UID: "aac82614-0eca-4386-8dd7-2ae364a95de1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:33 crc kubenswrapper[4707]: I0121 15:21:33.996008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac82614-0eca-4386-8dd7-2ae364a95de1-kube-api-access-hhpqt" (OuterVolumeSpecName: "kube-api-access-hhpqt") pod "aac82614-0eca-4386-8dd7-2ae364a95de1" (UID: "aac82614-0eca-4386-8dd7-2ae364a95de1"). InnerVolumeSpecName "kube-api-access-hhpqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.047556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aac82614-0eca-4386-8dd7-2ae364a95de1" (UID: "aac82614-0eca-4386-8dd7-2ae364a95de1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.072906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data" (OuterVolumeSpecName: "config-data") pod "aac82614-0eca-4386-8dd7-2ae364a95de1" (UID: "aac82614-0eca-4386-8dd7-2ae364a95de1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.088586 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.088613 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.088622 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhpqt\" (UniqueName: \"kubernetes.io/projected/aac82614-0eca-4386-8dd7-2ae364a95de1-kube-api-access-hhpqt\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.088631 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.088640 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aac82614-0eca-4386-8dd7-2ae364a95de1-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.523466 4707 generic.go:334] "Generic (PLEG): container finished" podID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerID="12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f" exitCode=0 Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.523534 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"aac82614-0eca-4386-8dd7-2ae364a95de1","Type":"ContainerDied","Data":"12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f"} Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.523563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"aac82614-0eca-4386-8dd7-2ae364a95de1","Type":"ContainerDied","Data":"14d131d2bbde0df99b1ef3f7df18a610fdb7c7397f2ab08e7fe9722254fc0c03"} Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.523579 4707 scope.go:117] "RemoveContainer" containerID="608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.524316 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.525132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" event={"ID":"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e","Type":"ContainerStarted","Data":"90be3266ccbe5e96e44486114865333b48d5affe4c0b643d4de2af49390b91a5"} Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.525191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" event={"ID":"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e","Type":"ContainerStarted","Data":"d959b8af71e65676b69991aa2187dfa38383cb263af1e369f5864d8fe2893795"} Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.525205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" event={"ID":"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e","Type":"ContainerStarted","Data":"ec7397528e2161c9bd690ee1124510074c0cbc3d4639f9045e8c178ecfaeedf4"} Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.525243 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.525272 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.553987 4707 scope.go:117] "RemoveContainer" containerID="12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.567874 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" podStartSLOduration=2.567859501 podStartE2EDuration="2.567859501s" podCreationTimestamp="2026-01-21 15:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:34.561004416 +0000 UTC m=+1191.742520648" watchObservedRunningTime="2026-01-21 15:21:34.567859501 +0000 UTC m=+1191.749375723" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.579325 4707 scope.go:117] "RemoveContainer" containerID="608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.579397 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:21:34 crc kubenswrapper[4707]: E0121 15:21:34.579599 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da\": container with ID starting with 608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da not found: ID does not exist" containerID="608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.579622 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da"} err="failed to get container status \"608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da\": rpc error: code = NotFound desc = could not find container \"608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da\": container with ID starting with 608c23565cba5d7cbda3601089f15dac173b8e17021c6687ce2422e7494ec9da not found: ID does not exist" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.579637 4707 scope.go:117] "RemoveContainer" containerID="12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f" Jan 21 15:21:34 crc kubenswrapper[4707]: E0121 15:21:34.579832 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f\": container with ID starting with 12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f not found: ID does not exist" containerID="12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.579853 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f"} err="failed to get container status \"12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f\": rpc error: code = NotFound desc = could not find container \"12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f\": container with ID starting with 12b1ad3abe09018cf07d3ebb59215f6a19d848c5a071177e8692c74a9453ba7f not found: ID does not exist" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.580913 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.594658 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:21:34 crc kubenswrapper[4707]: E0121 15:21:34.596802 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerName="probe" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.596842 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerName="probe" Jan 21 15:21:34 crc kubenswrapper[4707]: E0121 15:21:34.596855 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerName="cinder-scheduler" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.596864 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerName="cinder-scheduler" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.597060 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerName="probe" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.597083 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac82614-0eca-4386-8dd7-2ae364a95de1" containerName="cinder-scheduler" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.597880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.604064 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.619417 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.701569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.701607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw2km\" (UniqueName: \"kubernetes.io/projected/3c937d1c-d74a-4296-a78d-c88deef6b011-kube-api-access-dw2km\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.701668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.701685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.701724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.701776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c937d1c-d74a-4296-a78d-c88deef6b011-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.803078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.803367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.803461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c937d1c-d74a-4296-a78d-c88deef6b011-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.803561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c937d1c-d74a-4296-a78d-c88deef6b011-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.803647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.803758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw2km\" (UniqueName: \"kubernetes.io/projected/3c937d1c-d74a-4296-a78d-c88deef6b011-kube-api-access-dw2km\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.803884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.806321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.806504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.806534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.806646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.830736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw2km\" (UniqueName: \"kubernetes.io/projected/3c937d1c-d74a-4296-a78d-c88deef6b011-kube-api-access-dw2km\") pod \"cinder-scheduler-0\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:34 crc kubenswrapper[4707]: I0121 15:21:34.910020 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:35 crc kubenswrapper[4707]: I0121 15:21:35.190184 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aac82614-0eca-4386-8dd7-2ae364a95de1" path="/var/lib/kubelet/pods/aac82614-0eca-4386-8dd7-2ae364a95de1/volumes" Jan 21 15:21:35 crc kubenswrapper[4707]: I0121 15:21:35.272131 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:21:35 crc kubenswrapper[4707]: W0121 15:21:35.275342 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c937d1c_d74a_4296_a78d_c88deef6b011.slice/crio-2e7ec64e38025020b723d6365c387b0f6466491d9052919899b2096edc5ee2cd WatchSource:0}: Error finding container 2e7ec64e38025020b723d6365c387b0f6466491d9052919899b2096edc5ee2cd: Status 404 returned error can't find the container with id 2e7ec64e38025020b723d6365c387b0f6466491d9052919899b2096edc5ee2cd Jan 21 15:21:35 crc kubenswrapper[4707]: I0121 15:21:35.537189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c937d1c-d74a-4296-a78d-c88deef6b011","Type":"ContainerStarted","Data":"2e7ec64e38025020b723d6365c387b0f6466491d9052919899b2096edc5ee2cd"} Jan 21 15:21:36 crc kubenswrapper[4707]: I0121 15:21:36.546674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c937d1c-d74a-4296-a78d-c88deef6b011","Type":"ContainerStarted","Data":"c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e"} Jan 21 15:21:36 crc kubenswrapper[4707]: I0121 15:21:36.546943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c937d1c-d74a-4296-a78d-c88deef6b011","Type":"ContainerStarted","Data":"1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444"} Jan 21 15:21:36 crc kubenswrapper[4707]: I0121 15:21:36.569319 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.569305119 podStartE2EDuration="2.569305119s" podCreationTimestamp="2026-01-21 15:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:21:36.564056867 +0000 UTC m=+1193.745573089" watchObservedRunningTime="2026-01-21 15:21:36.569305119 +0000 UTC m=+1193.750821341" Jan 21 15:21:39 crc kubenswrapper[4707]: I0121 15:21:39.435688 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:39 crc kubenswrapper[4707]: I0121 15:21:39.459833 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:21:39 crc kubenswrapper[4707]: I0121 15:21:39.518831 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-fb4888b88-mz44p"] Jan 21 15:21:39 crc kubenswrapper[4707]: I0121 15:21:39.519055 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api-log" containerID="cri-o://b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499" gracePeriod=30 Jan 21 15:21:39 crc kubenswrapper[4707]: I0121 15:21:39.519195 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api" containerID="cri-o://ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1" gracePeriod=30 Jan 21 15:21:39 crc kubenswrapper[4707]: I0121 15:21:39.910938 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:39 crc kubenswrapper[4707]: I0121 15:21:39.945558 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:21:39 crc kubenswrapper[4707]: I0121 15:21:39.945766 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:21:40 crc kubenswrapper[4707]: I0121 15:21:40.413343 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:21:40 crc kubenswrapper[4707]: I0121 15:21:40.579346 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf2ca736-6fb3-48fd-b827-17934d478419" containerID="b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499" exitCode=143 Jan 21 15:21:40 crc kubenswrapper[4707]: I0121 15:21:40.579382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" event={"ID":"cf2ca736-6fb3-48fd-b827-17934d478419","Type":"ContainerDied","Data":"b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499"} Jan 21 15:21:42 crc kubenswrapper[4707]: I0121 15:21:42.655972 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.239:9311/healthcheck\": read tcp 10.217.0.2:59626->10.217.0.239:9311: read: connection reset by peer" Jan 21 15:21:42 crc kubenswrapper[4707]: I0121 15:21:42.656585 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.239:9311/healthcheck\": read tcp 10.217.0.2:59622->10.217.0.239:9311: read: connection reset by peer" Jan 21 15:21:42 crc kubenswrapper[4707]: I0121 15:21:42.997861 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.134832 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-combined-ca-bundle\") pod \"cf2ca736-6fb3-48fd-b827-17934d478419\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.134941 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data\") pod \"cf2ca736-6fb3-48fd-b827-17934d478419\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.135029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data-custom\") pod \"cf2ca736-6fb3-48fd-b827-17934d478419\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.135051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/cf2ca736-6fb3-48fd-b827-17934d478419-kube-api-access-6pqqt\") pod \"cf2ca736-6fb3-48fd-b827-17934d478419\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.135119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ca736-6fb3-48fd-b827-17934d478419-logs\") pod \"cf2ca736-6fb3-48fd-b827-17934d478419\" (UID: \"cf2ca736-6fb3-48fd-b827-17934d478419\") " Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.135544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf2ca736-6fb3-48fd-b827-17934d478419-logs" (OuterVolumeSpecName: "logs") pod "cf2ca736-6fb3-48fd-b827-17934d478419" (UID: "cf2ca736-6fb3-48fd-b827-17934d478419"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.135681 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ca736-6fb3-48fd-b827-17934d478419-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.139994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cf2ca736-6fb3-48fd-b827-17934d478419" (UID: "cf2ca736-6fb3-48fd-b827-17934d478419"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.140223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf2ca736-6fb3-48fd-b827-17934d478419-kube-api-access-6pqqt" (OuterVolumeSpecName: "kube-api-access-6pqqt") pod "cf2ca736-6fb3-48fd-b827-17934d478419" (UID: "cf2ca736-6fb3-48fd-b827-17934d478419"). InnerVolumeSpecName "kube-api-access-6pqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.156531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf2ca736-6fb3-48fd-b827-17934d478419" (UID: "cf2ca736-6fb3-48fd-b827-17934d478419"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.196145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data" (OuterVolumeSpecName: "config-data") pod "cf2ca736-6fb3-48fd-b827-17934d478419" (UID: "cf2ca736-6fb3-48fd-b827-17934d478419"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.236700 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.236732 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.236742 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/cf2ca736-6fb3-48fd-b827-17934d478419-kube-api-access-6pqqt\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.236752 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ca736-6fb3-48fd-b827-17934d478419-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.611694 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf2ca736-6fb3-48fd-b827-17934d478419" containerID="ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1" exitCode=0 Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.611745 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.611762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" event={"ID":"cf2ca736-6fb3-48fd-b827-17934d478419","Type":"ContainerDied","Data":"ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1"} Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.612159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-fb4888b88-mz44p" event={"ID":"cf2ca736-6fb3-48fd-b827-17934d478419","Type":"ContainerDied","Data":"c3de0a15d39aa2eb90940c9527ea5e94131c99be22da1c90edac2aafa08f88e6"} Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.612178 4707 scope.go:117] "RemoveContainer" containerID="ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.636470 4707 scope.go:117] "RemoveContainer" containerID="b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.643411 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-fb4888b88-mz44p"] Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.649234 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-fb4888b88-mz44p"] Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.656727 4707 scope.go:117] "RemoveContainer" containerID="ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1" Jan 21 15:21:43 crc kubenswrapper[4707]: E0121 15:21:43.657621 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1\": container with ID starting with ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1 not found: ID does not exist" containerID="ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.657660 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1"} err="failed to get container status \"ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1\": rpc error: code = NotFound desc = could not find container \"ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1\": container with ID starting with ecc694650b57109a6d5ad16c341e259d3fc4cbcd4cd1dfd09a1875d127c440f1 not found: ID does not exist" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.657702 4707 scope.go:117] "RemoveContainer" containerID="b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499" Jan 21 15:21:43 crc kubenswrapper[4707]: E0121 15:21:43.657961 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499\": container with ID starting with b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499 not found: ID does not exist" containerID="b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499" Jan 21 15:21:43 crc kubenswrapper[4707]: I0121 15:21:43.657989 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499"} err="failed to get container status \"b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499\": rpc error: code = NotFound desc = could not find container \"b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499\": container with ID starting with b45931530d76456a54275e750ff1006b1520ce0484a07a400f68f7208b3aa499 not found: ID does not exist" Jan 21 15:21:44 crc kubenswrapper[4707]: I0121 15:21:44.814321 4707 scope.go:117] "RemoveContainer" containerID="f213684e000592d6c88fd06d620047c601b5465bd66e1c2eae23b0ba1f084b37" Jan 21 15:21:44 crc kubenswrapper[4707]: I0121 15:21:44.832758 4707 scope.go:117] "RemoveContainer" containerID="30cfdca99688f8422580aaa82fb53297290e4107488f50f74bafac30cdb605e2" Jan 21 15:21:44 crc kubenswrapper[4707]: I0121 15:21:44.871262 4707 scope.go:117] "RemoveContainer" containerID="93601405e0699ec0ef7e3daeaae4615c531db22fb7b3ae9899ac20bde048fe28" Jan 21 15:21:45 crc kubenswrapper[4707]: I0121 15:21:45.102734 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:21:45 crc kubenswrapper[4707]: I0121 15:21:45.192096 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" path="/var/lib/kubelet/pods/cf2ca736-6fb3-48fd-b827-17934d478419/volumes" Jan 21 15:21:50 crc kubenswrapper[4707]: I0121 15:21:50.284537 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:50 crc kubenswrapper[4707]: I0121 15:21:50.295766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:51 crc kubenswrapper[4707]: I0121 15:21:51.298182 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod685bf29d-d0cc-43bc-9d01-6585cc8b0f18"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod685bf29d-d0cc-43bc-9d01-6585cc8b0f18] : Timed out while waiting for systemd to remove kubepods-besteffort-pod685bf29d_d0cc_43bc_9d01_6585cc8b0f18.slice" Jan 21 15:21:51 crc kubenswrapper[4707]: E0121 15:21:51.298236 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod685bf29d-d0cc-43bc-9d01-6585cc8b0f18] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod685bf29d-d0cc-43bc-9d01-6585cc8b0f18] : Timed out while waiting for systemd to remove kubepods-besteffort-pod685bf29d_d0cc_43bc_9d01_6585cc8b0f18.slice" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" podUID="685bf29d-d0cc-43bc-9d01-6585cc8b0f18" Jan 21 15:21:51 crc kubenswrapper[4707]: I0121 15:21:51.567722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:51 crc kubenswrapper[4707]: I0121 15:21:51.568702 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:21:51 crc kubenswrapper[4707]: I0121 15:21:51.618612 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-79b5787b96-vlq6n"] Jan 21 15:21:51 crc kubenswrapper[4707]: I0121 15:21:51.665076 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-7rjvw" Jan 21 15:21:51 crc kubenswrapper[4707]: I0121 15:21:51.665458 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" podUID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerName="placement-log" containerID="cri-o://923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99" gracePeriod=30 Jan 21 15:21:51 crc kubenswrapper[4707]: I0121 15:21:51.665504 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" podUID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerName="placement-api" containerID="cri-o://27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612" gracePeriod=30 Jan 21 15:21:51 crc kubenswrapper[4707]: I0121 15:21:51.688889 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:21:52 crc kubenswrapper[4707]: I0121 15:21:52.097840 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:21:52 crc kubenswrapper[4707]: I0121 15:21:52.674269 4707 generic.go:334] "Generic (PLEG): container finished" podID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerID="923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99" exitCode=143 Jan 21 15:21:52 crc kubenswrapper[4707]: I0121 15:21:52.674350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" event={"ID":"d380df51-2feb-4db3-ac56-f2dc791690c9","Type":"ContainerDied","Data":"923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99"} Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.013271 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.021590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-config-data\") pod \"d380df51-2feb-4db3-ac56-f2dc791690c9\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.056083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-config-data" (OuterVolumeSpecName: "config-data") pod "d380df51-2feb-4db3-ac56-f2dc791690c9" (UID: "d380df51-2feb-4db3-ac56-f2dc791690c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.123356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-scripts\") pod \"d380df51-2feb-4db3-ac56-f2dc791690c9\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.123632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d380df51-2feb-4db3-ac56-f2dc791690c9-logs\") pod \"d380df51-2feb-4db3-ac56-f2dc791690c9\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.123689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kfzz\" (UniqueName: \"kubernetes.io/projected/d380df51-2feb-4db3-ac56-f2dc791690c9-kube-api-access-6kfzz\") pod \"d380df51-2feb-4db3-ac56-f2dc791690c9\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.123775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-combined-ca-bundle\") pod \"d380df51-2feb-4db3-ac56-f2dc791690c9\" (UID: \"d380df51-2feb-4db3-ac56-f2dc791690c9\") " Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.124024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d380df51-2feb-4db3-ac56-f2dc791690c9-logs" (OuterVolumeSpecName: "logs") pod "d380df51-2feb-4db3-ac56-f2dc791690c9" (UID: "d380df51-2feb-4db3-ac56-f2dc791690c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.124294 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d380df51-2feb-4db3-ac56-f2dc791690c9-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.124320 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.126350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-scripts" (OuterVolumeSpecName: "scripts") pod "d380df51-2feb-4db3-ac56-f2dc791690c9" (UID: "d380df51-2feb-4db3-ac56-f2dc791690c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.126499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d380df51-2feb-4db3-ac56-f2dc791690c9-kube-api-access-6kfzz" (OuterVolumeSpecName: "kube-api-access-6kfzz") pod "d380df51-2feb-4db3-ac56-f2dc791690c9" (UID: "d380df51-2feb-4db3-ac56-f2dc791690c9"). InnerVolumeSpecName "kube-api-access-6kfzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.158304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d380df51-2feb-4db3-ac56-f2dc791690c9" (UID: "d380df51-2feb-4db3-ac56-f2dc791690c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.225912 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.225939 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d380df51-2feb-4db3-ac56-f2dc791690c9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.225948 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kfzz\" (UniqueName: \"kubernetes.io/projected/d380df51-2feb-4db3-ac56-f2dc791690c9-kube-api-access-6kfzz\") on node \"crc\" DevicePath \"\"" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.695103 4707 generic.go:334] "Generic (PLEG): container finished" podID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerID="27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612" exitCode=0 Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.695197 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.695193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" event={"ID":"d380df51-2feb-4db3-ac56-f2dc791690c9","Type":"ContainerDied","Data":"27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612"} Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.695446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b5787b96-vlq6n" event={"ID":"d380df51-2feb-4db3-ac56-f2dc791690c9","Type":"ContainerDied","Data":"cf132100523696cf3d340712971e050843538e96f77e623d017912998013fd5c"} Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.695464 4707 scope.go:117] "RemoveContainer" containerID="27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.713895 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-79b5787b96-vlq6n"] Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.719263 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-79b5787b96-vlq6n"] Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.722042 4707 scope.go:117] "RemoveContainer" containerID="923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.737363 4707 scope.go:117] "RemoveContainer" containerID="27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612" Jan 21 15:21:55 crc kubenswrapper[4707]: E0121 15:21:55.737616 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612\": container with ID starting with 27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612 not found: ID does not exist" containerID="27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.737710 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612"} err="failed to get container status \"27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612\": rpc error: code = NotFound desc = could not find container \"27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612\": container with ID starting with 27fe536fb02ea05db5c422ea0e866ecdcac5b0d4d8492ac36195caf4a39a3612 not found: ID does not exist" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.737789 4707 scope.go:117] "RemoveContainer" containerID="923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99" Jan 21 15:21:55 crc kubenswrapper[4707]: E0121 15:21:55.738164 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99\": container with ID starting with 923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99 not found: ID does not exist" containerID="923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99" Jan 21 15:21:55 crc kubenswrapper[4707]: I0121 15:21:55.738233 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99"} err="failed to get container status \"923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99\": rpc error: code = NotFound desc = could not find container \"923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99\": container with ID starting with 923387058ff942f047b5ba251f607182d1228b2cce474fd2d227328993d79f99 not found: ID does not exist" Jan 21 15:21:57 crc kubenswrapper[4707]: I0121 15:21:57.190781 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d380df51-2feb-4db3-ac56-f2dc791690c9" path="/var/lib/kubelet/pods/d380df51-2feb-4db3-ac56-f2dc791690c9/volumes" Jan 21 15:21:59 crc kubenswrapper[4707]: I0121 15:21:59.233334 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:22:00 crc kubenswrapper[4707]: I0121 15:22:00.177420 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:22:00 crc kubenswrapper[4707]: I0121 15:22:00.231866 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-d5c68ff65-qkk97"] Jan 21 15:22:00 crc kubenswrapper[4707]: I0121 15:22:00.232074 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" podUID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerName="neutron-api" containerID="cri-o://e555f5b33a22f3ba35e49640ecdfcebc760af82162eb8d8cac885aa76cc9b7b2" gracePeriod=30 Jan 21 15:22:00 crc kubenswrapper[4707]: I0121 15:22:00.232148 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" podUID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerName="neutron-httpd" containerID="cri-o://16208ea82b812a06cbd8678b6432f56905a4fec41f434ee8846390c6c7f1477c" gracePeriod=30 Jan 21 15:22:00 crc kubenswrapper[4707]: I0121 15:22:00.727997 4707 generic.go:334] "Generic (PLEG): container finished" podID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerID="16208ea82b812a06cbd8678b6432f56905a4fec41f434ee8846390c6c7f1477c" exitCode=0 Jan 21 15:22:00 crc kubenswrapper[4707]: I0121 15:22:00.728065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" event={"ID":"9f52f02f-4ec5-4642-a742-004ac2668fe8","Type":"ContainerDied","Data":"16208ea82b812a06cbd8678b6432f56905a4fec41f434ee8846390c6c7f1477c"} Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.582237 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:22:02 crc kubenswrapper[4707]: E0121 15:22:02.582564 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api-log" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.582578 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api-log" Jan 21 15:22:02 crc kubenswrapper[4707]: E0121 15:22:02.582592 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerName="placement-api" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.582597 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerName="placement-api" Jan 21 15:22:02 crc kubenswrapper[4707]: E0121 15:22:02.582613 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.582619 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api" Jan 21 15:22:02 crc kubenswrapper[4707]: E0121 15:22:02.582629 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerName="placement-log" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.582634 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerName="placement-log" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.582791 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.582803 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerName="placement-api" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.582827 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d380df51-2feb-4db3-ac56-f2dc791690c9" containerName="placement-log" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.582837 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2ca736-6fb3-48fd-b827-17934d478419" containerName="barbican-api-log" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.583302 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.584582 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-mhgcw" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.584901 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.585373 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.593231 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.733978 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.734192 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="ceilometer-central-agent" containerID="cri-o://ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e" gracePeriod=30 Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.734286 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="ceilometer-notification-agent" containerID="cri-o://c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf" gracePeriod=30 Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.734265 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="proxy-httpd" containerID="cri-o://ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6" gracePeriod=30 Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.734280 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="sg-core" containerID="cri-o://41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595" gracePeriod=30 Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.739492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.739607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9mb\" (UniqueName: \"kubernetes.io/projected/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-kube-api-access-fz9mb\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.739629 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.739654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.841583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.841686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9mb\" (UniqueName: \"kubernetes.io/projected/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-kube-api-access-fz9mb\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.841706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.841730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.846147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.846873 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.847982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.857660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9mb\" (UniqueName: \"kubernetes.io/projected/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-kube-api-access-fz9mb\") pod \"openstackclient\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:02 crc kubenswrapper[4707]: I0121 15:22:02.900728 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.194125 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.750573 4707 generic.go:334] "Generic (PLEG): container finished" podID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerID="ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6" exitCode=0 Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.750755 4707 generic.go:334] "Generic (PLEG): container finished" podID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerID="41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595" exitCode=2 Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.750765 4707 generic.go:334] "Generic (PLEG): container finished" podID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerID="ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e" exitCode=0 Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.750701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerDied","Data":"ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6"} Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.750836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerDied","Data":"41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595"} Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.750850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerDied","Data":"ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e"} Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.757272 4707 generic.go:334] "Generic (PLEG): container finished" podID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerID="e555f5b33a22f3ba35e49640ecdfcebc760af82162eb8d8cac885aa76cc9b7b2" exitCode=0 Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.757317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" event={"ID":"9f52f02f-4ec5-4642-a742-004ac2668fe8","Type":"ContainerDied","Data":"e555f5b33a22f3ba35e49640ecdfcebc760af82162eb8d8cac885aa76cc9b7b2"} Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.759019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa","Type":"ContainerStarted","Data":"3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a"} Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.759040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa","Type":"ContainerStarted","Data":"85bd6607cc2e7c819596219f167e0eadc7994bc2f6c5ddf4294fad7af635f8e4"} Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.916732 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:22:03 crc kubenswrapper[4707]: I0121 15:22:03.934054 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=1.934040231 podStartE2EDuration="1.934040231s" podCreationTimestamp="2026-01-21 15:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:03.772129132 +0000 UTC m=+1220.953645354" watchObservedRunningTime="2026-01-21 15:22:03.934040231 +0000 UTC m=+1221.115556453" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.063380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9q5j\" (UniqueName: \"kubernetes.io/projected/9f52f02f-4ec5-4642-a742-004ac2668fe8-kube-api-access-d9q5j\") pod \"9f52f02f-4ec5-4642-a742-004ac2668fe8\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.063460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-combined-ca-bundle\") pod \"9f52f02f-4ec5-4642-a742-004ac2668fe8\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.063509 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-config\") pod \"9f52f02f-4ec5-4642-a742-004ac2668fe8\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.063566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-httpd-config\") pod \"9f52f02f-4ec5-4642-a742-004ac2668fe8\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.063609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-ovndb-tls-certs\") pod \"9f52f02f-4ec5-4642-a742-004ac2668fe8\" (UID: \"9f52f02f-4ec5-4642-a742-004ac2668fe8\") " Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.067303 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f52f02f-4ec5-4642-a742-004ac2668fe8-kube-api-access-d9q5j" (OuterVolumeSpecName: "kube-api-access-d9q5j") pod "9f52f02f-4ec5-4642-a742-004ac2668fe8" (UID: "9f52f02f-4ec5-4642-a742-004ac2668fe8"). InnerVolumeSpecName "kube-api-access-d9q5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.075937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9f52f02f-4ec5-4642-a742-004ac2668fe8" (UID: "9f52f02f-4ec5-4642-a742-004ac2668fe8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.100140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-config" (OuterVolumeSpecName: "config") pod "9f52f02f-4ec5-4642-a742-004ac2668fe8" (UID: "9f52f02f-4ec5-4642-a742-004ac2668fe8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.102271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f52f02f-4ec5-4642-a742-004ac2668fe8" (UID: "9f52f02f-4ec5-4642-a742-004ac2668fe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.131951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9f52f02f-4ec5-4642-a742-004ac2668fe8" (UID: "9f52f02f-4ec5-4642-a742-004ac2668fe8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.165465 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9q5j\" (UniqueName: \"kubernetes.io/projected/9f52f02f-4ec5-4642-a742-004ac2668fe8-kube-api-access-d9q5j\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.165495 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.165505 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.165515 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.165524 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f52f02f-4ec5-4642-a742-004ac2668fe8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.686863 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l"] Jan 21 15:22:04 crc kubenswrapper[4707]: E0121 15:22:04.690771 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerName="neutron-httpd" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.690796 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerName="neutron-httpd" Jan 21 15:22:04 crc kubenswrapper[4707]: E0121 15:22:04.690827 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerName="neutron-api" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.690834 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerName="neutron-api" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.691229 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerName="neutron-httpd" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.691269 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f52f02f-4ec5-4642-a742-004ac2668fe8" containerName="neutron-api" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.695619 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.697457 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.699507 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.710591 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l"] Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.710648 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.771156 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.771194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d5c68ff65-qkk97" event={"ID":"9f52f02f-4ec5-4642-a742-004ac2668fe8","Type":"ContainerDied","Data":"98c39d949d751211f34cdf48694908dcfa4e28b0f99bd1b9cfb12803b7fb15bd"} Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.771225 4707 scope.go:117] "RemoveContainer" containerID="16208ea82b812a06cbd8678b6432f56905a4fec41f434ee8846390c6c7f1477c" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.787974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b267z\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-kube-api-access-b267z\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.788013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-etc-swift\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.788031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-run-httpd\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.788045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-internal-tls-certs\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.788062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-combined-ca-bundle\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.788149 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-config-data\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.788165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-log-httpd\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.788226 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-public-tls-certs\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.789344 4707 scope.go:117] "RemoveContainer" containerID="e555f5b33a22f3ba35e49640ecdfcebc760af82162eb8d8cac885aa76cc9b7b2" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.802194 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-d5c68ff65-qkk97"] Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.809388 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-d5c68ff65-qkk97"] Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.888737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-public-tls-certs\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.888791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b267z\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-kube-api-access-b267z\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.888824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-etc-swift\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.888841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-run-httpd\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.888856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-internal-tls-certs\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.888872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-combined-ca-bundle\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.888936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-config-data\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.888954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-log-httpd\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.889310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-log-httpd\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.889917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-run-httpd\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.892878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-public-tls-certs\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.893186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-etc-swift\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.893445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-combined-ca-bundle\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.894038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-config-data\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.900343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-internal-tls-certs\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:04 crc kubenswrapper[4707]: I0121 15:22:04.903191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b267z\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-kube-api-access-b267z\") pod \"swift-proxy-6f8b448588-x2v5l\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:05 crc kubenswrapper[4707]: I0121 15:22:05.029050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:05 crc kubenswrapper[4707]: I0121 15:22:05.190846 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f52f02f-4ec5-4642-a742-004ac2668fe8" path="/var/lib/kubelet/pods/9f52f02f-4ec5-4642-a742-004ac2668fe8/volumes" Jan 21 15:22:05 crc kubenswrapper[4707]: I0121 15:22:05.404366 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l"] Jan 21 15:22:05 crc kubenswrapper[4707]: I0121 15:22:05.788912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" event={"ID":"00ff5dfe-af7e-4827-949e-926b7a41731b","Type":"ContainerStarted","Data":"383bb9acff21cf11ac127e42ba50d992286bc0e2ae3f73e405b397f0688b28c8"} Jan 21 15:22:05 crc kubenswrapper[4707]: I0121 15:22:05.789130 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" event={"ID":"00ff5dfe-af7e-4827-949e-926b7a41731b","Type":"ContainerStarted","Data":"7b5e320769635ae68967dbf380846b5fcaacb83053ce0325c620c2d8aacab67e"} Jan 21 15:22:05 crc kubenswrapper[4707]: I0121 15:22:05.789142 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" event={"ID":"00ff5dfe-af7e-4827-949e-926b7a41731b","Type":"ContainerStarted","Data":"38686d7331429b59d331120099879c8b5e6bc17322a5c249f322d7268652131c"} Jan 21 15:22:05 crc kubenswrapper[4707]: I0121 15:22:05.789574 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:05 crc kubenswrapper[4707]: I0121 15:22:05.805913 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" podStartSLOduration=1.805898353 podStartE2EDuration="1.805898353s" podCreationTimestamp="2026-01-21 15:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:05.801538021 +0000 UTC m=+1222.983054244" watchObservedRunningTime="2026-01-21 15:22:05.805898353 +0000 UTC m=+1222.987414575" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.194344 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.207401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-log-httpd\") pod \"8216f328-2076-4cd7-a713-1ac9d9a48557\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.207462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-config-data\") pod \"8216f328-2076-4cd7-a713-1ac9d9a48557\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.207481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-combined-ca-bundle\") pod \"8216f328-2076-4cd7-a713-1ac9d9a48557\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.207514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbxw4\" (UniqueName: \"kubernetes.io/projected/8216f328-2076-4cd7-a713-1ac9d9a48557-kube-api-access-nbxw4\") pod \"8216f328-2076-4cd7-a713-1ac9d9a48557\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.207557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-sg-core-conf-yaml\") pod \"8216f328-2076-4cd7-a713-1ac9d9a48557\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.207592 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-run-httpd\") pod \"8216f328-2076-4cd7-a713-1ac9d9a48557\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.207610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-scripts\") pod \"8216f328-2076-4cd7-a713-1ac9d9a48557\" (UID: \"8216f328-2076-4cd7-a713-1ac9d9a48557\") " Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.208771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8216f328-2076-4cd7-a713-1ac9d9a48557" (UID: "8216f328-2076-4cd7-a713-1ac9d9a48557"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.213349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8216f328-2076-4cd7-a713-1ac9d9a48557" (UID: "8216f328-2076-4cd7-a713-1ac9d9a48557"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.213841 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8216f328-2076-4cd7-a713-1ac9d9a48557-kube-api-access-nbxw4" (OuterVolumeSpecName: "kube-api-access-nbxw4") pod "8216f328-2076-4cd7-a713-1ac9d9a48557" (UID: "8216f328-2076-4cd7-a713-1ac9d9a48557"). InnerVolumeSpecName "kube-api-access-nbxw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.220303 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-scripts" (OuterVolumeSpecName: "scripts") pod "8216f328-2076-4cd7-a713-1ac9d9a48557" (UID: "8216f328-2076-4cd7-a713-1ac9d9a48557"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.253352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8216f328-2076-4cd7-a713-1ac9d9a48557" (UID: "8216f328-2076-4cd7-a713-1ac9d9a48557"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.277794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8216f328-2076-4cd7-a713-1ac9d9a48557" (UID: "8216f328-2076-4cd7-a713-1ac9d9a48557"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.289517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-config-data" (OuterVolumeSpecName: "config-data") pod "8216f328-2076-4cd7-a713-1ac9d9a48557" (UID: "8216f328-2076-4cd7-a713-1ac9d9a48557"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.308534 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.308557 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.308570 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbxw4\" (UniqueName: \"kubernetes.io/projected/8216f328-2076-4cd7-a713-1ac9d9a48557-kube-api-access-nbxw4\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.308579 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.308587 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.308594 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8216f328-2076-4cd7-a713-1ac9d9a48557-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.308601 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8216f328-2076-4cd7-a713-1ac9d9a48557-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.801293 4707 generic.go:334] "Generic (PLEG): container finished" podID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerID="c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf" exitCode=0 Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.802083 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.806531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerDied","Data":"c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf"} Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.806616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8216f328-2076-4cd7-a713-1ac9d9a48557","Type":"ContainerDied","Data":"b603e7cb3adfb798ffa88529cc2462b503a82c97cdc54a4a3e8d432197604bb1"} Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.806639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.806663 4707 scope.go:117] "RemoveContainer" containerID="ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.823511 4707 scope.go:117] "RemoveContainer" containerID="41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.832188 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.838870 4707 scope.go:117] "RemoveContainer" containerID="c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.847960 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.854542 4707 scope.go:117] "RemoveContainer" containerID="ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.855202 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:06 crc kubenswrapper[4707]: E0121 15:22:06.855477 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="ceilometer-central-agent" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.855493 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="ceilometer-central-agent" Jan 21 15:22:06 crc kubenswrapper[4707]: E0121 15:22:06.855512 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="sg-core" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.855519 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="sg-core" Jan 21 15:22:06 crc kubenswrapper[4707]: E0121 15:22:06.855543 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="proxy-httpd" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.855548 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="proxy-httpd" Jan 21 15:22:06 crc kubenswrapper[4707]: E0121 15:22:06.855558 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="ceilometer-notification-agent" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.855564 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="ceilometer-notification-agent" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.855698 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="proxy-httpd" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.855708 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="ceilometer-notification-agent" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.855715 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="sg-core" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.855730 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" containerName="ceilometer-central-agent" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.857003 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.858649 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.858800 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.870573 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.881339 4707 scope.go:117] "RemoveContainer" containerID="ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6" Jan 21 15:22:06 crc kubenswrapper[4707]: E0121 15:22:06.887234 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6\": container with ID starting with ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6 not found: ID does not exist" containerID="ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.887274 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6"} err="failed to get container status \"ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6\": rpc error: code = NotFound desc = could not find container \"ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6\": container with ID starting with ae399ffe22caabb1e1a13238f6776021badeeacd6f69e4a66f2c7885d52f07f6 not found: ID does not exist" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.887296 4707 scope.go:117] "RemoveContainer" containerID="41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595" Jan 21 15:22:06 crc kubenswrapper[4707]: E0121 15:22:06.887541 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595\": container with ID starting with 41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595 not found: ID does not exist" containerID="41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.887559 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595"} err="failed to get container status \"41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595\": rpc error: code = NotFound desc = could not find container \"41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595\": container with ID starting with 41f8fcd7c51835d9b2213c7a8e5a800c327b1e4d5bb3cb371f771ec5fc813595 not found: ID does not exist" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.887572 4707 scope.go:117] "RemoveContainer" containerID="c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf" Jan 21 15:22:06 crc kubenswrapper[4707]: E0121 15:22:06.887743 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf\": container with ID starting with c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf not found: ID does not exist" containerID="c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.887757 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf"} err="failed to get container status \"c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf\": rpc error: code = NotFound desc = could not find container \"c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf\": container with ID starting with c30fbe9d9daa379e72a258d1281995c3c14310526e9f6d5e65de0d16870e89bf not found: ID does not exist" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.887771 4707 scope.go:117] "RemoveContainer" containerID="ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e" Jan 21 15:22:06 crc kubenswrapper[4707]: E0121 15:22:06.887950 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e\": container with ID starting with ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e not found: ID does not exist" containerID="ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.887971 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e"} err="failed to get container status \"ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e\": rpc error: code = NotFound desc = could not find container \"ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e\": container with ID starting with ca4be961c3fc0a390454590504ac8c98ca6059c285133b4366611b2af68f153e not found: ID does not exist" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.917460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-log-httpd\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.917578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-config-data\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.917703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-run-httpd\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.917898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x4vq\" (UniqueName: \"kubernetes.io/projected/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-kube-api-access-5x4vq\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.918005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-scripts\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.918150 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:06 crc kubenswrapper[4707]: I0121 15:22:06.918180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.019361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-log-httpd\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.019403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-config-data\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.019448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-run-httpd\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.019529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x4vq\" (UniqueName: \"kubernetes.io/projected/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-kube-api-access-5x4vq\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.019556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-scripts\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.019596 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.019610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.020156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-run-httpd\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.020723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-log-httpd\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.024298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-scripts\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.027157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.034046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-config-data\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.034104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x4vq\" (UniqueName: \"kubernetes.io/projected/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-kube-api-access-5x4vq\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.034208 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.179044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.190067 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8216f328-2076-4cd7-a713-1ac9d9a48557" path="/var/lib/kubelet/pods/8216f328-2076-4cd7-a713-1ac9d9a48557/volumes" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.196475 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.196657 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerName="glance-log" containerID="cri-o://729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2" gracePeriod=30 Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.196770 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerName="glance-httpd" containerID="cri-o://577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86" gracePeriod=30 Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.584471 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:07 crc kubenswrapper[4707]: W0121 15:22:07.584466 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e71e2a7_d752_4ad2_a739_bd0d3cc2ec7c.slice/crio-b3514b475dcee7bc7b9b5399a141af9508530f2832be80372e1291ef7391a946 WatchSource:0}: Error finding container b3514b475dcee7bc7b9b5399a141af9508530f2832be80372e1291ef7391a946: Status 404 returned error can't find the container with id b3514b475dcee7bc7b9b5399a141af9508530f2832be80372e1291ef7391a946 Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.649475 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-dwf5z"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.651268 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.661484 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-dwf5z"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.691123 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.691507 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b6243463-c1a1-42a5-8697-11a221d45be7" containerName="glance-log" containerID="cri-o://71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374" gracePeriod=30 Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.691988 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b6243463-c1a1-42a5-8697-11a221d45be7" containerName="glance-httpd" containerID="cri-o://138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610" gracePeriod=30 Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.738596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vzw\" (UniqueName: \"kubernetes.io/projected/863c9468-2e71-46c6-b6ab-212a5591921d-kube-api-access-68vzw\") pod \"nova-api-db-create-dwf5z\" (UID: \"863c9468-2e71-46c6-b6ab-212a5591921d\") " pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.738702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/863c9468-2e71-46c6-b6ab-212a5591921d-operator-scripts\") pod \"nova-api-db-create-dwf5z\" (UID: \"863c9468-2e71-46c6-b6ab-212a5591921d\") " pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.753856 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z4djf"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.755038 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.766133 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z4djf"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.815871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerStarted","Data":"b3514b475dcee7bc7b9b5399a141af9508530f2832be80372e1291ef7391a946"} Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.817842 4707 generic.go:334] "Generic (PLEG): container finished" podID="b6243463-c1a1-42a5-8697-11a221d45be7" containerID="71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374" exitCode=143 Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.817964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b6243463-c1a1-42a5-8697-11a221d45be7","Type":"ContainerDied","Data":"71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374"} Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.819288 4707 generic.go:334] "Generic (PLEG): container finished" podID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerID="729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2" exitCode=143 Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.819337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"47279210-6924-452f-abb7-8d5d7ff90ec9","Type":"ContainerDied","Data":"729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2"} Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.840465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vzw\" (UniqueName: \"kubernetes.io/projected/863c9468-2e71-46c6-b6ab-212a5591921d-kube-api-access-68vzw\") pod \"nova-api-db-create-dwf5z\" (UID: \"863c9468-2e71-46c6-b6ab-212a5591921d\") " pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.840833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/863c9468-2e71-46c6-b6ab-212a5591921d-operator-scripts\") pod \"nova-api-db-create-dwf5z\" (UID: \"863c9468-2e71-46c6-b6ab-212a5591921d\") " pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.841007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcf62a-53d8-4003-8e75-747b6e768ffd-operator-scripts\") pod \"nova-cell0-db-create-z4djf\" (UID: \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.841111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn8j9\" (UniqueName: \"kubernetes.io/projected/0ffcf62a-53d8-4003-8e75-747b6e768ffd-kube-api-access-gn8j9\") pod \"nova-cell0-db-create-z4djf\" (UID: \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.842214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/863c9468-2e71-46c6-b6ab-212a5591921d-operator-scripts\") pod \"nova-api-db-create-dwf5z\" (UID: \"863c9468-2e71-46c6-b6ab-212a5591921d\") " pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.849079 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-msrjr"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.849951 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.855789 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.856780 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.859254 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.859479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vzw\" (UniqueName: \"kubernetes.io/projected/863c9468-2e71-46c6-b6ab-212a5591921d-kube-api-access-68vzw\") pod \"nova-api-db-create-dwf5z\" (UID: \"863c9468-2e71-46c6-b6ab-212a5591921d\") " pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.862800 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-msrjr"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.869914 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l"] Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.942431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f95w\" (UniqueName: \"kubernetes.io/projected/15a1405c-3d18-46d7-888c-2273a3ea7e6e-kube-api-access-9f95w\") pod \"nova-api-e09b-account-create-update-fns8l\" (UID: \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.942478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67qn9\" (UniqueName: \"kubernetes.io/projected/000e0803-0c00-4afa-863f-23269a570787-kube-api-access-67qn9\") pod \"nova-cell1-db-create-msrjr\" (UID: \"000e0803-0c00-4afa-863f-23269a570787\") " pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.942549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/000e0803-0c00-4afa-863f-23269a570787-operator-scripts\") pod \"nova-cell1-db-create-msrjr\" (UID: \"000e0803-0c00-4afa-863f-23269a570787\") " pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.942576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcf62a-53d8-4003-8e75-747b6e768ffd-operator-scripts\") pod \"nova-cell0-db-create-z4djf\" (UID: \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.942603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15a1405c-3d18-46d7-888c-2273a3ea7e6e-operator-scripts\") pod \"nova-api-e09b-account-create-update-fns8l\" (UID: \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.942623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn8j9\" (UniqueName: \"kubernetes.io/projected/0ffcf62a-53d8-4003-8e75-747b6e768ffd-kube-api-access-gn8j9\") pod \"nova-cell0-db-create-z4djf\" (UID: \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.943394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcf62a-53d8-4003-8e75-747b6e768ffd-operator-scripts\") pod \"nova-cell0-db-create-z4djf\" (UID: \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.955768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn8j9\" (UniqueName: \"kubernetes.io/projected/0ffcf62a-53d8-4003-8e75-747b6e768ffd-kube-api-access-gn8j9\") pod \"nova-cell0-db-create-z4djf\" (UID: \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:07 crc kubenswrapper[4707]: I0121 15:22:07.966058 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.043794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f95w\" (UniqueName: \"kubernetes.io/projected/15a1405c-3d18-46d7-888c-2273a3ea7e6e-kube-api-access-9f95w\") pod \"nova-api-e09b-account-create-update-fns8l\" (UID: \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.044001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67qn9\" (UniqueName: \"kubernetes.io/projected/000e0803-0c00-4afa-863f-23269a570787-kube-api-access-67qn9\") pod \"nova-cell1-db-create-msrjr\" (UID: \"000e0803-0c00-4afa-863f-23269a570787\") " pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.044079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/000e0803-0c00-4afa-863f-23269a570787-operator-scripts\") pod \"nova-cell1-db-create-msrjr\" (UID: \"000e0803-0c00-4afa-863f-23269a570787\") " pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.044124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15a1405c-3d18-46d7-888c-2273a3ea7e6e-operator-scripts\") pod \"nova-api-e09b-account-create-update-fns8l\" (UID: \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.044910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15a1405c-3d18-46d7-888c-2273a3ea7e6e-operator-scripts\") pod \"nova-api-e09b-account-create-update-fns8l\" (UID: \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.045019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/000e0803-0c00-4afa-863f-23269a570787-operator-scripts\") pod \"nova-cell1-db-create-msrjr\" (UID: \"000e0803-0c00-4afa-863f-23269a570787\") " pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.067160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f95w\" (UniqueName: \"kubernetes.io/projected/15a1405c-3d18-46d7-888c-2273a3ea7e6e-kube-api-access-9f95w\") pod \"nova-api-e09b-account-create-update-fns8l\" (UID: \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.068797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67qn9\" (UniqueName: \"kubernetes.io/projected/000e0803-0c00-4afa-863f-23269a570787-kube-api-access-67qn9\") pod \"nova-cell1-db-create-msrjr\" (UID: \"000e0803-0c00-4afa-863f-23269a570787\") " pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.070479 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.070861 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6"] Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.071859 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.074633 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.097274 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6"] Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.145631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c579748e-251c-4267-b9bf-64849d6baaea-operator-scripts\") pod \"nova-cell0-0d6b-account-create-update-cphl6\" (UID: \"c579748e-251c-4267-b9bf-64849d6baaea\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.145699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsvv\" (UniqueName: \"kubernetes.io/projected/c579748e-251c-4267-b9bf-64849d6baaea-kube-api-access-xbsvv\") pod \"nova-cell0-0d6b-account-create-update-cphl6\" (UID: \"c579748e-251c-4267-b9bf-64849d6baaea\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.163739 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.167491 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-dwf5z"] Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.170002 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.247255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c579748e-251c-4267-b9bf-64849d6baaea-operator-scripts\") pod \"nova-cell0-0d6b-account-create-update-cphl6\" (UID: \"c579748e-251c-4267-b9bf-64849d6baaea\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.247537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsvv\" (UniqueName: \"kubernetes.io/projected/c579748e-251c-4267-b9bf-64849d6baaea-kube-api-access-xbsvv\") pod \"nova-cell0-0d6b-account-create-update-cphl6\" (UID: \"c579748e-251c-4267-b9bf-64849d6baaea\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.248954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c579748e-251c-4267-b9bf-64849d6baaea-operator-scripts\") pod \"nova-cell0-0d6b-account-create-update-cphl6\" (UID: \"c579748e-251c-4267-b9bf-64849d6baaea\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.264273 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh"] Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.265306 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.270313 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.270645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsvv\" (UniqueName: \"kubernetes.io/projected/c579748e-251c-4267-b9bf-64849d6baaea-kube-api-access-xbsvv\") pod \"nova-cell0-0d6b-account-create-update-cphl6\" (UID: \"c579748e-251c-4267-b9bf-64849d6baaea\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.276777 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh"] Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.417288 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.435802 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.451024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhzw\" (UniqueName: \"kubernetes.io/projected/2d41a013-882b-4118-a008-d1fe0ec11a22-kube-api-access-xwhzw\") pod \"nova-cell1-d429-account-create-update-htwbh\" (UID: \"2d41a013-882b-4118-a008-d1fe0ec11a22\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.451121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d41a013-882b-4118-a008-d1fe0ec11a22-operator-scripts\") pod \"nova-cell1-d429-account-create-update-htwbh\" (UID: \"2d41a013-882b-4118-a008-d1fe0ec11a22\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.552890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d41a013-882b-4118-a008-d1fe0ec11a22-operator-scripts\") pod \"nova-cell1-d429-account-create-update-htwbh\" (UID: \"2d41a013-882b-4118-a008-d1fe0ec11a22\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.553033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwhzw\" (UniqueName: \"kubernetes.io/projected/2d41a013-882b-4118-a008-d1fe0ec11a22-kube-api-access-xwhzw\") pod \"nova-cell1-d429-account-create-update-htwbh\" (UID: \"2d41a013-882b-4118-a008-d1fe0ec11a22\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.553622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d41a013-882b-4118-a008-d1fe0ec11a22-operator-scripts\") pod \"nova-cell1-d429-account-create-update-htwbh\" (UID: \"2d41a013-882b-4118-a008-d1fe0ec11a22\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.557431 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z4djf"] Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.568737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwhzw\" (UniqueName: \"kubernetes.io/projected/2d41a013-882b-4118-a008-d1fe0ec11a22-kube-api-access-xwhzw\") pod \"nova-cell1-d429-account-create-update-htwbh\" (UID: \"2d41a013-882b-4118-a008-d1fe0ec11a22\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.587089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:08 crc kubenswrapper[4707]: W0121 15:22:08.594632 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ffcf62a_53d8_4003_8e75_747b6e768ffd.slice/crio-a97c07b56d539571de2b9158f25f06cb4ef1d76711e34f46f7c824727be4def0 WatchSource:0}: Error finding container a97c07b56d539571de2b9158f25f06cb4ef1d76711e34f46f7c824727be4def0: Status 404 returned error can't find the container with id a97c07b56d539571de2b9158f25f06cb4ef1d76711e34f46f7c824727be4def0 Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.640671 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-msrjr"] Jan 21 15:22:08 crc kubenswrapper[4707]: W0121 15:22:08.654517 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod000e0803_0c00_4afa_863f_23269a570787.slice/crio-2714d73dfe15fe4cec9ad2411652f5f1815e06fa60f3527c399c988cb567f292 WatchSource:0}: Error finding container 2714d73dfe15fe4cec9ad2411652f5f1815e06fa60f3527c399c988cb567f292: Status 404 returned error can't find the container with id 2714d73dfe15fe4cec9ad2411652f5f1815e06fa60f3527c399c988cb567f292 Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.734721 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l"] Jan 21 15:22:08 crc kubenswrapper[4707]: W0121 15:22:08.736409 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15a1405c_3d18_46d7_888c_2273a3ea7e6e.slice/crio-dda4a4653692228326ab6bc9a1f946207f0ea641bf60a3d5146a607a86557229 WatchSource:0}: Error finding container dda4a4653692228326ab6bc9a1f946207f0ea641bf60a3d5146a607a86557229: Status 404 returned error can't find the container with id dda4a4653692228326ab6bc9a1f946207f0ea641bf60a3d5146a607a86557229 Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.820130 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6"] Jan 21 15:22:08 crc kubenswrapper[4707]: W0121 15:22:08.823426 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc579748e_251c_4267_b9bf_64849d6baaea.slice/crio-42fd64258f21b75049f89dd19fc0403511a29264aeea9b8d1c34489e5c38a0d6 WatchSource:0}: Error finding container 42fd64258f21b75049f89dd19fc0403511a29264aeea9b8d1c34489e5c38a0d6: Status 404 returned error can't find the container with id 42fd64258f21b75049f89dd19fc0403511a29264aeea9b8d1c34489e5c38a0d6 Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.828937 4707 generic.go:334] "Generic (PLEG): container finished" podID="863c9468-2e71-46c6-b6ab-212a5591921d" containerID="cac73fa5858eb140e1c949b3821698dd7fa162b5fddb80b7fc1a91e69dde4eb2" exitCode=0 Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.829158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" event={"ID":"863c9468-2e71-46c6-b6ab-212a5591921d","Type":"ContainerDied","Data":"cac73fa5858eb140e1c949b3821698dd7fa162b5fddb80b7fc1a91e69dde4eb2"} Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.829184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" event={"ID":"863c9468-2e71-46c6-b6ab-212a5591921d","Type":"ContainerStarted","Data":"969b1816dc0f7d601b722b98713450f6b0d48e13aa1a3b9a6260e2a53c5a1587"} Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.831757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" event={"ID":"0ffcf62a-53d8-4003-8e75-747b6e768ffd","Type":"ContainerStarted","Data":"719f9a3afb9fefe6adde1cda13f4842deb8edc9afdc5cb7b8c9599908b918256"} Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.831787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" event={"ID":"0ffcf62a-53d8-4003-8e75-747b6e768ffd","Type":"ContainerStarted","Data":"a97c07b56d539571de2b9158f25f06cb4ef1d76711e34f46f7c824727be4def0"} Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.834768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerStarted","Data":"98f027dc4997b7c69b319ba3e43d33780581a93b0b7cd5adb05faf23a6d5ed86"} Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.836102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" event={"ID":"000e0803-0c00-4afa-863f-23269a570787","Type":"ContainerStarted","Data":"12a30114a287ea421974e1f93e5f0be3fcc1108f286a1c2443ce8bf22eafa491"} Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.836129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" event={"ID":"000e0803-0c00-4afa-863f-23269a570787","Type":"ContainerStarted","Data":"2714d73dfe15fe4cec9ad2411652f5f1815e06fa60f3527c399c988cb567f292"} Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.840036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" event={"ID":"15a1405c-3d18-46d7-888c-2273a3ea7e6e","Type":"ContainerStarted","Data":"dda4a4653692228326ab6bc9a1f946207f0ea641bf60a3d5146a607a86557229"} Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.854562 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" podStartSLOduration=1.854551042 podStartE2EDuration="1.854551042s" podCreationTimestamp="2026-01-21 15:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:08.851446251 +0000 UTC m=+1226.032962473" watchObservedRunningTime="2026-01-21 15:22:08.854551042 +0000 UTC m=+1226.036067264" Jan 21 15:22:08 crc kubenswrapper[4707]: I0121 15:22:08.863561 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" podStartSLOduration=1.863552523 podStartE2EDuration="1.863552523s" podCreationTimestamp="2026-01-21 15:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:08.860433036 +0000 UTC m=+1226.041949258" watchObservedRunningTime="2026-01-21 15:22:08.863552523 +0000 UTC m=+1226.045068745" Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.011734 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh"] Jan 21 15:22:09 crc kubenswrapper[4707]: W0121 15:22:09.058599 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d41a013_882b_4118_a008_d1fe0ec11a22.slice/crio-4b2725ba258c94a860225fde30e8a17f2fc63536ed4bf3a5cf4772e6ab0f29d8 WatchSource:0}: Error finding container 4b2725ba258c94a860225fde30e8a17f2fc63536ed4bf3a5cf4772e6ab0f29d8: Status 404 returned error can't find the container with id 4b2725ba258c94a860225fde30e8a17f2fc63536ed4bf3a5cf4772e6ab0f29d8 Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.851064 4707 generic.go:334] "Generic (PLEG): container finished" podID="c579748e-251c-4267-b9bf-64849d6baaea" containerID="cc8f6b16f50be7d1df064a7a111f75a22b38a2e3850ed9f555db741209b2eea5" exitCode=0 Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.851200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" event={"ID":"c579748e-251c-4267-b9bf-64849d6baaea","Type":"ContainerDied","Data":"cc8f6b16f50be7d1df064a7a111f75a22b38a2e3850ed9f555db741209b2eea5"} Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.851348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" event={"ID":"c579748e-251c-4267-b9bf-64849d6baaea","Type":"ContainerStarted","Data":"42fd64258f21b75049f89dd19fc0403511a29264aeea9b8d1c34489e5c38a0d6"} Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.853577 4707 generic.go:334] "Generic (PLEG): container finished" podID="2d41a013-882b-4118-a008-d1fe0ec11a22" containerID="2945cc35ca1844c3a12635b486f13b44293757cfb3e36de61c6e12c658f3b87a" exitCode=0 Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.853621 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" event={"ID":"2d41a013-882b-4118-a008-d1fe0ec11a22","Type":"ContainerDied","Data":"2945cc35ca1844c3a12635b486f13b44293757cfb3e36de61c6e12c658f3b87a"} Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.853638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" event={"ID":"2d41a013-882b-4118-a008-d1fe0ec11a22","Type":"ContainerStarted","Data":"4b2725ba258c94a860225fde30e8a17f2fc63536ed4bf3a5cf4772e6ab0f29d8"} Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.854907 4707 generic.go:334] "Generic (PLEG): container finished" podID="000e0803-0c00-4afa-863f-23269a570787" containerID="12a30114a287ea421974e1f93e5f0be3fcc1108f286a1c2443ce8bf22eafa491" exitCode=0 Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.854947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" event={"ID":"000e0803-0c00-4afa-863f-23269a570787","Type":"ContainerDied","Data":"12a30114a287ea421974e1f93e5f0be3fcc1108f286a1c2443ce8bf22eafa491"} Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.856109 4707 generic.go:334] "Generic (PLEG): container finished" podID="15a1405c-3d18-46d7-888c-2273a3ea7e6e" containerID="07a095da7d9200c541e009e8d6b9f784e0acfae65c8c93ddbf83ff6faf8458b1" exitCode=0 Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.856146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" event={"ID":"15a1405c-3d18-46d7-888c-2273a3ea7e6e","Type":"ContainerDied","Data":"07a095da7d9200c541e009e8d6b9f784e0acfae65c8c93ddbf83ff6faf8458b1"} Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.857388 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ffcf62a-53d8-4003-8e75-747b6e768ffd" containerID="719f9a3afb9fefe6adde1cda13f4842deb8edc9afdc5cb7b8c9599908b918256" exitCode=0 Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.857435 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" event={"ID":"0ffcf62a-53d8-4003-8e75-747b6e768ffd","Type":"ContainerDied","Data":"719f9a3afb9fefe6adde1cda13f4842deb8edc9afdc5cb7b8c9599908b918256"} Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.859421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerStarted","Data":"95b2f38bfc84afb1647dad340ce49f61fbe1866f669768a5f37cfdb6eee6deb4"} Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.945763 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:22:09 crc kubenswrapper[4707]: I0121 15:22:09.945853 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.045155 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.046390 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.149390 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.287057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vzw\" (UniqueName: \"kubernetes.io/projected/863c9468-2e71-46c6-b6ab-212a5591921d-kube-api-access-68vzw\") pod \"863c9468-2e71-46c6-b6ab-212a5591921d\" (UID: \"863c9468-2e71-46c6-b6ab-212a5591921d\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.287364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/863c9468-2e71-46c6-b6ab-212a5591921d-operator-scripts\") pod \"863c9468-2e71-46c6-b6ab-212a5591921d\" (UID: \"863c9468-2e71-46c6-b6ab-212a5591921d\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.287908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/863c9468-2e71-46c6-b6ab-212a5591921d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "863c9468-2e71-46c6-b6ab-212a5591921d" (UID: "863c9468-2e71-46c6-b6ab-212a5591921d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.293968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863c9468-2e71-46c6-b6ab-212a5591921d-kube-api-access-68vzw" (OuterVolumeSpecName: "kube-api-access-68vzw") pod "863c9468-2e71-46c6-b6ab-212a5591921d" (UID: "863c9468-2e71-46c6-b6ab-212a5591921d"). InnerVolumeSpecName "kube-api-access-68vzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.389277 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vzw\" (UniqueName: \"kubernetes.io/projected/863c9468-2e71-46c6-b6ab-212a5591921d-kube-api-access-68vzw\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.389304 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/863c9468-2e71-46c6-b6ab-212a5591921d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.638969 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.800462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-httpd-run\") pod \"47279210-6924-452f-abb7-8d5d7ff90ec9\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.800501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-public-tls-certs\") pod \"47279210-6924-452f-abb7-8d5d7ff90ec9\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.800531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"47279210-6924-452f-abb7-8d5d7ff90ec9\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.800584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-logs\") pod \"47279210-6924-452f-abb7-8d5d7ff90ec9\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.800616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-config-data\") pod \"47279210-6924-452f-abb7-8d5d7ff90ec9\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.800670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-scripts\") pod \"47279210-6924-452f-abb7-8d5d7ff90ec9\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.800700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9dhq\" (UniqueName: \"kubernetes.io/projected/47279210-6924-452f-abb7-8d5d7ff90ec9-kube-api-access-z9dhq\") pod \"47279210-6924-452f-abb7-8d5d7ff90ec9\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.800757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-combined-ca-bundle\") pod \"47279210-6924-452f-abb7-8d5d7ff90ec9\" (UID: \"47279210-6924-452f-abb7-8d5d7ff90ec9\") " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.801369 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-logs" (OuterVolumeSpecName: "logs") pod "47279210-6924-452f-abb7-8d5d7ff90ec9" (UID: "47279210-6924-452f-abb7-8d5d7ff90ec9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.801524 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "47279210-6924-452f-abb7-8d5d7ff90ec9" (UID: "47279210-6924-452f-abb7-8d5d7ff90ec9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.812971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-scripts" (OuterVolumeSpecName: "scripts") pod "47279210-6924-452f-abb7-8d5d7ff90ec9" (UID: "47279210-6924-452f-abb7-8d5d7ff90ec9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.813908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "47279210-6924-452f-abb7-8d5d7ff90ec9" (UID: "47279210-6924-452f-abb7-8d5d7ff90ec9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.814006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47279210-6924-452f-abb7-8d5d7ff90ec9-kube-api-access-z9dhq" (OuterVolumeSpecName: "kube-api-access-z9dhq") pod "47279210-6924-452f-abb7-8d5d7ff90ec9" (UID: "47279210-6924-452f-abb7-8d5d7ff90ec9"). InnerVolumeSpecName "kube-api-access-z9dhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.856592 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47279210-6924-452f-abb7-8d5d7ff90ec9" (UID: "47279210-6924-452f-abb7-8d5d7ff90ec9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.874922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "47279210-6924-452f-abb7-8d5d7ff90ec9" (UID: "47279210-6924-452f-abb7-8d5d7ff90ec9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.884831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-config-data" (OuterVolumeSpecName: "config-data") pod "47279210-6924-452f-abb7-8d5d7ff90ec9" (UID: "47279210-6924-452f-abb7-8d5d7ff90ec9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.885544 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.885739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-dwf5z" event={"ID":"863c9468-2e71-46c6-b6ab-212a5591921d","Type":"ContainerDied","Data":"969b1816dc0f7d601b722b98713450f6b0d48e13aa1a3b9a6260e2a53c5a1587"} Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.885774 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="969b1816dc0f7d601b722b98713450f6b0d48e13aa1a3b9a6260e2a53c5a1587" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.891574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerStarted","Data":"f6d456c62accf70f9fb589bed7142f11924ed6a740b1b735c9246cf1193a0764"} Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.900133 4707 generic.go:334] "Generic (PLEG): container finished" podID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerID="577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86" exitCode=0 Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.900822 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.900867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"47279210-6924-452f-abb7-8d5d7ff90ec9","Type":"ContainerDied","Data":"577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86"} Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.900903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"47279210-6924-452f-abb7-8d5d7ff90ec9","Type":"ContainerDied","Data":"12ab4c266a95bb5fa67439fef7acd5ab5e9df1b9be55f315b9df742eb6bb6f66"} Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.900923 4707 scope.go:117] "RemoveContainer" containerID="577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.902720 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.902744 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.902752 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.902760 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9dhq\" (UniqueName: \"kubernetes.io/projected/47279210-6924-452f-abb7-8d5d7ff90ec9-kube-api-access-z9dhq\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.902769 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.902776 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47279210-6924-452f-abb7-8d5d7ff90ec9-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.902784 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47279210-6924-452f-abb7-8d5d7ff90ec9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.902804 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 15:22:10 crc kubenswrapper[4707]: I0121 15:22:10.939973 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.007527 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.014229 4707 scope.go:117] "RemoveContainer" containerID="729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.028008 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.035288 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.049163 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:22:11 crc kubenswrapper[4707]: E0121 15:22:11.049466 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerName="glance-log" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.049477 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerName="glance-log" Jan 21 15:22:11 crc kubenswrapper[4707]: E0121 15:22:11.049487 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863c9468-2e71-46c6-b6ab-212a5591921d" containerName="mariadb-database-create" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.049492 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="863c9468-2e71-46c6-b6ab-212a5591921d" containerName="mariadb-database-create" Jan 21 15:22:11 crc kubenswrapper[4707]: E0121 15:22:11.049522 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerName="glance-httpd" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.049527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerName="glance-httpd" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.049867 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerName="glance-httpd" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.049884 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="863c9468-2e71-46c6-b6ab-212a5591921d" containerName="mariadb-database-create" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.049896 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="47279210-6924-452f-abb7-8d5d7ff90ec9" containerName="glance-log" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.050832 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.064695 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.064894 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.078347 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.094822 4707 scope.go:117] "RemoveContainer" containerID="577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86" Jan 21 15:22:11 crc kubenswrapper[4707]: E0121 15:22:11.096927 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86\": container with ID starting with 577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86 not found: ID does not exist" containerID="577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.096962 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86"} err="failed to get container status \"577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86\": rpc error: code = NotFound desc = could not find container \"577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86\": container with ID starting with 577310382bc7d989d25a0e56e240423d993a8dda115df6f47195d8e2c1dacf86 not found: ID does not exist" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.096986 4707 scope.go:117] "RemoveContainer" containerID="729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2" Jan 21 15:22:11 crc kubenswrapper[4707]: E0121 15:22:11.097314 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2\": container with ID starting with 729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2 not found: ID does not exist" containerID="729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.097342 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2"} err="failed to get container status \"729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2\": rpc error: code = NotFound desc = could not find container \"729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2\": container with ID starting with 729abdf7fe6d409dce4390c1f7940a5c748541f8849f0ac522736d02000da4d2 not found: ID does not exist" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.174860 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.203253 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47279210-6924-452f-abb7-8d5d7ff90ec9" path="/var/lib/kubelet/pods/47279210-6924-452f-abb7-8d5d7ff90ec9/volumes" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.210907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-scripts\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.210940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-logs\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.210966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.211052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.211088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.211167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghzb6\" (UniqueName: \"kubernetes.io/projected/edf40ef7-2790-4e9b-92dd-b92ab951383d-kube-api-access-ghzb6\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.211241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-config-data\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.211276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.256207 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.280769 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.312539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbsvv\" (UniqueName: \"kubernetes.io/projected/c579748e-251c-4267-b9bf-64849d6baaea-kube-api-access-xbsvv\") pod \"c579748e-251c-4267-b9bf-64849d6baaea\" (UID: \"c579748e-251c-4267-b9bf-64849d6baaea\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.312757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c579748e-251c-4267-b9bf-64849d6baaea-operator-scripts\") pod \"c579748e-251c-4267-b9bf-64849d6baaea\" (UID: \"c579748e-251c-4267-b9bf-64849d6baaea\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.313169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-config-data\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.313211 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.313258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-scripts\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.313271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-logs\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.313295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.313413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.313446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.313539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghzb6\" (UniqueName: \"kubernetes.io/projected/edf40ef7-2790-4e9b-92dd-b92ab951383d-kube-api-access-ghzb6\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.314019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c579748e-251c-4267-b9bf-64849d6baaea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c579748e-251c-4267-b9bf-64849d6baaea" (UID: "c579748e-251c-4267-b9bf-64849d6baaea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.314473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-logs\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.314622 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.315639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.323662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-config-data\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.323673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c579748e-251c-4267-b9bf-64849d6baaea-kube-api-access-xbsvv" (OuterVolumeSpecName: "kube-api-access-xbsvv") pod "c579748e-251c-4267-b9bf-64849d6baaea" (UID: "c579748e-251c-4267-b9bf-64849d6baaea"). InnerVolumeSpecName "kube-api-access-xbsvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.324278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.326427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-scripts\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.328411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.332226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghzb6\" (UniqueName: \"kubernetes.io/projected/edf40ef7-2790-4e9b-92dd-b92ab951383d-kube-api-access-ghzb6\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.346644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.389285 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.398864 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.405125 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.417471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/000e0803-0c00-4afa-863f-23269a570787-operator-scripts\") pod \"000e0803-0c00-4afa-863f-23269a570787\" (UID: \"000e0803-0c00-4afa-863f-23269a570787\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.417678 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15a1405c-3d18-46d7-888c-2273a3ea7e6e-operator-scripts\") pod \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\" (UID: \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.417704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67qn9\" (UniqueName: \"kubernetes.io/projected/000e0803-0c00-4afa-863f-23269a570787-kube-api-access-67qn9\") pod \"000e0803-0c00-4afa-863f-23269a570787\" (UID: \"000e0803-0c00-4afa-863f-23269a570787\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.417786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f95w\" (UniqueName: \"kubernetes.io/projected/15a1405c-3d18-46d7-888c-2273a3ea7e6e-kube-api-access-9f95w\") pod \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\" (UID: \"15a1405c-3d18-46d7-888c-2273a3ea7e6e\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.417881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000e0803-0c00-4afa-863f-23269a570787-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "000e0803-0c00-4afa-863f-23269a570787" (UID: "000e0803-0c00-4afa-863f-23269a570787"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.418143 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/000e0803-0c00-4afa-863f-23269a570787-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.418155 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbsvv\" (UniqueName: \"kubernetes.io/projected/c579748e-251c-4267-b9bf-64849d6baaea-kube-api-access-xbsvv\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.418166 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c579748e-251c-4267-b9bf-64849d6baaea-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.418230 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a1405c-3d18-46d7-888c-2273a3ea7e6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15a1405c-3d18-46d7-888c-2273a3ea7e6e" (UID: "15a1405c-3d18-46d7-888c-2273a3ea7e6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.422446 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a1405c-3d18-46d7-888c-2273a3ea7e6e-kube-api-access-9f95w" (OuterVolumeSpecName: "kube-api-access-9f95w") pod "15a1405c-3d18-46d7-888c-2273a3ea7e6e" (UID: "15a1405c-3d18-46d7-888c-2273a3ea7e6e"). InnerVolumeSpecName "kube-api-access-9f95w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.425619 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000e0803-0c00-4afa-863f-23269a570787-kube-api-access-67qn9" (OuterVolumeSpecName: "kube-api-access-67qn9") pod "000e0803-0c00-4afa-863f-23269a570787" (UID: "000e0803-0c00-4afa-863f-23269a570787"). InnerVolumeSpecName "kube-api-access-67qn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.512573 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.519586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcf62a-53d8-4003-8e75-747b6e768ffd-operator-scripts\") pod \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\" (UID: \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.519635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d41a013-882b-4118-a008-d1fe0ec11a22-operator-scripts\") pod \"2d41a013-882b-4118-a008-d1fe0ec11a22\" (UID: \"2d41a013-882b-4118-a008-d1fe0ec11a22\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.519719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn8j9\" (UniqueName: \"kubernetes.io/projected/0ffcf62a-53d8-4003-8e75-747b6e768ffd-kube-api-access-gn8j9\") pod \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\" (UID: \"0ffcf62a-53d8-4003-8e75-747b6e768ffd\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.519775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwhzw\" (UniqueName: \"kubernetes.io/projected/2d41a013-882b-4118-a008-d1fe0ec11a22-kube-api-access-xwhzw\") pod \"2d41a013-882b-4118-a008-d1fe0ec11a22\" (UID: \"2d41a013-882b-4118-a008-d1fe0ec11a22\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.520026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ffcf62a-53d8-4003-8e75-747b6e768ffd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ffcf62a-53d8-4003-8e75-747b6e768ffd" (UID: "0ffcf62a-53d8-4003-8e75-747b6e768ffd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.520424 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15a1405c-3d18-46d7-888c-2273a3ea7e6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.520450 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67qn9\" (UniqueName: \"kubernetes.io/projected/000e0803-0c00-4afa-863f-23269a570787-kube-api-access-67qn9\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.520461 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ffcf62a-53d8-4003-8e75-747b6e768ffd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.520470 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f95w\" (UniqueName: \"kubernetes.io/projected/15a1405c-3d18-46d7-888c-2273a3ea7e6e-kube-api-access-9f95w\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.520750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d41a013-882b-4118-a008-d1fe0ec11a22-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d41a013-882b-4118-a008-d1fe0ec11a22" (UID: "2d41a013-882b-4118-a008-d1fe0ec11a22"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.522958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d41a013-882b-4118-a008-d1fe0ec11a22-kube-api-access-xwhzw" (OuterVolumeSpecName: "kube-api-access-xwhzw") pod "2d41a013-882b-4118-a008-d1fe0ec11a22" (UID: "2d41a013-882b-4118-a008-d1fe0ec11a22"). InnerVolumeSpecName "kube-api-access-xwhzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.522979 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffcf62a-53d8-4003-8e75-747b6e768ffd-kube-api-access-gn8j9" (OuterVolumeSpecName: "kube-api-access-gn8j9") pod "0ffcf62a-53d8-4003-8e75-747b6e768ffd" (UID: "0ffcf62a-53d8-4003-8e75-747b6e768ffd"). InnerVolumeSpecName "kube-api-access-gn8j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.621434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-combined-ca-bundle\") pod \"b6243463-c1a1-42a5-8697-11a221d45be7\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.621473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"b6243463-c1a1-42a5-8697-11a221d45be7\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.621502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-httpd-run\") pod \"b6243463-c1a1-42a5-8697-11a221d45be7\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.621533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-internal-tls-certs\") pod \"b6243463-c1a1-42a5-8697-11a221d45be7\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.621648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlxd7\" (UniqueName: \"kubernetes.io/projected/b6243463-c1a1-42a5-8697-11a221d45be7-kube-api-access-dlxd7\") pod \"b6243463-c1a1-42a5-8697-11a221d45be7\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.621666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-logs\") pod \"b6243463-c1a1-42a5-8697-11a221d45be7\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.621705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-scripts\") pod \"b6243463-c1a1-42a5-8697-11a221d45be7\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.621755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-config-data\") pod \"b6243463-c1a1-42a5-8697-11a221d45be7\" (UID: \"b6243463-c1a1-42a5-8697-11a221d45be7\") " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.621833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b6243463-c1a1-42a5-8697-11a221d45be7" (UID: "b6243463-c1a1-42a5-8697-11a221d45be7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.622269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-logs" (OuterVolumeSpecName: "logs") pod "b6243463-c1a1-42a5-8697-11a221d45be7" (UID: "b6243463-c1a1-42a5-8697-11a221d45be7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.622374 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d41a013-882b-4118-a008-d1fe0ec11a22-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.622390 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn8j9\" (UniqueName: \"kubernetes.io/projected/0ffcf62a-53d8-4003-8e75-747b6e768ffd-kube-api-access-gn8j9\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.622400 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.622408 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwhzw\" (UniqueName: \"kubernetes.io/projected/2d41a013-882b-4118-a008-d1fe0ec11a22-kube-api-access-xwhzw\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.625671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "b6243463-c1a1-42a5-8697-11a221d45be7" (UID: "b6243463-c1a1-42a5-8697-11a221d45be7"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.625760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6243463-c1a1-42a5-8697-11a221d45be7-kube-api-access-dlxd7" (OuterVolumeSpecName: "kube-api-access-dlxd7") pod "b6243463-c1a1-42a5-8697-11a221d45be7" (UID: "b6243463-c1a1-42a5-8697-11a221d45be7"). InnerVolumeSpecName "kube-api-access-dlxd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.626208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-scripts" (OuterVolumeSpecName: "scripts") pod "b6243463-c1a1-42a5-8697-11a221d45be7" (UID: "b6243463-c1a1-42a5-8697-11a221d45be7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.646235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6243463-c1a1-42a5-8697-11a221d45be7" (UID: "b6243463-c1a1-42a5-8697-11a221d45be7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.658118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-config-data" (OuterVolumeSpecName: "config-data") pod "b6243463-c1a1-42a5-8697-11a221d45be7" (UID: "b6243463-c1a1-42a5-8697-11a221d45be7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.658341 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b6243463-c1a1-42a5-8697-11a221d45be7" (UID: "b6243463-c1a1-42a5-8697-11a221d45be7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.723744 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.723924 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlxd7\" (UniqueName: \"kubernetes.io/projected/b6243463-c1a1-42a5-8697-11a221d45be7-kube-api-access-dlxd7\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.724002 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6243463-c1a1-42a5-8697-11a221d45be7-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.724057 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.724105 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.724158 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6243463-c1a1-42a5-8697-11a221d45be7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.724235 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.737494 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.803957 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:22:11 crc kubenswrapper[4707]: W0121 15:22:11.812572 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedf40ef7_2790_4e9b_92dd_b92ab951383d.slice/crio-b5932ed4d3d6079a9ff1cf10c3d5a106f93d59da74038f55064b644c7f9a5966 WatchSource:0}: Error finding container b5932ed4d3d6079a9ff1cf10c3d5a106f93d59da74038f55064b644c7f9a5966: Status 404 returned error can't find the container with id b5932ed4d3d6079a9ff1cf10c3d5a106f93d59da74038f55064b644c7f9a5966 Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.825597 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.909129 4707 generic.go:334] "Generic (PLEG): container finished" podID="b6243463-c1a1-42a5-8697-11a221d45be7" containerID="138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610" exitCode=0 Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.909171 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.909210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b6243463-c1a1-42a5-8697-11a221d45be7","Type":"ContainerDied","Data":"138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610"} Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.909282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b6243463-c1a1-42a5-8697-11a221d45be7","Type":"ContainerDied","Data":"6cf65c2bba9d9a8d94be9db8039c5d8f1d4729776345ad2374399cffc76de588"} Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.909581 4707 scope.go:117] "RemoveContainer" containerID="138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.911440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"edf40ef7-2790-4e9b-92dd-b92ab951383d","Type":"ContainerStarted","Data":"b5932ed4d3d6079a9ff1cf10c3d5a106f93d59da74038f55064b644c7f9a5966"} Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.914676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" event={"ID":"c579748e-251c-4267-b9bf-64849d6baaea","Type":"ContainerDied","Data":"42fd64258f21b75049f89dd19fc0403511a29264aeea9b8d1c34489e5c38a0d6"} Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.914703 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42fd64258f21b75049f89dd19fc0403511a29264aeea9b8d1c34489e5c38a0d6" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.914753 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.932385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" event={"ID":"000e0803-0c00-4afa-863f-23269a570787","Type":"ContainerDied","Data":"2714d73dfe15fe4cec9ad2411652f5f1815e06fa60f3527c399c988cb567f292"} Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.932417 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2714d73dfe15fe4cec9ad2411652f5f1815e06fa60f3527c399c988cb567f292" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.932472 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-msrjr" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.940883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" event={"ID":"2d41a013-882b-4118-a008-d1fe0ec11a22","Type":"ContainerDied","Data":"4b2725ba258c94a860225fde30e8a17f2fc63536ed4bf3a5cf4772e6ab0f29d8"} Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.940914 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b2725ba258c94a860225fde30e8a17f2fc63536ed4bf3a5cf4772e6ab0f29d8" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.940964 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.943853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" event={"ID":"15a1405c-3d18-46d7-888c-2273a3ea7e6e","Type":"ContainerDied","Data":"dda4a4653692228326ab6bc9a1f946207f0ea641bf60a3d5146a607a86557229"} Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.943875 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda4a4653692228326ab6bc9a1f946207f0ea641bf60a3d5146a607a86557229" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.943890 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.952576 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" event={"ID":"0ffcf62a-53d8-4003-8e75-747b6e768ffd","Type":"ContainerDied","Data":"a97c07b56d539571de2b9158f25f06cb4ef1d76711e34f46f7c824727be4def0"} Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.952607 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97c07b56d539571de2b9158f25f06cb4ef1d76711e34f46f7c824727be4def0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.952657 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-z4djf" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.956121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerStarted","Data":"c676e22752e26f185147bce45effa67c926b63a3ea8247ae141a6ed02d29f828"} Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.956289 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="ceilometer-central-agent" containerID="cri-o://98f027dc4997b7c69b319ba3e43d33780581a93b0b7cd5adb05faf23a6d5ed86" gracePeriod=30 Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.956665 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="sg-core" containerID="cri-o://f6d456c62accf70f9fb589bed7142f11924ed6a740b1b735c9246cf1193a0764" gracePeriod=30 Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.956687 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.956696 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="proxy-httpd" containerID="cri-o://c676e22752e26f185147bce45effa67c926b63a3ea8247ae141a6ed02d29f828" gracePeriod=30 Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.956730 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="ceilometer-notification-agent" containerID="cri-o://95b2f38bfc84afb1647dad340ce49f61fbe1866f669768a5f37cfdb6eee6deb4" gracePeriod=30 Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.994650 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.595453194 podStartE2EDuration="5.994633636s" podCreationTimestamp="2026-01-21 15:22:06 +0000 UTC" firstStartedPulling="2026-01-21 15:22:07.587220809 +0000 UTC m=+1224.768737032" lastFinishedPulling="2026-01-21 15:22:10.986401252 +0000 UTC m=+1228.167917474" observedRunningTime="2026-01-21 15:22:11.974291592 +0000 UTC m=+1229.155807814" watchObservedRunningTime="2026-01-21 15:22:11.994633636 +0000 UTC m=+1229.176149859" Jan 21 15:22:11 crc kubenswrapper[4707]: I0121 15:22:11.995763 4707 scope.go:117] "RemoveContainer" containerID="71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.019669 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.039679 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.053656 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:22:12 crc kubenswrapper[4707]: E0121 15:22:12.054225 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c579748e-251c-4267-b9bf-64849d6baaea" containerName="mariadb-account-create-update" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054246 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c579748e-251c-4267-b9bf-64849d6baaea" containerName="mariadb-account-create-update" Jan 21 15:22:12 crc kubenswrapper[4707]: E0121 15:22:12.054269 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6243463-c1a1-42a5-8697-11a221d45be7" containerName="glance-log" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054318 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6243463-c1a1-42a5-8697-11a221d45be7" containerName="glance-log" Jan 21 15:22:12 crc kubenswrapper[4707]: E0121 15:22:12.054341 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffcf62a-53d8-4003-8e75-747b6e768ffd" containerName="mariadb-database-create" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054349 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffcf62a-53d8-4003-8e75-747b6e768ffd" containerName="mariadb-database-create" Jan 21 15:22:12 crc kubenswrapper[4707]: E0121 15:22:12.054398 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6243463-c1a1-42a5-8697-11a221d45be7" containerName="glance-httpd" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054404 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6243463-c1a1-42a5-8697-11a221d45be7" containerName="glance-httpd" Jan 21 15:22:12 crc kubenswrapper[4707]: E0121 15:22:12.054413 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a1405c-3d18-46d7-888c-2273a3ea7e6e" containerName="mariadb-account-create-update" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054418 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a1405c-3d18-46d7-888c-2273a3ea7e6e" containerName="mariadb-account-create-update" Jan 21 15:22:12 crc kubenswrapper[4707]: E0121 15:22:12.054426 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d41a013-882b-4118-a008-d1fe0ec11a22" containerName="mariadb-account-create-update" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054431 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d41a013-882b-4118-a008-d1fe0ec11a22" containerName="mariadb-account-create-update" Jan 21 15:22:12 crc kubenswrapper[4707]: E0121 15:22:12.054438 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000e0803-0c00-4afa-863f-23269a570787" containerName="mariadb-database-create" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054463 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="000e0803-0c00-4afa-863f-23269a570787" containerName="mariadb-database-create" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054661 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffcf62a-53d8-4003-8e75-747b6e768ffd" containerName="mariadb-database-create" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054678 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d41a013-882b-4118-a008-d1fe0ec11a22" containerName="mariadb-account-create-update" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054690 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a1405c-3d18-46d7-888c-2273a3ea7e6e" containerName="mariadb-account-create-update" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054697 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="000e0803-0c00-4afa-863f-23269a570787" containerName="mariadb-database-create" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054705 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6243463-c1a1-42a5-8697-11a221d45be7" containerName="glance-httpd" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054717 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6243463-c1a1-42a5-8697-11a221d45be7" containerName="glance-log" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.054729 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c579748e-251c-4267-b9bf-64849d6baaea" containerName="mariadb-account-create-update" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.055620 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.057938 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.058054 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.059792 4707 scope.go:117] "RemoveContainer" containerID="138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610" Jan 21 15:22:12 crc kubenswrapper[4707]: E0121 15:22:12.060152 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610\": container with ID starting with 138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610 not found: ID does not exist" containerID="138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.060188 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610"} err="failed to get container status \"138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610\": rpc error: code = NotFound desc = could not find container \"138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610\": container with ID starting with 138b7698d6608140c7d1bd1a838459bc3279c108c1cc39de2f141700ec34d610 not found: ID does not exist" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.060210 4707 scope.go:117] "RemoveContainer" containerID="71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374" Jan 21 15:22:12 crc kubenswrapper[4707]: E0121 15:22:12.061535 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374\": container with ID starting with 71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374 not found: ID does not exist" containerID="71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.061564 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374"} err="failed to get container status \"71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374\": rpc error: code = NotFound desc = could not find container \"71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374\": container with ID starting with 71ffeb5a06f6ff09af20eb55503f1a490fe8bebf9c44ba51856ff570dc383374 not found: ID does not exist" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.072716 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.151240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.151276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.151307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6st5g\" (UniqueName: \"kubernetes.io/projected/63816993-7ef8-4f21-9a3c-18040cb843bd-kube-api-access-6st5g\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.151325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.151364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.151387 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.151436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.151471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6st5g\" (UniqueName: \"kubernetes.io/projected/63816993-7ef8-4f21-9a3c-18040cb843bd-kube-api-access-6st5g\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.253965 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.254357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.257137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.257650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.258405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.259278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.270518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6st5g\" (UniqueName: \"kubernetes.io/projected/63816993-7ef8-4f21-9a3c-18040cb843bd-kube-api-access-6st5g\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.279062 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.367567 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.753647 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:22:12 crc kubenswrapper[4707]: W0121 15:22:12.759749 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63816993_7ef8_4f21_9a3c_18040cb843bd.slice/crio-2e5be557c46d199ec3ad1ee1b26a68bcf71ee595c54cc79d8bf9c925aaad8154 WatchSource:0}: Error finding container 2e5be557c46d199ec3ad1ee1b26a68bcf71ee595c54cc79d8bf9c925aaad8154: Status 404 returned error can't find the container with id 2e5be557c46d199ec3ad1ee1b26a68bcf71ee595c54cc79d8bf9c925aaad8154 Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.966732 4707 generic.go:334] "Generic (PLEG): container finished" podID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerID="c676e22752e26f185147bce45effa67c926b63a3ea8247ae141a6ed02d29f828" exitCode=0 Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.966965 4707 generic.go:334] "Generic (PLEG): container finished" podID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerID="f6d456c62accf70f9fb589bed7142f11924ed6a740b1b735c9246cf1193a0764" exitCode=2 Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.966976 4707 generic.go:334] "Generic (PLEG): container finished" podID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerID="95b2f38bfc84afb1647dad340ce49f61fbe1866f669768a5f37cfdb6eee6deb4" exitCode=0 Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.966778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerDied","Data":"c676e22752e26f185147bce45effa67c926b63a3ea8247ae141a6ed02d29f828"} Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.967060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerDied","Data":"f6d456c62accf70f9fb589bed7142f11924ed6a740b1b735c9246cf1193a0764"} Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.967074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerDied","Data":"95b2f38bfc84afb1647dad340ce49f61fbe1866f669768a5f37cfdb6eee6deb4"} Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.969901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"edf40ef7-2790-4e9b-92dd-b92ab951383d","Type":"ContainerStarted","Data":"5a6924b802768405402d6b86cb657dfd7334a39647b9f65f1a3ca7e21dd5e716"} Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.969925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"edf40ef7-2790-4e9b-92dd-b92ab951383d","Type":"ContainerStarted","Data":"bdd54c33530224d64a1afe448ee39fd6390f3f98a84ea3afe2b57e8006166cbd"} Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.971038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"63816993-7ef8-4f21-9a3c-18040cb843bd","Type":"ContainerStarted","Data":"2e5be557c46d199ec3ad1ee1b26a68bcf71ee595c54cc79d8bf9c925aaad8154"} Jan 21 15:22:12 crc kubenswrapper[4707]: I0121 15:22:12.990779 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=1.9907642220000001 podStartE2EDuration="1.990764222s" podCreationTimestamp="2026-01-21 15:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:12.983602219 +0000 UTC m=+1230.165118441" watchObservedRunningTime="2026-01-21 15:22:12.990764222 +0000 UTC m=+1230.172280444" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.208006 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6243463-c1a1-42a5-8697-11a221d45be7" path="/var/lib/kubelet/pods/b6243463-c1a1-42a5-8697-11a221d45be7/volumes" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.320401 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq"] Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.321468 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.328672 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq"] Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.329468 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.329704 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-2hbv2" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.329890 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.370963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-scripts\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.370999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7jkw\" (UniqueName: \"kubernetes.io/projected/e5177656-16d0-431f-b865-455378e95bd3-kube-api-access-w7jkw\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.371047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.371083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-config-data\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.472194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-scripts\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.472228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7jkw\" (UniqueName: \"kubernetes.io/projected/e5177656-16d0-431f-b865-455378e95bd3-kube-api-access-w7jkw\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.472270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.472303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-config-data\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.476123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.476365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-scripts\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.476567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-config-data\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.484943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7jkw\" (UniqueName: \"kubernetes.io/projected/e5177656-16d0-431f-b865-455378e95bd3-kube-api-access-w7jkw\") pod \"nova-cell0-conductor-db-sync-rffsq\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.645880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.982165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"63816993-7ef8-4f21-9a3c-18040cb843bd","Type":"ContainerStarted","Data":"b4638cfb34b740af95a7bc03409e4baaf7acbfe3168548cc0715b59e19f6d136"} Jan 21 15:22:13 crc kubenswrapper[4707]: I0121 15:22:13.982372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"63816993-7ef8-4f21-9a3c-18040cb843bd","Type":"ContainerStarted","Data":"cdead9cf6c5698fee36e61676ef626b53d0a70cdd040ba9566e1451f134de51e"} Jan 21 15:22:14 crc kubenswrapper[4707]: I0121 15:22:14.004161 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.004143708 podStartE2EDuration="2.004143708s" podCreationTimestamp="2026-01-21 15:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:13.995496514 +0000 UTC m=+1231.177012736" watchObservedRunningTime="2026-01-21 15:22:14.004143708 +0000 UTC m=+1231.185659929" Jan 21 15:22:14 crc kubenswrapper[4707]: I0121 15:22:14.060023 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq"] Jan 21 15:22:14 crc kubenswrapper[4707]: W0121 15:22:14.073037 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5177656_16d0_431f_b865_455378e95bd3.slice/crio-8bacc73f326141f7ba94b1886d53313789e5067fb9096a293c463707fa9f50f8 WatchSource:0}: Error finding container 8bacc73f326141f7ba94b1886d53313789e5067fb9096a293c463707fa9f50f8: Status 404 returned error can't find the container with id 8bacc73f326141f7ba94b1886d53313789e5067fb9096a293c463707fa9f50f8 Jan 21 15:22:14 crc kubenswrapper[4707]: I0121 15:22:14.992148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" event={"ID":"e5177656-16d0-431f-b865-455378e95bd3","Type":"ContainerStarted","Data":"271534ce4b9b3e231c7bb12b86091e9f944b8dccc5e23a788d117e456d901635"} Jan 21 15:22:14 crc kubenswrapper[4707]: I0121 15:22:14.992354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" event={"ID":"e5177656-16d0-431f-b865-455378e95bd3","Type":"ContainerStarted","Data":"8bacc73f326141f7ba94b1886d53313789e5067fb9096a293c463707fa9f50f8"} Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.015104 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5177656-16d0-431f-b865-455378e95bd3" containerID="271534ce4b9b3e231c7bb12b86091e9f944b8dccc5e23a788d117e456d901635" exitCode=0 Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.015164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" event={"ID":"e5177656-16d0-431f-b865-455378e95bd3","Type":"ContainerDied","Data":"271534ce4b9b3e231c7bb12b86091e9f944b8dccc5e23a788d117e456d901635"} Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.018684 4707 generic.go:334] "Generic (PLEG): container finished" podID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerID="98f027dc4997b7c69b319ba3e43d33780581a93b0b7cd5adb05faf23a6d5ed86" exitCode=0 Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.018719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerDied","Data":"98f027dc4997b7c69b319ba3e43d33780581a93b0b7cd5adb05faf23a6d5ed86"} Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.150803 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.345566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-log-httpd\") pod \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.345872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-run-httpd\") pod \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.345895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x4vq\" (UniqueName: \"kubernetes.io/projected/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-kube-api-access-5x4vq\") pod \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.345944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-combined-ca-bundle\") pod \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.345995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-config-data\") pod \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.346063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-scripts\") pod \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.346085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-sg-core-conf-yaml\") pod \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\" (UID: \"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c\") " Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.346138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" (UID: "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.346281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" (UID: "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.346688 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.346708 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.350481 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-scripts" (OuterVolumeSpecName: "scripts") pod "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" (UID: "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.350712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-kube-api-access-5x4vq" (OuterVolumeSpecName: "kube-api-access-5x4vq") pod "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" (UID: "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c"). InnerVolumeSpecName "kube-api-access-5x4vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.366405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" (UID: "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.397144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" (UID: "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.410432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-config-data" (OuterVolumeSpecName: "config-data") pod "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" (UID: "2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.448059 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x4vq\" (UniqueName: \"kubernetes.io/projected/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-kube-api-access-5x4vq\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.448094 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.448105 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.448114 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:18 crc kubenswrapper[4707]: I0121 15:22:18.448121 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.027476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c","Type":"ContainerDied","Data":"b3514b475dcee7bc7b9b5399a141af9508530f2832be80372e1291ef7391a946"} Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.027710 4707 scope.go:117] "RemoveContainer" containerID="c676e22752e26f185147bce45effa67c926b63a3ea8247ae141a6ed02d29f828" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.027525 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.056831 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.068378 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.079032 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:19 crc kubenswrapper[4707]: E0121 15:22:19.079403 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="proxy-httpd" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.079423 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="proxy-httpd" Jan 21 15:22:19 crc kubenswrapper[4707]: E0121 15:22:19.079440 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="sg-core" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.079447 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="sg-core" Jan 21 15:22:19 crc kubenswrapper[4707]: E0121 15:22:19.079463 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="ceilometer-notification-agent" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.079470 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="ceilometer-notification-agent" Jan 21 15:22:19 crc kubenswrapper[4707]: E0121 15:22:19.079488 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="ceilometer-central-agent" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.079494 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="ceilometer-central-agent" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.079774 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="ceilometer-notification-agent" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.079793 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="ceilometer-central-agent" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.079819 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="sg-core" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.079827 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" containerName="proxy-httpd" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.081218 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.085402 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.085799 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.095576 4707 scope.go:117] "RemoveContainer" containerID="f6d456c62accf70f9fb589bed7142f11924ed6a740b1b735c9246cf1193a0764" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.097024 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.123928 4707 scope.go:117] "RemoveContainer" containerID="95b2f38bfc84afb1647dad340ce49f61fbe1866f669768a5f37cfdb6eee6deb4" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.141760 4707 scope.go:117] "RemoveContainer" containerID="98f027dc4997b7c69b319ba3e43d33780581a93b0b7cd5adb05faf23a6d5ed86" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.204559 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c" path="/var/lib/kubelet/pods/2e71e2a7-d752-4ad2-a739-bd0d3cc2ec7c/volumes" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.262023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvdrd\" (UniqueName: \"kubernetes.io/projected/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-kube-api-access-vvdrd\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.263048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.263115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-scripts\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.263173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-config-data\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.263191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-log-httpd\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.263224 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.263245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-run-httpd\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.327054 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.366171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvdrd\" (UniqueName: \"kubernetes.io/projected/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-kube-api-access-vvdrd\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.366421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.366487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-scripts\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.366601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-config-data\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.366642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-log-httpd\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.366685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.366717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-run-httpd\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.367726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-run-httpd\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.367987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-log-httpd\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.371466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.371709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.371891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-scripts\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.372260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-config-data\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.383317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvdrd\" (UniqueName: \"kubernetes.io/projected/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-kube-api-access-vvdrd\") pod \"ceilometer-0\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.410793 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.467886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-config-data\") pod \"e5177656-16d0-431f-b865-455378e95bd3\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.468030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-combined-ca-bundle\") pod \"e5177656-16d0-431f-b865-455378e95bd3\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.468085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-scripts\") pod \"e5177656-16d0-431f-b865-455378e95bd3\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.468119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7jkw\" (UniqueName: \"kubernetes.io/projected/e5177656-16d0-431f-b865-455378e95bd3-kube-api-access-w7jkw\") pod \"e5177656-16d0-431f-b865-455378e95bd3\" (UID: \"e5177656-16d0-431f-b865-455378e95bd3\") " Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.470621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5177656-16d0-431f-b865-455378e95bd3-kube-api-access-w7jkw" (OuterVolumeSpecName: "kube-api-access-w7jkw") pod "e5177656-16d0-431f-b865-455378e95bd3" (UID: "e5177656-16d0-431f-b865-455378e95bd3"). InnerVolumeSpecName "kube-api-access-w7jkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.470953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-scripts" (OuterVolumeSpecName: "scripts") pod "e5177656-16d0-431f-b865-455378e95bd3" (UID: "e5177656-16d0-431f-b865-455378e95bd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.491054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-config-data" (OuterVolumeSpecName: "config-data") pod "e5177656-16d0-431f-b865-455378e95bd3" (UID: "e5177656-16d0-431f-b865-455378e95bd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.492184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5177656-16d0-431f-b865-455378e95bd3" (UID: "e5177656-16d0-431f-b865-455378e95bd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.569950 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.569974 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.569984 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5177656-16d0-431f-b865-455378e95bd3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.569992 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7jkw\" (UniqueName: \"kubernetes.io/projected/e5177656-16d0-431f-b865-455378e95bd3-kube-api-access-w7jkw\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:19 crc kubenswrapper[4707]: I0121 15:22:19.776130 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:19 crc kubenswrapper[4707]: W0121 15:22:19.779209 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod542a1ea2_98ab_42cb_a7f0_7624b26c3ccf.slice/crio-9cedfc984d06dc321fb3049ec3839fc3baa5c6776a6552d48aa6dd01be4cbc51 WatchSource:0}: Error finding container 9cedfc984d06dc321fb3049ec3839fc3baa5c6776a6552d48aa6dd01be4cbc51: Status 404 returned error can't find the container with id 9cedfc984d06dc321fb3049ec3839fc3baa5c6776a6552d48aa6dd01be4cbc51 Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.036587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" event={"ID":"e5177656-16d0-431f-b865-455378e95bd3","Type":"ContainerDied","Data":"8bacc73f326141f7ba94b1886d53313789e5067fb9096a293c463707fa9f50f8"} Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.036626 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bacc73f326141f7ba94b1886d53313789e5067fb9096a293c463707fa9f50f8" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.036599 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.038797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerStarted","Data":"9cedfc984d06dc321fb3049ec3839fc3baa5c6776a6552d48aa6dd01be4cbc51"} Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.075206 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:22:20 crc kubenswrapper[4707]: E0121 15:22:20.075505 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5177656-16d0-431f-b865-455378e95bd3" containerName="nova-cell0-conductor-db-sync" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.075520 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5177656-16d0-431f-b865-455378e95bd3" containerName="nova-cell0-conductor-db-sync" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.075674 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5177656-16d0-431f-b865-455378e95bd3" containerName="nova-cell0-conductor-db-sync" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.076148 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.077825 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.078541 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-2hbv2" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.081988 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.178549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.178592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gdms\" (UniqueName: \"kubernetes.io/projected/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-kube-api-access-6gdms\") pod \"nova-cell0-conductor-0\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.178645 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.279512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.279548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gdms\" (UniqueName: \"kubernetes.io/projected/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-kube-api-access-6gdms\") pod \"nova-cell0-conductor-0\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.279602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.283629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.283827 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.292844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gdms\" (UniqueName: \"kubernetes.io/projected/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-kube-api-access-6gdms\") pod \"nova-cell0-conductor-0\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.388757 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:20 crc kubenswrapper[4707]: I0121 15:22:20.765762 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:22:21 crc kubenswrapper[4707]: I0121 15:22:21.046076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"d4d93b8f-0a09-4105-9590-e3bbd2ee2953","Type":"ContainerStarted","Data":"4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095"} Jan 21 15:22:21 crc kubenswrapper[4707]: I0121 15:22:21.046119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"d4d93b8f-0a09-4105-9590-e3bbd2ee2953","Type":"ContainerStarted","Data":"78c5d4e772ff8b20a9a8acffa96711bac16a41572b0a2c524ee1d5af2b333c17"} Jan 21 15:22:21 crc kubenswrapper[4707]: I0121 15:22:21.046136 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:21 crc kubenswrapper[4707]: I0121 15:22:21.049730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerStarted","Data":"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d"} Jan 21 15:22:21 crc kubenswrapper[4707]: I0121 15:22:21.062379 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.062364942 podStartE2EDuration="1.062364942s" podCreationTimestamp="2026-01-21 15:22:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:21.061502401 +0000 UTC m=+1238.243018623" watchObservedRunningTime="2026-01-21 15:22:21.062364942 +0000 UTC m=+1238.243881164" Jan 21 15:22:21 crc kubenswrapper[4707]: I0121 15:22:21.390140 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:21 crc kubenswrapper[4707]: I0121 15:22:21.390377 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:21 crc kubenswrapper[4707]: I0121 15:22:21.413173 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:21 crc kubenswrapper[4707]: I0121 15:22:21.419635 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:22 crc kubenswrapper[4707]: I0121 15:22:22.058680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerStarted","Data":"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d"} Jan 21 15:22:22 crc kubenswrapper[4707]: I0121 15:22:22.058889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerStarted","Data":"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca"} Jan 21 15:22:22 crc kubenswrapper[4707]: I0121 15:22:22.059352 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:22 crc kubenswrapper[4707]: I0121 15:22:22.059369 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:22 crc kubenswrapper[4707]: I0121 15:22:22.368390 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:22 crc kubenswrapper[4707]: I0121 15:22:22.368430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:22 crc kubenswrapper[4707]: I0121 15:22:22.391526 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:22 crc kubenswrapper[4707]: I0121 15:22:22.401562 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:22 crc kubenswrapper[4707]: I0121 15:22:22.690319 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.064895 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="d4d93b8f-0a09-4105-9590-e3bbd2ee2953" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095" gracePeriod=30 Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.065344 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.065366 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.854963 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.920964 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.935563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gdms\" (UniqueName: \"kubernetes.io/projected/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-kube-api-access-6gdms\") pod \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.935671 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-config-data\") pod \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.935860 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-combined-ca-bundle\") pod \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\" (UID: \"d4d93b8f-0a09-4105-9590-e3bbd2ee2953\") " Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.942430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-kube-api-access-6gdms" (OuterVolumeSpecName: "kube-api-access-6gdms") pod "d4d93b8f-0a09-4105-9590-e3bbd2ee2953" (UID: "d4d93b8f-0a09-4105-9590-e3bbd2ee2953"). InnerVolumeSpecName "kube-api-access-6gdms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.960361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4d93b8f-0a09-4105-9590-e3bbd2ee2953" (UID: "d4d93b8f-0a09-4105-9590-e3bbd2ee2953"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.966276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-config-data" (OuterVolumeSpecName: "config-data") pod "d4d93b8f-0a09-4105-9590-e3bbd2ee2953" (UID: "d4d93b8f-0a09-4105-9590-e3bbd2ee2953"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:23 crc kubenswrapper[4707]: I0121 15:22:23.983282 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.037427 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.037460 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.037472 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gdms\" (UniqueName: \"kubernetes.io/projected/d4d93b8f-0a09-4105-9590-e3bbd2ee2953-kube-api-access-6gdms\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.072888 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4d93b8f-0a09-4105-9590-e3bbd2ee2953" containerID="4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095" exitCode=0 Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.072942 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.072937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"d4d93b8f-0a09-4105-9590-e3bbd2ee2953","Type":"ContainerDied","Data":"4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095"} Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.073866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"d4d93b8f-0a09-4105-9590-e3bbd2ee2953","Type":"ContainerDied","Data":"78c5d4e772ff8b20a9a8acffa96711bac16a41572b0a2c524ee1d5af2b333c17"} Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.073889 4707 scope.go:117] "RemoveContainer" containerID="4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.078298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerStarted","Data":"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8"} Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.078777 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.097542 4707 scope.go:117] "RemoveContainer" containerID="4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.100239 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.42164735 podStartE2EDuration="5.100222642s" podCreationTimestamp="2026-01-21 15:22:19 +0000 UTC" firstStartedPulling="2026-01-21 15:22:19.781275741 +0000 UTC m=+1236.962791962" lastFinishedPulling="2026-01-21 15:22:23.459851032 +0000 UTC m=+1240.641367254" observedRunningTime="2026-01-21 15:22:24.093236891 +0000 UTC m=+1241.274753114" watchObservedRunningTime="2026-01-21 15:22:24.100222642 +0000 UTC m=+1241.281738864" Jan 21 15:22:24 crc kubenswrapper[4707]: E0121 15:22:24.107398 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095\": container with ID starting with 4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095 not found: ID does not exist" containerID="4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.107438 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095"} err="failed to get container status \"4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095\": rpc error: code = NotFound desc = could not find container \"4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095\": container with ID starting with 4c48476db315e935094d1cb8f184654fd90805672f828fa994a1be1f10ff6095 not found: ID does not exist" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.117456 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.130860 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.146677 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:22:24 crc kubenswrapper[4707]: E0121 15:22:24.147010 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d93b8f-0a09-4105-9590-e3bbd2ee2953" containerName="nova-cell0-conductor-conductor" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.147021 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d93b8f-0a09-4105-9590-e3bbd2ee2953" containerName="nova-cell0-conductor-conductor" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.147194 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d93b8f-0a09-4105-9590-e3bbd2ee2953" containerName="nova-cell0-conductor-conductor" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.147666 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.150370 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.150539 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-2hbv2" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.150847 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.240343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.240506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85p7z\" (UniqueName: \"kubernetes.io/projected/9e516808-af42-455d-b94b-7a72e6439115-kube-api-access-85p7z\") pod \"nova-cell0-conductor-0\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.240650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.342276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.343320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.343534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85p7z\" (UniqueName: \"kubernetes.io/projected/9e516808-af42-455d-b94b-7a72e6439115-kube-api-access-85p7z\") pod \"nova-cell0-conductor-0\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.347347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.347455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.367231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85p7z\" (UniqueName: \"kubernetes.io/projected/9e516808-af42-455d-b94b-7a72e6439115-kube-api-access-85p7z\") pod \"nova-cell0-conductor-0\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.465627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.495241 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.886107 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.919673 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:24 crc kubenswrapper[4707]: I0121 15:22:24.969945 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:22:25 crc kubenswrapper[4707]: I0121 15:22:25.094662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9e516808-af42-455d-b94b-7a72e6439115","Type":"ContainerStarted","Data":"f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965"} Jan 21 15:22:25 crc kubenswrapper[4707]: I0121 15:22:25.094699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9e516808-af42-455d-b94b-7a72e6439115","Type":"ContainerStarted","Data":"68487121fe82059f865f78f2184415aaaa8ac74f1b7747a58152abb1b509dcaf"} Jan 21 15:22:25 crc kubenswrapper[4707]: I0121 15:22:25.095571 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:25 crc kubenswrapper[4707]: I0121 15:22:25.108653 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.10864468 podStartE2EDuration="1.10864468s" podCreationTimestamp="2026-01-21 15:22:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:25.106622068 +0000 UTC m=+1242.288138290" watchObservedRunningTime="2026-01-21 15:22:25.10864468 +0000 UTC m=+1242.290160902" Jan 21 15:22:25 crc kubenswrapper[4707]: I0121 15:22:25.190521 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d93b8f-0a09-4105-9590-e3bbd2ee2953" path="/var/lib/kubelet/pods/d4d93b8f-0a09-4105-9590-e3bbd2ee2953/volumes" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.101564 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="ceilometer-central-agent" containerID="cri-o://802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d" gracePeriod=30 Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.102284 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="proxy-httpd" containerID="cri-o://b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8" gracePeriod=30 Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.102351 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="sg-core" containerID="cri-o://66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d" gracePeriod=30 Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.102348 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="ceilometer-notification-agent" containerID="cri-o://6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca" gracePeriod=30 Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.679487 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.782160 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-scripts\") pod \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.782232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-log-httpd\") pod \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.782307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvdrd\" (UniqueName: \"kubernetes.io/projected/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-kube-api-access-vvdrd\") pod \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.782387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-run-httpd\") pod \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.782404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-combined-ca-bundle\") pod \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.782439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-config-data\") pod \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.782462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-sg-core-conf-yaml\") pod \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\" (UID: \"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf\") " Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.782630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" (UID: "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.782851 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.783112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" (UID: "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.786727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-scripts" (OuterVolumeSpecName: "scripts") pod "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" (UID: "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.786925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-kube-api-access-vvdrd" (OuterVolumeSpecName: "kube-api-access-vvdrd") pod "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" (UID: "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf"). InnerVolumeSpecName "kube-api-access-vvdrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.802436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" (UID: "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.830649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" (UID: "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.843892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-config-data" (OuterVolumeSpecName: "config-data") pod "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" (UID: "542a1ea2-98ab-42cb-a7f0-7624b26c3ccf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.883757 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.883784 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.883795 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.883802 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.883825 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:26 crc kubenswrapper[4707]: I0121 15:22:26.883832 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvdrd\" (UniqueName: \"kubernetes.io/projected/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf-kube-api-access-vvdrd\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110853 4707 generic.go:334] "Generic (PLEG): container finished" podID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerID="b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8" exitCode=0 Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110886 4707 generic.go:334] "Generic (PLEG): container finished" podID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerID="66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d" exitCode=2 Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110895 4707 generic.go:334] "Generic (PLEG): container finished" podID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerID="6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca" exitCode=0 Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110903 4707 generic.go:334] "Generic (PLEG): container finished" podID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerID="802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d" exitCode=0 Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110909 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerDied","Data":"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8"} Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerDied","Data":"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d"} Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerDied","Data":"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca"} Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerDied","Data":"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d"} Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.111002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"542a1ea2-98ab-42cb-a7f0-7624b26c3ccf","Type":"ContainerDied","Data":"9cedfc984d06dc321fb3049ec3839fc3baa5c6776a6552d48aa6dd01be4cbc51"} Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.110982 4707 scope.go:117] "RemoveContainer" containerID="b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.130479 4707 scope.go:117] "RemoveContainer" containerID="66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.141596 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.154479 4707 scope.go:117] "RemoveContainer" containerID="6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.159228 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.168002 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:27 crc kubenswrapper[4707]: E0121 15:22:27.168388 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="ceilometer-central-agent" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.168401 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="ceilometer-central-agent" Jan 21 15:22:27 crc kubenswrapper[4707]: E0121 15:22:27.168419 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="ceilometer-notification-agent" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.168426 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="ceilometer-notification-agent" Jan 21 15:22:27 crc kubenswrapper[4707]: E0121 15:22:27.168447 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="sg-core" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.168452 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="sg-core" Jan 21 15:22:27 crc kubenswrapper[4707]: E0121 15:22:27.168463 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="proxy-httpd" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.168469 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="proxy-httpd" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.168610 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="ceilometer-notification-agent" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.168618 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="ceilometer-central-agent" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.168625 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="sg-core" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.168640 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" containerName="proxy-httpd" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.170029 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.171946 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.172395 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.172672 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.172743 4707 scope.go:117] "RemoveContainer" containerID="802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.186537 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-scripts\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.186568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtnch\" (UniqueName: \"kubernetes.io/projected/4bc8db82-0945-4c71-955c-6e80d92a8c15-kube-api-access-vtnch\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.186600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-config-data\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.186668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.186684 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.186747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-run-httpd\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.186785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-log-httpd\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.192230 4707 scope.go:117] "RemoveContainer" containerID="b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8" Jan 21 15:22:27 crc kubenswrapper[4707]: E0121 15:22:27.192670 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8\": container with ID starting with b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8 not found: ID does not exist" containerID="b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.192701 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8"} err="failed to get container status \"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8\": rpc error: code = NotFound desc = could not find container \"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8\": container with ID starting with b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8 not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.192722 4707 scope.go:117] "RemoveContainer" containerID="66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d" Jan 21 15:22:27 crc kubenswrapper[4707]: E0121 15:22:27.193043 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d\": container with ID starting with 66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d not found: ID does not exist" containerID="66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.193077 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d"} err="failed to get container status \"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d\": rpc error: code = NotFound desc = could not find container \"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d\": container with ID starting with 66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.193103 4707 scope.go:117] "RemoveContainer" containerID="6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca" Jan 21 15:22:27 crc kubenswrapper[4707]: E0121 15:22:27.193475 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca\": container with ID starting with 6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca not found: ID does not exist" containerID="6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.193500 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca"} err="failed to get container status \"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca\": rpc error: code = NotFound desc = could not find container \"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca\": container with ID starting with 6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.193514 4707 scope.go:117] "RemoveContainer" containerID="802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d" Jan 21 15:22:27 crc kubenswrapper[4707]: E0121 15:22:27.193795 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d\": container with ID starting with 802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d not found: ID does not exist" containerID="802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.193832 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d"} err="failed to get container status \"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d\": rpc error: code = NotFound desc = could not find container \"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d\": container with ID starting with 802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.193847 4707 scope.go:117] "RemoveContainer" containerID="b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.194090 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8"} err="failed to get container status \"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8\": rpc error: code = NotFound desc = could not find container \"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8\": container with ID starting with b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8 not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.194108 4707 scope.go:117] "RemoveContainer" containerID="66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.194356 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d"} err="failed to get container status \"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d\": rpc error: code = NotFound desc = could not find container \"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d\": container with ID starting with 66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.194385 4707 scope.go:117] "RemoveContainer" containerID="6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.194622 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca"} err="failed to get container status \"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca\": rpc error: code = NotFound desc = could not find container \"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca\": container with ID starting with 6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.194644 4707 scope.go:117] "RemoveContainer" containerID="802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.194885 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d"} err="failed to get container status \"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d\": rpc error: code = NotFound desc = could not find container \"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d\": container with ID starting with 802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.194904 4707 scope.go:117] "RemoveContainer" containerID="b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.195092 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8"} err="failed to get container status \"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8\": rpc error: code = NotFound desc = could not find container \"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8\": container with ID starting with b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8 not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.195117 4707 scope.go:117] "RemoveContainer" containerID="66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.196419 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="542a1ea2-98ab-42cb-a7f0-7624b26c3ccf" path="/var/lib/kubelet/pods/542a1ea2-98ab-42cb-a7f0-7624b26c3ccf/volumes" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.197704 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d"} err="failed to get container status \"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d\": rpc error: code = NotFound desc = could not find container \"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d\": container with ID starting with 66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.197723 4707 scope.go:117] "RemoveContainer" containerID="6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.199680 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca"} err="failed to get container status \"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca\": rpc error: code = NotFound desc = could not find container \"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca\": container with ID starting with 6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.199710 4707 scope.go:117] "RemoveContainer" containerID="802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.200216 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d"} err="failed to get container status \"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d\": rpc error: code = NotFound desc = could not find container \"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d\": container with ID starting with 802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.200237 4707 scope.go:117] "RemoveContainer" containerID="b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.200487 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8"} err="failed to get container status \"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8\": rpc error: code = NotFound desc = could not find container \"b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8\": container with ID starting with b69596a47a2a87ee940d36c3b4f832924385e7f252a76b1504140e541b7efbb8 not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.200515 4707 scope.go:117] "RemoveContainer" containerID="66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.200694 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d"} err="failed to get container status \"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d\": rpc error: code = NotFound desc = could not find container \"66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d\": container with ID starting with 66f2e0a6a5c7d267b5df6678ee6a9fbd40df291f185fdf1dbca60a22c4cca63d not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.200714 4707 scope.go:117] "RemoveContainer" containerID="6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.201037 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca"} err="failed to get container status \"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca\": rpc error: code = NotFound desc = could not find container \"6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca\": container with ID starting with 6eb69e9cd0378ca1dd2db07ec0e5a81833d3818b64dbcf71b997fe1f5e0ffaca not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.201063 4707 scope.go:117] "RemoveContainer" containerID="802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.201301 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d"} err="failed to get container status \"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d\": rpc error: code = NotFound desc = could not find container \"802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d\": container with ID starting with 802834ae332b7566620a2a141ef03915e593a2415f0e7208879888f5dce34b2d not found: ID does not exist" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.287319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-scripts\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.287365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtnch\" (UniqueName: \"kubernetes.io/projected/4bc8db82-0945-4c71-955c-6e80d92a8c15-kube-api-access-vtnch\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.287399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-config-data\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.287448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.287463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.287517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-run-httpd\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.287541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-log-httpd\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.287926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-run-httpd\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.287982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-log-httpd\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.290657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-scripts\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.290697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.290880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.291789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-config-data\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.301223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtnch\" (UniqueName: \"kubernetes.io/projected/4bc8db82-0945-4c71-955c-6e80d92a8c15-kube-api-access-vtnch\") pod \"ceilometer-0\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.488478 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:27 crc kubenswrapper[4707]: I0121 15:22:27.857446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:22:27 crc kubenswrapper[4707]: W0121 15:22:27.863717 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bc8db82_0945_4c71_955c_6e80d92a8c15.slice/crio-07485e3a2f530453d18e47166fe32a67b0f55077a72f26ed3ffb0faafaf672ea WatchSource:0}: Error finding container 07485e3a2f530453d18e47166fe32a67b0f55077a72f26ed3ffb0faafaf672ea: Status 404 returned error can't find the container with id 07485e3a2f530453d18e47166fe32a67b0f55077a72f26ed3ffb0faafaf672ea Jan 21 15:22:28 crc kubenswrapper[4707]: I0121 15:22:28.118284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerStarted","Data":"07485e3a2f530453d18e47166fe32a67b0f55077a72f26ed3ffb0faafaf672ea"} Jan 21 15:22:29 crc kubenswrapper[4707]: I0121 15:22:29.128950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerStarted","Data":"e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf"} Jan 21 15:22:30 crc kubenswrapper[4707]: I0121 15:22:30.143429 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerStarted","Data":"91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe"} Jan 21 15:22:30 crc kubenswrapper[4707]: I0121 15:22:30.143645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerStarted","Data":"7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f"} Jan 21 15:22:31 crc kubenswrapper[4707]: I0121 15:22:31.154971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerStarted","Data":"23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1"} Jan 21 15:22:31 crc kubenswrapper[4707]: I0121 15:22:31.155382 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:31 crc kubenswrapper[4707]: I0121 15:22:31.178554 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.084893863 podStartE2EDuration="4.17853992s" podCreationTimestamp="2026-01-21 15:22:27 +0000 UTC" firstStartedPulling="2026-01-21 15:22:27.865701917 +0000 UTC m=+1245.047218140" lastFinishedPulling="2026-01-21 15:22:30.959347974 +0000 UTC m=+1248.140864197" observedRunningTime="2026-01-21 15:22:31.175310249 +0000 UTC m=+1248.356826472" watchObservedRunningTime="2026-01-21 15:22:31.17853992 +0000 UTC m=+1248.360056143" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.487356 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.868944 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl"] Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.870040 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.871766 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.872978 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.877432 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl"] Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.924871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.924963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-scripts\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.925012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2mfs\" (UniqueName: \"kubernetes.io/projected/024458dc-51a9-4318-a563-439e24104ffd-kube-api-access-h2mfs\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.925099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-config-data\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.968831 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.970066 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.979077 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:34 crc kubenswrapper[4707]: I0121 15:22:34.979451 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.006007 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.007301 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.012280 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.024507 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-config-data\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027169 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25325d39-69af-40b1-86e3-34c53a08b2f9-logs\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfgz8\" (UniqueName: \"kubernetes.io/projected/25325d39-69af-40b1-86e3-34c53a08b2f9-kube-api-access-gfgz8\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hndb\" (UniqueName: \"kubernetes.io/projected/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-kube-api-access-4hndb\") pod \"nova-scheduler-0\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-config-data\") pod \"nova-scheduler-0\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-scripts\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2mfs\" (UniqueName: \"kubernetes.io/projected/024458dc-51a9-4318-a563-439e24104ffd-kube-api-access-h2mfs\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.027877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-config-data\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.033577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.036506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-scripts\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.037742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-config-data\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.068622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2mfs\" (UniqueName: \"kubernetes.io/projected/024458dc-51a9-4318-a563-439e24104ffd-kube-api-access-h2mfs\") pod \"nova-cell0-cell-mapping-z72fl\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.071865 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.073250 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.078562 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.083319 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.122391 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.123297 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.125721 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.129501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.129560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.129582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfgz8\" (UniqueName: \"kubernetes.io/projected/25325d39-69af-40b1-86e3-34c53a08b2f9-kube-api-access-gfgz8\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.129613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hndb\" (UniqueName: \"kubernetes.io/projected/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-kube-api-access-4hndb\") pod \"nova-scheduler-0\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.129685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.129725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-config-data\") pod \"nova-scheduler-0\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.130357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxjqt\" (UniqueName: \"kubernetes.io/projected/fc99a262-4249-4be1-88dd-3ea033b1fdcc-kube-api-access-sxjqt\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.130468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.130619 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-config-data\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.130692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8fb0570-d57a-4361-9747-073f2ffd5a3f-logs\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.130845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-config-data\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.130932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c4kw\" (UniqueName: \"kubernetes.io/projected/b8fb0570-d57a-4361-9747-073f2ffd5a3f-kube-api-access-2c4kw\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.131012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.131091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25325d39-69af-40b1-86e3-34c53a08b2f9-logs\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.131707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25325d39-69af-40b1-86e3-34c53a08b2f9-logs\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.133662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.135405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.139289 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.140134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-config-data\") pod \"nova-scheduler-0\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.140684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-config-data\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.149282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hndb\" (UniqueName: \"kubernetes.io/projected/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-kube-api-access-4hndb\") pod \"nova-scheduler-0\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.151254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfgz8\" (UniqueName: \"kubernetes.io/projected/25325d39-69af-40b1-86e3-34c53a08b2f9-kube-api-access-gfgz8\") pod \"nova-metadata-0\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.184537 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.233696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxjqt\" (UniqueName: \"kubernetes.io/projected/fc99a262-4249-4be1-88dd-3ea033b1fdcc-kube-api-access-sxjqt\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.234085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-config-data\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.234203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8fb0570-d57a-4361-9747-073f2ffd5a3f-logs\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.234567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8fb0570-d57a-4361-9747-073f2ffd5a3f-logs\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.234583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c4kw\" (UniqueName: \"kubernetes.io/projected/b8fb0570-d57a-4361-9747-073f2ffd5a3f-kube-api-access-2c4kw\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.234778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.234954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.235724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.238015 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-config-data\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.241437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.243435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.245866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.253301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c4kw\" (UniqueName: \"kubernetes.io/projected/b8fb0570-d57a-4361-9747-073f2ffd5a3f-kube-api-access-2c4kw\") pod \"nova-api-0\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.256140 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxjqt\" (UniqueName: \"kubernetes.io/projected/fc99a262-4249-4be1-88dd-3ea033b1fdcc-kube-api-access-sxjqt\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.299497 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.324064 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.408865 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.529665 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.567490 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl"] Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.708955 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:35 crc kubenswrapper[4707]: W0121 15:22:35.719113 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf07e2299_1ba0_4747_90ca_7fe3c0c099ee.slice/crio-42496bfe87172086269c7b63219b05874837ae2388d4234896c4042071c8ac78 WatchSource:0}: Error finding container 42496bfe87172086269c7b63219b05874837ae2388d4234896c4042071c8ac78: Status 404 returned error can't find the container with id 42496bfe87172086269c7b63219b05874837ae2388d4234896c4042071c8ac78 Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.722273 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t"] Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.723305 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.724920 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.725403 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.732610 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t"] Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.745220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-config-data\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.745281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjtjz\" (UniqueName: \"kubernetes.io/projected/af3add83-4dbe-498e-82af-461847ca1175-kube-api-access-sjtjz\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.745344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-scripts\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.745394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.778131 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:35 crc kubenswrapper[4707]: W0121 15:22:35.784025 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25325d39_69af_40b1_86e3_34c53a08b2f9.slice/crio-4c09e52c1f0a39598f52e75cf98e4a48bf5baf3382951debc460017333d158e4 WatchSource:0}: Error finding container 4c09e52c1f0a39598f52e75cf98e4a48bf5baf3382951debc460017333d158e4: Status 404 returned error can't find the container with id 4c09e52c1f0a39598f52e75cf98e4a48bf5baf3382951debc460017333d158e4 Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.861730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.862474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-config-data\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.862960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjtjz\" (UniqueName: \"kubernetes.io/projected/af3add83-4dbe-498e-82af-461847ca1175-kube-api-access-sjtjz\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.863600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-scripts\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.866759 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-config-data\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.868727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-scripts\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.876096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjtjz\" (UniqueName: \"kubernetes.io/projected/af3add83-4dbe-498e-82af-461847ca1175-kube-api-access-sjtjz\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.876737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7m69t\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:35 crc kubenswrapper[4707]: I0121 15:22:35.898462 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:35 crc kubenswrapper[4707]: W0121 15:22:35.903546 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8fb0570_d57a_4361_9747_073f2ffd5a3f.slice/crio-9f12015809dd2e85fadd29d8b6d629b3d60b2a058bef3599da6e94b70e77c62f WatchSource:0}: Error finding container 9f12015809dd2e85fadd29d8b6d629b3d60b2a058bef3599da6e94b70e77c62f: Status 404 returned error can't find the container with id 9f12015809dd2e85fadd29d8b6d629b3d60b2a058bef3599da6e94b70e77c62f Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.003313 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.053521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.196595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" event={"ID":"024458dc-51a9-4318-a563-439e24104ffd","Type":"ContainerStarted","Data":"147a04b8d4127c77f223fa46ba02a222d6252316f9d1f34d4f7164c46976c44a"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.196791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" event={"ID":"024458dc-51a9-4318-a563-439e24104ffd","Type":"ContainerStarted","Data":"f2a9408260abda75e79b33e275d47b7e8f2b6fdf295dc30e51397a0b56b8bacd"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.211660 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" podStartSLOduration=2.211646393 podStartE2EDuration="2.211646393s" podCreationTimestamp="2026-01-21 15:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:36.210039613 +0000 UTC m=+1253.391555836" watchObservedRunningTime="2026-01-21 15:22:36.211646393 +0000 UTC m=+1253.393162615" Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.213201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b8fb0570-d57a-4361-9747-073f2ffd5a3f","Type":"ContainerStarted","Data":"f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.213232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b8fb0570-d57a-4361-9747-073f2ffd5a3f","Type":"ContainerStarted","Data":"9f12015809dd2e85fadd29d8b6d629b3d60b2a058bef3599da6e94b70e77c62f"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.214950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"25325d39-69af-40b1-86e3-34c53a08b2f9","Type":"ContainerStarted","Data":"bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.214974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"25325d39-69af-40b1-86e3-34c53a08b2f9","Type":"ContainerStarted","Data":"9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.214982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"25325d39-69af-40b1-86e3-34c53a08b2f9","Type":"ContainerStarted","Data":"4c09e52c1f0a39598f52e75cf98e4a48bf5baf3382951debc460017333d158e4"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.218631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"fc99a262-4249-4be1-88dd-3ea033b1fdcc","Type":"ContainerStarted","Data":"298ab8a16c766786e4da12700e4e042d5ca3ece0319b1693f80a256ffadb703f"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.231560 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.231545919 podStartE2EDuration="2.231545919s" podCreationTimestamp="2026-01-21 15:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:36.229394224 +0000 UTC m=+1253.410910446" watchObservedRunningTime="2026-01-21 15:22:36.231545919 +0000 UTC m=+1253.413062141" Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.234493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f07e2299-1ba0-4747-90ca-7fe3c0c099ee","Type":"ContainerStarted","Data":"f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.234521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f07e2299-1ba0-4747-90ca-7fe3c0c099ee","Type":"ContainerStarted","Data":"42496bfe87172086269c7b63219b05874837ae2388d4234896c4042071c8ac78"} Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.249536 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.249527259 podStartE2EDuration="2.249527259s" podCreationTimestamp="2026-01-21 15:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:36.248553138 +0000 UTC m=+1253.430069380" watchObservedRunningTime="2026-01-21 15:22:36.249527259 +0000 UTC m=+1253.431043480" Jan 21 15:22:36 crc kubenswrapper[4707]: I0121 15:22:36.514732 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t"] Jan 21 15:22:37 crc kubenswrapper[4707]: I0121 15:22:37.243964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b8fb0570-d57a-4361-9747-073f2ffd5a3f","Type":"ContainerStarted","Data":"ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723"} Jan 21 15:22:37 crc kubenswrapper[4707]: I0121 15:22:37.245412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" event={"ID":"af3add83-4dbe-498e-82af-461847ca1175","Type":"ContainerStarted","Data":"82702cae16237653df2b9f3bc5adea1229ba33351b62b15c664127ebacb3731b"} Jan 21 15:22:37 crc kubenswrapper[4707]: I0121 15:22:37.245438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" event={"ID":"af3add83-4dbe-498e-82af-461847ca1175","Type":"ContainerStarted","Data":"748cee54b3a7127e2380c105db2e61b52d725614d0fe7715c1b8e4797350965e"} Jan 21 15:22:37 crc kubenswrapper[4707]: I0121 15:22:37.247385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"fc99a262-4249-4be1-88dd-3ea033b1fdcc","Type":"ContainerStarted","Data":"050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6"} Jan 21 15:22:37 crc kubenswrapper[4707]: I0121 15:22:37.260355 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.260344167 podStartE2EDuration="2.260344167s" podCreationTimestamp="2026-01-21 15:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:37.259111922 +0000 UTC m=+1254.440628144" watchObservedRunningTime="2026-01-21 15:22:37.260344167 +0000 UTC m=+1254.441860389" Jan 21 15:22:37 crc kubenswrapper[4707]: I0121 15:22:37.278202 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" podStartSLOduration=2.278191065 podStartE2EDuration="2.278191065s" podCreationTimestamp="2026-01-21 15:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:37.268875815 +0000 UTC m=+1254.450392037" watchObservedRunningTime="2026-01-21 15:22:37.278191065 +0000 UTC m=+1254.459707287" Jan 21 15:22:37 crc kubenswrapper[4707]: I0121 15:22:37.289865 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.289849178 podStartE2EDuration="2.289849178s" podCreationTimestamp="2026-01-21 15:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:37.287117705 +0000 UTC m=+1254.468633926" watchObservedRunningTime="2026-01-21 15:22:37.289849178 +0000 UTC m=+1254.471365400" Jan 21 15:22:37 crc kubenswrapper[4707]: I0121 15:22:37.347733 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:37 crc kubenswrapper[4707]: I0121 15:22:37.354295 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.256593 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerName="nova-metadata-log" containerID="cri-o://9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1" gracePeriod=30 Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.256594 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerName="nova-metadata-metadata" containerID="cri-o://bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd" gracePeriod=30 Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.644630 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.738750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-combined-ca-bundle\") pod \"25325d39-69af-40b1-86e3-34c53a08b2f9\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.738907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-config-data\") pod \"25325d39-69af-40b1-86e3-34c53a08b2f9\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.738945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfgz8\" (UniqueName: \"kubernetes.io/projected/25325d39-69af-40b1-86e3-34c53a08b2f9-kube-api-access-gfgz8\") pod \"25325d39-69af-40b1-86e3-34c53a08b2f9\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.738968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25325d39-69af-40b1-86e3-34c53a08b2f9-logs\") pod \"25325d39-69af-40b1-86e3-34c53a08b2f9\" (UID: \"25325d39-69af-40b1-86e3-34c53a08b2f9\") " Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.739563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25325d39-69af-40b1-86e3-34c53a08b2f9-logs" (OuterVolumeSpecName: "logs") pod "25325d39-69af-40b1-86e3-34c53a08b2f9" (UID: "25325d39-69af-40b1-86e3-34c53a08b2f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.743669 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25325d39-69af-40b1-86e3-34c53a08b2f9-kube-api-access-gfgz8" (OuterVolumeSpecName: "kube-api-access-gfgz8") pod "25325d39-69af-40b1-86e3-34c53a08b2f9" (UID: "25325d39-69af-40b1-86e3-34c53a08b2f9"). InnerVolumeSpecName "kube-api-access-gfgz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.759174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-config-data" (OuterVolumeSpecName: "config-data") pod "25325d39-69af-40b1-86e3-34c53a08b2f9" (UID: "25325d39-69af-40b1-86e3-34c53a08b2f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.760029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25325d39-69af-40b1-86e3-34c53a08b2f9" (UID: "25325d39-69af-40b1-86e3-34c53a08b2f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.840949 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.840974 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfgz8\" (UniqueName: \"kubernetes.io/projected/25325d39-69af-40b1-86e3-34c53a08b2f9-kube-api-access-gfgz8\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.841140 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25325d39-69af-40b1-86e3-34c53a08b2f9-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:38 crc kubenswrapper[4707]: I0121 15:22:38.841151 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25325d39-69af-40b1-86e3-34c53a08b2f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.264841 4707 generic.go:334] "Generic (PLEG): container finished" podID="af3add83-4dbe-498e-82af-461847ca1175" containerID="82702cae16237653df2b9f3bc5adea1229ba33351b62b15c664127ebacb3731b" exitCode=0 Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.264893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" event={"ID":"af3add83-4dbe-498e-82af-461847ca1175","Type":"ContainerDied","Data":"82702cae16237653df2b9f3bc5adea1229ba33351b62b15c664127ebacb3731b"} Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.267470 4707 generic.go:334] "Generic (PLEG): container finished" podID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerID="bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd" exitCode=0 Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.267490 4707 generic.go:334] "Generic (PLEG): container finished" podID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerID="9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1" exitCode=143 Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.267584 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="fc99a262-4249-4be1-88dd-3ea033b1fdcc" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6" gracePeriod=30 Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.267779 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.268275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"25325d39-69af-40b1-86e3-34c53a08b2f9","Type":"ContainerDied","Data":"bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd"} Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.268296 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"25325d39-69af-40b1-86e3-34c53a08b2f9","Type":"ContainerDied","Data":"9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1"} Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.268306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"25325d39-69af-40b1-86e3-34c53a08b2f9","Type":"ContainerDied","Data":"4c09e52c1f0a39598f52e75cf98e4a48bf5baf3382951debc460017333d158e4"} Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.268329 4707 scope.go:117] "RemoveContainer" containerID="bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.294967 4707 scope.go:117] "RemoveContainer" containerID="9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.309041 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.323931 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.325751 4707 scope.go:117] "RemoveContainer" containerID="bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd" Jan 21 15:22:39 crc kubenswrapper[4707]: E0121 15:22:39.328230 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd\": container with ID starting with bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd not found: ID does not exist" containerID="bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.328264 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd"} err="failed to get container status \"bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd\": rpc error: code = NotFound desc = could not find container \"bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd\": container with ID starting with bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd not found: ID does not exist" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.328285 4707 scope.go:117] "RemoveContainer" containerID="9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1" Jan 21 15:22:39 crc kubenswrapper[4707]: E0121 15:22:39.328645 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1\": container with ID starting with 9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1 not found: ID does not exist" containerID="9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.328669 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1"} err="failed to get container status \"9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1\": rpc error: code = NotFound desc = could not find container \"9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1\": container with ID starting with 9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1 not found: ID does not exist" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.328683 4707 scope.go:117] "RemoveContainer" containerID="bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.329084 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd"} err="failed to get container status \"bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd\": rpc error: code = NotFound desc = could not find container \"bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd\": container with ID starting with bce2fc03d75c1f81fd7d6b99e7054eb40f4f81f5e9b83e8cdac6bc6ab1d41ecd not found: ID does not exist" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.329108 4707 scope.go:117] "RemoveContainer" containerID="9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.329640 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1"} err="failed to get container status \"9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1\": rpc error: code = NotFound desc = could not find container \"9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1\": container with ID starting with 9d48e0090fcd64a3e1bfd032164ebe7931992477761ee3ed449a9927325360f1 not found: ID does not exist" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.330437 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:39 crc kubenswrapper[4707]: E0121 15:22:39.330772 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerName="nova-metadata-log" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.330789 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerName="nova-metadata-log" Jan 21 15:22:39 crc kubenswrapper[4707]: E0121 15:22:39.330831 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerName="nova-metadata-metadata" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.330839 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerName="nova-metadata-metadata" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.330975 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerName="nova-metadata-metadata" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.331003 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="25325d39-69af-40b1-86e3-34c53a08b2f9" containerName="nova-metadata-log" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.331767 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.335461 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.336061 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.339226 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.450606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-config-data\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.450672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckjc7\" (UniqueName: \"kubernetes.io/projected/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-kube-api-access-ckjc7\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.450692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.450740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-logs\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.450789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.552581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckjc7\" (UniqueName: \"kubernetes.io/projected/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-kube-api-access-ckjc7\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.552613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.552643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-logs\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.552703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.552763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-config-data\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.554232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-logs\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.555643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.556073 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.556107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-config-data\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.572086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckjc7\" (UniqueName: \"kubernetes.io/projected/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-kube-api-access-ckjc7\") pod \"nova-metadata-0\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.653562 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.860910 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.945698 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.945748 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.945794 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.946462 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bb97b7a6a7cd703c4244311ac8e9e2660a8fee90a5d7f829eaf760d2141c0e2"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.946516 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://8bb97b7a6a7cd703c4244311ac8e9e2660a8fee90a5d7f829eaf760d2141c0e2" gracePeriod=600 Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.959262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxjqt\" (UniqueName: \"kubernetes.io/projected/fc99a262-4249-4be1-88dd-3ea033b1fdcc-kube-api-access-sxjqt\") pod \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.959295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-combined-ca-bundle\") pod \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.959441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-config-data\") pod \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\" (UID: \"fc99a262-4249-4be1-88dd-3ea033b1fdcc\") " Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.963417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc99a262-4249-4be1-88dd-3ea033b1fdcc-kube-api-access-sxjqt" (OuterVolumeSpecName: "kube-api-access-sxjqt") pod "fc99a262-4249-4be1-88dd-3ea033b1fdcc" (UID: "fc99a262-4249-4be1-88dd-3ea033b1fdcc"). InnerVolumeSpecName "kube-api-access-sxjqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.978549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc99a262-4249-4be1-88dd-3ea033b1fdcc" (UID: "fc99a262-4249-4be1-88dd-3ea033b1fdcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:39 crc kubenswrapper[4707]: I0121 15:22:39.981566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-config-data" (OuterVolumeSpecName: "config-data") pod "fc99a262-4249-4be1-88dd-3ea033b1fdcc" (UID: "fc99a262-4249-4be1-88dd-3ea033b1fdcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:40 crc kubenswrapper[4707]: W0121 15:22:40.038187 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddcb4cca_cedf_42a9_bf7d_bc80ccf3950a.slice/crio-ac315a85e1750542f198a7387065a8871486b1d8121af78139eba907efe2bdfe WatchSource:0}: Error finding container ac315a85e1750542f198a7387065a8871486b1d8121af78139eba907efe2bdfe: Status 404 returned error can't find the container with id ac315a85e1750542f198a7387065a8871486b1d8121af78139eba907efe2bdfe Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.039037 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.060733 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.060757 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxjqt\" (UniqueName: \"kubernetes.io/projected/fc99a262-4249-4be1-88dd-3ea033b1fdcc-kube-api-access-sxjqt\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.060767 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc99a262-4249-4be1-88dd-3ea033b1fdcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.276377 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc99a262-4249-4be1-88dd-3ea033b1fdcc" containerID="050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6" exitCode=0 Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.276556 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"fc99a262-4249-4be1-88dd-3ea033b1fdcc","Type":"ContainerDied","Data":"050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6"} Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.276579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"fc99a262-4249-4be1-88dd-3ea033b1fdcc","Type":"ContainerDied","Data":"298ab8a16c766786e4da12700e4e042d5ca3ece0319b1693f80a256ffadb703f"} Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.276593 4707 scope.go:117] "RemoveContainer" containerID="050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.276705 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.281100 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="8bb97b7a6a7cd703c4244311ac8e9e2660a8fee90a5d7f829eaf760d2141c0e2" exitCode=0 Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.281174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"8bb97b7a6a7cd703c4244311ac8e9e2660a8fee90a5d7f829eaf760d2141c0e2"} Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.281402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"8593fceae8f46f5fa87aa536502a4e355cae8485d3fdcdf096b66dd3daf7b85e"} Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.283542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a","Type":"ContainerStarted","Data":"966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa"} Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.283698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a","Type":"ContainerStarted","Data":"ac315a85e1750542f198a7387065a8871486b1d8121af78139eba907efe2bdfe"} Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.285029 4707 generic.go:334] "Generic (PLEG): container finished" podID="024458dc-51a9-4318-a563-439e24104ffd" containerID="147a04b8d4127c77f223fa46ba02a222d6252316f9d1f34d4f7164c46976c44a" exitCode=0 Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.285184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" event={"ID":"024458dc-51a9-4318-a563-439e24104ffd","Type":"ContainerDied","Data":"147a04b8d4127c77f223fa46ba02a222d6252316f9d1f34d4f7164c46976c44a"} Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.300262 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.304501 4707 scope.go:117] "RemoveContainer" containerID="050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6" Jan 21 15:22:40 crc kubenswrapper[4707]: E0121 15:22:40.305414 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6\": container with ID starting with 050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6 not found: ID does not exist" containerID="050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.305448 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6"} err="failed to get container status \"050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6\": rpc error: code = NotFound desc = could not find container \"050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6\": container with ID starting with 050e5d350d774b574c78cbf30ad531968e4396f2b94a31668b5dab15d004b2b6 not found: ID does not exist" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.305467 4707 scope.go:117] "RemoveContainer" containerID="cf1b4064e4934cd4c43802ba3e90cfa41897c007fa175bcbffaf2760e553c708" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.326135 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.330508 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.342420 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:22:40 crc kubenswrapper[4707]: E0121 15:22:40.342948 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc99a262-4249-4be1-88dd-3ea033b1fdcc" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.342966 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc99a262-4249-4be1-88dd-3ea033b1fdcc" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.343202 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc99a262-4249-4be1-88dd-3ea033b1fdcc" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.343780 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.346637 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.346883 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.347026 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.351718 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.468030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9phjz\" (UniqueName: \"kubernetes.io/projected/117c40c7-7452-42e7-8a5a-0a1123ac030e-kube-api-access-9phjz\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.468254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.468341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.468420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.468761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.548895 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.570707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.570776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.570868 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.570995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.571051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9phjz\" (UniqueName: \"kubernetes.io/projected/117c40c7-7452-42e7-8a5a-0a1123ac030e-kube-api-access-9phjz\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.575532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.575875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.576864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.581853 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.586921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9phjz\" (UniqueName: \"kubernetes.io/projected/117c40c7-7452-42e7-8a5a-0a1123ac030e-kube-api-access-9phjz\") pod \"nova-cell1-novncproxy-0\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.664936 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.672224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-scripts\") pod \"af3add83-4dbe-498e-82af-461847ca1175\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.672295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjtjz\" (UniqueName: \"kubernetes.io/projected/af3add83-4dbe-498e-82af-461847ca1175-kube-api-access-sjtjz\") pod \"af3add83-4dbe-498e-82af-461847ca1175\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.672328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-config-data\") pod \"af3add83-4dbe-498e-82af-461847ca1175\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.672431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-combined-ca-bundle\") pod \"af3add83-4dbe-498e-82af-461847ca1175\" (UID: \"af3add83-4dbe-498e-82af-461847ca1175\") " Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.675120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3add83-4dbe-498e-82af-461847ca1175-kube-api-access-sjtjz" (OuterVolumeSpecName: "kube-api-access-sjtjz") pod "af3add83-4dbe-498e-82af-461847ca1175" (UID: "af3add83-4dbe-498e-82af-461847ca1175"). InnerVolumeSpecName "kube-api-access-sjtjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.675254 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-scripts" (OuterVolumeSpecName: "scripts") pod "af3add83-4dbe-498e-82af-461847ca1175" (UID: "af3add83-4dbe-498e-82af-461847ca1175"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.692249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-config-data" (OuterVolumeSpecName: "config-data") pod "af3add83-4dbe-498e-82af-461847ca1175" (UID: "af3add83-4dbe-498e-82af-461847ca1175"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.693029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af3add83-4dbe-498e-82af-461847ca1175" (UID: "af3add83-4dbe-498e-82af-461847ca1175"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.774315 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.774345 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.774354 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjtjz\" (UniqueName: \"kubernetes.io/projected/af3add83-4dbe-498e-82af-461847ca1175-kube-api-access-sjtjz\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:40 crc kubenswrapper[4707]: I0121 15:22:40.774366 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3add83-4dbe-498e-82af-461847ca1175-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.019773 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:22:41 crc kubenswrapper[4707]: W0121 15:22:41.021016 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod117c40c7_7452_42e7_8a5a_0a1123ac030e.slice/crio-fba276945f55be0cd800a6fcb1dac6fdf6e1f8722c4d5600d17ea16ba4bd3261 WatchSource:0}: Error finding container fba276945f55be0cd800a6fcb1dac6fdf6e1f8722c4d5600d17ea16ba4bd3261: Status 404 returned error can't find the container with id fba276945f55be0cd800a6fcb1dac6fdf6e1f8722c4d5600d17ea16ba4bd3261 Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.191215 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25325d39-69af-40b1-86e3-34c53a08b2f9" path="/var/lib/kubelet/pods/25325d39-69af-40b1-86e3-34c53a08b2f9/volumes" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.191935 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc99a262-4249-4be1-88dd-3ea033b1fdcc" path="/var/lib/kubelet/pods/fc99a262-4249-4be1-88dd-3ea033b1fdcc/volumes" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.294181 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.294180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t" event={"ID":"af3add83-4dbe-498e-82af-461847ca1175","Type":"ContainerDied","Data":"748cee54b3a7127e2380c105db2e61b52d725614d0fe7715c1b8e4797350965e"} Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.294279 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="748cee54b3a7127e2380c105db2e61b52d725614d0fe7715c1b8e4797350965e" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.296511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"117c40c7-7452-42e7-8a5a-0a1123ac030e","Type":"ContainerStarted","Data":"3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f"} Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.296543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"117c40c7-7452-42e7-8a5a-0a1123ac030e","Type":"ContainerStarted","Data":"fba276945f55be0cd800a6fcb1dac6fdf6e1f8722c4d5600d17ea16ba4bd3261"} Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.321000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a","Type":"ContainerStarted","Data":"de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002"} Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.323257 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.3232459890000001 podStartE2EDuration="1.323245989s" podCreationTimestamp="2026-01-21 15:22:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:41.315362261 +0000 UTC m=+1258.496878483" watchObservedRunningTime="2026-01-21 15:22:41.323245989 +0000 UTC m=+1258.504762212" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.354069 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.354052678 podStartE2EDuration="2.354052678s" podCreationTimestamp="2026-01-21 15:22:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:41.351996823 +0000 UTC m=+1258.533513045" watchObservedRunningTime="2026-01-21 15:22:41.354052678 +0000 UTC m=+1258.535568901" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.360893 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:22:41 crc kubenswrapper[4707]: E0121 15:22:41.361276 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3add83-4dbe-498e-82af-461847ca1175" containerName="nova-cell1-conductor-db-sync" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.361292 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3add83-4dbe-498e-82af-461847ca1175" containerName="nova-cell1-conductor-db-sync" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.361465 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3add83-4dbe-498e-82af-461847ca1175" containerName="nova-cell1-conductor-db-sync" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.362031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.364661 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.379609 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.485798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.485859 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.486006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s4hp\" (UniqueName: \"kubernetes.io/projected/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-kube-api-access-7s4hp\") pod \"nova-cell1-conductor-0\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.587163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s4hp\" (UniqueName: \"kubernetes.io/projected/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-kube-api-access-7s4hp\") pod \"nova-cell1-conductor-0\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.587248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.587280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.591785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.591919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.603469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s4hp\" (UniqueName: \"kubernetes.io/projected/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-kube-api-access-7s4hp\") pod \"nova-cell1-conductor-0\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.666491 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.679604 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.789947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-config-data\") pod \"024458dc-51a9-4318-a563-439e24104ffd\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.790020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-combined-ca-bundle\") pod \"024458dc-51a9-4318-a563-439e24104ffd\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.790044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-scripts\") pod \"024458dc-51a9-4318-a563-439e24104ffd\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.790069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2mfs\" (UniqueName: \"kubernetes.io/projected/024458dc-51a9-4318-a563-439e24104ffd-kube-api-access-h2mfs\") pod \"024458dc-51a9-4318-a563-439e24104ffd\" (UID: \"024458dc-51a9-4318-a563-439e24104ffd\") " Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.793595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/024458dc-51a9-4318-a563-439e24104ffd-kube-api-access-h2mfs" (OuterVolumeSpecName: "kube-api-access-h2mfs") pod "024458dc-51a9-4318-a563-439e24104ffd" (UID: "024458dc-51a9-4318-a563-439e24104ffd"). InnerVolumeSpecName "kube-api-access-h2mfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.794312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-scripts" (OuterVolumeSpecName: "scripts") pod "024458dc-51a9-4318-a563-439e24104ffd" (UID: "024458dc-51a9-4318-a563-439e24104ffd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.810962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-config-data" (OuterVolumeSpecName: "config-data") pod "024458dc-51a9-4318-a563-439e24104ffd" (UID: "024458dc-51a9-4318-a563-439e24104ffd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.816529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "024458dc-51a9-4318-a563-439e24104ffd" (UID: "024458dc-51a9-4318-a563-439e24104ffd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.891850 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.892034 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.892043 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2mfs\" (UniqueName: \"kubernetes.io/projected/024458dc-51a9-4318-a563-439e24104ffd-kube-api-access-h2mfs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:41 crc kubenswrapper[4707]: I0121 15:22:41.892053 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/024458dc-51a9-4318-a563-439e24104ffd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:42 crc kubenswrapper[4707]: W0121 15:22:42.068959 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9606e288_80ca_48db_a0a6_8f0f4ec67eaa.slice/crio-7c3d086ce662659164b45b8e96e8bba61663245fff6912cde3428dbf310367af WatchSource:0}: Error finding container 7c3d086ce662659164b45b8e96e8bba61663245fff6912cde3428dbf310367af: Status 404 returned error can't find the container with id 7c3d086ce662659164b45b8e96e8bba61663245fff6912cde3428dbf310367af Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.069275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.328669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"9606e288-80ca-48db-a0a6-8f0f4ec67eaa","Type":"ContainerStarted","Data":"eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320"} Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.328712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"9606e288-80ca-48db-a0a6-8f0f4ec67eaa","Type":"ContainerStarted","Data":"7c3d086ce662659164b45b8e96e8bba61663245fff6912cde3428dbf310367af"} Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.329534 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.340002 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.340889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl" event={"ID":"024458dc-51a9-4318-a563-439e24104ffd","Type":"ContainerDied","Data":"f2a9408260abda75e79b33e275d47b7e8f2b6fdf295dc30e51397a0b56b8bacd"} Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.340960 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2a9408260abda75e79b33e275d47b7e8f2b6fdf295dc30e51397a0b56b8bacd" Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.364133 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.364113976 podStartE2EDuration="1.364113976s" podCreationTimestamp="2026-01-21 15:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:42.346290153 +0000 UTC m=+1259.527806375" watchObservedRunningTime="2026-01-21 15:22:42.364113976 +0000 UTC m=+1259.545630199" Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.486395 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.486639 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerName="nova-api-log" containerID="cri-o://f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a" gracePeriod=30 Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.486736 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerName="nova-api-api" containerID="cri-o://ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723" gracePeriod=30 Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.508229 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.508409 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="f07e2299-1ba0-4747-90ca-7fe3c0c099ee" containerName="nova-scheduler-scheduler" containerID="cri-o://f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985" gracePeriod=30 Jan 21 15:22:42 crc kubenswrapper[4707]: I0121 15:22:42.543769 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.286425 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.348562 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerID="ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723" exitCode=0 Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.348591 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerID="f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a" exitCode=143 Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.348615 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.348660 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b8fb0570-d57a-4361-9747-073f2ffd5a3f","Type":"ContainerDied","Data":"ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723"} Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.348722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b8fb0570-d57a-4361-9747-073f2ffd5a3f","Type":"ContainerDied","Data":"f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a"} Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.348737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b8fb0570-d57a-4361-9747-073f2ffd5a3f","Type":"ContainerDied","Data":"9f12015809dd2e85fadd29d8b6d629b3d60b2a058bef3599da6e94b70e77c62f"} Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.348752 4707 scope.go:117] "RemoveContainer" containerID="ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.348933 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerName="nova-metadata-log" containerID="cri-o://966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa" gracePeriod=30 Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.348984 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerName="nova-metadata-metadata" containerID="cri-o://de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002" gracePeriod=30 Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.369988 4707 scope.go:117] "RemoveContainer" containerID="f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.384135 4707 scope.go:117] "RemoveContainer" containerID="ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723" Jan 21 15:22:43 crc kubenswrapper[4707]: E0121 15:22:43.384435 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723\": container with ID starting with ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723 not found: ID does not exist" containerID="ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.384469 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723"} err="failed to get container status \"ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723\": rpc error: code = NotFound desc = could not find container \"ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723\": container with ID starting with ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723 not found: ID does not exist" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.384495 4707 scope.go:117] "RemoveContainer" containerID="f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a" Jan 21 15:22:43 crc kubenswrapper[4707]: E0121 15:22:43.384747 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a\": container with ID starting with f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a not found: ID does not exist" containerID="f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.384775 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a"} err="failed to get container status \"f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a\": rpc error: code = NotFound desc = could not find container \"f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a\": container with ID starting with f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a not found: ID does not exist" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.384796 4707 scope.go:117] "RemoveContainer" containerID="ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.385008 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723"} err="failed to get container status \"ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723\": rpc error: code = NotFound desc = could not find container \"ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723\": container with ID starting with ba90683d3cb73df12334d888084fff4bed630ed067e289e296bc989f59e21723 not found: ID does not exist" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.385030 4707 scope.go:117] "RemoveContainer" containerID="f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.385219 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a"} err="failed to get container status \"f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a\": rpc error: code = NotFound desc = could not find container \"f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a\": container with ID starting with f607de354fa7ca3cd22575ae8f7c4f379ad332474f40dd4648a93eb9762da75a not found: ID does not exist" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.417333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c4kw\" (UniqueName: \"kubernetes.io/projected/b8fb0570-d57a-4361-9747-073f2ffd5a3f-kube-api-access-2c4kw\") pod \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.417484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8fb0570-d57a-4361-9747-073f2ffd5a3f-logs\") pod \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.417555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-config-data\") pod \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.417589 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-combined-ca-bundle\") pod \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\" (UID: \"b8fb0570-d57a-4361-9747-073f2ffd5a3f\") " Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.417705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8fb0570-d57a-4361-9747-073f2ffd5a3f-logs" (OuterVolumeSpecName: "logs") pod "b8fb0570-d57a-4361-9747-073f2ffd5a3f" (UID: "b8fb0570-d57a-4361-9747-073f2ffd5a3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.418030 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8fb0570-d57a-4361-9747-073f2ffd5a3f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.421786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fb0570-d57a-4361-9747-073f2ffd5a3f-kube-api-access-2c4kw" (OuterVolumeSpecName: "kube-api-access-2c4kw") pod "b8fb0570-d57a-4361-9747-073f2ffd5a3f" (UID: "b8fb0570-d57a-4361-9747-073f2ffd5a3f"). InnerVolumeSpecName "kube-api-access-2c4kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.438242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8fb0570-d57a-4361-9747-073f2ffd5a3f" (UID: "b8fb0570-d57a-4361-9747-073f2ffd5a3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.440734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-config-data" (OuterVolumeSpecName: "config-data") pod "b8fb0570-d57a-4361-9747-073f2ffd5a3f" (UID: "b8fb0570-d57a-4361-9747-073f2ffd5a3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.520106 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c4kw\" (UniqueName: \"kubernetes.io/projected/b8fb0570-d57a-4361-9747-073f2ffd5a3f-kube-api-access-2c4kw\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.520131 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.520143 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8fb0570-d57a-4361-9747-073f2ffd5a3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.685728 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.700626 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.716090 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:43 crc kubenswrapper[4707]: E0121 15:22:43.716468 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024458dc-51a9-4318-a563-439e24104ffd" containerName="nova-manage" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.716480 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="024458dc-51a9-4318-a563-439e24104ffd" containerName="nova-manage" Jan 21 15:22:43 crc kubenswrapper[4707]: E0121 15:22:43.716501 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerName="nova-api-api" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.716509 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerName="nova-api-api" Jan 21 15:22:43 crc kubenswrapper[4707]: E0121 15:22:43.716522 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerName="nova-api-log" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.716527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerName="nova-api-log" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.716680 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerName="nova-api-log" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.716701 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="024458dc-51a9-4318-a563-439e24104ffd" containerName="nova-manage" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.716711 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" containerName="nova-api-api" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.717570 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.720399 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.724902 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.737341 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.824567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-logs\") pod \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.824702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckjc7\" (UniqueName: \"kubernetes.io/projected/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-kube-api-access-ckjc7\") pod \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.824792 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-config-data\") pod \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.824875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-nova-metadata-tls-certs\") pod \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.824913 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-combined-ca-bundle\") pod \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\" (UID: \"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a\") " Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.825184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-logs\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.825240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27tjd\" (UniqueName: \"kubernetes.io/projected/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-kube-api-access-27tjd\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.825277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.825186 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-logs" (OuterVolumeSpecName: "logs") pod "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" (UID: "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.825416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-config-data\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.825684 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.828176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-kube-api-access-ckjc7" (OuterVolumeSpecName: "kube-api-access-ckjc7") pod "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" (UID: "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a"). InnerVolumeSpecName "kube-api-access-ckjc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.844876 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-config-data" (OuterVolumeSpecName: "config-data") pod "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" (UID: "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.855881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" (UID: "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.863364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" (UID: "ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.927249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-logs\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.927307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27tjd\" (UniqueName: \"kubernetes.io/projected/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-kube-api-access-27tjd\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.927353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.927439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-config-data\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.927547 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckjc7\" (UniqueName: \"kubernetes.io/projected/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-kube-api-access-ckjc7\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.927565 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.927574 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.927585 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.928222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-logs\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.930303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-config-data\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.930980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:43 crc kubenswrapper[4707]: I0121 15:22:43.941048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27tjd\" (UniqueName: \"kubernetes.io/projected/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-kube-api-access-27tjd\") pod \"nova-api-0\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.045391 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.357507 4707 generic.go:334] "Generic (PLEG): container finished" podID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerID="de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002" exitCode=0 Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.357684 4707 generic.go:334] "Generic (PLEG): container finished" podID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerID="966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa" exitCode=143 Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.357540 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.357555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a","Type":"ContainerDied","Data":"de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002"} Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.357743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a","Type":"ContainerDied","Data":"966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa"} Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.357754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a","Type":"ContainerDied","Data":"ac315a85e1750542f198a7387065a8871486b1d8121af78139eba907efe2bdfe"} Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.357768 4707 scope.go:117] "RemoveContainer" containerID="de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.386290 4707 scope.go:117] "RemoveContainer" containerID="966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.388609 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.395137 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.409268 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:44 crc kubenswrapper[4707]: E0121 15:22:44.410290 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerName="nova-metadata-log" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.410306 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerName="nova-metadata-log" Jan 21 15:22:44 crc kubenswrapper[4707]: E0121 15:22:44.410336 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerName="nova-metadata-metadata" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.410342 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerName="nova-metadata-metadata" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.410502 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerName="nova-metadata-metadata" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.410510 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" containerName="nova-metadata-log" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.411387 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.413750 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.414001 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.422717 4707 scope.go:117] "RemoveContainer" containerID="de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002" Jan 21 15:22:44 crc kubenswrapper[4707]: E0121 15:22:44.423172 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002\": container with ID starting with de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002 not found: ID does not exist" containerID="de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.423195 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002"} err="failed to get container status \"de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002\": rpc error: code = NotFound desc = could not find container \"de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002\": container with ID starting with de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002 not found: ID does not exist" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.423211 4707 scope.go:117] "RemoveContainer" containerID="966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa" Jan 21 15:22:44 crc kubenswrapper[4707]: E0121 15:22:44.423456 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa\": container with ID starting with 966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa not found: ID does not exist" containerID="966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.423471 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa"} err="failed to get container status \"966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa\": rpc error: code = NotFound desc = could not find container \"966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa\": container with ID starting with 966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa not found: ID does not exist" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.423484 4707 scope.go:117] "RemoveContainer" containerID="de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.423659 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002"} err="failed to get container status \"de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002\": rpc error: code = NotFound desc = could not find container \"de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002\": container with ID starting with de35219dd41ad4fd2c26aff66ea0c4bd0e9b5a25d02fccab394368a181558002 not found: ID does not exist" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.423673 4707 scope.go:117] "RemoveContainer" containerID="966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.423853 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa"} err="failed to get container status \"966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa\": rpc error: code = NotFound desc = could not find container \"966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa\": container with ID starting with 966e590d761c3ea4f60dc157291736dfb5ec4a89b3e854597a3bd28e730622aa not found: ID does not exist" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.432884 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.440478 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.536160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.536389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9360b9c-9e26-4d1c-981f-7d84b28555cf-logs\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.536563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf749\" (UniqueName: \"kubernetes.io/projected/e9360b9c-9e26-4d1c-981f-7d84b28555cf-kube-api-access-zf749\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.536654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-config-data\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.536747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.638396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf749\" (UniqueName: \"kubernetes.io/projected/e9360b9c-9e26-4d1c-981f-7d84b28555cf-kube-api-access-zf749\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.638440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-config-data\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.638467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.638542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.638615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9360b9c-9e26-4d1c-981f-7d84b28555cf-logs\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.638978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9360b9c-9e26-4d1c-981f-7d84b28555cf-logs\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.641361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.641905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.642045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-config-data\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.652605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf749\" (UniqueName: \"kubernetes.io/projected/e9360b9c-9e26-4d1c-981f-7d84b28555cf-kube-api-access-zf749\") pod \"nova-metadata-0\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:44 crc kubenswrapper[4707]: I0121 15:22:44.733791 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.034834 4707 scope.go:117] "RemoveContainer" containerID="a1746eb35de025fc0fec59c5f3fcfa37f77eeb90568db27ad9ffb35d3ca2b33f" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.052582 4707 scope.go:117] "RemoveContainer" containerID="ca241c4a9b687cbabd145c1ec655336baf10d6aef14fd3951470b276d7bd3c4f" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.068392 4707 scope.go:117] "RemoveContainer" containerID="1e7579152ee0a2a533efbfa1b34360456cb8779e12e40941d34d9745d3fbe0db" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.084958 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.093208 4707 scope.go:117] "RemoveContainer" containerID="b439aa595d2654611a443e0a117c7e818373bc9a2b50676f7467d7a722473847" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.118884 4707 scope.go:117] "RemoveContainer" containerID="924e0ee83104507d86b0618f79b0bd26660930ec44ec0006755016cc489eea6d" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.143509 4707 scope.go:117] "RemoveContainer" containerID="6ef3e2522324b1b698104b8d16b38946069af9a3394c382864fecbc285f1a8ca" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.158547 4707 scope.go:117] "RemoveContainer" containerID="93752cb0214af1aa190397cad0406ed11bd3f6dba015c3fbf12e099f8830c859" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.177180 4707 scope.go:117] "RemoveContainer" containerID="c54252794c0f19f6048f542f3e43812c6e224174c052c5045000b9e469f61138" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.191045 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8fb0570-d57a-4361-9747-073f2ffd5a3f" path="/var/lib/kubelet/pods/b8fb0570-d57a-4361-9747-073f2ffd5a3f/volumes" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.191617 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a" path="/var/lib/kubelet/pods/ddcb4cca-cedf-42a9-bf7d-bc80ccf3950a/volumes" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.196474 4707 scope.go:117] "RemoveContainer" containerID="87541c3326360047b9a65f7edaf21d2685779154d9e8a57721bdf77d39ce2562" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.218881 4707 scope.go:117] "RemoveContainer" containerID="f1984e9b20aa50544fba8536d85aecb53e0c63d0022c32cbaf91a219384719e0" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.253933 4707 scope.go:117] "RemoveContainer" containerID="9fb51b0f5a10f7dd754fc8560172fa188afb70f88828db786b4b522fc4d90f77" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.277864 4707 scope.go:117] "RemoveContainer" containerID="31896ebb7fd8a0ba8f4a099a589d633c45bbca89c2f32b8849942170749a152c" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.294065 4707 scope.go:117] "RemoveContainer" containerID="fb73b3d35ea2bf064a335bd348e92533a992b130f7ef5249fa1d76315e8e8a33" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.313576 4707 scope.go:117] "RemoveContainer" containerID="0a2191db90bc16f94bfd958e8b8824d4ccdd945c494ba2d8e86a5b5a077adeca" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.349098 4707 scope.go:117] "RemoveContainer" containerID="2788f083d09f6b70bb2ba314399bb3a57cca0351ffd15839c0880290b15ad16c" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.367838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1731c030-db8a-4a3f-a597-3fd64a5aeb3d","Type":"ContainerStarted","Data":"2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d"} Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.367879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1731c030-db8a-4a3f-a597-3fd64a5aeb3d","Type":"ContainerStarted","Data":"8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d"} Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.367890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1731c030-db8a-4a3f-a597-3fd64a5aeb3d","Type":"ContainerStarted","Data":"054f9fe98df0600551c3df948922a1978b2871917448dca48b57673283941490"} Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.376008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"e9360b9c-9e26-4d1c-981f-7d84b28555cf","Type":"ContainerStarted","Data":"616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1"} Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.376039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"e9360b9c-9e26-4d1c-981f-7d84b28555cf","Type":"ContainerStarted","Data":"d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858"} Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.376049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"e9360b9c-9e26-4d1c-981f-7d84b28555cf","Type":"ContainerStarted","Data":"78e5b09c14924594d218000376500c167d585204f83d0e92c98033d03bb5aa0f"} Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.377480 4707 scope.go:117] "RemoveContainer" containerID="111c9b79aba7c1ce6070d460e7114904ba89c9d35a5fd542f6794479355ce3d8" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.386950 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.386938849 podStartE2EDuration="2.386938849s" podCreationTimestamp="2026-01-21 15:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:45.381518982 +0000 UTC m=+1262.563035204" watchObservedRunningTime="2026-01-21 15:22:45.386938849 +0000 UTC m=+1262.568455072" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.400470 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.400458433 podStartE2EDuration="1.400458433s" podCreationTimestamp="2026-01-21 15:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:45.393050257 +0000 UTC m=+1262.574566479" watchObservedRunningTime="2026-01-21 15:22:45.400458433 +0000 UTC m=+1262.581974655" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.406015 4707 scope.go:117] "RemoveContainer" containerID="8e63927c2f73623e25aa4f9c0e08d2cdd5be5f7361f328bb4e017a36264b524f" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.435564 4707 scope.go:117] "RemoveContainer" containerID="e747338a48d85b7d6ace762a38846bdb6e3624ba8bf4af8b159c5363c6252c93" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.459784 4707 scope.go:117] "RemoveContainer" containerID="4dd228771194d7a02860fad493275cd1310e5a44053f4995cf7c2c0c9d5053e7" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.473840 4707 scope.go:117] "RemoveContainer" containerID="b795d9c6c95d2ea68b071f6114cbc0a1b83db6018018c159a2b056eaa4b5b499" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.489781 4707 scope.go:117] "RemoveContainer" containerID="0c70505b8377bd779309c265a19c1a4d599d7c7e55b279d3deb5f52fddc79c7f" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.505284 4707 scope.go:117] "RemoveContainer" containerID="b31f4f9052448b785d44b5b6bb2e793bfd762ba6718264d80a5fa47c35b16237" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.519844 4707 scope.go:117] "RemoveContainer" containerID="30eeef85500bce636f1829c179683cddf9eb3aacb9a8c8f4b817abaab927f81f" Jan 21 15:22:45 crc kubenswrapper[4707]: I0121 15:22:45.665122 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:46 crc kubenswrapper[4707]: E0121 15:22:46.719202 4707 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddcb4cca_cedf_42a9_bf7d_bc80ccf3950a.slice/crio-ac315a85e1750542f198a7387065a8871486b1d8121af78139eba907efe2bdfe: Error finding container ac315a85e1750542f198a7387065a8871486b1d8121af78139eba907efe2bdfe: Status 404 returned error can't find the container with id ac315a85e1750542f198a7387065a8871486b1d8121af78139eba907efe2bdfe Jan 21 15:22:46 crc kubenswrapper[4707]: E0121 15:22:46.730517 4707 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8fb0570_d57a_4361_9747_073f2ffd5a3f.slice/crio-9f12015809dd2e85fadd29d8b6d629b3d60b2a058bef3599da6e94b70e77c62f: Error finding container 9f12015809dd2e85fadd29d8b6d629b3d60b2a058bef3599da6e94b70e77c62f: Status 404 returned error can't find the container with id 9f12015809dd2e85fadd29d8b6d629b3d60b2a058bef3599da6e94b70e77c62f Jan 21 15:22:46 crc kubenswrapper[4707]: I0121 15:22:46.995598 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.088558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-combined-ca-bundle\") pod \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.088932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-config-data\") pod \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.089157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hndb\" (UniqueName: \"kubernetes.io/projected/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-kube-api-access-4hndb\") pod \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\" (UID: \"f07e2299-1ba0-4747-90ca-7fe3c0c099ee\") " Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.093337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-kube-api-access-4hndb" (OuterVolumeSpecName: "kube-api-access-4hndb") pod "f07e2299-1ba0-4747-90ca-7fe3c0c099ee" (UID: "f07e2299-1ba0-4747-90ca-7fe3c0c099ee"). InnerVolumeSpecName "kube-api-access-4hndb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.109380 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-config-data" (OuterVolumeSpecName: "config-data") pod "f07e2299-1ba0-4747-90ca-7fe3c0c099ee" (UID: "f07e2299-1ba0-4747-90ca-7fe3c0c099ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.109756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f07e2299-1ba0-4747-90ca-7fe3c0c099ee" (UID: "f07e2299-1ba0-4747-90ca-7fe3c0c099ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.191368 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.191395 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hndb\" (UniqueName: \"kubernetes.io/projected/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-kube-api-access-4hndb\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.191406 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2299-1ba0-4747-90ca-7fe3c0c099ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.408537 4707 generic.go:334] "Generic (PLEG): container finished" podID="f07e2299-1ba0-4747-90ca-7fe3c0c099ee" containerID="f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985" exitCode=0 Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.408572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f07e2299-1ba0-4747-90ca-7fe3c0c099ee","Type":"ContainerDied","Data":"f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985"} Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.408590 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.408605 4707 scope.go:117] "RemoveContainer" containerID="f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.408595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f07e2299-1ba0-4747-90ca-7fe3c0c099ee","Type":"ContainerDied","Data":"42496bfe87172086269c7b63219b05874837ae2388d4234896c4042071c8ac78"} Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.425573 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.431666 4707 scope.go:117] "RemoveContainer" containerID="f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985" Jan 21 15:22:47 crc kubenswrapper[4707]: E0121 15:22:47.432755 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985\": container with ID starting with f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985 not found: ID does not exist" containerID="f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.432919 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985"} err="failed to get container status \"f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985\": rpc error: code = NotFound desc = could not find container \"f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985\": container with ID starting with f681823570f0cc9188a92821deec2464c41401d5d4c3da93af4cc08d98d1e985 not found: ID does not exist" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.433736 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.440785 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:47 crc kubenswrapper[4707]: E0121 15:22:47.441110 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07e2299-1ba0-4747-90ca-7fe3c0c099ee" containerName="nova-scheduler-scheduler" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.441125 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07e2299-1ba0-4747-90ca-7fe3c0c099ee" containerName="nova-scheduler-scheduler" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.441299 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07e2299-1ba0-4747-90ca-7fe3c0c099ee" containerName="nova-scheduler-scheduler" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.441799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.443297 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.448062 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.495613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv26g\" (UniqueName: \"kubernetes.io/projected/c7a3d3fb-6c49-4863-a30f-fe60113bca49-kube-api-access-kv26g\") pod \"nova-scheduler-0\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.495881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-config-data\") pod \"nova-scheduler-0\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.495916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.597735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-config-data\") pod \"nova-scheduler-0\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.597829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.598007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv26g\" (UniqueName: \"kubernetes.io/projected/c7a3d3fb-6c49-4863-a30f-fe60113bca49-kube-api-access-kv26g\") pod \"nova-scheduler-0\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.600850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-config-data\") pod \"nova-scheduler-0\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.600880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.610415 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv26g\" (UniqueName: \"kubernetes.io/projected/c7a3d3fb-6c49-4863-a30f-fe60113bca49-kube-api-access-kv26g\") pod \"nova-scheduler-0\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:47 crc kubenswrapper[4707]: I0121 15:22:47.761247 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:48 crc kubenswrapper[4707]: I0121 15:22:48.122558 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:48 crc kubenswrapper[4707]: I0121 15:22:48.417140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c7a3d3fb-6c49-4863-a30f-fe60113bca49","Type":"ContainerStarted","Data":"2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74"} Jan 21 15:22:48 crc kubenswrapper[4707]: I0121 15:22:48.417347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c7a3d3fb-6c49-4863-a30f-fe60113bca49","Type":"ContainerStarted","Data":"e61a5e9bd18cacd845a50d50d0e160dec685fa63453fd0c1d1527642aeeebb01"} Jan 21 15:22:48 crc kubenswrapper[4707]: I0121 15:22:48.431771 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.431755642 podStartE2EDuration="1.431755642s" podCreationTimestamp="2026-01-21 15:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:48.42839239 +0000 UTC m=+1265.609908611" watchObservedRunningTime="2026-01-21 15:22:48.431755642 +0000 UTC m=+1265.613271864" Jan 21 15:22:49 crc kubenswrapper[4707]: I0121 15:22:49.189417 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07e2299-1ba0-4747-90ca-7fe3c0c099ee" path="/var/lib/kubelet/pods/f07e2299-1ba0-4747-90ca-7fe3c0c099ee/volumes" Jan 21 15:22:49 crc kubenswrapper[4707]: I0121 15:22:49.734155 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:49 crc kubenswrapper[4707]: I0121 15:22:49.734263 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:50 crc kubenswrapper[4707]: I0121 15:22:50.665961 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:50 crc kubenswrapper[4707]: I0121 15:22:50.680772 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:51 crc kubenswrapper[4707]: I0121 15:22:51.450157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:22:51 crc kubenswrapper[4707]: I0121 15:22:51.700230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.060423 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv"] Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.061508 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.063869 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.066102 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.070615 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv"] Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.172836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-config-data\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.173002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-scripts\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.173227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.173350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6fbj\" (UniqueName: \"kubernetes.io/projected/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-kube-api-access-d6fbj\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.274867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-scripts\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.274956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.274997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6fbj\" (UniqueName: \"kubernetes.io/projected/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-kube-api-access-d6fbj\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.275069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-config-data\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.279293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-scripts\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.279430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.279987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-config-data\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.293211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6fbj\" (UniqueName: \"kubernetes.io/projected/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-kube-api-access-d6fbj\") pod \"nova-cell1-cell-mapping-fdzkv\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.376291 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:52 crc kubenswrapper[4707]: I0121 15:22:52.761979 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:53 crc kubenswrapper[4707]: I0121 15:22:53.248868 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv"] Jan 21 15:22:53 crc kubenswrapper[4707]: W0121 15:22:53.252638 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16ff817_63cb_4b7d_8ce2_40979f6a2cc6.slice/crio-636c34c57c676b40ca697007a132202f4bab0c0d234887b1903844a071fe12fd WatchSource:0}: Error finding container 636c34c57c676b40ca697007a132202f4bab0c0d234887b1903844a071fe12fd: Status 404 returned error can't find the container with id 636c34c57c676b40ca697007a132202f4bab0c0d234887b1903844a071fe12fd Jan 21 15:22:53 crc kubenswrapper[4707]: I0121 15:22:53.459478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" event={"ID":"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6","Type":"ContainerStarted","Data":"349d7ce8226bdb20d55a2f74cfccc3655f5c42f37dd06ac1618bc6d7d059c8d3"} Jan 21 15:22:53 crc kubenswrapper[4707]: I0121 15:22:53.459522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" event={"ID":"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6","Type":"ContainerStarted","Data":"636c34c57c676b40ca697007a132202f4bab0c0d234887b1903844a071fe12fd"} Jan 21 15:22:53 crc kubenswrapper[4707]: I0121 15:22:53.479161 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" podStartSLOduration=1.479148499 podStartE2EDuration="1.479148499s" podCreationTimestamp="2026-01-21 15:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:22:53.472669419 +0000 UTC m=+1270.654185642" watchObservedRunningTime="2026-01-21 15:22:53.479148499 +0000 UTC m=+1270.660664721" Jan 21 15:22:54 crc kubenswrapper[4707]: I0121 15:22:54.046899 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:54 crc kubenswrapper[4707]: I0121 15:22:54.047219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:22:54 crc kubenswrapper[4707]: I0121 15:22:54.734359 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:54 crc kubenswrapper[4707]: I0121 15:22:54.734410 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:22:55 crc kubenswrapper[4707]: I0121 15:22:55.129971 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.20:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:22:55 crc kubenswrapper[4707]: I0121 15:22:55.129987 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.20:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:22:55 crc kubenswrapper[4707]: I0121 15:22:55.746923 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.21:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:22:55 crc kubenswrapper[4707]: I0121 15:22:55.746946 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.21:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:22:57 crc kubenswrapper[4707]: I0121 15:22:57.489149 4707 generic.go:334] "Generic (PLEG): container finished" podID="e16ff817-63cb-4b7d-8ce2-40979f6a2cc6" containerID="349d7ce8226bdb20d55a2f74cfccc3655f5c42f37dd06ac1618bc6d7d059c8d3" exitCode=0 Jan 21 15:22:57 crc kubenswrapper[4707]: I0121 15:22:57.489241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" event={"ID":"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6","Type":"ContainerDied","Data":"349d7ce8226bdb20d55a2f74cfccc3655f5c42f37dd06ac1618bc6d7d059c8d3"} Jan 21 15:22:57 crc kubenswrapper[4707]: I0121 15:22:57.492922 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:22:57 crc kubenswrapper[4707]: I0121 15:22:57.761485 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:57 crc kubenswrapper[4707]: I0121 15:22:57.781871 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.517033 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.760078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.879624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-scripts\") pod \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.879669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-combined-ca-bundle\") pod \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.879747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-config-data\") pod \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.879854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6fbj\" (UniqueName: \"kubernetes.io/projected/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-kube-api-access-d6fbj\") pod \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\" (UID: \"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6\") " Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.884297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-kube-api-access-d6fbj" (OuterVolumeSpecName: "kube-api-access-d6fbj") pod "e16ff817-63cb-4b7d-8ce2-40979f6a2cc6" (UID: "e16ff817-63cb-4b7d-8ce2-40979f6a2cc6"). InnerVolumeSpecName "kube-api-access-d6fbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.884978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-scripts" (OuterVolumeSpecName: "scripts") pod "e16ff817-63cb-4b7d-8ce2-40979f6a2cc6" (UID: "e16ff817-63cb-4b7d-8ce2-40979f6a2cc6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.899257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e16ff817-63cb-4b7d-8ce2-40979f6a2cc6" (UID: "e16ff817-63cb-4b7d-8ce2-40979f6a2cc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.900212 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-config-data" (OuterVolumeSpecName: "config-data") pod "e16ff817-63cb-4b7d-8ce2-40979f6a2cc6" (UID: "e16ff817-63cb-4b7d-8ce2-40979f6a2cc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.982227 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.982254 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.982266 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:58 crc kubenswrapper[4707]: I0121 15:22:58.982274 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6fbj\" (UniqueName: \"kubernetes.io/projected/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6-kube-api-access-d6fbj\") on node \"crc\" DevicePath \"\"" Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.503904 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" event={"ID":"e16ff817-63cb-4b7d-8ce2-40979f6a2cc6","Type":"ContainerDied","Data":"636c34c57c676b40ca697007a132202f4bab0c0d234887b1903844a071fe12fd"} Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.503937 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv" Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.503942 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="636c34c57c676b40ca697007a132202f4bab0c0d234887b1903844a071fe12fd" Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.580501 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.580716 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-log" containerID="cri-o://8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d" gracePeriod=30 Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.580783 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-api" containerID="cri-o://2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d" gracePeriod=30 Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.589924 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.596848 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.597095 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-log" containerID="cri-o://d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858" gracePeriod=30 Jan 21 15:22:59 crc kubenswrapper[4707]: I0121 15:22:59.597126 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-metadata" containerID="cri-o://616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1" gracePeriod=30 Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.456412 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.456759 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="3b269364-bf60-4c6a-8a37-3c9a4cf7ae30" containerName="kube-state-metrics" containerID="cri-o://359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e" gracePeriod=30 Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.511979 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerID="d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858" exitCode=143 Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.512029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"e9360b9c-9e26-4d1c-981f-7d84b28555cf","Type":"ContainerDied","Data":"d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858"} Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.513337 4707 generic.go:334] "Generic (PLEG): container finished" podID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerID="8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d" exitCode=143 Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.513473 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c7a3d3fb-6c49-4863-a30f-fe60113bca49" containerName="nova-scheduler-scheduler" containerID="cri-o://2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74" gracePeriod=30 Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.513669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1731c030-db8a-4a3f-a597-3fd64a5aeb3d","Type":"ContainerDied","Data":"8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d"} Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.859139 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.914121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n67zn\" (UniqueName: \"kubernetes.io/projected/3b269364-bf60-4c6a-8a37-3c9a4cf7ae30-kube-api-access-n67zn\") pod \"3b269364-bf60-4c6a-8a37-3c9a4cf7ae30\" (UID: \"3b269364-bf60-4c6a-8a37-3c9a4cf7ae30\") " Jan 21 15:23:00 crc kubenswrapper[4707]: I0121 15:23:00.919203 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b269364-bf60-4c6a-8a37-3c9a4cf7ae30-kube-api-access-n67zn" (OuterVolumeSpecName: "kube-api-access-n67zn") pod "3b269364-bf60-4c6a-8a37-3c9a4cf7ae30" (UID: "3b269364-bf60-4c6a-8a37-3c9a4cf7ae30"). InnerVolumeSpecName "kube-api-access-n67zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.016119 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n67zn\" (UniqueName: \"kubernetes.io/projected/3b269364-bf60-4c6a-8a37-3c9a4cf7ae30-kube-api-access-n67zn\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.521171 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b269364-bf60-4c6a-8a37-3c9a4cf7ae30" containerID="359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e" exitCode=2 Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.521208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"3b269364-bf60-4c6a-8a37-3c9a4cf7ae30","Type":"ContainerDied","Data":"359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e"} Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.521229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"3b269364-bf60-4c6a-8a37-3c9a4cf7ae30","Type":"ContainerDied","Data":"aaee06e3e6f6d20add5b83f0226746849583a0c74616f3485c03c6a2281655d6"} Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.521244 4707 scope.go:117] "RemoveContainer" containerID="359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.521350 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.540840 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.542376 4707 scope.go:117] "RemoveContainer" containerID="359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e" Jan 21 15:23:01 crc kubenswrapper[4707]: E0121 15:23:01.542694 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e\": container with ID starting with 359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e not found: ID does not exist" containerID="359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.542723 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e"} err="failed to get container status \"359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e\": rpc error: code = NotFound desc = could not find container \"359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e\": container with ID starting with 359be425da09de9815d57dca69f59f38b6cc1c76c0377dd7ca355e1fdf73074e not found: ID does not exist" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.547557 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.551709 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:23:01 crc kubenswrapper[4707]: E0121 15:23:01.552028 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16ff817-63cb-4b7d-8ce2-40979f6a2cc6" containerName="nova-manage" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.552048 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16ff817-63cb-4b7d-8ce2-40979f6a2cc6" containerName="nova-manage" Jan 21 15:23:01 crc kubenswrapper[4707]: E0121 15:23:01.552066 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b269364-bf60-4c6a-8a37-3c9a4cf7ae30" containerName="kube-state-metrics" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.552073 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b269364-bf60-4c6a-8a37-3c9a4cf7ae30" containerName="kube-state-metrics" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.552265 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b269364-bf60-4c6a-8a37-3c9a4cf7ae30" containerName="kube-state-metrics" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.552283 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16ff817-63cb-4b7d-8ce2-40979f6a2cc6" containerName="nova-manage" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.552796 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.554474 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.556598 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.556607 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.624465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.624510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.624612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4rc4\" (UniqueName: \"kubernetes.io/projected/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-api-access-q4rc4\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.624771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.726256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4rc4\" (UniqueName: \"kubernetes.io/projected/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-api-access-q4rc4\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.726353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.726560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.726606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.729999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.730183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.730345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.739861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4rc4\" (UniqueName: \"kubernetes.io/projected/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-api-access-q4rc4\") pod \"kube-state-metrics-0\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.832593 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.832890 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="ceilometer-central-agent" containerID="cri-o://e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf" gracePeriod=30 Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.832942 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="proxy-httpd" containerID="cri-o://23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1" gracePeriod=30 Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.832980 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="ceilometer-notification-agent" containerID="cri-o://91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe" gracePeriod=30 Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.832956 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="sg-core" containerID="cri-o://7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f" gracePeriod=30 Jan 21 15:23:01 crc kubenswrapper[4707]: I0121 15:23:01.870588 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:02 crc kubenswrapper[4707]: I0121 15:23:02.254769 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:23:02 crc kubenswrapper[4707]: W0121 15:23:02.256450 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37f8fb41_6a21_4135_ace9_7b9b36eaaac5.slice/crio-a5a75f299af371c49eff8cbc12e19805b5fa166fdbee5038db4acec1d3a0e78c WatchSource:0}: Error finding container a5a75f299af371c49eff8cbc12e19805b5fa166fdbee5038db4acec1d3a0e78c: Status 404 returned error can't find the container with id a5a75f299af371c49eff8cbc12e19805b5fa166fdbee5038db4acec1d3a0e78c Jan 21 15:23:02 crc kubenswrapper[4707]: I0121 15:23:02.528917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"37f8fb41-6a21-4135-ace9-7b9b36eaaac5","Type":"ContainerStarted","Data":"a5a75f299af371c49eff8cbc12e19805b5fa166fdbee5038db4acec1d3a0e78c"} Jan 21 15:23:02 crc kubenswrapper[4707]: I0121 15:23:02.531288 4707 generic.go:334] "Generic (PLEG): container finished" podID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerID="23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1" exitCode=0 Jan 21 15:23:02 crc kubenswrapper[4707]: I0121 15:23:02.531328 4707 generic.go:334] "Generic (PLEG): container finished" podID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerID="7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f" exitCode=2 Jan 21 15:23:02 crc kubenswrapper[4707]: I0121 15:23:02.531338 4707 generic.go:334] "Generic (PLEG): container finished" podID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerID="e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf" exitCode=0 Jan 21 15:23:02 crc kubenswrapper[4707]: I0121 15:23:02.531372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerDied","Data":"23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1"} Jan 21 15:23:02 crc kubenswrapper[4707]: I0121 15:23:02.531408 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerDied","Data":"7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f"} Jan 21 15:23:02 crc kubenswrapper[4707]: I0121 15:23:02.531418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerDied","Data":"e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf"} Jan 21 15:23:02 crc kubenswrapper[4707]: E0121 15:23:02.762933 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:23:02 crc kubenswrapper[4707]: E0121 15:23:02.763989 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:23:02 crc kubenswrapper[4707]: E0121 15:23:02.765109 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:23:02 crc kubenswrapper[4707]: E0121 15:23:02.765144 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c7a3d3fb-6c49-4863-a30f-fe60113bca49" containerName="nova-scheduler-scheduler" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.009074 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.126621 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.147428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-nova-metadata-tls-certs\") pod \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.147557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf749\" (UniqueName: \"kubernetes.io/projected/e9360b9c-9e26-4d1c-981f-7d84b28555cf-kube-api-access-zf749\") pod \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.147608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9360b9c-9e26-4d1c-981f-7d84b28555cf-logs\") pod \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.147706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-combined-ca-bundle\") pod \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.148268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-config-data\") pod \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\" (UID: \"e9360b9c-9e26-4d1c-981f-7d84b28555cf\") " Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.148201 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9360b9c-9e26-4d1c-981f-7d84b28555cf-logs" (OuterVolumeSpecName: "logs") pod "e9360b9c-9e26-4d1c-981f-7d84b28555cf" (UID: "e9360b9c-9e26-4d1c-981f-7d84b28555cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.149065 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9360b9c-9e26-4d1c-981f-7d84b28555cf-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.151782 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9360b9c-9e26-4d1c-981f-7d84b28555cf-kube-api-access-zf749" (OuterVolumeSpecName: "kube-api-access-zf749") pod "e9360b9c-9e26-4d1c-981f-7d84b28555cf" (UID: "e9360b9c-9e26-4d1c-981f-7d84b28555cf"). InnerVolumeSpecName "kube-api-access-zf749". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.167405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9360b9c-9e26-4d1c-981f-7d84b28555cf" (UID: "e9360b9c-9e26-4d1c-981f-7d84b28555cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.168082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-config-data" (OuterVolumeSpecName: "config-data") pod "e9360b9c-9e26-4d1c-981f-7d84b28555cf" (UID: "e9360b9c-9e26-4d1c-981f-7d84b28555cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.181793 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e9360b9c-9e26-4d1c-981f-7d84b28555cf" (UID: "e9360b9c-9e26-4d1c-981f-7d84b28555cf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.190863 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b269364-bf60-4c6a-8a37-3c9a4cf7ae30" path="/var/lib/kubelet/pods/3b269364-bf60-4c6a-8a37-3c9a4cf7ae30/volumes" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.249845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-config-data\") pod \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.249891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-combined-ca-bundle\") pod \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.249953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27tjd\" (UniqueName: \"kubernetes.io/projected/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-kube-api-access-27tjd\") pod \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.250089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-logs\") pod \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\" (UID: \"1731c030-db8a-4a3f-a597-3fd64a5aeb3d\") " Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.250503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-logs" (OuterVolumeSpecName: "logs") pod "1731c030-db8a-4a3f-a597-3fd64a5aeb3d" (UID: "1731c030-db8a-4a3f-a597-3fd64a5aeb3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.251624 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf749\" (UniqueName: \"kubernetes.io/projected/e9360b9c-9e26-4d1c-981f-7d84b28555cf-kube-api-access-zf749\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.251708 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.251736 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.251747 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.251756 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9360b9c-9e26-4d1c-981f-7d84b28555cf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.252874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-kube-api-access-27tjd" (OuterVolumeSpecName: "kube-api-access-27tjd") pod "1731c030-db8a-4a3f-a597-3fd64a5aeb3d" (UID: "1731c030-db8a-4a3f-a597-3fd64a5aeb3d"). InnerVolumeSpecName "kube-api-access-27tjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.269501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-config-data" (OuterVolumeSpecName: "config-data") pod "1731c030-db8a-4a3f-a597-3fd64a5aeb3d" (UID: "1731c030-db8a-4a3f-a597-3fd64a5aeb3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.270803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1731c030-db8a-4a3f-a597-3fd64a5aeb3d" (UID: "1731c030-db8a-4a3f-a597-3fd64a5aeb3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.353698 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27tjd\" (UniqueName: \"kubernetes.io/projected/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-kube-api-access-27tjd\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.353916 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.353927 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1731c030-db8a-4a3f-a597-3fd64a5aeb3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.539826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"37f8fb41-6a21-4135-ace9-7b9b36eaaac5","Type":"ContainerStarted","Data":"d0cdbb4ec686eadd5528f51fab83e9c9497d8a3d56a4bb906dfe28f48de85b3b"} Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.539887 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.541299 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerID="616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1" exitCode=0 Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.541338 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.541563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"e9360b9c-9e26-4d1c-981f-7d84b28555cf","Type":"ContainerDied","Data":"616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1"} Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.541606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"e9360b9c-9e26-4d1c-981f-7d84b28555cf","Type":"ContainerDied","Data":"78e5b09c14924594d218000376500c167d585204f83d0e92c98033d03bb5aa0f"} Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.541624 4707 scope.go:117] "RemoveContainer" containerID="616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.543780 4707 generic.go:334] "Generic (PLEG): container finished" podID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerID="2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d" exitCode=0 Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.543828 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1731c030-db8a-4a3f-a597-3fd64a5aeb3d","Type":"ContainerDied","Data":"2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d"} Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.543854 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1731c030-db8a-4a3f-a597-3fd64a5aeb3d","Type":"ContainerDied","Data":"054f9fe98df0600551c3df948922a1978b2871917448dca48b57673283941490"} Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.543860 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.557878 4707 scope.go:117] "RemoveContainer" containerID="d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.561998 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.271483655 podStartE2EDuration="2.561952641s" podCreationTimestamp="2026-01-21 15:23:01 +0000 UTC" firstStartedPulling="2026-01-21 15:23:02.258331957 +0000 UTC m=+1279.439848179" lastFinishedPulling="2026-01-21 15:23:02.548800943 +0000 UTC m=+1279.730317165" observedRunningTime="2026-01-21 15:23:03.56052663 +0000 UTC m=+1280.742042852" watchObservedRunningTime="2026-01-21 15:23:03.561952641 +0000 UTC m=+1280.743468863" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.584369 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.594074 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.601972 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.618528 4707 scope.go:117] "RemoveContainer" containerID="616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1" Jan 21 15:23:03 crc kubenswrapper[4707]: E0121 15:23:03.618933 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1\": container with ID starting with 616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1 not found: ID does not exist" containerID="616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.618965 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1"} err="failed to get container status \"616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1\": rpc error: code = NotFound desc = could not find container \"616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1\": container with ID starting with 616ae2ed6516bd017cae72feb046616a3da8e45cf60762208e225f6d80e7bcc1 not found: ID does not exist" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.619014 4707 scope.go:117] "RemoveContainer" containerID="d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858" Jan 21 15:23:03 crc kubenswrapper[4707]: E0121 15:23:03.619299 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858\": container with ID starting with d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858 not found: ID does not exist" containerID="d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.619352 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858"} err="failed to get container status \"d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858\": rpc error: code = NotFound desc = could not find container \"d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858\": container with ID starting with d8fbb4c4c338ff07304aaf0af43075b21d848feae02951dc5c7ae9037cac8858 not found: ID does not exist" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.619383 4707 scope.go:117] "RemoveContainer" containerID="2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.630230 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.645707 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:03 crc kubenswrapper[4707]: E0121 15:23:03.646083 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-api" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.646114 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-api" Jan 21 15:23:03 crc kubenswrapper[4707]: E0121 15:23:03.646140 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-log" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.646146 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-log" Jan 21 15:23:03 crc kubenswrapper[4707]: E0121 15:23:03.646162 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-metadata" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.646168 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-metadata" Jan 21 15:23:03 crc kubenswrapper[4707]: E0121 15:23:03.646179 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-log" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.646184 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-log" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.646413 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-metadata" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.646432 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" containerName="nova-metadata-log" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.646446 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-api" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.646459 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" containerName="nova-api-log" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.647522 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.649885 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.651144 4707 scope.go:117] "RemoveContainer" containerID="8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.653050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.659769 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.661708 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.663828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.664720 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.668905 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.684898 4707 scope.go:117] "RemoveContainer" containerID="2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d" Jan 21 15:23:03 crc kubenswrapper[4707]: E0121 15:23:03.685147 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d\": container with ID starting with 2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d not found: ID does not exist" containerID="2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.685182 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d"} err="failed to get container status \"2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d\": rpc error: code = NotFound desc = could not find container \"2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d\": container with ID starting with 2607a5be5d5d161945a04e0ff37589caca26cc90f73b133b5ebee1c05473d23d not found: ID does not exist" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.685206 4707 scope.go:117] "RemoveContainer" containerID="8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d" Jan 21 15:23:03 crc kubenswrapper[4707]: E0121 15:23:03.685398 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d\": container with ID starting with 8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d not found: ID does not exist" containerID="8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.685419 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d"} err="failed to get container status \"8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d\": rpc error: code = NotFound desc = could not find container \"8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d\": container with ID starting with 8c37b7f3965896c8d612f751b8dd625d99fd708cc70215cf1a3889344840e44d not found: ID does not exist" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.763286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.763557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw29m\" (UniqueName: \"kubernetes.io/projected/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-kube-api-access-lw29m\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.763600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-logs\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.763647 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-config-data\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.763660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-config-data\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.763676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb688fd-a99d-494a-a688-b2c465f0ae48-logs\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.763716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.763758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2tn5\" (UniqueName: \"kubernetes.io/projected/3eb688fd-a99d-494a-a688-b2c465f0ae48-kube-api-access-v2tn5\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.763791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.868425 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2tn5\" (UniqueName: \"kubernetes.io/projected/3eb688fd-a99d-494a-a688-b2c465f0ae48-kube-api-access-v2tn5\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.868469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.868514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.868585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw29m\" (UniqueName: \"kubernetes.io/projected/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-kube-api-access-lw29m\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.868601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-logs\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.868622 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-config-data\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.868634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-config-data\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.868648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb688fd-a99d-494a-a688-b2c465f0ae48-logs\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.868672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.869302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-logs\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.869736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb688fd-a99d-494a-a688-b2c465f0ae48-logs\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.872178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-config-data\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.872321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-config-data\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.872965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.874219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.874687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.881435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2tn5\" (UniqueName: \"kubernetes.io/projected/3eb688fd-a99d-494a-a688-b2c465f0ae48-kube-api-access-v2tn5\") pod \"nova-api-0\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.882600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw29m\" (UniqueName: \"kubernetes.io/projected/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-kube-api-access-lw29m\") pod \"nova-metadata-0\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.964430 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.972551 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:03 crc kubenswrapper[4707]: I0121 15:23:03.974882 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.069994 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-run-httpd\") pod \"4bc8db82-0945-4c71-955c-6e80d92a8c15\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.070189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-config-data\") pod \"4bc8db82-0945-4c71-955c-6e80d92a8c15\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.070589 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4bc8db82-0945-4c71-955c-6e80d92a8c15" (UID: "4bc8db82-0945-4c71-955c-6e80d92a8c15"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.070691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-combined-ca-bundle\") pod \"4bc8db82-0945-4c71-955c-6e80d92a8c15\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.070722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-sg-core-conf-yaml\") pod \"4bc8db82-0945-4c71-955c-6e80d92a8c15\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.070840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtnch\" (UniqueName: \"kubernetes.io/projected/4bc8db82-0945-4c71-955c-6e80d92a8c15-kube-api-access-vtnch\") pod \"4bc8db82-0945-4c71-955c-6e80d92a8c15\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.070876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-log-httpd\") pod \"4bc8db82-0945-4c71-955c-6e80d92a8c15\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.070900 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-scripts\") pod \"4bc8db82-0945-4c71-955c-6e80d92a8c15\" (UID: \"4bc8db82-0945-4c71-955c-6e80d92a8c15\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.071671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4bc8db82-0945-4c71-955c-6e80d92a8c15" (UID: "4bc8db82-0945-4c71-955c-6e80d92a8c15"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.072006 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.072020 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bc8db82-0945-4c71-955c-6e80d92a8c15-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.075243 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc8db82-0945-4c71-955c-6e80d92a8c15-kube-api-access-vtnch" (OuterVolumeSpecName: "kube-api-access-vtnch") pod "4bc8db82-0945-4c71-955c-6e80d92a8c15" (UID: "4bc8db82-0945-4c71-955c-6e80d92a8c15"). InnerVolumeSpecName "kube-api-access-vtnch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.076913 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-scripts" (OuterVolumeSpecName: "scripts") pod "4bc8db82-0945-4c71-955c-6e80d92a8c15" (UID: "4bc8db82-0945-4c71-955c-6e80d92a8c15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.102605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4bc8db82-0945-4c71-955c-6e80d92a8c15" (UID: "4bc8db82-0945-4c71-955c-6e80d92a8c15"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.119363 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.158301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bc8db82-0945-4c71-955c-6e80d92a8c15" (UID: "4bc8db82-0945-4c71-955c-6e80d92a8c15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.172487 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-config-data" (OuterVolumeSpecName: "config-data") pod "4bc8db82-0945-4c71-955c-6e80d92a8c15" (UID: "4bc8db82-0945-4c71-955c-6e80d92a8c15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.172870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-combined-ca-bundle\") pod \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.172905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv26g\" (UniqueName: \"kubernetes.io/projected/c7a3d3fb-6c49-4863-a30f-fe60113bca49-kube-api-access-kv26g\") pod \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.172960 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-config-data\") pod \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\" (UID: \"c7a3d3fb-6c49-4863-a30f-fe60113bca49\") " Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.173347 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.173366 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.173374 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.173383 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bc8db82-0945-4c71-955c-6e80d92a8c15-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.173391 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtnch\" (UniqueName: \"kubernetes.io/projected/4bc8db82-0945-4c71-955c-6e80d92a8c15-kube-api-access-vtnch\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.175983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a3d3fb-6c49-4863-a30f-fe60113bca49-kube-api-access-kv26g" (OuterVolumeSpecName: "kube-api-access-kv26g") pod "c7a3d3fb-6c49-4863-a30f-fe60113bca49" (UID: "c7a3d3fb-6c49-4863-a30f-fe60113bca49"). InnerVolumeSpecName "kube-api-access-kv26g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.198864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-config-data" (OuterVolumeSpecName: "config-data") pod "c7a3d3fb-6c49-4863-a30f-fe60113bca49" (UID: "c7a3d3fb-6c49-4863-a30f-fe60113bca49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.203514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7a3d3fb-6c49-4863-a30f-fe60113bca49" (UID: "c7a3d3fb-6c49-4863-a30f-fe60113bca49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.275242 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.275362 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv26g\" (UniqueName: \"kubernetes.io/projected/c7a3d3fb-6c49-4863-a30f-fe60113bca49-kube-api-access-kv26g\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.275736 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a3d3fb-6c49-4863-a30f-fe60113bca49-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.409575 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: W0121 15:23:04.415557 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb688fd_a99d_494a_a688_b2c465f0ae48.slice/crio-a96d2221f6310a907f99c6f2f2d18d1f1fcd4480f6431930cdabf28e289664c0 WatchSource:0}: Error finding container a96d2221f6310a907f99c6f2f2d18d1f1fcd4480f6431930cdabf28e289664c0: Status 404 returned error can't find the container with id a96d2221f6310a907f99c6f2f2d18d1f1fcd4480f6431930cdabf28e289664c0 Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.451803 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: W0121 15:23:04.452794 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded7400b4_4f9e_4571_9156_5cf8f5e0df67.slice/crio-aba8650dca5babd848a93663f9d9b471727332299482c7f107b319fa5acbb7bb WatchSource:0}: Error finding container aba8650dca5babd848a93663f9d9b471727332299482c7f107b319fa5acbb7bb: Status 404 returned error can't find the container with id aba8650dca5babd848a93663f9d9b471727332299482c7f107b319fa5acbb7bb Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.559100 4707 generic.go:334] "Generic (PLEG): container finished" podID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerID="91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe" exitCode=0 Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.559144 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.559165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerDied","Data":"91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe"} Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.559193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4bc8db82-0945-4c71-955c-6e80d92a8c15","Type":"ContainerDied","Data":"07485e3a2f530453d18e47166fe32a67b0f55077a72f26ed3ffb0faafaf672ea"} Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.559209 4707 scope.go:117] "RemoveContainer" containerID="23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.560468 4707 generic.go:334] "Generic (PLEG): container finished" podID="c7a3d3fb-6c49-4863-a30f-fe60113bca49" containerID="2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74" exitCode=0 Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.560514 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.560522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c7a3d3fb-6c49-4863-a30f-fe60113bca49","Type":"ContainerDied","Data":"2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74"} Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.560544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c7a3d3fb-6c49-4863-a30f-fe60113bca49","Type":"ContainerDied","Data":"e61a5e9bd18cacd845a50d50d0e160dec685fa63453fd0c1d1527642aeeebb01"} Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.562443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3eb688fd-a99d-494a-a688-b2c465f0ae48","Type":"ContainerStarted","Data":"ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4"} Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.562467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3eb688fd-a99d-494a-a688-b2c465f0ae48","Type":"ContainerStarted","Data":"a96d2221f6310a907f99c6f2f2d18d1f1fcd4480f6431930cdabf28e289664c0"} Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.564949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed7400b4-4f9e-4571-9156-5cf8f5e0df67","Type":"ContainerStarted","Data":"add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868"} Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.564980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed7400b4-4f9e-4571-9156-5cf8f5e0df67","Type":"ContainerStarted","Data":"aba8650dca5babd848a93663f9d9b471727332299482c7f107b319fa5acbb7bb"} Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.575130 4707 scope.go:117] "RemoveContainer" containerID="7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.587929 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.595496 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.606435 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.620025 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627162 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.627519 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a3d3fb-6c49-4863-a30f-fe60113bca49" containerName="nova-scheduler-scheduler" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627537 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a3d3fb-6c49-4863-a30f-fe60113bca49" containerName="nova-scheduler-scheduler" Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.627549 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="ceilometer-central-agent" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627555 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="ceilometer-central-agent" Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.627564 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="ceilometer-notification-agent" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627570 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="ceilometer-notification-agent" Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.627580 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="proxy-httpd" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627585 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="proxy-httpd" Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.627602 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="sg-core" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627609 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="sg-core" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627761 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a3d3fb-6c49-4863-a30f-fe60113bca49" containerName="nova-scheduler-scheduler" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627773 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="sg-core" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627783 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="ceilometer-central-agent" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627790 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="ceilometer-notification-agent" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.627829 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" containerName="proxy-httpd" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.628340 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.630260 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.635385 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.636965 4707 scope.go:117] "RemoveContainer" containerID="91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.641689 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.643288 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.644409 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.645217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.645382 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.647959 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.667020 4707 scope.go:117] "RemoveContainer" containerID="e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.681279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stbxl\" (UniqueName: \"kubernetes.io/projected/c1addb55-dbd5-4f5c-957b-592c89033dab-kube-api-access-stbxl\") pod \"nova-scheduler-0\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.681326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-config-data\") pod \"nova-scheduler-0\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.681356 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.690568 4707 scope.go:117] "RemoveContainer" containerID="23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1" Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.690876 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1\": container with ID starting with 23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1 not found: ID does not exist" containerID="23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.690910 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1"} err="failed to get container status \"23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1\": rpc error: code = NotFound desc = could not find container \"23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1\": container with ID starting with 23f6809f8ff6bb29f89ec4f4c81041ba69bb4f33cfe8cde3c55b8ed4297833c1 not found: ID does not exist" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.690933 4707 scope.go:117] "RemoveContainer" containerID="7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f" Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.691194 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f\": container with ID starting with 7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f not found: ID does not exist" containerID="7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.691221 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f"} err="failed to get container status \"7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f\": rpc error: code = NotFound desc = could not find container \"7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f\": container with ID starting with 7f940c71584d147b994b670ca7b69ba8ae4d2fba093797aee53f1b462f41ba2f not found: ID does not exist" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.691239 4707 scope.go:117] "RemoveContainer" containerID="91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe" Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.691653 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe\": container with ID starting with 91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe not found: ID does not exist" containerID="91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.691688 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe"} err="failed to get container status \"91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe\": rpc error: code = NotFound desc = could not find container \"91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe\": container with ID starting with 91a113329c1b800b9d4d5449b520a9abd47dd961c29c5cc2c8cb11da2cc8d0fe not found: ID does not exist" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.691704 4707 scope.go:117] "RemoveContainer" containerID="e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf" Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.692000 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf\": container with ID starting with e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf not found: ID does not exist" containerID="e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.692019 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf"} err="failed to get container status \"e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf\": rpc error: code = NotFound desc = could not find container \"e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf\": container with ID starting with e614ccfcc4d348d788c7ce6dadcfabad742639bfe2d63938c16ac56b042d5baf not found: ID does not exist" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.692031 4707 scope.go:117] "RemoveContainer" containerID="2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.706633 4707 scope.go:117] "RemoveContainer" containerID="2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74" Jan 21 15:23:04 crc kubenswrapper[4707]: E0121 15:23:04.706930 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74\": container with ID starting with 2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74 not found: ID does not exist" containerID="2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.706955 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74"} err="failed to get container status \"2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74\": rpc error: code = NotFound desc = could not find container \"2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74\": container with ID starting with 2c4fbe911410c9a7859eef9d5062991f2bf54e76d754325495fb79193f2d4d74 not found: ID does not exist" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.786732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.786792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.786879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-config-data\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.786899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-log-httpd\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.786943 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.786974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-scripts\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.786990 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88tmx\" (UniqueName: \"kubernetes.io/projected/33d67064-63b6-4a0d-8310-3e41a2e52c3f-kube-api-access-88tmx\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.787013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-run-httpd\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.787033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.787051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stbxl\" (UniqueName: \"kubernetes.io/projected/c1addb55-dbd5-4f5c-957b-592c89033dab-kube-api-access-stbxl\") pod \"nova-scheduler-0\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.787072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-config-data\") pod \"nova-scheduler-0\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.789673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-config-data\") pod \"nova-scheduler-0\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.792258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.806256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stbxl\" (UniqueName: \"kubernetes.io/projected/c1addb55-dbd5-4f5c-957b-592c89033dab-kube-api-access-stbxl\") pod \"nova-scheduler-0\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.890683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-config-data\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.890727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-log-httpd\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.890770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.890799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-scripts\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.890827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88tmx\" (UniqueName: \"kubernetes.io/projected/33d67064-63b6-4a0d-8310-3e41a2e52c3f-kube-api-access-88tmx\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.890852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-run-httpd\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.890873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.890910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.891260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-log-httpd\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.896056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-run-httpd\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.901886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-config-data\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.902421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-scripts\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.903307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.906460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.907327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.931126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88tmx\" (UniqueName: \"kubernetes.io/projected/33d67064-63b6-4a0d-8310-3e41a2e52c3f-kube-api-access-88tmx\") pod \"ceilometer-0\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.959354 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:04 crc kubenswrapper[4707]: I0121 15:23:04.967204 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.191655 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1731c030-db8a-4a3f-a597-3fd64a5aeb3d" path="/var/lib/kubelet/pods/1731c030-db8a-4a3f-a597-3fd64a5aeb3d/volumes" Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.192481 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc8db82-0945-4c71-955c-6e80d92a8c15" path="/var/lib/kubelet/pods/4bc8db82-0945-4c71-955c-6e80d92a8c15/volumes" Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.193109 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a3d3fb-6c49-4863-a30f-fe60113bca49" path="/var/lib/kubelet/pods/c7a3d3fb-6c49-4863-a30f-fe60113bca49/volumes" Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.193999 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9360b9c-9e26-4d1c-981f-7d84b28555cf" path="/var/lib/kubelet/pods/e9360b9c-9e26-4d1c-981f-7d84b28555cf/volumes" Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.352056 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.408036 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:05 crc kubenswrapper[4707]: W0121 15:23:05.419371 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33d67064_63b6_4a0d_8310_3e41a2e52c3f.slice/crio-8385229cb7da50f06dcb1ead316cc786b3986fd2abf4ef36869d4de23d960c20 WatchSource:0}: Error finding container 8385229cb7da50f06dcb1ead316cc786b3986fd2abf4ef36869d4de23d960c20: Status 404 returned error can't find the container with id 8385229cb7da50f06dcb1ead316cc786b3986fd2abf4ef36869d4de23d960c20 Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.576988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c1addb55-dbd5-4f5c-957b-592c89033dab","Type":"ContainerStarted","Data":"192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec"} Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.577031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c1addb55-dbd5-4f5c-957b-592c89033dab","Type":"ContainerStarted","Data":"7e5676fce46cf35e84ce7792b9247621f82d5c13788fa5ea074f6841a2a9a833"} Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.580090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerStarted","Data":"8385229cb7da50f06dcb1ead316cc786b3986fd2abf4ef36869d4de23d960c20"} Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.581889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3eb688fd-a99d-494a-a688-b2c465f0ae48","Type":"ContainerStarted","Data":"8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f"} Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.583226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed7400b4-4f9e-4571-9156-5cf8f5e0df67","Type":"ContainerStarted","Data":"6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da"} Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.593357 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.593330234 podStartE2EDuration="1.593330234s" podCreationTimestamp="2026-01-21 15:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:23:05.588015834 +0000 UTC m=+1282.769532056" watchObservedRunningTime="2026-01-21 15:23:05.593330234 +0000 UTC m=+1282.774846456" Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.604102 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.604092203 podStartE2EDuration="2.604092203s" podCreationTimestamp="2026-01-21 15:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:23:05.599219494 +0000 UTC m=+1282.780735716" watchObservedRunningTime="2026-01-21 15:23:05.604092203 +0000 UTC m=+1282.785608425" Jan 21 15:23:05 crc kubenswrapper[4707]: I0121 15:23:05.614032 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.614022469 podStartE2EDuration="2.614022469s" podCreationTimestamp="2026-01-21 15:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:23:05.611458489 +0000 UTC m=+1282.792974711" watchObservedRunningTime="2026-01-21 15:23:05.614022469 +0000 UTC m=+1282.795538691" Jan 21 15:23:06 crc kubenswrapper[4707]: I0121 15:23:06.592063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerStarted","Data":"c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30"} Jan 21 15:23:07 crc kubenswrapper[4707]: I0121 15:23:07.600716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerStarted","Data":"4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0"} Jan 21 15:23:07 crc kubenswrapper[4707]: I0121 15:23:07.600948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerStarted","Data":"05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c"} Jan 21 15:23:08 crc kubenswrapper[4707]: I0121 15:23:08.975629 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:08 crc kubenswrapper[4707]: I0121 15:23:08.975862 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:09 crc kubenswrapper[4707]: I0121 15:23:09.615238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerStarted","Data":"1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91"} Jan 21 15:23:09 crc kubenswrapper[4707]: I0121 15:23:09.615393 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:09 crc kubenswrapper[4707]: I0121 15:23:09.630178 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.486203263 podStartE2EDuration="5.630169407s" podCreationTimestamp="2026-01-21 15:23:04 +0000 UTC" firstStartedPulling="2026-01-21 15:23:05.423646038 +0000 UTC m=+1282.605162259" lastFinishedPulling="2026-01-21 15:23:08.567612181 +0000 UTC m=+1285.749128403" observedRunningTime="2026-01-21 15:23:09.628938282 +0000 UTC m=+1286.810454505" watchObservedRunningTime="2026-01-21 15:23:09.630169407 +0000 UTC m=+1286.811685629" Jan 21 15:23:09 crc kubenswrapper[4707]: I0121 15:23:09.959916 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:11 crc kubenswrapper[4707]: I0121 15:23:11.879612 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:23:13 crc kubenswrapper[4707]: I0121 15:23:13.965230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:13 crc kubenswrapper[4707]: I0121 15:23:13.965291 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:13 crc kubenswrapper[4707]: I0121 15:23:13.976154 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:13 crc kubenswrapper[4707]: I0121 15:23:13.976196 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:14 crc kubenswrapper[4707]: I0121 15:23:14.960553 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:14 crc kubenswrapper[4707]: I0121 15:23:14.982200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:15 crc kubenswrapper[4707]: I0121 15:23:15.058991 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.26:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:23:15 crc kubenswrapper[4707]: I0121 15:23:15.058989 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.25:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:23:15 crc kubenswrapper[4707]: I0121 15:23:15.059505 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.25:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:23:15 crc kubenswrapper[4707]: I0121 15:23:15.059611 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.26:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:23:15 crc kubenswrapper[4707]: I0121 15:23:15.693707 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:23:23 crc kubenswrapper[4707]: I0121 15:23:23.968221 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:23 crc kubenswrapper[4707]: I0121 15:23:23.968833 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:23 crc kubenswrapper[4707]: I0121 15:23:23.969592 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:23 crc kubenswrapper[4707]: I0121 15:23:23.973700 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:23 crc kubenswrapper[4707]: I0121 15:23:23.980136 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:23 crc kubenswrapper[4707]: I0121 15:23:23.983261 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:23 crc kubenswrapper[4707]: I0121 15:23:23.984386 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:24 crc kubenswrapper[4707]: I0121 15:23:24.738537 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:24 crc kubenswrapper[4707]: I0121 15:23:24.741090 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:24 crc kubenswrapper[4707]: I0121 15:23:24.742636 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.305121 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.306172 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="ceilometer-central-agent" containerID="cri-o://c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30" gracePeriod=30 Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.306452 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="proxy-httpd" containerID="cri-o://1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91" gracePeriod=30 Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.306529 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="ceilometer-notification-agent" containerID="cri-o://05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c" gracePeriod=30 Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.306679 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="sg-core" containerID="cri-o://4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0" gracePeriod=30 Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.315789 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.28:3000/\": EOF" Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.752666 4707 generic.go:334] "Generic (PLEG): container finished" podID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerID="1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91" exitCode=0 Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.752913 4707 generic.go:334] "Generic (PLEG): container finished" podID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerID="4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0" exitCode=2 Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.752922 4707 generic.go:334] "Generic (PLEG): container finished" podID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerID="c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30" exitCode=0 Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.752754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerDied","Data":"1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91"} Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.753014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerDied","Data":"4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0"} Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.753028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerDied","Data":"c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30"} Jan 21 15:23:26 crc kubenswrapper[4707]: I0121 15:23:26.925493 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:27 crc kubenswrapper[4707]: I0121 15:23:27.759163 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-log" containerID="cri-o://ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4" gracePeriod=30 Jan 21 15:23:27 crc kubenswrapper[4707]: I0121 15:23:27.759226 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-api" containerID="cri-o://8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f" gracePeriod=30 Jan 21 15:23:28 crc kubenswrapper[4707]: I0121 15:23:28.768194 4707 generic.go:334] "Generic (PLEG): container finished" podID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerID="ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4" exitCode=143 Jan 21 15:23:28 crc kubenswrapper[4707]: I0121 15:23:28.768263 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3eb688fd-a99d-494a-a688-b2c465f0ae48","Type":"ContainerDied","Data":"ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4"} Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.263628 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.329025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb688fd-a99d-494a-a688-b2c465f0ae48-logs\") pod \"3eb688fd-a99d-494a-a688-b2c465f0ae48\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.329090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-config-data\") pod \"3eb688fd-a99d-494a-a688-b2c465f0ae48\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.329164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2tn5\" (UniqueName: \"kubernetes.io/projected/3eb688fd-a99d-494a-a688-b2c465f0ae48-kube-api-access-v2tn5\") pod \"3eb688fd-a99d-494a-a688-b2c465f0ae48\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.329325 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-combined-ca-bundle\") pod \"3eb688fd-a99d-494a-a688-b2c465f0ae48\" (UID: \"3eb688fd-a99d-494a-a688-b2c465f0ae48\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.329529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eb688fd-a99d-494a-a688-b2c465f0ae48-logs" (OuterVolumeSpecName: "logs") pod "3eb688fd-a99d-494a-a688-b2c465f0ae48" (UID: "3eb688fd-a99d-494a-a688-b2c465f0ae48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.329857 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb688fd-a99d-494a-a688-b2c465f0ae48-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.333865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb688fd-a99d-494a-a688-b2c465f0ae48-kube-api-access-v2tn5" (OuterVolumeSpecName: "kube-api-access-v2tn5") pod "3eb688fd-a99d-494a-a688-b2c465f0ae48" (UID: "3eb688fd-a99d-494a-a688-b2c465f0ae48"). InnerVolumeSpecName "kube-api-access-v2tn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.350660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-config-data" (OuterVolumeSpecName: "config-data") pod "3eb688fd-a99d-494a-a688-b2c465f0ae48" (UID: "3eb688fd-a99d-494a-a688-b2c465f0ae48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.357258 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eb688fd-a99d-494a-a688-b2c465f0ae48" (UID: "3eb688fd-a99d-494a-a688-b2c465f0ae48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.431649 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.432009 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb688fd-a99d-494a-a688-b2c465f0ae48-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.432022 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2tn5\" (UniqueName: \"kubernetes.io/projected/3eb688fd-a99d-494a-a688-b2c465f0ae48-kube-api-access-v2tn5\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.485464 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.533103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-sg-core-conf-yaml\") pod \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.533180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-config-data\") pod \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.533241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-log-httpd\") pod \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.533424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-run-httpd\") pod \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.533454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-scripts\") pod \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.533546 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88tmx\" (UniqueName: \"kubernetes.io/projected/33d67064-63b6-4a0d-8310-3e41a2e52c3f-kube-api-access-88tmx\") pod \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.533566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-combined-ca-bundle\") pod \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.533638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-ceilometer-tls-certs\") pod \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\" (UID: \"33d67064-63b6-4a0d-8310-3e41a2e52c3f\") " Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.533995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "33d67064-63b6-4a0d-8310-3e41a2e52c3f" (UID: "33d67064-63b6-4a0d-8310-3e41a2e52c3f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.534095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "33d67064-63b6-4a0d-8310-3e41a2e52c3f" (UID: "33d67064-63b6-4a0d-8310-3e41a2e52c3f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.534278 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.534308 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33d67064-63b6-4a0d-8310-3e41a2e52c3f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.537236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-scripts" (OuterVolumeSpecName: "scripts") pod "33d67064-63b6-4a0d-8310-3e41a2e52c3f" (UID: "33d67064-63b6-4a0d-8310-3e41a2e52c3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.537606 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d67064-63b6-4a0d-8310-3e41a2e52c3f-kube-api-access-88tmx" (OuterVolumeSpecName: "kube-api-access-88tmx") pod "33d67064-63b6-4a0d-8310-3e41a2e52c3f" (UID: "33d67064-63b6-4a0d-8310-3e41a2e52c3f"). InnerVolumeSpecName "kube-api-access-88tmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.553625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "33d67064-63b6-4a0d-8310-3e41a2e52c3f" (UID: "33d67064-63b6-4a0d-8310-3e41a2e52c3f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.567353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "33d67064-63b6-4a0d-8310-3e41a2e52c3f" (UID: "33d67064-63b6-4a0d-8310-3e41a2e52c3f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.581129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33d67064-63b6-4a0d-8310-3e41a2e52c3f" (UID: "33d67064-63b6-4a0d-8310-3e41a2e52c3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.596661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-config-data" (OuterVolumeSpecName: "config-data") pod "33d67064-63b6-4a0d-8310-3e41a2e52c3f" (UID: "33d67064-63b6-4a0d-8310-3e41a2e52c3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.636718 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.636749 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.636762 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.636773 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.636782 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33d67064-63b6-4a0d-8310-3e41a2e52c3f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.636792 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88tmx\" (UniqueName: \"kubernetes.io/projected/33d67064-63b6-4a0d-8310-3e41a2e52c3f-kube-api-access-88tmx\") on node \"crc\" DevicePath \"\"" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.787879 4707 generic.go:334] "Generic (PLEG): container finished" podID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerID="05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c" exitCode=0 Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.787943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerDied","Data":"05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c"} Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.787975 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.787993 4707 scope.go:117] "RemoveContainer" containerID="1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.787981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"33d67064-63b6-4a0d-8310-3e41a2e52c3f","Type":"ContainerDied","Data":"8385229cb7da50f06dcb1ead316cc786b3986fd2abf4ef36869d4de23d960c20"} Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.789878 4707 generic.go:334] "Generic (PLEG): container finished" podID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerID="8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f" exitCode=0 Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.789907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3eb688fd-a99d-494a-a688-b2c465f0ae48","Type":"ContainerDied","Data":"8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f"} Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.789922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3eb688fd-a99d-494a-a688-b2c465f0ae48","Type":"ContainerDied","Data":"a96d2221f6310a907f99c6f2f2d18d1f1fcd4480f6431930cdabf28e289664c0"} Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.789965 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.812417 4707 scope.go:117] "RemoveContainer" containerID="4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.819973 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.833557 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.847793 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.849593 4707 scope.go:117] "RemoveContainer" containerID="05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.856022 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.874627 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.874985 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="ceilometer-notification-agent" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875003 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="ceilometer-notification-agent" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.875017 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="proxy-httpd" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875023 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="proxy-httpd" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.875032 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="ceilometer-central-agent" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875038 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="ceilometer-central-agent" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.875064 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-log" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875069 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-log" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.875083 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="sg-core" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875089 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="sg-core" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.875099 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-api" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875105 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-api" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875252 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-api" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875263 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="proxy-httpd" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875270 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="sg-core" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875278 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="ceilometer-central-agent" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875287 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" containerName="nova-api-log" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.875294 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" containerName="ceilometer-notification-agent" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.876290 4707 scope.go:117] "RemoveContainer" containerID="c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.876718 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.878877 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.878975 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.879147 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.882939 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.884070 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.885398 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.885856 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.886179 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.891949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.899867 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.903050 4707 scope.go:117] "RemoveContainer" containerID="1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.903647 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91\": container with ID starting with 1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91 not found: ID does not exist" containerID="1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.903679 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91"} err="failed to get container status \"1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91\": rpc error: code = NotFound desc = could not find container \"1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91\": container with ID starting with 1ba5c607705d8dcf62fb3f3ed36324be655fff4a5f2893c94b286b491f734a91 not found: ID does not exist" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.903699 4707 scope.go:117] "RemoveContainer" containerID="4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.904374 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0\": container with ID starting with 4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0 not found: ID does not exist" containerID="4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.904407 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0"} err="failed to get container status \"4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0\": rpc error: code = NotFound desc = could not find container \"4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0\": container with ID starting with 4addd6e1c79d79b9dedc0b72b717cc525080e613fd83225c5ad1a0b65cc77af0 not found: ID does not exist" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.904430 4707 scope.go:117] "RemoveContainer" containerID="05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.906098 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c\": container with ID starting with 05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c not found: ID does not exist" containerID="05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.906120 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c"} err="failed to get container status \"05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c\": rpc error: code = NotFound desc = could not find container \"05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c\": container with ID starting with 05d7da9e44c4ecc31fb153eb628f95649be2dc631edd5567671acc997bf2e34c not found: ID does not exist" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.906134 4707 scope.go:117] "RemoveContainer" containerID="c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.907838 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30\": container with ID starting with c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30 not found: ID does not exist" containerID="c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.907914 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30"} err="failed to get container status \"c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30\": rpc error: code = NotFound desc = could not find container \"c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30\": container with ID starting with c82e55131984c93b084526e4f1cd2aa0b2158e6a2ffe44c2b408377df626ca30 not found: ID does not exist" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.907938 4707 scope.go:117] "RemoveContainer" containerID="8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.927648 4707 scope.go:117] "RemoveContainer" containerID="ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.941950 4707 scope.go:117] "RemoveContainer" containerID="8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.943319 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f\": container with ID starting with 8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f not found: ID does not exist" containerID="8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.943386 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f"} err="failed to get container status \"8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f\": rpc error: code = NotFound desc = could not find container \"8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f\": container with ID starting with 8c2cc41fe6c722a6599750f2cbb77a9fc226470bc81ba49334f452e8233e3a2f not found: ID does not exist" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.943415 4707 scope.go:117] "RemoveContainer" containerID="ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.943743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-config-data\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.943873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94sdq\" (UniqueName: \"kubernetes.io/projected/a7658f54-ea3a-4d2a-bec0-6a885e661223-kube-api-access-94sdq\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.943905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-scripts\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.943937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.943961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7658f54-ea3a-4d2a-bec0-6a885e661223-logs\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.943998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.944021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.944052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-public-tls-certs\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.944078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnhl2\" (UniqueName: \"kubernetes.io/projected/07daec18-3928-4cae-8fbf-4de0b30c690c-kube-api-access-wnhl2\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.944100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-config-data\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.944141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-log-httpd\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.944162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.944187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.944208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-run-httpd\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:31 crc kubenswrapper[4707]: E0121 15:23:31.944246 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4\": container with ID starting with ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4 not found: ID does not exist" containerID="ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4" Jan 21 15:23:31 crc kubenswrapper[4707]: I0121 15:23:31.944285 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4"} err="failed to get container status \"ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4\": rpc error: code = NotFound desc = could not find container \"ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4\": container with ID starting with ec994bfcbe4050835d2934c29cec71e24d183c1fd13fbefd15b5186b8c68e6e4 not found: ID does not exist" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94sdq\" (UniqueName: \"kubernetes.io/projected/a7658f54-ea3a-4d2a-bec0-6a885e661223-kube-api-access-94sdq\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-scripts\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7658f54-ea3a-4d2a-bec0-6a885e661223-logs\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-public-tls-certs\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnhl2\" (UniqueName: \"kubernetes.io/projected/07daec18-3928-4cae-8fbf-4de0b30c690c-kube-api-access-wnhl2\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-config-data\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-log-httpd\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-run-httpd\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-config-data\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.046708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7658f54-ea3a-4d2a-bec0-6a885e661223-logs\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.047018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-log-httpd\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.047334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-run-httpd\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.049356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.049590 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.049439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.049708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.050028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-scripts\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.050502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-config-data\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.050624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-public-tls-certs\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.051348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-config-data\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.063797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94sdq\" (UniqueName: \"kubernetes.io/projected/a7658f54-ea3a-4d2a-bec0-6a885e661223-kube-api-access-94sdq\") pod \"nova-api-0\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.064683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.069352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnhl2\" (UniqueName: \"kubernetes.io/projected/07daec18-3928-4cae-8fbf-4de0b30c690c-kube-api-access-wnhl2\") pod \"ceilometer-0\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.198566 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.205824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.613558 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.672209 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:23:32 crc kubenswrapper[4707]: W0121 15:23:32.678221 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07daec18_3928_4cae_8fbf_4de0b30c690c.slice/crio-1313d39be45c107bc5cce37c7a7b8160ceb0128f42ec84e9bbdab1c18ca06b51 WatchSource:0}: Error finding container 1313d39be45c107bc5cce37c7a7b8160ceb0128f42ec84e9bbdab1c18ca06b51: Status 404 returned error can't find the container with id 1313d39be45c107bc5cce37c7a7b8160ceb0128f42ec84e9bbdab1c18ca06b51 Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.801373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerStarted","Data":"1313d39be45c107bc5cce37c7a7b8160ceb0128f42ec84e9bbdab1c18ca06b51"} Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.802542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a7658f54-ea3a-4d2a-bec0-6a885e661223","Type":"ContainerStarted","Data":"e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386"} Jan 21 15:23:32 crc kubenswrapper[4707]: I0121 15:23:32.802567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a7658f54-ea3a-4d2a-bec0-6a885e661223","Type":"ContainerStarted","Data":"4d06506824cfce7e30e471be7a467e782bc3996c9e181fba2fe71f33c59f8834"} Jan 21 15:23:33 crc kubenswrapper[4707]: I0121 15:23:33.196469 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d67064-63b6-4a0d-8310-3e41a2e52c3f" path="/var/lib/kubelet/pods/33d67064-63b6-4a0d-8310-3e41a2e52c3f/volumes" Jan 21 15:23:33 crc kubenswrapper[4707]: I0121 15:23:33.197437 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb688fd-a99d-494a-a688-b2c465f0ae48" path="/var/lib/kubelet/pods/3eb688fd-a99d-494a-a688-b2c465f0ae48/volumes" Jan 21 15:23:33 crc kubenswrapper[4707]: I0121 15:23:33.809864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerStarted","Data":"a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858"} Jan 21 15:23:33 crc kubenswrapper[4707]: I0121 15:23:33.811350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a7658f54-ea3a-4d2a-bec0-6a885e661223","Type":"ContainerStarted","Data":"e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde"} Jan 21 15:23:33 crc kubenswrapper[4707]: I0121 15:23:33.839647 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.839635545 podStartE2EDuration="2.839635545s" podCreationTimestamp="2026-01-21 15:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:23:33.828108117 +0000 UTC m=+1311.009624339" watchObservedRunningTime="2026-01-21 15:23:33.839635545 +0000 UTC m=+1311.021151767" Jan 21 15:23:34 crc kubenswrapper[4707]: I0121 15:23:34.820997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerStarted","Data":"9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e"} Jan 21 15:23:35 crc kubenswrapper[4707]: I0121 15:23:35.840969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerStarted","Data":"a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec"} Jan 21 15:23:36 crc kubenswrapper[4707]: I0121 15:23:36.848801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerStarted","Data":"10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695"} Jan 21 15:23:36 crc kubenswrapper[4707]: I0121 15:23:36.849793 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:23:36 crc kubenswrapper[4707]: I0121 15:23:36.880294 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.40707427 podStartE2EDuration="5.880273824s" podCreationTimestamp="2026-01-21 15:23:31 +0000 UTC" firstStartedPulling="2026-01-21 15:23:32.68083982 +0000 UTC m=+1309.862356043" lastFinishedPulling="2026-01-21 15:23:36.154039375 +0000 UTC m=+1313.335555597" observedRunningTime="2026-01-21 15:23:36.866612795 +0000 UTC m=+1314.048129017" watchObservedRunningTime="2026-01-21 15:23:36.880273824 +0000 UTC m=+1314.061790045" Jan 21 15:23:42 crc kubenswrapper[4707]: I0121 15:23:42.206438 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:42 crc kubenswrapper[4707]: I0121 15:23:42.206904 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:43 crc kubenswrapper[4707]: I0121 15:23:43.220923 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.30:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:23:43 crc kubenswrapper[4707]: I0121 15:23:43.220923 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.30:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:23:45 crc kubenswrapper[4707]: I0121 15:23:45.911694 4707 scope.go:117] "RemoveContainer" containerID="841e9305d4ad8534d58b141a550858d835ba6a91bc9f3d6a0944858c473b8ca5" Jan 21 15:23:45 crc kubenswrapper[4707]: I0121 15:23:45.940127 4707 scope.go:117] "RemoveContainer" containerID="37c0a0b5683af8cf2897340a05e69b1b3c4f76f2984fc6f86bd2d0a6a22c70ee" Jan 21 15:23:45 crc kubenswrapper[4707]: I0121 15:23:45.984037 4707 scope.go:117] "RemoveContainer" containerID="13f131c209e26209d96710f9a60dc9ab4422ae7154a9afed2d58a0a38c71156f" Jan 21 15:23:46 crc kubenswrapper[4707]: I0121 15:23:46.004269 4707 scope.go:117] "RemoveContainer" containerID="7398d99b8c2d024f0d741e0979a2b5f13e02ba91e8bee355cd0a60ea43c7fc69" Jan 21 15:23:46 crc kubenswrapper[4707]: I0121 15:23:46.027680 4707 scope.go:117] "RemoveContainer" containerID="14cc43f0b41c7090b3769eaa7c3c1157933a68ecf8e7d9d7aa35a88d58bb5fa5" Jan 21 15:23:46 crc kubenswrapper[4707]: I0121 15:23:46.051181 4707 scope.go:117] "RemoveContainer" containerID="3bd94c36268aea9cfa512b21c4fd0dbdba4e5ab6445d5390d147f8748ffcb799" Jan 21 15:23:46 crc kubenswrapper[4707]: I0121 15:23:46.076244 4707 scope.go:117] "RemoveContainer" containerID="c727367a8fb1551f421d3c7b7ffddfef9fb2b665be85437347357093e4b2ff22" Jan 21 15:23:46 crc kubenswrapper[4707]: I0121 15:23:46.090981 4707 scope.go:117] "RemoveContainer" containerID="be4c978b2be2e32de5ace8ea709ece67310a8ad3c2471018b5a376668dc85715" Jan 21 15:23:46 crc kubenswrapper[4707]: I0121 15:23:46.107286 4707 scope.go:117] "RemoveContainer" containerID="ca98d718c47ed378038fe7df95d2f699834093a794f92ae9aef0b348916a2f30" Jan 21 15:23:46 crc kubenswrapper[4707]: I0121 15:23:46.122118 4707 scope.go:117] "RemoveContainer" containerID="b8f8d78c1a69eb77beb67ef7906ecf6a1d6bb3824ff6ae8e2e9f9d9202110932" Jan 21 15:23:46 crc kubenswrapper[4707]: I0121 15:23:46.136457 4707 scope.go:117] "RemoveContainer" containerID="48c5225159a3ffc9586877ac0dcbb13aa01282ae7849cfd26facedad54d07afc" Jan 21 15:23:46 crc kubenswrapper[4707]: I0121 15:23:46.151546 4707 scope.go:117] "RemoveContainer" containerID="c0892d23e8c04ef92e90d0d760078d96e6415bc7a7329eaafda050c2690b9fd1" Jan 21 15:23:52 crc kubenswrapper[4707]: I0121 15:23:52.212178 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:52 crc kubenswrapper[4707]: I0121 15:23:52.212591 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:52 crc kubenswrapper[4707]: I0121 15:23:52.212918 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:52 crc kubenswrapper[4707]: I0121 15:23:52.212938 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:52 crc kubenswrapper[4707]: I0121 15:23:52.216875 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:23:52 crc kubenswrapper[4707]: I0121 15:23:52.218128 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:24:02 crc kubenswrapper[4707]: I0121 15:24:02.206237 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.267267 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-wwtsw"] Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.268820 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.272057 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovncontroller-ovndbs" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.272082 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncontroller-ovncontroller-dockercfg-tzstr" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.272068 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-scripts" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.277798 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-4vm8l"] Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.279342 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.284254 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-wwtsw"] Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-lib\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run-ovn\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-run\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-ovn-controller-tls-certs\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-log-ovn\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-etc-ovs\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797ea71-9c2c-489f-90d2-a94e803d0bf4-scripts\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72p4\" (UniqueName: \"kubernetes.io/projected/6797ea71-9c2c-489f-90d2-a94e803d0bf4-kube-api-access-v72p4\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-log\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzbcz\" (UniqueName: \"kubernetes.io/projected/5feebb91-5bf4-40c1-8f20-b32f37c78561-kube-api-access-xzbcz\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.310796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-combined-ca-bundle\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.319628 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-4vm8l"] Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.376436 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-bxmzj"] Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.377557 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.380384 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-metrics-config" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.390418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-bxmzj"] Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72p4\" (UniqueName: \"kubernetes.io/projected/6797ea71-9c2c-489f-90d2-a94e803d0bf4-kube-api-access-v72p4\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovs-rundir\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-log\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzbcz\" (UniqueName: \"kubernetes.io/projected/5feebb91-5bf4-40c1-8f20-b32f37c78561-kube-api-access-xzbcz\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-combined-ca-bundle\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-lib\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-combined-ca-bundle\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run-ovn\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovn-rundir\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-run\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-ovn-controller-tls-certs\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-log-ovn\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.416953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-etc-ovs\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.417006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797ea71-9c2c-489f-90d2-a94e803d0bf4-scripts\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.417420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run-ovn\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.417577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-lib\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.417913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-run\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.418338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-etc-ovs\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.418429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-log-ovn\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.419414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797ea71-9c2c-489f-90d2-a94e803d0bf4-scripts\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.419498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d9af08-3fde-4d03-b82c-db3342193a45-config\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.419761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hzdq\" (UniqueName: \"kubernetes.io/projected/43d9af08-3fde-4d03-b82c-db3342193a45-kube-api-access-5hzdq\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.421274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.422216 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-log\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.423419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-combined-ca-bundle\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.423871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.424047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.430201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-ovn-controller-tls-certs\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.437974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzbcz\" (UniqueName: \"kubernetes.io/projected/5feebb91-5bf4-40c1-8f20-b32f37c78561-kube-api-access-xzbcz\") pod \"ovn-controller-ovs-4vm8l\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.440300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72p4\" (UniqueName: \"kubernetes.io/projected/6797ea71-9c2c-489f-90d2-a94e803d0bf4-kube-api-access-v72p4\") pod \"ovn-controller-wwtsw\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.526194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovs-rundir\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.526260 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.526356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovs-rundir\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.526369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-combined-ca-bundle\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.526392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovn-rundir\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.526421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovn-rundir\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.526454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d9af08-3fde-4d03-b82c-db3342193a45-config\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.526552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hzdq\" (UniqueName: \"kubernetes.io/projected/43d9af08-3fde-4d03-b82c-db3342193a45-kube-api-access-5hzdq\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.527358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d9af08-3fde-4d03-b82c-db3342193a45-config\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.529095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.529374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-combined-ca-bundle\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.542844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hzdq\" (UniqueName: \"kubernetes.io/projected/43d9af08-3fde-4d03-b82c-db3342193a45-kube-api-access-5hzdq\") pod \"ovn-controller-metrics-bxmzj\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.584940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.598961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.694742 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:06 crc kubenswrapper[4707]: I0121 15:24:06.993912 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-wwtsw"] Jan 21 15:24:07 crc kubenswrapper[4707]: I0121 15:24:07.077443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-wwtsw" event={"ID":"6797ea71-9c2c-489f-90d2-a94e803d0bf4","Type":"ContainerStarted","Data":"a6c09659c3cb3ec9c2750211a7fa2614e278e0fbf4025fe6e626bd201b44c0be"} Jan 21 15:24:07 crc kubenswrapper[4707]: I0121 15:24:07.102078 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-4vm8l"] Jan 21 15:24:07 crc kubenswrapper[4707]: W0121 15:24:07.142929 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43d9af08_3fde_4d03_b82c_db3342193a45.slice/crio-3d4b4d43f0a463024d5ba0db44609cd9252d9e44a4dca6d4f90dd72cbd6210d9 WatchSource:0}: Error finding container 3d4b4d43f0a463024d5ba0db44609cd9252d9e44a4dca6d4f90dd72cbd6210d9: Status 404 returned error can't find the container with id 3d4b4d43f0a463024d5ba0db44609cd9252d9e44a4dca6d4f90dd72cbd6210d9 Jan 21 15:24:07 crc kubenswrapper[4707]: I0121 15:24:07.144299 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-bxmzj"] Jan 21 15:24:08 crc kubenswrapper[4707]: I0121 15:24:08.084877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" event={"ID":"43d9af08-3fde-4d03-b82c-db3342193a45","Type":"ContainerStarted","Data":"c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473"} Jan 21 15:24:08 crc kubenswrapper[4707]: I0121 15:24:08.085305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" event={"ID":"43d9af08-3fde-4d03-b82c-db3342193a45","Type":"ContainerStarted","Data":"3d4b4d43f0a463024d5ba0db44609cd9252d9e44a4dca6d4f90dd72cbd6210d9"} Jan 21 15:24:08 crc kubenswrapper[4707]: I0121 15:24:08.086875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" event={"ID":"5feebb91-5bf4-40c1-8f20-b32f37c78561","Type":"ContainerStarted","Data":"6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f"} Jan 21 15:24:08 crc kubenswrapper[4707]: I0121 15:24:08.086907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" event={"ID":"5feebb91-5bf4-40c1-8f20-b32f37c78561","Type":"ContainerStarted","Data":"564c2c405213d004f600994f33753d0cc72db63610f97799f2e39b6ebea6547d"} Jan 21 15:24:08 crc kubenswrapper[4707]: I0121 15:24:08.101067 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" podStartSLOduration=2.101055547 podStartE2EDuration="2.101055547s" podCreationTimestamp="2026-01-21 15:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:24:08.101055047 +0000 UTC m=+1345.282571268" watchObservedRunningTime="2026-01-21 15:24:08.101055547 +0000 UTC m=+1345.282571770" Jan 21 15:24:09 crc kubenswrapper[4707]: I0121 15:24:09.094391 4707 generic.go:334] "Generic (PLEG): container finished" podID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerID="6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f" exitCode=0 Jan 21 15:24:09 crc kubenswrapper[4707]: I0121 15:24:09.094467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" event={"ID":"5feebb91-5bf4-40c1-8f20-b32f37c78561","Type":"ContainerDied","Data":"6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f"} Jan 21 15:24:09 crc kubenswrapper[4707]: I0121 15:24:09.096442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-wwtsw" event={"ID":"6797ea71-9c2c-489f-90d2-a94e803d0bf4","Type":"ContainerStarted","Data":"1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2"} Jan 21 15:24:09 crc kubenswrapper[4707]: I0121 15:24:09.121203 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-wwtsw" podStartSLOduration=2.082331131 podStartE2EDuration="3.121187606s" podCreationTimestamp="2026-01-21 15:24:06 +0000 UTC" firstStartedPulling="2026-01-21 15:24:07.004150829 +0000 UTC m=+1344.185667051" lastFinishedPulling="2026-01-21 15:24:08.043007304 +0000 UTC m=+1345.224523526" observedRunningTime="2026-01-21 15:24:09.120490626 +0000 UTC m=+1346.302006848" watchObservedRunningTime="2026-01-21 15:24:09.121187606 +0000 UTC m=+1346.302703828" Jan 21 15:24:10 crc kubenswrapper[4707]: I0121 15:24:10.103909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" event={"ID":"5feebb91-5bf4-40c1-8f20-b32f37c78561","Type":"ContainerStarted","Data":"d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac"} Jan 21 15:24:10 crc kubenswrapper[4707]: I0121 15:24:10.104107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" event={"ID":"5feebb91-5bf4-40c1-8f20-b32f37c78561","Type":"ContainerStarted","Data":"978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb"} Jan 21 15:24:10 crc kubenswrapper[4707]: I0121 15:24:10.104121 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:10 crc kubenswrapper[4707]: I0121 15:24:10.104132 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:10 crc kubenswrapper[4707]: I0121 15:24:10.117174 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" podStartSLOduration=3.510983247 podStartE2EDuration="4.117161054s" podCreationTimestamp="2026-01-21 15:24:06 +0000 UTC" firstStartedPulling="2026-01-21 15:24:07.106295651 +0000 UTC m=+1344.287811873" lastFinishedPulling="2026-01-21 15:24:07.712473457 +0000 UTC m=+1344.893989680" observedRunningTime="2026-01-21 15:24:10.116373683 +0000 UTC m=+1347.297889906" watchObservedRunningTime="2026-01-21 15:24:10.117161054 +0000 UTC m=+1347.298677276" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.109740 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.439312 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-wwtsw"] Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.447628 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-bxmzj"] Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.447780 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" podUID="43d9af08-3fde-4d03-b82c-db3342193a45" containerName="openstack-network-exporter" containerID="cri-o://c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473" gracePeriod=30 Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.470976 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-4vm8l"] Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.851058 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-metrics-bxmzj_43d9af08-3fde-4d03-b82c-db3342193a45/openstack-network-exporter/0.log" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.851279 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.908823 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-combined-ca-bundle\") pod \"43d9af08-3fde-4d03-b82c-db3342193a45\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.908892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hzdq\" (UniqueName: \"kubernetes.io/projected/43d9af08-3fde-4d03-b82c-db3342193a45-kube-api-access-5hzdq\") pod \"43d9af08-3fde-4d03-b82c-db3342193a45\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.908918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-metrics-certs-tls-certs\") pod \"43d9af08-3fde-4d03-b82c-db3342193a45\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.909001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovn-rundir\") pod \"43d9af08-3fde-4d03-b82c-db3342193a45\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.909017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovs-rundir\") pod \"43d9af08-3fde-4d03-b82c-db3342193a45\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.909039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d9af08-3fde-4d03-b82c-db3342193a45-config\") pod \"43d9af08-3fde-4d03-b82c-db3342193a45\" (UID: \"43d9af08-3fde-4d03-b82c-db3342193a45\") " Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.909256 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "43d9af08-3fde-4d03-b82c-db3342193a45" (UID: "43d9af08-3fde-4d03-b82c-db3342193a45"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.909327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "43d9af08-3fde-4d03-b82c-db3342193a45" (UID: "43d9af08-3fde-4d03-b82c-db3342193a45"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.909720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d9af08-3fde-4d03-b82c-db3342193a45-config" (OuterVolumeSpecName: "config") pod "43d9af08-3fde-4d03-b82c-db3342193a45" (UID: "43d9af08-3fde-4d03-b82c-db3342193a45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.909913 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.909929 4707 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43d9af08-3fde-4d03-b82c-db3342193a45-ovs-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.909936 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d9af08-3fde-4d03-b82c-db3342193a45-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.913350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d9af08-3fde-4d03-b82c-db3342193a45-kube-api-access-5hzdq" (OuterVolumeSpecName: "kube-api-access-5hzdq") pod "43d9af08-3fde-4d03-b82c-db3342193a45" (UID: "43d9af08-3fde-4d03-b82c-db3342193a45"). InnerVolumeSpecName "kube-api-access-5hzdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.929510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43d9af08-3fde-4d03-b82c-db3342193a45" (UID: "43d9af08-3fde-4d03-b82c-db3342193a45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:11 crc kubenswrapper[4707]: I0121 15:24:11.961410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "43d9af08-3fde-4d03-b82c-db3342193a45" (UID: "43d9af08-3fde-4d03-b82c-db3342193a45"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.011261 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.011357 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hzdq\" (UniqueName: \"kubernetes.io/projected/43d9af08-3fde-4d03-b82c-db3342193a45-kube-api-access-5hzdq\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.011424 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43d9af08-3fde-4d03-b82c-db3342193a45-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.117645 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-metrics-bxmzj_43d9af08-3fde-4d03-b82c-db3342193a45/openstack-network-exporter/0.log" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.117690 4707 generic.go:334] "Generic (PLEG): container finished" podID="43d9af08-3fde-4d03-b82c-db3342193a45" containerID="c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473" exitCode=2 Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.117727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" event={"ID":"43d9af08-3fde-4d03-b82c-db3342193a45","Type":"ContainerDied","Data":"c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473"} Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.117752 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.117767 4707 scope.go:117] "RemoveContainer" containerID="c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.117756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-bxmzj" event={"ID":"43d9af08-3fde-4d03-b82c-db3342193a45","Type":"ContainerDied","Data":"3d4b4d43f0a463024d5ba0db44609cd9252d9e44a4dca6d4f90dd72cbd6210d9"} Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.118351 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" secret="" err="secret \"ovncontroller-ovncontroller-dockercfg-tzstr\" not found" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.134890 4707 scope.go:117] "RemoveContainer" containerID="c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473" Jan 21 15:24:12 crc kubenswrapper[4707]: E0121 15:24:12.137055 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473\": container with ID starting with c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473 not found: ID does not exist" containerID="c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.137087 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473"} err="failed to get container status \"c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473\": rpc error: code = NotFound desc = could not find container \"c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473\": container with ID starting with c9606d3587ece37994ada9014e94a450a980d4e70eab80b171efa88e321ab473 not found: ID does not exist" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.139965 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-bxmzj"] Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.146745 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-bxmzj"] Jan 21 15:24:12 crc kubenswrapper[4707]: E0121 15:24:12.189028 4707 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack-kuttl-tests/ovn-controller-wwtsw" message=< Jan 21 15:24:12 crc kubenswrapper[4707]: Exiting ovn-controller (1) [ OK ] Jan 21 15:24:12 crc kubenswrapper[4707]: > Jan 21 15:24:12 crc kubenswrapper[4707]: E0121 15:24:12.189067 4707 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack-kuttl-tests/ovn-controller-wwtsw" podUID="6797ea71-9c2c-489f-90d2-a94e803d0bf4" containerName="ovn-controller" containerID="cri-o://1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.189103 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-wwtsw" podUID="6797ea71-9c2c-489f-90d2-a94e803d0bf4" containerName="ovn-controller" containerID="cri-o://1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2" gracePeriod=30 Jan 21 15:24:12 crc kubenswrapper[4707]: E0121 15:24:12.214574 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 21 15:24:12 crc kubenswrapper[4707]: E0121 15:24:12.214637 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts podName:5feebb91-5bf4-40c1-8f20-b32f37c78561 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:12.714624069 +0000 UTC m=+1349.896140290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts") pod "ovn-controller-ovs-4vm8l" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561") : configmap "ovncontroller-scripts" not found Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.401654 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.418282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-log-ovn\") pod \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.418368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-ovn-controller-tls-certs\") pod \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.418374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6797ea71-9c2c-489f-90d2-a94e803d0bf4" (UID: "6797ea71-9c2c-489f-90d2-a94e803d0bf4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.418387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v72p4\" (UniqueName: \"kubernetes.io/projected/6797ea71-9c2c-489f-90d2-a94e803d0bf4-kube-api-access-v72p4\") pod \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.418482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run-ovn\") pod \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.418595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-combined-ca-bundle\") pod \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.418723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797ea71-9c2c-489f-90d2-a94e803d0bf4-scripts\") pod \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.418768 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run\") pod \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\" (UID: \"6797ea71-9c2c-489f-90d2-a94e803d0bf4\") " Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.418722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6797ea71-9c2c-489f-90d2-a94e803d0bf4" (UID: "6797ea71-9c2c-489f-90d2-a94e803d0bf4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.419006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run" (OuterVolumeSpecName: "var-run") pod "6797ea71-9c2c-489f-90d2-a94e803d0bf4" (UID: "6797ea71-9c2c-489f-90d2-a94e803d0bf4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.419521 4707 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.419540 4707 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.419548 4707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797ea71-9c2c-489f-90d2-a94e803d0bf4-var-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.420547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6797ea71-9c2c-489f-90d2-a94e803d0bf4-scripts" (OuterVolumeSpecName: "scripts") pod "6797ea71-9c2c-489f-90d2-a94e803d0bf4" (UID: "6797ea71-9c2c-489f-90d2-a94e803d0bf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.421242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6797ea71-9c2c-489f-90d2-a94e803d0bf4-kube-api-access-v72p4" (OuterVolumeSpecName: "kube-api-access-v72p4") pod "6797ea71-9c2c-489f-90d2-a94e803d0bf4" (UID: "6797ea71-9c2c-489f-90d2-a94e803d0bf4"). InnerVolumeSpecName "kube-api-access-v72p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.437788 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6797ea71-9c2c-489f-90d2-a94e803d0bf4" (UID: "6797ea71-9c2c-489f-90d2-a94e803d0bf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.470255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "6797ea71-9c2c-489f-90d2-a94e803d0bf4" (UID: "6797ea71-9c2c-489f-90d2-a94e803d0bf4"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.521338 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.521361 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797ea71-9c2c-489f-90d2-a94e803d0bf4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.521372 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6797ea71-9c2c-489f-90d2-a94e803d0bf4-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: I0121 15:24:12.521381 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v72p4\" (UniqueName: \"kubernetes.io/projected/6797ea71-9c2c-489f-90d2-a94e803d0bf4-kube-api-access-v72p4\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:12 crc kubenswrapper[4707]: E0121 15:24:12.724272 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 21 15:24:12 crc kubenswrapper[4707]: E0121 15:24:12.724359 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts podName:5feebb91-5bf4-40c1-8f20-b32f37c78561 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:13.724326641 +0000 UTC m=+1350.905842862 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts") pod "ovn-controller-ovs-4vm8l" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561") : configmap "ovncontroller-scripts" not found Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.126585 4707 generic.go:334] "Generic (PLEG): container finished" podID="6797ea71-9c2c-489f-90d2-a94e803d0bf4" containerID="1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2" exitCode=0 Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.126709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-wwtsw" event={"ID":"6797ea71-9c2c-489f-90d2-a94e803d0bf4","Type":"ContainerDied","Data":"1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2"} Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.127139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-wwtsw" event={"ID":"6797ea71-9c2c-489f-90d2-a94e803d0bf4","Type":"ContainerDied","Data":"a6c09659c3cb3ec9c2750211a7fa2614e278e0fbf4025fe6e626bd201b44c0be"} Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.127159 4707 scope.go:117] "RemoveContainer" containerID="1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2" Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.126780 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-wwtsw" Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.149606 4707 scope.go:117] "RemoveContainer" containerID="1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2" Jan 21 15:24:13 crc kubenswrapper[4707]: E0121 15:24:13.150833 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2\": container with ID starting with 1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2 not found: ID does not exist" containerID="1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2" Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.150870 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2"} err="failed to get container status \"1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2\": rpc error: code = NotFound desc = could not find container \"1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2\": container with ID starting with 1260aefbea75487795f11b8372d3b214b985e4dda76155b01ccb19dccf4151c2 not found: ID does not exist" Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.155180 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-wwtsw"] Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.162163 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-wwtsw"] Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.189762 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d9af08-3fde-4d03-b82c-db3342193a45" path="/var/lib/kubelet/pods/43d9af08-3fde-4d03-b82c-db3342193a45/volumes" Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.190465 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6797ea71-9c2c-489f-90d2-a94e803d0bf4" path="/var/lib/kubelet/pods/6797ea71-9c2c-489f-90d2-a94e803d0bf4/volumes" Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.379673 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovs-vswitchd" containerID="cri-o://d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac" gracePeriod=30 Jan 21 15:24:13 crc kubenswrapper[4707]: E0121 15:24:13.719619 4707 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 21 15:24:13 crc kubenswrapper[4707]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 21 15:24:13 crc kubenswrapper[4707]: + source /usr/local/bin/container-scripts/functions Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNBridge=br-int Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNRemote=tcp:localhost:6642 Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNEncapType=geneve Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNAvailabilityZones= Jan 21 15:24:13 crc kubenswrapper[4707]: ++ EnableChassisAsGateway=true Jan 21 15:24:13 crc kubenswrapper[4707]: ++ PhysicalNetworks= Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNHostName= Jan 21 15:24:13 crc kubenswrapper[4707]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 21 15:24:13 crc kubenswrapper[4707]: ++ ovs_dir=/var/lib/openvswitch Jan 21 15:24:13 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 21 15:24:13 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 21 15:24:13 crc kubenswrapper[4707]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:24:13 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:24:13 crc kubenswrapper[4707]: + sleep 0.5 Jan 21 15:24:13 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:24:13 crc kubenswrapper[4707]: + cleanup_ovsdb_server_semaphore Jan 21 15:24:13 crc kubenswrapper[4707]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:24:13 crc kubenswrapper[4707]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 21 15:24:13 crc kubenswrapper[4707]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" message=< Jan 21 15:24:13 crc kubenswrapper[4707]: Exiting ovsdb-server (5) [ OK ] Jan 21 15:24:13 crc kubenswrapper[4707]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 21 15:24:13 crc kubenswrapper[4707]: + source /usr/local/bin/container-scripts/functions Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNBridge=br-int Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNRemote=tcp:localhost:6642 Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNEncapType=geneve Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNAvailabilityZones= Jan 21 15:24:13 crc kubenswrapper[4707]: ++ EnableChassisAsGateway=true Jan 21 15:24:13 crc kubenswrapper[4707]: ++ PhysicalNetworks= Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNHostName= Jan 21 15:24:13 crc kubenswrapper[4707]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 21 15:24:13 crc kubenswrapper[4707]: ++ ovs_dir=/var/lib/openvswitch Jan 21 15:24:13 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 21 15:24:13 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 21 15:24:13 crc kubenswrapper[4707]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:24:13 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:24:13 crc kubenswrapper[4707]: + sleep 0.5 Jan 21 15:24:13 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:24:13 crc kubenswrapper[4707]: + cleanup_ovsdb_server_semaphore Jan 21 15:24:13 crc kubenswrapper[4707]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:24:13 crc kubenswrapper[4707]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 21 15:24:13 crc kubenswrapper[4707]: > Jan 21 15:24:13 crc kubenswrapper[4707]: E0121 15:24:13.719654 4707 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 21 15:24:13 crc kubenswrapper[4707]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 21 15:24:13 crc kubenswrapper[4707]: + source /usr/local/bin/container-scripts/functions Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNBridge=br-int Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNRemote=tcp:localhost:6642 Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNEncapType=geneve Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNAvailabilityZones= Jan 21 15:24:13 crc kubenswrapper[4707]: ++ EnableChassisAsGateway=true Jan 21 15:24:13 crc kubenswrapper[4707]: ++ PhysicalNetworks= Jan 21 15:24:13 crc kubenswrapper[4707]: ++ OVNHostName= Jan 21 15:24:13 crc kubenswrapper[4707]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 21 15:24:13 crc kubenswrapper[4707]: ++ ovs_dir=/var/lib/openvswitch Jan 21 15:24:13 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 21 15:24:13 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 21 15:24:13 crc kubenswrapper[4707]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:24:13 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:24:13 crc kubenswrapper[4707]: + sleep 0.5 Jan 21 15:24:13 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:24:13 crc kubenswrapper[4707]: + cleanup_ovsdb_server_semaphore Jan 21 15:24:13 crc kubenswrapper[4707]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:24:13 crc kubenswrapper[4707]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 21 15:24:13 crc kubenswrapper[4707]: > pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovsdb-server" containerID="cri-o://978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb" Jan 21 15:24:13 crc kubenswrapper[4707]: I0121 15:24:13.719683 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovsdb-server" containerID="cri-o://978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb" gracePeriod=30 Jan 21 15:24:13 crc kubenswrapper[4707]: E0121 15:24:13.739683 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 21 15:24:13 crc kubenswrapper[4707]: E0121 15:24:13.739732 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts podName:5feebb91-5bf4-40c1-8f20-b32f37c78561 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:15.739719621 +0000 UTC m=+1352.921235843 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts") pod "ovn-controller-ovs-4vm8l" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561") : configmap "ovncontroller-scripts" not found Jan 21 15:24:14 crc kubenswrapper[4707]: I0121 15:24:14.135249 4707 generic.go:334] "Generic (PLEG): container finished" podID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerID="978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb" exitCode=0 Jan 21 15:24:14 crc kubenswrapper[4707]: I0121 15:24:14.135322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" event={"ID":"5feebb91-5bf4-40c1-8f20-b32f37c78561","Type":"ContainerDied","Data":"978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb"} Jan 21 15:24:15 crc kubenswrapper[4707]: E0121 15:24:15.768446 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 21 15:24:15 crc kubenswrapper[4707]: E0121 15:24:15.768725 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts podName:5feebb91-5bf4-40c1-8f20-b32f37c78561 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:19.76871083 +0000 UTC m=+1356.950227051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts") pod "ovn-controller-ovs-4vm8l" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561") : configmap "ovncontroller-scripts" not found Jan 21 15:24:19 crc kubenswrapper[4707]: E0121 15:24:19.827066 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 21 15:24:19 crc kubenswrapper[4707]: E0121 15:24:19.827278 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts podName:5feebb91-5bf4-40c1-8f20-b32f37c78561 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:27.827265481 +0000 UTC m=+1365.008781702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts") pod "ovn-controller-ovs-4vm8l" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561") : configmap "ovncontroller-scripts" not found Jan 21 15:24:27 crc kubenswrapper[4707]: E0121 15:24:27.833027 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 21 15:24:27 crc kubenswrapper[4707]: E0121 15:24:27.833363 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts podName:5feebb91-5bf4-40c1-8f20-b32f37c78561 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:43.83334961 +0000 UTC m=+1381.014865833 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts") pod "ovn-controller-ovs-4vm8l" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561") : configmap "ovncontroller-scripts" not found Jan 21 15:24:41 crc kubenswrapper[4707]: E0121 15:24:41.600251 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb is running failed: container process not found" containerID="978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 21 15:24:41 crc kubenswrapper[4707]: E0121 15:24:41.601009 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb is running failed: container process not found" containerID="978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 21 15:24:41 crc kubenswrapper[4707]: E0121 15:24:41.601223 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb is running failed: container process not found" containerID="978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 21 15:24:41 crc kubenswrapper[4707]: E0121 15:24:41.601254 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovsdb-server" Jan 21 15:24:41 crc kubenswrapper[4707]: E0121 15:24:41.601656 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 21 15:24:41 crc kubenswrapper[4707]: E0121 15:24:41.602706 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 21 15:24:41 crc kubenswrapper[4707]: E0121 15:24:41.603576 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 21 15:24:41 crc kubenswrapper[4707]: E0121 15:24:41.603603 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovs-vswitchd" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.709930 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-ovs-4vm8l_5feebb91-5bf4-40c1-8f20-b32f37c78561/ovs-vswitchd/0.log" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.711430 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-log\") pod \"5feebb91-5bf4-40c1-8f20-b32f37c78561\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-log" (OuterVolumeSpecName: "var-log") pod "5feebb91-5bf4-40c1-8f20-b32f37c78561" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765756 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-etc-ovs\") pod \"5feebb91-5bf4-40c1-8f20-b32f37c78561\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts\") pod \"5feebb91-5bf4-40c1-8f20-b32f37c78561\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765839 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-lib\") pod \"5feebb91-5bf4-40c1-8f20-b32f37c78561\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "5feebb91-5bf4-40c1-8f20-b32f37c78561" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-run\") pod \"5feebb91-5bf4-40c1-8f20-b32f37c78561\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765926 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzbcz\" (UniqueName: \"kubernetes.io/projected/5feebb91-5bf4-40c1-8f20-b32f37c78561-kube-api-access-xzbcz\") pod \"5feebb91-5bf4-40c1-8f20-b32f37c78561\" (UID: \"5feebb91-5bf4-40c1-8f20-b32f37c78561\") " Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-lib" (OuterVolumeSpecName: "var-lib") pod "5feebb91-5bf4-40c1-8f20-b32f37c78561" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.765988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-run" (OuterVolumeSpecName: "var-run") pod "5feebb91-5bf4-40c1-8f20-b32f37c78561" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.766434 4707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.766451 4707 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-log\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.766458 4707 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-etc-ovs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.766466 4707 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5feebb91-5bf4-40c1-8f20-b32f37c78561-var-lib\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.766586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts" (OuterVolumeSpecName: "scripts") pod "5feebb91-5bf4-40c1-8f20-b32f37c78561" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.770445 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5feebb91-5bf4-40c1-8f20-b32f37c78561-kube-api-access-xzbcz" (OuterVolumeSpecName: "kube-api-access-xzbcz") pod "5feebb91-5bf4-40c1-8f20-b32f37c78561" (UID: "5feebb91-5bf4-40c1-8f20-b32f37c78561"). InnerVolumeSpecName "kube-api-access-xzbcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.868595 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5feebb91-5bf4-40c1-8f20-b32f37c78561-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:43 crc kubenswrapper[4707]: I0121 15:24:43.868624 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzbcz\" (UniqueName: \"kubernetes.io/projected/5feebb91-5bf4-40c1-8f20-b32f37c78561-kube-api-access-xzbcz\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.325272 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-ovs-4vm8l_5feebb91-5bf4-40c1-8f20-b32f37c78561/ovs-vswitchd/0.log" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.326415 4707 generic.go:334] "Generic (PLEG): container finished" podID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerID="d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac" exitCode=137 Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.326451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" event={"ID":"5feebb91-5bf4-40c1-8f20-b32f37c78561","Type":"ContainerDied","Data":"d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac"} Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.326465 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.326476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-4vm8l" event={"ID":"5feebb91-5bf4-40c1-8f20-b32f37c78561","Type":"ContainerDied","Data":"564c2c405213d004f600994f33753d0cc72db63610f97799f2e39b6ebea6547d"} Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.326493 4707 scope.go:117] "RemoveContainer" containerID="d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.344127 4707 scope.go:117] "RemoveContainer" containerID="978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.349787 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-4vm8l"] Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.356065 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-4vm8l"] Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.360347 4707 scope.go:117] "RemoveContainer" containerID="6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.379370 4707 scope.go:117] "RemoveContainer" containerID="d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac" Jan 21 15:24:44 crc kubenswrapper[4707]: E0121 15:24:44.379734 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac\": container with ID starting with d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac not found: ID does not exist" containerID="d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.379764 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac"} err="failed to get container status \"d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac\": rpc error: code = NotFound desc = could not find container \"d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac\": container with ID starting with d5e61863accdd6616ac9d891745b0b197ad59dea4afe691f03e5bf60d4701cac not found: ID does not exist" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.379784 4707 scope.go:117] "RemoveContainer" containerID="978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb" Jan 21 15:24:44 crc kubenswrapper[4707]: E0121 15:24:44.380389 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb\": container with ID starting with 978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb not found: ID does not exist" containerID="978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.380422 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb"} err="failed to get container status \"978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb\": rpc error: code = NotFound desc = could not find container \"978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb\": container with ID starting with 978d22fff8b61d1566a916a897dd8e89895f6aa6c7fbe6d9920da7bcdf41bfeb not found: ID does not exist" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.380440 4707 scope.go:117] "RemoveContainer" containerID="6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f" Jan 21 15:24:44 crc kubenswrapper[4707]: E0121 15:24:44.380706 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f\": container with ID starting with 6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f not found: ID does not exist" containerID="6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f" Jan 21 15:24:44 crc kubenswrapper[4707]: I0121 15:24:44.380739 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f"} err="failed to get container status \"6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f\": rpc error: code = NotFound desc = could not find container \"6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f\": container with ID starting with 6edcc954afa71306e3fb5d41fcfe73b5d534ee55ac5e049b1ac1dd627c6ee15f not found: ID does not exist" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.190747 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" path="/var/lib/kubelet/pods/5feebb91-5bf4-40c1-8f20-b32f37c78561/volumes" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.505134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.523687 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-xpbnj"] Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.524043 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6797ea71-9c2c-489f-90d2-a94e803d0bf4" containerName="ovn-controller" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524061 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6797ea71-9c2c-489f-90d2-a94e803d0bf4" containerName="ovn-controller" Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.524078 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovsdb-server" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524084 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovsdb-server" Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.524100 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovsdb-server-init" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524106 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovsdb-server-init" Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.524116 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovs-vswitchd" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524120 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovs-vswitchd" Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.524140 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d9af08-3fde-4d03-b82c-db3342193a45" containerName="openstack-network-exporter" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524146 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d9af08-3fde-4d03-b82c-db3342193a45" containerName="openstack-network-exporter" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524289 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6797ea71-9c2c-489f-90d2-a94e803d0bf4" containerName="ovn-controller" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524299 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovsdb-server" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524309 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5feebb91-5bf4-40c1-8f20-b32f37c78561" containerName="ovs-vswitchd" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524320 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d9af08-3fde-4d03-b82c-db3342193a45" containerName="openstack-network-exporter" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.524784 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.532641 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.561630 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tnlbm"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.562558 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.570786 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.595123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/574e045d-bacf-4c78-8091-c33d0beb09c9-operator-scripts\") pod \"placement-a414-account-create-update-xpbnj\" (UID: \"574e045d-bacf-4c78-8091-c33d0beb09c9\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.595172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww75t\" (UniqueName: \"kubernetes.io/projected/7b113905-777a-4919-96c0-704f8e6f4026-kube-api-access-ww75t\") pod \"root-account-create-update-tnlbm\" (UID: \"7b113905-777a-4919-96c0-704f8e6f4026\") " pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.595192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdbq7\" (UniqueName: \"kubernetes.io/projected/574e045d-bacf-4c78-8091-c33d0beb09c9-kube-api-access-fdbq7\") pod \"placement-a414-account-create-update-xpbnj\" (UID: \"574e045d-bacf-4c78-8091-c33d0beb09c9\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.595248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts\") pod \"root-account-create-update-tnlbm\" (UID: \"7b113905-777a-4919-96c0-704f8e6f4026\") " pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.613709 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-4cm8d"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.640938 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.641093 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" containerName="openstackclient" containerID="cri-o://3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a" gracePeriod=2 Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.667455 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.668743 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.679356 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-4cm8d"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.690083 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.691564 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.695887 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.699740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/574e045d-bacf-4c78-8091-c33d0beb09c9-operator-scripts\") pod \"placement-a414-account-create-update-xpbnj\" (UID: \"574e045d-bacf-4c78-8091-c33d0beb09c9\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.699797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww75t\" (UniqueName: \"kubernetes.io/projected/7b113905-777a-4919-96c0-704f8e6f4026-kube-api-access-ww75t\") pod \"root-account-create-update-tnlbm\" (UID: \"7b113905-777a-4919-96c0-704f8e6f4026\") " pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.699832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdbq7\" (UniqueName: \"kubernetes.io/projected/574e045d-bacf-4c78-8091-c33d0beb09c9-kube-api-access-fdbq7\") pod \"placement-a414-account-create-update-xpbnj\" (UID: \"574e045d-bacf-4c78-8091-c33d0beb09c9\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.699900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts\") pod \"root-account-create-update-tnlbm\" (UID: \"7b113905-777a-4919-96c0-704f8e6f4026\") " pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.701060 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.701103 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data podName:4fbbd17b-fed9-463c-9c25-435f84d0205e nodeName:}" failed. No retries permitted until 2026-01-21 15:24:46.201091289 +0000 UTC m=+1383.382607512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data") pod "rabbitmq-cell1-server-0" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.701621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts\") pod \"root-account-create-update-tnlbm\" (UID: \"7b113905-777a-4919-96c0-704f8e6f4026\") " pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.701752 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/574e045d-bacf-4c78-8091-c33d0beb09c9-operator-scripts\") pod \"placement-a414-account-create-update-xpbnj\" (UID: \"574e045d-bacf-4c78-8091-c33d0beb09c9\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.705401 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-xpbnj"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.713871 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tnlbm"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.726996 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.748999 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.754919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdbq7\" (UniqueName: \"kubernetes.io/projected/574e045d-bacf-4c78-8091-c33d0beb09c9-kube-api-access-fdbq7\") pod \"placement-a414-account-create-update-xpbnj\" (UID: \"574e045d-bacf-4c78-8091-c33d0beb09c9\") " pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.761297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww75t\" (UniqueName: \"kubernetes.io/projected/7b113905-777a-4919-96c0-704f8e6f4026-kube-api-access-ww75t\") pod \"root-account-create-update-tnlbm\" (UID: \"7b113905-777a-4919-96c0-704f8e6f4026\") " pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802361 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da066dd9-9af4-46cf-9fea-e8088a738220-logs\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vsbr\" (UniqueName: \"kubernetes.io/projected/577d42b7-93f8-47f8-987a-58995f68dd9b-kube-api-access-6vsbr\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdtbg\" (UniqueName: \"kubernetes.io/projected/da066dd9-9af4-46cf-9fea-e8088a738220-kube-api-access-zdtbg\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577d42b7-93f8-47f8-987a-58995f68dd9b-logs\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data-custom\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.802846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data-custom\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.835906 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pdhwl"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.841175 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.876517 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.904193 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pdhwl"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vsbr\" (UniqueName: \"kubernetes.io/projected/577d42b7-93f8-47f8-987a-58995f68dd9b-kube-api-access-6vsbr\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdtbg\" (UniqueName: \"kubernetes.io/projected/da066dd9-9af4-46cf-9fea-e8088a738220-kube-api-access-zdtbg\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577d42b7-93f8-47f8-987a-58995f68dd9b-logs\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905478 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data-custom\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905598 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data-custom\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.905624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da066dd9-9af4-46cf-9fea-e8088a738220-logs\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.905794 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.905855 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle podName:da066dd9-9af4-46cf-9fea-e8088a738220 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:46.40584151 +0000 UTC m=+1383.587357733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle") pod "barbican-worker-5f75796cbf-qqfgf" (UID: "da066dd9-9af4-46cf-9fea-e8088a738220") : secret "combined-ca-bundle" not found Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.906667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577d42b7-93f8-47f8-987a-58995f68dd9b-logs\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.912648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.912985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da066dd9-9af4-46cf-9fea-e8088a738220-logs\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.913158 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.913273 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle podName:577d42b7-93f8-47f8-987a-58995f68dd9b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:46.413249042 +0000 UTC m=+1383.594765263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle") pod "barbican-keystone-listener-7b97d48fcb-bq5h5" (UID: "577d42b7-93f8-47f8-987a-58995f68dd9b") : secret "combined-ca-bundle" not found Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.917999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.930775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data-custom\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.950309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data-custom\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.950837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdtbg\" (UniqueName: \"kubernetes.io/projected/da066dd9-9af4-46cf-9fea-e8088a738220-kube-api-access-zdtbg\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.954892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vsbr\" (UniqueName: \"kubernetes.io/projected/577d42b7-93f8-47f8-987a-58995f68dd9b-kube-api-access-6vsbr\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.959215 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.959545 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerName="openstack-network-exporter" containerID="cri-o://e9c76da884763d8ec8111f38ea6a811f7c45fca9db9d9d3fa8c038f1f9944718" gracePeriod=300 Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.991486 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-fpc6b"] Jan 21 15:24:45 crc kubenswrapper[4707]: E0121 15:24:45.991951 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" containerName="openstackclient" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.991970 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" containerName="openstackclient" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.992150 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" containerName="openstackclient" Jan 21 15:24:45 crc kubenswrapper[4707]: I0121 15:24:45.992679 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.003585 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.010336 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-s5sw9"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.038022 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7rjvw"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.042781 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerName="ovsdbserver-sb" containerID="cri-o://9ea2a83aab093f5ad72f13e2481f0741c78195045448ff4cfb2387b4ed4d0678" gracePeriod=300 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.059927 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7rjvw"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.081465 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-s5sw9"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.086848 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.087861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.093511 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.112100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts\") pod \"glance-8403-account-create-update-fpc6b\" (UID: \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.112143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k879\" (UniqueName: \"kubernetes.io/projected/d8e220c0-ad06-4f2b-be03-642acdd3a23a-kube-api-access-2k879\") pod \"glance-8403-account-create-update-fpc6b\" (UID: \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.118637 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-fpc6b"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.129759 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-xr874"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.137351 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.146003 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.149717 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.177530 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-xr874"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.188148 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.214639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbxwb\" (UniqueName: \"kubernetes.io/projected/c86afe68-6c61-4a64-bdb3-2f489efbea33-kube-api-access-hbxwb\") pod \"neutron-06ca-account-create-update-49sxr\" (UID: \"c86afe68-6c61-4a64-bdb3-2f489efbea33\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.214729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts\") pod \"neutron-06ca-account-create-update-49sxr\" (UID: \"c86afe68-6c61-4a64-bdb3-2f489efbea33\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.214849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts\") pod \"glance-8403-account-create-update-fpc6b\" (UID: \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.214881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k879\" (UniqueName: \"kubernetes.io/projected/d8e220c0-ad06-4f2b-be03-642acdd3a23a-kube-api-access-2k879\") pod \"glance-8403-account-create-update-fpc6b\" (UID: \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.214912 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-2ztql"] Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.214990 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.215029 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data podName:4fbbd17b-fed9-463c-9c25-435f84d0205e nodeName:}" failed. No retries permitted until 2026-01-21 15:24:47.215016103 +0000 UTC m=+1384.396532325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data") pod "rabbitmq-cell1-server-0" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.215621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts\") pod \"glance-8403-account-create-update-fpc6b\" (UID: \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.246861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.247428 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerName="openstack-network-exporter" containerID="cri-o://e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02" gracePeriod=300 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.276272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k879\" (UniqueName: \"kubernetes.io/projected/d8e220c0-ad06-4f2b-be03-642acdd3a23a-kube-api-access-2k879\") pod \"glance-8403-account-create-update-fpc6b\" (UID: \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\") " pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.306799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.321760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbxwb\" (UniqueName: \"kubernetes.io/projected/c86afe68-6c61-4a64-bdb3-2f489efbea33-kube-api-access-hbxwb\") pod \"neutron-06ca-account-create-update-49sxr\" (UID: \"c86afe68-6c61-4a64-bdb3-2f489efbea33\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.321847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts\") pod \"neutron-06ca-account-create-update-49sxr\" (UID: \"c86afe68-6c61-4a64-bdb3-2f489efbea33\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.321897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45dd8867-a733-447e-adc9-68b1250fb2ba-operator-scripts\") pod \"barbican-6090-account-create-update-xr874\" (UID: \"45dd8867-a733-447e-adc9-68b1250fb2ba\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.321922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmwnx\" (UniqueName: \"kubernetes.io/projected/45dd8867-a733-447e-adc9-68b1250fb2ba-kube-api-access-wmwnx\") pod \"barbican-6090-account-create-update-xr874\" (UID: \"45dd8867-a733-447e-adc9-68b1250fb2ba\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.322661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts\") pod \"neutron-06ca-account-create-update-49sxr\" (UID: \"c86afe68-6c61-4a64-bdb3-2f489efbea33\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.328108 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.328455 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-llxnn"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.367852 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-llxnn"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.409590 4707 scope.go:117] "RemoveContainer" containerID="073faa94385891588bd04cabc70df3b815c7cb775dc37edc546cf23853834b62" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.417308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbxwb\" (UniqueName: \"kubernetes.io/projected/c86afe68-6c61-4a64-bdb3-2f489efbea33-kube-api-access-hbxwb\") pod \"neutron-06ca-account-create-update-49sxr\" (UID: \"c86afe68-6c61-4a64-bdb3-2f489efbea33\") " pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.418677 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.423874 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.424899 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.433035 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.436409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45dd8867-a733-447e-adc9-68b1250fb2ba-operator-scripts\") pod \"barbican-6090-account-create-update-xr874\" (UID: \"45dd8867-a733-447e-adc9-68b1250fb2ba\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.436474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmwnx\" (UniqueName: \"kubernetes.io/projected/45dd8867-a733-447e-adc9-68b1250fb2ba-kube-api-access-wmwnx\") pod \"barbican-6090-account-create-update-xr874\" (UID: \"45dd8867-a733-447e-adc9-68b1250fb2ba\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.437225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45dd8867-a733-447e-adc9-68b1250fb2ba-operator-scripts\") pod \"barbican-6090-account-create-update-xr874\" (UID: \"45dd8867-a733-447e-adc9-68b1250fb2ba\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.447464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.447746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.448277 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.448312 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle podName:da066dd9-9af4-46cf-9fea-e8088a738220 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:47.448299285 +0000 UTC m=+1384.629815507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle") pod "barbican-worker-5f75796cbf-qqfgf" (UID: "da066dd9-9af4-46cf-9fea-e8088a738220") : secret "combined-ca-bundle" not found Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.448883 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.448907 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle podName:577d42b7-93f8-47f8-987a-58995f68dd9b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:47.448899133 +0000 UTC m=+1384.630415355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle") pod "barbican-keystone-listener-7b97d48fcb-bq5h5" (UID: "577d42b7-93f8-47f8-987a-58995f68dd9b") : secret "combined-ca-bundle" not found Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.449008 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.450157 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data podName:1bc7660e-aaa9-4865-a873-4db17c1db628 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:46.950144994 +0000 UTC m=+1384.131661216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data") pod "rabbitmq-server-0" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628") : configmap "rabbitmq-config-data" not found Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.476867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-v66nf"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.494544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmwnx\" (UniqueName: \"kubernetes.io/projected/45dd8867-a733-447e-adc9-68b1250fb2ba-kube-api-access-wmwnx\") pod \"barbican-6090-account-create-update-xr874\" (UID: \"45dd8867-a733-447e-adc9-68b1250fb2ba\") " pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.520532 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerName="ovsdbserver-nb" containerID="cri-o://71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148" gracePeriod=300 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.530646 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-v66nf"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.532886 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_b6a9b240-8c1c-45c5-98a0-0a5f624a0506/ovsdbserver-sb/0.log" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.532920 4707 generic.go:334] "Generic (PLEG): container finished" podID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerID="e9c76da884763d8ec8111f38ea6a811f7c45fca9db9d9d3fa8c038f1f9944718" exitCode=2 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.532934 4707 generic.go:334] "Generic (PLEG): container finished" podID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerID="9ea2a83aab093f5ad72f13e2481f0741c78195045448ff4cfb2387b4ed4d0678" exitCode=143 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.532953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"b6a9b240-8c1c-45c5-98a0-0a5f624a0506","Type":"ContainerDied","Data":"e9c76da884763d8ec8111f38ea6a811f7c45fca9db9d9d3fa8c038f1f9944718"} Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.532974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"b6a9b240-8c1c-45c5-98a0-0a5f624a0506","Type":"ContainerDied","Data":"9ea2a83aab093f5ad72f13e2481f0741c78195045448ff4cfb2387b4ed4d0678"} Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.548009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.554466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44gkn\" (UniqueName: \"kubernetes.io/projected/9ff945cd-386c-4385-91aa-0fb0b9b861c5-kube-api-access-44gkn\") pod \"nova-api-e09b-account-create-update-q69js\" (UID: \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.554585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff945cd-386c-4385-91aa-0fb0b9b861c5-operator-scripts\") pod \"nova-api-e09b-account-create-update-q69js\" (UID: \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.559030 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.588575 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.589634 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.593410 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.599870 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.600885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.602368 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.653667 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.658941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44gkn\" (UniqueName: \"kubernetes.io/projected/9ff945cd-386c-4385-91aa-0fb0b9b861c5-kube-api-access-44gkn\") pod \"nova-api-e09b-account-create-update-q69js\" (UID: \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.659119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff945cd-386c-4385-91aa-0fb0b9b861c5-operator-scripts\") pod \"nova-api-e09b-account-create-update-q69js\" (UID: \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.659149 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a45393-4050-40be-a82d-69550d6cb89e-operator-scripts\") pod \"nova-cell1-d429-account-create-update-4xh55\" (UID: \"01a45393-4050-40be-a82d-69550d6cb89e\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.659238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwvt9\" (UniqueName: \"kubernetes.io/projected/01a45393-4050-40be-a82d-69550d6cb89e-kube-api-access-nwvt9\") pod \"nova-cell1-d429-account-create-update-4xh55\" (UID: \"01a45393-4050-40be-a82d-69550d6cb89e\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.659288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac603bf-ede6-4aba-a1fd-e34a94470ada-operator-scripts\") pod \"nova-cell0-0d6b-account-create-update-qkcbk\" (UID: \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.659306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj4gb\" (UniqueName: \"kubernetes.io/projected/2ac603bf-ede6-4aba-a1fd-e34a94470ada-kube-api-access-mj4gb\") pod \"nova-cell0-0d6b-account-create-update-qkcbk\" (UID: \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.660102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff945cd-386c-4385-91aa-0fb0b9b861c5-operator-scripts\") pod \"nova-api-e09b-account-create-update-q69js\" (UID: \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.662851 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.674682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-zlwl4"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.686410 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44gkn\" (UniqueName: \"kubernetes.io/projected/9ff945cd-386c-4385-91aa-0fb0b9b861c5-kube-api-access-44gkn\") pod \"nova-api-e09b-account-create-update-q69js\" (UID: \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\") " pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.706786 4707 scope.go:117] "RemoveContainer" containerID="a7150f1051360f92a9f0e5b088d60627d5673203037cb805b18b7152c14ec772" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.709175 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-zlwl4"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.754777 4707 scope.go:117] "RemoveContainer" containerID="a1be8011bc6045d525baadc48d11d4837fc2704a537669c89fedb85109e454de" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.785489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a45393-4050-40be-a82d-69550d6cb89e-operator-scripts\") pod \"nova-cell1-d429-account-create-update-4xh55\" (UID: \"01a45393-4050-40be-a82d-69550d6cb89e\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.785742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwvt9\" (UniqueName: \"kubernetes.io/projected/01a45393-4050-40be-a82d-69550d6cb89e-kube-api-access-nwvt9\") pod \"nova-cell1-d429-account-create-update-4xh55\" (UID: \"01a45393-4050-40be-a82d-69550d6cb89e\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.785785 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac603bf-ede6-4aba-a1fd-e34a94470ada-operator-scripts\") pod \"nova-cell0-0d6b-account-create-update-qkcbk\" (UID: \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.785824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj4gb\" (UniqueName: \"kubernetes.io/projected/2ac603bf-ede6-4aba-a1fd-e34a94470ada-kube-api-access-mj4gb\") pod \"nova-cell0-0d6b-account-create-update-qkcbk\" (UID: \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.797325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.798105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a45393-4050-40be-a82d-69550d6cb89e-operator-scripts\") pod \"nova-cell1-d429-account-create-update-4xh55\" (UID: \"01a45393-4050-40be-a82d-69550d6cb89e\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.803141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac603bf-ede6-4aba-a1fd-e34a94470ada-operator-scripts\") pod \"nova-cell0-0d6b-account-create-update-qkcbk\" (UID: \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.808138 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:46 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:46 crc kubenswrapper[4707]: Jan 21 15:24:46 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:46 crc kubenswrapper[4707]: Jan 21 15:24:46 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:46 crc kubenswrapper[4707]: Jan 21 15:24:46 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:46 crc kubenswrapper[4707]: Jan 21 15:24:46 crc kubenswrapper[4707]: if [ -n "placement" ]; then Jan 21 15:24:46 crc kubenswrapper[4707]: GRANT_DATABASE="placement" Jan 21 15:24:46 crc kubenswrapper[4707]: else Jan 21 15:24:46 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:46 crc kubenswrapper[4707]: fi Jan 21 15:24:46 crc kubenswrapper[4707]: Jan 21 15:24:46 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:46 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:46 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:46 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:46 crc kubenswrapper[4707]: # support updates Jan 21 15:24:46 crc kubenswrapper[4707]: Jan 21 15:24:46 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:46 crc kubenswrapper[4707]: E0121 15:24:46.809875 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" podUID="574e045d-bacf-4c78-8091-c33d0beb09c9" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.827303 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.827560 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerName="ovn-northd" containerID="cri-o://8eb790e157aa673fadd935d740017ed4db8e762fcaad08bdb4716c7e456ed832" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.827991 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerName="openstack-network-exporter" containerID="cri-o://eeacdf781fdc40d12fa1f766ca9a1e9ce44c84afa1271e3f9baffd10cf9d9255" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.875170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwvt9\" (UniqueName: \"kubernetes.io/projected/01a45393-4050-40be-a82d-69550d6cb89e-kube-api-access-nwvt9\") pod \"nova-cell1-d429-account-create-update-4xh55\" (UID: \"01a45393-4050-40be-a82d-69550d6cb89e\") " pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.885723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj4gb\" (UniqueName: \"kubernetes.io/projected/2ac603bf-ede6-4aba-a1fd-e34a94470ada-kube-api-access-mj4gb\") pod \"nova-cell0-0d6b-account-create-update-qkcbk\" (UID: \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\") " pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.919450 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.919841 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-server" containerID="cri-o://cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.920173 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="swift-recon-cron" containerID="cri-o://367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.920226 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="rsync" containerID="cri-o://eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.920255 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-expirer" containerID="cri-o://e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.920296 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-updater" containerID="cri-o://944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.920337 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-auditor" containerID="cri-o://f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.920371 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-replicator" containerID="cri-o://15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.920397 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-server" containerID="cri-o://1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.921727 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-updater" containerID="cri-o://a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.921774 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-auditor" containerID="cri-o://1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.921820 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-replicator" containerID="cri-o://c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.921851 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-server" containerID="cri-o://2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.921878 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-reaper" containerID="cri-o://5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.921914 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-auditor" containerID="cri-o://8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.921954 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-replicator" containerID="cri-o://9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.923510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.934077 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-xczbb"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.949371 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.966076 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.975012 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.975213 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerName="cinder-scheduler" containerID="cri-o://1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444" gracePeriod=30 Jan 21 15:24:46 crc kubenswrapper[4707]: I0121 15:24:46.975558 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerName="probe" containerID="cri-o://c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.002676 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-xczbb"] Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.019278 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.019331 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data podName:1bc7660e-aaa9-4865-a873-4db17c1db628 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:48.019317743 +0000 UTC m=+1385.200833965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data") pod "rabbitmq-server-0" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628") : configmap "rabbitmq-config-data" not found Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.022836 4707 scope.go:117] "RemoveContainer" containerID="12e34aafd762407b78d68f9a01a747e62412bb16c46c84d0bc5833e5a2e16f5d" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.023894 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-fns8l"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.091114 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.092970 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.099925 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.100244 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_b6a9b240-8c1c-45c5-98a0-0a5f624a0506/ovsdbserver-sb/0.log" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.100332 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.136305 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tnlbm"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.147930 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.147981 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-cphl6"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.151039 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.155495 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-htwbh"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.161026 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-xpbnj"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.169801 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4bftn"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.169866 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4bftn"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.217454 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a1405c-3d18-46d7-888c-2273a3ea7e6e" path="/var/lib/kubelet/pods/15a1405c-3d18-46d7-888c-2273a3ea7e6e/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.217975 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a3a71af-3540-4d7c-9b47-04bf4098aa40" path="/var/lib/kubelet/pods/1a3a71af-3540-4d7c-9b47-04bf4098aa40/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.218486 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d41a013-882b-4118-a008-d1fe0ec11a22" path="/var/lib/kubelet/pods/2d41a013-882b-4118-a008-d1fe0ec11a22/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.219073 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5185f1f9-4519-435b-a4fd-53fc033ad864" path="/var/lib/kubelet/pods/5185f1f9-4519-435b-a4fd-53fc033ad864/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.221148 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5560e962-f3b1-420e-9e8a-f443c99d1e43" path="/var/lib/kubelet/pods/5560e962-f3b1-420e-9e8a-f443c99d1e43/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.221668 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a8304c0-9267-4892-8907-574c07ead717" path="/var/lib/kubelet/pods/5a8304c0-9267-4892-8907-574c07ead717/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.222161 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a8c3cab-2625-44be-97b5-1d4d5f210b6e" path="/var/lib/kubelet/pods/5a8c3cab-2625-44be-97b5-1d4d5f210b6e/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.223106 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6541749d-0704-4c9c-adec-82e776820a93" path="/var/lib/kubelet/pods/6541749d-0704-4c9c-adec-82e776820a93/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.223702 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685bf29d-d0cc-43bc-9d01-6585cc8b0f18" path="/var/lib/kubelet/pods/685bf29d-d0cc-43bc-9d01-6585cc8b0f18/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.224249 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa1c474d-9462-45ca-8889-0e62e9fa5fe4" path="/var/lib/kubelet/pods/aa1c474d-9462-45ca-8889-0e62e9fa5fe4/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.224761 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3dcb05d-aaf0-4646-a1dd-b52ada0e0980" path="/var/lib/kubelet/pods/c3dcb05d-aaf0-4646-a1dd-b52ada0e0980/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.229235 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c579748e-251c-4267-b9bf-64849d6baaea" path="/var/lib/kubelet/pods/c579748e-251c-4267-b9bf-64849d6baaea/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.231361 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d89d96bb-d73b-45b0-a807-1644c19c9a4c" path="/var/lib/kubelet/pods/d89d96bb-d73b-45b0-a807-1644c19c9a4c/volumes" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.237629 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-scripts\") pod \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.238104 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.238126 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-kdl56"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.238290 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="90c1a990-5286-4274-a26a-8740ee506a12" containerName="cinder-api-log" containerID="cri-o://2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.238551 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="90c1a990-5286-4274-a26a-8740ee506a12" containerName="cinder-api" containerID="cri-o://1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.239552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-scripts" (OuterVolumeSpecName: "scripts") pod "b6a9b240-8c1c-45c5-98a0-0a5f624a0506" (UID: "b6a9b240-8c1c-45c5-98a0-0a5f624a0506"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.246934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-combined-ca-bundle\") pod \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdb-rundir\") pod \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjf6s\" (UniqueName: \"kubernetes.io/projected/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-kube-api-access-vjf6s\") pod \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-config\") pod \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-metrics-certs-tls-certs\") pod \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247195 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdbserver-sb-tls-certs\") pod \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\" (UID: \"b6a9b240-8c1c-45c5-98a0-0a5f624a0506\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-sj9hn\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247638 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b6a9b240-8c1c-45c5-98a0-0a5f624a0506" (UID: "b6a9b240-8c1c-45c5-98a0-0a5f624a0506"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247804 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-sj9hn\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-config" (OuterVolumeSpecName: "config") pod "b6a9b240-8c1c-45c5-98a0-0a5f624a0506" (UID: "b6a9b240-8c1c-45c5-98a0-0a5f624a0506"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.247903 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp8zk\" (UniqueName: \"kubernetes.io/projected/80b52f69-1c27-4d55-a847-c304552c27d0-kube-api-access-wp8zk\") pod \"dnsmasq-dnsmasq-84b9f45d47-sj9hn\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.248080 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.248094 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.248102 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.248216 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.248361 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data podName:4fbbd17b-fed9-463c-9c25-435f84d0205e nodeName:}" failed. No retries permitted until 2026-01-21 15:24:49.248290565 +0000 UTC m=+1386.429806787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data") pod "rabbitmq-cell1-server-0" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.251855 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-kdl56"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.272564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-kube-api-access-vjf6s" (OuterVolumeSpecName: "kube-api-access-vjf6s") pod "b6a9b240-8c1c-45c5-98a0-0a5f624a0506" (UID: "b6a9b240-8c1c-45c5-98a0-0a5f624a0506"). InnerVolumeSpecName "kube-api-access-vjf6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.275487 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "b6a9b240-8c1c-45c5-98a0-0a5f624a0506" (UID: "b6a9b240-8c1c-45c5-98a0-0a5f624a0506"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.326206 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4zsrk"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.336626 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4zsrk"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.343481 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.343917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6a9b240-8c1c-45c5-98a0-0a5f624a0506" (UID: "b6a9b240-8c1c-45c5-98a0-0a5f624a0506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.350763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-sj9hn\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.350829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp8zk\" (UniqueName: \"kubernetes.io/projected/80b52f69-1c27-4d55-a847-c304552c27d0-kube-api-access-wp8zk\") pod \"dnsmasq-dnsmasq-84b9f45d47-sj9hn\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.350930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-sj9hn\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.351036 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.351048 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjf6s\" (UniqueName: \"kubernetes.io/projected/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-kube-api-access-vjf6s\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.351067 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.352218 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-8672-account-create-update-6qhrw"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.352576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-sj9hn\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.352660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-sj9hn\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.365850 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-55779dfdd6-4x8fn"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.366045 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-log" containerID="cri-o://84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.366141 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-api" containerID="cri-o://5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.403072 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.403663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp8zk\" (UniqueName: \"kubernetes.io/projected/80b52f69-1c27-4d55-a847-c304552c27d0-kube-api-access-wp8zk\") pod \"dnsmasq-dnsmasq-84b9f45d47-sj9hn\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.406117 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.413723 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fdzkv"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.434049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b6a9b240-8c1c-45c5-98a0-0a5f624a0506" (UID: "b6a9b240-8c1c-45c5-98a0-0a5f624a0506"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.434056 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.436210 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.445229 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-z72fl"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.452372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.452571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.452772 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.452784 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.452868 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.452916 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle podName:da066dd9-9af4-46cf-9fea-e8088a738220 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:49.452895242 +0000 UTC m=+1386.634411465 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle") pod "barbican-worker-5f75796cbf-qqfgf" (UID: "da066dd9-9af4-46cf-9fea-e8088a738220") : secret "combined-ca-bundle" not found Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.453282 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.453315 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle podName:577d42b7-93f8-47f8-987a-58995f68dd9b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:49.453306616 +0000 UTC m=+1386.634822838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle") pod "barbican-keystone-listener-7b97d48fcb-bq5h5" (UID: "577d42b7-93f8-47f8-987a-58995f68dd9b") : secret "combined-ca-bundle" not found Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.469419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "b6a9b240-8c1c-45c5-98a0-0a5f624a0506" (UID: "b6a9b240-8c1c-45c5-98a0-0a5f624a0506"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.503970 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.504527 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="ceilometer-central-agent" containerID="cri-o://a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.504959 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="proxy-httpd" containerID="cri-o://10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.505015 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="ceilometer-notification-agent" containerID="cri-o://9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.505112 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="sg-core" containerID="cri-o://a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.562242 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a9b240-8c1c-45c5-98a0-0a5f624a0506-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.596899 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-xpbnj"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.623571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" event={"ID":"574e045d-bacf-4c78-8091-c33d0beb09c9","Type":"ContainerStarted","Data":"9bf877b23da7cdec1c2ba15b16e6e798241f418eda143f2c822993f8a73c3370"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.628195 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-vqr2f"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.634023 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_09ff63a8-6d5b-4529-9c0a-9e78957b1618/ovsdbserver-nb/0.log" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.634069 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.634393 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:47 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: if [ -n "placement" ]; then Jan 21 15:24:47 crc kubenswrapper[4707]: GRANT_DATABASE="placement" Jan 21 15:24:47 crc kubenswrapper[4707]: else Jan 21 15:24:47 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:47 crc kubenswrapper[4707]: fi Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:47 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:47 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:47 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:47 crc kubenswrapper[4707]: # support updates Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.635838 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" podUID="574e045d-bacf-4c78-8091-c33d0beb09c9" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.635877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tnlbm" event={"ID":"7b113905-777a-4919-96c0-704f8e6f4026","Type":"ContainerStarted","Data":"3886a813b1e41e66c41c1764196d19969a06be87ec2eb9f998f64d26c965490e"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.644625 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-vqr2f"] Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.659155 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:47 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: if [ -n "neutron" ]; then Jan 21 15:24:47 crc kubenswrapper[4707]: GRANT_DATABASE="neutron" Jan 21 15:24:47 crc kubenswrapper[4707]: else Jan 21 15:24:47 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:47 crc kubenswrapper[4707]: fi Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:47 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:47 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:47 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:47 crc kubenswrapper[4707]: # support updates Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.659475 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:47 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 15:24:47 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 15:24:47 crc kubenswrapper[4707]: else Jan 21 15:24:47 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:47 crc kubenswrapper[4707]: fi Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:47 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:47 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:47 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:47 crc kubenswrapper[4707]: # support updates Jan 21 15:24:47 crc kubenswrapper[4707]: Jan 21 15:24:47 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.662251 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" podUID="d8e220c0-ad06-4f2b-be03-642acdd3a23a" Jan 21 15:24:47 crc kubenswrapper[4707]: E0121 15:24:47.662404 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" podUID="c86afe68-6c61-4a64-bdb3-2f489efbea33" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.670238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdb-rundir\") pod \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.671528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdbserver-nb-tls-certs\") pod \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.671630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-config\") pod \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.671725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-metrics-certs-tls-certs\") pod \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.671752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtw5f\" (UniqueName: \"kubernetes.io/projected/09ff63a8-6d5b-4529-9c0a-9e78957b1618-kube-api-access-xtw5f\") pod \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.671778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-scripts\") pod \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.671865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.671895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-combined-ca-bundle\") pod \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\" (UID: \"09ff63a8-6d5b-4529-9c0a-9e78957b1618\") " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.680969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "09ff63a8-6d5b-4529-9c0a-9e78957b1618" (UID: "09ff63a8-6d5b-4529-9c0a-9e78957b1618"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.683755 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-scripts" (OuterVolumeSpecName: "scripts") pod "09ff63a8-6d5b-4529-9c0a-9e78957b1618" (UID: "09ff63a8-6d5b-4529-9c0a-9e78957b1618"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.684155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-config" (OuterVolumeSpecName: "config") pod "09ff63a8-6d5b-4529-9c0a-9e78957b1618" (UID: "09ff63a8-6d5b-4529-9c0a-9e78957b1618"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.697488 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-855bf96656-swjds"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.697671 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerName="neutron-api" containerID="cri-o://9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.698005 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerName="neutron-httpd" containerID="cri-o://41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717607 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717630 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717638 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717646 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717653 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717659 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717667 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717675 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717682 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717688 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717693 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717699 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b" exitCode=0 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.717967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.725662 4707 generic.go:334] "Generic (PLEG): container finished" podID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerID="eeacdf781fdc40d12fa1f766ca9a1e9ce44c84afa1271e3f9baffd10cf9d9255" exitCode=2 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.725700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"11022c93-b7e5-4241-812c-eed16cc40ce2","Type":"ContainerDied","Data":"eeacdf781fdc40d12fa1f766ca9a1e9ce44c84afa1271e3f9baffd10cf9d9255"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.756137 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "09ff63a8-6d5b-4529-9c0a-9e78957b1618" (UID: "09ff63a8-6d5b-4529-9c0a-9e78957b1618"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.757792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ff63a8-6d5b-4529-9c0a-9e78957b1618-kube-api-access-xtw5f" (OuterVolumeSpecName: "kube-api-access-xtw5f") pod "09ff63a8-6d5b-4529-9c0a-9e78957b1618" (UID: "09ff63a8-6d5b-4529-9c0a-9e78957b1618"). InnerVolumeSpecName "kube-api-access-xtw5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.761518 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.761734 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" podUID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerName="proxy-httpd" containerID="cri-o://7b5e320769635ae68967dbf380846b5fcaacb83053ce0325c620c2d8aacab67e" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.761837 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" podUID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerName="proxy-server" containerID="cri-o://383bb9acff21cf11ac127e42ba50d992286bc0e2ae3f73e405b397f0688b28c8" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.765754 4707 generic.go:334] "Generic (PLEG): container finished" podID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerID="84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078" exitCode=143 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.765789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" event={"ID":"316bd962-7ce5-4797-8a99-79aacd85c3fb","Type":"ContainerDied","Data":"84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.768341 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_09ff63a8-6d5b-4529-9c0a-9e78957b1618/ovsdbserver-nb/0.log" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.768374 4707 generic.go:334] "Generic (PLEG): container finished" podID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerID="e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02" exitCode=2 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.768387 4707 generic.go:334] "Generic (PLEG): container finished" podID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerID="71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148" exitCode=143 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.768423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"09ff63a8-6d5b-4529-9c0a-9e78957b1618","Type":"ContainerDied","Data":"e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.768444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"09ff63a8-6d5b-4529-9c0a-9e78957b1618","Type":"ContainerDied","Data":"71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.768459 4707 scope.go:117] "RemoveContainer" containerID="e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.768548 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.772375 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.775680 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.775696 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.775705 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtw5f\" (UniqueName: \"kubernetes.io/projected/09ff63a8-6d5b-4529-9c0a-9e78957b1618-kube-api-access-xtw5f\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.775714 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09ff63a8-6d5b-4529-9c0a-9e78957b1618-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.775729 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.790180 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_b6a9b240-8c1c-45c5-98a0-0a5f624a0506/ovsdbserver-sb/0.log" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.790239 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"b6a9b240-8c1c-45c5-98a0-0a5f624a0506","Type":"ContainerDied","Data":"bfd929ee5d3cdf25d0f6db4f2d30d043cfe39d8e3021ee358414e3738de7e986"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.790324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.811094 4707 generic.go:334] "Generic (PLEG): container finished" podID="90c1a990-5286-4274-a26a-8740ee506a12" containerID="2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390" exitCode=143 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.811305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"90c1a990-5286-4274-a26a-8740ee506a12","Type":"ContainerDied","Data":"2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390"} Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.814105 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.830879 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-fpc6b"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.869464 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.869767 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerName="glance-log" containerID="cri-o://cdead9cf6c5698fee36e61676ef626b53d0a70cdd040ba9566e1451f134de51e" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.870090 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerName="glance-httpd" containerID="cri-o://b4638cfb34b740af95a7bc03409e4baaf7acbfe3168548cc0715b59e19f6d136" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.887878 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.905232 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-ws576"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.922240 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-ws576"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.938760 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.939513 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerName="glance-log" containerID="cri-o://bdd54c33530224d64a1afe448ee39fd6390f3f98a84ea3afe2b57e8006166cbd" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.939865 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerName="glance-httpd" containerID="cri-o://5a6924b802768405402d6b86cb657dfd7334a39647b9f65f1a3ca7e21dd5e716" gracePeriod=30 Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.953938 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gj2xh"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.967525 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gj2xh"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.979086 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-fpc6b"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.989194 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zjkkw"] Jan 21 15:24:47 crc kubenswrapper[4707]: I0121 15:24:47.999379 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zjkkw"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.003538 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-xr874"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.008033 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.013856 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.019717 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-dwf5z"] Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.021302 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:48 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:24:48 crc kubenswrapper[4707]: else Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:48 crc kubenswrapper[4707]: fi Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:48 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:48 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:48 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:48 crc kubenswrapper[4707]: # support updates Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.023399 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" podUID="45dd8867-a733-447e-adc9-68b1250fb2ba" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.026026 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-dwf5z"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.035129 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z4djf"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.047795 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z4djf"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.082586 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.082681 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.082845 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data podName:1bc7660e-aaa9-4865-a873-4db17c1db628 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:50.082828508 +0000 UTC m=+1387.264344730 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data") pod "rabbitmq-server-0" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628") : configmap "rabbitmq-config-data" not found Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.094877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.095061 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-log" containerID="cri-o://e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.096974 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-api" containerID="cri-o://e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.097973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09ff63a8-6d5b-4529-9c0a-9e78957b1618" (UID: "09ff63a8-6d5b-4529-9c0a-9e78957b1618"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.143356 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.183899 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.192558 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.193538 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="37f8fb41-6a21-4135-ace9-7b9b36eaaac5" containerName="kube-state-metrics" containerID="cri-o://d0cdbb4ec686eadd5528f51fab83e9c9497d8a3d56a4bb906dfe28f48de85b3b" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.205682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: W0121 15:24:48.233181 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ac603bf_ede6_4aba_a1fd_e34a94470ada.slice/crio-77ed561a5c72fd3b28269ce72255d0aa48cc1d2300c5b31b644673bc8799d8ca WatchSource:0}: Error finding container 77ed561a5c72fd3b28269ce72255d0aa48cc1d2300c5b31b644673bc8799d8ca: Status 404 returned error can't find the container with id 77ed561a5c72fd3b28269ce72255d0aa48cc1d2300c5b31b644673bc8799d8ca Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.233663 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="4fbbd17b-fed9-463c-9c25-435f84d0205e" containerName="rabbitmq" containerID="cri-o://bbadb76a87972ac7320d59c53dfb93227092e217045548f169a94fd6a5c429ce" gracePeriod=604800 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.234052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "09ff63a8-6d5b-4529-9c0a-9e78957b1618" (UID: "09ff63a8-6d5b-4529-9c0a-9e78957b1618"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.234779 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:48 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: if [ -n "nova_api" ]; then Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="nova_api" Jan 21 15:24:48 crc kubenswrapper[4707]: else Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:48 crc kubenswrapper[4707]: fi Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:48 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:48 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:48 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:48 crc kubenswrapper[4707]: # support updates Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.237104 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" podUID="9ff945cd-386c-4385-91aa-0fb0b9b861c5" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.249356 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.278384 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-msrjr"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.287796 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.295439 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:48 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: if [ -n "nova_cell0" ]; then Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell0" Jan 21 15:24:48 crc kubenswrapper[4707]: else Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:48 crc kubenswrapper[4707]: fi Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:48 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:48 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:48 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:48 crc kubenswrapper[4707]: # support updates Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.295633 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-xr874"] Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.299077 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" podUID="2ac603bf-ede6-4aba-a1fd-e34a94470ada" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.314957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "09ff63a8-6d5b-4529-9c0a-9e78957b1618" (UID: "09ff63a8-6d5b-4529-9c0a-9e78957b1618"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.315009 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-msrjr"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.317201 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.317478 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-log" containerID="cri-o://add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.319173 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-metadata" containerID="cri-o://6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.352843 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.396686 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ff63a8-6d5b-4529-9c0a-9e78957b1618-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.464976 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh"] Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.466038 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerName="ovsdbserver-nb" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.466114 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerName="ovsdbserver-nb" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.466178 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerName="ovsdbserver-sb" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.466227 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerName="ovsdbserver-sb" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.466296 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerName="openstack-network-exporter" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.466348 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerName="openstack-network-exporter" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.466403 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerName="openstack-network-exporter" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.466446 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerName="openstack-network-exporter" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.466691 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerName="openstack-network-exporter" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.466750 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerName="openstack-network-exporter" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.466794 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" containerName="ovsdbserver-nb" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.466880 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" containerName="ovsdbserver-sb" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.469726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.473837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.479884 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.495527 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.507189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dbh\" (UniqueName: \"kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh\") pod \"keystone-2c83-account-create-update-9mbgh\" (UID: \"77fb4097-22e4-44a9-a955-937dbfb75ee5\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.507419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts\") pod \"keystone-2c83-account-create-update-9mbgh\" (UID: \"77fb4097-22e4-44a9-a955-937dbfb75ee5\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.511873 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5"] Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.512563 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" podUID="577d42b7-93f8-47f8-987a-58995f68dd9b" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.525793 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.526022 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" podUID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerName="barbican-keystone-listener-log" containerID="cri-o://ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.526381 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" podUID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerName="barbican-keystone-listener" containerID="cri-o://ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.543186 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.543373 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" podUID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerName="barbican-worker-log" containerID="cri-o://a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.543629 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" podUID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerName="barbican-worker" containerID="cri-o://1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.564777 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf"] Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.565612 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" podUID="da066dd9-9af4-46cf-9fea-e8088a738220" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.566950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.573440 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:48 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 15:24:48 crc kubenswrapper[4707]: else Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:48 crc kubenswrapper[4707]: fi Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:48 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:48 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:48 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:48 crc kubenswrapper[4707]: # support updates Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.574860 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" podUID="01a45393-4050-40be-a82d-69550d6cb89e" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.579395 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-55976cc56-2tgtc"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.579633 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" podUID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerName="barbican-api-log" containerID="cri-o://d959b8af71e65676b69991aa2187dfa38383cb263af1e369f5864d8fe2893795" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.579802 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" podUID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerName="barbican-api" containerID="cri-o://90be3266ccbe5e96e44486114865333b48d5affe4c0b643d4de2af49390b91a5" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.590982 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.591127 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="117c40c7-7452-42e7-8a5a-0a1123ac030e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.604779 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tnlbm"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.608853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27dbh\" (UniqueName: \"kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh\") pod \"keystone-2c83-account-create-update-9mbgh\" (UID: \"77fb4097-22e4-44a9-a955-937dbfb75ee5\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.609008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts\") pod \"keystone-2c83-account-create-update-9mbgh\" (UID: \"77fb4097-22e4-44a9-a955-937dbfb75ee5\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.610098 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.610135 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts podName:77fb4097-22e4-44a9-a955-937dbfb75ee5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:49.110123879 +0000 UTC m=+1386.291640100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts") pod "keystone-2c83-account-create-update-9mbgh" (UID: "77fb4097-22e4-44a9-a955-937dbfb75ee5") : configmap "openstack-scripts" not found Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.614313 4707 projected.go:194] Error preparing data for projected volume kube-api-access-27dbh for pod openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.614355 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh podName:77fb4097-22e4-44a9-a955-937dbfb75ee5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:49.114343678 +0000 UTC m=+1386.295859901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-27dbh" (UniqueName: "kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh") pod "keystone-2c83-account-create-update-9mbgh" (UID: "77fb4097-22e4-44a9-a955-937dbfb75ee5") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.617955 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.667454 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.673530 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-nkz5v"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.674518 4707 scope.go:117] "RemoveContainer" containerID="71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.691932 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.697182 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="1bc7660e-aaa9-4865-a873-4db17c1db628" containerName="rabbitmq" containerID="cri-o://a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8" gracePeriod=604800 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.706346 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-98266"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.710502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config\") pod \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.710560 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-combined-ca-bundle\") pod \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.710591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config-secret\") pod \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.710624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz9mb\" (UniqueName: \"kubernetes.io/projected/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-kube-api-access-fz9mb\") pod \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\" (UID: \"3c972bb1-d869-48c5-a0a1-d56f6cbe61aa\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.712413 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-v9gl6"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.716876 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-v9gl6"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.723444 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-kube-api-access-fz9mb" (OuterVolumeSpecName: "kube-api-access-fz9mb") pod "3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" (UID: "3c972bb1-d869-48c5-a0a1-d56f6cbe61aa"). InnerVolumeSpecName "kube-api-access-fz9mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.723498 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-98266"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.727223 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7f68c764d7-98wg2"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.727430 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" podUID="3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" containerName="keystone-api" containerID="cri-o://464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.733088 4707 scope.go:117] "RemoveContainer" containerID="e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.735310 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02\": container with ID starting with e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02 not found: ID does not exist" containerID="e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.735348 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02"} err="failed to get container status \"e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02\": rpc error: code = NotFound desc = could not find container \"e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02\": container with ID starting with e1a234a0567ba7df7fb86623e50d22bba47b4e8ea791c30b0e6e2e95669f2a02 not found: ID does not exist" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.735366 4707 scope.go:117] "RemoveContainer" containerID="71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.735895 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t"] Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.738572 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148\": container with ID starting with 71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148 not found: ID does not exist" containerID="71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.738599 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148"} err="failed to get container status \"71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148\": rpc error: code = NotFound desc = could not find container \"71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148\": container with ID starting with 71299762bb19a78057d995480e0e3026a4fe61404c748098d5d7c2606a254148 not found: ID does not exist" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.738636 4707 scope.go:117] "RemoveContainer" containerID="e9c76da884763d8ec8111f38ea6a811f7c45fca9db9d9d3fa8c038f1f9944718" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.739221 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="8d4d90d7-a400-45ab-8c5b-7e68ced57657" containerName="galera" containerID="cri-o://589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.739939 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-7m69t"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.744828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" (UID: "3c972bb1-d869-48c5-a0a1-d56f6cbe61aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.750026 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.750162 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="9606e288-80ca-48db-a0a6-8f0f4ec67eaa" containerName="nova-cell1-conductor-conductor" containerID="cri-o://eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.756757 4707 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/ovsdbserver-sb-0_openstack-kuttl-tests_openstack-network-exporter-e9c76da884763d8ec8111f38ea6a811f7c45fca9db9d9d3fa8c038f1f9944718.log: no such file or directory" path="/var/log/containers/ovsdbserver-sb-0_openstack-kuttl-tests_openstack-network-exporter-e9c76da884763d8ec8111f38ea6a811f7c45fca9db9d9d3fa8c038f1f9944718.log" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.758169 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.772137 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.785351 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rffsq"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.789051 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.789218 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9e516808-af42-455d-b94b-7a72e6439115" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.791334 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-lgngm"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.795514 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.795733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" (UID: "3c972bb1-d869-48c5-a0a1-d56f6cbe61aa"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.795852 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="09bf8b72-b690-4084-854a-0245cd1a61b2" containerName="memcached" containerID="cri-o://ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.799250 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-lgngm"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.803862 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.804034 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c1addb55-dbd5-4f5c-957b-592c89033dab" containerName="nova-scheduler-scheduler" containerID="cri-o://192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" gracePeriod=30 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.806966 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh"] Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.807643 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-27dbh operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" podUID="77fb4097-22e4-44a9-a955-937dbfb75ee5" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.816579 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.816609 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.818349 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz9mb\" (UniqueName: \"kubernetes.io/projected/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-kube-api-access-fz9mb\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.818365 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.818374 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.819331 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.823731 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.826412 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn"] Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.826438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" event={"ID":"80b52f69-1c27-4d55-a847-c304552c27d0","Type":"ContainerStarted","Data":"14fdfc0e29849c8c5109d73b4da45c2195ad39a81a58d32cc6317c513faff7dc"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.827663 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" event={"ID":"2ac603bf-ede6-4aba-a1fd-e34a94470ada","Type":"ContainerStarted","Data":"77ed561a5c72fd3b28269ce72255d0aa48cc1d2300c5b31b644673bc8799d8ca"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.836703 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerID="d959b8af71e65676b69991aa2187dfa38383cb263af1e369f5864d8fe2893795" exitCode=143 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.836741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" event={"ID":"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e","Type":"ContainerDied","Data":"d959b8af71e65676b69991aa2187dfa38383cb263af1e369f5864d8fe2893795"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.841214 4707 generic.go:334] "Generic (PLEG): container finished" podID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerID="e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386" exitCode=143 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.841279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a7658f54-ea3a-4d2a-bec0-6a885e661223","Type":"ContainerDied","Data":"e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.845691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" (UID: "3c972bb1-d869-48c5-a0a1-d56f6cbe61aa"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.853571 4707 generic.go:334] "Generic (PLEG): container finished" podID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerID="383bb9acff21cf11ac127e42ba50d992286bc0e2ae3f73e405b397f0688b28c8" exitCode=0 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.853590 4707 generic.go:334] "Generic (PLEG): container finished" podID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerID="7b5e320769635ae68967dbf380846b5fcaacb83053ce0325c620c2d8aacab67e" exitCode=0 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.853622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" event={"ID":"00ff5dfe-af7e-4827-949e-926b7a41731b","Type":"ContainerDied","Data":"383bb9acff21cf11ac127e42ba50d992286bc0e2ae3f73e405b397f0688b28c8"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.853638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" event={"ID":"00ff5dfe-af7e-4827-949e-926b7a41731b","Type":"ContainerDied","Data":"7b5e320769635ae68967dbf380846b5fcaacb83053ce0325c620c2d8aacab67e"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.855104 4707 generic.go:334] "Generic (PLEG): container finished" podID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerID="bdd54c33530224d64a1afe448ee39fd6390f3f98a84ea3afe2b57e8006166cbd" exitCode=143 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.855137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"edf40ef7-2790-4e9b-92dd-b92ab951383d","Type":"ContainerDied","Data":"bdd54c33530224d64a1afe448ee39fd6390f3f98a84ea3afe2b57e8006166cbd"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.856004 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" event={"ID":"9ff945cd-386c-4385-91aa-0fb0b9b861c5","Type":"ContainerStarted","Data":"8cc87ed52ab41128678cacf1b87946799e2cd2079fb60706fce98dfbf03a947c"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.877644 4707 generic.go:334] "Generic (PLEG): container finished" podID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerID="cdead9cf6c5698fee36e61676ef626b53d0a70cdd040ba9566e1451f134de51e" exitCode=143 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.877693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"63816993-7ef8-4f21-9a3c-18040cb843bd","Type":"ContainerDied","Data":"cdead9cf6c5698fee36e61676ef626b53d0a70cdd040ba9566e1451f134de51e"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.887403 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerID="a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3" exitCode=143 Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.887460 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" event={"ID":"ee0ec627-76fd-4938-9a2b-db220a1ac5d5","Type":"ContainerDied","Data":"a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.891110 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.900085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" event={"ID":"45dd8867-a733-447e-adc9-68b1250fb2ba","Type":"ContainerStarted","Data":"48af315254b9b26ca6fb562d7100875164b108c6597e4da1bd7edb224207df65"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.906871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" event={"ID":"d8e220c0-ad06-4f2b-be03-642acdd3a23a","Type":"ContainerStarted","Data":"62d467298f9c0c82284e8fe0454fad8b6f08f5085f130b12206ab1cd95e69302"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.907275 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" secret="" err="secret \"galera-openstack-dockercfg-kvgwr\" not found" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.911416 4707 scope.go:117] "RemoveContainer" containerID="9ea2a83aab093f5ad72f13e2481f0741c78195045448ff4cfb2387b4ed4d0678" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.914411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" event={"ID":"01a45393-4050-40be-a82d-69550d6cb89e","Type":"ContainerStarted","Data":"6191c485161f11673978ad2e23e63703e2968751f3a5671ce72a99d32d960cc2"} Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.921460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data\") pod \"3c937d1c-d74a-4296-a78d-c88deef6b011\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.921486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw2km\" (UniqueName: \"kubernetes.io/projected/3c937d1c-d74a-4296-a78d-c88deef6b011-kube-api-access-dw2km\") pod \"3c937d1c-d74a-4296-a78d-c88deef6b011\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.921527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-combined-ca-bundle\") pod \"3c937d1c-d74a-4296-a78d-c88deef6b011\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.921548 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c937d1c-d74a-4296-a78d-c88deef6b011-etc-machine-id\") pod \"3c937d1c-d74a-4296-a78d-c88deef6b011\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.921644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-scripts\") pod \"3c937d1c-d74a-4296-a78d-c88deef6b011\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.921667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data-custom\") pod \"3c937d1c-d74a-4296-a78d-c88deef6b011\" (UID: \"3c937d1c-d74a-4296-a78d-c88deef6b011\") " Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.922377 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.926159 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c937d1c-d74a-4296-a78d-c88deef6b011-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3c937d1c-d74a-4296-a78d-c88deef6b011" (UID: "3c937d1c-d74a-4296-a78d-c88deef6b011"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.927060 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.927130 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts podName:d8e220c0-ad06-4f2b-be03-642acdd3a23a nodeName:}" failed. No retries permitted until 2026-01-21 15:24:49.427114029 +0000 UTC m=+1386.608630250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts") pod "glance-8403-account-create-update-fpc6b" (UID: "d8e220c0-ad06-4f2b-be03-642acdd3a23a") : configmap "openstack-scripts" not found Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.945452 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-scripts" (OuterVolumeSpecName: "scripts") pod "3c937d1c-d74a-4296-a78d-c88deef6b011" (UID: "3c937d1c-d74a-4296-a78d-c88deef6b011"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.951012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3c937d1c-d74a-4296-a78d-c88deef6b011" (UID: "3c937d1c-d74a-4296-a78d-c88deef6b011"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: I0121 15:24:48.963481 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c937d1c-d74a-4296-a78d-c88deef6b011-kube-api-access-dw2km" (OuterVolumeSpecName: "kube-api-access-dw2km") pod "3c937d1c-d74a-4296-a78d-c88deef6b011" (UID: "3c937d1c-d74a-4296-a78d-c88deef6b011"). InnerVolumeSpecName "kube-api-access-dw2km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:48 crc kubenswrapper[4707]: E0121 15:24:48.998932 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:48 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 15:24:48 crc kubenswrapper[4707]: else Jan 21 15:24:48 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:48 crc kubenswrapper[4707]: fi Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:48 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:48 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:48 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:48 crc kubenswrapper[4707]: # support updates Jan 21 15:24:48 crc kubenswrapper[4707]: Jan 21 15:24:48 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.000204 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" podUID="d8e220c0-ad06-4f2b-be03-642acdd3a23a" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.017093 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b113905-777a-4919-96c0-704f8e6f4026" containerID="210aa9472f7d900c0ca04d5cf2c536a85b2621ac36455c5b52c667c54ca18f1a" exitCode=1 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.017160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tnlbm" event={"ID":"7b113905-777a-4919-96c0-704f8e6f4026","Type":"ContainerDied","Data":"210aa9472f7d900c0ca04d5cf2c536a85b2621ac36455c5b52c667c54ca18f1a"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.017537 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-tnlbm" secret="" err="secret \"galera-openstack-cell1-dockercfg-chnzh\" not found" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.017568 4707 scope.go:117] "RemoveContainer" containerID="210aa9472f7d900c0ca04d5cf2c536a85b2621ac36455c5b52c667c54ca18f1a" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.026969 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.027033 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.027047 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.027094 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts podName:7b113905-777a-4919-96c0-704f8e6f4026 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:49.527078358 +0000 UTC m=+1386.708594580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts") pod "root-account-create-update-tnlbm" (UID: "7b113905-777a-4919-96c0-704f8e6f4026") : configmap "openstack-cell1-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.027117 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw2km\" (UniqueName: \"kubernetes.io/projected/3c937d1c-d74a-4296-a78d-c88deef6b011-kube-api-access-dw2km\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.027131 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c937d1c-d74a-4296-a78d-c88deef6b011-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.042918 4707 generic.go:334] "Generic (PLEG): container finished" podID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerID="ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341" exitCode=143 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.042995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" event={"ID":"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1","Type":"ContainerDied","Data":"ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.046008 4707 generic.go:334] "Generic (PLEG): container finished" podID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerID="10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695" exitCode=0 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.046035 4707 generic.go:334] "Generic (PLEG): container finished" podID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerID="a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec" exitCode=2 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.046042 4707 generic.go:334] "Generic (PLEG): container finished" podID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerID="a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858" exitCode=0 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.046083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerDied","Data":"10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.046104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerDied","Data":"a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.046114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerDied","Data":"a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.069145 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a" exitCode=0 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.069166 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39" exitCode=0 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.069219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.069241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.071995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c937d1c-d74a-4296-a78d-c88deef6b011" (UID: "3c937d1c-d74a-4296-a78d-c88deef6b011"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.072179 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" containerID="3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a" exitCode=137 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.072288 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.087175 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8eb790e157aa673fadd935d740017ed4db8e762fcaad08bdb4716c7e456ed832" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.089353 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8eb790e157aa673fadd935d740017ed4db8e762fcaad08bdb4716c7e456ed832" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.089550 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="f9190471-72fb-4298-85b6-bf07b32b6de5" containerName="galera" containerID="cri-o://eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee" gracePeriod=30 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.090430 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerID="add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868" exitCode=143 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.090487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed7400b4-4f9e-4571-9156-5cf8f5e0df67","Type":"ContainerDied","Data":"add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868"} Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.090697 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8eb790e157aa673fadd935d740017ed4db8e762fcaad08bdb4716c7e456ed832" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.090730 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerName="ovn-northd" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.094539 4707 generic.go:334] "Generic (PLEG): container finished" podID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerID="41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51" exitCode=0 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.094612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" event={"ID":"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e","Type":"ContainerDied","Data":"41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.095241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data" (OuterVolumeSpecName: "config-data") pod "3c937d1c-d74a-4296-a78d-c88deef6b011" (UID: "3c937d1c-d74a-4296-a78d-c88deef6b011"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.096337 4707 generic.go:334] "Generic (PLEG): container finished" podID="37f8fb41-6a21-4135-ace9-7b9b36eaaac5" containerID="d0cdbb4ec686eadd5528f51fab83e9c9497d8a3d56a4bb906dfe28f48de85b3b" exitCode=2 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.096391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"37f8fb41-6a21-4135-ace9-7b9b36eaaac5","Type":"ContainerDied","Data":"d0cdbb4ec686eadd5528f51fab83e9c9497d8a3d56a4bb906dfe28f48de85b3b"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.098219 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerID="c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e" exitCode=0 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.098237 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerID="1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444" exitCode=0 Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.098279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c937d1c-d74a-4296-a78d-c88deef6b011","Type":"ContainerDied","Data":"c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.098294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c937d1c-d74a-4296-a78d-c88deef6b011","Type":"ContainerDied","Data":"1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.098304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c937d1c-d74a-4296-a78d-c88deef6b011","Type":"ContainerDied","Data":"2e7ec64e38025020b723d6365c387b0f6466491d9052919899b2096edc5ee2cd"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.098352 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.105510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.105937 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" secret="" err="secret \"galera-openstack-dockercfg-kvgwr\" not found" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.108453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" event={"ID":"c86afe68-6c61-4a64-bdb3-2f489efbea33","Type":"ContainerStarted","Data":"4191c90e78f75ac630bab39ee6c9bd2ef9b9833b23492e2d0e44d3b0812d06da"} Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.108529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.108651 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.110044 4707 scope.go:117] "RemoveContainer" containerID="3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.110497 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:24:49 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:24:49 crc kubenswrapper[4707]: Jan 21 15:24:49 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:24:49 crc kubenswrapper[4707]: Jan 21 15:24:49 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:24:49 crc kubenswrapper[4707]: Jan 21 15:24:49 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:24:49 crc kubenswrapper[4707]: Jan 21 15:24:49 crc kubenswrapper[4707]: if [ -n "neutron" ]; then Jan 21 15:24:49 crc kubenswrapper[4707]: GRANT_DATABASE="neutron" Jan 21 15:24:49 crc kubenswrapper[4707]: else Jan 21 15:24:49 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:24:49 crc kubenswrapper[4707]: fi Jan 21 15:24:49 crc kubenswrapper[4707]: Jan 21 15:24:49 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:24:49 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:24:49 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:24:49 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:24:49 crc kubenswrapper[4707]: # support updates Jan 21 15:24:49 crc kubenswrapper[4707]: Jan 21 15:24:49 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.111554 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" podUID="c86afe68-6c61-4a64-bdb3-2f489efbea33" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.129172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27dbh\" (UniqueName: \"kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh\") pod \"keystone-2c83-account-create-update-9mbgh\" (UID: \"77fb4097-22e4-44a9-a955-937dbfb75ee5\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.129297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts\") pod \"keystone-2c83-account-create-update-9mbgh\" (UID: \"77fb4097-22e4-44a9-a955-937dbfb75ee5\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.129458 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.129506 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.129519 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c937d1c-d74a-4296-a78d-c88deef6b011-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.129532 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.129561 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts podName:77fb4097-22e4-44a9-a955-937dbfb75ee5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:50.129548297 +0000 UTC m=+1387.311064519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts") pod "keystone-2c83-account-create-update-9mbgh" (UID: "77fb4097-22e4-44a9-a955-937dbfb75ee5") : configmap "openstack-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.129624 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts podName:c86afe68-6c61-4a64-bdb3-2f489efbea33 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:49.629605925 +0000 UTC m=+1386.811122147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts") pod "neutron-06ca-account-create-update-49sxr" (UID: "c86afe68-6c61-4a64-bdb3-2f489efbea33") : configmap "openstack-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.137122 4707 projected.go:194] Error preparing data for projected volume kube-api-access-27dbh for pod openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.137181 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh podName:77fb4097-22e4-44a9-a955-937dbfb75ee5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:50.137159911 +0000 UTC m=+1387.318676133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-27dbh" (UniqueName: "kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh") pod "keystone-2c83-account-create-update-9mbgh" (UID: "77fb4097-22e4-44a9-a955-937dbfb75ee5") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.148921 4707 scope.go:117] "RemoveContainer" containerID="3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.149390 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a\": container with ID starting with 3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a not found: ID does not exist" containerID="3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.149825 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a"} err="failed to get container status \"3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a\": rpc error: code = NotFound desc = could not find container \"3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a\": container with ID starting with 3acd296f56ef482ef5a24eee99a83a7fdf0f585cc9c08884cdcb2bf050a8715a not found: ID does not exist" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.149855 4707 scope.go:117] "RemoveContainer" containerID="c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.181680 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.187041 4707 scope.go:117] "RemoveContainer" containerID="1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.193010 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000e0803-0c00-4afa-863f-23269a570787" path="/var/lib/kubelet/pods/000e0803-0c00-4afa-863f-23269a570787/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.193517 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="024458dc-51a9-4318-a563-439e24104ffd" path="/var/lib/kubelet/pods/024458dc-51a9-4318-a563-439e24104ffd/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.194096 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ff63a8-6d5b-4529-9c0a-9e78957b1618" path="/var/lib/kubelet/pods/09ff63a8-6d5b-4529-9c0a-9e78957b1618/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.195105 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffcf62a-53d8-4003-8e75-747b6e768ffd" path="/var/lib/kubelet/pods/0ffcf62a-53d8-4003-8e75-747b6e768ffd/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.195582 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c1b37e-4ff9-4d56-97c1-744f743a2f9a" path="/var/lib/kubelet/pods/14c1b37e-4ff9-4d56-97c1-744f743a2f9a/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.196323 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c1cabab-f9cf-4c1c-8690-daedf78a0a53" path="/var/lib/kubelet/pods/3c1cabab-f9cf-4c1c-8690-daedf78a0a53/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.197521 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c972bb1-d869-48c5-a0a1-d56f6cbe61aa" path="/var/lib/kubelet/pods/3c972bb1-d869-48c5-a0a1-d56f6cbe61aa/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.198000 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443064ab-05cf-4215-9029-9d1c8adb1934" path="/var/lib/kubelet/pods/443064ab-05cf-4215-9029-9d1c8adb1934/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.198478 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dfb5d13-8ce5-49f7-a24d-4b913d85df26" path="/var/lib/kubelet/pods/4dfb5d13-8ce5-49f7-a24d-4b913d85df26/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.198972 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5838abad-cfc4-4b71-a97d-0a96e327ddd7" path="/var/lib/kubelet/pods/5838abad-cfc4-4b71-a97d-0a96e327ddd7/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.204957 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="863c9468-2e71-46c6-b6ab-212a5591921d" path="/var/lib/kubelet/pods/863c9468-2e71-46c6-b6ab-212a5591921d/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.205477 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a360c79-ca70-4364-87a9-5d0b24cb9b2e" path="/var/lib/kubelet/pods/9a360c79-ca70-4364-87a9-5d0b24cb9b2e/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.206006 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3add83-4dbe-498e-82af-461847ca1175" path="/var/lib/kubelet/pods/af3add83-4dbe-498e-82af-461847ca1175/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.211560 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4de4059-fde2-431a-b7e8-7f2bc23218dd" path="/var/lib/kubelet/pods/b4de4059-fde2-431a-b7e8-7f2bc23218dd/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.212124 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a9b240-8c1c-45c5-98a0-0a5f624a0506" path="/var/lib/kubelet/pods/b6a9b240-8c1c-45c5-98a0-0a5f624a0506/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.227824 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.228331 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82b2b5f-76f3-4da4-b413-fd8fc4d520c0" path="/var/lib/kubelet/pods/b82b2b5f-76f3-4da4-b413-fd8fc4d520c0/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.230846 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee367ab-367a-4b7e-bf75-0c7ae0964933" path="/var/lib/kubelet/pods/cee367ab-367a-4b7e-bf75-0c7ae0964933/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.231719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-combined-ca-bundle\") pod \"00ff5dfe-af7e-4827-949e-926b7a41731b\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.232842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-log-httpd\") pod \"00ff5dfe-af7e-4827-949e-926b7a41731b\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.232876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-run-httpd\") pod \"00ff5dfe-af7e-4827-949e-926b7a41731b\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.234238 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f1cb8a-c64f-4997-a896-0d7127aa309a" path="/var/lib/kubelet/pods/e0f1cb8a-c64f-4997-a896-0d7127aa309a/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.235801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-config\") pod \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.236047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-certs\") pod \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.236117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-config-data\") pod \"00ff5dfe-af7e-4827-949e-926b7a41731b\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.236171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-public-tls-certs\") pod \"00ff5dfe-af7e-4827-949e-926b7a41731b\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.236203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-internal-tls-certs\") pod \"00ff5dfe-af7e-4827-949e-926b7a41731b\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.236233 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4rc4\" (UniqueName: \"kubernetes.io/projected/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-api-access-q4rc4\") pod \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.236267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-etc-swift\") pod \"00ff5dfe-af7e-4827-949e-926b7a41731b\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.236294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-combined-ca-bundle\") pod \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\" (UID: \"37f8fb41-6a21-4135-ace9-7b9b36eaaac5\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.236313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b267z\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-kube-api-access-b267z\") pod \"00ff5dfe-af7e-4827-949e-926b7a41731b\" (UID: \"00ff5dfe-af7e-4827-949e-926b7a41731b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.238621 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16ff817-63cb-4b7d-8ce2-40979f6a2cc6" path="/var/lib/kubelet/pods/e16ff817-63cb-4b7d-8ce2-40979f6a2cc6/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.238849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.243241 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5177656-16d0-431f-b865-455378e95bd3" path="/var/lib/kubelet/pods/e5177656-16d0-431f-b865-455378e95bd3/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.243741 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94be5be-28c9-4cf4-92bd-f08538ff3bd8" path="/var/lib/kubelet/pods/e94be5be-28c9-4cf4-92bd-f08538ff3bd8/volumes" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.246550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "00ff5dfe-af7e-4827-949e-926b7a41731b" (UID: "00ff5dfe-af7e-4827-949e-926b7a41731b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.246957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "00ff5dfe-af7e-4827-949e-926b7a41731b" (UID: "00ff5dfe-af7e-4827-949e-926b7a41731b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.254172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-api-access-q4rc4" (OuterVolumeSpecName: "kube-api-access-q4rc4") pod "37f8fb41-6a21-4135-ace9-7b9b36eaaac5" (UID: "37f8fb41-6a21-4135-ace9-7b9b36eaaac5"). InnerVolumeSpecName "kube-api-access-q4rc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.254863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-kube-api-access-b267z" (OuterVolumeSpecName: "kube-api-access-b267z") pod "00ff5dfe-af7e-4827-949e-926b7a41731b" (UID: "00ff5dfe-af7e-4827-949e-926b7a41731b"). InnerVolumeSpecName "kube-api-access-b267z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.256577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "00ff5dfe-af7e-4827-949e-926b7a41731b" (UID: "00ff5dfe-af7e-4827-949e-926b7a41731b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.256634 4707 scope.go:117] "RemoveContainer" containerID="c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.258108 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e\": container with ID starting with c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e not found: ID does not exist" containerID="c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.258134 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e"} err="failed to get container status \"c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e\": rpc error: code = NotFound desc = could not find container \"c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e\": container with ID starting with c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e not found: ID does not exist" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.258151 4707 scope.go:117] "RemoveContainer" containerID="1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.260982 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444\": container with ID starting with 1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444 not found: ID does not exist" containerID="1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.261019 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444"} err="failed to get container status \"1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444\": rpc error: code = NotFound desc = could not find container \"1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444\": container with ID starting with 1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444 not found: ID does not exist" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.261043 4707 scope.go:117] "RemoveContainer" containerID="c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.261324 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e"} err="failed to get container status \"c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e\": rpc error: code = NotFound desc = could not find container \"c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e\": container with ID starting with c04d203f887576dc216962acb2bc559675fb57a5125eb45330d0630f0bd05d8e not found: ID does not exist" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.261354 4707 scope.go:117] "RemoveContainer" containerID="1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.261587 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444"} err="failed to get container status \"1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444\": rpc error: code = NotFound desc = could not find container \"1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444\": container with ID starting with 1b80f49a554a56fdf703a6f931c3f56d7d00f226daccdc71901fce5493dc6444 not found: ID does not exist" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.279446 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.279474 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.290509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.295989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.302514 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.305254 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdtbg\" (UniqueName: \"kubernetes.io/projected/da066dd9-9af4-46cf-9fea-e8088a738220-kube-api-access-zdtbg\") pod \"da066dd9-9af4-46cf-9fea-e8088a738220\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339483 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data\") pod \"577d42b7-93f8-47f8-987a-58995f68dd9b\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vsbr\" (UniqueName: \"kubernetes.io/projected/577d42b7-93f8-47f8-987a-58995f68dd9b-kube-api-access-6vsbr\") pod \"577d42b7-93f8-47f8-987a-58995f68dd9b\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data-custom\") pod \"577d42b7-93f8-47f8-987a-58995f68dd9b\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj4gb\" (UniqueName: \"kubernetes.io/projected/2ac603bf-ede6-4aba-a1fd-e34a94470ada-kube-api-access-mj4gb\") pod \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\" (UID: \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac603bf-ede6-4aba-a1fd-e34a94470ada-operator-scripts\") pod \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\" (UID: \"2ac603bf-ede6-4aba-a1fd-e34a94470ada\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da066dd9-9af4-46cf-9fea-e8088a738220-logs\") pod \"da066dd9-9af4-46cf-9fea-e8088a738220\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577d42b7-93f8-47f8-987a-58995f68dd9b-logs\") pod \"577d42b7-93f8-47f8-987a-58995f68dd9b\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff945cd-386c-4385-91aa-0fb0b9b861c5-operator-scripts\") pod \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\" (UID: \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data\") pod \"da066dd9-9af4-46cf-9fea-e8088a738220\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44gkn\" (UniqueName: \"kubernetes.io/projected/9ff945cd-386c-4385-91aa-0fb0b9b861c5-kube-api-access-44gkn\") pod \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\" (UID: \"9ff945cd-386c-4385-91aa-0fb0b9b861c5\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.339884 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data-custom\") pod \"da066dd9-9af4-46cf-9fea-e8088a738220\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.340552 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4rc4\" (UniqueName: \"kubernetes.io/projected/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-api-access-q4rc4\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.340564 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.340574 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b267z\" (UniqueName: \"kubernetes.io/projected/00ff5dfe-af7e-4827-949e-926b7a41731b-kube-api-access-b267z\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.340582 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.340589 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00ff5dfe-af7e-4827-949e-926b7a41731b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.340635 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.340667 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data podName:4fbbd17b-fed9-463c-9c25-435f84d0205e nodeName:}" failed. No retries permitted until 2026-01-21 15:24:53.340654973 +0000 UTC m=+1390.522171196 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data") pod "rabbitmq-cell1-server-0" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.340910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da066dd9-9af4-46cf-9fea-e8088a738220-logs" (OuterVolumeSpecName: "logs") pod "da066dd9-9af4-46cf-9fea-e8088a738220" (UID: "da066dd9-9af4-46cf-9fea-e8088a738220"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.340962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac603bf-ede6-4aba-a1fd-e34a94470ada-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ac603bf-ede6-4aba-a1fd-e34a94470ada" (UID: "2ac603bf-ede6-4aba-a1fd-e34a94470ada"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.341100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577d42b7-93f8-47f8-987a-58995f68dd9b-logs" (OuterVolumeSpecName: "logs") pod "577d42b7-93f8-47f8-987a-58995f68dd9b" (UID: "577d42b7-93f8-47f8-987a-58995f68dd9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.344093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ff945cd-386c-4385-91aa-0fb0b9b861c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ff945cd-386c-4385-91aa-0fb0b9b861c5" (UID: "9ff945cd-386c-4385-91aa-0fb0b9b861c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.351995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data" (OuterVolumeSpecName: "config-data") pod "577d42b7-93f8-47f8-987a-58995f68dd9b" (UID: "577d42b7-93f8-47f8-987a-58995f68dd9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.352001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "37f8fb41-6a21-4135-ace9-7b9b36eaaac5" (UID: "37f8fb41-6a21-4135-ace9-7b9b36eaaac5"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.352004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da066dd9-9af4-46cf-9fea-e8088a738220" (UID: "da066dd9-9af4-46cf-9fea-e8088a738220"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.352052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data" (OuterVolumeSpecName: "config-data") pod "da066dd9-9af4-46cf-9fea-e8088a738220" (UID: "da066dd9-9af4-46cf-9fea-e8088a738220"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.352129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac603bf-ede6-4aba-a1fd-e34a94470ada-kube-api-access-mj4gb" (OuterVolumeSpecName: "kube-api-access-mj4gb") pod "2ac603bf-ede6-4aba-a1fd-e34a94470ada" (UID: "2ac603bf-ede6-4aba-a1fd-e34a94470ada"). InnerVolumeSpecName "kube-api-access-mj4gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.352295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da066dd9-9af4-46cf-9fea-e8088a738220-kube-api-access-zdtbg" (OuterVolumeSpecName: "kube-api-access-zdtbg") pod "da066dd9-9af4-46cf-9fea-e8088a738220" (UID: "da066dd9-9af4-46cf-9fea-e8088a738220"). InnerVolumeSpecName "kube-api-access-zdtbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.352661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577d42b7-93f8-47f8-987a-58995f68dd9b-kube-api-access-6vsbr" (OuterVolumeSpecName: "kube-api-access-6vsbr") pod "577d42b7-93f8-47f8-987a-58995f68dd9b" (UID: "577d42b7-93f8-47f8-987a-58995f68dd9b"). InnerVolumeSpecName "kube-api-access-6vsbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.352721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff945cd-386c-4385-91aa-0fb0b9b861c5-kube-api-access-44gkn" (OuterVolumeSpecName: "kube-api-access-44gkn") pod "9ff945cd-386c-4385-91aa-0fb0b9b861c5" (UID: "9ff945cd-386c-4385-91aa-0fb0b9b861c5"). InnerVolumeSpecName "kube-api-access-44gkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.353561 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "577d42b7-93f8-47f8-987a-58995f68dd9b" (UID: "577d42b7-93f8-47f8-987a-58995f68dd9b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.366571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37f8fb41-6a21-4135-ace9-7b9b36eaaac5" (UID: "37f8fb41-6a21-4135-ace9-7b9b36eaaac5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.369619 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-config-data" (OuterVolumeSpecName: "config-data") pod "00ff5dfe-af7e-4827-949e-926b7a41731b" (UID: "00ff5dfe-af7e-4827-949e-926b7a41731b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.371363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "37f8fb41-6a21-4135-ace9-7b9b36eaaac5" (UID: "37f8fb41-6a21-4135-ace9-7b9b36eaaac5"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.373514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "00ff5dfe-af7e-4827-949e-926b7a41731b" (UID: "00ff5dfe-af7e-4827-949e-926b7a41731b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.405569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "00ff5dfe-af7e-4827-949e-926b7a41731b" (UID: "00ff5dfe-af7e-4827-949e-926b7a41731b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.415894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00ff5dfe-af7e-4827-949e-926b7a41731b" (UID: "00ff5dfe-af7e-4827-949e-926b7a41731b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441219 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj4gb\" (UniqueName: \"kubernetes.io/projected/2ac603bf-ede6-4aba-a1fd-e34a94470ada-kube-api-access-mj4gb\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441244 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac603bf-ede6-4aba-a1fd-e34a94470ada-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441254 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441274 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da066dd9-9af4-46cf-9fea-e8088a738220-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441283 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577d42b7-93f8-47f8-987a-58995f68dd9b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441291 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441300 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff945cd-386c-4385-91aa-0fb0b9b861c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441308 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441474 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441484 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441493 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44gkn\" (UniqueName: \"kubernetes.io/projected/9ff945cd-386c-4385-91aa-0fb0b9b861c5-kube-api-access-44gkn\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441501 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441508 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441516 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdtbg\" (UniqueName: \"kubernetes.io/projected/da066dd9-9af4-46cf-9fea-e8088a738220-kube-api-access-zdtbg\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441523 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441531 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f8fb41-6a21-4135-ace9-7b9b36eaaac5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441539 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vsbr\" (UniqueName: \"kubernetes.io/projected/577d42b7-93f8-47f8-987a-58995f68dd9b-kube-api-access-6vsbr\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441546 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff5dfe-af7e-4827-949e-926b7a41731b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.441554 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.441455 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.441592 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts podName:d8e220c0-ad06-4f2b-be03-642acdd3a23a nodeName:}" failed. No retries permitted until 2026-01-21 15:24:50.44158081 +0000 UTC m=+1387.623097032 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts") pod "glance-8403-account-create-update-fpc6b" (UID: "d8e220c0-ad06-4f2b-be03-642acdd3a23a") : configmap "openstack-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.469210 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.471218 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.473687 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.473720 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9e516808-af42-455d-b94b-7a72e6439115" containerName="nova-cell0-conductor-conductor" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.486914 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-0" podUID="f9190471-72fb-4298-85b6-bf07b32b6de5" containerName="galera" probeResult="failure" output="" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.543098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle\") pod \"barbican-keystone-listener-7b97d48fcb-bq5h5\" (UID: \"577d42b7-93f8-47f8-987a-58995f68dd9b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.543306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle\") pod \"barbican-worker-5f75796cbf-qqfgf\" (UID: \"da066dd9-9af4-46cf-9fea-e8088a738220\") " pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.543564 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.543707 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle podName:da066dd9-9af4-46cf-9fea-e8088a738220 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:53.543672349 +0000 UTC m=+1390.725188570 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle") pod "barbican-worker-5f75796cbf-qqfgf" (UID: "da066dd9-9af4-46cf-9fea-e8088a738220") : secret "combined-ca-bundle" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.543833 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.544140 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle podName:577d42b7-93f8-47f8-987a-58995f68dd9b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:53.54413094 +0000 UTC m=+1390.725647163 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle") pod "barbican-keystone-listener-7b97d48fcb-bq5h5" (UID: "577d42b7-93f8-47f8-987a-58995f68dd9b") : secret "combined-ca-bundle" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.544362 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.544454 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts podName:7b113905-777a-4919-96c0-704f8e6f4026 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:50.544446804 +0000 UTC m=+1387.725963025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts") pod "root-account-create-update-tnlbm" (UID: "7b113905-777a-4919-96c0-704f8e6f4026") : configmap "openstack-cell1-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.593996 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.602719 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.644155 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a45393-4050-40be-a82d-69550d6cb89e-operator-scripts\") pod \"01a45393-4050-40be-a82d-69550d6cb89e\" (UID: \"01a45393-4050-40be-a82d-69550d6cb89e\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.644508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdbq7\" (UniqueName: \"kubernetes.io/projected/574e045d-bacf-4c78-8091-c33d0beb09c9-kube-api-access-fdbq7\") pod \"574e045d-bacf-4c78-8091-c33d0beb09c9\" (UID: \"574e045d-bacf-4c78-8091-c33d0beb09c9\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.644723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/574e045d-bacf-4c78-8091-c33d0beb09c9-operator-scripts\") pod \"574e045d-bacf-4c78-8091-c33d0beb09c9\" (UID: \"574e045d-bacf-4c78-8091-c33d0beb09c9\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.645411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwvt9\" (UniqueName: \"kubernetes.io/projected/01a45393-4050-40be-a82d-69550d6cb89e-kube-api-access-nwvt9\") pod \"01a45393-4050-40be-a82d-69550d6cb89e\" (UID: \"01a45393-4050-40be-a82d-69550d6cb89e\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.644642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a45393-4050-40be-a82d-69550d6cb89e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01a45393-4050-40be-a82d-69550d6cb89e" (UID: "01a45393-4050-40be-a82d-69550d6cb89e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.645353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/574e045d-bacf-4c78-8091-c33d0beb09c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "574e045d-bacf-4c78-8091-c33d0beb09c9" (UID: "574e045d-bacf-4c78-8091-c33d0beb09c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.646237 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.646419 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts podName:c86afe68-6c61-4a64-bdb3-2f489efbea33 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:50.646342353 +0000 UTC m=+1387.827858575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts") pod "neutron-06ca-account-create-update-49sxr" (UID: "c86afe68-6c61-4a64-bdb3-2f489efbea33") : configmap "openstack-scripts" not found Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.647089 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/574e045d-bacf-4c78-8091-c33d0beb09c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.647174 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a45393-4050-40be-a82d-69550d6cb89e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.649366 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574e045d-bacf-4c78-8091-c33d0beb09c9-kube-api-access-fdbq7" (OuterVolumeSpecName: "kube-api-access-fdbq7") pod "574e045d-bacf-4c78-8091-c33d0beb09c9" (UID: "574e045d-bacf-4c78-8091-c33d0beb09c9"). InnerVolumeSpecName "kube-api-access-fdbq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.649501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a45393-4050-40be-a82d-69550d6cb89e-kube-api-access-nwvt9" (OuterVolumeSpecName: "kube-api-access-nwvt9") pod "01a45393-4050-40be-a82d-69550d6cb89e" (UID: "01a45393-4050-40be-a82d-69550d6cb89e"). InnerVolumeSpecName "kube-api-access-nwvt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.661395 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.671307 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.677247 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.683828 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.747857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmwnx\" (UniqueName: \"kubernetes.io/projected/45dd8867-a733-447e-adc9-68b1250fb2ba-kube-api-access-wmwnx\") pod \"45dd8867-a733-447e-adc9-68b1250fb2ba\" (UID: \"45dd8867-a733-447e-adc9-68b1250fb2ba\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.748385 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-operator-scripts\") pod \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.748475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-generated\") pod \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.748554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-galera-tls-certs\") pod \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.748624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-default\") pod \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.748689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65tfb\" (UniqueName: \"kubernetes.io/projected/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kube-api-access-65tfb\") pod \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.748753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.748869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data\") pod \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.748933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-config-data\") pod \"117c40c7-7452-42e7-8a5a-0a1123ac030e\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.749576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8d4d90d7-a400-45ab-8c5b-7e68ced57657" (UID: "8d4d90d7-a400-45ab-8c5b-7e68ced57657"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.750328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8d4d90d7-a400-45ab-8c5b-7e68ced57657" (UID: "8d4d90d7-a400-45ab-8c5b-7e68ced57657"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.751130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d4d90d7-a400-45ab-8c5b-7e68ced57657" (UID: "8d4d90d7-a400-45ab-8c5b-7e68ced57657"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.751342 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45dd8867-a733-447e-adc9-68b1250fb2ba-kube-api-access-wmwnx" (OuterVolumeSpecName: "kube-api-access-wmwnx") pod "45dd8867-a733-447e-adc9-68b1250fb2ba" (UID: "45dd8867-a733-447e-adc9-68b1250fb2ba"). InnerVolumeSpecName "kube-api-access-wmwnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.751425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-nova-novncproxy-tls-certs\") pod \"117c40c7-7452-42e7-8a5a-0a1123ac030e\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.751722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-combined-ca-bundle\") pod \"117c40c7-7452-42e7-8a5a-0a1123ac030e\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.751871 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgvj6\" (UniqueName: \"kubernetes.io/projected/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-kube-api-access-rgvj6\") pod \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.752225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kolla-config\") pod \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.752321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data-custom\") pod \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.752382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45dd8867-a733-447e-adc9-68b1250fb2ba-operator-scripts\") pod \"45dd8867-a733-447e-adc9-68b1250fb2ba\" (UID: \"45dd8867-a733-447e-adc9-68b1250fb2ba\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.752454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-combined-ca-bundle\") pod \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.752734 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9phjz\" (UniqueName: \"kubernetes.io/projected/117c40c7-7452-42e7-8a5a-0a1123ac030e-kube-api-access-9phjz\") pod \"117c40c7-7452-42e7-8a5a-0a1123ac030e\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.752879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-combined-ca-bundle\") pod \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\" (UID: \"8d4d90d7-a400-45ab-8c5b-7e68ced57657\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.752952 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-logs\") pod \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\" (UID: \"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.753027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-vencrypt-tls-certs\") pod \"117c40c7-7452-42e7-8a5a-0a1123ac030e\" (UID: \"117c40c7-7452-42e7-8a5a-0a1123ac030e\") " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.753573 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwvt9\" (UniqueName: \"kubernetes.io/projected/01a45393-4050-40be-a82d-69550d6cb89e-kube-api-access-nwvt9\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.753633 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmwnx\" (UniqueName: \"kubernetes.io/projected/45dd8867-a733-447e-adc9-68b1250fb2ba-kube-api-access-wmwnx\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.753682 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.753728 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.753787 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.755896 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdbq7\" (UniqueName: \"kubernetes.io/projected/574e045d-bacf-4c78-8091-c33d0beb09c9-kube-api-access-fdbq7\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.754505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45dd8867-a733-447e-adc9-68b1250fb2ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45dd8867-a733-447e-adc9-68b1250fb2ba" (UID: "45dd8867-a733-447e-adc9-68b1250fb2ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.756499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-logs" (OuterVolumeSpecName: "logs") pod "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" (UID: "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.759229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8d4d90d7-a400-45ab-8c5b-7e68ced57657" (UID: "8d4d90d7-a400-45ab-8c5b-7e68ced57657"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.760354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kube-api-access-65tfb" (OuterVolumeSpecName: "kube-api-access-65tfb") pod "8d4d90d7-a400-45ab-8c5b-7e68ced57657" (UID: "8d4d90d7-a400-45ab-8c5b-7e68ced57657"). InnerVolumeSpecName "kube-api-access-65tfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.762607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117c40c7-7452-42e7-8a5a-0a1123ac030e-kube-api-access-9phjz" (OuterVolumeSpecName: "kube-api-access-9phjz") pod "117c40c7-7452-42e7-8a5a-0a1123ac030e" (UID: "117c40c7-7452-42e7-8a5a-0a1123ac030e"). InnerVolumeSpecName "kube-api-access-9phjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.765546 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" (UID: "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.769159 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-kube-api-access-rgvj6" (OuterVolumeSpecName: "kube-api-access-rgvj6") pod "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" (UID: "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1"). InnerVolumeSpecName "kube-api-access-rgvj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.826404 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-config-data" (OuterVolumeSpecName: "config-data") pod "117c40c7-7452-42e7-8a5a-0a1123ac030e" (UID: "117c40c7-7452-42e7-8a5a-0a1123ac030e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.826247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "8d4d90d7-a400-45ab-8c5b-7e68ced57657" (UID: "8d4d90d7-a400-45ab-8c5b-7e68ced57657"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.852933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "117c40c7-7452-42e7-8a5a-0a1123ac030e" (UID: "117c40c7-7452-42e7-8a5a-0a1123ac030e"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.857414 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.857525 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65tfb\" (UniqueName: \"kubernetes.io/projected/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kube-api-access-65tfb\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.857590 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.857641 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.857703 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgvj6\" (UniqueName: \"kubernetes.io/projected/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-kube-api-access-rgvj6\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.857782 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d4d90d7-a400-45ab-8c5b-7e68ced57657-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.857861 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.857912 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45dd8867-a733-447e-adc9-68b1250fb2ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.857957 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9phjz\" (UniqueName: \"kubernetes.io/projected/117c40c7-7452-42e7-8a5a-0a1123ac030e-kube-api-access-9phjz\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.858011 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.861828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "117c40c7-7452-42e7-8a5a-0a1123ac030e" (UID: "117c40c7-7452-42e7-8a5a-0a1123ac030e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.864676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d4d90d7-a400-45ab-8c5b-7e68ced57657" (UID: "8d4d90d7-a400-45ab-8c5b-7e68ced57657"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.879586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "8d4d90d7-a400-45ab-8c5b-7e68ced57657" (UID: "8d4d90d7-a400-45ab-8c5b-7e68ced57657"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.882557 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.882632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data" (OuterVolumeSpecName: "config-data") pod "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" (UID: "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.889625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" (UID: "e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.889802 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "117c40c7-7452-42e7-8a5a-0a1123ac030e" (UID: "117c40c7-7452-42e7-8a5a-0a1123ac030e"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.959414 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.959439 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.959448 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.959457 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.959466 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c40c7-7452-42e7-8a5a-0a1123ac030e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.959473 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.959481 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4d90d7-a400-45ab-8c5b-7e68ced57657-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.963273 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.964400 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.965338 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:24:49 crc kubenswrapper[4707]: E0121 15:24:49.965362 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c1addb55-dbd5-4f5c-957b-592c89033dab" containerName="nova-scheduler-scheduler" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.985185 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:24:49 crc kubenswrapper[4707]: I0121 15:24:49.989444 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.061465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-combined-ca-bundle\") pod \"09bf8b72-b690-4084-854a-0245cd1a61b2\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.061543 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-combined-ca-bundle\") pod \"f9190471-72fb-4298-85b6-bf07b32b6de5\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.061567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-kolla-config\") pod \"09bf8b72-b690-4084-854a-0245cd1a61b2\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.061594 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-operator-scripts\") pod \"f9190471-72fb-4298-85b6-bf07b32b6de5\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "09bf8b72-b690-4084-854a-0245cd1a61b2" (UID: "09bf8b72-b690-4084-854a-0245cd1a61b2"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f9190471-72fb-4298-85b6-bf07b32b6de5" (UID: "f9190471-72fb-4298-85b6-bf07b32b6de5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9190471-72fb-4298-85b6-bf07b32b6de5" (UID: "f9190471-72fb-4298-85b6-bf07b32b6de5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.061626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-kolla-config\") pod \"f9190471-72fb-4298-85b6-bf07b32b6de5\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-config-data\") pod \"09bf8b72-b690-4084-854a-0245cd1a61b2\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-generated\") pod \"f9190471-72fb-4298-85b6-bf07b32b6de5\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062789 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-memcached-tls-certs\") pod \"09bf8b72-b690-4084-854a-0245cd1a61b2\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062826 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-478rf\" (UniqueName: \"kubernetes.io/projected/09bf8b72-b690-4084-854a-0245cd1a61b2-kube-api-access-478rf\") pod \"09bf8b72-b690-4084-854a-0245cd1a61b2\" (UID: \"09bf8b72-b690-4084-854a-0245cd1a61b2\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-default\") pod \"f9190471-72fb-4298-85b6-bf07b32b6de5\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f9190471-72fb-4298-85b6-bf07b32b6de5\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjgrc\" (UniqueName: \"kubernetes.io/projected/f9190471-72fb-4298-85b6-bf07b32b6de5-kube-api-access-bjgrc\") pod \"f9190471-72fb-4298-85b6-bf07b32b6de5\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.062947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-galera-tls-certs\") pod \"f9190471-72fb-4298-85b6-bf07b32b6de5\" (UID: \"f9190471-72fb-4298-85b6-bf07b32b6de5\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.063413 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.063426 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.063434 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.067175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f9190471-72fb-4298-85b6-bf07b32b6de5" (UID: "f9190471-72fb-4298-85b6-bf07b32b6de5"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.067285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-config-data" (OuterVolumeSpecName: "config-data") pod "09bf8b72-b690-4084-854a-0245cd1a61b2" (UID: "09bf8b72-b690-4084-854a-0245cd1a61b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.067614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f9190471-72fb-4298-85b6-bf07b32b6de5" (UID: "f9190471-72fb-4298-85b6-bf07b32b6de5"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.072046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09bf8b72-b690-4084-854a-0245cd1a61b2-kube-api-access-478rf" (OuterVolumeSpecName: "kube-api-access-478rf") pod "09bf8b72-b690-4084-854a-0245cd1a61b2" (UID: "09bf8b72-b690-4084-854a-0245cd1a61b2"). InnerVolumeSpecName "kube-api-access-478rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.072645 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9190471-72fb-4298-85b6-bf07b32b6de5-kube-api-access-bjgrc" (OuterVolumeSpecName: "kube-api-access-bjgrc") pod "f9190471-72fb-4298-85b6-bf07b32b6de5" (UID: "f9190471-72fb-4298-85b6-bf07b32b6de5"). InnerVolumeSpecName "kube-api-access-bjgrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.075152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "f9190471-72fb-4298-85b6-bf07b32b6de5" (UID: "f9190471-72fb-4298-85b6-bf07b32b6de5"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.079037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09bf8b72-b690-4084-854a-0245cd1a61b2" (UID: "09bf8b72-b690-4084-854a-0245cd1a61b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.081335 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9190471-72fb-4298-85b6-bf07b32b6de5" (UID: "f9190471-72fb-4298-85b6-bf07b32b6de5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.096709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f9190471-72fb-4298-85b6-bf07b32b6de5" (UID: "f9190471-72fb-4298-85b6-bf07b32b6de5"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.101441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "09bf8b72-b690-4084-854a-0245cd1a61b2" (UID: "09bf8b72-b690-4084-854a-0245cd1a61b2"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.114861 4707 generic.go:334] "Generic (PLEG): container finished" podID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerID="ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1" exitCode=0 Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.114918 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.114938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" event={"ID":"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1","Type":"ContainerDied","Data":"ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.114972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl" event={"ID":"e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1","Type":"ContainerDied","Data":"73bddd9a9b9955c56c3a5111745237517c39a313a7a2863a240fa17953561f34"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.114988 4707 scope.go:117] "RemoveContainer" containerID="ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.117015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" event={"ID":"2ac603bf-ede6-4aba-a1fd-e34a94470ada","Type":"ContainerDied","Data":"77ed561a5c72fd3b28269ce72255d0aa48cc1d2300c5b31b644673bc8799d8ca"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.117145 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.130696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"37f8fb41-6a21-4135-ace9-7b9b36eaaac5","Type":"ContainerDied","Data":"a5a75f299af371c49eff8cbc12e19805b5fa166fdbee5038db4acec1d3a0e78c"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.130757 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.149095 4707 scope.go:117] "RemoveContainer" containerID="ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.150922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" event={"ID":"00ff5dfe-af7e-4827-949e-926b7a41731b","Type":"ContainerDied","Data":"38686d7331429b59d331120099879c8b5e6bc17322a5c249f322d7268652131c"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.150995 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.160005 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.163181 4707 generic.go:334] "Generic (PLEG): container finished" podID="80b52f69-1c27-4d55-a847-c304552c27d0" containerID="9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20" exitCode=0 Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.163300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" event={"ID":"80b52f69-1c27-4d55-a847-c304552c27d0","Type":"ContainerDied","Data":"9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.167882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27dbh\" (UniqueName: \"kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh\") pod \"keystone-2c83-account-create-update-9mbgh\" (UID: \"77fb4097-22e4-44a9-a955-937dbfb75ee5\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.167976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts\") pod \"keystone-2c83-account-create-update-9mbgh\" (UID: \"77fb4097-22e4-44a9-a955-937dbfb75ee5\") " pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168023 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168034 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168043 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09bf8b72-b690-4084-854a-0245cd1a61b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168052 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bf8b72-b690-4084-854a-0245cd1a61b2-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168061 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168071 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-478rf\" (UniqueName: \"kubernetes.io/projected/09bf8b72-b690-4084-854a-0245cd1a61b2-kube-api-access-478rf\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168080 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9190471-72fb-4298-85b6-bf07b32b6de5-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168096 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168105 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjgrc\" (UniqueName: \"kubernetes.io/projected/f9190471-72fb-4298-85b6-bf07b32b6de5-kube-api-access-bjgrc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.168113 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9190471-72fb-4298-85b6-bf07b32b6de5-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.168619 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.168670 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts podName:77fb4097-22e4-44a9-a955-937dbfb75ee5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:52.16865517 +0000 UTC m=+1389.350171392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts") pod "keystone-2c83-account-create-update-9mbgh" (UID: "77fb4097-22e4-44a9-a955-937dbfb75ee5") : configmap "openstack-scripts" not found Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.168701 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.168913 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data podName:1bc7660e-aaa9-4865-a873-4db17c1db628 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:54.168898096 +0000 UTC m=+1391.350414338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data") pod "rabbitmq-server-0" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628") : configmap "rabbitmq-config-data" not found Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.169044 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7fc4c9546b-n8wbl"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.169402 4707 generic.go:334] "Generic (PLEG): container finished" podID="8d4d90d7-a400-45ab-8c5b-7e68ced57657" containerID="589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f" exitCode=0 Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.169527 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.171669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"8d4d90d7-a400-45ab-8c5b-7e68ced57657","Type":"ContainerDied","Data":"589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.172665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"8d4d90d7-a400-45ab-8c5b-7e68ced57657","Type":"ContainerDied","Data":"d03a51987a21b4d709dac3b6f6ad6c6355f12a24465c9d1ae438bce6450456ba"} Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.175205 4707 projected.go:194] Error preparing data for projected volume kube-api-access-27dbh for pod openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.175280 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh podName:77fb4097-22e4-44a9-a955-937dbfb75ee5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:52.175255814 +0000 UTC m=+1389.356772037 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-27dbh" (UniqueName: "kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh") pod "keystone-2c83-account-create-update-9mbgh" (UID: "77fb4097-22e4-44a9-a955-937dbfb75ee5") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.177152 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.177117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-a414-account-create-update-xpbnj" event={"ID":"574e045d-bacf-4c78-8091-c33d0beb09c9","Type":"ContainerDied","Data":"9bf877b23da7cdec1c2ba15b16e6e798241f418eda143f2c822993f8a73c3370"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.182957 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.191871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" event={"ID":"01a45393-4050-40be-a82d-69550d6cb89e","Type":"ContainerDied","Data":"6191c485161f11673978ad2e23e63703e2968751f3a5671ce72a99d32d960cc2"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.191938 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.194023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" event={"ID":"45dd8867-a733-447e-adc9-68b1250fb2ba","Type":"ContainerDied","Data":"48af315254b9b26ca6fb562d7100875164b108c6597e4da1bd7edb224207df65"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.194094 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6090-account-create-update-xr874" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.196406 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.197336 4707 scope.go:117] "RemoveContainer" containerID="ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.198191 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1\": container with ID starting with ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1 not found: ID does not exist" containerID="ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.198297 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1"} err="failed to get container status \"ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1\": rpc error: code = NotFound desc = could not find container \"ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1\": container with ID starting with ededcec46bd9c2b9addeadc7cb120acfee8ee72bfe946228f3454300362b95e1 not found: ID does not exist" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.198513 4707 scope.go:117] "RemoveContainer" containerID="ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.199050 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341\": container with ID starting with ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341 not found: ID does not exist" containerID="ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.199128 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341"} err="failed to get container status \"ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341\": rpc error: code = NotFound desc = could not find container \"ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341\": container with ID starting with ad8e23ef54bbc01a998ebda55d392c2c06bf9f2547e93e73e288c3a7f4a53341 not found: ID does not exist" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.199198 4707 scope.go:117] "RemoveContainer" containerID="d0cdbb4ec686eadd5528f51fab83e9c9497d8a3d56a4bb906dfe28f48de85b3b" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.199712 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9190471-72fb-4298-85b6-bf07b32b6de5" containerID="eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee" exitCode=0 Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.199868 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.199941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js" event={"ID":"9ff945cd-386c-4385-91aa-0fb0b9b861c5","Type":"ContainerDied","Data":"8cc87ed52ab41128678cacf1b87946799e2cd2079fb60706fce98dfbf03a947c"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.200653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f9190471-72fb-4298-85b6-bf07b32b6de5","Type":"ContainerDied","Data":"eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.200667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f9190471-72fb-4298-85b6-bf07b32b6de5","Type":"ContainerDied","Data":"0bca8923ca333e053b4bc11f78c49c8ebf18710aaec94c05f9867d5a0f9e125a"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.231752 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.231794 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0d6b-account-create-update-qkcbk"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.232088 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b113905-777a-4919-96c0-704f8e6f4026" containerID="7e3078d1d6d9b820e4927ce9abc6e267ee99aa8fe7c1503c8b16b8ea3eed53da" exitCode=1 Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.232731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tnlbm" event={"ID":"7b113905-777a-4919-96c0-704f8e6f4026","Type":"ContainerDied","Data":"7e3078d1d6d9b820e4927ce9abc6e267ee99aa8fe7c1503c8b16b8ea3eed53da"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.251409 4707 generic.go:334] "Generic (PLEG): container finished" podID="09bf8b72-b690-4084-854a-0245cd1a61b2" containerID="ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d" exitCode=0 Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.252120 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.258535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"09bf8b72-b690-4084-854a-0245cd1a61b2","Type":"ContainerDied","Data":"ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.258572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"09bf8b72-b690-4084-854a-0245cd1a61b2","Type":"ContainerDied","Data":"f2b7a8281dc592388d996fc191b88446c86a5f018e9c8d74de99a1d8705ca737"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.260102 4707 scope.go:117] "RemoveContainer" containerID="383bb9acff21cf11ac127e42ba50d992286bc0e2ae3f73e405b397f0688b28c8" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.261455 4707 generic.go:334] "Generic (PLEG): container finished" podID="117c40c7-7452-42e7-8a5a-0a1123ac030e" containerID="3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f" exitCode=0 Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.261564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"117c40c7-7452-42e7-8a5a-0a1123ac030e","Type":"ContainerDied","Data":"3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.261589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"117c40c7-7452-42e7-8a5a-0a1123ac030e","Type":"ContainerDied","Data":"fba276945f55be0cd800a6fcb1dac6fdf6e1f8722c4d5600d17ea16ba4bd3261"} Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.266000 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.272009 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.274202 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.275030 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.275072 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.276217 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.287953 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.304168 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.310941 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.314099 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.330936 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6f8b448588-x2v5l"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.342752 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-xpbnj"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.347377 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-a414-account-create-update-xpbnj"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.417176 4707 scope.go:117] "RemoveContainer" containerID="7b5e320769635ae68967dbf380846b5fcaacb83053ce0325c620c2d8aacab67e" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.453010 4707 scope.go:117] "RemoveContainer" containerID="589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.455121 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.460522 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="90c1a990-5286-4274-a26a-8740ee506a12" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.247:8776/healthcheck\": read tcp 10.217.0.2:56458->10.217.0.247:8776: read: connection reset by peer" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.465315 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.475396 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.475451 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts podName:d8e220c0-ad06-4f2b-be03-642acdd3a23a nodeName:}" failed. No retries permitted until 2026-01-21 15:24:52.475437918 +0000 UTC m=+1389.656954129 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts") pod "glance-8403-account-create-update-fpc6b" (UID: "d8e220c0-ad06-4f2b-be03-642acdd3a23a") : configmap "openstack-scripts" not found Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.480883 4707 scope.go:117] "RemoveContainer" containerID="e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.480944 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.489692 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.513637 4707 scope.go:117] "RemoveContainer" containerID="589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.524728 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f\": container with ID starting with 589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f not found: ID does not exist" containerID="589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.524765 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f"} err="failed to get container status \"589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f\": rpc error: code = NotFound desc = could not find container \"589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f\": container with ID starting with 589255943db50cc08d7cc074bd83b7a269736d8b858a2fae7a80adfdfe9da30f not found: ID does not exist" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.524842 4707 scope.go:117] "RemoveContainer" containerID="e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.530378 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce\": container with ID starting with e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce not found: ID does not exist" containerID="e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.530405 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce"} err="failed to get container status \"e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce\": rpc error: code = NotFound desc = could not find container \"e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce\": container with ID starting with e040e6aec68d216664185da0053355da57390d1f0a46e7d2e9cc6de279b8bfce not found: ID does not exist" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.530421 4707 scope.go:117] "RemoveContainer" containerID="eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.539312 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.546095 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d429-account-create-update-4xh55"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.553002 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.556978 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-e09b-account-create-update-q69js"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.561117 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.565153 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.573099 4707 scope.go:117] "RemoveContainer" containerID="2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.576508 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.576563 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts podName:7b113905-777a-4919-96c0-704f8e6f4026 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:52.576550126 +0000 UTC m=+1389.758066347 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts") pod "root-account-create-update-tnlbm" (UID: "7b113905-777a-4919-96c0-704f8e6f4026") : configmap "openstack-cell1-scripts" not found Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.576880 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.593163 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-2c83-account-create-update-9mbgh"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.607384 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.608914 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7b97d48fcb-bq5h5"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.615860 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.619697 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5f75796cbf-qqfgf"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.631768 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-xr874"] Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.642947 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-6090-account-create-update-xr874"] Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.665998 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ff945cd_386c_4385_91aa_0fb0b9b861c5.slice/crio-8cc87ed52ab41128678cacf1b87946799e2cd2079fb60706fce98dfbf03a947c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09bf8b72_b690_4084_854a_0245cd1a61b2.slice/crio-f2b7a8281dc592388d996fc191b88446c86a5f018e9c8d74de99a1d8705ca737\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09bf8b72_b690_4084_854a_0245cd1a61b2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90c1a990_5286_4274_a26a_8740ee506a12.slice/crio-1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod577d42b7_93f8_47f8_987a_58995f68dd9b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ff945cd_386c_4385_91aa_0fb0b9b861c5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda066dd9_9af4_46cf_9fea_e8088a738220.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45dd8867_a733_447e_adc9_68b1250fb2ba.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod117c40c7_7452_42e7_8a5a_0a1123ac030e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45dd8867_a733_447e_adc9_68b1250fb2ba.slice/crio-48af315254b9b26ca6fb562d7100875164b108c6597e4da1bd7edb224207df65\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77fb4097_22e4_44a9_a955_937dbfb75ee5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod117c40c7_7452_42e7_8a5a_0a1123ac030e.slice/crio-fba276945f55be0cd800a6fcb1dac6fdf6e1f8722c4d5600d17ea16ba4bd3261\": RecentStats: unable to find data in memory cache]" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.676097 4707 scope.go:117] "RemoveContainer" containerID="eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.676440 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee\": container with ID starting with eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee not found: ID does not exist" containerID="eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.676633 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee"} err="failed to get container status \"eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee\": rpc error: code = NotFound desc = could not find container \"eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee\": container with ID starting with eb419fbf544019cae6196057b5734612a475964a2b2331db5ee37b0acf0431ee not found: ID does not exist" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.676651 4707 scope.go:117] "RemoveContainer" containerID="2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.676849 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572\": container with ID starting with 2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572 not found: ID does not exist" containerID="2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.676864 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572"} err="failed to get container status \"2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572\": rpc error: code = NotFound desc = could not find container \"2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572\": container with ID starting with 2e0645a9f3c7b4ed8201ea6073905eade13a57a3c291bf068cb314fb514ce572 not found: ID does not exist" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.676876 4707 scope.go:117] "RemoveContainer" containerID="210aa9472f7d900c0ca04d5cf2c536a85b2621ac36455c5b52c667c54ca18f1a" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.677527 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da066dd9-9af4-46cf-9fea-e8088a738220-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.677546 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77fb4097-22e4-44a9-a955-937dbfb75ee5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.677555 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27dbh\" (UniqueName: \"kubernetes.io/projected/77fb4097-22e4-44a9-a955-937dbfb75ee5-kube-api-access-27dbh\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.677564 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577d42b7-93f8-47f8-987a-58995f68dd9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.677626 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.677684 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts podName:c86afe68-6c61-4a64-bdb3-2f489efbea33 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:52.677670337 +0000 UTC m=+1389.859186559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts") pod "neutron-06ca-account-create-update-49sxr" (UID: "c86afe68-6c61-4a64-bdb3-2f489efbea33") : configmap "openstack-scripts" not found Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.697776 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.240:8778/\": dial tcp 10.217.0.240:8778: connect: connection refused" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.698485 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.240:8778/\": dial tcp 10.217.0.240:8778: connect: connection refused" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.719559 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.721157 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.727578 4707 scope.go:117] "RemoveContainer" containerID="ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.778402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts\") pod \"7b113905-777a-4919-96c0-704f8e6f4026\" (UID: \"7b113905-777a-4919-96c0-704f8e6f4026\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.778542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts\") pod \"c86afe68-6c61-4a64-bdb3-2f489efbea33\" (UID: \"c86afe68-6c61-4a64-bdb3-2f489efbea33\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.779194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b113905-777a-4919-96c0-704f8e6f4026" (UID: "7b113905-777a-4919-96c0-704f8e6f4026"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.781116 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww75t\" (UniqueName: \"kubernetes.io/projected/7b113905-777a-4919-96c0-704f8e6f4026-kube-api-access-ww75t\") pod \"7b113905-777a-4919-96c0-704f8e6f4026\" (UID: \"7b113905-777a-4919-96c0-704f8e6f4026\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.781160 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbxwb\" (UniqueName: \"kubernetes.io/projected/c86afe68-6c61-4a64-bdb3-2f489efbea33-kube-api-access-hbxwb\") pod \"c86afe68-6c61-4a64-bdb3-2f489efbea33\" (UID: \"c86afe68-6c61-4a64-bdb3-2f489efbea33\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.781494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c86afe68-6c61-4a64-bdb3-2f489efbea33" (UID: "c86afe68-6c61-4a64-bdb3-2f489efbea33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.782547 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86afe68-6c61-4a64-bdb3-2f489efbea33-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.782563 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b113905-777a-4919-96c0-704f8e6f4026-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.799496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86afe68-6c61-4a64-bdb3-2f489efbea33-kube-api-access-hbxwb" (OuterVolumeSpecName: "kube-api-access-hbxwb") pod "c86afe68-6c61-4a64-bdb3-2f489efbea33" (UID: "c86afe68-6c61-4a64-bdb3-2f489efbea33"). InnerVolumeSpecName "kube-api-access-hbxwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.822112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b113905-777a-4919-96c0-704f8e6f4026-kube-api-access-ww75t" (OuterVolumeSpecName: "kube-api-access-ww75t") pod "7b113905-777a-4919-96c0-704f8e6f4026" (UID: "7b113905-777a-4919-96c0-704f8e6f4026"). InnerVolumeSpecName "kube-api-access-ww75t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.836495 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.876031 4707 scope.go:117] "RemoveContainer" containerID="ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.879064 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d\": container with ID starting with ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d not found: ID does not exist" containerID="ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.879092 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d"} err="failed to get container status \"ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d\": rpc error: code = NotFound desc = could not find container \"ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d\": container with ID starting with ceb9b5905b05138114c7d05ceb2dea82a8e29fea6c987b05664b5ad98857a71d not found: ID does not exist" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.879110 4707 scope.go:117] "RemoveContainer" containerID="3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.890418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k879\" (UniqueName: \"kubernetes.io/projected/d8e220c0-ad06-4f2b-be03-642acdd3a23a-kube-api-access-2k879\") pod \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\" (UID: \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.890983 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts\") pod \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\" (UID: \"d8e220c0-ad06-4f2b-be03-642acdd3a23a\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.891394 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww75t\" (UniqueName: \"kubernetes.io/projected/7b113905-777a-4919-96c0-704f8e6f4026-kube-api-access-ww75t\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.891412 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbxwb\" (UniqueName: \"kubernetes.io/projected/c86afe68-6c61-4a64-bdb3-2f489efbea33-kube-api-access-hbxwb\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.891889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8e220c0-ad06-4f2b-be03-642acdd3a23a" (UID: "d8e220c0-ad06-4f2b-be03-642acdd3a23a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.899188 4707 scope.go:117] "RemoveContainer" containerID="3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.899599 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f\": container with ID starting with 3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f not found: ID does not exist" containerID="3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.899634 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f"} err="failed to get container status \"3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f\": rpc error: code = NotFound desc = could not find container \"3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f\": container with ID starting with 3ffb017f216a3e1e720ba664a0b738200c50d63bea4577cabde7509a3030be3f not found: ID does not exist" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.910536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e220c0-ad06-4f2b-be03-642acdd3a23a-kube-api-access-2k879" (OuterVolumeSpecName: "kube-api-access-2k879") pod "d8e220c0-ad06-4f2b-be03-642acdd3a23a" (UID: "d8e220c0-ad06-4f2b-be03-642acdd3a23a"). InnerVolumeSpecName "kube-api-access-2k879". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: E0121 15:24:50.940474 4707 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.949577 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.992728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-combined-ca-bundle\") pod \"90c1a990-5286-4274-a26a-8740ee506a12\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.992978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data-custom\") pod \"90c1a990-5286-4274-a26a-8740ee506a12\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.993070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90c1a990-5286-4274-a26a-8740ee506a12-etc-machine-id\") pod \"90c1a990-5286-4274-a26a-8740ee506a12\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.993187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90c1a990-5286-4274-a26a-8740ee506a12-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "90c1a990-5286-4274-a26a-8740ee506a12" (UID: "90c1a990-5286-4274-a26a-8740ee506a12"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.993192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6nlk\" (UniqueName: \"kubernetes.io/projected/90c1a990-5286-4274-a26a-8740ee506a12-kube-api-access-v6nlk\") pod \"90c1a990-5286-4274-a26a-8740ee506a12\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.993682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-internal-tls-certs\") pod \"90c1a990-5286-4274-a26a-8740ee506a12\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.993785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-scripts\") pod \"90c1a990-5286-4274-a26a-8740ee506a12\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.993914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-public-tls-certs\") pod \"90c1a990-5286-4274-a26a-8740ee506a12\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.993986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c1a990-5286-4274-a26a-8740ee506a12-logs\") pod \"90c1a990-5286-4274-a26a-8740ee506a12\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.994212 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data\") pod \"90c1a990-5286-4274-a26a-8740ee506a12\" (UID: \"90c1a990-5286-4274-a26a-8740ee506a12\") " Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.994843 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k879\" (UniqueName: \"kubernetes.io/projected/d8e220c0-ad06-4f2b-be03-642acdd3a23a-kube-api-access-2k879\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.994919 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90c1a990-5286-4274-a26a-8740ee506a12-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.994973 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e220c0-ad06-4f2b-be03-642acdd3a23a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.995023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90c1a990-5286-4274-a26a-8740ee506a12-logs" (OuterVolumeSpecName: "logs") pod "90c1a990-5286-4274-a26a-8740ee506a12" (UID: "90c1a990-5286-4274-a26a-8740ee506a12"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:50 crc kubenswrapper[4707]: I0121 15:24:50.995790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "90c1a990-5286-4274-a26a-8740ee506a12" (UID: "90c1a990-5286-4274-a26a-8740ee506a12"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:50.997949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-scripts" (OuterVolumeSpecName: "scripts") pod "90c1a990-5286-4274-a26a-8740ee506a12" (UID: "90c1a990-5286-4274-a26a-8740ee506a12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.010390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c1a990-5286-4274-a26a-8740ee506a12-kube-api-access-v6nlk" (OuterVolumeSpecName: "kube-api-access-v6nlk") pod "90c1a990-5286-4274-a26a-8740ee506a12" (UID: "90c1a990-5286-4274-a26a-8740ee506a12"). InnerVolumeSpecName "kube-api-access-v6nlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.026020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data" (OuterVolumeSpecName: "config-data") pod "90c1a990-5286-4274-a26a-8740ee506a12" (UID: "90c1a990-5286-4274-a26a-8740ee506a12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.029688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90c1a990-5286-4274-a26a-8740ee506a12" (UID: "90c1a990-5286-4274-a26a-8740ee506a12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.035981 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "90c1a990-5286-4274-a26a-8740ee506a12" (UID: "90c1a990-5286-4274-a26a-8740ee506a12"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.043033 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.052621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "90c1a990-5286-4274-a26a-8740ee506a12" (UID: "90c1a990-5286-4274-a26a-8740ee506a12"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.096474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvddn\" (UniqueName: \"kubernetes.io/projected/316bd962-7ce5-4797-8a99-79aacd85c3fb-kube-api-access-tvddn\") pod \"316bd962-7ce5-4797-8a99-79aacd85c3fb\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.096713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-scripts\") pod \"316bd962-7ce5-4797-8a99-79aacd85c3fb\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.096934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-config-data\") pod \"316bd962-7ce5-4797-8a99-79aacd85c3fb\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.097024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-combined-ca-bundle\") pod \"316bd962-7ce5-4797-8a99-79aacd85c3fb\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.097097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-public-tls-certs\") pod \"316bd962-7ce5-4797-8a99-79aacd85c3fb\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.097173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316bd962-7ce5-4797-8a99-79aacd85c3fb-logs\") pod \"316bd962-7ce5-4797-8a99-79aacd85c3fb\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.097288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-internal-tls-certs\") pod \"316bd962-7ce5-4797-8a99-79aacd85c3fb\" (UID: \"316bd962-7ce5-4797-8a99-79aacd85c3fb\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.097423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316bd962-7ce5-4797-8a99-79aacd85c3fb-logs" (OuterVolumeSpecName: "logs") pod "316bd962-7ce5-4797-8a99-79aacd85c3fb" (UID: "316bd962-7ce5-4797-8a99-79aacd85c3fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.097869 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316bd962-7ce5-4797-8a99-79aacd85c3fb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.097942 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.097995 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6nlk\" (UniqueName: \"kubernetes.io/projected/90c1a990-5286-4274-a26a-8740ee506a12-kube-api-access-v6nlk\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.098044 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.098115 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.098174 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.098286 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c1a990-5286-4274-a26a-8740ee506a12-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.098378 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.098427 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c1a990-5286-4274-a26a-8740ee506a12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.100660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316bd962-7ce5-4797-8a99-79aacd85c3fb-kube-api-access-tvddn" (OuterVolumeSpecName: "kube-api-access-tvddn") pod "316bd962-7ce5-4797-8a99-79aacd85c3fb" (UID: "316bd962-7ce5-4797-8a99-79aacd85c3fb"). InnerVolumeSpecName "kube-api-access-tvddn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.100728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-scripts" (OuterVolumeSpecName: "scripts") pod "316bd962-7ce5-4797-8a99-79aacd85c3fb" (UID: "316bd962-7ce5-4797-8a99-79aacd85c3fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.128737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "316bd962-7ce5-4797-8a99-79aacd85c3fb" (UID: "316bd962-7ce5-4797-8a99-79aacd85c3fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.133490 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-config-data" (OuterVolumeSpecName: "config-data") pod "316bd962-7ce5-4797-8a99-79aacd85c3fb" (UID: "316bd962-7ce5-4797-8a99-79aacd85c3fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.158651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "316bd962-7ce5-4797-8a99-79aacd85c3fb" (UID: "316bd962-7ce5-4797-8a99-79aacd85c3fb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.172148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "316bd962-7ce5-4797-8a99-79aacd85c3fb" (UID: "316bd962-7ce5-4797-8a99-79aacd85c3fb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.190919 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ff5dfe-af7e-4827-949e-926b7a41731b" path="/var/lib/kubelet/pods/00ff5dfe-af7e-4827-949e-926b7a41731b/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.191423 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a45393-4050-40be-a82d-69550d6cb89e" path="/var/lib/kubelet/pods/01a45393-4050-40be-a82d-69550d6cb89e/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.191755 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09bf8b72-b690-4084-854a-0245cd1a61b2" path="/var/lib/kubelet/pods/09bf8b72-b690-4084-854a-0245cd1a61b2/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.192321 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117c40c7-7452-42e7-8a5a-0a1123ac030e" path="/var/lib/kubelet/pods/117c40c7-7452-42e7-8a5a-0a1123ac030e/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.193322 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac603bf-ede6-4aba-a1fd-e34a94470ada" path="/var/lib/kubelet/pods/2ac603bf-ede6-4aba-a1fd-e34a94470ada/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.193626 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f8fb41-6a21-4135-ace9-7b9b36eaaac5" path="/var/lib/kubelet/pods/37f8fb41-6a21-4135-ace9-7b9b36eaaac5/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.194067 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c937d1c-d74a-4296-a78d-c88deef6b011" path="/var/lib/kubelet/pods/3c937d1c-d74a-4296-a78d-c88deef6b011/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.195073 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45dd8867-a733-447e-adc9-68b1250fb2ba" path="/var/lib/kubelet/pods/45dd8867-a733-447e-adc9-68b1250fb2ba/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.195838 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="574e045d-bacf-4c78-8091-c33d0beb09c9" path="/var/lib/kubelet/pods/574e045d-bacf-4c78-8091-c33d0beb09c9/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.196245 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577d42b7-93f8-47f8-987a-58995f68dd9b" path="/var/lib/kubelet/pods/577d42b7-93f8-47f8-987a-58995f68dd9b/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.196476 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77fb4097-22e4-44a9-a955-937dbfb75ee5" path="/var/lib/kubelet/pods/77fb4097-22e4-44a9-a955-937dbfb75ee5/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.196845 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4d90d7-a400-45ab-8c5b-7e68ced57657" path="/var/lib/kubelet/pods/8d4d90d7-a400-45ab-8c5b-7e68ced57657/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.197711 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff945cd-386c-4385-91aa-0fb0b9b861c5" path="/var/lib/kubelet/pods/9ff945cd-386c-4385-91aa-0fb0b9b861c5/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.198062 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da066dd9-9af4-46cf-9fea-e8088a738220" path="/var/lib/kubelet/pods/da066dd9-9af4-46cf-9fea-e8088a738220/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.198411 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" path="/var/lib/kubelet/pods/e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.199841 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.199857 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvddn\" (UniqueName: \"kubernetes.io/projected/316bd962-7ce5-4797-8a99-79aacd85c3fb-kube-api-access-tvddn\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.199867 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.199875 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.199883 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.199890 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316bd962-7ce5-4797-8a99-79aacd85c3fb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.200085 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9190471-72fb-4298-85b6-bf07b32b6de5" path="/var/lib/kubelet/pods/f9190471-72fb-4298-85b6-bf07b32b6de5/volumes" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.281445 4707 generic.go:334] "Generic (PLEG): container finished" podID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerID="b4638cfb34b740af95a7bc03409e4baaf7acbfe3168548cc0715b59e19f6d136" exitCode=0 Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.281515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"63816993-7ef8-4f21-9a3c-18040cb843bd","Type":"ContainerDied","Data":"b4638cfb34b740af95a7bc03409e4baaf7acbfe3168548cc0715b59e19f6d136"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.283803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" event={"ID":"80b52f69-1c27-4d55-a847-c304552c27d0","Type":"ContainerStarted","Data":"2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.283876 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.291057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" event={"ID":"c86afe68-6c61-4a64-bdb3-2f489efbea33","Type":"ContainerDied","Data":"4191c90e78f75ac630bab39ee6c9bd2ef9b9833b23492e2d0e44d3b0812d06da"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.291126 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.293198 4707 generic.go:334] "Generic (PLEG): container finished" podID="90c1a990-5286-4274-a26a-8740ee506a12" containerID="1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447" exitCode=0 Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.293237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"90c1a990-5286-4274-a26a-8740ee506a12","Type":"ContainerDied","Data":"1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.293293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"90c1a990-5286-4274-a26a-8740ee506a12","Type":"ContainerDied","Data":"2cfc23d8af7f51b787084f15c145ea5d95b8afe2ea3d608d68f00e4b34f65866"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.293308 4707 scope.go:117] "RemoveContainer" containerID="1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.293433 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.312250 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" podStartSLOduration=5.312236679 podStartE2EDuration="5.312236679s" podCreationTimestamp="2026-01-21 15:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:24:51.299975978 +0000 UTC m=+1388.481492200" watchObservedRunningTime="2026-01-21 15:24:51.312236679 +0000 UTC m=+1388.493752901" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.331236 4707 scope.go:117] "RemoveContainer" containerID="2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.332101 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.332387 4707 generic.go:334] "Generic (PLEG): container finished" podID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerID="5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b" exitCode=0 Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.332433 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" event={"ID":"316bd962-7ce5-4797-8a99-79aacd85c3fb","Type":"ContainerDied","Data":"5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.332456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" event={"ID":"316bd962-7ce5-4797-8a99-79aacd85c3fb","Type":"ContainerDied","Data":"c6efe1634e159efa5d15d25489c07dd0eaac269d0facb952d945215a65da5eba"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.332493 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-55779dfdd6-4x8fn" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.336036 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.340859 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-06ca-account-create-update-49sxr"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.344474 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.348803 4707 generic.go:334] "Generic (PLEG): container finished" podID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerID="5a6924b802768405402d6b86cb657dfd7334a39647b9f65f1a3ca7e21dd5e716" exitCode=0 Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.348886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"edf40ef7-2790-4e9b-92dd-b92ab951383d","Type":"ContainerDied","Data":"5a6924b802768405402d6b86cb657dfd7334a39647b9f65f1a3ca7e21dd5e716"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.348989 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.352861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" event={"ID":"d8e220c0-ad06-4f2b-be03-642acdd3a23a","Type":"ContainerDied","Data":"62d467298f9c0c82284e8fe0454fad8b6f08f5085f130b12206ab1cd95e69302"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.352911 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-8403-account-create-update-fpc6b" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.355223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tnlbm" event={"ID":"7b113905-777a-4919-96c0-704f8e6f4026","Type":"ContainerDied","Data":"3886a813b1e41e66c41c1764196d19969a06be87ec2eb9f998f64d26c965490e"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.355311 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tnlbm" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.356937 4707 scope.go:117] "RemoveContainer" containerID="1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447" Jan 21 15:24:51 crc kubenswrapper[4707]: E0121 15:24:51.359879 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447\": container with ID starting with 1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447 not found: ID does not exist" containerID="1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.359910 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447"} err="failed to get container status \"1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447\": rpc error: code = NotFound desc = could not find container \"1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447\": container with ID starting with 1cae5247dae32784b83d5e2b8f4b032df2b1d67f729bc3e99af73430254e1447 not found: ID does not exist" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.359941 4707 scope.go:117] "RemoveContainer" containerID="2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390" Jan 21 15:24:51 crc kubenswrapper[4707]: E0121 15:24:51.361029 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390\": container with ID starting with 2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390 not found: ID does not exist" containerID="2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.361055 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390"} err="failed to get container status \"2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390\": rpc error: code = NotFound desc = could not find container \"2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390\": container with ID starting with 2e18baa98c11ec371e16f5003fa841209c64a3fad418a1747d29a36a3ac01390 not found: ID does not exist" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.361071 4707 scope.go:117] "RemoveContainer" containerID="5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.364853 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_11022c93-b7e5-4241-812c-eed16cc40ce2/ovn-northd/0.log" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.364882 4707 generic.go:334] "Generic (PLEG): container finished" podID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerID="8eb790e157aa673fadd935d740017ed4db8e762fcaad08bdb4716c7e456ed832" exitCode=139 Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.364904 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"11022c93-b7e5-4241-812c-eed16cc40ce2","Type":"ContainerDied","Data":"8eb790e157aa673fadd935d740017ed4db8e762fcaad08bdb4716c7e456ed832"} Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.383796 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-fpc6b"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.390174 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-8403-account-create-update-fpc6b"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.396073 4707 scope.go:117] "RemoveContainer" containerID="84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.396394 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-55779dfdd6-4x8fn"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.402179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"63816993-7ef8-4f21-9a3c-18040cb843bd\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.402245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-config-data\") pod \"63816993-7ef8-4f21-9a3c-18040cb843bd\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.402318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-logs\") pod \"63816993-7ef8-4f21-9a3c-18040cb843bd\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.402378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6st5g\" (UniqueName: \"kubernetes.io/projected/63816993-7ef8-4f21-9a3c-18040cb843bd-kube-api-access-6st5g\") pod \"63816993-7ef8-4f21-9a3c-18040cb843bd\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.402428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-combined-ca-bundle\") pod \"63816993-7ef8-4f21-9a3c-18040cb843bd\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.402453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-scripts\") pod \"63816993-7ef8-4f21-9a3c-18040cb843bd\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.402488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-httpd-run\") pod \"63816993-7ef8-4f21-9a3c-18040cb843bd\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.402508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-internal-tls-certs\") pod \"63816993-7ef8-4f21-9a3c-18040cb843bd\" (UID: \"63816993-7ef8-4f21-9a3c-18040cb843bd\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.403600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "63816993-7ef8-4f21-9a3c-18040cb843bd" (UID: "63816993-7ef8-4f21-9a3c-18040cb843bd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.404115 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-55779dfdd6-4x8fn"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.413249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63816993-7ef8-4f21-9a3c-18040cb843bd-kube-api-access-6st5g" (OuterVolumeSpecName: "kube-api-access-6st5g") pod "63816993-7ef8-4f21-9a3c-18040cb843bd" (UID: "63816993-7ef8-4f21-9a3c-18040cb843bd"). InnerVolumeSpecName "kube-api-access-6st5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.413454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-logs" (OuterVolumeSpecName: "logs") pod "63816993-7ef8-4f21-9a3c-18040cb843bd" (UID: "63816993-7ef8-4f21-9a3c-18040cb843bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.413904 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tnlbm"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.413917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-scripts" (OuterVolumeSpecName: "scripts") pod "63816993-7ef8-4f21-9a3c-18040cb843bd" (UID: "63816993-7ef8-4f21-9a3c-18040cb843bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.416415 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.417847 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tnlbm"] Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.418711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "63816993-7ef8-4f21-9a3c-18040cb843bd" (UID: "63816993-7ef8-4f21-9a3c-18040cb843bd"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.424253 4707 scope.go:117] "RemoveContainer" containerID="5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b" Jan 21 15:24:51 crc kubenswrapper[4707]: E0121 15:24:51.424583 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b\": container with ID starting with 5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b not found: ID does not exist" containerID="5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.424606 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b"} err="failed to get container status \"5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b\": rpc error: code = NotFound desc = could not find container \"5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b\": container with ID starting with 5a203b1fa6c5065269d52d7e5b754f609e8e96c67737a2804ef369a5eed8e88b not found: ID does not exist" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.424627 4707 scope.go:117] "RemoveContainer" containerID="84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078" Jan 21 15:24:51 crc kubenswrapper[4707]: E0121 15:24:51.424864 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078\": container with ID starting with 84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078 not found: ID does not exist" containerID="84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.424885 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078"} err="failed to get container status \"84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078\": rpc error: code = NotFound desc = could not find container \"84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078\": container with ID starting with 84e2e45907bd04c24dcd723dcadc23908436fe084f932c8f521f4d0367c2a078 not found: ID does not exist" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.424901 4707 scope.go:117] "RemoveContainer" containerID="7e3078d1d6d9b820e4927ce9abc6e267ee99aa8fe7c1503c8b16b8ea3eed53da" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.428491 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63816993-7ef8-4f21-9a3c-18040cb843bd" (UID: "63816993-7ef8-4f21-9a3c-18040cb843bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.456165 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "63816993-7ef8-4f21-9a3c-18040cb843bd" (UID: "63816993-7ef8-4f21-9a3c-18040cb843bd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.463695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-config-data" (OuterVolumeSpecName: "config-data") pod "63816993-7ef8-4f21-9a3c-18040cb843bd" (UID: "63816993-7ef8-4f21-9a3c-18040cb843bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.503295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-public-tls-certs\") pod \"edf40ef7-2790-4e9b-92dd-b92ab951383d\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.503381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-combined-ca-bundle\") pod \"edf40ef7-2790-4e9b-92dd-b92ab951383d\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.503444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghzb6\" (UniqueName: \"kubernetes.io/projected/edf40ef7-2790-4e9b-92dd-b92ab951383d-kube-api-access-ghzb6\") pod \"edf40ef7-2790-4e9b-92dd-b92ab951383d\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.503471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"edf40ef7-2790-4e9b-92dd-b92ab951383d\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.503490 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-httpd-run\") pod \"edf40ef7-2790-4e9b-92dd-b92ab951383d\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.503520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-scripts\") pod \"edf40ef7-2790-4e9b-92dd-b92ab951383d\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.503588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-config-data\") pod \"edf40ef7-2790-4e9b-92dd-b92ab951383d\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.503622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-logs\") pod \"edf40ef7-2790-4e9b-92dd-b92ab951383d\" (UID: \"edf40ef7-2790-4e9b-92dd-b92ab951383d\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.504098 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.504115 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6st5g\" (UniqueName: \"kubernetes.io/projected/63816993-7ef8-4f21-9a3c-18040cb843bd-kube-api-access-6st5g\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.504126 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.504135 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.504142 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63816993-7ef8-4f21-9a3c-18040cb843bd-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.504149 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.504166 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.504174 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63816993-7ef8-4f21-9a3c-18040cb843bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.504772 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "edf40ef7-2790-4e9b-92dd-b92ab951383d" (UID: "edf40ef7-2790-4e9b-92dd-b92ab951383d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.505297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-logs" (OuterVolumeSpecName: "logs") pod "edf40ef7-2790-4e9b-92dd-b92ab951383d" (UID: "edf40ef7-2790-4e9b-92dd-b92ab951383d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.507202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "edf40ef7-2790-4e9b-92dd-b92ab951383d" (UID: "edf40ef7-2790-4e9b-92dd-b92ab951383d"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.507594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edf40ef7-2790-4e9b-92dd-b92ab951383d-kube-api-access-ghzb6" (OuterVolumeSpecName: "kube-api-access-ghzb6") pod "edf40ef7-2790-4e9b-92dd-b92ab951383d" (UID: "edf40ef7-2790-4e9b-92dd-b92ab951383d"). InnerVolumeSpecName "kube-api-access-ghzb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.511645 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-scripts" (OuterVolumeSpecName: "scripts") pod "edf40ef7-2790-4e9b-92dd-b92ab951383d" (UID: "edf40ef7-2790-4e9b-92dd-b92ab951383d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.522204 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.524394 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_11022c93-b7e5-4241-812c-eed16cc40ce2/ovn-northd/0.log" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.524450 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.528087 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edf40ef7-2790-4e9b-92dd-b92ab951383d" (UID: "edf40ef7-2790-4e9b-92dd-b92ab951383d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.551128 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-config-data" (OuterVolumeSpecName: "config-data") pod "edf40ef7-2790-4e9b-92dd-b92ab951383d" (UID: "edf40ef7-2790-4e9b-92dd-b92ab951383d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.559820 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "edf40ef7-2790-4e9b-92dd-b92ab951383d" (UID: "edf40ef7-2790-4e9b-92dd-b92ab951383d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-config\") pod \"11022c93-b7e5-4241-812c-eed16cc40ce2\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-combined-ca-bundle\") pod \"11022c93-b7e5-4241-812c-eed16cc40ce2\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-scripts\") pod \"11022c93-b7e5-4241-812c-eed16cc40ce2\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-rundir\") pod \"11022c93-b7e5-4241-812c-eed16cc40ce2\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzcgj\" (UniqueName: \"kubernetes.io/projected/11022c93-b7e5-4241-812c-eed16cc40ce2-kube-api-access-xzcgj\") pod \"11022c93-b7e5-4241-812c-eed16cc40ce2\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-metrics-certs-tls-certs\") pod \"11022c93-b7e5-4241-812c-eed16cc40ce2\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-northd-tls-certs\") pod \"11022c93-b7e5-4241-812c-eed16cc40ce2\" (UID: \"11022c93-b7e5-4241-812c-eed16cc40ce2\") " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605896 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605913 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605920 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605929 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605939 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf40ef7-2790-4e9b-92dd-b92ab951383d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605947 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605956 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghzb6\" (UniqueName: \"kubernetes.io/projected/edf40ef7-2790-4e9b-92dd-b92ab951383d-kube-api-access-ghzb6\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605972 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.605980 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/edf40ef7-2790-4e9b-92dd-b92ab951383d-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.606349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "11022c93-b7e5-4241-812c-eed16cc40ce2" (UID: "11022c93-b7e5-4241-812c-eed16cc40ce2"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.606668 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-config" (OuterVolumeSpecName: "config") pod "11022c93-b7e5-4241-812c-eed16cc40ce2" (UID: "11022c93-b7e5-4241-812c-eed16cc40ce2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.615151 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-scripts" (OuterVolumeSpecName: "scripts") pod "11022c93-b7e5-4241-812c-eed16cc40ce2" (UID: "11022c93-b7e5-4241-812c-eed16cc40ce2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.616643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11022c93-b7e5-4241-812c-eed16cc40ce2-kube-api-access-xzcgj" (OuterVolumeSpecName: "kube-api-access-xzcgj") pod "11022c93-b7e5-4241-812c-eed16cc40ce2" (UID: "11022c93-b7e5-4241-812c-eed16cc40ce2"). InnerVolumeSpecName "kube-api-access-xzcgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.617216 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.624218 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11022c93-b7e5-4241-812c-eed16cc40ce2" (UID: "11022c93-b7e5-4241-812c-eed16cc40ce2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.658028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "11022c93-b7e5-4241-812c-eed16cc40ce2" (UID: "11022c93-b7e5-4241-812c-eed16cc40ce2"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.658657 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "11022c93-b7e5-4241-812c-eed16cc40ce2" (UID: "11022c93-b7e5-4241-812c-eed16cc40ce2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:51 crc kubenswrapper[4707]: E0121 15:24:51.685791 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:24:51 crc kubenswrapper[4707]: E0121 15:24:51.687120 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:24:51 crc kubenswrapper[4707]: E0121 15:24:51.692981 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:24:51 crc kubenswrapper[4707]: E0121 15:24:51.693033 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="9606e288-80ca-48db-a0a6-8f0f4ec67eaa" containerName="nova-cell1-conductor-conductor" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.707774 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.707799 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.707821 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.707829 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.707837 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11022c93-b7e5-4241-812c-eed16cc40ce2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.707847 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11022c93-b7e5-4241-812c-eed16cc40ce2-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.707855 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzcgj\" (UniqueName: \"kubernetes.io/projected/11022c93-b7e5-4241-812c-eed16cc40ce2-kube-api-access-xzcgj\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.707865 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11022c93-b7e5-4241-812c-eed16cc40ce2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.887344 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.26:8775/\": read tcp 10.217.0.2:43094->10.217.1.26:8775: read: connection reset by peer" Jan 21 15:24:51 crc kubenswrapper[4707]: I0121 15:24:51.887518 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.26:8775/\": read tcp 10.217.0.2:43100->10.217.1.26:8775: read: connection reset by peer" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.211498 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.237291 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.262076 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.292321 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321185 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-combined-ca-bundle\") pod \"07daec18-3928-4cae-8fbf-4de0b30c690c\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321222 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnhl2\" (UniqueName: \"kubernetes.io/projected/07daec18-3928-4cae-8fbf-4de0b30c690c-kube-api-access-wnhl2\") pod \"07daec18-3928-4cae-8fbf-4de0b30c690c\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-combined-ca-bundle\") pod \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-log-httpd\") pod \"07daec18-3928-4cae-8fbf-4de0b30c690c\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7658f54-ea3a-4d2a-bec0-6a885e661223-logs\") pod \"a7658f54-ea3a-4d2a-bec0-6a885e661223\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-logs\") pod \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-combined-ca-bundle\") pod \"a7658f54-ea3a-4d2a-bec0-6a885e661223\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-config-data\") pod \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-config-data\") pod \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-run-httpd\") pod \"07daec18-3928-4cae-8fbf-4de0b30c690c\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-fernet-keys\") pod \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-config-data\") pod \"a7658f54-ea3a-4d2a-bec0-6a885e661223\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-internal-tls-certs\") pod \"a7658f54-ea3a-4d2a-bec0-6a885e661223\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-internal-tls-certs\") pod \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-scripts\") pod \"07daec18-3928-4cae-8fbf-4de0b30c690c\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-public-tls-certs\") pod \"a7658f54-ea3a-4d2a-bec0-6a885e661223\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-scripts\") pod \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-ceilometer-tls-certs\") pod \"07daec18-3928-4cae-8fbf-4de0b30c690c\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94sdq\" (UniqueName: \"kubernetes.io/projected/a7658f54-ea3a-4d2a-bec0-6a885e661223-kube-api-access-94sdq\") pod \"a7658f54-ea3a-4d2a-bec0-6a885e661223\" (UID: \"a7658f54-ea3a-4d2a-bec0-6a885e661223\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-sg-core-conf-yaml\") pod \"07daec18-3928-4cae-8fbf-4de0b30c690c\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw29m\" (UniqueName: \"kubernetes.io/projected/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-kube-api-access-lw29m\") pod \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-credential-keys\") pod \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321634 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6zjd\" (UniqueName: \"kubernetes.io/projected/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-kube-api-access-j6zjd\") pod \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-combined-ca-bundle\") pod \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-nova-metadata-tls-certs\") pod \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\" (UID: \"ed7400b4-4f9e-4571-9156-5cf8f5e0df67\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-config-data\") pod \"07daec18-3928-4cae-8fbf-4de0b30c690c\" (UID: \"07daec18-3928-4cae-8fbf-4de0b30c690c\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.321728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-public-tls-certs\") pod \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\" (UID: \"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.328769 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-kube-api-access-lw29m" (OuterVolumeSpecName: "kube-api-access-lw29m") pod "ed7400b4-4f9e-4571-9156-5cf8f5e0df67" (UID: "ed7400b4-4f9e-4571-9156-5cf8f5e0df67"). InnerVolumeSpecName "kube-api-access-lw29m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.332113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-logs" (OuterVolumeSpecName: "logs") pod "ed7400b4-4f9e-4571-9156-5cf8f5e0df67" (UID: "ed7400b4-4f9e-4571-9156-5cf8f5e0df67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.355004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" (UID: "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.355095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "07daec18-3928-4cae-8fbf-4de0b30c690c" (UID: "07daec18-3928-4cae-8fbf-4de0b30c690c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.357303 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "07daec18-3928-4cae-8fbf-4de0b30c690c" (UID: "07daec18-3928-4cae-8fbf-4de0b30c690c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.357570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7658f54-ea3a-4d2a-bec0-6a885e661223-logs" (OuterVolumeSpecName: "logs") pod "a7658f54-ea3a-4d2a-bec0-6a885e661223" (UID: "a7658f54-ea3a-4d2a-bec0-6a885e661223"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.364907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07daec18-3928-4cae-8fbf-4de0b30c690c-kube-api-access-wnhl2" (OuterVolumeSpecName: "kube-api-access-wnhl2") pod "07daec18-3928-4cae-8fbf-4de0b30c690c" (UID: "07daec18-3928-4cae-8fbf-4de0b30c690c"). InnerVolumeSpecName "kube-api-access-wnhl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.365142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-kube-api-access-j6zjd" (OuterVolumeSpecName: "kube-api-access-j6zjd") pod "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" (UID: "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a"). InnerVolumeSpecName "kube-api-access-j6zjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.366078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" (UID: "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430709 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw29m\" (UniqueName: \"kubernetes.io/projected/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-kube-api-access-lw29m\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430741 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430750 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6zjd\" (UniqueName: \"kubernetes.io/projected/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-kube-api-access-j6zjd\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430758 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnhl2\" (UniqueName: \"kubernetes.io/projected/07daec18-3928-4cae-8fbf-4de0b30c690c-kube-api-access-wnhl2\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430768 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430775 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7658f54-ea3a-4d2a-bec0-6a885e661223-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430783 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430790 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07daec18-3928-4cae-8fbf-4de0b30c690c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430798 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.430925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-scripts" (OuterVolumeSpecName: "scripts") pod "07daec18-3928-4cae-8fbf-4de0b30c690c" (UID: "07daec18-3928-4cae-8fbf-4de0b30c690c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.433671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-scripts" (OuterVolumeSpecName: "scripts") pod "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" (UID: "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.464907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7658f54-ea3a-4d2a-bec0-6a885e661223-kube-api-access-94sdq" (OuterVolumeSpecName: "kube-api-access-94sdq") pod "a7658f54-ea3a-4d2a-bec0-6a885e661223" (UID: "a7658f54-ea3a-4d2a-bec0-6a885e661223"). InnerVolumeSpecName "kube-api-access-94sdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.469884 4707 generic.go:334] "Generic (PLEG): container finished" podID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerID="e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde" exitCode=0 Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.469946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a7658f54-ea3a-4d2a-bec0-6a885e661223","Type":"ContainerDied","Data":"e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.469970 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a7658f54-ea3a-4d2a-bec0-6a885e661223","Type":"ContainerDied","Data":"4d06506824cfce7e30e471be7a467e782bc3996c9e181fba2fe71f33c59f8834"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.469986 4707 scope.go:117] "RemoveContainer" containerID="e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.470091 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.480474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-config-data" (OuterVolumeSpecName: "config-data") pod "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" (UID: "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.487178 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerID="6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da" exitCode=0 Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.487243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed7400b4-4f9e-4571-9156-5cf8f5e0df67","Type":"ContainerDied","Data":"6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.487279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed7400b4-4f9e-4571-9156-5cf8f5e0df67","Type":"ContainerDied","Data":"aba8650dca5babd848a93663f9d9b471727332299482c7f107b319fa5acbb7bb"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.487689 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.491294 4707 generic.go:334] "Generic (PLEG): container finished" podID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerID="9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e" exitCode=0 Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.491322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerDied","Data":"9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.491350 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.491869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"07daec18-3928-4cae-8fbf-4de0b30c690c","Type":"ContainerDied","Data":"1313d39be45c107bc5cce37c7a7b8160ceb0128f42ec84e9bbdab1c18ca06b51"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.493799 4707 scope.go:117] "RemoveContainer" containerID="e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.494067 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerID="90be3266ccbe5e96e44486114865333b48d5affe4c0b643d4de2af49390b91a5" exitCode=0 Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.494124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" event={"ID":"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e","Type":"ContainerDied","Data":"90be3266ccbe5e96e44486114865333b48d5affe4c0b643d4de2af49390b91a5"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.494521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" (UID: "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.495627 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed7400b4-4f9e-4571-9156-5cf8f5e0df67" (UID: "ed7400b4-4f9e-4571-9156-5cf8f5e0df67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.496411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"edf40ef7-2790-4e9b-92dd-b92ab951383d","Type":"ContainerDied","Data":"b5932ed4d3d6079a9ff1cf10c3d5a106f93d59da74038f55064b644c7f9a5966"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.496428 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.498611 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_11022c93-b7e5-4241-812c-eed16cc40ce2/ovn-northd/0.log" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.498670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"11022c93-b7e5-4241-812c-eed16cc40ce2","Type":"ContainerDied","Data":"412a687571ac458ce8139915f80a087668d12812bd398a0a6eabba5476dde1a9"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.498721 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.502775 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" (UID: "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.507833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-config-data" (OuterVolumeSpecName: "config-data") pod "a7658f54-ea3a-4d2a-bec0-6a885e661223" (UID: "a7658f54-ea3a-4d2a-bec0-6a885e661223"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.508320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"63816993-7ef8-4f21-9a3c-18040cb843bd","Type":"ContainerDied","Data":"2e5be557c46d199ec3ad1ee1b26a68bcf71ee595c54cc79d8bf9c925aaad8154"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.508587 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.513532 4707 generic.go:334] "Generic (PLEG): container finished" podID="3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" containerID="464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed" exitCode=0 Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.513573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" event={"ID":"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a","Type":"ContainerDied","Data":"464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.513593 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" event={"ID":"3aee120c-53c3-4a72-a81e-2dd10cc5cb9a","Type":"ContainerDied","Data":"a4d25ef1cd513130c256ed45fabe1218d3833a2b42fbd941a7c36c0b9ca09e02"} Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.513627 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7f68c764d7-98wg2" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.513666 4707 scope.go:117] "RemoveContainer" containerID="e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde" Jan 21 15:24:52 crc kubenswrapper[4707]: E0121 15:24:52.513981 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde\": container with ID starting with e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde not found: ID does not exist" containerID="e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.514016 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde"} err="failed to get container status \"e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde\": rpc error: code = NotFound desc = could not find container \"e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde\": container with ID starting with e32de085e59780987751760faa376193338a3af0838ac9e84b9eb5648b22dcde not found: ID does not exist" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.514032 4707 scope.go:117] "RemoveContainer" containerID="e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386" Jan 21 15:24:52 crc kubenswrapper[4707]: E0121 15:24:52.516329 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386\": container with ID starting with e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386 not found: ID does not exist" containerID="e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.516352 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386"} err="failed to get container status \"e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386\": rpc error: code = NotFound desc = could not find container \"e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386\": container with ID starting with e71b7ba98d32af91d1fe3c371a060be201b721355fb81ad51222cc660d946386 not found: ID does not exist" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.516368 4707 scope.go:117] "RemoveContainer" containerID="6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.519881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-config-data" (OuterVolumeSpecName: "config-data") pod "ed7400b4-4f9e-4571-9156-5cf8f5e0df67" (UID: "ed7400b4-4f9e-4571-9156-5cf8f5e0df67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.529668 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.532518 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.532540 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.532550 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.532559 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.532567 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.532596 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94sdq\" (UniqueName: \"kubernetes.io/projected/a7658f54-ea3a-4d2a-bec0-6a885e661223-kube-api-access-94sdq\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.532604 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.532612 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.532619 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.533507 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.541624 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7658f54-ea3a-4d2a-bec0-6a885e661223" (UID: "a7658f54-ea3a-4d2a-bec0-6a885e661223"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.541683 4707 scope.go:117] "RemoveContainer" containerID="add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.547882 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.553392 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.553670 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a7658f54-ea3a-4d2a-bec0-6a885e661223" (UID: "a7658f54-ea3a-4d2a-bec0-6a885e661223"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.557123 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.560650 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.561990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "07daec18-3928-4cae-8fbf-4de0b30c690c" (UID: "07daec18-3928-4cae-8fbf-4de0b30c690c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.570666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ed7400b4-4f9e-4571-9156-5cf8f5e0df67" (UID: "ed7400b4-4f9e-4571-9156-5cf8f5e0df67"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.572743 4707 scope.go:117] "RemoveContainer" containerID="6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da" Jan 21 15:24:52 crc kubenswrapper[4707]: E0121 15:24:52.573190 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da\": container with ID starting with 6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da not found: ID does not exist" containerID="6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.573190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" (UID: "3aee120c-53c3-4a72-a81e-2dd10cc5cb9a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.573215 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da"} err="failed to get container status \"6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da\": rpc error: code = NotFound desc = could not find container \"6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da\": container with ID starting with 6094ffe3e04e3aefd5218a72044156248f14c22f70394fa002f9f8e8abf856da not found: ID does not exist" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.573309 4707 scope.go:117] "RemoveContainer" containerID="add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868" Jan 21 15:24:52 crc kubenswrapper[4707]: E0121 15:24:52.573962 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868\": container with ID starting with add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868 not found: ID does not exist" containerID="add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.573986 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868"} err="failed to get container status \"add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868\": rpc error: code = NotFound desc = could not find container \"add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868\": container with ID starting with add60b3df92ea9fe7578358d219b4102de8165eb955505a64a7c5c9a6a332868 not found: ID does not exist" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.574025 4707 scope.go:117] "RemoveContainer" containerID="10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.574432 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.576131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "07daec18-3928-4cae-8fbf-4de0b30c690c" (UID: "07daec18-3928-4cae-8fbf-4de0b30c690c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.576205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07daec18-3928-4cae-8fbf-4de0b30c690c" (UID: "07daec18-3928-4cae-8fbf-4de0b30c690c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.582721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a7658f54-ea3a-4d2a-bec0-6a885e661223" (UID: "a7658f54-ea3a-4d2a-bec0-6a885e661223"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.595937 4707 scope.go:117] "RemoveContainer" containerID="a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.610136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-config-data" (OuterVolumeSpecName: "config-data") pod "07daec18-3928-4cae-8fbf-4de0b30c690c" (UID: "07daec18-3928-4cae-8fbf-4de0b30c690c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.610309 4707 scope.go:117] "RemoveContainer" containerID="9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.626185 4707 scope.go:117] "RemoveContainer" containerID="a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-public-tls-certs\") pod \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-combined-ca-bundle\") pod \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl522\" (UniqueName: \"kubernetes.io/projected/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-kube-api-access-xl522\") pod \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637419 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data\") pod \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-logs\") pod \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data-custom\") pod \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-internal-tls-certs\") pod \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\" (UID: \"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e\") " Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637720 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637737 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637747 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637756 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7658f54-ea3a-4d2a-bec0-6a885e661223-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637764 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637771 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637779 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7400b4-4f9e-4571-9156-5cf8f5e0df67-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637787 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637794 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07daec18-3928-4cae-8fbf-4de0b30c690c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.637997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-logs" (OuterVolumeSpecName: "logs") pod "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" (UID: "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.640314 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-kube-api-access-xl522" (OuterVolumeSpecName: "kube-api-access-xl522") pod "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" (UID: "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e"). InnerVolumeSpecName "kube-api-access-xl522". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.640300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" (UID: "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.641186 4707 scope.go:117] "RemoveContainer" containerID="10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695" Jan 21 15:24:52 crc kubenswrapper[4707]: E0121 15:24:52.641514 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695\": container with ID starting with 10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695 not found: ID does not exist" containerID="10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.641538 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695"} err="failed to get container status \"10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695\": rpc error: code = NotFound desc = could not find container \"10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695\": container with ID starting with 10b02745006bee5d67ba385a0486a8c961f55a577ceb5d1997970237e6a14695 not found: ID does not exist" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.641555 4707 scope.go:117] "RemoveContainer" containerID="a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec" Jan 21 15:24:52 crc kubenswrapper[4707]: E0121 15:24:52.641790 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec\": container with ID starting with a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec not found: ID does not exist" containerID="a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.641843 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec"} err="failed to get container status \"a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec\": rpc error: code = NotFound desc = could not find container \"a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec\": container with ID starting with a70a45b11b65b5a880fae09a02d40c269e680823a89d90b4c7968bc225e695ec not found: ID does not exist" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.641858 4707 scope.go:117] "RemoveContainer" containerID="9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e" Jan 21 15:24:52 crc kubenswrapper[4707]: E0121 15:24:52.642084 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e\": container with ID starting with 9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e not found: ID does not exist" containerID="9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.642105 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e"} err="failed to get container status \"9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e\": rpc error: code = NotFound desc = could not find container \"9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e\": container with ID starting with 9cf29b78fa7036872f9972cec3f54bf9edc9f795626bc1539663f3b5ae966a4e not found: ID does not exist" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.642117 4707 scope.go:117] "RemoveContainer" containerID="a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858" Jan 21 15:24:52 crc kubenswrapper[4707]: E0121 15:24:52.642342 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858\": container with ID starting with a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858 not found: ID does not exist" containerID="a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.642367 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858"} err="failed to get container status \"a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858\": rpc error: code = NotFound desc = could not find container \"a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858\": container with ID starting with a6202584e65627ba827ed5c6eab4608cbe50ad66257d62028da0572f0ced1858 not found: ID does not exist" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.642380 4707 scope.go:117] "RemoveContainer" containerID="5a6924b802768405402d6b86cb657dfd7334a39647b9f65f1a3ca7e21dd5e716" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.653724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" (UID: "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.657908 4707 scope.go:117] "RemoveContainer" containerID="bdd54c33530224d64a1afe448ee39fd6390f3f98a84ea3afe2b57e8006166cbd" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.662711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data" (OuterVolumeSpecName: "config-data") pod "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" (UID: "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.664718 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" (UID: "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.666643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" (UID: "bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.672997 4707 scope.go:117] "RemoveContainer" containerID="eeacdf781fdc40d12fa1f766ca9a1e9ce44c84afa1271e3f9baffd10cf9d9255" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.695658 4707 scope.go:117] "RemoveContainer" containerID="8eb790e157aa673fadd935d740017ed4db8e762fcaad08bdb4716c7e456ed832" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.709383 4707 scope.go:117] "RemoveContainer" containerID="b4638cfb34b740af95a7bc03409e4baaf7acbfe3168548cc0715b59e19f6d136" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.724454 4707 scope.go:117] "RemoveContainer" containerID="cdead9cf6c5698fee36e61676ef626b53d0a70cdd040ba9566e1451f134de51e" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.737512 4707 scope.go:117] "RemoveContainer" containerID="464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.738249 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.738276 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.738285 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.738293 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.738301 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl522\" (UniqueName: \"kubernetes.io/projected/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-kube-api-access-xl522\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.738308 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.738315 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.754102 4707 scope.go:117] "RemoveContainer" containerID="464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed" Jan 21 15:24:52 crc kubenswrapper[4707]: E0121 15:24:52.756107 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed\": container with ID starting with 464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed not found: ID does not exist" containerID="464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.756134 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed"} err="failed to get container status \"464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed\": rpc error: code = NotFound desc = could not find container \"464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed\": container with ID starting with 464df9929baf58c7aa7eb50d3b07e2c0f5fd054f5ad9290daa34242b2a0cbaed not found: ID does not exist" Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.875087 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.881825 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.887722 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.893095 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.899603 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.901801 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.905539 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7f68c764d7-98wg2"] Jan 21 15:24:52 crc kubenswrapper[4707]: I0121 15:24:52.908973 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-7f68c764d7-98wg2"] Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.188935 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" path="/var/lib/kubelet/pods/07daec18-3928-4cae-8fbf-4de0b30c690c/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.190055 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" path="/var/lib/kubelet/pods/11022c93-b7e5-4241-812c-eed16cc40ce2/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.190584 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" path="/var/lib/kubelet/pods/316bd962-7ce5-4797-8a99-79aacd85c3fb/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.191083 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" path="/var/lib/kubelet/pods/3aee120c-53c3-4a72-a81e-2dd10cc5cb9a/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.191711 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63816993-7ef8-4f21-9a3c-18040cb843bd" path="/var/lib/kubelet/pods/63816993-7ef8-4f21-9a3c-18040cb843bd/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.192220 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b113905-777a-4919-96c0-704f8e6f4026" path="/var/lib/kubelet/pods/7b113905-777a-4919-96c0-704f8e6f4026/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.192760 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c1a990-5286-4274-a26a-8740ee506a12" path="/var/lib/kubelet/pods/90c1a990-5286-4274-a26a-8740ee506a12/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.194211 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" path="/var/lib/kubelet/pods/a7658f54-ea3a-4d2a-bec0-6a885e661223/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.194688 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86afe68-6c61-4a64-bdb3-2f489efbea33" path="/var/lib/kubelet/pods/c86afe68-6c61-4a64-bdb3-2f489efbea33/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.195008 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e220c0-ad06-4f2b-be03-642acdd3a23a" path="/var/lib/kubelet/pods/d8e220c0-ad06-4f2b-be03-642acdd3a23a/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.195730 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" path="/var/lib/kubelet/pods/ed7400b4-4f9e-4571-9156-5cf8f5e0df67/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.196328 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edf40ef7-2790-4e9b-92dd-b92ab951383d" path="/var/lib/kubelet/pods/edf40ef7-2790-4e9b-92dd-b92ab951383d/volumes" Jan 21 15:24:53 crc kubenswrapper[4707]: E0121 15:24:53.245370 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 21 15:24:53 crc kubenswrapper[4707]: E0121 15:24:53.245487 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0 podName:4543cf32-1b8e-4b5d-8189-2cef6a154e9b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:53.745473444 +0000 UTC m=+1390.926989666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0") pod "dnsmasq-dnsmasq-7b69d68bf7-xmkr7" (UID: "4543cf32-1b8e-4b5d-8189-2cef6a154e9b") : configmap "dns-swift-storage-0" not found Jan 21 15:24:53 crc kubenswrapper[4707]: E0121 15:24:53.345945 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:53 crc kubenswrapper[4707]: E0121 15:24:53.346539 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data podName:4fbbd17b-fed9-463c-9c25-435f84d0205e nodeName:}" failed. No retries permitted until 2026-01-21 15:25:01.346527361 +0000 UTC m=+1398.528043583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data") pod "rabbitmq-cell1-server-0" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.524596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" event={"ID":"bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e","Type":"ContainerDied","Data":"ec7397528e2161c9bd690ee1124510074c0cbc3d4639f9045e8c178ecfaeedf4"} Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.524797 4707 scope.go:117] "RemoveContainer" containerID="90be3266ccbe5e96e44486114865333b48d5affe4c0b643d4de2af49390b91a5" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.524990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-55976cc56-2tgtc" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.544298 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-55976cc56-2tgtc"] Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.551029 4707 scope.go:117] "RemoveContainer" containerID="d959b8af71e65676b69991aa2187dfa38383cb263af1e369f5864d8fe2893795" Jan 21 15:24:53 crc kubenswrapper[4707]: I0121 15:24:53.552387 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-55976cc56-2tgtc"] Jan 21 15:24:53 crc kubenswrapper[4707]: E0121 15:24:53.757727 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 21 15:24:53 crc kubenswrapper[4707]: E0121 15:24:53.757787 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0 podName:4543cf32-1b8e-4b5d-8189-2cef6a154e9b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:54.757773334 +0000 UTC m=+1391.939289556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0") pod "dnsmasq-dnsmasq-7b69d68bf7-xmkr7" (UID: "4543cf32-1b8e-4b5d-8189-2cef6a154e9b") : configmap "dns-swift-storage-0" not found Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.268193 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.268268 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data podName:1bc7660e-aaa9-4865-a873-4db17c1db628 nodeName:}" failed. No retries permitted until 2026-01-21 15:25:02.268242113 +0000 UTC m=+1399.449758335 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data") pod "rabbitmq-server-0" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628") : configmap "rabbitmq-config-data" not found Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.468068 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.469426 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.470416 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.470443 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9e516808-af42-455d-b94b-7a72e6439115" containerName="nova-cell0-conductor-conductor" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.556443 4707 generic.go:334] "Generic (PLEG): container finished" podID="4fbbd17b-fed9-463c-9c25-435f84d0205e" containerID="bbadb76a87972ac7320d59c53dfb93227092e217045548f169a94fd6a5c429ce" exitCode=0 Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.556524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"4fbbd17b-fed9-463c-9c25-435f84d0205e","Type":"ContainerDied","Data":"bbadb76a87972ac7320d59c53dfb93227092e217045548f169a94fd6a5c429ce"} Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.678560 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fbbd17b-fed9-463c-9c25-435f84d0205e-erlang-cookie-secret\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776127 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-plugins-conf\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-confd\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-tls\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-erlang-cookie\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fbbd17b-fed9-463c-9c25-435f84d0205e-pod-info\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-server-conf\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-plugins\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvjrs\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-kube-api-access-fvjrs\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776582 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data\") pod \"4fbbd17b-fed9-463c-9c25-435f84d0205e\" (UID: \"4fbbd17b-fed9-463c-9c25-435f84d0205e\") " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.776910 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.776953 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.776985 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0 podName:4543cf32-1b8e-4b5d-8189-2cef6a154e9b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:56.776973496 +0000 UTC m=+1393.958489718 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0") pod "dnsmasq-dnsmasq-7b69d68bf7-xmkr7" (UID: "4543cf32-1b8e-4b5d-8189-2cef6a154e9b") : configmap "dns-swift-storage-0" not found Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.777332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.778492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.782345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4fbbd17b-fed9-463c-9c25-435f84d0205e-pod-info" (OuterVolumeSpecName: "pod-info") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.782368 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-kube-api-access-fvjrs" (OuterVolumeSpecName: "kube-api-access-fvjrs") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "kube-api-access-fvjrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.782430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "persistence") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.782434 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.782609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fbbd17b-fed9-463c-9c25-435f84d0205e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.796739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data" (OuterVolumeSpecName: "config-data") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.814858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-server-conf" (OuterVolumeSpecName: "server-conf") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.833156 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4fbbd17b-fed9-463c-9c25-435f84d0205e" (UID: "4fbbd17b-fed9-463c-9c25-435f84d0205e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.878855 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.879024 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.879088 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fbbd17b-fed9-463c-9c25-435f84d0205e-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.879140 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.879206 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.879270 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvjrs\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-kube-api-access-fvjrs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.879323 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fbbd17b-fed9-463c-9c25-435f84d0205e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.879369 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fbbd17b-fed9-463c-9c25-435f84d0205e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.879436 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.879493 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fbbd17b-fed9-463c-9c25-435f84d0205e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.889056 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.962039 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.965088 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.972774 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:24:54 crc kubenswrapper[4707]: E0121 15:24:54.972903 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c1addb55-dbd5-4f5c-957b-592c89033dab" containerName="nova-scheduler-scheduler" Jan 21 15:24:54 crc kubenswrapper[4707]: I0121 15:24:54.981333 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.035448 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.183706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-erlang-cookie\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.183758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bc7660e-aaa9-4865-a873-4db17c1db628-erlang-cookie-secret\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.183787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx8kv\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-kube-api-access-qx8kv\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.183889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-confd\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.183934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.183953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bc7660e-aaa9-4865-a873-4db17c1db628-pod-info\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.183992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-plugins-conf\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.184008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.184039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-server-conf\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.184074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-plugins\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.184101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-tls\") pod \"1bc7660e-aaa9-4865-a873-4db17c1db628\" (UID: \"1bc7660e-aaa9-4865-a873-4db17c1db628\") " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.186629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.187085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.187290 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.187401 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.188262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc7660e-aaa9-4865-a873-4db17c1db628-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.188801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1bc7660e-aaa9-4865-a873-4db17c1db628-pod-info" (OuterVolumeSpecName: "pod-info") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.189328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-kube-api-access-qx8kv" (OuterVolumeSpecName: "kube-api-access-qx8kv") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "kube-api-access-qx8kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.189417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "persistence") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.191310 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" path="/var/lib/kubelet/pods/bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e/volumes" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.203468 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data" (OuterVolumeSpecName: "config-data") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.215020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-server-conf" (OuterVolumeSpecName: "server-conf") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.235016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1bc7660e-aaa9-4865-a873-4db17c1db628" (UID: "1bc7660e-aaa9-4865-a873-4db17c1db628"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286626 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286666 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286679 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286692 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286704 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bc7660e-aaa9-4865-a873-4db17c1db628-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286714 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx8kv\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-kube-api-access-qx8kv\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286723 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bc7660e-aaa9-4865-a873-4db17c1db628-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286732 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286740 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bc7660e-aaa9-4865-a873-4db17c1db628-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286751 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bc7660e-aaa9-4865-a873-4db17c1db628-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.286784 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.299505 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.388207 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.567714 4707 generic.go:334] "Generic (PLEG): container finished" podID="1bc7660e-aaa9-4865-a873-4db17c1db628" containerID="a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8" exitCode=0 Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.567761 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.567842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1bc7660e-aaa9-4865-a873-4db17c1db628","Type":"ContainerDied","Data":"a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8"} Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.567933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1bc7660e-aaa9-4865-a873-4db17c1db628","Type":"ContainerDied","Data":"ecbce59d03cb46ca5d6d98464312966ab1a453320f74319b63602870b531911d"} Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.567960 4707 scope.go:117] "RemoveContainer" containerID="a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.571033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"4fbbd17b-fed9-463c-9c25-435f84d0205e","Type":"ContainerDied","Data":"598e556105658c465cda9e50d32a3202a18a9737a6c1bcdb2fbfad62f3d1f31b"} Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.571114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.599146 4707 scope.go:117] "RemoveContainer" containerID="f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.599276 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.604127 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.613110 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.617744 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.618348 4707 scope.go:117] "RemoveContainer" containerID="a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8" Jan 21 15:24:55 crc kubenswrapper[4707]: E0121 15:24:55.618621 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8\": container with ID starting with a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8 not found: ID does not exist" containerID="a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.618650 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8"} err="failed to get container status \"a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8\": rpc error: code = NotFound desc = could not find container \"a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8\": container with ID starting with a5351dc023c4570855b72c4a86e7ce817c8b8a1d9bf7a547a2bd3260d706e8a8 not found: ID does not exist" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.618670 4707 scope.go:117] "RemoveContainer" containerID="f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b" Jan 21 15:24:55 crc kubenswrapper[4707]: E0121 15:24:55.618951 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b\": container with ID starting with f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b not found: ID does not exist" containerID="f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.618991 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b"} err="failed to get container status \"f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b\": rpc error: code = NotFound desc = could not find container \"f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b\": container with ID starting with f96c639ee81e20092c9e4b28252908f18d405e1f7bcafc3f2655077788a5c44b not found: ID does not exist" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.619023 4707 scope.go:117] "RemoveContainer" containerID="bbadb76a87972ac7320d59c53dfb93227092e217045548f169a94fd6a5c429ce" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.638251 4707 scope.go:117] "RemoveContainer" containerID="f96114fc5af659c21483e07417feb1e1da0f1dbf8acbb58d75419c23defc0836" Jan 21 15:24:55 crc kubenswrapper[4707]: I0121 15:24:55.991603 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.098786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-config-data\") pod \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.098869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-combined-ca-bundle\") pod \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.099012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s4hp\" (UniqueName: \"kubernetes.io/projected/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-kube-api-access-7s4hp\") pod \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\" (UID: \"9606e288-80ca-48db-a0a6-8f0f4ec67eaa\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.102486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-kube-api-access-7s4hp" (OuterVolumeSpecName: "kube-api-access-7s4hp") pod "9606e288-80ca-48db-a0a6-8f0f4ec67eaa" (UID: "9606e288-80ca-48db-a0a6-8f0f4ec67eaa"). InnerVolumeSpecName "kube-api-access-7s4hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.116726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-config-data" (OuterVolumeSpecName: "config-data") pod "9606e288-80ca-48db-a0a6-8f0f4ec67eaa" (UID: "9606e288-80ca-48db-a0a6-8f0f4ec67eaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.117496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9606e288-80ca-48db-a0a6-8f0f4ec67eaa" (UID: "9606e288-80ca-48db-a0a6-8f0f4ec67eaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.195375 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.200271 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s4hp\" (UniqueName: \"kubernetes.io/projected/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-kube-api-access-7s4hp\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.200293 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.200302 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9606e288-80ca-48db-a0a6-8f0f4ec67eaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.307169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-combined-ca-bundle\") pod \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.307311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnqzp\" (UniqueName: \"kubernetes.io/projected/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-kube-api-access-nnqzp\") pod \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.307345 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data\") pod \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.307378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-logs\") pod \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.307444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data-custom\") pod \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\" (UID: \"ee0ec627-76fd-4938-9a2b-db220a1ac5d5\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.314915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-kube-api-access-nnqzp" (OuterVolumeSpecName: "kube-api-access-nnqzp") pod "ee0ec627-76fd-4938-9a2b-db220a1ac5d5" (UID: "ee0ec627-76fd-4938-9a2b-db220a1ac5d5"). InnerVolumeSpecName "kube-api-access-nnqzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.318163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-logs" (OuterVolumeSpecName: "logs") pod "ee0ec627-76fd-4938-9a2b-db220a1ac5d5" (UID: "ee0ec627-76fd-4938-9a2b-db220a1ac5d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.320890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ee0ec627-76fd-4938-9a2b-db220a1ac5d5" (UID: "ee0ec627-76fd-4938-9a2b-db220a1ac5d5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.366950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data" (OuterVolumeSpecName: "config-data") pod "ee0ec627-76fd-4938-9a2b-db220a1ac5d5" (UID: "ee0ec627-76fd-4938-9a2b-db220a1ac5d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.371973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee0ec627-76fd-4938-9a2b-db220a1ac5d5" (UID: "ee0ec627-76fd-4938-9a2b-db220a1ac5d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.412228 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnqzp\" (UniqueName: \"kubernetes.io/projected/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-kube-api-access-nnqzp\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.412270 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.412280 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.412288 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.412295 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ec627-76fd-4938-9a2b-db220a1ac5d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.426531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.513109 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-combined-ca-bundle\") pod \"9e516808-af42-455d-b94b-7a72e6439115\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.513218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-config-data\") pod \"9e516808-af42-455d-b94b-7a72e6439115\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.513287 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85p7z\" (UniqueName: \"kubernetes.io/projected/9e516808-af42-455d-b94b-7a72e6439115-kube-api-access-85p7z\") pod \"9e516808-af42-455d-b94b-7a72e6439115\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.516110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e516808-af42-455d-b94b-7a72e6439115-kube-api-access-85p7z" (OuterVolumeSpecName: "kube-api-access-85p7z") pod "9e516808-af42-455d-b94b-7a72e6439115" (UID: "9e516808-af42-455d-b94b-7a72e6439115"). InnerVolumeSpecName "kube-api-access-85p7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: E0121 15:24:56.527060 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-combined-ca-bundle podName:9e516808-af42-455d-b94b-7a72e6439115 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:57.027020037 +0000 UTC m=+1394.208536259 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-combined-ca-bundle") pod "9e516808-af42-455d-b94b-7a72e6439115" (UID: "9e516808-af42-455d-b94b-7a72e6439115") : error deleting /var/lib/kubelet/pods/9e516808-af42-455d-b94b-7a72e6439115/volume-subpaths: remove /var/lib/kubelet/pods/9e516808-af42-455d-b94b-7a72e6439115/volume-subpaths: no such file or directory Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.528787 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-config-data" (OuterVolumeSpecName: "config-data") pod "9e516808-af42-455d-b94b-7a72e6439115" (UID: "9e516808-af42-455d-b94b-7a72e6439115"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.565611 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.584274 4707 generic.go:334] "Generic (PLEG): container finished" podID="9e516808-af42-455d-b94b-7a72e6439115" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" exitCode=0 Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.584319 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.584335 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9e516808-af42-455d-b94b-7a72e6439115","Type":"ContainerDied","Data":"f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965"} Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.584363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9e516808-af42-455d-b94b-7a72e6439115","Type":"ContainerDied","Data":"68487121fe82059f865f78f2184415aaaa8ac74f1b7747a58152abb1b509dcaf"} Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.584379 4707 scope.go:117] "RemoveContainer" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.587615 4707 generic.go:334] "Generic (PLEG): container finished" podID="9606e288-80ca-48db-a0a6-8f0f4ec67eaa" containerID="eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320" exitCode=0 Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.587735 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.587905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"9606e288-80ca-48db-a0a6-8f0f4ec67eaa","Type":"ContainerDied","Data":"eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320"} Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.587937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"9606e288-80ca-48db-a0a6-8f0f4ec67eaa","Type":"ContainerDied","Data":"7c3d086ce662659164b45b8e96e8bba61663245fff6912cde3428dbf310367af"} Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.589108 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1addb55-dbd5-4f5c-957b-592c89033dab" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" exitCode=0 Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.589166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c1addb55-dbd5-4f5c-957b-592c89033dab","Type":"ContainerDied","Data":"192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec"} Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.589195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c1addb55-dbd5-4f5c-957b-592c89033dab","Type":"ContainerDied","Data":"7e5676fce46cf35e84ce7792b9247621f82d5c13788fa5ea074f6841a2a9a833"} Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.589244 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.590479 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerID="1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f" exitCode=0 Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.590502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" event={"ID":"ee0ec627-76fd-4938-9a2b-db220a1ac5d5","Type":"ContainerDied","Data":"1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f"} Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.590519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" event={"ID":"ee0ec627-76fd-4938-9a2b-db220a1ac5d5","Type":"ContainerDied","Data":"69af5616e1526ccaf0294310ea014397d1c00ffb9cde6dfbef8e59f0e4eafc9c"} Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.590578 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.606143 4707 scope.go:117] "RemoveContainer" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" Jan 21 15:24:56 crc kubenswrapper[4707]: E0121 15:24:56.606891 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965\": container with ID starting with f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965 not found: ID does not exist" containerID="f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.606922 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965"} err="failed to get container status \"f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965\": rpc error: code = NotFound desc = could not find container \"f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965\": container with ID starting with f61f10460c9b9a94b37d14b54fb334eaeb5da5b152f65846299cdaa17e8ab965 not found: ID does not exist" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.606938 4707 scope.go:117] "RemoveContainer" containerID="eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.614886 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85p7z\" (UniqueName: \"kubernetes.io/projected/9e516808-af42-455d-b94b-7a72e6439115-kube-api-access-85p7z\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.614978 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.629300 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.631610 4707 scope.go:117] "RemoveContainer" containerID="eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320" Jan 21 15:24:56 crc kubenswrapper[4707]: E0121 15:24:56.632069 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320\": container with ID starting with eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320 not found: ID does not exist" containerID="eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.632100 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320"} err="failed to get container status \"eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320\": rpc error: code = NotFound desc = could not find container \"eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320\": container with ID starting with eceb032f9f7338b006c275c707adeab04bc193d26a5ace8f4694135424515320 not found: ID does not exist" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.632120 4707 scope.go:117] "RemoveContainer" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.633980 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.638116 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd"] Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.642020 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b9b58ff55-8q9qd"] Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.646865 4707 scope.go:117] "RemoveContainer" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" Jan 21 15:24:56 crc kubenswrapper[4707]: E0121 15:24:56.647199 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec\": container with ID starting with 192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec not found: ID does not exist" containerID="192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.647235 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec"} err="failed to get container status \"192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec\": rpc error: code = NotFound desc = could not find container \"192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec\": container with ID starting with 192f67111f4c267bdea149c92788fc59040b648b32589c4b216fac6bc86a0fec not found: ID does not exist" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.647269 4707 scope.go:117] "RemoveContainer" containerID="1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.661307 4707 scope.go:117] "RemoveContainer" containerID="a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.680329 4707 scope.go:117] "RemoveContainer" containerID="1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f" Jan 21 15:24:56 crc kubenswrapper[4707]: E0121 15:24:56.680681 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f\": container with ID starting with 1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f not found: ID does not exist" containerID="1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.680713 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f"} err="failed to get container status \"1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f\": rpc error: code = NotFound desc = could not find container \"1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f\": container with ID starting with 1b1bd018ecfa9d550854e6f42621682d5d8cdab27ad589a8515e060cb197d60f not found: ID does not exist" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.680736 4707 scope.go:117] "RemoveContainer" containerID="a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3" Jan 21 15:24:56 crc kubenswrapper[4707]: E0121 15:24:56.680982 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3\": container with ID starting with a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3 not found: ID does not exist" containerID="a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.681056 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3"} err="failed to get container status \"a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3\": rpc error: code = NotFound desc = could not find container \"a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3\": container with ID starting with a09793d3163bfca67622d7f46f4274199e0a398a5ed2a8c714f4f9417b77def3 not found: ID does not exist" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.716307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stbxl\" (UniqueName: \"kubernetes.io/projected/c1addb55-dbd5-4f5c-957b-592c89033dab-kube-api-access-stbxl\") pod \"c1addb55-dbd5-4f5c-957b-592c89033dab\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.716343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-config-data\") pod \"c1addb55-dbd5-4f5c-957b-592c89033dab\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.716371 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-combined-ca-bundle\") pod \"c1addb55-dbd5-4f5c-957b-592c89033dab\" (UID: \"c1addb55-dbd5-4f5c-957b-592c89033dab\") " Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.718719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1addb55-dbd5-4f5c-957b-592c89033dab-kube-api-access-stbxl" (OuterVolumeSpecName: "kube-api-access-stbxl") pod "c1addb55-dbd5-4f5c-957b-592c89033dab" (UID: "c1addb55-dbd5-4f5c-957b-592c89033dab"). InnerVolumeSpecName "kube-api-access-stbxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.730441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-config-data" (OuterVolumeSpecName: "config-data") pod "c1addb55-dbd5-4f5c-957b-592c89033dab" (UID: "c1addb55-dbd5-4f5c-957b-592c89033dab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.730778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1addb55-dbd5-4f5c-957b-592c89033dab" (UID: "c1addb55-dbd5-4f5c-957b-592c89033dab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.818399 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stbxl\" (UniqueName: \"kubernetes.io/projected/c1addb55-dbd5-4f5c-957b-592c89033dab-kube-api-access-stbxl\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.818426 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.818436 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1addb55-dbd5-4f5c-957b-592c89033dab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:56 crc kubenswrapper[4707]: E0121 15:24:56.818483 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 21 15:24:56 crc kubenswrapper[4707]: E0121 15:24:56.818517 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0 podName:4543cf32-1b8e-4b5d-8189-2cef6a154e9b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:00.818505978 +0000 UTC m=+1398.000022201 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0") pod "dnsmasq-dnsmasq-7b69d68bf7-xmkr7" (UID: "4543cf32-1b8e-4b5d-8189-2cef6a154e9b") : configmap "dns-swift-storage-0" not found Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.912013 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:24:56 crc kubenswrapper[4707]: I0121 15:24:56.916018 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.122349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-combined-ca-bundle\") pod \"9e516808-af42-455d-b94b-7a72e6439115\" (UID: \"9e516808-af42-455d-b94b-7a72e6439115\") " Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.124769 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e516808-af42-455d-b94b-7a72e6439115" (UID: "9e516808-af42-455d-b94b-7a72e6439115"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.189452 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc7660e-aaa9-4865-a873-4db17c1db628" path="/var/lib/kubelet/pods/1bc7660e-aaa9-4865-a873-4db17c1db628/volumes" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.190149 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fbbd17b-fed9-463c-9c25-435f84d0205e" path="/var/lib/kubelet/pods/4fbbd17b-fed9-463c-9c25-435f84d0205e/volumes" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.191034 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9606e288-80ca-48db-a0a6-8f0f4ec67eaa" path="/var/lib/kubelet/pods/9606e288-80ca-48db-a0a6-8f0f4ec67eaa/volumes" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.191466 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1addb55-dbd5-4f5c-957b-592c89033dab" path="/var/lib/kubelet/pods/c1addb55-dbd5-4f5c-957b-592c89033dab/volumes" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.191938 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" path="/var/lib/kubelet/pods/ee0ec627-76fd-4938-9a2b-db220a1ac5d5/volumes" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.208264 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.211455 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.224209 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e516808-af42-455d-b94b-7a72e6439115-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.440232 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.471007 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7"] Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.471321 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" podUID="4543cf32-1b8e-4b5d-8189-2cef6a154e9b" containerName="dnsmasq-dns" containerID="cri-o://302ea15ef3aee89402cad8d53e32bed232af74eea08c703bbb3d5a07199d6fc4" gracePeriod=10 Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.601675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" event={"ID":"4543cf32-1b8e-4b5d-8189-2cef6a154e9b","Type":"ContainerDied","Data":"302ea15ef3aee89402cad8d53e32bed232af74eea08c703bbb3d5a07199d6fc4"} Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.601658 4707 generic.go:334] "Generic (PLEG): container finished" podID="4543cf32-1b8e-4b5d-8189-2cef6a154e9b" containerID="302ea15ef3aee89402cad8d53e32bed232af74eea08c703bbb3d5a07199d6fc4" exitCode=0 Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.823073 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.945800 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-config\") pod \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.946000 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvpxl\" (UniqueName: \"kubernetes.io/projected/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-kube-api-access-pvpxl\") pod \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.946022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dnsmasq-svc\") pod \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.946047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0\") pod \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\" (UID: \"4543cf32-1b8e-4b5d-8189-2cef6a154e9b\") " Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.950335 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-kube-api-access-pvpxl" (OuterVolumeSpecName: "kube-api-access-pvpxl") pod "4543cf32-1b8e-4b5d-8189-2cef6a154e9b" (UID: "4543cf32-1b8e-4b5d-8189-2cef6a154e9b"). InnerVolumeSpecName "kube-api-access-pvpxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.969988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "4543cf32-1b8e-4b5d-8189-2cef6a154e9b" (UID: "4543cf32-1b8e-4b5d-8189-2cef6a154e9b"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.970092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4543cf32-1b8e-4b5d-8189-2cef6a154e9b" (UID: "4543cf32-1b8e-4b5d-8189-2cef6a154e9b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:57 crc kubenswrapper[4707]: I0121 15:24:57.978455 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-config" (OuterVolumeSpecName: "config") pod "4543cf32-1b8e-4b5d-8189-2cef6a154e9b" (UID: "4543cf32-1b8e-4b5d-8189-2cef6a154e9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.048567 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvpxl\" (UniqueName: \"kubernetes.io/projected/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-kube-api-access-pvpxl\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.048993 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.049057 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.049109 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4543cf32-1b8e-4b5d-8189-2cef6a154e9b-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.609319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" event={"ID":"4543cf32-1b8e-4b5d-8189-2cef6a154e9b","Type":"ContainerDied","Data":"229d40c510ef1bc6a30ce808f2d81467dcae253935998ca27a2f9d7c2e6acd52"} Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.610019 4707 scope.go:117] "RemoveContainer" containerID="302ea15ef3aee89402cad8d53e32bed232af74eea08c703bbb3d5a07199d6fc4" Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.610166 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7" Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.629874 4707 scope.go:117] "RemoveContainer" containerID="5f28403aebcb12ef2e6138d7601af2b5ea7ec05817ec3373881a8597876e713e" Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.646575 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7"] Jan 21 15:24:58 crc kubenswrapper[4707]: I0121 15:24:58.649895 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7b69d68bf7-xmkr7"] Jan 21 15:24:59 crc kubenswrapper[4707]: I0121 15:24:59.189578 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4543cf32-1b8e-4b5d-8189-2cef6a154e9b" path="/var/lib/kubelet/pods/4543cf32-1b8e-4b5d-8189-2cef6a154e9b/volumes" Jan 21 15:24:59 crc kubenswrapper[4707]: I0121 15:24:59.190348 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e516808-af42-455d-b94b-7a72e6439115" path="/var/lib/kubelet/pods/9e516808-af42-455d-b94b-7a72e6439115/volumes" Jan 21 15:25:00 crc kubenswrapper[4707]: I0121 15:25:00.169452 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.248:9696/\": dial tcp 10.217.0.248:9696: connect: connection refused" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.134800 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.249349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-combined-ca-bundle\") pod \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.249406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-public-tls-certs\") pod \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.249466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-ovndb-tls-certs\") pod \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.249518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbd8b\" (UniqueName: \"kubernetes.io/projected/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-kube-api-access-lbd8b\") pod \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.249541 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-httpd-config\") pod \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.249588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-internal-tls-certs\") pod \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.249623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-config\") pod \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\" (UID: \"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e\") " Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.253276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-kube-api-access-lbd8b" (OuterVolumeSpecName: "kube-api-access-lbd8b") pod "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" (UID: "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e"). InnerVolumeSpecName "kube-api-access-lbd8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.253307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" (UID: "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.275375 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" (UID: "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.276037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" (UID: "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.276303 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" (UID: "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.276458 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-config" (OuterVolumeSpecName: "config") pod "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" (UID: "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.285437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" (UID: "dc5eeb45-87cc-4837-8d10-1a6e85a78b1e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.351286 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.351445 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.351576 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbd8b\" (UniqueName: \"kubernetes.io/projected/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-kube-api-access-lbd8b\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.351643 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.351706 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.351757 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.351835 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.669608 4707 generic.go:334] "Generic (PLEG): container finished" podID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerID="9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686" exitCode=0 Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.669658 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" event={"ID":"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e","Type":"ContainerDied","Data":"9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686"} Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.669681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" event={"ID":"dc5eeb45-87cc-4837-8d10-1a6e85a78b1e","Type":"ContainerDied","Data":"dca8c21676d2517dfc849b4ad0a7ebb3326a9fcac4ed08cb9058357a9dcf254f"} Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.669681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-855bf96656-swjds" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.669696 4707 scope.go:117] "RemoveContainer" containerID="41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.685331 4707 scope.go:117] "RemoveContainer" containerID="9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.691617 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-855bf96656-swjds"] Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.695846 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-855bf96656-swjds"] Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.700506 4707 scope.go:117] "RemoveContainer" containerID="41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51" Jan 21 15:25:06 crc kubenswrapper[4707]: E0121 15:25:06.700858 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51\": container with ID starting with 41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51 not found: ID does not exist" containerID="41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.700889 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51"} err="failed to get container status \"41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51\": rpc error: code = NotFound desc = could not find container \"41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51\": container with ID starting with 41e7917402d30a261f4d9c5f7dd025ff3b677739892d501c36e84b301dee0f51 not found: ID does not exist" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.700908 4707 scope.go:117] "RemoveContainer" containerID="9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686" Jan 21 15:25:06 crc kubenswrapper[4707]: E0121 15:25:06.701263 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686\": container with ID starting with 9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686 not found: ID does not exist" containerID="9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686" Jan 21 15:25:06 crc kubenswrapper[4707]: I0121 15:25:06.701304 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686"} err="failed to get container status \"9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686\": rpc error: code = NotFound desc = could not find container \"9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686\": container with ID starting with 9d38f5de06440a695b777da9be657be80230638e6b4dced6ddbcf25531ddb686 not found: ID does not exist" Jan 21 15:25:07 crc kubenswrapper[4707]: I0121 15:25:07.188295 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" path="/var/lib/kubelet/pods/dc5eeb45-87cc-4837-8d10-1a6e85a78b1e/volumes" Jan 21 15:25:09 crc kubenswrapper[4707]: I0121 15:25:09.945593 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:25:09 crc kubenswrapper[4707]: I0121 15:25:09.945782 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.266256 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.381975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c9958390-b164-4ae8-b834-d1df8a561b95\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.382293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-cache\") pod \"c9958390-b164-4ae8-b834-d1df8a561b95\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.382360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7zxq\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-kube-api-access-j7zxq\") pod \"c9958390-b164-4ae8-b834-d1df8a561b95\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.382376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift\") pod \"c9958390-b164-4ae8-b834-d1df8a561b95\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.382415 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-lock\") pod \"c9958390-b164-4ae8-b834-d1df8a561b95\" (UID: \"c9958390-b164-4ae8-b834-d1df8a561b95\") " Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.383113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-cache" (OuterVolumeSpecName: "cache") pod "c9958390-b164-4ae8-b834-d1df8a561b95" (UID: "c9958390-b164-4ae8-b834-d1df8a561b95"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.383224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-lock" (OuterVolumeSpecName: "lock") pod "c9958390-b164-4ae8-b834-d1df8a561b95" (UID: "c9958390-b164-4ae8-b834-d1df8a561b95"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.387082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-kube-api-access-j7zxq" (OuterVolumeSpecName: "kube-api-access-j7zxq") pod "c9958390-b164-4ae8-b834-d1df8a561b95" (UID: "c9958390-b164-4ae8-b834-d1df8a561b95"). InnerVolumeSpecName "kube-api-access-j7zxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.387358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c9958390-b164-4ae8-b834-d1df8a561b95" (UID: "c9958390-b164-4ae8-b834-d1df8a561b95"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.387361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "c9958390-b164-4ae8-b834-d1df8a561b95" (UID: "c9958390-b164-4ae8-b834-d1df8a561b95"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.484517 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.484543 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-cache\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.484553 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7zxq\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-kube-api-access-j7zxq\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.484562 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9958390-b164-4ae8-b834-d1df8a561b95-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.484569 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c9958390-b164-4ae8-b834-d1df8a561b95-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.494346 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.585680 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.735644 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9958390-b164-4ae8-b834-d1df8a561b95" containerID="367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222" exitCode=137 Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.735680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222"} Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.735704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"c9958390-b164-4ae8-b834-d1df8a561b95","Type":"ContainerDied","Data":"13f4b075478eda21b5827d16a44012dda1eeca3faa6e611a7cae524d0ec4ca5b"} Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.735721 4707 scope.go:117] "RemoveContainer" containerID="367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.735723 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.749666 4707 scope.go:117] "RemoveContainer" containerID="eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.758143 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.763300 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.766312 4707 scope.go:117] "RemoveContainer" containerID="e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.784368 4707 scope.go:117] "RemoveContainer" containerID="944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.798442 4707 scope.go:117] "RemoveContainer" containerID="f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.814747 4707 scope.go:117] "RemoveContainer" containerID="15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.828515 4707 scope.go:117] "RemoveContainer" containerID="1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.840720 4707 scope.go:117] "RemoveContainer" containerID="a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.851600 4707 scope.go:117] "RemoveContainer" containerID="1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.863771 4707 scope.go:117] "RemoveContainer" containerID="c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.875679 4707 scope.go:117] "RemoveContainer" containerID="2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.886949 4707 scope.go:117] "RemoveContainer" containerID="5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.898492 4707 scope.go:117] "RemoveContainer" containerID="8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.911242 4707 scope.go:117] "RemoveContainer" containerID="9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.923921 4707 scope.go:117] "RemoveContainer" containerID="cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.935087 4707 scope.go:117] "RemoveContainer" containerID="367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.935488 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222\": container with ID starting with 367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222 not found: ID does not exist" containerID="367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.935516 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222"} err="failed to get container status \"367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222\": rpc error: code = NotFound desc = could not find container \"367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222\": container with ID starting with 367c209052096d8cc5cb4f728719c2902da4d3ae66c00614ef826d320c5c6222 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.935535 4707 scope.go:117] "RemoveContainer" containerID="eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.935799 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3\": container with ID starting with eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3 not found: ID does not exist" containerID="eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.935839 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3"} err="failed to get container status \"eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3\": rpc error: code = NotFound desc = could not find container \"eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3\": container with ID starting with eced0d2218a83913b5b3123f43867f53eb97740de7d97982dd46e7d38a3dd8f3 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.935859 4707 scope.go:117] "RemoveContainer" containerID="e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.936121 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87\": container with ID starting with e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87 not found: ID does not exist" containerID="e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.936143 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87"} err="failed to get container status \"e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87\": rpc error: code = NotFound desc = could not find container \"e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87\": container with ID starting with e590d431ee27f8a5c094476c90b41ad90990ccbf8baa9132e5a9bb3938a2cb87 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.936154 4707 scope.go:117] "RemoveContainer" containerID="944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.936430 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759\": container with ID starting with 944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759 not found: ID does not exist" containerID="944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.936454 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759"} err="failed to get container status \"944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759\": rpc error: code = NotFound desc = could not find container \"944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759\": container with ID starting with 944c4b9cc7d6d4de9b66619915501bd45569474bbfda614403ee849ba036d759 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.936465 4707 scope.go:117] "RemoveContainer" containerID="f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.936660 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229\": container with ID starting with f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229 not found: ID does not exist" containerID="f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.936679 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229"} err="failed to get container status \"f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229\": rpc error: code = NotFound desc = could not find container \"f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229\": container with ID starting with f2c873717a98214b6c559379f72a055cf30df874f7d388b8f8d16fa0a6a92229 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.936689 4707 scope.go:117] "RemoveContainer" containerID="15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.936957 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072\": container with ID starting with 15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072 not found: ID does not exist" containerID="15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.936978 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072"} err="failed to get container status \"15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072\": rpc error: code = NotFound desc = could not find container \"15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072\": container with ID starting with 15e8967e97e3acb14aeadcaba2fa81783c57a6ef765643de4fea5e210960e072 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.936990 4707 scope.go:117] "RemoveContainer" containerID="1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.937182 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a\": container with ID starting with 1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a not found: ID does not exist" containerID="1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.937201 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a"} err="failed to get container status \"1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a\": rpc error: code = NotFound desc = could not find container \"1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a\": container with ID starting with 1f702c731228daf13ce4b05307bf2b37f4aa5c5356fa7deea60142f0cf3f825a not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.937212 4707 scope.go:117] "RemoveContainer" containerID="a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.937399 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877\": container with ID starting with a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877 not found: ID does not exist" containerID="a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.937417 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877"} err="failed to get container status \"a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877\": rpc error: code = NotFound desc = could not find container \"a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877\": container with ID starting with a7c3a946df22cad983d00483e1374bae57d35d5653fd2ebbf313d8366f715877 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.937428 4707 scope.go:117] "RemoveContainer" containerID="1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.937686 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa\": container with ID starting with 1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa not found: ID does not exist" containerID="1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.937703 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa"} err="failed to get container status \"1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa\": rpc error: code = NotFound desc = could not find container \"1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa\": container with ID starting with 1f1f0fcc3ac019896339b5fb63871b7dc51444ba01b335ddf096d576724ea2fa not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.937715 4707 scope.go:117] "RemoveContainer" containerID="c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.937920 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a\": container with ID starting with c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a not found: ID does not exist" containerID="c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.937938 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a"} err="failed to get container status \"c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a\": rpc error: code = NotFound desc = could not find container \"c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a\": container with ID starting with c1aff70275c167fa76b4b71afd47dd63953f3fddd53d0eeb53b09c7c5534a68a not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.937949 4707 scope.go:117] "RemoveContainer" containerID="2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.938138 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39\": container with ID starting with 2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39 not found: ID does not exist" containerID="2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.938156 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39"} err="failed to get container status \"2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39\": rpc error: code = NotFound desc = could not find container \"2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39\": container with ID starting with 2c0d00d9d59a34f557feba0d8020eeea6b627ff829fad79bac322c9a42246e39 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.938169 4707 scope.go:117] "RemoveContainer" containerID="5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.938458 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8\": container with ID starting with 5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8 not found: ID does not exist" containerID="5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.938472 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8"} err="failed to get container status \"5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8\": rpc error: code = NotFound desc = could not find container \"5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8\": container with ID starting with 5916ff9ea3bd141bbcffabd589c7586f701a47dcd9bb2f4ae7cb5fcc0b5b46a8 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.938483 4707 scope.go:117] "RemoveContainer" containerID="8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.938688 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17\": container with ID starting with 8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17 not found: ID does not exist" containerID="8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.938705 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17"} err="failed to get container status \"8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17\": rpc error: code = NotFound desc = could not find container \"8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17\": container with ID starting with 8247175d54797937bc0939b92b7f1d454ab5950c219a43a897092dd34f8b3d17 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.938717 4707 scope.go:117] "RemoveContainer" containerID="9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.938986 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663\": container with ID starting with 9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663 not found: ID does not exist" containerID="9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.939005 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663"} err="failed to get container status \"9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663\": rpc error: code = NotFound desc = could not find container \"9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663\": container with ID starting with 9b25c68732b0ab773a995ae14bb66a20785422bd52fa4d6c978d878bc992d663 not found: ID does not exist" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.939015 4707 scope.go:117] "RemoveContainer" containerID="cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b" Jan 21 15:25:17 crc kubenswrapper[4707]: E0121 15:25:17.939210 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b\": container with ID starting with cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b not found: ID does not exist" containerID="cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b" Jan 21 15:25:17 crc kubenswrapper[4707]: I0121 15:25:17.939227 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b"} err="failed to get container status \"cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b\": rpc error: code = NotFound desc = could not find container \"cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b\": container with ID starting with cec328de3738b8689b11099e2034c3d4e5fcb9a61ab5f6c149cfce163cfe098b not found: ID does not exist" Jan 21 15:25:19 crc kubenswrapper[4707]: I0121 15:25:19.188432 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" path="/var/lib/kubelet/pods/c9958390-b164-4ae8-b834-d1df8a561b95/volumes" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.126237 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-qjrxp"] Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.129833 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-qjrxp"] Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.189654 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2def2a51-5e8a-44c8-a7a0-99652042de6c" path="/var/lib/kubelet/pods/2def2a51-5e8a-44c8-a7a0-99652042de6c/volumes" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231599 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-rpjvx"] Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231832 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerName="probe" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231847 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerName="probe" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231863 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc7660e-aaa9-4865-a873-4db17c1db628" containerName="setup-container" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231869 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc7660e-aaa9-4865-a873-4db17c1db628" containerName="setup-container" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231880 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c1a990-5286-4274-a26a-8740ee506a12" containerName="cinder-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231886 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c1a990-5286-4274-a26a-8740ee506a12" containerName="cinder-api" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231894 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231899 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-log" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231907 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9190471-72fb-4298-85b6-bf07b32b6de5" containerName="mysql-bootstrap" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231912 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9190471-72fb-4298-85b6-bf07b32b6de5" containerName="mysql-bootstrap" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231919 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerName="barbican-keystone-listener" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231925 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerName="barbican-keystone-listener" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231933 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231940 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-api" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231958 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231963 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-log" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerName="glance-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231975 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerName="glance-log" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231984 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-expirer" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.231989 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-expirer" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.231996 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4543cf32-1b8e-4b5d-8189-2cef6a154e9b" containerName="init" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232002 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4543cf32-1b8e-4b5d-8189-2cef6a154e9b" containerName="init" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232009 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4543cf32-1b8e-4b5d-8189-2cef6a154e9b" containerName="dnsmasq-dns" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232014 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4543cf32-1b8e-4b5d-8189-2cef6a154e9b" containerName="dnsmasq-dns" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232023 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-updater" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232029 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-updater" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232037 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerName="neutron-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232042 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerName="neutron-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerName="openstack-network-exporter" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232055 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerName="openstack-network-exporter" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232061 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-updater" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232066 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-updater" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232074 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbbd17b-fed9-463c-9c25-435f84d0205e" containerName="rabbitmq" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232079 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbbd17b-fed9-463c-9c25-435f84d0205e" containerName="rabbitmq" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232085 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e516808-af42-455d-b94b-7a72e6439115" containerName="nova-cell0-conductor-conductor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232090 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e516808-af42-455d-b94b-7a72e6439115" containerName="nova-cell0-conductor-conductor" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232099 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-auditor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232105 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-auditor" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232111 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerName="proxy-server" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232115 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerName="proxy-server" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232124 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="sg-core" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232128 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="sg-core" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232137 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-server" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232142 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-server" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232148 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b113905-777a-4919-96c0-704f8e6f4026" containerName="mariadb-account-create-update" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232152 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b113905-777a-4919-96c0-704f8e6f4026" containerName="mariadb-account-create-update" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232160 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerName="proxy-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232165 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerName="proxy-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232172 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-replicator" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232177 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-replicator" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232185 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc7660e-aaa9-4865-a873-4db17c1db628" containerName="rabbitmq" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232190 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc7660e-aaa9-4865-a873-4db17c1db628" containerName="rabbitmq" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232200 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerName="cinder-scheduler" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232205 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerName="cinder-scheduler" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232212 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-auditor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232217 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-auditor" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232224 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-metadata" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232238 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-metadata" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232247 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9190471-72fb-4298-85b6-bf07b32b6de5" containerName="galera" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232252 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9190471-72fb-4298-85b6-bf07b32b6de5" containerName="galera" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232258 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117c40c7-7452-42e7-8a5a-0a1123ac030e" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232262 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="117c40c7-7452-42e7-8a5a-0a1123ac030e" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232270 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="swift-recon-cron" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232276 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="swift-recon-cron" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232283 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerName="neutron-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232288 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerName="neutron-api" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232296 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="ceilometer-notification-agent" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232301 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="ceilometer-notification-agent" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232310 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbbd17b-fed9-463c-9c25-435f84d0205e" containerName="setup-container" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232315 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbbd17b-fed9-463c-9c25-435f84d0205e" containerName="setup-container" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232321 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-server" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232327 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-server" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232335 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b113905-777a-4919-96c0-704f8e6f4026" containerName="mariadb-account-create-update" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232340 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b113905-777a-4919-96c0-704f8e6f4026" containerName="mariadb-account-create-update" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232348 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerName="glance-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232352 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerName="glance-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232359 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerName="barbican-keystone-listener-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232363 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerName="barbican-keystone-listener-log" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232373 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerName="glance-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232378 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerName="glance-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232385 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232390 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-log" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232399 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09bf8b72-b690-4084-854a-0245cd1a61b2" containerName="memcached" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232403 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09bf8b72-b690-4084-854a-0245cd1a61b2" containerName="memcached" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232409 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerName="glance-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232413 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerName="glance-log" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232419 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="proxy-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232423 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="proxy-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232429 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c1a990-5286-4274-a26a-8740ee506a12" containerName="cinder-api-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232434 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c1a990-5286-4274-a26a-8740ee506a12" containerName="cinder-api-log" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232443 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerName="barbican-api-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232448 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerName="barbican-api-log" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232454 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-auditor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232459 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-auditor" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232466 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-replicator" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232471 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-replicator" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232479 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232484 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-api" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232491 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-replicator" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232496 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-replicator" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232504 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-server" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232509 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-server" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232516 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4d90d7-a400-45ab-8c5b-7e68ced57657" containerName="galera" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232521 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4d90d7-a400-45ab-8c5b-7e68ced57657" containerName="galera" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232527 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f8fb41-6a21-4135-ace9-7b9b36eaaac5" containerName="kube-state-metrics" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232531 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f8fb41-6a21-4135-ace9-7b9b36eaaac5" containerName="kube-state-metrics" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232539 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-reaper" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232543 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-reaper" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232551 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerName="ovn-northd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232556 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerName="ovn-northd" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232564 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" containerName="keystone-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232569 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" containerName="keystone-api" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232575 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerName="barbican-worker-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232580 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerName="barbican-worker-log" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232588 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9606e288-80ca-48db-a0a6-8f0f4ec67eaa" containerName="nova-cell1-conductor-conductor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232593 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9606e288-80ca-48db-a0a6-8f0f4ec67eaa" containerName="nova-cell1-conductor-conductor" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232600 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="rsync" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232605 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="rsync" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232612 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerName="barbican-worker" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232617 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerName="barbican-worker" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232625 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerName="barbican-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232630 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerName="barbican-api" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232638 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1addb55-dbd5-4f5c-957b-592c89033dab" containerName="nova-scheduler-scheduler" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232642 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1addb55-dbd5-4f5c-957b-592c89033dab" containerName="nova-scheduler-scheduler" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232650 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4d90d7-a400-45ab-8c5b-7e68ced57657" containerName="mysql-bootstrap" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232655 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4d90d7-a400-45ab-8c5b-7e68ced57657" containerName="mysql-bootstrap" Jan 21 15:25:27 crc kubenswrapper[4707]: E0121 15:25:27.232660 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="ceilometer-central-agent" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232666 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="ceilometer-central-agent" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232762 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-server" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232771 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerName="glance-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232778 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-metadata" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232786 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-replicator" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232791 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerName="proxy-server" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232798 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="rsync" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232818 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232824 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1addb55-dbd5-4f5c-957b-592c89033dab" containerName="nova-scheduler-scheduler" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232833 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerName="ovn-northd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232841 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-auditor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232850 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerName="cinder-scheduler" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232857 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-updater" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232867 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-replicator" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232877 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c1a990-5286-4274-a26a-8740ee506a12" containerName="cinder-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232883 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc7660e-aaa9-4865-a873-4db17c1db628" containerName="rabbitmq" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232890 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="container-updater" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232898 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b113905-777a-4919-96c0-704f8e6f4026" containerName="mariadb-account-create-update" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232904 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4d90d7-a400-45ab-8c5b-7e68ced57657" containerName="galera" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232912 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="ceilometer-notification-agent" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232918 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="09bf8b72-b690-4084-854a-0245cd1a61b2" containerName="memcached" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232925 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11022c93-b7e5-4241-812c-eed16cc40ce2" containerName="openstack-network-exporter" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232931 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-replicator" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232938 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-expirer" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232944 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c937d1c-d74a-4296-a78d-c88deef6b011" containerName="probe" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232950 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4543cf32-1b8e-4b5d-8189-2cef6a154e9b" containerName="dnsmasq-dns" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232958 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c1a990-5286-4274-a26a-8740ee506a12" containerName="cinder-api-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232964 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ff5dfe-af7e-4827-949e-926b7a41731b" containerName="proxy-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232972 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerName="glance-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232980 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7400b4-4f9e-4571-9156-5cf8f5e0df67" containerName="nova-metadata-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232986 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-reaper" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232993 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-server" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.232999 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e516808-af42-455d-b94b-7a72e6439115" containerName="nova-cell0-conductor-conductor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233007 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerName="neutron-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233012 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="sg-core" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233018 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-auditor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233026 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9190471-72fb-4298-85b6-bf07b32b6de5" containerName="galera" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233034 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="account-auditor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233040 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="swift-recon-cron" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233045 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerName="barbican-worker-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233050 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc5eeb45-87cc-4837-8d10-1a6e85a78b1e" containerName="neutron-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233056 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="edf40ef7-2790-4e9b-92dd-b92ab951383d" containerName="glance-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233061 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="ceilometer-central-agent" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233068 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="316bd962-7ce5-4797-8a99-79aacd85c3fb" containerName="placement-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233074 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerName="barbican-api-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233079 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aee120c-53c3-4a72-a81e-2dd10cc5cb9a" containerName="keystone-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233084 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="117c40c7-7452-42e7-8a5a-0a1123ac030e" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233091 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f8fb41-6a21-4135-ace9-7b9b36eaaac5" containerName="kube-state-metrics" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233097 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbbd17b-fed9-463c-9c25-435f84d0205e" containerName="rabbitmq" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233103 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerName="barbican-keystone-listener" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233111 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="63816993-7ef8-4f21-9a3c-18040cb843bd" containerName="glance-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233115 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233121 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="07daec18-3928-4cae-8fbf-4de0b30c690c" containerName="proxy-httpd" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233129 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9606e288-80ca-48db-a0a6-8f0f4ec67eaa" containerName="nova-cell1-conductor-conductor" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233136 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7658f54-ea3a-4d2a-bec0-6a885e661223" containerName="nova-api-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233142 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb9fece3-dd79-4ad1-a2bc-65f4c66a6c6e" containerName="barbican-api" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233148 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958390-b164-4ae8-b834-d1df8a561b95" containerName="object-server" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233156 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0ec627-76fd-4938-9a2b-db220a1ac5d5" containerName="barbican-worker" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233162 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41c8dac-d0ea-40c7-a4f2-0d43d31f58b1" containerName="barbican-keystone-listener-log" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.233518 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.236342 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.236408 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.236710 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.236960 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.239151 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rpjvx"] Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.405795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-crc-storage\") pod \"crc-storage-crc-rpjvx\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.405860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7454z\" (UniqueName: \"kubernetes.io/projected/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-kube-api-access-7454z\") pod \"crc-storage-crc-rpjvx\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.405886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-node-mnt\") pod \"crc-storage-crc-rpjvx\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.506928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-crc-storage\") pod \"crc-storage-crc-rpjvx\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.506971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7454z\" (UniqueName: \"kubernetes.io/projected/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-kube-api-access-7454z\") pod \"crc-storage-crc-rpjvx\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.506991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-node-mnt\") pod \"crc-storage-crc-rpjvx\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.507166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-node-mnt\") pod \"crc-storage-crc-rpjvx\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.507592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-crc-storage\") pod \"crc-storage-crc-rpjvx\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.523986 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7454z\" (UniqueName: \"kubernetes.io/projected/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-kube-api-access-7454z\") pod \"crc-storage-crc-rpjvx\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.545508 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:27 crc kubenswrapper[4707]: I0121 15:25:27.892553 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rpjvx"] Jan 21 15:25:28 crc kubenswrapper[4707]: I0121 15:25:28.797499 4707 generic.go:334] "Generic (PLEG): container finished" podID="6b73719e-f8b4-4ce2-95b0-51f65e1ecabf" containerID="a71ad25a4023689928703304509832c24880dba0eaca081e4ac7b84a9f39e2d7" exitCode=0 Jan 21 15:25:28 crc kubenswrapper[4707]: I0121 15:25:28.797537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rpjvx" event={"ID":"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf","Type":"ContainerDied","Data":"a71ad25a4023689928703304509832c24880dba0eaca081e4ac7b84a9f39e2d7"} Jan 21 15:25:28 crc kubenswrapper[4707]: I0121 15:25:28.797701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rpjvx" event={"ID":"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf","Type":"ContainerStarted","Data":"378aa011b33f8a87893c9fdbd8de7c94497e88f1bb0e6e9abf1680999873e2d5"} Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.012620 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.137332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7454z\" (UniqueName: \"kubernetes.io/projected/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-kube-api-access-7454z\") pod \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.137424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-node-mnt\") pod \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.137448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-crc-storage\") pod \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\" (UID: \"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf\") " Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.137498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6b73719e-f8b4-4ce2-95b0-51f65e1ecabf" (UID: "6b73719e-f8b4-4ce2-95b0-51f65e1ecabf"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.137760 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.141181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-kube-api-access-7454z" (OuterVolumeSpecName: "kube-api-access-7454z") pod "6b73719e-f8b4-4ce2-95b0-51f65e1ecabf" (UID: "6b73719e-f8b4-4ce2-95b0-51f65e1ecabf"). InnerVolumeSpecName "kube-api-access-7454z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.150777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6b73719e-f8b4-4ce2-95b0-51f65e1ecabf" (UID: "6b73719e-f8b4-4ce2-95b0-51f65e1ecabf"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.238503 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7454z\" (UniqueName: \"kubernetes.io/projected/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-kube-api-access-7454z\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.238574 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.810094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rpjvx" event={"ID":"6b73719e-f8b4-4ce2-95b0-51f65e1ecabf","Type":"ContainerDied","Data":"378aa011b33f8a87893c9fdbd8de7c94497e88f1bb0e6e9abf1680999873e2d5"} Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.810126 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="378aa011b33f8a87893c9fdbd8de7c94497e88f1bb0e6e9abf1680999873e2d5" Jan 21 15:25:30 crc kubenswrapper[4707]: I0121 15:25:30.810143 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rpjvx" Jan 21 15:25:31 crc kubenswrapper[4707]: I0121 15:25:31.848394 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k9gkb"] Jan 21 15:25:31 crc kubenswrapper[4707]: E0121 15:25:31.848790 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b73719e-f8b4-4ce2-95b0-51f65e1ecabf" containerName="storage" Jan 21 15:25:31 crc kubenswrapper[4707]: I0121 15:25:31.848802 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b73719e-f8b4-4ce2-95b0-51f65e1ecabf" containerName="storage" Jan 21 15:25:31 crc kubenswrapper[4707]: I0121 15:25:31.848939 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b113905-777a-4919-96c0-704f8e6f4026" containerName="mariadb-account-create-update" Jan 21 15:25:31 crc kubenswrapper[4707]: I0121 15:25:31.848953 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b73719e-f8b4-4ce2-95b0-51f65e1ecabf" containerName="storage" Jan 21 15:25:31 crc kubenswrapper[4707]: I0121 15:25:31.849708 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:31 crc kubenswrapper[4707]: I0121 15:25:31.860078 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9gkb"] Jan 21 15:25:31 crc kubenswrapper[4707]: I0121 15:25:31.956753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22983f7b-d717-456f-9efe-3ff6a7342a68-catalog-content\") pod \"certified-operators-k9gkb\" (UID: \"22983f7b-d717-456f-9efe-3ff6a7342a68\") " pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:31 crc kubenswrapper[4707]: I0121 15:25:31.956792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcqcz\" (UniqueName: \"kubernetes.io/projected/22983f7b-d717-456f-9efe-3ff6a7342a68-kube-api-access-tcqcz\") pod \"certified-operators-k9gkb\" (UID: \"22983f7b-d717-456f-9efe-3ff6a7342a68\") " pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:31 crc kubenswrapper[4707]: I0121 15:25:31.956869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22983f7b-d717-456f-9efe-3ff6a7342a68-utilities\") pod \"certified-operators-k9gkb\" (UID: \"22983f7b-d717-456f-9efe-3ff6a7342a68\") " pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.057696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22983f7b-d717-456f-9efe-3ff6a7342a68-catalog-content\") pod \"certified-operators-k9gkb\" (UID: \"22983f7b-d717-456f-9efe-3ff6a7342a68\") " pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.057917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcqcz\" (UniqueName: \"kubernetes.io/projected/22983f7b-d717-456f-9efe-3ff6a7342a68-kube-api-access-tcqcz\") pod \"certified-operators-k9gkb\" (UID: \"22983f7b-d717-456f-9efe-3ff6a7342a68\") " pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.058034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22983f7b-d717-456f-9efe-3ff6a7342a68-utilities\") pod \"certified-operators-k9gkb\" (UID: \"22983f7b-d717-456f-9efe-3ff6a7342a68\") " pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.058171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22983f7b-d717-456f-9efe-3ff6a7342a68-catalog-content\") pod \"certified-operators-k9gkb\" (UID: \"22983f7b-d717-456f-9efe-3ff6a7342a68\") " pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.058377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22983f7b-d717-456f-9efe-3ff6a7342a68-utilities\") pod \"certified-operators-k9gkb\" (UID: \"22983f7b-d717-456f-9efe-3ff6a7342a68\") " pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.076659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcqcz\" (UniqueName: \"kubernetes.io/projected/22983f7b-d717-456f-9efe-3ff6a7342a68-kube-api-access-tcqcz\") pod \"certified-operators-k9gkb\" (UID: \"22983f7b-d717-456f-9efe-3ff6a7342a68\") " pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.167729 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.560975 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9gkb"] Jan 21 15:25:32 crc kubenswrapper[4707]: W0121 15:25:32.564846 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22983f7b_d717_456f_9efe_3ff6a7342a68.slice/crio-a860e180a698ebd47420818ebc48bcf1cc22d55b8856fcbdd0d6a8867f82e49e WatchSource:0}: Error finding container a860e180a698ebd47420818ebc48bcf1cc22d55b8856fcbdd0d6a8867f82e49e: Status 404 returned error can't find the container with id a860e180a698ebd47420818ebc48bcf1cc22d55b8856fcbdd0d6a8867f82e49e Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.823318 4707 generic.go:334] "Generic (PLEG): container finished" podID="22983f7b-d717-456f-9efe-3ff6a7342a68" containerID="557f21e0a27ca20dfa5651ea33f81c0937efb0a9451d4602e8eb341a966e4198" exitCode=0 Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.823358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9gkb" event={"ID":"22983f7b-d717-456f-9efe-3ff6a7342a68","Type":"ContainerDied","Data":"557f21e0a27ca20dfa5651ea33f81c0937efb0a9451d4602e8eb341a966e4198"} Jan 21 15:25:32 crc kubenswrapper[4707]: I0121 15:25:32.823531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9gkb" event={"ID":"22983f7b-d717-456f-9efe-3ff6a7342a68","Type":"ContainerStarted","Data":"a860e180a698ebd47420818ebc48bcf1cc22d55b8856fcbdd0d6a8867f82e49e"} Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.030345 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-rpjvx"] Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.033794 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-rpjvx"] Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.138097 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-n6mcc"] Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.138790 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.140827 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.142316 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.142335 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.142407 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.142837 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-n6mcc"] Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.170325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-crc-storage\") pod \"crc-storage-crc-n6mcc\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.170416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-node-mnt\") pod \"crc-storage-crc-n6mcc\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.170461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szrdv\" (UniqueName: \"kubernetes.io/projected/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-kube-api-access-szrdv\") pod \"crc-storage-crc-n6mcc\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.191185 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b73719e-f8b4-4ce2-95b0-51f65e1ecabf" path="/var/lib/kubelet/pods/6b73719e-f8b4-4ce2-95b0-51f65e1ecabf/volumes" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.271597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-node-mnt\") pod \"crc-storage-crc-n6mcc\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.271833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-node-mnt\") pod \"crc-storage-crc-n6mcc\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.271843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szrdv\" (UniqueName: \"kubernetes.io/projected/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-kube-api-access-szrdv\") pod \"crc-storage-crc-n6mcc\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.271983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-crc-storage\") pod \"crc-storage-crc-n6mcc\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.272588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-crc-storage\") pod \"crc-storage-crc-n6mcc\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.286796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szrdv\" (UniqueName: \"kubernetes.io/projected/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-kube-api-access-szrdv\") pod \"crc-storage-crc-n6mcc\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.451001 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.813928 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-n6mcc"] Jan 21 15:25:33 crc kubenswrapper[4707]: I0121 15:25:33.830989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-n6mcc" event={"ID":"5e8fb795-c82f-48ae-9bdf-19c03a086d7b","Type":"ContainerStarted","Data":"ba6c6d7de300a9e0c5db5655e869766f6ce0f9a183a7bf76c99bd1dde787f5b1"} Jan 21 15:25:34 crc kubenswrapper[4707]: I0121 15:25:34.839371 4707 generic.go:334] "Generic (PLEG): container finished" podID="5e8fb795-c82f-48ae-9bdf-19c03a086d7b" containerID="bb631c2b0c64cf60fe5fbbe07c38027ecccd9801ec84b61405f3b27a88764a99" exitCode=0 Jan 21 15:25:34 crc kubenswrapper[4707]: I0121 15:25:34.839422 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-n6mcc" event={"ID":"5e8fb795-c82f-48ae-9bdf-19c03a086d7b","Type":"ContainerDied","Data":"bb631c2b0c64cf60fe5fbbe07c38027ecccd9801ec84b61405f3b27a88764a99"} Jan 21 15:25:35 crc kubenswrapper[4707]: I0121 15:25:35.854927 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9gkb" event={"ID":"22983f7b-d717-456f-9efe-3ff6a7342a68","Type":"ContainerStarted","Data":"e6c1752060841cd727cd73ec71ec8c458dd7f3cc50b8b8bf9c297e5d4b330d8d"} Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.061413 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.136476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-node-mnt\") pod \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.136573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szrdv\" (UniqueName: \"kubernetes.io/projected/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-kube-api-access-szrdv\") pod \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.136571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "5e8fb795-c82f-48ae-9bdf-19c03a086d7b" (UID: "5e8fb795-c82f-48ae-9bdf-19c03a086d7b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.136644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-crc-storage\") pod \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\" (UID: \"5e8fb795-c82f-48ae-9bdf-19c03a086d7b\") " Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.136990 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.140476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-kube-api-access-szrdv" (OuterVolumeSpecName: "kube-api-access-szrdv") pod "5e8fb795-c82f-48ae-9bdf-19c03a086d7b" (UID: "5e8fb795-c82f-48ae-9bdf-19c03a086d7b"). InnerVolumeSpecName "kube-api-access-szrdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.149954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "5e8fb795-c82f-48ae-9bdf-19c03a086d7b" (UID: "5e8fb795-c82f-48ae-9bdf-19c03a086d7b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.239315 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szrdv\" (UniqueName: \"kubernetes.io/projected/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-kube-api-access-szrdv\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.239515 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e8fb795-c82f-48ae-9bdf-19c03a086d7b-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.861629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-n6mcc" event={"ID":"5e8fb795-c82f-48ae-9bdf-19c03a086d7b","Type":"ContainerDied","Data":"ba6c6d7de300a9e0c5db5655e869766f6ce0f9a183a7bf76c99bd1dde787f5b1"} Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.861678 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6c6d7de300a9e0c5db5655e869766f6ce0f9a183a7bf76c99bd1dde787f5b1" Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.861644 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n6mcc" Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.863103 4707 generic.go:334] "Generic (PLEG): container finished" podID="22983f7b-d717-456f-9efe-3ff6a7342a68" containerID="e6c1752060841cd727cd73ec71ec8c458dd7f3cc50b8b8bf9c297e5d4b330d8d" exitCode=0 Jan 21 15:25:36 crc kubenswrapper[4707]: I0121 15:25:36.863128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9gkb" event={"ID":"22983f7b-d717-456f-9efe-3ff6a7342a68","Type":"ContainerDied","Data":"e6c1752060841cd727cd73ec71ec8c458dd7f3cc50b8b8bf9c297e5d4b330d8d"} Jan 21 15:25:37 crc kubenswrapper[4707]: I0121 15:25:37.870908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9gkb" event={"ID":"22983f7b-d717-456f-9efe-3ff6a7342a68","Type":"ContainerStarted","Data":"71ff97a3e759e0ba2a65b43477b427054bc50a8c242154a504b34f707e39985d"} Jan 21 15:25:37 crc kubenswrapper[4707]: I0121 15:25:37.886415 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k9gkb" podStartSLOduration=2.341773174 podStartE2EDuration="6.886401469s" podCreationTimestamp="2026-01-21 15:25:31 +0000 UTC" firstStartedPulling="2026-01-21 15:25:32.824358422 +0000 UTC m=+1430.005874644" lastFinishedPulling="2026-01-21 15:25:37.368986717 +0000 UTC m=+1434.550502939" observedRunningTime="2026-01-21 15:25:37.88427865 +0000 UTC m=+1435.065794872" watchObservedRunningTime="2026-01-21 15:25:37.886401469 +0000 UTC m=+1435.067917692" Jan 21 15:25:39 crc kubenswrapper[4707]: I0121 15:25:39.946031 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:25:39 crc kubenswrapper[4707]: I0121 15:25:39.946085 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:25:42 crc kubenswrapper[4707]: I0121 15:25:42.168747 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:42 crc kubenswrapper[4707]: I0121 15:25:42.168990 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:42 crc kubenswrapper[4707]: I0121 15:25:42.197941 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:42 crc kubenswrapper[4707]: I0121 15:25:42.926543 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k9gkb" Jan 21 15:25:42 crc kubenswrapper[4707]: I0121 15:25:42.968393 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9gkb"] Jan 21 15:25:42 crc kubenswrapper[4707]: I0121 15:25:42.994989 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r67jn"] Jan 21 15:25:42 crc kubenswrapper[4707]: I0121 15:25:42.995177 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r67jn" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerName="registry-server" containerID="cri-o://9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a" gracePeriod=2 Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.842471 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.905319 4707 generic.go:334] "Generic (PLEG): container finished" podID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerID="9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a" exitCode=0 Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.905399 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r67jn" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.905389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r67jn" event={"ID":"332f17e3-46fd-4f6a-a1bc-8021c3590a38","Type":"ContainerDied","Data":"9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a"} Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.905468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r67jn" event={"ID":"332f17e3-46fd-4f6a-a1bc-8021c3590a38","Type":"ContainerDied","Data":"163dcf16f85e277eaee44b3b4493873e892fc2d6fd99f3320c02f28d433ec6cf"} Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.905490 4707 scope.go:117] "RemoveContainer" containerID="9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.921674 4707 scope.go:117] "RemoveContainer" containerID="cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.936051 4707 scope.go:117] "RemoveContainer" containerID="8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.942145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-utilities\") pod \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.942282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvmq4\" (UniqueName: \"kubernetes.io/projected/332f17e3-46fd-4f6a-a1bc-8021c3590a38-kube-api-access-wvmq4\") pod \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.942338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-catalog-content\") pod \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\" (UID: \"332f17e3-46fd-4f6a-a1bc-8021c3590a38\") " Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.942568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-utilities" (OuterVolumeSpecName: "utilities") pod "332f17e3-46fd-4f6a-a1bc-8021c3590a38" (UID: "332f17e3-46fd-4f6a-a1bc-8021c3590a38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.948327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/332f17e3-46fd-4f6a-a1bc-8021c3590a38-kube-api-access-wvmq4" (OuterVolumeSpecName: "kube-api-access-wvmq4") pod "332f17e3-46fd-4f6a-a1bc-8021c3590a38" (UID: "332f17e3-46fd-4f6a-a1bc-8021c3590a38"). InnerVolumeSpecName "kube-api-access-wvmq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.958919 4707 scope.go:117] "RemoveContainer" containerID="9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a" Jan 21 15:25:43 crc kubenswrapper[4707]: E0121 15:25:43.959381 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a\": container with ID starting with 9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a not found: ID does not exist" containerID="9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.959411 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a"} err="failed to get container status \"9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a\": rpc error: code = NotFound desc = could not find container \"9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a\": container with ID starting with 9c8b324aa656c213bab43ed070e044a52637f2807a0dbd640aa791186a974e9a not found: ID does not exist" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.959428 4707 scope.go:117] "RemoveContainer" containerID="cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915" Jan 21 15:25:43 crc kubenswrapper[4707]: E0121 15:25:43.959716 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915\": container with ID starting with cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915 not found: ID does not exist" containerID="cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.959736 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915"} err="failed to get container status \"cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915\": rpc error: code = NotFound desc = could not find container \"cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915\": container with ID starting with cfaa7841a05fa4df2aec8394bd203859d8dc80b58e921e2a62e4a4d2c7f4d915 not found: ID does not exist" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.959748 4707 scope.go:117] "RemoveContainer" containerID="8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c" Jan 21 15:25:43 crc kubenswrapper[4707]: E0121 15:25:43.960063 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c\": container with ID starting with 8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c not found: ID does not exist" containerID="8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.960156 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c"} err="failed to get container status \"8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c\": rpc error: code = NotFound desc = could not find container \"8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c\": container with ID starting with 8c1c37f7d206d2e651eaa8e9c2fe369543f0e478ce75b3a82ce10feb6df5cb9c not found: ID does not exist" Jan 21 15:25:43 crc kubenswrapper[4707]: I0121 15:25:43.982855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "332f17e3-46fd-4f6a-a1bc-8021c3590a38" (UID: "332f17e3-46fd-4f6a-a1bc-8021c3590a38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:25:44 crc kubenswrapper[4707]: I0121 15:25:44.043708 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:44 crc kubenswrapper[4707]: I0121 15:25:44.043738 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332f17e3-46fd-4f6a-a1bc-8021c3590a38-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:44 crc kubenswrapper[4707]: I0121 15:25:44.043748 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvmq4\" (UniqueName: \"kubernetes.io/projected/332f17e3-46fd-4f6a-a1bc-8021c3590a38-kube-api-access-wvmq4\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:44 crc kubenswrapper[4707]: I0121 15:25:44.226466 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r67jn"] Jan 21 15:25:44 crc kubenswrapper[4707]: I0121 15:25:44.230617 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r67jn"] Jan 21 15:25:45 crc kubenswrapper[4707]: I0121 15:25:45.188277 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" path="/var/lib/kubelet/pods/332f17e3-46fd-4f6a-a1bc-8021c3590a38/volumes" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.027683 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s6j4l"] Jan 21 15:25:46 crc kubenswrapper[4707]: E0121 15:25:46.028034 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerName="extract-utilities" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.028052 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerName="extract-utilities" Jan 21 15:25:46 crc kubenswrapper[4707]: E0121 15:25:46.028064 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8fb795-c82f-48ae-9bdf-19c03a086d7b" containerName="storage" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.028070 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8fb795-c82f-48ae-9bdf-19c03a086d7b" containerName="storage" Jan 21 15:25:46 crc kubenswrapper[4707]: E0121 15:25:46.028078 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerName="extract-content" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.028084 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerName="extract-content" Jan 21 15:25:46 crc kubenswrapper[4707]: E0121 15:25:46.028095 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerName="registry-server" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.028101 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerName="registry-server" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.028243 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="332f17e3-46fd-4f6a-a1bc-8021c3590a38" containerName="registry-server" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.028253 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8fb795-c82f-48ae-9bdf-19c03a086d7b" containerName="storage" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.029120 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.031785 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6j4l"] Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.169359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bln5r\" (UniqueName: \"kubernetes.io/projected/b9673fc0-793d-4a03-a441-3b9f78a1e106-kube-api-access-bln5r\") pod \"community-operators-s6j4l\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.169412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-utilities\") pod \"community-operators-s6j4l\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.169528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-catalog-content\") pod \"community-operators-s6j4l\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.271422 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bln5r\" (UniqueName: \"kubernetes.io/projected/b9673fc0-793d-4a03-a441-3b9f78a1e106-kube-api-access-bln5r\") pod \"community-operators-s6j4l\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.272063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-utilities\") pod \"community-operators-s6j4l\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.272195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-catalog-content\") pod \"community-operators-s6j4l\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.272582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-utilities\") pod \"community-operators-s6j4l\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.272765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-catalog-content\") pod \"community-operators-s6j4l\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.286997 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bln5r\" (UniqueName: \"kubernetes.io/projected/b9673fc0-793d-4a03-a441-3b9f78a1e106-kube-api-access-bln5r\") pod \"community-operators-s6j4l\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.340620 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.759492 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6j4l"] Jan 21 15:25:46 crc kubenswrapper[4707]: W0121 15:25:46.770987 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9673fc0_793d_4a03_a441_3b9f78a1e106.slice/crio-a847b091549d797f3eb1f04a9a6e156325b1bb4a46251d655cb35966b9d990c7 WatchSource:0}: Error finding container a847b091549d797f3eb1f04a9a6e156325b1bb4a46251d655cb35966b9d990c7: Status 404 returned error can't find the container with id a847b091549d797f3eb1f04a9a6e156325b1bb4a46251d655cb35966b9d990c7 Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.924928 4707 generic.go:334] "Generic (PLEG): container finished" podID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerID="e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2" exitCode=0 Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.925098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6j4l" event={"ID":"b9673fc0-793d-4a03-a441-3b9f78a1e106","Type":"ContainerDied","Data":"e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2"} Jan 21 15:25:46 crc kubenswrapper[4707]: I0121 15:25:46.925145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6j4l" event={"ID":"b9673fc0-793d-4a03-a441-3b9f78a1e106","Type":"ContainerStarted","Data":"a847b091549d797f3eb1f04a9a6e156325b1bb4a46251d655cb35966b9d990c7"} Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.710849 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.711844 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.715645 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.715754 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.717049 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.717275 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.717392 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.717508 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-t5c7w" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.717680 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.721648 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bac3bc5-b4ca-4333-9fc6-851430c48708-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qmvf\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-kube-api-access-5qmvf\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.791718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bac3bc5-b4ca-4333-9fc6-851430c48708-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qmvf\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-kube-api-access-5qmvf\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bac3bc5-b4ca-4333-9fc6-851430c48708-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bac3bc5-b4ca-4333-9fc6-851430c48708-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893653 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.893981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.894794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.894956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.895198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.898792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.899020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.902632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bac3bc5-b4ca-4333-9fc6-851430c48708-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.903465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bac3bc5-b4ca-4333-9fc6-851430c48708-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.905136 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.906070 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.910719 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.911065 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-hm5wm" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.911196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qmvf\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-kube-api-access-5qmvf\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.911631 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.911839 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.911955 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.912052 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.912150 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.914107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.915782 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.937179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6j4l" event={"ID":"b9673fc0-793d-4a03-a441-3b9f78a1e106","Type":"ContainerStarted","Data":"1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365"} Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.994930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.995028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.995110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d54b9100-265d-49a3-a2ab-72e8787510df-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.995130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.995244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd8ws\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-kube-api-access-hd8ws\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.995302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d54b9100-265d-49a3-a2ab-72e8787510df-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.995335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.995374 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.995396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.995470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:47 crc kubenswrapper[4707]: I0121 15:25:47.996325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.036127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.097772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.097821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd8ws\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-kube-api-access-hd8ws\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.097853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d54b9100-265d-49a3-a2ab-72e8787510df-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.097871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.097888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.097902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.097931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.098243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.098314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.098367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.098427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d54b9100-265d-49a3-a2ab-72e8787510df-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.099906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.100266 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.100449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.100729 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.101190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.101379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d54b9100-265d-49a3-a2ab-72e8787510df-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.101992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.102842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.103075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d54b9100-265d-49a3-a2ab-72e8787510df-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.107783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.112276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd8ws\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-kube-api-access-hd8ws\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.119154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.251296 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.401774 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:25:48 crc kubenswrapper[4707]: W0121 15:25:48.446536 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bac3bc5_b4ca_4333_9fc6_851430c48708.slice/crio-cb9f7fedda804b83122a99e4767503c5e6a4ca6a1b966e175e2f50e17e8b5268 WatchSource:0}: Error finding container cb9f7fedda804b83122a99e4767503c5e6a4ca6a1b966e175e2f50e17e8b5268: Status 404 returned error can't find the container with id cb9f7fedda804b83122a99e4767503c5e6a4ca6a1b966e175e2f50e17e8b5268 Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.610901 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:25:48 crc kubenswrapper[4707]: W0121 15:25:48.621677 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd54b9100_265d_49a3_a2ab_72e8787510df.slice/crio-3f3eb732254be7188935cbb1415cb838e3af6835b0129dbe6e6868919046514a WatchSource:0}: Error finding container 3f3eb732254be7188935cbb1415cb838e3af6835b0129dbe6e6868919046514a: Status 404 returned error can't find the container with id 3f3eb732254be7188935cbb1415cb838e3af6835b0129dbe6e6868919046514a Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.765827 4707 scope.go:117] "RemoveContainer" containerID="ea553eff67dde957d1a586908e14078e3b873dcfe703ba1e658cd6842e6dbe17" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.790540 4707 scope.go:117] "RemoveContainer" containerID="c6260bbf19dfc6a9ac3a5408173a09e768e183ee3df3567857551dbfb60cc47a" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.806261 4707 scope.go:117] "RemoveContainer" containerID="b7e4d4cbcccddede310a26fc6b0f9056fadab3f4b99c92bb4024f7818ae6104e" Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.943951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"d54b9100-265d-49a3-a2ab-72e8787510df","Type":"ContainerStarted","Data":"3f3eb732254be7188935cbb1415cb838e3af6835b0129dbe6e6868919046514a"} Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.948565 4707 generic.go:334] "Generic (PLEG): container finished" podID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerID="1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365" exitCode=0 Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.948636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6j4l" event={"ID":"b9673fc0-793d-4a03-a441-3b9f78a1e106","Type":"ContainerDied","Data":"1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365"} Jan 21 15:25:48 crc kubenswrapper[4707]: I0121 15:25:48.950282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6bac3bc5-b4ca-4333-9fc6-851430c48708","Type":"ContainerStarted","Data":"cb9f7fedda804b83122a99e4767503c5e6a4ca6a1b966e175e2f50e17e8b5268"} Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.574479 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.575591 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.577959 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.578117 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-brfjv" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.578283 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.579547 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.584285 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.586488 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.716304 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kolla-config\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.716579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.716606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.716644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ssx6\" (UniqueName: \"kubernetes.io/projected/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kube-api-access-5ssx6\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.716671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.716718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.716823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-default\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.716865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.817784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.817864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.817902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-default\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.817919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.817960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kolla-config\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.818027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.818049 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.818092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ssx6\" (UniqueName: \"kubernetes.io/projected/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kube-api-access-5ssx6\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.818734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.818744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-default\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.818840 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.818944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kolla-config\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.820041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.823272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.826345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.831138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ssx6\" (UniqueName: \"kubernetes.io/projected/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kube-api-access-5ssx6\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.838245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.912112 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.964930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6j4l" event={"ID":"b9673fc0-793d-4a03-a441-3b9f78a1e106","Type":"ContainerStarted","Data":"d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7"} Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.972979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"d54b9100-265d-49a3-a2ab-72e8787510df","Type":"ContainerStarted","Data":"db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a"} Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.976779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6bac3bc5-b4ca-4333-9fc6-851430c48708","Type":"ContainerStarted","Data":"37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855"} Jan 21 15:25:49 crc kubenswrapper[4707]: I0121 15:25:49.979951 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s6j4l" podStartSLOduration=1.366585629 podStartE2EDuration="3.97994079s" podCreationTimestamp="2026-01-21 15:25:46 +0000 UTC" firstStartedPulling="2026-01-21 15:25:46.926282079 +0000 UTC m=+1444.107798302" lastFinishedPulling="2026-01-21 15:25:49.539637241 +0000 UTC m=+1446.721153463" observedRunningTime="2026-01-21 15:25:49.977655355 +0000 UTC m=+1447.159171577" watchObservedRunningTime="2026-01-21 15:25:49.97994079 +0000 UTC m=+1447.161457012" Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.301996 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:25:50 crc kubenswrapper[4707]: W0121 15:25:50.306827 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8d97aa5_7c98_4709_ba78_b2d9fb1a3f16.slice/crio-5c135dfe8016b4dc350f6e5dec8b85a7607e309dfc08a6771fa518f3c77f0797 WatchSource:0}: Error finding container 5c135dfe8016b4dc350f6e5dec8b85a7607e309dfc08a6771fa518f3c77f0797: Status 404 returned error can't find the container with id 5c135dfe8016b4dc350f6e5dec8b85a7607e309dfc08a6771fa518f3c77f0797 Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.912836 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.913945 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.915319 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-zxrdl" Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.916399 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.916620 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.918221 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.921381 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.982478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16","Type":"ContainerStarted","Data":"c26a6d88ea140a2e846151b29e903b6447b9113d6b93c6dacb375b5ce715545b"} Jan 21 15:25:50 crc kubenswrapper[4707]: I0121 15:25:50.982529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16","Type":"ContainerStarted","Data":"5c135dfe8016b4dc350f6e5dec8b85a7607e309dfc08a6771fa518f3c77f0797"} Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.135111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.135160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.135179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.135389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.135433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.135604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.135671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5sfm\" (UniqueName: \"kubernetes.io/projected/280f96dd-c140-464c-a906-d5428cda5b5c-kube-api-access-m5sfm\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.135717 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.236853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5sfm\" (UniqueName: \"kubernetes.io/projected/280f96dd-c140-464c-a906-d5428cda5b5c-kube-api-access-m5sfm\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237557 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237715 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.237729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.238396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.238755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.241075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.241323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.250750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5sfm\" (UniqueName: \"kubernetes.io/projected/280f96dd-c140-464c-a906-d5428cda5b5c-kube-api-access-m5sfm\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.253218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.341559 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.342485 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.346317 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.346497 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-7qwxr" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.347104 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.356341 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.440775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtfkg\" (UniqueName: \"kubernetes.io/projected/7ff4b887-d336-427e-811b-0b0694985417-kube-api-access-xtfkg\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.440956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-config-data\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.440983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-kolla-config\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.441296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.441417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.528823 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.542636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtfkg\" (UniqueName: \"kubernetes.io/projected/7ff4b887-d336-427e-811b-0b0694985417-kube-api-access-xtfkg\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.542740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-config-data\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.542759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-kolla-config\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.542844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.542870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.543670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-kolla-config\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.544164 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-config-data\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.549469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.549495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.554985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtfkg\" (UniqueName: \"kubernetes.io/projected/7ff4b887-d336-427e-811b-0b0694985417-kube-api-access-xtfkg\") pod \"memcached-0\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.655543 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.909229 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:25:51 crc kubenswrapper[4707]: W0121 15:25:51.911824 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod280f96dd_c140_464c_a906_d5428cda5b5c.slice/crio-70c83635b682fa19e7974e177dba118ba5dcfb9ff1eba4cb7d4d1724f57f2a9d WatchSource:0}: Error finding container 70c83635b682fa19e7974e177dba118ba5dcfb9ff1eba4cb7d4d1724f57f2a9d: Status 404 returned error can't find the container with id 70c83635b682fa19e7974e177dba118ba5dcfb9ff1eba4cb7d4d1724f57f2a9d Jan 21 15:25:51 crc kubenswrapper[4707]: I0121 15:25:51.989720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"280f96dd-c140-464c-a906-d5428cda5b5c","Type":"ContainerStarted","Data":"70c83635b682fa19e7974e177dba118ba5dcfb9ff1eba4cb7d4d1724f57f2a9d"} Jan 21 15:25:52 crc kubenswrapper[4707]: I0121 15:25:52.016098 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:25:52 crc kubenswrapper[4707]: W0121 15:25:52.017187 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ff4b887_d336_427e_811b_0b0694985417.slice/crio-fb28ad225e48afe7ff2833a326e7ba0a6e53e51f643187850de8cd07df44d0ed WatchSource:0}: Error finding container fb28ad225e48afe7ff2833a326e7ba0a6e53e51f643187850de8cd07df44d0ed: Status 404 returned error can't find the container with id fb28ad225e48afe7ff2833a326e7ba0a6e53e51f643187850de8cd07df44d0ed Jan 21 15:25:52 crc kubenswrapper[4707]: I0121 15:25:52.997028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"280f96dd-c140-464c-a906-d5428cda5b5c","Type":"ContainerStarted","Data":"5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150"} Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.000881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"7ff4b887-d336-427e-811b-0b0694985417","Type":"ContainerStarted","Data":"4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b"} Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.000907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"7ff4b887-d336-427e-811b-0b0694985417","Type":"ContainerStarted","Data":"fb28ad225e48afe7ff2833a326e7ba0a6e53e51f643187850de8cd07df44d0ed"} Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.000983 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.022773 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.022758618 podStartE2EDuration="2.022758618s" podCreationTimestamp="2026-01-21 15:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:53.022455337 +0000 UTC m=+1450.203971559" watchObservedRunningTime="2026-01-21 15:25:53.022758618 +0000 UTC m=+1450.204274840" Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.166500 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.167269 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.168773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-xcz5p" Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.178475 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.264937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwshg\" (UniqueName: \"kubernetes.io/projected/fc5e58d5-06c3-4fe0-a609-e227c50fdf22-kube-api-access-hwshg\") pod \"kube-state-metrics-0\" (UID: \"fc5e58d5-06c3-4fe0-a609-e227c50fdf22\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.366404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwshg\" (UniqueName: \"kubernetes.io/projected/fc5e58d5-06c3-4fe0-a609-e227c50fdf22-kube-api-access-hwshg\") pod \"kube-state-metrics-0\" (UID: \"fc5e58d5-06c3-4fe0-a609-e227c50fdf22\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.380803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwshg\" (UniqueName: \"kubernetes.io/projected/fc5e58d5-06c3-4fe0-a609-e227c50fdf22-kube-api-access-hwshg\") pod \"kube-state-metrics-0\" (UID: \"fc5e58d5-06c3-4fe0-a609-e227c50fdf22\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.489070 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:25:53 crc kubenswrapper[4707]: I0121 15:25:53.853606 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:25:53 crc kubenswrapper[4707]: W0121 15:25:53.856862 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc5e58d5_06c3_4fe0_a609_e227c50fdf22.slice/crio-e09ab96e3853b04ee07450cee131c32fa7b19b511a1e501c530c7833513f1e70 WatchSource:0}: Error finding container e09ab96e3853b04ee07450cee131c32fa7b19b511a1e501c530c7833513f1e70: Status 404 returned error can't find the container with id e09ab96e3853b04ee07450cee131c32fa7b19b511a1e501c530c7833513f1e70 Jan 21 15:25:54 crc kubenswrapper[4707]: I0121 15:25:54.008353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"fc5e58d5-06c3-4fe0-a609-e227c50fdf22","Type":"ContainerStarted","Data":"e09ab96e3853b04ee07450cee131c32fa7b19b511a1e501c530c7833513f1e70"} Jan 21 15:25:54 crc kubenswrapper[4707]: I0121 15:25:54.010105 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" containerID="c26a6d88ea140a2e846151b29e903b6447b9113d6b93c6dacb375b5ce715545b" exitCode=0 Jan 21 15:25:54 crc kubenswrapper[4707]: I0121 15:25:54.010193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16","Type":"ContainerDied","Data":"c26a6d88ea140a2e846151b29e903b6447b9113d6b93c6dacb375b5ce715545b"} Jan 21 15:25:55 crc kubenswrapper[4707]: I0121 15:25:55.019519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16","Type":"ContainerStarted","Data":"338edc6004b39eadad308bfb8bb14c8882a1995789ae4fdf1b5e3424d971b9e4"} Jan 21 15:25:55 crc kubenswrapper[4707]: I0121 15:25:55.021484 4707 generic.go:334] "Generic (PLEG): container finished" podID="280f96dd-c140-464c-a906-d5428cda5b5c" containerID="5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150" exitCode=0 Jan 21 15:25:55 crc kubenswrapper[4707]: I0121 15:25:55.021560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"280f96dd-c140-464c-a906-d5428cda5b5c","Type":"ContainerDied","Data":"5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150"} Jan 21 15:25:55 crc kubenswrapper[4707]: I0121 15:25:55.023110 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"fc5e58d5-06c3-4fe0-a609-e227c50fdf22","Type":"ContainerStarted","Data":"4ee46ed5aecbf7debc0d1510a301c8a5f0a04f9e2a5b16175bc375ef2d1d8911"} Jan 21 15:25:55 crc kubenswrapper[4707]: I0121 15:25:55.023373 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:25:55 crc kubenswrapper[4707]: I0121 15:25:55.038653 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=7.038641236 podStartE2EDuration="7.038641236s" podCreationTimestamp="2026-01-21 15:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:55.038140816 +0000 UTC m=+1452.219657038" watchObservedRunningTime="2026-01-21 15:25:55.038641236 +0000 UTC m=+1452.220157459" Jan 21 15:25:55 crc kubenswrapper[4707]: I0121 15:25:55.057696 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.792102722 podStartE2EDuration="2.057680555s" podCreationTimestamp="2026-01-21 15:25:53 +0000 UTC" firstStartedPulling="2026-01-21 15:25:53.858857423 +0000 UTC m=+1451.040373645" lastFinishedPulling="2026-01-21 15:25:54.124435256 +0000 UTC m=+1451.305951478" observedRunningTime="2026-01-21 15:25:55.054994899 +0000 UTC m=+1452.236511121" watchObservedRunningTime="2026-01-21 15:25:55.057680555 +0000 UTC m=+1452.239196778" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.030027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"280f96dd-c140-464c-a906-d5428cda5b5c","Type":"ContainerStarted","Data":"690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533"} Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.045113 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.045100355 podStartE2EDuration="7.045100355s" podCreationTimestamp="2026-01-21 15:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:56.042599425 +0000 UTC m=+1453.224115648" watchObservedRunningTime="2026-01-21 15:25:56.045100355 +0000 UTC m=+1453.226616587" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.340967 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.341024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.413389 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.590950 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.591978 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.593344 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.593496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.596457 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.601248 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.596475 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.596695 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-q55pt" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.710408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.710466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.710492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6d2r\" (UniqueName: \"kubernetes.io/projected/c475086c-e924-418c-bdd8-0b61fe31f950-kube-api-access-n6d2r\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.710543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.710587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.710626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.710662 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-config\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.710679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.812191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.812251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6d2r\" (UniqueName: \"kubernetes.io/projected/c475086c-e924-418c-bdd8-0b61fe31f950-kube-api-access-n6d2r\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.812316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.812370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.812413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.812455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-config\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.812474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.812500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.812758 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.815600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.815883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.816933 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-config\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.817018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.817365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.819669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.827606 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6d2r\" (UniqueName: \"kubernetes.io/projected/c475086c-e924-418c-bdd8-0b61fe31f950-kube-api-access-n6d2r\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.830026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:56 crc kubenswrapper[4707]: I0121 15:25:56.904704 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:57 crc kubenswrapper[4707]: I0121 15:25:57.065476 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:57 crc kubenswrapper[4707]: I0121 15:25:57.098067 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6j4l"] Jan 21 15:25:57 crc kubenswrapper[4707]: I0121 15:25:57.258720 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:25:57 crc kubenswrapper[4707]: W0121 15:25:57.262085 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc475086c_e924_418c_bdd8_0b61fe31f950.slice/crio-3b9e51c0fe6c76aec32ebbc8d0bb9d6ca15609a0cf57bcbf6cb6ab8cd7e52ecf WatchSource:0}: Error finding container 3b9e51c0fe6c76aec32ebbc8d0bb9d6ca15609a0cf57bcbf6cb6ab8cd7e52ecf: Status 404 returned error can't find the container with id 3b9e51c0fe6c76aec32ebbc8d0bb9d6ca15609a0cf57bcbf6cb6ab8cd7e52ecf Jan 21 15:25:58 crc kubenswrapper[4707]: I0121 15:25:58.042441 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"c475086c-e924-418c-bdd8-0b61fe31f950","Type":"ContainerStarted","Data":"d9976eaa71a516b23e46173e6991270fa567b0fd2fdf3a5753ae210e1a2cdd0f"} Jan 21 15:25:58 crc kubenswrapper[4707]: I0121 15:25:58.042787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"c475086c-e924-418c-bdd8-0b61fe31f950","Type":"ContainerStarted","Data":"6773818b9477157356db610d5c8e6c5f70dba1e6f5a99dffecbb55f651495e61"} Jan 21 15:25:58 crc kubenswrapper[4707]: I0121 15:25:58.042799 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"c475086c-e924-418c-bdd8-0b61fe31f950","Type":"ContainerStarted","Data":"3b9e51c0fe6c76aec32ebbc8d0bb9d6ca15609a0cf57bcbf6cb6ab8cd7e52ecf"} Jan 21 15:25:58 crc kubenswrapper[4707]: I0121 15:25:58.058346 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.05833112 podStartE2EDuration="3.05833112s" podCreationTimestamp="2026-01-21 15:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:58.055562546 +0000 UTC m=+1455.237078768" watchObservedRunningTime="2026-01-21 15:25:58.05833112 +0000 UTC m=+1455.239847341" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.048446 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s6j4l" podUID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerName="registry-server" containerID="cri-o://d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7" gracePeriod=2 Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.381356 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.445903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-utilities\") pod \"b9673fc0-793d-4a03-a441-3b9f78a1e106\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.446052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-catalog-content\") pod \"b9673fc0-793d-4a03-a441-3b9f78a1e106\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.446081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bln5r\" (UniqueName: \"kubernetes.io/projected/b9673fc0-793d-4a03-a441-3b9f78a1e106-kube-api-access-bln5r\") pod \"b9673fc0-793d-4a03-a441-3b9f78a1e106\" (UID: \"b9673fc0-793d-4a03-a441-3b9f78a1e106\") " Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.446621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-utilities" (OuterVolumeSpecName: "utilities") pod "b9673fc0-793d-4a03-a441-3b9f78a1e106" (UID: "b9673fc0-793d-4a03-a441-3b9f78a1e106"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.451831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9673fc0-793d-4a03-a441-3b9f78a1e106-kube-api-access-bln5r" (OuterVolumeSpecName: "kube-api-access-bln5r") pod "b9673fc0-793d-4a03-a441-3b9f78a1e106" (UID: "b9673fc0-793d-4a03-a441-3b9f78a1e106"). InnerVolumeSpecName "kube-api-access-bln5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.484126 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9673fc0-793d-4a03-a441-3b9f78a1e106" (UID: "b9673fc0-793d-4a03-a441-3b9f78a1e106"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.547740 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.547770 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bln5r\" (UniqueName: \"kubernetes.io/projected/b9673fc0-793d-4a03-a441-3b9f78a1e106-kube-api-access-bln5r\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.547781 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9673fc0-793d-4a03-a441-3b9f78a1e106-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.788522 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:25:59 crc kubenswrapper[4707]: E0121 15:25:59.788742 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerName="registry-server" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.788758 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerName="registry-server" Jan 21 15:25:59 crc kubenswrapper[4707]: E0121 15:25:59.788778 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerName="extract-content" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.788785 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerName="extract-content" Jan 21 15:25:59 crc kubenswrapper[4707]: E0121 15:25:59.788795 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerName="extract-utilities" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.788800 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerName="extract-utilities" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.788935 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerName="registry-server" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.789553 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.790694 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.791010 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.791069 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.791357 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hhfjn" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.798209 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.851177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg2ft\" (UniqueName: \"kubernetes.io/projected/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-kube-api-access-mg2ft\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.851222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.851242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.851465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-config\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.851514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.851600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.851634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.851672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.905426 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.912654 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.912688 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.952917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.952952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.952986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.953023 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.953037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.953051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg2ft\" (UniqueName: \"kubernetes.io/projected/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-kube-api-access-mg2ft\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.953132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-config\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.953159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.953576 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.954326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.954353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.954636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-config\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.956892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.958105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.958234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.968396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg2ft\" (UniqueName: \"kubernetes.io/projected/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-kube-api-access-mg2ft\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:25:59 crc kubenswrapper[4707]: I0121 15:25:59.968566 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.055923 4707 generic.go:334] "Generic (PLEG): container finished" podID="b9673fc0-793d-4a03-a441-3b9f78a1e106" containerID="d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7" exitCode=0 Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.055995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6j4l" event={"ID":"b9673fc0-793d-4a03-a441-3b9f78a1e106","Type":"ContainerDied","Data":"d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7"} Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.056042 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6j4l" event={"ID":"b9673fc0-793d-4a03-a441-3b9f78a1e106","Type":"ContainerDied","Data":"a847b091549d797f3eb1f04a9a6e156325b1bb4a46251d655cb35966b9d990c7"} Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.056060 4707 scope.go:117] "RemoveContainer" containerID="d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.056574 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6j4l" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.080291 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6j4l"] Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.080825 4707 scope.go:117] "RemoveContainer" containerID="1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.084581 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s6j4l"] Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.094717 4707 scope.go:117] "RemoveContainer" containerID="e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.101117 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.107982 4707 scope.go:117] "RemoveContainer" containerID="d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7" Jan 21 15:26:00 crc kubenswrapper[4707]: E0121 15:26:00.108356 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7\": container with ID starting with d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7 not found: ID does not exist" containerID="d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.108388 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7"} err="failed to get container status \"d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7\": rpc error: code = NotFound desc = could not find container \"d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7\": container with ID starting with d6c5fee873b8b1ad64965b9877fdab4c5a8dd1d2ff67b33edd7126f9c91cb4f7 not found: ID does not exist" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.108408 4707 scope.go:117] "RemoveContainer" containerID="1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365" Jan 21 15:26:00 crc kubenswrapper[4707]: E0121 15:26:00.108707 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365\": container with ID starting with 1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365 not found: ID does not exist" containerID="1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.108743 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365"} err="failed to get container status \"1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365\": rpc error: code = NotFound desc = could not find container \"1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365\": container with ID starting with 1809cbc90783cf3aa01d4348fc6dc6463888408514a819f11a8d765e42f4e365 not found: ID does not exist" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.108767 4707 scope.go:117] "RemoveContainer" containerID="e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2" Jan 21 15:26:00 crc kubenswrapper[4707]: E0121 15:26:00.109042 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2\": container with ID starting with e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2 not found: ID does not exist" containerID="e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.109070 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2"} err="failed to get container status \"e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2\": rpc error: code = NotFound desc = could not find container \"e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2\": container with ID starting with e60311a24ceb855b2ec13db189a99d8479f014f5462840ffa14191256b928af2 not found: ID does not exist" Jan 21 15:26:00 crc kubenswrapper[4707]: I0121 15:26:00.494596 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.063880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e","Type":"ContainerStarted","Data":"e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6"} Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.063916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e","Type":"ContainerStarted","Data":"229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43"} Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.063929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e","Type":"ContainerStarted","Data":"7463dce460246bcc9c4ccb82217244ca5cd983747a19134b492c77bafd55e439"} Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.080438 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.080425613 podStartE2EDuration="3.080425613s" podCreationTimestamp="2026-01-21 15:25:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:01.075059859 +0000 UTC m=+1458.256576081" watchObservedRunningTime="2026-01-21 15:26:01.080425613 +0000 UTC m=+1458.261941835" Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.189181 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9673fc0-793d-4a03-a441-3b9f78a1e106" path="/var/lib/kubelet/pods/b9673fc0-793d-4a03-a441-3b9f78a1e106/volumes" Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.529195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.529234 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.593248 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.656120 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:26:01 crc kubenswrapper[4707]: I0121 15:26:01.905088 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:26:02 crc kubenswrapper[4707]: I0121 15:26:02.118762 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:26:02 crc kubenswrapper[4707]: I0121 15:26:02.153239 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:26:02 crc kubenswrapper[4707]: I0121 15:26:02.210900 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:26:02 crc kubenswrapper[4707]: I0121 15:26:02.932221 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:26:02 crc kubenswrapper[4707]: I0121 15:26:02.958788 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:26:03 crc kubenswrapper[4707]: I0121 15:26:03.101397 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:26:03 crc kubenswrapper[4707]: I0121 15:26:03.126529 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:26:03 crc kubenswrapper[4707]: I0121 15:26:03.544378 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.080451 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.429192 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.433176 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.434669 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.435268 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.435775 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.437083 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-ckd2t" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.442766 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.625589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-lock\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.625715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.625893 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.625993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-589cr\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-kube-api-access-589cr\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.626073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-cache\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.727626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-lock\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.727693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.727742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.727776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-589cr\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-kube-api-access-589cr\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.727827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-cache\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: E0121 15:26:04.727905 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:26:04 crc kubenswrapper[4707]: E0121 15:26:04.727929 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:26:04 crc kubenswrapper[4707]: E0121 15:26:04.727978 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift podName:d2152261-adb2-4974-bd02-1e61bbd3d0b4 nodeName:}" failed. No retries permitted until 2026-01-21 15:26:05.227963462 +0000 UTC m=+1462.409479684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift") pod "swift-storage-0" (UID: "d2152261-adb2-4974-bd02-1e61bbd3d0b4") : configmap "swift-ring-files" not found Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.727994 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.728085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-lock\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.728179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-cache\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.743511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-589cr\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-kube-api-access-589cr\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.744325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.830849 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jt7wv"] Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.831650 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:04 crc kubenswrapper[4707]: W0121 15:26:04.832645 4707 reflector.go:561] object-"openstack-kuttl-tests"/"swift-proxy-config-data": failed to list *v1.Secret: secrets "swift-proxy-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 21 15:26:04 crc kubenswrapper[4707]: E0121 15:26:04.832685 4707 reflector.go:158] "Unhandled Error" err="object-\"openstack-kuttl-tests\"/\"swift-proxy-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"swift-proxy-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 15:26:04 crc kubenswrapper[4707]: W0121 15:26:04.832782 4707 reflector.go:561] object-"openstack-kuttl-tests"/"swift-ring-scripts": failed to list *v1.ConfigMap: configmaps "swift-ring-scripts" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 21 15:26:04 crc kubenswrapper[4707]: E0121 15:26:04.832797 4707 reflector.go:158] "Unhandled Error" err="object-\"openstack-kuttl-tests\"/\"swift-ring-scripts\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"swift-ring-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 15:26:04 crc kubenswrapper[4707]: W0121 15:26:04.834301 4707 reflector.go:561] object-"openstack-kuttl-tests"/"swift-ring-config-data": failed to list *v1.ConfigMap: configmaps "swift-ring-config-data" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 21 15:26:04 crc kubenswrapper[4707]: E0121 15:26:04.834326 4707 reflector.go:158] "Unhandled Error" err="object-\"openstack-kuttl-tests\"/\"swift-ring-config-data\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"swift-ring-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.844058 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jt7wv"] Jan 21 15:26:04 crc kubenswrapper[4707]: E0121 15:26:04.846993 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-tpg6s ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-tpg6s ring-data-devices scripts swiftconf]: context canceled" pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" podUID="0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.862324 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jw749"] Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.863193 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.866641 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jt7wv"] Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.871359 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jw749"] Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-combined-ca-bundle\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-scripts\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-swiftconf\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/625c54ec-3c40-4fd5-a2f9-583dedf97df7-etc-swift\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-ring-data-devices\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs4tx\" (UniqueName: \"kubernetes.io/projected/625c54ec-3c40-4fd5-a2f9-583dedf97df7-kube-api-access-fs4tx\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-ring-data-devices\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpg6s\" (UniqueName: \"kubernetes.io/projected/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-kube-api-access-tpg6s\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930673 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-etc-swift\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-dispersionconf\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-combined-ca-bundle\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-swiftconf\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-dispersionconf\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:04 crc kubenswrapper[4707]: I0121 15:26:04.930787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-scripts\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-combined-ca-bundle\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-scripts\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032478 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-swiftconf\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/625c54ec-3c40-4fd5-a2f9-583dedf97df7-etc-swift\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-ring-data-devices\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs4tx\" (UniqueName: \"kubernetes.io/projected/625c54ec-3c40-4fd5-a2f9-583dedf97df7-kube-api-access-fs4tx\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-ring-data-devices\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpg6s\" (UniqueName: \"kubernetes.io/projected/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-kube-api-access-tpg6s\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-etc-swift\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-dispersionconf\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-combined-ca-bundle\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-swiftconf\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-dispersionconf\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-scripts\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.032850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/625c54ec-3c40-4fd5-a2f9-583dedf97df7-etc-swift\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.033098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-etc-swift\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.035249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-combined-ca-bundle\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.036031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-swiftconf\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.036058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-swiftconf\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.036230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-combined-ca-bundle\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.044869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs4tx\" (UniqueName: \"kubernetes.io/projected/625c54ec-3c40-4fd5-a2f9-583dedf97df7-kube-api-access-fs4tx\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.045487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpg6s\" (UniqueName: \"kubernetes.io/projected/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-kube-api-access-tpg6s\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.085951 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.094431 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.113036 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.195749 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.196732 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.198573 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.198730 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.198944 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.199091 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-cq69b" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.206701 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.235704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpg6s\" (UniqueName: \"kubernetes.io/projected/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-kube-api-access-tpg6s\") pod \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.235799 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-combined-ca-bundle\") pod \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.235885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-etc-swift\") pod \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.235967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-swiftconf\") pod \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.236140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:05 crc kubenswrapper[4707]: E0121 15:26:05.236952 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:26:05 crc kubenswrapper[4707]: E0121 15:26:05.236981 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:26:05 crc kubenswrapper[4707]: E0121 15:26:05.237011 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift podName:d2152261-adb2-4974-bd02-1e61bbd3d0b4 nodeName:}" failed. No retries permitted until 2026-01-21 15:26:06.2369999 +0000 UTC m=+1463.418516121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift") pod "swift-storage-0" (UID: "d2152261-adb2-4974-bd02-1e61bbd3d0b4") : configmap "swift-ring-files" not found Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.238752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e" (UID: "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.238917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e" (UID: "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.238971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e" (UID: "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.239541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-kube-api-access-tpg6s" (OuterVolumeSpecName: "kube-api-access-tpg6s") pod "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e" (UID: "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e"). InnerVolumeSpecName "kube-api-access-tpg6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.337999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jftpv\" (UniqueName: \"kubernetes.io/projected/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-kube-api-access-jftpv\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-config\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338406 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-scripts\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338659 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338735 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpg6s\" (UniqueName: \"kubernetes.io/projected/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-kube-api-access-tpg6s\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338792 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.338876 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.439872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.439905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.439932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.439976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-scripts\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.439993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.440041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jftpv\" (UniqueName: \"kubernetes.io/projected/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-kube-api-access-jftpv\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.440118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-config\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.440873 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-config\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.440931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-scripts\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.441129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.443220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.443561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.443838 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.453283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jftpv\" (UniqueName: \"kubernetes.io/projected/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-kube-api-access-jftpv\") pod \"ovn-northd-0\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.512605 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.682949 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.693766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-ring-data-devices\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.693910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-ring-data-devices\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.763576 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.777500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-dispersionconf\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.777511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-dispersionconf\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.845099 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-ring-data-devices\") pod \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.845489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e" (UID: "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.847860 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.853117 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-scripts\") pod \"swift-ring-rebalance-jw749\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.853546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-scripts\") pod \"swift-ring-rebalance-jt7wv\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.856777 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:26:05 crc kubenswrapper[4707]: W0121 15:26:05.860669 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8daec53c_1342_4ca3_aa8b_a9d50bcf44b5.slice/crio-0d93ca850483dce3ccacf5567e498fbfa039f1a818b38dcb8d4c66cd4a367682 WatchSource:0}: Error finding container 0d93ca850483dce3ccacf5567e498fbfa039f1a818b38dcb8d4c66cd4a367682: Status 404 returned error can't find the container with id 0d93ca850483dce3ccacf5567e498fbfa039f1a818b38dcb8d4c66cd4a367682 Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.946527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-dispersionconf\") pod \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.946957 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:05 crc kubenswrapper[4707]: I0121 15:26:05.949708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e" (UID: "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.047752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-scripts\") pod \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\" (UID: \"0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e\") " Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.048158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-scripts" (OuterVolumeSpecName: "scripts") pod "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e" (UID: "0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.048336 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.048354 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.073610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.093726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-jt7wv" Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.093938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5","Type":"ContainerStarted","Data":"2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c"} Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.093971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5","Type":"ContainerStarted","Data":"a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7"} Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.093981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5","Type":"ContainerStarted","Data":"0d93ca850483dce3ccacf5567e498fbfa039f1a818b38dcb8d4c66cd4a367682"} Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.094231 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.117784 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=1.11777337 podStartE2EDuration="1.11777337s" podCreationTimestamp="2026-01-21 15:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:06.114960666 +0000 UTC m=+1463.296476887" watchObservedRunningTime="2026-01-21 15:26:06.11777337 +0000 UTC m=+1463.299289593" Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.172057 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jt7wv"] Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.183975 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jt7wv"] Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.277403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:06 crc kubenswrapper[4707]: E0121 15:26:06.277757 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:26:06 crc kubenswrapper[4707]: E0121 15:26:06.277777 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:26:06 crc kubenswrapper[4707]: E0121 15:26:06.277824 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift podName:d2152261-adb2-4974-bd02-1e61bbd3d0b4 nodeName:}" failed. No retries permitted until 2026-01-21 15:26:08.277798421 +0000 UTC m=+1465.459314644 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift") pod "swift-storage-0" (UID: "d2152261-adb2-4974-bd02-1e61bbd3d0b4") : configmap "swift-ring-files" not found Jan 21 15:26:06 crc kubenswrapper[4707]: I0121 15:26:06.486600 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jw749"] Jan 21 15:26:07 crc kubenswrapper[4707]: I0121 15:26:07.099928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" event={"ID":"625c54ec-3c40-4fd5-a2f9-583dedf97df7","Type":"ContainerStarted","Data":"d1e0f194bb5efcc61be86150df728c212bfcbb440d89e9a94e9e2ba84136344f"} Jan 21 15:26:07 crc kubenswrapper[4707]: I0121 15:26:07.100118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" event={"ID":"625c54ec-3c40-4fd5-a2f9-583dedf97df7","Type":"ContainerStarted","Data":"fff5d697fc86d2a95b7d5abbca2f5920b6fe7bba746a1244275d0b4c51e6423b"} Jan 21 15:26:07 crc kubenswrapper[4707]: I0121 15:26:07.114871 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" podStartSLOduration=3.114861048 podStartE2EDuration="3.114861048s" podCreationTimestamp="2026-01-21 15:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:07.110916845 +0000 UTC m=+1464.292433068" watchObservedRunningTime="2026-01-21 15:26:07.114861048 +0000 UTC m=+1464.296377271" Jan 21 15:26:07 crc kubenswrapper[4707]: I0121 15:26:07.188761 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e" path="/var/lib/kubelet/pods/0e08f792-e6f0-4cdc-91f5-ee6cd6b49a6e/volumes" Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.305642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:08 crc kubenswrapper[4707]: E0121 15:26:08.305866 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:26:08 crc kubenswrapper[4707]: E0121 15:26:08.305891 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:26:08 crc kubenswrapper[4707]: E0121 15:26:08.305942 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift podName:d2152261-adb2-4974-bd02-1e61bbd3d0b4 nodeName:}" failed. No retries permitted until 2026-01-21 15:26:12.305927594 +0000 UTC m=+1469.487443816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift") pod "swift-storage-0" (UID: "d2152261-adb2-4974-bd02-1e61bbd3d0b4") : configmap "swift-ring-files" not found Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.562668 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzxvp"] Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.563582 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.565735 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.567891 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzxvp"] Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.711081 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c9637bb-5239-4d8b-b416-64992331127c-operator-scripts\") pod \"root-account-create-update-jzxvp\" (UID: \"3c9637bb-5239-4d8b-b416-64992331127c\") " pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.711237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/3c9637bb-5239-4d8b-b416-64992331127c-kube-api-access-mbkln\") pod \"root-account-create-update-jzxvp\" (UID: \"3c9637bb-5239-4d8b-b416-64992331127c\") " pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.812900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/3c9637bb-5239-4d8b-b416-64992331127c-kube-api-access-mbkln\") pod \"root-account-create-update-jzxvp\" (UID: \"3c9637bb-5239-4d8b-b416-64992331127c\") " pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.813044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c9637bb-5239-4d8b-b416-64992331127c-operator-scripts\") pod \"root-account-create-update-jzxvp\" (UID: \"3c9637bb-5239-4d8b-b416-64992331127c\") " pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.813660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c9637bb-5239-4d8b-b416-64992331127c-operator-scripts\") pod \"root-account-create-update-jzxvp\" (UID: \"3c9637bb-5239-4d8b-b416-64992331127c\") " pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.828383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/3c9637bb-5239-4d8b-b416-64992331127c-kube-api-access-mbkln\") pod \"root-account-create-update-jzxvp\" (UID: \"3c9637bb-5239-4d8b-b416-64992331127c\") " pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:08 crc kubenswrapper[4707]: I0121 15:26:08.876543 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:09 crc kubenswrapper[4707]: I0121 15:26:09.243062 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzxvp"] Jan 21 15:26:09 crc kubenswrapper[4707]: W0121 15:26:09.250662 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c9637bb_5239_4d8b_b416_64992331127c.slice/crio-ab90bc7f9be7440b5cd7c726e565f63eb23633f67ab7817854dbb6c20be36a1d WatchSource:0}: Error finding container ab90bc7f9be7440b5cd7c726e565f63eb23633f67ab7817854dbb6c20be36a1d: Status 404 returned error can't find the container with id ab90bc7f9be7440b5cd7c726e565f63eb23633f67ab7817854dbb6c20be36a1d Jan 21 15:26:09 crc kubenswrapper[4707]: I0121 15:26:09.945792 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:26:09 crc kubenswrapper[4707]: I0121 15:26:09.946634 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:26:09 crc kubenswrapper[4707]: I0121 15:26:09.946737 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:26:09 crc kubenswrapper[4707]: I0121 15:26:09.947257 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8593fceae8f46f5fa87aa536502a4e355cae8485d3fdcdf096b66dd3daf7b85e"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:26:09 crc kubenswrapper[4707]: I0121 15:26:09.947378 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://8593fceae8f46f5fa87aa536502a4e355cae8485d3fdcdf096b66dd3daf7b85e" gracePeriod=600 Jan 21 15:26:10 crc kubenswrapper[4707]: I0121 15:26:10.122575 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c9637bb-5239-4d8b-b416-64992331127c" containerID="0912e37e74ceeb9d3d9eb417056786f9f77c16bb553ec5e0be0c220aa1c5ddeb" exitCode=0 Jan 21 15:26:10 crc kubenswrapper[4707]: I0121 15:26:10.122625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jzxvp" event={"ID":"3c9637bb-5239-4d8b-b416-64992331127c","Type":"ContainerDied","Data":"0912e37e74ceeb9d3d9eb417056786f9f77c16bb553ec5e0be0c220aa1c5ddeb"} Jan 21 15:26:10 crc kubenswrapper[4707]: I0121 15:26:10.122675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jzxvp" event={"ID":"3c9637bb-5239-4d8b-b416-64992331127c","Type":"ContainerStarted","Data":"ab90bc7f9be7440b5cd7c726e565f63eb23633f67ab7817854dbb6c20be36a1d"} Jan 21 15:26:10 crc kubenswrapper[4707]: I0121 15:26:10.126161 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="8593fceae8f46f5fa87aa536502a4e355cae8485d3fdcdf096b66dd3daf7b85e" exitCode=0 Jan 21 15:26:10 crc kubenswrapper[4707]: I0121 15:26:10.126201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"8593fceae8f46f5fa87aa536502a4e355cae8485d3fdcdf096b66dd3daf7b85e"} Jan 21 15:26:10 crc kubenswrapper[4707]: I0121 15:26:10.126254 4707 scope.go:117] "RemoveContainer" containerID="8bb97b7a6a7cd703c4244311ac8e9e2660a8fee90a5d7f829eaf760d2141c0e2" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.133662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a"} Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.271436 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-52g8k"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.272316 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.277156 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-52g8k"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.368493 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.369288 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.371066 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.372957 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.425044 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.469760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sc4g\" (UniqueName: \"kubernetes.io/projected/4fb37062-6ccc-4f42-9bc2-e553db1e788b-kube-api-access-4sc4g\") pod \"keystone-db-create-52g8k\" (UID: \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\") " pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.469863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqm89\" (UniqueName: \"kubernetes.io/projected/2856c284-39e8-469c-b288-422cc5f3205c-kube-api-access-vqm89\") pod \"keystone-b6c7-account-create-update-4rttk\" (UID: \"2856c284-39e8-469c-b288-422cc5f3205c\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.469888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2856c284-39e8-469c-b288-422cc5f3205c-operator-scripts\") pod \"keystone-b6c7-account-create-update-4rttk\" (UID: \"2856c284-39e8-469c-b288-422cc5f3205c\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.469950 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb37062-6ccc-4f42-9bc2-e553db1e788b-operator-scripts\") pod \"keystone-db-create-52g8k\" (UID: \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\") " pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.562217 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-7xszt"] Jan 21 15:26:11 crc kubenswrapper[4707]: E0121 15:26:11.562492 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c9637bb-5239-4d8b-b416-64992331127c" containerName="mariadb-account-create-update" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.562508 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c9637bb-5239-4d8b-b416-64992331127c" containerName="mariadb-account-create-update" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.562642 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c9637bb-5239-4d8b-b416-64992331127c" containerName="mariadb-account-create-update" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.563334 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.571266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/3c9637bb-5239-4d8b-b416-64992331127c-kube-api-access-mbkln\") pod \"3c9637bb-5239-4d8b-b416-64992331127c\" (UID: \"3c9637bb-5239-4d8b-b416-64992331127c\") " Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.571430 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c9637bb-5239-4d8b-b416-64992331127c-operator-scripts\") pod \"3c9637bb-5239-4d8b-b416-64992331127c\" (UID: \"3c9637bb-5239-4d8b-b416-64992331127c\") " Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.572983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c9637bb-5239-4d8b-b416-64992331127c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c9637bb-5239-4d8b-b416-64992331127c" (UID: "3c9637bb-5239-4d8b-b416-64992331127c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.574363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sc4g\" (UniqueName: \"kubernetes.io/projected/4fb37062-6ccc-4f42-9bc2-e553db1e788b-kube-api-access-4sc4g\") pod \"keystone-db-create-52g8k\" (UID: \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\") " pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.574400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdlw9\" (UniqueName: \"kubernetes.io/projected/8e4c7891-17ea-4609-865e-ed35416688f0-kube-api-access-tdlw9\") pod \"placement-db-create-7xszt\" (UID: \"8e4c7891-17ea-4609-865e-ed35416688f0\") " pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.574437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e4c7891-17ea-4609-865e-ed35416688f0-operator-scripts\") pod \"placement-db-create-7xszt\" (UID: \"8e4c7891-17ea-4609-865e-ed35416688f0\") " pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.574492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqm89\" (UniqueName: \"kubernetes.io/projected/2856c284-39e8-469c-b288-422cc5f3205c-kube-api-access-vqm89\") pod \"keystone-b6c7-account-create-update-4rttk\" (UID: \"2856c284-39e8-469c-b288-422cc5f3205c\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.574516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2856c284-39e8-469c-b288-422cc5f3205c-operator-scripts\") pod \"keystone-b6c7-account-create-update-4rttk\" (UID: \"2856c284-39e8-469c-b288-422cc5f3205c\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.574578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb37062-6ccc-4f42-9bc2-e553db1e788b-operator-scripts\") pod \"keystone-db-create-52g8k\" (UID: \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\") " pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.574623 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c9637bb-5239-4d8b-b416-64992331127c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.575171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb37062-6ccc-4f42-9bc2-e553db1e788b-operator-scripts\") pod \"keystone-db-create-52g8k\" (UID: \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\") " pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.575229 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-7xszt"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.575796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2856c284-39e8-469c-b288-422cc5f3205c-operator-scripts\") pod \"keystone-b6c7-account-create-update-4rttk\" (UID: \"2856c284-39e8-469c-b288-422cc5f3205c\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.592216 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqm89\" (UniqueName: \"kubernetes.io/projected/2856c284-39e8-469c-b288-422cc5f3205c-kube-api-access-vqm89\") pod \"keystone-b6c7-account-create-update-4rttk\" (UID: \"2856c284-39e8-469c-b288-422cc5f3205c\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.592988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sc4g\" (UniqueName: \"kubernetes.io/projected/4fb37062-6ccc-4f42-9bc2-e553db1e788b-kube-api-access-4sc4g\") pod \"keystone-db-create-52g8k\" (UID: \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\") " pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.597946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c9637bb-5239-4d8b-b416-64992331127c-kube-api-access-mbkln" (OuterVolumeSpecName: "kube-api-access-mbkln") pod "3c9637bb-5239-4d8b-b416-64992331127c" (UID: "3c9637bb-5239-4d8b-b416-64992331127c"). InnerVolumeSpecName "kube-api-access-mbkln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.672055 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-408c-account-create-update-7pgf5"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.673130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.676133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdlw9\" (UniqueName: \"kubernetes.io/projected/8e4c7891-17ea-4609-865e-ed35416688f0-kube-api-access-tdlw9\") pod \"placement-db-create-7xszt\" (UID: \"8e4c7891-17ea-4609-865e-ed35416688f0\") " pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.676256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e4c7891-17ea-4609-865e-ed35416688f0-operator-scripts\") pod \"placement-db-create-7xszt\" (UID: \"8e4c7891-17ea-4609-865e-ed35416688f0\") " pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.676385 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/3c9637bb-5239-4d8b-b416-64992331127c-kube-api-access-mbkln\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.676802 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e4c7891-17ea-4609-865e-ed35416688f0-operator-scripts\") pod \"placement-db-create-7xszt\" (UID: \"8e4c7891-17ea-4609-865e-ed35416688f0\") " pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.678032 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.683069 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-408c-account-create-update-7pgf5"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.692437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdlw9\" (UniqueName: \"kubernetes.io/projected/8e4c7891-17ea-4609-865e-ed35416688f0-kube-api-access-tdlw9\") pod \"placement-db-create-7xszt\" (UID: \"8e4c7891-17ea-4609-865e-ed35416688f0\") " pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.723296 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.766869 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-48sg8"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.767717 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.771672 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-48sg8"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.777621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-operator-scripts\") pod \"glance-db-create-48sg8\" (UID: \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\") " pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.777668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rblhv\" (UniqueName: \"kubernetes.io/projected/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-kube-api-access-rblhv\") pod \"glance-db-create-48sg8\" (UID: \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\") " pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.777925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnlhb\" (UniqueName: \"kubernetes.io/projected/832d985c-f2bb-47c8-b4ae-95cbfc58b024-kube-api-access-vnlhb\") pod \"placement-408c-account-create-update-7pgf5\" (UID: \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\") " pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.777963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/832d985c-f2bb-47c8-b4ae-95cbfc58b024-operator-scripts\") pod \"placement-408c-account-create-update-7pgf5\" (UID: \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\") " pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:11 crc kubenswrapper[4707]: E0121 15:26:11.826844 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod625c54ec_3c40_4fd5_a2f9_583dedf97df7.slice/crio-d1e0f194bb5efcc61be86150df728c212bfcbb440d89e9a94e9e2ba84136344f.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.866715 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.867777 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.869409 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.872150 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl"] Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.879991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-operator-scripts\") pod \"glance-db-create-48sg8\" (UID: \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\") " pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.880145 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rblhv\" (UniqueName: \"kubernetes.io/projected/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-kube-api-access-rblhv\") pod \"glance-db-create-48sg8\" (UID: \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\") " pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.880316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9n5d\" (UniqueName: \"kubernetes.io/projected/ac82c533-38ae-4570-9826-a85958afa2c0-kube-api-access-s9n5d\") pod \"glance-d5d6-account-create-update-kt8jl\" (UID: \"ac82c533-38ae-4570-9826-a85958afa2c0\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.880832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-operator-scripts\") pod \"glance-db-create-48sg8\" (UID: \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\") " pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.881976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnlhb\" (UniqueName: \"kubernetes.io/projected/832d985c-f2bb-47c8-b4ae-95cbfc58b024-kube-api-access-vnlhb\") pod \"placement-408c-account-create-update-7pgf5\" (UID: \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\") " pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.882727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/832d985c-f2bb-47c8-b4ae-95cbfc58b024-operator-scripts\") pod \"placement-408c-account-create-update-7pgf5\" (UID: \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\") " pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.884157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/832d985c-f2bb-47c8-b4ae-95cbfc58b024-operator-scripts\") pod \"placement-408c-account-create-update-7pgf5\" (UID: \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\") " pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.884306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac82c533-38ae-4570-9826-a85958afa2c0-operator-scripts\") pod \"glance-d5d6-account-create-update-kt8jl\" (UID: \"ac82c533-38ae-4570-9826-a85958afa2c0\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.884408 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.893767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rblhv\" (UniqueName: \"kubernetes.io/projected/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-kube-api-access-rblhv\") pod \"glance-db-create-48sg8\" (UID: \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\") " pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.894728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnlhb\" (UniqueName: \"kubernetes.io/projected/832d985c-f2bb-47c8-b4ae-95cbfc58b024-kube-api-access-vnlhb\") pod \"placement-408c-account-create-update-7pgf5\" (UID: \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\") " pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.926239 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.985727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9n5d\" (UniqueName: \"kubernetes.io/projected/ac82c533-38ae-4570-9826-a85958afa2c0-kube-api-access-s9n5d\") pod \"glance-d5d6-account-create-update-kt8jl\" (UID: \"ac82c533-38ae-4570-9826-a85958afa2c0\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.985850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac82c533-38ae-4570-9826-a85958afa2c0-operator-scripts\") pod \"glance-d5d6-account-create-update-kt8jl\" (UID: \"ac82c533-38ae-4570-9826-a85958afa2c0\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.986442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac82c533-38ae-4570-9826-a85958afa2c0-operator-scripts\") pod \"glance-d5d6-account-create-update-kt8jl\" (UID: \"ac82c533-38ae-4570-9826-a85958afa2c0\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:11 crc kubenswrapper[4707]: I0121 15:26:11.988144 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.003931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9n5d\" (UniqueName: \"kubernetes.io/projected/ac82c533-38ae-4570-9826-a85958afa2c0-kube-api-access-s9n5d\") pod \"glance-d5d6-account-create-update-kt8jl\" (UID: \"ac82c533-38ae-4570-9826-a85958afa2c0\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.098788 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.106740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk"] Jan 21 15:26:12 crc kubenswrapper[4707]: W0121 15:26:12.119693 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2856c284_39e8_469c_b288_422cc5f3205c.slice/crio-b9ddfdc26b64ce7f5e8469ae2a3e8794c0c2e552031c2e47ef6aee7c70c9f0c3 WatchSource:0}: Error finding container b9ddfdc26b64ce7f5e8469ae2a3e8794c0c2e552031c2e47ef6aee7c70c9f0c3: Status 404 returned error can't find the container with id b9ddfdc26b64ce7f5e8469ae2a3e8794c0c2e552031c2e47ef6aee7c70c9f0c3 Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.142098 4707 generic.go:334] "Generic (PLEG): container finished" podID="625c54ec-3c40-4fd5-a2f9-583dedf97df7" containerID="d1e0f194bb5efcc61be86150df728c212bfcbb440d89e9a94e9e2ba84136344f" exitCode=0 Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.142155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" event={"ID":"625c54ec-3c40-4fd5-a2f9-583dedf97df7","Type":"ContainerDied","Data":"d1e0f194bb5efcc61be86150df728c212bfcbb440d89e9a94e9e2ba84136344f"} Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.144053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jzxvp" event={"ID":"3c9637bb-5239-4d8b-b416-64992331127c","Type":"ContainerDied","Data":"ab90bc7f9be7440b5cd7c726e565f63eb23633f67ab7817854dbb6c20be36a1d"} Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.144078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jzxvp" Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.144088 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab90bc7f9be7440b5cd7c726e565f63eb23633f67ab7817854dbb6c20be36a1d" Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.145037 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" event={"ID":"2856c284-39e8-469c-b288-422cc5f3205c","Type":"ContainerStarted","Data":"b9ddfdc26b64ce7f5e8469ae2a3e8794c0c2e552031c2e47ef6aee7c70c9f0c3"} Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.187102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:12 crc kubenswrapper[4707]: W0121 15:26:12.265674 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb37062_6ccc_4f42_9bc2_e553db1e788b.slice/crio-c8ebc22fcb692e5e3113fb40e2a15d307dee336feacfa56a2605098ecd4e797a WatchSource:0}: Error finding container c8ebc22fcb692e5e3113fb40e2a15d307dee336feacfa56a2605098ecd4e797a: Status 404 returned error can't find the container with id c8ebc22fcb692e5e3113fb40e2a15d307dee336feacfa56a2605098ecd4e797a Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.268211 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-52g8k"] Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.334144 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-7xszt"] Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.392798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.398274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift\") pod \"swift-storage-0\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.426179 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-408c-account-create-update-7pgf5"] Jan 21 15:26:12 crc kubenswrapper[4707]: W0121 15:26:12.426864 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod832d985c_f2bb_47c8_b4ae_95cbfc58b024.slice/crio-fdd868a7188d3911824e1ff59a006c88a3d483c5a3418761e5aa56f4e72bd227 WatchSource:0}: Error finding container fdd868a7188d3911824e1ff59a006c88a3d483c5a3418761e5aa56f4e72bd227: Status 404 returned error can't find the container with id fdd868a7188d3911824e1ff59a006c88a3d483c5a3418761e5aa56f4e72bd227 Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.498235 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-48sg8"] Jan 21 15:26:12 crc kubenswrapper[4707]: W0121 15:26:12.515384 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b55957d_e7aa_4c3b_b4d7_d6e14c3b5927.slice/crio-22fd3c2f476c382e8d12046db2f2c532f736a9deef6486ddb03d67d37a070cfc WatchSource:0}: Error finding container 22fd3c2f476c382e8d12046db2f2c532f736a9deef6486ddb03d67d37a070cfc: Status 404 returned error can't find the container with id 22fd3c2f476c382e8d12046db2f2c532f736a9deef6486ddb03d67d37a070cfc Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.547901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.574094 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl"] Jan 21 15:26:12 crc kubenswrapper[4707]: W0121 15:26:12.600565 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac82c533_38ae_4570_9826_a85958afa2c0.slice/crio-82dcf3d707c7d105002597d21d97cafbef66d961a134dfa11d897513de249643 WatchSource:0}: Error finding container 82dcf3d707c7d105002597d21d97cafbef66d961a134dfa11d897513de249643: Status 404 returned error can't find the container with id 82dcf3d707c7d105002597d21d97cafbef66d961a134dfa11d897513de249643 Jan 21 15:26:12 crc kubenswrapper[4707]: I0121 15:26:12.915874 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.151629 4707 generic.go:334] "Generic (PLEG): container finished" podID="ac82c533-38ae-4570-9826-a85958afa2c0" containerID="02e28dadd8fc9e9b163dde4dbf029a252158949885ad335282d6766892c01e0e" exitCode=0 Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.151721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" event={"ID":"ac82c533-38ae-4570-9826-a85958afa2c0","Type":"ContainerDied","Data":"02e28dadd8fc9e9b163dde4dbf029a252158949885ad335282d6766892c01e0e"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.151756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" event={"ID":"ac82c533-38ae-4570-9826-a85958afa2c0","Type":"ContainerStarted","Data":"82dcf3d707c7d105002597d21d97cafbef66d961a134dfa11d897513de249643"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.152649 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927" containerID="1bea5fb4ea7834020b5704135d41ce7f5437b5498e017d423fba0f6f0c7c8d38" exitCode=0 Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.152693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-48sg8" event={"ID":"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927","Type":"ContainerDied","Data":"1bea5fb4ea7834020b5704135d41ce7f5437b5498e017d423fba0f6f0c7c8d38"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.152707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-48sg8" event={"ID":"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927","Type":"ContainerStarted","Data":"22fd3c2f476c382e8d12046db2f2c532f736a9deef6486ddb03d67d37a070cfc"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.155123 4707 generic.go:334] "Generic (PLEG): container finished" podID="832d985c-f2bb-47c8-b4ae-95cbfc58b024" containerID="8becf27fcce8f470863c589f92c7787a471ea570cea12e938de611fc3ca3f044" exitCode=0 Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.155185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" event={"ID":"832d985c-f2bb-47c8-b4ae-95cbfc58b024","Type":"ContainerDied","Data":"8becf27fcce8f470863c589f92c7787a471ea570cea12e938de611fc3ca3f044"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.155222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" event={"ID":"832d985c-f2bb-47c8-b4ae-95cbfc58b024","Type":"ContainerStarted","Data":"fdd868a7188d3911824e1ff59a006c88a3d483c5a3418761e5aa56f4e72bd227"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.156338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.156364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"51bd4a509c9056061034c28ba83642bb715ed15d21b2887dc9e334d4ca3fde09"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.158881 4707 generic.go:334] "Generic (PLEG): container finished" podID="2856c284-39e8-469c-b288-422cc5f3205c" containerID="aa6c42ba172b95b7cc1127e455450f045ffbfb0a3f4c0d221599cca192bdf08e" exitCode=0 Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.158934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" event={"ID":"2856c284-39e8-469c-b288-422cc5f3205c","Type":"ContainerDied","Data":"aa6c42ba172b95b7cc1127e455450f045ffbfb0a3f4c0d221599cca192bdf08e"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.163120 4707 generic.go:334] "Generic (PLEG): container finished" podID="8e4c7891-17ea-4609-865e-ed35416688f0" containerID="ee4846a1605c12513d522993ab369007eadfe8110bdee639d33d4b47837aad9c" exitCode=0 Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.163175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-7xszt" event={"ID":"8e4c7891-17ea-4609-865e-ed35416688f0","Type":"ContainerDied","Data":"ee4846a1605c12513d522993ab369007eadfe8110bdee639d33d4b47837aad9c"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.163202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-7xszt" event={"ID":"8e4c7891-17ea-4609-865e-ed35416688f0","Type":"ContainerStarted","Data":"16b79989123ddae60dfb5b970a08cb978bda996b8f510c0ac818f763db9e19cf"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.164425 4707 generic.go:334] "Generic (PLEG): container finished" podID="4fb37062-6ccc-4f42-9bc2-e553db1e788b" containerID="a33add28ee54c3f16a7ed662f08ebebecc8c61219584e3106caabddc6c3f4203" exitCode=0 Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.164583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-52g8k" event={"ID":"4fb37062-6ccc-4f42-9bc2-e553db1e788b","Type":"ContainerDied","Data":"a33add28ee54c3f16a7ed662f08ebebecc8c61219584e3106caabddc6c3f4203"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.164604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-52g8k" event={"ID":"4fb37062-6ccc-4f42-9bc2-e553db1e788b","Type":"ContainerStarted","Data":"c8ebc22fcb692e5e3113fb40e2a15d307dee336feacfa56a2605098ecd4e797a"} Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.466981 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.516207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-ring-data-devices\") pod \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.516250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs4tx\" (UniqueName: \"kubernetes.io/projected/625c54ec-3c40-4fd5-a2f9-583dedf97df7-kube-api-access-fs4tx\") pod \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.516274 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-swiftconf\") pod \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.516316 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-dispersionconf\") pod \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.516399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-scripts\") pod \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.516419 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-combined-ca-bundle\") pod \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.516467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/625c54ec-3c40-4fd5-a2f9-583dedf97df7-etc-swift\") pod \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\" (UID: \"625c54ec-3c40-4fd5-a2f9-583dedf97df7\") " Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.516849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "625c54ec-3c40-4fd5-a2f9-583dedf97df7" (UID: "625c54ec-3c40-4fd5-a2f9-583dedf97df7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.517342 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/625c54ec-3c40-4fd5-a2f9-583dedf97df7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "625c54ec-3c40-4fd5-a2f9-583dedf97df7" (UID: "625c54ec-3c40-4fd5-a2f9-583dedf97df7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.520701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625c54ec-3c40-4fd5-a2f9-583dedf97df7-kube-api-access-fs4tx" (OuterVolumeSpecName: "kube-api-access-fs4tx") pod "625c54ec-3c40-4fd5-a2f9-583dedf97df7" (UID: "625c54ec-3c40-4fd5-a2f9-583dedf97df7"). InnerVolumeSpecName "kube-api-access-fs4tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.523972 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "625c54ec-3c40-4fd5-a2f9-583dedf97df7" (UID: "625c54ec-3c40-4fd5-a2f9-583dedf97df7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.542168 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "625c54ec-3c40-4fd5-a2f9-583dedf97df7" (UID: "625c54ec-3c40-4fd5-a2f9-583dedf97df7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.542952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-scripts" (OuterVolumeSpecName: "scripts") pod "625c54ec-3c40-4fd5-a2f9-583dedf97df7" (UID: "625c54ec-3c40-4fd5-a2f9-583dedf97df7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.558871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "625c54ec-3c40-4fd5-a2f9-583dedf97df7" (UID: "625c54ec-3c40-4fd5-a2f9-583dedf97df7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.617590 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.617617 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.617628 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/625c54ec-3c40-4fd5-a2f9-583dedf97df7-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.617637 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/625c54ec-3c40-4fd5-a2f9-583dedf97df7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.617645 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs4tx\" (UniqueName: \"kubernetes.io/projected/625c54ec-3c40-4fd5-a2f9-583dedf97df7-kube-api-access-fs4tx\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.617652 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:13 crc kubenswrapper[4707]: I0121 15:26:13.617660 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/625c54ec-3c40-4fd5-a2f9-583dedf97df7-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.174708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.174929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.174941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.174950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.174958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.174966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.174974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.174981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.174988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.175693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" event={"ID":"625c54ec-3c40-4fd5-a2f9-583dedf97df7","Type":"ContainerDied","Data":"fff5d697fc86d2a95b7d5abbca2f5920b6fe7bba746a1244275d0b4c51e6423b"} Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.175723 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fff5d697fc86d2a95b7d5abbca2f5920b6fe7bba746a1244275d0b4c51e6423b" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.175725 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-jw749" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.631974 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.742182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqm89\" (UniqueName: \"kubernetes.io/projected/2856c284-39e8-469c-b288-422cc5f3205c-kube-api-access-vqm89\") pod \"2856c284-39e8-469c-b288-422cc5f3205c\" (UID: \"2856c284-39e8-469c-b288-422cc5f3205c\") " Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.742426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2856c284-39e8-469c-b288-422cc5f3205c-operator-scripts\") pod \"2856c284-39e8-469c-b288-422cc5f3205c\" (UID: \"2856c284-39e8-469c-b288-422cc5f3205c\") " Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.743396 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2856c284-39e8-469c-b288-422cc5f3205c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2856c284-39e8-469c-b288-422cc5f3205c" (UID: "2856c284-39e8-469c-b288-422cc5f3205c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.764356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2856c284-39e8-469c-b288-422cc5f3205c-kube-api-access-vqm89" (OuterVolumeSpecName: "kube-api-access-vqm89") pod "2856c284-39e8-469c-b288-422cc5f3205c" (UID: "2856c284-39e8-469c-b288-422cc5f3205c"). InnerVolumeSpecName "kube-api-access-vqm89". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.821062 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.844272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-operator-scripts\") pod \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\" (UID: \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\") " Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.844318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rblhv\" (UniqueName: \"kubernetes.io/projected/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-kube-api-access-rblhv\") pod \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\" (UID: \"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927\") " Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.845734 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqm89\" (UniqueName: \"kubernetes.io/projected/2856c284-39e8-469c-b288-422cc5f3205c-kube-api-access-vqm89\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.845763 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2856c284-39e8-469c-b288-422cc5f3205c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.849976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927" (UID: "3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.856465 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-kube-api-access-rblhv" (OuterVolumeSpecName: "kube-api-access-rblhv") pod "3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927" (UID: "3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927"). InnerVolumeSpecName "kube-api-access-rblhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.886941 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.922283 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzxvp"] Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.930408 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzxvp"] Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.946113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/832d985c-f2bb-47c8-b4ae-95cbfc58b024-operator-scripts\") pod \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\" (UID: \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\") " Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.946308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnlhb\" (UniqueName: \"kubernetes.io/projected/832d985c-f2bb-47c8-b4ae-95cbfc58b024-kube-api-access-vnlhb\") pod \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\" (UID: \"832d985c-f2bb-47c8-b4ae-95cbfc58b024\") " Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.946580 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.946598 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rblhv\" (UniqueName: \"kubernetes.io/projected/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927-kube-api-access-rblhv\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.946931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/832d985c-f2bb-47c8-b4ae-95cbfc58b024-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "832d985c-f2bb-47c8-b4ae-95cbfc58b024" (UID: "832d985c-f2bb-47c8-b4ae-95cbfc58b024"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.949479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/832d985c-f2bb-47c8-b4ae-95cbfc58b024-kube-api-access-vnlhb" (OuterVolumeSpecName: "kube-api-access-vnlhb") pod "832d985c-f2bb-47c8-b4ae-95cbfc58b024" (UID: "832d985c-f2bb-47c8-b4ae-95cbfc58b024"). InnerVolumeSpecName "kube-api-access-vnlhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.951211 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.965053 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:14 crc kubenswrapper[4707]: I0121 15:26:14.977988 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047236 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e4c7891-17ea-4609-865e-ed35416688f0-operator-scripts\") pod \"8e4c7891-17ea-4609-865e-ed35416688f0\" (UID: \"8e4c7891-17ea-4609-865e-ed35416688f0\") " Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sc4g\" (UniqueName: \"kubernetes.io/projected/4fb37062-6ccc-4f42-9bc2-e553db1e788b-kube-api-access-4sc4g\") pod \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\" (UID: \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\") " Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9n5d\" (UniqueName: \"kubernetes.io/projected/ac82c533-38ae-4570-9826-a85958afa2c0-kube-api-access-s9n5d\") pod \"ac82c533-38ae-4570-9826-a85958afa2c0\" (UID: \"ac82c533-38ae-4570-9826-a85958afa2c0\") " Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac82c533-38ae-4570-9826-a85958afa2c0-operator-scripts\") pod \"ac82c533-38ae-4570-9826-a85958afa2c0\" (UID: \"ac82c533-38ae-4570-9826-a85958afa2c0\") " Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047389 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb37062-6ccc-4f42-9bc2-e553db1e788b-operator-scripts\") pod \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\" (UID: \"4fb37062-6ccc-4f42-9bc2-e553db1e788b\") " Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdlw9\" (UniqueName: \"kubernetes.io/projected/8e4c7891-17ea-4609-865e-ed35416688f0-kube-api-access-tdlw9\") pod \"8e4c7891-17ea-4609-865e-ed35416688f0\" (UID: \"8e4c7891-17ea-4609-865e-ed35416688f0\") " Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e4c7891-17ea-4609-865e-ed35416688f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e4c7891-17ea-4609-865e-ed35416688f0" (UID: "8e4c7891-17ea-4609-865e-ed35416688f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047761 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnlhb\" (UniqueName: \"kubernetes.io/projected/832d985c-f2bb-47c8-b4ae-95cbfc58b024-kube-api-access-vnlhb\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047773 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/832d985c-f2bb-47c8-b4ae-95cbfc58b024-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac82c533-38ae-4570-9826-a85958afa2c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac82c533-38ae-4570-9826-a85958afa2c0" (UID: "ac82c533-38ae-4570-9826-a85958afa2c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.047890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb37062-6ccc-4f42-9bc2-e553db1e788b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fb37062-6ccc-4f42-9bc2-e553db1e788b" (UID: "4fb37062-6ccc-4f42-9bc2-e553db1e788b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.049977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac82c533-38ae-4570-9826-a85958afa2c0-kube-api-access-s9n5d" (OuterVolumeSpecName: "kube-api-access-s9n5d") pod "ac82c533-38ae-4570-9826-a85958afa2c0" (UID: "ac82c533-38ae-4570-9826-a85958afa2c0"). InnerVolumeSpecName "kube-api-access-s9n5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.050421 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e4c7891-17ea-4609-865e-ed35416688f0-kube-api-access-tdlw9" (OuterVolumeSpecName: "kube-api-access-tdlw9") pod "8e4c7891-17ea-4609-865e-ed35416688f0" (UID: "8e4c7891-17ea-4609-865e-ed35416688f0"). InnerVolumeSpecName "kube-api-access-tdlw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.051937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb37062-6ccc-4f42-9bc2-e553db1e788b-kube-api-access-4sc4g" (OuterVolumeSpecName: "kube-api-access-4sc4g") pod "4fb37062-6ccc-4f42-9bc2-e553db1e788b" (UID: "4fb37062-6ccc-4f42-9bc2-e553db1e788b"). InnerVolumeSpecName "kube-api-access-4sc4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.149329 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e4c7891-17ea-4609-865e-ed35416688f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.149358 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sc4g\" (UniqueName: \"kubernetes.io/projected/4fb37062-6ccc-4f42-9bc2-e553db1e788b-kube-api-access-4sc4g\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.149370 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9n5d\" (UniqueName: \"kubernetes.io/projected/ac82c533-38ae-4570-9826-a85958afa2c0-kube-api-access-s9n5d\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.149378 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac82c533-38ae-4570-9826-a85958afa2c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.149386 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb37062-6ccc-4f42-9bc2-e553db1e788b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.149394 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdlw9\" (UniqueName: \"kubernetes.io/projected/8e4c7891-17ea-4609-865e-ed35416688f0-kube-api-access-tdlw9\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.188071 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.188800 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c9637bb-5239-4d8b-b416-64992331127c" path="/var/lib/kubelet/pods/3c9637bb-5239-4d8b-b416-64992331127c/volumes" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.190520 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-7xszt" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.196270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.196310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.196321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.196329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.196336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerStarted","Data":"f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.196344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk" event={"ID":"2856c284-39e8-469c-b288-422cc5f3205c","Type":"ContainerDied","Data":"b9ddfdc26b64ce7f5e8469ae2a3e8794c0c2e552031c2e47ef6aee7c70c9f0c3"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.196355 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9ddfdc26b64ce7f5e8469ae2a3e8794c0c2e552031c2e47ef6aee7c70c9f0c3" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.196364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-7xszt" event={"ID":"8e4c7891-17ea-4609-865e-ed35416688f0","Type":"ContainerDied","Data":"16b79989123ddae60dfb5b970a08cb978bda996b8f510c0ac818f763db9e19cf"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.196373 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16b79989123ddae60dfb5b970a08cb978bda996b8f510c0ac818f763db9e19cf" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.202850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-52g8k" event={"ID":"4fb37062-6ccc-4f42-9bc2-e553db1e788b","Type":"ContainerDied","Data":"c8ebc22fcb692e5e3113fb40e2a15d307dee336feacfa56a2605098ecd4e797a"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.202883 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8ebc22fcb692e5e3113fb40e2a15d307dee336feacfa56a2605098ecd4e797a" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.202858 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-52g8k" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.204135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" event={"ID":"ac82c533-38ae-4570-9826-a85958afa2c0","Type":"ContainerDied","Data":"82dcf3d707c7d105002597d21d97cafbef66d961a134dfa11d897513de249643"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.204148 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.204153 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82dcf3d707c7d105002597d21d97cafbef66d961a134dfa11d897513de249643" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.206457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-48sg8" event={"ID":"3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927","Type":"ContainerDied","Data":"22fd3c2f476c382e8d12046db2f2c532f736a9deef6486ddb03d67d37a070cfc"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.206483 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22fd3c2f476c382e8d12046db2f2c532f736a9deef6486ddb03d67d37a070cfc" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.206524 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-48sg8" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.213651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" event={"ID":"832d985c-f2bb-47c8-b4ae-95cbfc58b024","Type":"ContainerDied","Data":"fdd868a7188d3911824e1ff59a006c88a3d483c5a3418761e5aa56f4e72bd227"} Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.213682 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd868a7188d3911824e1ff59a006c88a3d483c5a3418761e5aa56f4e72bd227" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.213689 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-408c-account-create-update-7pgf5" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.217728 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=12.217713119 podStartE2EDuration="12.217713119s" podCreationTimestamp="2026-01-21 15:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:15.207175967 +0000 UTC m=+1472.388692189" watchObservedRunningTime="2026-01-21 15:26:15.217713119 +0000 UTC m=+1472.399229341" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.314867 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m"] Jan 21 15:26:15 crc kubenswrapper[4707]: E0121 15:26:15.315130 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb37062-6ccc-4f42-9bc2-e553db1e788b" containerName="mariadb-database-create" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315147 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb37062-6ccc-4f42-9bc2-e553db1e788b" containerName="mariadb-database-create" Jan 21 15:26:15 crc kubenswrapper[4707]: E0121 15:26:15.315159 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832d985c-f2bb-47c8-b4ae-95cbfc58b024" containerName="mariadb-account-create-update" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315165 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="832d985c-f2bb-47c8-b4ae-95cbfc58b024" containerName="mariadb-account-create-update" Jan 21 15:26:15 crc kubenswrapper[4707]: E0121 15:26:15.315173 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2856c284-39e8-469c-b288-422cc5f3205c" containerName="mariadb-account-create-update" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315179 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2856c284-39e8-469c-b288-422cc5f3205c" containerName="mariadb-account-create-update" Jan 21 15:26:15 crc kubenswrapper[4707]: E0121 15:26:15.315198 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac82c533-38ae-4570-9826-a85958afa2c0" containerName="mariadb-account-create-update" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315203 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac82c533-38ae-4570-9826-a85958afa2c0" containerName="mariadb-account-create-update" Jan 21 15:26:15 crc kubenswrapper[4707]: E0121 15:26:15.315214 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625c54ec-3c40-4fd5-a2f9-583dedf97df7" containerName="swift-ring-rebalance" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315219 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="625c54ec-3c40-4fd5-a2f9-583dedf97df7" containerName="swift-ring-rebalance" Jan 21 15:26:15 crc kubenswrapper[4707]: E0121 15:26:15.315228 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927" containerName="mariadb-database-create" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315234 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927" containerName="mariadb-database-create" Jan 21 15:26:15 crc kubenswrapper[4707]: E0121 15:26:15.315240 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e4c7891-17ea-4609-865e-ed35416688f0" containerName="mariadb-database-create" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315245 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4c7891-17ea-4609-865e-ed35416688f0" containerName="mariadb-database-create" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315394 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="625c54ec-3c40-4fd5-a2f9-583dedf97df7" containerName="swift-ring-rebalance" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315408 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb37062-6ccc-4f42-9bc2-e553db1e788b" containerName="mariadb-database-create" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315416 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927" containerName="mariadb-database-create" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315427 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="832d985c-f2bb-47c8-b4ae-95cbfc58b024" containerName="mariadb-account-create-update" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315436 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac82c533-38ae-4570-9826-a85958afa2c0" containerName="mariadb-account-create-update" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315445 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2856c284-39e8-469c-b288-422cc5f3205c" containerName="mariadb-account-create-update" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.315458 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e4c7891-17ea-4609-865e-ed35416688f0" containerName="mariadb-database-create" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.316118 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.317463 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.326014 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m"] Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.351864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.351900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-config\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.351971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.352004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbcdl\" (UniqueName: \"kubernetes.io/projected/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-kube-api-access-jbcdl\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.452965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.453216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-config\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.453261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.453286 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbcdl\" (UniqueName: \"kubernetes.io/projected/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-kube-api-access-jbcdl\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.453976 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-config\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.454086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.454545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.466532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbcdl\" (UniqueName: \"kubernetes.io/projected/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-kube-api-access-jbcdl\") pod \"dnsmasq-dnsmasq-6bf5dd6689-jrt6m\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.558257 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:26:15 crc kubenswrapper[4707]: I0121 15:26:15.628743 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:16 crc kubenswrapper[4707]: I0121 15:26:16.018162 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m"] Jan 21 15:26:16 crc kubenswrapper[4707]: W0121 15:26:16.021096 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd327f80a_69ba_4e3d_97fa_fff0bfc98dd5.slice/crio-2b007c6a88d89f317074db587956c50c7cc1f817e8de9007ad276a848af17810 WatchSource:0}: Error finding container 2b007c6a88d89f317074db587956c50c7cc1f817e8de9007ad276a848af17810: Status 404 returned error can't find the container with id 2b007c6a88d89f317074db587956c50c7cc1f817e8de9007ad276a848af17810 Jan 21 15:26:16 crc kubenswrapper[4707]: I0121 15:26:16.220317 4707 generic.go:334] "Generic (PLEG): container finished" podID="d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" containerID="083e4cd32b052f682b3d16aab2992cbad4e1ff2d3385eda7a355f1297585116c" exitCode=0 Jan 21 15:26:16 crc kubenswrapper[4707]: I0121 15:26:16.220366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" event={"ID":"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5","Type":"ContainerDied","Data":"083e4cd32b052f682b3d16aab2992cbad4e1ff2d3385eda7a355f1297585116c"} Jan 21 15:26:16 crc kubenswrapper[4707]: I0121 15:26:16.220558 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" event={"ID":"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5","Type":"ContainerStarted","Data":"2b007c6a88d89f317074db587956c50c7cc1f817e8de9007ad276a848af17810"} Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.012821 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-9mljp"] Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.013782 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.016960 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-9hnf8" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.017159 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.019669 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-9mljp"] Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.074956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-db-sync-config-data\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.075009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-combined-ca-bundle\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.075050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-666vq\" (UniqueName: \"kubernetes.io/projected/0ce47035-1fcb-413a-b5ff-d3207523b1ef-kube-api-access-666vq\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.075279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-config-data\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.176594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-config-data\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.176660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-db-sync-config-data\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.176699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-combined-ca-bundle\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.176736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-666vq\" (UniqueName: \"kubernetes.io/projected/0ce47035-1fcb-413a-b5ff-d3207523b1ef-kube-api-access-666vq\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.180320 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-db-sync-config-data\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.180365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-config-data\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.180391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-combined-ca-bundle\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.195007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-666vq\" (UniqueName: \"kubernetes.io/projected/0ce47035-1fcb-413a-b5ff-d3207523b1ef-kube-api-access-666vq\") pod \"glance-db-sync-9mljp\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.227366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" event={"ID":"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5","Type":"ContainerStarted","Data":"9ee60ee038e2abfb8251554c9075cf75988811fa8396a5cc948a9afaf1fcf228"} Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.227486 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.241540 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" podStartSLOduration=2.241529847 podStartE2EDuration="2.241529847s" podCreationTimestamp="2026-01-21 15:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:17.238025802 +0000 UTC m=+1474.419542024" watchObservedRunningTime="2026-01-21 15:26:17.241529847 +0000 UTC m=+1474.423046069" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.329174 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:17 crc kubenswrapper[4707]: I0121 15:26:17.679528 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-9mljp"] Jan 21 15:26:18 crc kubenswrapper[4707]: I0121 15:26:18.234267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-9mljp" event={"ID":"0ce47035-1fcb-413a-b5ff-d3207523b1ef","Type":"ContainerStarted","Data":"56b307e8ae01dc1efee73790e46167abc0507fd0d2427ed1c12896d92831d80b"} Jan 21 15:26:18 crc kubenswrapper[4707]: I0121 15:26:18.234490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-9mljp" event={"ID":"0ce47035-1fcb-413a-b5ff-d3207523b1ef","Type":"ContainerStarted","Data":"537b1ced5d2c487afc80bd40abbce6a101cb19a2f29055177cf8e023f15c05ea"} Jan 21 15:26:18 crc kubenswrapper[4707]: I0121 15:26:18.250403 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-9mljp" podStartSLOduration=1.250387413 podStartE2EDuration="1.250387413s" podCreationTimestamp="2026-01-21 15:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:18.243030096 +0000 UTC m=+1475.424546318" watchObservedRunningTime="2026-01-21 15:26:18.250387413 +0000 UTC m=+1475.431903635" Jan 21 15:26:19 crc kubenswrapper[4707]: I0121 15:26:19.924057 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g78qf"] Jan 21 15:26:19 crc kubenswrapper[4707]: I0121 15:26:19.925169 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:19 crc kubenswrapper[4707]: I0121 15:26:19.931930 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g78qf"] Jan 21 15:26:19 crc kubenswrapper[4707]: I0121 15:26:19.937139 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:26:20 crc kubenswrapper[4707]: I0121 15:26:20.015740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-operator-scripts\") pod \"root-account-create-update-g78qf\" (UID: \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\") " pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:20 crc kubenswrapper[4707]: I0121 15:26:20.016023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmwnl\" (UniqueName: \"kubernetes.io/projected/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-kube-api-access-zmwnl\") pod \"root-account-create-update-g78qf\" (UID: \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\") " pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:20 crc kubenswrapper[4707]: I0121 15:26:20.117240 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-operator-scripts\") pod \"root-account-create-update-g78qf\" (UID: \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\") " pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:20 crc kubenswrapper[4707]: I0121 15:26:20.117360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmwnl\" (UniqueName: \"kubernetes.io/projected/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-kube-api-access-zmwnl\") pod \"root-account-create-update-g78qf\" (UID: \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\") " pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:20 crc kubenswrapper[4707]: I0121 15:26:20.117962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-operator-scripts\") pod \"root-account-create-update-g78qf\" (UID: \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\") " pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:20 crc kubenswrapper[4707]: I0121 15:26:20.132802 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmwnl\" (UniqueName: \"kubernetes.io/projected/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-kube-api-access-zmwnl\") pod \"root-account-create-update-g78qf\" (UID: \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\") " pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:20 crc kubenswrapper[4707]: I0121 15:26:20.238237 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:20 crc kubenswrapper[4707]: I0121 15:26:20.598768 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g78qf"] Jan 21 15:26:21 crc kubenswrapper[4707]: I0121 15:26:21.280057 4707 generic.go:334] "Generic (PLEG): container finished" podID="d54b9100-265d-49a3-a2ab-72e8787510df" containerID="db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a" exitCode=0 Jan 21 15:26:21 crc kubenswrapper[4707]: I0121 15:26:21.280129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"d54b9100-265d-49a3-a2ab-72e8787510df","Type":"ContainerDied","Data":"db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a"} Jan 21 15:26:21 crc kubenswrapper[4707]: I0121 15:26:21.281539 4707 generic.go:334] "Generic (PLEG): container finished" podID="6bac3bc5-b4ca-4333-9fc6-851430c48708" containerID="37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855" exitCode=0 Jan 21 15:26:21 crc kubenswrapper[4707]: I0121 15:26:21.281564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6bac3bc5-b4ca-4333-9fc6-851430c48708","Type":"ContainerDied","Data":"37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855"} Jan 21 15:26:21 crc kubenswrapper[4707]: I0121 15:26:21.282700 4707 generic.go:334] "Generic (PLEG): container finished" podID="983aa7f0-fb2b-46d5-912a-c0d8b26bff20" containerID="158ef82383ebc483ed65ed5a15084ea8125ee0ba76ff7dc77f0ac7243069e6fa" exitCode=0 Jan 21 15:26:21 crc kubenswrapper[4707]: I0121 15:26:21.282720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-g78qf" event={"ID":"983aa7f0-fb2b-46d5-912a-c0d8b26bff20","Type":"ContainerDied","Data":"158ef82383ebc483ed65ed5a15084ea8125ee0ba76ff7dc77f0ac7243069e6fa"} Jan 21 15:26:21 crc kubenswrapper[4707]: I0121 15:26:21.282736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-g78qf" event={"ID":"983aa7f0-fb2b-46d5-912a-c0d8b26bff20","Type":"ContainerStarted","Data":"84e2073d296e337d01404ffecc459d7d701d8ddc0b84476f847f5a99ea161d2d"} Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.290539 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ce47035-1fcb-413a-b5ff-d3207523b1ef" containerID="56b307e8ae01dc1efee73790e46167abc0507fd0d2427ed1c12896d92831d80b" exitCode=0 Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.290616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-9mljp" event={"ID":"0ce47035-1fcb-413a-b5ff-d3207523b1ef","Type":"ContainerDied","Data":"56b307e8ae01dc1efee73790e46167abc0507fd0d2427ed1c12896d92831d80b"} Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.292384 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"d54b9100-265d-49a3-a2ab-72e8787510df","Type":"ContainerStarted","Data":"ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d"} Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.292571 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.294129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6bac3bc5-b4ca-4333-9fc6-851430c48708","Type":"ContainerStarted","Data":"001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287"} Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.294374 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.328372 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.328358083 podStartE2EDuration="36.328358083s" podCreationTimestamp="2026-01-21 15:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:22.324097226 +0000 UTC m=+1479.505613448" watchObservedRunningTime="2026-01-21 15:26:22.328358083 +0000 UTC m=+1479.509874306" Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.340359 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.340328087 podStartE2EDuration="36.340328087s" podCreationTimestamp="2026-01-21 15:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:22.338468023 +0000 UTC m=+1479.519984244" watchObservedRunningTime="2026-01-21 15:26:22.340328087 +0000 UTC m=+1479.521844310" Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.549165 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.652762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-operator-scripts\") pod \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\" (UID: \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\") " Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.652899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmwnl\" (UniqueName: \"kubernetes.io/projected/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-kube-api-access-zmwnl\") pod \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\" (UID: \"983aa7f0-fb2b-46d5-912a-c0d8b26bff20\") " Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.653110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "983aa7f0-fb2b-46d5-912a-c0d8b26bff20" (UID: "983aa7f0-fb2b-46d5-912a-c0d8b26bff20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.653291 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.661214 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-kube-api-access-zmwnl" (OuterVolumeSpecName: "kube-api-access-zmwnl") pod "983aa7f0-fb2b-46d5-912a-c0d8b26bff20" (UID: "983aa7f0-fb2b-46d5-912a-c0d8b26bff20"). InnerVolumeSpecName "kube-api-access-zmwnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:22 crc kubenswrapper[4707]: I0121 15:26:22.754775 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmwnl\" (UniqueName: \"kubernetes.io/projected/983aa7f0-fb2b-46d5-912a-c0d8b26bff20-kube-api-access-zmwnl\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.305954 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g78qf" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.306491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-g78qf" event={"ID":"983aa7f0-fb2b-46d5-912a-c0d8b26bff20","Type":"ContainerDied","Data":"84e2073d296e337d01404ffecc459d7d701d8ddc0b84476f847f5a99ea161d2d"} Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.306511 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84e2073d296e337d01404ffecc459d7d701d8ddc0b84476f847f5a99ea161d2d" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.575014 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.665765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-config-data\") pod \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.665856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-db-sync-config-data\") pod \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.666001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-combined-ca-bundle\") pod \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.666039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-666vq\" (UniqueName: \"kubernetes.io/projected/0ce47035-1fcb-413a-b5ff-d3207523b1ef-kube-api-access-666vq\") pod \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\" (UID: \"0ce47035-1fcb-413a-b5ff-d3207523b1ef\") " Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.669869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0ce47035-1fcb-413a-b5ff-d3207523b1ef" (UID: "0ce47035-1fcb-413a-b5ff-d3207523b1ef"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.670119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce47035-1fcb-413a-b5ff-d3207523b1ef-kube-api-access-666vq" (OuterVolumeSpecName: "kube-api-access-666vq") pod "0ce47035-1fcb-413a-b5ff-d3207523b1ef" (UID: "0ce47035-1fcb-413a-b5ff-d3207523b1ef"). InnerVolumeSpecName "kube-api-access-666vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.693164 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ce47035-1fcb-413a-b5ff-d3207523b1ef" (UID: "0ce47035-1fcb-413a-b5ff-d3207523b1ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.700133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-config-data" (OuterVolumeSpecName: "config-data") pod "0ce47035-1fcb-413a-b5ff-d3207523b1ef" (UID: "0ce47035-1fcb-413a-b5ff-d3207523b1ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.767984 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.768013 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-666vq\" (UniqueName: \"kubernetes.io/projected/0ce47035-1fcb-413a-b5ff-d3207523b1ef-kube-api-access-666vq\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.768025 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:23 crc kubenswrapper[4707]: I0121 15:26:23.768034 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ce47035-1fcb-413a-b5ff-d3207523b1ef-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:24 crc kubenswrapper[4707]: I0121 15:26:24.311734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-9mljp" event={"ID":"0ce47035-1fcb-413a-b5ff-d3207523b1ef","Type":"ContainerDied","Data":"537b1ced5d2c487afc80bd40abbce6a101cb19a2f29055177cf8e023f15c05ea"} Jan 21 15:26:24 crc kubenswrapper[4707]: I0121 15:26:24.311764 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="537b1ced5d2c487afc80bd40abbce6a101cb19a2f29055177cf8e023f15c05ea" Jan 21 15:26:24 crc kubenswrapper[4707]: I0121 15:26:24.311775 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-9mljp" Jan 21 15:26:25 crc kubenswrapper[4707]: I0121 15:26:25.630746 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:26:25 crc kubenswrapper[4707]: I0121 15:26:25.666157 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn"] Jan 21 15:26:25 crc kubenswrapper[4707]: I0121 15:26:25.666353 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" podUID="80b52f69-1c27-4d55-a847-c304552c27d0" containerName="dnsmasq-dns" containerID="cri-o://2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98" gracePeriod=10 Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.033383 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.202386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-config\") pod \"80b52f69-1c27-4d55-a847-c304552c27d0\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.202505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-dnsmasq-svc\") pod \"80b52f69-1c27-4d55-a847-c304552c27d0\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.202557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp8zk\" (UniqueName: \"kubernetes.io/projected/80b52f69-1c27-4d55-a847-c304552c27d0-kube-api-access-wp8zk\") pod \"80b52f69-1c27-4d55-a847-c304552c27d0\" (UID: \"80b52f69-1c27-4d55-a847-c304552c27d0\") " Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.206729 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b52f69-1c27-4d55-a847-c304552c27d0-kube-api-access-wp8zk" (OuterVolumeSpecName: "kube-api-access-wp8zk") pod "80b52f69-1c27-4d55-a847-c304552c27d0" (UID: "80b52f69-1c27-4d55-a847-c304552c27d0"). InnerVolumeSpecName "kube-api-access-wp8zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.231708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-config" (OuterVolumeSpecName: "config") pod "80b52f69-1c27-4d55-a847-c304552c27d0" (UID: "80b52f69-1c27-4d55-a847-c304552c27d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.238763 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "80b52f69-1c27-4d55-a847-c304552c27d0" (UID: "80b52f69-1c27-4d55-a847-c304552c27d0"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.304144 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.304408 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp8zk\" (UniqueName: \"kubernetes.io/projected/80b52f69-1c27-4d55-a847-c304552c27d0-kube-api-access-wp8zk\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.304440 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b52f69-1c27-4d55-a847-c304552c27d0-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.324526 4707 generic.go:334] "Generic (PLEG): container finished" podID="80b52f69-1c27-4d55-a847-c304552c27d0" containerID="2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98" exitCode=0 Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.324561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" event={"ID":"80b52f69-1c27-4d55-a847-c304552c27d0","Type":"ContainerDied","Data":"2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98"} Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.324585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" event={"ID":"80b52f69-1c27-4d55-a847-c304552c27d0","Type":"ContainerDied","Data":"14fdfc0e29849c8c5109d73b4da45c2195ad39a81a58d32cc6317c513faff7dc"} Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.324601 4707 scope.go:117] "RemoveContainer" containerID="2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.324700 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.347513 4707 scope.go:117] "RemoveContainer" containerID="9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.359560 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn"] Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.364058 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sj9hn"] Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.381092 4707 scope.go:117] "RemoveContainer" containerID="2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98" Jan 21 15:26:26 crc kubenswrapper[4707]: E0121 15:26:26.385151 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98\": container with ID starting with 2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98 not found: ID does not exist" containerID="2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.385199 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98"} err="failed to get container status \"2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98\": rpc error: code = NotFound desc = could not find container \"2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98\": container with ID starting with 2d354c160ab52fc77afd71f93a034a645d433035cb79109084507cdb89cefd98 not found: ID does not exist" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.385226 4707 scope.go:117] "RemoveContainer" containerID="9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20" Jan 21 15:26:26 crc kubenswrapper[4707]: E0121 15:26:26.389039 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20\": container with ID starting with 9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20 not found: ID does not exist" containerID="9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20" Jan 21 15:26:26 crc kubenswrapper[4707]: I0121 15:26:26.389176 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20"} err="failed to get container status \"9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20\": rpc error: code = NotFound desc = could not find container \"9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20\": container with ID starting with 9e0a0da8e3b96e0f8f6bbfd1e7f9f88706978dff97d095db15aa5bc8d19adc20 not found: ID does not exist" Jan 21 15:26:27 crc kubenswrapper[4707]: I0121 15:26:27.189930 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80b52f69-1c27-4d55-a847-c304552c27d0" path="/var/lib/kubelet/pods/80b52f69-1c27-4d55-a847-c304552c27d0/volumes" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.040022 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.242029 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-fwzmm"] Jan 21 15:26:38 crc kubenswrapper[4707]: E0121 15:26:38.242336 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983aa7f0-fb2b-46d5-912a-c0d8b26bff20" containerName="mariadb-account-create-update" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.242353 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="983aa7f0-fb2b-46d5-912a-c0d8b26bff20" containerName="mariadb-account-create-update" Jan 21 15:26:38 crc kubenswrapper[4707]: E0121 15:26:38.242364 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce47035-1fcb-413a-b5ff-d3207523b1ef" containerName="glance-db-sync" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.242370 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce47035-1fcb-413a-b5ff-d3207523b1ef" containerName="glance-db-sync" Jan 21 15:26:38 crc kubenswrapper[4707]: E0121 15:26:38.242383 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b52f69-1c27-4d55-a847-c304552c27d0" containerName="init" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.242389 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b52f69-1c27-4d55-a847-c304552c27d0" containerName="init" Jan 21 15:26:38 crc kubenswrapper[4707]: E0121 15:26:38.242397 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b52f69-1c27-4d55-a847-c304552c27d0" containerName="dnsmasq-dns" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.242402 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b52f69-1c27-4d55-a847-c304552c27d0" containerName="dnsmasq-dns" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.242551 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="983aa7f0-fb2b-46d5-912a-c0d8b26bff20" containerName="mariadb-account-create-update" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.242567 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b52f69-1c27-4d55-a847-c304552c27d0" containerName="dnsmasq-dns" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.242576 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce47035-1fcb-413a-b5ff-d3207523b1ef" containerName="glance-db-sync" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.243079 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.247541 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-fwzmm"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.254230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.345100 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lxdwg"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.345969 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.354966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b384d69-2797-4368-9779-c0860b850738-operator-scripts\") pod \"cinder-db-create-fwzmm\" (UID: \"0b384d69-2797-4368-9779-c0860b850738\") " pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.355105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xh7\" (UniqueName: \"kubernetes.io/projected/0b384d69-2797-4368-9779-c0860b850738-kube-api-access-l6xh7\") pod \"cinder-db-create-fwzmm\" (UID: \"0b384d69-2797-4368-9779-c0860b850738\") " pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.360073 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lxdwg"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.369367 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.370162 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.374089 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.374722 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.451027 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.452014 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.456306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b384d69-2797-4368-9779-c0860b850738-operator-scripts\") pod \"cinder-db-create-fwzmm\" (UID: \"0b384d69-2797-4368-9779-c0860b850738\") " pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.456395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxv5w\" (UniqueName: \"kubernetes.io/projected/89dba5d8-7aea-4717-a816-d26edb1da323-kube-api-access-bxv5w\") pod \"barbican-db-create-lxdwg\" (UID: \"89dba5d8-7aea-4717-a816-d26edb1da323\") " pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.456448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89dba5d8-7aea-4717-a816-d26edb1da323-operator-scripts\") pod \"barbican-db-create-lxdwg\" (UID: \"89dba5d8-7aea-4717-a816-d26edb1da323\") " pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.456475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xh7\" (UniqueName: \"kubernetes.io/projected/0b384d69-2797-4368-9779-c0860b850738-kube-api-access-l6xh7\") pod \"cinder-db-create-fwzmm\" (UID: \"0b384d69-2797-4368-9779-c0860b850738\") " pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.457054 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b384d69-2797-4368-9779-c0860b850738-operator-scripts\") pod \"cinder-db-create-fwzmm\" (UID: \"0b384d69-2797-4368-9779-c0860b850738\") " pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.458192 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.458986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.478041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xh7\" (UniqueName: \"kubernetes.io/projected/0b384d69-2797-4368-9779-c0860b850738-kube-api-access-l6xh7\") pod \"cinder-db-create-fwzmm\" (UID: \"0b384d69-2797-4368-9779-c0860b850738\") " pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.548746 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g8rcm"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.549556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.554541 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-btx76"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.555372 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.556525 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.556876 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.557388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6bebd0-9910-4926-8ee0-0510b9b99d90-operator-scripts\") pod \"neutron-11a0-account-create-update-wp6hh\" (UID: \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.557437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd32f87d-9ff2-481d-82ac-0ceb13492b31-operator-scripts\") pod \"cinder-9d5a-account-create-update-dxf2p\" (UID: \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.557483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7kd9\" (UniqueName: \"kubernetes.io/projected/9e6bebd0-9910-4926-8ee0-0510b9b99d90-kube-api-access-c7kd9\") pod \"neutron-11a0-account-create-update-wp6hh\" (UID: \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.557559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxv5w\" (UniqueName: \"kubernetes.io/projected/89dba5d8-7aea-4717-a816-d26edb1da323-kube-api-access-bxv5w\") pod \"barbican-db-create-lxdwg\" (UID: \"89dba5d8-7aea-4717-a816-d26edb1da323\") " pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.557599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztb25\" (UniqueName: \"kubernetes.io/projected/bd32f87d-9ff2-481d-82ac-0ceb13492b31-kube-api-access-ztb25\") pod \"cinder-9d5a-account-create-update-dxf2p\" (UID: \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.557628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89dba5d8-7aea-4717-a816-d26edb1da323-operator-scripts\") pod \"barbican-db-create-lxdwg\" (UID: \"89dba5d8-7aea-4717-a816-d26edb1da323\") " pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.558262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89dba5d8-7aea-4717-a816-d26edb1da323-operator-scripts\") pod \"barbican-db-create-lxdwg\" (UID: \"89dba5d8-7aea-4717-a816-d26edb1da323\") " pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.559505 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g8rcm"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.563277 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-btx76"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.578543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxv5w\" (UniqueName: \"kubernetes.io/projected/89dba5d8-7aea-4717-a816-d26edb1da323-kube-api-access-bxv5w\") pod \"barbican-db-create-lxdwg\" (UID: \"89dba5d8-7aea-4717-a816-d26edb1da323\") " pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.599792 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fxgzb"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.601322 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.603988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.604118 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-wp4kf" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.604286 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.604677 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.607615 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fxgzb"] Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.657848 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.658512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-operator-scripts\") pod \"barbican-7408-account-create-update-btx76\" (UID: \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.658611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6bebd0-9910-4926-8ee0-0510b9b99d90-operator-scripts\") pod \"neutron-11a0-account-create-update-wp6hh\" (UID: \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.658642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd32f87d-9ff2-481d-82ac-0ceb13492b31-operator-scripts\") pod \"cinder-9d5a-account-create-update-dxf2p\" (UID: \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.658672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swxmc\" (UniqueName: \"kubernetes.io/projected/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-kube-api-access-swxmc\") pod \"barbican-7408-account-create-update-btx76\" (UID: \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.658692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7kd9\" (UniqueName: \"kubernetes.io/projected/9e6bebd0-9910-4926-8ee0-0510b9b99d90-kube-api-access-c7kd9\") pod \"neutron-11a0-account-create-update-wp6hh\" (UID: \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.658716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrfq4\" (UniqueName: \"kubernetes.io/projected/d2969b7b-1db8-4940-ad96-4f68d565fe0d-kube-api-access-vrfq4\") pod \"neutron-db-create-g8rcm\" (UID: \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\") " pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.658743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2969b7b-1db8-4940-ad96-4f68d565fe0d-operator-scripts\") pod \"neutron-db-create-g8rcm\" (UID: \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\") " pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.658776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztb25\" (UniqueName: \"kubernetes.io/projected/bd32f87d-9ff2-481d-82ac-0ceb13492b31-kube-api-access-ztb25\") pod \"cinder-9d5a-account-create-update-dxf2p\" (UID: \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.659283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd32f87d-9ff2-481d-82ac-0ceb13492b31-operator-scripts\") pod \"cinder-9d5a-account-create-update-dxf2p\" (UID: \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.659374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6bebd0-9910-4926-8ee0-0510b9b99d90-operator-scripts\") pod \"neutron-11a0-account-create-update-wp6hh\" (UID: \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.673348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7kd9\" (UniqueName: \"kubernetes.io/projected/9e6bebd0-9910-4926-8ee0-0510b9b99d90-kube-api-access-c7kd9\") pod \"neutron-11a0-account-create-update-wp6hh\" (UID: \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.675960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztb25\" (UniqueName: \"kubernetes.io/projected/bd32f87d-9ff2-481d-82ac-0ceb13492b31-kube-api-access-ztb25\") pod \"cinder-9d5a-account-create-update-dxf2p\" (UID: \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.687198 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.759595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-combined-ca-bundle\") pod \"keystone-db-sync-fxgzb\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.759647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swxmc\" (UniqueName: \"kubernetes.io/projected/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-kube-api-access-swxmc\") pod \"barbican-7408-account-create-update-btx76\" (UID: \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.759680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrfq4\" (UniqueName: \"kubernetes.io/projected/d2969b7b-1db8-4940-ad96-4f68d565fe0d-kube-api-access-vrfq4\") pod \"neutron-db-create-g8rcm\" (UID: \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\") " pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.759704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-config-data\") pod \"keystone-db-sync-fxgzb\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.759724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2969b7b-1db8-4940-ad96-4f68d565fe0d-operator-scripts\") pod \"neutron-db-create-g8rcm\" (UID: \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\") " pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.759764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-operator-scripts\") pod \"barbican-7408-account-create-update-btx76\" (UID: \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.759779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwbcl\" (UniqueName: \"kubernetes.io/projected/be59b54f-310b-4813-8485-69cd6af8424e-kube-api-access-hwbcl\") pod \"keystone-db-sync-fxgzb\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.760420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2969b7b-1db8-4940-ad96-4f68d565fe0d-operator-scripts\") pod \"neutron-db-create-g8rcm\" (UID: \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\") " pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.760735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-operator-scripts\") pod \"barbican-7408-account-create-update-btx76\" (UID: \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.770771 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.776896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrfq4\" (UniqueName: \"kubernetes.io/projected/d2969b7b-1db8-4940-ad96-4f68d565fe0d-kube-api-access-vrfq4\") pod \"neutron-db-create-g8rcm\" (UID: \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\") " pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.777198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swxmc\" (UniqueName: \"kubernetes.io/projected/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-kube-api-access-swxmc\") pod \"barbican-7408-account-create-update-btx76\" (UID: \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.861374 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwbcl\" (UniqueName: \"kubernetes.io/projected/be59b54f-310b-4813-8485-69cd6af8424e-kube-api-access-hwbcl\") pod \"keystone-db-sync-fxgzb\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.861629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-combined-ca-bundle\") pod \"keystone-db-sync-fxgzb\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.861668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-config-data\") pod \"keystone-db-sync-fxgzb\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.862390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.864829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-combined-ca-bundle\") pod \"keystone-db-sync-fxgzb\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.865092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-config-data\") pod \"keystone-db-sync-fxgzb\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.868615 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.877966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwbcl\" (UniqueName: \"kubernetes.io/projected/be59b54f-310b-4813-8485-69cd6af8424e-kube-api-access-hwbcl\") pod \"keystone-db-sync-fxgzb\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.935644 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:38 crc kubenswrapper[4707]: I0121 15:26:38.994149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-fwzmm"] Jan 21 15:26:39 crc kubenswrapper[4707]: W0121 15:26:39.000323 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b384d69_2797_4368_9779_c0860b850738.slice/crio-933b62fff6a5a0252fb5e0ccf37f1ca2846e098f3fb91c834108d151c6cc8830 WatchSource:0}: Error finding container 933b62fff6a5a0252fb5e0ccf37f1ca2846e098f3fb91c834108d151c6cc8830: Status 404 returned error can't find the container with id 933b62fff6a5a0252fb5e0ccf37f1ca2846e098f3fb91c834108d151c6cc8830 Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.096138 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lxdwg"] Jan 21 15:26:39 crc kubenswrapper[4707]: W0121 15:26:39.098507 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89dba5d8_7aea_4717_a816_d26edb1da323.slice/crio-5e0ec8f6a0f983b2d990995d148f383e1513e6914b25bd128717b381a5dba21e WatchSource:0}: Error finding container 5e0ec8f6a0f983b2d990995d148f383e1513e6914b25bd128717b381a5dba21e: Status 404 returned error can't find the container with id 5e0ec8f6a0f983b2d990995d148f383e1513e6914b25bd128717b381a5dba21e Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.144777 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh"] Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.216688 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p"] Jan 21 15:26:39 crc kubenswrapper[4707]: W0121 15:26:39.221229 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd32f87d_9ff2_481d_82ac_0ceb13492b31.slice/crio-f608a3f17fcda52451c8b746bc99ef9db3edb273406a2d91cbc77bd4c1a894f1 WatchSource:0}: Error finding container f608a3f17fcda52451c8b746bc99ef9db3edb273406a2d91cbc77bd4c1a894f1: Status 404 returned error can't find the container with id f608a3f17fcda52451c8b746bc99ef9db3edb273406a2d91cbc77bd4c1a894f1 Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.287255 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-btx76"] Jan 21 15:26:39 crc kubenswrapper[4707]: W0121 15:26:39.293533 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a246ce3_51b4_4914_9f3e_4cc6b4ca58d8.slice/crio-0478341ccb9f661112b1c113c6b10f7706bb00efeba6509f55a8ee8bcc161e59 WatchSource:0}: Error finding container 0478341ccb9f661112b1c113c6b10f7706bb00efeba6509f55a8ee8bcc161e59: Status 404 returned error can't find the container with id 0478341ccb9f661112b1c113c6b10f7706bb00efeba6509f55a8ee8bcc161e59 Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.353943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g8rcm"] Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.360158 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fxgzb"] Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.425581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" event={"ID":"be59b54f-310b-4813-8485-69cd6af8424e","Type":"ContainerStarted","Data":"0692fcb91ecde36d4e555bf8248d5afa7c0ee7219785aad4849e4bd31fe9a512"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.430167 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" event={"ID":"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8","Type":"ContainerStarted","Data":"0478341ccb9f661112b1c113c6b10f7706bb00efeba6509f55a8ee8bcc161e59"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.431696 4707 generic.go:334] "Generic (PLEG): container finished" podID="0b384d69-2797-4368-9779-c0860b850738" containerID="d92bfb954044f93a1d2d296e0bc8b3543217d2a0fefaa7612d5415b7cfb8164c" exitCode=0 Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.431746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-fwzmm" event={"ID":"0b384d69-2797-4368-9779-c0860b850738","Type":"ContainerDied","Data":"d92bfb954044f93a1d2d296e0bc8b3543217d2a0fefaa7612d5415b7cfb8164c"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.431763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-fwzmm" event={"ID":"0b384d69-2797-4368-9779-c0860b850738","Type":"ContainerStarted","Data":"933b62fff6a5a0252fb5e0ccf37f1ca2846e098f3fb91c834108d151c6cc8830"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.440942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" event={"ID":"9e6bebd0-9910-4926-8ee0-0510b9b99d90","Type":"ContainerStarted","Data":"67029361353c98122cfe1c9817668815bab4e2b6c15bd6afd746615a9e8b5ee9"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.440977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" event={"ID":"9e6bebd0-9910-4926-8ee0-0510b9b99d90","Type":"ContainerStarted","Data":"fbe244632d674281a63ed7b32254a2b8c9f5ec45dc7deae055e8f7aa82bd1fa4"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.453264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-lxdwg" event={"ID":"89dba5d8-7aea-4717-a816-d26edb1da323","Type":"ContainerStarted","Data":"440a9bfead233d507ab9aea7087704ef0f819f59103ccdbb90e238448f2e9fc0"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.453294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-lxdwg" event={"ID":"89dba5d8-7aea-4717-a816-d26edb1da323","Type":"ContainerStarted","Data":"5e0ec8f6a0f983b2d990995d148f383e1513e6914b25bd128717b381a5dba21e"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.459128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" event={"ID":"bd32f87d-9ff2-481d-82ac-0ceb13492b31","Type":"ContainerStarted","Data":"a202955f6373494fbeb43b833189545087a0c39e0d98b4567a8ab1c1d4688e12"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.459169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" event={"ID":"bd32f87d-9ff2-481d-82ac-0ceb13492b31","Type":"ContainerStarted","Data":"f608a3f17fcda52451c8b746bc99ef9db3edb273406a2d91cbc77bd4c1a894f1"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.459732 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" podStartSLOduration=1.459720721 podStartE2EDuration="1.459720721s" podCreationTimestamp="2026-01-21 15:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:39.458158063 +0000 UTC m=+1496.639674285" watchObservedRunningTime="2026-01-21 15:26:39.459720721 +0000 UTC m=+1496.641236943" Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.462963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-g8rcm" event={"ID":"d2969b7b-1db8-4940-ad96-4f68d565fe0d","Type":"ContainerStarted","Data":"c4e45186c10c259acf5715bc455c8d930739ec9edb125aebfce9b4e61de00cfb"} Jan 21 15:26:39 crc kubenswrapper[4707]: I0121 15:26:39.493441 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" podStartSLOduration=1.493427209 podStartE2EDuration="1.493427209s" podCreationTimestamp="2026-01-21 15:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:39.487115834 +0000 UTC m=+1496.668632056" watchObservedRunningTime="2026-01-21 15:26:39.493427209 +0000 UTC m=+1496.674943431" Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.470881 4707 generic.go:334] "Generic (PLEG): container finished" podID="9e6bebd0-9910-4926-8ee0-0510b9b99d90" containerID="67029361353c98122cfe1c9817668815bab4e2b6c15bd6afd746615a9e8b5ee9" exitCode=0 Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.470973 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" event={"ID":"9e6bebd0-9910-4926-8ee0-0510b9b99d90","Type":"ContainerDied","Data":"67029361353c98122cfe1c9817668815bab4e2b6c15bd6afd746615a9e8b5ee9"} Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.472715 4707 generic.go:334] "Generic (PLEG): container finished" podID="89dba5d8-7aea-4717-a816-d26edb1da323" containerID="440a9bfead233d507ab9aea7087704ef0f819f59103ccdbb90e238448f2e9fc0" exitCode=0 Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.472881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-lxdwg" event={"ID":"89dba5d8-7aea-4717-a816-d26edb1da323","Type":"ContainerDied","Data":"440a9bfead233d507ab9aea7087704ef0f819f59103ccdbb90e238448f2e9fc0"} Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.474761 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd32f87d-9ff2-481d-82ac-0ceb13492b31" containerID="a202955f6373494fbeb43b833189545087a0c39e0d98b4567a8ab1c1d4688e12" exitCode=0 Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.474792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" event={"ID":"bd32f87d-9ff2-481d-82ac-0ceb13492b31","Type":"ContainerDied","Data":"a202955f6373494fbeb43b833189545087a0c39e0d98b4567a8ab1c1d4688e12"} Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.475939 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2969b7b-1db8-4940-ad96-4f68d565fe0d" containerID="208c0662db0015e98e63b50f6623e6c048e772144dda6cb679b0e325256766b1" exitCode=0 Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.475989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-g8rcm" event={"ID":"d2969b7b-1db8-4940-ad96-4f68d565fe0d","Type":"ContainerDied","Data":"208c0662db0015e98e63b50f6623e6c048e772144dda6cb679b0e325256766b1"} Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.481375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" event={"ID":"be59b54f-310b-4813-8485-69cd6af8424e","Type":"ContainerStarted","Data":"473132bae3b4f1a358a9fce0e4dd49f77c32fea544af9d8275f4b91ef2bdb12f"} Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.483947 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8" containerID="5b2b95e85c5e4eda7cdfebc4edbda64106a9ba4ff11ec9077edac721fc1d7244" exitCode=0 Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.484089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" event={"ID":"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8","Type":"ContainerDied","Data":"5b2b95e85c5e4eda7cdfebc4edbda64106a9ba4ff11ec9077edac721fc1d7244"} Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.500783 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" podStartSLOduration=2.500771671 podStartE2EDuration="2.500771671s" podCreationTimestamp="2026-01-21 15:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:40.496010611 +0000 UTC m=+1497.677526832" watchObservedRunningTime="2026-01-21 15:26:40.500771671 +0000 UTC m=+1497.682287893" Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.827624 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.831935 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.989878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxv5w\" (UniqueName: \"kubernetes.io/projected/89dba5d8-7aea-4717-a816-d26edb1da323-kube-api-access-bxv5w\") pod \"89dba5d8-7aea-4717-a816-d26edb1da323\" (UID: \"89dba5d8-7aea-4717-a816-d26edb1da323\") " Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.989997 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b384d69-2797-4368-9779-c0860b850738-operator-scripts\") pod \"0b384d69-2797-4368-9779-c0860b850738\" (UID: \"0b384d69-2797-4368-9779-c0860b850738\") " Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.990058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89dba5d8-7aea-4717-a816-d26edb1da323-operator-scripts\") pod \"89dba5d8-7aea-4717-a816-d26edb1da323\" (UID: \"89dba5d8-7aea-4717-a816-d26edb1da323\") " Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.990092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6xh7\" (UniqueName: \"kubernetes.io/projected/0b384d69-2797-4368-9779-c0860b850738-kube-api-access-l6xh7\") pod \"0b384d69-2797-4368-9779-c0860b850738\" (UID: \"0b384d69-2797-4368-9779-c0860b850738\") " Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.990676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b384d69-2797-4368-9779-c0860b850738-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b384d69-2797-4368-9779-c0860b850738" (UID: "0b384d69-2797-4368-9779-c0860b850738"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.990752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89dba5d8-7aea-4717-a816-d26edb1da323-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89dba5d8-7aea-4717-a816-d26edb1da323" (UID: "89dba5d8-7aea-4717-a816-d26edb1da323"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.994606 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b384d69-2797-4368-9779-c0860b850738-kube-api-access-l6xh7" (OuterVolumeSpecName: "kube-api-access-l6xh7") pod "0b384d69-2797-4368-9779-c0860b850738" (UID: "0b384d69-2797-4368-9779-c0860b850738"). InnerVolumeSpecName "kube-api-access-l6xh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:40 crc kubenswrapper[4707]: I0121 15:26:40.994744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89dba5d8-7aea-4717-a816-d26edb1da323-kube-api-access-bxv5w" (OuterVolumeSpecName: "kube-api-access-bxv5w") pod "89dba5d8-7aea-4717-a816-d26edb1da323" (UID: "89dba5d8-7aea-4717-a816-d26edb1da323"). InnerVolumeSpecName "kube-api-access-bxv5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.091518 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6xh7\" (UniqueName: \"kubernetes.io/projected/0b384d69-2797-4368-9779-c0860b850738-kube-api-access-l6xh7\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.091548 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxv5w\" (UniqueName: \"kubernetes.io/projected/89dba5d8-7aea-4717-a816-d26edb1da323-kube-api-access-bxv5w\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.091559 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b384d69-2797-4368-9779-c0860b850738-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.091567 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89dba5d8-7aea-4717-a816-d26edb1da323-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.491222 4707 generic.go:334] "Generic (PLEG): container finished" podID="be59b54f-310b-4813-8485-69cd6af8424e" containerID="473132bae3b4f1a358a9fce0e4dd49f77c32fea544af9d8275f4b91ef2bdb12f" exitCode=0 Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.491267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" event={"ID":"be59b54f-310b-4813-8485-69cd6af8424e","Type":"ContainerDied","Data":"473132bae3b4f1a358a9fce0e4dd49f77c32fea544af9d8275f4b91ef2bdb12f"} Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.492873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-fwzmm" event={"ID":"0b384d69-2797-4368-9779-c0860b850738","Type":"ContainerDied","Data":"933b62fff6a5a0252fb5e0ccf37f1ca2846e098f3fb91c834108d151c6cc8830"} Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.492932 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="933b62fff6a5a0252fb5e0ccf37f1ca2846e098f3fb91c834108d151c6cc8830" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.492890 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-fwzmm" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.494664 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-lxdwg" event={"ID":"89dba5d8-7aea-4717-a816-d26edb1da323","Type":"ContainerDied","Data":"5e0ec8f6a0f983b2d990995d148f383e1513e6914b25bd128717b381a5dba21e"} Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.494687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-lxdwg" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.494693 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e0ec8f6a0f983b2d990995d148f383e1513e6914b25bd128717b381a5dba21e" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.769986 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.873790 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.877627 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.883267 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.903184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd32f87d-9ff2-481d-82ac-0ceb13492b31-operator-scripts\") pod \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\" (UID: \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\") " Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.903350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztb25\" (UniqueName: \"kubernetes.io/projected/bd32f87d-9ff2-481d-82ac-0ceb13492b31-kube-api-access-ztb25\") pod \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\" (UID: \"bd32f87d-9ff2-481d-82ac-0ceb13492b31\") " Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.903893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd32f87d-9ff2-481d-82ac-0ceb13492b31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd32f87d-9ff2-481d-82ac-0ceb13492b31" (UID: "bd32f87d-9ff2-481d-82ac-0ceb13492b31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:41 crc kubenswrapper[4707]: I0121 15:26:41.913918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd32f87d-9ff2-481d-82ac-0ceb13492b31-kube-api-access-ztb25" (OuterVolumeSpecName: "kube-api-access-ztb25") pod "bd32f87d-9ff2-481d-82ac-0ceb13492b31" (UID: "bd32f87d-9ff2-481d-82ac-0ceb13492b31"). InnerVolumeSpecName "kube-api-access-ztb25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.004514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swxmc\" (UniqueName: \"kubernetes.io/projected/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-kube-api-access-swxmc\") pod \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\" (UID: \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\") " Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.004598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-operator-scripts\") pod \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\" (UID: \"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8\") " Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.004677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2969b7b-1db8-4940-ad96-4f68d565fe0d-operator-scripts\") pod \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\" (UID: \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\") " Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.004730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6bebd0-9910-4926-8ee0-0510b9b99d90-operator-scripts\") pod \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\" (UID: \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\") " Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.004760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7kd9\" (UniqueName: \"kubernetes.io/projected/9e6bebd0-9910-4926-8ee0-0510b9b99d90-kube-api-access-c7kd9\") pod \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\" (UID: \"9e6bebd0-9910-4926-8ee0-0510b9b99d90\") " Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.004789 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrfq4\" (UniqueName: \"kubernetes.io/projected/d2969b7b-1db8-4940-ad96-4f68d565fe0d-kube-api-access-vrfq4\") pod \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\" (UID: \"d2969b7b-1db8-4940-ad96-4f68d565fe0d\") " Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.005205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8" (UID: "1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.005218 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztb25\" (UniqueName: \"kubernetes.io/projected/bd32f87d-9ff2-481d-82ac-0ceb13492b31-kube-api-access-ztb25\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.005270 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd32f87d-9ff2-481d-82ac-0ceb13492b31-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.005302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2969b7b-1db8-4940-ad96-4f68d565fe0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2969b7b-1db8-4940-ad96-4f68d565fe0d" (UID: "d2969b7b-1db8-4940-ad96-4f68d565fe0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.005325 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e6bebd0-9910-4926-8ee0-0510b9b99d90-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e6bebd0-9910-4926-8ee0-0510b9b99d90" (UID: "9e6bebd0-9910-4926-8ee0-0510b9b99d90"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.006962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-kube-api-access-swxmc" (OuterVolumeSpecName: "kube-api-access-swxmc") pod "1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8" (UID: "1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8"). InnerVolumeSpecName "kube-api-access-swxmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.007358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6bebd0-9910-4926-8ee0-0510b9b99d90-kube-api-access-c7kd9" (OuterVolumeSpecName: "kube-api-access-c7kd9") pod "9e6bebd0-9910-4926-8ee0-0510b9b99d90" (UID: "9e6bebd0-9910-4926-8ee0-0510b9b99d90"). InnerVolumeSpecName "kube-api-access-c7kd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.007733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2969b7b-1db8-4940-ad96-4f68d565fe0d-kube-api-access-vrfq4" (OuterVolumeSpecName: "kube-api-access-vrfq4") pod "d2969b7b-1db8-4940-ad96-4f68d565fe0d" (UID: "d2969b7b-1db8-4940-ad96-4f68d565fe0d"). InnerVolumeSpecName "kube-api-access-vrfq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.106746 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6bebd0-9910-4926-8ee0-0510b9b99d90-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.106774 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7kd9\" (UniqueName: \"kubernetes.io/projected/9e6bebd0-9910-4926-8ee0-0510b9b99d90-kube-api-access-c7kd9\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.106785 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrfq4\" (UniqueName: \"kubernetes.io/projected/d2969b7b-1db8-4940-ad96-4f68d565fe0d-kube-api-access-vrfq4\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.106794 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swxmc\" (UniqueName: \"kubernetes.io/projected/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-kube-api-access-swxmc\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.106801 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.106823 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2969b7b-1db8-4940-ad96-4f68d565fe0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.502656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" event={"ID":"1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8","Type":"ContainerDied","Data":"0478341ccb9f661112b1c113c6b10f7706bb00efeba6509f55a8ee8bcc161e59"} Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.502897 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0478341ccb9f661112b1c113c6b10f7706bb00efeba6509f55a8ee8bcc161e59" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.502829 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-7408-account-create-update-btx76" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.503964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" event={"ID":"9e6bebd0-9910-4926-8ee0-0510b9b99d90","Type":"ContainerDied","Data":"fbe244632d674281a63ed7b32254a2b8c9f5ec45dc7deae055e8f7aa82bd1fa4"} Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.503985 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbe244632d674281a63ed7b32254a2b8c9f5ec45dc7deae055e8f7aa82bd1fa4" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.504016 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.513279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" event={"ID":"bd32f87d-9ff2-481d-82ac-0ceb13492b31","Type":"ContainerDied","Data":"f608a3f17fcda52451c8b746bc99ef9db3edb273406a2d91cbc77bd4c1a894f1"} Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.513303 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f608a3f17fcda52451c8b746bc99ef9db3edb273406a2d91cbc77bd4c1a894f1" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.513341 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.515992 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-g8rcm" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.518869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-g8rcm" event={"ID":"d2969b7b-1db8-4940-ad96-4f68d565fe0d","Type":"ContainerDied","Data":"c4e45186c10c259acf5715bc455c8d930739ec9edb125aebfce9b4e61de00cfb"} Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.518891 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4e45186c10c259acf5715bc455c8d930739ec9edb125aebfce9b4e61de00cfb" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.708780 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.815714 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-combined-ca-bundle\") pod \"be59b54f-310b-4813-8485-69cd6af8424e\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.815769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwbcl\" (UniqueName: \"kubernetes.io/projected/be59b54f-310b-4813-8485-69cd6af8424e-kube-api-access-hwbcl\") pod \"be59b54f-310b-4813-8485-69cd6af8424e\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.815800 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-config-data\") pod \"be59b54f-310b-4813-8485-69cd6af8424e\" (UID: \"be59b54f-310b-4813-8485-69cd6af8424e\") " Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.819749 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be59b54f-310b-4813-8485-69cd6af8424e-kube-api-access-hwbcl" (OuterVolumeSpecName: "kube-api-access-hwbcl") pod "be59b54f-310b-4813-8485-69cd6af8424e" (UID: "be59b54f-310b-4813-8485-69cd6af8424e"). InnerVolumeSpecName "kube-api-access-hwbcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.842202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be59b54f-310b-4813-8485-69cd6af8424e" (UID: "be59b54f-310b-4813-8485-69cd6af8424e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.853601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-config-data" (OuterVolumeSpecName: "config-data") pod "be59b54f-310b-4813-8485-69cd6af8424e" (UID: "be59b54f-310b-4813-8485-69cd6af8424e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.917686 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.917718 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwbcl\" (UniqueName: \"kubernetes.io/projected/be59b54f-310b-4813-8485-69cd6af8424e-kube-api-access-hwbcl\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:42 crc kubenswrapper[4707]: I0121 15:26:42.917730 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be59b54f-310b-4813-8485-69cd6af8424e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.525796 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" event={"ID":"be59b54f-310b-4813-8485-69cd6af8424e","Type":"ContainerDied","Data":"0692fcb91ecde36d4e555bf8248d5afa7c0ee7219785aad4849e4bd31fe9a512"} Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.525866 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0692fcb91ecde36d4e555bf8248d5afa7c0ee7219785aad4849e4bd31fe9a512" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.525879 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-fxgzb" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650247 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rpsr2"] Jan 21 15:26:43 crc kubenswrapper[4707]: E0121 15:26:43.650534 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd32f87d-9ff2-481d-82ac-0ceb13492b31" containerName="mariadb-account-create-update" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650551 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd32f87d-9ff2-481d-82ac-0ceb13492b31" containerName="mariadb-account-create-update" Jan 21 15:26:43 crc kubenswrapper[4707]: E0121 15:26:43.650559 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dba5d8-7aea-4717-a816-d26edb1da323" containerName="mariadb-database-create" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650566 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dba5d8-7aea-4717-a816-d26edb1da323" containerName="mariadb-database-create" Jan 21 15:26:43 crc kubenswrapper[4707]: E0121 15:26:43.650575 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6bebd0-9910-4926-8ee0-0510b9b99d90" containerName="mariadb-account-create-update" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650580 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6bebd0-9910-4926-8ee0-0510b9b99d90" containerName="mariadb-account-create-update" Jan 21 15:26:43 crc kubenswrapper[4707]: E0121 15:26:43.650590 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2969b7b-1db8-4940-ad96-4f68d565fe0d" containerName="mariadb-database-create" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650596 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2969b7b-1db8-4940-ad96-4f68d565fe0d" containerName="mariadb-database-create" Jan 21 15:26:43 crc kubenswrapper[4707]: E0121 15:26:43.650614 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8" containerName="mariadb-account-create-update" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650620 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8" containerName="mariadb-account-create-update" Jan 21 15:26:43 crc kubenswrapper[4707]: E0121 15:26:43.650630 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b384d69-2797-4368-9779-c0860b850738" containerName="mariadb-database-create" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650636 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b384d69-2797-4368-9779-c0860b850738" containerName="mariadb-database-create" Jan 21 15:26:43 crc kubenswrapper[4707]: E0121 15:26:43.650657 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be59b54f-310b-4813-8485-69cd6af8424e" containerName="keystone-db-sync" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650662 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be59b54f-310b-4813-8485-69cd6af8424e" containerName="keystone-db-sync" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650830 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be59b54f-310b-4813-8485-69cd6af8424e" containerName="keystone-db-sync" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650838 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8" containerName="mariadb-account-create-update" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650852 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd32f87d-9ff2-481d-82ac-0ceb13492b31" containerName="mariadb-account-create-update" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650859 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b384d69-2797-4368-9779-c0860b850738" containerName="mariadb-database-create" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650868 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2969b7b-1db8-4940-ad96-4f68d565fe0d" containerName="mariadb-database-create" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650876 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="89dba5d8-7aea-4717-a816-d26edb1da323" containerName="mariadb-database-create" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.650884 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e6bebd0-9910-4926-8ee0-0510b9b99d90" containerName="mariadb-account-create-update" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.651397 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.656192 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.658192 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.658331 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.658451 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.658869 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-wp4kf" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.660746 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rpsr2"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.753372 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.754861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.756795 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.757032 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.769351 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.828502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-scripts\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.828585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-combined-ca-bundle\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.828607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-credential-keys\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.828679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkvxl\" (UniqueName: \"kubernetes.io/projected/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-kube-api-access-xkvxl\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.828713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-fernet-keys\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.828755 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-config-data\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.835617 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-g2jcl"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.836475 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.840333 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.840433 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.840477 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-j2bpz" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.852117 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-g2jcl"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.861787 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-98k7p"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.862594 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.867192 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.867657 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-f68hj" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.870257 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.881905 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-98k7p"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.897758 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-g9gwr"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.898632 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.900540 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-6x6jk" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.900788 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.900916 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.925462 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-g9gwr"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlkkz\" (UniqueName: \"kubernetes.io/projected/eb2d6ace-515d-4728-8340-aff4c8419520-kube-api-access-tlkkz\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-run-httpd\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-log-httpd\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-combined-ca-bundle\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-credential-keys\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhlmc\" (UniqueName: \"kubernetes.io/projected/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-kube-api-access-dhlmc\") pod \"neutron-db-sync-g2jcl\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-scripts\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-combined-ca-bundle\") pod \"neutron-db-sync-g2jcl\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkvxl\" (UniqueName: \"kubernetes.io/projected/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-kube-api-access-xkvxl\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-fernet-keys\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-config-data\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-config-data\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-config\") pod \"neutron-db-sync-g2jcl\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.929897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-scripts\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.933348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-combined-ca-bundle\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.935247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-scripts\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.938508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-credential-keys\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.938849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-fernet-keys\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.938942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-config-data\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.948797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkvxl\" (UniqueName: \"kubernetes.io/projected/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-kube-api-access-xkvxl\") pod \"keystone-bootstrap-rpsr2\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.956408 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-bb59f"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.957309 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.959402 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.960488 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-6vfbz" Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.965450 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-bb59f"] Jan 21 15:26:43 crc kubenswrapper[4707]: I0121 15:26:43.970194 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031150 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwpll\" (UniqueName: \"kubernetes.io/projected/7ec845a4-b468-4927-a900-58afe8305a5c-kube-api-access-dwpll\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-config-data\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-config-data\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-db-sync-config-data\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac97b49-c5fc-425e-91cb-8dd30213b600-logs\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-config-data\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec845a4-b468-4927-a900-58afe8305a5c-etc-machine-id\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-combined-ca-bundle\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-config\") pod \"neutron-db-sync-g2jcl\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.031778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-scripts\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.032006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cst2m\" (UniqueName: \"kubernetes.io/projected/6ac97b49-c5fc-425e-91cb-8dd30213b600-kube-api-access-cst2m\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.032055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlkkz\" (UniqueName: \"kubernetes.io/projected/eb2d6ace-515d-4728-8340-aff4c8419520-kube-api-access-tlkkz\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.032139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.032186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-run-httpd\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.032208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-log-httpd\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.032273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-combined-ca-bundle\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.032288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-scripts\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.032339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhlmc\" (UniqueName: \"kubernetes.io/projected/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-kube-api-access-dhlmc\") pod \"neutron-db-sync-g2jcl\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.032367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-scripts\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.033435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-run-httpd\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.034423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-log-httpd\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.034496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.034529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-combined-ca-bundle\") pod \"neutron-db-sync-g2jcl\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.038477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.039741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-config-data\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.041785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-combined-ca-bundle\") pod \"neutron-db-sync-g2jcl\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.042108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-config\") pod \"neutron-db-sync-g2jcl\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.042917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-scripts\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.044443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.049445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlkkz\" (UniqueName: \"kubernetes.io/projected/eb2d6ace-515d-4728-8340-aff4c8419520-kube-api-access-tlkkz\") pod \"ceilometer-0\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.050360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhlmc\" (UniqueName: \"kubernetes.io/projected/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-kube-api-access-dhlmc\") pod \"neutron-db-sync-g2jcl\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.071114 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139027 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-db-sync-config-data\") pod \"barbican-db-sync-bb59f\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-combined-ca-bundle\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-scripts\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwpll\" (UniqueName: \"kubernetes.io/projected/7ec845a4-b468-4927-a900-58afe8305a5c-kube-api-access-dwpll\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139558 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-config-data\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-config-data\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-combined-ca-bundle\") pod \"barbican-db-sync-bb59f\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-db-sync-config-data\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac97b49-c5fc-425e-91cb-8dd30213b600-logs\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec845a4-b468-4927-a900-58afe8305a5c-etc-machine-id\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-combined-ca-bundle\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139818 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-scripts\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cst2m\" (UniqueName: \"kubernetes.io/projected/6ac97b49-c5fc-425e-91cb-8dd30213b600-kube-api-access-cst2m\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.139882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9k97\" (UniqueName: \"kubernetes.io/projected/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-kube-api-access-l9k97\") pod \"barbican-db-sync-bb59f\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.140223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec845a4-b468-4927-a900-58afe8305a5c-etc-machine-id\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.140400 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac97b49-c5fc-425e-91cb-8dd30213b600-logs\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.145304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-scripts\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.145344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-db-sync-config-data\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.145514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-config-data\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.145617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-combined-ca-bundle\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.145683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-config-data\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.145694 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-combined-ca-bundle\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.147166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-scripts\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.155325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.157335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwpll\" (UniqueName: \"kubernetes.io/projected/7ec845a4-b468-4927-a900-58afe8305a5c-kube-api-access-dwpll\") pod \"cinder-db-sync-g9gwr\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.158565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cst2m\" (UniqueName: \"kubernetes.io/projected/6ac97b49-c5fc-425e-91cb-8dd30213b600-kube-api-access-cst2m\") pod \"placement-db-sync-98k7p\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.175585 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.219276 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.246040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9k97\" (UniqueName: \"kubernetes.io/projected/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-kube-api-access-l9k97\") pod \"barbican-db-sync-bb59f\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.246137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-db-sync-config-data\") pod \"barbican-db-sync-bb59f\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.248921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-db-sync-config-data\") pod \"barbican-db-sync-bb59f\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.249120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-combined-ca-bundle\") pod \"barbican-db-sync-bb59f\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.260267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-combined-ca-bundle\") pod \"barbican-db-sync-bb59f\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.263155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9k97\" (UniqueName: \"kubernetes.io/projected/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-kube-api-access-l9k97\") pod \"barbican-db-sync-bb59f\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.362997 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rpsr2"] Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.397055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.464389 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:26:44 crc kubenswrapper[4707]: W0121 15:26:44.488998 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb2d6ace_515d_4728_8340_aff4c8419520.slice/crio-8cd04acbc5ede1276abbf82885d24785a55c3e77f8c3d9c9611a2f7ed178e5f6 WatchSource:0}: Error finding container 8cd04acbc5ede1276abbf82885d24785a55c3e77f8c3d9c9611a2f7ed178e5f6: Status 404 returned error can't find the container with id 8cd04acbc5ede1276abbf82885d24785a55c3e77f8c3d9c9611a2f7ed178e5f6 Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.491611 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.534433 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" event={"ID":"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01","Type":"ContainerStarted","Data":"5c36dcae3ed936bb2d60dcf3ecca40ea3f567386b90ad431c335c8aa57debe7e"} Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.534470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" event={"ID":"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01","Type":"ContainerStarted","Data":"b47190e0b160f376ff30e464b217d6346a029ab027913083374ca5b3e655448e"} Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.538471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerStarted","Data":"8cd04acbc5ede1276abbf82885d24785a55c3e77f8c3d9c9611a2f7ed178e5f6"} Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.544231 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-g2jcl"] Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.551570 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" podStartSLOduration=1.551557579 podStartE2EDuration="1.551557579s" podCreationTimestamp="2026-01-21 15:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:44.545918798 +0000 UTC m=+1501.727435020" watchObservedRunningTime="2026-01-21 15:26:44.551557579 +0000 UTC m=+1501.733073801" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.659549 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-g9gwr"] Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.665928 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-98k7p"] Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.702735 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.704004 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.706044 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.706155 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.706369 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-9hnf8" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.706506 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.712692 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.788706 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.803880 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.803967 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.805920 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.807843 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.835860 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-bb59f"] Jan 21 15:26:44 crc kubenswrapper[4707]: W0121 15:26:44.838451 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde2f4f3c_d3df_40d2_91e4_4f98a4487bbf.slice/crio-d606bef206eabd531add18604e1e1d8bf4c02f4b0bbe47b744c7a3c985dcbe25 WatchSource:0}: Error finding container d606bef206eabd531add18604e1e1d8bf4c02f4b0bbe47b744c7a3c985dcbe25: Status 404 returned error can't find the container with id d606bef206eabd531add18604e1e1d8bf4c02f4b0bbe47b744c7a3c985dcbe25 Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.864039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.864137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-logs\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.864247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.864339 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.864376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.864452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.864475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nplz6\" (UniqueName: \"kubernetes.io/projected/1773e7c4-491e-4ef7-a558-d61e6fa3439e-kube-api-access-nplz6\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.864557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-logs\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966537 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-logs\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966592 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfk6w\" (UniqueName: \"kubernetes.io/projected/4c852d83-608f-44c0-961c-1a3f38a88224-kube-api-access-hfk6w\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nplz6\" (UniqueName: \"kubernetes.io/projected/1773e7c4-491e-4ef7-a558-d61e6fa3439e-kube-api-access-nplz6\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.966843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-logs\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.967139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.968465 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.970430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.971147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.971381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.972981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.980639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nplz6\" (UniqueName: \"kubernetes.io/projected/1773e7c4-491e-4ef7-a558-d61e6fa3439e-kube-api-access-nplz6\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:44 crc kubenswrapper[4707]: I0121 15:26:44.999722 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.026980 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-logs\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfk6w\" (UniqueName: \"kubernetes.io/projected/4c852d83-608f-44c0-961c-1a3f38a88224-kube-api-access-hfk6w\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-logs\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.068480 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.069162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.073282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.078472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.083490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.084465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.091956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfk6w\" (UniqueName: \"kubernetes.io/projected/4c852d83-608f-44c0-961c-1a3f38a88224-kube-api-access-hfk6w\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.102682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.127151 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.457350 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:26:45 crc kubenswrapper[4707]: W0121 15:26:45.467846 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1773e7c4_491e_4ef7_a558_d61e6fa3439e.slice/crio-2afdbb360c4bbb5eab1400f7578d8afcc782eeb6d405afa4140ac7ce1968bf4f WatchSource:0}: Error finding container 2afdbb360c4bbb5eab1400f7578d8afcc782eeb6d405afa4140ac7ce1968bf4f: Status 404 returned error can't find the container with id 2afdbb360c4bbb5eab1400f7578d8afcc782eeb6d405afa4140ac7ce1968bf4f Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.562184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-98k7p" event={"ID":"6ac97b49-c5fc-425e-91cb-8dd30213b600","Type":"ContainerStarted","Data":"30177fc1fa90681a04365549e6ac7cf1dbd60c425bb4bb3154414d2533f14fa8"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.562230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-98k7p" event={"ID":"6ac97b49-c5fc-425e-91cb-8dd30213b600","Type":"ContainerStarted","Data":"cf15a74f017c4fe4b2c1e3886b96606a9c9beb7ebdef20680c71b044bacb4a3e"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.565127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" event={"ID":"7ec845a4-b468-4927-a900-58afe8305a5c","Type":"ContainerStarted","Data":"d1c16be54872c293cc9299362e17bd0ce9ffc2c91beeb312a1ec0ca339b4c71b"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.565168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" event={"ID":"7ec845a4-b468-4927-a900-58afe8305a5c","Type":"ContainerStarted","Data":"1e7f10cf3a09839e9b128dbe3648bc4f4a94bf885717074e5682da3ca209459a"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.572080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" event={"ID":"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9","Type":"ContainerStarted","Data":"359db13546bda46dfe24fc62a12209a680cd159a00de210a56790fbb67e0c993"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.572124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" event={"ID":"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9","Type":"ContainerStarted","Data":"0b286cc7c90b6329aef6cece2cfe683fd7830de99081b2e5253a70c6fbefd778"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.578877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerStarted","Data":"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.585035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1773e7c4-491e-4ef7-a558-d61e6fa3439e","Type":"ContainerStarted","Data":"2afdbb360c4bbb5eab1400f7578d8afcc782eeb6d405afa4140ac7ce1968bf4f"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.587488 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-98k7p" podStartSLOduration=2.58747607 podStartE2EDuration="2.58747607s" podCreationTimestamp="2026-01-21 15:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:45.578378437 +0000 UTC m=+1502.759894660" watchObservedRunningTime="2026-01-21 15:26:45.58747607 +0000 UTC m=+1502.768992292" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.594851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-bb59f" event={"ID":"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf","Type":"ContainerStarted","Data":"0123d569abdf67071620acec970d42c7b8e7f5e4554d50dce12151ac907d347e"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.594880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-bb59f" event={"ID":"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf","Type":"ContainerStarted","Data":"d606bef206eabd531add18604e1e1d8bf4c02f4b0bbe47b744c7a3c985dcbe25"} Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.598855 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.606284 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" podStartSLOduration=2.60626328 podStartE2EDuration="2.60626328s" podCreationTimestamp="2026-01-21 15:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:45.59161984 +0000 UTC m=+1502.773136062" watchObservedRunningTime="2026-01-21 15:26:45.60626328 +0000 UTC m=+1502.787779502" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.612069 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" podStartSLOduration=2.61205623 podStartE2EDuration="2.61205623s" podCreationTimestamp="2026-01-21 15:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:45.603718265 +0000 UTC m=+1502.785234487" watchObservedRunningTime="2026-01-21 15:26:45.61205623 +0000 UTC m=+1502.793572453" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.620262 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-bb59f" podStartSLOduration=2.620250425 podStartE2EDuration="2.620250425s" podCreationTimestamp="2026-01-21 15:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:45.615048546 +0000 UTC m=+1502.796564768" watchObservedRunningTime="2026-01-21 15:26:45.620250425 +0000 UTC m=+1502.801766647" Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.813107 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.842696 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:26:45 crc kubenswrapper[4707]: I0121 15:26:45.877835 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:26:46 crc kubenswrapper[4707]: I0121 15:26:46.609316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1773e7c4-491e-4ef7-a558-d61e6fa3439e","Type":"ContainerStarted","Data":"cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687"} Jan 21 15:26:46 crc kubenswrapper[4707]: I0121 15:26:46.613097 4707 generic.go:334] "Generic (PLEG): container finished" podID="6ac97b49-c5fc-425e-91cb-8dd30213b600" containerID="30177fc1fa90681a04365549e6ac7cf1dbd60c425bb4bb3154414d2533f14fa8" exitCode=0 Jan 21 15:26:46 crc kubenswrapper[4707]: I0121 15:26:46.613136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-98k7p" event={"ID":"6ac97b49-c5fc-425e-91cb-8dd30213b600","Type":"ContainerDied","Data":"30177fc1fa90681a04365549e6ac7cf1dbd60c425bb4bb3154414d2533f14fa8"} Jan 21 15:26:46 crc kubenswrapper[4707]: I0121 15:26:46.616438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerStarted","Data":"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021"} Jan 21 15:26:46 crc kubenswrapper[4707]: I0121 15:26:46.619194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4c852d83-608f-44c0-961c-1a3f38a88224","Type":"ContainerStarted","Data":"bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee"} Jan 21 15:26:46 crc kubenswrapper[4707]: I0121 15:26:46.619219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4c852d83-608f-44c0-961c-1a3f38a88224","Type":"ContainerStarted","Data":"d6a4e8e0b9cd4954af9ec2f460f2dd1eb5c0d0ed7f1fe7a3f390199a7d539012"} Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.626467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1773e7c4-491e-4ef7-a558-d61e6fa3439e","Type":"ContainerStarted","Data":"b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9"} Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.626921 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerName="glance-log" containerID="cri-o://cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687" gracePeriod=30 Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.627299 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerName="glance-httpd" containerID="cri-o://b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9" gracePeriod=30 Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.631140 4707 generic.go:334] "Generic (PLEG): container finished" podID="de2f4f3c-d3df-40d2-91e4-4f98a4487bbf" containerID="0123d569abdf67071620acec970d42c7b8e7f5e4554d50dce12151ac907d347e" exitCode=0 Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.631191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-bb59f" event={"ID":"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf","Type":"ContainerDied","Data":"0123d569abdf67071620acec970d42c7b8e7f5e4554d50dce12151ac907d347e"} Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.632712 4707 generic.go:334] "Generic (PLEG): container finished" podID="0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" containerID="5c36dcae3ed936bb2d60dcf3ecca40ea3f567386b90ad431c335c8aa57debe7e" exitCode=0 Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.632752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" event={"ID":"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01","Type":"ContainerDied","Data":"5c36dcae3ed936bb2d60dcf3ecca40ea3f567386b90ad431c335c8aa57debe7e"} Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.634436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerStarted","Data":"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd"} Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.635801 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="4c852d83-608f-44c0-961c-1a3f38a88224" containerName="glance-log" containerID="cri-o://bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee" gracePeriod=30 Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.635989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4c852d83-608f-44c0-961c-1a3f38a88224","Type":"ContainerStarted","Data":"daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2"} Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.636036 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="4c852d83-608f-44c0-961c-1a3f38a88224" containerName="glance-httpd" containerID="cri-o://daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2" gracePeriod=30 Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.645885 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.645871383 podStartE2EDuration="4.645871383s" podCreationTimestamp="2026-01-21 15:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:47.639499364 +0000 UTC m=+1504.821015576" watchObservedRunningTime="2026-01-21 15:26:47.645871383 +0000 UTC m=+1504.827387605" Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.669172 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.669157891 podStartE2EDuration="4.669157891s" podCreationTimestamp="2026-01-21 15:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:47.66888575 +0000 UTC m=+1504.850401971" watchObservedRunningTime="2026-01-21 15:26:47.669157891 +0000 UTC m=+1504.850674113" Jan 21 15:26:47 crc kubenswrapper[4707]: I0121 15:26:47.995218 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.123918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-combined-ca-bundle\") pod \"6ac97b49-c5fc-425e-91cb-8dd30213b600\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.124524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-config-data\") pod \"6ac97b49-c5fc-425e-91cb-8dd30213b600\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.125034 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cst2m\" (UniqueName: \"kubernetes.io/projected/6ac97b49-c5fc-425e-91cb-8dd30213b600-kube-api-access-cst2m\") pod \"6ac97b49-c5fc-425e-91cb-8dd30213b600\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.125184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-scripts\") pod \"6ac97b49-c5fc-425e-91cb-8dd30213b600\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.125244 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac97b49-c5fc-425e-91cb-8dd30213b600-logs\") pod \"6ac97b49-c5fc-425e-91cb-8dd30213b600\" (UID: \"6ac97b49-c5fc-425e-91cb-8dd30213b600\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.125865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ac97b49-c5fc-425e-91cb-8dd30213b600-logs" (OuterVolumeSpecName: "logs") pod "6ac97b49-c5fc-425e-91cb-8dd30213b600" (UID: "6ac97b49-c5fc-425e-91cb-8dd30213b600"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.127852 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-scripts" (OuterVolumeSpecName: "scripts") pod "6ac97b49-c5fc-425e-91cb-8dd30213b600" (UID: "6ac97b49-c5fc-425e-91cb-8dd30213b600"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.141213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac97b49-c5fc-425e-91cb-8dd30213b600-kube-api-access-cst2m" (OuterVolumeSpecName: "kube-api-access-cst2m") pod "6ac97b49-c5fc-425e-91cb-8dd30213b600" (UID: "6ac97b49-c5fc-425e-91cb-8dd30213b600"). InnerVolumeSpecName "kube-api-access-cst2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.143725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-config-data" (OuterVolumeSpecName: "config-data") pod "6ac97b49-c5fc-425e-91cb-8dd30213b600" (UID: "6ac97b49-c5fc-425e-91cb-8dd30213b600"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.144098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac97b49-c5fc-425e-91cb-8dd30213b600" (UID: "6ac97b49-c5fc-425e-91cb-8dd30213b600"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.208046 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.213922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.227509 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cst2m\" (UniqueName: \"kubernetes.io/projected/6ac97b49-c5fc-425e-91cb-8dd30213b600-kube-api-access-cst2m\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.227596 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.227652 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac97b49-c5fc-425e-91cb-8dd30213b600-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.227703 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.227766 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac97b49-c5fc-425e-91cb-8dd30213b600-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.328942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"4c852d83-608f-44c0-961c-1a3f38a88224\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.328995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-logs\") pod \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-logs\") pod \"4c852d83-608f-44c0-961c-1a3f38a88224\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-scripts\") pod \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329077 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-public-tls-certs\") pod \"4c852d83-608f-44c0-961c-1a3f38a88224\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-internal-tls-certs\") pod \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-config-data\") pod \"4c852d83-608f-44c0-961c-1a3f38a88224\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-httpd-run\") pod \"4c852d83-608f-44c0-961c-1a3f38a88224\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329237 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nplz6\" (UniqueName: \"kubernetes.io/projected/1773e7c4-491e-4ef7-a558-d61e6fa3439e-kube-api-access-nplz6\") pod \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-combined-ca-bundle\") pod \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-combined-ca-bundle\") pod \"4c852d83-608f-44c0-961c-1a3f38a88224\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-config-data\") pod \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-logs" (OuterVolumeSpecName: "logs") pod "1773e7c4-491e-4ef7-a558-d61e6fa3439e" (UID: "1773e7c4-491e-4ef7-a558-d61e6fa3439e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-httpd-run\") pod \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\" (UID: \"1773e7c4-491e-4ef7-a558-d61e6fa3439e\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-scripts\") pod \"4c852d83-608f-44c0-961c-1a3f38a88224\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.329453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfk6w\" (UniqueName: \"kubernetes.io/projected/4c852d83-608f-44c0-961c-1a3f38a88224-kube-api-access-hfk6w\") pod \"4c852d83-608f-44c0-961c-1a3f38a88224\" (UID: \"4c852d83-608f-44c0-961c-1a3f38a88224\") " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.330029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-logs" (OuterVolumeSpecName: "logs") pod "4c852d83-608f-44c0-961c-1a3f38a88224" (UID: "4c852d83-608f-44c0-961c-1a3f38a88224"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.330042 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.330568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4c852d83-608f-44c0-961c-1a3f38a88224" (UID: "4c852d83-608f-44c0-961c-1a3f38a88224"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.333461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c852d83-608f-44c0-961c-1a3f38a88224-kube-api-access-hfk6w" (OuterVolumeSpecName: "kube-api-access-hfk6w") pod "4c852d83-608f-44c0-961c-1a3f38a88224" (UID: "4c852d83-608f-44c0-961c-1a3f38a88224"). InnerVolumeSpecName "kube-api-access-hfk6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.333761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "4c852d83-608f-44c0-961c-1a3f38a88224" (UID: "4c852d83-608f-44c0-961c-1a3f38a88224"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.334141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1773e7c4-491e-4ef7-a558-d61e6fa3439e" (UID: "1773e7c4-491e-4ef7-a558-d61e6fa3439e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.337933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-scripts" (OuterVolumeSpecName: "scripts") pod "4c852d83-608f-44c0-961c-1a3f38a88224" (UID: "4c852d83-608f-44c0-961c-1a3f38a88224"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.338008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1773e7c4-491e-4ef7-a558-d61e6fa3439e-kube-api-access-nplz6" (OuterVolumeSpecName: "kube-api-access-nplz6") pod "1773e7c4-491e-4ef7-a558-d61e6fa3439e" (UID: "1773e7c4-491e-4ef7-a558-d61e6fa3439e"). InnerVolumeSpecName "kube-api-access-nplz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.337947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-scripts" (OuterVolumeSpecName: "scripts") pod "1773e7c4-491e-4ef7-a558-d61e6fa3439e" (UID: "1773e7c4-491e-4ef7-a558-d61e6fa3439e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.347675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "1773e7c4-491e-4ef7-a558-d61e6fa3439e" (UID: "1773e7c4-491e-4ef7-a558-d61e6fa3439e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.357084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c852d83-608f-44c0-961c-1a3f38a88224" (UID: "4c852d83-608f-44c0-961c-1a3f38a88224"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.361346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1773e7c4-491e-4ef7-a558-d61e6fa3439e" (UID: "1773e7c4-491e-4ef7-a558-d61e6fa3439e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.366270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-config-data" (OuterVolumeSpecName: "config-data") pod "4c852d83-608f-44c0-961c-1a3f38a88224" (UID: "4c852d83-608f-44c0-961c-1a3f38a88224"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.370191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4c852d83-608f-44c0-961c-1a3f38a88224" (UID: "4c852d83-608f-44c0-961c-1a3f38a88224"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.379980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-config-data" (OuterVolumeSpecName: "config-data") pod "1773e7c4-491e-4ef7-a558-d61e6fa3439e" (UID: "1773e7c4-491e-4ef7-a558-d61e6fa3439e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.380553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1773e7c4-491e-4ef7-a558-d61e6fa3439e" (UID: "1773e7c4-491e-4ef7-a558-d61e6fa3439e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431491 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nplz6\" (UniqueName: \"kubernetes.io/projected/1773e7c4-491e-4ef7-a558-d61e6fa3439e-kube-api-access-nplz6\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431522 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431532 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431541 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431551 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1773e7c4-491e-4ef7-a558-d61e6fa3439e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431558 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431566 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfk6w\" (UniqueName: \"kubernetes.io/projected/4c852d83-608f-44c0-961c-1a3f38a88224-kube-api-access-hfk6w\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431595 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431603 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431610 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431618 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431625 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1773e7c4-491e-4ef7-a558-d61e6fa3439e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431633 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c852d83-608f-44c0-961c-1a3f38a88224-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431640 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c852d83-608f-44c0-961c-1a3f38a88224-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.431651 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.444908 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.445008 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.533077 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.533107 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.643191 4707 generic.go:334] "Generic (PLEG): container finished" podID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerID="b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9" exitCode=0 Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.643218 4707 generic.go:334] "Generic (PLEG): container finished" podID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerID="cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687" exitCode=143 Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.643265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1773e7c4-491e-4ef7-a558-d61e6fa3439e","Type":"ContainerDied","Data":"b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9"} Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.643296 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1773e7c4-491e-4ef7-a558-d61e6fa3439e","Type":"ContainerDied","Data":"cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687"} Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.643308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1773e7c4-491e-4ef7-a558-d61e6fa3439e","Type":"ContainerDied","Data":"2afdbb360c4bbb5eab1400f7578d8afcc782eeb6d405afa4140ac7ce1968bf4f"} Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.643322 4707 scope.go:117] "RemoveContainer" containerID="b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.644141 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.645093 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-98k7p" event={"ID":"6ac97b49-c5fc-425e-91cb-8dd30213b600","Type":"ContainerDied","Data":"cf15a74f017c4fe4b2c1e3886b96606a9c9beb7ebdef20680c71b044bacb4a3e"} Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.645339 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf15a74f017c4fe4b2c1e3886b96606a9c9beb7ebdef20680c71b044bacb4a3e" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.645437 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-98k7p" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.646587 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ec845a4-b468-4927-a900-58afe8305a5c" containerID="d1c16be54872c293cc9299362e17bd0ce9ffc2c91beeb312a1ec0ca339b4c71b" exitCode=0 Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.646623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" event={"ID":"7ec845a4-b468-4927-a900-58afe8305a5c","Type":"ContainerDied","Data":"d1c16be54872c293cc9299362e17bd0ce9ffc2c91beeb312a1ec0ca339b4c71b"} Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.649409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerStarted","Data":"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88"} Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.649518 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="ceilometer-central-agent" containerID="cri-o://dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc" gracePeriod=30 Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.649686 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.649725 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="proxy-httpd" containerID="cri-o://cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88" gracePeriod=30 Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.649769 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="sg-core" containerID="cri-o://814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd" gracePeriod=30 Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.649798 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="ceilometer-notification-agent" containerID="cri-o://da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021" gracePeriod=30 Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.656612 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c852d83-608f-44c0-961c-1a3f38a88224" containerID="daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2" exitCode=0 Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.656631 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c852d83-608f-44c0-961c-1a3f38a88224" containerID="bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee" exitCode=143 Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.656755 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.659232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4c852d83-608f-44c0-961c-1a3f38a88224","Type":"ContainerDied","Data":"daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2"} Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.659281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4c852d83-608f-44c0-961c-1a3f38a88224","Type":"ContainerDied","Data":"bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee"} Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.659291 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4c852d83-608f-44c0-961c-1a3f38a88224","Type":"ContainerDied","Data":"d6a4e8e0b9cd4954af9ec2f460f2dd1eb5c0d0ed7f1fe7a3f390199a7d539012"} Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.692644 4707 scope.go:117] "RemoveContainer" containerID="cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.701971 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.121967129 podStartE2EDuration="5.701958s" podCreationTimestamp="2026-01-21 15:26:43 +0000 UTC" firstStartedPulling="2026-01-21 15:26:44.49142509 +0000 UTC m=+1501.672941312" lastFinishedPulling="2026-01-21 15:26:48.071415961 +0000 UTC m=+1505.252932183" observedRunningTime="2026-01-21 15:26:48.693044363 +0000 UTC m=+1505.874560585" watchObservedRunningTime="2026-01-21 15:26:48.701958 +0000 UTC m=+1505.883474223" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.725410 4707 scope.go:117] "RemoveContainer" containerID="b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9" Jan 21 15:26:48 crc kubenswrapper[4707]: E0121 15:26:48.725725 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9\": container with ID starting with b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9 not found: ID does not exist" containerID="b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.725749 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9"} err="failed to get container status \"b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9\": rpc error: code = NotFound desc = could not find container \"b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9\": container with ID starting with b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9 not found: ID does not exist" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.725767 4707 scope.go:117] "RemoveContainer" containerID="cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687" Jan 21 15:26:48 crc kubenswrapper[4707]: E0121 15:26:48.726000 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687\": container with ID starting with cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687 not found: ID does not exist" containerID="cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.726020 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687"} err="failed to get container status \"cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687\": rpc error: code = NotFound desc = could not find container \"cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687\": container with ID starting with cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687 not found: ID does not exist" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.726040 4707 scope.go:117] "RemoveContainer" containerID="b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.726274 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9"} err="failed to get container status \"b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9\": rpc error: code = NotFound desc = could not find container \"b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9\": container with ID starting with b028c3a8d31388d6192b7095ebe797087b68f2565f377b5f0f43e04068a366b9 not found: ID does not exist" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.726302 4707 scope.go:117] "RemoveContainer" containerID="cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.726568 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687"} err="failed to get container status \"cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687\": rpc error: code = NotFound desc = could not find container \"cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687\": container with ID starting with cd3e1f07baba8f39dcaff158cc50ee99b19b7a7819177f75456d839de7a18687 not found: ID does not exist" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.726587 4707 scope.go:117] "RemoveContainer" containerID="daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.736311 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-c4967b5bb-n8cvb"] Jan 21 15:26:48 crc kubenswrapper[4707]: E0121 15:26:48.736851 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac97b49-c5fc-425e-91cb-8dd30213b600" containerName="placement-db-sync" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.736868 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac97b49-c5fc-425e-91cb-8dd30213b600" containerName="placement-db-sync" Jan 21 15:26:48 crc kubenswrapper[4707]: E0121 15:26:48.736883 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c852d83-608f-44c0-961c-1a3f38a88224" containerName="glance-httpd" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.736888 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c852d83-608f-44c0-961c-1a3f38a88224" containerName="glance-httpd" Jan 21 15:26:48 crc kubenswrapper[4707]: E0121 15:26:48.736900 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerName="glance-log" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.736905 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerName="glance-log" Jan 21 15:26:48 crc kubenswrapper[4707]: E0121 15:26:48.736914 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c852d83-608f-44c0-961c-1a3f38a88224" containerName="glance-log" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.736920 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c852d83-608f-44c0-961c-1a3f38a88224" containerName="glance-log" Jan 21 15:26:48 crc kubenswrapper[4707]: E0121 15:26:48.736939 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerName="glance-httpd" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.736944 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerName="glance-httpd" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.737062 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerName="glance-httpd" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.737073 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac97b49-c5fc-425e-91cb-8dd30213b600" containerName="placement-db-sync" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.737089 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c852d83-608f-44c0-961c-1a3f38a88224" containerName="glance-httpd" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.737098 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c852d83-608f-44c0-961c-1a3f38a88224" containerName="glance-log" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.737109 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" containerName="glance-log" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.737861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.741496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.747081 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.747382 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-f68hj" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.758214 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-c4967b5bb-n8cvb"] Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.759964 4707 scope.go:117] "RemoveContainer" containerID="bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.778523 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.781886 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.787679 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.792420 4707 scope.go:117] "RemoveContainer" containerID="daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2" Jan 21 15:26:48 crc kubenswrapper[4707]: E0121 15:26:48.792911 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2\": container with ID starting with daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2 not found: ID does not exist" containerID="daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.792946 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2"} err="failed to get container status \"daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2\": rpc error: code = NotFound desc = could not find container \"daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2\": container with ID starting with daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2 not found: ID does not exist" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.792971 4707 scope.go:117] "RemoveContainer" containerID="bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.793875 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:26:48 crc kubenswrapper[4707]: E0121 15:26:48.810186 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee\": container with ID starting with bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee not found: ID does not exist" containerID="bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.810225 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee"} err="failed to get container status \"bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee\": rpc error: code = NotFound desc = could not find container \"bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee\": container with ID starting with bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee not found: ID does not exist" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.810249 4707 scope.go:117] "RemoveContainer" containerID="daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.820865 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.822044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.828920 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2"} err="failed to get container status \"daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2\": rpc error: code = NotFound desc = could not find container \"daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2\": container with ID starting with daac1a1d2c1b738d4aaf5e97a1cc5d89ccbd1204c328617c4abea65c01c50be2 not found: ID does not exist" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.828962 4707 scope.go:117] "RemoveContainer" containerID="bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.831126 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.831344 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.831450 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-9hnf8" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.831560 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.839876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-config-data\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.839953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-scripts\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.840001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wz7j\" (UniqueName: \"kubernetes.io/projected/4e9925f1-bf87-4f65-b5bd-4827d6946de5-kube-api-access-7wz7j\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.840030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-combined-ca-bundle\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.840046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9925f1-bf87-4f65-b5bd-4827d6946de5-logs\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.842801 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.847543 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee"} err="failed to get container status \"bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee\": rpc error: code = NotFound desc = could not find container \"bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee\": container with ID starting with bba25ed8ed5ed1157c5a331fd0f45128f12888561203efcd6ec06a9de06d9bee not found: ID does not exist" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.849489 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.850928 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.854014 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.857625 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.857823 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gmvc\" (UniqueName: \"kubernetes.io/projected/d8665ece-e4f3-481b-b4eb-92126976834e-kube-api-access-8gmvc\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9925f1-bf87-4f65-b5bd-4827d6946de5-logs\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-logs\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-config-data\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-scripts\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-config-data\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-scripts\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wz7j\" (UniqueName: \"kubernetes.io/projected/4e9925f1-bf87-4f65-b5bd-4827d6946de5-kube-api-access-7wz7j\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.941747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-combined-ca-bundle\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.942596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9925f1-bf87-4f65-b5bd-4827d6946de5-logs\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.955244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-scripts\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.955388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-combined-ca-bundle\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.968636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-config-data\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:48 crc kubenswrapper[4707]: I0121 15:26:48.987356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wz7j\" (UniqueName: \"kubernetes.io/projected/4e9925f1-bf87-4f65-b5bd-4827d6946de5-kube-api-access-7wz7j\") pod \"placement-c4967b5bb-n8cvb\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-logs\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043616 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gmvc\" (UniqueName: \"kubernetes.io/projected/d8665ece-e4f3-481b-b4eb-92126976834e-kube-api-access-8gmvc\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-config-data\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-scripts\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043755 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4sxs\" (UniqueName: \"kubernetes.io/projected/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-kube-api-access-h4sxs\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.043981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.044014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.044449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-logs\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.048097 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.051092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.052662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.052884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-config-data\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.053583 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-scripts\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.060378 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.064446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gmvc\" (UniqueName: \"kubernetes.io/projected/d8665ece-e4f3-481b-b4eb-92126976834e-kube-api-access-8gmvc\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.065473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.069073 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.144004 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.148339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4sxs\" (UniqueName: \"kubernetes.io/projected/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-kube-api-access-h4sxs\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.148373 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.148396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.148425 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.148459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.148475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.148489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.148515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.150204 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.150539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.150566 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.158214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.158690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.158700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.164251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4sxs\" (UniqueName: \"kubernetes.io/projected/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-kube-api-access-h4sxs\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.165942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.168188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.184245 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.193912 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1773e7c4-491e-4ef7-a558-d61e6fa3439e" path="/var/lib/kubelet/pods/1773e7c4-491e-4ef7-a558-d61e6fa3439e/volumes" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.194626 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c852d83-608f-44c0-961c-1a3f38a88224" path="/var/lib/kubelet/pods/4c852d83-608f-44c0-961c-1a3f38a88224/volumes" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.203602 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.206664 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.225102 4707 scope.go:117] "RemoveContainer" containerID="9cf6b1987a60900afa662dfbbf59d1ae2b96a810d41798201bc72c7371a322e5" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.264001 4707 scope.go:117] "RemoveContainer" containerID="ef39347322b8be3a3ce0794824042d8b99eb534953ab353e81ab65e6783f7ddf" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.292645 4707 scope.go:117] "RemoveContainer" containerID="16dd7b38c695d1db80f48d375ca8303e3621f5b7878940f4a752db4a7794b23c" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.313295 4707 scope.go:117] "RemoveContainer" containerID="aa23da1e8ad14c7c75f4c95b67498188f812619ba11cff54ffcb28ddef51018c" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.338270 4707 scope.go:117] "RemoveContainer" containerID="6fad03905ac9924024bbd83b187773be3d1d8859cb027e58c2f343735ac4b6a5" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.351268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-db-sync-config-data\") pod \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.351374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkvxl\" (UniqueName: \"kubernetes.io/projected/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-kube-api-access-xkvxl\") pod \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.351427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-credential-keys\") pod \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.351451 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9k97\" (UniqueName: \"kubernetes.io/projected/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-kube-api-access-l9k97\") pod \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.351468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-config-data\") pod \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.351524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-combined-ca-bundle\") pod \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.351572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-scripts\") pod \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.351585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-fernet-keys\") pod \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\" (UID: \"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.351608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-combined-ca-bundle\") pod \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\" (UID: \"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.356400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-scripts" (OuterVolumeSpecName: "scripts") pod "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" (UID: "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.361958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" (UID: "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.361968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-kube-api-access-xkvxl" (OuterVolumeSpecName: "kube-api-access-xkvxl") pod "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" (UID: "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01"). InnerVolumeSpecName "kube-api-access-xkvxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.361989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-kube-api-access-l9k97" (OuterVolumeSpecName: "kube-api-access-l9k97") pod "de2f4f3c-d3df-40d2-91e4-4f98a4487bbf" (UID: "de2f4f3c-d3df-40d2-91e4-4f98a4487bbf"). InnerVolumeSpecName "kube-api-access-l9k97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.362011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" (UID: "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.363498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "de2f4f3c-d3df-40d2-91e4-4f98a4487bbf" (UID: "de2f4f3c-d3df-40d2-91e4-4f98a4487bbf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.377299 4707 scope.go:117] "RemoveContainer" containerID="e48534f9c080cffaac35d1488292d55f1059c686dc837998c490ccb3e1c2e0fb" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.378715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de2f4f3c-d3df-40d2-91e4-4f98a4487bbf" (UID: "de2f4f3c-d3df-40d2-91e4-4f98a4487bbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.382730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" (UID: "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.391861 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-config-data" (OuterVolumeSpecName: "config-data") pod "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" (UID: "0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.400725 4707 scope.go:117] "RemoveContainer" containerID="ab3eeddc6598c297fab3d07fa539d52160b58955adf32604c651489c3a40b876" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.421173 4707 scope.go:117] "RemoveContainer" containerID="d4289b6efa74e80d01a680e30c7ad22037e55e14b8070a9387edcaf4166f1213" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.442461 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.444640 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-c4967b5bb-n8cvb"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.453336 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.453360 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.453370 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.453379 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.453387 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkvxl\" (UniqueName: \"kubernetes.io/projected/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-kube-api-access-xkvxl\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.453395 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.453402 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9k97\" (UniqueName: \"kubernetes.io/projected/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf-kube-api-access-l9k97\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.453409 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.453416 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.454306 4707 scope.go:117] "RemoveContainer" containerID="1b506835d91987c2c094b1c1d50eee1a83ac3f42c3cedbbbcadfbdf94de1575a" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.554107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-config-data\") pod \"eb2d6ace-515d-4728-8340-aff4c8419520\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.554145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-run-httpd\") pod \"eb2d6ace-515d-4728-8340-aff4c8419520\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.554211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlkkz\" (UniqueName: \"kubernetes.io/projected/eb2d6ace-515d-4728-8340-aff4c8419520-kube-api-access-tlkkz\") pod \"eb2d6ace-515d-4728-8340-aff4c8419520\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.554237 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-scripts\") pod \"eb2d6ace-515d-4728-8340-aff4c8419520\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.554332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-sg-core-conf-yaml\") pod \"eb2d6ace-515d-4728-8340-aff4c8419520\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.554370 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-combined-ca-bundle\") pod \"eb2d6ace-515d-4728-8340-aff4c8419520\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.554431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-log-httpd\") pod \"eb2d6ace-515d-4728-8340-aff4c8419520\" (UID: \"eb2d6ace-515d-4728-8340-aff4c8419520\") " Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.554738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eb2d6ace-515d-4728-8340-aff4c8419520" (UID: "eb2d6ace-515d-4728-8340-aff4c8419520"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.554992 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.555076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eb2d6ace-515d-4728-8340-aff4c8419520" (UID: "eb2d6ace-515d-4728-8340-aff4c8419520"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.557730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-scripts" (OuterVolumeSpecName: "scripts") pod "eb2d6ace-515d-4728-8340-aff4c8419520" (UID: "eb2d6ace-515d-4728-8340-aff4c8419520"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.557743 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2d6ace-515d-4728-8340-aff4c8419520-kube-api-access-tlkkz" (OuterVolumeSpecName: "kube-api-access-tlkkz") pod "eb2d6ace-515d-4728-8340-aff4c8419520" (UID: "eb2d6ace-515d-4728-8340-aff4c8419520"). InnerVolumeSpecName "kube-api-access-tlkkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.573927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eb2d6ace-515d-4728-8340-aff4c8419520" (UID: "eb2d6ace-515d-4728-8340-aff4c8419520"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.609896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb2d6ace-515d-4728-8340-aff4c8419520" (UID: "eb2d6ace-515d-4728-8340-aff4c8419520"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.633708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-config-data" (OuterVolumeSpecName: "config-data") pod "eb2d6ace-515d-4728-8340-aff4c8419520" (UID: "eb2d6ace-515d-4728-8340-aff4c8419520"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.640906 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:26:49 crc kubenswrapper[4707]: W0121 15:26:49.649339 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8665ece_e4f3_481b_b4eb_92126976834e.slice/crio-5df24f191ad32d87727a20232e5220227000c90950cd69f0d4401595dae00926 WatchSource:0}: Error finding container 5df24f191ad32d87727a20232e5220227000c90950cd69f0d4401595dae00926: Status 404 returned error can't find the container with id 5df24f191ad32d87727a20232e5220227000c90950cd69f0d4401595dae00926 Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.656157 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.656193 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.656202 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb2d6ace-515d-4728-8340-aff4c8419520-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.656212 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.656221 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlkkz\" (UniqueName: \"kubernetes.io/projected/eb2d6ace-515d-4728-8340-aff4c8419520-kube-api-access-tlkkz\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.656229 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb2d6ace-515d-4728-8340-aff4c8419520-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.673523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8665ece-e4f3-481b-b4eb-92126976834e","Type":"ContainerStarted","Data":"5df24f191ad32d87727a20232e5220227000c90950cd69f0d4401595dae00926"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.682193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-bb59f" event={"ID":"de2f4f3c-d3df-40d2-91e4-4f98a4487bbf","Type":"ContainerDied","Data":"d606bef206eabd531add18604e1e1d8bf4c02f4b0bbe47b744c7a3c985dcbe25"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.682240 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d606bef206eabd531add18604e1e1d8bf4c02f4b0bbe47b744c7a3c985dcbe25" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.682309 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-bb59f" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.691395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" event={"ID":"4e9925f1-bf87-4f65-b5bd-4827d6946de5","Type":"ContainerStarted","Data":"639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.691437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" event={"ID":"4e9925f1-bf87-4f65-b5bd-4827d6946de5","Type":"ContainerStarted","Data":"f407d8cb0f06c50076577d76bb830ca84b508b33df40e00201e5a537cb1fb334"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.696716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" event={"ID":"0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01","Type":"ContainerDied","Data":"b47190e0b160f376ff30e464b217d6346a029ab027913083374ca5b3e655448e"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.696756 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b47190e0b160f376ff30e464b217d6346a029ab027913083374ca5b3e655448e" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.696849 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-rpsr2" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.698906 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb2d6ace-515d-4728-8340-aff4c8419520" containerID="cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88" exitCode=0 Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.698925 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb2d6ace-515d-4728-8340-aff4c8419520" containerID="814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd" exitCode=2 Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.698934 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb2d6ace-515d-4728-8340-aff4c8419520" containerID="da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021" exitCode=0 Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.698941 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb2d6ace-515d-4728-8340-aff4c8419520" containerID="dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc" exitCode=0 Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.699138 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.699194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerDied","Data":"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.699243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerDied","Data":"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.699257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerDied","Data":"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.699267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerDied","Data":"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.699281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb2d6ace-515d-4728-8340-aff4c8419520","Type":"ContainerDied","Data":"8cd04acbc5ede1276abbf82885d24785a55c3e77f8c3d9c9611a2f7ed178e5f6"} Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.699298 4707 scope.go:117] "RemoveContainer" containerID="cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.777397 4707 scope.go:117] "RemoveContainer" containerID="814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.789561 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808103 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b555f485-zww44"] Jan 21 15:26:49 crc kubenswrapper[4707]: E0121 15:26:49.808640 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="proxy-httpd" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808653 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="proxy-httpd" Jan 21 15:26:49 crc kubenswrapper[4707]: E0121 15:26:49.808666 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="ceilometer-central-agent" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808672 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="ceilometer-central-agent" Jan 21 15:26:49 crc kubenswrapper[4707]: E0121 15:26:49.808685 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="sg-core" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808690 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="sg-core" Jan 21 15:26:49 crc kubenswrapper[4707]: E0121 15:26:49.808704 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" containerName="keystone-bootstrap" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808711 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" containerName="keystone-bootstrap" Jan 21 15:26:49 crc kubenswrapper[4707]: E0121 15:26:49.808726 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2f4f3c-d3df-40d2-91e4-4f98a4487bbf" containerName="barbican-db-sync" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808731 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2f4f3c-d3df-40d2-91e4-4f98a4487bbf" containerName="barbican-db-sync" Jan 21 15:26:49 crc kubenswrapper[4707]: E0121 15:26:49.808741 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="ceilometer-notification-agent" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808746 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="ceilometer-notification-agent" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808878 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="proxy-httpd" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808889 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" containerName="keystone-bootstrap" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808896 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="ceilometer-central-agent" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808909 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="ceilometer-notification-agent" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808918 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2f4f3c-d3df-40d2-91e4-4f98a4487bbf" containerName="barbican-db-sync" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.808934 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" containerName="sg-core" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.809697 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.812773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-6vfbz" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.812929 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.813048 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.815915 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b555f485-zww44"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.824745 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.837083 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.839906 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.842546 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.852918 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.860903 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.863295 4707 scope.go:117] "RemoveContainer" containerID="da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.865252 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rpsr2"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.870447 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rpsr2"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.894900 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.898216 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.901734 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.901920 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.909734 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-nz8gm"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.910950 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.914718 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-wp4kf" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.914941 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.915038 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.915128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.915294 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.915376 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.926904 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-nz8gm"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.943119 4707 scope.go:117] "RemoveContainer" containerID="dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.948167 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.949519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.955168 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx"] Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.956032 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4jzx\" (UniqueName: \"kubernetes.io/projected/8a287941-c7ad-4671-904b-03172bc803e8-kube-api-access-l4jzx\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a287941-c7ad-4671-904b-03172bc803e8-logs\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-combined-ca-bundle\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data-custom\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c27903-58da-47dc-97d0-b1328c58303d-logs\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-combined-ca-bundle\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v862r\" (UniqueName: \"kubernetes.io/projected/d4c27903-58da-47dc-97d0-b1328c58303d-kube-api-access-v862r\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:49 crc kubenswrapper[4707]: I0121 15:26:49.963716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data-custom\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.050441 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5977d44c4b-9wznx"] Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.051696 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.056697 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.056900 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.063651 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5977d44c4b-9wznx"] Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.064849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-combined-ca-bundle\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.064886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-combined-ca-bundle\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.064908 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-combined-ca-bundle\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.064934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data-custom\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.064954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c27903-58da-47dc-97d0-b1328c58303d-logs\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.064979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-config-data\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-combined-ca-bundle\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qlj\" (UniqueName: \"kubernetes.io/projected/d646bb51-5d6a-429f-9a18-915d0a69bd1b-kube-api-access-75qlj\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-scripts\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v862r\" (UniqueName: \"kubernetes.io/projected/d4c27903-58da-47dc-97d0-b1328c58303d-kube-api-access-v862r\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data-custom\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-run-httpd\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-log-httpd\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-scripts\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-config-data\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-fernet-keys\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619a1e68-03a6-42a3-b3ae-b9ff576688b3-logs\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-credential-keys\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4jzx\" (UniqueName: \"kubernetes.io/projected/8a287941-c7ad-4671-904b-03172bc803e8-kube-api-access-l4jzx\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data-custom\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v8xz\" (UniqueName: \"kubernetes.io/projected/619a1e68-03a6-42a3-b3ae-b9ff576688b3-kube-api-access-4v8xz\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwtm8\" (UniqueName: \"kubernetes.io/projected/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-kube-api-access-rwtm8\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065635 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c27903-58da-47dc-97d0-b1328c58303d-logs\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.065916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a287941-c7ad-4671-904b-03172bc803e8-logs\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.068931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a287941-c7ad-4671-904b-03172bc803e8-logs\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.071772 4707 scope.go:117] "RemoveContainer" containerID="cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.077281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data-custom\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.083628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: E0121 15:26:50.084550 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88\": container with ID starting with cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88 not found: ID does not exist" containerID="cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.084578 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88"} err="failed to get container status \"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88\": rpc error: code = NotFound desc = could not find container \"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88\": container with ID starting with cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88 not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.084607 4707 scope.go:117] "RemoveContainer" containerID="814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.088298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v862r\" (UniqueName: \"kubernetes.io/projected/d4c27903-58da-47dc-97d0-b1328c58303d-kube-api-access-v862r\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.096942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4jzx\" (UniqueName: \"kubernetes.io/projected/8a287941-c7ad-4671-904b-03172bc803e8-kube-api-access-l4jzx\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: E0121 15:26:50.097015 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd\": container with ID starting with 814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd not found: ID does not exist" containerID="814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.097033 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd"} err="failed to get container status \"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd\": rpc error: code = NotFound desc = could not find container \"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd\": container with ID starting with 814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.097048 4707 scope.go:117] "RemoveContainer" containerID="da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.098034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data-custom\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.098517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.100413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-combined-ca-bundle\") pod \"barbican-keystone-listener-d5d8c859b-2fbnc\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.100483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-combined-ca-bundle\") pod \"barbican-worker-7b555f485-zww44\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: E0121 15:26:50.100545 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021\": container with ID starting with da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021 not found: ID does not exist" containerID="da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.100561 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021"} err="failed to get container status \"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021\": rpc error: code = NotFound desc = could not find container \"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021\": container with ID starting with da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021 not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.100573 4707 scope.go:117] "RemoveContainer" containerID="dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc" Jan 21 15:26:50 crc kubenswrapper[4707]: E0121 15:26:50.102209 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc\": container with ID starting with dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc not found: ID does not exist" containerID="dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.102237 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc"} err="failed to get container status \"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc\": rpc error: code = NotFound desc = could not find container \"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc\": container with ID starting with dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.102249 4707 scope.go:117] "RemoveContainer" containerID="cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.103747 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88"} err="failed to get container status \"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88\": rpc error: code = NotFound desc = could not find container \"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88\": container with ID starting with cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88 not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.104039 4707 scope.go:117] "RemoveContainer" containerID="814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.104470 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd"} err="failed to get container status \"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd\": rpc error: code = NotFound desc = could not find container \"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd\": container with ID starting with 814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.104526 4707 scope.go:117] "RemoveContainer" containerID="da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.104780 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021"} err="failed to get container status \"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021\": rpc error: code = NotFound desc = could not find container \"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021\": container with ID starting with da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021 not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.104801 4707 scope.go:117] "RemoveContainer" containerID="dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.106276 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc"} err="failed to get container status \"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc\": rpc error: code = NotFound desc = could not find container \"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc\": container with ID starting with dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.106298 4707 scope.go:117] "RemoveContainer" containerID="cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.106642 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88"} err="failed to get container status \"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88\": rpc error: code = NotFound desc = could not find container \"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88\": container with ID starting with cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88 not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.106686 4707 scope.go:117] "RemoveContainer" containerID="814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.108013 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd"} err="failed to get container status \"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd\": rpc error: code = NotFound desc = could not find container \"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd\": container with ID starting with 814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.108055 4707 scope.go:117] "RemoveContainer" containerID="da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.109956 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021"} err="failed to get container status \"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021\": rpc error: code = NotFound desc = could not find container \"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021\": container with ID starting with da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021 not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.109995 4707 scope.go:117] "RemoveContainer" containerID="dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.110192 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc"} err="failed to get container status \"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc\": rpc error: code = NotFound desc = could not find container \"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc\": container with ID starting with dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.110212 4707 scope.go:117] "RemoveContainer" containerID="cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.110372 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88"} err="failed to get container status \"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88\": rpc error: code = NotFound desc = could not find container \"cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88\": container with ID starting with cbf071f57b51593a45fb0db28ae33fee45df21c0547467709327f13ea7567c88 not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.110385 4707 scope.go:117] "RemoveContainer" containerID="814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.110565 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd"} err="failed to get container status \"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd\": rpc error: code = NotFound desc = could not find container \"814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd\": container with ID starting with 814f8331022019e26304062d7a538c272fd5db3a74820eb6d98983aa7fa07cbd not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.110583 4707 scope.go:117] "RemoveContainer" containerID="da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.110760 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021"} err="failed to get container status \"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021\": rpc error: code = NotFound desc = could not find container \"da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021\": container with ID starting with da0267eb16b818d868ef8570d6116b86d51dc88eeced8d975d5533d64dc73021 not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.110893 4707 scope.go:117] "RemoveContainer" containerID="dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.111088 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc"} err="failed to get container status \"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc\": rpc error: code = NotFound desc = could not find container \"dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc\": container with ID starting with dbea07cf79c2c61bfa6323b347a6efde350ad08bb67cc366979ccbe633f963bc not found: ID does not exist" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.141184 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.156395 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176038 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-combined-ca-bundle\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-combined-ca-bundle\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-logs\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-public-tls-certs\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-config-data\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qlj\" (UniqueName: \"kubernetes.io/projected/d646bb51-5d6a-429f-9a18-915d0a69bd1b-kube-api-access-75qlj\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-scripts\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-run-httpd\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176804 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-log-httpd\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-scripts\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.176961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-config-data\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.177019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-fernet-keys\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.177082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.177142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619a1e68-03a6-42a3-b3ae-b9ff576688b3-logs\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.177219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnjn6\" (UniqueName: \"kubernetes.io/projected/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-kube-api-access-fnjn6\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.177295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-credential-keys\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.177365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-combined-ca-bundle\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.177432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-internal-tls-certs\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.177503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data-custom\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.178565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-config-data\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.178695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v8xz\" (UniqueName: \"kubernetes.io/projected/619a1e68-03a6-42a3-b3ae-b9ff576688b3-kube-api-access-4v8xz\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.178763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwtm8\" (UniqueName: \"kubernetes.io/projected/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-kube-api-access-rwtm8\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.178851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-scripts\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.178950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.182329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-scripts\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.183464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619a1e68-03a6-42a3-b3ae-b9ff576688b3-logs\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.184043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.187393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-fernet-keys\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.189091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-config-data\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.190649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.192101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-run-httpd\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.178472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-log-httpd\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.193246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-combined-ca-bundle\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.194289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.192764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-credential-keys\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.195011 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-combined-ca-bundle\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.197559 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-config-data\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.197675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data-custom\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.200620 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-scripts\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.201838 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwtm8\" (UniqueName: \"kubernetes.io/projected/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-kube-api-access-rwtm8\") pod \"keystone-bootstrap-nz8gm\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.202223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qlj\" (UniqueName: \"kubernetes.io/projected/d646bb51-5d6a-429f-9a18-915d0a69bd1b-kube-api-access-75qlj\") pod \"ceilometer-0\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.208706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v8xz\" (UniqueName: \"kubernetes.io/projected/619a1e68-03a6-42a3-b3ae-b9ff576688b3-kube-api-access-4v8xz\") pod \"barbican-api-6999c6fbd4-scrnx\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.209646 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.230526 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.237447 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.280600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-logs\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.280633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-public-tls-certs\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.280720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnjn6\" (UniqueName: \"kubernetes.io/projected/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-kube-api-access-fnjn6\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.280748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-combined-ca-bundle\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.280767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-internal-tls-certs\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.280783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-config-data\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.280802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-scripts\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.280961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-logs\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.284423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-scripts\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.287026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-internal-tls-certs\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.301166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-combined-ca-bundle\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.301474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-public-tls-certs\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.302538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-config-data\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.303699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnjn6\" (UniqueName: \"kubernetes.io/projected/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-kube-api-access-fnjn6\") pod \"placement-5977d44c4b-9wznx\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.366505 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.382462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-scripts\") pod \"7ec845a4-b468-4927-a900-58afe8305a5c\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.382511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwpll\" (UniqueName: \"kubernetes.io/projected/7ec845a4-b468-4927-a900-58afe8305a5c-kube-api-access-dwpll\") pod \"7ec845a4-b468-4927-a900-58afe8305a5c\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.382541 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec845a4-b468-4927-a900-58afe8305a5c-etc-machine-id\") pod \"7ec845a4-b468-4927-a900-58afe8305a5c\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.382586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-db-sync-config-data\") pod \"7ec845a4-b468-4927-a900-58afe8305a5c\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.382599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-config-data\") pod \"7ec845a4-b468-4927-a900-58afe8305a5c\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.382630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-combined-ca-bundle\") pod \"7ec845a4-b468-4927-a900-58afe8305a5c\" (UID: \"7ec845a4-b468-4927-a900-58afe8305a5c\") " Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.382671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ec845a4-b468-4927-a900-58afe8305a5c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ec845a4-b468-4927-a900-58afe8305a5c" (UID: "7ec845a4-b468-4927-a900-58afe8305a5c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.383233 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ec845a4-b468-4927-a900-58afe8305a5c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.388015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec845a4-b468-4927-a900-58afe8305a5c-kube-api-access-dwpll" (OuterVolumeSpecName: "kube-api-access-dwpll") pod "7ec845a4-b468-4927-a900-58afe8305a5c" (UID: "7ec845a4-b468-4927-a900-58afe8305a5c"). InnerVolumeSpecName "kube-api-access-dwpll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.389886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-scripts" (OuterVolumeSpecName: "scripts") pod "7ec845a4-b468-4927-a900-58afe8305a5c" (UID: "7ec845a4-b468-4927-a900-58afe8305a5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:50 crc kubenswrapper[4707]: I0121 15:26:50.392202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7ec845a4-b468-4927-a900-58afe8305a5c" (UID: "7ec845a4-b468-4927-a900-58afe8305a5c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.420605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ec845a4-b468-4927-a900-58afe8305a5c" (UID: "7ec845a4-b468-4927-a900-58afe8305a5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.431377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b555f485-zww44"] Jan 21 15:26:51 crc kubenswrapper[4707]: W0121 15:26:50.444583 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a287941_c7ad_4671_904b_03172bc803e8.slice/crio-511a8eb1b7daae3f71a300332cef4579f5623d1cbe859d8ebe91b9de425890c3 WatchSource:0}: Error finding container 511a8eb1b7daae3f71a300332cef4579f5623d1cbe859d8ebe91b9de425890c3: Status 404 returned error can't find the container with id 511a8eb1b7daae3f71a300332cef4579f5623d1cbe859d8ebe91b9de425890c3 Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.471260 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-config-data" (OuterVolumeSpecName: "config-data") pod "7ec845a4-b468-4927-a900-58afe8305a5c" (UID: "7ec845a4-b468-4927-a900-58afe8305a5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.473961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.485008 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.485028 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwpll\" (UniqueName: \"kubernetes.io/projected/7ec845a4-b468-4927-a900-58afe8305a5c-kube-api-access-dwpll\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.485037 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.485045 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.485053 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec845a4-b468-4927-a900-58afe8305a5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.692731 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc"] Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.728397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" event={"ID":"8a287941-c7ad-4671-904b-03172bc803e8","Type":"ContainerStarted","Data":"7d5e60913431e94a855dc86f1bcab868f337b843d01bafdb458a1a24b751facd"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.728485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" event={"ID":"8a287941-c7ad-4671-904b-03172bc803e8","Type":"ContainerStarted","Data":"511a8eb1b7daae3f71a300332cef4579f5623d1cbe859d8ebe91b9de425890c3"} Jan 21 15:26:51 crc kubenswrapper[4707]: W0121 15:26:50.736980 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4c27903_58da_47dc_97d0_b1328c58303d.slice/crio-3adbc289fd06dd5710f77b95284b3975c02718651b316f0da0678395de0803c2 WatchSource:0}: Error finding container 3adbc289fd06dd5710f77b95284b3975c02718651b316f0da0678395de0803c2: Status 404 returned error can't find the container with id 3adbc289fd06dd5710f77b95284b3975c02718651b316f0da0678395de0803c2 Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.737095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" event={"ID":"4e9925f1-bf87-4f65-b5bd-4827d6946de5","Type":"ContainerStarted","Data":"4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.737146 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.737157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.749407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4a9ee08-b47e-42c9-a557-35c13dbc9b16","Type":"ContainerStarted","Data":"780f1ff1126457e163f15d285904bf20997e1fa8d3a62e958bafdb6e38ea3073"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.749446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4a9ee08-b47e-42c9-a557-35c13dbc9b16","Type":"ContainerStarted","Data":"15bcf13514063cfbb4c689e47fe96488036211848dbf21d83d780560a3e9fffb"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.761016 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.766208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-g9gwr" event={"ID":"7ec845a4-b468-4927-a900-58afe8305a5c","Type":"ContainerDied","Data":"1e7f10cf3a09839e9b128dbe3648bc4f4a94bf885717074e5682da3ca209459a"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.766246 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e7f10cf3a09839e9b128dbe3648bc4f4a94bf885717074e5682da3ca209459a" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.776112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8665ece-e4f3-481b-b4eb-92126976834e","Type":"ContainerStarted","Data":"c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.802785 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" podStartSLOduration=2.802770356 podStartE2EDuration="2.802770356s" podCreationTimestamp="2026-01-21 15:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:50.760836351 +0000 UTC m=+1507.942352573" watchObservedRunningTime="2026-01-21 15:26:50.802770356 +0000 UTC m=+1507.984286568" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.807692 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:26:51 crc kubenswrapper[4707]: E0121 15:26:50.808008 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec845a4-b468-4927-a900-58afe8305a5c" containerName="cinder-db-sync" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.808020 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec845a4-b468-4927-a900-58afe8305a5c" containerName="cinder-db-sync" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.808293 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec845a4-b468-4927-a900-58afe8305a5c" containerName="cinder-db-sync" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.809125 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.816270 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-6x6jk" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.816450 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.816585 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.816694 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.818687 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.844791 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.850386 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.850902 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.852090 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.892976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.893014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.893034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7tbm\" (UniqueName: \"kubernetes.io/projected/5c632c77-1494-4a0d-a5de-b52325cbb042-kube-api-access-h7tbm\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.893073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c632c77-1494-4a0d-a5de-b52325cbb042-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.893644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-scripts\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.893709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f15d68-b4b0-40de-8e34-2e245ef31890-etc-machine-id\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-scripts\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7tbm\" (UniqueName: \"kubernetes.io/projected/5c632c77-1494-4a0d-a5de-b52325cbb042-kube-api-access-h7tbm\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data-custom\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c632c77-1494-4a0d-a5de-b52325cbb042-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq62g\" (UniqueName: \"kubernetes.io/projected/32f15d68-b4b0-40de-8e34-2e245ef31890-kube-api-access-rq62g\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-scripts\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995889 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f15d68-b4b0-40de-8e34-2e245ef31890-logs\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.995904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:50.997922 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c632c77-1494-4a0d-a5de-b52325cbb042-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.000114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.000129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.000713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-scripts\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.003281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.012916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7tbm\" (UniqueName: \"kubernetes.io/projected/5c632c77-1494-4a0d-a5de-b52325cbb042-kube-api-access-h7tbm\") pod \"cinder-scheduler-0\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.097709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-scripts\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.097786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.097843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data-custom\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.097866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq62g\" (UniqueName: \"kubernetes.io/projected/32f15d68-b4b0-40de-8e34-2e245ef31890-kube-api-access-rq62g\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.097918 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f15d68-b4b0-40de-8e34-2e245ef31890-logs\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.097975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f15d68-b4b0-40de-8e34-2e245ef31890-etc-machine-id\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.097997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.099211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f15d68-b4b0-40de-8e34-2e245ef31890-logs\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.099271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f15d68-b4b0-40de-8e34-2e245ef31890-etc-machine-id\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.101575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data-custom\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.101717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.101989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.102909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-scripts\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.126224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq62g\" (UniqueName: \"kubernetes.io/projected/32f15d68-b4b0-40de-8e34-2e245ef31890-kube-api-access-rq62g\") pod \"cinder-api-0\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.145506 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.183060 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.213983 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01" path="/var/lib/kubelet/pods/0716a5b9-4ed3-4a9c-a6cd-3f99993cdb01/volumes" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.214670 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2d6ace-515d-4728-8340-aff4c8419520" path="/var/lib/kubelet/pods/eb2d6ace-515d-4728-8340-aff4c8419520/volumes" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.443109 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-nz8gm"] Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.448834 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:26:51 crc kubenswrapper[4707]: W0121 15:26:51.453352 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd646bb51_5d6a_429f_9a18_915d0a69bd1b.slice/crio-1d94c0376445ba919bdce62b1c153c6cff526ef8ac4dc17d0116f559380ed42a WatchSource:0}: Error finding container 1d94c0376445ba919bdce62b1c153c6cff526ef8ac4dc17d0116f559380ed42a: Status 404 returned error can't find the container with id 1d94c0376445ba919bdce62b1c153c6cff526ef8ac4dc17d0116f559380ed42a Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.619926 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5977d44c4b-9wznx"] Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.626557 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx"] Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.733125 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.740705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:26:51 crc kubenswrapper[4707]: W0121 15:26:51.769004 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c632c77_1494_4a0d_a5de_b52325cbb042.slice/crio-c69ff0081dc1987eaf08dc8b3249cc4ba49aff800c3335d21c049ad549ad0255 WatchSource:0}: Error finding container c69ff0081dc1987eaf08dc8b3249cc4ba49aff800c3335d21c049ad549ad0255: Status 404 returned error can't find the container with id c69ff0081dc1987eaf08dc8b3249cc4ba49aff800c3335d21c049ad549ad0255 Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.825832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5c632c77-1494-4a0d-a5de-b52325cbb042","Type":"ContainerStarted","Data":"c69ff0081dc1987eaf08dc8b3249cc4ba49aff800c3335d21c049ad549ad0255"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.834923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" event={"ID":"faf162ab-5917-4ddc-9b89-573d2e3bc7e6","Type":"ContainerStarted","Data":"b013e244262fdaa69d3b4ac19690becf57cae587ef55290f611f4f84e672f7bb"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.837506 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4a9ee08-b47e-42c9-a557-35c13dbc9b16","Type":"ContainerStarted","Data":"10f61a011e329ec1d672daefa3c014bb1fa30336866182ba8c21a0dcaf122b87"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.841214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" event={"ID":"d4c27903-58da-47dc-97d0-b1328c58303d","Type":"ContainerStarted","Data":"858a7831d2593cf2e3d08f763a792322ce65f0af93f68b5ca794de32f64a614f"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.841254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" event={"ID":"d4c27903-58da-47dc-97d0-b1328c58303d","Type":"ContainerStarted","Data":"b4573b28ecd3cce9248e8a085493d4d875d269b88b74b45cf9bfa26ecf1c1f4a"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.841265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" event={"ID":"d4c27903-58da-47dc-97d0-b1328c58303d","Type":"ContainerStarted","Data":"3adbc289fd06dd5710f77b95284b3975c02718651b316f0da0678395de0803c2"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.843447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8665ece-e4f3-481b-b4eb-92126976834e","Type":"ContainerStarted","Data":"ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.844445 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" event={"ID":"619a1e68-03a6-42a3-b3ae-b9ff576688b3","Type":"ContainerStarted","Data":"db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.844473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" event={"ID":"619a1e68-03a6-42a3-b3ae-b9ff576688b3","Type":"ContainerStarted","Data":"ca6b97a769e92173985c7a7df94f949ae9a1a273d65d44b9fb33c4d838c515bc"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.848565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"32f15d68-b4b0-40de-8e34-2e245ef31890","Type":"ContainerStarted","Data":"3c403c7a3dbb9a60414ce75a87ab3ab9c4f98e9d1850717d2336fd5cae6d1411"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.849528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerStarted","Data":"1d94c0376445ba919bdce62b1c153c6cff526ef8ac4dc17d0116f559380ed42a"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.855799 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.855788815 podStartE2EDuration="3.855788815s" podCreationTimestamp="2026-01-21 15:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:51.852802911 +0000 UTC m=+1509.034319134" watchObservedRunningTime="2026-01-21 15:26:51.855788815 +0000 UTC m=+1509.037305037" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.861366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" event={"ID":"8a287941-c7ad-4671-904b-03172bc803e8","Type":"ContainerStarted","Data":"f81c42fb84cf6dc6db11797efd104f4de9e6929e765ab0ffe83b4f838fac8280"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.864340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" event={"ID":"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9","Type":"ContainerStarted","Data":"5936c739ea4cec2b5636f30e7f2f0827fbed8af38286bc32dafbc452cfcb0151"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.864377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" event={"ID":"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9","Type":"ContainerStarted","Data":"767a757762c059515954ca490afde88787e88a6d356cd6789b8672564609587c"} Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.882111 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.882093328 podStartE2EDuration="3.882093328s" podCreationTimestamp="2026-01-21 15:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:51.879047532 +0000 UTC m=+1509.060563754" watchObservedRunningTime="2026-01-21 15:26:51.882093328 +0000 UTC m=+1509.063609550" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.907773 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" podStartSLOduration=2.907760792 podStartE2EDuration="2.907760792s" podCreationTimestamp="2026-01-21 15:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:51.907013087 +0000 UTC m=+1509.088529308" watchObservedRunningTime="2026-01-21 15:26:51.907760792 +0000 UTC m=+1509.089277014" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.951438 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" podStartSLOduration=2.951424158 podStartE2EDuration="2.951424158s" podCreationTimestamp="2026-01-21 15:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:51.933307459 +0000 UTC m=+1509.114823681" watchObservedRunningTime="2026-01-21 15:26:51.951424158 +0000 UTC m=+1509.132940381" Jan 21 15:26:51 crc kubenswrapper[4707]: I0121 15:26:51.968785 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" podStartSLOduration=2.968771692 podStartE2EDuration="2.968771692s" podCreationTimestamp="2026-01-21 15:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:51.949116761 +0000 UTC m=+1509.130632983" watchObservedRunningTime="2026-01-21 15:26:51.968771692 +0000 UTC m=+1509.150287914" Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.875944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" event={"ID":"faf162ab-5917-4ddc-9b89-573d2e3bc7e6","Type":"ContainerStarted","Data":"f89ab992dfc657152c4b63f182969f628ca255996d2039d51a99b906f27be1d7"} Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.876252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" event={"ID":"faf162ab-5917-4ddc-9b89-573d2e3bc7e6","Type":"ContainerStarted","Data":"762d5296ac937d5e18cdab76d4b50154815f31ea6c484242500f5fa8c55bfbe3"} Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.876297 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.876319 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.878727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" event={"ID":"619a1e68-03a6-42a3-b3ae-b9ff576688b3","Type":"ContainerStarted","Data":"1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb"} Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.878786 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.878861 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.879972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerStarted","Data":"7070afec529f9d56c29393f342c1eb1b439a894789e2ac24b3d987a7804f4866"} Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.881001 4707 generic.go:334] "Generic (PLEG): container finished" podID="ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9" containerID="359db13546bda46dfe24fc62a12209a680cd159a00de210a56790fbb67e0c993" exitCode=0 Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.881053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" event={"ID":"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9","Type":"ContainerDied","Data":"359db13546bda46dfe24fc62a12209a680cd159a00de210a56790fbb67e0c993"} Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.882152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"32f15d68-b4b0-40de-8e34-2e245ef31890","Type":"ContainerStarted","Data":"4b43dc70a4582485430ace8442dc121386a9ec12fd575aa2826f3ac4ab1a4ee1"} Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.883555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5c632c77-1494-4a0d-a5de-b52325cbb042","Type":"ContainerStarted","Data":"e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75"} Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.895756 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" podStartSLOduration=2.895741772 podStartE2EDuration="2.895741772s" podCreationTimestamp="2026-01-21 15:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:52.893499917 +0000 UTC m=+1510.075016139" watchObservedRunningTime="2026-01-21 15:26:52.895741772 +0000 UTC m=+1510.077257994" Jan 21 15:26:52 crc kubenswrapper[4707]: I0121 15:26:52.910590 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" podStartSLOduration=3.910573335 podStartE2EDuration="3.910573335s" podCreationTimestamp="2026-01-21 15:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:52.910098501 +0000 UTC m=+1510.091614724" watchObservedRunningTime="2026-01-21 15:26:52.910573335 +0000 UTC m=+1510.092089557" Jan 21 15:26:53 crc kubenswrapper[4707]: I0121 15:26:53.891791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5c632c77-1494-4a0d-a5de-b52325cbb042","Type":"ContainerStarted","Data":"9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483"} Jan 21 15:26:53 crc kubenswrapper[4707]: I0121 15:26:53.893937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerStarted","Data":"642edbea8c4d70be6bbceee8f568996d8a9009063407fe39ae292e3d4b7f4599"} Jan 21 15:26:53 crc kubenswrapper[4707]: I0121 15:26:53.893967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerStarted","Data":"3209d9296b843327d03e27d6b54b024ed915b3086eafcf41994f7ec3896c4560"} Jan 21 15:26:53 crc kubenswrapper[4707]: I0121 15:26:53.896798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"32f15d68-b4b0-40de-8e34-2e245ef31890","Type":"ContainerStarted","Data":"715e420cc0dbeaabde165b3dd077734a97ad55aed2a6aaa82e365654e9607a0d"} Jan 21 15:26:53 crc kubenswrapper[4707]: I0121 15:26:53.896837 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:53 crc kubenswrapper[4707]: I0121 15:26:53.909293 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.909278826 podStartE2EDuration="3.909278826s" podCreationTimestamp="2026-01-21 15:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:53.905050728 +0000 UTC m=+1511.086566949" watchObservedRunningTime="2026-01-21 15:26:53.909278826 +0000 UTC m=+1511.090795048" Jan 21 15:26:53 crc kubenswrapper[4707]: I0121 15:26:53.954864 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.954847975 podStartE2EDuration="3.954847975s" podCreationTimestamp="2026-01-21 15:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:53.920264429 +0000 UTC m=+1511.101780652" watchObservedRunningTime="2026-01-21 15:26:53.954847975 +0000 UTC m=+1511.136364198" Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.267198 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.362521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-combined-ca-bundle\") pod \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.362595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhlmc\" (UniqueName: \"kubernetes.io/projected/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-kube-api-access-dhlmc\") pod \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.362697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-config\") pod \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\" (UID: \"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9\") " Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.367137 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-kube-api-access-dhlmc" (OuterVolumeSpecName: "kube-api-access-dhlmc") pod "ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9" (UID: "ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9"). InnerVolumeSpecName "kube-api-access-dhlmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.382006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-config" (OuterVolumeSpecName: "config") pod "ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9" (UID: "ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.384205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9" (UID: "ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.464182 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.464214 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhlmc\" (UniqueName: \"kubernetes.io/projected/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-kube-api-access-dhlmc\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.464227 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.904004 4707 generic.go:334] "Generic (PLEG): container finished" podID="7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" containerID="5936c739ea4cec2b5636f30e7f2f0827fbed8af38286bc32dafbc452cfcb0151" exitCode=0 Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.904052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" event={"ID":"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9","Type":"ContainerDied","Data":"5936c739ea4cec2b5636f30e7f2f0827fbed8af38286bc32dafbc452cfcb0151"} Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.906040 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.908470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-g2jcl" event={"ID":"ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9","Type":"ContainerDied","Data":"0b286cc7c90b6329aef6cece2cfe683fd7830de99081b2e5253a70c6fbefd778"} Jan 21 15:26:54 crc kubenswrapper[4707]: I0121 15:26:54.908597 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b286cc7c90b6329aef6cece2cfe683fd7830de99081b2e5253a70c6fbefd778" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.025348 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-686d8c7c65-gqrw6"] Jan 21 15:26:55 crc kubenswrapper[4707]: E0121 15:26:55.025761 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9" containerName="neutron-db-sync" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.025846 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9" containerName="neutron-db-sync" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.026073 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9" containerName="neutron-db-sync" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.026856 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.030070 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.030162 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-j2bpz" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.030296 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.030340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.039852 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-686d8c7c65-gqrw6"] Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.073003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-config\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.073358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txxsg\" (UniqueName: \"kubernetes.io/projected/1acf0601-50f7-429b-b0ff-a5db620b581e-kube-api-access-txxsg\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.073462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-combined-ca-bundle\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.073538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-httpd-config\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.073569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-ovndb-tls-certs\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.175356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-httpd-config\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.175416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-ovndb-tls-certs\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.175484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-config\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.175503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txxsg\" (UniqueName: \"kubernetes.io/projected/1acf0601-50f7-429b-b0ff-a5db620b581e-kube-api-access-txxsg\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.175542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-combined-ca-bundle\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.179510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-ovndb-tls-certs\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.180372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-combined-ca-bundle\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.180534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-httpd-config\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.186879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-config\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.197248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txxsg\" (UniqueName: \"kubernetes.io/projected/1acf0601-50f7-429b-b0ff-a5db620b581e-kube-api-access-txxsg\") pod \"neutron-686d8c7c65-gqrw6\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.338843 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.725947 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-686d8c7c65-gqrw6"] Jan 21 15:26:55 crc kubenswrapper[4707]: W0121 15:26:55.732605 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1acf0601_50f7_429b_b0ff_a5db620b581e.slice/crio-97995bc5e9daa3b3fbb230f54f5546808e2a2cafc7a540976692094cff28ae9d WatchSource:0}: Error finding container 97995bc5e9daa3b3fbb230f54f5546808e2a2cafc7a540976692094cff28ae9d: Status 404 returned error can't find the container with id 97995bc5e9daa3b3fbb230f54f5546808e2a2cafc7a540976692094cff28ae9d Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.912864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" event={"ID":"1acf0601-50f7-429b-b0ff-a5db620b581e","Type":"ContainerStarted","Data":"066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a"} Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.913072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" event={"ID":"1acf0601-50f7-429b-b0ff-a5db620b581e","Type":"ContainerStarted","Data":"97995bc5e9daa3b3fbb230f54f5546808e2a2cafc7a540976692094cff28ae9d"} Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.917276 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerStarted","Data":"ed215b0d90d236ebda6b3d56bffa7d616362beff92e265c250edc13c4aca5a9e"} Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.917417 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:26:55 crc kubenswrapper[4707]: I0121 15:26:55.935917 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.554074532 podStartE2EDuration="6.935905855s" podCreationTimestamp="2026-01-21 15:26:49 +0000 UTC" firstStartedPulling="2026-01-21 15:26:51.461586731 +0000 UTC m=+1508.643102953" lastFinishedPulling="2026-01-21 15:26:54.843418054 +0000 UTC m=+1512.024934276" observedRunningTime="2026-01-21 15:26:55.933877101 +0000 UTC m=+1513.115393323" watchObservedRunningTime="2026-01-21 15:26:55.935905855 +0000 UTC m=+1513.117422077" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.146099 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.150789 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.193687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-fernet-keys\") pod \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.193727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-credential-keys\") pod \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.193756 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-combined-ca-bundle\") pod \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.193780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-scripts\") pod \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.194204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwtm8\" (UniqueName: \"kubernetes.io/projected/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-kube-api-access-rwtm8\") pod \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.194249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-config-data\") pod \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\" (UID: \"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9\") " Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.198278 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" (UID: "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.200309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-kube-api-access-rwtm8" (OuterVolumeSpecName: "kube-api-access-rwtm8") pod "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" (UID: "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9"). InnerVolumeSpecName "kube-api-access-rwtm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.202793 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-scripts" (OuterVolumeSpecName: "scripts") pod "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" (UID: "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.202988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" (UID: "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.236716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-config-data" (OuterVolumeSpecName: "config-data") pod "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" (UID: "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.238347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" (UID: "7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.296321 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.296351 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.296361 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwtm8\" (UniqueName: \"kubernetes.io/projected/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-kube-api-access-rwtm8\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.296371 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.296379 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.296387 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.591471 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.591823 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerName="cinder-api-log" containerID="cri-o://4b43dc70a4582485430ace8442dc121386a9ec12fd575aa2826f3ac4ab1a4ee1" gracePeriod=30 Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.591858 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerName="cinder-api" containerID="cri-o://715e420cc0dbeaabde165b3dd077734a97ad55aed2a6aaa82e365654e9607a0d" gracePeriod=30 Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.929724 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" event={"ID":"1acf0601-50f7-429b-b0ff-a5db620b581e","Type":"ContainerStarted","Data":"0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0"} Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.929792 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.936862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" event={"ID":"7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9","Type":"ContainerDied","Data":"767a757762c059515954ca490afde88787e88a6d356cd6789b8672564609587c"} Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.936917 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="767a757762c059515954ca490afde88787e88a6d356cd6789b8672564609587c" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.936958 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-nz8gm" Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.951051 4707 generic.go:334] "Generic (PLEG): container finished" podID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerID="715e420cc0dbeaabde165b3dd077734a97ad55aed2a6aaa82e365654e9607a0d" exitCode=0 Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.951075 4707 generic.go:334] "Generic (PLEG): container finished" podID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerID="4b43dc70a4582485430ace8442dc121386a9ec12fd575aa2826f3ac4ab1a4ee1" exitCode=143 Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.968798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"32f15d68-b4b0-40de-8e34-2e245ef31890","Type":"ContainerDied","Data":"715e420cc0dbeaabde165b3dd077734a97ad55aed2a6aaa82e365654e9607a0d"} Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.969108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"32f15d68-b4b0-40de-8e34-2e245ef31890","Type":"ContainerDied","Data":"4b43dc70a4582485430ace8442dc121386a9ec12fd575aa2826f3ac4ab1a4ee1"} Jan 21 15:26:56 crc kubenswrapper[4707]: I0121 15:26:56.987065 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" podStartSLOduration=1.9870456330000001 podStartE2EDuration="1.987045633s" podCreationTimestamp="2026-01-21 15:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:56.968967286 +0000 UTC m=+1514.150483508" watchObservedRunningTime="2026-01-21 15:26:56.987045633 +0000 UTC m=+1514.168561854" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.055331 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-664998d598-nfbb8"] Jan 21 15:26:57 crc kubenswrapper[4707]: E0121 15:26:57.055626 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" containerName="keystone-bootstrap" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.055644 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" containerName="keystone-bootstrap" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.055837 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" containerName="keystone-bootstrap" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.056375 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.059030 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.059568 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-wp4kf" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.060066 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.060222 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.060331 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.060435 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.063943 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.087843 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-664998d598-nfbb8"] Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.113521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data\") pod \"32f15d68-b4b0-40de-8e34-2e245ef31890\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.113572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-scripts\") pod \"32f15d68-b4b0-40de-8e34-2e245ef31890\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.113606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq62g\" (UniqueName: \"kubernetes.io/projected/32f15d68-b4b0-40de-8e34-2e245ef31890-kube-api-access-rq62g\") pod \"32f15d68-b4b0-40de-8e34-2e245ef31890\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.113721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data-custom\") pod \"32f15d68-b4b0-40de-8e34-2e245ef31890\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.113748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f15d68-b4b0-40de-8e34-2e245ef31890-logs\") pod \"32f15d68-b4b0-40de-8e34-2e245ef31890\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.113797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-combined-ca-bundle\") pod \"32f15d68-b4b0-40de-8e34-2e245ef31890\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.113843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f15d68-b4b0-40de-8e34-2e245ef31890-etc-machine-id\") pod \"32f15d68-b4b0-40de-8e34-2e245ef31890\" (UID: \"32f15d68-b4b0-40de-8e34-2e245ef31890\") " Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.114050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2p5d\" (UniqueName: \"kubernetes.io/projected/4638ccbe-c266-4bff-8771-67eb02be0a6a-kube-api-access-w2p5d\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.114086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-fernet-keys\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.114117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-config-data\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.114161 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-internal-tls-certs\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.114197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-combined-ca-bundle\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.114215 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-public-tls-certs\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.114239 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-scripts\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.114261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-credential-keys\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.117275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32f15d68-b4b0-40de-8e34-2e245ef31890-logs" (OuterVolumeSpecName: "logs") pod "32f15d68-b4b0-40de-8e34-2e245ef31890" (UID: "32f15d68-b4b0-40de-8e34-2e245ef31890"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.120915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f15d68-b4b0-40de-8e34-2e245ef31890-kube-api-access-rq62g" (OuterVolumeSpecName: "kube-api-access-rq62g") pod "32f15d68-b4b0-40de-8e34-2e245ef31890" (UID: "32f15d68-b4b0-40de-8e34-2e245ef31890"). InnerVolumeSpecName "kube-api-access-rq62g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.126077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "32f15d68-b4b0-40de-8e34-2e245ef31890" (UID: "32f15d68-b4b0-40de-8e34-2e245ef31890"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.126241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f15d68-b4b0-40de-8e34-2e245ef31890-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "32f15d68-b4b0-40de-8e34-2e245ef31890" (UID: "32f15d68-b4b0-40de-8e34-2e245ef31890"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.133351 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-scripts" (OuterVolumeSpecName: "scripts") pod "32f15d68-b4b0-40de-8e34-2e245ef31890" (UID: "32f15d68-b4b0-40de-8e34-2e245ef31890"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.154593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32f15d68-b4b0-40de-8e34-2e245ef31890" (UID: "32f15d68-b4b0-40de-8e34-2e245ef31890"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.184525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data" (OuterVolumeSpecName: "config-data") pod "32f15d68-b4b0-40de-8e34-2e245ef31890" (UID: "32f15d68-b4b0-40de-8e34-2e245ef31890"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-fernet-keys\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-config-data\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-internal-tls-certs\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-combined-ca-bundle\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-public-tls-certs\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-scripts\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215341 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-credential-keys\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2p5d\" (UniqueName: \"kubernetes.io/projected/4638ccbe-c266-4bff-8771-67eb02be0a6a-kube-api-access-w2p5d\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215517 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215533 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f15d68-b4b0-40de-8e34-2e245ef31890-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215550 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215558 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f15d68-b4b0-40de-8e34-2e245ef31890-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215567 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215574 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f15d68-b4b0-40de-8e34-2e245ef31890-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.215582 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq62g\" (UniqueName: \"kubernetes.io/projected/32f15d68-b4b0-40de-8e34-2e245ef31890-kube-api-access-rq62g\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.218437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-credential-keys\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.219129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-scripts\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.220132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-fernet-keys\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.220461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-public-tls-certs\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.220723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-internal-tls-certs\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.221788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-combined-ca-bundle\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.228537 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-config-data\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.234789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2p5d\" (UniqueName: \"kubernetes.io/projected/4638ccbe-c266-4bff-8771-67eb02be0a6a-kube-api-access-w2p5d\") pod \"keystone-664998d598-nfbb8\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.371579 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.736897 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-664998d598-nfbb8"] Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.959107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"32f15d68-b4b0-40de-8e34-2e245ef31890","Type":"ContainerDied","Data":"3c403c7a3dbb9a60414ce75a87ab3ab9c4f98e9d1850717d2336fd5cae6d1411"} Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.959354 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.959374 4707 scope.go:117] "RemoveContainer" containerID="715e420cc0dbeaabde165b3dd077734a97ad55aed2a6aaa82e365654e9607a0d" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.961912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" event={"ID":"4638ccbe-c266-4bff-8771-67eb02be0a6a","Type":"ContainerStarted","Data":"aeed4f7ea398c8aec570dded3f936a9753106f96a565795b7770c4040b2cca32"} Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.961952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" event={"ID":"4638ccbe-c266-4bff-8771-67eb02be0a6a","Type":"ContainerStarted","Data":"47ee72cee17f0eca5fbbb671c8bed2952ca0a7b93a87b1f42a74ccac4d163da4"} Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.975848 4707 scope.go:117] "RemoveContainer" containerID="4b43dc70a4582485430ace8442dc121386a9ec12fd575aa2826f3ac4ab1a4ee1" Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.979607 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:26:57 crc kubenswrapper[4707]: I0121 15:26:57.994293 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.003276 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:26:58 crc kubenswrapper[4707]: E0121 15:26:58.003750 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerName="cinder-api" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.003768 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerName="cinder-api" Jan 21 15:26:58 crc kubenswrapper[4707]: E0121 15:26:58.003789 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerName="cinder-api-log" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.003795 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerName="cinder-api-log" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.003961 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerName="cinder-api-log" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.003987 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f15d68-b4b0-40de-8e34-2e245ef31890" containerName="cinder-api" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.004913 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.005205 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" podStartSLOduration=1.005184809 podStartE2EDuration="1.005184809s" podCreationTimestamp="2026-01-21 15:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:57.988051669 +0000 UTC m=+1515.169567890" watchObservedRunningTime="2026-01-21 15:26:58.005184809 +0000 UTC m=+1515.186701031" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.006553 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.010375 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.013903 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.028398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.028475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-scripts\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.028501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef7c730d-de41-4543-a84e-d42a48259f71-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.028573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.028657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7c730d-de41-4543-a84e-d42a48259f71-logs\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.028679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.028703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.028756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data-custom\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.028784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrb7v\" (UniqueName: \"kubernetes.io/projected/ef7c730d-de41-4543-a84e-d42a48259f71-kube-api-access-zrb7v\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.033277 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.130675 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7c730d-de41-4543-a84e-d42a48259f71-logs\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.130726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.130748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.130789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data-custom\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.130824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrb7v\" (UniqueName: \"kubernetes.io/projected/ef7c730d-de41-4543-a84e-d42a48259f71-kube-api-access-zrb7v\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.130852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.130880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-scripts\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.130902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef7c730d-de41-4543-a84e-d42a48259f71-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.130945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.131294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef7c730d-de41-4543-a84e-d42a48259f71-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.135441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7c730d-de41-4543-a84e-d42a48259f71-logs\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.138676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.139799 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.139831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.139848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data-custom\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.141196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-scripts\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.144077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.147205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrb7v\" (UniqueName: \"kubernetes.io/projected/ef7c730d-de41-4543-a84e-d42a48259f71-kube-api-access-zrb7v\") pod \"cinder-api-0\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.326282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.740037 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:26:58 crc kubenswrapper[4707]: W0121 15:26:58.757779 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef7c730d_de41_4543_a84e_d42a48259f71.slice/crio-b5018d8edc96d0f356ecd6ea74a027ceea4c811c7650c799ac49932ad9b0fd27 WatchSource:0}: Error finding container b5018d8edc96d0f356ecd6ea74a027ceea4c811c7650c799ac49932ad9b0fd27: Status 404 returned error can't find the container with id b5018d8edc96d0f356ecd6ea74a027ceea4c811c7650c799ac49932ad9b0fd27 Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.973704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ef7c730d-de41-4543-a84e-d42a48259f71","Type":"ContainerStarted","Data":"b5018d8edc96d0f356ecd6ea74a027ceea4c811c7650c799ac49932ad9b0fd27"} Jan 21 15:26:58 crc kubenswrapper[4707]: I0121 15:26:58.975931 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.145506 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.145550 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.193324 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32f15d68-b4b0-40de-8e34-2e245ef31890" path="/var/lib/kubelet/pods/32f15d68-b4b0-40de-8e34-2e245ef31890/volumes" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.193973 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.193998 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.194010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.194049 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.213466 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.221623 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.989591 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ef7c730d-de41-4543-a84e-d42a48259f71","Type":"ContainerStarted","Data":"51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f"} Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.989851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ef7c730d-de41-4543-a84e-d42a48259f71","Type":"ContainerStarted","Data":"8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f"} Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.990380 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.990594 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.990769 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:26:59 crc kubenswrapper[4707]: I0121 15:26:59.990788 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:00 crc kubenswrapper[4707]: I0121 15:27:00.013834 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.013821316 podStartE2EDuration="3.013821316s" podCreationTimestamp="2026-01-21 15:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:00.008564123 +0000 UTC m=+1517.190080334" watchObservedRunningTime="2026-01-21 15:27:00.013821316 +0000 UTC m=+1517.195337538" Jan 21 15:27:00 crc kubenswrapper[4707]: I0121 15:27:00.981689 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r"] Jan 21 15:27:00 crc kubenswrapper[4707]: I0121 15:27:00.983473 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:00 crc kubenswrapper[4707]: I0121 15:27:00.985690 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 15:27:00 crc kubenswrapper[4707]: I0121 15:27:00.985942 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 15:27:00 crc kubenswrapper[4707]: I0121 15:27:00.992526 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r"] Jan 21 15:27:00 crc kubenswrapper[4707]: I0121 15:27:00.999270 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.109228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkxd4\" (UniqueName: \"kubernetes.io/projected/deb292be-f24c-461f-ac51-f1d4daa3f261-kube-api-access-rkxd4\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.109279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-internal-tls-certs\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.109324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-httpd-config\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.109390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-ovndb-tls-certs\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.109424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-public-tls-certs\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.109461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-config\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.109746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-combined-ca-bundle\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.210836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-combined-ca-bundle\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.210875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkxd4\" (UniqueName: \"kubernetes.io/projected/deb292be-f24c-461f-ac51-f1d4daa3f261-kube-api-access-rkxd4\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.210906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-internal-tls-certs\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.210945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-httpd-config\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.210993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-ovndb-tls-certs\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.211024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-public-tls-certs\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.211042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-config\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.217768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-config\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.218004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-internal-tls-certs\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.222978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-ovndb-tls-certs\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.224461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-httpd-config\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.228934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-combined-ca-bundle\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.237224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-public-tls-certs\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.237593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkxd4\" (UniqueName: \"kubernetes.io/projected/deb292be-f24c-461f-ac51-f1d4daa3f261-kube-api-access-rkxd4\") pod \"neutron-7d8fb7c856-ftg5r\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.302551 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.354546 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.398071 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.722659 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.740475 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r"] Jan 21 15:27:01 crc kubenswrapper[4707]: W0121 15:27:01.742088 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb292be_f24c_461f_ac51_f1d4daa3f261.slice/crio-04f14defb88ef7caea788209d0125afde6c7de00f6224dbe6cac5e26b0b269a1 WatchSource:0}: Error finding container 04f14defb88ef7caea788209d0125afde6c7de00f6224dbe6cac5e26b0b269a1: Status 404 returned error can't find the container with id 04f14defb88ef7caea788209d0125afde6c7de00f6224dbe6cac5e26b0b269a1 Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.802783 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.807482 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.808923 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.879489 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:01 crc kubenswrapper[4707]: I0121 15:27:01.887860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.009840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" event={"ID":"deb292be-f24c-461f-ac51-f1d4daa3f261","Type":"ContainerStarted","Data":"4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b"} Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.009882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" event={"ID":"deb292be-f24c-461f-ac51-f1d4daa3f261","Type":"ContainerStarted","Data":"04f14defb88ef7caea788209d0125afde6c7de00f6224dbe6cac5e26b0b269a1"} Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.010312 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerName="cinder-scheduler" containerID="cri-o://e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75" gracePeriod=30 Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.010441 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerName="probe" containerID="cri-o://9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483" gracePeriod=30 Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.553081 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2"] Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.558221 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.562775 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.564611 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.590318 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2"] Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.644666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-internal-tls-certs\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.644754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74qjx\" (UniqueName: \"kubernetes.io/projected/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-kube-api-access-74qjx\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.644782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data-custom\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.645102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.645219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-combined-ca-bundle\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.645253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-public-tls-certs\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.645320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-logs\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.749081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-internal-tls-certs\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.749166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74qjx\" (UniqueName: \"kubernetes.io/projected/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-kube-api-access-74qjx\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.749199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data-custom\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.749292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.749335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-combined-ca-bundle\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.749354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-public-tls-certs\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.749381 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-logs\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.750571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-logs\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.756197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-internal-tls-certs\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.757523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.759639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-combined-ca-bundle\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.759664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data-custom\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.762162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-public-tls-certs\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.769898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74qjx\" (UniqueName: \"kubernetes.io/projected/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-kube-api-access-74qjx\") pod \"barbican-api-5d4684d9b4-fdrm2\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:02 crc kubenswrapper[4707]: I0121 15:27:02.878742 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.026244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" event={"ID":"deb292be-f24c-461f-ac51-f1d4daa3f261","Type":"ContainerStarted","Data":"05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46"} Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.027276 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.032940 4707 generic.go:334] "Generic (PLEG): container finished" podID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerID="9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483" exitCode=0 Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.033525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5c632c77-1494-4a0d-a5de-b52325cbb042","Type":"ContainerDied","Data":"9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483"} Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.047990 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" podStartSLOduration=3.047973436 podStartE2EDuration="3.047973436s" podCreationTimestamp="2026-01-21 15:27:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:03.044096187 +0000 UTC m=+1520.225612409" watchObservedRunningTime="2026-01-21 15:27:03.047973436 +0000 UTC m=+1520.229489658" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.345181 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2"] Jan 21 15:27:03 crc kubenswrapper[4707]: W0121 15:27:03.353409 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62c3da3c_d7d6_4839_94a7_55b3d07bf5ce.slice/crio-3c361f6701b07397fac8443eefa35d147f466ea8c4940adb2baa943dd886807a WatchSource:0}: Error finding container 3c361f6701b07397fac8443eefa35d147f466ea8c4940adb2baa943dd886807a: Status 404 returned error can't find the container with id 3c361f6701b07397fac8443eefa35d147f466ea8c4940adb2baa943dd886807a Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.440037 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.564443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-scripts\") pod \"5c632c77-1494-4a0d-a5de-b52325cbb042\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.564486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data\") pod \"5c632c77-1494-4a0d-a5de-b52325cbb042\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.564576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-combined-ca-bundle\") pod \"5c632c77-1494-4a0d-a5de-b52325cbb042\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.564640 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7tbm\" (UniqueName: \"kubernetes.io/projected/5c632c77-1494-4a0d-a5de-b52325cbb042-kube-api-access-h7tbm\") pod \"5c632c77-1494-4a0d-a5de-b52325cbb042\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.564711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c632c77-1494-4a0d-a5de-b52325cbb042-etc-machine-id\") pod \"5c632c77-1494-4a0d-a5de-b52325cbb042\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.564724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data-custom\") pod \"5c632c77-1494-4a0d-a5de-b52325cbb042\" (UID: \"5c632c77-1494-4a0d-a5de-b52325cbb042\") " Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.566181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c632c77-1494-4a0d-a5de-b52325cbb042-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5c632c77-1494-4a0d-a5de-b52325cbb042" (UID: "5c632c77-1494-4a0d-a5de-b52325cbb042"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.568667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5c632c77-1494-4a0d-a5de-b52325cbb042" (UID: "5c632c77-1494-4a0d-a5de-b52325cbb042"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.571935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c632c77-1494-4a0d-a5de-b52325cbb042-kube-api-access-h7tbm" (OuterVolumeSpecName: "kube-api-access-h7tbm") pod "5c632c77-1494-4a0d-a5de-b52325cbb042" (UID: "5c632c77-1494-4a0d-a5de-b52325cbb042"). InnerVolumeSpecName "kube-api-access-h7tbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.571952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-scripts" (OuterVolumeSpecName: "scripts") pod "5c632c77-1494-4a0d-a5de-b52325cbb042" (UID: "5c632c77-1494-4a0d-a5de-b52325cbb042"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.617348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c632c77-1494-4a0d-a5de-b52325cbb042" (UID: "5c632c77-1494-4a0d-a5de-b52325cbb042"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.668300 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7tbm\" (UniqueName: \"kubernetes.io/projected/5c632c77-1494-4a0d-a5de-b52325cbb042-kube-api-access-h7tbm\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.668327 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c632c77-1494-4a0d-a5de-b52325cbb042-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.668336 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.668343 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.668352 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.671635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data" (OuterVolumeSpecName: "config-data") pod "5c632c77-1494-4a0d-a5de-b52325cbb042" (UID: "5c632c77-1494-4a0d-a5de-b52325cbb042"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:03 crc kubenswrapper[4707]: I0121 15:27:03.769265 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c632c77-1494-4a0d-a5de-b52325cbb042-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.041086 4707 generic.go:334] "Generic (PLEG): container finished" podID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerID="e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75" exitCode=0 Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.041125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5c632c77-1494-4a0d-a5de-b52325cbb042","Type":"ContainerDied","Data":"e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75"} Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.041166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5c632c77-1494-4a0d-a5de-b52325cbb042","Type":"ContainerDied","Data":"c69ff0081dc1987eaf08dc8b3249cc4ba49aff800c3335d21c049ad549ad0255"} Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.041135 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.041197 4707 scope.go:117] "RemoveContainer" containerID="9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.043044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" event={"ID":"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce","Type":"ContainerStarted","Data":"8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb"} Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.043100 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" event={"ID":"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce","Type":"ContainerStarted","Data":"9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59"} Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.043113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" event={"ID":"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce","Type":"ContainerStarted","Data":"3c361f6701b07397fac8443eefa35d147f466ea8c4940adb2baa943dd886807a"} Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.069269 4707 scope.go:117] "RemoveContainer" containerID="e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.072432 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" podStartSLOduration=2.072409913 podStartE2EDuration="2.072409913s" podCreationTimestamp="2026-01-21 15:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:04.069739842 +0000 UTC m=+1521.251256065" watchObservedRunningTime="2026-01-21 15:27:04.072409913 +0000 UTC m=+1521.253926135" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.092465 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.095757 4707 scope.go:117] "RemoveContainer" containerID="9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483" Jan 21 15:27:04 crc kubenswrapper[4707]: E0121 15:27:04.096133 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483\": container with ID starting with 9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483 not found: ID does not exist" containerID="9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.096171 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483"} err="failed to get container status \"9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483\": rpc error: code = NotFound desc = could not find container \"9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483\": container with ID starting with 9b4b587398d2743c5cd260d0468515c9f75c0c71654577e23b905235b4843483 not found: ID does not exist" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.096207 4707 scope.go:117] "RemoveContainer" containerID="e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75" Jan 21 15:27:04 crc kubenswrapper[4707]: E0121 15:27:04.096429 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75\": container with ID starting with e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75 not found: ID does not exist" containerID="e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.096462 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75"} err="failed to get container status \"e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75\": rpc error: code = NotFound desc = could not find container \"e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75\": container with ID starting with e401bdbe6ca3f6613df8c7aff7530bdf9f6ca8aced2ac842102686d49295db75 not found: ID does not exist" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.101329 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.117241 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:27:04 crc kubenswrapper[4707]: E0121 15:27:04.117764 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerName="cinder-scheduler" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.117782 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerName="cinder-scheduler" Jan 21 15:27:04 crc kubenswrapper[4707]: E0121 15:27:04.117804 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerName="probe" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.117828 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerName="probe" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.118125 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerName="cinder-scheduler" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.118159 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c632c77-1494-4a0d-a5de-b52325cbb042" containerName="probe" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.120329 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.123799 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.126908 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.177706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-scripts\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.177780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.177882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.177902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjrtt\" (UniqueName: \"kubernetes.io/projected/f161ce76-80a9-470b-8776-df5af1799b7e-kube-api-access-jjrtt\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.177934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.177951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f161ce76-80a9-470b-8776-df5af1799b7e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.278912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.278952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjrtt\" (UniqueName: \"kubernetes.io/projected/f161ce76-80a9-470b-8776-df5af1799b7e-kube-api-access-jjrtt\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.278988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.279006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f161ce76-80a9-470b-8776-df5af1799b7e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.279037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-scripts\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.279083 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.279536 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f161ce76-80a9-470b-8776-df5af1799b7e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.282724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.282855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.285109 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-scripts\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.287495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.292837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjrtt\" (UniqueName: \"kubernetes.io/projected/f161ce76-80a9-470b-8776-df5af1799b7e-kube-api-access-jjrtt\") pod \"cinder-scheduler-0\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.437260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:04 crc kubenswrapper[4707]: W0121 15:27:04.806292 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf161ce76_80a9_470b_8776_df5af1799b7e.slice/crio-d4ed07cf8ed2006ff07f3aceb398c12221ee8f00f1f21939ff393ac7b6149fbe WatchSource:0}: Error finding container d4ed07cf8ed2006ff07f3aceb398c12221ee8f00f1f21939ff393ac7b6149fbe: Status 404 returned error can't find the container with id d4ed07cf8ed2006ff07f3aceb398c12221ee8f00f1f21939ff393ac7b6149fbe Jan 21 15:27:04 crc kubenswrapper[4707]: I0121 15:27:04.811796 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:27:05 crc kubenswrapper[4707]: I0121 15:27:05.052063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f161ce76-80a9-470b-8776-df5af1799b7e","Type":"ContainerStarted","Data":"d4ed07cf8ed2006ff07f3aceb398c12221ee8f00f1f21939ff393ac7b6149fbe"} Jan 21 15:27:05 crc kubenswrapper[4707]: I0121 15:27:05.052157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:05 crc kubenswrapper[4707]: I0121 15:27:05.052171 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:05 crc kubenswrapper[4707]: I0121 15:27:05.190042 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c632c77-1494-4a0d-a5de-b52325cbb042" path="/var/lib/kubelet/pods/5c632c77-1494-4a0d-a5de-b52325cbb042/volumes" Jan 21 15:27:06 crc kubenswrapper[4707]: I0121 15:27:06.062398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f161ce76-80a9-470b-8776-df5af1799b7e","Type":"ContainerStarted","Data":"aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5"} Jan 21 15:27:06 crc kubenswrapper[4707]: I0121 15:27:06.062619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f161ce76-80a9-470b-8776-df5af1799b7e","Type":"ContainerStarted","Data":"51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4"} Jan 21 15:27:06 crc kubenswrapper[4707]: I0121 15:27:06.082926 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.082911355 podStartE2EDuration="2.082911355s" podCreationTimestamp="2026-01-21 15:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:06.074260903 +0000 UTC m=+1523.255777125" watchObservedRunningTime="2026-01-21 15:27:06.082911355 +0000 UTC m=+1523.264427577" Jan 21 15:27:09 crc kubenswrapper[4707]: I0121 15:27:09.437926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:09 crc kubenswrapper[4707]: I0121 15:27:09.836437 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:27:14 crc kubenswrapper[4707]: I0121 15:27:14.020823 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:14 crc kubenswrapper[4707]: I0121 15:27:14.039946 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:27:14 crc kubenswrapper[4707]: I0121 15:27:14.085940 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx"] Jan 21 15:27:14 crc kubenswrapper[4707]: I0121 15:27:14.086125 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api-log" containerID="cri-o://db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c" gracePeriod=30 Jan 21 15:27:14 crc kubenswrapper[4707]: I0121 15:27:14.086254 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api" containerID="cri-o://1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb" gracePeriod=30 Jan 21 15:27:14 crc kubenswrapper[4707]: I0121 15:27:14.611476 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.121283 4707 generic.go:334] "Generic (PLEG): container finished" podID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerID="db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c" exitCode=143 Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.121321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" event={"ID":"619a1e68-03a6-42a3-b3ae-b9ff576688b3","Type":"ContainerDied","Data":"db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c"} Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.581760 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgd6"] Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.583772 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.592187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgd6"] Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.661339 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-utilities\") pod \"redhat-marketplace-hdgd6\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.661407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm2k6\" (UniqueName: \"kubernetes.io/projected/ea08963c-1f84-4a19-9bf7-8e64278d7c48-kube-api-access-mm2k6\") pod \"redhat-marketplace-hdgd6\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.661481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-catalog-content\") pod \"redhat-marketplace-hdgd6\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.762531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-catalog-content\") pod \"redhat-marketplace-hdgd6\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.762670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-utilities\") pod \"redhat-marketplace-hdgd6\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.762696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm2k6\" (UniqueName: \"kubernetes.io/projected/ea08963c-1f84-4a19-9bf7-8e64278d7c48-kube-api-access-mm2k6\") pod \"redhat-marketplace-hdgd6\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.763299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-catalog-content\") pod \"redhat-marketplace-hdgd6\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.763516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-utilities\") pod \"redhat-marketplace-hdgd6\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.778233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm2k6\" (UniqueName: \"kubernetes.io/projected/ea08963c-1f84-4a19-9bf7-8e64278d7c48-kube-api-access-mm2k6\") pod \"redhat-marketplace-hdgd6\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:15 crc kubenswrapper[4707]: I0121 15:27:15.897445 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:16 crc kubenswrapper[4707]: I0121 15:27:16.257956 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgd6"] Jan 21 15:27:16 crc kubenswrapper[4707]: W0121 15:27:16.260519 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea08963c_1f84_4a19_9bf7_8e64278d7c48.slice/crio-e47c5d781d92cd3458b52e312a28b64e4749fadd07d89b4a64ca34bb7c3875ef WatchSource:0}: Error finding container e47c5d781d92cd3458b52e312a28b64e4749fadd07d89b4a64ca34bb7c3875ef: Status 404 returned error can't find the container with id e47c5d781d92cd3458b52e312a28b64e4749fadd07d89b4a64ca34bb7c3875ef Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.134771 4707 generic.go:334] "Generic (PLEG): container finished" podID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerID="b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554" exitCode=0 Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.134836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgd6" event={"ID":"ea08963c-1f84-4a19-9bf7-8e64278d7c48","Type":"ContainerDied","Data":"b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554"} Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.135021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgd6" event={"ID":"ea08963c-1f84-4a19-9bf7-8e64278d7c48","Type":"ContainerStarted","Data":"e47c5d781d92cd3458b52e312a28b64e4749fadd07d89b4a64ca34bb7c3875ef"} Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.215252 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.95:9311/healthcheck\": read tcp 10.217.0.2:34410->10.217.1.95:9311: read: connection reset by peer" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.215282 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.95:9311/healthcheck\": read tcp 10.217.0.2:34418->10.217.1.95:9311: read: connection reset by peer" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.594164 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.688356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-combined-ca-bundle\") pod \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.688398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data-custom\") pod \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.688531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619a1e68-03a6-42a3-b3ae-b9ff576688b3-logs\") pod \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.688564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data\") pod \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.688606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v8xz\" (UniqueName: \"kubernetes.io/projected/619a1e68-03a6-42a3-b3ae-b9ff576688b3-kube-api-access-4v8xz\") pod \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\" (UID: \"619a1e68-03a6-42a3-b3ae-b9ff576688b3\") " Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.688961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/619a1e68-03a6-42a3-b3ae-b9ff576688b3-logs" (OuterVolumeSpecName: "logs") pod "619a1e68-03a6-42a3-b3ae-b9ff576688b3" (UID: "619a1e68-03a6-42a3-b3ae-b9ff576688b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.689201 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619a1e68-03a6-42a3-b3ae-b9ff576688b3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.692474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619a1e68-03a6-42a3-b3ae-b9ff576688b3-kube-api-access-4v8xz" (OuterVolumeSpecName: "kube-api-access-4v8xz") pod "619a1e68-03a6-42a3-b3ae-b9ff576688b3" (UID: "619a1e68-03a6-42a3-b3ae-b9ff576688b3"). InnerVolumeSpecName "kube-api-access-4v8xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.700138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "619a1e68-03a6-42a3-b3ae-b9ff576688b3" (UID: "619a1e68-03a6-42a3-b3ae-b9ff576688b3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.709359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "619a1e68-03a6-42a3-b3ae-b9ff576688b3" (UID: "619a1e68-03a6-42a3-b3ae-b9ff576688b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.725301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data" (OuterVolumeSpecName: "config-data") pod "619a1e68-03a6-42a3-b3ae-b9ff576688b3" (UID: "619a1e68-03a6-42a3-b3ae-b9ff576688b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.791227 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.791253 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v8xz\" (UniqueName: \"kubernetes.io/projected/619a1e68-03a6-42a3-b3ae-b9ff576688b3-kube-api-access-4v8xz\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.791263 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:17 crc kubenswrapper[4707]: I0121 15:27:17.791272 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/619a1e68-03a6-42a3-b3ae-b9ff576688b3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.142558 4707 generic.go:334] "Generic (PLEG): container finished" podID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerID="86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243" exitCode=0 Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.142664 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgd6" event={"ID":"ea08963c-1f84-4a19-9bf7-8e64278d7c48","Type":"ContainerDied","Data":"86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243"} Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.145398 4707 generic.go:334] "Generic (PLEG): container finished" podID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerID="1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb" exitCode=0 Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.145434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" event={"ID":"619a1e68-03a6-42a3-b3ae-b9ff576688b3","Type":"ContainerDied","Data":"1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb"} Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.145444 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.145457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx" event={"ID":"619a1e68-03a6-42a3-b3ae-b9ff576688b3","Type":"ContainerDied","Data":"ca6b97a769e92173985c7a7df94f949ae9a1a273d65d44b9fb33c4d838c515bc"} Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.145475 4707 scope.go:117] "RemoveContainer" containerID="1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb" Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.165348 4707 scope.go:117] "RemoveContainer" containerID="db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c" Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.179440 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx"] Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.185576 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6999c6fbd4-scrnx"] Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.185650 4707 scope.go:117] "RemoveContainer" containerID="1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb" Jan 21 15:27:18 crc kubenswrapper[4707]: E0121 15:27:18.186005 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb\": container with ID starting with 1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb not found: ID does not exist" containerID="1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb" Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.186034 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb"} err="failed to get container status \"1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb\": rpc error: code = NotFound desc = could not find container \"1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb\": container with ID starting with 1f149c66d7420cfacb686df15e0b38b1dbd649c359188cd6b68d62adc757c5cb not found: ID does not exist" Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.186053 4707 scope.go:117] "RemoveContainer" containerID="db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c" Jan 21 15:27:18 crc kubenswrapper[4707]: E0121 15:27:18.186351 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c\": container with ID starting with db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c not found: ID does not exist" containerID="db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c" Jan 21 15:27:18 crc kubenswrapper[4707]: I0121 15:27:18.186385 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c"} err="failed to get container status \"db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c\": rpc error: code = NotFound desc = could not find container \"db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c\": container with ID starting with db3a181beab6d9f3ed25b62e394f3e125aeed45e1829430bb225a5c6cece8d7c not found: ID does not exist" Jan 21 15:27:19 crc kubenswrapper[4707]: I0121 15:27:19.152661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgd6" event={"ID":"ea08963c-1f84-4a19-9bf7-8e64278d7c48","Type":"ContainerStarted","Data":"da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347"} Jan 21 15:27:19 crc kubenswrapper[4707]: I0121 15:27:19.170349 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hdgd6" podStartSLOduration=2.705575192 podStartE2EDuration="4.170333395s" podCreationTimestamp="2026-01-21 15:27:15 +0000 UTC" firstStartedPulling="2026-01-21 15:27:17.136106229 +0000 UTC m=+1534.317622451" lastFinishedPulling="2026-01-21 15:27:18.600864432 +0000 UTC m=+1535.782380654" observedRunningTime="2026-01-21 15:27:19.166545735 +0000 UTC m=+1536.348061957" watchObservedRunningTime="2026-01-21 15:27:19.170333395 +0000 UTC m=+1536.351849617" Jan 21 15:27:19 crc kubenswrapper[4707]: I0121 15:27:19.191456 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" path="/var/lib/kubelet/pods/619a1e68-03a6-42a3-b3ae-b9ff576688b3/volumes" Jan 21 15:27:19 crc kubenswrapper[4707]: I0121 15:27:19.917381 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:27:19 crc kubenswrapper[4707]: I0121 15:27:19.943135 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:27:20 crc kubenswrapper[4707]: I0121 15:27:20.235374 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.337735 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.337848 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.385116 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-c4967b5bb-n8cvb"] Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.385585 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" podUID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerName="placement-log" containerID="cri-o://639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c" gracePeriod=30 Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.385627 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" podUID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerName="placement-api" containerID="cri-o://4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b" gracePeriod=30 Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.976419 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g2jth"] Jan 21 15:27:21 crc kubenswrapper[4707]: E0121 15:27:21.977052 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api-log" Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.977070 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api-log" Jan 21 15:27:21 crc kubenswrapper[4707]: E0121 15:27:21.977093 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api" Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.977100 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api" Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.977281 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api" Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.977311 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="619a1e68-03a6-42a3-b3ae-b9ff576688b3" containerName="barbican-api-log" Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.978421 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:21 crc kubenswrapper[4707]: I0121 15:27:21.984755 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2jth"] Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.058371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-catalog-content\") pod \"redhat-operators-g2jth\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.058554 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-utilities\") pod \"redhat-operators-g2jth\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.058750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59jz7\" (UniqueName: \"kubernetes.io/projected/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-kube-api-access-59jz7\") pod \"redhat-operators-g2jth\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.160933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59jz7\" (UniqueName: \"kubernetes.io/projected/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-kube-api-access-59jz7\") pod \"redhat-operators-g2jth\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.161021 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-catalog-content\") pod \"redhat-operators-g2jth\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.161080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-utilities\") pod \"redhat-operators-g2jth\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.161486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-utilities\") pod \"redhat-operators-g2jth\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.161607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-catalog-content\") pod \"redhat-operators-g2jth\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.178925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59jz7\" (UniqueName: \"kubernetes.io/projected/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-kube-api-access-59jz7\") pod \"redhat-operators-g2jth\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.179324 4707 generic.go:334] "Generic (PLEG): container finished" podID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerID="639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c" exitCode=143 Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.179367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" event={"ID":"4e9925f1-bf87-4f65-b5bd-4827d6946de5","Type":"ContainerDied","Data":"639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c"} Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.292899 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:22 crc kubenswrapper[4707]: I0121 15:27:22.533753 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2jth"] Jan 21 15:27:22 crc kubenswrapper[4707]: W0121 15:27:22.541856 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeea1c2c4_98c7_47fc_9fd6_475e6a80acaa.slice/crio-d7697d7161cce1cec80681be474513a7e289e4d208c527831167c7a10929bc80 WatchSource:0}: Error finding container d7697d7161cce1cec80681be474513a7e289e4d208c527831167c7a10929bc80: Status 404 returned error can't find the container with id d7697d7161cce1cec80681be474513a7e289e4d208c527831167c7a10929bc80 Jan 21 15:27:23 crc kubenswrapper[4707]: I0121 15:27:23.186611 4707 generic.go:334] "Generic (PLEG): container finished" podID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerID="56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0" exitCode=0 Jan 21 15:27:23 crc kubenswrapper[4707]: I0121 15:27:23.188994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2jth" event={"ID":"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa","Type":"ContainerDied","Data":"56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0"} Jan 21 15:27:23 crc kubenswrapper[4707]: I0121 15:27:23.189028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2jth" event={"ID":"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa","Type":"ContainerStarted","Data":"d7697d7161cce1cec80681be474513a7e289e4d208c527831167c7a10929bc80"} Jan 21 15:27:24 crc kubenswrapper[4707]: I0121 15:27:24.210666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2jth" event={"ID":"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa","Type":"ContainerStarted","Data":"7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb"} Jan 21 15:27:24 crc kubenswrapper[4707]: I0121 15:27:24.885245 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.013449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-combined-ca-bundle\") pod \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.013562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9925f1-bf87-4f65-b5bd-4827d6946de5-logs\") pod \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.013669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-scripts\") pod \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.013707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wz7j\" (UniqueName: \"kubernetes.io/projected/4e9925f1-bf87-4f65-b5bd-4827d6946de5-kube-api-access-7wz7j\") pod \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.013747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-config-data\") pod \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\" (UID: \"4e9925f1-bf87-4f65-b5bd-4827d6946de5\") " Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.014054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9925f1-bf87-4f65-b5bd-4827d6946de5-logs" (OuterVolumeSpecName: "logs") pod "4e9925f1-bf87-4f65-b5bd-4827d6946de5" (UID: "4e9925f1-bf87-4f65-b5bd-4827d6946de5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.014505 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9925f1-bf87-4f65-b5bd-4827d6946de5-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.018276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9925f1-bf87-4f65-b5bd-4827d6946de5-kube-api-access-7wz7j" (OuterVolumeSpecName: "kube-api-access-7wz7j") pod "4e9925f1-bf87-4f65-b5bd-4827d6946de5" (UID: "4e9925f1-bf87-4f65-b5bd-4827d6946de5"). InnerVolumeSpecName "kube-api-access-7wz7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.018401 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-scripts" (OuterVolumeSpecName: "scripts") pod "4e9925f1-bf87-4f65-b5bd-4827d6946de5" (UID: "4e9925f1-bf87-4f65-b5bd-4827d6946de5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.046280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e9925f1-bf87-4f65-b5bd-4827d6946de5" (UID: "4e9925f1-bf87-4f65-b5bd-4827d6946de5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.048202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-config-data" (OuterVolumeSpecName: "config-data") pod "4e9925f1-bf87-4f65-b5bd-4827d6946de5" (UID: "4e9925f1-bf87-4f65-b5bd-4827d6946de5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.116500 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.116529 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wz7j\" (UniqueName: \"kubernetes.io/projected/4e9925f1-bf87-4f65-b5bd-4827d6946de5-kube-api-access-7wz7j\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.116540 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.116549 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9925f1-bf87-4f65-b5bd-4827d6946de5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.218568 4707 generic.go:334] "Generic (PLEG): container finished" podID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerID="4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b" exitCode=0 Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.218634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" event={"ID":"4e9925f1-bf87-4f65-b5bd-4827d6946de5","Type":"ContainerDied","Data":"4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b"} Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.218661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" event={"ID":"4e9925f1-bf87-4f65-b5bd-4827d6946de5","Type":"ContainerDied","Data":"f407d8cb0f06c50076577d76bb830ca84b508b33df40e00201e5a537cb1fb334"} Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.218638 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c4967b5bb-n8cvb" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.218701 4707 scope.go:117] "RemoveContainer" containerID="4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.220219 4707 generic.go:334] "Generic (PLEG): container finished" podID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerID="7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb" exitCode=0 Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.220248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2jth" event={"ID":"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa","Type":"ContainerDied","Data":"7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb"} Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.237167 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-c4967b5bb-n8cvb"] Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.242904 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-c4967b5bb-n8cvb"] Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.248533 4707 scope.go:117] "RemoveContainer" containerID="639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.269143 4707 scope.go:117] "RemoveContainer" containerID="4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b" Jan 21 15:27:25 crc kubenswrapper[4707]: E0121 15:27:25.269765 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b\": container with ID starting with 4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b not found: ID does not exist" containerID="4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.269799 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b"} err="failed to get container status \"4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b\": rpc error: code = NotFound desc = could not find container \"4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b\": container with ID starting with 4ded8248ab748fcd684e0e5c1fe2f8068fd40cfe70c53fd00c452f38014ccb9b not found: ID does not exist" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.269834 4707 scope.go:117] "RemoveContainer" containerID="639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c" Jan 21 15:27:25 crc kubenswrapper[4707]: E0121 15:27:25.270253 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c\": container with ID starting with 639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c not found: ID does not exist" containerID="639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.270295 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c"} err="failed to get container status \"639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c\": rpc error: code = NotFound desc = could not find container \"639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c\": container with ID starting with 639444b49ce970bad4e8f322be8019d2b546d1205c62a2f670f44c300e23cc9c not found: ID does not exist" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.347778 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.898415 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.898465 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:25 crc kubenswrapper[4707]: I0121 15:27:25.934179 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:26 crc kubenswrapper[4707]: I0121 15:27:26.229956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2jth" event={"ID":"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa","Type":"ContainerStarted","Data":"d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238"} Jan 21 15:27:26 crc kubenswrapper[4707]: I0121 15:27:26.248632 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g2jth" podStartSLOduration=2.739369115 podStartE2EDuration="5.248616503s" podCreationTimestamp="2026-01-21 15:27:21 +0000 UTC" firstStartedPulling="2026-01-21 15:27:23.187764687 +0000 UTC m=+1540.369280909" lastFinishedPulling="2026-01-21 15:27:25.697012075 +0000 UTC m=+1542.878528297" observedRunningTime="2026-01-21 15:27:26.242377864 +0000 UTC m=+1543.423894086" watchObservedRunningTime="2026-01-21 15:27:26.248616503 +0000 UTC m=+1543.430132725" Jan 21 15:27:26 crc kubenswrapper[4707]: I0121 15:27:26.263046 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:27 crc kubenswrapper[4707]: I0121 15:27:27.191511 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" path="/var/lib/kubelet/pods/4e9925f1-bf87-4f65-b5bd-4827d6946de5/volumes" Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.176486 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgd6"] Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.245315 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hdgd6" podUID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerName="registry-server" containerID="cri-o://da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347" gracePeriod=2 Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.657494 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.770027 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.787891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-catalog-content\") pod \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.788081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm2k6\" (UniqueName: \"kubernetes.io/projected/ea08963c-1f84-4a19-9bf7-8e64278d7c48-kube-api-access-mm2k6\") pod \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.788159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-utilities\") pod \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\" (UID: \"ea08963c-1f84-4a19-9bf7-8e64278d7c48\") " Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.788919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-utilities" (OuterVolumeSpecName: "utilities") pod "ea08963c-1f84-4a19-9bf7-8e64278d7c48" (UID: "ea08963c-1f84-4a19-9bf7-8e64278d7c48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.794208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea08963c-1f84-4a19-9bf7-8e64278d7c48-kube-api-access-mm2k6" (OuterVolumeSpecName: "kube-api-access-mm2k6") pod "ea08963c-1f84-4a19-9bf7-8e64278d7c48" (UID: "ea08963c-1f84-4a19-9bf7-8e64278d7c48"). InnerVolumeSpecName "kube-api-access-mm2k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.809502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea08963c-1f84-4a19-9bf7-8e64278d7c48" (UID: "ea08963c-1f84-4a19-9bf7-8e64278d7c48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.891820 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.891854 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm2k6\" (UniqueName: \"kubernetes.io/projected/ea08963c-1f84-4a19-9bf7-8e64278d7c48-kube-api-access-mm2k6\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:28 crc kubenswrapper[4707]: I0121 15:27:28.891866 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea08963c-1f84-4a19-9bf7-8e64278d7c48-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.253634 4707 generic.go:334] "Generic (PLEG): container finished" podID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerID="da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347" exitCode=0 Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.253689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgd6" event={"ID":"ea08963c-1f84-4a19-9bf7-8e64278d7c48","Type":"ContainerDied","Data":"da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347"} Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.253733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgd6" event={"ID":"ea08963c-1f84-4a19-9bf7-8e64278d7c48","Type":"ContainerDied","Data":"e47c5d781d92cd3458b52e312a28b64e4749fadd07d89b4a64ca34bb7c3875ef"} Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.253743 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdgd6" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.253758 4707 scope.go:117] "RemoveContainer" containerID="da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.271062 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgd6"] Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.273097 4707 scope.go:117] "RemoveContainer" containerID="86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.277827 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgd6"] Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.292657 4707 scope.go:117] "RemoveContainer" containerID="b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.323229 4707 scope.go:117] "RemoveContainer" containerID="da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347" Jan 21 15:27:29 crc kubenswrapper[4707]: E0121 15:27:29.323624 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347\": container with ID starting with da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347 not found: ID does not exist" containerID="da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.323656 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347"} err="failed to get container status \"da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347\": rpc error: code = NotFound desc = could not find container \"da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347\": container with ID starting with da8b305a4440a0850ed03a5e7f073ab53bebf02d1bc2232f005e1e7325f37347 not found: ID does not exist" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.323676 4707 scope.go:117] "RemoveContainer" containerID="86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243" Jan 21 15:27:29 crc kubenswrapper[4707]: E0121 15:27:29.324003 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243\": container with ID starting with 86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243 not found: ID does not exist" containerID="86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.324036 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243"} err="failed to get container status \"86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243\": rpc error: code = NotFound desc = could not find container \"86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243\": container with ID starting with 86dd0aef72cc0182ba685442dff2c23a24466d09a19ae4ab39cad55739cd1243 not found: ID does not exist" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.324058 4707 scope.go:117] "RemoveContainer" containerID="b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554" Jan 21 15:27:29 crc kubenswrapper[4707]: E0121 15:27:29.324513 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554\": container with ID starting with b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554 not found: ID does not exist" containerID="b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554" Jan 21 15:27:29 crc kubenswrapper[4707]: I0121 15:27:29.324535 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554"} err="failed to get container status \"b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554\": rpc error: code = NotFound desc = could not find container \"b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554\": container with ID starting with b4437360582d21660f32d837f05e5b0fd160a2c73c5f98b3eba62e50473a1554 not found: ID does not exist" Jan 21 15:27:31 crc kubenswrapper[4707]: I0121 15:27:31.190151 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" path="/var/lib/kubelet/pods/ea08963c-1f84-4a19-9bf7-8e64278d7c48/volumes" Jan 21 15:27:31 crc kubenswrapper[4707]: I0121 15:27:31.314343 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:27:31 crc kubenswrapper[4707]: I0121 15:27:31.373494 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-686d8c7c65-gqrw6"] Jan 21 15:27:31 crc kubenswrapper[4707]: I0121 15:27:31.373685 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" podUID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerName="neutron-api" containerID="cri-o://066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a" gracePeriod=30 Jan 21 15:27:31 crc kubenswrapper[4707]: I0121 15:27:31.373994 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" podUID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerName="neutron-httpd" containerID="cri-o://0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0" gracePeriod=30 Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.279769 4707 generic.go:334] "Generic (PLEG): container finished" podID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerID="0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0" exitCode=0 Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.279832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" event={"ID":"1acf0601-50f7-429b-b0ff-a5db620b581e","Type":"ContainerDied","Data":"0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0"} Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.293185 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.293312 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.327449 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.890485 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:27:32 crc kubenswrapper[4707]: E0121 15:27:32.890761 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerName="placement-api" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.890779 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerName="placement-api" Jan 21 15:27:32 crc kubenswrapper[4707]: E0121 15:27:32.890803 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerName="placement-log" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.890822 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerName="placement-log" Jan 21 15:27:32 crc kubenswrapper[4707]: E0121 15:27:32.890830 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerName="extract-utilities" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.890835 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerName="extract-utilities" Jan 21 15:27:32 crc kubenswrapper[4707]: E0121 15:27:32.890846 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerName="extract-content" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.890851 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerName="extract-content" Jan 21 15:27:32 crc kubenswrapper[4707]: E0121 15:27:32.890864 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerName="registry-server" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.890869 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerName="registry-server" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.891015 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerName="placement-api" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.891027 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9925f1-bf87-4f65-b5bd-4827d6946de5" containerName="placement-log" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.891046 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea08963c-1f84-4a19-9bf7-8e64278d7c48" containerName="registry-server" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.891516 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.892709 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-h6mzs" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.893028 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.893462 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.903405 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.964677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6s4c\" (UniqueName: \"kubernetes.io/projected/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-kube-api-access-g6s4c\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.964723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.964785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config-secret\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:32 crc kubenswrapper[4707]: I0121 15:27:32.964822 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.065951 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6s4c\" (UniqueName: \"kubernetes.io/projected/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-kube-api-access-g6s4c\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.065987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.066047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config-secret\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.066068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.066860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.073259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config-secret\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.073347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.089256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6s4c\" (UniqueName: \"kubernetes.io/projected/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-kube-api-access-g6s4c\") pod \"openstackclient\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.204922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.334061 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.368540 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2jth"] Jan 21 15:27:33 crc kubenswrapper[4707]: I0121 15:27:33.598156 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:27:34 crc kubenswrapper[4707]: I0121 15:27:34.295576 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc","Type":"ContainerStarted","Data":"07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7"} Jan 21 15:27:34 crc kubenswrapper[4707]: I0121 15:27:34.295781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc","Type":"ContainerStarted","Data":"2aafa3fb1e4d6ab50793bc50b9eba43763782a2abfb3ab153a761f2d556c364a"} Jan 21 15:27:34 crc kubenswrapper[4707]: I0121 15:27:34.309730 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.309720809 podStartE2EDuration="2.309720809s" podCreationTimestamp="2026-01-21 15:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:34.307299968 +0000 UTC m=+1551.488816190" watchObservedRunningTime="2026-01-21 15:27:34.309720809 +0000 UTC m=+1551.491237031" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.301539 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g2jth" podUID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerName="registry-server" containerID="cri-o://d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238" gracePeriod=2 Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.686251 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.808309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59jz7\" (UniqueName: \"kubernetes.io/projected/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-kube-api-access-59jz7\") pod \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.808372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-catalog-content\") pod \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.808456 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-utilities\") pod \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\" (UID: \"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa\") " Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.809209 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-utilities" (OuterVolumeSpecName: "utilities") pod "eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" (UID: "eea1c2c4-98c7-47fc-9fd6-475e6a80acaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.809922 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.810133 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="ceilometer-central-agent" containerID="cri-o://7070afec529f9d56c29393f342c1eb1b439a894789e2ac24b3d987a7804f4866" gracePeriod=30 Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.810183 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="proxy-httpd" containerID="cri-o://ed215b0d90d236ebda6b3d56bffa7d616362beff92e265c250edc13c4aca5a9e" gracePeriod=30 Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.810264 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="ceilometer-notification-agent" containerID="cri-o://3209d9296b843327d03e27d6b54b024ed915b3086eafcf41994f7ec3896c4560" gracePeriod=30 Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.810257 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="sg-core" containerID="cri-o://642edbea8c4d70be6bbceee8f568996d8a9009063407fe39ae292e3d4b7f4599" gracePeriod=30 Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.819333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-kube-api-access-59jz7" (OuterVolumeSpecName: "kube-api-access-59jz7") pod "eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" (UID: "eea1c2c4-98c7-47fc-9fd6-475e6a80acaa"). InnerVolumeSpecName "kube-api-access-59jz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.892119 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl"] Jan 21 15:27:35 crc kubenswrapper[4707]: E0121 15:27:35.892469 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerName="registry-server" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.892487 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerName="registry-server" Jan 21 15:27:35 crc kubenswrapper[4707]: E0121 15:27:35.892503 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerName="extract-utilities" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.892509 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerName="extract-utilities" Jan 21 15:27:35 crc kubenswrapper[4707]: E0121 15:27:35.892525 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerName="extract-content" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.892531 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerName="extract-content" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.892702 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerName="registry-server" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.893542 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.898219 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.898232 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.898738 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.903719 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl"] Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.911625 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59jz7\" (UniqueName: \"kubernetes.io/projected/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-kube-api-access-59jz7\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.911647 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:35 crc kubenswrapper[4707]: I0121 15:27:35.931770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" (UID: "eea1c2c4-98c7-47fc-9fd6-475e6a80acaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.014119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-combined-ca-bundle\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.014231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kqp8\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-kube-api-access-8kqp8\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.014293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-public-tls-certs\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.014349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-internal-tls-certs\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.014373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-etc-swift\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.014391 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-run-httpd\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.014417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-log-httpd\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.014477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-config-data\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.014557 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.118607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-internal-tls-certs\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.118653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-etc-swift\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.118671 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-run-httpd\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.118698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-log-httpd\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.118755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-config-data\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.118829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-combined-ca-bundle\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.118890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kqp8\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-kube-api-access-8kqp8\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.118944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-public-tls-certs\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.119253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-run-httpd\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.120075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-log-httpd\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.122508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-internal-tls-certs\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.124138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-config-data\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.124359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-etc-swift\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.124610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-public-tls-certs\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.131504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-combined-ca-bundle\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.134666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kqp8\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-kube-api-access-8kqp8\") pod \"swift-proxy-f8c4bdc4-wf2dl\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.211071 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.313248 4707 generic.go:334] "Generic (PLEG): container finished" podID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" containerID="d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238" exitCode=0 Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.313306 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2jth" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.313303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2jth" event={"ID":"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa","Type":"ContainerDied","Data":"d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238"} Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.313601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2jth" event={"ID":"eea1c2c4-98c7-47fc-9fd6-475e6a80acaa","Type":"ContainerDied","Data":"d7697d7161cce1cec80681be474513a7e289e4d208c527831167c7a10929bc80"} Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.313620 4707 scope.go:117] "RemoveContainer" containerID="d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.320420 4707 generic.go:334] "Generic (PLEG): container finished" podID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerID="ed215b0d90d236ebda6b3d56bffa7d616362beff92e265c250edc13c4aca5a9e" exitCode=0 Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.320445 4707 generic.go:334] "Generic (PLEG): container finished" podID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerID="642edbea8c4d70be6bbceee8f568996d8a9009063407fe39ae292e3d4b7f4599" exitCode=2 Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.320455 4707 generic.go:334] "Generic (PLEG): container finished" podID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerID="7070afec529f9d56c29393f342c1eb1b439a894789e2ac24b3d987a7804f4866" exitCode=0 Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.320472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerDied","Data":"ed215b0d90d236ebda6b3d56bffa7d616362beff92e265c250edc13c4aca5a9e"} Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.320492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerDied","Data":"642edbea8c4d70be6bbceee8f568996d8a9009063407fe39ae292e3d4b7f4599"} Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.320502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerDied","Data":"7070afec529f9d56c29393f342c1eb1b439a894789e2ac24b3d987a7804f4866"} Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.337140 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2jth"] Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.338165 4707 scope.go:117] "RemoveContainer" containerID="7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.345901 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g2jth"] Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.361991 4707 scope.go:117] "RemoveContainer" containerID="56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.378759 4707 scope.go:117] "RemoveContainer" containerID="d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238" Jan 21 15:27:36 crc kubenswrapper[4707]: E0121 15:27:36.379265 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238\": container with ID starting with d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238 not found: ID does not exist" containerID="d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.379300 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238"} err="failed to get container status \"d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238\": rpc error: code = NotFound desc = could not find container \"d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238\": container with ID starting with d97bbd484e60e0861f13916339503d43a34858925e00d3fb8c4e4ced9ef50238 not found: ID does not exist" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.379326 4707 scope.go:117] "RemoveContainer" containerID="7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb" Jan 21 15:27:36 crc kubenswrapper[4707]: E0121 15:27:36.380133 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb\": container with ID starting with 7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb not found: ID does not exist" containerID="7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.380162 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb"} err="failed to get container status \"7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb\": rpc error: code = NotFound desc = could not find container \"7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb\": container with ID starting with 7e0477aee8fa9dbb0d3fa51d60199b4a89bb6b2e933a16a2771b8a71264be1cb not found: ID does not exist" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.380187 4707 scope.go:117] "RemoveContainer" containerID="56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0" Jan 21 15:27:36 crc kubenswrapper[4707]: E0121 15:27:36.380454 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0\": container with ID starting with 56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0 not found: ID does not exist" containerID="56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.380488 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0"} err="failed to get container status \"56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0\": rpc error: code = NotFound desc = could not find container \"56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0\": container with ID starting with 56ad11fb79a768c4b743a6629beab3570c85447e31f2dd17efd0aaa80548fdf0 not found: ID does not exist" Jan 21 15:27:36 crc kubenswrapper[4707]: I0121 15:27:36.600599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl"] Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.093962 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.190060 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea1c2c4-98c7-47fc-9fd6-475e6a80acaa" path="/var/lib/kubelet/pods/eea1c2c4-98c7-47fc-9fd6-475e6a80acaa/volumes" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.237193 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-httpd-config\") pod \"1acf0601-50f7-429b-b0ff-a5db620b581e\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.237240 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-ovndb-tls-certs\") pod \"1acf0601-50f7-429b-b0ff-a5db620b581e\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.237267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-combined-ca-bundle\") pod \"1acf0601-50f7-429b-b0ff-a5db620b581e\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.237328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txxsg\" (UniqueName: \"kubernetes.io/projected/1acf0601-50f7-429b-b0ff-a5db620b581e-kube-api-access-txxsg\") pod \"1acf0601-50f7-429b-b0ff-a5db620b581e\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.237723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-config\") pod \"1acf0601-50f7-429b-b0ff-a5db620b581e\" (UID: \"1acf0601-50f7-429b-b0ff-a5db620b581e\") " Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.241022 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1acf0601-50f7-429b-b0ff-a5db620b581e" (UID: "1acf0601-50f7-429b-b0ff-a5db620b581e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.241046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1acf0601-50f7-429b-b0ff-a5db620b581e-kube-api-access-txxsg" (OuterVolumeSpecName: "kube-api-access-txxsg") pod "1acf0601-50f7-429b-b0ff-a5db620b581e" (UID: "1acf0601-50f7-429b-b0ff-a5db620b581e"). InnerVolumeSpecName "kube-api-access-txxsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.273072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1acf0601-50f7-429b-b0ff-a5db620b581e" (UID: "1acf0601-50f7-429b-b0ff-a5db620b581e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.274228 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-config" (OuterVolumeSpecName: "config") pod "1acf0601-50f7-429b-b0ff-a5db620b581e" (UID: "1acf0601-50f7-429b-b0ff-a5db620b581e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.289752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1acf0601-50f7-429b-b0ff-a5db620b581e" (UID: "1acf0601-50f7-429b-b0ff-a5db620b581e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.331057 4707 generic.go:334] "Generic (PLEG): container finished" podID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerID="066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a" exitCode=0 Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.331098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" event={"ID":"1acf0601-50f7-429b-b0ff-a5db620b581e","Type":"ContainerDied","Data":"066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a"} Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.331143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" event={"ID":"1acf0601-50f7-429b-b0ff-a5db620b581e","Type":"ContainerDied","Data":"97995bc5e9daa3b3fbb230f54f5546808e2a2cafc7a540976692094cff28ae9d"} Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.331162 4707 scope.go:117] "RemoveContainer" containerID="0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.331114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-686d8c7c65-gqrw6" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.334585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" event={"ID":"700a8d9c-852c-4611-9a98-50418d7df800","Type":"ContainerStarted","Data":"c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729"} Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.334739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" event={"ID":"700a8d9c-852c-4611-9a98-50418d7df800","Type":"ContainerStarted","Data":"afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f"} Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.334750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" event={"ID":"700a8d9c-852c-4611-9a98-50418d7df800","Type":"ContainerStarted","Data":"f4089e8c475b1b148e9ed549045646922131eb537aabf15c8c54fe90e7d657d0"} Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.335559 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.335582 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.339329 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txxsg\" (UniqueName: \"kubernetes.io/projected/1acf0601-50f7-429b-b0ff-a5db620b581e-kube-api-access-txxsg\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.339349 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.339358 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.339367 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.339376 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acf0601-50f7-429b-b0ff-a5db620b581e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.350442 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" podStartSLOduration=2.350428936 podStartE2EDuration="2.350428936s" podCreationTimestamp="2026-01-21 15:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:37.346121057 +0000 UTC m=+1554.527637279" watchObservedRunningTime="2026-01-21 15:27:37.350428936 +0000 UTC m=+1554.531945158" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.356407 4707 scope.go:117] "RemoveContainer" containerID="066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.362986 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-686d8c7c65-gqrw6"] Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.370403 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-686d8c7c65-gqrw6"] Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.378401 4707 scope.go:117] "RemoveContainer" containerID="0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0" Jan 21 15:27:37 crc kubenswrapper[4707]: E0121 15:27:37.378825 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0\": container with ID starting with 0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0 not found: ID does not exist" containerID="0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.378881 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0"} err="failed to get container status \"0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0\": rpc error: code = NotFound desc = could not find container \"0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0\": container with ID starting with 0c66d6f41beae3e57bf5014192883796c928ef59bff31e88873122aec63754e0 not found: ID does not exist" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.378907 4707 scope.go:117] "RemoveContainer" containerID="066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a" Jan 21 15:27:37 crc kubenswrapper[4707]: E0121 15:27:37.379263 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a\": container with ID starting with 066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a not found: ID does not exist" containerID="066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a" Jan 21 15:27:37 crc kubenswrapper[4707]: I0121 15:27:37.379515 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a"} err="failed to get container status \"066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a\": rpc error: code = NotFound desc = could not find container \"066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a\": container with ID starting with 066b752ef07a8a8641bb4336274429f81a3379d7442c8f0f36c3ce83a52d118a not found: ID does not exist" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.205625 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1acf0601-50f7-429b-b0ff-a5db620b581e" path="/var/lib/kubelet/pods/1acf0601-50f7-429b-b0ff-a5db620b581e/volumes" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.360162 4707 generic.go:334] "Generic (PLEG): container finished" podID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerID="3209d9296b843327d03e27d6b54b024ed915b3086eafcf41994f7ec3896c4560" exitCode=0 Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.360856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerDied","Data":"3209d9296b843327d03e27d6b54b024ed915b3086eafcf41994f7ec3896c4560"} Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.360884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d646bb51-5d6a-429f-9a18-915d0a69bd1b","Type":"ContainerDied","Data":"1d94c0376445ba919bdce62b1c153c6cff526ef8ac4dc17d0116f559380ed42a"} Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.360895 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d94c0376445ba919bdce62b1c153c6cff526ef8ac4dc17d0116f559380ed42a" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.381905 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.468610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-config-data\") pod \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.468694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-scripts\") pod \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.468730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-run-httpd\") pod \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.468780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75qlj\" (UniqueName: \"kubernetes.io/projected/d646bb51-5d6a-429f-9a18-915d0a69bd1b-kube-api-access-75qlj\") pod \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.468824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-combined-ca-bundle\") pod \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.468911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-log-httpd\") pod \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.468945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-sg-core-conf-yaml\") pod \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\" (UID: \"d646bb51-5d6a-429f-9a18-915d0a69bd1b\") " Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.469417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d646bb51-5d6a-429f-9a18-915d0a69bd1b" (UID: "d646bb51-5d6a-429f-9a18-915d0a69bd1b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.469446 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d646bb51-5d6a-429f-9a18-915d0a69bd1b" (UID: "d646bb51-5d6a-429f-9a18-915d0a69bd1b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.476078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-scripts" (OuterVolumeSpecName: "scripts") pod "d646bb51-5d6a-429f-9a18-915d0a69bd1b" (UID: "d646bb51-5d6a-429f-9a18-915d0a69bd1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.476897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d646bb51-5d6a-429f-9a18-915d0a69bd1b-kube-api-access-75qlj" (OuterVolumeSpecName: "kube-api-access-75qlj") pod "d646bb51-5d6a-429f-9a18-915d0a69bd1b" (UID: "d646bb51-5d6a-429f-9a18-915d0a69bd1b"). InnerVolumeSpecName "kube-api-access-75qlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.495991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d646bb51-5d6a-429f-9a18-915d0a69bd1b" (UID: "d646bb51-5d6a-429f-9a18-915d0a69bd1b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.523739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d646bb51-5d6a-429f-9a18-915d0a69bd1b" (UID: "d646bb51-5d6a-429f-9a18-915d0a69bd1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.536960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-config-data" (OuterVolumeSpecName: "config-data") pod "d646bb51-5d6a-429f-9a18-915d0a69bd1b" (UID: "d646bb51-5d6a-429f-9a18-915d0a69bd1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.571343 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75qlj\" (UniqueName: \"kubernetes.io/projected/d646bb51-5d6a-429f-9a18-915d0a69bd1b-kube-api-access-75qlj\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.571366 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.571377 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.571387 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.571395 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.571402 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d646bb51-5d6a-429f-9a18-915d0a69bd1b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:39 crc kubenswrapper[4707]: I0121 15:27:39.571409 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d646bb51-5d6a-429f-9a18-915d0a69bd1b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.366520 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.389396 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.394319 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.413781 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:40 crc kubenswrapper[4707]: E0121 15:27:40.414096 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerName="neutron-httpd" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414110 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerName="neutron-httpd" Jan 21 15:27:40 crc kubenswrapper[4707]: E0121 15:27:40.414120 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="sg-core" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414125 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="sg-core" Jan 21 15:27:40 crc kubenswrapper[4707]: E0121 15:27:40.414133 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="proxy-httpd" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414139 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="proxy-httpd" Jan 21 15:27:40 crc kubenswrapper[4707]: E0121 15:27:40.414152 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="ceilometer-central-agent" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414157 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="ceilometer-central-agent" Jan 21 15:27:40 crc kubenswrapper[4707]: E0121 15:27:40.414165 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerName="neutron-api" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414179 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerName="neutron-api" Jan 21 15:27:40 crc kubenswrapper[4707]: E0121 15:27:40.414193 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="ceilometer-notification-agent" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414199 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="ceilometer-notification-agent" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414333 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="ceilometer-notification-agent" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414347 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="ceilometer-central-agent" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="sg-core" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414364 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerName="neutron-httpd" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414374 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1acf0601-50f7-429b-b0ff-a5db620b581e" containerName="neutron-api" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.414385 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" containerName="proxy-httpd" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.416226 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.418587 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.418587 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.422128 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.484801 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.484963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-config-data\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.485004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-log-httpd\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.485164 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.485203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwhdj\" (UniqueName: \"kubernetes.io/projected/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-kube-api-access-kwhdj\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.485349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-run-httpd\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.485407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-scripts\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.586658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.586707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-config-data\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.586724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-log-httpd\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.586764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.586779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwhdj\" (UniqueName: \"kubernetes.io/projected/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-kube-api-access-kwhdj\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.586831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-run-httpd\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.586856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-scripts\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.587289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-run-httpd\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.587389 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-log-httpd\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.590584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.590727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.597220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-scripts\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.599505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwhdj\" (UniqueName: \"kubernetes.io/projected/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-kube-api-access-kwhdj\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.599649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-config-data\") pod \"ceilometer-0\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:40 crc kubenswrapper[4707]: I0121 15:27:40.742100 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.047679 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-jfctd"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.048775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.057711 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-jfctd"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.106009 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:41 crc kubenswrapper[4707]: W0121 15:27:41.107966 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod189bb580_1bb8_4175_a41d_20cb5dd4a2ec.slice/crio-deaef8cc24e1e84fc32a447b330e490a9f59e807ba4385fdec731417bd695926 WatchSource:0}: Error finding container deaef8cc24e1e84fc32a447b330e490a9f59e807ba4385fdec731417bd695926: Status 404 returned error can't find the container with id deaef8cc24e1e84fc32a447b330e490a9f59e807ba4385fdec731417bd695926 Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.159151 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-tcjlc"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.160258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.170144 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-tcjlc"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.177994 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.179016 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.180482 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.192408 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d646bb51-5d6a-429f-9a18-915d0a69bd1b" path="/var/lib/kubelet/pods/d646bb51-5d6a-429f-9a18-915d0a69bd1b/volumes" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.193199 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.199260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e4cf85-ea8a-4428-b07c-3f1659259c77-operator-scripts\") pod \"nova-api-db-create-jfctd\" (UID: \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\") " pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.199305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxklb\" (UniqueName: \"kubernetes.io/projected/d0e4cf85-ea8a-4428-b07c-3f1659259c77-kube-api-access-qxklb\") pod \"nova-api-db-create-jfctd\" (UID: \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\") " pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.221315 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.222386 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.265319 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.265559 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d8665ece-e4f3-481b-b4eb-92126976834e" containerName="glance-log" containerID="cri-o://c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94" gracePeriod=30 Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.265666 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d8665ece-e4f3-481b-b4eb-92126976834e" containerName="glance-httpd" containerID="cri-o://ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9" gracePeriod=30 Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.302267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e4cf85-ea8a-4428-b07c-3f1659259c77-operator-scripts\") pod \"nova-api-db-create-jfctd\" (UID: \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\") " pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.302358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxklb\" (UniqueName: \"kubernetes.io/projected/d0e4cf85-ea8a-4428-b07c-3f1659259c77-kube-api-access-qxklb\") pod \"nova-api-db-create-jfctd\" (UID: \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\") " pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.302425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-operator-scripts\") pod \"nova-api-85d1-account-create-update-w6f54\" (UID: \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.302523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-operator-scripts\") pod \"nova-cell0-db-create-tcjlc\" (UID: \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\") " pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.302561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7xcl\" (UniqueName: \"kubernetes.io/projected/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-kube-api-access-x7xcl\") pod \"nova-api-85d1-account-create-update-w6f54\" (UID: \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.302646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dczw\" (UniqueName: \"kubernetes.io/projected/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-kube-api-access-4dczw\") pod \"nova-cell0-db-create-tcjlc\" (UID: \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\") " pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.303069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e4cf85-ea8a-4428-b07c-3f1659259c77-operator-scripts\") pod \"nova-api-db-create-jfctd\" (UID: \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\") " pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.319365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxklb\" (UniqueName: \"kubernetes.io/projected/d0e4cf85-ea8a-4428-b07c-3f1659259c77-kube-api-access-qxklb\") pod \"nova-api-db-create-jfctd\" (UID: \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\") " pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.354939 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.355946 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.358185 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.360884 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-h728t"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.361923 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.362168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.367333 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-h728t"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.374243 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.380656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerStarted","Data":"deaef8cc24e1e84fc32a447b330e490a9f59e807ba4385fdec731417bd695926"} Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.404697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-operator-scripts\") pod \"nova-api-85d1-account-create-update-w6f54\" (UID: \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.404841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-operator-scripts\") pod \"nova-cell0-db-create-tcjlc\" (UID: \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\") " pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.404953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7xcl\" (UniqueName: \"kubernetes.io/projected/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-kube-api-access-x7xcl\") pod \"nova-api-85d1-account-create-update-w6f54\" (UID: \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.405033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dczw\" (UniqueName: \"kubernetes.io/projected/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-kube-api-access-4dczw\") pod \"nova-cell0-db-create-tcjlc\" (UID: \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\") " pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.405799 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-operator-scripts\") pod \"nova-api-85d1-account-create-update-w6f54\" (UID: \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.406875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-operator-scripts\") pod \"nova-cell0-db-create-tcjlc\" (UID: \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\") " pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.419097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dczw\" (UniqueName: \"kubernetes.io/projected/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-kube-api-access-4dczw\") pod \"nova-cell0-db-create-tcjlc\" (UID: \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\") " pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.419460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7xcl\" (UniqueName: \"kubernetes.io/projected/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-kube-api-access-x7xcl\") pod \"nova-api-85d1-account-create-update-w6f54\" (UID: \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.456690 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.460885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.464189 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.475470 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.475889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.493526 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.506183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-operator-scripts\") pod \"nova-cell1-db-create-h728t\" (UID: \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\") " pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.506244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399b071b-395f-4701-b7b9-3c3868acae7f-operator-scripts\") pod \"nova-cell0-62dd-account-create-update-cgs94\" (UID: \"399b071b-395f-4701-b7b9-3c3868acae7f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.506341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtsb\" (UniqueName: \"kubernetes.io/projected/399b071b-395f-4701-b7b9-3c3868acae7f-kube-api-access-kqtsb\") pod \"nova-cell0-62dd-account-create-update-cgs94\" (UID: \"399b071b-395f-4701-b7b9-3c3868acae7f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.506400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz4hl\" (UniqueName: \"kubernetes.io/projected/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-kube-api-access-vz4hl\") pod \"nova-cell1-db-create-h728t\" (UID: \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\") " pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.608066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399b071b-395f-4701-b7b9-3c3868acae7f-operator-scripts\") pod \"nova-cell0-62dd-account-create-update-cgs94\" (UID: \"399b071b-395f-4701-b7b9-3c3868acae7f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.608232 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/120902fc-6e0c-4568-af50-7cc575b4be15-operator-scripts\") pod \"nova-cell1-80bd-account-create-update-8gwrb\" (UID: \"120902fc-6e0c-4568-af50-7cc575b4be15\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.608268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtsb\" (UniqueName: \"kubernetes.io/projected/399b071b-395f-4701-b7b9-3c3868acae7f-kube-api-access-kqtsb\") pod \"nova-cell0-62dd-account-create-update-cgs94\" (UID: \"399b071b-395f-4701-b7b9-3c3868acae7f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.608290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6jfg\" (UniqueName: \"kubernetes.io/projected/120902fc-6e0c-4568-af50-7cc575b4be15-kube-api-access-k6jfg\") pod \"nova-cell1-80bd-account-create-update-8gwrb\" (UID: \"120902fc-6e0c-4568-af50-7cc575b4be15\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.608352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz4hl\" (UniqueName: \"kubernetes.io/projected/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-kube-api-access-vz4hl\") pod \"nova-cell1-db-create-h728t\" (UID: \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\") " pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.608385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-operator-scripts\") pod \"nova-cell1-db-create-h728t\" (UID: \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\") " pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.608974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399b071b-395f-4701-b7b9-3c3868acae7f-operator-scripts\") pod \"nova-cell0-62dd-account-create-update-cgs94\" (UID: \"399b071b-395f-4701-b7b9-3c3868acae7f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.609351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-operator-scripts\") pod \"nova-cell1-db-create-h728t\" (UID: \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\") " pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.624545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtsb\" (UniqueName: \"kubernetes.io/projected/399b071b-395f-4701-b7b9-3c3868acae7f-kube-api-access-kqtsb\") pod \"nova-cell0-62dd-account-create-update-cgs94\" (UID: \"399b071b-395f-4701-b7b9-3c3868acae7f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.625559 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz4hl\" (UniqueName: \"kubernetes.io/projected/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-kube-api-access-vz4hl\") pod \"nova-cell1-db-create-h728t\" (UID: \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\") " pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.676681 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.710095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/120902fc-6e0c-4568-af50-7cc575b4be15-operator-scripts\") pod \"nova-cell1-80bd-account-create-update-8gwrb\" (UID: \"120902fc-6e0c-4568-af50-7cc575b4be15\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.710143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6jfg\" (UniqueName: \"kubernetes.io/projected/120902fc-6e0c-4568-af50-7cc575b4be15-kube-api-access-k6jfg\") pod \"nova-cell1-80bd-account-create-update-8gwrb\" (UID: \"120902fc-6e0c-4568-af50-7cc575b4be15\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.711018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/120902fc-6e0c-4568-af50-7cc575b4be15-operator-scripts\") pod \"nova-cell1-80bd-account-create-update-8gwrb\" (UID: \"120902fc-6e0c-4568-af50-7cc575b4be15\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.724353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6jfg\" (UniqueName: \"kubernetes.io/projected/120902fc-6e0c-4568-af50-7cc575b4be15-kube-api-access-k6jfg\") pod \"nova-cell1-80bd-account-create-update-8gwrb\" (UID: \"120902fc-6e0c-4568-af50-7cc575b4be15\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.770354 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-jfctd"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.789460 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.802762 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.909341 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54"] Jan 21 15:27:41 crc kubenswrapper[4707]: I0121 15:27:41.967059 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-tcjlc"] Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.097324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94"] Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.208497 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.208867 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerName="glance-log" containerID="cri-o://780f1ff1126457e163f15d285904bf20997e1fa8d3a62e958bafdb6e38ea3073" gracePeriod=30 Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.209102 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerName="glance-httpd" containerID="cri-o://10f61a011e329ec1d672daefa3c014bb1fa30336866182ba8c21a0dcaf122b87" gracePeriod=30 Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.246564 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-h728t"] Jan 21 15:27:42 crc kubenswrapper[4707]: W0121 15:27:42.257942 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef17a342_1cc1_4fe8_8900_ef6f41eb4a51.slice/crio-484268a6fb1356a45bb72d8d84d87edde116b7cbd96aa4975b219572288d51c4 WatchSource:0}: Error finding container 484268a6fb1356a45bb72d8d84d87edde116b7cbd96aa4975b219572288d51c4: Status 404 returned error can't find the container with id 484268a6fb1356a45bb72d8d84d87edde116b7cbd96aa4975b219572288d51c4 Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.297673 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb"] Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.399834 4707 generic.go:334] "Generic (PLEG): container finished" podID="abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e" containerID="bbef33fc947becdd1d5558d859c4bca846cce597aa59ea98ba205b9d3e17dcc1" exitCode=0 Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.399894 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" event={"ID":"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e","Type":"ContainerDied","Data":"bbef33fc947becdd1d5558d859c4bca846cce597aa59ea98ba205b9d3e17dcc1"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.399917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" event={"ID":"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e","Type":"ContainerStarted","Data":"aca393eeb3e75609b3bca33ec7e960d38b55c46f73d8f2db92cbb11e69b7cbbc"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.404371 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8665ece-e4f3-481b-b4eb-92126976834e" containerID="c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94" exitCode=143 Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.404432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8665ece-e4f3-481b-b4eb-92126976834e","Type":"ContainerDied","Data":"c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.410125 4707 generic.go:334] "Generic (PLEG): container finished" podID="d0e4cf85-ea8a-4428-b07c-3f1659259c77" containerID="a38a1e887943798f2b7cda2bb98ad15eef8f01dd0e9584fb4d01c907a2a4857a" exitCode=0 Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.410194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-jfctd" event={"ID":"d0e4cf85-ea8a-4428-b07c-3f1659259c77","Type":"ContainerDied","Data":"a38a1e887943798f2b7cda2bb98ad15eef8f01dd0e9584fb4d01c907a2a4857a"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.410212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-jfctd" event={"ID":"d0e4cf85-ea8a-4428-b07c-3f1659259c77","Type":"ContainerStarted","Data":"d861a21997ba4ac36b8c2f812eed73446cd7a25e1850bbe0a7e611c0b2184c55"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.411228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" event={"ID":"120902fc-6e0c-4568-af50-7cc575b4be15","Type":"ContainerStarted","Data":"e351a87e441da597ea10e84c81babc6730023232111dceff709d3755c4460a2d"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.412692 4707 generic.go:334] "Generic (PLEG): container finished" podID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerID="780f1ff1126457e163f15d285904bf20997e1fa8d3a62e958bafdb6e38ea3073" exitCode=143 Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.412750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4a9ee08-b47e-42c9-a557-35c13dbc9b16","Type":"ContainerDied","Data":"780f1ff1126457e163f15d285904bf20997e1fa8d3a62e958bafdb6e38ea3073"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.413727 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe" containerID="e27f7fb12ce72b230d303a6c5514f136d2f3b7438b371e165d8f5c1d2e221f5e" exitCode=0 Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.413802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" event={"ID":"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe","Type":"ContainerDied","Data":"e27f7fb12ce72b230d303a6c5514f136d2f3b7438b371e165d8f5c1d2e221f5e"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.413848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" event={"ID":"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe","Type":"ContainerStarted","Data":"5fb7360e567f9146c17a4cf3f4fbf08003aec658a0f8c6ee69005f8e415c6d66"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.415082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerStarted","Data":"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.416022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" event={"ID":"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51","Type":"ContainerStarted","Data":"484268a6fb1356a45bb72d8d84d87edde116b7cbd96aa4975b219572288d51c4"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.419027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" event={"ID":"399b071b-395f-4701-b7b9-3c3868acae7f","Type":"ContainerStarted","Data":"3afacb2fae26e1db59aef8536ad9c7fae3e346112f1acdf155aed4ac62deca07"} Jan 21 15:27:42 crc kubenswrapper[4707]: I0121 15:27:42.918928 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.426667 4707 generic.go:334] "Generic (PLEG): container finished" podID="ef17a342-1cc1-4fe8-8900-ef6f41eb4a51" containerID="d567d73287d29adb0661d01c4c04f9722e6892b396a786bbaae79534a669a0dd" exitCode=0 Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.426850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" event={"ID":"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51","Type":"ContainerDied","Data":"d567d73287d29adb0661d01c4c04f9722e6892b396a786bbaae79534a669a0dd"} Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.428153 4707 generic.go:334] "Generic (PLEG): container finished" podID="120902fc-6e0c-4568-af50-7cc575b4be15" containerID="da9ab0487ecb52d6afd10b061c395b405c6b662ea77f16f3551cb9bae154af0e" exitCode=0 Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.428204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" event={"ID":"120902fc-6e0c-4568-af50-7cc575b4be15","Type":"ContainerDied","Data":"da9ab0487ecb52d6afd10b061c395b405c6b662ea77f16f3551cb9bae154af0e"} Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.429851 4707 generic.go:334] "Generic (PLEG): container finished" podID="399b071b-395f-4701-b7b9-3c3868acae7f" containerID="33ee19dacd4b8460c57308ceb725cbdeac6589f0322cef31b06d130e43027761" exitCode=0 Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.429889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" event={"ID":"399b071b-395f-4701-b7b9-3c3868acae7f","Type":"ContainerDied","Data":"33ee19dacd4b8460c57308ceb725cbdeac6589f0322cef31b06d130e43027761"} Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.432756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerStarted","Data":"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063"} Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.432792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerStarted","Data":"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4"} Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.802455 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.810649 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.813518 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.950411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7xcl\" (UniqueName: \"kubernetes.io/projected/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-kube-api-access-x7xcl\") pod \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\" (UID: \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\") " Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.950710 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e4cf85-ea8a-4428-b07c-3f1659259c77-operator-scripts\") pod \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\" (UID: \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\") " Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.951238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e4cf85-ea8a-4428-b07c-3f1659259c77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0e4cf85-ea8a-4428-b07c-3f1659259c77" (UID: "d0e4cf85-ea8a-4428-b07c-3f1659259c77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.951313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dczw\" (UniqueName: \"kubernetes.io/projected/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-kube-api-access-4dczw\") pod \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\" (UID: \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\") " Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.951335 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxklb\" (UniqueName: \"kubernetes.io/projected/d0e4cf85-ea8a-4428-b07c-3f1659259c77-kube-api-access-qxklb\") pod \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\" (UID: \"d0e4cf85-ea8a-4428-b07c-3f1659259c77\") " Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.951365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-operator-scripts\") pod \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\" (UID: \"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e\") " Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.951403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-operator-scripts\") pod \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\" (UID: \"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe\") " Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.951775 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e4cf85-ea8a-4428-b07c-3f1659259c77-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.952018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe" (UID: "4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.952519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e" (UID: "abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.954988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e4cf85-ea8a-4428-b07c-3f1659259c77-kube-api-access-qxklb" (OuterVolumeSpecName: "kube-api-access-qxklb") pod "d0e4cf85-ea8a-4428-b07c-3f1659259c77" (UID: "d0e4cf85-ea8a-4428-b07c-3f1659259c77"). InnerVolumeSpecName "kube-api-access-qxklb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.955035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-kube-api-access-x7xcl" (OuterVolumeSpecName: "kube-api-access-x7xcl") pod "abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e" (UID: "abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e"). InnerVolumeSpecName "kube-api-access-x7xcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:43 crc kubenswrapper[4707]: I0121 15:27:43.955191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-kube-api-access-4dczw" (OuterVolumeSpecName: "kube-api-access-4dczw") pod "4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe" (UID: "4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe"). InnerVolumeSpecName "kube-api-access-4dczw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.053657 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dczw\" (UniqueName: \"kubernetes.io/projected/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-kube-api-access-4dczw\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.053679 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxklb\" (UniqueName: \"kubernetes.io/projected/d0e4cf85-ea8a-4428-b07c-3f1659259c77-kube-api-access-qxklb\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.053689 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.053697 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.053704 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7xcl\" (UniqueName: \"kubernetes.io/projected/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e-kube-api-access-x7xcl\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.443197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-jfctd" event={"ID":"d0e4cf85-ea8a-4428-b07c-3f1659259c77","Type":"ContainerDied","Data":"d861a21997ba4ac36b8c2f812eed73446cd7a25e1850bbe0a7e611c0b2184c55"} Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.443238 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d861a21997ba4ac36b8c2f812eed73446cd7a25e1850bbe0a7e611c0b2184c55" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.443243 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-jfctd" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.445370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" event={"ID":"abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e","Type":"ContainerDied","Data":"aca393eeb3e75609b3bca33ec7e960d38b55c46f73d8f2db92cbb11e69b7cbbc"} Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.445402 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.445421 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aca393eeb3e75609b3bca33ec7e960d38b55c46f73d8f2db92cbb11e69b7cbbc" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.447302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" event={"ID":"4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe","Type":"ContainerDied","Data":"5fb7360e567f9146c17a4cf3f4fbf08003aec658a0f8c6ee69005f8e415c6d66"} Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.447358 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fb7360e567f9146c17a4cf3f4fbf08003aec658a0f8c6ee69005f8e415c6d66" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.447367 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-tcjlc" Jan 21 15:27:44 crc kubenswrapper[4707]: E0121 15:27:44.671524 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a2a0d64_5c2e_48a0_92d4_34f1536f7cbe.slice/crio-5fb7360e567f9146c17a4cf3f4fbf08003aec658a0f8c6ee69005f8e415c6d66\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0e4cf85_ea8a_4428_b07c_3f1659259c77.slice/crio-d861a21997ba4ac36b8c2f812eed73446cd7a25e1850bbe0a7e611c0b2184c55\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabcbb0c2_58e3_45e6_87ab_a1ef3bc43d0e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabcbb0c2_58e3_45e6_87ab_a1ef3bc43d0e.slice/crio-aca393eeb3e75609b3bca33ec7e960d38b55c46f73d8f2db92cbb11e69b7cbbc\": RecentStats: unable to find data in memory cache]" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.726857 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.821305 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.823626 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.869362 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.882123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz4hl\" (UniqueName: \"kubernetes.io/projected/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-kube-api-access-vz4hl\") pod \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\" (UID: \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.882201 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-operator-scripts\") pod \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\" (UID: \"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.882919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef17a342-1cc1-4fe8-8900-ef6f41eb4a51" (UID: "ef17a342-1cc1-4fe8-8900-ef6f41eb4a51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.890001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-kube-api-access-vz4hl" (OuterVolumeSpecName: "kube-api-access-vz4hl") pod "ef17a342-1cc1-4fe8-8900-ef6f41eb4a51" (UID: "ef17a342-1cc1-4fe8-8900-ef6f41eb4a51"). InnerVolumeSpecName "kube-api-access-vz4hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.983664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6jfg\" (UniqueName: \"kubernetes.io/projected/120902fc-6e0c-4568-af50-7cc575b4be15-kube-api-access-k6jfg\") pod \"120902fc-6e0c-4568-af50-7cc575b4be15\" (UID: \"120902fc-6e0c-4568-af50-7cc575b4be15\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.983737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-scripts\") pod \"d8665ece-e4f3-481b-b4eb-92126976834e\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.983767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqtsb\" (UniqueName: \"kubernetes.io/projected/399b071b-395f-4701-b7b9-3c3868acae7f-kube-api-access-kqtsb\") pod \"399b071b-395f-4701-b7b9-3c3868acae7f\" (UID: \"399b071b-395f-4701-b7b9-3c3868acae7f\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.983823 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-logs\") pod \"d8665ece-e4f3-481b-b4eb-92126976834e\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.983865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399b071b-395f-4701-b7b9-3c3868acae7f-operator-scripts\") pod \"399b071b-395f-4701-b7b9-3c3868acae7f\" (UID: \"399b071b-395f-4701-b7b9-3c3868acae7f\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.983905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/120902fc-6e0c-4568-af50-7cc575b4be15-operator-scripts\") pod \"120902fc-6e0c-4568-af50-7cc575b4be15\" (UID: \"120902fc-6e0c-4568-af50-7cc575b4be15\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.983923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-public-tls-certs\") pod \"d8665ece-e4f3-481b-b4eb-92126976834e\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.983950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"d8665ece-e4f3-481b-b4eb-92126976834e\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.983991 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gmvc\" (UniqueName: \"kubernetes.io/projected/d8665ece-e4f3-481b-b4eb-92126976834e-kube-api-access-8gmvc\") pod \"d8665ece-e4f3-481b-b4eb-92126976834e\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.984022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-combined-ca-bundle\") pod \"d8665ece-e4f3-481b-b4eb-92126976834e\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.984065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-config-data\") pod \"d8665ece-e4f3-481b-b4eb-92126976834e\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.984081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-httpd-run\") pod \"d8665ece-e4f3-481b-b4eb-92126976834e\" (UID: \"d8665ece-e4f3-481b-b4eb-92126976834e\") " Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.984622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-logs" (OuterVolumeSpecName: "logs") pod "d8665ece-e4f3-481b-b4eb-92126976834e" (UID: "d8665ece-e4f3-481b-b4eb-92126976834e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.984894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399b071b-395f-4701-b7b9-3c3868acae7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "399b071b-395f-4701-b7b9-3c3868acae7f" (UID: "399b071b-395f-4701-b7b9-3c3868acae7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.985023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120902fc-6e0c-4568-af50-7cc575b4be15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "120902fc-6e0c-4568-af50-7cc575b4be15" (UID: "120902fc-6e0c-4568-af50-7cc575b4be15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.985199 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.985215 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.985226 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399b071b-395f-4701-b7b9-3c3868acae7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.985233 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/120902fc-6e0c-4568-af50-7cc575b4be15-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.985241 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz4hl\" (UniqueName: \"kubernetes.io/projected/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51-kube-api-access-vz4hl\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.985278 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d8665ece-e4f3-481b-b4eb-92126976834e" (UID: "d8665ece-e4f3-481b-b4eb-92126976834e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.987858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/120902fc-6e0c-4568-af50-7cc575b4be15-kube-api-access-k6jfg" (OuterVolumeSpecName: "kube-api-access-k6jfg") pod "120902fc-6e0c-4568-af50-7cc575b4be15" (UID: "120902fc-6e0c-4568-af50-7cc575b4be15"). InnerVolumeSpecName "kube-api-access-k6jfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.988162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399b071b-395f-4701-b7b9-3c3868acae7f-kube-api-access-kqtsb" (OuterVolumeSpecName: "kube-api-access-kqtsb") pod "399b071b-395f-4701-b7b9-3c3868acae7f" (UID: "399b071b-395f-4701-b7b9-3c3868acae7f"). InnerVolumeSpecName "kube-api-access-kqtsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.988182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8665ece-e4f3-481b-b4eb-92126976834e-kube-api-access-8gmvc" (OuterVolumeSpecName: "kube-api-access-8gmvc") pod "d8665ece-e4f3-481b-b4eb-92126976834e" (UID: "d8665ece-e4f3-481b-b4eb-92126976834e"). InnerVolumeSpecName "kube-api-access-8gmvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.989287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "d8665ece-e4f3-481b-b4eb-92126976834e" (UID: "d8665ece-e4f3-481b-b4eb-92126976834e"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:27:44 crc kubenswrapper[4707]: I0121 15:27:44.989637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-scripts" (OuterVolumeSpecName: "scripts") pod "d8665ece-e4f3-481b-b4eb-92126976834e" (UID: "d8665ece-e4f3-481b-b4eb-92126976834e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.007148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8665ece-e4f3-481b-b4eb-92126976834e" (UID: "d8665ece-e4f3-481b-b4eb-92126976834e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.025334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-config-data" (OuterVolumeSpecName: "config-data") pod "d8665ece-e4f3-481b-b4eb-92126976834e" (UID: "d8665ece-e4f3-481b-b4eb-92126976834e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.025973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d8665ece-e4f3-481b-b4eb-92126976834e" (UID: "d8665ece-e4f3-481b-b4eb-92126976834e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.086526 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.086788 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.086799 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gmvc\" (UniqueName: \"kubernetes.io/projected/d8665ece-e4f3-481b-b4eb-92126976834e-kube-api-access-8gmvc\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.086823 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.086832 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.086839 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8665ece-e4f3-481b-b4eb-92126976834e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.086848 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6jfg\" (UniqueName: \"kubernetes.io/projected/120902fc-6e0c-4568-af50-7cc575b4be15-kube-api-access-k6jfg\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.086856 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8665ece-e4f3-481b-b4eb-92126976834e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.086866 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqtsb\" (UniqueName: \"kubernetes.io/projected/399b071b-395f-4701-b7b9-3c3868acae7f-kube-api-access-kqtsb\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.103664 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.188446 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.468375 4707 generic.go:334] "Generic (PLEG): container finished" podID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerID="10f61a011e329ec1d672daefa3c014bb1fa30336866182ba8c21a0dcaf122b87" exitCode=0 Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.468709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4a9ee08-b47e-42c9-a557-35c13dbc9b16","Type":"ContainerDied","Data":"10f61a011e329ec1d672daefa3c014bb1fa30336866182ba8c21a0dcaf122b87"} Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.475365 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8665ece-e4f3-481b-b4eb-92126976834e" containerID="ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9" exitCode=0 Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.475492 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.475490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8665ece-e4f3-481b-b4eb-92126976834e","Type":"ContainerDied","Data":"ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9"} Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.475639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8665ece-e4f3-481b-b4eb-92126976834e","Type":"ContainerDied","Data":"5df24f191ad32d87727a20232e5220227000c90950cd69f0d4401595dae00926"} Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.475690 4707 scope.go:117] "RemoveContainer" containerID="ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.483042 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerStarted","Data":"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af"} Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.483398 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="ceilometer-central-agent" containerID="cri-o://dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc" gracePeriod=30 Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.483494 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="sg-core" containerID="cri-o://e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063" gracePeriod=30 Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.483513 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="ceilometer-notification-agent" containerID="cri-o://d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4" gracePeriod=30 Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.483425 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.483628 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="proxy-httpd" containerID="cri-o://a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af" gracePeriod=30 Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.500306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" event={"ID":"ef17a342-1cc1-4fe8-8900-ef6f41eb4a51","Type":"ContainerDied","Data":"484268a6fb1356a45bb72d8d84d87edde116b7cbd96aa4975b219572288d51c4"} Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.500339 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="484268a6fb1356a45bb72d8d84d87edde116b7cbd96aa4975b219572288d51c4" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.500410 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-h728t" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.508369 4707 scope.go:117] "RemoveContainer" containerID="c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.508450 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.510438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" event={"ID":"120902fc-6e0c-4568-af50-7cc575b4be15","Type":"ContainerDied","Data":"e351a87e441da597ea10e84c81babc6730023232111dceff709d3755c4460a2d"} Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.510478 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e351a87e441da597ea10e84c81babc6730023232111dceff709d3755c4460a2d" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.510597 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.523239 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.525057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" event={"ID":"399b071b-395f-4701-b7b9-3c3868acae7f","Type":"ContainerDied","Data":"3afacb2fae26e1db59aef8536ad9c7fae3e346112f1acdf155aed4ac62deca07"} Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.525085 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3afacb2fae26e1db59aef8536ad9c7fae3e346112f1acdf155aed4ac62deca07" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.525164 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534014 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.534334 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120902fc-6e0c-4568-af50-7cc575b4be15" containerName="mariadb-account-create-update" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534349 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="120902fc-6e0c-4568-af50-7cc575b4be15" containerName="mariadb-account-create-update" Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.534369 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e4cf85-ea8a-4428-b07c-3f1659259c77" containerName="mariadb-database-create" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534375 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e4cf85-ea8a-4428-b07c-3f1659259c77" containerName="mariadb-database-create" Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.534386 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef17a342-1cc1-4fe8-8900-ef6f41eb4a51" containerName="mariadb-database-create" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534392 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef17a342-1cc1-4fe8-8900-ef6f41eb4a51" containerName="mariadb-database-create" Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.534404 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399b071b-395f-4701-b7b9-3c3868acae7f" containerName="mariadb-account-create-update" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534409 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="399b071b-395f-4701-b7b9-3c3868acae7f" containerName="mariadb-account-create-update" Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.534416 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8665ece-e4f3-481b-b4eb-92126976834e" containerName="glance-httpd" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534421 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8665ece-e4f3-481b-b4eb-92126976834e" containerName="glance-httpd" Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.534436 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8665ece-e4f3-481b-b4eb-92126976834e" containerName="glance-log" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534441 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8665ece-e4f3-481b-b4eb-92126976834e" containerName="glance-log" Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.534455 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe" containerName="mariadb-database-create" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534461 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe" containerName="mariadb-database-create" Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.534472 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e" containerName="mariadb-account-create-update" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534477 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e" containerName="mariadb-account-create-update" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534609 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="399b071b-395f-4701-b7b9-3c3868acae7f" containerName="mariadb-account-create-update" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534617 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef17a342-1cc1-4fe8-8900-ef6f41eb4a51" containerName="mariadb-database-create" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534626 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8665ece-e4f3-481b-b4eb-92126976834e" containerName="glance-httpd" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534631 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe" containerName="mariadb-database-create" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534640 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="120902fc-6e0c-4568-af50-7cc575b4be15" containerName="mariadb-account-create-update" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534657 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e4cf85-ea8a-4428-b07c-3f1659259c77" containerName="mariadb-database-create" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534666 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e" containerName="mariadb-account-create-update" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.534674 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8665ece-e4f3-481b-b4eb-92126976834e" containerName="glance-log" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.535545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.539680 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.3242620880000002 podStartE2EDuration="5.539662283s" podCreationTimestamp="2026-01-21 15:27:40 +0000 UTC" firstStartedPulling="2026-01-21 15:27:41.110729907 +0000 UTC m=+1558.292246128" lastFinishedPulling="2026-01-21 15:27:44.326130101 +0000 UTC m=+1561.507646323" observedRunningTime="2026-01-21 15:27:45.528286287 +0000 UTC m=+1562.709802509" watchObservedRunningTime="2026-01-21 15:27:45.539662283 +0000 UTC m=+1562.721178505" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.540149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.541712 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.542243 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.549164 4707 scope.go:117] "RemoveContainer" containerID="ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9" Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.549557 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9\": container with ID starting with ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9 not found: ID does not exist" containerID="ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.549583 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9"} err="failed to get container status \"ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9\": rpc error: code = NotFound desc = could not find container \"ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9\": container with ID starting with ce0fc1c22ee8dfb1303a4c0b3ee61e935e75e0f74afbfbc2d208290bcafdbdf9 not found: ID does not exist" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.549597 4707 scope.go:117] "RemoveContainer" containerID="c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94" Jan 21 15:27:45 crc kubenswrapper[4707]: E0121 15:27:45.549770 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94\": container with ID starting with c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94 not found: ID does not exist" containerID="c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.549788 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94"} err="failed to get container status \"c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94\": rpc error: code = NotFound desc = could not find container \"c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94\": container with ID starting with c4960db130d48f0b6290d7989f7dd73279e63313f7e276fb973789584971dc94 not found: ID does not exist" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.704587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-scripts\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.705097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.705221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.705311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.705377 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svsj9\" (UniqueName: \"kubernetes.io/projected/09b00a3d-6e52-458e-8fb3-6c48d658ff86-kube-api-access-svsj9\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.705451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.705576 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-config-data\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.705734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-logs\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.745301 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.808346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-config-data\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.808403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-logs\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.808453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-scripts\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.808477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.808509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.808537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.808557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svsj9\" (UniqueName: \"kubernetes.io/projected/09b00a3d-6e52-458e-8fb3-6c48d658ff86-kube-api-access-svsj9\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.808579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.808943 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.810539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-logs\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.812698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.817253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-scripts\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.818911 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.820726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.822306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-config-data\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.826706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svsj9\" (UniqueName: \"kubernetes.io/projected/09b00a3d-6e52-458e-8fb3-6c48d658ff86-kube-api-access-svsj9\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.831714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.858559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.909876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4sxs\" (UniqueName: \"kubernetes.io/projected/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-kube-api-access-h4sxs\") pod \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.910052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-httpd-run\") pod \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.910188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-logs\") pod \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.910274 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-scripts\") pod \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.910330 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.910554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-internal-tls-certs\") pod \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.910588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-combined-ca-bundle\") pod \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.910654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-config-data\") pod \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\" (UID: \"e4a9ee08-b47e-42c9-a557-35c13dbc9b16\") " Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.910520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e4a9ee08-b47e-42c9-a557-35c13dbc9b16" (UID: "e4a9ee08-b47e-42c9-a557-35c13dbc9b16"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.910620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-logs" (OuterVolumeSpecName: "logs") pod "e4a9ee08-b47e-42c9-a557-35c13dbc9b16" (UID: "e4a9ee08-b47e-42c9-a557-35c13dbc9b16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.912394 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.912417 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.913817 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-kube-api-access-h4sxs" (OuterVolumeSpecName: "kube-api-access-h4sxs") pod "e4a9ee08-b47e-42c9-a557-35c13dbc9b16" (UID: "e4a9ee08-b47e-42c9-a557-35c13dbc9b16"). InnerVolumeSpecName "kube-api-access-h4sxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.913853 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "e4a9ee08-b47e-42c9-a557-35c13dbc9b16" (UID: "e4a9ee08-b47e-42c9-a557-35c13dbc9b16"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.914518 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-scripts" (OuterVolumeSpecName: "scripts") pod "e4a9ee08-b47e-42c9-a557-35c13dbc9b16" (UID: "e4a9ee08-b47e-42c9-a557-35c13dbc9b16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.940514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4a9ee08-b47e-42c9-a557-35c13dbc9b16" (UID: "e4a9ee08-b47e-42c9-a557-35c13dbc9b16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.961526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-config-data" (OuterVolumeSpecName: "config-data") pod "e4a9ee08-b47e-42c9-a557-35c13dbc9b16" (UID: "e4a9ee08-b47e-42c9-a557-35c13dbc9b16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:45 crc kubenswrapper[4707]: I0121 15:27:45.962712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4a9ee08-b47e-42c9-a557-35c13dbc9b16" (UID: "e4a9ee08-b47e-42c9-a557-35c13dbc9b16"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.014375 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4sxs\" (UniqueName: \"kubernetes.io/projected/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-kube-api-access-h4sxs\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.014403 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.014435 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.014445 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.014454 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.014462 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a9ee08-b47e-42c9-a557-35c13dbc9b16-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.030914 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.116551 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: W0121 15:27:46.246932 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09b00a3d_6e52_458e_8fb3_6c48d658ff86.slice/crio-9f6933ab473e2c8d0a445426066fe44bdcd81dfe5af76f5a868ef5c37f97527a WatchSource:0}: Error finding container 9f6933ab473e2c8d0a445426066fe44bdcd81dfe5af76f5a868ef5c37f97527a: Status 404 returned error can't find the container with id 9f6933ab473e2c8d0a445426066fe44bdcd81dfe5af76f5a868ef5c37f97527a Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.247100 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.248486 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.434839 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-sg-core-conf-yaml\") pod \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.434901 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-scripts\") pod \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.434976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwhdj\" (UniqueName: \"kubernetes.io/projected/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-kube-api-access-kwhdj\") pod \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.435075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-run-httpd\") pod \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.435093 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-combined-ca-bundle\") pod \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.435117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-config-data\") pod \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.435141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-log-httpd\") pod \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\" (UID: \"189bb580-1bb8-4175-a41d-20cb5dd4a2ec\") " Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.435834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "189bb580-1bb8-4175-a41d-20cb5dd4a2ec" (UID: "189bb580-1bb8-4175-a41d-20cb5dd4a2ec"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.436119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "189bb580-1bb8-4175-a41d-20cb5dd4a2ec" (UID: "189bb580-1bb8-4175-a41d-20cb5dd4a2ec"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.443931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-kube-api-access-kwhdj" (OuterVolumeSpecName: "kube-api-access-kwhdj") pod "189bb580-1bb8-4175-a41d-20cb5dd4a2ec" (UID: "189bb580-1bb8-4175-a41d-20cb5dd4a2ec"). InnerVolumeSpecName "kube-api-access-kwhdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.446991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-scripts" (OuterVolumeSpecName: "scripts") pod "189bb580-1bb8-4175-a41d-20cb5dd4a2ec" (UID: "189bb580-1bb8-4175-a41d-20cb5dd4a2ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.459154 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "189bb580-1bb8-4175-a41d-20cb5dd4a2ec" (UID: "189bb580-1bb8-4175-a41d-20cb5dd4a2ec"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.496904 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "189bb580-1bb8-4175-a41d-20cb5dd4a2ec" (UID: "189bb580-1bb8-4175-a41d-20cb5dd4a2ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.512305 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-config-data" (OuterVolumeSpecName: "config-data") pod "189bb580-1bb8-4175-a41d-20cb5dd4a2ec" (UID: "189bb580-1bb8-4175-a41d-20cb5dd4a2ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.537056 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.537084 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.537096 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.537104 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.537112 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.537120 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.537127 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwhdj\" (UniqueName: \"kubernetes.io/projected/189bb580-1bb8-4175-a41d-20cb5dd4a2ec-kube-api-access-kwhdj\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.539089 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.539117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4a9ee08-b47e-42c9-a557-35c13dbc9b16","Type":"ContainerDied","Data":"15bcf13514063cfbb4c689e47fe96488036211848dbf21d83d780560a3e9fffb"} Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.539183 4707 scope.go:117] "RemoveContainer" containerID="10f61a011e329ec1d672daefa3c014bb1fa30336866182ba8c21a0dcaf122b87" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.543122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"09b00a3d-6e52-458e-8fb3-6c48d658ff86","Type":"ContainerStarted","Data":"9f6933ab473e2c8d0a445426066fe44bdcd81dfe5af76f5a868ef5c37f97527a"} Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546310 4707 generic.go:334] "Generic (PLEG): container finished" podID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerID="a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af" exitCode=0 Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546334 4707 generic.go:334] "Generic (PLEG): container finished" podID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerID="e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063" exitCode=2 Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546344 4707 generic.go:334] "Generic (PLEG): container finished" podID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerID="d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4" exitCode=0 Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546353 4707 generic.go:334] "Generic (PLEG): container finished" podID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerID="dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc" exitCode=0 Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerDied","Data":"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af"} Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerDied","Data":"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063"} Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerDied","Data":"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4"} Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerDied","Data":"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc"} Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"189bb580-1bb8-4175-a41d-20cb5dd4a2ec","Type":"ContainerDied","Data":"deaef8cc24e1e84fc32a447b330e490a9f59e807ba4385fdec731417bd695926"} Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.546370 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.569070 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.573593 4707 scope.go:117] "RemoveContainer" containerID="780f1ff1126457e163f15d285904bf20997e1fa8d3a62e958bafdb6e38ea3073" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.576976 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.591791 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597273 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.597606 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="ceilometer-central-agent" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597624 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="ceilometer-central-agent" Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.597645 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="sg-core" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597652 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="sg-core" Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.597665 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerName="glance-log" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597670 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerName="glance-log" Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.597683 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerName="glance-httpd" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597688 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerName="glance-httpd" Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.597702 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="ceilometer-notification-agent" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597707 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="ceilometer-notification-agent" Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.597722 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="proxy-httpd" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597728 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="proxy-httpd" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597897 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="ceilometer-central-agent" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597914 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="ceilometer-notification-agent" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597922 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerName="glance-log" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597934 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="sg-core" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597942 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" containerName="glance-httpd" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.597953 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" containerName="proxy-httpd" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.598701 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.603593 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.603923 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.606192 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.612771 4707 scope.go:117] "RemoveContainer" containerID="a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.619038 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.625574 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.627855 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.630241 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.630540 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.630683 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.651383 4707 scope.go:117] "RemoveContainer" containerID="e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.670296 4707 scope.go:117] "RemoveContainer" containerID="d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.687838 4707 scope.go:117] "RemoveContainer" containerID="dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.703418 4707 scope.go:117] "RemoveContainer" containerID="a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af" Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.704044 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af\": container with ID starting with a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af not found: ID does not exist" containerID="a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.704077 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af"} err="failed to get container status \"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af\": rpc error: code = NotFound desc = could not find container \"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af\": container with ID starting with a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.704096 4707 scope.go:117] "RemoveContainer" containerID="e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063" Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.704330 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063\": container with ID starting with e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063 not found: ID does not exist" containerID="e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.704363 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063"} err="failed to get container status \"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063\": rpc error: code = NotFound desc = could not find container \"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063\": container with ID starting with e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063 not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.704389 4707 scope.go:117] "RemoveContainer" containerID="d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4" Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.704594 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4\": container with ID starting with d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4 not found: ID does not exist" containerID="d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.704626 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4"} err="failed to get container status \"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4\": rpc error: code = NotFound desc = could not find container \"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4\": container with ID starting with d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4 not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.704640 4707 scope.go:117] "RemoveContainer" containerID="dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc" Jan 21 15:27:46 crc kubenswrapper[4707]: E0121 15:27:46.704837 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc\": container with ID starting with dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc not found: ID does not exist" containerID="dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.704858 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc"} err="failed to get container status \"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc\": rpc error: code = NotFound desc = could not find container \"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc\": container with ID starting with dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.704883 4707 scope.go:117] "RemoveContainer" containerID="a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705049 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af"} err="failed to get container status \"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af\": rpc error: code = NotFound desc = could not find container \"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af\": container with ID starting with a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705069 4707 scope.go:117] "RemoveContainer" containerID="e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705229 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063"} err="failed to get container status \"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063\": rpc error: code = NotFound desc = could not find container \"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063\": container with ID starting with e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063 not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705249 4707 scope.go:117] "RemoveContainer" containerID="d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705400 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4"} err="failed to get container status \"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4\": rpc error: code = NotFound desc = could not find container \"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4\": container with ID starting with d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4 not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705415 4707 scope.go:117] "RemoveContainer" containerID="dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705563 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc"} err="failed to get container status \"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc\": rpc error: code = NotFound desc = could not find container \"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc\": container with ID starting with dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705582 4707 scope.go:117] "RemoveContainer" containerID="a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705735 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af"} err="failed to get container status \"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af\": rpc error: code = NotFound desc = could not find container \"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af\": container with ID starting with a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.705754 4707 scope.go:117] "RemoveContainer" containerID="e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.706328 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063"} err="failed to get container status \"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063\": rpc error: code = NotFound desc = could not find container \"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063\": container with ID starting with e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063 not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.706362 4707 scope.go:117] "RemoveContainer" containerID="d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.706928 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4"} err="failed to get container status \"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4\": rpc error: code = NotFound desc = could not find container \"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4\": container with ID starting with d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4 not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.706948 4707 scope.go:117] "RemoveContainer" containerID="dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.707154 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc"} err="failed to get container status \"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc\": rpc error: code = NotFound desc = could not find container \"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc\": container with ID starting with dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.707181 4707 scope.go:117] "RemoveContainer" containerID="a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.707997 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af"} err="failed to get container status \"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af\": rpc error: code = NotFound desc = could not find container \"a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af\": container with ID starting with a3d20a71b64a23496a2cb395d6f2cc2a0e0b81d5b535aae52742e488ce0f46af not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.708018 4707 scope.go:117] "RemoveContainer" containerID="e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.708250 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063"} err="failed to get container status \"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063\": rpc error: code = NotFound desc = could not find container \"e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063\": container with ID starting with e4f12cf61256f68372505b51152091e8a8f4bd15f40c7ae4de2881eac6f12063 not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.708271 4707 scope.go:117] "RemoveContainer" containerID="d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.708438 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4"} err="failed to get container status \"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4\": rpc error: code = NotFound desc = could not find container \"d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4\": container with ID starting with d3abb348466bd2ea48861de26fe4a858299bdb9994175a7c7280565c425392a4 not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.708474 4707 scope.go:117] "RemoveContainer" containerID="dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.708672 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc"} err="failed to get container status \"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc\": rpc error: code = NotFound desc = could not find container \"dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc\": container with ID starting with dd49f38efa616e9bd71d6f4147656402a8d32566a21bda92985828a94d9c2bfc not found: ID does not exist" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd658\" (UniqueName: \"kubernetes.io/projected/33a83157-f122-493a-acd7-ba671cfce233-kube-api-access-zd658\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-scripts\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-log-httpd\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-config-data\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-scripts\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqr9s\" (UniqueName: \"kubernetes.io/projected/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-kube-api-access-pqr9s\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-config-data\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-run-httpd\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740629 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-logs\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.740772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.758690 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.781781 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.787244 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.787446 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-5sbk6" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.787655 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.787888 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf"] Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.841972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-config-data\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-scripts\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqr9s\" (UniqueName: \"kubernetes.io/projected/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-kube-api-access-pqr9s\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-config-data\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-run-httpd\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-logs\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842233 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd658\" (UniqueName: \"kubernetes.io/projected/33a83157-f122-493a-acd7-ba671cfce233-kube-api-access-zd658\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-scripts\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-log-httpd\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.842711 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.843860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-logs\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.843980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.844833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-log-httpd\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.844942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-run-httpd\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.847524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.848441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-scripts\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.850419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.850425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.850945 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.851612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-config-data\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.852142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-config-data\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.852968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-scripts\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.857902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd658\" (UniqueName: \"kubernetes.io/projected/33a83157-f122-493a-acd7-ba671cfce233-kube-api-access-zd658\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.858002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqr9s\" (UniqueName: \"kubernetes.io/projected/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-kube-api-access-pqr9s\") pod \"ceilometer-0\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.867369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.927408 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.941407 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.944316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-scripts\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.944367 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-config-data\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.944414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psr9p\" (UniqueName: \"kubernetes.io/projected/35adac90-7099-431d-b0a3-58ada4999e21-kube-api-access-psr9p\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:46 crc kubenswrapper[4707]: I0121 15:27:46.944446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.045725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.045858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-scripts\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.045887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-config-data\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.045921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psr9p\" (UniqueName: \"kubernetes.io/projected/35adac90-7099-431d-b0a3-58ada4999e21-kube-api-access-psr9p\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.050044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-config-data\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.052317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.055710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-scripts\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.060340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psr9p\" (UniqueName: \"kubernetes.io/projected/35adac90-7099-431d-b0a3-58ada4999e21-kube-api-access-psr9p\") pod \"nova-cell0-conductor-db-sync-p44sf\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.097079 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.199321 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189bb580-1bb8-4175-a41d-20cb5dd4a2ec" path="/var/lib/kubelet/pods/189bb580-1bb8-4175-a41d-20cb5dd4a2ec/volumes" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.200349 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8665ece-e4f3-481b-b4eb-92126976834e" path="/var/lib/kubelet/pods/d8665ece-e4f3-481b-b4eb-92126976834e/volumes" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.201729 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a9ee08-b47e-42c9-a557-35c13dbc9b16" path="/var/lib/kubelet/pods/e4a9ee08-b47e-42c9-a557-35c13dbc9b16/volumes" Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.353869 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.421509 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.551705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf"] Jan 21 15:27:47 crc kubenswrapper[4707]: W0121 15:27:47.582996 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35adac90_7099_431d_b0a3_58ada4999e21.slice/crio-f339a93df0d8ec65206045cecb599ef99305447f3623449ab57448deb32a42fa WatchSource:0}: Error finding container f339a93df0d8ec65206045cecb599ef99305447f3623449ab57448deb32a42fa: Status 404 returned error can't find the container with id f339a93df0d8ec65206045cecb599ef99305447f3623449ab57448deb32a42fa Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.601948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"09b00a3d-6e52-458e-8fb3-6c48d658ff86","Type":"ContainerStarted","Data":"dac905514aca127784ff10619aa0102a1724955c3593a17e7fa6fb6888775848"} Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.602000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"09b00a3d-6e52-458e-8fb3-6c48d658ff86","Type":"ContainerStarted","Data":"0f79ffcc51a1df0f60b92a65eed0727ebe1a87e5686c3cfa9b48daa4dee20df1"} Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.611967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"33a83157-f122-493a-acd7-ba671cfce233","Type":"ContainerStarted","Data":"9ccc2ad60f67c465be08cd21b0bdc4939b90da2d9c2ce4491b84bb1d1d8e5901"} Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.614219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerStarted","Data":"c6fa9a0c334f83916545d25b9e8c09e2700d03f606820a39b9838c6ac33d888f"} Jan 21 15:27:47 crc kubenswrapper[4707]: I0121 15:27:47.633744 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.633730881 podStartE2EDuration="2.633730881s" podCreationTimestamp="2026-01-21 15:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:47.631866788 +0000 UTC m=+1564.813383009" watchObservedRunningTime="2026-01-21 15:27:47.633730881 +0000 UTC m=+1564.815247104" Jan 21 15:27:48 crc kubenswrapper[4707]: I0121 15:27:48.010027 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:48 crc kubenswrapper[4707]: I0121 15:27:48.626128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"33a83157-f122-493a-acd7-ba671cfce233","Type":"ContainerStarted","Data":"a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233"} Jan 21 15:27:48 crc kubenswrapper[4707]: I0121 15:27:48.626523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"33a83157-f122-493a-acd7-ba671cfce233","Type":"ContainerStarted","Data":"940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5"} Jan 21 15:27:48 crc kubenswrapper[4707]: I0121 15:27:48.628669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" event={"ID":"35adac90-7099-431d-b0a3-58ada4999e21","Type":"ContainerStarted","Data":"27ef52ef9ab5a10942362bb34bdfe0da4ea9479f559ae274a3e66b2fb8b006ae"} Jan 21 15:27:48 crc kubenswrapper[4707]: I0121 15:27:48.628693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" event={"ID":"35adac90-7099-431d-b0a3-58ada4999e21","Type":"ContainerStarted","Data":"f339a93df0d8ec65206045cecb599ef99305447f3623449ab57448deb32a42fa"} Jan 21 15:27:48 crc kubenswrapper[4707]: I0121 15:27:48.631492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerStarted","Data":"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487"} Jan 21 15:27:48 crc kubenswrapper[4707]: I0121 15:27:48.664581 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.664566888 podStartE2EDuration="2.664566888s" podCreationTimestamp="2026-01-21 15:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:48.652143532 +0000 UTC m=+1565.833659754" watchObservedRunningTime="2026-01-21 15:27:48.664566888 +0000 UTC m=+1565.846083109" Jan 21 15:27:48 crc kubenswrapper[4707]: I0121 15:27:48.668241 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" podStartSLOduration=2.668231687 podStartE2EDuration="2.668231687s" podCreationTimestamp="2026-01-21 15:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:48.66279165 +0000 UTC m=+1565.844307892" watchObservedRunningTime="2026-01-21 15:27:48.668231687 +0000 UTC m=+1565.849747910" Jan 21 15:27:49 crc kubenswrapper[4707]: I0121 15:27:49.645188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerStarted","Data":"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4"} Jan 21 15:27:49 crc kubenswrapper[4707]: I0121 15:27:49.667452 4707 scope.go:117] "RemoveContainer" containerID="a2d9743e9e081360d25dfe8246d36ca6184ede8b708a4387ffff1b0b955bd74a" Jan 21 15:27:49 crc kubenswrapper[4707]: I0121 15:27:49.699460 4707 scope.go:117] "RemoveContainer" containerID="6974d5bf2b2381c2c14cf2c91fc90afcde48c9637fed19ce45073e9c9e80bec1" Jan 21 15:27:49 crc kubenswrapper[4707]: I0121 15:27:49.751566 4707 scope.go:117] "RemoveContainer" containerID="d8ca5318999b7bf8d6303c3e592edd3d377c05a2dce6855ef7cb9b43aa92a648" Jan 21 15:27:49 crc kubenswrapper[4707]: I0121 15:27:49.783957 4707 scope.go:117] "RemoveContainer" containerID="7cbbc5f91d32fbfe39900eb972fe527aa33aacdf4a104c1e7339806c89526fae" Jan 21 15:27:49 crc kubenswrapper[4707]: I0121 15:27:49.822673 4707 scope.go:117] "RemoveContainer" containerID="898270f329b3e42bbbee5e74a2988492248116b7c718075edc0eb8b621339c63" Jan 21 15:27:49 crc kubenswrapper[4707]: I0121 15:27:49.872342 4707 scope.go:117] "RemoveContainer" containerID="f88d80aa5a2d179739c143a0050ab27e4ec78c655f22cd1f2e1091b62bc6e73b" Jan 21 15:27:49 crc kubenswrapper[4707]: I0121 15:27:49.903837 4707 scope.go:117] "RemoveContainer" containerID="dfafa233b16792821e92a67c522a5cbe5d8dd451e5c5bbe6fab6dcd63eb69869" Jan 21 15:27:49 crc kubenswrapper[4707]: I0121 15:27:49.930035 4707 scope.go:117] "RemoveContainer" containerID="74830281e4e80b570ce296677137e2b13d42928b30d53e08074c5321973fff8d" Jan 21 15:27:50 crc kubenswrapper[4707]: I0121 15:27:50.001260 4707 scope.go:117] "RemoveContainer" containerID="fee0cb39511a55b0679602f9c7a78524a3f99c3be04ba4a4ce0a4fd68b351c5c" Jan 21 15:27:50 crc kubenswrapper[4707]: I0121 15:27:50.049261 4707 scope.go:117] "RemoveContainer" containerID="eb34cbdfd71f1c9dd2163feb2da32572dcbf417daea1ca6cfab5189985d32ea4" Jan 21 15:27:50 crc kubenswrapper[4707]: I0121 15:27:50.076957 4707 scope.go:117] "RemoveContainer" containerID="3d83cb67dec455e46dc1ce200dbbd0a61b39254b95b751d67e78d0b70627b898" Jan 21 15:27:50 crc kubenswrapper[4707]: I0121 15:27:50.096148 4707 scope.go:117] "RemoveContainer" containerID="1e5286050c55001bc380d39d61363c908de1f2d1b096180f6b900cc316a11aad" Jan 21 15:27:50 crc kubenswrapper[4707]: I0121 15:27:50.126508 4707 scope.go:117] "RemoveContainer" containerID="55b0115da79e304b9f7eb8b41f3a28735875a035dbcde7cbf64e8cdf08faa4a9" Jan 21 15:27:50 crc kubenswrapper[4707]: I0121 15:27:50.194265 4707 scope.go:117] "RemoveContainer" containerID="6d331224286c383855737ab2779d81c7a80a7051be238066760e4621490f70b0" Jan 21 15:27:50 crc kubenswrapper[4707]: I0121 15:27:50.654124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerStarted","Data":"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc"} Jan 21 15:27:51 crc kubenswrapper[4707]: I0121 15:27:51.662151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerStarted","Data":"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09"} Jan 21 15:27:51 crc kubenswrapper[4707]: I0121 15:27:51.662405 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="ceilometer-notification-agent" containerID="cri-o://a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4" gracePeriod=30 Jan 21 15:27:51 crc kubenswrapper[4707]: I0121 15:27:51.662426 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:51 crc kubenswrapper[4707]: I0121 15:27:51.662387 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="proxy-httpd" containerID="cri-o://a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09" gracePeriod=30 Jan 21 15:27:51 crc kubenswrapper[4707]: I0121 15:27:51.662258 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="ceilometer-central-agent" containerID="cri-o://f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487" gracePeriod=30 Jan 21 15:27:51 crc kubenswrapper[4707]: I0121 15:27:51.662350 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="sg-core" containerID="cri-o://e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc" gracePeriod=30 Jan 21 15:27:51 crc kubenswrapper[4707]: I0121 15:27:51.683699 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.975564072 podStartE2EDuration="5.683686638s" podCreationTimestamp="2026-01-21 15:27:46 +0000 UTC" firstStartedPulling="2026-01-21 15:27:47.463624614 +0000 UTC m=+1564.645140826" lastFinishedPulling="2026-01-21 15:27:51.17174717 +0000 UTC m=+1568.353263392" observedRunningTime="2026-01-21 15:27:51.678287559 +0000 UTC m=+1568.859803791" watchObservedRunningTime="2026-01-21 15:27:51.683686638 +0000 UTC m=+1568.865202860" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.244023 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.351380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-run-httpd\") pod \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.351631 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-log-httpd\") pod \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.351725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-config-data\") pod \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.351716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "92eb87ab-58c5-4f9d-90ac-56eca074fa5e" (UID: "92eb87ab-58c5-4f9d-90ac-56eca074fa5e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.351785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-combined-ca-bundle\") pod \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.351819 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-scripts\") pod \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.351849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqr9s\" (UniqueName: \"kubernetes.io/projected/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-kube-api-access-pqr9s\") pod \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.351873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-sg-core-conf-yaml\") pod \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\" (UID: \"92eb87ab-58c5-4f9d-90ac-56eca074fa5e\") " Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.352048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "92eb87ab-58c5-4f9d-90ac-56eca074fa5e" (UID: "92eb87ab-58c5-4f9d-90ac-56eca074fa5e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.352520 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.352539 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.366952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-scripts" (OuterVolumeSpecName: "scripts") pod "92eb87ab-58c5-4f9d-90ac-56eca074fa5e" (UID: "92eb87ab-58c5-4f9d-90ac-56eca074fa5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.383974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-kube-api-access-pqr9s" (OuterVolumeSpecName: "kube-api-access-pqr9s") pod "92eb87ab-58c5-4f9d-90ac-56eca074fa5e" (UID: "92eb87ab-58c5-4f9d-90ac-56eca074fa5e"). InnerVolumeSpecName "kube-api-access-pqr9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.401922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "92eb87ab-58c5-4f9d-90ac-56eca074fa5e" (UID: "92eb87ab-58c5-4f9d-90ac-56eca074fa5e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.453747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92eb87ab-58c5-4f9d-90ac-56eca074fa5e" (UID: "92eb87ab-58c5-4f9d-90ac-56eca074fa5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.453967 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.454043 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.454112 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqr9s\" (UniqueName: \"kubernetes.io/projected/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-kube-api-access-pqr9s\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.454177 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.468345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-config-data" (OuterVolumeSpecName: "config-data") pod "92eb87ab-58c5-4f9d-90ac-56eca074fa5e" (UID: "92eb87ab-58c5-4f9d-90ac-56eca074fa5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.555600 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92eb87ab-58c5-4f9d-90ac-56eca074fa5e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.673824 4707 generic.go:334] "Generic (PLEG): container finished" podID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerID="a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09" exitCode=0 Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.673887 4707 generic.go:334] "Generic (PLEG): container finished" podID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerID="e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc" exitCode=2 Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.673897 4707 generic.go:334] "Generic (PLEG): container finished" podID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerID="a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4" exitCode=0 Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.673907 4707 generic.go:334] "Generic (PLEG): container finished" podID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerID="f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487" exitCode=0 Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.673948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerDied","Data":"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09"} Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.673996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerDied","Data":"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc"} Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.674010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerDied","Data":"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4"} Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.674023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerDied","Data":"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487"} Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.674035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92eb87ab-58c5-4f9d-90ac-56eca074fa5e","Type":"ContainerDied","Data":"c6fa9a0c334f83916545d25b9e8c09e2700d03f606820a39b9838c6ac33d888f"} Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.674066 4707 scope.go:117] "RemoveContainer" containerID="a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.674336 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.702905 4707 scope.go:117] "RemoveContainer" containerID="e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.705638 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.716963 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.723206 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:52 crc kubenswrapper[4707]: E0121 15:27:52.723522 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="ceilometer-central-agent" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.723535 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="ceilometer-central-agent" Jan 21 15:27:52 crc kubenswrapper[4707]: E0121 15:27:52.723548 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="ceilometer-notification-agent" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.723554 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="ceilometer-notification-agent" Jan 21 15:27:52 crc kubenswrapper[4707]: E0121 15:27:52.723615 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="proxy-httpd" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.723621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="proxy-httpd" Jan 21 15:27:52 crc kubenswrapper[4707]: E0121 15:27:52.723634 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="sg-core" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.723641 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="sg-core" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.723826 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="ceilometer-central-agent" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.723842 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="proxy-httpd" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.723856 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="sg-core" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.723868 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" containerName="ceilometer-notification-agent" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.725337 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.725908 4707 scope.go:117] "RemoveContainer" containerID="a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.727301 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.731035 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.732186 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.747588 4707 scope.go:117] "RemoveContainer" containerID="f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.764410 4707 scope.go:117] "RemoveContainer" containerID="a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09" Jan 21 15:27:52 crc kubenswrapper[4707]: E0121 15:27:52.764804 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09\": container with ID starting with a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09 not found: ID does not exist" containerID="a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.764859 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09"} err="failed to get container status \"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09\": rpc error: code = NotFound desc = could not find container \"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09\": container with ID starting with a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.764885 4707 scope.go:117] "RemoveContainer" containerID="e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc" Jan 21 15:27:52 crc kubenswrapper[4707]: E0121 15:27:52.765129 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc\": container with ID starting with e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc not found: ID does not exist" containerID="e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.765154 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc"} err="failed to get container status \"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc\": rpc error: code = NotFound desc = could not find container \"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc\": container with ID starting with e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.765182 4707 scope.go:117] "RemoveContainer" containerID="a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4" Jan 21 15:27:52 crc kubenswrapper[4707]: E0121 15:27:52.765415 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4\": container with ID starting with a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4 not found: ID does not exist" containerID="a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.765441 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4"} err="failed to get container status \"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4\": rpc error: code = NotFound desc = could not find container \"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4\": container with ID starting with a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.765453 4707 scope.go:117] "RemoveContainer" containerID="f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487" Jan 21 15:27:52 crc kubenswrapper[4707]: E0121 15:27:52.766111 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487\": container with ID starting with f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487 not found: ID does not exist" containerID="f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.766147 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487"} err="failed to get container status \"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487\": rpc error: code = NotFound desc = could not find container \"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487\": container with ID starting with f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.766179 4707 scope.go:117] "RemoveContainer" containerID="a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.766429 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09"} err="failed to get container status \"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09\": rpc error: code = NotFound desc = could not find container \"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09\": container with ID starting with a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.766449 4707 scope.go:117] "RemoveContainer" containerID="e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.766646 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc"} err="failed to get container status \"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc\": rpc error: code = NotFound desc = could not find container \"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc\": container with ID starting with e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.766671 4707 scope.go:117] "RemoveContainer" containerID="a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.766887 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4"} err="failed to get container status \"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4\": rpc error: code = NotFound desc = could not find container \"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4\": container with ID starting with a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.766907 4707 scope.go:117] "RemoveContainer" containerID="f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767096 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487"} err="failed to get container status \"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487\": rpc error: code = NotFound desc = could not find container \"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487\": container with ID starting with f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767114 4707 scope.go:117] "RemoveContainer" containerID="a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767310 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09"} err="failed to get container status \"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09\": rpc error: code = NotFound desc = could not find container \"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09\": container with ID starting with a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767326 4707 scope.go:117] "RemoveContainer" containerID="e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767493 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc"} err="failed to get container status \"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc\": rpc error: code = NotFound desc = could not find container \"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc\": container with ID starting with e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767508 4707 scope.go:117] "RemoveContainer" containerID="a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767654 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4"} err="failed to get container status \"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4\": rpc error: code = NotFound desc = could not find container \"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4\": container with ID starting with a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767669 4707 scope.go:117] "RemoveContainer" containerID="f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767821 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487"} err="failed to get container status \"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487\": rpc error: code = NotFound desc = could not find container \"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487\": container with ID starting with f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767838 4707 scope.go:117] "RemoveContainer" containerID="a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.767996 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09"} err="failed to get container status \"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09\": rpc error: code = NotFound desc = could not find container \"a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09\": container with ID starting with a65441553f9acba84093a3d62baa1059e2e06c72ffd2a4d74d98c481669a4f09 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.768011 4707 scope.go:117] "RemoveContainer" containerID="e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.768151 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc"} err="failed to get container status \"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc\": rpc error: code = NotFound desc = could not find container \"e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc\": container with ID starting with e6237a96913bb25c7570460d99512f04a354d887c925f5af68630dfde994a1bc not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.768166 4707 scope.go:117] "RemoveContainer" containerID="a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.768323 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4"} err="failed to get container status \"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4\": rpc error: code = NotFound desc = could not find container \"a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4\": container with ID starting with a47fa5924f4fdc0e2665c7d78195df2a4d39a7fa55390f65ff71d421de0f52d4 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.768340 4707 scope.go:117] "RemoveContainer" containerID="f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.768499 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487"} err="failed to get container status \"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487\": rpc error: code = NotFound desc = could not find container \"f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487\": container with ID starting with f5f7593ba7d7c686bce299b3fb048f092707a55559d052f188128b0934b10487 not found: ID does not exist" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.860160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.860277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5677\" (UniqueName: \"kubernetes.io/projected/db84f146-2a89-41df-bf0a-e8bef54a64d9-kube-api-access-p5677\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.860323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-scripts\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.860389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-log-httpd\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.860447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.860472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-config-data\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.860490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-run-httpd\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.961631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-scripts\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.961721 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-log-httpd\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.961760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.961777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-config-data\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.961794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-run-httpd\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.961835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.961886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5677\" (UniqueName: \"kubernetes.io/projected/db84f146-2a89-41df-bf0a-e8bef54a64d9-kube-api-access-p5677\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.962182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-log-httpd\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.962243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-run-httpd\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.964952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.965262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-config-data\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.965587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.966528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-scripts\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:52 crc kubenswrapper[4707]: I0121 15:27:52.977427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5677\" (UniqueName: \"kubernetes.io/projected/db84f146-2a89-41df-bf0a-e8bef54a64d9-kube-api-access-p5677\") pod \"ceilometer-0\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:53 crc kubenswrapper[4707]: I0121 15:27:53.052426 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:53 crc kubenswrapper[4707]: I0121 15:27:53.198043 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92eb87ab-58c5-4f9d-90ac-56eca074fa5e" path="/var/lib/kubelet/pods/92eb87ab-58c5-4f9d-90ac-56eca074fa5e/volumes" Jan 21 15:27:53 crc kubenswrapper[4707]: I0121 15:27:53.418316 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:27:53 crc kubenswrapper[4707]: I0121 15:27:53.681235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerStarted","Data":"72afeda33debe140662938cf01628880c528e6094bdf624669cb9121157c72a2"} Jan 21 15:27:53 crc kubenswrapper[4707]: I0121 15:27:53.684666 4707 generic.go:334] "Generic (PLEG): container finished" podID="35adac90-7099-431d-b0a3-58ada4999e21" containerID="27ef52ef9ab5a10942362bb34bdfe0da4ea9479f559ae274a3e66b2fb8b006ae" exitCode=0 Jan 21 15:27:53 crc kubenswrapper[4707]: I0121 15:27:53.684703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" event={"ID":"35adac90-7099-431d-b0a3-58ada4999e21","Type":"ContainerDied","Data":"27ef52ef9ab5a10942362bb34bdfe0da4ea9479f559ae274a3e66b2fb8b006ae"} Jan 21 15:27:54 crc kubenswrapper[4707]: I0121 15:27:54.701268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerStarted","Data":"8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec"} Jan 21 15:27:54 crc kubenswrapper[4707]: I0121 15:27:54.932694 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.094444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-scripts\") pod \"35adac90-7099-431d-b0a3-58ada4999e21\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.094488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-config-data\") pod \"35adac90-7099-431d-b0a3-58ada4999e21\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.094555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psr9p\" (UniqueName: \"kubernetes.io/projected/35adac90-7099-431d-b0a3-58ada4999e21-kube-api-access-psr9p\") pod \"35adac90-7099-431d-b0a3-58ada4999e21\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.094600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-combined-ca-bundle\") pod \"35adac90-7099-431d-b0a3-58ada4999e21\" (UID: \"35adac90-7099-431d-b0a3-58ada4999e21\") " Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.099037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-scripts" (OuterVolumeSpecName: "scripts") pod "35adac90-7099-431d-b0a3-58ada4999e21" (UID: "35adac90-7099-431d-b0a3-58ada4999e21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.099257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35adac90-7099-431d-b0a3-58ada4999e21-kube-api-access-psr9p" (OuterVolumeSpecName: "kube-api-access-psr9p") pod "35adac90-7099-431d-b0a3-58ada4999e21" (UID: "35adac90-7099-431d-b0a3-58ada4999e21"). InnerVolumeSpecName "kube-api-access-psr9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.114137 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-config-data" (OuterVolumeSpecName: "config-data") pod "35adac90-7099-431d-b0a3-58ada4999e21" (UID: "35adac90-7099-431d-b0a3-58ada4999e21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.116989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35adac90-7099-431d-b0a3-58ada4999e21" (UID: "35adac90-7099-431d-b0a3-58ada4999e21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.196917 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.196944 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.196954 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psr9p\" (UniqueName: \"kubernetes.io/projected/35adac90-7099-431d-b0a3-58ada4999e21-kube-api-access-psr9p\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.196963 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adac90-7099-431d-b0a3-58ada4999e21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.710719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" event={"ID":"35adac90-7099-431d-b0a3-58ada4999e21","Type":"ContainerDied","Data":"f339a93df0d8ec65206045cecb599ef99305447f3623449ab57448deb32a42fa"} Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.710966 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f339a93df0d8ec65206045cecb599ef99305447f3623449ab57448deb32a42fa" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.711011 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.713698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerStarted","Data":"143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313"} Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.713726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerStarted","Data":"9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628"} Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.751501 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:27:55 crc kubenswrapper[4707]: E0121 15:27:55.751822 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35adac90-7099-431d-b0a3-58ada4999e21" containerName="nova-cell0-conductor-db-sync" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.751841 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35adac90-7099-431d-b0a3-58ada4999e21" containerName="nova-cell0-conductor-db-sync" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.751991 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35adac90-7099-431d-b0a3-58ada4999e21" containerName="nova-cell0-conductor-db-sync" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.752449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.753798 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.754727 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-5sbk6" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.774856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.859200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.859243 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.879542 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.887184 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.907034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rrq8\" (UniqueName: \"kubernetes.io/projected/f5fe6793-4878-480a-a359-225be30e006c-kube-api-access-2rrq8\") pod \"nova-cell0-conductor-0\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.907088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:55 crc kubenswrapper[4707]: I0121 15:27:55.907459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.008741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.008856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rrq8\" (UniqueName: \"kubernetes.io/projected/f5fe6793-4878-480a-a359-225be30e006c-kube-api-access-2rrq8\") pod \"nova-cell0-conductor-0\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.008900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.013837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.014198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.035198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rrq8\" (UniqueName: \"kubernetes.io/projected/f5fe6793-4878-480a-a359-225be30e006c-kube-api-access-2rrq8\") pod \"nova-cell0-conductor-0\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.067264 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.443891 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:27:56 crc kubenswrapper[4707]: W0121 15:27:56.445741 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5fe6793_4878_480a_a359_225be30e006c.slice/crio-a3993deda44d1d6434b3e8c25a3820a13a697f2a0bba99561425621c0cb08bd3 WatchSource:0}: Error finding container a3993deda44d1d6434b3e8c25a3820a13a697f2a0bba99561425621c0cb08bd3: Status 404 returned error can't find the container with id a3993deda44d1d6434b3e8c25a3820a13a697f2a0bba99561425621c0cb08bd3 Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.727515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f5fe6793-4878-480a-a359-225be30e006c","Type":"ContainerStarted","Data":"4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d"} Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.727570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.727581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f5fe6793-4878-480a-a359-225be30e006c","Type":"ContainerStarted","Data":"a3993deda44d1d6434b3e8c25a3820a13a697f2a0bba99561425621c0cb08bd3"} Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.727591 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.727598 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.742154 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.742143833 podStartE2EDuration="1.742143833s" podCreationTimestamp="2026-01-21 15:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:56.738285399 +0000 UTC m=+1573.919801621" watchObservedRunningTime="2026-01-21 15:27:56.742143833 +0000 UTC m=+1573.923660055" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.928912 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.929119 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.961358 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:56 crc kubenswrapper[4707]: I0121 15:27:56.962954 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:57 crc kubenswrapper[4707]: I0121 15:27:57.734721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerStarted","Data":"347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3"} Jan 21 15:27:57 crc kubenswrapper[4707]: I0121 15:27:57.735158 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:57 crc kubenswrapper[4707]: I0121 15:27:57.735354 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:57 crc kubenswrapper[4707]: I0121 15:27:57.750480 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.611540188 podStartE2EDuration="5.750466674s" podCreationTimestamp="2026-01-21 15:27:52 +0000 UTC" firstStartedPulling="2026-01-21 15:27:53.431908209 +0000 UTC m=+1570.613424431" lastFinishedPulling="2026-01-21 15:27:56.570834694 +0000 UTC m=+1573.752350917" observedRunningTime="2026-01-21 15:27:57.748991682 +0000 UTC m=+1574.930507904" watchObservedRunningTime="2026-01-21 15:27:57.750466674 +0000 UTC m=+1574.931982897" Jan 21 15:27:58 crc kubenswrapper[4707]: I0121 15:27:58.330428 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:58 crc kubenswrapper[4707]: I0121 15:27:58.353406 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:27:58 crc kubenswrapper[4707]: I0121 15:27:58.740919 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:27:59 crc kubenswrapper[4707]: I0121 15:27:59.365837 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:27:59 crc kubenswrapper[4707]: I0121 15:27:59.375526 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.088528 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.156334 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.156544 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="ceilometer-central-agent" containerID="cri-o://8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec" gracePeriod=30 Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.156599 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="sg-core" containerID="cri-o://143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313" gracePeriod=30 Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.156621 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="proxy-httpd" containerID="cri-o://347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3" gracePeriod=30 Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.156627 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="ceilometer-notification-agent" containerID="cri-o://9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628" gracePeriod=30 Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.603270 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.604390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.605993 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.607043 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.617003 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.701334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-config-data\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.701482 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.701507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-scripts\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.701566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s9nn\" (UniqueName: \"kubernetes.io/projected/ab9ae62d-9e10-4db0-a711-28078f1ba665-kube-api-access-2s9nn\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.709300 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.710552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.724164 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.731766 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.733155 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.734647 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.745789 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.759509 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.763886 4707 generic.go:334] "Generic (PLEG): container finished" podID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerID="347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3" exitCode=0 Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.763981 4707 generic.go:334] "Generic (PLEG): container finished" podID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerID="143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313" exitCode=2 Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.764031 4707 generic.go:334] "Generic (PLEG): container finished" podID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerID="8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec" exitCode=0 Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.764097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerDied","Data":"347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3"} Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.764201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerDied","Data":"143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313"} Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.764265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerDied","Data":"8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec"} Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.783246 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.784398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.795656 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.796776 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-config-data\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-config-data\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpf8k\" (UniqueName: \"kubernetes.io/projected/3f213eea-c901-4586-b267-ee407ceddfc3-kube-api-access-dpf8k\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-config-data\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f213eea-c901-4586-b267-ee407ceddfc3-logs\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-scripts\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jvdr\" (UniqueName: \"kubernetes.io/projected/43a512b6-d97e-4b5e-8e28-74ecefe99134-kube-api-access-8jvdr\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s9nn\" (UniqueName: \"kubernetes.io/projected/ab9ae62d-9e10-4db0-a711-28078f1ba665-kube-api-access-2s9nn\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.802993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a512b6-d97e-4b5e-8e28-74ecefe99134-logs\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.812592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.819349 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-config-data\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.825899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-scripts\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.835428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s9nn\" (UniqueName: \"kubernetes.io/projected/ab9ae62d-9e10-4db0-a711-28078f1ba665-kube-api-access-2s9nn\") pod \"nova-cell0-cell-mapping-zjhmd\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.848962 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.863618 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.865334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.882678 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.904963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-config-data\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-config-data\") pod \"nova-scheduler-0\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f213eea-c901-4586-b267-ee407ceddfc3-logs\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jvdr\" (UniqueName: \"kubernetes.io/projected/43a512b6-d97e-4b5e-8e28-74ecefe99134-kube-api-access-8jvdr\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a512b6-d97e-4b5e-8e28-74ecefe99134-logs\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-config-data\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckvg9\" (UniqueName: \"kubernetes.io/projected/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-kube-api-access-ckvg9\") pod \"nova-scheduler-0\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpf8k\" (UniqueName: \"kubernetes.io/projected/3f213eea-c901-4586-b267-ee407ceddfc3-kube-api-access-dpf8k\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.905973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a512b6-d97e-4b5e-8e28-74ecefe99134-logs\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.907245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f213eea-c901-4586-b267-ee407ceddfc3-logs\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.912276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.912360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.912521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-config-data\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.912759 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-config-data\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.926115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.932828 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpf8k\" (UniqueName: \"kubernetes.io/projected/3f213eea-c901-4586-b267-ee407ceddfc3-kube-api-access-dpf8k\") pod \"nova-metadata-0\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:01 crc kubenswrapper[4707]: I0121 15:28:01.933407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jvdr\" (UniqueName: \"kubernetes.io/projected/43a512b6-d97e-4b5e-8e28-74ecefe99134-kube-api-access-8jvdr\") pod \"nova-api-0\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.006574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.007028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.007192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckvg9\" (UniqueName: \"kubernetes.io/projected/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-kube-api-access-ckvg9\") pod \"nova-scheduler-0\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.007293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.007360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-config-data\") pod \"nova-scheduler-0\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.007437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp89t\" (UniqueName: \"kubernetes.io/projected/0b70861f-5fa0-4bad-b1d6-174581d9d478-kube-api-access-fp89t\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.014369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.027009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-config-data\") pod \"nova-scheduler-0\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.027484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.030452 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckvg9\" (UniqueName: \"kubernetes.io/projected/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-kube-api-access-ckvg9\") pod \"nova-scheduler-0\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.049238 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.108719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.109053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp89t\" (UniqueName: \"kubernetes.io/projected/0b70861f-5fa0-4bad-b1d6-174581d9d478-kube-api-access-fp89t\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.109180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.112101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.112572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.125499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp89t\" (UniqueName: \"kubernetes.io/projected/0b70861f-5fa0-4bad-b1d6-174581d9d478-kube-api-access-fp89t\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.166955 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.185333 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.400692 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.514832 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5677\" (UniqueName: \"kubernetes.io/projected/db84f146-2a89-41df-bf0a-e8bef54a64d9-kube-api-access-p5677\") pod \"db84f146-2a89-41df-bf0a-e8bef54a64d9\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.515204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-run-httpd\") pod \"db84f146-2a89-41df-bf0a-e8bef54a64d9\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.515234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-scripts\") pod \"db84f146-2a89-41df-bf0a-e8bef54a64d9\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.515272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-config-data\") pod \"db84f146-2a89-41df-bf0a-e8bef54a64d9\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.515328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-combined-ca-bundle\") pod \"db84f146-2a89-41df-bf0a-e8bef54a64d9\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.515404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-log-httpd\") pod \"db84f146-2a89-41df-bf0a-e8bef54a64d9\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.515431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-sg-core-conf-yaml\") pod \"db84f146-2a89-41df-bf0a-e8bef54a64d9\" (UID: \"db84f146-2a89-41df-bf0a-e8bef54a64d9\") " Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.515443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "db84f146-2a89-41df-bf0a-e8bef54a64d9" (UID: "db84f146-2a89-41df-bf0a-e8bef54a64d9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.515748 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.516085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "db84f146-2a89-41df-bf0a-e8bef54a64d9" (UID: "db84f146-2a89-41df-bf0a-e8bef54a64d9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.523174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db84f146-2a89-41df-bf0a-e8bef54a64d9-kube-api-access-p5677" (OuterVolumeSpecName: "kube-api-access-p5677") pod "db84f146-2a89-41df-bf0a-e8bef54a64d9" (UID: "db84f146-2a89-41df-bf0a-e8bef54a64d9"). InnerVolumeSpecName "kube-api-access-p5677". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.529900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-scripts" (OuterVolumeSpecName: "scripts") pod "db84f146-2a89-41df-bf0a-e8bef54a64d9" (UID: "db84f146-2a89-41df-bf0a-e8bef54a64d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.532551 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.543958 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.623125 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5677\" (UniqueName: \"kubernetes.io/projected/db84f146-2a89-41df-bf0a-e8bef54a64d9-kube-api-access-p5677\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.623152 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.623162 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db84f146-2a89-41df-bf0a-e8bef54a64d9-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.633440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db84f146-2a89-41df-bf0a-e8bef54a64d9" (UID: "db84f146-2a89-41df-bf0a-e8bef54a64d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.642022 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "db84f146-2a89-41df-bf0a-e8bef54a64d9" (UID: "db84f146-2a89-41df-bf0a-e8bef54a64d9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.647501 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr"] Jan 21 15:28:02 crc kubenswrapper[4707]: E0121 15:28:02.647868 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="sg-core" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.647879 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="sg-core" Jan 21 15:28:02 crc kubenswrapper[4707]: E0121 15:28:02.647892 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="ceilometer-central-agent" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.647898 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="ceilometer-central-agent" Jan 21 15:28:02 crc kubenswrapper[4707]: E0121 15:28:02.647926 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="proxy-httpd" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.647931 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="proxy-httpd" Jan 21 15:28:02 crc kubenswrapper[4707]: E0121 15:28:02.647944 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="ceilometer-notification-agent" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.647950 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="ceilometer-notification-agent" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.648091 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="ceilometer-central-agent" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.648104 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="proxy-httpd" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.648114 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="ceilometer-notification-agent" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.648123 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerName="sg-core" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.648649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.650902 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.651099 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.652050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-config-data" (OuterVolumeSpecName: "config-data") pod "db84f146-2a89-41df-bf0a-e8bef54a64d9" (UID: "db84f146-2a89-41df-bf0a-e8bef54a64d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.664928 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.669411 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.698837 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.706394 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.724070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-config-data\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.724141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtf8w\" (UniqueName: \"kubernetes.io/projected/c899fead-a20d-4d1e-ba2f-168c79e4c953-kube-api-access-xtf8w\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.724180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.724241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-scripts\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.724292 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.724307 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.724316 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db84f146-2a89-41df-bf0a-e8bef54a64d9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.776018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"0b70861f-5fa0-4bad-b1d6-174581d9d478","Type":"ContainerStarted","Data":"0626e456e472716be5016272e0d159cb1c88167a891c1899f9497231ecd92802"} Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.779201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" event={"ID":"ab9ae62d-9e10-4db0-a711-28078f1ba665","Type":"ContainerStarted","Data":"3503df076c1647da3089d817fbbad13aa54971586487b7b20c8c6ada30b2c149"} Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.780285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"43a512b6-d97e-4b5e-8e28-74ecefe99134","Type":"ContainerStarted","Data":"953996f157617cfe71a36945020e0e3a1d67f2f0ce81d0404707906301079f77"} Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.781680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3f213eea-c901-4586-b267-ee407ceddfc3","Type":"ContainerStarted","Data":"980afee481faa54aa1bd528328a85c4eb270eacb947811d29527437bf6183f3e"} Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.783772 4707 generic.go:334] "Generic (PLEG): container finished" podID="db84f146-2a89-41df-bf0a-e8bef54a64d9" containerID="9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628" exitCode=0 Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.783834 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.783849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerDied","Data":"9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628"} Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.783869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db84f146-2a89-41df-bf0a-e8bef54a64d9","Type":"ContainerDied","Data":"72afeda33debe140662938cf01628880c528e6094bdf624669cb9121157c72a2"} Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.783884 4707 scope.go:117] "RemoveContainer" containerID="347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.785105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce","Type":"ContainerStarted","Data":"7f383b627e9f96e05184d29953c503bae98940c34590ba90ee113941c4c955c6"} Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.806059 4707 scope.go:117] "RemoveContainer" containerID="143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.825476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-config-data\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.825571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtf8w\" (UniqueName: \"kubernetes.io/projected/c899fead-a20d-4d1e-ba2f-168c79e4c953-kube-api-access-xtf8w\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.825620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.825710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-scripts\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.829681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-config-data\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.830993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.842364 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.848886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-scripts\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.849500 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.850964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtf8w\" (UniqueName: \"kubernetes.io/projected/c899fead-a20d-4d1e-ba2f-168c79e4c953-kube-api-access-xtf8w\") pod \"nova-cell1-conductor-db-sync-b6gqr\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.870517 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.871140 4707 scope.go:117] "RemoveContainer" containerID="9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.872282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.873801 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.876349 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.878111 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.907351 4707 scope.go:117] "RemoveContainer" containerID="8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.927181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.927310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-run-httpd\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.927360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-config-data\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.927390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-log-httpd\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.927477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvx7r\" (UniqueName: \"kubernetes.io/projected/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-kube-api-access-tvx7r\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.927498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.927513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-scripts\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.931412 4707 scope.go:117] "RemoveContainer" containerID="347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3" Jan 21 15:28:02 crc kubenswrapper[4707]: E0121 15:28:02.931696 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3\": container with ID starting with 347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3 not found: ID does not exist" containerID="347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.931732 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3"} err="failed to get container status \"347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3\": rpc error: code = NotFound desc = could not find container \"347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3\": container with ID starting with 347022c9abbae3152d3a6f918c7449cd4e8aea7b71b45960d333e5521a9d38f3 not found: ID does not exist" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.931754 4707 scope.go:117] "RemoveContainer" containerID="143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313" Jan 21 15:28:02 crc kubenswrapper[4707]: E0121 15:28:02.932084 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313\": container with ID starting with 143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313 not found: ID does not exist" containerID="143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.932113 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313"} err="failed to get container status \"143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313\": rpc error: code = NotFound desc = could not find container \"143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313\": container with ID starting with 143c3cf68bd5625d88d19c189a0c36fa5c07dacae494f4029ad6dfe4c17dd313 not found: ID does not exist" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.932135 4707 scope.go:117] "RemoveContainer" containerID="9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628" Jan 21 15:28:02 crc kubenswrapper[4707]: E0121 15:28:02.932379 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628\": container with ID starting with 9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628 not found: ID does not exist" containerID="9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.932395 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628"} err="failed to get container status \"9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628\": rpc error: code = NotFound desc = could not find container \"9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628\": container with ID starting with 9f783993639ca7392fb94f1530b82432d3923a9b4bbcf4bf25f5afdb5e238628 not found: ID does not exist" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.932412 4707 scope.go:117] "RemoveContainer" containerID="8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec" Jan 21 15:28:02 crc kubenswrapper[4707]: E0121 15:28:02.932583 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec\": container with ID starting with 8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec not found: ID does not exist" containerID="8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec" Jan 21 15:28:02 crc kubenswrapper[4707]: I0121 15:28:02.932600 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec"} err="failed to get container status \"8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec\": rpc error: code = NotFound desc = could not find container \"8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec\": container with ID starting with 8b485c9abda136df517546f92efb81df2944fe5c79dd92258ccffceeb41feaec not found: ID does not exist" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.028445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-scripts\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.028496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.028569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-run-httpd\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.028600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-config-data\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.028624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-log-httpd\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.028678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvx7r\" (UniqueName: \"kubernetes.io/projected/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-kube-api-access-tvx7r\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.028696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.029072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-run-httpd\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.029278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-log-httpd\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.031078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.034551 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-scripts\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.035038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.038373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.038908 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-config-data\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.049776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvx7r\" (UniqueName: \"kubernetes.io/projected/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-kube-api-access-tvx7r\") pod \"ceilometer-0\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.191764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.192989 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db84f146-2a89-41df-bf0a-e8bef54a64d9" path="/var/lib/kubelet/pods/db84f146-2a89-41df-bf0a-e8bef54a64d9/volumes" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.466854 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr"] Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.672803 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:03 crc kubenswrapper[4707]: W0121 15:28:03.683183 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3532e287_d2df_4b9d_b5ff_e7fdcd35a3e9.slice/crio-2631a37910b998e0ab23421b9bf4f8525b7c97c1a3e62efd1813b22320e7d392 WatchSource:0}: Error finding container 2631a37910b998e0ab23421b9bf4f8525b7c97c1a3e62efd1813b22320e7d392: Status 404 returned error can't find the container with id 2631a37910b998e0ab23421b9bf4f8525b7c97c1a3e62efd1813b22320e7d392 Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.792474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"0b70861f-5fa0-4bad-b1d6-174581d9d478","Type":"ContainerStarted","Data":"766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.794716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerStarted","Data":"2631a37910b998e0ab23421b9bf4f8525b7c97c1a3e62efd1813b22320e7d392"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.797598 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" event={"ID":"ab9ae62d-9e10-4db0-a711-28078f1ba665","Type":"ContainerStarted","Data":"4b415b2394f6a6b1a0194164dcb9f6acb2393c4d7e63cbac8c3202ab06d0df36"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.800793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"43a512b6-d97e-4b5e-8e28-74ecefe99134","Type":"ContainerStarted","Data":"086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.800835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"43a512b6-d97e-4b5e-8e28-74ecefe99134","Type":"ContainerStarted","Data":"3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.811775 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.8117630829999998 podStartE2EDuration="2.811763083s" podCreationTimestamp="2026-01-21 15:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:03.80780428 +0000 UTC m=+1580.989354847" watchObservedRunningTime="2026-01-21 15:28:03.811763083 +0000 UTC m=+1580.993279305" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.827253 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" podStartSLOduration=2.827241091 podStartE2EDuration="2.827241091s" podCreationTimestamp="2026-01-21 15:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:03.819562305 +0000 UTC m=+1581.001078528" watchObservedRunningTime="2026-01-21 15:28:03.827241091 +0000 UTC m=+1581.008757312" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.829358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" event={"ID":"c899fead-a20d-4d1e-ba2f-168c79e4c953","Type":"ContainerStarted","Data":"3db19e9379751f044b218011cb0a5dd0bb55284281358d73c960c33d0fdb73c9"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.829408 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" event={"ID":"c899fead-a20d-4d1e-ba2f-168c79e4c953","Type":"ContainerStarted","Data":"dc33e13d18f5f45383eef50f12371543eb6cf72af02fa4d717a4a2dca694cab1"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.831591 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3f213eea-c901-4586-b267-ee407ceddfc3","Type":"ContainerStarted","Data":"93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.831668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3f213eea-c901-4586-b267-ee407ceddfc3","Type":"ContainerStarted","Data":"1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.843486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce","Type":"ContainerStarted","Data":"04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a"} Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.844441 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.844421009 podStartE2EDuration="2.844421009s" podCreationTimestamp="2026-01-21 15:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:03.833588233 +0000 UTC m=+1581.015104475" watchObservedRunningTime="2026-01-21 15:28:03.844421009 +0000 UTC m=+1581.025937231" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.859185 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.8591627429999997 podStartE2EDuration="2.859162743s" podCreationTimestamp="2026-01-21 15:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:03.84761308 +0000 UTC m=+1581.029129322" watchObservedRunningTime="2026-01-21 15:28:03.859162743 +0000 UTC m=+1581.040678965" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.871903 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" podStartSLOduration=1.871891594 podStartE2EDuration="1.871891594s" podCreationTimestamp="2026-01-21 15:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:03.86529889 +0000 UTC m=+1581.046815112" watchObservedRunningTime="2026-01-21 15:28:03.871891594 +0000 UTC m=+1581.053407815" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.884025 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.8840013300000003 podStartE2EDuration="2.88400133s" podCreationTimestamp="2026-01-21 15:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:03.88161328 +0000 UTC m=+1581.063129512" watchObservedRunningTime="2026-01-21 15:28:03.88400133 +0000 UTC m=+1581.065517552" Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.928523 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:03 crc kubenswrapper[4707]: I0121 15:28:03.941061 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:28:04 crc kubenswrapper[4707]: I0121 15:28:04.850947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerStarted","Data":"7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884"} Jan 21 15:28:05 crc kubenswrapper[4707]: I0121 15:28:05.859748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerStarted","Data":"f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126"} Jan 21 15:28:05 crc kubenswrapper[4707]: I0121 15:28:05.859890 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="3f213eea-c901-4586-b267-ee407ceddfc3" containerName="nova-metadata-log" containerID="cri-o://1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492" gracePeriod=30 Jan 21 15:28:05 crc kubenswrapper[4707]: I0121 15:28:05.859930 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="3f213eea-c901-4586-b267-ee407ceddfc3" containerName="nova-metadata-metadata" containerID="cri-o://93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d" gracePeriod=30 Jan 21 15:28:05 crc kubenswrapper[4707]: I0121 15:28:05.860002 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="0b70861f-5fa0-4bad-b1d6-174581d9d478" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf" gracePeriod=30 Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.323886 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.392421 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpf8k\" (UniqueName: \"kubernetes.io/projected/3f213eea-c901-4586-b267-ee407ceddfc3-kube-api-access-dpf8k\") pod \"3f213eea-c901-4586-b267-ee407ceddfc3\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.392492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-combined-ca-bundle\") pod \"3f213eea-c901-4586-b267-ee407ceddfc3\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.392702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-config-data\") pod \"3f213eea-c901-4586-b267-ee407ceddfc3\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.393362 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f213eea-c901-4586-b267-ee407ceddfc3-logs\") pod \"3f213eea-c901-4586-b267-ee407ceddfc3\" (UID: \"3f213eea-c901-4586-b267-ee407ceddfc3\") " Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.393787 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f213eea-c901-4586-b267-ee407ceddfc3-logs" (OuterVolumeSpecName: "logs") pod "3f213eea-c901-4586-b267-ee407ceddfc3" (UID: "3f213eea-c901-4586-b267-ee407ceddfc3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.394085 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f213eea-c901-4586-b267-ee407ceddfc3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.411469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f213eea-c901-4586-b267-ee407ceddfc3-kube-api-access-dpf8k" (OuterVolumeSpecName: "kube-api-access-dpf8k") pod "3f213eea-c901-4586-b267-ee407ceddfc3" (UID: "3f213eea-c901-4586-b267-ee407ceddfc3"). InnerVolumeSpecName "kube-api-access-dpf8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.440068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f213eea-c901-4586-b267-ee407ceddfc3" (UID: "3f213eea-c901-4586-b267-ee407ceddfc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.441424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-config-data" (OuterVolumeSpecName: "config-data") pod "3f213eea-c901-4586-b267-ee407ceddfc3" (UID: "3f213eea-c901-4586-b267-ee407ceddfc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.482444 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.497683 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.497721 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpf8k\" (UniqueName: \"kubernetes.io/projected/3f213eea-c901-4586-b267-ee407ceddfc3-kube-api-access-dpf8k\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.497732 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f213eea-c901-4586-b267-ee407ceddfc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.599077 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp89t\" (UniqueName: \"kubernetes.io/projected/0b70861f-5fa0-4bad-b1d6-174581d9d478-kube-api-access-fp89t\") pod \"0b70861f-5fa0-4bad-b1d6-174581d9d478\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.599125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-combined-ca-bundle\") pod \"0b70861f-5fa0-4bad-b1d6-174581d9d478\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.599151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-config-data\") pod \"0b70861f-5fa0-4bad-b1d6-174581d9d478\" (UID: \"0b70861f-5fa0-4bad-b1d6-174581d9d478\") " Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.605096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b70861f-5fa0-4bad-b1d6-174581d9d478-kube-api-access-fp89t" (OuterVolumeSpecName: "kube-api-access-fp89t") pod "0b70861f-5fa0-4bad-b1d6-174581d9d478" (UID: "0b70861f-5fa0-4bad-b1d6-174581d9d478"). InnerVolumeSpecName "kube-api-access-fp89t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.622104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b70861f-5fa0-4bad-b1d6-174581d9d478" (UID: "0b70861f-5fa0-4bad-b1d6-174581d9d478"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.624024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-config-data" (OuterVolumeSpecName: "config-data") pod "0b70861f-5fa0-4bad-b1d6-174581d9d478" (UID: "0b70861f-5fa0-4bad-b1d6-174581d9d478"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.702089 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp89t\" (UniqueName: \"kubernetes.io/projected/0b70861f-5fa0-4bad-b1d6-174581d9d478-kube-api-access-fp89t\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.702118 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.702144 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b70861f-5fa0-4bad-b1d6-174581d9d478-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.868521 4707 generic.go:334] "Generic (PLEG): container finished" podID="3f213eea-c901-4586-b267-ee407ceddfc3" containerID="93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d" exitCode=0 Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.868550 4707 generic.go:334] "Generic (PLEG): container finished" podID="3f213eea-c901-4586-b267-ee407ceddfc3" containerID="1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492" exitCode=143 Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.868623 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.871861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3f213eea-c901-4586-b267-ee407ceddfc3","Type":"ContainerDied","Data":"93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d"} Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.871907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3f213eea-c901-4586-b267-ee407ceddfc3","Type":"ContainerDied","Data":"1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492"} Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.871918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3f213eea-c901-4586-b267-ee407ceddfc3","Type":"ContainerDied","Data":"980afee481faa54aa1bd528328a85c4eb270eacb947811d29527437bf6183f3e"} Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.871935 4707 scope.go:117] "RemoveContainer" containerID="93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.873502 4707 generic.go:334] "Generic (PLEG): container finished" podID="0b70861f-5fa0-4bad-b1d6-174581d9d478" containerID="766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf" exitCode=0 Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.873559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"0b70861f-5fa0-4bad-b1d6-174581d9d478","Type":"ContainerDied","Data":"766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf"} Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.873583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"0b70861f-5fa0-4bad-b1d6-174581d9d478","Type":"ContainerDied","Data":"0626e456e472716be5016272e0d159cb1c88167a891c1899f9497231ecd92802"} Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.873623 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.875903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerStarted","Data":"03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60"} Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.898349 4707 scope.go:117] "RemoveContainer" containerID="1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.899935 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.916312 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.926707 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.934648 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.953869 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:28:06 crc kubenswrapper[4707]: E0121 15:28:06.954230 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b70861f-5fa0-4bad-b1d6-174581d9d478" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.954247 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b70861f-5fa0-4bad-b1d6-174581d9d478" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:28:06 crc kubenswrapper[4707]: E0121 15:28:06.954261 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f213eea-c901-4586-b267-ee407ceddfc3" containerName="nova-metadata-log" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.954268 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f213eea-c901-4586-b267-ee407ceddfc3" containerName="nova-metadata-log" Jan 21 15:28:06 crc kubenswrapper[4707]: E0121 15:28:06.954275 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f213eea-c901-4586-b267-ee407ceddfc3" containerName="nova-metadata-metadata" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.954280 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f213eea-c901-4586-b267-ee407ceddfc3" containerName="nova-metadata-metadata" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.954441 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f213eea-c901-4586-b267-ee407ceddfc3" containerName="nova-metadata-log" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.954468 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b70861f-5fa0-4bad-b1d6-174581d9d478" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.954478 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f213eea-c901-4586-b267-ee407ceddfc3" containerName="nova-metadata-metadata" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.955016 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.956901 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.957623 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.957657 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.963916 4707 scope.go:117] "RemoveContainer" containerID="93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d" Jan 21 15:28:06 crc kubenswrapper[4707]: E0121 15:28:06.964515 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d\": container with ID starting with 93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d not found: ID does not exist" containerID="93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.964554 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d"} err="failed to get container status \"93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d\": rpc error: code = NotFound desc = could not find container \"93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d\": container with ID starting with 93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d not found: ID does not exist" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.964579 4707 scope.go:117] "RemoveContainer" containerID="1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492" Jan 21 15:28:06 crc kubenswrapper[4707]: E0121 15:28:06.965940 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492\": container with ID starting with 1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492 not found: ID does not exist" containerID="1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.965969 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492"} err="failed to get container status \"1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492\": rpc error: code = NotFound desc = could not find container \"1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492\": container with ID starting with 1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492 not found: ID does not exist" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.965989 4707 scope.go:117] "RemoveContainer" containerID="93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.966432 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d"} err="failed to get container status \"93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d\": rpc error: code = NotFound desc = could not find container \"93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d\": container with ID starting with 93461b467b6967a20837c63edc9d781bad38c7e85e50847d823d9fa35f0d3c4d not found: ID does not exist" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.966455 4707 scope.go:117] "RemoveContainer" containerID="1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.966800 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492"} err="failed to get container status \"1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492\": rpc error: code = NotFound desc = could not find container \"1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492\": container with ID starting with 1c68b102fdb7c702acae2cc48349e73de24d9e41ead9f9bef4ca630500485492 not found: ID does not exist" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.966837 4707 scope.go:117] "RemoveContainer" containerID="766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.968849 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.969986 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.976216 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.976339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.983725 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:28:06 crc kubenswrapper[4707]: I0121 15:28:06.990506 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.006566 4707 scope.go:117] "RemoveContainer" containerID="766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf" Jan 21 15:28:07 crc kubenswrapper[4707]: E0121 15:28:07.006977 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf\": container with ID starting with 766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf not found: ID does not exist" containerID="766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007008 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf"} err="failed to get container status \"766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf\": rpc error: code = NotFound desc = could not find container \"766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf\": container with ID starting with 766153fb8b67d919ea1abb1f4d657eb0582b463b9f0520f89b7b08ff277f3baf not found: ID does not exist" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bl57\" (UniqueName: \"kubernetes.io/projected/1cd14261-de88-46af-aace-fd8b860f6522-kube-api-access-6bl57\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007554 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-config-data\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007826 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cd14261-de88-46af-aace-fd8b860f6522-logs\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkrwc\" (UniqueName: \"kubernetes.io/projected/184ed54c-4ce8-4d12-ae34-255b44708d27-kube-api-access-qkrwc\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.007988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.109569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.109900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-config-data\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.110490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.110568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cd14261-de88-46af-aace-fd8b860f6522-logs\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.110670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkrwc\" (UniqueName: \"kubernetes.io/projected/184ed54c-4ce8-4d12-ae34-255b44708d27-kube-api-access-qkrwc\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.110690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.110732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.110757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.110784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bl57\" (UniqueName: \"kubernetes.io/projected/1cd14261-de88-46af-aace-fd8b860f6522-kube-api-access-6bl57\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.111043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cd14261-de88-46af-aace-fd8b860f6522-logs\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.111233 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.113143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-config-data\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.114546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.115028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.116014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.118109 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.118501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.120947 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.125062 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkrwc\" (UniqueName: \"kubernetes.io/projected/184ed54c-4ce8-4d12-ae34-255b44708d27-kube-api-access-qkrwc\") pod \"nova-cell1-novncproxy-0\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.128115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bl57\" (UniqueName: \"kubernetes.io/projected/1cd14261-de88-46af-aace-fd8b860f6522-kube-api-access-6bl57\") pod \"nova-metadata-0\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.168304 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.192364 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b70861f-5fa0-4bad-b1d6-174581d9d478" path="/var/lib/kubelet/pods/0b70861f-5fa0-4bad-b1d6-174581d9d478/volumes" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.192991 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f213eea-c901-4586-b267-ee407ceddfc3" path="/var/lib/kubelet/pods/3f213eea-c901-4586-b267-ee407ceddfc3/volumes" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.278228 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.298153 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.687746 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.744280 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:07 crc kubenswrapper[4707]: W0121 15:28:07.756279 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cd14261_de88_46af_aace_fd8b860f6522.slice/crio-d4491a9f7f528ae71e4eeac0c9078221eb3644405f25cfe978162f6d7fd6e524 WatchSource:0}: Error finding container d4491a9f7f528ae71e4eeac0c9078221eb3644405f25cfe978162f6d7fd6e524: Status 404 returned error can't find the container with id d4491a9f7f528ae71e4eeac0c9078221eb3644405f25cfe978162f6d7fd6e524 Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.887081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"184ed54c-4ce8-4d12-ae34-255b44708d27","Type":"ContainerStarted","Data":"d307e13550eef112188a32aad9f6cbeff277a1d8ec256fe5d4efc8f52d5a4c77"} Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.890676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1cd14261-de88-46af-aace-fd8b860f6522","Type":"ContainerStarted","Data":"d4491a9f7f528ae71e4eeac0c9078221eb3644405f25cfe978162f6d7fd6e524"} Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.893540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerStarted","Data":"bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6"} Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.894966 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.896802 4707 generic.go:334] "Generic (PLEG): container finished" podID="c899fead-a20d-4d1e-ba2f-168c79e4c953" containerID="3db19e9379751f044b218011cb0a5dd0bb55284281358d73c960c33d0fdb73c9" exitCode=0 Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.897647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" event={"ID":"c899fead-a20d-4d1e-ba2f-168c79e4c953","Type":"ContainerDied","Data":"3db19e9379751f044b218011cb0a5dd0bb55284281358d73c960c33d0fdb73c9"} Jan 21 15:28:07 crc kubenswrapper[4707]: I0121 15:28:07.916380 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.534203931 podStartE2EDuration="5.916361968s" podCreationTimestamp="2026-01-21 15:28:02 +0000 UTC" firstStartedPulling="2026-01-21 15:28:03.68529778 +0000 UTC m=+1580.866814002" lastFinishedPulling="2026-01-21 15:28:07.067455817 +0000 UTC m=+1584.248972039" observedRunningTime="2026-01-21 15:28:07.911826621 +0000 UTC m=+1585.093342843" watchObservedRunningTime="2026-01-21 15:28:07.916361968 +0000 UTC m=+1585.097878190" Jan 21 15:28:08 crc kubenswrapper[4707]: I0121 15:28:08.911686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1cd14261-de88-46af-aace-fd8b860f6522","Type":"ContainerStarted","Data":"ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e"} Jan 21 15:28:08 crc kubenswrapper[4707]: I0121 15:28:08.912619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1cd14261-de88-46af-aace-fd8b860f6522","Type":"ContainerStarted","Data":"7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0"} Jan 21 15:28:08 crc kubenswrapper[4707]: I0121 15:28:08.915859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"184ed54c-4ce8-4d12-ae34-255b44708d27","Type":"ContainerStarted","Data":"fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121"} Jan 21 15:28:08 crc kubenswrapper[4707]: I0121 15:28:08.929931 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.929916615 podStartE2EDuration="2.929916615s" podCreationTimestamp="2026-01-21 15:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:08.924091354 +0000 UTC m=+1586.105607576" watchObservedRunningTime="2026-01-21 15:28:08.929916615 +0000 UTC m=+1586.111432837" Jan 21 15:28:08 crc kubenswrapper[4707]: I0121 15:28:08.949637 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.94962111 podStartE2EDuration="2.94962111s" podCreationTimestamp="2026-01-21 15:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:08.938588068 +0000 UTC m=+1586.120104290" watchObservedRunningTime="2026-01-21 15:28:08.94962111 +0000 UTC m=+1586.131137332" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.292085 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.361957 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-config-data\") pod \"c899fead-a20d-4d1e-ba2f-168c79e4c953\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.362190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-combined-ca-bundle\") pod \"c899fead-a20d-4d1e-ba2f-168c79e4c953\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.362281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-scripts\") pod \"c899fead-a20d-4d1e-ba2f-168c79e4c953\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.362829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtf8w\" (UniqueName: \"kubernetes.io/projected/c899fead-a20d-4d1e-ba2f-168c79e4c953-kube-api-access-xtf8w\") pod \"c899fead-a20d-4d1e-ba2f-168c79e4c953\" (UID: \"c899fead-a20d-4d1e-ba2f-168c79e4c953\") " Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.367992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-scripts" (OuterVolumeSpecName: "scripts") pod "c899fead-a20d-4d1e-ba2f-168c79e4c953" (UID: "c899fead-a20d-4d1e-ba2f-168c79e4c953"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.368432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c899fead-a20d-4d1e-ba2f-168c79e4c953-kube-api-access-xtf8w" (OuterVolumeSpecName: "kube-api-access-xtf8w") pod "c899fead-a20d-4d1e-ba2f-168c79e4c953" (UID: "c899fead-a20d-4d1e-ba2f-168c79e4c953"). InnerVolumeSpecName "kube-api-access-xtf8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.388567 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c899fead-a20d-4d1e-ba2f-168c79e4c953" (UID: "c899fead-a20d-4d1e-ba2f-168c79e4c953"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.390950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-config-data" (OuterVolumeSpecName: "config-data") pod "c899fead-a20d-4d1e-ba2f-168c79e4c953" (UID: "c899fead-a20d-4d1e-ba2f-168c79e4c953"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.465303 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.465335 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.465345 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtf8w\" (UniqueName: \"kubernetes.io/projected/c899fead-a20d-4d1e-ba2f-168c79e4c953-kube-api-access-xtf8w\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.465356 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c899fead-a20d-4d1e-ba2f-168c79e4c953-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.923633 4707 generic.go:334] "Generic (PLEG): container finished" podID="ab9ae62d-9e10-4db0-a711-28078f1ba665" containerID="4b415b2394f6a6b1a0194164dcb9f6acb2393c4d7e63cbac8c3202ab06d0df36" exitCode=0 Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.923688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" event={"ID":"ab9ae62d-9e10-4db0-a711-28078f1ba665","Type":"ContainerDied","Data":"4b415b2394f6a6b1a0194164dcb9f6acb2393c4d7e63cbac8c3202ab06d0df36"} Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.925698 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.929731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr" event={"ID":"c899fead-a20d-4d1e-ba2f-168c79e4c953","Type":"ContainerDied","Data":"dc33e13d18f5f45383eef50f12371543eb6cf72af02fa4d717a4a2dca694cab1"} Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.929758 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc33e13d18f5f45383eef50f12371543eb6cf72af02fa4d717a4a2dca694cab1" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.985197 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:28:09 crc kubenswrapper[4707]: E0121 15:28:09.985484 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c899fead-a20d-4d1e-ba2f-168c79e4c953" containerName="nova-cell1-conductor-db-sync" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.985501 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c899fead-a20d-4d1e-ba2f-168c79e4c953" containerName="nova-cell1-conductor-db-sync" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.985658 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c899fead-a20d-4d1e-ba2f-168c79e4c953" containerName="nova-cell1-conductor-db-sync" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.986139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.987465 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:28:09 crc kubenswrapper[4707]: I0121 15:28:09.996072 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.076882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2zd\" (UniqueName: \"kubernetes.io/projected/efb1c248-d3f2-4038-8f84-01a4de59d39e-kube-api-access-9h2zd\") pod \"nova-cell1-conductor-0\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.077132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.077285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.178570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.178642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.178766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2zd\" (UniqueName: \"kubernetes.io/projected/efb1c248-d3f2-4038-8f84-01a4de59d39e-kube-api-access-9h2zd\") pod \"nova-cell1-conductor-0\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.191407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.192310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.197193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2zd\" (UniqueName: \"kubernetes.io/projected/efb1c248-d3f2-4038-8f84-01a4de59d39e-kube-api-access-9h2zd\") pod \"nova-cell1-conductor-0\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.304916 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.698364 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.934157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"efb1c248-d3f2-4038-8f84-01a4de59d39e","Type":"ContainerStarted","Data":"056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7"} Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.934438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"efb1c248-d3f2-4038-8f84-01a4de59d39e","Type":"ContainerStarted","Data":"b1f69fa4770e07211e077fa247a73d82e466f808d4f1c9ed4a53bfe3a403d016"} Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.934586 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:10 crc kubenswrapper[4707]: I0121 15:28:10.951470 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.951457582 podStartE2EDuration="1.951457582s" podCreationTimestamp="2026-01-21 15:28:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:10.944856453 +0000 UTC m=+1588.126372675" watchObservedRunningTime="2026-01-21 15:28:10.951457582 +0000 UTC m=+1588.132973795" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.225687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.302332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s9nn\" (UniqueName: \"kubernetes.io/projected/ab9ae62d-9e10-4db0-a711-28078f1ba665-kube-api-access-2s9nn\") pod \"ab9ae62d-9e10-4db0-a711-28078f1ba665\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.302417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-combined-ca-bundle\") pod \"ab9ae62d-9e10-4db0-a711-28078f1ba665\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.302563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-config-data\") pod \"ab9ae62d-9e10-4db0-a711-28078f1ba665\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.302593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-scripts\") pod \"ab9ae62d-9e10-4db0-a711-28078f1ba665\" (UID: \"ab9ae62d-9e10-4db0-a711-28078f1ba665\") " Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.314905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9ae62d-9e10-4db0-a711-28078f1ba665-kube-api-access-2s9nn" (OuterVolumeSpecName: "kube-api-access-2s9nn") pod "ab9ae62d-9e10-4db0-a711-28078f1ba665" (UID: "ab9ae62d-9e10-4db0-a711-28078f1ba665"). InnerVolumeSpecName "kube-api-access-2s9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.315178 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-scripts" (OuterVolumeSpecName: "scripts") pod "ab9ae62d-9e10-4db0-a711-28078f1ba665" (UID: "ab9ae62d-9e10-4db0-a711-28078f1ba665"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.323683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab9ae62d-9e10-4db0-a711-28078f1ba665" (UID: "ab9ae62d-9e10-4db0-a711-28078f1ba665"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.329578 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-config-data" (OuterVolumeSpecName: "config-data") pod "ab9ae62d-9e10-4db0-a711-28078f1ba665" (UID: "ab9ae62d-9e10-4db0-a711-28078f1ba665"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.404286 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s9nn\" (UniqueName: \"kubernetes.io/projected/ab9ae62d-9e10-4db0-a711-28078f1ba665-kube-api-access-2s9nn\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.404313 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.404323 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.404331 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9ae62d-9e10-4db0-a711-28078f1ba665-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.944575 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.945275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd" event={"ID":"ab9ae62d-9e10-4db0-a711-28078f1ba665","Type":"ContainerDied","Data":"3503df076c1647da3089d817fbbad13aa54971586487b7b20c8c6ada30b2c149"} Jan 21 15:28:11 crc kubenswrapper[4707]: I0121 15:28:11.945309 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3503df076c1647da3089d817fbbad13aa54971586487b7b20c8c6ada30b2c149" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.028599 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.028650 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.122495 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.122685 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce" containerName="nova-scheduler-scheduler" containerID="cri-o://04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a" gracePeriod=30 Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.140331 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.155262 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.155564 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="1cd14261-de88-46af-aace-fd8b860f6522" containerName="nova-metadata-log" containerID="cri-o://7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0" gracePeriod=30 Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.156123 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="1cd14261-de88-46af-aace-fd8b860f6522" containerName="nova-metadata-metadata" containerID="cri-o://ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e" gracePeriod=30 Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.278962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.298213 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.298253 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.685081 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.831972 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bl57\" (UniqueName: \"kubernetes.io/projected/1cd14261-de88-46af-aace-fd8b860f6522-kube-api-access-6bl57\") pod \"1cd14261-de88-46af-aace-fd8b860f6522\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.832145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-nova-metadata-tls-certs\") pod \"1cd14261-de88-46af-aace-fd8b860f6522\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.832223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-config-data\") pod \"1cd14261-de88-46af-aace-fd8b860f6522\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.832292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-combined-ca-bundle\") pod \"1cd14261-de88-46af-aace-fd8b860f6522\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.832334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cd14261-de88-46af-aace-fd8b860f6522-logs\") pod \"1cd14261-de88-46af-aace-fd8b860f6522\" (UID: \"1cd14261-de88-46af-aace-fd8b860f6522\") " Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.834136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd14261-de88-46af-aace-fd8b860f6522-logs" (OuterVolumeSpecName: "logs") pod "1cd14261-de88-46af-aace-fd8b860f6522" (UID: "1cd14261-de88-46af-aace-fd8b860f6522"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.842477 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd14261-de88-46af-aace-fd8b860f6522-kube-api-access-6bl57" (OuterVolumeSpecName: "kube-api-access-6bl57") pod "1cd14261-de88-46af-aace-fd8b860f6522" (UID: "1cd14261-de88-46af-aace-fd8b860f6522"). InnerVolumeSpecName "kube-api-access-6bl57". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.855655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-config-data" (OuterVolumeSpecName: "config-data") pod "1cd14261-de88-46af-aace-fd8b860f6522" (UID: "1cd14261-de88-46af-aace-fd8b860f6522"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.865850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cd14261-de88-46af-aace-fd8b860f6522" (UID: "1cd14261-de88-46af-aace-fd8b860f6522"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.884205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1cd14261-de88-46af-aace-fd8b860f6522" (UID: "1cd14261-de88-46af-aace-fd8b860f6522"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.934471 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.934496 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.934506 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd14261-de88-46af-aace-fd8b860f6522-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.934514 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cd14261-de88-46af-aace-fd8b860f6522-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.934522 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bl57\" (UniqueName: \"kubernetes.io/projected/1cd14261-de88-46af-aace-fd8b860f6522-kube-api-access-6bl57\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.956374 4707 generic.go:334] "Generic (PLEG): container finished" podID="1cd14261-de88-46af-aace-fd8b860f6522" containerID="ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e" exitCode=0 Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.956399 4707 generic.go:334] "Generic (PLEG): container finished" podID="1cd14261-de88-46af-aace-fd8b860f6522" containerID="7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0" exitCode=143 Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.957073 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.957095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1cd14261-de88-46af-aace-fd8b860f6522","Type":"ContainerDied","Data":"ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e"} Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.957127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1cd14261-de88-46af-aace-fd8b860f6522","Type":"ContainerDied","Data":"7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0"} Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.957140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1cd14261-de88-46af-aace-fd8b860f6522","Type":"ContainerDied","Data":"d4491a9f7f528ae71e4eeac0c9078221eb3644405f25cfe978162f6d7fd6e524"} Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.957156 4707 scope.go:117] "RemoveContainer" containerID="ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.957431 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-log" containerID="cri-o://3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8" gracePeriod=30 Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.957462 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-api" containerID="cri-o://086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd" gracePeriod=30 Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.963639 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": EOF" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.974061 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": EOF" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.981003 4707 scope.go:117] "RemoveContainer" containerID="7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0" Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.988933 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:12 crc kubenswrapper[4707]: I0121 15:28:12.995826 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.004974 4707 scope.go:117] "RemoveContainer" containerID="ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.008074 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:13 crc kubenswrapper[4707]: E0121 15:28:13.013150 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e\": container with ID starting with ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e not found: ID does not exist" containerID="ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.013206 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e"} err="failed to get container status \"ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e\": rpc error: code = NotFound desc = could not find container \"ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e\": container with ID starting with ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e not found: ID does not exist" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.013231 4707 scope.go:117] "RemoveContainer" containerID="7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0" Jan 21 15:28:13 crc kubenswrapper[4707]: E0121 15:28:13.015384 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd14261-de88-46af-aace-fd8b860f6522" containerName="nova-metadata-metadata" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.015408 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd14261-de88-46af-aace-fd8b860f6522" containerName="nova-metadata-metadata" Jan 21 15:28:13 crc kubenswrapper[4707]: E0121 15:28:13.015430 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd14261-de88-46af-aace-fd8b860f6522" containerName="nova-metadata-log" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.015436 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd14261-de88-46af-aace-fd8b860f6522" containerName="nova-metadata-log" Jan 21 15:28:13 crc kubenswrapper[4707]: E0121 15:28:13.015451 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9ae62d-9e10-4db0-a711-28078f1ba665" containerName="nova-manage" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.015457 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9ae62d-9e10-4db0-a711-28078f1ba665" containerName="nova-manage" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.015652 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd14261-de88-46af-aace-fd8b860f6522" containerName="nova-metadata-log" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.015670 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9ae62d-9e10-4db0-a711-28078f1ba665" containerName="nova-manage" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.015684 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd14261-de88-46af-aace-fd8b860f6522" containerName="nova-metadata-metadata" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.016516 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.019066 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.019289 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.020783 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:13 crc kubenswrapper[4707]: E0121 15:28:13.022473 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0\": container with ID starting with 7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0 not found: ID does not exist" containerID="7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.022502 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0"} err="failed to get container status \"7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0\": rpc error: code = NotFound desc = could not find container \"7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0\": container with ID starting with 7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0 not found: ID does not exist" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.022525 4707 scope.go:117] "RemoveContainer" containerID="ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.026900 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e"} err="failed to get container status \"ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e\": rpc error: code = NotFound desc = could not find container \"ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e\": container with ID starting with ae731ec196659eaa483e7525f7e62c216ad1f9259ae6be9eb656b492bbc7a18e not found: ID does not exist" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.026929 4707 scope.go:117] "RemoveContainer" containerID="7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.028774 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0"} err="failed to get container status \"7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0\": rpc error: code = NotFound desc = could not find container \"7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0\": container with ID starting with 7a0b082613e2a767ad3588697672a5b58b0882e7196173726ea32364eca217f0 not found: ID does not exist" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.139035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-config-data\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.139366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.139488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5484b931-622d-4cfb-9610-5bebf742a351-logs\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.139620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4qdc\" (UniqueName: \"kubernetes.io/projected/5484b931-622d-4cfb-9610-5bebf742a351-kube-api-access-b4qdc\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.139924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.192783 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd14261-de88-46af-aace-fd8b860f6522" path="/var/lib/kubelet/pods/1cd14261-de88-46af-aace-fd8b860f6522/volumes" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.242129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-config-data\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.242224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.242253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5484b931-622d-4cfb-9610-5bebf742a351-logs\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.242291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4qdc\" (UniqueName: \"kubernetes.io/projected/5484b931-622d-4cfb-9610-5bebf742a351-kube-api-access-b4qdc\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.242323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.243679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5484b931-622d-4cfb-9610-5bebf742a351-logs\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.246317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.246488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.246591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-config-data\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.256665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4qdc\" (UniqueName: \"kubernetes.io/projected/5484b931-622d-4cfb-9610-5bebf742a351-kube-api-access-b4qdc\") pod \"nova-metadata-0\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.340266 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.764752 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.966966 4707 generic.go:334] "Generic (PLEG): container finished" podID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerID="3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8" exitCode=143 Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.967835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"43a512b6-d97e-4b5e-8e28-74ecefe99134","Type":"ContainerDied","Data":"3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8"} Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.973322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5484b931-622d-4cfb-9610-5bebf742a351","Type":"ContainerStarted","Data":"b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5"} Jan 21 15:28:13 crc kubenswrapper[4707]: I0121 15:28:13.973374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5484b931-622d-4cfb-9610-5bebf742a351","Type":"ContainerStarted","Data":"ce56b051c2f5145d7d1388be05aa1ad51df53d98e9ff6cbca176c1f9b6651622"} Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.521728 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.572126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-combined-ca-bundle\") pod \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.572192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckvg9\" (UniqueName: \"kubernetes.io/projected/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-kube-api-access-ckvg9\") pod \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.572354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-config-data\") pod \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\" (UID: \"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce\") " Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.577697 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-kube-api-access-ckvg9" (OuterVolumeSpecName: "kube-api-access-ckvg9") pod "bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce" (UID: "bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce"). InnerVolumeSpecName "kube-api-access-ckvg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.592953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce" (UID: "bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.594529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-config-data" (OuterVolumeSpecName: "config-data") pod "bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce" (UID: "bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.674881 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.674911 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.674924 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckvg9\" (UniqueName: \"kubernetes.io/projected/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce-kube-api-access-ckvg9\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.985112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5484b931-622d-4cfb-9610-5bebf742a351","Type":"ContainerStarted","Data":"dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599"} Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.986700 4707 generic.go:334] "Generic (PLEG): container finished" podID="bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce" containerID="04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a" exitCode=0 Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.986750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce","Type":"ContainerDied","Data":"04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a"} Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.986825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce","Type":"ContainerDied","Data":"7f383b627e9f96e05184d29953c503bae98940c34590ba90ee113941c4c955c6"} Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.986851 4707 scope.go:117] "RemoveContainer" containerID="04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a" Jan 21 15:28:14 crc kubenswrapper[4707]: I0121 15:28:14.987029 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.004006 4707 scope.go:117] "RemoveContainer" containerID="04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a" Jan 21 15:28:15 crc kubenswrapper[4707]: E0121 15:28:15.007095 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a\": container with ID starting with 04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a not found: ID does not exist" containerID="04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.007137 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a"} err="failed to get container status \"04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a\": rpc error: code = NotFound desc = could not find container \"04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a\": container with ID starting with 04c61371bf46477eec24ae0c323371de8baece53c550d8b733a577d07632645a not found: ID does not exist" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.009061 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.009045639 podStartE2EDuration="3.009045639s" podCreationTimestamp="2026-01-21 15:28:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:15.001145415 +0000 UTC m=+1592.182661638" watchObservedRunningTime="2026-01-21 15:28:15.009045639 +0000 UTC m=+1592.190561860" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.020684 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.032113 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.039845 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:15 crc kubenswrapper[4707]: E0121 15:28:15.040190 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce" containerName="nova-scheduler-scheduler" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.040210 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce" containerName="nova-scheduler-scheduler" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.040400 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce" containerName="nova-scheduler-scheduler" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.040961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.042440 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.047668 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.182835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-config-data\") pod \"nova-scheduler-0\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.183066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff82n\" (UniqueName: \"kubernetes.io/projected/c980b301-ad6b-464f-9aef-5079e5dfbe69-kube-api-access-ff82n\") pod \"nova-scheduler-0\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.183210 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.191259 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce" path="/var/lib/kubelet/pods/bea8bb1b-ce0b-4c88-8349-bbbefd4d11ce/volumes" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.285291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-config-data\") pod \"nova-scheduler-0\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.285600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff82n\" (UniqueName: \"kubernetes.io/projected/c980b301-ad6b-464f-9aef-5079e5dfbe69-kube-api-access-ff82n\") pod \"nova-scheduler-0\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.285646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.288832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.289380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-config-data\") pod \"nova-scheduler-0\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.299904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff82n\" (UniqueName: \"kubernetes.io/projected/c980b301-ad6b-464f-9aef-5079e5dfbe69-kube-api-access-ff82n\") pod \"nova-scheduler-0\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.325211 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.355350 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:15 crc kubenswrapper[4707]: I0121 15:28:15.727254 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:15 crc kubenswrapper[4707]: W0121 15:28:15.728886 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc980b301_ad6b_464f_9aef_5079e5dfbe69.slice/crio-6ca29ac562be2d735bdc0bba8a78e888c525ba410112fc2be192fa50e6ac3dde WatchSource:0}: Error finding container 6ca29ac562be2d735bdc0bba8a78e888c525ba410112fc2be192fa50e6ac3dde: Status 404 returned error can't find the container with id 6ca29ac562be2d735bdc0bba8a78e888c525ba410112fc2be192fa50e6ac3dde Jan 21 15:28:16 crc kubenswrapper[4707]: I0121 15:28:16.000321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c980b301-ad6b-464f-9aef-5079e5dfbe69","Type":"ContainerStarted","Data":"4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee"} Jan 21 15:28:16 crc kubenswrapper[4707]: I0121 15:28:16.000592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c980b301-ad6b-464f-9aef-5079e5dfbe69","Type":"ContainerStarted","Data":"6ca29ac562be2d735bdc0bba8a78e888c525ba410112fc2be192fa50e6ac3dde"} Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.279376 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.293974 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.310609 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.310594706 podStartE2EDuration="2.310594706s" podCreationTimestamp="2026-01-21 15:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:16.012450825 +0000 UTC m=+1593.193967047" watchObservedRunningTime="2026-01-21 15:28:17.310594706 +0000 UTC m=+1594.492110928" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.672941 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.729762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a512b6-d97e-4b5e-8e28-74ecefe99134-logs\") pod \"43a512b6-d97e-4b5e-8e28-74ecefe99134\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.729841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-combined-ca-bundle\") pod \"43a512b6-d97e-4b5e-8e28-74ecefe99134\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.729882 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-config-data\") pod \"43a512b6-d97e-4b5e-8e28-74ecefe99134\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.729960 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jvdr\" (UniqueName: \"kubernetes.io/projected/43a512b6-d97e-4b5e-8e28-74ecefe99134-kube-api-access-8jvdr\") pod \"43a512b6-d97e-4b5e-8e28-74ecefe99134\" (UID: \"43a512b6-d97e-4b5e-8e28-74ecefe99134\") " Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.730138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a512b6-d97e-4b5e-8e28-74ecefe99134-logs" (OuterVolumeSpecName: "logs") pod "43a512b6-d97e-4b5e-8e28-74ecefe99134" (UID: "43a512b6-d97e-4b5e-8e28-74ecefe99134"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.730418 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a512b6-d97e-4b5e-8e28-74ecefe99134-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.733888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a512b6-d97e-4b5e-8e28-74ecefe99134-kube-api-access-8jvdr" (OuterVolumeSpecName: "kube-api-access-8jvdr") pod "43a512b6-d97e-4b5e-8e28-74ecefe99134" (UID: "43a512b6-d97e-4b5e-8e28-74ecefe99134"). InnerVolumeSpecName "kube-api-access-8jvdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.749300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43a512b6-d97e-4b5e-8e28-74ecefe99134" (UID: "43a512b6-d97e-4b5e-8e28-74ecefe99134"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.772067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-config-data" (OuterVolumeSpecName: "config-data") pod "43a512b6-d97e-4b5e-8e28-74ecefe99134" (UID: "43a512b6-d97e-4b5e-8e28-74ecefe99134"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.831845 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.831989 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a512b6-d97e-4b5e-8e28-74ecefe99134-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:17 crc kubenswrapper[4707]: I0121 15:28:17.832058 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jvdr\" (UniqueName: \"kubernetes.io/projected/43a512b6-d97e-4b5e-8e28-74ecefe99134-kube-api-access-8jvdr\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.019766 4707 generic.go:334] "Generic (PLEG): container finished" podID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerID="086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd" exitCode=0 Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.020950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.028948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"43a512b6-d97e-4b5e-8e28-74ecefe99134","Type":"ContainerDied","Data":"086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd"} Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.028977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"43a512b6-d97e-4b5e-8e28-74ecefe99134","Type":"ContainerDied","Data":"953996f157617cfe71a36945020e0e3a1d67f2f0ce81d0404707906301079f77"} Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.028993 4707 scope.go:117] "RemoveContainer" containerID="086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.046835 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.047148 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.047953 4707 scope.go:117] "RemoveContainer" containerID="3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.056318 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.071313 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:18 crc kubenswrapper[4707]: E0121 15:28:18.071676 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-log" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.071693 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-log" Jan 21 15:28:18 crc kubenswrapper[4707]: E0121 15:28:18.071710 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-api" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.071716 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-api" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.071879 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-api" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.071900 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" containerName="nova-api-log" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.072659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.074602 4707 scope.go:117] "RemoveContainer" containerID="086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd" Jan 21 15:28:18 crc kubenswrapper[4707]: E0121 15:28:18.074907 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd\": container with ID starting with 086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd not found: ID does not exist" containerID="086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.074932 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd"} err="failed to get container status \"086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd\": rpc error: code = NotFound desc = could not find container \"086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd\": container with ID starting with 086f6169ca89cdb5943005f7a5c7a83bfa42cafa60b864c25f724e5eb54b06bd not found: ID does not exist" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.074949 4707 scope.go:117] "RemoveContainer" containerID="3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8" Jan 21 15:28:18 crc kubenswrapper[4707]: E0121 15:28:18.075153 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8\": container with ID starting with 3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8 not found: ID does not exist" containerID="3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.075188 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8"} err="failed to get container status \"3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8\": rpc error: code = NotFound desc = could not find container \"3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8\": container with ID starting with 3858ed93018da42c60415cb59ce44ec128dcaa040496179fd4d2a5ec7ddab5d8 not found: ID does not exist" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.075953 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.098651 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.137553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tznl6\" (UniqueName: \"kubernetes.io/projected/80026a7b-bf72-4cb9-9e62-297018be7c36-kube-api-access-tznl6\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.137635 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-config-data\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.137766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.137846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80026a7b-bf72-4cb9-9e62-297018be7c36-logs\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.185960 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd"] Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.186925 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.188440 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.190389 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.191649 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd"] Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.239677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-config-data\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.239829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.239896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80026a7b-bf72-4cb9-9e62-297018be7c36-logs\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.239949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tznl6\" (UniqueName: \"kubernetes.io/projected/80026a7b-bf72-4cb9-9e62-297018be7c36-kube-api-access-tznl6\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.241486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80026a7b-bf72-4cb9-9e62-297018be7c36-logs\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.243876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.244016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-config-data\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.255205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tznl6\" (UniqueName: \"kubernetes.io/projected/80026a7b-bf72-4cb9-9e62-297018be7c36-kube-api-access-tznl6\") pod \"nova-api-0\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.340843 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.341558 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.341705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxlj\" (UniqueName: \"kubernetes.io/projected/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-kube-api-access-sfxlj\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.341734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.341851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-config-data\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.341893 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-scripts\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.384762 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.443174 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxlj\" (UniqueName: \"kubernetes.io/projected/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-kube-api-access-sfxlj\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.443207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.443267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-config-data\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.443291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-scripts\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.449513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-scripts\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.451938 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.453383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-config-data\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.477269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxlj\" (UniqueName: \"kubernetes.io/projected/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-kube-api-access-sfxlj\") pod \"nova-cell1-cell-mapping-q6vqd\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.505137 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.897687 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:18 crc kubenswrapper[4707]: W0121 15:28:18.899971 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80026a7b_bf72_4cb9_9e62_297018be7c36.slice/crio-4dd9f36ed7383d4ddc22a1111ac7714de18f431414588f42a85f74f17236b81e WatchSource:0}: Error finding container 4dd9f36ed7383d4ddc22a1111ac7714de18f431414588f42a85f74f17236b81e: Status 404 returned error can't find the container with id 4dd9f36ed7383d4ddc22a1111ac7714de18f431414588f42a85f74f17236b81e Jan 21 15:28:18 crc kubenswrapper[4707]: I0121 15:28:18.966213 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd"] Jan 21 15:28:18 crc kubenswrapper[4707]: W0121 15:28:18.968343 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1a438bc_0bc6_4de6_b01d_7376dbf25a5c.slice/crio-bc6c6e8f15081e78a162d2ce7126a132f6a1f28633acdbb63d46ad54afb1ef9a WatchSource:0}: Error finding container bc6c6e8f15081e78a162d2ce7126a132f6a1f28633acdbb63d46ad54afb1ef9a: Status 404 returned error can't find the container with id bc6c6e8f15081e78a162d2ce7126a132f6a1f28633acdbb63d46ad54afb1ef9a Jan 21 15:28:19 crc kubenswrapper[4707]: I0121 15:28:19.028427 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" event={"ID":"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c","Type":"ContainerStarted","Data":"bc6c6e8f15081e78a162d2ce7126a132f6a1f28633acdbb63d46ad54afb1ef9a"} Jan 21 15:28:19 crc kubenswrapper[4707]: I0121 15:28:19.029687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"80026a7b-bf72-4cb9-9e62-297018be7c36","Type":"ContainerStarted","Data":"acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c"} Jan 21 15:28:19 crc kubenswrapper[4707]: I0121 15:28:19.029729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"80026a7b-bf72-4cb9-9e62-297018be7c36","Type":"ContainerStarted","Data":"4dd9f36ed7383d4ddc22a1111ac7714de18f431414588f42a85f74f17236b81e"} Jan 21 15:28:19 crc kubenswrapper[4707]: I0121 15:28:19.190129 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a512b6-d97e-4b5e-8e28-74ecefe99134" path="/var/lib/kubelet/pods/43a512b6-d97e-4b5e-8e28-74ecefe99134/volumes" Jan 21 15:28:20 crc kubenswrapper[4707]: I0121 15:28:20.038211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" event={"ID":"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c","Type":"ContainerStarted","Data":"9301f08dba6fe0925acba1d6c85e32b4cc1ed0d37ee9a2c6412884e1b955f53d"} Jan 21 15:28:20 crc kubenswrapper[4707]: I0121 15:28:20.039637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"80026a7b-bf72-4cb9-9e62-297018be7c36","Type":"ContainerStarted","Data":"1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11"} Jan 21 15:28:20 crc kubenswrapper[4707]: I0121 15:28:20.049727 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" podStartSLOduration=2.049712678 podStartE2EDuration="2.049712678s" podCreationTimestamp="2026-01-21 15:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:20.048892917 +0000 UTC m=+1597.230409139" watchObservedRunningTime="2026-01-21 15:28:20.049712678 +0000 UTC m=+1597.231228900" Jan 21 15:28:20 crc kubenswrapper[4707]: I0121 15:28:20.065156 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.0651446 podStartE2EDuration="2.0651446s" podCreationTimestamp="2026-01-21 15:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:20.060206116 +0000 UTC m=+1597.241722339" watchObservedRunningTime="2026-01-21 15:28:20.0651446 +0000 UTC m=+1597.246660823" Jan 21 15:28:20 crc kubenswrapper[4707]: I0121 15:28:20.355571 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:23 crc kubenswrapper[4707]: I0121 15:28:23.060451 4707 generic.go:334] "Generic (PLEG): container finished" podID="b1a438bc-0bc6-4de6-b01d-7376dbf25a5c" containerID="9301f08dba6fe0925acba1d6c85e32b4cc1ed0d37ee9a2c6412884e1b955f53d" exitCode=0 Jan 21 15:28:23 crc kubenswrapper[4707]: I0121 15:28:23.060527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" event={"ID":"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c","Type":"ContainerDied","Data":"9301f08dba6fe0925acba1d6c85e32b4cc1ed0d37ee9a2c6412884e1b955f53d"} Jan 21 15:28:23 crc kubenswrapper[4707]: I0121 15:28:23.341000 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:23 crc kubenswrapper[4707]: I0121 15:28:23.341195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.353198 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.353912 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.132:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.353931 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.132:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.428561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfxlj\" (UniqueName: \"kubernetes.io/projected/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-kube-api-access-sfxlj\") pod \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.428664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-combined-ca-bundle\") pod \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.428708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-config-data\") pod \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.428774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-scripts\") pod \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\" (UID: \"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c\") " Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.434861 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-kube-api-access-sfxlj" (OuterVolumeSpecName: "kube-api-access-sfxlj") pod "b1a438bc-0bc6-4de6-b01d-7376dbf25a5c" (UID: "b1a438bc-0bc6-4de6-b01d-7376dbf25a5c"). InnerVolumeSpecName "kube-api-access-sfxlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.434870 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-scripts" (OuterVolumeSpecName: "scripts") pod "b1a438bc-0bc6-4de6-b01d-7376dbf25a5c" (UID: "b1a438bc-0bc6-4de6-b01d-7376dbf25a5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.449461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1a438bc-0bc6-4de6-b01d-7376dbf25a5c" (UID: "b1a438bc-0bc6-4de6-b01d-7376dbf25a5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.450636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-config-data" (OuterVolumeSpecName: "config-data") pod "b1a438bc-0bc6-4de6-b01d-7376dbf25a5c" (UID: "b1a438bc-0bc6-4de6-b01d-7376dbf25a5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.531224 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfxlj\" (UniqueName: \"kubernetes.io/projected/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-kube-api-access-sfxlj\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.531247 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.531256 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:24 crc kubenswrapper[4707]: I0121 15:28:24.531265 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.075385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" event={"ID":"b1a438bc-0bc6-4de6-b01d-7376dbf25a5c","Type":"ContainerDied","Data":"bc6c6e8f15081e78a162d2ce7126a132f6a1f28633acdbb63d46ad54afb1ef9a"} Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.075423 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc6c6e8f15081e78a162d2ce7126a132f6a1f28633acdbb63d46ad54afb1ef9a" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.075443 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.232118 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.232327 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerName="nova-api-log" containerID="cri-o://acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c" gracePeriod=30 Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.232384 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerName="nova-api-api" containerID="cri-o://1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11" gracePeriod=30 Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.238529 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.238857 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c980b301-ad6b-464f-9aef-5079e5dfbe69" containerName="nova-scheduler-scheduler" containerID="cri-o://4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee" gracePeriod=30 Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.244496 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.244665 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-log" containerID="cri-o://b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5" gracePeriod=30 Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.244705 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-metadata" containerID="cri-o://dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599" gracePeriod=30 Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.716244 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.850347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tznl6\" (UniqueName: \"kubernetes.io/projected/80026a7b-bf72-4cb9-9e62-297018be7c36-kube-api-access-tznl6\") pod \"80026a7b-bf72-4cb9-9e62-297018be7c36\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.850391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-combined-ca-bundle\") pod \"80026a7b-bf72-4cb9-9e62-297018be7c36\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.850428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-config-data\") pod \"80026a7b-bf72-4cb9-9e62-297018be7c36\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.850487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80026a7b-bf72-4cb9-9e62-297018be7c36-logs\") pod \"80026a7b-bf72-4cb9-9e62-297018be7c36\" (UID: \"80026a7b-bf72-4cb9-9e62-297018be7c36\") " Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.850834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80026a7b-bf72-4cb9-9e62-297018be7c36-logs" (OuterVolumeSpecName: "logs") pod "80026a7b-bf72-4cb9-9e62-297018be7c36" (UID: "80026a7b-bf72-4cb9-9e62-297018be7c36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.851149 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80026a7b-bf72-4cb9-9e62-297018be7c36-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.854067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80026a7b-bf72-4cb9-9e62-297018be7c36-kube-api-access-tznl6" (OuterVolumeSpecName: "kube-api-access-tznl6") pod "80026a7b-bf72-4cb9-9e62-297018be7c36" (UID: "80026a7b-bf72-4cb9-9e62-297018be7c36"). InnerVolumeSpecName "kube-api-access-tznl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.869833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80026a7b-bf72-4cb9-9e62-297018be7c36" (UID: "80026a7b-bf72-4cb9-9e62-297018be7c36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.874381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-config-data" (OuterVolumeSpecName: "config-data") pod "80026a7b-bf72-4cb9-9e62-297018be7c36" (UID: "80026a7b-bf72-4cb9-9e62-297018be7c36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.953020 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tznl6\" (UniqueName: \"kubernetes.io/projected/80026a7b-bf72-4cb9-9e62-297018be7c36-kube-api-access-tznl6\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.953044 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:25 crc kubenswrapper[4707]: I0121 15:28:25.953052 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80026a7b-bf72-4cb9-9e62-297018be7c36-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.083127 4707 generic.go:334] "Generic (PLEG): container finished" podID="5484b931-622d-4cfb-9610-5bebf742a351" containerID="b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5" exitCode=143 Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.083208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5484b931-622d-4cfb-9610-5bebf742a351","Type":"ContainerDied","Data":"b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5"} Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.084826 4707 generic.go:334] "Generic (PLEG): container finished" podID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerID="1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11" exitCode=0 Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.084844 4707 generic.go:334] "Generic (PLEG): container finished" podID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerID="acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c" exitCode=143 Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.084860 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.084864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"80026a7b-bf72-4cb9-9e62-297018be7c36","Type":"ContainerDied","Data":"1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11"} Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.084880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"80026a7b-bf72-4cb9-9e62-297018be7c36","Type":"ContainerDied","Data":"acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c"} Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.084890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"80026a7b-bf72-4cb9-9e62-297018be7c36","Type":"ContainerDied","Data":"4dd9f36ed7383d4ddc22a1111ac7714de18f431414588f42a85f74f17236b81e"} Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.084903 4707 scope.go:117] "RemoveContainer" containerID="1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.101268 4707 scope.go:117] "RemoveContainer" containerID="acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.109580 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.116070 4707 scope.go:117] "RemoveContainer" containerID="1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11" Jan 21 15:28:26 crc kubenswrapper[4707]: E0121 15:28:26.116407 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11\": container with ID starting with 1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11 not found: ID does not exist" containerID="1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.116430 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11"} err="failed to get container status \"1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11\": rpc error: code = NotFound desc = could not find container \"1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11\": container with ID starting with 1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11 not found: ID does not exist" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.116448 4707 scope.go:117] "RemoveContainer" containerID="acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c" Jan 21 15:28:26 crc kubenswrapper[4707]: E0121 15:28:26.118206 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c\": container with ID starting with acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c not found: ID does not exist" containerID="acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.118233 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c"} err="failed to get container status \"acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c\": rpc error: code = NotFound desc = could not find container \"acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c\": container with ID starting with acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c not found: ID does not exist" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.118251 4707 scope.go:117] "RemoveContainer" containerID="1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.118480 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11"} err="failed to get container status \"1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11\": rpc error: code = NotFound desc = could not find container \"1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11\": container with ID starting with 1f0c7e763e6f970dec2b9838bc9a15e2be846dae99ace87087f656cc28a83f11 not found: ID does not exist" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.118500 4707 scope.go:117] "RemoveContainer" containerID="acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.119142 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c"} err="failed to get container status \"acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c\": rpc error: code = NotFound desc = could not find container \"acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c\": container with ID starting with acf96eb27657c8f99cd1f500898d783a07b67d005e529de123b5da80086e7f0c not found: ID does not exist" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.121728 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.132332 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:26 crc kubenswrapper[4707]: E0121 15:28:26.132655 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerName="nova-api-log" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.132674 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerName="nova-api-log" Jan 21 15:28:26 crc kubenswrapper[4707]: E0121 15:28:26.132688 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerName="nova-api-api" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.132695 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerName="nova-api-api" Jan 21 15:28:26 crc kubenswrapper[4707]: E0121 15:28:26.132726 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a438bc-0bc6-4de6-b01d-7376dbf25a5c" containerName="nova-manage" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.132732 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a438bc-0bc6-4de6-b01d-7376dbf25a5c" containerName="nova-manage" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.132896 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a438bc-0bc6-4de6-b01d-7376dbf25a5c" containerName="nova-manage" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.132921 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerName="nova-api-log" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.132936 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="80026a7b-bf72-4cb9-9e62-297018be7c36" containerName="nova-api-api" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.133690 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.136659 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.138324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.257593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.257839 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-config-data\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.257924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-logs\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.258025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5g4g\" (UniqueName: \"kubernetes.io/projected/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-kube-api-access-c5g4g\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.359639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.359857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-config-data\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.359886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-logs\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.359953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5g4g\" (UniqueName: \"kubernetes.io/projected/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-kube-api-access-c5g4g\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.360291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-logs\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.363375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-config-data\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.370483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.372154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5g4g\" (UniqueName: \"kubernetes.io/projected/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-kube-api-access-c5g4g\") pod \"nova-api-0\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.446205 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.702093 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.765557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-config-data\") pod \"c980b301-ad6b-464f-9aef-5079e5dfbe69\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.765614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-combined-ca-bundle\") pod \"c980b301-ad6b-464f-9aef-5079e5dfbe69\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.765679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff82n\" (UniqueName: \"kubernetes.io/projected/c980b301-ad6b-464f-9aef-5079e5dfbe69-kube-api-access-ff82n\") pod \"c980b301-ad6b-464f-9aef-5079e5dfbe69\" (UID: \"c980b301-ad6b-464f-9aef-5079e5dfbe69\") " Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.769698 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c980b301-ad6b-464f-9aef-5079e5dfbe69-kube-api-access-ff82n" (OuterVolumeSpecName: "kube-api-access-ff82n") pod "c980b301-ad6b-464f-9aef-5079e5dfbe69" (UID: "c980b301-ad6b-464f-9aef-5079e5dfbe69"). InnerVolumeSpecName "kube-api-access-ff82n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.785712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-config-data" (OuterVolumeSpecName: "config-data") pod "c980b301-ad6b-464f-9aef-5079e5dfbe69" (UID: "c980b301-ad6b-464f-9aef-5079e5dfbe69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.787917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c980b301-ad6b-464f-9aef-5079e5dfbe69" (UID: "c980b301-ad6b-464f-9aef-5079e5dfbe69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.862941 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.867238 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.867261 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c980b301-ad6b-464f-9aef-5079e5dfbe69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:26 crc kubenswrapper[4707]: I0121 15:28:26.867271 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff82n\" (UniqueName: \"kubernetes.io/projected/c980b301-ad6b-464f-9aef-5079e5dfbe69-kube-api-access-ff82n\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.104909 4707 generic.go:334] "Generic (PLEG): container finished" podID="c980b301-ad6b-464f-9aef-5079e5dfbe69" containerID="4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee" exitCode=0 Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.104957 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.104971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c980b301-ad6b-464f-9aef-5079e5dfbe69","Type":"ContainerDied","Data":"4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee"} Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.105240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c980b301-ad6b-464f-9aef-5079e5dfbe69","Type":"ContainerDied","Data":"6ca29ac562be2d735bdc0bba8a78e888c525ba410112fc2be192fa50e6ac3dde"} Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.105256 4707 scope.go:117] "RemoveContainer" containerID="4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.111041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b28d4f89-f7dd-4acb-80a0-95f5cb31b459","Type":"ContainerStarted","Data":"b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726"} Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.111080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b28d4f89-f7dd-4acb-80a0-95f5cb31b459","Type":"ContainerStarted","Data":"5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4"} Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.111089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b28d4f89-f7dd-4acb-80a0-95f5cb31b459","Type":"ContainerStarted","Data":"86fac2029585f909af57bf80ef1a27153271cbf9c428ba6fd7b15687b06e1a9f"} Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.128195 4707 scope.go:117] "RemoveContainer" containerID="4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee" Jan 21 15:28:27 crc kubenswrapper[4707]: E0121 15:28:27.128487 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee\": container with ID starting with 4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee not found: ID does not exist" containerID="4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.128518 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee"} err="failed to get container status \"4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee\": rpc error: code = NotFound desc = could not find container \"4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee\": container with ID starting with 4b203a09af5df41167a7e1f422d77c1b3799cb75678cc42d88a36036f027d5ee not found: ID does not exist" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.133512 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.133497739 podStartE2EDuration="1.133497739s" podCreationTimestamp="2026-01-21 15:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:27.123353458 +0000 UTC m=+1604.304869680" watchObservedRunningTime="2026-01-21 15:28:27.133497739 +0000 UTC m=+1604.315013961" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.146103 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.154317 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.165022 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:27 crc kubenswrapper[4707]: E0121 15:28:27.165358 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c980b301-ad6b-464f-9aef-5079e5dfbe69" containerName="nova-scheduler-scheduler" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.165374 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c980b301-ad6b-464f-9aef-5079e5dfbe69" containerName="nova-scheduler-scheduler" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.165557 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c980b301-ad6b-464f-9aef-5079e5dfbe69" containerName="nova-scheduler-scheduler" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.166078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.168018 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.171564 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.189624 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80026a7b-bf72-4cb9-9e62-297018be7c36" path="/var/lib/kubelet/pods/80026a7b-bf72-4cb9-9e62-297018be7c36/volumes" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.190217 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c980b301-ad6b-464f-9aef-5079e5dfbe69" path="/var/lib/kubelet/pods/c980b301-ad6b-464f-9aef-5079e5dfbe69/volumes" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.273944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.273997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-config-data\") pod \"nova-scheduler-0\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.274222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46h96\" (UniqueName: \"kubernetes.io/projected/a651b22a-3eb9-4d6e-96d5-95ece889686f-kube-api-access-46h96\") pod \"nova-scheduler-0\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.375596 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46h96\" (UniqueName: \"kubernetes.io/projected/a651b22a-3eb9-4d6e-96d5-95ece889686f-kube-api-access-46h96\") pod \"nova-scheduler-0\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.375793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.375846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-config-data\") pod \"nova-scheduler-0\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.379226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-config-data\") pod \"nova-scheduler-0\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.379294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.388432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46h96\" (UniqueName: \"kubernetes.io/projected/a651b22a-3eb9-4d6e-96d5-95ece889686f-kube-api-access-46h96\") pod \"nova-scheduler-0\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.483656 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:27 crc kubenswrapper[4707]: I0121 15:28:27.836338 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:28:27 crc kubenswrapper[4707]: W0121 15:28:27.836734 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda651b22a_3eb9_4d6e_96d5_95ece889686f.slice/crio-843a8eb2c1f4442d9021632abceae272b31c88bc4936db7ddd8deddb6cde59c0 WatchSource:0}: Error finding container 843a8eb2c1f4442d9021632abceae272b31c88bc4936db7ddd8deddb6cde59c0: Status 404 returned error can't find the container with id 843a8eb2c1f4442d9021632abceae272b31c88bc4936db7ddd8deddb6cde59c0 Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.118501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a651b22a-3eb9-4d6e-96d5-95ece889686f","Type":"ContainerStarted","Data":"83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af"} Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.118920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a651b22a-3eb9-4d6e-96d5-95ece889686f","Type":"ContainerStarted","Data":"843a8eb2c1f4442d9021632abceae272b31c88bc4936db7ddd8deddb6cde59c0"} Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.748779 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.771248 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.771231926 podStartE2EDuration="1.771231926s" podCreationTimestamp="2026-01-21 15:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:28.13705883 +0000 UTC m=+1605.318575051" watchObservedRunningTime="2026-01-21 15:28:28.771231926 +0000 UTC m=+1605.952748167" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.798256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5484b931-622d-4cfb-9610-5bebf742a351-logs\") pod \"5484b931-622d-4cfb-9610-5bebf742a351\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.798311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4qdc\" (UniqueName: \"kubernetes.io/projected/5484b931-622d-4cfb-9610-5bebf742a351-kube-api-access-b4qdc\") pod \"5484b931-622d-4cfb-9610-5bebf742a351\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.798347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-config-data\") pod \"5484b931-622d-4cfb-9610-5bebf742a351\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.798375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-nova-metadata-tls-certs\") pod \"5484b931-622d-4cfb-9610-5bebf742a351\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.798458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-combined-ca-bundle\") pod \"5484b931-622d-4cfb-9610-5bebf742a351\" (UID: \"5484b931-622d-4cfb-9610-5bebf742a351\") " Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.798763 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5484b931-622d-4cfb-9610-5bebf742a351-logs" (OuterVolumeSpecName: "logs") pod "5484b931-622d-4cfb-9610-5bebf742a351" (UID: "5484b931-622d-4cfb-9610-5bebf742a351"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.799156 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5484b931-622d-4cfb-9610-5bebf742a351-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.801897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5484b931-622d-4cfb-9610-5bebf742a351-kube-api-access-b4qdc" (OuterVolumeSpecName: "kube-api-access-b4qdc") pod "5484b931-622d-4cfb-9610-5bebf742a351" (UID: "5484b931-622d-4cfb-9610-5bebf742a351"). InnerVolumeSpecName "kube-api-access-b4qdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.818539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5484b931-622d-4cfb-9610-5bebf742a351" (UID: "5484b931-622d-4cfb-9610-5bebf742a351"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.818841 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-config-data" (OuterVolumeSpecName: "config-data") pod "5484b931-622d-4cfb-9610-5bebf742a351" (UID: "5484b931-622d-4cfb-9610-5bebf742a351"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.852212 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5484b931-622d-4cfb-9610-5bebf742a351" (UID: "5484b931-622d-4cfb-9610-5bebf742a351"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.900565 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4qdc\" (UniqueName: \"kubernetes.io/projected/5484b931-622d-4cfb-9610-5bebf742a351-kube-api-access-b4qdc\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.900593 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.900604 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:28 crc kubenswrapper[4707]: I0121 15:28:28.900613 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484b931-622d-4cfb-9610-5bebf742a351-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.127497 4707 generic.go:334] "Generic (PLEG): container finished" podID="5484b931-622d-4cfb-9610-5bebf742a351" containerID="dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599" exitCode=0 Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.127551 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.127582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5484b931-622d-4cfb-9610-5bebf742a351","Type":"ContainerDied","Data":"dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599"} Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.127630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5484b931-622d-4cfb-9610-5bebf742a351","Type":"ContainerDied","Data":"ce56b051c2f5145d7d1388be05aa1ad51df53d98e9ff6cbca176c1f9b6651622"} Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.127649 4707 scope.go:117] "RemoveContainer" containerID="dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.152566 4707 scope.go:117] "RemoveContainer" containerID="b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.165446 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.174900 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.176206 4707 scope.go:117] "RemoveContainer" containerID="dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599" Jan 21 15:28:29 crc kubenswrapper[4707]: E0121 15:28:29.176681 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599\": container with ID starting with dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599 not found: ID does not exist" containerID="dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.176713 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599"} err="failed to get container status \"dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599\": rpc error: code = NotFound desc = could not find container \"dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599\": container with ID starting with dde1306fbdd3036523b05746d291554793c63314a07c1ffa7dcd150c79ee1599 not found: ID does not exist" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.176734 4707 scope.go:117] "RemoveContainer" containerID="b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5" Jan 21 15:28:29 crc kubenswrapper[4707]: E0121 15:28:29.177230 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5\": container with ID starting with b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5 not found: ID does not exist" containerID="b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.177268 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5"} err="failed to get container status \"b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5\": rpc error: code = NotFound desc = could not find container \"b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5\": container with ID starting with b4ab380bfb940a0fdb71dbf1dab6f0c52cea5373e24262b70ef9eaca064897b5 not found: ID does not exist" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.179895 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:29 crc kubenswrapper[4707]: E0121 15:28:29.180237 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-metadata" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.180248 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-metadata" Jan 21 15:28:29 crc kubenswrapper[4707]: E0121 15:28:29.180259 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-log" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.180265 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-log" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.180400 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-log" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.180417 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5484b931-622d-4cfb-9610-5bebf742a351" containerName="nova-metadata-metadata" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.181221 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.183498 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.183693 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.195434 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5484b931-622d-4cfb-9610-5bebf742a351" path="/var/lib/kubelet/pods/5484b931-622d-4cfb-9610-5bebf742a351/volumes" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.196469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.306346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-config-data\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.306393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.306417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.306443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccc6250e-7692-4325-896f-33f84c241afb-logs\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.306457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hccm4\" (UniqueName: \"kubernetes.io/projected/ccc6250e-7692-4325-896f-33f84c241afb-kube-api-access-hccm4\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.408290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-config-data\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.408333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.408353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.408378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccc6250e-7692-4325-896f-33f84c241afb-logs\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.408394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hccm4\" (UniqueName: \"kubernetes.io/projected/ccc6250e-7692-4325-896f-33f84c241afb-kube-api-access-hccm4\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.409305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccc6250e-7692-4325-896f-33f84c241afb-logs\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.411915 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.412076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.412372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-config-data\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.421223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hccm4\" (UniqueName: \"kubernetes.io/projected/ccc6250e-7692-4325-896f-33f84c241afb-kube-api-access-hccm4\") pod \"nova-metadata-0\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.497455 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:29 crc kubenswrapper[4707]: W0121 15:28:29.877509 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccc6250e_7692_4325_896f_33f84c241afb.slice/crio-b57c33ea34d91d71336a11e04971f169963d1c8464483ff59c6605bae24dc745 WatchSource:0}: Error finding container b57c33ea34d91d71336a11e04971f169963d1c8464483ff59c6605bae24dc745: Status 404 returned error can't find the container with id b57c33ea34d91d71336a11e04971f169963d1c8464483ff59c6605bae24dc745 Jan 21 15:28:29 crc kubenswrapper[4707]: I0121 15:28:29.878580 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:28:30 crc kubenswrapper[4707]: I0121 15:28:30.149508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ccc6250e-7692-4325-896f-33f84c241afb","Type":"ContainerStarted","Data":"1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b"} Jan 21 15:28:30 crc kubenswrapper[4707]: I0121 15:28:30.149699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ccc6250e-7692-4325-896f-33f84c241afb","Type":"ContainerStarted","Data":"3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221"} Jan 21 15:28:30 crc kubenswrapper[4707]: I0121 15:28:30.149711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ccc6250e-7692-4325-896f-33f84c241afb","Type":"ContainerStarted","Data":"b57c33ea34d91d71336a11e04971f169963d1c8464483ff59c6605bae24dc745"} Jan 21 15:28:30 crc kubenswrapper[4707]: I0121 15:28:30.163011 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.162998826 podStartE2EDuration="1.162998826s" podCreationTimestamp="2026-01-21 15:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:30.161285666 +0000 UTC m=+1607.342801887" watchObservedRunningTime="2026-01-21 15:28:30.162998826 +0000 UTC m=+1607.344515049" Jan 21 15:28:32 crc kubenswrapper[4707]: I0121 15:28:32.484005 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:33 crc kubenswrapper[4707]: I0121 15:28:33.195699 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:34 crc kubenswrapper[4707]: I0121 15:28:34.497936 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:34 crc kubenswrapper[4707]: I0121 15:28:34.497974 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:35 crc kubenswrapper[4707]: I0121 15:28:35.898176 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:28:35 crc kubenswrapper[4707]: I0121 15:28:35.898350 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="fc5e58d5-06c3-4fe0-a609-e227c50fdf22" containerName="kube-state-metrics" containerID="cri-o://4ee46ed5aecbf7debc0d1510a301c8a5f0a04f9e2a5b16175bc375ef2d1d8911" gracePeriod=30 Jan 21 15:28:36 crc kubenswrapper[4707]: I0121 15:28:36.188045 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc5e58d5-06c3-4fe0-a609-e227c50fdf22" containerID="4ee46ed5aecbf7debc0d1510a301c8a5f0a04f9e2a5b16175bc375ef2d1d8911" exitCode=2 Jan 21 15:28:36 crc kubenswrapper[4707]: I0121 15:28:36.188127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"fc5e58d5-06c3-4fe0-a609-e227c50fdf22","Type":"ContainerDied","Data":"4ee46ed5aecbf7debc0d1510a301c8a5f0a04f9e2a5b16175bc375ef2d1d8911"} Jan 21 15:28:36 crc kubenswrapper[4707]: I0121 15:28:36.265467 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:36 crc kubenswrapper[4707]: I0121 15:28:36.304610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwshg\" (UniqueName: \"kubernetes.io/projected/fc5e58d5-06c3-4fe0-a609-e227c50fdf22-kube-api-access-hwshg\") pod \"fc5e58d5-06c3-4fe0-a609-e227c50fdf22\" (UID: \"fc5e58d5-06c3-4fe0-a609-e227c50fdf22\") " Jan 21 15:28:36 crc kubenswrapper[4707]: I0121 15:28:36.308942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5e58d5-06c3-4fe0-a609-e227c50fdf22-kube-api-access-hwshg" (OuterVolumeSpecName: "kube-api-access-hwshg") pod "fc5e58d5-06c3-4fe0-a609-e227c50fdf22" (UID: "fc5e58d5-06c3-4fe0-a609-e227c50fdf22"). InnerVolumeSpecName "kube-api-access-hwshg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:36 crc kubenswrapper[4707]: I0121 15:28:36.407131 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwshg\" (UniqueName: \"kubernetes.io/projected/fc5e58d5-06c3-4fe0-a609-e227c50fdf22-kube-api-access-hwshg\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:36 crc kubenswrapper[4707]: I0121 15:28:36.447063 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:36 crc kubenswrapper[4707]: I0121 15:28:36.447111 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.203911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"fc5e58d5-06c3-4fe0-a609-e227c50fdf22","Type":"ContainerDied","Data":"e09ab96e3853b04ee07450cee131c32fa7b19b511a1e501c530c7833513f1e70"} Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.204499 4707 scope.go:117] "RemoveContainer" containerID="4ee46ed5aecbf7debc0d1510a301c8a5f0a04f9e2a5b16175bc375ef2d1d8911" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.203946 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.221444 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.234435 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.243873 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:28:37 crc kubenswrapper[4707]: E0121 15:28:37.244236 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5e58d5-06c3-4fe0-a609-e227c50fdf22" containerName="kube-state-metrics" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.244254 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5e58d5-06c3-4fe0-a609-e227c50fdf22" containerName="kube-state-metrics" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.244423 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5e58d5-06c3-4fe0-a609-e227c50fdf22" containerName="kube-state-metrics" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.244932 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.248006 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.248740 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.249614 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.292692 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.292918 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="ceilometer-central-agent" containerID="cri-o://7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884" gracePeriod=30 Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.292974 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="proxy-httpd" containerID="cri-o://bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6" gracePeriod=30 Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.293142 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="ceilometer-notification-agent" containerID="cri-o://f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126" gracePeriod=30 Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.293213 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="sg-core" containerID="cri-o://03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60" gracePeriod=30 Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.327542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2zfx\" (UniqueName: \"kubernetes.io/projected/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-api-access-x2zfx\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.327617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.327770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.327866 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.428910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.428971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.429011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2zfx\" (UniqueName: \"kubernetes.io/projected/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-api-access-x2zfx\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.429038 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.433291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.434217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.435162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.443490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2zfx\" (UniqueName: \"kubernetes.io/projected/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-api-access-x2zfx\") pod \"kube-state-metrics-0\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.484509 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.516538 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.530023 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.136:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.530122 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.136:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.563185 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:37 crc kubenswrapper[4707]: I0121 15:28:37.938434 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:28:38 crc kubenswrapper[4707]: I0121 15:28:38.212857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7d909d33-58f9-4a53-b98a-aa1927df3ea3","Type":"ContainerStarted","Data":"a6cdf48eaa31f4e1b11a335b8bbc6d8dd7c28dc053821351f56ade5f300bd172"} Jan 21 15:28:38 crc kubenswrapper[4707]: I0121 15:28:38.219688 4707 generic.go:334] "Generic (PLEG): container finished" podID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerID="bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6" exitCode=0 Jan 21 15:28:38 crc kubenswrapper[4707]: I0121 15:28:38.219716 4707 generic.go:334] "Generic (PLEG): container finished" podID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerID="03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60" exitCode=2 Jan 21 15:28:38 crc kubenswrapper[4707]: I0121 15:28:38.219724 4707 generic.go:334] "Generic (PLEG): container finished" podID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerID="7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884" exitCode=0 Jan 21 15:28:38 crc kubenswrapper[4707]: I0121 15:28:38.219727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerDied","Data":"bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6"} Jan 21 15:28:38 crc kubenswrapper[4707]: I0121 15:28:38.219783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerDied","Data":"03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60"} Jan 21 15:28:38 crc kubenswrapper[4707]: I0121 15:28:38.219795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerDied","Data":"7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884"} Jan 21 15:28:38 crc kubenswrapper[4707]: I0121 15:28:38.240549 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:28:39 crc kubenswrapper[4707]: I0121 15:28:39.189534 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5e58d5-06c3-4fe0-a609-e227c50fdf22" path="/var/lib/kubelet/pods/fc5e58d5-06c3-4fe0-a609-e227c50fdf22/volumes" Jan 21 15:28:39 crc kubenswrapper[4707]: I0121 15:28:39.228406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7d909d33-58f9-4a53-b98a-aa1927df3ea3","Type":"ContainerStarted","Data":"c79dfd4c99dbc23a0bdc487e4cc960e31cd013b21ba5d1830d4fab5e1cc9d84d"} Jan 21 15:28:39 crc kubenswrapper[4707]: I0121 15:28:39.228465 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:39 crc kubenswrapper[4707]: I0121 15:28:39.240782 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.9802451699999999 podStartE2EDuration="2.240768851s" podCreationTimestamp="2026-01-21 15:28:37 +0000 UTC" firstStartedPulling="2026-01-21 15:28:37.953179031 +0000 UTC m=+1615.134695252" lastFinishedPulling="2026-01-21 15:28:38.213702711 +0000 UTC m=+1615.395218933" observedRunningTime="2026-01-21 15:28:39.238971702 +0000 UTC m=+1616.420487923" watchObservedRunningTime="2026-01-21 15:28:39.240768851 +0000 UTC m=+1616.422285073" Jan 21 15:28:39 crc kubenswrapper[4707]: I0121 15:28:39.497949 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:39 crc kubenswrapper[4707]: I0121 15:28:39.498156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:39 crc kubenswrapper[4707]: I0121 15:28:39.946056 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:28:39 crc kubenswrapper[4707]: I0121 15:28:39.946104 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:28:39 crc kubenswrapper[4707]: I0121 15:28:39.958502 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.075686 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-config-data\") pod \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.075789 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-sg-core-conf-yaml\") pod \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.075879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-combined-ca-bundle\") pod \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.075973 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvx7r\" (UniqueName: \"kubernetes.io/projected/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-kube-api-access-tvx7r\") pod \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.076016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-log-httpd\") pod \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.076102 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-run-httpd\") pod \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.076124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-scripts\") pod \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\" (UID: \"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9\") " Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.076435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" (UID: "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.076467 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" (UID: "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.087915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-kube-api-access-tvx7r" (OuterVolumeSpecName: "kube-api-access-tvx7r") pod "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" (UID: "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9"). InnerVolumeSpecName "kube-api-access-tvx7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.087926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-scripts" (OuterVolumeSpecName: "scripts") pod "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" (UID: "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.097057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" (UID: "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.143964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" (UID: "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.160098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-config-data" (OuterVolumeSpecName: "config-data") pod "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" (UID: "3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.177699 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.177723 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.177734 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.177742 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvx7r\" (UniqueName: \"kubernetes.io/projected/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-kube-api-access-tvx7r\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.177751 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.177758 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.177767 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.237280 4707 generic.go:334] "Generic (PLEG): container finished" podID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerID="f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126" exitCode=0 Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.237319 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.237344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerDied","Data":"f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126"} Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.237365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9","Type":"ContainerDied","Data":"2631a37910b998e0ab23421b9bf4f8525b7c97c1a3e62efd1813b22320e7d392"} Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.237379 4707 scope.go:117] "RemoveContainer" containerID="bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.260641 4707 scope.go:117] "RemoveContainer" containerID="03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.266638 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.284954 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.285278 4707 scope.go:117] "RemoveContainer" containerID="f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.298654 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:40 crc kubenswrapper[4707]: E0121 15:28:40.299014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="sg-core" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.299034 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="sg-core" Jan 21 15:28:40 crc kubenswrapper[4707]: E0121 15:28:40.299042 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="ceilometer-notification-agent" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.299048 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="ceilometer-notification-agent" Jan 21 15:28:40 crc kubenswrapper[4707]: E0121 15:28:40.299081 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="ceilometer-central-agent" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.299087 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="ceilometer-central-agent" Jan 21 15:28:40 crc kubenswrapper[4707]: E0121 15:28:40.299097 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="proxy-httpd" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.299103 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="proxy-httpd" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.299396 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="proxy-httpd" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.299417 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="sg-core" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.299429 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="ceilometer-notification-agent" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.299439 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" containerName="ceilometer-central-agent" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.300690 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.300766 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.303145 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.303250 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.303496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.310687 4707 scope.go:117] "RemoveContainer" containerID="7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.335461 4707 scope.go:117] "RemoveContainer" containerID="bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6" Jan 21 15:28:40 crc kubenswrapper[4707]: E0121 15:28:40.335781 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6\": container with ID starting with bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6 not found: ID does not exist" containerID="bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.335936 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6"} err="failed to get container status \"bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6\": rpc error: code = NotFound desc = could not find container \"bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6\": container with ID starting with bb20fc4409a135b0644ca0017e95d0badac62f7043a6f65898fa9b8584099eb6 not found: ID does not exist" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.335970 4707 scope.go:117] "RemoveContainer" containerID="03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60" Jan 21 15:28:40 crc kubenswrapper[4707]: E0121 15:28:40.336304 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60\": container with ID starting with 03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60 not found: ID does not exist" containerID="03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.336335 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60"} err="failed to get container status \"03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60\": rpc error: code = NotFound desc = could not find container \"03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60\": container with ID starting with 03cc1d1131f7e05324009801cd61f6fcf9dee9a27800785227a6037ca981af60 not found: ID does not exist" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.336383 4707 scope.go:117] "RemoveContainer" containerID="f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126" Jan 21 15:28:40 crc kubenswrapper[4707]: E0121 15:28:40.336604 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126\": container with ID starting with f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126 not found: ID does not exist" containerID="f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.336629 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126"} err="failed to get container status \"f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126\": rpc error: code = NotFound desc = could not find container \"f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126\": container with ID starting with f972de600385841d6ac3254038ef430eafe7a7a16f9928029b2d09bd4e77f126 not found: ID does not exist" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.336642 4707 scope.go:117] "RemoveContainer" containerID="7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884" Jan 21 15:28:40 crc kubenswrapper[4707]: E0121 15:28:40.336985 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884\": container with ID starting with 7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884 not found: ID does not exist" containerID="7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.337024 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884"} err="failed to get container status \"7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884\": rpc error: code = NotFound desc = could not find container \"7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884\": container with ID starting with 7d099ffd567cee3064a0b26c091bf10b217eae7710e070e00fa4f769deb4e884 not found: ID does not exist" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.380873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-scripts\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.380913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-run-httpd\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.380954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.381255 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnhnw\" (UniqueName: \"kubernetes.io/projected/693c8080-ea38-41e0-b7c3-6da8bfc64979-kube-api-access-bnhnw\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.381324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-config-data\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.381345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.381379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.381456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-log-httpd\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnhnw\" (UniqueName: \"kubernetes.io/projected/693c8080-ea38-41e0-b7c3-6da8bfc64979-kube-api-access-bnhnw\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-config-data\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-log-httpd\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483341 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-scripts\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-run-httpd\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-log-httpd\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.483952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-run-httpd\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.486079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.486353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-config-data\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.486498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.486782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-scripts\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.487324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.498120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnhnw\" (UniqueName: \"kubernetes.io/projected/693c8080-ea38-41e0-b7c3-6da8bfc64979-kube-api-access-bnhnw\") pod \"ceilometer-0\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.508915 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.138:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.508931 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.138:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:28:40 crc kubenswrapper[4707]: I0121 15:28:40.614946 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:41 crc kubenswrapper[4707]: I0121 15:28:41.022261 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:41 crc kubenswrapper[4707]: I0121 15:28:41.190533 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9" path="/var/lib/kubelet/pods/3532e287-d2df-4b9d-b5ff-e7fdcd35a3e9/volumes" Jan 21 15:28:41 crc kubenswrapper[4707]: I0121 15:28:41.246390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerStarted","Data":"ae8adf5b3bcd6ccd5dd909cd87587de5acd472b03edeebbfc56c2f446cc0b9e4"} Jan 21 15:28:42 crc kubenswrapper[4707]: I0121 15:28:42.253836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerStarted","Data":"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7"} Jan 21 15:28:43 crc kubenswrapper[4707]: I0121 15:28:43.262021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerStarted","Data":"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d"} Jan 21 15:28:43 crc kubenswrapper[4707]: I0121 15:28:43.262516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerStarted","Data":"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806"} Jan 21 15:28:45 crc kubenswrapper[4707]: I0121 15:28:45.287547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerStarted","Data":"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00"} Jan 21 15:28:45 crc kubenswrapper[4707]: I0121 15:28:45.290200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:45 crc kubenswrapper[4707]: I0121 15:28:45.307557 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.212057028 podStartE2EDuration="5.307542936s" podCreationTimestamp="2026-01-21 15:28:40 +0000 UTC" firstStartedPulling="2026-01-21 15:28:41.027276127 +0000 UTC m=+1618.208792349" lastFinishedPulling="2026-01-21 15:28:44.122762035 +0000 UTC m=+1621.304278257" observedRunningTime="2026-01-21 15:28:45.306408553 +0000 UTC m=+1622.487924775" watchObservedRunningTime="2026-01-21 15:28:45.307542936 +0000 UTC m=+1622.489059158" Jan 21 15:28:46 crc kubenswrapper[4707]: I0121 15:28:46.449965 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:46 crc kubenswrapper[4707]: I0121 15:28:46.450217 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:46 crc kubenswrapper[4707]: I0121 15:28:46.450491 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:46 crc kubenswrapper[4707]: I0121 15:28:46.450519 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:46 crc kubenswrapper[4707]: I0121 15:28:46.452800 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:46 crc kubenswrapper[4707]: I0121 15:28:46.453392 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:47 crc kubenswrapper[4707]: I0121 15:28:47.571796 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:28:48 crc kubenswrapper[4707]: I0121 15:28:48.007388 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:48 crc kubenswrapper[4707]: I0121 15:28:48.304356 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="ceilometer-central-agent" containerID="cri-o://b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7" gracePeriod=30 Jan 21 15:28:48 crc kubenswrapper[4707]: I0121 15:28:48.304419 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="proxy-httpd" containerID="cri-o://dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00" gracePeriod=30 Jan 21 15:28:48 crc kubenswrapper[4707]: I0121 15:28:48.304430 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="sg-core" containerID="cri-o://6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d" gracePeriod=30 Jan 21 15:28:48 crc kubenswrapper[4707]: I0121 15:28:48.304474 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="ceilometer-notification-agent" containerID="cri-o://d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806" gracePeriod=30 Jan 21 15:28:48 crc kubenswrapper[4707]: I0121 15:28:48.655990 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:48 crc kubenswrapper[4707]: I0121 15:28:48.923609 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.010716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnhnw\" (UniqueName: \"kubernetes.io/projected/693c8080-ea38-41e0-b7c3-6da8bfc64979-kube-api-access-bnhnw\") pod \"693c8080-ea38-41e0-b7c3-6da8bfc64979\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.010793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-ceilometer-tls-certs\") pod \"693c8080-ea38-41e0-b7c3-6da8bfc64979\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.010863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-sg-core-conf-yaml\") pod \"693c8080-ea38-41e0-b7c3-6da8bfc64979\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.010889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-config-data\") pod \"693c8080-ea38-41e0-b7c3-6da8bfc64979\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.010909 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-log-httpd\") pod \"693c8080-ea38-41e0-b7c3-6da8bfc64979\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.010930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-run-httpd\") pod \"693c8080-ea38-41e0-b7c3-6da8bfc64979\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.010956 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-scripts\") pod \"693c8080-ea38-41e0-b7c3-6da8bfc64979\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.011007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-combined-ca-bundle\") pod \"693c8080-ea38-41e0-b7c3-6da8bfc64979\" (UID: \"693c8080-ea38-41e0-b7c3-6da8bfc64979\") " Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.012268 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "693c8080-ea38-41e0-b7c3-6da8bfc64979" (UID: "693c8080-ea38-41e0-b7c3-6da8bfc64979"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.012339 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "693c8080-ea38-41e0-b7c3-6da8bfc64979" (UID: "693c8080-ea38-41e0-b7c3-6da8bfc64979"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.015939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-scripts" (OuterVolumeSpecName: "scripts") pod "693c8080-ea38-41e0-b7c3-6da8bfc64979" (UID: "693c8080-ea38-41e0-b7c3-6da8bfc64979"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.016516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693c8080-ea38-41e0-b7c3-6da8bfc64979-kube-api-access-bnhnw" (OuterVolumeSpecName: "kube-api-access-bnhnw") pod "693c8080-ea38-41e0-b7c3-6da8bfc64979" (UID: "693c8080-ea38-41e0-b7c3-6da8bfc64979"). InnerVolumeSpecName "kube-api-access-bnhnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.031085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "693c8080-ea38-41e0-b7c3-6da8bfc64979" (UID: "693c8080-ea38-41e0-b7c3-6da8bfc64979"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.055674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "693c8080-ea38-41e0-b7c3-6da8bfc64979" (UID: "693c8080-ea38-41e0-b7c3-6da8bfc64979"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.059287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "693c8080-ea38-41e0-b7c3-6da8bfc64979" (UID: "693c8080-ea38-41e0-b7c3-6da8bfc64979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.079368 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-config-data" (OuterVolumeSpecName: "config-data") pod "693c8080-ea38-41e0-b7c3-6da8bfc64979" (UID: "693c8080-ea38-41e0-b7c3-6da8bfc64979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.113832 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnhnw\" (UniqueName: \"kubernetes.io/projected/693c8080-ea38-41e0-b7c3-6da8bfc64979-kube-api-access-bnhnw\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.113863 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.113873 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.113881 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.113889 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.113897 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693c8080-ea38-41e0-b7c3-6da8bfc64979-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.113903 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.113911 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693c8080-ea38-41e0-b7c3-6da8bfc64979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.313284 4707 generic.go:334] "Generic (PLEG): container finished" podID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerID="dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00" exitCode=0 Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.313312 4707 generic.go:334] "Generic (PLEG): container finished" podID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerID="6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d" exitCode=2 Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.313320 4707 generic.go:334] "Generic (PLEG): container finished" podID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerID="d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806" exitCode=0 Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.313327 4707 generic.go:334] "Generic (PLEG): container finished" podID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerID="b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7" exitCode=0 Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.313477 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-log" containerID="cri-o://5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4" gracePeriod=30 Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.313737 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.314399 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerDied","Data":"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00"} Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.314421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerDied","Data":"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d"} Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.314432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerDied","Data":"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806"} Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.314442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerDied","Data":"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7"} Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.314450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"693c8080-ea38-41e0-b7c3-6da8bfc64979","Type":"ContainerDied","Data":"ae8adf5b3bcd6ccd5dd909cd87587de5acd472b03edeebbfc56c2f446cc0b9e4"} Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.314462 4707 scope.go:117] "RemoveContainer" containerID="dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.314709 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-api" containerID="cri-o://b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726" gracePeriod=30 Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.334240 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.346912 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.354208 4707 scope.go:117] "RemoveContainer" containerID="6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.354462 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:49 crc kubenswrapper[4707]: E0121 15:28:49.354746 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="ceilometer-notification-agent" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.354764 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="ceilometer-notification-agent" Jan 21 15:28:49 crc kubenswrapper[4707]: E0121 15:28:49.354775 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="sg-core" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.354780 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="sg-core" Jan 21 15:28:49 crc kubenswrapper[4707]: E0121 15:28:49.354791 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="ceilometer-central-agent" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.354797 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="ceilometer-central-agent" Jan 21 15:28:49 crc kubenswrapper[4707]: E0121 15:28:49.354831 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="proxy-httpd" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.354837 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="proxy-httpd" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.354984 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="proxy-httpd" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.355002 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="ceilometer-notification-agent" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.355016 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="sg-core" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.355026 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" containerName="ceilometer-central-agent" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.356280 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.358665 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.358796 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.359223 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.366148 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.380228 4707 scope.go:117] "RemoveContainer" containerID="d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.395702 4707 scope.go:117] "RemoveContainer" containerID="b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.409957 4707 scope.go:117] "RemoveContainer" containerID="dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00" Jan 21 15:28:49 crc kubenswrapper[4707]: E0121 15:28:49.410259 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00\": container with ID starting with dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00 not found: ID does not exist" containerID="dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.410284 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00"} err="failed to get container status \"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00\": rpc error: code = NotFound desc = could not find container \"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00\": container with ID starting with dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.410303 4707 scope.go:117] "RemoveContainer" containerID="6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d" Jan 21 15:28:49 crc kubenswrapper[4707]: E0121 15:28:49.410519 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d\": container with ID starting with 6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d not found: ID does not exist" containerID="6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.410539 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d"} err="failed to get container status \"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d\": rpc error: code = NotFound desc = could not find container \"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d\": container with ID starting with 6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.410550 4707 scope.go:117] "RemoveContainer" containerID="d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806" Jan 21 15:28:49 crc kubenswrapper[4707]: E0121 15:28:49.410773 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806\": container with ID starting with d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806 not found: ID does not exist" containerID="d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.410794 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806"} err="failed to get container status \"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806\": rpc error: code = NotFound desc = could not find container \"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806\": container with ID starting with d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.410820 4707 scope.go:117] "RemoveContainer" containerID="b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7" Jan 21 15:28:49 crc kubenswrapper[4707]: E0121 15:28:49.410974 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7\": container with ID starting with b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7 not found: ID does not exist" containerID="b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.410993 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7"} err="failed to get container status \"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7\": rpc error: code = NotFound desc = could not find container \"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7\": container with ID starting with b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.411004 4707 scope.go:117] "RemoveContainer" containerID="dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.411144 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00"} err="failed to get container status \"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00\": rpc error: code = NotFound desc = could not find container \"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00\": container with ID starting with dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.411173 4707 scope.go:117] "RemoveContainer" containerID="6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.411313 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d"} err="failed to get container status \"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d\": rpc error: code = NotFound desc = could not find container \"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d\": container with ID starting with 6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.411332 4707 scope.go:117] "RemoveContainer" containerID="d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.416870 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806"} err="failed to get container status \"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806\": rpc error: code = NotFound desc = could not find container \"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806\": container with ID starting with d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.416909 4707 scope.go:117] "RemoveContainer" containerID="b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.417780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4swqn\" (UniqueName: \"kubernetes.io/projected/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-kube-api-access-4swqn\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.418089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-run-httpd\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.418137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-config-data\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.418155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.418194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.418274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.418309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-scripts\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.418354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-log-httpd\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.421055 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7"} err="failed to get container status \"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7\": rpc error: code = NotFound desc = could not find container \"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7\": container with ID starting with b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.421095 4707 scope.go:117] "RemoveContainer" containerID="dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.421344 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00"} err="failed to get container status \"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00\": rpc error: code = NotFound desc = could not find container \"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00\": container with ID starting with dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.421380 4707 scope.go:117] "RemoveContainer" containerID="6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.421570 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d"} err="failed to get container status \"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d\": rpc error: code = NotFound desc = could not find container \"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d\": container with ID starting with 6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.421589 4707 scope.go:117] "RemoveContainer" containerID="d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.421824 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806"} err="failed to get container status \"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806\": rpc error: code = NotFound desc = could not find container \"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806\": container with ID starting with d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.421843 4707 scope.go:117] "RemoveContainer" containerID="b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.422030 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7"} err="failed to get container status \"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7\": rpc error: code = NotFound desc = could not find container \"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7\": container with ID starting with b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.422047 4707 scope.go:117] "RemoveContainer" containerID="dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.422242 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00"} err="failed to get container status \"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00\": rpc error: code = NotFound desc = could not find container \"dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00\": container with ID starting with dd23afe1b967d828a7ab92884fd37b54422db14e4cef324b9bc71f32e7e86c00 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.422259 4707 scope.go:117] "RemoveContainer" containerID="6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.422443 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d"} err="failed to get container status \"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d\": rpc error: code = NotFound desc = could not find container \"6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d\": container with ID starting with 6871a4b809492a7897e7da2be354f891fd5dd81a2a703c1516da439c3c64544d not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.422460 4707 scope.go:117] "RemoveContainer" containerID="d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.422618 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806"} err="failed to get container status \"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806\": rpc error: code = NotFound desc = could not find container \"d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806\": container with ID starting with d7779dce0e549053dd3cb08e6a516ca66501fa9c69462dc155989a57d3c99806 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.422634 4707 scope.go:117] "RemoveContainer" containerID="b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.422837 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7"} err="failed to get container status \"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7\": rpc error: code = NotFound desc = could not find container \"b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7\": container with ID starting with b92a4f82f62de29fd8870c406ec1688a5c9d0bf984d2fe19c7b65d62f8b269d7 not found: ID does not exist" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.507830 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.508386 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:49 crc kubenswrapper[4707]: E0121 15:28:49.509002 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-4swqn log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ceilometer-0" podUID="3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.509044 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.519394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-run-httpd\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.519429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-config-data\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.519446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.519957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.520002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.520022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-scripts\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.520045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-log-httpd\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.520079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4swqn\" (UniqueName: \"kubernetes.io/projected/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-kube-api-access-4swqn\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.520057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-run-httpd\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.520558 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-log-httpd\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.526033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.533756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.534212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.536190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-scripts\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.536538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-config-data\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.546313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4swqn\" (UniqueName: \"kubernetes.io/projected/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-kube-api-access-4swqn\") pod \"ceilometer-0\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:49 crc kubenswrapper[4707]: I0121 15:28:49.548743 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.321792 4707 generic.go:334] "Generic (PLEG): container finished" podID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerID="5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4" exitCode=143 Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.321838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b28d4f89-f7dd-4acb-80a0-95f5cb31b459","Type":"ContainerDied","Data":"5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4"} Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.321906 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.326294 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.330285 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.434549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-ceilometer-tls-certs\") pod \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.434765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-combined-ca-bundle\") pod \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.434835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-scripts\") pod \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.434864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-config-data\") pod \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.434928 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-log-httpd\") pod \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.434979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4swqn\" (UniqueName: \"kubernetes.io/projected/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-kube-api-access-4swqn\") pod \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.435006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-run-httpd\") pod \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.435020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-sg-core-conf-yaml\") pod \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\" (UID: \"3ba8ed6a-1fb4-4f33-8f53-a77b92e40612\") " Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.435925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" (UID: "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.436717 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.437004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" (UID: "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.438930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" (UID: "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.439000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-config-data" (OuterVolumeSpecName: "config-data") pod "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" (UID: "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.439471 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" (UID: "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.439938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-kube-api-access-4swqn" (OuterVolumeSpecName: "kube-api-access-4swqn") pod "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" (UID: "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612"). InnerVolumeSpecName "kube-api-access-4swqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.444277 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" (UID: "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.455255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-scripts" (OuterVolumeSpecName: "scripts") pod "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" (UID: "3ba8ed6a-1fb4-4f33-8f53-a77b92e40612"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.537975 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4swqn\" (UniqueName: \"kubernetes.io/projected/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-kube-api-access-4swqn\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.538008 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.538018 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.538026 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.538034 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.538042 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.538050 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.585186 4707 scope.go:117] "RemoveContainer" containerID="cac73fa5858eb140e1c949b3821698dd7fa162b5fddb80b7fc1a91e69dde4eb2" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.601477 4707 scope.go:117] "RemoveContainer" containerID="82702cae16237653df2b9f3bc5adea1229ba33351b62b15c664127ebacb3731b" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.644725 4707 scope.go:117] "RemoveContainer" containerID="cc8f6b16f50be7d1df064a7a111f75a22b38a2e3850ed9f555db741209b2eea5" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.660586 4707 scope.go:117] "RemoveContainer" containerID="719f9a3afb9fefe6adde1cda13f4842deb8edc9afdc5cb7b8c9599908b918256" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.684772 4707 scope.go:117] "RemoveContainer" containerID="12a30114a287ea421974e1f93e5f0be3fcc1108f286a1c2443ce8bf22eafa491" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.698353 4707 scope.go:117] "RemoveContainer" containerID="07a095da7d9200c541e009e8d6b9f784e0acfae65c8c93ddbf83ff6faf8458b1" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.715994 4707 scope.go:117] "RemoveContainer" containerID="271534ce4b9b3e231c7bb12b86091e9f944b8dccc5e23a788d117e456d901635" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.743216 4707 scope.go:117] "RemoveContainer" containerID="2945cc35ca1844c3a12635b486f13b44293757cfb3e36de61c6e12c658f3b87a" Jan 21 15:28:50 crc kubenswrapper[4707]: I0121 15:28:50.762400 4707 scope.go:117] "RemoveContainer" containerID="147a04b8d4127c77f223fa46ba02a222d6252316f9d1f34d4f7164c46976c44a" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.189330 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693c8080-ea38-41e0-b7c3-6da8bfc64979" path="/var/lib/kubelet/pods/693c8080-ea38-41e0-b7c3-6da8bfc64979/volumes" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.327852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.385375 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.392467 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.398214 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.400687 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.402617 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.403011 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.403054 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.403275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.459967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.460211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-config-data\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.460322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.460416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-run-httpd\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.460558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxd7z\" (UniqueName: \"kubernetes.io/projected/46be47ce-9ea4-4319-b7af-1a299354d2ab-kube-api-access-dxd7z\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.460664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.460777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-scripts\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.460996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-log-httpd\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-log-httpd\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-config-data\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-run-httpd\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxd7z\" (UniqueName: \"kubernetes.io/projected/46be47ce-9ea4-4319-b7af-1a299354d2ab-kube-api-access-dxd7z\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-scripts\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-log-httpd\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.562863 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-run-httpd\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.565594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.565803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.566090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-config-data\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.566524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-scripts\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.567980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.577868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxd7z\" (UniqueName: \"kubernetes.io/projected/46be47ce-9ea4-4319-b7af-1a299354d2ab-kube-api-access-dxd7z\") pod \"ceilometer-0\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:51 crc kubenswrapper[4707]: I0121 15:28:51.715101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:52 crc kubenswrapper[4707]: W0121 15:28:52.081664 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46be47ce_9ea4_4319_b7af_1a299354d2ab.slice/crio-c76a85d742cec86291e6eff4c33261f801a64a9c4ddbcbbf88328d1168087e82 WatchSource:0}: Error finding container c76a85d742cec86291e6eff4c33261f801a64a9c4ddbcbbf88328d1168087e82: Status 404 returned error can't find the container with id c76a85d742cec86291e6eff4c33261f801a64a9c4ddbcbbf88328d1168087e82 Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.081733 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.333531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerStarted","Data":"c76a85d742cec86291e6eff4c33261f801a64a9c4ddbcbbf88328d1168087e82"} Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.753704 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.881705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-config-data\") pod \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.881766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5g4g\" (UniqueName: \"kubernetes.io/projected/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-kube-api-access-c5g4g\") pod \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.881879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-logs\") pod \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.881985 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-combined-ca-bundle\") pod \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\" (UID: \"b28d4f89-f7dd-4acb-80a0-95f5cb31b459\") " Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.882355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-logs" (OuterVolumeSpecName: "logs") pod "b28d4f89-f7dd-4acb-80a0-95f5cb31b459" (UID: "b28d4f89-f7dd-4acb-80a0-95f5cb31b459"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.885488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-kube-api-access-c5g4g" (OuterVolumeSpecName: "kube-api-access-c5g4g") pod "b28d4f89-f7dd-4acb-80a0-95f5cb31b459" (UID: "b28d4f89-f7dd-4acb-80a0-95f5cb31b459"). InnerVolumeSpecName "kube-api-access-c5g4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.900494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-config-data" (OuterVolumeSpecName: "config-data") pod "b28d4f89-f7dd-4acb-80a0-95f5cb31b459" (UID: "b28d4f89-f7dd-4acb-80a0-95f5cb31b459"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.901149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b28d4f89-f7dd-4acb-80a0-95f5cb31b459" (UID: "b28d4f89-f7dd-4acb-80a0-95f5cb31b459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.983429 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.983458 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.983468 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5g4g\" (UniqueName: \"kubernetes.io/projected/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-kube-api-access-c5g4g\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:52 crc kubenswrapper[4707]: I0121 15:28:52.983478 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b28d4f89-f7dd-4acb-80a0-95f5cb31b459-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.193371 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba8ed6a-1fb4-4f33-8f53-a77b92e40612" path="/var/lib/kubelet/pods/3ba8ed6a-1fb4-4f33-8f53-a77b92e40612/volumes" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.356374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerStarted","Data":"a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554"} Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.358036 4707 generic.go:334] "Generic (PLEG): container finished" podID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerID="b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726" exitCode=0 Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.358064 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b28d4f89-f7dd-4acb-80a0-95f5cb31b459","Type":"ContainerDied","Data":"b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726"} Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.358081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b28d4f89-f7dd-4acb-80a0-95f5cb31b459","Type":"ContainerDied","Data":"86fac2029585f909af57bf80ef1a27153271cbf9c428ba6fd7b15687b06e1a9f"} Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.358096 4707 scope.go:117] "RemoveContainer" containerID="b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.358211 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.382038 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.385425 4707 scope.go:117] "RemoveContainer" containerID="5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.393652 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.400796 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:53 crc kubenswrapper[4707]: E0121 15:28:53.401184 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-api" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.401202 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-api" Jan 21 15:28:53 crc kubenswrapper[4707]: E0121 15:28:53.401216 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-log" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.401224 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-log" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.401367 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-log" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.401390 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" containerName="nova-api-api" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.405246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.406710 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.407022 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.407188 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.415954 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.416081 4707 scope.go:117] "RemoveContainer" containerID="b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726" Jan 21 15:28:53 crc kubenswrapper[4707]: E0121 15:28:53.417089 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726\": container with ID starting with b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726 not found: ID does not exist" containerID="b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.417137 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726"} err="failed to get container status \"b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726\": rpc error: code = NotFound desc = could not find container \"b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726\": container with ID starting with b8a3f7de91124c53c9bd815d12a7d3eb0183c2fa267b7e80e83d84b18b60c726 not found: ID does not exist" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.417154 4707 scope.go:117] "RemoveContainer" containerID="5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4" Jan 21 15:28:53 crc kubenswrapper[4707]: E0121 15:28:53.417409 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4\": container with ID starting with 5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4 not found: ID does not exist" containerID="5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.417454 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4"} err="failed to get container status \"5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4\": rpc error: code = NotFound desc = could not find container \"5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4\": container with ID starting with 5b6abb900d5cb040d5cab912f6cbd87fdcd77850929f506798ad4476226c4bc4 not found: ID does not exist" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.498519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f3e600-3622-4bc5-9481-86560969ee79-logs\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.498624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-public-tls-certs\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.498695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.498725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-config-data\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.498746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ph2h\" (UniqueName: \"kubernetes.io/projected/a9f3e600-3622-4bc5-9481-86560969ee79-kube-api-access-2ph2h\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.498766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.600494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-public-tls-certs\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.600553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.600570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-config-data\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.600589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ph2h\" (UniqueName: \"kubernetes.io/projected/a9f3e600-3622-4bc5-9481-86560969ee79-kube-api-access-2ph2h\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.600618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.600664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f3e600-3622-4bc5-9481-86560969ee79-logs\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.601021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f3e600-3622-4bc5-9481-86560969ee79-logs\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.610693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.610880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.610902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-config-data\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.612404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-public-tls-certs\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.613125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ph2h\" (UniqueName: \"kubernetes.io/projected/a9f3e600-3622-4bc5-9481-86560969ee79-kube-api-access-2ph2h\") pod \"nova-api-0\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:53 crc kubenswrapper[4707]: I0121 15:28:53.735601 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:28:54 crc kubenswrapper[4707]: I0121 15:28:54.111505 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:28:54 crc kubenswrapper[4707]: W0121 15:28:54.113876 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9f3e600_3622_4bc5_9481_86560969ee79.slice/crio-d6c85ef282dec9f375cff60a34b332b21d801ab2f061fae93a3be587d21b1575 WatchSource:0}: Error finding container d6c85ef282dec9f375cff60a34b332b21d801ab2f061fae93a3be587d21b1575: Status 404 returned error can't find the container with id d6c85ef282dec9f375cff60a34b332b21d801ab2f061fae93a3be587d21b1575 Jan 21 15:28:54 crc kubenswrapper[4707]: I0121 15:28:54.365794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a9f3e600-3622-4bc5-9481-86560969ee79","Type":"ContainerStarted","Data":"e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7"} Jan 21 15:28:54 crc kubenswrapper[4707]: I0121 15:28:54.366011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a9f3e600-3622-4bc5-9481-86560969ee79","Type":"ContainerStarted","Data":"2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b"} Jan 21 15:28:54 crc kubenswrapper[4707]: I0121 15:28:54.366024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a9f3e600-3622-4bc5-9481-86560969ee79","Type":"ContainerStarted","Data":"d6c85ef282dec9f375cff60a34b332b21d801ab2f061fae93a3be587d21b1575"} Jan 21 15:28:54 crc kubenswrapper[4707]: I0121 15:28:54.367927 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerStarted","Data":"eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4"} Jan 21 15:28:54 crc kubenswrapper[4707]: I0121 15:28:54.367961 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerStarted","Data":"e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488"} Jan 21 15:28:54 crc kubenswrapper[4707]: I0121 15:28:54.384820 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.384796588 podStartE2EDuration="1.384796588s" podCreationTimestamp="2026-01-21 15:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:54.377918337 +0000 UTC m=+1631.559434559" watchObservedRunningTime="2026-01-21 15:28:54.384796588 +0000 UTC m=+1631.566312809" Jan 21 15:28:55 crc kubenswrapper[4707]: I0121 15:28:55.190305 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28d4f89-f7dd-4acb-80a0-95f5cb31b459" path="/var/lib/kubelet/pods/b28d4f89-f7dd-4acb-80a0-95f5cb31b459/volumes" Jan 21 15:28:56 crc kubenswrapper[4707]: I0121 15:28:56.381407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerStarted","Data":"37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad"} Jan 21 15:28:56 crc kubenswrapper[4707]: I0121 15:28:56.381699 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:28:56 crc kubenswrapper[4707]: I0121 15:28:56.402384 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.007045958 podStartE2EDuration="5.40237177s" podCreationTimestamp="2026-01-21 15:28:51 +0000 UTC" firstStartedPulling="2026-01-21 15:28:52.083797585 +0000 UTC m=+1629.265313806" lastFinishedPulling="2026-01-21 15:28:55.479123396 +0000 UTC m=+1632.660639618" observedRunningTime="2026-01-21 15:28:56.39518606 +0000 UTC m=+1633.576702282" watchObservedRunningTime="2026-01-21 15:28:56.40237177 +0000 UTC m=+1633.583887992" Jan 21 15:29:03 crc kubenswrapper[4707]: I0121 15:29:03.736043 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:03 crc kubenswrapper[4707]: I0121 15:29:03.736282 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:04 crc kubenswrapper[4707]: I0121 15:29:04.751943 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.143:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:29:04 crc kubenswrapper[4707]: I0121 15:29:04.751965 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.143:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:29:09 crc kubenswrapper[4707]: I0121 15:29:09.945875 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:29:09 crc kubenswrapper[4707]: I0121 15:29:09.946254 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:29:13 crc kubenswrapper[4707]: I0121 15:29:13.741673 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:13 crc kubenswrapper[4707]: I0121 15:29:13.742097 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:13 crc kubenswrapper[4707]: I0121 15:29:13.742493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:13 crc kubenswrapper[4707]: I0121 15:29:13.742848 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:13 crc kubenswrapper[4707]: I0121 15:29:13.747118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:13 crc kubenswrapper[4707]: I0121 15:29:13.747339 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:21 crc kubenswrapper[4707]: I0121 15:29:21.721419 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.753280 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm"] Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.754561 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.756636 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncontroller-ovncontroller-dockercfg-9bwst" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.757648 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovncontroller-ovndbs" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.757745 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-scripts" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.762326 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-67zzd"] Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.764070 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.787886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm"] Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.797089 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-67zzd"] Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.801899 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-tbrg2"] Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.802882 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.804696 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-metrics-config" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.813433 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-tbrg2"] Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.865562 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk"] Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.866654 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.871553 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-extra-scripts" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.877569 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk"] Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbjsc\" (UniqueName: \"kubernetes.io/projected/18374136-0195-4797-8436-413b88fd3984-kube-api-access-wbjsc\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18374136-0195-4797-8436-413b88fd3984-config\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894109 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0ec7846-0394-4865-9143-d7cced1b7994-scripts\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894215 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-combined-ca-bundle\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-lib\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-run\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovs-rundir\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-log\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovn-rundir\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-log-ovn\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5520417-48eb-47ef-afaa-5528cb122d9e-scripts\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqnc\" (UniqueName: \"kubernetes.io/projected/f5520417-48eb-47ef-afaa-5528cb122d9e-kube-api-access-9vqnc\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run-ovn\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-ovn-controller-tls-certs\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-combined-ca-bundle\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-etc-ovs\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.894749 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2ccm\" (UniqueName: \"kubernetes.io/projected/f0ec7846-0394-4865-9143-d7cced1b7994-kube-api-access-l2ccm\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.995833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-run\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.995895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.995914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovs-rundir\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.995950 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-log-ovn\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.995973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-log\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovn-rundir\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-log-ovn\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5520417-48eb-47ef-afaa-5528cb122d9e-scripts\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqnc\" (UniqueName: \"kubernetes.io/projected/f5520417-48eb-47ef-afaa-5528cb122d9e-kube-api-access-9vqnc\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run-ovn\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovs-rundir\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovn-rundir\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-run\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxcxn\" (UniqueName: \"kubernetes.io/projected/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-kube-api-access-sxcxn\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-log-ovn\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run-ovn\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-ovn-controller-tls-certs\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-combined-ca-bundle\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-etc-ovs\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996414 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2ccm\" (UniqueName: \"kubernetes.io/projected/f0ec7846-0394-4865-9143-d7cced1b7994-kube-api-access-l2ccm\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-additional-scripts\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-etc-ovs\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run-ovn\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbjsc\" (UniqueName: \"kubernetes.io/projected/18374136-0195-4797-8436-413b88fd3984-kube-api-access-wbjsc\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996659 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-scripts\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18374136-0195-4797-8436-413b88fd3984-config\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0ec7846-0394-4865-9143-d7cced1b7994-scripts\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-combined-ca-bundle\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-log\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-lib\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.996990 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-lib\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.997314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18374136-0195-4797-8436-413b88fd3984-config\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.998227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5520417-48eb-47ef-afaa-5528cb122d9e-scripts\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:25 crc kubenswrapper[4707]: I0121 15:29:25.998823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0ec7846-0394-4865-9143-d7cced1b7994-scripts\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.002141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.002458 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-combined-ca-bundle\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.003000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-combined-ca-bundle\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.003509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-ovn-controller-tls-certs\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.009102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqnc\" (UniqueName: \"kubernetes.io/projected/f5520417-48eb-47ef-afaa-5528cb122d9e-kube-api-access-9vqnc\") pod \"ovn-controller-bggbm\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.009683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbjsc\" (UniqueName: \"kubernetes.io/projected/18374136-0195-4797-8436-413b88fd3984-kube-api-access-wbjsc\") pod \"ovn-controller-metrics-tbrg2\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.011184 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2ccm\" (UniqueName: \"kubernetes.io/projected/f0ec7846-0394-4865-9143-d7cced1b7994-kube-api-access-l2ccm\") pod \"ovn-controller-ovs-67zzd\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.088282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.094472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.098554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-log-ovn\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.098651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxcxn\" (UniqueName: \"kubernetes.io/projected/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-kube-api-access-sxcxn\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.098732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-additional-scripts\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.098778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.098823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run-ovn\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.098856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-scripts\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.100425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-additional-scripts\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.100513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-log-ovn\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.100653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-scripts\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.100740 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.100781 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run-ovn\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.114179 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.128338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxcxn\" (UniqueName: \"kubernetes.io/projected/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-kube-api-access-sxcxn\") pod \"ovn-controller-bggbm-config-8d9fk\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.187221 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.602638 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-tbrg2"] Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.691890 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm"] Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.697470 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk"] Jan 21 15:29:26 crc kubenswrapper[4707]: W0121 15:29:26.703185 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ec7846_0394_4865_9143_d7cced1b7994.slice/crio-c2d714bb77dc074624fd4dab229f086cfb6dc2abd4b10a619cbc0efc51a9fe52 WatchSource:0}: Error finding container c2d714bb77dc074624fd4dab229f086cfb6dc2abd4b10a619cbc0efc51a9fe52: Status 404 returned error can't find the container with id c2d714bb77dc074624fd4dab229f086cfb6dc2abd4b10a619cbc0efc51a9fe52 Jan 21 15:29:26 crc kubenswrapper[4707]: I0121 15:29:26.704100 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-67zzd"] Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.589105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" event={"ID":"c53ded6e-23a8-4f32-8ca5-404098ea8cb6","Type":"ContainerStarted","Data":"11a1605bcc319d9ed814d9d67aa7a9e3e51111b55b8c53c91e8414081bb3326b"} Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.589496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" event={"ID":"c53ded6e-23a8-4f32-8ca5-404098ea8cb6","Type":"ContainerStarted","Data":"1d6b5d46b672652fc2363d8aebf0a9d2f4d00841a22de6435cf5ae4bf3375de2"} Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.592217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-bggbm" event={"ID":"f5520417-48eb-47ef-afaa-5528cb122d9e","Type":"ContainerStarted","Data":"6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b"} Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.592272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-bggbm" event={"ID":"f5520417-48eb-47ef-afaa-5528cb122d9e","Type":"ContainerStarted","Data":"a2f9f42427fbd4b454676c6ba30fe64ac9a45e8866128dd3a74e817eef961038"} Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.592329 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.593515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" event={"ID":"18374136-0195-4797-8436-413b88fd3984","Type":"ContainerStarted","Data":"a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f"} Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.593545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" event={"ID":"18374136-0195-4797-8436-413b88fd3984","Type":"ContainerStarted","Data":"70b6bb72e03265d4705fd792ee6d36344fefb1e1a70785e6ddd4a6abbacc6aaf"} Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.595213 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0ec7846-0394-4865-9143-d7cced1b7994" containerID="d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129" exitCode=0 Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.595251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" event={"ID":"f0ec7846-0394-4865-9143-d7cced1b7994","Type":"ContainerDied","Data":"d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129"} Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.595272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" event={"ID":"f0ec7846-0394-4865-9143-d7cced1b7994","Type":"ContainerStarted","Data":"c2d714bb77dc074624fd4dab229f086cfb6dc2abd4b10a619cbc0efc51a9fe52"} Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.607253 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" podStartSLOduration=2.607239611 podStartE2EDuration="2.607239611s" podCreationTimestamp="2026-01-21 15:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:29:27.60189903 +0000 UTC m=+1664.783415252" watchObservedRunningTime="2026-01-21 15:29:27.607239611 +0000 UTC m=+1664.788755832" Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.613154 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" podStartSLOduration=2.613137028 podStartE2EDuration="2.613137028s" podCreationTimestamp="2026-01-21 15:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:29:27.612256863 +0000 UTC m=+1664.793773095" watchObservedRunningTime="2026-01-21 15:29:27.613137028 +0000 UTC m=+1664.794653250" Jan 21 15:29:27 crc kubenswrapper[4707]: I0121 15:29:27.648520 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-bggbm" podStartSLOduration=2.64850545 podStartE2EDuration="2.64850545s" podCreationTimestamp="2026-01-21 15:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:29:27.645018845 +0000 UTC m=+1664.826535067" watchObservedRunningTime="2026-01-21 15:29:27.64850545 +0000 UTC m=+1664.830021672" Jan 21 15:29:28 crc kubenswrapper[4707]: I0121 15:29:28.603385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" event={"ID":"f0ec7846-0394-4865-9143-d7cced1b7994","Type":"ContainerStarted","Data":"f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702"} Jan 21 15:29:28 crc kubenswrapper[4707]: I0121 15:29:28.603720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" event={"ID":"f0ec7846-0394-4865-9143-d7cced1b7994","Type":"ContainerStarted","Data":"19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6"} Jan 21 15:29:28 crc kubenswrapper[4707]: I0121 15:29:28.603737 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:28 crc kubenswrapper[4707]: I0121 15:29:28.603747 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:28 crc kubenswrapper[4707]: I0121 15:29:28.604902 4707 generic.go:334] "Generic (PLEG): container finished" podID="c53ded6e-23a8-4f32-8ca5-404098ea8cb6" containerID="11a1605bcc319d9ed814d9d67aa7a9e3e51111b55b8c53c91e8414081bb3326b" exitCode=0 Jan 21 15:29:28 crc kubenswrapper[4707]: I0121 15:29:28.604945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" event={"ID":"c53ded6e-23a8-4f32-8ca5-404098ea8cb6","Type":"ContainerDied","Data":"11a1605bcc319d9ed814d9d67aa7a9e3e51111b55b8c53c91e8414081bb3326b"} Jan 21 15:29:28 crc kubenswrapper[4707]: I0121 15:29:28.622279 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" podStartSLOduration=3.6222658450000003 podStartE2EDuration="3.622265845s" podCreationTimestamp="2026-01-21 15:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:29:28.617555779 +0000 UTC m=+1665.799072002" watchObservedRunningTime="2026-01-21 15:29:28.622265845 +0000 UTC m=+1665.803782066" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.852932 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.965690 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-log-ovn\") pod \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.965773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-additional-scripts\") pod \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.965846 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run-ovn\") pod \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.965863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-scripts\") pod \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.965926 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run\") pod \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.965975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxcxn\" (UniqueName: \"kubernetes.io/projected/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-kube-api-access-sxcxn\") pod \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\" (UID: \"c53ded6e-23a8-4f32-8ca5-404098ea8cb6\") " Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.966105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c53ded6e-23a8-4f32-8ca5-404098ea8cb6" (UID: "c53ded6e-23a8-4f32-8ca5-404098ea8cb6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.966129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run" (OuterVolumeSpecName: "var-run") pod "c53ded6e-23a8-4f32-8ca5-404098ea8cb6" (UID: "c53ded6e-23a8-4f32-8ca5-404098ea8cb6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.966151 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c53ded6e-23a8-4f32-8ca5-404098ea8cb6" (UID: "c53ded6e-23a8-4f32-8ca5-404098ea8cb6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.966449 4707 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.966467 4707 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.966482 4707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-var-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.966543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c53ded6e-23a8-4f32-8ca5-404098ea8cb6" (UID: "c53ded6e-23a8-4f32-8ca5-404098ea8cb6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.966867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-scripts" (OuterVolumeSpecName: "scripts") pod "c53ded6e-23a8-4f32-8ca5-404098ea8cb6" (UID: "c53ded6e-23a8-4f32-8ca5-404098ea8cb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:29 crc kubenswrapper[4707]: I0121 15:29:29.970496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-kube-api-access-sxcxn" (OuterVolumeSpecName: "kube-api-access-sxcxn") pod "c53ded6e-23a8-4f32-8ca5-404098ea8cb6" (UID: "c53ded6e-23a8-4f32-8ca5-404098ea8cb6"). InnerVolumeSpecName "kube-api-access-sxcxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.067585 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxcxn\" (UniqueName: \"kubernetes.io/projected/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-kube-api-access-sxcxn\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.067612 4707 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.067623 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ded6e-23a8-4f32-8ca5-404098ea8cb6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.120987 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-67zzd"] Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.128292 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm"] Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.141865 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk"] Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.147424 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-tbrg2"] Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.147574 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" podUID="18374136-0195-4797-8436-413b88fd3984" containerName="openstack-network-exporter" containerID="cri-o://a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f" gracePeriod=30 Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.167665 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk"] Jan 21 15:29:30 crc kubenswrapper[4707]: E0121 15:29:30.206698 4707 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack-kuttl-tests/ovn-controller-bggbm" message=< Jan 21 15:29:30 crc kubenswrapper[4707]: Exiting ovn-controller (1) [ OK ] Jan 21 15:29:30 crc kubenswrapper[4707]: > Jan 21 15:29:30 crc kubenswrapper[4707]: E0121 15:29:30.206736 4707 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack-kuttl-tests/ovn-controller-bggbm" podUID="f5520417-48eb-47ef-afaa-5528cb122d9e" containerName="ovn-controller" containerID="cri-o://6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.206774 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-bggbm" podUID="f5520417-48eb-47ef-afaa-5528cb122d9e" containerName="ovn-controller" containerID="cri-o://6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b" gracePeriod=30 Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.426318 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.474984 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-metrics-tbrg2_18374136-0195-4797-8436-413b88fd3984/openstack-network-exporter/0.log" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.475211 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run\") pod \"f5520417-48eb-47ef-afaa-5528cb122d9e\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run-ovn\") pod \"f5520417-48eb-47ef-afaa-5528cb122d9e\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-log-ovn\") pod \"f5520417-48eb-47ef-afaa-5528cb122d9e\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run" (OuterVolumeSpecName: "var-run") pod "f5520417-48eb-47ef-afaa-5528cb122d9e" (UID: "f5520417-48eb-47ef-afaa-5528cb122d9e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-combined-ca-bundle\") pod \"f5520417-48eb-47ef-afaa-5528cb122d9e\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574829 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f5520417-48eb-47ef-afaa-5528cb122d9e" (UID: "f5520417-48eb-47ef-afaa-5528cb122d9e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574882 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f5520417-48eb-47ef-afaa-5528cb122d9e" (UID: "f5520417-48eb-47ef-afaa-5528cb122d9e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vqnc\" (UniqueName: \"kubernetes.io/projected/f5520417-48eb-47ef-afaa-5528cb122d9e-kube-api-access-9vqnc\") pod \"f5520417-48eb-47ef-afaa-5528cb122d9e\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-ovn-controller-tls-certs\") pod \"f5520417-48eb-47ef-afaa-5528cb122d9e\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.574956 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5520417-48eb-47ef-afaa-5528cb122d9e-scripts\") pod \"f5520417-48eb-47ef-afaa-5528cb122d9e\" (UID: \"f5520417-48eb-47ef-afaa-5528cb122d9e\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.575329 4707 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.575347 4707 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.575356 4707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5520417-48eb-47ef-afaa-5528cb122d9e-var-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.575896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5520417-48eb-47ef-afaa-5528cb122d9e-scripts" (OuterVolumeSpecName: "scripts") pod "f5520417-48eb-47ef-afaa-5528cb122d9e" (UID: "f5520417-48eb-47ef-afaa-5528cb122d9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.578621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5520417-48eb-47ef-afaa-5528cb122d9e-kube-api-access-9vqnc" (OuterVolumeSpecName: "kube-api-access-9vqnc") pod "f5520417-48eb-47ef-afaa-5528cb122d9e" (UID: "f5520417-48eb-47ef-afaa-5528cb122d9e"). InnerVolumeSpecName "kube-api-access-9vqnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.593792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5520417-48eb-47ef-afaa-5528cb122d9e" (UID: "f5520417-48eb-47ef-afaa-5528cb122d9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.619547 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5520417-48eb-47ef-afaa-5528cb122d9e" containerID="6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b" exitCode=0 Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.619588 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-bggbm" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.619614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-bggbm" event={"ID":"f5520417-48eb-47ef-afaa-5528cb122d9e","Type":"ContainerDied","Data":"6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b"} Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.619638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-bggbm" event={"ID":"f5520417-48eb-47ef-afaa-5528cb122d9e","Type":"ContainerDied","Data":"a2f9f42427fbd4b454676c6ba30fe64ac9a45e8866128dd3a74e817eef961038"} Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.619653 4707 scope.go:117] "RemoveContainer" containerID="6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.620972 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "f5520417-48eb-47ef-afaa-5528cb122d9e" (UID: "f5520417-48eb-47ef-afaa-5528cb122d9e"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.621967 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-metrics-tbrg2_18374136-0195-4797-8436-413b88fd3984/openstack-network-exporter/0.log" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.622001 4707 generic.go:334] "Generic (PLEG): container finished" podID="18374136-0195-4797-8436-413b88fd3984" containerID="a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f" exitCode=2 Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.622068 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.622287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" event={"ID":"18374136-0195-4797-8436-413b88fd3984","Type":"ContainerDied","Data":"a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f"} Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.622332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-tbrg2" event={"ID":"18374136-0195-4797-8436-413b88fd3984","Type":"ContainerDied","Data":"70b6bb72e03265d4705fd792ee6d36344fefb1e1a70785e6ddd4a6abbacc6aaf"} Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.625447 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d6b5d46b672652fc2363d8aebf0a9d2f4d00841a22de6435cf5ae4bf3375de2" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.625449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-bggbm-config-8d9fk" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.641467 4707 scope.go:117] "RemoveContainer" containerID="6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b" Jan 21 15:29:30 crc kubenswrapper[4707]: E0121 15:29:30.641781 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b\": container with ID starting with 6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b not found: ID does not exist" containerID="6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.641829 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b"} err="failed to get container status \"6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b\": rpc error: code = NotFound desc = could not find container \"6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b\": container with ID starting with 6c7fc3211defba28bd51c1a23c32a36388176ead29b2edf25de56f0671a1c91b not found: ID does not exist" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.641853 4707 scope.go:117] "RemoveContainer" containerID="a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.659015 4707 scope.go:117] "RemoveContainer" containerID="a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f" Jan 21 15:29:30 crc kubenswrapper[4707]: E0121 15:29:30.659513 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f\": container with ID starting with a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f not found: ID does not exist" containerID="a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.659543 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f"} err="failed to get container status \"a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f\": rpc error: code = NotFound desc = could not find container \"a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f\": container with ID starting with a07755a00c716e7bb4653aa688231c6a6126aec66c1cadd3bb05eefeb476455f not found: ID does not exist" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18374136-0195-4797-8436-413b88fd3984-config\") pod \"18374136-0195-4797-8436-413b88fd3984\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-combined-ca-bundle\") pod \"18374136-0195-4797-8436-413b88fd3984\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovs-rundir\") pod \"18374136-0195-4797-8436-413b88fd3984\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovn-rundir\") pod \"18374136-0195-4797-8436-413b88fd3984\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbjsc\" (UniqueName: \"kubernetes.io/projected/18374136-0195-4797-8436-413b88fd3984-kube-api-access-wbjsc\") pod \"18374136-0195-4797-8436-413b88fd3984\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-metrics-certs-tls-certs\") pod \"18374136-0195-4797-8436-413b88fd3984\" (UID: \"18374136-0195-4797-8436-413b88fd3984\") " Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "18374136-0195-4797-8436-413b88fd3984" (UID: "18374136-0195-4797-8436-413b88fd3984"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "18374136-0195-4797-8436-413b88fd3984" (UID: "18374136-0195-4797-8436-413b88fd3984"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676873 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vqnc\" (UniqueName: \"kubernetes.io/projected/f5520417-48eb-47ef-afaa-5528cb122d9e-kube-api-access-9vqnc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676890 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676899 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5520417-48eb-47ef-afaa-5528cb122d9e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676907 4707 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovs-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676914 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18374136-0195-4797-8436-413b88fd3984-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676922 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5520417-48eb-47ef-afaa-5528cb122d9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.676993 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18374136-0195-4797-8436-413b88fd3984-config" (OuterVolumeSpecName: "config") pod "18374136-0195-4797-8436-413b88fd3984" (UID: "18374136-0195-4797-8436-413b88fd3984"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.678714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18374136-0195-4797-8436-413b88fd3984-kube-api-access-wbjsc" (OuterVolumeSpecName: "kube-api-access-wbjsc") pod "18374136-0195-4797-8436-413b88fd3984" (UID: "18374136-0195-4797-8436-413b88fd3984"). InnerVolumeSpecName "kube-api-access-wbjsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.694541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18374136-0195-4797-8436-413b88fd3984" (UID: "18374136-0195-4797-8436-413b88fd3984"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.721593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "18374136-0195-4797-8436-413b88fd3984" (UID: "18374136-0195-4797-8436-413b88fd3984"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.778562 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18374136-0195-4797-8436-413b88fd3984-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.778596 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.778606 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbjsc\" (UniqueName: \"kubernetes.io/projected/18374136-0195-4797-8436-413b88fd3984-kube-api-access-wbjsc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.778615 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18374136-0195-4797-8436-413b88fd3984-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.919722 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovs-vswitchd" containerID="cri-o://f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702" gracePeriod=30 Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.946241 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm"] Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.952093 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-bggbm"] Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.957501 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-tbrg2"] Jan 21 15:29:30 crc kubenswrapper[4707]: I0121 15:29:30.963904 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-tbrg2"] Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.190636 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18374136-0195-4797-8436-413b88fd3984" path="/var/lib/kubelet/pods/18374136-0195-4797-8436-413b88fd3984/volumes" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.191124 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53ded6e-23a8-4f32-8ca5-404098ea8cb6" path="/var/lib/kubelet/pods/c53ded6e-23a8-4f32-8ca5-404098ea8cb6/volumes" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.191607 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5520417-48eb-47ef-afaa-5528cb122d9e" path="/var/lib/kubelet/pods/f5520417-48eb-47ef-afaa-5528cb122d9e/volumes" Jan 21 15:29:31 crc kubenswrapper[4707]: E0121 15:29:31.199216 4707 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 21 15:29:31 crc kubenswrapper[4707]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 21 15:29:31 crc kubenswrapper[4707]: + source /usr/local/bin/container-scripts/functions Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNBridge=br-int Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNRemote=tcp:localhost:6642 Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNEncapType=geneve Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNAvailabilityZones= Jan 21 15:29:31 crc kubenswrapper[4707]: ++ EnableChassisAsGateway=true Jan 21 15:29:31 crc kubenswrapper[4707]: ++ PhysicalNetworks= Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNHostName= Jan 21 15:29:31 crc kubenswrapper[4707]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 21 15:29:31 crc kubenswrapper[4707]: ++ ovs_dir=/var/lib/openvswitch Jan 21 15:29:31 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 21 15:29:31 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 21 15:29:31 crc kubenswrapper[4707]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:29:31 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:29:31 crc kubenswrapper[4707]: + sleep 0.5 Jan 21 15:29:31 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:29:31 crc kubenswrapper[4707]: + cleanup_ovsdb_server_semaphore Jan 21 15:29:31 crc kubenswrapper[4707]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:29:31 crc kubenswrapper[4707]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 21 15:29:31 crc kubenswrapper[4707]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" message=< Jan 21 15:29:31 crc kubenswrapper[4707]: Exiting ovsdb-server (5) [ OK ] Jan 21 15:29:31 crc kubenswrapper[4707]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 21 15:29:31 crc kubenswrapper[4707]: + source /usr/local/bin/container-scripts/functions Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNBridge=br-int Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNRemote=tcp:localhost:6642 Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNEncapType=geneve Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNAvailabilityZones= Jan 21 15:29:31 crc kubenswrapper[4707]: ++ EnableChassisAsGateway=true Jan 21 15:29:31 crc kubenswrapper[4707]: ++ PhysicalNetworks= Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNHostName= Jan 21 15:29:31 crc kubenswrapper[4707]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 21 15:29:31 crc kubenswrapper[4707]: ++ ovs_dir=/var/lib/openvswitch Jan 21 15:29:31 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 21 15:29:31 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 21 15:29:31 crc kubenswrapper[4707]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:29:31 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:29:31 crc kubenswrapper[4707]: + sleep 0.5 Jan 21 15:29:31 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:29:31 crc kubenswrapper[4707]: + cleanup_ovsdb_server_semaphore Jan 21 15:29:31 crc kubenswrapper[4707]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:29:31 crc kubenswrapper[4707]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 21 15:29:31 crc kubenswrapper[4707]: > Jan 21 15:29:31 crc kubenswrapper[4707]: E0121 15:29:31.199262 4707 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 21 15:29:31 crc kubenswrapper[4707]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 21 15:29:31 crc kubenswrapper[4707]: + source /usr/local/bin/container-scripts/functions Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNBridge=br-int Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNRemote=tcp:localhost:6642 Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNEncapType=geneve Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNAvailabilityZones= Jan 21 15:29:31 crc kubenswrapper[4707]: ++ EnableChassisAsGateway=true Jan 21 15:29:31 crc kubenswrapper[4707]: ++ PhysicalNetworks= Jan 21 15:29:31 crc kubenswrapper[4707]: ++ OVNHostName= Jan 21 15:29:31 crc kubenswrapper[4707]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 21 15:29:31 crc kubenswrapper[4707]: ++ ovs_dir=/var/lib/openvswitch Jan 21 15:29:31 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 21 15:29:31 crc kubenswrapper[4707]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 21 15:29:31 crc kubenswrapper[4707]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:29:31 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:29:31 crc kubenswrapper[4707]: + sleep 0.5 Jan 21 15:29:31 crc kubenswrapper[4707]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 21 15:29:31 crc kubenswrapper[4707]: + cleanup_ovsdb_server_semaphore Jan 21 15:29:31 crc kubenswrapper[4707]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 21 15:29:31 crc kubenswrapper[4707]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 21 15:29:31 crc kubenswrapper[4707]: > pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovsdb-server" containerID="cri-o://19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.199308 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovsdb-server" containerID="cri-o://19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6" gracePeriod=30 Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.491445 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-ovs-67zzd_f0ec7846-0394-4865-9143-d7cced1b7994/ovs-vswitchd/0.log" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.492155 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.590496 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-log\") pod \"f0ec7846-0394-4865-9143-d7cced1b7994\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.590566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-log" (OuterVolumeSpecName: "var-log") pod "f0ec7846-0394-4865-9143-d7cced1b7994" (UID: "f0ec7846-0394-4865-9143-d7cced1b7994"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.590677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0ec7846-0394-4865-9143-d7cced1b7994-scripts\") pod \"f0ec7846-0394-4865-9143-d7cced1b7994\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.590757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-lib\") pod \"f0ec7846-0394-4865-9143-d7cced1b7994\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.590794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2ccm\" (UniqueName: \"kubernetes.io/projected/f0ec7846-0394-4865-9143-d7cced1b7994-kube-api-access-l2ccm\") pod \"f0ec7846-0394-4865-9143-d7cced1b7994\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.590844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-etc-ovs\") pod \"f0ec7846-0394-4865-9143-d7cced1b7994\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.590855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-lib" (OuterVolumeSpecName: "var-lib") pod "f0ec7846-0394-4865-9143-d7cced1b7994" (UID: "f0ec7846-0394-4865-9143-d7cced1b7994"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.590876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-run\") pod \"f0ec7846-0394-4865-9143-d7cced1b7994\" (UID: \"f0ec7846-0394-4865-9143-d7cced1b7994\") " Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.591026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "f0ec7846-0394-4865-9143-d7cced1b7994" (UID: "f0ec7846-0394-4865-9143-d7cced1b7994"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.591069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-run" (OuterVolumeSpecName: "var-run") pod "f0ec7846-0394-4865-9143-d7cced1b7994" (UID: "f0ec7846-0394-4865-9143-d7cced1b7994"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.591493 4707 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-lib\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.591506 4707 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-etc-ovs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.591523 4707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.591532 4707 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f0ec7846-0394-4865-9143-d7cced1b7994-var-log\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.591675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ec7846-0394-4865-9143-d7cced1b7994-scripts" (OuterVolumeSpecName: "scripts") pod "f0ec7846-0394-4865-9143-d7cced1b7994" (UID: "f0ec7846-0394-4865-9143-d7cced1b7994"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.594291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ec7846-0394-4865-9143-d7cced1b7994-kube-api-access-l2ccm" (OuterVolumeSpecName: "kube-api-access-l2ccm") pod "f0ec7846-0394-4865-9143-d7cced1b7994" (UID: "f0ec7846-0394-4865-9143-d7cced1b7994"). InnerVolumeSpecName "kube-api-access-l2ccm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.634534 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-ovs-67zzd_f0ec7846-0394-4865-9143-d7cced1b7994/ovs-vswitchd/0.log" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.635121 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0ec7846-0394-4865-9143-d7cced1b7994" containerID="f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702" exitCode=143 Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.635142 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0ec7846-0394-4865-9143-d7cced1b7994" containerID="19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6" exitCode=0 Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.635170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" event={"ID":"f0ec7846-0394-4865-9143-d7cced1b7994","Type":"ContainerDied","Data":"f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702"} Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.635191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" event={"ID":"f0ec7846-0394-4865-9143-d7cced1b7994","Type":"ContainerDied","Data":"19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6"} Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.635200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" event={"ID":"f0ec7846-0394-4865-9143-d7cced1b7994","Type":"ContainerDied","Data":"c2d714bb77dc074624fd4dab229f086cfb6dc2abd4b10a619cbc0efc51a9fe52"} Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.635203 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-67zzd" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.635216 4707 scope.go:117] "RemoveContainer" containerID="f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.656829 4707 scope.go:117] "RemoveContainer" containerID="19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.661144 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-67zzd"] Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.667147 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-67zzd"] Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.672707 4707 scope.go:117] "RemoveContainer" containerID="d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.692358 4707 scope.go:117] "RemoveContainer" containerID="f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702" Jan 21 15:29:31 crc kubenswrapper[4707]: E0121 15:29:31.692671 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702\": container with ID starting with f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702 not found: ID does not exist" containerID="f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.692698 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702"} err="failed to get container status \"f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702\": rpc error: code = NotFound desc = could not find container \"f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702\": container with ID starting with f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702 not found: ID does not exist" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.692716 4707 scope.go:117] "RemoveContainer" containerID="19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6" Jan 21 15:29:31 crc kubenswrapper[4707]: E0121 15:29:31.692972 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6\": container with ID starting with 19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6 not found: ID does not exist" containerID="19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.692994 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6"} err="failed to get container status \"19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6\": rpc error: code = NotFound desc = could not find container \"19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6\": container with ID starting with 19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6 not found: ID does not exist" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.693009 4707 scope.go:117] "RemoveContainer" containerID="d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.693282 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2ccm\" (UniqueName: \"kubernetes.io/projected/f0ec7846-0394-4865-9143-d7cced1b7994-kube-api-access-l2ccm\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.693314 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0ec7846-0394-4865-9143-d7cced1b7994-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:31 crc kubenswrapper[4707]: E0121 15:29:31.693326 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129\": container with ID starting with d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129 not found: ID does not exist" containerID="d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.693357 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129"} err="failed to get container status \"d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129\": rpc error: code = NotFound desc = could not find container \"d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129\": container with ID starting with d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129 not found: ID does not exist" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.693380 4707 scope.go:117] "RemoveContainer" containerID="f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.693626 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702"} err="failed to get container status \"f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702\": rpc error: code = NotFound desc = could not find container \"f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702\": container with ID starting with f231aec5694d1443c3ae109d5e79b4d2c85e5937d6a8637ce332fe7ecbd05702 not found: ID does not exist" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.693646 4707 scope.go:117] "RemoveContainer" containerID="19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.694003 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6"} err="failed to get container status \"19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6\": rpc error: code = NotFound desc = could not find container \"19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6\": container with ID starting with 19cfd3ef3a97fa31ccdd09e1f7edb883480a419074036d7cfe5e6abcb1697ed6 not found: ID does not exist" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.694021 4707 scope.go:117] "RemoveContainer" containerID="d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129" Jan 21 15:29:31 crc kubenswrapper[4707]: I0121 15:29:31.694255 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129"} err="failed to get container status \"d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129\": rpc error: code = NotFound desc = could not find container \"d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129\": container with ID starting with d15f87e5b3ec1af00a2abb3ec7158e4836bcbc144c658cda6f58bb62954cd129 not found: ID does not exist" Jan 21 15:29:32 crc kubenswrapper[4707]: I0121 15:29:32.865585 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:29:32 crc kubenswrapper[4707]: I0121 15:29:32.865956 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" containerName="openstackclient" containerID="cri-o://07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7" gracePeriod=2 Jan 21 15:29:32 crc kubenswrapper[4707]: I0121 15:29:32.877583 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:29:32 crc kubenswrapper[4707]: I0121 15:29:32.897402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:29:32 crc kubenswrapper[4707]: I0121 15:29:32.897603 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="f161ce76-80a9-470b-8776-df5af1799b7e" containerName="cinder-scheduler" containerID="cri-o://51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4" gracePeriod=30 Jan 21 15:29:32 crc kubenswrapper[4707]: I0121 15:29:32.897957 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="f161ce76-80a9-470b-8776-df5af1799b7e" containerName="probe" containerID="cri-o://aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5" gracePeriod=30 Jan 21 15:29:32 crc kubenswrapper[4707]: I0121 15:29:32.954236 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.117865 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.118102 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" containerName="cinder-api-log" containerID="cri-o://8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.118455 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" containerName="cinder-api" containerID="cri-o://51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.127877 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.127932 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data podName:6bac3bc5-b4ca-4333-9fc6-851430c48708 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:33.627917444 +0000 UTC m=+1670.809433666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data") pod "rabbitmq-server-0" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708") : configmap "rabbitmq-config-data" not found Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.147027 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm"] Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.147537 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" containerName="openstackclient" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.147607 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" containerName="openstackclient" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.147664 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53ded6e-23a8-4f32-8ca5-404098ea8cb6" containerName="ovn-config" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.147709 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53ded6e-23a8-4f32-8ca5-404098ea8cb6" containerName="ovn-config" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.147770 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovsdb-server-init" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.147838 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovsdb-server-init" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.147898 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovs-vswitchd" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.147951 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovs-vswitchd" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.148006 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovsdb-server" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.148052 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovsdb-server" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.148100 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5520417-48eb-47ef-afaa-5528cb122d9e" containerName="ovn-controller" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.148153 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5520417-48eb-47ef-afaa-5528cb122d9e" containerName="ovn-controller" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.148243 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18374136-0195-4797-8436-413b88fd3984" containerName="openstack-network-exporter" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.148294 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="18374136-0195-4797-8436-413b88fd3984" containerName="openstack-network-exporter" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.148486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5520417-48eb-47ef-afaa-5528cb122d9e" containerName="ovn-controller" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.152776 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="18374136-0195-4797-8436-413b88fd3984" containerName="openstack-network-exporter" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.152859 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" containerName="openstackclient" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.152922 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53ded6e-23a8-4f32-8ca5-404098ea8cb6" containerName="ovn-config" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.152985 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovsdb-server" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.153038 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" containerName="ovs-vswitchd" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.153920 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.186098 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-b89f5977c-485np"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.201496 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.337215 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ec7846-0394-4865-9143-d7cced1b7994" path="/var/lib/kubelet/pods/f0ec7846-0394-4865-9143-d7cced1b7994/volumes" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.337940 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.337959 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-b89f5977c-485np"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.337969 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.338821 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.338882 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.338927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data-custom\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.339014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.339029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.339051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.339071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks9tb\" (UniqueName: \"kubernetes.io/projected/0080fa38-5728-4417-968b-156c1a7df0bf-kube-api-access-ks9tb\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.339096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.339114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data-custom\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.339152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a8e47-360d-481e-bf1c-9d58ad545f77-logs\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.339181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080fa38-5728-4417-968b-156c1a7df0bf-logs\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.339203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvwlb\" (UniqueName: \"kubernetes.io/projected/249a8e47-360d-481e-bf1c-9d58ad545f77-kube-api-access-qvwlb\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.352744 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.353309 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerName="openstack-network-exporter" containerID="cri-o://e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6" gracePeriod=300 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.363141 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.382577 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.408048 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzkq7"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.409290 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.411990 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks9tb\" (UniqueName: \"kubernetes.io/projected/0080fa38-5728-4417-968b-156c1a7df0bf-kube-api-access-ks9tb\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data-custom\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a8e47-360d-481e-bf1c-9d58ad545f77-logs\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080fa38-5728-4417-968b-156c1a7df0bf-logs\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvwlb\" (UniqueName: \"kubernetes.io/projected/249a8e47-360d-481e-bf1c-9d58ad545f77-kube-api-access-qvwlb\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts\") pod \"glance-d5d6-account-create-update-zcbtm\" (UID: \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkp5k\" (UniqueName: \"kubernetes.io/projected/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-kube-api-access-nkp5k\") pod \"glance-d5d6-account-create-update-zcbtm\" (UID: \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.443639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data-custom\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.444568 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.444612 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle podName:249a8e47-360d-481e-bf1c-9d58ad545f77 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:33.944599241 +0000 UTC m=+1671.126115462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle") pod "barbican-worker-b89f5977c-485np" (UID: "249a8e47-360d-481e-bf1c-9d58ad545f77") : secret "combined-ca-bundle" not found Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.450989 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.451249 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle podName:0080fa38-5728-4417-968b-156c1a7df0bf nodeName:}" failed. No retries permitted until 2026-01-21 15:29:33.951234704 +0000 UTC m=+1671.132750927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle") pod "barbican-keystone-listener-6457bf97d6-6nrnm" (UID: "0080fa38-5728-4417-968b-156c1a7df0bf") : secret "combined-ca-bundle" not found Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.451786 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-kt8jl"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.455397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a8e47-360d-481e-bf1c-9d58ad545f77-logs\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.456209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080fa38-5728-4417-968b-156c1a7df0bf-logs\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.458048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.465465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.470364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data-custom\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.472373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data-custom\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.496459 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g78qf"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.505676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvwlb\" (UniqueName: \"kubernetes.io/projected/249a8e47-360d-481e-bf1c-9d58ad545f77-kube-api-access-qvwlb\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.514303 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g78qf"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.530083 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzkq7"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.546369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts\") pod \"glance-d5d6-account-create-update-zcbtm\" (UID: \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.546432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6zg6\" (UniqueName: \"kubernetes.io/projected/e8daa249-22b3-4782-b64b-d9dca84a8777-kube-api-access-v6zg6\") pod \"root-account-create-update-jzkq7\" (UID: \"e8daa249-22b3-4782-b64b-d9dca84a8777\") " pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.546461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkp5k\" (UniqueName: \"kubernetes.io/projected/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-kube-api-access-nkp5k\") pod \"glance-d5d6-account-create-update-zcbtm\" (UID: \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.546479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts\") pod \"root-account-create-update-jzkq7\" (UID: \"e8daa249-22b3-4782-b64b-d9dca84a8777\") " pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.548074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks9tb\" (UniqueName: \"kubernetes.io/projected/0080fa38-5728-4417-968b-156c1a7df0bf-kube-api-access-ks9tb\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.548519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts\") pod \"glance-d5d6-account-create-update-zcbtm\" (UID: \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.549273 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerName="ovsdbserver-sb" containerID="cri-o://229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43" gracePeriod=300 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.560549 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.585139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkp5k\" (UniqueName: \"kubernetes.io/projected/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-kube-api-access-nkp5k\") pod \"glance-d5d6-account-create-update-zcbtm\" (UID: \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\") " pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.585347 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.586452 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.588464 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.604143 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.632127 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.647692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts\") pod \"root-account-create-update-jzkq7\" (UID: \"e8daa249-22b3-4782-b64b-d9dca84a8777\") " pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.648082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6zg6\" (UniqueName: \"kubernetes.io/projected/e8daa249-22b3-4782-b64b-d9dca84a8777-kube-api-access-v6zg6\") pod \"root-account-create-update-jzkq7\" (UID: \"e8daa249-22b3-4782-b64b-d9dca84a8777\") " pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.648416 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.648463 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data podName:6bac3bc5-b4ca-4333-9fc6-851430c48708 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:34.648450106 +0000 UTC m=+1671.829966329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data") pod "rabbitmq-server-0" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708") : configmap "rabbitmq-config-data" not found Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.648472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts\") pod \"root-account-create-update-jzkq7\" (UID: \"e8daa249-22b3-4782-b64b-d9dca84a8777\") " pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.648535 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.648574 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data podName:d54b9100-265d-49a3-a2ab-72e8787510df nodeName:}" failed. No retries permitted until 2026-01-21 15:29:34.148562649 +0000 UTC m=+1671.330078871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data") pod "rabbitmq-cell1-server-0" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.650680 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.657264 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.662974 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-dxf2p"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.670728 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.671695 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_2b37dca3-6de3-4496-93ad-e1cff3fc0e3e/ovsdbserver-sb/0.log" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.671753 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerID="e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6" exitCode=2 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.671803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e","Type":"ContainerDied","Data":"e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6"} Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.672992 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.682878 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-wp6hh"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.683226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6zg6\" (UniqueName: \"kubernetes.io/projected/e8daa249-22b3-4782-b64b-d9dca84a8777-kube-api-access-v6zg6\") pod \"root-account-create-update-jzkq7\" (UID: \"e8daa249-22b3-4782-b64b-d9dca84a8777\") " pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.697979 4707 generic.go:334] "Generic (PLEG): container finished" podID="ef7c730d-de41-4543-a84e-d42a48259f71" containerID="8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f" exitCode=143 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.698021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ef7c730d-de41-4543-a84e-d42a48259f71","Type":"ContainerDied","Data":"8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f"} Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.701531 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.708987 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.727897 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.729037 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.733204 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.735117 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.750584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-operator-scripts\") pod \"cinder-9d5a-account-create-update-cdskv\" (UID: \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.750704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvtt\" (UniqueName: \"kubernetes.io/projected/5d4d5044-73fa-4995-8594-0541964e35e8-kube-api-access-wrvtt\") pod \"neutron-11a0-account-create-update-qjfm5\" (UID: \"5d4d5044-73fa-4995-8594-0541964e35e8\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.750724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fxtt\" (UniqueName: \"kubernetes.io/projected/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-kube-api-access-9fxtt\") pod \"cinder-9d5a-account-create-update-cdskv\" (UID: \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.750795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d5044-73fa-4995-8594-0541964e35e8-operator-scripts\") pod \"neutron-11a0-account-create-update-qjfm5\" (UID: \"5d4d5044-73fa-4995-8594-0541964e35e8\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.770362 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.782820 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-btx76"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.809851 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.810102 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerName="openstack-network-exporter" containerID="cri-o://2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.810320 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerName="ovn-northd" containerID="cri-o://a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.820866 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.821125 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="c475086c-e924-418c-bdd8-0b61fe31f950" containerName="openstack-network-exporter" containerID="cri-o://d9976eaa71a516b23e46173e6991270fa567b0fd2fdf3a5753ae210e1a2cdd0f" gracePeriod=300 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.830490 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-btx76"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.854254 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-98k7p"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.857090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b669faa-0600-4ffd-918d-b33788b18eb9-operator-scripts\") pod \"barbican-7408-account-create-update-wrhz7\" (UID: \"5b669faa-0600-4ffd-918d-b33788b18eb9\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.857200 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvtt\" (UniqueName: \"kubernetes.io/projected/5d4d5044-73fa-4995-8594-0541964e35e8-kube-api-access-wrvtt\") pod \"neutron-11a0-account-create-update-qjfm5\" (UID: \"5d4d5044-73fa-4995-8594-0541964e35e8\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.857232 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fxtt\" (UniqueName: \"kubernetes.io/projected/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-kube-api-access-9fxtt\") pod \"cinder-9d5a-account-create-update-cdskv\" (UID: \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.857378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d5044-73fa-4995-8594-0541964e35e8-operator-scripts\") pod \"neutron-11a0-account-create-update-qjfm5\" (UID: \"5d4d5044-73fa-4995-8594-0541964e35e8\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.857419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtf2q\" (UniqueName: \"kubernetes.io/projected/5b669faa-0600-4ffd-918d-b33788b18eb9-kube-api-access-qtf2q\") pod \"barbican-7408-account-create-update-wrhz7\" (UID: \"5b669faa-0600-4ffd-918d-b33788b18eb9\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.857449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-operator-scripts\") pod \"cinder-9d5a-account-create-update-cdskv\" (UID: \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.859221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d5044-73fa-4995-8594-0541964e35e8-operator-scripts\") pod \"neutron-11a0-account-create-update-qjfm5\" (UID: \"5d4d5044-73fa-4995-8594-0541964e35e8\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.861042 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-98k7p"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.870116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-operator-scripts\") pod \"cinder-9d5a-account-create-update-cdskv\" (UID: \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.872772 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873221 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-server" containerID="cri-o://d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873546 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="swift-recon-cron" containerID="cri-o://3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873589 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="rsync" containerID="cri-o://45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873619 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-expirer" containerID="cri-o://e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873649 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-updater" containerID="cri-o://eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873678 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-auditor" containerID="cri-o://f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873708 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-replicator" containerID="cri-o://132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873736 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-server" containerID="cri-o://ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873766 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-updater" containerID="cri-o://4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873793 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-auditor" containerID="cri-o://5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873840 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-replicator" containerID="cri-o://f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873867 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-server" containerID="cri-o://1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873901 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-reaper" containerID="cri-o://f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873929 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-auditor" containerID="cri-o://292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.873967 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-replicator" containerID="cri-o://b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d" gracePeriod=30 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.894610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fxtt\" (UniqueName: \"kubernetes.io/projected/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-kube-api-access-9fxtt\") pod \"cinder-9d5a-account-create-update-cdskv\" (UID: \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\") " pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.898688 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvtt\" (UniqueName: \"kubernetes.io/projected/5d4d5044-73fa-4995-8594-0541964e35e8-kube-api-access-wrvtt\") pod \"neutron-11a0-account-create-update-qjfm5\" (UID: \"5d4d5044-73fa-4995-8594-0541964e35e8\") " pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.924275 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.945983 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-g2jcl"] Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.968470 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.968845 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle podName:249a8e47-360d-481e-bf1c-9d58ad545f77 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:34.968804569 +0000 UTC m=+1672.150320791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle") pod "barbican-worker-b89f5977c-485np" (UID: "249a8e47-360d-481e-bf1c-9d58ad545f77") : secret "combined-ca-bundle" not found Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.968379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.969409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtf2q\" (UniqueName: \"kubernetes.io/projected/5b669faa-0600-4ffd-918d-b33788b18eb9-kube-api-access-qtf2q\") pod \"barbican-7408-account-create-update-wrhz7\" (UID: \"5b669faa-0600-4ffd-918d-b33788b18eb9\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.969587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b669faa-0600-4ffd-918d-b33788b18eb9-operator-scripts\") pod \"barbican-7408-account-create-update-wrhz7\" (UID: \"5b669faa-0600-4ffd-918d-b33788b18eb9\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.969651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.969862 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:33 crc kubenswrapper[4707]: E0121 15:29:33.969896 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle podName:0080fa38-5728-4417-968b-156c1a7df0bf nodeName:}" failed. No retries permitted until 2026-01-21 15:29:34.969887054 +0000 UTC m=+1672.151403276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle") pod "barbican-keystone-listener-6457bf97d6-6nrnm" (UID: "0080fa38-5728-4417-968b-156c1a7df0bf") : secret "combined-ca-bundle" not found Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.971177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b669faa-0600-4ffd-918d-b33788b18eb9-operator-scripts\") pod \"barbican-7408-account-create-update-wrhz7\" (UID: \"5b669faa-0600-4ffd-918d-b33788b18eb9\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.976023 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="c475086c-e924-418c-bdd8-0b61fe31f950" containerName="ovsdbserver-nb" containerID="cri-o://6773818b9477157356db610d5c8e6c5f70dba1e6f5a99dffecbb55f651495e61" gracePeriod=300 Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.985859 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-g2jcl"] Jan 21 15:29:33 crc kubenswrapper[4707]: I0121 15:29:33.994280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtf2q\" (UniqueName: \"kubernetes.io/projected/5b669faa-0600-4ffd-918d-b33788b18eb9-kube-api-access-qtf2q\") pod \"barbican-7408-account-create-update-wrhz7\" (UID: \"5b669faa-0600-4ffd-918d-b33788b18eb9\") " pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.029821 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.055533 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.056617 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.061623 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.061805 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.069603 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.119652 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.120610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.137635 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jw749"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.138069 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.168832 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-jw749"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.175716 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.193145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-operator-scripts\") pod \"nova-api-85d1-account-create-update-dpm4s\" (UID: \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.193240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lwn7\" (UniqueName: \"kubernetes.io/projected/5ae763d3-de1a-4a94-94bf-41c06bfe239f-kube-api-access-4lwn7\") pod \"nova-cell0-62dd-account-create-update-v8v9d\" (UID: \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.193292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-249tc\" (UniqueName: \"kubernetes.io/projected/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-kube-api-access-249tc\") pod \"nova-api-85d1-account-create-update-dpm4s\" (UID: \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.193346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae763d3-de1a-4a94-94bf-41c06bfe239f-operator-scripts\") pod \"nova-cell0-62dd-account-create-update-v8v9d\" (UID: \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.193501 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.193537 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data podName:d54b9100-265d-49a3-a2ab-72e8787510df nodeName:}" failed. No retries permitted until 2026-01-21 15:29:35.19352485 +0000 UTC m=+1672.375041071 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data") pod "rabbitmq-cell1-server-0" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.199196 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.218975 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-w6f54"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.258358 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.259435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.261180 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.286684 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.294584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-operator-scripts\") pod \"nova-api-85d1-account-create-update-dpm4s\" (UID: \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.294697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lwn7\" (UniqueName: \"kubernetes.io/projected/5ae763d3-de1a-4a94-94bf-41c06bfe239f-kube-api-access-4lwn7\") pod \"nova-cell0-62dd-account-create-update-v8v9d\" (UID: \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.294756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-249tc\" (UniqueName: \"kubernetes.io/projected/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-kube-api-access-249tc\") pod \"nova-api-85d1-account-create-update-dpm4s\" (UID: \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.294840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae763d3-de1a-4a94-94bf-41c06bfe239f-operator-scripts\") pod \"nova-cell0-62dd-account-create-update-v8v9d\" (UID: \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.295630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae763d3-de1a-4a94-94bf-41c06bfe239f-operator-scripts\") pod \"nova-cell0-62dd-account-create-update-v8v9d\" (UID: \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.295730 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-operator-scripts\") pod \"nova-api-85d1-account-create-update-dpm4s\" (UID: \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.304484 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.313093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lwn7\" (UniqueName: \"kubernetes.io/projected/5ae763d3-de1a-4a94-94bf-41c06bfe239f-kube-api-access-4lwn7\") pod \"nova-cell0-62dd-account-create-update-v8v9d\" (UID: \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\") " pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.318720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-249tc\" (UniqueName: \"kubernetes.io/projected/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-kube-api-access-249tc\") pod \"nova-api-85d1-account-create-update-dpm4s\" (UID: \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\") " pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.322694 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-cgs94"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.327099 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_2b37dca3-6de3-4496-93ad-e1cff3fc0e3e/ovsdbserver-sb/0.log" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.327188 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.336854 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-g9gwr"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.351233 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-g9gwr"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.370790 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt"] Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.371228 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerName="ovsdbserver-sb" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.371240 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerName="ovsdbserver-sb" Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.371253 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerName="openstack-network-exporter" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.371259 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerName="openstack-network-exporter" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.371395 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerName="openstack-network-exporter" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.371408 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerName="ovsdbserver-sb" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.372236 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.376750 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.391209 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-9mljp"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.395437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdb-rundir\") pod \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.395501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-scripts\") pod \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.395562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-combined-ca-bundle\") pod \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.395603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-metrics-certs-tls-certs\") pod \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.395747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdbserver-sb-tls-certs\") pod \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.395770 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg2ft\" (UniqueName: \"kubernetes.io/projected/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-kube-api-access-mg2ft\") pod \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.395795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.395828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-config\") pod \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\" (UID: \"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e\") " Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.396057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whkz\" (UniqueName: \"kubernetes.io/projected/48f50e81-5c99-4197-823e-f6967c42a20c-kube-api-access-8whkz\") pod \"nova-cell1-80bd-account-create-update-4cxdh\" (UID: \"48f50e81-5c99-4197-823e-f6967c42a20c\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.396147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f50e81-5c99-4197-823e-f6967c42a20c-operator-scripts\") pod \"nova-cell1-80bd-account-create-update-4cxdh\" (UID: \"48f50e81-5c99-4197-823e-f6967c42a20c\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.396875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" (UID: "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.400923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-scripts" (OuterVolumeSpecName: "scripts") pod "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" (UID: "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.407879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-config" (OuterVolumeSpecName: "config") pod "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" (UID: "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.408103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-kube-api-access-mg2ft" (OuterVolumeSpecName: "kube-api-access-mg2ft") pod "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" (UID: "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e"). InnerVolumeSpecName "kube-api-access-mg2ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.408154 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-9mljp"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.425082 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.426756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" (UID: "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.444556 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-8gwrb"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.451664 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5977d44c4b-9wznx"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.451968 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" podUID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerName="placement-log" containerID="cri-o://762d5296ac937d5e18cdab76d4b50154815f31ea6c484242500f5fa8c55bfbe3" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.452311 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" podUID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerName="placement-api" containerID="cri-o://f89ab992dfc657152c4b63f182969f628ca255996d2039d51a99b906f27be1d7" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.462191 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.469560 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.469602 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" (UID: "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.506690 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.506903 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" podUID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerName="neutron-api" containerID="cri-o://4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.507020 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" podUID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerName="neutron-httpd" containerID="cri-o://05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.507982 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-vgrnt\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8whkz\" (UniqueName: \"kubernetes.io/projected/48f50e81-5c99-4197-823e-f6967c42a20c-kube-api-access-8whkz\") pod \"nova-cell1-80bd-account-create-update-4cxdh\" (UID: \"48f50e81-5c99-4197-823e-f6967c42a20c\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-vgrnt\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f50e81-5c99-4197-823e-f6967c42a20c-operator-scripts\") pod \"nova-cell1-80bd-account-create-update-4cxdh\" (UID: \"48f50e81-5c99-4197-823e-f6967c42a20c\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgxw4\" (UniqueName: \"kubernetes.io/projected/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-kube-api-access-hgxw4\") pod \"dnsmasq-dnsmasq-84b9f45d47-vgrnt\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528408 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528421 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg2ft\" (UniqueName: \"kubernetes.io/projected/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-kube-api-access-mg2ft\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528432 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528440 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528448 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.528455 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.531755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f50e81-5c99-4197-823e-f6967c42a20c-operator-scripts\") pod \"nova-cell1-80bd-account-create-update-4cxdh\" (UID: \"48f50e81-5c99-4197-823e-f6967c42a20c\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.600101 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="6bac3bc5-b4ca-4333-9fc6-851430c48708" containerName="rabbitmq" containerID="cri-o://001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287" gracePeriod=604800 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.601209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whkz\" (UniqueName: \"kubernetes.io/projected/48f50e81-5c99-4197-823e-f6967c42a20c-kube-api-access-8whkz\") pod \"nova-cell1-80bd-account-create-update-4cxdh\" (UID: \"48f50e81-5c99-4197-823e-f6967c42a20c\") " pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.604283 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-bb59f"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.620874 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-bb59f"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.631239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgxw4\" (UniqueName: \"kubernetes.io/projected/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-kube-api-access-hgxw4\") pod \"dnsmasq-dnsmasq-84b9f45d47-vgrnt\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.631304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-vgrnt\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.631387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-vgrnt\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.634698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-vgrnt\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.635331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-vgrnt\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.648157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" (UID: "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.664915 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgxw4\" (UniqueName: \"kubernetes.io/projected/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-kube-api-access-hgxw4\") pod \"dnsmasq-dnsmasq-84b9f45d47-vgrnt\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.680877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.695383 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-q6vqd"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.704586 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.706338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" (UID: "2b37dca3-6de3-4496-93ad-e1cff3fc0e3e"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.724389 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.735928 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.736191 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.736210 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.738088 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.738149 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data podName:6bac3bc5-b4ca-4333-9fc6-851430c48708 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.738132524 +0000 UTC m=+1673.919648746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data") pod "rabbitmq-server-0" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708") : configmap "rabbitmq-config-data" not found Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.762849 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-zjhmd"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771260 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771294 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771302 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771309 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771314 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771322 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771331 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771338 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771343 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771348 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771353 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771433 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.771484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.774879 4707 generic.go:334] "Generic (PLEG): container finished" podID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerID="762d5296ac937d5e18cdab76d4b50154815f31ea6c484242500f5fa8c55bfbe3" exitCode=143 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.774932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" event={"ID":"faf162ab-5917-4ddc-9b89-573d2e3bc7e6","Type":"ContainerDied","Data":"762d5296ac937d5e18cdab76d4b50154815f31ea6c484242500f5fa8c55bfbe3"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.776930 4707 generic.go:334] "Generic (PLEG): container finished" podID="f161ce76-80a9-470b-8776-df5af1799b7e" containerID="aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5" exitCode=0 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.776972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f161ce76-80a9-470b-8776-df5af1799b7e","Type":"ContainerDied","Data":"aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.778895 4707 generic.go:334] "Generic (PLEG): container finished" podID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerID="2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c" exitCode=2 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.778934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5","Type":"ContainerDied","Data":"2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.800981 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_2b37dca3-6de3-4496-93ad-e1cff3fc0e3e/ovsdbserver-sb/0.log" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.801018 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" containerID="229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43" exitCode=143 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.801137 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.801299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e","Type":"ContainerDied","Data":"229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.801342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"2b37dca3-6de3-4496-93ad-e1cff3fc0e3e","Type":"ContainerDied","Data":"7463dce460246bcc9c4ccb82217244ca5cd983747a19134b492c77bafd55e439"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.801359 4707 scope.go:117] "RemoveContainer" containerID="e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6" Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.810660 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:29:34 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:29:34 crc kubenswrapper[4707]: Jan 21 15:29:34 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:29:34 crc kubenswrapper[4707]: Jan 21 15:29:34 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:29:34 crc kubenswrapper[4707]: Jan 21 15:29:34 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:29:34 crc kubenswrapper[4707]: Jan 21 15:29:34 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 15:29:34 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 15:29:34 crc kubenswrapper[4707]: else Jan 21 15:29:34 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:29:34 crc kubenswrapper[4707]: fi Jan 21 15:29:34 crc kubenswrapper[4707]: Jan 21 15:29:34 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:29:34 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:29:34 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:29:34 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:29:34 crc kubenswrapper[4707]: # support updates Jan 21 15:29:34 crc kubenswrapper[4707]: Jan 21 15:29:34 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.814996 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_c475086c-e924-418c-bdd8-0b61fe31f950/ovsdbserver-nb/0.log" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.815037 4707 generic.go:334] "Generic (PLEG): container finished" podID="c475086c-e924-418c-bdd8-0b61fe31f950" containerID="d9976eaa71a516b23e46173e6991270fa567b0fd2fdf3a5753ae210e1a2cdd0f" exitCode=2 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.815051 4707 generic.go:334] "Generic (PLEG): container finished" podID="c475086c-e924-418c-bdd8-0b61fe31f950" containerID="6773818b9477157356db610d5c8e6c5f70dba1e6f5a99dffecbb55f651495e61" exitCode=143 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.815093 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"c475086c-e924-418c-bdd8-0b61fe31f950","Type":"ContainerDied","Data":"d9976eaa71a516b23e46173e6991270fa567b0fd2fdf3a5753ae210e1a2cdd0f"} Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.815116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"c475086c-e924-418c-bdd8-0b61fe31f950","Type":"ContainerDied","Data":"6773818b9477157356db610d5c8e6c5f70dba1e6f5a99dffecbb55f651495e61"} Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.815407 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" podUID="8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.857142 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-7xszt"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.870406 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-7xszt"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.882939 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.883125 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" podUID="700a8d9c-852c-4611-9a98-50418d7df800" containerName="proxy-httpd" containerID="cri-o://afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.883474 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" podUID="700a8d9c-852c-4611-9a98-50418d7df800" containerName="proxy-server" containerID="cri-o://c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.888864 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.897981 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-408c-account-create-update-7pgf5"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.916744 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-408c-account-create-update-7pgf5"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.924492 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzkq7"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.930115 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.936584 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.936842 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerName="glance-log" containerID="cri-o://0f79ffcc51a1df0f60b92a65eed0727ebe1a87e5686c3cfa9b48daa4dee20df1" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.936867 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerName="glance-httpd" containerID="cri-o://dac905514aca127784ff10619aa0102a1724955c3593a17e7fa6fb6888775848" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.941012 4707 scope.go:117] "RemoveContainer" containerID="229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.941247 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.941715 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g8rcm"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.950541 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g8rcm"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.960751 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.961220 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="33a83157-f122-493a-acd7-ba671cfce233" containerName="glance-log" containerID="cri-o://940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.961738 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="33a83157-f122-493a-acd7-ba671cfce233" containerName="glance-httpd" containerID="cri-o://a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233" gracePeriod=30 Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.970420 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.973697 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-fwzmm"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.979633 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-fwzmm"] Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.989240 4707 scope.go:117] "RemoveContainer" containerID="e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.989379 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv"] Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.990063 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6\": container with ID starting with e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6 not found: ID does not exist" containerID="e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.990090 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6"} err="failed to get container status \"e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6\": rpc error: code = NotFound desc = could not find container \"e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6\": container with ID starting with e2550bd662895b0eae3a788982d692c47494d3810cd8c559b44b7fa4a570e9a6 not found: ID does not exist" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.990112 4707 scope.go:117] "RemoveContainer" containerID="229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43" Jan 21 15:29:34 crc kubenswrapper[4707]: E0121 15:29:34.994402 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43\": container with ID starting with 229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43 not found: ID does not exist" containerID="229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.994421 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43"} err="failed to get container status \"229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43\": rpc error: code = NotFound desc = could not find container \"229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43\": container with ID starting with 229565198f8a6d8fb7e772dcc866fb2968ed9f6df327433ee518623e61eada43 not found: ID does not exist" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.996096 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_c475086c-e924-418c-bdd8-0b61fe31f950/ovsdbserver-nb/0.log" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.996142 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:29:34 crc kubenswrapper[4707]: I0121 15:29:34.997082 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.006877 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.033195 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-48sg8"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.039131 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-48sg8"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.043316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.043360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.043612 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.043654 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle podName:249a8e47-360d-481e-bf1c-9d58ad545f77 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:37.043641676 +0000 UTC m=+1674.225157898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle") pod "barbican-worker-b89f5977c-485np" (UID: "249a8e47-360d-481e-bf1c-9d58ad545f77") : secret "combined-ca-bundle" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.043963 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.043995 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle podName:0080fa38-5728-4417-968b-156c1a7df0bf nodeName:}" failed. No retries permitted until 2026-01-21 15:29:37.043987998 +0000 UTC m=+1674.225504220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle") pod "barbican-keystone-listener-6457bf97d6-6nrnm" (UID: "0080fa38-5728-4417-968b-156c1a7df0bf") : secret "combined-ca-bundle" not found Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.051406 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.063405 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lxdwg"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.075858 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lxdwg"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.093074 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7"] Jan 21 15:29:35 crc kubenswrapper[4707]: W0121 15:29:35.099654 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d4d5044_73fa_4995_8594_0541964e35e8.slice/crio-afca4d3242e761f4e1914710f4e42d3a067b4206e7cb051b53eb5ef1cf98e00f WatchSource:0}: Error finding container afca4d3242e761f4e1914710f4e42d3a067b4206e7cb051b53eb5ef1cf98e00f: Status 404 returned error can't find the container with id afca4d3242e761f4e1914710f4e42d3a067b4206e7cb051b53eb5ef1cf98e00f Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.110146 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.110198 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.110373 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-log" containerID="cri-o://2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.110753 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-api" containerID="cri-o://e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.119366 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-tcjlc"] Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.121876 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:29:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: if [ -n "neutron" ]; then Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="neutron" Jan 21 15:29:35 crc kubenswrapper[4707]: else Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:29:35 crc kubenswrapper[4707]: fi Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:29:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:29:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:29:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:29:35 crc kubenswrapper[4707]: # support updates Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.123462 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" podUID="5d4d5044-73fa-4995-8594-0541964e35e8" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.125655 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-tcjlc"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.135732 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-jfctd"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.141289 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-jfctd"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.144382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-config\") pod \"c475086c-e924-418c-bdd8-0b61fe31f950\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.144462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-combined-ca-bundle\") pod \"c475086c-e924-418c-bdd8-0b61fe31f950\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.144517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-scripts\") pod \"c475086c-e924-418c-bdd8-0b61fe31f950\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.144558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdb-rundir\") pod \"c475086c-e924-418c-bdd8-0b61fe31f950\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.144597 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6d2r\" (UniqueName: \"kubernetes.io/projected/c475086c-e924-418c-bdd8-0b61fe31f950-kube-api-access-n6d2r\") pod \"c475086c-e924-418c-bdd8-0b61fe31f950\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.144634 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-metrics-certs-tls-certs\") pod \"c475086c-e924-418c-bdd8-0b61fe31f950\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.144679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"c475086c-e924-418c-bdd8-0b61fe31f950\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.144740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdbserver-nb-tls-certs\") pod \"c475086c-e924-418c-bdd8-0b61fe31f950\" (UID: \"c475086c-e924-418c-bdd8-0b61fe31f950\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.147880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-scripts" (OuterVolumeSpecName: "scripts") pod "c475086c-e924-418c-bdd8-0b61fe31f950" (UID: "c475086c-e924-418c-bdd8-0b61fe31f950"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.148343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-config" (OuterVolumeSpecName: "config") pod "c475086c-e924-418c-bdd8-0b61fe31f950" (UID: "c475086c-e924-418c-bdd8-0b61fe31f950"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.149299 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "c475086c-e924-418c-bdd8-0b61fe31f950" (UID: "c475086c-e924-418c-bdd8-0b61fe31f950"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.153306 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.166430 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.166732 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-log" containerID="cri-o://3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.167854 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-metadata" containerID="cri-o://1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.177259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "c475086c-e924-418c-bdd8-0b61fe31f950" (UID: "c475086c-e924-418c-bdd8-0b61fe31f950"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.179507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c475086c-e924-418c-bdd8-0b61fe31f950-kube-api-access-n6d2r" (OuterVolumeSpecName: "kube-api-access-n6d2r") pod "c475086c-e924-418c-bdd8-0b61fe31f950" (UID: "c475086c-e924-418c-bdd8-0b61fe31f950"). InnerVolumeSpecName "kube-api-access-n6d2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.188115 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/openstack-cell1-galera-0" secret="" err="secret \"galera-openstack-cell1-dockercfg-zxrdl\" not found" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.215553 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:29:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:29:35 crc kubenswrapper[4707]: else Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:29:35 crc kubenswrapper[4707]: fi Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:29:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:29:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:29:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:29:35 crc kubenswrapper[4707]: # support updates Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.217331 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" podUID="5b669faa-0600-4ffd-918d-b33788b18eb9" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.225455 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b384d69-2797-4368-9779-c0860b850738" path="/var/lib/kubelet/pods/0b384d69-2797-4368-9779-c0860b850738/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.225972 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce47035-1fcb-413a-b5ff-d3207523b1ef" path="/var/lib/kubelet/pods/0ce47035-1fcb-413a-b5ff-d3207523b1ef/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.230488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c475086c-e924-418c-bdd8-0b61fe31f950" (UID: "c475086c-e924-418c-bdd8-0b61fe31f950"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.231033 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="120902fc-6e0c-4568-af50-7cc575b4be15" path="/var/lib/kubelet/pods/120902fc-6e0c-4568-af50-7cc575b4be15/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.232082 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8" path="/var/lib/kubelet/pods/1a246ce3-51b4-4914-9f3e-4cc6b4ca58d8/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.234440 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b37dca3-6de3-4496-93ad-e1cff3fc0e3e" path="/var/lib/kubelet/pods/2b37dca3-6de3-4496-93ad-e1cff3fc0e3e/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.235251 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399b071b-395f-4701-b7b9-3c3868acae7f" path="/var/lib/kubelet/pods/399b071b-395f-4701-b7b9-3c3868acae7f/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.241585 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927" path="/var/lib/kubelet/pods/3b55957d-e7aa-4c3b-b4d7-d6e14c3b5927/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.243506 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe" path="/var/lib/kubelet/pods/4a2a0d64-5c2e-48a0-92d4-34f1536f7cbe/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.244044 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625c54ec-3c40-4fd5-a2f9-583dedf97df7" path="/var/lib/kubelet/pods/625c54ec-3c40-4fd5-a2f9-583dedf97df7/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.244708 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac97b49-c5fc-425e-91cb-8dd30213b600" path="/var/lib/kubelet/pods/6ac97b49-c5fc-425e-91cb-8dd30213b600/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.250078 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.250146 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data podName:d54b9100-265d-49a3-a2ab-72e8787510df nodeName:}" failed. No retries permitted until 2026-01-21 15:29:37.250128499 +0000 UTC m=+1674.431644722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data") pod "rabbitmq-cell1-server-0" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.250545 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6d2r\" (UniqueName: \"kubernetes.io/projected/c475086c-e924-418c-bdd8-0b61fe31f950-kube-api-access-n6d2r\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.250597 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.250610 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.250619 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.250627 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c475086c-e924-418c-bdd8-0b61fe31f950-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.250635 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.250757 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.250781 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:35.750773863 +0000 UTC m=+1672.932290084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : configmap "openstack-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.250833 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.250854 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:35.750848432 +0000 UTC m=+1672.932364654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : secret "combined-ca-bundle" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.280374 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.286539 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec845a4-b468-4927-a900-58afe8305a5c" path="/var/lib/kubelet/pods/7ec845a4-b468-4927-a900-58afe8305a5c/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.287659 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="832d985c-f2bb-47c8-b4ae-95cbfc58b024" path="/var/lib/kubelet/pods/832d985c-f2bb-47c8-b4ae-95cbfc58b024/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.288129 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:35.788109653 +0000 UTC m=+1672.969625874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : configmap "openstack-cell1-scripts" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.288199 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.288227 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:35.788220271 +0000 UTC m=+1672.969736493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : configmap "openstack-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.288282 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.288316 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:35.788308587 +0000 UTC m=+1672.969824808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : secret "cert-galera-openstack-cell1-svc" not found Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.288411 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89dba5d8-7aea-4717-a816-d26edb1da323" path="/var/lib/kubelet/pods/89dba5d8-7aea-4717-a816-d26edb1da323/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.289333 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e4c7891-17ea-4609-865e-ed35416688f0" path="/var/lib/kubelet/pods/8e4c7891-17ea-4609-865e-ed35416688f0/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.291768 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="983aa7f0-fb2b-46d5-912a-c0d8b26bff20" path="/var/lib/kubelet/pods/983aa7f0-fb2b-46d5-912a-c0d8b26bff20/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.293713 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e6bebd0-9910-4926-8ee0-0510b9b99d90" path="/var/lib/kubelet/pods/9e6bebd0-9910-4926-8ee0-0510b9b99d90/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.303591 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9ae62d-9e10-4db0-a711-28078f1ba665" path="/var/lib/kubelet/pods/ab9ae62d-9e10-4db0-a711-28078f1ba665/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.304250 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e" path="/var/lib/kubelet/pods/abcbb0c2-58e3-45e6-87ab-a1ef3bc43d0e/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.304894 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac82c533-38ae-4570-9826-a85958afa2c0" path="/var/lib/kubelet/pods/ac82c533-38ae-4570-9826-a85958afa2c0/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.305446 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9" path="/var/lib/kubelet/pods/ae47c2e2-4e54-4c89-b4c2-80f35c34ccc9/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.306403 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a438bc-0bc6-4de6-b01d-7376dbf25a5c" path="/var/lib/kubelet/pods/b1a438bc-0bc6-4de6-b01d-7376dbf25a5c/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.306935 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd32f87d-9ff2-481d-82ac-0ceb13492b31" path="/var/lib/kubelet/pods/bd32f87d-9ff2-481d-82ac-0ceb13492b31/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.307452 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e4cf85-ea8a-4428-b07c-3f1659259c77" path="/var/lib/kubelet/pods/d0e4cf85-ea8a-4428-b07c-3f1659259c77/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.308973 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.313685 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2969b7b-1db8-4940-ad96-4f68d565fe0d" path="/var/lib/kubelet/pods/d2969b7b-1db8-4940-ad96-4f68d565fe0d/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.314326 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2f4f3c-d3df-40d2-91e4-4f98a4487bbf" path="/var/lib/kubelet/pods/de2f4f3c-d3df-40d2-91e4-4f98a4487bbf/volumes" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.314897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "c475086c-e924-418c-bdd8-0b61fe31f950" (UID: "c475086c-e924-418c-bdd8-0b61fe31f950"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.331311 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.331504 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.331518 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.331529 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.331539 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-h728t"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.331555 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-h728t"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.331569 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.335794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c475086c-e924-418c-bdd8-0b61fe31f950" (UID: "c475086c-e924-418c-bdd8-0b61fe31f950"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.342456 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.345850 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" podUID="d4c27903-58da-47dc-97d0-b1328c58303d" containerName="barbican-keystone-listener-log" containerID="cri-o://b4573b28ecd3cce9248e8a085493d4d875d269b88b74b45cf9bfa26ecf1c1f4a" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.346312 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" podUID="d4c27903-58da-47dc-97d0-b1328c58303d" containerName="barbican-keystone-listener" containerID="cri-o://858a7831d2593cf2e3d08f763a792322ce65f0af93f68b5ca794de32f64a614f" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.353141 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:29:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 15:29:35 crc kubenswrapper[4707]: else Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:29:35 crc kubenswrapper[4707]: fi Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:29:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:29:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:29:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:29:35 crc kubenswrapper[4707]: # support updates Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.353213 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.353362 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api-log" containerID="cri-o://9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.353647 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api" containerID="cri-o://8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.355753 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" podUID="fbb7bc95-1acd-4048-89a4-01ea8ae34d34" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.362457 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.362479 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.362488 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c475086c-e924-418c-bdd8-0b61fe31f950-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.368390 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm"] Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.369000 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" podUID="0080fa38-5728-4417-968b-156c1a7df0bf" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.373767 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.374001 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="184ed54c-4ce8-4d12-ae34-255b44708d27" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.379836 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b555f485-zww44"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.379986 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" podUID="8a287941-c7ad-4671-904b-03172bc803e8" containerName="barbican-worker-log" containerID="cri-o://7d5e60913431e94a855dc86f1bcab868f337b843d01bafdb458a1a24b751facd" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.380005 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" podUID="8a287941-c7ad-4671-904b-03172bc803e8" containerName="barbican-worker" containerID="cri-o://f81c42fb84cf6dc6db11797efd104f4de9e6929e765ab0ffe83b4f838fac8280" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.380274 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="d54b9100-265d-49a3-a2ab-72e8787510df" containerName="rabbitmq" containerID="cri-o://ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d" gracePeriod=604800 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.391512 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-b89f5977c-485np"] Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.392006 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" podUID="249a8e47-360d-481e-bf1c-9d58ad545f77" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.404104 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.404127 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-b6gqr"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.404139 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.404251 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="efb1c248-d3f2-4038-8f84-01a4de59d39e" containerName="nova-cell1-conductor-conductor" containerID="cri-o://056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.409435 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.434967 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-p44sf"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.440905 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.441112 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f5fe6793-4878-480a-a359-225be30e006c" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.446102 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.498234 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.498479 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="ceilometer-central-agent" containerID="cri-o://a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.498854 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="proxy-httpd" containerID="cri-o://37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.498918 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="sg-core" containerID="cri-o://eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.498951 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="ceilometer-notification-agent" containerID="cri-o://e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.546415 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:29:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: if [ -n "nova_cell0" ]; then Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell0" Jan 21 15:29:35 crc kubenswrapper[4707]: else Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:29:35 crc kubenswrapper[4707]: fi Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:29:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:29:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:29:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:29:35 crc kubenswrapper[4707]: # support updates Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.547518 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" podUID="5ae763d3-de1a-4a94-94bf-41c06bfe239f" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.548332 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.552420 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d"] Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.568117 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.570001 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.570222 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="7d909d33-58f9-4a53-b98a-aa1927df3ea3" containerName="kube-state-metrics" containerID="cri-o://c79dfd4c99dbc23a0bdc487e4cc960e31cd013b21ba5d1830d4fab5e1cc9d84d" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.577334 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s"] Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.584559 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:29:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: if [ -n "nova_api" ]; then Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="nova_api" Jan 21 15:29:35 crc kubenswrapper[4707]: else Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:29:35 crc kubenswrapper[4707]: fi Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:29:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:29:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:29:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:29:35 crc kubenswrapper[4707]: # support updates Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.585754 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" podUID="c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.588243 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.588298 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerName="ovn-northd" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.620732 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.652473 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4rttk"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.656023 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.657969 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db"] Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.658325 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c475086c-e924-418c-bdd8-0b61fe31f950" containerName="openstack-network-exporter" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.658337 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c475086c-e924-418c-bdd8-0b61fe31f950" containerName="openstack-network-exporter" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.658357 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c475086c-e924-418c-bdd8-0b61fe31f950" containerName="ovsdbserver-nb" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.658363 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c475086c-e924-418c-bdd8-0b61fe31f950" containerName="ovsdbserver-nb" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.658515 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c475086c-e924-418c-bdd8-0b61fe31f950" containerName="openstack-network-exporter" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.658530 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c475086c-e924-418c-bdd8-0b61fe31f950" containerName="ovsdbserver-nb" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.659649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.661924 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.662154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.713191 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-nz8gm"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.746614 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-nz8gm"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.760954 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fxgzb"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.767966 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fxgzb"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.793518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-combined-ca-bundle\") pod \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.794070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config-secret\") pod \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.794132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6s4c\" (UniqueName: \"kubernetes.io/projected/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-kube-api-access-g6s4c\") pod \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.794241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config\") pod \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\" (UID: \"f6edfee9-38b1-44ae-bbf1-ff909ee38bfc\") " Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.794620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lsln\" (UniqueName: \"kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln\") pod \"keystone-b6c7-account-create-update-4r7db\" (UID: \"901b197a-7bb5-4257-983b-560632e8987f\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.794689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts\") pod \"keystone-b6c7-account-create-update-4r7db\" (UID: \"901b197a-7bb5-4257-983b-560632e8987f\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.795062 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.795176 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.795109385 +0000 UTC m=+1673.976625607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : secret "combined-ca-bundle" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.798016 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.798069 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.798059801 +0000 UTC m=+1673.979576023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : configmap "openstack-cell1-scripts" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.801257 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.801303 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.801291286 +0000 UTC m=+1673.982807509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : secret "cert-galera-openstack-cell1-svc" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.801340 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.801359 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.801353083 +0000 UTC m=+1673.982869305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : configmap "openstack-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.801387 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.801405 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.801399941 +0000 UTC m=+1673.982916163 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : configmap "openstack-cell1-config-data" not found Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.807049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-kube-api-access-g6s4c" (OuterVolumeSpecName: "kube-api-access-g6s4c") pod "f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" (UID: "f6edfee9-38b1-44ae-bbf1-ff909ee38bfc"). InnerVolumeSpecName "kube-api-access-g6s4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.820984 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-664998d598-nfbb8"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.821170 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" podUID="4638ccbe-c266-4bff-8771-67eb02be0a6a" containerName="keystone-api" containerID="cri-o://aeed4f7ea398c8aec570dded3f936a9753106f96a565795b7770c4040b2cca32" gracePeriod=30 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.829579 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.840262 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-52g8k"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.841660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" (UID: "f6edfee9-38b1-44ae-bbf1-ff909ee38bfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.849224 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-52g8k"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.852973 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db"] Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.853688 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2lsln operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" podUID="901b197a-7bb5-4257-983b-560632e8987f" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.857026 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzkq7"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.864707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" (UID: "f6edfee9-38b1-44ae-bbf1-ff909ee38bfc"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.864773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" event={"ID":"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8","Type":"ContainerStarted","Data":"89002195f52b9c360b417246f7df9ce298eaecfea9be208be4a2cc5928cc3cef"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.864949 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" secret="" err="secret \"galera-openstack-dockercfg-brfjv\" not found" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.866871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.867553 4707 generic.go:334] "Generic (PLEG): container finished" podID="ccc6250e-7692-4325-896f-33f84c241afb" containerID="3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221" exitCode=143 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.867603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ccc6250e-7692-4325-896f-33f84c241afb","Type":"ContainerDied","Data":"3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221"} Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.869298 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:29:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 15:29:35 crc kubenswrapper[4707]: else Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:29:35 crc kubenswrapper[4707]: fi Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:29:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:29:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:29:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:29:35 crc kubenswrapper[4707]: # support updates Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.870865 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" podUID="8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.874969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" (UID: "f6edfee9-38b1-44ae-bbf1-ff909ee38bfc"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.885422 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d" exitCode=0 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.885444 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58" exitCode=0 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.885451 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9" exitCode=0 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.885492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.885509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.885518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.894067 4707 generic.go:334] "Generic (PLEG): container finished" podID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerID="0f79ffcc51a1df0f60b92a65eed0727ebe1a87e5686c3cfa9b48daa4dee20df1" exitCode=143 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.894108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"09b00a3d-6e52-458e-8fb3-6c48d658ff86","Type":"ContainerDied","Data":"0f79ffcc51a1df0f60b92a65eed0727ebe1a87e5686c3cfa9b48daa4dee20df1"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.898366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts\") pod \"keystone-b6c7-account-create-update-4r7db\" (UID: \"901b197a-7bb5-4257-983b-560632e8987f\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.898530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lsln\" (UniqueName: \"kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln\") pod \"keystone-b6c7-account-create-update-4r7db\" (UID: \"901b197a-7bb5-4257-983b-560632e8987f\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.898571 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.898583 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.898593 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6s4c\" (UniqueName: \"kubernetes.io/projected/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-kube-api-access-g6s4c\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.898602 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.898795 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.898889 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts podName:901b197a-7bb5-4257-983b-560632e8987f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.398877387 +0000 UTC m=+1673.580393610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts") pod "keystone-b6c7-account-create-update-4r7db" (UID: "901b197a-7bb5-4257-983b-560632e8987f") : configmap "openstack-scripts" not found Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.899574 4707 generic.go:334] "Generic (PLEG): container finished" podID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerID="9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59" exitCode=143 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.900065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" event={"ID":"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce","Type":"ContainerDied","Data":"9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59"} Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.903179 4707 projected.go:194] Error preparing data for projected volume kube-api-access-2lsln for pod openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.903255 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln podName:901b197a-7bb5-4257-983b-560632e8987f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.403232725 +0000 UTC m=+1673.584748947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2lsln" (UniqueName: "kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln") pod "keystone-b6c7-account-create-update-4r7db" (UID: "901b197a-7bb5-4257-983b-560632e8987f") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.910942 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9f3e600-3622-4bc5-9481-86560969ee79" containerID="2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b" exitCode=143 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.911005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a9f3e600-3622-4bc5-9481-86560969ee79","Type":"ContainerDied","Data":"2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b"} Jan 21 15:29:35 crc kubenswrapper[4707]: W0121 15:29:35.911351 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a2b6336_8cd0_4321_b5ed_6bd1e321b93c.slice/crio-709b1c2333202b4ae32673fb0530e208c401bfad1dbf3db56bf8916bfb22d484 WatchSource:0}: Error finding container 709b1c2333202b4ae32673fb0530e208c401bfad1dbf3db56bf8916bfb22d484: Status 404 returned error can't find the container with id 709b1c2333202b4ae32673fb0530e208c401bfad1dbf3db56bf8916bfb22d484 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.916042 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.921097 4707 generic.go:334] "Generic (PLEG): container finished" podID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerID="37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad" exitCode=0 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.921122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerDied","Data":"37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.921153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerDied","Data":"eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.921173 4707 generic.go:334] "Generic (PLEG): container finished" podID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerID="eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4" exitCode=2 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.924989 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh"] Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.933312 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4c27903-58da-47dc-97d0-b1328c58303d" containerID="b4573b28ecd3cce9248e8a085493d4d875d269b88b74b45cf9bfa26ecf1c1f4a" exitCode=143 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.933364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" event={"ID":"d4c27903-58da-47dc-97d0-b1328c58303d","Type":"ContainerDied","Data":"b4573b28ecd3cce9248e8a085493d4d875d269b88b74b45cf9bfa26ecf1c1f4a"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.935257 4707 generic.go:334] "Generic (PLEG): container finished" podID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerID="05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46" exitCode=0 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.935321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" event={"ID":"deb292be-f24c-461f-ac51-f1d4daa3f261","Type":"ContainerDied","Data":"05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46"} Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.936184 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:29:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 15:29:35 crc kubenswrapper[4707]: else Jan 21 15:29:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:29:35 crc kubenswrapper[4707]: fi Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:29:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:29:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:29:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:29:35 crc kubenswrapper[4707]: # support updates Jan 21 15:29:35 crc kubenswrapper[4707]: Jan 21 15:29:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.936901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" event={"ID":"5ae763d3-de1a-4a94-94bf-41c06bfe239f","Type":"ContainerStarted","Data":"f6b22179bacde6d2bfd9e499ee207e8287515c8a31d3daf2a80e1caa021038c9"} Jan 21 15:29:35 crc kubenswrapper[4707]: E0121 15:29:35.937372 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" podUID="48f50e81-5c99-4197-823e-f6967c42a20c" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.940692 4707 generic.go:334] "Generic (PLEG): container finished" podID="700a8d9c-852c-4611-9a98-50418d7df800" containerID="c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729" exitCode=0 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.940713 4707 generic.go:334] "Generic (PLEG): container finished" podID="700a8d9c-852c-4611-9a98-50418d7df800" containerID="afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f" exitCode=0 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.940743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" event={"ID":"700a8d9c-852c-4611-9a98-50418d7df800","Type":"ContainerDied","Data":"c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.940759 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" event={"ID":"700a8d9c-852c-4611-9a98-50418d7df800","Type":"ContainerDied","Data":"afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.940774 4707 scope.go:117] "RemoveContainer" containerID="c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.940886 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.948698 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_c475086c-e924-418c-bdd8-0b61fe31f950/ovsdbserver-nb/0.log" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.948796 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.948867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"c475086c-e924-418c-bdd8-0b61fe31f950","Type":"ContainerDied","Data":"3b9e51c0fe6c76aec32ebbc8d0bb9d6ca15609a0cf57bcbf6cb6ab8cd7e52ecf"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.954356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" event={"ID":"5b669faa-0600-4ffd-918d-b33788b18eb9","Type":"ContainerStarted","Data":"83e855f3f3c81f300cf9ddf93f5bd50e5e0b1e865c7b58edb6ce098881bb182f"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.969017 4707 generic.go:334] "Generic (PLEG): container finished" podID="33a83157-f122-493a-acd7-ba671cfce233" containerID="940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5" exitCode=143 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.969085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"33a83157-f122-493a-acd7-ba671cfce233","Type":"ContainerDied","Data":"940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.972641 4707 generic.go:334] "Generic (PLEG): container finished" podID="8a287941-c7ad-4671-904b-03172bc803e8" containerID="7d5e60913431e94a855dc86f1bcab868f337b843d01bafdb458a1a24b751facd" exitCode=143 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.972701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" event={"ID":"8a287941-c7ad-4671-904b-03172bc803e8","Type":"ContainerDied","Data":"7d5e60913431e94a855dc86f1bcab868f337b843d01bafdb458a1a24b751facd"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.973924 4707 generic.go:334] "Generic (PLEG): container finished" podID="e8daa249-22b3-4782-b64b-d9dca84a8777" containerID="44277af5c0f4f6569bed8d7b6aa6359e8dac4fb7085d41d7548def3d3ce9aecb" exitCode=1 Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.973964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jzkq7" event={"ID":"e8daa249-22b3-4782-b64b-d9dca84a8777","Type":"ContainerDied","Data":"44277af5c0f4f6569bed8d7b6aa6359e8dac4fb7085d41d7548def3d3ce9aecb"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.973981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jzkq7" event={"ID":"e8daa249-22b3-4782-b64b-d9dca84a8777","Type":"ContainerStarted","Data":"2840ce53faf0cf4ee686293752433c05461b4ca44d152320dfd77e3cd22f20e5"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.974393 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-jzkq7" secret="" err="secret \"galera-openstack-dockercfg-brfjv\" not found" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.974425 4707 scope.go:117] "RemoveContainer" containerID="44277af5c0f4f6569bed8d7b6aa6359e8dac4fb7085d41d7548def3d3ce9aecb" Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.982107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" event={"ID":"fbb7bc95-1acd-4048-89a4-01ea8ae34d34","Type":"ContainerStarted","Data":"ff18eac4e43d9db65c8c34b9558763ffa857e042ca6b003d96fb99506cc59ba5"} Jan 21 15:29:35 crc kubenswrapper[4707]: I0121 15:29:35.994052 4707 scope.go:117] "RemoveContainer" containerID="afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.000022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-log-httpd\") pod \"700a8d9c-852c-4611-9a98-50418d7df800\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.000084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-etc-swift\") pod \"700a8d9c-852c-4611-9a98-50418d7df800\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.000246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-internal-tls-certs\") pod \"700a8d9c-852c-4611-9a98-50418d7df800\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.000333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-combined-ca-bundle\") pod \"700a8d9c-852c-4611-9a98-50418d7df800\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.000837 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "700a8d9c-852c-4611-9a98-50418d7df800" (UID: "700a8d9c-852c-4611-9a98-50418d7df800"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.001246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-config-data\") pod \"700a8d9c-852c-4611-9a98-50418d7df800\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.001313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-public-tls-certs\") pod \"700a8d9c-852c-4611-9a98-50418d7df800\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.001337 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kqp8\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-kube-api-access-8kqp8\") pod \"700a8d9c-852c-4611-9a98-50418d7df800\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.001436 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-run-httpd\") pod \"700a8d9c-852c-4611-9a98-50418d7df800\" (UID: \"700a8d9c-852c-4611-9a98-50418d7df800\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.003392 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.003463 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.003501 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts podName:8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.503485368 +0000 UTC m=+1673.685001590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts") pod "glance-d5d6-account-create-update-zcbtm" (UID: "8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8") : configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.014462 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d909d33-58f9-4a53-b98a-aa1927df3ea3" containerID="c79dfd4c99dbc23a0bdc487e4cc960e31cd013b21ba5d1830d4fab5e1cc9d84d" exitCode=2 Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.014536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7d909d33-58f9-4a53-b98a-aa1927df3ea3","Type":"ContainerDied","Data":"c79dfd4c99dbc23a0bdc487e4cc960e31cd013b21ba5d1830d4fab5e1cc9d84d"} Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.018007 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" containerID="07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7" exitCode=137 Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.018134 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.021863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "700a8d9c-852c-4611-9a98-50418d7df800" (UID: "700a8d9c-852c-4611-9a98-50418d7df800"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.026665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" event={"ID":"5d4d5044-73fa-4995-8594-0541964e35e8","Type":"ContainerStarted","Data":"afca4d3242e761f4e1914710f4e42d3a067b4206e7cb051b53eb5ef1cf98e00f"} Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.035815 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.036272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" event={"ID":"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc","Type":"ContainerStarted","Data":"717edc04b22288af6646f344f0f3f2b7d5cc903aeec61f4f11538142fe30ce97"} Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.037666 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.049116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-kube-api-access-8kqp8" (OuterVolumeSpecName: "kube-api-access-8kqp8") pod "700a8d9c-852c-4611-9a98-50418d7df800" (UID: "700a8d9c-852c-4611-9a98-50418d7df800"). InnerVolumeSpecName "kube-api-access-8kqp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.049356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.052573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "700a8d9c-852c-4611-9a98-50418d7df800" (UID: "700a8d9c-852c-4611-9a98-50418d7df800"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.072052 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.073566 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.075007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "700a8d9c-852c-4611-9a98-50418d7df800" (UID: "700a8d9c-852c-4611-9a98-50418d7df800"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.083493 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.083700 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f5fe6793-4878-480a-a359-225be30e006c" containerName="nova-cell0-conductor-conductor" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.087369 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" containerName="galera" containerID="cri-o://338edc6004b39eadad308bfb8bb14c8882a1995789ae4fdf1b5e3424d971b9e4" gracePeriod=30 Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.087692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "700a8d9c-852c-4611-9a98-50418d7df800" (UID: "700a8d9c-852c-4611-9a98-50418d7df800"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.101728 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.102074 4707 scope.go:117] "RemoveContainer" containerID="c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.104623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data\") pod \"0080fa38-5728-4417-968b-156c1a7df0bf\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.104694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks9tb\" (UniqueName: \"kubernetes.io/projected/0080fa38-5728-4417-968b-156c1a7df0bf-kube-api-access-ks9tb\") pod \"0080fa38-5728-4417-968b-156c1a7df0bf\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.104876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data-custom\") pod \"0080fa38-5728-4417-968b-156c1a7df0bf\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.104992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080fa38-5728-4417-968b-156c1a7df0bf-logs\") pod \"0080fa38-5728-4417-968b-156c1a7df0bf\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.105446 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.105463 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.105472 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.105481 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kqp8\" (UniqueName: \"kubernetes.io/projected/700a8d9c-852c-4611-9a98-50418d7df800-kube-api-access-8kqp8\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.105489 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/700a8d9c-852c-4611-9a98-50418d7df800-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.105536 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.105571 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts podName:e8daa249-22b3-4782-b64b-d9dca84a8777 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:36.605559497 +0000 UTC m=+1673.787075719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts") pod "root-account-create-update-jzkq7" (UID: "e8daa249-22b3-4782-b64b-d9dca84a8777") : configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.106995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0080fa38-5728-4417-968b-156c1a7df0bf-logs" (OuterVolumeSpecName: "logs") pod "0080fa38-5728-4417-968b-156c1a7df0bf" (UID: "0080fa38-5728-4417-968b-156c1a7df0bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.108373 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729\": container with ID starting with c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729 not found: ID does not exist" containerID="c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.108417 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729"} err="failed to get container status \"c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729\": rpc error: code = NotFound desc = could not find container \"c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729\": container with ID starting with c4520f976e63dc2125a8b15cebfb24a64ed69a422e285c577658cd1fc9856729 not found: ID does not exist" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.108440 4707 scope.go:117] "RemoveContainer" containerID="afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.109064 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f\": container with ID starting with afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f not found: ID does not exist" containerID="afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.109107 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f"} err="failed to get container status \"afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f\": rpc error: code = NotFound desc = could not find container \"afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f\": container with ID starting with afed057a770cedc7d723966183e5ed71b71a615fbcc9928054a0094dc60ebf5f not found: ID does not exist" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.109137 4707 scope.go:117] "RemoveContainer" containerID="d9976eaa71a516b23e46173e6991270fa567b0fd2fdf3a5753ae210e1a2cdd0f" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.109230 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.114080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0080fa38-5728-4417-968b-156c1a7df0bf" (UID: "0080fa38-5728-4417-968b-156c1a7df0bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.117559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data" (OuterVolumeSpecName: "config-data") pod "0080fa38-5728-4417-968b-156c1a7df0bf" (UID: "0080fa38-5728-4417-968b-156c1a7df0bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.118311 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.120597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0080fa38-5728-4417-968b-156c1a7df0bf-kube-api-access-ks9tb" (OuterVolumeSpecName: "kube-api-access-ks9tb") pod "0080fa38-5728-4417-968b-156c1a7df0bf" (UID: "0080fa38-5728-4417-968b-156c1a7df0bf"). InnerVolumeSpecName "kube-api-access-ks9tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.136052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "700a8d9c-852c-4611-9a98-50418d7df800" (UID: "700a8d9c-852c-4611-9a98-50418d7df800"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.150848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-config-data" (OuterVolumeSpecName: "config-data") pod "700a8d9c-852c-4611-9a98-50418d7df800" (UID: "700a8d9c-852c-4611-9a98-50418d7df800"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.156059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.157612 4707 scope.go:117] "RemoveContainer" containerID="6773818b9477157356db610d5c8e6c5f70dba1e6f5a99dffecbb55f651495e61" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.202247 4707 scope.go:117] "RemoveContainer" containerID="07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.206390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-certs\") pod \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.206474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a8e47-360d-481e-bf1c-9d58ad545f77-logs\") pod \"249a8e47-360d-481e-bf1c-9d58ad545f77\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.206505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-combined-ca-bundle\") pod \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.206542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-config\") pod \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.206596 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data\") pod \"249a8e47-360d-481e-bf1c-9d58ad545f77\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.206618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2zfx\" (UniqueName: \"kubernetes.io/projected/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-api-access-x2zfx\") pod \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\" (UID: \"7d909d33-58f9-4a53-b98a-aa1927df3ea3\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.206633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvwlb\" (UniqueName: \"kubernetes.io/projected/249a8e47-360d-481e-bf1c-9d58ad545f77-kube-api-access-qvwlb\") pod \"249a8e47-360d-481e-bf1c-9d58ad545f77\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.206708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data-custom\") pod \"249a8e47-360d-481e-bf1c-9d58ad545f77\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.207252 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.207272 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.207281 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080fa38-5728-4417-968b-156c1a7df0bf-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.207290 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700a8d9c-852c-4611-9a98-50418d7df800-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.207297 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.207306 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks9tb\" (UniqueName: \"kubernetes.io/projected/0080fa38-5728-4417-968b-156c1a7df0bf-kube-api-access-ks9tb\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.208197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/249a8e47-360d-481e-bf1c-9d58ad545f77-logs" (OuterVolumeSpecName: "logs") pod "249a8e47-360d-481e-bf1c-9d58ad545f77" (UID: "249a8e47-360d-481e-bf1c-9d58ad545f77"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.209268 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="280f96dd-c140-464c-a906-d5428cda5b5c" containerName="galera" containerID="cri-o://690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533" gracePeriod=30 Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.212313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249a8e47-360d-481e-bf1c-9d58ad545f77-kube-api-access-qvwlb" (OuterVolumeSpecName: "kube-api-access-qvwlb") pod "249a8e47-360d-481e-bf1c-9d58ad545f77" (UID: "249a8e47-360d-481e-bf1c-9d58ad545f77"). InnerVolumeSpecName "kube-api-access-qvwlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.212413 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "249a8e47-360d-481e-bf1c-9d58ad545f77" (UID: "249a8e47-360d-481e-bf1c-9d58ad545f77"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.214045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data" (OuterVolumeSpecName: "config-data") pod "249a8e47-360d-481e-bf1c-9d58ad545f77" (UID: "249a8e47-360d-481e-bf1c-9d58ad545f77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.222738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-api-access-x2zfx" (OuterVolumeSpecName: "kube-api-access-x2zfx") pod "7d909d33-58f9-4a53-b98a-aa1927df3ea3" (UID: "7d909d33-58f9-4a53-b98a-aa1927df3ea3"). InnerVolumeSpecName "kube-api-access-x2zfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.238958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "7d909d33-58f9-4a53-b98a-aa1927df3ea3" (UID: "7d909d33-58f9-4a53-b98a-aa1927df3ea3"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.246140 4707 scope.go:117] "RemoveContainer" containerID="07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.252176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d909d33-58f9-4a53-b98a-aa1927df3ea3" (UID: "7d909d33-58f9-4a53-b98a-aa1927df3ea3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.253431 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7\": container with ID starting with 07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7 not found: ID does not exist" containerID="07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.253468 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7"} err="failed to get container status \"07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7\": rpc error: code = NotFound desc = could not find container \"07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7\": container with ID starting with 07fa62ac6d45e86baeb56e31ddba3e88b84c404c2ec17b879acc00478317a7d7 not found: ID does not exist" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.273417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "7d909d33-58f9-4a53-b98a-aa1927df3ea3" (UID: "7d909d33-58f9-4a53-b98a-aa1927df3ea3"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.273469 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl"] Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.278061 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f8c4bdc4-wf2dl"] Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.309152 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.309185 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.309198 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2zfx\" (UniqueName: \"kubernetes.io/projected/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-api-access-x2zfx\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.309207 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvwlb\" (UniqueName: \"kubernetes.io/projected/249a8e47-360d-481e-bf1c-9d58ad545f77-kube-api-access-qvwlb\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.309216 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.309224 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.309231 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a8e47-360d-481e-bf1c-9d58ad545f77-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.309238 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d909d33-58f9-4a53-b98a-aa1927df3ea3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.410843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lsln\" (UniqueName: \"kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln\") pod \"keystone-b6c7-account-create-update-4r7db\" (UID: \"901b197a-7bb5-4257-983b-560632e8987f\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.410913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts\") pod \"keystone-b6c7-account-create-update-4r7db\" (UID: \"901b197a-7bb5-4257-983b-560632e8987f\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.411185 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.411249 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts podName:901b197a-7bb5-4257-983b-560632e8987f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:37.411234791 +0000 UTC m=+1674.592751013 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts") pod "keystone-b6c7-account-create-update-4r7db" (UID: "901b197a-7bb5-4257-983b-560632e8987f") : configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.414220 4707 projected.go:194] Error preparing data for projected volume kube-api-access-2lsln for pod openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.414284 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln podName:901b197a-7bb5-4257-983b-560632e8987f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:37.414272272 +0000 UTC m=+1674.595788494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2lsln" (UniqueName: "kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln") pod "keystone-b6c7-account-create-update-4r7db" (UID: "901b197a-7bb5-4257-983b-560632e8987f") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.469634 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.469923 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.517199 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.517264 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts podName:8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:37.517250329 +0000 UTC m=+1674.698766551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts") pod "glance-d5d6-account-create-update-zcbtm" (UID: "8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8") : configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.597646 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.615631 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.617758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lwn7\" (UniqueName: \"kubernetes.io/projected/5ae763d3-de1a-4a94-94bf-41c06bfe239f-kube-api-access-4lwn7\") pod \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\" (UID: \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.617823 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b669faa-0600-4ffd-918d-b33788b18eb9-operator-scripts\") pod \"5b669faa-0600-4ffd-918d-b33788b18eb9\" (UID: \"5b669faa-0600-4ffd-918d-b33788b18eb9\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.617998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtf2q\" (UniqueName: \"kubernetes.io/projected/5b669faa-0600-4ffd-918d-b33788b18eb9-kube-api-access-qtf2q\") pod \"5b669faa-0600-4ffd-918d-b33788b18eb9\" (UID: \"5b669faa-0600-4ffd-918d-b33788b18eb9\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.618057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae763d3-de1a-4a94-94bf-41c06bfe239f-operator-scripts\") pod \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\" (UID: \"5ae763d3-de1a-4a94-94bf-41c06bfe239f\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.618670 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.618735 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.618772 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts podName:e8daa249-22b3-4782-b64b-d9dca84a8777 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:37.618760778 +0000 UTC m=+1674.800276999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts") pod "root-account-create-update-jzkq7" (UID: "e8daa249-22b3-4782-b64b-d9dca84a8777") : configmap "openstack-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.618797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae763d3-de1a-4a94-94bf-41c06bfe239f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ae763d3-de1a-4a94-94bf-41c06bfe239f" (UID: "5ae763d3-de1a-4a94-94bf-41c06bfe239f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.619078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b669faa-0600-4ffd-918d-b33788b18eb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b669faa-0600-4ffd-918d-b33788b18eb9" (UID: "5b669faa-0600-4ffd-918d-b33788b18eb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.623109 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.101:8776/healthcheck\": read tcp 10.217.0.2:48338->10.217.1.101:8776: read: connection reset by peer" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.623417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b669faa-0600-4ffd-918d-b33788b18eb9-kube-api-access-qtf2q" (OuterVolumeSpecName: "kube-api-access-qtf2q") pod "5b669faa-0600-4ffd-918d-b33788b18eb9" (UID: "5b669faa-0600-4ffd-918d-b33788b18eb9"). InnerVolumeSpecName "kube-api-access-qtf2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.627291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae763d3-de1a-4a94-94bf-41c06bfe239f-kube-api-access-4lwn7" (OuterVolumeSpecName: "kube-api-access-4lwn7") pod "5ae763d3-de1a-4a94-94bf-41c06bfe239f" (UID: "5ae763d3-de1a-4a94-94bf-41c06bfe239f"). InnerVolumeSpecName "kube-api-access-4lwn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.642108 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-nova-novncproxy-tls-certs\") pod \"184ed54c-4ce8-4d12-ae34-255b44708d27\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719407 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d5044-73fa-4995-8594-0541964e35e8-operator-scripts\") pod \"5d4d5044-73fa-4995-8594-0541964e35e8\" (UID: \"5d4d5044-73fa-4995-8594-0541964e35e8\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719429 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkrwc\" (UniqueName: \"kubernetes.io/projected/184ed54c-4ce8-4d12-ae34-255b44708d27-kube-api-access-qkrwc\") pod \"184ed54c-4ce8-4d12-ae34-255b44708d27\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-operator-scripts\") pod \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\" (UID: \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrvtt\" (UniqueName: \"kubernetes.io/projected/5d4d5044-73fa-4995-8594-0541964e35e8-kube-api-access-wrvtt\") pod \"5d4d5044-73fa-4995-8594-0541964e35e8\" (UID: \"5d4d5044-73fa-4995-8594-0541964e35e8\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-combined-ca-bundle\") pod \"184ed54c-4ce8-4d12-ae34-255b44708d27\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-operator-scripts\") pod \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\" (UID: \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719856 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4d5044-73fa-4995-8594-0541964e35e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d4d5044-73fa-4995-8594-0541964e35e8" (UID: "5d4d5044-73fa-4995-8594-0541964e35e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719900 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fxtt\" (UniqueName: \"kubernetes.io/projected/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-kube-api-access-9fxtt\") pod \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\" (UID: \"fbb7bc95-1acd-4048-89a4-01ea8ae34d34\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249tc\" (UniqueName: \"kubernetes.io/projected/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-kube-api-access-249tc\") pod \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\" (UID: \"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.719979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-config-data\") pod \"184ed54c-4ce8-4d12-ae34-255b44708d27\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.720027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-vencrypt-tls-certs\") pod \"184ed54c-4ce8-4d12-ae34-255b44708d27\" (UID: \"184ed54c-4ce8-4d12-ae34-255b44708d27\") " Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.720089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc" (UID: "c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.720425 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae763d3-de1a-4a94-94bf-41c06bfe239f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.720443 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lwn7\" (UniqueName: \"kubernetes.io/projected/5ae763d3-de1a-4a94-94bf-41c06bfe239f-kube-api-access-4lwn7\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.720453 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b669faa-0600-4ffd-918d-b33788b18eb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.720461 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d5044-73fa-4995-8594-0541964e35e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.720469 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.720478 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtf2q\" (UniqueName: \"kubernetes.io/projected/5b669faa-0600-4ffd-918d-b33788b18eb9-kube-api-access-qtf2q\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.720991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbb7bc95-1acd-4048-89a4-01ea8ae34d34" (UID: "fbb7bc95-1acd-4048-89a4-01ea8ae34d34"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.728024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4d5044-73fa-4995-8594-0541964e35e8-kube-api-access-wrvtt" (OuterVolumeSpecName: "kube-api-access-wrvtt") pod "5d4d5044-73fa-4995-8594-0541964e35e8" (UID: "5d4d5044-73fa-4995-8594-0541964e35e8"). InnerVolumeSpecName "kube-api-access-wrvtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.735264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184ed54c-4ce8-4d12-ae34-255b44708d27-kube-api-access-qkrwc" (OuterVolumeSpecName: "kube-api-access-qkrwc") pod "184ed54c-4ce8-4d12-ae34-255b44708d27" (UID: "184ed54c-4ce8-4d12-ae34-255b44708d27"). InnerVolumeSpecName "kube-api-access-qkrwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.747553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "184ed54c-4ce8-4d12-ae34-255b44708d27" (UID: "184ed54c-4ce8-4d12-ae34-255b44708d27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.748410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-kube-api-access-9fxtt" (OuterVolumeSpecName: "kube-api-access-9fxtt") pod "fbb7bc95-1acd-4048-89a4-01ea8ae34d34" (UID: "fbb7bc95-1acd-4048-89a4-01ea8ae34d34"). InnerVolumeSpecName "kube-api-access-9fxtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.751102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-kube-api-access-249tc" (OuterVolumeSpecName: "kube-api-access-249tc") pod "c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc" (UID: "c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc"). InnerVolumeSpecName "kube-api-access-249tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.757265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-config-data" (OuterVolumeSpecName: "config-data") pod "184ed54c-4ce8-4d12-ae34-255b44708d27" (UID: "184ed54c-4ce8-4d12-ae34-255b44708d27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.763734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "184ed54c-4ce8-4d12-ae34-255b44708d27" (UID: "184ed54c-4ce8-4d12-ae34-255b44708d27"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.773010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "184ed54c-4ce8-4d12-ae34-255b44708d27" (UID: "184ed54c-4ce8-4d12-ae34-255b44708d27"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.838426 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.838630 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.838642 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkrwc\" (UniqueName: \"kubernetes.io/projected/184ed54c-4ce8-4d12-ae34-255b44708d27-kube-api-access-qkrwc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.838651 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrvtt\" (UniqueName: \"kubernetes.io/projected/5d4d5044-73fa-4995-8594-0541964e35e8-kube-api-access-wrvtt\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.838660 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.838669 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.838677 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fxtt\" (UniqueName: \"kubernetes.io/projected/fbb7bc95-1acd-4048-89a4-01ea8ae34d34-kube-api-access-9fxtt\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.838686 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249tc\" (UniqueName: \"kubernetes.io/projected/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc-kube-api-access-249tc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.838694 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184ed54c-4ce8-4d12-ae34-255b44708d27-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.838751 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.838790 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:38.838777465 +0000 UTC m=+1676.020293687 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : configmap "openstack-cell1-config-data" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.838836 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.838857 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:38.838850784 +0000 UTC m=+1676.020367005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : configmap "openstack-cell1-scripts" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.838904 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.838922 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:38.838917028 +0000 UTC m=+1676.020433250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : secret "cert-galera-openstack-cell1-svc" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.838946 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.838962 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:38.838956472 +0000 UTC m=+1676.020472694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : configmap "openstack-cell1-config-data" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.838989 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.839005 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle podName:280f96dd-c140-464c-a906-d5428cda5b5c nodeName:}" failed. No retries permitted until 2026-01-21 15:29:38.838999492 +0000 UTC m=+1676.020515714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle") pod "openstack-cell1-galera-0" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c") : secret "combined-ca-bundle" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.839026 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:29:36 crc kubenswrapper[4707]: E0121 15:29:36.839040 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data podName:6bac3bc5-b4ca-4333-9fc6-851430c48708 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:40.839036131 +0000 UTC m=+1678.020552353 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data") pod "rabbitmq-server-0" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708") : configmap "rabbitmq-config-data" not found Jan 21 15:29:36 crc kubenswrapper[4707]: I0121 15:29:36.919212 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.006402 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.045541 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-internal-tls-certs\") pod \"ef7c730d-de41-4543-a84e-d42a48259f71\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.045591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef7c730d-de41-4543-a84e-d42a48259f71-etc-machine-id\") pod \"ef7c730d-de41-4543-a84e-d42a48259f71\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.045621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data-custom\") pod \"ef7c730d-de41-4543-a84e-d42a48259f71\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.045969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef7c730d-de41-4543-a84e-d42a48259f71-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ef7c730d-de41-4543-a84e-d42a48259f71" (UID: "ef7c730d-de41-4543-a84e-d42a48259f71"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.048017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data\") pod \"ef7c730d-de41-4543-a84e-d42a48259f71\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.048046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-public-tls-certs\") pod \"ef7c730d-de41-4543-a84e-d42a48259f71\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.048486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrb7v\" (UniqueName: \"kubernetes.io/projected/ef7c730d-de41-4543-a84e-d42a48259f71-kube-api-access-zrb7v\") pod \"ef7c730d-de41-4543-a84e-d42a48259f71\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.048867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-scripts\") pod \"ef7c730d-de41-4543-a84e-d42a48259f71\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.048971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7c730d-de41-4543-a84e-d42a48259f71-logs\") pod \"ef7c730d-de41-4543-a84e-d42a48259f71\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.049027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-combined-ca-bundle\") pod \"ef7c730d-de41-4543-a84e-d42a48259f71\" (UID: \"ef7c730d-de41-4543-a84e-d42a48259f71\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.049447 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7c730d-de41-4543-a84e-d42a48259f71-logs" (OuterVolumeSpecName: "logs") pod "ef7c730d-de41-4543-a84e-d42a48259f71" (UID: "ef7c730d-de41-4543-a84e-d42a48259f71"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.050309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle\") pod \"barbican-keystone-listener-6457bf97d6-6nrnm\" (UID: \"0080fa38-5728-4417-968b-156c1a7df0bf\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.051614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle\") pod \"barbican-worker-b89f5977c-485np\" (UID: \"249a8e47-360d-481e-bf1c-9d58ad545f77\") " pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.052667 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.052969 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.053594 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle podName:0080fa38-5728-4417-968b-156c1a7df0bf nodeName:}" failed. No retries permitted until 2026-01-21 15:29:41.052854267 +0000 UTC m=+1678.234370489 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle") pod "barbican-keystone-listener-6457bf97d6-6nrnm" (UID: "0080fa38-5728-4417-968b-156c1a7df0bf") : secret "combined-ca-bundle" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.053612 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle podName:249a8e47-360d-481e-bf1c-9d58ad545f77 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:41.053603985 +0000 UTC m=+1678.235120208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle") pod "barbican-worker-b89f5977c-485np" (UID: "249a8e47-360d-481e-bf1c-9d58ad545f77") : secret "combined-ca-bundle" not found Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.054732 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7c730d-de41-4543-a84e-d42a48259f71-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.054745 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef7c730d-de41-4543-a84e-d42a48259f71-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.075900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7c730d-de41-4543-a84e-d42a48259f71-kube-api-access-zrb7v" (OuterVolumeSpecName: "kube-api-access-zrb7v") pod "ef7c730d-de41-4543-a84e-d42a48259f71" (UID: "ef7c730d-de41-4543-a84e-d42a48259f71"). InnerVolumeSpecName "kube-api-access-zrb7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.076004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-scripts" (OuterVolumeSpecName: "scripts") pod "ef7c730d-de41-4543-a84e-d42a48259f71" (UID: "ef7c730d-de41-4543-a84e-d42a48259f71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.076748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ef7c730d-de41-4543-a84e-d42a48259f71" (UID: "ef7c730d-de41-4543-a84e-d42a48259f71"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.084020 4707 generic.go:334] "Generic (PLEG): container finished" podID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerID="a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.084080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerDied","Data":"a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.086411 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.086479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7" event={"ID":"5b669faa-0600-4ffd-918d-b33788b18eb9","Type":"ContainerDied","Data":"83e855f3f3c81f300cf9ddf93f5bd50e5e0b1e865c7b58edb6ce098881bb182f"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.087523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" event={"ID":"5d4d5044-73fa-4995-8594-0541964e35e8","Type":"ContainerDied","Data":"afca4d3242e761f4e1914710f4e42d3a067b4206e7cb051b53eb5ef1cf98e00f"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.087696 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.090436 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.090426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s" event={"ID":"c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc","Type":"ContainerDied","Data":"717edc04b22288af6646f344f0f3f2b7d5cc903aeec61f4f11538142fe30ce97"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.093050 4707 generic.go:334] "Generic (PLEG): container finished" podID="184ed54c-4ce8-4d12-ae34-255b44708d27" containerID="fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.093088 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.093180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"184ed54c-4ce8-4d12-ae34-255b44708d27","Type":"ContainerDied","Data":"fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.093202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"184ed54c-4ce8-4d12-ae34-255b44708d27","Type":"ContainerDied","Data":"d307e13550eef112188a32aad9f6cbeff277a1d8ec256fe5d4efc8f52d5a4c77"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.093218 4707 scope.go:117] "RemoveContainer" containerID="fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.098833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7d909d33-58f9-4a53-b98a-aa1927df3ea3","Type":"ContainerDied","Data":"a6cdf48eaa31f4e1b11a335b8bbc6d8dd7c28dc053821351f56ade5f300bd172"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.098924 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.101288 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" event={"ID":"5ae763d3-de1a-4a94-94bf-41c06bfe239f","Type":"ContainerDied","Data":"f6b22179bacde6d2bfd9e499ee207e8287515c8a31d3daf2a80e1caa021038c9"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.101353 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.106311 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ef7c730d-de41-4543-a84e-d42a48259f71" (UID: "ef7c730d-de41-4543-a84e-d42a48259f71"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.106533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" event={"ID":"48f50e81-5c99-4197-823e-f6967c42a20c","Type":"ContainerStarted","Data":"3bf8f1e6ff03b331032bc9209251ddc3f0f214034d1ef493eb9dcc9ad643d3c1"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.109068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ef7c730d-de41-4543-a84e-d42a48259f71" (UID: "ef7c730d-de41-4543-a84e-d42a48259f71"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.111320 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" containerID="338edc6004b39eadad308bfb8bb14c8882a1995789ae4fdf1b5e3424d971b9e4" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.111375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16","Type":"ContainerDied","Data":"338edc6004b39eadad308bfb8bb14c8882a1995789ae4fdf1b5e3424d971b9e4"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.113660 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" event={"ID":"fbb7bc95-1acd-4048-89a4-01ea8ae34d34","Type":"ContainerDied","Data":"ff18eac4e43d9db65c8c34b9558763ffa857e042ca6b003d96fb99506cc59ba5"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.113717 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.114131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data" (OuterVolumeSpecName: "config-data") pod "ef7c730d-de41-4543-a84e-d42a48259f71" (UID: "ef7c730d-de41-4543-a84e-d42a48259f71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.119862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef7c730d-de41-4543-a84e-d42a48259f71" (UID: "ef7c730d-de41-4543-a84e-d42a48259f71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.119918 4707 generic.go:334] "Generic (PLEG): container finished" podID="e8daa249-22b3-4782-b64b-d9dca84a8777" containerID="11d18989f69fc04fab3e70e9f5484f5fa9775c1bd917f37a8ee509295c912833" exitCode=1 Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.119988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jzkq7" event={"ID":"e8daa249-22b3-4782-b64b-d9dca84a8777","Type":"ContainerDied","Data":"11d18989f69fc04fab3e70e9f5484f5fa9775c1bd917f37a8ee509295c912833"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.123457 4707 generic.go:334] "Generic (PLEG): container finished" podID="280f96dd-c140-464c-a906-d5428cda5b5c" containerID="690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.123503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"280f96dd-c140-464c-a906-d5428cda5b5c","Type":"ContainerDied","Data":"690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.123526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"280f96dd-c140-464c-a906-d5428cda5b5c","Type":"ContainerDied","Data":"70c83635b682fa19e7974e177dba118ba5dcfb9ff1eba4cb7d4d1724f57f2a9d"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.123572 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.127357 4707 generic.go:334] "Generic (PLEG): container finished" podID="ef7c730d-de41-4543-a84e-d42a48259f71" containerID="51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.127397 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.127436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ef7c730d-de41-4543-a84e-d42a48259f71","Type":"ContainerDied","Data":"51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.127455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ef7c730d-de41-4543-a84e-d42a48259f71","Type":"ContainerDied","Data":"b5018d8edc96d0f356ecd6ea74a027ceea4c811c7650c799ac49932ad9b0fd27"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.129140 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" containerID="4ee7a2e40272780a6533082a718502f3763d3518cb8a640cca7433f0c0ba28b0" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.129182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" event={"ID":"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c","Type":"ContainerDied","Data":"4ee7a2e40272780a6533082a718502f3763d3518cb8a640cca7433f0c0ba28b0"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.129221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" event={"ID":"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c","Type":"ContainerStarted","Data":"709b1c2333202b4ae32673fb0530e208c401bfad1dbf3db56bf8916bfb22d484"} Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.129226 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.129196 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-b89f5977c-485np" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.129241 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.160414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config\") pod \"280f96dd-c140-464c-a906-d5428cda5b5c\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.160614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-generated\") pod \"280f96dd-c140-464c-a906-d5428cda5b5c\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.160668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs\") pod \"280f96dd-c140-464c-a906-d5428cda5b5c\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.160689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle\") pod \"280f96dd-c140-464c-a906-d5428cda5b5c\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.160713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5sfm\" (UniqueName: \"kubernetes.io/projected/280f96dd-c140-464c-a906-d5428cda5b5c-kube-api-access-m5sfm\") pod \"280f96dd-c140-464c-a906-d5428cda5b5c\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.160745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"280f96dd-c140-464c-a906-d5428cda5b5c\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.160771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default\") pod \"280f96dd-c140-464c-a906-d5428cda5b5c\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.161081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts\") pod \"280f96dd-c140-464c-a906-d5428cda5b5c\" (UID: \"280f96dd-c140-464c-a906-d5428cda5b5c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.162941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "280f96dd-c140-464c-a906-d5428cda5b5c" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.163349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "280f96dd-c140-464c-a906-d5428cda5b5c" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.164348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "280f96dd-c140-464c-a906-d5428cda5b5c" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.165854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "280f96dd-c140-464c-a906-d5428cda5b5c" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.166711 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167269 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167321 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167334 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167348 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167356 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/280f96dd-c140-464c-a906-d5428cda5b5c-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167365 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167400 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167410 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrb7v\" (UniqueName: \"kubernetes.io/projected/ef7c730d-de41-4543-a84e-d42a48259f71-kube-api-access-zrb7v\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167419 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7c730d-de41-4543-a84e-d42a48259f71-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.167427 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/280f96dd-c140-464c-a906-d5428cda5b5c-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.169183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280f96dd-c140-464c-a906-d5428cda5b5c-kube-api-access-m5sfm" (OuterVolumeSpecName: "kube-api-access-m5sfm") pod "280f96dd-c140-464c-a906-d5428cda5b5c" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c"). InnerVolumeSpecName "kube-api-access-m5sfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.191083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "280f96dd-c140-464c-a906-d5428cda5b5c" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.196547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "280f96dd-c140-464c-a906-d5428cda5b5c" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.201922 4707 scope.go:117] "RemoveContainer" containerID="fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.207864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "280f96dd-c140-464c-a906-d5428cda5b5c" (UID: "280f96dd-c140-464c-a906-d5428cda5b5c"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.211906 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121\": container with ID starting with fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121 not found: ID does not exist" containerID="fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.211946 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121"} err="failed to get container status \"fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121\": rpc error: code = NotFound desc = could not find container \"fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121\": container with ID starting with fd613c0f48da6f6492611b32739979124f93b6a6a77edad4fc8e835e86bca121 not found: ID does not exist" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.211968 4707 scope.go:117] "RemoveContainer" containerID="c79dfd4c99dbc23a0bdc487e4cc960e31cd013b21ba5d1830d4fab5e1cc9d84d" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.245414 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2856c284-39e8-469c-b288-422cc5f3205c" path="/var/lib/kubelet/pods/2856c284-39e8-469c-b288-422cc5f3205c/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.245897 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35adac90-7099-431d-b0a3-58ada4999e21" path="/var/lib/kubelet/pods/35adac90-7099-431d-b0a3-58ada4999e21/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.246338 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb37062-6ccc-4f42-9bc2-e553db1e788b" path="/var/lib/kubelet/pods/4fb37062-6ccc-4f42-9bc2-e553db1e788b/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.247550 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700a8d9c-852c-4611-9a98-50418d7df800" path="/var/lib/kubelet/pods/700a8d9c-852c-4611-9a98-50418d7df800/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.248340 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9" path="/var/lib/kubelet/pods/7f7677d5-e9ba-491e-bbe8-6ebb6f8de1c9/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.248789 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be59b54f-310b-4813-8485-69cd6af8424e" path="/var/lib/kubelet/pods/be59b54f-310b-4813-8485-69cd6af8424e/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.249755 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c475086c-e924-418c-bdd8-0b61fe31f950" path="/var/lib/kubelet/pods/c475086c-e924-418c-bdd8-0b61fe31f950/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.250706 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c899fead-a20d-4d1e-ba2f-168c79e4c953" path="/var/lib/kubelet/pods/c899fead-a20d-4d1e-ba2f-168c79e4c953/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.251179 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef17a342-1cc1-4fe8-8900-ef6f41eb4a51" path="/var/lib/kubelet/pods/ef17a342-1cc1-4fe8-8900-ef6f41eb4a51/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.252095 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6edfee9-38b1-44ae-bbf1-ff909ee38bfc" path="/var/lib/kubelet/pods/f6edfee9-38b1-44ae-bbf1-ff909ee38bfc/volumes" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.259634 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.267220 4707 scope.go:117] "RemoveContainer" containerID="44277af5c0f4f6569bed8d7b6aa6359e8dac4fb7085d41d7548def3d3ce9aecb" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.269006 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.269019 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280f96dd-c140-464c-a906-d5428cda5b5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.269028 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5sfm\" (UniqueName: \"kubernetes.io/projected/280f96dd-c140-464c-a906-d5428cda5b5c-kube-api-access-m5sfm\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.269045 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.270365 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.270403 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data podName:d54b9100-265d-49a3-a2ab-72e8787510df nodeName:}" failed. No retries permitted until 2026-01-21 15:29:41.270390301 +0000 UTC m=+1678.451906523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data") pod "rabbitmq-cell1-server-0" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.305651 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.313244 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-85d1-account-create-update-dpm4s"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.325940 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.330583 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.335546 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.344632 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.348154 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-11a0-account-create-update-qjfm5"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.354596 4707 scope.go:117] "RemoveContainer" containerID="690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.357626 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.361768 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-62dd-account-create-update-v8v9d"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.371336 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.473126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lsln\" (UniqueName: \"kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln\") pod \"keystone-b6c7-account-create-update-4r7db\" (UID: \"901b197a-7bb5-4257-983b-560632e8987f\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.473463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts\") pod \"keystone-b6c7-account-create-update-4r7db\" (UID: \"901b197a-7bb5-4257-983b-560632e8987f\") " pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.474039 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.474540 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts podName:901b197a-7bb5-4257-983b-560632e8987f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:39.47422654 +0000 UTC m=+1676.655742762 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts") pod "keystone-b6c7-account-create-update-4r7db" (UID: "901b197a-7bb5-4257-983b-560632e8987f") : configmap "openstack-scripts" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.475646 4707 projected.go:194] Error preparing data for projected volume kube-api-access-2lsln for pod openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.475684 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln podName:901b197a-7bb5-4257-983b-560632e8987f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:39.475673871 +0000 UTC m=+1676.657190093 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2lsln" (UniqueName: "kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln") pod "keystone-b6c7-account-create-update-4r7db" (UID: "901b197a-7bb5-4257-983b-560632e8987f") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.576022 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.576347 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts podName:8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:39.576332017 +0000 UTC m=+1676.757848239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts") pod "glance-d5d6-account-create-update-zcbtm" (UID: "8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8") : configmap "openstack-scripts" not found Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.632642 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.640739 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.656284 4707 scope.go:117] "RemoveContainer" containerID="5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150" Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.679573 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.679657 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts podName:e8daa249-22b3-4782-b64b-d9dca84a8777 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:39.679642238 +0000 UTC m=+1676.861158461 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts") pod "root-account-create-update-jzkq7" (UID: "e8daa249-22b3-4782-b64b-d9dca84a8777") : configmap "openstack-scripts" not found Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.719465 4707 scope.go:117] "RemoveContainer" containerID="690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533" Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.721794 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533\": container with ID starting with 690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533 not found: ID does not exist" containerID="690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.721875 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533"} err="failed to get container status \"690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533\": rpc error: code = NotFound desc = could not find container \"690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533\": container with ID starting with 690ea76c1a6c175ba8ba03cb774e821019694d3306bb385cd818807bec8d0533 not found: ID does not exist" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.721901 4707 scope.go:117] "RemoveContainer" containerID="5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150" Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.722620 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150\": container with ID starting with 5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150 not found: ID does not exist" containerID="5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.722641 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150"} err="failed to get container status \"5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150\": rpc error: code = NotFound desc = could not find container \"5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150\": container with ID starting with 5a28a21ad2369b11757592bf2cfa2e0fcf60cfbf19884e7346cc3a3518b10150 not found: ID does not exist" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.722654 4707 scope.go:117] "RemoveContainer" containerID="51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.743125 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.752203 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.753542 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.769740 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.778483 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ssx6\" (UniqueName: \"kubernetes.io/projected/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kube-api-access-5ssx6\") pod \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-galera-tls-certs\") pod \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kolla-config\") pod \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-generated\") pod \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f50e81-5c99-4197-823e-f6967c42a20c-operator-scripts\") pod \"48f50e81-5c99-4197-823e-f6967c42a20c\" (UID: \"48f50e81-5c99-4197-823e-f6967c42a20c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-combined-ca-bundle\") pod \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-operator-scripts\") pod \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-default\") pod \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\" (UID: \"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.780503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8whkz\" (UniqueName: \"kubernetes.io/projected/48f50e81-5c99-4197-823e-f6967c42a20c-kube-api-access-8whkz\") pod \"48f50e81-5c99-4197-823e-f6967c42a20c\" (UID: \"48f50e81-5c99-4197-823e-f6967c42a20c\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.784513 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.786829 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f50e81-5c99-4197-823e-f6967c42a20c-kube-api-access-8whkz" (OuterVolumeSpecName: "kube-api-access-8whkz") pod "48f50e81-5c99-4197-823e-f6967c42a20c" (UID: "48f50e81-5c99-4197-823e-f6967c42a20c"). InnerVolumeSpecName "kube-api-access-8whkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.787928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" (UID: "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.788333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f50e81-5c99-4197-823e-f6967c42a20c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48f50e81-5c99-4197-823e-f6967c42a20c" (UID: "48f50e81-5c99-4197-823e-f6967c42a20c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.788636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" (UID: "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.788750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" (UID: "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.789466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" (UID: "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.789775 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kube-api-access-5ssx6" (OuterVolumeSpecName: "kube-api-access-5ssx6") pod "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" (UID: "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16"). InnerVolumeSpecName "kube-api-access-5ssx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.798935 4707 scope.go:117] "RemoveContainer" containerID="8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.799047 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-b89f5977c-485np"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.804298 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-b89f5977c-485np"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.811773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "mysql-db") pod "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" (UID: "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.826957 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.851357 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-7408-account-create-update-wrhz7"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.856541 4707 scope.go:117] "RemoveContainer" containerID="51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f" Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.856986 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f\": container with ID starting with 51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f not found: ID does not exist" containerID="51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.857025 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f"} err="failed to get container status \"51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f\": rpc error: code = NotFound desc = could not find container \"51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f\": container with ID starting with 51aab980356c72435dfeaf96906c2985bf24e43414ae5f3286d345079191e16f not found: ID does not exist" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.857046 4707 scope.go:117] "RemoveContainer" containerID="8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f" Jan 21 15:29:37 crc kubenswrapper[4707]: E0121 15:29:37.857342 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f\": container with ID starting with 8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f not found: ID does not exist" containerID="8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.857366 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f"} err="failed to get container status \"8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f\": rpc error: code = NotFound desc = could not find container \"8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f\": container with ID starting with 8b9cadb45f2a886f43e9e68086124591a7b5acc42fc3e1ce369815f25670db4f not found: ID does not exist" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.881913 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts\") pod \"e8daa249-22b3-4782-b64b-d9dca84a8777\" (UID: \"e8daa249-22b3-4782-b64b-d9dca84a8777\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.882250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkp5k\" (UniqueName: \"kubernetes.io/projected/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-kube-api-access-nkp5k\") pod \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\" (UID: \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.882269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts\") pod \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\" (UID: \"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.882323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6zg6\" (UniqueName: \"kubernetes.io/projected/e8daa249-22b3-4782-b64b-d9dca84a8777-kube-api-access-v6zg6\") pod \"e8daa249-22b3-4782-b64b-d9dca84a8777\" (UID: \"e8daa249-22b3-4782-b64b-d9dca84a8777\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.883036 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f50e81-5c99-4197-823e-f6967c42a20c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.883056 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.883066 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a8e47-360d-481e-bf1c-9d58ad545f77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.883087 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.883096 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.883106 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8whkz\" (UniqueName: \"kubernetes.io/projected/48f50e81-5c99-4197-823e-f6967c42a20c-kube-api-access-8whkz\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.883118 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ssx6\" (UniqueName: \"kubernetes.io/projected/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kube-api-access-5ssx6\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.883127 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.883134 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.891516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8" (UID: "8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.893716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8daa249-22b3-4782-b64b-d9dca84a8777" (UID: "e8daa249-22b3-4782-b64b-d9dca84a8777"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.894556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" (UID: "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.899879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8daa249-22b3-4782-b64b-d9dca84a8777-kube-api-access-v6zg6" (OuterVolumeSpecName: "kube-api-access-v6zg6") pod "e8daa249-22b3-4782-b64b-d9dca84a8777" (UID: "e8daa249-22b3-4782-b64b-d9dca84a8777"). InnerVolumeSpecName "kube-api-access-v6zg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.910157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-kube-api-access-nkp5k" (OuterVolumeSpecName: "kube-api-access-nkp5k") pod "8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8" (UID: "8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8"). InnerVolumeSpecName "kube-api-access-nkp5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.915951 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.916091 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_8daec53c-1342-4ca3-aa8b-a9d50bcf44b5/ovn-northd/0.log" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.916192 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.921586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" (UID: "d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.927202 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6457bf97d6-6nrnm"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.939355 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.941193 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.949891 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.967330 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.971080 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-9d5a-account-create-update-cdskv"] Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.983680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-scripts\") pod \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.983723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-northd-tls-certs\") pod \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.983835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-rundir\") pod \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.983866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jftpv\" (UniqueName: \"kubernetes.io/projected/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-kube-api-access-jftpv\") pod \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.983895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-config\") pod \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.983925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-combined-ca-bundle\") pod \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-metrics-certs-tls-certs\") pod \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\" (UID: \"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5\") " Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-config" (OuterVolumeSpecName: "config") pod "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" (UID: "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984462 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-scripts" (OuterVolumeSpecName: "scripts") pod "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" (UID: "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" (UID: "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984690 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984711 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984721 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8daa249-22b3-4782-b64b-d9dca84a8777-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984730 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984739 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984747 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984756 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984764 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984772 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkp5k\" (UniqueName: \"kubernetes.io/projected/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8-kube-api-access-nkp5k\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.984780 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6zg6\" (UniqueName: \"kubernetes.io/projected/e8daa249-22b3-4782-b64b-d9dca84a8777-kube-api-access-v6zg6\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:37 crc kubenswrapper[4707]: I0121 15:29:37.989643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-kube-api-access-jftpv" (OuterVolumeSpecName: "kube-api-access-jftpv") pod "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" (UID: "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5"). InnerVolumeSpecName "kube-api-access-jftpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.001959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" (UID: "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.034081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" (UID: "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.036841 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="6bac3bc5-b4ca-4333-9fc6-851430c48708" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.50:5671: connect: connection refused" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.037891 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" (UID: "8daec53c-1342-4ca3-aa8b-a9d50bcf44b5"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.086223 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jftpv\" (UniqueName: \"kubernetes.io/projected/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-kube-api-access-jftpv\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.086256 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.086266 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.086291 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0080fa38-5728-4417-968b-156c1a7df0bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.086300 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.144220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.144192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh" event={"ID":"48f50e81-5c99-4197-823e-f6967c42a20c","Type":"ContainerDied","Data":"3bf8f1e6ff03b331032bc9209251ddc3f0f214034d1ef493eb9dcc9ad643d3c1"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.149688 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_8daec53c-1342-4ca3-aa8b-a9d50bcf44b5/ovn-northd/0.log" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.149752 4707 generic.go:334] "Generic (PLEG): container finished" podID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerID="a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7" exitCode=139 Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.149850 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.150106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5","Type":"ContainerDied","Data":"a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.150152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8daec53c-1342-4ca3-aa8b-a9d50bcf44b5","Type":"ContainerDied","Data":"0d93ca850483dce3ccacf5567e498fbfa039f1a818b38dcb8d4c66cd4a367682"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.150181 4707 scope.go:117] "RemoveContainer" containerID="2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.160475 4707 generic.go:334] "Generic (PLEG): container finished" podID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerID="f89ab992dfc657152c4b63f182969f628ca255996d2039d51a99b906f27be1d7" exitCode=0 Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.160516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" event={"ID":"faf162ab-5917-4ddc-9b89-573d2e3bc7e6","Type":"ContainerDied","Data":"f89ab992dfc657152c4b63f182969f628ca255996d2039d51a99b906f27be1d7"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.160531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" event={"ID":"faf162ab-5917-4ddc-9b89-573d2e3bc7e6","Type":"ContainerDied","Data":"b013e244262fdaa69d3b4ac19690becf57cae587ef55290f611f4f84e672f7bb"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.160542 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b013e244262fdaa69d3b4ac19690becf57cae587ef55290f611f4f84e672f7bb" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.162570 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4c27903-58da-47dc-97d0-b1328c58303d" containerID="858a7831d2593cf2e3d08f763a792322ce65f0af93f68b5ca794de32f64a614f" exitCode=0 Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.162603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" event={"ID":"d4c27903-58da-47dc-97d0-b1328c58303d","Type":"ContainerDied","Data":"858a7831d2593cf2e3d08f763a792322ce65f0af93f68b5ca794de32f64a614f"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.163914 4707 generic.go:334] "Generic (PLEG): container finished" podID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerID="dac905514aca127784ff10619aa0102a1724955c3593a17e7fa6fb6888775848" exitCode=0 Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.163946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"09b00a3d-6e52-458e-8fb3-6c48d658ff86","Type":"ContainerDied","Data":"dac905514aca127784ff10619aa0102a1724955c3593a17e7fa6fb6888775848"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.169896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" event={"ID":"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c","Type":"ContainerStarted","Data":"291ac9488458f15e17db78c4845ab63022b0bc1c6ee640c51ad561be142062ce"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.170271 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.171570 4707 generic.go:334] "Generic (PLEG): container finished" podID="8a287941-c7ad-4671-904b-03172bc803e8" containerID="f81c42fb84cf6dc6db11797efd104f4de9e6929e765ab0ffe83b4f838fac8280" exitCode=0 Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.171632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" event={"ID":"8a287941-c7ad-4671-904b-03172bc803e8","Type":"ContainerDied","Data":"f81c42fb84cf6dc6db11797efd104f4de9e6929e765ab0ffe83b4f838fac8280"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.173029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jzkq7" event={"ID":"e8daa249-22b3-4782-b64b-d9dca84a8777","Type":"ContainerDied","Data":"2840ce53faf0cf4ee686293752433c05461b4ca44d152320dfd77e3cd22f20e5"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.173088 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jzkq7" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.192025 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.192376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16","Type":"ContainerDied","Data":"5c135dfe8016b4dc350f6e5dec8b85a7607e309dfc08a6771fa518f3c77f0797"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.204693 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.205478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" event={"ID":"8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8","Type":"ContainerDied","Data":"89002195f52b9c360b417246f7df9ce298eaecfea9be208be4a2cc5928cc3cef"} Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.205570 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.220174 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.245611 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" podStartSLOduration=4.245601213 podStartE2EDuration="4.245601213s" podCreationTimestamp="2026-01-21 15:29:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:29:38.192140829 +0000 UTC m=+1675.373657051" watchObservedRunningTime="2026-01-21 15:29:38.245601213 +0000 UTC m=+1675.427117435" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.246234 4707 scope.go:117] "RemoveContainer" containerID="a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.251848 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="d54b9100-265d-49a3-a2ab-72e8787510df" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.51:5671: connect: connection refused" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.269741 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.277582 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-d5d6-account-create-update-zcbtm"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.281717 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.290421 4707 scope.go:117] "RemoveContainer" containerID="2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.290791 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:29:38 crc kubenswrapper[4707]: E0121 15:29:38.293946 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c\": container with ID starting with 2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c not found: ID does not exist" containerID="2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.293984 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c"} err="failed to get container status \"2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c\": rpc error: code = NotFound desc = could not find container \"2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c\": container with ID starting with 2a430b5cd3cf8cd04ef5ff4805ec6a430df168a2c5cca743b42edb2b6879017c not found: ID does not exist" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.294007 4707 scope.go:117] "RemoveContainer" containerID="a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.296385 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.304064 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:29:38 crc kubenswrapper[4707]: E0121 15:29:38.307036 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7\": container with ID starting with a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7 not found: ID does not exist" containerID="a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.307076 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7"} err="failed to get container status \"a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7\": rpc error: code = NotFound desc = could not find container \"a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7\": container with ID starting with a8ed4d53147054b63aa311864bb72f70abd446f8bc24012f710f1a0016cbb9f7 not found: ID does not exist" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.307101 4707 scope.go:117] "RemoveContainer" containerID="11d18989f69fc04fab3e70e9f5484f5fa9775c1bd917f37a8ee509295c912833" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.324892 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.329781 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.336697 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-b6c7-account-create-update-4r7db"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.344637 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzkq7"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.351432 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jzkq7"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.361141 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.365630 4707 scope.go:117] "RemoveContainer" containerID="338edc6004b39eadad308bfb8bb14c8882a1995789ae4fdf1b5e3424d971b9e4" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.368821 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-80bd-account-create-update-4cxdh"] Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.396427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-internal-tls-certs\") pod \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.396532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-scripts\") pod \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.396558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnjn6\" (UniqueName: \"kubernetes.io/projected/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-kube-api-access-fnjn6\") pod \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.396575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-config-data\") pod \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.396629 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-public-tls-certs\") pod \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.396723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-combined-ca-bundle\") pod \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.396759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-logs\") pod \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\" (UID: \"faf162ab-5917-4ddc-9b89-573d2e3bc7e6\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.397431 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-logs" (OuterVolumeSpecName: "logs") pod "faf162ab-5917-4ddc-9b89-573d2e3bc7e6" (UID: "faf162ab-5917-4ddc-9b89-573d2e3bc7e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.400418 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-scripts" (OuterVolumeSpecName: "scripts") pod "faf162ab-5917-4ddc-9b89-573d2e3bc7e6" (UID: "faf162ab-5917-4ddc-9b89-573d2e3bc7e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.403535 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-kube-api-access-fnjn6" (OuterVolumeSpecName: "kube-api-access-fnjn6") pod "faf162ab-5917-4ddc-9b89-573d2e3bc7e6" (UID: "faf162ab-5917-4ddc-9b89-573d2e3bc7e6"). InnerVolumeSpecName "kube-api-access-fnjn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.460736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faf162ab-5917-4ddc-9b89-573d2e3bc7e6" (UID: "faf162ab-5917-4ddc-9b89-573d2e3bc7e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.463295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-config-data" (OuterVolumeSpecName: "config-data") pod "faf162ab-5917-4ddc-9b89-573d2e3bc7e6" (UID: "faf162ab-5917-4ddc-9b89-573d2e3bc7e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.479783 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "faf162ab-5917-4ddc-9b89-573d2e3bc7e6" (UID: "faf162ab-5917-4ddc-9b89-573d2e3bc7e6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.500539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-combined-ca-bundle\") pod \"8a287941-c7ad-4671-904b-03172bc803e8\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.500605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data\") pod \"8a287941-c7ad-4671-904b-03172bc803e8\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.500624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data-custom\") pod \"8a287941-c7ad-4671-904b-03172bc803e8\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.500704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a287941-c7ad-4671-904b-03172bc803e8-logs\") pod \"8a287941-c7ad-4671-904b-03172bc803e8\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.500730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4jzx\" (UniqueName: \"kubernetes.io/projected/8a287941-c7ad-4671-904b-03172bc803e8-kube-api-access-l4jzx\") pod \"8a287941-c7ad-4671-904b-03172bc803e8\" (UID: \"8a287941-c7ad-4671-904b-03172bc803e8\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.501215 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.501227 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.501234 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.501243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnjn6\" (UniqueName: \"kubernetes.io/projected/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-kube-api-access-fnjn6\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.501252 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.501260 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.501267 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lsln\" (UniqueName: \"kubernetes.io/projected/901b197a-7bb5-4257-983b-560632e8987f-kube-api-access-2lsln\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.501275 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/901b197a-7bb5-4257-983b-560632e8987f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.504198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a287941-c7ad-4671-904b-03172bc803e8-logs" (OuterVolumeSpecName: "logs") pod "8a287941-c7ad-4671-904b-03172bc803e8" (UID: "8a287941-c7ad-4671-904b-03172bc803e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.504929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "faf162ab-5917-4ddc-9b89-573d2e3bc7e6" (UID: "faf162ab-5917-4ddc-9b89-573d2e3bc7e6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.507336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8a287941-c7ad-4671-904b-03172bc803e8" (UID: "8a287941-c7ad-4671-904b-03172bc803e8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.510957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a287941-c7ad-4671-904b-03172bc803e8-kube-api-access-l4jzx" (OuterVolumeSpecName: "kube-api-access-l4jzx") pod "8a287941-c7ad-4671-904b-03172bc803e8" (UID: "8a287941-c7ad-4671-904b-03172bc803e8"). InnerVolumeSpecName "kube-api-access-l4jzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.511405 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.103:9311/healthcheck\": read tcp 10.217.0.2:50898->10.217.1.103:9311: read: connection reset by peer" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.511376 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.103:9311/healthcheck\": read tcp 10.217.0.2:50896->10.217.1.103:9311: read: connection reset by peer" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.535093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a287941-c7ad-4671-904b-03172bc803e8" (UID: "8a287941-c7ad-4671-904b-03172bc803e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.540468 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data" (OuterVolumeSpecName: "config-data") pod "8a287941-c7ad-4671-904b-03172bc803e8" (UID: "8a287941-c7ad-4671-904b-03172bc803e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.603424 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf162ab-5917-4ddc-9b89-573d2e3bc7e6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.603550 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.603606 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.603656 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a287941-c7ad-4671-904b-03172bc803e8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.603706 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a287941-c7ad-4671-904b-03172bc803e8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.603772 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4jzx\" (UniqueName: \"kubernetes.io/projected/8a287941-c7ad-4671-904b-03172bc803e8-kube-api-access-l4jzx\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.705987 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.725977 4707 scope.go:117] "RemoveContainer" containerID="c26a6d88ea140a2e846151b29e903b6447b9113d6b93c6dacb375b5ce715545b" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.731776 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.737534 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.741926 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.809698 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-combined-ca-bundle\") pod \"d4c27903-58da-47dc-97d0-b1328c58303d\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.809738 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data\") pod \"d4c27903-58da-47dc-97d0-b1328c58303d\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.809761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v862r\" (UniqueName: \"kubernetes.io/projected/d4c27903-58da-47dc-97d0-b1328c58303d-kube-api-access-v862r\") pod \"d4c27903-58da-47dc-97d0-b1328c58303d\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.809885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c27903-58da-47dc-97d0-b1328c58303d-logs\") pod \"d4c27903-58da-47dc-97d0-b1328c58303d\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.809950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data-custom\") pod \"d4c27903-58da-47dc-97d0-b1328c58303d\" (UID: \"d4c27903-58da-47dc-97d0-b1328c58303d\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.810461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4c27903-58da-47dc-97d0-b1328c58303d-logs" (OuterVolumeSpecName: "logs") pod "d4c27903-58da-47dc-97d0-b1328c58303d" (UID: "d4c27903-58da-47dc-97d0-b1328c58303d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.812673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4c27903-58da-47dc-97d0-b1328c58303d-kube-api-access-v862r" (OuterVolumeSpecName: "kube-api-access-v862r") pod "d4c27903-58da-47dc-97d0-b1328c58303d" (UID: "d4c27903-58da-47dc-97d0-b1328c58303d"). InnerVolumeSpecName "kube-api-access-v862r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.812880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4c27903-58da-47dc-97d0-b1328c58303d" (UID: "d4c27903-58da-47dc-97d0-b1328c58303d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.822917 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.826965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4c27903-58da-47dc-97d0-b1328c58303d" (UID: "d4c27903-58da-47dc-97d0-b1328c58303d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.862340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data" (OuterVolumeSpecName: "config-data") pod "d4c27903-58da-47dc-97d0-b1328c58303d" (UID: "d4c27903-58da-47dc-97d0-b1328c58303d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911237 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-httpd-run\") pod \"33a83157-f122-493a-acd7-ba671cfce233\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-logs\") pod \"33a83157-f122-493a-acd7-ba671cfce233\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-combined-ca-bundle\") pod \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data\") pod \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911370 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd658\" (UniqueName: \"kubernetes.io/projected/33a83157-f122-493a-acd7-ba671cfce233-kube-api-access-zd658\") pod \"33a83157-f122-493a-acd7-ba671cfce233\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911389 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-config-data\") pod \"ccc6250e-7692-4325-896f-33f84c241afb\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911407 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-combined-ca-bundle\") pod \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svsj9\" (UniqueName: \"kubernetes.io/projected/09b00a3d-6e52-458e-8fb3-6c48d658ff86-kube-api-access-svsj9\") pod \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-internal-tls-certs\") pod \"33a83157-f122-493a-acd7-ba671cfce233\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911455 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data-custom\") pod \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-public-tls-certs\") pod \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-nova-metadata-tls-certs\") pod \"ccc6250e-7692-4325-896f-33f84c241afb\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-public-tls-certs\") pod \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911541 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-config-data\") pod \"33a83157-f122-493a-acd7-ba671cfce233\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74qjx\" (UniqueName: \"kubernetes.io/projected/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-kube-api-access-74qjx\") pod \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"33a83157-f122-493a-acd7-ba671cfce233\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "33a83157-f122-493a-acd7-ba671cfce233" (UID: "33a83157-f122-493a-acd7-ba671cfce233"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-combined-ca-bundle\") pod \"33a83157-f122-493a-acd7-ba671cfce233\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-internal-tls-certs\") pod \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccc6250e-7692-4325-896f-33f84c241afb-logs\") pod \"ccc6250e-7692-4325-896f-33f84c241afb\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-logs\") pod \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\" (UID: \"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hccm4\" (UniqueName: \"kubernetes.io/projected/ccc6250e-7692-4325-896f-33f84c241afb-kube-api-access-hccm4\") pod \"ccc6250e-7692-4325-896f-33f84c241afb\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911768 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-combined-ca-bundle\") pod \"ccc6250e-7692-4325-896f-33f84c241afb\" (UID: \"ccc6250e-7692-4325-896f-33f84c241afb\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-scripts\") pod \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-config-data\") pod \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-httpd-run\") pod \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-logs\") pod \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\" (UID: \"09b00a3d-6e52-458e-8fb3-6c48d658ff86\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.911919 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-scripts\") pod \"33a83157-f122-493a-acd7-ba671cfce233\" (UID: \"33a83157-f122-493a-acd7-ba671cfce233\") " Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.912254 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.912266 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c27903-58da-47dc-97d0-b1328c58303d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.912274 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.912283 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.912291 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c27903-58da-47dc-97d0-b1328c58303d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.912298 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v862r\" (UniqueName: \"kubernetes.io/projected/d4c27903-58da-47dc-97d0-b1328c58303d-kube-api-access-v862r\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.913484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "09b00a3d-6e52-458e-8fb3-6c48d658ff86" (UID: "09b00a3d-6e52-458e-8fb3-6c48d658ff86"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.913540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-logs" (OuterVolumeSpecName: "logs") pod "33a83157-f122-493a-acd7-ba671cfce233" (UID: "33a83157-f122-493a-acd7-ba671cfce233"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.913645 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccc6250e-7692-4325-896f-33f84c241afb-logs" (OuterVolumeSpecName: "logs") pod "ccc6250e-7692-4325-896f-33f84c241afb" (UID: "ccc6250e-7692-4325-896f-33f84c241afb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.914146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-logs" (OuterVolumeSpecName: "logs") pod "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" (UID: "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.914290 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "33a83157-f122-493a-acd7-ba671cfce233" (UID: "33a83157-f122-493a-acd7-ba671cfce233"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.916003 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" (UID: "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.916177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a83157-f122-493a-acd7-ba671cfce233-kube-api-access-zd658" (OuterVolumeSpecName: "kube-api-access-zd658") pod "33a83157-f122-493a-acd7-ba671cfce233" (UID: "33a83157-f122-493a-acd7-ba671cfce233"). InnerVolumeSpecName "kube-api-access-zd658". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.916361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-scripts" (OuterVolumeSpecName: "scripts") pod "33a83157-f122-493a-acd7-ba671cfce233" (UID: "33a83157-f122-493a-acd7-ba671cfce233"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.916631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-logs" (OuterVolumeSpecName: "logs") pod "09b00a3d-6e52-458e-8fb3-6c48d658ff86" (UID: "09b00a3d-6e52-458e-8fb3-6c48d658ff86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.919920 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-scripts" (OuterVolumeSpecName: "scripts") pod "09b00a3d-6e52-458e-8fb3-6c48d658ff86" (UID: "09b00a3d-6e52-458e-8fb3-6c48d658ff86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.931937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "09b00a3d-6e52-458e-8fb3-6c48d658ff86" (UID: "09b00a3d-6e52-458e-8fb3-6c48d658ff86"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.947839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" (UID: "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.948938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33a83157-f122-493a-acd7-ba671cfce233" (UID: "33a83157-f122-493a-acd7-ba671cfce233"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.950977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-kube-api-access-74qjx" (OuterVolumeSpecName: "kube-api-access-74qjx") pod "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" (UID: "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce"). InnerVolumeSpecName "kube-api-access-74qjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.951282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc6250e-7692-4325-896f-33f84c241afb-kube-api-access-hccm4" (OuterVolumeSpecName: "kube-api-access-hccm4") pod "ccc6250e-7692-4325-896f-33f84c241afb" (UID: "ccc6250e-7692-4325-896f-33f84c241afb"). InnerVolumeSpecName "kube-api-access-hccm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.951392 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccc6250e-7692-4325-896f-33f84c241afb" (UID: "ccc6250e-7692-4325-896f-33f84c241afb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.951899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b00a3d-6e52-458e-8fb3-6c48d658ff86-kube-api-access-svsj9" (OuterVolumeSpecName: "kube-api-access-svsj9") pod "09b00a3d-6e52-458e-8fb3-6c48d658ff86" (UID: "09b00a3d-6e52-458e-8fb3-6c48d658ff86"). InnerVolumeSpecName "kube-api-access-svsj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.981215 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "09b00a3d-6e52-458e-8fb3-6c48d658ff86" (UID: "09b00a3d-6e52-458e-8fb3-6c48d658ff86"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.982283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "33a83157-f122-493a-acd7-ba671cfce233" (UID: "33a83157-f122-493a-acd7-ba671cfce233"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.982503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ccc6250e-7692-4325-896f-33f84c241afb" (UID: "ccc6250e-7692-4325-896f-33f84c241afb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.983328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09b00a3d-6e52-458e-8fb3-6c48d658ff86" (UID: "09b00a3d-6e52-458e-8fb3-6c48d658ff86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.983995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" (UID: "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4707]: I0121 15:29:38.985201 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" (UID: "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.000132 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data" (OuterVolumeSpecName: "config-data") pod "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" (UID: "62c3da3c-d7d6-4839-94a7-55b3d07bf5ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.000373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-config-data" (OuterVolumeSpecName: "config-data") pod "33a83157-f122-493a-acd7-ba671cfce233" (UID: "33a83157-f122-493a-acd7-ba671cfce233"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.006224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-config-data" (OuterVolumeSpecName: "config-data") pod "ccc6250e-7692-4325-896f-33f84c241afb" (UID: "ccc6250e-7692-4325-896f-33f84c241afb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.010942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-config-data" (OuterVolumeSpecName: "config-data") pod "09b00a3d-6e52-458e-8fb3-6c48d658ff86" (UID: "09b00a3d-6e52-458e-8fb3-6c48d658ff86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013832 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013860 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd658\" (UniqueName: \"kubernetes.io/projected/33a83157-f122-493a-acd7-ba671cfce233-kube-api-access-zd658\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013871 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013879 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013889 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svsj9\" (UniqueName: \"kubernetes.io/projected/09b00a3d-6e52-458e-8fb3-6c48d658ff86-kube-api-access-svsj9\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013896 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013904 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013911 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013918 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013926 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013934 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013941 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74qjx\" (UniqueName: \"kubernetes.io/projected/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-kube-api-access-74qjx\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013975 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013985 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.013995 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014002 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccc6250e-7692-4325-896f-33f84c241afb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014009 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014017 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hccm4\" (UniqueName: \"kubernetes.io/projected/ccc6250e-7692-4325-896f-33f84c241afb-kube-api-access-hccm4\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014025 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6250e-7692-4325-896f-33f84c241afb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014032 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014039 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b00a3d-6e52-458e-8fb3-6c48d658ff86-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014046 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014053 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09b00a3d-6e52-458e-8fb3-6c48d658ff86-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014064 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014072 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a83157-f122-493a-acd7-ba671cfce233-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014079 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33a83157-f122-493a-acd7-ba671cfce233-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.014086 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.025476 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.027736 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.115663 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.115688 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.158778 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.190845 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0080fa38-5728-4417-968b-156c1a7df0bf" path="/var/lib/kubelet/pods/0080fa38-5728-4417-968b-156c1a7df0bf/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.191378 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184ed54c-4ce8-4d12-ae34-255b44708d27" path="/var/lib/kubelet/pods/184ed54c-4ce8-4d12-ae34-255b44708d27/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.192054 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249a8e47-360d-481e-bf1c-9d58ad545f77" path="/var/lib/kubelet/pods/249a8e47-360d-481e-bf1c-9d58ad545f77/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.192568 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280f96dd-c140-464c-a906-d5428cda5b5c" path="/var/lib/kubelet/pods/280f96dd-c140-464c-a906-d5428cda5b5c/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.193478 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f50e81-5c99-4197-823e-f6967c42a20c" path="/var/lib/kubelet/pods/48f50e81-5c99-4197-823e-f6967c42a20c/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.193774 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae763d3-de1a-4a94-94bf-41c06bfe239f" path="/var/lib/kubelet/pods/5ae763d3-de1a-4a94-94bf-41c06bfe239f/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.194091 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b669faa-0600-4ffd-918d-b33788b18eb9" path="/var/lib/kubelet/pods/5b669faa-0600-4ffd-918d-b33788b18eb9/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.194396 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4d5044-73fa-4995-8594-0541964e35e8" path="/var/lib/kubelet/pods/5d4d5044-73fa-4995-8594-0541964e35e8/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.194688 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d909d33-58f9-4a53-b98a-aa1927df3ea3" path="/var/lib/kubelet/pods/7d909d33-58f9-4a53-b98a-aa1927df3ea3/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.198964 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8" path="/var/lib/kubelet/pods/8cb01f6b-d9cb-4e96-a6a7-9f3206da50f8/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.199398 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" path="/var/lib/kubelet/pods/8daec53c-1342-4ca3-aa8b-a9d50bcf44b5/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.201431 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="901b197a-7bb5-4257-983b-560632e8987f" path="/var/lib/kubelet/pods/901b197a-7bb5-4257-983b-560632e8987f/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.201697 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc" path="/var/lib/kubelet/pods/c2766ddc-e7f3-4c82-b9a4-f7df41a6cacc/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.202719 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" path="/var/lib/kubelet/pods/d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.203237 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8daa249-22b3-4782-b64b-d9dca84a8777" path="/var/lib/kubelet/pods/e8daa249-22b3-4782-b64b-d9dca84a8777/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.203718 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" path="/var/lib/kubelet/pods/ef7c730d-de41-4543-a84e-d42a48259f71/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.204638 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbb7bc95-1acd-4048-89a4-01ea8ae34d34" path="/var/lib/kubelet/pods/fbb7bc95-1acd-4048-89a4-01ea8ae34d34/volumes" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.224734 4707 generic.go:334] "Generic (PLEG): container finished" podID="ccc6250e-7692-4325-896f-33f84c241afb" containerID="1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b" exitCode=0 Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.224770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ccc6250e-7692-4325-896f-33f84c241afb","Type":"ContainerDied","Data":"1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.224854 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.225082 4707 scope.go:117] "RemoveContainer" containerID="1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.225071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ccc6250e-7692-4325-896f-33f84c241afb","Type":"ContainerDied","Data":"b57c33ea34d91d71336a11e04971f169963d1c8464483ff59c6605bae24dc745"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.226843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" event={"ID":"d4c27903-58da-47dc-97d0-b1328c58303d","Type":"ContainerDied","Data":"3adbc289fd06dd5710f77b95284b3975c02718651b316f0da0678395de0803c2"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.226855 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.228682 4707 generic.go:334] "Generic (PLEG): container finished" podID="4638ccbe-c266-4bff-8771-67eb02be0a6a" containerID="aeed4f7ea398c8aec570dded3f936a9753106f96a565795b7770c4040b2cca32" exitCode=0 Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.228725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" event={"ID":"4638ccbe-c266-4bff-8771-67eb02be0a6a","Type":"ContainerDied","Data":"aeed4f7ea398c8aec570dded3f936a9753106f96a565795b7770c4040b2cca32"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.230100 4707 generic.go:334] "Generic (PLEG): container finished" podID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerID="8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb" exitCode=0 Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.230144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" event={"ID":"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce","Type":"ContainerDied","Data":"8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.230173 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.230174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2" event={"ID":"62c3da3c-d7d6-4839-94a7-55b3d07bf5ce","Type":"ContainerDied","Data":"3c361f6701b07397fac8443eefa35d147f466ea8c4940adb2baa943dd886807a"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.231777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" event={"ID":"8a287941-c7ad-4671-904b-03172bc803e8","Type":"ContainerDied","Data":"511a8eb1b7daae3f71a300332cef4579f5623d1cbe859d8ebe91b9de425890c3"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.231899 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b555f485-zww44" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.244041 4707 generic.go:334] "Generic (PLEG): container finished" podID="efb1c248-d3f2-4038-8f84-01a4de59d39e" containerID="056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7" exitCode=0 Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.244110 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.244146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"efb1c248-d3f2-4038-8f84-01a4de59d39e","Type":"ContainerDied","Data":"056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.244228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"efb1c248-d3f2-4038-8f84-01a4de59d39e","Type":"ContainerDied","Data":"b1f69fa4770e07211e077fa247a73d82e466f808d4f1c9ed4a53bfe3a403d016"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.246837 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.246839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"09b00a3d-6e52-458e-8fb3-6c48d658ff86","Type":"ContainerDied","Data":"9f6933ab473e2c8d0a445426066fe44bdcd81dfe5af76f5a868ef5c37f97527a"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.249650 4707 generic.go:334] "Generic (PLEG): container finished" podID="33a83157-f122-493a-acd7-ba671cfce233" containerID="a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233" exitCode=0 Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.250515 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.252124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"33a83157-f122-493a-acd7-ba671cfce233","Type":"ContainerDied","Data":"a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.252148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"33a83157-f122-493a-acd7-ba671cfce233","Type":"ContainerDied","Data":"9ccc2ad60f67c465be08cd21b0bdc4939b90da2d9c2ce4491b84bb1d1d8e5901"} Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.252207 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5977d44c4b-9wznx" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.265740 4707 scope.go:117] "RemoveContainer" containerID="3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.294767 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.301520 4707 scope.go:117] "RemoveContainer" containerID="1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b" Jan 21 15:29:39 crc kubenswrapper[4707]: E0121 15:29:39.302154 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b\": container with ID starting with 1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b not found: ID does not exist" containerID="1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.302196 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b"} err="failed to get container status \"1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b\": rpc error: code = NotFound desc = could not find container \"1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b\": container with ID starting with 1945f1f8b0e92e356a083611179b225b9fb09aa0ba4bbcd7b371b6390155413b not found: ID does not exist" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.302214 4707 scope.go:117] "RemoveContainer" containerID="3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221" Jan 21 15:29:39 crc kubenswrapper[4707]: E0121 15:29:39.302690 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221\": container with ID starting with 3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221 not found: ID does not exist" containerID="3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.302732 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221"} err="failed to get container status \"3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221\": rpc error: code = NotFound desc = could not find container \"3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221\": container with ID starting with 3c5082c8f267c73d97b1859c661b0ef48a1978aff1024893f071380f39d9c221 not found: ID does not exist" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.302755 4707 scope.go:117] "RemoveContainer" containerID="858a7831d2593cf2e3d08f763a792322ce65f0af93f68b5ca794de32f64a614f" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.303866 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.308297 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.315519 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-5d4684d9b4-fdrm2"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.316871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.319948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-combined-ca-bundle\") pod \"efb1c248-d3f2-4038-8f84-01a4de59d39e\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.320069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-config-data\") pod \"efb1c248-d3f2-4038-8f84-01a4de59d39e\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.320435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h2zd\" (UniqueName: \"kubernetes.io/projected/efb1c248-d3f2-4038-8f84-01a4de59d39e-kube-api-access-9h2zd\") pod \"efb1c248-d3f2-4038-8f84-01a4de59d39e\" (UID: \"efb1c248-d3f2-4038-8f84-01a4de59d39e\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.324769 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb1c248-d3f2-4038-8f84-01a4de59d39e-kube-api-access-9h2zd" (OuterVolumeSpecName: "kube-api-access-9h2zd") pod "efb1c248-d3f2-4038-8f84-01a4de59d39e" (UID: "efb1c248-d3f2-4038-8f84-01a4de59d39e"). InnerVolumeSpecName "kube-api-access-9h2zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.327773 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.333891 4707 scope.go:117] "RemoveContainer" containerID="b4573b28ecd3cce9248e8a085493d4d875d269b88b74b45cf9bfa26ecf1c1f4a" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.334434 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-d5d8c859b-2fbnc"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.338686 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b555f485-zww44"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.340096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efb1c248-d3f2-4038-8f84-01a4de59d39e" (UID: "efb1c248-d3f2-4038-8f84-01a4de59d39e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.345576 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b555f485-zww44"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.347422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-config-data" (OuterVolumeSpecName: "config-data") pod "efb1c248-d3f2-4038-8f84-01a4de59d39e" (UID: "efb1c248-d3f2-4038-8f84-01a4de59d39e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.347922 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.352597 4707 scope.go:117] "RemoveContainer" containerID="8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.353691 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.358739 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5977d44c4b-9wznx"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.363593 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5977d44c4b-9wznx"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.369168 4707 scope.go:117] "RemoveContainer" containerID="9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.369250 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.372781 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.422043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-fernet-keys\") pod \"4638ccbe-c266-4bff-8771-67eb02be0a6a\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.422087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-internal-tls-certs\") pod \"4638ccbe-c266-4bff-8771-67eb02be0a6a\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.422114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-public-tls-certs\") pod \"4638ccbe-c266-4bff-8771-67eb02be0a6a\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.422142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-combined-ca-bundle\") pod \"4638ccbe-c266-4bff-8771-67eb02be0a6a\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.422182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-credential-keys\") pod \"4638ccbe-c266-4bff-8771-67eb02be0a6a\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.422235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-config-data\") pod \"4638ccbe-c266-4bff-8771-67eb02be0a6a\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.422290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2p5d\" (UniqueName: \"kubernetes.io/projected/4638ccbe-c266-4bff-8771-67eb02be0a6a-kube-api-access-w2p5d\") pod \"4638ccbe-c266-4bff-8771-67eb02be0a6a\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.422353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-scripts\") pod \"4638ccbe-c266-4bff-8771-67eb02be0a6a\" (UID: \"4638ccbe-c266-4bff-8771-67eb02be0a6a\") " Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.424235 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h2zd\" (UniqueName: \"kubernetes.io/projected/efb1c248-d3f2-4038-8f84-01a4de59d39e-kube-api-access-9h2zd\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.424252 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.424277 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb1c248-d3f2-4038-8f84-01a4de59d39e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.425372 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4638ccbe-c266-4bff-8771-67eb02be0a6a" (UID: "4638ccbe-c266-4bff-8771-67eb02be0a6a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.425464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4638ccbe-c266-4bff-8771-67eb02be0a6a" (UID: "4638ccbe-c266-4bff-8771-67eb02be0a6a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.425847 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-scripts" (OuterVolumeSpecName: "scripts") pod "4638ccbe-c266-4bff-8771-67eb02be0a6a" (UID: "4638ccbe-c266-4bff-8771-67eb02be0a6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.428462 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4638ccbe-c266-4bff-8771-67eb02be0a6a-kube-api-access-w2p5d" (OuterVolumeSpecName: "kube-api-access-w2p5d") pod "4638ccbe-c266-4bff-8771-67eb02be0a6a" (UID: "4638ccbe-c266-4bff-8771-67eb02be0a6a"). InnerVolumeSpecName "kube-api-access-w2p5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.439700 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4638ccbe-c266-4bff-8771-67eb02be0a6a" (UID: "4638ccbe-c266-4bff-8771-67eb02be0a6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.441259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-config-data" (OuterVolumeSpecName: "config-data") pod "4638ccbe-c266-4bff-8771-67eb02be0a6a" (UID: "4638ccbe-c266-4bff-8771-67eb02be0a6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.449427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4638ccbe-c266-4bff-8771-67eb02be0a6a" (UID: "4638ccbe-c266-4bff-8771-67eb02be0a6a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.451197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4638ccbe-c266-4bff-8771-67eb02be0a6a" (UID: "4638ccbe-c266-4bff-8771-67eb02be0a6a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.497729 4707 scope.go:117] "RemoveContainer" containerID="8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb" Jan 21 15:29:39 crc kubenswrapper[4707]: E0121 15:29:39.498081 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb\": container with ID starting with 8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb not found: ID does not exist" containerID="8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.498111 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb"} err="failed to get container status \"8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb\": rpc error: code = NotFound desc = could not find container \"8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb\": container with ID starting with 8dcc3be62054f7363d50d006d05aeee93007b007079415c043a94616e1bbbbbb not found: ID does not exist" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.498131 4707 scope.go:117] "RemoveContainer" containerID="9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59" Jan 21 15:29:39 crc kubenswrapper[4707]: E0121 15:29:39.498349 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59\": container with ID starting with 9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59 not found: ID does not exist" containerID="9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.498374 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59"} err="failed to get container status \"9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59\": rpc error: code = NotFound desc = could not find container \"9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59\": container with ID starting with 9230a32292e8f08f6cdafdf517f48f2097ddac8fe36f4e1346c06f088a729d59 not found: ID does not exist" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.498388 4707 scope.go:117] "RemoveContainer" containerID="f81c42fb84cf6dc6db11797efd104f4de9e6929e765ab0ffe83b4f838fac8280" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.512379 4707 scope.go:117] "RemoveContainer" containerID="7d5e60913431e94a855dc86f1bcab868f337b843d01bafdb458a1a24b751facd" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.525998 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.526019 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.526030 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.526039 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.526047 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.526055 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.526062 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2p5d\" (UniqueName: \"kubernetes.io/projected/4638ccbe-c266-4bff-8771-67eb02be0a6a-kube-api-access-w2p5d\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.526072 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4638ccbe-c266-4bff-8771-67eb02be0a6a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.529182 4707 scope.go:117] "RemoveContainer" containerID="056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.543090 4707 scope.go:117] "RemoveContainer" containerID="056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7" Jan 21 15:29:39 crc kubenswrapper[4707]: E0121 15:29:39.543379 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7\": container with ID starting with 056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7 not found: ID does not exist" containerID="056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.543438 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7"} err="failed to get container status \"056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7\": rpc error: code = NotFound desc = could not find container \"056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7\": container with ID starting with 056a5a8af181d028999cedac1b4a90f0b00dd9edd74adeeaeef7670b34bfd6b7 not found: ID does not exist" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.543454 4707 scope.go:117] "RemoveContainer" containerID="dac905514aca127784ff10619aa0102a1724955c3593a17e7fa6fb6888775848" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.560391 4707 scope.go:117] "RemoveContainer" containerID="0f79ffcc51a1df0f60b92a65eed0727ebe1a87e5686c3cfa9b48daa4dee20df1" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.613583 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.620003 4707 scope.go:117] "RemoveContainer" containerID="a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.625326 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.637127 4707 scope.go:117] "RemoveContainer" containerID="940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.661497 4707 scope.go:117] "RemoveContainer" containerID="a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233" Jan 21 15:29:39 crc kubenswrapper[4707]: E0121 15:29:39.663947 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233\": container with ID starting with a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233 not found: ID does not exist" containerID="a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.664085 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233"} err="failed to get container status \"a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233\": rpc error: code = NotFound desc = could not find container \"a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233\": container with ID starting with a5b339477eba9505b78026cdcf78d7b17e44e41f066c6503d57e0d9ab76a5233 not found: ID does not exist" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.664274 4707 scope.go:117] "RemoveContainer" containerID="940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5" Jan 21 15:29:39 crc kubenswrapper[4707]: E0121 15:29:39.664766 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5\": container with ID starting with 940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5 not found: ID does not exist" containerID="940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.664789 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5"} err="failed to get container status \"940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5\": rpc error: code = NotFound desc = could not find container \"940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5\": container with ID starting with 940a52152aa821540d6ef2ff4b216762b4ad9685af3ac470b44964bde67fe2b5 not found: ID does not exist" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.905285 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.946253 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.946301 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.946340 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.946717 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:29:39 crc kubenswrapper[4707]: I0121 15:29:39.946770 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" gracePeriod=600 Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.036620 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-config-data\") pod \"a9f3e600-3622-4bc5-9481-86560969ee79\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.036703 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-internal-tls-certs\") pod \"a9f3e600-3622-4bc5-9481-86560969ee79\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.036725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f3e600-3622-4bc5-9481-86560969ee79-logs\") pod \"a9f3e600-3622-4bc5-9481-86560969ee79\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.036765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ph2h\" (UniqueName: \"kubernetes.io/projected/a9f3e600-3622-4bc5-9481-86560969ee79-kube-api-access-2ph2h\") pod \"a9f3e600-3622-4bc5-9481-86560969ee79\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.036802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-combined-ca-bundle\") pod \"a9f3e600-3622-4bc5-9481-86560969ee79\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.036881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-public-tls-certs\") pod \"a9f3e600-3622-4bc5-9481-86560969ee79\" (UID: \"a9f3e600-3622-4bc5-9481-86560969ee79\") " Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.038302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f3e600-3622-4bc5-9481-86560969ee79-logs" (OuterVolumeSpecName: "logs") pod "a9f3e600-3622-4bc5-9481-86560969ee79" (UID: "a9f3e600-3622-4bc5-9481-86560969ee79"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.039996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f3e600-3622-4bc5-9481-86560969ee79-kube-api-access-2ph2h" (OuterVolumeSpecName: "kube-api-access-2ph2h") pod "a9f3e600-3622-4bc5-9481-86560969ee79" (UID: "a9f3e600-3622-4bc5-9481-86560969ee79"). InnerVolumeSpecName "kube-api-access-2ph2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.052327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-config-data" (OuterVolumeSpecName: "config-data") pod "a9f3e600-3622-4bc5-9481-86560969ee79" (UID: "a9f3e600-3622-4bc5-9481-86560969ee79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.055479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9f3e600-3622-4bc5-9481-86560969ee79" (UID: "a9f3e600-3622-4bc5-9481-86560969ee79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.060731 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.063449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a9f3e600-3622-4bc5-9481-86560969ee79" (UID: "a9f3e600-3622-4bc5-9481-86560969ee79"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.064310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a9f3e600-3622-4bc5-9481-86560969ee79" (UID: "a9f3e600-3622-4bc5-9481-86560969ee79"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.138210 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.138239 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.138249 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f3e600-3622-4bc5-9481-86560969ee79-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.138257 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ph2h\" (UniqueName: \"kubernetes.io/projected/a9f3e600-3622-4bc5-9481-86560969ee79-kube-api-access-2ph2h\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.138267 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.138275 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f3e600-3622-4bc5-9481-86560969ee79-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.182133 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-scheduler-0" secret="" err="secret \"nova-nova-dockercfg-5sbk6\" not found" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.262673 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" exitCode=0 Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.262932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a"} Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.262959 4707 scope.go:117] "RemoveContainer" containerID="8593fceae8f46f5fa87aa536502a4e355cae8485d3fdcdf096b66dd3daf7b85e" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.263317 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.264075 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.265341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" event={"ID":"4638ccbe-c266-4bff-8771-67eb02be0a6a","Type":"ContainerDied","Data":"47ee72cee17f0eca5fbbb671c8bed2952ca0a7b93a87b1f42a74ccac4d163da4"} Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.265403 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-664998d598-nfbb8" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.276367 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9f3e600-3622-4bc5-9481-86560969ee79" containerID="e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7" exitCode=0 Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.276393 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.276404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a9f3e600-3622-4bc5-9481-86560969ee79","Type":"ContainerDied","Data":"e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7"} Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.276453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a9f3e600-3622-4bc5-9481-86560969ee79","Type":"ContainerDied","Data":"d6c85ef282dec9f375cff60a34b332b21d801ab2f061fae93a3be587d21b1575"} Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.287775 4707 scope.go:117] "RemoveContainer" containerID="aeed4f7ea398c8aec570dded3f936a9753106f96a565795b7770c4040b2cca32" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.308832 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-664998d598-nfbb8"] Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.308978 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-664998d598-nfbb8"] Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.316544 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.320871 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.323280 4707 scope.go:117] "RemoveContainer" containerID="e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7" Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.343177 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.343227 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle podName:a651b22a-3eb9-4d6e-96d5-95ece889686f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:40.843213784 +0000 UTC m=+1678.024730006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle") pod "nova-scheduler-0" (UID: "a651b22a-3eb9-4d6e-96d5-95ece889686f") : secret "combined-ca-bundle" not found Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.349879 4707 scope.go:117] "RemoveContainer" containerID="2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.365493 4707 scope.go:117] "RemoveContainer" containerID="e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7" Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.365959 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7\": container with ID starting with e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7 not found: ID does not exist" containerID="e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.366016 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7"} err="failed to get container status \"e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7\": rpc error: code = NotFound desc = could not find container \"e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7\": container with ID starting with e7de3d12fbc57ca51181e9605e9df23138dd5abf9180d2cd089882ae45e03bc7 not found: ID does not exist" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.366052 4707 scope.go:117] "RemoveContainer" containerID="2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b" Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.366503 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b\": container with ID starting with 2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b not found: ID does not exist" containerID="2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b" Jan 21 15:29:40 crc kubenswrapper[4707]: I0121 15:29:40.366526 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b"} err="failed to get container status \"2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b\": rpc error: code = NotFound desc = could not find container \"2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b\": container with ID starting with 2fd6dab3e7eb68c63049c51eae1a024f3bb1be0a3a670e3cd2838b89a700ac8b not found: ID does not exist" Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.847684 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.847750 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data podName:6bac3bc5-b4ca-4333-9fc6-851430c48708 nodeName:}" failed. No retries permitted until 2026-01-21 15:29:48.847735056 +0000 UTC m=+1686.029251278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data") pod "rabbitmq-server-0" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708") : configmap "rabbitmq-config-data" not found Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.847767 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:40 crc kubenswrapper[4707]: E0121 15:29:40.847833 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle podName:a651b22a-3eb9-4d6e-96d5-95ece889686f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:41.847820617 +0000 UTC m=+1679.029336839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle") pod "nova-scheduler-0" (UID: "a651b22a-3eb9-4d6e-96d5-95ece889686f") : secret "combined-ca-bundle" not found Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.069749 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.070882 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.072063 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.072090 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f5fe6793-4878-480a-a359-225be30e006c" containerName="nova-cell0-conductor-conductor" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.072794 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.189279 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" path="/var/lib/kubelet/pods/09b00a3d-6e52-458e-8fb3-6c48d658ff86/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.190258 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a83157-f122-493a-acd7-ba671cfce233" path="/var/lib/kubelet/pods/33a83157-f122-493a-acd7-ba671cfce233/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.190825 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4638ccbe-c266-4bff-8771-67eb02be0a6a" path="/var/lib/kubelet/pods/4638ccbe-c266-4bff-8771-67eb02be0a6a/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.191844 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" path="/var/lib/kubelet/pods/62c3da3c-d7d6-4839-94a7-55b3d07bf5ce/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.192386 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a287941-c7ad-4671-904b-03172bc803e8" path="/var/lib/kubelet/pods/8a287941-c7ad-4671-904b-03172bc803e8/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.193367 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" path="/var/lib/kubelet/pods/a9f3e600-3622-4bc5-9481-86560969ee79/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.193916 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc6250e-7692-4325-896f-33f84c241afb" path="/var/lib/kubelet/pods/ccc6250e-7692-4325-896f-33f84c241afb/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.194429 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4c27903-58da-47dc-97d0-b1328c58303d" path="/var/lib/kubelet/pods/d4c27903-58da-47dc-97d0-b1328c58303d/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.195306 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb1c248-d3f2-4038-8f84-01a4de59d39e" path="/var/lib/kubelet/pods/efb1c248-d3f2-4038-8f84-01a4de59d39e/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.195796 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" path="/var/lib/kubelet/pods/faf162ab-5917-4ddc-9b89-573d2e3bc7e6/volumes" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.256986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-tls\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-confd\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-plugins-conf\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-erlang-cookie\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257148 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-plugins\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qmvf\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-kube-api-access-5qmvf\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-server-conf\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bac3bc5-b4ca-4333-9fc6-851430c48708-erlang-cookie-secret\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.257337 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bac3bc5-b4ca-4333-9fc6-851430c48708-pod-info\") pod \"6bac3bc5-b4ca-4333-9fc6-851430c48708\" (UID: \"6bac3bc5-b4ca-4333-9fc6-851430c48708\") " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.258617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.259141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.259369 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.262486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-kube-api-access-5qmvf" (OuterVolumeSpecName: "kube-api-access-5qmvf") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "kube-api-access-5qmvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.262708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "persistence") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.262799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6bac3bc5-b4ca-4333-9fc6-851430c48708-pod-info" (OuterVolumeSpecName: "pod-info") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.263016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bac3bc5-b4ca-4333-9fc6-851430c48708-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.263424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.272784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data" (OuterVolumeSpecName: "config-data") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.284989 4707 generic.go:334] "Generic (PLEG): container finished" podID="6bac3bc5-b4ca-4333-9fc6-851430c48708" containerID="001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287" exitCode=0 Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.285083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6bac3bc5-b4ca-4333-9fc6-851430c48708","Type":"ContainerDied","Data":"001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287"} Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.285133 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.285156 4707 scope.go:117] "RemoveContainer" containerID="001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.285147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6bac3bc5-b4ca-4333-9fc6-851430c48708","Type":"ContainerDied","Data":"cb9f7fedda804b83122a99e4767503c5e6a4ca6a1b966e175e2f50e17e8b5268"} Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.285373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-server-conf" (OuterVolumeSpecName: "server-conf") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.303326 4707 scope.go:117] "RemoveContainer" containerID="37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.316862 4707 scope.go:117] "RemoveContainer" containerID="001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287" Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.317103 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287\": container with ID starting with 001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287 not found: ID does not exist" containerID="001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.317139 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287"} err="failed to get container status \"001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287\": rpc error: code = NotFound desc = could not find container \"001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287\": container with ID starting with 001f7f6d82d027ccafa115dd861814e4cfe64187912f2799fdbb65bbcabfa287 not found: ID does not exist" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.317169 4707 scope.go:117] "RemoveContainer" containerID="37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855" Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.317455 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855\": container with ID starting with 37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855 not found: ID does not exist" containerID="37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.317488 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855"} err="failed to get container status \"37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855\": rpc error: code = NotFound desc = could not find container \"37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855\": container with ID starting with 37434dc25ced5b562a4a62ef8fc6e955d45f2ed549832b6eb489283266546855 not found: ID does not exist" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.328821 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6bac3bc5-b4ca-4333-9fc6-851430c48708" (UID: "6bac3bc5-b4ca-4333-9fc6-851430c48708"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358834 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358858 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358869 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358878 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qmvf\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-kube-api-access-5qmvf\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358886 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358893 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bac3bc5-b4ca-4333-9fc6-851430c48708-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358911 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358919 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bac3bc5-b4ca-4333-9fc6-851430c48708-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358927 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bac3bc5-b4ca-4333-9fc6-851430c48708-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358934 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.358942 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bac3bc5-b4ca-4333-9fc6-851430c48708-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.359618 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.359665 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data podName:d54b9100-265d-49a3-a2ab-72e8787510df nodeName:}" failed. No retries permitted until 2026-01-21 15:29:49.3596504 +0000 UTC m=+1686.541166622 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data") pod "rabbitmq-cell1-server-0" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.372584 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.460795 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.663759 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.668527 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.866423 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:41 crc kubenswrapper[4707]: E0121 15:29:41.866676 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle podName:a651b22a-3eb9-4d6e-96d5-95ece889686f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:43.86666009 +0000 UTC m=+1681.048176312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle") pod "nova-scheduler-0" (UID: "a651b22a-3eb9-4d6e-96d5-95ece889686f") : secret "combined-ca-bundle" not found Jan 21 15:29:41 crc kubenswrapper[4707]: I0121 15:29:41.908329 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.039285 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.068863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-confd\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.068913 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.068947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-erlang-cookie\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.068994 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd8ws\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-kube-api-access-hd8ws\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.069010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-tls\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.069024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.069071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-server-conf\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.069087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d54b9100-265d-49a3-a2ab-72e8787510df-pod-info\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.069100 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-plugins-conf\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.069147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-plugins\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.069204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d54b9100-265d-49a3-a2ab-72e8787510df-erlang-cookie-secret\") pod \"d54b9100-265d-49a3-a2ab-72e8787510df\" (UID: \"d54b9100-265d-49a3-a2ab-72e8787510df\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.071436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.072521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.072635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.077710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-kube-api-access-hd8ws" (OuterVolumeSpecName: "kube-api-access-hd8ws") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "kube-api-access-hd8ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.077859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54b9100-265d-49a3-a2ab-72e8787510df-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.077928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d54b9100-265d-49a3-a2ab-72e8787510df-pod-info" (OuterVolumeSpecName: "pod-info") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.077862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.085151 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.086559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data" (OuterVolumeSpecName: "config-data") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.098099 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-server-conf" (OuterVolumeSpecName: "server-conf") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.124394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d54b9100-265d-49a3-a2ab-72e8787510df" (UID: "d54b9100-265d-49a3-a2ab-72e8787510df"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-config-data\") pod \"46be47ce-9ea4-4319-b7af-1a299354d2ab\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-combined-ca-bundle\") pod \"46be47ce-9ea4-4319-b7af-1a299354d2ab\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-sg-core-conf-yaml\") pod \"46be47ce-9ea4-4319-b7af-1a299354d2ab\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170355 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-log-httpd\") pod \"46be47ce-9ea4-4319-b7af-1a299354d2ab\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxd7z\" (UniqueName: \"kubernetes.io/projected/46be47ce-9ea4-4319-b7af-1a299354d2ab-kube-api-access-dxd7z\") pod \"46be47ce-9ea4-4319-b7af-1a299354d2ab\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-run-httpd\") pod \"46be47ce-9ea4-4319-b7af-1a299354d2ab\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170445 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-ceilometer-tls-certs\") pod \"46be47ce-9ea4-4319-b7af-1a299354d2ab\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-scripts\") pod \"46be47ce-9ea4-4319-b7af-1a299354d2ab\" (UID: \"46be47ce-9ea4-4319-b7af-1a299354d2ab\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170852 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170872 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170881 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170890 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170905 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170914 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd8ws\" (UniqueName: \"kubernetes.io/projected/d54b9100-265d-49a3-a2ab-72e8787510df-kube-api-access-hd8ws\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170922 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170930 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d54b9100-265d-49a3-a2ab-72e8787510df-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170937 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d54b9100-265d-49a3-a2ab-72e8787510df-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170946 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d54b9100-265d-49a3-a2ab-72e8787510df-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170953 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d54b9100-265d-49a3-a2ab-72e8787510df-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "46be47ce-9ea4-4319-b7af-1a299354d2ab" (UID: "46be47ce-9ea4-4319-b7af-1a299354d2ab"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.170996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "46be47ce-9ea4-4319-b7af-1a299354d2ab" (UID: "46be47ce-9ea4-4319-b7af-1a299354d2ab"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.173587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46be47ce-9ea4-4319-b7af-1a299354d2ab-kube-api-access-dxd7z" (OuterVolumeSpecName: "kube-api-access-dxd7z") pod "46be47ce-9ea4-4319-b7af-1a299354d2ab" (UID: "46be47ce-9ea4-4319-b7af-1a299354d2ab"). InnerVolumeSpecName "kube-api-access-dxd7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.173963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-scripts" (OuterVolumeSpecName: "scripts") pod "46be47ce-9ea4-4319-b7af-1a299354d2ab" (UID: "46be47ce-9ea4-4319-b7af-1a299354d2ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.187760 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.189355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "46be47ce-9ea4-4319-b7af-1a299354d2ab" (UID: "46be47ce-9ea4-4319-b7af-1a299354d2ab"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.199138 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.201387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "46be47ce-9ea4-4319-b7af-1a299354d2ab" (UID: "46be47ce-9ea4-4319-b7af-1a299354d2ab"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.213307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46be47ce-9ea4-4319-b7af-1a299354d2ab" (UID: "46be47ce-9ea4-4319-b7af-1a299354d2ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.229727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-config-data" (OuterVolumeSpecName: "config-data") pod "46be47ce-9ea4-4319-b7af-1a299354d2ab" (UID: "46be47ce-9ea4-4319-b7af-1a299354d2ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.272654 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.272684 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.272695 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.272704 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxd7z\" (UniqueName: \"kubernetes.io/projected/46be47ce-9ea4-4319-b7af-1a299354d2ab-kube-api-access-dxd7z\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.272712 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46be47ce-9ea4-4319-b7af-1a299354d2ab-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.272720 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.272727 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.272734 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.272740 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46be47ce-9ea4-4319-b7af-1a299354d2ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.297291 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5fe6793-4878-480a-a359-225be30e006c" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" exitCode=0 Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.297329 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.297343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f5fe6793-4878-480a-a359-225be30e006c","Type":"ContainerDied","Data":"4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d"} Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.297385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f5fe6793-4878-480a-a359-225be30e006c","Type":"ContainerDied","Data":"a3993deda44d1d6434b3e8c25a3820a13a697f2a0bba99561425621c0cb08bd3"} Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.297407 4707 scope.go:117] "RemoveContainer" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.300643 4707 generic.go:334] "Generic (PLEG): container finished" podID="d54b9100-265d-49a3-a2ab-72e8787510df" containerID="ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d" exitCode=0 Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.300686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"d54b9100-265d-49a3-a2ab-72e8787510df","Type":"ContainerDied","Data":"ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d"} Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.300702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"d54b9100-265d-49a3-a2ab-72e8787510df","Type":"ContainerDied","Data":"3f3eb732254be7188935cbb1415cb838e3af6835b0129dbe6e6868919046514a"} Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.300754 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.303441 4707 generic.go:334] "Generic (PLEG): container finished" podID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerID="e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488" exitCode=0 Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.303492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerDied","Data":"e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488"} Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.303509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"46be47ce-9ea4-4319-b7af-1a299354d2ab","Type":"ContainerDied","Data":"c76a85d742cec86291e6eff4c33261f801a64a9c4ddbcbbf88328d1168087e82"} Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.303528 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.324231 4707 scope.go:117] "RemoveContainer" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" Jan 21 15:29:42 crc kubenswrapper[4707]: E0121 15:29:42.324743 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d\": container with ID starting with 4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d not found: ID does not exist" containerID="4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.324772 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d"} err="failed to get container status \"4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d\": rpc error: code = NotFound desc = could not find container \"4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d\": container with ID starting with 4b562acf9a7b4c17d2ad4a75dc1d40576e3a1b09e4e6493b81eef3135babc59d not found: ID does not exist" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.324795 4707 scope.go:117] "RemoveContainer" containerID="ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.333789 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.339701 4707 scope.go:117] "RemoveContainer" containerID="db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.339943 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.352402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.358120 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.371304 4707 scope.go:117] "RemoveContainer" containerID="ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d" Jan 21 15:29:42 crc kubenswrapper[4707]: E0121 15:29:42.371799 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d\": container with ID starting with ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d not found: ID does not exist" containerID="ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.371846 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d"} err="failed to get container status \"ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d\": rpc error: code = NotFound desc = could not find container \"ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d\": container with ID starting with ec68dd61b62e9fa5f9669aa95e53375db25422fd5899ea36d22538536d9e8e7d not found: ID does not exist" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.371866 4707 scope.go:117] "RemoveContainer" containerID="db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a" Jan 21 15:29:42 crc kubenswrapper[4707]: E0121 15:29:42.372292 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a\": container with ID starting with db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a not found: ID does not exist" containerID="db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.372374 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a"} err="failed to get container status \"db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a\": rpc error: code = NotFound desc = could not find container \"db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a\": container with ID starting with db1d13a227145db7f12e1976d90a2c94584f4336966404f65e61be592f74641a not found: ID does not exist" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.372475 4707 scope.go:117] "RemoveContainer" containerID="37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.373176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-config-data\") pod \"f5fe6793-4878-480a-a359-225be30e006c\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.373262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-combined-ca-bundle\") pod \"f5fe6793-4878-480a-a359-225be30e006c\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.373378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rrq8\" (UniqueName: \"kubernetes.io/projected/f5fe6793-4878-480a-a359-225be30e006c-kube-api-access-2rrq8\") pod \"f5fe6793-4878-480a-a359-225be30e006c\" (UID: \"f5fe6793-4878-480a-a359-225be30e006c\") " Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.376747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5fe6793-4878-480a-a359-225be30e006c-kube-api-access-2rrq8" (OuterVolumeSpecName: "kube-api-access-2rrq8") pod "f5fe6793-4878-480a-a359-225be30e006c" (UID: "f5fe6793-4878-480a-a359-225be30e006c"). InnerVolumeSpecName "kube-api-access-2rrq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.390312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-config-data" (OuterVolumeSpecName: "config-data") pod "f5fe6793-4878-480a-a359-225be30e006c" (UID: "f5fe6793-4878-480a-a359-225be30e006c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.391405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5fe6793-4878-480a-a359-225be30e006c" (UID: "f5fe6793-4878-480a-a359-225be30e006c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.455747 4707 scope.go:117] "RemoveContainer" containerID="eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.467152 4707 scope.go:117] "RemoveContainer" containerID="e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.475222 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rrq8\" (UniqueName: \"kubernetes.io/projected/f5fe6793-4878-480a-a359-225be30e006c-kube-api-access-2rrq8\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.475252 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.475264 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fe6793-4878-480a-a359-225be30e006c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.485280 4707 scope.go:117] "RemoveContainer" containerID="a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.500857 4707 scope.go:117] "RemoveContainer" containerID="37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad" Jan 21 15:29:42 crc kubenswrapper[4707]: E0121 15:29:42.501273 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad\": container with ID starting with 37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad not found: ID does not exist" containerID="37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.501310 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad"} err="failed to get container status \"37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad\": rpc error: code = NotFound desc = could not find container \"37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad\": container with ID starting with 37d0b9703efda8a10eb9f28e3aa9a80a703e781e78bb69157e5cd75dbbf5e7ad not found: ID does not exist" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.501332 4707 scope.go:117] "RemoveContainer" containerID="eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4" Jan 21 15:29:42 crc kubenswrapper[4707]: E0121 15:29:42.501625 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4\": container with ID starting with eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4 not found: ID does not exist" containerID="eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.501749 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4"} err="failed to get container status \"eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4\": rpc error: code = NotFound desc = could not find container \"eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4\": container with ID starting with eccdd198ee3107526f2a61897a43312d53d9197f3a91b9127ec2f3edfd1822f4 not found: ID does not exist" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.501857 4707 scope.go:117] "RemoveContainer" containerID="e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488" Jan 21 15:29:42 crc kubenswrapper[4707]: E0121 15:29:42.502323 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488\": container with ID starting with e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488 not found: ID does not exist" containerID="e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.502347 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488"} err="failed to get container status \"e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488\": rpc error: code = NotFound desc = could not find container \"e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488\": container with ID starting with e96550d8950cc83f06dbd809cf09aa11bd66a121da69a27352d2ff0216c21488 not found: ID does not exist" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.502362 4707 scope.go:117] "RemoveContainer" containerID="a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554" Jan 21 15:29:42 crc kubenswrapper[4707]: E0121 15:29:42.502694 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554\": container with ID starting with a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554 not found: ID does not exist" containerID="a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.502719 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554"} err="failed to get container status \"a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554\": rpc error: code = NotFound desc = could not find container \"a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554\": container with ID starting with a19c7181f18fc66edf934cb448c2411947c763b5f1ba6ede3f6a1ed141c12554 not found: ID does not exist" Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.622405 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:29:42 crc kubenswrapper[4707]: I0121 15:29:42.626430 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:29:43 crc kubenswrapper[4707]: I0121 15:29:43.190310 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" path="/var/lib/kubelet/pods/46be47ce-9ea4-4319-b7af-1a299354d2ab/volumes" Jan 21 15:29:43 crc kubenswrapper[4707]: I0121 15:29:43.191117 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bac3bc5-b4ca-4333-9fc6-851430c48708" path="/var/lib/kubelet/pods/6bac3bc5-b4ca-4333-9fc6-851430c48708/volumes" Jan 21 15:29:43 crc kubenswrapper[4707]: I0121 15:29:43.192144 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54b9100-265d-49a3-a2ab-72e8787510df" path="/var/lib/kubelet/pods/d54b9100-265d-49a3-a2ab-72e8787510df/volumes" Jan 21 15:29:43 crc kubenswrapper[4707]: I0121 15:29:43.192615 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5fe6793-4878-480a-a359-225be30e006c" path="/var/lib/kubelet/pods/f5fe6793-4878-480a-a359-225be30e006c/volumes" Jan 21 15:29:43 crc kubenswrapper[4707]: E0121 15:29:43.893541 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:43 crc kubenswrapper[4707]: E0121 15:29:43.893631 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle podName:a651b22a-3eb9-4d6e-96d5-95ece889686f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:47.893617961 +0000 UTC m=+1685.075134182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle") pod "nova-scheduler-0" (UID: "a651b22a-3eb9-4d6e-96d5-95ece889686f") : secret "combined-ca-bundle" not found Jan 21 15:29:44 crc kubenswrapper[4707]: I0121 15:29:44.703856 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:29:44 crc kubenswrapper[4707]: I0121 15:29:44.704294 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="7ff4b887-d336-427e-811b-0b0694985417" containerName="memcached" containerID="cri-o://4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b" gracePeriod=30 Jan 21 15:29:44 crc kubenswrapper[4707]: I0121 15:29:44.715201 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:29:44 crc kubenswrapper[4707]: I0121 15:29:44.715390 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="a651b22a-3eb9-4d6e-96d5-95ece889686f" containerName="nova-scheduler-scheduler" containerID="cri-o://83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af" gracePeriod=30 Jan 21 15:29:44 crc kubenswrapper[4707]: I0121 15:29:44.942952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:29:44 crc kubenswrapper[4707]: I0121 15:29:44.982452 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m"] Jan 21 15:29:44 crc kubenswrapper[4707]: I0121 15:29:44.982644 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" podUID="d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" containerName="dnsmasq-dns" containerID="cri-o://9ee60ee038e2abfb8251554c9075cf75988811fa8396a5cc948a9afaf1fcf228" gracePeriod=10 Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.332121 4707 generic.go:334] "Generic (PLEG): container finished" podID="d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" containerID="9ee60ee038e2abfb8251554c9075cf75988811fa8396a5cc948a9afaf1fcf228" exitCode=0 Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.332156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" event={"ID":"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5","Type":"ContainerDied","Data":"9ee60ee038e2abfb8251554c9075cf75988811fa8396a5cc948a9afaf1fcf228"} Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.332187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" event={"ID":"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5","Type":"ContainerDied","Data":"2b007c6a88d89f317074db587956c50c7cc1f817e8de9007ad276a848af17810"} Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.332198 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b007c6a88d89f317074db587956c50c7cc1f817e8de9007ad276a848af17810" Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.341519 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.515636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dns-swift-storage-0\") pod \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.515748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dnsmasq-svc\") pod \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.515785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbcdl\" (UniqueName: \"kubernetes.io/projected/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-kube-api-access-jbcdl\") pod \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.515803 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-config\") pod \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\" (UID: \"d327f80a-69ba-4e3d-97fa-fff0bfc98dd5\") " Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.520215 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-kube-api-access-jbcdl" (OuterVolumeSpecName: "kube-api-access-jbcdl") pod "d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" (UID: "d327f80a-69ba-4e3d-97fa-fff0bfc98dd5"). InnerVolumeSpecName "kube-api-access-jbcdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.541014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" (UID: "d327f80a-69ba-4e3d-97fa-fff0bfc98dd5"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.542444 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-config" (OuterVolumeSpecName: "config") pod "d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" (UID: "d327f80a-69ba-4e3d-97fa-fff0bfc98dd5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.552663 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" (UID: "d327f80a-69ba-4e3d-97fa-fff0bfc98dd5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.617445 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.617470 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbcdl\" (UniqueName: \"kubernetes.io/projected/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-kube-api-access-jbcdl\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.617481 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:45 crc kubenswrapper[4707]: I0121 15:29:45.617489 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.024334 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.225255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-combined-ca-bundle\") pod \"7ff4b887-d336-427e-811b-0b0694985417\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.225294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-memcached-tls-certs\") pod \"7ff4b887-d336-427e-811b-0b0694985417\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.225329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-config-data\") pod \"7ff4b887-d336-427e-811b-0b0694985417\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.225358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtfkg\" (UniqueName: \"kubernetes.io/projected/7ff4b887-d336-427e-811b-0b0694985417-kube-api-access-xtfkg\") pod \"7ff4b887-d336-427e-811b-0b0694985417\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.225384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-kolla-config\") pod \"7ff4b887-d336-427e-811b-0b0694985417\" (UID: \"7ff4b887-d336-427e-811b-0b0694985417\") " Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.226127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-config-data" (OuterVolumeSpecName: "config-data") pod "7ff4b887-d336-427e-811b-0b0694985417" (UID: "7ff4b887-d336-427e-811b-0b0694985417"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.226136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7ff4b887-d336-427e-811b-0b0694985417" (UID: "7ff4b887-d336-427e-811b-0b0694985417"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.236549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff4b887-d336-427e-811b-0b0694985417-kube-api-access-xtfkg" (OuterVolumeSpecName: "kube-api-access-xtfkg") pod "7ff4b887-d336-427e-811b-0b0694985417" (UID: "7ff4b887-d336-427e-811b-0b0694985417"). InnerVolumeSpecName "kube-api-access-xtfkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.239145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ff4b887-d336-427e-811b-0b0694985417" (UID: "7ff4b887-d336-427e-811b-0b0694985417"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.249629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "7ff4b887-d336-427e-811b-0b0694985417" (UID: "7ff4b887-d336-427e-811b-0b0694985417"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.326823 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.326846 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.326856 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtfkg\" (UniqueName: \"kubernetes.io/projected/7ff4b887-d336-427e-811b-0b0694985417-kube-api-access-xtfkg\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.326866 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ff4b887-d336-427e-811b-0b0694985417-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.326874 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff4b887-d336-427e-811b-0b0694985417-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.341047 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ff4b887-d336-427e-811b-0b0694985417" containerID="4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b" exitCode=0 Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.341188 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.341252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"7ff4b887-d336-427e-811b-0b0694985417","Type":"ContainerDied","Data":"4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b"} Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.341297 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"7ff4b887-d336-427e-811b-0b0694985417","Type":"ContainerDied","Data":"fb28ad225e48afe7ff2833a326e7ba0a6e53e51f643187850de8cd07df44d0ed"} Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.341315 4707 scope.go:117] "RemoveContainer" containerID="4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.341559 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.367959 4707 scope.go:117] "RemoveContainer" containerID="4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b" Jan 21 15:29:46 crc kubenswrapper[4707]: E0121 15:29:46.368670 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b\": container with ID starting with 4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b not found: ID does not exist" containerID="4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.368702 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b"} err="failed to get container status \"4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b\": rpc error: code = NotFound desc = could not find container \"4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b\": container with ID starting with 4ddc77924a0b449c39c1bd38be0cbb029a71f472d2c237ed91ec55f1fc6cdf7b not found: ID does not exist" Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.373959 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m"] Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.377501 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6bf5dd6689-jrt6m"] Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.386412 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:29:46 crc kubenswrapper[4707]: I0121 15:29:46.390043 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:29:47 crc kubenswrapper[4707]: I0121 15:29:47.188861 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff4b887-d336-427e-811b-0b0694985417" path="/var/lib/kubelet/pods/7ff4b887-d336-427e-811b-0b0694985417/volumes" Jan 21 15:29:47 crc kubenswrapper[4707]: I0121 15:29:47.190014 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" path="/var/lib/kubelet/pods/d327f80a-69ba-4e3d-97fa-fff0bfc98dd5/volumes" Jan 21 15:29:47 crc kubenswrapper[4707]: E0121 15:29:47.485755 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:29:47 crc kubenswrapper[4707]: E0121 15:29:47.486902 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:29:47 crc kubenswrapper[4707]: E0121 15:29:47.487946 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:29:47 crc kubenswrapper[4707]: E0121 15:29:47.487978 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="a651b22a-3eb9-4d6e-96d5-95ece889686f" containerName="nova-scheduler-scheduler" Jan 21 15:29:47 crc kubenswrapper[4707]: E0121 15:29:47.945138 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:29:47 crc kubenswrapper[4707]: E0121 15:29:47.945203 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle podName:a651b22a-3eb9-4d6e-96d5-95ece889686f nodeName:}" failed. No retries permitted until 2026-01-21 15:29:55.945191429 +0000 UTC m=+1693.126707652 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle") pod "nova-scheduler-0" (UID: "a651b22a-3eb9-4d6e-96d5-95ece889686f") : secret "combined-ca-bundle" not found Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.275961 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.360322 4707 generic.go:334] "Generic (PLEG): container finished" podID="a651b22a-3eb9-4d6e-96d5-95ece889686f" containerID="83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af" exitCode=0 Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.360380 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.360381 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a651b22a-3eb9-4d6e-96d5-95ece889686f","Type":"ContainerDied","Data":"83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af"} Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.360438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a651b22a-3eb9-4d6e-96d5-95ece889686f","Type":"ContainerDied","Data":"843a8eb2c1f4442d9021632abceae272b31c88bc4936db7ddd8deddb6cde59c0"} Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.360457 4707 scope.go:117] "RemoveContainer" containerID="83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.377200 4707 scope.go:117] "RemoveContainer" containerID="83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af" Jan 21 15:29:49 crc kubenswrapper[4707]: E0121 15:29:49.377755 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af\": container with ID starting with 83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af not found: ID does not exist" containerID="83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.377793 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af"} err="failed to get container status \"83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af\": rpc error: code = NotFound desc = could not find container \"83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af\": container with ID starting with 83200278b24d453790bc68bc9717cbda1f9611c757f51502fcbaedc6fb14e9af not found: ID does not exist" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.465842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-config-data\") pod \"a651b22a-3eb9-4d6e-96d5-95ece889686f\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.466130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle\") pod \"a651b22a-3eb9-4d6e-96d5-95ece889686f\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.466242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46h96\" (UniqueName: \"kubernetes.io/projected/a651b22a-3eb9-4d6e-96d5-95ece889686f-kube-api-access-46h96\") pod \"a651b22a-3eb9-4d6e-96d5-95ece889686f\" (UID: \"a651b22a-3eb9-4d6e-96d5-95ece889686f\") " Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.470623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a651b22a-3eb9-4d6e-96d5-95ece889686f-kube-api-access-46h96" (OuterVolumeSpecName: "kube-api-access-46h96") pod "a651b22a-3eb9-4d6e-96d5-95ece889686f" (UID: "a651b22a-3eb9-4d6e-96d5-95ece889686f"). InnerVolumeSpecName "kube-api-access-46h96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.481726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-config-data" (OuterVolumeSpecName: "config-data") pod "a651b22a-3eb9-4d6e-96d5-95ece889686f" (UID: "a651b22a-3eb9-4d6e-96d5-95ece889686f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.486995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a651b22a-3eb9-4d6e-96d5-95ece889686f" (UID: "a651b22a-3eb9-4d6e-96d5-95ece889686f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.568536 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.568724 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46h96\" (UniqueName: \"kubernetes.io/projected/a651b22a-3eb9-4d6e-96d5-95ece889686f-kube-api-access-46h96\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.568803 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a651b22a-3eb9-4d6e-96d5-95ece889686f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.684094 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:29:49 crc kubenswrapper[4707]: I0121 15:29:49.699025 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:29:51 crc kubenswrapper[4707]: I0121 15:29:51.027385 4707 scope.go:117] "RemoveContainer" containerID="349d7ce8226bdb20d55a2f74cfccc3655f5c42f37dd06ac1618bc6d7d059c8d3" Jan 21 15:29:51 crc kubenswrapper[4707]: I0121 15:29:51.190174 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a651b22a-3eb9-4d6e-96d5-95ece889686f" path="/var/lib/kubelet/pods/a651b22a-3eb9-4d6e-96d5-95ece889686f/volumes" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.089701 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.100239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-ovndb-tls-certs\") pod \"deb292be-f24c-461f-ac51-f1d4daa3f261\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.100297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-combined-ca-bundle\") pod \"deb292be-f24c-461f-ac51-f1d4daa3f261\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.100318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-httpd-config\") pod \"deb292be-f24c-461f-ac51-f1d4daa3f261\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.100331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-public-tls-certs\") pod \"deb292be-f24c-461f-ac51-f1d4daa3f261\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.100353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkxd4\" (UniqueName: \"kubernetes.io/projected/deb292be-f24c-461f-ac51-f1d4daa3f261-kube-api-access-rkxd4\") pod \"deb292be-f24c-461f-ac51-f1d4daa3f261\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.100433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-config\") pod \"deb292be-f24c-461f-ac51-f1d4daa3f261\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.100448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-internal-tls-certs\") pod \"deb292be-f24c-461f-ac51-f1d4daa3f261\" (UID: \"deb292be-f24c-461f-ac51-f1d4daa3f261\") " Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.105587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb292be-f24c-461f-ac51-f1d4daa3f261-kube-api-access-rkxd4" (OuterVolumeSpecName: "kube-api-access-rkxd4") pod "deb292be-f24c-461f-ac51-f1d4daa3f261" (UID: "deb292be-f24c-461f-ac51-f1d4daa3f261"). InnerVolumeSpecName "kube-api-access-rkxd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.111825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "deb292be-f24c-461f-ac51-f1d4daa3f261" (UID: "deb292be-f24c-461f-ac51-f1d4daa3f261"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.130481 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "deb292be-f24c-461f-ac51-f1d4daa3f261" (UID: "deb292be-f24c-461f-ac51-f1d4daa3f261"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.132869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "deb292be-f24c-461f-ac51-f1d4daa3f261" (UID: "deb292be-f24c-461f-ac51-f1d4daa3f261"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.134879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deb292be-f24c-461f-ac51-f1d4daa3f261" (UID: "deb292be-f24c-461f-ac51-f1d4daa3f261"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.135469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-config" (OuterVolumeSpecName: "config") pod "deb292be-f24c-461f-ac51-f1d4daa3f261" (UID: "deb292be-f24c-461f-ac51-f1d4daa3f261"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.146017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "deb292be-f24c-461f-ac51-f1d4daa3f261" (UID: "deb292be-f24c-461f-ac51-f1d4daa3f261"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.201880 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.201905 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.201914 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.201921 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.201930 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkxd4\" (UniqueName: \"kubernetes.io/projected/deb292be-f24c-461f-ac51-f1d4daa3f261-kube-api-access-rkxd4\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.201938 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.201946 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb292be-f24c-461f-ac51-f1d4daa3f261-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.379296 4707 generic.go:334] "Generic (PLEG): container finished" podID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerID="4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b" exitCode=0 Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.379327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" event={"ID":"deb292be-f24c-461f-ac51-f1d4daa3f261","Type":"ContainerDied","Data":"4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b"} Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.379347 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.379362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r" event={"ID":"deb292be-f24c-461f-ac51-f1d4daa3f261","Type":"ContainerDied","Data":"04f14defb88ef7caea788209d0125afde6c7de00f6224dbe6cac5e26b0b269a1"} Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.379379 4707 scope.go:117] "RemoveContainer" containerID="05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.394766 4707 scope.go:117] "RemoveContainer" containerID="4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.399058 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r"] Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.403197 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-7d8fb7c856-ftg5r"] Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.408972 4707 scope.go:117] "RemoveContainer" containerID="05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46" Jan 21 15:29:52 crc kubenswrapper[4707]: E0121 15:29:52.409273 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46\": container with ID starting with 05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46 not found: ID does not exist" containerID="05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.409310 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46"} err="failed to get container status \"05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46\": rpc error: code = NotFound desc = could not find container \"05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46\": container with ID starting with 05f14806dca01c7afe8cf431693d11141246b50a0a67cb1912d8cb743403aa46 not found: ID does not exist" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.409348 4707 scope.go:117] "RemoveContainer" containerID="4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b" Jan 21 15:29:52 crc kubenswrapper[4707]: E0121 15:29:52.409616 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b\": container with ID starting with 4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b not found: ID does not exist" containerID="4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b" Jan 21 15:29:52 crc kubenswrapper[4707]: I0121 15:29:52.409645 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b"} err="failed to get container status \"4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b\": rpc error: code = NotFound desc = could not find container \"4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b\": container with ID starting with 4a0da38673cee1cf373f1d7a2da4fc263d3eb1dd7d74c6fa33f5baa76942314b not found: ID does not exist" Jan 21 15:29:53 crc kubenswrapper[4707]: I0121 15:29:53.189629 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb292be-f24c-461f-ac51-f1d4daa3f261" path="/var/lib/kubelet/pods/deb292be-f24c-461f-ac51-f1d4daa3f261/volumes" Jan 21 15:29:55 crc kubenswrapper[4707]: I0121 15:29:55.184455 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:29:55 crc kubenswrapper[4707]: E0121 15:29:55.185967 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129272 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8"] Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129680 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" containerName="cinder-api-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129692 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" containerName="cinder-api-log" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129705 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff4b887-d336-427e-811b-0b0694985417" containerName="memcached" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129711 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff4b887-d336-427e-811b-0b0694985417" containerName="memcached" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129719 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" containerName="galera" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129725 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" containerName="galera" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129732 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700a8d9c-852c-4611-9a98-50418d7df800" containerName="proxy-server" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129737 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="700a8d9c-852c-4611-9a98-50418d7df800" containerName="proxy-server" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129748 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="sg-core" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129753 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="sg-core" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129762 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a287941-c7ad-4671-904b-03172bc803e8" containerName="barbican-worker-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129767 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a287941-c7ad-4671-904b-03172bc803e8" containerName="barbican-worker-log" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129775 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="proxy-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129780 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="proxy-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129786 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d909d33-58f9-4a53-b98a-aa1927df3ea3" containerName="kube-state-metrics" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129792 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d909d33-58f9-4a53-b98a-aa1927df3ea3" containerName="kube-state-metrics" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129820 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" containerName="init" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129825 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" containerName="init" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129835 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerName="glance-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129841 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerName="glance-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129850 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerName="ovn-northd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129855 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerName="ovn-northd" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129864 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5fe6793-4878-480a-a359-225be30e006c" containerName="nova-cell0-conductor-conductor" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129869 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5fe6793-4878-480a-a359-225be30e006c" containerName="nova-cell0-conductor-conductor" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129875 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129881 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-api" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129887 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bac3bc5-b4ca-4333-9fc6-851430c48708" containerName="rabbitmq" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129892 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bac3bc5-b4ca-4333-9fc6-851430c48708" containerName="rabbitmq" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129898 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerName="placement-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129902 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerName="placement-api" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129911 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" containerName="cinder-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129916 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" containerName="cinder-api" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129924 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="ceilometer-central-agent" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129930 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="ceilometer-central-agent" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129938 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerName="neutron-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129945 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerName="neutron-api" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129954 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54b9100-265d-49a3-a2ab-72e8787510df" containerName="rabbitmq" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129959 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54b9100-265d-49a3-a2ab-72e8787510df" containerName="rabbitmq" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129964 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129969 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129975 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerName="openstack-network-exporter" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129980 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerName="openstack-network-exporter" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.129988 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.129993 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api-log" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130002 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280f96dd-c140-464c-a906-d5428cda5b5c" containerName="galera" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130008 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="280f96dd-c140-464c-a906-d5428cda5b5c" containerName="galera" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="ceilometer-notification-agent" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130019 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="ceilometer-notification-agent" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130027 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a83157-f122-493a-acd7-ba671cfce233" containerName="glance-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130032 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a83157-f122-493a-acd7-ba671cfce233" containerName="glance-log" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130038 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280f96dd-c140-464c-a906-d5428cda5b5c" containerName="mysql-bootstrap" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130043 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="280f96dd-c140-464c-a906-d5428cda5b5c" containerName="mysql-bootstrap" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" containerName="dnsmasq-dns" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130053 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" containerName="dnsmasq-dns" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130059 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" containerName="mysql-bootstrap" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130064 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" containerName="mysql-bootstrap" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130072 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a651b22a-3eb9-4d6e-96d5-95ece889686f" containerName="nova-scheduler-scheduler" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130077 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a651b22a-3eb9-4d6e-96d5-95ece889686f" containerName="nova-scheduler-scheduler" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130086 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54b9100-265d-49a3-a2ab-72e8787510df" containerName="setup-container" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130091 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54b9100-265d-49a3-a2ab-72e8787510df" containerName="setup-container" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130097 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4c27903-58da-47dc-97d0-b1328c58303d" containerName="barbican-keystone-listener" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130102 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c27903-58da-47dc-97d0-b1328c58303d" containerName="barbican-keystone-listener" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130111 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb1c248-d3f2-4038-8f84-01a4de59d39e" containerName="nova-cell1-conductor-conductor" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130116 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb1c248-d3f2-4038-8f84-01a4de59d39e" containerName="nova-cell1-conductor-conductor" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130124 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerName="placement-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130128 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerName="placement-log" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130135 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700a8d9c-852c-4611-9a98-50418d7df800" containerName="proxy-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130140 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="700a8d9c-852c-4611-9a98-50418d7df800" containerName="proxy-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130148 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerName="glance-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130154 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerName="glance-log" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130171 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184ed54c-4ce8-4d12-ae34-255b44708d27" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130176 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="184ed54c-4ce8-4d12-ae34-255b44708d27" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130185 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130190 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-log" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130196 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bac3bc5-b4ca-4333-9fc6-851430c48708" containerName="setup-container" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130202 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bac3bc5-b4ca-4333-9fc6-851430c48708" containerName="setup-container" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130208 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-metadata" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130214 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-metadata" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130224 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a287941-c7ad-4671-904b-03172bc803e8" containerName="barbican-worker" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130229 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a287941-c7ad-4671-904b-03172bc803e8" containerName="barbican-worker" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130237 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130242 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-log" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130250 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4c27903-58da-47dc-97d0-b1328c58303d" containerName="barbican-keystone-listener-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130255 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c27903-58da-47dc-97d0-b1328c58303d" containerName="barbican-keystone-listener-log" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130262 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerName="neutron-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130266 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerName="neutron-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130274 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a83157-f122-493a-acd7-ba671cfce233" containerName="glance-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a83157-f122-493a-acd7-ba671cfce233" containerName="glance-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130285 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8daa249-22b3-4782-b64b-d9dca84a8777" containerName="mariadb-account-create-update" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130290 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8daa249-22b3-4782-b64b-d9dca84a8777" containerName="mariadb-account-create-update" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130299 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8daa249-22b3-4782-b64b-d9dca84a8777" containerName="mariadb-account-create-update" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130304 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8daa249-22b3-4782-b64b-d9dca84a8777" containerName="mariadb-account-create-update" Jan 21 15:30:00 crc kubenswrapper[4707]: E0121 15:30:00.130310 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638ccbe-c266-4bff-8771-67eb02be0a6a" containerName="keystone-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130315 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638ccbe-c266-4bff-8771-67eb02be0a6a" containerName="keystone-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130414 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="sg-core" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130422 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="proxy-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130430 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4c27903-58da-47dc-97d0-b1328c58303d" containerName="barbican-keystone-listener-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130438 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerName="neutron-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130447 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="ceilometer-central-agent" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130456 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8daa249-22b3-4782-b64b-d9dca84a8777" containerName="mariadb-account-create-update" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130461 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d97aa5-7c98-4709-ba78-b2d9fb1a3f16" containerName="galera" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130471 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="700a8d9c-852c-4611-9a98-50418d7df800" containerName="proxy-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130477 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerName="openstack-network-exporter" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130484 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d327f80a-69ba-4e3d-97fa-fff0bfc98dd5" containerName="dnsmasq-dns" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130491 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb292be-f24c-461f-ac51-f1d4daa3f261" containerName="neutron-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130499 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130504 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bac3bc5-b4ca-4333-9fc6-851430c48708" containerName="rabbitmq" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130510 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a83157-f122-493a-acd7-ba671cfce233" containerName="glance-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130517 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c3da3c-d7d6-4839-94a7-55b3d07bf5ce" containerName="barbican-api-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130524 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5fe6793-4878-480a-a359-225be30e006c" containerName="nova-cell0-conductor-conductor" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130533 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerName="placement-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130540 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a83157-f122-493a-acd7-ba671cfce233" containerName="glance-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130549 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d909d33-58f9-4a53-b98a-aa1927df3ea3" containerName="kube-state-metrics" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130557 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4638ccbe-c266-4bff-8771-67eb02be0a6a" containerName="keystone-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130564 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a287941-c7ad-4671-904b-03172bc803e8" containerName="barbican-worker" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130574 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8daec53c-1342-4ca3-aa8b-a9d50bcf44b5" containerName="ovn-northd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130582 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130590 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f3e600-3622-4bc5-9481-86560969ee79" containerName="nova-api-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130595 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130603 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="280f96dd-c140-464c-a906-d5428cda5b5c" containerName="galera" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130609 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff4b887-d336-427e-811b-0b0694985417" containerName="memcached" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130616 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf162ab-5917-4ddc-9b89-573d2e3bc7e6" containerName="placement-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130622 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a651b22a-3eb9-4d6e-96d5-95ece889686f" containerName="nova-scheduler-scheduler" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130629 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="46be47ce-9ea4-4319-b7af-1a299354d2ab" containerName="ceilometer-notification-agent" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130635 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="700a8d9c-852c-4611-9a98-50418d7df800" containerName="proxy-server" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130643 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4c27903-58da-47dc-97d0-b1328c58303d" containerName="barbican-keystone-listener" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130652 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" containerName="cinder-api" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130658 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerName="glance-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130665 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="184ed54c-4ce8-4d12-ae34-255b44708d27" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130672 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a287941-c7ad-4671-904b-03172bc803e8" containerName="barbican-worker-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130677 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb1c248-d3f2-4038-8f84-01a4de59d39e" containerName="nova-cell1-conductor-conductor" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130685 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b00a3d-6e52-458e-8fb3-6c48d658ff86" containerName="glance-httpd" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130691 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8daa249-22b3-4782-b64b-d9dca84a8777" containerName="mariadb-account-create-update" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130696 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6250e-7692-4325-896f-33f84c241afb" containerName="nova-metadata-metadata" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130704 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7c730d-de41-4543-a84e-d42a48259f71" containerName="cinder-api-log" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.130710 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54b9100-265d-49a3-a2ab-72e8787510df" containerName="rabbitmq" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.131102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.132783 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.133027 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.136748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8"] Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.292098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60b25168-75cb-4128-9228-1547e29e5af9-config-volume\") pod \"collect-profiles-29483490-9p6v8\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.292152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85xpc\" (UniqueName: \"kubernetes.io/projected/60b25168-75cb-4128-9228-1547e29e5af9-kube-api-access-85xpc\") pod \"collect-profiles-29483490-9p6v8\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.292335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60b25168-75cb-4128-9228-1547e29e5af9-secret-volume\") pod \"collect-profiles-29483490-9p6v8\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.393308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60b25168-75cb-4128-9228-1547e29e5af9-config-volume\") pod \"collect-profiles-29483490-9p6v8\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.393349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85xpc\" (UniqueName: \"kubernetes.io/projected/60b25168-75cb-4128-9228-1547e29e5af9-kube-api-access-85xpc\") pod \"collect-profiles-29483490-9p6v8\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.393403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60b25168-75cb-4128-9228-1547e29e5af9-secret-volume\") pod \"collect-profiles-29483490-9p6v8\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.394100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60b25168-75cb-4128-9228-1547e29e5af9-config-volume\") pod \"collect-profiles-29483490-9p6v8\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.397660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60b25168-75cb-4128-9228-1547e29e5af9-secret-volume\") pod \"collect-profiles-29483490-9p6v8\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.406024 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85xpc\" (UniqueName: \"kubernetes.io/projected/60b25168-75cb-4128-9228-1547e29e5af9-kube-api-access-85xpc\") pod \"collect-profiles-29483490-9p6v8\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.443745 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:00 crc kubenswrapper[4707]: I0121 15:30:00.844572 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8"] Jan 21 15:30:01 crc kubenswrapper[4707]: I0121 15:30:01.430510 4707 generic.go:334] "Generic (PLEG): container finished" podID="60b25168-75cb-4128-9228-1547e29e5af9" containerID="a39188bac11b6baaf80c99303d7a4fcc4cc0ab5f6d005043fe277c2caf45f82c" exitCode=0 Jan 21 15:30:01 crc kubenswrapper[4707]: I0121 15:30:01.430612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" event={"ID":"60b25168-75cb-4128-9228-1547e29e5af9","Type":"ContainerDied","Data":"a39188bac11b6baaf80c99303d7a4fcc4cc0ab5f6d005043fe277c2caf45f82c"} Jan 21 15:30:01 crc kubenswrapper[4707]: I0121 15:30:01.430704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" event={"ID":"60b25168-75cb-4128-9228-1547e29e5af9","Type":"ContainerStarted","Data":"b6b356755809bd087e1823ec6bfb6cfb86fa1025c6f3770ee3c7f486ee471955"} Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.656632 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.822071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60b25168-75cb-4128-9228-1547e29e5af9-config-volume\") pod \"60b25168-75cb-4128-9228-1547e29e5af9\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.822212 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60b25168-75cb-4128-9228-1547e29e5af9-secret-volume\") pod \"60b25168-75cb-4128-9228-1547e29e5af9\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.822234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85xpc\" (UniqueName: \"kubernetes.io/projected/60b25168-75cb-4128-9228-1547e29e5af9-kube-api-access-85xpc\") pod \"60b25168-75cb-4128-9228-1547e29e5af9\" (UID: \"60b25168-75cb-4128-9228-1547e29e5af9\") " Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.822718 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b25168-75cb-4128-9228-1547e29e5af9-config-volume" (OuterVolumeSpecName: "config-volume") pod "60b25168-75cb-4128-9228-1547e29e5af9" (UID: "60b25168-75cb-4128-9228-1547e29e5af9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.826250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b25168-75cb-4128-9228-1547e29e5af9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "60b25168-75cb-4128-9228-1547e29e5af9" (UID: "60b25168-75cb-4128-9228-1547e29e5af9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.826321 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b25168-75cb-4128-9228-1547e29e5af9-kube-api-access-85xpc" (OuterVolumeSpecName: "kube-api-access-85xpc") pod "60b25168-75cb-4128-9228-1547e29e5af9" (UID: "60b25168-75cb-4128-9228-1547e29e5af9"). InnerVolumeSpecName "kube-api-access-85xpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.923389 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60b25168-75cb-4128-9228-1547e29e5af9-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.923417 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85xpc\" (UniqueName: \"kubernetes.io/projected/60b25168-75cb-4128-9228-1547e29e5af9-kube-api-access-85xpc\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:02 crc kubenswrapper[4707]: I0121 15:30:02.923427 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60b25168-75cb-4128-9228-1547e29e5af9-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.158080 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.329199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjrtt\" (UniqueName: \"kubernetes.io/projected/f161ce76-80a9-470b-8776-df5af1799b7e-kube-api-access-jjrtt\") pod \"f161ce76-80a9-470b-8776-df5af1799b7e\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.329251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data-custom\") pod \"f161ce76-80a9-470b-8776-df5af1799b7e\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.329341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f161ce76-80a9-470b-8776-df5af1799b7e-etc-machine-id\") pod \"f161ce76-80a9-470b-8776-df5af1799b7e\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.329360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data\") pod \"f161ce76-80a9-470b-8776-df5af1799b7e\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.329376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-scripts\") pod \"f161ce76-80a9-470b-8776-df5af1799b7e\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.329426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-combined-ca-bundle\") pod \"f161ce76-80a9-470b-8776-df5af1799b7e\" (UID: \"f161ce76-80a9-470b-8776-df5af1799b7e\") " Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.329842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f161ce76-80a9-470b-8776-df5af1799b7e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f161ce76-80a9-470b-8776-df5af1799b7e" (UID: "f161ce76-80a9-470b-8776-df5af1799b7e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.332108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f161ce76-80a9-470b-8776-df5af1799b7e" (UID: "f161ce76-80a9-470b-8776-df5af1799b7e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.332291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-scripts" (OuterVolumeSpecName: "scripts") pod "f161ce76-80a9-470b-8776-df5af1799b7e" (UID: "f161ce76-80a9-470b-8776-df5af1799b7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.332324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f161ce76-80a9-470b-8776-df5af1799b7e-kube-api-access-jjrtt" (OuterVolumeSpecName: "kube-api-access-jjrtt") pod "f161ce76-80a9-470b-8776-df5af1799b7e" (UID: "f161ce76-80a9-470b-8776-df5af1799b7e"). InnerVolumeSpecName "kube-api-access-jjrtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.354991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f161ce76-80a9-470b-8776-df5af1799b7e" (UID: "f161ce76-80a9-470b-8776-df5af1799b7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.373590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data" (OuterVolumeSpecName: "config-data") pod "f161ce76-80a9-470b-8776-df5af1799b7e" (UID: "f161ce76-80a9-470b-8776-df5af1799b7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.430564 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f161ce76-80a9-470b-8776-df5af1799b7e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.430593 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.430601 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.430610 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.430620 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjrtt\" (UniqueName: \"kubernetes.io/projected/f161ce76-80a9-470b-8776-df5af1799b7e-kube-api-access-jjrtt\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.430629 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f161ce76-80a9-470b-8776-df5af1799b7e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.442336 4707 generic.go:334] "Generic (PLEG): container finished" podID="f161ce76-80a9-470b-8776-df5af1799b7e" containerID="51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4" exitCode=137 Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.442379 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.442515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f161ce76-80a9-470b-8776-df5af1799b7e","Type":"ContainerDied","Data":"51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4"} Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.442616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f161ce76-80a9-470b-8776-df5af1799b7e","Type":"ContainerDied","Data":"d4ed07cf8ed2006ff07f3aceb398c12221ee8f00f1f21939ff393ac7b6149fbe"} Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.442645 4707 scope.go:117] "RemoveContainer" containerID="aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.444639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" event={"ID":"60b25168-75cb-4128-9228-1547e29e5af9","Type":"ContainerDied","Data":"b6b356755809bd087e1823ec6bfb6cfb86fa1025c6f3770ee3c7f486ee471955"} Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.444729 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b356755809bd087e1823ec6bfb6cfb86fa1025c6f3770ee3c7f486ee471955" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.444691 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.458504 4707 scope.go:117] "RemoveContainer" containerID="51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.463118 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.467668 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.472827 4707 scope.go:117] "RemoveContainer" containerID="aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5" Jan 21 15:30:03 crc kubenswrapper[4707]: E0121 15:30:03.473075 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5\": container with ID starting with aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5 not found: ID does not exist" containerID="aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.473104 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5"} err="failed to get container status \"aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5\": rpc error: code = NotFound desc = could not find container \"aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5\": container with ID starting with aa7af08fba2965ba8908b6ffc31c678e0ca116090a1ba0e71164e6cdd08fe3d5 not found: ID does not exist" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.473124 4707 scope.go:117] "RemoveContainer" containerID="51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4" Jan 21 15:30:03 crc kubenswrapper[4707]: E0121 15:30:03.473357 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4\": container with ID starting with 51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4 not found: ID does not exist" containerID="51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4" Jan 21 15:30:03 crc kubenswrapper[4707]: I0121 15:30:03.473385 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4"} err="failed to get container status \"51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4\": rpc error: code = NotFound desc = could not find container \"51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4\": container with ID starting with 51c6dc21238eacf1375c3393701e1bedee3d380f47ab4f8591f7b9259fadfef4 not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.207869 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.239918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-589cr\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-kube-api-access-589cr\") pod \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.239991 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift\") pod \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.240070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.240090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-lock\") pod \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.240103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-cache\") pod \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\" (UID: \"d2152261-adb2-4974-bd02-1e61bbd3d0b4\") " Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.240515 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-lock" (OuterVolumeSpecName: "lock") pod "d2152261-adb2-4974-bd02-1e61bbd3d0b4" (UID: "d2152261-adb2-4974-bd02-1e61bbd3d0b4"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.240619 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-cache" (OuterVolumeSpecName: "cache") pod "d2152261-adb2-4974-bd02-1e61bbd3d0b4" (UID: "d2152261-adb2-4974-bd02-1e61bbd3d0b4"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.243057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "d2152261-adb2-4974-bd02-1e61bbd3d0b4" (UID: "d2152261-adb2-4974-bd02-1e61bbd3d0b4"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.243181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d2152261-adb2-4974-bd02-1e61bbd3d0b4" (UID: "d2152261-adb2-4974-bd02-1e61bbd3d0b4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.243550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-kube-api-access-589cr" (OuterVolumeSpecName: "kube-api-access-589cr") pod "d2152261-adb2-4974-bd02-1e61bbd3d0b4" (UID: "d2152261-adb2-4974-bd02-1e61bbd3d0b4"). InnerVolumeSpecName "kube-api-access-589cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.341178 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-589cr\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-kube-api-access-589cr\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.341206 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2152261-adb2-4974-bd02-1e61bbd3d0b4-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.341239 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.341248 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.341256 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2152261-adb2-4974-bd02-1e61bbd3d0b4-cache\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.351501 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.441707 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.455777 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerID="3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a" exitCode=137 Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.455821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a"} Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.456044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d2152261-adb2-4974-bd02-1e61bbd3d0b4","Type":"ContainerDied","Data":"51bd4a509c9056061034c28ba83642bb715ed15d21b2887dc9e334d4ca3fde09"} Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.455867 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.456106 4707 scope.go:117] "RemoveContainer" containerID="3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.471231 4707 scope.go:117] "RemoveContainer" containerID="45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.478402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.483672 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.485394 4707 scope.go:117] "RemoveContainer" containerID="e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.499597 4707 scope.go:117] "RemoveContainer" containerID="eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.512682 4707 scope.go:117] "RemoveContainer" containerID="f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.523333 4707 scope.go:117] "RemoveContainer" containerID="132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.535694 4707 scope.go:117] "RemoveContainer" containerID="ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.547961 4707 scope.go:117] "RemoveContainer" containerID="4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.560539 4707 scope.go:117] "RemoveContainer" containerID="5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.573667 4707 scope.go:117] "RemoveContainer" containerID="f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.586905 4707 scope.go:117] "RemoveContainer" containerID="1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.598063 4707 scope.go:117] "RemoveContainer" containerID="f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.611486 4707 scope.go:117] "RemoveContainer" containerID="292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.638536 4707 scope.go:117] "RemoveContainer" containerID="b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.652634 4707 scope.go:117] "RemoveContainer" containerID="d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.664674 4707 scope.go:117] "RemoveContainer" containerID="3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.664984 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a\": container with ID starting with 3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a not found: ID does not exist" containerID="3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.665023 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a"} err="failed to get container status \"3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a\": rpc error: code = NotFound desc = could not find container \"3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a\": container with ID starting with 3c8304a289357e1bd2c223e5517ad916d511ccae2e6c1c4bec0d4faa9b31981a not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.665048 4707 scope.go:117] "RemoveContainer" containerID="45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.665296 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949\": container with ID starting with 45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949 not found: ID does not exist" containerID="45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.665324 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949"} err="failed to get container status \"45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949\": rpc error: code = NotFound desc = could not find container \"45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949\": container with ID starting with 45f5ba6c380272f31e351f19d3ad9bd995865d83fa20c3976d8ba51da728f949 not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.665343 4707 scope.go:117] "RemoveContainer" containerID="e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.665564 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02\": container with ID starting with e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02 not found: ID does not exist" containerID="e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.665588 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02"} err="failed to get container status \"e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02\": rpc error: code = NotFound desc = could not find container \"e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02\": container with ID starting with e908f5cb1df3921aa357f03c523241a3d4a308e1a56bf6d418bebf983e0b6f02 not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.665607 4707 scope.go:117] "RemoveContainer" containerID="eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.665823 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa\": container with ID starting with eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa not found: ID does not exist" containerID="eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.665848 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa"} err="failed to get container status \"eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa\": rpc error: code = NotFound desc = could not find container \"eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa\": container with ID starting with eb52547ad21d9a03b880024195a562634f3522abde4be0a90e000efbe0b3c0fa not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.665863 4707 scope.go:117] "RemoveContainer" containerID="f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.666092 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec\": container with ID starting with f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec not found: ID does not exist" containerID="f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.666113 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec"} err="failed to get container status \"f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec\": rpc error: code = NotFound desc = could not find container \"f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec\": container with ID starting with f5852acda4e4e96d7772105ec90831e69b57360144d709c659b2f938e7f3abec not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.666128 4707 scope.go:117] "RemoveContainer" containerID="132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.666334 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a\": container with ID starting with 132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a not found: ID does not exist" containerID="132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.666355 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a"} err="failed to get container status \"132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a\": rpc error: code = NotFound desc = could not find container \"132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a\": container with ID starting with 132cb9c6833913a23500ab5577d72319993893a1ebf2071c9cb68158dc81d56a not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.666368 4707 scope.go:117] "RemoveContainer" containerID="ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.666542 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d\": container with ID starting with ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d not found: ID does not exist" containerID="ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.666578 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d"} err="failed to get container status \"ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d\": rpc error: code = NotFound desc = could not find container \"ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d\": container with ID starting with ce39f48336a2a4c642576637f06e33db3e4c79113916acb80150641d9419660d not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.666589 4707 scope.go:117] "RemoveContainer" containerID="4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.666787 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89\": container with ID starting with 4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89 not found: ID does not exist" containerID="4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.666824 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89"} err="failed to get container status \"4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89\": rpc error: code = NotFound desc = could not find container \"4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89\": container with ID starting with 4c221f56196b9016c421c0a21a454b12f4f4185ea8bb81ce8a90051d3c46fe89 not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.666836 4707 scope.go:117] "RemoveContainer" containerID="5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.667012 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a\": container with ID starting with 5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a not found: ID does not exist" containerID="5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667029 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a"} err="failed to get container status \"5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a\": rpc error: code = NotFound desc = could not find container \"5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a\": container with ID starting with 5a7b230559f9083b33f60a08125192e027322e5191fe3786f495d7e2a3fa344a not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667094 4707 scope.go:117] "RemoveContainer" containerID="f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.667288 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39\": container with ID starting with f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39 not found: ID does not exist" containerID="f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667308 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39"} err="failed to get container status \"f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39\": rpc error: code = NotFound desc = could not find container \"f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39\": container with ID starting with f5df66ee8f6571d0aec9ff5b7b2bce825c2d48325e56a10546b37dfb035a7d39 not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667321 4707 scope.go:117] "RemoveContainer" containerID="1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.667499 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58\": container with ID starting with 1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58 not found: ID does not exist" containerID="1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667519 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58"} err="failed to get container status \"1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58\": rpc error: code = NotFound desc = could not find container \"1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58\": container with ID starting with 1def97637d30d5ae73d24a5894742fae01c152285d0f132da07d3feab4961a58 not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667531 4707 scope.go:117] "RemoveContainer" containerID="f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.667696 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c\": container with ID starting with f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c not found: ID does not exist" containerID="f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667717 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c"} err="failed to get container status \"f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c\": rpc error: code = NotFound desc = could not find container \"f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c\": container with ID starting with f0bb86e0b8b50d4ffff0fb908ef389581d5d26b93c8c93154d733627f83d062c not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667728 4707 scope.go:117] "RemoveContainer" containerID="292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.667913 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d\": container with ID starting with 292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d not found: ID does not exist" containerID="292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667932 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d"} err="failed to get container status \"292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d\": rpc error: code = NotFound desc = could not find container \"292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d\": container with ID starting with 292d249099873d4df1221d2870105d70fdfafe65983c675fabd9ee200d411e4d not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.667943 4707 scope.go:117] "RemoveContainer" containerID="b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.668108 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d\": container with ID starting with b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d not found: ID does not exist" containerID="b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.668149 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d"} err="failed to get container status \"b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d\": rpc error: code = NotFound desc = could not find container \"b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d\": container with ID starting with b828db751e54255592b64dec316efe2c667e46a8b1ddb2282182ff30176b3c1d not found: ID does not exist" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.668173 4707 scope.go:117] "RemoveContainer" containerID="d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9" Jan 21 15:30:04 crc kubenswrapper[4707]: E0121 15:30:04.668676 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9\": container with ID starting with d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9 not found: ID does not exist" containerID="d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9" Jan 21 15:30:04 crc kubenswrapper[4707]: I0121 15:30:04.668721 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9"} err="failed to get container status \"d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9\": rpc error: code = NotFound desc = could not find container \"d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9\": container with ID starting with d56c1f43e80436263a24954dd469c8f40c593759f1c9757c44a3cc7047a658d9 not found: ID does not exist" Jan 21 15:30:05 crc kubenswrapper[4707]: I0121 15:30:05.189019 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" path="/var/lib/kubelet/pods/d2152261-adb2-4974-bd02-1e61bbd3d0b4/volumes" Jan 21 15:30:05 crc kubenswrapper[4707]: I0121 15:30:05.190797 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f161ce76-80a9-470b-8776-df5af1799b7e" path="/var/lib/kubelet/pods/f161ce76-80a9-470b-8776-df5af1799b7e/volumes" Jan 21 15:30:08 crc kubenswrapper[4707]: I0121 15:30:08.182961 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:30:08 crc kubenswrapper[4707]: E0121 15:30:08.184052 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.334221 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-n6mcc"] Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.337912 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-n6mcc"] Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447142 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-l6rps"] Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447388 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="swift-recon-cron" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447407 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="swift-recon-cron" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447419 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-expirer" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447425 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-expirer" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447436 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-replicator" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447441 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-replicator" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447450 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-auditor" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-auditor" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447462 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-updater" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447469 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-updater" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447481 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-updater" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447486 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-updater" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447495 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-replicator" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447500 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-replicator" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447505 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="rsync" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447510 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="rsync" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447519 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-auditor" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447525 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-auditor" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447531 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-reaper" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447536 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-reaper" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447544 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f161ce76-80a9-470b-8776-df5af1799b7e" containerName="cinder-scheduler" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447549 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f161ce76-80a9-470b-8776-df5af1799b7e" containerName="cinder-scheduler" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447556 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-server" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-server" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447569 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-server" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447573 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-server" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447582 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-auditor" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447587 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-auditor" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447592 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b25168-75cb-4128-9228-1547e29e5af9" containerName="collect-profiles" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447597 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b25168-75cb-4128-9228-1547e29e5af9" containerName="collect-profiles" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447605 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-replicator" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-replicator" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447619 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-server" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447624 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-server" Jan 21 15:30:12 crc kubenswrapper[4707]: E0121 15:30:12.447631 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f161ce76-80a9-470b-8776-df5af1799b7e" containerName="probe" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447636 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f161ce76-80a9-470b-8776-df5af1799b7e" containerName="probe" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447753 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-auditor" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447764 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b25168-75cb-4128-9228-1547e29e5af9" containerName="collect-profiles" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447771 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-updater" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447781 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-reaper" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447786 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-server" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447796 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-replicator" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447801 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f161ce76-80a9-470b-8776-df5af1799b7e" containerName="cinder-scheduler" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447828 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="swift-recon-cron" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447837 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-expirer" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447844 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="rsync" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447851 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="account-replicator" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447858 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-replicator" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447864 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f161ce76-80a9-470b-8776-df5af1799b7e" containerName="probe" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-updater" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447883 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-server" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447894 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-auditor" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447901 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="container-server" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.447908 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2152261-adb2-4974-bd02-1e61bbd3d0b4" containerName="object-auditor" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.448298 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.449460 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.450141 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.450279 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.452564 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.453175 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-l6rps"] Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.537008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lghps\" (UniqueName: \"kubernetes.io/projected/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-kube-api-access-lghps\") pod \"crc-storage-crc-l6rps\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.537049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-node-mnt\") pod \"crc-storage-crc-l6rps\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.537194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-crc-storage\") pod \"crc-storage-crc-l6rps\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.638108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-crc-storage\") pod \"crc-storage-crc-l6rps\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.638147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lghps\" (UniqueName: \"kubernetes.io/projected/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-kube-api-access-lghps\") pod \"crc-storage-crc-l6rps\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.638175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-node-mnt\") pod \"crc-storage-crc-l6rps\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.638340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-node-mnt\") pod \"crc-storage-crc-l6rps\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.638845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-crc-storage\") pod \"crc-storage-crc-l6rps\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.653409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lghps\" (UniqueName: \"kubernetes.io/projected/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-kube-api-access-lghps\") pod \"crc-storage-crc-l6rps\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:12 crc kubenswrapper[4707]: I0121 15:30:12.768064 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:13 crc kubenswrapper[4707]: I0121 15:30:13.117073 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-l6rps"] Jan 21 15:30:13 crc kubenswrapper[4707]: I0121 15:30:13.189186 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e8fb795-c82f-48ae-9bdf-19c03a086d7b" path="/var/lib/kubelet/pods/5e8fb795-c82f-48ae-9bdf-19c03a086d7b/volumes" Jan 21 15:30:13 crc kubenswrapper[4707]: I0121 15:30:13.514284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-l6rps" event={"ID":"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4","Type":"ContainerStarted","Data":"b73c1d8de8dd6c62cbf6f3d5224eff55750d8d46737a8c6b78e6f00a25d60c31"} Jan 21 15:30:14 crc kubenswrapper[4707]: I0121 15:30:14.521362 4707 generic.go:334] "Generic (PLEG): container finished" podID="d317adf0-9538-4bb0-a7b7-1aa9ccb547a4" containerID="2959c05294ef22b7eddaf9fa7c698a04d0ddd1042daf82642dac605658641a3b" exitCode=0 Jan 21 15:30:14 crc kubenswrapper[4707]: I0121 15:30:14.521459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-l6rps" event={"ID":"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4","Type":"ContainerDied","Data":"2959c05294ef22b7eddaf9fa7c698a04d0ddd1042daf82642dac605658641a3b"} Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:15.786373 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:15.977526 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-node-mnt\") pod \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:15.977563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-crc-storage\") pod \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:15.977594 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lghps\" (UniqueName: \"kubernetes.io/projected/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-kube-api-access-lghps\") pod \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\" (UID: \"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4\") " Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:15.977596 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d317adf0-9538-4bb0-a7b7-1aa9ccb547a4" (UID: "d317adf0-9538-4bb0-a7b7-1aa9ccb547a4"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:15.985906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-kube-api-access-lghps" (OuterVolumeSpecName: "kube-api-access-lghps") pod "d317adf0-9538-4bb0-a7b7-1aa9ccb547a4" (UID: "d317adf0-9538-4bb0-a7b7-1aa9ccb547a4"). InnerVolumeSpecName "kube-api-access-lghps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:15.990648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d317adf0-9538-4bb0-a7b7-1aa9ccb547a4" (UID: "d317adf0-9538-4bb0-a7b7-1aa9ccb547a4"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:16.078526 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:16.078560 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:16.078570 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lghps\" (UniqueName: \"kubernetes.io/projected/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4-kube-api-access-lghps\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:16.533317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-l6rps" event={"ID":"d317adf0-9538-4bb0-a7b7-1aa9ccb547a4","Type":"ContainerDied","Data":"b73c1d8de8dd6c62cbf6f3d5224eff55750d8d46737a8c6b78e6f00a25d60c31"} Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:16.533352 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b73c1d8de8dd6c62cbf6f3d5224eff55750d8d46737a8c6b78e6f00a25d60c31" Jan 21 15:30:16 crc kubenswrapper[4707]: I0121 15:30:16.533355 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l6rps" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.656252 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-l6rps"] Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.661030 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-l6rps"] Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.764243 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-qpnsx"] Jan 21 15:30:18 crc kubenswrapper[4707]: E0121 15:30:18.764624 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d317adf0-9538-4bb0-a7b7-1aa9ccb547a4" containerName="storage" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.764698 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d317adf0-9538-4bb0-a7b7-1aa9ccb547a4" containerName="storage" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.764945 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d317adf0-9538-4bb0-a7b7-1aa9ccb547a4" containerName="storage" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.765385 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.767606 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.767689 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.767624 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.767669 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.769626 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qpnsx"] Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.814834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-crc-storage\") pod \"crc-storage-crc-qpnsx\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.814876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md4mm\" (UniqueName: \"kubernetes.io/projected/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-kube-api-access-md4mm\") pod \"crc-storage-crc-qpnsx\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.815073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-node-mnt\") pod \"crc-storage-crc-qpnsx\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.916096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-crc-storage\") pod \"crc-storage-crc-qpnsx\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.916136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md4mm\" (UniqueName: \"kubernetes.io/projected/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-kube-api-access-md4mm\") pod \"crc-storage-crc-qpnsx\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.916224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-node-mnt\") pod \"crc-storage-crc-qpnsx\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.916412 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-node-mnt\") pod \"crc-storage-crc-qpnsx\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.916792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-crc-storage\") pod \"crc-storage-crc-qpnsx\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:18 crc kubenswrapper[4707]: I0121 15:30:18.929698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md4mm\" (UniqueName: \"kubernetes.io/projected/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-kube-api-access-md4mm\") pod \"crc-storage-crc-qpnsx\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:19 crc kubenswrapper[4707]: I0121 15:30:19.076830 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:19 crc kubenswrapper[4707]: I0121 15:30:19.198939 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d317adf0-9538-4bb0-a7b7-1aa9ccb547a4" path="/var/lib/kubelet/pods/d317adf0-9538-4bb0-a7b7-1aa9ccb547a4/volumes" Jan 21 15:30:19 crc kubenswrapper[4707]: I0121 15:30:19.417255 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qpnsx"] Jan 21 15:30:19 crc kubenswrapper[4707]: I0121 15:30:19.553731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qpnsx" event={"ID":"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb","Type":"ContainerStarted","Data":"7054cfc8ceccb16f934090a6dd5412454081073f9249ba685aa4a1da3002d5d6"} Jan 21 15:30:20 crc kubenswrapper[4707]: I0121 15:30:20.561304 4707 generic.go:334] "Generic (PLEG): container finished" podID="a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb" containerID="1a70f6bf277e47355538d924e7369f28b3980ffde1ce60e65e4ba86178bb6cc7" exitCode=0 Jan 21 15:30:20 crc kubenswrapper[4707]: I0121 15:30:20.561339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qpnsx" event={"ID":"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb","Type":"ContainerDied","Data":"1a70f6bf277e47355538d924e7369f28b3980ffde1ce60e65e4ba86178bb6cc7"} Jan 21 15:30:21 crc kubenswrapper[4707]: I0121 15:30:21.182706 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:30:21 crc kubenswrapper[4707]: E0121 15:30:21.185566 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:30:21 crc kubenswrapper[4707]: I0121 15:30:21.778355 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:21 crc kubenswrapper[4707]: I0121 15:30:21.952065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md4mm\" (UniqueName: \"kubernetes.io/projected/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-kube-api-access-md4mm\") pod \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " Jan 21 15:30:21 crc kubenswrapper[4707]: I0121 15:30:21.952125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-crc-storage\") pod \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " Jan 21 15:30:21 crc kubenswrapper[4707]: I0121 15:30:21.952143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-node-mnt\") pod \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\" (UID: \"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb\") " Jan 21 15:30:21 crc kubenswrapper[4707]: I0121 15:30:21.952430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb" (UID: "a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:30:21 crc kubenswrapper[4707]: I0121 15:30:21.956117 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-kube-api-access-md4mm" (OuterVolumeSpecName: "kube-api-access-md4mm") pod "a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb" (UID: "a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb"). InnerVolumeSpecName "kube-api-access-md4mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:21 crc kubenswrapper[4707]: I0121 15:30:21.965474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb" (UID: "a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:30:22 crc kubenswrapper[4707]: I0121 15:30:22.053076 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md4mm\" (UniqueName: \"kubernetes.io/projected/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-kube-api-access-md4mm\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:22 crc kubenswrapper[4707]: I0121 15:30:22.053101 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:22 crc kubenswrapper[4707]: I0121 15:30:22.053111 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:22 crc kubenswrapper[4707]: I0121 15:30:22.574086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qpnsx" event={"ID":"a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb","Type":"ContainerDied","Data":"7054cfc8ceccb16f934090a6dd5412454081073f9249ba685aa4a1da3002d5d6"} Jan 21 15:30:22 crc kubenswrapper[4707]: I0121 15:30:22.574119 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7054cfc8ceccb16f934090a6dd5412454081073f9249ba685aa4a1da3002d5d6" Jan 21 15:30:22 crc kubenswrapper[4707]: I0121 15:30:22.574128 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qpnsx" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.120450 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh"] Jan 21 15:30:27 crc kubenswrapper[4707]: E0121 15:30:27.120859 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb" containerName="storage" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.120872 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb" containerName="storage" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.121006 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb" containerName="storage" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.121821 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.126415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.127334 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh"] Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.312409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.312478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqk8b\" (UniqueName: \"kubernetes.io/projected/d5e44698-abdb-4d2b-8058-ec06409cd1d3-kube-api-access-pqk8b\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.312584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.413680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqk8b\" (UniqueName: \"kubernetes.io/projected/d5e44698-abdb-4d2b-8058-ec06409cd1d3-kube-api-access-pqk8b\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.413735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.413942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.414247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.414308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.428364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqk8b\" (UniqueName: \"kubernetes.io/projected/d5e44698-abdb-4d2b-8058-ec06409cd1d3-kube-api-access-pqk8b\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.435032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:27 crc kubenswrapper[4707]: I0121 15:30:27.782567 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh"] Jan 21 15:30:28 crc kubenswrapper[4707]: I0121 15:30:28.609490 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerID="75df47ba669c5e8933cd2fa4e1787d4a58c04c83a5f629ee99b8f4453fb64dfa" exitCode=0 Jan 21 15:30:28 crc kubenswrapper[4707]: I0121 15:30:28.609535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" event={"ID":"d5e44698-abdb-4d2b-8058-ec06409cd1d3","Type":"ContainerDied","Data":"75df47ba669c5e8933cd2fa4e1787d4a58c04c83a5f629ee99b8f4453fb64dfa"} Jan 21 15:30:28 crc kubenswrapper[4707]: I0121 15:30:28.609706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" event={"ID":"d5e44698-abdb-4d2b-8058-ec06409cd1d3","Type":"ContainerStarted","Data":"02d472202596cad34ac6761f4ddb94a9357de9b894adbe010bdbca52c89f410f"} Jan 21 15:30:30 crc kubenswrapper[4707]: I0121 15:30:30.621432 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerID="4463fd7f6d929e33583521043b21aa0adddd721c4916cbb036527bf7547d76c4" exitCode=0 Jan 21 15:30:30 crc kubenswrapper[4707]: I0121 15:30:30.621500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" event={"ID":"d5e44698-abdb-4d2b-8058-ec06409cd1d3","Type":"ContainerDied","Data":"4463fd7f6d929e33583521043b21aa0adddd721c4916cbb036527bf7547d76c4"} Jan 21 15:30:31 crc kubenswrapper[4707]: I0121 15:30:31.629141 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerID="0fc73b3896d68c62ceebb7cb193d05351e3457490479b0523e5688f7ca4658ca" exitCode=0 Jan 21 15:30:31 crc kubenswrapper[4707]: I0121 15:30:31.629206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" event={"ID":"d5e44698-abdb-4d2b-8058-ec06409cd1d3","Type":"ContainerDied","Data":"0fc73b3896d68c62ceebb7cb193d05351e3457490479b0523e5688f7ca4658ca"} Jan 21 15:30:32 crc kubenswrapper[4707]: I0121 15:30:32.846255 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:32 crc kubenswrapper[4707]: I0121 15:30:32.983729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-util\") pod \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " Jan 21 15:30:32 crc kubenswrapper[4707]: I0121 15:30:32.983795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqk8b\" (UniqueName: \"kubernetes.io/projected/d5e44698-abdb-4d2b-8058-ec06409cd1d3-kube-api-access-pqk8b\") pod \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " Jan 21 15:30:32 crc kubenswrapper[4707]: I0121 15:30:32.983839 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-bundle\") pod \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\" (UID: \"d5e44698-abdb-4d2b-8058-ec06409cd1d3\") " Jan 21 15:30:32 crc kubenswrapper[4707]: I0121 15:30:32.986054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-bundle" (OuterVolumeSpecName: "bundle") pod "d5e44698-abdb-4d2b-8058-ec06409cd1d3" (UID: "d5e44698-abdb-4d2b-8058-ec06409cd1d3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:30:32 crc kubenswrapper[4707]: I0121 15:30:32.988649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e44698-abdb-4d2b-8058-ec06409cd1d3-kube-api-access-pqk8b" (OuterVolumeSpecName: "kube-api-access-pqk8b") pod "d5e44698-abdb-4d2b-8058-ec06409cd1d3" (UID: "d5e44698-abdb-4d2b-8058-ec06409cd1d3"). InnerVolumeSpecName "kube-api-access-pqk8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:32 crc kubenswrapper[4707]: I0121 15:30:32.994890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-util" (OuterVolumeSpecName: "util") pod "d5e44698-abdb-4d2b-8058-ec06409cd1d3" (UID: "d5e44698-abdb-4d2b-8058-ec06409cd1d3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:30:33 crc kubenswrapper[4707]: I0121 15:30:33.085914 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:33 crc kubenswrapper[4707]: I0121 15:30:33.085937 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqk8b\" (UniqueName: \"kubernetes.io/projected/d5e44698-abdb-4d2b-8058-ec06409cd1d3-kube-api-access-pqk8b\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:33 crc kubenswrapper[4707]: I0121 15:30:33.085947 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5e44698-abdb-4d2b-8058-ec06409cd1d3-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:33 crc kubenswrapper[4707]: I0121 15:30:33.640726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" event={"ID":"d5e44698-abdb-4d2b-8058-ec06409cd1d3","Type":"ContainerDied","Data":"02d472202596cad34ac6761f4ddb94a9357de9b894adbe010bdbca52c89f410f"} Jan 21 15:30:33 crc kubenswrapper[4707]: I0121 15:30:33.640921 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d472202596cad34ac6761f4ddb94a9357de9b894adbe010bdbca52c89f410f" Jan 21 15:30:33 crc kubenswrapper[4707]: I0121 15:30:33.640774 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh" Jan 21 15:30:36 crc kubenswrapper[4707]: I0121 15:30:36.182423 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:30:36 crc kubenswrapper[4707]: E0121 15:30:36.182607 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.771485 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr"] Jan 21 15:30:43 crc kubenswrapper[4707]: E0121 15:30:43.772105 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerName="pull" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.772119 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerName="pull" Jan 21 15:30:43 crc kubenswrapper[4707]: E0121 15:30:43.772132 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerName="extract" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.772137 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerName="extract" Jan 21 15:30:43 crc kubenswrapper[4707]: E0121 15:30:43.772150 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerName="util" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.772170 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerName="util" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.772301 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e44698-abdb-4d2b-8058-ec06409cd1d3" containerName="extract" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.772664 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.774448 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.774611 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.774731 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-j95xk" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.787990 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr"] Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.810088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflbk\" (UniqueName: \"kubernetes.io/projected/b57455b4-f5cc-4e50-972f-299a31cc4eba-kube-api-access-fflbk\") pod \"obo-prometheus-operator-68bc856cb9-vcvdr\" (UID: \"b57455b4-f5cc-4e50-972f-299a31cc4eba\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.886952 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc"] Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.887841 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.893731 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qtlqp" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.893912 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.903425 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc"] Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.904141 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.908980 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc"] Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.910978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50a8c4cf-f6ea-4042-900f-37b63445f2d3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc\" (UID: \"50a8c4cf-f6ea-4042-900f-37b63445f2d3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.911018 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc\" (UID: \"f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.911060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fflbk\" (UniqueName: \"kubernetes.io/projected/b57455b4-f5cc-4e50-972f-299a31cc4eba-kube-api-access-fflbk\") pod \"obo-prometheus-operator-68bc856cb9-vcvdr\" (UID: \"b57455b4-f5cc-4e50-972f-299a31cc4eba\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.911099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50a8c4cf-f6ea-4042-900f-37b63445f2d3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc\" (UID: \"50a8c4cf-f6ea-4042-900f-37b63445f2d3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.911144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc\" (UID: \"f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.926345 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc"] Jan 21 15:30:43 crc kubenswrapper[4707]: I0121 15:30:43.942112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflbk\" (UniqueName: \"kubernetes.io/projected/b57455b4-f5cc-4e50-972f-299a31cc4eba-kube-api-access-fflbk\") pod \"obo-prometheus-operator-68bc856cb9-vcvdr\" (UID: \"b57455b4-f5cc-4e50-972f-299a31cc4eba\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.011982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50a8c4cf-f6ea-4042-900f-37b63445f2d3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc\" (UID: \"50a8c4cf-f6ea-4042-900f-37b63445f2d3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.012249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc\" (UID: \"f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.012275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50a8c4cf-f6ea-4042-900f-37b63445f2d3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc\" (UID: \"50a8c4cf-f6ea-4042-900f-37b63445f2d3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.012299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc\" (UID: \"f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.015317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50a8c4cf-f6ea-4042-900f-37b63445f2d3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc\" (UID: \"50a8c4cf-f6ea-4042-900f-37b63445f2d3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.016212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc\" (UID: \"f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.018063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50a8c4cf-f6ea-4042-900f-37b63445f2d3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc\" (UID: \"50a8c4cf-f6ea-4042-900f-37b63445f2d3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.019186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc\" (UID: \"f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.021474 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hc5fq"] Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.022280 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.026014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.026174 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-fxptm" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.049408 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hc5fq"] Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.086664 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.116043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/de08c513-221d-41e8-a1b3-fc4b21d19173-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hc5fq\" (UID: \"de08c513-221d-41e8-a1b3-fc4b21d19173\") " pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.116125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c77g8\" (UniqueName: \"kubernetes.io/projected/de08c513-221d-41e8-a1b3-fc4b21d19173-kube-api-access-c77g8\") pod \"observability-operator-59bdc8b94-hc5fq\" (UID: \"de08c513-221d-41e8-a1b3-fc4b21d19173\") " pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.168773 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-n7h88"] Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.172892 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.177190 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-t69c5" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.179106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-n7h88"] Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.207046 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.216077 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.217111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/de08c513-221d-41e8-a1b3-fc4b21d19173-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hc5fq\" (UID: \"de08c513-221d-41e8-a1b3-fc4b21d19173\") " pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.217209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c77g8\" (UniqueName: \"kubernetes.io/projected/de08c513-221d-41e8-a1b3-fc4b21d19173-kube-api-access-c77g8\") pod \"observability-operator-59bdc8b94-hc5fq\" (UID: \"de08c513-221d-41e8-a1b3-fc4b21d19173\") " pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.225315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/de08c513-221d-41e8-a1b3-fc4b21d19173-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hc5fq\" (UID: \"de08c513-221d-41e8-a1b3-fc4b21d19173\") " pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.237482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c77g8\" (UniqueName: \"kubernetes.io/projected/de08c513-221d-41e8-a1b3-fc4b21d19173-kube-api-access-c77g8\") pod \"observability-operator-59bdc8b94-hc5fq\" (UID: \"de08c513-221d-41e8-a1b3-fc4b21d19173\") " pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.318292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6n5q\" (UniqueName: \"kubernetes.io/projected/a77834d8-cde4-449d-b56f-6d3b53b0cadd-kube-api-access-x6n5q\") pod \"perses-operator-5bf474d74f-n7h88\" (UID: \"a77834d8-cde4-449d-b56f-6d3b53b0cadd\") " pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.318345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a77834d8-cde4-449d-b56f-6d3b53b0cadd-openshift-service-ca\") pod \"perses-operator-5bf474d74f-n7h88\" (UID: \"a77834d8-cde4-449d-b56f-6d3b53b0cadd\") " pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.367676 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.419421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6n5q\" (UniqueName: \"kubernetes.io/projected/a77834d8-cde4-449d-b56f-6d3b53b0cadd-kube-api-access-x6n5q\") pod \"perses-operator-5bf474d74f-n7h88\" (UID: \"a77834d8-cde4-449d-b56f-6d3b53b0cadd\") " pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.419644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a77834d8-cde4-449d-b56f-6d3b53b0cadd-openshift-service-ca\") pod \"perses-operator-5bf474d74f-n7h88\" (UID: \"a77834d8-cde4-449d-b56f-6d3b53b0cadd\") " pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.420662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a77834d8-cde4-449d-b56f-6d3b53b0cadd-openshift-service-ca\") pod \"perses-operator-5bf474d74f-n7h88\" (UID: \"a77834d8-cde4-449d-b56f-6d3b53b0cadd\") " pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.436075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6n5q\" (UniqueName: \"kubernetes.io/projected/a77834d8-cde4-449d-b56f-6d3b53b0cadd-kube-api-access-x6n5q\") pod \"perses-operator-5bf474d74f-n7h88\" (UID: \"a77834d8-cde4-449d-b56f-6d3b53b0cadd\") " pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.492683 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.611390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr"] Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.713843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr" event={"ID":"b57455b4-f5cc-4e50-972f-299a31cc4eba","Type":"ContainerStarted","Data":"2e832f42d956fb7186aabba5a6f6feefbc2b8e934c3672fb45b6b3a20d8d402d"} Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.725628 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc"] Jan 21 15:30:44 crc kubenswrapper[4707]: W0121 15:30:44.730232 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50a8c4cf_f6ea_4042_900f_37b63445f2d3.slice/crio-a979eeb753ce8d9094b23fdc36e09298caf104271440f1215986db741434564d WatchSource:0}: Error finding container a979eeb753ce8d9094b23fdc36e09298caf104271440f1215986db741434564d: Status 404 returned error can't find the container with id a979eeb753ce8d9094b23fdc36e09298caf104271440f1215986db741434564d Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.761770 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc"] Jan 21 15:30:44 crc kubenswrapper[4707]: W0121 15:30:44.765023 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf13b2937_d7b0_4dcb_8589_fd4c2c19cd5d.slice/crio-bc3a7ac457f1557e190e7ada8da56df0145d9f51008fb2587c9f70c1d2e39802 WatchSource:0}: Error finding container bc3a7ac457f1557e190e7ada8da56df0145d9f51008fb2587c9f70c1d2e39802: Status 404 returned error can't find the container with id bc3a7ac457f1557e190e7ada8da56df0145d9f51008fb2587c9f70c1d2e39802 Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.827383 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hc5fq"] Jan 21 15:30:44 crc kubenswrapper[4707]: W0121 15:30:44.829960 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde08c513_221d_41e8_a1b3_fc4b21d19173.slice/crio-a7ab69ebd30534cbb5860f5c335557eb6f4611f4be3810e64573bee5d0eac0d6 WatchSource:0}: Error finding container a7ab69ebd30534cbb5860f5c335557eb6f4611f4be3810e64573bee5d0eac0d6: Status 404 returned error can't find the container with id a7ab69ebd30534cbb5860f5c335557eb6f4611f4be3810e64573bee5d0eac0d6 Jan 21 15:30:44 crc kubenswrapper[4707]: I0121 15:30:44.917361 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-n7h88"] Jan 21 15:30:44 crc kubenswrapper[4707]: W0121 15:30:44.919533 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda77834d8_cde4_449d_b56f_6d3b53b0cadd.slice/crio-11691a91ffc6a4a2ee2258108c42be516920c5e04688f5afb539a8795f2741a5 WatchSource:0}: Error finding container 11691a91ffc6a4a2ee2258108c42be516920c5e04688f5afb539a8795f2741a5: Status 404 returned error can't find the container with id 11691a91ffc6a4a2ee2258108c42be516920c5e04688f5afb539a8795f2741a5 Jan 21 15:30:45 crc kubenswrapper[4707]: I0121 15:30:45.720974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" event={"ID":"f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d","Type":"ContainerStarted","Data":"bc3a7ac457f1557e190e7ada8da56df0145d9f51008fb2587c9f70c1d2e39802"} Jan 21 15:30:45 crc kubenswrapper[4707]: I0121 15:30:45.722008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-n7h88" event={"ID":"a77834d8-cde4-449d-b56f-6d3b53b0cadd","Type":"ContainerStarted","Data":"11691a91ffc6a4a2ee2258108c42be516920c5e04688f5afb539a8795f2741a5"} Jan 21 15:30:45 crc kubenswrapper[4707]: I0121 15:30:45.722975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" event={"ID":"de08c513-221d-41e8-a1b3-fc4b21d19173","Type":"ContainerStarted","Data":"a7ab69ebd30534cbb5860f5c335557eb6f4611f4be3810e64573bee5d0eac0d6"} Jan 21 15:30:45 crc kubenswrapper[4707]: I0121 15:30:45.723874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" event={"ID":"50a8c4cf-f6ea-4042-900f-37b63445f2d3","Type":"ContainerStarted","Data":"a979eeb753ce8d9094b23fdc36e09298caf104271440f1215986db741434564d"} Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.150582 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.151888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.153571 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.153685 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.155977 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.156188 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.157007 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.157419 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-ftt6f" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.157540 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.177082 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.183920 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.183994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.184019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.184036 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.184172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.184219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmz8r\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-kube-api-access-wmz8r\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.184257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.184332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.184377 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.184406 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.184455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmz8r\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-kube-api-access-wmz8r\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.289876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.290397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.290637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.291148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.291943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.292849 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.293035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.302594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.303313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.304585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmz8r\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-kube-api-access-wmz8r\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.304978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.307635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.314632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-0\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:48 crc kubenswrapper[4707]: I0121 15:30:48.473096 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.227878 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.534917 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.536121 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.540260 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-8p8k7" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.540416 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.542067 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.542313 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.549218 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.559655 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.611195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.611232 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk64m\" (UniqueName: \"kubernetes.io/projected/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kube-api-access-bk64m\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.611327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.611359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-default\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.611391 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.611408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kolla-config\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.611463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.611506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk64m\" (UniqueName: \"kubernetes.io/projected/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kube-api-access-bk64m\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-default\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712233 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kolla-config\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712439 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.712468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.713105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-default\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.713107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kolla-config\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.713548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.722576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.722665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.725549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk64m\" (UniqueName: \"kubernetes.io/projected/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kube-api-access-bk64m\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.729388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.754144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" event={"ID":"50a8c4cf-f6ea-4042-900f-37b63445f2d3","Type":"ContainerStarted","Data":"1178ab2fd212ee459c631be454034aa98033debc77717fee42aa15c801a504d3"} Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.756565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr" event={"ID":"b57455b4-f5cc-4e50-972f-299a31cc4eba","Type":"ContainerStarted","Data":"9ef39431d32b9dc2814ee46791b9f956cd576f3a879b4572fd610ccae3247609"} Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.758738 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" event={"ID":"f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d","Type":"ContainerStarted","Data":"8599068be595e3835328185cfe25a7ce9d7e9245907747c94da3d4570b1e8595"} Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.761174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-n7h88" event={"ID":"a77834d8-cde4-449d-b56f-6d3b53b0cadd","Type":"ContainerStarted","Data":"735b99fd97e2047ab7593f39b393f2090f1cb9aef81775268f07b43b56f9ca17"} Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.761576 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.774261 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc" podStartSLOduration=2.68794858 podStartE2EDuration="6.774248873s" podCreationTimestamp="2026-01-21 15:30:43 +0000 UTC" firstStartedPulling="2026-01-21 15:30:44.732433886 +0000 UTC m=+1741.913950108" lastFinishedPulling="2026-01-21 15:30:48.818734179 +0000 UTC m=+1746.000250401" observedRunningTime="2026-01-21 15:30:49.771097819 +0000 UTC m=+1746.952614042" watchObservedRunningTime="2026-01-21 15:30:49.774248873 +0000 UTC m=+1746.955765095" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.804774 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc" podStartSLOduration=2.753909503 podStartE2EDuration="6.804760073s" podCreationTimestamp="2026-01-21 15:30:43 +0000 UTC" firstStartedPulling="2026-01-21 15:30:44.767467418 +0000 UTC m=+1741.948983639" lastFinishedPulling="2026-01-21 15:30:48.818317986 +0000 UTC m=+1745.999834209" observedRunningTime="2026-01-21 15:30:49.800058535 +0000 UTC m=+1746.981574756" watchObservedRunningTime="2026-01-21 15:30:49.804760073 +0000 UTC m=+1746.986276295" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.830962 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vcvdr" podStartSLOduration=2.627600915 podStartE2EDuration="6.83095013s" podCreationTimestamp="2026-01-21 15:30:43 +0000 UTC" firstStartedPulling="2026-01-21 15:30:44.622572687 +0000 UTC m=+1741.804088909" lastFinishedPulling="2026-01-21 15:30:48.825921902 +0000 UTC m=+1746.007438124" observedRunningTime="2026-01-21 15:30:49.826200811 +0000 UTC m=+1747.007717034" watchObservedRunningTime="2026-01-21 15:30:49.83095013 +0000 UTC m=+1747.012466352" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.850639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:49 crc kubenswrapper[4707]: I0121 15:30:49.855510 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-n7h88" podStartSLOduration=1.958402558 podStartE2EDuration="5.855473834s" podCreationTimestamp="2026-01-21 15:30:44 +0000 UTC" firstStartedPulling="2026-01-21 15:30:44.921606637 +0000 UTC m=+1742.103122859" lastFinishedPulling="2026-01-21 15:30:48.818677914 +0000 UTC m=+1746.000194135" observedRunningTime="2026-01-21 15:30:49.855393814 +0000 UTC m=+1747.036910046" watchObservedRunningTime="2026-01-21 15:30:49.855473834 +0000 UTC m=+1747.036990056" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.129315 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.130113 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.131657 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-v67vm" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.131702 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.131927 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.135826 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.319393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.319699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kolla-config\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.319721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-config-data\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.319740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhbjs\" (UniqueName: \"kubernetes.io/projected/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kube-api-access-zhbjs\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.319803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.420663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kolla-config\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.420700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-config-data\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.420722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhbjs\" (UniqueName: \"kubernetes.io/projected/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kube-api-access-zhbjs\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.420755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.420787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.421689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kolla-config\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.423841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-config-data\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.438176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.440663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.442864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhbjs\" (UniqueName: \"kubernetes.io/projected/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kube-api-access-zhbjs\") pod \"memcached-0\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:50 crc kubenswrapper[4707]: I0121 15:30:50.443823 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:51 crc kubenswrapper[4707]: I0121 15:30:51.182878 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:30:51 crc kubenswrapper[4707]: E0121 15:30:51.183203 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:30:51 crc kubenswrapper[4707]: W0121 15:30:51.261095 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a47423_6eba_4cf7_ab2d_db338f4f28a6.slice/crio-395074a19a86cfe868b78ac61c2facf68e32e71f3a65253b963a8750379a1c39 WatchSource:0}: Error finding container 395074a19a86cfe868b78ac61c2facf68e32e71f3a65253b963a8750379a1c39: Status 404 returned error can't find the container with id 395074a19a86cfe868b78ac61c2facf68e32e71f3a65253b963a8750379a1c39 Jan 21 15:30:51 crc kubenswrapper[4707]: I0121 15:30:51.774371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"c1a47423-6eba-4cf7-ab2d-db338f4f28a6","Type":"ContainerStarted","Data":"395074a19a86cfe868b78ac61c2facf68e32e71f3a65253b963a8750379a1c39"} Jan 21 15:30:51 crc kubenswrapper[4707]: I0121 15:30:51.776042 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" event={"ID":"de08c513-221d-41e8-a1b3-fc4b21d19173","Type":"ContainerStarted","Data":"905ebaf79d31121f7e239e1402adecde897113d8da8f74168afa20de100bea64"} Jan 21 15:30:51 crc kubenswrapper[4707]: I0121 15:30:51.796793 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" podStartSLOduration=2.301026668 podStartE2EDuration="8.796778576s" podCreationTimestamp="2026-01-21 15:30:43 +0000 UTC" firstStartedPulling="2026-01-21 15:30:44.832254777 +0000 UTC m=+1742.013770999" lastFinishedPulling="2026-01-21 15:30:51.328006685 +0000 UTC m=+1748.509522907" observedRunningTime="2026-01-21 15:30:51.793732257 +0000 UTC m=+1748.975248479" watchObservedRunningTime="2026-01-21 15:30:51.796778576 +0000 UTC m=+1748.978294798" Jan 21 15:30:51 crc kubenswrapper[4707]: I0121 15:30:51.850623 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:30:51 crc kubenswrapper[4707]: W0121 15:30:51.858067 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b6c2ba2_268a_46f5_9ee1_93ff501d79de.slice/crio-1acfd68d1ac2310962a80937170e8b20c84d7103d5faaa175ef4a1e777180840 WatchSource:0}: Error finding container 1acfd68d1ac2310962a80937170e8b20c84d7103d5faaa175ef4a1e777180840: Status 404 returned error can't find the container with id 1acfd68d1ac2310962a80937170e8b20c84d7103d5faaa175ef4a1e777180840 Jan 21 15:30:51 crc kubenswrapper[4707]: I0121 15:30:51.921218 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.334245 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.334979 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.337227 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-hrq5h" Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.345834 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.374819 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7gdv\" (UniqueName: \"kubernetes.io/projected/0a86b73c-13a2-4142-819c-00d3db4ce9cd-kube-api-access-x7gdv\") pod \"kube-state-metrics-0\" (UID: \"0a86b73c-13a2-4142-819c-00d3db4ce9cd\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.475904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7gdv\" (UniqueName: \"kubernetes.io/projected/0a86b73c-13a2-4142-819c-00d3db4ce9cd-kube-api-access-x7gdv\") pod \"kube-state-metrics-0\" (UID: \"0a86b73c-13a2-4142-819c-00d3db4ce9cd\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.492341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7gdv\" (UniqueName: \"kubernetes.io/projected/0a86b73c-13a2-4142-819c-00d3db4ce9cd-kube-api-access-x7gdv\") pod \"kube-state-metrics-0\" (UID: \"0a86b73c-13a2-4142-819c-00d3db4ce9cd\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.647191 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.783337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"27f0dcee-d9f1-4bcd-9084-402a6f132ddc","Type":"ContainerStarted","Data":"84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a"} Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.783554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"27f0dcee-d9f1-4bcd-9084-402a6f132ddc","Type":"ContainerStarted","Data":"b529af99fddb0566ae94ef7081ab9434ade2f06a08d41401683903cbf948d077"} Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.785020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"6b6c2ba2-268a-46f5-9ee1-93ff501d79de","Type":"ContainerStarted","Data":"beb5043a095378415d9c0c0cc410632d5c83cf638d9ebdc6c46b7366a9b1df05"} Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.785043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"6b6c2ba2-268a-46f5-9ee1-93ff501d79de","Type":"ContainerStarted","Data":"1acfd68d1ac2310962a80937170e8b20c84d7103d5faaa175ef4a1e777180840"} Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.785143 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.786054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"c1a47423-6eba-4cf7-ab2d-db338f4f28a6","Type":"ContainerStarted","Data":"a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d"} Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.786389 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.787983 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-hc5fq" Jan 21 15:30:52 crc kubenswrapper[4707]: I0121 15:30:52.842185 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.8421701390000003 podStartE2EDuration="2.842170139s" podCreationTimestamp="2026-01-21 15:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:30:52.84162766 +0000 UTC m=+1750.023143882" watchObservedRunningTime="2026-01-21 15:30:52.842170139 +0000 UTC m=+1750.023686361" Jan 21 15:30:53 crc kubenswrapper[4707]: I0121 15:30:53.010833 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:30:53 crc kubenswrapper[4707]: I0121 15:30:53.796846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0a86b73c-13a2-4142-819c-00d3db4ce9cd","Type":"ContainerStarted","Data":"a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5"} Jan 21 15:30:53 crc kubenswrapper[4707]: I0121 15:30:53.797068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0a86b73c-13a2-4142-819c-00d3db4ce9cd","Type":"ContainerStarted","Data":"45bd6777f067edf5baa854d100a2c0711696b9eb094ccab412e052761220acd9"} Jan 21 15:30:53 crc kubenswrapper[4707]: I0121 15:30:53.819657 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.515406961 podStartE2EDuration="1.819645643s" podCreationTimestamp="2026-01-21 15:30:52 +0000 UTC" firstStartedPulling="2026-01-21 15:30:53.019376035 +0000 UTC m=+1750.200892258" lastFinishedPulling="2026-01-21 15:30:53.323614718 +0000 UTC m=+1750.505130940" observedRunningTime="2026-01-21 15:30:53.815047697 +0000 UTC m=+1750.996563919" watchObservedRunningTime="2026-01-21 15:30:53.819645643 +0000 UTC m=+1751.001161865" Jan 21 15:30:54 crc kubenswrapper[4707]: I0121 15:30:54.495685 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-n7h88" Jan 21 15:30:54 crc kubenswrapper[4707]: I0121 15:30:54.802597 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:30:55 crc kubenswrapper[4707]: I0121 15:30:55.817602 4707 generic.go:334] "Generic (PLEG): container finished" podID="27f0dcee-d9f1-4bcd-9084-402a6f132ddc" containerID="84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a" exitCode=0 Jan 21 15:30:55 crc kubenswrapper[4707]: I0121 15:30:55.817681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"27f0dcee-d9f1-4bcd-9084-402a6f132ddc","Type":"ContainerDied","Data":"84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a"} Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.769457 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.770864 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.773517 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.773828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.778513 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.778828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.778868 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-wwqsv" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.789536 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.826338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"27f0dcee-d9f1-4bcd-9084-402a6f132ddc","Type":"ContainerStarted","Data":"c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517"} Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.858226 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=8.858211212 podStartE2EDuration="8.858211212s" podCreationTimestamp="2026-01-21 15:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:30:56.856862627 +0000 UTC m=+1754.038378848" watchObservedRunningTime="2026-01-21 15:30:56.858211212 +0000 UTC m=+1754.039727434" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.938623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-879x2\" (UniqueName: \"kubernetes.io/projected/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-kube-api-access-879x2\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.938663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.938691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.938732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.938764 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.938799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.938887 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-config\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:56 crc kubenswrapper[4707]: I0121 15:30:56.938914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.040273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-config\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.040337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.040362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-879x2\" (UniqueName: \"kubernetes.io/projected/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-kube-api-access-879x2\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.040384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.040401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.040432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.040458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.040484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.040951 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.041229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.041666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-config\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.042662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.045442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.049339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.053841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.055235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-879x2\" (UniqueName: \"kubernetes.io/projected/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-kube-api-access-879x2\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.065442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.084169 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.646226 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:30:57 crc kubenswrapper[4707]: I0121 15:30:57.881265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5","Type":"ContainerStarted","Data":"42ac5c94e6abcfd6351ccd163c49c14943f7eb6b1e65192b2c6f324be55b3d61"} Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.333385 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.334652 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.337639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"alertmanager-metric-storage-web-config" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.339067 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"alertmanager-metric-storage-generated" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.339302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"alertmanager-metric-storage-tls-assets-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.342126 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"metric-storage-alertmanager-dockercfg-dx2d2" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.342197 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"alertmanager-metric-storage-cluster-tls-config" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.350769 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.465264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.465549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.465575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.465614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.465788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.465851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlnsz\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-kube-api-access-hlnsz\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.465877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.566772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.566834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.566853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.566880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.566913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.566927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlnsz\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-kube-api-access-hlnsz\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.566942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.568205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.571152 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.574238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.575231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.584133 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.587193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.618218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlnsz\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-kube-api-access-hlnsz\") pod \"alertmanager-metric-storage-0\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.647103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.907123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5","Type":"ContainerStarted","Data":"b87047431e2fca4a31a430cebbb8e750f9e99cbfe863d8680da80f4d138bf6ce"} Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.907402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5","Type":"ContainerStarted","Data":"6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5"} Jan 21 15:30:58 crc kubenswrapper[4707]: I0121 15:30:58.943203 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.943183384 podStartE2EDuration="3.943183384s" podCreationTimestamp="2026-01-21 15:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:30:58.942849195 +0000 UTC m=+1756.124365417" watchObservedRunningTime="2026-01-21 15:30:58.943183384 +0000 UTC m=+1756.124699606" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.024375 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.026509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.037592 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.037648 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.037740 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-1" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.037837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.037851 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-web-config" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.037961 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"metric-storage-prometheus-dockercfg-kh2t7" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.040204 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-tls-assets-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.040350 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.040471 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-2" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.114856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7077aeb3-cc3a-41f0-882c-a29171cdda53-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-config\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.177627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnxj\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-kube-api-access-8mnxj\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnxj\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-kube-api-access-8mnxj\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7077aeb3-cc3a-41f0-882c-a29171cdda53-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.279619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-config\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.281613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.282041 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.282732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.283182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.287486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7077aeb3-cc3a-41f0-882c-a29171cdda53-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.290245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-config\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.292331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.305414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.308519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.335475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnxj\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-kube-api-access-8mnxj\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.361020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"prometheus-metric-storage-0\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.614038 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.615326 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.622382 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.622668 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-r9fj2" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.622875 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.623024 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.632659 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.650843 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.684597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.684689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.684738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.684764 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95vxd\" (UniqueName: \"kubernetes.io/projected/8376e997-f7b0-4432-997e-dbc14561fe5c-kube-api-access-95vxd\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.684786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-config\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.684802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.684869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.684921 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95vxd\" (UniqueName: \"kubernetes.io/projected/8376e997-f7b0-4432-997e-dbc14561fe5c-kube-api-access-95vxd\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-config\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787190 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787390 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.787462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.788392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-config\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.788431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.792085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.792686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.802472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.810003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.823414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95vxd\" (UniqueName: \"kubernetes.io/projected/8376e997-f7b0-4432-997e-dbc14561fe5c-kube-api-access-95vxd\") pod \"ovsdbserver-sb-0\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.851276 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.851319 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.919277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b","Type":"ContainerStarted","Data":"57e6012eff8c9cc8be5b8be5fe6af8ea9ffa9100011c6c1cede087099cce1640"} Jan 21 15:30:59 crc kubenswrapper[4707]: I0121 15:30:59.929181 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:31:00 crc kubenswrapper[4707]: I0121 15:31:00.084984 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:31:00 crc kubenswrapper[4707]: I0121 15:31:00.113462 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:31:00 crc kubenswrapper[4707]: I0121 15:31:00.120993 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:31:00 crc kubenswrapper[4707]: W0121 15:31:00.125365 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7077aeb3_cc3a_41f0_882c_a29171cdda53.slice/crio-aea07a785d69297b821c4370ae38b084c5efe52194ba3afe34ce409577e9ec28 WatchSource:0}: Error finding container aea07a785d69297b821c4370ae38b084c5efe52194ba3afe34ce409577e9ec28: Status 404 returned error can't find the container with id aea07a785d69297b821c4370ae38b084c5efe52194ba3afe34ce409577e9ec28 Jan 21 15:31:00 crc kubenswrapper[4707]: I0121 15:31:00.415510 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:31:00 crc kubenswrapper[4707]: I0121 15:31:00.444921 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:31:00 crc kubenswrapper[4707]: W0121 15:31:00.458879 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8376e997_f7b0_4432_997e_dbc14561fe5c.slice/crio-fa02a16dc3c529c5d340f129dbaee472fc66fdc1f3dccc2e1783fc42f2092f59 WatchSource:0}: Error finding container fa02a16dc3c529c5d340f129dbaee472fc66fdc1f3dccc2e1783fc42f2092f59: Status 404 returned error can't find the container with id fa02a16dc3c529c5d340f129dbaee472fc66fdc1f3dccc2e1783fc42f2092f59 Jan 21 15:31:00 crc kubenswrapper[4707]: I0121 15:31:00.926264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerStarted","Data":"aea07a785d69297b821c4370ae38b084c5efe52194ba3afe34ce409577e9ec28"} Jan 21 15:31:00 crc kubenswrapper[4707]: I0121 15:31:00.927374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8376e997-f7b0-4432-997e-dbc14561fe5c","Type":"ContainerStarted","Data":"fa02a16dc3c529c5d340f129dbaee472fc66fdc1f3dccc2e1783fc42f2092f59"} Jan 21 15:31:00 crc kubenswrapper[4707]: I0121 15:31:00.927568 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:31:01 crc kubenswrapper[4707]: I0121 15:31:01.935095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8376e997-f7b0-4432-997e-dbc14561fe5c","Type":"ContainerStarted","Data":"e55d327aaedda39102f9591ab335da29319ef38cfe196bb243d1ce6b2b26dfed"} Jan 21 15:31:01 crc kubenswrapper[4707]: I0121 15:31:01.935135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8376e997-f7b0-4432-997e-dbc14561fe5c","Type":"ContainerStarted","Data":"8846341a10e36d0ac832f88ab6dd92585a6e11983d8927e565d6d370ee499686"} Jan 21 15:31:01 crc kubenswrapper[4707]: I0121 15:31:01.956228 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.9562130890000002 podStartE2EDuration="3.956213089s" podCreationTimestamp="2026-01-21 15:30:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:01.947461755 +0000 UTC m=+1759.128977976" watchObservedRunningTime="2026-01-21 15:31:01.956213089 +0000 UTC m=+1759.137729311" Jan 21 15:31:02 crc kubenswrapper[4707]: I0121 15:31:02.116679 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:31:02 crc kubenswrapper[4707]: I0121 15:31:02.651694 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:31:02 crc kubenswrapper[4707]: I0121 15:31:02.930056 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.814845 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.818953 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.820695 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-4bg8m" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.820712 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.821195 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.821625 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.860610 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.876588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-lock\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.876640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-cache\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.876666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x568p\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-kube-api-access-x568p\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.876689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.876708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.978132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.978181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.978258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-lock\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.978291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-cache\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.978322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x568p\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-kube-api-access-x568p\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: E0121 15:31:03.978405 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:31:03 crc kubenswrapper[4707]: E0121 15:31:03.978429 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:31:03 crc kubenswrapper[4707]: E0121 15:31:03.978464 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift podName:4d1f2b5b-f47a-496d-ba02-5299848b53a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:31:04.478450402 +0000 UTC m=+1761.659966624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift") pod "swift-storage-0" (UID: "4d1f2b5b-f47a-496d-ba02-5299848b53a5") : configmap "swift-ring-files" not found Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.978410 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.978720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-lock\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.978751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-cache\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:03 crc kubenswrapper[4707]: I0121 15:31:03.980071 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.000892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x568p\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-kube-api-access-x568p\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.019964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.041280 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.182438 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:31:04 crc kubenswrapper[4707]: E0121 15:31:04.182884 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.198938 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-pprp9"] Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.199765 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.201893 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.206216 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.207955 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.211592 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-pprp9"] Jan 21 15:31:04 crc kubenswrapper[4707]: E0121 15:31:04.213981 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-j5q72 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-j5q72 ring-data-devices scripts swiftconf]: context canceled" pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" podUID="7593cccd-e50d-4d53-88fe-dcc773ed4d5a" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.223694 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-dw9hb"] Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.224560 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.237614 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-dw9hb"] Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.269382 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-pprp9"] Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.281625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-ring-data-devices\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.281688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m795\" (UniqueName: \"kubernetes.io/projected/6113492f-9c05-4511-956e-8f19e045260d-kube-api-access-6m795\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.281770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-combined-ca-bundle\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.281961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-swiftconf\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-ring-data-devices\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-dispersionconf\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113492f-9c05-4511-956e-8f19e045260d-etc-swift\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-etc-swift\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5q72\" (UniqueName: \"kubernetes.io/projected/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-kube-api-access-j5q72\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-swiftconf\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-scripts\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282314 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-dispersionconf\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-scripts\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.282469 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-combined-ca-bundle\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-ring-data-devices\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-dispersionconf\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113492f-9c05-4511-956e-8f19e045260d-etc-swift\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-etc-swift\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5q72\" (UniqueName: \"kubernetes.io/projected/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-kube-api-access-j5q72\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-swiftconf\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-scripts\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-scripts\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-dispersionconf\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-combined-ca-bundle\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-ring-data-devices\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384305 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m795\" (UniqueName: \"kubernetes.io/projected/6113492f-9c05-4511-956e-8f19e045260d-kube-api-access-6m795\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-combined-ca-bundle\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-swiftconf\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113492f-9c05-4511-956e-8f19e045260d-etc-swift\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-etc-swift\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.384794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-ring-data-devices\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.385089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-scripts\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.385153 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-scripts\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.389261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-ring-data-devices\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.393459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-combined-ca-bundle\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.394346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-dispersionconf\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.394565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-combined-ca-bundle\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.394970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-dispersionconf\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.397673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-swiftconf\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.398100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-swiftconf\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.398371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5q72\" (UniqueName: \"kubernetes.io/projected/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-kube-api-access-j5q72\") pod \"swift-ring-rebalance-pprp9\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.431974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m795\" (UniqueName: \"kubernetes.io/projected/6113492f-9c05-4511-956e-8f19e045260d-kube-api-access-6m795\") pod \"swift-ring-rebalance-dw9hb\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.485450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:04 crc kubenswrapper[4707]: E0121 15:31:04.485581 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:31:04 crc kubenswrapper[4707]: E0121 15:31:04.485606 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:31:04 crc kubenswrapper[4707]: E0121 15:31:04.485652 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift podName:4d1f2b5b-f47a-496d-ba02-5299848b53a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:31:05.485636352 +0000 UTC m=+1762.667152574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift") pod "swift-storage-0" (UID: "4d1f2b5b-f47a-496d-ba02-5299848b53a5") : configmap "swift-ring-files" not found Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.536411 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.929842 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.954999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b","Type":"ContainerStarted","Data":"2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892"} Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.956320 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.956318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerStarted","Data":"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a"} Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.965055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.993872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-dispersionconf\") pod \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.994013 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-scripts\") pod \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.994387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-scripts" (OuterVolumeSpecName: "scripts") pod "7593cccd-e50d-4d53-88fe-dcc773ed4d5a" (UID: "7593cccd-e50d-4d53-88fe-dcc773ed4d5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.994456 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-ring-data-devices\") pod \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.994510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-etc-swift\") pod \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.994681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7593cccd-e50d-4d53-88fe-dcc773ed4d5a" (UID: "7593cccd-e50d-4d53-88fe-dcc773ed4d5a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.994548 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-combined-ca-bundle\") pod \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.994745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-swiftconf\") pod \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.994753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7593cccd-e50d-4d53-88fe-dcc773ed4d5a" (UID: "7593cccd-e50d-4d53-88fe-dcc773ed4d5a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.994761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5q72\" (UniqueName: \"kubernetes.io/projected/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-kube-api-access-j5q72\") pod \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\" (UID: \"7593cccd-e50d-4d53-88fe-dcc773ed4d5a\") " Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.996046 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.996085 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:04 crc kubenswrapper[4707]: I0121 15:31:04.996096 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:04.999936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7593cccd-e50d-4d53-88fe-dcc773ed4d5a" (UID: "7593cccd-e50d-4d53-88fe-dcc773ed4d5a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.000101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7593cccd-e50d-4d53-88fe-dcc773ed4d5a" (UID: "7593cccd-e50d-4d53-88fe-dcc773ed4d5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.001938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7593cccd-e50d-4d53-88fe-dcc773ed4d5a" (UID: "7593cccd-e50d-4d53-88fe-dcc773ed4d5a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.001952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-kube-api-access-j5q72" (OuterVolumeSpecName: "kube-api-access-j5q72") pod "7593cccd-e50d-4d53-88fe-dcc773ed4d5a" (UID: "7593cccd-e50d-4d53-88fe-dcc773ed4d5a"). InnerVolumeSpecName "kube-api-access-j5q72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.009558 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-dw9hb"] Jan 21 15:31:05 crc kubenswrapper[4707]: W0121 15:31:05.030253 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6113492f_9c05_4511_956e_8f19e045260d.slice/crio-a663067d3f8c6b5fc6fa3d85cb651ec6a01483ba0b0f5d4fafa7c1be4bf7e1a2 WatchSource:0}: Error finding container a663067d3f8c6b5fc6fa3d85cb651ec6a01483ba0b0f5d4fafa7c1be4bf7e1a2: Status 404 returned error can't find the container with id a663067d3f8c6b5fc6fa3d85cb651ec6a01483ba0b0f5d4fafa7c1be4bf7e1a2 Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.097266 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.097295 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.097307 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5q72\" (UniqueName: \"kubernetes.io/projected/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-kube-api-access-j5q72\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.097317 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7593cccd-e50d-4d53-88fe-dcc773ed4d5a-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.502711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:05 crc kubenswrapper[4707]: E0121 15:31:05.502912 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:31:05 crc kubenswrapper[4707]: E0121 15:31:05.502936 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:31:05 crc kubenswrapper[4707]: E0121 15:31:05.502990 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift podName:4d1f2b5b-f47a-496d-ba02-5299848b53a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:31:07.502974878 +0000 UTC m=+1764.684491100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift") pod "swift-storage-0" (UID: "4d1f2b5b-f47a-496d-ba02-5299848b53a5") : configmap "swift-ring-files" not found Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.961874 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.963623 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-pprp9" Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.963670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" event={"ID":"6113492f-9c05-4511-956e-8f19e045260d","Type":"ContainerStarted","Data":"eddf6b664ed9079fcbf3b5395ca851df02ea61755652f68ba2a34e4c789441bb"} Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.963708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" event={"ID":"6113492f-9c05-4511-956e-8f19e045260d","Type":"ContainerStarted","Data":"a663067d3f8c6b5fc6fa3d85cb651ec6a01483ba0b0f5d4fafa7c1be4bf7e1a2"} Jan 21 15:31:05 crc kubenswrapper[4707]: I0121 15:31:05.987067 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" podStartSLOduration=1.98705401 podStartE2EDuration="1.98705401s" podCreationTimestamp="2026-01-21 15:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:05.985579257 +0000 UTC m=+1763.167095479" watchObservedRunningTime="2026-01-21 15:31:05.98705401 +0000 UTC m=+1763.168570232" Jan 21 15:31:06 crc kubenswrapper[4707]: I0121 15:31:06.002641 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-pprp9"] Jan 21 15:31:06 crc kubenswrapper[4707]: I0121 15:31:06.006467 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-pprp9"] Jan 21 15:31:07 crc kubenswrapper[4707]: I0121 15:31:07.189985 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7593cccd-e50d-4d53-88fe-dcc773ed4d5a" path="/var/lib/kubelet/pods/7593cccd-e50d-4d53-88fe-dcc773ed4d5a/volumes" Jan 21 15:31:07 crc kubenswrapper[4707]: I0121 15:31:07.532057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:07 crc kubenswrapper[4707]: E0121 15:31:07.532216 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:31:07 crc kubenswrapper[4707]: E0121 15:31:07.532239 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:31:07 crc kubenswrapper[4707]: E0121 15:31:07.532284 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift podName:4d1f2b5b-f47a-496d-ba02-5299848b53a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:31:11.53226927 +0000 UTC m=+1768.713785492 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift") pod "swift-storage-0" (UID: "4d1f2b5b-f47a-496d-ba02-5299848b53a5") : configmap "swift-ring-files" not found Jan 21 15:31:07 crc kubenswrapper[4707]: I0121 15:31:07.976428 4707 generic.go:334] "Generic (PLEG): container finished" podID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerID="2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892" exitCode=0 Jan 21 15:31:07 crc kubenswrapper[4707]: I0121 15:31:07.976464 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b","Type":"ContainerDied","Data":"2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892"} Jan 21 15:31:08 crc kubenswrapper[4707]: E0121 15:31:08.104262 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e034ee_7d6e_45f4_a7a5_d3ddc3a9bc7b.slice/crio-conmon-2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.526010 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-6nmsr"] Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.527087 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.529625 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.535380 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-6nmsr"] Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.550854 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae5397d9-c1f9-472c-b182-06dbbc26381a-operator-scripts\") pod \"root-account-create-update-6nmsr\" (UID: \"ae5397d9-c1f9-472c-b182-06dbbc26381a\") " pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.550959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dcrv\" (UniqueName: \"kubernetes.io/projected/ae5397d9-c1f9-472c-b182-06dbbc26381a-kube-api-access-8dcrv\") pod \"root-account-create-update-6nmsr\" (UID: \"ae5397d9-c1f9-472c-b182-06dbbc26381a\") " pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.651927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae5397d9-c1f9-472c-b182-06dbbc26381a-operator-scripts\") pod \"root-account-create-update-6nmsr\" (UID: \"ae5397d9-c1f9-472c-b182-06dbbc26381a\") " pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.652061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dcrv\" (UniqueName: \"kubernetes.io/projected/ae5397d9-c1f9-472c-b182-06dbbc26381a-kube-api-access-8dcrv\") pod \"root-account-create-update-6nmsr\" (UID: \"ae5397d9-c1f9-472c-b182-06dbbc26381a\") " pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.652650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae5397d9-c1f9-472c-b182-06dbbc26381a-operator-scripts\") pod \"root-account-create-update-6nmsr\" (UID: \"ae5397d9-c1f9-472c-b182-06dbbc26381a\") " pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.666659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dcrv\" (UniqueName: \"kubernetes.io/projected/ae5397d9-c1f9-472c-b182-06dbbc26381a-kube-api-access-8dcrv\") pod \"root-account-create-update-6nmsr\" (UID: \"ae5397d9-c1f9-472c-b182-06dbbc26381a\") " pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.838966 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.990786 4707 generic.go:334] "Generic (PLEG): container finished" podID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerID="354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a" exitCode=0 Jan 21 15:31:08 crc kubenswrapper[4707]: I0121 15:31:08.990870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerDied","Data":"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a"} Jan 21 15:31:09 crc kubenswrapper[4707]: I0121 15:31:09.234316 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-6nmsr"] Jan 21 15:31:09 crc kubenswrapper[4707]: W0121 15:31:09.241996 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae5397d9_c1f9_472c_b182_06dbbc26381a.slice/crio-65ff0e0d0d87c2f747f2b409088c66720a9de662398b8b763df845242bfa28eb WatchSource:0}: Error finding container 65ff0e0d0d87c2f747f2b409088c66720a9de662398b8b763df845242bfa28eb: Status 404 returned error can't find the container with id 65ff0e0d0d87c2f747f2b409088c66720a9de662398b8b763df845242bfa28eb Jan 21 15:31:09 crc kubenswrapper[4707]: I0121 15:31:09.959899 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.001373 4707 generic.go:334] "Generic (PLEG): container finished" podID="ae5397d9-c1f9-472c-b182-06dbbc26381a" containerID="853a930f49c1f2edf2a101608783efd9deb9725c99a7dea6d7832faaaf26cc05" exitCode=0 Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.001415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-6nmsr" event={"ID":"ae5397d9-c1f9-472c-b182-06dbbc26381a","Type":"ContainerDied","Data":"853a930f49c1f2edf2a101608783efd9deb9725c99a7dea6d7832faaaf26cc05"} Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.001683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-6nmsr" event={"ID":"ae5397d9-c1f9-472c-b182-06dbbc26381a","Type":"ContainerStarted","Data":"65ff0e0d0d87c2f747f2b409088c66720a9de662398b8b763df845242bfa28eb"} Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.011009 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-qc6lv"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.017302 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.022745 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-qc6lv"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.114428 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.115335 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.118274 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.126776 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.173797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rnd\" (UniqueName: \"kubernetes.io/projected/630582cc-d708-415d-88da-06be4d720728-kube-api-access-j2rnd\") pod \"keystone-db-create-qc6lv\" (UID: \"630582cc-d708-415d-88da-06be4d720728\") " pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.173912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630582cc-d708-415d-88da-06be4d720728-operator-scripts\") pod \"keystone-db-create-qc6lv\" (UID: \"630582cc-d708-415d-88da-06be4d720728\") " pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.229848 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.230957 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.232322 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-blszt" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.232470 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.232494 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.232997 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.239287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.275778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jckfn\" (UniqueName: \"kubernetes.io/projected/b2678839-9b6d-4d68-aa48-c2b42793ea0b-kube-api-access-jckfn\") pod \"keystone-cd48-account-create-update-cnjfw\" (UID: \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.275979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2678839-9b6d-4d68-aa48-c2b42793ea0b-operator-scripts\") pod \"keystone-cd48-account-create-update-cnjfw\" (UID: \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.276013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rnd\" (UniqueName: \"kubernetes.io/projected/630582cc-d708-415d-88da-06be4d720728-kube-api-access-j2rnd\") pod \"keystone-db-create-qc6lv\" (UID: \"630582cc-d708-415d-88da-06be4d720728\") " pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.276036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630582cc-d708-415d-88da-06be4d720728-operator-scripts\") pod \"keystone-db-create-qc6lv\" (UID: \"630582cc-d708-415d-88da-06be4d720728\") " pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.276887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630582cc-d708-415d-88da-06be4d720728-operator-scripts\") pod \"keystone-db-create-qc6lv\" (UID: \"630582cc-d708-415d-88da-06be4d720728\") " pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.295960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rnd\" (UniqueName: \"kubernetes.io/projected/630582cc-d708-415d-88da-06be4d720728-kube-api-access-j2rnd\") pod \"keystone-db-create-qc6lv\" (UID: \"630582cc-d708-415d-88da-06be4d720728\") " pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.331608 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.343050 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-2mnwh"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.344132 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.349540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-2mnwh"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.377635 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.377695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-scripts\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.377740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.377791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jckfn\" (UniqueName: \"kubernetes.io/projected/b2678839-9b6d-4d68-aa48-c2b42793ea0b-kube-api-access-jckfn\") pod \"keystone-cd48-account-create-update-cnjfw\" (UID: \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.377861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwt5b\" (UniqueName: \"kubernetes.io/projected/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-kube-api-access-xwt5b\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.378188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.378440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-config\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.378529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2678839-9b6d-4d68-aa48-c2b42793ea0b-operator-scripts\") pod \"keystone-cd48-account-create-update-cnjfw\" (UID: \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.378778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.379706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2678839-9b6d-4d68-aa48-c2b42793ea0b-operator-scripts\") pod \"keystone-cd48-account-create-update-cnjfw\" (UID: \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.391885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jckfn\" (UniqueName: \"kubernetes.io/projected/b2678839-9b6d-4d68-aa48-c2b42793ea0b-kube-api-access-jckfn\") pod \"keystone-cd48-account-create-update-cnjfw\" (UID: \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.430637 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.480016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d991e26-3d69-43bb-a18b-67aa7819a210-operator-scripts\") pod \"placement-db-create-2mnwh\" (UID: \"6d991e26-3d69-43bb-a18b-67aa7819a210\") " pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.480293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-config\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.480323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.480352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.480375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-scripts\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.480394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnf9h\" (UniqueName: \"kubernetes.io/projected/6d991e26-3d69-43bb-a18b-67aa7819a210-kube-api-access-pnf9h\") pod \"placement-db-create-2mnwh\" (UID: \"6d991e26-3d69-43bb-a18b-67aa7819a210\") " pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.480415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.480441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwt5b\" (UniqueName: \"kubernetes.io/projected/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-kube-api-access-xwt5b\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.480468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.481669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-scripts\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.482211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-config\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.482247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.486775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.488489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.493338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.493388 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-2f5kd"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.494325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.497583 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.498112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwt5b\" (UniqueName: \"kubernetes.io/projected/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-kube-api-access-xwt5b\") pod \"ovn-northd-0\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.513601 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-2f5kd"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.544357 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.582460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnf9h\" (UniqueName: \"kubernetes.io/projected/6d991e26-3d69-43bb-a18b-67aa7819a210-kube-api-access-pnf9h\") pod \"placement-db-create-2mnwh\" (UID: \"6d991e26-3d69-43bb-a18b-67aa7819a210\") " pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.582708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d991e26-3d69-43bb-a18b-67aa7819a210-operator-scripts\") pod \"placement-db-create-2mnwh\" (UID: \"6d991e26-3d69-43bb-a18b-67aa7819a210\") " pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.582774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp42c\" (UniqueName: \"kubernetes.io/projected/5c499192-7911-4564-b833-1a941ba515a9-kube-api-access-wp42c\") pod \"placement-9339-account-create-update-2f5kd\" (UID: \"5c499192-7911-4564-b833-1a941ba515a9\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.582799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c499192-7911-4564-b833-1a941ba515a9-operator-scripts\") pod \"placement-9339-account-create-update-2f5kd\" (UID: \"5c499192-7911-4564-b833-1a941ba515a9\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.583792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d991e26-3d69-43bb-a18b-67aa7819a210-operator-scripts\") pod \"placement-db-create-2mnwh\" (UID: \"6d991e26-3d69-43bb-a18b-67aa7819a210\") " pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.600099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnf9h\" (UniqueName: \"kubernetes.io/projected/6d991e26-3d69-43bb-a18b-67aa7819a210-kube-api-access-pnf9h\") pod \"placement-db-create-2mnwh\" (UID: \"6d991e26-3d69-43bb-a18b-67aa7819a210\") " pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.664998 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.684327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp42c\" (UniqueName: \"kubernetes.io/projected/5c499192-7911-4564-b833-1a941ba515a9-kube-api-access-wp42c\") pod \"placement-9339-account-create-update-2f5kd\" (UID: \"5c499192-7911-4564-b833-1a941ba515a9\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.684375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c499192-7911-4564-b833-1a941ba515a9-operator-scripts\") pod \"placement-9339-account-create-update-2f5kd\" (UID: \"5c499192-7911-4564-b833-1a941ba515a9\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.685030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c499192-7911-4564-b833-1a941ba515a9-operator-scripts\") pod \"placement-9339-account-create-update-2f5kd\" (UID: \"5c499192-7911-4564-b833-1a941ba515a9\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.701215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp42c\" (UniqueName: \"kubernetes.io/projected/5c499192-7911-4564-b833-1a941ba515a9-kube-api-access-wp42c\") pod \"placement-9339-account-create-update-2f5kd\" (UID: \"5c499192-7911-4564-b833-1a941ba515a9\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.717188 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-r8f6v"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.718026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.740697 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-r8f6v"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.819326 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-2t28b"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.820207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.822846 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.827036 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.836280 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-2t28b"] Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.888023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq2st\" (UniqueName: \"kubernetes.io/projected/11d59352-da6e-45fa-a721-5f7dc6361200-kube-api-access-lq2st\") pod \"glance-db-create-r8f6v\" (UID: \"11d59352-da6e-45fa-a721-5f7dc6361200\") " pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.888090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d59352-da6e-45fa-a721-5f7dc6361200-operator-scripts\") pod \"glance-db-create-r8f6v\" (UID: \"11d59352-da6e-45fa-a721-5f7dc6361200\") " pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.932881 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-qc6lv"] Jan 21 15:31:10 crc kubenswrapper[4707]: W0121 15:31:10.951624 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630582cc_d708_415d_88da_06be4d720728.slice/crio-adce826b43c57a6c0193db3e54a584e32fc9ac4ef0b2d993bda0aabdd0d68582 WatchSource:0}: Error finding container adce826b43c57a6c0193db3e54a584e32fc9ac4ef0b2d993bda0aabdd0d68582: Status 404 returned error can't find the container with id adce826b43c57a6c0193db3e54a584e32fc9ac4ef0b2d993bda0aabdd0d68582 Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.989431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq2st\" (UniqueName: \"kubernetes.io/projected/11d59352-da6e-45fa-a721-5f7dc6361200-kube-api-access-lq2st\") pod \"glance-db-create-r8f6v\" (UID: \"11d59352-da6e-45fa-a721-5f7dc6361200\") " pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.989529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c38ceab-e8d7-47cb-bf82-437d544633dc-operator-scripts\") pod \"glance-4d50-account-create-update-2t28b\" (UID: \"5c38ceab-e8d7-47cb-bf82-437d544633dc\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.989589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d59352-da6e-45fa-a721-5f7dc6361200-operator-scripts\") pod \"glance-db-create-r8f6v\" (UID: \"11d59352-da6e-45fa-a721-5f7dc6361200\") " pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.989637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt7ps\" (UniqueName: \"kubernetes.io/projected/5c38ceab-e8d7-47cb-bf82-437d544633dc-kube-api-access-rt7ps\") pod \"glance-4d50-account-create-update-2t28b\" (UID: \"5c38ceab-e8d7-47cb-bf82-437d544633dc\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:10 crc kubenswrapper[4707]: I0121 15:31:10.990405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d59352-da6e-45fa-a721-5f7dc6361200-operator-scripts\") pod \"glance-db-create-r8f6v\" (UID: \"11d59352-da6e-45fa-a721-5f7dc6361200\") " pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.015497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b","Type":"ContainerStarted","Data":"fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1"} Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.021084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-qc6lv" event={"ID":"630582cc-d708-415d-88da-06be4d720728","Type":"ContainerStarted","Data":"adce826b43c57a6c0193db3e54a584e32fc9ac4ef0b2d993bda0aabdd0d68582"} Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.023149 4707 generic.go:334] "Generic (PLEG): container finished" podID="6113492f-9c05-4511-956e-8f19e045260d" containerID="eddf6b664ed9079fcbf3b5395ca851df02ea61755652f68ba2a34e4c789441bb" exitCode=0 Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.023228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" event={"ID":"6113492f-9c05-4511-956e-8f19e045260d","Type":"ContainerDied","Data":"eddf6b664ed9079fcbf3b5395ca851df02ea61755652f68ba2a34e4c789441bb"} Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.053585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq2st\" (UniqueName: \"kubernetes.io/projected/11d59352-da6e-45fa-a721-5f7dc6361200-kube-api-access-lq2st\") pod \"glance-db-create-r8f6v\" (UID: \"11d59352-da6e-45fa-a721-5f7dc6361200\") " pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.066331 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw"] Jan 21 15:31:11 crc kubenswrapper[4707]: W0121 15:31:11.073701 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2678839_9b6d_4d68_aa48_c2b42793ea0b.slice/crio-6d53c61e40536302919e23fd82014ecf72a27edc45d4906592eed44c43efde2a WatchSource:0}: Error finding container 6d53c61e40536302919e23fd82014ecf72a27edc45d4906592eed44c43efde2a: Status 404 returned error can't find the container with id 6d53c61e40536302919e23fd82014ecf72a27edc45d4906592eed44c43efde2a Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.091670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c38ceab-e8d7-47cb-bf82-437d544633dc-operator-scripts\") pod \"glance-4d50-account-create-update-2t28b\" (UID: \"5c38ceab-e8d7-47cb-bf82-437d544633dc\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.091792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt7ps\" (UniqueName: \"kubernetes.io/projected/5c38ceab-e8d7-47cb-bf82-437d544633dc-kube-api-access-rt7ps\") pod \"glance-4d50-account-create-update-2t28b\" (UID: \"5c38ceab-e8d7-47cb-bf82-437d544633dc\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.092329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c38ceab-e8d7-47cb-bf82-437d544633dc-operator-scripts\") pod \"glance-4d50-account-create-update-2t28b\" (UID: \"5c38ceab-e8d7-47cb-bf82-437d544633dc\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.105249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt7ps\" (UniqueName: \"kubernetes.io/projected/5c38ceab-e8d7-47cb-bf82-437d544633dc-kube-api-access-rt7ps\") pod \"glance-4d50-account-create-update-2t28b\" (UID: \"5c38ceab-e8d7-47cb-bf82-437d544633dc\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.119372 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:31:11 crc kubenswrapper[4707]: W0121 15:31:11.123712 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod781be74c_c8d5_4cfa_8bb1_5a0434bb3823.slice/crio-77fb2fc44af5392306954d6db181bef200ccdb5ef65ec7a483ce8ae026d82ffa WatchSource:0}: Error finding container 77fb2fc44af5392306954d6db181bef200ccdb5ef65ec7a483ce8ae026d82ffa: Status 404 returned error can't find the container with id 77fb2fc44af5392306954d6db181bef200ccdb5ef65ec7a483ce8ae026d82ffa Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.146378 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.191321 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-2mnwh"] Jan 21 15:31:11 crc kubenswrapper[4707]: W0121 15:31:11.197526 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d991e26_3d69_43bb_a18b_67aa7819a210.slice/crio-88063fb1d50901b0d5327864191e3de3bc2f866ba57be53c06ccf4f44e207d78 WatchSource:0}: Error finding container 88063fb1d50901b0d5327864191e3de3bc2f866ba57be53c06ccf4f44e207d78: Status 404 returned error can't find the container with id 88063fb1d50901b0d5327864191e3de3bc2f866ba57be53c06ccf4f44e207d78 Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.347883 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.360455 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-2f5kd"] Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.593102 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.596367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-2t28b"] Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.602724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.610761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift\") pod \"swift-storage-0\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:11 crc kubenswrapper[4707]: W0121 15:31:11.611280 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c38ceab_e8d7_47cb_bf82_437d544633dc.slice/crio-7bd553f81712a5a7672d3137e6f4e3ce6f00d97e0110fffd7520924c8000af0d WatchSource:0}: Error finding container 7bd553f81712a5a7672d3137e6f4e3ce6f00d97e0110fffd7520924c8000af0d: Status 404 returned error can't find the container with id 7bd553f81712a5a7672d3137e6f4e3ce6f00d97e0110fffd7520924c8000af0d Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.631109 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.703339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dcrv\" (UniqueName: \"kubernetes.io/projected/ae5397d9-c1f9-472c-b182-06dbbc26381a-kube-api-access-8dcrv\") pod \"ae5397d9-c1f9-472c-b182-06dbbc26381a\" (UID: \"ae5397d9-c1f9-472c-b182-06dbbc26381a\") " Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.703576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae5397d9-c1f9-472c-b182-06dbbc26381a-operator-scripts\") pod \"ae5397d9-c1f9-472c-b182-06dbbc26381a\" (UID: \"ae5397d9-c1f9-472c-b182-06dbbc26381a\") " Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.704473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5397d9-c1f9-472c-b182-06dbbc26381a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae5397d9-c1f9-472c-b182-06dbbc26381a" (UID: "ae5397d9-c1f9-472c-b182-06dbbc26381a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.710045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5397d9-c1f9-472c-b182-06dbbc26381a-kube-api-access-8dcrv" (OuterVolumeSpecName: "kube-api-access-8dcrv") pod "ae5397d9-c1f9-472c-b182-06dbbc26381a" (UID: "ae5397d9-c1f9-472c-b182-06dbbc26381a"). InnerVolumeSpecName "kube-api-access-8dcrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.786970 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-r8f6v"] Jan 21 15:31:11 crc kubenswrapper[4707]: W0121 15:31:11.794656 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11d59352_da6e_45fa_a721_5f7dc6361200.slice/crio-3dd9fd3c703c3cd2065ebfee52e30409bfa829aa14eed7ca4d8e4d6a5017d11c WatchSource:0}: Error finding container 3dd9fd3c703c3cd2065ebfee52e30409bfa829aa14eed7ca4d8e4d6a5017d11c: Status 404 returned error can't find the container with id 3dd9fd3c703c3cd2065ebfee52e30409bfa829aa14eed7ca4d8e4d6a5017d11c Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.806170 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae5397d9-c1f9-472c-b182-06dbbc26381a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:11 crc kubenswrapper[4707]: I0121 15:31:11.806198 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dcrv\" (UniqueName: \"kubernetes.io/projected/ae5397d9-c1f9-472c-b182-06dbbc26381a-kube-api-access-8dcrv\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.029411 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.034336 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2678839-9b6d-4d68-aa48-c2b42793ea0b" containerID="f0d68dbc7cf9b3d8e58f408cefe224a119cce64c438f26782bacec119e7d78d0" exitCode=0 Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.034409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" event={"ID":"b2678839-9b6d-4d68-aa48-c2b42793ea0b","Type":"ContainerDied","Data":"f0d68dbc7cf9b3d8e58f408cefe224a119cce64c438f26782bacec119e7d78d0"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.034614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" event={"ID":"b2678839-9b6d-4d68-aa48-c2b42793ea0b","Type":"ContainerStarted","Data":"6d53c61e40536302919e23fd82014ecf72a27edc45d4906592eed44c43efde2a"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.038017 4707 generic.go:334] "Generic (PLEG): container finished" podID="11d59352-da6e-45fa-a721-5f7dc6361200" containerID="28fc022c66807df1e793d0893ff1d6a282fe23d532bc5ee33036ce44ac8e5248" exitCode=0 Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.038055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-r8f6v" event={"ID":"11d59352-da6e-45fa-a721-5f7dc6361200","Type":"ContainerDied","Data":"28fc022c66807df1e793d0893ff1d6a282fe23d532bc5ee33036ce44ac8e5248"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.038070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-r8f6v" event={"ID":"11d59352-da6e-45fa-a721-5f7dc6361200","Type":"ContainerStarted","Data":"3dd9fd3c703c3cd2065ebfee52e30409bfa829aa14eed7ca4d8e4d6a5017d11c"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.038960 4707 generic.go:334] "Generic (PLEG): container finished" podID="630582cc-d708-415d-88da-06be4d720728" containerID="4c572740964e4167e045c95d8fd1045f5bdbeb75371d01f2192b08d69680f85b" exitCode=0 Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.038989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-qc6lv" event={"ID":"630582cc-d708-415d-88da-06be4d720728","Type":"ContainerDied","Data":"4c572740964e4167e045c95d8fd1045f5bdbeb75371d01f2192b08d69680f85b"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.045644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"781be74c-c8d5-4cfa-8bb1-5a0434bb3823","Type":"ContainerStarted","Data":"0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.045670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"781be74c-c8d5-4cfa-8bb1-5a0434bb3823","Type":"ContainerStarted","Data":"a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.045682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"781be74c-c8d5-4cfa-8bb1-5a0434bb3823","Type":"ContainerStarted","Data":"77fb2fc44af5392306954d6db181bef200ccdb5ef65ec7a483ce8ae026d82ffa"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.045770 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.047675 4707 generic.go:334] "Generic (PLEG): container finished" podID="5c499192-7911-4564-b833-1a941ba515a9" containerID="0a6802cdbd1648fbd84c2134028f81c3fbdce8d1f1ef02e2484c6860d637820d" exitCode=0 Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.047721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" event={"ID":"5c499192-7911-4564-b833-1a941ba515a9","Type":"ContainerDied","Data":"0a6802cdbd1648fbd84c2134028f81c3fbdce8d1f1ef02e2484c6860d637820d"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.047742 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" event={"ID":"5c499192-7911-4564-b833-1a941ba515a9","Type":"ContainerStarted","Data":"23488882f99c4ce90e77df15400848545884b1b80d54df1733e4be0c41304ba4"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.066515 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d991e26-3d69-43bb-a18b-67aa7819a210" containerID="c11a387cd3a3b4d0292ec4d1ea2b8b17f7cb347c49dd9f6cb4898b7d8632656c" exitCode=0 Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.066567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-2mnwh" event={"ID":"6d991e26-3d69-43bb-a18b-67aa7819a210","Type":"ContainerDied","Data":"c11a387cd3a3b4d0292ec4d1ea2b8b17f7cb347c49dd9f6cb4898b7d8632656c"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.066587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-2mnwh" event={"ID":"6d991e26-3d69-43bb-a18b-67aa7819a210","Type":"ContainerStarted","Data":"88063fb1d50901b0d5327864191e3de3bc2f866ba57be53c06ccf4f44e207d78"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.081973 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-6nmsr" event={"ID":"ae5397d9-c1f9-472c-b182-06dbbc26381a","Type":"ContainerDied","Data":"65ff0e0d0d87c2f747f2b409088c66720a9de662398b8b763df845242bfa28eb"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.081999 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65ff0e0d0d87c2f747f2b409088c66720a9de662398b8b763df845242bfa28eb" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.082041 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-6nmsr" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.092861 4707 generic.go:334] "Generic (PLEG): container finished" podID="5c38ceab-e8d7-47cb-bf82-437d544633dc" containerID="7338268f8d3bd6f4dbade36699445b162a93a58535d079d9e6e3ae94779deb2b" exitCode=0 Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.092915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" event={"ID":"5c38ceab-e8d7-47cb-bf82-437d544633dc","Type":"ContainerDied","Data":"7338268f8d3bd6f4dbade36699445b162a93a58535d079d9e6e3ae94779deb2b"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.092957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" event={"ID":"5c38ceab-e8d7-47cb-bf82-437d544633dc","Type":"ContainerStarted","Data":"7bd553f81712a5a7672d3137e6f4e3ce6f00d97e0110fffd7520924c8000af0d"} Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.108928 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.108917628 podStartE2EDuration="2.108917628s" podCreationTimestamp="2026-01-21 15:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:12.108533967 +0000 UTC m=+1769.290050189" watchObservedRunningTime="2026-01-21 15:31:12.108917628 +0000 UTC m=+1769.290433850" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.550375 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.624307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-swiftconf\") pod \"6113492f-9c05-4511-956e-8f19e045260d\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.625586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m795\" (UniqueName: \"kubernetes.io/projected/6113492f-9c05-4511-956e-8f19e045260d-kube-api-access-6m795\") pod \"6113492f-9c05-4511-956e-8f19e045260d\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.625614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-ring-data-devices\") pod \"6113492f-9c05-4511-956e-8f19e045260d\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.628445 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6113492f-9c05-4511-956e-8f19e045260d" (UID: "6113492f-9c05-4511-956e-8f19e045260d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.629839 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-scripts\") pod \"6113492f-9c05-4511-956e-8f19e045260d\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.629915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-dispersionconf\") pod \"6113492f-9c05-4511-956e-8f19e045260d\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.629936 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-combined-ca-bundle\") pod \"6113492f-9c05-4511-956e-8f19e045260d\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.629961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113492f-9c05-4511-956e-8f19e045260d-etc-swift\") pod \"6113492f-9c05-4511-956e-8f19e045260d\" (UID: \"6113492f-9c05-4511-956e-8f19e045260d\") " Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.630297 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.630957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6113492f-9c05-4511-956e-8f19e045260d-kube-api-access-6m795" (OuterVolumeSpecName: "kube-api-access-6m795") pod "6113492f-9c05-4511-956e-8f19e045260d" (UID: "6113492f-9c05-4511-956e-8f19e045260d"). InnerVolumeSpecName "kube-api-access-6m795". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.631227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6113492f-9c05-4511-956e-8f19e045260d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6113492f-9c05-4511-956e-8f19e045260d" (UID: "6113492f-9c05-4511-956e-8f19e045260d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.635832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6113492f-9c05-4511-956e-8f19e045260d" (UID: "6113492f-9c05-4511-956e-8f19e045260d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.644521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-scripts" (OuterVolumeSpecName: "scripts") pod "6113492f-9c05-4511-956e-8f19e045260d" (UID: "6113492f-9c05-4511-956e-8f19e045260d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.644623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6113492f-9c05-4511-956e-8f19e045260d" (UID: "6113492f-9c05-4511-956e-8f19e045260d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.650322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6113492f-9c05-4511-956e-8f19e045260d" (UID: "6113492f-9c05-4511-956e-8f19e045260d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.731256 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113492f-9c05-4511-956e-8f19e045260d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.731482 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.731494 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.731502 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113492f-9c05-4511-956e-8f19e045260d-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.731510 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113492f-9c05-4511-956e-8f19e045260d-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.731517 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m795\" (UniqueName: \"kubernetes.io/projected/6113492f-9c05-4511-956e-8f19e045260d-kube-api-access-6m795\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.744286 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs"] Jan 21 15:31:12 crc kubenswrapper[4707]: E0121 15:31:12.744579 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5397d9-c1f9-472c-b182-06dbbc26381a" containerName="mariadb-account-create-update" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.744597 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5397d9-c1f9-472c-b182-06dbbc26381a" containerName="mariadb-account-create-update" Jan 21 15:31:12 crc kubenswrapper[4707]: E0121 15:31:12.744614 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6113492f-9c05-4511-956e-8f19e045260d" containerName="swift-ring-rebalance" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.744621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6113492f-9c05-4511-956e-8f19e045260d" containerName="swift-ring-rebalance" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.744754 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5397d9-c1f9-472c-b182-06dbbc26381a" containerName="mariadb-account-create-update" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.744782 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6113492f-9c05-4511-956e-8f19e045260d" containerName="swift-ring-rebalance" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.745641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.748443 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-db-secret" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.750987 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-db-create-tnspr"] Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.751936 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.761696 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs"] Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.766198 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-db-create-tnspr"] Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.833147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96slf\" (UniqueName: \"kubernetes.io/projected/15438c80-41a4-474c-8953-1cb1146c4dab-kube-api-access-96slf\") pod \"watcher-db-create-tnspr\" (UID: \"15438c80-41a4-474c-8953-1cb1146c4dab\") " pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.833213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk4tl\" (UniqueName: \"kubernetes.io/projected/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-kube-api-access-sk4tl\") pod \"watcher-6b65-account-create-update-gmzcs\" (UID: \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\") " pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.833297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-operator-scripts\") pod \"watcher-6b65-account-create-update-gmzcs\" (UID: \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\") " pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.833319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15438c80-41a4-474c-8953-1cb1146c4dab-operator-scripts\") pod \"watcher-db-create-tnspr\" (UID: \"15438c80-41a4-474c-8953-1cb1146c4dab\") " pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.934445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96slf\" (UniqueName: \"kubernetes.io/projected/15438c80-41a4-474c-8953-1cb1146c4dab-kube-api-access-96slf\") pod \"watcher-db-create-tnspr\" (UID: \"15438c80-41a4-474c-8953-1cb1146c4dab\") " pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.934495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk4tl\" (UniqueName: \"kubernetes.io/projected/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-kube-api-access-sk4tl\") pod \"watcher-6b65-account-create-update-gmzcs\" (UID: \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\") " pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.936730 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-operator-scripts\") pod \"watcher-6b65-account-create-update-gmzcs\" (UID: \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\") " pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.936934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-operator-scripts\") pod \"watcher-6b65-account-create-update-gmzcs\" (UID: \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\") " pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.936971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15438c80-41a4-474c-8953-1cb1146c4dab-operator-scripts\") pod \"watcher-db-create-tnspr\" (UID: \"15438c80-41a4-474c-8953-1cb1146c4dab\") " pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.937604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15438c80-41a4-474c-8953-1cb1146c4dab-operator-scripts\") pod \"watcher-db-create-tnspr\" (UID: \"15438c80-41a4-474c-8953-1cb1146c4dab\") " pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.949242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96slf\" (UniqueName: \"kubernetes.io/projected/15438c80-41a4-474c-8953-1cb1146c4dab-kube-api-access-96slf\") pod \"watcher-db-create-tnspr\" (UID: \"15438c80-41a4-474c-8953-1cb1146c4dab\") " pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:12 crc kubenswrapper[4707]: I0121 15:31:12.951240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk4tl\" (UniqueName: \"kubernetes.io/projected/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-kube-api-access-sk4tl\") pod \"watcher-6b65-account-create-update-gmzcs\" (UID: \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\") " pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.072777 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.081834 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.103456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" event={"ID":"6113492f-9c05-4511-956e-8f19e045260d","Type":"ContainerDied","Data":"a663067d3f8c6b5fc6fa3d85cb651ec6a01483ba0b0f5d4fafa7c1be4bf7e1a2"} Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.103500 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a663067d3f8c6b5fc6fa3d85cb651ec6a01483ba0b0f5d4fafa7c1be4bf7e1a2" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.103552 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-dw9hb" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.108686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b","Type":"ContainerStarted","Data":"8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013"} Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.109477 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.113466 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.118548 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3"} Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.118589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034"} Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.118600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed"} Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.118609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe"} Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.118618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276"} Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.118625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"51bf0791aef6a119d2d47241c693b87fee6f1f2b506c7de05b60681dba464880"} Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.136434 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" podStartSLOduration=3.655313014 podStartE2EDuration="15.136419585s" podCreationTimestamp="2026-01-21 15:30:58 +0000 UTC" firstStartedPulling="2026-01-21 15:30:59.016278463 +0000 UTC m=+1756.197794685" lastFinishedPulling="2026-01-21 15:31:10.497385024 +0000 UTC m=+1767.678901256" observedRunningTime="2026-01-21 15:31:13.130349472 +0000 UTC m=+1770.311865694" watchObservedRunningTime="2026-01-21 15:31:13.136419585 +0000 UTC m=+1770.317935807" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.799559 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.822876 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.828211 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.845601 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.848476 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.852006 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.876616 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs"] Jan 21 15:31:13 crc kubenswrapper[4707]: W0121 15:31:13.887427 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2eac07c_10c9_47d6_a2cd_6da78fb97ee0.slice/crio-c8af95eb0d4b7c7ba34bb5354f06965cc8297693b92f45fa7c3c0ac3c3b63e06 WatchSource:0}: Error finding container c8af95eb0d4b7c7ba34bb5354f06965cc8297693b92f45fa7c3c0ac3c3b63e06: Status 404 returned error can't find the container with id c8af95eb0d4b7c7ba34bb5354f06965cc8297693b92f45fa7c3c0ac3c3b63e06 Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969167 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d59352-da6e-45fa-a721-5f7dc6361200-operator-scripts\") pod \"11d59352-da6e-45fa-a721-5f7dc6361200\" (UID: \"11d59352-da6e-45fa-a721-5f7dc6361200\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jckfn\" (UniqueName: \"kubernetes.io/projected/b2678839-9b6d-4d68-aa48-c2b42793ea0b-kube-api-access-jckfn\") pod \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\" (UID: \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt7ps\" (UniqueName: \"kubernetes.io/projected/5c38ceab-e8d7-47cb-bf82-437d544633dc-kube-api-access-rt7ps\") pod \"5c38ceab-e8d7-47cb-bf82-437d544633dc\" (UID: \"5c38ceab-e8d7-47cb-bf82-437d544633dc\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnf9h\" (UniqueName: \"kubernetes.io/projected/6d991e26-3d69-43bb-a18b-67aa7819a210-kube-api-access-pnf9h\") pod \"6d991e26-3d69-43bb-a18b-67aa7819a210\" (UID: \"6d991e26-3d69-43bb-a18b-67aa7819a210\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2rnd\" (UniqueName: \"kubernetes.io/projected/630582cc-d708-415d-88da-06be4d720728-kube-api-access-j2rnd\") pod \"630582cc-d708-415d-88da-06be4d720728\" (UID: \"630582cc-d708-415d-88da-06be4d720728\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c499192-7911-4564-b833-1a941ba515a9-operator-scripts\") pod \"5c499192-7911-4564-b833-1a941ba515a9\" (UID: \"5c499192-7911-4564-b833-1a941ba515a9\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630582cc-d708-415d-88da-06be4d720728-operator-scripts\") pod \"630582cc-d708-415d-88da-06be4d720728\" (UID: \"630582cc-d708-415d-88da-06be4d720728\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq2st\" (UniqueName: \"kubernetes.io/projected/11d59352-da6e-45fa-a721-5f7dc6361200-kube-api-access-lq2st\") pod \"11d59352-da6e-45fa-a721-5f7dc6361200\" (UID: \"11d59352-da6e-45fa-a721-5f7dc6361200\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969690 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2678839-9b6d-4d68-aa48-c2b42793ea0b-operator-scripts\") pod \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\" (UID: \"b2678839-9b6d-4d68-aa48-c2b42793ea0b\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp42c\" (UniqueName: \"kubernetes.io/projected/5c499192-7911-4564-b833-1a941ba515a9-kube-api-access-wp42c\") pod \"5c499192-7911-4564-b833-1a941ba515a9\" (UID: \"5c499192-7911-4564-b833-1a941ba515a9\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969781 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c38ceab-e8d7-47cb-bf82-437d544633dc-operator-scripts\") pod \"5c38ceab-e8d7-47cb-bf82-437d544633dc\" (UID: \"5c38ceab-e8d7-47cb-bf82-437d544633dc\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d991e26-3d69-43bb-a18b-67aa7819a210-operator-scripts\") pod \"6d991e26-3d69-43bb-a18b-67aa7819a210\" (UID: \"6d991e26-3d69-43bb-a18b-67aa7819a210\") " Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.969946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11d59352-da6e-45fa-a721-5f7dc6361200-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11d59352-da6e-45fa-a721-5f7dc6361200" (UID: "11d59352-da6e-45fa-a721-5f7dc6361200"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.970105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2678839-9b6d-4d68-aa48-c2b42793ea0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2678839-9b6d-4d68-aa48-c2b42793ea0b" (UID: "b2678839-9b6d-4d68-aa48-c2b42793ea0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.970215 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d991e26-3d69-43bb-a18b-67aa7819a210-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d991e26-3d69-43bb-a18b-67aa7819a210" (UID: "6d991e26-3d69-43bb-a18b-67aa7819a210"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.970457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c499192-7911-4564-b833-1a941ba515a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c499192-7911-4564-b833-1a941ba515a9" (UID: "5c499192-7911-4564-b833-1a941ba515a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.970760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630582cc-d708-415d-88da-06be4d720728-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "630582cc-d708-415d-88da-06be4d720728" (UID: "630582cc-d708-415d-88da-06be4d720728"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.971391 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2678839-9b6d-4d68-aa48-c2b42793ea0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.971565 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d991e26-3d69-43bb-a18b-67aa7819a210-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.971575 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d59352-da6e-45fa-a721-5f7dc6361200-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.971579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c38ceab-e8d7-47cb-bf82-437d544633dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c38ceab-e8d7-47cb-bf82-437d544633dc" (UID: "5c38ceab-e8d7-47cb-bf82-437d544633dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.971587 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c499192-7911-4564-b833-1a941ba515a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.971613 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630582cc-d708-415d-88da-06be4d720728-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.974238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2678839-9b6d-4d68-aa48-c2b42793ea0b-kube-api-access-jckfn" (OuterVolumeSpecName: "kube-api-access-jckfn") pod "b2678839-9b6d-4d68-aa48-c2b42793ea0b" (UID: "b2678839-9b6d-4d68-aa48-c2b42793ea0b"). InnerVolumeSpecName "kube-api-access-jckfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.974273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630582cc-d708-415d-88da-06be4d720728-kube-api-access-j2rnd" (OuterVolumeSpecName: "kube-api-access-j2rnd") pod "630582cc-d708-415d-88da-06be4d720728" (UID: "630582cc-d708-415d-88da-06be4d720728"). InnerVolumeSpecName "kube-api-access-j2rnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.974292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c38ceab-e8d7-47cb-bf82-437d544633dc-kube-api-access-rt7ps" (OuterVolumeSpecName: "kube-api-access-rt7ps") pod "5c38ceab-e8d7-47cb-bf82-437d544633dc" (UID: "5c38ceab-e8d7-47cb-bf82-437d544633dc"). InnerVolumeSpecName "kube-api-access-rt7ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.974640 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d59352-da6e-45fa-a721-5f7dc6361200-kube-api-access-lq2st" (OuterVolumeSpecName: "kube-api-access-lq2st") pod "11d59352-da6e-45fa-a721-5f7dc6361200" (UID: "11d59352-da6e-45fa-a721-5f7dc6361200"). InnerVolumeSpecName "kube-api-access-lq2st". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.974797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d991e26-3d69-43bb-a18b-67aa7819a210-kube-api-access-pnf9h" (OuterVolumeSpecName: "kube-api-access-pnf9h") pod "6d991e26-3d69-43bb-a18b-67aa7819a210" (UID: "6d991e26-3d69-43bb-a18b-67aa7819a210"). InnerVolumeSpecName "kube-api-access-pnf9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:13 crc kubenswrapper[4707]: I0121 15:31:13.975127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c499192-7911-4564-b833-1a941ba515a9-kube-api-access-wp42c" (OuterVolumeSpecName: "kube-api-access-wp42c") pod "5c499192-7911-4564-b833-1a941ba515a9" (UID: "5c499192-7911-4564-b833-1a941ba515a9"). InnerVolumeSpecName "kube-api-access-wp42c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.027119 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-db-create-tnspr"] Jan 21 15:31:14 crc kubenswrapper[4707]: W0121 15:31:14.030096 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15438c80_41a4_474c_8953_1cb1146c4dab.slice/crio-46f118172728b37acd77eec324ff5c5732eee240939bc155f550cf24c959d670 WatchSource:0}: Error finding container 46f118172728b37acd77eec324ff5c5732eee240939bc155f550cf24c959d670: Status 404 returned error can't find the container with id 46f118172728b37acd77eec324ff5c5732eee240939bc155f550cf24c959d670 Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.072478 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jckfn\" (UniqueName: \"kubernetes.io/projected/b2678839-9b6d-4d68-aa48-c2b42793ea0b-kube-api-access-jckfn\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.072502 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt7ps\" (UniqueName: \"kubernetes.io/projected/5c38ceab-e8d7-47cb-bf82-437d544633dc-kube-api-access-rt7ps\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.072512 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2rnd\" (UniqueName: \"kubernetes.io/projected/630582cc-d708-415d-88da-06be4d720728-kube-api-access-j2rnd\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.072521 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnf9h\" (UniqueName: \"kubernetes.io/projected/6d991e26-3d69-43bb-a18b-67aa7819a210-kube-api-access-pnf9h\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.072530 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq2st\" (UniqueName: \"kubernetes.io/projected/11d59352-da6e-45fa-a721-5f7dc6361200-kube-api-access-lq2st\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.072556 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp42c\" (UniqueName: \"kubernetes.io/projected/5c499192-7911-4564-b833-1a941ba515a9-kube-api-access-wp42c\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.072564 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c38ceab-e8d7-47cb-bf82-437d544633dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.126022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-r8f6v" event={"ID":"11d59352-da6e-45fa-a721-5f7dc6361200","Type":"ContainerDied","Data":"3dd9fd3c703c3cd2065ebfee52e30409bfa829aa14eed7ca4d8e4d6a5017d11c"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.126062 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dd9fd3c703c3cd2065ebfee52e30409bfa829aa14eed7ca4d8e4d6a5017d11c" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.126075 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-r8f6v" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.127282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-qc6lv" event={"ID":"630582cc-d708-415d-88da-06be4d720728","Type":"ContainerDied","Data":"adce826b43c57a6c0193db3e54a584e32fc9ac4ef0b2d993bda0aabdd0d68582"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.127303 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adce826b43c57a6c0193db3e54a584e32fc9ac4ef0b2d993bda0aabdd0d68582" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.127302 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-qc6lv" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.128193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-create-tnspr" event={"ID":"15438c80-41a4-474c-8953-1cb1146c4dab","Type":"ContainerStarted","Data":"46f118172728b37acd77eec324ff5c5732eee240939bc155f550cf24c959d670"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.130258 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.130302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4d50-account-create-update-2t28b" event={"ID":"5c38ceab-e8d7-47cb-bf82-437d544633dc","Type":"ContainerDied","Data":"7bd553f81712a5a7672d3137e6f4e3ce6f00d97e0110fffd7520924c8000af0d"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.130322 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bd553f81712a5a7672d3137e6f4e3ce6f00d97e0110fffd7520924c8000af0d" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.131548 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" event={"ID":"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0","Type":"ContainerStarted","Data":"c0fe7f58e8940e18c6eb0ada86b5033201da9ecf5cc7b720a5e7ba73cfd6bb9f"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.131570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" event={"ID":"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0","Type":"ContainerStarted","Data":"c8af95eb0d4b7c7ba34bb5354f06965cc8297693b92f45fa7c3c0ac3c3b63e06"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.135459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" event={"ID":"b2678839-9b6d-4d68-aa48-c2b42793ea0b","Type":"ContainerDied","Data":"6d53c61e40536302919e23fd82014ecf72a27edc45d4906592eed44c43efde2a"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.135481 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d53c61e40536302919e23fd82014ecf72a27edc45d4906592eed44c43efde2a" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.135572 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.149661 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" podStartSLOduration=2.14964592 podStartE2EDuration="2.14964592s" podCreationTimestamp="2026-01-21 15:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:14.143957423 +0000 UTC m=+1771.325473645" watchObservedRunningTime="2026-01-21 15:31:14.14964592 +0000 UTC m=+1771.331162142" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.150931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.151234 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.151246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.151255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.151264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.151272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.162230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" event={"ID":"5c499192-7911-4564-b833-1a941ba515a9","Type":"ContainerDied","Data":"23488882f99c4ce90e77df15400848545884b1b80d54df1733e4be0c41304ba4"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.162260 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23488882f99c4ce90e77df15400848545884b1b80d54df1733e4be0c41304ba4" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.162326 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9339-account-create-update-2f5kd" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.176089 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-2mnwh" Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.176985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-2mnwh" event={"ID":"6d991e26-3d69-43bb-a18b-67aa7819a210","Type":"ContainerDied","Data":"88063fb1d50901b0d5327864191e3de3bc2f866ba57be53c06ccf4f44e207d78"} Jan 21 15:31:14 crc kubenswrapper[4707]: I0121 15:31:14.177015 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88063fb1d50901b0d5327864191e3de3bc2f866ba57be53c06ccf4f44e207d78" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.187281 4707 generic.go:334] "Generic (PLEG): container finished" podID="15438c80-41a4-474c-8953-1cb1146c4dab" containerID="1fe533f65e70741554fde5962a4585ba7bcb52018aab459c4513cf5fe868f95b" exitCode=0 Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.190035 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2eac07c-10c9-47d6-a2cd-6da78fb97ee0" containerID="c0fe7f58e8940e18c6eb0ada86b5033201da9ecf5cc7b720a5e7ba73cfd6bb9f" exitCode=0 Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.190124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-create-tnspr" event={"ID":"15438c80-41a4-474c-8953-1cb1146c4dab","Type":"ContainerDied","Data":"1fe533f65e70741554fde5962a4585ba7bcb52018aab459c4513cf5fe868f95b"} Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.190155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" event={"ID":"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0","Type":"ContainerDied","Data":"c0fe7f58e8940e18c6eb0ada86b5033201da9ecf5cc7b720a5e7ba73cfd6bb9f"} Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.195515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7"} Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.195554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d"} Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.195567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97"} Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.975441 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-j4zfc"] Jan 21 15:31:15 crc kubenswrapper[4707]: E0121 15:31:15.975718 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c38ceab-e8d7-47cb-bf82-437d544633dc" containerName="mariadb-account-create-update" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.975735 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c38ceab-e8d7-47cb-bf82-437d544633dc" containerName="mariadb-account-create-update" Jan 21 15:31:15 crc kubenswrapper[4707]: E0121 15:31:15.975748 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d59352-da6e-45fa-a721-5f7dc6361200" containerName="mariadb-database-create" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.975754 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d59352-da6e-45fa-a721-5f7dc6361200" containerName="mariadb-database-create" Jan 21 15:31:15 crc kubenswrapper[4707]: E0121 15:31:15.975768 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d991e26-3d69-43bb-a18b-67aa7819a210" containerName="mariadb-database-create" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.975774 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d991e26-3d69-43bb-a18b-67aa7819a210" containerName="mariadb-database-create" Jan 21 15:31:15 crc kubenswrapper[4707]: E0121 15:31:15.975786 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c499192-7911-4564-b833-1a941ba515a9" containerName="mariadb-account-create-update" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.975791 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c499192-7911-4564-b833-1a941ba515a9" containerName="mariadb-account-create-update" Jan 21 15:31:15 crc kubenswrapper[4707]: E0121 15:31:15.975800 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630582cc-d708-415d-88da-06be4d720728" containerName="mariadb-database-create" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.975821 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="630582cc-d708-415d-88da-06be4d720728" containerName="mariadb-database-create" Jan 21 15:31:15 crc kubenswrapper[4707]: E0121 15:31:15.975830 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2678839-9b6d-4d68-aa48-c2b42793ea0b" containerName="mariadb-account-create-update" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.975836 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2678839-9b6d-4d68-aa48-c2b42793ea0b" containerName="mariadb-account-create-update" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.975986 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2678839-9b6d-4d68-aa48-c2b42793ea0b" containerName="mariadb-account-create-update" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.975999 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c499192-7911-4564-b833-1a941ba515a9" containerName="mariadb-account-create-update" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.976008 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="630582cc-d708-415d-88da-06be4d720728" containerName="mariadb-database-create" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.976021 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d991e26-3d69-43bb-a18b-67aa7819a210" containerName="mariadb-database-create" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.976029 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d59352-da6e-45fa-a721-5f7dc6361200" containerName="mariadb-database-create" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.976040 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c38ceab-e8d7-47cb-bf82-437d544633dc" containerName="mariadb-account-create-update" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.976489 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.977796 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-9hvs4" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.977856 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 15:31:15 crc kubenswrapper[4707]: I0121 15:31:15.983197 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-j4zfc"] Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.103780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79lm6\" (UniqueName: \"kubernetes.io/projected/2288c539-d799-4de2-8f62-020faad333ec-kube-api-access-79lm6\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.103902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-db-sync-config-data\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.103935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-config-data\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.103970 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-combined-ca-bundle\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.182358 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:31:16 crc kubenswrapper[4707]: E0121 15:31:16.182592 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.204951 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-db-sync-config-data\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.204983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-config-data\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.205010 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-combined-ca-bundle\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.205103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79lm6\" (UniqueName: \"kubernetes.io/projected/2288c539-d799-4de2-8f62-020faad333ec-kube-api-access-79lm6\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.209775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-config-data\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.224207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-db-sync-config-data\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.224575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-combined-ca-bundle\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.232275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79lm6\" (UniqueName: \"kubernetes.io/projected/2288c539-d799-4de2-8f62-020faad333ec-kube-api-access-79lm6\") pod \"glance-db-sync-j4zfc\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.297793 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.589484 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.612267 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.714070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk4tl\" (UniqueName: \"kubernetes.io/projected/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-kube-api-access-sk4tl\") pod \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\" (UID: \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\") " Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.714207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15438c80-41a4-474c-8953-1cb1146c4dab-operator-scripts\") pod \"15438c80-41a4-474c-8953-1cb1146c4dab\" (UID: \"15438c80-41a4-474c-8953-1cb1146c4dab\") " Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.714439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-operator-scripts\") pod \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\" (UID: \"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0\") " Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.714507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96slf\" (UniqueName: \"kubernetes.io/projected/15438c80-41a4-474c-8953-1cb1146c4dab-kube-api-access-96slf\") pod \"15438c80-41a4-474c-8953-1cb1146c4dab\" (UID: \"15438c80-41a4-474c-8953-1cb1146c4dab\") " Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.714695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15438c80-41a4-474c-8953-1cb1146c4dab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15438c80-41a4-474c-8953-1cb1146c4dab" (UID: "15438c80-41a4-474c-8953-1cb1146c4dab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.714780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2eac07c-10c9-47d6-a2cd-6da78fb97ee0" (UID: "e2eac07c-10c9-47d6-a2cd-6da78fb97ee0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.715171 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15438c80-41a4-474c-8953-1cb1146c4dab-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.715195 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.722673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-kube-api-access-sk4tl" (OuterVolumeSpecName: "kube-api-access-sk4tl") pod "e2eac07c-10c9-47d6-a2cd-6da78fb97ee0" (UID: "e2eac07c-10c9-47d6-a2cd-6da78fb97ee0"). InnerVolumeSpecName "kube-api-access-sk4tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.722728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15438c80-41a4-474c-8953-1cb1146c4dab-kube-api-access-96slf" (OuterVolumeSpecName: "kube-api-access-96slf") pod "15438c80-41a4-474c-8953-1cb1146c4dab" (UID: "15438c80-41a4-474c-8953-1cb1146c4dab"). InnerVolumeSpecName "kube-api-access-96slf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.819001 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk4tl\" (UniqueName: \"kubernetes.io/projected/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0-kube-api-access-sk4tl\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.819347 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96slf\" (UniqueName: \"kubernetes.io/projected/15438c80-41a4-474c-8953-1cb1146c4dab-kube-api-access-96slf\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:16 crc kubenswrapper[4707]: I0121 15:31:16.877404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-j4zfc"] Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.216086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerStarted","Data":"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97"} Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.223272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerStarted","Data":"abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266"} Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.227854 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-create-tnspr" event={"ID":"15438c80-41a4-474c-8953-1cb1146c4dab","Type":"ContainerDied","Data":"46f118172728b37acd77eec324ff5c5732eee240939bc155f550cf24c959d670"} Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.227881 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46f118172728b37acd77eec324ff5c5732eee240939bc155f550cf24c959d670" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.227984 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-create-tnspr" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.229516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" event={"ID":"e2eac07c-10c9-47d6-a2cd-6da78fb97ee0","Type":"ContainerDied","Data":"c8af95eb0d4b7c7ba34bb5354f06965cc8297693b92f45fa7c3c0ac3c3b63e06"} Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.229537 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8af95eb0d4b7c7ba34bb5354f06965cc8297693b92f45fa7c3c0ac3c3b63e06" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.229572 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.238289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-j4zfc" event={"ID":"2288c539-d799-4de2-8f62-020faad333ec","Type":"ContainerStarted","Data":"17ce73e60766b2bbde6399c44c3a9aae4f6eb9dc1a913079d9939f718a9312e8"} Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.251015 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=15.251004678 podStartE2EDuration="15.251004678s" podCreationTimestamp="2026-01-21 15:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:17.244632226 +0000 UTC m=+1774.426148448" watchObservedRunningTime="2026-01-21 15:31:17.251004678 +0000 UTC m=+1774.432520900" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.347512 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn"] Jan 21 15:31:17 crc kubenswrapper[4707]: E0121 15:31:17.347892 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15438c80-41a4-474c-8953-1cb1146c4dab" containerName="mariadb-database-create" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.347907 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15438c80-41a4-474c-8953-1cb1146c4dab" containerName="mariadb-database-create" Jan 21 15:31:17 crc kubenswrapper[4707]: E0121 15:31:17.347937 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2eac07c-10c9-47d6-a2cd-6da78fb97ee0" containerName="mariadb-account-create-update" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.347943 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2eac07c-10c9-47d6-a2cd-6da78fb97ee0" containerName="mariadb-account-create-update" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.348113 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="15438c80-41a4-474c-8953-1cb1146c4dab" containerName="mariadb-database-create" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.348127 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2eac07c-10c9-47d6-a2cd-6da78fb97ee0" containerName="mariadb-account-create-update" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.349011 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.350872 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.351742 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn"] Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.530177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.530559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-config\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.530611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.530636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjbg8\" (UniqueName: \"kubernetes.io/projected/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-kube-api-access-wjbg8\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.631963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.632066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-config\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.632105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.632125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjbg8\" (UniqueName: \"kubernetes.io/projected/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-kube-api-access-wjbg8\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.633244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.633259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-config\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.633937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.747085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjbg8\" (UniqueName: \"kubernetes.io/projected/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-kube-api-access-wjbg8\") pod \"dnsmasq-dnsmasq-859b49f69-4jtmn\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:17 crc kubenswrapper[4707]: I0121 15:31:17.965489 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:18 crc kubenswrapper[4707]: I0121 15:31:18.253952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-j4zfc" event={"ID":"2288c539-d799-4de2-8f62-020faad333ec","Type":"ContainerStarted","Data":"806e8dc586e5d2329e8f32eec444b83190a7950e3e3032a6311dac6fef009c29"} Jan 21 15:31:18 crc kubenswrapper[4707]: I0121 15:31:18.333833 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-j4zfc" podStartSLOduration=3.333802325 podStartE2EDuration="3.333802325s" podCreationTimestamp="2026-01-21 15:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:18.265199404 +0000 UTC m=+1775.446715626" watchObservedRunningTime="2026-01-21 15:31:18.333802325 +0000 UTC m=+1775.515318548" Jan 21 15:31:18 crc kubenswrapper[4707]: I0121 15:31:18.335466 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn"] Jan 21 15:31:19 crc kubenswrapper[4707]: I0121 15:31:19.263698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerStarted","Data":"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9"} Jan 21 15:31:19 crc kubenswrapper[4707]: I0121 15:31:19.266710 4707 generic.go:334] "Generic (PLEG): container finished" podID="91bd3c13-ed9b-4929-9c5d-3d650badf3e9" containerID="b19a11ff1e744d04142c11ec5127bbafe457bae5474e682a369e5bab0699a6ac" exitCode=0 Jan 21 15:31:19 crc kubenswrapper[4707]: I0121 15:31:19.266770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" event={"ID":"91bd3c13-ed9b-4929-9c5d-3d650badf3e9","Type":"ContainerDied","Data":"b19a11ff1e744d04142c11ec5127bbafe457bae5474e682a369e5bab0699a6ac"} Jan 21 15:31:19 crc kubenswrapper[4707]: I0121 15:31:19.266828 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" event={"ID":"91bd3c13-ed9b-4929-9c5d-3d650badf3e9","Type":"ContainerStarted","Data":"e079037a2fecb2ddda6a612ee1b3f2b5acb044b51b0df30dde88676cc5c0c267"} Jan 21 15:31:20 crc kubenswrapper[4707]: I0121 15:31:20.275580 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" event={"ID":"91bd3c13-ed9b-4929-9c5d-3d650badf3e9","Type":"ContainerStarted","Data":"9cb3ce555c3c18849f88cb950839ddcad6eacdd2264d87d1e1ee4ea3ccfb67eb"} Jan 21 15:31:20 crc kubenswrapper[4707]: I0121 15:31:20.276026 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:20 crc kubenswrapper[4707]: I0121 15:31:20.294597 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" podStartSLOduration=3.294582261 podStartE2EDuration="3.294582261s" podCreationTimestamp="2026-01-21 15:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:20.292001429 +0000 UTC m=+1777.473517651" watchObservedRunningTime="2026-01-21 15:31:20.294582261 +0000 UTC m=+1777.476098483" Jan 21 15:31:21 crc kubenswrapper[4707]: I0121 15:31:21.286612 4707 generic.go:334] "Generic (PLEG): container finished" podID="2288c539-d799-4de2-8f62-020faad333ec" containerID="806e8dc586e5d2329e8f32eec444b83190a7950e3e3032a6311dac6fef009c29" exitCode=0 Jan 21 15:31:21 crc kubenswrapper[4707]: I0121 15:31:21.286687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-j4zfc" event={"ID":"2288c539-d799-4de2-8f62-020faad333ec","Type":"ContainerDied","Data":"806e8dc586e5d2329e8f32eec444b83190a7950e3e3032a6311dac6fef009c29"} Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.570036 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.716953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-combined-ca-bundle\") pod \"2288c539-d799-4de2-8f62-020faad333ec\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.717003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-config-data\") pod \"2288c539-d799-4de2-8f62-020faad333ec\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.717085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79lm6\" (UniqueName: \"kubernetes.io/projected/2288c539-d799-4de2-8f62-020faad333ec-kube-api-access-79lm6\") pod \"2288c539-d799-4de2-8f62-020faad333ec\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.717105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-db-sync-config-data\") pod \"2288c539-d799-4de2-8f62-020faad333ec\" (UID: \"2288c539-d799-4de2-8f62-020faad333ec\") " Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.722971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2288c539-d799-4de2-8f62-020faad333ec-kube-api-access-79lm6" (OuterVolumeSpecName: "kube-api-access-79lm6") pod "2288c539-d799-4de2-8f62-020faad333ec" (UID: "2288c539-d799-4de2-8f62-020faad333ec"). InnerVolumeSpecName "kube-api-access-79lm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.723874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2288c539-d799-4de2-8f62-020faad333ec" (UID: "2288c539-d799-4de2-8f62-020faad333ec"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.736919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2288c539-d799-4de2-8f62-020faad333ec" (UID: "2288c539-d799-4de2-8f62-020faad333ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.749372 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-config-data" (OuterVolumeSpecName: "config-data") pod "2288c539-d799-4de2-8f62-020faad333ec" (UID: "2288c539-d799-4de2-8f62-020faad333ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.819098 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.819231 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.819292 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79lm6\" (UniqueName: \"kubernetes.io/projected/2288c539-d799-4de2-8f62-020faad333ec-kube-api-access-79lm6\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:22 crc kubenswrapper[4707]: I0121 15:31:22.819344 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2288c539-d799-4de2-8f62-020faad333ec-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:23 crc kubenswrapper[4707]: I0121 15:31:23.300555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-j4zfc" event={"ID":"2288c539-d799-4de2-8f62-020faad333ec","Type":"ContainerDied","Data":"17ce73e60766b2bbde6399c44c3a9aae4f6eb9dc1a913079d9939f718a9312e8"} Jan 21 15:31:23 crc kubenswrapper[4707]: I0121 15:31:23.300581 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-j4zfc" Jan 21 15:31:23 crc kubenswrapper[4707]: I0121 15:31:23.300587 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17ce73e60766b2bbde6399c44c3a9aae4f6eb9dc1a913079d9939f718a9312e8" Jan 21 15:31:24 crc kubenswrapper[4707]: I0121 15:31:24.314331 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" containerID="a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d" exitCode=0 Jan 21 15:31:24 crc kubenswrapper[4707]: I0121 15:31:24.314395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"c1a47423-6eba-4cf7-ab2d-db338f4f28a6","Type":"ContainerDied","Data":"a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d"} Jan 21 15:31:25 crc kubenswrapper[4707]: I0121 15:31:25.322086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"c1a47423-6eba-4cf7-ab2d-db338f4f28a6","Type":"ContainerStarted","Data":"23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce"} Jan 21 15:31:25 crc kubenswrapper[4707]: I0121 15:31:25.322444 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:31:25 crc kubenswrapper[4707]: I0121 15:31:25.324035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerStarted","Data":"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5"} Jan 21 15:31:25 crc kubenswrapper[4707]: I0121 15:31:25.340183 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=38.340173753 podStartE2EDuration="38.340173753s" podCreationTimestamp="2026-01-21 15:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:25.336664906 +0000 UTC m=+1782.518181128" watchObservedRunningTime="2026-01-21 15:31:25.340173753 +0000 UTC m=+1782.521689975" Jan 21 15:31:25 crc kubenswrapper[4707]: I0121 15:31:25.362082 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podStartSLOduration=3.884582227 podStartE2EDuration="28.362059393s" podCreationTimestamp="2026-01-21 15:30:57 +0000 UTC" firstStartedPulling="2026-01-21 15:31:00.128264062 +0000 UTC m=+1757.309780284" lastFinishedPulling="2026-01-21 15:31:24.605741227 +0000 UTC m=+1781.787257450" observedRunningTime="2026-01-21 15:31:25.356576565 +0000 UTC m=+1782.538092787" watchObservedRunningTime="2026-01-21 15:31:25.362059393 +0000 UTC m=+1782.543575616" Jan 21 15:31:25 crc kubenswrapper[4707]: I0121 15:31:25.639137 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:31:27 crc kubenswrapper[4707]: I0121 15:31:27.967930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.002709 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt"] Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.002949 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" podUID="4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" containerName="dnsmasq-dns" containerID="cri-o://291ac9488458f15e17db78c4845ab63022b0bc1c6ee640c51ad561be142062ce" gracePeriod=10 Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.345091 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" containerID="291ac9488458f15e17db78c4845ab63022b0bc1c6ee640c51ad561be142062ce" exitCode=0 Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.345172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" event={"ID":"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c","Type":"ContainerDied","Data":"291ac9488458f15e17db78c4845ab63022b0bc1c6ee640c51ad561be142062ce"} Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.394845 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.511744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-config\") pod \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.512266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-dnsmasq-svc\") pod \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.512500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgxw4\" (UniqueName: \"kubernetes.io/projected/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-kube-api-access-hgxw4\") pod \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\" (UID: \"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c\") " Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.516437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-kube-api-access-hgxw4" (OuterVolumeSpecName: "kube-api-access-hgxw4") pod "4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" (UID: "4a2b6336-8cd0-4321-b5ed-6bd1e321b93c"). InnerVolumeSpecName "kube-api-access-hgxw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.539112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" (UID: "4a2b6336-8cd0-4321-b5ed-6bd1e321b93c"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.539444 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-config" (OuterVolumeSpecName: "config") pod "4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" (UID: "4a2b6336-8cd0-4321-b5ed-6bd1e321b93c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.613759 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.613791 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgxw4\" (UniqueName: \"kubernetes.io/projected/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-kube-api-access-hgxw4\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:28 crc kubenswrapper[4707]: I0121 15:31:28.613803 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.182078 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:31:29 crc kubenswrapper[4707]: E0121 15:31:29.182338 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.352768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" event={"ID":"4a2b6336-8cd0-4321-b5ed-6bd1e321b93c","Type":"ContainerDied","Data":"709b1c2333202b4ae32673fb0530e208c401bfad1dbf3db56bf8916bfb22d484"} Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.352830 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt" Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.352842 4707 scope.go:117] "RemoveContainer" containerID="291ac9488458f15e17db78c4845ab63022b0bc1c6ee640c51ad561be142062ce" Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.369437 4707 scope.go:117] "RemoveContainer" containerID="4ee7a2e40272780a6533082a718502f3763d3518cb8a640cca7433f0c0ba28b0" Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.369542 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt"] Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.374979 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-vgrnt"] Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.651404 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.651441 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:29 crc kubenswrapper[4707]: I0121 15:31:29.653927 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:30 crc kubenswrapper[4707]: I0121 15:31:30.360283 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:31 crc kubenswrapper[4707]: I0121 15:31:31.189668 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" path="/var/lib/kubelet/pods/4a2b6336-8cd0-4321-b5ed-6bd1e321b93c/volumes" Jan 21 15:31:32 crc kubenswrapper[4707]: I0121 15:31:32.270183 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:31:32 crc kubenswrapper[4707]: I0121 15:31:32.371567 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="prometheus" containerID="cri-o://30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97" gracePeriod=600 Jan 21 15:31:32 crc kubenswrapper[4707]: I0121 15:31:32.371629 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="config-reloader" containerID="cri-o://81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9" gracePeriod=600 Jan 21 15:31:32 crc kubenswrapper[4707]: I0121 15:31:32.371631 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="thanos-sidecar" containerID="cri-o://2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5" gracePeriod=600 Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.242223 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-2\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-1\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mnxj\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-kube-api-access-8mnxj\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-tls-assets\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-web-config\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-config\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7077aeb3-cc3a-41f0-882c-a29171cdda53-config-out\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-thanos-prometheus-http-client-file\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-0\") pod \"7077aeb3-cc3a-41f0-882c-a29171cdda53\" (UID: \"7077aeb3-cc3a-41f0-882c-a29171cdda53\") " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.371936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.372230 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.372307 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.372318 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.372530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.381667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-kube-api-access-8mnxj" (OuterVolumeSpecName: "kube-api-access-8mnxj") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "kube-api-access-8mnxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.383498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.384151 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7077aeb3-cc3a-41f0-882c-a29171cdda53-config-out" (OuterVolumeSpecName: "config-out") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.384255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-config" (OuterVolumeSpecName: "config") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.385363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.385850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.388620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerDied","Data":"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5"} Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.388931 4707 scope.go:117] "RemoveContainer" containerID="2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.388601 4707 generic.go:334] "Generic (PLEG): container finished" podID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerID="2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5" exitCode=0 Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.388982 4707 generic.go:334] "Generic (PLEG): container finished" podID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerID="81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9" exitCode=0 Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.389013 4707 generic.go:334] "Generic (PLEG): container finished" podID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerID="30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97" exitCode=0 Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.389028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerDied","Data":"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9"} Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.389047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerDied","Data":"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97"} Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.388651 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.389056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"7077aeb3-cc3a-41f0-882c-a29171cdda53","Type":"ContainerDied","Data":"aea07a785d69297b821c4370ae38b084c5efe52194ba3afe34ce409577e9ec28"} Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.399971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-web-config" (OuterVolumeSpecName: "web-config") pod "7077aeb3-cc3a-41f0-882c-a29171cdda53" (UID: "7077aeb3-cc3a-41f0-882c-a29171cdda53"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.419223 4707 scope.go:117] "RemoveContainer" containerID="81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.433691 4707 scope.go:117] "RemoveContainer" containerID="30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.448430 4707 scope.go:117] "RemoveContainer" containerID="354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.465331 4707 scope.go:117] "RemoveContainer" containerID="2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.466263 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5\": container with ID starting with 2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5 not found: ID does not exist" containerID="2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.466304 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5"} err="failed to get container status \"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5\": rpc error: code = NotFound desc = could not find container \"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5\": container with ID starting with 2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5 not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.466331 4707 scope.go:117] "RemoveContainer" containerID="81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.466691 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9\": container with ID starting with 81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9 not found: ID does not exist" containerID="81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.466723 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9"} err="failed to get container status \"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9\": rpc error: code = NotFound desc = could not find container \"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9\": container with ID starting with 81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9 not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.466746 4707 scope.go:117] "RemoveContainer" containerID="30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.467323 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97\": container with ID starting with 30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97 not found: ID does not exist" containerID="30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.467355 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97"} err="failed to get container status \"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97\": rpc error: code = NotFound desc = could not find container \"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97\": container with ID starting with 30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97 not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.467376 4707 scope.go:117] "RemoveContainer" containerID="354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.468454 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a\": container with ID starting with 354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a not found: ID does not exist" containerID="354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.468493 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a"} err="failed to get container status \"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a\": rpc error: code = NotFound desc = could not find container \"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a\": container with ID starting with 354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.468511 4707 scope.go:117] "RemoveContainer" containerID="2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.468733 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5"} err="failed to get container status \"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5\": rpc error: code = NotFound desc = could not find container \"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5\": container with ID starting with 2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5 not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.468752 4707 scope.go:117] "RemoveContainer" containerID="81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.469042 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9"} err="failed to get container status \"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9\": rpc error: code = NotFound desc = could not find container \"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9\": container with ID starting with 81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9 not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.469064 4707 scope.go:117] "RemoveContainer" containerID="30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.469450 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97"} err="failed to get container status \"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97\": rpc error: code = NotFound desc = could not find container \"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97\": container with ID starting with 30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97 not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.469472 4707 scope.go:117] "RemoveContainer" containerID="354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.469798 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a"} err="failed to get container status \"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a\": rpc error: code = NotFound desc = could not find container \"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a\": container with ID starting with 354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.469832 4707 scope.go:117] "RemoveContainer" containerID="2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.470072 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5"} err="failed to get container status \"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5\": rpc error: code = NotFound desc = could not find container \"2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5\": container with ID starting with 2e85a98706eca8cc5b392d47b67c10b1515eaac51c9d7c0e24f381211ce3ace5 not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.470091 4707 scope.go:117] "RemoveContainer" containerID="81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.470940 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9"} err="failed to get container status \"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9\": rpc error: code = NotFound desc = could not find container \"81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9\": container with ID starting with 81de11e014e49b967357b3892d75fa88f6b2486c8668b60543a25a7ea01b74e9 not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.470958 4707 scope.go:117] "RemoveContainer" containerID="30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.471212 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97"} err="failed to get container status \"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97\": rpc error: code = NotFound desc = could not find container \"30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97\": container with ID starting with 30b09b4fa838ebcd71c26721acbc11f5de58be3627bd5b9b8c5c5ade3668fa97 not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.471234 4707 scope.go:117] "RemoveContainer" containerID="354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.471451 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a"} err="failed to get container status \"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a\": rpc error: code = NotFound desc = could not find container \"354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a\": container with ID starting with 354e6de121bdbeb4c7eb9771321ff8cdafa064e6ecdacc26b0915d59562a3d5a not found: ID does not exist" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.473168 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7077aeb3-cc3a-41f0-882c-a29171cdda53-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.473192 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mnxj\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-kube-api-access-8mnxj\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.473204 4707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7077aeb3-cc3a-41f0-882c-a29171cdda53-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.473212 4707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-web-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.473220 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.473227 4707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7077aeb3-cc3a-41f0-882c-a29171cdda53-config-out\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.473254 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.473262 4707 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7077aeb3-cc3a-41f0-882c-a29171cdda53-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.486388 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.574302 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.712152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.716660 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.727930 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.728172 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="config-reloader" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728186 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="config-reloader" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.728198 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" containerName="dnsmasq-dns" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728204 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" containerName="dnsmasq-dns" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.728214 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="prometheus" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728219 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="prometheus" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.728228 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="init-config-reloader" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728233 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="init-config-reloader" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.728251 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2288c539-d799-4de2-8f62-020faad333ec" containerName="glance-db-sync" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728256 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2288c539-d799-4de2-8f62-020faad333ec" containerName="glance-db-sync" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.728270 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" containerName="init" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728275 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" containerName="init" Jan 21 15:31:33 crc kubenswrapper[4707]: E0121 15:31:33.728283 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="thanos-sidecar" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728288 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="thanos-sidecar" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728399 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="config-reloader" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728415 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2288c539-d799-4de2-8f62-020faad333ec" containerName="glance-db-sync" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728426 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="prometheus" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728436 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" containerName="thanos-sidecar" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.728444 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2b6336-8cd0-4321-b5ed-6bd1e321b93c" containerName="dnsmasq-dns" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.731890 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.733703 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-1" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.733778 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.734346 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.734750 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-web-config" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.734783 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-2" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.735235 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.735598 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"metric-storage-prometheus-dockercfg-kh2t7" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.736312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-metric-storage-prometheus-svc" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.740711 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.744040 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-tls-assets-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.876964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27bcd\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-kube-api-access-27bcd\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.877365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.978125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.978515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.978621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.978712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.978827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27bcd\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-kube-api-access-27bcd\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.978908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.978977 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979457 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.979179 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.982519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.982537 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.982594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.982724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.983093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.983575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.983707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.984783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.993359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27bcd\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-kube-api-access-27bcd\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:33 crc kubenswrapper[4707]: I0121 15:31:33.996790 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"prometheus-metric-storage-0\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:34 crc kubenswrapper[4707]: I0121 15:31:34.047066 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:34 crc kubenswrapper[4707]: I0121 15:31:34.436305 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:31:35 crc kubenswrapper[4707]: I0121 15:31:35.190349 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7077aeb3-cc3a-41f0-882c-a29171cdda53" path="/var/lib/kubelet/pods/7077aeb3-cc3a-41f0-882c-a29171cdda53/volumes" Jan 21 15:31:35 crc kubenswrapper[4707]: I0121 15:31:35.404923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerStarted","Data":"b9ba6ef9736e4e3423de4537a070ef7c9438832cc3b04653d27954688e69a445"} Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.423908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerStarted","Data":"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23"} Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.475941 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.698621 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-qcqbl"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.699557 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.707571 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-qcqbl"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.750520 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-jrrrh"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.751402 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.752913 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-watcher-dockercfg-zx5g8" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.752979 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-config-data" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.761648 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-jrrrh"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.805732 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lmmvb"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.806708 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.810298 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-69db-account-create-update-p864k"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.811294 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.812569 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.814576 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lmmvb"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.818746 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-69db-account-create-update-p864k"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.846784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-config-data\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.847016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-combined-ca-bundle\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.847122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfgjb\" (UniqueName: \"kubernetes.io/projected/5ab2b0b7-4f0b-49d9-b217-893b083a786c-kube-api-access-gfgjb\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.847247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-db-sync-config-data\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.847333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10033319-b8c1-4443-9604-9e94be4e4159-operator-scripts\") pod \"cinder-db-create-qcqbl\" (UID: \"10033319-b8c1-4443-9604-9e94be4e4159\") " pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.847404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkx2p\" (UniqueName: \"kubernetes.io/projected/10033319-b8c1-4443-9604-9e94be4e4159-kube-api-access-xkx2p\") pod \"cinder-db-create-qcqbl\" (UID: \"10033319-b8c1-4443-9604-9e94be4e4159\") " pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.899297 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.900151 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.906075 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.909800 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg"] Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.948886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfgjb\" (UniqueName: \"kubernetes.io/projected/5ab2b0b7-4f0b-49d9-b217-893b083a786c-kube-api-access-gfgjb\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.949033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24548ac0-7326-443b-ad9c-81ace3a3dd45-operator-scripts\") pod \"barbican-db-create-lmmvb\" (UID: \"24548ac0-7326-443b-ad9c-81ace3a3dd45\") " pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.949082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-db-sync-config-data\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.949149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10033319-b8c1-4443-9604-9e94be4e4159-operator-scripts\") pod \"cinder-db-create-qcqbl\" (UID: \"10033319-b8c1-4443-9604-9e94be4e4159\") " pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.949178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkx2p\" (UniqueName: \"kubernetes.io/projected/10033319-b8c1-4443-9604-9e94be4e4159-kube-api-access-xkx2p\") pod \"cinder-db-create-qcqbl\" (UID: \"10033319-b8c1-4443-9604-9e94be4e4159\") " pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.949288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fs2q\" (UniqueName: \"kubernetes.io/projected/24548ac0-7326-443b-ad9c-81ace3a3dd45-kube-api-access-9fs2q\") pod \"barbican-db-create-lmmvb\" (UID: \"24548ac0-7326-443b-ad9c-81ace3a3dd45\") " pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.949349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fglff\" (UniqueName: \"kubernetes.io/projected/811c1581-db32-4ad5-abfd-cb2bd09575d4-kube-api-access-fglff\") pod \"neutron-69db-account-create-update-p864k\" (UID: \"811c1581-db32-4ad5-abfd-cb2bd09575d4\") " pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.949395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-config-data\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.949422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811c1581-db32-4ad5-abfd-cb2bd09575d4-operator-scripts\") pod \"neutron-69db-account-create-update-p864k\" (UID: \"811c1581-db32-4ad5-abfd-cb2bd09575d4\") " pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.949569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-combined-ca-bundle\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.950571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10033319-b8c1-4443-9604-9e94be4e4159-operator-scripts\") pod \"cinder-db-create-qcqbl\" (UID: \"10033319-b8c1-4443-9604-9e94be4e4159\") " pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.960654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-config-data\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.964280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-combined-ca-bundle\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.966624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfgjb\" (UniqueName: \"kubernetes.io/projected/5ab2b0b7-4f0b-49d9-b217-893b083a786c-kube-api-access-gfgjb\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.967121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-db-sync-config-data\") pod \"watcher-db-sync-jrrrh\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:38 crc kubenswrapper[4707]: I0121 15:31:38.975840 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkx2p\" (UniqueName: \"kubernetes.io/projected/10033319-b8c1-4443-9604-9e94be4e4159-kube-api-access-xkx2p\") pod \"cinder-db-create-qcqbl\" (UID: \"10033319-b8c1-4443-9604-9e94be4e4159\") " pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.010831 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7"] Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.011671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.011903 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.013136 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.020080 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7"] Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.050981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndgm5\" (UniqueName: \"kubernetes.io/projected/e972c93b-19a1-455e-a134-ba9bd7f56854-kube-api-access-ndgm5\") pod \"barbican-5b6c-account-create-update-84qjg\" (UID: \"e972c93b-19a1-455e-a134-ba9bd7f56854\") " pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.051103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24548ac0-7326-443b-ad9c-81ace3a3dd45-operator-scripts\") pod \"barbican-db-create-lmmvb\" (UID: \"24548ac0-7326-443b-ad9c-81ace3a3dd45\") " pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.051274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fs2q\" (UniqueName: \"kubernetes.io/projected/24548ac0-7326-443b-ad9c-81ace3a3dd45-kube-api-access-9fs2q\") pod \"barbican-db-create-lmmvb\" (UID: \"24548ac0-7326-443b-ad9c-81ace3a3dd45\") " pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.051629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fglff\" (UniqueName: \"kubernetes.io/projected/811c1581-db32-4ad5-abfd-cb2bd09575d4-kube-api-access-fglff\") pod \"neutron-69db-account-create-update-p864k\" (UID: \"811c1581-db32-4ad5-abfd-cb2bd09575d4\") " pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.051699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811c1581-db32-4ad5-abfd-cb2bd09575d4-operator-scripts\") pod \"neutron-69db-account-create-update-p864k\" (UID: \"811c1581-db32-4ad5-abfd-cb2bd09575d4\") " pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.051743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e972c93b-19a1-455e-a134-ba9bd7f56854-operator-scripts\") pod \"barbican-5b6c-account-create-update-84qjg\" (UID: \"e972c93b-19a1-455e-a134-ba9bd7f56854\") " pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.051789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24548ac0-7326-443b-ad9c-81ace3a3dd45-operator-scripts\") pod \"barbican-db-create-lmmvb\" (UID: \"24548ac0-7326-443b-ad9c-81ace3a3dd45\") " pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.052261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811c1581-db32-4ad5-abfd-cb2bd09575d4-operator-scripts\") pod \"neutron-69db-account-create-update-p864k\" (UID: \"811c1581-db32-4ad5-abfd-cb2bd09575d4\") " pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.052546 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-d6kbk"] Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.054339 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.056334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.057042 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.057217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-hswdl" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.057329 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.062313 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.064609 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-d6kbk"] Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.068241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fs2q\" (UniqueName: \"kubernetes.io/projected/24548ac0-7326-443b-ad9c-81ace3a3dd45-kube-api-access-9fs2q\") pod \"barbican-db-create-lmmvb\" (UID: \"24548ac0-7326-443b-ad9c-81ace3a3dd45\") " pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.087314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fglff\" (UniqueName: \"kubernetes.io/projected/811c1581-db32-4ad5-abfd-cb2bd09575d4-kube-api-access-fglff\") pod \"neutron-69db-account-create-update-p864k\" (UID: \"811c1581-db32-4ad5-abfd-cb2bd09575d4\") " pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.097872 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jzlsb"] Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.098790 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.108148 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jzlsb"] Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.119258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.127897 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.152599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e972c93b-19a1-455e-a134-ba9bd7f56854-operator-scripts\") pod \"barbican-5b6c-account-create-update-84qjg\" (UID: \"e972c93b-19a1-455e-a134-ba9bd7f56854\") " pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.152657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndgm5\" (UniqueName: \"kubernetes.io/projected/e972c93b-19a1-455e-a134-ba9bd7f56854-kube-api-access-ndgm5\") pod \"barbican-5b6c-account-create-update-84qjg\" (UID: \"e972c93b-19a1-455e-a134-ba9bd7f56854\") " pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.152685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-combined-ca-bundle\") pod \"keystone-db-sync-d6kbk\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.152723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3602a2ea-5d10-4d05-9963-99da2382de51-operator-scripts\") pod \"cinder-8e80-account-create-update-t9nz7\" (UID: \"3602a2ea-5d10-4d05-9963-99da2382de51\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.152741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-config-data\") pod \"keystone-db-sync-d6kbk\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.152761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sb58\" (UniqueName: \"kubernetes.io/projected/3602a2ea-5d10-4d05-9963-99da2382de51-kube-api-access-2sb58\") pod \"cinder-8e80-account-create-update-t9nz7\" (UID: \"3602a2ea-5d10-4d05-9963-99da2382de51\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.152790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ghwx\" (UniqueName: \"kubernetes.io/projected/f26df112-b756-487f-ad9b-5d39ed9f1754-kube-api-access-8ghwx\") pod \"keystone-db-sync-d6kbk\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.153370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e972c93b-19a1-455e-a134-ba9bd7f56854-operator-scripts\") pod \"barbican-5b6c-account-create-update-84qjg\" (UID: \"e972c93b-19a1-455e-a134-ba9bd7f56854\") " pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.175044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndgm5\" (UniqueName: \"kubernetes.io/projected/e972c93b-19a1-455e-a134-ba9bd7f56854-kube-api-access-ndgm5\") pod \"barbican-5b6c-account-create-update-84qjg\" (UID: \"e972c93b-19a1-455e-a134-ba9bd7f56854\") " pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.215454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.253885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-operator-scripts\") pod \"neutron-db-create-jzlsb\" (UID: \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\") " pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.253937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3602a2ea-5d10-4d05-9963-99da2382de51-operator-scripts\") pod \"cinder-8e80-account-create-update-t9nz7\" (UID: \"3602a2ea-5d10-4d05-9963-99da2382de51\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.253965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-config-data\") pod \"keystone-db-sync-d6kbk\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.253992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sb58\" (UniqueName: \"kubernetes.io/projected/3602a2ea-5d10-4d05-9963-99da2382de51-kube-api-access-2sb58\") pod \"cinder-8e80-account-create-update-t9nz7\" (UID: \"3602a2ea-5d10-4d05-9963-99da2382de51\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.254578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3602a2ea-5d10-4d05-9963-99da2382de51-operator-scripts\") pod \"cinder-8e80-account-create-update-t9nz7\" (UID: \"3602a2ea-5d10-4d05-9963-99da2382de51\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.254931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ghwx\" (UniqueName: \"kubernetes.io/projected/f26df112-b756-487f-ad9b-5d39ed9f1754-kube-api-access-8ghwx\") pod \"keystone-db-sync-d6kbk\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.260171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-config-data\") pod \"keystone-db-sync-d6kbk\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.260267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-combined-ca-bundle\") pod \"keystone-db-sync-d6kbk\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.260308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8sm\" (UniqueName: \"kubernetes.io/projected/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-kube-api-access-6g8sm\") pod \"neutron-db-create-jzlsb\" (UID: \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\") " pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.264006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-combined-ca-bundle\") pod \"keystone-db-sync-d6kbk\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.277072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ghwx\" (UniqueName: \"kubernetes.io/projected/f26df112-b756-487f-ad9b-5d39ed9f1754-kube-api-access-8ghwx\") pod \"keystone-db-sync-d6kbk\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.281891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sb58\" (UniqueName: \"kubernetes.io/projected/3602a2ea-5d10-4d05-9963-99da2382de51-kube-api-access-2sb58\") pod \"cinder-8e80-account-create-update-t9nz7\" (UID: \"3602a2ea-5d10-4d05-9963-99da2382de51\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.329362 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.362177 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8sm\" (UniqueName: \"kubernetes.io/projected/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-kube-api-access-6g8sm\") pod \"neutron-db-create-jzlsb\" (UID: \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\") " pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.362236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-operator-scripts\") pod \"neutron-db-create-jzlsb\" (UID: \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\") " pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.363860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-operator-scripts\") pod \"neutron-db-create-jzlsb\" (UID: \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\") " pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.375653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8sm\" (UniqueName: \"kubernetes.io/projected/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-kube-api-access-6g8sm\") pod \"neutron-db-create-jzlsb\" (UID: \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\") " pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.411653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.415553 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.492690 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-qcqbl"] Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.558431 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-jrrrh"] Jan 21 15:31:39 crc kubenswrapper[4707]: W0121 15:31:39.567050 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ab2b0b7_4f0b_49d9_b217_893b083a786c.slice/crio-f8c62ac174a794f2ecc07ae300cce28e2ad6005cbd8966d103af1ad246490ec4 WatchSource:0}: Error finding container f8c62ac174a794f2ecc07ae300cce28e2ad6005cbd8966d103af1ad246490ec4: Status 404 returned error can't find the container with id f8c62ac174a794f2ecc07ae300cce28e2ad6005cbd8966d103af1ad246490ec4 Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.619904 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lmmvb"] Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.625612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-69db-account-create-update-p864k"] Jan 21 15:31:39 crc kubenswrapper[4707]: W0121 15:31:39.627399 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod811c1581_db32_4ad5_abfd_cb2bd09575d4.slice/crio-3cdd972148ae68be22fda8da93b0c66ad8be6982103d332370d7152dc22bdbf3 WatchSource:0}: Error finding container 3cdd972148ae68be22fda8da93b0c66ad8be6982103d332370d7152dc22bdbf3: Status 404 returned error can't find the container with id 3cdd972148ae68be22fda8da93b0c66ad8be6982103d332370d7152dc22bdbf3 Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.726475 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg"] Jan 21 15:31:39 crc kubenswrapper[4707]: W0121 15:31:39.729522 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode972c93b_19a1_455e_a134_ba9bd7f56854.slice/crio-92516b02103bb49b98a1ff8167c2281e84736db0fcc43385163f364f12a5f2cb WatchSource:0}: Error finding container 92516b02103bb49b98a1ff8167c2281e84736db0fcc43385163f364f12a5f2cb: Status 404 returned error can't find the container with id 92516b02103bb49b98a1ff8167c2281e84736db0fcc43385163f364f12a5f2cb Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.778832 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7"] Jan 21 15:31:39 crc kubenswrapper[4707]: W0121 15:31:39.791567 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3602a2ea_5d10_4d05_9963_99da2382de51.slice/crio-227139ecd11074dbb1e0cc4a6886abb47dc1ed7cdd43fccd47cbd11e0903834b WatchSource:0}: Error finding container 227139ecd11074dbb1e0cc4a6886abb47dc1ed7cdd43fccd47cbd11e0903834b: Status 404 returned error can't find the container with id 227139ecd11074dbb1e0cc4a6886abb47dc1ed7cdd43fccd47cbd11e0903834b Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.867568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-d6kbk"] Jan 21 15:31:39 crc kubenswrapper[4707]: W0121 15:31:39.875535 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf26df112_b756_487f_ad9b_5d39ed9f1754.slice/crio-f20b83e3b747ef18225c5ad0d599f2b71765f78e889da68806e1ea0bc3f3c5c7 WatchSource:0}: Error finding container f20b83e3b747ef18225c5ad0d599f2b71765f78e889da68806e1ea0bc3f3c5c7: Status 404 returned error can't find the container with id f20b83e3b747ef18225c5ad0d599f2b71765f78e889da68806e1ea0bc3f3c5c7 Jan 21 15:31:39 crc kubenswrapper[4707]: I0121 15:31:39.883485 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jzlsb"] Jan 21 15:31:39 crc kubenswrapper[4707]: W0121 15:31:39.887253 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77cbdfe0_44a8_4d32_8cf9_f050e71e5299.slice/crio-2e0269480158f97a6afc8d15ce3aecaad3397a6273964cc32cafc9af63df864d WatchSource:0}: Error finding container 2e0269480158f97a6afc8d15ce3aecaad3397a6273964cc32cafc9af63df864d: Status 404 returned error can't find the container with id 2e0269480158f97a6afc8d15ce3aecaad3397a6273964cc32cafc9af63df864d Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.443688 4707 generic.go:334] "Generic (PLEG): container finished" podID="e972c93b-19a1-455e-a134-ba9bd7f56854" containerID="aa0e78e27cfa36ab1db2cdd44a88c11e0e7e739c2624edfaec77ac05d1ae9737" exitCode=0 Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.443753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" event={"ID":"e972c93b-19a1-455e-a134-ba9bd7f56854","Type":"ContainerDied","Data":"aa0e78e27cfa36ab1db2cdd44a88c11e0e7e739c2624edfaec77ac05d1ae9737"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.443778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" event={"ID":"e972c93b-19a1-455e-a134-ba9bd7f56854","Type":"ContainerStarted","Data":"92516b02103bb49b98a1ff8167c2281e84736db0fcc43385163f364f12a5f2cb"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.448733 4707 generic.go:334] "Generic (PLEG): container finished" podID="3602a2ea-5d10-4d05-9963-99da2382de51" containerID="d554013233633c782741923e73a8d8723a0219cb7dc1c3c0326bf28aba43877a" exitCode=0 Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.448787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" event={"ID":"3602a2ea-5d10-4d05-9963-99da2382de51","Type":"ContainerDied","Data":"d554013233633c782741923e73a8d8723a0219cb7dc1c3c0326bf28aba43877a"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.448824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" event={"ID":"3602a2ea-5d10-4d05-9963-99da2382de51","Type":"ContainerStarted","Data":"227139ecd11074dbb1e0cc4a6886abb47dc1ed7cdd43fccd47cbd11e0903834b"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.451854 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" event={"ID":"5ab2b0b7-4f0b-49d9-b217-893b083a786c","Type":"ContainerStarted","Data":"f8c62ac174a794f2ecc07ae300cce28e2ad6005cbd8966d103af1ad246490ec4"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.457204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" event={"ID":"f26df112-b756-487f-ad9b-5d39ed9f1754","Type":"ContainerStarted","Data":"37412d7e5fd2bcf3a43d5fa7b581590cb3a673423d13c7149b4e26b6f815c70f"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.457229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" event={"ID":"f26df112-b756-487f-ad9b-5d39ed9f1754","Type":"ContainerStarted","Data":"f20b83e3b747ef18225c5ad0d599f2b71765f78e889da68806e1ea0bc3f3c5c7"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.460375 4707 generic.go:334] "Generic (PLEG): container finished" podID="24548ac0-7326-443b-ad9c-81ace3a3dd45" containerID="97d6d731114606325ecfbbec11eeb91fce426d49196a794390a7e9a4c8b3edd5" exitCode=0 Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.460460 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-lmmvb" event={"ID":"24548ac0-7326-443b-ad9c-81ace3a3dd45","Type":"ContainerDied","Data":"97d6d731114606325ecfbbec11eeb91fce426d49196a794390a7e9a4c8b3edd5"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.460486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-lmmvb" event={"ID":"24548ac0-7326-443b-ad9c-81ace3a3dd45","Type":"ContainerStarted","Data":"df3094247dc422e3d49f60b50ce563c083f6d8b8432b1cdbd07901a2069df569"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.461932 4707 generic.go:334] "Generic (PLEG): container finished" podID="77cbdfe0-44a8-4d32-8cf9-f050e71e5299" containerID="e13b9cba489637d49d8262a8ecc2efb194e4e6f29440a9c7831758ea2e38f05e" exitCode=0 Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.461954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jzlsb" event={"ID":"77cbdfe0-44a8-4d32-8cf9-f050e71e5299","Type":"ContainerDied","Data":"e13b9cba489637d49d8262a8ecc2efb194e4e6f29440a9c7831758ea2e38f05e"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.461979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jzlsb" event={"ID":"77cbdfe0-44a8-4d32-8cf9-f050e71e5299","Type":"ContainerStarted","Data":"2e0269480158f97a6afc8d15ce3aecaad3397a6273964cc32cafc9af63df864d"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.463299 4707 generic.go:334] "Generic (PLEG): container finished" podID="811c1581-db32-4ad5-abfd-cb2bd09575d4" containerID="9314272c08eb3f5ed1dc4224ebb999ae74acb5f87849d8be469e0d3cd8c9eb28" exitCode=0 Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.463353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" event={"ID":"811c1581-db32-4ad5-abfd-cb2bd09575d4","Type":"ContainerDied","Data":"9314272c08eb3f5ed1dc4224ebb999ae74acb5f87849d8be469e0d3cd8c9eb28"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.463377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" event={"ID":"811c1581-db32-4ad5-abfd-cb2bd09575d4","Type":"ContainerStarted","Data":"3cdd972148ae68be22fda8da93b0c66ad8be6982103d332370d7152dc22bdbf3"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.467836 4707 generic.go:334] "Generic (PLEG): container finished" podID="10033319-b8c1-4443-9604-9e94be4e4159" containerID="5bedc67db4a2e84d7e5a7540beeaab9c861fcf915bd616f51818ad66fc63504e" exitCode=0 Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.467931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-qcqbl" event={"ID":"10033319-b8c1-4443-9604-9e94be4e4159","Type":"ContainerDied","Data":"5bedc67db4a2e84d7e5a7540beeaab9c861fcf915bd616f51818ad66fc63504e"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.468000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-qcqbl" event={"ID":"10033319-b8c1-4443-9604-9e94be4e4159","Type":"ContainerStarted","Data":"3f44364dc64dea53c614fdd3372ea80939ceb69d5031dd77e30a46ba4824f787"} Jan 21 15:31:40 crc kubenswrapper[4707]: I0121 15:31:40.496229 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" podStartSLOduration=1.49621316 podStartE2EDuration="1.49621316s" podCreationTimestamp="2026-01-21 15:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:40.483865343 +0000 UTC m=+1797.665381565" watchObservedRunningTime="2026-01-21 15:31:40.49621316 +0000 UTC m=+1797.677729382" Jan 21 15:31:41 crc kubenswrapper[4707]: I0121 15:31:41.788478 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:41 crc kubenswrapper[4707]: I0121 15:31:41.809520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e972c93b-19a1-455e-a134-ba9bd7f56854-operator-scripts\") pod \"e972c93b-19a1-455e-a134-ba9bd7f56854\" (UID: \"e972c93b-19a1-455e-a134-ba9bd7f56854\") " Jan 21 15:31:41 crc kubenswrapper[4707]: I0121 15:31:41.809623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndgm5\" (UniqueName: \"kubernetes.io/projected/e972c93b-19a1-455e-a134-ba9bd7f56854-kube-api-access-ndgm5\") pod \"e972c93b-19a1-455e-a134-ba9bd7f56854\" (UID: \"e972c93b-19a1-455e-a134-ba9bd7f56854\") " Jan 21 15:31:41 crc kubenswrapper[4707]: I0121 15:31:41.811102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e972c93b-19a1-455e-a134-ba9bd7f56854-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e972c93b-19a1-455e-a134-ba9bd7f56854" (UID: "e972c93b-19a1-455e-a134-ba9bd7f56854"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:41 crc kubenswrapper[4707]: I0121 15:31:41.835436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e972c93b-19a1-455e-a134-ba9bd7f56854-kube-api-access-ndgm5" (OuterVolumeSpecName: "kube-api-access-ndgm5") pod "e972c93b-19a1-455e-a134-ba9bd7f56854" (UID: "e972c93b-19a1-455e-a134-ba9bd7f56854"). InnerVolumeSpecName "kube-api-access-ndgm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:41 crc kubenswrapper[4707]: I0121 15:31:41.910977 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e972c93b-19a1-455e-a134-ba9bd7f56854-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:41 crc kubenswrapper[4707]: I0121 15:31:41.911004 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndgm5\" (UniqueName: \"kubernetes.io/projected/e972c93b-19a1-455e-a134-ba9bd7f56854-kube-api-access-ndgm5\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:42 crc kubenswrapper[4707]: I0121 15:31:42.482631 4707 generic.go:334] "Generic (PLEG): container finished" podID="f26df112-b756-487f-ad9b-5d39ed9f1754" containerID="37412d7e5fd2bcf3a43d5fa7b581590cb3a673423d13c7149b4e26b6f815c70f" exitCode=0 Jan 21 15:31:42 crc kubenswrapper[4707]: I0121 15:31:42.482684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" event={"ID":"f26df112-b756-487f-ad9b-5d39ed9f1754","Type":"ContainerDied","Data":"37412d7e5fd2bcf3a43d5fa7b581590cb3a673423d13c7149b4e26b6f815c70f"} Jan 21 15:31:42 crc kubenswrapper[4707]: I0121 15:31:42.484273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" event={"ID":"e972c93b-19a1-455e-a134-ba9bd7f56854","Type":"ContainerDied","Data":"92516b02103bb49b98a1ff8167c2281e84736db0fcc43385163f364f12a5f2cb"} Jan 21 15:31:42 crc kubenswrapper[4707]: I0121 15:31:42.484297 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92516b02103bb49b98a1ff8167c2281e84736db0fcc43385163f364f12a5f2cb" Jan 21 15:31:42 crc kubenswrapper[4707]: I0121 15:31:42.484327 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.196846 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:31:43 crc kubenswrapper[4707]: E0121 15:31:43.200834 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.491829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-qcqbl" event={"ID":"10033319-b8c1-4443-9604-9e94be4e4159","Type":"ContainerDied","Data":"3f44364dc64dea53c614fdd3372ea80939ceb69d5031dd77e30a46ba4824f787"} Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.491864 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f44364dc64dea53c614fdd3372ea80939ceb69d5031dd77e30a46ba4824f787" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.493533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" event={"ID":"3602a2ea-5d10-4d05-9963-99da2382de51","Type":"ContainerDied","Data":"227139ecd11074dbb1e0cc4a6886abb47dc1ed7cdd43fccd47cbd11e0903834b"} Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.493572 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="227139ecd11074dbb1e0cc4a6886abb47dc1ed7cdd43fccd47cbd11e0903834b" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.494188 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.494965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-lmmvb" event={"ID":"24548ac0-7326-443b-ad9c-81ace3a3dd45","Type":"ContainerDied","Data":"df3094247dc422e3d49f60b50ce563c083f6d8b8432b1cdbd07901a2069df569"} Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.494987 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df3094247dc422e3d49f60b50ce563c083f6d8b8432b1cdbd07901a2069df569" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.496712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jzlsb" event={"ID":"77cbdfe0-44a8-4d32-8cf9-f050e71e5299","Type":"ContainerDied","Data":"2e0269480158f97a6afc8d15ce3aecaad3397a6273964cc32cafc9af63df864d"} Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.496741 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e0269480158f97a6afc8d15ce3aecaad3397a6273964cc32cafc9af63df864d" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.498255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" event={"ID":"811c1581-db32-4ad5-abfd-cb2bd09575d4","Type":"ContainerDied","Data":"3cdd972148ae68be22fda8da93b0c66ad8be6982103d332370d7152dc22bdbf3"} Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.498290 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cdd972148ae68be22fda8da93b0c66ad8be6982103d332370d7152dc22bdbf3" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.499638 4707 generic.go:334] "Generic (PLEG): container finished" podID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerID="659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23" exitCode=0 Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.499704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerDied","Data":"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23"} Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.502901 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.509786 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.511517 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.517514 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.630763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fs2q\" (UniqueName: \"kubernetes.io/projected/24548ac0-7326-443b-ad9c-81ace3a3dd45-kube-api-access-9fs2q\") pod \"24548ac0-7326-443b-ad9c-81ace3a3dd45\" (UID: \"24548ac0-7326-443b-ad9c-81ace3a3dd45\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.630802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkx2p\" (UniqueName: \"kubernetes.io/projected/10033319-b8c1-4443-9604-9e94be4e4159-kube-api-access-xkx2p\") pod \"10033319-b8c1-4443-9604-9e94be4e4159\" (UID: \"10033319-b8c1-4443-9604-9e94be4e4159\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.630858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811c1581-db32-4ad5-abfd-cb2bd09575d4-operator-scripts\") pod \"811c1581-db32-4ad5-abfd-cb2bd09575d4\" (UID: \"811c1581-db32-4ad5-abfd-cb2bd09575d4\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.630880 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g8sm\" (UniqueName: \"kubernetes.io/projected/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-kube-api-access-6g8sm\") pod \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\" (UID: \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.630932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fglff\" (UniqueName: \"kubernetes.io/projected/811c1581-db32-4ad5-abfd-cb2bd09575d4-kube-api-access-fglff\") pod \"811c1581-db32-4ad5-abfd-cb2bd09575d4\" (UID: \"811c1581-db32-4ad5-abfd-cb2bd09575d4\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.630964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sb58\" (UniqueName: \"kubernetes.io/projected/3602a2ea-5d10-4d05-9963-99da2382de51-kube-api-access-2sb58\") pod \"3602a2ea-5d10-4d05-9963-99da2382de51\" (UID: \"3602a2ea-5d10-4d05-9963-99da2382de51\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.630982 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3602a2ea-5d10-4d05-9963-99da2382de51-operator-scripts\") pod \"3602a2ea-5d10-4d05-9963-99da2382de51\" (UID: \"3602a2ea-5d10-4d05-9963-99da2382de51\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.631019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24548ac0-7326-443b-ad9c-81ace3a3dd45-operator-scripts\") pod \"24548ac0-7326-443b-ad9c-81ace3a3dd45\" (UID: \"24548ac0-7326-443b-ad9c-81ace3a3dd45\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.631033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-operator-scripts\") pod \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\" (UID: \"77cbdfe0-44a8-4d32-8cf9-f050e71e5299\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.631049 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10033319-b8c1-4443-9604-9e94be4e4159-operator-scripts\") pod \"10033319-b8c1-4443-9604-9e94be4e4159\" (UID: \"10033319-b8c1-4443-9604-9e94be4e4159\") " Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.631656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3602a2ea-5d10-4d05-9963-99da2382de51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3602a2ea-5d10-4d05-9963-99da2382de51" (UID: "3602a2ea-5d10-4d05-9963-99da2382de51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.632049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/811c1581-db32-4ad5-abfd-cb2bd09575d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "811c1581-db32-4ad5-abfd-cb2bd09575d4" (UID: "811c1581-db32-4ad5-abfd-cb2bd09575d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.632429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24548ac0-7326-443b-ad9c-81ace3a3dd45-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24548ac0-7326-443b-ad9c-81ace3a3dd45" (UID: "24548ac0-7326-443b-ad9c-81ace3a3dd45"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.632432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77cbdfe0-44a8-4d32-8cf9-f050e71e5299" (UID: "77cbdfe0-44a8-4d32-8cf9-f050e71e5299"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.632719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10033319-b8c1-4443-9604-9e94be4e4159-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10033319-b8c1-4443-9604-9e94be4e4159" (UID: "10033319-b8c1-4443-9604-9e94be4e4159"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.648681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3602a2ea-5d10-4d05-9963-99da2382de51-kube-api-access-2sb58" (OuterVolumeSpecName: "kube-api-access-2sb58") pod "3602a2ea-5d10-4d05-9963-99da2382de51" (UID: "3602a2ea-5d10-4d05-9963-99da2382de51"). InnerVolumeSpecName "kube-api-access-2sb58". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.648706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-kube-api-access-6g8sm" (OuterVolumeSpecName: "kube-api-access-6g8sm") pod "77cbdfe0-44a8-4d32-8cf9-f050e71e5299" (UID: "77cbdfe0-44a8-4d32-8cf9-f050e71e5299"). InnerVolumeSpecName "kube-api-access-6g8sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.648720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24548ac0-7326-443b-ad9c-81ace3a3dd45-kube-api-access-9fs2q" (OuterVolumeSpecName: "kube-api-access-9fs2q") pod "24548ac0-7326-443b-ad9c-81ace3a3dd45" (UID: "24548ac0-7326-443b-ad9c-81ace3a3dd45"). InnerVolumeSpecName "kube-api-access-9fs2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.649050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811c1581-db32-4ad5-abfd-cb2bd09575d4-kube-api-access-fglff" (OuterVolumeSpecName: "kube-api-access-fglff") pod "811c1581-db32-4ad5-abfd-cb2bd09575d4" (UID: "811c1581-db32-4ad5-abfd-cb2bd09575d4"). InnerVolumeSpecName "kube-api-access-fglff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.649183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10033319-b8c1-4443-9604-9e94be4e4159-kube-api-access-xkx2p" (OuterVolumeSpecName: "kube-api-access-xkx2p") pod "10033319-b8c1-4443-9604-9e94be4e4159" (UID: "10033319-b8c1-4443-9604-9e94be4e4159"). InnerVolumeSpecName "kube-api-access-xkx2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733146 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fglff\" (UniqueName: \"kubernetes.io/projected/811c1581-db32-4ad5-abfd-cb2bd09575d4-kube-api-access-fglff\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733179 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sb58\" (UniqueName: \"kubernetes.io/projected/3602a2ea-5d10-4d05-9963-99da2382de51-kube-api-access-2sb58\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733190 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3602a2ea-5d10-4d05-9963-99da2382de51-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733198 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24548ac0-7326-443b-ad9c-81ace3a3dd45-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733206 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733213 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10033319-b8c1-4443-9604-9e94be4e4159-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733221 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fs2q\" (UniqueName: \"kubernetes.io/projected/24548ac0-7326-443b-ad9c-81ace3a3dd45-kube-api-access-9fs2q\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733228 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkx2p\" (UniqueName: \"kubernetes.io/projected/10033319-b8c1-4443-9604-9e94be4e4159-kube-api-access-xkx2p\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733235 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811c1581-db32-4ad5-abfd-cb2bd09575d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:43 crc kubenswrapper[4707]: I0121 15:31:43.733243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g8sm\" (UniqueName: \"kubernetes.io/projected/77cbdfe0-44a8-4d32-8cf9-f050e71e5299-kube-api-access-6g8sm\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.205116 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.340879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ghwx\" (UniqueName: \"kubernetes.io/projected/f26df112-b756-487f-ad9b-5d39ed9f1754-kube-api-access-8ghwx\") pod \"f26df112-b756-487f-ad9b-5d39ed9f1754\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.340939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-combined-ca-bundle\") pod \"f26df112-b756-487f-ad9b-5d39ed9f1754\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.340991 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-config-data\") pod \"f26df112-b756-487f-ad9b-5d39ed9f1754\" (UID: \"f26df112-b756-487f-ad9b-5d39ed9f1754\") " Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.344029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26df112-b756-487f-ad9b-5d39ed9f1754-kube-api-access-8ghwx" (OuterVolumeSpecName: "kube-api-access-8ghwx") pod "f26df112-b756-487f-ad9b-5d39ed9f1754" (UID: "f26df112-b756-487f-ad9b-5d39ed9f1754"). InnerVolumeSpecName "kube-api-access-8ghwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.358071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f26df112-b756-487f-ad9b-5d39ed9f1754" (UID: "f26df112-b756-487f-ad9b-5d39ed9f1754"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.371536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-config-data" (OuterVolumeSpecName: "config-data") pod "f26df112-b756-487f-ad9b-5d39ed9f1754" (UID: "f26df112-b756-487f-ad9b-5d39ed9f1754"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.443363 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ghwx\" (UniqueName: \"kubernetes.io/projected/f26df112-b756-487f-ad9b-5d39ed9f1754-kube-api-access-8ghwx\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.443390 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.443401 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26df112-b756-487f-ad9b-5d39ed9f1754-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.519429 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jzlsb" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.522236 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-lmmvb" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.524738 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.527546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-d6kbk" event={"ID":"f26df112-b756-487f-ad9b-5d39ed9f1754","Type":"ContainerDied","Data":"f20b83e3b747ef18225c5ad0d599f2b71765f78e889da68806e1ea0bc3f3c5c7"} Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.527634 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f20b83e3b747ef18225c5ad0d599f2b71765f78e889da68806e1ea0bc3f3c5c7" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.527994 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-qcqbl" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.528213 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.529115 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-69db-account-create-update-p864k" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.674912 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dvz8h"] Jan 21 15:31:44 crc kubenswrapper[4707]: E0121 15:31:44.675510 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24548ac0-7326-443b-ad9c-81ace3a3dd45" containerName="mariadb-database-create" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.675531 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24548ac0-7326-443b-ad9c-81ace3a3dd45" containerName="mariadb-database-create" Jan 21 15:31:44 crc kubenswrapper[4707]: E0121 15:31:44.675551 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10033319-b8c1-4443-9604-9e94be4e4159" containerName="mariadb-database-create" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.675561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10033319-b8c1-4443-9604-9e94be4e4159" containerName="mariadb-database-create" Jan 21 15:31:44 crc kubenswrapper[4707]: E0121 15:31:44.675572 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811c1581-db32-4ad5-abfd-cb2bd09575d4" containerName="mariadb-account-create-update" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.675578 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="811c1581-db32-4ad5-abfd-cb2bd09575d4" containerName="mariadb-account-create-update" Jan 21 15:31:44 crc kubenswrapper[4707]: E0121 15:31:44.675605 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26df112-b756-487f-ad9b-5d39ed9f1754" containerName="keystone-db-sync" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.675612 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26df112-b756-487f-ad9b-5d39ed9f1754" containerName="keystone-db-sync" Jan 21 15:31:44 crc kubenswrapper[4707]: E0121 15:31:44.675625 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77cbdfe0-44a8-4d32-8cf9-f050e71e5299" containerName="mariadb-database-create" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.675632 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77cbdfe0-44a8-4d32-8cf9-f050e71e5299" containerName="mariadb-database-create" Jan 21 15:31:44 crc kubenswrapper[4707]: E0121 15:31:44.675649 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e972c93b-19a1-455e-a134-ba9bd7f56854" containerName="mariadb-account-create-update" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.675656 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e972c93b-19a1-455e-a134-ba9bd7f56854" containerName="mariadb-account-create-update" Jan 21 15:31:44 crc kubenswrapper[4707]: E0121 15:31:44.675669 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3602a2ea-5d10-4d05-9963-99da2382de51" containerName="mariadb-account-create-update" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.675676 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3602a2ea-5d10-4d05-9963-99da2382de51" containerName="mariadb-account-create-update" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.677514 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77cbdfe0-44a8-4d32-8cf9-f050e71e5299" containerName="mariadb-database-create" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.677545 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26df112-b756-487f-ad9b-5d39ed9f1754" containerName="keystone-db-sync" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.677564 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e972c93b-19a1-455e-a134-ba9bd7f56854" containerName="mariadb-account-create-update" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.677579 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="10033319-b8c1-4443-9604-9e94be4e4159" containerName="mariadb-database-create" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.677596 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="811c1581-db32-4ad5-abfd-cb2bd09575d4" containerName="mariadb-account-create-update" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.677613 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24548ac0-7326-443b-ad9c-81ace3a3dd45" containerName="mariadb-database-create" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.677630 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3602a2ea-5d10-4d05-9963-99da2382de51" containerName="mariadb-account-create-update" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.678341 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.681309 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.681504 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-hswdl" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.681645 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.681760 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.681823 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.684007 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dvz8h"] Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.762735 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-combined-ca-bundle\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.762861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-credential-keys\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.762886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-config-data\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.762910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-scripts\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.762965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-fernet-keys\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.763005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/1e1fae94-8cda-4956-bab0-04acd48d8d14-kube-api-access-rv5b6\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.775099 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.778055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.789554 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.789708 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.799396 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.854656 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dvtbt"] Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.855575 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.863041 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.863233 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.863405 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-dk2bl" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962jn\" (UniqueName: \"kubernetes.io/projected/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-kube-api-access-962jn\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-combined-ca-bundle\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-run-httpd\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-scripts\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-credential-keys\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-config-data\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-scripts\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-config-data\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-fernet-keys\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-log-httpd\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/1e1fae94-8cda-4956-bab0-04acd48d8d14-kube-api-access-rv5b6\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.864365 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dvtbt"] Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.869180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-config-data\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.875044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-scripts\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.877127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-fernet-keys\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.877366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-combined-ca-bundle\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.878489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-credential-keys\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.880366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/1e1fae94-8cda-4956-bab0-04acd48d8d14-kube-api-access-rv5b6\") pod \"keystone-bootstrap-dvz8h\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-combined-ca-bundle\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-config-data\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhhsc\" (UniqueName: \"kubernetes.io/projected/73c8603f-b572-4c68-a083-e68ebd2faa67-kube-api-access-xhhsc\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-log-httpd\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965612 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-scripts\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-962jn\" (UniqueName: \"kubernetes.io/projected/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-kube-api-access-962jn\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c8603f-b572-4c68-a083-e68ebd2faa67-logs\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-run-httpd\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-scripts\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.965800 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-config-data\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.966671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-run-httpd\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.967370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-log-httpd\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.969407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-scripts\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.969590 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-config-data\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.973037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.975933 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:44 crc kubenswrapper[4707]: I0121 15:31:44.979512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-962jn\" (UniqueName: \"kubernetes.io/projected/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-kube-api-access-962jn\") pod \"ceilometer-0\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.001132 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.066850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c8603f-b572-4c68-a083-e68ebd2faa67-logs\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.066973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-combined-ca-bundle\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.066994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-config-data\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.067016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhhsc\" (UniqueName: \"kubernetes.io/projected/73c8603f-b572-4c68-a083-e68ebd2faa67-kube-api-access-xhhsc\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.067053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-scripts\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.067272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c8603f-b572-4c68-a083-e68ebd2faa67-logs\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.071051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-combined-ca-bundle\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.071180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-scripts\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.072285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-config-data\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.082340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhhsc\" (UniqueName: \"kubernetes.io/projected/73c8603f-b572-4c68-a083-e68ebd2faa67-kube-api-access-xhhsc\") pod \"placement-db-sync-dvtbt\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.111982 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.220796 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.730344 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.731996 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.733782 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-9hvs4" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.738308 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.739135 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.739158 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.739218 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.774686 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.776058 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.778865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.778917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.778975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqs62\" (UniqueName: \"kubernetes.io/projected/fec8458e-7f99-4bbf-bb1e-f4e855189154-kube-api-access-fqs62\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.779049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-logs\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.779071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.779094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.779147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-scripts\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.779217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-config-data\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.780707 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.781351 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.781427 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.880762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqs62\" (UniqueName: \"kubernetes.io/projected/fec8458e-7f99-4bbf-bb1e-f4e855189154-kube-api-access-fqs62\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.880822 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.880850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.880907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-logs\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.880927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.880945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.880968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-scripts\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-logs\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-config-data\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881271 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxmt2\" (UniqueName: \"kubernetes.io/projected/1ef058e4-6e88-4c98-885f-cce6c978ce50-kube-api-access-sxmt2\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-logs\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.881999 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.886862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.888110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-scripts\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.888365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.889743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-config-data\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.894448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqs62\" (UniqueName: \"kubernetes.io/projected/fec8458e-7f99-4bbf-bb1e-f4e855189154-kube-api-access-fqs62\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.901146 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.982799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.982937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.982971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.983015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxmt2\" (UniqueName: \"kubernetes.io/projected/1ef058e4-6e88-4c98-885f-cce6c978ce50-kube-api-access-sxmt2\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.983039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.983084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.983157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.983404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.983670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-logs\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.983949 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.984561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-logs\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.986642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.987301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.988088 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.989954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:45 crc kubenswrapper[4707]: I0121 15:31:45.997320 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxmt2\" (UniqueName: \"kubernetes.io/projected/1ef058e4-6e88-4c98-885f-cce6c978ce50-kube-api-access-sxmt2\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:46 crc kubenswrapper[4707]: I0121 15:31:46.003627 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:46 crc kubenswrapper[4707]: I0121 15:31:46.052370 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:46 crc kubenswrapper[4707]: I0121 15:31:46.090479 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:46 crc kubenswrapper[4707]: I0121 15:31:46.497775 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:31:46 crc kubenswrapper[4707]: I0121 15:31:46.550402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:31:46 crc kubenswrapper[4707]: I0121 15:31:46.565478 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.218266 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-sbh8l"] Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.219380 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.222955 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-xmj86" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.222989 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.225715 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-sbh8l"] Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.228059 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s87zt\" (UniqueName: \"kubernetes.io/projected/37cd4c00-bb9a-406e-af6c-427a36af5e74-kube-api-access-s87zt\") pod \"barbican-db-sync-sbh8l\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.228335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-db-sync-config-data\") pod \"barbican-db-sync-sbh8l\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.228782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-combined-ca-bundle\") pod \"barbican-db-sync-sbh8l\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.312920 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-rt85p"] Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.313756 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.316700 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.316839 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.320940 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-jvd5f" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.322002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-rt85p"] Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.330713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-combined-ca-bundle\") pod \"barbican-db-sync-sbh8l\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.330757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5htfw\" (UniqueName: \"kubernetes.io/projected/fb71e17c-c141-4845-a86e-c7d8643681e0-kube-api-access-5htfw\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.330845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb71e17c-c141-4845-a86e-c7d8643681e0-etc-machine-id\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.330862 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s87zt\" (UniqueName: \"kubernetes.io/projected/37cd4c00-bb9a-406e-af6c-427a36af5e74-kube-api-access-s87zt\") pod \"barbican-db-sync-sbh8l\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.330891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-combined-ca-bundle\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.330909 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-db-sync-config-data\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.330932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-config-data\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.330947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-scripts\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.330970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-db-sync-config-data\") pod \"barbican-db-sync-sbh8l\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.335488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-combined-ca-bundle\") pod \"barbican-db-sync-sbh8l\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.344040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-db-sync-config-data\") pod \"barbican-db-sync-sbh8l\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.349392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s87zt\" (UniqueName: \"kubernetes.io/projected/37cd4c00-bb9a-406e-af6c-427a36af5e74-kube-api-access-s87zt\") pod \"barbican-db-sync-sbh8l\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.435754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5htfw\" (UniqueName: \"kubernetes.io/projected/fb71e17c-c141-4845-a86e-c7d8643681e0-kube-api-access-5htfw\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.435961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb71e17c-c141-4845-a86e-c7d8643681e0-etc-machine-id\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.436120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb71e17c-c141-4845-a86e-c7d8643681e0-etc-machine-id\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.436162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-combined-ca-bundle\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.436203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-db-sync-config-data\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.436265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-config-data\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.436286 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-scripts\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.440738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-scripts\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.444320 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-db-sync-config-data\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.445094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-config-data\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.445520 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-n7pdl"] Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.446185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-combined-ca-bundle\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.447186 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.449695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5htfw\" (UniqueName: \"kubernetes.io/projected/fb71e17c-c141-4845-a86e-c7d8643681e0-kube-api-access-5htfw\") pod \"cinder-db-sync-rt85p\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.450729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.451017 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-9kpsx" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.451581 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.454596 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-n7pdl"] Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.537574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-combined-ca-bundle\") pod \"neutron-db-sync-n7pdl\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.537658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f74wt\" (UniqueName: \"kubernetes.io/projected/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-kube-api-access-f74wt\") pod \"neutron-db-sync-n7pdl\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.537679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-config\") pod \"neutron-db-sync-n7pdl\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.538007 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.632514 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.638945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f74wt\" (UniqueName: \"kubernetes.io/projected/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-kube-api-access-f74wt\") pod \"neutron-db-sync-n7pdl\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.638973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-config\") pod \"neutron-db-sync-n7pdl\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.639038 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-combined-ca-bundle\") pod \"neutron-db-sync-n7pdl\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.643834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-combined-ca-bundle\") pod \"neutron-db-sync-n7pdl\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.643913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-config\") pod \"neutron-db-sync-n7pdl\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.652058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f74wt\" (UniqueName: \"kubernetes.io/projected/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-kube-api-access-f74wt\") pod \"neutron-db-sync-n7pdl\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:49 crc kubenswrapper[4707]: I0121 15:31:49.800652 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.565910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" event={"ID":"5ab2b0b7-4f0b-49d9-b217-893b083a786c","Type":"ContainerStarted","Data":"af3f3dd74a9c8d48aa97c107bfa8b058c3e1d0de4496d824674dc535fe8ed785"} Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.577021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerStarted","Data":"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24"} Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.590315 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" podStartSLOduration=1.97048697 podStartE2EDuration="12.590301266s" podCreationTimestamp="2026-01-21 15:31:38 +0000 UTC" firstStartedPulling="2026-01-21 15:31:39.568890047 +0000 UTC m=+1796.750406269" lastFinishedPulling="2026-01-21 15:31:50.188704343 +0000 UTC m=+1807.370220565" observedRunningTime="2026-01-21 15:31:50.588052458 +0000 UTC m=+1807.769568679" watchObservedRunningTime="2026-01-21 15:31:50.590301266 +0000 UTC m=+1807.771817488" Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.619488 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dvtbt"] Jan 21 15:31:50 crc kubenswrapper[4707]: W0121 15:31:50.633226 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73c8603f_b572_4c68_a083_e68ebd2faa67.slice/crio-469119bc7ecbc6b33abe1c60299ebef6e62ec4691ebcd1789210102488ec4fc1 WatchSource:0}: Error finding container 469119bc7ecbc6b33abe1c60299ebef6e62ec4691ebcd1789210102488ec4fc1: Status 404 returned error can't find the container with id 469119bc7ecbc6b33abe1c60299ebef6e62ec4691ebcd1789210102488ec4fc1 Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.705550 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dvz8h"] Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.722031 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-rt85p"] Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.728470 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:31:50 crc kubenswrapper[4707]: W0121 15:31:50.733677 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb71e17c_c141_4845_a86e_c7d8643681e0.slice/crio-d54e98b42bc1f5cf3133dc616ba41a2ed7ac91c5b60c647811b00d845882023d WatchSource:0}: Error finding container d54e98b42bc1f5cf3133dc616ba41a2ed7ac91c5b60c647811b00d845882023d: Status 404 returned error can't find the container with id d54e98b42bc1f5cf3133dc616ba41a2ed7ac91c5b60c647811b00d845882023d Jan 21 15:31:50 crc kubenswrapper[4707]: W0121 15:31:50.738308 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ef058e4_6e88_4c98_885f_cce6c978ce50.slice/crio-eea358a716229f867bdfb37e0f8c8272ecbffc6b309e2d7fc05ef9b2aa19015e WatchSource:0}: Error finding container eea358a716229f867bdfb37e0f8c8272ecbffc6b309e2d7fc05ef9b2aa19015e: Status 404 returned error can't find the container with id eea358a716229f867bdfb37e0f8c8272ecbffc6b309e2d7fc05ef9b2aa19015e Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.860186 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-n7pdl"] Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.871701 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-sbh8l"] Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.878943 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.885068 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:31:50 crc kubenswrapper[4707]: I0121 15:31:50.887519 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:31:50 crc kubenswrapper[4707]: W0121 15:31:50.890242 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfec8458e_7f99_4bbf_bb1e_f4e855189154.slice/crio-6349f7200f6dda09bf3067d20f870049a74e9027665aaee021fe08d6d1bae9cd WatchSource:0}: Error finding container 6349f7200f6dda09bf3067d20f870049a74e9027665aaee021fe08d6d1bae9cd: Status 404 returned error can't find the container with id 6349f7200f6dda09bf3067d20f870049a74e9027665aaee021fe08d6d1bae9cd Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.552891 4707 scope.go:117] "RemoveContainer" containerID="a71ad25a4023689928703304509832c24880dba0eaca081e4ac7b84a9f39e2d7" Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.596767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fec8458e-7f99-4bbf-bb1e-f4e855189154","Type":"ContainerStarted","Data":"41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.596826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fec8458e-7f99-4bbf-bb1e-f4e855189154","Type":"ContainerStarted","Data":"6349f7200f6dda09bf3067d20f870049a74e9027665aaee021fe08d6d1bae9cd"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.613674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-dvtbt" event={"ID":"73c8603f-b572-4c68-a083-e68ebd2faa67","Type":"ContainerStarted","Data":"d63fd0beea1df3c3498f461cec49c9f5f0eccbac73e86aa32c9f464f4767b393"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.613715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-dvtbt" event={"ID":"73c8603f-b572-4c68-a083-e68ebd2faa67","Type":"ContainerStarted","Data":"469119bc7ecbc6b33abe1c60299ebef6e62ec4691ebcd1789210102488ec4fc1"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.616626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" event={"ID":"37cd4c00-bb9a-406e-af6c-427a36af5e74","Type":"ContainerStarted","Data":"9f4c9093ac2066cbb5d4e1f9ec419fe00db495ed354586dc8985b8d701102d20"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.616652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" event={"ID":"37cd4c00-bb9a-406e-af6c-427a36af5e74","Type":"ContainerStarted","Data":"60577bafc57986dee471737abbfbba1a8c655e50aa0d0b6f55fa9f684c777120"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.621456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" event={"ID":"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984","Type":"ContainerStarted","Data":"ff17dfb054823684830a7337448f62fc48f01c6fe76d2357e00911976e1b3f4e"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.621485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" event={"ID":"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984","Type":"ContainerStarted","Data":"53e0d516f1bbb339da9206f50c5c1edb59e31e6faed7826dab5177c4f72b16e8"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.623072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" event={"ID":"1e1fae94-8cda-4956-bab0-04acd48d8d14","Type":"ContainerStarted","Data":"d769b7a269a2696c705c404c162be86ce7e930410be6def7eb2ac8306f38854a"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.623132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" event={"ID":"1e1fae94-8cda-4956-bab0-04acd48d8d14","Type":"ContainerStarted","Data":"2ddbec6f68a851c62c290f99c66cc56c8d7ff32309921b682939d57601c73eba"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.635237 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-dvtbt" podStartSLOduration=7.635226764 podStartE2EDuration="7.635226764s" podCreationTimestamp="2026-01-21 15:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:51.626392042 +0000 UTC m=+1808.807908283" watchObservedRunningTime="2026-01-21 15:31:51.635226764 +0000 UTC m=+1808.816742986" Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.635926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-rt85p" event={"ID":"fb71e17c-c141-4845-a86e-c7d8643681e0","Type":"ContainerStarted","Data":"3b5c736d9b656e958f76a45f5a50a7625d90c3cbb7ffb20133329c0ca0bb0e57"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.636032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-rt85p" event={"ID":"fb71e17c-c141-4845-a86e-c7d8643681e0","Type":"ContainerStarted","Data":"d54e98b42bc1f5cf3133dc616ba41a2ed7ac91c5b60c647811b00d845882023d"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.642879 4707 scope.go:117] "RemoveContainer" containerID="bb631c2b0c64cf60fe5fbbe07c38027ecccd9801ec84b61405f3b27a88764a99" Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.643046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1ef058e4-6e88-4c98-885f-cce6c978ce50","Type":"ContainerStarted","Data":"eea358a716229f867bdfb37e0f8c8272ecbffc6b309e2d7fc05ef9b2aa19015e"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.656908 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" podStartSLOduration=2.656894986 podStartE2EDuration="2.656894986s" podCreationTimestamp="2026-01-21 15:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:51.653026311 +0000 UTC m=+1808.834542534" watchObservedRunningTime="2026-01-21 15:31:51.656894986 +0000 UTC m=+1808.838411208" Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.658057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerStarted","Data":"46959d9013644cd3e3fef3fadede1a6eba58d48f1087aa2a430164480b5dfcc0"} Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.676469 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" podStartSLOduration=7.676453952 podStartE2EDuration="7.676453952s" podCreationTimestamp="2026-01-21 15:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:51.668265215 +0000 UTC m=+1808.849781437" watchObservedRunningTime="2026-01-21 15:31:51.676453952 +0000 UTC m=+1808.857970173" Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.689873 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" podStartSLOduration=2.689862433 podStartE2EDuration="2.689862433s" podCreationTimestamp="2026-01-21 15:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:51.688519468 +0000 UTC m=+1808.870035690" watchObservedRunningTime="2026-01-21 15:31:51.689862433 +0000 UTC m=+1808.871378655" Jan 21 15:31:51 crc kubenswrapper[4707]: I0121 15:31:51.700503 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-rt85p" podStartSLOduration=2.700493743 podStartE2EDuration="2.700493743s" podCreationTimestamp="2026-01-21 15:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:51.699787876 +0000 UTC m=+1808.881304098" watchObservedRunningTime="2026-01-21 15:31:51.700493743 +0000 UTC m=+1808.882009965" Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.666636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerStarted","Data":"e1dc84b86ab11bb0ba2285f043a3749f0360512fb16701006354fcd7887d01ff"} Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.669300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerStarted","Data":"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f"} Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.671313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fec8458e-7f99-4bbf-bb1e-f4e855189154","Type":"ContainerStarted","Data":"21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa"} Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.671377 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerName="glance-log" containerID="cri-o://41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68" gracePeriod=30 Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.671423 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerName="glance-httpd" containerID="cri-o://21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa" gracePeriod=30 Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.673022 4707 generic.go:334] "Generic (PLEG): container finished" podID="73c8603f-b572-4c68-a083-e68ebd2faa67" containerID="d63fd0beea1df3c3498f461cec49c9f5f0eccbac73e86aa32c9f464f4767b393" exitCode=0 Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.673091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-dvtbt" event={"ID":"73c8603f-b572-4c68-a083-e68ebd2faa67","Type":"ContainerDied","Data":"d63fd0beea1df3c3498f461cec49c9f5f0eccbac73e86aa32c9f464f4767b393"} Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.676158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1ef058e4-6e88-4c98-885f-cce6c978ce50","Type":"ContainerStarted","Data":"41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f"} Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.676193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1ef058e4-6e88-4c98-885f-cce6c978ce50","Type":"ContainerStarted","Data":"a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df"} Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.676368 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerName="glance-httpd" containerID="cri-o://41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f" gracePeriod=30 Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.677034 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerName="glance-log" containerID="cri-o://a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df" gracePeriod=30 Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.689977 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=8.68996393 podStartE2EDuration="8.68996393s" podCreationTimestamp="2026-01-21 15:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:52.686741371 +0000 UTC m=+1809.868257592" watchObservedRunningTime="2026-01-21 15:31:52.68996393 +0000 UTC m=+1809.871480152" Jan 21 15:31:52 crc kubenswrapper[4707]: I0121 15:31:52.724412 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=8.724398907 podStartE2EDuration="8.724398907s" podCreationTimestamp="2026-01-21 15:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:52.721461905 +0000 UTC m=+1809.902978127" watchObservedRunningTime="2026-01-21 15:31:52.724398907 +0000 UTC m=+1809.905915129" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.040654 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-config-data\") pod \"1ef058e4-6e88-4c98-885f-cce6c978ce50\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-combined-ca-bundle\") pod \"1ef058e4-6e88-4c98-885f-cce6c978ce50\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-internal-tls-certs\") pod \"1ef058e4-6e88-4c98-885f-cce6c978ce50\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-httpd-run\") pod \"1ef058e4-6e88-4c98-885f-cce6c978ce50\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-logs\") pod \"1ef058e4-6e88-4c98-885f-cce6c978ce50\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104177 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"1ef058e4-6e88-4c98-885f-cce6c978ce50\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104202 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-scripts\") pod \"1ef058e4-6e88-4c98-885f-cce6c978ce50\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxmt2\" (UniqueName: \"kubernetes.io/projected/1ef058e4-6e88-4c98-885f-cce6c978ce50-kube-api-access-sxmt2\") pod \"1ef058e4-6e88-4c98-885f-cce6c978ce50\" (UID: \"1ef058e4-6e88-4c98-885f-cce6c978ce50\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-logs" (OuterVolumeSpecName: "logs") pod "1ef058e4-6e88-4c98-885f-cce6c978ce50" (UID: "1ef058e4-6e88-4c98-885f-cce6c978ce50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.104677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1ef058e4-6e88-4c98-885f-cce6c978ce50" (UID: "1ef058e4-6e88-4c98-885f-cce6c978ce50"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.107801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "1ef058e4-6e88-4c98-885f-cce6c978ce50" (UID: "1ef058e4-6e88-4c98-885f-cce6c978ce50"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.108326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef058e4-6e88-4c98-885f-cce6c978ce50-kube-api-access-sxmt2" (OuterVolumeSpecName: "kube-api-access-sxmt2") pod "1ef058e4-6e88-4c98-885f-cce6c978ce50" (UID: "1ef058e4-6e88-4c98-885f-cce6c978ce50"). InnerVolumeSpecName "kube-api-access-sxmt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.110902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-scripts" (OuterVolumeSpecName: "scripts") pod "1ef058e4-6e88-4c98-885f-cce6c978ce50" (UID: "1ef058e4-6e88-4c98-885f-cce6c978ce50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.123064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ef058e4-6e88-4c98-885f-cce6c978ce50" (UID: "1ef058e4-6e88-4c98-885f-cce6c978ce50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.137617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1ef058e4-6e88-4c98-885f-cce6c978ce50" (UID: "1ef058e4-6e88-4c98-885f-cce6c978ce50"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.141271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-config-data" (OuterVolumeSpecName: "config-data") pod "1ef058e4-6e88-4c98-885f-cce6c978ce50" (UID: "1ef058e4-6e88-4c98-885f-cce6c978ce50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.149289 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.205442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-config-data\") pod \"fec8458e-7f99-4bbf-bb1e-f4e855189154\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.205486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqs62\" (UniqueName: \"kubernetes.io/projected/fec8458e-7f99-4bbf-bb1e-f4e855189154-kube-api-access-fqs62\") pod \"fec8458e-7f99-4bbf-bb1e-f4e855189154\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.205580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-scripts\") pod \"fec8458e-7f99-4bbf-bb1e-f4e855189154\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.205639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-logs\") pod \"fec8458e-7f99-4bbf-bb1e-f4e855189154\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.205676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-public-tls-certs\") pod \"fec8458e-7f99-4bbf-bb1e-f4e855189154\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.205699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-httpd-run\") pod \"fec8458e-7f99-4bbf-bb1e-f4e855189154\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.205746 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"fec8458e-7f99-4bbf-bb1e-f4e855189154\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.205768 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-combined-ca-bundle\") pod \"fec8458e-7f99-4bbf-bb1e-f4e855189154\" (UID: \"fec8458e-7f99-4bbf-bb1e-f4e855189154\") " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-logs" (OuterVolumeSpecName: "logs") pod "fec8458e-7f99-4bbf-bb1e-f4e855189154" (UID: "fec8458e-7f99-4bbf-bb1e-f4e855189154"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206311 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206324 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206333 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206343 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206351 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206358 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef058e4-6e88-4c98-885f-cce6c978ce50-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206373 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206381 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ef058e4-6e88-4c98-885f-cce6c978ce50-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.206390 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxmt2\" (UniqueName: \"kubernetes.io/projected/1ef058e4-6e88-4c98-885f-cce6c978ce50-kube-api-access-sxmt2\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.210318 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fec8458e-7f99-4bbf-bb1e-f4e855189154" (UID: "fec8458e-7f99-4bbf-bb1e-f4e855189154"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.211609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "fec8458e-7f99-4bbf-bb1e-f4e855189154" (UID: "fec8458e-7f99-4bbf-bb1e-f4e855189154"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.213912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fec8458e-7f99-4bbf-bb1e-f4e855189154-kube-api-access-fqs62" (OuterVolumeSpecName: "kube-api-access-fqs62") pod "fec8458e-7f99-4bbf-bb1e-f4e855189154" (UID: "fec8458e-7f99-4bbf-bb1e-f4e855189154"). InnerVolumeSpecName "kube-api-access-fqs62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.214724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-scripts" (OuterVolumeSpecName: "scripts") pod "fec8458e-7f99-4bbf-bb1e-f4e855189154" (UID: "fec8458e-7f99-4bbf-bb1e-f4e855189154"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.221376 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.228863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fec8458e-7f99-4bbf-bb1e-f4e855189154" (UID: "fec8458e-7f99-4bbf-bb1e-f4e855189154"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.238483 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-config-data" (OuterVolumeSpecName: "config-data") pod "fec8458e-7f99-4bbf-bb1e-f4e855189154" (UID: "fec8458e-7f99-4bbf-bb1e-f4e855189154"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.243774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fec8458e-7f99-4bbf-bb1e-f4e855189154" (UID: "fec8458e-7f99-4bbf-bb1e-f4e855189154"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.307880 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.308030 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fec8458e-7f99-4bbf-bb1e-f4e855189154-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.308099 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.308189 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.308248 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.308300 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.308357 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqs62\" (UniqueName: \"kubernetes.io/projected/fec8458e-7f99-4bbf-bb1e-f4e855189154-kube-api-access-fqs62\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.308414 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec8458e-7f99-4bbf-bb1e-f4e855189154-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.324945 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.410085 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.686037 4707 generic.go:334] "Generic (PLEG): container finished" podID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerID="21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa" exitCode=0 Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.686237 4707 generic.go:334] "Generic (PLEG): container finished" podID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerID="41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68" exitCode=143 Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.686240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fec8458e-7f99-4bbf-bb1e-f4e855189154","Type":"ContainerDied","Data":"21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.686275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fec8458e-7f99-4bbf-bb1e-f4e855189154","Type":"ContainerDied","Data":"41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.686203 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.686294 4707 scope.go:117] "RemoveContainer" containerID="21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.686285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fec8458e-7f99-4bbf-bb1e-f4e855189154","Type":"ContainerDied","Data":"6349f7200f6dda09bf3067d20f870049a74e9027665aaee021fe08d6d1bae9cd"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.690098 4707 generic.go:334] "Generic (PLEG): container finished" podID="37cd4c00-bb9a-406e-af6c-427a36af5e74" containerID="9f4c9093ac2066cbb5d4e1f9ec419fe00db495ed354586dc8985b8d701102d20" exitCode=0 Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.690159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" event={"ID":"37cd4c00-bb9a-406e-af6c-427a36af5e74","Type":"ContainerDied","Data":"9f4c9093ac2066cbb5d4e1f9ec419fe00db495ed354586dc8985b8d701102d20"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.691738 4707 generic.go:334] "Generic (PLEG): container finished" podID="1e1fae94-8cda-4956-bab0-04acd48d8d14" containerID="d769b7a269a2696c705c404c162be86ce7e930410be6def7eb2ac8306f38854a" exitCode=0 Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.691777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" event={"ID":"1e1fae94-8cda-4956-bab0-04acd48d8d14","Type":"ContainerDied","Data":"d769b7a269a2696c705c404c162be86ce7e930410be6def7eb2ac8306f38854a"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.693772 4707 generic.go:334] "Generic (PLEG): container finished" podID="5ab2b0b7-4f0b-49d9-b217-893b083a786c" containerID="af3f3dd74a9c8d48aa97c107bfa8b058c3e1d0de4496d824674dc535fe8ed785" exitCode=0 Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.693848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" event={"ID":"5ab2b0b7-4f0b-49d9-b217-893b083a786c","Type":"ContainerDied","Data":"af3f3dd74a9c8d48aa97c107bfa8b058c3e1d0de4496d824674dc535fe8ed785"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.695756 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerID="41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f" exitCode=143 Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.695775 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerID="a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df" exitCode=143 Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.695824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1ef058e4-6e88-4c98-885f-cce6c978ce50","Type":"ContainerDied","Data":"41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.695847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1ef058e4-6e88-4c98-885f-cce6c978ce50","Type":"ContainerDied","Data":"a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.695856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1ef058e4-6e88-4c98-885f-cce6c978ce50","Type":"ContainerDied","Data":"eea358a716229f867bdfb37e0f8c8272ecbffc6b309e2d7fc05ef9b2aa19015e"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.695905 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.700118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerStarted","Data":"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.705743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerStarted","Data":"de2a44b24ac64021d947b2eed4378c9c44d16a303720d8747987c6a37f36b369"} Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.709666 4707 scope.go:117] "RemoveContainer" containerID="41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.733151 4707 scope.go:117] "RemoveContainer" containerID="21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa" Jan 21 15:31:53 crc kubenswrapper[4707]: E0121 15:31:53.738830 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa\": container with ID starting with 21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa not found: ID does not exist" containerID="21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.738864 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa"} err="failed to get container status \"21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa\": rpc error: code = NotFound desc = could not find container \"21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa\": container with ID starting with 21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa not found: ID does not exist" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.738885 4707 scope.go:117] "RemoveContainer" containerID="41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68" Jan 21 15:31:53 crc kubenswrapper[4707]: E0121 15:31:53.745908 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68\": container with ID starting with 41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68 not found: ID does not exist" containerID="41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.745938 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68"} err="failed to get container status \"41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68\": rpc error: code = NotFound desc = could not find container \"41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68\": container with ID starting with 41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68 not found: ID does not exist" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.745962 4707 scope.go:117] "RemoveContainer" containerID="21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.746305 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa"} err="failed to get container status \"21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa\": rpc error: code = NotFound desc = could not find container \"21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa\": container with ID starting with 21d5d74e0861974ff81119fc84f3bf967cd778423ec894759145268b809989aa not found: ID does not exist" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.746340 4707 scope.go:117] "RemoveContainer" containerID="41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.746427 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.746561 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68"} err="failed to get container status \"41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68\": rpc error: code = NotFound desc = could not find container \"41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68\": container with ID starting with 41376ff39c884adde4ac277d4a098c876d1fa6fd404a6b5549b6730f5b1f0b68 not found: ID does not exist" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.746582 4707 scope.go:117] "RemoveContainer" containerID="41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.757920 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.766256 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.776197 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.783866 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:31:53 crc kubenswrapper[4707]: E0121 15:31:53.784217 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerName="glance-httpd" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.784234 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerName="glance-httpd" Jan 21 15:31:53 crc kubenswrapper[4707]: E0121 15:31:53.784265 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerName="glance-httpd" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.784278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerName="glance-httpd" Jan 21 15:31:53 crc kubenswrapper[4707]: E0121 15:31:53.784293 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerName="glance-log" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.784299 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerName="glance-log" Jan 21 15:31:53 crc kubenswrapper[4707]: E0121 15:31:53.784308 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerName="glance-log" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.784313 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerName="glance-log" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.784447 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerName="glance-httpd" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.784464 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerName="glance-log" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.784473 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef058e4-6e88-4c98-885f-cce6c978ce50" containerName="glance-log" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.784481 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fec8458e-7f99-4bbf-bb1e-f4e855189154" containerName="glance-httpd" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.786439 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.790025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.796158 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.796258 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.796306 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.796328 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-9hvs4" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.798526 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.799785 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.804998 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.805221 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.807595 4707 scope.go:117] "RemoveContainer" containerID="a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.813244 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.817535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.817846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-scripts\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.817875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.817904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ml5k\" (UniqueName: \"kubernetes.io/projected/35d81c05-e993-4076-9d9f-f5a1a10ba669-kube-api-access-7ml5k\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.817919 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.817948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.817980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-logs\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.818020 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-config-data\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.834337 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podStartSLOduration=20.834319974 podStartE2EDuration="20.834319974s" podCreationTimestamp="2026-01-21 15:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:53.805331518 +0000 UTC m=+1810.986847750" watchObservedRunningTime="2026-01-21 15:31:53.834319974 +0000 UTC m=+1811.015836196" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.863840 4707 scope.go:117] "RemoveContainer" containerID="41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f" Jan 21 15:31:53 crc kubenswrapper[4707]: E0121 15:31:53.865700 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f\": container with ID starting with 41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f not found: ID does not exist" containerID="41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.865848 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f"} err="failed to get container status \"41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f\": rpc error: code = NotFound desc = could not find container \"41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f\": container with ID starting with 41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f not found: ID does not exist" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.865873 4707 scope.go:117] "RemoveContainer" containerID="a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df" Jan 21 15:31:53 crc kubenswrapper[4707]: E0121 15:31:53.866302 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df\": container with ID starting with a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df not found: ID does not exist" containerID="a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.866428 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df"} err="failed to get container status \"a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df\": rpc error: code = NotFound desc = could not find container \"a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df\": container with ID starting with a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df not found: ID does not exist" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.866452 4707 scope.go:117] "RemoveContainer" containerID="41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.866830 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f"} err="failed to get container status \"41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f\": rpc error: code = NotFound desc = could not find container \"41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f\": container with ID starting with 41974aaeca2eff6dfa87ac07791173f7b393dbf70ba8c346444bfa64599c361f not found: ID does not exist" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.866862 4707 scope.go:117] "RemoveContainer" containerID="a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.867229 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df"} err="failed to get container status \"a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df\": rpc error: code = NotFound desc = could not find container \"a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df\": container with ID starting with a5f66059465ebfb29b1560eaf4bd17975a64ddfdf9bb9963ba5c961084f523df not found: ID does not exist" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9fs\" (UniqueName: \"kubernetes.io/projected/179986b8-3ec0-4966-8eb9-621f8465e876-kube-api-access-qq9fs\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919769 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-scripts\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ml5k\" (UniqueName: \"kubernetes.io/projected/35d81c05-e993-4076-9d9f-f5a1a10ba669-kube-api-access-7ml5k\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-logs\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.919916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-config-data\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.920573 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.921336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.921383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-logs\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.921687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-logs\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.921727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-config-data\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.921783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-scripts\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.921841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.923637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.923772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-scripts\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.930426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.937728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-config-data\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.943660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:53 crc kubenswrapper[4707]: I0121 15:31:53.952629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ml5k\" (UniqueName: \"kubernetes.io/projected/35d81c05-e993-4076-9d9f-f5a1a10ba669-kube-api-access-7ml5k\") pod \"glance-default-internal-api-0\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.023653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.023748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-config-data\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.023767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-logs\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.023841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-scripts\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.023911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9fs\" (UniqueName: \"kubernetes.io/projected/179986b8-3ec0-4966-8eb9-621f8465e876-kube-api-access-qq9fs\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.023942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.023957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.023976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.024091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.024130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-logs\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.024331 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.028065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.028414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-scripts\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.030475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-config-data\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.035760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.038622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9fs\" (UniqueName: \"kubernetes.io/projected/179986b8-3ec0-4966-8eb9-621f8465e876-kube-api-access-qq9fs\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.043123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.045357 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.047423 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.125053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-config-data\") pod \"73c8603f-b572-4c68-a083-e68ebd2faa67\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.125094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-combined-ca-bundle\") pod \"73c8603f-b572-4c68-a083-e68ebd2faa67\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.125118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-scripts\") pod \"73c8603f-b572-4c68-a083-e68ebd2faa67\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.125150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhhsc\" (UniqueName: \"kubernetes.io/projected/73c8603f-b572-4c68-a083-e68ebd2faa67-kube-api-access-xhhsc\") pod \"73c8603f-b572-4c68-a083-e68ebd2faa67\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.125239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c8603f-b572-4c68-a083-e68ebd2faa67-logs\") pod \"73c8603f-b572-4c68-a083-e68ebd2faa67\" (UID: \"73c8603f-b572-4c68-a083-e68ebd2faa67\") " Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.125867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73c8603f-b572-4c68-a083-e68ebd2faa67-logs" (OuterVolumeSpecName: "logs") pod "73c8603f-b572-4c68-a083-e68ebd2faa67" (UID: "73c8603f-b572-4c68-a083-e68ebd2faa67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.128271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c8603f-b572-4c68-a083-e68ebd2faa67-kube-api-access-xhhsc" (OuterVolumeSpecName: "kube-api-access-xhhsc") pod "73c8603f-b572-4c68-a083-e68ebd2faa67" (UID: "73c8603f-b572-4c68-a083-e68ebd2faa67"). InnerVolumeSpecName "kube-api-access-xhhsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.128506 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-scripts" (OuterVolumeSpecName: "scripts") pod "73c8603f-b572-4c68-a083-e68ebd2faa67" (UID: "73c8603f-b572-4c68-a083-e68ebd2faa67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.135789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.144376 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73c8603f-b572-4c68-a083-e68ebd2faa67" (UID: "73c8603f-b572-4c68-a083-e68ebd2faa67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.145542 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.154561 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-config-data" (OuterVolumeSpecName: "config-data") pod "73c8603f-b572-4c68-a083-e68ebd2faa67" (UID: "73c8603f-b572-4c68-a083-e68ebd2faa67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.227611 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.227792 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.227803 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c8603f-b572-4c68-a083-e68ebd2faa67-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.227824 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhhsc\" (UniqueName: \"kubernetes.io/projected/73c8603f-b572-4c68-a083-e68ebd2faa67-kube-api-access-xhhsc\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.227832 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c8603f-b572-4c68-a083-e68ebd2faa67-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:54 crc kubenswrapper[4707]: W0121 15:31:54.529481 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179986b8_3ec0_4966_8eb9_621f8465e876.slice/crio-59c853e3e09db444bd481ca9d2da0d0d788936cffe1b2f16b541c7a1e89e5d10 WatchSource:0}: Error finding container 59c853e3e09db444bd481ca9d2da0d0d788936cffe1b2f16b541c7a1e89e5d10: Status 404 returned error can't find the container with id 59c853e3e09db444bd481ca9d2da0d0d788936cffe1b2f16b541c7a1e89e5d10 Jan 21 15:31:54 crc kubenswrapper[4707]: W0121 15:31:54.532276 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35d81c05_e993_4076_9d9f_f5a1a10ba669.slice/crio-873fd06338066688f0ba7caec377999a244bd13499209d45c5ad99e28d1102b4 WatchSource:0}: Error finding container 873fd06338066688f0ba7caec377999a244bd13499209d45c5ad99e28d1102b4: Status 404 returned error can't find the container with id 873fd06338066688f0ba7caec377999a244bd13499209d45c5ad99e28d1102b4 Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.534185 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.540487 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.725720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerStarted","Data":"d3fc3939079983be3bc7bcc20c7035ba575c4bdc487b761a918920a4bf396595"} Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.728217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-dvtbt" event={"ID":"73c8603f-b572-4c68-a083-e68ebd2faa67","Type":"ContainerDied","Data":"469119bc7ecbc6b33abe1c60299ebef6e62ec4691ebcd1789210102488ec4fc1"} Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.728241 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="469119bc7ecbc6b33abe1c60299ebef6e62ec4691ebcd1789210102488ec4fc1" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.728289 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-dvtbt" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.731003 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb71e17c-c141-4845-a86e-c7d8643681e0" containerID="3b5c736d9b656e958f76a45f5a50a7625d90c3cbb7ffb20133329c0ca0bb0e57" exitCode=0 Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.731069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-rt85p" event={"ID":"fb71e17c-c141-4845-a86e-c7d8643681e0","Type":"ContainerDied","Data":"3b5c736d9b656e958f76a45f5a50a7625d90c3cbb7ffb20133329c0ca0bb0e57"} Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.734534 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"179986b8-3ec0-4966-8eb9-621f8465e876","Type":"ContainerStarted","Data":"59c853e3e09db444bd481ca9d2da0d0d788936cffe1b2f16b541c7a1e89e5d10"} Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.774298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"35d81c05-e993-4076-9d9f-f5a1a10ba669","Type":"ContainerStarted","Data":"873fd06338066688f0ba7caec377999a244bd13499209d45c5ad99e28d1102b4"} Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.895870 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-6ccd55b46b-7mqqb"] Jan 21 15:31:54 crc kubenswrapper[4707]: E0121 15:31:54.896574 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c8603f-b572-4c68-a083-e68ebd2faa67" containerName="placement-db-sync" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.896594 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c8603f-b572-4c68-a083-e68ebd2faa67" containerName="placement-db-sync" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.896962 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c8603f-b572-4c68-a083-e68ebd2faa67" containerName="placement-db-sync" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.898300 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.939473 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-dk2bl" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.939658 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.939831 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6ccd55b46b-7mqqb"] Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.940178 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.940349 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 15:31:54 crc kubenswrapper[4707]: I0121 15:31:54.940489 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.052425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-scripts\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.052633 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a85cef69-0d30-4ed6-9c73-c169b3636f3a-logs\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.052762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-internal-tls-certs\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.052919 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2scc\" (UniqueName: \"kubernetes.io/projected/a85cef69-0d30-4ed6-9c73-c169b3636f3a-kube-api-access-h2scc\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.052975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-public-tls-certs\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.052992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-config-data\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.053005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-combined-ca-bundle\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.110584 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.155550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-internal-tls-certs\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.155672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2scc\" (UniqueName: \"kubernetes.io/projected/a85cef69-0d30-4ed6-9c73-c169b3636f3a-kube-api-access-h2scc\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.155725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-public-tls-certs\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.155743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-combined-ca-bundle\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.155759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-config-data\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.155816 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-scripts\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.155831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a85cef69-0d30-4ed6-9c73-c169b3636f3a-logs\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.156160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a85cef69-0d30-4ed6-9c73-c169b3636f3a-logs\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.163266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-public-tls-certs\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.163650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-scripts\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.169360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-internal-tls-certs\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.173500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-config-data\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.173639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-combined-ca-bundle\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.182695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2scc\" (UniqueName: \"kubernetes.io/projected/a85cef69-0d30-4ed6-9c73-c169b3636f3a-kube-api-access-h2scc\") pod \"placement-6ccd55b46b-7mqqb\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.191724 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef058e4-6e88-4c98-885f-cce6c978ce50" path="/var/lib/kubelet/pods/1ef058e4-6e88-4c98-885f-cce6c978ce50/volumes" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.192440 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fec8458e-7f99-4bbf-bb1e-f4e855189154" path="/var/lib/kubelet/pods/fec8458e-7f99-4bbf-bb1e-f4e855189154/volumes" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.219720 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.256312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-scripts\") pod \"1e1fae94-8cda-4956-bab0-04acd48d8d14\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.256391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-combined-ca-bundle\") pod \"1e1fae94-8cda-4956-bab0-04acd48d8d14\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.256499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/1e1fae94-8cda-4956-bab0-04acd48d8d14-kube-api-access-rv5b6\") pod \"1e1fae94-8cda-4956-bab0-04acd48d8d14\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.256533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-credential-keys\") pod \"1e1fae94-8cda-4956-bab0-04acd48d8d14\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.256567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-fernet-keys\") pod \"1e1fae94-8cda-4956-bab0-04acd48d8d14\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.256615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-config-data\") pod \"1e1fae94-8cda-4956-bab0-04acd48d8d14\" (UID: \"1e1fae94-8cda-4956-bab0-04acd48d8d14\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.262762 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-scripts" (OuterVolumeSpecName: "scripts") pod "1e1fae94-8cda-4956-bab0-04acd48d8d14" (UID: "1e1fae94-8cda-4956-bab0-04acd48d8d14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.262951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1e1fae94-8cda-4956-bab0-04acd48d8d14" (UID: "1e1fae94-8cda-4956-bab0-04acd48d8d14"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.263939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e1fae94-8cda-4956-bab0-04acd48d8d14-kube-api-access-rv5b6" (OuterVolumeSpecName: "kube-api-access-rv5b6") pod "1e1fae94-8cda-4956-bab0-04acd48d8d14" (UID: "1e1fae94-8cda-4956-bab0-04acd48d8d14"). InnerVolumeSpecName "kube-api-access-rv5b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.265314 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1e1fae94-8cda-4956-bab0-04acd48d8d14" (UID: "1e1fae94-8cda-4956-bab0-04acd48d8d14"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.295963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e1fae94-8cda-4956-bab0-04acd48d8d14" (UID: "1e1fae94-8cda-4956-bab0-04acd48d8d14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.295983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-config-data" (OuterVolumeSpecName: "config-data") pod "1e1fae94-8cda-4956-bab0-04acd48d8d14" (UID: "1e1fae94-8cda-4956-bab0-04acd48d8d14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.297050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.301202 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-config-data\") pod \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfgjb\" (UniqueName: \"kubernetes.io/projected/5ab2b0b7-4f0b-49d9-b217-893b083a786c-kube-api-access-gfgjb\") pod \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358160 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-db-sync-config-data\") pod \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358220 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-combined-ca-bundle\") pod \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\" (UID: \"5ab2b0b7-4f0b-49d9-b217-893b083a786c\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358529 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/1e1fae94-8cda-4956-bab0-04acd48d8d14-kube-api-access-rv5b6\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358542 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358550 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358579 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358586 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.358594 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1fae94-8cda-4956-bab0-04acd48d8d14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.361210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab2b0b7-4f0b-49d9-b217-893b083a786c-kube-api-access-gfgjb" (OuterVolumeSpecName: "kube-api-access-gfgjb") pod "5ab2b0b7-4f0b-49d9-b217-893b083a786c" (UID: "5ab2b0b7-4f0b-49d9-b217-893b083a786c"). InnerVolumeSpecName "kube-api-access-gfgjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.361948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5ab2b0b7-4f0b-49d9-b217-893b083a786c" (UID: "5ab2b0b7-4f0b-49d9-b217-893b083a786c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.382704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ab2b0b7-4f0b-49d9-b217-893b083a786c" (UID: "5ab2b0b7-4f0b-49d9-b217-893b083a786c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.395797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-config-data" (OuterVolumeSpecName: "config-data") pod "5ab2b0b7-4f0b-49d9-b217-893b083a786c" (UID: "5ab2b0b7-4f0b-49d9-b217-893b083a786c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.459798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s87zt\" (UniqueName: \"kubernetes.io/projected/37cd4c00-bb9a-406e-af6c-427a36af5e74-kube-api-access-s87zt\") pod \"37cd4c00-bb9a-406e-af6c-427a36af5e74\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.460062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-combined-ca-bundle\") pod \"37cd4c00-bb9a-406e-af6c-427a36af5e74\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.460136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-db-sync-config-data\") pod \"37cd4c00-bb9a-406e-af6c-427a36af5e74\" (UID: \"37cd4c00-bb9a-406e-af6c-427a36af5e74\") " Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.460516 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfgjb\" (UniqueName: \"kubernetes.io/projected/5ab2b0b7-4f0b-49d9-b217-893b083a786c-kube-api-access-gfgjb\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.460552 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.460561 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.460569 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab2b0b7-4f0b-49d9-b217-893b083a786c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.475957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37cd4c00-bb9a-406e-af6c-427a36af5e74-kube-api-access-s87zt" (OuterVolumeSpecName: "kube-api-access-s87zt") pod "37cd4c00-bb9a-406e-af6c-427a36af5e74" (UID: "37cd4c00-bb9a-406e-af6c-427a36af5e74"). InnerVolumeSpecName "kube-api-access-s87zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.488315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "37cd4c00-bb9a-406e-af6c-427a36af5e74" (UID: "37cd4c00-bb9a-406e-af6c-427a36af5e74"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.498130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37cd4c00-bb9a-406e-af6c-427a36af5e74" (UID: "37cd4c00-bb9a-406e-af6c-427a36af5e74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.562257 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.562288 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37cd4c00-bb9a-406e-af6c-427a36af5e74-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.562299 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s87zt\" (UniqueName: \"kubernetes.io/projected/37cd4c00-bb9a-406e-af6c-427a36af5e74-kube-api-access-s87zt\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.748774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" event={"ID":"5ab2b0b7-4f0b-49d9-b217-893b083a786c","Type":"ContainerDied","Data":"f8c62ac174a794f2ecc07ae300cce28e2ad6005cbd8966d103af1ad246490ec4"} Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.748825 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8c62ac174a794f2ecc07ae300cce28e2ad6005cbd8966d103af1ad246490ec4" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.748880 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-sync-jrrrh" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.753981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"179986b8-3ec0-4966-8eb9-621f8465e876","Type":"ContainerStarted","Data":"2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a"} Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.754023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"179986b8-3ec0-4966-8eb9-621f8465e876","Type":"ContainerStarted","Data":"f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723"} Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.757821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"35d81c05-e993-4076-9d9f-f5a1a10ba669","Type":"ContainerStarted","Data":"ca4216f956996cbc2e5bcace21a45a2b9480a81abafc0d0274deb43923eca1ff"} Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.757860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"35d81c05-e993-4076-9d9f-f5a1a10ba669","Type":"ContainerStarted","Data":"976f48e2b189f2381af11b61d8d7b7156ce939a83f766ece2a01389510c7f928"} Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.759788 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerStarted","Data":"9ca0586ca08a5b7ac55528bf9c9c384d53a238e7d86e57be1dca72eaa544a092"} Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.759868 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="ceilometer-central-agent" containerID="cri-o://e1dc84b86ab11bb0ba2285f043a3749f0360512fb16701006354fcd7887d01ff" gracePeriod=30 Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.759883 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.759922 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="sg-core" containerID="cri-o://d3fc3939079983be3bc7bcc20c7035ba575c4bdc487b761a918920a4bf396595" gracePeriod=30 Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.759958 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="proxy-httpd" containerID="cri-o://9ca0586ca08a5b7ac55528bf9c9c384d53a238e7d86e57be1dca72eaa544a092" gracePeriod=30 Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.759982 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="ceilometer-notification-agent" containerID="cri-o://de2a44b24ac64021d947b2eed4378c9c44d16a303720d8747987c6a37f36b369" gracePeriod=30 Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.763070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" event={"ID":"37cd4c00-bb9a-406e-af6c-427a36af5e74","Type":"ContainerDied","Data":"60577bafc57986dee471737abbfbba1a8c655e50aa0d0b6f55fa9f684c777120"} Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.763089 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60577bafc57986dee471737abbfbba1a8c655e50aa0d0b6f55fa9f684c777120" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.763124 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-sbh8l" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.777905 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.777894168 podStartE2EDuration="2.777894168s" podCreationTimestamp="2026-01-21 15:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:55.768153263 +0000 UTC m=+1812.949669485" watchObservedRunningTime="2026-01-21 15:31:55.777894168 +0000 UTC m=+1812.959410391" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.778350 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.778493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dvz8h" event={"ID":"1e1fae94-8cda-4956-bab0-04acd48d8d14","Type":"ContainerDied","Data":"2ddbec6f68a851c62c290f99c66cc56c8d7ff32309921b682939d57601c73eba"} Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.778521 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ddbec6f68a851c62c290f99c66cc56c8d7ff32309921b682939d57601c73eba" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.796015 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6ccd55b46b-7mqqb"] Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.796666 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.796655445 podStartE2EDuration="2.796655445s" podCreationTimestamp="2026-01-21 15:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:55.786533513 +0000 UTC m=+1812.968049735" watchObservedRunningTime="2026-01-21 15:31:55.796655445 +0000 UTC m=+1812.978171667" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.820019 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=7.502777071 podStartE2EDuration="11.820005549s" podCreationTimestamp="2026-01-21 15:31:44 +0000 UTC" firstStartedPulling="2026-01-21 15:31:50.887323569 +0000 UTC m=+1808.068839791" lastFinishedPulling="2026-01-21 15:31:55.204552047 +0000 UTC m=+1812.386068269" observedRunningTime="2026-01-21 15:31:55.816064179 +0000 UTC m=+1812.997580401" watchObservedRunningTime="2026-01-21 15:31:55.820005549 +0000 UTC m=+1813.001521771" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.850662 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dvz8h"] Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.862832 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dvz8h"] Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.903941 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb"] Jan 21 15:31:55 crc kubenswrapper[4707]: E0121 15:31:55.904303 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cd4c00-bb9a-406e-af6c-427a36af5e74" containerName="barbican-db-sync" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.904320 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cd4c00-bb9a-406e-af6c-427a36af5e74" containerName="barbican-db-sync" Jan 21 15:31:55 crc kubenswrapper[4707]: E0121 15:31:55.904337 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1fae94-8cda-4956-bab0-04acd48d8d14" containerName="keystone-bootstrap" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.904343 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1fae94-8cda-4956-bab0-04acd48d8d14" containerName="keystone-bootstrap" Jan 21 15:31:55 crc kubenswrapper[4707]: E0121 15:31:55.904355 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab2b0b7-4f0b-49d9-b217-893b083a786c" containerName="watcher-db-sync" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.904360 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab2b0b7-4f0b-49d9-b217-893b083a786c" containerName="watcher-db-sync" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.904506 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1fae94-8cda-4956-bab0-04acd48d8d14" containerName="keystone-bootstrap" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.904520 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab2b0b7-4f0b-49d9-b217-893b083a786c" containerName="watcher-db-sync" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.904537 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="37cd4c00-bb9a-406e-af6c-427a36af5e74" containerName="barbican-db-sync" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.905401 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.915232 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.915433 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.915571 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-xmj86" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.936626 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt"] Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.939435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.945439 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.958006 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb"] Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.970714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-combined-ca-bundle\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.970748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcg9h\" (UniqueName: \"kubernetes.io/projected/12bfd31d-1d19-4286-87f6-83397fa62338-kube-api-access-bcg9h\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.970825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data-custom\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.970864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.970908 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bfd31d-1d19-4286-87f6-83397fa62338-logs\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:55 crc kubenswrapper[4707]: I0121 15:31:55.983229 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.022653 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-x8kr2"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.023694 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.029180 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.029367 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.029494 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.029661 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-hswdl" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.029763 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.055137 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-x8kr2"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-fernet-keys\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-combined-ca-bundle\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcg9h\" (UniqueName: \"kubernetes.io/projected/12bfd31d-1d19-4286-87f6-83397fa62338-kube-api-access-bcg9h\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-config-data\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data-custom\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-scripts\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-combined-ca-bundle\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data-custom\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-combined-ca-bundle\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-credential-keys\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc34703-f7b2-455d-b4a6-4865a4c90f45-logs\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnhh\" (UniqueName: \"kubernetes.io/projected/cbc34703-f7b2-455d-b4a6-4865a4c90f45-kube-api-access-7hnhh\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkt4g\" (UniqueName: \"kubernetes.io/projected/ec7ce142-4555-4604-a31b-155176988d1a-kube-api-access-wkt4g\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bfd31d-1d19-4286-87f6-83397fa62338-logs\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.072737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bfd31d-1d19-4286-87f6-83397fa62338-logs\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.080654 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.082039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.084261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data-custom\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.085302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-watcher-dockercfg-zx5g8" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.085468 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-api-config-data" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.087383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.091780 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-7764cc588-x5xsc"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.092899 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.100276 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.102887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-combined-ca-bundle\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.111027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-7764cc588-x5xsc"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.148744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcg9h\" (UniqueName: \"kubernetes.io/projected/12bfd31d-1d19-4286-87f6-83397fa62338-kube-api-access-bcg9h\") pod \"barbican-keystone-listener-7d44c8db94-rqgvb\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.153799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-fernet-keys\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wghw\" (UniqueName: \"kubernetes.io/projected/52b4da74-d2e7-4bca-8aa3-492e4b041925-kube-api-access-7wghw\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-config-data\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-combined-ca-bundle\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data-custom\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-scripts\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173887 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-config-data\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173909 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-combined-ca-bundle\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.173971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-combined-ca-bundle\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj285\" (UniqueName: \"kubernetes.io/projected/2b691f7a-1d61-46ef-bddc-5e692d197f4e-kube-api-access-hj285\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-credential-keys\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc34703-f7b2-455d-b4a6-4865a4c90f45-logs\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174099 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnhh\" (UniqueName: \"kubernetes.io/projected/cbc34703-f7b2-455d-b4a6-4865a4c90f45-kube-api-access-7hnhh\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b4da74-d2e7-4bca-8aa3-492e4b041925-logs\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b691f7a-1d61-46ef-bddc-5e692d197f4e-logs\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkt4g\" (UniqueName: \"kubernetes.io/projected/ec7ce142-4555-4604-a31b-155176988d1a-kube-api-access-wkt4g\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.174237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data-custom\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.178082 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.180209 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.180973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-fernet-keys\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.185502 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-decision-engine-config-data" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.186657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data-custom\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.187252 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-combined-ca-bundle\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.189313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc34703-f7b2-455d-b4a6-4865a4c90f45-logs\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.190405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-config-data\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.198034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-credential-keys\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.198592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-combined-ca-bundle\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.199600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.200253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-scripts\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.200391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkt4g\" (UniqueName: \"kubernetes.io/projected/ec7ce142-4555-4604-a31b-155176988d1a-kube-api-access-wkt4g\") pod \"keystone-bootstrap-x8kr2\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.201918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.219304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnhh\" (UniqueName: \"kubernetes.io/projected/cbc34703-f7b2-455d-b4a6-4865a4c90f45-kube-api-access-7hnhh\") pod \"barbican-worker-7f55fcd6b9-77npt\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.228886 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.229907 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.231997 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-applier-config-data" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.234719 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.266410 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.275083 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.275753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.275792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4cr\" (UniqueName: \"kubernetes.io/projected/4eebbb84-d09f-44b4-9ad2-5f715fd16644-kube-api-access-xv4cr\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.275869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.275915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b4da74-d2e7-4bca-8aa3-492e4b041925-logs\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.275949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b691f7a-1d61-46ef-bddc-5e692d197f4e-logs\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.275972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data-custom\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276048 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wghw\" (UniqueName: \"kubernetes.io/projected/52b4da74-d2e7-4bca-8aa3-492e4b041925-kube-api-access-7wghw\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-combined-ca-bundle\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-config-data\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-config-data\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933e570a-28bd-42d8-891f-0734578bc040-logs\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276305 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj285\" (UniqueName: \"kubernetes.io/projected/2b691f7a-1d61-46ef-bddc-5e692d197f4e-kube-api-access-hj285\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276332 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw4rq\" (UniqueName: \"kubernetes.io/projected/933e570a-28bd-42d8-891f-0734578bc040-kube-api-access-fw4rq\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.276380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eebbb84-d09f-44b4-9ad2-5f715fd16644-logs\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.278385 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b691f7a-1d61-46ef-bddc-5e692d197f4e-logs\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.278634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b4da74-d2e7-4bca-8aa3-492e4b041925-logs\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.280119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.280897 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.281440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data-custom\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.281760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.282873 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-config-data\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.288412 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-combined-ca-bundle\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.292289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wghw\" (UniqueName: \"kubernetes.io/projected/52b4da74-d2e7-4bca-8aa3-492e4b041925-kube-api-access-7wghw\") pod \"barbican-api-7764cc588-x5xsc\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.293782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj285\" (UniqueName: \"kubernetes.io/projected/2b691f7a-1d61-46ef-bddc-5e692d197f4e-kube-api-access-hj285\") pod \"watcher-api-0\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.300040 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.355951 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.377486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.377669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.377720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw4rq\" (UniqueName: \"kubernetes.io/projected/933e570a-28bd-42d8-891f-0734578bc040-kube-api-access-fw4rq\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.377748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eebbb84-d09f-44b4-9ad2-5f715fd16644-logs\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.377775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.377790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4cr\" (UniqueName: \"kubernetes.io/projected/4eebbb84-d09f-44b4-9ad2-5f715fd16644-kube-api-access-xv4cr\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.377821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.377919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-config-data\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.377978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933e570a-28bd-42d8-891f-0734578bc040-logs\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.378326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933e570a-28bd-42d8-891f-0734578bc040-logs\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.378572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eebbb84-d09f-44b4-9ad2-5f715fd16644-logs\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.382692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.384035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.384179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-config-data\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.384928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.386382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.401012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4cr\" (UniqueName: \"kubernetes.io/projected/4eebbb84-d09f-44b4-9ad2-5f715fd16644-kube-api-access-xv4cr\") pod \"watcher-decision-engine-0\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.408617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw4rq\" (UniqueName: \"kubernetes.io/projected/933e570a-28bd-42d8-891f-0734578bc040-kube-api-access-fw4rq\") pod \"watcher-applier-0\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.463481 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.472964 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.480441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb71e17c-c141-4845-a86e-c7d8643681e0-etc-machine-id\") pod \"fb71e17c-c141-4845-a86e-c7d8643681e0\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.480651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-scripts\") pod \"fb71e17c-c141-4845-a86e-c7d8643681e0\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.480694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-combined-ca-bundle\") pod \"fb71e17c-c141-4845-a86e-c7d8643681e0\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.480747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-config-data\") pod \"fb71e17c-c141-4845-a86e-c7d8643681e0\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.480850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-db-sync-config-data\") pod \"fb71e17c-c141-4845-a86e-c7d8643681e0\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.480907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5htfw\" (UniqueName: \"kubernetes.io/projected/fb71e17c-c141-4845-a86e-c7d8643681e0-kube-api-access-5htfw\") pod \"fb71e17c-c141-4845-a86e-c7d8643681e0\" (UID: \"fb71e17c-c141-4845-a86e-c7d8643681e0\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.480553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb71e17c-c141-4845-a86e-c7d8643681e0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fb71e17c-c141-4845-a86e-c7d8643681e0" (UID: "fb71e17c-c141-4845-a86e-c7d8643681e0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.483843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fb71e17c-c141-4845-a86e-c7d8643681e0" (UID: "fb71e17c-c141-4845-a86e-c7d8643681e0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.486990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb71e17c-c141-4845-a86e-c7d8643681e0-kube-api-access-5htfw" (OuterVolumeSpecName: "kube-api-access-5htfw") pod "fb71e17c-c141-4845-a86e-c7d8643681e0" (UID: "fb71e17c-c141-4845-a86e-c7d8643681e0"). InnerVolumeSpecName "kube-api-access-5htfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.493892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-scripts" (OuterVolumeSpecName: "scripts") pod "fb71e17c-c141-4845-a86e-c7d8643681e0" (UID: "fb71e17c-c141-4845-a86e-c7d8643681e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.508478 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb71e17c-c141-4845-a86e-c7d8643681e0" (UID: "fb71e17c-c141-4845-a86e-c7d8643681e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.514329 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.536190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-config-data" (OuterVolumeSpecName: "config-data") pod "fb71e17c-c141-4845-a86e-c7d8643681e0" (UID: "fb71e17c-c141-4845-a86e-c7d8643681e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.543796 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.582786 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.582936 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.582954 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.582964 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb71e17c-c141-4845-a86e-c7d8643681e0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.582973 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5htfw\" (UniqueName: \"kubernetes.io/projected/fb71e17c-c141-4845-a86e-c7d8643681e0-kube-api-access-5htfw\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.582983 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb71e17c-c141-4845-a86e-c7d8643681e0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.740898 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt"] Jan 21 15:31:56 crc kubenswrapper[4707]: W0121 15:31:56.745951 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbc34703_f7b2_455d_b4a6_4865a4c90f45.slice/crio-2efb102eaf13375814dfd29d9c25bfcf44141dcbf6141b4b5cf3ad5313b9f2e2 WatchSource:0}: Error finding container 2efb102eaf13375814dfd29d9c25bfcf44141dcbf6141b4b5cf3ad5313b9f2e2: Status 404 returned error can't find the container with id 2efb102eaf13375814dfd29d9c25bfcf44141dcbf6141b4b5cf3ad5313b9f2e2 Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.795105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" event={"ID":"cbc34703-f7b2-455d-b4a6-4865a4c90f45","Type":"ContainerStarted","Data":"2efb102eaf13375814dfd29d9c25bfcf44141dcbf6141b4b5cf3ad5313b9f2e2"} Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.802389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" event={"ID":"a85cef69-0d30-4ed6-9c73-c169b3636f3a","Type":"ContainerStarted","Data":"62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c"} Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.802433 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" event={"ID":"a85cef69-0d30-4ed6-9c73-c169b3636f3a","Type":"ContainerStarted","Data":"5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46"} Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.802444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" event={"ID":"a85cef69-0d30-4ed6-9c73-c169b3636f3a","Type":"ContainerStarted","Data":"5428a1a45696df7bc48859a81108b69da0b13364f64b946c410377962db7a57e"} Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.803889 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.803930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.807825 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-rt85p" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.807824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-rt85p" event={"ID":"fb71e17c-c141-4845-a86e-c7d8643681e0","Type":"ContainerDied","Data":"d54e98b42bc1f5cf3133dc616ba41a2ed7ac91c5b60c647811b00d845882023d"} Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.808089 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d54e98b42bc1f5cf3133dc616ba41a2ed7ac91c5b60c647811b00d845882023d" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.819106 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerID="9ca0586ca08a5b7ac55528bf9c9c384d53a238e7d86e57be1dca72eaa544a092" exitCode=0 Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.819137 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerID="d3fc3939079983be3bc7bcc20c7035ba575c4bdc487b761a918920a4bf396595" exitCode=2 Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.819145 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerID="de2a44b24ac64021d947b2eed4378c9c44d16a303720d8747987c6a37f36b369" exitCode=0 Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.819153 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerID="e1dc84b86ab11bb0ba2285f043a3749f0360512fb16701006354fcd7887d01ff" exitCode=0 Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.819430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerDied","Data":"9ca0586ca08a5b7ac55528bf9c9c384d53a238e7d86e57be1dca72eaa544a092"} Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.819478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerDied","Data":"d3fc3939079983be3bc7bcc20c7035ba575c4bdc487b761a918920a4bf396595"} Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.819490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerDied","Data":"de2a44b24ac64021d947b2eed4378c9c44d16a303720d8747987c6a37f36b369"} Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.819498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerDied","Data":"e1dc84b86ab11bb0ba2285f043a3749f0360512fb16701006354fcd7887d01ff"} Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.823697 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" podStartSLOduration=2.823684252 podStartE2EDuration="2.823684252s" podCreationTimestamp="2026-01-21 15:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:56.821662521 +0000 UTC m=+1814.003178742" watchObservedRunningTime="2026-01-21 15:31:56.823684252 +0000 UTC m=+1814.005200475" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.858671 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.893553 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:31:56 crc kubenswrapper[4707]: E0121 15:31:56.903135 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb71e17c-c141-4845-a86e-c7d8643681e0" containerName="cinder-db-sync" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.903156 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb71e17c-c141-4845-a86e-c7d8643681e0" containerName="cinder-db-sync" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.903357 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb71e17c-c141-4845-a86e-c7d8643681e0" containerName="cinder-db-sync" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.904134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.904774 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.905981 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.907917 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.908207 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.910434 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-jvd5f" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.910775 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.990846 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-sg-core-conf-yaml\") pod \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.990938 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-config-data\") pod \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.990994 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-log-httpd\") pod \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.991446 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f5508f21-d9fc-47ff-8b5a-1a58eea0d045" (UID: "f5508f21-d9fc-47ff-8b5a-1a58eea0d045"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.991511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-run-httpd\") pod \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.991619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-scripts\") pod \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.991659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-combined-ca-bundle\") pod \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.991682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-962jn\" (UniqueName: \"kubernetes.io/projected/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-kube-api-access-962jn\") pod \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\" (UID: \"f5508f21-d9fc-47ff-8b5a-1a58eea0d045\") " Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.992259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42600f96-228b-4f9d-ab60-d65367d09512-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.992334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.992381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-scripts\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.992403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.992464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdl4k\" (UniqueName: \"kubernetes.io/projected/42600f96-228b-4f9d-ab60-d65367d09512-kube-api-access-hdl4k\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.992676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:56 crc kubenswrapper[4707]: I0121 15:31:56.992970 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.001593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f5508f21-d9fc-47ff-8b5a-1a58eea0d045" (UID: "f5508f21-d9fc-47ff-8b5a-1a58eea0d045"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.005445 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-x8kr2"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.006919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-scripts" (OuterVolumeSpecName: "scripts") pod "f5508f21-d9fc-47ff-8b5a-1a58eea0d045" (UID: "f5508f21-d9fc-47ff-8b5a-1a58eea0d045"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.011964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-kube-api-access-962jn" (OuterVolumeSpecName: "kube-api-access-962jn") pod "f5508f21-d9fc-47ff-8b5a-1a58eea0d045" (UID: "f5508f21-d9fc-47ff-8b5a-1a58eea0d045"). InnerVolumeSpecName "kube-api-access-962jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.022785 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: E0121 15:31:57.023212 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="ceilometer-notification-agent" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.023302 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="ceilometer-notification-agent" Jan 21 15:31:57 crc kubenswrapper[4707]: E0121 15:31:57.023374 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="proxy-httpd" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.023431 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="proxy-httpd" Jan 21 15:31:57 crc kubenswrapper[4707]: E0121 15:31:57.023500 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="sg-core" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.023564 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="sg-core" Jan 21 15:31:57 crc kubenswrapper[4707]: E0121 15:31:57.023632 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="ceilometer-central-agent" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.023949 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="ceilometer-central-agent" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.024183 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="ceilometer-central-agent" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.024255 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="sg-core" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.024340 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="proxy-httpd" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.024410 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" containerName="ceilometer-notification-agent" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.025286 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.029999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f5508f21-d9fc-47ff-8b5a-1a58eea0d045" (UID: "f5508f21-d9fc-47ff-8b5a-1a58eea0d045"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.032790 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.034509 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-scripts\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094397 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba850224-736f-46c0-abd7-5c0ad1d55712-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42600f96-228b-4f9d-ab60-d65367d09512-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094508 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba850224-736f-46c0-abd7-5c0ad1d55712-logs\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-scripts\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdl4k\" (UniqueName: \"kubernetes.io/projected/42600f96-228b-4f9d-ab60-d65367d09512-kube-api-access-hdl4k\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data-custom\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094755 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwjdc\" (UniqueName: \"kubernetes.io/projected/ba850224-736f-46c0-abd7-5c0ad1d55712-kube-api-access-kwjdc\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094831 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094843 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094852 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094861 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-962jn\" (UniqueName: \"kubernetes.io/projected/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-kube-api-access-962jn\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.094913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42600f96-228b-4f9d-ab60-d65367d09512-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.098724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.098969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.100146 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-scripts\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.100796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.111178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdl4k\" (UniqueName: \"kubernetes.io/projected/42600f96-228b-4f9d-ab60-d65367d09512-kube-api-access-hdl4k\") pod \"cinder-scheduler-0\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.158130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5508f21-d9fc-47ff-8b5a-1a58eea0d045" (UID: "f5508f21-d9fc-47ff-8b5a-1a58eea0d045"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.177048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-config-data" (OuterVolumeSpecName: "config-data") pod "f5508f21-d9fc-47ff-8b5a-1a58eea0d045" (UID: "f5508f21-d9fc-47ff-8b5a-1a58eea0d045"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.183301 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:31:57 crc kubenswrapper[4707]: E0121 15:31:57.183573 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.191965 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e1fae94-8cda-4956-bab0-04acd48d8d14" path="/var/lib/kubelet/pods/1e1fae94-8cda-4956-bab0-04acd48d8d14/volumes" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.196349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwjdc\" (UniqueName: \"kubernetes.io/projected/ba850224-736f-46c0-abd7-5c0ad1d55712-kube-api-access-kwjdc\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.196410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-scripts\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.196435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba850224-736f-46c0-abd7-5c0ad1d55712-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.196515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba850224-736f-46c0-abd7-5c0ad1d55712-logs\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.196573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.196631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba850224-736f-46c0-abd7-5c0ad1d55712-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.197679 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.197766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data-custom\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.197866 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.197885 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5508f21-d9fc-47ff-8b5a-1a58eea0d045-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.202269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba850224-736f-46c0-abd7-5c0ad1d55712-logs\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.204899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.205017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-scripts\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.205456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.223207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data-custom\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.223890 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwjdc\" (UniqueName: \"kubernetes.io/projected/ba850224-736f-46c0-abd7-5c0ad1d55712-kube-api-access-kwjdc\") pod \"cinder-api-0\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.244580 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.361696 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.414030 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.449845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.463514 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-7764cc588-x5xsc"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.469247 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.724947 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.837092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"42600f96-228b-4f9d-ab60-d65367d09512","Type":"ContainerStarted","Data":"4007d73d2c0dfdc1beb4cf4a6c1542a04d389394ba51de99dcf4544ec8723e80"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.838491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" event={"ID":"ec7ce142-4555-4604-a31b-155176988d1a","Type":"ContainerStarted","Data":"d7d98a1f6b0a50dff9ed5bf7d3edb969798ff4f76e9a3eb5436c368c2529ba04"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.838521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" event={"ID":"ec7ce142-4555-4604-a31b-155176988d1a","Type":"ContainerStarted","Data":"2629857ece79ec5de0dccc1f043990e2ebd01db6456679be1130b239b332de9c"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.840520 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.846360 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f5508f21-d9fc-47ff-8b5a-1a58eea0d045","Type":"ContainerDied","Data":"46959d9013644cd3e3fef3fadede1a6eba58d48f1087aa2a430164480b5dfcc0"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.846401 4707 scope.go:117] "RemoveContainer" containerID="9ca0586ca08a5b7ac55528bf9c9c384d53a238e7d86e57be1dca72eaa544a092" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.846525 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.851936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" event={"ID":"cbc34703-f7b2-455d-b4a6-4865a4c90f45","Type":"ContainerStarted","Data":"220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.851966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" event={"ID":"cbc34703-f7b2-455d-b4a6-4865a4c90f45","Type":"ContainerStarted","Data":"266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.861412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" event={"ID":"12bfd31d-1d19-4286-87f6-83397fa62338","Type":"ContainerStarted","Data":"ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.861594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" event={"ID":"12bfd31d-1d19-4286-87f6-83397fa62338","Type":"ContainerStarted","Data":"d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.861606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" event={"ID":"12bfd31d-1d19-4286-87f6-83397fa62338","Type":"ContainerStarted","Data":"49a2054eafba79b9a86349076a67bc4a52c0675f0324064f5d58e27ba10275a0"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.861418 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" podStartSLOduration=2.861405123 podStartE2EDuration="2.861405123s" podCreationTimestamp="2026-01-21 15:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:57.85365491 +0000 UTC m=+1815.035171132" watchObservedRunningTime="2026-01-21 15:31:57.861405123 +0000 UTC m=+1815.042921344" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.873012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" event={"ID":"52b4da74-d2e7-4bca-8aa3-492e4b041925","Type":"ContainerStarted","Data":"6d4cc6dc2fbda90d29592d93dffc5e58dce8b4b4d7475854f4553e5732daaf13"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.873046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" event={"ID":"52b4da74-d2e7-4bca-8aa3-492e4b041925","Type":"ContainerStarted","Data":"3512429b9d3d64e5ea9920cadc30d8196b5a7fe2ab7f24557369e62e8ccd3a22"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.873059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" event={"ID":"52b4da74-d2e7-4bca-8aa3-492e4b041925","Type":"ContainerStarted","Data":"4edc87e94c48dfae268b1d775f49947081a07b7eb7ecd90e10c4bf94793a944b"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.873109 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.884544 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" podStartSLOduration=2.8845331400000003 podStartE2EDuration="2.88453314s" podCreationTimestamp="2026-01-21 15:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:57.869446292 +0000 UTC m=+1815.050962515" watchObservedRunningTime="2026-01-21 15:31:57.88453314 +0000 UTC m=+1815.066049362" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.886948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"4eebbb84-d09f-44b4-9ad2-5f715fd16644","Type":"ContainerStarted","Data":"0049b938274b30339fb6c454a8ce1d0e43900a63aa1201c7d0de7dfb4cf4bcd2"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.897398 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.913395 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.920102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-applier-0" event={"ID":"933e570a-28bd-42d8-891f-0734578bc040","Type":"ContainerStarted","Data":"7cc53e13d68ae65e774f9d51ed557b0b0a16e61aa3e9983335212210cbc1815e"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.926754 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.928649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.930003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"2b691f7a-1d61-46ef-bddc-5e692d197f4e","Type":"ContainerStarted","Data":"cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.930029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"2b691f7a-1d61-46ef-bddc-5e692d197f4e","Type":"ContainerStarted","Data":"06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.930041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"2b691f7a-1d61-46ef-bddc-5e692d197f4e","Type":"ContainerStarted","Data":"1738a1de39e7acc8d860c1550f165bb6c77f149dc5efea54d0d69bc77639cecd"} Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.930762 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.932434 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.932501 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.936052 4707 scope.go:117] "RemoveContainer" containerID="d3fc3939079983be3bc7bcc20c7035ba575c4bdc487b761a918920a4bf396595" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.936059 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.936556 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/watcher-api-0" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.1.216:9322/\": dial tcp 10.217.1.216:9322: connect: connection refused" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.939049 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" podStartSLOduration=2.939037924 podStartE2EDuration="2.939037924s" podCreationTimestamp="2026-01-21 15:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:57.920317865 +0000 UTC m=+1815.101834088" watchObservedRunningTime="2026-01-21 15:31:57.939037924 +0000 UTC m=+1815.120554146" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.947183 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" podStartSLOduration=2.94716244 podStartE2EDuration="2.94716244s" podCreationTimestamp="2026-01-21 15:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:57.945080746 +0000 UTC m=+1815.126596968" watchObservedRunningTime="2026-01-21 15:31:57.94716244 +0000 UTC m=+1815.128678683" Jan 21 15:31:57 crc kubenswrapper[4707]: I0121 15:31:57.996907 4707 scope.go:117] "RemoveContainer" containerID="de2a44b24ac64021d947b2eed4378c9c44d16a303720d8747987c6a37f36b369" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.018159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-scripts\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.018200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-run-httpd\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.018216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.018366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-log-httpd\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.018395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h827p\" (UniqueName: \"kubernetes.io/projected/c5b5fbb1-f7b8-4920-8329-08ff2b954651-kube-api-access-h827p\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.018414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-config-data\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.018430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.085040 4707 scope.go:117] "RemoveContainer" containerID="e1dc84b86ab11bb0ba2285f043a3749f0360512fb16701006354fcd7887d01ff" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.120904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-scripts\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.120939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-run-httpd\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.120958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.121089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-log-httpd\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.121118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h827p\" (UniqueName: \"kubernetes.io/projected/c5b5fbb1-f7b8-4920-8329-08ff2b954651-kube-api-access-h827p\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.121142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-config-data\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.121161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.121956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-log-httpd\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.122186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-run-httpd\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.127765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-scripts\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.128680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.129650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-config-data\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.130295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.138292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h827p\" (UniqueName: \"kubernetes.io/projected/c5b5fbb1-f7b8-4920-8329-08ff2b954651-kube-api-access-h827p\") pod \"ceilometer-0\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.254942 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.717698 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-api-0" podStartSLOduration=3.71767734 podStartE2EDuration="3.71767734s" podCreationTimestamp="2026-01-21 15:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:31:57.981841577 +0000 UTC m=+1815.163357789" watchObservedRunningTime="2026-01-21 15:31:58.71767734 +0000 UTC m=+1815.899193573" Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.725460 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.938714 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"42600f96-228b-4f9d-ab60-d65367d09512","Type":"ContainerStarted","Data":"4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7"} Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.941731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ba850224-736f-46c0-abd7-5c0ad1d55712","Type":"ContainerStarted","Data":"846b728b3b2d09ea433532e9fe0bbc32c3f23517ce3593d76efd12f4e719ba57"} Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.941764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ba850224-736f-46c0-abd7-5c0ad1d55712","Type":"ContainerStarted","Data":"019100d458eb74326687f39ad6f7bdc16abd5f5d387000e330e8cf0c733379e4"} Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.947455 4707 generic.go:334] "Generic (PLEG): container finished" podID="eeff42ad-81d5-4dfc-a4d9-c68d7f37f984" containerID="ff17dfb054823684830a7337448f62fc48f01c6fe76d2357e00911976e1b3f4e" exitCode=0 Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.947521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" event={"ID":"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984","Type":"ContainerDied","Data":"ff17dfb054823684830a7337448f62fc48f01c6fe76d2357e00911976e1b3f4e"} Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.953032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerStarted","Data":"83ee88dde55dcc2f86a16dd2acb5290a35a8fac570c276dc7996a346e922cbff"} Jan 21 15:31:58 crc kubenswrapper[4707]: I0121 15:31:58.954180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:31:59 crc kubenswrapper[4707]: I0121 15:31:59.204506 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5508f21-d9fc-47ff-8b5a-1a58eea0d045" path="/var/lib/kubelet/pods/f5508f21-d9fc-47ff-8b5a-1a58eea0d045/volumes" Jan 21 15:31:59 crc kubenswrapper[4707]: I0121 15:31:59.986207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-applier-0" event={"ID":"933e570a-28bd-42d8-891f-0734578bc040","Type":"ContainerStarted","Data":"a5cfda8d9c91d4e73ddfaa7ddcab6f6a580e4abf30eecf6674e01a38f42e5830"} Jan 21 15:31:59 crc kubenswrapper[4707]: I0121 15:31:59.999320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerStarted","Data":"c209b57fbadaa82a717e8034092cbff5667f1050841c14612dd7edd344643d8d"} Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.009932 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-applier-0" podStartSLOduration=2.019969857 podStartE2EDuration="4.009918649s" podCreationTimestamp="2026-01-21 15:31:56 +0000 UTC" firstStartedPulling="2026-01-21 15:31:57.515339935 +0000 UTC m=+1814.696856157" lastFinishedPulling="2026-01-21 15:31:59.505288727 +0000 UTC m=+1816.686804949" observedRunningTime="2026-01-21 15:32:00.008976198 +0000 UTC m=+1817.190492420" watchObservedRunningTime="2026-01-21 15:32:00.009918649 +0000 UTC m=+1817.191434872" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.010353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"42600f96-228b-4f9d-ab60-d65367d09512","Type":"ContainerStarted","Data":"74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d"} Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.012687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"4eebbb84-d09f-44b4-9ad2-5f715fd16644","Type":"ContainerStarted","Data":"1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b"} Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.051608 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=4.0515936 podStartE2EDuration="4.0515936s" podCreationTimestamp="2026-01-21 15:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:00.049433859 +0000 UTC m=+1817.230950081" watchObservedRunningTime="2026-01-21 15:32:00.0515936 +0000 UTC m=+1817.233109821" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.079319 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-decision-engine-0" podStartSLOduration=2.0187159 podStartE2EDuration="4.079305075s" podCreationTimestamp="2026-01-21 15:31:56 +0000 UTC" firstStartedPulling="2026-01-21 15:31:57.448302877 +0000 UTC m=+1814.629819099" lastFinishedPulling="2026-01-21 15:31:59.508892053 +0000 UTC m=+1816.690408274" observedRunningTime="2026-01-21 15:32:00.076027232 +0000 UTC m=+1817.257543454" watchObservedRunningTime="2026-01-21 15:32:00.079305075 +0000 UTC m=+1817.260821298" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.487114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.580420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-combined-ca-bundle\") pod \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.580629 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f74wt\" (UniqueName: \"kubernetes.io/projected/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-kube-api-access-f74wt\") pod \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.580673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-config\") pod \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\" (UID: \"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984\") " Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.597304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-kube-api-access-f74wt" (OuterVolumeSpecName: "kube-api-access-f74wt") pod "eeff42ad-81d5-4dfc-a4d9-c68d7f37f984" (UID: "eeff42ad-81d5-4dfc-a4d9-c68d7f37f984"). InnerVolumeSpecName "kube-api-access-f74wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.614520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-config" (OuterVolumeSpecName: "config") pod "eeff42ad-81d5-4dfc-a4d9-c68d7f37f984" (UID: "eeff42ad-81d5-4dfc-a4d9-c68d7f37f984"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.614621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eeff42ad-81d5-4dfc-a4d9-c68d7f37f984" (UID: "eeff42ad-81d5-4dfc-a4d9-c68d7f37f984"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.683289 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f74wt\" (UniqueName: \"kubernetes.io/projected/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-kube-api-access-f74wt\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.683318 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.683329 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:00 crc kubenswrapper[4707]: I0121 15:32:00.893632 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.020779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" event={"ID":"eeff42ad-81d5-4dfc-a4d9-c68d7f37f984","Type":"ContainerDied","Data":"53e0d516f1bbb339da9206f50c5c1edb59e31e6faed7826dab5177c4f72b16e8"} Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.020843 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53e0d516f1bbb339da9206f50c5c1edb59e31e6faed7826dab5177c4f72b16e8" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.020798 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-n7pdl" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.022321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerStarted","Data":"f147ab0f86f668f3ec8bf52d936c30782c1949aa7ed10de4fc9e740b145f5072"} Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.025235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ba850224-736f-46c0-abd7-5c0ad1d55712","Type":"ContainerStarted","Data":"c365cf36b4f2fc00031629ee5599fc047c09c4b0b3d0e2f0351fc420f78fdbcf"} Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.025278 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.026110 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec7ce142-4555-4604-a31b-155176988d1a" containerID="d7d98a1f6b0a50dff9ed5bf7d3edb969798ff4f76e9a3eb5436c368c2529ba04" exitCode=0 Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.026712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" event={"ID":"ec7ce142-4555-4604-a31b-155176988d1a","Type":"ContainerDied","Data":"d7d98a1f6b0a50dff9ed5bf7d3edb969798ff4f76e9a3eb5436c368c2529ba04"} Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.044504 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=4.044490761 podStartE2EDuration="4.044490761s" podCreationTimestamp="2026-01-21 15:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:01.043083395 +0000 UTC m=+1818.224599616" watchObservedRunningTime="2026-01-21 15:32:01.044490761 +0000 UTC m=+1818.226006982" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.327719 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-7fbc45cc89-glgj4"] Jan 21 15:32:01 crc kubenswrapper[4707]: E0121 15:32:01.328289 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeff42ad-81d5-4dfc-a4d9-c68d7f37f984" containerName="neutron-db-sync" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.328308 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeff42ad-81d5-4dfc-a4d9-c68d7f37f984" containerName="neutron-db-sync" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.328479 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeff42ad-81d5-4dfc-a4d9-c68d7f37f984" containerName="neutron-db-sync" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.329312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.331741 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.332119 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-9kpsx" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.332938 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.334964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.341894 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7fbc45cc89-glgj4"] Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.397563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5xv\" (UniqueName: \"kubernetes.io/projected/7c12f4c3-101b-4361-8d90-6adbf916cbda-kube-api-access-mg5xv\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.397620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-combined-ca-bundle\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.397717 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-ovndb-tls-certs\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.397754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-httpd-config\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.397783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-config\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.464423 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.464517 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.498933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-ovndb-tls-certs\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.498982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-httpd-config\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.499012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-config\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.499257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5xv\" (UniqueName: \"kubernetes.io/projected/7c12f4c3-101b-4361-8d90-6adbf916cbda-kube-api-access-mg5xv\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.499288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-combined-ca-bundle\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.505437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-combined-ca-bundle\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.505926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-ovndb-tls-certs\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.519847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-httpd-config\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.520290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-config\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.521946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5xv\" (UniqueName: \"kubernetes.io/projected/7c12f4c3-101b-4361-8d90-6adbf916cbda-kube-api-access-mg5xv\") pod \"neutron-7fbc45cc89-glgj4\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.544987 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:32:01 crc kubenswrapper[4707]: I0121 15:32:01.643051 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.033395 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerName="cinder-api-log" containerID="cri-o://846b728b3b2d09ea433532e9fe0bbc32c3f23517ce3593d76efd12f4e719ba57" gracePeriod=30 Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.033425 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerName="cinder-api" containerID="cri-o://c365cf36b4f2fc00031629ee5599fc047c09c4b0b3d0e2f0351fc420f78fdbcf" gracePeriod=30 Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.043458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7fbc45cc89-glgj4"] Jan 21 15:32:02 crc kubenswrapper[4707]: W0121 15:32:02.048136 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c12f4c3_101b_4361_8d90_6adbf916cbda.slice/crio-8540af325387fba5474f5dcaaeb6f2a15f06e83d3f08a875bde2af5c7eea2071 WatchSource:0}: Error finding container 8540af325387fba5474f5dcaaeb6f2a15f06e83d3f08a875bde2af5c7eea2071: Status 404 returned error can't find the container with id 8540af325387fba5474f5dcaaeb6f2a15f06e83d3f08a875bde2af5c7eea2071 Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.246441 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.306130 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.312417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-scripts\") pod \"ec7ce142-4555-4604-a31b-155176988d1a\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.312461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-fernet-keys\") pod \"ec7ce142-4555-4604-a31b-155176988d1a\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.312487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-config-data\") pod \"ec7ce142-4555-4604-a31b-155176988d1a\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.312538 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-credential-keys\") pod \"ec7ce142-4555-4604-a31b-155176988d1a\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.312592 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkt4g\" (UniqueName: \"kubernetes.io/projected/ec7ce142-4555-4604-a31b-155176988d1a-kube-api-access-wkt4g\") pod \"ec7ce142-4555-4604-a31b-155176988d1a\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.312649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-combined-ca-bundle\") pod \"ec7ce142-4555-4604-a31b-155176988d1a\" (UID: \"ec7ce142-4555-4604-a31b-155176988d1a\") " Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.315575 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ec7ce142-4555-4604-a31b-155176988d1a" (UID: "ec7ce142-4555-4604-a31b-155176988d1a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.316432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec7ce142-4555-4604-a31b-155176988d1a-kube-api-access-wkt4g" (OuterVolumeSpecName: "kube-api-access-wkt4g") pod "ec7ce142-4555-4604-a31b-155176988d1a" (UID: "ec7ce142-4555-4604-a31b-155176988d1a"). InnerVolumeSpecName "kube-api-access-wkt4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.319135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ec7ce142-4555-4604-a31b-155176988d1a" (UID: "ec7ce142-4555-4604-a31b-155176988d1a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.319696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-scripts" (OuterVolumeSpecName: "scripts") pod "ec7ce142-4555-4604-a31b-155176988d1a" (UID: "ec7ce142-4555-4604-a31b-155176988d1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.341059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec7ce142-4555-4604-a31b-155176988d1a" (UID: "ec7ce142-4555-4604-a31b-155176988d1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.351702 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-config-data" (OuterVolumeSpecName: "config-data") pod "ec7ce142-4555-4604-a31b-155176988d1a" (UID: "ec7ce142-4555-4604-a31b-155176988d1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.415139 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkt4g\" (UniqueName: \"kubernetes.io/projected/ec7ce142-4555-4604-a31b-155176988d1a-kube-api-access-wkt4g\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.415181 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.415192 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.415203 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.415212 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:02 crc kubenswrapper[4707]: I0121 15:32:02.415219 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec7ce142-4555-4604-a31b-155176988d1a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.040752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" event={"ID":"ec7ce142-4555-4604-a31b-155176988d1a","Type":"ContainerDied","Data":"2629857ece79ec5de0dccc1f043990e2ebd01db6456679be1130b239b332de9c"} Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.040781 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-x8kr2" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.040793 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2629857ece79ec5de0dccc1f043990e2ebd01db6456679be1130b239b332de9c" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.049262 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" event={"ID":"7c12f4c3-101b-4361-8d90-6adbf916cbda","Type":"ContainerStarted","Data":"8540af325387fba5474f5dcaaeb6f2a15f06e83d3f08a875bde2af5c7eea2071"} Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.050519 4707 generic.go:334] "Generic (PLEG): container finished" podID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerID="846b728b3b2d09ea433532e9fe0bbc32c3f23517ce3593d76efd12f4e719ba57" exitCode=143 Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.050548 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ba850224-736f-46c0-abd7-5c0ad1d55712","Type":"ContainerDied","Data":"846b728b3b2d09ea433532e9fe0bbc32c3f23517ce3593d76efd12f4e719ba57"} Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.148465 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6885874467-qknsk"] Jan 21 15:32:03 crc kubenswrapper[4707]: E0121 15:32:03.148784 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7ce142-4555-4604-a31b-155176988d1a" containerName="keystone-bootstrap" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.148800 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7ce142-4555-4604-a31b-155176988d1a" containerName="keystone-bootstrap" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.148961 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec7ce142-4555-4604-a31b-155176988d1a" containerName="keystone-bootstrap" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.149455 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.154397 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.154539 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.154585 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.154779 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.154796 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-hswdl" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.154922 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.165145 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6885874467-qknsk"] Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.227429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-fernet-keys\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.227476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-public-tls-certs\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.227557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-credential-keys\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.227592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm7vg\" (UniqueName: \"kubernetes.io/projected/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-kube-api-access-jm7vg\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.228155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-config-data\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.228214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-scripts\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.228308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-combined-ca-bundle\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.228333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-internal-tls-certs\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.329583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-public-tls-certs\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.329707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-credential-keys\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.330445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm7vg\" (UniqueName: \"kubernetes.io/projected/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-kube-api-access-jm7vg\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.330553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-config-data\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.330597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-scripts\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.330731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-combined-ca-bundle\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.330759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-internal-tls-certs\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.330791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-fernet-keys\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.333538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-scripts\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.334245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-credential-keys\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.335375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-config-data\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.335374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-fernet-keys\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.346551 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-internal-tls-certs\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.350189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm7vg\" (UniqueName: \"kubernetes.io/projected/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-kube-api-access-jm7vg\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.352310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-combined-ca-bundle\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.356430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-public-tls-certs\") pod \"keystone-6885874467-qknsk\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.463597 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:03 crc kubenswrapper[4707]: I0121 15:32:03.852436 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6885874467-qknsk"] Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.048625 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.060293 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.071547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerStarted","Data":"1aee2995c3d4ccdc925eeebc3ca9feb5c8ee77bef3bbab34e325a26d995dac36"} Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.085353 4707 generic.go:334] "Generic (PLEG): container finished" podID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerID="c365cf36b4f2fc00031629ee5599fc047c09c4b0b3d0e2f0351fc420f78fdbcf" exitCode=0 Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.085454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ba850224-736f-46c0-abd7-5c0ad1d55712","Type":"ContainerDied","Data":"c365cf36b4f2fc00031629ee5599fc047c09c4b0b3d0e2f0351fc420f78fdbcf"} Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.094355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" event={"ID":"7c12f4c3-101b-4361-8d90-6adbf916cbda","Type":"ContainerStarted","Data":"a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5"} Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.094413 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.098459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" event={"ID":"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed","Type":"ContainerStarted","Data":"42be8515c1468a13f580b0b6a2ff8764b5cd99e2e34240e7dc95492be18ffdb6"} Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.103382 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.121515 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" podStartSLOduration=3.121501457 podStartE2EDuration="3.121501457s" podCreationTimestamp="2026-01-21 15:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:04.112000403 +0000 UTC m=+1821.293516624" watchObservedRunningTime="2026-01-21 15:32:04.121501457 +0000 UTC m=+1821.303017680" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.142357 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.142402 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.147224 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.149124 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.179555 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.185205 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.189352 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.198554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.414084 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.550588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwjdc\" (UniqueName: \"kubernetes.io/projected/ba850224-736f-46c0-abd7-5c0ad1d55712-kube-api-access-kwjdc\") pod \"ba850224-736f-46c0-abd7-5c0ad1d55712\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.550651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data\") pod \"ba850224-736f-46c0-abd7-5c0ad1d55712\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.550704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-scripts\") pod \"ba850224-736f-46c0-abd7-5c0ad1d55712\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.550722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data-custom\") pod \"ba850224-736f-46c0-abd7-5c0ad1d55712\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.550784 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba850224-736f-46c0-abd7-5c0ad1d55712-etc-machine-id\") pod \"ba850224-736f-46c0-abd7-5c0ad1d55712\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.550870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba850224-736f-46c0-abd7-5c0ad1d55712-logs\") pod \"ba850224-736f-46c0-abd7-5c0ad1d55712\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.550897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-combined-ca-bundle\") pod \"ba850224-736f-46c0-abd7-5c0ad1d55712\" (UID: \"ba850224-736f-46c0-abd7-5c0ad1d55712\") " Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.551658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba850224-736f-46c0-abd7-5c0ad1d55712-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ba850224-736f-46c0-abd7-5c0ad1d55712" (UID: "ba850224-736f-46c0-abd7-5c0ad1d55712"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.551738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba850224-736f-46c0-abd7-5c0ad1d55712-logs" (OuterVolumeSpecName: "logs") pod "ba850224-736f-46c0-abd7-5c0ad1d55712" (UID: "ba850224-736f-46c0-abd7-5c0ad1d55712"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.556956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-scripts" (OuterVolumeSpecName: "scripts") pod "ba850224-736f-46c0-abd7-5c0ad1d55712" (UID: "ba850224-736f-46c0-abd7-5c0ad1d55712"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.556958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba850224-736f-46c0-abd7-5c0ad1d55712-kube-api-access-kwjdc" (OuterVolumeSpecName: "kube-api-access-kwjdc") pod "ba850224-736f-46c0-abd7-5c0ad1d55712" (UID: "ba850224-736f-46c0-abd7-5c0ad1d55712"). InnerVolumeSpecName "kube-api-access-kwjdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.559648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ba850224-736f-46c0-abd7-5c0ad1d55712" (UID: "ba850224-736f-46c0-abd7-5c0ad1d55712"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.572930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba850224-736f-46c0-abd7-5c0ad1d55712" (UID: "ba850224-736f-46c0-abd7-5c0ad1d55712"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.582664 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data" (OuterVolumeSpecName: "config-data") pod "ba850224-736f-46c0-abd7-5c0ad1d55712" (UID: "ba850224-736f-46c0-abd7-5c0ad1d55712"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.652421 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwjdc\" (UniqueName: \"kubernetes.io/projected/ba850224-736f-46c0-abd7-5c0ad1d55712-kube-api-access-kwjdc\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.652448 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.652458 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.652466 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.652475 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba850224-736f-46c0-abd7-5c0ad1d55712-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.652483 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba850224-736f-46c0-abd7-5c0ad1d55712-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:04 crc kubenswrapper[4707]: I0121 15:32:04.652490 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba850224-736f-46c0-abd7-5c0ad1d55712-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.106777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" event={"ID":"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed","Type":"ContainerStarted","Data":"8a845675965250cb9cb44e3ab713dc4f43ebdac5af17938f0005b1ca1e29b6e7"} Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.109105 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.109759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.109894 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ba850224-736f-46c0-abd7-5c0ad1d55712","Type":"ContainerDied","Data":"019100d458eb74326687f39ad6f7bdc16abd5f5d387000e330e8cf0c733379e4"} Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.110118 4707 scope.go:117] "RemoveContainer" containerID="c365cf36b4f2fc00031629ee5599fc047c09c4b0b3d0e2f0351fc420f78fdbcf" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.110647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" event={"ID":"7c12f4c3-101b-4361-8d90-6adbf916cbda","Type":"ContainerStarted","Data":"72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083"} Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.112453 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.114403 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.114498 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.114583 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.122250 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" podStartSLOduration=2.122236404 podStartE2EDuration="2.122236404s" podCreationTimestamp="2026-01-21 15:32:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:05.120586112 +0000 UTC m=+1822.302102334" watchObservedRunningTime="2026-01-21 15:32:05.122236404 +0000 UTC m=+1822.303752616" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.137528 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.139496 4707 scope.go:117] "RemoveContainer" containerID="846b728b3b2d09ea433532e9fe0bbc32c3f23517ce3593d76efd12f4e719ba57" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.143791 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.158377 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:32:05 crc kubenswrapper[4707]: E0121 15:32:05.158681 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerName="cinder-api" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.158698 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerName="cinder-api" Jan 21 15:32:05 crc kubenswrapper[4707]: E0121 15:32:05.158722 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerName="cinder-api-log" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.158728 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerName="cinder-api-log" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.158898 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerName="cinder-api-log" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.158917 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba850224-736f-46c0-abd7-5c0ad1d55712" containerName="cinder-api" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.160527 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.163635 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.166408 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.166545 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.176365 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.197472 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba850224-736f-46c0-abd7-5c0ad1d55712" path="/var/lib/kubelet/pods/ba850224-736f-46c0-abd7-5c0ad1d55712/volumes" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.260404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.260522 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data-custom\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.260631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fd72c11-1e49-4359-811c-f7a96288df7b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.260681 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.260753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhzbp\" (UniqueName: \"kubernetes.io/projected/1fd72c11-1e49-4359-811c-f7a96288df7b-kube-api-access-zhzbp\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.260791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.260947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fd72c11-1e49-4359-811c-f7a96288df7b-logs\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.261118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.262662 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-scripts\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.363991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fd72c11-1e49-4359-811c-f7a96288df7b-logs\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-scripts\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data-custom\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364200 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fd72c11-1e49-4359-811c-f7a96288df7b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhzbp\" (UniqueName: \"kubernetes.io/projected/1fd72c11-1e49-4359-811c-f7a96288df7b-kube-api-access-zhzbp\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fd72c11-1e49-4359-811c-f7a96288df7b-logs\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.364893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fd72c11-1e49-4359-811c-f7a96288df7b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.367486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.367653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data-custom\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.368457 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.369025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.370992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.371158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-scripts\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.381216 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhzbp\" (UniqueName: \"kubernetes.io/projected/1fd72c11-1e49-4359-811c-f7a96288df7b-kube-api-access-zhzbp\") pod \"cinder-api-0\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.483991 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:05 crc kubenswrapper[4707]: I0121 15:32:05.870312 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:32:05 crc kubenswrapper[4707]: W0121 15:32:05.888912 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fd72c11_1e49_4359_811c_f7a96288df7b.slice/crio-dc10bff20660ffdc16c6a79b7c00c6d62566b78e994dfe809bcac732dea7fa82 WatchSource:0}: Error finding container dc10bff20660ffdc16c6a79b7c00c6d62566b78e994dfe809bcac732dea7fa82: Status 404 returned error can't find the container with id dc10bff20660ffdc16c6a79b7c00c6d62566b78e994dfe809bcac732dea7fa82 Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.125750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"1fd72c11-1e49-4359-811c-f7a96288df7b","Type":"ContainerStarted","Data":"dc10bff20660ffdc16c6a79b7c00c6d62566b78e994dfe809bcac732dea7fa82"} Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.132685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerStarted","Data":"5f78dadf11135148036c0ad6ef343c31a86b8b3aa982408f9c591e3a8fe22eb0"} Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.133852 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.153878 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.570137952 podStartE2EDuration="9.153868237s" podCreationTimestamp="2026-01-21 15:31:57 +0000 UTC" firstStartedPulling="2026-01-21 15:31:58.729955927 +0000 UTC m=+1815.911472150" lastFinishedPulling="2026-01-21 15:32:05.313686213 +0000 UTC m=+1822.495202435" observedRunningTime="2026-01-21 15:32:06.1479777 +0000 UTC m=+1823.329493922" watchObservedRunningTime="2026-01-21 15:32:06.153868237 +0000 UTC m=+1823.335384458" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.460183 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-78749c84c4-t8xcs"] Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.461522 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.465532 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.466258 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.467315 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.480154 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-78749c84c4-t8xcs"] Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.515024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.542321 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.546078 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.582725 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.589090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffsr2\" (UniqueName: \"kubernetes.io/projected/27a7cc38-e5af-494e-880c-45e5278e927a-kube-api-access-ffsr2\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.589163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-config\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.589252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-combined-ca-bundle\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.589303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-ovndb-tls-certs\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.589342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-internal-tls-certs\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.589399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-public-tls-certs\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.589461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-httpd-config\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.590408 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.691262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-internal-tls-certs\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.691316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-public-tls-certs\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.691348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-httpd-config\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.691392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffsr2\" (UniqueName: \"kubernetes.io/projected/27a7cc38-e5af-494e-880c-45e5278e927a-kube-api-access-ffsr2\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.691428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-config\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.691472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-combined-ca-bundle\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.691494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-ovndb-tls-certs\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.695565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-combined-ca-bundle\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.697518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-ovndb-tls-certs\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.698053 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-public-tls-certs\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.698983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-config\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.699632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-httpd-config\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.702281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-internal-tls-certs\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.710368 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffsr2\" (UniqueName: \"kubernetes.io/projected/27a7cc38-e5af-494e-880c-45e5278e927a-kube-api-access-ffsr2\") pod \"neutron-78749c84c4-t8xcs\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.775938 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.920558 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j"] Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.925830 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.927894 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j"] Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.930422 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.930639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.996554 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t767x\" (UniqueName: \"kubernetes.io/projected/de3b367f-2f6e-4ab7-af9b-5390359d3140-kube-api-access-t767x\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.996607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de3b367f-2f6e-4ab7-af9b-5390359d3140-logs\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.996627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-combined-ca-bundle\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.996655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data-custom\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.996691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-internal-tls-certs\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.996729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:06 crc kubenswrapper[4707]: I0121 15:32:06.996786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-public-tls-certs\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.102507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de3b367f-2f6e-4ab7-af9b-5390359d3140-logs\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.102692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-combined-ca-bundle\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.102726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data-custom\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.102773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-internal-tls-certs\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.102856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.102922 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-public-tls-certs\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.103003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de3b367f-2f6e-4ab7-af9b-5390359d3140-logs\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.103099 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t767x\" (UniqueName: \"kubernetes.io/projected/de3b367f-2f6e-4ab7-af9b-5390359d3140-kube-api-access-t767x\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.111511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-internal-tls-certs\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.112892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-combined-ca-bundle\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.113125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data-custom\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.113335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-public-tls-certs\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.135307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.140027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t767x\" (UniqueName: \"kubernetes.io/projected/de3b367f-2f6e-4ab7-af9b-5390359d3140-kube-api-access-t767x\") pod \"barbican-api-696bc54df8-zwm2j\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.148032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"1fd72c11-1e49-4359-811c-f7a96288df7b","Type":"ContainerStarted","Data":"035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57"} Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.148538 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.148556 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.148931 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.148952 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.150313 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.165123 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.201201 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.201239 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.201280 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.241603 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.438254 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-78749c84c4-t8xcs"] Jan 21 15:32:07 crc kubenswrapper[4707]: W0121 15:32:07.447922 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27a7cc38_e5af_494e_880c_45e5278e927a.slice/crio-cd8d6acf3eb7cca90e881c498961d6928c15748039fc080c5c11e30d2c93bbb2 WatchSource:0}: Error finding container cd8d6acf3eb7cca90e881c498961d6928c15748039fc080c5c11e30d2c93bbb2: Status 404 returned error can't find the container with id cd8d6acf3eb7cca90e881c498961d6928c15748039fc080c5c11e30d2c93bbb2 Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.624894 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.671601 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.695984 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.778490 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.869311 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j"] Jan 21 15:32:07 crc kubenswrapper[4707]: I0121 15:32:07.981297 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.159399 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"1fd72c11-1e49-4359-811c-f7a96288df7b","Type":"ContainerStarted","Data":"9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab"} Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.160371 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.161958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" event={"ID":"de3b367f-2f6e-4ab7-af9b-5390359d3140","Type":"ContainerStarted","Data":"2f689f4a8b3655a649e060bcd6e63b5d3f7054c1d19adef9d2866c4790ef6ad2"} Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.161978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" event={"ID":"de3b367f-2f6e-4ab7-af9b-5390359d3140","Type":"ContainerStarted","Data":"c87c0a6abff6f08bad44ecfdc1aecf62544eeeb17c94e85c8cc600145d952048"} Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.163880 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="42600f96-228b-4f9d-ab60-d65367d09512" containerName="cinder-scheduler" containerID="cri-o://4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7" gracePeriod=30 Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.164535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" event={"ID":"27a7cc38-e5af-494e-880c-45e5278e927a","Type":"ContainerStarted","Data":"88c64ecf8afd910bd3c88e6e7d95ee4f70a40752b48f92f051811e5a100e6877"} Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.164559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" event={"ID":"27a7cc38-e5af-494e-880c-45e5278e927a","Type":"ContainerStarted","Data":"f13d26d52d87a08e099fcd0268946a2ae4f7db7d6a421b118ad9da2d50e25f98"} Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.164569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" event={"ID":"27a7cc38-e5af-494e-880c-45e5278e927a","Type":"ContainerStarted","Data":"cd8d6acf3eb7cca90e881c498961d6928c15748039fc080c5c11e30d2c93bbb2"} Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.164580 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.166855 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="42600f96-228b-4f9d-ab60-d65367d09512" containerName="probe" containerID="cri-o://74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d" gracePeriod=30 Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.187320 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.187310417 podStartE2EDuration="3.187310417s" podCreationTimestamp="2026-01-21 15:32:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:08.182801229 +0000 UTC m=+1825.364317451" watchObservedRunningTime="2026-01-21 15:32:08.187310417 +0000 UTC m=+1825.368826639" Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.208282 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" podStartSLOduration=2.20827146 podStartE2EDuration="2.20827146s" podCreationTimestamp="2026-01-21 15:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:08.201317825 +0000 UTC m=+1825.382834047" watchObservedRunningTime="2026-01-21 15:32:08.20827146 +0000 UTC m=+1825.389787682" Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.246128 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:32:08 crc kubenswrapper[4707]: I0121 15:32:08.959093 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:32:09 crc kubenswrapper[4707]: I0121 15:32:09.179385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" event={"ID":"de3b367f-2f6e-4ab7-af9b-5390359d3140","Type":"ContainerStarted","Data":"0eabbf082b84f4e174b9daa40c33805625e90ebc29b3dfef4ea2017e10d6f5c0"} Jan 21 15:32:09 crc kubenswrapper[4707]: I0121 15:32:09.179423 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:09 crc kubenswrapper[4707]: I0121 15:32:09.179442 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:09 crc kubenswrapper[4707]: I0121 15:32:09.218499 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" podStartSLOduration=3.218484779 podStartE2EDuration="3.218484779s" podCreationTimestamp="2026-01-21 15:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:09.218115104 +0000 UTC m=+1826.399631325" watchObservedRunningTime="2026-01-21 15:32:09.218484779 +0000 UTC m=+1826.400001001" Jan 21 15:32:09 crc kubenswrapper[4707]: E0121 15:32:09.220405 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42600f96_228b_4f9d_ab60_d65367d09512.slice/crio-74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.131492 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.163084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data-custom\") pod \"42600f96-228b-4f9d-ab60-d65367d09512\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.163152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdl4k\" (UniqueName: \"kubernetes.io/projected/42600f96-228b-4f9d-ab60-d65367d09512-kube-api-access-hdl4k\") pod \"42600f96-228b-4f9d-ab60-d65367d09512\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.163226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42600f96-228b-4f9d-ab60-d65367d09512-etc-machine-id\") pod \"42600f96-228b-4f9d-ab60-d65367d09512\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.163258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data\") pod \"42600f96-228b-4f9d-ab60-d65367d09512\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.163293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-scripts\") pod \"42600f96-228b-4f9d-ab60-d65367d09512\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.163316 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42600f96-228b-4f9d-ab60-d65367d09512-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "42600f96-228b-4f9d-ab60-d65367d09512" (UID: "42600f96-228b-4f9d-ab60-d65367d09512"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.163406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-combined-ca-bundle\") pod \"42600f96-228b-4f9d-ab60-d65367d09512\" (UID: \"42600f96-228b-4f9d-ab60-d65367d09512\") " Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.163738 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42600f96-228b-4f9d-ab60-d65367d09512-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.170098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "42600f96-228b-4f9d-ab60-d65367d09512" (UID: "42600f96-228b-4f9d-ab60-d65367d09512"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.170352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-scripts" (OuterVolumeSpecName: "scripts") pod "42600f96-228b-4f9d-ab60-d65367d09512" (UID: "42600f96-228b-4f9d-ab60-d65367d09512"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.171708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42600f96-228b-4f9d-ab60-d65367d09512-kube-api-access-hdl4k" (OuterVolumeSpecName: "kube-api-access-hdl4k") pod "42600f96-228b-4f9d-ab60-d65367d09512" (UID: "42600f96-228b-4f9d-ab60-d65367d09512"). InnerVolumeSpecName "kube-api-access-hdl4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.190294 4707 generic.go:334] "Generic (PLEG): container finished" podID="42600f96-228b-4f9d-ab60-d65367d09512" containerID="74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d" exitCode=0 Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.190321 4707 generic.go:334] "Generic (PLEG): container finished" podID="42600f96-228b-4f9d-ab60-d65367d09512" containerID="4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7" exitCode=0 Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.190369 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.190401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"42600f96-228b-4f9d-ab60-d65367d09512","Type":"ContainerDied","Data":"74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d"} Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.190426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"42600f96-228b-4f9d-ab60-d65367d09512","Type":"ContainerDied","Data":"4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7"} Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.190436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"42600f96-228b-4f9d-ab60-d65367d09512","Type":"ContainerDied","Data":"4007d73d2c0dfdc1beb4cf4a6c1542a04d389394ba51de99dcf4544ec8723e80"} Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.190450 4707 scope.go:117] "RemoveContainer" containerID="74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.207957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42600f96-228b-4f9d-ab60-d65367d09512" (UID: "42600f96-228b-4f9d-ab60-d65367d09512"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.239492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data" (OuterVolumeSpecName: "config-data") pod "42600f96-228b-4f9d-ab60-d65367d09512" (UID: "42600f96-228b-4f9d-ab60-d65367d09512"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.264300 4707 scope.go:117] "RemoveContainer" containerID="4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.265452 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.265481 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.265491 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdl4k\" (UniqueName: \"kubernetes.io/projected/42600f96-228b-4f9d-ab60-d65367d09512-kube-api-access-hdl4k\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.265500 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.265509 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42600f96-228b-4f9d-ab60-d65367d09512-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.281131 4707 scope.go:117] "RemoveContainer" containerID="74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d" Jan 21 15:32:10 crc kubenswrapper[4707]: E0121 15:32:10.281557 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d\": container with ID starting with 74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d not found: ID does not exist" containerID="74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.281595 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d"} err="failed to get container status \"74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d\": rpc error: code = NotFound desc = could not find container \"74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d\": container with ID starting with 74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d not found: ID does not exist" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.281618 4707 scope.go:117] "RemoveContainer" containerID="4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7" Jan 21 15:32:10 crc kubenswrapper[4707]: E0121 15:32:10.281934 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7\": container with ID starting with 4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7 not found: ID does not exist" containerID="4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.281958 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7"} err="failed to get container status \"4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7\": rpc error: code = NotFound desc = could not find container \"4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7\": container with ID starting with 4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7 not found: ID does not exist" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.281973 4707 scope.go:117] "RemoveContainer" containerID="74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.282314 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d"} err="failed to get container status \"74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d\": rpc error: code = NotFound desc = could not find container \"74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d\": container with ID starting with 74bb9ccb289169182279b363a46877b0814fa384c2b906513e949dae2bf4c32d not found: ID does not exist" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.282336 4707 scope.go:117] "RemoveContainer" containerID="4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.282608 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7"} err="failed to get container status \"4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7\": rpc error: code = NotFound desc = could not find container \"4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7\": container with ID starting with 4cae36dee5e0c18f2b8035d637b4e48e9069e4c09fbc83570934e24366afd8c7 not found: ID does not exist" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.519540 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.526762 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.537637 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:32:10 crc kubenswrapper[4707]: E0121 15:32:10.538001 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42600f96-228b-4f9d-ab60-d65367d09512" containerName="cinder-scheduler" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.538020 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42600f96-228b-4f9d-ab60-d65367d09512" containerName="cinder-scheduler" Jan 21 15:32:10 crc kubenswrapper[4707]: E0121 15:32:10.538033 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42600f96-228b-4f9d-ab60-d65367d09512" containerName="probe" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.538038 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42600f96-228b-4f9d-ab60-d65367d09512" containerName="probe" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.538231 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="42600f96-228b-4f9d-ab60-d65367d09512" containerName="probe" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.538247 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="42600f96-228b-4f9d-ab60-d65367d09512" containerName="cinder-scheduler" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.539162 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.541973 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.557403 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.575142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9hgv\" (UniqueName: \"kubernetes.io/projected/c893a809-f898-4540-9624-c092e6ea4e8b-kube-api-access-p9hgv\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.575225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.575259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.575290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.575320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.575335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c893a809-f898-4540-9624-c092e6ea4e8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.615450 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.615657 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-api-0" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api-log" containerID="cri-o://06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c" gracePeriod=30 Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.616051 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-api-0" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api" containerID="cri-o://cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3" gracePeriod=30 Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.677222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.677275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.677297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c893a809-f898-4540-9624-c092e6ea4e8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.677363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9hgv\" (UniqueName: \"kubernetes.io/projected/c893a809-f898-4540-9624-c092e6ea4e8b-kube-api-access-p9hgv\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.677405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.677431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.677466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c893a809-f898-4540-9624-c092e6ea4e8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.682720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.683366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.684380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.685121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.696226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9hgv\" (UniqueName: \"kubernetes.io/projected/c893a809-f898-4540-9624-c092e6ea4e8b-kube-api-access-p9hgv\") pod \"cinder-scheduler-0\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:10 crc kubenswrapper[4707]: I0121 15:32:10.857725 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:11 crc kubenswrapper[4707]: I0121 15:32:11.190757 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42600f96-228b-4f9d-ab60-d65367d09512" path="/var/lib/kubelet/pods/42600f96-228b-4f9d-ab60-d65367d09512/volumes" Jan 21 15:32:11 crc kubenswrapper[4707]: I0121 15:32:11.198911 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerID="06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c" exitCode=143 Jan 21 15:32:11 crc kubenswrapper[4707]: I0121 15:32:11.198945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"2b691f7a-1d61-46ef-bddc-5e692d197f4e","Type":"ContainerDied","Data":"06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c"} Jan 21 15:32:11 crc kubenswrapper[4707]: I0121 15:32:11.274242 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:32:12 crc kubenswrapper[4707]: I0121 15:32:12.183034 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:32:12 crc kubenswrapper[4707]: E0121 15:32:12.183448 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:32:12 crc kubenswrapper[4707]: I0121 15:32:12.216703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c893a809-f898-4540-9624-c092e6ea4e8b","Type":"ContainerStarted","Data":"ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a"} Jan 21 15:32:12 crc kubenswrapper[4707]: I0121 15:32:12.216735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c893a809-f898-4540-9624-c092e6ea4e8b","Type":"ContainerStarted","Data":"e7e88f35fdc4832f8b3e4ad1f62c9ce429c46af0a5419ca8b699df2dbf68678d"} Jan 21 15:32:13 crc kubenswrapper[4707]: I0121 15:32:13.226989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c893a809-f898-4540-9624-c092e6ea4e8b","Type":"ContainerStarted","Data":"630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a"} Jan 21 15:32:13 crc kubenswrapper[4707]: I0121 15:32:13.242718 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.242701895 podStartE2EDuration="3.242701895s" podCreationTimestamp="2026-01-21 15:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:13.239789749 +0000 UTC m=+1830.421305971" watchObservedRunningTime="2026-01-21 15:32:13.242701895 +0000 UTC m=+1830.424218117" Jan 21 15:32:13 crc kubenswrapper[4707]: I0121 15:32:13.579983 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:13 crc kubenswrapper[4707]: I0121 15:32:13.746997 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/watcher-api-0" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.1.216:9322/\": read tcp 10.217.0.2:57218->10.217.1.216:9322: read: connection reset by peer" Jan 21 15:32:13 crc kubenswrapper[4707]: I0121 15:32:13.747006 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/watcher-api-0" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.1.216:9322/\": read tcp 10.217.0.2:57204->10.217.1.216:9322: read: connection reset by peer" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.194338 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.242536 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerID="cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3" exitCode=0 Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.243344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"2b691f7a-1d61-46ef-bddc-5e692d197f4e","Type":"ContainerDied","Data":"cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3"} Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.243408 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"2b691f7a-1d61-46ef-bddc-5e692d197f4e","Type":"ContainerDied","Data":"1738a1de39e7acc8d860c1550f165bb6c77f149dc5efea54d0d69bc77639cecd"} Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.243431 4707 scope.go:117] "RemoveContainer" containerID="cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.243523 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.289694 4707 scope.go:117] "RemoveContainer" containerID="06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.311876 4707 scope.go:117] "RemoveContainer" containerID="cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3" Jan 21 15:32:14 crc kubenswrapper[4707]: E0121 15:32:14.312320 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3\": container with ID starting with cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3 not found: ID does not exist" containerID="cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.312370 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3"} err="failed to get container status \"cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3\": rpc error: code = NotFound desc = could not find container \"cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3\": container with ID starting with cb88e4282c95083891782a9722a6f7fa3b45eb5aa504dec07ae9733352a4c6d3 not found: ID does not exist" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.312402 4707 scope.go:117] "RemoveContainer" containerID="06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c" Jan 21 15:32:14 crc kubenswrapper[4707]: E0121 15:32:14.312754 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c\": container with ID starting with 06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c not found: ID does not exist" containerID="06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.312795 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c"} err="failed to get container status \"06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c\": rpc error: code = NotFound desc = could not find container \"06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c\": container with ID starting with 06974b2175bdc82ab94b58c19eb0f90af58ee7e284567b9d508617c74644f52c not found: ID does not exist" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.342966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-combined-ca-bundle\") pod \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.343106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-custom-prometheus-ca\") pod \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.343142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj285\" (UniqueName: \"kubernetes.io/projected/2b691f7a-1d61-46ef-bddc-5e692d197f4e-kube-api-access-hj285\") pod \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.343225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-config-data\") pod \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.343290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b691f7a-1d61-46ef-bddc-5e692d197f4e-logs\") pod \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\" (UID: \"2b691f7a-1d61-46ef-bddc-5e692d197f4e\") " Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.345928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b691f7a-1d61-46ef-bddc-5e692d197f4e-logs" (OuterVolumeSpecName: "logs") pod "2b691f7a-1d61-46ef-bddc-5e692d197f4e" (UID: "2b691f7a-1d61-46ef-bddc-5e692d197f4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.349968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b691f7a-1d61-46ef-bddc-5e692d197f4e-kube-api-access-hj285" (OuterVolumeSpecName: "kube-api-access-hj285") pod "2b691f7a-1d61-46ef-bddc-5e692d197f4e" (UID: "2b691f7a-1d61-46ef-bddc-5e692d197f4e"). InnerVolumeSpecName "kube-api-access-hj285". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.370754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b691f7a-1d61-46ef-bddc-5e692d197f4e" (UID: "2b691f7a-1d61-46ef-bddc-5e692d197f4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.374295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "2b691f7a-1d61-46ef-bddc-5e692d197f4e" (UID: "2b691f7a-1d61-46ef-bddc-5e692d197f4e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.394699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-config-data" (OuterVolumeSpecName: "config-data") pod "2b691f7a-1d61-46ef-bddc-5e692d197f4e" (UID: "2b691f7a-1d61-46ef-bddc-5e692d197f4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.445281 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.445307 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b691f7a-1d61-46ef-bddc-5e692d197f4e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.445316 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.445327 4707 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2b691f7a-1d61-46ef-bddc-5e692d197f4e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.445336 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj285\" (UniqueName: \"kubernetes.io/projected/2b691f7a-1d61-46ef-bddc-5e692d197f4e-kube-api-access-hj285\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.570181 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.577326 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.590685 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:32:14 crc kubenswrapper[4707]: E0121 15:32:14.590988 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api-log" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.591007 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api-log" Jan 21 15:32:14 crc kubenswrapper[4707]: E0121 15:32:14.591021 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.591028 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.591190 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.591221 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" containerName="watcher-api-log" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.592052 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.594030 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-watcher-internal-svc" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.594837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-api-config-data" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.602381 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-watcher-public-svc" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.604290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.749846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.750212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.750244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.750306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-config-data\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.750572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22fbbd00-3dc5-42f1-a015-24e2aac9be07-logs\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.750705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-public-tls-certs\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.750795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6sfx\" (UniqueName: \"kubernetes.io/projected/22fbbd00-3dc5-42f1-a015-24e2aac9be07-kube-api-access-t6sfx\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.824416 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.852733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.852827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-config-data\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.853048 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22fbbd00-3dc5-42f1-a015-24e2aac9be07-logs\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.853152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-public-tls-certs\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.853225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6sfx\" (UniqueName: \"kubernetes.io/projected/22fbbd00-3dc5-42f1-a015-24e2aac9be07-kube-api-access-t6sfx\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.853290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.853321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.853789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22fbbd00-3dc5-42f1-a015-24e2aac9be07-logs\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.858727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.860913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.861396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.868180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-public-tls-certs\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.873790 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-config-data\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.875730 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6sfx\" (UniqueName: \"kubernetes.io/projected/22fbbd00-3dc5-42f1-a015-24e2aac9be07-kube-api-access-t6sfx\") pod \"watcher-api-0\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.885986 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-7764cc588-x5xsc"] Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.886202 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api-log" containerID="cri-o://3512429b9d3d64e5ea9920cadc30d8196b5a7fe2ab7f24557369e62e8ccd3a22" gracePeriod=30 Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.886330 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api" containerID="cri-o://6d4cc6dc2fbda90d29592d93dffc5e58dce8b4b4d7475854f4553e5732daaf13" gracePeriod=30 Jan 21 15:32:14 crc kubenswrapper[4707]: I0121 15:32:14.906787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:15 crc kubenswrapper[4707]: I0121 15:32:15.194468 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b691f7a-1d61-46ef-bddc-5e692d197f4e" path="/var/lib/kubelet/pods/2b691f7a-1d61-46ef-bddc-5e692d197f4e/volumes" Jan 21 15:32:15 crc kubenswrapper[4707]: I0121 15:32:15.260177 4707 generic.go:334] "Generic (PLEG): container finished" podID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerID="3512429b9d3d64e5ea9920cadc30d8196b5a7fe2ab7f24557369e62e8ccd3a22" exitCode=143 Jan 21 15:32:15 crc kubenswrapper[4707]: I0121 15:32:15.260222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" event={"ID":"52b4da74-d2e7-4bca-8aa3-492e4b041925","Type":"ContainerDied","Data":"3512429b9d3d64e5ea9920cadc30d8196b5a7fe2ab7f24557369e62e8ccd3a22"} Jan 21 15:32:15 crc kubenswrapper[4707]: I0121 15:32:15.341948 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:32:15 crc kubenswrapper[4707]: W0121 15:32:15.346825 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22fbbd00_3dc5_42f1_a015_24e2aac9be07.slice/crio-7f3ebdec4e685d493529eb54ba646508de7eb1d1971cd20244c84cf17f9c0f4d WatchSource:0}: Error finding container 7f3ebdec4e685d493529eb54ba646508de7eb1d1971cd20244c84cf17f9c0f4d: Status 404 returned error can't find the container with id 7f3ebdec4e685d493529eb54ba646508de7eb1d1971cd20244c84cf17f9c0f4d Jan 21 15:32:15 crc kubenswrapper[4707]: I0121 15:32:15.858014 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:16 crc kubenswrapper[4707]: I0121 15:32:16.268267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"22fbbd00-3dc5-42f1-a015-24e2aac9be07","Type":"ContainerStarted","Data":"9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb"} Jan 21 15:32:16 crc kubenswrapper[4707]: I0121 15:32:16.268503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"22fbbd00-3dc5-42f1-a015-24e2aac9be07","Type":"ContainerStarted","Data":"b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d"} Jan 21 15:32:16 crc kubenswrapper[4707]: I0121 15:32:16.268514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"22fbbd00-3dc5-42f1-a015-24e2aac9be07","Type":"ContainerStarted","Data":"7f3ebdec4e685d493529eb54ba646508de7eb1d1971cd20244c84cf17f9c0f4d"} Jan 21 15:32:16 crc kubenswrapper[4707]: I0121 15:32:16.270954 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:16 crc kubenswrapper[4707]: I0121 15:32:16.285974 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-api-0" podStartSLOduration=2.285962613 podStartE2EDuration="2.285962613s" podCreationTimestamp="2026-01-21 15:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:16.28319579 +0000 UTC m=+1833.464712013" watchObservedRunningTime="2026-01-21 15:32:16.285962613 +0000 UTC m=+1833.467478835" Jan 21 15:32:17 crc kubenswrapper[4707]: I0121 15:32:17.279088 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:32:17 crc kubenswrapper[4707]: I0121 15:32:17.967959 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.016041 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.217:9311/healthcheck\": read tcp 10.217.0.2:46830->10.217.1.217:9311: read: connection reset by peer" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.016111 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.217:9311/healthcheck\": read tcp 10.217.0.2:46836->10.217.1.217:9311: read: connection reset by peer" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.282289 4707 generic.go:334] "Generic (PLEG): container finished" podID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerID="6d4cc6dc2fbda90d29592d93dffc5e58dce8b4b4d7475854f4553e5732daaf13" exitCode=0 Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.282359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" event={"ID":"52b4da74-d2e7-4bca-8aa3-492e4b041925","Type":"ContainerDied","Data":"6d4cc6dc2fbda90d29592d93dffc5e58dce8b4b4d7475854f4553e5732daaf13"} Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.366931 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.522773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data-custom\") pod \"52b4da74-d2e7-4bca-8aa3-492e4b041925\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.522840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data\") pod \"52b4da74-d2e7-4bca-8aa3-492e4b041925\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.522890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wghw\" (UniqueName: \"kubernetes.io/projected/52b4da74-d2e7-4bca-8aa3-492e4b041925-kube-api-access-7wghw\") pod \"52b4da74-d2e7-4bca-8aa3-492e4b041925\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.522972 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-combined-ca-bundle\") pod \"52b4da74-d2e7-4bca-8aa3-492e4b041925\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.523029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b4da74-d2e7-4bca-8aa3-492e4b041925-logs\") pod \"52b4da74-d2e7-4bca-8aa3-492e4b041925\" (UID: \"52b4da74-d2e7-4bca-8aa3-492e4b041925\") " Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.523887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b4da74-d2e7-4bca-8aa3-492e4b041925-logs" (OuterVolumeSpecName: "logs") pod "52b4da74-d2e7-4bca-8aa3-492e4b041925" (UID: "52b4da74-d2e7-4bca-8aa3-492e4b041925"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.527562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "52b4da74-d2e7-4bca-8aa3-492e4b041925" (UID: "52b4da74-d2e7-4bca-8aa3-492e4b041925"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.534651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b4da74-d2e7-4bca-8aa3-492e4b041925-kube-api-access-7wghw" (OuterVolumeSpecName: "kube-api-access-7wghw") pod "52b4da74-d2e7-4bca-8aa3-492e4b041925" (UID: "52b4da74-d2e7-4bca-8aa3-492e4b041925"). InnerVolumeSpecName "kube-api-access-7wghw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.550198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52b4da74-d2e7-4bca-8aa3-492e4b041925" (UID: "52b4da74-d2e7-4bca-8aa3-492e4b041925"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.574126 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data" (OuterVolumeSpecName: "config-data") pod "52b4da74-d2e7-4bca-8aa3-492e4b041925" (UID: "52b4da74-d2e7-4bca-8aa3-492e4b041925"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.625484 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wghw\" (UniqueName: \"kubernetes.io/projected/52b4da74-d2e7-4bca-8aa3-492e4b041925-kube-api-access-7wghw\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.625510 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.625521 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b4da74-d2e7-4bca-8aa3-492e4b041925-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.625529 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:18 crc kubenswrapper[4707]: I0121 15:32:18.625539 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b4da74-d2e7-4bca-8aa3-492e4b041925-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:19 crc kubenswrapper[4707]: I0121 15:32:19.290036 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" Jan 21 15:32:19 crc kubenswrapper[4707]: I0121 15:32:19.290060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7764cc588-x5xsc" event={"ID":"52b4da74-d2e7-4bca-8aa3-492e4b041925","Type":"ContainerDied","Data":"4edc87e94c48dfae268b1d775f49947081a07b7eb7ecd90e10c4bf94793a944b"} Jan 21 15:32:19 crc kubenswrapper[4707]: I0121 15:32:19.290108 4707 scope.go:117] "RemoveContainer" containerID="6d4cc6dc2fbda90d29592d93dffc5e58dce8b4b4d7475854f4553e5732daaf13" Jan 21 15:32:19 crc kubenswrapper[4707]: I0121 15:32:19.309040 4707 scope.go:117] "RemoveContainer" containerID="3512429b9d3d64e5ea9920cadc30d8196b5a7fe2ab7f24557369e62e8ccd3a22" Jan 21 15:32:19 crc kubenswrapper[4707]: I0121 15:32:19.323671 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-7764cc588-x5xsc"] Jan 21 15:32:19 crc kubenswrapper[4707]: I0121 15:32:19.329796 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-7764cc588-x5xsc"] Jan 21 15:32:19 crc kubenswrapper[4707]: I0121 15:32:19.908004 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:21 crc kubenswrapper[4707]: I0121 15:32:21.013940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:32:21 crc kubenswrapper[4707]: I0121 15:32:21.190311 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" path="/var/lib/kubelet/pods/52b4da74-d2e7-4bca-8aa3-492e4b041925/volumes" Jan 21 15:32:24 crc kubenswrapper[4707]: I0121 15:32:24.908097 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:24 crc kubenswrapper[4707]: I0121 15:32:24.914369 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:25 crc kubenswrapper[4707]: I0121 15:32:25.360245 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:32:26 crc kubenswrapper[4707]: I0121 15:32:26.182322 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:32:26 crc kubenswrapper[4707]: E0121 15:32:26.182529 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:32:26 crc kubenswrapper[4707]: I0121 15:32:26.204247 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:32:27 crc kubenswrapper[4707]: I0121 15:32:27.203352 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:32:28 crc kubenswrapper[4707]: I0121 15:32:28.259052 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:31 crc kubenswrapper[4707]: I0121 15:32:31.650554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:34 crc kubenswrapper[4707]: I0121 15:32:34.749306 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.784655 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.834693 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7fbc45cc89-glgj4"] Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.834910 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" podUID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerName="neutron-api" containerID="cri-o://a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5" gracePeriod=30 Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.835252 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" podUID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerName="neutron-httpd" containerID="cri-o://72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083" gracePeriod=30 Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.994605 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:32:36 crc kubenswrapper[4707]: E0121 15:32:36.994944 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.994961 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api" Jan 21 15:32:36 crc kubenswrapper[4707]: E0121 15:32:36.994973 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api-log" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.994979 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api-log" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.995132 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.995153 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b4da74-d2e7-4bca-8aa3-492e4b041925" containerName="barbican-api-log" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.995662 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.998824 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.998871 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 15:32:36 crc kubenswrapper[4707]: I0121 15:32:36.999008 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-xd9zv" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.002473 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.005614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config-secret\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.005682 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.005869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2d64\" (UniqueName: \"kubernetes.io/projected/94270f61-4b16-47e0-b2ec-66b8c3409eb0-kube-api-access-s2d64\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.005978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.012502 4707 status_manager.go:875] "Failed to update status for pod" pod="openstack-kuttl-tests/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94270f61-4b16-47e0-b2ec-66b8c3409eb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:32:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:32:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:32:36Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:32:36Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b4f8494513a3af102066fec5868ab167ac8664aceb2f0c639d7a0b60260a944\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2d64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:32:36Z\\\"}}\" for pod \"openstack-kuttl-tests\"/\"openstackclient\": pods \"openstackclient\" not found" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.013783 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:32:37 crc kubenswrapper[4707]: E0121 15:32:37.014416 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-s2d64 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/openstackclient" podUID="94270f61-4b16-47e0-b2ec-66b8c3409eb0" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.023759 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.032000 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.032903 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.035057 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="94270f61-4b16-47e0-b2ec-66b8c3409eb0" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.044087 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.108057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.108120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.108150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config-secret\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.108209 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config-secret\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.108250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtfpx\" (UniqueName: \"kubernetes.io/projected/11f832c1-2aa2-45a2-b766-f0483cfda9e0-kube-api-access-wtfpx\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.108278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.108310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.108404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2d64\" (UniqueName: \"kubernetes.io/projected/94270f61-4b16-47e0-b2ec-66b8c3409eb0-kube-api-access-s2d64\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.109472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: E0121 15:32:37.109902 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2d64 for pod openstack-kuttl-tests/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (94270f61-4b16-47e0-b2ec-66b8c3409eb0) does not match the UID in record. The object might have been deleted and then recreated Jan 21 15:32:37 crc kubenswrapper[4707]: E0121 15:32:37.109958 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94270f61-4b16-47e0-b2ec-66b8c3409eb0-kube-api-access-s2d64 podName:94270f61-4b16-47e0-b2ec-66b8c3409eb0 nodeName:}" failed. No retries permitted until 2026-01-21 15:32:37.609943752 +0000 UTC m=+1854.791459974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2d64" (UniqueName: "kubernetes.io/projected/94270f61-4b16-47e0-b2ec-66b8c3409eb0-kube-api-access-s2d64") pod "openstackclient" (UID: "94270f61-4b16-47e0-b2ec-66b8c3409eb0") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (94270f61-4b16-47e0-b2ec-66b8c3409eb0) does not match the UID in record. The object might have been deleted and then recreated Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.113292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.113595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config-secret\") pod \"openstackclient\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.209871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.210087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config-secret\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.210111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtfpx\" (UniqueName: \"kubernetes.io/projected/11f832c1-2aa2-45a2-b766-f0483cfda9e0-kube-api-access-wtfpx\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.210138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.210758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.212916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config-secret\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.220742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.223596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtfpx\" (UniqueName: \"kubernetes.io/projected/11f832c1-2aa2-45a2-b766-f0483cfda9e0-kube-api-access-wtfpx\") pod \"openstackclient\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.348500 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.426133 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerID="72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083" exitCode=0 Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.426224 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.426699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" event={"ID":"7c12f4c3-101b-4361-8d90-6adbf916cbda","Type":"ContainerDied","Data":"72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083"} Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.433446 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="94270f61-4b16-47e0-b2ec-66b8c3409eb0" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.439920 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.442082 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="94270f61-4b16-47e0-b2ec-66b8c3409eb0" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.512987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config\") pod \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.513085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-combined-ca-bundle\") pod \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.513162 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config-secret\") pod \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\" (UID: \"94270f61-4b16-47e0-b2ec-66b8c3409eb0\") " Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.513365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "94270f61-4b16-47e0-b2ec-66b8c3409eb0" (UID: "94270f61-4b16-47e0-b2ec-66b8c3409eb0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.513429 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2d64\" (UniqueName: \"kubernetes.io/projected/94270f61-4b16-47e0-b2ec-66b8c3409eb0-kube-api-access-s2d64\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.517288 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94270f61-4b16-47e0-b2ec-66b8c3409eb0" (UID: "94270f61-4b16-47e0-b2ec-66b8c3409eb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.522110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "94270f61-4b16-47e0-b2ec-66b8c3409eb0" (UID: "94270f61-4b16-47e0-b2ec-66b8c3409eb0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.615148 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.615187 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/94270f61-4b16-47e0-b2ec-66b8c3409eb0-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.615197 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94270f61-4b16-47e0-b2ec-66b8c3409eb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:37 crc kubenswrapper[4707]: I0121 15:32:37.723855 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:32:37 crc kubenswrapper[4707]: W0121 15:32:37.726903 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11f832c1_2aa2_45a2_b766_f0483cfda9e0.slice/crio-7d0a37da2c953c642fa554b84897987695c343940cdb94b547d5fdf93b5154ab WatchSource:0}: Error finding container 7d0a37da2c953c642fa554b84897987695c343940cdb94b547d5fdf93b5154ab: Status 404 returned error can't find the container with id 7d0a37da2c953c642fa554b84897987695c343940cdb94b547d5fdf93b5154ab Jan 21 15:32:38 crc kubenswrapper[4707]: I0121 15:32:38.182983 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:32:38 crc kubenswrapper[4707]: E0121 15:32:38.183955 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:32:38 crc kubenswrapper[4707]: I0121 15:32:38.434002 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:32:38 crc kubenswrapper[4707]: I0121 15:32:38.434020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"11f832c1-2aa2-45a2-b766-f0483cfda9e0","Type":"ContainerStarted","Data":"9998f408842af3a6508e5983cb44f2ecb94e754e96c36fd322e289d1600aab84"} Jan 21 15:32:38 crc kubenswrapper[4707]: I0121 15:32:38.434061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"11f832c1-2aa2-45a2-b766-f0483cfda9e0","Type":"ContainerStarted","Data":"7d0a37da2c953c642fa554b84897987695c343940cdb94b547d5fdf93b5154ab"} Jan 21 15:32:38 crc kubenswrapper[4707]: I0121 15:32:38.435877 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="94270f61-4b16-47e0-b2ec-66b8c3409eb0" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" Jan 21 15:32:38 crc kubenswrapper[4707]: I0121 15:32:38.450767 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="94270f61-4b16-47e0-b2ec-66b8c3409eb0" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.190316 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94270f61-4b16-47e0-b2ec-66b8c3409eb0" path="/var/lib/kubelet/pods/94270f61-4b16-47e0-b2ec-66b8c3409eb0/volumes" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.734343 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.751218 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.751203186 podStartE2EDuration="2.751203186s" podCreationTimestamp="2026-01-21 15:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:38.449090005 +0000 UTC m=+1855.630606227" watchObservedRunningTime="2026-01-21 15:32:39.751203186 +0000 UTC m=+1856.932719408" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.753845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5xv\" (UniqueName: \"kubernetes.io/projected/7c12f4c3-101b-4361-8d90-6adbf916cbda-kube-api-access-mg5xv\") pod \"7c12f4c3-101b-4361-8d90-6adbf916cbda\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.753882 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-config\") pod \"7c12f4c3-101b-4361-8d90-6adbf916cbda\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.753907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-ovndb-tls-certs\") pod \"7c12f4c3-101b-4361-8d90-6adbf916cbda\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.753943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-httpd-config\") pod \"7c12f4c3-101b-4361-8d90-6adbf916cbda\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.753983 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-combined-ca-bundle\") pod \"7c12f4c3-101b-4361-8d90-6adbf916cbda\" (UID: \"7c12f4c3-101b-4361-8d90-6adbf916cbda\") " Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.763492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7c12f4c3-101b-4361-8d90-6adbf916cbda" (UID: "7c12f4c3-101b-4361-8d90-6adbf916cbda"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.771050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c12f4c3-101b-4361-8d90-6adbf916cbda-kube-api-access-mg5xv" (OuterVolumeSpecName: "kube-api-access-mg5xv") pod "7c12f4c3-101b-4361-8d90-6adbf916cbda" (UID: "7c12f4c3-101b-4361-8d90-6adbf916cbda"). InnerVolumeSpecName "kube-api-access-mg5xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.794852 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-config" (OuterVolumeSpecName: "config") pod "7c12f4c3-101b-4361-8d90-6adbf916cbda" (UID: "7c12f4c3-101b-4361-8d90-6adbf916cbda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.806611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c12f4c3-101b-4361-8d90-6adbf916cbda" (UID: "7c12f4c3-101b-4361-8d90-6adbf916cbda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.807006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7c12f4c3-101b-4361-8d90-6adbf916cbda" (UID: "7c12f4c3-101b-4361-8d90-6adbf916cbda"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.855532 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5xv\" (UniqueName: \"kubernetes.io/projected/7c12f4c3-101b-4361-8d90-6adbf916cbda-kube-api-access-mg5xv\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.855555 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.855566 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.855574 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:39 crc kubenswrapper[4707]: I0121 15:32:39.855583 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c12f4c3-101b-4361-8d90-6adbf916cbda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.449036 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerID="a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5" exitCode=0 Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.449071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" event={"ID":"7c12f4c3-101b-4361-8d90-6adbf916cbda","Type":"ContainerDied","Data":"a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5"} Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.449094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" event={"ID":"7c12f4c3-101b-4361-8d90-6adbf916cbda","Type":"ContainerDied","Data":"8540af325387fba5474f5dcaaeb6f2a15f06e83d3f08a875bde2af5c7eea2071"} Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.449093 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fbc45cc89-glgj4" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.449107 4707 scope.go:117] "RemoveContainer" containerID="72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.482391 4707 scope.go:117] "RemoveContainer" containerID="a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.496616 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7fbc45cc89-glgj4"] Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.502762 4707 scope.go:117] "RemoveContainer" containerID="72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.502997 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-7fbc45cc89-glgj4"] Jan 21 15:32:40 crc kubenswrapper[4707]: E0121 15:32:40.503220 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083\": container with ID starting with 72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083 not found: ID does not exist" containerID="72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.503250 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083"} err="failed to get container status \"72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083\": rpc error: code = NotFound desc = could not find container \"72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083\": container with ID starting with 72f7ce289c6eda2e8f6a8765f0c35a2774f0920b6ea83438eb264a5849f52083 not found: ID does not exist" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.503271 4707 scope.go:117] "RemoveContainer" containerID="a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5" Jan 21 15:32:40 crc kubenswrapper[4707]: E0121 15:32:40.503563 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5\": container with ID starting with a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5 not found: ID does not exist" containerID="a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.503598 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5"} err="failed to get container status \"a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5\": rpc error: code = NotFound desc = could not find container \"a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5\": container with ID starting with a484ce4579285730efce60d6a83a7b5d595a1dba0dfcc6e2227834a763060ba5 not found: ID does not exist" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.636148 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.637404 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="179986b8-3ec0-4966-8eb9-621f8465e876" containerName="glance-httpd" containerID="cri-o://2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a" gracePeriod=30 Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.637597 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="179986b8-3ec0-4966-8eb9-621f8465e876" containerName="glance-log" containerID="cri-o://f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723" gracePeriod=30 Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.818957 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb"] Jan 21 15:32:40 crc kubenswrapper[4707]: E0121 15:32:40.819444 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerName="neutron-httpd" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.819460 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerName="neutron-httpd" Jan 21 15:32:40 crc kubenswrapper[4707]: E0121 15:32:40.819478 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerName="neutron-api" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.819483 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerName="neutron-api" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.819630 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerName="neutron-httpd" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.819653 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c12f4c3-101b-4361-8d90-6adbf916cbda" containerName="neutron-api" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.820450 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.821853 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.822614 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.827642 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.836852 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb"] Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.871395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-internal-tls-certs\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.871432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xv96\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-kube-api-access-5xv96\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.871447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-run-httpd\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.871464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-log-httpd\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.871609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-config-data\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.871649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-combined-ca-bundle\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.871704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-etc-swift\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.871726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-public-tls-certs\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.973551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-internal-tls-certs\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.973594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xv96\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-kube-api-access-5xv96\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.973615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-run-httpd\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.973631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-log-httpd\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.973704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-config-data\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.973913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-combined-ca-bundle\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.973995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-etc-swift\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.974097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-run-httpd\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.974253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-public-tls-certs\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.974233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-log-httpd\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.980646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-combined-ca-bundle\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.980695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-public-tls-certs\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.980739 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-config-data\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.980957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-etc-swift\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.981575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-internal-tls-certs\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:40 crc kubenswrapper[4707]: I0121 15:32:40.989764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xv96\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-kube-api-access-5xv96\") pod \"swift-proxy-84c8d5c4db-f9znb\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.133440 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.189647 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c12f4c3-101b-4361-8d90-6adbf916cbda" path="/var/lib/kubelet/pods/7c12f4c3-101b-4361-8d90-6adbf916cbda/volumes" Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.330883 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.331282 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerName="glance-log" containerID="cri-o://976f48e2b189f2381af11b61d8d7b7156ce939a83f766ece2a01389510c7f928" gracePeriod=30 Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.331635 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerName="glance-httpd" containerID="cri-o://ca4216f956996cbc2e5bcace21a45a2b9480a81abafc0d0274deb43923eca1ff" gracePeriod=30 Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.459522 4707 generic.go:334] "Generic (PLEG): container finished" podID="179986b8-3ec0-4966-8eb9-621f8465e876" containerID="f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723" exitCode=143 Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.459579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"179986b8-3ec0-4966-8eb9-621f8465e876","Type":"ContainerDied","Data":"f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723"} Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.461398 4707 generic.go:334] "Generic (PLEG): container finished" podID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerID="976f48e2b189f2381af11b61d8d7b7156ce939a83f766ece2a01389510c7f928" exitCode=143 Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.461439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"35d81c05-e993-4076-9d9f-f5a1a10ba669","Type":"ContainerDied","Data":"976f48e2b189f2381af11b61d8d7b7156ce939a83f766ece2a01389510c7f928"} Jan 21 15:32:41 crc kubenswrapper[4707]: I0121 15:32:41.509437 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb"] Jan 21 15:32:41 crc kubenswrapper[4707]: W0121 15:32:41.512451 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2af1e8b_8059_4390_9ab4_a4ed516f509d.slice/crio-bad4d798cab896195e02b8d2b431ac1f2c3c484485a32284f142a084fc84f9c9 WatchSource:0}: Error finding container bad4d798cab896195e02b8d2b431ac1f2c3c484485a32284f142a084fc84f9c9: Status 404 returned error can't find the container with id bad4d798cab896195e02b8d2b431ac1f2c3c484485a32284f142a084fc84f9c9 Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.140622 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.141074 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="sg-core" containerID="cri-o://1aee2995c3d4ccdc925eeebc3ca9feb5c8ee77bef3bbab34e325a26d995dac36" gracePeriod=30 Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.141125 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="ceilometer-notification-agent" containerID="cri-o://f147ab0f86f668f3ec8bf52d936c30782c1949aa7ed10de4fc9e740b145f5072" gracePeriod=30 Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.141217 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="proxy-httpd" containerID="cri-o://5f78dadf11135148036c0ad6ef343c31a86b8b3aa982408f9c591e3a8fe22eb0" gracePeriod=30 Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.141039 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="ceilometer-central-agent" containerID="cri-o://c209b57fbadaa82a717e8034092cbff5667f1050841c14612dd7edd344643d8d" gracePeriod=30 Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.475396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" event={"ID":"c2af1e8b-8059-4390-9ab4-a4ed516f509d","Type":"ContainerStarted","Data":"67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a"} Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.476033 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.476097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" event={"ID":"c2af1e8b-8059-4390-9ab4-a4ed516f509d","Type":"ContainerStarted","Data":"d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2"} Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.476153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" event={"ID":"c2af1e8b-8059-4390-9ab4-a4ed516f509d","Type":"ContainerStarted","Data":"bad4d798cab896195e02b8d2b431ac1f2c3c484485a32284f142a084fc84f9c9"} Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.480013 4707 generic.go:334] "Generic (PLEG): container finished" podID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerID="5f78dadf11135148036c0ad6ef343c31a86b8b3aa982408f9c591e3a8fe22eb0" exitCode=0 Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.480094 4707 generic.go:334] "Generic (PLEG): container finished" podID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerID="1aee2995c3d4ccdc925eeebc3ca9feb5c8ee77bef3bbab34e325a26d995dac36" exitCode=2 Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.480044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerDied","Data":"5f78dadf11135148036c0ad6ef343c31a86b8b3aa982408f9c591e3a8fe22eb0"} Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.480198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerDied","Data":"1aee2995c3d4ccdc925eeebc3ca9feb5c8ee77bef3bbab34e325a26d995dac36"} Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.480214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerDied","Data":"f147ab0f86f668f3ec8bf52d936c30782c1949aa7ed10de4fc9e740b145f5072"} Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.480157 4707 generic.go:334] "Generic (PLEG): container finished" podID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerID="f147ab0f86f668f3ec8bf52d936c30782c1949aa7ed10de4fc9e740b145f5072" exitCode=0 Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.480231 4707 generic.go:334] "Generic (PLEG): container finished" podID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerID="c209b57fbadaa82a717e8034092cbff5667f1050841c14612dd7edd344643d8d" exitCode=0 Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.480246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerDied","Data":"c209b57fbadaa82a717e8034092cbff5667f1050841c14612dd7edd344643d8d"} Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.752707 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.777004 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" podStartSLOduration=2.776988796 podStartE2EDuration="2.776988796s" podCreationTimestamp="2026-01-21 15:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:42.495337534 +0000 UTC m=+1859.676853756" watchObservedRunningTime="2026-01-21 15:32:42.776988796 +0000 UTC m=+1859.958505018" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.799486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-combined-ca-bundle\") pod \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.799527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h827p\" (UniqueName: \"kubernetes.io/projected/c5b5fbb1-f7b8-4920-8329-08ff2b954651-kube-api-access-h827p\") pod \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.799598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-scripts\") pod \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.799625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-run-httpd\") pod \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.799663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-log-httpd\") pod \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.799720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-config-data\") pod \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.799735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-sg-core-conf-yaml\") pod \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\" (UID: \"c5b5fbb1-f7b8-4920-8329-08ff2b954651\") " Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.800082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c5b5fbb1-f7b8-4920-8329-08ff2b954651" (UID: "c5b5fbb1-f7b8-4920-8329-08ff2b954651"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.800138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c5b5fbb1-f7b8-4920-8329-08ff2b954651" (UID: "c5b5fbb1-f7b8-4920-8329-08ff2b954651"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.805796 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-scripts" (OuterVolumeSpecName: "scripts") pod "c5b5fbb1-f7b8-4920-8329-08ff2b954651" (UID: "c5b5fbb1-f7b8-4920-8329-08ff2b954651"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.808955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b5fbb1-f7b8-4920-8329-08ff2b954651-kube-api-access-h827p" (OuterVolumeSpecName: "kube-api-access-h827p") pod "c5b5fbb1-f7b8-4920-8329-08ff2b954651" (UID: "c5b5fbb1-f7b8-4920-8329-08ff2b954651"). InnerVolumeSpecName "kube-api-access-h827p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.821941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c5b5fbb1-f7b8-4920-8329-08ff2b954651" (UID: "c5b5fbb1-f7b8-4920-8329-08ff2b954651"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.864462 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5b5fbb1-f7b8-4920-8329-08ff2b954651" (UID: "c5b5fbb1-f7b8-4920-8329-08ff2b954651"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.870480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-config-data" (OuterVolumeSpecName: "config-data") pod "c5b5fbb1-f7b8-4920-8329-08ff2b954651" (UID: "c5b5fbb1-f7b8-4920-8329-08ff2b954651"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.901439 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.901535 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5b5fbb1-f7b8-4920-8329-08ff2b954651-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.901589 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.901647 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.901717 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.901777 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h827p\" (UniqueName: \"kubernetes.io/projected/c5b5fbb1-f7b8-4920-8329-08ff2b954651-kube-api-access-h827p\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:42 crc kubenswrapper[4707]: I0121 15:32:42.901841 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5b5fbb1-f7b8-4920-8329-08ff2b954651-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.490440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c5b5fbb1-f7b8-4920-8329-08ff2b954651","Type":"ContainerDied","Data":"83ee88dde55dcc2f86a16dd2acb5290a35a8fac570c276dc7996a346e922cbff"} Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.490504 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.490510 4707 scope.go:117] "RemoveContainer" containerID="5f78dadf11135148036c0ad6ef343c31a86b8b3aa982408f9c591e3a8fe22eb0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.490573 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.508727 4707 scope.go:117] "RemoveContainer" containerID="1aee2995c3d4ccdc925eeebc3ca9feb5c8ee77bef3bbab34e325a26d995dac36" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.510003 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.517793 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.534083 4707 scope.go:117] "RemoveContainer" containerID="f147ab0f86f668f3ec8bf52d936c30782c1949aa7ed10de4fc9e740b145f5072" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.537340 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:43 crc kubenswrapper[4707]: E0121 15:32:43.537659 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="sg-core" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.537676 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="sg-core" Jan 21 15:32:43 crc kubenswrapper[4707]: E0121 15:32:43.537687 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="ceilometer-central-agent" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.537692 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="ceilometer-central-agent" Jan 21 15:32:43 crc kubenswrapper[4707]: E0121 15:32:43.537704 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="ceilometer-notification-agent" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.537709 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="ceilometer-notification-agent" Jan 21 15:32:43 crc kubenswrapper[4707]: E0121 15:32:43.537727 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="proxy-httpd" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.537732 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="proxy-httpd" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.537908 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="ceilometer-central-agent" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.537927 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="ceilometer-notification-agent" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.537937 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="proxy-httpd" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.537951 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" containerName="sg-core" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.539342 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.541657 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.541730 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.551977 4707 scope.go:117] "RemoveContainer" containerID="c209b57fbadaa82a717e8034092cbff5667f1050841c14612dd7edd344643d8d" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.561296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.625436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-run-httpd\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.625466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-log-httpd\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.625499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-scripts\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.625519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhk9q\" (UniqueName: \"kubernetes.io/projected/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-kube-api-access-vhk9q\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.625749 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-config-data\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.625781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.625877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.727842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-config-data\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.727883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.727932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.727987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-run-httpd\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.728002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-log-httpd\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.728027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-scripts\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.728044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhk9q\" (UniqueName: \"kubernetes.io/projected/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-kube-api-access-vhk9q\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.728556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-log-httpd\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.728610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-run-httpd\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.731144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-config-data\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.732049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-scripts\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.735366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.735796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.752480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhk9q\" (UniqueName: \"kubernetes.io/projected/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-kube-api-access-vhk9q\") pod \"ceilometer-0\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:43 crc kubenswrapper[4707]: I0121 15:32:43.851791 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.207157 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.234851 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-scripts\") pod \"179986b8-3ec0-4966-8eb9-621f8465e876\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.234892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-httpd-run\") pod \"179986b8-3ec0-4966-8eb9-621f8465e876\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.234957 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-combined-ca-bundle\") pod \"179986b8-3ec0-4966-8eb9-621f8465e876\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.234988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq9fs\" (UniqueName: \"kubernetes.io/projected/179986b8-3ec0-4966-8eb9-621f8465e876-kube-api-access-qq9fs\") pod \"179986b8-3ec0-4966-8eb9-621f8465e876\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.235009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"179986b8-3ec0-4966-8eb9-621f8465e876\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.235054 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-public-tls-certs\") pod \"179986b8-3ec0-4966-8eb9-621f8465e876\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.235120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-config-data\") pod \"179986b8-3ec0-4966-8eb9-621f8465e876\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.235171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-logs\") pod \"179986b8-3ec0-4966-8eb9-621f8465e876\" (UID: \"179986b8-3ec0-4966-8eb9-621f8465e876\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.236776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "179986b8-3ec0-4966-8eb9-621f8465e876" (UID: "179986b8-3ec0-4966-8eb9-621f8465e876"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.236792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-logs" (OuterVolumeSpecName: "logs") pod "179986b8-3ec0-4966-8eb9-621f8465e876" (UID: "179986b8-3ec0-4966-8eb9-621f8465e876"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.246121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-scripts" (OuterVolumeSpecName: "scripts") pod "179986b8-3ec0-4966-8eb9-621f8465e876" (UID: "179986b8-3ec0-4966-8eb9-621f8465e876"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.248943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179986b8-3ec0-4966-8eb9-621f8465e876-kube-api-access-qq9fs" (OuterVolumeSpecName: "kube-api-access-qq9fs") pod "179986b8-3ec0-4966-8eb9-621f8465e876" (UID: "179986b8-3ec0-4966-8eb9-621f8465e876"). InnerVolumeSpecName "kube-api-access-qq9fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.272233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "179986b8-3ec0-4966-8eb9-621f8465e876" (UID: "179986b8-3ec0-4966-8eb9-621f8465e876"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.275383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "179986b8-3ec0-4966-8eb9-621f8465e876" (UID: "179986b8-3ec0-4966-8eb9-621f8465e876"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.280106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.292801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-config-data" (OuterVolumeSpecName: "config-data") pod "179986b8-3ec0-4966-8eb9-621f8465e876" (UID: "179986b8-3ec0-4966-8eb9-621f8465e876"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.292836 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "179986b8-3ec0-4966-8eb9-621f8465e876" (UID: "179986b8-3ec0-4966-8eb9-621f8465e876"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.337409 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.337437 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.337448 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.337457 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq9fs\" (UniqueName: \"kubernetes.io/projected/179986b8-3ec0-4966-8eb9-621f8465e876-kube-api-access-qq9fs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.337495 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.337505 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.337514 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179986b8-3ec0-4966-8eb9-621f8465e876-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.337521 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/179986b8-3ec0-4966-8eb9-621f8465e876-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.352773 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.439055 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.501063 4707 generic.go:334] "Generic (PLEG): container finished" podID="179986b8-3ec0-4966-8eb9-621f8465e876" containerID="2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a" exitCode=0 Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.501269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"179986b8-3ec0-4966-8eb9-621f8465e876","Type":"ContainerDied","Data":"2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a"} Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.501293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"179986b8-3ec0-4966-8eb9-621f8465e876","Type":"ContainerDied","Data":"59c853e3e09db444bd481ca9d2da0d0d788936cffe1b2f16b541c7a1e89e5d10"} Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.501310 4707 scope.go:117] "RemoveContainer" containerID="2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.501403 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.517252 4707 generic.go:334] "Generic (PLEG): container finished" podID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerID="ca4216f956996cbc2e5bcace21a45a2b9480a81abafc0d0274deb43923eca1ff" exitCode=0 Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.517315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"35d81c05-e993-4076-9d9f-f5a1a10ba669","Type":"ContainerDied","Data":"ca4216f956996cbc2e5bcace21a45a2b9480a81abafc0d0274deb43923eca1ff"} Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.524694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerStarted","Data":"9146257ff3e11f72048c59f73df7b16b3ba346374d9460bfc5df080f6c15e078"} Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.539651 4707 scope.go:117] "RemoveContainer" containerID="f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.541298 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.556692 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.565316 4707 scope.go:117] "RemoveContainer" containerID="2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a" Jan 21 15:32:44 crc kubenswrapper[4707]: E0121 15:32:44.565660 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a\": container with ID starting with 2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a not found: ID does not exist" containerID="2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.565687 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a"} err="failed to get container status \"2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a\": rpc error: code = NotFound desc = could not find container \"2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a\": container with ID starting with 2a269650790dede75c521160bf08e0d2201e3cf38faf8b1fdfb207a9c1a4d55a not found: ID does not exist" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.565704 4707 scope.go:117] "RemoveContainer" containerID="f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723" Jan 21 15:32:44 crc kubenswrapper[4707]: E0121 15:32:44.565928 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723\": container with ID starting with f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723 not found: ID does not exist" containerID="f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.565951 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723"} err="failed to get container status \"f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723\": rpc error: code = NotFound desc = could not find container \"f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723\": container with ID starting with f9738d2e35414778c6c8537e4886d0dabeaede098b4c2d6e9da4ae3b7387e723 not found: ID does not exist" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.570523 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:32:44 crc kubenswrapper[4707]: E0121 15:32:44.570852 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179986b8-3ec0-4966-8eb9-621f8465e876" containerName="glance-log" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.570868 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="179986b8-3ec0-4966-8eb9-621f8465e876" containerName="glance-log" Jan 21 15:32:44 crc kubenswrapper[4707]: E0121 15:32:44.570906 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179986b8-3ec0-4966-8eb9-621f8465e876" containerName="glance-httpd" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.570913 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="179986b8-3ec0-4966-8eb9-621f8465e876" containerName="glance-httpd" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.571074 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="179986b8-3ec0-4966-8eb9-621f8465e876" containerName="glance-log" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.571090 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="179986b8-3ec0-4966-8eb9-621f8465e876" containerName="glance-httpd" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.571878 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.574359 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.574739 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.577609 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.641512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.641553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.641598 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msp5\" (UniqueName: \"kubernetes.io/projected/7b5a0b6a-5b7d-4431-961c-984efc37166a-kube-api-access-5msp5\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.641614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-logs\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.641651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.641671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.641715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.641730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.731034 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743198 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msp5\" (UniqueName: \"kubernetes.io/projected/7b5a0b6a-5b7d-4431-961c-984efc37166a-kube-api-access-5msp5\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-logs\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743628 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.743676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-logs\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.746731 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.749237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.751622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.757498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.767948 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.768125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msp5\" (UniqueName: \"kubernetes.io/projected/7b5a0b6a-5b7d-4431-961c-984efc37166a-kube-api-access-5msp5\") pod \"glance-default-external-api-0\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.843982 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-combined-ca-bundle\") pod \"35d81c05-e993-4076-9d9f-f5a1a10ba669\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.844027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-scripts\") pod \"35d81c05-e993-4076-9d9f-f5a1a10ba669\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.844102 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-config-data\") pod \"35d81c05-e993-4076-9d9f-f5a1a10ba669\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.844127 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ml5k\" (UniqueName: \"kubernetes.io/projected/35d81c05-e993-4076-9d9f-f5a1a10ba669-kube-api-access-7ml5k\") pod \"35d81c05-e993-4076-9d9f-f5a1a10ba669\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.844155 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-logs\") pod \"35d81c05-e993-4076-9d9f-f5a1a10ba669\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.844170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"35d81c05-e993-4076-9d9f-f5a1a10ba669\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.844213 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-internal-tls-certs\") pod \"35d81c05-e993-4076-9d9f-f5a1a10ba669\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.844246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-httpd-run\") pod \"35d81c05-e993-4076-9d9f-f5a1a10ba669\" (UID: \"35d81c05-e993-4076-9d9f-f5a1a10ba669\") " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.844683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "35d81c05-e993-4076-9d9f-f5a1a10ba669" (UID: "35d81c05-e993-4076-9d9f-f5a1a10ba669"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.844709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-logs" (OuterVolumeSpecName: "logs") pod "35d81c05-e993-4076-9d9f-f5a1a10ba669" (UID: "35d81c05-e993-4076-9d9f-f5a1a10ba669"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.846610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "35d81c05-e993-4076-9d9f-f5a1a10ba669" (UID: "35d81c05-e993-4076-9d9f-f5a1a10ba669"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.849720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-scripts" (OuterVolumeSpecName: "scripts") pod "35d81c05-e993-4076-9d9f-f5a1a10ba669" (UID: "35d81c05-e993-4076-9d9f-f5a1a10ba669"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.850399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d81c05-e993-4076-9d9f-f5a1a10ba669-kube-api-access-7ml5k" (OuterVolumeSpecName: "kube-api-access-7ml5k") pod "35d81c05-e993-4076-9d9f-f5a1a10ba669" (UID: "35d81c05-e993-4076-9d9f-f5a1a10ba669"). InnerVolumeSpecName "kube-api-access-7ml5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.874956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35d81c05-e993-4076-9d9f-f5a1a10ba669" (UID: "35d81c05-e993-4076-9d9f-f5a1a10ba669"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.881298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-config-data" (OuterVolumeSpecName: "config-data") pod "35d81c05-e993-4076-9d9f-f5a1a10ba669" (UID: "35d81c05-e993-4076-9d9f-f5a1a10ba669"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.890873 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "35d81c05-e993-4076-9d9f-f5a1a10ba669" (UID: "35d81c05-e993-4076-9d9f-f5a1a10ba669"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.895206 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.946639 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.946668 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.946678 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.946686 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.946695 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d81c05-e993-4076-9d9f-f5a1a10ba669-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.946703 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ml5k\" (UniqueName: \"kubernetes.io/projected/35d81c05-e993-4076-9d9f-f5a1a10ba669-kube-api-access-7ml5k\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.946712 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35d81c05-e993-4076-9d9f-f5a1a10ba669-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.946740 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:32:44 crc kubenswrapper[4707]: I0121 15:32:44.961720 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.056879 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.190858 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="179986b8-3ec0-4966-8eb9-621f8465e876" path="/var/lib/kubelet/pods/179986b8-3ec0-4966-8eb9-621f8465e876/volumes" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.191612 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b5fbb1-f7b8-4920-8329-08ff2b954651" path="/var/lib/kubelet/pods/c5b5fbb1-f7b8-4920-8329-08ff2b954651/volumes" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.273110 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:32:45 crc kubenswrapper[4707]: W0121 15:32:45.276624 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b5a0b6a_5b7d_4431_961c_984efc37166a.slice/crio-067e532ebd8f036c470c5d6529c752eaf43c6d10357df88f3560830645434aa2 WatchSource:0}: Error finding container 067e532ebd8f036c470c5d6529c752eaf43c6d10357df88f3560830645434aa2: Status 404 returned error can't find the container with id 067e532ebd8f036c470c5d6529c752eaf43c6d10357df88f3560830645434aa2 Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.384774 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-cr6fw"] Jan 21 15:32:45 crc kubenswrapper[4707]: E0121 15:32:45.385105 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerName="glance-log" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.385122 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerName="glance-log" Jan 21 15:32:45 crc kubenswrapper[4707]: E0121 15:32:45.385163 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerName="glance-httpd" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.385168 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerName="glance-httpd" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.385340 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerName="glance-httpd" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.385360 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d81c05-e993-4076-9d9f-f5a1a10ba669" containerName="glance-log" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.385870 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.396308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-cr6fw"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.465427 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtfgf\" (UniqueName: \"kubernetes.io/projected/5681f814-1ae0-4737-bc71-5e487d92c291-kube-api-access-rtfgf\") pod \"nova-api-db-create-cr6fw\" (UID: \"5681f814-1ae0-4737-bc71-5e487d92c291\") " pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.465592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5681f814-1ae0-4737-bc71-5e487d92c291-operator-scripts\") pod \"nova-api-db-create-cr6fw\" (UID: \"5681f814-1ae0-4737-bc71-5e487d92c291\") " pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.497584 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-h7lxj"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.498540 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.510662 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.511599 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.516141 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.523946 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-h7lxj"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.529758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.554471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7b5a0b6a-5b7d-4431-961c-984efc37166a","Type":"ContainerStarted","Data":"067e532ebd8f036c470c5d6529c752eaf43c6d10357df88f3560830645434aa2"} Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.555549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerStarted","Data":"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9"} Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.557694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"35d81c05-e993-4076-9d9f-f5a1a10ba669","Type":"ContainerDied","Data":"873fd06338066688f0ba7caec377999a244bd13499209d45c5ad99e28d1102b4"} Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.557725 4707 scope.go:117] "RemoveContainer" containerID="ca4216f956996cbc2e5bcace21a45a2b9480a81abafc0d0274deb43923eca1ff" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.557874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.567205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxnq\" (UniqueName: \"kubernetes.io/projected/18f92f6b-e051-483b-91ae-9af16316bb7c-kube-api-access-sfxnq\") pod \"nova-api-78f7-account-create-update-hsj98\" (UID: \"18f92f6b-e051-483b-91ae-9af16316bb7c\") " pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.567254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5681f814-1ae0-4737-bc71-5e487d92c291-operator-scripts\") pod \"nova-api-db-create-cr6fw\" (UID: \"5681f814-1ae0-4737-bc71-5e487d92c291\") " pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.567318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18f92f6b-e051-483b-91ae-9af16316bb7c-operator-scripts\") pod \"nova-api-78f7-account-create-update-hsj98\" (UID: \"18f92f6b-e051-483b-91ae-9af16316bb7c\") " pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.567338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtfgf\" (UniqueName: \"kubernetes.io/projected/5681f814-1ae0-4737-bc71-5e487d92c291-kube-api-access-rtfgf\") pod \"nova-api-db-create-cr6fw\" (UID: \"5681f814-1ae0-4737-bc71-5e487d92c291\") " pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.567389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/916916dd-fb61-4e93-8aff-948350956ec8-operator-scripts\") pod \"nova-cell0-db-create-h7lxj\" (UID: \"916916dd-fb61-4e93-8aff-948350956ec8\") " pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.567410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6p7\" (UniqueName: \"kubernetes.io/projected/916916dd-fb61-4e93-8aff-948350956ec8-kube-api-access-2q6p7\") pod \"nova-cell0-db-create-h7lxj\" (UID: \"916916dd-fb61-4e93-8aff-948350956ec8\") " pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.568057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5681f814-1ae0-4737-bc71-5e487d92c291-operator-scripts\") pod \"nova-api-db-create-cr6fw\" (UID: \"5681f814-1ae0-4737-bc71-5e487d92c291\") " pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.597651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtfgf\" (UniqueName: \"kubernetes.io/projected/5681f814-1ae0-4737-bc71-5e487d92c291-kube-api-access-rtfgf\") pod \"nova-api-db-create-cr6fw\" (UID: \"5681f814-1ae0-4737-bc71-5e487d92c291\") " pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.600145 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.606912 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.630819 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5q9tj"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.631875 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.637908 4707 scope.go:117] "RemoveContainer" containerID="976f48e2b189f2381af11b61d8d7b7156ce939a83f766ece2a01389510c7f928" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.643908 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5q9tj"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.650960 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.652232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.657786 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.657866 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.658026 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.668768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmh42\" (UniqueName: \"kubernetes.io/projected/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-kube-api-access-tmh42\") pod \"nova-cell1-db-create-5q9tj\" (UID: \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.668852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/916916dd-fb61-4e93-8aff-948350956ec8-operator-scripts\") pod \"nova-cell0-db-create-h7lxj\" (UID: \"916916dd-fb61-4e93-8aff-948350956ec8\") " pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.668887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6p7\" (UniqueName: \"kubernetes.io/projected/916916dd-fb61-4e93-8aff-948350956ec8-kube-api-access-2q6p7\") pod \"nova-cell0-db-create-h7lxj\" (UID: \"916916dd-fb61-4e93-8aff-948350956ec8\") " pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.668959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxnq\" (UniqueName: \"kubernetes.io/projected/18f92f6b-e051-483b-91ae-9af16316bb7c-kube-api-access-sfxnq\") pod \"nova-api-78f7-account-create-update-hsj98\" (UID: \"18f92f6b-e051-483b-91ae-9af16316bb7c\") " pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.669088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18f92f6b-e051-483b-91ae-9af16316bb7c-operator-scripts\") pod \"nova-api-78f7-account-create-update-hsj98\" (UID: \"18f92f6b-e051-483b-91ae-9af16316bb7c\") " pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.669104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-operator-scripts\") pod \"nova-cell1-db-create-5q9tj\" (UID: \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.669682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/916916dd-fb61-4e93-8aff-948350956ec8-operator-scripts\") pod \"nova-cell0-db-create-h7lxj\" (UID: \"916916dd-fb61-4e93-8aff-948350956ec8\") " pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.670466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18f92f6b-e051-483b-91ae-9af16316bb7c-operator-scripts\") pod \"nova-api-78f7-account-create-update-hsj98\" (UID: \"18f92f6b-e051-483b-91ae-9af16316bb7c\") " pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.688618 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6p7\" (UniqueName: \"kubernetes.io/projected/916916dd-fb61-4e93-8aff-948350956ec8-kube-api-access-2q6p7\") pod \"nova-cell0-db-create-h7lxj\" (UID: \"916916dd-fb61-4e93-8aff-948350956ec8\") " pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.692202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxnq\" (UniqueName: \"kubernetes.io/projected/18f92f6b-e051-483b-91ae-9af16316bb7c-kube-api-access-sfxnq\") pod \"nova-api-78f7-account-create-update-hsj98\" (UID: \"18f92f6b-e051-483b-91ae-9af16316bb7c\") " pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.696305 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.697334 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.698549 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.708518 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.714869 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.771248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.771518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-operator-scripts\") pod \"nova-cell0-2a48-account-create-update-sfbrk\" (UID: \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.771663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.771758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.771904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4wxs\" (UniqueName: \"kubernetes.io/projected/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-kube-api-access-t4wxs\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.771993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.772093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-operator-scripts\") pod \"nova-cell1-db-create-5q9tj\" (UID: \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.776761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4rpf\" (UniqueName: \"kubernetes.io/projected/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-kube-api-access-p4rpf\") pod \"nova-cell0-2a48-account-create-update-sfbrk\" (UID: \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.776932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.777042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmh42\" (UniqueName: \"kubernetes.io/projected/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-kube-api-access-tmh42\") pod \"nova-cell1-db-create-5q9tj\" (UID: \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.777191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.777316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.776580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-operator-scripts\") pod \"nova-cell1-db-create-5q9tj\" (UID: \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.796631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmh42\" (UniqueName: \"kubernetes.io/projected/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-kube-api-access-tmh42\") pod \"nova-cell1-db-create-5q9tj\" (UID: \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.842067 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.847020 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4wxs\" (UniqueName: \"kubernetes.io/projected/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-kube-api-access-t4wxs\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4rpf\" (UniqueName: \"kubernetes.io/projected/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-kube-api-access-p4rpf\") pod \"nova-cell0-2a48-account-create-update-sfbrk\" (UID: \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-operator-scripts\") pod \"nova-cell0-2a48-account-create-update-sfbrk\" (UID: \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.878704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.879120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.879584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-operator-scripts\") pod \"nova-cell0-2a48-account-create-update-sfbrk\" (UID: \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.879902 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.880013 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.889139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.890326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.897065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4rpf\" (UniqueName: \"kubernetes.io/projected/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-kube-api-access-p4rpf\") pod \"nova-cell0-2a48-account-create-update-sfbrk\" (UID: \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.897620 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.902285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4wxs\" (UniqueName: \"kubernetes.io/projected/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-kube-api-access-t4wxs\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.903547 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.905266 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.905378 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.909057 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.911239 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6"] Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.958793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.965244 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.987127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.988146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz89j\" (UniqueName: \"kubernetes.io/projected/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-kube-api-access-bz89j\") pod \"nova-cell1-6cf9-account-create-update-26rc6\" (UID: \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:45 crc kubenswrapper[4707]: I0121 15:32:45.988252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-operator-scripts\") pod \"nova-cell1-6cf9-account-create-update-26rc6\" (UID: \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.089681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz89j\" (UniqueName: \"kubernetes.io/projected/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-kube-api-access-bz89j\") pod \"nova-cell1-6cf9-account-create-update-26rc6\" (UID: \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.089761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-operator-scripts\") pod \"nova-cell1-6cf9-account-create-update-26rc6\" (UID: \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.090596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-operator-scripts\") pod \"nova-cell1-6cf9-account-create-update-26rc6\" (UID: \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:46 crc kubenswrapper[4707]: W0121 15:32:46.110759 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5681f814_1ae0_4737_bc71_5e487d92c291.slice/crio-018249e0c5d18a4bde6fec01806be23c4f2d04ef9141c85e485cc11e6929d54e WatchSource:0}: Error finding container 018249e0c5d18a4bde6fec01806be23c4f2d04ef9141c85e485cc11e6929d54e: Status 404 returned error can't find the container with id 018249e0c5d18a4bde6fec01806be23c4f2d04ef9141c85e485cc11e6929d54e Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.111265 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-cr6fw"] Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.111482 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.115014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz89j\" (UniqueName: \"kubernetes.io/projected/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-kube-api-access-bz89j\") pod \"nova-cell1-6cf9-account-create-update-26rc6\" (UID: \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.167484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.170183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.244205 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.347481 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-h7lxj"] Jan 21 15:32:46 crc kubenswrapper[4707]: W0121 15:32:46.347956 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod916916dd_fb61_4e93_8aff_948350956ec8.slice/crio-cb7e99981dea0725794d2519c93dbfecac76d6757203d5f36cf62fed9f4425f0 WatchSource:0}: Error finding container cb7e99981dea0725794d2519c93dbfecac76d6757203d5f36cf62fed9f4425f0: Status 404 returned error can't find the container with id cb7e99981dea0725794d2519c93dbfecac76d6757203d5f36cf62fed9f4425f0 Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.437955 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98"] Jan 21 15:32:46 crc kubenswrapper[4707]: W0121 15:32:46.445822 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18f92f6b_e051_483b_91ae_9af16316bb7c.slice/crio-d154d4713bd200b45befb6f012d615b81671c2246569575203c664c84f51cec8 WatchSource:0}: Error finding container d154d4713bd200b45befb6f012d615b81671c2246569575203c664c84f51cec8: Status 404 returned error can't find the container with id d154d4713bd200b45befb6f012d615b81671c2246569575203c664c84f51cec8 Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.513192 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5q9tj"] Jan 21 15:32:46 crc kubenswrapper[4707]: W0121 15:32:46.527374 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod605f5170_dbeb_4c0a_a4e0_ca8ba1b93453.slice/crio-fb5e120ea5829f885fc296c1599c698a653c2dbd90bb1ec1a81219f0a54af66c WatchSource:0}: Error finding container fb5e120ea5829f885fc296c1599c698a653c2dbd90bb1ec1a81219f0a54af66c: Status 404 returned error can't find the container with id fb5e120ea5829f885fc296c1599c698a653c2dbd90bb1ec1a81219f0a54af66c Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.591608 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.609127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7b5a0b6a-5b7d-4431-961c-984efc37166a","Type":"ContainerStarted","Data":"2f3a47f92a10db39e366cd34b4513029b880fd9122a25ff4919afeff80052643"} Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.611488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" event={"ID":"5681f814-1ae0-4737-bc71-5e487d92c291","Type":"ContainerStarted","Data":"883f6570fece7c67c8b79e14447abbdbb70fd24d3a0f4805e5af9f115a6620cd"} Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.611513 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" event={"ID":"5681f814-1ae0-4737-bc71-5e487d92c291","Type":"ContainerStarted","Data":"018249e0c5d18a4bde6fec01806be23c4f2d04ef9141c85e485cc11e6929d54e"} Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.617416 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerStarted","Data":"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1"} Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.620953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" event={"ID":"18f92f6b-e051-483b-91ae-9af16316bb7c","Type":"ContainerStarted","Data":"d154d4713bd200b45befb6f012d615b81671c2246569575203c664c84f51cec8"} Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.624033 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" podStartSLOduration=1.624024763 podStartE2EDuration="1.624024763s" podCreationTimestamp="2026-01-21 15:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:46.621200582 +0000 UTC m=+1863.802716804" watchObservedRunningTime="2026-01-21 15:32:46.624024763 +0000 UTC m=+1863.805540985" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.656312 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk"] Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.665372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" event={"ID":"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453","Type":"ContainerStarted","Data":"fb5e120ea5829f885fc296c1599c698a653c2dbd90bb1ec1a81219f0a54af66c"} Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.669589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" event={"ID":"916916dd-fb61-4e93-8aff-948350956ec8","Type":"ContainerStarted","Data":"85176d83288ee8e0bcd7b88b5097eff0c9c6bf00f9758386dd0f8ace36c01fce"} Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.669633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" event={"ID":"916916dd-fb61-4e93-8aff-948350956ec8","Type":"ContainerStarted","Data":"cb7e99981dea0725794d2519c93dbfecac76d6757203d5f36cf62fed9f4425f0"} Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.687200 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" podStartSLOduration=1.687159844 podStartE2EDuration="1.687159844s" podCreationTimestamp="2026-01-21 15:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:46.681544726 +0000 UTC m=+1863.863060948" watchObservedRunningTime="2026-01-21 15:32:46.687159844 +0000 UTC m=+1863.868676066" Jan 21 15:32:46 crc kubenswrapper[4707]: I0121 15:32:46.772260 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6"] Jan 21 15:32:46 crc kubenswrapper[4707]: W0121 15:32:46.860502 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08b9b4ed_3277_4cbf_b22b_377bb4fb33f7.slice/crio-f5adc41a2608a546419839efef21802285e4d7a5a4bc87329099b3de9d7f158c WatchSource:0}: Error finding container f5adc41a2608a546419839efef21802285e4d7a5a4bc87329099b3de9d7f158c: Status 404 returned error can't find the container with id f5adc41a2608a546419839efef21802285e4d7a5a4bc87329099b3de9d7f158c Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.190437 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d81c05-e993-4076-9d9f-f5a1a10ba669" path="/var/lib/kubelet/pods/35d81c05-e993-4076-9d9f-f5a1a10ba669/volumes" Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.679470 4707 generic.go:334] "Generic (PLEG): container finished" podID="5681f814-1ae0-4737-bc71-5e487d92c291" containerID="883f6570fece7c67c8b79e14447abbdbb70fd24d3a0f4805e5af9f115a6620cd" exitCode=0 Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.679511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" event={"ID":"5681f814-1ae0-4737-bc71-5e487d92c291","Type":"ContainerDied","Data":"883f6570fece7c67c8b79e14447abbdbb70fd24d3a0f4805e5af9f115a6620cd"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.682724 4707 generic.go:334] "Generic (PLEG): container finished" podID="08b9b4ed-3277-4cbf-b22b-377bb4fb33f7" containerID="f36d8c9033a891382c33bba12614c135ec6171dacb14457377832a840b45c6be" exitCode=0 Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.682783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" event={"ID":"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7","Type":"ContainerDied","Data":"f36d8c9033a891382c33bba12614c135ec6171dacb14457377832a840b45c6be"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.682844 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" event={"ID":"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7","Type":"ContainerStarted","Data":"f5adc41a2608a546419839efef21802285e4d7a5a4bc87329099b3de9d7f158c"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.688379 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7b5a0b6a-5b7d-4431-961c-984efc37166a","Type":"ContainerStarted","Data":"73da13975b9a0fea75976713ffc06ad0487d928c65d1dd4b7b9f66279d719295"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.702218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerStarted","Data":"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.704083 4707 generic.go:334] "Generic (PLEG): container finished" podID="18f92f6b-e051-483b-91ae-9af16316bb7c" containerID="2d423c2230ae9d1a2174a39d585283c85105561e536de899f464d70c0a04a0eb" exitCode=0 Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.704141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" event={"ID":"18f92f6b-e051-483b-91ae-9af16316bb7c","Type":"ContainerDied","Data":"2d423c2230ae9d1a2174a39d585283c85105561e536de899f464d70c0a04a0eb"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.706335 4707 generic.go:334] "Generic (PLEG): container finished" podID="605f5170-dbeb-4c0a-a4e0-ca8ba1b93453" containerID="9e9209849ce6017488f4669fc17def86f04557718a632745fb949e3858aed61d" exitCode=0 Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.706400 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" event={"ID":"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453","Type":"ContainerDied","Data":"9e9209849ce6017488f4669fc17def86f04557718a632745fb949e3858aed61d"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.708110 4707 generic.go:334] "Generic (PLEG): container finished" podID="916916dd-fb61-4e93-8aff-948350956ec8" containerID="85176d83288ee8e0bcd7b88b5097eff0c9c6bf00f9758386dd0f8ace36c01fce" exitCode=0 Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.708138 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" event={"ID":"916916dd-fb61-4e93-8aff-948350956ec8","Type":"ContainerDied","Data":"85176d83288ee8e0bcd7b88b5097eff0c9c6bf00f9758386dd0f8ace36c01fce"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.716751 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcddd37c-c543-435e-94d7-cf9c9fd75f0a" containerID="a24de2a1a0d2d1105d4f042d523feff72032cf29f32b7f88df585a754e253a44" exitCode=0 Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.716850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" event={"ID":"fcddd37c-c543-435e-94d7-cf9c9fd75f0a","Type":"ContainerDied","Data":"a24de2a1a0d2d1105d4f042d523feff72032cf29f32b7f88df585a754e253a44"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.716867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" event={"ID":"fcddd37c-c543-435e-94d7-cf9c9fd75f0a","Type":"ContainerStarted","Data":"c963ac72f925a33a00abdf773a01ac0798563107b463bcf5903682e8f4e33ad0"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.721292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360","Type":"ContainerStarted","Data":"a33bcbf9b6be5d2e1482cb5138bc02de1e5c5fd932db478a845ca93ad2949f3b"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.721361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360","Type":"ContainerStarted","Data":"47ac127f3b5cc10e7c294ce42c9f26d5ecf141b91f6b7b707cd93e19f2a24533"} Jan 21 15:32:47 crc kubenswrapper[4707]: I0121 15:32:47.727243 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.727230293 podStartE2EDuration="3.727230293s" podCreationTimestamp="2026-01-21 15:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:47.717082862 +0000 UTC m=+1864.898599104" watchObservedRunningTime="2026-01-21 15:32:47.727230293 +0000 UTC m=+1864.908746515" Jan 21 15:32:48 crc kubenswrapper[4707]: I0121 15:32:48.273620 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:48 crc kubenswrapper[4707]: I0121 15:32:48.731849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerStarted","Data":"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df"} Jan 21 15:32:48 crc kubenswrapper[4707]: I0121 15:32:48.731957 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:48 crc kubenswrapper[4707]: I0121 15:32:48.733453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360","Type":"ContainerStarted","Data":"b02f577e02d6cede6688dd7b75593a1e9627c1ae835ea04f9d891f4fea9be490"} Jan 21 15:32:48 crc kubenswrapper[4707]: I0121 15:32:48.759947 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.163701443 podStartE2EDuration="5.759933469s" podCreationTimestamp="2026-01-21 15:32:43 +0000 UTC" firstStartedPulling="2026-01-21 15:32:44.282200849 +0000 UTC m=+1861.463717072" lastFinishedPulling="2026-01-21 15:32:47.878432876 +0000 UTC m=+1865.059949098" observedRunningTime="2026-01-21 15:32:48.753604809 +0000 UTC m=+1865.935121032" watchObservedRunningTime="2026-01-21 15:32:48.759933469 +0000 UTC m=+1865.941449691" Jan 21 15:32:48 crc kubenswrapper[4707]: I0121 15:32:48.778572 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.778557417 podStartE2EDuration="3.778557417s" podCreationTimestamp="2026-01-21 15:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:48.775198011 +0000 UTC m=+1865.956714233" watchObservedRunningTime="2026-01-21 15:32:48.778557417 +0000 UTC m=+1865.960073639" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.064568 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.247749 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/916916dd-fb61-4e93-8aff-948350956ec8-operator-scripts\") pod \"916916dd-fb61-4e93-8aff-948350956ec8\" (UID: \"916916dd-fb61-4e93-8aff-948350956ec8\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.247780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q6p7\" (UniqueName: \"kubernetes.io/projected/916916dd-fb61-4e93-8aff-948350956ec8-kube-api-access-2q6p7\") pod \"916916dd-fb61-4e93-8aff-948350956ec8\" (UID: \"916916dd-fb61-4e93-8aff-948350956ec8\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.248278 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916916dd-fb61-4e93-8aff-948350956ec8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "916916dd-fb61-4e93-8aff-948350956ec8" (UID: "916916dd-fb61-4e93-8aff-948350956ec8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.248681 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/916916dd-fb61-4e93-8aff-948350956ec8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.254990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916916dd-fb61-4e93-8aff-948350956ec8-kube-api-access-2q6p7" (OuterVolumeSpecName: "kube-api-access-2q6p7") pod "916916dd-fb61-4e93-8aff-948350956ec8" (UID: "916916dd-fb61-4e93-8aff-948350956ec8"). InnerVolumeSpecName "kube-api-access-2q6p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.258564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.295885 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.301611 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.307964 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.317277 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.350916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-operator-scripts\") pod \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\" (UID: \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.350959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmh42\" (UniqueName: \"kubernetes.io/projected/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-kube-api-access-tmh42\") pod \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\" (UID: \"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5681f814-1ae0-4737-bc71-5e487d92c291-operator-scripts\") pod \"5681f814-1ae0-4737-bc71-5e487d92c291\" (UID: \"5681f814-1ae0-4737-bc71-5e487d92c291\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18f92f6b-e051-483b-91ae-9af16316bb7c-operator-scripts\") pod \"18f92f6b-e051-483b-91ae-9af16316bb7c\" (UID: \"18f92f6b-e051-483b-91ae-9af16316bb7c\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-operator-scripts\") pod \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\" (UID: \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfxnq\" (UniqueName: \"kubernetes.io/projected/18f92f6b-e051-483b-91ae-9af16316bb7c-kube-api-access-sfxnq\") pod \"18f92f6b-e051-483b-91ae-9af16316bb7c\" (UID: \"18f92f6b-e051-483b-91ae-9af16316bb7c\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18f92f6b-e051-483b-91ae-9af16316bb7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18f92f6b-e051-483b-91ae-9af16316bb7c" (UID: "18f92f6b-e051-483b-91ae-9af16316bb7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5681f814-1ae0-4737-bc71-5e487d92c291-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5681f814-1ae0-4737-bc71-5e487d92c291" (UID: "5681f814-1ae0-4737-bc71-5e487d92c291"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351841 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5681f814-1ae0-4737-bc71-5e487d92c291-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351859 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18f92f6b-e051-483b-91ae-9af16316bb7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351870 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q6p7\" (UniqueName: \"kubernetes.io/projected/916916dd-fb61-4e93-8aff-948350956ec8-kube-api-access-2q6p7\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.351850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "605f5170-dbeb-4c0a-a4e0-ca8ba1b93453" (UID: "605f5170-dbeb-4c0a-a4e0-ca8ba1b93453"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.352705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fcddd37c-c543-435e-94d7-cf9c9fd75f0a" (UID: "fcddd37c-c543-435e-94d7-cf9c9fd75f0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.353770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-kube-api-access-tmh42" (OuterVolumeSpecName: "kube-api-access-tmh42") pod "605f5170-dbeb-4c0a-a4e0-ca8ba1b93453" (UID: "605f5170-dbeb-4c0a-a4e0-ca8ba1b93453"). InnerVolumeSpecName "kube-api-access-tmh42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.354041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f92f6b-e051-483b-91ae-9af16316bb7c-kube-api-access-sfxnq" (OuterVolumeSpecName: "kube-api-access-sfxnq") pod "18f92f6b-e051-483b-91ae-9af16316bb7c" (UID: "18f92f6b-e051-483b-91ae-9af16316bb7c"). InnerVolumeSpecName "kube-api-access-sfxnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.452276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4rpf\" (UniqueName: \"kubernetes.io/projected/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-kube-api-access-p4rpf\") pod \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\" (UID: \"fcddd37c-c543-435e-94d7-cf9c9fd75f0a\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.452417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtfgf\" (UniqueName: \"kubernetes.io/projected/5681f814-1ae0-4737-bc71-5e487d92c291-kube-api-access-rtfgf\") pod \"5681f814-1ae0-4737-bc71-5e487d92c291\" (UID: \"5681f814-1ae0-4737-bc71-5e487d92c291\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.452444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-operator-scripts\") pod \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\" (UID: \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.452468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz89j\" (UniqueName: \"kubernetes.io/projected/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-kube-api-access-bz89j\") pod \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\" (UID: \"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7\") " Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.452692 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.452710 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfxnq\" (UniqueName: \"kubernetes.io/projected/18f92f6b-e051-483b-91ae-9af16316bb7c-kube-api-access-sfxnq\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.452721 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.452729 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmh42\" (UniqueName: \"kubernetes.io/projected/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453-kube-api-access-tmh42\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.453292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08b9b4ed-3277-4cbf-b22b-377bb4fb33f7" (UID: "08b9b4ed-3277-4cbf-b22b-377bb4fb33f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.455200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-kube-api-access-bz89j" (OuterVolumeSpecName: "kube-api-access-bz89j") pod "08b9b4ed-3277-4cbf-b22b-377bb4fb33f7" (UID: "08b9b4ed-3277-4cbf-b22b-377bb4fb33f7"). InnerVolumeSpecName "kube-api-access-bz89j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.455652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5681f814-1ae0-4737-bc71-5e487d92c291-kube-api-access-rtfgf" (OuterVolumeSpecName: "kube-api-access-rtfgf") pod "5681f814-1ae0-4737-bc71-5e487d92c291" (UID: "5681f814-1ae0-4737-bc71-5e487d92c291"). InnerVolumeSpecName "kube-api-access-rtfgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.456674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-kube-api-access-p4rpf" (OuterVolumeSpecName: "kube-api-access-p4rpf") pod "fcddd37c-c543-435e-94d7-cf9c9fd75f0a" (UID: "fcddd37c-c543-435e-94d7-cf9c9fd75f0a"). InnerVolumeSpecName "kube-api-access-p4rpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.553594 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtfgf\" (UniqueName: \"kubernetes.io/projected/5681f814-1ae0-4737-bc71-5e487d92c291-kube-api-access-rtfgf\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.553627 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.553637 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz89j\" (UniqueName: \"kubernetes.io/projected/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7-kube-api-access-bz89j\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.553645 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4rpf\" (UniqueName: \"kubernetes.io/projected/fcddd37c-c543-435e-94d7-cf9c9fd75f0a-kube-api-access-p4rpf\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.743215 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.743206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk" event={"ID":"fcddd37c-c543-435e-94d7-cf9c9fd75f0a","Type":"ContainerDied","Data":"c963ac72f925a33a00abdf773a01ac0798563107b463bcf5903682e8f4e33ad0"} Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.743340 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c963ac72f925a33a00abdf773a01ac0798563107b463bcf5903682e8f4e33ad0" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.744663 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.744659 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-cr6fw" event={"ID":"5681f814-1ae0-4737-bc71-5e487d92c291","Type":"ContainerDied","Data":"018249e0c5d18a4bde6fec01806be23c4f2d04ef9141c85e485cc11e6929d54e"} Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.744777 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="018249e0c5d18a4bde6fec01806be23c4f2d04ef9141c85e485cc11e6929d54e" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.745972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" event={"ID":"18f92f6b-e051-483b-91ae-9af16316bb7c","Type":"ContainerDied","Data":"d154d4713bd200b45befb6f012d615b81671c2246569575203c664c84f51cec8"} Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.746005 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d154d4713bd200b45befb6f012d615b81671c2246569575203c664c84f51cec8" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.745985 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.747252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" event={"ID":"605f5170-dbeb-4c0a-a4e0-ca8ba1b93453","Type":"ContainerDied","Data":"fb5e120ea5829f885fc296c1599c698a653c2dbd90bb1ec1a81219f0a54af66c"} Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.747489 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb5e120ea5829f885fc296c1599c698a653c2dbd90bb1ec1a81219f0a54af66c" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.747278 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-5q9tj" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.748258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" event={"ID":"916916dd-fb61-4e93-8aff-948350956ec8","Type":"ContainerDied","Data":"cb7e99981dea0725794d2519c93dbfecac76d6757203d5f36cf62fed9f4425f0"} Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.748283 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb7e99981dea0725794d2519c93dbfecac76d6757203d5f36cf62fed9f4425f0" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.748319 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-h7lxj" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.751139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" event={"ID":"08b9b4ed-3277-4cbf-b22b-377bb4fb33f7","Type":"ContainerDied","Data":"f5adc41a2608a546419839efef21802285e4d7a5a4bc87329099b3de9d7f158c"} Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.751164 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5adc41a2608a546419839efef21802285e4d7a5a4bc87329099b3de9d7f158c" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.751230 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6" Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.751343 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="ceilometer-central-agent" containerID="cri-o://f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9" gracePeriod=30 Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.751348 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="proxy-httpd" containerID="cri-o://b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df" gracePeriod=30 Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.751371 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="sg-core" containerID="cri-o://25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4" gracePeriod=30 Jan 21 15:32:49 crc kubenswrapper[4707]: I0121 15:32:49.751392 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="ceilometer-notification-agent" containerID="cri-o://f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1" gracePeriod=30 Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.054350 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1b9e5a_e5ba_4a67_9326_a588a05f73fa.slice/crio-conmon-f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.358113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.469716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhk9q\" (UniqueName: \"kubernetes.io/projected/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-kube-api-access-vhk9q\") pod \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.469875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-run-httpd\") pod \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.470115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-log-httpd\") pod \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.470373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-combined-ca-bundle\") pod \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.470683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-config-data\") pod \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.470975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" (UID: "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.471019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-scripts\") pod \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.471078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-sg-core-conf-yaml\") pod \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\" (UID: \"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa\") " Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.471139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" (UID: "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.475565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-kube-api-access-vhk9q" (OuterVolumeSpecName: "kube-api-access-vhk9q") pod "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" (UID: "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa"). InnerVolumeSpecName "kube-api-access-vhk9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.476605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-scripts" (OuterVolumeSpecName: "scripts") pod "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" (UID: "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.491998 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.492023 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhk9q\" (UniqueName: \"kubernetes.io/projected/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-kube-api-access-vhk9q\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.492036 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.492046 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.496091 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" (UID: "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.531222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" (UID: "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.546378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-config-data" (OuterVolumeSpecName: "config-data") pod "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" (UID: "ec1b9e5a-e5ba-4a67-9326-a588a05f73fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.594210 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.594243 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.594253 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760751 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerID="b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df" exitCode=0 Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760783 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerID="25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4" exitCode=2 Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760792 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerID="f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1" exitCode=0 Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760799 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerID="f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9" exitCode=0 Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerDied","Data":"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df"} Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerDied","Data":"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4"} Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerDied","Data":"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1"} Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerDied","Data":"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9"} Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ec1b9e5a-e5ba-4a67-9326-a588a05f73fa","Type":"ContainerDied","Data":"9146257ff3e11f72048c59f73df7b16b3ba346374d9460bfc5df080f6c15e078"} Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.760907 4707 scope.go:117] "RemoveContainer" containerID="b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.761017 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.795060 4707 scope.go:117] "RemoveContainer" containerID="25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.805693 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.815517 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.818834 4707 scope.go:117] "RemoveContainer" containerID="f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.822783 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823219 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5681f814-1ae0-4737-bc71-5e487d92c291" containerName="mariadb-database-create" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823236 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5681f814-1ae0-4737-bc71-5e487d92c291" containerName="mariadb-database-create" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823252 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="ceilometer-notification-agent" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823259 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="ceilometer-notification-agent" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823270 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="sg-core" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823276 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="sg-core" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823285 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f92f6b-e051-483b-91ae-9af16316bb7c" containerName="mariadb-account-create-update" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823291 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f92f6b-e051-483b-91ae-9af16316bb7c" containerName="mariadb-account-create-update" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823311 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="ceilometer-central-agent" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823317 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="ceilometer-central-agent" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823324 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcddd37c-c543-435e-94d7-cf9c9fd75f0a" containerName="mariadb-account-create-update" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823330 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcddd37c-c543-435e-94d7-cf9c9fd75f0a" containerName="mariadb-account-create-update" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823345 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b9b4ed-3277-4cbf-b22b-377bb4fb33f7" containerName="mariadb-account-create-update" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823351 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b9b4ed-3277-4cbf-b22b-377bb4fb33f7" containerName="mariadb-account-create-update" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823360 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916916dd-fb61-4e93-8aff-948350956ec8" containerName="mariadb-database-create" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823365 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="916916dd-fb61-4e93-8aff-948350956ec8" containerName="mariadb-database-create" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823372 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="proxy-httpd" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823377 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="proxy-httpd" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.823384 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605f5170-dbeb-4c0a-a4e0-ca8ba1b93453" containerName="mariadb-database-create" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823389 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="605f5170-dbeb-4c0a-a4e0-ca8ba1b93453" containerName="mariadb-database-create" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823544 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b9b4ed-3277-4cbf-b22b-377bb4fb33f7" containerName="mariadb-account-create-update" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823553 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="605f5170-dbeb-4c0a-a4e0-ca8ba1b93453" containerName="mariadb-database-create" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823562 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcddd37c-c543-435e-94d7-cf9c9fd75f0a" containerName="mariadb-account-create-update" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823572 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5681f814-1ae0-4737-bc71-5e487d92c291" containerName="mariadb-database-create" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823580 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="ceilometer-central-agent" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823588 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f92f6b-e051-483b-91ae-9af16316bb7c" containerName="mariadb-account-create-update" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823594 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="ceilometer-notification-agent" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823604 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="sg-core" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823612 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" containerName="proxy-httpd" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.823623 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="916916dd-fb61-4e93-8aff-948350956ec8" containerName="mariadb-database-create" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.825232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.827370 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.828189 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.830922 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.854078 4707 scope.go:117] "RemoveContainer" containerID="f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.881655 4707 scope.go:117] "RemoveContainer" containerID="b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.881982 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df\": container with ID starting with b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df not found: ID does not exist" containerID="b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.882021 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df"} err="failed to get container status \"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df\": rpc error: code = NotFound desc = could not find container \"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df\": container with ID starting with b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.882045 4707 scope.go:117] "RemoveContainer" containerID="25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.882379 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4\": container with ID starting with 25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4 not found: ID does not exist" containerID="25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.882402 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4"} err="failed to get container status \"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4\": rpc error: code = NotFound desc = could not find container \"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4\": container with ID starting with 25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.882417 4707 scope.go:117] "RemoveContainer" containerID="f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.883363 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1\": container with ID starting with f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1 not found: ID does not exist" containerID="f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.883386 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1"} err="failed to get container status \"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1\": rpc error: code = NotFound desc = could not find container \"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1\": container with ID starting with f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.883399 4707 scope.go:117] "RemoveContainer" containerID="f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9" Jan 21 15:32:50 crc kubenswrapper[4707]: E0121 15:32:50.883744 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9\": container with ID starting with f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9 not found: ID does not exist" containerID="f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.883785 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9"} err="failed to get container status \"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9\": rpc error: code = NotFound desc = could not find container \"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9\": container with ID starting with f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.883825 4707 scope.go:117] "RemoveContainer" containerID="b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.884445 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df"} err="failed to get container status \"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df\": rpc error: code = NotFound desc = could not find container \"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df\": container with ID starting with b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.884469 4707 scope.go:117] "RemoveContainer" containerID="25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.884794 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4"} err="failed to get container status \"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4\": rpc error: code = NotFound desc = could not find container \"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4\": container with ID starting with 25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.884943 4707 scope.go:117] "RemoveContainer" containerID="f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.887595 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1"} err="failed to get container status \"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1\": rpc error: code = NotFound desc = could not find container \"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1\": container with ID starting with f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.887631 4707 scope.go:117] "RemoveContainer" containerID="f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.892593 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9"} err="failed to get container status \"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9\": rpc error: code = NotFound desc = could not find container \"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9\": container with ID starting with f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.892628 4707 scope.go:117] "RemoveContainer" containerID="b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.893043 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df"} err="failed to get container status \"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df\": rpc error: code = NotFound desc = could not find container \"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df\": container with ID starting with b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.893066 4707 scope.go:117] "RemoveContainer" containerID="25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.893389 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4"} err="failed to get container status \"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4\": rpc error: code = NotFound desc = could not find container \"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4\": container with ID starting with 25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.893407 4707 scope.go:117] "RemoveContainer" containerID="f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.893649 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1"} err="failed to get container status \"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1\": rpc error: code = NotFound desc = could not find container \"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1\": container with ID starting with f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.893672 4707 scope.go:117] "RemoveContainer" containerID="f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.893945 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9"} err="failed to get container status \"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9\": rpc error: code = NotFound desc = could not find container \"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9\": container with ID starting with f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.893966 4707 scope.go:117] "RemoveContainer" containerID="b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.894268 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df"} err="failed to get container status \"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df\": rpc error: code = NotFound desc = could not find container \"b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df\": container with ID starting with b4b46db13bfbd958c44e1c79618525757e858754182d3fb496c53a46947b79df not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.894289 4707 scope.go:117] "RemoveContainer" containerID="25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.894495 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4"} err="failed to get container status \"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4\": rpc error: code = NotFound desc = could not find container \"25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4\": container with ID starting with 25e7e55443e185fe25a61d98de37397f2ae0dcab52aa2031c5a39411dc82f1e4 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.894515 4707 scope.go:117] "RemoveContainer" containerID="f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.894719 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1"} err="failed to get container status \"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1\": rpc error: code = NotFound desc = could not find container \"f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1\": container with ID starting with f8a0d04c0423590b512bd69b89d80899696d5068a416ea203bd971756f752bf1 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.894746 4707 scope.go:117] "RemoveContainer" containerID="f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.894990 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9"} err="failed to get container status \"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9\": rpc error: code = NotFound desc = could not find container \"f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9\": container with ID starting with f11fc757d580f689d8f50841872ce790766be6196353318e2c7d2beeed0727e9 not found: ID does not exist" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.899115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.899241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.899272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-log-httpd\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.899357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-config-data\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.899382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqttz\" (UniqueName: \"kubernetes.io/projected/fb51953b-a321-4fd3-830a-0109c2c8c207-kube-api-access-sqttz\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.899543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-run-httpd\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:50 crc kubenswrapper[4707]: I0121 15:32:50.899603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-scripts\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.000941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-config-data\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.000984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqttz\" (UniqueName: \"kubernetes.io/projected/fb51953b-a321-4fd3-830a-0109c2c8c207-kube-api-access-sqttz\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.001064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-run-httpd\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.001096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-scripts\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.001145 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.001174 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.001200 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-log-httpd\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.001949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-run-httpd\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.003830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-log-httpd\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.012426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.012437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-scripts\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.013298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.014372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-config-data\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.016750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqttz\" (UniqueName: \"kubernetes.io/projected/fb51953b-a321-4fd3-830a-0109c2c8c207-kube-api-access-sqttz\") pod \"ceilometer-0\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.071773 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns"] Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.072704 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.085556 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns"] Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.089829 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.089971 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-rsjbw" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.090349 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.102687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-config-data\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.102730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-scripts\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.102845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7qtj\" (UniqueName: \"kubernetes.io/projected/9e7eb849-a2d7-4ac3-88cc-977992b13490-kube-api-access-q7qtj\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.102905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.149488 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.182749 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:32:51 crc kubenswrapper[4707]: E0121 15:32:51.183165 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.191913 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1b9e5a-e5ba-4a67-9326-a588a05f73fa" path="/var/lib/kubelet/pods/ec1b9e5a-e5ba-4a67-9326-a588a05f73fa/volumes" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.203737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7qtj\" (UniqueName: \"kubernetes.io/projected/9e7eb849-a2d7-4ac3-88cc-977992b13490-kube-api-access-q7qtj\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.203920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.203978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-config-data\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.204006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-scripts\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.207475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-scripts\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.208275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-config-data\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.211305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.221716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7qtj\" (UniqueName: \"kubernetes.io/projected/9e7eb849-a2d7-4ac3-88cc-977992b13490-kube-api-access-q7qtj\") pod \"nova-cell0-conductor-db-sync-7bdns\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.392546 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.538591 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.725394 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.725626 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-decision-engine-0" podUID="4eebbb84-d09f-44b4-9ad2-5f715fd16644" containerName="watcher-decision-engine" containerID="cri-o://1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b" gracePeriod=30 Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.772116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerStarted","Data":"cc93b6cbfb74f1bb56e2bbedd74ef00fcd6be497f7acfbd3bafc155a284deb7e"} Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.795477 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns"] Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.898318 4707 scope.go:117] "RemoveContainer" containerID="158ef82383ebc483ed65ed5a15084ea8125ee0ba76ff7dc77f0ac7243069e6fa" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.919687 4707 scope.go:117] "RemoveContainer" containerID="473132bae3b4f1a358a9fce0e4dd49f77c32fea544af9d8275f4b91ef2bdb12f" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.954949 4707 scope.go:117] "RemoveContainer" containerID="5b2b95e85c5e4eda7cdfebc4edbda64106a9ba4ff11ec9077edac721fc1d7244" Jan 21 15:32:51 crc kubenswrapper[4707]: I0121 15:32:51.979277 4707 scope.go:117] "RemoveContainer" containerID="762d5296ac937d5e18cdab76d4b50154815f31ea6c484242500f5fa8c55bfbe3" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.000878 4707 scope.go:117] "RemoveContainer" containerID="9ee60ee038e2abfb8251554c9075cf75988811fa8396a5cc948a9afaf1fcf228" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.051297 4707 scope.go:117] "RemoveContainer" containerID="0912e37e74ceeb9d3d9eb417056786f9f77c16bb553ec5e0be0c220aa1c5ddeb" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.079501 4707 scope.go:117] "RemoveContainer" containerID="67029361353c98122cfe1c9817668815bab4e2b6c15bd6afd746615a9e8b5ee9" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.189718 4707 scope.go:117] "RemoveContainer" containerID="ee4846a1605c12513d522993ab369007eadfe8110bdee639d33d4b47837aad9c" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.255649 4707 scope.go:117] "RemoveContainer" containerID="a33add28ee54c3f16a7ed662f08ebebecc8c61219584e3106caabddc6c3f4203" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.311607 4707 scope.go:117] "RemoveContainer" containerID="359db13546bda46dfe24fc62a12209a680cd159a00de210a56790fbb67e0c993" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.377173 4707 scope.go:117] "RemoveContainer" containerID="8becf27fcce8f470863c589f92c7787a471ea570cea12e938de611fc3ca3f044" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.402125 4707 scope.go:117] "RemoveContainer" containerID="440a9bfead233d507ab9aea7087704ef0f819f59103ccdbb90e238448f2e9fc0" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.420648 4707 scope.go:117] "RemoveContainer" containerID="d92bfb954044f93a1d2d296e0bc8b3543217d2a0fefaa7612d5415b7cfb8164c" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.443727 4707 scope.go:117] "RemoveContainer" containerID="aa6c42ba172b95b7cc1127e455450f045ffbfb0a3f4c0d221599cca192bdf08e" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.461854 4707 scope.go:117] "RemoveContainer" containerID="a202955f6373494fbeb43b833189545087a0c39e0d98b4567a8ab1c1d4688e12" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.482408 4707 scope.go:117] "RemoveContainer" containerID="1bea5fb4ea7834020b5704135d41ce7f5437b5498e017d423fba0f6f0c7c8d38" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.502126 4707 scope.go:117] "RemoveContainer" containerID="083e4cd32b052f682b3d16aab2992cbad4e1ff2d3385eda7a355f1297585116c" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.525969 4707 scope.go:117] "RemoveContainer" containerID="5936c739ea4cec2b5636f30e7f2f0827fbed8af38286bc32dafbc452cfcb0151" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.555131 4707 scope.go:117] "RemoveContainer" containerID="208c0662db0015e98e63b50f6623e6c048e772144dda6cb679b0e325256766b1" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.591400 4707 scope.go:117] "RemoveContainer" containerID="0123d569abdf67071620acec970d42c7b8e7f5e4554d50dce12151ac907d347e" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.632983 4707 scope.go:117] "RemoveContainer" containerID="02e28dadd8fc9e9b163dde4dbf029a252158949885ad335282d6766892c01e0e" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.666093 4707 scope.go:117] "RemoveContainer" containerID="d1c16be54872c293cc9299362e17bd0ce9ffc2c91beeb312a1ec0ca339b4c71b" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.710758 4707 scope.go:117] "RemoveContainer" containerID="56b307e8ae01dc1efee73790e46167abc0507fd0d2427ed1c12896d92831d80b" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.737575 4707 scope.go:117] "RemoveContainer" containerID="d1e0f194bb5efcc61be86150df728c212bfcbb440d89e9a94e9e2ba84136344f" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.758368 4707 scope.go:117] "RemoveContainer" containerID="30177fc1fa90681a04365549e6ac7cf1dbd60c425bb4bb3154414d2533f14fa8" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.786981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" event={"ID":"9e7eb849-a2d7-4ac3-88cc-977992b13490","Type":"ContainerStarted","Data":"3ab85ee94bca50a4accee416f902c8f5749d6ac2a33fa89d6c447eb295d21860"} Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.787020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" event={"ID":"9e7eb849-a2d7-4ac3-88cc-977992b13490","Type":"ContainerStarted","Data":"2e0b99ee6e64a07cc321a9c12d33a744581da6cf5a2b90219f000d9d4a2f41b9"} Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.802372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerStarted","Data":"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13"} Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.808511 4707 scope.go:117] "RemoveContainer" containerID="5c36dcae3ed936bb2d60dcf3ecca40ea3f567386b90ad431c335c8aa57debe7e" Jan 21 15:32:52 crc kubenswrapper[4707]: I0121 15:32:52.809193 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" podStartSLOduration=1.809166791 podStartE2EDuration="1.809166791s" podCreationTimestamp="2026-01-21 15:32:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:52.80268213 +0000 UTC m=+1869.984198351" watchObservedRunningTime="2026-01-21 15:32:52.809166791 +0000 UTC m=+1869.990683014" Jan 21 15:32:53 crc kubenswrapper[4707]: I0121 15:32:53.119191 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:53 crc kubenswrapper[4707]: I0121 15:32:53.846792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerStarted","Data":"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025"} Jan 21 15:32:53 crc kubenswrapper[4707]: I0121 15:32:53.846883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerStarted","Data":"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d"} Jan 21 15:32:54 crc kubenswrapper[4707]: I0121 15:32:54.895951 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:54 crc kubenswrapper[4707]: I0121 15:32:54.896197 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:54 crc kubenswrapper[4707]: I0121 15:32:54.927250 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:54 crc kubenswrapper[4707]: I0121 15:32:54.931700 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.242931 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.379397 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-custom-prometheus-ca\") pod \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.379450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-config-data\") pod \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.379469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-combined-ca-bundle\") pod \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.379579 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eebbb84-d09f-44b4-9ad2-5f715fd16644-logs\") pod \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.379724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv4cr\" (UniqueName: \"kubernetes.io/projected/4eebbb84-d09f-44b4-9ad2-5f715fd16644-kube-api-access-xv4cr\") pod \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\" (UID: \"4eebbb84-d09f-44b4-9ad2-5f715fd16644\") " Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.380516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eebbb84-d09f-44b4-9ad2-5f715fd16644-logs" (OuterVolumeSpecName: "logs") pod "4eebbb84-d09f-44b4-9ad2-5f715fd16644" (UID: "4eebbb84-d09f-44b4-9ad2-5f715fd16644"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.380769 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eebbb84-d09f-44b4-9ad2-5f715fd16644-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.382999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eebbb84-d09f-44b4-9ad2-5f715fd16644-kube-api-access-xv4cr" (OuterVolumeSpecName: "kube-api-access-xv4cr") pod "4eebbb84-d09f-44b4-9ad2-5f715fd16644" (UID: "4eebbb84-d09f-44b4-9ad2-5f715fd16644"). InnerVolumeSpecName "kube-api-access-xv4cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.405793 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "4eebbb84-d09f-44b4-9ad2-5f715fd16644" (UID: "4eebbb84-d09f-44b4-9ad2-5f715fd16644"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.409248 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4eebbb84-d09f-44b4-9ad2-5f715fd16644" (UID: "4eebbb84-d09f-44b4-9ad2-5f715fd16644"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.420501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-config-data" (OuterVolumeSpecName: "config-data") pod "4eebbb84-d09f-44b4-9ad2-5f715fd16644" (UID: "4eebbb84-d09f-44b4-9ad2-5f715fd16644"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.482546 4707 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.482579 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.482590 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eebbb84-d09f-44b4-9ad2-5f715fd16644-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.482603 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv4cr\" (UniqueName: \"kubernetes.io/projected/4eebbb84-d09f-44b4-9ad2-5f715fd16644-kube-api-access-xv4cr\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.865270 4707 generic.go:334] "Generic (PLEG): container finished" podID="4eebbb84-d09f-44b4-9ad2-5f715fd16644" containerID="1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b" exitCode=0 Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.865318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"4eebbb84-d09f-44b4-9ad2-5f715fd16644","Type":"ContainerDied","Data":"1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b"} Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.865379 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.865702 4707 scope.go:117] "RemoveContainer" containerID="1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.865683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"4eebbb84-d09f-44b4-9ad2-5f715fd16644","Type":"ContainerDied","Data":"0049b938274b30339fb6c454a8ce1d0e43900a63aa1201c7d0de7dfb4cf4bcd2"} Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.868723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerStarted","Data":"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a"} Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.868801 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="proxy-httpd" containerID="cri-o://4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a" gracePeriod=30 Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.868783 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="ceilometer-central-agent" containerID="cri-o://626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13" gracePeriod=30 Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.868847 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="sg-core" containerID="cri-o://f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025" gracePeriod=30 Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.868910 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="ceilometer-notification-agent" containerID="cri-o://10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d" gracePeriod=30 Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.868944 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.868966 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.868978 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.896166 4707 scope.go:117] "RemoveContainer" containerID="1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b" Jan 21 15:32:55 crc kubenswrapper[4707]: E0121 15:32:55.896579 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b\": container with ID starting with 1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b not found: ID does not exist" containerID="1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.896607 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b"} err="failed to get container status \"1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b\": rpc error: code = NotFound desc = could not find container \"1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b\": container with ID starting with 1b1732db666357ea09b905bdefa98a8a67b2d14514c95720573b1d31517ce00b not found: ID does not exist" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.899513 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.394872534 podStartE2EDuration="5.899496542s" podCreationTimestamp="2026-01-21 15:32:50 +0000 UTC" firstStartedPulling="2026-01-21 15:32:51.549826105 +0000 UTC m=+1868.731342317" lastFinishedPulling="2026-01-21 15:32:55.054450102 +0000 UTC m=+1872.235966325" observedRunningTime="2026-01-21 15:32:55.891144058 +0000 UTC m=+1873.072660280" watchObservedRunningTime="2026-01-21 15:32:55.899496542 +0000 UTC m=+1873.081012765" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.916074 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.923379 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.933171 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:32:55 crc kubenswrapper[4707]: E0121 15:32:55.933514 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eebbb84-d09f-44b4-9ad2-5f715fd16644" containerName="watcher-decision-engine" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.933531 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eebbb84-d09f-44b4-9ad2-5f715fd16644" containerName="watcher-decision-engine" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.933696 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eebbb84-d09f-44b4-9ad2-5f715fd16644" containerName="watcher-decision-engine" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.934228 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.940246 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-decision-engine-config-data" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.945742 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.988554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:55 crc kubenswrapper[4707]: I0121 15:32:55.988735 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.023714 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.031487 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.093576 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5tm7\" (UniqueName: \"kubernetes.io/projected/4904f4a7-c8e7-4d41-9254-32ccb097df02-kube-api-access-w5tm7\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.093802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.094059 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4904f4a7-c8e7-4d41-9254-32ccb097df02-logs\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.094198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.094325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.195855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4904f4a7-c8e7-4d41-9254-32ccb097df02-logs\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.195920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.195952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.196011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5tm7\" (UniqueName: \"kubernetes.io/projected/4904f4a7-c8e7-4d41-9254-32ccb097df02-kube-api-access-w5tm7\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.196045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.197130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4904f4a7-c8e7-4d41-9254-32ccb097df02-logs\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.201191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.201193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.205511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.220208 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5tm7\" (UniqueName: \"kubernetes.io/projected/4904f4a7-c8e7-4d41-9254-32ccb097df02-kube-api-access-w5tm7\") pod \"watcher-decision-engine-0\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.249400 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.479651 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.604250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-config-data\") pod \"fb51953b-a321-4fd3-830a-0109c2c8c207\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.604549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-run-httpd\") pod \"fb51953b-a321-4fd3-830a-0109c2c8c207\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.604610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-sg-core-conf-yaml\") pod \"fb51953b-a321-4fd3-830a-0109c2c8c207\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.604702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-scripts\") pod \"fb51953b-a321-4fd3-830a-0109c2c8c207\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.604843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-log-httpd\") pod \"fb51953b-a321-4fd3-830a-0109c2c8c207\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.604864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-combined-ca-bundle\") pod \"fb51953b-a321-4fd3-830a-0109c2c8c207\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.604942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqttz\" (UniqueName: \"kubernetes.io/projected/fb51953b-a321-4fd3-830a-0109c2c8c207-kube-api-access-sqttz\") pod \"fb51953b-a321-4fd3-830a-0109c2c8c207\" (UID: \"fb51953b-a321-4fd3-830a-0109c2c8c207\") " Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.605065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fb51953b-a321-4fd3-830a-0109c2c8c207" (UID: "fb51953b-a321-4fd3-830a-0109c2c8c207"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.605377 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fb51953b-a321-4fd3-830a-0109c2c8c207" (UID: "fb51953b-a321-4fd3-830a-0109c2c8c207"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.605833 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.605853 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb51953b-a321-4fd3-830a-0109c2c8c207-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.610037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-scripts" (OuterVolumeSpecName: "scripts") pod "fb51953b-a321-4fd3-830a-0109c2c8c207" (UID: "fb51953b-a321-4fd3-830a-0109c2c8c207"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.610457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb51953b-a321-4fd3-830a-0109c2c8c207-kube-api-access-sqttz" (OuterVolumeSpecName: "kube-api-access-sqttz") pod "fb51953b-a321-4fd3-830a-0109c2c8c207" (UID: "fb51953b-a321-4fd3-830a-0109c2c8c207"). InnerVolumeSpecName "kube-api-access-sqttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.627542 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fb51953b-a321-4fd3-830a-0109c2c8c207" (UID: "fb51953b-a321-4fd3-830a-0109c2c8c207"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.661570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb51953b-a321-4fd3-830a-0109c2c8c207" (UID: "fb51953b-a321-4fd3-830a-0109c2c8c207"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.675035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-config-data" (OuterVolumeSpecName: "config-data") pod "fb51953b-a321-4fd3-830a-0109c2c8c207" (UID: "fb51953b-a321-4fd3-830a-0109c2c8c207"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.703834 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.709342 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.709368 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.709378 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqttz\" (UniqueName: \"kubernetes.io/projected/fb51953b-a321-4fd3-830a-0109c2c8c207-kube-api-access-sqttz\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.709387 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.709396 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb51953b-a321-4fd3-830a-0109c2c8c207-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.877981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"4904f4a7-c8e7-4d41-9254-32ccb097df02","Type":"ContainerStarted","Data":"dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e"} Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.878050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"4904f4a7-c8e7-4d41-9254-32ccb097df02","Type":"ContainerStarted","Data":"4e0fc1adb7f245844cd128ff624b3ca945acd7dcda58b9658732106372498717"} Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881625 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerID="4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a" exitCode=0 Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881652 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerID="f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025" exitCode=2 Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881669 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerID="10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d" exitCode=0 Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881678 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerID="626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13" exitCode=0 Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerDied","Data":"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a"} Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerDied","Data":"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025"} Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerDied","Data":"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d"} Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerDied","Data":"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13"} Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"fb51953b-a321-4fd3-830a-0109c2c8c207","Type":"ContainerDied","Data":"cc93b6cbfb74f1bb56e2bbedd74ef00fcd6be497f7acfbd3bafc155a284deb7e"} Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.881772 4707 scope.go:117] "RemoveContainer" containerID="4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.882023 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.882266 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.882318 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.892129 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-decision-engine-0" podStartSLOduration=1.892113436 podStartE2EDuration="1.892113436s" podCreationTimestamp="2026-01-21 15:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:32:56.889074522 +0000 UTC m=+1874.070590744" watchObservedRunningTime="2026-01-21 15:32:56.892113436 +0000 UTC m=+1874.073629658" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.903586 4707 scope.go:117] "RemoveContainer" containerID="f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.918756 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.929063 4707 scope.go:117] "RemoveContainer" containerID="10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.934856 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.945474 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:56 crc kubenswrapper[4707]: E0121 15:32:56.945845 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="ceilometer-notification-agent" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.945862 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="ceilometer-notification-agent" Jan 21 15:32:56 crc kubenswrapper[4707]: E0121 15:32:56.945881 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="proxy-httpd" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.945887 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="proxy-httpd" Jan 21 15:32:56 crc kubenswrapper[4707]: E0121 15:32:56.945902 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="sg-core" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.945907 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="sg-core" Jan 21 15:32:56 crc kubenswrapper[4707]: E0121 15:32:56.945913 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="ceilometer-central-agent" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.945918 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="ceilometer-central-agent" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.946074 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="proxy-httpd" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.946085 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="ceilometer-notification-agent" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.946095 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="ceilometer-central-agent" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.946111 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" containerName="sg-core" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.947491 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.954001 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.954208 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.961416 4707 scope.go:117] "RemoveContainer" containerID="626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13" Jan 21 15:32:56 crc kubenswrapper[4707]: I0121 15:32:56.965253 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.003995 4707 scope.go:117] "RemoveContainer" containerID="4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a" Jan 21 15:32:57 crc kubenswrapper[4707]: E0121 15:32:57.004444 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a\": container with ID starting with 4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a not found: ID does not exist" containerID="4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.004474 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a"} err="failed to get container status \"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a\": rpc error: code = NotFound desc = could not find container \"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a\": container with ID starting with 4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.004493 4707 scope.go:117] "RemoveContainer" containerID="f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025" Jan 21 15:32:57 crc kubenswrapper[4707]: E0121 15:32:57.005056 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025\": container with ID starting with f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025 not found: ID does not exist" containerID="f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.005079 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025"} err="failed to get container status \"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025\": rpc error: code = NotFound desc = could not find container \"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025\": container with ID starting with f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025 not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.005095 4707 scope.go:117] "RemoveContainer" containerID="10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d" Jan 21 15:32:57 crc kubenswrapper[4707]: E0121 15:32:57.006033 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d\": container with ID starting with 10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d not found: ID does not exist" containerID="10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.006058 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d"} err="failed to get container status \"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d\": rpc error: code = NotFound desc = could not find container \"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d\": container with ID starting with 10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.006072 4707 scope.go:117] "RemoveContainer" containerID="626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13" Jan 21 15:32:57 crc kubenswrapper[4707]: E0121 15:32:57.013938 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13\": container with ID starting with 626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13 not found: ID does not exist" containerID="626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.013978 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13"} err="failed to get container status \"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13\": rpc error: code = NotFound desc = could not find container \"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13\": container with ID starting with 626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13 not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.014003 4707 scope.go:117] "RemoveContainer" containerID="4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.015274 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a"} err="failed to get container status \"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a\": rpc error: code = NotFound desc = could not find container \"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a\": container with ID starting with 4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.015294 4707 scope.go:117] "RemoveContainer" containerID="f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.017668 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025"} err="failed to get container status \"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025\": rpc error: code = NotFound desc = could not find container \"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025\": container with ID starting with f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025 not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.017688 4707 scope.go:117] "RemoveContainer" containerID="10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.017909 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d"} err="failed to get container status \"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d\": rpc error: code = NotFound desc = could not find container \"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d\": container with ID starting with 10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.017925 4707 scope.go:117] "RemoveContainer" containerID="626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.018087 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13"} err="failed to get container status \"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13\": rpc error: code = NotFound desc = could not find container \"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13\": container with ID starting with 626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13 not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.018099 4707 scope.go:117] "RemoveContainer" containerID="4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.018266 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a"} err="failed to get container status \"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a\": rpc error: code = NotFound desc = could not find container \"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a\": container with ID starting with 4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.018278 4707 scope.go:117] "RemoveContainer" containerID="f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.019678 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025"} err="failed to get container status \"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025\": rpc error: code = NotFound desc = could not find container \"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025\": container with ID starting with f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025 not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.019771 4707 scope.go:117] "RemoveContainer" containerID="10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.020031 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d"} err="failed to get container status \"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d\": rpc error: code = NotFound desc = could not find container \"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d\": container with ID starting with 10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.020119 4707 scope.go:117] "RemoveContainer" containerID="626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.020341 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13"} err="failed to get container status \"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13\": rpc error: code = NotFound desc = could not find container \"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13\": container with ID starting with 626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13 not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.020410 4707 scope.go:117] "RemoveContainer" containerID="4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.020634 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a"} err="failed to get container status \"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a\": rpc error: code = NotFound desc = could not find container \"4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a\": container with ID starting with 4f8825ed84aecd5897dfddbb71cc788bd93d824cbae8158ac570a4624c9a6c6a not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.020719 4707 scope.go:117] "RemoveContainer" containerID="f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.022164 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025"} err="failed to get container status \"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025\": rpc error: code = NotFound desc = could not find container \"f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025\": container with ID starting with f730fb0d302744d91b733b290a040e46ed201fed90fbe9fca507b2b23f0b4025 not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.022250 4707 scope.go:117] "RemoveContainer" containerID="10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.022527 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d"} err="failed to get container status \"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d\": rpc error: code = NotFound desc = could not find container \"10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d\": container with ID starting with 10a2d2a3a7bca4f32efc6511821ea212a38a99156d22584da3635e997f2cbb5d not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.022595 4707 scope.go:117] "RemoveContainer" containerID="626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.022838 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13"} err="failed to get container status \"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13\": rpc error: code = NotFound desc = could not find container \"626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13\": container with ID starting with 626b0bb0edac56cd0d8389627f5eb3a04a7880eeb89e6007cba9b64f8a9cac13 not found: ID does not exist" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.117750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.117829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-scripts\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.117930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-log-httpd\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.118035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgrgr\" (UniqueName: \"kubernetes.io/projected/a72530e0-ee2a-4de5-9b09-666e0b759c09-kube-api-access-kgrgr\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.118063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.118077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-run-httpd\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.118299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-config-data\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.203348 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eebbb84-d09f-44b4-9ad2-5f715fd16644" path="/var/lib/kubelet/pods/4eebbb84-d09f-44b4-9ad2-5f715fd16644/volumes" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.204950 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb51953b-a321-4fd3-830a-0109c2c8c207" path="/var/lib/kubelet/pods/fb51953b-a321-4fd3-830a-0109c2c8c207/volumes" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.220778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.220849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-scripts\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.220915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-log-httpd\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.220981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgrgr\" (UniqueName: \"kubernetes.io/projected/a72530e0-ee2a-4de5-9b09-666e0b759c09-kube-api-access-kgrgr\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.221003 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.221017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-run-httpd\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.221086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-config-data\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.222048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-log-httpd\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.224930 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.227247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-config-data\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.227378 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-scripts\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.227727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-run-httpd\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.235894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.242996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgrgr\" (UniqueName: \"kubernetes.io/projected/a72530e0-ee2a-4de5-9b09-666e0b759c09-kube-api-access-kgrgr\") pod \"ceilometer-0\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.283915 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.637604 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.711585 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:57 crc kubenswrapper[4707]: W0121 15:32:57.712265 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda72530e0_ee2a_4de5_9b09_666e0b759c09.slice/crio-fd8598b3eea8545f7ec24cf01de5232db470b6df0c27c7ad28a3face1398d839 WatchSource:0}: Error finding container fd8598b3eea8545f7ec24cf01de5232db470b6df0c27c7ad28a3face1398d839: Status 404 returned error can't find the container with id fd8598b3eea8545f7ec24cf01de5232db470b6df0c27c7ad28a3face1398d839 Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.891994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerStarted","Data":"fd8598b3eea8545f7ec24cf01de5232db470b6df0c27c7ad28a3face1398d839"} Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.893193 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:32:57 crc kubenswrapper[4707]: I0121 15:32:57.900687 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:32:58 crc kubenswrapper[4707]: I0121 15:32:58.615702 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:32:58 crc kubenswrapper[4707]: I0121 15:32:58.652747 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:58 crc kubenswrapper[4707]: I0121 15:32:58.653430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:32:58 crc kubenswrapper[4707]: I0121 15:32:58.901050 4707 generic.go:334] "Generic (PLEG): container finished" podID="9e7eb849-a2d7-4ac3-88cc-977992b13490" containerID="3ab85ee94bca50a4accee416f902c8f5749d6ac2a33fa89d6c447eb295d21860" exitCode=0 Jan 21 15:32:58 crc kubenswrapper[4707]: I0121 15:32:58.901105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" event={"ID":"9e7eb849-a2d7-4ac3-88cc-977992b13490","Type":"ContainerDied","Data":"3ab85ee94bca50a4accee416f902c8f5749d6ac2a33fa89d6c447eb295d21860"} Jan 21 15:32:58 crc kubenswrapper[4707]: I0121 15:32:58.903457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerStarted","Data":"987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e"} Jan 21 15:32:59 crc kubenswrapper[4707]: I0121 15:32:59.912998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerStarted","Data":"3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0"} Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.229151 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.372137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-config-data\") pod \"9e7eb849-a2d7-4ac3-88cc-977992b13490\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.372176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-scripts\") pod \"9e7eb849-a2d7-4ac3-88cc-977992b13490\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.372275 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-combined-ca-bundle\") pod \"9e7eb849-a2d7-4ac3-88cc-977992b13490\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.372324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7qtj\" (UniqueName: \"kubernetes.io/projected/9e7eb849-a2d7-4ac3-88cc-977992b13490-kube-api-access-q7qtj\") pod \"9e7eb849-a2d7-4ac3-88cc-977992b13490\" (UID: \"9e7eb849-a2d7-4ac3-88cc-977992b13490\") " Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.381147 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e7eb849-a2d7-4ac3-88cc-977992b13490-kube-api-access-q7qtj" (OuterVolumeSpecName: "kube-api-access-q7qtj") pod "9e7eb849-a2d7-4ac3-88cc-977992b13490" (UID: "9e7eb849-a2d7-4ac3-88cc-977992b13490"). InnerVolumeSpecName "kube-api-access-q7qtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.393232 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-scripts" (OuterVolumeSpecName: "scripts") pod "9e7eb849-a2d7-4ac3-88cc-977992b13490" (UID: "9e7eb849-a2d7-4ac3-88cc-977992b13490"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.395919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e7eb849-a2d7-4ac3-88cc-977992b13490" (UID: "9e7eb849-a2d7-4ac3-88cc-977992b13490"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.396017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-config-data" (OuterVolumeSpecName: "config-data") pod "9e7eb849-a2d7-4ac3-88cc-977992b13490" (UID: "9e7eb849-a2d7-4ac3-88cc-977992b13490"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.474535 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.474567 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.474578 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7eb849-a2d7-4ac3-88cc-977992b13490-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.474590 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7qtj\" (UniqueName: \"kubernetes.io/projected/9e7eb849-a2d7-4ac3-88cc-977992b13490-kube-api-access-q7qtj\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.922559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerStarted","Data":"a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88"} Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.923986 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.923976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns" event={"ID":"9e7eb849-a2d7-4ac3-88cc-977992b13490","Type":"ContainerDied","Data":"2e0b99ee6e64a07cc321a9c12d33a744581da6cf5a2b90219f000d9d4a2f41b9"} Jan 21 15:33:00 crc kubenswrapper[4707]: I0121 15:33:00.924051 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e0b99ee6e64a07cc321a9c12d33a744581da6cf5a2b90219f000d9d4a2f41b9" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.420696 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q"] Jan 21 15:33:01 crc kubenswrapper[4707]: E0121 15:33:01.421059 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7eb849-a2d7-4ac3-88cc-977992b13490" containerName="nova-cell0-conductor-db-sync" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.421073 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7eb849-a2d7-4ac3-88cc-977992b13490" containerName="nova-cell0-conductor-db-sync" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.421235 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7eb849-a2d7-4ac3-88cc-977992b13490" containerName="nova-cell0-conductor-db-sync" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.421763 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.424406 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-rsjbw" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.424668 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.424802 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.437000 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q"] Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.498476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.498990 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-scripts\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.499204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khdh9\" (UniqueName: \"kubernetes.io/projected/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-kube-api-access-khdh9\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.499384 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-config-data\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.558565 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.560734 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.563531 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.593660 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.604780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-config-data\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.604920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.605056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-scripts\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.605176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khdh9\" (UniqueName: \"kubernetes.io/projected/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-kube-api-access-khdh9\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.615583 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-config-data\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.615864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.629172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-scripts\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.660397 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.661906 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.662576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khdh9\" (UniqueName: \"kubernetes.io/projected/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-kube-api-access-khdh9\") pod \"nova-cell0-cell-mapping-kb84q\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.671975 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.687222 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.707883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-config-data\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.707944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-logs\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.708008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.708108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qn96\" (UniqueName: \"kubernetes.io/projected/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-kube-api-access-8qn96\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.713432 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.714520 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.717437 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.721027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.737226 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.751046 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.752079 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.757115 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.760255 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.812908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qn96\" (UniqueName: \"kubernetes.io/projected/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-kube-api-access-8qn96\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.812961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-config-data\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.812985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-config-data\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.813009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.813041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-logs\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.813073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.813101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.813131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29c0fc2-933e-4700-96b0-68cb6dadcf48-logs\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.813160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.813187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjk6s\" (UniqueName: \"kubernetes.io/projected/7cb9bb6c-cd90-4b4a-a986-4141116ed215-kube-api-access-vjk6s\") pod \"nova-cell1-novncproxy-0\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.813211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd2x6\" (UniqueName: \"kubernetes.io/projected/c29c0fc2-933e-4700-96b0-68cb6dadcf48-kube-api-access-hd2x6\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.813951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-logs\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.817885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-config-data\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.818820 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.832633 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qn96\" (UniqueName: \"kubernetes.io/projected/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-kube-api-access-8qn96\") pod \"nova-api-0\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.917700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.917744 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn64w\" (UniqueName: \"kubernetes.io/projected/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-kube-api-access-nn64w\") pod \"nova-scheduler-0\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.917772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-config-data\") pod \"nova-scheduler-0\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.917822 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.917852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.917911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29c0fc2-933e-4700-96b0-68cb6dadcf48-logs\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.918417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29c0fc2-933e-4700-96b0-68cb6dadcf48-logs\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.918474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.918494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjk6s\" (UniqueName: \"kubernetes.io/projected/7cb9bb6c-cd90-4b4a-a986-4141116ed215-kube-api-access-vjk6s\") pod \"nova-cell1-novncproxy-0\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.918517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd2x6\" (UniqueName: \"kubernetes.io/projected/c29c0fc2-933e-4700-96b0-68cb6dadcf48-kube-api-access-hd2x6\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.918583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-config-data\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.922350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.923822 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-config-data\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.925379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.929353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.938892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjk6s\" (UniqueName: \"kubernetes.io/projected/7cb9bb6c-cd90-4b4a-a986-4141116ed215-kube-api-access-vjk6s\") pod \"nova-cell1-novncproxy-0\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.942276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd2x6\" (UniqueName: \"kubernetes.io/projected/c29c0fc2-933e-4700-96b0-68cb6dadcf48-kube-api-access-hd2x6\") pod \"nova-metadata-0\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.947097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerStarted","Data":"b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b"} Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.947309 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="ceilometer-central-agent" containerID="cri-o://987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e" gracePeriod=30 Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.947585 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.947785 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="proxy-httpd" containerID="cri-o://b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b" gracePeriod=30 Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.947991 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="ceilometer-notification-agent" containerID="cri-o://3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0" gracePeriod=30 Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.949138 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="sg-core" containerID="cri-o://a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88" gracePeriod=30 Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.976785 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.1957860670000002 podStartE2EDuration="5.976763437s" podCreationTimestamp="2026-01-21 15:32:56 +0000 UTC" firstStartedPulling="2026-01-21 15:32:57.716705366 +0000 UTC m=+1874.898221587" lastFinishedPulling="2026-01-21 15:33:01.497682734 +0000 UTC m=+1878.679198957" observedRunningTime="2026-01-21 15:33:01.965286197 +0000 UTC m=+1879.146802419" watchObservedRunningTime="2026-01-21 15:33:01.976763437 +0000 UTC m=+1879.158279660" Jan 21 15:33:01 crc kubenswrapper[4707]: I0121 15:33:01.999653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.005053 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.022737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn64w\" (UniqueName: \"kubernetes.io/projected/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-kube-api-access-nn64w\") pod \"nova-scheduler-0\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.022827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-config-data\") pod \"nova-scheduler-0\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.022978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.034439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-config-data\") pod \"nova-scheduler-0\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.040474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.041461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.072328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn64w\" (UniqueName: \"kubernetes.io/projected/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-kube-api-access-nn64w\") pod \"nova-scheduler-0\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.223508 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q"] Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.373729 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.611980 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.672326 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.775114 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.953143 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:02 crc kubenswrapper[4707]: W0121 15:33:02.975621 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2adc5e20_edde_499c_b40f_e92dcdb8dcb7.slice/crio-e46dd610954e91fa728b4b6a5f57d3e02a65021a69f3316d095e9c02aed9935c WatchSource:0}: Error finding container e46dd610954e91fa728b4b6a5f57d3e02a65021a69f3316d095e9c02aed9935c: Status 404 returned error can't find the container with id e46dd610954e91fa728b4b6a5f57d3e02a65021a69f3316d095e9c02aed9935c Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.976888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c29c0fc2-933e-4700-96b0-68cb6dadcf48","Type":"ContainerStarted","Data":"a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f"} Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.976930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c29c0fc2-933e-4700-96b0-68cb6dadcf48","Type":"ContainerStarted","Data":"0a1e0c25ed03f46fec1500ca27651fb74ecbd4d7a0d83b6053a916c03d65fa7e"} Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.978338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7cb9bb6c-cd90-4b4a-a986-4141116ed215","Type":"ContainerStarted","Data":"a0091800fc863f512ff782e1f6727ea4c04c3d5fb4d9b3811518324a9f6c7ba6"} Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.981348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a","Type":"ContainerStarted","Data":"53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4"} Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.981389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a","Type":"ContainerStarted","Data":"6183131366ba1aefeed714b5f54eb155d771d8f08a7797f6cfdedc00dcedae3d"} Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.984249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" event={"ID":"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde","Type":"ContainerStarted","Data":"034eae89f5e612ade9f163f60388849a5f0b555f0c70a7dd88537a0410cf853b"} Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.984277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" event={"ID":"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde","Type":"ContainerStarted","Data":"86856da99c5e8506933ea62dbf316c1378db23b8beea448f6e2822be73cde41c"} Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.992529 4707 generic.go:334] "Generic (PLEG): container finished" podID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerID="b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b" exitCode=0 Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.992555 4707 generic.go:334] "Generic (PLEG): container finished" podID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerID="a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88" exitCode=2 Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.992563 4707 generic.go:334] "Generic (PLEG): container finished" podID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerID="3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0" exitCode=0 Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.992577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerDied","Data":"b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b"} Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.992592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerDied","Data":"a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88"} Jan 21 15:33:02 crc kubenswrapper[4707]: I0121 15:33:02.992602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerDied","Data":"3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0"} Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.003727 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" podStartSLOduration=2.003717124 podStartE2EDuration="2.003717124s" podCreationTimestamp="2026-01-21 15:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:02.998433017 +0000 UTC m=+1880.179949239" watchObservedRunningTime="2026-01-21 15:33:03.003717124 +0000 UTC m=+1880.185233345" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.339092 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq"] Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.340061 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.342823 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.343474 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.351159 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq"] Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.467940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.468275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-config-data\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.468458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgfx6\" (UniqueName: \"kubernetes.io/projected/79071a6b-873b-4668-9af8-45aafd0facb3-kube-api-access-hgfx6\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.468956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-scripts\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.571642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-config-data\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.571733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgfx6\" (UniqueName: \"kubernetes.io/projected/79071a6b-873b-4668-9af8-45aafd0facb3-kube-api-access-hgfx6\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.571957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-scripts\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.572040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.579277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.580332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-config-data\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.599282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-scripts\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.612747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgfx6\" (UniqueName: \"kubernetes.io/projected/79071a6b-873b-4668-9af8-45aafd0facb3-kube-api-access-hgfx6\") pod \"nova-cell1-conductor-db-sync-9nsxq\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:03 crc kubenswrapper[4707]: I0121 15:33:03.656353 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.004758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7cb9bb6c-cd90-4b4a-a986-4141116ed215","Type":"ContainerStarted","Data":"95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f"} Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.006684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a","Type":"ContainerStarted","Data":"1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3"} Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.017203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c29c0fc2-933e-4700-96b0-68cb6dadcf48","Type":"ContainerStarted","Data":"88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd"} Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.019045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2adc5e20-edde-499c-b40f-e92dcdb8dcb7","Type":"ContainerStarted","Data":"90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681"} Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.019073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2adc5e20-edde-499c-b40f-e92dcdb8dcb7","Type":"ContainerStarted","Data":"e46dd610954e91fa728b4b6a5f57d3e02a65021a69f3316d095e9c02aed9935c"} Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.025220 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=3.02520945 podStartE2EDuration="3.02520945s" podCreationTimestamp="2026-01-21 15:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:04.023285152 +0000 UTC m=+1881.204801373" watchObservedRunningTime="2026-01-21 15:33:04.02520945 +0000 UTC m=+1881.206725671" Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.046447 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=3.04642981 podStartE2EDuration="3.04642981s" podCreationTimestamp="2026-01-21 15:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:04.042733971 +0000 UTC m=+1881.224250193" watchObservedRunningTime="2026-01-21 15:33:04.04642981 +0000 UTC m=+1881.227946032" Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.071244 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.071228408 podStartE2EDuration="3.071228408s" podCreationTimestamp="2026-01-21 15:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:04.063022258 +0000 UTC m=+1881.244538480" watchObservedRunningTime="2026-01-21 15:33:04.071228408 +0000 UTC m=+1881.252744630" Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.091116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq"] Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.092943 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.092921968 podStartE2EDuration="3.092921968s" podCreationTimestamp="2026-01-21 15:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:04.083830714 +0000 UTC m=+1881.265346936" watchObservedRunningTime="2026-01-21 15:33:04.092921968 +0000 UTC m=+1881.274438190" Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.182729 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:33:04 crc kubenswrapper[4707]: E0121 15:33:04.183066 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.488102 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:33:04 crc kubenswrapper[4707]: I0121 15:33:04.505414 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:05 crc kubenswrapper[4707]: I0121 15:33:05.027440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" event={"ID":"79071a6b-873b-4668-9af8-45aafd0facb3","Type":"ContainerStarted","Data":"40c226bf4e51d0f7d6f1a51b8c7febdbedfe2ce1d5184582b150377ae5d63e42"} Jan 21 15:33:05 crc kubenswrapper[4707]: I0121 15:33:05.027491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" event={"ID":"79071a6b-873b-4668-9af8-45aafd0facb3","Type":"ContainerStarted","Data":"5e75524d2db167877b93f8ae7994333ca04154ed7078e588507751d08c03dde8"} Jan 21 15:33:05 crc kubenswrapper[4707]: I0121 15:33:05.047476 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" podStartSLOduration=2.047460794 podStartE2EDuration="2.047460794s" podCreationTimestamp="2026-01-21 15:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:05.039065358 +0000 UTC m=+1882.220581580" watchObservedRunningTime="2026-01-21 15:33:05.047460794 +0000 UTC m=+1882.228977016" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.036715 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="7cb9bb6c-cd90-4b4a-a986-4141116ed215" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f" gracePeriod=30 Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.037098 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerName="nova-metadata-log" containerID="cri-o://a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f" gracePeriod=30 Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.037132 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerName="nova-metadata-metadata" containerID="cri-o://88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd" gracePeriod=30 Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.250583 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.280466 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.463890 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.551316 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-config-data\") pod \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.551639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29c0fc2-933e-4700-96b0-68cb6dadcf48-logs\") pod \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.551822 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-combined-ca-bundle\") pod \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.551945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd2x6\" (UniqueName: \"kubernetes.io/projected/c29c0fc2-933e-4700-96b0-68cb6dadcf48-kube-api-access-hd2x6\") pod \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.551922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c29c0fc2-933e-4700-96b0-68cb6dadcf48-logs" (OuterVolumeSpecName: "logs") pod "c29c0fc2-933e-4700-96b0-68cb6dadcf48" (UID: "c29c0fc2-933e-4700-96b0-68cb6dadcf48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.552792 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29c0fc2-933e-4700-96b0-68cb6dadcf48-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.571839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29c0fc2-933e-4700-96b0-68cb6dadcf48-kube-api-access-hd2x6" (OuterVolumeSpecName: "kube-api-access-hd2x6") pod "c29c0fc2-933e-4700-96b0-68cb6dadcf48" (UID: "c29c0fc2-933e-4700-96b0-68cb6dadcf48"). InnerVolumeSpecName "kube-api-access-hd2x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:06 crc kubenswrapper[4707]: E0121 15:33:06.576100 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-combined-ca-bundle podName:c29c0fc2-933e-4700-96b0-68cb6dadcf48 nodeName:}" failed. No retries permitted until 2026-01-21 15:33:07.076077795 +0000 UTC m=+1884.257594017 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-combined-ca-bundle") pod "c29c0fc2-933e-4700-96b0-68cb6dadcf48" (UID: "c29c0fc2-933e-4700-96b0-68cb6dadcf48") : error deleting /var/lib/kubelet/pods/c29c0fc2-933e-4700-96b0-68cb6dadcf48/volume-subpaths: remove /var/lib/kubelet/pods/c29c0fc2-933e-4700-96b0-68cb6dadcf48/volume-subpaths: no such file or directory Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.579329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-config-data" (OuterVolumeSpecName: "config-data") pod "c29c0fc2-933e-4700-96b0-68cb6dadcf48" (UID: "c29c0fc2-933e-4700-96b0-68cb6dadcf48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.626925 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.653468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjk6s\" (UniqueName: \"kubernetes.io/projected/7cb9bb6c-cd90-4b4a-a986-4141116ed215-kube-api-access-vjk6s\") pod \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.653670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-config-data\") pod \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.653773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-combined-ca-bundle\") pod \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\" (UID: \"7cb9bb6c-cd90-4b4a-a986-4141116ed215\") " Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.654374 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.654453 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd2x6\" (UniqueName: \"kubernetes.io/projected/c29c0fc2-933e-4700-96b0-68cb6dadcf48-kube-api-access-hd2x6\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.655945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb9bb6c-cd90-4b4a-a986-4141116ed215-kube-api-access-vjk6s" (OuterVolumeSpecName: "kube-api-access-vjk6s") pod "7cb9bb6c-cd90-4b4a-a986-4141116ed215" (UID: "7cb9bb6c-cd90-4b4a-a986-4141116ed215"). InnerVolumeSpecName "kube-api-access-vjk6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.674742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cb9bb6c-cd90-4b4a-a986-4141116ed215" (UID: "7cb9bb6c-cd90-4b4a-a986-4141116ed215"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.687890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-config-data" (OuterVolumeSpecName: "config-data") pod "7cb9bb6c-cd90-4b4a-a986-4141116ed215" (UID: "7cb9bb6c-cd90-4b4a-a986-4141116ed215"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.755858 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjk6s\" (UniqueName: \"kubernetes.io/projected/7cb9bb6c-cd90-4b4a-a986-4141116ed215-kube-api-access-vjk6s\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.755887 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:06 crc kubenswrapper[4707]: I0121 15:33:06.755899 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9bb6c-cd90-4b4a-a986-4141116ed215-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.045372 4707 generic.go:334] "Generic (PLEG): container finished" podID="7cb9bb6c-cd90-4b4a-a986-4141116ed215" containerID="95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f" exitCode=0 Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.045621 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7cb9bb6c-cd90-4b4a-a986-4141116ed215","Type":"ContainerDied","Data":"95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f"} Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.045655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7cb9bb6c-cd90-4b4a-a986-4141116ed215","Type":"ContainerDied","Data":"a0091800fc863f512ff782e1f6727ea4c04c3d5fb4d9b3811518324a9f6c7ba6"} Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.045673 4707 scope.go:117] "RemoveContainer" containerID="95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.046296 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.046730 4707 generic.go:334] "Generic (PLEG): container finished" podID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerID="88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd" exitCode=0 Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.046746 4707 generic.go:334] "Generic (PLEG): container finished" podID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerID="a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f" exitCode=143 Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.046768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c29c0fc2-933e-4700-96b0-68cb6dadcf48","Type":"ContainerDied","Data":"88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd"} Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.046782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c29c0fc2-933e-4700-96b0-68cb6dadcf48","Type":"ContainerDied","Data":"a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f"} Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.046791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c29c0fc2-933e-4700-96b0-68cb6dadcf48","Type":"ContainerDied","Data":"0a1e0c25ed03f46fec1500ca27651fb74ecbd4d7a0d83b6053a916c03d65fa7e"} Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.048678 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.049758 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.074045 4707 scope.go:117] "RemoveContainer" containerID="95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.074461 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:33:07 crc kubenswrapper[4707]: E0121 15:33:07.075170 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f\": container with ID starting with 95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f not found: ID does not exist" containerID="95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.075235 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f"} err="failed to get container status \"95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f\": rpc error: code = NotFound desc = could not find container \"95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f\": container with ID starting with 95916eeda0760686585a269d92eea2ff59106cf0c590f1fd074617df0113a98f not found: ID does not exist" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.075261 4707 scope.go:117] "RemoveContainer" containerID="88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.075842 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.081508 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.102130 4707 scope.go:117] "RemoveContainer" containerID="a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.108594 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:33:07 crc kubenswrapper[4707]: E0121 15:33:07.109054 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerName="nova-metadata-log" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.109077 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerName="nova-metadata-log" Jan 21 15:33:07 crc kubenswrapper[4707]: E0121 15:33:07.109093 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerName="nova-metadata-metadata" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.109099 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerName="nova-metadata-metadata" Jan 21 15:33:07 crc kubenswrapper[4707]: E0121 15:33:07.109113 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb9bb6c-cd90-4b4a-a986-4141116ed215" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.109121 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb9bb6c-cd90-4b4a-a986-4141116ed215" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.109293 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerName="nova-metadata-metadata" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.109308 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" containerName="nova-metadata-log" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.109320 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb9bb6c-cd90-4b4a-a986-4141116ed215" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.109880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.113835 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.114014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.114124 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.131170 4707 scope.go:117] "RemoveContainer" containerID="88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd" Jan 21 15:33:07 crc kubenswrapper[4707]: E0121 15:33:07.131596 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd\": container with ID starting with 88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd not found: ID does not exist" containerID="88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.131628 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd"} err="failed to get container status \"88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd\": rpc error: code = NotFound desc = could not find container \"88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd\": container with ID starting with 88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd not found: ID does not exist" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.131648 4707 scope.go:117] "RemoveContainer" containerID="a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f" Jan 21 15:33:07 crc kubenswrapper[4707]: E0121 15:33:07.131858 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f\": container with ID starting with a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f not found: ID does not exist" containerID="a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.131884 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f"} err="failed to get container status \"a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f\": rpc error: code = NotFound desc = could not find container \"a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f\": container with ID starting with a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f not found: ID does not exist" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.131902 4707 scope.go:117] "RemoveContainer" containerID="88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.132168 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd"} err="failed to get container status \"88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd\": rpc error: code = NotFound desc = could not find container \"88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd\": container with ID starting with 88d521b943a3bb4f3c12eec0da8bbecd4f4116bac65af2ed2098bc4dbbd693bd not found: ID does not exist" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.132210 4707 scope.go:117] "RemoveContainer" containerID="a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.132424 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f"} err="failed to get container status \"a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f\": rpc error: code = NotFound desc = could not find container \"a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f\": container with ID starting with a8a58b81fc3e81301de1b8e1eaa926d440f1ba17c9669d6cb975dfb6f006764f not found: ID does not exist" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.152960 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.161575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-combined-ca-bundle\") pod \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\" (UID: \"c29c0fc2-933e-4700-96b0-68cb6dadcf48\") " Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.162238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.162288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv4z4\" (UniqueName: \"kubernetes.io/projected/317f9a2c-fa7d-417b-9899-9bce822ffd74-kube-api-access-wv4z4\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.162323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.162425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.162476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.164931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c29c0fc2-933e-4700-96b0-68cb6dadcf48" (UID: "c29c0fc2-933e-4700-96b0-68cb6dadcf48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.209470 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb9bb6c-cd90-4b4a-a986-4141116ed215" path="/var/lib/kubelet/pods/7cb9bb6c-cd90-4b4a-a986-4141116ed215/volumes" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.263739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv4z4\" (UniqueName: \"kubernetes.io/projected/317f9a2c-fa7d-417b-9899-9bce822ffd74-kube-api-access-wv4z4\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.263799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.263916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.264007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.264148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.264214 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29c0fc2-933e-4700-96b0-68cb6dadcf48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.268149 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.270208 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.273500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.276234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.277622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv4z4\" (UniqueName: \"kubernetes.io/projected/317f9a2c-fa7d-417b-9899-9bce822ffd74-kube-api-access-wv4z4\") pod \"nova-cell1-novncproxy-0\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.372086 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.375199 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.383459 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.393008 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.394747 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.396144 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.396907 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.416949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.431055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.571620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdxrv\" (UniqueName: \"kubernetes.io/projected/9d12ead2-d0f8-40c9-b23c-f38e37339651-kube-api-access-mdxrv\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.572048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-config-data\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.572122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d12ead2-d0f8-40c9-b23c-f38e37339651-logs\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.572172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.572217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.674137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.674261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdxrv\" (UniqueName: \"kubernetes.io/projected/9d12ead2-d0f8-40c9-b23c-f38e37339651-kube-api-access-mdxrv\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.674298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-config-data\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.674370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d12ead2-d0f8-40c9-b23c-f38e37339651-logs\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.674423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.676331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d12ead2-d0f8-40c9-b23c-f38e37339651-logs\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.679201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.679540 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-config-data\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.679545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.688264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdxrv\" (UniqueName: \"kubernetes.io/projected/9d12ead2-d0f8-40c9-b23c-f38e37339651-kube-api-access-mdxrv\") pod \"nova-metadata-0\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.708560 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:07 crc kubenswrapper[4707]: W0121 15:33:07.845018 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod317f9a2c_fa7d_417b_9899_9bce822ffd74.slice/crio-4e2e40b027020e927ccc3a4b83fa3b9cf999e97161998ace26065b2a0b11a5f4 WatchSource:0}: Error finding container 4e2e40b027020e927ccc3a4b83fa3b9cf999e97161998ace26065b2a0b11a5f4: Status 404 returned error can't find the container with id 4e2e40b027020e927ccc3a4b83fa3b9cf999e97161998ace26065b2a0b11a5f4 Jan 21 15:33:07 crc kubenswrapper[4707]: I0121 15:33:07.848752 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:33:08 crc kubenswrapper[4707]: I0121 15:33:08.058095 4707 generic.go:334] "Generic (PLEG): container finished" podID="79071a6b-873b-4668-9af8-45aafd0facb3" containerID="40c226bf4e51d0f7d6f1a51b8c7febdbedfe2ce1d5184582b150377ae5d63e42" exitCode=0 Jan 21 15:33:08 crc kubenswrapper[4707]: I0121 15:33:08.058198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" event={"ID":"79071a6b-873b-4668-9af8-45aafd0facb3","Type":"ContainerDied","Data":"40c226bf4e51d0f7d6f1a51b8c7febdbedfe2ce1d5184582b150377ae5d63e42"} Jan 21 15:33:08 crc kubenswrapper[4707]: I0121 15:33:08.060305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"317f9a2c-fa7d-417b-9899-9bce822ffd74","Type":"ContainerStarted","Data":"3936a2d7a7a92681b283ae06c8af878facd4800d2852311442ea58c67b2c19cc"} Jan 21 15:33:08 crc kubenswrapper[4707]: I0121 15:33:08.060342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"317f9a2c-fa7d-417b-9899-9bce822ffd74","Type":"ContainerStarted","Data":"4e2e40b027020e927ccc3a4b83fa3b9cf999e97161998ace26065b2a0b11a5f4"} Jan 21 15:33:08 crc kubenswrapper[4707]: I0121 15:33:08.089709 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.089689371 podStartE2EDuration="1.089689371s" podCreationTimestamp="2026-01-21 15:33:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:08.080459907 +0000 UTC m=+1885.261976129" watchObservedRunningTime="2026-01-21 15:33:08.089689371 +0000 UTC m=+1885.271205593" Jan 21 15:33:08 crc kubenswrapper[4707]: I0121 15:33:08.130481 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:08 crc kubenswrapper[4707]: W0121 15:33:08.134105 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d12ead2_d0f8_40c9_b23c_f38e37339651.slice/crio-9262dfc5ae36a5b1f7f8e5ff880054bb17e62efce9f2bd0aede8b2621a7508a3 WatchSource:0}: Error finding container 9262dfc5ae36a5b1f7f8e5ff880054bb17e62efce9f2bd0aede8b2621a7508a3: Status 404 returned error can't find the container with id 9262dfc5ae36a5b1f7f8e5ff880054bb17e62efce9f2bd0aede8b2621a7508a3 Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.069910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d12ead2-d0f8-40c9-b23c-f38e37339651","Type":"ContainerStarted","Data":"493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd"} Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.070437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d12ead2-d0f8-40c9-b23c-f38e37339651","Type":"ContainerStarted","Data":"0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938"} Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.070452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d12ead2-d0f8-40c9-b23c-f38e37339651","Type":"ContainerStarted","Data":"9262dfc5ae36a5b1f7f8e5ff880054bb17e62efce9f2bd0aede8b2621a7508a3"} Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.087491 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.087474552 podStartE2EDuration="2.087474552s" podCreationTimestamp="2026-01-21 15:33:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:09.083243116 +0000 UTC m=+1886.264759338" watchObservedRunningTime="2026-01-21 15:33:09.087474552 +0000 UTC m=+1886.268990774" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.193133 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29c0fc2-933e-4700-96b0-68cb6dadcf48" path="/var/lib/kubelet/pods/c29c0fc2-933e-4700-96b0-68cb6dadcf48/volumes" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.389439 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.512221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-config-data\") pod \"79071a6b-873b-4668-9af8-45aafd0facb3\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.512255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-scripts\") pod \"79071a6b-873b-4668-9af8-45aafd0facb3\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.512358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-combined-ca-bundle\") pod \"79071a6b-873b-4668-9af8-45aafd0facb3\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.512424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgfx6\" (UniqueName: \"kubernetes.io/projected/79071a6b-873b-4668-9af8-45aafd0facb3-kube-api-access-hgfx6\") pod \"79071a6b-873b-4668-9af8-45aafd0facb3\" (UID: \"79071a6b-873b-4668-9af8-45aafd0facb3\") " Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.517999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79071a6b-873b-4668-9af8-45aafd0facb3-kube-api-access-hgfx6" (OuterVolumeSpecName: "kube-api-access-hgfx6") pod "79071a6b-873b-4668-9af8-45aafd0facb3" (UID: "79071a6b-873b-4668-9af8-45aafd0facb3"). InnerVolumeSpecName "kube-api-access-hgfx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.518019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-scripts" (OuterVolumeSpecName: "scripts") pod "79071a6b-873b-4668-9af8-45aafd0facb3" (UID: "79071a6b-873b-4668-9af8-45aafd0facb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.535484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-config-data" (OuterVolumeSpecName: "config-data") pod "79071a6b-873b-4668-9af8-45aafd0facb3" (UID: "79071a6b-873b-4668-9af8-45aafd0facb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.535864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79071a6b-873b-4668-9af8-45aafd0facb3" (UID: "79071a6b-873b-4668-9af8-45aafd0facb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.614477 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgfx6\" (UniqueName: \"kubernetes.io/projected/79071a6b-873b-4668-9af8-45aafd0facb3-kube-api-access-hgfx6\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.614756 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.614765 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.614777 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79071a6b-873b-4668-9af8-45aafd0facb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:09 crc kubenswrapper[4707]: I0121 15:33:09.823573 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.025075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgrgr\" (UniqueName: \"kubernetes.io/projected/a72530e0-ee2a-4de5-9b09-666e0b759c09-kube-api-access-kgrgr\") pod \"a72530e0-ee2a-4de5-9b09-666e0b759c09\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.025131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-run-httpd\") pod \"a72530e0-ee2a-4de5-9b09-666e0b759c09\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.025164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-config-data\") pod \"a72530e0-ee2a-4de5-9b09-666e0b759c09\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.025225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-sg-core-conf-yaml\") pod \"a72530e0-ee2a-4de5-9b09-666e0b759c09\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.025247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-combined-ca-bundle\") pod \"a72530e0-ee2a-4de5-9b09-666e0b759c09\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.025295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-scripts\") pod \"a72530e0-ee2a-4de5-9b09-666e0b759c09\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.025344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-log-httpd\") pod \"a72530e0-ee2a-4de5-9b09-666e0b759c09\" (UID: \"a72530e0-ee2a-4de5-9b09-666e0b759c09\") " Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.025644 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a72530e0-ee2a-4de5-9b09-666e0b759c09" (UID: "a72530e0-ee2a-4de5-9b09-666e0b759c09"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.025971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a72530e0-ee2a-4de5-9b09-666e0b759c09" (UID: "a72530e0-ee2a-4de5-9b09-666e0b759c09"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.030124 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72530e0-ee2a-4de5-9b09-666e0b759c09-kube-api-access-kgrgr" (OuterVolumeSpecName: "kube-api-access-kgrgr") pod "a72530e0-ee2a-4de5-9b09-666e0b759c09" (UID: "a72530e0-ee2a-4de5-9b09-666e0b759c09"). InnerVolumeSpecName "kube-api-access-kgrgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.031857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-scripts" (OuterVolumeSpecName: "scripts") pod "a72530e0-ee2a-4de5-9b09-666e0b759c09" (UID: "a72530e0-ee2a-4de5-9b09-666e0b759c09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.048163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a72530e0-ee2a-4de5-9b09-666e0b759c09" (UID: "a72530e0-ee2a-4de5-9b09-666e0b759c09"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.080306 4707 generic.go:334] "Generic (PLEG): container finished" podID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerID="987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e" exitCode=0 Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.080405 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.081031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerDied","Data":"987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e"} Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.081121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a72530e0-ee2a-4de5-9b09-666e0b759c09","Type":"ContainerDied","Data":"fd8598b3eea8545f7ec24cf01de5232db470b6df0c27c7ad28a3face1398d839"} Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.081156 4707 scope.go:117] "RemoveContainer" containerID="b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.084093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a72530e0-ee2a-4de5-9b09-666e0b759c09" (UID: "a72530e0-ee2a-4de5-9b09-666e0b759c09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.085614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" event={"ID":"79071a6b-873b-4668-9af8-45aafd0facb3","Type":"ContainerDied","Data":"5e75524d2db167877b93f8ae7994333ca04154ed7078e588507751d08c03dde8"} Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.085656 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e75524d2db167877b93f8ae7994333ca04154ed7078e588507751d08c03dde8" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.085687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.088158 4707 generic.go:334] "Generic (PLEG): container finished" podID="4329f46d-8b4d-45e8-a6a1-59ae09d3bdde" containerID="034eae89f5e612ade9f163f60388849a5f0b555f0c70a7dd88537a0410cf853b" exitCode=0 Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.088206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" event={"ID":"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde","Type":"ContainerDied","Data":"034eae89f5e612ade9f163f60388849a5f0b555f0c70a7dd88537a0410cf853b"} Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.103745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-config-data" (OuterVolumeSpecName: "config-data") pod "a72530e0-ee2a-4de5-9b09-666e0b759c09" (UID: "a72530e0-ee2a-4de5-9b09-666e0b759c09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.109325 4707 scope.go:117] "RemoveContainer" containerID="a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.127555 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.127586 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgrgr\" (UniqueName: \"kubernetes.io/projected/a72530e0-ee2a-4de5-9b09-666e0b759c09-kube-api-access-kgrgr\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.127597 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a72530e0-ee2a-4de5-9b09-666e0b759c09-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.127607 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.127616 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.127626 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.127634 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a72530e0-ee2a-4de5-9b09-666e0b759c09-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.130723 4707 scope.go:117] "RemoveContainer" containerID="3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.137415 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:33:10 crc kubenswrapper[4707]: E0121 15:33:10.137770 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79071a6b-873b-4668-9af8-45aafd0facb3" containerName="nova-cell1-conductor-db-sync" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.137788 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79071a6b-873b-4668-9af8-45aafd0facb3" containerName="nova-cell1-conductor-db-sync" Jan 21 15:33:10 crc kubenswrapper[4707]: E0121 15:33:10.137865 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="sg-core" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.137878 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="sg-core" Jan 21 15:33:10 crc kubenswrapper[4707]: E0121 15:33:10.137890 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="ceilometer-central-agent" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.137898 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="ceilometer-central-agent" Jan 21 15:33:10 crc kubenswrapper[4707]: E0121 15:33:10.137916 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="ceilometer-notification-agent" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.137923 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="ceilometer-notification-agent" Jan 21 15:33:10 crc kubenswrapper[4707]: E0121 15:33:10.137941 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="proxy-httpd" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.137946 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="proxy-httpd" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.138108 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="sg-core" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.138126 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="ceilometer-central-agent" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.138142 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="proxy-httpd" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.138151 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79071a6b-873b-4668-9af8-45aafd0facb3" containerName="nova-cell1-conductor-db-sync" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.138160 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" containerName="ceilometer-notification-agent" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.138765 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.140450 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.146946 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.157057 4707 scope.go:117] "RemoveContainer" containerID="987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.174936 4707 scope.go:117] "RemoveContainer" containerID="b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b" Jan 21 15:33:10 crc kubenswrapper[4707]: E0121 15:33:10.175591 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b\": container with ID starting with b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b not found: ID does not exist" containerID="b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.175624 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b"} err="failed to get container status \"b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b\": rpc error: code = NotFound desc = could not find container \"b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b\": container with ID starting with b88746d79daf8678bbd462355c6ebe884ec014e2ae894a9cab017b327596f97b not found: ID does not exist" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.175645 4707 scope.go:117] "RemoveContainer" containerID="a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88" Jan 21 15:33:10 crc kubenswrapper[4707]: E0121 15:33:10.176146 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88\": container with ID starting with a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88 not found: ID does not exist" containerID="a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.176215 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88"} err="failed to get container status \"a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88\": rpc error: code = NotFound desc = could not find container \"a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88\": container with ID starting with a2e466696b9b824811c031d0ca61af8a0daf04bc7cf0144519630071a7201b88 not found: ID does not exist" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.176243 4707 scope.go:117] "RemoveContainer" containerID="3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0" Jan 21 15:33:10 crc kubenswrapper[4707]: E0121 15:33:10.176523 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0\": container with ID starting with 3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0 not found: ID does not exist" containerID="3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.176555 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0"} err="failed to get container status \"3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0\": rpc error: code = NotFound desc = could not find container \"3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0\": container with ID starting with 3d26ff763826f75042872fc1c900e79dc4bf0b4400f828dd4eb28249426e83c0 not found: ID does not exist" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.176578 4707 scope.go:117] "RemoveContainer" containerID="987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e" Jan 21 15:33:10 crc kubenswrapper[4707]: E0121 15:33:10.176928 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e\": container with ID starting with 987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e not found: ID does not exist" containerID="987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.176956 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e"} err="failed to get container status \"987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e\": rpc error: code = NotFound desc = could not find container \"987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e\": container with ID starting with 987aa634067d45eb144db4523b630d915a3d3a4546511601571dcb2db71b815e not found: ID does not exist" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.331853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbbbt\" (UniqueName: \"kubernetes.io/projected/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-kube-api-access-fbbbt\") pod \"nova-cell1-conductor-0\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.331895 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.332132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.410885 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.423346 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.436861 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.438916 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.438951 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.438988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbbbt\" (UniqueName: \"kubernetes.io/projected/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-kube-api-access-fbbbt\") pod \"nova-cell1-conductor-0\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.439085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.439217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.442294 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.442453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.446690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.458922 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbbbt\" (UniqueName: \"kubernetes.io/projected/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-kube-api-access-fbbbt\") pod \"nova-cell1-conductor-0\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.467468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.540764 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-config-data\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.540804 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b62w\" (UniqueName: \"kubernetes.io/projected/0fe2f662-8806-442a-81fe-a5d62acdc376-kube-api-access-7b62w\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.540868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-log-httpd\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.540887 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-run-httpd\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.540930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.541005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-scripts\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.541042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.642220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.642293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-scripts\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.642334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.642387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-config-data\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.642779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b62w\" (UniqueName: \"kubernetes.io/projected/0fe2f662-8806-442a-81fe-a5d62acdc376-kube-api-access-7b62w\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.642844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-log-httpd\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.642861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-run-httpd\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.643152 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-log-httpd\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.643169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-run-httpd\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.653860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.653873 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-scripts\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.654070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-config-data\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.655618 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.655911 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b62w\" (UniqueName: \"kubernetes.io/projected/0fe2f662-8806-442a-81fe-a5d62acdc376-kube-api-access-7b62w\") pod \"ceilometer-0\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.765919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:10 crc kubenswrapper[4707]: I0121 15:33:10.814383 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.200887 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72530e0-ee2a-4de5-9b09-666e0b759c09" path="/var/lib/kubelet/pods/a72530e0-ee2a-4de5-9b09-666e0b759c09/volumes" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.201873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:33:11 crc kubenswrapper[4707]: W0121 15:33:11.205109 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb3cd8ce_d377_43dc_a92e_7d63c8ab5a87.slice/crio-6c9e2c0a4c181ddaf67e3f47f776476fd05edd48320343c19a9ec8cdb852b4f3 WatchSource:0}: Error finding container 6c9e2c0a4c181ddaf67e3f47f776476fd05edd48320343c19a9ec8cdb852b4f3: Status 404 returned error can't find the container with id 6c9e2c0a4c181ddaf67e3f47f776476fd05edd48320343c19a9ec8cdb852b4f3 Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.249570 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.371206 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.563010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-scripts\") pod \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.563067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-config-data\") pod \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.563396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-combined-ca-bundle\") pod \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.563499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khdh9\" (UniqueName: \"kubernetes.io/projected/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-kube-api-access-khdh9\") pod \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\" (UID: \"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde\") " Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.578644 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-kube-api-access-khdh9" (OuterVolumeSpecName: "kube-api-access-khdh9") pod "4329f46d-8b4d-45e8-a6a1-59ae09d3bdde" (UID: "4329f46d-8b4d-45e8-a6a1-59ae09d3bdde"). InnerVolumeSpecName "kube-api-access-khdh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.582079 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-scripts" (OuterVolumeSpecName: "scripts") pod "4329f46d-8b4d-45e8-a6a1-59ae09d3bdde" (UID: "4329f46d-8b4d-45e8-a6a1-59ae09d3bdde"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.590507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4329f46d-8b4d-45e8-a6a1-59ae09d3bdde" (UID: "4329f46d-8b4d-45e8-a6a1-59ae09d3bdde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.594838 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-config-data" (OuterVolumeSpecName: "config-data") pod "4329f46d-8b4d-45e8-a6a1-59ae09d3bdde" (UID: "4329f46d-8b4d-45e8-a6a1-59ae09d3bdde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.666745 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.666779 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khdh9\" (UniqueName: \"kubernetes.io/projected/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-kube-api-access-khdh9\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.666792 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:11 crc kubenswrapper[4707]: I0121 15:33:11.666801 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.001569 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.002010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.108677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87","Type":"ContainerStarted","Data":"1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84"} Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.108740 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87","Type":"ContainerStarted","Data":"6c9e2c0a4c181ddaf67e3f47f776476fd05edd48320343c19a9ec8cdb852b4f3"} Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.110459 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.110918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" event={"ID":"4329f46d-8b4d-45e8-a6a1-59ae09d3bdde","Type":"ContainerDied","Data":"86856da99c5e8506933ea62dbf316c1378db23b8beea448f6e2822be73cde41c"} Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.110947 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86856da99c5e8506933ea62dbf316c1378db23b8beea448f6e2822be73cde41c" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.110951 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.130136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerStarted","Data":"0272d7a8588ea0c3d1b48fe05807dd58adda544e876612236fa96ef68acdd7ca"} Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.130200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerStarted","Data":"ea049c371492d375d13c0b996c8075e592d0f9297393006f929d4c0045ae78b3"} Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.138893 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.13887746 podStartE2EDuration="2.13887746s" podCreationTimestamp="2026-01-21 15:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:12.130063476 +0000 UTC m=+1889.311579699" watchObservedRunningTime="2026-01-21 15:33:12.13887746 +0000 UTC m=+1889.320393681" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.269603 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.270450 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-log" containerID="cri-o://53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4" gracePeriod=30 Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.271258 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-api" containerID="cri-o://1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3" gracePeriod=30 Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.287241 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.287515 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="2adc5e20-edde-499c-b40f-e92dcdb8dcb7" containerName="nova-scheduler-scheduler" containerID="cri-o://90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681" gracePeriod=30 Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.289407 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.247:8774/\": EOF" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.298428 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.247:8774/\": EOF" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.304975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.305856 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerName="nova-metadata-log" containerID="cri-o://0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938" gracePeriod=30 Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.305913 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerName="nova-metadata-metadata" containerID="cri-o://493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd" gracePeriod=30 Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.432598 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.708872 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.709091 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.815942 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.889584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d12ead2-d0f8-40c9-b23c-f38e37339651-logs\") pod \"9d12ead2-d0f8-40c9-b23c-f38e37339651\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.889704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdxrv\" (UniqueName: \"kubernetes.io/projected/9d12ead2-d0f8-40c9-b23c-f38e37339651-kube-api-access-mdxrv\") pod \"9d12ead2-d0f8-40c9-b23c-f38e37339651\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.889763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-combined-ca-bundle\") pod \"9d12ead2-d0f8-40c9-b23c-f38e37339651\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.889824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-config-data\") pod \"9d12ead2-d0f8-40c9-b23c-f38e37339651\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.889850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-nova-metadata-tls-certs\") pod \"9d12ead2-d0f8-40c9-b23c-f38e37339651\" (UID: \"9d12ead2-d0f8-40c9-b23c-f38e37339651\") " Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.891046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d12ead2-d0f8-40c9-b23c-f38e37339651-logs" (OuterVolumeSpecName: "logs") pod "9d12ead2-d0f8-40c9-b23c-f38e37339651" (UID: "9d12ead2-d0f8-40c9-b23c-f38e37339651"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.893219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d12ead2-d0f8-40c9-b23c-f38e37339651-kube-api-access-mdxrv" (OuterVolumeSpecName: "kube-api-access-mdxrv") pod "9d12ead2-d0f8-40c9-b23c-f38e37339651" (UID: "9d12ead2-d0f8-40c9-b23c-f38e37339651"). InnerVolumeSpecName "kube-api-access-mdxrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.912093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-config-data" (OuterVolumeSpecName: "config-data") pod "9d12ead2-d0f8-40c9-b23c-f38e37339651" (UID: "9d12ead2-d0f8-40c9-b23c-f38e37339651"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.914492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d12ead2-d0f8-40c9-b23c-f38e37339651" (UID: "9d12ead2-d0f8-40c9-b23c-f38e37339651"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.933723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9d12ead2-d0f8-40c9-b23c-f38e37339651" (UID: "9d12ead2-d0f8-40c9-b23c-f38e37339651"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.991313 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdxrv\" (UniqueName: \"kubernetes.io/projected/9d12ead2-d0f8-40c9-b23c-f38e37339651-kube-api-access-mdxrv\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.991338 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.991348 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.991356 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d12ead2-d0f8-40c9-b23c-f38e37339651-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:12 crc kubenswrapper[4707]: I0121 15:33:12.991367 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d12ead2-d0f8-40c9-b23c-f38e37339651-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.137595 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerID="53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4" exitCode=143 Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.137649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a","Type":"ContainerDied","Data":"53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4"} Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.138936 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerID="493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd" exitCode=0 Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.138956 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerID="0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938" exitCode=143 Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.139957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d12ead2-d0f8-40c9-b23c-f38e37339651","Type":"ContainerDied","Data":"493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd"} Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.140235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d12ead2-d0f8-40c9-b23c-f38e37339651","Type":"ContainerDied","Data":"0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938"} Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.140249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d12ead2-d0f8-40c9-b23c-f38e37339651","Type":"ContainerDied","Data":"9262dfc5ae36a5b1f7f8e5ff880054bb17e62efce9f2bd0aede8b2621a7508a3"} Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.140265 4707 scope.go:117] "RemoveContainer" containerID="493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.140364 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.150092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerStarted","Data":"81ccf685e562c5b05d32070822c0e14cb8cb62485bd2c5d4fab0dd270ab2932e"} Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.169947 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.178104 4707 scope.go:117] "RemoveContainer" containerID="0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.198572 4707 scope.go:117] "RemoveContainer" containerID="493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd" Jan 21 15:33:13 crc kubenswrapper[4707]: E0121 15:33:13.201584 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd\": container with ID starting with 493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd not found: ID does not exist" containerID="493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.201622 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd"} err="failed to get container status \"493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd\": rpc error: code = NotFound desc = could not find container \"493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd\": container with ID starting with 493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd not found: ID does not exist" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.201646 4707 scope.go:117] "RemoveContainer" containerID="0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938" Jan 21 15:33:13 crc kubenswrapper[4707]: E0121 15:33:13.202178 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938\": container with ID starting with 0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938 not found: ID does not exist" containerID="0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.202223 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938"} err="failed to get container status \"0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938\": rpc error: code = NotFound desc = could not find container \"0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938\": container with ID starting with 0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938 not found: ID does not exist" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.202248 4707 scope.go:117] "RemoveContainer" containerID="493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.203323 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd"} err="failed to get container status \"493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd\": rpc error: code = NotFound desc = could not find container \"493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd\": container with ID starting with 493173e771a4cc6f7deb35d6d5307d5a0a67d796ef5891744bed79d063cf47fd not found: ID does not exist" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.203344 4707 scope.go:117] "RemoveContainer" containerID="0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.204279 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938"} err="failed to get container status \"0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938\": rpc error: code = NotFound desc = could not find container \"0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938\": container with ID starting with 0846d28896fac2a65bbb481463e90abb8e8285282e6b5cdb1750184576940938 not found: ID does not exist" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.207059 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.207099 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:13 crc kubenswrapper[4707]: E0121 15:33:13.207372 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4329f46d-8b4d-45e8-a6a1-59ae09d3bdde" containerName="nova-manage" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.207389 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4329f46d-8b4d-45e8-a6a1-59ae09d3bdde" containerName="nova-manage" Jan 21 15:33:13 crc kubenswrapper[4707]: E0121 15:33:13.207401 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerName="nova-metadata-log" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.207407 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerName="nova-metadata-log" Jan 21 15:33:13 crc kubenswrapper[4707]: E0121 15:33:13.207421 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerName="nova-metadata-metadata" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.207429 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerName="nova-metadata-metadata" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.207628 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerName="nova-metadata-log" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.207648 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d12ead2-d0f8-40c9-b23c-f38e37339651" containerName="nova-metadata-metadata" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.207656 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4329f46d-8b4d-45e8-a6a1-59ae09d3bdde" containerName="nova-manage" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.208555 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.212560 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.213935 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.224925 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.297592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.297686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-config-data\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.297748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.297772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nvjt\" (UniqueName: \"kubernetes.io/projected/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-kube-api-access-9nvjt\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.297848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-logs\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.399175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.399227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvjt\" (UniqueName: \"kubernetes.io/projected/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-kube-api-access-9nvjt\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.399246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-logs\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.399312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.399356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-config-data\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.399963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-logs\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.402177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.403460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.407303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-config-data\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.413214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvjt\" (UniqueName: \"kubernetes.io/projected/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-kube-api-access-9nvjt\") pod \"nova-metadata-0\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.526345 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:13 crc kubenswrapper[4707]: I0121 15:33:13.927245 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:13 crc kubenswrapper[4707]: W0121 15:33:13.935037 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf3e88a6_0f55_4de2_82db_7055f0ae3dfc.slice/crio-84911ff753273bd5d3e7da6e24c7017eeca76c3af23ccc7b035065294a985b1d WatchSource:0}: Error finding container 84911ff753273bd5d3e7da6e24c7017eeca76c3af23ccc7b035065294a985b1d: Status 404 returned error can't find the container with id 84911ff753273bd5d3e7da6e24c7017eeca76c3af23ccc7b035065294a985b1d Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.158473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"df3e88a6-0f55-4de2-82db-7055f0ae3dfc","Type":"ContainerStarted","Data":"7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f"} Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.158829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"df3e88a6-0f55-4de2-82db-7055f0ae3dfc","Type":"ContainerStarted","Data":"84911ff753273bd5d3e7da6e24c7017eeca76c3af23ccc7b035065294a985b1d"} Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.161542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerStarted","Data":"deae1009cf5b2522db18f13d04c635a3640ffb4fb6438965665632d3c4e7544a"} Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.504336 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.620576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-config-data\") pod \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.620601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-combined-ca-bundle\") pod \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.620627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn64w\" (UniqueName: \"kubernetes.io/projected/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-kube-api-access-nn64w\") pod \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\" (UID: \"2adc5e20-edde-499c-b40f-e92dcdb8dcb7\") " Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.623476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-kube-api-access-nn64w" (OuterVolumeSpecName: "kube-api-access-nn64w") pod "2adc5e20-edde-499c-b40f-e92dcdb8dcb7" (UID: "2adc5e20-edde-499c-b40f-e92dcdb8dcb7"). InnerVolumeSpecName "kube-api-access-nn64w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.645875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-config-data" (OuterVolumeSpecName: "config-data") pod "2adc5e20-edde-499c-b40f-e92dcdb8dcb7" (UID: "2adc5e20-edde-499c-b40f-e92dcdb8dcb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.650354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2adc5e20-edde-499c-b40f-e92dcdb8dcb7" (UID: "2adc5e20-edde-499c-b40f-e92dcdb8dcb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.722955 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.722986 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:14 crc kubenswrapper[4707]: I0121 15:33:14.722997 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn64w\" (UniqueName: \"kubernetes.io/projected/2adc5e20-edde-499c-b40f-e92dcdb8dcb7-kube-api-access-nn64w\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.170831 4707 generic.go:334] "Generic (PLEG): container finished" podID="2adc5e20-edde-499c-b40f-e92dcdb8dcb7" containerID="90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681" exitCode=0 Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.170881 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.170896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2adc5e20-edde-499c-b40f-e92dcdb8dcb7","Type":"ContainerDied","Data":"90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681"} Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.171363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2adc5e20-edde-499c-b40f-e92dcdb8dcb7","Type":"ContainerDied","Data":"e46dd610954e91fa728b4b6a5f57d3e02a65021a69f3316d095e9c02aed9935c"} Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.171386 4707 scope.go:117] "RemoveContainer" containerID="90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.173073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"df3e88a6-0f55-4de2-82db-7055f0ae3dfc","Type":"ContainerStarted","Data":"00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf"} Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.174936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerStarted","Data":"a0fff428bdde860dd842fe26e36209a4da5de38749108578e230170a328e4bc1"} Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.175261 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.191983 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d12ead2-d0f8-40c9-b23c-f38e37339651" path="/var/lib/kubelet/pods/9d12ead2-d0f8-40c9-b23c-f38e37339651/volumes" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.193383 4707 scope.go:117] "RemoveContainer" containerID="90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681" Jan 21 15:33:15 crc kubenswrapper[4707]: E0121 15:33:15.193833 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681\": container with ID starting with 90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681 not found: ID does not exist" containerID="90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.193863 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681"} err="failed to get container status \"90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681\": rpc error: code = NotFound desc = could not find container \"90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681\": container with ID starting with 90ea97769257903ab071dc9f7b849742b857cf85b265c74bacd095bd5b805681 not found: ID does not exist" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.194971 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.194955507 podStartE2EDuration="2.194955507s" podCreationTimestamp="2026-01-21 15:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:15.19355833 +0000 UTC m=+1892.375074553" watchObservedRunningTime="2026-01-21 15:33:15.194955507 +0000 UTC m=+1892.376471730" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.220056 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.89398276 podStartE2EDuration="5.220045384s" podCreationTimestamp="2026-01-21 15:33:10 +0000 UTC" firstStartedPulling="2026-01-21 15:33:11.281162559 +0000 UTC m=+1888.462678771" lastFinishedPulling="2026-01-21 15:33:14.607225173 +0000 UTC m=+1891.788741395" observedRunningTime="2026-01-21 15:33:15.213234116 +0000 UTC m=+1892.394750339" watchObservedRunningTime="2026-01-21 15:33:15.220045384 +0000 UTC m=+1892.401561625" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.229694 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.239615 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.250303 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:15 crc kubenswrapper[4707]: E0121 15:33:15.250687 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adc5e20-edde-499c-b40f-e92dcdb8dcb7" containerName="nova-scheduler-scheduler" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.250704 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adc5e20-edde-499c-b40f-e92dcdb8dcb7" containerName="nova-scheduler-scheduler" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.250884 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2adc5e20-edde-499c-b40f-e92dcdb8dcb7" containerName="nova-scheduler-scheduler" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.251467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.257285 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.259631 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.433532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-config-data\") pod \"nova-scheduler-0\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.433618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc4zj\" (UniqueName: \"kubernetes.io/projected/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-kube-api-access-lc4zj\") pod \"nova-scheduler-0\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.433684 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.534906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc4zj\" (UniqueName: \"kubernetes.io/projected/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-kube-api-access-lc4zj\") pod \"nova-scheduler-0\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.534986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.535050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-config-data\") pod \"nova-scheduler-0\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.540014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-config-data\") pod \"nova-scheduler-0\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.541489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.557735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc4zj\" (UniqueName: \"kubernetes.io/projected/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-kube-api-access-lc4zj\") pod \"nova-scheduler-0\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.567347 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:15 crc kubenswrapper[4707]: I0121 15:33:15.956275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:15 crc kubenswrapper[4707]: W0121 15:33:15.974270 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fcf3fe5_5cf0_4334_aec8_e4929894c12e.slice/crio-4f1f8f8a737b0e7b8a035edcddf28479b57065589c706fa7635c9bd6028e4c2b WatchSource:0}: Error finding container 4f1f8f8a737b0e7b8a035edcddf28479b57065589c706fa7635c9bd6028e4c2b: Status 404 returned error can't find the container with id 4f1f8f8a737b0e7b8a035edcddf28479b57065589c706fa7635c9bd6028e4c2b Jan 21 15:33:16 crc kubenswrapper[4707]: I0121 15:33:16.184636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9fcf3fe5-5cf0-4334-aec8-e4929894c12e","Type":"ContainerStarted","Data":"460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a"} Jan 21 15:33:16 crc kubenswrapper[4707]: I0121 15:33:16.184881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9fcf3fe5-5cf0-4334-aec8-e4929894c12e","Type":"ContainerStarted","Data":"4f1f8f8a737b0e7b8a035edcddf28479b57065589c706fa7635c9bd6028e4c2b"} Jan 21 15:33:16 crc kubenswrapper[4707]: I0121 15:33:16.201663 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.201652344 podStartE2EDuration="1.201652344s" podCreationTimestamp="2026-01-21 15:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:16.197752863 +0000 UTC m=+1893.379269085" watchObservedRunningTime="2026-01-21 15:33:16.201652344 +0000 UTC m=+1893.383168567" Jan 21 15:33:17 crc kubenswrapper[4707]: I0121 15:33:17.182555 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:33:17 crc kubenswrapper[4707]: E0121 15:33:17.183030 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:33:17 crc kubenswrapper[4707]: I0121 15:33:17.190112 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2adc5e20-edde-499c-b40f-e92dcdb8dcb7" path="/var/lib/kubelet/pods/2adc5e20-edde-499c-b40f-e92dcdb8dcb7/volumes" Jan 21 15:33:17 crc kubenswrapper[4707]: I0121 15:33:17.431918 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:17 crc kubenswrapper[4707]: I0121 15:33:17.447373 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.178589 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.209080 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerID="1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3" exitCode=0 Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.209143 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.209170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a","Type":"ContainerDied","Data":"1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3"} Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.209219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a","Type":"ContainerDied","Data":"6183131366ba1aefeed714b5f54eb155d771d8f08a7797f6cfdedc00dcedae3d"} Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.209236 4707 scope.go:117] "RemoveContainer" containerID="1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.223847 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.227179 4707 scope.go:117] "RemoveContainer" containerID="53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.244927 4707 scope.go:117] "RemoveContainer" containerID="1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3" Jan 21 15:33:18 crc kubenswrapper[4707]: E0121 15:33:18.245320 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3\": container with ID starting with 1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3 not found: ID does not exist" containerID="1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.245344 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3"} err="failed to get container status \"1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3\": rpc error: code = NotFound desc = could not find container \"1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3\": container with ID starting with 1e138d9de24186e39d858f0cc07f4b226bd1e228be4e8969bd9bcfa18cadfba3 not found: ID does not exist" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.245362 4707 scope.go:117] "RemoveContainer" containerID="53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4" Jan 21 15:33:18 crc kubenswrapper[4707]: E0121 15:33:18.249861 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4\": container with ID starting with 53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4 not found: ID does not exist" containerID="53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.249898 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4"} err="failed to get container status \"53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4\": rpc error: code = NotFound desc = could not find container \"53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4\": container with ID starting with 53e91d544cebc9064ed2ea800976f4a53b3d1ea040da1be51a1c8a2041dcb2a4 not found: ID does not exist" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.387372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-combined-ca-bundle\") pod \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.387611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-config-data\") pod \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.387712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-logs\") pod \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.387849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qn96\" (UniqueName: \"kubernetes.io/projected/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-kube-api-access-8qn96\") pod \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\" (UID: \"d5bab875-2c5e-4ae1-b67e-bcb7b818b55a\") " Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.389758 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-logs" (OuterVolumeSpecName: "logs") pod "d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" (UID: "d5bab875-2c5e-4ae1-b67e-bcb7b818b55a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.392984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-kube-api-access-8qn96" (OuterVolumeSpecName: "kube-api-access-8qn96") pod "d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" (UID: "d5bab875-2c5e-4ae1-b67e-bcb7b818b55a"). InnerVolumeSpecName "kube-api-access-8qn96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.409013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-config-data" (OuterVolumeSpecName: "config-data") pod "d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" (UID: "d5bab875-2c5e-4ae1-b67e-bcb7b818b55a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.411121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" (UID: "d5bab875-2c5e-4ae1-b67e-bcb7b818b55a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.490343 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.490436 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qn96\" (UniqueName: \"kubernetes.io/projected/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-kube-api-access-8qn96\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.490506 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.490559 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.527688 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.527849 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.536424 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.542306 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.555541 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:18 crc kubenswrapper[4707]: E0121 15:33:18.555923 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-api" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.555941 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-api" Jan 21 15:33:18 crc kubenswrapper[4707]: E0121 15:33:18.555952 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-log" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.555958 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-log" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.556126 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-log" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.556153 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" containerName="nova-api-api" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.557032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.560086 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.562945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.694213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.694904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-config-data\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.694993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfws2\" (UniqueName: \"kubernetes.io/projected/c859e562-fd51-4fad-bc54-79c8b39fe941-kube-api-access-mfws2\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.695058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c859e562-fd51-4fad-bc54-79c8b39fe941-logs\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.796958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfws2\" (UniqueName: \"kubernetes.io/projected/c859e562-fd51-4fad-bc54-79c8b39fe941-kube-api-access-mfws2\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.797013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c859e562-fd51-4fad-bc54-79c8b39fe941-logs\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.797113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.797143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-config-data\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.797426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c859e562-fd51-4fad-bc54-79c8b39fe941-logs\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.800176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-config-data\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.800380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.809218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfws2\" (UniqueName: \"kubernetes.io/projected/c859e562-fd51-4fad-bc54-79c8b39fe941-kube-api-access-mfws2\") pod \"nova-api-0\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:18 crc kubenswrapper[4707]: I0121 15:33:18.876823 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:19 crc kubenswrapper[4707]: I0121 15:33:19.200650 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bab875-2c5e-4ae1-b67e-bcb7b818b55a" path="/var/lib/kubelet/pods/d5bab875-2c5e-4ae1-b67e-bcb7b818b55a/volumes" Jan 21 15:33:19 crc kubenswrapper[4707]: W0121 15:33:19.258195 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc859e562_fd51_4fad_bc54_79c8b39fe941.slice/crio-0cb37a33f096eee2a70d822b71a4eaf62064d867609235551b523b860201f780 WatchSource:0}: Error finding container 0cb37a33f096eee2a70d822b71a4eaf62064d867609235551b523b860201f780: Status 404 returned error can't find the container with id 0cb37a33f096eee2a70d822b71a4eaf62064d867609235551b523b860201f780 Jan 21 15:33:19 crc kubenswrapper[4707]: I0121 15:33:19.258451 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:20 crc kubenswrapper[4707]: I0121 15:33:20.224254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c859e562-fd51-4fad-bc54-79c8b39fe941","Type":"ContainerStarted","Data":"29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701"} Jan 21 15:33:20 crc kubenswrapper[4707]: I0121 15:33:20.224286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c859e562-fd51-4fad-bc54-79c8b39fe941","Type":"ContainerStarted","Data":"ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408"} Jan 21 15:33:20 crc kubenswrapper[4707]: I0121 15:33:20.224296 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c859e562-fd51-4fad-bc54-79c8b39fe941","Type":"ContainerStarted","Data":"0cb37a33f096eee2a70d822b71a4eaf62064d867609235551b523b860201f780"} Jan 21 15:33:20 crc kubenswrapper[4707]: I0121 15:33:20.235775 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.235761961 podStartE2EDuration="2.235761961s" podCreationTimestamp="2026-01-21 15:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:20.234227726 +0000 UTC m=+1897.415743938" watchObservedRunningTime="2026-01-21 15:33:20.235761961 +0000 UTC m=+1897.417278183" Jan 21 15:33:20 crc kubenswrapper[4707]: I0121 15:33:20.567428 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:20 crc kubenswrapper[4707]: I0121 15:33:20.787642 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.153604 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp"] Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.154644 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.156403 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.156550 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.162483 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp"] Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.334631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgmsz\" (UniqueName: \"kubernetes.io/projected/c829483b-7aed-46a5-892c-1c6983e8aaf2-kube-api-access-qgmsz\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.334697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-scripts\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.334725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.334853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-config-data\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.436541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgmsz\" (UniqueName: \"kubernetes.io/projected/c829483b-7aed-46a5-892c-1c6983e8aaf2-kube-api-access-qgmsz\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.436601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-scripts\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.436632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.436717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-config-data\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.443246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-scripts\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.444384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-config-data\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.444741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.452614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgmsz\" (UniqueName: \"kubernetes.io/projected/c829483b-7aed-46a5-892c-1c6983e8aaf2-kube-api-access-qgmsz\") pod \"nova-cell1-cell-mapping-cfgpp\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.471016 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:21 crc kubenswrapper[4707]: I0121 15:33:21.867847 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp"] Jan 21 15:33:21 crc kubenswrapper[4707]: W0121 15:33:21.878100 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc829483b_7aed_46a5_892c_1c6983e8aaf2.slice/crio-f904c22b2e4a1f3f4b9b7736c5a194e9ff6c9cdea608a9f5409f92fe6cbd0173 WatchSource:0}: Error finding container f904c22b2e4a1f3f4b9b7736c5a194e9ff6c9cdea608a9f5409f92fe6cbd0173: Status 404 returned error can't find the container with id f904c22b2e4a1f3f4b9b7736c5a194e9ff6c9cdea608a9f5409f92fe6cbd0173 Jan 21 15:33:22 crc kubenswrapper[4707]: I0121 15:33:22.238759 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" event={"ID":"c829483b-7aed-46a5-892c-1c6983e8aaf2","Type":"ContainerStarted","Data":"f3d23c22831b8b61a1371a54cbf9cc055881f895c54df9153e45ee9a3ca7ebde"} Jan 21 15:33:22 crc kubenswrapper[4707]: I0121 15:33:22.239226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" event={"ID":"c829483b-7aed-46a5-892c-1c6983e8aaf2","Type":"ContainerStarted","Data":"f904c22b2e4a1f3f4b9b7736c5a194e9ff6c9cdea608a9f5409f92fe6cbd0173"} Jan 21 15:33:22 crc kubenswrapper[4707]: I0121 15:33:22.250782 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" podStartSLOduration=1.2507693 podStartE2EDuration="1.2507693s" podCreationTimestamp="2026-01-21 15:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:22.249776854 +0000 UTC m=+1899.431293075" watchObservedRunningTime="2026-01-21 15:33:22.2507693 +0000 UTC m=+1899.432285522" Jan 21 15:33:23 crc kubenswrapper[4707]: I0121 15:33:23.527237 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:23 crc kubenswrapper[4707]: I0121 15:33:23.527280 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:24 crc kubenswrapper[4707]: I0121 15:33:24.539917 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.13:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:33:24 crc kubenswrapper[4707]: I0121 15:33:24.539939 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.13:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:33:25 crc kubenswrapper[4707]: I0121 15:33:25.567866 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:25 crc kubenswrapper[4707]: I0121 15:33:25.589081 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:26 crc kubenswrapper[4707]: I0121 15:33:26.273988 4707 generic.go:334] "Generic (PLEG): container finished" podID="c829483b-7aed-46a5-892c-1c6983e8aaf2" containerID="f3d23c22831b8b61a1371a54cbf9cc055881f895c54df9153e45ee9a3ca7ebde" exitCode=0 Jan 21 15:33:26 crc kubenswrapper[4707]: I0121 15:33:26.274491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" event={"ID":"c829483b-7aed-46a5-892c-1c6983e8aaf2","Type":"ContainerDied","Data":"f3d23c22831b8b61a1371a54cbf9cc055881f895c54df9153e45ee9a3ca7ebde"} Jan 21 15:33:26 crc kubenswrapper[4707]: I0121 15:33:26.296029 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.543610 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.733256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-config-data\") pod \"c829483b-7aed-46a5-892c-1c6983e8aaf2\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.733306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-scripts\") pod \"c829483b-7aed-46a5-892c-1c6983e8aaf2\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.733384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-combined-ca-bundle\") pod \"c829483b-7aed-46a5-892c-1c6983e8aaf2\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.733447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgmsz\" (UniqueName: \"kubernetes.io/projected/c829483b-7aed-46a5-892c-1c6983e8aaf2-kube-api-access-qgmsz\") pod \"c829483b-7aed-46a5-892c-1c6983e8aaf2\" (UID: \"c829483b-7aed-46a5-892c-1c6983e8aaf2\") " Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.737255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-scripts" (OuterVolumeSpecName: "scripts") pod "c829483b-7aed-46a5-892c-1c6983e8aaf2" (UID: "c829483b-7aed-46a5-892c-1c6983e8aaf2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.737295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c829483b-7aed-46a5-892c-1c6983e8aaf2-kube-api-access-qgmsz" (OuterVolumeSpecName: "kube-api-access-qgmsz") pod "c829483b-7aed-46a5-892c-1c6983e8aaf2" (UID: "c829483b-7aed-46a5-892c-1c6983e8aaf2"). InnerVolumeSpecName "kube-api-access-qgmsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.753279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c829483b-7aed-46a5-892c-1c6983e8aaf2" (UID: "c829483b-7aed-46a5-892c-1c6983e8aaf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.753684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-config-data" (OuterVolumeSpecName: "config-data") pod "c829483b-7aed-46a5-892c-1c6983e8aaf2" (UID: "c829483b-7aed-46a5-892c-1c6983e8aaf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.835469 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgmsz\" (UniqueName: \"kubernetes.io/projected/c829483b-7aed-46a5-892c-1c6983e8aaf2-kube-api-access-qgmsz\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.835497 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.835524 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:27 crc kubenswrapper[4707]: I0121 15:33:27.835533 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c829483b-7aed-46a5-892c-1c6983e8aaf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.289356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" event={"ID":"c829483b-7aed-46a5-892c-1c6983e8aaf2","Type":"ContainerDied","Data":"f904c22b2e4a1f3f4b9b7736c5a194e9ff6c9cdea608a9f5409f92fe6cbd0173"} Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.289536 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f904c22b2e4a1f3f4b9b7736c5a194e9ff6c9cdea608a9f5409f92fe6cbd0173" Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.289407 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp" Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.437452 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.437841 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerName="nova-api-log" containerID="cri-o://ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408" gracePeriod=30 Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.437883 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerName="nova-api-api" containerID="cri-o://29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701" gracePeriod=30 Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.444769 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.444936 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="9fcf3fe5-5cf0-4334-aec8-e4929894c12e" containerName="nova-scheduler-scheduler" containerID="cri-o://460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a" gracePeriod=30 Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.474109 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.474311 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-log" containerID="cri-o://7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f" gracePeriod=30 Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.474395 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-metadata" containerID="cri-o://00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf" gracePeriod=30 Jan 21 15:33:28 crc kubenswrapper[4707]: I0121 15:33:28.914143 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.054117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c859e562-fd51-4fad-bc54-79c8b39fe941-logs\") pod \"c859e562-fd51-4fad-bc54-79c8b39fe941\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.054414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-config-data\") pod \"c859e562-fd51-4fad-bc54-79c8b39fe941\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.054474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-combined-ca-bundle\") pod \"c859e562-fd51-4fad-bc54-79c8b39fe941\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.054521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfws2\" (UniqueName: \"kubernetes.io/projected/c859e562-fd51-4fad-bc54-79c8b39fe941-kube-api-access-mfws2\") pod \"c859e562-fd51-4fad-bc54-79c8b39fe941\" (UID: \"c859e562-fd51-4fad-bc54-79c8b39fe941\") " Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.054579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c859e562-fd51-4fad-bc54-79c8b39fe941-logs" (OuterVolumeSpecName: "logs") pod "c859e562-fd51-4fad-bc54-79c8b39fe941" (UID: "c859e562-fd51-4fad-bc54-79c8b39fe941"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.055017 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c859e562-fd51-4fad-bc54-79c8b39fe941-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.058019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c859e562-fd51-4fad-bc54-79c8b39fe941-kube-api-access-mfws2" (OuterVolumeSpecName: "kube-api-access-mfws2") pod "c859e562-fd51-4fad-bc54-79c8b39fe941" (UID: "c859e562-fd51-4fad-bc54-79c8b39fe941"). InnerVolumeSpecName "kube-api-access-mfws2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.073513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-config-data" (OuterVolumeSpecName: "config-data") pod "c859e562-fd51-4fad-bc54-79c8b39fe941" (UID: "c859e562-fd51-4fad-bc54-79c8b39fe941"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.074641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c859e562-fd51-4fad-bc54-79c8b39fe941" (UID: "c859e562-fd51-4fad-bc54-79c8b39fe941"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.156344 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.156368 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c859e562-fd51-4fad-bc54-79c8b39fe941-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.156380 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfws2\" (UniqueName: \"kubernetes.io/projected/c859e562-fd51-4fad-bc54-79c8b39fe941-kube-api-access-mfws2\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.182646 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:33:29 crc kubenswrapper[4707]: E0121 15:33:29.182854 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.297302 4707 generic.go:334] "Generic (PLEG): container finished" podID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerID="29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701" exitCode=0 Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.297497 4707 generic.go:334] "Generic (PLEG): container finished" podID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerID="ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408" exitCode=143 Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.297377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c859e562-fd51-4fad-bc54-79c8b39fe941","Type":"ContainerDied","Data":"29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701"} Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.297561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c859e562-fd51-4fad-bc54-79c8b39fe941","Type":"ContainerDied","Data":"ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408"} Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.297404 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.297586 4707 scope.go:117] "RemoveContainer" containerID="29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.297575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c859e562-fd51-4fad-bc54-79c8b39fe941","Type":"ContainerDied","Data":"0cb37a33f096eee2a70d822b71a4eaf62064d867609235551b523b860201f780"} Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.299534 4707 generic.go:334] "Generic (PLEG): container finished" podID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerID="7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f" exitCode=143 Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.299558 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"df3e88a6-0f55-4de2-82db-7055f0ae3dfc","Type":"ContainerDied","Data":"7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f"} Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.313532 4707 scope.go:117] "RemoveContainer" containerID="ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.314134 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.327488 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.328637 4707 scope.go:117] "RemoveContainer" containerID="29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701" Jan 21 15:33:29 crc kubenswrapper[4707]: E0121 15:33:29.329132 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701\": container with ID starting with 29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701 not found: ID does not exist" containerID="29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.329257 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701"} err="failed to get container status \"29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701\": rpc error: code = NotFound desc = could not find container \"29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701\": container with ID starting with 29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701 not found: ID does not exist" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.329336 4707 scope.go:117] "RemoveContainer" containerID="ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408" Jan 21 15:33:29 crc kubenswrapper[4707]: E0121 15:33:29.329647 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408\": container with ID starting with ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408 not found: ID does not exist" containerID="ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.329686 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408"} err="failed to get container status \"ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408\": rpc error: code = NotFound desc = could not find container \"ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408\": container with ID starting with ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408 not found: ID does not exist" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.329712 4707 scope.go:117] "RemoveContainer" containerID="29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.329961 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701"} err="failed to get container status \"29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701\": rpc error: code = NotFound desc = could not find container \"29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701\": container with ID starting with 29d1b1dc2862563d1ae0f4a0dc9f0f731f74323af82afe3013fb7739adcef701 not found: ID does not exist" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.330031 4707 scope.go:117] "RemoveContainer" containerID="ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.330255 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408"} err="failed to get container status \"ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408\": rpc error: code = NotFound desc = could not find container \"ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408\": container with ID starting with ab7775e7bf049f735333b01bfe9b59187961d83740b2bd0ea190f09cdfa95408 not found: ID does not exist" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.337129 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:29 crc kubenswrapper[4707]: E0121 15:33:29.337460 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerName="nova-api-api" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.337477 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerName="nova-api-api" Jan 21 15:33:29 crc kubenswrapper[4707]: E0121 15:33:29.337498 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c829483b-7aed-46a5-892c-1c6983e8aaf2" containerName="nova-manage" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.337504 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c829483b-7aed-46a5-892c-1c6983e8aaf2" containerName="nova-manage" Jan 21 15:33:29 crc kubenswrapper[4707]: E0121 15:33:29.337513 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerName="nova-api-log" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.337519 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerName="nova-api-log" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.337672 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerName="nova-api-api" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.337690 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c829483b-7aed-46a5-892c-1c6983e8aaf2" containerName="nova-manage" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.337704 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c859e562-fd51-4fad-bc54-79c8b39fe941" containerName="nova-api-log" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.338568 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.343289 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.346927 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.371255 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qh9\" (UniqueName: \"kubernetes.io/projected/ce71b678-6366-4a72-9175-3428db36ed53-kube-api-access-m4qh9\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.371339 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-config-data\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.371369 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce71b678-6366-4a72-9175-3428db36ed53-logs\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.371433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.472965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.473089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qh9\" (UniqueName: \"kubernetes.io/projected/ce71b678-6366-4a72-9175-3428db36ed53-kube-api-access-m4qh9\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.473124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-config-data\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.473143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce71b678-6366-4a72-9175-3428db36ed53-logs\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.473614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce71b678-6366-4a72-9175-3428db36ed53-logs\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.477057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-config-data\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.477182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.485573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qh9\" (UniqueName: \"kubernetes.io/projected/ce71b678-6366-4a72-9175-3428db36ed53-kube-api-access-m4qh9\") pod \"nova-api-0\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:29 crc kubenswrapper[4707]: I0121 15:33:29.654430 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:30 crc kubenswrapper[4707]: I0121 15:33:30.024488 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:30 crc kubenswrapper[4707]: I0121 15:33:30.308327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ce71b678-6366-4a72-9175-3428db36ed53","Type":"ContainerStarted","Data":"9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d"} Jan 21 15:33:30 crc kubenswrapper[4707]: I0121 15:33:30.308526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ce71b678-6366-4a72-9175-3428db36ed53","Type":"ContainerStarted","Data":"7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6"} Jan 21 15:33:30 crc kubenswrapper[4707]: I0121 15:33:30.308537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ce71b678-6366-4a72-9175-3428db36ed53","Type":"ContainerStarted","Data":"67b19545205492bc2592b88f091fd36f6c7f33cc997495826233a5861be2f226"} Jan 21 15:33:30 crc kubenswrapper[4707]: I0121 15:33:30.327541 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.327531115 podStartE2EDuration="1.327531115s" podCreationTimestamp="2026-01-21 15:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:30.321799559 +0000 UTC m=+1907.503315780" watchObservedRunningTime="2026-01-21 15:33:30.327531115 +0000 UTC m=+1907.509047338" Jan 21 15:33:30 crc kubenswrapper[4707]: E0121 15:33:30.569063 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:33:30 crc kubenswrapper[4707]: E0121 15:33:30.570089 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:33:30 crc kubenswrapper[4707]: E0121 15:33:30.571906 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:33:30 crc kubenswrapper[4707]: E0121 15:33:30.571937 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="9fcf3fe5-5cf0-4334-aec8-e4929894c12e" containerName="nova-scheduler-scheduler" Jan 21 15:33:31 crc kubenswrapper[4707]: I0121 15:33:31.191065 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c859e562-fd51-4fad-bc54-79c8b39fe941" path="/var/lib/kubelet/pods/c859e562-fd51-4fad-bc54-79c8b39fe941/volumes" Jan 21 15:33:31 crc kubenswrapper[4707]: I0121 15:33:31.982415 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.101743 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.110468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-config-data\") pod \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.110539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-nova-metadata-tls-certs\") pod \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.110640 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nvjt\" (UniqueName: \"kubernetes.io/projected/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-kube-api-access-9nvjt\") pod \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.110661 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-logs\") pod \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.110762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-combined-ca-bundle\") pod \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\" (UID: \"df3e88a6-0f55-4de2-82db-7055f0ae3dfc\") " Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.111097 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-logs" (OuterVolumeSpecName: "logs") pod "df3e88a6-0f55-4de2-82db-7055f0ae3dfc" (UID: "df3e88a6-0f55-4de2-82db-7055f0ae3dfc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.111436 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.116139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-kube-api-access-9nvjt" (OuterVolumeSpecName: "kube-api-access-9nvjt") pod "df3e88a6-0f55-4de2-82db-7055f0ae3dfc" (UID: "df3e88a6-0f55-4de2-82db-7055f0ae3dfc"). InnerVolumeSpecName "kube-api-access-9nvjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.134686 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-config-data" (OuterVolumeSpecName: "config-data") pod "df3e88a6-0f55-4de2-82db-7055f0ae3dfc" (UID: "df3e88a6-0f55-4de2-82db-7055f0ae3dfc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.142632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df3e88a6-0f55-4de2-82db-7055f0ae3dfc" (UID: "df3e88a6-0f55-4de2-82db-7055f0ae3dfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.154298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "df3e88a6-0f55-4de2-82db-7055f0ae3dfc" (UID: "df3e88a6-0f55-4de2-82db-7055f0ae3dfc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.212482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-config-data\") pod \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.212632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc4zj\" (UniqueName: \"kubernetes.io/projected/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-kube-api-access-lc4zj\") pod \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.212678 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-combined-ca-bundle\") pod \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\" (UID: \"9fcf3fe5-5cf0-4334-aec8-e4929894c12e\") " Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.213261 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nvjt\" (UniqueName: \"kubernetes.io/projected/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-kube-api-access-9nvjt\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.213280 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.213289 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.213297 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e88a6-0f55-4de2-82db-7055f0ae3dfc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.214874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-kube-api-access-lc4zj" (OuterVolumeSpecName: "kube-api-access-lc4zj") pod "9fcf3fe5-5cf0-4334-aec8-e4929894c12e" (UID: "9fcf3fe5-5cf0-4334-aec8-e4929894c12e"). InnerVolumeSpecName "kube-api-access-lc4zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.229479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-config-data" (OuterVolumeSpecName: "config-data") pod "9fcf3fe5-5cf0-4334-aec8-e4929894c12e" (UID: "9fcf3fe5-5cf0-4334-aec8-e4929894c12e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.229573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fcf3fe5-5cf0-4334-aec8-e4929894c12e" (UID: "9fcf3fe5-5cf0-4334-aec8-e4929894c12e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.315171 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.315346 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc4zj\" (UniqueName: \"kubernetes.io/projected/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-kube-api-access-lc4zj\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.315418 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcf3fe5-5cf0-4334-aec8-e4929894c12e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.330051 4707 generic.go:334] "Generic (PLEG): container finished" podID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerID="00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf" exitCode=0 Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.330107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"df3e88a6-0f55-4de2-82db-7055f0ae3dfc","Type":"ContainerDied","Data":"00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf"} Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.330132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"df3e88a6-0f55-4de2-82db-7055f0ae3dfc","Type":"ContainerDied","Data":"84911ff753273bd5d3e7da6e24c7017eeca76c3af23ccc7b035065294a985b1d"} Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.330150 4707 scope.go:117] "RemoveContainer" containerID="00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.330259 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.332488 4707 generic.go:334] "Generic (PLEG): container finished" podID="9fcf3fe5-5cf0-4334-aec8-e4929894c12e" containerID="460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a" exitCode=0 Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.332525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9fcf3fe5-5cf0-4334-aec8-e4929894c12e","Type":"ContainerDied","Data":"460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a"} Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.332547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9fcf3fe5-5cf0-4334-aec8-e4929894c12e","Type":"ContainerDied","Data":"4f1f8f8a737b0e7b8a035edcddf28479b57065589c706fa7635c9bd6028e4c2b"} Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.332590 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.351426 4707 scope.go:117] "RemoveContainer" containerID="7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.362927 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.377799 4707 scope.go:117] "RemoveContainer" containerID="00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf" Jan 21 15:33:32 crc kubenswrapper[4707]: E0121 15:33:32.380869 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf\": container with ID starting with 00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf not found: ID does not exist" containerID="00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.382387 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf"} err="failed to get container status \"00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf\": rpc error: code = NotFound desc = could not find container \"00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf\": container with ID starting with 00d715ef274c76694465e83a08551006235546ba851b96104dcb399fa1f76fcf not found: ID does not exist" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.382473 4707 scope.go:117] "RemoveContainer" containerID="7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.382053 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:32 crc kubenswrapper[4707]: E0121 15:33:32.382892 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f\": container with ID starting with 7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f not found: ID does not exist" containerID="7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.382928 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f"} err="failed to get container status \"7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f\": rpc error: code = NotFound desc = could not find container \"7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f\": container with ID starting with 7e8acebf119cad493c04d2edf1ef4a5635d40bf26b2367be6dc2a159f7e64d9f not found: ID does not exist" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.382947 4707 scope.go:117] "RemoveContainer" containerID="460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.405275 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.406053 4707 scope.go:117] "RemoveContainer" containerID="460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a" Jan 21 15:33:32 crc kubenswrapper[4707]: E0121 15:33:32.411559 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a\": container with ID starting with 460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a not found: ID does not exist" containerID="460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.411591 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a"} err="failed to get container status \"460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a\": rpc error: code = NotFound desc = could not find container \"460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a\": container with ID starting with 460a8886d1b2388c102419ead561d3fb403b5902c9b2796b98b405a13f06e01a not found: ID does not exist" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.417890 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:32 crc kubenswrapper[4707]: E0121 15:33:32.418240 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-log" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.418254 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-log" Jan 21 15:33:32 crc kubenswrapper[4707]: E0121 15:33:32.418266 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-metadata" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.418271 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-metadata" Jan 21 15:33:32 crc kubenswrapper[4707]: E0121 15:33:32.418298 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcf3fe5-5cf0-4334-aec8-e4929894c12e" containerName="nova-scheduler-scheduler" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.418303 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcf3fe5-5cf0-4334-aec8-e4929894c12e" containerName="nova-scheduler-scheduler" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.418463 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-metadata" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.418480 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fcf3fe5-5cf0-4334-aec8-e4929894c12e" containerName="nova-scheduler-scheduler" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.418498 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" containerName="nova-metadata-log" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.419476 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.421324 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.421521 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.424413 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.429757 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.436009 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.437059 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.438687 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.441630 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.619622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.619702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.619799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e1893-2e97-4caa-91e1-5c6621395737-logs\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.619892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-config-data\") pod \"nova-scheduler-0\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.619932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2rtv\" (UniqueName: \"kubernetes.io/projected/952e1893-2e97-4caa-91e1-5c6621395737-kube-api-access-x2rtv\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.620063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.620116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxll6\" (UniqueName: \"kubernetes.io/projected/849d228c-4673-486b-9e38-e7567fd1ddb8-kube-api-access-sxll6\") pod \"nova-scheduler-0\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.620151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-config-data\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.721416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.721455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e1893-2e97-4caa-91e1-5c6621395737-logs\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.721484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-config-data\") pod \"nova-scheduler-0\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.721503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2rtv\" (UniqueName: \"kubernetes.io/projected/952e1893-2e97-4caa-91e1-5c6621395737-kube-api-access-x2rtv\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.721557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.721580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxll6\" (UniqueName: \"kubernetes.io/projected/849d228c-4673-486b-9e38-e7567fd1ddb8-kube-api-access-sxll6\") pod \"nova-scheduler-0\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.721602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-config-data\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.721640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.723604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e1893-2e97-4caa-91e1-5c6621395737-logs\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.725057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.725071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.725067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-config-data\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.725756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-config-data\") pod \"nova-scheduler-0\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.726291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.735092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2rtv\" (UniqueName: \"kubernetes.io/projected/952e1893-2e97-4caa-91e1-5c6621395737-kube-api-access-x2rtv\") pod \"nova-metadata-0\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.735856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxll6\" (UniqueName: \"kubernetes.io/projected/849d228c-4673-486b-9e38-e7567fd1ddb8-kube-api-access-sxll6\") pod \"nova-scheduler-0\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.737647 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:32 crc kubenswrapper[4707]: I0121 15:33:32.752030 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.113615 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:33:33 crc kubenswrapper[4707]: W0121 15:33:33.178242 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod849d228c_4673_486b_9e38_e7567fd1ddb8.slice/crio-186014d69fbbb3a8b2ce9b8010a7ed7fde308ee8968888f97d9bcbfdd6212ed6 WatchSource:0}: Error finding container 186014d69fbbb3a8b2ce9b8010a7ed7fde308ee8968888f97d9bcbfdd6212ed6: Status 404 returned error can't find the container with id 186014d69fbbb3a8b2ce9b8010a7ed7fde308ee8968888f97d9bcbfdd6212ed6 Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.179149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.195792 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fcf3fe5-5cf0-4334-aec8-e4929894c12e" path="/var/lib/kubelet/pods/9fcf3fe5-5cf0-4334-aec8-e4929894c12e/volumes" Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.196396 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df3e88a6-0f55-4de2-82db-7055f0ae3dfc" path="/var/lib/kubelet/pods/df3e88a6-0f55-4de2-82db-7055f0ae3dfc/volumes" Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.341224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"849d228c-4673-486b-9e38-e7567fd1ddb8","Type":"ContainerStarted","Data":"6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83"} Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.341404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"849d228c-4673-486b-9e38-e7567fd1ddb8","Type":"ContainerStarted","Data":"186014d69fbbb3a8b2ce9b8010a7ed7fde308ee8968888f97d9bcbfdd6212ed6"} Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.343945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"952e1893-2e97-4caa-91e1-5c6621395737","Type":"ContainerStarted","Data":"694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39"} Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.343982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"952e1893-2e97-4caa-91e1-5c6621395737","Type":"ContainerStarted","Data":"9f63b6c580a02d7a47db45c4b560fb26239699dbcab6e104dacbc4511e91128e"} Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.356199 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.356179388 podStartE2EDuration="1.356179388s" podCreationTimestamp="2026-01-21 15:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:33.351414308 +0000 UTC m=+1910.532930530" watchObservedRunningTime="2026-01-21 15:33:33.356179388 +0000 UTC m=+1910.537695610" Jan 21 15:33:33 crc kubenswrapper[4707]: I0121 15:33:33.367906 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.367894125 podStartE2EDuration="1.367894125s" podCreationTimestamp="2026-01-21 15:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:33.363616883 +0000 UTC m=+1910.545133105" watchObservedRunningTime="2026-01-21 15:33:33.367894125 +0000 UTC m=+1910.549410347" Jan 21 15:33:34 crc kubenswrapper[4707]: I0121 15:33:34.355076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"952e1893-2e97-4caa-91e1-5c6621395737","Type":"ContainerStarted","Data":"ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3"} Jan 21 15:33:37 crc kubenswrapper[4707]: I0121 15:33:37.738329 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:37 crc kubenswrapper[4707]: I0121 15:33:37.738515 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:37 crc kubenswrapper[4707]: I0121 15:33:37.752987 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:39 crc kubenswrapper[4707]: I0121 15:33:39.655390 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:39 crc kubenswrapper[4707]: I0121 15:33:39.655444 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:40 crc kubenswrapper[4707]: I0121 15:33:40.736953 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.56:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:33:40 crc kubenswrapper[4707]: I0121 15:33:40.736961 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.56:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:33:40 crc kubenswrapper[4707]: I0121 15:33:40.818234 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:42 crc kubenswrapper[4707]: I0121 15:33:42.738685 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:42 crc kubenswrapper[4707]: I0121 15:33:42.738739 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:42 crc kubenswrapper[4707]: I0121 15:33:42.753245 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:42 crc kubenswrapper[4707]: I0121 15:33:42.774325 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:43 crc kubenswrapper[4707]: I0121 15:33:43.501590 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:33:43 crc kubenswrapper[4707]: I0121 15:33:43.747927 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.69:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:33:43 crc kubenswrapper[4707]: I0121 15:33:43.747964 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.69:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.070083 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.070271 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="0a86b73c-13a2-4142-819c-00d3db4ce9cd" containerName="kube-state-metrics" containerID="cri-o://a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5" gracePeriod=30 Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.186276 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:33:44 crc kubenswrapper[4707]: E0121 15:33:44.186494 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.452070 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.459576 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a86b73c-13a2-4142-819c-00d3db4ce9cd" containerID="a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5" exitCode=2 Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.459630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0a86b73c-13a2-4142-819c-00d3db4ce9cd","Type":"ContainerDied","Data":"a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5"} Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.459676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0a86b73c-13a2-4142-819c-00d3db4ce9cd","Type":"ContainerDied","Data":"45bd6777f067edf5baa854d100a2c0711696b9eb094ccab412e052761220acd9"} Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.459695 4707 scope.go:117] "RemoveContainer" containerID="a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.459641 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.481344 4707 scope.go:117] "RemoveContainer" containerID="a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5" Jan 21 15:33:44 crc kubenswrapper[4707]: E0121 15:33:44.481610 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5\": container with ID starting with a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5 not found: ID does not exist" containerID="a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.481637 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5"} err="failed to get container status \"a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5\": rpc error: code = NotFound desc = could not find container \"a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5\": container with ID starting with a493fa778374f48929fb8bf5b692b907da196adbec6e5c6106d38e95f3177ff5 not found: ID does not exist" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.517085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7gdv\" (UniqueName: \"kubernetes.io/projected/0a86b73c-13a2-4142-819c-00d3db4ce9cd-kube-api-access-x7gdv\") pod \"0a86b73c-13a2-4142-819c-00d3db4ce9cd\" (UID: \"0a86b73c-13a2-4142-819c-00d3db4ce9cd\") " Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.521739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a86b73c-13a2-4142-819c-00d3db4ce9cd-kube-api-access-x7gdv" (OuterVolumeSpecName: "kube-api-access-x7gdv") pod "0a86b73c-13a2-4142-819c-00d3db4ce9cd" (UID: "0a86b73c-13a2-4142-819c-00d3db4ce9cd"). InnerVolumeSpecName "kube-api-access-x7gdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.619565 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7gdv\" (UniqueName: \"kubernetes.io/projected/0a86b73c-13a2-4142-819c-00d3db4ce9cd-kube-api-access-x7gdv\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.783030 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.788659 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.798725 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:33:44 crc kubenswrapper[4707]: E0121 15:33:44.799101 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a86b73c-13a2-4142-819c-00d3db4ce9cd" containerName="kube-state-metrics" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.799117 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a86b73c-13a2-4142-819c-00d3db4ce9cd" containerName="kube-state-metrics" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.799332 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a86b73c-13a2-4142-819c-00d3db4ce9cd" containerName="kube-state-metrics" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.799926 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.801671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.801699 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.805093 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.821969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.822042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74pq\" (UniqueName: \"kubernetes.io/projected/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-api-access-c74pq\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.822213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.822373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.924400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.924496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c74pq\" (UniqueName: \"kubernetes.io/projected/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-api-access-c74pq\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.924593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.924688 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.927832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.927933 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.928574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:44 crc kubenswrapper[4707]: I0121 15:33:44.938684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74pq\" (UniqueName: \"kubernetes.io/projected/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-api-access-c74pq\") pod \"kube-state-metrics-0\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:45 crc kubenswrapper[4707]: I0121 15:33:45.122081 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:45 crc kubenswrapper[4707]: I0121 15:33:45.193493 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a86b73c-13a2-4142-819c-00d3db4ce9cd" path="/var/lib/kubelet/pods/0a86b73c-13a2-4142-819c-00d3db4ce9cd/volumes" Jan 21 15:33:45 crc kubenswrapper[4707]: I0121 15:33:45.533228 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:33:45 crc kubenswrapper[4707]: I0121 15:33:45.661133 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:45 crc kubenswrapper[4707]: I0121 15:33:45.661367 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="ceilometer-central-agent" containerID="cri-o://0272d7a8588ea0c3d1b48fe05807dd58adda544e876612236fa96ef68acdd7ca" gracePeriod=30 Jan 21 15:33:45 crc kubenswrapper[4707]: I0121 15:33:45.661458 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="sg-core" containerID="cri-o://deae1009cf5b2522db18f13d04c635a3640ffb4fb6438965665632d3c4e7544a" gracePeriod=30 Jan 21 15:33:45 crc kubenswrapper[4707]: I0121 15:33:45.661471 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="ceilometer-notification-agent" containerID="cri-o://81ccf685e562c5b05d32070822c0e14cb8cb62485bd2c5d4fab0dd270ab2932e" gracePeriod=30 Jan 21 15:33:45 crc kubenswrapper[4707]: I0121 15:33:45.661511 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="proxy-httpd" containerID="cri-o://a0fff428bdde860dd842fe26e36209a4da5de38749108578e230170a328e4bc1" gracePeriod=30 Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.477311 4707 generic.go:334] "Generic (PLEG): container finished" podID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerID="a0fff428bdde860dd842fe26e36209a4da5de38749108578e230170a328e4bc1" exitCode=0 Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.477515 4707 generic.go:334] "Generic (PLEG): container finished" podID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerID="deae1009cf5b2522db18f13d04c635a3640ffb4fb6438965665632d3c4e7544a" exitCode=2 Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.477524 4707 generic.go:334] "Generic (PLEG): container finished" podID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerID="0272d7a8588ea0c3d1b48fe05807dd58adda544e876612236fa96ef68acdd7ca" exitCode=0 Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.477370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerDied","Data":"a0fff428bdde860dd842fe26e36209a4da5de38749108578e230170a328e4bc1"} Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.477585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerDied","Data":"deae1009cf5b2522db18f13d04c635a3640ffb4fb6438965665632d3c4e7544a"} Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.477599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerDied","Data":"0272d7a8588ea0c3d1b48fe05807dd58adda544e876612236fa96ef68acdd7ca"} Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.478876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"618755bc-3ad6-4081-9de4-d0f1f7c14020","Type":"ContainerStarted","Data":"3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e"} Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.478907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"618755bc-3ad6-4081-9de4-d0f1f7c14020","Type":"ContainerStarted","Data":"22d482e15c3e317cca75a50234e35f28c5f7da65fc761d014e05518c8f48768f"} Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.479007 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:46 crc kubenswrapper[4707]: I0121 15:33:46.492232 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.220717222 podStartE2EDuration="2.492220282s" podCreationTimestamp="2026-01-21 15:33:44 +0000 UTC" firstStartedPulling="2026-01-21 15:33:45.536868597 +0000 UTC m=+1922.718384819" lastFinishedPulling="2026-01-21 15:33:45.808371658 +0000 UTC m=+1922.989887879" observedRunningTime="2026-01-21 15:33:46.488297074 +0000 UTC m=+1923.669813297" watchObservedRunningTime="2026-01-21 15:33:46.492220282 +0000 UTC m=+1923.673736503" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.490167 4707 generic.go:334] "Generic (PLEG): container finished" podID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerID="81ccf685e562c5b05d32070822c0e14cb8cb62485bd2c5d4fab0dd270ab2932e" exitCode=0 Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.490232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerDied","Data":"81ccf685e562c5b05d32070822c0e14cb8cb62485bd2c5d4fab0dd270ab2932e"} Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.550562 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.570184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-combined-ca-bundle\") pod \"0fe2f662-8806-442a-81fe-a5d62acdc376\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.570251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-sg-core-conf-yaml\") pod \"0fe2f662-8806-442a-81fe-a5d62acdc376\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.570301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-run-httpd\") pod \"0fe2f662-8806-442a-81fe-a5d62acdc376\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.570326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-log-httpd\") pod \"0fe2f662-8806-442a-81fe-a5d62acdc376\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.570411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-scripts\") pod \"0fe2f662-8806-442a-81fe-a5d62acdc376\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.570736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0fe2f662-8806-442a-81fe-a5d62acdc376" (UID: "0fe2f662-8806-442a-81fe-a5d62acdc376"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.570761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0fe2f662-8806-442a-81fe-a5d62acdc376" (UID: "0fe2f662-8806-442a-81fe-a5d62acdc376"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.570866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b62w\" (UniqueName: \"kubernetes.io/projected/0fe2f662-8806-442a-81fe-a5d62acdc376-kube-api-access-7b62w\") pod \"0fe2f662-8806-442a-81fe-a5d62acdc376\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.570899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-config-data\") pod \"0fe2f662-8806-442a-81fe-a5d62acdc376\" (UID: \"0fe2f662-8806-442a-81fe-a5d62acdc376\") " Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.572875 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.572892 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe2f662-8806-442a-81fe-a5d62acdc376-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.576241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-scripts" (OuterVolumeSpecName: "scripts") pod "0fe2f662-8806-442a-81fe-a5d62acdc376" (UID: "0fe2f662-8806-442a-81fe-a5d62acdc376"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.577960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fe2f662-8806-442a-81fe-a5d62acdc376-kube-api-access-7b62w" (OuterVolumeSpecName: "kube-api-access-7b62w") pod "0fe2f662-8806-442a-81fe-a5d62acdc376" (UID: "0fe2f662-8806-442a-81fe-a5d62acdc376"). InnerVolumeSpecName "kube-api-access-7b62w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.597933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0fe2f662-8806-442a-81fe-a5d62acdc376" (UID: "0fe2f662-8806-442a-81fe-a5d62acdc376"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.629931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fe2f662-8806-442a-81fe-a5d62acdc376" (UID: "0fe2f662-8806-442a-81fe-a5d62acdc376"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.646482 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-config-data" (OuterVolumeSpecName: "config-data") pod "0fe2f662-8806-442a-81fe-a5d62acdc376" (UID: "0fe2f662-8806-442a-81fe-a5d62acdc376"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.673905 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.673929 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b62w\" (UniqueName: \"kubernetes.io/projected/0fe2f662-8806-442a-81fe-a5d62acdc376-kube-api-access-7b62w\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.673940 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.673949 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:47 crc kubenswrapper[4707]: I0121 15:33:47.673957 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe2f662-8806-442a-81fe-a5d62acdc376-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.499341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0fe2f662-8806-442a-81fe-a5d62acdc376","Type":"ContainerDied","Data":"ea049c371492d375d13c0b996c8075e592d0f9297393006f929d4c0045ae78b3"} Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.499393 4707 scope.go:117] "RemoveContainer" containerID="a0fff428bdde860dd842fe26e36209a4da5de38749108578e230170a328e4bc1" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.499406 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.525721 4707 scope.go:117] "RemoveContainer" containerID="deae1009cf5b2522db18f13d04c635a3640ffb4fb6438965665632d3c4e7544a" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.526793 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.535756 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.545379 4707 scope.go:117] "RemoveContainer" containerID="81ccf685e562c5b05d32070822c0e14cb8cb62485bd2c5d4fab0dd270ab2932e" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.551726 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:48 crc kubenswrapper[4707]: E0121 15:33:48.552078 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="ceilometer-central-agent" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.552095 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="ceilometer-central-agent" Jan 21 15:33:48 crc kubenswrapper[4707]: E0121 15:33:48.552111 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="ceilometer-notification-agent" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.552118 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="ceilometer-notification-agent" Jan 21 15:33:48 crc kubenswrapper[4707]: E0121 15:33:48.552127 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="sg-core" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.552132 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="sg-core" Jan 21 15:33:48 crc kubenswrapper[4707]: E0121 15:33:48.552145 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="proxy-httpd" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.552150 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="proxy-httpd" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.552300 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="ceilometer-central-agent" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.552316 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="sg-core" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.552322 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="proxy-httpd" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.552343 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" containerName="ceilometer-notification-agent" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.553779 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.559152 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.559371 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.559411 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.561174 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.576155 4707 scope.go:117] "RemoveContainer" containerID="0272d7a8588ea0c3d1b48fe05807dd58adda544e876612236fa96ef68acdd7ca" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.584382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-log-httpd\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.584424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.584560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkw2r\" (UniqueName: \"kubernetes.io/projected/aed83d62-c811-4d58-93c0-75d2af7f4474-kube-api-access-dkw2r\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.584611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.584794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-scripts\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.584839 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.584864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-run-httpd\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.584912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-config-data\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.686436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-log-httpd\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.686474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.686539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkw2r\" (UniqueName: \"kubernetes.io/projected/aed83d62-c811-4d58-93c0-75d2af7f4474-kube-api-access-dkw2r\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.686566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.686613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-scripts\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.686626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.686670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-run-httpd\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.686694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-config-data\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.686942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-log-httpd\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.687153 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-run-httpd\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.690167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.690173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.690282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-scripts\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.691042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.691425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-config-data\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.699926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkw2r\" (UniqueName: \"kubernetes.io/projected/aed83d62-c811-4d58-93c0-75d2af7f4474-kube-api-access-dkw2r\") pod \"ceilometer-0\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:48 crc kubenswrapper[4707]: I0121 15:33:48.871894 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:49 crc kubenswrapper[4707]: I0121 15:33:49.190665 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fe2f662-8806-442a-81fe-a5d62acdc376" path="/var/lib/kubelet/pods/0fe2f662-8806-442a-81fe-a5d62acdc376/volumes" Jan 21 15:33:49 crc kubenswrapper[4707]: I0121 15:33:49.249525 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:49 crc kubenswrapper[4707]: W0121 15:33:49.250802 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed83d62_c811_4d58_93c0_75d2af7f4474.slice/crio-a967aa3c9f2067c8ffe52cbe6d5fcb923dfcbfa5f1dcf4eec926c50f3c7ea91f WatchSource:0}: Error finding container a967aa3c9f2067c8ffe52cbe6d5fcb923dfcbfa5f1dcf4eec926c50f3c7ea91f: Status 404 returned error can't find the container with id a967aa3c9f2067c8ffe52cbe6d5fcb923dfcbfa5f1dcf4eec926c50f3c7ea91f Jan 21 15:33:49 crc kubenswrapper[4707]: I0121 15:33:49.507377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerStarted","Data":"a967aa3c9f2067c8ffe52cbe6d5fcb923dfcbfa5f1dcf4eec926c50f3c7ea91f"} Jan 21 15:33:49 crc kubenswrapper[4707]: I0121 15:33:49.658317 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:49 crc kubenswrapper[4707]: I0121 15:33:49.658706 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:49 crc kubenswrapper[4707]: I0121 15:33:49.660486 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:49 crc kubenswrapper[4707]: I0121 15:33:49.660530 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:50 crc kubenswrapper[4707]: I0121 15:33:50.525010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerStarted","Data":"d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a"} Jan 21 15:33:50 crc kubenswrapper[4707]: I0121 15:33:50.525221 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:50 crc kubenswrapper[4707]: I0121 15:33:50.528852 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:51 crc kubenswrapper[4707]: I0121 15:33:51.531624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerStarted","Data":"8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49"} Jan 21 15:33:52 crc kubenswrapper[4707]: I0121 15:33:52.543277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerStarted","Data":"c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf"} Jan 21 15:33:52 crc kubenswrapper[4707]: I0121 15:33:52.743145 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:52 crc kubenswrapper[4707]: I0121 15:33:52.749797 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:52 crc kubenswrapper[4707]: I0121 15:33:52.762592 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.012234 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.164402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.476150 4707 scope.go:117] "RemoveContainer" containerID="bbef33fc947becdd1d5558d859c4bca846cce597aa59ea98ba205b9d3e17dcc1" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.496955 4707 scope.go:117] "RemoveContainer" containerID="a38a1e887943798f2b7cda2bb98ad15eef8f01dd0e9584fb4d01c907a2a4857a" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.552241 4707 scope.go:117] "RemoveContainer" containerID="7070afec529f9d56c29393f342c1eb1b439a894789e2ac24b3d987a7804f4866" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.578381 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerStarted","Data":"e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec"} Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.579293 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.580671 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-log" containerID="cri-o://7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6" gracePeriod=30 Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.581555 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-api" containerID="cri-o://9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d" gracePeriod=30 Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.597590 4707 scope.go:117] "RemoveContainer" containerID="d567d73287d29adb0661d01c4c04f9722e6892b396a786bbaae79534a669a0dd" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.605224 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.610837 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.069441703 podStartE2EDuration="5.610823656s" podCreationTimestamp="2026-01-21 15:33:48 +0000 UTC" firstStartedPulling="2026-01-21 15:33:49.252757681 +0000 UTC m=+1926.434273903" lastFinishedPulling="2026-01-21 15:33:52.794139644 +0000 UTC m=+1929.975655856" observedRunningTime="2026-01-21 15:33:53.609293789 +0000 UTC m=+1930.790810011" watchObservedRunningTime="2026-01-21 15:33:53.610823656 +0000 UTC m=+1930.792339877" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.642056 4707 scope.go:117] "RemoveContainer" containerID="3209d9296b843327d03e27d6b54b024ed915b3086eafcf41994f7ec3896c4560" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.689907 4707 scope.go:117] "RemoveContainer" containerID="33ee19dacd4b8460c57308ceb725cbdeac6589f0322cef31b06d130e43027761" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.735516 4707 scope.go:117] "RemoveContainer" containerID="da9ab0487ecb52d6afd10b061c395b405c6b662ea77f16f3551cb9bae154af0e" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.762625 4707 scope.go:117] "RemoveContainer" containerID="f89ab992dfc657152c4b63f182969f628ca255996d2039d51a99b906f27be1d7" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.806754 4707 scope.go:117] "RemoveContainer" containerID="e27f7fb12ce72b230d303a6c5514f136d2f3b7438b371e165d8f5c1d2e221f5e" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.878600 4707 scope.go:117] "RemoveContainer" containerID="27ef52ef9ab5a10942362bb34bdfe0da4ea9479f559ae274a3e66b2fb8b006ae" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.925001 4707 scope.go:117] "RemoveContainer" containerID="ed215b0d90d236ebda6b3d56bffa7d616362beff92e265c250edc13c4aca5a9e" Jan 21 15:33:53 crc kubenswrapper[4707]: I0121 15:33:53.965320 4707 scope.go:117] "RemoveContainer" containerID="642edbea8c4d70be6bbceee8f568996d8a9009063407fe39ae292e3d4b7f4599" Jan 21 15:33:54 crc kubenswrapper[4707]: I0121 15:33:54.588084 4707 generic.go:334] "Generic (PLEG): container finished" podID="ce71b678-6366-4a72-9175-3428db36ed53" containerID="7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6" exitCode=143 Jan 21 15:33:54 crc kubenswrapper[4707]: I0121 15:33:54.588469 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="ceilometer-central-agent" containerID="cri-o://d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a" gracePeriod=30 Jan 21 15:33:54 crc kubenswrapper[4707]: I0121 15:33:54.588687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ce71b678-6366-4a72-9175-3428db36ed53","Type":"ContainerDied","Data":"7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6"} Jan 21 15:33:54 crc kubenswrapper[4707]: I0121 15:33:54.589514 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="proxy-httpd" containerID="cri-o://e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec" gracePeriod=30 Jan 21 15:33:54 crc kubenswrapper[4707]: I0121 15:33:54.589573 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="sg-core" containerID="cri-o://c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf" gracePeriod=30 Jan 21 15:33:54 crc kubenswrapper[4707]: I0121 15:33:54.589608 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="ceilometer-notification-agent" containerID="cri-o://8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49" gracePeriod=30 Jan 21 15:33:55 crc kubenswrapper[4707]: I0121 15:33:55.128769 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:33:55 crc kubenswrapper[4707]: I0121 15:33:55.606503 4707 generic.go:334] "Generic (PLEG): container finished" podID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerID="e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec" exitCode=0 Jan 21 15:33:55 crc kubenswrapper[4707]: I0121 15:33:55.606530 4707 generic.go:334] "Generic (PLEG): container finished" podID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerID="c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf" exitCode=2 Jan 21 15:33:55 crc kubenswrapper[4707]: I0121 15:33:55.606538 4707 generic.go:334] "Generic (PLEG): container finished" podID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerID="8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49" exitCode=0 Jan 21 15:33:55 crc kubenswrapper[4707]: I0121 15:33:55.607511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerDied","Data":"e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec"} Jan 21 15:33:55 crc kubenswrapper[4707]: I0121 15:33:55.607541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerDied","Data":"c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf"} Jan 21 15:33:55 crc kubenswrapper[4707]: I0121 15:33:55.607552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerDied","Data":"8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49"} Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.487858 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.555507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkw2r\" (UniqueName: \"kubernetes.io/projected/aed83d62-c811-4d58-93c0-75d2af7f4474-kube-api-access-dkw2r\") pod \"aed83d62-c811-4d58-93c0-75d2af7f4474\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.555545 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-ceilometer-tls-certs\") pod \"aed83d62-c811-4d58-93c0-75d2af7f4474\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.555594 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-log-httpd\") pod \"aed83d62-c811-4d58-93c0-75d2af7f4474\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.555622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-scripts\") pod \"aed83d62-c811-4d58-93c0-75d2af7f4474\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.555760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-combined-ca-bundle\") pod \"aed83d62-c811-4d58-93c0-75d2af7f4474\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.555778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-run-httpd\") pod \"aed83d62-c811-4d58-93c0-75d2af7f4474\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.555833 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-sg-core-conf-yaml\") pod \"aed83d62-c811-4d58-93c0-75d2af7f4474\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.555891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-config-data\") pod \"aed83d62-c811-4d58-93c0-75d2af7f4474\" (UID: \"aed83d62-c811-4d58-93c0-75d2af7f4474\") " Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.557515 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aed83d62-c811-4d58-93c0-75d2af7f4474" (UID: "aed83d62-c811-4d58-93c0-75d2af7f4474"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.557650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aed83d62-c811-4d58-93c0-75d2af7f4474" (UID: "aed83d62-c811-4d58-93c0-75d2af7f4474"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.561783 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed83d62-c811-4d58-93c0-75d2af7f4474-kube-api-access-dkw2r" (OuterVolumeSpecName: "kube-api-access-dkw2r") pod "aed83d62-c811-4d58-93c0-75d2af7f4474" (UID: "aed83d62-c811-4d58-93c0-75d2af7f4474"). InnerVolumeSpecName "kube-api-access-dkw2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.563974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-scripts" (OuterVolumeSpecName: "scripts") pod "aed83d62-c811-4d58-93c0-75d2af7f4474" (UID: "aed83d62-c811-4d58-93c0-75d2af7f4474"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.577107 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aed83d62-c811-4d58-93c0-75d2af7f4474" (UID: "aed83d62-c811-4d58-93c0-75d2af7f4474"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.592786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "aed83d62-c811-4d58-93c0-75d2af7f4474" (UID: "aed83d62-c811-4d58-93c0-75d2af7f4474"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.611907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aed83d62-c811-4d58-93c0-75d2af7f4474" (UID: "aed83d62-c811-4d58-93c0-75d2af7f4474"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.616255 4707 generic.go:334] "Generic (PLEG): container finished" podID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerID="d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a" exitCode=0 Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.616287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerDied","Data":"d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a"} Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.616311 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.616327 4707 scope.go:117] "RemoveContainer" containerID="e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.616316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aed83d62-c811-4d58-93c0-75d2af7f4474","Type":"ContainerDied","Data":"a967aa3c9f2067c8ffe52cbe6d5fcb923dfcbfa5f1dcf4eec926c50f3c7ea91f"} Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.626029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-config-data" (OuterVolumeSpecName: "config-data") pod "aed83d62-c811-4d58-93c0-75d2af7f4474" (UID: "aed83d62-c811-4d58-93c0-75d2af7f4474"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.651126 4707 scope.go:117] "RemoveContainer" containerID="c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.658618 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.658642 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.658653 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkw2r\" (UniqueName: \"kubernetes.io/projected/aed83d62-c811-4d58-93c0-75d2af7f4474-kube-api-access-dkw2r\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.658662 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.658670 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.658678 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.658685 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed83d62-c811-4d58-93c0-75d2af7f4474-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.658693 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aed83d62-c811-4d58-93c0-75d2af7f4474-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.670342 4707 scope.go:117] "RemoveContainer" containerID="8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.686011 4707 scope.go:117] "RemoveContainer" containerID="d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.700579 4707 scope.go:117] "RemoveContainer" containerID="e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec" Jan 21 15:33:56 crc kubenswrapper[4707]: E0121 15:33:56.700890 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec\": container with ID starting with e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec not found: ID does not exist" containerID="e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.700931 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec"} err="failed to get container status \"e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec\": rpc error: code = NotFound desc = could not find container \"e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec\": container with ID starting with e010d988f8ac18133e04ecaf870c7ccf75ac1124f73eb4b00fcf1a70b21a2cec not found: ID does not exist" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.700960 4707 scope.go:117] "RemoveContainer" containerID="c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf" Jan 21 15:33:56 crc kubenswrapper[4707]: E0121 15:33:56.701204 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf\": container with ID starting with c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf not found: ID does not exist" containerID="c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.701234 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf"} err="failed to get container status \"c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf\": rpc error: code = NotFound desc = could not find container \"c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf\": container with ID starting with c437786a947117e49a64f0178ff4d65a054444c3272179282b823f6c18fe6fbf not found: ID does not exist" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.701253 4707 scope.go:117] "RemoveContainer" containerID="8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49" Jan 21 15:33:56 crc kubenswrapper[4707]: E0121 15:33:56.701505 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49\": container with ID starting with 8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49 not found: ID does not exist" containerID="8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.701524 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49"} err="failed to get container status \"8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49\": rpc error: code = NotFound desc = could not find container \"8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49\": container with ID starting with 8265bb2178c96f59ba4022f417991e309f110077c1c6826adbf61c11a2115d49 not found: ID does not exist" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.701536 4707 scope.go:117] "RemoveContainer" containerID="d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a" Jan 21 15:33:56 crc kubenswrapper[4707]: E0121 15:33:56.701838 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a\": container with ID starting with d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a not found: ID does not exist" containerID="d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.701872 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a"} err="failed to get container status \"d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a\": rpc error: code = NotFound desc = could not find container \"d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a\": container with ID starting with d9f79ad0178bd740a36248c395dd34f50780eca0ea8b7c6b703388778124508a not found: ID does not exist" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.944030 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.956877 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.965528 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:56 crc kubenswrapper[4707]: E0121 15:33:56.965899 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="sg-core" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.965915 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="sg-core" Jan 21 15:33:56 crc kubenswrapper[4707]: E0121 15:33:56.965943 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="ceilometer-central-agent" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.965948 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="ceilometer-central-agent" Jan 21 15:33:56 crc kubenswrapper[4707]: E0121 15:33:56.965966 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="ceilometer-notification-agent" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.965971 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="ceilometer-notification-agent" Jan 21 15:33:56 crc kubenswrapper[4707]: E0121 15:33:56.965978 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="proxy-httpd" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.965984 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="proxy-httpd" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.966142 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="proxy-httpd" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.966153 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="sg-core" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.966161 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="ceilometer-central-agent" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.966170 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" containerName="ceilometer-notification-agent" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.975733 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.975841 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.978946 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.979159 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:33:56 crc kubenswrapper[4707]: I0121 15:33:56.979410 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.064241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.064287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.064307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-log-httpd\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.064330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-scripts\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.064348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.064378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4t9\" (UniqueName: \"kubernetes.io/projected/598e50f3-fe68-4276-9dab-a81d954b1505-kube-api-access-mm4t9\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.064433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-config-data\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.064466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-run-httpd\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.165894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.165931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.165950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-log-httpd\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.165976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-scripts\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.165995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.166029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4t9\" (UniqueName: \"kubernetes.io/projected/598e50f3-fe68-4276-9dab-a81d954b1505-kube-api-access-mm4t9\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.166098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-config-data\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.166139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-run-httpd\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.166494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-run-httpd\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.166493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-log-httpd\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.170161 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.170319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-config-data\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.171014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.171663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-scripts\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.171847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.179856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4t9\" (UniqueName: \"kubernetes.io/projected/598e50f3-fe68-4276-9dab-a81d954b1505-kube-api-access-mm4t9\") pod \"ceilometer-0\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.203649 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed83d62-c811-4d58-93c0-75d2af7f4474" path="/var/lib/kubelet/pods/aed83d62-c811-4d58-93c0-75d2af7f4474/volumes" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.288827 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.381135 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.471449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce71b678-6366-4a72-9175-3428db36ed53-logs\") pod \"ce71b678-6366-4a72-9175-3428db36ed53\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.471953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-combined-ca-bundle\") pod \"ce71b678-6366-4a72-9175-3428db36ed53\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.472527 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce71b678-6366-4a72-9175-3428db36ed53-logs" (OuterVolumeSpecName: "logs") pod "ce71b678-6366-4a72-9175-3428db36ed53" (UID: "ce71b678-6366-4a72-9175-3428db36ed53"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.472606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-config-data\") pod \"ce71b678-6366-4a72-9175-3428db36ed53\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.472859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4qh9\" (UniqueName: \"kubernetes.io/projected/ce71b678-6366-4a72-9175-3428db36ed53-kube-api-access-m4qh9\") pod \"ce71b678-6366-4a72-9175-3428db36ed53\" (UID: \"ce71b678-6366-4a72-9175-3428db36ed53\") " Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.473514 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce71b678-6366-4a72-9175-3428db36ed53-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.477175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce71b678-6366-4a72-9175-3428db36ed53-kube-api-access-m4qh9" (OuterVolumeSpecName: "kube-api-access-m4qh9") pod "ce71b678-6366-4a72-9175-3428db36ed53" (UID: "ce71b678-6366-4a72-9175-3428db36ed53"). InnerVolumeSpecName "kube-api-access-m4qh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.493514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce71b678-6366-4a72-9175-3428db36ed53" (UID: "ce71b678-6366-4a72-9175-3428db36ed53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.512994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-config-data" (OuterVolumeSpecName: "config-data") pod "ce71b678-6366-4a72-9175-3428db36ed53" (UID: "ce71b678-6366-4a72-9175-3428db36ed53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.575416 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.575448 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce71b678-6366-4a72-9175-3428db36ed53-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.575459 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4qh9\" (UniqueName: \"kubernetes.io/projected/ce71b678-6366-4a72-9175-3428db36ed53-kube-api-access-m4qh9\") on node \"crc\" DevicePath \"\"" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.624875 4707 generic.go:334] "Generic (PLEG): container finished" podID="ce71b678-6366-4a72-9175-3428db36ed53" containerID="9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d" exitCode=0 Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.624908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ce71b678-6366-4a72-9175-3428db36ed53","Type":"ContainerDied","Data":"9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d"} Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.624928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ce71b678-6366-4a72-9175-3428db36ed53","Type":"ContainerDied","Data":"67b19545205492bc2592b88f091fd36f6c7f33cc997495826233a5861be2f226"} Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.624945 4707 scope.go:117] "RemoveContainer" containerID="9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.625664 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.654616 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.659803 4707 scope.go:117] "RemoveContainer" containerID="7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.663303 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.673946 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:57 crc kubenswrapper[4707]: E0121 15:33:57.674267 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-log" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.674278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-log" Jan 21 15:33:57 crc kubenswrapper[4707]: E0121 15:33:57.674296 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-api" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.674301 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-api" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.674470 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-log" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.674479 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce71b678-6366-4a72-9175-3428db36ed53" containerName="nova-api-api" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.675329 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.682394 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.683980 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.684212 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.686308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.701122 4707 scope.go:117] "RemoveContainer" containerID="9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.706186 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:33:57 crc kubenswrapper[4707]: E0121 15:33:57.709112 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d\": container with ID starting with 9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d not found: ID does not exist" containerID="9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.709151 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d"} err="failed to get container status \"9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d\": rpc error: code = NotFound desc = could not find container \"9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d\": container with ID starting with 9b9000eeecff5a53acad4205b09f1d93f2ce901cc457f12b3675c4ad9b5bbf7d not found: ID does not exist" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.709171 4707 scope.go:117] "RemoveContainer" containerID="7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6" Jan 21 15:33:57 crc kubenswrapper[4707]: E0121 15:33:57.710604 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6\": container with ID starting with 7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6 not found: ID does not exist" containerID="7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.710630 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6"} err="failed to get container status \"7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6\": rpc error: code = NotFound desc = could not find container \"7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6\": container with ID starting with 7184d389e383fcaf9506fff8469fe3d52d3274e709c7de909a4c37fc34a0c4d6 not found: ID does not exist" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.779263 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-config-data\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.779416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-public-tls-certs\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.779468 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-logs\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.779489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.779569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvx5l\" (UniqueName: \"kubernetes.io/projected/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-kube-api-access-qvx5l\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.779610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.881772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-public-tls-certs\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.881853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-logs\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.881878 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.881988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvx5l\" (UniqueName: \"kubernetes.io/projected/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-kube-api-access-qvx5l\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.882014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.882237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-config-data\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.883298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-logs\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.886643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.887300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-config-data\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.891170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.891727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-public-tls-certs\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:57 crc kubenswrapper[4707]: I0121 15:33:57.896243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvx5l\" (UniqueName: \"kubernetes.io/projected/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-kube-api-access-qvx5l\") pod \"nova-api-0\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:58 crc kubenswrapper[4707]: I0121 15:33:58.000713 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:33:58 crc kubenswrapper[4707]: I0121 15:33:58.182664 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:33:58 crc kubenswrapper[4707]: E0121 15:33:58.183208 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:33:58 crc kubenswrapper[4707]: I0121 15:33:58.365026 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:33:58 crc kubenswrapper[4707]: I0121 15:33:58.638386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerStarted","Data":"e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da"} Jan 21 15:33:58 crc kubenswrapper[4707]: I0121 15:33:58.638637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerStarted","Data":"47b0f8325f2eb1dbe5f9d291dbed57bab8ef080a8f802363739d9209d8176fab"} Jan 21 15:33:58 crc kubenswrapper[4707]: I0121 15:33:58.642706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b4d7dc0f-9472-43f7-8f37-adb51c82f84e","Type":"ContainerStarted","Data":"95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e"} Jan 21 15:33:58 crc kubenswrapper[4707]: I0121 15:33:58.642735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b4d7dc0f-9472-43f7-8f37-adb51c82f84e","Type":"ContainerStarted","Data":"e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd"} Jan 21 15:33:58 crc kubenswrapper[4707]: I0121 15:33:58.642745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b4d7dc0f-9472-43f7-8f37-adb51c82f84e","Type":"ContainerStarted","Data":"bbf921ac27aaf3a7f162f8de1af4bf9ce73a971377aac2ab5423750b82f0183d"} Jan 21 15:33:58 crc kubenswrapper[4707]: I0121 15:33:58.667044 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.66703248 podStartE2EDuration="1.66703248s" podCreationTimestamp="2026-01-21 15:33:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:33:58.663973308 +0000 UTC m=+1935.845489530" watchObservedRunningTime="2026-01-21 15:33:58.66703248 +0000 UTC m=+1935.848548702" Jan 21 15:33:59 crc kubenswrapper[4707]: I0121 15:33:59.191629 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce71b678-6366-4a72-9175-3428db36ed53" path="/var/lib/kubelet/pods/ce71b678-6366-4a72-9175-3428db36ed53/volumes" Jan 21 15:33:59 crc kubenswrapper[4707]: I0121 15:33:59.655349 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerStarted","Data":"4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a"} Jan 21 15:33:59 crc kubenswrapper[4707]: I0121 15:33:59.655392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerStarted","Data":"3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19"} Jan 21 15:34:01 crc kubenswrapper[4707]: I0121 15:34:01.674202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerStarted","Data":"6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339"} Jan 21 15:34:01 crc kubenswrapper[4707]: I0121 15:34:01.674421 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:34:01 crc kubenswrapper[4707]: I0121 15:34:01.696359 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.782082986 podStartE2EDuration="5.696343508s" podCreationTimestamp="2026-01-21 15:33:56 +0000 UTC" firstStartedPulling="2026-01-21 15:33:57.712083182 +0000 UTC m=+1934.893599404" lastFinishedPulling="2026-01-21 15:34:00.626343704 +0000 UTC m=+1937.807859926" observedRunningTime="2026-01-21 15:34:01.687265609 +0000 UTC m=+1938.868781831" watchObservedRunningTime="2026-01-21 15:34:01.696343508 +0000 UTC m=+1938.877859730" Jan 21 15:34:08 crc kubenswrapper[4707]: I0121 15:34:08.001685 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:08 crc kubenswrapper[4707]: I0121 15:34:08.002097 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:09 crc kubenswrapper[4707]: I0121 15:34:09.015929 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.95:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:34:09 crc kubenswrapper[4707]: I0121 15:34:09.015977 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.95:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:34:11 crc kubenswrapper[4707]: I0121 15:34:11.183122 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:34:11 crc kubenswrapper[4707]: E0121 15:34:11.183636 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:34:18 crc kubenswrapper[4707]: I0121 15:34:18.006450 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:18 crc kubenswrapper[4707]: I0121 15:34:18.007018 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:18 crc kubenswrapper[4707]: I0121 15:34:18.008433 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:18 crc kubenswrapper[4707]: I0121 15:34:18.011645 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:18 crc kubenswrapper[4707]: I0121 15:34:18.790940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:18 crc kubenswrapper[4707]: I0121 15:34:18.799051 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:23 crc kubenswrapper[4707]: I0121 15:34:23.187029 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:34:23 crc kubenswrapper[4707]: E0121 15:34:23.188407 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:34:27 crc kubenswrapper[4707]: I0121 15:34:27.294562 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.410994 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.412085 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" containerName="openstackclient" containerID="cri-o://9998f408842af3a6508e5983cb44f2ecb94e754e96c36fd322e289d1600aab84" gracePeriod=2 Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.428720 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.469402 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.542860 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-69db-account-create-update-p864k"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.558555 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-69db-account-create-update-p864k"] Jan 21 15:34:30 crc kubenswrapper[4707]: E0121 15:34:30.589880 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:34:30 crc kubenswrapper[4707]: E0121 15:34:30.589928 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data podName:c1a47423-6eba-4cf7-ab2d-db338f4f28a6 nodeName:}" failed. No retries permitted until 2026-01-21 15:34:31.089913686 +0000 UTC m=+1968.271429897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data") pod "rabbitmq-server-0" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6") : configmap "rabbitmq-config-data" not found Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.590097 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-2t28b"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.607899 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-2t28b"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.678859 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-vpps7"] Jan 21 15:34:30 crc kubenswrapper[4707]: E0121 15:34:30.679275 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" containerName="openstackclient" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.679292 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" containerName="openstackclient" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.679456 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" containerName="openstackclient" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.680083 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.686244 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.694918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-vpps7"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.707872 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.708926 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.712852 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.724372 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.733836 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.739237 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-26rc6"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.791897 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-2tzhb"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.793002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-operator-scripts\") pod \"nova-cell1-6cf9-account-create-update-kbxc6\" (UID: \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.793096 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.793244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-operator-scripts\") pod \"glance-4d50-account-create-update-vpps7\" (UID: \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.793333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k7dc\" (UniqueName: \"kubernetes.io/projected/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-kube-api-access-7k7dc\") pod \"nova-cell1-6cf9-account-create-update-kbxc6\" (UID: \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.793432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5297m\" (UniqueName: \"kubernetes.io/projected/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-kube-api-access-5297m\") pod \"glance-4d50-account-create-update-vpps7\" (UID: \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.839455 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.842272 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.859614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.862889 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.866578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.873933 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.908984 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.920607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-operator-scripts\") pod \"nova-cell1-6cf9-account-create-update-kbxc6\" (UID: \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.910539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-operator-scripts\") pod \"nova-cell1-6cf9-account-create-update-kbxc6\" (UID: \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.923130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-operator-scripts\") pod \"glance-4d50-account-create-update-vpps7\" (UID: \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.923161 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e46f052f-cdf5-442f-9628-b1aacddca32d-operator-scripts\") pod \"root-account-create-update-2tzhb\" (UID: \"e46f052f-cdf5-442f-9628-b1aacddca32d\") " pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.923192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7dc\" (UniqueName: \"kubernetes.io/projected/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-kube-api-access-7k7dc\") pod \"nova-cell1-6cf9-account-create-update-kbxc6\" (UID: \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.931097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-operator-scripts\") pod \"glance-4d50-account-create-update-vpps7\" (UID: \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.931594 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-2tzhb"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.934224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5297m\" (UniqueName: \"kubernetes.io/projected/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-kube-api-access-5297m\") pod \"glance-4d50-account-create-update-vpps7\" (UID: \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.934274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6659\" (UniqueName: \"kubernetes.io/projected/e46f052f-cdf5-442f-9628-b1aacddca32d-kube-api-access-f6659\") pod \"root-account-create-update-2tzhb\" (UID: \"e46f052f-cdf5-442f-9628-b1aacddca32d\") " pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.967918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6"] Jan 21 15:34:30 crc kubenswrapper[4707]: I0121 15:34:30.993277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k7dc\" (UniqueName: \"kubernetes.io/projected/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-kube-api-access-7k7dc\") pod \"nova-cell1-6cf9-account-create-update-kbxc6\" (UID: \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\") " pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.003240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5297m\" (UniqueName: \"kubernetes.io/projected/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-kube-api-access-5297m\") pod \"glance-4d50-account-create-update-vpps7\" (UID: \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\") " pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.010447 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.024230 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.036975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84510089-d853-4404-9d37-9130c56ad8aa-operator-scripts\") pod \"nova-cell0-2a48-account-create-update-qfct4\" (UID: \"84510089-d853-4404-9d37-9130c56ad8aa\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.037178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqksn\" (UniqueName: \"kubernetes.io/projected/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-kube-api-access-hqksn\") pod \"cinder-8e80-account-create-update-sdxm6\" (UID: \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.037212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-operator-scripts\") pod \"cinder-8e80-account-create-update-sdxm6\" (UID: \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.037627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.038220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e46f052f-cdf5-442f-9628-b1aacddca32d-operator-scripts\") pod \"root-account-create-update-2tzhb\" (UID: \"e46f052f-cdf5-442f-9628-b1aacddca32d\") " pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.038410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljxxn\" (UniqueName: \"kubernetes.io/projected/84510089-d853-4404-9d37-9130c56ad8aa-kube-api-access-ljxxn\") pod \"nova-cell0-2a48-account-create-update-qfct4\" (UID: \"84510089-d853-4404-9d37-9130c56ad8aa\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.038496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6659\" (UniqueName: \"kubernetes.io/projected/e46f052f-cdf5-442f-9628-b1aacddca32d-kube-api-access-f6659\") pod \"root-account-create-update-2tzhb\" (UID: \"e46f052f-cdf5-442f-9628-b1aacddca32d\") " pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.038778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e46f052f-cdf5-442f-9628-b1aacddca32d-operator-scripts\") pod \"root-account-create-update-2tzhb\" (UID: \"e46f052f-cdf5-442f-9628-b1aacddca32d\") " pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.044011 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.044359 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerName="openstack-network-exporter" containerID="cri-o://e55d327aaedda39102f9591ab335da29319ef38cfe196bb243d1ce6b2b26dfed" gracePeriod=300 Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.062395 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.076227 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-6b65-account-create-update-gmzcs"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.103221 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-8xfcb"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.104341 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.108988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.109426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6659\" (UniqueName: \"kubernetes.io/projected/e46f052f-cdf5-442f-9628-b1aacddca32d-kube-api-access-f6659\") pod \"root-account-create-update-2tzhb\" (UID: \"e46f052f-cdf5-442f-9628-b1aacddca32d\") " pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.119824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.141078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84510089-d853-4404-9d37-9130c56ad8aa-operator-scripts\") pod \"nova-cell0-2a48-account-create-update-qfct4\" (UID: \"84510089-d853-4404-9d37-9130c56ad8aa\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.141242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqksn\" (UniqueName: \"kubernetes.io/projected/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-kube-api-access-hqksn\") pod \"cinder-8e80-account-create-update-sdxm6\" (UID: \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.141920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-operator-scripts\") pod \"cinder-8e80-account-create-update-sdxm6\" (UID: \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.142067 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts\") pod \"placement-9339-account-create-update-8xfcb\" (UID: \"9f46d10f-704e-4815-8636-7d8dcb666e0c\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.142149 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84510089-d853-4404-9d37-9130c56ad8aa-operator-scripts\") pod \"nova-cell0-2a48-account-create-update-qfct4\" (UID: \"84510089-d853-4404-9d37-9130c56ad8aa\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.141211 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-8xfcb"] Jan 21 15:34:31 crc kubenswrapper[4707]: E0121 15:34:31.142313 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.142433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9t4l\" (UniqueName: \"kubernetes.io/projected/9f46d10f-704e-4815-8636-7d8dcb666e0c-kube-api-access-c9t4l\") pod \"placement-9339-account-create-update-8xfcb\" (UID: \"9f46d10f-704e-4815-8636-7d8dcb666e0c\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.142504 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljxxn\" (UniqueName: \"kubernetes.io/projected/84510089-d853-4404-9d37-9130c56ad8aa-kube-api-access-ljxxn\") pod \"nova-cell0-2a48-account-create-update-qfct4\" (UID: \"84510089-d853-4404-9d37-9130c56ad8aa\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.142783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-operator-scripts\") pod \"cinder-8e80-account-create-update-sdxm6\" (UID: \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:31 crc kubenswrapper[4707]: E0121 15:34:31.142843 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data podName:c1a47423-6eba-4cf7-ab2d-db338f4f28a6 nodeName:}" failed. No retries permitted until 2026-01-21 15:34:32.142801095 +0000 UTC m=+1969.324317317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data") pod "rabbitmq-server-0" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6") : configmap "rabbitmq-config-data" not found Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.168143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljxxn\" (UniqueName: \"kubernetes.io/projected/84510089-d853-4404-9d37-9130c56ad8aa-kube-api-access-ljxxn\") pod \"nova-cell0-2a48-account-create-update-qfct4\" (UID: \"84510089-d853-4404-9d37-9130c56ad8aa\") " pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.176598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqksn\" (UniqueName: \"kubernetes.io/projected/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-kube-api-access-hqksn\") pod \"cinder-8e80-account-create-update-sdxm6\" (UID: \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\") " pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.222649 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b9b4ed-3277-4cbf-b22b-377bb4fb33f7" path="/var/lib/kubelet/pods/08b9b4ed-3277-4cbf-b22b-377bb4fb33f7/volumes" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.223184 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c38ceab-e8d7-47cb-bf82-437d544633dc" path="/var/lib/kubelet/pods/5c38ceab-e8d7-47cb-bf82-437d544633dc/volumes" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.223658 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="811c1581-db32-4ad5-abfd-cb2bd09575d4" path="/var/lib/kubelet/pods/811c1581-db32-4ad5-abfd-cb2bd09575d4/volumes" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.224148 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2eac07c-10c9-47d6-a2cd-6da78fb97ee0" path="/var/lib/kubelet/pods/e2eac07c-10c9-47d6-a2cd-6da78fb97ee0/volumes" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.243387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts\") pod \"placement-9339-account-create-update-8xfcb\" (UID: \"9f46d10f-704e-4815-8636-7d8dcb666e0c\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.243455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9t4l\" (UniqueName: \"kubernetes.io/projected/9f46d10f-704e-4815-8636-7d8dcb666e0c-kube-api-access-c9t4l\") pod \"placement-9339-account-create-update-8xfcb\" (UID: \"9f46d10f-704e-4815-8636-7d8dcb666e0c\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.244297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts\") pod \"placement-9339-account-create-update-8xfcb\" (UID: \"9f46d10f-704e-4815-8636-7d8dcb666e0c\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.251350 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.268961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.291843 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-6nmsr"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.297940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9t4l\" (UniqueName: \"kubernetes.io/projected/9f46d10f-704e-4815-8636-7d8dcb666e0c-kube-api-access-c9t4l\") pod \"placement-9339-account-create-update-8xfcb\" (UID: \"9f46d10f-704e-4815-8636-7d8dcb666e0c\") " pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.317968 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-n7pdl"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.342931 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.361384 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-n7pdl"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.389569 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-6nmsr"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.410025 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-sfbrk"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.447853 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.454133 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerName="ovsdbserver-sb" containerID="cri-o://8846341a10e36d0ac832f88ab6dd92585a6e11983d8927e565d6d370ee499686" gracePeriod=300 Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.461442 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-t9nz7"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.478447 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.479063 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerName="openstack-network-exporter" containerID="cri-o://b87047431e2fca4a31a430cebbb8e750f9e99cbfe863d8680da80f4d138bf6ce" gracePeriod=300 Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.489470 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-2f5kd"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.521860 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-2f5kd"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.528009 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-j4zfc"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.550995 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-j4zfc"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.578359 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.583141 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-jrrrh"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.604356 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-jrrrh"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.691737 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerName="ovsdbserver-nb" containerID="cri-o://6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5" gracePeriod=300 Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.783889 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.798645 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-cfgpp"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.809010 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-rt85p"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.812784 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-rt85p"] Jan 21 15:34:31 crc kubenswrapper[4707]: E0121 15:34:31.932019 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:31 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:31 crc kubenswrapper[4707]: Jan 21 15:34:31 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:31 crc kubenswrapper[4707]: Jan 21 15:34:31 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:31 crc kubenswrapper[4707]: Jan 21 15:34:31 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:31 crc kubenswrapper[4707]: Jan 21 15:34:31 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 15:34:31 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 15:34:31 crc kubenswrapper[4707]: else Jan 21 15:34:31 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:31 crc kubenswrapper[4707]: fi Jan 21 15:34:31 crc kubenswrapper[4707]: Jan 21 15:34:31 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:31 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:31 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:31 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:31 crc kubenswrapper[4707]: # support updates Jan 21 15:34:31 crc kubenswrapper[4707]: Jan 21 15:34:31 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:31 crc kubenswrapper[4707]: E0121 15:34:31.934335 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" podUID="e3b620a6-bbfd-429f-bcbc-16235afb8fc5" Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.957043 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q"] Jan 21 15:34:31 crc kubenswrapper[4707]: I0121 15:34:31.989422 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kb84q"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.010794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-vpps7"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.022951 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_8376e997-f7b0-4432-997e-dbc14561fe5c/ovsdbserver-sb/0.log" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.022993 4707 generic.go:334] "Generic (PLEG): container finished" podID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerID="e55d327aaedda39102f9591ab335da29319ef38cfe196bb243d1ce6b2b26dfed" exitCode=2 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.023007 4707 generic.go:334] "Generic (PLEG): container finished" podID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerID="8846341a10e36d0ac832f88ab6dd92585a6e11983d8927e565d6d370ee499686" exitCode=143 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.023059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8376e997-f7b0-4432-997e-dbc14561fe5c","Type":"ContainerDied","Data":"e55d327aaedda39102f9591ab335da29319ef38cfe196bb243d1ce6b2b26dfed"} Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.023083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8376e997-f7b0-4432-997e-dbc14561fe5c","Type":"ContainerDied","Data":"8846341a10e36d0ac832f88ab6dd92585a6e11983d8927e565d6d370ee499686"} Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.024115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" event={"ID":"e3b620a6-bbfd-429f-bcbc-16235afb8fc5","Type":"ContainerStarted","Data":"8d67078be2917cdb59b82e62b39f46cab4989883cf2002516ec0453d21833d66"} Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.029860 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:32 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 15:34:32 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 15:34:32 crc kubenswrapper[4707]: else Jan 21 15:34:32 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:32 crc kubenswrapper[4707]: fi Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:32 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:32 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:32 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:32 crc kubenswrapper[4707]: # support updates Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.035707 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_b2bf03b0-3f33-46ca-8d6a-59d30612d2a5/ovsdbserver-nb/0.log" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.035759 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerID="b87047431e2fca4a31a430cebbb8e750f9e99cbfe863d8680da80f4d138bf6ce" exitCode=2 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.035775 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerID="6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5" exitCode=143 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.035797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5","Type":"ContainerDied","Data":"b87047431e2fca4a31a430cebbb8e750f9e99cbfe863d8680da80f4d138bf6ce"} Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.035834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5","Type":"ContainerDied","Data":"6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5"} Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.035883 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" podUID="e3b620a6-bbfd-429f-bcbc-16235afb8fc5" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.037162 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.037482 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerName="openstack-network-exporter" containerID="cri-o://0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.037470 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerName="ovn-northd" containerID="cri-o://a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.091586 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5 is running failed: container process not found" containerID="6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.099931 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5 is running failed: container process not found" containerID="6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.102496 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5 is running failed: container process not found" containerID="6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.102562 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerName="ovsdbserver-nb" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.120224 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-sbh8l"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.128794 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-sbh8l"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.136211 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dvtbt"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.143559 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dvtbt"] Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.176434 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod781be74c_c8d5_4cfa_8bb1_5a0434bb3823.slice/crio-conmon-0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod781be74c_c8d5_4cfa_8bb1_5a0434bb3823.slice/crio-a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.193237 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.199022 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data podName:c1a47423-6eba-4cf7-ab2d-db338f4f28a6 nodeName:}" failed. No retries permitted until 2026-01-21 15:34:34.198994521 +0000 UTC m=+1971.380510742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data") pod "rabbitmq-server-0" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6") : configmap "rabbitmq-config-data" not found Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.214433 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-2tzhb"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.263662 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-78749c84c4-t8xcs"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.263921 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" containerName="neutron-api" containerID="cri-o://f13d26d52d87a08e099fcd0268946a2ae4f7db7d6a421b118ad9da2d50e25f98" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.264283 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" containerName="neutron-httpd" containerID="cri-o://88c64ecf8afd910bd3c88e6e7d95ee4f70a40752b48f92f051811e5a100e6877" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.281185 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jzlsb"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.294073 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jzlsb"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.363416 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.363883 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerName="glance-log" containerID="cri-o://2f3a47f92a10db39e366cd34b4513029b880fd9122a25ff4919afeff80052643" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.364006 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerName="glance-httpd" containerID="cri-o://73da13975b9a0fea75976713ffc06ad0487d928c65d1dd4b7b9f66279d719295" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.411298 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.411521 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerName="glance-log" containerID="cri-o://a33bcbf9b6be5d2e1482cb5138bc02de1e5c5fd932db478a845ca93ad2949f3b" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.412060 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerName="glance-httpd" containerID="cri-o://b02f577e02d6cede6688dd7b75593a1e9627c1ae835ea04f9d891f4fea9be490" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.432512 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-r8f6v"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.447074 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-r8f6v"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.449346 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.456950 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.487485 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.487908 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-server" containerID="cri-o://bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488216 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-updater" containerID="cri-o://f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488230 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-reaper" containerID="cri-o://df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488253 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-replicator" containerID="cri-o://d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488281 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-auditor" containerID="cri-o://56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488340 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-server" containerID="cri-o://105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488355 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-replicator" containerID="cri-o://69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488342 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-auditor" containerID="cri-o://4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488366 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-server" containerID="cri-o://befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488354 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-replicator" containerID="cri-o://1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488407 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-expirer" containerID="cri-o://857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488404 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="rsync" containerID="cri-o://0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488430 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="swift-recon-cron" containerID="cri-o://abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488394 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-auditor" containerID="cri-o://936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.488743 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-updater" containerID="cri-o://fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.527281 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.527501 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="c893a809-f898-4540-9624-c092e6ea4e8b" containerName="cinder-scheduler" containerID="cri-o://ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.527865 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="c893a809-f898-4540-9624-c092e6ea4e8b" containerName="probe" containerID="cri-o://630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.544770 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.545001 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-applier-0" podUID="933e570a-28bd-42d8-891f-0734578bc040" containerName="watcher-applier" containerID="cri-o://a5cfda8d9c91d4e73ddfaa7ddcab6f6a580e4abf30eecf6674e01a38f42e5830" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.566390 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:32 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 15:34:32 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 15:34:32 crc kubenswrapper[4707]: else Jan 21 15:34:32 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:32 crc kubenswrapper[4707]: fi Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:32 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:32 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:32 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:32 crc kubenswrapper[4707]: # support updates Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.567760 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" podUID="1a712880-3b94-4d6a-b137-bbe10fbf4ed4" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.575524 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-vpps7"] Jan 21 15:34:32 crc kubenswrapper[4707]: W0121 15:34:32.583144 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84510089_d853_4404_9d37_9130c56ad8aa.slice/crio-73a71acd78062f06c8b964037089777bc5495b64f2bb66bf5371a908de0bf27d WatchSource:0}: Error finding container 73a71acd78062f06c8b964037089777bc5495b64f2bb66bf5371a908de0bf27d: Status 404 returned error can't find the container with id 73a71acd78062f06c8b964037089777bc5495b64f2bb66bf5371a908de0bf27d Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.613409 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.614341 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerName="cinder-api-log" containerID="cri-o://035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.614500 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerName="cinder-api" containerID="cri-o://9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.623830 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:32 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: if [ -n "nova_cell0" ]; then Jan 21 15:34:32 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell0" Jan 21 15:34:32 crc kubenswrapper[4707]: else Jan 21 15:34:32 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:32 crc kubenswrapper[4707]: fi Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:32 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:32 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:32 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:32 crc kubenswrapper[4707]: # support updates Jan 21 15:34:32 crc kubenswrapper[4707]: Jan 21 15:34:32 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.626204 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6"] Jan 21 15:34:32 crc kubenswrapper[4707]: E0121 15:34:32.629051 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" podUID="84510089-d853-4404-9d37-9130c56ad8aa" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.631803 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.650677 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_b2bf03b0-3f33-46ca-8d6a-59d30612d2a5/ovsdbserver-nb/0.log" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.650758 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.657325 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.681178 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_8376e997-f7b0-4432-997e-dbc14561fe5c/ovsdbserver-sb/0.log" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.681257 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.684111 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.684315 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="ceilometer-central-agent" containerID="cri-o://e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.684404 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="proxy-httpd" containerID="cri-o://6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.684441 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="sg-core" containerID="cri-o://4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.684470 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="ceilometer-notification-agent" containerID="cri-o://3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.752697 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.752984 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-api-0" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api-log" containerID="cri-o://b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.753122 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-api-0" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api" containerID="cri-o://9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.769979 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-dw9hb"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.802244 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-dw9hb"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824220 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95vxd\" (UniqueName: \"kubernetes.io/projected/8376e997-f7b0-4432-997e-dbc14561fe5c-kube-api-access-95vxd\") pod \"8376e997-f7b0-4432-997e-dbc14561fe5c\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-config\") pod \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824379 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdb-rundir\") pod \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-config\") pod \"8376e997-f7b0-4432-997e-dbc14561fe5c\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824464 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-combined-ca-bundle\") pod \"8376e997-f7b0-4432-997e-dbc14561fe5c\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"8376e997-f7b0-4432-997e-dbc14561fe5c\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824547 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-metrics-certs-tls-certs\") pod \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-metrics-certs-tls-certs\") pod \"8376e997-f7b0-4432-997e-dbc14561fe5c\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-combined-ca-bundle\") pod \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdb-rundir\") pod \"8376e997-f7b0-4432-997e-dbc14561fe5c\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-scripts\") pod \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-scripts\") pod \"8376e997-f7b0-4432-997e-dbc14561fe5c\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-879x2\" (UniqueName: \"kubernetes.io/projected/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-kube-api-access-879x2\") pod \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdbserver-nb-tls-certs\") pod \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\" (UID: \"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.824890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdbserver-sb-tls-certs\") pod \"8376e997-f7b0-4432-997e-dbc14561fe5c\" (UID: \"8376e997-f7b0-4432-997e-dbc14561fe5c\") " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.825521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-9l6r6\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.825649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-9l6r6\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.825825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqbnz\" (UniqueName: \"kubernetes.io/projected/8238358f-4625-4642-84db-0834ff86d43b-kube-api-access-sqbnz\") pod \"dnsmasq-dnsmasq-84b9f45d47-9l6r6\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.837179 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "8376e997-f7b0-4432-997e-dbc14561fe5c" (UID: "8376e997-f7b0-4432-997e-dbc14561fe5c"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.837611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" (UID: "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.837937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" (UID: "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.840210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-scripts" (OuterVolumeSpecName: "scripts") pod "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" (UID: "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.842893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-config" (OuterVolumeSpecName: "config") pod "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" (UID: "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.876209 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8376e997-f7b0-4432-997e-dbc14561fe5c-kube-api-access-95vxd" (OuterVolumeSpecName: "kube-api-access-95vxd") pod "8376e997-f7b0-4432-997e-dbc14561fe5c" (UID: "8376e997-f7b0-4432-997e-dbc14561fe5c"). InnerVolumeSpecName "kube-api-access-95vxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.876267 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-db-create-tnspr"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.876473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-scripts" (OuterVolumeSpecName: "scripts") pod "8376e997-f7b0-4432-997e-dbc14561fe5c" (UID: "8376e997-f7b0-4432-997e-dbc14561fe5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.876731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-config" (OuterVolumeSpecName: "config") pod "8376e997-f7b0-4432-997e-dbc14561fe5c" (UID: "8376e997-f7b0-4432-997e-dbc14561fe5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.880760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "8376e997-f7b0-4432-997e-dbc14561fe5c" (UID: "8376e997-f7b0-4432-997e-dbc14561fe5c"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.890434 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-kube-api-access-879x2" (OuterVolumeSpecName: "kube-api-access-879x2") pod "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" (UID: "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5"). InnerVolumeSpecName "kube-api-access-879x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.899168 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-db-create-tnspr"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.915655 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.915928 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-decision-engine-0" podUID="4904f4a7-c8e7-4d41-9254-32ccb097df02" containerName="watcher-decision-engine" containerID="cri-o://dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e" gracePeriod=30 Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.928746 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-qcqbl"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.938208 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-qcqbl"] Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.944749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-9l6r6\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.944957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-9l6r6\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945104 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqbnz\" (UniqueName: \"kubernetes.io/projected/8238358f-4625-4642-84db-0834ff86d43b-kube-api-access-sqbnz\") pod \"dnsmasq-dnsmasq-84b9f45d47-9l6r6\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945310 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945323 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-879x2\" (UniqueName: \"kubernetes.io/projected/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-kube-api-access-879x2\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945334 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95vxd\" (UniqueName: \"kubernetes.io/projected/8376e997-f7b0-4432-997e-dbc14561fe5c-kube-api-access-95vxd\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945343 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945350 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945359 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8376e997-f7b0-4432-997e-dbc14561fe5c-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945376 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945387 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945394 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.945402 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.948058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-9l6r6\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.948541 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-9l6r6\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.965056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqbnz\" (UniqueName: \"kubernetes.io/projected/8238358f-4625-4642-84db-0834ff86d43b-kube-api-access-sqbnz\") pod \"dnsmasq-dnsmasq-84b9f45d47-9l6r6\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.980021 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:32 crc kubenswrapper[4707]: I0121 15:34:32.984828 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.011123 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.022128 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8376e997-f7b0-4432-997e-dbc14561fe5c" (UID: "8376e997-f7b0-4432-997e-dbc14561fe5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.025102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" (UID: "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.026946 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:33 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: if [ -n "placement" ]; then Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="placement" Jan 21 15:34:33 crc kubenswrapper[4707]: else Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:33 crc kubenswrapper[4707]: fi Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:33 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:33 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:33 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:33 crc kubenswrapper[4707]: # support updates Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.027084 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:33 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 15:34:33 crc kubenswrapper[4707]: else Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:33 crc kubenswrapper[4707]: fi Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:33 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:33 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:33 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:33 crc kubenswrapper[4707]: # support updates Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.029686 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" podUID="c1478145-b0f0-47c0-a3d1-d9bb50238e3b" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.029742 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" podUID="9f46d10f-704e-4815-8636-7d8dcb666e0c" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.029876 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.030180 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-log" containerID="cri-o://694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.030529 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-metadata" containerID="cri-o://ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.064492 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.064538 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.064548 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.064557 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.091633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "8376e997-f7b0-4432-997e-dbc14561fe5c" (UID: "8376e997-f7b0-4432-997e-dbc14561fe5c"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.101259 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.110680 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.111985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" event={"ID":"9f46d10f-704e-4815-8636-7d8dcb666e0c","Type":"ContainerStarted","Data":"549986bc094cc8610df33182d898677b6394064b175c654646e1daa1f778e056"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.112136 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-log" containerID="cri-o://e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.117897 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-api" containerID="cri-o://95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.125071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" (UID: "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.140088 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5q9tj"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.148303 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5q9tj"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.151865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8376e997-f7b0-4432-997e-dbc14561fe5c" (UID: "8376e997-f7b0-4432-997e-dbc14561fe5c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.152748 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:33 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: if [ -n "placement" ]; then Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="placement" Jan 21 15:34:33 crc kubenswrapper[4707]: else Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:33 crc kubenswrapper[4707]: fi Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:33 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:33 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:33 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:33 crc kubenswrapper[4707]: # support updates Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.155425 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" podUID="9f46d10f-704e-4815-8636-7d8dcb666e0c" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.162614 4707 generic.go:334] "Generic (PLEG): container finished" podID="27a7cc38-e5af-494e-880c-45e5278e927a" containerID="88c64ecf8afd910bd3c88e6e7d95ee4f70a40752b48f92f051811e5a100e6877" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.162699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" event={"ID":"27a7cc38-e5af-494e-880c-45e5278e927a","Type":"ContainerDied","Data":"88c64ecf8afd910bd3c88e6e7d95ee4f70a40752b48f92f051811e5a100e6877"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.166074 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.166090 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.166099 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8376e997-f7b0-4432-997e-dbc14561fe5c-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.171565 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.171740 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" containerID="cri-o://6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.177515 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-8xfcb"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.190976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" (UID: "b2bf03b0-3f33-46ca-8d6a-59d30612d2a5"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.235914 4707 generic.go:334] "Generic (PLEG): container finished" podID="598e50f3-fe68-4276-9dab-a81d954b1505" containerID="4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a" exitCode=2 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.245970 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10033319-b8c1-4443-9604-9e94be4e4159" path="/var/lib/kubelet/pods/10033319-b8c1-4443-9604-9e94be4e4159/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.247912 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d59352-da6e-45fa-a721-5f7dc6361200" path="/var/lib/kubelet/pods/11d59352-da6e-45fa-a721-5f7dc6361200/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.248736 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15438c80-41a4-474c-8953-1cb1146c4dab" path="/var/lib/kubelet/pods/15438c80-41a4-474c-8953-1cb1146c4dab/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.249614 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2288c539-d799-4de2-8f62-020faad333ec" path="/var/lib/kubelet/pods/2288c539-d799-4de2-8f62-020faad333ec/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.255512 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3602a2ea-5d10-4d05-9963-99da2382de51" path="/var/lib/kubelet/pods/3602a2ea-5d10-4d05-9963-99da2382de51/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.259146 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:33 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 15:34:33 crc kubenswrapper[4707]: else Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:33 crc kubenswrapper[4707]: fi Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:33 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:33 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:33 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:33 crc kubenswrapper[4707]: # support updates Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.261638 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37cd4c00-bb9a-406e-af6c-427a36af5e74" path="/var/lib/kubelet/pods/37cd4c00-bb9a-406e-af6c-427a36af5e74/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.262290 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4329f46d-8b4d-45e8-a6a1-59ae09d3bdde" path="/var/lib/kubelet/pods/4329f46d-8b4d-45e8-a6a1-59ae09d3bdde/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.262837 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab2b0b7-4f0b-49d9-b217-893b083a786c" path="/var/lib/kubelet/pods/5ab2b0b7-4f0b-49d9-b217-893b083a786c/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.263106 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" podUID="c1478145-b0f0-47c0-a3d1-d9bb50238e3b" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.265412 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c499192-7911-4564-b833-1a941ba515a9" path="/var/lib/kubelet/pods/5c499192-7911-4564-b833-1a941ba515a9/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.267978 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.276168 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="605f5170-dbeb-4c0a-a4e0-ca8ba1b93453" path="/var/lib/kubelet/pods/605f5170-dbeb-4c0a-a4e0-ca8ba1b93453/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.276797 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6113492f-9c05-4511-956e-8f19e045260d" path="/var/lib/kubelet/pods/6113492f-9c05-4511-956e-8f19e045260d/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.277377 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c8603f-b572-4c68-a083-e68ebd2faa67" path="/var/lib/kubelet/pods/73c8603f-b572-4c68-a083-e68ebd2faa67/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.287784 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77cbdfe0-44a8-4d32-8cf9-f050e71e5299" path="/var/lib/kubelet/pods/77cbdfe0-44a8-4d32-8cf9-f050e71e5299/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.288444 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5397d9-c1f9-472c-b182-06dbbc26381a" path="/var/lib/kubelet/pods/ae5397d9-c1f9-472c-b182-06dbbc26381a/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.288952 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c829483b-7aed-46a5-892c-1c6983e8aaf2" path="/var/lib/kubelet/pods/c829483b-7aed-46a5-892c-1c6983e8aaf2/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.289916 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeff42ad-81d5-4dfc-a4d9-c68d7f37f984" path="/var/lib/kubelet/pods/eeff42ad-81d5-4dfc-a4d9-c68d7f37f984/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.290414 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb71e17c-c141-4845-a86e-c7d8643681e0" path="/var/lib/kubelet/pods/fb71e17c-c141-4845-a86e-c7d8643681e0/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.291072 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcddd37c-c543-435e-94d7-cf9c9fd75f0a" path="/var/lib/kubelet/pods/fcddd37c-c543-435e-94d7-cf9c9fd75f0a/volumes" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.292166 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.292208 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.292221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerDied","Data":"4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.292247 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" event={"ID":"c1478145-b0f0-47c0-a3d1-d9bb50238e3b","Type":"ContainerStarted","Data":"7ad8a28e3ea9a57157023982ff196974a08ed5a3649cfb86565c44931148ead1"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.292257 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.292271 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6ccd55b46b-7mqqb"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.292284 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-cr6fw"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.292298 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-cr6fw"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.292313 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293718 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-78f7-account-create-update-hsj98"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293733 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-h7lxj"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293743 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293770 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-h7lxj"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293782 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lmmvb"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293792 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293802 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-5b6c-account-create-update-84qjg"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293832 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-lmmvb"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293843 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-2mnwh"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293853 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-2mnwh"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.293866 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-8xfcb"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.294098 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" podUID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerName="placement-log" containerID="cri-o://5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.296876 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" podUID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerName="placement-api" containerID="cri-o://62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.301976 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.312654 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="618755bc-3ad6-4081-9de4-d0f1f7c14020" containerName="kube-state-metrics" containerID="cri-o://3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.330739 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.334892 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_b2bf03b0-3f33-46ca-8d6a-59d30612d2a5/ovsdbserver-nb/0.log" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.335300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b2bf03b0-3f33-46ca-8d6a-59d30612d2a5","Type":"ContainerDied","Data":"42ac5c94e6abcfd6351ccd163c49c14943f7eb6b1e65192b2c6f324be55b3d61"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.335653 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_781be74c-c8d5-4cfa-8bb1-5a0434bb3823/ovn-northd/0.log" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.335802 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.336086 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.336316 4707 scope.go:117] "RemoveContainer" containerID="b87047431e2fca4a31a430cebbb8e750f9e99cbfe863d8680da80f4d138bf6ce" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.346793 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.347022 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerName="proxy-httpd" containerID="cri-o://d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.347291 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerName="proxy-server" containerID="cri-o://67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.356394 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerID="2f3a47f92a10db39e366cd34b4513029b880fd9122a25ff4919afeff80052643" exitCode=143 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.356456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7b5a0b6a-5b7d-4431-961c-984efc37166a","Type":"ContainerDied","Data":"2f3a47f92a10db39e366cd34b4513029b880fd9122a25ff4919afeff80052643"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.369486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" event={"ID":"1a712880-3b94-4d6a-b137-bbe10fbf4ed4","Type":"ContainerStarted","Data":"6bea8be2f791b2c951699f302a0b9f4da77842238474b87214dbf4a1809ae10b"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-config\") pod \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-rundir\") pod \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-combined-ca-bundle\") pod \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config\") pod \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config-secret\") pod \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-metrics-certs-tls-certs\") pod \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371529 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwt5b\" (UniqueName: \"kubernetes.io/projected/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-kube-api-access-xwt5b\") pod \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-scripts\") pod \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-northd-tls-certs\") pod \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\" (UID: \"781be74c-c8d5-4cfa-8bb1-5a0434bb3823\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtfpx\" (UniqueName: \"kubernetes.io/projected/11f832c1-2aa2-45a2-b766-f0483cfda9e0-kube-api-access-wtfpx\") pod \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.371644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-combined-ca-bundle\") pod \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\" (UID: \"11f832c1-2aa2-45a2-b766-f0483cfda9e0\") " Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.372162 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:33 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 15:34:33 crc kubenswrapper[4707]: else Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:33 crc kubenswrapper[4707]: fi Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:33 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:33 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:33 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:33 crc kubenswrapper[4707]: # support updates Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.372519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-config" (OuterVolumeSpecName: "config") pod "781be74c-c8d5-4cfa-8bb1-5a0434bb3823" (UID: "781be74c-c8d5-4cfa-8bb1-5a0434bb3823"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.372927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-scripts" (OuterVolumeSpecName: "scripts") pod "781be74c-c8d5-4cfa-8bb1-5a0434bb3823" (UID: "781be74c-c8d5-4cfa-8bb1-5a0434bb3823"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.374133 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" podUID="1a712880-3b94-4d6a-b137-bbe10fbf4ed4" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.375368 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.375392 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.376251 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "781be74c-c8d5-4cfa-8bb1-5a0434bb3823" (UID: "781be74c-c8d5-4cfa-8bb1-5a0434bb3823"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.376795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-kube-api-access-xwt5b" (OuterVolumeSpecName: "kube-api-access-xwt5b") pod "781be74c-c8d5-4cfa-8bb1-5a0434bb3823" (UID: "781be74c-c8d5-4cfa-8bb1-5a0434bb3823"). InnerVolumeSpecName "kube-api-access-xwt5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.378482 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk"] Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.379998 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerName="ovn-northd" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.380018 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerName="ovn-northd" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.380050 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerName="ovsdbserver-nb" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.380060 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerName="ovsdbserver-nb" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.380084 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerName="ovsdbserver-sb" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.380090 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerName="ovsdbserver-sb" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.380237 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerName="openstack-network-exporter" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.380370 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerName="openstack-network-exporter" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.380395 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerName="openstack-network-exporter" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.380443 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerName="openstack-network-exporter" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.380538 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerName="openstack-network-exporter" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.380545 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerName="openstack-network-exporter" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.381192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f832c1-2aa2-45a2-b766-f0483cfda9e0-kube-api-access-wtfpx" (OuterVolumeSpecName: "kube-api-access-wtfpx") pod "11f832c1-2aa2-45a2-b766-f0483cfda9e0" (UID: "11f832c1-2aa2-45a2-b766-f0483cfda9e0"). InnerVolumeSpecName "kube-api-access-wtfpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.381759 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerName="openstack-network-exporter" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.381799 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerName="ovn-northd" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.383074 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8376e997-f7b0-4432-997e-dbc14561fe5c" containerName="ovsdbserver-sb" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.383099 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerName="openstack-network-exporter" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.383116 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerName="openstack-network-exporter" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.383141 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" containerName="ovsdbserver-nb" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.384217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.386457 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" containerName="rabbitmq" containerID="cri-o://23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce" gracePeriod=604800 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.388484 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.391050 4707 generic.go:334] "Generic (PLEG): container finished" podID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" containerID="9998f408842af3a6508e5983cb44f2ecb94e754e96c36fd322e289d1600aab84" exitCode=137 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.391223 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.402081 4707 generic.go:334] "Generic (PLEG): container finished" podID="e46f052f-cdf5-442f-9628-b1aacddca32d" containerID="6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e" exitCode=1 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.402348 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.402391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-2tzhb" event={"ID":"e46f052f-cdf5-442f-9628-b1aacddca32d","Type":"ContainerDied","Data":"6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.402424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-2tzhb" event={"ID":"e46f052f-cdf5-442f-9628-b1aacddca32d","Type":"ContainerStarted","Data":"0d86fcd0f00ebc9364f7780e6c946773788543a71a07d32f2fd906ea287f365f"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.404592 4707 scope.go:117] "RemoveContainer" containerID="6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.407715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" event={"ID":"84510089-d853-4404-9d37-9130c56ad8aa","Type":"ContainerStarted","Data":"73a71acd78062f06c8b964037089777bc5495b64f2bb66bf5371a908de0bf27d"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.415149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "781be74c-c8d5-4cfa-8bb1-5a0434bb3823" (UID: "781be74c-c8d5-4cfa-8bb1-5a0434bb3823"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.462778 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.464633 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_781be74c-c8d5-4cfa-8bb1-5a0434bb3823/ovn-northd/0.log" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.464677 4707 generic.go:334] "Generic (PLEG): container finished" podID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerID="0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9" exitCode=2 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.464732 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.464746 4707 generic.go:334] "Generic (PLEG): container finished" podID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" containerID="a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678" exitCode=143 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.464847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"781be74c-c8d5-4cfa-8bb1-5a0434bb3823","Type":"ContainerDied","Data":"0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.464878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"781be74c-c8d5-4cfa-8bb1-5a0434bb3823","Type":"ContainerDied","Data":"a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.464890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"781be74c-c8d5-4cfa-8bb1-5a0434bb3823","Type":"ContainerDied","Data":"77fb2fc44af5392306954d6db181bef200ccdb5ef65ec7a483ce8ae026d82ffa"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.471792 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.478227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz645\" (UniqueName: \"kubernetes.io/projected/512bcb97-a919-42d4-8a18-67f62cbeecb6-kube-api-access-kz645\") pod \"keystone-cd48-account-create-update-dc7bk\" (UID: \"512bcb97-a919-42d4-8a18-67f62cbeecb6\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.478319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/512bcb97-a919-42d4-8a18-67f62cbeecb6-operator-scripts\") pod \"keystone-cd48-account-create-update-dc7bk\" (UID: \"512bcb97-a919-42d4-8a18-67f62cbeecb6\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.478452 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtfpx\" (UniqueName: \"kubernetes.io/projected/11f832c1-2aa2-45a2-b766-f0483cfda9e0-kube-api-access-wtfpx\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.478464 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.478476 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.478484 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwt5b\" (UniqueName: \"kubernetes.io/projected/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-kube-api-access-xwt5b\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.481775 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:33 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: if [ -n "nova_cell0" ]; then Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell0" Jan 21 15:34:33 crc kubenswrapper[4707]: else Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:33 crc kubenswrapper[4707]: fi Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:33 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:33 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:33 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:33 crc kubenswrapper[4707]: # support updates Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.482958 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" podUID="84510089-d853-4404-9d37-9130c56ad8aa" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.485719 4707 generic.go:334] "Generic (PLEG): container finished" podID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerID="b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d" exitCode=143 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.485781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"22fbbd00-3dc5-42f1-a015-24e2aac9be07","Type":"ContainerDied","Data":"b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.503722 4707 scope.go:117] "RemoveContainer" containerID="6a850fa2cfb883b2195860230b913c828dcca686cf6166a675bf299e792cd8c5" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.504060 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-7bdns"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.504433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11f832c1-2aa2-45a2-b766-f0483cfda9e0" (UID: "11f832c1-2aa2-45a2-b766-f0483cfda9e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.527406 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_8376e997-f7b0-4432-997e-dbc14561fe5c/ovsdbserver-sb/0.log" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.527526 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.528420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8376e997-f7b0-4432-997e-dbc14561fe5c","Type":"ContainerDied","Data":"fa02a16dc3c529c5d340f129dbaee472fc66fdc1f3dccc2e1783fc42f2092f59"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.536158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "11f832c1-2aa2-45a2-b766-f0483cfda9e0" (UID: "11f832c1-2aa2-45a2-b766-f0483cfda9e0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.538083 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerID="a33bcbf9b6be5d2e1482cb5138bc02de1e5c5fd932db478a845ca93ad2949f3b" exitCode=143 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.538142 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360","Type":"ContainerDied","Data":"a33bcbf9b6be5d2e1482cb5138bc02de1e5c5fd932db478a845ca93ad2949f3b"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.541070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "11f832c1-2aa2-45a2-b766-f0483cfda9e0" (UID: "11f832c1-2aa2-45a2-b766-f0483cfda9e0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.568226 4707 generic.go:334] "Generic (PLEG): container finished" podID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerID="035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57" exitCode=143 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.568336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"1fd72c11-1e49-4359-811c-f7a96288df7b","Type":"ContainerDied","Data":"035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.593674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz645\" (UniqueName: \"kubernetes.io/projected/512bcb97-a919-42d4-8a18-67f62cbeecb6-kube-api-access-kz645\") pod \"keystone-cd48-account-create-update-dc7bk\" (UID: \"512bcb97-a919-42d4-8a18-67f62cbeecb6\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.593848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/512bcb97-a919-42d4-8a18-67f62cbeecb6-operator-scripts\") pod \"keystone-cd48-account-create-update-dc7bk\" (UID: \"512bcb97-a919-42d4-8a18-67f62cbeecb6\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.593969 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.593980 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f832c1-2aa2-45a2-b766-f0483cfda9e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.593990 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11f832c1-2aa2-45a2-b766-f0483cfda9e0-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.594548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/512bcb97-a919-42d4-8a18-67f62cbeecb6-operator-scripts\") pod \"keystone-cd48-account-create-update-dc7bk\" (UID: \"512bcb97-a919-42d4-8a18-67f62cbeecb6\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.607476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "781be74c-c8d5-4cfa-8bb1-5a0434bb3823" (UID: "781be74c-c8d5-4cfa-8bb1-5a0434bb3823"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.620418 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.620632 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" containerID="cri-o://1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.622380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz645\" (UniqueName: \"kubernetes.io/projected/512bcb97-a919-42d4-8a18-67f62cbeecb6-kube-api-access-kz645\") pod \"keystone-cd48-account-create-update-dc7bk\" (UID: \"512bcb97-a919-42d4-8a18-67f62cbeecb6\") " pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.635895 4707 scope.go:117] "RemoveContainer" containerID="9998f408842af3a6508e5983cb44f2ecb94e754e96c36fd322e289d1600aab84" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646836 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646863 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646872 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646879 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646885 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646891 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646897 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646903 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646909 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646915 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646921 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.646926 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe" exitCode=0 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647346 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647380 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.647422 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe"} Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.648906 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9nsxq"] Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.651166 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:33 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 15:34:33 crc kubenswrapper[4707]: else Jan 21 15:34:33 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:33 crc kubenswrapper[4707]: fi Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:33 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:33 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:33 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:33 crc kubenswrapper[4707]: # support updates Jan 21 15:34:33 crc kubenswrapper[4707]: Jan 21 15:34:33 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:33 crc kubenswrapper[4707]: E0121 15:34:33.652283 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" podUID="e3b620a6-bbfd-429f-bcbc-16235afb8fc5" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.658778 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.658965 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="317f9a2c-fa7d-417b-9899-9bce822ffd74" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3936a2d7a7a92681b283ae06c8af878facd4800d2852311442ea58c67b2c19cc" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.687110 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.687454 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api-log" containerID="cri-o://2f689f4a8b3655a649e060bcd6e63b5d3f7054c1d19adef9d2866c4790ef6ad2" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.687854 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api" containerID="cri-o://0eabbf082b84f4e174b9daa40c33805625e90ebc29b3dfef4ea2017e10d6f5c0" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.690120 4707 scope.go:117] "RemoveContainer" containerID="0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.694933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "781be74c-c8d5-4cfa-8bb1-5a0434bb3823" (UID: "781be74c-c8d5-4cfa-8bb1-5a0434bb3823"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.695260 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.695472 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" podUID="12bfd31d-1d19-4286-87f6-83397fa62338" containerName="barbican-keystone-listener-log" containerID="cri-o://d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.695544 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" podUID="12bfd31d-1d19-4286-87f6-83397fa62338" containerName="barbican-keystone-listener" containerID="cri-o://ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.702740 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.702784 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/781be74c-c8d5-4cfa-8bb1-5a0434bb3823-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.713951 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.714138 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" podUID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerName="barbican-worker-log" containerID="cri-o://266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.714244 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" podUID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerName="barbican-worker" containerID="cri-o://220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.738963 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.739295 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="prometheus" containerID="cri-o://2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24" gracePeriod=600 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.739503 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="thanos-sidecar" containerID="cri-o://93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97" gracePeriod=600 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.739639 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="config-reloader" containerID="cri-o://06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f" gracePeriod=600 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.777099 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.777343 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="alertmanager" containerID="cri-o://fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1" gracePeriod=120 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.777632 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="config-reloader" containerID="cri-o://8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013" gracePeriod=120 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.782450 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.806703 4707 scope.go:117] "RemoveContainer" containerID="a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678" Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.850298 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.855302 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.860317 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-cnjfw"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.864331 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.868212 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.872356 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.883939 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.884000 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.884210 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="6b6c2ba2-268a-46f5-9ee1-93ff501d79de" containerName="memcached" containerID="cri-o://beb5043a095378415d9c0c0cc410632d5c83cf638d9ebdc6c46b7366a9b1df05" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.884989 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-x8kr2"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.888858 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-x8kr2"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.892466 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-d6kbk"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.896242 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-d6kbk"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.904324 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.907736 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.911936 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-qc6lv"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.915672 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.921530 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.925181 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-qc6lv"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.932908 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6885874467-qknsk"] Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.933084 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" podUID="32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" containerName="keystone-api" containerID="cri-o://8a845675965250cb9cb44e3ab713dc4f43ebdac5af17938f0005b1ca1e29b6e7" gracePeriod=30 Jan 21 15:34:33 crc kubenswrapper[4707]: I0121 15:34:33.936742 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-2tzhb"] Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.019398 4707 scope.go:117] "RemoveContainer" containerID="0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9" Jan 21 15:34:34 crc kubenswrapper[4707]: E0121 15:34:34.019885 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9\": container with ID starting with 0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9 not found: ID does not exist" containerID="0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.019932 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9"} err="failed to get container status \"0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9\": rpc error: code = NotFound desc = could not find container \"0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9\": container with ID starting with 0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9 not found: ID does not exist" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.019953 4707 scope.go:117] "RemoveContainer" containerID="a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678" Jan 21 15:34:34 crc kubenswrapper[4707]: E0121 15:34:34.020899 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678\": container with ID starting with a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678 not found: ID does not exist" containerID="a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.020935 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678"} err="failed to get container status \"a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678\": rpc error: code = NotFound desc = could not find container \"a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678\": container with ID starting with a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678 not found: ID does not exist" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.020953 4707 scope.go:117] "RemoveContainer" containerID="0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.021304 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9"} err="failed to get container status \"0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9\": rpc error: code = NotFound desc = could not find container \"0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9\": container with ID starting with 0181408bef13f95550cc9142a5facabeb3c3d5e3a355e6a2274c193755de3cf9 not found: ID does not exist" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.021326 4707 scope.go:117] "RemoveContainer" containerID="a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.021940 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678"} err="failed to get container status \"a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678\": rpc error: code = NotFound desc = could not find container \"a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678\": container with ID starting with a8f13e75e09a10ba9c6684872c48e1cf0b7ab8f7b9e6bb06d13089d152cda678 not found: ID does not exist" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.021960 4707 scope.go:117] "RemoveContainer" containerID="e55d327aaedda39102f9591ab335da29319ef38cfe196bb243d1ce6b2b26dfed" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.049854 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.1.193:9090/-/ready\": dial tcp 10.217.1.193:9090: connect: connection refused" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.155720 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="27f0dcee-d9f1-4bcd-9084-402a6f132ddc" containerName="galera" containerID="cri-o://c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517" gracePeriod=30 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.186360 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:34:34 crc kubenswrapper[4707]: E0121 15:34:34.186585 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:34:34 crc kubenswrapper[4707]: E0121 15:34:34.219858 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:34:34 crc kubenswrapper[4707]: E0121 15:34:34.219926 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data podName:c1a47423-6eba-4cf7-ab2d-db338f4f28a6 nodeName:}" failed. No retries permitted until 2026-01-21 15:34:38.21990539 +0000 UTC m=+1975.401421611 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data") pod "rabbitmq-server-0" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6") : configmap "rabbitmq-config-data" not found Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.511152 4707 scope.go:117] "RemoveContainer" containerID="8846341a10e36d0ac832f88ab6dd92585a6e11983d8927e565d6d370ee499686" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.571866 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk"] Jan 21 15:34:34 crc kubenswrapper[4707]: E0121 15:34:34.640738 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:34 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:34 crc kubenswrapper[4707]: Jan 21 15:34:34 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:34 crc kubenswrapper[4707]: Jan 21 15:34:34 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:34 crc kubenswrapper[4707]: Jan 21 15:34:34 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:34 crc kubenswrapper[4707]: Jan 21 15:34:34 crc kubenswrapper[4707]: if [ -n "keystone" ]; then Jan 21 15:34:34 crc kubenswrapper[4707]: GRANT_DATABASE="keystone" Jan 21 15:34:34 crc kubenswrapper[4707]: else Jan 21 15:34:34 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:34 crc kubenswrapper[4707]: fi Jan 21 15:34:34 crc kubenswrapper[4707]: Jan 21 15:34:34 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:34 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:34 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:34 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:34 crc kubenswrapper[4707]: # support updates Jan 21 15:34:34 crc kubenswrapper[4707]: Jan 21 15:34:34 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:34 crc kubenswrapper[4707]: E0121 15:34:34.641964 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"keystone-db-secret\\\" not found\"" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" podUID="512bcb97-a919-42d4-8a18-67f62cbeecb6" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.643709 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.654155 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.674932 4707 generic.go:334] "Generic (PLEG): container finished" podID="952e1893-2e97-4caa-91e1-5c6621395737" containerID="694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39" exitCode=143 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.674967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"952e1893-2e97-4caa-91e1-5c6621395737","Type":"ContainerDied","Data":"694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.677159 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.679978 4707 generic.go:334] "Generic (PLEG): container finished" podID="12bfd31d-1d19-4286-87f6-83397fa62338" containerID="d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072" exitCode=143 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.680026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" event={"ID":"12bfd31d-1d19-4286-87f6-83397fa62338","Type":"ContainerDied","Data":"d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.701839 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.703487 4707 generic.go:334] "Generic (PLEG): container finished" podID="317f9a2c-fa7d-417b-9899-9bce822ffd74" containerID="3936a2d7a7a92681b283ae06c8af878facd4800d2852311442ea58c67b2c19cc" exitCode=0 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.703581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"317f9a2c-fa7d-417b-9899-9bce822ffd74","Type":"ContainerDied","Data":"3936a2d7a7a92681b283ae06c8af878facd4800d2852311442ea58c67b2c19cc"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.717312 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.724826 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727383 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-internal-tls-certs\") pod \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-log-httpd\") pod \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727464 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-config\") pod \"618755bc-3ad6-4081-9de4-d0f1f7c14020\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727489 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-public-tls-certs\") pod \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5tm7\" (UniqueName: \"kubernetes.io/projected/4904f4a7-c8e7-4d41-9254-32ccb097df02-kube-api-access-w5tm7\") pod \"4904f4a7-c8e7-4d41-9254-32ccb097df02\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-config-data\") pod \"4904f4a7-c8e7-4d41-9254-32ccb097df02\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-combined-ca-bundle\") pod \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-certs\") pod \"618755bc-3ad6-4081-9de4-d0f1f7c14020\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-config-data\") pod \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727718 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-etc-swift\") pod \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4904f4a7-c8e7-4d41-9254-32ccb097df02-logs\") pod \"4904f4a7-c8e7-4d41-9254-32ccb097df02\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-combined-ca-bundle\") pod \"4904f4a7-c8e7-4d41-9254-32ccb097df02\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c74pq\" (UniqueName: \"kubernetes.io/projected/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-api-access-c74pq\") pod \"618755bc-3ad6-4081-9de4-d0f1f7c14020\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727816 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-custom-prometheus-ca\") pod \"4904f4a7-c8e7-4d41-9254-32ccb097df02\" (UID: \"4904f4a7-c8e7-4d41-9254-32ccb097df02\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-run-httpd\") pod \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-combined-ca-bundle\") pod \"618755bc-3ad6-4081-9de4-d0f1f7c14020\" (UID: \"618755bc-3ad6-4081-9de4-d0f1f7c14020\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.727874 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xv96\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-kube-api-access-5xv96\") pod \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\" (UID: \"c2af1e8b-8059-4390-9ab4-a4ed516f509d\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.729519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c2af1e8b-8059-4390-9ab4-a4ed516f509d" (UID: "c2af1e8b-8059-4390-9ab4-a4ed516f509d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.730221 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4904f4a7-c8e7-4d41-9254-32ccb097df02-logs" (OuterVolumeSpecName: "logs") pod "4904f4a7-c8e7-4d41-9254-32ccb097df02" (UID: "4904f4a7-c8e7-4d41-9254-32ccb097df02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.730659 4707 generic.go:334] "Generic (PLEG): container finished" podID="598e50f3-fe68-4276-9dab-a81d954b1505" containerID="6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339" exitCode=0 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.730695 4707 generic.go:334] "Generic (PLEG): container finished" podID="598e50f3-fe68-4276-9dab-a81d954b1505" containerID="e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da" exitCode=0 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.730902 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.730963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerDied","Data":"6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.730996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerDied","Data":"e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.732434 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c2af1e8b-8059-4390-9ab4-a4ed516f509d" (UID: "c2af1e8b-8059-4390-9ab4-a4ed516f509d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.735360 4707 generic.go:334] "Generic (PLEG): container finished" podID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerID="2f689f4a8b3655a649e060bcd6e63b5d3f7054c1d19adef9d2866c4790ef6ad2" exitCode=143 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.735417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" event={"ID":"de3b367f-2f6e-4ab7-af9b-5390359d3140","Type":"ContainerDied","Data":"2f689f4a8b3655a649e060bcd6e63b5d3f7054c1d19adef9d2866c4790ef6ad2"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.767105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4904f4a7-c8e7-4d41-9254-32ccb097df02-kube-api-access-w5tm7" (OuterVolumeSpecName: "kube-api-access-w5tm7") pod "4904f4a7-c8e7-4d41-9254-32ccb097df02" (UID: "4904f4a7-c8e7-4d41-9254-32ccb097df02"). InnerVolumeSpecName "kube-api-access-w5tm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.780998 4707 generic.go:334] "Generic (PLEG): container finished" podID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerID="5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46" exitCode=143 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.781084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" event={"ID":"a85cef69-0d30-4ed6-9c73-c169b3636f3a","Type":"ContainerDied","Data":"5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.796351 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerID="e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd" exitCode=143 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.796421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b4d7dc0f-9472-43f7-8f37-adb51c82f84e","Type":"ContainerDied","Data":"e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.804642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c2af1e8b-8059-4390-9ab4-a4ed516f509d" (UID: "c2af1e8b-8059-4390-9ab4-a4ed516f509d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.810649 4707 generic.go:334] "Generic (PLEG): container finished" podID="618755bc-3ad6-4081-9de4-d0f1f7c14020" containerID="3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e" exitCode=2 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.810712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"618755bc-3ad6-4081-9de4-d0f1f7c14020","Type":"ContainerDied","Data":"3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.810736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"618755bc-3ad6-4081-9de4-d0f1f7c14020","Type":"ContainerDied","Data":"22d482e15c3e317cca75a50234e35f28c5f7da65fc761d014e05518c8f48768f"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.810752 4707 scope.go:117] "RemoveContainer" containerID="3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.810874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.810989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-kube-api-access-5xv96" (OuterVolumeSpecName: "kube-api-access-5xv96") pod "c2af1e8b-8059-4390-9ab4-a4ed516f509d" (UID: "c2af1e8b-8059-4390-9ab4-a4ed516f509d"). InnerVolumeSpecName "kube-api-access-5xv96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830496 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-out\") pod \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-config-data\") pod \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-combined-ca-bundle\") pod \"317f9a2c-fa7d-417b-9899-9bce822ffd74\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-memcached-tls-certs\") pod \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-tls-assets\") pod \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-thanos-prometheus-http-client-file\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-alertmanager-metric-storage-db\") pod \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830728 4707 generic.go:334] "Generic (PLEG): container finished" podID="c893a809-f898-4540-9624-c092e6ea4e8b" containerID="630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a" exitCode=0 Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlnsz\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-kube-api-access-hlnsz\") pod \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-2\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.830989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-cluster-tls-config\") pod \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-nova-novncproxy-tls-certs\") pod \"317f9a2c-fa7d-417b-9899-9bce822ffd74\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-secret-combined-ca-bundle\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27bcd\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-kube-api-access-27bcd\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-0\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-volume\") pod \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-vencrypt-tls-certs\") pod \"317f9a2c-fa7d-417b-9899-9bce822ffd74\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-web-config\") pod \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\" (UID: \"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv4z4\" (UniqueName: \"kubernetes.io/projected/317f9a2c-fa7d-417b-9899-9bce822ffd74-kube-api-access-wv4z4\") pod \"317f9a2c-fa7d-417b-9899-9bce822ffd74\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config-out\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831264 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kolla-config\") pod \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-1\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-combined-ca-bundle\") pod \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831340 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhbjs\" (UniqueName: \"kubernetes.io/projected/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kube-api-access-zhbjs\") pod \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\" (UID: \"6b6c2ba2-268a-46f5-9ee1-93ff501d79de\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831377 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-config-data\") pod \"317f9a2c-fa7d-417b-9899-9bce822ffd74\" (UID: \"317f9a2c-fa7d-417b-9899-9bce822ffd74\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.831404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-tls-assets\") pod \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\" (UID: \"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2\") " Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.832045 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.832064 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4904f4a7-c8e7-4d41-9254-32ccb097df02-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.832073 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.832082 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xv96\" (UniqueName: \"kubernetes.io/projected/c2af1e8b-8059-4390-9ab4-a4ed516f509d-kube-api-access-5xv96\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.832091 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2af1e8b-8059-4390-9ab4-a4ed516f509d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.832099 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5tm7\" (UniqueName: \"kubernetes.io/projected/4904f4a7-c8e7-4d41-9254-32ccb097df02-kube-api-access-w5tm7\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.844633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6b6c2ba2-268a-46f5-9ee1-93ff501d79de" (UID: "6b6c2ba2-268a-46f5-9ee1-93ff501d79de"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.844723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-alertmanager-metric-storage-db" (OuterVolumeSpecName: "alertmanager-metric-storage-db") pod "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" (UID: "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b"). InnerVolumeSpecName "alertmanager-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.851220 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.856086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-api-access-c74pq" (OuterVolumeSpecName: "kube-api-access-c74pq") pod "618755bc-3ad6-4081-9de4-d0f1f7c14020" (UID: "618755bc-3ad6-4081-9de4-d0f1f7c14020"). InnerVolumeSpecName "kube-api-access-c74pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.856785 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-config-data" (OuterVolumeSpecName: "config-data") pod "6b6c2ba2-268a-46f5-9ee1-93ff501d79de" (UID: "6b6c2ba2-268a-46f5-9ee1-93ff501d79de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.856872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c893a809-f898-4540-9624-c092e6ea4e8b","Type":"ContainerDied","Data":"630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a"} Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.864307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config" (OuterVolumeSpecName: "config") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.864975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.875084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.875256 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "618755bc-3ad6-4081-9de4-d0f1f7c14020" (UID: "618755bc-3ad6-4081-9de4-d0f1f7c14020"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.890675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.890773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.890842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" (UID: "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.890897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" (UID: "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.928469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.928552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940313 4707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940335 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940347 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940356 4707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940364 4707 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940373 4707 reconciler_common.go:293] "Volume detached for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-alertmanager-metric-storage-db\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940384 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940392 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c74pq\" (UniqueName: \"kubernetes.io/projected/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-api-access-c74pq\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940400 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940409 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940418 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940426 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940436 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940443 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.940451 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.946384 4707 scope.go:117] "RemoveContainer" containerID="3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.948611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-out" (OuterVolumeSpecName: "config-out") pod "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" (UID: "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: E0121 15:34:34.967450 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e\": container with ID starting with 3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e not found: ID does not exist" containerID="3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.967479 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e"} err="failed to get container status \"3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e\": rpc error: code = NotFound desc = could not find container \"3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e\": container with ID starting with 3783f3e1bf2f6745d4f5869fefbac9bc3983a6eda34d3d7295725900772a073e not found: ID does not exist" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.967587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317f9a2c-fa7d-417b-9899-9bce822ffd74-kube-api-access-wv4z4" (OuterVolumeSpecName: "kube-api-access-wv4z4") pod "317f9a2c-fa7d-417b-9899-9bce822ffd74" (UID: "317f9a2c-fa7d-417b-9899-9bce822ffd74"). InnerVolumeSpecName "kube-api-access-wv4z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.967652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-kube-api-access-hlnsz" (OuterVolumeSpecName: "kube-api-access-hlnsz") pod "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" (UID: "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b"). InnerVolumeSpecName "kube-api-access-hlnsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.967712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config-out" (OuterVolumeSpecName: "config-out") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.967757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.967825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-kube-api-access-27bcd" (OuterVolumeSpecName: "kube-api-access-27bcd") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "kube-api-access-27bcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:34 crc kubenswrapper[4707]: I0121 15:34:34.984478 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kube-api-access-zhbjs" (OuterVolumeSpecName: "kube-api-access-zhbjs") pod "6b6c2ba2-268a-46f5-9ee1-93ff501d79de" (UID: "6b6c2ba2-268a-46f5-9ee1-93ff501d79de"). InnerVolumeSpecName "kube-api-access-zhbjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.000084 4707 generic.go:334] "Generic (PLEG): container finished" podID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerID="67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.000109 4707 generic.go:334] "Generic (PLEG): container finished" podID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerID="d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.000158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" event={"ID":"c2af1e8b-8059-4390-9ab4-a4ed516f509d","Type":"ContainerDied","Data":"67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.000187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" event={"ID":"c2af1e8b-8059-4390-9ab4-a4ed516f509d","Type":"ContainerDied","Data":"d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.000207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" event={"ID":"c2af1e8b-8059-4390-9ab4-a4ed516f509d","Type":"ContainerDied","Data":"bad4d798cab896195e02b8d2b431ac1f2c3c484485a32284f142a084fc84f9c9"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.000223 4707 scope.go:117] "RemoveContainer" containerID="67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.000328 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.011868 4707 generic.go:334] "Generic (PLEG): container finished" podID="6b6c2ba2-268a-46f5-9ee1-93ff501d79de" containerID="beb5043a095378415d9c0c0cc410632d5c83cf638d9ebdc6c46b7366a9b1df05" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.011967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"6b6c2ba2-268a-46f5-9ee1-93ff501d79de","Type":"ContainerDied","Data":"beb5043a095378415d9c0c0cc410632d5c83cf638d9ebdc6c46b7366a9b1df05"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.012026 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.013296 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.024344 4707 generic.go:334] "Generic (PLEG): container finished" podID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerID="266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e" exitCode=143 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.024382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" event={"ID":"cbc34703-f7b2-455d-b4a6-4865a4c90f45","Type":"ContainerDied","Data":"266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.026005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" event={"ID":"512bcb97-a919-42d4-8a18-67f62cbeecb6","Type":"ContainerStarted","Data":"ab2b2989d719b49da29c21b644e617c426393cc9a7147474f7b15e9f7212d944"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.036043 4707 generic.go:334] "Generic (PLEG): container finished" podID="4904f4a7-c8e7-4d41-9254-32ccb097df02" containerID="dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.036096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"4904f4a7-c8e7-4d41-9254-32ccb097df02","Type":"ContainerDied","Data":"dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.036113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"4904f4a7-c8e7-4d41-9254-32ccb097df02","Type":"ContainerDied","Data":"4e0fc1adb7f245844cd128ff624b3ca945acd7dcda58b9658732106372498717"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.036161 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.041759 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv4z4\" (UniqueName: \"kubernetes.io/projected/317f9a2c-fa7d-417b-9899-9bce822ffd74-kube-api-access-wv4z4\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.041785 4707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-config-out\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.041794 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhbjs\" (UniqueName: \"kubernetes.io/projected/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-kube-api-access-zhbjs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.041817 4707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-config-out\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.041841 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.041851 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlnsz\" (UniqueName: \"kubernetes.io/projected/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-kube-api-access-hlnsz\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.041859 4707 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.041867 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27bcd\" (UniqueName: \"kubernetes.io/projected/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-kube-api-access-27bcd\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.046693 4707 generic.go:334] "Generic (PLEG): container finished" podID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerID="93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.046719 4707 generic.go:334] "Generic (PLEG): container finished" podID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerID="06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.046727 4707 generic.go:334] "Generic (PLEG): container finished" podID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerID="2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.046765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerDied","Data":"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.046789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerDied","Data":"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.046800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerDied","Data":"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.046825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"75c78782-36f5-4ea6-9b94-0c38ad8ac8b2","Type":"ContainerDied","Data":"b9ba6ef9736e4e3423de4537a070ef7c9438832cc3b04653d27954688e69a445"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.046891 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.052349 4707 generic.go:334] "Generic (PLEG): container finished" podID="933e570a-28bd-42d8-891f-0734578bc040" containerID="a5cfda8d9c91d4e73ddfaa7ddcab6f6a580e4abf30eecf6674e01a38f42e5830" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.052394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-applier-0" event={"ID":"933e570a-28bd-42d8-891f-0734578bc040","Type":"ContainerDied","Data":"a5cfda8d9c91d4e73ddfaa7ddcab6f6a580e4abf30eecf6674e01a38f42e5830"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.055090 4707 generic.go:334] "Generic (PLEG): container finished" podID="e46f052f-cdf5-442f-9628-b1aacddca32d" containerID="d150068cda003e5778b5b784c9314de19be0c8b0bff88efea482c9b70f8f5d2e" exitCode=1 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.055145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-2tzhb" event={"ID":"e46f052f-cdf5-442f-9628-b1aacddca32d","Type":"ContainerDied","Data":"d150068cda003e5778b5b784c9314de19be0c8b0bff88efea482c9b70f8f5d2e"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.063360 4707 generic.go:334] "Generic (PLEG): container finished" podID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerID="8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.063382 4707 generic.go:334] "Generic (PLEG): container finished" podID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerID="fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.063437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b","Type":"ContainerDied","Data":"8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.063455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b","Type":"ContainerDied","Data":"fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.063465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b","Type":"ContainerDied","Data":"57e6012eff8c9cc8be5b8be5fe6af8ea9ffa9100011c6c1cede087099cce1640"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.063531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.066142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4904f4a7-c8e7-4d41-9254-32ccb097df02" (UID: "4904f4a7-c8e7-4d41-9254-32ccb097df02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.087272 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.087299 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276" exitCode=0 Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.087340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.087359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.088958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" event={"ID":"8238358f-4625-4642-84db-0834ff86d43b","Type":"ContainerStarted","Data":"e662f7d8b0f422cd8379b91df72a3d5203e1ed6e7ae27fac775dd1e8df730699"} Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.089307 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" secret="" err="secret \"galera-openstack-dockercfg-8p8k7\" not found" Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.104318 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:34:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:34:35 crc kubenswrapper[4707]: Jan 21 15:34:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:34:35 crc kubenswrapper[4707]: Jan 21 15:34:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:34:35 crc kubenswrapper[4707]: Jan 21 15:34:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:34:35 crc kubenswrapper[4707]: Jan 21 15:34:35 crc kubenswrapper[4707]: if [ -n "placement" ]; then Jan 21 15:34:35 crc kubenswrapper[4707]: GRANT_DATABASE="placement" Jan 21 15:34:35 crc kubenswrapper[4707]: else Jan 21 15:34:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:34:35 crc kubenswrapper[4707]: fi Jan 21 15:34:35 crc kubenswrapper[4707]: Jan 21 15:34:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:34:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:34:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:34:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:34:35 crc kubenswrapper[4707]: # support updates Jan 21 15:34:35 crc kubenswrapper[4707]: Jan 21 15:34:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.105427 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" podUID="9f46d10f-704e-4815-8636-7d8dcb666e0c" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.143859 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.144275 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.144392 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts podName:9f46d10f-704e-4815-8636-7d8dcb666e0c nodeName:}" failed. No retries permitted until 2026-01-21 15:34:35.644378494 +0000 UTC m=+1972.825894715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts") pod "placement-9339-account-create-update-8xfcb" (UID: "9f46d10f-704e-4815-8636-7d8dcb666e0c") : configmap "openstack-scripts" not found Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.162850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "317f9a2c-fa7d-417b-9899-9bce822ffd74" (UID: "317f9a2c-fa7d-417b-9899-9bce822ffd74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.189160 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.207969 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f832c1-2aa2-45a2-b766-f0483cfda9e0" path="/var/lib/kubelet/pods/11f832c1-2aa2-45a2-b766-f0483cfda9e0/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.212826 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f92f6b-e051-483b-91ae-9af16316bb7c" path="/var/lib/kubelet/pods/18f92f6b-e051-483b-91ae-9af16316bb7c/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.214048 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24548ac0-7326-443b-ad9c-81ace3a3dd45" path="/var/lib/kubelet/pods/24548ac0-7326-443b-ad9c-81ace3a3dd45/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.214662 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5681f814-1ae0-4737-bc71-5e487d92c291" path="/var/lib/kubelet/pods/5681f814-1ae0-4737-bc71-5e487d92c291/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.215188 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630582cc-d708-415d-88da-06be4d720728" path="/var/lib/kubelet/pods/630582cc-d708-415d-88da-06be4d720728/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.217968 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d991e26-3d69-43bb-a18b-67aa7819a210" path="/var/lib/kubelet/pods/6d991e26-3d69-43bb-a18b-67aa7819a210/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.218511 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="781be74c-c8d5-4cfa-8bb1-5a0434bb3823" path="/var/lib/kubelet/pods/781be74c-c8d5-4cfa-8bb1-5a0434bb3823/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.219026 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79071a6b-873b-4668-9af8-45aafd0facb3" path="/var/lib/kubelet/pods/79071a6b-873b-4668-9af8-45aafd0facb3/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.220875 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8376e997-f7b0-4432-997e-dbc14561fe5c" path="/var/lib/kubelet/pods/8376e997-f7b0-4432-997e-dbc14561fe5c/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.222777 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916916dd-fb61-4e93-8aff-948350956ec8" path="/var/lib/kubelet/pods/916916dd-fb61-4e93-8aff-948350956ec8/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.223632 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e7eb849-a2d7-4ac3-88cc-977992b13490" path="/var/lib/kubelet/pods/9e7eb849-a2d7-4ac3-88cc-977992b13490/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.224967 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2678839-9b6d-4d68-aa48-c2b42793ea0b" path="/var/lib/kubelet/pods/b2678839-9b6d-4d68-aa48-c2b42793ea0b/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.226766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b6c2ba2-268a-46f5-9ee1-93ff501d79de" (UID: "6b6c2ba2-268a-46f5-9ee1-93ff501d79de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.226948 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2bf03b0-3f33-46ca-8d6a-59d30612d2a5" path="/var/lib/kubelet/pods/b2bf03b0-3f33-46ca-8d6a-59d30612d2a5/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.230352 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e972c93b-19a1-455e-a134-ba9bd7f56854" path="/var/lib/kubelet/pods/e972c93b-19a1-455e-a134-ba9bd7f56854/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.233456 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec7ce142-4555-4604-a31b-155176988d1a" path="/var/lib/kubelet/pods/ec7ce142-4555-4604-a31b-155176988d1a/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.234074 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26df112-b756-487f-ad9b-5d39ed9f1754" path="/var/lib/kubelet/pods/f26df112-b756-487f-ad9b-5d39ed9f1754/volumes" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.241988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "618755bc-3ad6-4081-9de4-d0f1f7c14020" (UID: "618755bc-3ad6-4081-9de4-d0f1f7c14020"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.247450 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.247578 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.247860 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.247882 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.282912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-config-data" (OuterVolumeSpecName: "config-data") pod "317f9a2c-fa7d-417b-9899-9bce822ffd74" (UID: "317f9a2c-fa7d-417b-9899-9bce822ffd74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.286168 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "4904f4a7-c8e7-4d41-9254-32ccb097df02" (UID: "4904f4a7-c8e7-4d41-9254-32ccb097df02"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.304512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c2af1e8b-8059-4390-9ab4-a4ed516f509d" (UID: "c2af1e8b-8059-4390-9ab4-a4ed516f509d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.329922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" (UID: "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.345185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2af1e8b-8059-4390-9ab4-a4ed516f509d" (UID: "c2af1e8b-8059-4390-9ab4-a4ed516f509d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.359142 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.359166 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.359175 4707 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.359184 4707 reconciler_common.go:293] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-cluster-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.359192 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.361111 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c2af1e8b-8059-4390-9ab4-a4ed516f509d" (UID: "c2af1e8b-8059-4390-9ab4-a4ed516f509d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.374889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-config-data" (OuterVolumeSpecName: "config-data") pod "4904f4a7-c8e7-4d41-9254-32ccb097df02" (UID: "4904f4a7-c8e7-4d41-9254-32ccb097df02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.377890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "317f9a2c-fa7d-417b-9899-9bce822ffd74" (UID: "317f9a2c-fa7d-417b-9899-9bce822ffd74"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.382856 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-config-data" (OuterVolumeSpecName: "config-data") pod "c2af1e8b-8059-4390-9ab4-a4ed516f509d" (UID: "c2af1e8b-8059-4390-9ab4-a4ed516f509d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.383910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "618755bc-3ad6-4081-9de4-d0f1f7c14020" (UID: "618755bc-3ad6-4081-9de4-d0f1f7c14020"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.399599 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "6b6c2ba2-268a-46f5-9ee1-93ff501d79de" (UID: "6b6c2ba2-268a-46f5-9ee1-93ff501d79de"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.401873 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-web-config" (OuterVolumeSpecName: "web-config") pod "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" (UID: "f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.402293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "317f9a2c-fa7d-417b-9899-9bce822ffd74" (UID: "317f9a2c-fa7d-417b-9899-9bce822ffd74"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.421865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config" (OuterVolumeSpecName: "web-config") pod "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" (UID: "75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.468937 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4904f4a7-c8e7-4d41-9254-32ccb097df02-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.468961 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/618755bc-3ad6-4081-9de4-d0f1f7c14020-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.468971 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.468980 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6c2ba2-268a-46f5-9ee1-93ff501d79de-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.468988 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.468996 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/317f9a2c-fa7d-417b-9899-9bce822ffd74-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.469004 4707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b-web-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.469012 4707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2-web-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.469020 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2af1e8b-8059-4390-9ab4-a4ed516f509d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.612273 4707 scope.go:117] "RemoveContainer" containerID="d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2" Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.673075 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.673127 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts podName:9f46d10f-704e-4815-8636-7d8dcb666e0c nodeName:}" failed. No retries permitted until 2026-01-21 15:34:36.673113871 +0000 UTC m=+1973.854630093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts") pod "placement-9339-account-create-update-8xfcb" (UID: "9f46d10f-704e-4815-8636-7d8dcb666e0c") : configmap "openstack-scripts" not found Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.688749 4707 scope.go:117] "RemoveContainer" containerID="67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.689171 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.689301 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.689372 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a\": container with ID starting with 67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a not found: ID does not exist" containerID="67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.689393 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a"} err="failed to get container status \"67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a\": rpc error: code = NotFound desc = could not find container \"67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a\": container with ID starting with 67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a not found: ID does not exist" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.689412 4707 scope.go:117] "RemoveContainer" containerID="d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2" Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.689678 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2\": container with ID starting with d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2 not found: ID does not exist" containerID="d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.689699 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2"} err="failed to get container status \"d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2\": rpc error: code = NotFound desc = could not find container \"d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2\": container with ID starting with d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2 not found: ID does not exist" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.689725 4707 scope.go:117] "RemoveContainer" containerID="67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.689994 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a"} err="failed to get container status \"67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a\": rpc error: code = NotFound desc = could not find container \"67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a\": container with ID starting with 67f3b58ff0799982532dfb3def98ac88d5dc268b04a396193a9ef2c3fc08af4a not found: ID does not exist" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.690013 4707 scope.go:117] "RemoveContainer" containerID="d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.690304 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2"} err="failed to get container status \"d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2\": rpc error: code = NotFound desc = could not find container \"d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2\": container with ID starting with d5cece904451bdc02cbc08e855e0e1bfcebd001a1578a3ae3d6e78be7b0d86f2 not found: ID does not exist" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.690362 4707 scope.go:117] "RemoveContainer" containerID="beb5043a095378415d9c0c0cc410632d5c83cf638d9ebdc6c46b7366a9b1df05" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.717863 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.725784 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.730985 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.732986 4707 scope.go:117] "RemoveContainer" containerID="dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.743396 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.747961 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.759735 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.766131 4707 scope.go:117] "RemoveContainer" containerID="dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e" Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.766390 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e\": container with ID starting with dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e not found: ID does not exist" containerID="dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.766556 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e"} err="failed to get container status \"dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e\": rpc error: code = NotFound desc = could not find container \"dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e\": container with ID starting with dab3ecbe21558eb185538d94e3161026177b05d77975f69a52633237fe8ac80e not found: ID does not exist" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.766601 4707 scope.go:117] "RemoveContainer" containerID="93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.768271 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.769091 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.771184 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.775057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz645\" (UniqueName: \"kubernetes.io/projected/512bcb97-a919-42d4-8a18-67f62cbeecb6-kube-api-access-kz645\") pod \"512bcb97-a919-42d4-8a18-67f62cbeecb6\" (UID: \"512bcb97-a919-42d4-8a18-67f62cbeecb6\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933e570a-28bd-42d8-891f-0734578bc040-logs\") pod \"933e570a-28bd-42d8-891f-0734578bc040\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/512bcb97-a919-42d4-8a18-67f62cbeecb6-operator-scripts\") pod \"512bcb97-a919-42d4-8a18-67f62cbeecb6\" (UID: \"512bcb97-a919-42d4-8a18-67f62cbeecb6\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5297m\" (UniqueName: \"kubernetes.io/projected/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-kube-api-access-5297m\") pod \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\" (UID: \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-config-data\") pod \"933e570a-28bd-42d8-891f-0734578bc040\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776340 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-combined-ca-bundle\") pod \"933e570a-28bd-42d8-891f-0734578bc040\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776429 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw4rq\" (UniqueName: \"kubernetes.io/projected/933e570a-28bd-42d8-891f-0734578bc040-kube-api-access-fw4rq\") pod \"933e570a-28bd-42d8-891f-0734578bc040\" (UID: \"933e570a-28bd-42d8-891f-0734578bc040\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-operator-scripts\") pod \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\" (UID: \"e3b620a6-bbfd-429f-bcbc-16235afb8fc5\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/933e570a-28bd-42d8-891f-0734578bc040-logs" (OuterVolumeSpecName: "logs") pod "933e570a-28bd-42d8-891f-0734578bc040" (UID: "933e570a-28bd-42d8-891f-0734578bc040"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/512bcb97-a919-42d4-8a18-67f62cbeecb6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "512bcb97-a919-42d4-8a18-67f62cbeecb6" (UID: "512bcb97-a919-42d4-8a18-67f62cbeecb6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.776992 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/933e570a-28bd-42d8-891f-0734578bc040-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.777009 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/512bcb97-a919-42d4-8a18-67f62cbeecb6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.777406 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:35 crc kubenswrapper[4707]: E0121 15:34:35.777440 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.777883 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3b620a6-bbfd-429f-bcbc-16235afb8fc5" (UID: "e3b620a6-bbfd-429f-bcbc-16235afb8fc5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.778059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.780548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-kube-api-access-5297m" (OuterVolumeSpecName: "kube-api-access-5297m") pod "e3b620a6-bbfd-429f-bcbc-16235afb8fc5" (UID: "e3b620a6-bbfd-429f-bcbc-16235afb8fc5"). InnerVolumeSpecName "kube-api-access-5297m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.785069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512bcb97-a919-42d4-8a18-67f62cbeecb6-kube-api-access-kz645" (OuterVolumeSpecName: "kube-api-access-kz645") pod "512bcb97-a919-42d4-8a18-67f62cbeecb6" (UID: "512bcb97-a919-42d4-8a18-67f62cbeecb6"). InnerVolumeSpecName "kube-api-access-kz645". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.791058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/933e570a-28bd-42d8-891f-0734578bc040-kube-api-access-fw4rq" (OuterVolumeSpecName: "kube-api-access-fw4rq") pod "933e570a-28bd-42d8-891f-0734578bc040" (UID: "933e570a-28bd-42d8-891f-0734578bc040"). InnerVolumeSpecName "kube-api-access-fw4rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.791238 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.799078 4707 scope.go:117] "RemoveContainer" containerID="06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.803582 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.810106 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.811420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "933e570a-28bd-42d8-891f-0734578bc040" (UID: "933e570a-28bd-42d8-891f-0734578bc040"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.824731 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.824784 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.832182 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.835368 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-config-data" (OuterVolumeSpecName: "config-data") pod "933e570a-28bd-42d8-891f-0734578bc040" (UID: "933e570a-28bd-42d8-891f-0734578bc040"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.840976 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.845878 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.846064 4707 scope.go:117] "RemoveContainer" containerID="2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.871737 4707 scope.go:117] "RemoveContainer" containerID="659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.878792 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-combined-ca-bundle\") pod \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.878862 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6659\" (UniqueName: \"kubernetes.io/projected/e46f052f-cdf5-442f-9628-b1aacddca32d-kube-api-access-f6659\") pod \"e46f052f-cdf5-442f-9628-b1aacddca32d\" (UID: \"e46f052f-cdf5-442f-9628-b1aacddca32d\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.878910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-operator-scripts\") pod \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\" (UID: \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.878942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqksn\" (UniqueName: \"kubernetes.io/projected/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-kube-api-access-hqksn\") pod \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\" (UID: \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.878989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk64m\" (UniqueName: \"kubernetes.io/projected/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kube-api-access-bk64m\") pod \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k7dc\" (UniqueName: \"kubernetes.io/projected/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-kube-api-access-7k7dc\") pod \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\" (UID: \"1a712880-3b94-4d6a-b137-bbe10fbf4ed4\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-default\") pod \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-galera-tls-certs\") pod \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879100 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kolla-config\") pod \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e46f052f-cdf5-442f-9628-b1aacddca32d-operator-scripts\") pod \"e46f052f-cdf5-442f-9628-b1aacddca32d\" (UID: \"e46f052f-cdf5-442f-9628-b1aacddca32d\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-generated\") pod \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84510089-d853-4404-9d37-9130c56ad8aa-operator-scripts\") pod \"84510089-d853-4404-9d37-9130c56ad8aa\" (UID: \"84510089-d853-4404-9d37-9130c56ad8aa\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljxxn\" (UniqueName: \"kubernetes.io/projected/84510089-d853-4404-9d37-9130c56ad8aa-kube-api-access-ljxxn\") pod \"84510089-d853-4404-9d37-9130c56ad8aa\" (UID: \"84510089-d853-4404-9d37-9130c56ad8aa\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-operator-scripts\") pod \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\" (UID: \"27f0dcee-d9f1-4bcd-9084-402a6f132ddc\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.879439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-operator-scripts\") pod \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\" (UID: \"c1478145-b0f0-47c0-a3d1-d9bb50238e3b\") " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.880117 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw4rq\" (UniqueName: \"kubernetes.io/projected/933e570a-28bd-42d8-891f-0734578bc040-kube-api-access-fw4rq\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.880154 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.880163 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz645\" (UniqueName: \"kubernetes.io/projected/512bcb97-a919-42d4-8a18-67f62cbeecb6-kube-api-access-kz645\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.880171 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5297m\" (UniqueName: \"kubernetes.io/projected/e3b620a6-bbfd-429f-bcbc-16235afb8fc5-kube-api-access-5297m\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.880179 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.880187 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933e570a-28bd-42d8-891f-0734578bc040-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.880707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1478145-b0f0-47c0-a3d1-d9bb50238e3b" (UID: "c1478145-b0f0-47c0-a3d1-d9bb50238e3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.881751 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "27f0dcee-d9f1-4bcd-9084-402a6f132ddc" (UID: "27f0dcee-d9f1-4bcd-9084-402a6f132ddc"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.881945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a712880-3b94-4d6a-b137-bbe10fbf4ed4" (UID: "1a712880-3b94-4d6a-b137-bbe10fbf4ed4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.881960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "27f0dcee-d9f1-4bcd-9084-402a6f132ddc" (UID: "27f0dcee-d9f1-4bcd-9084-402a6f132ddc"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.882221 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e46f052f-cdf5-442f-9628-b1aacddca32d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e46f052f-cdf5-442f-9628-b1aacddca32d" (UID: "e46f052f-cdf5-442f-9628-b1aacddca32d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.882547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "27f0dcee-d9f1-4bcd-9084-402a6f132ddc" (UID: "27f0dcee-d9f1-4bcd-9084-402a6f132ddc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.882660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84510089-d853-4404-9d37-9130c56ad8aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84510089-d853-4404-9d37-9130c56ad8aa" (UID: "84510089-d853-4404-9d37-9130c56ad8aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.882873 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27f0dcee-d9f1-4bcd-9084-402a6f132ddc" (UID: "27f0dcee-d9f1-4bcd-9084-402a6f132ddc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.884098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-kube-api-access-hqksn" (OuterVolumeSpecName: "kube-api-access-hqksn") pod "c1478145-b0f0-47c0-a3d1-d9bb50238e3b" (UID: "c1478145-b0f0-47c0-a3d1-d9bb50238e3b"). InnerVolumeSpecName "kube-api-access-hqksn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.886961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-kube-api-access-7k7dc" (OuterVolumeSpecName: "kube-api-access-7k7dc") pod "1a712880-3b94-4d6a-b137-bbe10fbf4ed4" (UID: "1a712880-3b94-4d6a-b137-bbe10fbf4ed4"). InnerVolumeSpecName "kube-api-access-7k7dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.888856 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46f052f-cdf5-442f-9628-b1aacddca32d-kube-api-access-f6659" (OuterVolumeSpecName: "kube-api-access-f6659") pod "e46f052f-cdf5-442f-9628-b1aacddca32d" (UID: "e46f052f-cdf5-442f-9628-b1aacddca32d"). InnerVolumeSpecName "kube-api-access-f6659". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.888962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kube-api-access-bk64m" (OuterVolumeSpecName: "kube-api-access-bk64m") pod "27f0dcee-d9f1-4bcd-9084-402a6f132ddc" (UID: "27f0dcee-d9f1-4bcd-9084-402a6f132ddc"). InnerVolumeSpecName "kube-api-access-bk64m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.890085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84510089-d853-4404-9d37-9130c56ad8aa-kube-api-access-ljxxn" (OuterVolumeSpecName: "kube-api-access-ljxxn") pod "84510089-d853-4404-9d37-9130c56ad8aa" (UID: "84510089-d853-4404-9d37-9130c56ad8aa"). InnerVolumeSpecName "kube-api-access-ljxxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.895010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "27f0dcee-d9f1-4bcd-9084-402a6f132ddc" (UID: "27f0dcee-d9f1-4bcd-9084-402a6f132ddc"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.905876 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27f0dcee-d9f1-4bcd-9084-402a6f132ddc" (UID: "27f0dcee-d9f1-4bcd-9084-402a6f132ddc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.921058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "27f0dcee-d9f1-4bcd-9084-402a6f132ddc" (UID: "27f0dcee-d9f1-4bcd-9084-402a6f132ddc"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983365 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983403 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983424 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983437 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e46f052f-cdf5-442f-9628-b1aacddca32d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983452 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983490 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983502 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84510089-d853-4404-9d37-9130c56ad8aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983512 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljxxn\" (UniqueName: \"kubernetes.io/projected/84510089-d853-4404-9d37-9130c56ad8aa-kube-api-access-ljxxn\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983525 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983537 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983549 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983565 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6659\" (UniqueName: \"kubernetes.io/projected/e46f052f-cdf5-442f-9628-b1aacddca32d-kube-api-access-f6659\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983575 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983586 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqksn\" (UniqueName: \"kubernetes.io/projected/c1478145-b0f0-47c0-a3d1-d9bb50238e3b-kube-api-access-hqksn\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983595 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk64m\" (UniqueName: \"kubernetes.io/projected/27f0dcee-d9f1-4bcd-9084-402a6f132ddc-kube-api-access-bk64m\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:35 crc kubenswrapper[4707]: I0121 15:34:35.983607 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k7dc\" (UniqueName: \"kubernetes.io/projected/1a712880-3b94-4d6a-b137-bbe10fbf4ed4-kube-api-access-7k7dc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.003174 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.005454 4707 scope.go:117] "RemoveContainer" containerID="93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.005914 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97\": container with ID starting with 93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97 not found: ID does not exist" containerID="93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.005972 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97"} err="failed to get container status \"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97\": rpc error: code = NotFound desc = could not find container \"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97\": container with ID starting with 93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.006006 4707 scope.go:117] "RemoveContainer" containerID="06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.006282 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f\": container with ID starting with 06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f not found: ID does not exist" containerID="06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.006304 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f"} err="failed to get container status \"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f\": rpc error: code = NotFound desc = could not find container \"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f\": container with ID starting with 06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.006324 4707 scope.go:117] "RemoveContainer" containerID="2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.006535 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24\": container with ID starting with 2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24 not found: ID does not exist" containerID="2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.006553 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24"} err="failed to get container status \"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24\": rpc error: code = NotFound desc = could not find container \"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24\": container with ID starting with 2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.006566 4707 scope.go:117] "RemoveContainer" containerID="659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.006728 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23\": container with ID starting with 659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23 not found: ID does not exist" containerID="659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.006748 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23"} err="failed to get container status \"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23\": rpc error: code = NotFound desc = could not find container \"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23\": container with ID starting with 659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.006762 4707 scope.go:117] "RemoveContainer" containerID="93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.006966 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97"} err="failed to get container status \"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97\": rpc error: code = NotFound desc = could not find container \"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97\": container with ID starting with 93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.006989 4707 scope.go:117] "RemoveContainer" containerID="06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.007213 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f"} err="failed to get container status \"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f\": rpc error: code = NotFound desc = could not find container \"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f\": container with ID starting with 06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.007231 4707 scope.go:117] "RemoveContainer" containerID="2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.007394 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24"} err="failed to get container status \"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24\": rpc error: code = NotFound desc = could not find container \"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24\": container with ID starting with 2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.007410 4707 scope.go:117] "RemoveContainer" containerID="659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.007603 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23"} err="failed to get container status \"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23\": rpc error: code = NotFound desc = could not find container \"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23\": container with ID starting with 659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.007617 4707 scope.go:117] "RemoveContainer" containerID="93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.007782 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97"} err="failed to get container status \"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97\": rpc error: code = NotFound desc = could not find container \"93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97\": container with ID starting with 93c5d73bf4e1cdc6be7d978f6e3d89cc51264da1cad2154dd97b842e161dbd97 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.007798 4707 scope.go:117] "RemoveContainer" containerID="06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.008028 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f"} err="failed to get container status \"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f\": rpc error: code = NotFound desc = could not find container \"06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f\": container with ID starting with 06483f347c9cc806ec1a4ffbae1a96f02a2d548a50707402a4b1b94fb278cf5f not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.008051 4707 scope.go:117] "RemoveContainer" containerID="2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.008209 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24"} err="failed to get container status \"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24\": rpc error: code = NotFound desc = could not find container \"2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24\": container with ID starting with 2cb5edb8f4fa21fce11cc7eef97e9b7bad17cc5dc6aa1e9176f68f5aa8b9ef24 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.008225 4707 scope.go:117] "RemoveContainer" containerID="659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.008360 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23"} err="failed to get container status \"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23\": rpc error: code = NotFound desc = could not find container \"659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23\": container with ID starting with 659b18393f7304ad4b34a04c3d0b66989353ccec808792f30d90975b0d485a23 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.008378 4707 scope.go:117] "RemoveContainer" containerID="6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.066613 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.225:8776/healthcheck\": read tcp 10.217.0.2:42952->10.217.1.225:8776: read: connection reset by peer" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.085307 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.096627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" event={"ID":"1a712880-3b94-4d6a-b137-bbe10fbf4ed4","Type":"ContainerDied","Data":"6bea8be2f791b2c951699f302a0b9f4da77842238474b87214dbf4a1809ae10b"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.096662 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.099918 4707 generic.go:334] "Generic (PLEG): container finished" podID="8238358f-4625-4642-84db-0834ff86d43b" containerID="74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e" exitCode=0 Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.099962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" event={"ID":"8238358f-4625-4642-84db-0834ff86d43b","Type":"ContainerStarted","Data":"9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.099979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" event={"ID":"8238358f-4625-4642-84db-0834ff86d43b","Type":"ContainerDied","Data":"74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.100794 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.108143 4707 generic.go:334] "Generic (PLEG): container finished" podID="27f0dcee-d9f1-4bcd-9084-402a6f132ddc" containerID="c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517" exitCode=0 Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.108192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"27f0dcee-d9f1-4bcd-9084-402a6f132ddc","Type":"ContainerDied","Data":"c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.108215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"27f0dcee-d9f1-4bcd-9084-402a6f132ddc","Type":"ContainerDied","Data":"b529af99fddb0566ae94ef7081ab9434ade2f06a08d41401683903cbf948d077"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.108236 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.109457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-applier-0" event={"ID":"933e570a-28bd-42d8-891f-0734578bc040","Type":"ContainerDied","Data":"7cc53e13d68ae65e774f9d51ed557b0b0a16e61aa3e9983335212210cbc1815e"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.109505 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-applier-0" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.115530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" event={"ID":"512bcb97-a919-42d4-8a18-67f62cbeecb6","Type":"ContainerDied","Data":"ab2b2989d719b49da29c21b644e617c426393cc9a7147474f7b15e9f7212d944"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.115601 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.123676 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" podStartSLOduration=4.123663047 podStartE2EDuration="4.123663047s" podCreationTimestamp="2026-01-21 15:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:34:36.11409638 +0000 UTC m=+1973.295612602" watchObservedRunningTime="2026-01-21 15:34:36.123663047 +0000 UTC m=+1973.305179269" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.125927 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-2tzhb" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.126282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-2tzhb" event={"ID":"e46f052f-cdf5-442f-9628-b1aacddca32d","Type":"ContainerDied","Data":"0d86fcd0f00ebc9364f7780e6c946773788543a71a07d32f2fd906ea287f365f"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.127586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"317f9a2c-fa7d-417b-9899-9bce822ffd74","Type":"ContainerDied","Data":"4e2e40b027020e927ccc3a4b83fa3b9cf999e97161998ace26065b2a0b11a5f4"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.127656 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.129561 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.129701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4d50-account-create-update-vpps7" event={"ID":"e3b620a6-bbfd-429f-bcbc-16235afb8fc5","Type":"ContainerDied","Data":"8d67078be2917cdb59b82e62b39f46cab4989883cf2002516ec0453d21833d66"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.130703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" event={"ID":"c1478145-b0f0-47c0-a3d1-d9bb50238e3b","Type":"ContainerDied","Data":"7ad8a28e3ea9a57157023982ff196974a08ed5a3649cfb86565c44931148ead1"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.130749 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.134530 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerID="b02f577e02d6cede6688dd7b75593a1e9627c1ae835ea04f9d891f4fea9be490" exitCode=0 Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.134632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360","Type":"ContainerDied","Data":"b02f577e02d6cede6688dd7b75593a1e9627c1ae835ea04f9d891f4fea9be490"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.138497 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerID="73da13975b9a0fea75976713ffc06ad0487d928c65d1dd4b7b9f66279d719295" exitCode=0 Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.138568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7b5a0b6a-5b7d-4431-961c-984efc37166a","Type":"ContainerDied","Data":"73da13975b9a0fea75976713ffc06ad0487d928c65d1dd4b7b9f66279d719295"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.140182 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.140192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4" event={"ID":"84510089-d853-4404-9d37-9130c56ad8aa","Type":"ContainerDied","Data":"73a71acd78062f06c8b964037089777bc5495b64f2bb66bf5371a908de0bf27d"} Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.173326 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/watcher-api-0" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.1.229:9322/\": read tcp 10.217.0.2:33896->10.217.1.229:9322: read: connection reset by peer" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.173594 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/watcher-api-0" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.1.229:9322/\": read tcp 10.217.0.2:33882->10.217.1.229:9322: read: connection reset by peer" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.325759 4707 scope.go:117] "RemoveContainer" containerID="8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.330522 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.331323 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.353729 4707 scope.go:117] "RemoveContainer" containerID="fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-config-data\") pod \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-public-tls-certs\") pod \"7b5a0b6a-5b7d-4431-961c-984efc37166a\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-httpd-run\") pod \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-logs\") pod \"7b5a0b6a-5b7d-4431-961c-984efc37166a\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-combined-ca-bundle\") pod \"7b5a0b6a-5b7d-4431-961c-984efc37166a\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-combined-ca-bundle\") pod \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393772 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5msp5\" (UniqueName: \"kubernetes.io/projected/7b5a0b6a-5b7d-4431-961c-984efc37166a-kube-api-access-5msp5\") pod \"7b5a0b6a-5b7d-4431-961c-984efc37166a\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-internal-tls-certs\") pod \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-scripts\") pod \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-httpd-run\") pod \"7b5a0b6a-5b7d-4431-961c-984efc37166a\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393946 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-logs\") pod \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-scripts\") pod \"7b5a0b6a-5b7d-4431-961c-984efc37166a\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.393993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4wxs\" (UniqueName: \"kubernetes.io/projected/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-kube-api-access-t4wxs\") pod \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.394016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\" (UID: \"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.394032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-config-data\") pod \"7b5a0b6a-5b7d-4431-961c-984efc37166a\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.394050 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"7b5a0b6a-5b7d-4431-961c-984efc37166a\" (UID: \"7b5a0b6a-5b7d-4431-961c-984efc37166a\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.394702 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7b5a0b6a-5b7d-4431-961c-984efc37166a" (UID: "7b5a0b6a-5b7d-4431-961c-984efc37166a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.395095 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.401336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "7b5a0b6a-5b7d-4431-961c-984efc37166a" (UID: "7b5a0b6a-5b7d-4431-961c-984efc37166a"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.401714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-logs" (OuterVolumeSpecName: "logs") pod "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" (UID: "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.402980 4707 scope.go:117] "RemoveContainer" containerID="2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.403100 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.403727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" (UID: "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.403989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-logs" (OuterVolumeSpecName: "logs") pod "7b5a0b6a-5b7d-4431-961c-984efc37166a" (UID: "7b5a0b6a-5b7d-4431-961c-984efc37166a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.404605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-scripts" (OuterVolumeSpecName: "scripts") pod "7b5a0b6a-5b7d-4431-961c-984efc37166a" (UID: "7b5a0b6a-5b7d-4431-961c-984efc37166a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.412369 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-2a48-account-create-update-qfct4"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.418947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5a0b6a-5b7d-4431-961c-984efc37166a-kube-api-access-5msp5" (OuterVolumeSpecName: "kube-api-access-5msp5") pod "7b5a0b6a-5b7d-4431-961c-984efc37166a" (UID: "7b5a0b6a-5b7d-4431-961c-984efc37166a"). InnerVolumeSpecName "kube-api-access-5msp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.419503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" (UID: "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.421934 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.428113 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.428712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-kube-api-access-t4wxs" (OuterVolumeSpecName: "kube-api-access-t4wxs") pod "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" (UID: "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360"). InnerVolumeSpecName "kube-api-access-t4wxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.429685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b5a0b6a-5b7d-4431-961c-984efc37166a" (UID: "7b5a0b6a-5b7d-4431-961c-984efc37166a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.436025 4707 scope.go:117] "RemoveContainer" containerID="8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.436836 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-scripts" (OuterVolumeSpecName: "scripts") pod "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" (UID: "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.437263 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013\": container with ID starting with 8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013 not found: ID does not exist" containerID="8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.437293 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013"} err="failed to get container status \"8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013\": rpc error: code = NotFound desc = could not find container \"8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013\": container with ID starting with 8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.437317 4707 scope.go:117] "RemoveContainer" containerID="fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.440917 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1\": container with ID starting with fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1 not found: ID does not exist" containerID="fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.440958 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1"} err="failed to get container status \"fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1\": rpc error: code = NotFound desc = could not find container \"fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1\": container with ID starting with fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.440984 4707 scope.go:117] "RemoveContainer" containerID="2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.441402 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892\": container with ID starting with 2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892 not found: ID does not exist" containerID="2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.441484 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892"} err="failed to get container status \"2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892\": rpc error: code = NotFound desc = could not find container \"2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892\": container with ID starting with 2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.441511 4707 scope.go:117] "RemoveContainer" containerID="8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.441921 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013"} err="failed to get container status \"8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013\": rpc error: code = NotFound desc = could not find container \"8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013\": container with ID starting with 8468c7348aadc06975333a90894bb65ae5677f4fc54a5442ddbe58984e3eb013 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.441944 4707 scope.go:117] "RemoveContainer" containerID="fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.442276 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1"} err="failed to get container status \"fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1\": rpc error: code = NotFound desc = could not find container \"fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1\": container with ID starting with fe41e0d65146cd4d0e29521639ad254ec1e2dac73168ae340f9f09d81e74b7f1 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.442294 4707 scope.go:117] "RemoveContainer" containerID="2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.442678 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892"} err="failed to get container status \"2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892\": rpc error: code = NotFound desc = could not find container \"2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892\": container with ID starting with 2dcefbf1d09a56135cba734321cffd5db757fa348b1b3b63800f879d37dbc892 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.442696 4707 scope.go:117] "RemoveContainer" containerID="c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.451836 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.465882 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-6cf9-account-create-update-kbxc6"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.476106 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.489960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" (UID: "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.493390 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496141 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496412 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5a0b6a-5b7d-4431-961c-984efc37166a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496421 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496431 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496439 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5msp5\" (UniqueName: \"kubernetes.io/projected/7b5a0b6a-5b7d-4431-961c-984efc37166a-kube-api-access-5msp5\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496447 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496453 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496461 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496468 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4wxs\" (UniqueName: \"kubernetes.io/projected/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-kube-api-access-t4wxs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496493 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.496509 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.507998 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.512772 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-vpps7"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.513953 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.518235 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-4d50-account-create-update-vpps7"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.522019 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.526066 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-config-data" (OuterVolumeSpecName: "config-data") pod "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" (UID: "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.526321 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7b5a0b6a-5b7d-4431-961c-984efc37166a" (UID: "7b5a0b6a-5b7d-4431-961c-984efc37166a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.529437 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.531992 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-2tzhb"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.534500 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-2tzhb"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.553294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-config-data" (OuterVolumeSpecName: "config-data") pod "7b5a0b6a-5b7d-4431-961c-984efc37166a" (UID: "7b5a0b6a-5b7d-4431-961c-984efc37166a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.558966 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.559027 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.563084 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-cd48-account-create-update-dc7bk"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.575607 4707 scope.go:117] "RemoveContainer" containerID="84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.582195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" (UID: "0a1a2521-fdd5-4e7d-ad4f-8c4773c97360"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.597427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9t4l\" (UniqueName: \"kubernetes.io/projected/9f46d10f-704e-4815-8636-7d8dcb666e0c-kube-api-access-c9t4l\") pod \"9f46d10f-704e-4815-8636-7d8dcb666e0c\" (UID: \"9f46d10f-704e-4815-8636-7d8dcb666e0c\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.597696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts\") pod \"9f46d10f-704e-4815-8636-7d8dcb666e0c\" (UID: \"9f46d10f-704e-4815-8636-7d8dcb666e0c\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.598486 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.598509 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.598523 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.598534 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.598543 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5a0b6a-5b7d-4431-961c-984efc37166a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.598553 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.598979 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f46d10f-704e-4815-8636-7d8dcb666e0c" (UID: "9f46d10f-704e-4815-8636-7d8dcb666e0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.601148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f46d10f-704e-4815-8636-7d8dcb666e0c-kube-api-access-c9t4l" (OuterVolumeSpecName: "kube-api-access-c9t4l") pod "9f46d10f-704e-4815-8636-7d8dcb666e0c" (UID: "9f46d10f-704e-4815-8636-7d8dcb666e0c"). InnerVolumeSpecName "kube-api-access-c9t4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.602313 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.613546 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-8e80-account-create-update-sdxm6"] Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.660151 4707 scope.go:117] "RemoveContainer" containerID="c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.661443 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517\": container with ID starting with c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517 not found: ID does not exist" containerID="c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.661480 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517"} err="failed to get container status \"c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517\": rpc error: code = NotFound desc = could not find container \"c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517\": container with ID starting with c63651a3aace6192a2e989c3a866bf507c0abe54c119655f2b09533e2c6b9517 not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.661505 4707 scope.go:117] "RemoveContainer" containerID="84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.661676 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.661844 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a\": container with ID starting with 84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a not found: ID does not exist" containerID="84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.661863 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a"} err="failed to get container status \"84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a\": rpc error: code = NotFound desc = could not find container \"84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a\": container with ID starting with 84abc4400973a539f8919cb1aabb84a49482ae49659c994aca5e4694bf15ac2a not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.661876 4707 scope.go:117] "RemoveContainer" containerID="a5cfda8d9c91d4e73ddfaa7ddcab6f6a580e4abf30eecf6674e01a38f42e5830" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.665451 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6sfx\" (UniqueName: \"kubernetes.io/projected/22fbbd00-3dc5-42f1-a015-24e2aac9be07-kube-api-access-t6sfx\") pod \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhzbp\" (UniqueName: \"kubernetes.io/projected/1fd72c11-1e49-4359-811c-f7a96288df7b-kube-api-access-zhzbp\") pod \"1fd72c11-1e49-4359-811c-f7a96288df7b\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700674 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-combined-ca-bundle\") pod \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-internal-tls-certs\") pod \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-public-tls-certs\") pod \"1fd72c11-1e49-4359-811c-f7a96288df7b\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-scripts\") pod \"1fd72c11-1e49-4359-811c-f7a96288df7b\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-internal-tls-certs\") pod \"1fd72c11-1e49-4359-811c-f7a96288df7b\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22fbbd00-3dc5-42f1-a015-24e2aac9be07-logs\") pod \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-public-tls-certs\") pod \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data\") pod \"1fd72c11-1e49-4359-811c-f7a96288df7b\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data-custom\") pod \"1fd72c11-1e49-4359-811c-f7a96288df7b\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700928 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-combined-ca-bundle\") pod \"1fd72c11-1e49-4359-811c-f7a96288df7b\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fd72c11-1e49-4359-811c-f7a96288df7b-etc-machine-id\") pod \"1fd72c11-1e49-4359-811c-f7a96288df7b\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-custom-prometheus-ca\") pod \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.700995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-config-data\") pod \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\" (UID: \"22fbbd00-3dc5-42f1-a015-24e2aac9be07\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.701435 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9t4l\" (UniqueName: \"kubernetes.io/projected/9f46d10f-704e-4815-8636-7d8dcb666e0c-kube-api-access-c9t4l\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.701452 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f46d10f-704e-4815-8636-7d8dcb666e0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.705183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22fbbd00-3dc5-42f1-a015-24e2aac9be07-logs" (OuterVolumeSpecName: "logs") pod "22fbbd00-3dc5-42f1-a015-24e2aac9be07" (UID: "22fbbd00-3dc5-42f1-a015-24e2aac9be07"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.706526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fd72c11-1e49-4359-811c-f7a96288df7b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1fd72c11-1e49-4359-811c-f7a96288df7b" (UID: "1fd72c11-1e49-4359-811c-f7a96288df7b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.706866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22fbbd00-3dc5-42f1-a015-24e2aac9be07-kube-api-access-t6sfx" (OuterVolumeSpecName: "kube-api-access-t6sfx") pod "22fbbd00-3dc5-42f1-a015-24e2aac9be07" (UID: "22fbbd00-3dc5-42f1-a015-24e2aac9be07"). InnerVolumeSpecName "kube-api-access-t6sfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.708661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd72c11-1e49-4359-811c-f7a96288df7b-kube-api-access-zhzbp" (OuterVolumeSpecName: "kube-api-access-zhzbp") pod "1fd72c11-1e49-4359-811c-f7a96288df7b" (UID: "1fd72c11-1e49-4359-811c-f7a96288df7b"). InnerVolumeSpecName "kube-api-access-zhzbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.708686 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-scripts" (OuterVolumeSpecName: "scripts") pod "1fd72c11-1e49-4359-811c-f7a96288df7b" (UID: "1fd72c11-1e49-4359-811c-f7a96288df7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.714339 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1fd72c11-1e49-4359-811c-f7a96288df7b" (UID: "1fd72c11-1e49-4359-811c-f7a96288df7b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.726663 4707 scope.go:117] "RemoveContainer" containerID="d150068cda003e5778b5b784c9314de19be0c8b0bff88efea482c9b70f8f5d2e" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.741977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fd72c11-1e49-4359-811c-f7a96288df7b" (UID: "1fd72c11-1e49-4359-811c-f7a96288df7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.763094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22fbbd00-3dc5-42f1-a015-24e2aac9be07" (UID: "22fbbd00-3dc5-42f1-a015-24e2aac9be07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.763152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1fd72c11-1e49-4359-811c-f7a96288df7b" (UID: "1fd72c11-1e49-4359-811c-f7a96288df7b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.765059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "22fbbd00-3dc5-42f1-a015-24e2aac9be07" (UID: "22fbbd00-3dc5-42f1-a015-24e2aac9be07"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.771530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data" (OuterVolumeSpecName: "config-data") pod "1fd72c11-1e49-4359-811c-f7a96288df7b" (UID: "1fd72c11-1e49-4359-811c-f7a96288df7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.774878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "22fbbd00-3dc5-42f1-a015-24e2aac9be07" (UID: "22fbbd00-3dc5-42f1-a015-24e2aac9be07"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.776092 4707 scope.go:117] "RemoveContainer" containerID="6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e" Jan 21 15:34:36 crc kubenswrapper[4707]: E0121 15:34:36.776536 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e\": container with ID starting with 6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e not found: ID does not exist" containerID="6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.776567 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e"} err="failed to get container status \"6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e\": rpc error: code = NotFound desc = could not find container \"6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e\": container with ID starting with 6b3dd6b9fecffa6e18d652c5595ce94aaef291bcfec6a7f231c7dab9d400835e not found: ID does not exist" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.776584 4707 scope.go:117] "RemoveContainer" containerID="3936a2d7a7a92681b283ae06c8af878facd4800d2852311442ea58c67b2c19cc" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.781224 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.226:9696/\": dial tcp 10.217.1.226:9696: connect: connection refused" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.781400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-config-data" (OuterVolumeSpecName: "config-data") pod "22fbbd00-3dc5-42f1-a015-24e2aac9be07" (UID: "22fbbd00-3dc5-42f1-a015-24e2aac9be07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.786441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "22fbbd00-3dc5-42f1-a015-24e2aac9be07" (UID: "22fbbd00-3dc5-42f1-a015-24e2aac9be07"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.795335 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1fd72c11-1e49-4359-811c-f7a96288df7b" (UID: "1fd72c11-1e49-4359-811c-f7a96288df7b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.796556 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.802666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fd72c11-1e49-4359-811c-f7a96288df7b-logs\") pod \"1fd72c11-1e49-4359-811c-f7a96288df7b\" (UID: \"1fd72c11-1e49-4359-811c-f7a96288df7b\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.802978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fd72c11-1e49-4359-811c-f7a96288df7b-logs" (OuterVolumeSpecName: "logs") pod "1fd72c11-1e49-4359-811c-f7a96288df7b" (UID: "1fd72c11-1e49-4359-811c-f7a96288df7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803272 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fd72c11-1e49-4359-811c-f7a96288df7b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803296 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6sfx\" (UniqueName: \"kubernetes.io/projected/22fbbd00-3dc5-42f1-a015-24e2aac9be07-kube-api-access-t6sfx\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803308 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhzbp\" (UniqueName: \"kubernetes.io/projected/1fd72c11-1e49-4359-811c-f7a96288df7b-kube-api-access-zhzbp\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803319 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803327 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803335 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803343 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803351 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803358 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22fbbd00-3dc5-42f1-a015-24e2aac9be07-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803366 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803373 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803383 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803391 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd72c11-1e49-4359-811c-f7a96288df7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803399 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fd72c11-1e49-4359-811c-f7a96288df7b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803408 4707 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.803415 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fbbd00-3dc5-42f1-a015-24e2aac9be07-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.904644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-config-data\") pod \"952e1893-2e97-4caa-91e1-5c6621395737\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.904725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-combined-ca-bundle\") pod \"952e1893-2e97-4caa-91e1-5c6621395737\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.904828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-nova-metadata-tls-certs\") pod \"952e1893-2e97-4caa-91e1-5c6621395737\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.904874 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2rtv\" (UniqueName: \"kubernetes.io/projected/952e1893-2e97-4caa-91e1-5c6621395737-kube-api-access-x2rtv\") pod \"952e1893-2e97-4caa-91e1-5c6621395737\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.904917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e1893-2e97-4caa-91e1-5c6621395737-logs\") pod \"952e1893-2e97-4caa-91e1-5c6621395737\" (UID: \"952e1893-2e97-4caa-91e1-5c6621395737\") " Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.905502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/952e1893-2e97-4caa-91e1-5c6621395737-logs" (OuterVolumeSpecName: "logs") pod "952e1893-2e97-4caa-91e1-5c6621395737" (UID: "952e1893-2e97-4caa-91e1-5c6621395737"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.908842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952e1893-2e97-4caa-91e1-5c6621395737-kube-api-access-x2rtv" (OuterVolumeSpecName: "kube-api-access-x2rtv") pod "952e1893-2e97-4caa-91e1-5c6621395737" (UID: "952e1893-2e97-4caa-91e1-5c6621395737"). InnerVolumeSpecName "kube-api-access-x2rtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.926689 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "952e1893-2e97-4caa-91e1-5c6621395737" (UID: "952e1893-2e97-4caa-91e1-5c6621395737"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.937277 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-config-data" (OuterVolumeSpecName: "config-data") pod "952e1893-2e97-4caa-91e1-5c6621395737" (UID: "952e1893-2e97-4caa-91e1-5c6621395737"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.943442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "952e1893-2e97-4caa-91e1-5c6621395737" (UID: "952e1893-2e97-4caa-91e1-5c6621395737"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.951599 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:34:36 crc kubenswrapper[4707]: I0121 15:34:36.973056 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.006861 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-scripts\") pod \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.006913 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-internal-tls-certs\") pod \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.006994 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-combined-ca-bundle\") pod \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a85cef69-0d30-4ed6-9c73-c169b3636f3a-logs\") pod \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-public-tls-certs\") pod \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-config-data\") pod \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2scc\" (UniqueName: \"kubernetes.io/projected/a85cef69-0d30-4ed6-9c73-c169b3636f3a-kube-api-access-h2scc\") pod \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-config-data\") pod \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\" (UID: \"a85cef69-0d30-4ed6-9c73-c169b3636f3a\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007619 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007634 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007644 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e1893-2e97-4caa-91e1-5c6621395737-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007654 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2rtv\" (UniqueName: \"kubernetes.io/projected/952e1893-2e97-4caa-91e1-5c6621395737-kube-api-access-x2rtv\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.007664 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e1893-2e97-4caa-91e1-5c6621395737-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.009265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-scripts" (OuterVolumeSpecName: "scripts") pod "a85cef69-0d30-4ed6-9c73-c169b3636f3a" (UID: "a85cef69-0d30-4ed6-9c73-c169b3636f3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.009856 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a85cef69-0d30-4ed6-9c73-c169b3636f3a-logs" (OuterVolumeSpecName: "logs") pod "a85cef69-0d30-4ed6-9c73-c169b3636f3a" (UID: "a85cef69-0d30-4ed6-9c73-c169b3636f3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.013698 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a85cef69-0d30-4ed6-9c73-c169b3636f3a-kube-api-access-h2scc" (OuterVolumeSpecName: "kube-api-access-h2scc") pod "a85cef69-0d30-4ed6-9c73-c169b3636f3a" (UID: "a85cef69-0d30-4ed6-9c73-c169b3636f3a"). InnerVolumeSpecName "kube-api-access-h2scc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.042484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-config-data" (OuterVolumeSpecName: "config-data") pod "b4d7dc0f-9472-43f7-8f37-adb51c82f84e" (UID: "b4d7dc0f-9472-43f7-8f37-adb51c82f84e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.052148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-config-data" (OuterVolumeSpecName: "config-data") pod "a85cef69-0d30-4ed6-9c73-c169b3636f3a" (UID: "a85cef69-0d30-4ed6-9c73-c169b3636f3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.059894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a85cef69-0d30-4ed6-9c73-c169b3636f3a" (UID: "a85cef69-0d30-4ed6-9c73-c169b3636f3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.112753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a85cef69-0d30-4ed6-9c73-c169b3636f3a" (UID: "a85cef69-0d30-4ed6-9c73-c169b3636f3a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.113619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvx5l\" (UniqueName: \"kubernetes.io/projected/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-kube-api-access-qvx5l\") pod \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.113680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-logs\") pod \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.113727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-public-tls-certs\") pod \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.113923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-internal-tls-certs\") pod \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.114008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-combined-ca-bundle\") pod \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\" (UID: \"b4d7dc0f-9472-43f7-8f37-adb51c82f84e\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.115264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-logs" (OuterVolumeSpecName: "logs") pod "b4d7dc0f-9472-43f7-8f37-adb51c82f84e" (UID: "b4d7dc0f-9472-43f7-8f37-adb51c82f84e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.115729 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2scc\" (UniqueName: \"kubernetes.io/projected/a85cef69-0d30-4ed6-9c73-c169b3636f3a-kube-api-access-h2scc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.115744 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.115763 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.115772 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.115782 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.115792 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a85cef69-0d30-4ed6-9c73-c169b3636f3a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.115800 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.115825 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.116301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a85cef69-0d30-4ed6-9c73-c169b3636f3a" (UID: "a85cef69-0d30-4ed6-9c73-c169b3636f3a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.118673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-kube-api-access-qvx5l" (OuterVolumeSpecName: "kube-api-access-qvx5l") pod "b4d7dc0f-9472-43f7-8f37-adb51c82f84e" (UID: "b4d7dc0f-9472-43f7-8f37-adb51c82f84e"). InnerVolumeSpecName "kube-api-access-qvx5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.141168 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4d7dc0f-9472-43f7-8f37-adb51c82f84e" (UID: "b4d7dc0f-9472-43f7-8f37-adb51c82f84e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.179716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" event={"ID":"9f46d10f-704e-4815-8636-7d8dcb666e0c","Type":"ContainerDied","Data":"549986bc094cc8610df33182d898677b6394064b175c654646e1daa1f778e056"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.179859 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9339-account-create-update-8xfcb" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.184871 4707 generic.go:334] "Generic (PLEG): container finished" podID="952e1893-2e97-4caa-91e1-5c6621395737" containerID="ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3" exitCode=0 Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.184909 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.186242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b4d7dc0f-9472-43f7-8f37-adb51c82f84e" (UID: "b4d7dc0f-9472-43f7-8f37-adb51c82f84e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.200067 4707 generic.go:334] "Generic (PLEG): container finished" podID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerID="9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab" exitCode=0 Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.200191 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.202782 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a712880-3b94-4d6a-b137-bbe10fbf4ed4" path="/var/lib/kubelet/pods/1a712880-3b94-4d6a-b137-bbe10fbf4ed4/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.203476 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f0dcee-d9f1-4bcd-9084-402a6f132ddc" path="/var/lib/kubelet/pods/27f0dcee-d9f1-4bcd-9084-402a6f132ddc/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.204106 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317f9a2c-fa7d-417b-9899-9bce822ffd74" path="/var/lib/kubelet/pods/317f9a2c-fa7d-417b-9899-9bce822ffd74/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.204654 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4904f4a7-c8e7-4d41-9254-32ccb097df02" path="/var/lib/kubelet/pods/4904f4a7-c8e7-4d41-9254-32ccb097df02/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.205578 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512bcb97-a919-42d4-8a18-67f62cbeecb6" path="/var/lib/kubelet/pods/512bcb97-a919-42d4-8a18-67f62cbeecb6/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.205971 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="618755bc-3ad6-4081-9de4-d0f1f7c14020" path="/var/lib/kubelet/pods/618755bc-3ad6-4081-9de4-d0f1f7c14020/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.206507 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6c2ba2-268a-46f5-9ee1-93ff501d79de" path="/var/lib/kubelet/pods/6b6c2ba2-268a-46f5-9ee1-93ff501d79de/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.208542 4707 generic.go:334] "Generic (PLEG): container finished" podID="32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" containerID="8a845675965250cb9cb44e3ab713dc4f43ebdac5af17938f0005b1ca1e29b6e7" exitCode=0 Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.209585 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" path="/var/lib/kubelet/pods/75c78782-36f5-4ea6-9b94-0c38ad8ac8b2/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.210215 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84510089-d853-4404-9d37-9130c56ad8aa" path="/var/lib/kubelet/pods/84510089-d853-4404-9d37-9130c56ad8aa/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.210925 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="933e570a-28bd-42d8-891f-0734578bc040" path="/var/lib/kubelet/pods/933e570a-28bd-42d8-891f-0734578bc040/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.211367 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1478145-b0f0-47c0-a3d1-d9bb50238e3b" path="/var/lib/kubelet/pods/c1478145-b0f0-47c0-a3d1-d9bb50238e3b/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.211684 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b620a6-bbfd-429f-bcbc-16235afb8fc5" path="/var/lib/kubelet/pods/e3b620a6-bbfd-429f-bcbc-16235afb8fc5/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.212744 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e46f052f-cdf5-442f-9628-b1aacddca32d" path="/var/lib/kubelet/pods/e46f052f-cdf5-442f-9628-b1aacddca32d/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.214945 4707 generic.go:334] "Generic (PLEG): container finished" podID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerID="62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c" exitCode=0 Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.215011 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.216752 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" path="/var/lib/kubelet/pods/f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b/volumes" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.219680 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85cef69-0d30-4ed6-9c73-c169b3636f3a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.219705 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.219717 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvx5l\" (UniqueName: \"kubernetes.io/projected/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-kube-api-access-qvx5l\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.219727 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.224053 4707 generic.go:334] "Generic (PLEG): container finished" podID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerID="9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb" exitCode=0 Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.224150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.238086 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerID="95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e" exitCode=0 Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.238167 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.243659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b4d7dc0f-9472-43f7-8f37-adb51c82f84e" (UID: "b4d7dc0f-9472-43f7-8f37-adb51c82f84e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.248787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.252498 4707 generic.go:334] "Generic (PLEG): container finished" podID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerID="0eabbf082b84f4e174b9daa40c33805625e90ebc29b3dfef4ea2017e10d6f5c0" exitCode=0 Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.255682 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"952e1893-2e97-4caa-91e1-5c6621395737","Type":"ContainerDied","Data":"ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"952e1893-2e97-4caa-91e1-5c6621395737","Type":"ContainerDied","Data":"9f63b6c580a02d7a47db45c4b560fb26239699dbcab6e104dacbc4511e91128e"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"1fd72c11-1e49-4359-811c-f7a96288df7b","Type":"ContainerDied","Data":"9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"1fd72c11-1e49-4359-811c-f7a96288df7b","Type":"ContainerDied","Data":"dc10bff20660ffdc16c6a79b7c00c6d62566b78e994dfe809bcac732dea7fa82"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" event={"ID":"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed","Type":"ContainerDied","Data":"8a845675965250cb9cb44e3ab713dc4f43ebdac5af17938f0005b1ca1e29b6e7"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" event={"ID":"a85cef69-0d30-4ed6-9c73-c169b3636f3a","Type":"ContainerDied","Data":"62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6ccd55b46b-7mqqb" event={"ID":"a85cef69-0d30-4ed6-9c73-c169b3636f3a","Type":"ContainerDied","Data":"5428a1a45696df7bc48859a81108b69da0b13364f64b946c410377962db7a57e"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"22fbbd00-3dc5-42f1-a015-24e2aac9be07","Type":"ContainerDied","Data":"9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"22fbbd00-3dc5-42f1-a015-24e2aac9be07","Type":"ContainerDied","Data":"7f3ebdec4e685d493529eb54ba646508de7eb1d1971cd20244c84cf17f9c0f4d"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b4d7dc0f-9472-43f7-8f37-adb51c82f84e","Type":"ContainerDied","Data":"95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b4d7dc0f-9472-43f7-8f37-adb51c82f84e","Type":"ContainerDied","Data":"bbf921ac27aaf3a7f162f8de1af4bf9ce73a971377aac2ab5423750b82f0183d"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.305995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0a1a2521-fdd5-4e7d-ad4f-8c4773c97360","Type":"ContainerDied","Data":"47ac127f3b5cc10e7c294ce42c9f26d5ecf141b91f6b7b707cd93e19f2a24533"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.306008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" event={"ID":"de3b367f-2f6e-4ab7-af9b-5390359d3140","Type":"ContainerDied","Data":"0eabbf082b84f4e174b9daa40c33805625e90ebc29b3dfef4ea2017e10d6f5c0"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.306021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7b5a0b6a-5b7d-4431-961c-984efc37166a","Type":"ContainerDied","Data":"067e532ebd8f036c470c5d6529c752eaf43c6d10357df88f3560830645434aa2"} Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.306347 4707 scope.go:117] "RemoveContainer" containerID="ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.321938 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d7dc0f-9472-43f7-8f37-adb51c82f84e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.329694 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.347904 4707 scope.go:117] "RemoveContainer" containerID="694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.366277 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.379438 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.381684 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.404720 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-8xfcb"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.410406 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-9339-account-create-update-8xfcb"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.416025 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data\") pod \"de3b367f-2f6e-4ab7-af9b-5390359d3140\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-scripts\") pod \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-combined-ca-bundle\") pod \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-public-tls-certs\") pod \"de3b367f-2f6e-4ab7-af9b-5390359d3140\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425512 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t767x\" (UniqueName: \"kubernetes.io/projected/de3b367f-2f6e-4ab7-af9b-5390359d3140-kube-api-access-t767x\") pod \"de3b367f-2f6e-4ab7-af9b-5390359d3140\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-internal-tls-certs\") pod \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de3b367f-2f6e-4ab7-af9b-5390359d3140-logs\") pod \"de3b367f-2f6e-4ab7-af9b-5390359d3140\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-config-data\") pod \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data-custom\") pod \"de3b367f-2f6e-4ab7-af9b-5390359d3140\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-public-tls-certs\") pod \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-credential-keys\") pod \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425871 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm7vg\" (UniqueName: \"kubernetes.io/projected/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-kube-api-access-jm7vg\") pod \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-internal-tls-certs\") pod \"de3b367f-2f6e-4ab7-af9b-5390359d3140\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.425945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-fernet-keys\") pod \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\" (UID: \"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.426029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-combined-ca-bundle\") pod \"de3b367f-2f6e-4ab7-af9b-5390359d3140\" (UID: \"de3b367f-2f6e-4ab7-af9b-5390359d3140\") " Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.428911 4707 scope.go:117] "RemoveContainer" containerID="ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.433007 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3\": container with ID starting with ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3 not found: ID does not exist" containerID="ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.433046 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3"} err="failed to get container status \"ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3\": rpc error: code = NotFound desc = could not find container \"ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3\": container with ID starting with ca5372ab2f2ec1473e7806ad9cc0235785ae8474e141b26c077991eab6897fe3 not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.433072 4707 scope.go:117] "RemoveContainer" containerID="694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.433802 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-scripts" (OuterVolumeSpecName: "scripts") pod "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" (UID: "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.433949 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39\": container with ID starting with 694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39 not found: ID does not exist" containerID="694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.433975 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39"} err="failed to get container status \"694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39\": rpc error: code = NotFound desc = could not find container \"694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39\": container with ID starting with 694058e7aa38ece34191d1f0704e41226f5c828a2e471b743ac064e2165a5d39 not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.433995 4707 scope.go:117] "RemoveContainer" containerID="9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.435569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" (UID: "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.437374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "de3b367f-2f6e-4ab7-af9b-5390359d3140" (UID: "de3b367f-2f6e-4ab7-af9b-5390359d3140"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.437976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de3b367f-2f6e-4ab7-af9b-5390359d3140-logs" (OuterVolumeSpecName: "logs") pod "de3b367f-2f6e-4ab7-af9b-5390359d3140" (UID: "de3b367f-2f6e-4ab7-af9b-5390359d3140"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.439409 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" (UID: "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.439856 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3b367f-2f6e-4ab7-af9b-5390359d3140-kube-api-access-t767x" (OuterVolumeSpecName: "kube-api-access-t767x") pod "de3b367f-2f6e-4ab7-af9b-5390359d3140" (UID: "de3b367f-2f6e-4ab7-af9b-5390359d3140"). InnerVolumeSpecName "kube-api-access-t767x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.441441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-kube-api-access-jm7vg" (OuterVolumeSpecName: "kube-api-access-jm7vg") pod "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" (UID: "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed"). InnerVolumeSpecName "kube-api-access-jm7vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.459274 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.467955 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6ccd55b46b-7mqqb"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.479887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de3b367f-2f6e-4ab7-af9b-5390359d3140" (UID: "de3b367f-2f6e-4ab7-af9b-5390359d3140"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.480915 4707 scope.go:117] "RemoveContainer" containerID="035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.484597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-config-data" (OuterVolumeSpecName: "config-data") pod "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" (UID: "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.490965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" (UID: "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.492050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" (UID: "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.501983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" (UID: "32bcbfab-0b0d-49c1-84ab-6a773c2a88ed"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.505879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "de3b367f-2f6e-4ab7-af9b-5390359d3140" (UID: "de3b367f-2f6e-4ab7-af9b-5390359d3140"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.510539 4707 scope.go:117] "RemoveContainer" containerID="9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.511840 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-6ccd55b46b-7mqqb"] Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.512707 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab\": container with ID starting with 9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab not found: ID does not exist" containerID="9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.512738 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab"} err="failed to get container status \"9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab\": rpc error: code = NotFound desc = could not find container \"9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab\": container with ID starting with 9c54c5b02add93db4adf7feb4e7b982a59b5e8ee6708eae33d12a8842093dfab not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.512757 4707 scope.go:117] "RemoveContainer" containerID="035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.512902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "de3b367f-2f6e-4ab7-af9b-5390359d3140" (UID: "de3b367f-2f6e-4ab7-af9b-5390359d3140"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.513305 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57\": container with ID starting with 035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57 not found: ID does not exist" containerID="035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.513335 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57"} err="failed to get container status \"035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57\": rpc error: code = NotFound desc = could not find container \"035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57\": container with ID starting with 035408a4ca718bcbacc19a3cae224e7594563abf90ec28c16560317b072abc57 not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.513349 4707 scope.go:117] "RemoveContainer" containerID="62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.519887 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.524865 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.527300 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528209 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de3b367f-2f6e-4ab7-af9b-5390359d3140-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528232 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528243 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528251 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528261 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528271 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm7vg\" (UniqueName: \"kubernetes.io/projected/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-kube-api-access-jm7vg\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528279 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528286 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528294 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528301 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528308 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528316 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528323 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t767x\" (UniqueName: \"kubernetes.io/projected/de3b367f-2f6e-4ab7-af9b-5390359d3140-kube-api-access-t767x\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.528332 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.530920 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.533393 4707 scope.go:117] "RemoveContainer" containerID="5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.534730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data" (OuterVolumeSpecName: "config-data") pod "de3b367f-2f6e-4ab7-af9b-5390359d3140" (UID: "de3b367f-2f6e-4ab7-af9b-5390359d3140"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.535227 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.538675 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.554451 4707 scope.go:117] "RemoveContainer" containerID="62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.554761 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c\": container with ID starting with 62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c not found: ID does not exist" containerID="62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.554801 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c"} err="failed to get container status \"62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c\": rpc error: code = NotFound desc = could not find container \"62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c\": container with ID starting with 62a306c3ecda3c4b612fccac8f9276d2b3b2d1c5a3e65752cdfbf1f8a7dfb15c not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.554847 4707 scope.go:117] "RemoveContainer" containerID="5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.555107 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46\": container with ID starting with 5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46 not found: ID does not exist" containerID="5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.555132 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46"} err="failed to get container status \"5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46\": rpc error: code = NotFound desc = could not find container \"5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46\": container with ID starting with 5c6fd5d13591d0f6840a9793f73e32cd26c5d483ea26eb31eefab370607eac46 not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.555144 4707 scope.go:117] "RemoveContainer" containerID="9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.573645 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.577604 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.591620 4707 scope.go:117] "RemoveContainer" containerID="b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.606448 4707 scope.go:117] "RemoveContainer" containerID="9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.606752 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb\": container with ID starting with 9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb not found: ID does not exist" containerID="9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.606784 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb"} err="failed to get container status \"9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb\": rpc error: code = NotFound desc = could not find container \"9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb\": container with ID starting with 9545fefb272354d72f646206fd68d0561381cfa22ca27ea8f27fb076095456bb not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.606805 4707 scope.go:117] "RemoveContainer" containerID="b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.607076 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d\": container with ID starting with b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d not found: ID does not exist" containerID="b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.607111 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d"} err="failed to get container status \"b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d\": rpc error: code = NotFound desc = could not find container \"b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d\": container with ID starting with b4876a0cea92e46f894d6b7c8d3b86ad4e7ed2bcf316f2f5a5a9b0228029da7d not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.607136 4707 scope.go:117] "RemoveContainer" containerID="95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.626907 4707 scope.go:117] "RemoveContainer" containerID="e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.630599 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b367f-2f6e-4ab7-af9b-5390359d3140-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.640916 4707 scope.go:117] "RemoveContainer" containerID="95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.641357 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e\": container with ID starting with 95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e not found: ID does not exist" containerID="95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.641402 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e"} err="failed to get container status \"95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e\": rpc error: code = NotFound desc = could not find container \"95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e\": container with ID starting with 95be451de58fd012fc8c8f8b73b9b724141056c33bc62e5637ece521baebb16e not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.641421 4707 scope.go:117] "RemoveContainer" containerID="e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.641659 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd\": container with ID starting with e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd not found: ID does not exist" containerID="e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.641688 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd"} err="failed to get container status \"e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd\": rpc error: code = NotFound desc = could not find container \"e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd\": container with ID starting with e9beb1143894b9034877ec01e39544525017778aa737f5e495ec46d49338aafd not found: ID does not exist" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.641705 4707 scope.go:117] "RemoveContainer" containerID="b02f577e02d6cede6688dd7b75593a1e9627c1ae835ea04f9d891f4fea9be490" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.656354 4707 scope.go:117] "RemoveContainer" containerID="a33bcbf9b6be5d2e1482cb5138bc02de1e5c5fd932db478a845ca93ad2949f3b" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.669083 4707 scope.go:117] "RemoveContainer" containerID="73da13975b9a0fea75976713ffc06ad0487d928c65d1dd4b7b9f66279d719295" Jan 21 15:34:37 crc kubenswrapper[4707]: I0121 15:34:37.684507 4707 scope.go:117] "RemoveContainer" containerID="2f3a47f92a10db39e366cd34b4513029b880fd9122a25ff4919afeff80052643" Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.754710 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.755928 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.757216 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:37 crc kubenswrapper[4707]: E0121 15:34:37.757274 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" Jan 21 15:34:38 crc kubenswrapper[4707]: E0121 15:34:38.237983 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:34:38 crc kubenswrapper[4707]: E0121 15:34:38.238052 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data podName:c1a47423-6eba-4cf7-ab2d-db338f4f28a6 nodeName:}" failed. No retries permitted until 2026-01-21 15:34:46.238034578 +0000 UTC m=+1983.419550799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data") pod "rabbitmq-server-0" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6") : configmap "rabbitmq-config-data" not found Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.265103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" event={"ID":"de3b367f-2f6e-4ab7-af9b-5390359d3140","Type":"ContainerDied","Data":"c87c0a6abff6f08bad44ecfdc1aecf62544eeeb17c94e85c8cc600145d952048"} Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.265123 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.265157 4707 scope.go:117] "RemoveContainer" containerID="0eabbf082b84f4e174b9daa40c33805625e90ebc29b3dfef4ea2017e10d6f5c0" Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.270782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" event={"ID":"32bcbfab-0b0d-49c1-84ab-6a773c2a88ed","Type":"ContainerDied","Data":"42be8515c1468a13f580b0b6a2ff8764b5cd99e2e34240e7dc95492be18ffdb6"} Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.270865 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6885874467-qknsk" Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.289803 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j"] Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.293699 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j"] Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.296105 4707 scope.go:117] "RemoveContainer" containerID="2f689f4a8b3655a649e060bcd6e63b5d3f7054c1d19adef9d2866c4790ef6ad2" Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.307147 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6885874467-qknsk"] Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.309093 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6885874467-qknsk"] Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.312972 4707 scope.go:117] "RemoveContainer" containerID="8a845675965250cb9cb44e3ab713dc4f43ebdac5af17938f0005b1ca1e29b6e7" Jan 21 15:34:38 crc kubenswrapper[4707]: I0121 15:34:38.476401 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.169:5671: connect: connection refused" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.084529 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.158253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcg9h\" (UniqueName: \"kubernetes.io/projected/12bfd31d-1d19-4286-87f6-83397fa62338-kube-api-access-bcg9h\") pod \"12bfd31d-1d19-4286-87f6-83397fa62338\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.158374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data\") pod \"12bfd31d-1d19-4286-87f6-83397fa62338\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.158518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-combined-ca-bundle\") pod \"12bfd31d-1d19-4286-87f6-83397fa62338\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.158568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data-custom\") pod \"12bfd31d-1d19-4286-87f6-83397fa62338\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.158627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bfd31d-1d19-4286-87f6-83397fa62338-logs\") pod \"12bfd31d-1d19-4286-87f6-83397fa62338\" (UID: \"12bfd31d-1d19-4286-87f6-83397fa62338\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.159130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12bfd31d-1d19-4286-87f6-83397fa62338-logs" (OuterVolumeSpecName: "logs") pod "12bfd31d-1d19-4286-87f6-83397fa62338" (UID: "12bfd31d-1d19-4286-87f6-83397fa62338"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.163027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "12bfd31d-1d19-4286-87f6-83397fa62338" (UID: "12bfd31d-1d19-4286-87f6-83397fa62338"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.163035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12bfd31d-1d19-4286-87f6-83397fa62338-kube-api-access-bcg9h" (OuterVolumeSpecName: "kube-api-access-bcg9h") pod "12bfd31d-1d19-4286-87f6-83397fa62338" (UID: "12bfd31d-1d19-4286-87f6-83397fa62338"). InnerVolumeSpecName "kube-api-access-bcg9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.187957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12bfd31d-1d19-4286-87f6-83397fa62338" (UID: "12bfd31d-1d19-4286-87f6-83397fa62338"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.188864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data" (OuterVolumeSpecName: "config-data") pod "12bfd31d-1d19-4286-87f6-83397fa62338" (UID: "12bfd31d-1d19-4286-87f6-83397fa62338"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.189728 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" path="/var/lib/kubelet/pods/0a1a2521-fdd5-4e7d-ad4f-8c4773c97360/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.190428 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" path="/var/lib/kubelet/pods/1fd72c11-1e49-4359-811c-f7a96288df7b/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.190992 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" path="/var/lib/kubelet/pods/22fbbd00-3dc5-42f1-a015-24e2aac9be07/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.194561 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" path="/var/lib/kubelet/pods/32bcbfab-0b0d-49c1-84ab-6a773c2a88ed/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.195073 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5a0b6a-5b7d-4431-961c-984efc37166a" path="/var/lib/kubelet/pods/7b5a0b6a-5b7d-4431-961c-984efc37166a/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.196921 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="952e1893-2e97-4caa-91e1-5c6621395737" path="/var/lib/kubelet/pods/952e1893-2e97-4caa-91e1-5c6621395737/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.197788 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f46d10f-704e-4815-8636-7d8dcb666e0c" path="/var/lib/kubelet/pods/9f46d10f-704e-4815-8636-7d8dcb666e0c/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.198300 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" path="/var/lib/kubelet/pods/a85cef69-0d30-4ed6-9c73-c169b3636f3a/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.198863 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" path="/var/lib/kubelet/pods/b4d7dc0f-9472-43f7-8f37-adb51c82f84e/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.200043 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" path="/var/lib/kubelet/pods/de3b367f-2f6e-4ab7-af9b-5390359d3140/volumes" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.221006 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.260150 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.260178 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.260188 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bfd31d-1d19-4286-87f6-83397fa62338-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.260202 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcg9h\" (UniqueName: \"kubernetes.io/projected/12bfd31d-1d19-4286-87f6-83397fa62338-kube-api-access-bcg9h\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.260213 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bfd31d-1d19-4286-87f6-83397fa62338-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.280967 4707 generic.go:334] "Generic (PLEG): container finished" podID="12bfd31d-1d19-4286-87f6-83397fa62338" containerID="ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7" exitCode=0 Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.281027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" event={"ID":"12bfd31d-1d19-4286-87f6-83397fa62338","Type":"ContainerDied","Data":"ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7"} Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.281037 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.281133 4707 scope.go:117] "RemoveContainer" containerID="ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.281289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb" event={"ID":"12bfd31d-1d19-4286-87f6-83397fa62338","Type":"ContainerDied","Data":"49a2054eafba79b9a86349076a67bc4a52c0675f0324064f5d58e27ba10275a0"} Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.284688 4707 generic.go:334] "Generic (PLEG): container finished" podID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerID="220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96" exitCode=0 Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.284724 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.284748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" event={"ID":"cbc34703-f7b2-455d-b4a6-4865a4c90f45","Type":"ContainerDied","Data":"220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96"} Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.284771 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt" event={"ID":"cbc34703-f7b2-455d-b4a6-4865a4c90f45","Type":"ContainerDied","Data":"2efb102eaf13375814dfd29d9c25bfcf44141dcbf6141b4b5cf3ad5313b9f2e2"} Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.301268 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb"] Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.303500 4707 scope.go:117] "RemoveContainer" containerID="d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.305229 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7d44c8db94-rqgvb"] Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.319778 4707 scope.go:117] "RemoveContainer" containerID="ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7" Jan 21 15:34:39 crc kubenswrapper[4707]: E0121 15:34:39.320093 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7\": container with ID starting with ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7 not found: ID does not exist" containerID="ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.320135 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7"} err="failed to get container status \"ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7\": rpc error: code = NotFound desc = could not find container \"ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7\": container with ID starting with ca6d05980db7eec4bd101722e9c7756f1d4001d2279dfaf59e09e8738cc7abe7 not found: ID does not exist" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.320151 4707 scope.go:117] "RemoveContainer" containerID="d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072" Jan 21 15:34:39 crc kubenswrapper[4707]: E0121 15:34:39.320480 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072\": container with ID starting with d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072 not found: ID does not exist" containerID="d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.320498 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072"} err="failed to get container status \"d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072\": rpc error: code = NotFound desc = could not find container \"d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072\": container with ID starting with d6a4bd97d57f43d0049f7412a5628aa436c4ccc74eabe8aea1569fa5077d9072 not found: ID does not exist" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.320529 4707 scope.go:117] "RemoveContainer" containerID="220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.353574 4707 scope.go:117] "RemoveContainer" containerID="266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.361669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-combined-ca-bundle\") pod \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.361732 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hnhh\" (UniqueName: \"kubernetes.io/projected/cbc34703-f7b2-455d-b4a6-4865a4c90f45-kube-api-access-7hnhh\") pod \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.361888 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc34703-f7b2-455d-b4a6-4865a4c90f45-logs\") pod \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.361938 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data\") pod \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.361976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data-custom\") pod \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\" (UID: \"cbc34703-f7b2-455d-b4a6-4865a4c90f45\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.365092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc34703-f7b2-455d-b4a6-4865a4c90f45-logs" (OuterVolumeSpecName: "logs") pod "cbc34703-f7b2-455d-b4a6-4865a4c90f45" (UID: "cbc34703-f7b2-455d-b4a6-4865a4c90f45"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.373926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cbc34703-f7b2-455d-b4a6-4865a4c90f45" (UID: "cbc34703-f7b2-455d-b4a6-4865a4c90f45"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.377981 4707 scope.go:117] "RemoveContainer" containerID="220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.388951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc34703-f7b2-455d-b4a6-4865a4c90f45-kube-api-access-7hnhh" (OuterVolumeSpecName: "kube-api-access-7hnhh") pod "cbc34703-f7b2-455d-b4a6-4865a4c90f45" (UID: "cbc34703-f7b2-455d-b4a6-4865a4c90f45"). InnerVolumeSpecName "kube-api-access-7hnhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: E0121 15:34:39.388979 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96\": container with ID starting with 220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96 not found: ID does not exist" containerID="220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.389019 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96"} err="failed to get container status \"220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96\": rpc error: code = NotFound desc = could not find container \"220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96\": container with ID starting with 220af5305c12eba16746788e21f20388976f870243747f3ff08d67a120c9cc96 not found: ID does not exist" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.389043 4707 scope.go:117] "RemoveContainer" containerID="266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e" Jan 21 15:34:39 crc kubenswrapper[4707]: E0121 15:34:39.390930 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e\": container with ID starting with 266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e not found: ID does not exist" containerID="266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.390959 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e"} err="failed to get container status \"266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e\": rpc error: code = NotFound desc = could not find container \"266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e\": container with ID starting with 266e1e95969914fbfacbc6eecbd4687a66599ab7220840787f075f7860280f8e not found: ID does not exist" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.395986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbc34703-f7b2-455d-b4a6-4865a4c90f45" (UID: "cbc34703-f7b2-455d-b4a6-4865a4c90f45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.406000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data" (OuterVolumeSpecName: "config-data") pod "cbc34703-f7b2-455d-b4a6-4865a4c90f45" (UID: "cbc34703-f7b2-455d-b4a6-4865a4c90f45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.463582 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc34703-f7b2-455d-b4a6-4865a4c90f45-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.463766 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.463843 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.463923 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc34703-f7b2-455d-b4a6-4865a4c90f45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.463972 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hnhh\" (UniqueName: \"kubernetes.io/projected/cbc34703-f7b2-455d-b4a6-4865a4c90f45-kube-api-access-7hnhh\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.621126 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt"] Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.630258 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7f55fcd6b9-77npt"] Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.810042 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmz8r\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-kube-api-access-wmz8r\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-pod-info\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-erlang-cookie-secret\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870156 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-tls\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-confd\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-server-conf\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870277 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-erlang-cookie\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-plugins\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.870389 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-plugins-conf\") pod \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\" (UID: \"c1a47423-6eba-4cf7-ab2d-db338f4f28a6\") " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.871020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.871288 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.871607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.872880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.873266 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-pod-info" (OuterVolumeSpecName: "pod-info") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.873686 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "persistence") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.874248 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-kube-api-access-wmz8r" (OuterVolumeSpecName: "kube-api-access-wmz8r") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "kube-api-access-wmz8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.875038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.888546 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data" (OuterVolumeSpecName: "config-data") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.902966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-server-conf" (OuterVolumeSpecName: "server-conf") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.941513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c1a47423-6eba-4cf7-ab2d-db338f4f28a6" (UID: "c1a47423-6eba-4cf7-ab2d-db338f4f28a6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972227 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972258 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972267 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972298 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972308 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972315 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972323 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972330 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmz8r\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-kube-api-access-wmz8r\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972339 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972346 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.972354 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1a47423-6eba-4cf7-ab2d-db338f4f28a6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:39 crc kubenswrapper[4707]: I0121 15:34:39.985547 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.073160 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.304990 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" containerID="23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce" exitCode=0 Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.305075 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.305103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"c1a47423-6eba-4cf7-ab2d-db338f4f28a6","Type":"ContainerDied","Data":"23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce"} Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.305169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"c1a47423-6eba-4cf7-ab2d-db338f4f28a6","Type":"ContainerDied","Data":"395074a19a86cfe868b78ac61c2facf68e32e71f3a65253b963a8750379a1c39"} Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.305195 4707 scope.go:117] "RemoveContainer" containerID="23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce" Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.329574 4707 scope.go:117] "RemoveContainer" containerID="a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d" Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.332418 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.339006 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.359008 4707 scope.go:117] "RemoveContainer" containerID="23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce" Jan 21 15:34:40 crc kubenswrapper[4707]: E0121 15:34:40.359421 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce\": container with ID starting with 23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce not found: ID does not exist" containerID="23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce" Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.359531 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce"} err="failed to get container status \"23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce\": rpc error: code = NotFound desc = could not find container \"23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce\": container with ID starting with 23c2323aa2913ad8f8bf8074d85207fb90051b821cc3767d2e085ef0ac8f84ce not found: ID does not exist" Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.359957 4707 scope.go:117] "RemoveContainer" containerID="a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d" Jan 21 15:34:40 crc kubenswrapper[4707]: E0121 15:34:40.361717 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d\": container with ID starting with a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d not found: ID does not exist" containerID="a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d" Jan 21 15:34:40 crc kubenswrapper[4707]: I0121 15:34:40.361775 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d"} err="failed to get container status \"a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d\": rpc error: code = NotFound desc = could not find container \"a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d\": container with ID starting with a544913242b59a9d5a591a21dddd2202bd87aa137d70b09b114cf22f7e41eb9d not found: ID does not exist" Jan 21 15:34:40 crc kubenswrapper[4707]: E0121 15:34:40.768595 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:40 crc kubenswrapper[4707]: E0121 15:34:40.770641 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:40 crc kubenswrapper[4707]: E0121 15:34:40.777896 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:40 crc kubenswrapper[4707]: E0121 15:34:40.777964 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" Jan 21 15:34:41 crc kubenswrapper[4707]: I0121 15:34:41.190121 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12bfd31d-1d19-4286-87f6-83397fa62338" path="/var/lib/kubelet/pods/12bfd31d-1d19-4286-87f6-83397fa62338/volumes" Jan 21 15:34:41 crc kubenswrapper[4707]: I0121 15:34:41.191047 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" path="/var/lib/kubelet/pods/c1a47423-6eba-4cf7-ab2d-db338f4f28a6/volumes" Jan 21 15:34:41 crc kubenswrapper[4707]: I0121 15:34:41.191662 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" path="/var/lib/kubelet/pods/cbc34703-f7b2-455d-b4a6-4865a4c90f45/volumes" Jan 21 15:34:42 crc kubenswrapper[4707]: I0121 15:34:42.242735 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.227:9311/healthcheck\": dial tcp 10.217.1.227:9311: i/o timeout" Jan 21 15:34:42 crc kubenswrapper[4707]: I0121 15:34:42.242833 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-696bc54df8-zwm2j" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.227:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:34:42 crc kubenswrapper[4707]: E0121 15:34:42.755728 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:42 crc kubenswrapper[4707]: E0121 15:34:42.757028 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:42 crc kubenswrapper[4707]: E0121 15:34:42.758171 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:42 crc kubenswrapper[4707]: E0121 15:34:42.758233 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" Jan 21 15:34:42 crc kubenswrapper[4707]: I0121 15:34:42.981974 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.027483 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn"] Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.027690 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" podUID="91bd3c13-ed9b-4929-9c5d-3d650badf3e9" containerName="dnsmasq-dns" containerID="cri-o://9cb3ce555c3c18849f88cb950839ddcad6eacdd2264d87d1e1ee4ea3ccfb67eb" gracePeriod=10 Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.332487 4707 generic.go:334] "Generic (PLEG): container finished" podID="91bd3c13-ed9b-4929-9c5d-3d650badf3e9" containerID="9cb3ce555c3c18849f88cb950839ddcad6eacdd2264d87d1e1ee4ea3ccfb67eb" exitCode=0 Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.332548 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" event={"ID":"91bd3c13-ed9b-4929-9c5d-3d650badf3e9","Type":"ContainerDied","Data":"9cb3ce555c3c18849f88cb950839ddcad6eacdd2264d87d1e1ee4ea3ccfb67eb"} Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.388833 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.519942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dns-swift-storage-0\") pod \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.520026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dnsmasq-svc\") pod \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.520090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-config\") pod \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.520113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjbg8\" (UniqueName: \"kubernetes.io/projected/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-kube-api-access-wjbg8\") pod \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\" (UID: \"91bd3c13-ed9b-4929-9c5d-3d650badf3e9\") " Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.534420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-kube-api-access-wjbg8" (OuterVolumeSpecName: "kube-api-access-wjbg8") pod "91bd3c13-ed9b-4929-9c5d-3d650badf3e9" (UID: "91bd3c13-ed9b-4929-9c5d-3d650badf3e9"). InnerVolumeSpecName "kube-api-access-wjbg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.547055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "91bd3c13-ed9b-4929-9c5d-3d650badf3e9" (UID: "91bd3c13-ed9b-4929-9c5d-3d650badf3e9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.552127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "91bd3c13-ed9b-4929-9c5d-3d650badf3e9" (UID: "91bd3c13-ed9b-4929-9c5d-3d650badf3e9"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.554221 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-config" (OuterVolumeSpecName: "config") pod "91bd3c13-ed9b-4929-9c5d-3d650badf3e9" (UID: "91bd3c13-ed9b-4929-9c5d-3d650badf3e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.621723 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.621749 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.621758 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:43 crc kubenswrapper[4707]: I0121 15:34:43.621766 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjbg8\" (UniqueName: \"kubernetes.io/projected/91bd3c13-ed9b-4929-9c5d-3d650badf3e9-kube-api-access-wjbg8\") on node \"crc\" DevicePath \"\"" Jan 21 15:34:44 crc kubenswrapper[4707]: I0121 15:34:44.340061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" event={"ID":"91bd3c13-ed9b-4929-9c5d-3d650badf3e9","Type":"ContainerDied","Data":"e079037a2fecb2ddda6a612ee1b3f2b5acb044b51b0df30dde88676cc5c0c267"} Jan 21 15:34:44 crc kubenswrapper[4707]: I0121 15:34:44.340102 4707 scope.go:117] "RemoveContainer" containerID="9cb3ce555c3c18849f88cb950839ddcad6eacdd2264d87d1e1ee4ea3ccfb67eb" Jan 21 15:34:44 crc kubenswrapper[4707]: I0121 15:34:44.340209 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn" Jan 21 15:34:44 crc kubenswrapper[4707]: I0121 15:34:44.364066 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn"] Jan 21 15:34:44 crc kubenswrapper[4707]: I0121 15:34:44.366055 4707 scope.go:117] "RemoveContainer" containerID="b19a11ff1e744d04142c11ec5127bbafe457bae5474e682a369e5bab0699a6ac" Jan 21 15:34:44 crc kubenswrapper[4707]: I0121 15:34:44.369720 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859b49f69-4jtmn"] Jan 21 15:34:45 crc kubenswrapper[4707]: I0121 15:34:45.189292 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91bd3c13-ed9b-4929-9c5d-3d650badf3e9" path="/var/lib/kubelet/pods/91bd3c13-ed9b-4929-9c5d-3d650badf3e9/volumes" Jan 21 15:34:45 crc kubenswrapper[4707]: E0121 15:34:45.768100 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:45 crc kubenswrapper[4707]: E0121 15:34:45.769152 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:45 crc kubenswrapper[4707]: E0121 15:34:45.770028 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:45 crc kubenswrapper[4707]: E0121 15:34:45.770055 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" Jan 21 15:34:46 crc kubenswrapper[4707]: I0121 15:34:46.182865 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:34:46 crc kubenswrapper[4707]: I0121 15:34:46.375772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"3d5e87bd6abf7a6178fdaecd77230a20522c0d8576400f05ca4f9dfd50c50ef1"} Jan 21 15:34:47 crc kubenswrapper[4707]: E0121 15:34:47.754501 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:47 crc kubenswrapper[4707]: E0121 15:34:47.756378 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:47 crc kubenswrapper[4707]: E0121 15:34:47.757399 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:47 crc kubenswrapper[4707]: E0121 15:34:47.757436 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" Jan 21 15:34:50 crc kubenswrapper[4707]: E0121 15:34:50.768537 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:50 crc kubenswrapper[4707]: E0121 15:34:50.769913 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:50 crc kubenswrapper[4707]: E0121 15:34:50.770925 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:50 crc kubenswrapper[4707]: E0121 15:34:50.770955 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" Jan 21 15:34:52 crc kubenswrapper[4707]: E0121 15:34:52.754425 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:52 crc kubenswrapper[4707]: E0121 15:34:52.756059 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:52 crc kubenswrapper[4707]: E0121 15:34:52.756928 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:52 crc kubenswrapper[4707]: E0121 15:34:52.756960 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" Jan 21 15:34:54 crc kubenswrapper[4707]: I0121 15:34:54.290161 4707 scope.go:117] "RemoveContainer" containerID="9301f08dba6fe0925acba1d6c85e32b4cc1ed0d37ee9a2c6412884e1b955f53d" Jan 21 15:34:54 crc kubenswrapper[4707]: I0121 15:34:54.317559 4707 scope.go:117] "RemoveContainer" containerID="4b415b2394f6a6b1a0194164dcb9f6acb2393c4d7e63cbac8c3202ab06d0df36" Jan 21 15:34:54 crc kubenswrapper[4707]: I0121 15:34:54.347393 4707 scope.go:117] "RemoveContainer" containerID="3db19e9379751f044b218011cb0a5dd0bb55284281358d73c960c33d0fdb73c9" Jan 21 15:34:55 crc kubenswrapper[4707]: E0121 15:34:55.768118 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:55 crc kubenswrapper[4707]: E0121 15:34:55.769313 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:55 crc kubenswrapper[4707]: E0121 15:34:55.770401 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:34:55 crc kubenswrapper[4707]: E0121 15:34:55.770436 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" Jan 21 15:34:57 crc kubenswrapper[4707]: I0121 15:34:57.291138 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.94:3000/\": dial tcp 10.217.0.94:3000: connect: connection refused" Jan 21 15:34:57 crc kubenswrapper[4707]: E0121 15:34:57.754488 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:57 crc kubenswrapper[4707]: E0121 15:34:57.755594 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:57 crc kubenswrapper[4707]: E0121 15:34:57.756617 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:34:57 crc kubenswrapper[4707]: E0121 15:34:57.756661 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" Jan 21 15:35:00 crc kubenswrapper[4707]: E0121 15:35:00.768551 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:35:00 crc kubenswrapper[4707]: E0121 15:35:00.769609 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:35:00 crc kubenswrapper[4707]: E0121 15:35:00.770382 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:35:00 crc kubenswrapper[4707]: E0121 15:35:00.770409 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.474756 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-78749c84c4-t8xcs_27a7cc38-e5af-494e-880c-45e5278e927a/neutron-api/0.log" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.474921 4707 generic.go:334] "Generic (PLEG): container finished" podID="27a7cc38-e5af-494e-880c-45e5278e927a" containerID="f13d26d52d87a08e099fcd0268946a2ae4f7db7d6a421b118ad9da2d50e25f98" exitCode=137 Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.474946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" event={"ID":"27a7cc38-e5af-494e-880c-45e5278e927a","Type":"ContainerDied","Data":"f13d26d52d87a08e099fcd0268946a2ae4f7db7d6a421b118ad9da2d50e25f98"} Jan 21 15:35:02 crc kubenswrapper[4707]: E0121 15:35:02.634892 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d1f2b5b_f47a_496d_ba02_5299848b53a5.slice/crio-abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d1f2b5b_f47a_496d_ba02_5299848b53a5.slice/crio-conmon-abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.692586 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-78749c84c4-t8xcs_27a7cc38-e5af-494e-880c-45e5278e927a/neutron-api/0.log" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.692648 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:35:02 crc kubenswrapper[4707]: E0121 15:35:02.754929 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:35:02 crc kubenswrapper[4707]: E0121 15:35:02.756086 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:35:02 crc kubenswrapper[4707]: E0121 15:35:02.757141 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:35:02 crc kubenswrapper[4707]: E0121 15:35:02.757175 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.767376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-ovndb-tls-certs\") pod \"27a7cc38-e5af-494e-880c-45e5278e927a\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.767439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-public-tls-certs\") pod \"27a7cc38-e5af-494e-880c-45e5278e927a\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.767482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-config\") pod \"27a7cc38-e5af-494e-880c-45e5278e927a\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.767562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffsr2\" (UniqueName: \"kubernetes.io/projected/27a7cc38-e5af-494e-880c-45e5278e927a-kube-api-access-ffsr2\") pod \"27a7cc38-e5af-494e-880c-45e5278e927a\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.767602 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-httpd-config\") pod \"27a7cc38-e5af-494e-880c-45e5278e927a\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.767636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-internal-tls-certs\") pod \"27a7cc38-e5af-494e-880c-45e5278e927a\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.767674 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-combined-ca-bundle\") pod \"27a7cc38-e5af-494e-880c-45e5278e927a\" (UID: \"27a7cc38-e5af-494e-880c-45e5278e927a\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.772167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a7cc38-e5af-494e-880c-45e5278e927a-kube-api-access-ffsr2" (OuterVolumeSpecName: "kube-api-access-ffsr2") pod "27a7cc38-e5af-494e-880c-45e5278e927a" (UID: "27a7cc38-e5af-494e-880c-45e5278e927a"). InnerVolumeSpecName "kube-api-access-ffsr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.773825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "27a7cc38-e5af-494e-880c-45e5278e927a" (UID: "27a7cc38-e5af-494e-880c-45e5278e927a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.799823 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-config" (OuterVolumeSpecName: "config") pod "27a7cc38-e5af-494e-880c-45e5278e927a" (UID: "27a7cc38-e5af-494e-880c-45e5278e927a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.806041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "27a7cc38-e5af-494e-880c-45e5278e927a" (UID: "27a7cc38-e5af-494e-880c-45e5278e927a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.808145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27a7cc38-e5af-494e-880c-45e5278e927a" (UID: "27a7cc38-e5af-494e-880c-45e5278e927a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.811109 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.811907 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.813351 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "27a7cc38-e5af-494e-880c-45e5278e927a" (UID: "27a7cc38-e5af-494e-880c-45e5278e927a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.818876 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "27a7cc38-e5af-494e-880c-45e5278e927a" (UID: "27a7cc38-e5af-494e-880c-45e5278e927a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift\") pod \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869393 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9hgv\" (UniqueName: \"kubernetes.io/projected/c893a809-f898-4540-9624-c092e6ea4e8b-kube-api-access-p9hgv\") pod \"c893a809-f898-4540-9624-c092e6ea4e8b\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x568p\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-kube-api-access-x568p\") pod \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data-custom\") pod \"c893a809-f898-4540-9624-c092e6ea4e8b\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869538 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-lock\") pod \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c893a809-f898-4540-9624-c092e6ea4e8b-etc-machine-id\") pod \"c893a809-f898-4540-9624-c092e6ea4e8b\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869597 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-scripts\") pod \"c893a809-f898-4540-9624-c092e6ea4e8b\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-cache\") pod \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\" (UID: \"4d1f2b5b-f47a-496d-ba02-5299848b53a5\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869671 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data\") pod \"c893a809-f898-4540-9624-c092e6ea4e8b\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-combined-ca-bundle\") pod \"c893a809-f898-4540-9624-c092e6ea4e8b\" (UID: \"c893a809-f898-4540-9624-c092e6ea4e8b\") " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869951 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869962 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869971 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869978 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869987 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.869995 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffsr2\" (UniqueName: \"kubernetes.io/projected/27a7cc38-e5af-494e-880c-45e5278e927a-kube-api-access-ffsr2\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.870003 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27a7cc38-e5af-494e-880c-45e5278e927a-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.870546 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-lock" (OuterVolumeSpecName: "lock") pod "4d1f2b5b-f47a-496d-ba02-5299848b53a5" (UID: "4d1f2b5b-f47a-496d-ba02-5299848b53a5"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.873361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c893a809-f898-4540-9624-c092e6ea4e8b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c893a809-f898-4540-9624-c092e6ea4e8b" (UID: "c893a809-f898-4540-9624-c092e6ea4e8b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.873611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-cache" (OuterVolumeSpecName: "cache") pod "4d1f2b5b-f47a-496d-ba02-5299848b53a5" (UID: "4d1f2b5b-f47a-496d-ba02-5299848b53a5"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.874956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4d1f2b5b-f47a-496d-ba02-5299848b53a5" (UID: "4d1f2b5b-f47a-496d-ba02-5299848b53a5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.875195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "swift") pod "4d1f2b5b-f47a-496d-ba02-5299848b53a5" (UID: "4d1f2b5b-f47a-496d-ba02-5299848b53a5"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.875941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c893a809-f898-4540-9624-c092e6ea4e8b-kube-api-access-p9hgv" (OuterVolumeSpecName: "kube-api-access-p9hgv") pod "c893a809-f898-4540-9624-c092e6ea4e8b" (UID: "c893a809-f898-4540-9624-c092e6ea4e8b"). InnerVolumeSpecName "kube-api-access-p9hgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.876926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-scripts" (OuterVolumeSpecName: "scripts") pod "c893a809-f898-4540-9624-c092e6ea4e8b" (UID: "c893a809-f898-4540-9624-c092e6ea4e8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.877283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c893a809-f898-4540-9624-c092e6ea4e8b" (UID: "c893a809-f898-4540-9624-c092e6ea4e8b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.877501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-kube-api-access-x568p" (OuterVolumeSpecName: "kube-api-access-x568p") pod "4d1f2b5b-f47a-496d-ba02-5299848b53a5" (UID: "4d1f2b5b-f47a-496d-ba02-5299848b53a5"). InnerVolumeSpecName "kube-api-access-x568p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.901575 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c893a809-f898-4540-9624-c092e6ea4e8b" (UID: "c893a809-f898-4540-9624-c092e6ea4e8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.926531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data" (OuterVolumeSpecName: "config-data") pod "c893a809-f898-4540-9624-c092e6ea4e8b" (UID: "c893a809-f898-4540-9624-c092e6ea4e8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.945368 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971281 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971312 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9hgv\" (UniqueName: \"kubernetes.io/projected/c893a809-f898-4540-9624-c092e6ea4e8b-kube-api-access-p9hgv\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971324 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x568p\" (UniqueName: \"kubernetes.io/projected/4d1f2b5b-f47a-496d-ba02-5299848b53a5-kube-api-access-x568p\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971333 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971341 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971351 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c893a809-f898-4540-9624-c092e6ea4e8b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971379 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971387 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971395 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d1f2b5b-f47a-496d-ba02-5299848b53a5-cache\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971402 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.971410 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c893a809-f898-4540-9624-c092e6ea4e8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:02 crc kubenswrapper[4707]: I0121 15:35:02.983557 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.072744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-ceilometer-tls-certs\") pod \"598e50f3-fe68-4276-9dab-a81d954b1505\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.072844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-config-data\") pod \"598e50f3-fe68-4276-9dab-a81d954b1505\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.072878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-sg-core-conf-yaml\") pod \"598e50f3-fe68-4276-9dab-a81d954b1505\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.072933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4t9\" (UniqueName: \"kubernetes.io/projected/598e50f3-fe68-4276-9dab-a81d954b1505-kube-api-access-mm4t9\") pod \"598e50f3-fe68-4276-9dab-a81d954b1505\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.072955 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-log-httpd\") pod \"598e50f3-fe68-4276-9dab-a81d954b1505\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.072998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-scripts\") pod \"598e50f3-fe68-4276-9dab-a81d954b1505\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.073021 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-combined-ca-bundle\") pod \"598e50f3-fe68-4276-9dab-a81d954b1505\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.073040 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-run-httpd\") pod \"598e50f3-fe68-4276-9dab-a81d954b1505\" (UID: \"598e50f3-fe68-4276-9dab-a81d954b1505\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.073278 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.073554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "598e50f3-fe68-4276-9dab-a81d954b1505" (UID: "598e50f3-fe68-4276-9dab-a81d954b1505"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.073631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "598e50f3-fe68-4276-9dab-a81d954b1505" (UID: "598e50f3-fe68-4276-9dab-a81d954b1505"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.081980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598e50f3-fe68-4276-9dab-a81d954b1505-kube-api-access-mm4t9" (OuterVolumeSpecName: "kube-api-access-mm4t9") pod "598e50f3-fe68-4276-9dab-a81d954b1505" (UID: "598e50f3-fe68-4276-9dab-a81d954b1505"). InnerVolumeSpecName "kube-api-access-mm4t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.088571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-scripts" (OuterVolumeSpecName: "scripts") pod "598e50f3-fe68-4276-9dab-a81d954b1505" (UID: "598e50f3-fe68-4276-9dab-a81d954b1505"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.114920 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "598e50f3-fe68-4276-9dab-a81d954b1505" (UID: "598e50f3-fe68-4276-9dab-a81d954b1505"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.167966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "598e50f3-fe68-4276-9dab-a81d954b1505" (UID: "598e50f3-fe68-4276-9dab-a81d954b1505"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.169308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "598e50f3-fe68-4276-9dab-a81d954b1505" (UID: "598e50f3-fe68-4276-9dab-a81d954b1505"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.174189 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.174224 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.174235 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.174245 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598e50f3-fe68-4276-9dab-a81d954b1505-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.174253 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.174263 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.174271 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4t9\" (UniqueName: \"kubernetes.io/projected/598e50f3-fe68-4276-9dab-a81d954b1505-kube-api-access-mm4t9\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.182347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-config-data" (OuterVolumeSpecName: "config-data") pod "598e50f3-fe68-4276-9dab-a81d954b1505" (UID: "598e50f3-fe68-4276-9dab-a81d954b1505"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.275612 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598e50f3-fe68-4276-9dab-a81d954b1505-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.488228 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerID="abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266" exitCode=137 Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.488296 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266"} Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.488322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"4d1f2b5b-f47a-496d-ba02-5299848b53a5","Type":"ContainerDied","Data":"51bf0791aef6a119d2d47241c693b87fee6f1f2b506c7de05b60681dba464880"} Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.488333 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.488343 4707 scope.go:117] "RemoveContainer" containerID="abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.490381 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-78749c84c4-t8xcs_27a7cc38-e5af-494e-880c-45e5278e927a/neutron-api/0.log" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.490481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" event={"ID":"27a7cc38-e5af-494e-880c-45e5278e927a","Type":"ContainerDied","Data":"cd8d6acf3eb7cca90e881c498961d6928c15748039fc080c5c11e30d2c93bbb2"} Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.490529 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78749c84c4-t8xcs" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.493858 4707 generic.go:334] "Generic (PLEG): container finished" podID="598e50f3-fe68-4276-9dab-a81d954b1505" containerID="3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19" exitCode=137 Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.493891 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerDied","Data":"3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19"} Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.493932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"598e50f3-fe68-4276-9dab-a81d954b1505","Type":"ContainerDied","Data":"47b0f8325f2eb1dbe5f9d291dbed57bab8ef080a8f802363739d9209d8176fab"} Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.493904 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.496235 4707 generic.go:334] "Generic (PLEG): container finished" podID="c893a809-f898-4540-9624-c092e6ea4e8b" containerID="ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a" exitCode=137 Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.496265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c893a809-f898-4540-9624-c092e6ea4e8b","Type":"ContainerDied","Data":"ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a"} Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.496286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c893a809-f898-4540-9624-c092e6ea4e8b","Type":"ContainerDied","Data":"e7e88f35fdc4832f8b3e4ad1f62c9ce429c46af0a5419ca8b699df2dbf68678d"} Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.496389 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.497600 4707 generic.go:334] "Generic (PLEG): container finished" podID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" exitCode=137 Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.497630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"849d228c-4673-486b-9e38-e7567fd1ddb8","Type":"ContainerDied","Data":"6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83"} Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.497665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"849d228c-4673-486b-9e38-e7567fd1ddb8","Type":"ContainerDied","Data":"186014d69fbbb3a8b2ce9b8010a7ed7fde308ee8968888f97d9bcbfdd6212ed6"} Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.497677 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="186014d69fbbb3a8b2ce9b8010a7ed7fde308ee8968888f97d9bcbfdd6212ed6" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.504423 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.511402 4707 scope.go:117] "RemoveContainer" containerID="0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.531445 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-78749c84c4-t8xcs"] Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.533102 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-78749c84c4-t8xcs"] Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.536911 4707 scope.go:117] "RemoveContainer" containerID="857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.537822 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.542978 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.547068 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.552251 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.558297 4707 scope.go:117] "RemoveContainer" containerID="fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.558905 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.562626 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.573332 4707 scope.go:117] "RemoveContainer" containerID="4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.580176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-combined-ca-bundle\") pod \"849d228c-4673-486b-9e38-e7567fd1ddb8\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.580235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxll6\" (UniqueName: \"kubernetes.io/projected/849d228c-4673-486b-9e38-e7567fd1ddb8-kube-api-access-sxll6\") pod \"849d228c-4673-486b-9e38-e7567fd1ddb8\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.580327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-config-data\") pod \"849d228c-4673-486b-9e38-e7567fd1ddb8\" (UID: \"849d228c-4673-486b-9e38-e7567fd1ddb8\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.584170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849d228c-4673-486b-9e38-e7567fd1ddb8-kube-api-access-sxll6" (OuterVolumeSpecName: "kube-api-access-sxll6") pod "849d228c-4673-486b-9e38-e7567fd1ddb8" (UID: "849d228c-4673-486b-9e38-e7567fd1ddb8"). InnerVolumeSpecName "kube-api-access-sxll6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.596916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-config-data" (OuterVolumeSpecName: "config-data") pod "849d228c-4673-486b-9e38-e7567fd1ddb8" (UID: "849d228c-4673-486b-9e38-e7567fd1ddb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.599022 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "849d228c-4673-486b-9e38-e7567fd1ddb8" (UID: "849d228c-4673-486b-9e38-e7567fd1ddb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.653224 4707 scope.go:117] "RemoveContainer" containerID="69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.681697 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.681729 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849d228c-4673-486b-9e38-e7567fd1ddb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.681740 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxll6\" (UniqueName: \"kubernetes.io/projected/849d228c-4673-486b-9e38-e7567fd1ddb8-kube-api-access-sxll6\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.737350 4707 scope.go:117] "RemoveContainer" containerID="befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.763526 4707 scope.go:117] "RemoveContainer" containerID="f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.784555 4707 scope.go:117] "RemoveContainer" containerID="936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.801183 4707 scope.go:117] "RemoveContainer" containerID="1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.816231 4707 scope.go:117] "RemoveContainer" containerID="105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.832485 4707 scope.go:117] "RemoveContainer" containerID="df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.846400 4707 scope.go:117] "RemoveContainer" containerID="56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.858452 4707 scope.go:117] "RemoveContainer" containerID="d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.874253 4707 scope.go:117] "RemoveContainer" containerID="bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.912706 4707 scope.go:117] "RemoveContainer" containerID="abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.915495 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266\": container with ID starting with abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266 not found: ID does not exist" containerID="abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.915528 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266"} err="failed to get container status \"abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266\": rpc error: code = NotFound desc = could not find container \"abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266\": container with ID starting with abb5c4532ff72bd760a991e73edb1fc40a44be4b400138531659a1b59abf3266 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.915553 4707 scope.go:117] "RemoveContainer" containerID="0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.915872 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7\": container with ID starting with 0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7 not found: ID does not exist" containerID="0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.915890 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7"} err="failed to get container status \"0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7\": rpc error: code = NotFound desc = could not find container \"0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7\": container with ID starting with 0c4e9a6976779a53381cde50cf8663c1a8b0ed324c827517ee4706c9aad9eea7 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.915903 4707 scope.go:117] "RemoveContainer" containerID="857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.916110 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d\": container with ID starting with 857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d not found: ID does not exist" containerID="857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.916157 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d"} err="failed to get container status \"857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d\": rpc error: code = NotFound desc = could not find container \"857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d\": container with ID starting with 857bf901be1425fef9390f5e8848c58c496e9fb8a102ae77bc4bd8e44edc679d not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.916179 4707 scope.go:117] "RemoveContainer" containerID="fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.916403 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97\": container with ID starting with fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97 not found: ID does not exist" containerID="fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.916426 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97"} err="failed to get container status \"fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97\": rpc error: code = NotFound desc = could not find container \"fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97\": container with ID starting with fa63fad741729249de771212d1c7a840aaa68db0d631f3367d48202c5775bb97 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.916440 4707 scope.go:117] "RemoveContainer" containerID="4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.916614 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6\": container with ID starting with 4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6 not found: ID does not exist" containerID="4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.916638 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6"} err="failed to get container status \"4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6\": rpc error: code = NotFound desc = could not find container \"4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6\": container with ID starting with 4f0e6468275ac40cedc60759c983797667ad6fe4fd8e76c767acaf925afbb0d6 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.916652 4707 scope.go:117] "RemoveContainer" containerID="69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.916845 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525\": container with ID starting with 69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525 not found: ID does not exist" containerID="69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.916865 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525"} err="failed to get container status \"69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525\": rpc error: code = NotFound desc = could not find container \"69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525\": container with ID starting with 69d8b2832fc658eb4ac7af8afbe27bfb8e8f59480014e328f62db2d17215a525 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.916879 4707 scope.go:117] "RemoveContainer" containerID="befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.917070 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd\": container with ID starting with befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd not found: ID does not exist" containerID="befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.917089 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd"} err="failed to get container status \"befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd\": rpc error: code = NotFound desc = could not find container \"befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd\": container with ID starting with befa2280f8adbeb23e0d4472fb0c67b0ef3012d31650e4ac5b6ba07459e6d5dd not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.917101 4707 scope.go:117] "RemoveContainer" containerID="f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.917296 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507\": container with ID starting with f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507 not found: ID does not exist" containerID="f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.917316 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507"} err="failed to get container status \"f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507\": rpc error: code = NotFound desc = could not find container \"f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507\": container with ID starting with f37c3cb2ea5aea1f3995d11dd2ea0f952f81053bfe870161d1c159664141c507 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.917328 4707 scope.go:117] "RemoveContainer" containerID="936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.917538 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146\": container with ID starting with 936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146 not found: ID does not exist" containerID="936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.917568 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146"} err="failed to get container status \"936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146\": rpc error: code = NotFound desc = could not find container \"936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146\": container with ID starting with 936bddc400212bf75c51fb6f9ccfc2cf2863955ecf2514a3e60d5feef05e0146 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.917591 4707 scope.go:117] "RemoveContainer" containerID="1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.917955 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa\": container with ID starting with 1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa not found: ID does not exist" containerID="1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.918047 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa"} err="failed to get container status \"1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa\": rpc error: code = NotFound desc = could not find container \"1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa\": container with ID starting with 1f4b6aa2fbd01cbfd6e5c43b7bea022088c4b209cb929e3ceaa1d6db430de5fa not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.918063 4707 scope.go:117] "RemoveContainer" containerID="105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.918299 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3\": container with ID starting with 105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3 not found: ID does not exist" containerID="105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.918316 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3"} err="failed to get container status \"105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3\": rpc error: code = NotFound desc = could not find container \"105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3\": container with ID starting with 105c4dd63b94e632c398937011b33985b2d3c020c11bc9c738c1081d698189c3 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.918327 4707 scope.go:117] "RemoveContainer" containerID="df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.918611 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034\": container with ID starting with df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034 not found: ID does not exist" containerID="df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.918630 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034"} err="failed to get container status \"df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034\": rpc error: code = NotFound desc = could not find container \"df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034\": container with ID starting with df8fa45ac7d5e38a80dfde81fbe3bc108b202d6c376dc6a654fa0db81e709034 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.918642 4707 scope.go:117] "RemoveContainer" containerID="56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.919011 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed\": container with ID starting with 56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed not found: ID does not exist" containerID="56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.919033 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed"} err="failed to get container status \"56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed\": rpc error: code = NotFound desc = could not find container \"56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed\": container with ID starting with 56c6648f667b832ba4c8770b42a43d21464aa9bfc0d79d2f448eb224f95ceaed not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.919046 4707 scope.go:117] "RemoveContainer" containerID="d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.919290 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe\": container with ID starting with d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe not found: ID does not exist" containerID="d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.919310 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe"} err="failed to get container status \"d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe\": rpc error: code = NotFound desc = could not find container \"d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe\": container with ID starting with d4f54b5ab5633d57f4add1a9f5ee6a0aad559df94ae6398bbf48dc09cae1a1fe not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.919322 4707 scope.go:117] "RemoveContainer" containerID="bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276" Jan 21 15:35:03 crc kubenswrapper[4707]: E0121 15:35:03.919475 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276\": container with ID starting with bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276 not found: ID does not exist" containerID="bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.919497 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276"} err="failed to get container status \"bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276\": rpc error: code = NotFound desc = could not find container \"bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276\": container with ID starting with bdc3fbdd6d42190f71b601eb50372d6c404b3402f6b7cada7d98ff4db49b7276 not found: ID does not exist" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.919511 4707 scope.go:117] "RemoveContainer" containerID="88c64ecf8afd910bd3c88e6e7d95ee4f70a40752b48f92f051811e5a100e6877" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.921588 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.938388 4707 scope.go:117] "RemoveContainer" containerID="f13d26d52d87a08e099fcd0268946a2ae4f7db7d6a421b118ad9da2d50e25f98" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.956781 4707 scope.go:117] "RemoveContainer" containerID="6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.970979 4707 scope.go:117] "RemoveContainer" containerID="4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.984105 4707 scope.go:117] "RemoveContainer" containerID="3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19" Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.988256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-config-data\") pod \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.988290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-combined-ca-bundle\") pod \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.988457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbbbt\" (UniqueName: \"kubernetes.io/projected/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-kube-api-access-fbbbt\") pod \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\" (UID: \"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87\") " Jan 21 15:35:03 crc kubenswrapper[4707]: I0121 15:35:03.991423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-kube-api-access-fbbbt" (OuterVolumeSpecName: "kube-api-access-fbbbt") pod "fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" (UID: "fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87"). InnerVolumeSpecName "kube-api-access-fbbbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.001799 4707 scope.go:117] "RemoveContainer" containerID="e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.002903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" (UID: "fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.003080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-config-data" (OuterVolumeSpecName: "config-data") pod "fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" (UID: "fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.021138 4707 scope.go:117] "RemoveContainer" containerID="6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339" Jan 21 15:35:04 crc kubenswrapper[4707]: E0121 15:35:04.022072 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339\": container with ID starting with 6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339 not found: ID does not exist" containerID="6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.022104 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339"} err="failed to get container status \"6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339\": rpc error: code = NotFound desc = could not find container \"6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339\": container with ID starting with 6ac3c5108d54447cff4b43a6d651f539a8f0084419ea85c763157ff69252e339 not found: ID does not exist" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.022123 4707 scope.go:117] "RemoveContainer" containerID="4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a" Jan 21 15:35:04 crc kubenswrapper[4707]: E0121 15:35:04.022431 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a\": container with ID starting with 4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a not found: ID does not exist" containerID="4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.022454 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a"} err="failed to get container status \"4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a\": rpc error: code = NotFound desc = could not find container \"4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a\": container with ID starting with 4ae087dc0b0d8c3e271ac54715824fb75901cca7048e6f3a8c13aef8893d025a not found: ID does not exist" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.022466 4707 scope.go:117] "RemoveContainer" containerID="3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19" Jan 21 15:35:04 crc kubenswrapper[4707]: E0121 15:35:04.022834 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19\": container with ID starting with 3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19 not found: ID does not exist" containerID="3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.022855 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19"} err="failed to get container status \"3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19\": rpc error: code = NotFound desc = could not find container \"3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19\": container with ID starting with 3a483dd317ff80922d0e3ce6c67b4cbf4a74b216d53df26748834a2fcf440d19 not found: ID does not exist" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.022867 4707 scope.go:117] "RemoveContainer" containerID="e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da" Jan 21 15:35:04 crc kubenswrapper[4707]: E0121 15:35:04.023215 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da\": container with ID starting with e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da not found: ID does not exist" containerID="e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.023235 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da"} err="failed to get container status \"e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da\": rpc error: code = NotFound desc = could not find container \"e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da\": container with ID starting with e91d253d92bb352098293b96fd1e909f40afcd6b4c9d1175ce0aa8191cb378da not found: ID does not exist" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.023247 4707 scope.go:117] "RemoveContainer" containerID="630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.037146 4707 scope.go:117] "RemoveContainer" containerID="ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.051904 4707 scope.go:117] "RemoveContainer" containerID="630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a" Jan 21 15:35:04 crc kubenswrapper[4707]: E0121 15:35:04.052292 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a\": container with ID starting with 630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a not found: ID does not exist" containerID="630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.052318 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a"} err="failed to get container status \"630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a\": rpc error: code = NotFound desc = could not find container \"630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a\": container with ID starting with 630bc402dad33ae3aaf868b65a8f531d43350fa68afa02cb6eae0aca7c5f0f8a not found: ID does not exist" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.052335 4707 scope.go:117] "RemoveContainer" containerID="ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a" Jan 21 15:35:04 crc kubenswrapper[4707]: E0121 15:35:04.052623 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a\": container with ID starting with ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a not found: ID does not exist" containerID="ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.052643 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a"} err="failed to get container status \"ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a\": rpc error: code = NotFound desc = could not find container \"ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a\": container with ID starting with ab2ed7c96c754c55c56e29a1d937e8d801d24694da0ec617100f761dab576a6a not found: ID does not exist" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.089529 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbbbt\" (UniqueName: \"kubernetes.io/projected/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-kube-api-access-fbbbt\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.089556 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.089566 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.508459 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" exitCode=137 Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.508552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87","Type":"ContainerDied","Data":"1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84"} Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.508564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.508603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87","Type":"ContainerDied","Data":"6c9e2c0a4c181ddaf67e3f47f776476fd05edd48320343c19a9ec8cdb852b4f3"} Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.508625 4707 scope.go:117] "RemoveContainer" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.509665 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.524302 4707 scope.go:117] "RemoveContainer" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" Jan 21 15:35:04 crc kubenswrapper[4707]: E0121 15:35:04.524590 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84\": container with ID starting with 1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84 not found: ID does not exist" containerID="1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.524629 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84"} err="failed to get container status \"1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84\": rpc error: code = NotFound desc = could not find container \"1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84\": container with ID starting with 1a55155c085f5bf64ed56791dd379b7032a19420ea693ec039731a74af5d9e84 not found: ID does not exist" Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.533595 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.539137 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.543302 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:35:04 crc kubenswrapper[4707]: I0121 15:35:04.547007 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:35:05 crc kubenswrapper[4707]: I0121 15:35:05.188467 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" path="/var/lib/kubelet/pods/27a7cc38-e5af-494e-880c-45e5278e927a/volumes" Jan 21 15:35:05 crc kubenswrapper[4707]: I0121 15:35:05.189074 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" path="/var/lib/kubelet/pods/4d1f2b5b-f47a-496d-ba02-5299848b53a5/volumes" Jan 21 15:35:05 crc kubenswrapper[4707]: I0121 15:35:05.190399 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" path="/var/lib/kubelet/pods/598e50f3-fe68-4276-9dab-a81d954b1505/volumes" Jan 21 15:35:05 crc kubenswrapper[4707]: I0121 15:35:05.191014 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" path="/var/lib/kubelet/pods/849d228c-4673-486b-9e38-e7567fd1ddb8/volumes" Jan 21 15:35:05 crc kubenswrapper[4707]: I0121 15:35:05.191439 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c893a809-f898-4540-9624-c092e6ea4e8b" path="/var/lib/kubelet/pods/c893a809-f898-4540-9624-c092e6ea4e8b/volumes" Jan 21 15:35:05 crc kubenswrapper[4707]: I0121 15:35:05.192066 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" path="/var/lib/kubelet/pods/fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87/volumes" Jan 21 15:35:05 crc kubenswrapper[4707]: I0121 15:35:05.678632 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podc2af1e8b-8059-4390-9ab4-a4ed516f509d"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podc2af1e8b-8059-4390-9ab4-a4ed516f509d] : Timed out while waiting for systemd to remove kubepods-besteffort-podc2af1e8b_8059_4390_9ab4_a4ed516f509d.slice" Jan 21 15:35:05 crc kubenswrapper[4707]: E0121 15:35:05.678697 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podc2af1e8b-8059-4390-9ab4-a4ed516f509d] : unable to destroy cgroup paths for cgroup [kubepods besteffort podc2af1e8b-8059-4390-9ab4-a4ed516f509d] : Timed out while waiting for systemd to remove kubepods-besteffort-podc2af1e8b_8059_4390_9ab4_a4ed516f509d.slice" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" Jan 21 15:35:05 crc kubenswrapper[4707]: I0121 15:35:05.701556 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod75c78782-36f5-4ea6-9b94-0c38ad8ac8b2"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod75c78782-36f5-4ea6-9b94-0c38ad8ac8b2] : Timed out while waiting for systemd to remove kubepods-burstable-pod75c78782_36f5_4ea6_9b94_0c38ad8ac8b2.slice" Jan 21 15:35:06 crc kubenswrapper[4707]: I0121 15:35:06.521420 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb" Jan 21 15:35:06 crc kubenswrapper[4707]: I0121 15:35:06.544533 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb"] Jan 21 15:35:06 crc kubenswrapper[4707]: I0121 15:35:06.548617 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-84c8d5c4db-f9znb"] Jan 21 15:35:07 crc kubenswrapper[4707]: I0121 15:35:07.189964 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" path="/var/lib/kubelet/pods/c2af1e8b-8059-4390-9ab4-a4ed516f509d/volumes" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.055758 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-qpnsx"] Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.060603 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-qpnsx"] Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.164974 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-p6phr"] Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165322 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="init-config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165343 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="init-config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165361 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="prometheus" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165366 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="prometheus" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165377 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-auditor" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165382 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-auditor" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165388 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerName="placement-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165394 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerName="placement-api" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165403 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="init-config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165408 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="init-config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165418 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-reaper" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165422 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-reaper" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165428 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerName="glance-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165433 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerName="glance-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165441 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="proxy-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165446 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="proxy-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165456 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-updater" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165462 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-updater" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165472 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165477 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165483 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f0dcee-d9f1-4bcd-9084-402a6f132ddc" containerName="galera" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f0dcee-d9f1-4bcd-9084-402a6f132ddc" containerName="galera" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165496 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" containerName="setup-container" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165502 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" containerName="setup-container" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165509 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-server" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165514 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-server" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165523 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904f4a7-c8e7-4d41-9254-32ccb097df02" containerName="watcher-decision-engine" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165528 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904f4a7-c8e7-4d41-9254-32ccb097df02" containerName="watcher-decision-engine" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165535 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-metadata" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165540 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-metadata" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165551 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165556 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165565 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-replicator" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165571 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-replicator" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165578 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" containerName="rabbitmq" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165583 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" containerName="rabbitmq" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165591 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-replicator" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165596 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-replicator" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165605 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bfd31d-1d19-4286-87f6-83397fa62338" containerName="barbican-keystone-listener" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bfd31d-1d19-4286-87f6-83397fa62338" containerName="barbican-keystone-listener" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165616 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerName="cinder-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerName="cinder-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165629 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317f9a2c-fa7d-417b-9899-9bce822ffd74" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165634 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="317f9a2c-fa7d-417b-9899-9bce822ffd74" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165640 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerName="glance-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165646 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerName="glance-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165655 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="alertmanager" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165660 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="alertmanager" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165667 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f0dcee-d9f1-4bcd-9084-402a6f132ddc" containerName="mysql-bootstrap" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165672 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f0dcee-d9f1-4bcd-9084-402a6f132ddc" containerName="mysql-bootstrap" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165678 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" containerName="neutron-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" containerName="neutron-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165692 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerName="placement-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165697 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerName="placement-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165702 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165707 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-api" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165715 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-replicator" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165721 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-replicator" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165732 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="sg-core" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165737 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="sg-core" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165742 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c893a809-f898-4540-9624-c092e6ea4e8b" containerName="cinder-scheduler" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165747 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c893a809-f898-4540-9624-c092e6ea4e8b" containerName="cinder-scheduler" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165757 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46f052f-cdf5-442f-9628-b1aacddca32d" containerName="mariadb-account-create-update" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165762 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46f052f-cdf5-442f-9628-b1aacddca32d" containerName="mariadb-account-create-update" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165770 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" containerName="neutron-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165776 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" containerName="neutron-api" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165782 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerName="barbican-worker-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165788 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerName="barbican-worker-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165794 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-server" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165799 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-server" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165822 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerName="glance-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165827 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerName="glance-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165838 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933e570a-28bd-42d8-891f-0734578bc040" containerName="watcher-applier" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165843 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="933e570a-28bd-42d8-891f-0734578bc040" containerName="watcher-applier" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165851 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="ceilometer-central-agent" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165857 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="ceilometer-central-agent" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165864 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165869 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165874 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165880 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165889 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="ceilometer-notification-agent" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165895 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="ceilometer-notification-agent" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165903 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-updater" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165909 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-updater" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165916 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6c2ba2-268a-46f5-9ee1-93ff501d79de" containerName="memcached" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165921 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6c2ba2-268a-46f5-9ee1-93ff501d79de" containerName="memcached" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165927 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165933 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165940 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-auditor" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165945 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-auditor" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165951 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerName="barbican-worker" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165957 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerName="barbican-worker" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165964 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165969 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165976 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" containerName="keystone-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165984 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" containerName="keystone-api" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.165992 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-server" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.165997 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-server" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166003 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-expirer" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166007 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-expirer" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166019 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166027 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c893a809-f898-4540-9624-c092e6ea4e8b" containerName="probe" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166032 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c893a809-f898-4540-9624-c092e6ea4e8b" containerName="probe" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166038 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="swift-recon-cron" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166043 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="swift-recon-cron" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166053 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-auditor" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166059 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-auditor" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166070 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bd3c13-ed9b-4929-9c5d-3d650badf3e9" containerName="dnsmasq-dns" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166076 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bd3c13-ed9b-4929-9c5d-3d650badf3e9" containerName="dnsmasq-dns" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166083 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerName="proxy-server" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166088 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerName="proxy-server" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166096 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="rsync" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166101 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="rsync" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166107 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618755bc-3ad6-4081-9de4-d0f1f7c14020" containerName="kube-state-metrics" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166114 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="618755bc-3ad6-4081-9de4-d0f1f7c14020" containerName="kube-state-metrics" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166121 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166126 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166135 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bfd31d-1d19-4286-87f6-83397fa62338" containerName="barbican-keystone-listener-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166141 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bfd31d-1d19-4286-87f6-83397fa62338" containerName="barbican-keystone-listener-log" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166148 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerName="proxy-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166153 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerName="proxy-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166162 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="thanos-sidecar" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166166 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="thanos-sidecar" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166174 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166179 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166187 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bd3c13-ed9b-4929-9c5d-3d650badf3e9" containerName="init" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166193 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bd3c13-ed9b-4929-9c5d-3d650badf3e9" containerName="init" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166198 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166215 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166225 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerName="cinder-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166230 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerName="cinder-api" Jan 21 15:35:13 crc kubenswrapper[4707]: E0121 15:35:13.166239 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerName="glance-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166244 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerName="glance-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166414 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerName="barbican-worker-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166427 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="618755bc-3ad6-4081-9de4-d0f1f7c14020" containerName="kube-state-metrics" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166434 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bfd31d-1d19-4286-87f6-83397fa62338" containerName="barbican-keystone-listener-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166441 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-server" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166446 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="849d228c-4673-486b-9e38-e7567fd1ddb8" containerName="nova-scheduler-scheduler" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166453 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-metadata" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166461 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-server" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166467 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" containerName="neutron-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166475 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a7cc38-e5af-494e-880c-45e5278e927a" containerName="neutron-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166480 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="ceilometer-central-agent" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166489 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-updater" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166498 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerName="glance-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166506 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166513 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="proxy-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166520 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6c2ba2-268a-46f5-9ee1-93ff501d79de" containerName="memcached" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166529 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-replicator" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166536 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3b367f-2f6e-4ab7-af9b-5390359d3140" containerName="barbican-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166544 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="sg-core" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166551 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166557 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-updater" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166565 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="317f9a2c-fa7d-417b-9899-9bce822ffd74" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166572 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91bd3c13-ed9b-4929-9c5d-3d650badf3e9" containerName="dnsmasq-dns" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166582 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166588 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="config-reloader" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166594 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46f052f-cdf5-442f-9628-b1aacddca32d" containerName="mariadb-account-create-update" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166603 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-replicator" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166609 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46f052f-cdf5-442f-9628-b1aacddca32d" containerName="mariadb-account-create-update" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166615 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166624 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f0dcee-d9f1-4bcd-9084-402a6f132ddc" containerName="galera" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166631 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerName="cinder-api-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166638 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c893a809-f898-4540-9624-c092e6ea4e8b" containerName="probe" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166647 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a47423-6eba-4cf7-ab2d-db338f4f28a6" containerName="rabbitmq" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166655 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="22fbbd00-3dc5-42f1-a015-24e2aac9be07" containerName="watcher-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166661 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="598e50f3-fe68-4276-9dab-a81d954b1505" containerName="ceilometer-notification-agent" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166666 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c893a809-f898-4540-9624-c092e6ea4e8b" containerName="cinder-scheduler" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166675 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="prometheus" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166680 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="933e570a-28bd-42d8-891f-0734578bc040" containerName="watcher-applier" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166687 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerName="glance-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166693 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc34703-f7b2-455d-b4a6-4865a4c90f45" containerName="barbican-worker" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166700 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-replicator" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166706 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a1a2521-fdd5-4e7d-ad4f-8c4773c97360" containerName="glance-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166712 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-reaper" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166719 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerName="placement-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166726 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3cd8ce-d377-43dc-a92e-7d63c8ab5a87" containerName="nova-cell1-conductor-conductor" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166733 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d7dc0f-9472-43f7-8f37-adb51c82f84e" containerName="nova-api-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166740 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-expirer" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerName="proxy-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166754 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bfd31d-1d19-4286-87f6-83397fa62338" containerName="barbican-keystone-listener" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166762 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="swift-recon-cron" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166769 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85cef69-0d30-4ed6-9c73-c169b3636f3a" containerName="placement-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166776 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="952e1893-2e97-4caa-91e1-5c6621395737" containerName="nova-metadata-log" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166785 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4904f4a7-c8e7-4d41-9254-32ccb097df02" containerName="watcher-decision-engine" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166791 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="object-auditor" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166798 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c78782-36f5-4ea6-9b94-0c38ad8ac8b2" containerName="thanos-sidecar" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166821 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="rsync" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166828 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5a0b6a-5b7d-4431-961c-984efc37166a" containerName="glance-httpd" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166836 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bcbfab-0b0d-49c1-84ab-6a773c2a88ed" containerName="keystone-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166843 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-server" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166852 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd72c11-1e49-4359-811c-f7a96288df7b" containerName="cinder-api" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166859 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="account-auditor" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1f2b5b-f47a-496d-ba02-5299848b53a5" containerName="container-auditor" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166871 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2af1e8b-8059-4390-9ab4-a4ed516f509d" containerName="proxy-server" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.166878 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e034ee-7d6e-45f4-a7a5-d3ddc3a9bc7b" containerName="alertmanager" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.167482 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.171911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.172016 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.172135 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.171911 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.173566 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p6phr"] Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.189495 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb" path="/var/lib/kubelet/pods/a6b807ce-f0fb-42cb-9cc8-9f000a9b37cb/volumes" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.302471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/456eae50-a3c0-4ddc-a71a-581d101f8561-crc-storage\") pod \"crc-storage-crc-p6phr\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.302516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mnjm\" (UniqueName: \"kubernetes.io/projected/456eae50-a3c0-4ddc-a71a-581d101f8561-kube-api-access-2mnjm\") pod \"crc-storage-crc-p6phr\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.302558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/456eae50-a3c0-4ddc-a71a-581d101f8561-node-mnt\") pod \"crc-storage-crc-p6phr\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.403518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/456eae50-a3c0-4ddc-a71a-581d101f8561-node-mnt\") pod \"crc-storage-crc-p6phr\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.403623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/456eae50-a3c0-4ddc-a71a-581d101f8561-crc-storage\") pod \"crc-storage-crc-p6phr\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.403653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mnjm\" (UniqueName: \"kubernetes.io/projected/456eae50-a3c0-4ddc-a71a-581d101f8561-kube-api-access-2mnjm\") pod \"crc-storage-crc-p6phr\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.403865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/456eae50-a3c0-4ddc-a71a-581d101f8561-node-mnt\") pod \"crc-storage-crc-p6phr\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.404372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/456eae50-a3c0-4ddc-a71a-581d101f8561-crc-storage\") pod \"crc-storage-crc-p6phr\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.421693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mnjm\" (UniqueName: \"kubernetes.io/projected/456eae50-a3c0-4ddc-a71a-581d101f8561-kube-api-access-2mnjm\") pod \"crc-storage-crc-p6phr\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.489296 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:13 crc kubenswrapper[4707]: I0121 15:35:13.878573 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p6phr"] Jan 21 15:35:13 crc kubenswrapper[4707]: W0121 15:35:13.881359 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod456eae50_a3c0_4ddc_a71a_581d101f8561.slice/crio-2c83117b34f9f23949e0deb155826f40731f5eac6e846dde5510c63623375b8c WatchSource:0}: Error finding container 2c83117b34f9f23949e0deb155826f40731f5eac6e846dde5510c63623375b8c: Status 404 returned error can't find the container with id 2c83117b34f9f23949e0deb155826f40731f5eac6e846dde5510c63623375b8c Jan 21 15:35:14 crc kubenswrapper[4707]: I0121 15:35:14.572313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p6phr" event={"ID":"456eae50-a3c0-4ddc-a71a-581d101f8561","Type":"ContainerStarted","Data":"2c83117b34f9f23949e0deb155826f40731f5eac6e846dde5510c63623375b8c"} Jan 21 15:35:15 crc kubenswrapper[4707]: I0121 15:35:15.578351 4707 generic.go:334] "Generic (PLEG): container finished" podID="456eae50-a3c0-4ddc-a71a-581d101f8561" containerID="54c3c3aa93da18a1865d4c3838da11a384cb1fb6df94004df8c93addbd488257" exitCode=0 Jan 21 15:35:15 crc kubenswrapper[4707]: I0121 15:35:15.578382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p6phr" event={"ID":"456eae50-a3c0-4ddc-a71a-581d101f8561","Type":"ContainerDied","Data":"54c3c3aa93da18a1865d4c3838da11a384cb1fb6df94004df8c93addbd488257"} Jan 21 15:35:16 crc kubenswrapper[4707]: I0121 15:35:16.798167 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:16 crc kubenswrapper[4707]: I0121 15:35:16.945508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mnjm\" (UniqueName: \"kubernetes.io/projected/456eae50-a3c0-4ddc-a71a-581d101f8561-kube-api-access-2mnjm\") pod \"456eae50-a3c0-4ddc-a71a-581d101f8561\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " Jan 21 15:35:16 crc kubenswrapper[4707]: I0121 15:35:16.945548 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/456eae50-a3c0-4ddc-a71a-581d101f8561-crc-storage\") pod \"456eae50-a3c0-4ddc-a71a-581d101f8561\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " Jan 21 15:35:16 crc kubenswrapper[4707]: I0121 15:35:16.945601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/456eae50-a3c0-4ddc-a71a-581d101f8561-node-mnt\") pod \"456eae50-a3c0-4ddc-a71a-581d101f8561\" (UID: \"456eae50-a3c0-4ddc-a71a-581d101f8561\") " Jan 21 15:35:16 crc kubenswrapper[4707]: I0121 15:35:16.945834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/456eae50-a3c0-4ddc-a71a-581d101f8561-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "456eae50-a3c0-4ddc-a71a-581d101f8561" (UID: "456eae50-a3c0-4ddc-a71a-581d101f8561"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:16 crc kubenswrapper[4707]: I0121 15:35:16.950028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/456eae50-a3c0-4ddc-a71a-581d101f8561-kube-api-access-2mnjm" (OuterVolumeSpecName: "kube-api-access-2mnjm") pod "456eae50-a3c0-4ddc-a71a-581d101f8561" (UID: "456eae50-a3c0-4ddc-a71a-581d101f8561"). InnerVolumeSpecName "kube-api-access-2mnjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:16 crc kubenswrapper[4707]: I0121 15:35:16.960085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456eae50-a3c0-4ddc-a71a-581d101f8561-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "456eae50-a3c0-4ddc-a71a-581d101f8561" (UID: "456eae50-a3c0-4ddc-a71a-581d101f8561"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:35:17 crc kubenswrapper[4707]: I0121 15:35:17.046979 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mnjm\" (UniqueName: \"kubernetes.io/projected/456eae50-a3c0-4ddc-a71a-581d101f8561-kube-api-access-2mnjm\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:17 crc kubenswrapper[4707]: I0121 15:35:17.047010 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/456eae50-a3c0-4ddc-a71a-581d101f8561-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:17 crc kubenswrapper[4707]: I0121 15:35:17.047022 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/456eae50-a3c0-4ddc-a71a-581d101f8561-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:17 crc kubenswrapper[4707]: I0121 15:35:17.590499 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p6phr" event={"ID":"456eae50-a3c0-4ddc-a71a-581d101f8561","Type":"ContainerDied","Data":"2c83117b34f9f23949e0deb155826f40731f5eac6e846dde5510c63623375b8c"} Jan 21 15:35:17 crc kubenswrapper[4707]: I0121 15:35:17.590540 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c83117b34f9f23949e0deb155826f40731f5eac6e846dde5510c63623375b8c" Jan 21 15:35:17 crc kubenswrapper[4707]: I0121 15:35:17.590538 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p6phr" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.012766 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-p6phr"] Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.016464 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-p6phr"] Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.129484 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-j9stt"] Jan 21 15:35:20 crc kubenswrapper[4707]: E0121 15:35:20.129722 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46f052f-cdf5-442f-9628-b1aacddca32d" containerName="mariadb-account-create-update" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.129740 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46f052f-cdf5-442f-9628-b1aacddca32d" containerName="mariadb-account-create-update" Jan 21 15:35:20 crc kubenswrapper[4707]: E0121 15:35:20.129763 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456eae50-a3c0-4ddc-a71a-581d101f8561" containerName="storage" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.129770 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="456eae50-a3c0-4ddc-a71a-581d101f8561" containerName="storage" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.129900 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="456eae50-a3c0-4ddc-a71a-581d101f8561" containerName="storage" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.130318 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.131884 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.133181 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.133270 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.133330 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.137533 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-j9stt"] Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.285774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-node-mnt\") pod \"crc-storage-crc-j9stt\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.285860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-crc-storage\") pod \"crc-storage-crc-j9stt\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.286026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqwmk\" (UniqueName: \"kubernetes.io/projected/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-kube-api-access-jqwmk\") pod \"crc-storage-crc-j9stt\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.387415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-node-mnt\") pod \"crc-storage-crc-j9stt\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.387498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-crc-storage\") pod \"crc-storage-crc-j9stt\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.387533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwmk\" (UniqueName: \"kubernetes.io/projected/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-kube-api-access-jqwmk\") pod \"crc-storage-crc-j9stt\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.387652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-node-mnt\") pod \"crc-storage-crc-j9stt\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.388242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-crc-storage\") pod \"crc-storage-crc-j9stt\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.401999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwmk\" (UniqueName: \"kubernetes.io/projected/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-kube-api-access-jqwmk\") pod \"crc-storage-crc-j9stt\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.443540 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:20 crc kubenswrapper[4707]: I0121 15:35:20.795827 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-j9stt"] Jan 21 15:35:21 crc kubenswrapper[4707]: I0121 15:35:21.190345 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="456eae50-a3c0-4ddc-a71a-581d101f8561" path="/var/lib/kubelet/pods/456eae50-a3c0-4ddc-a71a-581d101f8561/volumes" Jan 21 15:35:21 crc kubenswrapper[4707]: I0121 15:35:21.617322 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10" containerID="fa1515803c68c5a23c843f4e49519efb946c9d84db010699f0e712149f6220d1" exitCode=0 Jan 21 15:35:21 crc kubenswrapper[4707]: I0121 15:35:21.617376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-j9stt" event={"ID":"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10","Type":"ContainerDied","Data":"fa1515803c68c5a23c843f4e49519efb946c9d84db010699f0e712149f6220d1"} Jan 21 15:35:21 crc kubenswrapper[4707]: I0121 15:35:21.617684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-j9stt" event={"ID":"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10","Type":"ContainerStarted","Data":"92611e9461e83076ed5cba88283f86b69423538d3fb33489e8fafe2c88c7013c"} Jan 21 15:35:22 crc kubenswrapper[4707]: I0121 15:35:22.827689 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:22 crc kubenswrapper[4707]: I0121 15:35:22.920473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqwmk\" (UniqueName: \"kubernetes.io/projected/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-kube-api-access-jqwmk\") pod \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " Jan 21 15:35:22 crc kubenswrapper[4707]: I0121 15:35:22.920593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-node-mnt\") pod \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " Jan 21 15:35:22 crc kubenswrapper[4707]: I0121 15:35:22.920707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-crc-storage\") pod \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\" (UID: \"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10\") " Jan 21 15:35:22 crc kubenswrapper[4707]: I0121 15:35:22.920780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10" (UID: "2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:22 crc kubenswrapper[4707]: I0121 15:35:22.921062 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:22 crc kubenswrapper[4707]: I0121 15:35:22.925594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-kube-api-access-jqwmk" (OuterVolumeSpecName: "kube-api-access-jqwmk") pod "2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10" (UID: "2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10"). InnerVolumeSpecName "kube-api-access-jqwmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:22 crc kubenswrapper[4707]: I0121 15:35:22.936367 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10" (UID: "2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:35:23 crc kubenswrapper[4707]: I0121 15:35:23.022735 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:23 crc kubenswrapper[4707]: I0121 15:35:23.022771 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqwmk\" (UniqueName: \"kubernetes.io/projected/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10-kube-api-access-jqwmk\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:23 crc kubenswrapper[4707]: I0121 15:35:23.632311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-j9stt" event={"ID":"2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10","Type":"ContainerDied","Data":"92611e9461e83076ed5cba88283f86b69423538d3fb33489e8fafe2c88c7013c"} Jan 21 15:35:23 crc kubenswrapper[4707]: I0121 15:35:23.632350 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92611e9461e83076ed5cba88283f86b69423538d3fb33489e8fafe2c88c7013c" Jan 21 15:35:23 crc kubenswrapper[4707]: I0121 15:35:23.632389 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j9stt" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.931080 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:35:36 crc kubenswrapper[4707]: E0121 15:35:36.931848 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10" containerName="storage" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.931862 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10" containerName="storage" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.931997 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10" containerName="storage" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.932726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.935127 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-8xwqj" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.935766 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.935826 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.935779 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.935922 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.938186 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.939135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.939221 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.939329 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.943122 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.944190 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.947491 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.951571 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 21 15:35:36 crc kubenswrapper[4707]: I0121 15:35:36.959859 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.114933 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c346ce8-cb7b-46a0-abda-84bc7a42f009-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.114999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c346ce8-cb7b-46a0-abda-84bc7a42f009-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-server-conf\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c641d259-efc1-4897-94d0-bf9d429f3c5d-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15b560a7-6184-4897-af4f-7975ebd19fcb-pod-info\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c641d259-efc1-4897-94d0-bf9d429f3c5d-pod-info\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-server-conf\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w869l\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-kube-api-access-w869l\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wvwf\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-kube-api-access-4wvwf\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115880 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4bls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-kube-api-access-h4bls\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.115974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.116035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.116057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.116085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15b560a7-6184-4897-af4f-7975ebd19fcb-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.137553 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.139956 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.141783 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-mbmwm" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.142116 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.142326 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.142484 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.142693 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.143554 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.144332 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.145575 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.150702 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.152047 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.173250 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.174317 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.192042 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.203297 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.217949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.217986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/702acb12-2ee0-4008-bf50-0fda9fc79030-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15b560a7-6184-4897-af4f-7975ebd19fcb-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c346ce8-cb7b-46a0-abda-84bc7a42f009-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c346ce8-cb7b-46a0-abda-84bc7a42f009-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/702acb12-2ee0-4008-bf50-0fda9fc79030-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-server-conf\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218337 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82kxz\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-kube-api-access-82kxz\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c641d259-efc1-4897-94d0-bf9d429f3c5d-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15b560a7-6184-4897-af4f-7975ebd19fcb-pod-info\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c641d259-efc1-4897-94d0-bf9d429f3c5d-pod-info\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-server-conf\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w869l\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-kube-api-access-w869l\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218721 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wvwf\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-kube-api-access-4wvwf\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4bls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-kube-api-access-h4bls\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.218850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.219495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.219539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.219597 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.220616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-server-conf\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.221920 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-server-conf\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.222439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.223369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.223504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.224243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.224260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.224381 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.224488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.224475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c641d259-efc1-4897-94d0-bf9d429f3c5d-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.224613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c346ce8-cb7b-46a0-abda-84bc7a42f009-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.224690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.225238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15b560a7-6184-4897-af4f-7975ebd19fcb-pod-info\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.225340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.225432 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.225461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.225777 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.226135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.226754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.227625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.228588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15b560a7-6184-4897-af4f-7975ebd19fcb-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.228610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.230178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c346ce8-cb7b-46a0-abda-84bc7a42f009-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.231801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c641d259-efc1-4897-94d0-bf9d429f3c5d-pod-info\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.234655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.234754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w869l\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-kube-api-access-w869l\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.234782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wvwf\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-kube-api-access-4wvwf\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.236506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4bls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-kube-api-access-h4bls\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.240420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-2\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.242712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-1\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.248416 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.251624 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.269138 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.276640 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/702acb12-2ee0-4008-bf50-0fda9fc79030-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65vml\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-kube-api-access-65vml\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322929 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.322999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/702acb12-2ee0-4008-bf50-0fda9fc79030-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323080 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-server-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e2084ed-217a-462c-8219-6cfa022751d9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.323261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.324099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.324432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.324475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnxrn\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-kube-api-access-dnxrn\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.324502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.324525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-plugins-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.324574 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") device mount path \"/mnt/openstack/pv16\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.324637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82kxz\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-kube-api-access-82kxz\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.324709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.325670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e2084ed-217a-462c-8219-6cfa022751d9-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.325709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-pod-info\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.326229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.326323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.326358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.327387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.329346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/702acb12-2ee0-4008-bf50-0fda9fc79030-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.329457 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.329680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/702acb12-2ee0-4008-bf50-0fda9fc79030-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.331350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.340373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82kxz\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-kube-api-access-82kxz\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.356718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65vml\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-kube-api-access-65vml\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427422 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-server-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e2084ed-217a-462c-8219-6cfa022751d9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxrn\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-kube-api-access-dnxrn\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-plugins-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-pod-info\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e2084ed-217a-462c-8219-6cfa022751d9-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.427690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.428891 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") device mount path \"/mnt/openstack/pv12\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.429022 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.429034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-plugins-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.429661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.430355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.430403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.430598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.431139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-server-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.431274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.431329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.431673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.432185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.432882 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.433381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.433938 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.435925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.435966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.437607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e2084ed-217a-462c-8219-6cfa022751d9-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.438881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-pod-info\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.443620 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e2084ed-217a-462c-8219-6cfa022751d9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.443963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65vml\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-kube-api-access-65vml\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.446273 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnxrn\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-kube-api-access-dnxrn\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.455618 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-2\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.457265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-1\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.472376 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.478911 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.489848 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.653440 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.718266 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.724112 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 21 15:35:37 crc kubenswrapper[4707]: W0121 15:35:37.730518 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc641d259_efc1_4897_94d0_bf9d429f3c5d.slice/crio-652ed6964b05c15a5f17f224272bf185cf72f7e5e1bd7307ed379302f6328ff7 WatchSource:0}: Error finding container 652ed6964b05c15a5f17f224272bf185cf72f7e5e1bd7307ed379302f6328ff7: Status 404 returned error can't find the container with id 652ed6964b05c15a5f17f224272bf185cf72f7e5e1bd7307ed379302f6328ff7 Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.741275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4c346ce8-cb7b-46a0-abda-84bc7a42f009","Type":"ContainerStarted","Data":"999ddce239732138ecc6be83b4fb9f61d057a201cfddf5940f97fde33a1d4f80"} Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.905156 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 21 15:35:37 crc kubenswrapper[4707]: W0121 15:35:37.911412 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1977b33b_45bb_4b7d_b7f7_ebfd42572edb.slice/crio-65112d66bcb9da79b16ec5a960f301f80edce5306110f1700e939f092238ad92 WatchSource:0}: Error finding container 65112d66bcb9da79b16ec5a960f301f80edce5306110f1700e939f092238ad92: Status 404 returned error can't find the container with id 65112d66bcb9da79b16ec5a960f301f80edce5306110f1700e939f092238ad92 Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.972458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 21 15:35:37 crc kubenswrapper[4707]: W0121 15:35:37.982590 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2084ed_217a_462c_8219_6cfa022751d9.slice/crio-99586717160da9478207f34b93f55cce606fee2e927195867542e138aa220d52 WatchSource:0}: Error finding container 99586717160da9478207f34b93f55cce606fee2e927195867542e138aa220d52: Status 404 returned error can't find the container with id 99586717160da9478207f34b93f55cce606fee2e927195867542e138aa220d52 Jan 21 15:35:37 crc kubenswrapper[4707]: W0121 15:35:37.984444 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod702acb12_2ee0_4008_bf50_0fda9fc79030.slice/crio-060754cc187f1818440fa24bac095975295b72d6a153f918438259376c29ef38 WatchSource:0}: Error finding container 060754cc187f1818440fa24bac095975295b72d6a153f918438259376c29ef38: Status 404 returned error can't find the container with id 060754cc187f1818440fa24bac095975295b72d6a153f918438259376c29ef38 Jan 21 15:35:37 crc kubenswrapper[4707]: I0121 15:35:37.987092 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.749447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"15b560a7-6184-4897-af4f-7975ebd19fcb","Type":"ContainerStarted","Data":"84f9a01fc933d02b31761e13aad8dce7376a5048b526d2374384b83a02ff8564"} Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.752252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"1977b33b-45bb-4b7d-b7f7-ebfd42572edb","Type":"ContainerStarted","Data":"65112d66bcb9da79b16ec5a960f301f80edce5306110f1700e939f092238ad92"} Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.755643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"702acb12-2ee0-4008-bf50-0fda9fc79030","Type":"ContainerStarted","Data":"060754cc187f1818440fa24bac095975295b72d6a153f918438259376c29ef38"} Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.757594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"8e2084ed-217a-462c-8219-6cfa022751d9","Type":"ContainerStarted","Data":"99586717160da9478207f34b93f55cce606fee2e927195867542e138aa220d52"} Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.758779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"c641d259-efc1-4897-94d0-bf9d429f3c5d","Type":"ContainerStarted","Data":"652ed6964b05c15a5f17f224272bf185cf72f7e5e1bd7307ed379302f6328ff7"} Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.758865 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.761608 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.764421 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.764588 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-mb7qz" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.764914 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.766255 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.767500 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.768804 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.771726 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.774883 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.779172 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.782855 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.783845 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.794827 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.850776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-generated\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.850884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-combined-ca-bundle\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.850906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-default\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.850947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqc5h\" (UniqueName: \"kubernetes.io/projected/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kube-api-access-rqc5h\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-operator-scripts\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-operator-scripts\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz2dc\" (UniqueName: \"kubernetes.io/projected/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kube-api-access-dz2dc\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kolla-config\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25gbb\" (UniqueName: \"kubernetes.io/projected/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kube-api-access-25gbb\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851149 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-default\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851190 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-default\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851255 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kolla-config\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-galera-tls-certs\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kolla-config\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851314 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-generated\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.851381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-galera-tls-certs\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kolla-config\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-generated\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-generated\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-default\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-combined-ca-bundle\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqc5h\" (UniqueName: \"kubernetes.io/projected/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kube-api-access-rqc5h\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-operator-scripts\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.954972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-operator-scripts\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz2dc\" (UniqueName: \"kubernetes.io/projected/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kube-api-access-dz2dc\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kolla-config\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25gbb\" (UniqueName: \"kubernetes.io/projected/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kube-api-access-25gbb\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955113 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-generated\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-default\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-default\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kolla-config\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.955530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kolla-config\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.956017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kolla-config\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.956284 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.956843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-default\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.957171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.957299 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.957410 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.959137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-generated\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.959532 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") device mount path \"/mnt/openstack/pv17\"" pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.959975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-default\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.960941 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-default\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.961016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kolla-config\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.961352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-galera-tls-certs\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.963068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-operator-scripts\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.963550 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-combined-ca-bundle\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.969006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-operator-scripts\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.970374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.970692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.974602 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.974671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz2dc\" (UniqueName: \"kubernetes.io/projected/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kube-api-access-dz2dc\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.979434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25gbb\" (UniqueName: \"kubernetes.io/projected/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kube-api-access-25gbb\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.980115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqc5h\" (UniqueName: \"kubernetes.io/projected/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kube-api-access-rqc5h\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.980626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:38 crc kubenswrapper[4707]: I0121 15:35:38.987885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.001884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.007994 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-1\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.207365 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.214273 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.222698 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.590571 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.651015 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 21 15:35:39 crc kubenswrapper[4707]: W0121 15:35:39.653507 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0cfe067_0df4_46ce_b674_eed01fb4ca91.slice/crio-a0a0b664893c15c33eb4e777a68f91a2a445b49ce070e19c7182c15bdb422ca2 WatchSource:0}: Error finding container a0a0b664893c15c33eb4e777a68f91a2a445b49ce070e19c7182c15bdb422ca2: Status 404 returned error can't find the container with id a0a0b664893c15c33eb4e777a68f91a2a445b49ce070e19c7182c15bdb422ca2 Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.726766 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 21 15:35:39 crc kubenswrapper[4707]: W0121 15:35:39.735454 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee55a6b_2348_4ab1_9bc1_0be2a8aafe4d.slice/crio-ef00fd3b5fb732833d6c613184fa0126d0718d7d48e307c90e36595ea5e901dc WatchSource:0}: Error finding container ef00fd3b5fb732833d6c613184fa0126d0718d7d48e307c90e36595ea5e901dc: Status 404 returned error can't find the container with id ef00fd3b5fb732833d6c613184fa0126d0718d7d48e307c90e36595ea5e901dc Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.768870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d","Type":"ContainerStarted","Data":"ef00fd3b5fb732833d6c613184fa0126d0718d7d48e307c90e36595ea5e901dc"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.770415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"702acb12-2ee0-4008-bf50-0fda9fc79030","Type":"ContainerStarted","Data":"26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.771727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4c346ce8-cb7b-46a0-abda-84bc7a42f009","Type":"ContainerStarted","Data":"4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.774769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"b0cfe067-0df4-46ce-b674-eed01fb4ca91","Type":"ContainerStarted","Data":"dbb3b37d291b0ba6c3e5d38b8c71359fbf200dfd228ed3c32b7182d8d9d73e43"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.774816 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"b0cfe067-0df4-46ce-b674-eed01fb4ca91","Type":"ContainerStarted","Data":"a0a0b664893c15c33eb4e777a68f91a2a445b49ce070e19c7182c15bdb422ca2"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.776828 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"c641d259-efc1-4897-94d0-bf9d429f3c5d","Type":"ContainerStarted","Data":"6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.778363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"15b560a7-6184-4897-af4f-7975ebd19fcb","Type":"ContainerStarted","Data":"4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.781440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"1977b33b-45bb-4b7d-b7f7-ebfd42572edb","Type":"ContainerStarted","Data":"158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.782885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"8e2084ed-217a-462c-8219-6cfa022751d9","Type":"ContainerStarted","Data":"6a798f965e2ede79a50452c9b9cf47838b4c424ce11b6965be9499ba22782a08"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.784108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b","Type":"ContainerStarted","Data":"4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678"} Jan 21 15:35:39 crc kubenswrapper[4707]: I0121 15:35:39.784148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b","Type":"ContainerStarted","Data":"cb2ececd620c05cd639d8ec3aefb8ccbf2ed59c0d90b6381ad7583044af99531"} Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.177278 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.180030 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.183008 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-rn5rm" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.183160 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.183297 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.183344 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.186188 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.187726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.202537 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.215216 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.216883 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.247611 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.259696 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.293718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.293958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294236 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294374 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294555 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.294980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s55v\" (UniqueName: \"kubernetes.io/projected/2a77adde-7080-44bf-bbba-0d8b0f874e98-kube-api-access-9s55v\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.295023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnnl6\" (UniqueName: \"kubernetes.io/projected/741daec5-08a7-4769-8b0b-4248c3c02d69-kube-api-access-rnnl6\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.396680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.396963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-operator-scripts\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kolla-config\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-default\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397839 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s55v\" (UniqueName: \"kubernetes.io/projected/2a77adde-7080-44bf-bbba-0d8b0f874e98-kube-api-access-9s55v\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnnl6\" (UniqueName: \"kubernetes.io/projected/741daec5-08a7-4769-8b0b-4248c3c02d69-kube-api-access-rnnl6\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.397994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjgck\" (UniqueName: \"kubernetes.io/projected/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kube-api-access-xjgck\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398261 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-generated\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-combined-ca-bundle\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-galera-tls-certs\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.398977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.399056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.399257 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.399269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.400033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.400131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.402786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.403720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.417787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s55v\" (UniqueName: \"kubernetes.io/projected/2a77adde-7080-44bf-bbba-0d8b0f874e98-kube-api-access-9s55v\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.418357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnnl6\" (UniqueName: \"kubernetes.io/projected/741daec5-08a7-4769-8b0b-4248c3c02d69-kube-api-access-rnnl6\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.419769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.420697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.424050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-2\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.426523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.499882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-galera-tls-certs\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.500121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-operator-scripts\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.500281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kolla-config\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.500382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-default\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.500548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.500627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjgck\" (UniqueName: \"kubernetes.io/projected/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kube-api-access-xjgck\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.500719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-generated\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.500780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-combined-ca-bundle\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.501331 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.501377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kolla-config\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.501461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-generated\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.501616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-default\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.501920 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-operator-scripts\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.504672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-combined-ca-bundle\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.508497 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.516164 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.518534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-galera-tls-certs\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.520297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjgck\" (UniqueName: \"kubernetes.io/projected/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kube-api-access-xjgck\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.535599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-1\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.555500 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.559440 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.566736 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-kjrcj" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.566959 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.567100 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.571395 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.705869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-config-data\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.706178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxdxr\" (UniqueName: \"kubernetes.io/projected/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kube-api-access-nxdxr\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.706285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.706336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.706443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kolla-config\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.796764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d","Type":"ContainerStarted","Data":"c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7"} Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.808576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kolla-config\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.808622 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kolla-config\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.808651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-config-data\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.808670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxdxr\" (UniqueName: \"kubernetes.io/projected/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kube-api-access-nxdxr\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.808715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.808745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.809795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-config-data\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.814545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.815202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.825795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxdxr\" (UniqueName: \"kubernetes.io/projected/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kube-api-access-nxdxr\") pod \"memcached-0\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.829805 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.884515 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.966108 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:35:40 crc kubenswrapper[4707]: I0121 15:35:40.977337 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 21 15:35:40 crc kubenswrapper[4707]: W0121 15:35:40.977595 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a77adde_7080_44bf_bbba_0d8b0f874e98.slice/crio-83e6b723e40a76c6e1ac48c1a8c6f33d260f6beb1081b8777747f123246e9621 WatchSource:0}: Error finding container 83e6b723e40a76c6e1ac48c1a8c6f33d260f6beb1081b8777747f123246e9621: Status 404 returned error can't find the container with id 83e6b723e40a76c6e1ac48c1a8c6f33d260f6beb1081b8777747f123246e9621 Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.217666 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 21 15:35:41 crc kubenswrapper[4707]: W0121 15:35:41.219129 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3597cfd3_7c8f_41b7_9265_cfffa2bab5e9.slice/crio-8d8f7f728f687b371c12b8c70f435ffe4a2bff3180de1274b1661aa55440023f WatchSource:0}: Error finding container 8d8f7f728f687b371c12b8c70f435ffe4a2bff3180de1274b1661aa55440023f: Status 404 returned error can't find the container with id 8d8f7f728f687b371c12b8c70f435ffe4a2bff3180de1274b1661aa55440023f Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.347450 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:35:41 crc kubenswrapper[4707]: W0121 15:35:41.350916 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e12f07_e051_4381_8419_5f3a1a09d6c5.slice/crio-4335b58d1c1e7187db5559e67e8b78629dd9337ce9d4fa2e4d68c6c8f435e90f WatchSource:0}: Error finding container 4335b58d1c1e7187db5559e67e8b78629dd9337ce9d4fa2e4d68c6c8f435e90f: Status 404 returned error can't find the container with id 4335b58d1c1e7187db5559e67e8b78629dd9337ce9d4fa2e4d68c6c8f435e90f Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.804104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"2a77adde-7080-44bf-bbba-0d8b0f874e98","Type":"ContainerStarted","Data":"c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d"} Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.804162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"2a77adde-7080-44bf-bbba-0d8b0f874e98","Type":"ContainerStarted","Data":"83e6b723e40a76c6e1ac48c1a8c6f33d260f6beb1081b8777747f123246e9621"} Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.806359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d1e12f07-e051-4381-8419-5f3a1a09d6c5","Type":"ContainerStarted","Data":"aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d"} Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.806389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d1e12f07-e051-4381-8419-5f3a1a09d6c5","Type":"ContainerStarted","Data":"4335b58d1c1e7187db5559e67e8b78629dd9337ce9d4fa2e4d68c6c8f435e90f"} Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.806872 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.808245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9","Type":"ContainerStarted","Data":"fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee"} Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.808302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9","Type":"ContainerStarted","Data":"8d8f7f728f687b371c12b8c70f435ffe4a2bff3180de1274b1661aa55440023f"} Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.809674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"741daec5-08a7-4769-8b0b-4248c3c02d69","Type":"ContainerStarted","Data":"6312bc7575fea47a596d88868e21654648feff1c9d8fe5d956eeb115a2036499"} Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.809721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"741daec5-08a7-4769-8b0b-4248c3c02d69","Type":"ContainerStarted","Data":"1c8f86be8625ae982d6dfb74ae0e8d2e5e0d1a39f6066ee75a12984ba71fbe62"} Jan 21 15:35:41 crc kubenswrapper[4707]: I0121 15:35:41.864643 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=1.86462583 podStartE2EDuration="1.86462583s" podCreationTimestamp="2026-01-21 15:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:41.857241455 +0000 UTC m=+2039.038757677" watchObservedRunningTime="2026-01-21 15:35:41.86462583 +0000 UTC m=+2039.046142052" Jan 21 15:35:42 crc kubenswrapper[4707]: I0121 15:35:42.797338 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:35:42 crc kubenswrapper[4707]: I0121 15:35:42.798374 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:35:42 crc kubenswrapper[4707]: I0121 15:35:42.800714 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-mksqz" Jan 21 15:35:42 crc kubenswrapper[4707]: I0121 15:35:42.804070 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:35:42 crc kubenswrapper[4707]: I0121 15:35:42.817628 4707 generic.go:334] "Generic (PLEG): container finished" podID="224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" containerID="4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678" exitCode=0 Jan 21 15:35:42 crc kubenswrapper[4707]: I0121 15:35:42.818411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b","Type":"ContainerDied","Data":"4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678"} Jan 21 15:35:42 crc kubenswrapper[4707]: I0121 15:35:42.842845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lsj9\" (UniqueName: \"kubernetes.io/projected/624a8ae2-0096-40d9-afc7-efa6b5f4f947-kube-api-access-4lsj9\") pod \"kube-state-metrics-0\" (UID: \"624a8ae2-0096-40d9-afc7-efa6b5f4f947\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:35:42 crc kubenswrapper[4707]: I0121 15:35:42.945025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lsj9\" (UniqueName: \"kubernetes.io/projected/624a8ae2-0096-40d9-afc7-efa6b5f4f947-kube-api-access-4lsj9\") pod \"kube-state-metrics-0\" (UID: \"624a8ae2-0096-40d9-afc7-efa6b5f4f947\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:35:42 crc kubenswrapper[4707]: I0121 15:35:42.960204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lsj9\" (UniqueName: \"kubernetes.io/projected/624a8ae2-0096-40d9-afc7-efa6b5f4f947-kube-api-access-4lsj9\") pod \"kube-state-metrics-0\" (UID: \"624a8ae2-0096-40d9-afc7-efa6b5f4f947\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.044290 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:35:43 crc kubenswrapper[4707]: W0121 15:35:43.414609 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod624a8ae2_0096_40d9_afc7_efa6b5f4f947.slice/crio-e5044283263e865126b8f868efbfb765da9c9adad6de39076539e005a411484e WatchSource:0}: Error finding container e5044283263e865126b8f868efbfb765da9c9adad6de39076539e005a411484e: Status 404 returned error can't find the container with id e5044283263e865126b8f868efbfb765da9c9adad6de39076539e005a411484e Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.415061 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.464372 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rshmd"] Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.465734 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.480621 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rshmd"] Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.655790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltfzx\" (UniqueName: \"kubernetes.io/projected/9b2b206d-6a55-4206-8353-a7f791c96091-kube-api-access-ltfzx\") pod \"certified-operators-rshmd\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.656002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-utilities\") pod \"certified-operators-rshmd\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.656071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-catalog-content\") pod \"certified-operators-rshmd\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.757796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-utilities\") pod \"certified-operators-rshmd\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.757899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-catalog-content\") pod \"certified-operators-rshmd\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.757938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltfzx\" (UniqueName: \"kubernetes.io/projected/9b2b206d-6a55-4206-8353-a7f791c96091-kube-api-access-ltfzx\") pod \"certified-operators-rshmd\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.758605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-utilities\") pod \"certified-operators-rshmd\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.758826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-catalog-content\") pod \"certified-operators-rshmd\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.777310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltfzx\" (UniqueName: \"kubernetes.io/projected/9b2b206d-6a55-4206-8353-a7f791c96091-kube-api-access-ltfzx\") pod \"certified-operators-rshmd\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.780286 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.830362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b","Type":"ContainerStarted","Data":"8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443"} Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.839687 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" containerID="dbb3b37d291b0ba6c3e5d38b8c71359fbf200dfd228ed3c32b7182d8d9d73e43" exitCode=0 Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.839783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"b0cfe067-0df4-46ce-b674-eed01fb4ca91","Type":"ContainerDied","Data":"dbb3b37d291b0ba6c3e5d38b8c71359fbf200dfd228ed3c32b7182d8d9d73e43"} Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.853538 4707 generic.go:334] "Generic (PLEG): container finished" podID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" containerID="c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7" exitCode=0 Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.853597 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d","Type":"ContainerDied","Data":"c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7"} Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.857594 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=6.85758214 podStartE2EDuration="6.85758214s" podCreationTimestamp="2026-01-21 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:43.857051651 +0000 UTC m=+2041.038567873" watchObservedRunningTime="2026-01-21 15:35:43.85758214 +0000 UTC m=+2041.039098361" Jan 21 15:35:43 crc kubenswrapper[4707]: I0121 15:35:43.857966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"624a8ae2-0096-40d9-afc7-efa6b5f4f947","Type":"ContainerStarted","Data":"e5044283263e865126b8f868efbfb765da9c9adad6de39076539e005a411484e"} Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.219623 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rshmd"] Jan 21 15:35:44 crc kubenswrapper[4707]: W0121 15:35:44.220507 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b2b206d_6a55_4206_8353_a7f791c96091.slice/crio-7aad15d047e10f97d0a68c6b49d6be2b6a2a1949496d5ad7c89bc619da052807 WatchSource:0}: Error finding container 7aad15d047e10f97d0a68c6b49d6be2b6a2a1949496d5ad7c89bc619da052807: Status 404 returned error can't find the container with id 7aad15d047e10f97d0a68c6b49d6be2b6a2a1949496d5ad7c89bc619da052807 Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.865681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d","Type":"ContainerStarted","Data":"0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5"} Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.867429 4707 generic.go:334] "Generic (PLEG): container finished" podID="741daec5-08a7-4769-8b0b-4248c3c02d69" containerID="6312bc7575fea47a596d88868e21654648feff1c9d8fe5d956eeb115a2036499" exitCode=0 Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.867490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"741daec5-08a7-4769-8b0b-4248c3c02d69","Type":"ContainerDied","Data":"6312bc7575fea47a596d88868e21654648feff1c9d8fe5d956eeb115a2036499"} Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.868710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"624a8ae2-0096-40d9-afc7-efa6b5f4f947","Type":"ContainerStarted","Data":"10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275"} Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.868842 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.869970 4707 generic.go:334] "Generic (PLEG): container finished" podID="2a77adde-7080-44bf-bbba-0d8b0f874e98" containerID="c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d" exitCode=0 Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.870024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"2a77adde-7080-44bf-bbba-0d8b0f874e98","Type":"ContainerDied","Data":"c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d"} Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.871551 4707 generic.go:334] "Generic (PLEG): container finished" podID="9b2b206d-6a55-4206-8353-a7f791c96091" containerID="6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d" exitCode=0 Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.871610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rshmd" event={"ID":"9b2b206d-6a55-4206-8353-a7f791c96091","Type":"ContainerDied","Data":"6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d"} Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.871651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rshmd" event={"ID":"9b2b206d-6a55-4206-8353-a7f791c96091","Type":"ContainerStarted","Data":"7aad15d047e10f97d0a68c6b49d6be2b6a2a1949496d5ad7c89bc619da052807"} Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.872965 4707 generic.go:334] "Generic (PLEG): container finished" podID="3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" containerID="fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee" exitCode=0 Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.873018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9","Type":"ContainerDied","Data":"fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee"} Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.875667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"b0cfe067-0df4-46ce-b674-eed01fb4ca91","Type":"ContainerStarted","Data":"cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91"} Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.900640 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-2" podStartSLOduration=7.900627307 podStartE2EDuration="7.900627307s" podCreationTimestamp="2026-01-21 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:44.892327371 +0000 UTC m=+2042.073843593" watchObservedRunningTime="2026-01-21 15:35:44.900627307 +0000 UTC m=+2042.082143529" Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.926526 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.635817034 podStartE2EDuration="2.926510865s" podCreationTimestamp="2026-01-21 15:35:42 +0000 UTC" firstStartedPulling="2026-01-21 15:35:43.416839968 +0000 UTC m=+2040.598356190" lastFinishedPulling="2026-01-21 15:35:43.707533798 +0000 UTC m=+2040.889050021" observedRunningTime="2026-01-21 15:35:44.920321877 +0000 UTC m=+2042.101838120" watchObservedRunningTime="2026-01-21 15:35:44.926510865 +0000 UTC m=+2042.108027087" Jan 21 15:35:44 crc kubenswrapper[4707]: I0121 15:35:44.983144 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-1" podStartSLOduration=7.983127358 podStartE2EDuration="7.983127358s" podCreationTimestamp="2026-01-21 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:44.975582522 +0000 UTC m=+2042.157098744" watchObservedRunningTime="2026-01-21 15:35:44.983127358 +0000 UTC m=+2042.164643580" Jan 21 15:35:45 crc kubenswrapper[4707]: I0121 15:35:45.883934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"741daec5-08a7-4769-8b0b-4248c3c02d69","Type":"ContainerStarted","Data":"0acfdb10ed4b8a5d4173af5a014e847071dd24247841328d306d9a0bc63b44ec"} Jan 21 15:35:45 crc kubenswrapper[4707]: I0121 15:35:45.885856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"2a77adde-7080-44bf-bbba-0d8b0f874e98","Type":"ContainerStarted","Data":"8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f"} Jan 21 15:35:45 crc kubenswrapper[4707]: I0121 15:35:45.887701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rshmd" event={"ID":"9b2b206d-6a55-4206-8353-a7f791c96091","Type":"ContainerStarted","Data":"d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f"} Jan 21 15:35:45 crc kubenswrapper[4707]: I0121 15:35:45.889476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9","Type":"ContainerStarted","Data":"67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f"} Jan 21 15:35:45 crc kubenswrapper[4707]: I0121 15:35:45.911443 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=6.911429419 podStartE2EDuration="6.911429419s" podCreationTimestamp="2026-01-21 15:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:45.909529106 +0000 UTC m=+2043.091045328" watchObservedRunningTime="2026-01-21 15:35:45.911429419 +0000 UTC m=+2043.092945641" Jan 21 15:35:45 crc kubenswrapper[4707]: I0121 15:35:45.947507 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-1" podStartSLOduration=6.94749442 podStartE2EDuration="6.94749442s" podCreationTimestamp="2026-01-21 15:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:45.941592394 +0000 UTC m=+2043.123108605" watchObservedRunningTime="2026-01-21 15:35:45.94749442 +0000 UTC m=+2043.129010633" Jan 21 15:35:45 crc kubenswrapper[4707]: I0121 15:35:45.961175 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-2" podStartSLOduration=6.96116238 podStartE2EDuration="6.96116238s" podCreationTimestamp="2026-01-21 15:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:45.957380911 +0000 UTC m=+2043.138897132" watchObservedRunningTime="2026-01-21 15:35:45.96116238 +0000 UTC m=+2043.142678602" Jan 21 15:35:46 crc kubenswrapper[4707]: I0121 15:35:46.897687 4707 generic.go:334] "Generic (PLEG): container finished" podID="9b2b206d-6a55-4206-8353-a7f791c96091" containerID="d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f" exitCode=0 Jan 21 15:35:46 crc kubenswrapper[4707]: I0121 15:35:46.898402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rshmd" event={"ID":"9b2b206d-6a55-4206-8353-a7f791c96091","Type":"ContainerDied","Data":"d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f"} Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.055365 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.056462 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.058011 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.058369 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.058391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.058462 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.058534 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-9vhns" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.068995 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.110778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.110873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.110907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.111019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw8l7\" (UniqueName: \"kubernetes.io/projected/0131070b-7f14-4797-aa35-8ed780306565-kube-api-access-dw8l7\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.111102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0131070b-7f14-4797-aa35-8ed780306565-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.111242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.111287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.111407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-config\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.213240 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.213296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.213377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-config\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.213408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.213441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.213475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.213528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8l7\" (UniqueName: \"kubernetes.io/projected/0131070b-7f14-4797-aa35-8ed780306565-kube-api-access-dw8l7\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.213562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0131070b-7f14-4797-aa35-8ed780306565-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.213743 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") device mount path \"/mnt/openstack/pv15\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.214314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-config\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.215365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0131070b-7f14-4797-aa35-8ed780306565-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.215798 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.218202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.218250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.219155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.228131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw8l7\" (UniqueName: \"kubernetes.io/projected/0131070b-7f14-4797-aa35-8ed780306565-kube-api-access-dw8l7\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.235740 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.367993 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.735565 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:35:47 crc kubenswrapper[4707]: W0121 15:35:47.737966 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0131070b_7f14_4797_aa35_8ed780306565.slice/crio-4d13023c30521c8ed1296133912c757acc390d68d804fd845bf45b97c9c67995 WatchSource:0}: Error finding container 4d13023c30521c8ed1296133912c757acc390d68d804fd845bf45b97c9c67995: Status 404 returned error can't find the container with id 4d13023c30521c8ed1296133912c757acc390d68d804fd845bf45b97c9c67995 Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.906509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rshmd" event={"ID":"9b2b206d-6a55-4206-8353-a7f791c96091","Type":"ContainerStarted","Data":"32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea"} Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.907784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0131070b-7f14-4797-aa35-8ed780306565","Type":"ContainerStarted","Data":"86e3d70b4ce7602fd0e3af1064f4bf80df7351fe1430fbcc7626e11c9a2e2b89"} Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.907830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0131070b-7f14-4797-aa35-8ed780306565","Type":"ContainerStarted","Data":"4d13023c30521c8ed1296133912c757acc390d68d804fd845bf45b97c9c67995"} Jan 21 15:35:47 crc kubenswrapper[4707]: I0121 15:35:47.921172 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rshmd" podStartSLOduration=2.387250596 podStartE2EDuration="4.92116097s" podCreationTimestamp="2026-01-21 15:35:43 +0000 UTC" firstStartedPulling="2026-01-21 15:35:44.872631507 +0000 UTC m=+2042.054147730" lastFinishedPulling="2026-01-21 15:35:47.406541882 +0000 UTC m=+2044.588058104" observedRunningTime="2026-01-21 15:35:47.919580238 +0000 UTC m=+2045.101096460" watchObservedRunningTime="2026-01-21 15:35:47.92116097 +0000 UTC m=+2045.102677182" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.377910 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.378971 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.384780 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.385118 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-26xwl" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.385287 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.385425 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.399306 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.433555 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.433621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.433658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-config\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.433700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.433716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsxpv\" (UniqueName: \"kubernetes.io/projected/a84e26cb-9beb-4f23-9101-f4e981d8c394-kube-api-access-jsxpv\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.433737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.433756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.433776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.535697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.535786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.535846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-config\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.535894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.535912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsxpv\" (UniqueName: \"kubernetes.io/projected/a84e26cb-9beb-4f23-9101-f4e981d8c394-kube-api-access-jsxpv\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.535932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.535952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.535973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.536019 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.536271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.536783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-config\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.537083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.539830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.540145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.541075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.550638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsxpv\" (UniqueName: \"kubernetes.io/projected/a84e26cb-9beb-4f23-9101-f4e981d8c394-kube-api-access-jsxpv\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.553538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.701516 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.915068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0131070b-7f14-4797-aa35-8ed780306565","Type":"ContainerStarted","Data":"1ad0818035f81047bb782a695932bdec1acc1f17765cf97e5139e9764439d548"} Jan 21 15:35:48 crc kubenswrapper[4707]: I0121 15:35:48.929786 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.929767955 podStartE2EDuration="2.929767955s" podCreationTimestamp="2026-01-21 15:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:48.927440549 +0000 UTC m=+2046.108956771" watchObservedRunningTime="2026-01-21 15:35:48.929767955 +0000 UTC m=+2046.111284177" Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.056366 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:35:49 crc kubenswrapper[4707]: W0121 15:35:49.059451 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda84e26cb_9beb_4f23_9101_f4e981d8c394.slice/crio-53095e8fd063ee2f94852ed4d31ef326bb37a0a23bb199b2f0f528ff5208f7f1 WatchSource:0}: Error finding container 53095e8fd063ee2f94852ed4d31ef326bb37a0a23bb199b2f0f528ff5208f7f1: Status 404 returned error can't find the container with id 53095e8fd063ee2f94852ed4d31ef326bb37a0a23bb199b2f0f528ff5208f7f1 Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.208049 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.208256 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.214749 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.214863 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.222890 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.223053 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.922521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a84e26cb-9beb-4f23-9101-f4e981d8c394","Type":"ContainerStarted","Data":"e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197"} Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.923229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a84e26cb-9beb-4f23-9101-f4e981d8c394","Type":"ContainerStarted","Data":"359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311"} Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.923256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a84e26cb-9beb-4f23-9101-f4e981d8c394","Type":"ContainerStarted","Data":"53095e8fd063ee2f94852ed4d31ef326bb37a0a23bb199b2f0f528ff5208f7f1"} Jan 21 15:35:49 crc kubenswrapper[4707]: I0121 15:35:49.941303 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.941287837 podStartE2EDuration="2.941287837s" podCreationTimestamp="2026-01-21 15:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:49.935386481 +0000 UTC m=+2047.116902702" watchObservedRunningTime="2026-01-21 15:35:49.941287837 +0000 UTC m=+2047.122804059" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.369076 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.394596 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.509695 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.509750 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.517084 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.517118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.830742 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.830780 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.885847 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:35:50 crc kubenswrapper[4707]: I0121 15:35:50.928982 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:51 crc kubenswrapper[4707]: I0121 15:35:51.278529 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:51 crc kubenswrapper[4707]: I0121 15:35:51.332702 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:35:51 crc kubenswrapper[4707]: I0121 15:35:51.701826 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:52 crc kubenswrapper[4707]: I0121 15:35:52.395055 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:35:53 crc kubenswrapper[4707]: I0121 15:35:53.051507 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:35:53 crc kubenswrapper[4707]: I0121 15:35:53.701693 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:53 crc kubenswrapper[4707]: I0121 15:35:53.781010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:53 crc kubenswrapper[4707]: I0121 15:35:53.781049 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:53 crc kubenswrapper[4707]: I0121 15:35:53.815419 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:53 crc kubenswrapper[4707]: I0121 15:35:53.974563 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.163493 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.167674 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.170678 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-7t492" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.170698 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.171373 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.172964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.175646 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.318607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-lock\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.318663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjp6\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-kube-api-access-qqjp6\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.318705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.318776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.318803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-cache\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.420141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-cache\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.420360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-lock\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.420462 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjp6\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-kube-api-access-qqjp6\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.420582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.420646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-cache\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.420700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-lock\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: E0121 15:35:54.420718 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:35:54 crc kubenswrapper[4707]: E0121 15:35:54.420746 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:35:54 crc kubenswrapper[4707]: E0121 15:35:54.420888 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift podName:70e67f96-41b8-43d4-b048-546873358118 nodeName:}" failed. No retries permitted until 2026-01-21 15:35:54.92087333 +0000 UTC m=+2052.102389551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift") pod "swift-storage-0" (UID: "70e67f96-41b8-43d4-b048-546873358118") : configmap "swift-ring-files" not found Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.420966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.421192 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.435019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjp6\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-kube-api-access-qqjp6\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.437678 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.547541 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-v97xz"] Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.548724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.550482 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.550775 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.555198 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.555602 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-v97xz"] Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.575050 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-v97xz"] Jan 21 15:35:54 crc kubenswrapper[4707]: E0121 15:35:54.575904 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-88ml8 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-88ml8 ring-data-devices scripts swiftconf]: context canceled" pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" podUID="8b45c7a9-17c2-436e-891f-f7c6a05a8129" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.578045 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6jqxb"] Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.578901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.589422 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6jqxb"] Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.624062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-swiftconf\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.624139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b45c7a9-17c2-436e-891f-f7c6a05a8129-etc-swift\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.624353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-scripts\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.624399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-combined-ca-bundle\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.624445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88ml8\" (UniqueName: \"kubernetes.io/projected/8b45c7a9-17c2-436e-891f-f7c6a05a8129-kube-api-access-88ml8\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.624502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-ring-data-devices\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.624606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-dispersionconf\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-scripts\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-combined-ca-bundle\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-ring-data-devices\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88ml8\" (UniqueName: \"kubernetes.io/projected/8b45c7a9-17c2-436e-891f-f7c6a05a8129-kube-api-access-88ml8\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-ring-data-devices\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-dispersionconf\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-etc-swift\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-combined-ca-bundle\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-swiftconf\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b45c7a9-17c2-436e-891f-f7c6a05a8129-etc-swift\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-swiftconf\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn9qs\" (UniqueName: \"kubernetes.io/projected/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-kube-api-access-mn9qs\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-dispersionconf\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726539 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-scripts\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.726953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-scripts\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.727175 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-ring-data-devices\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.727203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b45c7a9-17c2-436e-891f-f7c6a05a8129-etc-swift\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.729335 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.730428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-combined-ca-bundle\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.730510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-dispersionconf\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.740869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-swiftconf\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.751515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88ml8\" (UniqueName: \"kubernetes.io/projected/8b45c7a9-17c2-436e-891f-f7c6a05a8129-kube-api-access-88ml8\") pod \"swift-ring-rebalance-v97xz\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.761477 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.804355 4707 scope.go:117] "RemoveContainer" containerID="11a1605bcc319d9ed814d9d67aa7a9e3e51111b55b8c53c91e8414081bb3326b" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.828398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-ring-data-devices\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.828492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-etc-swift\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.828567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-combined-ca-bundle\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.828631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-swiftconf\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.828667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn9qs\" (UniqueName: \"kubernetes.io/projected/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-kube-api-access-mn9qs\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.828705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-dispersionconf\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.828743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-scripts\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.828828 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-etc-swift\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.829144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-ring-data-devices\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.829489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-scripts\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.832654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-dispersionconf\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.833274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-combined-ca-bundle\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.833517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-swiftconf\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.854370 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.855674 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn9qs\" (UniqueName: \"kubernetes.io/projected/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-kube-api-access-mn9qs\") pod \"swift-ring-rebalance-6jqxb\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.855785 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.860509 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-rkz9q" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.860666 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.860785 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.864901 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.877368 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.889773 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.930492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.930551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.930592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.930611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hxkq\" (UniqueName: \"kubernetes.io/projected/7e183977-7c22-4b6d-a465-132c8f85dbb0-kube-api-access-8hxkq\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.930659 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-scripts\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:54 crc kubenswrapper[4707]: E0121 15:35:54.930668 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:35:54 crc kubenswrapper[4707]: E0121 15:35:54.930691 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:35:54 crc kubenswrapper[4707]: E0121 15:35:54.930731 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift podName:70e67f96-41b8-43d4-b048-546873358118 nodeName:}" failed. No retries permitted until 2026-01-21 15:35:55.93071747 +0000 UTC m=+2053.112233692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift") pod "swift-storage-0" (UID: "70e67f96-41b8-43d4-b048-546873358118") : configmap "swift-ring-files" not found Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.930676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-config\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.930842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.930862 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.952306 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:54 crc kubenswrapper[4707]: I0121 15:35:54.959986 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.031498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b45c7a9-17c2-436e-891f-f7c6a05a8129-etc-swift\") pod \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.031582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-dispersionconf\") pod \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.031636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-combined-ca-bundle\") pod \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.031660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88ml8\" (UniqueName: \"kubernetes.io/projected/8b45c7a9-17c2-436e-891f-f7c6a05a8129-kube-api-access-88ml8\") pod \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.031676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-ring-data-devices\") pod \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.031703 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-swiftconf\") pod \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.031749 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-scripts\") pod \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\" (UID: \"8b45c7a9-17c2-436e-891f-f7c6a05a8129\") " Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.032352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.032382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hxkq\" (UniqueName: \"kubernetes.io/projected/7e183977-7c22-4b6d-a465-132c8f85dbb0-kube-api-access-8hxkq\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.032442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-scripts\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.032461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-config\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.032503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.032518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.032591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.032620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-scripts" (OuterVolumeSpecName: "scripts") pod "8b45c7a9-17c2-436e-891f-f7c6a05a8129" (UID: "8b45c7a9-17c2-436e-891f-f7c6a05a8129"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.036001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.036513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-scripts\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.036732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8b45c7a9-17c2-436e-891f-f7c6a05a8129" (UID: "8b45c7a9-17c2-436e-891f-f7c6a05a8129"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.036951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-config\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.037492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.038926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8b45c7a9-17c2-436e-891f-f7c6a05a8129" (UID: "8b45c7a9-17c2-436e-891f-f7c6a05a8129"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.040894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.041540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b45c7a9-17c2-436e-891f-f7c6a05a8129-kube-api-access-88ml8" (OuterVolumeSpecName: "kube-api-access-88ml8") pod "8b45c7a9-17c2-436e-891f-f7c6a05a8129" (UID: "8b45c7a9-17c2-436e-891f-f7c6a05a8129"). InnerVolumeSpecName "kube-api-access-88ml8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.048157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8b45c7a9-17c2-436e-891f-f7c6a05a8129" (UID: "8b45c7a9-17c2-436e-891f-f7c6a05a8129"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.056868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b45c7a9-17c2-436e-891f-f7c6a05a8129-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8b45c7a9-17c2-436e-891f-f7c6a05a8129" (UID: "8b45c7a9-17c2-436e-891f-f7c6a05a8129"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.057391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b45c7a9-17c2-436e-891f-f7c6a05a8129" (UID: "8b45c7a9-17c2-436e-891f-f7c6a05a8129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.059244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hxkq\" (UniqueName: \"kubernetes.io/projected/7e183977-7c22-4b6d-a465-132c8f85dbb0-kube-api-access-8hxkq\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.059934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.133999 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.134032 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88ml8\" (UniqueName: \"kubernetes.io/projected/8b45c7a9-17c2-436e-891f-f7c6a05a8129-kube-api-access-88ml8\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.134045 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.134054 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.134062 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b45c7a9-17c2-436e-891f-f7c6a05a8129-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.134070 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b45c7a9-17c2-436e-891f-f7c6a05a8129-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.134078 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b45c7a9-17c2-436e-891f-f7c6a05a8129-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.170889 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.293541 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6jqxb"] Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.531458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.948013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:55 crc kubenswrapper[4707]: E0121 15:35:55.948172 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:35:55 crc kubenswrapper[4707]: E0121 15:35:55.949086 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:35:55 crc kubenswrapper[4707]: E0121 15:35:55.949139 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift podName:70e67f96-41b8-43d4-b048-546873358118 nodeName:}" failed. No retries permitted until 2026-01-21 15:35:57.949123199 +0000 UTC m=+2055.130639420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift") pod "swift-storage-0" (UID: "70e67f96-41b8-43d4-b048-546873358118") : configmap "swift-ring-files" not found Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.961280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" event={"ID":"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b","Type":"ContainerStarted","Data":"17e70d7489d29ca57860b2d236ba90a62d858bc89f801452c8a3eba7098c9352"} Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.961321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" event={"ID":"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b","Type":"ContainerStarted","Data":"f08bf1ef24e21baf248c0a3432c206bb5b619e7a6b6a96c14b42a6d26f665c49"} Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.962979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7e183977-7c22-4b6d-a465-132c8f85dbb0","Type":"ContainerStarted","Data":"a31decf41a1b02620ab2be257c60c8227184c0d7af43f0210eae219e937afaba"} Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.963141 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.963229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7e183977-7c22-4b6d-a465-132c8f85dbb0","Type":"ContainerStarted","Data":"8a2c4e0ae63668aed4274707ac9589b229bd930700f2c6e2c7706b1b5642df61"} Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.962992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-v97xz" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.963305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7e183977-7c22-4b6d-a465-132c8f85dbb0","Type":"ContainerStarted","Data":"9c3a6d7076a0d105db144a6333b85c631bc6ff5efa3e4cd298a1f6de22f877a7"} Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.975345 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" podStartSLOduration=1.9753316079999999 podStartE2EDuration="1.975331608s" podCreationTimestamp="2026-01-21 15:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:55.972616281 +0000 UTC m=+2053.154132503" watchObservedRunningTime="2026-01-21 15:35:55.975331608 +0000 UTC m=+2053.156847830" Jan 21 15:35:55 crc kubenswrapper[4707]: I0121 15:35:55.991010 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=1.990996863 podStartE2EDuration="1.990996863s" podCreationTimestamp="2026-01-21 15:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:55.986934162 +0000 UTC m=+2053.168450385" watchObservedRunningTime="2026-01-21 15:35:55.990996863 +0000 UTC m=+2053.172513085" Jan 21 15:35:56 crc kubenswrapper[4707]: I0121 15:35:56.016696 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-v97xz"] Jan 21 15:35:56 crc kubenswrapper[4707]: I0121 15:35:56.022460 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-v97xz"] Jan 21 15:35:56 crc kubenswrapper[4707]: I0121 15:35:56.599870 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:56 crc kubenswrapper[4707]: I0121 15:35:56.652906 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.058559 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rshmd"] Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.058959 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rshmd" podUID="9b2b206d-6a55-4206-8353-a7f791c96091" containerName="registry-server" containerID="cri-o://32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea" gracePeriod=2 Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.191074 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b45c7a9-17c2-436e-891f-f7c6a05a8129" path="/var/lib/kubelet/pods/8b45c7a9-17c2-436e-891f-f7c6a05a8129/volumes" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.492267 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.575174 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-catalog-content\") pod \"9b2b206d-6a55-4206-8353-a7f791c96091\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.575292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltfzx\" (UniqueName: \"kubernetes.io/projected/9b2b206d-6a55-4206-8353-a7f791c96091-kube-api-access-ltfzx\") pod \"9b2b206d-6a55-4206-8353-a7f791c96091\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.575341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-utilities\") pod \"9b2b206d-6a55-4206-8353-a7f791c96091\" (UID: \"9b2b206d-6a55-4206-8353-a7f791c96091\") " Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.576236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-utilities" (OuterVolumeSpecName: "utilities") pod "9b2b206d-6a55-4206-8353-a7f791c96091" (UID: "9b2b206d-6a55-4206-8353-a7f791c96091"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.583701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b2b206d-6a55-4206-8353-a7f791c96091-kube-api-access-ltfzx" (OuterVolumeSpecName: "kube-api-access-ltfzx") pod "9b2b206d-6a55-4206-8353-a7f791c96091" (UID: "9b2b206d-6a55-4206-8353-a7f791c96091"). InnerVolumeSpecName "kube-api-access-ltfzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.616790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b2b206d-6a55-4206-8353-a7f791c96091" (UID: "9b2b206d-6a55-4206-8353-a7f791c96091"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.679191 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltfzx\" (UniqueName: \"kubernetes.io/projected/9b2b206d-6a55-4206-8353-a7f791c96091-kube-api-access-ltfzx\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.679237 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.679249 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2b206d-6a55-4206-8353-a7f791c96091-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.728104 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4hf7c"] Jan 21 15:35:57 crc kubenswrapper[4707]: E0121 15:35:57.728540 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2b206d-6a55-4206-8353-a7f791c96091" containerName="extract-utilities" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.728555 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2b206d-6a55-4206-8353-a7f791c96091" containerName="extract-utilities" Jan 21 15:35:57 crc kubenswrapper[4707]: E0121 15:35:57.728581 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2b206d-6a55-4206-8353-a7f791c96091" containerName="extract-content" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.728587 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2b206d-6a55-4206-8353-a7f791c96091" containerName="extract-content" Jan 21 15:35:57 crc kubenswrapper[4707]: E0121 15:35:57.728608 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2b206d-6a55-4206-8353-a7f791c96091" containerName="registry-server" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.728613 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2b206d-6a55-4206-8353-a7f791c96091" containerName="registry-server" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.728781 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b2b206d-6a55-4206-8353-a7f791c96091" containerName="registry-server" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.729521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.731804 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.753718 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4hf7c"] Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.780526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c80d2f2-2f59-4b71-96fe-e57d442c2789-operator-scripts\") pod \"root-account-create-update-4hf7c\" (UID: \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\") " pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.780606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqsth\" (UniqueName: \"kubernetes.io/projected/2c80d2f2-2f59-4b71-96fe-e57d442c2789-kube-api-access-tqsth\") pod \"root-account-create-update-4hf7c\" (UID: \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\") " pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.882277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqsth\" (UniqueName: \"kubernetes.io/projected/2c80d2f2-2f59-4b71-96fe-e57d442c2789-kube-api-access-tqsth\") pod \"root-account-create-update-4hf7c\" (UID: \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\") " pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.882456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c80d2f2-2f59-4b71-96fe-e57d442c2789-operator-scripts\") pod \"root-account-create-update-4hf7c\" (UID: \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\") " pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.883157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c80d2f2-2f59-4b71-96fe-e57d442c2789-operator-scripts\") pod \"root-account-create-update-4hf7c\" (UID: \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\") " pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.896410 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqsth\" (UniqueName: \"kubernetes.io/projected/2c80d2f2-2f59-4b71-96fe-e57d442c2789-kube-api-access-tqsth\") pod \"root-account-create-update-4hf7c\" (UID: \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\") " pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.977168 4707 generic.go:334] "Generic (PLEG): container finished" podID="9b2b206d-6a55-4206-8353-a7f791c96091" containerID="32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea" exitCode=0 Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.977220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rshmd" event={"ID":"9b2b206d-6a55-4206-8353-a7f791c96091","Type":"ContainerDied","Data":"32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea"} Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.977239 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rshmd" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.977260 4707 scope.go:117] "RemoveContainer" containerID="32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea" Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.977248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rshmd" event={"ID":"9b2b206d-6a55-4206-8353-a7f791c96091","Type":"ContainerDied","Data":"7aad15d047e10f97d0a68c6b49d6be2b6a2a1949496d5ad7c89bc619da052807"} Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.983771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:35:57 crc kubenswrapper[4707]: E0121 15:35:57.983917 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:35:57 crc kubenswrapper[4707]: E0121 15:35:57.983934 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:35:57 crc kubenswrapper[4707]: E0121 15:35:57.983978 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift podName:70e67f96-41b8-43d4-b048-546873358118 nodeName:}" failed. No retries permitted until 2026-01-21 15:36:01.983964903 +0000 UTC m=+2059.165481125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift") pod "swift-storage-0" (UID: "70e67f96-41b8-43d4-b048-546873358118") : configmap "swift-ring-files" not found Jan 21 15:35:57 crc kubenswrapper[4707]: I0121 15:35:57.992563 4707 scope.go:117] "RemoveContainer" containerID="d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f" Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.008901 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rshmd"] Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.012162 4707 scope.go:117] "RemoveContainer" containerID="6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d" Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.015400 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rshmd"] Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.035243 4707 scope.go:117] "RemoveContainer" containerID="32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea" Jan 21 15:35:58 crc kubenswrapper[4707]: E0121 15:35:58.035528 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea\": container with ID starting with 32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea not found: ID does not exist" containerID="32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea" Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.035564 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea"} err="failed to get container status \"32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea\": rpc error: code = NotFound desc = could not find container \"32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea\": container with ID starting with 32b7d91e2d50b7f9e38137d9bb6c5f4d46799d2436fe60de1e8c79124f898eea not found: ID does not exist" Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.035588 4707 scope.go:117] "RemoveContainer" containerID="d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f" Jan 21 15:35:58 crc kubenswrapper[4707]: E0121 15:35:58.036653 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f\": container with ID starting with d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f not found: ID does not exist" containerID="d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f" Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.036681 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f"} err="failed to get container status \"d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f\": rpc error: code = NotFound desc = could not find container \"d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f\": container with ID starting with d907831dcadeb146710763968a9f518ff15393a10b62f39e7ff48738c1653a2f not found: ID does not exist" Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.036697 4707 scope.go:117] "RemoveContainer" containerID="6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d" Jan 21 15:35:58 crc kubenswrapper[4707]: E0121 15:35:58.038374 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d\": container with ID starting with 6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d not found: ID does not exist" containerID="6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d" Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.038400 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d"} err="failed to get container status \"6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d\": rpc error: code = NotFound desc = could not find container \"6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d\": container with ID starting with 6f12a6cd4fd84e0fe696585ed9749d14989da2c3109f8755db98eeb9c6ec8d1d not found: ID does not exist" Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.055295 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.440621 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4hf7c"] Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.984696 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c80d2f2-2f59-4b71-96fe-e57d442c2789" containerID="b1188d36dc2bdebaced9f39f950f8b3b1eca82f9ad0ca7aef76acdda21e53bf1" exitCode=0 Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.984743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-4hf7c" event={"ID":"2c80d2f2-2f59-4b71-96fe-e57d442c2789","Type":"ContainerDied","Data":"b1188d36dc2bdebaced9f39f950f8b3b1eca82f9ad0ca7aef76acdda21e53bf1"} Jan 21 15:35:58 crc kubenswrapper[4707]: I0121 15:35:58.985151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-4hf7c" event={"ID":"2c80d2f2-2f59-4b71-96fe-e57d442c2789","Type":"ContainerStarted","Data":"930370acdf3230346a2f6aeb735e39eead79edafc34c582b7aae6ce34ea21c64"} Jan 21 15:35:59 crc kubenswrapper[4707]: I0121 15:35:59.190858 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b2b206d-6a55-4206-8353-a7f791c96091" path="/var/lib/kubelet/pods/9b2b206d-6a55-4206-8353-a7f791c96091/volumes" Jan 21 15:35:59 crc kubenswrapper[4707]: I0121 15:35:59.294683 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-2" podUID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" containerName="galera" probeResult="failure" output=< Jan 21 15:35:59 crc kubenswrapper[4707]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 21 15:35:59 crc kubenswrapper[4707]: > Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.146353 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.227353 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.307479 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.425665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqsth\" (UniqueName: \"kubernetes.io/projected/2c80d2f2-2f59-4b71-96fe-e57d442c2789-kube-api-access-tqsth\") pod \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\" (UID: \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\") " Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.425847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c80d2f2-2f59-4b71-96fe-e57d442c2789-operator-scripts\") pod \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\" (UID: \"2c80d2f2-2f59-4b71-96fe-e57d442c2789\") " Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.426241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c80d2f2-2f59-4b71-96fe-e57d442c2789-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c80d2f2-2f59-4b71-96fe-e57d442c2789" (UID: "2c80d2f2-2f59-4b71-96fe-e57d442c2789"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.426472 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c80d2f2-2f59-4b71-96fe-e57d442c2789-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.439946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c80d2f2-2f59-4b71-96fe-e57d442c2789-kube-api-access-tqsth" (OuterVolumeSpecName: "kube-api-access-tqsth") pod "2c80d2f2-2f59-4b71-96fe-e57d442c2789" (UID: "2c80d2f2-2f59-4b71-96fe-e57d442c2789"). InnerVolumeSpecName "kube-api-access-tqsth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.452415 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8s54s"] Jan 21 15:36:00 crc kubenswrapper[4707]: E0121 15:36:00.452948 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c80d2f2-2f59-4b71-96fe-e57d442c2789" containerName="mariadb-account-create-update" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.452965 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c80d2f2-2f59-4b71-96fe-e57d442c2789" containerName="mariadb-account-create-update" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.453119 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c80d2f2-2f59-4b71-96fe-e57d442c2789" containerName="mariadb-account-create-update" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.453617 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.476400 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8s54s"] Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.529030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca523b2c-039d-42f0-8602-10b7fe0d4134-operator-scripts\") pod \"keystone-db-create-8s54s\" (UID: \"ca523b2c-039d-42f0-8602-10b7fe0d4134\") " pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.529161 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fcx7\" (UniqueName: \"kubernetes.io/projected/ca523b2c-039d-42f0-8602-10b7fe0d4134-kube-api-access-5fcx7\") pod \"keystone-db-create-8s54s\" (UID: \"ca523b2c-039d-42f0-8602-10b7fe0d4134\") " pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.529524 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqsth\" (UniqueName: \"kubernetes.io/projected/2c80d2f2-2f59-4b71-96fe-e57d442c2789-kube-api-access-tqsth\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.568621 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc"] Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.569638 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.571251 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.576582 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc"] Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.605579 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/openstack-cell1-galera-2" podUID="2a77adde-7080-44bf-bbba-0d8b0f874e98" containerName="galera" probeResult="failure" output=< Jan 21 15:36:00 crc kubenswrapper[4707]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 21 15:36:00 crc kubenswrapper[4707]: > Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.630781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2362c581-1317-4f6d-8967-0bf4dedd8637-operator-scripts\") pod \"keystone-ab63-account-create-update-mtmbc\" (UID: \"2362c581-1317-4f6d-8967-0bf4dedd8637\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.631006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca523b2c-039d-42f0-8602-10b7fe0d4134-operator-scripts\") pod \"keystone-db-create-8s54s\" (UID: \"ca523b2c-039d-42f0-8602-10b7fe0d4134\") " pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.631093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqd7z\" (UniqueName: \"kubernetes.io/projected/2362c581-1317-4f6d-8967-0bf4dedd8637-kube-api-access-gqd7z\") pod \"keystone-ab63-account-create-update-mtmbc\" (UID: \"2362c581-1317-4f6d-8967-0bf4dedd8637\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.631160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fcx7\" (UniqueName: \"kubernetes.io/projected/ca523b2c-039d-42f0-8602-10b7fe0d4134-kube-api-access-5fcx7\") pod \"keystone-db-create-8s54s\" (UID: \"ca523b2c-039d-42f0-8602-10b7fe0d4134\") " pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.631690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca523b2c-039d-42f0-8602-10b7fe0d4134-operator-scripts\") pod \"keystone-db-create-8s54s\" (UID: \"ca523b2c-039d-42f0-8602-10b7fe0d4134\") " pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.646627 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fcx7\" (UniqueName: \"kubernetes.io/projected/ca523b2c-039d-42f0-8602-10b7fe0d4134-kube-api-access-5fcx7\") pod \"keystone-db-create-8s54s\" (UID: \"ca523b2c-039d-42f0-8602-10b7fe0d4134\") " pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.732698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2362c581-1317-4f6d-8967-0bf4dedd8637-operator-scripts\") pod \"keystone-ab63-account-create-update-mtmbc\" (UID: \"2362c581-1317-4f6d-8967-0bf4dedd8637\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.732860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqd7z\" (UniqueName: \"kubernetes.io/projected/2362c581-1317-4f6d-8967-0bf4dedd8637-kube-api-access-gqd7z\") pod \"keystone-ab63-account-create-update-mtmbc\" (UID: \"2362c581-1317-4f6d-8967-0bf4dedd8637\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.733471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2362c581-1317-4f6d-8967-0bf4dedd8637-operator-scripts\") pod \"keystone-ab63-account-create-update-mtmbc\" (UID: \"2362c581-1317-4f6d-8967-0bf4dedd8637\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.749209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqd7z\" (UniqueName: \"kubernetes.io/projected/2362c581-1317-4f6d-8967-0bf4dedd8637-kube-api-access-gqd7z\") pod \"keystone-ab63-account-create-update-mtmbc\" (UID: \"2362c581-1317-4f6d-8967-0bf4dedd8637\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.760579 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-zjvdt"] Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.761539 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.769131 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-zjvdt"] Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.772798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.834938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kfq\" (UniqueName: \"kubernetes.io/projected/a916eb4b-5b4d-47ff-b5db-01802c7d1484-kube-api-access-59kfq\") pod \"placement-db-create-zjvdt\" (UID: \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\") " pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.834977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a916eb4b-5b4d-47ff-b5db-01802c7d1484-operator-scripts\") pod \"placement-db-create-zjvdt\" (UID: \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\") " pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.876465 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc"] Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.877558 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.879427 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.883300 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc"] Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.888117 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.936653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b6803b-8c90-49cb-860c-f08d6a472f9e-operator-scripts\") pod \"placement-5e9c-account-create-update-hvxbc\" (UID: \"27b6803b-8c90-49cb-860c-f08d6a472f9e\") " pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.936967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jczg\" (UniqueName: \"kubernetes.io/projected/27b6803b-8c90-49cb-860c-f08d6a472f9e-kube-api-access-5jczg\") pod \"placement-5e9c-account-create-update-hvxbc\" (UID: \"27b6803b-8c90-49cb-860c-f08d6a472f9e\") " pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.937038 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kfq\" (UniqueName: \"kubernetes.io/projected/a916eb4b-5b4d-47ff-b5db-01802c7d1484-kube-api-access-59kfq\") pod \"placement-db-create-zjvdt\" (UID: \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\") " pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.937061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a916eb4b-5b4d-47ff-b5db-01802c7d1484-operator-scripts\") pod \"placement-db-create-zjvdt\" (UID: \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\") " pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.937616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a916eb4b-5b4d-47ff-b5db-01802c7d1484-operator-scripts\") pod \"placement-db-create-zjvdt\" (UID: \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\") " pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.958527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kfq\" (UniqueName: \"kubernetes.io/projected/a916eb4b-5b4d-47ff-b5db-01802c7d1484-kube-api-access-59kfq\") pod \"placement-db-create-zjvdt\" (UID: \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\") " pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.994254 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-svdgj"] Jan 21 15:36:00 crc kubenswrapper[4707]: I0121 15:36:00.995261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.001854 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-svdgj"] Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.013863 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4hf7c" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.013866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-4hf7c" event={"ID":"2c80d2f2-2f59-4b71-96fe-e57d442c2789","Type":"ContainerDied","Data":"930370acdf3230346a2f6aeb735e39eead79edafc34c582b7aae6ce34ea21c64"} Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.014092 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="930370acdf3230346a2f6aeb735e39eead79edafc34c582b7aae6ce34ea21c64" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.038733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b6803b-8c90-49cb-860c-f08d6a472f9e-operator-scripts\") pod \"placement-5e9c-account-create-update-hvxbc\" (UID: \"27b6803b-8c90-49cb-860c-f08d6a472f9e\") " pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.038798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41231705-a331-4eec-8c6f-3bfb45b6db70-operator-scripts\") pod \"glance-db-create-svdgj\" (UID: \"41231705-a331-4eec-8c6f-3bfb45b6db70\") " pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.038858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jczg\" (UniqueName: \"kubernetes.io/projected/27b6803b-8c90-49cb-860c-f08d6a472f9e-kube-api-access-5jczg\") pod \"placement-5e9c-account-create-update-hvxbc\" (UID: \"27b6803b-8c90-49cb-860c-f08d6a472f9e\") " pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.038923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58hs\" (UniqueName: \"kubernetes.io/projected/41231705-a331-4eec-8c6f-3bfb45b6db70-kube-api-access-x58hs\") pod \"glance-db-create-svdgj\" (UID: \"41231705-a331-4eec-8c6f-3bfb45b6db70\") " pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.039364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b6803b-8c90-49cb-860c-f08d6a472f9e-operator-scripts\") pod \"placement-5e9c-account-create-update-hvxbc\" (UID: \"27b6803b-8c90-49cb-860c-f08d6a472f9e\") " pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.056902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jczg\" (UniqueName: \"kubernetes.io/projected/27b6803b-8c90-49cb-860c-f08d6a472f9e-kube-api-access-5jczg\") pod \"placement-5e9c-account-create-update-hvxbc\" (UID: \"27b6803b-8c90-49cb-860c-f08d6a472f9e\") " pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.082953 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-4c4c-account-create-update-gn426"] Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.084154 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.088192 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.089077 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.090053 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-4c4c-account-create-update-gn426"] Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.140132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x58hs\" (UniqueName: \"kubernetes.io/projected/41231705-a331-4eec-8c6f-3bfb45b6db70-kube-api-access-x58hs\") pod \"glance-db-create-svdgj\" (UID: \"41231705-a331-4eec-8c6f-3bfb45b6db70\") " pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.140193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w76dg\" (UniqueName: \"kubernetes.io/projected/bc880d2b-0a62-4e59-8063-810f798bb321-kube-api-access-w76dg\") pod \"glance-4c4c-account-create-update-gn426\" (UID: \"bc880d2b-0a62-4e59-8063-810f798bb321\") " pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.140415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41231705-a331-4eec-8c6f-3bfb45b6db70-operator-scripts\") pod \"glance-db-create-svdgj\" (UID: \"41231705-a331-4eec-8c6f-3bfb45b6db70\") " pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.140488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc880d2b-0a62-4e59-8063-810f798bb321-operator-scripts\") pod \"glance-4c4c-account-create-update-gn426\" (UID: \"bc880d2b-0a62-4e59-8063-810f798bb321\") " pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.141768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41231705-a331-4eec-8c6f-3bfb45b6db70-operator-scripts\") pod \"glance-db-create-svdgj\" (UID: \"41231705-a331-4eec-8c6f-3bfb45b6db70\") " pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.164040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58hs\" (UniqueName: \"kubernetes.io/projected/41231705-a331-4eec-8c6f-3bfb45b6db70-kube-api-access-x58hs\") pod \"glance-db-create-svdgj\" (UID: \"41231705-a331-4eec-8c6f-3bfb45b6db70\") " pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.171658 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8s54s"] Jan 21 15:36:01 crc kubenswrapper[4707]: W0121 15:36:01.181615 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca523b2c_039d_42f0_8602_10b7fe0d4134.slice/crio-6a7bccb9dab17eb7e99ff5e42a4c36aa84500b9cc94388a108957da408aa5cee WatchSource:0}: Error finding container 6a7bccb9dab17eb7e99ff5e42a4c36aa84500b9cc94388a108957da408aa5cee: Status 404 returned error can't find the container with id 6a7bccb9dab17eb7e99ff5e42a4c36aa84500b9cc94388a108957da408aa5cee Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.191463 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.241775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc880d2b-0a62-4e59-8063-810f798bb321-operator-scripts\") pod \"glance-4c4c-account-create-update-gn426\" (UID: \"bc880d2b-0a62-4e59-8063-810f798bb321\") " pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.241912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w76dg\" (UniqueName: \"kubernetes.io/projected/bc880d2b-0a62-4e59-8063-810f798bb321-kube-api-access-w76dg\") pod \"glance-4c4c-account-create-update-gn426\" (UID: \"bc880d2b-0a62-4e59-8063-810f798bb321\") " pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.242419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc880d2b-0a62-4e59-8063-810f798bb321-operator-scripts\") pod \"glance-4c4c-account-create-update-gn426\" (UID: \"bc880d2b-0a62-4e59-8063-810f798bb321\") " pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.265404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w76dg\" (UniqueName: \"kubernetes.io/projected/bc880d2b-0a62-4e59-8063-810f798bb321-kube-api-access-w76dg\") pod \"glance-4c4c-account-create-update-gn426\" (UID: \"bc880d2b-0a62-4e59-8063-810f798bb321\") " pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.309079 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.321067 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc"] Jan 21 15:36:01 crc kubenswrapper[4707]: W0121 15:36:01.350942 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2362c581_1317_4f6d_8967_0bf4dedd8637.slice/crio-664a15cf5554015b1e4c82bfddb6e6da6e53f7de5f10cc4f34199a04d50689be WatchSource:0}: Error finding container 664a15cf5554015b1e4c82bfddb6e6da6e53f7de5f10cc4f34199a04d50689be: Status 404 returned error can't find the container with id 664a15cf5554015b1e4c82bfddb6e6da6e53f7de5f10cc4f34199a04d50689be Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.395551 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.536290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-zjvdt"] Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.673823 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc"] Jan 21 15:36:01 crc kubenswrapper[4707]: W0121 15:36:01.725874 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b6803b_8c90_49cb_860c_f08d6a472f9e.slice/crio-a32515873dacaa9808ad476bd80a35cb5c3c7f15454839e6bb3f523695e85060 WatchSource:0}: Error finding container a32515873dacaa9808ad476bd80a35cb5c3c7f15454839e6bb3f523695e85060: Status 404 returned error can't find the container with id a32515873dacaa9808ad476bd80a35cb5c3c7f15454839e6bb3f523695e85060 Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.886955 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-svdgj"] Jan 21 15:36:01 crc kubenswrapper[4707]: W0121 15:36:01.888536 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41231705_a331_4eec_8c6f_3bfb45b6db70.slice/crio-2e2f52ec534703a06cd7327a4cac5353869b27a0944a331eaa45efac0e587985 WatchSource:0}: Error finding container 2e2f52ec534703a06cd7327a4cac5353869b27a0944a331eaa45efac0e587985: Status 404 returned error can't find the container with id 2e2f52ec534703a06cd7327a4cac5353869b27a0944a331eaa45efac0e587985 Jan 21 15:36:01 crc kubenswrapper[4707]: I0121 15:36:01.974363 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-4c4c-account-create-update-gn426"] Jan 21 15:36:02 crc kubenswrapper[4707]: W0121 15:36:02.013413 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc880d2b_0a62_4e59_8063_810f798bb321.slice/crio-2ab4ffbd0b422d601a521bbb9b0c4334cf69630f4ca789440e1a1edcb3f90645 WatchSource:0}: Error finding container 2ab4ffbd0b422d601a521bbb9b0c4334cf69630f4ca789440e1a1edcb3f90645: Status 404 returned error can't find the container with id 2ab4ffbd0b422d601a521bbb9b0c4334cf69630f4ca789440e1a1edcb3f90645 Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.022765 4707 generic.go:334] "Generic (PLEG): container finished" podID="a916eb4b-5b4d-47ff-b5db-01802c7d1484" containerID="9d5b0b7f5a582f6c4354f5489be09ac9274c0e007f0788be84a147857f5bd690" exitCode=0 Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.022817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-zjvdt" event={"ID":"a916eb4b-5b4d-47ff-b5db-01802c7d1484","Type":"ContainerDied","Data":"9d5b0b7f5a582f6c4354f5489be09ac9274c0e007f0788be84a147857f5bd690"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.022859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-zjvdt" event={"ID":"a916eb4b-5b4d-47ff-b5db-01802c7d1484","Type":"ContainerStarted","Data":"1594d5652e340f96b064ec26d648f2e621a785b493f5bd11a18bca7de1ac46fc"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.025024 4707 generic.go:334] "Generic (PLEG): container finished" podID="dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" containerID="17e70d7489d29ca57860b2d236ba90a62d858bc89f801452c8a3eba7098c9352" exitCode=0 Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.025066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" event={"ID":"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b","Type":"ContainerDied","Data":"17e70d7489d29ca57860b2d236ba90a62d858bc89f801452c8a3eba7098c9352"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.026902 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca523b2c-039d-42f0-8602-10b7fe0d4134" containerID="57c89da5fb9005ec6dcd82ec6ad8a265e4dd85410a9e6a5f1e56c83f0296636c" exitCode=0 Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.026939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-8s54s" event={"ID":"ca523b2c-039d-42f0-8602-10b7fe0d4134","Type":"ContainerDied","Data":"57c89da5fb9005ec6dcd82ec6ad8a265e4dd85410a9e6a5f1e56c83f0296636c"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.026971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-8s54s" event={"ID":"ca523b2c-039d-42f0-8602-10b7fe0d4134","Type":"ContainerStarted","Data":"6a7bccb9dab17eb7e99ff5e42a4c36aa84500b9cc94388a108957da408aa5cee"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.028647 4707 generic.go:334] "Generic (PLEG): container finished" podID="27b6803b-8c90-49cb-860c-f08d6a472f9e" containerID="48ccd1e724ea566ac4ef6f265b3e10eddd03645be766819c72a99208070bd75e" exitCode=0 Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.028719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" event={"ID":"27b6803b-8c90-49cb-860c-f08d6a472f9e","Type":"ContainerDied","Data":"48ccd1e724ea566ac4ef6f265b3e10eddd03645be766819c72a99208070bd75e"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.028749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" event={"ID":"27b6803b-8c90-49cb-860c-f08d6a472f9e","Type":"ContainerStarted","Data":"a32515873dacaa9808ad476bd80a35cb5c3c7f15454839e6bb3f523695e85060"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.030290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-svdgj" event={"ID":"41231705-a331-4eec-8c6f-3bfb45b6db70","Type":"ContainerStarted","Data":"e9c5e95311c111be35df30d16c6a5e7813e650f8ab4dabdc6a40169ffdf21a57"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.030315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-svdgj" event={"ID":"41231705-a331-4eec-8c6f-3bfb45b6db70","Type":"ContainerStarted","Data":"2e2f52ec534703a06cd7327a4cac5353869b27a0944a331eaa45efac0e587985"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.031507 4707 generic.go:334] "Generic (PLEG): container finished" podID="2362c581-1317-4f6d-8967-0bf4dedd8637" containerID="141300207b12cdbdf06baed6cb9e51962fd7f970e7d8fd78eb7b854f6ba5d5a6" exitCode=0 Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.031532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" event={"ID":"2362c581-1317-4f6d-8967-0bf4dedd8637","Type":"ContainerDied","Data":"141300207b12cdbdf06baed6cb9e51962fd7f970e7d8fd78eb7b854f6ba5d5a6"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.031551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" event={"ID":"2362c581-1317-4f6d-8967-0bf4dedd8637","Type":"ContainerStarted","Data":"664a15cf5554015b1e4c82bfddb6e6da6e53f7de5f10cc4f34199a04d50689be"} Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.050834 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-create-svdgj" podStartSLOduration=2.050820956 podStartE2EDuration="2.050820956s" podCreationTimestamp="2026-01-21 15:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:02.04523833 +0000 UTC m=+2059.226754551" watchObservedRunningTime="2026-01-21 15:36:02.050820956 +0000 UTC m=+2059.232337168" Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.057499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.064759 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift\") pod \"swift-storage-0\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.280306 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:36:02 crc kubenswrapper[4707]: I0121 15:36:02.672092 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:36:02 crc kubenswrapper[4707]: E0121 15:36:02.951205 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:56640->192.168.25.165:36655: write tcp 192.168.25.165:56640->192.168.25.165:36655: write: broken pipe Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.040578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1"} Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.040622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a"} Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.040631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c"} Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.040640 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"f21cb72f66e3f8efcba46eb76e5cc5b4b986967d6e0d4b8566a52ef0b0c38a92"} Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.042180 4707 generic.go:334] "Generic (PLEG): container finished" podID="bc880d2b-0a62-4e59-8063-810f798bb321" containerID="90983c11e74f2f1c02a3d7133c07441ce683da95b9d00877cfefa6d7ec35f648" exitCode=0 Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.042261 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" event={"ID":"bc880d2b-0a62-4e59-8063-810f798bb321","Type":"ContainerDied","Data":"90983c11e74f2f1c02a3d7133c07441ce683da95b9d00877cfefa6d7ec35f648"} Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.042322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" event={"ID":"bc880d2b-0a62-4e59-8063-810f798bb321","Type":"ContainerStarted","Data":"2ab4ffbd0b422d601a521bbb9b0c4334cf69630f4ca789440e1a1edcb3f90645"} Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.044560 4707 generic.go:334] "Generic (PLEG): container finished" podID="41231705-a331-4eec-8c6f-3bfb45b6db70" containerID="e9c5e95311c111be35df30d16c6a5e7813e650f8ab4dabdc6a40169ffdf21a57" exitCode=0 Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.044653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-svdgj" event={"ID":"41231705-a331-4eec-8c6f-3bfb45b6db70","Type":"ContainerDied","Data":"e9c5e95311c111be35df30d16c6a5e7813e650f8ab4dabdc6a40169ffdf21a57"} Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.433383 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.488559 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kfq\" (UniqueName: \"kubernetes.io/projected/a916eb4b-5b4d-47ff-b5db-01802c7d1484-kube-api-access-59kfq\") pod \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\" (UID: \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.488705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a916eb4b-5b4d-47ff-b5db-01802c7d1484-operator-scripts\") pod \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\" (UID: \"a916eb4b-5b4d-47ff-b5db-01802c7d1484\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.489722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a916eb4b-5b4d-47ff-b5db-01802c7d1484-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a916eb4b-5b4d-47ff-b5db-01802c7d1484" (UID: "a916eb4b-5b4d-47ff-b5db-01802c7d1484"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.500087 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a916eb4b-5b4d-47ff-b5db-01802c7d1484-kube-api-access-59kfq" (OuterVolumeSpecName: "kube-api-access-59kfq") pod "a916eb4b-5b4d-47ff-b5db-01802c7d1484" (UID: "a916eb4b-5b4d-47ff-b5db-01802c7d1484"). InnerVolumeSpecName "kube-api-access-59kfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.590439 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kfq\" (UniqueName: \"kubernetes.io/projected/a916eb4b-5b4d-47ff-b5db-01802c7d1484-kube-api-access-59kfq\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.590473 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a916eb4b-5b4d-47ff-b5db-01802c7d1484-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.599873 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.604161 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.609274 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.622189 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.690791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2362c581-1317-4f6d-8967-0bf4dedd8637-operator-scripts\") pod \"2362c581-1317-4f6d-8967-0bf4dedd8637\" (UID: \"2362c581-1317-4f6d-8967-0bf4dedd8637\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn9qs\" (UniqueName: \"kubernetes.io/projected/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-kube-api-access-mn9qs\") pod \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqd7z\" (UniqueName: \"kubernetes.io/projected/2362c581-1317-4f6d-8967-0bf4dedd8637-kube-api-access-gqd7z\") pod \"2362c581-1317-4f6d-8967-0bf4dedd8637\" (UID: \"2362c581-1317-4f6d-8967-0bf4dedd8637\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fcx7\" (UniqueName: \"kubernetes.io/projected/ca523b2c-039d-42f0-8602-10b7fe0d4134-kube-api-access-5fcx7\") pod \"ca523b2c-039d-42f0-8602-10b7fe0d4134\" (UID: \"ca523b2c-039d-42f0-8602-10b7fe0d4134\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691135 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jczg\" (UniqueName: \"kubernetes.io/projected/27b6803b-8c90-49cb-860c-f08d6a472f9e-kube-api-access-5jczg\") pod \"27b6803b-8c90-49cb-860c-f08d6a472f9e\" (UID: \"27b6803b-8c90-49cb-860c-f08d6a472f9e\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691174 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-etc-swift\") pod \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-swiftconf\") pod \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691273 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-ring-data-devices\") pod \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b6803b-8c90-49cb-860c-f08d6a472f9e-operator-scripts\") pod \"27b6803b-8c90-49cb-860c-f08d6a472f9e\" (UID: \"27b6803b-8c90-49cb-860c-f08d6a472f9e\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-dispersionconf\") pod \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-scripts\") pod \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-combined-ca-bundle\") pod \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\" (UID: \"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.691391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca523b2c-039d-42f0-8602-10b7fe0d4134-operator-scripts\") pod \"ca523b2c-039d-42f0-8602-10b7fe0d4134\" (UID: \"ca523b2c-039d-42f0-8602-10b7fe0d4134\") " Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.692548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2362c581-1317-4f6d-8967-0bf4dedd8637-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2362c581-1317-4f6d-8967-0bf4dedd8637" (UID: "2362c581-1317-4f6d-8967-0bf4dedd8637"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.692829 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca523b2c-039d-42f0-8602-10b7fe0d4134-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca523b2c-039d-42f0-8602-10b7fe0d4134" (UID: "ca523b2c-039d-42f0-8602-10b7fe0d4134"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.693209 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" (UID: "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.693253 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27b6803b-8c90-49cb-860c-f08d6a472f9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27b6803b-8c90-49cb-860c-f08d6a472f9e" (UID: "27b6803b-8c90-49cb-860c-f08d6a472f9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.695276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-kube-api-access-mn9qs" (OuterVolumeSpecName: "kube-api-access-mn9qs") pod "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" (UID: "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b"). InnerVolumeSpecName "kube-api-access-mn9qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.696488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b6803b-8c90-49cb-860c-f08d6a472f9e-kube-api-access-5jczg" (OuterVolumeSpecName: "kube-api-access-5jczg") pod "27b6803b-8c90-49cb-860c-f08d6a472f9e" (UID: "27b6803b-8c90-49cb-860c-f08d6a472f9e"). InnerVolumeSpecName "kube-api-access-5jczg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.696554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" (UID: "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.699713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca523b2c-039d-42f0-8602-10b7fe0d4134-kube-api-access-5fcx7" (OuterVolumeSpecName: "kube-api-access-5fcx7") pod "ca523b2c-039d-42f0-8602-10b7fe0d4134" (UID: "ca523b2c-039d-42f0-8602-10b7fe0d4134"). InnerVolumeSpecName "kube-api-access-5fcx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.708852 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" (UID: "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.710513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2362c581-1317-4f6d-8967-0bf4dedd8637-kube-api-access-gqd7z" (OuterVolumeSpecName: "kube-api-access-gqd7z") pod "2362c581-1317-4f6d-8967-0bf4dedd8637" (UID: "2362c581-1317-4f6d-8967-0bf4dedd8637"). InnerVolumeSpecName "kube-api-access-gqd7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.717945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-scripts" (OuterVolumeSpecName: "scripts") pod "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" (UID: "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.718139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" (UID: "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.719706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" (UID: "dbdd5bf5-6e0f-4577-ac4f-960a74e6679b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793777 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jczg\" (UniqueName: \"kubernetes.io/projected/27b6803b-8c90-49cb-860c-f08d6a472f9e-kube-api-access-5jczg\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793844 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793856 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793865 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793873 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b6803b-8c90-49cb-860c-f08d6a472f9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793880 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793887 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793896 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793904 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca523b2c-039d-42f0-8602-10b7fe0d4134-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793928 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2362c581-1317-4f6d-8967-0bf4dedd8637-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793936 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn9qs\" (UniqueName: \"kubernetes.io/projected/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b-kube-api-access-mn9qs\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793946 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqd7z\" (UniqueName: \"kubernetes.io/projected/2362c581-1317-4f6d-8967-0bf4dedd8637-kube-api-access-gqd7z\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:03 crc kubenswrapper[4707]: I0121 15:36:03.793954 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fcx7\" (UniqueName: \"kubernetes.io/projected/ca523b2c-039d-42f0-8602-10b7fe0d4134-kube-api-access-5fcx7\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.053511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" event={"ID":"27b6803b-8c90-49cb-860c-f08d6a472f9e","Type":"ContainerDied","Data":"a32515873dacaa9808ad476bd80a35cb5c3c7f15454839e6bb3f523695e85060"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.053542 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.053554 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a32515873dacaa9808ad476bd80a35cb5c3c7f15454839e6bb3f523695e85060" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.055899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" event={"ID":"2362c581-1317-4f6d-8967-0bf4dedd8637","Type":"ContainerDied","Data":"664a15cf5554015b1e4c82bfddb6e6da6e53f7de5f10cc4f34199a04d50689be"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.055935 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="664a15cf5554015b1e4c82bfddb6e6da6e53f7de5f10cc4f34199a04d50689be" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.056000 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.060065 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-zjvdt" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.060057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-zjvdt" event={"ID":"a916eb4b-5b4d-47ff-b5db-01802c7d1484","Type":"ContainerDied","Data":"1594d5652e340f96b064ec26d648f2e621a785b493f5bd11a18bca7de1ac46fc"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.060108 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1594d5652e340f96b064ec26d648f2e621a785b493f5bd11a18bca7de1ac46fc" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.064727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.064765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.064782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.064791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.064800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.064823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.064831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.066120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" event={"ID":"dbdd5bf5-6e0f-4577-ac4f-960a74e6679b","Type":"ContainerDied","Data":"f08bf1ef24e21baf248c0a3432c206bb5b619e7a6b6a96c14b42a6d26f665c49"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.066154 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f08bf1ef24e21baf248c0a3432c206bb5b619e7a6b6a96c14b42a6d26f665c49" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.066135 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-6jqxb" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.067798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-8s54s" event={"ID":"ca523b2c-039d-42f0-8602-10b7fe0d4134","Type":"ContainerDied","Data":"6a7bccb9dab17eb7e99ff5e42a4c36aa84500b9cc94388a108957da408aa5cee"} Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.067926 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a7bccb9dab17eb7e99ff5e42a4c36aa84500b9cc94388a108957da408aa5cee" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.067874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-8s54s" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.168102 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4hf7c"] Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.174269 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4hf7c"] Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.488194 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.512267 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.616655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41231705-a331-4eec-8c6f-3bfb45b6db70-operator-scripts\") pod \"41231705-a331-4eec-8c6f-3bfb45b6db70\" (UID: \"41231705-a331-4eec-8c6f-3bfb45b6db70\") " Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.616708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc880d2b-0a62-4e59-8063-810f798bb321-operator-scripts\") pod \"bc880d2b-0a62-4e59-8063-810f798bb321\" (UID: \"bc880d2b-0a62-4e59-8063-810f798bb321\") " Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.616863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x58hs\" (UniqueName: \"kubernetes.io/projected/41231705-a331-4eec-8c6f-3bfb45b6db70-kube-api-access-x58hs\") pod \"41231705-a331-4eec-8c6f-3bfb45b6db70\" (UID: \"41231705-a331-4eec-8c6f-3bfb45b6db70\") " Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.616995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w76dg\" (UniqueName: \"kubernetes.io/projected/bc880d2b-0a62-4e59-8063-810f798bb321-kube-api-access-w76dg\") pod \"bc880d2b-0a62-4e59-8063-810f798bb321\" (UID: \"bc880d2b-0a62-4e59-8063-810f798bb321\") " Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.618334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc880d2b-0a62-4e59-8063-810f798bb321-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc880d2b-0a62-4e59-8063-810f798bb321" (UID: "bc880d2b-0a62-4e59-8063-810f798bb321"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.618364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41231705-a331-4eec-8c6f-3bfb45b6db70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41231705-a331-4eec-8c6f-3bfb45b6db70" (UID: "41231705-a331-4eec-8c6f-3bfb45b6db70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.623082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc880d2b-0a62-4e59-8063-810f798bb321-kube-api-access-w76dg" (OuterVolumeSpecName: "kube-api-access-w76dg") pod "bc880d2b-0a62-4e59-8063-810f798bb321" (UID: "bc880d2b-0a62-4e59-8063-810f798bb321"). InnerVolumeSpecName "kube-api-access-w76dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.626996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41231705-a331-4eec-8c6f-3bfb45b6db70-kube-api-access-x58hs" (OuterVolumeSpecName: "kube-api-access-x58hs") pod "41231705-a331-4eec-8c6f-3bfb45b6db70" (UID: "41231705-a331-4eec-8c6f-3bfb45b6db70"). InnerVolumeSpecName "kube-api-access-x58hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.718276 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x58hs\" (UniqueName: \"kubernetes.io/projected/41231705-a331-4eec-8c6f-3bfb45b6db70-kube-api-access-x58hs\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.718583 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w76dg\" (UniqueName: \"kubernetes.io/projected/bc880d2b-0a62-4e59-8063-810f798bb321-kube-api-access-w76dg\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.718596 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41231705-a331-4eec-8c6f-3bfb45b6db70-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:04 crc kubenswrapper[4707]: I0121 15:36:04.718609 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc880d2b-0a62-4e59-8063-810f798bb321-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.078245 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-svdgj" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.078281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-svdgj" event={"ID":"41231705-a331-4eec-8c6f-3bfb45b6db70","Type":"ContainerDied","Data":"2e2f52ec534703a06cd7327a4cac5353869b27a0944a331eaa45efac0e587985"} Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.078316 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e2f52ec534703a06cd7327a4cac5353869b27a0944a331eaa45efac0e587985" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.083078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9"} Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.083123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12"} Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.083135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d"} Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.083144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651"} Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.083153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerStarted","Data":"ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf"} Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.084671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" event={"ID":"bc880d2b-0a62-4e59-8063-810f798bb321","Type":"ContainerDied","Data":"2ab4ffbd0b422d601a521bbb9b0c4334cf69630f4ca789440e1a1edcb3f90645"} Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.084718 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ab4ffbd0b422d601a521bbb9b0c4334cf69630f4ca789440e1a1edcb3f90645" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.084737 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4c4c-account-create-update-gn426" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.119069 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=12.119051664 podStartE2EDuration="12.119051664s" podCreationTimestamp="2026-01-21 15:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:05.116207395 +0000 UTC m=+2062.297723617" watchObservedRunningTime="2026-01-21 15:36:05.119051664 +0000 UTC m=+2062.300567915" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.195488 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c80d2f2-2f59-4b71-96fe-e57d442c2789" path="/var/lib/kubelet/pods/2c80d2f2-2f59-4b71-96fe-e57d442c2789/volumes" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220180 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf"] Jan 21 15:36:05 crc kubenswrapper[4707]: E0121 15:36:05.220506 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2362c581-1317-4f6d-8967-0bf4dedd8637" containerName="mariadb-account-create-update" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220525 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2362c581-1317-4f6d-8967-0bf4dedd8637" containerName="mariadb-account-create-update" Jan 21 15:36:05 crc kubenswrapper[4707]: E0121 15:36:05.220542 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca523b2c-039d-42f0-8602-10b7fe0d4134" containerName="mariadb-database-create" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220548 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca523b2c-039d-42f0-8602-10b7fe0d4134" containerName="mariadb-database-create" Jan 21 15:36:05 crc kubenswrapper[4707]: E0121 15:36:05.220563 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc880d2b-0a62-4e59-8063-810f798bb321" containerName="mariadb-account-create-update" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220571 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc880d2b-0a62-4e59-8063-810f798bb321" containerName="mariadb-account-create-update" Jan 21 15:36:05 crc kubenswrapper[4707]: E0121 15:36:05.220578 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41231705-a331-4eec-8c6f-3bfb45b6db70" containerName="mariadb-database-create" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220583 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="41231705-a331-4eec-8c6f-3bfb45b6db70" containerName="mariadb-database-create" Jan 21 15:36:05 crc kubenswrapper[4707]: E0121 15:36:05.220595 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" containerName="swift-ring-rebalance" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220600 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" containerName="swift-ring-rebalance" Jan 21 15:36:05 crc kubenswrapper[4707]: E0121 15:36:05.220613 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a916eb4b-5b4d-47ff-b5db-01802c7d1484" containerName="mariadb-database-create" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220618 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a916eb4b-5b4d-47ff-b5db-01802c7d1484" containerName="mariadb-database-create" Jan 21 15:36:05 crc kubenswrapper[4707]: E0121 15:36:05.220627 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b6803b-8c90-49cb-860c-f08d6a472f9e" containerName="mariadb-account-create-update" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220632 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b6803b-8c90-49cb-860c-f08d6a472f9e" containerName="mariadb-account-create-update" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220774 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="41231705-a331-4eec-8c6f-3bfb45b6db70" containerName="mariadb-database-create" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220791 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc880d2b-0a62-4e59-8063-810f798bb321" containerName="mariadb-account-create-update" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220801 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2362c581-1317-4f6d-8967-0bf4dedd8637" containerName="mariadb-account-create-update" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220835 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a916eb4b-5b4d-47ff-b5db-01802c7d1484" containerName="mariadb-database-create" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220850 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" containerName="swift-ring-rebalance" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220863 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b6803b-8c90-49cb-860c-f08d6a472f9e" containerName="mariadb-account-create-update" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.220872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca523b2c-039d-42f0-8602-10b7fe0d4134" containerName="mariadb-database-create" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.221655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.223759 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.234870 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf"] Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.250326 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.305048 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.329633 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-config\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.329696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.329874 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448pb\" (UniqueName: \"kubernetes.io/projected/e007cb2f-fee4-4c18-9baf-9e61af91254c-kube-api-access-448pb\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.330025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.364652 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.432194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-config\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.433076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-config\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.433241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.433291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448pb\" (UniqueName: \"kubernetes.io/projected/e007cb2f-fee4-4c18-9baf-9e61af91254c-kube-api-access-448pb\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.433369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.434005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.434550 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.447134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448pb\" (UniqueName: \"kubernetes.io/projected/e007cb2f-fee4-4c18-9baf-9e61af91254c-kube-api-access-448pb\") pod \"dnsmasq-dnsmasq-64ddcf65d5-pc4rf\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.534276 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.745757 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.806098 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:36:05 crc kubenswrapper[4707]: I0121 15:36:05.899141 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf"] Jan 21 15:36:05 crc kubenswrapper[4707]: W0121 15:36:05.902037 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode007cb2f_fee4_4c18_9baf_9e61af91254c.slice/crio-c2621c8143f9a172c2a3d23695790adfd92c6bc5da1849ad78cd00b550697a26 WatchSource:0}: Error finding container c2621c8143f9a172c2a3d23695790adfd92c6bc5da1849ad78cd00b550697a26: Status 404 returned error can't find the container with id c2621c8143f9a172c2a3d23695790adfd92c6bc5da1849ad78cd00b550697a26 Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.092865 4707 generic.go:334] "Generic (PLEG): container finished" podID="e007cb2f-fee4-4c18-9baf-9e61af91254c" containerID="aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522" exitCode=0 Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.092915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" event={"ID":"e007cb2f-fee4-4c18-9baf-9e61af91254c","Type":"ContainerDied","Data":"aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522"} Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.093102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" event={"ID":"e007cb2f-fee4-4c18-9baf-9e61af91254c","Type":"ContainerStarted","Data":"c2621c8143f9a172c2a3d23695790adfd92c6bc5da1849ad78cd00b550697a26"} Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.249734 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-pcvgx"] Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.250652 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.252096 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-7x9kc" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.252335 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.261444 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-pcvgx"] Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.453719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-222h9\" (UniqueName: \"kubernetes.io/projected/4b53d780-b8f6-4755-a954-db9895f87427-kube-api-access-222h9\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.454133 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-combined-ca-bundle\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.454306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-config-data\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.454410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-db-sync-config-data\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.556112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-db-sync-config-data\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.556167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-222h9\" (UniqueName: \"kubernetes.io/projected/4b53d780-b8f6-4755-a954-db9895f87427-kube-api-access-222h9\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.556242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-combined-ca-bundle\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.556295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-config-data\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.560651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-config-data\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.560785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-db-sync-config-data\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.561573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-combined-ca-bundle\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.569689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-222h9\" (UniqueName: \"kubernetes.io/projected/4b53d780-b8f6-4755-a954-db9895f87427-kube-api-access-222h9\") pod \"glance-db-sync-pcvgx\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.604201 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:06 crc kubenswrapper[4707]: I0121 15:36:06.967974 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-pcvgx"] Jan 21 15:36:06 crc kubenswrapper[4707]: W0121 15:36:06.972339 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b53d780_b8f6_4755_a954_db9895f87427.slice/crio-0613527a5cf004d9509da9bbe84c94039dc9e4cc256d6b1067aaadea9eff7af3 WatchSource:0}: Error finding container 0613527a5cf004d9509da9bbe84c94039dc9e4cc256d6b1067aaadea9eff7af3: Status 404 returned error can't find the container with id 0613527a5cf004d9509da9bbe84c94039dc9e4cc256d6b1067aaadea9eff7af3 Jan 21 15:36:07 crc kubenswrapper[4707]: I0121 15:36:07.102880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-pcvgx" event={"ID":"4b53d780-b8f6-4755-a954-db9895f87427","Type":"ContainerStarted","Data":"0613527a5cf004d9509da9bbe84c94039dc9e4cc256d6b1067aaadea9eff7af3"} Jan 21 15:36:07 crc kubenswrapper[4707]: I0121 15:36:07.105156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" event={"ID":"e007cb2f-fee4-4c18-9baf-9e61af91254c","Type":"ContainerStarted","Data":"b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd"} Jan 21 15:36:07 crc kubenswrapper[4707]: I0121 15:36:07.105282 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:07 crc kubenswrapper[4707]: I0121 15:36:07.123003 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" podStartSLOduration=2.122988819 podStartE2EDuration="2.122988819s" podCreationTimestamp="2026-01-21 15:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:07.116919737 +0000 UTC m=+2064.298435959" watchObservedRunningTime="2026-01-21 15:36:07.122988819 +0000 UTC m=+2064.304505041" Jan 21 15:36:08 crc kubenswrapper[4707]: I0121 15:36:08.115711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-pcvgx" event={"ID":"4b53d780-b8f6-4755-a954-db9895f87427","Type":"ContainerStarted","Data":"9ab835caae749435aeadf6ada86120577bb2f30d5a8addc17705da18c542955c"} Jan 21 15:36:08 crc kubenswrapper[4707]: I0121 15:36:08.128155 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-pcvgx" podStartSLOduration=2.128137552 podStartE2EDuration="2.128137552s" podCreationTimestamp="2026-01-21 15:36:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:08.126981839 +0000 UTC m=+2065.308498060" watchObservedRunningTime="2026-01-21 15:36:08.128137552 +0000 UTC m=+2065.309653774" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.173109 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8tr5g"] Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.174306 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.177405 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.182228 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8tr5g"] Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.202317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8463ac2d-d1cb-4505-adc1-e5019f545c85-operator-scripts\") pod \"root-account-create-update-8tr5g\" (UID: \"8463ac2d-d1cb-4505-adc1-e5019f545c85\") " pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.202355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdrh\" (UniqueName: \"kubernetes.io/projected/8463ac2d-d1cb-4505-adc1-e5019f545c85-kube-api-access-wrdrh\") pod \"root-account-create-update-8tr5g\" (UID: \"8463ac2d-d1cb-4505-adc1-e5019f545c85\") " pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.303941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8463ac2d-d1cb-4505-adc1-e5019f545c85-operator-scripts\") pod \"root-account-create-update-8tr5g\" (UID: \"8463ac2d-d1cb-4505-adc1-e5019f545c85\") " pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.303983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdrh\" (UniqueName: \"kubernetes.io/projected/8463ac2d-d1cb-4505-adc1-e5019f545c85-kube-api-access-wrdrh\") pod \"root-account-create-update-8tr5g\" (UID: \"8463ac2d-d1cb-4505-adc1-e5019f545c85\") " pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.304579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8463ac2d-d1cb-4505-adc1-e5019f545c85-operator-scripts\") pod \"root-account-create-update-8tr5g\" (UID: \"8463ac2d-d1cb-4505-adc1-e5019f545c85\") " pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.320364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdrh\" (UniqueName: \"kubernetes.io/projected/8463ac2d-d1cb-4505-adc1-e5019f545c85-kube-api-access-wrdrh\") pod \"root-account-create-update-8tr5g\" (UID: \"8463ac2d-d1cb-4505-adc1-e5019f545c85\") " pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.502754 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:09 crc kubenswrapper[4707]: I0121 15:36:09.896095 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8tr5g"] Jan 21 15:36:09 crc kubenswrapper[4707]: W0121 15:36:09.899676 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8463ac2d_d1cb_4505_adc1_e5019f545c85.slice/crio-de6be950719d7274322f71fb7e961670a51e0c6fae9595551a4dc6471d300012 WatchSource:0}: Error finding container de6be950719d7274322f71fb7e961670a51e0c6fae9595551a4dc6471d300012: Status 404 returned error can't find the container with id de6be950719d7274322f71fb7e961670a51e0c6fae9595551a4dc6471d300012 Jan 21 15:36:10 crc kubenswrapper[4707]: I0121 15:36:10.132119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8tr5g" event={"ID":"8463ac2d-d1cb-4505-adc1-e5019f545c85","Type":"ContainerStarted","Data":"30c8e6768c731bc44e17cb82c6d9a3a43f549efa140290faacd619bcb41a9383"} Jan 21 15:36:10 crc kubenswrapper[4707]: I0121 15:36:10.132164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8tr5g" event={"ID":"8463ac2d-d1cb-4505-adc1-e5019f545c85","Type":"ContainerStarted","Data":"de6be950719d7274322f71fb7e961670a51e0c6fae9595551a4dc6471d300012"} Jan 21 15:36:10 crc kubenswrapper[4707]: I0121 15:36:10.154326 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/root-account-create-update-8tr5g" podStartSLOduration=1.15431266 podStartE2EDuration="1.15431266s" podCreationTimestamp="2026-01-21 15:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:10.145075342 +0000 UTC m=+2067.326591564" watchObservedRunningTime="2026-01-21 15:36:10.15431266 +0000 UTC m=+2067.335828883" Jan 21 15:36:10 crc kubenswrapper[4707]: I0121 15:36:10.427547 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:36:10 crc kubenswrapper[4707]: I0121 15:36:10.675332 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.140123 4707 generic.go:334] "Generic (PLEG): container finished" podID="8e2084ed-217a-462c-8219-6cfa022751d9" containerID="6a798f965e2ede79a50452c9b9cf47838b4c424ce11b6965be9499ba22782a08" exitCode=0 Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.140191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"8e2084ed-217a-462c-8219-6cfa022751d9","Type":"ContainerDied","Data":"6a798f965e2ede79a50452c9b9cf47838b4c424ce11b6965be9499ba22782a08"} Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.142903 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerID="4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3" exitCode=0 Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.142947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4c346ce8-cb7b-46a0-abda-84bc7a42f009","Type":"ContainerDied","Data":"4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3"} Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.144971 4707 generic.go:334] "Generic (PLEG): container finished" podID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerID="6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e" exitCode=0 Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.145087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"c641d259-efc1-4897-94d0-bf9d429f3c5d","Type":"ContainerDied","Data":"6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e"} Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.147630 4707 generic.go:334] "Generic (PLEG): container finished" podID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerID="4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069" exitCode=0 Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.147900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"15b560a7-6184-4897-af4f-7975ebd19fcb","Type":"ContainerDied","Data":"4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069"} Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.149797 4707 generic.go:334] "Generic (PLEG): container finished" podID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerID="158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816" exitCode=0 Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.149863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"1977b33b-45bb-4b7d-b7f7-ebfd42572edb","Type":"ContainerDied","Data":"158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816"} Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.152275 4707 generic.go:334] "Generic (PLEG): container finished" podID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerID="26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45" exitCode=0 Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.152425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"702acb12-2ee0-4008-bf50-0fda9fc79030","Type":"ContainerDied","Data":"26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45"} Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.156170 4707 generic.go:334] "Generic (PLEG): container finished" podID="8463ac2d-d1cb-4505-adc1-e5019f545c85" containerID="30c8e6768c731bc44e17cb82c6d9a3a43f549efa140290faacd619bcb41a9383" exitCode=0 Jan 21 15:36:11 crc kubenswrapper[4707]: I0121 15:36:11.156256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8tr5g" event={"ID":"8463ac2d-d1cb-4505-adc1-e5019f545c85","Type":"ContainerDied","Data":"30c8e6768c731bc44e17cb82c6d9a3a43f549efa140290faacd619bcb41a9383"} Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.163713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"702acb12-2ee0-4008-bf50-0fda9fc79030","Type":"ContainerStarted","Data":"e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e"} Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.164859 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.166327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"8e2084ed-217a-462c-8219-6cfa022751d9","Type":"ContainerStarted","Data":"77bb156bdfd2d77e45072aa690e4f0b81d8bcd71d4f1156e3520806d9729c421"} Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.166767 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.168055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4c346ce8-cb7b-46a0-abda-84bc7a42f009","Type":"ContainerStarted","Data":"da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442"} Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.168467 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.171671 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b53d780-b8f6-4755-a954-db9895f87427" containerID="9ab835caae749435aeadf6ada86120577bb2f30d5a8addc17705da18c542955c" exitCode=0 Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.171721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-pcvgx" event={"ID":"4b53d780-b8f6-4755-a954-db9895f87427","Type":"ContainerDied","Data":"9ab835caae749435aeadf6ada86120577bb2f30d5a8addc17705da18c542955c"} Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.177374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"c641d259-efc1-4897-94d0-bf9d429f3c5d","Type":"ContainerStarted","Data":"6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62"} Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.177747 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.186726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"15b560a7-6184-4897-af4f-7975ebd19fcb","Type":"ContainerStarted","Data":"c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc"} Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.186865 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.190054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"1977b33b-45bb-4b7d-b7f7-ebfd42572edb","Type":"ContainerStarted","Data":"22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311"} Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.190205 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.198717 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.198706415 podStartE2EDuration="36.198706415s" podCreationTimestamp="2026-01-21 15:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:12.189410086 +0000 UTC m=+2069.370926308" watchObservedRunningTime="2026-01-21 15:36:12.198706415 +0000 UTC m=+2069.380222637" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.236866 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-2" podStartSLOduration=37.236852651 podStartE2EDuration="37.236852651s" podCreationTimestamp="2026-01-21 15:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:12.231307855 +0000 UTC m=+2069.412824077" watchObservedRunningTime="2026-01-21 15:36:12.236852651 +0000 UTC m=+2069.418368873" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.256152 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-1" podStartSLOduration=37.256135757 podStartE2EDuration="37.256135757s" podCreationTimestamp="2026-01-21 15:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:12.252198264 +0000 UTC m=+2069.433714487" watchObservedRunningTime="2026-01-21 15:36:12.256135757 +0000 UTC m=+2069.437651980" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.295192 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.295177366 podStartE2EDuration="37.295177366s" podCreationTimestamp="2026-01-21 15:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:12.290836655 +0000 UTC m=+2069.472352877" watchObservedRunningTime="2026-01-21 15:36:12.295177366 +0000 UTC m=+2069.476693588" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.310086 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" podStartSLOduration=36.310062745 podStartE2EDuration="36.310062745s" podCreationTimestamp="2026-01-21 15:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:12.30739065 +0000 UTC m=+2069.488906872" watchObservedRunningTime="2026-01-21 15:36:12.310062745 +0000 UTC m=+2069.491578967" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.328084 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" podStartSLOduration=36.328070314 podStartE2EDuration="36.328070314s" podCreationTimestamp="2026-01-21 15:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:12.324423326 +0000 UTC m=+2069.505939548" watchObservedRunningTime="2026-01-21 15:36:12.328070314 +0000 UTC m=+2069.509586536" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.564765 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.669315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8463ac2d-d1cb-4505-adc1-e5019f545c85-operator-scripts\") pod \"8463ac2d-d1cb-4505-adc1-e5019f545c85\" (UID: \"8463ac2d-d1cb-4505-adc1-e5019f545c85\") " Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.669461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrdrh\" (UniqueName: \"kubernetes.io/projected/8463ac2d-d1cb-4505-adc1-e5019f545c85-kube-api-access-wrdrh\") pod \"8463ac2d-d1cb-4505-adc1-e5019f545c85\" (UID: \"8463ac2d-d1cb-4505-adc1-e5019f545c85\") " Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.670843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8463ac2d-d1cb-4505-adc1-e5019f545c85-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8463ac2d-d1cb-4505-adc1-e5019f545c85" (UID: "8463ac2d-d1cb-4505-adc1-e5019f545c85"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.686949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8463ac2d-d1cb-4505-adc1-e5019f545c85-kube-api-access-wrdrh" (OuterVolumeSpecName: "kube-api-access-wrdrh") pod "8463ac2d-d1cb-4505-adc1-e5019f545c85" (UID: "8463ac2d-d1cb-4505-adc1-e5019f545c85"). InnerVolumeSpecName "kube-api-access-wrdrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.771250 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrdrh\" (UniqueName: \"kubernetes.io/projected/8463ac2d-d1cb-4505-adc1-e5019f545c85-kube-api-access-wrdrh\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:12 crc kubenswrapper[4707]: I0121 15:36:12.771467 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8463ac2d-d1cb-4505-adc1-e5019f545c85-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.196419 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8tr5g" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.196449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8tr5g" event={"ID":"8463ac2d-d1cb-4505-adc1-e5019f545c85","Type":"ContainerDied","Data":"de6be950719d7274322f71fb7e961670a51e0c6fae9595551a4dc6471d300012"} Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.196483 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de6be950719d7274322f71fb7e961670a51e0c6fae9595551a4dc6471d300012" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.498384 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.583255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-config-data\") pod \"4b53d780-b8f6-4755-a954-db9895f87427\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.583293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-db-sync-config-data\") pod \"4b53d780-b8f6-4755-a954-db9895f87427\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.583375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-222h9\" (UniqueName: \"kubernetes.io/projected/4b53d780-b8f6-4755-a954-db9895f87427-kube-api-access-222h9\") pod \"4b53d780-b8f6-4755-a954-db9895f87427\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.583406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-combined-ca-bundle\") pod \"4b53d780-b8f6-4755-a954-db9895f87427\" (UID: \"4b53d780-b8f6-4755-a954-db9895f87427\") " Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.589070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b53d780-b8f6-4755-a954-db9895f87427-kube-api-access-222h9" (OuterVolumeSpecName: "kube-api-access-222h9") pod "4b53d780-b8f6-4755-a954-db9895f87427" (UID: "4b53d780-b8f6-4755-a954-db9895f87427"). InnerVolumeSpecName "kube-api-access-222h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.602971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4b53d780-b8f6-4755-a954-db9895f87427" (UID: "4b53d780-b8f6-4755-a954-db9895f87427"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.621015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-config-data" (OuterVolumeSpecName: "config-data") pod "4b53d780-b8f6-4755-a954-db9895f87427" (UID: "4b53d780-b8f6-4755-a954-db9895f87427"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.621433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b53d780-b8f6-4755-a954-db9895f87427" (UID: "4b53d780-b8f6-4755-a954-db9895f87427"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.685009 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-222h9\" (UniqueName: \"kubernetes.io/projected/4b53d780-b8f6-4755-a954-db9895f87427-kube-api-access-222h9\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.685044 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.685053 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:13 crc kubenswrapper[4707]: I0121 15:36:13.685064 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b53d780-b8f6-4755-a954-db9895f87427-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:14 crc kubenswrapper[4707]: I0121 15:36:14.204928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-pcvgx" event={"ID":"4b53d780-b8f6-4755-a954-db9895f87427","Type":"ContainerDied","Data":"0613527a5cf004d9509da9bbe84c94039dc9e4cc256d6b1067aaadea9eff7af3"} Jan 21 15:36:14 crc kubenswrapper[4707]: I0121 15:36:14.205241 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0613527a5cf004d9509da9bbe84c94039dc9e4cc256d6b1067aaadea9eff7af3" Jan 21 15:36:14 crc kubenswrapper[4707]: I0121 15:36:14.205306 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-pcvgx" Jan 21 15:36:15 crc kubenswrapper[4707]: I0121 15:36:15.535968 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:36:15 crc kubenswrapper[4707]: I0121 15:36:15.574912 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6"] Jan 21 15:36:15 crc kubenswrapper[4707]: I0121 15:36:15.575111 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" podUID="8238358f-4625-4642-84db-0834ff86d43b" containerName="dnsmasq-dns" containerID="cri-o://9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d" gracePeriod=10 Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.014253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.018405 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4bxtj"] Jan 21 15:36:16 crc kubenswrapper[4707]: E0121 15:36:16.018734 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b53d780-b8f6-4755-a954-db9895f87427" containerName="glance-db-sync" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.018753 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b53d780-b8f6-4755-a954-db9895f87427" containerName="glance-db-sync" Jan 21 15:36:16 crc kubenswrapper[4707]: E0121 15:36:16.018778 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8238358f-4625-4642-84db-0834ff86d43b" containerName="dnsmasq-dns" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.018784 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8238358f-4625-4642-84db-0834ff86d43b" containerName="dnsmasq-dns" Jan 21 15:36:16 crc kubenswrapper[4707]: E0121 15:36:16.018796 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8463ac2d-d1cb-4505-adc1-e5019f545c85" containerName="mariadb-account-create-update" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.018801 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8463ac2d-d1cb-4505-adc1-e5019f545c85" containerName="mariadb-account-create-update" Jan 21 15:36:16 crc kubenswrapper[4707]: E0121 15:36:16.018828 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8238358f-4625-4642-84db-0834ff86d43b" containerName="init" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.018835 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8238358f-4625-4642-84db-0834ff86d43b" containerName="init" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.018994 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8463ac2d-d1cb-4505-adc1-e5019f545c85" containerName="mariadb-account-create-update" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.019009 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b53d780-b8f6-4755-a954-db9895f87427" containerName="glance-db-sync" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.019022 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8238358f-4625-4642-84db-0834ff86d43b" containerName="dnsmasq-dns" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.020075 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.033056 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4bxtj"] Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.127163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-dnsmasq-svc\") pod \"8238358f-4625-4642-84db-0834ff86d43b\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.127308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqbnz\" (UniqueName: \"kubernetes.io/projected/8238358f-4625-4642-84db-0834ff86d43b-kube-api-access-sqbnz\") pod \"8238358f-4625-4642-84db-0834ff86d43b\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.127334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-config\") pod \"8238358f-4625-4642-84db-0834ff86d43b\" (UID: \"8238358f-4625-4642-84db-0834ff86d43b\") " Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.128197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-catalog-content\") pod \"community-operators-4bxtj\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.128266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-utilities\") pod \"community-operators-4bxtj\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.128302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d29l5\" (UniqueName: \"kubernetes.io/projected/a02f7172-7384-42cd-9214-8b002cf145ed-kube-api-access-d29l5\") pod \"community-operators-4bxtj\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.132005 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8238358f-4625-4642-84db-0834ff86d43b-kube-api-access-sqbnz" (OuterVolumeSpecName: "kube-api-access-sqbnz") pod "8238358f-4625-4642-84db-0834ff86d43b" (UID: "8238358f-4625-4642-84db-0834ff86d43b"). InnerVolumeSpecName "kube-api-access-sqbnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.165851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "8238358f-4625-4642-84db-0834ff86d43b" (UID: "8238358f-4625-4642-84db-0834ff86d43b"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.173243 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-config" (OuterVolumeSpecName: "config") pod "8238358f-4625-4642-84db-0834ff86d43b" (UID: "8238358f-4625-4642-84db-0834ff86d43b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.219791 4707 generic.go:334] "Generic (PLEG): container finished" podID="8238358f-4625-4642-84db-0834ff86d43b" containerID="9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d" exitCode=0 Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.219842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" event={"ID":"8238358f-4625-4642-84db-0834ff86d43b","Type":"ContainerDied","Data":"9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d"} Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.219888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" event={"ID":"8238358f-4625-4642-84db-0834ff86d43b","Type":"ContainerDied","Data":"e662f7d8b0f422cd8379b91df72a3d5203e1ed6e7ae27fac775dd1e8df730699"} Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.219907 4707 scope.go:117] "RemoveContainer" containerID="9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.220044 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.229329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-catalog-content\") pod \"community-operators-4bxtj\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.229400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-utilities\") pod \"community-operators-4bxtj\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.229427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d29l5\" (UniqueName: \"kubernetes.io/projected/a02f7172-7384-42cd-9214-8b002cf145ed-kube-api-access-d29l5\") pod \"community-operators-4bxtj\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.229676 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqbnz\" (UniqueName: \"kubernetes.io/projected/8238358f-4625-4642-84db-0834ff86d43b-kube-api-access-sqbnz\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.229688 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.229697 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8238358f-4625-4642-84db-0834ff86d43b-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.230256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-catalog-content\") pod \"community-operators-4bxtj\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.230654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-utilities\") pod \"community-operators-4bxtj\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.247759 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6"] Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.248046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d29l5\" (UniqueName: \"kubernetes.io/projected/a02f7172-7384-42cd-9214-8b002cf145ed-kube-api-access-d29l5\") pod \"community-operators-4bxtj\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.250041 4707 scope.go:117] "RemoveContainer" containerID="74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.257243 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9l6r6"] Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.269947 4707 scope.go:117] "RemoveContainer" containerID="9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d" Jan 21 15:36:16 crc kubenswrapper[4707]: E0121 15:36:16.270281 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d\": container with ID starting with 9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d not found: ID does not exist" containerID="9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.270318 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d"} err="failed to get container status \"9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d\": rpc error: code = NotFound desc = could not find container \"9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d\": container with ID starting with 9547ab56c5d9f39562dd9517065a7ceb10771ffa5bfa3c1ce958ee1fee07a12d not found: ID does not exist" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.270343 4707 scope.go:117] "RemoveContainer" containerID="74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e" Jan 21 15:36:16 crc kubenswrapper[4707]: E0121 15:36:16.270599 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e\": container with ID starting with 74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e not found: ID does not exist" containerID="74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.270619 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e"} err="failed to get container status \"74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e\": rpc error: code = NotFound desc = could not find container \"74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e\": container with ID starting with 74646454393da5492a33acc7d4b57bc83bb678e0106271e632163c4389c6eb5e not found: ID does not exist" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.332148 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:16 crc kubenswrapper[4707]: I0121 15:36:16.842260 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4bxtj"] Jan 21 15:36:16 crc kubenswrapper[4707]: W0121 15:36:16.845249 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda02f7172_7384_42cd_9214_8b002cf145ed.slice/crio-c9a0b4c4b2d328acfc8dbf98085726b3d08fcc43606a37757f2904fc601dae4a WatchSource:0}: Error finding container c9a0b4c4b2d328acfc8dbf98085726b3d08fcc43606a37757f2904fc601dae4a: Status 404 returned error can't find the container with id c9a0b4c4b2d328acfc8dbf98085726b3d08fcc43606a37757f2904fc601dae4a Jan 21 15:36:17 crc kubenswrapper[4707]: I0121 15:36:17.190171 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8238358f-4625-4642-84db-0834ff86d43b" path="/var/lib/kubelet/pods/8238358f-4625-4642-84db-0834ff86d43b/volumes" Jan 21 15:36:17 crc kubenswrapper[4707]: I0121 15:36:17.227329 4707 generic.go:334] "Generic (PLEG): container finished" podID="a02f7172-7384-42cd-9214-8b002cf145ed" containerID="bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c" exitCode=0 Jan 21 15:36:17 crc kubenswrapper[4707]: I0121 15:36:17.227395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4bxtj" event={"ID":"a02f7172-7384-42cd-9214-8b002cf145ed","Type":"ContainerDied","Data":"bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c"} Jan 21 15:36:17 crc kubenswrapper[4707]: I0121 15:36:17.227422 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4bxtj" event={"ID":"a02f7172-7384-42cd-9214-8b002cf145ed","Type":"ContainerStarted","Data":"c9a0b4c4b2d328acfc8dbf98085726b3d08fcc43606a37757f2904fc601dae4a"} Jan 21 15:36:18 crc kubenswrapper[4707]: I0121 15:36:18.236233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4bxtj" event={"ID":"a02f7172-7384-42cd-9214-8b002cf145ed","Type":"ContainerStarted","Data":"f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f"} Jan 21 15:36:19 crc kubenswrapper[4707]: I0121 15:36:19.254070 4707 generic.go:334] "Generic (PLEG): container finished" podID="a02f7172-7384-42cd-9214-8b002cf145ed" containerID="f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f" exitCode=0 Jan 21 15:36:19 crc kubenswrapper[4707]: I0121 15:36:19.254108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4bxtj" event={"ID":"a02f7172-7384-42cd-9214-8b002cf145ed","Type":"ContainerDied","Data":"f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f"} Jan 21 15:36:20 crc kubenswrapper[4707]: I0121 15:36:20.262898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4bxtj" event={"ID":"a02f7172-7384-42cd-9214-8b002cf145ed","Type":"ContainerStarted","Data":"88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd"} Jan 21 15:36:20 crc kubenswrapper[4707]: I0121 15:36:20.282371 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4bxtj" podStartSLOduration=2.787049141 podStartE2EDuration="5.282357313s" podCreationTimestamp="2026-01-21 15:36:15 +0000 UTC" firstStartedPulling="2026-01-21 15:36:17.228349681 +0000 UTC m=+2074.409865893" lastFinishedPulling="2026-01-21 15:36:19.723657843 +0000 UTC m=+2076.905174065" observedRunningTime="2026-01-21 15:36:20.275788803 +0000 UTC m=+2077.457305025" watchObservedRunningTime="2026-01-21 15:36:20.282357313 +0000 UTC m=+2077.463873536" Jan 21 15:36:26 crc kubenswrapper[4707]: I0121 15:36:26.332319 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:26 crc kubenswrapper[4707]: I0121 15:36:26.332678 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:26 crc kubenswrapper[4707]: I0121 15:36:26.364411 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:27 crc kubenswrapper[4707]: I0121 15:36:27.253064 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Jan 21 15:36:27 crc kubenswrapper[4707]: I0121 15:36:27.270290 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-2" podUID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Jan 21 15:36:27 crc kubenswrapper[4707]: I0121 15:36:27.277438 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-1" podUID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Jan 21 15:36:27 crc kubenswrapper[4707]: I0121 15:36:27.404231 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:27 crc kubenswrapper[4707]: I0121 15:36:27.439167 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4bxtj"] Jan 21 15:36:27 crc kubenswrapper[4707]: I0121 15:36:27.474616 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Jan 21 15:36:27 crc kubenswrapper[4707]: I0121 15:36:27.481945 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:36:27 crc kubenswrapper[4707]: I0121 15:36:27.492927 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.336822 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4bxtj" podUID="a02f7172-7384-42cd-9214-8b002cf145ed" containerName="registry-server" containerID="cri-o://88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd" gracePeriod=2 Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.730871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.838801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-utilities\") pod \"a02f7172-7384-42cd-9214-8b002cf145ed\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.838881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d29l5\" (UniqueName: \"kubernetes.io/projected/a02f7172-7384-42cd-9214-8b002cf145ed-kube-api-access-d29l5\") pod \"a02f7172-7384-42cd-9214-8b002cf145ed\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.838967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-catalog-content\") pod \"a02f7172-7384-42cd-9214-8b002cf145ed\" (UID: \"a02f7172-7384-42cd-9214-8b002cf145ed\") " Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.839472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-utilities" (OuterVolumeSpecName: "utilities") pod "a02f7172-7384-42cd-9214-8b002cf145ed" (UID: "a02f7172-7384-42cd-9214-8b002cf145ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.850482 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02f7172-7384-42cd-9214-8b002cf145ed-kube-api-access-d29l5" (OuterVolumeSpecName: "kube-api-access-d29l5") pod "a02f7172-7384-42cd-9214-8b002cf145ed" (UID: "a02f7172-7384-42cd-9214-8b002cf145ed"). InnerVolumeSpecName "kube-api-access-d29l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.874584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a02f7172-7384-42cd-9214-8b002cf145ed" (UID: "a02f7172-7384-42cd-9214-8b002cf145ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.940485 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d29l5\" (UniqueName: \"kubernetes.io/projected/a02f7172-7384-42cd-9214-8b002cf145ed-kube-api-access-d29l5\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.940515 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:29 crc kubenswrapper[4707]: I0121 15:36:29.940525 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a02f7172-7384-42cd-9214-8b002cf145ed-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.345240 4707 generic.go:334] "Generic (PLEG): container finished" podID="a02f7172-7384-42cd-9214-8b002cf145ed" containerID="88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd" exitCode=0 Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.345280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4bxtj" event={"ID":"a02f7172-7384-42cd-9214-8b002cf145ed","Type":"ContainerDied","Data":"88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd"} Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.345310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4bxtj" event={"ID":"a02f7172-7384-42cd-9214-8b002cf145ed","Type":"ContainerDied","Data":"c9a0b4c4b2d328acfc8dbf98085726b3d08fcc43606a37757f2904fc601dae4a"} Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.345322 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4bxtj" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.345329 4707 scope.go:117] "RemoveContainer" containerID="88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.364450 4707 scope.go:117] "RemoveContainer" containerID="f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.367241 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4bxtj"] Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.373005 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4bxtj"] Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.400339 4707 scope.go:117] "RemoveContainer" containerID="bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.417020 4707 scope.go:117] "RemoveContainer" containerID="88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd" Jan 21 15:36:30 crc kubenswrapper[4707]: E0121 15:36:30.417403 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd\": container with ID starting with 88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd not found: ID does not exist" containerID="88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.417434 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd"} err="failed to get container status \"88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd\": rpc error: code = NotFound desc = could not find container \"88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd\": container with ID starting with 88bf045b6301d633dde5330d773d22d775356bf33f2f27b20e19782c96365bfd not found: ID does not exist" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.417452 4707 scope.go:117] "RemoveContainer" containerID="f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f" Jan 21 15:36:30 crc kubenswrapper[4707]: E0121 15:36:30.417682 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f\": container with ID starting with f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f not found: ID does not exist" containerID="f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.417706 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f"} err="failed to get container status \"f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f\": rpc error: code = NotFound desc = could not find container \"f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f\": container with ID starting with f614b0ccd19600aa709c3fbe7a704c05c0d4fff7ab815a2b5a8b686df4d26d4f not found: ID does not exist" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.417719 4707 scope.go:117] "RemoveContainer" containerID="bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c" Jan 21 15:36:30 crc kubenswrapper[4707]: E0121 15:36:30.417934 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c\": container with ID starting with bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c not found: ID does not exist" containerID="bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c" Jan 21 15:36:30 crc kubenswrapper[4707]: I0121 15:36:30.417956 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c"} err="failed to get container status \"bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c\": rpc error: code = NotFound desc = could not find container \"bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c\": container with ID starting with bdc72e8d1f288f10cd41f216086db7a5085726be2fd09c0346a3bb3429617f6c not found: ID does not exist" Jan 21 15:36:31 crc kubenswrapper[4707]: I0121 15:36:31.190545 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a02f7172-7384-42cd-9214-8b002cf145ed" path="/var/lib/kubelet/pods/a02f7172-7384-42cd-9214-8b002cf145ed/volumes" Jan 21 15:36:37 crc kubenswrapper[4707]: I0121 15:36:37.252990 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:36:37 crc kubenswrapper[4707]: I0121 15:36:37.271964 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:36:37 crc kubenswrapper[4707]: I0121 15:36:37.278291 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:36:37 crc kubenswrapper[4707]: I0121 15:36:37.474964 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.230569 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-mptqk"] Jan 21 15:36:40 crc kubenswrapper[4707]: E0121 15:36:40.231588 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02f7172-7384-42cd-9214-8b002cf145ed" containerName="extract-content" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.231692 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02f7172-7384-42cd-9214-8b002cf145ed" containerName="extract-content" Jan 21 15:36:40 crc kubenswrapper[4707]: E0121 15:36:40.231768 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02f7172-7384-42cd-9214-8b002cf145ed" containerName="extract-utilities" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.231848 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02f7172-7384-42cd-9214-8b002cf145ed" containerName="extract-utilities" Jan 21 15:36:40 crc kubenswrapper[4707]: E0121 15:36:40.231917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02f7172-7384-42cd-9214-8b002cf145ed" containerName="registry-server" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.231970 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02f7172-7384-42cd-9214-8b002cf145ed" containerName="registry-server" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.232195 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02f7172-7384-42cd-9214-8b002cf145ed" containerName="registry-server" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.232846 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.236325 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-c140-account-create-update-prxqq"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.237331 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.241938 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.246640 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-c140-account-create-update-prxqq"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.251320 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-mptqk"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.301372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed552cf6-1879-4c6e-877e-ff70fd80c485-operator-scripts\") pod \"cinder-c140-account-create-update-prxqq\" (UID: \"ed552cf6-1879-4c6e-877e-ff70fd80c485\") " pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.301584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb7dw\" (UniqueName: \"kubernetes.io/projected/d1b890b6-faa3-4e93-aebe-6d8277291a22-kube-api-access-vb7dw\") pod \"cinder-db-create-mptqk\" (UID: \"d1b890b6-faa3-4e93-aebe-6d8277291a22\") " pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.301715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b890b6-faa3-4e93-aebe-6d8277291a22-operator-scripts\") pod \"cinder-db-create-mptqk\" (UID: \"d1b890b6-faa3-4e93-aebe-6d8277291a22\") " pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.301834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn2jb\" (UniqueName: \"kubernetes.io/projected/ed552cf6-1879-4c6e-877e-ff70fd80c485-kube-api-access-gn2jb\") pod \"cinder-c140-account-create-update-prxqq\" (UID: \"ed552cf6-1879-4c6e-877e-ff70fd80c485\") " pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.330590 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.331849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.336360 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-f8zkr"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.337284 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.340376 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.345893 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-f8zkr"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.350618 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.404104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8px8w\" (UniqueName: \"kubernetes.io/projected/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-kube-api-access-8px8w\") pod \"barbican-f889-account-create-update-9hjdd\" (UID: \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\") " pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.404190 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4pdc\" (UniqueName: \"kubernetes.io/projected/902a08ab-3caf-4942-bf59-986f57145ec4-kube-api-access-p4pdc\") pod \"barbican-db-create-f8zkr\" (UID: \"902a08ab-3caf-4942-bf59-986f57145ec4\") " pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.404253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-operator-scripts\") pod \"barbican-f889-account-create-update-9hjdd\" (UID: \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\") " pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.404498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed552cf6-1879-4c6e-877e-ff70fd80c485-operator-scripts\") pod \"cinder-c140-account-create-update-prxqq\" (UID: \"ed552cf6-1879-4c6e-877e-ff70fd80c485\") " pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.404537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb7dw\" (UniqueName: \"kubernetes.io/projected/d1b890b6-faa3-4e93-aebe-6d8277291a22-kube-api-access-vb7dw\") pod \"cinder-db-create-mptqk\" (UID: \"d1b890b6-faa3-4e93-aebe-6d8277291a22\") " pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.404594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b890b6-faa3-4e93-aebe-6d8277291a22-operator-scripts\") pod \"cinder-db-create-mptqk\" (UID: \"d1b890b6-faa3-4e93-aebe-6d8277291a22\") " pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.404612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/902a08ab-3caf-4942-bf59-986f57145ec4-operator-scripts\") pod \"barbican-db-create-f8zkr\" (UID: \"902a08ab-3caf-4942-bf59-986f57145ec4\") " pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.404658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn2jb\" (UniqueName: \"kubernetes.io/projected/ed552cf6-1879-4c6e-877e-ff70fd80c485-kube-api-access-gn2jb\") pod \"cinder-c140-account-create-update-prxqq\" (UID: \"ed552cf6-1879-4c6e-877e-ff70fd80c485\") " pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.405331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed552cf6-1879-4c6e-877e-ff70fd80c485-operator-scripts\") pod \"cinder-c140-account-create-update-prxqq\" (UID: \"ed552cf6-1879-4c6e-877e-ff70fd80c485\") " pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.405703 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b890b6-faa3-4e93-aebe-6d8277291a22-operator-scripts\") pod \"cinder-db-create-mptqk\" (UID: \"d1b890b6-faa3-4e93-aebe-6d8277291a22\") " pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.422782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn2jb\" (UniqueName: \"kubernetes.io/projected/ed552cf6-1879-4c6e-877e-ff70fd80c485-kube-api-access-gn2jb\") pod \"cinder-c140-account-create-update-prxqq\" (UID: \"ed552cf6-1879-4c6e-877e-ff70fd80c485\") " pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.427702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb7dw\" (UniqueName: \"kubernetes.io/projected/d1b890b6-faa3-4e93-aebe-6d8277291a22-kube-api-access-vb7dw\") pod \"cinder-db-create-mptqk\" (UID: \"d1b890b6-faa3-4e93-aebe-6d8277291a22\") " pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.433852 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lzkn7"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.434798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.439208 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lzkn7"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.480631 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-bm4lc"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.481662 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.483637 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-m5gqr" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.483793 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.484827 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.485001 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.489413 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-bm4lc"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.505479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581b5e31-76d5-4900-91bd-8b2f82e9bf45-operator-scripts\") pod \"neutron-db-create-lzkn7\" (UID: \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\") " pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.505536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8px8w\" (UniqueName: \"kubernetes.io/projected/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-kube-api-access-8px8w\") pod \"barbican-f889-account-create-update-9hjdd\" (UID: \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\") " pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.505700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4pdc\" (UniqueName: \"kubernetes.io/projected/902a08ab-3caf-4942-bf59-986f57145ec4-kube-api-access-p4pdc\") pod \"barbican-db-create-f8zkr\" (UID: \"902a08ab-3caf-4942-bf59-986f57145ec4\") " pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.505737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgql6\" (UniqueName: \"kubernetes.io/projected/581b5e31-76d5-4900-91bd-8b2f82e9bf45-kube-api-access-kgql6\") pod \"neutron-db-create-lzkn7\" (UID: \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\") " pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.505822 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-operator-scripts\") pod \"barbican-f889-account-create-update-9hjdd\" (UID: \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\") " pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.505903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/902a08ab-3caf-4942-bf59-986f57145ec4-operator-scripts\") pod \"barbican-db-create-f8zkr\" (UID: \"902a08ab-3caf-4942-bf59-986f57145ec4\") " pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.508564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/902a08ab-3caf-4942-bf59-986f57145ec4-operator-scripts\") pod \"barbican-db-create-f8zkr\" (UID: \"902a08ab-3caf-4942-bf59-986f57145ec4\") " pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.508562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-operator-scripts\") pod \"barbican-f889-account-create-update-9hjdd\" (UID: \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\") " pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.527712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4pdc\" (UniqueName: \"kubernetes.io/projected/902a08ab-3caf-4942-bf59-986f57145ec4-kube-api-access-p4pdc\") pod \"barbican-db-create-f8zkr\" (UID: \"902a08ab-3caf-4942-bf59-986f57145ec4\") " pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.530154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8px8w\" (UniqueName: \"kubernetes.io/projected/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-kube-api-access-8px8w\") pod \"barbican-f889-account-create-update-9hjdd\" (UID: \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\") " pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.536851 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.537852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.539761 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.543682 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5"] Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.549726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.556089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.607442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-combined-ca-bundle\") pod \"keystone-db-sync-bm4lc\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.607543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3169b921-59ad-46d3-9c46-1ad852c93a6e-operator-scripts\") pod \"neutron-e1d7-account-create-update-mr6l5\" (UID: \"3169b921-59ad-46d3-9c46-1ad852c93a6e\") " pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.607623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581b5e31-76d5-4900-91bd-8b2f82e9bf45-operator-scripts\") pod \"neutron-db-create-lzkn7\" (UID: \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\") " pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.607729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-config-data\") pod \"keystone-db-sync-bm4lc\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.607774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fm9\" (UniqueName: \"kubernetes.io/projected/3169b921-59ad-46d3-9c46-1ad852c93a6e-kube-api-access-p6fm9\") pod \"neutron-e1d7-account-create-update-mr6l5\" (UID: \"3169b921-59ad-46d3-9c46-1ad852c93a6e\") " pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.608006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgql6\" (UniqueName: \"kubernetes.io/projected/581b5e31-76d5-4900-91bd-8b2f82e9bf45-kube-api-access-kgql6\") pod \"neutron-db-create-lzkn7\" (UID: \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\") " pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.608035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb7gm\" (UniqueName: \"kubernetes.io/projected/77947c2b-5588-40bc-9038-45844e74e767-kube-api-access-nb7gm\") pod \"keystone-db-sync-bm4lc\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.608352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581b5e31-76d5-4900-91bd-8b2f82e9bf45-operator-scripts\") pod \"neutron-db-create-lzkn7\" (UID: \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\") " pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.627327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgql6\" (UniqueName: \"kubernetes.io/projected/581b5e31-76d5-4900-91bd-8b2f82e9bf45-kube-api-access-kgql6\") pod \"neutron-db-create-lzkn7\" (UID: \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\") " pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.670655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.679116 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.709869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-combined-ca-bundle\") pod \"keystone-db-sync-bm4lc\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.709964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3169b921-59ad-46d3-9c46-1ad852c93a6e-operator-scripts\") pod \"neutron-e1d7-account-create-update-mr6l5\" (UID: \"3169b921-59ad-46d3-9c46-1ad852c93a6e\") " pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.710093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-config-data\") pod \"keystone-db-sync-bm4lc\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.710138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fm9\" (UniqueName: \"kubernetes.io/projected/3169b921-59ad-46d3-9c46-1ad852c93a6e-kube-api-access-p6fm9\") pod \"neutron-e1d7-account-create-update-mr6l5\" (UID: \"3169b921-59ad-46d3-9c46-1ad852c93a6e\") " pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.710189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb7gm\" (UniqueName: \"kubernetes.io/projected/77947c2b-5588-40bc-9038-45844e74e767-kube-api-access-nb7gm\") pod \"keystone-db-sync-bm4lc\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.711737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3169b921-59ad-46d3-9c46-1ad852c93a6e-operator-scripts\") pod \"neutron-e1d7-account-create-update-mr6l5\" (UID: \"3169b921-59ad-46d3-9c46-1ad852c93a6e\") " pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.715853 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-config-data\") pod \"keystone-db-sync-bm4lc\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.716757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-combined-ca-bundle\") pod \"keystone-db-sync-bm4lc\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.726978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fm9\" (UniqueName: \"kubernetes.io/projected/3169b921-59ad-46d3-9c46-1ad852c93a6e-kube-api-access-p6fm9\") pod \"neutron-e1d7-account-create-update-mr6l5\" (UID: \"3169b921-59ad-46d3-9c46-1ad852c93a6e\") " pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.731245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb7gm\" (UniqueName: \"kubernetes.io/projected/77947c2b-5588-40bc-9038-45844e74e767-kube-api-access-nb7gm\") pod \"keystone-db-sync-bm4lc\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.805888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.822540 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.956324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:40 crc kubenswrapper[4707]: I0121 15:36:40.969119 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-mptqk"] Jan 21 15:36:40 crc kubenswrapper[4707]: W0121 15:36:40.973160 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1b890b6_faa3_4e93_aebe_6d8277291a22.slice/crio-eb07065a121ea5e0c433f60df05ea67b8c76ae59b11df2ee8ffb8d81a7a64bdf WatchSource:0}: Error finding container eb07065a121ea5e0c433f60df05ea67b8c76ae59b11df2ee8ffb8d81a7a64bdf: Status 404 returned error can't find the container with id eb07065a121ea5e0c433f60df05ea67b8c76ae59b11df2ee8ffb8d81a7a64bdf Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.060307 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-c140-account-create-update-prxqq"] Jan 21 15:36:41 crc kubenswrapper[4707]: W0121 15:36:41.067509 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded552cf6_1879_4c6e_877e_ff70fd80c485.slice/crio-c0cef32c9e6e437c1722309a0846f192b2ca3dbe87ae5733701f98fabf3fa2ff WatchSource:0}: Error finding container c0cef32c9e6e437c1722309a0846f192b2ca3dbe87ae5733701f98fabf3fa2ff: Status 404 returned error can't find the container with id c0cef32c9e6e437c1722309a0846f192b2ca3dbe87ae5733701f98fabf3fa2ff Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.160126 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd"] Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.168099 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-f8zkr"] Jan 21 15:36:41 crc kubenswrapper[4707]: W0121 15:36:41.185139 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e6c1e3_a518_4ff6_8449_be149ab65bb1.slice/crio-084687294ce9600974226a76ad002dd0dd1c84270fe174995768d00cb205b529 WatchSource:0}: Error finding container 084687294ce9600974226a76ad002dd0dd1c84270fe174995768d00cb205b529: Status 404 returned error can't find the container with id 084687294ce9600974226a76ad002dd0dd1c84270fe174995768d00cb205b529 Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.300304 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-bm4lc"] Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.306525 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lzkn7"] Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.430230 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5"] Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.448418 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed552cf6-1879-4c6e-877e-ff70fd80c485" containerID="f123bdd1dba97854e5b6dbe5bcbbf1f01fb964c4f1fe9a7a1965cf38a8e919f5" exitCode=0 Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.448493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" event={"ID":"ed552cf6-1879-4c6e-877e-ff70fd80c485","Type":"ContainerDied","Data":"f123bdd1dba97854e5b6dbe5bcbbf1f01fb964c4f1fe9a7a1965cf38a8e919f5"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.448519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" event={"ID":"ed552cf6-1879-4c6e-877e-ff70fd80c485","Type":"ContainerStarted","Data":"c0cef32c9e6e437c1722309a0846f192b2ca3dbe87ae5733701f98fabf3fa2ff"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.451889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-f8zkr" event={"ID":"902a08ab-3caf-4942-bf59-986f57145ec4","Type":"ContainerStarted","Data":"cbcc2961b6a327d16171eb31d8ea54ff8a2c6f5303297d8511993e6e6fbb1e86"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.451933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-f8zkr" event={"ID":"902a08ab-3caf-4942-bf59-986f57145ec4","Type":"ContainerStarted","Data":"846e869c3a9bd60db5c73602129b2c2aaea99bd2eac3db7f85807bffccc6b54e"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.453635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" event={"ID":"e6e6c1e3-a518-4ff6-8449-be149ab65bb1","Type":"ContainerStarted","Data":"6f7a4f03f1265c02518cb79a00df2bf9a2f6b20ef54c2f013a587fc6042298c4"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.453679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" event={"ID":"e6e6c1e3-a518-4ff6-8449-be149ab65bb1","Type":"ContainerStarted","Data":"084687294ce9600974226a76ad002dd0dd1c84270fe174995768d00cb205b529"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.455880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" event={"ID":"77947c2b-5588-40bc-9038-45844e74e767","Type":"ContainerStarted","Data":"d878e1f61ddfa33924b77f9590ee0c8e7795ba4d23535fd388af1276f8852ff3"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.459569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-lzkn7" event={"ID":"581b5e31-76d5-4900-91bd-8b2f82e9bf45","Type":"ContainerStarted","Data":"bf4b3ec86d1911132d55278bdb9ecf1493b7b574ae3c2fe9ba78524372115b57"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.461333 4707 generic.go:334] "Generic (PLEG): container finished" podID="d1b890b6-faa3-4e93-aebe-6d8277291a22" containerID="2ee450230b14b4a5c2a9d71be025be369c10b66e0be91322654b7f1dcc35d9a2" exitCode=0 Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.461378 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-mptqk" event={"ID":"d1b890b6-faa3-4e93-aebe-6d8277291a22","Type":"ContainerDied","Data":"2ee450230b14b4a5c2a9d71be025be369c10b66e0be91322654b7f1dcc35d9a2"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.461417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-mptqk" event={"ID":"d1b890b6-faa3-4e93-aebe-6d8277291a22","Type":"ContainerStarted","Data":"eb07065a121ea5e0c433f60df05ea67b8c76ae59b11df2ee8ffb8d81a7a64bdf"} Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.515449 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-create-f8zkr" podStartSLOduration=1.515433421 podStartE2EDuration="1.515433421s" podCreationTimestamp="2026-01-21 15:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:41.49993563 +0000 UTC m=+2098.681451853" watchObservedRunningTime="2026-01-21 15:36:41.515433421 +0000 UTC m=+2098.696949642" Jan 21 15:36:41 crc kubenswrapper[4707]: I0121 15:36:41.527305 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" podStartSLOduration=1.527289373 podStartE2EDuration="1.527289373s" podCreationTimestamp="2026-01-21 15:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:41.526016891 +0000 UTC m=+2098.707533113" watchObservedRunningTime="2026-01-21 15:36:41.527289373 +0000 UTC m=+2098.708805596" Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.487667 4707 generic.go:334] "Generic (PLEG): container finished" podID="3169b921-59ad-46d3-9c46-1ad852c93a6e" containerID="b26e99f385c2260f1e53ef3f8fa69105e75942496a20c6800252912d20edc6b9" exitCode=0 Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.487773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" event={"ID":"3169b921-59ad-46d3-9c46-1ad852c93a6e","Type":"ContainerDied","Data":"b26e99f385c2260f1e53ef3f8fa69105e75942496a20c6800252912d20edc6b9"} Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.488078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" event={"ID":"3169b921-59ad-46d3-9c46-1ad852c93a6e","Type":"ContainerStarted","Data":"e5c6fb0c234f6f3d53af9c7c740cb3176f0bc15bcdcf951d68e888fab85173e2"} Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.489965 4707 generic.go:334] "Generic (PLEG): container finished" podID="902a08ab-3caf-4942-bf59-986f57145ec4" containerID="cbcc2961b6a327d16171eb31d8ea54ff8a2c6f5303297d8511993e6e6fbb1e86" exitCode=0 Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.490100 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-f8zkr" event={"ID":"902a08ab-3caf-4942-bf59-986f57145ec4","Type":"ContainerDied","Data":"cbcc2961b6a327d16171eb31d8ea54ff8a2c6f5303297d8511993e6e6fbb1e86"} Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.491676 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6e6c1e3-a518-4ff6-8449-be149ab65bb1" containerID="6f7a4f03f1265c02518cb79a00df2bf9a2f6b20ef54c2f013a587fc6042298c4" exitCode=0 Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.491705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" event={"ID":"e6e6c1e3-a518-4ff6-8449-be149ab65bb1","Type":"ContainerDied","Data":"6f7a4f03f1265c02518cb79a00df2bf9a2f6b20ef54c2f013a587fc6042298c4"} Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.493068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" event={"ID":"77947c2b-5588-40bc-9038-45844e74e767","Type":"ContainerStarted","Data":"17314e5ab8670c6d632a90347ca49b8287c481196323c08cd4371587301e2862"} Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.494582 4707 generic.go:334] "Generic (PLEG): container finished" podID="581b5e31-76d5-4900-91bd-8b2f82e9bf45" containerID="4e8db9206e60207d764bf1cf861c46a318af07cd5da8f84caf3d1dceec27103a" exitCode=0 Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.494637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-lzkn7" event={"ID":"581b5e31-76d5-4900-91bd-8b2f82e9bf45","Type":"ContainerDied","Data":"4e8db9206e60207d764bf1cf861c46a318af07cd5da8f84caf3d1dceec27103a"} Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.540345 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" podStartSLOduration=2.54032738 podStartE2EDuration="2.54032738s" podCreationTimestamp="2026-01-21 15:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:42.537671296 +0000 UTC m=+2099.719187518" watchObservedRunningTime="2026-01-21 15:36:42.54032738 +0000 UTC m=+2099.721843603" Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.881283 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.899961 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.958510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed552cf6-1879-4c6e-877e-ff70fd80c485-operator-scripts\") pod \"ed552cf6-1879-4c6e-877e-ff70fd80c485\" (UID: \"ed552cf6-1879-4c6e-877e-ff70fd80c485\") " Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.958711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb7dw\" (UniqueName: \"kubernetes.io/projected/d1b890b6-faa3-4e93-aebe-6d8277291a22-kube-api-access-vb7dw\") pod \"d1b890b6-faa3-4e93-aebe-6d8277291a22\" (UID: \"d1b890b6-faa3-4e93-aebe-6d8277291a22\") " Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.958788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn2jb\" (UniqueName: \"kubernetes.io/projected/ed552cf6-1879-4c6e-877e-ff70fd80c485-kube-api-access-gn2jb\") pod \"ed552cf6-1879-4c6e-877e-ff70fd80c485\" (UID: \"ed552cf6-1879-4c6e-877e-ff70fd80c485\") " Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.958825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b890b6-faa3-4e93-aebe-6d8277291a22-operator-scripts\") pod \"d1b890b6-faa3-4e93-aebe-6d8277291a22\" (UID: \"d1b890b6-faa3-4e93-aebe-6d8277291a22\") " Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.958999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed552cf6-1879-4c6e-877e-ff70fd80c485-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed552cf6-1879-4c6e-877e-ff70fd80c485" (UID: "ed552cf6-1879-4c6e-877e-ff70fd80c485"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.959201 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b890b6-faa3-4e93-aebe-6d8277291a22-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1b890b6-faa3-4e93-aebe-6d8277291a22" (UID: "d1b890b6-faa3-4e93-aebe-6d8277291a22"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.959352 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b890b6-faa3-4e93-aebe-6d8277291a22-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.959368 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed552cf6-1879-4c6e-877e-ff70fd80c485-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.963142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b890b6-faa3-4e93-aebe-6d8277291a22-kube-api-access-vb7dw" (OuterVolumeSpecName: "kube-api-access-vb7dw") pod "d1b890b6-faa3-4e93-aebe-6d8277291a22" (UID: "d1b890b6-faa3-4e93-aebe-6d8277291a22"). InnerVolumeSpecName "kube-api-access-vb7dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:42 crc kubenswrapper[4707]: I0121 15:36:42.965441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed552cf6-1879-4c6e-877e-ff70fd80c485-kube-api-access-gn2jb" (OuterVolumeSpecName: "kube-api-access-gn2jb") pod "ed552cf6-1879-4c6e-877e-ff70fd80c485" (UID: "ed552cf6-1879-4c6e-877e-ff70fd80c485"). InnerVolumeSpecName "kube-api-access-gn2jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.061099 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb7dw\" (UniqueName: \"kubernetes.io/projected/d1b890b6-faa3-4e93-aebe-6d8277291a22-kube-api-access-vb7dw\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.061128 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn2jb\" (UniqueName: \"kubernetes.io/projected/ed552cf6-1879-4c6e-877e-ff70fd80c485-kube-api-access-gn2jb\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.502205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-mptqk" event={"ID":"d1b890b6-faa3-4e93-aebe-6d8277291a22","Type":"ContainerDied","Data":"eb07065a121ea5e0c433f60df05ea67b8c76ae59b11df2ee8ffb8d81a7a64bdf"} Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.502257 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb07065a121ea5e0c433f60df05ea67b8c76ae59b11df2ee8ffb8d81a7a64bdf" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.502317 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-mptqk" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.504565 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.504915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-c140-account-create-update-prxqq" event={"ID":"ed552cf6-1879-4c6e-877e-ff70fd80c485","Type":"ContainerDied","Data":"c0cef32c9e6e437c1722309a0846f192b2ca3dbe87ae5733701f98fabf3fa2ff"} Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.504939 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0cef32c9e6e437c1722309a0846f192b2ca3dbe87ae5733701f98fabf3fa2ff" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.746477 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.873275 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3169b921-59ad-46d3-9c46-1ad852c93a6e-operator-scripts\") pod \"3169b921-59ad-46d3-9c46-1ad852c93a6e\" (UID: \"3169b921-59ad-46d3-9c46-1ad852c93a6e\") " Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.873326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6fm9\" (UniqueName: \"kubernetes.io/projected/3169b921-59ad-46d3-9c46-1ad852c93a6e-kube-api-access-p6fm9\") pod \"3169b921-59ad-46d3-9c46-1ad852c93a6e\" (UID: \"3169b921-59ad-46d3-9c46-1ad852c93a6e\") " Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.873958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3169b921-59ad-46d3-9c46-1ad852c93a6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3169b921-59ad-46d3-9c46-1ad852c93a6e" (UID: "3169b921-59ad-46d3-9c46-1ad852c93a6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.877591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3169b921-59ad-46d3-9c46-1ad852c93a6e-kube-api-access-p6fm9" (OuterVolumeSpecName: "kube-api-access-p6fm9") pod "3169b921-59ad-46d3-9c46-1ad852c93a6e" (UID: "3169b921-59ad-46d3-9c46-1ad852c93a6e"). InnerVolumeSpecName "kube-api-access-p6fm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.976181 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3169b921-59ad-46d3-9c46-1ad852c93a6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:43 crc kubenswrapper[4707]: I0121 15:36:43.976216 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6fm9\" (UniqueName: \"kubernetes.io/projected/3169b921-59ad-46d3-9c46-1ad852c93a6e-kube-api-access-p6fm9\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.005312 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.010886 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.019702 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.077537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgql6\" (UniqueName: \"kubernetes.io/projected/581b5e31-76d5-4900-91bd-8b2f82e9bf45-kube-api-access-kgql6\") pod \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\" (UID: \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\") " Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.077680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4pdc\" (UniqueName: \"kubernetes.io/projected/902a08ab-3caf-4942-bf59-986f57145ec4-kube-api-access-p4pdc\") pod \"902a08ab-3caf-4942-bf59-986f57145ec4\" (UID: \"902a08ab-3caf-4942-bf59-986f57145ec4\") " Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.077719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/902a08ab-3caf-4942-bf59-986f57145ec4-operator-scripts\") pod \"902a08ab-3caf-4942-bf59-986f57145ec4\" (UID: \"902a08ab-3caf-4942-bf59-986f57145ec4\") " Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.077754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-operator-scripts\") pod \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\" (UID: \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\") " Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.077825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8px8w\" (UniqueName: \"kubernetes.io/projected/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-kube-api-access-8px8w\") pod \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\" (UID: \"e6e6c1e3-a518-4ff6-8449-be149ab65bb1\") " Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.077876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581b5e31-76d5-4900-91bd-8b2f82e9bf45-operator-scripts\") pod \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\" (UID: \"581b5e31-76d5-4900-91bd-8b2f82e9bf45\") " Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.079406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902a08ab-3caf-4942-bf59-986f57145ec4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "902a08ab-3caf-4942-bf59-986f57145ec4" (UID: "902a08ab-3caf-4942-bf59-986f57145ec4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.079514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6e6c1e3-a518-4ff6-8449-be149ab65bb1" (UID: "e6e6c1e3-a518-4ff6-8449-be149ab65bb1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.079640 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581b5e31-76d5-4900-91bd-8b2f82e9bf45-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "581b5e31-76d5-4900-91bd-8b2f82e9bf45" (UID: "581b5e31-76d5-4900-91bd-8b2f82e9bf45"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.081491 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/581b5e31-76d5-4900-91bd-8b2f82e9bf45-kube-api-access-kgql6" (OuterVolumeSpecName: "kube-api-access-kgql6") pod "581b5e31-76d5-4900-91bd-8b2f82e9bf45" (UID: "581b5e31-76d5-4900-91bd-8b2f82e9bf45"). InnerVolumeSpecName "kube-api-access-kgql6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.082006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902a08ab-3caf-4942-bf59-986f57145ec4-kube-api-access-p4pdc" (OuterVolumeSpecName: "kube-api-access-p4pdc") pod "902a08ab-3caf-4942-bf59-986f57145ec4" (UID: "902a08ab-3caf-4942-bf59-986f57145ec4"). InnerVolumeSpecName "kube-api-access-p4pdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.082361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-kube-api-access-8px8w" (OuterVolumeSpecName: "kube-api-access-8px8w") pod "e6e6c1e3-a518-4ff6-8449-be149ab65bb1" (UID: "e6e6c1e3-a518-4ff6-8449-be149ab65bb1"). InnerVolumeSpecName "kube-api-access-8px8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.182847 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581b5e31-76d5-4900-91bd-8b2f82e9bf45-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.183122 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgql6\" (UniqueName: \"kubernetes.io/projected/581b5e31-76d5-4900-91bd-8b2f82e9bf45-kube-api-access-kgql6\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.183195 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4pdc\" (UniqueName: \"kubernetes.io/projected/902a08ab-3caf-4942-bf59-986f57145ec4-kube-api-access-p4pdc\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.183274 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/902a08ab-3caf-4942-bf59-986f57145ec4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.183337 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.183411 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8px8w\" (UniqueName: \"kubernetes.io/projected/e6e6c1e3-a518-4ff6-8449-be149ab65bb1-kube-api-access-8px8w\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.511056 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-f8zkr" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.511051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-f8zkr" event={"ID":"902a08ab-3caf-4942-bf59-986f57145ec4","Type":"ContainerDied","Data":"846e869c3a9bd60db5c73602129b2c2aaea99bd2eac3db7f85807bffccc6b54e"} Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.514079 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="846e869c3a9bd60db5c73602129b2c2aaea99bd2eac3db7f85807bffccc6b54e" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.515483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" event={"ID":"e6e6c1e3-a518-4ff6-8449-be149ab65bb1","Type":"ContainerDied","Data":"084687294ce9600974226a76ad002dd0dd1c84270fe174995768d00cb205b529"} Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.515535 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="084687294ce9600974226a76ad002dd0dd1c84270fe174995768d00cb205b529" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.515559 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.518410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" event={"ID":"3169b921-59ad-46d3-9c46-1ad852c93a6e","Type":"ContainerDied","Data":"e5c6fb0c234f6f3d53af9c7c740cb3176f0bc15bcdcf951d68e888fab85173e2"} Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.518490 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5c6fb0c234f6f3d53af9c7c740cb3176f0bc15bcdcf951d68e888fab85173e2" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.518644 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.519774 4707 generic.go:334] "Generic (PLEG): container finished" podID="77947c2b-5588-40bc-9038-45844e74e767" containerID="17314e5ab8670c6d632a90347ca49b8287c481196323c08cd4371587301e2862" exitCode=0 Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.519862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" event={"ID":"77947c2b-5588-40bc-9038-45844e74e767","Type":"ContainerDied","Data":"17314e5ab8670c6d632a90347ca49b8287c481196323c08cd4371587301e2862"} Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.521295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-lzkn7" event={"ID":"581b5e31-76d5-4900-91bd-8b2f82e9bf45","Type":"ContainerDied","Data":"bf4b3ec86d1911132d55278bdb9ecf1493b7b574ae3c2fe9ba78524372115b57"} Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.521324 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf4b3ec86d1911132d55278bdb9ecf1493b7b574ae3c2fe9ba78524372115b57" Jan 21 15:36:44 crc kubenswrapper[4707]: I0121 15:36:44.521337 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-lzkn7" Jan 21 15:36:45 crc kubenswrapper[4707]: I0121 15:36:45.796623 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:45 crc kubenswrapper[4707]: I0121 15:36:45.909286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-combined-ca-bundle\") pod \"77947c2b-5588-40bc-9038-45844e74e767\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " Jan 21 15:36:45 crc kubenswrapper[4707]: I0121 15:36:45.909349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb7gm\" (UniqueName: \"kubernetes.io/projected/77947c2b-5588-40bc-9038-45844e74e767-kube-api-access-nb7gm\") pod \"77947c2b-5588-40bc-9038-45844e74e767\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " Jan 21 15:36:45 crc kubenswrapper[4707]: I0121 15:36:45.909441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-config-data\") pod \"77947c2b-5588-40bc-9038-45844e74e767\" (UID: \"77947c2b-5588-40bc-9038-45844e74e767\") " Jan 21 15:36:45 crc kubenswrapper[4707]: I0121 15:36:45.915186 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77947c2b-5588-40bc-9038-45844e74e767-kube-api-access-nb7gm" (OuterVolumeSpecName: "kube-api-access-nb7gm") pod "77947c2b-5588-40bc-9038-45844e74e767" (UID: "77947c2b-5588-40bc-9038-45844e74e767"). InnerVolumeSpecName "kube-api-access-nb7gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:45 crc kubenswrapper[4707]: I0121 15:36:45.933592 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77947c2b-5588-40bc-9038-45844e74e767" (UID: "77947c2b-5588-40bc-9038-45844e74e767"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:45 crc kubenswrapper[4707]: I0121 15:36:45.949475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-config-data" (OuterVolumeSpecName: "config-data") pod "77947c2b-5588-40bc-9038-45844e74e767" (UID: "77947c2b-5588-40bc-9038-45844e74e767"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.011721 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.011758 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77947c2b-5588-40bc-9038-45844e74e767-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.011769 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb7gm\" (UniqueName: \"kubernetes.io/projected/77947c2b-5588-40bc-9038-45844e74e767-kube-api-access-nb7gm\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.536063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" event={"ID":"77947c2b-5588-40bc-9038-45844e74e767","Type":"ContainerDied","Data":"d878e1f61ddfa33924b77f9590ee0c8e7795ba4d23535fd388af1276f8852ff3"} Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.536090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-bm4lc" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.536102 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d878e1f61ddfa33924b77f9590ee0c8e7795ba4d23535fd388af1276f8852ff3" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968336 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mj6vd"] Jan 21 15:36:46 crc kubenswrapper[4707]: E0121 15:36:46.968624 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed552cf6-1879-4c6e-877e-ff70fd80c485" containerName="mariadb-account-create-update" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968635 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed552cf6-1879-4c6e-877e-ff70fd80c485" containerName="mariadb-account-create-update" Jan 21 15:36:46 crc kubenswrapper[4707]: E0121 15:36:46.968660 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902a08ab-3caf-4942-bf59-986f57145ec4" containerName="mariadb-database-create" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968666 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="902a08ab-3caf-4942-bf59-986f57145ec4" containerName="mariadb-database-create" Jan 21 15:36:46 crc kubenswrapper[4707]: E0121 15:36:46.968677 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3169b921-59ad-46d3-9c46-1ad852c93a6e" containerName="mariadb-account-create-update" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3169b921-59ad-46d3-9c46-1ad852c93a6e" containerName="mariadb-account-create-update" Jan 21 15:36:46 crc kubenswrapper[4707]: E0121 15:36:46.968695 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581b5e31-76d5-4900-91bd-8b2f82e9bf45" containerName="mariadb-database-create" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968700 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="581b5e31-76d5-4900-91bd-8b2f82e9bf45" containerName="mariadb-database-create" Jan 21 15:36:46 crc kubenswrapper[4707]: E0121 15:36:46.968712 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77947c2b-5588-40bc-9038-45844e74e767" containerName="keystone-db-sync" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968716 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77947c2b-5588-40bc-9038-45844e74e767" containerName="keystone-db-sync" Jan 21 15:36:46 crc kubenswrapper[4707]: E0121 15:36:46.968730 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b890b6-faa3-4e93-aebe-6d8277291a22" containerName="mariadb-database-create" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968739 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b890b6-faa3-4e93-aebe-6d8277291a22" containerName="mariadb-database-create" Jan 21 15:36:46 crc kubenswrapper[4707]: E0121 15:36:46.968749 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e6c1e3-a518-4ff6-8449-be149ab65bb1" containerName="mariadb-account-create-update" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968754 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e6c1e3-a518-4ff6-8449-be149ab65bb1" containerName="mariadb-account-create-update" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968891 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed552cf6-1879-4c6e-877e-ff70fd80c485" containerName="mariadb-account-create-update" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968902 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b890b6-faa3-4e93-aebe-6d8277291a22" containerName="mariadb-database-create" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968912 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="902a08ab-3caf-4942-bf59-986f57145ec4" containerName="mariadb-database-create" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968919 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e6c1e3-a518-4ff6-8449-be149ab65bb1" containerName="mariadb-account-create-update" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968932 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77947c2b-5588-40bc-9038-45844e74e767" containerName="keystone-db-sync" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968941 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="581b5e31-76d5-4900-91bd-8b2f82e9bf45" containerName="mariadb-database-create" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.968950 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3169b921-59ad-46d3-9c46-1ad852c93a6e" containerName="mariadb-account-create-update" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.969459 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.972632 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.974128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.977234 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-m5gqr" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.977263 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.978761 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:36:46 crc kubenswrapper[4707]: I0121 15:36:46.984601 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mj6vd"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.027507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-credential-keys\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.027714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-scripts\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.027784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-fernet-keys\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.027844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-combined-ca-bundle\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.027993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-config-data\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.028064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp5dc\" (UniqueName: \"kubernetes.io/projected/88eddb1d-4df9-4afb-bae3-a55b93a6252a-kube-api-access-vp5dc\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.101533 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.103329 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.105712 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.105981 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.115067 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.129388 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-scripts\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.129451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-fernet-keys\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.129478 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-combined-ca-bundle\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.129549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-config-data\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.129587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp5dc\" (UniqueName: \"kubernetes.io/projected/88eddb1d-4df9-4afb-bae3-a55b93a6252a-kube-api-access-vp5dc\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.129624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-credential-keys\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.134262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-scripts\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.135382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-fernet-keys\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.156555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-config-data\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.156909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-credential-keys\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.160259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-combined-ca-bundle\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.191757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp5dc\" (UniqueName: \"kubernetes.io/projected/88eddb1d-4df9-4afb-bae3-a55b93a6252a-kube-api-access-vp5dc\") pod \"keystone-bootstrap-mj6vd\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.220767 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-q7kt6"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.232120 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.233666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.233750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-scripts\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.233768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.233787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-log-httpd\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.234006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlj7v\" (UniqueName: \"kubernetes.io/projected/eaea7418-f6a3-4005-a8bf-d92b199d03b2-kube-api-access-xlj7v\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.234043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-run-httpd\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.234133 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-config-data\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.235314 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.235468 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-rtwk4" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.239427 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.254558 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-q7kt6"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.285485 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.335680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-config-data\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.335781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.335824 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-scripts\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.335846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6q6\" (UniqueName: \"kubernetes.io/projected/56293b49-445a-404c-8eb7-443b7d9d0170-kube-api-access-rn6q6\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.335905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-db-sync-config-data\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.335922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56293b49-445a-404c-8eb7-443b7d9d0170-etc-machine-id\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.335938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-scripts\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.335955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-log-httpd\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.335970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.336002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-config-data\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.336070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-combined-ca-bundle\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.336095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlj7v\" (UniqueName: \"kubernetes.io/projected/eaea7418-f6a3-4005-a8bf-d92b199d03b2-kube-api-access-xlj7v\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.336126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-run-httpd\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.336564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-run-httpd\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.337784 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9m9rb"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.338643 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xwzrg"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.344001 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.344409 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.344803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-log-httpd\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.345918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-scripts\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.356893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.357391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.361648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-config-data\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.363720 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9m9rb"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.367236 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.367488 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-6fz9m" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.367617 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.367724 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-r5ql8" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.368126 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.368360 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.399539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlj7v\" (UniqueName: \"kubernetes.io/projected/eaea7418-f6a3-4005-a8bf-d92b199d03b2-kube-api-access-xlj7v\") pod \"ceilometer-0\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.403095 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xwzrg"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.434619 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-scripts\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsmmz\" (UniqueName: \"kubernetes.io/projected/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-kube-api-access-tsmmz\") pod \"neutron-db-sync-9m9rb\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-db-sync-config-data\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56293b49-445a-404c-8eb7-443b7d9d0170-etc-machine-id\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-config-data\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-config-data\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-combined-ca-bundle\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99c92c6-cf0f-406a-ad28-7134cace4dd7-logs\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-combined-ca-bundle\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klj8k\" (UniqueName: \"kubernetes.io/projected/f99c92c6-cf0f-406a-ad28-7134cace4dd7-kube-api-access-klj8k\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.437910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-combined-ca-bundle\") pod \"neutron-db-sync-9m9rb\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.438140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-config\") pod \"neutron-db-sync-9m9rb\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.438184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-scripts\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.438200 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6q6\" (UniqueName: \"kubernetes.io/projected/56293b49-445a-404c-8eb7-443b7d9d0170-kube-api-access-rn6q6\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.443353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56293b49-445a-404c-8eb7-443b7d9d0170-etc-machine-id\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.444299 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-lwmqg"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.444530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-db-sync-config-data\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.449544 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.463025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-config-data\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.467718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-scripts\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.468234 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-2xl4n" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.468716 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.468796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-combined-ca-bundle\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.469207 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-lwmqg"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.487531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6q6\" (UniqueName: \"kubernetes.io/projected/56293b49-445a-404c-8eb7-443b7d9d0170-kube-api-access-rn6q6\") pod \"cinder-db-sync-q7kt6\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-combined-ca-bundle\") pod \"barbican-db-sync-lwmqg\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-config-data\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99c92c6-cf0f-406a-ad28-7134cace4dd7-logs\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-combined-ca-bundle\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mps\" (UniqueName: \"kubernetes.io/projected/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-kube-api-access-r7mps\") pod \"barbican-db-sync-lwmqg\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klj8k\" (UniqueName: \"kubernetes.io/projected/f99c92c6-cf0f-406a-ad28-7134cace4dd7-kube-api-access-klj8k\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-combined-ca-bundle\") pod \"neutron-db-sync-9m9rb\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-db-sync-config-data\") pod \"barbican-db-sync-lwmqg\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-config\") pod \"neutron-db-sync-9m9rb\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-scripts\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.543756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsmmz\" (UniqueName: \"kubernetes.io/projected/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-kube-api-access-tsmmz\") pod \"neutron-db-sync-9m9rb\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.547035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99c92c6-cf0f-406a-ad28-7134cace4dd7-logs\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.554540 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-config\") pod \"neutron-db-sync-9m9rb\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.554916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-combined-ca-bundle\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.555322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-config-data\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.566001 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.584854 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsmmz\" (UniqueName: \"kubernetes.io/projected/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-kube-api-access-tsmmz\") pod \"neutron-db-sync-9m9rb\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.585449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-combined-ca-bundle\") pod \"neutron-db-sync-9m9rb\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.586921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-scripts\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.587043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klj8k\" (UniqueName: \"kubernetes.io/projected/f99c92c6-cf0f-406a-ad28-7134cace4dd7-kube-api-access-klj8k\") pod \"placement-db-sync-xwzrg\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.645256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mps\" (UniqueName: \"kubernetes.io/projected/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-kube-api-access-r7mps\") pod \"barbican-db-sync-lwmqg\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.645334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-db-sync-config-data\") pod \"barbican-db-sync-lwmqg\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.645418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-combined-ca-bundle\") pod \"barbican-db-sync-lwmqg\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.649638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-combined-ca-bundle\") pod \"barbican-db-sync-lwmqg\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.650475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-db-sync-config-data\") pod \"barbican-db-sync-lwmqg\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.662029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mps\" (UniqueName: \"kubernetes.io/projected/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-kube-api-access-r7mps\") pod \"barbican-db-sync-lwmqg\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.778953 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.823199 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.880924 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mj6vd"] Jan 21 15:36:47 crc kubenswrapper[4707]: I0121 15:36:47.888479 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:48 crc kubenswrapper[4707]: W0121 15:36:48.007943 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaea7418_f6a3_4005_a8bf_d92b199d03b2.slice/crio-9087c74239db23aed6b37b4eb8795218c2d7117cb60e96bd2d95303913a5278c WatchSource:0}: Error finding container 9087c74239db23aed6b37b4eb8795218c2d7117cb60e96bd2d95303913a5278c: Status 404 returned error can't find the container with id 9087c74239db23aed6b37b4eb8795218c2d7117cb60e96bd2d95303913a5278c Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.008050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.064450 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.069115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.073707 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.074667 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.079628 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-7x9kc" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.081566 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.099426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:36:48 crc kubenswrapper[4707]: W0121 15:36:48.117982 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56293b49_445a_404c_8eb7_443b7d9d0170.slice/crio-2eea8c3f81f780c9113094b65c01e2d3826790e7f9cda7b122c74136f584f81e WatchSource:0}: Error finding container 2eea8c3f81f780c9113094b65c01e2d3826790e7f9cda7b122c74136f584f81e: Status 404 returned error can't find the container with id 2eea8c3f81f780c9113094b65c01e2d3826790e7f9cda7b122c74136f584f81e Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.118090 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-q7kt6"] Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.145203 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.146702 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.149698 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.149763 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.154757 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.157976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.158115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkcj\" (UniqueName: \"kubernetes.io/projected/55c9c667-afb4-4ace-8bee-173cf43d5a91-kube-api-access-ckkcj\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.158192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-logs\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.158332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-scripts\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.158409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-config-data\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.158484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.158561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.158632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: W0121 15:36:48.251074 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf99c92c6_cf0f_406a_ad28_7134cace4dd7.slice/crio-021faba23c683890758004fd1cf02a022429c08a3c0691104a7a7bce27b27b00 WatchSource:0}: Error finding container 021faba23c683890758004fd1cf02a022429c08a3c0691104a7a7bce27b27b00: Status 404 returned error can't find the container with id 021faba23c683890758004fd1cf02a022429c08a3c0691104a7a7bce27b27b00 Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.251365 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xwzrg"] Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.259668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.259848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-scripts\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.259957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbhxn\" (UniqueName: \"kubernetes.io/projected/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-kube-api-access-pbhxn\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.260055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-config-data\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.260128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.260601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.260694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.260784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-logs\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.260884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.260567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.261069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.261526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.261706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.262076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkcj\" (UniqueName: \"kubernetes.io/projected/55c9c667-afb4-4ace-8bee-173cf43d5a91-kube-api-access-ckkcj\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.262438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-logs\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.262829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.262939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.262040 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.262758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-logs\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.264294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-scripts\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.265355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-config-data\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.270775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.270927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.276165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkcj\" (UniqueName: \"kubernetes.io/projected/55c9c667-afb4-4ace-8bee-173cf43d5a91-kube-api-access-ckkcj\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.299527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.338877 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9m9rb"] Jan 21 15:36:48 crc kubenswrapper[4707]: W0121 15:36:48.341352 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8157a3d5_13f8_4842_9fcd_6bd3d5f4a020.slice/crio-b5ff2cd59756900eca26d7cdb1a41e8c9b570c3fe333ac5cf279f6b593828c5d WatchSource:0}: Error finding container b5ff2cd59756900eca26d7cdb1a41e8c9b570c3fe333ac5cf279f6b593828c5d: Status 404 returned error can't find the container with id b5ff2cd59756900eca26d7cdb1a41e8c9b570c3fe333ac5cf279f6b593828c5d Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.363681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbhxn\" (UniqueName: \"kubernetes.io/projected/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-kube-api-access-pbhxn\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.363736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.363761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-logs\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.363789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.363823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.363908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.363971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.364023 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.364523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.364599 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.364891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-logs\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.367671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.368332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.369840 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.379914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.385449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbhxn\" (UniqueName: \"kubernetes.io/projected/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-kube-api-access-pbhxn\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.393021 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.402242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.429664 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-lwmqg"] Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.616623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" event={"ID":"88eddb1d-4df9-4afb-bae3-a55b93a6252a","Type":"ContainerStarted","Data":"73b71146f952853bdd6e59449df71c4b5a1958e47e4c4588e0b1ac00e9bd595a"} Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.616929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" event={"ID":"88eddb1d-4df9-4afb-bae3-a55b93a6252a","Type":"ContainerStarted","Data":"0bcac72fa58ff64e99a340c63b37b7529493e35576bc41ab32ddf189fce08b57"} Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.622178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" event={"ID":"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020","Type":"ContainerStarted","Data":"cee8fc65d92a4c449380c20ddd4feeb07fa1354e54366a6b28e01b8d807032e6"} Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.622206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" event={"ID":"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020","Type":"ContainerStarted","Data":"b5ff2cd59756900eca26d7cdb1a41e8c9b570c3fe333ac5cf279f6b593828c5d"} Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.627584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerStarted","Data":"9087c74239db23aed6b37b4eb8795218c2d7117cb60e96bd2d95303913a5278c"} Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.628745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" event={"ID":"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8","Type":"ContainerStarted","Data":"d329165a54265c1914902c5f66712fa26a01137aa64da14eb54cfb3059404372"} Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.633633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" event={"ID":"56293b49-445a-404c-8eb7-443b7d9d0170","Type":"ContainerStarted","Data":"2eea8c3f81f780c9113094b65c01e2d3826790e7f9cda7b122c74136f584f81e"} Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.635299 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" podStartSLOduration=2.635285795 podStartE2EDuration="2.635285795s" podCreationTimestamp="2026-01-21 15:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:48.632105916 +0000 UTC m=+2105.813622139" watchObservedRunningTime="2026-01-21 15:36:48.635285795 +0000 UTC m=+2105.816802018" Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.641085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xwzrg" event={"ID":"f99c92c6-cf0f-406a-ad28-7134cace4dd7","Type":"ContainerStarted","Data":"915ec7465a1a102fa76e8d82b871add4c78fcca54bbabfcfdaf7adb84ced2a0d"} Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.641514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xwzrg" event={"ID":"f99c92c6-cf0f-406a-ad28-7134cace4dd7","Type":"ContainerStarted","Data":"021faba23c683890758004fd1cf02a022429c08a3c0691104a7a7bce27b27b00"} Jan 21 15:36:48 crc kubenswrapper[4707]: I0121 15:36:48.655158 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" podStartSLOduration=1.6551410180000001 podStartE2EDuration="1.655141018s" podCreationTimestamp="2026-01-21 15:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:48.644077586 +0000 UTC m=+2105.825593809" watchObservedRunningTime="2026-01-21 15:36:48.655141018 +0000 UTC m=+2105.836657240" Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:48.664460 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-xwzrg" podStartSLOduration=1.664446946 podStartE2EDuration="1.664446946s" podCreationTimestamp="2026-01-21 15:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:48.659911228 +0000 UTC m=+2105.841427449" watchObservedRunningTime="2026-01-21 15:36:48.664446946 +0000 UTC m=+2105.845963169" Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:48.691330 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:48.888857 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.241545 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.278673 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.553387 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.653900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerStarted","Data":"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011"} Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.654089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerStarted","Data":"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9"} Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.657218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" event={"ID":"56293b49-445a-404c-8eb7-443b7d9d0170","Type":"ContainerStarted","Data":"f431730553af7e4d97ef4b5ba81a40ace44ae601e09bfd4227bed22314d88fa7"} Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.659706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" event={"ID":"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8","Type":"ContainerStarted","Data":"2bad8cee5f896a528e089f0e95328272fc4d1d17326f5b82657dc7c6854aebea"} Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.662329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"55c9c667-afb4-4ace-8bee-173cf43d5a91","Type":"ContainerStarted","Data":"0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e"} Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.662364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"55c9c667-afb4-4ace-8bee-173cf43d5a91","Type":"ContainerStarted","Data":"32288c6fb19adab0068168417ca6ae969474a4a0fc4253fcf9145ec13bfc2973"} Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.691865 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" podStartSLOduration=2.691837881 podStartE2EDuration="2.691837881s" podCreationTimestamp="2026-01-21 15:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:49.674666654 +0000 UTC m=+2106.856182886" watchObservedRunningTime="2026-01-21 15:36:49.691837881 +0000 UTC m=+2106.873354102" Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.706454 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" podStartSLOduration=2.706436289 podStartE2EDuration="2.706436289s" podCreationTimestamp="2026-01-21 15:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:49.68589191 +0000 UTC m=+2106.867408132" watchObservedRunningTime="2026-01-21 15:36:49.706436289 +0000 UTC m=+2106.887952510" Jan 21 15:36:49 crc kubenswrapper[4707]: I0121 15:36:49.790115 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:36:50 crc kubenswrapper[4707]: I0121 15:36:50.672478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"55c9c667-afb4-4ace-8bee-173cf43d5a91","Type":"ContainerStarted","Data":"f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8"} Jan 21 15:36:50 crc kubenswrapper[4707]: I0121 15:36:50.672565 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerName="glance-log" containerID="cri-o://0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e" gracePeriod=30 Jan 21 15:36:50 crc kubenswrapper[4707]: I0121 15:36:50.672891 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerName="glance-httpd" containerID="cri-o://f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8" gracePeriod=30 Jan 21 15:36:50 crc kubenswrapper[4707]: I0121 15:36:50.675862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerStarted","Data":"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae"} Jan 21 15:36:50 crc kubenswrapper[4707]: I0121 15:36:50.688840 4707 generic.go:334] "Generic (PLEG): container finished" podID="f99c92c6-cf0f-406a-ad28-7134cace4dd7" containerID="915ec7465a1a102fa76e8d82b871add4c78fcca54bbabfcfdaf7adb84ced2a0d" exitCode=0 Jan 21 15:36:50 crc kubenswrapper[4707]: I0121 15:36:50.688846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xwzrg" event={"ID":"f99c92c6-cf0f-406a-ad28-7134cace4dd7","Type":"ContainerDied","Data":"915ec7465a1a102fa76e8d82b871add4c78fcca54bbabfcfdaf7adb84ced2a0d"} Jan 21 15:36:50 crc kubenswrapper[4707]: I0121 15:36:50.697113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"927b7d73-3969-49ed-86e4-df5d5ccfa0f0","Type":"ContainerStarted","Data":"8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef"} Jan 21 15:36:50 crc kubenswrapper[4707]: I0121 15:36:50.697160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"927b7d73-3969-49ed-86e4-df5d5ccfa0f0","Type":"ContainerStarted","Data":"c094a123aeb6a8927f5bdb044ac7fbf33917da59f1c72516761d648b21ed0a27"} Jan 21 15:36:50 crc kubenswrapper[4707]: I0121 15:36:50.697649 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.697631245 podStartE2EDuration="3.697631245s" podCreationTimestamp="2026-01-21 15:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:50.689930765 +0000 UTC m=+2107.871446988" watchObservedRunningTime="2026-01-21 15:36:50.697631245 +0000 UTC m=+2107.879147467" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.316596 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.356593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-config-data\") pod \"55c9c667-afb4-4ace-8bee-173cf43d5a91\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.356644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckkcj\" (UniqueName: \"kubernetes.io/projected/55c9c667-afb4-4ace-8bee-173cf43d5a91-kube-api-access-ckkcj\") pod \"55c9c667-afb4-4ace-8bee-173cf43d5a91\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.356687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-combined-ca-bundle\") pod \"55c9c667-afb4-4ace-8bee-173cf43d5a91\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.356743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-httpd-run\") pod \"55c9c667-afb4-4ace-8bee-173cf43d5a91\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.356794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-public-tls-certs\") pod \"55c9c667-afb4-4ace-8bee-173cf43d5a91\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.356883 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-scripts\") pod \"55c9c667-afb4-4ace-8bee-173cf43d5a91\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.356929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-logs\") pod \"55c9c667-afb4-4ace-8bee-173cf43d5a91\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.356984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"55c9c667-afb4-4ace-8bee-173cf43d5a91\" (UID: \"55c9c667-afb4-4ace-8bee-173cf43d5a91\") " Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.357235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "55c9c667-afb4-4ace-8bee-173cf43d5a91" (UID: "55c9c667-afb4-4ace-8bee-173cf43d5a91"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.357596 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.359139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-logs" (OuterVolumeSpecName: "logs") pod "55c9c667-afb4-4ace-8bee-173cf43d5a91" (UID: "55c9c667-afb4-4ace-8bee-173cf43d5a91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.372485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "55c9c667-afb4-4ace-8bee-173cf43d5a91" (UID: "55c9c667-afb4-4ace-8bee-173cf43d5a91"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.377903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-scripts" (OuterVolumeSpecName: "scripts") pod "55c9c667-afb4-4ace-8bee-173cf43d5a91" (UID: "55c9c667-afb4-4ace-8bee-173cf43d5a91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.378013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c9c667-afb4-4ace-8bee-173cf43d5a91-kube-api-access-ckkcj" (OuterVolumeSpecName: "kube-api-access-ckkcj") pod "55c9c667-afb4-4ace-8bee-173cf43d5a91" (UID: "55c9c667-afb4-4ace-8bee-173cf43d5a91"). InnerVolumeSpecName "kube-api-access-ckkcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.392949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55c9c667-afb4-4ace-8bee-173cf43d5a91" (UID: "55c9c667-afb4-4ace-8bee-173cf43d5a91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.410598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "55c9c667-afb4-4ace-8bee-173cf43d5a91" (UID: "55c9c667-afb4-4ace-8bee-173cf43d5a91"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.418554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-config-data" (OuterVolumeSpecName: "config-data") pod "55c9c667-afb4-4ace-8bee-173cf43d5a91" (UID: "55c9c667-afb4-4ace-8bee-173cf43d5a91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.459689 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.459719 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.459729 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c9c667-afb4-4ace-8bee-173cf43d5a91-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.459761 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.459770 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.459781 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckkcj\" (UniqueName: \"kubernetes.io/projected/55c9c667-afb4-4ace-8bee-173cf43d5a91-kube-api-access-ckkcj\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.459791 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c9c667-afb4-4ace-8bee-173cf43d5a91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.475678 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.561606 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.709436 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8" containerID="2bad8cee5f896a528e089f0e95328272fc4d1d17326f5b82657dc7c6854aebea" exitCode=0 Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.709534 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" event={"ID":"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8","Type":"ContainerDied","Data":"2bad8cee5f896a528e089f0e95328272fc4d1d17326f5b82657dc7c6854aebea"} Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.713979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"927b7d73-3969-49ed-86e4-df5d5ccfa0f0","Type":"ContainerStarted","Data":"74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357"} Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.714075 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerName="glance-log" containerID="cri-o://8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef" gracePeriod=30 Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.714205 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerName="glance-httpd" containerID="cri-o://74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357" gracePeriod=30 Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.719601 4707 generic.go:334] "Generic (PLEG): container finished" podID="88eddb1d-4df9-4afb-bae3-a55b93a6252a" containerID="73b71146f952853bdd6e59449df71c4b5a1958e47e4c4588e0b1ac00e9bd595a" exitCode=0 Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.719676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" event={"ID":"88eddb1d-4df9-4afb-bae3-a55b93a6252a","Type":"ContainerDied","Data":"73b71146f952853bdd6e59449df71c4b5a1958e47e4c4588e0b1ac00e9bd595a"} Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.727841 4707 generic.go:334] "Generic (PLEG): container finished" podID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerID="f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8" exitCode=0 Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.727872 4707 generic.go:334] "Generic (PLEG): container finished" podID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerID="0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e" exitCode=143 Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.727922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.727959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"55c9c667-afb4-4ace-8bee-173cf43d5a91","Type":"ContainerDied","Data":"f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8"} Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.728007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"55c9c667-afb4-4ace-8bee-173cf43d5a91","Type":"ContainerDied","Data":"0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e"} Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.728020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"55c9c667-afb4-4ace-8bee-173cf43d5a91","Type":"ContainerDied","Data":"32288c6fb19adab0068168417ca6ae969474a4a0fc4253fcf9145ec13bfc2973"} Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.728037 4707 scope.go:117] "RemoveContainer" containerID="f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.765016 4707 scope.go:117] "RemoveContainer" containerID="0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.792040 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.792025491 podStartE2EDuration="4.792025491s" podCreationTimestamp="2026-01-21 15:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:51.78807788 +0000 UTC m=+2108.969594101" watchObservedRunningTime="2026-01-21 15:36:51.792025491 +0000 UTC m=+2108.973541713" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.807872 4707 scope.go:117] "RemoveContainer" containerID="f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8" Jan 21 15:36:51 crc kubenswrapper[4707]: E0121 15:36:51.808317 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8\": container with ID starting with f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8 not found: ID does not exist" containerID="f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.808348 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8"} err="failed to get container status \"f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8\": rpc error: code = NotFound desc = could not find container \"f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8\": container with ID starting with f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8 not found: ID does not exist" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.808367 4707 scope.go:117] "RemoveContainer" containerID="0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e" Jan 21 15:36:51 crc kubenswrapper[4707]: E0121 15:36:51.808566 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e\": container with ID starting with 0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e not found: ID does not exist" containerID="0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.808623 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e"} err="failed to get container status \"0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e\": rpc error: code = NotFound desc = could not find container \"0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e\": container with ID starting with 0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e not found: ID does not exist" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.808638 4707 scope.go:117] "RemoveContainer" containerID="f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.808918 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8"} err="failed to get container status \"f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8\": rpc error: code = NotFound desc = could not find container \"f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8\": container with ID starting with f184dd5c32ac1893fa4d5a33b204ae5c678c4f5e104a6182afce54b32099b2b8 not found: ID does not exist" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.808931 4707 scope.go:117] "RemoveContainer" containerID="0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.814866 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e"} err="failed to get container status \"0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e\": rpc error: code = NotFound desc = could not find container \"0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e\": container with ID starting with 0ad53acd853b16a714a3ee2b200a88a284e2b304d8307209bebb72b601948d1e not found: ID does not exist" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.821688 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.863462 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.868773 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:36:51 crc kubenswrapper[4707]: E0121 15:36:51.869120 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerName="glance-httpd" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.869137 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerName="glance-httpd" Jan 21 15:36:51 crc kubenswrapper[4707]: E0121 15:36:51.869148 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerName="glance-log" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.869154 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerName="glance-log" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.869342 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerName="glance-log" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.869361 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c9c667-afb4-4ace-8bee-173cf43d5a91" containerName="glance-httpd" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.870177 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.871747 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.873343 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.881359 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.967008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-logs\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.967066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.967098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.967121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.967140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.967163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.967211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.967269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24789\" (UniqueName: \"kubernetes.io/projected/9c93e649-01b5-4e89-8453-106a67bc11ba-kube-api-access-24789\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:51 crc kubenswrapper[4707]: I0121 15:36:51.989839 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.068593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99c92c6-cf0f-406a-ad28-7134cace4dd7-logs\") pod \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.068862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f99c92c6-cf0f-406a-ad28-7134cace4dd7-logs" (OuterVolumeSpecName: "logs") pod "f99c92c6-cf0f-406a-ad28-7134cace4dd7" (UID: "f99c92c6-cf0f-406a-ad28-7134cace4dd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.068940 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-combined-ca-bundle\") pod \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.069057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-config-data\") pod \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.069108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-scripts\") pod \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.069583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klj8k\" (UniqueName: \"kubernetes.io/projected/f99c92c6-cf0f-406a-ad28-7134cace4dd7-kube-api-access-klj8k\") pod \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\" (UID: \"f99c92c6-cf0f-406a-ad28-7134cace4dd7\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.070126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24789\" (UniqueName: \"kubernetes.io/projected/9c93e649-01b5-4e89-8453-106a67bc11ba-kube-api-access-24789\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.070220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-logs\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.070269 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.070308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.070331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.070351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.070376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.070424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.070492 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99c92c6-cf0f-406a-ad28-7134cace4dd7-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.071241 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.072093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-logs\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.072557 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99c92c6-cf0f-406a-ad28-7134cace4dd7-kube-api-access-klj8k" (OuterVolumeSpecName: "kube-api-access-klj8k") pod "f99c92c6-cf0f-406a-ad28-7134cace4dd7" (UID: "f99c92c6-cf0f-406a-ad28-7134cace4dd7"). InnerVolumeSpecName "kube-api-access-klj8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.076839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.085774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.086709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.132957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.136248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.146349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-scripts" (OuterVolumeSpecName: "scripts") pod "f99c92c6-cf0f-406a-ad28-7134cace4dd7" (UID: "f99c92c6-cf0f-406a-ad28-7134cace4dd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.146394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24789\" (UniqueName: \"kubernetes.io/projected/9c93e649-01b5-4e89-8453-106a67bc11ba-kube-api-access-24789\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.153219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-config-data" (OuterVolumeSpecName: "config-data") pod "f99c92c6-cf0f-406a-ad28-7134cace4dd7" (UID: "f99c92c6-cf0f-406a-ad28-7134cace4dd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.154669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.161692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f99c92c6-cf0f-406a-ad28-7134cace4dd7" (UID: "f99c92c6-cf0f-406a-ad28-7134cace4dd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.172846 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.173510 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.173530 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klj8k\" (UniqueName: \"kubernetes.io/projected/f99c92c6-cf0f-406a-ad28-7134cace4dd7-kube-api-access-klj8k\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.173543 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c92c6-cf0f-406a-ad28-7134cace4dd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.195530 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.587987 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.683157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-httpd-run\") pod \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.683252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-logs\") pod \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.683284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-internal-tls-certs\") pod \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.683319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbhxn\" (UniqueName: \"kubernetes.io/projected/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-kube-api-access-pbhxn\") pod \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.683335 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.683372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-scripts\") pod \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.683389 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-combined-ca-bundle\") pod \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.683436 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-config-data\") pod \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\" (UID: \"927b7d73-3969-49ed-86e4-df5d5ccfa0f0\") " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.684795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-logs" (OuterVolumeSpecName: "logs") pod "927b7d73-3969-49ed-86e4-df5d5ccfa0f0" (UID: "927b7d73-3969-49ed-86e4-df5d5ccfa0f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.684922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "927b7d73-3969-49ed-86e4-df5d5ccfa0f0" (UID: "927b7d73-3969-49ed-86e4-df5d5ccfa0f0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.688887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "927b7d73-3969-49ed-86e4-df5d5ccfa0f0" (UID: "927b7d73-3969-49ed-86e4-df5d5ccfa0f0"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.690011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-kube-api-access-pbhxn" (OuterVolumeSpecName: "kube-api-access-pbhxn") pod "927b7d73-3969-49ed-86e4-df5d5ccfa0f0" (UID: "927b7d73-3969-49ed-86e4-df5d5ccfa0f0"). InnerVolumeSpecName "kube-api-access-pbhxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.691913 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-scripts" (OuterVolumeSpecName: "scripts") pod "927b7d73-3969-49ed-86e4-df5d5ccfa0f0" (UID: "927b7d73-3969-49ed-86e4-df5d5ccfa0f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.715827 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.735501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-config-data" (OuterVolumeSpecName: "config-data") pod "927b7d73-3969-49ed-86e4-df5d5ccfa0f0" (UID: "927b7d73-3969-49ed-86e4-df5d5ccfa0f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.755304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "927b7d73-3969-49ed-86e4-df5d5ccfa0f0" (UID: "927b7d73-3969-49ed-86e4-df5d5ccfa0f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.759110 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="ceilometer-central-agent" containerID="cri-o://cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9" gracePeriod=30 Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.759145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerStarted","Data":"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c"} Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.759204 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.759244 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="proxy-httpd" containerID="cri-o://1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c" gracePeriod=30 Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.759289 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="sg-core" containerID="cri-o://73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae" gracePeriod=30 Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.759320 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="ceilometer-notification-agent" containerID="cri-o://7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011" gracePeriod=30 Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.764136 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xwzrg" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.764254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xwzrg" event={"ID":"f99c92c6-cf0f-406a-ad28-7134cace4dd7","Type":"ContainerDied","Data":"021faba23c683890758004fd1cf02a022429c08a3c0691104a7a7bce27b27b00"} Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.764434 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="021faba23c683890758004fd1cf02a022429c08a3c0691104a7a7bce27b27b00" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.779056 4707 generic.go:334] "Generic (PLEG): container finished" podID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerID="74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357" exitCode=0 Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.779097 4707 generic.go:334] "Generic (PLEG): container finished" podID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerID="8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef" exitCode=143 Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.779141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"927b7d73-3969-49ed-86e4-df5d5ccfa0f0","Type":"ContainerDied","Data":"74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357"} Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.779178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"927b7d73-3969-49ed-86e4-df5d5ccfa0f0","Type":"ContainerDied","Data":"8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef"} Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.779190 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"927b7d73-3969-49ed-86e4-df5d5ccfa0f0","Type":"ContainerDied","Data":"c094a123aeb6a8927f5bdb044ac7fbf33917da59f1c72516761d648b21ed0a27"} Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.779204 4707 scope.go:117] "RemoveContainer" containerID="74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.779362 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.785505 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.785523 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.785533 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbhxn\" (UniqueName: \"kubernetes.io/projected/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-kube-api-access-pbhxn\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.785552 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.785561 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.785569 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.785576 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.789956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "927b7d73-3969-49ed-86e4-df5d5ccfa0f0" (UID: "927b7d73-3969-49ed-86e4-df5d5ccfa0f0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.832974 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.866893788 podStartE2EDuration="5.832957115s" podCreationTimestamp="2026-01-21 15:36:47 +0000 UTC" firstStartedPulling="2026-01-21 15:36:48.012601396 +0000 UTC m=+2105.194117618" lastFinishedPulling="2026-01-21 15:36:51.978664722 +0000 UTC m=+2109.160180945" observedRunningTime="2026-01-21 15:36:52.780373004 +0000 UTC m=+2109.961889226" watchObservedRunningTime="2026-01-21 15:36:52.832957115 +0000 UTC m=+2110.014473337" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.838189 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.843375 4707 scope.go:117] "RemoveContainer" containerID="8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.862486 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-6cf4db6586-6qsm2"] Jan 21 15:36:52 crc kubenswrapper[4707]: E0121 15:36:52.862802 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerName="glance-log" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.862831 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerName="glance-log" Jan 21 15:36:52 crc kubenswrapper[4707]: E0121 15:36:52.862849 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99c92c6-cf0f-406a-ad28-7134cace4dd7" containerName="placement-db-sync" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.862856 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99c92c6-cf0f-406a-ad28-7134cace4dd7" containerName="placement-db-sync" Jan 21 15:36:52 crc kubenswrapper[4707]: E0121 15:36:52.862870 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerName="glance-httpd" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.862875 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerName="glance-httpd" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.863026 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerName="glance-httpd" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.863042 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" containerName="glance-log" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.863052 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99c92c6-cf0f-406a-ad28-7134cace4dd7" containerName="placement-db-sync" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.865975 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.868734 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-r5ql8" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.868983 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.869115 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.875141 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6cf4db6586-6qsm2"] Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.889534 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b7d73-3969-49ed-86e4-df5d5ccfa0f0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.901086 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.903181 4707 scope.go:117] "RemoveContainer" containerID="74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357" Jan 21 15:36:52 crc kubenswrapper[4707]: E0121 15:36:52.903798 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357\": container with ID starting with 74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357 not found: ID does not exist" containerID="74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.903862 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357"} err="failed to get container status \"74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357\": rpc error: code = NotFound desc = could not find container \"74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357\": container with ID starting with 74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357 not found: ID does not exist" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.903888 4707 scope.go:117] "RemoveContainer" containerID="8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef" Jan 21 15:36:52 crc kubenswrapper[4707]: E0121 15:36:52.904273 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef\": container with ID starting with 8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef not found: ID does not exist" containerID="8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.904296 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef"} err="failed to get container status \"8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef\": rpc error: code = NotFound desc = could not find container \"8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef\": container with ID starting with 8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef not found: ID does not exist" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.904309 4707 scope.go:117] "RemoveContainer" containerID="74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.904519 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357"} err="failed to get container status \"74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357\": rpc error: code = NotFound desc = could not find container \"74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357\": container with ID starting with 74daa8b4870085bd8e6b07cc6e54350a8234a1e6762c79edacd638ca65cd6357 not found: ID does not exist" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.904534 4707 scope.go:117] "RemoveContainer" containerID="8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef" Jan 21 15:36:52 crc kubenswrapper[4707]: I0121 15:36:52.904933 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef"} err="failed to get container status \"8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef\": rpc error: code = NotFound desc = could not find container \"8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef\": container with ID starting with 8097b377ff05305d83e60b49706c3453542b81fd35a423d04d3023f9ce6bf5ef not found: ID does not exist" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.002824 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-config-data\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.002905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-scripts\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.002931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/370b1b6f-b257-4b0d-8d49-d79e90f2577d-logs\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.003220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-combined-ca-bundle\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.003444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmkvv\" (UniqueName: \"kubernetes.io/projected/370b1b6f-b257-4b0d-8d49-d79e90f2577d-kube-api-access-hmkvv\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.105089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmkvv\" (UniqueName: \"kubernetes.io/projected/370b1b6f-b257-4b0d-8d49-d79e90f2577d-kube-api-access-hmkvv\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.107710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-config-data\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.117540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-scripts\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.117612 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/370b1b6f-b257-4b0d-8d49-d79e90f2577d-logs\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.117832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-combined-ca-bundle\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.120691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/370b1b6f-b257-4b0d-8d49-d79e90f2577d-logs\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.121626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-config-data\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.122182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmkvv\" (UniqueName: \"kubernetes.io/projected/370b1b6f-b257-4b0d-8d49-d79e90f2577d-kube-api-access-hmkvv\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.125088 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-scripts\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.129037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-combined-ca-bundle\") pod \"placement-6cf4db6586-6qsm2\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.132796 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.144407 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.151342 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.153824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.156931 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.157088 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.165720 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.200553 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c9c667-afb4-4ace-8bee-173cf43d5a91" path="/var/lib/kubelet/pods/55c9c667-afb4-4ace-8bee-173cf43d5a91/volumes" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.201678 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="927b7d73-3969-49ed-86e4-df5d5ccfa0f0" path="/var/lib/kubelet/pods/927b7d73-3969-49ed-86e4-df5d5ccfa0f0/volumes" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.213974 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.224420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.224581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-logs\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.224998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.225686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.226437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.226511 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.226598 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.226651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-776p9\" (UniqueName: \"kubernetes.io/projected/bf3d161c-7bfd-42f7-90c1-63d92df34424-kube-api-access-776p9\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.330302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-776p9\" (UniqueName: \"kubernetes.io/projected/bf3d161c-7bfd-42f7-90c1-63d92df34424-kube-api-access-776p9\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.330390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.330446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-logs\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.330486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.330628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.330752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.330819 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.330852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.331720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.333031 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.335069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-logs\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.340409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.340959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.340963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.345306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.349761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-776p9\" (UniqueName: \"kubernetes.io/projected/bf3d161c-7bfd-42f7-90c1-63d92df34424-kube-api-access-776p9\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.356338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.448381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.453043 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.455920 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.530019 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7b4f4d8c74-82f2d"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.539434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-combined-ca-bundle\") pod \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.539510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-fernet-keys\") pod \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.539534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-db-sync-config-data\") pod \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.539676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7mps\" (UniqueName: \"kubernetes.io/projected/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-kube-api-access-r7mps\") pod \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.539724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-scripts\") pod \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.539783 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-combined-ca-bundle\") pod \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\" (UID: \"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.539858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp5dc\" (UniqueName: \"kubernetes.io/projected/88eddb1d-4df9-4afb-bae3-a55b93a6252a-kube-api-access-vp5dc\") pod \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.539887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-config-data\") pod \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.539973 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-credential-keys\") pod \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\" (UID: \"88eddb1d-4df9-4afb-bae3-a55b93a6252a\") " Jan 21 15:36:53 crc kubenswrapper[4707]: E0121 15:36:53.543286 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88eddb1d-4df9-4afb-bae3-a55b93a6252a" containerName="keystone-bootstrap" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.543331 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="88eddb1d-4df9-4afb-bae3-a55b93a6252a" containerName="keystone-bootstrap" Jan 21 15:36:53 crc kubenswrapper[4707]: E0121 15:36:53.543350 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8" containerName="barbican-db-sync" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.543356 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8" containerName="barbican-db-sync" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.543733 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8" containerName="barbican-db-sync" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.543763 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="88eddb1d-4df9-4afb-bae3-a55b93a6252a" containerName="keystone-bootstrap" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.544613 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7b4f4d8c74-82f2d"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.544697 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.546327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88eddb1d-4df9-4afb-bae3-a55b93a6252a-kube-api-access-vp5dc" (OuterVolumeSpecName: "kube-api-access-vp5dc") pod "88eddb1d-4df9-4afb-bae3-a55b93a6252a" (UID: "88eddb1d-4df9-4afb-bae3-a55b93a6252a"). InnerVolumeSpecName "kube-api-access-vp5dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.547658 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.547859 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.548302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "88eddb1d-4df9-4afb-bae3-a55b93a6252a" (UID: "88eddb1d-4df9-4afb-bae3-a55b93a6252a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.557992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-kube-api-access-r7mps" (OuterVolumeSpecName: "kube-api-access-r7mps") pod "ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8" (UID: "ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8"). InnerVolumeSpecName "kube-api-access-r7mps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.558002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "88eddb1d-4df9-4afb-bae3-a55b93a6252a" (UID: "88eddb1d-4df9-4afb-bae3-a55b93a6252a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.561010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8" (UID: "ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.561658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-scripts" (OuterVolumeSpecName: "scripts") pod "88eddb1d-4df9-4afb-bae3-a55b93a6252a" (UID: "88eddb1d-4df9-4afb-bae3-a55b93a6252a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.587446 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.623337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8" (UID: "ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.632051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88eddb1d-4df9-4afb-bae3-a55b93a6252a" (UID: "88eddb1d-4df9-4afb-bae3-a55b93a6252a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.643938 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-config-data\") pod \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-run-httpd\") pod \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-sg-core-conf-yaml\") pod \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-scripts\") pod \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644254 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlj7v\" (UniqueName: \"kubernetes.io/projected/eaea7418-f6a3-4005-a8bf-d92b199d03b2-kube-api-access-xlj7v\") pod \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-combined-ca-bundle\") pod \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-log-httpd\") pod \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\" (UID: \"eaea7418-f6a3-4005-a8bf-d92b199d03b2\") " Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rgpw\" (UniqueName: \"kubernetes.io/projected/886a48bf-ec30-418c-86d9-9d96b725aa59-kube-api-access-6rgpw\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-config-data\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644839 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-scripts\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644880 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-public-tls-certs\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-internal-tls-certs\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-combined-ca-bundle\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.644975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886a48bf-ec30-418c-86d9-9d96b725aa59-logs\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.645067 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.645080 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.645092 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp5dc\" (UniqueName: \"kubernetes.io/projected/88eddb1d-4df9-4afb-bae3-a55b93a6252a-kube-api-access-vp5dc\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.645102 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.645112 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.645120 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.645129 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.645137 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7mps\" (UniqueName: \"kubernetes.io/projected/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8-kube-api-access-r7mps\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.646026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eaea7418-f6a3-4005-a8bf-d92b199d03b2" (UID: "eaea7418-f6a3-4005-a8bf-d92b199d03b2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.646305 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eaea7418-f6a3-4005-a8bf-d92b199d03b2" (UID: "eaea7418-f6a3-4005-a8bf-d92b199d03b2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.651554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaea7418-f6a3-4005-a8bf-d92b199d03b2-kube-api-access-xlj7v" (OuterVolumeSpecName: "kube-api-access-xlj7v") pod "eaea7418-f6a3-4005-a8bf-d92b199d03b2" (UID: "eaea7418-f6a3-4005-a8bf-d92b199d03b2"). InnerVolumeSpecName "kube-api-access-xlj7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.653940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-scripts" (OuterVolumeSpecName: "scripts") pod "eaea7418-f6a3-4005-a8bf-d92b199d03b2" (UID: "eaea7418-f6a3-4005-a8bf-d92b199d03b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.663437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-config-data" (OuterVolumeSpecName: "config-data") pod "88eddb1d-4df9-4afb-bae3-a55b93a6252a" (UID: "88eddb1d-4df9-4afb-bae3-a55b93a6252a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.690128 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eaea7418-f6a3-4005-a8bf-d92b199d03b2" (UID: "eaea7418-f6a3-4005-a8bf-d92b199d03b2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.732631 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6cf4db6586-6qsm2"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.739563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaea7418-f6a3-4005-a8bf-d92b199d03b2" (UID: "eaea7418-f6a3-4005-a8bf-d92b199d03b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.745926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rgpw\" (UniqueName: \"kubernetes.io/projected/886a48bf-ec30-418c-86d9-9d96b725aa59-kube-api-access-6rgpw\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.745963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-config-data\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.745986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-scripts\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-public-tls-certs\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-internal-tls-certs\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-combined-ca-bundle\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886a48bf-ec30-418c-86d9-9d96b725aa59-logs\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746176 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746186 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746195 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaea7418-f6a3-4005-a8bf-d92b199d03b2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746203 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746211 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746220 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88eddb1d-4df9-4afb-bae3-a55b93a6252a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746237 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlj7v\" (UniqueName: \"kubernetes.io/projected/eaea7418-f6a3-4005-a8bf-d92b199d03b2-kube-api-access-xlj7v\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.746513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886a48bf-ec30-418c-86d9-9d96b725aa59-logs\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.750782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-internal-tls-certs\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.751164 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-public-tls-certs\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.759652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-scripts\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.759799 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-combined-ca-bundle\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.763997 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rgpw\" (UniqueName: \"kubernetes.io/projected/886a48bf-ec30-418c-86d9-9d96b725aa59-kube-api-access-6rgpw\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.781191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-config-data\") pod \"placement-7b4f4d8c74-82f2d\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.786305 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-config-data" (OuterVolumeSpecName: "config-data") pod "eaea7418-f6a3-4005-a8bf-d92b199d03b2" (UID: "eaea7418-f6a3-4005-a8bf-d92b199d03b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.848388 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaea7418-f6a3-4005-a8bf-d92b199d03b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.865564 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.872383 4707 generic.go:334] "Generic (PLEG): container finished" podID="56293b49-445a-404c-8eb7-443b7d9d0170" containerID="f431730553af7e4d97ef4b5ba81a40ace44ae601e09bfd4227bed22314d88fa7" exitCode=0 Jan 21 15:36:53 crc kubenswrapper[4707]: E0121 15:36:53.874091 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="ceilometer-central-agent" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.874117 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="ceilometer-central-agent" Jan 21 15:36:53 crc kubenswrapper[4707]: E0121 15:36:53.874146 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="proxy-httpd" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.874152 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="proxy-httpd" Jan 21 15:36:53 crc kubenswrapper[4707]: E0121 15:36:53.874169 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="ceilometer-notification-agent" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.874176 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="ceilometer-notification-agent" Jan 21 15:36:53 crc kubenswrapper[4707]: E0121 15:36:53.874183 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="sg-core" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.874189 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="sg-core" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.874450 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="proxy-httpd" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.874469 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="ceilometer-notification-agent" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.874480 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="ceilometer-central-agent" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.874486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerName="sg-core" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.874677 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.875308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" event={"ID":"56293b49-445a-404c-8eb7-443b7d9d0170","Type":"ContainerDied","Data":"f431730553af7e4d97ef4b5ba81a40ace44ae601e09bfd4227bed22314d88fa7"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.875678 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.880259 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.900747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" event={"ID":"88eddb1d-4df9-4afb-bae3-a55b93a6252a","Type":"ContainerDied","Data":"0bcac72fa58ff64e99a340c63b37b7529493e35576bc41ab32ddf189fce08b57"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.900786 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bcac72fa58ff64e99a340c63b37b7529493e35576bc41ab32ddf189fce08b57" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.900871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-mj6vd" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.926565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" event={"ID":"370b1b6f-b257-4b0d-8d49-d79e90f2577d","Type":"ContainerStarted","Data":"dd90c850c1cefaaa38e22049833efff4181cc45cb0764e71a05099c47f1c7f24"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.930878 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.932258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.947597 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.951154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40927271-bdaa-4872-a579-c100648860dd-logs\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.951269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data-custom\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953667 4707 generic.go:334] "Generic (PLEG): container finished" podID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerID="1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c" exitCode=0 Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953700 4707 generic.go:334] "Generic (PLEG): container finished" podID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerID="73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae" exitCode=2 Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953709 4707 generic.go:334] "Generic (PLEG): container finished" podID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerID="7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011" exitCode=0 Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953718 4707 generic.go:334] "Generic (PLEG): container finished" podID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" containerID="cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9" exitCode=0 Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953771 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerDied","Data":"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerDied","Data":"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerDied","Data":"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerDied","Data":"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eaea7418-f6a3-4005-a8bf-d92b199d03b2","Type":"ContainerDied","Data":"9087c74239db23aed6b37b4eb8795218c2d7117cb60e96bd2d95303913a5278c"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953855 4707 scope.go:117] "RemoveContainer" containerID="1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.953975 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.954631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.954721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.954746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-combined-ca-bundle\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.954884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-combined-ca-bundle\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.954985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q44vg\" (UniqueName: \"kubernetes.io/projected/40927271-bdaa-4872-a579-c100648860dd-kube-api-access-q44vg\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.955033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data-custom\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.955070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btdqp\" (UniqueName: \"kubernetes.io/projected/767701a0-4f58-4ea4-a625-abc7f426d70b-kube-api-access-btdqp\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.955089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/767701a0-4f58-4ea4-a625-abc7f426d70b-logs\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.976704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9c93e649-01b5-4e89-8453-106a67bc11ba","Type":"ContainerStarted","Data":"07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.976882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9c93e649-01b5-4e89-8453-106a67bc11ba","Type":"ContainerStarted","Data":"c60ba3771db87f16ea472576fcc37d1a1da884134d536b89457156ca088661a3"} Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.985054 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm"] Jan 21 15:36:53 crc kubenswrapper[4707]: I0121 15:36:53.999394 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.002176 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" event={"ID":"ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8","Type":"ContainerDied","Data":"d329165a54265c1914902c5f66712fa26a01137aa64da14eb54cfb3059404372"} Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.002204 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d329165a54265c1914902c5f66712fa26a01137aa64da14eb54cfb3059404372" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.002269 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-lwmqg" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.011080 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mj6vd"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.020375 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mj6vd"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.026113 4707 scope.go:117] "RemoveContainer" containerID="73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.047298 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.048752 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.051501 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dpbkq"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.055314 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.055415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.056305 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q44vg\" (UniqueName: \"kubernetes.io/projected/40927271-bdaa-4872-a579-c100648860dd-kube-api-access-q44vg\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.063377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data-custom\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.063519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/767701a0-4f58-4ea4-a625-abc7f426d70b-logs\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.063599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdqp\" (UniqueName: \"kubernetes.io/projected/767701a0-4f58-4ea4-a625-abc7f426d70b-kube-api-access-btdqp\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.063827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40927271-bdaa-4872-a579-c100648860dd-logs\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.063931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data-custom\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.064012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.064093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.064155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-combined-ca-bundle\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.064242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/767701a0-4f58-4ea4-a625-abc7f426d70b-logs\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.057693 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.058070 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.064441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-combined-ca-bundle\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.059898 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.068868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40927271-bdaa-4872-a579-c100648860dd-logs\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.059946 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.059971 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-m5gqr" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.078688 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.081259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data-custom\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.082321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.083994 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data-custom\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.087774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.092637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q44vg\" (UniqueName: \"kubernetes.io/projected/40927271-bdaa-4872-a579-c100648860dd-kube-api-access-q44vg\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.109798 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdqp\" (UniqueName: \"kubernetes.io/projected/767701a0-4f58-4ea4-a625-abc7f426d70b-kube-api-access-btdqp\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.110313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-combined-ca-bundle\") pod \"barbican-keystone-listener-5fcf9fd7cd-jh5hx\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.110860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-combined-ca-bundle\") pod \"barbican-worker-77c8f47597-7cmwm\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.126239 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.137651 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dpbkq"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.160098 4707 scope.go:117] "RemoveContainer" containerID="7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.161891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.165965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-combined-ca-bundle\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-scripts\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166059 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-config-data\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-combined-ca-bundle\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjww\" (UniqueName: \"kubernetes.io/projected/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-kube-api-access-rqjww\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-logs\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-fernet-keys\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data-custom\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166576 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-credential-keys\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.166621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkt7g\" (UniqueName: \"kubernetes.io/projected/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-kube-api-access-nkt7g\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.168925 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.174782 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.177102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.183370 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.183433 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.190923 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.240704 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.268301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-logs\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269073 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-logs\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-scripts\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-fernet-keys\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data-custom\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-run-httpd\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-credential-keys\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkt7g\" (UniqueName: \"kubernetes.io/projected/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-kube-api-access-nkt7g\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-config-data\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.269979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-log-httpd\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.270000 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-combined-ca-bundle\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.270027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-scripts\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.270043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-config-data\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.270070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.270392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-combined-ca-bundle\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.270418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrcr\" (UniqueName: \"kubernetes.io/projected/3401bc45-ba40-4893-b70c-debe5e5bf534-kube-api-access-kfrcr\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.270437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjww\" (UniqueName: \"kubernetes.io/projected/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-kube-api-access-rqjww\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.270453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.273146 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-fernet-keys\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.273454 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data-custom\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.273548 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.273702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-combined-ca-bundle\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.274109 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-credential-keys\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.274720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-config-data\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.282282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.283028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkt7g\" (UniqueName: \"kubernetes.io/projected/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-kube-api-access-nkt7g\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.284716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-scripts\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.285413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-combined-ca-bundle\") pod \"keystone-bootstrap-dpbkq\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.285450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjww\" (UniqueName: \"kubernetes.io/projected/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-kube-api-access-rqjww\") pod \"barbican-api-6495d54cb4-mg28h\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.372310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-run-httpd\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.373333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-run-httpd\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.373377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-config-data\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.373497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-log-httpd\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.373601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.373654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrcr\" (UniqueName: \"kubernetes.io/projected/3401bc45-ba40-4893-b70c-debe5e5bf534-kube-api-access-kfrcr\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.373676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.374291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-log-httpd\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.374839 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-scripts\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.378902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.379475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-config-data\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.379627 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-scripts\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.389513 4707 scope.go:117] "RemoveContainer" containerID="cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.390241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.397442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrcr\" (UniqueName: \"kubernetes.io/projected/3401bc45-ba40-4893-b70c-debe5e5bf534-kube-api-access-kfrcr\") pod \"ceilometer-0\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.413393 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.430028 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.552743 4707 scope.go:117] "RemoveContainer" containerID="1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c" Jan 21 15:36:54 crc kubenswrapper[4707]: E0121 15:36:54.554375 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c\": container with ID starting with 1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c not found: ID does not exist" containerID="1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.554868 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c"} err="failed to get container status \"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c\": rpc error: code = NotFound desc = could not find container \"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c\": container with ID starting with 1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.558200 4707 scope.go:117] "RemoveContainer" containerID="73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae" Jan 21 15:36:54 crc kubenswrapper[4707]: E0121 15:36:54.558874 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae\": container with ID starting with 73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae not found: ID does not exist" containerID="73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.558907 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae"} err="failed to get container status \"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae\": rpc error: code = NotFound desc = could not find container \"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae\": container with ID starting with 73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.558935 4707 scope.go:117] "RemoveContainer" containerID="7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011" Jan 21 15:36:54 crc kubenswrapper[4707]: E0121 15:36:54.559288 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011\": container with ID starting with 7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011 not found: ID does not exist" containerID="7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.559302 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011"} err="failed to get container status \"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011\": rpc error: code = NotFound desc = could not find container \"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011\": container with ID starting with 7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011 not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.559317 4707 scope.go:117] "RemoveContainer" containerID="cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9" Jan 21 15:36:54 crc kubenswrapper[4707]: E0121 15:36:54.559649 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9\": container with ID starting with cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9 not found: ID does not exist" containerID="cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.559665 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9"} err="failed to get container status \"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9\": rpc error: code = NotFound desc = could not find container \"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9\": container with ID starting with cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9 not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.559676 4707 scope.go:117] "RemoveContainer" containerID="1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.560022 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c"} err="failed to get container status \"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c\": rpc error: code = NotFound desc = could not find container \"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c\": container with ID starting with 1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.560034 4707 scope.go:117] "RemoveContainer" containerID="73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.560449 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae"} err="failed to get container status \"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae\": rpc error: code = NotFound desc = could not find container \"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae\": container with ID starting with 73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.560484 4707 scope.go:117] "RemoveContainer" containerID="7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.560963 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011"} err="failed to get container status \"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011\": rpc error: code = NotFound desc = could not find container \"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011\": container with ID starting with 7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011 not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.560984 4707 scope.go:117] "RemoveContainer" containerID="cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.561317 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9"} err="failed to get container status \"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9\": rpc error: code = NotFound desc = could not find container \"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9\": container with ID starting with cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9 not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.561361 4707 scope.go:117] "RemoveContainer" containerID="1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.561689 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c"} err="failed to get container status \"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c\": rpc error: code = NotFound desc = could not find container \"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c\": container with ID starting with 1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.561703 4707 scope.go:117] "RemoveContainer" containerID="73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.565977 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7b4f4d8c74-82f2d"] Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.571045 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae"} err="failed to get container status \"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae\": rpc error: code = NotFound desc = could not find container \"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae\": container with ID starting with 73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.571083 4707 scope.go:117] "RemoveContainer" containerID="7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.574213 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011"} err="failed to get container status \"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011\": rpc error: code = NotFound desc = could not find container \"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011\": container with ID starting with 7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011 not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.574291 4707 scope.go:117] "RemoveContainer" containerID="cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.575707 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9"} err="failed to get container status \"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9\": rpc error: code = NotFound desc = could not find container \"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9\": container with ID starting with cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9 not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.575755 4707 scope.go:117] "RemoveContainer" containerID="1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.576258 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c"} err="failed to get container status \"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c\": rpc error: code = NotFound desc = could not find container \"1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c\": container with ID starting with 1a7c23337c248d3e5bc248f4150ba1ba2f374b2d596e489fc2ab84d0340e678c not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.576288 4707 scope.go:117] "RemoveContainer" containerID="73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.580055 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae"} err="failed to get container status \"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae\": rpc error: code = NotFound desc = could not find container \"73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae\": container with ID starting with 73da5798267faa59aa5e5f341aa4f78c7a7ad8a40895e5d370f63996035624ae not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.580126 4707 scope.go:117] "RemoveContainer" containerID="7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.580650 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011"} err="failed to get container status \"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011\": rpc error: code = NotFound desc = could not find container \"7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011\": container with ID starting with 7a119546e4533f82ccd6857bb1aa41af6c3c0b0108cebd8116be5b8b9a4b3011 not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.580682 4707 scope.go:117] "RemoveContainer" containerID="cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.582703 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9"} err="failed to get container status \"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9\": rpc error: code = NotFound desc = could not find container \"cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9\": container with ID starting with cfdca0eeacb50e8cf07868e573539eadd9bf2a09a82496c2cbb9a66591e32aa9 not found: ID does not exist" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.691824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:36:54 crc kubenswrapper[4707]: I0121 15:36:54.972836 4707 scope.go:117] "RemoveContainer" containerID="2959c05294ef22b7eddaf9fa7c698a04d0ddd1042daf82642dac605658641a3b" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.055738 4707 scope.go:117] "RemoveContainer" containerID="1a70f6bf277e47355538d924e7369f28b3980ffde1ce60e65e4ba86178bb6cc7" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.065091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" event={"ID":"370b1b6f-b257-4b0d-8d49-d79e90f2577d","Type":"ContainerStarted","Data":"69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f"} Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.065144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" event={"ID":"370b1b6f-b257-4b0d-8d49-d79e90f2577d","Type":"ContainerStarted","Data":"2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474"} Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.066625 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.066664 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.073739 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm"] Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.089465 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx"] Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.099148 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" podStartSLOduration=3.099130214 podStartE2EDuration="3.099130214s" podCreationTimestamp="2026-01-21 15:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:55.097620274 +0000 UTC m=+2112.279136496" watchObservedRunningTime="2026-01-21 15:36:55.099130214 +0000 UTC m=+2112.280646436" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.126270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9c93e649-01b5-4e89-8453-106a67bc11ba","Type":"ContainerStarted","Data":"88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c"} Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.131750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"bf3d161c-7bfd-42f7-90c1-63d92df34424","Type":"ContainerStarted","Data":"d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428"} Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.131788 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"bf3d161c-7bfd-42f7-90c1-63d92df34424","Type":"ContainerStarted","Data":"1f8a968c4580174171f2b3afbda30e29b044bb570b2e623982f61eecce36eb2a"} Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.150488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" event={"ID":"886a48bf-ec30-418c-86d9-9d96b725aa59","Type":"ContainerStarted","Data":"7fcfea5a91aa187e427c2bdc47ba7952faee7441564ea963143811ec3cd2307c"} Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.215794 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.21577692 podStartE2EDuration="4.21577692s" podCreationTimestamp="2026-01-21 15:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:55.197340695 +0000 UTC m=+2112.378856907" watchObservedRunningTime="2026-01-21 15:36:55.21577692 +0000 UTC m=+2112.397293142" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.242505 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88eddb1d-4df9-4afb-bae3-a55b93a6252a" path="/var/lib/kubelet/pods/88eddb1d-4df9-4afb-bae3-a55b93a6252a/volumes" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.244066 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaea7418-f6a3-4005-a8bf-d92b199d03b2" path="/var/lib/kubelet/pods/eaea7418-f6a3-4005-a8bf-d92b199d03b2/volumes" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.366912 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dpbkq"] Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.530407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h"] Jan 21 15:36:55 crc kubenswrapper[4707]: W0121 15:36:55.539277 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd38bf7c9_fa00_4dc5_abaa_446877bcbe60.slice/crio-3b7ba521f4de330a4d368c27142ba028fddf46712d0bd71b953049b50ca3f6c8 WatchSource:0}: Error finding container 3b7ba521f4de330a4d368c27142ba028fddf46712d0bd71b953049b50ca3f6c8: Status 404 returned error can't find the container with id 3b7ba521f4de330a4d368c27142ba028fddf46712d0bd71b953049b50ca3f6c8 Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.661966 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.665475 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.687019 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.732458 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56293b49-445a-404c-8eb7-443b7d9d0170-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "56293b49-445a-404c-8eb7-443b7d9d0170" (UID: "56293b49-445a-404c-8eb7-443b7d9d0170"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.732702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56293b49-445a-404c-8eb7-443b7d9d0170-etc-machine-id\") pod \"56293b49-445a-404c-8eb7-443b7d9d0170\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.732780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-scripts\") pod \"56293b49-445a-404c-8eb7-443b7d9d0170\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.732849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn6q6\" (UniqueName: \"kubernetes.io/projected/56293b49-445a-404c-8eb7-443b7d9d0170-kube-api-access-rn6q6\") pod \"56293b49-445a-404c-8eb7-443b7d9d0170\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.732881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-combined-ca-bundle\") pod \"56293b49-445a-404c-8eb7-443b7d9d0170\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.732922 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-config-data\") pod \"56293b49-445a-404c-8eb7-443b7d9d0170\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.732963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-db-sync-config-data\") pod \"56293b49-445a-404c-8eb7-443b7d9d0170\" (UID: \"56293b49-445a-404c-8eb7-443b7d9d0170\") " Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.738445 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "56293b49-445a-404c-8eb7-443b7d9d0170" (UID: "56293b49-445a-404c-8eb7-443b7d9d0170"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.745891 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56293b49-445a-404c-8eb7-443b7d9d0170-kube-api-access-rn6q6" (OuterVolumeSpecName: "kube-api-access-rn6q6") pod "56293b49-445a-404c-8eb7-443b7d9d0170" (UID: "56293b49-445a-404c-8eb7-443b7d9d0170"). InnerVolumeSpecName "kube-api-access-rn6q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.781296 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-scripts" (OuterVolumeSpecName: "scripts") pod "56293b49-445a-404c-8eb7-443b7d9d0170" (UID: "56293b49-445a-404c-8eb7-443b7d9d0170"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.797014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56293b49-445a-404c-8eb7-443b7d9d0170" (UID: "56293b49-445a-404c-8eb7-443b7d9d0170"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.817880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-config-data" (OuterVolumeSpecName: "config-data") pod "56293b49-445a-404c-8eb7-443b7d9d0170" (UID: "56293b49-445a-404c-8eb7-443b7d9d0170"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.835026 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.835055 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.835066 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.835075 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56293b49-445a-404c-8eb7-443b7d9d0170-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.835083 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56293b49-445a-404c-8eb7-443b7d9d0170-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:55 crc kubenswrapper[4707]: I0121 15:36:55.835091 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn6q6\" (UniqueName: \"kubernetes.io/projected/56293b49-445a-404c-8eb7-443b7d9d0170-kube-api-access-rn6q6\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.118075 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:36:56 crc kubenswrapper[4707]: E0121 15:36:56.118393 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56293b49-445a-404c-8eb7-443b7d9d0170" containerName="cinder-db-sync" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.118405 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="56293b49-445a-404c-8eb7-443b7d9d0170" containerName="cinder-db-sync" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.118609 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="56293b49-445a-404c-8eb7-443b7d9d0170" containerName="cinder-db-sync" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.119436 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.127919 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.144126 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.160747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.161120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr75n\" (UniqueName: \"kubernetes.io/projected/984386a6-8ce2-4d73-9d17-0654599439aa-kube-api-access-cr75n\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.161179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.161254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/984386a6-8ce2-4d73-9d17-0654599439aa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.161274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-scripts\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.161447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.175000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" event={"ID":"767701a0-4f58-4ea4-a625-abc7f426d70b","Type":"ContainerStarted","Data":"85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.175046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" event={"ID":"767701a0-4f58-4ea4-a625-abc7f426d70b","Type":"ContainerStarted","Data":"5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.175058 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" event={"ID":"767701a0-4f58-4ea4-a625-abc7f426d70b","Type":"ContainerStarted","Data":"09709386cbaf9ea6ae35f27c8dcdb19eda38edc8854a3017806f36d74010b0dc"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.180965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" event={"ID":"40927271-bdaa-4872-a579-c100648860dd","Type":"ContainerStarted","Data":"4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.180992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" event={"ID":"40927271-bdaa-4872-a579-c100648860dd","Type":"ContainerStarted","Data":"2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.181003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" event={"ID":"40927271-bdaa-4872-a579-c100648860dd","Type":"ContainerStarted","Data":"47b186c7ba354be7dc9056b266f41d519315c491a5b4a1ff37edf9002cf8e69f"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.182791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" event={"ID":"886a48bf-ec30-418c-86d9-9d96b725aa59","Type":"ContainerStarted","Data":"43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.182835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" event={"ID":"886a48bf-ec30-418c-86d9-9d96b725aa59","Type":"ContainerStarted","Data":"292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.183651 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.200318 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.200344 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.201058 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.201857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" event={"ID":"0e9eed66-7b3c-4e13-8083-7cfb9a92c178","Type":"ContainerStarted","Data":"9855407ae307baba14bb2d5bf299fe60a15470f4bae1dd8811b5a4cd3662c164"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.201884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" event={"ID":"0e9eed66-7b3c-4e13-8083-7cfb9a92c178","Type":"ContainerStarted","Data":"53d39996e2d1a423ae4037ccc2ad28d76b1980c252edd05020c1599671b71e61"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.229165 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.236754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" event={"ID":"d38bf7c9-fa00-4dc5-abaa-446877bcbe60","Type":"ContainerStarted","Data":"2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.236857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" event={"ID":"d38bf7c9-fa00-4dc5-abaa-446877bcbe60","Type":"ContainerStarted","Data":"39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.236881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" event={"ID":"d38bf7c9-fa00-4dc5-abaa-446877bcbe60","Type":"ContainerStarted","Data":"3b7ba521f4de330a4d368c27142ba028fddf46712d0bd71b953049b50ca3f6c8"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.237681 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.237897 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.243257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerStarted","Data":"92cedbd193becec37b552edf91342663f674f348c474127919863971595574bb"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.248907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" event={"ID":"56293b49-445a-404c-8eb7-443b7d9d0170","Type":"ContainerDied","Data":"2eea8c3f81f780c9113094b65c01e2d3826790e7f9cda7b122c74136f584f81e"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.249034 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eea8c3f81f780c9113094b65c01e2d3826790e7f9cda7b122c74136f584f81e" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.249124 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-q7kt6" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.258034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"bf3d161c-7bfd-42f7-90c1-63d92df34424","Type":"ContainerStarted","Data":"dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97"} Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-scripts\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr75n\" (UniqueName: \"kubernetes.io/projected/984386a6-8ce2-4d73-9d17-0654599439aa-kube-api-access-cr75n\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m2mc\" (UniqueName: \"kubernetes.io/projected/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-kube-api-access-8m2mc\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data-custom\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/984386a6-8ce2-4d73-9d17-0654599439aa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.264832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-scripts\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.265022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-logs\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.265085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.265189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.266733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/984386a6-8ce2-4d73-9d17-0654599439aa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.344435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.350525 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.353504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr75n\" (UniqueName: \"kubernetes.io/projected/984386a6-8ce2-4d73-9d17-0654599439aa-kube-api-access-cr75n\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.354049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.354390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-scripts\") pod \"cinder-scheduler-0\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.354491 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.368109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-logs\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.368162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.368309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.368429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.369668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-scripts\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.369825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m2mc\" (UniqueName: \"kubernetes.io/projected/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-kube-api-access-8m2mc\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.369872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.369934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data-custom\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.368545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-logs\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.374194 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" podStartSLOduration=3.374181714 podStartE2EDuration="3.374181714s" podCreationTimestamp="2026-01-21 15:36:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:56.202041115 +0000 UTC m=+2113.383557337" watchObservedRunningTime="2026-01-21 15:36:56.374181714 +0000 UTC m=+2113.555697935" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.387436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-scripts\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.389883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.390033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.398328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m2mc\" (UniqueName: \"kubernetes.io/projected/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-kube-api-access-8m2mc\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.408054 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data-custom\") pod \"cinder-api-0\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.408396 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" podStartSLOduration=3.408369316 podStartE2EDuration="3.408369316s" podCreationTimestamp="2026-01-21 15:36:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:56.267175183 +0000 UTC m=+2113.448691405" watchObservedRunningTime="2026-01-21 15:36:56.408369316 +0000 UTC m=+2113.589885537" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.412878 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" podStartSLOduration=3.412868505 podStartE2EDuration="3.412868505s" podCreationTimestamp="2026-01-21 15:36:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:56.341254051 +0000 UTC m=+2113.522770274" watchObservedRunningTime="2026-01-21 15:36:56.412868505 +0000 UTC m=+2113.594384727" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.420295 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" podStartSLOduration=3.420278438 podStartE2EDuration="3.420278438s" podCreationTimestamp="2026-01-21 15:36:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:56.385067062 +0000 UTC m=+2113.566583283" watchObservedRunningTime="2026-01-21 15:36:56.420278438 +0000 UTC m=+2113.601794661" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.437661 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.438939 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" podStartSLOduration=3.43892054 podStartE2EDuration="3.43892054s" podCreationTimestamp="2026-01-21 15:36:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:56.407476016 +0000 UTC m=+2113.588992239" watchObservedRunningTime="2026-01-21 15:36:56.43892054 +0000 UTC m=+2113.620436762" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.443084 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.443077526 podStartE2EDuration="3.443077526s" podCreationTimestamp="2026-01-21 15:36:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:56.432115936 +0000 UTC m=+2113.613632158" watchObservedRunningTime="2026-01-21 15:36:56.443077526 +0000 UTC m=+2113.624593739" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.560490 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:36:56 crc kubenswrapper[4707]: I0121 15:36:56.852678 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:36:56 crc kubenswrapper[4707]: W0121 15:36:56.860885 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c66fb33_df23_49e6_b3b4_a42bdbf0b1ce.slice/crio-95b1dd42d17166408b4a5b1a7d5b6b37a601c70ddaea8e46eae3071f9d7c5a16 WatchSource:0}: Error finding container 95b1dd42d17166408b4a5b1a7d5b6b37a601c70ddaea8e46eae3071f9d7c5a16: Status 404 returned error can't find the container with id 95b1dd42d17166408b4a5b1a7d5b6b37a601c70ddaea8e46eae3071f9d7c5a16 Jan 21 15:36:57 crc kubenswrapper[4707]: I0121 15:36:57.013738 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:36:57 crc kubenswrapper[4707]: W0121 15:36:57.017069 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod984386a6_8ce2_4d73_9d17_0654599439aa.slice/crio-04c4d15a7c00c3d6a2167302c6c8d29d4b96452909390e7ba4710f560a1d077c WatchSource:0}: Error finding container 04c4d15a7c00c3d6a2167302c6c8d29d4b96452909390e7ba4710f560a1d077c: Status 404 returned error can't find the container with id 04c4d15a7c00c3d6a2167302c6c8d29d4b96452909390e7ba4710f560a1d077c Jan 21 15:36:57 crc kubenswrapper[4707]: I0121 15:36:57.283107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce","Type":"ContainerStarted","Data":"95b1dd42d17166408b4a5b1a7d5b6b37a601c70ddaea8e46eae3071f9d7c5a16"} Jan 21 15:36:57 crc kubenswrapper[4707]: I0121 15:36:57.285406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerStarted","Data":"4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c"} Jan 21 15:36:57 crc kubenswrapper[4707]: I0121 15:36:57.287744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"984386a6-8ce2-4d73-9d17-0654599439aa","Type":"ContainerStarted","Data":"04c4d15a7c00c3d6a2167302c6c8d29d4b96452909390e7ba4710f560a1d077c"} Jan 21 15:36:58 crc kubenswrapper[4707]: I0121 15:36:58.297754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerStarted","Data":"2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6"} Jan 21 15:36:58 crc kubenswrapper[4707]: I0121 15:36:58.298249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerStarted","Data":"be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143"} Jan 21 15:36:58 crc kubenswrapper[4707]: I0121 15:36:58.301484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce","Type":"ContainerStarted","Data":"fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7"} Jan 21 15:36:58 crc kubenswrapper[4707]: I0121 15:36:58.301535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce","Type":"ContainerStarted","Data":"721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7"} Jan 21 15:36:58 crc kubenswrapper[4707]: I0121 15:36:58.301567 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:36:58 crc kubenswrapper[4707]: I0121 15:36:58.307062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"984386a6-8ce2-4d73-9d17-0654599439aa","Type":"ContainerStarted","Data":"86f11fcd5fba98de8bdba84643f8037cbe04cfb15e4057fa1e54c13ebce243b6"} Jan 21 15:36:58 crc kubenswrapper[4707]: I0121 15:36:58.322259 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.322243149 podStartE2EDuration="2.322243149s" podCreationTimestamp="2026-01-21 15:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:58.317379905 +0000 UTC m=+2115.498896127" watchObservedRunningTime="2026-01-21 15:36:58.322243149 +0000 UTC m=+2115.503759372" Jan 21 15:36:58 crc kubenswrapper[4707]: I0121 15:36:58.341532 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.341514374 podStartE2EDuration="2.341514374s" podCreationTimestamp="2026-01-21 15:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:36:58.332477412 +0000 UTC m=+2115.513993634" watchObservedRunningTime="2026-01-21 15:36:58.341514374 +0000 UTC m=+2115.523030595" Jan 21 15:36:59 crc kubenswrapper[4707]: I0121 15:36:59.317332 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e9eed66-7b3c-4e13-8083-7cfb9a92c178" containerID="9855407ae307baba14bb2d5bf299fe60a15470f4bae1dd8811b5a4cd3662c164" exitCode=0 Jan 21 15:36:59 crc kubenswrapper[4707]: I0121 15:36:59.317430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" event={"ID":"0e9eed66-7b3c-4e13-8083-7cfb9a92c178","Type":"ContainerDied","Data":"9855407ae307baba14bb2d5bf299fe60a15470f4bae1dd8811b5a4cd3662c164"} Jan 21 15:36:59 crc kubenswrapper[4707]: I0121 15:36:59.320392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"984386a6-8ce2-4d73-9d17-0654599439aa","Type":"ContainerStarted","Data":"35e5a074ef992f0ac99a7b67323fef094c2c86061197f54a16fb0be2a696fcee"} Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.329019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerStarted","Data":"0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0"} Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.329333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.330141 4707 generic.go:334] "Generic (PLEG): container finished" podID="8157a3d5-13f8-4842-9fcd-6bd3d5f4a020" containerID="cee8fc65d92a4c449380c20ddd4feeb07fa1354e54366a6b28e01b8d807032e6" exitCode=0 Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.330212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" event={"ID":"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020","Type":"ContainerDied","Data":"cee8fc65d92a4c449380c20ddd4feeb07fa1354e54366a6b28e01b8d807032e6"} Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.350823 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.929848152 podStartE2EDuration="6.350792983s" podCreationTimestamp="2026-01-21 15:36:54 +0000 UTC" firstStartedPulling="2026-01-21 15:36:55.686794897 +0000 UTC m=+2112.868311119" lastFinishedPulling="2026-01-21 15:36:59.107739728 +0000 UTC m=+2116.289255950" observedRunningTime="2026-01-21 15:37:00.346564773 +0000 UTC m=+2117.528080995" watchObservedRunningTime="2026-01-21 15:37:00.350792983 +0000 UTC m=+2117.532309205" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.651574 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.761741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-credential-keys\") pod \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.761776 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-combined-ca-bundle\") pod \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.761843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-scripts\") pod \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.762501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkt7g\" (UniqueName: \"kubernetes.io/projected/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-kube-api-access-nkt7g\") pod \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.762705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-fernet-keys\") pod \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.762737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-config-data\") pod \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\" (UID: \"0e9eed66-7b3c-4e13-8083-7cfb9a92c178\") " Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.767340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-scripts" (OuterVolumeSpecName: "scripts") pod "0e9eed66-7b3c-4e13-8083-7cfb9a92c178" (UID: "0e9eed66-7b3c-4e13-8083-7cfb9a92c178"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.767751 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0e9eed66-7b3c-4e13-8083-7cfb9a92c178" (UID: "0e9eed66-7b3c-4e13-8083-7cfb9a92c178"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.767901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0e9eed66-7b3c-4e13-8083-7cfb9a92c178" (UID: "0e9eed66-7b3c-4e13-8083-7cfb9a92c178"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.779373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-kube-api-access-nkt7g" (OuterVolumeSpecName: "kube-api-access-nkt7g") pod "0e9eed66-7b3c-4e13-8083-7cfb9a92c178" (UID: "0e9eed66-7b3c-4e13-8083-7cfb9a92c178"). InnerVolumeSpecName "kube-api-access-nkt7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.786565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e9eed66-7b3c-4e13-8083-7cfb9a92c178" (UID: "0e9eed66-7b3c-4e13-8083-7cfb9a92c178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.788538 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-config-data" (OuterVolumeSpecName: "config-data") pod "0e9eed66-7b3c-4e13-8083-7cfb9a92c178" (UID: "0e9eed66-7b3c-4e13-8083-7cfb9a92c178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.864836 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.865044 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.865054 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.865065 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.865075 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:00 crc kubenswrapper[4707]: I0121 15:37:00.865084 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkt7g\" (UniqueName: \"kubernetes.io/projected/0e9eed66-7b3c-4e13-8083-7cfb9a92c178-kube-api-access-nkt7g\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.338853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" event={"ID":"0e9eed66-7b3c-4e13-8083-7cfb9a92c178","Type":"ContainerDied","Data":"53d39996e2d1a423ae4037ccc2ad28d76b1980c252edd05020c1599671b71e61"} Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.338905 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53d39996e2d1a423ae4037ccc2ad28d76b1980c252edd05020c1599671b71e61" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.338906 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-dpbkq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.470634 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6cfc89d475-9v9xq"] Jan 21 15:37:01 crc kubenswrapper[4707]: E0121 15:37:01.470974 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9eed66-7b3c-4e13-8083-7cfb9a92c178" containerName="keystone-bootstrap" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.470987 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9eed66-7b3c-4e13-8083-7cfb9a92c178" containerName="keystone-bootstrap" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.471167 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9eed66-7b3c-4e13-8083-7cfb9a92c178" containerName="keystone-bootstrap" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.471771 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.474583 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.474590 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.474626 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.474753 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.474867 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-m5gqr" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.474983 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.487290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6cfc89d475-9v9xq"] Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.533953 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.534145 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerName="cinder-api-log" containerID="cri-o://721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7" gracePeriod=30 Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.534498 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerName="cinder-api" containerID="cri-o://fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7" gracePeriod=30 Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.567925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.576552 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-config-data\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.576595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-credential-keys\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.576774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-internal-tls-certs\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.576878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-fernet-keys\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.576906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-public-tls-certs\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.576958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snnh7\" (UniqueName: \"kubernetes.io/projected/324409f5-db7f-41c2-9004-d9c4d10e1e0e-kube-api-access-snnh7\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.577106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-combined-ca-bundle\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.577186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-scripts\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.679512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-internal-tls-certs\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.679708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-fernet-keys\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.679727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-public-tls-certs\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.679754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snnh7\" (UniqueName: \"kubernetes.io/projected/324409f5-db7f-41c2-9004-d9c4d10e1e0e-kube-api-access-snnh7\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.679798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-combined-ca-bundle\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.679864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-scripts\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.679919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-config-data\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.679938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-credential-keys\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.685379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-combined-ca-bundle\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.685841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-scripts\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.685890 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-public-tls-certs\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.686297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-internal-tls-certs\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.689597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-config-data\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.689776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-fernet-keys\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.690894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-credential-keys\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.698946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snnh7\" (UniqueName: \"kubernetes.io/projected/324409f5-db7f-41c2-9004-d9c4d10e1e0e-kube-api-access-snnh7\") pod \"keystone-6cfc89d475-9v9xq\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.789032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:01 crc kubenswrapper[4707]: I0121 15:37:01.901977 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.004628 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.087553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-combined-ca-bundle\") pod \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.087698 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-config\") pod \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.087780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsmmz\" (UniqueName: \"kubernetes.io/projected/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-kube-api-access-tsmmz\") pod \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\" (UID: \"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.092308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-kube-api-access-tsmmz" (OuterVolumeSpecName: "kube-api-access-tsmmz") pod "8157a3d5-13f8-4842-9fcd-6bd3d5f4a020" (UID: "8157a3d5-13f8-4842-9fcd-6bd3d5f4a020"). InnerVolumeSpecName "kube-api-access-tsmmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.134520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-config" (OuterVolumeSpecName: "config") pod "8157a3d5-13f8-4842-9fcd-6bd3d5f4a020" (UID: "8157a3d5-13f8-4842-9fcd-6bd3d5f4a020"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.141141 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-867d686c44-7f4dj"] Jan 21 15:37:02 crc kubenswrapper[4707]: E0121 15:37:02.141915 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerName="cinder-api" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.141937 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerName="cinder-api" Jan 21 15:37:02 crc kubenswrapper[4707]: E0121 15:37:02.141987 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8157a3d5-13f8-4842-9fcd-6bd3d5f4a020" containerName="neutron-db-sync" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.141993 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8157a3d5-13f8-4842-9fcd-6bd3d5f4a020" containerName="neutron-db-sync" Jan 21 15:37:02 crc kubenswrapper[4707]: E0121 15:37:02.142026 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerName="cinder-api-log" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.142034 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerName="cinder-api-log" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.142618 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerName="cinder-api" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.142648 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerName="cinder-api-log" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.142670 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8157a3d5-13f8-4842-9fcd-6bd3d5f4a020" containerName="neutron-db-sync" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.143968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8157a3d5-13f8-4842-9fcd-6bd3d5f4a020" (UID: "8157a3d5-13f8-4842-9fcd-6bd3d5f4a020"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.144054 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.145922 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.149700 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.153557 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-867d686c44-7f4dj"] Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.190890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data\") pod \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.190996 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-combined-ca-bundle\") pod \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.191282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m2mc\" (UniqueName: \"kubernetes.io/projected/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-kube-api-access-8m2mc\") pod \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.191337 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-scripts\") pod \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.191353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data-custom\") pod \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.191659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-logs\") pod \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.191819 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-etc-machine-id\") pod \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\" (UID: \"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce\") " Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.193427 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.193444 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.193510 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsmmz\" (UniqueName: \"kubernetes.io/projected/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020-kube-api-access-tsmmz\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.193551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" (UID: "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.199165 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-logs" (OuterVolumeSpecName: "logs") pod "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" (UID: "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.199350 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.199371 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.204416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-scripts" (OuterVolumeSpecName: "scripts") pod "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" (UID: "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.209020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-kube-api-access-8m2mc" (OuterVolumeSpecName: "kube-api-access-8m2mc") pod "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" (UID: "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce"). InnerVolumeSpecName "kube-api-access-8m2mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.211107 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" (UID: "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.220661 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6cfc89d475-9v9xq"] Jan 21 15:37:02 crc kubenswrapper[4707]: W0121 15:37:02.230442 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod324409f5_db7f_41c2_9004_d9c4d10e1e0e.slice/crio-5e582727291c318d43862ce2372edd257d145e7bdddbd0587bc647696d9cf276 WatchSource:0}: Error finding container 5e582727291c318d43862ce2372edd257d145e7bdddbd0587bc647696d9cf276: Status 404 returned error can't find the container with id 5e582727291c318d43862ce2372edd257d145e7bdddbd0587bc647696d9cf276 Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.233384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" (UID: "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.240153 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.240537 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.279842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data" (OuterVolumeSpecName: "config-data") pod "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" (UID: "5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.297643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0633364-ad88-4b44-bc23-3e2548ffb3d3-logs\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.298173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-public-tls-certs\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.298317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data-custom\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.298392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-combined-ca-bundle\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.298577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnhrb\" (UniqueName: \"kubernetes.io/projected/c0633364-ad88-4b44-bc23-3e2548ffb3d3-kube-api-access-dnhrb\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.298615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-internal-tls-certs\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.298675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.299055 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.299076 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.299086 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.299095 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.299103 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.299112 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.299121 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m2mc\" (UniqueName: \"kubernetes.io/projected/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce-kube-api-access-8m2mc\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.351237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" event={"ID":"324409f5-db7f-41c2-9004-d9c4d10e1e0e","Type":"ContainerStarted","Data":"5e582727291c318d43862ce2372edd257d145e7bdddbd0587bc647696d9cf276"} Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.355912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" event={"ID":"8157a3d5-13f8-4842-9fcd-6bd3d5f4a020","Type":"ContainerDied","Data":"b5ff2cd59756900eca26d7cdb1a41e8c9b570c3fe333ac5cf279f6b593828c5d"} Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.355945 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5ff2cd59756900eca26d7cdb1a41e8c9b570c3fe333ac5cf279f6b593828c5d" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.356027 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-9m9rb" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.357915 4707 generic.go:334] "Generic (PLEG): container finished" podID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerID="fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7" exitCode=0 Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.357939 4707 generic.go:334] "Generic (PLEG): container finished" podID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" containerID="721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7" exitCode=143 Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.357975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce","Type":"ContainerDied","Data":"fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7"} Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.358002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce","Type":"ContainerDied","Data":"721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7"} Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.358013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce","Type":"ContainerDied","Data":"95b1dd42d17166408b4a5b1a7d5b6b37a601c70ddaea8e46eae3071f9d7c5a16"} Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.358028 4707 scope.go:117] "RemoveContainer" containerID="fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.357981 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.358484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.358516 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.392739 4707 scope.go:117] "RemoveContainer" containerID="721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.400668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-public-tls-certs\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.400738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data-custom\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.400789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-combined-ca-bundle\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.401500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnhrb\" (UniqueName: \"kubernetes.io/projected/c0633364-ad88-4b44-bc23-3e2548ffb3d3-kube-api-access-dnhrb\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.401523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-internal-tls-certs\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.401545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.401589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0633364-ad88-4b44-bc23-3e2548ffb3d3-logs\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.401996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0633364-ad88-4b44-bc23-3e2548ffb3d3-logs\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.406751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-combined-ca-bundle\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.407169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data-custom\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.409010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-public-tls-certs\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.414361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-internal-tls-certs\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.415380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.415625 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.431864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnhrb\" (UniqueName: \"kubernetes.io/projected/c0633364-ad88-4b44-bc23-3e2548ffb3d3-kube-api-access-dnhrb\") pod \"barbican-api-867d686c44-7f4dj\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.434976 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.444660 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.449112 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.453769 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.459574 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.459743 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.459788 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.464988 4707 scope.go:117] "RemoveContainer" containerID="fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7" Jan 21 15:37:02 crc kubenswrapper[4707]: E0121 15:37:02.468143 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7\": container with ID starting with fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7 not found: ID does not exist" containerID="fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.468172 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7"} err="failed to get container status \"fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7\": rpc error: code = NotFound desc = could not find container \"fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7\": container with ID starting with fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7 not found: ID does not exist" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.468190 4707 scope.go:117] "RemoveContainer" containerID="721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7" Jan 21 15:37:02 crc kubenswrapper[4707]: E0121 15:37:02.472158 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7\": container with ID starting with 721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7 not found: ID does not exist" containerID="721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.472183 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7"} err="failed to get container status \"721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7\": rpc error: code = NotFound desc = could not find container \"721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7\": container with ID starting with 721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7 not found: ID does not exist" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.472198 4707 scope.go:117] "RemoveContainer" containerID="fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.472473 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7"} err="failed to get container status \"fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7\": rpc error: code = NotFound desc = could not find container \"fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7\": container with ID starting with fc6fb35bbaca8205cb3ac6b44d20563d0b7fd2a0c5c2cd9343f9f8455408d1d7 not found: ID does not exist" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.472495 4707 scope.go:117] "RemoveContainer" containerID="721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.473521 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7"} err="failed to get container status \"721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7\": rpc error: code = NotFound desc = could not find container \"721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7\": container with ID starting with 721d6d42d39e70282fe43ae76dda7d37a07643bfc6d5c78aa6578997149b50f7 not found: ID does not exist" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.474048 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.505624 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-57d48858d5-rdc5w"] Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.507034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.509299 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-6fz9m" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.509480 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.513368 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.513484 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.544671 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-57d48858d5-rdc5w"] Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.605760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-config\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.605821 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgwxq\" (UniqueName: \"kubernetes.io/projected/487d2c84-f1f6-4b88-9188-49e704aeb43a-kube-api-access-wgwxq\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.605876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-scripts\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.605894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-httpd-config\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.605915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-ovndb-tls-certs\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.605951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487d2c84-f1f6-4b88-9188-49e704aeb43a-logs\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.605992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.606012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-combined-ca-bundle\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.606055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.606068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.606089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmccn\" (UniqueName: \"kubernetes.io/projected/0799c4c9-1d73-46a2-9b0b-0f225aeba225-kube-api-access-bmccn\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.606146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data-custom\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.606214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/487d2c84-f1f6-4b88-9188-49e704aeb43a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.606242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707231 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-config\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgwxq\" (UniqueName: \"kubernetes.io/projected/487d2c84-f1f6-4b88-9188-49e704aeb43a-kube-api-access-wgwxq\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-scripts\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-httpd-config\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-ovndb-tls-certs\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487d2c84-f1f6-4b88-9188-49e704aeb43a-logs\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707506 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-combined-ca-bundle\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmccn\" (UniqueName: \"kubernetes.io/projected/0799c4c9-1d73-46a2-9b0b-0f225aeba225-kube-api-access-bmccn\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data-custom\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/487d2c84-f1f6-4b88-9188-49e704aeb43a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.707839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/487d2c84-f1f6-4b88-9188-49e704aeb43a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.708196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487d2c84-f1f6-4b88-9188-49e704aeb43a-logs\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.719518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-ovndb-tls-certs\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.726693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.726693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data-custom\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.727220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-combined-ca-bundle\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.727388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-config\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.737800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.738588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.739061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-httpd-config\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.744596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-scripts\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.747002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.749428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgwxq\" (UniqueName: \"kubernetes.io/projected/487d2c84-f1f6-4b88-9188-49e704aeb43a-kube-api-access-wgwxq\") pod \"cinder-api-0\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.752012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmccn\" (UniqueName: \"kubernetes.io/projected/0799c4c9-1d73-46a2-9b0b-0f225aeba225-kube-api-access-bmccn\") pod \"neutron-57d48858d5-rdc5w\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.772824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:02 crc kubenswrapper[4707]: I0121 15:37:02.835905 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.111456 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-867d686c44-7f4dj"] Jan 21 15:37:03 crc kubenswrapper[4707]: W0121 15:37:03.120999 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0633364_ad88_4b44_bc23_3e2548ffb3d3.slice/crio-694266690a95a71432e331bfbb399389f375307a96d01c7cf73b02872aeae289 WatchSource:0}: Error finding container 694266690a95a71432e331bfbb399389f375307a96d01c7cf73b02872aeae289: Status 404 returned error can't find the container with id 694266690a95a71432e331bfbb399389f375307a96d01c7cf73b02872aeae289 Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.190238 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce" path="/var/lib/kubelet/pods/5c66fb33-df23-49e6-b3b4-a42bdbf0b1ce/volumes" Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.208610 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:37:03 crc kubenswrapper[4707]: W0121 15:37:03.209506 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod487d2c84_f1f6_4b88_9188_49e704aeb43a.slice/crio-9e5810e83405bceb5ec184f8064af3ac8b78021832b246e60622ea38317a7ed2 WatchSource:0}: Error finding container 9e5810e83405bceb5ec184f8064af3ac8b78021832b246e60622ea38317a7ed2: Status 404 returned error can't find the container with id 9e5810e83405bceb5ec184f8064af3ac8b78021832b246e60622ea38317a7ed2 Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.269875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-57d48858d5-rdc5w"] Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.403245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"487d2c84-f1f6-4b88-9188-49e704aeb43a","Type":"ContainerStarted","Data":"9e5810e83405bceb5ec184f8064af3ac8b78021832b246e60622ea38317a7ed2"} Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.408670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" event={"ID":"c0633364-ad88-4b44-bc23-3e2548ffb3d3","Type":"ContainerStarted","Data":"9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827"} Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.408715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" event={"ID":"c0633364-ad88-4b44-bc23-3e2548ffb3d3","Type":"ContainerStarted","Data":"694266690a95a71432e331bfbb399389f375307a96d01c7cf73b02872aeae289"} Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.413329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" event={"ID":"0799c4c9-1d73-46a2-9b0b-0f225aeba225","Type":"ContainerStarted","Data":"bb8cf85ce118e38c4ded0436c517dd84bdb339bed9afec250b5bf7ba011a5d13"} Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.418899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" event={"ID":"324409f5-db7f-41c2-9004-d9c4d10e1e0e","Type":"ContainerStarted","Data":"9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577"} Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.419959 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.448641 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.448680 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.490371 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.493471 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:03 crc kubenswrapper[4707]: I0121 15:37:03.521459 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" podStartSLOduration=2.521445219 podStartE2EDuration="2.521445219s" podCreationTimestamp="2026-01-21 15:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:03.446437285 +0000 UTC m=+2120.627953506" watchObservedRunningTime="2026-01-21 15:37:03.521445219 +0000 UTC m=+2120.702961441" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.433454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" event={"ID":"c0633364-ad88-4b44-bc23-3e2548ffb3d3","Type":"ContainerStarted","Data":"514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d"} Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.433838 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.433852 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.435889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" event={"ID":"0799c4c9-1d73-46a2-9b0b-0f225aeba225","Type":"ContainerStarted","Data":"f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402"} Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.435926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" event={"ID":"0799c4c9-1d73-46a2-9b0b-0f225aeba225","Type":"ContainerStarted","Data":"7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82"} Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.436038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.440689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"487d2c84-f1f6-4b88-9188-49e704aeb43a","Type":"ContainerStarted","Data":"793158d30625ea44f14e3cf6927d960d4fb70b77cc2bde9279de8570448d24cb"} Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.440718 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.440729 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.440737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"487d2c84-f1f6-4b88-9188-49e704aeb43a","Type":"ContainerStarted","Data":"bda04896b44c77f2f7e84995fda7c5d626e8dec1e42dd18df28b089230861206"} Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.441433 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.453217 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" podStartSLOduration=2.453204871 podStartE2EDuration="2.453204871s" podCreationTimestamp="2026-01-21 15:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:04.448658964 +0000 UTC m=+2121.630175185" watchObservedRunningTime="2026-01-21 15:37:04.453204871 +0000 UTC m=+2121.634721092" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.470586 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.470573138 podStartE2EDuration="2.470573138s" podCreationTimestamp="2026-01-21 15:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:04.468688924 +0000 UTC m=+2121.650205147" watchObservedRunningTime="2026-01-21 15:37:04.470573138 +0000 UTC m=+2121.652089360" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.506965 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" podStartSLOduration=2.5069498550000002 podStartE2EDuration="2.506949855s" podCreationTimestamp="2026-01-21 15:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:04.494199651 +0000 UTC m=+2121.675715874" watchObservedRunningTime="2026-01-21 15:37:04.506949855 +0000 UTC m=+2121.688466078" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.961623 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:04 crc kubenswrapper[4707]: I0121 15:37:04.962054 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.215875 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.560207 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5f454b744-bh8qb"] Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.561755 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.564195 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.564481 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.575839 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5f454b744-bh8qb"] Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.672373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-combined-ca-bundle\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.672419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-public-tls-certs\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.672512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-httpd-config\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.672568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-internal-tls-certs\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.672762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-ovndb-tls-certs\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.672904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwvrk\" (UniqueName: \"kubernetes.io/projected/6452a729-a782-4679-be7b-1c5240b66caa-kube-api-access-fwvrk\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.673074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-config\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.774886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwvrk\" (UniqueName: \"kubernetes.io/projected/6452a729-a782-4679-be7b-1c5240b66caa-kube-api-access-fwvrk\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.775041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-config\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.775138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-combined-ca-bundle\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.775171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-public-tls-certs\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.775201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-httpd-config\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.775246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-internal-tls-certs\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.775329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-ovndb-tls-certs\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.783269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-internal-tls-certs\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.783607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-combined-ca-bundle\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.786721 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-ovndb-tls-certs\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.787411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-public-tls-certs\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.788439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-config\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.790738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-httpd-config\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.795385 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwvrk\" (UniqueName: \"kubernetes.io/projected/6452a729-a782-4679-be7b-1c5240b66caa-kube-api-access-fwvrk\") pod \"neutron-5f454b744-bh8qb\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.862857 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:37:05 crc kubenswrapper[4707]: I0121 15:37:05.877827 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:06 crc kubenswrapper[4707]: I0121 15:37:06.238335 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:37:06 crc kubenswrapper[4707]: W0121 15:37:06.373359 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6452a729_a782_4679_be7b_1c5240b66caa.slice/crio-119071083d3504a83fd61b8490add36b41daeacbc3e7fa44c785a9272d2a77be WatchSource:0}: Error finding container 119071083d3504a83fd61b8490add36b41daeacbc3e7fa44c785a9272d2a77be: Status 404 returned error can't find the container with id 119071083d3504a83fd61b8490add36b41daeacbc3e7fa44c785a9272d2a77be Jan 21 15:37:06 crc kubenswrapper[4707]: I0121 15:37:06.382466 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5f454b744-bh8qb"] Jan 21 15:37:06 crc kubenswrapper[4707]: I0121 15:37:06.440593 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:06 crc kubenswrapper[4707]: I0121 15:37:06.441522 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:06 crc kubenswrapper[4707]: I0121 15:37:06.486574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" event={"ID":"6452a729-a782-4679-be7b-1c5240b66caa","Type":"ContainerStarted","Data":"119071083d3504a83fd61b8490add36b41daeacbc3e7fa44c785a9272d2a77be"} Jan 21 15:37:06 crc kubenswrapper[4707]: I0121 15:37:06.806143 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:06 crc kubenswrapper[4707]: I0121 15:37:06.834115 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:37:07 crc kubenswrapper[4707]: I0121 15:37:07.501179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" event={"ID":"6452a729-a782-4679-be7b-1c5240b66caa","Type":"ContainerStarted","Data":"2121408667c0e3ee8a9ef8da490db5564b60ad19d737feab45bf9a9843fd4d1c"} Jan 21 15:37:07 crc kubenswrapper[4707]: I0121 15:37:07.501475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" event={"ID":"6452a729-a782-4679-be7b-1c5240b66caa","Type":"ContainerStarted","Data":"46086c7e5a9656d5495acf8f78897d5393fb2095b5808e57b661a658aa9f8aec"} Jan 21 15:37:07 crc kubenswrapper[4707]: I0121 15:37:07.501505 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:07 crc kubenswrapper[4707]: I0121 15:37:07.501875 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="984386a6-8ce2-4d73-9d17-0654599439aa" containerName="probe" containerID="cri-o://35e5a074ef992f0ac99a7b67323fef094c2c86061197f54a16fb0be2a696fcee" gracePeriod=30 Jan 21 15:37:07 crc kubenswrapper[4707]: I0121 15:37:07.501875 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="984386a6-8ce2-4d73-9d17-0654599439aa" containerName="cinder-scheduler" containerID="cri-o://86f11fcd5fba98de8bdba84643f8037cbe04cfb15e4057fa1e54c13ebce243b6" gracePeriod=30 Jan 21 15:37:07 crc kubenswrapper[4707]: I0121 15:37:07.520515 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" podStartSLOduration=2.520498426 podStartE2EDuration="2.520498426s" podCreationTimestamp="2026-01-21 15:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:07.513829215 +0000 UTC m=+2124.695345437" watchObservedRunningTime="2026-01-21 15:37:07.520498426 +0000 UTC m=+2124.702014647" Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.522995 4707 generic.go:334] "Generic (PLEG): container finished" podID="984386a6-8ce2-4d73-9d17-0654599439aa" containerID="35e5a074ef992f0ac99a7b67323fef094c2c86061197f54a16fb0be2a696fcee" exitCode=0 Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.523285 4707 generic.go:334] "Generic (PLEG): container finished" podID="984386a6-8ce2-4d73-9d17-0654599439aa" containerID="86f11fcd5fba98de8bdba84643f8037cbe04cfb15e4057fa1e54c13ebce243b6" exitCode=0 Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.524124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"984386a6-8ce2-4d73-9d17-0654599439aa","Type":"ContainerDied","Data":"35e5a074ef992f0ac99a7b67323fef094c2c86061197f54a16fb0be2a696fcee"} Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.524155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"984386a6-8ce2-4d73-9d17-0654599439aa","Type":"ContainerDied","Data":"86f11fcd5fba98de8bdba84643f8037cbe04cfb15e4057fa1e54c13ebce243b6"} Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.804060 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.950333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr75n\" (UniqueName: \"kubernetes.io/projected/984386a6-8ce2-4d73-9d17-0654599439aa-kube-api-access-cr75n\") pod \"984386a6-8ce2-4d73-9d17-0654599439aa\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.950576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data\") pod \"984386a6-8ce2-4d73-9d17-0654599439aa\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.950647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-scripts\") pod \"984386a6-8ce2-4d73-9d17-0654599439aa\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.950766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-combined-ca-bundle\") pod \"984386a6-8ce2-4d73-9d17-0654599439aa\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.950889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/984386a6-8ce2-4d73-9d17-0654599439aa-etc-machine-id\") pod \"984386a6-8ce2-4d73-9d17-0654599439aa\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.951013 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data-custom\") pod \"984386a6-8ce2-4d73-9d17-0654599439aa\" (UID: \"984386a6-8ce2-4d73-9d17-0654599439aa\") " Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.951140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/984386a6-8ce2-4d73-9d17-0654599439aa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "984386a6-8ce2-4d73-9d17-0654599439aa" (UID: "984386a6-8ce2-4d73-9d17-0654599439aa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.952121 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/984386a6-8ce2-4d73-9d17-0654599439aa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.959484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-scripts" (OuterVolumeSpecName: "scripts") pod "984386a6-8ce2-4d73-9d17-0654599439aa" (UID: "984386a6-8ce2-4d73-9d17-0654599439aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.961041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "984386a6-8ce2-4d73-9d17-0654599439aa" (UID: "984386a6-8ce2-4d73-9d17-0654599439aa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.961337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/984386a6-8ce2-4d73-9d17-0654599439aa-kube-api-access-cr75n" (OuterVolumeSpecName: "kube-api-access-cr75n") pod "984386a6-8ce2-4d73-9d17-0654599439aa" (UID: "984386a6-8ce2-4d73-9d17-0654599439aa"). InnerVolumeSpecName "kube-api-access-cr75n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:08 crc kubenswrapper[4707]: I0121 15:37:08.993917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "984386a6-8ce2-4d73-9d17-0654599439aa" (UID: "984386a6-8ce2-4d73-9d17-0654599439aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.039104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data" (OuterVolumeSpecName: "config-data") pod "984386a6-8ce2-4d73-9d17-0654599439aa" (UID: "984386a6-8ce2-4d73-9d17-0654599439aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.057714 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.057759 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.057771 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.057781 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/984386a6-8ce2-4d73-9d17-0654599439aa-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.057790 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr75n\" (UniqueName: \"kubernetes.io/projected/984386a6-8ce2-4d73-9d17-0654599439aa-kube-api-access-cr75n\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.532583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"984386a6-8ce2-4d73-9d17-0654599439aa","Type":"ContainerDied","Data":"04c4d15a7c00c3d6a2167302c6c8d29d4b96452909390e7ba4710f560a1d077c"} Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.532627 4707 scope.go:117] "RemoveContainer" containerID="35e5a074ef992f0ac99a7b67323fef094c2c86061197f54a16fb0be2a696fcee" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.532735 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.550130 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.556711 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.557153 4707 scope.go:117] "RemoveContainer" containerID="86f11fcd5fba98de8bdba84643f8037cbe04cfb15e4057fa1e54c13ebce243b6" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.583116 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:37:09 crc kubenswrapper[4707]: E0121 15:37:09.583481 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984386a6-8ce2-4d73-9d17-0654599439aa" containerName="cinder-scheduler" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.583500 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="984386a6-8ce2-4d73-9d17-0654599439aa" containerName="cinder-scheduler" Jan 21 15:37:09 crc kubenswrapper[4707]: E0121 15:37:09.583515 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984386a6-8ce2-4d73-9d17-0654599439aa" containerName="probe" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.583522 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="984386a6-8ce2-4d73-9d17-0654599439aa" containerName="probe" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.583710 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="984386a6-8ce2-4d73-9d17-0654599439aa" containerName="probe" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.583731 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="984386a6-8ce2-4d73-9d17-0654599439aa" containerName="cinder-scheduler" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.584626 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.589066 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.599801 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.672700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.672860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-scripts\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.672915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lh9\" (UniqueName: \"kubernetes.io/projected/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-kube-api-access-66lh9\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.672963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.673006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.673049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.774727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-scripts\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.775039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66lh9\" (UniqueName: \"kubernetes.io/projected/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-kube-api-access-66lh9\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.775077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.775126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.775153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.775285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.775723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.780163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.780403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.780544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.780669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-scripts\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.796747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lh9\" (UniqueName: \"kubernetes.io/projected/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-kube-api-access-66lh9\") pod \"cinder-scheduler-0\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.920021 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.945727 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:37:09 crc kubenswrapper[4707]: I0121 15:37:09.945787 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:37:10 crc kubenswrapper[4707]: I0121 15:37:10.314489 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:37:10 crc kubenswrapper[4707]: W0121 15:37:10.317639 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb87a91a2_ba47_4dc0_9b58_0d2f00f5f829.slice/crio-0ff4026afe627ace74a3cabe6ea6a25aabd52c5a58ef0944ac48699f06dd87d8 WatchSource:0}: Error finding container 0ff4026afe627ace74a3cabe6ea6a25aabd52c5a58ef0944ac48699f06dd87d8: Status 404 returned error can't find the container with id 0ff4026afe627ace74a3cabe6ea6a25aabd52c5a58ef0944ac48699f06dd87d8 Jan 21 15:37:10 crc kubenswrapper[4707]: I0121 15:37:10.547802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829","Type":"ContainerStarted","Data":"0ff4026afe627ace74a3cabe6ea6a25aabd52c5a58ef0944ac48699f06dd87d8"} Jan 21 15:37:11 crc kubenswrapper[4707]: I0121 15:37:11.192861 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="984386a6-8ce2-4d73-9d17-0654599439aa" path="/var/lib/kubelet/pods/984386a6-8ce2-4d73-9d17-0654599439aa/volumes" Jan 21 15:37:11 crc kubenswrapper[4707]: I0121 15:37:11.559748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829","Type":"ContainerStarted","Data":"6276366157c1c72f36b10c6a0708b755cc3d4fe703dde761888194d9a7e0a9e3"} Jan 21 15:37:11 crc kubenswrapper[4707]: I0121 15:37:11.559791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829","Type":"ContainerStarted","Data":"86eaf4108dcfe70d29829d6618776e1e19fc4cf3438e7d0357073b14327e2707"} Jan 21 15:37:11 crc kubenswrapper[4707]: I0121 15:37:11.589060 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.589039577 podStartE2EDuration="2.589039577s" podCreationTimestamp="2026-01-21 15:37:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:11.583081665 +0000 UTC m=+2128.764597886" watchObservedRunningTime="2026-01-21 15:37:11.589039577 +0000 UTC m=+2128.770555819" Jan 21 15:37:13 crc kubenswrapper[4707]: I0121 15:37:13.674350 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:13 crc kubenswrapper[4707]: I0121 15:37:13.709828 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:37:13 crc kubenswrapper[4707]: I0121 15:37:13.771509 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h"] Jan 21 15:37:13 crc kubenswrapper[4707]: I0121 15:37:13.771975 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api-log" containerID="cri-o://39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2" gracePeriod=30 Jan 21 15:37:13 crc kubenswrapper[4707]: I0121 15:37:13.772644 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api" containerID="cri-o://2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81" gracePeriod=30 Jan 21 15:37:14 crc kubenswrapper[4707]: I0121 15:37:14.519743 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:37:14 crc kubenswrapper[4707]: I0121 15:37:14.588787 4707 generic.go:334] "Generic (PLEG): container finished" podID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerID="39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2" exitCode=143 Jan 21 15:37:14 crc kubenswrapper[4707]: I0121 15:37:14.589761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" event={"ID":"d38bf7c9-fa00-4dc5-abaa-446877bcbe60","Type":"ContainerDied","Data":"39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2"} Jan 21 15:37:14 crc kubenswrapper[4707]: I0121 15:37:14.921523 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:16 crc kubenswrapper[4707]: I0121 15:37:16.959664 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:49722->10.217.0.160:9311: read: connection reset by peer" Jan 21 15:37:16 crc kubenswrapper[4707]: I0121 15:37:16.959675 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:49720->10.217.0.160:9311: read: connection reset by peer" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.291100 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.406872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data\") pod \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.407052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-combined-ca-bundle\") pod \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.407117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-logs\") pod \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.407141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqjww\" (UniqueName: \"kubernetes.io/projected/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-kube-api-access-rqjww\") pod \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.407159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data-custom\") pod \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\" (UID: \"d38bf7c9-fa00-4dc5-abaa-446877bcbe60\") " Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.407660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-logs" (OuterVolumeSpecName: "logs") pod "d38bf7c9-fa00-4dc5-abaa-446877bcbe60" (UID: "d38bf7c9-fa00-4dc5-abaa-446877bcbe60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.411447 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d38bf7c9-fa00-4dc5-abaa-446877bcbe60" (UID: "d38bf7c9-fa00-4dc5-abaa-446877bcbe60"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.412994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-kube-api-access-rqjww" (OuterVolumeSpecName: "kube-api-access-rqjww") pod "d38bf7c9-fa00-4dc5-abaa-446877bcbe60" (UID: "d38bf7c9-fa00-4dc5-abaa-446877bcbe60"). InnerVolumeSpecName "kube-api-access-rqjww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.428781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d38bf7c9-fa00-4dc5-abaa-446877bcbe60" (UID: "d38bf7c9-fa00-4dc5-abaa-446877bcbe60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.442910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data" (OuterVolumeSpecName: "config-data") pod "d38bf7c9-fa00-4dc5-abaa-446877bcbe60" (UID: "d38bf7c9-fa00-4dc5-abaa-446877bcbe60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.509591 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.509689 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.509744 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqjww\" (UniqueName: \"kubernetes.io/projected/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-kube-api-access-rqjww\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.509898 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.509954 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38bf7c9-fa00-4dc5-abaa-446877bcbe60-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.612496 4707 generic.go:334] "Generic (PLEG): container finished" podID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerID="2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81" exitCode=0 Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.612535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" event={"ID":"d38bf7c9-fa00-4dc5-abaa-446877bcbe60","Type":"ContainerDied","Data":"2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81"} Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.612546 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.612562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h" event={"ID":"d38bf7c9-fa00-4dc5-abaa-446877bcbe60","Type":"ContainerDied","Data":"3b7ba521f4de330a4d368c27142ba028fddf46712d0bd71b953049b50ca3f6c8"} Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.612577 4707 scope.go:117] "RemoveContainer" containerID="2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.636342 4707 scope.go:117] "RemoveContainer" containerID="39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.637295 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h"] Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.642980 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6495d54cb4-mg28h"] Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.657256 4707 scope.go:117] "RemoveContainer" containerID="2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81" Jan 21 15:37:17 crc kubenswrapper[4707]: E0121 15:37:17.657710 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81\": container with ID starting with 2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81 not found: ID does not exist" containerID="2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.657743 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81"} err="failed to get container status \"2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81\": rpc error: code = NotFound desc = could not find container \"2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81\": container with ID starting with 2e5e858fd2eab4f06064abb6cc565b4f575c6d930af69d369a6a684e62615a81 not found: ID does not exist" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.657767 4707 scope.go:117] "RemoveContainer" containerID="39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2" Jan 21 15:37:17 crc kubenswrapper[4707]: E0121 15:37:17.658070 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2\": container with ID starting with 39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2 not found: ID does not exist" containerID="39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2" Jan 21 15:37:17 crc kubenswrapper[4707]: I0121 15:37:17.658093 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2"} err="failed to get container status \"39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2\": rpc error: code = NotFound desc = could not find container \"39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2\": container with ID starting with 39f02a285e8d9b867905fdfb70d5a0efee9215c75e6d03ded4aecf7586ef78f2 not found: ID does not exist" Jan 21 15:37:19 crc kubenswrapper[4707]: I0121 15:37:19.191001 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" path="/var/lib/kubelet/pods/d38bf7c9-fa00-4dc5-abaa-446877bcbe60/volumes" Jan 21 15:37:20 crc kubenswrapper[4707]: I0121 15:37:20.076583 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.872643 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phbs6"] Jan 21 15:37:21 crc kubenswrapper[4707]: E0121 15:37:21.873198 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api-log" Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.873212 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api-log" Jan 21 15:37:21 crc kubenswrapper[4707]: E0121 15:37:21.873246 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api" Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.873252 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api" Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.873445 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api" Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.873457 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38bf7c9-fa00-4dc5-abaa-446877bcbe60" containerName="barbican-api-log" Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.874659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.880617 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phbs6"] Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.983476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lctb8\" (UniqueName: \"kubernetes.io/projected/c6abdb91-1633-452b-ab64-95100bb38512-kube-api-access-lctb8\") pod \"redhat-operators-phbs6\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.983527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-utilities\") pod \"redhat-operators-phbs6\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:21 crc kubenswrapper[4707]: I0121 15:37:21.983562 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-catalog-content\") pod \"redhat-operators-phbs6\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:22 crc kubenswrapper[4707]: I0121 15:37:22.085480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lctb8\" (UniqueName: \"kubernetes.io/projected/c6abdb91-1633-452b-ab64-95100bb38512-kube-api-access-lctb8\") pod \"redhat-operators-phbs6\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:22 crc kubenswrapper[4707]: I0121 15:37:22.085550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-utilities\") pod \"redhat-operators-phbs6\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:22 crc kubenswrapper[4707]: I0121 15:37:22.085599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-catalog-content\") pod \"redhat-operators-phbs6\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:22 crc kubenswrapper[4707]: I0121 15:37:22.086084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-catalog-content\") pod \"redhat-operators-phbs6\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:22 crc kubenswrapper[4707]: I0121 15:37:22.086393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-utilities\") pod \"redhat-operators-phbs6\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:22 crc kubenswrapper[4707]: I0121 15:37:22.101863 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lctb8\" (UniqueName: \"kubernetes.io/projected/c6abdb91-1633-452b-ab64-95100bb38512-kube-api-access-lctb8\") pod \"redhat-operators-phbs6\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:22 crc kubenswrapper[4707]: I0121 15:37:22.192909 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:22 crc kubenswrapper[4707]: I0121 15:37:22.625904 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phbs6"] Jan 21 15:37:22 crc kubenswrapper[4707]: I0121 15:37:22.653725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phbs6" event={"ID":"c6abdb91-1633-452b-ab64-95100bb38512","Type":"ContainerStarted","Data":"00e066e0e0d8b82f1bdd025e45708b329e78b59259f9b6142cc9c67afa891bb5"} Jan 21 15:37:23 crc kubenswrapper[4707]: I0121 15:37:23.662214 4707 generic.go:334] "Generic (PLEG): container finished" podID="c6abdb91-1633-452b-ab64-95100bb38512" containerID="b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89" exitCode=0 Jan 21 15:37:23 crc kubenswrapper[4707]: I0121 15:37:23.662324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phbs6" event={"ID":"c6abdb91-1633-452b-ab64-95100bb38512","Type":"ContainerDied","Data":"b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89"} Jan 21 15:37:24 crc kubenswrapper[4707]: I0121 15:37:24.093430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:37:24 crc kubenswrapper[4707]: I0121 15:37:24.105975 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:37:24 crc kubenswrapper[4707]: I0121 15:37:24.674119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phbs6" event={"ID":"c6abdb91-1633-452b-ab64-95100bb38512","Type":"ContainerStarted","Data":"f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d"} Jan 21 15:37:24 crc kubenswrapper[4707]: I0121 15:37:24.700194 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:24 crc kubenswrapper[4707]: I0121 15:37:24.776131 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:37:24 crc kubenswrapper[4707]: I0121 15:37:24.784690 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:37:24 crc kubenswrapper[4707]: I0121 15:37:24.838766 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6cf4db6586-6qsm2"] Jan 21 15:37:25 crc kubenswrapper[4707]: I0121 15:37:25.681067 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" podUID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerName="placement-log" containerID="cri-o://2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474" gracePeriod=30 Jan 21 15:37:25 crc kubenswrapper[4707]: I0121 15:37:25.681557 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" podUID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerName="placement-api" containerID="cri-o://69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f" gracePeriod=30 Jan 21 15:37:26 crc kubenswrapper[4707]: I0121 15:37:26.690492 4707 generic.go:334] "Generic (PLEG): container finished" podID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerID="2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474" exitCode=143 Jan 21 15:37:26 crc kubenswrapper[4707]: I0121 15:37:26.690562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" event={"ID":"370b1b6f-b257-4b0d-8d49-d79e90f2577d","Type":"ContainerDied","Data":"2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474"} Jan 21 15:37:26 crc kubenswrapper[4707]: I0121 15:37:26.693092 4707 generic.go:334] "Generic (PLEG): container finished" podID="c6abdb91-1633-452b-ab64-95100bb38512" containerID="f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d" exitCode=0 Jan 21 15:37:26 crc kubenswrapper[4707]: I0121 15:37:26.693115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phbs6" event={"ID":"c6abdb91-1633-452b-ab64-95100bb38512","Type":"ContainerDied","Data":"f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d"} Jan 21 15:37:27 crc kubenswrapper[4707]: I0121 15:37:27.703262 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phbs6" event={"ID":"c6abdb91-1633-452b-ab64-95100bb38512","Type":"ContainerStarted","Data":"dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4"} Jan 21 15:37:27 crc kubenswrapper[4707]: I0121 15:37:27.724818 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phbs6" podStartSLOduration=3.202498729 podStartE2EDuration="6.7247907s" podCreationTimestamp="2026-01-21 15:37:21 +0000 UTC" firstStartedPulling="2026-01-21 15:37:23.663865178 +0000 UTC m=+2140.845381399" lastFinishedPulling="2026-01-21 15:37:27.186157147 +0000 UTC m=+2144.367673370" observedRunningTime="2026-01-21 15:37:27.720799775 +0000 UTC m=+2144.902315998" watchObservedRunningTime="2026-01-21 15:37:27.7247907 +0000 UTC m=+2144.906306922" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.129271 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.238471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-combined-ca-bundle\") pod \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.238621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmkvv\" (UniqueName: \"kubernetes.io/projected/370b1b6f-b257-4b0d-8d49-d79e90f2577d-kube-api-access-hmkvv\") pod \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.238643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-config-data\") pod \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.238736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/370b1b6f-b257-4b0d-8d49-d79e90f2577d-logs\") pod \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.238766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-scripts\") pod \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\" (UID: \"370b1b6f-b257-4b0d-8d49-d79e90f2577d\") " Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.239349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370b1b6f-b257-4b0d-8d49-d79e90f2577d-logs" (OuterVolumeSpecName: "logs") pod "370b1b6f-b257-4b0d-8d49-d79e90f2577d" (UID: "370b1b6f-b257-4b0d-8d49-d79e90f2577d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.244539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-scripts" (OuterVolumeSpecName: "scripts") pod "370b1b6f-b257-4b0d-8d49-d79e90f2577d" (UID: "370b1b6f-b257-4b0d-8d49-d79e90f2577d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.244551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370b1b6f-b257-4b0d-8d49-d79e90f2577d-kube-api-access-hmkvv" (OuterVolumeSpecName: "kube-api-access-hmkvv") pod "370b1b6f-b257-4b0d-8d49-d79e90f2577d" (UID: "370b1b6f-b257-4b0d-8d49-d79e90f2577d"). InnerVolumeSpecName "kube-api-access-hmkvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.281878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "370b1b6f-b257-4b0d-8d49-d79e90f2577d" (UID: "370b1b6f-b257-4b0d-8d49-d79e90f2577d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.283909 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-config-data" (OuterVolumeSpecName: "config-data") pod "370b1b6f-b257-4b0d-8d49-d79e90f2577d" (UID: "370b1b6f-b257-4b0d-8d49-d79e90f2577d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.341304 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.341330 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmkvv\" (UniqueName: \"kubernetes.io/projected/370b1b6f-b257-4b0d-8d49-d79e90f2577d-kube-api-access-hmkvv\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.341341 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.341349 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/370b1b6f-b257-4b0d-8d49-d79e90f2577d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.341357 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370b1b6f-b257-4b0d-8d49-d79e90f2577d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.722942 4707 generic.go:334] "Generic (PLEG): container finished" podID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerID="69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f" exitCode=0 Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.723013 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.723018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" event={"ID":"370b1b6f-b257-4b0d-8d49-d79e90f2577d","Type":"ContainerDied","Data":"69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f"} Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.723699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6cf4db6586-6qsm2" event={"ID":"370b1b6f-b257-4b0d-8d49-d79e90f2577d","Type":"ContainerDied","Data":"dd90c850c1cefaaa38e22049833efff4181cc45cb0764e71a05099c47f1c7f24"} Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.723743 4707 scope.go:117] "RemoveContainer" containerID="69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.743989 4707 scope.go:117] "RemoveContainer" containerID="2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.747517 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6cf4db6586-6qsm2"] Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.754327 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-6cf4db6586-6qsm2"] Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.761210 4707 scope.go:117] "RemoveContainer" containerID="69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f" Jan 21 15:37:29 crc kubenswrapper[4707]: E0121 15:37:29.761587 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f\": container with ID starting with 69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f not found: ID does not exist" containerID="69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.761624 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f"} err="failed to get container status \"69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f\": rpc error: code = NotFound desc = could not find container \"69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f\": container with ID starting with 69eb1233af4ac56fb7588aa7c5cd8f699ee933a9b6e638252a5059f6424b086f not found: ID does not exist" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.761647 4707 scope.go:117] "RemoveContainer" containerID="2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474" Jan 21 15:37:29 crc kubenswrapper[4707]: E0121 15:37:29.761956 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474\": container with ID starting with 2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474 not found: ID does not exist" containerID="2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474" Jan 21 15:37:29 crc kubenswrapper[4707]: I0121 15:37:29.761989 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474"} err="failed to get container status \"2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474\": rpc error: code = NotFound desc = could not find container \"2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474\": container with ID starting with 2a11e7dba3dd31354fcb31b9e93230570fc461506a37160155dc00b0b8403474 not found: ID does not exist" Jan 21 15:37:31 crc kubenswrapper[4707]: I0121 15:37:31.191146 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" path="/var/lib/kubelet/pods/370b1b6f-b257-4b0d-8d49-d79e90f2577d/volumes" Jan 21 15:37:32 crc kubenswrapper[4707]: I0121 15:37:32.193025 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:32 crc kubenswrapper[4707]: I0121 15:37:32.193748 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:32 crc kubenswrapper[4707]: I0121 15:37:32.227433 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:32 crc kubenswrapper[4707]: I0121 15:37:32.782119 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:32 crc kubenswrapper[4707]: I0121 15:37:32.823600 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phbs6"] Jan 21 15:37:32 crc kubenswrapper[4707]: I0121 15:37:32.848458 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:33 crc kubenswrapper[4707]: I0121 15:37:33.099282 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:37:34 crc kubenswrapper[4707]: I0121 15:37:34.760190 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phbs6" podUID="c6abdb91-1633-452b-ab64-95100bb38512" containerName="registry-server" containerID="cri-o://dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4" gracePeriod=2 Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.144827 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.241163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-utilities\") pod \"c6abdb91-1633-452b-ab64-95100bb38512\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.241214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-catalog-content\") pod \"c6abdb91-1633-452b-ab64-95100bb38512\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.241279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lctb8\" (UniqueName: \"kubernetes.io/projected/c6abdb91-1633-452b-ab64-95100bb38512-kube-api-access-lctb8\") pod \"c6abdb91-1633-452b-ab64-95100bb38512\" (UID: \"c6abdb91-1633-452b-ab64-95100bb38512\") " Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.242384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-utilities" (OuterVolumeSpecName: "utilities") pod "c6abdb91-1633-452b-ab64-95100bb38512" (UID: "c6abdb91-1633-452b-ab64-95100bb38512"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.245848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6abdb91-1633-452b-ab64-95100bb38512-kube-api-access-lctb8" (OuterVolumeSpecName: "kube-api-access-lctb8") pod "c6abdb91-1633-452b-ab64-95100bb38512" (UID: "c6abdb91-1633-452b-ab64-95100bb38512"). InnerVolumeSpecName "kube-api-access-lctb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.331260 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6abdb91-1633-452b-ab64-95100bb38512" (UID: "c6abdb91-1633-452b-ab64-95100bb38512"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.342940 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lctb8\" (UniqueName: \"kubernetes.io/projected/c6abdb91-1633-452b-ab64-95100bb38512-kube-api-access-lctb8\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.343041 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.343115 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6abdb91-1633-452b-ab64-95100bb38512-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.771363 4707 generic.go:334] "Generic (PLEG): container finished" podID="c6abdb91-1633-452b-ab64-95100bb38512" containerID="dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4" exitCode=0 Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.771370 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phbs6" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.771387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phbs6" event={"ID":"c6abdb91-1633-452b-ab64-95100bb38512","Type":"ContainerDied","Data":"dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4"} Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.771431 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phbs6" event={"ID":"c6abdb91-1633-452b-ab64-95100bb38512","Type":"ContainerDied","Data":"00e066e0e0d8b82f1bdd025e45708b329e78b59259f9b6142cc9c67afa891bb5"} Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.771450 4707 scope.go:117] "RemoveContainer" containerID="dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.804988 4707 scope.go:117] "RemoveContainer" containerID="f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.810715 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phbs6"] Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.819199 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phbs6"] Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.829963 4707 scope.go:117] "RemoveContainer" containerID="b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.862829 4707 scope.go:117] "RemoveContainer" containerID="dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4" Jan 21 15:37:35 crc kubenswrapper[4707]: E0121 15:37:35.863097 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4\": container with ID starting with dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4 not found: ID does not exist" containerID="dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.863132 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4"} err="failed to get container status \"dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4\": rpc error: code = NotFound desc = could not find container \"dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4\": container with ID starting with dbd12a714a367a79a0b3d80e0414f2eb83b94eb1fa6eaedbf3fd82d0a02abab4 not found: ID does not exist" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.863152 4707 scope.go:117] "RemoveContainer" containerID="f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d" Jan 21 15:37:35 crc kubenswrapper[4707]: E0121 15:37:35.863470 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d\": container with ID starting with f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d not found: ID does not exist" containerID="f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.863499 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d"} err="failed to get container status \"f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d\": rpc error: code = NotFound desc = could not find container \"f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d\": container with ID starting with f7f2be9710abcf0fb935760117beede707aaf608352943fc6eed6a0bd679043d not found: ID does not exist" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.863515 4707 scope.go:117] "RemoveContainer" containerID="b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89" Jan 21 15:37:35 crc kubenswrapper[4707]: E0121 15:37:35.863696 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89\": container with ID starting with b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89 not found: ID does not exist" containerID="b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.863714 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89"} err="failed to get container status \"b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89\": rpc error: code = NotFound desc = could not find container \"b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89\": container with ID starting with b2c7b1b0b94f41743259ee3c93f92662fccafcb6b7e0a37d7c28a465407c0e89 not found: ID does not exist" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.887038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.952208 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-57d48858d5-rdc5w"] Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.952416 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" podUID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerName="neutron-api" containerID="cri-o://7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82" gracePeriod=30 Jan 21 15:37:35 crc kubenswrapper[4707]: I0121 15:37:35.952543 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" podUID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerName="neutron-httpd" containerID="cri-o://f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402" gracePeriod=30 Jan 21 15:37:36 crc kubenswrapper[4707]: I0121 15:37:36.794838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" event={"ID":"0799c4c9-1d73-46a2-9b0b-0f225aeba225","Type":"ContainerDied","Data":"f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402"} Jan 21 15:37:36 crc kubenswrapper[4707]: I0121 15:37:36.794828 4707 generic.go:334] "Generic (PLEG): container finished" podID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerID="f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402" exitCode=0 Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.110175 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:37:37 crc kubenswrapper[4707]: E0121 15:37:37.110777 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerName="placement-api" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.110794 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerName="placement-api" Jan 21 15:37:37 crc kubenswrapper[4707]: E0121 15:37:37.110831 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6abdb91-1633-452b-ab64-95100bb38512" containerName="extract-content" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.110840 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6abdb91-1633-452b-ab64-95100bb38512" containerName="extract-content" Jan 21 15:37:37 crc kubenswrapper[4707]: E0121 15:37:37.110853 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6abdb91-1633-452b-ab64-95100bb38512" containerName="extract-utilities" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.110860 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6abdb91-1633-452b-ab64-95100bb38512" containerName="extract-utilities" Jan 21 15:37:37 crc kubenswrapper[4707]: E0121 15:37:37.110878 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6abdb91-1633-452b-ab64-95100bb38512" containerName="registry-server" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.110883 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6abdb91-1633-452b-ab64-95100bb38512" containerName="registry-server" Jan 21 15:37:37 crc kubenswrapper[4707]: E0121 15:37:37.110892 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerName="placement-log" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.110897 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerName="placement-log" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.111052 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerName="placement-api" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.111063 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="370b1b6f-b257-4b0d-8d49-d79e90f2577d" containerName="placement-log" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.111070 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6abdb91-1633-452b-ab64-95100bb38512" containerName="registry-server" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.111658 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.114433 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.114449 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.117774 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-42flq" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.136097 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.178075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config-secret\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.178125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4zrh\" (UniqueName: \"kubernetes.io/projected/37972b5f-69ec-4558-b68a-c8108b5b5fea-kube-api-access-g4zrh\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.178178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-combined-ca-bundle\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.178593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.198959 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6abdb91-1633-452b-ab64-95100bb38512" path="/var/lib/kubelet/pods/c6abdb91-1633-452b-ab64-95100bb38512/volumes" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.280398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config-secret\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.280435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4zrh\" (UniqueName: \"kubernetes.io/projected/37972b5f-69ec-4558-b68a-c8108b5b5fea-kube-api-access-g4zrh\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.280489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-combined-ca-bundle\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.280556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.281352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.287379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config-secret\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.287495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-combined-ca-bundle\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.298492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4zrh\" (UniqueName: \"kubernetes.io/projected/37972b5f-69ec-4558-b68a-c8108b5b5fea-kube-api-access-g4zrh\") pod \"openstackclient\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.354751 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.386091 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-combined-ca-bundle\") pod \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.386161 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-httpd-config\") pod \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.386282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-config\") pod \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.386358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-ovndb-tls-certs\") pod \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.387038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmccn\" (UniqueName: \"kubernetes.io/projected/0799c4c9-1d73-46a2-9b0b-0f225aeba225-kube-api-access-bmccn\") pod \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\" (UID: \"0799c4c9-1d73-46a2-9b0b-0f225aeba225\") " Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.393905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0799c4c9-1d73-46a2-9b0b-0f225aeba225" (UID: "0799c4c9-1d73-46a2-9b0b-0f225aeba225"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.399986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0799c4c9-1d73-46a2-9b0b-0f225aeba225-kube-api-access-bmccn" (OuterVolumeSpecName: "kube-api-access-bmccn") pod "0799c4c9-1d73-46a2-9b0b-0f225aeba225" (UID: "0799c4c9-1d73-46a2-9b0b-0f225aeba225"). InnerVolumeSpecName "kube-api-access-bmccn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.425349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0799c4c9-1d73-46a2-9b0b-0f225aeba225" (UID: "0799c4c9-1d73-46a2-9b0b-0f225aeba225"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.434012 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.434694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-config" (OuterVolumeSpecName: "config") pod "0799c4c9-1d73-46a2-9b0b-0f225aeba225" (UID: "0799c4c9-1d73-46a2-9b0b-0f225aeba225"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.448929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0799c4c9-1d73-46a2-9b0b-0f225aeba225" (UID: "0799c4c9-1d73-46a2-9b0b-0f225aeba225"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.489866 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.489909 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.489923 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmccn\" (UniqueName: \"kubernetes.io/projected/0799c4c9-1d73-46a2-9b0b-0f225aeba225-kube-api-access-bmccn\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.489932 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.489942 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0799c4c9-1d73-46a2-9b0b-0f225aeba225-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.805212 4707 generic.go:334] "Generic (PLEG): container finished" podID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerID="7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82" exitCode=0 Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.805266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" event={"ID":"0799c4c9-1d73-46a2-9b0b-0f225aeba225","Type":"ContainerDied","Data":"7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82"} Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.805294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" event={"ID":"0799c4c9-1d73-46a2-9b0b-0f225aeba225","Type":"ContainerDied","Data":"bb8cf85ce118e38c4ded0436c517dd84bdb339bed9afec250b5bf7ba011a5d13"} Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.805316 4707 scope.go:117] "RemoveContainer" containerID="f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.805358 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-57d48858d5-rdc5w" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.823570 4707 scope.go:117] "RemoveContainer" containerID="7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.840454 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-57d48858d5-rdc5w"] Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.842326 4707 scope.go:117] "RemoveContainer" containerID="f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402" Jan 21 15:37:37 crc kubenswrapper[4707]: E0121 15:37:37.842729 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402\": container with ID starting with f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402 not found: ID does not exist" containerID="f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.842756 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402"} err="failed to get container status \"f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402\": rpc error: code = NotFound desc = could not find container \"f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402\": container with ID starting with f03897eb6e33a3f37ddc4fbfaa90ba0d03551a7d52187fb0cd5cecf5a949c402 not found: ID does not exist" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.842772 4707 scope.go:117] "RemoveContainer" containerID="7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82" Jan 21 15:37:37 crc kubenswrapper[4707]: E0121 15:37:37.843049 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82\": container with ID starting with 7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82 not found: ID does not exist" containerID="7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.843076 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82"} err="failed to get container status \"7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82\": rpc error: code = NotFound desc = could not find container \"7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82\": container with ID starting with 7e1827dfc800600179ca0e15ca8d0b52b0fa3552d6a1f260bae19ec2ecf0ae82 not found: ID does not exist" Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.850904 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-57d48858d5-rdc5w"] Jan 21 15:37:37 crc kubenswrapper[4707]: I0121 15:37:37.857619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:37:38 crc kubenswrapper[4707]: I0121 15:37:38.814422 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"37972b5f-69ec-4558-b68a-c8108b5b5fea","Type":"ContainerStarted","Data":"2d5245eb454b9ce88454d22d980989055f69b8ce07955601437667dbc9834df3"} Jan 21 15:37:38 crc kubenswrapper[4707]: I0121 15:37:38.814758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"37972b5f-69ec-4558-b68a-c8108b5b5fea","Type":"ContainerStarted","Data":"0f9bb9909aae3584cde11c732676f14c5b2f21ac657ec8dab2994e71333396dc"} Jan 21 15:37:38 crc kubenswrapper[4707]: I0121 15:37:38.828509 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=1.828493887 podStartE2EDuration="1.828493887s" podCreationTimestamp="2026-01-21 15:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:38.826787247 +0000 UTC m=+2156.008303470" watchObservedRunningTime="2026-01-21 15:37:38.828493887 +0000 UTC m=+2156.010010108" Jan 21 15:37:38 crc kubenswrapper[4707]: I0121 15:37:38.900517 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:38 crc kubenswrapper[4707]: I0121 15:37:38.900875 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="proxy-httpd" containerID="cri-o://0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0" gracePeriod=30 Jan 21 15:37:38 crc kubenswrapper[4707]: I0121 15:37:38.900934 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="sg-core" containerID="cri-o://2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6" gracePeriod=30 Jan 21 15:37:38 crc kubenswrapper[4707]: I0121 15:37:38.900969 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="ceilometer-notification-agent" containerID="cri-o://be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143" gracePeriod=30 Jan 21 15:37:38 crc kubenswrapper[4707]: I0121 15:37:38.901053 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="ceilometer-central-agent" containerID="cri-o://4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c" gracePeriod=30 Jan 21 15:37:39 crc kubenswrapper[4707]: I0121 15:37:39.190982 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" path="/var/lib/kubelet/pods/0799c4c9-1d73-46a2-9b0b-0f225aeba225/volumes" Jan 21 15:37:39 crc kubenswrapper[4707]: I0121 15:37:39.827491 4707 generic.go:334] "Generic (PLEG): container finished" podID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerID="0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0" exitCode=0 Jan 21 15:37:39 crc kubenswrapper[4707]: I0121 15:37:39.827530 4707 generic.go:334] "Generic (PLEG): container finished" podID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerID="2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6" exitCode=2 Jan 21 15:37:39 crc kubenswrapper[4707]: I0121 15:37:39.827538 4707 generic.go:334] "Generic (PLEG): container finished" podID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerID="4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c" exitCode=0 Jan 21 15:37:39 crc kubenswrapper[4707]: I0121 15:37:39.827565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerDied","Data":"0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0"} Jan 21 15:37:39 crc kubenswrapper[4707]: I0121 15:37:39.827609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerDied","Data":"2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6"} Jan 21 15:37:39 crc kubenswrapper[4707]: I0121 15:37:39.827622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerDied","Data":"4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c"} Jan 21 15:37:39 crc kubenswrapper[4707]: I0121 15:37:39.946145 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:37:39 crc kubenswrapper[4707]: I0121 15:37:39.946196 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.320414 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.443797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfrcr\" (UniqueName: \"kubernetes.io/projected/3401bc45-ba40-4893-b70c-debe5e5bf534-kube-api-access-kfrcr\") pod \"3401bc45-ba40-4893-b70c-debe5e5bf534\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.443865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-sg-core-conf-yaml\") pod \"3401bc45-ba40-4893-b70c-debe5e5bf534\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.443887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-config-data\") pod \"3401bc45-ba40-4893-b70c-debe5e5bf534\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.443935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-scripts\") pod \"3401bc45-ba40-4893-b70c-debe5e5bf534\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.444001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-combined-ca-bundle\") pod \"3401bc45-ba40-4893-b70c-debe5e5bf534\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.444044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-run-httpd\") pod \"3401bc45-ba40-4893-b70c-debe5e5bf534\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.444090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-log-httpd\") pod \"3401bc45-ba40-4893-b70c-debe5e5bf534\" (UID: \"3401bc45-ba40-4893-b70c-debe5e5bf534\") " Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.444945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3401bc45-ba40-4893-b70c-debe5e5bf534" (UID: "3401bc45-ba40-4893-b70c-debe5e5bf534"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.446225 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3401bc45-ba40-4893-b70c-debe5e5bf534" (UID: "3401bc45-ba40-4893-b70c-debe5e5bf534"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.449676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3401bc45-ba40-4893-b70c-debe5e5bf534-kube-api-access-kfrcr" (OuterVolumeSpecName: "kube-api-access-kfrcr") pod "3401bc45-ba40-4893-b70c-debe5e5bf534" (UID: "3401bc45-ba40-4893-b70c-debe5e5bf534"). InnerVolumeSpecName "kube-api-access-kfrcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.453141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-scripts" (OuterVolumeSpecName: "scripts") pod "3401bc45-ba40-4893-b70c-debe5e5bf534" (UID: "3401bc45-ba40-4893-b70c-debe5e5bf534"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.469210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3401bc45-ba40-4893-b70c-debe5e5bf534" (UID: "3401bc45-ba40-4893-b70c-debe5e5bf534"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.499074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3401bc45-ba40-4893-b70c-debe5e5bf534" (UID: "3401bc45-ba40-4893-b70c-debe5e5bf534"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.519448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-config-data" (OuterVolumeSpecName: "config-data") pod "3401bc45-ba40-4893-b70c-debe5e5bf534" (UID: "3401bc45-ba40-4893-b70c-debe5e5bf534"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.546483 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfrcr\" (UniqueName: \"kubernetes.io/projected/3401bc45-ba40-4893-b70c-debe5e5bf534-kube-api-access-kfrcr\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.546511 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.546521 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.546529 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.546539 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3401bc45-ba40-4893-b70c-debe5e5bf534-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.546548 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.546556 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3401bc45-ba40-4893-b70c-debe5e5bf534-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.836617 4707 generic.go:334] "Generic (PLEG): container finished" podID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerID="be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143" exitCode=0 Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.836657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerDied","Data":"be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143"} Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.836660 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.836683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3401bc45-ba40-4893-b70c-debe5e5bf534","Type":"ContainerDied","Data":"92cedbd193becec37b552edf91342663f674f348c474127919863971595574bb"} Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.836700 4707 scope.go:117] "RemoveContainer" containerID="0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.854476 4707 scope.go:117] "RemoveContainer" containerID="2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.864302 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.876488 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.877986 4707 scope.go:117] "RemoveContainer" containerID="be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895124 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.895508 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="proxy-httpd" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895528 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="proxy-httpd" Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.895543 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerName="neutron-api" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895548 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerName="neutron-api" Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.895568 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerName="neutron-httpd" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895574 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerName="neutron-httpd" Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.895585 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="ceilometer-central-agent" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895592 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="ceilometer-central-agent" Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.895600 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="sg-core" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895605 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="sg-core" Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.895616 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="ceilometer-notification-agent" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="ceilometer-notification-agent" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895791 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="ceilometer-central-agent" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895820 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerName="neutron-httpd" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895831 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="proxy-httpd" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895849 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0799c4c9-1d73-46a2-9b0b-0f225aeba225" containerName="neutron-api" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895858 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="sg-core" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.895872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" containerName="ceilometer-notification-agent" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.897367 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.897928 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.899425 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.899485 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.913051 4707 scope.go:117] "RemoveContainer" containerID="4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.932586 4707 scope.go:117] "RemoveContainer" containerID="0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0" Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.932949 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0\": container with ID starting with 0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0 not found: ID does not exist" containerID="0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.932994 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0"} err="failed to get container status \"0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0\": rpc error: code = NotFound desc = could not find container \"0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0\": container with ID starting with 0b4efe7f2e7cb32867cc85ba30b7133035a213baaf7cdd87a7b962114a7612a0 not found: ID does not exist" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.933013 4707 scope.go:117] "RemoveContainer" containerID="2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6" Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.933357 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6\": container with ID starting with 2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6 not found: ID does not exist" containerID="2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.933383 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6"} err="failed to get container status \"2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6\": rpc error: code = NotFound desc = could not find container \"2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6\": container with ID starting with 2c8f23cec889b5b8916bcacf44f1d58b8d1741ff5c574d0e89039e8fc0659cc6 not found: ID does not exist" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.933395 4707 scope.go:117] "RemoveContainer" containerID="be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143" Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.933647 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143\": container with ID starting with be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143 not found: ID does not exist" containerID="be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.933665 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143"} err="failed to get container status \"be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143\": rpc error: code = NotFound desc = could not find container \"be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143\": container with ID starting with be7ae6df80800c2a4dbe917e181b4e27cdfff59ccb3ee1742cb3b55fa5b67143 not found: ID does not exist" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.933693 4707 scope.go:117] "RemoveContainer" containerID="4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c" Jan 21 15:37:40 crc kubenswrapper[4707]: E0121 15:37:40.933951 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c\": container with ID starting with 4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c not found: ID does not exist" containerID="4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c" Jan 21 15:37:40 crc kubenswrapper[4707]: I0121 15:37:40.933966 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c"} err="failed to get container status \"4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c\": rpc error: code = NotFound desc = could not find container \"4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c\": container with ID starting with 4b84fbffe42a23f7a26eb83524db05677bec7865dbbc741d7134daa0697fd32c not found: ID does not exist" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.058071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spk8r\" (UniqueName: \"kubernetes.io/projected/12508707-9551-4635-b05c-4cd647137b44-kube-api-access-spk8r\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.058175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.058222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-config-data\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.058264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.058280 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-scripts\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.058329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-run-httpd\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.058630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-log-httpd\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.159633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.159904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-scripts\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.160047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-run-httpd\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.160590 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-log-httpd\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.160731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spk8r\" (UniqueName: \"kubernetes.io/projected/12508707-9551-4635-b05c-4cd647137b44-kube-api-access-spk8r\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.160460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-run-httpd\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.161015 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-log-httpd\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.161147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.161334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-config-data\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.167448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.167464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-scripts\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.167553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.167709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-config-data\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.173855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spk8r\" (UniqueName: \"kubernetes.io/projected/12508707-9551-4635-b05c-4cd647137b44-kube-api-access-spk8r\") pod \"ceilometer-0\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.190101 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3401bc45-ba40-4893-b70c-debe5e5bf534" path="/var/lib/kubelet/pods/3401bc45-ba40-4893-b70c-debe5e5bf534/volumes" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.217061 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.619703 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.837644 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf"] Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.845160 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.847961 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.848093 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.848154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.854734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerStarted","Data":"ef0cf3d9fd8b50b71b349cc7ea33193792d80b90750be0ebdd88a4ddc87ada39"} Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.867591 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf"] Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.876796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-run-httpd\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.876894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-config-data\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.876974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgdlj\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-kube-api-access-zgdlj\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.877023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-log-httpd\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.877101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-public-tls-certs\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.877129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-combined-ca-bundle\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.877147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-etc-swift\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.877306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-internal-tls-certs\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.979266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-public-tls-certs\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.979320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-combined-ca-bundle\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.979343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-etc-swift\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.979382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-internal-tls-certs\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.979465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-run-httpd\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.979490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-config-data\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.979528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgdlj\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-kube-api-access-zgdlj\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.979549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-log-httpd\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.980161 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-run-httpd\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.980173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-log-httpd\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.985575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-public-tls-certs\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.985580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-config-data\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.985923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-internal-tls-certs\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.988582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-etc-swift\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.996438 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-combined-ca-bundle\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:41 crc kubenswrapper[4707]: I0121 15:37:41.999644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgdlj\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-kube-api-access-zgdlj\") pod \"swift-proxy-6478545c6d-rrgqf\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:42 crc kubenswrapper[4707]: I0121 15:37:42.168497 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:42 crc kubenswrapper[4707]: I0121 15:37:42.219738 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:42 crc kubenswrapper[4707]: I0121 15:37:42.593586 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf"] Jan 21 15:37:42 crc kubenswrapper[4707]: I0121 15:37:42.877746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" event={"ID":"4c133e4d-7259-43f9-90c7-d8fee2af8588","Type":"ContainerStarted","Data":"1df6460ba3866cec27e159d8bc64f86136c0600f74e7a9842781454a2ef8142c"} Jan 21 15:37:42 crc kubenswrapper[4707]: I0121 15:37:42.878028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" event={"ID":"4c133e4d-7259-43f9-90c7-d8fee2af8588","Type":"ContainerStarted","Data":"ff9e7bd1f2a0d3427ab31dc36cbefa73b193fcc1c2688ecbc02b7b18ba10fa93"} Jan 21 15:37:42 crc kubenswrapper[4707]: I0121 15:37:42.878039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" event={"ID":"4c133e4d-7259-43f9-90c7-d8fee2af8588","Type":"ContainerStarted","Data":"f6a9a2822431c1b0446f9af883787fd441d1f3f25e2b7f011e6ce9ce21b5e1f7"} Jan 21 15:37:42 crc kubenswrapper[4707]: I0121 15:37:42.878052 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:42 crc kubenswrapper[4707]: I0121 15:37:42.879883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerStarted","Data":"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c"} Jan 21 15:37:42 crc kubenswrapper[4707]: I0121 15:37:42.898512 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" podStartSLOduration=1.898502007 podStartE2EDuration="1.898502007s" podCreationTimestamp="2026-01-21 15:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:42.890397478 +0000 UTC m=+2160.071913700" watchObservedRunningTime="2026-01-21 15:37:42.898502007 +0000 UTC m=+2160.080018228" Jan 21 15:37:43 crc kubenswrapper[4707]: I0121 15:37:43.892256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerStarted","Data":"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465"} Jan 21 15:37:43 crc kubenswrapper[4707]: I0121 15:37:43.892491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerStarted","Data":"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f"} Jan 21 15:37:43 crc kubenswrapper[4707]: I0121 15:37:43.892514 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:44 crc kubenswrapper[4707]: I0121 15:37:44.999154 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rffn8"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.000149 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.015082 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rffn8"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.037563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kt76\" (UniqueName: \"kubernetes.io/projected/175e861a-ef37-43bf-a299-41690c536532-kube-api-access-8kt76\") pod \"nova-api-db-create-rffn8\" (UID: \"175e861a-ef37-43bf-a299-41690c536532\") " pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.037643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/175e861a-ef37-43bf-a299-41690c536532-operator-scripts\") pod \"nova-api-db-create-rffn8\" (UID: \"175e861a-ef37-43bf-a299-41690c536532\") " pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.096857 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.098289 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.100455 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.108523 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.142658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdtxt\" (UniqueName: \"kubernetes.io/projected/488b22fe-c39f-42fb-8c0e-75917ac55fff-kube-api-access-cdtxt\") pod \"nova-api-40b4-account-create-update-lktds\" (UID: \"488b22fe-c39f-42fb-8c0e-75917ac55fff\") " pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.142710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/488b22fe-c39f-42fb-8c0e-75917ac55fff-operator-scripts\") pod \"nova-api-40b4-account-create-update-lktds\" (UID: \"488b22fe-c39f-42fb-8c0e-75917ac55fff\") " pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.142798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kt76\" (UniqueName: \"kubernetes.io/projected/175e861a-ef37-43bf-a299-41690c536532-kube-api-access-8kt76\") pod \"nova-api-db-create-rffn8\" (UID: \"175e861a-ef37-43bf-a299-41690c536532\") " pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.142828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/175e861a-ef37-43bf-a299-41690c536532-operator-scripts\") pod \"nova-api-db-create-rffn8\" (UID: \"175e861a-ef37-43bf-a299-41690c536532\") " pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.143392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/175e861a-ef37-43bf-a299-41690c536532-operator-scripts\") pod \"nova-api-db-create-rffn8\" (UID: \"175e861a-ef37-43bf-a299-41690c536532\") " pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.159169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kt76\" (UniqueName: \"kubernetes.io/projected/175e861a-ef37-43bf-a299-41690c536532-kube-api-access-8kt76\") pod \"nova-api-db-create-rffn8\" (UID: \"175e861a-ef37-43bf-a299-41690c536532\") " pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.216818 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9df45"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.218088 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.245057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdtxt\" (UniqueName: \"kubernetes.io/projected/488b22fe-c39f-42fb-8c0e-75917ac55fff-kube-api-access-cdtxt\") pod \"nova-api-40b4-account-create-update-lktds\" (UID: \"488b22fe-c39f-42fb-8c0e-75917ac55fff\") " pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.245119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/488b22fe-c39f-42fb-8c0e-75917ac55fff-operator-scripts\") pod \"nova-api-40b4-account-create-update-lktds\" (UID: \"488b22fe-c39f-42fb-8c0e-75917ac55fff\") " pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.245204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlgss\" (UniqueName: \"kubernetes.io/projected/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-kube-api-access-qlgss\") pod \"nova-cell0-db-create-9df45\" (UID: \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.245280 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-operator-scripts\") pod \"nova-cell0-db-create-9df45\" (UID: \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.248416 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/488b22fe-c39f-42fb-8c0e-75917ac55fff-operator-scripts\") pod \"nova-api-40b4-account-create-update-lktds\" (UID: \"488b22fe-c39f-42fb-8c0e-75917ac55fff\") " pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.248754 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9df45"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.263055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdtxt\" (UniqueName: \"kubernetes.io/projected/488b22fe-c39f-42fb-8c0e-75917ac55fff-kube-api-access-cdtxt\") pod \"nova-api-40b4-account-create-update-lktds\" (UID: \"488b22fe-c39f-42fb-8c0e-75917ac55fff\") " pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.299076 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-c94hb"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.300552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.322040 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-c94hb"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.332956 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.334614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.338149 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.347178 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.348515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlgss\" (UniqueName: \"kubernetes.io/projected/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-kube-api-access-qlgss\") pod \"nova-cell0-db-create-9df45\" (UID: \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.349950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-operator-scripts\") pod \"nova-cell0-db-create-9df45\" (UID: \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.350593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-operator-scripts\") pod \"nova-cell0-db-create-9df45\" (UID: \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.364513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlgss\" (UniqueName: \"kubernetes.io/projected/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-kube-api-access-qlgss\") pod \"nova-cell0-db-create-9df45\" (UID: \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.427269 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.447175 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.451595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnm5z\" (UniqueName: \"kubernetes.io/projected/22546c4b-0348-445b-a781-c3a15ca2b4cd-kube-api-access-wnm5z\") pod \"nova-cell0-6d10-account-create-update-pwrg2\" (UID: \"22546c4b-0348-445b-a781-c3a15ca2b4cd\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.451644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/618a68ea-1d06-4355-9d36-b0a833165b08-operator-scripts\") pod \"nova-cell1-db-create-c94hb\" (UID: \"618a68ea-1d06-4355-9d36-b0a833165b08\") " pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.451674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22546c4b-0348-445b-a781-c3a15ca2b4cd-operator-scripts\") pod \"nova-cell0-6d10-account-create-update-pwrg2\" (UID: \"22546c4b-0348-445b-a781-c3a15ca2b4cd\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.451818 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpl98\" (UniqueName: \"kubernetes.io/projected/618a68ea-1d06-4355-9d36-b0a833165b08-kube-api-access-vpl98\") pod \"nova-cell1-db-create-c94hb\" (UID: \"618a68ea-1d06-4355-9d36-b0a833165b08\") " pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.494835 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.496283 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.498997 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.501778 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f"] Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.542460 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.557245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99428af2-9c90-49ba-909e-486ce037a1ee-operator-scripts\") pod \"nova-cell1-ac8b-account-create-update-fkz9f\" (UID: \"99428af2-9c90-49ba-909e-486ce037a1ee\") " pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.557313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z9fz\" (UniqueName: \"kubernetes.io/projected/99428af2-9c90-49ba-909e-486ce037a1ee-kube-api-access-6z9fz\") pod \"nova-cell1-ac8b-account-create-update-fkz9f\" (UID: \"99428af2-9c90-49ba-909e-486ce037a1ee\") " pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.557343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnm5z\" (UniqueName: \"kubernetes.io/projected/22546c4b-0348-445b-a781-c3a15ca2b4cd-kube-api-access-wnm5z\") pod \"nova-cell0-6d10-account-create-update-pwrg2\" (UID: \"22546c4b-0348-445b-a781-c3a15ca2b4cd\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.557399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/618a68ea-1d06-4355-9d36-b0a833165b08-operator-scripts\") pod \"nova-cell1-db-create-c94hb\" (UID: \"618a68ea-1d06-4355-9d36-b0a833165b08\") " pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.557422 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22546c4b-0348-445b-a781-c3a15ca2b4cd-operator-scripts\") pod \"nova-cell0-6d10-account-create-update-pwrg2\" (UID: \"22546c4b-0348-445b-a781-c3a15ca2b4cd\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.557444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpl98\" (UniqueName: \"kubernetes.io/projected/618a68ea-1d06-4355-9d36-b0a833165b08-kube-api-access-vpl98\") pod \"nova-cell1-db-create-c94hb\" (UID: \"618a68ea-1d06-4355-9d36-b0a833165b08\") " pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.558499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/618a68ea-1d06-4355-9d36-b0a833165b08-operator-scripts\") pod \"nova-cell1-db-create-c94hb\" (UID: \"618a68ea-1d06-4355-9d36-b0a833165b08\") " pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.558956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22546c4b-0348-445b-a781-c3a15ca2b4cd-operator-scripts\") pod \"nova-cell0-6d10-account-create-update-pwrg2\" (UID: \"22546c4b-0348-445b-a781-c3a15ca2b4cd\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.575342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnm5z\" (UniqueName: \"kubernetes.io/projected/22546c4b-0348-445b-a781-c3a15ca2b4cd-kube-api-access-wnm5z\") pod \"nova-cell0-6d10-account-create-update-pwrg2\" (UID: \"22546c4b-0348-445b-a781-c3a15ca2b4cd\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.578507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpl98\" (UniqueName: \"kubernetes.io/projected/618a68ea-1d06-4355-9d36-b0a833165b08-kube-api-access-vpl98\") pod \"nova-cell1-db-create-c94hb\" (UID: \"618a68ea-1d06-4355-9d36-b0a833165b08\") " pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.616295 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.650607 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.658930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99428af2-9c90-49ba-909e-486ce037a1ee-operator-scripts\") pod \"nova-cell1-ac8b-account-create-update-fkz9f\" (UID: \"99428af2-9c90-49ba-909e-486ce037a1ee\") " pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.659006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z9fz\" (UniqueName: \"kubernetes.io/projected/99428af2-9c90-49ba-909e-486ce037a1ee-kube-api-access-6z9fz\") pod \"nova-cell1-ac8b-account-create-update-fkz9f\" (UID: \"99428af2-9c90-49ba-909e-486ce037a1ee\") " pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.660004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99428af2-9c90-49ba-909e-486ce037a1ee-operator-scripts\") pod \"nova-cell1-ac8b-account-create-update-fkz9f\" (UID: \"99428af2-9c90-49ba-909e-486ce037a1ee\") " pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.678322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z9fz\" (UniqueName: \"kubernetes.io/projected/99428af2-9c90-49ba-909e-486ce037a1ee-kube-api-access-6z9fz\") pod \"nova-cell1-ac8b-account-create-update-fkz9f\" (UID: \"99428af2-9c90-49ba-909e-486ce037a1ee\") " pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.831940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.872151 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rffn8"] Jan 21 15:37:45 crc kubenswrapper[4707]: W0121 15:37:45.875959 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod175e861a_ef37_43bf_a299_41690c536532.slice/crio-319e9d3f2c2a9fa962cdf26b5e7d6f2427369e04051d76a53a433b4b8091c201 WatchSource:0}: Error finding container 319e9d3f2c2a9fa962cdf26b5e7d6f2427369e04051d76a53a433b4b8091c201: Status 404 returned error can't find the container with id 319e9d3f2c2a9fa962cdf26b5e7d6f2427369e04051d76a53a433b4b8091c201 Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.925127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-rffn8" event={"ID":"175e861a-ef37-43bf-a299-41690c536532","Type":"ContainerStarted","Data":"319e9d3f2c2a9fa962cdf26b5e7d6f2427369e04051d76a53a433b4b8091c201"} Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.927141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerStarted","Data":"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2"} Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.927276 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="ceilometer-central-agent" containerID="cri-o://5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c" gracePeriod=30 Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.927547 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.927779 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="proxy-httpd" containerID="cri-o://4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2" gracePeriod=30 Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.927867 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="sg-core" containerID="cri-o://8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465" gracePeriod=30 Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.927902 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="ceilometer-notification-agent" containerID="cri-o://d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f" gracePeriod=30 Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.983785 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.633952674 podStartE2EDuration="5.983757924s" podCreationTimestamp="2026-01-21 15:37:40 +0000 UTC" firstStartedPulling="2026-01-21 15:37:41.612063144 +0000 UTC m=+2158.793579366" lastFinishedPulling="2026-01-21 15:37:44.961868395 +0000 UTC m=+2162.143384616" observedRunningTime="2026-01-21 15:37:45.945769205 +0000 UTC m=+2163.127285427" watchObservedRunningTime="2026-01-21 15:37:45.983757924 +0000 UTC m=+2163.165274147" Jan 21 15:37:45 crc kubenswrapper[4707]: I0121 15:37:45.995998 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds"] Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.054151 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9df45"] Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.117115 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-c94hb"] Jan 21 15:37:46 crc kubenswrapper[4707]: W0121 15:37:46.124916 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod618a68ea_1d06_4355_9d36_b0a833165b08.slice/crio-8e3b04c5dba7cef54834de2f676a9a4d54f3e7e23f26f5165dd478c62141d646 WatchSource:0}: Error finding container 8e3b04c5dba7cef54834de2f676a9a4d54f3e7e23f26f5165dd478c62141d646: Status 404 returned error can't find the container with id 8e3b04c5dba7cef54834de2f676a9a4d54f3e7e23f26f5165dd478c62141d646 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.175940 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2"] Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.309730 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f"] Jan 21 15:37:46 crc kubenswrapper[4707]: W0121 15:37:46.333896 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99428af2_9c90_49ba_909e_486ce037a1ee.slice/crio-4ce4c9e5ab51bca31b748d8f76aeac9498af0805699598fb0913638a7dccb89b WatchSource:0}: Error finding container 4ce4c9e5ab51bca31b748d8f76aeac9498af0805699598fb0913638a7dccb89b: Status 404 returned error can't find the container with id 4ce4c9e5ab51bca31b748d8f76aeac9498af0805699598fb0913638a7dccb89b Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.824543 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.953049 4707 generic.go:334] "Generic (PLEG): container finished" podID="488b22fe-c39f-42fb-8c0e-75917ac55fff" containerID="ac0ab72e2adc66130459decd935a239cd9d2d7fa50b371dc00565ad083fd50a9" exitCode=0 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.953159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" event={"ID":"488b22fe-c39f-42fb-8c0e-75917ac55fff","Type":"ContainerDied","Data":"ac0ab72e2adc66130459decd935a239cd9d2d7fa50b371dc00565ad083fd50a9"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.953188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" event={"ID":"488b22fe-c39f-42fb-8c0e-75917ac55fff","Type":"ContainerStarted","Data":"0c8aeda8d734ee0692abec34c69599543cc07ba394f89f53ee046dea15740976"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.956618 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.956603 4707 generic.go:334] "Generic (PLEG): container finished" podID="12508707-9551-4635-b05c-4cd647137b44" containerID="4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2" exitCode=0 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.956672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerDied","Data":"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.956755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerDied","Data":"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.956780 4707 scope.go:117] "RemoveContainer" containerID="4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2" Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.956700 4707 generic.go:334] "Generic (PLEG): container finished" podID="12508707-9551-4635-b05c-4cd647137b44" containerID="8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465" exitCode=2 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.956832 4707 generic.go:334] "Generic (PLEG): container finished" podID="12508707-9551-4635-b05c-4cd647137b44" containerID="d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f" exitCode=0 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.956842 4707 generic.go:334] "Generic (PLEG): container finished" podID="12508707-9551-4635-b05c-4cd647137b44" containerID="5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c" exitCode=0 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.956886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerDied","Data":"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.957079 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerDied","Data":"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.957419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"12508707-9551-4635-b05c-4cd647137b44","Type":"ContainerDied","Data":"ef0cf3d9fd8b50b71b349cc7ea33193792d80b90750be0ebdd88a4ddc87ada39"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.958772 4707 generic.go:334] "Generic (PLEG): container finished" podID="22546c4b-0348-445b-a781-c3a15ca2b4cd" containerID="07df91c2736baee7ae0dcb92b0586ed687e019bc1d2d0d55f1d53c627d3b7bb8" exitCode=0 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.958862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" event={"ID":"22546c4b-0348-445b-a781-c3a15ca2b4cd","Type":"ContainerDied","Data":"07df91c2736baee7ae0dcb92b0586ed687e019bc1d2d0d55f1d53c627d3b7bb8"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.959306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" event={"ID":"22546c4b-0348-445b-a781-c3a15ca2b4cd","Type":"ContainerStarted","Data":"40b207d3d68af82d15fae20194e20f7f8956c349ccb733878869a6e959ce743c"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.960911 4707 generic.go:334] "Generic (PLEG): container finished" podID="618a68ea-1d06-4355-9d36-b0a833165b08" containerID="b73c003ff0dc595122eb4fcd8f2b3c3b46a88371d6bdfe451fb8a53df4e771d5" exitCode=0 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.961001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" event={"ID":"618a68ea-1d06-4355-9d36-b0a833165b08","Type":"ContainerDied","Data":"b73c003ff0dc595122eb4fcd8f2b3c3b46a88371d6bdfe451fb8a53df4e771d5"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.961032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" event={"ID":"618a68ea-1d06-4355-9d36-b0a833165b08","Type":"ContainerStarted","Data":"8e3b04c5dba7cef54834de2f676a9a4d54f3e7e23f26f5165dd478c62141d646"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.963502 4707 generic.go:334] "Generic (PLEG): container finished" podID="175e861a-ef37-43bf-a299-41690c536532" containerID="38209e49c03af2916763782b9ab88941d2a2fce3d3f8e0a0e6eb99881b4f82b3" exitCode=0 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.963543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-rffn8" event={"ID":"175e861a-ef37-43bf-a299-41690c536532","Type":"ContainerDied","Data":"38209e49c03af2916763782b9ab88941d2a2fce3d3f8e0a0e6eb99881b4f82b3"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.968770 4707 generic.go:334] "Generic (PLEG): container finished" podID="99428af2-9c90-49ba-909e-486ce037a1ee" containerID="7ae822343165321e1d28599fa0b72e4f84a0e15193bcb3d2070a4af6ab58b85d" exitCode=0 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.968803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" event={"ID":"99428af2-9c90-49ba-909e-486ce037a1ee","Type":"ContainerDied","Data":"7ae822343165321e1d28599fa0b72e4f84a0e15193bcb3d2070a4af6ab58b85d"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.968979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" event={"ID":"99428af2-9c90-49ba-909e-486ce037a1ee","Type":"ContainerStarted","Data":"4ce4c9e5ab51bca31b748d8f76aeac9498af0805699598fb0913638a7dccb89b"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.972696 4707 generic.go:334] "Generic (PLEG): container finished" podID="fefeb0e1-4aeb-4505-98e5-c7df15a5e06f" containerID="c0e20ad0b54e06f9f898937e4cd6c15bb4efb7247d7665943402e873ebf065eb" exitCode=0 Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.972734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" event={"ID":"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f","Type":"ContainerDied","Data":"c0e20ad0b54e06f9f898937e4cd6c15bb4efb7247d7665943402e873ebf065eb"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.972754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" event={"ID":"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f","Type":"ContainerStarted","Data":"fa8fd628594cbbb3a46a55c2be7d29071355b689294e03d561ea739c5509a08a"} Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.984996 4707 scope.go:117] "RemoveContainer" containerID="8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465" Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.988370 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spk8r\" (UniqueName: \"kubernetes.io/projected/12508707-9551-4635-b05c-4cd647137b44-kube-api-access-spk8r\") pod \"12508707-9551-4635-b05c-4cd647137b44\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.988480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-sg-core-conf-yaml\") pod \"12508707-9551-4635-b05c-4cd647137b44\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.988524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-log-httpd\") pod \"12508707-9551-4635-b05c-4cd647137b44\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.988608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-run-httpd\") pod \"12508707-9551-4635-b05c-4cd647137b44\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.988725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-scripts\") pod \"12508707-9551-4635-b05c-4cd647137b44\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.988769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-combined-ca-bundle\") pod \"12508707-9551-4635-b05c-4cd647137b44\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.988799 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-config-data\") pod \"12508707-9551-4635-b05c-4cd647137b44\" (UID: \"12508707-9551-4635-b05c-4cd647137b44\") " Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.989603 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "12508707-9551-4635-b05c-4cd647137b44" (UID: "12508707-9551-4635-b05c-4cd647137b44"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:46 crc kubenswrapper[4707]: I0121 15:37:46.991072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "12508707-9551-4635-b05c-4cd647137b44" (UID: "12508707-9551-4635-b05c-4cd647137b44"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.001489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-scripts" (OuterVolumeSpecName: "scripts") pod "12508707-9551-4635-b05c-4cd647137b44" (UID: "12508707-9551-4635-b05c-4cd647137b44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.002029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12508707-9551-4635-b05c-4cd647137b44-kube-api-access-spk8r" (OuterVolumeSpecName: "kube-api-access-spk8r") pod "12508707-9551-4635-b05c-4cd647137b44" (UID: "12508707-9551-4635-b05c-4cd647137b44"). InnerVolumeSpecName "kube-api-access-spk8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.048689 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "12508707-9551-4635-b05c-4cd647137b44" (UID: "12508707-9551-4635-b05c-4cd647137b44"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.074262 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.074532 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerName="glance-log" containerID="cri-o://07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025" gracePeriod=30 Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.075011 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerName="glance-httpd" containerID="cri-o://88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c" gracePeriod=30 Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.092313 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.092340 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.092351 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12508707-9551-4635-b05c-4cd647137b44-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.092359 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.092369 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spk8r\" (UniqueName: \"kubernetes.io/projected/12508707-9551-4635-b05c-4cd647137b44-kube-api-access-spk8r\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.107963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12508707-9551-4635-b05c-4cd647137b44" (UID: "12508707-9551-4635-b05c-4cd647137b44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.125435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-config-data" (OuterVolumeSpecName: "config-data") pod "12508707-9551-4635-b05c-4cd647137b44" (UID: "12508707-9551-4635-b05c-4cd647137b44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.179967 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.183561 4707 scope.go:117] "RemoveContainer" containerID="d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.193464 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.193489 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12508707-9551-4635-b05c-4cd647137b44-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.203507 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.245020 4707 scope.go:117] "RemoveContainer" containerID="5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.284602 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.290846 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.296028 4707 scope.go:117] "RemoveContainer" containerID="4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2" Jan 21 15:37:47 crc kubenswrapper[4707]: E0121 15:37:47.296694 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2\": container with ID starting with 4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2 not found: ID does not exist" containerID="4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.296731 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2"} err="failed to get container status \"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2\": rpc error: code = NotFound desc = could not find container \"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2\": container with ID starting with 4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2 not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.296755 4707 scope.go:117] "RemoveContainer" containerID="8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465" Jan 21 15:37:47 crc kubenswrapper[4707]: E0121 15:37:47.297150 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465\": container with ID starting with 8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465 not found: ID does not exist" containerID="8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.297252 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465"} err="failed to get container status \"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465\": rpc error: code = NotFound desc = could not find container \"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465\": container with ID starting with 8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465 not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.297325 4707 scope.go:117] "RemoveContainer" containerID="d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f" Jan 21 15:37:47 crc kubenswrapper[4707]: E0121 15:37:47.297767 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f\": container with ID starting with d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f not found: ID does not exist" containerID="d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.297797 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f"} err="failed to get container status \"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f\": rpc error: code = NotFound desc = could not find container \"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f\": container with ID starting with d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.297842 4707 scope.go:117] "RemoveContainer" containerID="5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c" Jan 21 15:37:47 crc kubenswrapper[4707]: E0121 15:37:47.298065 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c\": container with ID starting with 5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c not found: ID does not exist" containerID="5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.298090 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c"} err="failed to get container status \"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c\": rpc error: code = NotFound desc = could not find container \"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c\": container with ID starting with 5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.298111 4707 scope.go:117] "RemoveContainer" containerID="4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.299007 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2"} err="failed to get container status \"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2\": rpc error: code = NotFound desc = could not find container \"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2\": container with ID starting with 4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2 not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.299030 4707 scope.go:117] "RemoveContainer" containerID="8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.300093 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465"} err="failed to get container status \"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465\": rpc error: code = NotFound desc = could not find container \"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465\": container with ID starting with 8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465 not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.300115 4707 scope.go:117] "RemoveContainer" containerID="d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.301618 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f"} err="failed to get container status \"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f\": rpc error: code = NotFound desc = could not find container \"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f\": container with ID starting with d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.301635 4707 scope.go:117] "RemoveContainer" containerID="5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.303227 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c"} err="failed to get container status \"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c\": rpc error: code = NotFound desc = could not find container \"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c\": container with ID starting with 5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.303266 4707 scope.go:117] "RemoveContainer" containerID="4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.306619 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2"} err="failed to get container status \"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2\": rpc error: code = NotFound desc = could not find container \"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2\": container with ID starting with 4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2 not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.306660 4707 scope.go:117] "RemoveContainer" containerID="8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.307051 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465"} err="failed to get container status \"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465\": rpc error: code = NotFound desc = could not find container \"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465\": container with ID starting with 8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465 not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.307073 4707 scope.go:117] "RemoveContainer" containerID="d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.307316 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f"} err="failed to get container status \"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f\": rpc error: code = NotFound desc = could not find container \"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f\": container with ID starting with d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.307399 4707 scope.go:117] "RemoveContainer" containerID="5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.307678 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c"} err="failed to get container status \"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c\": rpc error: code = NotFound desc = could not find container \"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c\": container with ID starting with 5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.307748 4707 scope.go:117] "RemoveContainer" containerID="4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.308055 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2"} err="failed to get container status \"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2\": rpc error: code = NotFound desc = could not find container \"4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2\": container with ID starting with 4b50b193afd5c5b2f44f459b5b276b330dea6272febc7fcdb95bad873aa7c8e2 not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.308079 4707 scope.go:117] "RemoveContainer" containerID="8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.308344 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465"} err="failed to get container status \"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465\": rpc error: code = NotFound desc = could not find container \"8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465\": container with ID starting with 8c42a033ec9a2cdcf4a6256d589adc003b7ac521552bdee0b7c5d344fff4c465 not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.308392 4707 scope.go:117] "RemoveContainer" containerID="d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.308578 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f"} err="failed to get container status \"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f\": rpc error: code = NotFound desc = could not find container \"d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f\": container with ID starting with d88694fead5aef7a8486256931034791d3b5764e158099292077cb0db88e0f9f not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.308597 4707 scope.go:117] "RemoveContainer" containerID="5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.308770 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c"} err="failed to get container status \"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c\": rpc error: code = NotFound desc = could not find container \"5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c\": container with ID starting with 5f1f6b81140352bceba073bc6f0f07b4732fe9edacf154123bd2e6438b9f5a7c not found: ID does not exist" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.313190 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:47 crc kubenswrapper[4707]: E0121 15:37:47.313576 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="sg-core" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.313589 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="sg-core" Jan 21 15:37:47 crc kubenswrapper[4707]: E0121 15:37:47.313603 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="ceilometer-central-agent" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.313609 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="ceilometer-central-agent" Jan 21 15:37:47 crc kubenswrapper[4707]: E0121 15:37:47.313618 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="ceilometer-notification-agent" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.313623 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="ceilometer-notification-agent" Jan 21 15:37:47 crc kubenswrapper[4707]: E0121 15:37:47.313636 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="proxy-httpd" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.313641 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="proxy-httpd" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.313824 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="proxy-httpd" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.313849 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="sg-core" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.313864 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="ceilometer-central-agent" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.313874 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12508707-9551-4635-b05c-4cd647137b44" containerName="ceilometer-notification-agent" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.315395 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.321466 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.322522 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.325773 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.502505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-log-httpd\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.502595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-scripts\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.502618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.502785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-config-data\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.502963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-run-httpd\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.502995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.503087 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhzv5\" (UniqueName: \"kubernetes.io/projected/5a08a1a1-52a9-4347-b062-154c4f8ca297-kube-api-access-rhzv5\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.605260 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-log-httpd\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.605650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-log-httpd\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.605731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-scripts\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.606540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.606614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-config-data\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.606671 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-run-httpd\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.606695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.606772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhzv5\" (UniqueName: \"kubernetes.io/projected/5a08a1a1-52a9-4347-b062-154c4f8ca297-kube-api-access-rhzv5\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.607412 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-run-httpd\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.609870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.610202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-scripts\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.611669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-config-data\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.611687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.622014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhzv5\" (UniqueName: \"kubernetes.io/projected/5a08a1a1-52a9-4347-b062-154c4f8ca297-kube-api-access-rhzv5\") pod \"ceilometer-0\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.636276 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.899061 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.899634 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerName="glance-log" containerID="cri-o://d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428" gracePeriod=30 Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.899710 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerName="glance-httpd" containerID="cri-o://dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97" gracePeriod=30 Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.985201 4707 generic.go:334] "Generic (PLEG): container finished" podID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerID="07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025" exitCode=143 Jan 21 15:37:47 crc kubenswrapper[4707]: I0121 15:37:47.985527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9c93e649-01b5-4e89-8453-106a67bc11ba","Type":"ContainerDied","Data":"07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025"} Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.044689 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.349648 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.486502 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.493723 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.525392 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.527424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-operator-scripts\") pod \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\" (UID: \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.527923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlgss\" (UniqueName: \"kubernetes.io/projected/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-kube-api-access-qlgss\") pod \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\" (UID: \"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.527977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fefeb0e1-4aeb-4505-98e5-c7df15a5e06f" (UID: "fefeb0e1-4aeb-4505-98e5-c7df15a5e06f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.530393 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.534298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-kube-api-access-qlgss" (OuterVolumeSpecName: "kube-api-access-qlgss") pod "fefeb0e1-4aeb-4505-98e5-c7df15a5e06f" (UID: "fefeb0e1-4aeb-4505-98e5-c7df15a5e06f"). InnerVolumeSpecName "kube-api-access-qlgss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.540777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.542053 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.617842 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.636672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kt76\" (UniqueName: \"kubernetes.io/projected/175e861a-ef37-43bf-a299-41690c536532-kube-api-access-8kt76\") pod \"175e861a-ef37-43bf-a299-41690c536532\" (UID: \"175e861a-ef37-43bf-a299-41690c536532\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.636780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnm5z\" (UniqueName: \"kubernetes.io/projected/22546c4b-0348-445b-a781-c3a15ca2b4cd-kube-api-access-wnm5z\") pod \"22546c4b-0348-445b-a781-c3a15ca2b4cd\" (UID: \"22546c4b-0348-445b-a781-c3a15ca2b4cd\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.636960 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpl98\" (UniqueName: \"kubernetes.io/projected/618a68ea-1d06-4355-9d36-b0a833165b08-kube-api-access-vpl98\") pod \"618a68ea-1d06-4355-9d36-b0a833165b08\" (UID: \"618a68ea-1d06-4355-9d36-b0a833165b08\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.637021 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/618a68ea-1d06-4355-9d36-b0a833165b08-operator-scripts\") pod \"618a68ea-1d06-4355-9d36-b0a833165b08\" (UID: \"618a68ea-1d06-4355-9d36-b0a833165b08\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.637070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/175e861a-ef37-43bf-a299-41690c536532-operator-scripts\") pod \"175e861a-ef37-43bf-a299-41690c536532\" (UID: \"175e861a-ef37-43bf-a299-41690c536532\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.637171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22546c4b-0348-445b-a781-c3a15ca2b4cd-operator-scripts\") pod \"22546c4b-0348-445b-a781-c3a15ca2b4cd\" (UID: \"22546c4b-0348-445b-a781-c3a15ca2b4cd\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.638161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618a68ea-1d06-4355-9d36-b0a833165b08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "618a68ea-1d06-4355-9d36-b0a833165b08" (UID: "618a68ea-1d06-4355-9d36-b0a833165b08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.638706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22546c4b-0348-445b-a781-c3a15ca2b4cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22546c4b-0348-445b-a781-c3a15ca2b4cd" (UID: "22546c4b-0348-445b-a781-c3a15ca2b4cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.639206 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/175e861a-ef37-43bf-a299-41690c536532-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "175e861a-ef37-43bf-a299-41690c536532" (UID: "175e861a-ef37-43bf-a299-41690c536532"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.641354 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/618a68ea-1d06-4355-9d36-b0a833165b08-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.641383 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/175e861a-ef37-43bf-a299-41690c536532-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.641394 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22546c4b-0348-445b-a781-c3a15ca2b4cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.641406 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlgss\" (UniqueName: \"kubernetes.io/projected/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f-kube-api-access-qlgss\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.642055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618a68ea-1d06-4355-9d36-b0a833165b08-kube-api-access-vpl98" (OuterVolumeSpecName: "kube-api-access-vpl98") pod "618a68ea-1d06-4355-9d36-b0a833165b08" (UID: "618a68ea-1d06-4355-9d36-b0a833165b08"). InnerVolumeSpecName "kube-api-access-vpl98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.644013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22546c4b-0348-445b-a781-c3a15ca2b4cd-kube-api-access-wnm5z" (OuterVolumeSpecName: "kube-api-access-wnm5z") pod "22546c4b-0348-445b-a781-c3a15ca2b4cd" (UID: "22546c4b-0348-445b-a781-c3a15ca2b4cd"). InnerVolumeSpecName "kube-api-access-wnm5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.644831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175e861a-ef37-43bf-a299-41690c536532-kube-api-access-8kt76" (OuterVolumeSpecName: "kube-api-access-8kt76") pod "175e861a-ef37-43bf-a299-41690c536532" (UID: "175e861a-ef37-43bf-a299-41690c536532"). InnerVolumeSpecName "kube-api-access-8kt76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.742794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdtxt\" (UniqueName: \"kubernetes.io/projected/488b22fe-c39f-42fb-8c0e-75917ac55fff-kube-api-access-cdtxt\") pod \"488b22fe-c39f-42fb-8c0e-75917ac55fff\" (UID: \"488b22fe-c39f-42fb-8c0e-75917ac55fff\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.742947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/488b22fe-c39f-42fb-8c0e-75917ac55fff-operator-scripts\") pod \"488b22fe-c39f-42fb-8c0e-75917ac55fff\" (UID: \"488b22fe-c39f-42fb-8c0e-75917ac55fff\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.743001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99428af2-9c90-49ba-909e-486ce037a1ee-operator-scripts\") pod \"99428af2-9c90-49ba-909e-486ce037a1ee\" (UID: \"99428af2-9c90-49ba-909e-486ce037a1ee\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.743024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z9fz\" (UniqueName: \"kubernetes.io/projected/99428af2-9c90-49ba-909e-486ce037a1ee-kube-api-access-6z9fz\") pod \"99428af2-9c90-49ba-909e-486ce037a1ee\" (UID: \"99428af2-9c90-49ba-909e-486ce037a1ee\") " Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.743412 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/488b22fe-c39f-42fb-8c0e-75917ac55fff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "488b22fe-c39f-42fb-8c0e-75917ac55fff" (UID: "488b22fe-c39f-42fb-8c0e-75917ac55fff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.743660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99428af2-9c90-49ba-909e-486ce037a1ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99428af2-9c90-49ba-909e-486ce037a1ee" (UID: "99428af2-9c90-49ba-909e-486ce037a1ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.745114 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/488b22fe-c39f-42fb-8c0e-75917ac55fff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.745155 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99428af2-9c90-49ba-909e-486ce037a1ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.745169 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kt76\" (UniqueName: \"kubernetes.io/projected/175e861a-ef37-43bf-a299-41690c536532-kube-api-access-8kt76\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.745181 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnm5z\" (UniqueName: \"kubernetes.io/projected/22546c4b-0348-445b-a781-c3a15ca2b4cd-kube-api-access-wnm5z\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.745191 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpl98\" (UniqueName: \"kubernetes.io/projected/618a68ea-1d06-4355-9d36-b0a833165b08-kube-api-access-vpl98\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.750143 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99428af2-9c90-49ba-909e-486ce037a1ee-kube-api-access-6z9fz" (OuterVolumeSpecName: "kube-api-access-6z9fz") pod "99428af2-9c90-49ba-909e-486ce037a1ee" (UID: "99428af2-9c90-49ba-909e-486ce037a1ee"). InnerVolumeSpecName "kube-api-access-6z9fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.757929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488b22fe-c39f-42fb-8c0e-75917ac55fff-kube-api-access-cdtxt" (OuterVolumeSpecName: "kube-api-access-cdtxt") pod "488b22fe-c39f-42fb-8c0e-75917ac55fff" (UID: "488b22fe-c39f-42fb-8c0e-75917ac55fff"). InnerVolumeSpecName "kube-api-access-cdtxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.847485 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdtxt\" (UniqueName: \"kubernetes.io/projected/488b22fe-c39f-42fb-8c0e-75917ac55fff-kube-api-access-cdtxt\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.847798 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z9fz\" (UniqueName: \"kubernetes.io/projected/99428af2-9c90-49ba-909e-486ce037a1ee-kube-api-access-6z9fz\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.996520 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerStarted","Data":"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e"} Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.996573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerStarted","Data":"1b1f5e707d4f67ecf4a74d93d25344e96494ee3b000f81eaafbeff7d28d1d6cd"} Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.998362 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerID="d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428" exitCode=143 Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.998412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"bf3d161c-7bfd-42f7-90c1-63d92df34424","Type":"ContainerDied","Data":"d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428"} Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.999911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-rffn8" event={"ID":"175e861a-ef37-43bf-a299-41690c536532","Type":"ContainerDied","Data":"319e9d3f2c2a9fa962cdf26b5e7d6f2427369e04051d76a53a433b4b8091c201"} Jan 21 15:37:48 crc kubenswrapper[4707]: I0121 15:37:48.999937 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="319e9d3f2c2a9fa962cdf26b5e7d6f2427369e04051d76a53a433b4b8091c201" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.000010 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-rffn8" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.005756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" event={"ID":"99428af2-9c90-49ba-909e-486ce037a1ee","Type":"ContainerDied","Data":"4ce4c9e5ab51bca31b748d8f76aeac9498af0805699598fb0913638a7dccb89b"} Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.005796 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ce4c9e5ab51bca31b748d8f76aeac9498af0805699598fb0913638a7dccb89b" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.005887 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.009278 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.009280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-9df45" event={"ID":"fefeb0e1-4aeb-4505-98e5-c7df15a5e06f","Type":"ContainerDied","Data":"fa8fd628594cbbb3a46a55c2be7d29071355b689294e03d561ea739c5509a08a"} Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.009327 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa8fd628594cbbb3a46a55c2be7d29071355b689294e03d561ea739c5509a08a" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.010879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" event={"ID":"488b22fe-c39f-42fb-8c0e-75917ac55fff","Type":"ContainerDied","Data":"0c8aeda8d734ee0692abec34c69599543cc07ba394f89f53ee046dea15740976"} Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.010908 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c8aeda8d734ee0692abec34c69599543cc07ba394f89f53ee046dea15740976" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.010914 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.012539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" event={"ID":"22546c4b-0348-445b-a781-c3a15ca2b4cd","Type":"ContainerDied","Data":"40b207d3d68af82d15fae20194e20f7f8956c349ccb733878869a6e959ce743c"} Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.012565 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40b207d3d68af82d15fae20194e20f7f8956c349ccb733878869a6e959ce743c" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.012586 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.013846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" event={"ID":"618a68ea-1d06-4355-9d36-b0a833165b08","Type":"ContainerDied","Data":"8e3b04c5dba7cef54834de2f676a9a4d54f3e7e23f26f5165dd478c62141d646"} Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.013865 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e3b04c5dba7cef54834de2f676a9a4d54f3e7e23f26f5165dd478c62141d646" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.013906 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-c94hb" Jan 21 15:37:49 crc kubenswrapper[4707]: I0121 15:37:49.195126 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12508707-9551-4635-b05c-4cd647137b44" path="/var/lib/kubelet/pods/12508707-9551-4635-b05c-4cd647137b44/volumes" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.023514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerStarted","Data":"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3"} Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.578513 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk"] Jan 21 15:37:50 crc kubenswrapper[4707]: E0121 15:37:50.579600 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99428af2-9c90-49ba-909e-486ce037a1ee" containerName="mariadb-account-create-update" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.579624 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="99428af2-9c90-49ba-909e-486ce037a1ee" containerName="mariadb-account-create-update" Jan 21 15:37:50 crc kubenswrapper[4707]: E0121 15:37:50.579651 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175e861a-ef37-43bf-a299-41690c536532" containerName="mariadb-database-create" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.579657 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="175e861a-ef37-43bf-a299-41690c536532" containerName="mariadb-database-create" Jan 21 15:37:50 crc kubenswrapper[4707]: E0121 15:37:50.579672 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488b22fe-c39f-42fb-8c0e-75917ac55fff" containerName="mariadb-account-create-update" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.579679 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="488b22fe-c39f-42fb-8c0e-75917ac55fff" containerName="mariadb-account-create-update" Jan 21 15:37:50 crc kubenswrapper[4707]: E0121 15:37:50.579693 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefeb0e1-4aeb-4505-98e5-c7df15a5e06f" containerName="mariadb-database-create" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.579698 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefeb0e1-4aeb-4505-98e5-c7df15a5e06f" containerName="mariadb-database-create" Jan 21 15:37:50 crc kubenswrapper[4707]: E0121 15:37:50.579724 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618a68ea-1d06-4355-9d36-b0a833165b08" containerName="mariadb-database-create" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.579729 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="618a68ea-1d06-4355-9d36-b0a833165b08" containerName="mariadb-database-create" Jan 21 15:37:50 crc kubenswrapper[4707]: E0121 15:37:50.579743 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22546c4b-0348-445b-a781-c3a15ca2b4cd" containerName="mariadb-account-create-update" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.579750 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="22546c4b-0348-445b-a781-c3a15ca2b4cd" containerName="mariadb-account-create-update" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.579979 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefeb0e1-4aeb-4505-98e5-c7df15a5e06f" containerName="mariadb-database-create" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.579991 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="175e861a-ef37-43bf-a299-41690c536532" containerName="mariadb-database-create" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.580005 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="618a68ea-1d06-4355-9d36-b0a833165b08" containerName="mariadb-database-create" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.580022 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="22546c4b-0348-445b-a781-c3a15ca2b4cd" containerName="mariadb-account-create-update" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.580038 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="99428af2-9c90-49ba-909e-486ce037a1ee" containerName="mariadb-account-create-update" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.580044 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="488b22fe-c39f-42fb-8c0e-75917ac55fff" containerName="mariadb-account-create-update" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.589904 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.592592 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-rhjn8" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.592890 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.593053 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.610167 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk"] Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.663630 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.696711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.697083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-config-data\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.697211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-scripts\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.697329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vnhx\" (UniqueName: \"kubernetes.io/projected/97e5f6a3-0b0f-4d27-b60c-54585edca724-kube-api-access-8vnhx\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.799122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24789\" (UniqueName: \"kubernetes.io/projected/9c93e649-01b5-4e89-8453-106a67bc11ba-kube-api-access-24789\") pod \"9c93e649-01b5-4e89-8453-106a67bc11ba\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.799423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-config-data\") pod \"9c93e649-01b5-4e89-8453-106a67bc11ba\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.799486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-public-tls-certs\") pod \"9c93e649-01b5-4e89-8453-106a67bc11ba\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.799518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-httpd-run\") pod \"9c93e649-01b5-4e89-8453-106a67bc11ba\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.799553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9c93e649-01b5-4e89-8453-106a67bc11ba\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.799576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-logs\") pod \"9c93e649-01b5-4e89-8453-106a67bc11ba\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.799677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-combined-ca-bundle\") pod \"9c93e649-01b5-4e89-8453-106a67bc11ba\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.799713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-scripts\") pod \"9c93e649-01b5-4e89-8453-106a67bc11ba\" (UID: \"9c93e649-01b5-4e89-8453-106a67bc11ba\") " Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.800175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-logs" (OuterVolumeSpecName: "logs") pod "9c93e649-01b5-4e89-8453-106a67bc11ba" (UID: "9c93e649-01b5-4e89-8453-106a67bc11ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.800329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.800621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-config-data\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.800629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9c93e649-01b5-4e89-8453-106a67bc11ba" (UID: "9c93e649-01b5-4e89-8453-106a67bc11ba"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.800831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-scripts\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.800940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vnhx\" (UniqueName: \"kubernetes.io/projected/97e5f6a3-0b0f-4d27-b60c-54585edca724-kube-api-access-8vnhx\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.801227 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.801253 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c93e649-01b5-4e89-8453-106a67bc11ba-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.805899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-scripts" (OuterVolumeSpecName: "scripts") pod "9c93e649-01b5-4e89-8453-106a67bc11ba" (UID: "9c93e649-01b5-4e89-8453-106a67bc11ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.813129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "9c93e649-01b5-4e89-8453-106a67bc11ba" (UID: "9c93e649-01b5-4e89-8453-106a67bc11ba"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.816656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-config-data\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.855040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c93e649-01b5-4e89-8453-106a67bc11ba-kube-api-access-24789" (OuterVolumeSpecName: "kube-api-access-24789") pod "9c93e649-01b5-4e89-8453-106a67bc11ba" (UID: "9c93e649-01b5-4e89-8453-106a67bc11ba"). InnerVolumeSpecName "kube-api-access-24789". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.872691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-scripts\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.873285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.880584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vnhx\" (UniqueName: \"kubernetes.io/projected/97e5f6a3-0b0f-4d27-b60c-54585edca724-kube-api-access-8vnhx\") pod \"nova-cell0-conductor-db-sync-wv2tk\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.912496 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24789\" (UniqueName: \"kubernetes.io/projected/9c93e649-01b5-4e89-8453-106a67bc11ba-kube-api-access-24789\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.912553 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.912566 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.964911 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.965591 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:37:50 crc kubenswrapper[4707]: I0121 15:37:50.970110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c93e649-01b5-4e89-8453-106a67bc11ba" (UID: "9c93e649-01b5-4e89-8453-106a67bc11ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.014188 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.014297 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.024967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-config-data" (OuterVolumeSpecName: "config-data") pod "9c93e649-01b5-4e89-8453-106a67bc11ba" (UID: "9c93e649-01b5-4e89-8453-106a67bc11ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.026173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9c93e649-01b5-4e89-8453-106a67bc11ba" (UID: "9c93e649-01b5-4e89-8453-106a67bc11ba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.088994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerStarted","Data":"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d"} Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.095177 4707 generic.go:334] "Generic (PLEG): container finished" podID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerID="88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c" exitCode=0 Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.095254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9c93e649-01b5-4e89-8453-106a67bc11ba","Type":"ContainerDied","Data":"88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c"} Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.095291 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9c93e649-01b5-4e89-8453-106a67bc11ba","Type":"ContainerDied","Data":"c60ba3771db87f16ea472576fcc37d1a1da884134d536b89457156ca088661a3"} Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.095313 4707 scope.go:117] "RemoveContainer" containerID="88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.095257 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.116457 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.116493 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c93e649-01b5-4e89-8453-106a67bc11ba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.137184 4707 scope.go:117] "RemoveContainer" containerID="07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.153847 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.186164 4707 scope.go:117] "RemoveContainer" containerID="88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c" Jan 21 15:37:51 crc kubenswrapper[4707]: E0121 15:37:51.187518 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c\": container with ID starting with 88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c not found: ID does not exist" containerID="88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.187571 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c"} err="failed to get container status \"88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c\": rpc error: code = NotFound desc = could not find container \"88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c\": container with ID starting with 88631e6b0342445ae61712c371a76539423b29b19940d23044fddfd3d230fc8c not found: ID does not exist" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.187602 4707 scope.go:117] "RemoveContainer" containerID="07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025" Jan 21 15:37:51 crc kubenswrapper[4707]: E0121 15:37:51.188092 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025\": container with ID starting with 07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025 not found: ID does not exist" containerID="07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.188139 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025"} err="failed to get container status \"07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025\": rpc error: code = NotFound desc = could not find container \"07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025\": container with ID starting with 07db5fa261612dad0bd445c8ac2301ddca6a80a324b3ae640ae6835394003025 not found: ID does not exist" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.213395 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.236600 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:37:51 crc kubenswrapper[4707]: E0121 15:37:51.239340 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerName="glance-httpd" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.239364 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerName="glance-httpd" Jan 21 15:37:51 crc kubenswrapper[4707]: E0121 15:37:51.239387 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerName="glance-log" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.239394 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerName="glance-log" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.241200 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerName="glance-log" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.241223 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c93e649-01b5-4e89-8453-106a67bc11ba" containerName="glance-httpd" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.245426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.246642 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.249615 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.250133 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.448054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.448153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-logs\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.448274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.448333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.448381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdr8\" (UniqueName: \"kubernetes.io/projected/3eec9e69-0f85-4cc7-8fad-020a027dedcc-kube-api-access-2pdr8\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.448433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.448501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-config-data\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.448535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-scripts\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.539784 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk"] Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.551025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.551221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.551347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdr8\" (UniqueName: \"kubernetes.io/projected/3eec9e69-0f85-4cc7-8fad-020a027dedcc-kube-api-access-2pdr8\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.551492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.551641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-config-data\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.551699 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.551780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-scripts\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.551968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.552116 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-logs\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.552699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.553014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-logs\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.557346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-scripts\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.557622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-config-data\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.560425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.562113 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.572092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdr8\" (UniqueName: \"kubernetes.io/projected/3eec9e69-0f85-4cc7-8fad-020a027dedcc-kube-api-access-2pdr8\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.586942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.656445 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.755225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-config-data\") pod \"bf3d161c-7bfd-42f7-90c1-63d92df34424\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.755306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-776p9\" (UniqueName: \"kubernetes.io/projected/bf3d161c-7bfd-42f7-90c1-63d92df34424-kube-api-access-776p9\") pod \"bf3d161c-7bfd-42f7-90c1-63d92df34424\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.755344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-httpd-run\") pod \"bf3d161c-7bfd-42f7-90c1-63d92df34424\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.755381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-scripts\") pod \"bf3d161c-7bfd-42f7-90c1-63d92df34424\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.755410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-internal-tls-certs\") pod \"bf3d161c-7bfd-42f7-90c1-63d92df34424\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.755539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"bf3d161c-7bfd-42f7-90c1-63d92df34424\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.755563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-combined-ca-bundle\") pod \"bf3d161c-7bfd-42f7-90c1-63d92df34424\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.755611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-logs\") pod \"bf3d161c-7bfd-42f7-90c1-63d92df34424\" (UID: \"bf3d161c-7bfd-42f7-90c1-63d92df34424\") " Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.756340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-logs" (OuterVolumeSpecName: "logs") pod "bf3d161c-7bfd-42f7-90c1-63d92df34424" (UID: "bf3d161c-7bfd-42f7-90c1-63d92df34424"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.757427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bf3d161c-7bfd-42f7-90c1-63d92df34424" (UID: "bf3d161c-7bfd-42f7-90c1-63d92df34424"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.760479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3d161c-7bfd-42f7-90c1-63d92df34424-kube-api-access-776p9" (OuterVolumeSpecName: "kube-api-access-776p9") pod "bf3d161c-7bfd-42f7-90c1-63d92df34424" (UID: "bf3d161c-7bfd-42f7-90c1-63d92df34424"). InnerVolumeSpecName "kube-api-access-776p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.760973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-scripts" (OuterVolumeSpecName: "scripts") pod "bf3d161c-7bfd-42f7-90c1-63d92df34424" (UID: "bf3d161c-7bfd-42f7-90c1-63d92df34424"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.760973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "bf3d161c-7bfd-42f7-90c1-63d92df34424" (UID: "bf3d161c-7bfd-42f7-90c1-63d92df34424"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.791393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf3d161c-7bfd-42f7-90c1-63d92df34424" (UID: "bf3d161c-7bfd-42f7-90c1-63d92df34424"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.808083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-config-data" (OuterVolumeSpecName: "config-data") pod "bf3d161c-7bfd-42f7-90c1-63d92df34424" (UID: "bf3d161c-7bfd-42f7-90c1-63d92df34424"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.813995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bf3d161c-7bfd-42f7-90c1-63d92df34424" (UID: "bf3d161c-7bfd-42f7-90c1-63d92df34424"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.859648 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.859684 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-776p9\" (UniqueName: \"kubernetes.io/projected/bf3d161c-7bfd-42f7-90c1-63d92df34424-kube-api-access-776p9\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.859698 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.859708 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.859719 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.859755 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.859764 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3d161c-7bfd-42f7-90c1-63d92df34424-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.859773 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3d161c-7bfd-42f7-90c1-63d92df34424-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.881780 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.888288 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:37:51 crc kubenswrapper[4707]: I0121 15:37:51.962157 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.118651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" event={"ID":"97e5f6a3-0b0f-4d27-b60c-54585edca724","Type":"ContainerStarted","Data":"1a038bb09ac026e0ff6f452079a322deba5bbb2f6d988725aead5c0eee7ae3dd"} Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.118713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" event={"ID":"97e5f6a3-0b0f-4d27-b60c-54585edca724","Type":"ContainerStarted","Data":"54fd55d89b3589741330b3e8d84440028521fa71c60696c80e1207e0a7f85397"} Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.124134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerStarted","Data":"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c"} Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.124311 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="ceilometer-central-agent" containerID="cri-o://4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e" gracePeriod=30 Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.124443 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.124500 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="proxy-httpd" containerID="cri-o://fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c" gracePeriod=30 Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.124549 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="sg-core" containerID="cri-o://ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d" gracePeriod=30 Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.124579 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="ceilometer-notification-agent" containerID="cri-o://16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3" gracePeriod=30 Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.141620 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" podStartSLOduration=2.141588427 podStartE2EDuration="2.141588427s" podCreationTimestamp="2026-01-21 15:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:52.133523051 +0000 UTC m=+2169.315039273" watchObservedRunningTime="2026-01-21 15:37:52.141588427 +0000 UTC m=+2169.323104648" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.142153 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerID="dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97" exitCode=0 Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.142205 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.142218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"bf3d161c-7bfd-42f7-90c1-63d92df34424","Type":"ContainerDied","Data":"dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97"} Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.142269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"bf3d161c-7bfd-42f7-90c1-63d92df34424","Type":"ContainerDied","Data":"1f8a968c4580174171f2b3afbda30e29b044bb570b2e623982f61eecce36eb2a"} Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.142288 4707 scope.go:117] "RemoveContainer" containerID="dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.171566 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.511928196 podStartE2EDuration="5.17154922s" podCreationTimestamp="2026-01-21 15:37:47 +0000 UTC" firstStartedPulling="2026-01-21 15:37:48.086931591 +0000 UTC m=+2165.268447813" lastFinishedPulling="2026-01-21 15:37:51.746552616 +0000 UTC m=+2168.928068837" observedRunningTime="2026-01-21 15:37:52.154526403 +0000 UTC m=+2169.336042625" watchObservedRunningTime="2026-01-21 15:37:52.17154922 +0000 UTC m=+2169.353065443" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.179502 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.194120 4707 scope.go:117] "RemoveContainer" containerID="d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.219698 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.221943 4707 scope.go:117] "RemoveContainer" containerID="dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97" Jan 21 15:37:52 crc kubenswrapper[4707]: E0121 15:37:52.222294 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97\": container with ID starting with dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97 not found: ID does not exist" containerID="dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.222360 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97"} err="failed to get container status \"dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97\": rpc error: code = NotFound desc = could not find container \"dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97\": container with ID starting with dd414b4a113295d5fe63e8f898d76ebd76f9cb381928440e88867f268485ea97 not found: ID does not exist" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.222379 4707 scope.go:117] "RemoveContainer" containerID="d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428" Jan 21 15:37:52 crc kubenswrapper[4707]: E0121 15:37:52.222718 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428\": container with ID starting with d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428 not found: ID does not exist" containerID="d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.222785 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428"} err="failed to get container status \"d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428\": rpc error: code = NotFound desc = could not find container \"d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428\": container with ID starting with d66af8db7fc9ef9a6d33a4a7cc8de4f56b14f621631ca9eb570dfd1fef787428 not found: ID does not exist" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.228281 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:37:52 crc kubenswrapper[4707]: E0121 15:37:52.228736 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerName="glance-httpd" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.228757 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerName="glance-httpd" Jan 21 15:37:52 crc kubenswrapper[4707]: E0121 15:37:52.228774 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerName="glance-log" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.228780 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerName="glance-log" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.229050 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerName="glance-log" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.229086 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3d161c-7bfd-42f7-90c1-63d92df34424" containerName="glance-httpd" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.230114 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.235627 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.235777 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.270758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.318245 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.370065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrzd\" (UniqueName: \"kubernetes.io/projected/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-kube-api-access-fxrzd\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.370110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.370143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.370218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-logs\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.370264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.370315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.371108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.371454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.473479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.473534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-logs\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.473560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.473604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.473638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.473703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.473743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrzd\" (UniqueName: \"kubernetes.io/projected/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-kube-api-access-fxrzd\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.473761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.474088 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.474223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-logs\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.474356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.480437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.482474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.483021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.486490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.490046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrzd\" (UniqueName: \"kubernetes.io/projected/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-kube-api-access-fxrzd\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.504100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.598998 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.823803 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.982247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhzv5\" (UniqueName: \"kubernetes.io/projected/5a08a1a1-52a9-4347-b062-154c4f8ca297-kube-api-access-rhzv5\") pod \"5a08a1a1-52a9-4347-b062-154c4f8ca297\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.982350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-run-httpd\") pod \"5a08a1a1-52a9-4347-b062-154c4f8ca297\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.982537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-sg-core-conf-yaml\") pod \"5a08a1a1-52a9-4347-b062-154c4f8ca297\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.982586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-combined-ca-bundle\") pod \"5a08a1a1-52a9-4347-b062-154c4f8ca297\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.982619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-scripts\") pod \"5a08a1a1-52a9-4347-b062-154c4f8ca297\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.982633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-config-data\") pod \"5a08a1a1-52a9-4347-b062-154c4f8ca297\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.982667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-log-httpd\") pod \"5a08a1a1-52a9-4347-b062-154c4f8ca297\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.984095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5a08a1a1-52a9-4347-b062-154c4f8ca297" (UID: "5a08a1a1-52a9-4347-b062-154c4f8ca297"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:52 crc kubenswrapper[4707]: I0121 15:37:52.984616 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5a08a1a1-52a9-4347-b062-154c4f8ca297" (UID: "5a08a1a1-52a9-4347-b062-154c4f8ca297"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.005358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-scripts" (OuterVolumeSpecName: "scripts") pod "5a08a1a1-52a9-4347-b062-154c4f8ca297" (UID: "5a08a1a1-52a9-4347-b062-154c4f8ca297"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.006226 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a08a1a1-52a9-4347-b062-154c4f8ca297-kube-api-access-rhzv5" (OuterVolumeSpecName: "kube-api-access-rhzv5") pod "5a08a1a1-52a9-4347-b062-154c4f8ca297" (UID: "5a08a1a1-52a9-4347-b062-154c4f8ca297"). InnerVolumeSpecName "kube-api-access-rhzv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.014786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5a08a1a1-52a9-4347-b062-154c4f8ca297" (UID: "5a08a1a1-52a9-4347-b062-154c4f8ca297"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.055471 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:37:53 crc kubenswrapper[4707]: W0121 15:37:53.059961 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf120fd0_7d2d_4f6e_80d9_523c4432c9db.slice/crio-6b89b54991d146634bc6a60926a26545aaf622fdd133dbd7721b113c06b3b0f7 WatchSource:0}: Error finding container 6b89b54991d146634bc6a60926a26545aaf622fdd133dbd7721b113c06b3b0f7: Status 404 returned error can't find the container with id 6b89b54991d146634bc6a60926a26545aaf622fdd133dbd7721b113c06b3b0f7 Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.090762 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.090802 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.090830 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.090839 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhzv5\" (UniqueName: \"kubernetes.io/projected/5a08a1a1-52a9-4347-b062-154c4f8ca297-kube-api-access-rhzv5\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.090851 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a08a1a1-52a9-4347-b062-154c4f8ca297-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:53 crc kubenswrapper[4707]: E0121 15:37:53.096774 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-config-data podName:5a08a1a1-52a9-4347-b062-154c4f8ca297 nodeName:}" failed. No retries permitted until 2026-01-21 15:37:53.596750191 +0000 UTC m=+2170.778266413 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-config-data") pod "5a08a1a1-52a9-4347-b062-154c4f8ca297" (UID: "5a08a1a1-52a9-4347-b062-154c4f8ca297") : error deleting /var/lib/kubelet/pods/5a08a1a1-52a9-4347-b062-154c4f8ca297/volume-subpaths: remove /var/lib/kubelet/pods/5a08a1a1-52a9-4347-b062-154c4f8ca297/volume-subpaths: no such file or directory Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.099401 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a08a1a1-52a9-4347-b062-154c4f8ca297" (UID: "5a08a1a1-52a9-4347-b062-154c4f8ca297"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.155165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3eec9e69-0f85-4cc7-8fad-020a027dedcc","Type":"ContainerStarted","Data":"cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27"} Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.156080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3eec9e69-0f85-4cc7-8fad-020a027dedcc","Type":"ContainerStarted","Data":"5fbde92505ad39b5b2fadf98a0505dce037831c5392ac079487c41f121cc2c83"} Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.157350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"cf120fd0-7d2d-4f6e-80d9-523c4432c9db","Type":"ContainerStarted","Data":"6b89b54991d146634bc6a60926a26545aaf622fdd133dbd7721b113c06b3b0f7"} Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160201 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerID="fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c" exitCode=0 Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerDied","Data":"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c"} Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerDied","Data":"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d"} Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160347 4707 scope.go:117] "RemoveContainer" containerID="fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160349 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160288 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerID="ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d" exitCode=2 Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160428 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerID="16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3" exitCode=0 Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160442 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerID="4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e" exitCode=0 Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerDied","Data":"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3"} Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerDied","Data":"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e"} Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.160485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5a08a1a1-52a9-4347-b062-154c4f8ca297","Type":"ContainerDied","Data":"1b1f5e707d4f67ecf4a74d93d25344e96494ee3b000f81eaafbeff7d28d1d6cd"} Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.193832 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.216674 4707 scope.go:117] "RemoveContainer" containerID="ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.219451 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c93e649-01b5-4e89-8453-106a67bc11ba" path="/var/lib/kubelet/pods/9c93e649-01b5-4e89-8453-106a67bc11ba/volumes" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.222010 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3d161c-7bfd-42f7-90c1-63d92df34424" path="/var/lib/kubelet/pods/bf3d161c-7bfd-42f7-90c1-63d92df34424/volumes" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.255890 4707 scope.go:117] "RemoveContainer" containerID="16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.277860 4707 scope.go:117] "RemoveContainer" containerID="4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.318179 4707 scope.go:117] "RemoveContainer" containerID="fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c" Jan 21 15:37:53 crc kubenswrapper[4707]: E0121 15:37:53.318639 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c\": container with ID starting with fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c not found: ID does not exist" containerID="fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.318684 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c"} err="failed to get container status \"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c\": rpc error: code = NotFound desc = could not find container \"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c\": container with ID starting with fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.318709 4707 scope.go:117] "RemoveContainer" containerID="ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d" Jan 21 15:37:53 crc kubenswrapper[4707]: E0121 15:37:53.319436 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d\": container with ID starting with ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d not found: ID does not exist" containerID="ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.319467 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d"} err="failed to get container status \"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d\": rpc error: code = NotFound desc = could not find container \"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d\": container with ID starting with ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.319487 4707 scope.go:117] "RemoveContainer" containerID="16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3" Jan 21 15:37:53 crc kubenswrapper[4707]: E0121 15:37:53.319792 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3\": container with ID starting with 16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3 not found: ID does not exist" containerID="16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.319838 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3"} err="failed to get container status \"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3\": rpc error: code = NotFound desc = could not find container \"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3\": container with ID starting with 16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3 not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.319861 4707 scope.go:117] "RemoveContainer" containerID="4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e" Jan 21 15:37:53 crc kubenswrapper[4707]: E0121 15:37:53.320086 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e\": container with ID starting with 4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e not found: ID does not exist" containerID="4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.320120 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e"} err="failed to get container status \"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e\": rpc error: code = NotFound desc = could not find container \"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e\": container with ID starting with 4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.320134 4707 scope.go:117] "RemoveContainer" containerID="fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.320344 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c"} err="failed to get container status \"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c\": rpc error: code = NotFound desc = could not find container \"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c\": container with ID starting with fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.320371 4707 scope.go:117] "RemoveContainer" containerID="ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.321053 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d"} err="failed to get container status \"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d\": rpc error: code = NotFound desc = could not find container \"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d\": container with ID starting with ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.321077 4707 scope.go:117] "RemoveContainer" containerID="16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.321388 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3"} err="failed to get container status \"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3\": rpc error: code = NotFound desc = could not find container \"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3\": container with ID starting with 16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3 not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.321407 4707 scope.go:117] "RemoveContainer" containerID="4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.321671 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e"} err="failed to get container status \"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e\": rpc error: code = NotFound desc = could not find container \"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e\": container with ID starting with 4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.321691 4707 scope.go:117] "RemoveContainer" containerID="fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.322123 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c"} err="failed to get container status \"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c\": rpc error: code = NotFound desc = could not find container \"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c\": container with ID starting with fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.322170 4707 scope.go:117] "RemoveContainer" containerID="ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.322613 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d"} err="failed to get container status \"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d\": rpc error: code = NotFound desc = could not find container \"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d\": container with ID starting with ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.322639 4707 scope.go:117] "RemoveContainer" containerID="16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.322917 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3"} err="failed to get container status \"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3\": rpc error: code = NotFound desc = could not find container \"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3\": container with ID starting with 16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3 not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.322937 4707 scope.go:117] "RemoveContainer" containerID="4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.323339 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e"} err="failed to get container status \"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e\": rpc error: code = NotFound desc = could not find container \"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e\": container with ID starting with 4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.323366 4707 scope.go:117] "RemoveContainer" containerID="fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.323748 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c"} err="failed to get container status \"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c\": rpc error: code = NotFound desc = could not find container \"fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c\": container with ID starting with fbce5c9ce58e28dc3fd515e6eecf59b7b7e934f8cff095ef5a447b0dd0868d8c not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.323769 4707 scope.go:117] "RemoveContainer" containerID="ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.324110 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d"} err="failed to get container status \"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d\": rpc error: code = NotFound desc = could not find container \"ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d\": container with ID starting with ac42ef4ff0ef8ad548b3e126b8164a9eed244e6c20d80dbdc777efb9015f8e7d not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.324136 4707 scope.go:117] "RemoveContainer" containerID="16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.324414 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3"} err="failed to get container status \"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3\": rpc error: code = NotFound desc = could not find container \"16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3\": container with ID starting with 16598730a5dc2606ee2f8c0e22018764f8d62b46773fdf4dd8945397d4280cb3 not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.324435 4707 scope.go:117] "RemoveContainer" containerID="4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.324708 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e"} err="failed to get container status \"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e\": rpc error: code = NotFound desc = could not find container \"4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e\": container with ID starting with 4a782fe1c48f3a69e3ac71ff90982ddcdcc210cd7e09eec37b0d3ad78b805e8e not found: ID does not exist" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.600910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-config-data\") pod \"5a08a1a1-52a9-4347-b062-154c4f8ca297\" (UID: \"5a08a1a1-52a9-4347-b062-154c4f8ca297\") " Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.604083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-config-data" (OuterVolumeSpecName: "config-data") pod "5a08a1a1-52a9-4347-b062-154c4f8ca297" (UID: "5a08a1a1-52a9-4347-b062-154c4f8ca297"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.703687 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a08a1a1-52a9-4347-b062-154c4f8ca297-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.798788 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.811058 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.819385 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:53 crc kubenswrapper[4707]: E0121 15:37:53.819988 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="sg-core" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.820011 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="sg-core" Jan 21 15:37:53 crc kubenswrapper[4707]: E0121 15:37:53.820024 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="ceilometer-notification-agent" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.820031 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="ceilometer-notification-agent" Jan 21 15:37:53 crc kubenswrapper[4707]: E0121 15:37:53.820046 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="proxy-httpd" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.820052 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="proxy-httpd" Jan 21 15:37:53 crc kubenswrapper[4707]: E0121 15:37:53.820062 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="ceilometer-central-agent" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.820068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="ceilometer-central-agent" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.820321 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="ceilometer-central-agent" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.820342 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="sg-core" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.820354 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="proxy-httpd" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.820372 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" containerName="ceilometer-notification-agent" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.822455 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.824052 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.827017 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.827279 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.911563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-log-httpd\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.912077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-config-data\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.912250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.912345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-run-httpd\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.912414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmd68\" (UniqueName: \"kubernetes.io/projected/3b52c25b-eb0f-4532-afe4-996f8341aa54-kube-api-access-kmd68\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.912517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:53 crc kubenswrapper[4707]: I0121 15:37:53.912604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-scripts\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.014185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-log-httpd\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.014316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-config-data\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.014494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.014547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-run-httpd\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.014577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmd68\" (UniqueName: \"kubernetes.io/projected/3b52c25b-eb0f-4532-afe4-996f8341aa54-kube-api-access-kmd68\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.014655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.014662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-log-httpd\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.014735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-scripts\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.015197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-run-httpd\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.020395 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.021462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.022772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-scripts\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.022966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-config-data\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.030526 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmd68\" (UniqueName: \"kubernetes.io/projected/3b52c25b-eb0f-4532-afe4-996f8341aa54-kube-api-access-kmd68\") pod \"ceilometer-0\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.149354 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.174599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"cf120fd0-7d2d-4f6e-80d9-523c4432c9db","Type":"ContainerStarted","Data":"872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8"} Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.175396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"cf120fd0-7d2d-4f6e-80d9-523c4432c9db","Type":"ContainerStarted","Data":"93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a"} Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.184005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3eec9e69-0f85-4cc7-8fad-020a027dedcc","Type":"ContainerStarted","Data":"0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d"} Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.199106 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.199071292 podStartE2EDuration="2.199071292s" podCreationTimestamp="2026-01-21 15:37:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:54.192012771 +0000 UTC m=+2171.373528993" watchObservedRunningTime="2026-01-21 15:37:54.199071292 +0000 UTC m=+2171.380587514" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.225663 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.225648294 podStartE2EDuration="3.225648294s" podCreationTimestamp="2026-01-21 15:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:37:54.219451973 +0000 UTC m=+2171.400968205" watchObservedRunningTime="2026-01-21 15:37:54.225648294 +0000 UTC m=+2171.407164516" Jan 21 15:37:54 crc kubenswrapper[4707]: I0121 15:37:54.587152 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.192467 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a08a1a1-52a9-4347-b062-154c4f8ca297" path="/var/lib/kubelet/pods/5a08a1a1-52a9-4347-b062-154c4f8ca297/volumes" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.197639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerStarted","Data":"518e01edb777cb23c060d5cf8939db320de9dd98692d6cd73b972be784b65eb4"} Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.614246 4707 scope.go:117] "RemoveContainer" containerID="af3f3dd74a9c8d48aa97c107bfa8b058c3e1d0de4496d824674dc535fe8ed785" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.647481 4707 scope.go:117] "RemoveContainer" containerID="28fc022c66807df1e793d0893ff1d6a282fe23d532bc5ee33036ce44ac8e5248" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.677615 4707 scope.go:117] "RemoveContainer" containerID="853a930f49c1f2edf2a101608783efd9deb9725c99a7dea6d7832faaaf26cc05" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.717583 4707 scope.go:117] "RemoveContainer" containerID="37412d7e5fd2bcf3a43d5fa7b581590cb3a673423d13c7149b4e26b6f815c70f" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.745844 4707 scope.go:117] "RemoveContainer" containerID="e13b9cba489637d49d8262a8ecc2efb194e4e6f29440a9c7831758ea2e38f05e" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.764619 4707 scope.go:117] "RemoveContainer" containerID="4c572740964e4167e045c95d8fd1045f5bdbeb75371d01f2192b08d69680f85b" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.785885 4707 scope.go:117] "RemoveContainer" containerID="d769b7a269a2696c705c404c162be86ce7e930410be6def7eb2ac8306f38854a" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.820972 4707 scope.go:117] "RemoveContainer" containerID="806e8dc586e5d2329e8f32eec444b83190a7950e3e3032a6311dac6fef009c29" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.865397 4707 scope.go:117] "RemoveContainer" containerID="5bedc67db4a2e84d7e5a7540beeaab9c861fcf915bd616f51818ad66fc63504e" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.893095 4707 scope.go:117] "RemoveContainer" containerID="9314272c08eb3f5ed1dc4224ebb999ae74acb5f87849d8be469e0d3cd8c9eb28" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.921266 4707 scope.go:117] "RemoveContainer" containerID="97d6d731114606325ecfbbec11eeb91fce426d49196a794390a7e9a4c8b3edd5" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.946195 4707 scope.go:117] "RemoveContainer" containerID="1fe533f65e70741554fde5962a4585ba7bcb52018aab459c4513cf5fe868f95b" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.975968 4707 scope.go:117] "RemoveContainer" containerID="c0fe7f58e8940e18c6eb0ada86b5033201da9ecf5cc7b720a5e7ba73cfd6bb9f" Jan 21 15:37:55 crc kubenswrapper[4707]: I0121 15:37:55.995492 4707 scope.go:117] "RemoveContainer" containerID="aa0e78e27cfa36ab1db2cdd44a88c11e0e7e739c2624edfaec77ac05d1ae9737" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.039483 4707 scope.go:117] "RemoveContainer" containerID="d554013233633c782741923e73a8d8723a0219cb7dc1c3c0326bf28aba43877a" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.061857 4707 scope.go:117] "RemoveContainer" containerID="0a6802cdbd1648fbd84c2134028f81c3fbdce8d1f1ef02e2484c6860d637820d" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.083563 4707 scope.go:117] "RemoveContainer" containerID="9f4c9093ac2066cbb5d4e1f9ec419fe00db495ed354586dc8985b8d701102d20" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.168038 4707 scope.go:117] "RemoveContainer" containerID="ff17dfb054823684830a7337448f62fc48f01c6fe76d2357e00911976e1b3f4e" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.219773 4707 scope.go:117] "RemoveContainer" containerID="f0d68dbc7cf9b3d8e58f408cefe224a119cce64c438f26782bacec119e7d78d0" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.237017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerStarted","Data":"354a1a4aabec21a6c966713249fb0bd25edfbc6a68a8c530d22229c21bb66a16"} Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.237070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerStarted","Data":"2ae953b2f228aae90d87d90141a6d0db816651ea62dfff399dc2fe9f8628851d"} Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.276613 4707 scope.go:117] "RemoveContainer" containerID="3b5c736d9b656e958f76a45f5a50a7625d90c3cbb7ffb20133329c0ca0bb0e57" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.336541 4707 scope.go:117] "RemoveContainer" containerID="d63fd0beea1df3c3498f461cec49c9f5f0eccbac73e86aa32c9f464f4767b393" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.385310 4707 scope.go:117] "RemoveContainer" containerID="7338268f8d3bd6f4dbade36699445b162a93a58535d079d9e6e3ae94779deb2b" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.411431 4707 scope.go:117] "RemoveContainer" containerID="c11a387cd3a3b4d0292ec4d1ea2b8b17f7cb347c49dd9f6cb4898b7d8632656c" Jan 21 15:37:56 crc kubenswrapper[4707]: I0121 15:37:56.432513 4707 scope.go:117] "RemoveContainer" containerID="eddf6b664ed9079fcbf3b5395ca851df02ea61755652f68ba2a34e4c789441bb" Jan 21 15:37:57 crc kubenswrapper[4707]: I0121 15:37:57.251886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerStarted","Data":"c80dffc8cb8c2f107dd0445adf2f6f55c11b2383a68fc5d023ffe5c79c6d493f"} Jan 21 15:37:58 crc kubenswrapper[4707]: I0121 15:37:58.160262 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:37:59 crc kubenswrapper[4707]: I0121 15:37:59.275876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerStarted","Data":"29b20907a38e28a19ed1d5f7925534c910f9cfa434299bef94b072ec07ac8375"} Jan 21 15:37:59 crc kubenswrapper[4707]: I0121 15:37:59.276009 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="ceilometer-central-agent" containerID="cri-o://2ae953b2f228aae90d87d90141a6d0db816651ea62dfff399dc2fe9f8628851d" gracePeriod=30 Jan 21 15:37:59 crc kubenswrapper[4707]: I0121 15:37:59.276274 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="proxy-httpd" containerID="cri-o://29b20907a38e28a19ed1d5f7925534c910f9cfa434299bef94b072ec07ac8375" gracePeriod=30 Jan 21 15:37:59 crc kubenswrapper[4707]: I0121 15:37:59.276274 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="sg-core" containerID="cri-o://c80dffc8cb8c2f107dd0445adf2f6f55c11b2383a68fc5d023ffe5c79c6d493f" gracePeriod=30 Jan 21 15:37:59 crc kubenswrapper[4707]: I0121 15:37:59.276344 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:37:59 crc kubenswrapper[4707]: I0121 15:37:59.276279 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="ceilometer-notification-agent" containerID="cri-o://354a1a4aabec21a6c966713249fb0bd25edfbc6a68a8c530d22229c21bb66a16" gracePeriod=30 Jan 21 15:37:59 crc kubenswrapper[4707]: I0121 15:37:59.280508 4707 generic.go:334] "Generic (PLEG): container finished" podID="97e5f6a3-0b0f-4d27-b60c-54585edca724" containerID="1a038bb09ac026e0ff6f452079a322deba5bbb2f6d988725aead5c0eee7ae3dd" exitCode=0 Jan 21 15:37:59 crc kubenswrapper[4707]: I0121 15:37:59.280544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" event={"ID":"97e5f6a3-0b0f-4d27-b60c-54585edca724","Type":"ContainerDied","Data":"1a038bb09ac026e0ff6f452079a322deba5bbb2f6d988725aead5c0eee7ae3dd"} Jan 21 15:37:59 crc kubenswrapper[4707]: I0121 15:37:59.301785 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.52947496 podStartE2EDuration="6.301761795s" podCreationTimestamp="2026-01-21 15:37:53 +0000 UTC" firstStartedPulling="2026-01-21 15:37:54.594022885 +0000 UTC m=+2171.775539107" lastFinishedPulling="2026-01-21 15:37:58.36630972 +0000 UTC m=+2175.547825942" observedRunningTime="2026-01-21 15:37:59.29183462 +0000 UTC m=+2176.473350842" watchObservedRunningTime="2026-01-21 15:37:59.301761795 +0000 UTC m=+2176.483278017" Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.292172 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerID="29b20907a38e28a19ed1d5f7925534c910f9cfa434299bef94b072ec07ac8375" exitCode=0 Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.292479 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerID="c80dffc8cb8c2f107dd0445adf2f6f55c11b2383a68fc5d023ffe5c79c6d493f" exitCode=2 Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.292489 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerID="354a1a4aabec21a6c966713249fb0bd25edfbc6a68a8c530d22229c21bb66a16" exitCode=0 Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.292264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerDied","Data":"29b20907a38e28a19ed1d5f7925534c910f9cfa434299bef94b072ec07ac8375"} Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.292530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerDied","Data":"c80dffc8cb8c2f107dd0445adf2f6f55c11b2383a68fc5d023ffe5c79c6d493f"} Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.292546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerDied","Data":"354a1a4aabec21a6c966713249fb0bd25edfbc6a68a8c530d22229c21bb66a16"} Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.618730 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.769515 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-combined-ca-bundle\") pod \"97e5f6a3-0b0f-4d27-b60c-54585edca724\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.769600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-config-data\") pod \"97e5f6a3-0b0f-4d27-b60c-54585edca724\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.769736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-scripts\") pod \"97e5f6a3-0b0f-4d27-b60c-54585edca724\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.769793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vnhx\" (UniqueName: \"kubernetes.io/projected/97e5f6a3-0b0f-4d27-b60c-54585edca724-kube-api-access-8vnhx\") pod \"97e5f6a3-0b0f-4d27-b60c-54585edca724\" (UID: \"97e5f6a3-0b0f-4d27-b60c-54585edca724\") " Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.776776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-scripts" (OuterVolumeSpecName: "scripts") pod "97e5f6a3-0b0f-4d27-b60c-54585edca724" (UID: "97e5f6a3-0b0f-4d27-b60c-54585edca724"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.791051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e5f6a3-0b0f-4d27-b60c-54585edca724-kube-api-access-8vnhx" (OuterVolumeSpecName: "kube-api-access-8vnhx") pod "97e5f6a3-0b0f-4d27-b60c-54585edca724" (UID: "97e5f6a3-0b0f-4d27-b60c-54585edca724"). InnerVolumeSpecName "kube-api-access-8vnhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.796826 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-config-data" (OuterVolumeSpecName: "config-data") pod "97e5f6a3-0b0f-4d27-b60c-54585edca724" (UID: "97e5f6a3-0b0f-4d27-b60c-54585edca724"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.800930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97e5f6a3-0b0f-4d27-b60c-54585edca724" (UID: "97e5f6a3-0b0f-4d27-b60c-54585edca724"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.873707 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.873744 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vnhx\" (UniqueName: \"kubernetes.io/projected/97e5f6a3-0b0f-4d27-b60c-54585edca724-kube-api-access-8vnhx\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.873757 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:00 crc kubenswrapper[4707]: I0121 15:38:00.873768 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e5f6a3-0b0f-4d27-b60c-54585edca724-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.303368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" event={"ID":"97e5f6a3-0b0f-4d27-b60c-54585edca724","Type":"ContainerDied","Data":"54fd55d89b3589741330b3e8d84440028521fa71c60696c80e1207e0a7f85397"} Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.303435 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54fd55d89b3589741330b3e8d84440028521fa71c60696c80e1207e0a7f85397" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.303474 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.377469 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:38:01 crc kubenswrapper[4707]: E0121 15:38:01.377847 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e5f6a3-0b0f-4d27-b60c-54585edca724" containerName="nova-cell0-conductor-db-sync" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.377868 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e5f6a3-0b0f-4d27-b60c-54585edca724" containerName="nova-cell0-conductor-db-sync" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.378084 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e5f6a3-0b0f-4d27-b60c-54585edca724" containerName="nova-cell0-conductor-db-sync" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.378660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.380462 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-rhjn8" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.380868 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.391454 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.487350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.487425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2cdt\" (UniqueName: \"kubernetes.io/projected/91365236-99fe-4ee7-9aa1-8a414557cf97-kube-api-access-m2cdt\") pod \"nova-cell0-conductor-0\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.488189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.591754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.591947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.592005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2cdt\" (UniqueName: \"kubernetes.io/projected/91365236-99fe-4ee7-9aa1-8a414557cf97-kube-api-access-m2cdt\") pod \"nova-cell0-conductor-0\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.597417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.597497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.609335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2cdt\" (UniqueName: \"kubernetes.io/projected/91365236-99fe-4ee7-9aa1-8a414557cf97-kube-api-access-m2cdt\") pod \"nova-cell0-conductor-0\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.696015 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.883003 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.883340 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.915120 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:38:01 crc kubenswrapper[4707]: I0121 15:38:01.920852 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.145830 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:38:02 crc kubenswrapper[4707]: W0121 15:38:02.151556 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91365236_99fe_4ee7_9aa1_8a414557cf97.slice/crio-33d9408d8368000eedbb230fdeef2f4602ab97036c442f30f15765be35ebdb33 WatchSource:0}: Error finding container 33d9408d8368000eedbb230fdeef2f4602ab97036c442f30f15765be35ebdb33: Status 404 returned error can't find the container with id 33d9408d8368000eedbb230fdeef2f4602ab97036c442f30f15765be35ebdb33 Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.329789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"91365236-99fe-4ee7-9aa1-8a414557cf97","Type":"ContainerStarted","Data":"77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc"} Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.329911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"91365236-99fe-4ee7-9aa1-8a414557cf97","Type":"ContainerStarted","Data":"33d9408d8368000eedbb230fdeef2f4602ab97036c442f30f15765be35ebdb33"} Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.330048 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.352757 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerID="2ae953b2f228aae90d87d90141a6d0db816651ea62dfff399dc2fe9f8628851d" exitCode=0 Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.353995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerDied","Data":"2ae953b2f228aae90d87d90141a6d0db816651ea62dfff399dc2fe9f8628851d"} Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.354430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.354746 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.366910 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.366887303 podStartE2EDuration="1.366887303s" podCreationTimestamp="2026-01-21 15:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:02.350650062 +0000 UTC m=+2179.532166285" watchObservedRunningTime="2026-01-21 15:38:02.366887303 +0000 UTC m=+2179.548403524" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.369568 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.519563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-log-httpd\") pod \"3b52c25b-eb0f-4532-afe4-996f8341aa54\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.519658 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmd68\" (UniqueName: \"kubernetes.io/projected/3b52c25b-eb0f-4532-afe4-996f8341aa54-kube-api-access-kmd68\") pod \"3b52c25b-eb0f-4532-afe4-996f8341aa54\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.519873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-scripts\") pod \"3b52c25b-eb0f-4532-afe4-996f8341aa54\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.519936 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-run-httpd\") pod \"3b52c25b-eb0f-4532-afe4-996f8341aa54\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.519956 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-sg-core-conf-yaml\") pod \"3b52c25b-eb0f-4532-afe4-996f8341aa54\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.519984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-config-data\") pod \"3b52c25b-eb0f-4532-afe4-996f8341aa54\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.520045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3b52c25b-eb0f-4532-afe4-996f8341aa54" (UID: "3b52c25b-eb0f-4532-afe4-996f8341aa54"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.520164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-combined-ca-bundle\") pod \"3b52c25b-eb0f-4532-afe4-996f8341aa54\" (UID: \"3b52c25b-eb0f-4532-afe4-996f8341aa54\") " Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.520729 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.520861 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3b52c25b-eb0f-4532-afe4-996f8341aa54" (UID: "3b52c25b-eb0f-4532-afe4-996f8341aa54"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.526076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b52c25b-eb0f-4532-afe4-996f8341aa54-kube-api-access-kmd68" (OuterVolumeSpecName: "kube-api-access-kmd68") pod "3b52c25b-eb0f-4532-afe4-996f8341aa54" (UID: "3b52c25b-eb0f-4532-afe4-996f8341aa54"). InnerVolumeSpecName "kube-api-access-kmd68". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.526109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-scripts" (OuterVolumeSpecName: "scripts") pod "3b52c25b-eb0f-4532-afe4-996f8341aa54" (UID: "3b52c25b-eb0f-4532-afe4-996f8341aa54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.552274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3b52c25b-eb0f-4532-afe4-996f8341aa54" (UID: "3b52c25b-eb0f-4532-afe4-996f8341aa54"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.584636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b52c25b-eb0f-4532-afe4-996f8341aa54" (UID: "3b52c25b-eb0f-4532-afe4-996f8341aa54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.599945 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.601084 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.614982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-config-data" (OuterVolumeSpecName: "config-data") pod "3b52c25b-eb0f-4532-afe4-996f8341aa54" (UID: "3b52c25b-eb0f-4532-afe4-996f8341aa54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.623758 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.623792 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmd68\" (UniqueName: \"kubernetes.io/projected/3b52c25b-eb0f-4532-afe4-996f8341aa54-kube-api-access-kmd68\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.623804 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.623826 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b52c25b-eb0f-4532-afe4-996f8341aa54-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.623837 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.623845 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b52c25b-eb0f-4532-afe4-996f8341aa54-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.633594 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:38:02 crc kubenswrapper[4707]: I0121 15:38:02.636431 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.362884 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.363607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"3b52c25b-eb0f-4532-afe4-996f8341aa54","Type":"ContainerDied","Data":"518e01edb777cb23c060d5cf8939db320de9dd98692d6cd73b972be784b65eb4"} Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.363636 4707 scope.go:117] "RemoveContainer" containerID="29b20907a38e28a19ed1d5f7925534c910f9cfa434299bef94b072ec07ac8375" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.365054 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.365078 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.391063 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.401856 4707 scope.go:117] "RemoveContainer" containerID="c80dffc8cb8c2f107dd0445adf2f6f55c11b2383a68fc5d023ffe5c79c6d493f" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.422744 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.429920 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:03 crc kubenswrapper[4707]: E0121 15:38:03.430313 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="ceilometer-notification-agent" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.430333 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="ceilometer-notification-agent" Jan 21 15:38:03 crc kubenswrapper[4707]: E0121 15:38:03.430344 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="ceilometer-central-agent" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.430351 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="ceilometer-central-agent" Jan 21 15:38:03 crc kubenswrapper[4707]: E0121 15:38:03.430361 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="sg-core" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.430368 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="sg-core" Jan 21 15:38:03 crc kubenswrapper[4707]: E0121 15:38:03.430383 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="proxy-httpd" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.430388 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="proxy-httpd" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.430538 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="proxy-httpd" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.430549 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="ceilometer-notification-agent" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.430564 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="sg-core" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.430584 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" containerName="ceilometer-central-agent" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.432091 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.432802 4707 scope.go:117] "RemoveContainer" containerID="354a1a4aabec21a6c966713249fb0bd25edfbc6a68a8c530d22229c21bb66a16" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.434265 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.434436 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.447944 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.463385 4707 scope.go:117] "RemoveContainer" containerID="2ae953b2f228aae90d87d90141a6d0db816651ea62dfff399dc2fe9f8628851d" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.540503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc8zt\" (UniqueName: \"kubernetes.io/projected/e3197c8d-2f22-4564-aba0-7118d04fea17-kube-api-access-nc8zt\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.540608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-log-httpd\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.540666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-config-data\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.540720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.540742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-run-httpd\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.540885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-scripts\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.541028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.643169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-scripts\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.643297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.643524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc8zt\" (UniqueName: \"kubernetes.io/projected/e3197c8d-2f22-4564-aba0-7118d04fea17-kube-api-access-nc8zt\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.645859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-log-httpd\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.645923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-log-httpd\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.645970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-config-data\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.646002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.646021 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-run-httpd\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.646995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-run-httpd\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.656011 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.656166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.656754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-scripts\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.658918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc8zt\" (UniqueName: \"kubernetes.io/projected/e3197c8d-2f22-4564-aba0-7118d04fea17-kube-api-access-nc8zt\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.668979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-config-data\") pod \"ceilometer-0\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:03 crc kubenswrapper[4707]: I0121 15:38:03.747322 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:04 crc kubenswrapper[4707]: I0121 15:38:04.084339 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:38:04 crc kubenswrapper[4707]: I0121 15:38:04.086486 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:38:04 crc kubenswrapper[4707]: I0121 15:38:04.169789 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:04 crc kubenswrapper[4707]: W0121 15:38:04.174468 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3197c8d_2f22_4564_aba0_7118d04fea17.slice/crio-b2989bf22d4106f45777318f85ac1262fc145b7a1212df1ba13272b95f57103e WatchSource:0}: Error finding container b2989bf22d4106f45777318f85ac1262fc145b7a1212df1ba13272b95f57103e: Status 404 returned error can't find the container with id b2989bf22d4106f45777318f85ac1262fc145b7a1212df1ba13272b95f57103e Jan 21 15:38:04 crc kubenswrapper[4707]: I0121 15:38:04.373412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerStarted","Data":"b2989bf22d4106f45777318f85ac1262fc145b7a1212df1ba13272b95f57103e"} Jan 21 15:38:04 crc kubenswrapper[4707]: I0121 15:38:04.793319 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:05 crc kubenswrapper[4707]: I0121 15:38:05.044972 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:38:05 crc kubenswrapper[4707]: I0121 15:38:05.046581 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:38:05 crc kubenswrapper[4707]: I0121 15:38:05.197729 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b52c25b-eb0f-4532-afe4-996f8341aa54" path="/var/lib/kubelet/pods/3b52c25b-eb0f-4532-afe4-996f8341aa54/volumes" Jan 21 15:38:05 crc kubenswrapper[4707]: I0121 15:38:05.381919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerStarted","Data":"f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c"} Jan 21 15:38:06 crc kubenswrapper[4707]: I0121 15:38:06.392477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerStarted","Data":"f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640"} Jan 21 15:38:07 crc kubenswrapper[4707]: I0121 15:38:07.409509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerStarted","Data":"c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2"} Jan 21 15:38:08 crc kubenswrapper[4707]: I0121 15:38:08.422574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerStarted","Data":"01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616"} Jan 21 15:38:08 crc kubenswrapper[4707]: I0121 15:38:08.422763 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="ceilometer-central-agent" containerID="cri-o://f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c" gracePeriod=30 Jan 21 15:38:08 crc kubenswrapper[4707]: I0121 15:38:08.422930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:08 crc kubenswrapper[4707]: I0121 15:38:08.423256 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="sg-core" containerID="cri-o://c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2" gracePeriod=30 Jan 21 15:38:08 crc kubenswrapper[4707]: I0121 15:38:08.423310 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="proxy-httpd" containerID="cri-o://01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616" gracePeriod=30 Jan 21 15:38:08 crc kubenswrapper[4707]: I0121 15:38:08.423354 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="ceilometer-notification-agent" containerID="cri-o://f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640" gracePeriod=30 Jan 21 15:38:08 crc kubenswrapper[4707]: I0121 15:38:08.439852 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.887161656 podStartE2EDuration="5.439838189s" podCreationTimestamp="2026-01-21 15:38:03 +0000 UTC" firstStartedPulling="2026-01-21 15:38:04.17650606 +0000 UTC m=+2181.358022281" lastFinishedPulling="2026-01-21 15:38:07.729182592 +0000 UTC m=+2184.910698814" observedRunningTime="2026-01-21 15:38:08.437680182 +0000 UTC m=+2185.619196403" watchObservedRunningTime="2026-01-21 15:38:08.439838189 +0000 UTC m=+2185.621354411" Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.433044 4707 generic.go:334] "Generic (PLEG): container finished" podID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerID="01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616" exitCode=0 Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.433327 4707 generic.go:334] "Generic (PLEG): container finished" podID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerID="c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2" exitCode=2 Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.433336 4707 generic.go:334] "Generic (PLEG): container finished" podID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerID="f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640" exitCode=0 Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.433087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerDied","Data":"01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616"} Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.433387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerDied","Data":"c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2"} Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.433403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerDied","Data":"f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640"} Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.945835 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.945929 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.946011 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.947371 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d5e87bd6abf7a6178fdaecd77230a20522c0d8576400f05ca4f9dfd50c50ef1"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:38:09 crc kubenswrapper[4707]: I0121 15:38:09.947455 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://3d5e87bd6abf7a6178fdaecd77230a20522c0d8576400f05ca4f9dfd50c50ef1" gracePeriod=600 Jan 21 15:38:10 crc kubenswrapper[4707]: I0121 15:38:10.450575 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="3d5e87bd6abf7a6178fdaecd77230a20522c0d8576400f05ca4f9dfd50c50ef1" exitCode=0 Jan 21 15:38:10 crc kubenswrapper[4707]: I0121 15:38:10.450684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"3d5e87bd6abf7a6178fdaecd77230a20522c0d8576400f05ca4f9dfd50c50ef1"} Jan 21 15:38:10 crc kubenswrapper[4707]: I0121 15:38:10.451020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e"} Jan 21 15:38:10 crc kubenswrapper[4707]: I0121 15:38:10.451052 4707 scope.go:117] "RemoveContainer" containerID="55470f5cd4e14a75b6bae4f9bb42147d4f88394da174e867c8e9fa2cb33b008a" Jan 21 15:38:11 crc kubenswrapper[4707]: I0121 15:38:11.720025 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.226132 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bn475"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.228684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.231918 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvjv\" (UniqueName: \"kubernetes.io/projected/31884041-4815-4438-a2cd-7bd2756f9f21-kube-api-access-ssvjv\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.231978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-config-data\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.232028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-scripts\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.232083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.250404 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.250622 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.279873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bn475"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.331063 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.332152 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.334298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvjv\" (UniqueName: \"kubernetes.io/projected/31884041-4815-4438-a2cd-7bd2756f9f21-kube-api-access-ssvjv\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.334365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-config-data\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.334407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-config-data\") pod \"nova-scheduler-0\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.334444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-scripts\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.334477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.334532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfxm\" (UniqueName: \"kubernetes.io/projected/d7dabca5-e35d-4932-b272-df52109fcd37-kube-api-access-2xfxm\") pod \"nova-scheduler-0\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.334549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.335891 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.337224 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.342747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.353747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-scripts\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.360686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-config-data\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.368833 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.370199 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.372996 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.373527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssvjv\" (UniqueName: \"kubernetes.io/projected/31884041-4815-4438-a2cd-7bd2756f9f21-kube-api-access-ssvjv\") pod \"nova-cell0-cell-mapping-bn475\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.435506 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.439974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-config-data\") pod \"nova-scheduler-0\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.440147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfxm\" (UniqueName: \"kubernetes.io/projected/d7dabca5-e35d-4932-b272-df52109fcd37-kube-api-access-2xfxm\") pod \"nova-scheduler-0\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.440170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.444907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-config-data\") pod \"nova-scheduler-0\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.454658 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.456956 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.459295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.459660 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.467176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfxm\" (UniqueName: \"kubernetes.io/projected/d7dabca5-e35d-4932-b272-df52109fcd37-kube-api-access-2xfxm\") pod \"nova-scheduler-0\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.478616 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.545376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058a69b4-b6a3-4533-a682-07c7c1b299d1-logs\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.545586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cc4w\" (UniqueName: \"kubernetes.io/projected/058a69b4-b6a3-4533-a682-07c7c1b299d1-kube-api-access-8cc4w\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.546573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.546724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-config-data\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.549081 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.550428 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.552523 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.559202 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.578254 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.652510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtb5\" (UniqueName: \"kubernetes.io/projected/2092919a-a2ce-4693-98ad-9654ed55ba0a-kube-api-access-gjtb5\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrqsz\" (UniqueName: \"kubernetes.io/projected/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-kube-api-access-xrqsz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-config-data\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058a69b4-b6a3-4533-a682-07c7c1b299d1-logs\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cc4w\" (UniqueName: \"kubernetes.io/projected/058a69b4-b6a3-4533-a682-07c7c1b299d1-kube-api-access-8cc4w\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2092919a-a2ce-4693-98ad-9654ed55ba0a-logs\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.653638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-config-data\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.655131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058a69b4-b6a3-4533-a682-07c7c1b299d1-logs\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.659147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.659848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-config-data\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.675108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cc4w\" (UniqueName: \"kubernetes.io/projected/058a69b4-b6a3-4533-a682-07c7c1b299d1-kube-api-access-8cc4w\") pod \"nova-api-0\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.735794 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.744081 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.755536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtb5\" (UniqueName: \"kubernetes.io/projected/2092919a-a2ce-4693-98ad-9654ed55ba0a-kube-api-access-gjtb5\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.755584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.755619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrqsz\" (UniqueName: \"kubernetes.io/projected/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-kube-api-access-xrqsz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.755675 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-config-data\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.755737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.755792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.755828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2092919a-a2ce-4693-98ad-9654ed55ba0a-logs\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.756183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2092919a-a2ce-4693-98ad-9654ed55ba0a-logs\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.760317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.760355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-config-data\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.761316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.761639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.771892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtb5\" (UniqueName: \"kubernetes.io/projected/2092919a-a2ce-4693-98ad-9654ed55ba0a-kube-api-access-gjtb5\") pod \"nova-metadata-0\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.774434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrqsz\" (UniqueName: \"kubernetes.io/projected/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-kube-api-access-xrqsz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.824305 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:12 crc kubenswrapper[4707]: I0121 15:38:12.875423 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.021217 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bn475"] Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.100423 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.265565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-combined-ca-bundle\") pod \"e3197c8d-2f22-4564-aba0-7118d04fea17\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.265923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-run-httpd\") pod \"e3197c8d-2f22-4564-aba0-7118d04fea17\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.265951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-log-httpd\") pod \"e3197c8d-2f22-4564-aba0-7118d04fea17\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.265993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc8zt\" (UniqueName: \"kubernetes.io/projected/e3197c8d-2f22-4564-aba0-7118d04fea17-kube-api-access-nc8zt\") pod \"e3197c8d-2f22-4564-aba0-7118d04fea17\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.266009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-scripts\") pod \"e3197c8d-2f22-4564-aba0-7118d04fea17\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.266091 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-sg-core-conf-yaml\") pod \"e3197c8d-2f22-4564-aba0-7118d04fea17\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.266183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-config-data\") pod \"e3197c8d-2f22-4564-aba0-7118d04fea17\" (UID: \"e3197c8d-2f22-4564-aba0-7118d04fea17\") " Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.269524 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e3197c8d-2f22-4564-aba0-7118d04fea17" (UID: "e3197c8d-2f22-4564-aba0-7118d04fea17"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.270084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e3197c8d-2f22-4564-aba0-7118d04fea17" (UID: "e3197c8d-2f22-4564-aba0-7118d04fea17"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.272966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-scripts" (OuterVolumeSpecName: "scripts") pod "e3197c8d-2f22-4564-aba0-7118d04fea17" (UID: "e3197c8d-2f22-4564-aba0-7118d04fea17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.275965 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk"] Jan 21 15:38:13 crc kubenswrapper[4707]: E0121 15:38:13.276672 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="ceilometer-central-agent" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.276749 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="ceilometer-central-agent" Jan 21 15:38:13 crc kubenswrapper[4707]: E0121 15:38:13.276831 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="proxy-httpd" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.276902 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="proxy-httpd" Jan 21 15:38:13 crc kubenswrapper[4707]: E0121 15:38:13.276966 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="ceilometer-notification-agent" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.277029 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="ceilometer-notification-agent" Jan 21 15:38:13 crc kubenswrapper[4707]: E0121 15:38:13.277120 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="sg-core" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.277172 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="sg-core" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.277508 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="ceilometer-notification-agent" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.277584 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="ceilometer-central-agent" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.277647 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="sg-core" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.277697 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerName="proxy-httpd" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.280864 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.285947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3197c8d-2f22-4564-aba0-7118d04fea17-kube-api-access-nc8zt" (OuterVolumeSpecName: "kube-api-access-nc8zt") pod "e3197c8d-2f22-4564-aba0-7118d04fea17" (UID: "e3197c8d-2f22-4564-aba0-7118d04fea17"). InnerVolumeSpecName "kube-api-access-nc8zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.290231 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.295185 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.303594 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk"] Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.321123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e3197c8d-2f22-4564-aba0-7118d04fea17" (UID: "e3197c8d-2f22-4564-aba0-7118d04fea17"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:13 crc kubenswrapper[4707]: W0121 15:38:13.323554 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod058a69b4_b6a3_4533_a682_07c7c1b299d1.slice/crio-e86c8e9e77e4a6e1ea0b8624b9f802d72cd0f4bac0b2d4516ac6da8de1fd9640 WatchSource:0}: Error finding container e86c8e9e77e4a6e1ea0b8624b9f802d72cd0f4bac0b2d4516ac6da8de1fd9640: Status 404 returned error can't find the container with id e86c8e9e77e4a6e1ea0b8624b9f802d72cd0f4bac0b2d4516ac6da8de1fd9640 Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.334262 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:13 crc kubenswrapper[4707]: W0121 15:38:13.337331 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7dabca5_e35d_4932_b272_df52109fcd37.slice/crio-ab9e3c445bfb4a4d5dfc8d142c59d056c7a72110a838574f9df3e4326e55fe2f WatchSource:0}: Error finding container ab9e3c445bfb4a4d5dfc8d142c59d056c7a72110a838574f9df3e4326e55fe2f: Status 404 returned error can't find the container with id ab9e3c445bfb4a4d5dfc8d142c59d056c7a72110a838574f9df3e4326e55fe2f Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.345089 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.360359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3197c8d-2f22-4564-aba0-7118d04fea17" (UID: "e3197c8d-2f22-4564-aba0-7118d04fea17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.373584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-scripts\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.373675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrt7z\" (UniqueName: \"kubernetes.io/projected/369fb473-5832-4299-80a7-81eecc4ab6d4-kube-api-access-xrt7z\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.373761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.374921 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-config-data\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.375068 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.375085 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.375093 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3197c8d-2f22-4564-aba0-7118d04fea17-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.375102 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc8zt\" (UniqueName: \"kubernetes.io/projected/e3197c8d-2f22-4564-aba0-7118d04fea17-kube-api-access-nc8zt\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.375112 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.375120 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.408948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-config-data" (OuterVolumeSpecName: "config-data") pod "e3197c8d-2f22-4564-aba0-7118d04fea17" (UID: "e3197c8d-2f22-4564-aba0-7118d04fea17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.457748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.476615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.477034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-config-data\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.477083 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-scripts\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.477114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrt7z\" (UniqueName: \"kubernetes.io/projected/369fb473-5832-4299-80a7-81eecc4ab6d4-kube-api-access-xrt7z\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.477186 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3197c8d-2f22-4564-aba0-7118d04fea17-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.478699 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.482832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-config-data\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.482846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.484528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-scripts\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.501517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrt7z\" (UniqueName: \"kubernetes.io/projected/369fb473-5832-4299-80a7-81eecc4ab6d4-kube-api-access-xrt7z\") pod \"nova-cell1-conductor-db-sync-8wfnk\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.535320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" event={"ID":"31884041-4815-4438-a2cd-7bd2756f9f21","Type":"ContainerStarted","Data":"c66dd17d9b60df10f57e9cc85f7bf0536cdcaea704e2db1ea2f6186abbb9b630"} Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.535372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" event={"ID":"31884041-4815-4438-a2cd-7bd2756f9f21","Type":"ContainerStarted","Data":"ba6a6534b0b637484b4e19d4f4e71f92f5aaa9fe908f8f3dd8c10cbbbd06315c"} Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.538257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d7dabca5-e35d-4932-b272-df52109fcd37","Type":"ContainerStarted","Data":"ab9e3c445bfb4a4d5dfc8d142c59d056c7a72110a838574f9df3e4326e55fe2f"} Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.540617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c26d0b21-9b78-4cdb-b01f-7745d7f0345a","Type":"ContainerStarted","Data":"72e85d045289b9c0a450fb22a152c4883137912160558c9cfd769ee476aba5e4"} Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.543585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2092919a-a2ce-4693-98ad-9654ed55ba0a","Type":"ContainerStarted","Data":"befcf3e5ce30019c53015c326636b3f280d7d7939d0a2e3a0de5f0b2f159dfad"} Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.548408 4707 generic.go:334] "Generic (PLEG): container finished" podID="e3197c8d-2f22-4564-aba0-7118d04fea17" containerID="f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c" exitCode=0 Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.548462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerDied","Data":"f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c"} Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.548539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e3197c8d-2f22-4564-aba0-7118d04fea17","Type":"ContainerDied","Data":"b2989bf22d4106f45777318f85ac1262fc145b7a1212df1ba13272b95f57103e"} Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.548885 4707 scope.go:117] "RemoveContainer" containerID="01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.550130 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.554645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"058a69b4-b6a3-4533-a682-07c7c1b299d1","Type":"ContainerStarted","Data":"e86c8e9e77e4a6e1ea0b8624b9f802d72cd0f4bac0b2d4516ac6da8de1fd9640"} Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.557794 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" podStartSLOduration=1.557783777 podStartE2EDuration="1.557783777s" podCreationTimestamp="2026-01-21 15:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:13.554385426 +0000 UTC m=+2190.735901649" watchObservedRunningTime="2026-01-21 15:38:13.557783777 +0000 UTC m=+2190.739299999" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.597486 4707 scope.go:117] "RemoveContainer" containerID="c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.604595 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.637608 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.650133 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.659873 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.662262 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.664419 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.668329 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.668764 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.691750 4707 scope.go:117] "RemoveContainer" containerID="f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.724150 4707 scope.go:117] "RemoveContainer" containerID="f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.757774 4707 scope.go:117] "RemoveContainer" containerID="01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616" Jan 21 15:38:13 crc kubenswrapper[4707]: E0121 15:38:13.768940 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616\": container with ID starting with 01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616 not found: ID does not exist" containerID="01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.769164 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616"} err="failed to get container status \"01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616\": rpc error: code = NotFound desc = could not find container \"01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616\": container with ID starting with 01de01ded135296046833988387bb4f1da71c14cd7e1f31f32d069c6dc118616 not found: ID does not exist" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.769192 4707 scope.go:117] "RemoveContainer" containerID="c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2" Jan 21 15:38:13 crc kubenswrapper[4707]: E0121 15:38:13.772266 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2\": container with ID starting with c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2 not found: ID does not exist" containerID="c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.772323 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2"} err="failed to get container status \"c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2\": rpc error: code = NotFound desc = could not find container \"c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2\": container with ID starting with c27272c1a8d9b258cbd547f74be28473a6c046ed869de7e5fe4cc6a5edfd49b2 not found: ID does not exist" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.772366 4707 scope.go:117] "RemoveContainer" containerID="f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640" Jan 21 15:38:13 crc kubenswrapper[4707]: E0121 15:38:13.773516 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640\": container with ID starting with f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640 not found: ID does not exist" containerID="f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.773538 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640"} err="failed to get container status \"f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640\": rpc error: code = NotFound desc = could not find container \"f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640\": container with ID starting with f806bb4beced0fd9493e54ea91152e1e2394e289fd9e62d431316e8831111640 not found: ID does not exist" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.773572 4707 scope.go:117] "RemoveContainer" containerID="f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c" Jan 21 15:38:13 crc kubenswrapper[4707]: E0121 15:38:13.773951 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c\": container with ID starting with f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c not found: ID does not exist" containerID="f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.773993 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c"} err="failed to get container status \"f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c\": rpc error: code = NotFound desc = could not find container \"f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c\": container with ID starting with f233d14c0808f724747ee8459c074a3c9d447c25fbeec27627809c3d26533c4c not found: ID does not exist" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.792257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-run-httpd\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.792316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.792356 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-scripts\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.792431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzth6\" (UniqueName: \"kubernetes.io/projected/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-kube-api-access-pzth6\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.792494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.792732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-config-data\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.792924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-log-httpd\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.894494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzth6\" (UniqueName: \"kubernetes.io/projected/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-kube-api-access-pzth6\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.894534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.894578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-config-data\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.894621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-log-httpd\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.894737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-run-httpd\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.894753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.894776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-scripts\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.895539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-log-httpd\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.896099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-run-httpd\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.898871 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-scripts\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.905177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.906192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-config-data\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.908217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:13 crc kubenswrapper[4707]: I0121 15:38:13.915756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzth6\" (UniqueName: \"kubernetes.io/projected/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-kube-api-access-pzth6\") pod \"ceilometer-0\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.006142 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.105019 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk"] Jan 21 15:38:14 crc kubenswrapper[4707]: W0121 15:38:14.522598 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb99c4eaa_dc5a_4d7e_916a_b4d5418d09fc.slice/crio-6b1487380fb4340340dd0d634453d35b3c026fa347792f87340d8625ec14b7e8 WatchSource:0}: Error finding container 6b1487380fb4340340dd0d634453d35b3c026fa347792f87340d8625ec14b7e8: Status 404 returned error can't find the container with id 6b1487380fb4340340dd0d634453d35b3c026fa347792f87340d8625ec14b7e8 Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.523308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.566185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"058a69b4-b6a3-4533-a682-07c7c1b299d1","Type":"ContainerStarted","Data":"bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825"} Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.566231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"058a69b4-b6a3-4533-a682-07c7c1b299d1","Type":"ContainerStarted","Data":"529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a"} Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.567363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerStarted","Data":"6b1487380fb4340340dd0d634453d35b3c026fa347792f87340d8625ec14b7e8"} Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.570353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d7dabca5-e35d-4932-b272-df52109fcd37","Type":"ContainerStarted","Data":"d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8"} Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.574733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c26d0b21-9b78-4cdb-b01f-7745d7f0345a","Type":"ContainerStarted","Data":"ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed"} Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.578158 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.578143781 podStartE2EDuration="2.578143781s" podCreationTimestamp="2026-01-21 15:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:14.577909078 +0000 UTC m=+2191.759425301" watchObservedRunningTime="2026-01-21 15:38:14.578143781 +0000 UTC m=+2191.759660002" Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.579677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2092919a-a2ce-4693-98ad-9654ed55ba0a","Type":"ContainerStarted","Data":"f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701"} Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.579708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2092919a-a2ce-4693-98ad-9654ed55ba0a","Type":"ContainerStarted","Data":"74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9"} Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.586753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" event={"ID":"369fb473-5832-4299-80a7-81eecc4ab6d4","Type":"ContainerStarted","Data":"b99afad247ed569f22be4570733e3e86a8756295925cb7a94ae2e835a8f1fd12"} Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.586784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" event={"ID":"369fb473-5832-4299-80a7-81eecc4ab6d4","Type":"ContainerStarted","Data":"578c709e365dfb7cca53fcbfe847032e2599877bd747fec63dec9379c23f7660"} Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.592259 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.592233222 podStartE2EDuration="2.592233222s" podCreationTimestamp="2026-01-21 15:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:14.589986338 +0000 UTC m=+2191.771502560" watchObservedRunningTime="2026-01-21 15:38:14.592233222 +0000 UTC m=+2191.773749445" Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.617273 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.617261653 podStartE2EDuration="2.617261653s" podCreationTimestamp="2026-01-21 15:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:14.604324186 +0000 UTC m=+2191.785840408" watchObservedRunningTime="2026-01-21 15:38:14.617261653 +0000 UTC m=+2191.798777874" Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.640907 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" podStartSLOduration=1.640889619 podStartE2EDuration="1.640889619s" podCreationTimestamp="2026-01-21 15:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:14.63690188 +0000 UTC m=+2191.818418103" watchObservedRunningTime="2026-01-21 15:38:14.640889619 +0000 UTC m=+2191.822405841" Jan 21 15:38:14 crc kubenswrapper[4707]: I0121 15:38:14.645264 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.645253926 podStartE2EDuration="2.645253926s" podCreationTimestamp="2026-01-21 15:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:14.621449537 +0000 UTC m=+2191.802965759" watchObservedRunningTime="2026-01-21 15:38:14.645253926 +0000 UTC m=+2191.826770147" Jan 21 15:38:15 crc kubenswrapper[4707]: I0121 15:38:15.141616 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:38:15 crc kubenswrapper[4707]: I0121 15:38:15.154436 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:15 crc kubenswrapper[4707]: I0121 15:38:15.202009 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3197c8d-2f22-4564-aba0-7118d04fea17" path="/var/lib/kubelet/pods/e3197c8d-2f22-4564-aba0-7118d04fea17/volumes" Jan 21 15:38:15 crc kubenswrapper[4707]: I0121 15:38:15.616278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerStarted","Data":"6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e"} Jan 21 15:38:16 crc kubenswrapper[4707]: I0121 15:38:16.629100 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerStarted","Data":"7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc"} Jan 21 15:38:16 crc kubenswrapper[4707]: I0121 15:38:16.629200 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="c26d0b21-9b78-4cdb-b01f-7745d7f0345a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed" gracePeriod=30 Jan 21 15:38:16 crc kubenswrapper[4707]: I0121 15:38:16.629324 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerName="nova-metadata-log" containerID="cri-o://74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9" gracePeriod=30 Jan 21 15:38:16 crc kubenswrapper[4707]: I0121 15:38:16.629408 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerName="nova-metadata-metadata" containerID="cri-o://f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701" gracePeriod=30 Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.144577 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.164669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-combined-ca-bundle\") pod \"2092919a-a2ce-4693-98ad-9654ed55ba0a\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.164783 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2092919a-a2ce-4693-98ad-9654ed55ba0a-logs\") pod \"2092919a-a2ce-4693-98ad-9654ed55ba0a\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.164832 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtb5\" (UniqueName: \"kubernetes.io/projected/2092919a-a2ce-4693-98ad-9654ed55ba0a-kube-api-access-gjtb5\") pod \"2092919a-a2ce-4693-98ad-9654ed55ba0a\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.164850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-config-data\") pod \"2092919a-a2ce-4693-98ad-9654ed55ba0a\" (UID: \"2092919a-a2ce-4693-98ad-9654ed55ba0a\") " Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.166161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2092919a-a2ce-4693-98ad-9654ed55ba0a-logs" (OuterVolumeSpecName: "logs") pod "2092919a-a2ce-4693-98ad-9654ed55ba0a" (UID: "2092919a-a2ce-4693-98ad-9654ed55ba0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.173017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2092919a-a2ce-4693-98ad-9654ed55ba0a-kube-api-access-gjtb5" (OuterVolumeSpecName: "kube-api-access-gjtb5") pod "2092919a-a2ce-4693-98ad-9654ed55ba0a" (UID: "2092919a-a2ce-4693-98ad-9654ed55ba0a"). InnerVolumeSpecName "kube-api-access-gjtb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.204936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2092919a-a2ce-4693-98ad-9654ed55ba0a" (UID: "2092919a-a2ce-4693-98ad-9654ed55ba0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.208067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-config-data" (OuterVolumeSpecName: "config-data") pod "2092919a-a2ce-4693-98ad-9654ed55ba0a" (UID: "2092919a-a2ce-4693-98ad-9654ed55ba0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.267458 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.267487 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2092919a-a2ce-4693-98ad-9654ed55ba0a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.267497 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjtb5\" (UniqueName: \"kubernetes.io/projected/2092919a-a2ce-4693-98ad-9654ed55ba0a-kube-api-access-gjtb5\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.267507 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2092919a-a2ce-4693-98ad-9654ed55ba0a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.321847 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.368750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-combined-ca-bundle\") pod \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.368866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-config-data\") pod \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.369012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrqsz\" (UniqueName: \"kubernetes.io/projected/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-kube-api-access-xrqsz\") pod \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\" (UID: \"c26d0b21-9b78-4cdb-b01f-7745d7f0345a\") " Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.379008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-kube-api-access-xrqsz" (OuterVolumeSpecName: "kube-api-access-xrqsz") pod "c26d0b21-9b78-4cdb-b01f-7745d7f0345a" (UID: "c26d0b21-9b78-4cdb-b01f-7745d7f0345a"). InnerVolumeSpecName "kube-api-access-xrqsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.390207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-config-data" (OuterVolumeSpecName: "config-data") pod "c26d0b21-9b78-4cdb-b01f-7745d7f0345a" (UID: "c26d0b21-9b78-4cdb-b01f-7745d7f0345a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.399882 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c26d0b21-9b78-4cdb-b01f-7745d7f0345a" (UID: "c26d0b21-9b78-4cdb-b01f-7745d7f0345a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.470647 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.470675 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.470684 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrqsz\" (UniqueName: \"kubernetes.io/projected/c26d0b21-9b78-4cdb-b01f-7745d7f0345a-kube-api-access-xrqsz\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.637705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerStarted","Data":"a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8"} Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.638883 4707 generic.go:334] "Generic (PLEG): container finished" podID="c26d0b21-9b78-4cdb-b01f-7745d7f0345a" containerID="ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed" exitCode=0 Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.638935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c26d0b21-9b78-4cdb-b01f-7745d7f0345a","Type":"ContainerDied","Data":"ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed"} Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.638953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c26d0b21-9b78-4cdb-b01f-7745d7f0345a","Type":"ContainerDied","Data":"72e85d045289b9c0a450fb22a152c4883137912160558c9cfd769ee476aba5e4"} Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.638968 4707 scope.go:117] "RemoveContainer" containerID="ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.639072 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.641993 4707 generic.go:334] "Generic (PLEG): container finished" podID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerID="f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701" exitCode=0 Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.642022 4707 generic.go:334] "Generic (PLEG): container finished" podID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerID="74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9" exitCode=143 Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.642043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2092919a-a2ce-4693-98ad-9654ed55ba0a","Type":"ContainerDied","Data":"f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701"} Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.642067 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2092919a-a2ce-4693-98ad-9654ed55ba0a","Type":"ContainerDied","Data":"74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9"} Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.642076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2092919a-a2ce-4693-98ad-9654ed55ba0a","Type":"ContainerDied","Data":"befcf3e5ce30019c53015c326636b3f280d7d7939d0a2e3a0de5f0b2f159dfad"} Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.642141 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.669291 4707 scope.go:117] "RemoveContainer" containerID="ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed" Jan 21 15:38:17 crc kubenswrapper[4707]: E0121 15:38:17.669609 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed\": container with ID starting with ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed not found: ID does not exist" containerID="ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.669652 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed"} err="failed to get container status \"ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed\": rpc error: code = NotFound desc = could not find container \"ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed\": container with ID starting with ca785e8fc6dd8fc0f4786ad1f52fcacdb7a4ce53dc5db0589d5798753eb1c7ed not found: ID does not exist" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.669671 4707 scope.go:117] "RemoveContainer" containerID="f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.670696 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.679738 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.692943 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.700978 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.712008 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:38:17 crc kubenswrapper[4707]: E0121 15:38:17.712354 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerName="nova-metadata-log" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.712366 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerName="nova-metadata-log" Jan 21 15:38:17 crc kubenswrapper[4707]: E0121 15:38:17.712384 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c26d0b21-9b78-4cdb-b01f-7745d7f0345a" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.712390 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c26d0b21-9b78-4cdb-b01f-7745d7f0345a" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:38:17 crc kubenswrapper[4707]: E0121 15:38:17.712399 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerName="nova-metadata-metadata" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.712404 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerName="nova-metadata-metadata" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.712575 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerName="nova-metadata-log" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.712597 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2092919a-a2ce-4693-98ad-9654ed55ba0a" containerName="nova-metadata-metadata" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.712608 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c26d0b21-9b78-4cdb-b01f-7745d7f0345a" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.713157 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.715685 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.715951 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.716081 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.722039 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.728319 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.729636 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.732497 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.732570 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.734071 4707 scope.go:117] "RemoveContainer" containerID="74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.735979 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.743833 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.777016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.777409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.777504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.777617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.777707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b6n2\" (UniqueName: \"kubernetes.io/projected/d4146629-1224-48a1-86c9-da68e93dbbca-kube-api-access-8b6n2\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.777789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e4b725-a13d-4249-9708-5b736ed9baae-logs\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.777914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.778578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-config-data\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.778756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt48s\" (UniqueName: \"kubernetes.io/projected/05e4b725-a13d-4249-9708-5b736ed9baae-kube-api-access-pt48s\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.778798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.819840 4707 scope.go:117] "RemoveContainer" containerID="f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701" Jan 21 15:38:17 crc kubenswrapper[4707]: E0121 15:38:17.820190 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701\": container with ID starting with f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701 not found: ID does not exist" containerID="f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.820214 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701"} err="failed to get container status \"f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701\": rpc error: code = NotFound desc = could not find container \"f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701\": container with ID starting with f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701 not found: ID does not exist" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.820233 4707 scope.go:117] "RemoveContainer" containerID="74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9" Jan 21 15:38:17 crc kubenswrapper[4707]: E0121 15:38:17.820537 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9\": container with ID starting with 74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9 not found: ID does not exist" containerID="74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.820554 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9"} err="failed to get container status \"74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9\": rpc error: code = NotFound desc = could not find container \"74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9\": container with ID starting with 74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9 not found: ID does not exist" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.820567 4707 scope.go:117] "RemoveContainer" containerID="f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.823655 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701"} err="failed to get container status \"f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701\": rpc error: code = NotFound desc = could not find container \"f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701\": container with ID starting with f4866e6fe3a064f9f9366219890641cb846ee1456610179d57f30f418f007701 not found: ID does not exist" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.823685 4707 scope.go:117] "RemoveContainer" containerID="74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.824581 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9"} err="failed to get container status \"74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9\": rpc error: code = NotFound desc = could not find container \"74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9\": container with ID starting with 74a2f84aa53d32b37db36970bf30e0380609cd9ebeac2d0ff3354a17564331b9 not found: ID does not exist" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b6n2\" (UniqueName: \"kubernetes.io/projected/d4146629-1224-48a1-86c9-da68e93dbbca-kube-api-access-8b6n2\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e4b725-a13d-4249-9708-5b736ed9baae-logs\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-config-data\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt48s\" (UniqueName: \"kubernetes.io/projected/05e4b725-a13d-4249-9708-5b736ed9baae-kube-api-access-pt48s\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880592 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.880618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.881341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e4b725-a13d-4249-9708-5b736ed9baae-logs\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.885580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.885642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.885885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.886151 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.886407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-config-data\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.886765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.893991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.895514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b6n2\" (UniqueName: \"kubernetes.io/projected/d4146629-1224-48a1-86c9-da68e93dbbca-kube-api-access-8b6n2\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:17 crc kubenswrapper[4707]: I0121 15:38:17.896435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt48s\" (UniqueName: \"kubernetes.io/projected/05e4b725-a13d-4249-9708-5b736ed9baae-kube-api-access-pt48s\") pod \"nova-metadata-0\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:18 crc kubenswrapper[4707]: I0121 15:38:18.103048 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:18 crc kubenswrapper[4707]: I0121 15:38:18.109371 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:18 crc kubenswrapper[4707]: W0121 15:38:18.531075 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4146629_1224_48a1_86c9_da68e93dbbca.slice/crio-7b6a811db370f624819f05bef72f525a106a2c7655e93b3a367187ef0affd76f WatchSource:0}: Error finding container 7b6a811db370f624819f05bef72f525a106a2c7655e93b3a367187ef0affd76f: Status 404 returned error can't find the container with id 7b6a811db370f624819f05bef72f525a106a2c7655e93b3a367187ef0affd76f Jan 21 15:38:18 crc kubenswrapper[4707]: I0121 15:38:18.531847 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:38:18 crc kubenswrapper[4707]: W0121 15:38:18.538491 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e4b725_a13d_4249_9708_5b736ed9baae.slice/crio-2ec698cf4fb5be0583ddce15d6df57b29cd42cf8465dbc134c0a4c77fbec398a WatchSource:0}: Error finding container 2ec698cf4fb5be0583ddce15d6df57b29cd42cf8465dbc134c0a4c77fbec398a: Status 404 returned error can't find the container with id 2ec698cf4fb5be0583ddce15d6df57b29cd42cf8465dbc134c0a4c77fbec398a Jan 21 15:38:18 crc kubenswrapper[4707]: I0121 15:38:18.540069 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:18 crc kubenswrapper[4707]: I0121 15:38:18.649341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"d4146629-1224-48a1-86c9-da68e93dbbca","Type":"ContainerStarted","Data":"7b6a811db370f624819f05bef72f525a106a2c7655e93b3a367187ef0affd76f"} Jan 21 15:38:18 crc kubenswrapper[4707]: I0121 15:38:18.652474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"05e4b725-a13d-4249-9708-5b736ed9baae","Type":"ContainerStarted","Data":"2ec698cf4fb5be0583ddce15d6df57b29cd42cf8465dbc134c0a4c77fbec398a"} Jan 21 15:38:18 crc kubenswrapper[4707]: I0121 15:38:18.656554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerStarted","Data":"a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754"} Jan 21 15:38:18 crc kubenswrapper[4707]: I0121 15:38:18.656683 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:18 crc kubenswrapper[4707]: I0121 15:38:18.671069 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.925658418 podStartE2EDuration="5.671056517s" podCreationTimestamp="2026-01-21 15:38:13 +0000 UTC" firstStartedPulling="2026-01-21 15:38:14.52514554 +0000 UTC m=+2191.706661762" lastFinishedPulling="2026-01-21 15:38:18.270543638 +0000 UTC m=+2195.452059861" observedRunningTime="2026-01-21 15:38:18.670681502 +0000 UTC m=+2195.852197724" watchObservedRunningTime="2026-01-21 15:38:18.671056517 +0000 UTC m=+2195.852572739" Jan 21 15:38:19 crc kubenswrapper[4707]: I0121 15:38:19.190823 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2092919a-a2ce-4693-98ad-9654ed55ba0a" path="/var/lib/kubelet/pods/2092919a-a2ce-4693-98ad-9654ed55ba0a/volumes" Jan 21 15:38:19 crc kubenswrapper[4707]: I0121 15:38:19.192015 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c26d0b21-9b78-4cdb-b01f-7745d7f0345a" path="/var/lib/kubelet/pods/c26d0b21-9b78-4cdb-b01f-7745d7f0345a/volumes" Jan 21 15:38:19 crc kubenswrapper[4707]: I0121 15:38:19.672543 4707 generic.go:334] "Generic (PLEG): container finished" podID="369fb473-5832-4299-80a7-81eecc4ab6d4" containerID="b99afad247ed569f22be4570733e3e86a8756295925cb7a94ae2e835a8f1fd12" exitCode=0 Jan 21 15:38:19 crc kubenswrapper[4707]: I0121 15:38:19.672597 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" event={"ID":"369fb473-5832-4299-80a7-81eecc4ab6d4","Type":"ContainerDied","Data":"b99afad247ed569f22be4570733e3e86a8756295925cb7a94ae2e835a8f1fd12"} Jan 21 15:38:19 crc kubenswrapper[4707]: I0121 15:38:19.674926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"d4146629-1224-48a1-86c9-da68e93dbbca","Type":"ContainerStarted","Data":"d8ae9b7d96594f3db274ccac95d0b77c71288664e3163100455a9c6c900575d6"} Jan 21 15:38:19 crc kubenswrapper[4707]: I0121 15:38:19.676388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"05e4b725-a13d-4249-9708-5b736ed9baae","Type":"ContainerStarted","Data":"0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c"} Jan 21 15:38:19 crc kubenswrapper[4707]: I0121 15:38:19.676415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"05e4b725-a13d-4249-9708-5b736ed9baae","Type":"ContainerStarted","Data":"7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3"} Jan 21 15:38:19 crc kubenswrapper[4707]: I0121 15:38:19.717879 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.7178659229999997 podStartE2EDuration="2.717865923s" podCreationTimestamp="2026-01-21 15:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:19.717358909 +0000 UTC m=+2196.898875132" watchObservedRunningTime="2026-01-21 15:38:19.717865923 +0000 UTC m=+2196.899382144" Jan 21 15:38:19 crc kubenswrapper[4707]: I0121 15:38:19.734585 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.734571714 podStartE2EDuration="2.734571714s" podCreationTimestamp="2026-01-21 15:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:19.72830942 +0000 UTC m=+2196.909825642" watchObservedRunningTime="2026-01-21 15:38:19.734571714 +0000 UTC m=+2196.916087936" Jan 21 15:38:20 crc kubenswrapper[4707]: I0121 15:38:20.685693 4707 generic.go:334] "Generic (PLEG): container finished" podID="31884041-4815-4438-a2cd-7bd2756f9f21" containerID="c66dd17d9b60df10f57e9cc85f7bf0536cdcaea704e2db1ea2f6186abbb9b630" exitCode=0 Jan 21 15:38:20 crc kubenswrapper[4707]: I0121 15:38:20.685779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" event={"ID":"31884041-4815-4438-a2cd-7bd2756f9f21","Type":"ContainerDied","Data":"c66dd17d9b60df10f57e9cc85f7bf0536cdcaea704e2db1ea2f6186abbb9b630"} Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.007376 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.146037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-scripts\") pod \"369fb473-5832-4299-80a7-81eecc4ab6d4\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.146219 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-combined-ca-bundle\") pod \"369fb473-5832-4299-80a7-81eecc4ab6d4\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.146312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrt7z\" (UniqueName: \"kubernetes.io/projected/369fb473-5832-4299-80a7-81eecc4ab6d4-kube-api-access-xrt7z\") pod \"369fb473-5832-4299-80a7-81eecc4ab6d4\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.146376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-config-data\") pod \"369fb473-5832-4299-80a7-81eecc4ab6d4\" (UID: \"369fb473-5832-4299-80a7-81eecc4ab6d4\") " Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.162938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-scripts" (OuterVolumeSpecName: "scripts") pod "369fb473-5832-4299-80a7-81eecc4ab6d4" (UID: "369fb473-5832-4299-80a7-81eecc4ab6d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.162971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369fb473-5832-4299-80a7-81eecc4ab6d4-kube-api-access-xrt7z" (OuterVolumeSpecName: "kube-api-access-xrt7z") pod "369fb473-5832-4299-80a7-81eecc4ab6d4" (UID: "369fb473-5832-4299-80a7-81eecc4ab6d4"). InnerVolumeSpecName "kube-api-access-xrt7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.169684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-config-data" (OuterVolumeSpecName: "config-data") pod "369fb473-5832-4299-80a7-81eecc4ab6d4" (UID: "369fb473-5832-4299-80a7-81eecc4ab6d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.171019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "369fb473-5832-4299-80a7-81eecc4ab6d4" (UID: "369fb473-5832-4299-80a7-81eecc4ab6d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.248592 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.248615 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrt7z\" (UniqueName: \"kubernetes.io/projected/369fb473-5832-4299-80a7-81eecc4ab6d4-kube-api-access-xrt7z\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.248626 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.248635 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369fb473-5832-4299-80a7-81eecc4ab6d4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.694732 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.695388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk" event={"ID":"369fb473-5832-4299-80a7-81eecc4ab6d4","Type":"ContainerDied","Data":"578c709e365dfb7cca53fcbfe847032e2599877bd747fec63dec9379c23f7660"} Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.695413 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="578c709e365dfb7cca53fcbfe847032e2599877bd747fec63dec9379c23f7660" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.750209 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:38:21 crc kubenswrapper[4707]: E0121 15:38:21.750797 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369fb473-5832-4299-80a7-81eecc4ab6d4" containerName="nova-cell1-conductor-db-sync" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.750823 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="369fb473-5832-4299-80a7-81eecc4ab6d4" containerName="nova-cell1-conductor-db-sync" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.751017 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="369fb473-5832-4299-80a7-81eecc4ab6d4" containerName="nova-cell1-conductor-db-sync" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.751617 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.753032 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.757464 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.864358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.864425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25bqm\" (UniqueName: \"kubernetes.io/projected/40c68303-00df-4442-9bfc-11c7c74a30c6-kube-api-access-25bqm\") pod \"nova-cell1-conductor-0\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.864518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.965759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.965876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25bqm\" (UniqueName: \"kubernetes.io/projected/40c68303-00df-4442-9bfc-11c7c74a30c6-kube-api-access-25bqm\") pod \"nova-cell1-conductor-0\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.966035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.971400 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.971504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:21 crc kubenswrapper[4707]: I0121 15:38:21.981343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25bqm\" (UniqueName: \"kubernetes.io/projected/40c68303-00df-4442-9bfc-11c7c74a30c6-kube-api-access-25bqm\") pod \"nova-cell1-conductor-0\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.045882 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.066322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-config-data\") pod \"31884041-4815-4438-a2cd-7bd2756f9f21\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.066363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-scripts\") pod \"31884041-4815-4438-a2cd-7bd2756f9f21\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.066423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssvjv\" (UniqueName: \"kubernetes.io/projected/31884041-4815-4438-a2cd-7bd2756f9f21-kube-api-access-ssvjv\") pod \"31884041-4815-4438-a2cd-7bd2756f9f21\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.066482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-combined-ca-bundle\") pod \"31884041-4815-4438-a2cd-7bd2756f9f21\" (UID: \"31884041-4815-4438-a2cd-7bd2756f9f21\") " Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.067570 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.070235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31884041-4815-4438-a2cd-7bd2756f9f21-kube-api-access-ssvjv" (OuterVolumeSpecName: "kube-api-access-ssvjv") pod "31884041-4815-4438-a2cd-7bd2756f9f21" (UID: "31884041-4815-4438-a2cd-7bd2756f9f21"). InnerVolumeSpecName "kube-api-access-ssvjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.072881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-scripts" (OuterVolumeSpecName: "scripts") pod "31884041-4815-4438-a2cd-7bd2756f9f21" (UID: "31884041-4815-4438-a2cd-7bd2756f9f21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.099077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-config-data" (OuterVolumeSpecName: "config-data") pod "31884041-4815-4438-a2cd-7bd2756f9f21" (UID: "31884041-4815-4438-a2cd-7bd2756f9f21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.099308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31884041-4815-4438-a2cd-7bd2756f9f21" (UID: "31884041-4815-4438-a2cd-7bd2756f9f21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.167861 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.167891 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.167902 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssvjv\" (UniqueName: \"kubernetes.io/projected/31884041-4815-4438-a2cd-7bd2756f9f21-kube-api-access-ssvjv\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.167912 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31884041-4815-4438-a2cd-7bd2756f9f21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.446637 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.704329 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.704326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bn475" event={"ID":"31884041-4815-4438-a2cd-7bd2756f9f21","Type":"ContainerDied","Data":"ba6a6534b0b637484b4e19d4f4e71f92f5aaa9fe908f8f3dd8c10cbbbd06315c"} Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.704604 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6a6534b0b637484b4e19d4f4e71f92f5aaa9fe908f8f3dd8c10cbbbd06315c" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.706430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"40c68303-00df-4442-9bfc-11c7c74a30c6","Type":"ContainerStarted","Data":"b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf"} Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.706459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"40c68303-00df-4442-9bfc-11c7c74a30c6","Type":"ContainerStarted","Data":"cb28fbbb085a56f4117ac92c67aaa47cab12c1e9ecaca8cdb7fdcd0703fa7e70"} Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.706914 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.733395 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.733379306 podStartE2EDuration="1.733379306s" podCreationTimestamp="2026-01-21 15:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:22.725763758 +0000 UTC m=+2199.907279980" watchObservedRunningTime="2026-01-21 15:38:22.733379306 +0000 UTC m=+2199.914895529" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.735976 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.744664 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.744696 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.765512 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.964485 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.976736 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.990736 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.990935 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="05e4b725-a13d-4249-9708-5b736ed9baae" containerName="nova-metadata-log" containerID="cri-o://7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3" gracePeriod=30 Jan 21 15:38:22 crc kubenswrapper[4707]: I0121 15:38:22.991070 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="05e4b725-a13d-4249-9708-5b736ed9baae" containerName="nova-metadata-metadata" containerID="cri-o://0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c" gracePeriod=30 Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.104077 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.109856 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.109900 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.541645 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.594667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-config-data\") pod \"05e4b725-a13d-4249-9708-5b736ed9baae\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.594747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-nova-metadata-tls-certs\") pod \"05e4b725-a13d-4249-9708-5b736ed9baae\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.594836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-combined-ca-bundle\") pod \"05e4b725-a13d-4249-9708-5b736ed9baae\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.594871 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e4b725-a13d-4249-9708-5b736ed9baae-logs\") pod \"05e4b725-a13d-4249-9708-5b736ed9baae\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.594901 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt48s\" (UniqueName: \"kubernetes.io/projected/05e4b725-a13d-4249-9708-5b736ed9baae-kube-api-access-pt48s\") pod \"05e4b725-a13d-4249-9708-5b736ed9baae\" (UID: \"05e4b725-a13d-4249-9708-5b736ed9baae\") " Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.600063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e4b725-a13d-4249-9708-5b736ed9baae-kube-api-access-pt48s" (OuterVolumeSpecName: "kube-api-access-pt48s") pod "05e4b725-a13d-4249-9708-5b736ed9baae" (UID: "05e4b725-a13d-4249-9708-5b736ed9baae"). InnerVolumeSpecName "kube-api-access-pt48s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.614557 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e4b725-a13d-4249-9708-5b736ed9baae-logs" (OuterVolumeSpecName: "logs") pod "05e4b725-a13d-4249-9708-5b736ed9baae" (UID: "05e4b725-a13d-4249-9708-5b736ed9baae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.632739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-config-data" (OuterVolumeSpecName: "config-data") pod "05e4b725-a13d-4249-9708-5b736ed9baae" (UID: "05e4b725-a13d-4249-9708-5b736ed9baae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.635529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05e4b725-a13d-4249-9708-5b736ed9baae" (UID: "05e4b725-a13d-4249-9708-5b736ed9baae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.646962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "05e4b725-a13d-4249-9708-5b736ed9baae" (UID: "05e4b725-a13d-4249-9708-5b736ed9baae"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.697422 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.697458 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e4b725-a13d-4249-9708-5b736ed9baae-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.697468 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt48s\" (UniqueName: \"kubernetes.io/projected/05e4b725-a13d-4249-9708-5b736ed9baae-kube-api-access-pt48s\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.697481 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.697489 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e4b725-a13d-4249-9708-5b736ed9baae-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.732723 4707 generic.go:334] "Generic (PLEG): container finished" podID="05e4b725-a13d-4249-9708-5b736ed9baae" containerID="0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c" exitCode=0 Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.732756 4707 generic.go:334] "Generic (PLEG): container finished" podID="05e4b725-a13d-4249-9708-5b736ed9baae" containerID="7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3" exitCode=143 Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.733579 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.734256 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-log" containerID="cri-o://529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a" gracePeriod=30 Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.734898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"05e4b725-a13d-4249-9708-5b736ed9baae","Type":"ContainerDied","Data":"0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c"} Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.734968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"05e4b725-a13d-4249-9708-5b736ed9baae","Type":"ContainerDied","Data":"7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3"} Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.734983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"05e4b725-a13d-4249-9708-5b736ed9baae","Type":"ContainerDied","Data":"2ec698cf4fb5be0583ddce15d6df57b29cd42cf8465dbc134c0a4c77fbec398a"} Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.735001 4707 scope.go:117] "RemoveContainer" containerID="0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.735350 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-api" containerID="cri-o://bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825" gracePeriod=30 Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.747942 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.748019 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.768847 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.789329 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.793002 4707 scope.go:117] "RemoveContainer" containerID="7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.797201 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.807090 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:23 crc kubenswrapper[4707]: E0121 15:38:23.807553 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e4b725-a13d-4249-9708-5b736ed9baae" containerName="nova-metadata-metadata" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.807572 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e4b725-a13d-4249-9708-5b736ed9baae" containerName="nova-metadata-metadata" Jan 21 15:38:23 crc kubenswrapper[4707]: E0121 15:38:23.807606 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e4b725-a13d-4249-9708-5b736ed9baae" containerName="nova-metadata-log" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.807613 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e4b725-a13d-4249-9708-5b736ed9baae" containerName="nova-metadata-log" Jan 21 15:38:23 crc kubenswrapper[4707]: E0121 15:38:23.807625 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31884041-4815-4438-a2cd-7bd2756f9f21" containerName="nova-manage" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.807630 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="31884041-4815-4438-a2cd-7bd2756f9f21" containerName="nova-manage" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.807780 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e4b725-a13d-4249-9708-5b736ed9baae" containerName="nova-metadata-metadata" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.807818 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="31884041-4815-4438-a2cd-7bd2756f9f21" containerName="nova-manage" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.807829 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e4b725-a13d-4249-9708-5b736ed9baae" containerName="nova-metadata-log" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.808685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.820398 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.820527 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.821764 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.827618 4707 scope.go:117] "RemoveContainer" containerID="0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c" Jan 21 15:38:23 crc kubenswrapper[4707]: E0121 15:38:23.830766 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c\": container with ID starting with 0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c not found: ID does not exist" containerID="0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.830792 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c"} err="failed to get container status \"0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c\": rpc error: code = NotFound desc = could not find container \"0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c\": container with ID starting with 0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c not found: ID does not exist" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.830822 4707 scope.go:117] "RemoveContainer" containerID="7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3" Jan 21 15:38:23 crc kubenswrapper[4707]: E0121 15:38:23.833053 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3\": container with ID starting with 7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3 not found: ID does not exist" containerID="7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.833076 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3"} err="failed to get container status \"7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3\": rpc error: code = NotFound desc = could not find container \"7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3\": container with ID starting with 7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3 not found: ID does not exist" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.833090 4707 scope.go:117] "RemoveContainer" containerID="0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.833418 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c"} err="failed to get container status \"0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c\": rpc error: code = NotFound desc = could not find container \"0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c\": container with ID starting with 0656fe4b4706cb57bf45080c309f011d8dab51b74e5eb45876303e2b8fd60f3c not found: ID does not exist" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.833441 4707 scope.go:117] "RemoveContainer" containerID="7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3" Jan 21 15:38:23 crc kubenswrapper[4707]: I0121 15:38:23.833676 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3"} err="failed to get container status \"7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3\": rpc error: code = NotFound desc = could not find container \"7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3\": container with ID starting with 7017597d1b04c0630ed2473e4db93867db3d7bdc47a92f07580b5274cf7822f3 not found: ID does not exist" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.004659 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-logs\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.004726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.004792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.004846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfm2b\" (UniqueName: \"kubernetes.io/projected/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-kube-api-access-gfm2b\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.005059 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-config-data\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.107311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-config-data\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.107364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-logs\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.107434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.107501 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.107563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfm2b\" (UniqueName: \"kubernetes.io/projected/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-kube-api-access-gfm2b\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.107851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-logs\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.111665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-config-data\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.112411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.119624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.141681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfm2b\" (UniqueName: \"kubernetes.io/projected/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-kube-api-access-gfm2b\") pod \"nova-metadata-0\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.433787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.743417 4707 generic.go:334] "Generic (PLEG): container finished" podID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerID="529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a" exitCode=143 Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.743500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"058a69b4-b6a3-4533-a682-07c7c1b299d1","Type":"ContainerDied","Data":"529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a"} Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.745316 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d7dabca5-e35d-4932-b272-df52109fcd37" containerName="nova-scheduler-scheduler" containerID="cri-o://d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8" gracePeriod=30 Jan 21 15:38:24 crc kubenswrapper[4707]: I0121 15:38:24.920974 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.202852 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e4b725-a13d-4249-9708-5b736ed9baae" path="/var/lib/kubelet/pods/05e4b725-a13d-4249-9708-5b736ed9baae/volumes" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.502557 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.640638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-combined-ca-bundle\") pod \"d7dabca5-e35d-4932-b272-df52109fcd37\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.641150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-config-data\") pod \"d7dabca5-e35d-4932-b272-df52109fcd37\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.641173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xfxm\" (UniqueName: \"kubernetes.io/projected/d7dabca5-e35d-4932-b272-df52109fcd37-kube-api-access-2xfxm\") pod \"d7dabca5-e35d-4932-b272-df52109fcd37\" (UID: \"d7dabca5-e35d-4932-b272-df52109fcd37\") " Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.644773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7dabca5-e35d-4932-b272-df52109fcd37-kube-api-access-2xfxm" (OuterVolumeSpecName: "kube-api-access-2xfxm") pod "d7dabca5-e35d-4932-b272-df52109fcd37" (UID: "d7dabca5-e35d-4932-b272-df52109fcd37"). InnerVolumeSpecName "kube-api-access-2xfxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.665475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7dabca5-e35d-4932-b272-df52109fcd37" (UID: "d7dabca5-e35d-4932-b272-df52109fcd37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.667924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-config-data" (OuterVolumeSpecName: "config-data") pod "d7dabca5-e35d-4932-b272-df52109fcd37" (UID: "d7dabca5-e35d-4932-b272-df52109fcd37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.743587 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.743613 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xfxm\" (UniqueName: \"kubernetes.io/projected/d7dabca5-e35d-4932-b272-df52109fcd37-kube-api-access-2xfxm\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.743623 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dabca5-e35d-4932-b272-df52109fcd37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.753428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5dfb4638-344f-48b5-a3bd-0ce06e4b9409","Type":"ContainerStarted","Data":"7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a"} Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.753473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5dfb4638-344f-48b5-a3bd-0ce06e4b9409","Type":"ContainerStarted","Data":"e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00"} Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.753484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5dfb4638-344f-48b5-a3bd-0ce06e4b9409","Type":"ContainerStarted","Data":"242d5732cf75e9519ffec42fc78bb729f16da4d72bd1397acd647622ab90594f"} Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.754498 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7dabca5-e35d-4932-b272-df52109fcd37" containerID="d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8" exitCode=0 Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.754531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d7dabca5-e35d-4932-b272-df52109fcd37","Type":"ContainerDied","Data":"d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8"} Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.754546 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.754562 4707 scope.go:117] "RemoveContainer" containerID="d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.754552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d7dabca5-e35d-4932-b272-df52109fcd37","Type":"ContainerDied","Data":"ab9e3c445bfb4a4d5dfc8d142c59d056c7a72110a838574f9df3e4326e55fe2f"} Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.768429 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.76841047 podStartE2EDuration="2.76841047s" podCreationTimestamp="2026-01-21 15:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:25.768001951 +0000 UTC m=+2202.949518174" watchObservedRunningTime="2026-01-21 15:38:25.76841047 +0000 UTC m=+2202.949926682" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.775528 4707 scope.go:117] "RemoveContainer" containerID="d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8" Jan 21 15:38:25 crc kubenswrapper[4707]: E0121 15:38:25.775939 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8\": container with ID starting with d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8 not found: ID does not exist" containerID="d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.775968 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8"} err="failed to get container status \"d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8\": rpc error: code = NotFound desc = could not find container \"d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8\": container with ID starting with d5b65606792dd5dd6b5c7fa904926b2f70214004f2df38987912a3fe6daf92d8 not found: ID does not exist" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.789909 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.795919 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.805584 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:25 crc kubenswrapper[4707]: E0121 15:38:25.806063 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7dabca5-e35d-4932-b272-df52109fcd37" containerName="nova-scheduler-scheduler" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.806084 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7dabca5-e35d-4932-b272-df52109fcd37" containerName="nova-scheduler-scheduler" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.806279 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7dabca5-e35d-4932-b272-df52109fcd37" containerName="nova-scheduler-scheduler" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.806954 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.809276 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.815707 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.948110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4hgl\" (UniqueName: \"kubernetes.io/projected/f98fe6a0-ff43-444b-8b87-9172e629b79d-kube-api-access-w4hgl\") pod \"nova-scheduler-0\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.948405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-config-data\") pod \"nova-scheduler-0\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:25 crc kubenswrapper[4707]: I0121 15:38:25.948470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.052186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4hgl\" (UniqueName: \"kubernetes.io/projected/f98fe6a0-ff43-444b-8b87-9172e629b79d-kube-api-access-w4hgl\") pod \"nova-scheduler-0\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.052441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-config-data\") pod \"nova-scheduler-0\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.052511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.057488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.057488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-config-data\") pod \"nova-scheduler-0\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.066474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4hgl\" (UniqueName: \"kubernetes.io/projected/f98fe6a0-ff43-444b-8b87-9172e629b79d-kube-api-access-w4hgl\") pod \"nova-scheduler-0\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.128616 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.523861 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.763158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f98fe6a0-ff43-444b-8b87-9172e629b79d","Type":"ContainerStarted","Data":"2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a"} Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.763495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f98fe6a0-ff43-444b-8b87-9172e629b79d","Type":"ContainerStarted","Data":"430489298d2b6f38f6b561287c0afdbdb0f1efdb0e58041e978c8f3a9aa8cc5a"} Jan 21 15:38:26 crc kubenswrapper[4707]: I0121 15:38:26.782578 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.782560638 podStartE2EDuration="1.782560638s" podCreationTimestamp="2026-01-21 15:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:26.774438115 +0000 UTC m=+2203.955954337" watchObservedRunningTime="2026-01-21 15:38:26.782560638 +0000 UTC m=+2203.964076859" Jan 21 15:38:27 crc kubenswrapper[4707]: I0121 15:38:27.090926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:38:27 crc kubenswrapper[4707]: I0121 15:38:27.191911 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7dabca5-e35d-4932-b272-df52109fcd37" path="/var/lib/kubelet/pods/d7dabca5-e35d-4932-b272-df52109fcd37/volumes" Jan 21 15:38:28 crc kubenswrapper[4707]: I0121 15:38:28.103931 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:28 crc kubenswrapper[4707]: I0121 15:38:28.121446 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:28 crc kubenswrapper[4707]: I0121 15:38:28.791523 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:38:28 crc kubenswrapper[4707]: I0121 15:38:28.941520 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw"] Jan 21 15:38:28 crc kubenswrapper[4707]: I0121 15:38:28.942916 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:28 crc kubenswrapper[4707]: I0121 15:38:28.945837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 15:38:28 crc kubenswrapper[4707]: I0121 15:38:28.948079 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 15:38:28 crc kubenswrapper[4707]: I0121 15:38:28.959899 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw"] Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.134663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-scripts\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.134863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.134885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-config-data\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.135006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr8xt\" (UniqueName: \"kubernetes.io/projected/f853ee64-61b1-4bbb-afa4-87fa50a0935a-kube-api-access-hr8xt\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.237332 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.237367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-config-data\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.237410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr8xt\" (UniqueName: \"kubernetes.io/projected/f853ee64-61b1-4bbb-afa4-87fa50a0935a-kube-api-access-hr8xt\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.237606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-scripts\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.243931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-config-data\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.246095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-scripts\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.253093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.253978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr8xt\" (UniqueName: \"kubernetes.io/projected/f853ee64-61b1-4bbb-afa4-87fa50a0935a-kube-api-access-hr8xt\") pod \"nova-cell1-cell-mapping-hplxw\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.263997 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.435471 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.435538 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.463388 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.644067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cc4w\" (UniqueName: \"kubernetes.io/projected/058a69b4-b6a3-4533-a682-07c7c1b299d1-kube-api-access-8cc4w\") pod \"058a69b4-b6a3-4533-a682-07c7c1b299d1\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.644166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-combined-ca-bundle\") pod \"058a69b4-b6a3-4533-a682-07c7c1b299d1\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.644255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058a69b4-b6a3-4533-a682-07c7c1b299d1-logs\") pod \"058a69b4-b6a3-4533-a682-07c7c1b299d1\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.644285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-config-data\") pod \"058a69b4-b6a3-4533-a682-07c7c1b299d1\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.645094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058a69b4-b6a3-4533-a682-07c7c1b299d1-logs" (OuterVolumeSpecName: "logs") pod "058a69b4-b6a3-4533-a682-07c7c1b299d1" (UID: "058a69b4-b6a3-4533-a682-07c7c1b299d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.650328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058a69b4-b6a3-4533-a682-07c7c1b299d1-kube-api-access-8cc4w" (OuterVolumeSpecName: "kube-api-access-8cc4w") pod "058a69b4-b6a3-4533-a682-07c7c1b299d1" (UID: "058a69b4-b6a3-4533-a682-07c7c1b299d1"). InnerVolumeSpecName "kube-api-access-8cc4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:29 crc kubenswrapper[4707]: E0121 15:38:29.663910 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-combined-ca-bundle podName:058a69b4-b6a3-4533-a682-07c7c1b299d1 nodeName:}" failed. No retries permitted until 2026-01-21 15:38:30.163888779 +0000 UTC m=+2207.345405001 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-combined-ca-bundle") pod "058a69b4-b6a3-4533-a682-07c7c1b299d1" (UID: "058a69b4-b6a3-4533-a682-07c7c1b299d1") : error deleting /var/lib/kubelet/pods/058a69b4-b6a3-4533-a682-07c7c1b299d1/volume-subpaths: remove /var/lib/kubelet/pods/058a69b4-b6a3-4533-a682-07c7c1b299d1/volume-subpaths: no such file or directory Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.666120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-config-data" (OuterVolumeSpecName: "config-data") pod "058a69b4-b6a3-4533-a682-07c7c1b299d1" (UID: "058a69b4-b6a3-4533-a682-07c7c1b299d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.717577 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw"] Jan 21 15:38:29 crc kubenswrapper[4707]: W0121 15:38:29.720574 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf853ee64_61b1_4bbb_afa4_87fa50a0935a.slice/crio-20447f1fc0ca690d47c47f2340858dc68673a77b83d74b37066b2f1ee0d5b867 WatchSource:0}: Error finding container 20447f1fc0ca690d47c47f2340858dc68673a77b83d74b37066b2f1ee0d5b867: Status 404 returned error can't find the container with id 20447f1fc0ca690d47c47f2340858dc68673a77b83d74b37066b2f1ee0d5b867 Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.746840 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.746868 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cc4w\" (UniqueName: \"kubernetes.io/projected/058a69b4-b6a3-4533-a682-07c7c1b299d1-kube-api-access-8cc4w\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.746880 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058a69b4-b6a3-4533-a682-07c7c1b299d1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.786421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" event={"ID":"f853ee64-61b1-4bbb-afa4-87fa50a0935a","Type":"ContainerStarted","Data":"20447f1fc0ca690d47c47f2340858dc68673a77b83d74b37066b2f1ee0d5b867"} Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.788827 4707 generic.go:334] "Generic (PLEG): container finished" podID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerID="bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825" exitCode=0 Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.789010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"058a69b4-b6a3-4533-a682-07c7c1b299d1","Type":"ContainerDied","Data":"bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825"} Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.789054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"058a69b4-b6a3-4533-a682-07c7c1b299d1","Type":"ContainerDied","Data":"e86c8e9e77e4a6e1ea0b8624b9f802d72cd0f4bac0b2d4516ac6da8de1fd9640"} Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.789018 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.789082 4707 scope.go:117] "RemoveContainer" containerID="bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.805578 4707 scope.go:117] "RemoveContainer" containerID="529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.823121 4707 scope.go:117] "RemoveContainer" containerID="bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825" Jan 21 15:38:29 crc kubenswrapper[4707]: E0121 15:38:29.823416 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825\": container with ID starting with bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825 not found: ID does not exist" containerID="bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.823456 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825"} err="failed to get container status \"bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825\": rpc error: code = NotFound desc = could not find container \"bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825\": container with ID starting with bd5e57c2c1a1b3afb9b9bebf00f7fb9ae2bf6f933fce9126d48c1dd07fad3825 not found: ID does not exist" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.823539 4707 scope.go:117] "RemoveContainer" containerID="529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a" Jan 21 15:38:29 crc kubenswrapper[4707]: E0121 15:38:29.823958 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a\": container with ID starting with 529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a not found: ID does not exist" containerID="529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a" Jan 21 15:38:29 crc kubenswrapper[4707]: I0121 15:38:29.823981 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a"} err="failed to get container status \"529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a\": rpc error: code = NotFound desc = could not find container \"529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a\": container with ID starting with 529b69b8122722ee5e55dc81c1259e1e1bed594c4b4a86acfeef6a2bd2a00a4a not found: ID does not exist" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.260106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-combined-ca-bundle\") pod \"058a69b4-b6a3-4533-a682-07c7c1b299d1\" (UID: \"058a69b4-b6a3-4533-a682-07c7c1b299d1\") " Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.270904 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "058a69b4-b6a3-4533-a682-07c7c1b299d1" (UID: "058a69b4-b6a3-4533-a682-07c7c1b299d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.363088 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a69b4-b6a3-4533-a682-07c7c1b299d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.421994 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.442271 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.455576 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:30 crc kubenswrapper[4707]: E0121 15:38:30.456434 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-api" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.456455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-api" Jan 21 15:38:30 crc kubenswrapper[4707]: E0121 15:38:30.456502 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-log" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.456509 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-log" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.456766 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-api" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.456797 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" containerName="nova-api-log" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.458122 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.460080 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.463890 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.566825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b785b25-10dc-4ca7-a080-2b8f469d5a61-logs\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.566884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.566964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77dzj\" (UniqueName: \"kubernetes.io/projected/9b785b25-10dc-4ca7-a080-2b8f469d5a61-kube-api-access-77dzj\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.566981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-config-data\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.668179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.668430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77dzj\" (UniqueName: \"kubernetes.io/projected/9b785b25-10dc-4ca7-a080-2b8f469d5a61-kube-api-access-77dzj\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.668463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-config-data\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.669005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b785b25-10dc-4ca7-a080-2b8f469d5a61-logs\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.669329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b785b25-10dc-4ca7-a080-2b8f469d5a61-logs\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.679376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-config-data\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.679413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.686280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77dzj\" (UniqueName: \"kubernetes.io/projected/9b785b25-10dc-4ca7-a080-2b8f469d5a61-kube-api-access-77dzj\") pod \"nova-api-0\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.774685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.800982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" event={"ID":"f853ee64-61b1-4bbb-afa4-87fa50a0935a","Type":"ContainerStarted","Data":"0f01c350b247b2f192966e13748e0e762a3105665c2a60f722f20a0211dfeec9"} Jan 21 15:38:30 crc kubenswrapper[4707]: I0121 15:38:30.831085 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" podStartSLOduration=2.8310592039999998 podStartE2EDuration="2.831059204s" podCreationTimestamp="2026-01-21 15:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:30.818733278 +0000 UTC m=+2208.000249500" watchObservedRunningTime="2026-01-21 15:38:30.831059204 +0000 UTC m=+2208.012575426" Jan 21 15:38:31 crc kubenswrapper[4707]: I0121 15:38:31.128819 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:31 crc kubenswrapper[4707]: I0121 15:38:31.168204 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:31 crc kubenswrapper[4707]: W0121 15:38:31.175466 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b785b25_10dc_4ca7_a080_2b8f469d5a61.slice/crio-7238c2c59ed7257513dc6265f731ece3a805c27061b697b31c3af3250098549a WatchSource:0}: Error finding container 7238c2c59ed7257513dc6265f731ece3a805c27061b697b31c3af3250098549a: Status 404 returned error can't find the container with id 7238c2c59ed7257513dc6265f731ece3a805c27061b697b31c3af3250098549a Jan 21 15:38:31 crc kubenswrapper[4707]: I0121 15:38:31.190159 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058a69b4-b6a3-4533-a682-07c7c1b299d1" path="/var/lib/kubelet/pods/058a69b4-b6a3-4533-a682-07c7c1b299d1/volumes" Jan 21 15:38:31 crc kubenswrapper[4707]: I0121 15:38:31.822827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9b785b25-10dc-4ca7-a080-2b8f469d5a61","Type":"ContainerStarted","Data":"632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564"} Jan 21 15:38:31 crc kubenswrapper[4707]: I0121 15:38:31.823133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9b785b25-10dc-4ca7-a080-2b8f469d5a61","Type":"ContainerStarted","Data":"5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a"} Jan 21 15:38:31 crc kubenswrapper[4707]: I0121 15:38:31.823148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9b785b25-10dc-4ca7-a080-2b8f469d5a61","Type":"ContainerStarted","Data":"7238c2c59ed7257513dc6265f731ece3a805c27061b697b31c3af3250098549a"} Jan 21 15:38:33 crc kubenswrapper[4707]: I0121 15:38:33.836765 4707 generic.go:334] "Generic (PLEG): container finished" podID="f853ee64-61b1-4bbb-afa4-87fa50a0935a" containerID="0f01c350b247b2f192966e13748e0e762a3105665c2a60f722f20a0211dfeec9" exitCode=0 Jan 21 15:38:33 crc kubenswrapper[4707]: I0121 15:38:33.836846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" event={"ID":"f853ee64-61b1-4bbb-afa4-87fa50a0935a","Type":"ContainerDied","Data":"0f01c350b247b2f192966e13748e0e762a3105665c2a60f722f20a0211dfeec9"} Jan 21 15:38:33 crc kubenswrapper[4707]: I0121 15:38:33.853371 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.853356834 podStartE2EDuration="3.853356834s" podCreationTimestamp="2026-01-21 15:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:31.836886853 +0000 UTC m=+2209.018403075" watchObservedRunningTime="2026-01-21 15:38:33.853356834 +0000 UTC m=+2211.034873056" Jan 21 15:38:34 crc kubenswrapper[4707]: I0121 15:38:34.434487 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:34 crc kubenswrapper[4707]: I0121 15:38:34.434654 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.113756 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.161059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-config-data\") pod \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.161098 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr8xt\" (UniqueName: \"kubernetes.io/projected/f853ee64-61b1-4bbb-afa4-87fa50a0935a-kube-api-access-hr8xt\") pod \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.161159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-scripts\") pod \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.161174 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-combined-ca-bundle\") pod \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\" (UID: \"f853ee64-61b1-4bbb-afa4-87fa50a0935a\") " Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.166476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-scripts" (OuterVolumeSpecName: "scripts") pod "f853ee64-61b1-4bbb-afa4-87fa50a0935a" (UID: "f853ee64-61b1-4bbb-afa4-87fa50a0935a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.167919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f853ee64-61b1-4bbb-afa4-87fa50a0935a-kube-api-access-hr8xt" (OuterVolumeSpecName: "kube-api-access-hr8xt") pod "f853ee64-61b1-4bbb-afa4-87fa50a0935a" (UID: "f853ee64-61b1-4bbb-afa4-87fa50a0935a"). InnerVolumeSpecName "kube-api-access-hr8xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.184482 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-config-data" (OuterVolumeSpecName: "config-data") pod "f853ee64-61b1-4bbb-afa4-87fa50a0935a" (UID: "f853ee64-61b1-4bbb-afa4-87fa50a0935a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.186441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f853ee64-61b1-4bbb-afa4-87fa50a0935a" (UID: "f853ee64-61b1-4bbb-afa4-87fa50a0935a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.263325 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.263350 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.263358 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f853ee64-61b1-4bbb-afa4-87fa50a0935a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.263367 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr8xt\" (UniqueName: \"kubernetes.io/projected/f853ee64-61b1-4bbb-afa4-87fa50a0935a-kube-api-access-hr8xt\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.444913 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.444971 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.874886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" event={"ID":"f853ee64-61b1-4bbb-afa4-87fa50a0935a","Type":"ContainerDied","Data":"20447f1fc0ca690d47c47f2340858dc68673a77b83d74b37066b2f1ee0d5b867"} Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.875619 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20447f1fc0ca690d47c47f2340858dc68673a77b83d74b37066b2f1ee0d5b867" Jan 21 15:38:35 crc kubenswrapper[4707]: I0121 15:38:35.874999 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.035685 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.035916 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerName="nova-api-log" containerID="cri-o://5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a" gracePeriod=30 Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.035987 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerName="nova-api-api" containerID="cri-o://632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564" gracePeriod=30 Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.044737 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.044924 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="f98fe6a0-ff43-444b-8b87-9172e629b79d" containerName="nova-scheduler-scheduler" containerID="cri-o://2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a" gracePeriod=30 Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.057463 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.064883 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-metadata" containerID="cri-o://7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a" gracePeriod=30 Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.065037 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-log" containerID="cri-o://e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00" gracePeriod=30 Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.519206 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.688635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-config-data\") pod \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.688692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-combined-ca-bundle\") pod \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.688731 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b785b25-10dc-4ca7-a080-2b8f469d5a61-logs\") pod \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.689141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b785b25-10dc-4ca7-a080-2b8f469d5a61-logs" (OuterVolumeSpecName: "logs") pod "9b785b25-10dc-4ca7-a080-2b8f469d5a61" (UID: "9b785b25-10dc-4ca7-a080-2b8f469d5a61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.689398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77dzj\" (UniqueName: \"kubernetes.io/projected/9b785b25-10dc-4ca7-a080-2b8f469d5a61-kube-api-access-77dzj\") pod \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\" (UID: \"9b785b25-10dc-4ca7-a080-2b8f469d5a61\") " Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.690382 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b785b25-10dc-4ca7-a080-2b8f469d5a61-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.692824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b785b25-10dc-4ca7-a080-2b8f469d5a61-kube-api-access-77dzj" (OuterVolumeSpecName: "kube-api-access-77dzj") pod "9b785b25-10dc-4ca7-a080-2b8f469d5a61" (UID: "9b785b25-10dc-4ca7-a080-2b8f469d5a61"). InnerVolumeSpecName "kube-api-access-77dzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.711600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-config-data" (OuterVolumeSpecName: "config-data") pod "9b785b25-10dc-4ca7-a080-2b8f469d5a61" (UID: "9b785b25-10dc-4ca7-a080-2b8f469d5a61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.712903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b785b25-10dc-4ca7-a080-2b8f469d5a61" (UID: "9b785b25-10dc-4ca7-a080-2b8f469d5a61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.791971 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77dzj\" (UniqueName: \"kubernetes.io/projected/9b785b25-10dc-4ca7-a080-2b8f469d5a61-kube-api-access-77dzj\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.792003 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.792014 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b785b25-10dc-4ca7-a080-2b8f469d5a61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.883494 4707 generic.go:334] "Generic (PLEG): container finished" podID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerID="632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564" exitCode=0 Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.883521 4707 generic.go:334] "Generic (PLEG): container finished" podID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerID="5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a" exitCode=143 Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.883550 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.883567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9b785b25-10dc-4ca7-a080-2b8f469d5a61","Type":"ContainerDied","Data":"632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564"} Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.883620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9b785b25-10dc-4ca7-a080-2b8f469d5a61","Type":"ContainerDied","Data":"5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a"} Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.883630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9b785b25-10dc-4ca7-a080-2b8f469d5a61","Type":"ContainerDied","Data":"7238c2c59ed7257513dc6265f731ece3a805c27061b697b31c3af3250098549a"} Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.883645 4707 scope.go:117] "RemoveContainer" containerID="632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.885048 4707 generic.go:334] "Generic (PLEG): container finished" podID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerID="e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00" exitCode=143 Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.885084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5dfb4638-344f-48b5-a3bd-0ce06e4b9409","Type":"ContainerDied","Data":"e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00"} Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.903265 4707 scope.go:117] "RemoveContainer" containerID="5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.909832 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.925475 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.928066 4707 scope.go:117] "RemoveContainer" containerID="632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564" Jan 21 15:38:36 crc kubenswrapper[4707]: E0121 15:38:36.931220 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564\": container with ID starting with 632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564 not found: ID does not exist" containerID="632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.931269 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564"} err="failed to get container status \"632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564\": rpc error: code = NotFound desc = could not find container \"632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564\": container with ID starting with 632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564 not found: ID does not exist" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.931289 4707 scope.go:117] "RemoveContainer" containerID="5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a" Jan 21 15:38:36 crc kubenswrapper[4707]: E0121 15:38:36.931582 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a\": container with ID starting with 5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a not found: ID does not exist" containerID="5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.931601 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a"} err="failed to get container status \"5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a\": rpc error: code = NotFound desc = could not find container \"5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a\": container with ID starting with 5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a not found: ID does not exist" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.931614 4707 scope.go:117] "RemoveContainer" containerID="632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.931846 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564"} err="failed to get container status \"632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564\": rpc error: code = NotFound desc = could not find container \"632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564\": container with ID starting with 632e2df0644020060ae0a22159c46c95db990d2b4936d8ca0ada6e2c80dbd564 not found: ID does not exist" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.931928 4707 scope.go:117] "RemoveContainer" containerID="5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.932270 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a"} err="failed to get container status \"5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a\": rpc error: code = NotFound desc = could not find container \"5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a\": container with ID starting with 5d9147e465346d9e9fe036e04d9f800f25f12294284b5986c56c04f42fb92e4a not found: ID does not exist" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.935069 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:36 crc kubenswrapper[4707]: E0121 15:38:36.935441 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853ee64-61b1-4bbb-afa4-87fa50a0935a" containerName="nova-manage" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.935459 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853ee64-61b1-4bbb-afa4-87fa50a0935a" containerName="nova-manage" Jan 21 15:38:36 crc kubenswrapper[4707]: E0121 15:38:36.935485 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerName="nova-api-api" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.935491 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerName="nova-api-api" Jan 21 15:38:36 crc kubenswrapper[4707]: E0121 15:38:36.935513 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerName="nova-api-log" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.935519 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerName="nova-api-log" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.935707 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerName="nova-api-log" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.935731 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853ee64-61b1-4bbb-afa4-87fa50a0935a" containerName="nova-manage" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.935747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" containerName="nova-api-api" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.936694 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.943195 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.944729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.994470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6879bdf9-63ff-4ef8-bf96-6f21380be697-logs\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.994606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7prj\" (UniqueName: \"kubernetes.io/projected/6879bdf9-63ff-4ef8-bf96-6f21380be697-kube-api-access-z7prj\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.994710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:36 crc kubenswrapper[4707]: I0121 15:38:36.994737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-config-data\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.095884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7prj\" (UniqueName: \"kubernetes.io/projected/6879bdf9-63ff-4ef8-bf96-6f21380be697-kube-api-access-z7prj\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.095956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.095975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-config-data\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.096035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6879bdf9-63ff-4ef8-bf96-6f21380be697-logs\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.096366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6879bdf9-63ff-4ef8-bf96-6f21380be697-logs\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.099878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-config-data\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.100109 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.109398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7prj\" (UniqueName: \"kubernetes.io/projected/6879bdf9-63ff-4ef8-bf96-6f21380be697-kube-api-access-z7prj\") pod \"nova-api-0\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.191759 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b785b25-10dc-4ca7-a080-2b8f469d5a61" path="/var/lib/kubelet/pods/9b785b25-10dc-4ca7-a080-2b8f469d5a61/volumes" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.249391 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.635666 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:37 crc kubenswrapper[4707]: W0121 15:38:37.647532 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6879bdf9_63ff_4ef8_bf96_6f21380be697.slice/crio-c14afe53c80956ec0ff72d9eb41c58d0bcb3ab04a981a668256940ecd3eb984c WatchSource:0}: Error finding container c14afe53c80956ec0ff72d9eb41c58d0bcb3ab04a981a668256940ecd3eb984c: Status 404 returned error can't find the container with id c14afe53c80956ec0ff72d9eb41c58d0bcb3ab04a981a668256940ecd3eb984c Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.744028 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.895032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6879bdf9-63ff-4ef8-bf96-6f21380be697","Type":"ContainerStarted","Data":"86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7"} Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.895652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6879bdf9-63ff-4ef8-bf96-6f21380be697","Type":"ContainerStarted","Data":"c14afe53c80956ec0ff72d9eb41c58d0bcb3ab04a981a668256940ecd3eb984c"} Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.896848 4707 generic.go:334] "Generic (PLEG): container finished" podID="f98fe6a0-ff43-444b-8b87-9172e629b79d" containerID="2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a" exitCode=0 Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.896878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f98fe6a0-ff43-444b-8b87-9172e629b79d","Type":"ContainerDied","Data":"2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a"} Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.896895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f98fe6a0-ff43-444b-8b87-9172e629b79d","Type":"ContainerDied","Data":"430489298d2b6f38f6b561287c0afdbdb0f1efdb0e58041e978c8f3a9aa8cc5a"} Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.896914 4707 scope.go:117] "RemoveContainer" containerID="2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.897054 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.908205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4hgl\" (UniqueName: \"kubernetes.io/projected/f98fe6a0-ff43-444b-8b87-9172e629b79d-kube-api-access-w4hgl\") pod \"f98fe6a0-ff43-444b-8b87-9172e629b79d\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.908280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-combined-ca-bundle\") pod \"f98fe6a0-ff43-444b-8b87-9172e629b79d\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.908471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-config-data\") pod \"f98fe6a0-ff43-444b-8b87-9172e629b79d\" (UID: \"f98fe6a0-ff43-444b-8b87-9172e629b79d\") " Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.913236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f98fe6a0-ff43-444b-8b87-9172e629b79d-kube-api-access-w4hgl" (OuterVolumeSpecName: "kube-api-access-w4hgl") pod "f98fe6a0-ff43-444b-8b87-9172e629b79d" (UID: "f98fe6a0-ff43-444b-8b87-9172e629b79d"). InnerVolumeSpecName "kube-api-access-w4hgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.915928 4707 scope.go:117] "RemoveContainer" containerID="2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a" Jan 21 15:38:37 crc kubenswrapper[4707]: E0121 15:38:37.916355 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a\": container with ID starting with 2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a not found: ID does not exist" containerID="2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.916448 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a"} err="failed to get container status \"2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a\": rpc error: code = NotFound desc = could not find container \"2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a\": container with ID starting with 2c0e9e1bae2e5a216f672b043f295958c376e83931df5f7d6ba1e82f7902c61a not found: ID does not exist" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.940987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f98fe6a0-ff43-444b-8b87-9172e629b79d" (UID: "f98fe6a0-ff43-444b-8b87-9172e629b79d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:37 crc kubenswrapper[4707]: I0121 15:38:37.945616 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-config-data" (OuterVolumeSpecName: "config-data") pod "f98fe6a0-ff43-444b-8b87-9172e629b79d" (UID: "f98fe6a0-ff43-444b-8b87-9172e629b79d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.010938 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4hgl\" (UniqueName: \"kubernetes.io/projected/f98fe6a0-ff43-444b-8b87-9172e629b79d-kube-api-access-w4hgl\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.010967 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.010978 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98fe6a0-ff43-444b-8b87-9172e629b79d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.230289 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.255179 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.268290 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:38 crc kubenswrapper[4707]: E0121 15:38:38.268773 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98fe6a0-ff43-444b-8b87-9172e629b79d" containerName="nova-scheduler-scheduler" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.268799 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98fe6a0-ff43-444b-8b87-9172e629b79d" containerName="nova-scheduler-scheduler" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.269054 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f98fe6a0-ff43-444b-8b87-9172e629b79d" containerName="nova-scheduler-scheduler" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.269733 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.271413 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.278128 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.418751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8drr\" (UniqueName: \"kubernetes.io/projected/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-kube-api-access-f8drr\") pod \"nova-scheduler-0\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.419037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-config-data\") pod \"nova-scheduler-0\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.419180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.520406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.520621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8drr\" (UniqueName: \"kubernetes.io/projected/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-kube-api-access-f8drr\") pod \"nova-scheduler-0\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.520720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-config-data\") pod \"nova-scheduler-0\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.524202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.524947 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-config-data\") pod \"nova-scheduler-0\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.536069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8drr\" (UniqueName: \"kubernetes.io/projected/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-kube-api-access-f8drr\") pod \"nova-scheduler-0\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.586769 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.906782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6879bdf9-63ff-4ef8-bf96-6f21380be697","Type":"ContainerStarted","Data":"78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57"} Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.923909 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.923894381 podStartE2EDuration="2.923894381s" podCreationTimestamp="2026-01-21 15:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:38.918853453 +0000 UTC m=+2216.100369675" watchObservedRunningTime="2026-01-21 15:38:38.923894381 +0000 UTC m=+2216.105410603" Jan 21 15:38:38 crc kubenswrapper[4707]: I0121 15:38:38.962711 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:38:38 crc kubenswrapper[4707]: W0121 15:38:38.964955 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6f7c9e9_a4dc_4d9e_80e4_b9564520e790.slice/crio-6cbbb25f79d2789525e5635ee65c57a8f5baca55ec4777b6364a7d60dbfd495e WatchSource:0}: Error finding container 6cbbb25f79d2789525e5635ee65c57a8f5baca55ec4777b6364a7d60dbfd495e: Status 404 returned error can't find the container with id 6cbbb25f79d2789525e5635ee65c57a8f5baca55ec4777b6364a7d60dbfd495e Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.192821 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f98fe6a0-ff43-444b-8b87-9172e629b79d" path="/var/lib/kubelet/pods/f98fe6a0-ff43-444b-8b87-9172e629b79d/volumes" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.792270 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.939359 4707 generic.go:334] "Generic (PLEG): container finished" podID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerID="7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a" exitCode=0 Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.939484 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.939529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5dfb4638-344f-48b5-a3bd-0ce06e4b9409","Type":"ContainerDied","Data":"7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a"} Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.939569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5dfb4638-344f-48b5-a3bd-0ce06e4b9409","Type":"ContainerDied","Data":"242d5732cf75e9519ffec42fc78bb729f16da4d72bd1397acd647622ab90594f"} Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.939588 4707 scope.go:117] "RemoveContainer" containerID="7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.943049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790","Type":"ContainerStarted","Data":"e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b"} Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.943089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790","Type":"ContainerStarted","Data":"6cbbb25f79d2789525e5635ee65c57a8f5baca55ec4777b6364a7d60dbfd495e"} Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.949391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-combined-ca-bundle\") pod \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.949488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfm2b\" (UniqueName: \"kubernetes.io/projected/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-kube-api-access-gfm2b\") pod \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.949603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-logs\") pod \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.949677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-nova-metadata-tls-certs\") pod \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.949720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-config-data\") pod \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\" (UID: \"5dfb4638-344f-48b5-a3bd-0ce06e4b9409\") " Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.951466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-logs" (OuterVolumeSpecName: "logs") pod "5dfb4638-344f-48b5-a3bd-0ce06e4b9409" (UID: "5dfb4638-344f-48b5-a3bd-0ce06e4b9409"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.955740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-kube-api-access-gfm2b" (OuterVolumeSpecName: "kube-api-access-gfm2b") pod "5dfb4638-344f-48b5-a3bd-0ce06e4b9409" (UID: "5dfb4638-344f-48b5-a3bd-0ce06e4b9409"). InnerVolumeSpecName "kube-api-access-gfm2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.959786 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.9597704390000001 podStartE2EDuration="1.959770439s" podCreationTimestamp="2026-01-21 15:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:39.957423526 +0000 UTC m=+2217.138939749" watchObservedRunningTime="2026-01-21 15:38:39.959770439 +0000 UTC m=+2217.141286661" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.963069 4707 scope.go:117] "RemoveContainer" containerID="e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.976828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dfb4638-344f-48b5-a3bd-0ce06e4b9409" (UID: "5dfb4638-344f-48b5-a3bd-0ce06e4b9409"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.978331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-config-data" (OuterVolumeSpecName: "config-data") pod "5dfb4638-344f-48b5-a3bd-0ce06e4b9409" (UID: "5dfb4638-344f-48b5-a3bd-0ce06e4b9409"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:39 crc kubenswrapper[4707]: I0121 15:38:39.995996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5dfb4638-344f-48b5-a3bd-0ce06e4b9409" (UID: "5dfb4638-344f-48b5-a3bd-0ce06e4b9409"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.008574 4707 scope.go:117] "RemoveContainer" containerID="7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a" Jan 21 15:38:40 crc kubenswrapper[4707]: E0121 15:38:40.008891 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a\": container with ID starting with 7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a not found: ID does not exist" containerID="7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.008993 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a"} err="failed to get container status \"7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a\": rpc error: code = NotFound desc = could not find container \"7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a\": container with ID starting with 7f3874f071aa368082af141333382fc0299a3fa442715db36c1064e7339b6c5a not found: ID does not exist" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.009068 4707 scope.go:117] "RemoveContainer" containerID="e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00" Jan 21 15:38:40 crc kubenswrapper[4707]: E0121 15:38:40.009450 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00\": container with ID starting with e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00 not found: ID does not exist" containerID="e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.009485 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00"} err="failed to get container status \"e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00\": rpc error: code = NotFound desc = could not find container \"e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00\": container with ID starting with e3663c213c9789039b8d16e98c207828e3c71dd1b3f2bd6778e2d8b49f0fed00 not found: ID does not exist" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.052840 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.052879 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.052889 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.052898 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.052907 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfm2b\" (UniqueName: \"kubernetes.io/projected/5dfb4638-344f-48b5-a3bd-0ce06e4b9409-kube-api-access-gfm2b\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.266938 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.271478 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.296495 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:40 crc kubenswrapper[4707]: E0121 15:38:40.296893 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-metadata" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.296912 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-metadata" Jan 21 15:38:40 crc kubenswrapper[4707]: E0121 15:38:40.296935 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-log" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.296943 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-log" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.297101 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-metadata" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.297123 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" containerName="nova-metadata-log" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.298001 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.302558 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.304213 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.304365 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.464383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.464694 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cee2089-6404-41ee-b6ae-82d788ced5fc-logs\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.464772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.464799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7jtb\" (UniqueName: \"kubernetes.io/projected/9cee2089-6404-41ee-b6ae-82d788ced5fc-kube-api-access-m7jtb\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.464872 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-config-data\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.566683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.566731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7jtb\" (UniqueName: \"kubernetes.io/projected/9cee2089-6404-41ee-b6ae-82d788ced5fc-kube-api-access-m7jtb\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.566791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-config-data\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.566915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.566977 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cee2089-6404-41ee-b6ae-82d788ced5fc-logs\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.567448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cee2089-6404-41ee-b6ae-82d788ced5fc-logs\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.571793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.572282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.573205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-config-data\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.581236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7jtb\" (UniqueName: \"kubernetes.io/projected/9cee2089-6404-41ee-b6ae-82d788ced5fc-kube-api-access-m7jtb\") pod \"nova-metadata-0\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:40 crc kubenswrapper[4707]: I0121 15:38:40.618904 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:41 crc kubenswrapper[4707]: I0121 15:38:41.006679 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:38:41 crc kubenswrapper[4707]: W0121 15:38:41.007387 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cee2089_6404_41ee_b6ae_82d788ced5fc.slice/crio-6dcb9a5746b9dbf80af99667cbbce8da259bc8f27b2b23a88b48e63db383f69c WatchSource:0}: Error finding container 6dcb9a5746b9dbf80af99667cbbce8da259bc8f27b2b23a88b48e63db383f69c: Status 404 returned error can't find the container with id 6dcb9a5746b9dbf80af99667cbbce8da259bc8f27b2b23a88b48e63db383f69c Jan 21 15:38:41 crc kubenswrapper[4707]: I0121 15:38:41.193313 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dfb4638-344f-48b5-a3bd-0ce06e4b9409" path="/var/lib/kubelet/pods/5dfb4638-344f-48b5-a3bd-0ce06e4b9409/volumes" Jan 21 15:38:41 crc kubenswrapper[4707]: I0121 15:38:41.960912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9cee2089-6404-41ee-b6ae-82d788ced5fc","Type":"ContainerStarted","Data":"1c8e77bfa68e38980554d4bde9e72ce205b90327a6373937e410e37c84c7a7f8"} Jan 21 15:38:41 crc kubenswrapper[4707]: I0121 15:38:41.961212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9cee2089-6404-41ee-b6ae-82d788ced5fc","Type":"ContainerStarted","Data":"1337d8606bac38af9b79c2ec732c96f137ac330857aa0b08877428b22036abc5"} Jan 21 15:38:41 crc kubenswrapper[4707]: I0121 15:38:41.961223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9cee2089-6404-41ee-b6ae-82d788ced5fc","Type":"ContainerStarted","Data":"6dcb9a5746b9dbf80af99667cbbce8da259bc8f27b2b23a88b48e63db383f69c"} Jan 21 15:38:41 crc kubenswrapper[4707]: I0121 15:38:41.978664 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.978642506 podStartE2EDuration="1.978642506s" podCreationTimestamp="2026-01-21 15:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:41.973417101 +0000 UTC m=+2219.154933323" watchObservedRunningTime="2026-01-21 15:38:41.978642506 +0000 UTC m=+2219.160158728" Jan 21 15:38:43 crc kubenswrapper[4707]: I0121 15:38:43.587239 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:44 crc kubenswrapper[4707]: I0121 15:38:44.013209 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:45 crc kubenswrapper[4707]: I0121 15:38:45.619853 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:45 crc kubenswrapper[4707]: I0121 15:38:45.620409 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:47 crc kubenswrapper[4707]: I0121 15:38:47.080545 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:38:47 crc kubenswrapper[4707]: I0121 15:38:47.081007 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="624a8ae2-0096-40d9-afc7-efa6b5f4f947" containerName="kube-state-metrics" containerID="cri-o://10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275" gracePeriod=30 Jan 21 15:38:47 crc kubenswrapper[4707]: I0121 15:38:47.249581 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:47 crc kubenswrapper[4707]: I0121 15:38:47.249649 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:47 crc kubenswrapper[4707]: E0121 15:38:47.277985 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod624a8ae2_0096_40d9_afc7_efa6b5f4f947.slice/crio-conmon-10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:38:47 crc kubenswrapper[4707]: I0121 15:38:47.508687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:47 crc kubenswrapper[4707]: I0121 15:38:47.611121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lsj9\" (UniqueName: \"kubernetes.io/projected/624a8ae2-0096-40d9-afc7-efa6b5f4f947-kube-api-access-4lsj9\") pod \"624a8ae2-0096-40d9-afc7-efa6b5f4f947\" (UID: \"624a8ae2-0096-40d9-afc7-efa6b5f4f947\") " Jan 21 15:38:47 crc kubenswrapper[4707]: I0121 15:38:47.618436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624a8ae2-0096-40d9-afc7-efa6b5f4f947-kube-api-access-4lsj9" (OuterVolumeSpecName: "kube-api-access-4lsj9") pod "624a8ae2-0096-40d9-afc7-efa6b5f4f947" (UID: "624a8ae2-0096-40d9-afc7-efa6b5f4f947"). InnerVolumeSpecName "kube-api-access-4lsj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:47 crc kubenswrapper[4707]: I0121 15:38:47.716076 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lsj9\" (UniqueName: \"kubernetes.io/projected/624a8ae2-0096-40d9-afc7-efa6b5f4f947-kube-api-access-4lsj9\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.012126 4707 generic.go:334] "Generic (PLEG): container finished" podID="624a8ae2-0096-40d9-afc7-efa6b5f4f947" containerID="10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275" exitCode=2 Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.012212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"624a8ae2-0096-40d9-afc7-efa6b5f4f947","Type":"ContainerDied","Data":"10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275"} Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.012466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"624a8ae2-0096-40d9-afc7-efa6b5f4f947","Type":"ContainerDied","Data":"e5044283263e865126b8f868efbfb765da9c9adad6de39076539e005a411484e"} Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.012497 4707 scope.go:117] "RemoveContainer" containerID="10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.012278 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.033430 4707 scope.go:117] "RemoveContainer" containerID="10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275" Jan 21 15:38:48 crc kubenswrapper[4707]: E0121 15:38:48.034117 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275\": container with ID starting with 10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275 not found: ID does not exist" containerID="10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.034166 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275"} err="failed to get container status \"10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275\": rpc error: code = NotFound desc = could not find container \"10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275\": container with ID starting with 10c309903a605f5d0f26eead5ce07e3acd0cb76cc43b5dab1ae5315c99f51275 not found: ID does not exist" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.043300 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.081113 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.101392 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:38:48 crc kubenswrapper[4707]: E0121 15:38:48.102134 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624a8ae2-0096-40d9-afc7-efa6b5f4f947" containerName="kube-state-metrics" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.102160 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="624a8ae2-0096-40d9-afc7-efa6b5f4f947" containerName="kube-state-metrics" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.102430 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="624a8ae2-0096-40d9-afc7-efa6b5f4f947" containerName="kube-state-metrics" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.103450 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.106678 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.106699 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.112674 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.124638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.124720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.124894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.125013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skhst\" (UniqueName: \"kubernetes.io/projected/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-api-access-skhst\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.228335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.228667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skhst\" (UniqueName: \"kubernetes.io/projected/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-api-access-skhst\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.229885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.230000 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.242608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.242625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.242828 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.259683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skhst\" (UniqueName: \"kubernetes.io/projected/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-api-access-skhst\") pod \"kube-state-metrics-0\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.338064 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.338074 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.430073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.587357 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.616546 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.708663 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.708990 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="ceilometer-central-agent" containerID="cri-o://6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e" gracePeriod=30 Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.709534 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="proxy-httpd" containerID="cri-o://a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754" gracePeriod=30 Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.709628 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="sg-core" containerID="cri-o://a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8" gracePeriod=30 Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.709677 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="ceilometer-notification-agent" containerID="cri-o://7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc" gracePeriod=30 Jan 21 15:38:48 crc kubenswrapper[4707]: I0121 15:38:48.858893 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:38:49 crc kubenswrapper[4707]: I0121 15:38:49.026994 4707 generic.go:334] "Generic (PLEG): container finished" podID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerID="a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754" exitCode=0 Jan 21 15:38:49 crc kubenswrapper[4707]: I0121 15:38:49.027022 4707 generic.go:334] "Generic (PLEG): container finished" podID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerID="a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8" exitCode=2 Jan 21 15:38:49 crc kubenswrapper[4707]: I0121 15:38:49.027062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerDied","Data":"a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754"} Jan 21 15:38:49 crc kubenswrapper[4707]: I0121 15:38:49.027089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerDied","Data":"a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8"} Jan 21 15:38:49 crc kubenswrapper[4707]: I0121 15:38:49.029239 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7a514056-a57f-4bc0-89b5-60ade40aadfc","Type":"ContainerStarted","Data":"df56b24817d3684719c8d776134fd3a87475dee4503b4ef449c32e500cd81c45"} Jan 21 15:38:49 crc kubenswrapper[4707]: I0121 15:38:49.055206 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:38:49 crc kubenswrapper[4707]: I0121 15:38:49.193839 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624a8ae2-0096-40d9-afc7-efa6b5f4f947" path="/var/lib/kubelet/pods/624a8ae2-0096-40d9-afc7-efa6b5f4f947/volumes" Jan 21 15:38:50 crc kubenswrapper[4707]: I0121 15:38:50.041525 4707 generic.go:334] "Generic (PLEG): container finished" podID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerID="6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e" exitCode=0 Jan 21 15:38:50 crc kubenswrapper[4707]: I0121 15:38:50.041594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerDied","Data":"6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e"} Jan 21 15:38:50 crc kubenswrapper[4707]: I0121 15:38:50.043908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7a514056-a57f-4bc0-89b5-60ade40aadfc","Type":"ContainerStarted","Data":"5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3"} Jan 21 15:38:50 crc kubenswrapper[4707]: I0121 15:38:50.044252 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:50 crc kubenswrapper[4707]: I0121 15:38:50.059678 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.7747179480000002 podStartE2EDuration="2.05966348s" podCreationTimestamp="2026-01-21 15:38:48 +0000 UTC" firstStartedPulling="2026-01-21 15:38:48.863025379 +0000 UTC m=+2226.044541590" lastFinishedPulling="2026-01-21 15:38:49.147970901 +0000 UTC m=+2226.329487122" observedRunningTime="2026-01-21 15:38:50.057662599 +0000 UTC m=+2227.239178820" watchObservedRunningTime="2026-01-21 15:38:50.05966348 +0000 UTC m=+2227.241179702" Jan 21 15:38:50 crc kubenswrapper[4707]: I0121 15:38:50.619759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:50 crc kubenswrapper[4707]: I0121 15:38:50.619838 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.585088 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.630557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-run-httpd\") pod \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.630621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzth6\" (UniqueName: \"kubernetes.io/projected/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-kube-api-access-pzth6\") pod \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.630670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-sg-core-conf-yaml\") pod \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.630668 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.630708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-combined-ca-bundle\") pod \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.630752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-config-data\") pod \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.630880 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-log-httpd\") pod \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.630907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-scripts\") pod \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\" (UID: \"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc\") " Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.632600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" (UID: "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.633703 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" (UID: "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.634697 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.643950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-scripts" (OuterVolumeSpecName: "scripts") pod "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" (UID: "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.653237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-kube-api-access-pzth6" (OuterVolumeSpecName: "kube-api-access-pzth6") pod "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" (UID: "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc"). InnerVolumeSpecName "kube-api-access-pzth6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.666374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" (UID: "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.710523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" (UID: "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.733602 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-config-data" (OuterVolumeSpecName: "config-data") pod "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" (UID: "b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.733686 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzth6\" (UniqueName: \"kubernetes.io/projected/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-kube-api-access-pzth6\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.734142 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.734156 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.734169 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.734181 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.734192 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:51 crc kubenswrapper[4707]: I0121 15:38:51.837954 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.061010 4707 generic.go:334] "Generic (PLEG): container finished" podID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerID="7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc" exitCode=0 Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.061075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerDied","Data":"7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc"} Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.061127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc","Type":"ContainerDied","Data":"6b1487380fb4340340dd0d634453d35b3c026fa347792f87340d8625ec14b7e8"} Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.061144 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.061160 4707 scope.go:117] "RemoveContainer" containerID="a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.092977 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.093950 4707 scope.go:117] "RemoveContainer" containerID="a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.112954 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.122181 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:52 crc kubenswrapper[4707]: E0121 15:38:52.123409 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="ceilometer-notification-agent" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.123432 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="ceilometer-notification-agent" Jan 21 15:38:52 crc kubenswrapper[4707]: E0121 15:38:52.123447 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="sg-core" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.124395 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="sg-core" Jan 21 15:38:52 crc kubenswrapper[4707]: E0121 15:38:52.124428 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="proxy-httpd" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.124435 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="proxy-httpd" Jan 21 15:38:52 crc kubenswrapper[4707]: E0121 15:38:52.124480 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="ceilometer-central-agent" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.124489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="ceilometer-central-agent" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.124645 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="ceilometer-central-agent" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.124666 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="sg-core" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.124676 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="proxy-httpd" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.124686 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" containerName="ceilometer-notification-agent" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.126736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.130134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.130492 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.130665 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.146127 4707 scope.go:117] "RemoveContainer" containerID="7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.154421 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.173850 4707 scope.go:117] "RemoveContainer" containerID="6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.194065 4707 scope.go:117] "RemoveContainer" containerID="a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754" Jan 21 15:38:52 crc kubenswrapper[4707]: E0121 15:38:52.194737 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754\": container with ID starting with a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754 not found: ID does not exist" containerID="a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.194780 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754"} err="failed to get container status \"a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754\": rpc error: code = NotFound desc = could not find container \"a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754\": container with ID starting with a3146e49ce51b7e1298cb9f76ec97ff02aab0b6621c70dea851ce2821ef87754 not found: ID does not exist" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.194817 4707 scope.go:117] "RemoveContainer" containerID="a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8" Jan 21 15:38:52 crc kubenswrapper[4707]: E0121 15:38:52.195193 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8\": container with ID starting with a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8 not found: ID does not exist" containerID="a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.195225 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8"} err="failed to get container status \"a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8\": rpc error: code = NotFound desc = could not find container \"a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8\": container with ID starting with a56488bd862ed4440925c646b2b827fd40b7f298b4a837c9431771f3691261e8 not found: ID does not exist" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.195264 4707 scope.go:117] "RemoveContainer" containerID="7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc" Jan 21 15:38:52 crc kubenswrapper[4707]: E0121 15:38:52.195549 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc\": container with ID starting with 7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc not found: ID does not exist" containerID="7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.195576 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc"} err="failed to get container status \"7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc\": rpc error: code = NotFound desc = could not find container \"7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc\": container with ID starting with 7234b6aad6aa54f9b1b1abd8f82b35a3ad63dbabcc8003677b989430573d59dc not found: ID does not exist" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.195591 4707 scope.go:117] "RemoveContainer" containerID="6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e" Jan 21 15:38:52 crc kubenswrapper[4707]: E0121 15:38:52.195886 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e\": container with ID starting with 6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e not found: ID does not exist" containerID="6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.195919 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e"} err="failed to get container status \"6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e\": rpc error: code = NotFound desc = could not find container \"6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e\": container with ID starting with 6e465bb1e045c76a8b1f312af0071b6c01c68a02bb2fe074b24e71f043ca4f3e not found: ID does not exist" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.249316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-run-httpd\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.249387 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-scripts\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.249424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-config-data\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.249445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.249510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.249595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.249623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxrrz\" (UniqueName: \"kubernetes.io/projected/44715b3c-44ea-4cb5-898c-7e06ce3000dc-kube-api-access-cxrrz\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.249667 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-log-httpd\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.352474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-scripts\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.352537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-config-data\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.352570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.352681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.352803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.352870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxrrz\" (UniqueName: \"kubernetes.io/projected/44715b3c-44ea-4cb5-898c-7e06ce3000dc-kube-api-access-cxrrz\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.352931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-log-httpd\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.353068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-run-httpd\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.353604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-run-httpd\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.354224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-log-httpd\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.358338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.358738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.365271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.372399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-config-data\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.376483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-scripts\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.381418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxrrz\" (UniqueName: \"kubernetes.io/projected/44715b3c-44ea-4cb5-898c-7e06ce3000dc-kube-api-access-cxrrz\") pod \"ceilometer-0\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.453211 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:52 crc kubenswrapper[4707]: I0121 15:38:52.861469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:53 crc kubenswrapper[4707]: I0121 15:38:53.073026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"44715b3c-44ea-4cb5-898c-7e06ce3000dc","Type":"ContainerStarted","Data":"3aa101bc564847b702933376452e4a21db0657a87890ca7a5145464cf7d1a8fb"} Jan 21 15:38:53 crc kubenswrapper[4707]: I0121 15:38:53.191662 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc" path="/var/lib/kubelet/pods/b99c4eaa-dc5a-4d7e-916a-b4d5418d09fc/volumes" Jan 21 15:38:54 crc kubenswrapper[4707]: I0121 15:38:54.081471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"44715b3c-44ea-4cb5-898c-7e06ce3000dc","Type":"ContainerStarted","Data":"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f"} Jan 21 15:38:54 crc kubenswrapper[4707]: E0121 15:38:54.339341 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = determining manifest MIME type for docker://quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67: Manifest does not match provided manifest digest sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67" Jan 21 15:38:54 crc kubenswrapper[4707]: E0121 15:38:54.339612 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n646h5dbh658h66dh5c6h56fh696h578h5cfh57dh5b5hd6h545h569h5d5hbfh666h577h696hf6h5ffhc7h56dh54chffh645h5f8h547h545h56fh54ch5bdq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxrrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack-kuttl-tests(44715b3c-44ea-4cb5-898c-7e06ce3000dc): ErrImagePull: determining manifest MIME type for docker://quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67: Manifest does not match provided manifest digest sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67" logger="UnhandledError" Jan 21 15:38:55 crc kubenswrapper[4707]: I0121 15:38:55.089187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"44715b3c-44ea-4cb5-898c-7e06ce3000dc","Type":"ContainerStarted","Data":"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd"} Jan 21 15:38:56 crc kubenswrapper[4707]: E0121 15:38:56.335973 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"determining manifest MIME type for docker://quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67: Manifest does not match provided manifest digest sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67\"" pod="openstack-kuttl-tests/ceilometer-0" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.080066 4707 scope.go:117] "RemoveContainer" containerID="9e9209849ce6017488f4669fc17def86f04557718a632745fb949e3858aed61d" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.099613 4707 scope.go:117] "RemoveContainer" containerID="a24de2a1a0d2d1105d4f042d523feff72032cf29f32b7f88df585a754e253a44" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.105155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"44715b3c-44ea-4cb5-898c-7e06ce3000dc","Type":"ContainerStarted","Data":"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc"} Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.105337 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:57 crc kubenswrapper[4707]: E0121 15:38:57.106917 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67\\\"\"" pod="openstack-kuttl-tests/ceilometer-0" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.136969 4707 scope.go:117] "RemoveContainer" containerID="85176d83288ee8e0bcd7b88b5097eff0c9c6bf00f9758386dd0f8ace36c01fce" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.174129 4707 scope.go:117] "RemoveContainer" containerID="883f6570fece7c67c8b79e14447abbdbb70fd24d3a0f4805e5af9f115a6620cd" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.208978 4707 scope.go:117] "RemoveContainer" containerID="2d423c2230ae9d1a2174a39d585283c85105561e536de899f464d70c0a04a0eb" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.244432 4707 scope.go:117] "RemoveContainer" containerID="f36d8c9033a891382c33bba12614c135ec6171dacb14457377832a840b45c6be" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.264636 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.264695 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.265179 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.265535 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.267316 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.274508 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.305076 4707 scope.go:117] "RemoveContainer" containerID="d7d98a1f6b0a50dff9ed5bf7d3edb969798ff4f76e9a3eb5436c368c2529ba04" Jan 21 15:38:57 crc kubenswrapper[4707]: I0121 15:38:57.361457 4707 scope.go:117] "RemoveContainer" containerID="3ab85ee94bca50a4accee416f902c8f5749d6ac2a33fa89d6c447eb295d21860" Jan 21 15:38:58 crc kubenswrapper[4707]: E0121 15:38:58.114212 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:df14f6de785b8aefc38ceb5b47088405224cfa914977c9ab811514cc77b08a67\\\"\"" pod="openstack-kuttl-tests/ceilometer-0" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" Jan 21 15:38:58 crc kubenswrapper[4707]: I0121 15:38:58.445431 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:38:58 crc kubenswrapper[4707]: I0121 15:38:58.966980 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.118947 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="ceilometer-central-agent" containerID="cri-o://c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f" gracePeriod=30 Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.119003 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="proxy-httpd" containerID="cri-o://7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc" gracePeriod=30 Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.119042 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="sg-core" containerID="cri-o://41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd" gracePeriod=30 Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.343337 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.786232 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-ceilometer-tls-certs\") pod \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxrrz\" (UniqueName: \"kubernetes.io/projected/44715b3c-44ea-4cb5-898c-7e06ce3000dc-kube-api-access-cxrrz\") pod \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-config-data\") pod \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-sg-core-conf-yaml\") pod \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901451 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-run-httpd\") pod \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-scripts\") pod \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-combined-ca-bundle\") pod \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901526 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-log-httpd\") pod \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\" (UID: \"44715b3c-44ea-4cb5-898c-7e06ce3000dc\") " Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "44715b3c-44ea-4cb5-898c-7e06ce3000dc" (UID: "44715b3c-44ea-4cb5-898c-7e06ce3000dc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.901974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "44715b3c-44ea-4cb5-898c-7e06ce3000dc" (UID: "44715b3c-44ea-4cb5-898c-7e06ce3000dc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.902378 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.902396 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44715b3c-44ea-4cb5-898c-7e06ce3000dc-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.920065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44715b3c-44ea-4cb5-898c-7e06ce3000dc-kube-api-access-cxrrz" (OuterVolumeSpecName: "kube-api-access-cxrrz") pod "44715b3c-44ea-4cb5-898c-7e06ce3000dc" (UID: "44715b3c-44ea-4cb5-898c-7e06ce3000dc"). InnerVolumeSpecName "kube-api-access-cxrrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.921912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-scripts" (OuterVolumeSpecName: "scripts") pod "44715b3c-44ea-4cb5-898c-7e06ce3000dc" (UID: "44715b3c-44ea-4cb5-898c-7e06ce3000dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.979138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "44715b3c-44ea-4cb5-898c-7e06ce3000dc" (UID: "44715b3c-44ea-4cb5-898c-7e06ce3000dc"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.979184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "44715b3c-44ea-4cb5-898c-7e06ce3000dc" (UID: "44715b3c-44ea-4cb5-898c-7e06ce3000dc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:59 crc kubenswrapper[4707]: I0121 15:38:59.989643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44715b3c-44ea-4cb5-898c-7e06ce3000dc" (UID: "44715b3c-44ea-4cb5-898c-7e06ce3000dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.001824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-config-data" (OuterVolumeSpecName: "config-data") pod "44715b3c-44ea-4cb5-898c-7e06ce3000dc" (UID: "44715b3c-44ea-4cb5-898c-7e06ce3000dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.004155 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.004179 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxrrz\" (UniqueName: \"kubernetes.io/projected/44715b3c-44ea-4cb5-898c-7e06ce3000dc-kube-api-access-cxrrz\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.004190 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.004199 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.004207 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.004215 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44715b3c-44ea-4cb5-898c-7e06ce3000dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.131916 4707 generic.go:334] "Generic (PLEG): container finished" podID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerID="7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc" exitCode=0 Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.131945 4707 generic.go:334] "Generic (PLEG): container finished" podID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerID="41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd" exitCode=2 Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.131955 4707 generic.go:334] "Generic (PLEG): container finished" podID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerID="c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f" exitCode=0 Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.132088 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-log" containerID="cri-o://86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7" gracePeriod=30 Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.132346 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.135873 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-api" containerID="cri-o://78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57" gracePeriod=30 Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.135940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"44715b3c-44ea-4cb5-898c-7e06ce3000dc","Type":"ContainerDied","Data":"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc"} Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.135976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"44715b3c-44ea-4cb5-898c-7e06ce3000dc","Type":"ContainerDied","Data":"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd"} Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.135985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"44715b3c-44ea-4cb5-898c-7e06ce3000dc","Type":"ContainerDied","Data":"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f"} Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.135995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"44715b3c-44ea-4cb5-898c-7e06ce3000dc","Type":"ContainerDied","Data":"3aa101bc564847b702933376452e4a21db0657a87890ca7a5145464cf7d1a8fb"} Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.136008 4707 scope.go:117] "RemoveContainer" containerID="7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.169122 4707 scope.go:117] "RemoveContainer" containerID="41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.195844 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.197621 4707 scope.go:117] "RemoveContainer" containerID="c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.206625 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.217604 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:00 crc kubenswrapper[4707]: E0121 15:39:00.218027 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="ceilometer-central-agent" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.218048 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="ceilometer-central-agent" Jan 21 15:39:00 crc kubenswrapper[4707]: E0121 15:39:00.218075 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="sg-core" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.218081 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="sg-core" Jan 21 15:39:00 crc kubenswrapper[4707]: E0121 15:39:00.218101 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="proxy-httpd" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.218106 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="proxy-httpd" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.218321 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="proxy-httpd" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.218356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="sg-core" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.218366 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" containerName="ceilometer-central-agent" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.219987 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.222270 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.222515 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.222631 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.230371 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.235117 4707 scope.go:117] "RemoveContainer" containerID="7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc" Jan 21 15:39:00 crc kubenswrapper[4707]: E0121 15:39:00.235512 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc\": container with ID starting with 7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc not found: ID does not exist" containerID="7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.235538 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc"} err="failed to get container status \"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc\": rpc error: code = NotFound desc = could not find container \"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc\": container with ID starting with 7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc not found: ID does not exist" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.235555 4707 scope.go:117] "RemoveContainer" containerID="41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd" Jan 21 15:39:00 crc kubenswrapper[4707]: E0121 15:39:00.235749 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd\": container with ID starting with 41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd not found: ID does not exist" containerID="41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.235773 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd"} err="failed to get container status \"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd\": rpc error: code = NotFound desc = could not find container \"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd\": container with ID starting with 41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd not found: ID does not exist" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.235784 4707 scope.go:117] "RemoveContainer" containerID="c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f" Jan 21 15:39:00 crc kubenswrapper[4707]: E0121 15:39:00.236170 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f\": container with ID starting with c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f not found: ID does not exist" containerID="c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.236192 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f"} err="failed to get container status \"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f\": rpc error: code = NotFound desc = could not find container \"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f\": container with ID starting with c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f not found: ID does not exist" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.236206 4707 scope.go:117] "RemoveContainer" containerID="7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.240928 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc"} err="failed to get container status \"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc\": rpc error: code = NotFound desc = could not find container \"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc\": container with ID starting with 7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc not found: ID does not exist" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.240954 4707 scope.go:117] "RemoveContainer" containerID="41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.246698 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd"} err="failed to get container status \"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd\": rpc error: code = NotFound desc = could not find container \"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd\": container with ID starting with 41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd not found: ID does not exist" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.246827 4707 scope.go:117] "RemoveContainer" containerID="c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.247844 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f"} err="failed to get container status \"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f\": rpc error: code = NotFound desc = could not find container \"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f\": container with ID starting with c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f not found: ID does not exist" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.247939 4707 scope.go:117] "RemoveContainer" containerID="7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.251490 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc"} err="failed to get container status \"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc\": rpc error: code = NotFound desc = could not find container \"7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc\": container with ID starting with 7afce53e45b70673c35b4decbb28a9bbc0e4b0732a80d86e840c4fe966b3f3bc not found: ID does not exist" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.251569 4707 scope.go:117] "RemoveContainer" containerID="41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.252167 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd"} err="failed to get container status \"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd\": rpc error: code = NotFound desc = could not find container \"41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd\": container with ID starting with 41ef80c342141f9f1358a60bbd13e9bb16eeaf7380d333c239cfb7f9a14bd2cd not found: ID does not exist" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.252202 4707 scope.go:117] "RemoveContainer" containerID="c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.252845 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f"} err="failed to get container status \"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f\": rpc error: code = NotFound desc = could not find container \"c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f\": container with ID starting with c42d7c8f8f578158998dec5c0c437da1f41a0d1b10e660501c98efff409fca3f not found: ID does not exist" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.309187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-log-httpd\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.309241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.309345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67zgk\" (UniqueName: \"kubernetes.io/projected/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-kube-api-access-67zgk\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.309369 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.309396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-scripts\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.309428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.309455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-run-httpd\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.310334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-config-data\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.412419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-config-data\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.412528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-log-httpd\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.412552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.412618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67zgk\" (UniqueName: \"kubernetes.io/projected/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-kube-api-access-67zgk\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.412634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.412661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-scripts\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.412685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.412708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-run-httpd\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.413072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-run-httpd\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.413868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-log-httpd\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.417288 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-config-data\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.418079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.418818 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-scripts\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.420122 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.422622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.429764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67zgk\" (UniqueName: \"kubernetes.io/projected/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-kube-api-access-67zgk\") pod \"ceilometer-0\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.626170 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.629226 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.630378 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.637684 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:39:00 crc kubenswrapper[4707]: I0121 15:39:00.941254 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:01 crc kubenswrapper[4707]: I0121 15:39:01.025152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:01 crc kubenswrapper[4707]: W0121 15:39:01.027757 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29016f3a_bc42_4f50_a3d0_d31a4a931e5b.slice/crio-b5e6aacb8e805234be93a9b39a5fa6391497d9320261a95c4a440c761adc5356 WatchSource:0}: Error finding container b5e6aacb8e805234be93a9b39a5fa6391497d9320261a95c4a440c761adc5356: Status 404 returned error can't find the container with id b5e6aacb8e805234be93a9b39a5fa6391497d9320261a95c4a440c761adc5356 Jan 21 15:39:01 crc kubenswrapper[4707]: I0121 15:39:01.141043 4707 generic.go:334] "Generic (PLEG): container finished" podID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerID="86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7" exitCode=143 Jan 21 15:39:01 crc kubenswrapper[4707]: I0121 15:39:01.141118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6879bdf9-63ff-4ef8-bf96-6f21380be697","Type":"ContainerDied","Data":"86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7"} Jan 21 15:39:01 crc kubenswrapper[4707]: I0121 15:39:01.142287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerStarted","Data":"b5e6aacb8e805234be93a9b39a5fa6391497d9320261a95c4a440c761adc5356"} Jan 21 15:39:01 crc kubenswrapper[4707]: I0121 15:39:01.148192 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:39:01 crc kubenswrapper[4707]: I0121 15:39:01.191840 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44715b3c-44ea-4cb5-898c-7e06ce3000dc" path="/var/lib/kubelet/pods/44715b3c-44ea-4cb5-898c-7e06ce3000dc/volumes" Jan 21 15:39:02 crc kubenswrapper[4707]: I0121 15:39:02.151945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerStarted","Data":"3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc"} Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.164171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerStarted","Data":"139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa"} Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.663185 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.775319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-combined-ca-bundle\") pod \"6879bdf9-63ff-4ef8-bf96-6f21380be697\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.775520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-config-data\") pod \"6879bdf9-63ff-4ef8-bf96-6f21380be697\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.775578 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6879bdf9-63ff-4ef8-bf96-6f21380be697-logs\") pod \"6879bdf9-63ff-4ef8-bf96-6f21380be697\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.775604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7prj\" (UniqueName: \"kubernetes.io/projected/6879bdf9-63ff-4ef8-bf96-6f21380be697-kube-api-access-z7prj\") pod \"6879bdf9-63ff-4ef8-bf96-6f21380be697\" (UID: \"6879bdf9-63ff-4ef8-bf96-6f21380be697\") " Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.775965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6879bdf9-63ff-4ef8-bf96-6f21380be697-logs" (OuterVolumeSpecName: "logs") pod "6879bdf9-63ff-4ef8-bf96-6f21380be697" (UID: "6879bdf9-63ff-4ef8-bf96-6f21380be697"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.776431 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6879bdf9-63ff-4ef8-bf96-6f21380be697-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.779948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6879bdf9-63ff-4ef8-bf96-6f21380be697-kube-api-access-z7prj" (OuterVolumeSpecName: "kube-api-access-z7prj") pod "6879bdf9-63ff-4ef8-bf96-6f21380be697" (UID: "6879bdf9-63ff-4ef8-bf96-6f21380be697"). InnerVolumeSpecName "kube-api-access-z7prj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.797387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6879bdf9-63ff-4ef8-bf96-6f21380be697" (UID: "6879bdf9-63ff-4ef8-bf96-6f21380be697"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.798657 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-config-data" (OuterVolumeSpecName: "config-data") pod "6879bdf9-63ff-4ef8-bf96-6f21380be697" (UID: "6879bdf9-63ff-4ef8-bf96-6f21380be697"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.878293 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.878547 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7prj\" (UniqueName: \"kubernetes.io/projected/6879bdf9-63ff-4ef8-bf96-6f21380be697-kube-api-access-z7prj\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:03 crc kubenswrapper[4707]: I0121 15:39:03.878558 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6879bdf9-63ff-4ef8-bf96-6f21380be697-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.172216 4707 generic.go:334] "Generic (PLEG): container finished" podID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerID="78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57" exitCode=0 Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.172397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6879bdf9-63ff-4ef8-bf96-6f21380be697","Type":"ContainerDied","Data":"78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57"} Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.173143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6879bdf9-63ff-4ef8-bf96-6f21380be697","Type":"ContainerDied","Data":"c14afe53c80956ec0ff72d9eb41c58d0bcb3ab04a981a668256940ecd3eb984c"} Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.173213 4707 scope.go:117] "RemoveContainer" containerID="78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.172487 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.184468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerStarted","Data":"6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5"} Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.199692 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.205108 4707 scope.go:117] "RemoveContainer" containerID="86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.217049 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.224911 4707 scope.go:117] "RemoveContainer" containerID="78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57" Jan 21 15:39:04 crc kubenswrapper[4707]: E0121 15:39:04.225296 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57\": container with ID starting with 78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57 not found: ID does not exist" containerID="78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.225325 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57"} err="failed to get container status \"78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57\": rpc error: code = NotFound desc = could not find container \"78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57\": container with ID starting with 78aa5994c1dcef07670b754864b5bd93922c9bfe19fd2ea0875d572148ab6f57 not found: ID does not exist" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.225342 4707 scope.go:117] "RemoveContainer" containerID="86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7" Jan 21 15:39:04 crc kubenswrapper[4707]: E0121 15:39:04.225551 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7\": container with ID starting with 86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7 not found: ID does not exist" containerID="86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.225571 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7"} err="failed to get container status \"86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7\": rpc error: code = NotFound desc = could not find container \"86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7\": container with ID starting with 86d0caf728c25b1cc441a6bd17b5e9da3e8b33fed843aefdb5190b5420cff4b7 not found: ID does not exist" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.225596 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:39:04 crc kubenswrapper[4707]: E0121 15:39:04.225985 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-log" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.226002 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-log" Jan 21 15:39:04 crc kubenswrapper[4707]: E0121 15:39:04.226015 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-api" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.226021 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-api" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.226208 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-api" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.226239 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" containerName="nova-api-log" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.227139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.228996 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.229153 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.229278 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.232218 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.390831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64087307-6402-47cd-8a4b-37c3d90eb61e-logs\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.390875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-config-data\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.390914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.390974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.391048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-public-tls-certs\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.391063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gt9\" (UniqueName: \"kubernetes.io/projected/64087307-6402-47cd-8a4b-37c3d90eb61e-kube-api-access-v2gt9\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.492720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.492791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.492878 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-public-tls-certs\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.492896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gt9\" (UniqueName: \"kubernetes.io/projected/64087307-6402-47cd-8a4b-37c3d90eb61e-kube-api-access-v2gt9\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.492974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64087307-6402-47cd-8a4b-37c3d90eb61e-logs\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.492998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-config-data\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.493727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64087307-6402-47cd-8a4b-37c3d90eb61e-logs\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.496605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-public-tls-certs\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.496720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-config-data\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.497798 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.498269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.508613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gt9\" (UniqueName: \"kubernetes.io/projected/64087307-6402-47cd-8a4b-37c3d90eb61e-kube-api-access-v2gt9\") pod \"nova-api-0\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.546545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:04 crc kubenswrapper[4707]: I0121 15:39:04.963350 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.192772 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6879bdf9-63ff-4ef8-bf96-6f21380be697" path="/var/lib/kubelet/pods/6879bdf9-63ff-4ef8-bf96-6f21380be697/volumes" Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.195491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"64087307-6402-47cd-8a4b-37c3d90eb61e","Type":"ContainerStarted","Data":"fc38731eba6859dd3321273b90548f6258defbfac5f235ee1a8b652aac078066"} Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.195530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"64087307-6402-47cd-8a4b-37c3d90eb61e","Type":"ContainerStarted","Data":"85d413e1d6ad31b912ee3d5aae2d1f1571993fb658f165ae67457a2c81e2059a"} Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.197235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerStarted","Data":"29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7"} Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.197364 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="proxy-httpd" containerID="cri-o://29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7" gracePeriod=30 Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.197346 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="ceilometer-central-agent" containerID="cri-o://3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc" gracePeriod=30 Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.197418 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="sg-core" containerID="cri-o://6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5" gracePeriod=30 Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.197376 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.197577 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="ceilometer-notification-agent" containerID="cri-o://139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa" gracePeriod=30 Jan 21 15:39:05 crc kubenswrapper[4707]: I0121 15:39:05.216859 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.9595366859999999 podStartE2EDuration="5.21684652s" podCreationTimestamp="2026-01-21 15:39:00 +0000 UTC" firstStartedPulling="2026-01-21 15:39:01.029933879 +0000 UTC m=+2238.211450101" lastFinishedPulling="2026-01-21 15:39:04.287243713 +0000 UTC m=+2241.468759935" observedRunningTime="2026-01-21 15:39:05.216114493 +0000 UTC m=+2242.397630715" watchObservedRunningTime="2026-01-21 15:39:05.21684652 +0000 UTC m=+2242.398362742" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.207906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"64087307-6402-47cd-8a4b-37c3d90eb61e","Type":"ContainerStarted","Data":"228afae2defd2f7cd01706c71f8b84017837c2d44aa5f97f405fbfa3b2e25d2f"} Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.210960 4707 generic.go:334] "Generic (PLEG): container finished" podID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerID="29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7" exitCode=0 Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.210986 4707 generic.go:334] "Generic (PLEG): container finished" podID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerID="6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5" exitCode=2 Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.210998 4707 generic.go:334] "Generic (PLEG): container finished" podID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerID="139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa" exitCode=0 Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.210994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerDied","Data":"29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7"} Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.211034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerDied","Data":"6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5"} Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.211045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerDied","Data":"139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa"} Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.237580 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.237562122 podStartE2EDuration="2.237562122s" podCreationTimestamp="2026-01-21 15:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:39:06.224774387 +0000 UTC m=+2243.406290609" watchObservedRunningTime="2026-01-21 15:39:06.237562122 +0000 UTC m=+2243.419078343" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.507868 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.538364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-combined-ca-bundle\") pod \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.538398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-sg-core-conf-yaml\") pod \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.538440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-run-httpd\") pod \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.538488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-scripts\") pod \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.538573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-ceilometer-tls-certs\") pod \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.538592 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67zgk\" (UniqueName: \"kubernetes.io/projected/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-kube-api-access-67zgk\") pod \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.538621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-log-httpd\") pod \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.538679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-config-data\") pod \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\" (UID: \"29016f3a-bc42-4f50-a3d0-d31a4a931e5b\") " Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.538933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "29016f3a-bc42-4f50-a3d0-d31a4a931e5b" (UID: "29016f3a-bc42-4f50-a3d0-d31a4a931e5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.539094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "29016f3a-bc42-4f50-a3d0-d31a4a931e5b" (UID: "29016f3a-bc42-4f50-a3d0-d31a4a931e5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.539597 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.539618 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.543922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-scripts" (OuterVolumeSpecName: "scripts") pod "29016f3a-bc42-4f50-a3d0-d31a4a931e5b" (UID: "29016f3a-bc42-4f50-a3d0-d31a4a931e5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.544290 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-kube-api-access-67zgk" (OuterVolumeSpecName: "kube-api-access-67zgk") pod "29016f3a-bc42-4f50-a3d0-d31a4a931e5b" (UID: "29016f3a-bc42-4f50-a3d0-d31a4a931e5b"). InnerVolumeSpecName "kube-api-access-67zgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.561125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "29016f3a-bc42-4f50-a3d0-d31a4a931e5b" (UID: "29016f3a-bc42-4f50-a3d0-d31a4a931e5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.578015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "29016f3a-bc42-4f50-a3d0-d31a4a931e5b" (UID: "29016f3a-bc42-4f50-a3d0-d31a4a931e5b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.601511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29016f3a-bc42-4f50-a3d0-d31a4a931e5b" (UID: "29016f3a-bc42-4f50-a3d0-d31a4a931e5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.610332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-config-data" (OuterVolumeSpecName: "config-data") pod "29016f3a-bc42-4f50-a3d0-d31a4a931e5b" (UID: "29016f3a-bc42-4f50-a3d0-d31a4a931e5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.641130 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.641161 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.641171 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.641181 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.641191 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67zgk\" (UniqueName: \"kubernetes.io/projected/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-kube-api-access-67zgk\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:06 crc kubenswrapper[4707]: I0121 15:39:06.641199 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29016f3a-bc42-4f50-a3d0-d31a4a931e5b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.219282 4707 generic.go:334] "Generic (PLEG): container finished" podID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerID="3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc" exitCode=0 Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.219325 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.219377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerDied","Data":"3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc"} Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.219403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"29016f3a-bc42-4f50-a3d0-d31a4a931e5b","Type":"ContainerDied","Data":"b5e6aacb8e805234be93a9b39a5fa6391497d9320261a95c4a440c761adc5356"} Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.219420 4707 scope.go:117] "RemoveContainer" containerID="29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.237827 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.238432 4707 scope.go:117] "RemoveContainer" containerID="6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.247177 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.256676 4707 scope.go:117] "RemoveContainer" containerID="139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.258383 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:07 crc kubenswrapper[4707]: E0121 15:39:07.259001 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="sg-core" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.259021 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="sg-core" Jan 21 15:39:07 crc kubenswrapper[4707]: E0121 15:39:07.259051 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="ceilometer-central-agent" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.259065 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="ceilometer-central-agent" Jan 21 15:39:07 crc kubenswrapper[4707]: E0121 15:39:07.259083 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="ceilometer-notification-agent" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.259088 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="ceilometer-notification-agent" Jan 21 15:39:07 crc kubenswrapper[4707]: E0121 15:39:07.259099 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="proxy-httpd" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.259105 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="proxy-httpd" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.259315 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="ceilometer-notification-agent" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.259374 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="proxy-httpd" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.259392 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="sg-core" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.259400 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" containerName="ceilometer-central-agent" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.263039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.267129 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.267264 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.267344 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.274201 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.293373 4707 scope.go:117] "RemoveContainer" containerID="3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.312481 4707 scope.go:117] "RemoveContainer" containerID="29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7" Jan 21 15:39:07 crc kubenswrapper[4707]: E0121 15:39:07.312787 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7\": container with ID starting with 29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7 not found: ID does not exist" containerID="29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.312842 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7"} err="failed to get container status \"29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7\": rpc error: code = NotFound desc = could not find container \"29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7\": container with ID starting with 29af4090284e4637790f25d0500f841c58b6e12780e9a238845076d8306abdc7 not found: ID does not exist" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.312859 4707 scope.go:117] "RemoveContainer" containerID="6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5" Jan 21 15:39:07 crc kubenswrapper[4707]: E0121 15:39:07.313966 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5\": container with ID starting with 6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5 not found: ID does not exist" containerID="6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.313994 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5"} err="failed to get container status \"6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5\": rpc error: code = NotFound desc = could not find container \"6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5\": container with ID starting with 6a24f87bb3242c8e764d9766d88b722fc43a97d0ad687dc09b54396860b4f1b5 not found: ID does not exist" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.314009 4707 scope.go:117] "RemoveContainer" containerID="139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa" Jan 21 15:39:07 crc kubenswrapper[4707]: E0121 15:39:07.314201 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa\": container with ID starting with 139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa not found: ID does not exist" containerID="139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.314359 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa"} err="failed to get container status \"139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa\": rpc error: code = NotFound desc = could not find container \"139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa\": container with ID starting with 139bdda05956638958d2dcfd6faea359fe8087128d5b534fdbd71907ad916efa not found: ID does not exist" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.314382 4707 scope.go:117] "RemoveContainer" containerID="3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc" Jan 21 15:39:07 crc kubenswrapper[4707]: E0121 15:39:07.314713 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc\": container with ID starting with 3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc not found: ID does not exist" containerID="3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.314833 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc"} err="failed to get container status \"3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc\": rpc error: code = NotFound desc = could not find container \"3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc\": container with ID starting with 3d61d693533628a4f2b8b503198e4155f1ada1397cd3cd22aa46136c06d369bc not found: ID does not exist" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.353383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d45j\" (UniqueName: \"kubernetes.io/projected/4083be08-daf7-4cdc-a1ea-4409372710cd-kube-api-access-2d45j\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.353431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-run-httpd\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.353513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.353558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-scripts\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.353578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-config-data\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.353691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-log-httpd\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.353714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.353734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.455988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.456026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-scripts\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.456042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-config-data\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.456120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-log-httpd\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.456136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.456157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.456230 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d45j\" (UniqueName: \"kubernetes.io/projected/4083be08-daf7-4cdc-a1ea-4409372710cd-kube-api-access-2d45j\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.456266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-run-httpd\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.456592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-run-httpd\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.457052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-log-httpd\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.460514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.460519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-scripts\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.461162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.461614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-config-data\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.469675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d45j\" (UniqueName: \"kubernetes.io/projected/4083be08-daf7-4cdc-a1ea-4409372710cd-kube-api-access-2d45j\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.470016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.583022 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:07 crc kubenswrapper[4707]: I0121 15:39:07.995664 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:08 crc kubenswrapper[4707]: I0121 15:39:08.228505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerStarted","Data":"5892e969bc045f83934258bec74407ce36a69cb7226f77656fcca277edcf5b75"} Jan 21 15:39:09 crc kubenswrapper[4707]: I0121 15:39:09.191345 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29016f3a-bc42-4f50-a3d0-d31a4a931e5b" path="/var/lib/kubelet/pods/29016f3a-bc42-4f50-a3d0-d31a4a931e5b/volumes" Jan 21 15:39:09 crc kubenswrapper[4707]: I0121 15:39:09.237519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerStarted","Data":"703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a"} Jan 21 15:39:10 crc kubenswrapper[4707]: I0121 15:39:10.245572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerStarted","Data":"d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7"} Jan 21 15:39:10 crc kubenswrapper[4707]: I0121 15:39:10.246070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerStarted","Data":"c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74"} Jan 21 15:39:12 crc kubenswrapper[4707]: I0121 15:39:12.263013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerStarted","Data":"2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc"} Jan 21 15:39:12 crc kubenswrapper[4707]: I0121 15:39:12.263586 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:12 crc kubenswrapper[4707]: I0121 15:39:12.296198 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.926871598 podStartE2EDuration="5.296184747s" podCreationTimestamp="2026-01-21 15:39:07 +0000 UTC" firstStartedPulling="2026-01-21 15:39:07.994473255 +0000 UTC m=+2245.175989477" lastFinishedPulling="2026-01-21 15:39:11.363786405 +0000 UTC m=+2248.545302626" observedRunningTime="2026-01-21 15:39:12.286537798 +0000 UTC m=+2249.468054020" watchObservedRunningTime="2026-01-21 15:39:12.296184747 +0000 UTC m=+2249.477700969" Jan 21 15:39:14 crc kubenswrapper[4707]: I0121 15:39:14.547450 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:14 crc kubenswrapper[4707]: I0121 15:39:14.548454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:15 crc kubenswrapper[4707]: I0121 15:39:15.559971 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:39:15 crc kubenswrapper[4707]: I0121 15:39:15.559984 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:39:24 crc kubenswrapper[4707]: I0121 15:39:24.554209 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:24 crc kubenswrapper[4707]: I0121 15:39:24.555006 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:24 crc kubenswrapper[4707]: I0121 15:39:24.555435 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:24 crc kubenswrapper[4707]: I0121 15:39:24.559321 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:25 crc kubenswrapper[4707]: I0121 15:39:25.362287 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:25 crc kubenswrapper[4707]: I0121 15:39:25.367344 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:37 crc kubenswrapper[4707]: I0121 15:39:37.592154 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.230857 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.231516 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="37972b5f-69ec-4558-b68a-c8108b5b5fea" containerName="openstackclient" containerID="cri-o://2d5245eb454b9ce88454d22d980989055f69b8ce07955601437667dbc9834df3" gracePeriod=2 Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.284032 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.303744 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.361345 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 21 15:39:40 crc kubenswrapper[4707]: E0121 15:39:40.437411 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:40 crc kubenswrapper[4707]: E0121 15:39:40.437475 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data podName:8e2084ed-217a-462c-8219-6cfa022751d9 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:40.937458558 +0000 UTC m=+2278.118974781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data") pod "rabbitmq-cell1-server-1" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.447636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:39:40 crc kubenswrapper[4707]: E0121 15:39:40.540268 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:40 crc kubenswrapper[4707]: E0121 15:39:40.540331 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data podName:1977b33b-45bb-4b7d-b7f7-ebfd42572edb nodeName:}" failed. No retries permitted until 2026-01-21 15:39:41.04031566 +0000 UTC m=+2278.221831882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data") pod "rabbitmq-cell1-server-2" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:40 crc kubenswrapper[4707]: E0121 15:39:40.540365 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:40 crc kubenswrapper[4707]: E0121 15:39:40.540384 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data podName:702acb12-2ee0-4008-bf50-0fda9fc79030 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:41.040377736 +0000 UTC m=+2278.221893959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data") pod "rabbitmq-cell1-server-0" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.645157 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.645728 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="0131070b-7f14-4797-aa35-8ed780306565" containerName="openstack-network-exporter" containerID="cri-o://1ad0818035f81047bb782a695932bdec1acc1f17765cf97e5139e9764439d548" gracePeriod=300 Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.668803 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8tr5g"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.690412 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.700871 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.719230 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8tr5g"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.735619 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-f889-account-create-update-9hjdd"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.756874 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-pwrg2"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.767485 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-brbmp"] Jan 21 15:39:40 crc kubenswrapper[4707]: E0121 15:39:40.767905 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37972b5f-69ec-4558-b68a-c8108b5b5fea" containerName="openstackclient" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.767917 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="37972b5f-69ec-4558-b68a-c8108b5b5fea" containerName="openstackclient" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.768096 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="37972b5f-69ec-4558-b68a-c8108b5b5fea" containerName="openstackclient" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.768734 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.774306 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.781916 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.782286 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerName="openstack-network-exporter" containerID="cri-o://e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197" gracePeriod=300 Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.806394 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-brbmp"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.828880 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.830052 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.838176 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.846991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrjxg\" (UniqueName: \"kubernetes.io/projected/0c0fc739-8c30-46ab-aea8-1db8f2864340-kube-api-access-vrjxg\") pod \"root-account-create-update-brbmp\" (UID: \"0c0fc739-8c30-46ab-aea8-1db8f2864340\") " pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.847291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts\") pod \"root-account-create-update-brbmp\" (UID: \"0c0fc739-8c30-46ab-aea8-1db8f2864340\") " pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.863769 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.887980 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerName="ovsdbserver-sb" containerID="cri-o://359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311" gracePeriod=300 Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.935282 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="0131070b-7f14-4797-aa35-8ed780306565" containerName="ovsdbserver-nb" containerID="cri-o://86e3d70b4ce7602fd0e3af1064f4bf80df7351fe1430fbcc7626e11c9a2e2b89" gracePeriod=300 Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.951237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts\") pod \"root-account-create-update-brbmp\" (UID: \"0c0fc739-8c30-46ab-aea8-1db8f2864340\") " pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.951530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dclfd\" (UniqueName: \"kubernetes.io/projected/ce8faea1-a6d2-4be7-88ca-e92d10481925-kube-api-access-dclfd\") pod \"nova-cell0-6d10-account-create-update-wnm88\" (UID: \"ce8faea1-a6d2-4be7-88ca-e92d10481925\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.951752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrjxg\" (UniqueName: \"kubernetes.io/projected/0c0fc739-8c30-46ab-aea8-1db8f2864340-kube-api-access-vrjxg\") pod \"root-account-create-update-brbmp\" (UID: \"0c0fc739-8c30-46ab-aea8-1db8f2864340\") " pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.951882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8faea1-a6d2-4be7-88ca-e92d10481925-operator-scripts\") pod \"nova-cell0-6d10-account-create-update-wnm88\" (UID: \"ce8faea1-a6d2-4be7-88ca-e92d10481925\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.952534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts\") pod \"root-account-create-update-brbmp\" (UID: \"0c0fc739-8c30-46ab-aea8-1db8f2864340\") " pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:40 crc kubenswrapper[4707]: E0121 15:39:40.952666 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:40 crc kubenswrapper[4707]: E0121 15:39:40.952767 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data podName:8e2084ed-217a-462c-8219-6cfa022751d9 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:41.952753144 +0000 UTC m=+2279.134269366 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data") pod "rabbitmq-cell1-server-1" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.968578 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xwzrg"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.983973 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xwzrg"] Jan 21 15:39:40 crc kubenswrapper[4707]: I0121 15:39:40.989639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrjxg\" (UniqueName: \"kubernetes.io/projected/0c0fc739-8c30-46ab-aea8-1db8f2864340-kube-api-access-vrjxg\") pod \"root-account-create-update-brbmp\" (UID: \"0c0fc739-8c30-46ab-aea8-1db8f2864340\") " pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.001937 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.018750 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5e9c-account-create-update-hvxbc"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.030860 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.050861 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-ac8b-account-create-update-fkz9f"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.053666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8faea1-a6d2-4be7-88ca-e92d10481925-operator-scripts\") pod \"nova-cell0-6d10-account-create-update-wnm88\" (UID: \"ce8faea1-a6d2-4be7-88ca-e92d10481925\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.053852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dclfd\" (UniqueName: \"kubernetes.io/projected/ce8faea1-a6d2-4be7-88ca-e92d10481925-kube-api-access-dclfd\") pod \"nova-cell0-6d10-account-create-update-wnm88\" (UID: \"ce8faea1-a6d2-4be7-88ca-e92d10481925\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.054200 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.054244 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data podName:702acb12-2ee0-4008-bf50-0fda9fc79030 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:42.054231684 +0000 UTC m=+2279.235747906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data") pod "rabbitmq-cell1-server-0" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.054693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8faea1-a6d2-4be7-88ca-e92d10481925-operator-scripts\") pod \"nova-cell0-6d10-account-create-update-wnm88\" (UID: \"ce8faea1-a6d2-4be7-88ca-e92d10481925\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.063883 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.063944 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data podName:1977b33b-45bb-4b7d-b7f7-ebfd42572edb nodeName:}" failed. No retries permitted until 2026-01-21 15:39:42.063931501 +0000 UTC m=+2279.245447723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data") pod "rabbitmq-cell1-server-2" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.067742 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.068049 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerName="ovn-northd" containerID="cri-o://8a2c4e0ae63668aed4274707ac9589b229bd930700f2c6e2c7706b1b5642df61" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.068473 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerName="openstack-network-exporter" containerID="cri-o://a31decf41a1b02620ab2be257c60c8227184c0d7af43f0210eae219e937afaba" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.088683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dclfd\" (UniqueName: \"kubernetes.io/projected/ce8faea1-a6d2-4be7-88ca-e92d10481925-kube-api-access-dclfd\") pod \"nova-cell0-6d10-account-create-update-wnm88\" (UID: \"ce8faea1-a6d2-4be7-88ca-e92d10481925\") " pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.089770 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-pcvgx"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.103430 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.114190 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-pcvgx"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.146591 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-lwmqg"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.157479 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-lwmqg"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.160502 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.170965 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.177854 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-hplxw"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.219001 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22546c4b-0348-445b-a781-c3a15ca2b4cd" path="/var/lib/kubelet/pods/22546c4b-0348-445b-a781-c3a15ca2b4cd/volumes" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.219765 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b6803b-8c90-49cb-860c-f08d6a472f9e" path="/var/lib/kubelet/pods/27b6803b-8c90-49cb-860c-f08d6a472f9e/volumes" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.220595 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b53d780-b8f6-4755-a954-db9895f87427" path="/var/lib/kubelet/pods/4b53d780-b8f6-4755-a954-db9895f87427/volumes" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.221941 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8463ac2d-d1cb-4505-adc1-e5019f545c85" path="/var/lib/kubelet/pods/8463ac2d-d1cb-4505-adc1-e5019f545c85/volumes" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.223209 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99428af2-9c90-49ba-909e-486ce037a1ee" path="/var/lib/kubelet/pods/99428af2-9c90-49ba-909e-486ce037a1ee/volumes" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.224534 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e6c1e3-a518-4ff6-8449-be149ab65bb1" path="/var/lib/kubelet/pods/e6e6c1e3-a518-4ff6-8449-be149ab65bb1/volumes" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.225167 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8" path="/var/lib/kubelet/pods/ee0ee8b6-e1c2-4d1d-b807-cad4d3e5c7f8/volumes" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.226173 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f853ee64-61b1-4bbb-afa4-87fa50a0935a" path="/var/lib/kubelet/pods/f853ee64-61b1-4bbb-afa4-87fa50a0935a/volumes" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.227197 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99c92c6-cf0f-406a-ad28-7134cace4dd7" path="/var/lib/kubelet/pods/f99c92c6-cf0f-406a-ad28-7134cace4dd7/volumes" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.256654 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.284690 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-e1d7-account-create-update-mr6l5"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.310403 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bn475"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.333693 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bn475"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.385618 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-c140-account-create-update-prxqq"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.396718 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-c140-account-create-update-prxqq"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.408399 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9m9rb"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.418039 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9m9rb"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.477020 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_a84e26cb-9beb-4f23-9101-f4e981d8c394/ovsdbserver-sb/0.log" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.477103 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.478581 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-q7kt6"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.490093 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-q7kt6"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.492600 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_0131070b-7f14-4797-aa35-8ed780306565/ovsdbserver-nb/0.log" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.492646 4707 generic.go:334] "Generic (PLEG): container finished" podID="0131070b-7f14-4797-aa35-8ed780306565" containerID="1ad0818035f81047bb782a695932bdec1acc1f17765cf97e5139e9764439d548" exitCode=2 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.492660 4707 generic.go:334] "Generic (PLEG): container finished" podID="0131070b-7f14-4797-aa35-8ed780306565" containerID="86e3d70b4ce7602fd0e3af1064f4bf80df7351fe1430fbcc7626e11c9a2e2b89" exitCode=143 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.492696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0131070b-7f14-4797-aa35-8ed780306565","Type":"ContainerDied","Data":"1ad0818035f81047bb782a695932bdec1acc1f17765cf97e5139e9764439d548"} Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.492728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0131070b-7f14-4797-aa35-8ed780306565","Type":"ContainerDied","Data":"86e3d70b4ce7602fd0e3af1064f4bf80df7351fe1430fbcc7626e11c9a2e2b89"} Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.501447 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerID="a31decf41a1b02620ab2be257c60c8227184c0d7af43f0210eae219e937afaba" exitCode=2 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.501502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7e183977-7c22-4b6d-a465-132c8f85dbb0","Type":"ContainerDied","Data":"a31decf41a1b02620ab2be257c60c8227184c0d7af43f0210eae219e937afaba"} Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.507245 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_a84e26cb-9beb-4f23-9101-f4e981d8c394/ovsdbserver-sb/0.log" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.507286 4707 generic.go:334] "Generic (PLEG): container finished" podID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerID="e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197" exitCode=2 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.507300 4707 generic.go:334] "Generic (PLEG): container finished" podID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerID="359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311" exitCode=143 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.507315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a84e26cb-9beb-4f23-9101-f4e981d8c394","Type":"ContainerDied","Data":"e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197"} Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.507334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a84e26cb-9beb-4f23-9101-f4e981d8c394","Type":"ContainerDied","Data":"359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311"} Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.507344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a84e26cb-9beb-4f23-9101-f4e981d8c394","Type":"ContainerDied","Data":"53095e8fd063ee2f94852ed4d31ef326bb37a0a23bb199b2f0f528ff5208f7f1"} Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.507359 4707 scope.go:117] "RemoveContainer" containerID="e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.507465 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.543795 4707 scope.go:117] "RemoveContainer" containerID="359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.591319 4707 scope.go:117] "RemoveContainer" containerID="e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.593389 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-config\") pod \"a84e26cb-9beb-4f23-9101-f4e981d8c394\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.593678 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-scripts\") pod \"a84e26cb-9beb-4f23-9101-f4e981d8c394\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.593696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdb-rundir\") pod \"a84e26cb-9beb-4f23-9101-f4e981d8c394\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.593751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-combined-ca-bundle\") pod \"a84e26cb-9beb-4f23-9101-f4e981d8c394\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.593766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"a84e26cb-9beb-4f23-9101-f4e981d8c394\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.594391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-config" (OuterVolumeSpecName: "config") pod "a84e26cb-9beb-4f23-9101-f4e981d8c394" (UID: "a84e26cb-9beb-4f23-9101-f4e981d8c394"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.594502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-scripts" (OuterVolumeSpecName: "scripts") pod "a84e26cb-9beb-4f23-9101-f4e981d8c394" (UID: "a84e26cb-9beb-4f23-9101-f4e981d8c394"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.593825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdbserver-sb-tls-certs\") pod \"a84e26cb-9beb-4f23-9101-f4e981d8c394\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595049 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsxpv\" (UniqueName: \"kubernetes.io/projected/a84e26cb-9beb-4f23-9101-f4e981d8c394-kube-api-access-jsxpv\") pod \"a84e26cb-9beb-4f23-9101-f4e981d8c394\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-metrics-certs-tls-certs\") pod \"a84e26cb-9beb-4f23-9101-f4e981d8c394\" (UID: \"a84e26cb-9beb-4f23-9101-f4e981d8c394\") " Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.595224 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197\": container with ID starting with e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197 not found: ID does not exist" containerID="e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595271 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197"} err="failed to get container status \"e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197\": rpc error: code = NotFound desc = could not find container \"e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197\": container with ID starting with e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197 not found: ID does not exist" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595293 4707 scope.go:117] "RemoveContainer" containerID="359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "a84e26cb-9beb-4f23-9101-f4e981d8c394" (UID: "a84e26cb-9beb-4f23-9101-f4e981d8c394"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.595577 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311\": container with ID starting with 359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311 not found: ID does not exist" containerID="359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595594 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311"} err="failed to get container status \"359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311\": rpc error: code = NotFound desc = could not find container \"359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311\": container with ID starting with 359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311 not found: ID does not exist" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595607 4707 scope.go:117] "RemoveContainer" containerID="e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595703 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595717 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84e26cb-9beb-4f23-9101-f4e981d8c394-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595725 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595794 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197"} err="failed to get container status \"e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197\": rpc error: code = NotFound desc = could not find container \"e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197\": container with ID starting with e9999ca539a32706fb0611ca2cacd898fc633a6cf01368411a27acb6b13e0197 not found: ID does not exist" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.595828 4707 scope.go:117] "RemoveContainer" containerID="359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.596017 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311"} err="failed to get container status \"359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311\": rpc error: code = NotFound desc = could not find container \"359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311\": container with ID starting with 359d805059c0bb76b955503e4a616bb6bb0dc57c8a2a832cccba914a7cf09311 not found: ID does not exist" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.605347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "a84e26cb-9beb-4f23-9101-f4e981d8c394" (UID: "a84e26cb-9beb-4f23-9101-f4e981d8c394"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.610335 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a84e26cb-9beb-4f23-9101-f4e981d8c394-kube-api-access-jsxpv" (OuterVolumeSpecName: "kube-api-access-jsxpv") pod "a84e26cb-9beb-4f23-9101-f4e981d8c394" (UID: "a84e26cb-9beb-4f23-9101-f4e981d8c394"). InnerVolumeSpecName "kube-api-access-jsxpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.651213 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_0131070b-7f14-4797-aa35-8ed780306565/ovsdbserver-nb/0.log" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.651286 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.669003 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7b4f4d8c74-82f2d"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.669295 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" podUID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerName="placement-log" containerID="cri-o://292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.669625 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" podUID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerName="placement-api" containerID="cri-o://43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.677654 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a84e26cb-9beb-4f23-9101-f4e981d8c394" (UID: "a84e26cb-9beb-4f23-9101-f4e981d8c394"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.680387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "a84e26cb-9beb-4f23-9101-f4e981d8c394" (UID: "a84e26cb-9beb-4f23-9101-f4e981d8c394"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.700397 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsxpv\" (UniqueName: \"kubernetes.io/projected/a84e26cb-9beb-4f23-9101-f4e981d8c394-kube-api-access-jsxpv\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.700416 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.700426 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.700459 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.715311 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6jqxb"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.727553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "a84e26cb-9beb-4f23-9101-f4e981d8c394" (UID: "a84e26cb-9beb-4f23-9101-f4e981d8c394"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.728596 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6jqxb"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.750165 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.750424 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerName="glance-log" containerID="cri-o://cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.750554 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerName="glance-httpd" containerID="cri-o://0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.758921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-brbmp"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.764226 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.765910 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.766128 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerName="glance-log" containerID="cri-o://93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.766261 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerName="glance-httpd" containerID="cri-o://872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.801768 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"0131070b-7f14-4797-aa35-8ed780306565\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.801989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-metrics-certs-tls-certs\") pod \"0131070b-7f14-4797-aa35-8ed780306565\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.802038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-combined-ca-bundle\") pod \"0131070b-7f14-4797-aa35-8ed780306565\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.802070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0131070b-7f14-4797-aa35-8ed780306565-ovsdb-rundir\") pod \"0131070b-7f14-4797-aa35-8ed780306565\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.802123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-scripts\") pod \"0131070b-7f14-4797-aa35-8ed780306565\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.802240 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-ovsdbserver-nb-tls-certs\") pod \"0131070b-7f14-4797-aa35-8ed780306565\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.802288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw8l7\" (UniqueName: \"kubernetes.io/projected/0131070b-7f14-4797-aa35-8ed780306565-kube-api-access-dw8l7\") pod \"0131070b-7f14-4797-aa35-8ed780306565\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.802321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-config\") pod \"0131070b-7f14-4797-aa35-8ed780306565\" (UID: \"0131070b-7f14-4797-aa35-8ed780306565\") " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.802731 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.802742 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84e26cb-9beb-4f23-9101-f4e981d8c394-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.803153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-scripts" (OuterVolumeSpecName: "scripts") pod "0131070b-7f14-4797-aa35-8ed780306565" (UID: "0131070b-7f14-4797-aa35-8ed780306565"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.803281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-config" (OuterVolumeSpecName: "config") pod "0131070b-7f14-4797-aa35-8ed780306565" (UID: "0131070b-7f14-4797-aa35-8ed780306565"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.804461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0131070b-7f14-4797-aa35-8ed780306565-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "0131070b-7f14-4797-aa35-8ed780306565" (UID: "0131070b-7f14-4797-aa35-8ed780306565"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.808000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "0131070b-7f14-4797-aa35-8ed780306565" (UID: "0131070b-7f14-4797-aa35-8ed780306565"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.812736 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.813184 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-server" containerID="cri-o://9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.813582 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="swift-recon-cron" containerID="cri-o://faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.813651 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="rsync" containerID="cri-o://3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.813708 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-expirer" containerID="cri-o://39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.813765 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-updater" containerID="cri-o://38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.813797 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-auditor" containerID="cri-o://ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.813882 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-replicator" containerID="cri-o://d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.813934 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-server" containerID="cri-o://0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.813965 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-updater" containerID="cri-o://8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.814039 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-auditor" containerID="cri-o://202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.814103 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-replicator" containerID="cri-o://a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.814135 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-server" containerID="cri-o://80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.814180 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-reaper" containerID="cri-o://5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.814213 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-auditor" containerID="cri-o://af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.814239 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-replicator" containerID="cri-o://0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.818774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0131070b-7f14-4797-aa35-8ed780306565-kube-api-access-dw8l7" (OuterVolumeSpecName: "kube-api-access-dw8l7") pod "0131070b-7f14-4797-aa35-8ed780306565" (UID: "0131070b-7f14-4797-aa35-8ed780306565"). InnerVolumeSpecName "kube-api-access-dw8l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.872024 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.872340 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-log" containerID="cri-o://fc38731eba6859dd3321273b90548f6258defbfac5f235ee1a8b652aac078066" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.872518 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-api" containerID="cri-o://228afae2defd2f7cd01706c71f8b84017837c2d44aa5f97f405fbfa3b2e25d2f" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.894165 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq"] Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.894502 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0131070b-7f14-4797-aa35-8ed780306565" containerName="openstack-network-exporter" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.894513 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0131070b-7f14-4797-aa35-8ed780306565" containerName="openstack-network-exporter" Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.894524 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerName="openstack-network-exporter" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.894529 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerName="openstack-network-exporter" Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.894539 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerName="ovsdbserver-sb" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.894545 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerName="ovsdbserver-sb" Jan 21 15:39:41 crc kubenswrapper[4707]: E0121 15:39:41.894557 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0131070b-7f14-4797-aa35-8ed780306565" containerName="ovsdbserver-nb" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.894563 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0131070b-7f14-4797-aa35-8ed780306565" containerName="ovsdbserver-nb" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.896441 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0131070b-7f14-4797-aa35-8ed780306565" containerName="ovsdbserver-nb" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.896455 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0131070b-7f14-4797-aa35-8ed780306565" containerName="openstack-network-exporter" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.896473 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerName="openstack-network-exporter" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.896483 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84e26cb-9beb-4f23-9101-f4e981d8c394" containerName="ovsdbserver-sb" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.897410 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.908546 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.908574 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0131070b-7f14-4797-aa35-8ed780306565-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.908585 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.908594 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw8l7\" (UniqueName: \"kubernetes.io/projected/0131070b-7f14-4797-aa35-8ed780306565-kube-api-access-dw8l7\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.908602 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0131070b-7f14-4797-aa35-8ed780306565-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.916615 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0131070b-7f14-4797-aa35-8ed780306565" (UID: "0131070b-7f14-4797-aa35-8ed780306565"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.944367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.964786 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.996855 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5f454b744-bh8qb"] Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.997066 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" podUID="6452a729-a782-4679-be7b-1c5240b66caa" containerName="neutron-api" containerID="cri-o://46086c7e5a9656d5495acf8f78897d5393fb2095b5808e57b661a658aa9f8aec" gracePeriod=30 Jan 21 15:39:41 crc kubenswrapper[4707]: I0121 15:39:41.997312 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" podUID="6452a729-a782-4679-be7b-1c5240b66caa" containerName="neutron-httpd" containerID="cri-o://2121408667c0e3ee8a9ef8da490db5564b60ad19d737feab45bf9a9843fd4d1c" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.026731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh4vp\" (UniqueName: \"kubernetes.io/projected/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-kube-api-access-sh4vp\") pod \"dnsmasq-dnsmasq-84b9f45d47-wkpqq\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.027057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-wkpqq\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.027100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-wkpqq\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.027323 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.027387 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.027426 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data podName:8e2084ed-217a-462c-8219-6cfa022751d9 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:44.027412994 +0000 UTC m=+2281.208929216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data") pod "rabbitmq-cell1-server-1" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.028851 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.040941 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.071591 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-f8zkr"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.095169 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88"] Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.110569 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:39:42 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: if [ -n "nova_cell0" ]; then Jan 21 15:39:42 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell0" Jan 21 15:39:42 crc kubenswrapper[4707]: else Jan 21 15:39:42 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:39:42 crc kubenswrapper[4707]: fi Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:39:42 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:39:42 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:39:42 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:39:42 crc kubenswrapper[4707]: # support updates Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.111721 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" podUID="ce8faea1-a6d2-4be7-88ca-e92d10481925" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.120869 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-f8zkr"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.128647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-wkpqq\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.129802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-wkpqq\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.130091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh4vp\" (UniqueName: \"kubernetes.io/projected/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-kube-api-access-sh4vp\") pod \"dnsmasq-dnsmasq-84b9f45d47-wkpqq\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.130541 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.130770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-wkpqq\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.130788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-wkpqq\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.130964 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.143238 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data podName:702acb12-2ee0-4008-bf50-0fda9fc79030 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:44.143219021 +0000 UTC m=+2281.324735244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data") pod "rabbitmq-cell1-server-0" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.131027 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.144646 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data podName:1977b33b-45bb-4b7d-b7f7-ebfd42572edb nodeName:}" failed. No retries permitted until 2026-01-21 15:39:44.144635014 +0000 UTC m=+2281.326151236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data") pod "rabbitmq-cell1-server-2" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.161856 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.164120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh4vp\" (UniqueName: \"kubernetes.io/projected/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-kube-api-access-sh4vp\") pod \"dnsmasq-dnsmasq-84b9f45d47-wkpqq\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.171213 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.178081 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.183827 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-cell0-conductor-0" secret="" err="secret \"nova-nova-dockercfg-rhjn8\" not found" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.190736 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.190954 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-log" containerID="cri-o://1337d8606bac38af9b79c2ec732c96f137ac330857aa0b08877428b22036abc5" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.191079 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-metadata" containerID="cri-o://1c8e77bfa68e38980554d4bde9e72ce205b90327a6373937e410e37c84c7a7f8" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.211820 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-zjvdt"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.226272 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-zjvdt"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.244020 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.244273 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerName="cinder-scheduler" containerID="cri-o://86eaf4108dcfe70d29829d6618776e1e19fc4cf3438e7d0357073b14327e2707" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.244387 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerName="probe" containerID="cri-o://6276366157c1c72f36b10c6a0708b755cc3d4fe703dde761888194d9a7e0a9e3" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.259973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.278868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "0131070b-7f14-4797-aa35-8ed780306565" (UID: "0131070b-7f14-4797-aa35-8ed780306565"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.280641 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-svdgj"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.290898 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-svdgj"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.296558 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.296762 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerName="cinder-api-log" containerID="cri-o://bda04896b44c77f2f7e84995fda7c5d626e8dec1e42dd18df28b089230861206" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.297105 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerName="cinder-api" containerID="cri-o://793158d30625ea44f14e3cf6927d960d4fb70b77cc2bde9279de8570448d24cb" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.304013 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-4c4c-account-create-update-gn426"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.312161 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-4c4c-account-create-update-gn426"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.313905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "0131070b-7f14-4797-aa35-8ed780306565" (UID: "0131070b-7f14-4797-aa35-8ed780306565"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.318561 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-c94hb"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.326287 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9df45"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.340270 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-c94hb"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.353894 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.354435 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/nova-cell0-conductor-config-data: secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.354483 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0131070b-7f14-4797-aa35-8ed780306565-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.354485 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data podName:91365236-99fe-4ee7-9aa1-8a414557cf97 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:42.854471648 +0000 UTC m=+2280.035987870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data") pod "nova-cell0-conductor-0" (UID: "91365236-99fe-4ee7-9aa1-8a414557cf97") : secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.357987 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9df45"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.364509 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.370702 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rffn8"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.374404 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rffn8"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.381483 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.411627 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-40b4-account-create-update-lktds"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.421610 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lzkn7"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.447066 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-mptqk"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.453997 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lzkn7"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.467450 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-mptqk"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.476131 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.476303 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="d4146629-1224-48a1-86c9-da68e93dbbca" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d8ae9b7d96594f3db274ccac95d0b77c71288664e3163100455a9c6c900575d6" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.483208 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.483426 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" podUID="40927271-bdaa-4872-a579-c100648860dd" containerName="barbican-worker-log" containerID="cri-o://2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.483555 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" podUID="40927271-bdaa-4872-a579-c100648860dd" containerName="barbican-worker" containerID="cri-o://4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.497216 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-867d686c44-7f4dj"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.497458 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" podUID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerName="barbican-api-log" containerID="cri-o://9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.497868 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" podUID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerName="barbican-api" containerID="cri-o://514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.513570 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.513765 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" podUID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerName="barbican-keystone-listener-log" containerID="cri-o://5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.513885 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" podUID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerName="barbican-keystone-listener" containerID="cri-o://85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.528753 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.529191 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" containerID="cri-o://e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.539482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" event={"ID":"ce8faea1-a6d2-4be7-88ca-e92d10481925","Type":"ContainerStarted","Data":"9c76be54f43963b7d8c4823536eefe4f94011a03ee3b9cce26b9bfad435f4e8d"} Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.540923 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:39:42 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: if [ -n "nova_cell0" ]; then Jan 21 15:39:42 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell0" Jan 21 15:39:42 crc kubenswrapper[4707]: else Jan 21 15:39:42 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:39:42 crc kubenswrapper[4707]: fi Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:39:42 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:39:42 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:39:42 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:39:42 crc kubenswrapper[4707]: # support updates Jan 21 15:39:42 crc kubenswrapper[4707]: Jan 21 15:39:42 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.541967 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" podUID="ce8faea1-a6d2-4be7-88ca-e92d10481925" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.547399 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.547726 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" containerID="cri-o://b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.553573 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerID="93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a" exitCode=143 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.553635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"cf120fd0-7d2d-4f6e-80d9-523c4432c9db","Type":"ContainerDied","Data":"93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.559458 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.563275 4707 generic.go:334] "Generic (PLEG): container finished" podID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerID="fc38731eba6859dd3321273b90548f6258defbfac5f235ee1a8b652aac078066" exitCode=143 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.563339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"64087307-6402-47cd-8a4b-37c3d90eb61e","Type":"ContainerDied","Data":"fc38731eba6859dd3321273b90548f6258defbfac5f235ee1a8b652aac078066"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.565401 4707 generic.go:334] "Generic (PLEG): container finished" podID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerID="292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2" exitCode=143 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.565438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" event={"ID":"886a48bf-ec30-418c-86d9-9d96b725aa59","Type":"ContainerDied","Data":"292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.566175 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-8wfnk"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.566640 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerID="1337d8606bac38af9b79c2ec732c96f137ac330857aa0b08877428b22036abc5" exitCode=143 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.566684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9cee2089-6404-41ee-b6ae-82d788ced5fc","Type":"ContainerDied","Data":"1337d8606bac38af9b79c2ec732c96f137ac330857aa0b08877428b22036abc5"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.578466 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.585020 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591172 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591197 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591244 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591260 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591268 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591274 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591280 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591288 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591295 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591302 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591308 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591313 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591320 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591326 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591460 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591499 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.591508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.595389 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-2" podUID="2a77adde-7080-44bf-bbba-0d8b0f874e98" containerName="galera" containerID="cri-o://8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.597291 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-brbmp" event={"ID":"0c0fc739-8c30-46ab-aea8-1db8f2864340","Type":"ContainerStarted","Data":"1434e8695f88889b7b13c9714203b638e18650a221ecf26e91992f23d796ee06"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.597332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-brbmp" event={"ID":"0c0fc739-8c30-46ab-aea8-1db8f2864340","Type":"ContainerStarted","Data":"51790c6cee21ae0e7376f4efd2bec64a752906230d7019ea10daf2d727e80557"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.604794 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wv2tk"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.620231 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/root-account-create-update-brbmp" podStartSLOduration=2.620217777 podStartE2EDuration="2.620217777s" podCreationTimestamp="2026-01-21 15:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:39:42.611570921 +0000 UTC m=+2279.793087143" watchObservedRunningTime="2026-01-21 15:39:42.620217777 +0000 UTC m=+2279.801734000" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.628767 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_0131070b-7f14-4797-aa35-8ed780306565/ovsdbserver-nb/0.log" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.628848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0131070b-7f14-4797-aa35-8ed780306565","Type":"ContainerDied","Data":"4d13023c30521c8ed1296133912c757acc390d68d804fd845bf45b97c9c67995"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.628877 4707 scope.go:117] "RemoveContainer" containerID="1ad0818035f81047bb782a695932bdec1acc1f17765cf97e5139e9764439d548" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.628961 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.638753 4707 generic.go:334] "Generic (PLEG): container finished" podID="6452a729-a782-4679-be7b-1c5240b66caa" containerID="2121408667c0e3ee8a9ef8da490db5564b60ad19d737feab45bf9a9843fd4d1c" exitCode=0 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.638839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" event={"ID":"6452a729-a782-4679-be7b-1c5240b66caa","Type":"ContainerDied","Data":"2121408667c0e3ee8a9ef8da490db5564b60ad19d737feab45bf9a9843fd4d1c"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.643000 4707 generic.go:334] "Generic (PLEG): container finished" podID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerID="bda04896b44c77f2f7e84995fda7c5d626e8dec1e42dd18df28b089230861206" exitCode=143 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.643061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"487d2c84-f1f6-4b88-9188-49e704aeb43a","Type":"ContainerDied","Data":"bda04896b44c77f2f7e84995fda7c5d626e8dec1e42dd18df28b089230861206"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.651263 4707 generic.go:334] "Generic (PLEG): container finished" podID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerID="cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27" exitCode=143 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.651316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3eec9e69-0f85-4cc7-8fad-020a027dedcc","Type":"ContainerDied","Data":"cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27"} Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.653265 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.654172 4707 generic.go:334] "Generic (PLEG): container finished" podID="37972b5f-69ec-4558-b68a-c8108b5b5fea" containerID="2d5245eb454b9ce88454d22d980989055f69b8ce07955601437667dbc9834df3" exitCode=137 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.654306 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" containerID="cri-o://77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.664187 4707 scope.go:117] "RemoveContainer" containerID="86e3d70b4ce7602fd0e3af1064f4bf80df7351fe1430fbcc7626e11c9a2e2b89" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.668861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.680096 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.698903 4707 scope.go:117] "RemoveContainer" containerID="2d5245eb454b9ce88454d22d980989055f69b8ce07955601437667dbc9834df3" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.735095 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.735270 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="7a514056-a57f-4bc0-89b5-60ade40aadfc" containerName="kube-state-metrics" containerID="cri-o://5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.750864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.751044 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" podUID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerName="proxy-httpd" containerID="cri-o://ff9e7bd1f2a0d3427ab31dc36cbefa73b193fcc1c2688ecbc02b7b18ba10fa93" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.751371 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" podUID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerName="proxy-server" containerID="cri-o://1df6460ba3866cec27e159d8bc64f86136c0600f74e7a9842781454a2ef8142c" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.767987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config-secret\") pod \"37972b5f-69ec-4558-b68a-c8108b5b5fea\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.768033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-combined-ca-bundle\") pod \"37972b5f-69ec-4558-b68a-c8108b5b5fea\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.768384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4zrh\" (UniqueName: \"kubernetes.io/projected/37972b5f-69ec-4558-b68a-c8108b5b5fea-kube-api-access-g4zrh\") pod \"37972b5f-69ec-4558-b68a-c8108b5b5fea\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.768417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config\") pod \"37972b5f-69ec-4558-b68a-c8108b5b5fea\" (UID: \"37972b5f-69ec-4558-b68a-c8108b5b5fea\") " Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.793125 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.793377 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="ceilometer-central-agent" containerID="cri-o://703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.793729 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="proxy-httpd" containerID="cri-o://2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.793876 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="sg-core" containerID="cri-o://d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.793914 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="ceilometer-notification-agent" containerID="cri-o://c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.795544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37972b5f-69ec-4558-b68a-c8108b5b5fea-kube-api-access-g4zrh" (OuterVolumeSpecName: "kube-api-access-g4zrh") pod "37972b5f-69ec-4558-b68a-c8108b5b5fea" (UID: "37972b5f-69ec-4558-b68a-c8108b5b5fea"). InnerVolumeSpecName "kube-api-access-g4zrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.830005 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "37972b5f-69ec-4558-b68a-c8108b5b5fea" (UID: "37972b5f-69ec-4558-b68a-c8108b5b5fea"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.852323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37972b5f-69ec-4558-b68a-c8108b5b5fea" (UID: "37972b5f-69ec-4558-b68a-c8108b5b5fea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.854055 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-1" podUID="3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" containerName="galera" containerID="cri-o://67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f" gracePeriod=30 Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.873361 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq"] Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.873746 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "37972b5f-69ec-4558-b68a-c8108b5b5fea" (UID: "37972b5f-69ec-4558-b68a-c8108b5b5fea"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.913980 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4zrh\" (UniqueName: \"kubernetes.io/projected/37972b5f-69ec-4558-b68a-c8108b5b5fea-kube-api-access-g4zrh\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.920463 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.920489 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:42 crc kubenswrapper[4707]: I0121 15:39:42.920501 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37972b5f-69ec-4558-b68a-c8108b5b5fea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.914134 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/nova-cell0-conductor-config-data: secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:42 crc kubenswrapper[4707]: E0121 15:39:42.920566 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data podName:91365236-99fe-4ee7-9aa1-8a414557cf97 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:43.920547036 +0000 UTC m=+2281.102063259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data") pod "nova-cell0-conductor-0" (UID: "91365236-99fe-4ee7-9aa1-8a414557cf97") : secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.103979 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="d4146629-1224-48a1-86c9-da68e93dbbca" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.195:6080/vnc_lite.html\": dial tcp 10.217.0.195:6080: connect: connection refused" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.244363 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0131070b-7f14-4797-aa35-8ed780306565" path="/var/lib/kubelet/pods/0131070b-7f14-4797-aa35-8ed780306565/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.246262 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175e861a-ef37-43bf-a299-41690c536532" path="/var/lib/kubelet/pods/175e861a-ef37-43bf-a299-41690c536532/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.247449 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3169b921-59ad-46d3-9c46-1ad852c93a6e" path="/var/lib/kubelet/pods/3169b921-59ad-46d3-9c46-1ad852c93a6e/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.249232 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31884041-4815-4438-a2cd-7bd2756f9f21" path="/var/lib/kubelet/pods/31884041-4815-4438-a2cd-7bd2756f9f21/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.252261 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="369fb473-5832-4299-80a7-81eecc4ab6d4" path="/var/lib/kubelet/pods/369fb473-5832-4299-80a7-81eecc4ab6d4/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.252880 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37972b5f-69ec-4558-b68a-c8108b5b5fea" path="/var/lib/kubelet/pods/37972b5f-69ec-4558-b68a-c8108b5b5fea/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.254887 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41231705-a331-4eec-8c6f-3bfb45b6db70" path="/var/lib/kubelet/pods/41231705-a331-4eec-8c6f-3bfb45b6db70/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.255548 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488b22fe-c39f-42fb-8c0e-75917ac55fff" path="/var/lib/kubelet/pods/488b22fe-c39f-42fb-8c0e-75917ac55fff/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.256407 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56293b49-445a-404c-8eb7-443b7d9d0170" path="/var/lib/kubelet/pods/56293b49-445a-404c-8eb7-443b7d9d0170/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.257290 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="581b5e31-76d5-4900-91bd-8b2f82e9bf45" path="/var/lib/kubelet/pods/581b5e31-76d5-4900-91bd-8b2f82e9bf45/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.258708 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="618a68ea-1d06-4355-9d36-b0a833165b08" path="/var/lib/kubelet/pods/618a68ea-1d06-4355-9d36-b0a833165b08/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.259842 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8157a3d5-13f8-4842-9fcd-6bd3d5f4a020" path="/var/lib/kubelet/pods/8157a3d5-13f8-4842-9fcd-6bd3d5f4a020/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.260764 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902a08ab-3caf-4942-bf59-986f57145ec4" path="/var/lib/kubelet/pods/902a08ab-3caf-4942-bf59-986f57145ec4/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.262884 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e5f6a3-0b0f-4d27-b60c-54585edca724" path="/var/lib/kubelet/pods/97e5f6a3-0b0f-4d27-b60c-54585edca724/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.264081 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a84e26cb-9beb-4f23-9101-f4e981d8c394" path="/var/lib/kubelet/pods/a84e26cb-9beb-4f23-9101-f4e981d8c394/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.265470 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a916eb4b-5b4d-47ff-b5db-01802c7d1484" path="/var/lib/kubelet/pods/a916eb4b-5b4d-47ff-b5db-01802c7d1484/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.266315 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc880d2b-0a62-4e59-8063-810f798bb321" path="/var/lib/kubelet/pods/bc880d2b-0a62-4e59-8063-810f798bb321/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.267591 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b890b6-faa3-4e93-aebe-6d8277291a22" path="/var/lib/kubelet/pods/d1b890b6-faa3-4e93-aebe-6d8277291a22/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.268341 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbdd5bf5-6e0f-4577-ac4f-960a74e6679b" path="/var/lib/kubelet/pods/dbdd5bf5-6e0f-4577-ac4f-960a74e6679b/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.268926 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed552cf6-1879-4c6e-877e-ff70fd80c485" path="/var/lib/kubelet/pods/ed552cf6-1879-4c6e-877e-ff70fd80c485/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.269945 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fefeb0e1-4aeb-4505-98e5-c7df15a5e06f" path="/var/lib/kubelet/pods/fefeb0e1-4aeb-4505-98e5-c7df15a5e06f/volumes" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.283160 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.339484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-combined-ca-bundle\") pod \"7a514056-a57f-4bc0-89b5-60ade40aadfc\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.339563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-config\") pod \"7a514056-a57f-4bc0-89b5-60ade40aadfc\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.339617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-certs\") pod \"7a514056-a57f-4bc0-89b5-60ade40aadfc\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.339635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skhst\" (UniqueName: \"kubernetes.io/projected/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-api-access-skhst\") pod \"7a514056-a57f-4bc0-89b5-60ade40aadfc\" (UID: \"7a514056-a57f-4bc0-89b5-60ade40aadfc\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.359992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-api-access-skhst" (OuterVolumeSpecName: "kube-api-access-skhst") pod "7a514056-a57f-4bc0-89b5-60ade40aadfc" (UID: "7a514056-a57f-4bc0-89b5-60ade40aadfc"). InnerVolumeSpecName "kube-api-access-skhst". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.380420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "7a514056-a57f-4bc0-89b5-60ade40aadfc" (UID: "7a514056-a57f-4bc0-89b5-60ade40aadfc"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.398312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "7a514056-a57f-4bc0-89b5-60ade40aadfc" (UID: "7a514056-a57f-4bc0-89b5-60ade40aadfc"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.400450 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a514056-a57f-4bc0-89b5-60ade40aadfc" (UID: "7a514056-a57f-4bc0-89b5-60ade40aadfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.443173 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.443280 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.443294 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.443304 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skhst\" (UniqueName: \"kubernetes.io/projected/7a514056-a57f-4bc0-89b5-60ade40aadfc-kube-api-access-skhst\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.621790 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.686952 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.707854 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.707922 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.739671 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.739737 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.741212 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.751057 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4146629-1224-48a1-86c9-da68e93dbbca" containerID="d8ae9b7d96594f3db274ccac95d0b77c71288664e3163100455a9c6c900575d6" exitCode=0 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.751271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"d4146629-1224-48a1-86c9-da68e93dbbca","Type":"ContainerDied","Data":"d8ae9b7d96594f3db274ccac95d0b77c71288664e3163100455a9c6c900575d6"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.754711 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.764307 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.767387 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.770799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.772074 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerID="1df6460ba3866cec27e159d8bc64f86136c0600f74e7a9842781454a2ef8142c" exitCode=0 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.772099 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerID="ff9e7bd1f2a0d3427ab31dc36cbefa73b193fcc1c2688ecbc02b7b18ba10fa93" exitCode=0 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.772136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" event={"ID":"4c133e4d-7259-43f9-90c7-d8fee2af8588","Type":"ContainerDied","Data":"1df6460ba3866cec27e159d8bc64f86136c0600f74e7a9842781454a2ef8142c"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.772155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" event={"ID":"4c133e4d-7259-43f9-90c7-d8fee2af8588","Type":"ContainerDied","Data":"ff9e7bd1f2a0d3427ab31dc36cbefa73b193fcc1c2688ecbc02b7b18ba10fa93"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.774001 4707 generic.go:334] "Generic (PLEG): container finished" podID="40927271-bdaa-4872-a579-c100648860dd" containerID="2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf" exitCode=143 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.774044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" event={"ID":"40927271-bdaa-4872-a579-c100648860dd","Type":"ContainerDied","Data":"2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.777267 4707 generic.go:334] "Generic (PLEG): container finished" podID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerID="2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc" exitCode=0 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.777286 4707 generic.go:334] "Generic (PLEG): container finished" podID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerID="d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7" exitCode=2 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.777293 4707 generic.go:334] "Generic (PLEG): container finished" podID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerID="703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a" exitCode=0 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.777323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerDied","Data":"2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.777338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerDied","Data":"d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.777348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerDied","Data":"703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.778633 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a514056-a57f-4bc0-89b5-60ade40aadfc" containerID="5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3" exitCode=2 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.778671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7a514056-a57f-4bc0-89b5-60ade40aadfc","Type":"ContainerDied","Data":"5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.778690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7a514056-a57f-4bc0-89b5-60ade40aadfc","Type":"ContainerDied","Data":"df56b24817d3684719c8d776134fd3a87475dee4503b4ef449c32e500cd81c45"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.778705 4707 scope.go:117] "RemoveContainer" containerID="5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.778791 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.785891 4707 generic.go:334] "Generic (PLEG): container finished" podID="4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" containerID="e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615" exitCode=0 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.786193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" event={"ID":"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97","Type":"ContainerDied","Data":"e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.786210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" event={"ID":"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97","Type":"ContainerStarted","Data":"005c3830bbcc49ebbc04330bf3fbbe8b92dadd75627cc14deb4d96c5057c014c"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.804088 4707 generic.go:334] "Generic (PLEG): container finished" podID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerID="5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f" exitCode=143 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.804141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" event={"ID":"767701a0-4f58-4ea4-a625-abc7f426d70b","Type":"ContainerDied","Data":"5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.822698 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.826689 4707 generic.go:334] "Generic (PLEG): container finished" podID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerID="9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827" exitCode=143 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.826747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" event={"ID":"c0633364-ad88-4b44-bc23-3e2548ffb3d3","Type":"ContainerDied","Data":"9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.835317 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.847293 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" podUID="8e2084ed-217a-462c-8219-6cfa022751d9" containerName="rabbitmq" containerID="cri-o://77bb156bdfd2d77e45072aa690e4f0b81d8bcd71d4f1156e3520806d9729c421" gracePeriod=604800 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.852039 4707 generic.go:334] "Generic (PLEG): container finished" podID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerID="6276366157c1c72f36b10c6a0708b755cc3d4fe703dde761888194d9a7e0a9e3" exitCode=0 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.852276 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829","Type":"ContainerDied","Data":"6276366157c1c72f36b10c6a0708b755cc3d4fe703dde761888194d9a7e0a9e3"} Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.854693 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.854734 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data podName:4c346ce8-cb7b-46a0-abda-84bc7a42f009 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:44.354723314 +0000 UTC m=+2281.536239535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data") pod "rabbitmq-server-0" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009") : configmap "rabbitmq-config-data" not found Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.861009 4707 generic.go:334] "Generic (PLEG): container finished" podID="0c0fc739-8c30-46ab-aea8-1db8f2864340" containerID="1434e8695f88889b7b13c9714203b638e18650a221ecf26e91992f23d796ee06" exitCode=1 Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.861494 4707 scope.go:117] "RemoveContainer" containerID="1434e8695f88889b7b13c9714203b638e18650a221ecf26e91992f23d796ee06" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.861835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-brbmp" event={"ID":"0c0fc739-8c30-46ab-aea8-1db8f2864340","Type":"ContainerDied","Data":"1434e8695f88889b7b13c9714203b638e18650a221ecf26e91992f23d796ee06"} Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.872150 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.884115 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.924264 4707 scope.go:117] "RemoveContainer" containerID="5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.954592 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-config-data\") pod \"4c133e4d-7259-43f9-90c7-d8fee2af8588\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.954721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-combined-ca-bundle\") pod \"d4146629-1224-48a1-86c9-da68e93dbbca\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.954773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-log-httpd\") pod \"4c133e4d-7259-43f9-90c7-d8fee2af8588\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.954824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-nova-novncproxy-tls-certs\") pod \"d4146629-1224-48a1-86c9-da68e93dbbca\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.954853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-etc-swift\") pod \"4c133e4d-7259-43f9-90c7-d8fee2af8588\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.954868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-internal-tls-certs\") pod \"4c133e4d-7259-43f9-90c7-d8fee2af8588\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.954942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-run-httpd\") pod \"4c133e4d-7259-43f9-90c7-d8fee2af8588\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.954980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b6n2\" (UniqueName: \"kubernetes.io/projected/d4146629-1224-48a1-86c9-da68e93dbbca-kube-api-access-8b6n2\") pod \"d4146629-1224-48a1-86c9-da68e93dbbca\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.955008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdlj\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-kube-api-access-zgdlj\") pod \"4c133e4d-7259-43f9-90c7-d8fee2af8588\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.955029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-vencrypt-tls-certs\") pod \"d4146629-1224-48a1-86c9-da68e93dbbca\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.955047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-combined-ca-bundle\") pod \"4c133e4d-7259-43f9-90c7-d8fee2af8588\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.955088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-config-data\") pod \"d4146629-1224-48a1-86c9-da68e93dbbca\" (UID: \"d4146629-1224-48a1-86c9-da68e93dbbca\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.955103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-public-tls-certs\") pod \"4c133e4d-7259-43f9-90c7-d8fee2af8588\" (UID: \"4c133e4d-7259-43f9-90c7-d8fee2af8588\") " Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.955354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4c133e4d-7259-43f9-90c7-d8fee2af8588" (UID: "4c133e4d-7259-43f9-90c7-d8fee2af8588"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.956110 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.957517 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/nova-cell0-conductor-config-data: secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.957586 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data podName:91365236-99fe-4ee7-9aa1-8a414557cf97 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:45.957567941 +0000 UTC m=+2283.139084164 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data") pod "nova-cell0-conductor-0" (UID: "91365236-99fe-4ee7-9aa1-8a414557cf97") : secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.958516 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3\": container with ID starting with 5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3 not found: ID does not exist" containerID="5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.958542 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3"} err="failed to get container status \"5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3\": rpc error: code = NotFound desc = could not find container \"5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3\": container with ID starting with 5b5f87af575a15ba800da9c114b1d813086b5a1151c88c92508e7edec487baa3 not found: ID does not exist" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.969772 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4c133e4d-7259-43f9-90c7-d8fee2af8588" (UID: "4c133e4d-7259-43f9-90c7-d8fee2af8588"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.969911 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.970000 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data podName:c641d259-efc1-4897-94d0-bf9d429f3c5d nodeName:}" failed. No retries permitted until 2026-01-21 15:39:44.469985197 +0000 UTC m=+2281.651501419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data") pod "rabbitmq-server-2" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d") : configmap "rabbitmq-config-data" not found Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.970762 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:43 crc kubenswrapper[4707]: E0121 15:39:43.970905 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data podName:15b560a7-6184-4897-af4f-7975ebd19fcb nodeName:}" failed. No retries permitted until 2026-01-21 15:39:44.470892343 +0000 UTC m=+2281.652408565 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data") pod "rabbitmq-server-1" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb") : configmap "rabbitmq-config-data" not found Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.974495 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4c133e4d-7259-43f9-90c7-d8fee2af8588" (UID: "4c133e4d-7259-43f9-90c7-d8fee2af8588"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.978868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4146629-1224-48a1-86c9-da68e93dbbca-kube-api-access-8b6n2" (OuterVolumeSpecName: "kube-api-access-8b6n2") pod "d4146629-1224-48a1-86c9-da68e93dbbca" (UID: "d4146629-1224-48a1-86c9-da68e93dbbca"). InnerVolumeSpecName "kube-api-access-8b6n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:43 crc kubenswrapper[4707]: I0121 15:39:43.992066 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-kube-api-access-zgdlj" (OuterVolumeSpecName: "kube-api-access-zgdlj") pod "4c133e4d-7259-43f9-90c7-d8fee2af8588" (UID: "4c133e4d-7259-43f9-90c7-d8fee2af8588"). InnerVolumeSpecName "kube-api-access-zgdlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.057648 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.058045 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c133e4d-7259-43f9-90c7-d8fee2af8588-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.058116 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b6n2\" (UniqueName: \"kubernetes.io/projected/d4146629-1224-48a1-86c9-da68e93dbbca-kube-api-access-8b6n2\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.058178 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdlj\" (UniqueName: \"kubernetes.io/projected/4c133e4d-7259-43f9-90c7-d8fee2af8588-kube-api-access-zgdlj\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.058278 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.058344 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data podName:8e2084ed-217a-462c-8219-6cfa022751d9 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:48.058325817 +0000 UTC m=+2285.239842039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data") pod "rabbitmq-cell1-server-1" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.089012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4146629-1224-48a1-86c9-da68e93dbbca" (UID: "d4146629-1224-48a1-86c9-da68e93dbbca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.091129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-config-data" (OuterVolumeSpecName: "config-data") pod "d4146629-1224-48a1-86c9-da68e93dbbca" (UID: "d4146629-1224-48a1-86c9-da68e93dbbca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.091519 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.096493 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.103235 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.153462 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4c133e4d-7259-43f9-90c7-d8fee2af8588" (UID: "4c133e4d-7259-43f9-90c7-d8fee2af8588"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.163740 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.163852 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.163935 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.164070 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.164184 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data podName:1977b33b-45bb-4b7d-b7f7-ebfd42572edb nodeName:}" failed. No retries permitted until 2026-01-21 15:39:48.164170275 +0000 UTC m=+2285.345686498 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data") pod "rabbitmq-cell1-server-2" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.164343 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.164402 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data podName:702acb12-2ee0-4008-bf50-0fda9fc79030 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:48.164387574 +0000 UTC m=+2285.345903796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data") pod "rabbitmq-cell1-server-0" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.178284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "d4146629-1224-48a1-86c9-da68e93dbbca" (UID: "d4146629-1224-48a1-86c9-da68e93dbbca"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.197425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c133e4d-7259-43f9-90c7-d8fee2af8588" (UID: "4c133e4d-7259-43f9-90c7-d8fee2af8588"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.208280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "d4146629-1224-48a1-86c9-da68e93dbbca" (UID: "d4146629-1224-48a1-86c9-da68e93dbbca"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.219459 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-config-data" (OuterVolumeSpecName: "config-data") pod "4c133e4d-7259-43f9-90c7-d8fee2af8588" (UID: "4c133e4d-7259-43f9-90c7-d8fee2af8588"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.219618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4c133e4d-7259-43f9-90c7-d8fee2af8588" (UID: "4c133e4d-7259-43f9-90c7-d8fee2af8588"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.265684 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.265716 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4146629-1224-48a1-86c9-da68e93dbbca-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.265726 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.265734 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.265743 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c133e4d-7259-43f9-90c7-d8fee2af8588-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.367494 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.367551 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data podName:4c346ce8-cb7b-46a0-abda-84bc7a42f009 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:45.367539961 +0000 UTC m=+2282.549056183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data") pod "rabbitmq-server-0" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009") : configmap "rabbitmq-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.386966 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.475041 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.475105 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data podName:15b560a7-6184-4897-af4f-7975ebd19fcb nodeName:}" failed. No retries permitted until 2026-01-21 15:39:45.475090477 +0000 UTC m=+2282.656606700 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data") pod "rabbitmq-server-1" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb") : configmap "rabbitmq-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.475408 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.475531 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data podName:c641d259-efc1-4897-94d0-bf9d429f3c5d nodeName:}" failed. No retries permitted until 2026-01-21 15:39:45.47551728 +0000 UTC m=+2282.657033502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data") pod "rabbitmq-server-2" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d") : configmap "rabbitmq-config-data" not found Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.482364 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.537333 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.579402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/767701a0-4f58-4ea4-a625-abc7f426d70b-logs\") pod \"767701a0-4f58-4ea4-a625-abc7f426d70b\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.579453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8faea1-a6d2-4be7-88ca-e92d10481925-operator-scripts\") pod \"ce8faea1-a6d2-4be7-88ca-e92d10481925\" (UID: \"ce8faea1-a6d2-4be7-88ca-e92d10481925\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.579519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data\") pod \"767701a0-4f58-4ea4-a625-abc7f426d70b\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.579535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-combined-ca-bundle\") pod \"767701a0-4f58-4ea4-a625-abc7f426d70b\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.579594 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btdqp\" (UniqueName: \"kubernetes.io/projected/767701a0-4f58-4ea4-a625-abc7f426d70b-kube-api-access-btdqp\") pod \"767701a0-4f58-4ea4-a625-abc7f426d70b\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.579693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dclfd\" (UniqueName: \"kubernetes.io/projected/ce8faea1-a6d2-4be7-88ca-e92d10481925-kube-api-access-dclfd\") pod \"ce8faea1-a6d2-4be7-88ca-e92d10481925\" (UID: \"ce8faea1-a6d2-4be7-88ca-e92d10481925\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.579717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data-custom\") pod \"767701a0-4f58-4ea4-a625-abc7f426d70b\" (UID: \"767701a0-4f58-4ea4-a625-abc7f426d70b\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.583734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767701a0-4f58-4ea4-a625-abc7f426d70b-logs" (OuterVolumeSpecName: "logs") pod "767701a0-4f58-4ea4-a625-abc7f426d70b" (UID: "767701a0-4f58-4ea4-a625-abc7f426d70b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.583798 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8faea1-a6d2-4be7-88ca-e92d10481925-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce8faea1-a6d2-4be7-88ca-e92d10481925" (UID: "ce8faea1-a6d2-4be7-88ca-e92d10481925"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.595573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "767701a0-4f58-4ea4-a625-abc7f426d70b" (UID: "767701a0-4f58-4ea4-a625-abc7f426d70b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.604876 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767701a0-4f58-4ea4-a625-abc7f426d70b-kube-api-access-btdqp" (OuterVolumeSpecName: "kube-api-access-btdqp") pod "767701a0-4f58-4ea4-a625-abc7f426d70b" (UID: "767701a0-4f58-4ea4-a625-abc7f426d70b"). InnerVolumeSpecName "kube-api-access-btdqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.615285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8faea1-a6d2-4be7-88ca-e92d10481925-kube-api-access-dclfd" (OuterVolumeSpecName: "kube-api-access-dclfd") pod "ce8faea1-a6d2-4be7-88ca-e92d10481925" (UID: "ce8faea1-a6d2-4be7-88ca-e92d10481925"). InnerVolumeSpecName "kube-api-access-dclfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.616202 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="741daec5-08a7-4769-8b0b-4248c3c02d69" containerName="galera" containerID="cri-o://0acfdb10ed4b8a5d4173af5a014e847071dd24247841328d306d9a0bc63b44ec" gracePeriod=28 Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.643961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "767701a0-4f58-4ea4-a625-abc7f426d70b" (UID: "767701a0-4f58-4ea4-a625-abc7f426d70b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.681349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"2a77adde-7080-44bf-bbba-0d8b0f874e98\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.681449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-default\") pod \"2a77adde-7080-44bf-bbba-0d8b0f874e98\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.681507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-galera-tls-certs\") pod \"2a77adde-7080-44bf-bbba-0d8b0f874e98\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.681526 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-generated\") pod \"2a77adde-7080-44bf-bbba-0d8b0f874e98\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.681542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-kolla-config\") pod \"2a77adde-7080-44bf-bbba-0d8b0f874e98\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.681579 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-combined-ca-bundle\") pod \"2a77adde-7080-44bf-bbba-0d8b0f874e98\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.681609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s55v\" (UniqueName: \"kubernetes.io/projected/2a77adde-7080-44bf-bbba-0d8b0f874e98-kube-api-access-9s55v\") pod \"2a77adde-7080-44bf-bbba-0d8b0f874e98\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.681626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-operator-scripts\") pod \"2a77adde-7080-44bf-bbba-0d8b0f874e98\" (UID: \"2a77adde-7080-44bf-bbba-0d8b0f874e98\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.682148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2a77adde-7080-44bf-bbba-0d8b0f874e98" (UID: "2a77adde-7080-44bf-bbba-0d8b0f874e98"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.682239 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/767701a0-4f58-4ea4-a625-abc7f426d70b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.682260 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8faea1-a6d2-4be7-88ca-e92d10481925-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.682270 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.682278 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btdqp\" (UniqueName: \"kubernetes.io/projected/767701a0-4f58-4ea4-a625-abc7f426d70b-kube-api-access-btdqp\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.682285 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dclfd\" (UniqueName: \"kubernetes.io/projected/ce8faea1-a6d2-4be7-88ca-e92d10481925-kube-api-access-dclfd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.682293 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.682496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2a77adde-7080-44bf-bbba-0d8b0f874e98" (UID: "2a77adde-7080-44bf-bbba-0d8b0f874e98"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.682615 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2a77adde-7080-44bf-bbba-0d8b0f874e98" (UID: "2a77adde-7080-44bf-bbba-0d8b0f874e98"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.683182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a77adde-7080-44bf-bbba-0d8b0f874e98" (UID: "2a77adde-7080-44bf-bbba-0d8b0f874e98"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.693721 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.694453 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="d1e12f07-e051-4381-8419-5f3a1a09d6c5" containerName="memcached" containerID="cri-o://aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d" gracePeriod=30 Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.697028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a77adde-7080-44bf-bbba-0d8b0f874e98-kube-api-access-9s55v" (OuterVolumeSpecName: "kube-api-access-9s55v") pod "2a77adde-7080-44bf-bbba-0d8b0f874e98" (UID: "2a77adde-7080-44bf-bbba-0d8b0f874e98"). InnerVolumeSpecName "kube-api-access-9s55v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.697127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data" (OuterVolumeSpecName: "config-data") pod "767701a0-4f58-4ea4-a625-abc7f426d70b" (UID: "767701a0-4f58-4ea4-a625-abc7f426d70b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.714928 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.738207 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.740384 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-mtmbc"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.749114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "2a77adde-7080-44bf-bbba-0d8b0f874e98" (UID: "2a77adde-7080-44bf-bbba-0d8b0f874e98"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.812424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a77adde-7080-44bf-bbba-0d8b0f874e98" (UID: "2a77adde-7080-44bf-bbba-0d8b0f874e98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.812647 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/767701a0-4f58-4ea4-a625-abc7f426d70b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.813624 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.814477 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.814512 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2a77adde-7080-44bf-bbba-0d8b0f874e98-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.814536 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.814571 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.814587 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s55v\" (UniqueName: \"kubernetes.io/projected/2a77adde-7080-44bf-bbba-0d8b0f874e98-kube-api-access-9s55v\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.814610 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a77adde-7080-44bf-bbba-0d8b0f874e98-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.828930 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25"] Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.835882 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="ceilometer-notification-agent" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.835922 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="ceilometer-notification-agent" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.835935 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerName="barbican-keystone-listener-log" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.835941 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerName="barbican-keystone-listener-log" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.835957 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="sg-core" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.835982 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="sg-core" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.835993 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerName="barbican-keystone-listener" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.835998 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerName="barbican-keystone-listener" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.836010 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerName="proxy-httpd" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836015 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerName="proxy-httpd" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.836022 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a77adde-7080-44bf-bbba-0d8b0f874e98" containerName="galera" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836027 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a77adde-7080-44bf-bbba-0d8b0f874e98" containerName="galera" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.836042 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="ceilometer-central-agent" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836063 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="ceilometer-central-agent" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.836078 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="proxy-httpd" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836083 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="proxy-httpd" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.836092 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4146629-1224-48a1-86c9-da68e93dbbca" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836098 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4146629-1224-48a1-86c9-da68e93dbbca" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.836107 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a77adde-7080-44bf-bbba-0d8b0f874e98" containerName="mysql-bootstrap" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836113 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a77adde-7080-44bf-bbba-0d8b0f874e98" containerName="mysql-bootstrap" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.836139 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a514056-a57f-4bc0-89b5-60ade40aadfc" containerName="kube-state-metrics" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836146 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a514056-a57f-4bc0-89b5-60ade40aadfc" containerName="kube-state-metrics" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.836154 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerName="proxy-server" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836160 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerName="proxy-server" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836412 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="ceilometer-notification-agent" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836430 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="ceilometer-central-agent" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836458 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerName="proxy-server" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836465 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a514056-a57f-4bc0-89b5-60ade40aadfc" containerName="kube-state-metrics" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836475 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4146629-1224-48a1-86c9-da68e93dbbca" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836484 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c133e4d-7259-43f9-90c7-d8fee2af8588" containerName="proxy-httpd" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836497 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="proxy-httpd" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836505 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerName="barbican-keystone-listener-log" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836528 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerName="barbican-keystone-listener" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836540 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a77adde-7080-44bf-bbba-0d8b0f874e98" containerName="galera" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.836549 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerName="sg-core" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.837344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.841602 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.848724 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.864439 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-bm4lc"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.891495 4707 generic.go:334] "Generic (PLEG): container finished" podID="0c0fc739-8c30-46ab-aea8-1db8f2864340" containerID="ed0d14d079952dc244381a0fdc4b45a78474f1f50fb22736556ea2cccbd997cc" exitCode=1 Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.891560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-brbmp" event={"ID":"0c0fc739-8c30-46ab-aea8-1db8f2864340","Type":"ContainerDied","Data":"ed0d14d079952dc244381a0fdc4b45a78474f1f50fb22736556ea2cccbd997cc"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.891588 4707 scope.go:117] "RemoveContainer" containerID="1434e8695f88889b7b13c9714203b638e18650a221ecf26e91992f23d796ee06" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.891901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2a77adde-7080-44bf-bbba-0d8b0f874e98" (UID: "2a77adde-7080-44bf-bbba-0d8b0f874e98"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.893083 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-brbmp" secret="" err="secret \"galera-openstack-dockercfg-mb7qz\" not found" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.893124 4707 scope.go:117] "RemoveContainer" containerID="ed0d14d079952dc244381a0fdc4b45a78474f1f50fb22736556ea2cccbd997cc" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.894003 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-brbmp_openstack-kuttl-tests(0c0fc739-8c30-46ab-aea8-1db8f2864340)\"" pod="openstack-kuttl-tests/root-account-create-update-brbmp" podUID="0c0fc739-8c30-46ab-aea8-1db8f2864340" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.899338 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.917575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-ceilometer-tls-certs\") pod \"4083be08-daf7-4cdc-a1ea-4409372710cd\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.917664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-sg-core-conf-yaml\") pod \"4083be08-daf7-4cdc-a1ea-4409372710cd\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.917745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-log-httpd\") pod \"4083be08-daf7-4cdc-a1ea-4409372710cd\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.917800 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-combined-ca-bundle\") pod \"4083be08-daf7-4cdc-a1ea-4409372710cd\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.917954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-run-httpd\") pod \"4083be08-daf7-4cdc-a1ea-4409372710cd\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.917999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-config-data\") pod \"4083be08-daf7-4cdc-a1ea-4409372710cd\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.918036 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d45j\" (UniqueName: \"kubernetes.io/projected/4083be08-daf7-4cdc-a1ea-4409372710cd-kube-api-access-2d45j\") pod \"4083be08-daf7-4cdc-a1ea-4409372710cd\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.918139 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-scripts\") pod \"4083be08-daf7-4cdc-a1ea-4409372710cd\" (UID: \"4083be08-daf7-4cdc-a1ea-4409372710cd\") " Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.918588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts\") pod \"keystone-ab63-account-create-update-r2g25\" (UID: \"baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.918678 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6n9v\" (UniqueName: \"kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v\") pod \"keystone-ab63-account-create-update-r2g25\" (UID: \"baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.918816 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a77adde-7080-44bf-bbba-0d8b0f874e98-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.918827 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.918910 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:39:44 crc kubenswrapper[4707]: E0121 15:39:44.918948 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts podName:0c0fc739-8c30-46ab-aea8-1db8f2864340 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:45.41893507 +0000 UTC m=+2282.600451292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts") pod "root-account-create-update-brbmp" (UID: "0c0fc739-8c30-46ab-aea8-1db8f2864340") : configmap "openstack-scripts" not found Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.921985 4707 generic.go:334] "Generic (PLEG): container finished" podID="4083be08-daf7-4cdc-a1ea-4409372710cd" containerID="c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74" exitCode=0 Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.922043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerDied","Data":"c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.922066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4083be08-daf7-4cdc-a1ea-4409372710cd","Type":"ContainerDied","Data":"5892e969bc045f83934258bec74407ce36a69cb7226f77656fcca277edcf5b75"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.922128 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.923441 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6cfc89d475-9v9xq"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.923595 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" podUID="324409f5-db7f-41c2-9004-d9c4d10e1e0e" containerName="keystone-api" containerID="cri-o://9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577" gracePeriod=30 Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.924188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4083be08-daf7-4cdc-a1ea-4409372710cd" (UID: "4083be08-daf7-4cdc-a1ea-4409372710cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.925775 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4083be08-daf7-4cdc-a1ea-4409372710cd" (UID: "4083be08-daf7-4cdc-a1ea-4409372710cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.928707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4083be08-daf7-4cdc-a1ea-4409372710cd-kube-api-access-2d45j" (OuterVolumeSpecName: "kube-api-access-2d45j") pod "4083be08-daf7-4cdc-a1ea-4409372710cd" (UID: "4083be08-daf7-4cdc-a1ea-4409372710cd"). InnerVolumeSpecName "kube-api-access-2d45j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.933124 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-scripts" (OuterVolumeSpecName: "scripts") pod "4083be08-daf7-4cdc-a1ea-4409372710cd" (UID: "4083be08-daf7-4cdc-a1ea-4409372710cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.935533 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-bm4lc"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.946010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" event={"ID":"4c133e4d-7259-43f9-90c7-d8fee2af8588","Type":"ContainerDied","Data":"f6a9a2822431c1b0446f9af883787fd441d1f3f25e2b7f011e6ce9ce21b5e1f7"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.946197 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.955367 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dpbkq"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.962512 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-dpbkq"] Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.969041 4707 generic.go:334] "Generic (PLEG): container finished" podID="2a77adde-7080-44bf-bbba-0d8b0f874e98" containerID="8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f" exitCode=0 Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.969149 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"2a77adde-7080-44bf-bbba-0d8b0f874e98","Type":"ContainerDied","Data":"8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.969228 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.969235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"2a77adde-7080-44bf-bbba-0d8b0f874e98","Type":"ContainerDied","Data":"83e6b723e40a76c6e1ac48c1a8c6f33d260f6beb1081b8777747f123246e9621"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.974649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4083be08-daf7-4cdc-a1ea-4409372710cd" (UID: "4083be08-daf7-4cdc-a1ea-4409372710cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.978855 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" event={"ID":"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97","Type":"ContainerStarted","Data":"ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.979711 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.988392 4707 generic.go:334] "Generic (PLEG): container finished" podID="767701a0-4f58-4ea4-a625-abc7f426d70b" containerID="85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d" exitCode=0 Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.988510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" event={"ID":"767701a0-4f58-4ea4-a625-abc7f426d70b","Type":"ContainerDied","Data":"85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.988596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" event={"ID":"767701a0-4f58-4ea4-a625-abc7f426d70b","Type":"ContainerDied","Data":"09709386cbaf9ea6ae35f27c8dcdb19eda38edc8854a3017806f36d74010b0dc"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.988708 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx" Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.997364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" event={"ID":"ce8faea1-a6d2-4be7-88ca-e92d10481925","Type":"ContainerDied","Data":"9c76be54f43963b7d8c4823536eefe4f94011a03ee3b9cce26b9bfad435f4e8d"} Jan 21 15:39:44 crc kubenswrapper[4707]: I0121 15:39:44.997418 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.005368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"d4146629-1224-48a1-86c9-da68e93dbbca","Type":"ContainerDied","Data":"7b6a811db370f624819f05bef72f525a106a2c7655e93b3a367187ef0affd76f"} Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.005561 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.015276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4083be08-daf7-4cdc-a1ea-4409372710cd" (UID: "4083be08-daf7-4cdc-a1ea-4409372710cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.021224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6n9v\" (UniqueName: \"kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v\") pod \"keystone-ab63-account-create-update-r2g25\" (UID: \"baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.021632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts\") pod \"keystone-ab63-account-create-update-r2g25\" (UID: \"baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.021733 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.021744 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.021752 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.021760 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083be08-daf7-4cdc-a1ea-4409372710cd-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.021768 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d45j\" (UniqueName: \"kubernetes.io/projected/4083be08-daf7-4cdc-a1ea-4409372710cd-kube-api-access-2d45j\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.021777 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.021866 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.021900 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts podName:baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:45.521888373 +0000 UTC m=+2282.703404595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts") pod "keystone-ab63-account-create-update-r2g25" (UID: "baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1") : configmap "openstack-scripts" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.030050 4707 projected.go:194] Error preparing data for projected volume kube-api-access-h6n9v for pod openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.030103 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v podName:baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:45.530086997 +0000 UTC m=+2282.711603219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h6n9v" (UniqueName: "kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v") pod "keystone-ab63-account-create-update-r2g25" (UID: "baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.036686 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.049048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4083be08-daf7-4cdc-a1ea-4409372710cd" (UID: "4083be08-daf7-4cdc-a1ea-4409372710cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.062324 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-2" podUID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerName="rabbitmq" containerID="cri-o://6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62" gracePeriod=604800 Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.062442 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-1" podUID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerName="rabbitmq" containerID="cri-o://c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc" gracePeriod=604800 Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.073833 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.074465 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerName="rabbitmq" containerID="cri-o://da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442" gracePeriod=604800 Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.074931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-config-data" (OuterVolumeSpecName: "config-data") pod "4083be08-daf7-4cdc-a1ea-4409372710cd" (UID: "4083be08-daf7-4cdc-a1ea-4409372710cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.095846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.124378 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.124404 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4083be08-daf7-4cdc-a1ea-4409372710cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.164107 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-brbmp"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.169402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8s54s"] Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.173855 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a2c4e0ae63668aed4274707ac9589b229bd930700f2c6e2c7706b1b5642df61" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.174000 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8s54s"] Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.177070 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a2c4e0ae63668aed4274707ac9589b229bd930700f2c6e2c7706b1b5642df61" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.185801 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a2c4e0ae63668aed4274707ac9589b229bd930700f2c6e2c7706b1b5642df61" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.185862 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerName="ovn-northd" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.194643 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" podStartSLOduration=4.19462773 podStartE2EDuration="4.19462773s" podCreationTimestamp="2026-01-21 15:39:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:39:45.00411643 +0000 UTC m=+2282.185632652" watchObservedRunningTime="2026-01-21 15:39:45.19462773 +0000 UTC m=+2282.376143952" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.195536 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9eed66-7b3c-4e13-8083-7cfb9a92c178" path="/var/lib/kubelet/pods/0e9eed66-7b3c-4e13-8083-7cfb9a92c178/volumes" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.196082 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2362c581-1317-4f6d-8967-0bf4dedd8637" path="/var/lib/kubelet/pods/2362c581-1317-4f6d-8967-0bf4dedd8637/volumes" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.196576 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77947c2b-5588-40bc-9038-45844e74e767" path="/var/lib/kubelet/pods/77947c2b-5588-40bc-9038-45844e74e767/volumes" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.197048 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a514056-a57f-4bc0-89b5-60ade40aadfc" path="/var/lib/kubelet/pods/7a514056-a57f-4bc0-89b5-60ade40aadfc/volumes" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.198104 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca523b2c-039d-42f0-8602-10b7fe0d4134" path="/var/lib/kubelet/pods/ca523b2c-039d-42f0-8602-10b7fe0d4134/volumes" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.199410 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25"] Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.200044 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-h6n9v operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" podUID="baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.280090 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-2" podUID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" containerName="galera" containerID="cri-o://0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5" gracePeriod=30 Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.432421 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.432471 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data podName:4c346ce8-cb7b-46a0-abda-84bc7a42f009 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:47.432458987 +0000 UTC m=+2284.613975209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data") pod "rabbitmq-server-0" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009") : configmap "rabbitmq-config-data" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.432501 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.432521 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts podName:0c0fc739-8c30-46ab-aea8-1db8f2864340 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:46.432514742 +0000 UTC m=+2283.614030964 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts") pod "root-account-create-update-brbmp" (UID: "0c0fc739-8c30-46ab-aea8-1db8f2864340") : configmap "openstack-scripts" not found Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.529381 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.167:8776/healthcheck\": read tcp 10.217.0.2:42120->10.217.0.167:8776: read: connection reset by peer" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.533788 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts\") pod \"keystone-ab63-account-create-update-r2g25\" (UID: \"baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.533866 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.533930 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data podName:c641d259-efc1-4897-94d0-bf9d429f3c5d nodeName:}" failed. No retries permitted until 2026-01-21 15:39:47.533914554 +0000 UTC m=+2284.715430776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data") pod "rabbitmq-server-2" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d") : configmap "rabbitmq-config-data" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.533993 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.534029 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts podName:baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:46.534016906 +0000 UTC m=+2283.715533128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts") pod "keystone-ab63-account-create-update-r2g25" (UID: "baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1") : configmap "openstack-scripts" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.534085 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.534107 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data podName:15b560a7-6184-4897-af4f-7975ebd19fcb nodeName:}" failed. No retries permitted until 2026-01-21 15:39:47.534100233 +0000 UTC m=+2284.715616454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data") pod "rabbitmq-server-1" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb") : configmap "rabbitmq-config-data" not found Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.534081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6n9v\" (UniqueName: \"kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v\") pod \"keystone-ab63-account-create-update-r2g25\" (UID: \"baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.537120 4707 projected.go:194] Error preparing data for projected volume kube-api-access-h6n9v for pod openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.537168 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v podName:baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:46.537159495 +0000 UTC m=+2283.718675717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-h6n9v" (UniqueName: "kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v") pod "keystone-ab63-account-create-update-r2g25" (UID: "baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.551368 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.552669 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.575171 4707 scope.go:117] "RemoveContainer" containerID="2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.582910 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.587769 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.621759 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": dial tcp 10.217.0.204:8775: connect: connection refused" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.621772 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": dial tcp 10.217.0.204:8775: connect: connection refused" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.630465 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.644335 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.652173 4707 scope.go:117] "RemoveContainer" containerID="d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.662564 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.677317 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.682209 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.682214 4707 scope.go:117] "RemoveContainer" containerID="c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.686682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.692959 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcf9fd7cd-jh5hx"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.702103 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.703491 4707 scope.go:117] "RemoveContainer" containerID="703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.707200 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6478545c6d-rrgqf"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.711190 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.715454 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.724339 4707 scope.go:117] "RemoveContainer" containerID="2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.724936 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc\": container with ID starting with 2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc not found: ID does not exist" containerID="2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.724968 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc"} err="failed to get container status \"2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc\": rpc error: code = NotFound desc = could not find container \"2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc\": container with ID starting with 2b58f29eeced92da0b2f592f4cccc1e36a5ff7d6fc765c85e07d3fd5a918d4bc not found: ID does not exist" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.724989 4707 scope.go:117] "RemoveContainer" containerID="d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.725290 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7\": container with ID starting with d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7 not found: ID does not exist" containerID="d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.725310 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7"} err="failed to get container status \"d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7\": rpc error: code = NotFound desc = could not find container \"d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7\": container with ID starting with d802129876fd203a52dcc3106c28c8bfe51384016c62fb4b30954774d7ab79f7 not found: ID does not exist" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.725323 4707 scope.go:117] "RemoveContainer" containerID="c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.725535 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74\": container with ID starting with c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74 not found: ID does not exist" containerID="c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.725552 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74"} err="failed to get container status \"c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74\": rpc error: code = NotFound desc = could not find container \"c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74\": container with ID starting with c993f7e27379e970eab1385933e59128f4ab337e15d0cf3436f77cf9fdc3eb74 not found: ID does not exist" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.725563 4707 scope.go:117] "RemoveContainer" containerID="703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.725785 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a\": container with ID starting with 703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a not found: ID does not exist" containerID="703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.725816 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a"} err="failed to get container status \"703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a\": rpc error: code = NotFound desc = could not find container \"703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a\": container with ID starting with 703abfb8d999ad13a96f7ff5ceb566caafa291557938ba36298d4f7e85f3942a not found: ID does not exist" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.725829 4707 scope.go:117] "RemoveContainer" containerID="1df6460ba3866cec27e159d8bc64f86136c0600f74e7a9842781454a2ef8142c" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-operator-scripts\") pod \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kolla-config\") pod \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-logs\") pod \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-logs\") pod \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-galera-tls-certs\") pod \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-generated\") pod \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjgck\" (UniqueName: \"kubernetes.io/projected/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kube-api-access-xjgck\") pod \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751846 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-public-tls-certs\") pod \"886a48bf-ec30-418c-86d9-9d96b725aa59\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-combined-ca-bundle\") pod \"886a48bf-ec30-418c-86d9-9d96b725aa59\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-httpd-run\") pod \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-combined-ca-bundle\") pod \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-combined-ca-bundle\") pod \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.751999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-internal-tls-certs\") pod \"886a48bf-ec30-418c-86d9-9d96b725aa59\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-config-data\") pod \"886a48bf-ec30-418c-86d9-9d96b725aa59\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxrzd\" (UniqueName: \"kubernetes.io/projected/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-kube-api-access-fxrzd\") pod \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-scripts\") pod \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-scripts\") pod \"886a48bf-ec30-418c-86d9-9d96b725aa59\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-config-data\") pod \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752139 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-config-data\") pod \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pdr8\" (UniqueName: \"kubernetes.io/projected/3eec9e69-0f85-4cc7-8fad-020a027dedcc-kube-api-access-2pdr8\") pod \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rgpw\" (UniqueName: \"kubernetes.io/projected/886a48bf-ec30-418c-86d9-9d96b725aa59-kube-api-access-6rgpw\") pod \"886a48bf-ec30-418c-86d9-9d96b725aa59\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-internal-tls-certs\") pod \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886a48bf-ec30-418c-86d9-9d96b725aa59-logs\") pod \"886a48bf-ec30-418c-86d9-9d96b725aa59\" (UID: \"886a48bf-ec30-418c-86d9-9d96b725aa59\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-httpd-run\") pod \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-scripts\") pod \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\" (UID: \"cf120fd0-7d2d-4f6e-80d9-523c4432c9db\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-public-tls-certs\") pod \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\" (UID: \"3eec9e69-0f85-4cc7-8fad-020a027dedcc\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-logs" (OuterVolumeSpecName: "logs") pod "cf120fd0-7d2d-4f6e-80d9-523c4432c9db" (UID: "cf120fd0-7d2d-4f6e-80d9-523c4432c9db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752362 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-combined-ca-bundle\") pod \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.752426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-default\") pod \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\" (UID: \"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.753212 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.753905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" (UID: "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.754677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" (UID: "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.754936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3eec9e69-0f85-4cc7-8fad-020a027dedcc" (UID: "3eec9e69-0f85-4cc7-8fad-020a027dedcc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.755018 4707 scope.go:117] "RemoveContainer" containerID="ff9e7bd1f2a0d3427ab31dc36cbefa73b193fcc1c2688ecbc02b7b18ba10fa93" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.755232 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" (UID: "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.755974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886a48bf-ec30-418c-86d9-9d96b725aa59-logs" (OuterVolumeSpecName: "logs") pod "886a48bf-ec30-418c-86d9-9d96b725aa59" (UID: "886a48bf-ec30-418c-86d9-9d96b725aa59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.756195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cf120fd0-7d2d-4f6e-80d9-523c4432c9db" (UID: "cf120fd0-7d2d-4f6e-80d9-523c4432c9db"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.758388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" (UID: "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.758695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-logs" (OuterVolumeSpecName: "logs") pod "3eec9e69-0f85-4cc7-8fad-020a027dedcc" (UID: "3eec9e69-0f85-4cc7-8fad-020a027dedcc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.759764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "cf120fd0-7d2d-4f6e-80d9-523c4432c9db" (UID: "cf120fd0-7d2d-4f6e-80d9-523c4432c9db"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.765548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886a48bf-ec30-418c-86d9-9d96b725aa59-kube-api-access-6rgpw" (OuterVolumeSpecName: "kube-api-access-6rgpw") pod "886a48bf-ec30-418c-86d9-9d96b725aa59" (UID: "886a48bf-ec30-418c-86d9-9d96b725aa59"). InnerVolumeSpecName "kube-api-access-6rgpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.766384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-kube-api-access-fxrzd" (OuterVolumeSpecName: "kube-api-access-fxrzd") pod "cf120fd0-7d2d-4f6e-80d9-523c4432c9db" (UID: "cf120fd0-7d2d-4f6e-80d9-523c4432c9db"). InnerVolumeSpecName "kube-api-access-fxrzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.766956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-scripts" (OuterVolumeSpecName: "scripts") pod "3eec9e69-0f85-4cc7-8fad-020a027dedcc" (UID: "3eec9e69-0f85-4cc7-8fad-020a027dedcc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.767199 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-scripts" (OuterVolumeSpecName: "scripts") pod "cf120fd0-7d2d-4f6e-80d9-523c4432c9db" (UID: "cf120fd0-7d2d-4f6e-80d9-523c4432c9db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.767679 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eec9e69-0f85-4cc7-8fad-020a027dedcc-kube-api-access-2pdr8" (OuterVolumeSpecName: "kube-api-access-2pdr8") pod "3eec9e69-0f85-4cc7-8fad-020a027dedcc" (UID: "3eec9e69-0f85-4cc7-8fad-020a027dedcc"). InnerVolumeSpecName "kube-api-access-2pdr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.769143 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "3eec9e69-0f85-4cc7-8fad-020a027dedcc" (UID: "3eec9e69-0f85-4cc7-8fad-020a027dedcc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.780056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-scripts" (OuterVolumeSpecName: "scripts") pod "886a48bf-ec30-418c-86d9-9d96b725aa59" (UID: "886a48bf-ec30-418c-86d9-9d96b725aa59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.781013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kube-api-access-xjgck" (OuterVolumeSpecName: "kube-api-access-xjgck") pod "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" (UID: "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9"). InnerVolumeSpecName "kube-api-access-xjgck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.792351 4707 scope.go:117] "RemoveContainer" containerID="8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.822908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" (UID: "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.835401 4707 scope.go:117] "RemoveContainer" containerID="c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.836114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" (UID: "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.851503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eec9e69-0f85-4cc7-8fad-020a027dedcc" (UID: "3eec9e69-0f85-4cc7-8fad-020a027dedcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.854678 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-config-data\") pod \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.854719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-combined-ca-bundle\") pod \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.854758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kolla-config\") pod \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.854948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxdxr\" (UniqueName: \"kubernetes.io/projected/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kube-api-access-nxdxr\") pod \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.855067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-config-data" (OuterVolumeSpecName: "config-data") pod "d1e12f07-e051-4381-8419-5f3a1a09d6c5" (UID: "d1e12f07-e051-4381-8419-5f3a1a09d6c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.855573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-memcached-tls-certs\") pod \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\" (UID: \"d1e12f07-e051-4381-8419-5f3a1a09d6c5\") " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.856169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d1e12f07-e051-4381-8419-5f3a1a09d6c5" (UID: "d1e12f07-e051-4381-8419-5f3a1a09d6c5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.856950 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.856975 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.856986 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.856997 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857007 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857030 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857038 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857047 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857055 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjgck\" (UniqueName: \"kubernetes.io/projected/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-kube-api-access-xjgck\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857063 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3eec9e69-0f85-4cc7-8fad-020a027dedcc-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857071 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857080 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxrzd\" (UniqueName: \"kubernetes.io/projected/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-kube-api-access-fxrzd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857089 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857103 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857111 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857120 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pdr8\" (UniqueName: \"kubernetes.io/projected/3eec9e69-0f85-4cc7-8fad-020a027dedcc-kube-api-access-2pdr8\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857129 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rgpw\" (UniqueName: \"kubernetes.io/projected/886a48bf-ec30-418c-86d9-9d96b725aa59-kube-api-access-6rgpw\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857139 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857147 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886a48bf-ec30-418c-86d9-9d96b725aa59-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857157 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.857168 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.881002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kube-api-access-nxdxr" (OuterVolumeSpecName: "kube-api-access-nxdxr") pod "d1e12f07-e051-4381-8419-5f3a1a09d6c5" (UID: "d1e12f07-e051-4381-8419-5f3a1a09d6c5"). InnerVolumeSpecName "kube-api-access-nxdxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.885279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf120fd0-7d2d-4f6e-80d9-523c4432c9db" (UID: "cf120fd0-7d2d-4f6e-80d9-523c4432c9db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.916639 4707 scope.go:117] "RemoveContainer" containerID="8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.918216 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f\": container with ID starting with 8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f not found: ID does not exist" containerID="8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.918250 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f"} err="failed to get container status \"8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f\": rpc error: code = NotFound desc = could not find container \"8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f\": container with ID starting with 8b021ac6439b3192cafd83ee596471e125704f536c99a98640bff6ed8017672f not found: ID does not exist" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.918287 4707 scope.go:117] "RemoveContainer" containerID="c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.922364 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d\": container with ID starting with c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d not found: ID does not exist" containerID="c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.922400 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d"} err="failed to get container status \"c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d\": rpc error: code = NotFound desc = could not find container \"c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d\": container with ID starting with c3ddc5db85d72f2958c779e2a21c53daf0ae5c8256016e114a50ba0e63087f2d not found: ID does not exist" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.922416 4707 scope.go:117] "RemoveContainer" containerID="85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.937934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-config-data" (OuterVolumeSpecName: "config-data") pod "3eec9e69-0f85-4cc7-8fad-020a027dedcc" (UID: "3eec9e69-0f85-4cc7-8fad-020a027dedcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.956190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-config-data" (OuterVolumeSpecName: "config-data") pod "886a48bf-ec30-418c-86d9-9d96b725aa59" (UID: "886a48bf-ec30-418c-86d9-9d96b725aa59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.958576 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/nova-cell0-conductor-config-data: secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:45 crc kubenswrapper[4707]: E0121 15:39:45.958643 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data podName:91365236-99fe-4ee7-9aa1-8a414557cf97 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:49.958628209 +0000 UTC m=+2287.140144431 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data") pod "nova-cell0-conductor-0" (UID: "91365236-99fe-4ee7-9aa1-8a414557cf97") : secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.959102 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.959325 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxdxr\" (UniqueName: \"kubernetes.io/projected/d1e12f07-e051-4381-8419-5f3a1a09d6c5-kube-api-access-nxdxr\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.959340 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.959349 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.959358 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.971839 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.972839 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.973282 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.978803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" (UID: "3597cfd3-7c8f-41b7-9265-cfffa2bab5e9"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.993537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3eec9e69-0f85-4cc7-8fad-020a027dedcc" (UID: "3eec9e69-0f85-4cc7-8fad-020a027dedcc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.995936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1e12f07-e051-4381-8419-5f3a1a09d6c5" (UID: "d1e12f07-e051-4381-8419-5f3a1a09d6c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:45 crc kubenswrapper[4707]: I0121 15:39:45.998626 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "886a48bf-ec30-418c-86d9-9d96b725aa59" (UID: "886a48bf-ec30-418c-86d9-9d96b725aa59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.008999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cf120fd0-7d2d-4f6e-80d9-523c4432c9db" (UID: "cf120fd0-7d2d-4f6e-80d9-523c4432c9db"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.015758 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-config-data" (OuterVolumeSpecName: "config-data") pod "cf120fd0-7d2d-4f6e-80d9-523c4432c9db" (UID: "cf120fd0-7d2d-4f6e-80d9-523c4432c9db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.028100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "886a48bf-ec30-418c-86d9-9d96b725aa59" (UID: "886a48bf-ec30-418c-86d9-9d96b725aa59"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.042864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "886a48bf-ec30-418c-86d9-9d96b725aa59" (UID: "886a48bf-ec30-418c-86d9-9d96b725aa59"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.043498 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerID="872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8" exitCode=0 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.043585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"cf120fd0-7d2d-4f6e-80d9-523c4432c9db","Type":"ContainerDied","Data":"872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.043611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"cf120fd0-7d2d-4f6e-80d9-523c4432c9db","Type":"ContainerDied","Data":"6b89b54991d146634bc6a60926a26545aaf622fdd133dbd7721b113c06b3b0f7"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.043678 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.047005 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "d1e12f07-e051-4381-8419-5f3a1a09d6c5" (UID: "d1e12f07-e051-4381-8419-5f3a1a09d6c5"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.048957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"487d2c84-f1f6-4b88-9188-49e704aeb43a","Type":"ContainerDied","Data":"793158d30625ea44f14e3cf6927d960d4fb70b77cc2bde9279de8570448d24cb"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.049070 4707 generic.go:334] "Generic (PLEG): container finished" podID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerID="793158d30625ea44f14e3cf6927d960d4fb70b77cc2bde9279de8570448d24cb" exitCode=0 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.049139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"487d2c84-f1f6-4b88-9188-49e704aeb43a","Type":"ContainerDied","Data":"9e5810e83405bceb5ec184f8064af3ac8b78021832b246e60622ea38317a7ed2"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.049155 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e5810e83405bceb5ec184f8064af3ac8b78021832b246e60622ea38317a7ed2" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.051329 4707 generic.go:334] "Generic (PLEG): container finished" podID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerID="228afae2defd2f7cd01706c71f8b84017837c2d44aa5f97f405fbfa3b2e25d2f" exitCode=0 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.051373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"64087307-6402-47cd-8a4b-37c3d90eb61e","Type":"ContainerDied","Data":"228afae2defd2f7cd01706c71f8b84017837c2d44aa5f97f405fbfa3b2e25d2f"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.051395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"64087307-6402-47cd-8a4b-37c3d90eb61e","Type":"ContainerDied","Data":"85d413e1d6ad31b912ee3d5aae2d1f1571993fb658f165ae67457a2c81e2059a"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.051404 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85d413e1d6ad31b912ee3d5aae2d1f1571993fb658f165ae67457a2c81e2059a" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.053307 4707 generic.go:334] "Generic (PLEG): container finished" podID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerID="0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d" exitCode=0 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.053390 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.053499 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3eec9e69-0f85-4cc7-8fad-020a027dedcc","Type":"ContainerDied","Data":"0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.053597 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3eec9e69-0f85-4cc7-8fad-020a027dedcc","Type":"ContainerDied","Data":"5fbde92505ad39b5b2fadf98a0505dce037831c5392ac079487c41f121cc2c83"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.056727 4707 generic.go:334] "Generic (PLEG): container finished" podID="d1e12f07-e051-4381-8419-5f3a1a09d6c5" containerID="aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d" exitCode=0 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.056767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d1e12f07-e051-4381-8419-5f3a1a09d6c5","Type":"ContainerDied","Data":"aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.056785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d1e12f07-e051-4381-8419-5f3a1a09d6c5","Type":"ContainerDied","Data":"4335b58d1c1e7187db5559e67e8b78629dd9337ce9d4fa2e4d68c6c8f435e90f"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.056837 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.059392 4707 generic.go:334] "Generic (PLEG): container finished" podID="3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" containerID="67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f" exitCode=0 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.059501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9","Type":"ContainerDied","Data":"67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.059522 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.059531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"3597cfd3-7c8f-41b7-9265-cfffa2bab5e9","Type":"ContainerDied","Data":"8d8f7f728f687b371c12b8c70f435ffe4a2bff3180de1274b1661aa55440023f"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060712 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060731 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060742 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060751 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060759 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf120fd0-7d2d-4f6e-80d9-523c4432c9db-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060767 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e12f07-e051-4381-8419-5f3a1a09d6c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060774 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060782 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eec9e69-0f85-4cc7-8fad-020a027dedcc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060789 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060796 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060815 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.060823 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886a48bf-ec30-418c-86d9-9d96b725aa59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.063211 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-brbmp" secret="" err="secret \"galera-openstack-dockercfg-mb7qz\" not found" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.063272 4707 scope.go:117] "RemoveContainer" containerID="ed0d14d079952dc244381a0fdc4b45a78474f1f50fb22736556ea2cccbd997cc" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.063617 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-brbmp_openstack-kuttl-tests(0c0fc739-8c30-46ab-aea8-1db8f2864340)\"" pod="openstack-kuttl-tests/root-account-create-update-brbmp" podUID="0c0fc739-8c30-46ab-aea8-1db8f2864340" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.066020 4707 generic.go:334] "Generic (PLEG): container finished" podID="741daec5-08a7-4769-8b0b-4248c3c02d69" containerID="0acfdb10ed4b8a5d4173af5a014e847071dd24247841328d306d9a0bc63b44ec" exitCode=0 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.066126 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"741daec5-08a7-4769-8b0b-4248c3c02d69","Type":"ContainerDied","Data":"0acfdb10ed4b8a5d4173af5a014e847071dd24247841328d306d9a0bc63b44ec"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.066320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"741daec5-08a7-4769-8b0b-4248c3c02d69","Type":"ContainerDied","Data":"1c8f86be8625ae982d6dfb74ae0e8d2e5e0d1a39f6066ee75a12984ba71fbe62"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.066334 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c8f86be8625ae982d6dfb74ae0e8d2e5e0d1a39f6066ee75a12984ba71fbe62" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.079702 4707 generic.go:334] "Generic (PLEG): container finished" podID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerID="43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f" exitCode=0 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.079773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" event={"ID":"886a48bf-ec30-418c-86d9-9d96b725aa59","Type":"ContainerDied","Data":"43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.079798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" event={"ID":"886a48bf-ec30-418c-86d9-9d96b725aa59","Type":"ContainerDied","Data":"7fcfea5a91aa187e427c2bdc47ba7952faee7441564ea963143811ec3cd2307c"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.079920 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7b4f4d8c74-82f2d" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.083626 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_7e183977-7c22-4b6d-a465-132c8f85dbb0/ovn-northd/0.log" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.083650 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerID="8a2c4e0ae63668aed4274707ac9589b229bd930700f2c6e2c7706b1b5642df61" exitCode=139 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.083683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7e183977-7c22-4b6d-a465-132c8f85dbb0","Type":"ContainerDied","Data":"8a2c4e0ae63668aed4274707ac9589b229bd930700f2c6e2c7706b1b5642df61"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.085509 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerID="1c8e77bfa68e38980554d4bde9e72ce205b90327a6373937e410e37c84c7a7f8" exitCode=0 Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.085549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9cee2089-6404-41ee-b6ae-82d788ced5fc","Type":"ContainerDied","Data":"1c8e77bfa68e38980554d4bde9e72ce205b90327a6373937e410e37c84c7a7f8"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.085585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9cee2089-6404-41ee-b6ae-82d788ced5fc","Type":"ContainerDied","Data":"6dcb9a5746b9dbf80af99667cbbce8da259bc8f27b2b23a88b48e63db383f69c"} Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.085596 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dcb9a5746b9dbf80af99667cbbce8da259bc8f27b2b23a88b48e63db383f69c" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.085844 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.271612 4707 scope.go:117] "RemoveContainer" containerID="5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.272393 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.379190 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.388007 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.391324 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.391419 4707 scope.go:117] "RemoveContainer" containerID="85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.391708 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d\": container with ID starting with 85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d not found: ID does not exist" containerID="85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.391740 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d"} err="failed to get container status \"85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d\": rpc error: code = NotFound desc = could not find container \"85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d\": container with ID starting with 85ae80fe6379396e9fef6c5a02209da08649a0ca22fd683b2dcaf8196e83c04d not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.391770 4707 scope.go:117] "RemoveContainer" containerID="5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.392530 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f\": container with ID starting with 5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f not found: ID does not exist" containerID="5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.392558 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f"} err="failed to get container status \"5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f\": rpc error: code = NotFound desc = could not find container \"5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f\": container with ID starting with 5e3d050551d2565622cf388f6315cd7fc1619d7a5d9d9fbcdbda7943da03fd3f not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.392576 4707 scope.go:117] "RemoveContainer" containerID="d8ae9b7d96594f3db274ccac95d0b77c71288664e3163100455a9c6c900575d6" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.396565 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.404930 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.406590 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.407136 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.412965 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.422193 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.422773 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_7e183977-7c22-4b6d-a465-132c8f85dbb0/ovn-northd/0.log" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.422847 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.433084 4707 scope.go:117] "RemoveContainer" containerID="872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.435968 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.448110 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.460369 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.467797 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.468679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-config-data\") pod \"9cee2089-6404-41ee-b6ae-82d788ced5fc\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.468760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-nova-metadata-tls-certs\") pod \"9cee2089-6404-41ee-b6ae-82d788ced5fc\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.468864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-combined-ca-bundle\") pod \"9cee2089-6404-41ee-b6ae-82d788ced5fc\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.468958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7jtb\" (UniqueName: \"kubernetes.io/projected/9cee2089-6404-41ee-b6ae-82d788ced5fc-kube-api-access-m7jtb\") pod \"9cee2089-6404-41ee-b6ae-82d788ced5fc\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.468981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cee2089-6404-41ee-b6ae-82d788ced5fc-logs\") pod \"9cee2089-6404-41ee-b6ae-82d788ced5fc\" (UID: \"9cee2089-6404-41ee-b6ae-82d788ced5fc\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.469063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-public-tls-certs\") pod \"487d2c84-f1f6-4b88-9188-49e704aeb43a\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.473532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cee2089-6404-41ee-b6ae-82d788ced5fc-kube-api-access-m7jtb" (OuterVolumeSpecName: "kube-api-access-m7jtb") pod "9cee2089-6404-41ee-b6ae-82d788ced5fc" (UID: "9cee2089-6404-41ee-b6ae-82d788ced5fc"). InnerVolumeSpecName "kube-api-access-m7jtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.473629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cee2089-6404-41ee-b6ae-82d788ced5fc-logs" (OuterVolumeSpecName: "logs") pod "9cee2089-6404-41ee-b6ae-82d788ced5fc" (UID: "9cee2089-6404-41ee-b6ae-82d788ced5fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.474179 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7jtb\" (UniqueName: \"kubernetes.io/projected/9cee2089-6404-41ee-b6ae-82d788ced5fc-kube-api-access-m7jtb\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.474200 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cee2089-6404-41ee-b6ae-82d788ced5fc-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.474281 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.474317 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts podName:0c0fc739-8c30-46ab-aea8-1db8f2864340 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:48.474304532 +0000 UTC m=+2285.655820755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts") pod "root-account-create-update-brbmp" (UID: "0c0fc739-8c30-46ab-aea8-1db8f2864340") : configmap "openstack-scripts" not found Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.482352 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7b4f4d8c74-82f2d"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.487442 4707 scope.go:117] "RemoveContainer" containerID="93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.498373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-config-data" (OuterVolumeSpecName: "config-data") pod "9cee2089-6404-41ee-b6ae-82d788ced5fc" (UID: "9cee2089-6404-41ee-b6ae-82d788ced5fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.498651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cee2089-6404-41ee-b6ae-82d788ced5fc" (UID: "9cee2089-6404-41ee-b6ae-82d788ced5fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.499166 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7b4f4d8c74-82f2d"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.506773 4707 scope.go:117] "RemoveContainer" containerID="872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.508432 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8\": container with ID starting with 872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8 not found: ID does not exist" containerID="872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.508491 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8"} err="failed to get container status \"872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8\": rpc error: code = NotFound desc = could not find container \"872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8\": container with ID starting with 872205f9ee4f12f0707e406888f7871a81a33c8bebb2baa98efb744f0a25efe8 not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.508517 4707 scope.go:117] "RemoveContainer" containerID="93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.509418 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a\": container with ID starting with 93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a not found: ID does not exist" containerID="93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.509452 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a"} err="failed to get container status \"93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a\": rpc error: code = NotFound desc = could not find container \"93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a\": container with ID starting with 93cf76558b735c3d688b18027a32c9e349ae50601aa53b4476a430c9a4b5421a not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.509473 4707 scope.go:117] "RemoveContainer" containerID="0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.522029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9cee2089-6404-41ee-b6ae-82d788ced5fc" (UID: "9cee2089-6404-41ee-b6ae-82d788ced5fc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.554887 4707 scope.go:117] "RemoveContainer" containerID="cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.569385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "487d2c84-f1f6-4b88-9188-49e704aeb43a" (UID: "487d2c84-f1f6-4b88-9188-49e704aeb43a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.573570 4707 scope.go:117] "RemoveContainer" containerID="0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575138 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-combined-ca-bundle\") pod \"7e183977-7c22-4b6d-a465-132c8f85dbb0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-public-tls-certs\") pod \"64087307-6402-47cd-8a4b-37c3d90eb61e\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-rundir\") pod \"7e183977-7c22-4b6d-a465-132c8f85dbb0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data-custom\") pod \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-metrics-certs-tls-certs\") pod \"7e183977-7c22-4b6d-a465-132c8f85dbb0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-combined-ca-bundle\") pod \"487d2c84-f1f6-4b88-9188-49e704aeb43a\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575316 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-internal-tls-certs\") pod \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-internal-tls-certs\") pod \"487d2c84-f1f6-4b88-9188-49e704aeb43a\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575377 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0633364-ad88-4b44-bc23-3e2548ffb3d3-logs\") pod \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-combined-ca-bundle\") pod \"741daec5-08a7-4769-8b0b-4248c3c02d69\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487d2c84-f1f6-4b88-9188-49e704aeb43a-logs\") pod \"487d2c84-f1f6-4b88-9188-49e704aeb43a\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-combined-ca-bundle\") pod \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-config\") pod \"7e183977-7c22-4b6d-a465-132c8f85dbb0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-kolla-config\") pod \"741daec5-08a7-4769-8b0b-4248c3c02d69\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575495 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data\") pod \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-operator-scripts\") pod \"741daec5-08a7-4769-8b0b-4248c3c02d69\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/487d2c84-f1f6-4b88-9188-49e704aeb43a-etc-machine-id\") pod \"487d2c84-f1f6-4b88-9188-49e704aeb43a\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgwxq\" (UniqueName: \"kubernetes.io/projected/487d2c84-f1f6-4b88-9188-49e704aeb43a-kube-api-access-wgwxq\") pod \"487d2c84-f1f6-4b88-9188-49e704aeb43a\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnhrb\" (UniqueName: \"kubernetes.io/projected/c0633364-ad88-4b44-bc23-3e2548ffb3d3-kube-api-access-dnhrb\") pod \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575607 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-combined-ca-bundle\") pod \"64087307-6402-47cd-8a4b-37c3d90eb61e\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data-custom\") pod \"487d2c84-f1f6-4b88-9188-49e704aeb43a\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-galera-tls-certs\") pod \"741daec5-08a7-4769-8b0b-4248c3c02d69\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-default\") pod \"741daec5-08a7-4769-8b0b-4248c3c02d69\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-scripts\") pod \"487d2c84-f1f6-4b88-9188-49e704aeb43a\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575695 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64087307-6402-47cd-8a4b-37c3d90eb61e-logs\") pod \"64087307-6402-47cd-8a4b-37c3d90eb61e\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2gt9\" (UniqueName: \"kubernetes.io/projected/64087307-6402-47cd-8a4b-37c3d90eb61e-kube-api-access-v2gt9\") pod \"64087307-6402-47cd-8a4b-37c3d90eb61e\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-internal-tls-certs\") pod \"64087307-6402-47cd-8a4b-37c3d90eb61e\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575746 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-scripts\") pod \"7e183977-7c22-4b6d-a465-132c8f85dbb0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data\") pod \"487d2c84-f1f6-4b88-9188-49e704aeb43a\" (UID: \"487d2c84-f1f6-4b88-9188-49e704aeb43a\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575781 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnnl6\" (UniqueName: \"kubernetes.io/projected/741daec5-08a7-4769-8b0b-4248c3c02d69-kube-api-access-rnnl6\") pod \"741daec5-08a7-4769-8b0b-4248c3c02d69\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-config-data\") pod \"64087307-6402-47cd-8a4b-37c3d90eb61e\" (UID: \"64087307-6402-47cd-8a4b-37c3d90eb61e\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"741daec5-08a7-4769-8b0b-4248c3c02d69\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-northd-tls-certs\") pod \"7e183977-7c22-4b6d-a465-132c8f85dbb0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-public-tls-certs\") pod \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\" (UID: \"c0633364-ad88-4b44-bc23-3e2548ffb3d3\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575919 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hxkq\" (UniqueName: \"kubernetes.io/projected/7e183977-7c22-4b6d-a465-132c8f85dbb0-kube-api-access-8hxkq\") pod \"7e183977-7c22-4b6d-a465-132c8f85dbb0\" (UID: \"7e183977-7c22-4b6d-a465-132c8f85dbb0\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.575937 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-generated\") pod \"741daec5-08a7-4769-8b0b-4248c3c02d69\" (UID: \"741daec5-08a7-4769-8b0b-4248c3c02d69\") " Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.576398 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d\": container with ID starting with 0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d not found: ID does not exist" containerID="0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.576426 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d"} err="failed to get container status \"0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d\": rpc error: code = NotFound desc = could not find container \"0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d\": container with ID starting with 0bf72d3713737fb029e20c1fbeca6a8d2cc6ed570211f0a33e26bc038bc36f3d not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.576446 4707 scope.go:117] "RemoveContainer" containerID="cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.576916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts\") pod \"keystone-ab63-account-create-update-r2g25\" (UID: \"baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.577012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6n9v\" (UniqueName: \"kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v\") pod \"keystone-ab63-account-create-update-r2g25\" (UID: \"baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1\") " pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.577061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "741daec5-08a7-4769-8b0b-4248c3c02d69" (UID: "741daec5-08a7-4769-8b0b-4248c3c02d69"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.577124 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.577138 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.577148 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.577155 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cee2089-6404-41ee-b6ae-82d788ced5fc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.577932 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27\": container with ID starting with cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27 not found: ID does not exist" containerID="cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.577972 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27"} err="failed to get container status \"cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27\": rpc error: code = NotFound desc = could not find container \"cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27\": container with ID starting with cfca601e59ca9f65d9d89da1a5d4644a149141ea62860fe13298b122c5d14b27 not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.577996 4707 scope.go:117] "RemoveContainer" containerID="aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.578104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/487d2c84-f1f6-4b88-9188-49e704aeb43a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "487d2c84-f1f6-4b88-9188-49e704aeb43a" (UID: "487d2c84-f1f6-4b88-9188-49e704aeb43a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.578174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "741daec5-08a7-4769-8b0b-4248c3c02d69" (UID: "741daec5-08a7-4769-8b0b-4248c3c02d69"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.578267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487d2c84-f1f6-4b88-9188-49e704aeb43a-logs" (OuterVolumeSpecName: "logs") pod "487d2c84-f1f6-4b88-9188-49e704aeb43a" (UID: "487d2c84-f1f6-4b88-9188-49e704aeb43a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.578562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7e183977-7c22-4b6d-a465-132c8f85dbb0" (UID: "7e183977-7c22-4b6d-a465-132c8f85dbb0"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.578671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "741daec5-08a7-4769-8b0b-4248c3c02d69" (UID: "741daec5-08a7-4769-8b0b-4248c3c02d69"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.578684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64087307-6402-47cd-8a4b-37c3d90eb61e-logs" (OuterVolumeSpecName: "logs") pod "64087307-6402-47cd-8a4b-37c3d90eb61e" (UID: "64087307-6402-47cd-8a4b-37c3d90eb61e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.578855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-config" (OuterVolumeSpecName: "config") pod "7e183977-7c22-4b6d-a465-132c8f85dbb0" (UID: "7e183977-7c22-4b6d-a465-132c8f85dbb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.578878 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.578923 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts podName:baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:48.57890914 +0000 UTC m=+2285.760425362 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts") pod "keystone-ab63-account-create-update-r2g25" (UID: "baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1") : configmap "openstack-scripts" not found Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.581148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741daec5-08a7-4769-8b0b-4248c3c02d69-kube-api-access-rnnl6" (OuterVolumeSpecName: "kube-api-access-rnnl6") pod "741daec5-08a7-4769-8b0b-4248c3c02d69" (UID: "741daec5-08a7-4769-8b0b-4248c3c02d69"). InnerVolumeSpecName "kube-api-access-rnnl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.581860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "741daec5-08a7-4769-8b0b-4248c3c02d69" (UID: "741daec5-08a7-4769-8b0b-4248c3c02d69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.583617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-scripts" (OuterVolumeSpecName: "scripts") pod "487d2c84-f1f6-4b88-9188-49e704aeb43a" (UID: "487d2c84-f1f6-4b88-9188-49e704aeb43a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.583888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-scripts" (OuterVolumeSpecName: "scripts") pod "7e183977-7c22-4b6d-a465-132c8f85dbb0" (UID: "7e183977-7c22-4b6d-a465-132c8f85dbb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.584068 4707 projected.go:194] Error preparing data for projected volume kube-api-access-h6n9v for pod openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.584110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0633364-ad88-4b44-bc23-3e2548ffb3d3-logs" (OuterVolumeSpecName: "logs") pod "c0633364-ad88-4b44-bc23-3e2548ffb3d3" (UID: "c0633364-ad88-4b44-bc23-3e2548ffb3d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.584124 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v podName:baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:48.584109096 +0000 UTC m=+2285.765625318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-h6n9v" (UniqueName: "kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v") pod "keystone-ab63-account-create-update-r2g25" (UID: "baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.587109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c0633364-ad88-4b44-bc23-3e2548ffb3d3" (UID: "c0633364-ad88-4b44-bc23-3e2548ffb3d3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.587881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e183977-7c22-4b6d-a465-132c8f85dbb0-kube-api-access-8hxkq" (OuterVolumeSpecName: "kube-api-access-8hxkq") pod "7e183977-7c22-4b6d-a465-132c8f85dbb0" (UID: "7e183977-7c22-4b6d-a465-132c8f85dbb0"). InnerVolumeSpecName "kube-api-access-8hxkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.596967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "487d2c84-f1f6-4b88-9188-49e704aeb43a" (UID: "487d2c84-f1f6-4b88-9188-49e704aeb43a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.600451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64087307-6402-47cd-8a4b-37c3d90eb61e-kube-api-access-v2gt9" (OuterVolumeSpecName: "kube-api-access-v2gt9") pod "64087307-6402-47cd-8a4b-37c3d90eb61e" (UID: "64087307-6402-47cd-8a4b-37c3d90eb61e"). InnerVolumeSpecName "kube-api-access-v2gt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.601595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487d2c84-f1f6-4b88-9188-49e704aeb43a-kube-api-access-wgwxq" (OuterVolumeSpecName: "kube-api-access-wgwxq") pod "487d2c84-f1f6-4b88-9188-49e704aeb43a" (UID: "487d2c84-f1f6-4b88-9188-49e704aeb43a"). InnerVolumeSpecName "kube-api-access-wgwxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.610266 4707 scope.go:117] "RemoveContainer" containerID="aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.610761 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d\": container with ID starting with aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d not found: ID does not exist" containerID="aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.610786 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d"} err="failed to get container status \"aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d\": rpc error: code = NotFound desc = could not find container \"aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d\": container with ID starting with aa827c347343ba0622e5ea556644fa7005bf452b997251f0273820ad9646e75d not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.610819 4707 scope.go:117] "RemoveContainer" containerID="67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.611842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0633364-ad88-4b44-bc23-3e2548ffb3d3-kube-api-access-dnhrb" (OuterVolumeSpecName: "kube-api-access-dnhrb") pod "c0633364-ad88-4b44-bc23-3e2548ffb3d3" (UID: "c0633364-ad88-4b44-bc23-3e2548ffb3d3"). InnerVolumeSpecName "kube-api-access-dnhrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.632296 4707 scope.go:117] "RemoveContainer" containerID="fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.640601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "741daec5-08a7-4769-8b0b-4248c3c02d69" (UID: "741daec5-08a7-4769-8b0b-4248c3c02d69"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.642049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-config-data" (OuterVolumeSpecName: "config-data") pod "64087307-6402-47cd-8a4b-37c3d90eb61e" (UID: "64087307-6402-47cd-8a4b-37c3d90eb61e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.645673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "741daec5-08a7-4769-8b0b-4248c3c02d69" (UID: "741daec5-08a7-4769-8b0b-4248c3c02d69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.654197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64087307-6402-47cd-8a4b-37c3d90eb61e" (UID: "64087307-6402-47cd-8a4b-37c3d90eb61e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.656513 4707 scope.go:117] "RemoveContainer" containerID="67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.656822 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f\": container with ID starting with 67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f not found: ID does not exist" containerID="67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.656849 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f"} err="failed to get container status \"67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f\": rpc error: code = NotFound desc = could not find container \"67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f\": container with ID starting with 67af7daf30d6e1ef9be096edf478f1628db69fc2f177a3b3cebb2438e862ac8f not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.656866 4707 scope.go:117] "RemoveContainer" containerID="fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.657125 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee\": container with ID starting with fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee not found: ID does not exist" containerID="fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.657167 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee"} err="failed to get container status \"fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee\": rpc error: code = NotFound desc = could not find container \"fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee\": container with ID starting with fec6e0561571e96c2b7fe438a7130a4bbe2106b39f773bda34c965d9b57e20ee not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.657190 4707 scope.go:117] "RemoveContainer" containerID="43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.657245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "487d2c84-f1f6-4b88-9188-49e704aeb43a" (UID: "487d2c84-f1f6-4b88-9188-49e704aeb43a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.659402 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "64087307-6402-47cd-8a4b-37c3d90eb61e" (UID: "64087307-6402-47cd-8a4b-37c3d90eb61e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.659405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c0633364-ad88-4b44-bc23-3e2548ffb3d3" (UID: "c0633364-ad88-4b44-bc23-3e2548ffb3d3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.683426 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685292 4707 scope.go:117] "RemoveContainer" containerID="292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.683456 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685893 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685908 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0633364-ad88-4b44-bc23-3e2548ffb3d3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685917 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685926 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487d2c84-f1f6-4b88-9188-49e704aeb43a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685934 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685943 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685952 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685960 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/487d2c84-f1f6-4b88-9188-49e704aeb43a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685968 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgwxq\" (UniqueName: \"kubernetes.io/projected/487d2c84-f1f6-4b88-9188-49e704aeb43a-kube-api-access-wgwxq\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685978 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnhrb\" (UniqueName: \"kubernetes.io/projected/c0633364-ad88-4b44-bc23-3e2548ffb3d3-kube-api-access-dnhrb\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685986 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.685994 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686002 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686009 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686016 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64087307-6402-47cd-8a4b-37c3d90eb61e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686025 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2gt9\" (UniqueName: \"kubernetes.io/projected/64087307-6402-47cd-8a4b-37c3d90eb61e-kube-api-access-v2gt9\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686034 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686041 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e183977-7c22-4b6d-a465-132c8f85dbb0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686049 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686056 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnnl6\" (UniqueName: \"kubernetes.io/projected/741daec5-08a7-4769-8b0b-4248c3c02d69-kube-api-access-rnnl6\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686081 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686090 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686099 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hxkq\" (UniqueName: \"kubernetes.io/projected/7e183977-7c22-4b6d-a465-132c8f85dbb0-kube-api-access-8hxkq\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.686107 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/741daec5-08a7-4769-8b0b-4248c3c02d69-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.689153 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.689245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e183977-7c22-4b6d-a465-132c8f85dbb0" (UID: "7e183977-7c22-4b6d-a465-132c8f85dbb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.689890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0633364-ad88-4b44-bc23-3e2548ffb3d3" (UID: "c0633364-ad88-4b44-bc23-3e2548ffb3d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.696511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "741daec5-08a7-4769-8b0b-4248c3c02d69" (UID: "741daec5-08a7-4769-8b0b-4248c3c02d69"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.697084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data" (OuterVolumeSpecName: "config-data") pod "487d2c84-f1f6-4b88-9188-49e704aeb43a" (UID: "487d2c84-f1f6-4b88-9188-49e704aeb43a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.697606 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.698054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data" (OuterVolumeSpecName: "config-data") pod "c0633364-ad88-4b44-bc23-3e2548ffb3d3" (UID: "c0633364-ad88-4b44-bc23-3e2548ffb3d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.698653 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.701061 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.701103 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.705362 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.711060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "64087307-6402-47cd-8a4b-37c3d90eb61e" (UID: "64087307-6402-47cd-8a4b-37c3d90eb61e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.711065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c0633364-ad88-4b44-bc23-3e2548ffb3d3" (UID: "c0633364-ad88-4b44-bc23-3e2548ffb3d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.712533 4707 scope.go:117] "RemoveContainer" containerID="43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.712877 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f\": container with ID starting with 43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f not found: ID does not exist" containerID="43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.712906 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f"} err="failed to get container status \"43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f\": rpc error: code = NotFound desc = could not find container \"43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f\": container with ID starting with 43a13af5c35645d9a0375e00e9f844c2f9875b82c6a06eab7b67c0db1a18af3f not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.712925 4707 scope.go:117] "RemoveContainer" containerID="292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2" Jan 21 15:39:46 crc kubenswrapper[4707]: E0121 15:39:46.713352 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2\": container with ID starting with 292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2 not found: ID does not exist" containerID="292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.713376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2"} err="failed to get container status \"292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2\": rpc error: code = NotFound desc = could not find container \"292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2\": container with ID starting with 292a953d44d4e5f686f7c16008a06a90d408faafbad751299fe4267bc093d8f2 not found: ID does not exist" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.713969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "487d2c84-f1f6-4b88-9188-49e704aeb43a" (UID: "487d2c84-f1f6-4b88-9188-49e704aeb43a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.720611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7e183977-7c22-4b6d-a465-132c8f85dbb0" (UID: "7e183977-7c22-4b6d-a465-132c8f85dbb0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.728840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "7e183977-7c22-4b6d-a465-132c8f85dbb0" (UID: "7e183977-7c22-4b6d-a465-132c8f85dbb0"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.787343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-combined-ca-bundle\") pod \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.787628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqc5h\" (UniqueName: \"kubernetes.io/projected/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kube-api-access-rqc5h\") pod \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.787757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kolla-config\") pod \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.787802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.787878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-default\") pod \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.787908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-generated\") pod \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.787979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-galera-tls-certs\") pod \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-operator-scripts\") pod \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\" (UID: \"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d\") " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" (UID: "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788489 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788506 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/741daec5-08a7-4769-8b0b-4248c3c02d69-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788515 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788524 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788532 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788540 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788548 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64087307-6402-47cd-8a4b-37c3d90eb61e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788556 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e183977-7c22-4b6d-a465-132c8f85dbb0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788563 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788570 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788578 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/487d2c84-f1f6-4b88-9188-49e704aeb43a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788586 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0633364-ad88-4b44-bc23-3e2548ffb3d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" (UID: "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.788750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" (UID: "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.789144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" (UID: "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.790330 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kube-api-access-rqc5h" (OuterVolumeSpecName: "kube-api-access-rqc5h") pod "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" (UID: "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d"). InnerVolumeSpecName "kube-api-access-rqc5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.794549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "mysql-db") pod "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" (UID: "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.806289 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" (UID: "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.820227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" (UID: "aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.890530 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.890558 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.890569 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.890577 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqc5h\" (UniqueName: \"kubernetes.io/projected/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-kube-api-access-rqc5h\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.890605 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.890615 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.890623 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.903406 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 21 15:39:46 crc kubenswrapper[4707]: I0121 15:39:46.992003 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.070088 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.071726 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.072739 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.072853 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.098631 4707 generic.go:334] "Generic (PLEG): container finished" podID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerID="514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d" exitCode=0 Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.098691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" event={"ID":"c0633364-ad88-4b44-bc23-3e2548ffb3d3","Type":"ContainerDied","Data":"514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d"} Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.098698 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.098716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-867d686c44-7f4dj" event={"ID":"c0633364-ad88-4b44-bc23-3e2548ffb3d3","Type":"ContainerDied","Data":"694266690a95a71432e331bfbb399389f375307a96d01c7cf73b02872aeae289"} Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.098732 4707 scope.go:117] "RemoveContainer" containerID="514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.101050 4707 generic.go:334] "Generic (PLEG): container finished" podID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" containerID="0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5" exitCode=0 Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.101098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d","Type":"ContainerDied","Data":"0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5"} Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.101115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d","Type":"ContainerDied","Data":"ef00fd3b5fb732833d6c613184fa0126d0718d7d48e307c90e36595ea5e901dc"} Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.101150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-2" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.104951 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_7e183977-7c22-4b6d-a465-132c8f85dbb0/ovn-northd/0.log" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.105589 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.106843 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.106949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7e183977-7c22-4b6d-a465-132c8f85dbb0","Type":"ContainerDied","Data":"9c3a6d7076a0d105db144a6333b85c631bc6ff5efa3e4cd298a1f6de22f877a7"} Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.107015 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.107429 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.107475 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.109595 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.119334 4707 scope.go:117] "RemoveContainer" containerID="9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.164530 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-867d686c44-7f4dj"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.174283 4707 scope.go:117] "RemoveContainer" containerID="514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d" Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.175170 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d\": container with ID starting with 514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d not found: ID does not exist" containerID="514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.175215 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d"} err="failed to get container status \"514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d\": rpc error: code = NotFound desc = could not find container \"514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d\": container with ID starting with 514824fa107fd8f8a12fac46f04d344f46006c8db14123190022167150d14a1d not found: ID does not exist" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.175239 4707 scope.go:117] "RemoveContainer" containerID="9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827" Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.175634 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827\": container with ID starting with 9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827 not found: ID does not exist" containerID="9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.175666 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827"} err="failed to get container status \"9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827\": rpc error: code = NotFound desc = could not find container \"9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827\": container with ID starting with 9ae1a817dbc7770627ef75d8d90fea645f2a3270ed28df644eedcebda672f827 not found: ID does not exist" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.175703 4707 scope.go:117] "RemoveContainer" containerID="0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.178722 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-867d686c44-7f4dj"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.208350 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a77adde-7080-44bf-bbba-0d8b0f874e98" path="/var/lib/kubelet/pods/2a77adde-7080-44bf-bbba-0d8b0f874e98/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.209180 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" path="/var/lib/kubelet/pods/3597cfd3-7c8f-41b7-9265-cfffa2bab5e9/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.210276 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" path="/var/lib/kubelet/pods/3eec9e69-0f85-4cc7-8fad-020a027dedcc/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.210894 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4083be08-daf7-4cdc-a1ea-4409372710cd" path="/var/lib/kubelet/pods/4083be08-daf7-4cdc-a1ea-4409372710cd/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.211566 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c133e4d-7259-43f9-90c7-d8fee2af8588" path="/var/lib/kubelet/pods/4c133e4d-7259-43f9-90c7-d8fee2af8588/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.212553 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="767701a0-4f58-4ea4-a625-abc7f426d70b" path="/var/lib/kubelet/pods/767701a0-4f58-4ea4-a625-abc7f426d70b/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.214760 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886a48bf-ec30-418c-86d9-9d96b725aa59" path="/var/lib/kubelet/pods/886a48bf-ec30-418c-86d9-9d96b725aa59/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.219191 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" path="/var/lib/kubelet/pods/c0633364-ad88-4b44-bc23-3e2548ffb3d3/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.221328 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" path="/var/lib/kubelet/pods/cf120fd0-7d2d-4f6e-80d9-523c4432c9db/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.222137 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e12f07-e051-4381-8419-5f3a1a09d6c5" path="/var/lib/kubelet/pods/d1e12f07-e051-4381-8419-5f3a1a09d6c5/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.222619 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4146629-1224-48a1-86c9-da68e93dbbca" path="/var/lib/kubelet/pods/d4146629-1224-48a1-86c9-da68e93dbbca/volumes" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.225077 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.225104 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.232911 4707 scope.go:117] "RemoveContainer" containerID="c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.241131 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.246461 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-ab63-account-create-update-r2g25"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.252172 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.256114 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.262374 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.265887 4707 scope.go:117] "RemoveContainer" containerID="0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.266714 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.270441 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-2" podUID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.270754 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.272340 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5\": container with ID starting with 0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5 not found: ID does not exist" containerID="0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.272372 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5"} err="failed to get container status \"0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5\": rpc error: code = NotFound desc = could not find container \"0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5\": container with ID starting with 0b36ea3176c04c3f7de7086453e145022cf8ee3a9e11aae9dc88beaa7f9df7c5 not found: ID does not exist" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.272390 4707 scope.go:117] "RemoveContainer" containerID="c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7" Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.273829 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7\": container with ID starting with c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7 not found: ID does not exist" containerID="c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.273869 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7"} err="failed to get container status \"c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7\": rpc error: code = NotFound desc = could not find container \"c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7\": container with ID starting with c101d7f312d42ba8cf008e59e1774d96466d6ac1a1e65dce3a71fceac9fdaff7 not found: ID does not exist" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.273892 4707 scope.go:117] "RemoveContainer" containerID="a31decf41a1b02620ab2be257c60c8227184c0d7af43f0210eae219e937afaba" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.275906 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.279196 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-1" podUID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.281324 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.292158 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.295244 4707 scope.go:117] "RemoveContainer" containerID="8a2c4e0ae63668aed4274707ac9589b229bd930700f2c6e2c7706b1b5642df61" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.296292 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.296302 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6n9v\" (UniqueName: \"kubernetes.io/projected/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-kube-api-access-h6n9v\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.296446 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.297523 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-1" podUID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" containerName="galera" containerID="cri-o://cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91" gracePeriod=28 Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.299878 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.303445 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.393324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.479691 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" podUID="8e2084ed-217a-462c-8219-6cfa022751d9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.111:5671: connect: connection refused" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.499907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts\") pod \"0c0fc739-8c30-46ab-aea8-1db8f2864340\" (UID: \"0c0fc739-8c30-46ab-aea8-1db8f2864340\") " Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.500059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrjxg\" (UniqueName: \"kubernetes.io/projected/0c0fc739-8c30-46ab-aea8-1db8f2864340-kube-api-access-vrjxg\") pod \"0c0fc739-8c30-46ab-aea8-1db8f2864340\" (UID: \"0c0fc739-8c30-46ab-aea8-1db8f2864340\") " Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.500595 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.500705 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data podName:4c346ce8-cb7b-46a0-abda-84bc7a42f009 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:51.500693048 +0000 UTC m=+2288.682209270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data") pod "rabbitmq-server-0" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009") : configmap "rabbitmq-config-data" not found Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.501271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c0fc739-8c30-46ab-aea8-1db8f2864340" (UID: "0c0fc739-8c30-46ab-aea8-1db8f2864340"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.505920 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0fc739-8c30-46ab-aea8-1db8f2864340-kube-api-access-vrjxg" (OuterVolumeSpecName: "kube-api-access-vrjxg") pod "0c0fc739-8c30-46ab-aea8-1db8f2864340" (UID: "0c0fc739-8c30-46ab-aea8-1db8f2864340"). InnerVolumeSpecName "kube-api-access-vrjxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.602557 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0fc739-8c30-46ab-aea8-1db8f2864340-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.602578 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.602591 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrjxg\" (UniqueName: \"kubernetes.io/projected/0c0fc739-8c30-46ab-aea8-1db8f2864340-kube-api-access-vrjxg\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.602630 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data podName:c641d259-efc1-4897-94d0-bf9d429f3c5d nodeName:}" failed. No retries permitted until 2026-01-21 15:39:51.6026147 +0000 UTC m=+2288.784130923 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data") pod "rabbitmq-server-2" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d") : configmap "rabbitmq-config-data" not found Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.602648 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:47 crc kubenswrapper[4707]: E0121 15:39:47.602690 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data podName:15b560a7-6184-4897-af4f-7975ebd19fcb nodeName:}" failed. No retries permitted until 2026-01-21 15:39:51.602677359 +0000 UTC m=+2288.784193581 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data") pod "rabbitmq-server-1" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb") : configmap "rabbitmq-config-data" not found Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.778329 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.807349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40927271-bdaa-4872-a579-c100648860dd-logs\") pod \"40927271-bdaa-4872-a579-c100648860dd\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.807450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-combined-ca-bundle\") pod \"40927271-bdaa-4872-a579-c100648860dd\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.807501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q44vg\" (UniqueName: \"kubernetes.io/projected/40927271-bdaa-4872-a579-c100648860dd-kube-api-access-q44vg\") pod \"40927271-bdaa-4872-a579-c100648860dd\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.807530 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data-custom\") pod \"40927271-bdaa-4872-a579-c100648860dd\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.807680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data\") pod \"40927271-bdaa-4872-a579-c100648860dd\" (UID: \"40927271-bdaa-4872-a579-c100648860dd\") " Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.810200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40927271-bdaa-4872-a579-c100648860dd-logs" (OuterVolumeSpecName: "logs") pod "40927271-bdaa-4872-a579-c100648860dd" (UID: "40927271-bdaa-4872-a579-c100648860dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.851872 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40927271-bdaa-4872-a579-c100648860dd-kube-api-access-q44vg" (OuterVolumeSpecName: "kube-api-access-q44vg") pod "40927271-bdaa-4872-a579-c100648860dd" (UID: "40927271-bdaa-4872-a579-c100648860dd"). InnerVolumeSpecName "kube-api-access-q44vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.851881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "40927271-bdaa-4872-a579-c100648860dd" (UID: "40927271-bdaa-4872-a579-c100648860dd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.864697 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40927271-bdaa-4872-a579-c100648860dd" (UID: "40927271-bdaa-4872-a579-c100648860dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.867169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data" (OuterVolumeSpecName: "config-data") pod "40927271-bdaa-4872-a579-c100648860dd" (UID: "40927271-bdaa-4872-a579-c100648860dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.910652 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40927271-bdaa-4872-a579-c100648860dd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.910683 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.910708 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q44vg\" (UniqueName: \"kubernetes.io/projected/40927271-bdaa-4872-a579-c100648860dd-kube-api-access-q44vg\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.910717 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:47 crc kubenswrapper[4707]: I0121 15:39:47.910725 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40927271-bdaa-4872-a579-c100648860dd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.112321 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.112372 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data podName:8e2084ed-217a-462c-8219-6cfa022751d9 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:56.112359954 +0000 UTC m=+2293.293876176 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data") pod "rabbitmq-cell1-server-1" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.114568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-brbmp" event={"ID":"0c0fc739-8c30-46ab-aea8-1db8f2864340","Type":"ContainerDied","Data":"51790c6cee21ae0e7376f4efd2bec64a752906230d7019ea10daf2d727e80557"} Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.114605 4707 scope.go:117] "RemoveContainer" containerID="ed0d14d079952dc244381a0fdc4b45a78474f1f50fb22736556ea2cccbd997cc" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.114688 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-brbmp" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.120575 4707 generic.go:334] "Generic (PLEG): container finished" podID="40927271-bdaa-4872-a579-c100648860dd" containerID="4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8" exitCode=0 Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.120960 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.121087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" event={"ID":"40927271-bdaa-4872-a579-c100648860dd","Type":"ContainerDied","Data":"4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8"} Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.121161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm" event={"ID":"40927271-bdaa-4872-a579-c100648860dd","Type":"ContainerDied","Data":"47b186c7ba354be7dc9056b266f41d519315c491a5b4a1ff37edf9002cf8e69f"} Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.140477 4707 scope.go:117] "RemoveContainer" containerID="4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.210611 4707 scope.go:117] "RemoveContainer" containerID="2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf" Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.213513 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.213566 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data podName:1977b33b-45bb-4b7d-b7f7-ebfd42572edb nodeName:}" failed. No retries permitted until 2026-01-21 15:39:56.213555462 +0000 UTC m=+2293.395071684 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data") pod "rabbitmq-cell1-server-2" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.213595 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.213657 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data podName:702acb12-2ee0-4008-bf50-0fda9fc79030 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:56.213640791 +0000 UTC m=+2293.395157014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data") pod "rabbitmq-cell1-server-0" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.214716 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-brbmp"] Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.221643 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-brbmp"] Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.226513 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm"] Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.231458 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77c8f47597-7cmwm"] Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.235469 4707 scope.go:117] "RemoveContainer" containerID="4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8" Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.235964 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8\": container with ID starting with 4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8 not found: ID does not exist" containerID="4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.236013 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8"} err="failed to get container status \"4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8\": rpc error: code = NotFound desc = could not find container \"4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8\": container with ID starting with 4e53bea5ccb95231c93fc249d13f35fb063367c3e793f48f47ddc37af52483a8 not found: ID does not exist" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.236036 4707 scope.go:117] "RemoveContainer" containerID="2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf" Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.236278 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf\": container with ID starting with 2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf not found: ID does not exist" containerID="2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.236308 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf"} err="failed to get container status \"2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf\": rpc error: code = NotFound desc = could not find container \"2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf\": container with ID starting with 2bba0bcc8aea0e15688d1d86222b91b84736d93c963f6bab6bba7ebddf36b4cf not found: ID does not exist" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.446865 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.517966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-config-data\") pod \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.519359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-credential-keys\") pod \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.519390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-fernet-keys\") pod \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.519461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-public-tls-certs\") pod \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.519507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-combined-ca-bundle\") pod \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.519611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-internal-tls-certs\") pod \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.519628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-scripts\") pod \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.519645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snnh7\" (UniqueName: \"kubernetes.io/projected/324409f5-db7f-41c2-9004-d9c4d10e1e0e-kube-api-access-snnh7\") pod \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\" (UID: \"324409f5-db7f-41c2-9004-d9c4d10e1e0e\") " Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.522917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "324409f5-db7f-41c2-9004-d9c4d10e1e0e" (UID: "324409f5-db7f-41c2-9004-d9c4d10e1e0e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.523116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "324409f5-db7f-41c2-9004-d9c4d10e1e0e" (UID: "324409f5-db7f-41c2-9004-d9c4d10e1e0e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.523199 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/324409f5-db7f-41c2-9004-d9c4d10e1e0e-kube-api-access-snnh7" (OuterVolumeSpecName: "kube-api-access-snnh7") pod "324409f5-db7f-41c2-9004-d9c4d10e1e0e" (UID: "324409f5-db7f-41c2-9004-d9c4d10e1e0e"). InnerVolumeSpecName "kube-api-access-snnh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.523873 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-scripts" (OuterVolumeSpecName: "scripts") pod "324409f5-db7f-41c2-9004-d9c4d10e1e0e" (UID: "324409f5-db7f-41c2-9004-d9c4d10e1e0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.537946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-config-data" (OuterVolumeSpecName: "config-data") pod "324409f5-db7f-41c2-9004-d9c4d10e1e0e" (UID: "324409f5-db7f-41c2-9004-d9c4d10e1e0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.538846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "324409f5-db7f-41c2-9004-d9c4d10e1e0e" (UID: "324409f5-db7f-41c2-9004-d9c4d10e1e0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.549926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "324409f5-db7f-41c2-9004-d9c4d10e1e0e" (UID: "324409f5-db7f-41c2-9004-d9c4d10e1e0e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.550291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "324409f5-db7f-41c2-9004-d9c4d10e1e0e" (UID: "324409f5-db7f-41c2-9004-d9c4d10e1e0e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.589677 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.591113 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.592240 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:48 crc kubenswrapper[4707]: E0121 15:39:48.592291 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.621266 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.621292 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.621302 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snnh7\" (UniqueName: \"kubernetes.io/projected/324409f5-db7f-41c2-9004-d9c4d10e1e0e-kube-api-access-snnh7\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.621311 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.621319 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.621327 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.621335 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:48 crc kubenswrapper[4707]: I0121 15:39:48.621344 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/324409f5-db7f-41c2-9004-d9c4d10e1e0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.129009 4707 generic.go:334] "Generic (PLEG): container finished" podID="324409f5-db7f-41c2-9004-d9c4d10e1e0e" containerID="9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577" exitCode=0 Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.129062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" event={"ID":"324409f5-db7f-41c2-9004-d9c4d10e1e0e","Type":"ContainerDied","Data":"9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577"} Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.129085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" event={"ID":"324409f5-db7f-41c2-9004-d9c4d10e1e0e","Type":"ContainerDied","Data":"5e582727291c318d43862ce2372edd257d145e7bdddbd0587bc647696d9cf276"} Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.129101 4707 scope.go:117] "RemoveContainer" containerID="9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.129174 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6cfc89d475-9v9xq" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.135581 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" containerID="cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91" exitCode=0 Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.135653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"b0cfe067-0df4-46ce-b674-eed01fb4ca91","Type":"ContainerDied","Data":"cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91"} Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.158225 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6cfc89d475-9v9xq"] Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.158395 4707 scope.go:117] "RemoveContainer" containerID="9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577" Jan 21 15:39:49 crc kubenswrapper[4707]: E0121 15:39:49.159042 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577\": container with ID starting with 9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577 not found: ID does not exist" containerID="9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.159100 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577"} err="failed to get container status \"9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577\": rpc error: code = NotFound desc = could not find container \"9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577\": container with ID starting with 9cef4029f0d5a9a2031797c8424f8ca387f4378dcae4378bfe57715ee862e577 not found: ID does not exist" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.163665 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6cfc89d475-9v9xq"] Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.198288 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0fc739-8c30-46ab-aea8-1db8f2864340" path="/var/lib/kubelet/pods/0c0fc739-8c30-46ab-aea8-1db8f2864340/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.198760 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="324409f5-db7f-41c2-9004-d9c4d10e1e0e" path="/var/lib/kubelet/pods/324409f5-db7f-41c2-9004-d9c4d10e1e0e/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.199270 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40927271-bdaa-4872-a579-c100648860dd" path="/var/lib/kubelet/pods/40927271-bdaa-4872-a579-c100648860dd/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.200213 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" path="/var/lib/kubelet/pods/487d2c84-f1f6-4b88-9188-49e704aeb43a/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.200877 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" path="/var/lib/kubelet/pods/64087307-6402-47cd-8a4b-37c3d90eb61e/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.201455 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="741daec5-08a7-4769-8b0b-4248c3c02d69" path="/var/lib/kubelet/pods/741daec5-08a7-4769-8b0b-4248c3c02d69/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.202435 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" path="/var/lib/kubelet/pods/7e183977-7c22-4b6d-a465-132c8f85dbb0/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.203293 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" path="/var/lib/kubelet/pods/9cee2089-6404-41ee-b6ae-82d788ced5fc/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.204280 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" path="/var/lib/kubelet/pods/aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.204670 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1" path="/var/lib/kubelet/pods/baa49cdf-1b1a-44a1-9fb6-b3bc438bd0a1/volumes" Jan 21 15:39:49 crc kubenswrapper[4707]: E0121 15:39:49.215975 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91 is running failed: container process not found" containerID="cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 15:39:49 crc kubenswrapper[4707]: E0121 15:39:49.216205 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91 is running failed: container process not found" containerID="cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 15:39:49 crc kubenswrapper[4707]: E0121 15:39:49.216391 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91 is running failed: container process not found" containerID="cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 15:39:49 crc kubenswrapper[4707]: E0121 15:39:49.216440 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-1" podUID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" containerName="galera" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.303032 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.303442 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" containerName="galera" containerID="cri-o://8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443" gracePeriod=26 Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.337417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz2dc\" (UniqueName: \"kubernetes.io/projected/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kube-api-access-dz2dc\") pod \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.337488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-operator-scripts\") pod \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.337512 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-galera-tls-certs\") pod \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.337535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.337599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-default\") pod \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.337617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kolla-config\") pod \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.337728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-generated\") pod \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.337761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-combined-ca-bundle\") pod \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\" (UID: \"b0cfe067-0df4-46ce-b674-eed01fb4ca91\") " Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.342547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "b0cfe067-0df4-46ce-b674-eed01fb4ca91" (UID: "b0cfe067-0df4-46ce-b674-eed01fb4ca91"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.342590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b0cfe067-0df4-46ce-b674-eed01fb4ca91" (UID: "b0cfe067-0df4-46ce-b674-eed01fb4ca91"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.343070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "b0cfe067-0df4-46ce-b674-eed01fb4ca91" (UID: "b0cfe067-0df4-46ce-b674-eed01fb4ca91"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.343402 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0cfe067-0df4-46ce-b674-eed01fb4ca91" (UID: "b0cfe067-0df4-46ce-b674-eed01fb4ca91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.345605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kube-api-access-dz2dc" (OuterVolumeSpecName: "kube-api-access-dz2dc") pod "b0cfe067-0df4-46ce-b674-eed01fb4ca91" (UID: "b0cfe067-0df4-46ce-b674-eed01fb4ca91"). InnerVolumeSpecName "kube-api-access-dz2dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.348907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "mysql-db") pod "b0cfe067-0df4-46ce-b674-eed01fb4ca91" (UID: "b0cfe067-0df4-46ce-b674-eed01fb4ca91"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.357767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0cfe067-0df4-46ce-b674-eed01fb4ca91" (UID: "b0cfe067-0df4-46ce-b674-eed01fb4ca91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.372193 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "b0cfe067-0df4-46ce-b674-eed01fb4ca91" (UID: "b0cfe067-0df4-46ce-b674-eed01fb4ca91"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.439296 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.439336 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.439348 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.439357 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.439367 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0cfe067-0df4-46ce-b674-eed01fb4ca91-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.439376 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cfe067-0df4-46ce-b674-eed01fb4ca91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.439384 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz2dc\" (UniqueName: \"kubernetes.io/projected/b0cfe067-0df4-46ce-b674-eed01fb4ca91-kube-api-access-dz2dc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.439391 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cfe067-0df4-46ce-b674-eed01fb4ca91-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.452155 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 21 15:39:49 crc kubenswrapper[4707]: I0121 15:39:49.540441 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: E0121 15:39:50.046577 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/nova-cell0-conductor-config-data: secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:50 crc kubenswrapper[4707]: E0121 15:39:50.046633 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data podName:91365236-99fe-4ee7-9aa1-8a414557cf97 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:58.04662057 +0000 UTC m=+2295.228136792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data") pod "nova-cell0-conductor-0" (UID: "91365236-99fe-4ee7-9aa1-8a414557cf97") : secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.198444 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.200676 4707 generic.go:334] "Generic (PLEG): container finished" podID="8e2084ed-217a-462c-8219-6cfa022751d9" containerID="77bb156bdfd2d77e45072aa690e4f0b81d8bcd71d4f1156e3520806d9729c421" exitCode=0 Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.200722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"8e2084ed-217a-462c-8219-6cfa022751d9","Type":"ContainerDied","Data":"77bb156bdfd2d77e45072aa690e4f0b81d8bcd71d4f1156e3520806d9729c421"} Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.204212 4707 generic.go:334] "Generic (PLEG): container finished" podID="224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" containerID="8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443" exitCode=0 Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.204268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b","Type":"ContainerDied","Data":"8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443"} Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.204284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b","Type":"ContainerDied","Data":"cb2ececd620c05cd639d8ec3aefb8ccbf2ed59c0d90b6381ad7583044af99531"} Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.204300 4707 scope.go:117] "RemoveContainer" containerID="8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.204411 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.209036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"b0cfe067-0df4-46ce-b674-eed01fb4ca91","Type":"ContainerDied","Data":"a0a0b664893c15c33eb4e777a68f91a2a445b49ce070e19c7182c15bdb422ca2"} Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.209156 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-1" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.230038 4707 scope.go:117] "RemoveContainer" containerID="4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.251521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-default\") pod \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.251571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25gbb\" (UniqueName: \"kubernetes.io/projected/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kube-api-access-25gbb\") pod \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.251611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.251675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-combined-ca-bundle\") pod \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.251697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-operator-scripts\") pod \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.251740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-galera-tls-certs\") pod \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.251774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kolla-config\") pod \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.251798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-generated\") pod \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\" (UID: \"224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.255681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kube-api-access-25gbb" (OuterVolumeSpecName: "kube-api-access-25gbb") pod "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" (UID: "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b"). InnerVolumeSpecName "kube-api-access-25gbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.256036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" (UID: "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.256394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" (UID: "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.256706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" (UID: "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.256927 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.257779 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" (UID: "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.259557 4707 scope.go:117] "RemoveContainer" containerID="8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443" Jan 21 15:39:50 crc kubenswrapper[4707]: E0121 15:39:50.259984 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443\": container with ID starting with 8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443 not found: ID does not exist" containerID="8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.260016 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443"} err="failed to get container status \"8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443\": rpc error: code = NotFound desc = could not find container \"8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443\": container with ID starting with 8891dc25b7fb62438e2fb4d08cc117485a4cd92827ece461b23699cdb404a443 not found: ID does not exist" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.260037 4707 scope.go:117] "RemoveContainer" containerID="4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678" Jan 21 15:39:50 crc kubenswrapper[4707]: E0121 15:39:50.260372 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678\": container with ID starting with 4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678 not found: ID does not exist" containerID="4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.260398 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678"} err="failed to get container status \"4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678\": rpc error: code = NotFound desc = could not find container \"4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678\": container with ID starting with 4430b1dd83cc4ece58f53cd2eebda807a33ec9cbd55029d681da336e81915678 not found: ID does not exist" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.260419 4707 scope.go:117] "RemoveContainer" containerID="cd29b6ff7775ca99ce7ebb2ba87d0b77498c0a0c7b683524a54507f1a44acf91" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.261764 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.276157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" (UID: "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.279407 4707 scope.go:117] "RemoveContainer" containerID="dbb3b37d291b0ba6c3e5d38b8c71359fbf200dfd228ed3c32b7182d8d9d73e43" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.280018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "mysql-db") pod "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" (UID: "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.294007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" (UID: "224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.337179 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.355881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-plugins-conf\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.355937 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.355994 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-tls\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65vml\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-kube-api-access-65vml\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-plugins\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356383 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-confd\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e2084ed-217a-462c-8219-6cfa022751d9-erlang-cookie-secret\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356490 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-server-conf\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356513 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-erlang-cookie\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e2084ed-217a-462c-8219-6cfa022751d9-pod-info\") pod \"8e2084ed-217a-462c-8219-6cfa022751d9\" (UID: \"8e2084ed-217a-462c-8219-6cfa022751d9\") " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.357091 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.357110 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.357119 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.357134 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.357143 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.356581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.357151 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.357186 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.357227 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25gbb\" (UniqueName: \"kubernetes.io/projected/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b-kube-api-access-25gbb\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.357610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.358967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.358990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.359403 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-kube-api-access-65vml" (OuterVolumeSpecName: "kube-api-access-65vml") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "kube-api-access-65vml". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.361370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2084ed-217a-462c-8219-6cfa022751d9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.372539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data" (OuterVolumeSpecName: "config-data") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.375853 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8e2084ed-217a-462c-8219-6cfa022751d9-pod-info" (OuterVolumeSpecName: "pod-info") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.380291 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.388042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-server-conf" (OuterVolumeSpecName: "server-conf") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.422802 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8e2084ed-217a-462c-8219-6cfa022751d9" (UID: "8e2084ed-217a-462c-8219-6cfa022751d9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.458431 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.459005 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460437 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460510 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460578 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65vml\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-kube-api-access-65vml\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460634 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460684 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460745 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460795 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e2084ed-217a-462c-8219-6cfa022751d9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460871 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e2084ed-217a-462c-8219-6cfa022751d9-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460929 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e2084ed-217a-462c-8219-6cfa022751d9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.460979 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e2084ed-217a-462c-8219-6cfa022751d9-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.473153 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.540621 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.546494 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:39:50 crc kubenswrapper[4707]: I0121 15:39:50.561719 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.188912 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" path="/var/lib/kubelet/pods/224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b/volumes" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.189567 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" path="/var/lib/kubelet/pods/b0cfe067-0df4-46ce-b674-eed01fb4ca91/volumes" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.219265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"8e2084ed-217a-462c-8219-6cfa022751d9","Type":"ContainerDied","Data":"99586717160da9478207f34b93f55cce606fee2e927195867542e138aa220d52"} Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.219286 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.219306 4707 scope.go:117] "RemoveContainer" containerID="77bb156bdfd2d77e45072aa690e4f0b81d8bcd71d4f1156e3520806d9729c421" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.402981 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.404371 4707 scope.go:117] "RemoveContainer" containerID="6a798f965e2ede79a50452c9b9cf47838b4c424ce11b6965be9499ba22782a08" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.408968 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 21 15:39:51 crc kubenswrapper[4707]: E0121 15:39:51.583164 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:39:51 crc kubenswrapper[4707]: E0121 15:39:51.583215 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data podName:4c346ce8-cb7b-46a0-abda-84bc7a42f009 nodeName:}" failed. No retries permitted until 2026-01-21 15:39:59.583200968 +0000 UTC m=+2296.764717190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data") pod "rabbitmq-server-0" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009") : configmap "rabbitmq-config-data" not found Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.651141 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.656467 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.659878 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.684884 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-server-conf\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.684925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-tls\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.684945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.684999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c346ce8-cb7b-46a0-abda-84bc7a42f009-pod-info\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-plugins-conf\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c641d259-efc1-4897-94d0-bf9d429f3c5d-pod-info\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-server-conf\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wvwf\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-kube-api-access-4wvwf\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15b560a7-6184-4897-af4f-7975ebd19fcb-erlang-cookie-secret\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685174 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c346ce8-cb7b-46a0-abda-84bc7a42f009-erlang-cookie-secret\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-plugins\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15b560a7-6184-4897-af4f-7975ebd19fcb-pod-info\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4bls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-kube-api-access-h4bls\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-server-conf\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-tls\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w869l\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-kube-api-access-w869l\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-erlang-cookie\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-confd\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-plugins-conf\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c641d259-efc1-4897-94d0-bf9d429f3c5d-erlang-cookie-secret\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-plugins-conf\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-erlang-cookie\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-erlang-cookie\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-confd\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-plugins\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-plugins\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-confd\") pod \"c641d259-efc1-4897-94d0-bf9d429f3c5d\" (UID: \"c641d259-efc1-4897-94d0-bf9d429f3c5d\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.685610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-tls\") pod \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\" (UID: \"4c346ce8-cb7b-46a0-abda-84bc7a42f009\") " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.690827 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.691067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.692718 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-kube-api-access-w869l" (OuterVolumeSpecName: "kube-api-access-w869l") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "kube-api-access-w869l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.693358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/15b560a7-6184-4897-af4f-7975ebd19fcb-pod-info" (OuterVolumeSpecName: "pod-info") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.694365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.695438 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.696149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.696714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c346ce8-cb7b-46a0-abda-84bc7a42f009-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.699010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b560a7-6184-4897-af4f-7975ebd19fcb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.699576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.700707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.701035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.702272 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.702733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.703080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.705274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.705357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c641d259-efc1-4897-94d0-bf9d429f3c5d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.705423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.705422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.705483 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: E0121 15:39:51.707423 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.717089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4c346ce8-cb7b-46a0-abda-84bc7a42f009-pod-info" (OuterVolumeSpecName: "pod-info") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.717128 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-kube-api-access-h4bls" (OuterVolumeSpecName: "kube-api-access-h4bls") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "kube-api-access-h4bls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: E0121 15:39:51.717053 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:51 crc kubenswrapper[4707]: E0121 15:39:51.723575 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:51 crc kubenswrapper[4707]: E0121 15:39:51.723618 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.726515 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c641d259-efc1-4897-94d0-bf9d429f3c5d-pod-info" (OuterVolumeSpecName: "pod-info") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.731335 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data" (OuterVolumeSpecName: "config-data") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.739383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data" (OuterVolumeSpecName: "config-data") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.739530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-kube-api-access-4wvwf" (OuterVolumeSpecName: "kube-api-access-4wvwf") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "kube-api-access-4wvwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.747123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data" (OuterVolumeSpecName: "config-data") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.760738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-server-conf" (OuterVolumeSpecName: "server-conf") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.760975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-server-conf" (OuterVolumeSpecName: "server-conf") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.762220 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-server-conf" (OuterVolumeSpecName: "server-conf") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.782154 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4c346ce8-cb7b-46a0-abda-84bc7a42f009" (UID: "4c346ce8-cb7b-46a0-abda-84bc7a42f009"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.787857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c641d259-efc1-4897-94d0-bf9d429f3c5d" (UID: "c641d259-efc1-4897-94d0-bf9d429f3c5d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.788625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-confd\") pod \"15b560a7-6184-4897-af4f-7975ebd19fcb\" (UID: \"15b560a7-6184-4897-af4f-7975ebd19fcb\") " Jan 21 15:39:51 crc kubenswrapper[4707]: W0121 15:39:51.789240 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/15b560a7-6184-4897-af4f-7975ebd19fcb/volumes/kubernetes.io~projected/rabbitmq-confd Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "15b560a7-6184-4897-af4f-7975ebd19fcb" (UID: "15b560a7-6184-4897-af4f-7975ebd19fcb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789542 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789563 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wvwf\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-kube-api-access-4wvwf\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789573 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789582 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15b560a7-6184-4897-af4f-7975ebd19fcb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789605 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789613 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c346ce8-cb7b-46a0-abda-84bc7a42f009-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789626 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789635 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789643 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789652 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15b560a7-6184-4897-af4f-7975ebd19fcb-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789660 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4bls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-kube-api-access-h4bls\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789668 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789677 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789685 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w869l\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-kube-api-access-w869l\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789693 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789702 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789737 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789746 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c641d259-efc1-4897-94d0-bf9d429f3c5d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789754 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789761 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789769 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789777 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789784 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789794 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789883 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c641d259-efc1-4897-94d0-bf9d429f3c5d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789892 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c346ce8-cb7b-46a0-abda-84bc7a42f009-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789900 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c641d259-efc1-4897-94d0-bf9d429f3c5d-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789908 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15b560a7-6184-4897-af4f-7975ebd19fcb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789920 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789927 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c346ce8-cb7b-46a0-abda-84bc7a42f009-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789945 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c346ce8-cb7b-46a0-abda-84bc7a42f009-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789953 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15b560a7-6184-4897-af4f-7975ebd19fcb-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.789961 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c641d259-efc1-4897-94d0-bf9d429f3c5d-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.800135 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.801705 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.802309 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.890932 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.890956 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:51 crc kubenswrapper[4707]: I0121 15:39:51.890965 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.070083 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.071284 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.073570 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.073602 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.228693 4707 generic.go:334] "Generic (PLEG): container finished" podID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerID="c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc" exitCode=0 Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.228740 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.228760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"15b560a7-6184-4897-af4f-7975ebd19fcb","Type":"ContainerDied","Data":"c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc"} Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.228794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"15b560a7-6184-4897-af4f-7975ebd19fcb","Type":"ContainerDied","Data":"84f9a01fc933d02b31761e13aad8dce7376a5048b526d2374384b83a02ff8564"} Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.228827 4707 scope.go:117] "RemoveContainer" containerID="c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.234881 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerID="da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442" exitCode=0 Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.234949 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.234954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4c346ce8-cb7b-46a0-abda-84bc7a42f009","Type":"ContainerDied","Data":"da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442"} Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.234984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4c346ce8-cb7b-46a0-abda-84bc7a42f009","Type":"ContainerDied","Data":"999ddce239732138ecc6be83b4fb9f61d057a201cfddf5940f97fde33a1d4f80"} Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.238138 4707 generic.go:334] "Generic (PLEG): container finished" podID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerID="6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62" exitCode=0 Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.238175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"c641d259-efc1-4897-94d0-bf9d429f3c5d","Type":"ContainerDied","Data":"6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62"} Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.238219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"c641d259-efc1-4897-94d0-bf9d429f3c5d","Type":"ContainerDied","Data":"652ed6964b05c15a5f17f224272bf185cf72f7e5e1bd7307ed379302f6328ff7"} Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.238188 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.252980 4707 scope.go:117] "RemoveContainer" containerID="4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.265684 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.266639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.282965 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.295417 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.297586 4707 scope.go:117] "RemoveContainer" containerID="c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc" Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.299441 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc\": container with ID starting with c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc not found: ID does not exist" containerID="c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.299480 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc"} err="failed to get container status \"c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc\": rpc error: code = NotFound desc = could not find container \"c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc\": container with ID starting with c95ca32c14090eecd8bce50d33849988897ef9c864da02b8fb66b350996c6bdc not found: ID does not exist" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.299501 4707 scope.go:117] "RemoveContainer" containerID="4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069" Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.299956 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069\": container with ID starting with 4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069 not found: ID does not exist" containerID="4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.299979 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069"} err="failed to get container status \"4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069\": rpc error: code = NotFound desc = could not find container \"4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069\": container with ID starting with 4ec1756f0c6a2cd33dc8861b64aefc3702f79341fbb59aa643a34b77066ff069 not found: ID does not exist" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.299992 4707 scope.go:117] "RemoveContainer" containerID="da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.307963 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.318908 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.324530 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.324837 4707 scope.go:117] "RemoveContainer" containerID="4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.368118 4707 scope.go:117] "RemoveContainer" containerID="da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442" Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.370118 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442\": container with ID starting with da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442 not found: ID does not exist" containerID="da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.370146 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442"} err="failed to get container status \"da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442\": rpc error: code = NotFound desc = could not find container \"da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442\": container with ID starting with da375c8bb49585824783ab7ca710dbb67f2f581edbccc13360586faf663f9442 not found: ID does not exist" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.370162 4707 scope.go:117] "RemoveContainer" containerID="4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.371460 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf"] Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.371662 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" podUID="e007cb2f-fee4-4c18-9baf-9e61af91254c" containerName="dnsmasq-dns" containerID="cri-o://b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd" gracePeriod=10 Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.372568 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3\": container with ID starting with 4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3 not found: ID does not exist" containerID="4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.372604 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3"} err="failed to get container status \"4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3\": rpc error: code = NotFound desc = could not find container \"4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3\": container with ID starting with 4c006cacf22b81fa9870858dc556552a3df1507f94871f7767824d02d26804e3 not found: ID does not exist" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.372626 4707 scope.go:117] "RemoveContainer" containerID="6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.408598 4707 scope.go:117] "RemoveContainer" containerID="6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.435331 4707 scope.go:117] "RemoveContainer" containerID="6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62" Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.435763 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62\": container with ID starting with 6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62 not found: ID does not exist" containerID="6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.435799 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62"} err="failed to get container status \"6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62\": rpc error: code = NotFound desc = could not find container \"6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62\": container with ID starting with 6d447671adc6b057d6cdaa31f862f5850d244a83a6bf2d606dc3402b6ef6af62 not found: ID does not exist" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.435831 4707 scope.go:117] "RemoveContainer" containerID="6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e" Jan 21 15:39:52 crc kubenswrapper[4707]: E0121 15:39:52.436114 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e\": container with ID starting with 6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e not found: ID does not exist" containerID="6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e" Jan 21 15:39:52 crc kubenswrapper[4707]: I0121 15:39:52.436133 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e"} err="failed to get container status \"6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e\": rpc error: code = NotFound desc = could not find container \"6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e\": container with ID starting with 6b35a902d1a57527c127ff7a571f8f5fc6344d1523ff0fde317912584c61a30e not found: ID does not exist" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.764682 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.804879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dns-swift-storage-0\") pod \"e007cb2f-fee4-4c18-9baf-9e61af91254c\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.805014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448pb\" (UniqueName: \"kubernetes.io/projected/e007cb2f-fee4-4c18-9baf-9e61af91254c-kube-api-access-448pb\") pod \"e007cb2f-fee4-4c18-9baf-9e61af91254c\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.805036 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-config\") pod \"e007cb2f-fee4-4c18-9baf-9e61af91254c\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.805097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dnsmasq-svc\") pod \"e007cb2f-fee4-4c18-9baf-9e61af91254c\" (UID: \"e007cb2f-fee4-4c18-9baf-9e61af91254c\") " Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.810632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e007cb2f-fee4-4c18-9baf-9e61af91254c-kube-api-access-448pb" (OuterVolumeSpecName: "kube-api-access-448pb") pod "e007cb2f-fee4-4c18-9baf-9e61af91254c" (UID: "e007cb2f-fee4-4c18-9baf-9e61af91254c"). InnerVolumeSpecName "kube-api-access-448pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.834707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-config" (OuterVolumeSpecName: "config") pod "e007cb2f-fee4-4c18-9baf-9e61af91254c" (UID: "e007cb2f-fee4-4c18-9baf-9e61af91254c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.835080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e007cb2f-fee4-4c18-9baf-9e61af91254c" (UID: "e007cb2f-fee4-4c18-9baf-9e61af91254c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.843399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "e007cb2f-fee4-4c18-9baf-9e61af91254c" (UID: "e007cb2f-fee4-4c18-9baf-9e61af91254c"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.906572 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448pb\" (UniqueName: \"kubernetes.io/projected/e007cb2f-fee4-4c18-9baf-9e61af91254c-kube-api-access-448pb\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.906590 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.906599 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:52.906607 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e007cb2f-fee4-4c18-9baf-9e61af91254c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.190998 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15b560a7-6184-4897-af4f-7975ebd19fcb" path="/var/lib/kubelet/pods/15b560a7-6184-4897-af4f-7975ebd19fcb/volumes" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.191668 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" path="/var/lib/kubelet/pods/4c346ce8-cb7b-46a0-abda-84bc7a42f009/volumes" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.192708 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2084ed-217a-462c-8219-6cfa022751d9" path="/var/lib/kubelet/pods/8e2084ed-217a-462c-8219-6cfa022751d9/volumes" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.193379 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c641d259-efc1-4897-94d0-bf9d429f3c5d" path="/var/lib/kubelet/pods/c641d259-efc1-4897-94d0-bf9d429f3c5d/volumes" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.249341 4707 generic.go:334] "Generic (PLEG): container finished" podID="e007cb2f-fee4-4c18-9baf-9e61af91254c" containerID="b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd" exitCode=0 Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.249399 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.249403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" event={"ID":"e007cb2f-fee4-4c18-9baf-9e61af91254c","Type":"ContainerDied","Data":"b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd"} Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.249480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf" event={"ID":"e007cb2f-fee4-4c18-9baf-9e61af91254c","Type":"ContainerDied","Data":"c2621c8143f9a172c2a3d23695790adfd92c6bc5da1849ad78cd00b550697a26"} Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.249501 4707 scope.go:117] "RemoveContainer" containerID="b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.270863 4707 scope.go:117] "RemoveContainer" containerID="aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.273543 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf"] Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.277665 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-pc4rf"] Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.308286 4707 scope.go:117] "RemoveContainer" containerID="b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd" Jan 21 15:39:53 crc kubenswrapper[4707]: E0121 15:39:53.308633 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd\": container with ID starting with b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd not found: ID does not exist" containerID="b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.308662 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd"} err="failed to get container status \"b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd\": rpc error: code = NotFound desc = could not find container \"b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd\": container with ID starting with b2dc5986c5466733f7a489ab4ad58bdadc6d817686682b0c78e996e91c7b6cfd not found: ID does not exist" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.308685 4707 scope.go:117] "RemoveContainer" containerID="aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522" Jan 21 15:39:53 crc kubenswrapper[4707]: E0121 15:39:53.309075 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522\": container with ID starting with aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522 not found: ID does not exist" containerID="aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522" Jan 21 15:39:53 crc kubenswrapper[4707]: I0121 15:39:53.309095 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522"} err="failed to get container status \"aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522\": rpc error: code = NotFound desc = could not find container \"aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522\": container with ID starting with aae85694a91b6e2c04463b349b2728872d3913dd042d2a6c0e60c0eba7db2522 not found: ID does not exist" Jan 21 15:39:53 crc kubenswrapper[4707]: E0121 15:39:53.589302 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:53 crc kubenswrapper[4707]: E0121 15:39:53.590503 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:53 crc kubenswrapper[4707]: E0121 15:39:53.591725 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:53 crc kubenswrapper[4707]: E0121 15:39:53.591765 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" Jan 21 15:39:55 crc kubenswrapper[4707]: I0121 15:39:55.192507 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e007cb2f-fee4-4c18-9baf-9e61af91254c" path="/var/lib/kubelet/pods/e007cb2f-fee4-4c18-9baf-9e61af91254c/volumes" Jan 21 15:39:56 crc kubenswrapper[4707]: E0121 15:39:56.248858 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:56 crc kubenswrapper[4707]: E0121 15:39:56.248912 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data podName:702acb12-2ee0-4008-bf50-0fda9fc79030 nodeName:}" failed. No retries permitted until 2026-01-21 15:40:12.248899596 +0000 UTC m=+2309.430415818 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data") pod "rabbitmq-cell1-server-0" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:56 crc kubenswrapper[4707]: E0121 15:39:56.249750 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:56 crc kubenswrapper[4707]: E0121 15:39:56.249858 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data podName:1977b33b-45bb-4b7d-b7f7-ebfd42572edb nodeName:}" failed. No retries permitted until 2026-01-21 15:40:12.249843571 +0000 UTC m=+2309.431359793 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data") pod "rabbitmq-cell1-server-2" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:39:56 crc kubenswrapper[4707]: E0121 15:39:56.698933 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:56 crc kubenswrapper[4707]: E0121 15:39:56.700031 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:56 crc kubenswrapper[4707]: E0121 15:39:56.701188 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:56 crc kubenswrapper[4707]: E0121 15:39:56.701221 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" Jan 21 15:39:57 crc kubenswrapper[4707]: E0121 15:39:57.070819 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:57 crc kubenswrapper[4707]: E0121 15:39:57.087463 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:57 crc kubenswrapper[4707]: E0121 15:39:57.095046 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:39:57 crc kubenswrapper[4707]: E0121 15:39:57.095106 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" Jan 21 15:39:57 crc kubenswrapper[4707]: I0121 15:39:57.784345 4707 scope.go:117] "RemoveContainer" containerID="6b8c13446ffb3eb7877ff3e82290ac2247f8250d0dd79db6ca3526a7c844fd83" Jan 21 15:39:57 crc kubenswrapper[4707]: I0121 15:39:57.801065 4707 scope.go:117] "RemoveContainer" containerID="034eae89f5e612ade9f163f60388849a5f0b555f0c70a7dd88537a0410cf853b" Jan 21 15:39:57 crc kubenswrapper[4707]: I0121 15:39:57.832186 4707 scope.go:117] "RemoveContainer" containerID="40c226bf4e51d0f7d6f1a51b8c7febdbedfe2ce1d5184582b150377ae5d63e42" Jan 21 15:39:57 crc kubenswrapper[4707]: I0121 15:39:57.858417 4707 scope.go:117] "RemoveContainer" containerID="f3d23c22831b8b61a1371a54cbf9cc055881f895c54df9153e45ee9a3ca7ebde" Jan 21 15:39:58 crc kubenswrapper[4707]: E0121 15:39:58.070004 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/nova-cell0-conductor-config-data: secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:58 crc kubenswrapper[4707]: E0121 15:39:58.070057 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data podName:91365236-99fe-4ee7-9aa1-8a414557cf97 nodeName:}" failed. No retries permitted until 2026-01-21 15:40:14.070044193 +0000 UTC m=+2311.251560415 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data") pod "nova-cell0-conductor-0" (UID: "91365236-99fe-4ee7-9aa1-8a414557cf97") : secret "nova-cell0-conductor-config-data" not found Jan 21 15:39:58 crc kubenswrapper[4707]: E0121 15:39:58.588699 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:58 crc kubenswrapper[4707]: E0121 15:39:58.590915 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:58 crc kubenswrapper[4707]: E0121 15:39:58.591980 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:39:58 crc kubenswrapper[4707]: E0121 15:39:58.592016 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" Jan 21 15:40:01 crc kubenswrapper[4707]: E0121 15:40:01.698156 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:01 crc kubenswrapper[4707]: E0121 15:40:01.707127 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:01 crc kubenswrapper[4707]: E0121 15:40:01.716052 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:01 crc kubenswrapper[4707]: E0121 15:40:01.716113 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" Jan 21 15:40:02 crc kubenswrapper[4707]: E0121 15:40:02.069623 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:02 crc kubenswrapper[4707]: E0121 15:40:02.070906 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:02 crc kubenswrapper[4707]: E0121 15:40:02.071945 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:02 crc kubenswrapper[4707]: E0121 15:40:02.072053 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" Jan 21 15:40:03 crc kubenswrapper[4707]: E0121 15:40:03.588567 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:40:03 crc kubenswrapper[4707]: E0121 15:40:03.589623 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:40:03 crc kubenswrapper[4707]: E0121 15:40:03.590625 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:40:03 crc kubenswrapper[4707]: E0121 15:40:03.590652 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" Jan 21 15:40:05 crc kubenswrapper[4707]: I0121 15:40:05.879129 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" podUID="6452a729-a782-4679-be7b-1c5240b66caa" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.169:9696/\": dial tcp 10.217.0.169:9696: connect: connection refused" Jan 21 15:40:06 crc kubenswrapper[4707]: E0121 15:40:06.697716 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:06 crc kubenswrapper[4707]: E0121 15:40:06.698624 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:06 crc kubenswrapper[4707]: E0121 15:40:06.699576 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:06 crc kubenswrapper[4707]: E0121 15:40:06.699605 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" Jan 21 15:40:07 crc kubenswrapper[4707]: E0121 15:40:07.069474 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:07 crc kubenswrapper[4707]: E0121 15:40:07.070698 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:07 crc kubenswrapper[4707]: E0121 15:40:07.071716 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:07 crc kubenswrapper[4707]: E0121 15:40:07.071824 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" Jan 21 15:40:08 crc kubenswrapper[4707]: E0121 15:40:08.588843 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:40:08 crc kubenswrapper[4707]: E0121 15:40:08.589852 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:40:08 crc kubenswrapper[4707]: E0121 15:40:08.590887 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:40:08 crc kubenswrapper[4707]: E0121 15:40:08.590912 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" Jan 21 15:40:11 crc kubenswrapper[4707]: E0121 15:40:11.698672 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:11 crc kubenswrapper[4707]: E0121 15:40:11.701723 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:11 crc kubenswrapper[4707]: E0121 15:40:11.702833 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:11 crc kubenswrapper[4707]: E0121 15:40:11.702859 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.072306 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.073766 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.074899 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.074946 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.161422 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.254279 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.254339 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data podName:702acb12-2ee0-4008-bf50-0fda9fc79030 nodeName:}" failed. No retries permitted until 2026-01-21 15:40:44.254323912 +0000 UTC m=+2341.435840134 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data") pod "rabbitmq-cell1-server-0" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.254289 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.254504 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data podName:1977b33b-45bb-4b7d-b7f7-ebfd42572edb nodeName:}" failed. No retries permitted until 2026-01-21 15:40:44.25439692 +0000 UTC m=+2341.435913131 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data") pod "rabbitmq-cell1-server-2" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.355115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift\") pod \"70e67f96-41b8-43d4-b048-546873358118\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.355418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqjp6\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-kube-api-access-qqjp6\") pod \"70e67f96-41b8-43d4-b048-546873358118\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.355481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-lock\") pod \"70e67f96-41b8-43d4-b048-546873358118\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.355521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"70e67f96-41b8-43d4-b048-546873358118\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.355554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-cache\") pod \"70e67f96-41b8-43d4-b048-546873358118\" (UID: \"70e67f96-41b8-43d4-b048-546873358118\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.355859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-lock" (OuterVolumeSpecName: "lock") pod "70e67f96-41b8-43d4-b048-546873358118" (UID: "70e67f96-41b8-43d4-b048-546873358118"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.356048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-cache" (OuterVolumeSpecName: "cache") pod "70e67f96-41b8-43d4-b048-546873358118" (UID: "70e67f96-41b8-43d4-b048-546873358118"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.359955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-kube-api-access-qqjp6" (OuterVolumeSpecName: "kube-api-access-qqjp6") pod "70e67f96-41b8-43d4-b048-546873358118" (UID: "70e67f96-41b8-43d4-b048-546873358118"). InnerVolumeSpecName "kube-api-access-qqjp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.359960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "swift") pod "70e67f96-41b8-43d4-b048-546873358118" (UID: "70e67f96-41b8-43d4-b048-546873358118"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.360311 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "70e67f96-41b8-43d4-b048-546873358118" (UID: "70e67f96-41b8-43d4-b048-546873358118"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.374206 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5f454b744-bh8qb_6452a729-a782-4679-be7b-1c5240b66caa/neutron-api/0.log" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.374247 4707 generic.go:334] "Generic (PLEG): container finished" podID="6452a729-a782-4679-be7b-1c5240b66caa" containerID="46086c7e5a9656d5495acf8f78897d5393fb2095b5808e57b661a658aa9f8aec" exitCode=137 Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.374304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" event={"ID":"6452a729-a782-4679-be7b-1c5240b66caa","Type":"ContainerDied","Data":"46086c7e5a9656d5495acf8f78897d5393fb2095b5808e57b661a658aa9f8aec"} Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.374327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" event={"ID":"6452a729-a782-4679-be7b-1c5240b66caa","Type":"ContainerDied","Data":"119071083d3504a83fd61b8490add36b41daeacbc3e7fa44c785a9272d2a77be"} Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.374338 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119071083d3504a83fd61b8490add36b41daeacbc3e7fa44c785a9272d2a77be" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.374412 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5f454b744-bh8qb_6452a729-a782-4679-be7b-1c5240b66caa/neutron-api/0.log" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.374480 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.381093 4707 generic.go:334] "Generic (PLEG): container finished" podID="70e67f96-41b8-43d4-b048-546873358118" containerID="faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9" exitCode=137 Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.381131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9"} Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.381148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70e67f96-41b8-43d4-b048-546873358118","Type":"ContainerDied","Data":"f21cb72f66e3f8efcba46eb76e5cc5b4b986967d6e0d4b8566a52ef0b0c38a92"} Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.381164 4707 scope.go:117] "RemoveContainer" containerID="faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.381420 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.384672 4707 generic.go:334] "Generic (PLEG): container finished" podID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerID="86eaf4108dcfe70d29829d6618776e1e19fc4cf3438e7d0357073b14327e2707" exitCode=137 Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.384698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829","Type":"ContainerDied","Data":"86eaf4108dcfe70d29829d6618776e1e19fc4cf3438e7d0357073b14327e2707"} Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.406114 4707 scope.go:117] "RemoveContainer" containerID="3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.445320 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.446961 4707 scope.go:117] "RemoveContainer" containerID="39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.450505 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.457334 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.457361 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-cache\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.457372 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.457382 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqjp6\" (UniqueName: \"kubernetes.io/projected/70e67f96-41b8-43d4-b048-546873358118-kube-api-access-qqjp6\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.457391 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70e67f96-41b8-43d4-b048-546873358118-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.462330 4707 scope.go:117] "RemoveContainer" containerID="38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.469039 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.477984 4707 scope.go:117] "RemoveContainer" containerID="ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.494216 4707 scope.go:117] "RemoveContainer" containerID="d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.508480 4707 scope.go:117] "RemoveContainer" containerID="0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.517947 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.524960 4707 scope.go:117] "RemoveContainer" containerID="8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.551796 4707 scope.go:117] "RemoveContainer" containerID="202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.558405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-config\") pod \"6452a729-a782-4679-be7b-1c5240b66caa\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.558472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-internal-tls-certs\") pod \"6452a729-a782-4679-be7b-1c5240b66caa\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.558517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-ovndb-tls-certs\") pod \"6452a729-a782-4679-be7b-1c5240b66caa\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.558586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwvrk\" (UniqueName: \"kubernetes.io/projected/6452a729-a782-4679-be7b-1c5240b66caa-kube-api-access-fwvrk\") pod \"6452a729-a782-4679-be7b-1c5240b66caa\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.558607 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-public-tls-certs\") pod \"6452a729-a782-4679-be7b-1c5240b66caa\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.558636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-combined-ca-bundle\") pod \"6452a729-a782-4679-be7b-1c5240b66caa\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.558662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-httpd-config\") pod \"6452a729-a782-4679-be7b-1c5240b66caa\" (UID: \"6452a729-a782-4679-be7b-1c5240b66caa\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.559862 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.562130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6452a729-a782-4679-be7b-1c5240b66caa-kube-api-access-fwvrk" (OuterVolumeSpecName: "kube-api-access-fwvrk") pod "6452a729-a782-4679-be7b-1c5240b66caa" (UID: "6452a729-a782-4679-be7b-1c5240b66caa"). InnerVolumeSpecName "kube-api-access-fwvrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.562548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6452a729-a782-4679-be7b-1c5240b66caa" (UID: "6452a729-a782-4679-be7b-1c5240b66caa"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.567673 4707 scope.go:117] "RemoveContainer" containerID="a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.586863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-config" (OuterVolumeSpecName: "config") pod "6452a729-a782-4679-be7b-1c5240b66caa" (UID: "6452a729-a782-4679-be7b-1c5240b66caa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.590588 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6452a729-a782-4679-be7b-1c5240b66caa" (UID: "6452a729-a782-4679-be7b-1c5240b66caa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.590601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6452a729-a782-4679-be7b-1c5240b66caa" (UID: "6452a729-a782-4679-be7b-1c5240b66caa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.596417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6452a729-a782-4679-be7b-1c5240b66caa" (UID: "6452a729-a782-4679-be7b-1c5240b66caa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.602758 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6452a729-a782-4679-be7b-1c5240b66caa" (UID: "6452a729-a782-4679-be7b-1c5240b66caa"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data-custom\") pod \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660174 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66lh9\" (UniqueName: \"kubernetes.io/projected/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-kube-api-access-66lh9\") pod \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-etc-machine-id\") pod \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-combined-ca-bundle\") pod \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660289 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-scripts\") pod \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data\") pod \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\" (UID: \"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" (UID: "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660728 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660749 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660760 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660769 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660777 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwvrk\" (UniqueName: \"kubernetes.io/projected/6452a729-a782-4679-be7b-1c5240b66caa-kube-api-access-fwvrk\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660787 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660795 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.660802 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6452a729-a782-4679-be7b-1c5240b66caa-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.663428 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" (UID: "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.664084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-scripts" (OuterVolumeSpecName: "scripts") pod "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" (UID: "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.664121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-kube-api-access-66lh9" (OuterVolumeSpecName: "kube-api-access-66lh9") pod "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" (UID: "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829"). InnerVolumeSpecName "kube-api-access-66lh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.669134 4707 scope.go:117] "RemoveContainer" containerID="80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.688752 4707 scope.go:117] "RemoveContainer" containerID="5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.707921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" (UID: "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.723440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data" (OuterVolumeSpecName: "config-data") pod "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" (UID: "b87a91a2-ba47-4dc0-9b58-0d2f00f5f829"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.757632 4707 scope.go:117] "RemoveContainer" containerID="af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.763972 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.763997 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.764006 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.764015 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.764023 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66lh9\" (UniqueName: \"kubernetes.io/projected/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829-kube-api-access-66lh9\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.774840 4707 scope.go:117] "RemoveContainer" containerID="0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.789284 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.801667 4707 scope.go:117] "RemoveContainer" containerID="9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.819528 4707 scope.go:117] "RemoveContainer" containerID="faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.819819 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9\": container with ID starting with faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9 not found: ID does not exist" containerID="faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.819847 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9"} err="failed to get container status \"faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9\": rpc error: code = NotFound desc = could not find container \"faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9\": container with ID starting with faba77eb23e2c83ec6b2fc83ac4e0e2b0d6b06244161e4ee6f77997d4b2fd1f9 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.819865 4707 scope.go:117] "RemoveContainer" containerID="3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.824729 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12\": container with ID starting with 3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12 not found: ID does not exist" containerID="3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.824756 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12"} err="failed to get container status \"3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12\": rpc error: code = NotFound desc = could not find container \"3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12\": container with ID starting with 3b0e538c0e0ee1c96afb0b7ab7e98a975108373d424a26d08320c33d8657ee12 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.824772 4707 scope.go:117] "RemoveContainer" containerID="39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.825015 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d\": container with ID starting with 39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d not found: ID does not exist" containerID="39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825030 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d"} err="failed to get container status \"39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d\": rpc error: code = NotFound desc = could not find container \"39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d\": container with ID starting with 39b97985bc85a7b6e963bdfb832f15ba030f9d2435f74ba3323b42bec452cc6d not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825041 4707 scope.go:117] "RemoveContainer" containerID="38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.825219 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651\": container with ID starting with 38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651 not found: ID does not exist" containerID="38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825234 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651"} err="failed to get container status \"38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651\": rpc error: code = NotFound desc = could not find container \"38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651\": container with ID starting with 38ea3aec6bf1f38431cb7c1afe9871493b69487e17cd602c1c501885b4279651 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825244 4707 scope.go:117] "RemoveContainer" containerID="ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.825427 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf\": container with ID starting with ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf not found: ID does not exist" containerID="ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825440 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf"} err="failed to get container status \"ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf\": rpc error: code = NotFound desc = could not find container \"ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf\": container with ID starting with ca7d4bd126f1721fb9f3c42409d386fef2690c35a837e1bac09f4e807d44ecdf not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825452 4707 scope.go:117] "RemoveContainer" containerID="d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.825616 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525\": container with ID starting with d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525 not found: ID does not exist" containerID="d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825630 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525"} err="failed to get container status \"d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525\": rpc error: code = NotFound desc = could not find container \"d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525\": container with ID starting with d890922a56b5836a3cdb89a44a85a476aeae104c2a640116dd72be464c33a525 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825640 4707 scope.go:117] "RemoveContainer" containerID="0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.825782 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b\": container with ID starting with 0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b not found: ID does not exist" containerID="0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825795 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b"} err="failed to get container status \"0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b\": rpc error: code = NotFound desc = could not find container \"0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b\": container with ID starting with 0f728bfa59652d6f4615e4af9e1a8b6b99eb84b346432758942ba14b7b180d5b not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825820 4707 scope.go:117] "RemoveContainer" containerID="8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.825980 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9\": container with ID starting with 8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9 not found: ID does not exist" containerID="8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.825993 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9"} err="failed to get container status \"8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9\": rpc error: code = NotFound desc = could not find container \"8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9\": container with ID starting with 8cdcb85e231565c45b2d663086c89aeae374d00b52694ea1cea6d5a1f8cd73c9 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.826005 4707 scope.go:117] "RemoveContainer" containerID="202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.826152 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015\": container with ID starting with 202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015 not found: ID does not exist" containerID="202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.826166 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015"} err="failed to get container status \"202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015\": rpc error: code = NotFound desc = could not find container \"202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015\": container with ID starting with 202a35c604b6febf0e25818705090d76815837dbe4eb52548b372e95efe1b015 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.826187 4707 scope.go:117] "RemoveContainer" containerID="a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.826555 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36\": container with ID starting with a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36 not found: ID does not exist" containerID="a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.826574 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36"} err="failed to get container status \"a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36\": rpc error: code = NotFound desc = could not find container \"a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36\": container with ID starting with a35edc75ae2ec8486815725f0bfd916db32dacf51157446d72fc76bf65b31b36 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.826587 4707 scope.go:117] "RemoveContainer" containerID="80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.826762 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709\": container with ID starting with 80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709 not found: ID does not exist" containerID="80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.826775 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709"} err="failed to get container status \"80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709\": rpc error: code = NotFound desc = could not find container \"80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709\": container with ID starting with 80c68a7ad1ec52c4cd75c9aeb3dc180b9fc1e4bc7316b4efc90d8b7e0a8f5709 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.826786 4707 scope.go:117] "RemoveContainer" containerID="5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.826960 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672\": container with ID starting with 5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672 not found: ID does not exist" containerID="5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.826974 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672"} err="failed to get container status \"5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672\": rpc error: code = NotFound desc = could not find container \"5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672\": container with ID starting with 5bc821d0b7404925f02b74e1660e129f977b50e2b330e1d2d19c864e879d2672 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.826984 4707 scope.go:117] "RemoveContainer" containerID="af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.827128 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1\": container with ID starting with af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1 not found: ID does not exist" containerID="af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.827141 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1"} err="failed to get container status \"af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1\": rpc error: code = NotFound desc = could not find container \"af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1\": container with ID starting with af1327a17f90fbbb09d38de94bbbf5cfd2a1bf11468b214335e6dd0e167f4af1 not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.827151 4707 scope.go:117] "RemoveContainer" containerID="0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.827313 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a\": container with ID starting with 0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a not found: ID does not exist" containerID="0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.827327 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a"} err="failed to get container status \"0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a\": rpc error: code = NotFound desc = could not find container \"0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a\": container with ID starting with 0a34b991fd36f5cb891644892391948ae8ab61b9dba8a593bf37f3d73fe9504a not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.827338 4707 scope.go:117] "RemoveContainer" containerID="9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.827966 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c\": container with ID starting with 9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c not found: ID does not exist" containerID="9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.827983 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c"} err="failed to get container status \"9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c\": rpc error: code = NotFound desc = could not find container \"9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c\": container with ID starting with 9325ca016b45421b87ffac4927130e8f72f6cbfffb90a972967651cebbc4b16c not found: ID does not exist" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.864832 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8drr\" (UniqueName: \"kubernetes.io/projected/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-kube-api-access-f8drr\") pod \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.864929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-config-data\") pod \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.864969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-combined-ca-bundle\") pod \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\" (UID: \"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.868317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-kube-api-access-f8drr" (OuterVolumeSpecName: "kube-api-access-f8drr") pod "d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" (UID: "d6f7c9e9-a4dc-4d9e-80e4-b9564520e790"). InnerVolumeSpecName "kube-api-access-f8drr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.870747 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.882955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" (UID: "d6f7c9e9-a4dc-4d9e-80e4-b9564520e790"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.885916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-config-data" (OuterVolumeSpecName: "config-data") pod "d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" (UID: "d6f7c9e9-a4dc-4d9e-80e4-b9564520e790"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.966444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-combined-ca-bundle\") pod \"40c68303-00df-4442-9bfc-11c7c74a30c6\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.966485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25bqm\" (UniqueName: \"kubernetes.io/projected/40c68303-00df-4442-9bfc-11c7c74a30c6-kube-api-access-25bqm\") pod \"40c68303-00df-4442-9bfc-11c7c74a30c6\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.966567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-config-data\") pod \"40c68303-00df-4442-9bfc-11c7c74a30c6\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.966933 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.966952 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8drr\" (UniqueName: \"kubernetes.io/projected/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-kube-api-access-f8drr\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.966961 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.968944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c68303-00df-4442-9bfc-11c7c74a30c6-kube-api-access-25bqm" (OuterVolumeSpecName: "kube-api-access-25bqm") pod "40c68303-00df-4442-9bfc-11c7c74a30c6" (UID: "40c68303-00df-4442-9bfc-11c7c74a30c6"). InnerVolumeSpecName "kube-api-access-25bqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:12 crc kubenswrapper[4707]: E0121 15:40:12.978766 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-config-data podName:40c68303-00df-4442-9bfc-11c7c74a30c6 nodeName:}" failed. No retries permitted until 2026-01-21 15:40:13.478747382 +0000 UTC m=+2310.660263604 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-config-data") pod "40c68303-00df-4442-9bfc-11c7c74a30c6" (UID: "40c68303-00df-4442-9bfc-11c7c74a30c6") : error deleting /var/lib/kubelet/pods/40c68303-00df-4442-9bfc-11c7c74a30c6/volume-subpaths: remove /var/lib/kubelet/pods/40c68303-00df-4442-9bfc-11c7c74a30c6/volume-subpaths: no such file or directory Jan 21 15:40:12 crc kubenswrapper[4707]: I0121 15:40:12.980697 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40c68303-00df-4442-9bfc-11c7c74a30c6" (UID: "40c68303-00df-4442-9bfc-11c7c74a30c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.011748 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.068487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2cdt\" (UniqueName: \"kubernetes.io/projected/91365236-99fe-4ee7-9aa1-8a414557cf97-kube-api-access-m2cdt\") pod \"91365236-99fe-4ee7-9aa1-8a414557cf97\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.068530 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-combined-ca-bundle\") pod \"91365236-99fe-4ee7-9aa1-8a414557cf97\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.068663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data\") pod \"91365236-99fe-4ee7-9aa1-8a414557cf97\" (UID: \"91365236-99fe-4ee7-9aa1-8a414557cf97\") " Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.069189 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.069208 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25bqm\" (UniqueName: \"kubernetes.io/projected/40c68303-00df-4442-9bfc-11c7c74a30c6-kube-api-access-25bqm\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.070833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91365236-99fe-4ee7-9aa1-8a414557cf97-kube-api-access-m2cdt" (OuterVolumeSpecName: "kube-api-access-m2cdt") pod "91365236-99fe-4ee7-9aa1-8a414557cf97" (UID: "91365236-99fe-4ee7-9aa1-8a414557cf97"). InnerVolumeSpecName "kube-api-access-m2cdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.084074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91365236-99fe-4ee7-9aa1-8a414557cf97" (UID: "91365236-99fe-4ee7-9aa1-8a414557cf97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.085033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data" (OuterVolumeSpecName: "config-data") pod "91365236-99fe-4ee7-9aa1-8a414557cf97" (UID: "91365236-99fe-4ee7-9aa1-8a414557cf97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.170884 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.170916 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2cdt\" (UniqueName: \"kubernetes.io/projected/91365236-99fe-4ee7-9aa1-8a414557cf97-kube-api-access-m2cdt\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.170929 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91365236-99fe-4ee7-9aa1-8a414557cf97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.190483 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70e67f96-41b8-43d4-b048-546873358118" path="/var/lib/kubelet/pods/70e67f96-41b8-43d4-b048-546873358118/volumes" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.392373 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" exitCode=137 Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.392408 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.392438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790","Type":"ContainerDied","Data":"e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b"} Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.392463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d6f7c9e9-a4dc-4d9e-80e4-b9564520e790","Type":"ContainerDied","Data":"6cbbb25f79d2789525e5635ee65c57a8f5baca55ec4777b6364a7d60dbfd495e"} Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.392479 4707 scope.go:117] "RemoveContainer" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.395023 4707 generic.go:334] "Generic (PLEG): container finished" podID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" exitCode=137 Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.395109 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.395108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"40c68303-00df-4442-9bfc-11c7c74a30c6","Type":"ContainerDied","Data":"b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf"} Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.395152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"40c68303-00df-4442-9bfc-11c7c74a30c6","Type":"ContainerDied","Data":"cb28fbbb085a56f4117ac92c67aaa47cab12c1e9ecaca8cdb7fdcd0703fa7e70"} Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.397555 4707 generic.go:334] "Generic (PLEG): container finished" podID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" exitCode=137 Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.397599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"91365236-99fe-4ee7-9aa1-8a414557cf97","Type":"ContainerDied","Data":"77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc"} Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.397610 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.397617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"91365236-99fe-4ee7-9aa1-8a414557cf97","Type":"ContainerDied","Data":"33d9408d8368000eedbb230fdeef2f4602ab97036c442f30f15765be35ebdb33"} Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.400360 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5f454b744-bh8qb" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.400394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b87a91a2-ba47-4dc0-9b58-0d2f00f5f829","Type":"ContainerDied","Data":"0ff4026afe627ace74a3cabe6ea6a25aabd52c5a58ef0944ac48699f06dd87d8"} Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.400429 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.411184 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.415760 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.419931 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.420371 4707 scope.go:117] "RemoveContainer" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" Jan 21 15:40:13 crc kubenswrapper[4707]: E0121 15:40:13.420849 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b\": container with ID starting with e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b not found: ID does not exist" containerID="e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.420883 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b"} err="failed to get container status \"e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b\": rpc error: code = NotFound desc = could not find container \"e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b\": container with ID starting with e693c0ad99becbf6ff2364e76a26bc2c76350ddb38436a3931f2dccabcb4254b not found: ID does not exist" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.420906 4707 scope.go:117] "RemoveContainer" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.424202 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.434270 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5f454b744-bh8qb"] Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.439388 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5f454b744-bh8qb"] Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.440449 4707 scope.go:117] "RemoveContainer" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" Jan 21 15:40:13 crc kubenswrapper[4707]: E0121 15:40:13.440730 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf\": container with ID starting with b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf not found: ID does not exist" containerID="b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.440764 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf"} err="failed to get container status \"b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf\": rpc error: code = NotFound desc = could not find container \"b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf\": container with ID starting with b3a1c197159dafbb65f3fc9ea2f42fb23e9a6191c5496fa373a7a0eb8cba35bf not found: ID does not exist" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.440784 4707 scope.go:117] "RemoveContainer" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.443669 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.447679 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.453885 4707 scope.go:117] "RemoveContainer" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" Jan 21 15:40:13 crc kubenswrapper[4707]: E0121 15:40:13.454185 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc\": container with ID starting with 77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc not found: ID does not exist" containerID="77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.454215 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc"} err="failed to get container status \"77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc\": rpc error: code = NotFound desc = could not find container \"77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc\": container with ID starting with 77e248c9a06916c27e8d2e9b4be2d10ee874fba5518a04a8b7258b11e867f3dc not found: ID does not exist" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.454238 4707 scope.go:117] "RemoveContainer" containerID="6276366157c1c72f36b10c6a0708b755cc3d4fe703dde761888194d9a7e0a9e3" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.467567 4707 scope.go:117] "RemoveContainer" containerID="86eaf4108dcfe70d29829d6618776e1e19fc4cf3438e7d0357073b14327e2707" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.576233 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-config-data\") pod \"40c68303-00df-4442-9bfc-11c7c74a30c6\" (UID: \"40c68303-00df-4442-9bfc-11c7c74a30c6\") " Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.578974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-config-data" (OuterVolumeSpecName: "config-data") pod "40c68303-00df-4442-9bfc-11c7c74a30c6" (UID: "40c68303-00df-4442-9bfc-11c7c74a30c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.678305 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c68303-00df-4442-9bfc-11c7c74a30c6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.715344 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:40:13 crc kubenswrapper[4707]: I0121 15:40:13.720779 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:40:15 crc kubenswrapper[4707]: I0121 15:40:15.189683 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" path="/var/lib/kubelet/pods/40c68303-00df-4442-9bfc-11c7c74a30c6/volumes" Jan 21 15:40:15 crc kubenswrapper[4707]: I0121 15:40:15.190427 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6452a729-a782-4679-be7b-1c5240b66caa" path="/var/lib/kubelet/pods/6452a729-a782-4679-be7b-1c5240b66caa/volumes" Jan 21 15:40:15 crc kubenswrapper[4707]: I0121 15:40:15.190957 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" path="/var/lib/kubelet/pods/91365236-99fe-4ee7-9aa1-8a414557cf97/volumes" Jan 21 15:40:15 crc kubenswrapper[4707]: I0121 15:40:15.191824 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" path="/var/lib/kubelet/pods/b87a91a2-ba47-4dc0-9b58-0d2f00f5f829/volumes" Jan 21 15:40:15 crc kubenswrapper[4707]: I0121 15:40:15.192458 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" path="/var/lib/kubelet/pods/d6f7c9e9-a4dc-4d9e-80e4-b9564520e790/volumes" Jan 21 15:40:15 crc kubenswrapper[4707]: I0121 15:40:15.511887 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podce8faea1-a6d2-4be7-88ca-e92d10481925"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podce8faea1-a6d2-4be7-88ca-e92d10481925] : Timed out while waiting for systemd to remove kubepods-besteffort-podce8faea1_a6d2_4be7_88ca_e92d10481925.slice" Jan 21 15:40:15 crc kubenswrapper[4707]: E0121 15:40:15.511951 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podce8faea1-a6d2-4be7-88ca-e92d10481925] : unable to destroy cgroup paths for cgroup [kubepods besteffort podce8faea1-a6d2-4be7-88ca-e92d10481925] : Timed out while waiting for systemd to remove kubepods-besteffort-podce8faea1_a6d2_4be7_88ca_e92d10481925.slice" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" podUID="ce8faea1-a6d2-4be7-88ca-e92d10481925" Jan 21 15:40:16 crc kubenswrapper[4707]: I0121 15:40:16.421560 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88" Jan 21 15:40:16 crc kubenswrapper[4707]: I0121 15:40:16.449056 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88"] Jan 21 15:40:16 crc kubenswrapper[4707]: I0121 15:40:16.454379 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-6d10-account-create-update-wnm88"] Jan 21 15:40:17 crc kubenswrapper[4707]: I0121 15:40:17.190832 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8faea1-a6d2-4be7-88ca-e92d10481925" path="/var/lib/kubelet/pods/ce8faea1-a6d2-4be7-88ca-e92d10481925/volumes" Jan 21 15:40:39 crc kubenswrapper[4707]: I0121 15:40:39.945308 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:40:39 crc kubenswrapper[4707]: I0121 15:40:39.945799 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:40:44 crc kubenswrapper[4707]: E0121 15:40:44.279655 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:40:44 crc kubenswrapper[4707]: E0121 15:40:44.279715 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:40:44 crc kubenswrapper[4707]: E0121 15:40:44.280561 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data podName:702acb12-2ee0-4008-bf50-0fda9fc79030 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:48.280534847 +0000 UTC m=+2405.462051070 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data") pod "rabbitmq-cell1-server-0" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:40:44 crc kubenswrapper[4707]: E0121 15:40:44.280610 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data podName:1977b33b-45bb-4b7d-b7f7-ebfd42572edb nodeName:}" failed. No retries permitted until 2026-01-21 15:41:48.280596393 +0000 UTC m=+2405.462112605 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data") pod "rabbitmq-cell1-server-2" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:40:45 crc kubenswrapper[4707]: E0121 15:40:45.553108 4707 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 21 15:40:45 crc kubenswrapper[4707]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-1011-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-629-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: > execCommand=["/bin/bash","-c","if [ ! -z \"$(cat /etc/pod-info/skipPreStopChecks)\" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 \u0026\u0026 rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true \u0026\u0026 rabbitmq-upgrade drain -t 604800"] containerName="rabbitmq" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" message=< Jan 21 15:40:45 crc kubenswrapper[4707]: Will put node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests into maintenance mode. The node will no longer serve any client traffic! Jan 21 15:40:45 crc kubenswrapper[4707]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-1011-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-629-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: > Jan 21 15:40:45 crc kubenswrapper[4707]: E0121 15:40:45.553397 4707 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 21 15:40:45 crc kubenswrapper[4707]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-1011-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-629-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: > pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerName="rabbitmq" containerID="cri-o://e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e" Jan 21 15:40:45 crc kubenswrapper[4707]: I0121 15:40:45.553442 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerName="rabbitmq" containerID="cri-o://e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e" gracePeriod=604739 Jan 21 15:40:45 crc kubenswrapper[4707]: E0121 15:40:45.646897 4707 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 21 15:40:45 crc kubenswrapper[4707]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-436-rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-413-rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: > execCommand=["/bin/bash","-c","if [ ! -z \"$(cat /etc/pod-info/skipPreStopChecks)\" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 \u0026\u0026 rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true \u0026\u0026 rabbitmq-upgrade drain -t 604800"] containerName="rabbitmq" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" message=< Jan 21 15:40:45 crc kubenswrapper[4707]: Will put node rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests into maintenance mode. The node will no longer serve any client traffic! Jan 21 15:40:45 crc kubenswrapper[4707]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-436-rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-413-rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: > Jan 21 15:40:45 crc kubenswrapper[4707]: E0121 15:40:45.646955 4707 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 21 15:40:45 crc kubenswrapper[4707]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-436-rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'. Please see diagnostics information and suggestions below. Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Most common reasons for this are: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Jan 21 15:40:45 crc kubenswrapper[4707]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Jan 21 15:40:45 crc kubenswrapper[4707]: * Target node is not running Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: In addition to the diagnostics info below: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Jan 21 15:40:45 crc kubenswrapper[4707]: * Consult server logs on node rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests Jan 21 15:40:45 crc kubenswrapper[4707]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: DIAGNOSTICS Jan 21 15:40:45 crc kubenswrapper[4707]: =========== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: attempted to contact: ['rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests'] Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: Jan 21 15:40:45 crc kubenswrapper[4707]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests: nxdomain (non-existing domain) Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: Current node details: Jan 21 15:40:45 crc kubenswrapper[4707]: * node name: 'rabbitmqcli-413-rabbit@rabbitmq-cell1-server-2.rabbitmq-cell1-nodes.openstack-kuttl-tests' Jan 21 15:40:45 crc kubenswrapper[4707]: * effective user's home directory: /var/lib/rabbitmq Jan 21 15:40:45 crc kubenswrapper[4707]: * Erlang cookie hash: Aa8CAkdfQdwtWjCU6ywxtQ== Jan 21 15:40:45 crc kubenswrapper[4707]: Jan 21 15:40:45 crc kubenswrapper[4707]: > pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" podUID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerName="rabbitmq" containerID="cri-o://22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311" Jan 21 15:40:45 crc kubenswrapper[4707]: I0121 15:40:45.647002 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" podUID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerName="rabbitmq" containerID="cri-o://22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311" gracePeriod=604739 Jan 21 15:40:47 crc kubenswrapper[4707]: I0121 15:40:47.472896 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Jan 21 15:40:47 crc kubenswrapper[4707]: I0121 15:40:47.490761 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" podUID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.112:5671: connect: connection refused" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.924051 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981040 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-plugins-conf\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82kxz\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-kube-api-access-82kxz\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/702acb12-2ee0-4008-bf50-0fda9fc79030-erlang-cookie-secret\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981161 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-plugins\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-confd\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981270 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-server-conf\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/702acb12-2ee0-4008-bf50-0fda9fc79030-pod-info\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-erlang-cookie\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.981387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-tls\") pod \"702acb12-2ee0-4008-bf50-0fda9fc79030\" (UID: \"702acb12-2ee0-4008-bf50-0fda9fc79030\") " Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.982006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.982026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.982082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.982188 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.982201 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.982210 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.986857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-kube-api-access-82kxz" (OuterVolumeSpecName: "kube-api-access-82kxz") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "kube-api-access-82kxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.986879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.987103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702acb12-2ee0-4008-bf50-0fda9fc79030-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.987626 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "persistence") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.988086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/702acb12-2ee0-4008-bf50-0fda9fc79030-pod-info" (OuterVolumeSpecName: "pod-info") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.988190 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:40:51 crc kubenswrapper[4707]: I0121 15:40:51.998477 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data" (OuterVolumeSpecName: "config-data") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.013017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-server-conf" (OuterVolumeSpecName: "server-conf") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.037129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "702acb12-2ee0-4008-bf50-0fda9fc79030" (UID: "702acb12-2ee0-4008-bf50-0fda9fc79030"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-tls\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnxrn\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-kube-api-access-dnxrn\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083513 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-plugins\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-plugins-conf\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-server-conf\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-confd\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-erlang-cookie\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083703 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-pod-info\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.083719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-erlang-cookie-secret\") pod \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\" (UID: \"1977b33b-45bb-4b7d-b7f7-ebfd42572edb\") " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084233 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084247 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82kxz\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-kube-api-access-82kxz\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084270 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/702acb12-2ee0-4008-bf50-0fda9fc79030-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084279 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084287 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/702acb12-2ee0-4008-bf50-0fda9fc79030-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084295 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/702acb12-2ee0-4008-bf50-0fda9fc79030-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084303 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/702acb12-2ee0-4008-bf50-0fda9fc79030-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084324 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.084439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.085871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-kube-api-access-dnxrn" (OuterVolumeSpecName: "kube-api-access-dnxrn") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "kube-api-access-dnxrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.085930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.086571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.086620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.087314 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-pod-info" (OuterVolumeSpecName: "pod-info") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.097185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data" (OuterVolumeSpecName: "config-data") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.097191 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.108407 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-server-conf" (OuterVolumeSpecName: "server-conf") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.130377 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1977b33b-45bb-4b7d-b7f7-ebfd42572edb" (UID: "1977b33b-45bb-4b7d-b7f7-ebfd42572edb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185469 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185497 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185508 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185518 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185526 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185534 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185544 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185559 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185567 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185574 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185581 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.185589 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnxrn\" (UniqueName: \"kubernetes.io/projected/1977b33b-45bb-4b7d-b7f7-ebfd42572edb-kube-api-access-dnxrn\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.199280 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.287408 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.627838 4707 generic.go:334] "Generic (PLEG): container finished" podID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerID="e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e" exitCode=0 Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.627884 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.627893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"702acb12-2ee0-4008-bf50-0fda9fc79030","Type":"ContainerDied","Data":"e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e"} Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.627916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"702acb12-2ee0-4008-bf50-0fda9fc79030","Type":"ContainerDied","Data":"060754cc187f1818440fa24bac095975295b72d6a153f918438259376c29ef38"} Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.627932 4707 scope.go:117] "RemoveContainer" containerID="e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.629725 4707 generic.go:334] "Generic (PLEG): container finished" podID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerID="22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311" exitCode=0 Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.629767 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.629765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"1977b33b-45bb-4b7d-b7f7-ebfd42572edb","Type":"ContainerDied","Data":"22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311"} Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.629852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"1977b33b-45bb-4b7d-b7f7-ebfd42572edb","Type":"ContainerDied","Data":"65112d66bcb9da79b16ec5a960f301f80edce5306110f1700e939f092238ad92"} Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.644352 4707 scope.go:117] "RemoveContainer" containerID="26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.655700 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.660832 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.666290 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.670927 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.678670 4707 scope.go:117] "RemoveContainer" containerID="e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e" Jan 21 15:40:52 crc kubenswrapper[4707]: E0121 15:40:52.679066 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e\": container with ID starting with e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e not found: ID does not exist" containerID="e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.679100 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e"} err="failed to get container status \"e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e\": rpc error: code = NotFound desc = could not find container \"e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e\": container with ID starting with e382e9e946d0eca8423a5b341b55bada404b55391d6cd84faff21ebb6c73a93e not found: ID does not exist" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.679121 4707 scope.go:117] "RemoveContainer" containerID="26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45" Jan 21 15:40:52 crc kubenswrapper[4707]: E0121 15:40:52.679556 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45\": container with ID starting with 26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45 not found: ID does not exist" containerID="26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.679580 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45"} err="failed to get container status \"26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45\": rpc error: code = NotFound desc = could not find container \"26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45\": container with ID starting with 26c22319e89b8cf2699d9d4384d65aa662385581aa4c5a6bcef84b60ce1ebb45 not found: ID does not exist" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.679596 4707 scope.go:117] "RemoveContainer" containerID="22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.692478 4707 scope.go:117] "RemoveContainer" containerID="158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.705864 4707 scope.go:117] "RemoveContainer" containerID="22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311" Jan 21 15:40:52 crc kubenswrapper[4707]: E0121 15:40:52.706210 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311\": container with ID starting with 22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311 not found: ID does not exist" containerID="22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.706316 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311"} err="failed to get container status \"22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311\": rpc error: code = NotFound desc = could not find container \"22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311\": container with ID starting with 22d4b7037f62e1f85fc85d7899fd0e4c6709a84dd97285ef32e18df7e5793311 not found: ID does not exist" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.706383 4707 scope.go:117] "RemoveContainer" containerID="158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816" Jan 21 15:40:52 crc kubenswrapper[4707]: E0121 15:40:52.706768 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816\": container with ID starting with 158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816 not found: ID does not exist" containerID="158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816" Jan 21 15:40:52 crc kubenswrapper[4707]: I0121 15:40:52.706831 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816"} err="failed to get container status \"158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816\": rpc error: code = NotFound desc = could not find container \"158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816\": container with ID starting with 158046d1bdaab768f2b3f39d4bda6815c9ab0c6b2e4083a4ed0b6940c6f8b816 not found: ID does not exist" Jan 21 15:40:53 crc kubenswrapper[4707]: I0121 15:40:53.189582 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" path="/var/lib/kubelet/pods/1977b33b-45bb-4b7d-b7f7-ebfd42572edb/volumes" Jan 21 15:40:53 crc kubenswrapper[4707]: I0121 15:40:53.190762 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" path="/var/lib/kubelet/pods/702acb12-2ee0-4008-bf50-0fda9fc79030/volumes" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.679592 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-j9stt"] Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.683882 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-j9stt"] Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779529 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-rgq8q"] Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779776 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779792 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779822 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779828 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779835 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779841 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779847 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e007cb2f-fee4-4c18-9baf-9e61af91254c" containerName="init" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779852 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e007cb2f-fee4-4c18-9baf-9e61af91254c" containerName="init" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779862 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-updater" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779868 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-updater" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779878 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="swift-recon-cron" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779883 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="swift-recon-cron" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779890 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40927271-bdaa-4872-a579-c100648860dd" containerName="barbican-worker" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779894 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="40927271-bdaa-4872-a579-c100648860dd" containerName="barbican-worker" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779904 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0fc739-8c30-46ab-aea8-1db8f2864340" containerName="mariadb-account-create-update" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779923 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0fc739-8c30-46ab-aea8-1db8f2864340" containerName="mariadb-account-create-update" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779929 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779934 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779943 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6452a729-a782-4679-be7b-1c5240b66caa" containerName="neutron-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779949 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6452a729-a782-4679-be7b-1c5240b66caa" containerName="neutron-api" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779955 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779960 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779966 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779971 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779977 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerName="barbican-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779982 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerName="barbican-api" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.779987 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-replicator" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.779993 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-replicator" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780001 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780007 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780016 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-server" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780021 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-server" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780029 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6452a729-a782-4679-be7b-1c5240b66caa" containerName="neutron-httpd" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780034 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6452a729-a782-4679-be7b-1c5240b66caa" containerName="neutron-httpd" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780041 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerName="cinder-scheduler" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780046 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerName="cinder-scheduler" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780054 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-replicator" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780060 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-replicator" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780066 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-metadata" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780071 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-metadata" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780077 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2084ed-217a-462c-8219-6cfa022751d9" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780081 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2084ed-217a-462c-8219-6cfa022751d9" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780087 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780092 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-log" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780101 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324409f5-db7f-41c2-9004-d9c4d10e1e0e" containerName="keystone-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780106 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="324409f5-db7f-41c2-9004-d9c4d10e1e0e" containerName="keystone-api" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780115 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780119 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780127 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-auditor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780132 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-auditor" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780138 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerName="glance-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780143 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerName="glance-log" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780151 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780156 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-api" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780163 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780168 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780174 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741daec5-08a7-4769-8b0b-4248c3c02d69" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780179 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="741daec5-08a7-4769-8b0b-4248c3c02d69" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780185 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780190 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780198 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerName="cinder-api-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780204 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerName="cinder-api-log" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780214 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerName="placement-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780219 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerName="placement-log" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780225 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="rsync" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780230 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="rsync" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780236 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780241 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780250 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerName="ovn-northd" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780263 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerName="ovn-northd" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780271 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780276 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780282 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-server" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780287 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-server" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780296 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerName="glance-httpd" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780302 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerName="glance-httpd" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780310 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0fc739-8c30-46ab-aea8-1db8f2864340" containerName="mariadb-account-create-update" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780315 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0fc739-8c30-46ab-aea8-1db8f2864340" containerName="mariadb-account-create-update" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780322 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780327 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780335 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-auditor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780340 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-auditor" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780348 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780353 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780359 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerName="glance-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780364 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerName="glance-log" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780371 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerName="glance-httpd" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780376 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerName="glance-httpd" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780383 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741daec5-08a7-4769-8b0b-4248c3c02d69" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780388 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="741daec5-08a7-4769-8b0b-4248c3c02d69" containerName="mysql-bootstrap" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780396 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780401 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerName="setup-container" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780409 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2084ed-217a-462c-8219-6cfa022751d9" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780414 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2084ed-217a-462c-8219-6cfa022751d9" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780421 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerName="cinder-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780441 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerName="cinder-api" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780450 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780462 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-auditor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780467 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-auditor" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780475 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerName="openstack-network-exporter" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780480 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerName="openstack-network-exporter" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780486 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerName="placement-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780491 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerName="placement-api" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780500 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e12f07-e051-4381-8419-5f3a1a09d6c5" containerName="memcached" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780504 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e12f07-e051-4381-8419-5f3a1a09d6c5" containerName="memcached" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780511 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780515 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780521 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-server" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780525 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-server" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780532 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerName="probe" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780537 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerName="probe" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780544 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-expirer" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780549 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-expirer" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780556 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-reaper" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780560 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-reaper" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780569 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerName="barbican-api-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780573 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerName="barbican-api-log" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780579 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780585 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-log" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780595 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e007cb2f-fee4-4c18-9baf-9e61af91254c" containerName="dnsmasq-dns" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780600 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e007cb2f-fee4-4c18-9baf-9e61af91254c" containerName="dnsmasq-dns" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780606 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780611 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780617 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40927271-bdaa-4872-a579-c100648860dd" containerName="barbican-worker-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780622 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="40927271-bdaa-4872-a579-c100648860dd" containerName="barbican-worker-log" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780629 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-replicator" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780633 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-replicator" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780641 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780645 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780653 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-updater" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780657 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-updater" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780664 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780669 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" Jan 21 15:40:58 crc kubenswrapper[4707]: E0121 15:40:58.780675 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780680 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780853 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerName="glance-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780863 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0fc739-8c30-46ab-aea8-1db8f2864340" containerName="mariadb-account-create-update" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780870 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="40927271-bdaa-4872-a579-c100648860dd" containerName="barbican-worker" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780883 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerName="glance-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780889 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6452a729-a782-4679-be7b-1c5240b66caa" containerName="neutron-httpd" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780894 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf120fd0-7d2d-4f6e-80d9-523c4432c9db" containerName="glance-httpd" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-server" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780906 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-auditor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780915 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="324409f5-db7f-41c2-9004-d9c4d10e1e0e" containerName="keystone-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780921 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c68303-00df-4442-9bfc-11c7c74a30c6" containerName="nova-cell1-conductor-conductor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780928 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b560a7-6184-4897-af4f-7975ebd19fcb" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780934 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="702acb12-2ee0-4008-bf50-0fda9fc79030" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780942 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-auditor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780947 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-auditor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780955 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-replicator" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780965 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="741daec5-08a7-4769-8b0b-4248c3c02d69" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780973 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c641d259-efc1-4897-94d0-bf9d429f3c5d" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780979 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c346ce8-cb7b-46a0-abda-84bc7a42f009" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780987 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6452a729-a782-4679-be7b-1c5240b66caa" containerName="neutron-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780993 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eec9e69-0f85-4cc7-8fad-020a027dedcc" containerName="glance-httpd" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.780998 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="rsync" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781004 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerName="cinder-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781011 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2084ed-217a-462c-8219-6cfa022751d9" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781018 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781026 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3597cfd3-7c8f-41b7-9265-cfffa2bab5e9" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781032 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="224bc6d8-9ffa-473c-a8c9-a7fcd9a3380b" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781039 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-expirer" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781045 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91365236-99fe-4ee7-9aa1-8a414557cf97" containerName="nova-cell0-conductor-conductor" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781051 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-updater" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781057 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee55a6b-2348-4ab1-9bc1-0be2a8aafe4d" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781062 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerName="ovn-northd" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781069 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="40927271-bdaa-4872-a579-c100648860dd" containerName="barbican-worker-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781076 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cfe067-0df4-46ce-b674-eed01fb4ca91" containerName="galera" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781083 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-server" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781090 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="object-server" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781096 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-replicator" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781102 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-replicator" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781108 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerName="placement-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781113 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerName="barbican-api-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781121 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e12f07-e051-4381-8419-5f3a1a09d6c5" containerName="memcached" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781128 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0633364-ad88-4b44-bc23-3e2548ffb3d3" containerName="barbican-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781133 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerName="probe" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781138 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="487d2c84-f1f6-4b88-9188-49e704aeb43a" containerName="cinder-api-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781144 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="swift-recon-cron" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781150 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1977b33b-45bb-4b7d-b7f7-ebfd42572edb" containerName="rabbitmq" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781157 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="account-reaper" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781165 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87a91a2-ba47-4dc0-9b58-0d2f00f5f829" containerName="cinder-scheduler" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781171 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e67f96-41b8-43d4-b048-546873358118" containerName="container-updater" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781177 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0fc739-8c30-46ab-aea8-1db8f2864340" containerName="mariadb-account-create-update" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781182 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e007cb2f-fee4-4c18-9baf-9e61af91254c" containerName="dnsmasq-dns" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781190 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-metadata" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781198 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="64087307-6402-47cd-8a4b-37c3d90eb61e" containerName="nova-api-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781204 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="886a48bf-ec30-418c-86d9-9d96b725aa59" containerName="placement-api" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781211 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e183977-7c22-4b6d-a465-132c8f85dbb0" containerName="openstack-network-exporter" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781217 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee2089-6404-41ee-b6ae-82d788ced5fc" containerName="nova-metadata-log" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781227 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f7c9e9-a4dc-4d9e-80e4-b9564520e790" containerName="nova-scheduler-scheduler" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.781701 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.783025 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.783033 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.783166 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.783393 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.785414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rgq8q"] Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.870638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16b8caec-ac4d-4d56-8585-059330c09fdd-crc-storage\") pod \"crc-storage-crc-rgq8q\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.870714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxp6k\" (UniqueName: \"kubernetes.io/projected/16b8caec-ac4d-4d56-8585-059330c09fdd-kube-api-access-pxp6k\") pod \"crc-storage-crc-rgq8q\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.870748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16b8caec-ac4d-4d56-8585-059330c09fdd-node-mnt\") pod \"crc-storage-crc-rgq8q\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.971960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxp6k\" (UniqueName: \"kubernetes.io/projected/16b8caec-ac4d-4d56-8585-059330c09fdd-kube-api-access-pxp6k\") pod \"crc-storage-crc-rgq8q\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.971996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16b8caec-ac4d-4d56-8585-059330c09fdd-node-mnt\") pod \"crc-storage-crc-rgq8q\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.972066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16b8caec-ac4d-4d56-8585-059330c09fdd-crc-storage\") pod \"crc-storage-crc-rgq8q\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.972370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16b8caec-ac4d-4d56-8585-059330c09fdd-node-mnt\") pod \"crc-storage-crc-rgq8q\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.972712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16b8caec-ac4d-4d56-8585-059330c09fdd-crc-storage\") pod \"crc-storage-crc-rgq8q\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:58 crc kubenswrapper[4707]: I0121 15:40:58.986998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxp6k\" (UniqueName: \"kubernetes.io/projected/16b8caec-ac4d-4d56-8585-059330c09fdd-kube-api-access-pxp6k\") pod \"crc-storage-crc-rgq8q\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:59 crc kubenswrapper[4707]: I0121 15:40:59.096697 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:40:59 crc kubenswrapper[4707]: I0121 15:40:59.190778 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10" path="/var/lib/kubelet/pods/2b3e3f92-aaf3-4e38-a737-9f7d1abd5c10/volumes" Jan 21 15:40:59 crc kubenswrapper[4707]: I0121 15:40:59.441379 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rgq8q"] Jan 21 15:40:59 crc kubenswrapper[4707]: I0121 15:40:59.671552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rgq8q" event={"ID":"16b8caec-ac4d-4d56-8585-059330c09fdd","Type":"ContainerStarted","Data":"0601f780c2c30c62363a649d44c056e583a141e78ab9bc74d61715da459f9fa5"} Jan 21 15:41:00 crc kubenswrapper[4707]: I0121 15:41:00.686438 4707 generic.go:334] "Generic (PLEG): container finished" podID="16b8caec-ac4d-4d56-8585-059330c09fdd" containerID="67426142fb520cd6c7a41ab524ba5e9fdc22b2e4ccf3ef543c4400ce59fc6e49" exitCode=0 Jan 21 15:41:00 crc kubenswrapper[4707]: I0121 15:41:00.686484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rgq8q" event={"ID":"16b8caec-ac4d-4d56-8585-059330c09fdd","Type":"ContainerDied","Data":"67426142fb520cd6c7a41ab524ba5e9fdc22b2e4ccf3ef543c4400ce59fc6e49"} Jan 21 15:41:01 crc kubenswrapper[4707]: I0121 15:41:01.890960 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.011106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16b8caec-ac4d-4d56-8585-059330c09fdd-crc-storage\") pod \"16b8caec-ac4d-4d56-8585-059330c09fdd\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.011185 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxp6k\" (UniqueName: \"kubernetes.io/projected/16b8caec-ac4d-4d56-8585-059330c09fdd-kube-api-access-pxp6k\") pod \"16b8caec-ac4d-4d56-8585-059330c09fdd\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.011274 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16b8caec-ac4d-4d56-8585-059330c09fdd-node-mnt\") pod \"16b8caec-ac4d-4d56-8585-059330c09fdd\" (UID: \"16b8caec-ac4d-4d56-8585-059330c09fdd\") " Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.011439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16b8caec-ac4d-4d56-8585-059330c09fdd-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "16b8caec-ac4d-4d56-8585-059330c09fdd" (UID: "16b8caec-ac4d-4d56-8585-059330c09fdd"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.011699 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/16b8caec-ac4d-4d56-8585-059330c09fdd-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.015900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b8caec-ac4d-4d56-8585-059330c09fdd-kube-api-access-pxp6k" (OuterVolumeSpecName: "kube-api-access-pxp6k") pod "16b8caec-ac4d-4d56-8585-059330c09fdd" (UID: "16b8caec-ac4d-4d56-8585-059330c09fdd"). InnerVolumeSpecName "kube-api-access-pxp6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.025966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16b8caec-ac4d-4d56-8585-059330c09fdd-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "16b8caec-ac4d-4d56-8585-059330c09fdd" (UID: "16b8caec-ac4d-4d56-8585-059330c09fdd"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.113328 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/16b8caec-ac4d-4d56-8585-059330c09fdd-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.113358 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxp6k\" (UniqueName: \"kubernetes.io/projected/16b8caec-ac4d-4d56-8585-059330c09fdd-kube-api-access-pxp6k\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.699212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rgq8q" event={"ID":"16b8caec-ac4d-4d56-8585-059330c09fdd","Type":"ContainerDied","Data":"0601f780c2c30c62363a649d44c056e583a141e78ab9bc74d61715da459f9fa5"} Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.699243 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0601f780c2c30c62363a649d44c056e583a141e78ab9bc74d61715da459f9fa5" Jan 21 15:41:02 crc kubenswrapper[4707]: I0121 15:41:02.699262 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgq8q" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.767627 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-rgq8q"] Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.771820 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-rgq8q"] Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.871951 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tm9xs"] Jan 21 15:41:04 crc kubenswrapper[4707]: E0121 15:41:04.872205 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b8caec-ac4d-4d56-8585-059330c09fdd" containerName="storage" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.872220 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b8caec-ac4d-4d56-8585-059330c09fdd" containerName="storage" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.872372 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b8caec-ac4d-4d56-8585-059330c09fdd" containerName="storage" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.872775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.875624 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.875938 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.879390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tm9xs"] Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.881119 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.882953 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.946180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6f049514-c7cc-473c-a107-3a823dc670ba-crc-storage\") pod \"crc-storage-crc-tm9xs\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.946216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6f049514-c7cc-473c-a107-3a823dc670ba-node-mnt\") pod \"crc-storage-crc-tm9xs\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:04 crc kubenswrapper[4707]: I0121 15:41:04.946246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fp9s\" (UniqueName: \"kubernetes.io/projected/6f049514-c7cc-473c-a107-3a823dc670ba-kube-api-access-7fp9s\") pod \"crc-storage-crc-tm9xs\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.046791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6f049514-c7cc-473c-a107-3a823dc670ba-crc-storage\") pod \"crc-storage-crc-tm9xs\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.047183 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6f049514-c7cc-473c-a107-3a823dc670ba-node-mnt\") pod \"crc-storage-crc-tm9xs\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.047213 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fp9s\" (UniqueName: \"kubernetes.io/projected/6f049514-c7cc-473c-a107-3a823dc670ba-kube-api-access-7fp9s\") pod \"crc-storage-crc-tm9xs\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.047398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6f049514-c7cc-473c-a107-3a823dc670ba-crc-storage\") pod \"crc-storage-crc-tm9xs\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.047454 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6f049514-c7cc-473c-a107-3a823dc670ba-node-mnt\") pod \"crc-storage-crc-tm9xs\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.062328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fp9s\" (UniqueName: \"kubernetes.io/projected/6f049514-c7cc-473c-a107-3a823dc670ba-kube-api-access-7fp9s\") pod \"crc-storage-crc-tm9xs\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.187691 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.190974 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b8caec-ac4d-4d56-8585-059330c09fdd" path="/var/lib/kubelet/pods/16b8caec-ac4d-4d56-8585-059330c09fdd/volumes" Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.529058 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tm9xs"] Jan 21 15:41:05 crc kubenswrapper[4707]: I0121 15:41:05.716320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tm9xs" event={"ID":"6f049514-c7cc-473c-a107-3a823dc670ba","Type":"ContainerStarted","Data":"75c5f657b22881519ea5b1ccaedb948537ca89dc716a3cfaf0eed894de67e5b4"} Jan 21 15:41:06 crc kubenswrapper[4707]: I0121 15:41:06.723843 4707 generic.go:334] "Generic (PLEG): container finished" podID="6f049514-c7cc-473c-a107-3a823dc670ba" containerID="a256d0c99dcf11f5da546ffd93042d18773ef99926f52b461c40448890ad7505" exitCode=0 Jan 21 15:41:06 crc kubenswrapper[4707]: I0121 15:41:06.723930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tm9xs" event={"ID":"6f049514-c7cc-473c-a107-3a823dc670ba","Type":"ContainerDied","Data":"a256d0c99dcf11f5da546ffd93042d18773ef99926f52b461c40448890ad7505"} Jan 21 15:41:07 crc kubenswrapper[4707]: I0121 15:41:07.938544 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.085562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6f049514-c7cc-473c-a107-3a823dc670ba-crc-storage\") pod \"6f049514-c7cc-473c-a107-3a823dc670ba\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.085633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fp9s\" (UniqueName: \"kubernetes.io/projected/6f049514-c7cc-473c-a107-3a823dc670ba-kube-api-access-7fp9s\") pod \"6f049514-c7cc-473c-a107-3a823dc670ba\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.085663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6f049514-c7cc-473c-a107-3a823dc670ba-node-mnt\") pod \"6f049514-c7cc-473c-a107-3a823dc670ba\" (UID: \"6f049514-c7cc-473c-a107-3a823dc670ba\") " Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.086035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f049514-c7cc-473c-a107-3a823dc670ba-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6f049514-c7cc-473c-a107-3a823dc670ba" (UID: "6f049514-c7cc-473c-a107-3a823dc670ba"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.090332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f049514-c7cc-473c-a107-3a823dc670ba-kube-api-access-7fp9s" (OuterVolumeSpecName: "kube-api-access-7fp9s") pod "6f049514-c7cc-473c-a107-3a823dc670ba" (UID: "6f049514-c7cc-473c-a107-3a823dc670ba"). InnerVolumeSpecName "kube-api-access-7fp9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.100332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f049514-c7cc-473c-a107-3a823dc670ba-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6f049514-c7cc-473c-a107-3a823dc670ba" (UID: "6f049514-c7cc-473c-a107-3a823dc670ba"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.186768 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6f049514-c7cc-473c-a107-3a823dc670ba-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.186789 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6f049514-c7cc-473c-a107-3a823dc670ba-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.186798 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fp9s\" (UniqueName: \"kubernetes.io/projected/6f049514-c7cc-473c-a107-3a823dc670ba-kube-api-access-7fp9s\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.735123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tm9xs" event={"ID":"6f049514-c7cc-473c-a107-3a823dc670ba","Type":"ContainerDied","Data":"75c5f657b22881519ea5b1ccaedb948537ca89dc716a3cfaf0eed894de67e5b4"} Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.735366 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75c5f657b22881519ea5b1ccaedb948537ca89dc716a3cfaf0eed894de67e5b4" Jan 21 15:41:08 crc kubenswrapper[4707]: I0121 15:41:08.735178 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tm9xs" Jan 21 15:41:09 crc kubenswrapper[4707]: I0121 15:41:09.945701 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:41:09 crc kubenswrapper[4707]: I0121 15:41:09.946087 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.603125 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:41:19 crc kubenswrapper[4707]: E0121 15:41:19.603905 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f049514-c7cc-473c-a107-3a823dc670ba" containerName="storage" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.603917 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f049514-c7cc-473c-a107-3a823dc670ba" containerName="storage" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.604062 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f049514-c7cc-473c-a107-3a823dc670ba" containerName="storage" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.604740 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.606639 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.608040 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.608109 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.609363 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.609376 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-cs298" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.609393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.609410 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.612978 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.725711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.725768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.725825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.725904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.725922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e6f92106-4e6f-4fdd-b646-322ee996dc80-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.725940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e6f92106-4e6f-4fdd-b646-322ee996dc80-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.725971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-config-data\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.725995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.726034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.726051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2nr4\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-kube-api-access-r2nr4\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.726073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.820327 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.821388 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.824048 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.825567 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.825842 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.825879 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.825842 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.826062 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.826121 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-rq4c9" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.826909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.826939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e6f92106-4e6f-4fdd-b646-322ee996dc80-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.826979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e6f92106-4e6f-4fdd-b646-322ee996dc80-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.827008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-config-data\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.827027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.827056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.827071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2nr4\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-kube-api-access-r2nr4\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.827091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.827124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.827140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.827161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.827216 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.828162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.828680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-config-data\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.828696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.832449 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.833780 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e6f92106-4e6f-4fdd-b646-322ee996dc80-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.834377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e6f92106-4e6f-4fdd-b646-322ee996dc80-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.834947 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.835657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.840877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.848252 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.848950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2nr4\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-kube-api-access-r2nr4\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.866514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.918840 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de4cffce-959d-47e8-9217-f842b2ab8f65-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jm9\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-kube-api-access-j7jm9\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de4cffce-959d-47e8-9217-f842b2ab8f65-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928166 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928236 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:19 crc kubenswrapper[4707]: I0121 15:41:19.928473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.029774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de4cffce-959d-47e8-9217-f842b2ab8f65-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030218 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de4cffce-959d-47e8-9217-f842b2ab8f65-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jm9\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-kube-api-access-j7jm9\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.030858 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.031223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.031384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.032556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.034414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.035072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de4cffce-959d-47e8-9217-f842b2ab8f65-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.035523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.041725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de4cffce-959d-47e8-9217-f842b2ab8f65-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.043127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jm9\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-kube-api-access-j7jm9\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.045331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.195762 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.271703 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.565204 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.806091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"de4cffce-959d-47e8-9217-f842b2ab8f65","Type":"ContainerStarted","Data":"ca6d53cc93afc70dd89f52af1dc376e5b807b3336d8d9081ba5ffce7e65c90fd"} Jan 21 15:41:20 crc kubenswrapper[4707]: I0121 15:41:20.807546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e6f92106-4e6f-4fdd-b646-322ee996dc80","Type":"ContainerStarted","Data":"ba41cf0494580131641542900654bdffb24a6379473ee3a827e893dd0f1fb1a9"} Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.608230 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.609927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.611519 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-w7frc" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.611956 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.612823 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.613012 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.618703 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.619364 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.649519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.649580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vch9b\" (UniqueName: \"kubernetes.io/projected/89b706e2-54e8-4a85-b263-cbccec9d93bd-kube-api-access-vch9b\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.649606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.649664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.649685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.649707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.649733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.649770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.751431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.751480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vch9b\" (UniqueName: \"kubernetes.io/projected/89b706e2-54e8-4a85-b263-cbccec9d93bd-kube-api-access-vch9b\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.751502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.751533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.751553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.751568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.751591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.751626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.752020 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.752411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.752679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.752980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.753065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.759693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.759741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.765220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vch9b\" (UniqueName: \"kubernetes.io/projected/89b706e2-54e8-4a85-b263-cbccec9d93bd-kube-api-access-vch9b\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.767448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.815475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e6f92106-4e6f-4fdd-b646-322ee996dc80","Type":"ContainerStarted","Data":"0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0"} Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.816980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"de4cffce-959d-47e8-9217-f842b2ab8f65","Type":"ContainerStarted","Data":"91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30"} Jan 21 15:41:21 crc kubenswrapper[4707]: I0121 15:41:21.928336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:22 crc kubenswrapper[4707]: I0121 15:41:22.306222 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:41:22 crc kubenswrapper[4707]: I0121 15:41:22.823501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"89b706e2-54e8-4a85-b263-cbccec9d93bd","Type":"ContainerStarted","Data":"95551b395d0de6386810a37afb11f5ca53b7a8c1ccf730edcd06de3010c53ce6"} Jan 21 15:41:22 crc kubenswrapper[4707]: I0121 15:41:22.823748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"89b706e2-54e8-4a85-b263-cbccec9d93bd","Type":"ContainerStarted","Data":"aa98719951bf2b6bf29297053f29bfa5df0272461f17db406e43b3c2d0522dc7"} Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.038962 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.040147 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.043057 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.043546 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.043972 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.044559 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-pbbx2" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.046834 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.068391 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.068451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.068470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.068517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.068628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2452s\" (UniqueName: \"kubernetes.io/projected/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kube-api-access-2452s\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.068673 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.068700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.068719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.169299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.169345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2452s\" (UniqueName: \"kubernetes.io/projected/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kube-api-access-2452s\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.169370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.169390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.169405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.169443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.169473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.169490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.169598 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.170307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.170439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.170752 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.171247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.174322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.174705 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.183361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2452s\" (UniqueName: \"kubernetes.io/projected/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kube-api-access-2452s\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.184063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.354510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.369480 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.370276 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.371924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-mmw7f" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.371949 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.371924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.384034 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.474495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jdp4\" (UniqueName: \"kubernetes.io/projected/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kube-api-access-8jdp4\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.474772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kolla-config\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.474928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.474947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.474964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-config-data\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.576388 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kolla-config\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.576506 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.576528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.576546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-config-data\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.576644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jdp4\" (UniqueName: \"kubernetes.io/projected/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kube-api-access-8jdp4\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.577185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kolla-config\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.577335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-config-data\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.581107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.581170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.588791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jdp4\" (UniqueName: \"kubernetes.io/projected/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kube-api-access-8jdp4\") pod \"memcached-0\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.735465 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.748597 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:41:23 crc kubenswrapper[4707]: W0121 15:41:23.753022 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7280671_6e1f_48c9_a9c8_ca1f7453f07c.slice/crio-85fa3a261dfa556fa3e788b3224aba8d6953f21c5112386ef2616abd7a8edbec WatchSource:0}: Error finding container 85fa3a261dfa556fa3e788b3224aba8d6953f21c5112386ef2616abd7a8edbec: Status 404 returned error can't find the container with id 85fa3a261dfa556fa3e788b3224aba8d6953f21c5112386ef2616abd7a8edbec Jan 21 15:41:23 crc kubenswrapper[4707]: I0121 15:41:23.846000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a7280671-6e1f-48c9-a9c8-ca1f7453f07c","Type":"ContainerStarted","Data":"85fa3a261dfa556fa3e788b3224aba8d6953f21c5112386ef2616abd7a8edbec"} Jan 21 15:41:24 crc kubenswrapper[4707]: I0121 15:41:24.099949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:41:24 crc kubenswrapper[4707]: I0121 15:41:24.853705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a7280671-6e1f-48c9-a9c8-ca1f7453f07c","Type":"ContainerStarted","Data":"958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad"} Jan 21 15:41:24 crc kubenswrapper[4707]: I0121 15:41:24.854962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2","Type":"ContainerStarted","Data":"0845927e911c37dcf9655a14e1f3c10d1152022715b30329ae47d09eb7ba4612"} Jan 21 15:41:24 crc kubenswrapper[4707]: I0121 15:41:24.854994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2","Type":"ContainerStarted","Data":"a5f183bebe0f7da4d0ddf4fa40803191f17d6f310bfb6a29a2703ac6f2d8bbd3"} Jan 21 15:41:24 crc kubenswrapper[4707]: I0121 15:41:24.855164 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:24 crc kubenswrapper[4707]: I0121 15:41:24.905339 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=1.905325457 podStartE2EDuration="1.905325457s" podCreationTimestamp="2026-01-21 15:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:24.899357998 +0000 UTC m=+2382.080874220" watchObservedRunningTime="2026-01-21 15:41:24.905325457 +0000 UTC m=+2382.086841679" Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.141833 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.142648 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.148059 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-lclkb" Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.155363 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.300096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k6sh\" (UniqueName: \"kubernetes.io/projected/d8988513-d916-4f32-aa5e-51ed5a40d697-kube-api-access-8k6sh\") pod \"kube-state-metrics-0\" (UID: \"d8988513-d916-4f32-aa5e-51ed5a40d697\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.401572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k6sh\" (UniqueName: \"kubernetes.io/projected/d8988513-d916-4f32-aa5e-51ed5a40d697-kube-api-access-8k6sh\") pod \"kube-state-metrics-0\" (UID: \"d8988513-d916-4f32-aa5e-51ed5a40d697\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.416950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k6sh\" (UniqueName: \"kubernetes.io/projected/d8988513-d916-4f32-aa5e-51ed5a40d697-kube-api-access-8k6sh\") pod \"kube-state-metrics-0\" (UID: \"d8988513-d916-4f32-aa5e-51ed5a40d697\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.462488 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.815941 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:41:25 crc kubenswrapper[4707]: W0121 15:41:25.816876 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8988513_d916_4f32_aa5e_51ed5a40d697.slice/crio-14d565ab8e4424fe4f8e7e75ce55eceb9b6bc6310430705cb4ece547a6936fd6 WatchSource:0}: Error finding container 14d565ab8e4424fe4f8e7e75ce55eceb9b6bc6310430705cb4ece547a6936fd6: Status 404 returned error can't find the container with id 14d565ab8e4424fe4f8e7e75ce55eceb9b6bc6310430705cb4ece547a6936fd6 Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.862105 4707 generic.go:334] "Generic (PLEG): container finished" podID="89b706e2-54e8-4a85-b263-cbccec9d93bd" containerID="95551b395d0de6386810a37afb11f5ca53b7a8c1ccf730edcd06de3010c53ce6" exitCode=0 Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.862175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"89b706e2-54e8-4a85-b263-cbccec9d93bd","Type":"ContainerDied","Data":"95551b395d0de6386810a37afb11f5ca53b7a8c1ccf730edcd06de3010c53ce6"} Jan 21 15:41:25 crc kubenswrapper[4707]: I0121 15:41:25.865535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d8988513-d916-4f32-aa5e-51ed5a40d697","Type":"ContainerStarted","Data":"14d565ab8e4424fe4f8e7e75ce55eceb9b6bc6310430705cb4ece547a6936fd6"} Jan 21 15:41:26 crc kubenswrapper[4707]: I0121 15:41:26.872345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"89b706e2-54e8-4a85-b263-cbccec9d93bd","Type":"ContainerStarted","Data":"6ab4155a7c3881ee300c8f40a21c7af6d838cd85e5003eb8d0024abf5e2ad845"} Jan 21 15:41:26 crc kubenswrapper[4707]: I0121 15:41:26.874519 4707 generic.go:334] "Generic (PLEG): container finished" podID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" containerID="958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad" exitCode=0 Jan 21 15:41:26 crc kubenswrapper[4707]: I0121 15:41:26.874583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a7280671-6e1f-48c9-a9c8-ca1f7453f07c","Type":"ContainerDied","Data":"958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad"} Jan 21 15:41:26 crc kubenswrapper[4707]: I0121 15:41:26.876280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d8988513-d916-4f32-aa5e-51ed5a40d697","Type":"ContainerStarted","Data":"63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427"} Jan 21 15:41:26 crc kubenswrapper[4707]: I0121 15:41:26.876341 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:41:26 crc kubenswrapper[4707]: I0121 15:41:26.900175 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=6.900158997 podStartE2EDuration="6.900158997s" podCreationTimestamp="2026-01-21 15:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:26.890984789 +0000 UTC m=+2384.072501011" watchObservedRunningTime="2026-01-21 15:41:26.900158997 +0000 UTC m=+2384.081675210" Jan 21 15:41:26 crc kubenswrapper[4707]: I0121 15:41:26.909884 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.637261684 podStartE2EDuration="1.909871698s" podCreationTimestamp="2026-01-21 15:41:25 +0000 UTC" firstStartedPulling="2026-01-21 15:41:25.81868143 +0000 UTC m=+2383.000197652" lastFinishedPulling="2026-01-21 15:41:26.091291444 +0000 UTC m=+2383.272807666" observedRunningTime="2026-01-21 15:41:26.9059952 +0000 UTC m=+2384.087511422" watchObservedRunningTime="2026-01-21 15:41:26.909871698 +0000 UTC m=+2384.091387921" Jan 21 15:41:27 crc kubenswrapper[4707]: I0121 15:41:27.884113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a7280671-6e1f-48c9-a9c8-ca1f7453f07c","Type":"ContainerStarted","Data":"fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448"} Jan 21 15:41:27 crc kubenswrapper[4707]: I0121 15:41:27.903360 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=5.9033472830000004 podStartE2EDuration="5.903347283s" podCreationTimestamp="2026-01-21 15:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:27.900878041 +0000 UTC m=+2385.082394253" watchObservedRunningTime="2026-01-21 15:41:27.903347283 +0000 UTC m=+2385.084863505" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.863357 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.866027 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.868284 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.868478 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.872343 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.872985 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-5mnrs" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.873185 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.873360 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.965779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.965904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.965947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ncsr\" (UniqueName: \"kubernetes.io/projected/a16b399e-f6ad-455c-b786-fc08d088f7dc-kube-api-access-5ncsr\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.965988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-config\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.966160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.966233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.966301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:29 crc kubenswrapper[4707]: I0121 15:41:29.966330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ncsr\" (UniqueName: \"kubernetes.io/projected/a16b399e-f6ad-455c-b786-fc08d088f7dc-kube-api-access-5ncsr\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-config\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.067930 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.068610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.069005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-config\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.072835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.072879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.072947 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.080026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ncsr\" (UniqueName: \"kubernetes.io/projected/a16b399e-f6ad-455c-b786-fc08d088f7dc-kube-api-access-5ncsr\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.083100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.190185 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:30 crc kubenswrapper[4707]: W0121 15:41:30.545964 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda16b399e_f6ad_455c_b786_fc08d088f7dc.slice/crio-d8544757a9b2d8e3ca9fff73bef4830f94514ff29fb9b253f5bf7bc9ef70c4ed WatchSource:0}: Error finding container d8544757a9b2d8e3ca9fff73bef4830f94514ff29fb9b253f5bf7bc9ef70c4ed: Status 404 returned error can't find the container with id d8544757a9b2d8e3ca9fff73bef4830f94514ff29fb9b253f5bf7bc9ef70c4ed Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.546651 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.900875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a16b399e-f6ad-455c-b786-fc08d088f7dc","Type":"ContainerStarted","Data":"df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a"} Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.900932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a16b399e-f6ad-455c-b786-fc08d088f7dc","Type":"ContainerStarted","Data":"1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439"} Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.900944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a16b399e-f6ad-455c-b786-fc08d088f7dc","Type":"ContainerStarted","Data":"d8544757a9b2d8e3ca9fff73bef4830f94514ff29fb9b253f5bf7bc9ef70c4ed"} Jan 21 15:41:30 crc kubenswrapper[4707]: I0121 15:41:30.921690 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.921677631 podStartE2EDuration="2.921677631s" podCreationTimestamp="2026-01-21 15:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:30.912456505 +0000 UTC m=+2388.093972727" watchObservedRunningTime="2026-01-21 15:41:30.921677631 +0000 UTC m=+2388.103193853" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.669762 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.670823 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.672955 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-wl7lk" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.673058 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.675095 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.675098 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.677212 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.791348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.791394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.791416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.791445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.791461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.791478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.791514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.791533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnwcv\" (UniqueName: \"kubernetes.io/projected/7d1ed8c7-69eb-461f-9984-719171cafcb2-kube-api-access-dnwcv\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.892739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.892778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.892841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.892866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnwcv\" (UniqueName: \"kubernetes.io/projected/7d1ed8c7-69eb-461f-9984-719171cafcb2-kube-api-access-dnwcv\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.892900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.892926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.892948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.892978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.893236 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") device mount path \"/mnt/openstack/pv12\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.894069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.894212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.895510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.897249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.897419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.899142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.908002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnwcv\" (UniqueName: \"kubernetes.io/projected/7d1ed8c7-69eb-461f-9984-719171cafcb2-kube-api-access-dnwcv\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.909484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.928843 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.929146 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:31 crc kubenswrapper[4707]: I0121 15:41:31.983234 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:32 crc kubenswrapper[4707]: I0121 15:41:32.334652 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:41:32 crc kubenswrapper[4707]: W0121 15:41:32.337799 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d1ed8c7_69eb_461f_9984_719171cafcb2.slice/crio-d2e8aa83b3a291c1d03d1eee0517b885b88329c584f84cb80a2f282b4aae999f WatchSource:0}: Error finding container d2e8aa83b3a291c1d03d1eee0517b885b88329c584f84cb80a2f282b4aae999f: Status 404 returned error can't find the container with id d2e8aa83b3a291c1d03d1eee0517b885b88329c584f84cb80a2f282b4aae999f Jan 21 15:41:32 crc kubenswrapper[4707]: I0121 15:41:32.914624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"7d1ed8c7-69eb-461f-9984-719171cafcb2","Type":"ContainerStarted","Data":"80c16aa36e45c30a214eb7a6a5d35d4783766903f11910f232a3d249cc3e22cd"} Jan 21 15:41:32 crc kubenswrapper[4707]: I0121 15:41:32.914923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"7d1ed8c7-69eb-461f-9984-719171cafcb2","Type":"ContainerStarted","Data":"945a84fae9f8753ed68a8b62e2772b440ee13e8fb93bc4182ad87216030f7bba"} Jan 21 15:41:32 crc kubenswrapper[4707]: I0121 15:41:32.914934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"7d1ed8c7-69eb-461f-9984-719171cafcb2","Type":"ContainerStarted","Data":"d2e8aa83b3a291c1d03d1eee0517b885b88329c584f84cb80a2f282b4aae999f"} Jan 21 15:41:33 crc kubenswrapper[4707]: I0121 15:41:33.191244 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:33 crc kubenswrapper[4707]: I0121 15:41:33.218243 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:33 crc kubenswrapper[4707]: I0121 15:41:33.231448 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.231433968 podStartE2EDuration="3.231433968s" podCreationTimestamp="2026-01-21 15:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:32.929869016 +0000 UTC m=+2390.111385238" watchObservedRunningTime="2026-01-21 15:41:33.231433968 +0000 UTC m=+2390.412950190" Jan 21 15:41:33 crc kubenswrapper[4707]: I0121 15:41:33.354765 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:33 crc kubenswrapper[4707]: I0121 15:41:33.354957 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:33 crc kubenswrapper[4707]: I0121 15:41:33.404750 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:33 crc kubenswrapper[4707]: I0121 15:41:33.736168 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:41:33 crc kubenswrapper[4707]: I0121 15:41:33.921241 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:33 crc kubenswrapper[4707]: I0121 15:41:33.967722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:41:34 crc kubenswrapper[4707]: I0121 15:41:34.147721 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:34 crc kubenswrapper[4707]: I0121 15:41:34.200640 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:41:34 crc kubenswrapper[4707]: I0121 15:41:34.984147 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:35 crc kubenswrapper[4707]: I0121 15:41:35.216361 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:41:35 crc kubenswrapper[4707]: I0121 15:41:35.465825 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.512373 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.516304 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.518108 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-sdg2x" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.518136 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.518147 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.518269 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.527491 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.659649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-cache\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.659879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.659919 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.660013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-lock\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.660119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nww2v\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-kube-api-access-nww2v\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.761991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.762027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.762093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-lock\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.762146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nww2v\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-kube-api-access-nww2v\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.762189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-cache\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: E0121 15:41:36.762245 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:41:36 crc kubenswrapper[4707]: E0121 15:41:36.762283 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.762317 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: E0121 15:41:36.762340 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift podName:fb2783e9-b8b1-4899-9a36-af57a12d56ba nodeName:}" failed. No retries permitted until 2026-01-21 15:41:37.262324222 +0000 UTC m=+2394.443840444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift") pod "swift-storage-0" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba") : configmap "swift-ring-files" not found Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.762568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-lock\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.762617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-cache\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.779168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nww2v\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-kube-api-access-nww2v\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.780276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:36 crc kubenswrapper[4707]: I0121 15:41:36.983789 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:37 crc kubenswrapper[4707]: I0121 15:41:37.270150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:37 crc kubenswrapper[4707]: E0121 15:41:37.270320 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:41:37 crc kubenswrapper[4707]: E0121 15:41:37.270337 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:41:37 crc kubenswrapper[4707]: E0121 15:41:37.270384 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift podName:fb2783e9-b8b1-4899-9a36-af57a12d56ba nodeName:}" failed. No retries permitted until 2026-01-21 15:41:38.270370851 +0000 UTC m=+2395.451887073 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift") pod "swift-storage-0" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba") : configmap "swift-ring-files" not found Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.010440 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.036926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.167523 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.168659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.172996 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.173011 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.173227 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.173308 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-zm2wj" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.175096 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.285430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.285799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-config\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.285858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.285894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.285936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sgzk\" (UniqueName: \"kubernetes.io/projected/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-kube-api-access-5sgzk\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: E0121 15:41:38.286047 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:41:38 crc kubenswrapper[4707]: E0121 15:41:38.286066 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:41:38 crc kubenswrapper[4707]: E0121 15:41:38.286108 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift podName:fb2783e9-b8b1-4899-9a36-af57a12d56ba nodeName:}" failed. No retries permitted until 2026-01-21 15:41:40.28609288 +0000 UTC m=+2397.467609103 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift") pod "swift-storage-0" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba") : configmap "swift-ring-files" not found Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.286129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-scripts\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.286171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.286318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.387584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.387656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sgzk\" (UniqueName: \"kubernetes.io/projected/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-kube-api-access-5sgzk\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.387707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-scripts\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.387728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.387772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.387796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.387826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-config\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.388510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.388529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-config\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.388532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-scripts\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.392473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.392973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.393388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.401473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sgzk\" (UniqueName: \"kubernetes.io/projected/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-kube-api-access-5sgzk\") pod \"ovn-northd-0\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.490534 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.848548 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:41:38 crc kubenswrapper[4707]: W0121 15:41:38.850844 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b1fde7b_722e_48c8_aefa_7baa7c4ebf81.slice/crio-6cf3f1ee49384e42b5fcc6e4eaa6fbac183ea7886485676916b84f1d1e3639c4 WatchSource:0}: Error finding container 6cf3f1ee49384e42b5fcc6e4eaa6fbac183ea7886485676916b84f1d1e3639c4: Status 404 returned error can't find the container with id 6cf3f1ee49384e42b5fcc6e4eaa6fbac183ea7886485676916b84f1d1e3639c4 Jan 21 15:41:38 crc kubenswrapper[4707]: I0121 15:41:38.952342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81","Type":"ContainerStarted","Data":"6cf3f1ee49384e42b5fcc6e4eaa6fbac183ea7886485676916b84f1d1e3639c4"} Jan 21 15:41:39 crc kubenswrapper[4707]: I0121 15:41:39.945797 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:41:39 crc kubenswrapper[4707]: I0121 15:41:39.946850 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:41:39 crc kubenswrapper[4707]: I0121 15:41:39.946972 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:41:39 crc kubenswrapper[4707]: I0121 15:41:39.947555 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:41:39 crc kubenswrapper[4707]: I0121 15:41:39.947679 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" gracePeriod=600 Jan 21 15:41:39 crc kubenswrapper[4707]: I0121 15:41:39.959847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81","Type":"ContainerStarted","Data":"63a8221ce80bea60ab8157bbbd92a930360d931306dcb9d132b2e61859dc44ae"} Jan 21 15:41:39 crc kubenswrapper[4707]: I0121 15:41:39.959886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81","Type":"ContainerStarted","Data":"19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce"} Jan 21 15:41:39 crc kubenswrapper[4707]: I0121 15:41:39.960100 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:39 crc kubenswrapper[4707]: I0121 15:41:39.975391 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=1.975378701 podStartE2EDuration="1.975378701s" podCreationTimestamp="2026-01-21 15:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:39.971439996 +0000 UTC m=+2397.152956217" watchObservedRunningTime="2026-01-21 15:41:39.975378701 +0000 UTC m=+2397.156894923" Jan 21 15:41:40 crc kubenswrapper[4707]: E0121 15:41:40.063012 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.317380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:40 crc kubenswrapper[4707]: E0121 15:41:40.317527 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:41:40 crc kubenswrapper[4707]: E0121 15:41:40.317738 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:41:40 crc kubenswrapper[4707]: E0121 15:41:40.317853 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift podName:fb2783e9-b8b1-4899-9a36-af57a12d56ba nodeName:}" failed. No retries permitted until 2026-01-21 15:41:44.31783889 +0000 UTC m=+2401.499355112 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift") pod "swift-storage-0" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba") : configmap "swift-ring-files" not found Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.387854 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2lfqh"] Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.389065 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.390505 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.390639 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.390990 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.394600 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2lfqh"] Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.418740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhpn4\" (UniqueName: \"kubernetes.io/projected/9055239a-b902-41a9-a86b-0a09f7b78d4e-kube-api-access-dhpn4\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.418957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-scripts\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.419060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-dispersionconf\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.419174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9055239a-b902-41a9-a86b-0a09f7b78d4e-etc-swift\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.419275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-ring-data-devices\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.419372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-combined-ca-bundle\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.419451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-swiftconf\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.520056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhpn4\" (UniqueName: \"kubernetes.io/projected/9055239a-b902-41a9-a86b-0a09f7b78d4e-kube-api-access-dhpn4\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.520089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-scripts\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.520139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-dispersionconf\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.520195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9055239a-b902-41a9-a86b-0a09f7b78d4e-etc-swift\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.520234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-ring-data-devices\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.520270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-combined-ca-bundle\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.520324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-swiftconf\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.520701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9055239a-b902-41a9-a86b-0a09f7b78d4e-etc-swift\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.521013 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-ring-data-devices\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.521221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-scripts\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.524717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-swiftconf\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.524741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-combined-ca-bundle\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.524722 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-dispersionconf\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.532726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhpn4\" (UniqueName: \"kubernetes.io/projected/9055239a-b902-41a9-a86b-0a09f7b78d4e-kube-api-access-dhpn4\") pod \"swift-ring-rebalance-2lfqh\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.577671 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b2pn5"] Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.578476 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.580519 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.587831 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b2pn5"] Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.621449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b35662-c4ac-4928-a1ef-f7a495f9e624-operator-scripts\") pod \"root-account-create-update-b2pn5\" (UID: \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\") " pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.621499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcgzx\" (UniqueName: \"kubernetes.io/projected/a6b35662-c4ac-4928-a1ef-f7a495f9e624-kube-api-access-bcgzx\") pod \"root-account-create-update-b2pn5\" (UID: \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\") " pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.702311 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.722518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcgzx\" (UniqueName: \"kubernetes.io/projected/a6b35662-c4ac-4928-a1ef-f7a495f9e624-kube-api-access-bcgzx\") pod \"root-account-create-update-b2pn5\" (UID: \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\") " pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.722744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b35662-c4ac-4928-a1ef-f7a495f9e624-operator-scripts\") pod \"root-account-create-update-b2pn5\" (UID: \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\") " pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.723366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b35662-c4ac-4928-a1ef-f7a495f9e624-operator-scripts\") pod \"root-account-create-update-b2pn5\" (UID: \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\") " pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.736851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcgzx\" (UniqueName: \"kubernetes.io/projected/a6b35662-c4ac-4928-a1ef-f7a495f9e624-kube-api-access-bcgzx\") pod \"root-account-create-update-b2pn5\" (UID: \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\") " pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.897610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.968957 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" exitCode=0 Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.969688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e"} Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.969716 4707 scope.go:117] "RemoveContainer" containerID="3d5e87bd6abf7a6178fdaecd77230a20522c0d8576400f05ca4f9dfd50c50ef1" Jan 21 15:41:40 crc kubenswrapper[4707]: I0121 15:41:40.970018 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:41:40 crc kubenswrapper[4707]: E0121 15:41:40.970191 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:41:41 crc kubenswrapper[4707]: I0121 15:41:41.075402 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2lfqh"] Jan 21 15:41:41 crc kubenswrapper[4707]: I0121 15:41:41.270728 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b2pn5"] Jan 21 15:41:41 crc kubenswrapper[4707]: W0121 15:41:41.274913 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6b35662_c4ac_4928_a1ef_f7a495f9e624.slice/crio-3b9b563b424b6e41f0065d7a010ee7be941720661574ef0fdb29b6d9676684ab WatchSource:0}: Error finding container 3b9b563b424b6e41f0065d7a010ee7be941720661574ef0fdb29b6d9676684ab: Status 404 returned error can't find the container with id 3b9b563b424b6e41f0065d7a010ee7be941720661574ef0fdb29b6d9676684ab Jan 21 15:41:41 crc kubenswrapper[4707]: I0121 15:41:41.977387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" event={"ID":"9055239a-b902-41a9-a86b-0a09f7b78d4e","Type":"ContainerStarted","Data":"c718221e74ac014b08a72ecde4ddd13e68c2658920e05635ff5b511a09b10c4d"} Jan 21 15:41:41 crc kubenswrapper[4707]: I0121 15:41:41.977645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" event={"ID":"9055239a-b902-41a9-a86b-0a09f7b78d4e","Type":"ContainerStarted","Data":"95e2b4f15f6717e4cc8c7b56670e502b9211296d6fafbc5a0bf09b13866e74f1"} Jan 21 15:41:41 crc kubenswrapper[4707]: I0121 15:41:41.979802 4707 generic.go:334] "Generic (PLEG): container finished" podID="a6b35662-c4ac-4928-a1ef-f7a495f9e624" containerID="03fc0ec367a820fb2eceeeb390d7bafb68a0ee8bb4d220520f537111c9def27d" exitCode=0 Jan 21 15:41:41 crc kubenswrapper[4707]: I0121 15:41:41.979895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-b2pn5" event={"ID":"a6b35662-c4ac-4928-a1ef-f7a495f9e624","Type":"ContainerDied","Data":"03fc0ec367a820fb2eceeeb390d7bafb68a0ee8bb4d220520f537111c9def27d"} Jan 21 15:41:41 crc kubenswrapper[4707]: I0121 15:41:41.979921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-b2pn5" event={"ID":"a6b35662-c4ac-4928-a1ef-f7a495f9e624","Type":"ContainerStarted","Data":"3b9b563b424b6e41f0065d7a010ee7be941720661574ef0fdb29b6d9676684ab"} Jan 21 15:41:41 crc kubenswrapper[4707]: I0121 15:41:41.991361 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" podStartSLOduration=1.991350312 podStartE2EDuration="1.991350312s" podCreationTimestamp="2026-01-21 15:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:41.988828351 +0000 UTC m=+2399.170344573" watchObservedRunningTime="2026-01-21 15:41:41.991350312 +0000 UTC m=+2399.172866534" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.280501 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.297225 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-9nhsf"] Jan 21 15:41:43 crc kubenswrapper[4707]: E0121 15:41:43.297564 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b35662-c4ac-4928-a1ef-f7a495f9e624" containerName="mariadb-account-create-update" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.297582 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b35662-c4ac-4928-a1ef-f7a495f9e624" containerName="mariadb-account-create-update" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.297739 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b35662-c4ac-4928-a1ef-f7a495f9e624" containerName="mariadb-account-create-update" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.298243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.305542 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-9nhsf"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.343988 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.344831 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.347767 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.352425 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.365974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82t4c\" (UniqueName: \"kubernetes.io/projected/93950f85-ecce-471f-bf1e-ecd888e9bc4a-kube-api-access-82t4c\") pod \"keystone-d4e5-account-create-update-7glvt\" (UID: \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.366033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5868034-d516-46e9-be86-98f34cea1875-operator-scripts\") pod \"keystone-db-create-9nhsf\" (UID: \"f5868034-d516-46e9-be86-98f34cea1875\") " pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.366113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93950f85-ecce-471f-bf1e-ecd888e9bc4a-operator-scripts\") pod \"keystone-d4e5-account-create-update-7glvt\" (UID: \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.366179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmlz\" (UniqueName: \"kubernetes.io/projected/f5868034-d516-46e9-be86-98f34cea1875-kube-api-access-bzmlz\") pod \"keystone-db-create-9nhsf\" (UID: \"f5868034-d516-46e9-be86-98f34cea1875\") " pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.467963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b35662-c4ac-4928-a1ef-f7a495f9e624-operator-scripts\") pod \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\" (UID: \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\") " Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.468304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcgzx\" (UniqueName: \"kubernetes.io/projected/a6b35662-c4ac-4928-a1ef-f7a495f9e624-kube-api-access-bcgzx\") pod \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\" (UID: \"a6b35662-c4ac-4928-a1ef-f7a495f9e624\") " Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.468548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82t4c\" (UniqueName: \"kubernetes.io/projected/93950f85-ecce-471f-bf1e-ecd888e9bc4a-kube-api-access-82t4c\") pod \"keystone-d4e5-account-create-update-7glvt\" (UID: \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.468659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b35662-c4ac-4928-a1ef-f7a495f9e624-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6b35662-c4ac-4928-a1ef-f7a495f9e624" (UID: "a6b35662-c4ac-4928-a1ef-f7a495f9e624"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.468823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5868034-d516-46e9-be86-98f34cea1875-operator-scripts\") pod \"keystone-db-create-9nhsf\" (UID: \"f5868034-d516-46e9-be86-98f34cea1875\") " pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.468965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93950f85-ecce-471f-bf1e-ecd888e9bc4a-operator-scripts\") pod \"keystone-d4e5-account-create-update-7glvt\" (UID: \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.469097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmlz\" (UniqueName: \"kubernetes.io/projected/f5868034-d516-46e9-be86-98f34cea1875-kube-api-access-bzmlz\") pod \"keystone-db-create-9nhsf\" (UID: \"f5868034-d516-46e9-be86-98f34cea1875\") " pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.469360 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b35662-c4ac-4928-a1ef-f7a495f9e624-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.469418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5868034-d516-46e9-be86-98f34cea1875-operator-scripts\") pod \"keystone-db-create-9nhsf\" (UID: \"f5868034-d516-46e9-be86-98f34cea1875\") " pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.469558 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93950f85-ecce-471f-bf1e-ecd888e9bc4a-operator-scripts\") pod \"keystone-d4e5-account-create-update-7glvt\" (UID: \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.478237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b35662-c4ac-4928-a1ef-f7a495f9e624-kube-api-access-bcgzx" (OuterVolumeSpecName: "kube-api-access-bcgzx") pod "a6b35662-c4ac-4928-a1ef-f7a495f9e624" (UID: "a6b35662-c4ac-4928-a1ef-f7a495f9e624"). InnerVolumeSpecName "kube-api-access-bcgzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.484698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82t4c\" (UniqueName: \"kubernetes.io/projected/93950f85-ecce-471f-bf1e-ecd888e9bc4a-kube-api-access-82t4c\") pod \"keystone-d4e5-account-create-update-7glvt\" (UID: \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.486085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmlz\" (UniqueName: \"kubernetes.io/projected/f5868034-d516-46e9-be86-98f34cea1875-kube-api-access-bzmlz\") pod \"keystone-db-create-9nhsf\" (UID: \"f5868034-d516-46e9-be86-98f34cea1875\") " pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.571284 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcgzx\" (UniqueName: \"kubernetes.io/projected/a6b35662-c4ac-4928-a1ef-f7a495f9e624-kube-api-access-bcgzx\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.585371 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-c5ksp"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.586203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.592430 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-c5ksp"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.613273 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.657989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.672792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3854a85-dfb4-4d28-bba8-be23fbc85bca-operator-scripts\") pod \"placement-db-create-c5ksp\" (UID: \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\") " pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.672890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjl29\" (UniqueName: \"kubernetes.io/projected/a3854a85-dfb4-4d28-bba8-be23fbc85bca-kube-api-access-kjl29\") pod \"placement-db-create-c5ksp\" (UID: \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\") " pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.697026 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.698248 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.700205 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.702677 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.774440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjl29\" (UniqueName: \"kubernetes.io/projected/a3854a85-dfb4-4d28-bba8-be23fbc85bca-kube-api-access-kjl29\") pod \"placement-db-create-c5ksp\" (UID: \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\") " pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.774550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcffr\" (UniqueName: \"kubernetes.io/projected/09f114bd-85ab-457d-aaad-2cc0a70d6e94-kube-api-access-dcffr\") pod \"placement-7e8f-account-create-update-2zf9k\" (UID: \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.774724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3854a85-dfb4-4d28-bba8-be23fbc85bca-operator-scripts\") pod \"placement-db-create-c5ksp\" (UID: \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\") " pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.774753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f114bd-85ab-457d-aaad-2cc0a70d6e94-operator-scripts\") pod \"placement-7e8f-account-create-update-2zf9k\" (UID: \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.775583 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3854a85-dfb4-4d28-bba8-be23fbc85bca-operator-scripts\") pod \"placement-db-create-c5ksp\" (UID: \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\") " pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.788303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjl29\" (UniqueName: \"kubernetes.io/projected/a3854a85-dfb4-4d28-bba8-be23fbc85bca-kube-api-access-kjl29\") pod \"placement-db-create-c5ksp\" (UID: \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\") " pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.828497 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-f2pf7"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.829539 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.833988 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-f2pf7"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.875673 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/a75f8ec8-450e-4fa8-a854-71449667b528-kube-api-access-4qcsj\") pod \"glance-db-create-f2pf7\" (UID: \"a75f8ec8-450e-4fa8-a854-71449667b528\") " pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.875724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcffr\" (UniqueName: \"kubernetes.io/projected/09f114bd-85ab-457d-aaad-2cc0a70d6e94-kube-api-access-dcffr\") pod \"placement-7e8f-account-create-update-2zf9k\" (UID: \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.875924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75f8ec8-450e-4fa8-a854-71449667b528-operator-scripts\") pod \"glance-db-create-f2pf7\" (UID: \"a75f8ec8-450e-4fa8-a854-71449667b528\") " pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.875990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f114bd-85ab-457d-aaad-2cc0a70d6e94-operator-scripts\") pod \"placement-7e8f-account-create-update-2zf9k\" (UID: \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.876924 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f114bd-85ab-457d-aaad-2cc0a70d6e94-operator-scripts\") pod \"placement-7e8f-account-create-update-2zf9k\" (UID: \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.892106 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcffr\" (UniqueName: \"kubernetes.io/projected/09f114bd-85ab-457d-aaad-2cc0a70d6e94-kube-api-access-dcffr\") pod \"placement-7e8f-account-create-update-2zf9k\" (UID: \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.902050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.931379 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-2gz5j"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.932379 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.934016 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.937220 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-2gz5j"] Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.977193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/a75f8ec8-450e-4fa8-a854-71449667b528-kube-api-access-4qcsj\") pod \"glance-db-create-f2pf7\" (UID: \"a75f8ec8-450e-4fa8-a854-71449667b528\") " pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.977316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75f8ec8-450e-4fa8-a854-71449667b528-operator-scripts\") pod \"glance-db-create-f2pf7\" (UID: \"a75f8ec8-450e-4fa8-a854-71449667b528\") " pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.978345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75f8ec8-450e-4fa8-a854-71449667b528-operator-scripts\") pod \"glance-db-create-f2pf7\" (UID: \"a75f8ec8-450e-4fa8-a854-71449667b528\") " pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:43 crc kubenswrapper[4707]: I0121 15:41:43.993595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/a75f8ec8-450e-4fa8-a854-71449667b528-kube-api-access-4qcsj\") pod \"glance-db-create-f2pf7\" (UID: \"a75f8ec8-450e-4fa8-a854-71449667b528\") " pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.008385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-b2pn5" event={"ID":"a6b35662-c4ac-4928-a1ef-f7a495f9e624","Type":"ContainerDied","Data":"3b9b563b424b6e41f0065d7a010ee7be941720661574ef0fdb29b6d9676684ab"} Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.008427 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9b563b424b6e41f0065d7a010ee7be941720661574ef0fdb29b6d9676684ab" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.008472 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b2pn5" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.015021 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.026055 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-9nhsf"] Jan 21 15:41:44 crc kubenswrapper[4707]: W0121 15:41:44.037183 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5868034_d516_46e9_be86_98f34cea1875.slice/crio-71da69aecc9bd7da9f3c3fd85fc832df059c1200c97d5d6ef47db5cd7f7c44aa WatchSource:0}: Error finding container 71da69aecc9bd7da9f3c3fd85fc832df059c1200c97d5d6ef47db5cd7f7c44aa: Status 404 returned error can't find the container with id 71da69aecc9bd7da9f3c3fd85fc832df059c1200c97d5d6ef47db5cd7f7c44aa Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.079178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp5hv\" (UniqueName: \"kubernetes.io/projected/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-kube-api-access-cp5hv\") pod \"glance-4189-account-create-update-2gz5j\" (UID: \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.079329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-operator-scripts\") pod \"glance-4189-account-create-update-2gz5j\" (UID: \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.095324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt"] Jan 21 15:41:44 crc kubenswrapper[4707]: W0121 15:41:44.105243 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93950f85_ecce_471f_bf1e_ecd888e9bc4a.slice/crio-f7bf8a4cae40444e8876f04d4a6223301fed32e203d0050678d805487bbc9a31 WatchSource:0}: Error finding container f7bf8a4cae40444e8876f04d4a6223301fed32e203d0050678d805487bbc9a31: Status 404 returned error can't find the container with id f7bf8a4cae40444e8876f04d4a6223301fed32e203d0050678d805487bbc9a31 Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.146509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.180724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp5hv\" (UniqueName: \"kubernetes.io/projected/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-kube-api-access-cp5hv\") pod \"glance-4189-account-create-update-2gz5j\" (UID: \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.180823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-operator-scripts\") pod \"glance-4189-account-create-update-2gz5j\" (UID: \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.181503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-operator-scripts\") pod \"glance-4189-account-create-update-2gz5j\" (UID: \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.199964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp5hv\" (UniqueName: \"kubernetes.io/projected/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-kube-api-access-cp5hv\") pod \"glance-4189-account-create-update-2gz5j\" (UID: \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.251547 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.311065 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-c5ksp"] Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.386209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:44 crc kubenswrapper[4707]: E0121 15:41:44.386382 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:41:44 crc kubenswrapper[4707]: E0121 15:41:44.386407 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:41:44 crc kubenswrapper[4707]: E0121 15:41:44.386455 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift podName:fb2783e9-b8b1-4899-9a36-af57a12d56ba nodeName:}" failed. No retries permitted until 2026-01-21 15:41:52.386439624 +0000 UTC m=+2409.567955846 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift") pod "swift-storage-0" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba") : configmap "swift-ring-files" not found Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.396688 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k"] Jan 21 15:41:44 crc kubenswrapper[4707]: W0121 15:41:44.459336 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09f114bd_85ab_457d_aaad_2cc0a70d6e94.slice/crio-99403da11cacfb4cf998f42e6e587b7dc0ecd69736d853cd2f89cd52c9299568 WatchSource:0}: Error finding container 99403da11cacfb4cf998f42e6e587b7dc0ecd69736d853cd2f89cd52c9299568: Status 404 returned error can't find the container with id 99403da11cacfb4cf998f42e6e587b7dc0ecd69736d853cd2f89cd52c9299568 Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.528590 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-f2pf7"] Jan 21 15:41:44 crc kubenswrapper[4707]: W0121 15:41:44.529223 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75f8ec8_450e_4fa8_a854_71449667b528.slice/crio-b4ecc4b1bbbf828e75c035d20f001c7d746a52234c81f986699ce5c27de2db5b WatchSource:0}: Error finding container b4ecc4b1bbbf828e75c035d20f001c7d746a52234c81f986699ce5c27de2db5b: Status 404 returned error can't find the container with id b4ecc4b1bbbf828e75c035d20f001c7d746a52234c81f986699ce5c27de2db5b Jan 21 15:41:44 crc kubenswrapper[4707]: W0121 15:41:44.652919 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eeb0fcb_cc4b_477d_8cb6_d3915da905a5.slice/crio-eda63471803162ffa37af4cf2f50fdf33032452556822ab6050309c67c328602 WatchSource:0}: Error finding container eda63471803162ffa37af4cf2f50fdf33032452556822ab6050309c67c328602: Status 404 returned error can't find the container with id eda63471803162ffa37af4cf2f50fdf33032452556822ab6050309c67c328602 Jan 21 15:41:44 crc kubenswrapper[4707]: I0121 15:41:44.656040 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-2gz5j"] Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.017133 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5868034-d516-46e9-be86-98f34cea1875" containerID="1348e4bd7ab33693f0b9a16d311fc059dd63a789a138bf8a94a3d344e3499b49" exitCode=0 Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.017216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-9nhsf" event={"ID":"f5868034-d516-46e9-be86-98f34cea1875","Type":"ContainerDied","Data":"1348e4bd7ab33693f0b9a16d311fc059dd63a789a138bf8a94a3d344e3499b49"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.017248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-9nhsf" event={"ID":"f5868034-d516-46e9-be86-98f34cea1875","Type":"ContainerStarted","Data":"71da69aecc9bd7da9f3c3fd85fc832df059c1200c97d5d6ef47db5cd7f7c44aa"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.019191 4707 generic.go:334] "Generic (PLEG): container finished" podID="09f114bd-85ab-457d-aaad-2cc0a70d6e94" containerID="a4efbc75ea46ed9e0d43ad15bc77a3b68cc728b78fe26f47fc3a8fc0add65d1e" exitCode=0 Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.019275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" event={"ID":"09f114bd-85ab-457d-aaad-2cc0a70d6e94","Type":"ContainerDied","Data":"a4efbc75ea46ed9e0d43ad15bc77a3b68cc728b78fe26f47fc3a8fc0add65d1e"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.019300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" event={"ID":"09f114bd-85ab-457d-aaad-2cc0a70d6e94","Type":"ContainerStarted","Data":"99403da11cacfb4cf998f42e6e587b7dc0ecd69736d853cd2f89cd52c9299568"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.020900 4707 generic.go:334] "Generic (PLEG): container finished" podID="a3854a85-dfb4-4d28-bba8-be23fbc85bca" containerID="d73ccd21bbb8e5c2fa996b7147ba920c18edaeb40f40be041465c71d75eda7fd" exitCode=0 Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.020981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-c5ksp" event={"ID":"a3854a85-dfb4-4d28-bba8-be23fbc85bca","Type":"ContainerDied","Data":"d73ccd21bbb8e5c2fa996b7147ba920c18edaeb40f40be041465c71d75eda7fd"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.021010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-c5ksp" event={"ID":"a3854a85-dfb4-4d28-bba8-be23fbc85bca","Type":"ContainerStarted","Data":"1db884284e8309097e12c456c3a7cf66293ec51444a65d5bc3e1224528273b48"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.022112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" event={"ID":"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5","Type":"ContainerStarted","Data":"e923daea0011570420f39bbf8ca3fd730d44e5caefd85bd6ad72c2334634c294"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.022140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" event={"ID":"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5","Type":"ContainerStarted","Data":"eda63471803162ffa37af4cf2f50fdf33032452556822ab6050309c67c328602"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.023762 4707 generic.go:334] "Generic (PLEG): container finished" podID="a75f8ec8-450e-4fa8-a854-71449667b528" containerID="527995a08fe95e2e4ed2f6dff00c892b5a1fd3d330c658c01e07a7831093049b" exitCode=0 Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.023835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-f2pf7" event={"ID":"a75f8ec8-450e-4fa8-a854-71449667b528","Type":"ContainerDied","Data":"527995a08fe95e2e4ed2f6dff00c892b5a1fd3d330c658c01e07a7831093049b"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.023872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-f2pf7" event={"ID":"a75f8ec8-450e-4fa8-a854-71449667b528","Type":"ContainerStarted","Data":"b4ecc4b1bbbf828e75c035d20f001c7d746a52234c81f986699ce5c27de2db5b"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.025052 4707 generic.go:334] "Generic (PLEG): container finished" podID="93950f85-ecce-471f-bf1e-ecd888e9bc4a" containerID="1611c703919e81570b553ebb49d79f83a2d4bd78494b93ec3ff68e128f0b8d6f" exitCode=0 Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.025085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" event={"ID":"93950f85-ecce-471f-bf1e-ecd888e9bc4a","Type":"ContainerDied","Data":"1611c703919e81570b553ebb49d79f83a2d4bd78494b93ec3ff68e128f0b8d6f"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.025102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" event={"ID":"93950f85-ecce-471f-bf1e-ecd888e9bc4a","Type":"ContainerStarted","Data":"f7bf8a4cae40444e8876f04d4a6223301fed32e203d0050678d805487bbc9a31"} Jan 21 15:41:45 crc kubenswrapper[4707]: I0121 15:41:45.058436 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" podStartSLOduration=2.058418513 podStartE2EDuration="2.058418513s" podCreationTimestamp="2026-01-21 15:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:45.057085527 +0000 UTC m=+2402.238601749" watchObservedRunningTime="2026-01-21 15:41:45.058418513 +0000 UTC m=+2402.239934734" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.033965 4707 generic.go:334] "Generic (PLEG): container finished" podID="6eeb0fcb-cc4b-477d-8cb6-d3915da905a5" containerID="e923daea0011570420f39bbf8ca3fd730d44e5caefd85bd6ad72c2334634c294" exitCode=0 Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.034017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" event={"ID":"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5","Type":"ContainerDied","Data":"e923daea0011570420f39bbf8ca3fd730d44e5caefd85bd6ad72c2334634c294"} Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.386308 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.431921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3854a85-dfb4-4d28-bba8-be23fbc85bca-operator-scripts\") pod \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\" (UID: \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.432038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjl29\" (UniqueName: \"kubernetes.io/projected/a3854a85-dfb4-4d28-bba8-be23fbc85bca-kube-api-access-kjl29\") pod \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\" (UID: \"a3854a85-dfb4-4d28-bba8-be23fbc85bca\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.433269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3854a85-dfb4-4d28-bba8-be23fbc85bca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3854a85-dfb4-4d28-bba8-be23fbc85bca" (UID: "a3854a85-dfb4-4d28-bba8-be23fbc85bca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.441127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3854a85-dfb4-4d28-bba8-be23fbc85bca-kube-api-access-kjl29" (OuterVolumeSpecName: "kube-api-access-kjl29") pod "a3854a85-dfb4-4d28-bba8-be23fbc85bca" (UID: "a3854a85-dfb4-4d28-bba8-be23fbc85bca"). InnerVolumeSpecName "kube-api-access-kjl29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.534516 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3854a85-dfb4-4d28-bba8-be23fbc85bca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.534550 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjl29\" (UniqueName: \"kubernetes.io/projected/a3854a85-dfb4-4d28-bba8-be23fbc85bca-kube-api-access-kjl29\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.683888 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.688621 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.693221 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.703074 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.737655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5868034-d516-46e9-be86-98f34cea1875-operator-scripts\") pod \"f5868034-d516-46e9-be86-98f34cea1875\" (UID: \"f5868034-d516-46e9-be86-98f34cea1875\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.737690 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82t4c\" (UniqueName: \"kubernetes.io/projected/93950f85-ecce-471f-bf1e-ecd888e9bc4a-kube-api-access-82t4c\") pod \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\" (UID: \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.737744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75f8ec8-450e-4fa8-a854-71449667b528-operator-scripts\") pod \"a75f8ec8-450e-4fa8-a854-71449667b528\" (UID: \"a75f8ec8-450e-4fa8-a854-71449667b528\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.737764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzmlz\" (UniqueName: \"kubernetes.io/projected/f5868034-d516-46e9-be86-98f34cea1875-kube-api-access-bzmlz\") pod \"f5868034-d516-46e9-be86-98f34cea1875\" (UID: \"f5868034-d516-46e9-be86-98f34cea1875\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.737785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93950f85-ecce-471f-bf1e-ecd888e9bc4a-operator-scripts\") pod \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\" (UID: \"93950f85-ecce-471f-bf1e-ecd888e9bc4a\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.737851 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcffr\" (UniqueName: \"kubernetes.io/projected/09f114bd-85ab-457d-aaad-2cc0a70d6e94-kube-api-access-dcffr\") pod \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\" (UID: \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.737872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f114bd-85ab-457d-aaad-2cc0a70d6e94-operator-scripts\") pod \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\" (UID: \"09f114bd-85ab-457d-aaad-2cc0a70d6e94\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.737903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/a75f8ec8-450e-4fa8-a854-71449667b528-kube-api-access-4qcsj\") pod \"a75f8ec8-450e-4fa8-a854-71449667b528\" (UID: \"a75f8ec8-450e-4fa8-a854-71449667b528\") " Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.741433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75f8ec8-450e-4fa8-a854-71449667b528-kube-api-access-4qcsj" (OuterVolumeSpecName: "kube-api-access-4qcsj") pod "a75f8ec8-450e-4fa8-a854-71449667b528" (UID: "a75f8ec8-450e-4fa8-a854-71449667b528"). InnerVolumeSpecName "kube-api-access-4qcsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.741705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5868034-d516-46e9-be86-98f34cea1875-kube-api-access-bzmlz" (OuterVolumeSpecName: "kube-api-access-bzmlz") pod "f5868034-d516-46e9-be86-98f34cea1875" (UID: "f5868034-d516-46e9-be86-98f34cea1875"). InnerVolumeSpecName "kube-api-access-bzmlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.741989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75f8ec8-450e-4fa8-a854-71449667b528-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a75f8ec8-450e-4fa8-a854-71449667b528" (UID: "a75f8ec8-450e-4fa8-a854-71449667b528"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.742353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93950f85-ecce-471f-bf1e-ecd888e9bc4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93950f85-ecce-471f-bf1e-ecd888e9bc4a" (UID: "93950f85-ecce-471f-bf1e-ecd888e9bc4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.742612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f114bd-85ab-457d-aaad-2cc0a70d6e94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09f114bd-85ab-457d-aaad-2cc0a70d6e94" (UID: "09f114bd-85ab-457d-aaad-2cc0a70d6e94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.742973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5868034-d516-46e9-be86-98f34cea1875-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5868034-d516-46e9-be86-98f34cea1875" (UID: "f5868034-d516-46e9-be86-98f34cea1875"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.744072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f114bd-85ab-457d-aaad-2cc0a70d6e94-kube-api-access-dcffr" (OuterVolumeSpecName: "kube-api-access-dcffr") pod "09f114bd-85ab-457d-aaad-2cc0a70d6e94" (UID: "09f114bd-85ab-457d-aaad-2cc0a70d6e94"). InnerVolumeSpecName "kube-api-access-dcffr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.744140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93950f85-ecce-471f-bf1e-ecd888e9bc4a-kube-api-access-82t4c" (OuterVolumeSpecName: "kube-api-access-82t4c") pod "93950f85-ecce-471f-bf1e-ecd888e9bc4a" (UID: "93950f85-ecce-471f-bf1e-ecd888e9bc4a"). InnerVolumeSpecName "kube-api-access-82t4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.839941 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcffr\" (UniqueName: \"kubernetes.io/projected/09f114bd-85ab-457d-aaad-2cc0a70d6e94-kube-api-access-dcffr\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.839980 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09f114bd-85ab-457d-aaad-2cc0a70d6e94-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.839991 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/a75f8ec8-450e-4fa8-a854-71449667b528-kube-api-access-4qcsj\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.839999 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5868034-d516-46e9-be86-98f34cea1875-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.840008 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82t4c\" (UniqueName: \"kubernetes.io/projected/93950f85-ecce-471f-bf1e-ecd888e9bc4a-kube-api-access-82t4c\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.840017 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75f8ec8-450e-4fa8-a854-71449667b528-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.840025 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzmlz\" (UniqueName: \"kubernetes.io/projected/f5868034-d516-46e9-be86-98f34cea1875-kube-api-access-bzmlz\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:46 crc kubenswrapper[4707]: I0121 15:41:46.840036 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93950f85-ecce-471f-bf1e-ecd888e9bc4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.004904 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b2pn5"] Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.009836 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b2pn5"] Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.044427 4707 generic.go:334] "Generic (PLEG): container finished" podID="9055239a-b902-41a9-a86b-0a09f7b78d4e" containerID="c718221e74ac014b08a72ecde4ddd13e68c2658920e05635ff5b511a09b10c4d" exitCode=0 Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.044727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" event={"ID":"9055239a-b902-41a9-a86b-0a09f7b78d4e","Type":"ContainerDied","Data":"c718221e74ac014b08a72ecde4ddd13e68c2658920e05635ff5b511a09b10c4d"} Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.047840 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-9nhsf" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.048028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-9nhsf" event={"ID":"f5868034-d516-46e9-be86-98f34cea1875","Type":"ContainerDied","Data":"71da69aecc9bd7da9f3c3fd85fc832df059c1200c97d5d6ef47db5cd7f7c44aa"} Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.048079 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71da69aecc9bd7da9f3c3fd85fc832df059c1200c97d5d6ef47db5cd7f7c44aa" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.052620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" event={"ID":"09f114bd-85ab-457d-aaad-2cc0a70d6e94","Type":"ContainerDied","Data":"99403da11cacfb4cf998f42e6e587b7dc0ecd69736d853cd2f89cd52c9299568"} Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.052649 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99403da11cacfb4cf998f42e6e587b7dc0ecd69736d853cd2f89cd52c9299568" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.052705 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.055853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-c5ksp" event={"ID":"a3854a85-dfb4-4d28-bba8-be23fbc85bca","Type":"ContainerDied","Data":"1db884284e8309097e12c456c3a7cf66293ec51444a65d5bc3e1224528273b48"} Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.055908 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1db884284e8309097e12c456c3a7cf66293ec51444a65d5bc3e1224528273b48" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.055837 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-c5ksp" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.057945 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-f2pf7" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.058134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-f2pf7" event={"ID":"a75f8ec8-450e-4fa8-a854-71449667b528","Type":"ContainerDied","Data":"b4ecc4b1bbbf828e75c035d20f001c7d746a52234c81f986699ce5c27de2db5b"} Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.058706 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4ecc4b1bbbf828e75c035d20f001c7d746a52234c81f986699ce5c27de2db5b" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.060918 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.060936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt" event={"ID":"93950f85-ecce-471f-bf1e-ecd888e9bc4a","Type":"ContainerDied","Data":"f7bf8a4cae40444e8876f04d4a6223301fed32e203d0050678d805487bbc9a31"} Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.060963 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7bf8a4cae40444e8876f04d4a6223301fed32e203d0050678d805487bbc9a31" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.189956 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b35662-c4ac-4928-a1ef-f7a495f9e624" path="/var/lib/kubelet/pods/a6b35662-c4ac-4928-a1ef-f7a495f9e624/volumes" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.372723 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.450700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-operator-scripts\") pod \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\" (UID: \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\") " Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.450885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp5hv\" (UniqueName: \"kubernetes.io/projected/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-kube-api-access-cp5hv\") pod \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\" (UID: \"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5\") " Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.451601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6eeb0fcb-cc4b-477d-8cb6-d3915da905a5" (UID: "6eeb0fcb-cc4b-477d-8cb6-d3915da905a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.455275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-kube-api-access-cp5hv" (OuterVolumeSpecName: "kube-api-access-cp5hv") pod "6eeb0fcb-cc4b-477d-8cb6-d3915da905a5" (UID: "6eeb0fcb-cc4b-477d-8cb6-d3915da905a5"). InnerVolumeSpecName "kube-api-access-cp5hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.552592 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp5hv\" (UniqueName: \"kubernetes.io/projected/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-kube-api-access-cp5hv\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:47 crc kubenswrapper[4707]: I0121 15:41:47.552955 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.070128 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.070117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4189-account-create-update-2gz5j" event={"ID":"6eeb0fcb-cc4b-477d-8cb6-d3915da905a5","Type":"ContainerDied","Data":"eda63471803162ffa37af4cf2f50fdf33032452556822ab6050309c67c328602"} Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.070278 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eda63471803162ffa37af4cf2f50fdf33032452556822ab6050309c67c328602" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.345939 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.364449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-scripts\") pod \"9055239a-b902-41a9-a86b-0a09f7b78d4e\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.364513 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-ring-data-devices\") pod \"9055239a-b902-41a9-a86b-0a09f7b78d4e\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.364554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-combined-ca-bundle\") pod \"9055239a-b902-41a9-a86b-0a09f7b78d4e\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.364572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9055239a-b902-41a9-a86b-0a09f7b78d4e-etc-swift\") pod \"9055239a-b902-41a9-a86b-0a09f7b78d4e\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.364679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-dispersionconf\") pod \"9055239a-b902-41a9-a86b-0a09f7b78d4e\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.364727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhpn4\" (UniqueName: \"kubernetes.io/projected/9055239a-b902-41a9-a86b-0a09f7b78d4e-kube-api-access-dhpn4\") pod \"9055239a-b902-41a9-a86b-0a09f7b78d4e\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.364750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-swiftconf\") pod \"9055239a-b902-41a9-a86b-0a09f7b78d4e\" (UID: \"9055239a-b902-41a9-a86b-0a09f7b78d4e\") " Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.364987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9055239a-b902-41a9-a86b-0a09f7b78d4e" (UID: "9055239a-b902-41a9-a86b-0a09f7b78d4e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.365357 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.365714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9055239a-b902-41a9-a86b-0a09f7b78d4e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9055239a-b902-41a9-a86b-0a09f7b78d4e" (UID: "9055239a-b902-41a9-a86b-0a09f7b78d4e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.374918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9055239a-b902-41a9-a86b-0a09f7b78d4e-kube-api-access-dhpn4" (OuterVolumeSpecName: "kube-api-access-dhpn4") pod "9055239a-b902-41a9-a86b-0a09f7b78d4e" (UID: "9055239a-b902-41a9-a86b-0a09f7b78d4e"). InnerVolumeSpecName "kube-api-access-dhpn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.381196 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9055239a-b902-41a9-a86b-0a09f7b78d4e" (UID: "9055239a-b902-41a9-a86b-0a09f7b78d4e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.384164 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9055239a-b902-41a9-a86b-0a09f7b78d4e" (UID: "9055239a-b902-41a9-a86b-0a09f7b78d4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.387934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9055239a-b902-41a9-a86b-0a09f7b78d4e" (UID: "9055239a-b902-41a9-a86b-0a09f7b78d4e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.394639 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-scripts" (OuterVolumeSpecName: "scripts") pod "9055239a-b902-41a9-a86b-0a09f7b78d4e" (UID: "9055239a-b902-41a9-a86b-0a09f7b78d4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.466418 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.466451 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9055239a-b902-41a9-a86b-0a09f7b78d4e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.466461 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.466472 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9055239a-b902-41a9-a86b-0a09f7b78d4e-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.466481 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9055239a-b902-41a9-a86b-0a09f7b78d4e-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.466491 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhpn4\" (UniqueName: \"kubernetes.io/projected/9055239a-b902-41a9-a86b-0a09f7b78d4e-kube-api-access-dhpn4\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:48 crc kubenswrapper[4707]: I0121 15:41:48.535853 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.077618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" event={"ID":"9055239a-b902-41a9-a86b-0a09f7b78d4e","Type":"ContainerDied","Data":"95e2b4f15f6717e4cc8c7b56670e502b9211296d6fafbc5a0bf09b13866e74f1"} Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.077658 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95e2b4f15f6717e4cc8c7b56670e502b9211296d6fafbc5a0bf09b13866e74f1" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.077678 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-2lfqh" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.210959 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rvckb"] Jan 21 15:41:49 crc kubenswrapper[4707]: E0121 15:41:49.211463 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75f8ec8-450e-4fa8-a854-71449667b528" containerName="mariadb-database-create" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.211583 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75f8ec8-450e-4fa8-a854-71449667b528" containerName="mariadb-database-create" Jan 21 15:41:49 crc kubenswrapper[4707]: E0121 15:41:49.211657 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9055239a-b902-41a9-a86b-0a09f7b78d4e" containerName="swift-ring-rebalance" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.211712 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9055239a-b902-41a9-a86b-0a09f7b78d4e" containerName="swift-ring-rebalance" Jan 21 15:41:49 crc kubenswrapper[4707]: E0121 15:41:49.211770 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eeb0fcb-cc4b-477d-8cb6-d3915da905a5" containerName="mariadb-account-create-update" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.211832 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eeb0fcb-cc4b-477d-8cb6-d3915da905a5" containerName="mariadb-account-create-update" Jan 21 15:41:49 crc kubenswrapper[4707]: E0121 15:41:49.211888 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5868034-d516-46e9-be86-98f34cea1875" containerName="mariadb-database-create" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.211943 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5868034-d516-46e9-be86-98f34cea1875" containerName="mariadb-database-create" Jan 21 15:41:49 crc kubenswrapper[4707]: E0121 15:41:49.211995 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3854a85-dfb4-4d28-bba8-be23fbc85bca" containerName="mariadb-database-create" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212037 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3854a85-dfb4-4d28-bba8-be23fbc85bca" containerName="mariadb-database-create" Jan 21 15:41:49 crc kubenswrapper[4707]: E0121 15:41:49.212095 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93950f85-ecce-471f-bf1e-ecd888e9bc4a" containerName="mariadb-account-create-update" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212145 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="93950f85-ecce-471f-bf1e-ecd888e9bc4a" containerName="mariadb-account-create-update" Jan 21 15:41:49 crc kubenswrapper[4707]: E0121 15:41:49.212208 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f114bd-85ab-457d-aaad-2cc0a70d6e94" containerName="mariadb-account-create-update" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212251 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f114bd-85ab-457d-aaad-2cc0a70d6e94" containerName="mariadb-account-create-update" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212510 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eeb0fcb-cc4b-477d-8cb6-d3915da905a5" containerName="mariadb-account-create-update" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212564 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f114bd-85ab-457d-aaad-2cc0a70d6e94" containerName="mariadb-account-create-update" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212612 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75f8ec8-450e-4fa8-a854-71449667b528" containerName="mariadb-database-create" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212664 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9055239a-b902-41a9-a86b-0a09f7b78d4e" containerName="swift-ring-rebalance" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212710 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="93950f85-ecce-471f-bf1e-ecd888e9bc4a" containerName="mariadb-account-create-update" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212757 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3854a85-dfb4-4d28-bba8-be23fbc85bca" containerName="mariadb-database-create" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.212831 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5868034-d516-46e9-be86-98f34cea1875" containerName="mariadb-database-create" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.213490 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.216938 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.218553 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rvckb"] Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.219474 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-lbfbl" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.276782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-db-sync-config-data\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.276888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-combined-ca-bundle\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.276967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qmxm\" (UniqueName: \"kubernetes.io/projected/3f22a02a-72f1-45cd-876b-99167c754c9c-kube-api-access-6qmxm\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.276992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-config-data\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.377770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-combined-ca-bundle\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.377866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qmxm\" (UniqueName: \"kubernetes.io/projected/3f22a02a-72f1-45cd-876b-99167c754c9c-kube-api-access-6qmxm\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.377895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-config-data\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.377941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-db-sync-config-data\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.381251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-db-sync-config-data\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.381612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-combined-ca-bundle\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.382401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-config-data\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.391092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qmxm\" (UniqueName: \"kubernetes.io/projected/3f22a02a-72f1-45cd-876b-99167c754c9c-kube-api-access-6qmxm\") pod \"glance-db-sync-rvckb\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.526306 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:49 crc kubenswrapper[4707]: I0121 15:41:49.882214 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rvckb"] Jan 21 15:41:50 crc kubenswrapper[4707]: I0121 15:41:50.087719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-rvckb" event={"ID":"3f22a02a-72f1-45cd-876b-99167c754c9c","Type":"ContainerStarted","Data":"ca326bb4998cb5892499469ee091e32c280f066657f0b3a4a194aeb632795678"} Jan 21 15:41:51 crc kubenswrapper[4707]: I0121 15:41:51.094290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-rvckb" event={"ID":"3f22a02a-72f1-45cd-876b-99167c754c9c","Type":"ContainerStarted","Data":"5af2da621a03f8cef0956e53ed72a3d8539092dc2471f1f3cb2edebedcaaafe7"} Jan 21 15:41:51 crc kubenswrapper[4707]: I0121 15:41:51.110361 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-rvckb" podStartSLOduration=2.110348166 podStartE2EDuration="2.110348166s" podCreationTimestamp="2026-01-21 15:41:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:51.105832917 +0000 UTC m=+2408.287349140" watchObservedRunningTime="2026-01-21 15:41:51.110348166 +0000 UTC m=+2408.291864388" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.007796 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-47dxg"] Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.008639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.010386 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.017627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66742\" (UniqueName: \"kubernetes.io/projected/3b0806ad-f9f2-44d7-b32f-4ad177974092-kube-api-access-66742\") pod \"root-account-create-update-47dxg\" (UID: \"3b0806ad-f9f2-44d7-b32f-4ad177974092\") " pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.018125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0806ad-f9f2-44d7-b32f-4ad177974092-operator-scripts\") pod \"root-account-create-update-47dxg\" (UID: \"3b0806ad-f9f2-44d7-b32f-4ad177974092\") " pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.018504 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-47dxg"] Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.118943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0806ad-f9f2-44d7-b32f-4ad177974092-operator-scripts\") pod \"root-account-create-update-47dxg\" (UID: \"3b0806ad-f9f2-44d7-b32f-4ad177974092\") " pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.119084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66742\" (UniqueName: \"kubernetes.io/projected/3b0806ad-f9f2-44d7-b32f-4ad177974092-kube-api-access-66742\") pod \"root-account-create-update-47dxg\" (UID: \"3b0806ad-f9f2-44d7-b32f-4ad177974092\") " pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.120141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0806ad-f9f2-44d7-b32f-4ad177974092-operator-scripts\") pod \"root-account-create-update-47dxg\" (UID: \"3b0806ad-f9f2-44d7-b32f-4ad177974092\") " pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.150417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66742\" (UniqueName: \"kubernetes.io/projected/3b0806ad-f9f2-44d7-b32f-4ad177974092-kube-api-access-66742\") pod \"root-account-create-update-47dxg\" (UID: \"3b0806ad-f9f2-44d7-b32f-4ad177974092\") " pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.330275 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.423744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.429547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") pod \"swift-storage-0\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.430836 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.720012 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-47dxg"] Jan 21 15:41:52 crc kubenswrapper[4707]: I0121 15:41:52.828832 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:41:52 crc kubenswrapper[4707]: W0121 15:41:52.836900 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb2783e9_b8b1_4899_9a36_af57a12d56ba.slice/crio-0361816401c49ef16e8e70e869d82aea6654f338d15f76c9482638c5dd05b70f WatchSource:0}: Error finding container 0361816401c49ef16e8e70e869d82aea6654f338d15f76c9482638c5dd05b70f: Status 404 returned error can't find the container with id 0361816401c49ef16e8e70e869d82aea6654f338d15f76c9482638c5dd05b70f Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.113080 4707 generic.go:334] "Generic (PLEG): container finished" podID="de4cffce-959d-47e8-9217-f842b2ab8f65" containerID="91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30" exitCode=0 Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.113188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"de4cffce-959d-47e8-9217-f842b2ab8f65","Type":"ContainerDied","Data":"91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30"} Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.114560 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b0806ad-f9f2-44d7-b32f-4ad177974092" containerID="e1469ba4322e58bc36a1c3104e1125846b364821981487a01957611fe56feef3" exitCode=0 Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.114636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-47dxg" event={"ID":"3b0806ad-f9f2-44d7-b32f-4ad177974092","Type":"ContainerDied","Data":"e1469ba4322e58bc36a1c3104e1125846b364821981487a01957611fe56feef3"} Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.114668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-47dxg" event={"ID":"3b0806ad-f9f2-44d7-b32f-4ad177974092","Type":"ContainerStarted","Data":"bb004b8c3c88c3b5ea4d5a8ac88b87a28c8d5775df40c9641553c59633bf420a"} Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.116830 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6f92106-4e6f-4fdd-b646-322ee996dc80" containerID="0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0" exitCode=0 Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.116897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e6f92106-4e6f-4fdd-b646-322ee996dc80","Type":"ContainerDied","Data":"0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0"} Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.118710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58"} Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.118743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa"} Jan 21 15:41:53 crc kubenswrapper[4707]: I0121 15:41:53.118755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"0361816401c49ef16e8e70e869d82aea6654f338d15f76c9482638c5dd05b70f"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.135614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"de4cffce-959d-47e8-9217-f842b2ab8f65","Type":"ContainerStarted","Data":"614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.135917 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.138139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e6f92106-4e6f-4fdd-b646-322ee996dc80","Type":"ContainerStarted","Data":"794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.138594 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.159206 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.159192043 podStartE2EDuration="36.159192043s" podCreationTimestamp="2026-01-21 15:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:54.151323339 +0000 UTC m=+2411.332839560" watchObservedRunningTime="2026-01-21 15:41:54.159192043 +0000 UTC m=+2411.340708265" Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.168666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.168711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.168720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.168730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.168737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.168745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.168754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.168762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.168770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51"} Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.175864 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.175848107 podStartE2EDuration="36.175848107s" podCreationTimestamp="2026-01-21 15:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:54.170097786 +0000 UTC m=+2411.351614008" watchObservedRunningTime="2026-01-21 15:41:54.175848107 +0000 UTC m=+2411.357364330" Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.183375 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:41:54 crc kubenswrapper[4707]: E0121 15:41:54.183617 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.480169 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.558308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0806ad-f9f2-44d7-b32f-4ad177974092-operator-scripts\") pod \"3b0806ad-f9f2-44d7-b32f-4ad177974092\" (UID: \"3b0806ad-f9f2-44d7-b32f-4ad177974092\") " Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.558572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66742\" (UniqueName: \"kubernetes.io/projected/3b0806ad-f9f2-44d7-b32f-4ad177974092-kube-api-access-66742\") pod \"3b0806ad-f9f2-44d7-b32f-4ad177974092\" (UID: \"3b0806ad-f9f2-44d7-b32f-4ad177974092\") " Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.558784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b0806ad-f9f2-44d7-b32f-4ad177974092-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b0806ad-f9f2-44d7-b32f-4ad177974092" (UID: "3b0806ad-f9f2-44d7-b32f-4ad177974092"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.559295 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0806ad-f9f2-44d7-b32f-4ad177974092-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.566590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b0806ad-f9f2-44d7-b32f-4ad177974092-kube-api-access-66742" (OuterVolumeSpecName: "kube-api-access-66742") pod "3b0806ad-f9f2-44d7-b32f-4ad177974092" (UID: "3b0806ad-f9f2-44d7-b32f-4ad177974092"). InnerVolumeSpecName "kube-api-access-66742". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:54 crc kubenswrapper[4707]: I0121 15:41:54.660601 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66742\" (UniqueName: \"kubernetes.io/projected/3b0806ad-f9f2-44d7-b32f-4ad177974092-kube-api-access-66742\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.192067 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107"} Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.192944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25"} Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.192991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838"} Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.193002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerStarted","Data":"2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6"} Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.194052 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-47dxg" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.194281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-47dxg" event={"ID":"3b0806ad-f9f2-44d7-b32f-4ad177974092","Type":"ContainerDied","Data":"bb004b8c3c88c3b5ea4d5a8ac88b87a28c8d5775df40c9641553c59633bf420a"} Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.194319 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb004b8c3c88c3b5ea4d5a8ac88b87a28c8d5775df40c9641553c59633bf420a" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.198425 4707 generic.go:334] "Generic (PLEG): container finished" podID="3f22a02a-72f1-45cd-876b-99167c754c9c" containerID="5af2da621a03f8cef0956e53ed72a3d8539092dc2471f1f3cb2edebedcaaafe7" exitCode=0 Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.198504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-rvckb" event={"ID":"3f22a02a-72f1-45cd-876b-99167c754c9c","Type":"ContainerDied","Data":"5af2da621a03f8cef0956e53ed72a3d8539092dc2471f1f3cb2edebedcaaafe7"} Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.223033 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=20.22302118 podStartE2EDuration="20.22302118s" podCreationTimestamp="2026-01-21 15:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:55.21831999 +0000 UTC m=+2412.399836213" watchObservedRunningTime="2026-01-21 15:41:55.22302118 +0000 UTC m=+2412.404537402" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.333216 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr"] Jan 21 15:41:55 crc kubenswrapper[4707]: E0121 15:41:55.333524 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b0806ad-f9f2-44d7-b32f-4ad177974092" containerName="mariadb-account-create-update" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.333543 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b0806ad-f9f2-44d7-b32f-4ad177974092" containerName="mariadb-account-create-update" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.333722 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b0806ad-f9f2-44d7-b32f-4ad177974092" containerName="mariadb-account-create-update" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.334470 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.336390 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.353991 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr"] Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.471676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.471739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2v4\" (UniqueName: \"kubernetes.io/projected/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-kube-api-access-cm2v4\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.471831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-config\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.472009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.573715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.573801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2v4\" (UniqueName: \"kubernetes.io/projected/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-kube-api-access-cm2v4\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.573858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-config\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.573943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.574752 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.574755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.574789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-config\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.589957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2v4\" (UniqueName: \"kubernetes.io/projected/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-kube-api-access-cm2v4\") pod \"dnsmasq-dnsmasq-5967bdfff7-tnxtr\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:55 crc kubenswrapper[4707]: I0121 15:41:55.647571 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.021930 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr"] Jan 21 15:41:56 crc kubenswrapper[4707]: W0121 15:41:56.035370 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76873cf_7ae7_48cf_b1f2_b3b3d6b4e725.slice/crio-48ccdff9968b9c1640f3ddf1f437967bdade6b28d7c7257855a27b69414596c9 WatchSource:0}: Error finding container 48ccdff9968b9c1640f3ddf1f437967bdade6b28d7c7257855a27b69414596c9: Status 404 returned error can't find the container with id 48ccdff9968b9c1640f3ddf1f437967bdade6b28d7c7257855a27b69414596c9 Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.211535 4707 generic.go:334] "Generic (PLEG): container finished" podID="b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" containerID="5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c" exitCode=0 Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.211714 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" event={"ID":"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725","Type":"ContainerDied","Data":"5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c"} Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.212860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" event={"ID":"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725","Type":"ContainerStarted","Data":"48ccdff9968b9c1640f3ddf1f437967bdade6b28d7c7257855a27b69414596c9"} Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.469868 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.590773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-db-sync-config-data\") pod \"3f22a02a-72f1-45cd-876b-99167c754c9c\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.591013 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-config-data\") pod \"3f22a02a-72f1-45cd-876b-99167c754c9c\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.591284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qmxm\" (UniqueName: \"kubernetes.io/projected/3f22a02a-72f1-45cd-876b-99167c754c9c-kube-api-access-6qmxm\") pod \"3f22a02a-72f1-45cd-876b-99167c754c9c\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.591399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-combined-ca-bundle\") pod \"3f22a02a-72f1-45cd-876b-99167c754c9c\" (UID: \"3f22a02a-72f1-45cd-876b-99167c754c9c\") " Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.594356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f22a02a-72f1-45cd-876b-99167c754c9c-kube-api-access-6qmxm" (OuterVolumeSpecName: "kube-api-access-6qmxm") pod "3f22a02a-72f1-45cd-876b-99167c754c9c" (UID: "3f22a02a-72f1-45cd-876b-99167c754c9c"). InnerVolumeSpecName "kube-api-access-6qmxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.594573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3f22a02a-72f1-45cd-876b-99167c754c9c" (UID: "3f22a02a-72f1-45cd-876b-99167c754c9c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.609874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f22a02a-72f1-45cd-876b-99167c754c9c" (UID: "3f22a02a-72f1-45cd-876b-99167c754c9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.622343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-config-data" (OuterVolumeSpecName: "config-data") pod "3f22a02a-72f1-45cd-876b-99167c754c9c" (UID: "3f22a02a-72f1-45cd-876b-99167c754c9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.694880 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qmxm\" (UniqueName: \"kubernetes.io/projected/3f22a02a-72f1-45cd-876b-99167c754c9c-kube-api-access-6qmxm\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.694904 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.694915 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:56 crc kubenswrapper[4707]: I0121 15:41:56.694927 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f22a02a-72f1-45cd-876b-99167c754c9c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:41:57 crc kubenswrapper[4707]: I0121 15:41:57.222492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" event={"ID":"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725","Type":"ContainerStarted","Data":"119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f"} Jan 21 15:41:57 crc kubenswrapper[4707]: I0121 15:41:57.222928 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:41:57 crc kubenswrapper[4707]: I0121 15:41:57.225171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-rvckb" event={"ID":"3f22a02a-72f1-45cd-876b-99167c754c9c","Type":"ContainerDied","Data":"ca326bb4998cb5892499469ee091e32c280f066657f0b3a4a194aeb632795678"} Jan 21 15:41:57 crc kubenswrapper[4707]: I0121 15:41:57.225203 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca326bb4998cb5892499469ee091e32c280f066657f0b3a4a194aeb632795678" Jan 21 15:41:57 crc kubenswrapper[4707]: I0121 15:41:57.225222 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-rvckb" Jan 21 15:41:57 crc kubenswrapper[4707]: I0121 15:41:57.238104 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" podStartSLOduration=2.238091507 podStartE2EDuration="2.238091507s" podCreationTimestamp="2026-01-21 15:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:41:57.235086267 +0000 UTC m=+2414.416602489" watchObservedRunningTime="2026-01-21 15:41:57.238091507 +0000 UTC m=+2414.419607729" Jan 21 15:41:58 crc kubenswrapper[4707]: I0121 15:41:58.372727 4707 scope.go:117] "RemoveContainer" containerID="6312bc7575fea47a596d88868e21654648feff1c9d8fe5d956eeb115a2036499" Jan 21 15:41:58 crc kubenswrapper[4707]: I0121 15:41:58.392198 4707 scope.go:117] "RemoveContainer" containerID="0acfdb10ed4b8a5d4173af5a014e847071dd24247841328d306d9a0bc63b44ec" Jan 21 15:41:58 crc kubenswrapper[4707]: I0121 15:41:58.426495 4707 scope.go:117] "RemoveContainer" containerID="54c3c3aa93da18a1865d4c3838da11a384cb1fb6df94004df8c93addbd488257" Jan 21 15:41:58 crc kubenswrapper[4707]: I0121 15:41:58.445746 4707 scope.go:117] "RemoveContainer" containerID="fa1515803c68c5a23c843f4e49519efb946c9d84db010699f0e712149f6220d1" Jan 21 15:41:58 crc kubenswrapper[4707]: I0121 15:41:58.467303 4707 scope.go:117] "RemoveContainer" containerID="17e70d7489d29ca57860b2d236ba90a62d858bc89f801452c8a3eba7098c9352" Jan 21 15:42:05 crc kubenswrapper[4707]: I0121 15:42:05.649573 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:42:05 crc kubenswrapper[4707]: I0121 15:42:05.685133 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq"] Jan 21 15:42:05 crc kubenswrapper[4707]: I0121 15:42:05.685358 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" podUID="4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" containerName="dnsmasq-dns" containerID="cri-o://ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384" gracePeriod=10 Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.066061 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.140374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-dnsmasq-svc\") pod \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.140427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh4vp\" (UniqueName: \"kubernetes.io/projected/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-kube-api-access-sh4vp\") pod \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.140519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-config\") pod \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\" (UID: \"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97\") " Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.146445 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-kube-api-access-sh4vp" (OuterVolumeSpecName: "kube-api-access-sh4vp") pod "4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" (UID: "4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97"). InnerVolumeSpecName "kube-api-access-sh4vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.173844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-config" (OuterVolumeSpecName: "config") pod "4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" (UID: "4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.174327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" (UID: "4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.243394 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.243425 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh4vp\" (UniqueName: \"kubernetes.io/projected/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-kube-api-access-sh4vp\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.243436 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.306061 4707 generic.go:334] "Generic (PLEG): container finished" podID="4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" containerID="ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384" exitCode=0 Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.306107 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.306112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" event={"ID":"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97","Type":"ContainerDied","Data":"ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384"} Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.306209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq" event={"ID":"4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97","Type":"ContainerDied","Data":"005c3830bbcc49ebbc04330bf3fbbe8b92dadd75627cc14deb4d96c5057c014c"} Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.306226 4707 scope.go:117] "RemoveContainer" containerID="ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.324660 4707 scope.go:117] "RemoveContainer" containerID="e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.335982 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq"] Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.341484 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-wkpqq"] Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.349286 4707 scope.go:117] "RemoveContainer" containerID="ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384" Jan 21 15:42:06 crc kubenswrapper[4707]: E0121 15:42:06.349664 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384\": container with ID starting with ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384 not found: ID does not exist" containerID="ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.349693 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384"} err="failed to get container status \"ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384\": rpc error: code = NotFound desc = could not find container \"ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384\": container with ID starting with ff1cad39861c5bcc1e137546e9464da2498c66698aefbc6a1b3dd428da787384 not found: ID does not exist" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.349714 4707 scope.go:117] "RemoveContainer" containerID="e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615" Jan 21 15:42:06 crc kubenswrapper[4707]: E0121 15:42:06.350046 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615\": container with ID starting with e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615 not found: ID does not exist" containerID="e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615" Jan 21 15:42:06 crc kubenswrapper[4707]: I0121 15:42:06.350076 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615"} err="failed to get container status \"e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615\": rpc error: code = NotFound desc = could not find container \"e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615\": container with ID starting with e82380bd2e78cb3940ea6657631ded0e2c483ef2608c0884c4a9fd379f309615 not found: ID does not exist" Jan 21 15:42:07 crc kubenswrapper[4707]: I0121 15:42:07.183538 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:42:07 crc kubenswrapper[4707]: E0121 15:42:07.184284 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:42:07 crc kubenswrapper[4707]: I0121 15:42:07.195209 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" path="/var/lib/kubelet/pods/4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97/volumes" Jan 21 15:42:09 crc kubenswrapper[4707]: I0121 15:42:09.922937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.143579 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-49p4w"] Jan 21 15:42:10 crc kubenswrapper[4707]: E0121 15:42:10.143910 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" containerName="dnsmasq-dns" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.143927 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" containerName="dnsmasq-dns" Jan 21 15:42:10 crc kubenswrapper[4707]: E0121 15:42:10.143940 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" containerName="init" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.143946 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" containerName="init" Jan 21 15:42:10 crc kubenswrapper[4707]: E0121 15:42:10.143963 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f22a02a-72f1-45cd-876b-99167c754c9c" containerName="glance-db-sync" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.143968 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f22a02a-72f1-45cd-876b-99167c754c9c" containerName="glance-db-sync" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.144113 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f22a02a-72f1-45cd-876b-99167c754c9c" containerName="glance-db-sync" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.144128 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa82ae2-2af5-4cad-9ffc-ca96fbe78e97" containerName="dnsmasq-dns" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.144616 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.153882 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-49p4w"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.199944 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.202574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03df5afa-682c-4caa-a0ab-091fb6c7fa11-operator-scripts\") pod \"barbican-db-create-49p4w\" (UID: \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\") " pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.202630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpnrx\" (UniqueName: \"kubernetes.io/projected/03df5afa-682c-4caa-a0ab-091fb6c7fa11-kube-api-access-cpnrx\") pod \"barbican-db-create-49p4w\" (UID: \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\") " pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.260046 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.261129 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.265996 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.300953 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.307511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03df5afa-682c-4caa-a0ab-091fb6c7fa11-operator-scripts\") pod \"barbican-db-create-49p4w\" (UID: \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\") " pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.307604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpnrx\" (UniqueName: \"kubernetes.io/projected/03df5afa-682c-4caa-a0ab-091fb6c7fa11-kube-api-access-cpnrx\") pod \"barbican-db-create-49p4w\" (UID: \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\") " pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.310094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03df5afa-682c-4caa-a0ab-091fb6c7fa11-operator-scripts\") pod \"barbican-db-create-49p4w\" (UID: \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\") " pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.368171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpnrx\" (UniqueName: \"kubernetes.io/projected/03df5afa-682c-4caa-a0ab-091fb6c7fa11-kube-api-access-cpnrx\") pod \"barbican-db-create-49p4w\" (UID: \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\") " pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.409447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ba6f6d-b3da-4f93-8af6-22537c56e223-operator-scripts\") pod \"barbican-e6a9-account-create-update-mgfrs\" (UID: \"67ba6f6d-b3da-4f93-8af6-22537c56e223\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.409531 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shv8q\" (UniqueName: \"kubernetes.io/projected/67ba6f6d-b3da-4f93-8af6-22537c56e223-kube-api-access-shv8q\") pod \"barbican-e6a9-account-create-update-mgfrs\" (UID: \"67ba6f6d-b3da-4f93-8af6-22537c56e223\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.437016 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lblpk"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.438093 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.443280 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lblpk"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.454231 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.455621 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.458390 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.458739 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.467542 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.510975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ba6f6d-b3da-4f93-8af6-22537c56e223-operator-scripts\") pod \"barbican-e6a9-account-create-update-mgfrs\" (UID: \"67ba6f6d-b3da-4f93-8af6-22537c56e223\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.511038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74fkt\" (UniqueName: \"kubernetes.io/projected/596ebb84-1e59-4793-804c-225d9a9d1794-kube-api-access-74fkt\") pod \"cinder-a608-account-create-update-tvcgs\" (UID: \"596ebb84-1e59-4793-804c-225d9a9d1794\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.511071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shv8q\" (UniqueName: \"kubernetes.io/projected/67ba6f6d-b3da-4f93-8af6-22537c56e223-kube-api-access-shv8q\") pod \"barbican-e6a9-account-create-update-mgfrs\" (UID: \"67ba6f6d-b3da-4f93-8af6-22537c56e223\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.511102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/596ebb84-1e59-4793-804c-225d9a9d1794-operator-scripts\") pod \"cinder-a608-account-create-update-tvcgs\" (UID: \"596ebb84-1e59-4793-804c-225d9a9d1794\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.511144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptvl\" (UniqueName: \"kubernetes.io/projected/61a9072a-1e29-4655-aec9-a7bf54ba24f9-kube-api-access-8ptvl\") pod \"neutron-db-create-lblpk\" (UID: \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\") " pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.511182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a9072a-1e29-4655-aec9-a7bf54ba24f9-operator-scripts\") pod \"neutron-db-create-lblpk\" (UID: \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\") " pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.512211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ba6f6d-b3da-4f93-8af6-22537c56e223-operator-scripts\") pod \"barbican-e6a9-account-create-update-mgfrs\" (UID: \"67ba6f6d-b3da-4f93-8af6-22537c56e223\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.519406 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vgsx9"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.520729 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.527511 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.527703 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.528230 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-xxph7" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.528357 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.533445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shv8q\" (UniqueName: \"kubernetes.io/projected/67ba6f6d-b3da-4f93-8af6-22537c56e223-kube-api-access-shv8q\") pod \"barbican-e6a9-account-create-update-mgfrs\" (UID: \"67ba6f6d-b3da-4f93-8af6-22537c56e223\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.560799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vgsx9"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.567338 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-s78zw"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.568583 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.577502 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.578418 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.581120 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.581446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.595276 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-s78zw"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.613757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a9072a-1e29-4655-aec9-a7bf54ba24f9-operator-scripts\") pod \"neutron-db-create-lblpk\" (UID: \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\") " pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.614100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-combined-ca-bundle\") pod \"keystone-db-sync-vgsx9\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.614162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fpkw\" (UniqueName: \"kubernetes.io/projected/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-kube-api-access-8fpkw\") pod \"keystone-db-sync-vgsx9\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.614312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74fkt\" (UniqueName: \"kubernetes.io/projected/596ebb84-1e59-4793-804c-225d9a9d1794-kube-api-access-74fkt\") pod \"cinder-a608-account-create-update-tvcgs\" (UID: \"596ebb84-1e59-4793-804c-225d9a9d1794\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.614444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/596ebb84-1e59-4793-804c-225d9a9d1794-operator-scripts\") pod \"cinder-a608-account-create-update-tvcgs\" (UID: \"596ebb84-1e59-4793-804c-225d9a9d1794\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.614542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptvl\" (UniqueName: \"kubernetes.io/projected/61a9072a-1e29-4655-aec9-a7bf54ba24f9-kube-api-access-8ptvl\") pod \"neutron-db-create-lblpk\" (UID: \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\") " pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.614582 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-config-data\") pod \"keystone-db-sync-vgsx9\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.614549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a9072a-1e29-4655-aec9-a7bf54ba24f9-operator-scripts\") pod \"neutron-db-create-lblpk\" (UID: \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\") " pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.615024 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/596ebb84-1e59-4793-804c-225d9a9d1794-operator-scripts\") pod \"cinder-a608-account-create-update-tvcgs\" (UID: \"596ebb84-1e59-4793-804c-225d9a9d1794\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.618112 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s"] Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.647481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74fkt\" (UniqueName: \"kubernetes.io/projected/596ebb84-1e59-4793-804c-225d9a9d1794-kube-api-access-74fkt\") pod \"cinder-a608-account-create-update-tvcgs\" (UID: \"596ebb84-1e59-4793-804c-225d9a9d1794\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.651022 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptvl\" (UniqueName: \"kubernetes.io/projected/61a9072a-1e29-4655-aec9-a7bf54ba24f9-kube-api-access-8ptvl\") pod \"neutron-db-create-lblpk\" (UID: \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\") " pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.716157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skgnw\" (UniqueName: \"kubernetes.io/projected/0219fe68-3dcf-4e10-88e8-6de2bd228897-kube-api-access-skgnw\") pod \"cinder-db-create-s78zw\" (UID: \"0219fe68-3dcf-4e10-88e8-6de2bd228897\") " pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.716243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2c49ae-8633-4865-9d20-03b627f5bf54-operator-scripts\") pod \"neutron-6d40-account-create-update-f9l4s\" (UID: \"2b2c49ae-8633-4865-9d20-03b627f5bf54\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.716406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-config-data\") pod \"keystone-db-sync-vgsx9\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.716495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0219fe68-3dcf-4e10-88e8-6de2bd228897-operator-scripts\") pod \"cinder-db-create-s78zw\" (UID: \"0219fe68-3dcf-4e10-88e8-6de2bd228897\") " pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.716575 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-combined-ca-bundle\") pod \"keystone-db-sync-vgsx9\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.717143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fpkw\" (UniqueName: \"kubernetes.io/projected/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-kube-api-access-8fpkw\") pod \"keystone-db-sync-vgsx9\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.717234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngmx9\" (UniqueName: \"kubernetes.io/projected/2b2c49ae-8633-4865-9d20-03b627f5bf54-kube-api-access-ngmx9\") pod \"neutron-6d40-account-create-update-f9l4s\" (UID: \"2b2c49ae-8633-4865-9d20-03b627f5bf54\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.720022 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-config-data\") pod \"keystone-db-sync-vgsx9\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.720114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-combined-ca-bundle\") pod \"keystone-db-sync-vgsx9\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.731698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fpkw\" (UniqueName: \"kubernetes.io/projected/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-kube-api-access-8fpkw\") pod \"keystone-db-sync-vgsx9\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.759101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.776124 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.818914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skgnw\" (UniqueName: \"kubernetes.io/projected/0219fe68-3dcf-4e10-88e8-6de2bd228897-kube-api-access-skgnw\") pod \"cinder-db-create-s78zw\" (UID: \"0219fe68-3dcf-4e10-88e8-6de2bd228897\") " pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.819133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2c49ae-8633-4865-9d20-03b627f5bf54-operator-scripts\") pod \"neutron-6d40-account-create-update-f9l4s\" (UID: \"2b2c49ae-8633-4865-9d20-03b627f5bf54\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.819195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0219fe68-3dcf-4e10-88e8-6de2bd228897-operator-scripts\") pod \"cinder-db-create-s78zw\" (UID: \"0219fe68-3dcf-4e10-88e8-6de2bd228897\") " pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.819263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngmx9\" (UniqueName: \"kubernetes.io/projected/2b2c49ae-8633-4865-9d20-03b627f5bf54-kube-api-access-ngmx9\") pod \"neutron-6d40-account-create-update-f9l4s\" (UID: \"2b2c49ae-8633-4865-9d20-03b627f5bf54\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.819866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2c49ae-8633-4865-9d20-03b627f5bf54-operator-scripts\") pod \"neutron-6d40-account-create-update-f9l4s\" (UID: \"2b2c49ae-8633-4865-9d20-03b627f5bf54\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.819981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0219fe68-3dcf-4e10-88e8-6de2bd228897-operator-scripts\") pod \"cinder-db-create-s78zw\" (UID: \"0219fe68-3dcf-4e10-88e8-6de2bd228897\") " pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.836360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skgnw\" (UniqueName: \"kubernetes.io/projected/0219fe68-3dcf-4e10-88e8-6de2bd228897-kube-api-access-skgnw\") pod \"cinder-db-create-s78zw\" (UID: \"0219fe68-3dcf-4e10-88e8-6de2bd228897\") " pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.837301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngmx9\" (UniqueName: \"kubernetes.io/projected/2b2c49ae-8633-4865-9d20-03b627f5bf54-kube-api-access-ngmx9\") pod \"neutron-6d40-account-create-update-f9l4s\" (UID: \"2b2c49ae-8633-4865-9d20-03b627f5bf54\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.895914 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.918493 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.960745 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:10 crc kubenswrapper[4707]: I0121 15:42:10.969549 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-49p4w"] Jan 21 15:42:10 crc kubenswrapper[4707]: W0121 15:42:10.988348 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03df5afa_682c_4caa_a0ab_091fb6c7fa11.slice/crio-20b14746d611a16589344a8490b060a18ea960806f56b8ca1f2e7c031ec1e7da WatchSource:0}: Error finding container 20b14746d611a16589344a8490b060a18ea960806f56b8ca1f2e7c031ec1e7da: Status 404 returned error can't find the container with id 20b14746d611a16589344a8490b060a18ea960806f56b8ca1f2e7c031ec1e7da Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.061050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs"] Jan 21 15:42:11 crc kubenswrapper[4707]: W0121 15:42:11.080875 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67ba6f6d_b3da_4f93_8af6_22537c56e223.slice/crio-f3c6084f8d66d3d46a1dbe7160a0ac80f1800ffb00f678655891e4ce6d4fa9b4 WatchSource:0}: Error finding container f3c6084f8d66d3d46a1dbe7160a0ac80f1800ffb00f678655891e4ce6d4fa9b4: Status 404 returned error can't find the container with id f3c6084f8d66d3d46a1dbe7160a0ac80f1800ffb00f678655891e4ce6d4fa9b4 Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.208365 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs"] Jan 21 15:42:11 crc kubenswrapper[4707]: W0121 15:42:11.212858 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod596ebb84_1e59_4793_804c_225d9a9d1794.slice/crio-2581aa3bcc3913bff32e2673d662924427802469ebf8b8051854dfdb4fbe5a9f WatchSource:0}: Error finding container 2581aa3bcc3913bff32e2673d662924427802469ebf8b8051854dfdb4fbe5a9f: Status 404 returned error can't find the container with id 2581aa3bcc3913bff32e2673d662924427802469ebf8b8051854dfdb4fbe5a9f Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.315083 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lblpk"] Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.349154 4707 generic.go:334] "Generic (PLEG): container finished" podID="03df5afa-682c-4caa-a0ab-091fb6c7fa11" containerID="85f0ac1944fb7f7018245b57f8ae281a1b377d7c02aeb6ea7c4032590cd3ceda" exitCode=0 Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.349224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-49p4w" event={"ID":"03df5afa-682c-4caa-a0ab-091fb6c7fa11","Type":"ContainerDied","Data":"85f0ac1944fb7f7018245b57f8ae281a1b377d7c02aeb6ea7c4032590cd3ceda"} Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.349268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-49p4w" event={"ID":"03df5afa-682c-4caa-a0ab-091fb6c7fa11","Type":"ContainerStarted","Data":"20b14746d611a16589344a8490b060a18ea960806f56b8ca1f2e7c031ec1e7da"} Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.351735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" event={"ID":"67ba6f6d-b3da-4f93-8af6-22537c56e223","Type":"ContainerStarted","Data":"aaaee1f315be8fc5264779dd0208672135508c8732b50d410fe3dee604e532f5"} Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.351760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" event={"ID":"67ba6f6d-b3da-4f93-8af6-22537c56e223","Type":"ContainerStarted","Data":"f3c6084f8d66d3d46a1dbe7160a0ac80f1800ffb00f678655891e4ce6d4fa9b4"} Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.353862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" event={"ID":"596ebb84-1e59-4793-804c-225d9a9d1794","Type":"ContainerStarted","Data":"2581aa3bcc3913bff32e2673d662924427802469ebf8b8051854dfdb4fbe5a9f"} Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.385963 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" podStartSLOduration=1.385946203 podStartE2EDuration="1.385946203s" podCreationTimestamp="2026-01-21 15:42:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:11.378846545 +0000 UTC m=+2428.560362768" watchObservedRunningTime="2026-01-21 15:42:11.385946203 +0000 UTC m=+2428.567462425" Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.395725 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vgsx9"] Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.454300 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s"] Jan 21 15:42:11 crc kubenswrapper[4707]: I0121 15:42:11.472012 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-s78zw"] Jan 21 15:42:11 crc kubenswrapper[4707]: E0121 15:42:11.554304 4707 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.165:54380->192.168.25.165:36655: read tcp 192.168.25.165:54380->192.168.25.165:36655: read: connection reset by peer Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.366715 4707 generic.go:334] "Generic (PLEG): container finished" podID="61a9072a-1e29-4655-aec9-a7bf54ba24f9" containerID="b6b60f79abe8f46a93b6c2e5dfd92e9104cd8b3b93411bbe1c60ed456a3637bc" exitCode=0 Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.366766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-lblpk" event={"ID":"61a9072a-1e29-4655-aec9-a7bf54ba24f9","Type":"ContainerDied","Data":"b6b60f79abe8f46a93b6c2e5dfd92e9104cd8b3b93411bbe1c60ed456a3637bc"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.367132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-lblpk" event={"ID":"61a9072a-1e29-4655-aec9-a7bf54ba24f9","Type":"ContainerStarted","Data":"d57092669d7b470706648ec1c938ca672634cf8e9d48abb5573653c06c550de3"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.369445 4707 generic.go:334] "Generic (PLEG): container finished" podID="596ebb84-1e59-4793-804c-225d9a9d1794" containerID="4fef014c4eff487e10d655344bee4cd79f65ebb0fd139656b3c18924175d3fa6" exitCode=0 Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.369618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" event={"ID":"596ebb84-1e59-4793-804c-225d9a9d1794","Type":"ContainerDied","Data":"4fef014c4eff487e10d655344bee4cd79f65ebb0fd139656b3c18924175d3fa6"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.371218 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b2c49ae-8633-4865-9d20-03b627f5bf54" containerID="276b7c05bf5308331c9bd2cda80036ad4969a70291f51c772dd84fff84a8ac15" exitCode=0 Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.371324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" event={"ID":"2b2c49ae-8633-4865-9d20-03b627f5bf54","Type":"ContainerDied","Data":"276b7c05bf5308331c9bd2cda80036ad4969a70291f51c772dd84fff84a8ac15"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.371525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" event={"ID":"2b2c49ae-8633-4865-9d20-03b627f5bf54","Type":"ContainerStarted","Data":"2688417d34c73ee641e3a6c1ed303870a38da9b8dd331d3c1aab9d4c5f1aefbe"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.372712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" event={"ID":"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d","Type":"ContainerStarted","Data":"ff9c1f50093af815c6e84a6e86938a5529512e8a26ae5c083aa8c51428af7125"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.372747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" event={"ID":"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d","Type":"ContainerStarted","Data":"3c22b50d03135a723117a24f1d5940884c1710606f5e5d812654a7fc4c3d5aff"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.374922 4707 generic.go:334] "Generic (PLEG): container finished" podID="0219fe68-3dcf-4e10-88e8-6de2bd228897" containerID="0921cbb4487de695795c92b6b4be615fabb9b47e191394d00f6a928e940d4022" exitCode=0 Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.375038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-s78zw" event={"ID":"0219fe68-3dcf-4e10-88e8-6de2bd228897","Type":"ContainerDied","Data":"0921cbb4487de695795c92b6b4be615fabb9b47e191394d00f6a928e940d4022"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.375120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-s78zw" event={"ID":"0219fe68-3dcf-4e10-88e8-6de2bd228897","Type":"ContainerStarted","Data":"decd4de16ce61d26f1cb8dec09dd4895d4d0782cbdb2be70148e7b773364db30"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.376802 4707 generic.go:334] "Generic (PLEG): container finished" podID="67ba6f6d-b3da-4f93-8af6-22537c56e223" containerID="aaaee1f315be8fc5264779dd0208672135508c8732b50d410fe3dee604e532f5" exitCode=0 Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.376873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" event={"ID":"67ba6f6d-b3da-4f93-8af6-22537c56e223","Type":"ContainerDied","Data":"aaaee1f315be8fc5264779dd0208672135508c8732b50d410fe3dee604e532f5"} Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.433146 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" podStartSLOduration=2.433128934 podStartE2EDuration="2.433128934s" podCreationTimestamp="2026-01-21 15:42:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:12.425284074 +0000 UTC m=+2429.606800296" watchObservedRunningTime="2026-01-21 15:42:12.433128934 +0000 UTC m=+2429.614645156" Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.656724 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.774429 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpnrx\" (UniqueName: \"kubernetes.io/projected/03df5afa-682c-4caa-a0ab-091fb6c7fa11-kube-api-access-cpnrx\") pod \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\" (UID: \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\") " Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.774712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03df5afa-682c-4caa-a0ab-091fb6c7fa11-operator-scripts\") pod \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\" (UID: \"03df5afa-682c-4caa-a0ab-091fb6c7fa11\") " Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.775103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03df5afa-682c-4caa-a0ab-091fb6c7fa11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03df5afa-682c-4caa-a0ab-091fb6c7fa11" (UID: "03df5afa-682c-4caa-a0ab-091fb6c7fa11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.775463 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03df5afa-682c-4caa-a0ab-091fb6c7fa11-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.780879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03df5afa-682c-4caa-a0ab-091fb6c7fa11-kube-api-access-cpnrx" (OuterVolumeSpecName: "kube-api-access-cpnrx") pod "03df5afa-682c-4caa-a0ab-091fb6c7fa11" (UID: "03df5afa-682c-4caa-a0ab-091fb6c7fa11"). InnerVolumeSpecName "kube-api-access-cpnrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:12 crc kubenswrapper[4707]: I0121 15:42:12.877453 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpnrx\" (UniqueName: \"kubernetes.io/projected/03df5afa-682c-4caa-a0ab-091fb6c7fa11-kube-api-access-cpnrx\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.390211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-49p4w" event={"ID":"03df5afa-682c-4caa-a0ab-091fb6c7fa11","Type":"ContainerDied","Data":"20b14746d611a16589344a8490b060a18ea960806f56b8ca1f2e7c031ec1e7da"} Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.390576 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20b14746d611a16589344a8490b060a18ea960806f56b8ca1f2e7c031ec1e7da" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.390308 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-49p4w" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.394751 4707 generic.go:334] "Generic (PLEG): container finished" podID="4f1e87e4-8e97-4b83-94c8-91505cd5dd0d" containerID="ff9c1f50093af815c6e84a6e86938a5529512e8a26ae5c083aa8c51428af7125" exitCode=0 Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.394780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" event={"ID":"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d","Type":"ContainerDied","Data":"ff9c1f50093af815c6e84a6e86938a5529512e8a26ae5c083aa8c51428af7125"} Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.694742 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.791367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74fkt\" (UniqueName: \"kubernetes.io/projected/596ebb84-1e59-4793-804c-225d9a9d1794-kube-api-access-74fkt\") pod \"596ebb84-1e59-4793-804c-225d9a9d1794\" (UID: \"596ebb84-1e59-4793-804c-225d9a9d1794\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.791453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/596ebb84-1e59-4793-804c-225d9a9d1794-operator-scripts\") pod \"596ebb84-1e59-4793-804c-225d9a9d1794\" (UID: \"596ebb84-1e59-4793-804c-225d9a9d1794\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.792250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596ebb84-1e59-4793-804c-225d9a9d1794-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "596ebb84-1e59-4793-804c-225d9a9d1794" (UID: "596ebb84-1e59-4793-804c-225d9a9d1794"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.796832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596ebb84-1e59-4793-804c-225d9a9d1794-kube-api-access-74fkt" (OuterVolumeSpecName: "kube-api-access-74fkt") pod "596ebb84-1e59-4793-804c-225d9a9d1794" (UID: "596ebb84-1e59-4793-804c-225d9a9d1794"). InnerVolumeSpecName "kube-api-access-74fkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.826040 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.832050 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.836495 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.845509 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.893777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ptvl\" (UniqueName: \"kubernetes.io/projected/61a9072a-1e29-4655-aec9-a7bf54ba24f9-kube-api-access-8ptvl\") pod \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\" (UID: \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.893921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shv8q\" (UniqueName: \"kubernetes.io/projected/67ba6f6d-b3da-4f93-8af6-22537c56e223-kube-api-access-shv8q\") pod \"67ba6f6d-b3da-4f93-8af6-22537c56e223\" (UID: \"67ba6f6d-b3da-4f93-8af6-22537c56e223\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.893954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a9072a-1e29-4655-aec9-a7bf54ba24f9-operator-scripts\") pod \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\" (UID: \"61a9072a-1e29-4655-aec9-a7bf54ba24f9\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.894030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skgnw\" (UniqueName: \"kubernetes.io/projected/0219fe68-3dcf-4e10-88e8-6de2bd228897-kube-api-access-skgnw\") pod \"0219fe68-3dcf-4e10-88e8-6de2bd228897\" (UID: \"0219fe68-3dcf-4e10-88e8-6de2bd228897\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.894063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0219fe68-3dcf-4e10-88e8-6de2bd228897-operator-scripts\") pod \"0219fe68-3dcf-4e10-88e8-6de2bd228897\" (UID: \"0219fe68-3dcf-4e10-88e8-6de2bd228897\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.894431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ba6f6d-b3da-4f93-8af6-22537c56e223-operator-scripts\") pod \"67ba6f6d-b3da-4f93-8af6-22537c56e223\" (UID: \"67ba6f6d-b3da-4f93-8af6-22537c56e223\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.895308 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74fkt\" (UniqueName: \"kubernetes.io/projected/596ebb84-1e59-4793-804c-225d9a9d1794-kube-api-access-74fkt\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.895336 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/596ebb84-1e59-4793-804c-225d9a9d1794-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.897128 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ba6f6d-b3da-4f93-8af6-22537c56e223-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67ba6f6d-b3da-4f93-8af6-22537c56e223" (UID: "67ba6f6d-b3da-4f93-8af6-22537c56e223"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.898175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a9072a-1e29-4655-aec9-a7bf54ba24f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61a9072a-1e29-4655-aec9-a7bf54ba24f9" (UID: "61a9072a-1e29-4655-aec9-a7bf54ba24f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.898304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0219fe68-3dcf-4e10-88e8-6de2bd228897-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0219fe68-3dcf-4e10-88e8-6de2bd228897" (UID: "0219fe68-3dcf-4e10-88e8-6de2bd228897"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.899790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a9072a-1e29-4655-aec9-a7bf54ba24f9-kube-api-access-8ptvl" (OuterVolumeSpecName: "kube-api-access-8ptvl") pod "61a9072a-1e29-4655-aec9-a7bf54ba24f9" (UID: "61a9072a-1e29-4655-aec9-a7bf54ba24f9"). InnerVolumeSpecName "kube-api-access-8ptvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.900801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ba6f6d-b3da-4f93-8af6-22537c56e223-kube-api-access-shv8q" (OuterVolumeSpecName: "kube-api-access-shv8q") pod "67ba6f6d-b3da-4f93-8af6-22537c56e223" (UID: "67ba6f6d-b3da-4f93-8af6-22537c56e223"). InnerVolumeSpecName "kube-api-access-shv8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.901238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0219fe68-3dcf-4e10-88e8-6de2bd228897-kube-api-access-skgnw" (OuterVolumeSpecName: "kube-api-access-skgnw") pod "0219fe68-3dcf-4e10-88e8-6de2bd228897" (UID: "0219fe68-3dcf-4e10-88e8-6de2bd228897"). InnerVolumeSpecName "kube-api-access-skgnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.996151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2c49ae-8633-4865-9d20-03b627f5bf54-operator-scripts\") pod \"2b2c49ae-8633-4865-9d20-03b627f5bf54\" (UID: \"2b2c49ae-8633-4865-9d20-03b627f5bf54\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.996279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngmx9\" (UniqueName: \"kubernetes.io/projected/2b2c49ae-8633-4865-9d20-03b627f5bf54-kube-api-access-ngmx9\") pod \"2b2c49ae-8633-4865-9d20-03b627f5bf54\" (UID: \"2b2c49ae-8633-4865-9d20-03b627f5bf54\") " Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.996549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2c49ae-8633-4865-9d20-03b627f5bf54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b2c49ae-8633-4865-9d20-03b627f5bf54" (UID: "2b2c49ae-8633-4865-9d20-03b627f5bf54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.996957 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skgnw\" (UniqueName: \"kubernetes.io/projected/0219fe68-3dcf-4e10-88e8-6de2bd228897-kube-api-access-skgnw\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.996978 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0219fe68-3dcf-4e10-88e8-6de2bd228897-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.996991 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ba6f6d-b3da-4f93-8af6-22537c56e223-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.997004 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ptvl\" (UniqueName: \"kubernetes.io/projected/61a9072a-1e29-4655-aec9-a7bf54ba24f9-kube-api-access-8ptvl\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.997012 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2c49ae-8633-4865-9d20-03b627f5bf54-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.997022 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shv8q\" (UniqueName: \"kubernetes.io/projected/67ba6f6d-b3da-4f93-8af6-22537c56e223-kube-api-access-shv8q\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.997031 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a9072a-1e29-4655-aec9-a7bf54ba24f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:13 crc kubenswrapper[4707]: I0121 15:42:13.999007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2c49ae-8633-4865-9d20-03b627f5bf54-kube-api-access-ngmx9" (OuterVolumeSpecName: "kube-api-access-ngmx9") pod "2b2c49ae-8633-4865-9d20-03b627f5bf54" (UID: "2b2c49ae-8633-4865-9d20-03b627f5bf54"). InnerVolumeSpecName "kube-api-access-ngmx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.099743 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngmx9\" (UniqueName: \"kubernetes.io/projected/2b2c49ae-8633-4865-9d20-03b627f5bf54-kube-api-access-ngmx9\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.406401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-s78zw" event={"ID":"0219fe68-3dcf-4e10-88e8-6de2bd228897","Type":"ContainerDied","Data":"decd4de16ce61d26f1cb8dec09dd4895d4d0782cbdb2be70148e7b773364db30"} Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.406445 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="decd4de16ce61d26f1cb8dec09dd4895d4d0782cbdb2be70148e7b773364db30" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.406420 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-s78zw" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.407997 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.407999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs" event={"ID":"67ba6f6d-b3da-4f93-8af6-22537c56e223","Type":"ContainerDied","Data":"f3c6084f8d66d3d46a1dbe7160a0ac80f1800ffb00f678655891e4ce6d4fa9b4"} Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.408113 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3c6084f8d66d3d46a1dbe7160a0ac80f1800ffb00f678655891e4ce6d4fa9b4" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.409955 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-lblpk" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.409966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-lblpk" event={"ID":"61a9072a-1e29-4655-aec9-a7bf54ba24f9","Type":"ContainerDied","Data":"d57092669d7b470706648ec1c938ca672634cf8e9d48abb5573653c06c550de3"} Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.409986 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d57092669d7b470706648ec1c938ca672634cf8e9d48abb5573653c06c550de3" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.412516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" event={"ID":"596ebb84-1e59-4793-804c-225d9a9d1794","Type":"ContainerDied","Data":"2581aa3bcc3913bff32e2673d662924427802469ebf8b8051854dfdb4fbe5a9f"} Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.412531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.412555 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2581aa3bcc3913bff32e2673d662924427802469ebf8b8051854dfdb4fbe5a9f" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.414095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" event={"ID":"2b2c49ae-8633-4865-9d20-03b627f5bf54","Type":"ContainerDied","Data":"2688417d34c73ee641e3a6c1ed303870a38da9b8dd331d3c1aab9d4c5f1aefbe"} Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.414126 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2688417d34c73ee641e3a6c1ed303870a38da9b8dd331d3c1aab9d4c5f1aefbe" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.414230 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.651324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.711571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-combined-ca-bundle\") pod \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.711857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-config-data\") pod \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.711908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fpkw\" (UniqueName: \"kubernetes.io/projected/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-kube-api-access-8fpkw\") pod \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\" (UID: \"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d\") " Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.716089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-kube-api-access-8fpkw" (OuterVolumeSpecName: "kube-api-access-8fpkw") pod "4f1e87e4-8e97-4b83-94c8-91505cd5dd0d" (UID: "4f1e87e4-8e97-4b83-94c8-91505cd5dd0d"). InnerVolumeSpecName "kube-api-access-8fpkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.733799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f1e87e4-8e97-4b83-94c8-91505cd5dd0d" (UID: "4f1e87e4-8e97-4b83-94c8-91505cd5dd0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.748370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-config-data" (OuterVolumeSpecName: "config-data") pod "4f1e87e4-8e97-4b83-94c8-91505cd5dd0d" (UID: "4f1e87e4-8e97-4b83-94c8-91505cd5dd0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.814353 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.814387 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fpkw\" (UniqueName: \"kubernetes.io/projected/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-kube-api-access-8fpkw\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:14 crc kubenswrapper[4707]: I0121 15:42:14.814399 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.425182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" event={"ID":"4f1e87e4-8e97-4b83-94c8-91505cd5dd0d","Type":"ContainerDied","Data":"3c22b50d03135a723117a24f1d5940884c1710606f5e5d812654a7fc4c3d5aff"} Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.425226 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c22b50d03135a723117a24f1d5940884c1710606f5e5d812654a7fc4c3d5aff" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.426178 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vgsx9" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763263 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-hffzr"] Jan 21 15:42:15 crc kubenswrapper[4707]: E0121 15:42:15.763538 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0219fe68-3dcf-4e10-88e8-6de2bd228897" containerName="mariadb-database-create" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763550 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0219fe68-3dcf-4e10-88e8-6de2bd228897" containerName="mariadb-database-create" Jan 21 15:42:15 crc kubenswrapper[4707]: E0121 15:42:15.763564 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2c49ae-8633-4865-9d20-03b627f5bf54" containerName="mariadb-account-create-update" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763570 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2c49ae-8633-4865-9d20-03b627f5bf54" containerName="mariadb-account-create-update" Jan 21 15:42:15 crc kubenswrapper[4707]: E0121 15:42:15.763577 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a9072a-1e29-4655-aec9-a7bf54ba24f9" containerName="mariadb-database-create" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763582 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a9072a-1e29-4655-aec9-a7bf54ba24f9" containerName="mariadb-database-create" Jan 21 15:42:15 crc kubenswrapper[4707]: E0121 15:42:15.763589 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1e87e4-8e97-4b83-94c8-91505cd5dd0d" containerName="keystone-db-sync" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763594 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1e87e4-8e97-4b83-94c8-91505cd5dd0d" containerName="keystone-db-sync" Jan 21 15:42:15 crc kubenswrapper[4707]: E0121 15:42:15.763610 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ebb84-1e59-4793-804c-225d9a9d1794" containerName="mariadb-account-create-update" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763616 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ebb84-1e59-4793-804c-225d9a9d1794" containerName="mariadb-account-create-update" Jan 21 15:42:15 crc kubenswrapper[4707]: E0121 15:42:15.763625 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03df5afa-682c-4caa-a0ab-091fb6c7fa11" containerName="mariadb-database-create" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763630 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03df5afa-682c-4caa-a0ab-091fb6c7fa11" containerName="mariadb-database-create" Jan 21 15:42:15 crc kubenswrapper[4707]: E0121 15:42:15.763640 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ba6f6d-b3da-4f93-8af6-22537c56e223" containerName="mariadb-account-create-update" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763644 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ba6f6d-b3da-4f93-8af6-22537c56e223" containerName="mariadb-account-create-update" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763789 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ba6f6d-b3da-4f93-8af6-22537c56e223" containerName="mariadb-account-create-update" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763799 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2c49ae-8633-4865-9d20-03b627f5bf54" containerName="mariadb-account-create-update" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763824 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="596ebb84-1e59-4793-804c-225d9a9d1794" containerName="mariadb-account-create-update" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763834 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0219fe68-3dcf-4e10-88e8-6de2bd228897" containerName="mariadb-database-create" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763842 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="03df5afa-682c-4caa-a0ab-091fb6c7fa11" containerName="mariadb-database-create" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763850 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a9072a-1e29-4655-aec9-a7bf54ba24f9" containerName="mariadb-database-create" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.763865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1e87e4-8e97-4b83-94c8-91505cd5dd0d" containerName="keystone-db-sync" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.764441 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.768166 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.769438 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.769680 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-xxph7" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.769875 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.773775 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-hffzr"] Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.778200 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.831873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-combined-ca-bundle\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.831915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-config-data\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.831952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqmtx\" (UniqueName: \"kubernetes.io/projected/22c21363-b8c6-45cb-842a-122bb6f64da5-kube-api-access-xqmtx\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.832188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-credential-keys\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.832413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-fernet-keys\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.832490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-scripts\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.876738 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.878573 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.880240 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.880313 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.892235 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.934491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-credential-keys\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.934566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-fernet-keys\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.934606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-scripts\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.934649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-combined-ca-bundle\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.934669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-config-data\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.934697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqmtx\" (UniqueName: \"kubernetes.io/projected/22c21363-b8c6-45cb-842a-122bb6f64da5-kube-api-access-xqmtx\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.945071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-credential-keys\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.945189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-combined-ca-bundle\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.947323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-fernet-keys\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.949020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-config-data\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.956191 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-v88c2"] Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.957299 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.961855 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.962169 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-rrwvr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.962302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.971018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-scripts\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.972191 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-v88c2"] Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.982848 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-2qlj6"] Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.984108 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.986366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqmtx\" (UniqueName: \"kubernetes.io/projected/22c21363-b8c6-45cb-842a-122bb6f64da5-kube-api-access-xqmtx\") pod \"keystone-bootstrap-hffzr\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.987196 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.987387 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:42:15 crc kubenswrapper[4707]: I0121 15:42:15.987501 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-r9bl4" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.004122 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hj9wf"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.005210 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.006467 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.007161 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-nddnq" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.007351 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.021840 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-2qlj6"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.027632 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hj9wf"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.036850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-scripts\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.036913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-scripts\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-db-sync-config-data\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-config-data\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-log-httpd\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-combined-ca-bundle\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzspt\" (UniqueName: \"kubernetes.io/projected/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-kube-api-access-wzspt\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-config-data\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-run-httpd\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5xmv\" (UniqueName: \"kubernetes.io/projected/d37f54bf-831c-4586-b825-7a4e57bc1878-kube-api-access-l5xmv\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.037891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d37f54bf-831c-4586-b825-7a4e57bc1878-etc-machine-id\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.081336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.098410 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qzljp"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.099797 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.104300 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.104506 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-h2gkl" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.107423 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qzljp"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-log-httpd\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-scripts\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141252 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-combined-ca-bundle\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzspt\" (UniqueName: \"kubernetes.io/projected/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-kube-api-access-wzspt\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrrzv\" (UniqueName: \"kubernetes.io/projected/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-kube-api-access-qrrzv\") pod \"neutron-db-sync-2qlj6\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-config-data\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-run-httpd\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c697467-33ed-4d7c-9496-3e6961ddf8f4-logs\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-combined-ca-bundle\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5xmv\" (UniqueName: \"kubernetes.io/projected/d37f54bf-831c-4586-b825-7a4e57bc1878-kube-api-access-l5xmv\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d37f54bf-831c-4586-b825-7a4e57bc1878-etc-machine-id\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpd6j\" (UniqueName: \"kubernetes.io/projected/3c697467-33ed-4d7c-9496-3e6961ddf8f4-kube-api-access-vpd6j\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-scripts\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-scripts\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-config-data\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-db-sync-config-data\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-combined-ca-bundle\") pod \"neutron-db-sync-2qlj6\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-config-data\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.141692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-config\") pod \"neutron-db-sync-2qlj6\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.142115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-log-httpd\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.142564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d37f54bf-831c-4586-b825-7a4e57bc1878-etc-machine-id\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.146047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-scripts\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.148274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-run-httpd\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.150510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-scripts\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.151654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-config-data\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.152290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-db-sync-config-data\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.152664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.153711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-combined-ca-bundle\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.155577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-config-data\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.156167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.164450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzspt\" (UniqueName: \"kubernetes.io/projected/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-kube-api-access-wzspt\") pod \"ceilometer-0\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.166533 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5xmv\" (UniqueName: \"kubernetes.io/projected/d37f54bf-831c-4586-b825-7a4e57bc1878-kube-api-access-l5xmv\") pod \"cinder-db-sync-v88c2\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.199576 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrrzv\" (UniqueName: \"kubernetes.io/projected/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-kube-api-access-qrrzv\") pod \"neutron-db-sync-2qlj6\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-combined-ca-bundle\") pod \"barbican-db-sync-qzljp\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c697467-33ed-4d7c-9496-3e6961ddf8f4-logs\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-combined-ca-bundle\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zczz8\" (UniqueName: \"kubernetes.io/projected/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-kube-api-access-zczz8\") pod \"barbican-db-sync-qzljp\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpd6j\" (UniqueName: \"kubernetes.io/projected/3c697467-33ed-4d7c-9496-3e6961ddf8f4-kube-api-access-vpd6j\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-config-data\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244681 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-db-sync-config-data\") pod \"barbican-db-sync-qzljp\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244721 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-combined-ca-bundle\") pod \"neutron-db-sync-2qlj6\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-config\") pod \"neutron-db-sync-2qlj6\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.244795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-scripts\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.245701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c697467-33ed-4d7c-9496-3e6961ddf8f4-logs\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.252185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-config\") pod \"neutron-db-sync-2qlj6\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.252393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-combined-ca-bundle\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.253859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-scripts\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.256757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-config-data\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.262529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-combined-ca-bundle\") pod \"neutron-db-sync-2qlj6\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.264943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpd6j\" (UniqueName: \"kubernetes.io/projected/3c697467-33ed-4d7c-9496-3e6961ddf8f4-kube-api-access-vpd6j\") pod \"placement-db-sync-hj9wf\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.268446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrrzv\" (UniqueName: \"kubernetes.io/projected/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-kube-api-access-qrrzv\") pod \"neutron-db-sync-2qlj6\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.341763 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.347182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-combined-ca-bundle\") pod \"barbican-db-sync-qzljp\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.347243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zczz8\" (UniqueName: \"kubernetes.io/projected/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-kube-api-access-zczz8\") pod \"barbican-db-sync-qzljp\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.347363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-db-sync-config-data\") pod \"barbican-db-sync-qzljp\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.354248 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.357741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-db-sync-config-data\") pod \"barbican-db-sync-qzljp\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.357928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-combined-ca-bundle\") pod \"barbican-db-sync-qzljp\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.363032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zczz8\" (UniqueName: \"kubernetes.io/projected/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-kube-api-access-zczz8\") pod \"barbican-db-sync-qzljp\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.369484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.521412 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.535899 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-hffzr"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.627918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:42:16 crc kubenswrapper[4707]: W0121 15:42:16.641567 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f2c6971_ade0_4913_ad6a_4125f1c5b0ba.slice/crio-9d69f3363bacd103cdaa01477cbf799f8a004de30c8863e98fa714f3d0c3103e WatchSource:0}: Error finding container 9d69f3363bacd103cdaa01477cbf799f8a004de30c8863e98fa714f3d0c3103e: Status 404 returned error can't find the container with id 9d69f3363bacd103cdaa01477cbf799f8a004de30c8863e98fa714f3d0c3103e Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.647533 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.780624 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-2qlj6"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.829923 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.834140 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.838240 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.839340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.839583 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.840982 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.841119 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-lbfbl" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.880846 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-v88c2"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.887039 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hj9wf"] Jan 21 15:42:16 crc kubenswrapper[4707]: W0121 15:42:16.887336 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd37f54bf_831c_4586_b825_7a4e57bc1878.slice/crio-fe8712870ec53838007ccbf2394d257d6878921faa73d22f33c349c2e3168a2c WatchSource:0}: Error finding container fe8712870ec53838007ccbf2394d257d6878921faa73d22f33c349c2e3168a2c: Status 404 returned error can't find the container with id fe8712870ec53838007ccbf2394d257d6878921faa73d22f33c349c2e3168a2c Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.892520 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.894233 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.896894 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.896989 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.897329 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.961824 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-config-data\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.961873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.961898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.961923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.961958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-logs\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.961981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.961997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.962066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.962084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.962101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.962128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-logs\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.962146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-scripts\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.962171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvp6\" (UniqueName: \"kubernetes.io/projected/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-kube-api-access-fdvp6\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.962200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.962223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkw66\" (UniqueName: \"kubernetes.io/projected/c25a44c1-0278-48f2-9096-ad13159c4e22-kube-api-access-wkw66\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:16 crc kubenswrapper[4707]: I0121 15:42:16.962246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.034101 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qzljp"] Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-logs\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-scripts\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvp6\" (UniqueName: \"kubernetes.io/projected/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-kube-api-access-fdvp6\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkw66\" (UniqueName: \"kubernetes.io/projected/c25a44c1-0278-48f2-9096-ad13159c4e22-kube-api-access-wkw66\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-config-data\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-logs\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.064996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.065134 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.065221 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.065494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.066121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-logs\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.067380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-logs\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.072483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-config-data\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.075359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-scripts\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.075474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.075486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.075540 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.075844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.079431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.079684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.080929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvp6\" (UniqueName: \"kubernetes.io/projected/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-kube-api-access-fdvp6\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.085464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkw66\" (UniqueName: \"kubernetes.io/projected/c25a44c1-0278-48f2-9096-ad13159c4e22-kube-api-access-wkw66\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.092366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.095342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.163784 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.320638 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.458666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerStarted","Data":"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.458940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerStarted","Data":"9d69f3363bacd103cdaa01477cbf799f8a004de30c8863e98fa714f3d0c3103e"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.459866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hj9wf" event={"ID":"3c697467-33ed-4d7c-9496-3e6961ddf8f4","Type":"ContainerStarted","Data":"03daf8802e87e335374564e2a9872eb53275865b06e92ef26781c039f459beb8"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.459892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hj9wf" event={"ID":"3c697467-33ed-4d7c-9496-3e6961ddf8f4","Type":"ContainerStarted","Data":"0c48bcb41df55322deb8e7b4c7c4221953d7185807dc46631b0643f3af43f2b2"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.464392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-v88c2" event={"ID":"d37f54bf-831c-4586-b825-7a4e57bc1878","Type":"ContainerStarted","Data":"fe8712870ec53838007ccbf2394d257d6878921faa73d22f33c349c2e3168a2c"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.475385 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-hj9wf" podStartSLOduration=2.475369308 podStartE2EDuration="2.475369308s" podCreationTimestamp="2026-01-21 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:17.475140318 +0000 UTC m=+2434.656656540" watchObservedRunningTime="2026-01-21 15:42:17.475369308 +0000 UTC m=+2434.656885530" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.476507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" event={"ID":"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe","Type":"ContainerStarted","Data":"d28821461c495d577ce7377449af1a5dcc20ad597a38321ba6e299be5f826bdd"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.476540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" event={"ID":"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe","Type":"ContainerStarted","Data":"ad14e0c6af61c27eb75d57a195373dce9bf0aac46b2df1381a8609e2296117f7"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.490052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" event={"ID":"22c21363-b8c6-45cb-842a-122bb6f64da5","Type":"ContainerStarted","Data":"4d471870a8d9c652e026de3e74df2f68fd04789c57f3460a8efb5a5831b71ff5"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.490081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" event={"ID":"22c21363-b8c6-45cb-842a-122bb6f64da5","Type":"ContainerStarted","Data":"73b45b6f796e87fc781823e1ee177cfa513b8a5d761c2c1cabd8afbffa863d9f"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.494053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-qzljp" event={"ID":"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b","Type":"ContainerStarted","Data":"9e4452792bdd8e7c7134ffe07b84894dcaa2be63135d758bc283c8b2eb863402"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.494110 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-qzljp" event={"ID":"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b","Type":"ContainerStarted","Data":"a7b914d581b6ebee597b8f5d13c114bb7ebf9025517d608337a97150c2196fe2"} Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.496265 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" podStartSLOduration=2.496247342 podStartE2EDuration="2.496247342s" podCreationTimestamp="2026-01-21 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:17.489342039 +0000 UTC m=+2434.670858261" watchObservedRunningTime="2026-01-21 15:42:17.496247342 +0000 UTC m=+2434.677763564" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.525469 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-qzljp" podStartSLOduration=1.525450857 podStartE2EDuration="1.525450857s" podCreationTimestamp="2026-01-21 15:42:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:17.524086541 +0000 UTC m=+2434.705602764" watchObservedRunningTime="2026-01-21 15:42:17.525450857 +0000 UTC m=+2434.706967079" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.526350 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" podStartSLOduration=2.526329669 podStartE2EDuration="2.526329669s" podCreationTimestamp="2026-01-21 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:17.508047638 +0000 UTC m=+2434.689563859" watchObservedRunningTime="2026-01-21 15:42:17.526329669 +0000 UTC m=+2434.707845891" Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.603097 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:42:17 crc kubenswrapper[4707]: I0121 15:42:17.755622 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:42:17 crc kubenswrapper[4707]: W0121 15:42:17.760246 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25a44c1_0278_48f2_9096_ad13159c4e22.slice/crio-d97b61f4337fea860677a852278984d1b33ccc548ff4f000e8707e0a17610de1 WatchSource:0}: Error finding container d97b61f4337fea860677a852278984d1b33ccc548ff4f000e8707e0a17610de1: Status 404 returned error can't find the container with id d97b61f4337fea860677a852278984d1b33ccc548ff4f000e8707e0a17610de1 Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.279189 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.318371 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.385210 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.514873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerStarted","Data":"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07"} Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.518742 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c25a44c1-0278-48f2-9096-ad13159c4e22","Type":"ContainerStarted","Data":"917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5"} Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.518794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c25a44c1-0278-48f2-9096-ad13159c4e22","Type":"ContainerStarted","Data":"d97b61f4337fea860677a852278984d1b33ccc548ff4f000e8707e0a17610de1"} Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.522131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"959593d7-b9c1-4d7a-90c8-51cdd8b05b08","Type":"ContainerStarted","Data":"626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee"} Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.522189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"959593d7-b9c1-4d7a-90c8-51cdd8b05b08","Type":"ContainerStarted","Data":"da03fbb46888a158457469d606e82c22db25c9e2a78d247fcc8d49baf8d19674"} Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.526452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-v88c2" event={"ID":"d37f54bf-831c-4586-b825-7a4e57bc1878","Type":"ContainerStarted","Data":"792ced9ce31de625df8513afd5d8450974231374cbbc8b4faa4e06145c17e003"} Jan 21 15:42:18 crc kubenswrapper[4707]: I0121 15:42:18.543493 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-v88c2" podStartSLOduration=3.543476809 podStartE2EDuration="3.543476809s" podCreationTimestamp="2026-01-21 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:18.541713484 +0000 UTC m=+2435.723229707" watchObservedRunningTime="2026-01-21 15:42:18.543476809 +0000 UTC m=+2435.724993031" Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.536364 4707 generic.go:334] "Generic (PLEG): container finished" podID="22c21363-b8c6-45cb-842a-122bb6f64da5" containerID="4d471870a8d9c652e026de3e74df2f68fd04789c57f3460a8efb5a5831b71ff5" exitCode=0 Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.536451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" event={"ID":"22c21363-b8c6-45cb-842a-122bb6f64da5","Type":"ContainerDied","Data":"4d471870a8d9c652e026de3e74df2f68fd04789c57f3460a8efb5a5831b71ff5"} Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.538440 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b" containerID="9e4452792bdd8e7c7134ffe07b84894dcaa2be63135d758bc283c8b2eb863402" exitCode=0 Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.538525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-qzljp" event={"ID":"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b","Type":"ContainerDied","Data":"9e4452792bdd8e7c7134ffe07b84894dcaa2be63135d758bc283c8b2eb863402"} Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.544671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerStarted","Data":"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9"} Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.549725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c25a44c1-0278-48f2-9096-ad13159c4e22","Type":"ContainerStarted","Data":"3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b"} Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.549804 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerName="glance-log" containerID="cri-o://917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5" gracePeriod=30 Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.549908 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerName="glance-httpd" containerID="cri-o://3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b" gracePeriod=30 Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.560555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"959593d7-b9c1-4d7a-90c8-51cdd8b05b08","Type":"ContainerStarted","Data":"f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca"} Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.560682 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerName="glance-log" containerID="cri-o://626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee" gracePeriod=30 Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.560891 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerName="glance-httpd" containerID="cri-o://f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca" gracePeriod=30 Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.565993 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c697467-33ed-4d7c-9496-3e6961ddf8f4" containerID="03daf8802e87e335374564e2a9872eb53275865b06e92ef26781c039f459beb8" exitCode=0 Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.566060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hj9wf" event={"ID":"3c697467-33ed-4d7c-9496-3e6961ddf8f4","Type":"ContainerDied","Data":"03daf8802e87e335374564e2a9872eb53275865b06e92ef26781c039f459beb8"} Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.588330 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.588321123 podStartE2EDuration="4.588321123s" podCreationTimestamp="2026-01-21 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:19.584155572 +0000 UTC m=+2436.765671793" watchObservedRunningTime="2026-01-21 15:42:19.588321123 +0000 UTC m=+2436.769837344" Jan 21 15:42:19 crc kubenswrapper[4707]: I0121 15:42:19.622516 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.622496817 podStartE2EDuration="4.622496817s" podCreationTimestamp="2026-01-21 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:19.61505241 +0000 UTC m=+2436.796568642" watchObservedRunningTime="2026-01-21 15:42:19.622496817 +0000 UTC m=+2436.804013029" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.131763 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.136375 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-httpd-run\") pod \"c25a44c1-0278-48f2-9096-ad13159c4e22\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-combined-ca-bundle\") pod \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-scripts\") pod \"c25a44c1-0278-48f2-9096-ad13159c4e22\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-public-tls-certs\") pod \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-scripts\") pod \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238768 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-logs\") pod \"c25a44c1-0278-48f2-9096-ad13159c4e22\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238784 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-httpd-run\") pod \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238821 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkw66\" (UniqueName: \"kubernetes.io/projected/c25a44c1-0278-48f2-9096-ad13159c4e22-kube-api-access-wkw66\") pod \"c25a44c1-0278-48f2-9096-ad13159c4e22\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-internal-tls-certs\") pod \"c25a44c1-0278-48f2-9096-ad13159c4e22\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-logs\") pod \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-config-data\") pod \"c25a44c1-0278-48f2-9096-ad13159c4e22\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c25a44c1-0278-48f2-9096-ad13159c4e22\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-combined-ca-bundle\") pod \"c25a44c1-0278-48f2-9096-ad13159c4e22\" (UID: \"c25a44c1-0278-48f2-9096-ad13159c4e22\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-config-data\") pod \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.238984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdvp6\" (UniqueName: \"kubernetes.io/projected/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-kube-api-access-fdvp6\") pod \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\" (UID: \"959593d7-b9c1-4d7a-90c8-51cdd8b05b08\") " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.239939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c25a44c1-0278-48f2-9096-ad13159c4e22" (UID: "c25a44c1-0278-48f2-9096-ad13159c4e22"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.241150 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "959593d7-b9c1-4d7a-90c8-51cdd8b05b08" (UID: "959593d7-b9c1-4d7a-90c8-51cdd8b05b08"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.241318 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-logs" (OuterVolumeSpecName: "logs") pod "c25a44c1-0278-48f2-9096-ad13159c4e22" (UID: "c25a44c1-0278-48f2-9096-ad13159c4e22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.243180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-logs" (OuterVolumeSpecName: "logs") pod "959593d7-b9c1-4d7a-90c8-51cdd8b05b08" (UID: "959593d7-b9c1-4d7a-90c8-51cdd8b05b08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.258373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c25a44c1-0278-48f2-9096-ad13159c4e22-kube-api-access-wkw66" (OuterVolumeSpecName: "kube-api-access-wkw66") pod "c25a44c1-0278-48f2-9096-ad13159c4e22" (UID: "c25a44c1-0278-48f2-9096-ad13159c4e22"). InnerVolumeSpecName "kube-api-access-wkw66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.263430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-scripts" (OuterVolumeSpecName: "scripts") pod "959593d7-b9c1-4d7a-90c8-51cdd8b05b08" (UID: "959593d7-b9c1-4d7a-90c8-51cdd8b05b08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.263695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "959593d7-b9c1-4d7a-90c8-51cdd8b05b08" (UID: "959593d7-b9c1-4d7a-90c8-51cdd8b05b08"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.268607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c25a44c1-0278-48f2-9096-ad13159c4e22" (UID: "c25a44c1-0278-48f2-9096-ad13159c4e22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.271881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "c25a44c1-0278-48f2-9096-ad13159c4e22" (UID: "c25a44c1-0278-48f2-9096-ad13159c4e22"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.272207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-kube-api-access-fdvp6" (OuterVolumeSpecName: "kube-api-access-fdvp6") pod "959593d7-b9c1-4d7a-90c8-51cdd8b05b08" (UID: "959593d7-b9c1-4d7a-90c8-51cdd8b05b08"). InnerVolumeSpecName "kube-api-access-fdvp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.277197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "959593d7-b9c1-4d7a-90c8-51cdd8b05b08" (UID: "959593d7-b9c1-4d7a-90c8-51cdd8b05b08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.277210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-scripts" (OuterVolumeSpecName: "scripts") pod "c25a44c1-0278-48f2-9096-ad13159c4e22" (UID: "c25a44c1-0278-48f2-9096-ad13159c4e22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.281525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-config-data" (OuterVolumeSpecName: "config-data") pod "c25a44c1-0278-48f2-9096-ad13159c4e22" (UID: "c25a44c1-0278-48f2-9096-ad13159c4e22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.287617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-config-data" (OuterVolumeSpecName: "config-data") pod "959593d7-b9c1-4d7a-90c8-51cdd8b05b08" (UID: "959593d7-b9c1-4d7a-90c8-51cdd8b05b08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.294641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "959593d7-b9c1-4d7a-90c8-51cdd8b05b08" (UID: "959593d7-b9c1-4d7a-90c8-51cdd8b05b08"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.296038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c25a44c1-0278-48f2-9096-ad13159c4e22" (UID: "c25a44c1-0278-48f2-9096-ad13159c4e22"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343036 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343077 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343106 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343117 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkw66\" (UniqueName: \"kubernetes.io/projected/c25a44c1-0278-48f2-9096-ad13159c4e22-kube-api-access-wkw66\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343129 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343137 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343147 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343181 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343192 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343201 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343211 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdvp6\" (UniqueName: \"kubernetes.io/projected/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-kube-api-access-fdvp6\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343220 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c25a44c1-0278-48f2-9096-ad13159c4e22-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.343236 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.344544 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.344581 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c25a44c1-0278-48f2-9096-ad13159c4e22-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.344597 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/959593d7-b9c1-4d7a-90c8-51cdd8b05b08-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.367762 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.375712 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.447193 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.447227 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.577702 4707 generic.go:334] "Generic (PLEG): container finished" podID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerID="3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b" exitCode=0 Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.577750 4707 generic.go:334] "Generic (PLEG): container finished" podID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerID="917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5" exitCode=143 Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.577763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c25a44c1-0278-48f2-9096-ad13159c4e22","Type":"ContainerDied","Data":"3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b"} Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.577846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c25a44c1-0278-48f2-9096-ad13159c4e22","Type":"ContainerDied","Data":"917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5"} Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.577850 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.577881 4707 scope.go:117] "RemoveContainer" containerID="3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.577866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c25a44c1-0278-48f2-9096-ad13159c4e22","Type":"ContainerDied","Data":"d97b61f4337fea860677a852278984d1b33ccc548ff4f000e8707e0a17610de1"} Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.580053 4707 generic.go:334] "Generic (PLEG): container finished" podID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerID="f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca" exitCode=0 Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.580087 4707 generic.go:334] "Generic (PLEG): container finished" podID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerID="626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee" exitCode=143 Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.580109 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.580157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"959593d7-b9c1-4d7a-90c8-51cdd8b05b08","Type":"ContainerDied","Data":"f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca"} Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.580188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"959593d7-b9c1-4d7a-90c8-51cdd8b05b08","Type":"ContainerDied","Data":"626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee"} Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.580200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"959593d7-b9c1-4d7a-90c8-51cdd8b05b08","Type":"ContainerDied","Data":"da03fbb46888a158457469d606e82c22db25c9e2a78d247fcc8d49baf8d19674"} Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.583570 4707 generic.go:334] "Generic (PLEG): container finished" podID="d37f54bf-831c-4586-b825-7a4e57bc1878" containerID="792ced9ce31de625df8513afd5d8450974231374cbbc8b4faa4e06145c17e003" exitCode=0 Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.583637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-v88c2" event={"ID":"d37f54bf-831c-4586-b825-7a4e57bc1878","Type":"ContainerDied","Data":"792ced9ce31de625df8513afd5d8450974231374cbbc8b4faa4e06145c17e003"} Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.586589 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="ceilometer-central-agent" containerID="cri-o://6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e" gracePeriod=30 Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.586757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerStarted","Data":"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea"} Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.586832 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="sg-core" containerID="cri-o://b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9" gracePeriod=30 Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.586864 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="ceilometer-notification-agent" containerID="cri-o://efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07" gracePeriod=30 Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.586910 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="proxy-httpd" containerID="cri-o://5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea" gracePeriod=30 Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.586939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.613038 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.9171902950000002 podStartE2EDuration="5.613020462s" podCreationTimestamp="2026-01-21 15:42:15 +0000 UTC" firstStartedPulling="2026-01-21 15:42:16.647240933 +0000 UTC m=+2433.828757155" lastFinishedPulling="2026-01-21 15:42:20.3430711 +0000 UTC m=+2437.524587322" observedRunningTime="2026-01-21 15:42:20.609893824 +0000 UTC m=+2437.791410045" watchObservedRunningTime="2026-01-21 15:42:20.613020462 +0000 UTC m=+2437.794536683" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.626749 4707 scope.go:117] "RemoveContainer" containerID="917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.651867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.663515 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.677386 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.681370 4707 scope.go:117] "RemoveContainer" containerID="3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.681482 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:42:20 crc kubenswrapper[4707]: E0121 15:42:20.684851 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b\": container with ID starting with 3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b not found: ID does not exist" containerID="3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.684893 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b"} err="failed to get container status \"3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b\": rpc error: code = NotFound desc = could not find container \"3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b\": container with ID starting with 3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b not found: ID does not exist" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.684916 4707 scope.go:117] "RemoveContainer" containerID="917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5" Jan 21 15:42:20 crc kubenswrapper[4707]: E0121 15:42:20.685187 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5\": container with ID starting with 917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5 not found: ID does not exist" containerID="917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.685209 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5"} err="failed to get container status \"917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5\": rpc error: code = NotFound desc = could not find container \"917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5\": container with ID starting with 917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5 not found: ID does not exist" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.685223 4707 scope.go:117] "RemoveContainer" containerID="3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.687001 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b"} err="failed to get container status \"3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b\": rpc error: code = NotFound desc = could not find container \"3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b\": container with ID starting with 3633e1e9de16436beecdc557536263b167b417ccc21a7338734c1ba7b15f549b not found: ID does not exist" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.687026 4707 scope.go:117] "RemoveContainer" containerID="917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.691839 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5"} err="failed to get container status \"917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5\": rpc error: code = NotFound desc = could not find container \"917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5\": container with ID starting with 917b0d89a67391327ff8ae296a8cfc2efde720d9832d6f432c7687cef0a0f0b5 not found: ID does not exist" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.691858 4707 scope.go:117] "RemoveContainer" containerID="f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.707788 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:42:20 crc kubenswrapper[4707]: E0121 15:42:20.708184 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerName="glance-httpd" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.708206 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerName="glance-httpd" Jan 21 15:42:20 crc kubenswrapper[4707]: E0121 15:42:20.708229 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerName="glance-log" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.708238 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerName="glance-log" Jan 21 15:42:20 crc kubenswrapper[4707]: E0121 15:42:20.708251 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerName="glance-httpd" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.708266 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerName="glance-httpd" Jan 21 15:42:20 crc kubenswrapper[4707]: E0121 15:42:20.708283 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerName="glance-log" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.708289 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerName="glance-log" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.708450 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerName="glance-httpd" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.708463 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerName="glance-httpd" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.708486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25a44c1-0278-48f2-9096-ad13159c4e22" containerName="glance-log" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.708494 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" containerName="glance-log" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.709402 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.717476 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.717759 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-lbfbl" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.718055 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.718186 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.718324 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.718851 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.723390 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.723769 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.728353 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.737190 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.749984 4707 scope.go:117] "RemoveContainer" containerID="626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.752378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.752419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgphk\" (UniqueName: \"kubernetes.io/projected/b423a914-b340-4f8f-ae6a-f55acfccd5da-kube-api-access-vgphk\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.752472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.752516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.752531 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-scripts\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.752550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.754049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-logs\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.754090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-config-data\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.793037 4707 scope.go:117] "RemoveContainer" containerID="f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca" Jan 21 15:42:20 crc kubenswrapper[4707]: E0121 15:42:20.793705 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca\": container with ID starting with f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca not found: ID does not exist" containerID="f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.793750 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca"} err="failed to get container status \"f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca\": rpc error: code = NotFound desc = could not find container \"f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca\": container with ID starting with f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca not found: ID does not exist" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.793771 4707 scope.go:117] "RemoveContainer" containerID="626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee" Jan 21 15:42:20 crc kubenswrapper[4707]: E0121 15:42:20.794218 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee\": container with ID starting with 626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee not found: ID does not exist" containerID="626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.794247 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee"} err="failed to get container status \"626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee\": rpc error: code = NotFound desc = could not find container \"626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee\": container with ID starting with 626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee not found: ID does not exist" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.794279 4707 scope.go:117] "RemoveContainer" containerID="f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.794497 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca"} err="failed to get container status \"f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca\": rpc error: code = NotFound desc = could not find container \"f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca\": container with ID starting with f05e26326166520906abd6bf3c745714f65d6144a893b1760bff4da95888c6ca not found: ID does not exist" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.794520 4707 scope.go:117] "RemoveContainer" containerID="626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.794768 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee"} err="failed to get container status \"626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee\": rpc error: code = NotFound desc = could not find container \"626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee\": container with ID starting with 626f549f1e469061aa30a13aff48c9062d3f5598c66f107ceebe0d58f5c91bee not found: ID does not exist" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.856106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.856275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.856361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgphk\" (UniqueName: \"kubernetes.io/projected/b423a914-b340-4f8f-ae6a-f55acfccd5da-kube-api-access-vgphk\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcnjk\" (UniqueName: \"kubernetes.io/projected/f762786d-4f31-412e-a685-b196f3cf3fea-kube-api-access-xcnjk\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-logs\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-scripts\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-logs\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857781 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.863728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.866244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-config-data\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.869480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.870232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.871942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-logs\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.857789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-config-data\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.874412 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgphk\" (UniqueName: \"kubernetes.io/projected/b423a914-b340-4f8f-ae6a-f55acfccd5da-kube-api-access-vgphk\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.883337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-scripts\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.897652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: E0121 15:42:20.922183 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f2c6971_ade0_4913_ad6a_4125f1c5b0ba.slice/crio-6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f2c6971_ade0_4913_ad6a_4125f1c5b0ba.slice/crio-conmon-6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.964911 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.978711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.980847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcnjk\" (UniqueName: \"kubernetes.io/projected/f762786d-4f31-412e-a685-b196f3cf3fea-kube-api-access-xcnjk\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.980883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.980913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.980958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-logs\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.981040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.981070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.981134 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.981421 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.981472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-logs\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.981921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.985101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.987628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:20 crc kubenswrapper[4707]: I0121 15:42:20.988170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.000980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.009071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcnjk\" (UniqueName: \"kubernetes.io/projected/f762786d-4f31-412e-a685-b196f3cf3fea-kube-api-access-xcnjk\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.022041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.048487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.061325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.082654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-db-sync-config-data\") pod \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.082748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zczz8\" (UniqueName: \"kubernetes.io/projected/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-kube-api-access-zczz8\") pod \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.082791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-combined-ca-bundle\") pod \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\" (UID: \"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.085727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b" (UID: "f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.086075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-kube-api-access-zczz8" (OuterVolumeSpecName: "kube-api-access-zczz8") pod "f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b" (UID: "f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b"). InnerVolumeSpecName "kube-api-access-zczz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.102177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b" (UID: "f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.194828 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.194872 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zczz8\" (UniqueName: \"kubernetes.io/projected/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-kube-api-access-zczz8\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.194884 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.200965 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.201129 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.261738 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959593d7-b9c1-4d7a-90c8-51cdd8b05b08" path="/var/lib/kubelet/pods/959593d7-b9c1-4d7a-90c8-51cdd8b05b08/volumes" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.262360 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c25a44c1-0278-48f2-9096-ad13159c4e22" path="/var/lib/kubelet/pods/c25a44c1-0278-48f2-9096-ad13159c4e22/volumes" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.269766 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.270209 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-config-data\") pod \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-scripts\") pod \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-config-data\") pod \"22c21363-b8c6-45cb-842a-122bb6f64da5\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-credential-keys\") pod \"22c21363-b8c6-45cb-842a-122bb6f64da5\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-scripts\") pod \"22c21363-b8c6-45cb-842a-122bb6f64da5\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpd6j\" (UniqueName: \"kubernetes.io/projected/3c697467-33ed-4d7c-9496-3e6961ddf8f4-kube-api-access-vpd6j\") pod \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c697467-33ed-4d7c-9496-3e6961ddf8f4-logs\") pod \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-fernet-keys\") pod \"22c21363-b8c6-45cb-842a-122bb6f64da5\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqmtx\" (UniqueName: \"kubernetes.io/projected/22c21363-b8c6-45cb-842a-122bb6f64da5-kube-api-access-xqmtx\") pod \"22c21363-b8c6-45cb-842a-122bb6f64da5\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-combined-ca-bundle\") pod \"22c21363-b8c6-45cb-842a-122bb6f64da5\" (UID: \"22c21363-b8c6-45cb-842a-122bb6f64da5\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.405753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-combined-ca-bundle\") pod \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\" (UID: \"3c697467-33ed-4d7c-9496-3e6961ddf8f4\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.408214 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c697467-33ed-4d7c-9496-3e6961ddf8f4-logs" (OuterVolumeSpecName: "logs") pod "3c697467-33ed-4d7c-9496-3e6961ddf8f4" (UID: "3c697467-33ed-4d7c-9496-3e6961ddf8f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.411089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-scripts" (OuterVolumeSpecName: "scripts") pod "3c697467-33ed-4d7c-9496-3e6961ddf8f4" (UID: "3c697467-33ed-4d7c-9496-3e6961ddf8f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.413930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-scripts" (OuterVolumeSpecName: "scripts") pod "22c21363-b8c6-45cb-842a-122bb6f64da5" (UID: "22c21363-b8c6-45cb-842a-122bb6f64da5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.414016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "22c21363-b8c6-45cb-842a-122bb6f64da5" (UID: "22c21363-b8c6-45cb-842a-122bb6f64da5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.416636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "22c21363-b8c6-45cb-842a-122bb6f64da5" (UID: "22c21363-b8c6-45cb-842a-122bb6f64da5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.417234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c697467-33ed-4d7c-9496-3e6961ddf8f4-kube-api-access-vpd6j" (OuterVolumeSpecName: "kube-api-access-vpd6j") pod "3c697467-33ed-4d7c-9496-3e6961ddf8f4" (UID: "3c697467-33ed-4d7c-9496-3e6961ddf8f4"). InnerVolumeSpecName "kube-api-access-vpd6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.418237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c21363-b8c6-45cb-842a-122bb6f64da5-kube-api-access-xqmtx" (OuterVolumeSpecName: "kube-api-access-xqmtx") pod "22c21363-b8c6-45cb-842a-122bb6f64da5" (UID: "22c21363-b8c6-45cb-842a-122bb6f64da5"). InnerVolumeSpecName "kube-api-access-xqmtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.419032 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.431424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22c21363-b8c6-45cb-842a-122bb6f64da5" (UID: "22c21363-b8c6-45cb-842a-122bb6f64da5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.436292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c697467-33ed-4d7c-9496-3e6961ddf8f4" (UID: "3c697467-33ed-4d7c-9496-3e6961ddf8f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.446049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-config-data" (OuterVolumeSpecName: "config-data") pod "3c697467-33ed-4d7c-9496-3e6961ddf8f4" (UID: "3c697467-33ed-4d7c-9496-3e6961ddf8f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.454968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-config-data" (OuterVolumeSpecName: "config-data") pod "22c21363-b8c6-45cb-842a-122bb6f64da5" (UID: "22c21363-b8c6-45cb-842a-122bb6f64da5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.508046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-sg-core-conf-yaml\") pod \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.508114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-log-httpd\") pod \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.508156 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-combined-ca-bundle\") pod \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.508304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-config-data\") pod \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.508341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-run-httpd\") pod \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.508369 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-scripts\") pod \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.508394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzspt\" (UniqueName: \"kubernetes.io/projected/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-kube-api-access-wzspt\") pod \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\" (UID: \"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba\") " Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.508580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" (UID: "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.508757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" (UID: "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509010 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509033 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509043 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509057 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509066 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpd6j\" (UniqueName: \"kubernetes.io/projected/3c697467-33ed-4d7c-9496-3e6961ddf8f4-kube-api-access-vpd6j\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509076 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c697467-33ed-4d7c-9496-3e6961ddf8f4-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509087 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509096 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqmtx\" (UniqueName: \"kubernetes.io/projected/22c21363-b8c6-45cb-842a-122bb6f64da5-kube-api-access-xqmtx\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509104 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c21363-b8c6-45cb-842a-122bb6f64da5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509113 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509123 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509131 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.509140 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c697467-33ed-4d7c-9496-3e6961ddf8f4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.510976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-scripts" (OuterVolumeSpecName: "scripts") pod "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" (UID: "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.511584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-kube-api-access-wzspt" (OuterVolumeSpecName: "kube-api-access-wzspt") pod "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" (UID: "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba"). InnerVolumeSpecName "kube-api-access-wzspt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.526749 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" (UID: "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.572002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" (UID: "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.586551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-config-data" (OuterVolumeSpecName: "config-data") pod "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" (UID: "5f2c6971-ade0-4913-ad6a-4125f1c5b0ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.598110 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hj9wf" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.598801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hj9wf" event={"ID":"3c697467-33ed-4d7c-9496-3e6961ddf8f4","Type":"ContainerDied","Data":"0c48bcb41df55322deb8e7b4c7c4221953d7185807dc46631b0643f3af43f2b2"} Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.598887 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c48bcb41df55322deb8e7b4c7c4221953d7185807dc46631b0643f3af43f2b2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.600959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" event={"ID":"22c21363-b8c6-45cb-842a-122bb6f64da5","Type":"ContainerDied","Data":"73b45b6f796e87fc781823e1ee177cfa513b8a5d761c2c1cabd8afbffa863d9f"} Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.600993 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73b45b6f796e87fc781823e1ee177cfa513b8a5d761c2c1cabd8afbffa863d9f" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.601059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-hffzr" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.602744 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-qzljp" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.602714 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-qzljp" event={"ID":"f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b","Type":"ContainerDied","Data":"a7b914d581b6ebee597b8f5d13c114bb7ebf9025517d608337a97150c2196fe2"} Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.602976 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7b914d581b6ebee597b8f5d13c114bb7ebf9025517d608337a97150c2196fe2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604753 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerID="5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea" exitCode=0 Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604772 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerID="b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9" exitCode=2 Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604782 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerID="efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07" exitCode=0 Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604790 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerID="6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e" exitCode=0 Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerDied","Data":"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea"} Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerDied","Data":"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9"} Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerDied","Data":"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07"} Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerDied","Data":"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e"} Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5f2c6971-ade0-4913-ad6a-4125f1c5b0ba","Type":"ContainerDied","Data":"9d69f3363bacd103cdaa01477cbf799f8a004de30c8863e98fa714f3d0c3103e"} Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604902 4707 scope.go:117] "RemoveContainer" containerID="5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.604997 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.613116 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.613145 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzspt\" (UniqueName: \"kubernetes.io/projected/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-kube-api-access-wzspt\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.613158 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.613167 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.613175 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.641639 4707 scope.go:117] "RemoveContainer" containerID="b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.642384 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-hffzr"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.651741 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-hffzr"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.664088 4707 scope.go:117] "RemoveContainer" containerID="efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.675439 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.689612 4707 scope.go:117] "RemoveContainer" containerID="6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.689739 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.698782 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.699162 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="sg-core" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699174 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="sg-core" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.699186 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="proxy-httpd" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699193 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="proxy-httpd" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.699202 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="ceilometer-notification-agent" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699208 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="ceilometer-notification-agent" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.699225 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c697467-33ed-4d7c-9496-3e6961ddf8f4" containerName="placement-db-sync" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699232 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c697467-33ed-4d7c-9496-3e6961ddf8f4" containerName="placement-db-sync" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.699244 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="ceilometer-central-agent" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699249 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="ceilometer-central-agent" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.699273 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c21363-b8c6-45cb-842a-122bb6f64da5" containerName="keystone-bootstrap" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c21363-b8c6-45cb-842a-122bb6f64da5" containerName="keystone-bootstrap" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.699287 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b" containerName="barbican-db-sync" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699293 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b" containerName="barbican-db-sync" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699417 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c697467-33ed-4d7c-9496-3e6961ddf8f4" containerName="placement-db-sync" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699427 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="sg-core" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699439 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="ceilometer-notification-agent" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699448 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="proxy-httpd" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699457 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b" containerName="barbican-db-sync" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699465 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" containerName="ceilometer-central-agent" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.699479 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c21363-b8c6-45cb-842a-122bb6f64da5" containerName="keystone-bootstrap" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.702322 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.714400 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.714677 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.722851 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.732428 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7xxw2"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.735970 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.739177 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.739410 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.739798 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-xxph7" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.739955 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.741158 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.741549 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-85b7676df9-ncnhq"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.742769 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.744137 4707 scope.go:117] "RemoveContainer" containerID="5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.745457 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7xxw2"] Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.745590 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea\": container with ID starting with 5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea not found: ID does not exist" containerID="5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.745620 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea"} err="failed to get container status \"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea\": rpc error: code = NotFound desc = could not find container \"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea\": container with ID starting with 5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.745645 4707 scope.go:117] "RemoveContainer" containerID="b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.750656 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9\": container with ID starting with b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9 not found: ID does not exist" containerID="b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.750708 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9"} err="failed to get container status \"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9\": rpc error: code = NotFound desc = could not find container \"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9\": container with ID starting with b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9 not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.750750 4707 scope.go:117] "RemoveContainer" containerID="efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.751200 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.751401 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.751558 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-nddnq" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.751559 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07\": container with ID starting with efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07 not found: ID does not exist" containerID="efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.751875 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07"} err="failed to get container status \"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07\": rpc error: code = NotFound desc = could not find container \"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07\": container with ID starting with efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07 not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.751901 4707 scope.go:117] "RemoveContainer" containerID="6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e" Jan 21 15:42:21 crc kubenswrapper[4707]: E0121 15:42:21.754529 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e\": container with ID starting with 6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e not found: ID does not exist" containerID="6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.754570 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e"} err="failed to get container status \"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e\": rpc error: code = NotFound desc = could not find container \"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e\": container with ID starting with 6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.754595 4707 scope.go:117] "RemoveContainer" containerID="5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.758316 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea"} err="failed to get container status \"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea\": rpc error: code = NotFound desc = could not find container \"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea\": container with ID starting with 5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.758353 4707 scope.go:117] "RemoveContainer" containerID="b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.760406 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9"} err="failed to get container status \"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9\": rpc error: code = NotFound desc = could not find container \"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9\": container with ID starting with b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9 not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.760436 4707 scope.go:117] "RemoveContainer" containerID="efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.764077 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07"} err="failed to get container status \"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07\": rpc error: code = NotFound desc = could not find container \"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07\": container with ID starting with efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07 not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.764107 4707 scope.go:117] "RemoveContainer" containerID="6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.765873 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e"} err="failed to get container status \"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e\": rpc error: code = NotFound desc = could not find container \"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e\": container with ID starting with 6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.765906 4707 scope.go:117] "RemoveContainer" containerID="5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.769075 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea"} err="failed to get container status \"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea\": rpc error: code = NotFound desc = could not find container \"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea\": container with ID starting with 5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.769105 4707 scope.go:117] "RemoveContainer" containerID="b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.770354 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-85b7676df9-ncnhq"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.772988 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9"} err="failed to get container status \"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9\": rpc error: code = NotFound desc = could not find container \"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9\": container with ID starting with b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9 not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.773011 4707 scope.go:117] "RemoveContainer" containerID="efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.786137 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07"} err="failed to get container status \"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07\": rpc error: code = NotFound desc = could not find container \"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07\": container with ID starting with efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07 not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.786163 4707 scope.go:117] "RemoveContainer" containerID="6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.803500 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e"} err="failed to get container status \"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e\": rpc error: code = NotFound desc = could not find container \"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e\": container with ID starting with 6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.803532 4707 scope.go:117] "RemoveContainer" containerID="5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.805010 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea"} err="failed to get container status \"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea\": rpc error: code = NotFound desc = could not find container \"5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea\": container with ID starting with 5a7d4bfea250c39ad16bdd648219634fbf629f27f57e97064cd22574062428ea not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.805031 4707 scope.go:117] "RemoveContainer" containerID="b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.805422 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9"} err="failed to get container status \"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9\": rpc error: code = NotFound desc = could not find container \"b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9\": container with ID starting with b6f0c1e976b4ec4b95a3042873ee2b8b03c2a3dc2bc4c8843304bcccf804d7f9 not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.805477 4707 scope.go:117] "RemoveContainer" containerID="efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.806746 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07"} err="failed to get container status \"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07\": rpc error: code = NotFound desc = could not find container \"efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07\": container with ID starting with efb40ee4d15df092c7b9c2b16d007e29a2679fd3de53408018f3743ca78aaa07 not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.806784 4707 scope.go:117] "RemoveContainer" containerID="6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.812095 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e"} err="failed to get container status \"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e\": rpc error: code = NotFound desc = could not find container \"6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e\": container with ID starting with 6c8ebda8fc1431d5202bd3497f5d05779f5ab5a9bfe7312eb05741bd9a12501e not found: ID does not exist" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrq7\" (UniqueName: \"kubernetes.io/projected/b86f97a2-7524-46a8-8e52-73a094fc7242-kube-api-access-2mrq7\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-combined-ca-bundle\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-run-httpd\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-scripts\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-config-data\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-combined-ca-bundle\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-fernet-keys\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-log-httpd\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-scripts\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-config-data\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-config-data\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-scripts\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vqd\" (UniqueName: \"kubernetes.io/projected/ac5d1294-8046-439a-a08b-5beea0684969-kube-api-access-k4vqd\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86f97a2-7524-46a8-8e52-73a094fc7242-logs\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822645 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822678 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-credential-keys\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6hmr\" (UniqueName: \"kubernetes.io/projected/28effb5a-773a-4084-8b1f-719eb7e4ea6f-kube-api-access-k6hmr\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.822709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.827047 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.856135 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.857206 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.857286 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.868853 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.870112 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.875139 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.875375 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-h2gkl" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.889396 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.890749 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.896880 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.901234 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-combined-ca-bundle\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-fernet-keys\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-log-httpd\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdh98\" (UniqueName: \"kubernetes.io/projected/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-kube-api-access-kdh98\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data-custom\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-scripts\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-config-data\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-config-data\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-scripts\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vqd\" (UniqueName: \"kubernetes.io/projected/ac5d1294-8046-439a-a08b-5beea0684969-kube-api-access-k4vqd\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86f97a2-7524-46a8-8e52-73a094fc7242-logs\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931924 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-credential-keys\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6hmr\" (UniqueName: \"kubernetes.io/projected/28effb5a-773a-4084-8b1f-719eb7e4ea6f-kube-api-access-k6hmr\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.931992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-logs\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.932020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrq7\" (UniqueName: \"kubernetes.io/projected/b86f97a2-7524-46a8-8e52-73a094fc7242-kube-api-access-2mrq7\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.932087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-combined-ca-bundle\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.932115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-run-httpd\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.932158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-scripts\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.932177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-combined-ca-bundle\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.932212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-config-data\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.934683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86f97a2-7524-46a8-8e52-73a094fc7242-logs\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.940661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-log-httpd\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.941305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-run-httpd\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.944288 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-combined-ca-bundle\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.946986 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll"] Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.950107 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.954868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-config-data\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.956465 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.956752 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-combined-ca-bundle\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.963553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-credential-keys\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.963680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.963837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-fernet-keys\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.964111 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vqd\" (UniqueName: \"kubernetes.io/projected/ac5d1294-8046-439a-a08b-5beea0684969-kube-api-access-k4vqd\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.964134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6hmr\" (UniqueName: \"kubernetes.io/projected/28effb5a-773a-4084-8b1f-719eb7e4ea6f-kube-api-access-k6hmr\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.965557 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrq7\" (UniqueName: \"kubernetes.io/projected/b86f97a2-7524-46a8-8e52-73a094fc7242-kube-api-access-2mrq7\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.965895 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-config-data\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.968961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-scripts\") pod \"keystone-bootstrap-7xxw2\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.969950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-scripts\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.970772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-scripts\") pod \"placement-85b7676df9-ncnhq\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.971195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.972724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-config-data\") pod \"ceilometer-0\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:21 crc kubenswrapper[4707]: I0121 15:42:21.976953 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll"] Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.019803 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.034291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-logs\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.038070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-logs\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.040205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.040302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data-custom\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp4dx\" (UniqueName: \"kubernetes.io/projected/10fd4722-3015-418c-8930-474edb0bdbc6-kube-api-access-kp4dx\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30955948-0578-447c-9c9d-73e489e54962-logs\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-combined-ca-bundle\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-combined-ca-bundle\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data-custom\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdh98\" (UniqueName: \"kubernetes.io/projected/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-kube-api-access-kdh98\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data-custom\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10fd4722-3015-418c-8930-474edb0bdbc6-logs\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-combined-ca-bundle\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw49n\" (UniqueName: \"kubernetes.io/projected/30955948-0578-447c-9c9d-73e489e54962-kube-api-access-rw49n\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.042983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.046174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-combined-ca-bundle\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.048165 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.048339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data-custom\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.053785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.062858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdh98\" (UniqueName: \"kubernetes.io/projected/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-kube-api-access-kdh98\") pod \"barbican-worker-86b7f55fc4-rft88\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.073198 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.101241 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.144537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-scripts\") pod \"d37f54bf-831c-4586-b825-7a4e57bc1878\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.144915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-config-data\") pod \"d37f54bf-831c-4586-b825-7a4e57bc1878\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.144961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-db-sync-config-data\") pod \"d37f54bf-831c-4586-b825-7a4e57bc1878\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.145065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-combined-ca-bundle\") pod \"d37f54bf-831c-4586-b825-7a4e57bc1878\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.145090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d37f54bf-831c-4586-b825-7a4e57bc1878-etc-machine-id\") pod \"d37f54bf-831c-4586-b825-7a4e57bc1878\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.145181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5xmv\" (UniqueName: \"kubernetes.io/projected/d37f54bf-831c-4586-b825-7a4e57bc1878-kube-api-access-l5xmv\") pod \"d37f54bf-831c-4586-b825-7a4e57bc1878\" (UID: \"d37f54bf-831c-4586-b825-7a4e57bc1878\") " Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.145507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-combined-ca-bundle\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.145563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data-custom\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.145601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10fd4722-3015-418c-8930-474edb0bdbc6-logs\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.145626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-combined-ca-bundle\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.145890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw49n\" (UniqueName: \"kubernetes.io/projected/30955948-0578-447c-9c9d-73e489e54962-kube-api-access-rw49n\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.145965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.146081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.146107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data-custom\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.146192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4dx\" (UniqueName: \"kubernetes.io/projected/10fd4722-3015-418c-8930-474edb0bdbc6-kube-api-access-kp4dx\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.146253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30955948-0578-447c-9c9d-73e489e54962-logs\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.146639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30955948-0578-447c-9c9d-73e489e54962-logs\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.147851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d37f54bf-831c-4586-b825-7a4e57bc1878-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d37f54bf-831c-4586-b825-7a4e57bc1878" (UID: "d37f54bf-831c-4586-b825-7a4e57bc1878"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.148608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10fd4722-3015-418c-8930-474edb0bdbc6-logs\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.152492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-scripts" (OuterVolumeSpecName: "scripts") pod "d37f54bf-831c-4586-b825-7a4e57bc1878" (UID: "d37f54bf-831c-4586-b825-7a4e57bc1878"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.161129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.161475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d37f54bf-831c-4586-b825-7a4e57bc1878-kube-api-access-l5xmv" (OuterVolumeSpecName: "kube-api-access-l5xmv") pod "d37f54bf-831c-4586-b825-7a4e57bc1878" (UID: "d37f54bf-831c-4586-b825-7a4e57bc1878"). InnerVolumeSpecName "kube-api-access-l5xmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.162350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp4dx\" (UniqueName: \"kubernetes.io/projected/10fd4722-3015-418c-8930-474edb0bdbc6-kube-api-access-kp4dx\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.162408 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.162908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d37f54bf-831c-4586-b825-7a4e57bc1878" (UID: "d37f54bf-831c-4586-b825-7a4e57bc1878"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.164267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw49n\" (UniqueName: \"kubernetes.io/projected/30955948-0578-447c-9c9d-73e489e54962-kube-api-access-rw49n\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.166518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-combined-ca-bundle\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.168186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data-custom\") pod \"barbican-api-5fbcc445ff-s28ll\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.193776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-combined-ca-bundle\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.197737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.200960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data-custom\") pod \"barbican-keystone-listener-57c4bc7d6c-72rt7\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.234902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-config-data" (OuterVolumeSpecName: "config-data") pod "d37f54bf-831c-4586-b825-7a4e57bc1878" (UID: "d37f54bf-831c-4586-b825-7a4e57bc1878"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.242735 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d37f54bf-831c-4586-b825-7a4e57bc1878" (UID: "d37f54bf-831c-4586-b825-7a4e57bc1878"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.251499 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.251550 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.251576 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.251586 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d37f54bf-831c-4586-b825-7a4e57bc1878-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.251598 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5xmv\" (UniqueName: \"kubernetes.io/projected/d37f54bf-831c-4586-b825-7a4e57bc1878-kube-api-access-l5xmv\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.251609 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37f54bf-831c-4586-b825-7a4e57bc1878-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.285048 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.285326 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.454511 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-c48bc69f9-zs4s8"] Jan 21 15:42:22 crc kubenswrapper[4707]: E0121 15:42:22.455203 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37f54bf-831c-4586-b825-7a4e57bc1878" containerName="cinder-db-sync" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.455218 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37f54bf-831c-4586-b825-7a4e57bc1878" containerName="cinder-db-sync" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.455388 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37f54bf-831c-4586-b825-7a4e57bc1878" containerName="cinder-db-sync" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.456281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.458661 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.464645 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.466760 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-c48bc69f9-zs4s8"] Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.495208 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.559858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-combined-ca-bundle\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.560177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-scripts\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.560206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-logs\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.560234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch8f7\" (UniqueName: \"kubernetes.io/projected/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-kube-api-access-ch8f7\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.560266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-public-tls-certs\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.560282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-internal-tls-certs\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.560324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-config-data\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.570762 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7xxw2"] Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.642535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b423a914-b340-4f8f-ae6a-f55acfccd5da","Type":"ContainerStarted","Data":"45c09eae1c3695863d1a8d7ec5493347c3d956d73b562f253ee2927315face69"} Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.647870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" event={"ID":"28effb5a-773a-4084-8b1f-719eb7e4ea6f","Type":"ContainerStarted","Data":"0c2196eb1e3d395e6f36d371eb797167768a9bd1d5d645cd6a71f11ec822d38a"} Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.649992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f762786d-4f31-412e-a685-b196f3cf3fea","Type":"ContainerStarted","Data":"f38b9dcb38e276bd9c92539d9f6c09c61aeb4c0e20d24a04f0aecd1dde1da008"} Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.650034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f762786d-4f31-412e-a685-b196f3cf3fea","Type":"ContainerStarted","Data":"b0c28c4764bb23b7c7d068a39fe864b187aa8b420bdcfc3940d7d314cc2c4769"} Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.651566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-v88c2" event={"ID":"d37f54bf-831c-4586-b825-7a4e57bc1878","Type":"ContainerDied","Data":"fe8712870ec53838007ccbf2394d257d6878921faa73d22f33c349c2e3168a2c"} Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.651584 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-v88c2" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.651592 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe8712870ec53838007ccbf2394d257d6878921faa73d22f33c349c2e3168a2c" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.658837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerStarted","Data":"6281d97debb8c5946cd57bc9e3768a8d106ecbab0211588f5808231951a73001"} Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.661585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-scripts\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.661648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-logs\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.661705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch8f7\" (UniqueName: \"kubernetes.io/projected/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-kube-api-access-ch8f7\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.661739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-public-tls-certs\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.661760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-internal-tls-certs\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.661860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-config-data\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.661905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-combined-ca-bundle\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.663272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-logs\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.668823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-public-tls-certs\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.671355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-scripts\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.672542 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-internal-tls-certs\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.674001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-config-data\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.675909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-combined-ca-bundle\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.683959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch8f7\" (UniqueName: \"kubernetes.io/projected/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-kube-api-access-ch8f7\") pod \"placement-c48bc69f9-zs4s8\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.718264 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-85b7676df9-ncnhq"] Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.784528 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.811752 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88"] Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.904903 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.906162 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.916124 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.916343 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.916490 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-rrwvr" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.916600 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.934910 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.964643 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7"] Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.969643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.969695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.969743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.969852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-scripts\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.969876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fe581ee-af6a-4e08-a466-cf10aa7ef468-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:22 crc kubenswrapper[4707]: I0121 15:42:22.969896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmlvg\" (UniqueName: \"kubernetes.io/projected/2fe581ee-af6a-4e08-a466-cf10aa7ef468-kube-api-access-pmlvg\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.020113 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll"] Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.063590 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.065384 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.067171 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.071236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.071280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.071317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.071406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-scripts\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.071430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fe581ee-af6a-4e08-a466-cf10aa7ef468-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.071451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmlvg\" (UniqueName: \"kubernetes.io/projected/2fe581ee-af6a-4e08-a466-cf10aa7ef468-kube-api-access-pmlvg\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.074580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fe581ee-af6a-4e08-a466-cf10aa7ef468-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.079513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-scripts\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.079961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.080919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.088129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmlvg\" (UniqueName: \"kubernetes.io/projected/2fe581ee-af6a-4e08-a466-cf10aa7ef468-kube-api-access-pmlvg\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.090194 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.090455 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.173136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2462620d-e137-447e-9b02-83e9488be645-logs\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.173182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.173214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-scripts\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.173291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.173337 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2462620d-e137-447e-9b02-83e9488be645-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.173363 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sdks\" (UniqueName: \"kubernetes.io/projected/2462620d-e137-447e-9b02-83e9488be645-kube-api-access-7sdks\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.173398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data-custom\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.223734 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c21363-b8c6-45cb-842a-122bb6f64da5" path="/var/lib/kubelet/pods/22c21363-b8c6-45cb-842a-122bb6f64da5/volumes" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.224382 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f2c6971-ade0-4913-ad6a-4125f1c5b0ba" path="/var/lib/kubelet/pods/5f2c6971-ade0-4913-ad6a-4125f1c5b0ba/volumes" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.252102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.275522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.275627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2462620d-e137-447e-9b02-83e9488be645-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.275656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sdks\" (UniqueName: \"kubernetes.io/projected/2462620d-e137-447e-9b02-83e9488be645-kube-api-access-7sdks\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.275698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data-custom\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.275733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2462620d-e137-447e-9b02-83e9488be645-logs\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.275756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.275778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-scripts\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.276804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2462620d-e137-447e-9b02-83e9488be645-logs\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.278360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2462620d-e137-447e-9b02-83e9488be645-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.282834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-scripts\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.284790 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.285186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.287470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data-custom\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.292589 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sdks\" (UniqueName: \"kubernetes.io/projected/2462620d-e137-447e-9b02-83e9488be645-kube-api-access-7sdks\") pod \"cinder-api-0\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.328250 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.409527 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-c48bc69f9-zs4s8"] Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.670420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" event={"ID":"10fd4722-3015-418c-8930-474edb0bdbc6","Type":"ContainerStarted","Data":"c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.670665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" event={"ID":"10fd4722-3015-418c-8930-474edb0bdbc6","Type":"ContainerStarted","Data":"52becd58796c84a02aa8c76ace90b695ee23c3b6146c0d09399850f36a89a396"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.674359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" event={"ID":"30955948-0578-447c-9c9d-73e489e54962","Type":"ContainerStarted","Data":"892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.674402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" event={"ID":"30955948-0578-447c-9c9d-73e489e54962","Type":"ContainerStarted","Data":"3eb1d5dcc534e6062a661e8ba642c7c69e4c56a6d4f5d3a2f5e5560251aba1fa"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.676138 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerStarted","Data":"49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.677420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b423a914-b340-4f8f-ae6a-f55acfccd5da","Type":"ContainerStarted","Data":"ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.681481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" event={"ID":"28effb5a-773a-4084-8b1f-719eb7e4ea6f","Type":"ContainerStarted","Data":"2372349b3eb16ac55b94f5e5b0a8ad44958cecbdb48be7d2072deaff4af5c7d9"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.687699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f762786d-4f31-412e-a685-b196f3cf3fea","Type":"ContainerStarted","Data":"42a79abf921569981336ed28abfb2303337cb7d5d4419864703a223193f9e1a4"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.692028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" event={"ID":"e0ffe639-1a1a-45e1-9152-ab2aa7af5301","Type":"ContainerStarted","Data":"2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.692076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" event={"ID":"e0ffe639-1a1a-45e1-9152-ab2aa7af5301","Type":"ContainerStarted","Data":"1b0effde157bd6dd6caa66c7277d95c8ed98bb53974f482cb8994505655cd4f8"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.708177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" event={"ID":"b86f97a2-7524-46a8-8e52-73a094fc7242","Type":"ContainerStarted","Data":"5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.708227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" event={"ID":"b86f97a2-7524-46a8-8e52-73a094fc7242","Type":"ContainerStarted","Data":"05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.708239 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" event={"ID":"b86f97a2-7524-46a8-8e52-73a094fc7242","Type":"ContainerStarted","Data":"94927bab8a001b56d04e3bf6edb99f4161922bc6351a2029b57ac76c426842b2"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.708897 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.708962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.711910 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" podStartSLOduration=2.71179325 podStartE2EDuration="2.71179325s" podCreationTimestamp="2026-01-21 15:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:23.700945556 +0000 UTC m=+2440.882461767" watchObservedRunningTime="2026-01-21 15:42:23.71179325 +0000 UTC m=+2440.893309472" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.745472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" event={"ID":"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf","Type":"ContainerStarted","Data":"171bb4a7379bdb4157e358de0b450c2d8cfb1260db9c5dc027c3c1080510e98f"} Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.757017 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.757425 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.757412699 podStartE2EDuration="3.757412699s" podCreationTimestamp="2026-01-21 15:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:23.725322486 +0000 UTC m=+2440.906838709" watchObservedRunningTime="2026-01-21 15:42:23.757412699 +0000 UTC m=+2440.938928912" Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.779418 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" podStartSLOduration=2.779392083 podStartE2EDuration="2.779392083s" podCreationTimestamp="2026-01-21 15:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:23.744251936 +0000 UTC m=+2440.925768158" watchObservedRunningTime="2026-01-21 15:42:23.779392083 +0000 UTC m=+2440.960908305" Jan 21 15:42:23 crc kubenswrapper[4707]: W0121 15:42:23.872802 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2462620d_e137_447e_9b02_83e9488be645.slice/crio-5913e84b927d0839fa1e74968241909ae88ae9b060a505e539afcc9d06983ab3 WatchSource:0}: Error finding container 5913e84b927d0839fa1e74968241909ae88ae9b060a505e539afcc9d06983ab3: Status 404 returned error can't find the container with id 5913e84b927d0839fa1e74968241909ae88ae9b060a505e539afcc9d06983ab3 Jan 21 15:42:23 crc kubenswrapper[4707]: I0121 15:42:23.873504 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.773720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" event={"ID":"e0ffe639-1a1a-45e1-9152-ab2aa7af5301","Type":"ContainerStarted","Data":"c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.779300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" event={"ID":"30955948-0578-447c-9c9d-73e489e54962","Type":"ContainerStarted","Data":"dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.782324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" event={"ID":"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf","Type":"ContainerStarted","Data":"8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.782361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" event={"ID":"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf","Type":"ContainerStarted","Data":"06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.783081 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.783105 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.785694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerStarted","Data":"59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.793253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"2462620d-e137-447e-9b02-83e9488be645","Type":"ContainerStarted","Data":"6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.793293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"2462620d-e137-447e-9b02-83e9488be645","Type":"ContainerStarted","Data":"5913e84b927d0839fa1e74968241909ae88ae9b060a505e539afcc9d06983ab3"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.794314 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" podStartSLOduration=3.794304912 podStartE2EDuration="3.794304912s" podCreationTimestamp="2026-01-21 15:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:24.792223991 +0000 UTC m=+2441.973740212" watchObservedRunningTime="2026-01-21 15:42:24.794304912 +0000 UTC m=+2441.975821135" Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.799721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" event={"ID":"10fd4722-3015-418c-8930-474edb0bdbc6","Type":"ContainerStarted","Data":"77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.800761 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.800832 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.822309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b423a914-b340-4f8f-ae6a-f55acfccd5da","Type":"ContainerStarted","Data":"66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.826457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2fe581ee-af6a-4e08-a466-cf10aa7ef468","Type":"ContainerStarted","Data":"bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.826478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2fe581ee-af6a-4e08-a466-cf10aa7ef468","Type":"ContainerStarted","Data":"7f9f5a24569dbb99943914ba628c26ea325bef20bc32847d1229d4bc05109d20"} Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.845542 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" podStartSLOduration=3.84552952 podStartE2EDuration="3.84552952s" podCreationTimestamp="2026-01-21 15:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:24.818913138 +0000 UTC m=+2442.000429360" watchObservedRunningTime="2026-01-21 15:42:24.84552952 +0000 UTC m=+2442.027045742" Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.864334 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" podStartSLOduration=2.864322282 podStartE2EDuration="2.864322282s" podCreationTimestamp="2026-01-21 15:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:24.846886991 +0000 UTC m=+2442.028403213" watchObservedRunningTime="2026-01-21 15:42:24.864322282 +0000 UTC m=+2442.045838504" Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.865696 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" podStartSLOduration=3.865689182 podStartE2EDuration="3.865689182s" podCreationTimestamp="2026-01-21 15:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:24.861081459 +0000 UTC m=+2442.042597681" watchObservedRunningTime="2026-01-21 15:42:24.865689182 +0000 UTC m=+2442.047205404" Jan 21 15:42:24 crc kubenswrapper[4707]: I0121 15:42:24.894459 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.894445578 podStartE2EDuration="4.894445578s" podCreationTimestamp="2026-01-21 15:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:24.879138097 +0000 UTC m=+2442.060654319" watchObservedRunningTime="2026-01-21 15:42:24.894445578 +0000 UTC m=+2442.075961799" Jan 21 15:42:25 crc kubenswrapper[4707]: I0121 15:42:25.847782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerStarted","Data":"85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3"} Jan 21 15:42:25 crc kubenswrapper[4707]: I0121 15:42:25.849991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"2462620d-e137-447e-9b02-83e9488be645","Type":"ContainerStarted","Data":"3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760"} Jan 21 15:42:25 crc kubenswrapper[4707]: I0121 15:42:25.851848 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:25 crc kubenswrapper[4707]: I0121 15:42:25.853921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2fe581ee-af6a-4e08-a466-cf10aa7ef468","Type":"ContainerStarted","Data":"5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6"} Jan 21 15:42:25 crc kubenswrapper[4707]: I0121 15:42:25.856970 4707 generic.go:334] "Generic (PLEG): container finished" podID="c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe" containerID="d28821461c495d577ce7377449af1a5dcc20ad597a38321ba6e299be5f826bdd" exitCode=0 Jan 21 15:42:25 crc kubenswrapper[4707]: I0121 15:42:25.857059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" event={"ID":"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe","Type":"ContainerDied","Data":"d28821461c495d577ce7377449af1a5dcc20ad597a38321ba6e299be5f826bdd"} Jan 21 15:42:25 crc kubenswrapper[4707]: I0121 15:42:25.872068 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.872056435 podStartE2EDuration="2.872056435s" podCreationTimestamp="2026-01-21 15:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:25.870119954 +0000 UTC m=+2443.051636176" watchObservedRunningTime="2026-01-21 15:42:25.872056435 +0000 UTC m=+2443.053572657" Jan 21 15:42:25 crc kubenswrapper[4707]: I0121 15:42:25.905719 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.905701751 podStartE2EDuration="3.905701751s" podCreationTimestamp="2026-01-21 15:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:25.899680571 +0000 UTC m=+2443.081196793" watchObservedRunningTime="2026-01-21 15:42:25.905701751 +0000 UTC m=+2443.087217963" Jan 21 15:42:26 crc kubenswrapper[4707]: I0121 15:42:26.867779 4707 generic.go:334] "Generic (PLEG): container finished" podID="28effb5a-773a-4084-8b1f-719eb7e4ea6f" containerID="2372349b3eb16ac55b94f5e5b0a8ad44958cecbdb48be7d2072deaff4af5c7d9" exitCode=0 Jan 21 15:42:26 crc kubenswrapper[4707]: I0121 15:42:26.867851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" event={"ID":"28effb5a-773a-4084-8b1f-719eb7e4ea6f","Type":"ContainerDied","Data":"2372349b3eb16ac55b94f5e5b0a8ad44958cecbdb48be7d2072deaff4af5c7d9"} Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.308612 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.371833 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-combined-ca-bundle\") pod \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.371870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrrzv\" (UniqueName: \"kubernetes.io/projected/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-kube-api-access-qrrzv\") pod \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.372092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-config\") pod \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\" (UID: \"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe\") " Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.376520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-kube-api-access-qrrzv" (OuterVolumeSpecName: "kube-api-access-qrrzv") pod "c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe" (UID: "c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe"). InnerVolumeSpecName "kube-api-access-qrrzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.393673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe" (UID: "c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.395544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-config" (OuterVolumeSpecName: "config") pod "c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe" (UID: "c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.474176 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.474209 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.474229 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrrzv\" (UniqueName: \"kubernetes.io/projected/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe-kube-api-access-qrrzv\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.788571 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.879238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" event={"ID":"c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe","Type":"ContainerDied","Data":"ad14e0c6af61c27eb75d57a195373dce9bf0aac46b2df1381a8609e2296117f7"} Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.879327 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad14e0c6af61c27eb75d57a195373dce9bf0aac46b2df1381a8609e2296117f7" Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.879252 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-2qlj6" Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.881514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerStarted","Data":"7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8"} Jan 21 15:42:27 crc kubenswrapper[4707]: I0121 15:42:27.902017 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.218701207 podStartE2EDuration="6.902005044s" podCreationTimestamp="2026-01-21 15:42:21 +0000 UTC" firstStartedPulling="2026-01-21 15:42:22.517995218 +0000 UTC m=+2439.699511441" lastFinishedPulling="2026-01-21 15:42:27.201299056 +0000 UTC m=+2444.382815278" observedRunningTime="2026-01-21 15:42:27.894250626 +0000 UTC m=+2445.075766848" watchObservedRunningTime="2026-01-21 15:42:27.902005044 +0000 UTC m=+2445.083521266" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.076671 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-566fb846f6-xwtm2"] Jan 21 15:42:28 crc kubenswrapper[4707]: E0121 15:42:28.077012 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe" containerName="neutron-db-sync" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.077030 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe" containerName="neutron-db-sync" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.077181 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe" containerName="neutron-db-sync" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.078076 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.081341 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.081546 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.081652 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.081851 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-r9bl4" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.098201 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-566fb846f6-xwtm2"] Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.187249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-combined-ca-bundle\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.187306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-config\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.187345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz4zl\" (UniqueName: \"kubernetes.io/projected/91706eeb-4f05-4bf7-b2b2-53618d51bda5-kube-api-access-kz4zl\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.187415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-ovndb-tls-certs\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.187434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-httpd-config\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.252337 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.252560 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.288447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6hmr\" (UniqueName: \"kubernetes.io/projected/28effb5a-773a-4084-8b1f-719eb7e4ea6f-kube-api-access-k6hmr\") pod \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.288497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-fernet-keys\") pod \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.288582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-scripts\") pod \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.288627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-combined-ca-bundle\") pod \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.288692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-credential-keys\") pod \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.288728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-config-data\") pod \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\" (UID: \"28effb5a-773a-4084-8b1f-719eb7e4ea6f\") " Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.289047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz4zl\" (UniqueName: \"kubernetes.io/projected/91706eeb-4f05-4bf7-b2b2-53618d51bda5-kube-api-access-kz4zl\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.289123 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-ovndb-tls-certs\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.289144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-httpd-config\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.289319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-combined-ca-bundle\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.289351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-config\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.296479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28effb5a-773a-4084-8b1f-719eb7e4ea6f-kube-api-access-k6hmr" (OuterVolumeSpecName: "kube-api-access-k6hmr") pod "28effb5a-773a-4084-8b1f-719eb7e4ea6f" (UID: "28effb5a-773a-4084-8b1f-719eb7e4ea6f"). InnerVolumeSpecName "kube-api-access-k6hmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.297790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "28effb5a-773a-4084-8b1f-719eb7e4ea6f" (UID: "28effb5a-773a-4084-8b1f-719eb7e4ea6f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.301886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-ovndb-tls-certs\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.302414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "28effb5a-773a-4084-8b1f-719eb7e4ea6f" (UID: "28effb5a-773a-4084-8b1f-719eb7e4ea6f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.310337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz4zl\" (UniqueName: \"kubernetes.io/projected/91706eeb-4f05-4bf7-b2b2-53618d51bda5-kube-api-access-kz4zl\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.321273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-scripts" (OuterVolumeSpecName: "scripts") pod "28effb5a-773a-4084-8b1f-719eb7e4ea6f" (UID: "28effb5a-773a-4084-8b1f-719eb7e4ea6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.321420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-combined-ca-bundle\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.321862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-config\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.326124 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-httpd-config\") pod \"neutron-566fb846f6-xwtm2\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.333698 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-config-data" (OuterVolumeSpecName: "config-data") pod "28effb5a-773a-4084-8b1f-719eb7e4ea6f" (UID: "28effb5a-773a-4084-8b1f-719eb7e4ea6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.343059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28effb5a-773a-4084-8b1f-719eb7e4ea6f" (UID: "28effb5a-773a-4084-8b1f-719eb7e4ea6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.391463 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.391663 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.391727 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.391876 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.391947 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6hmr\" (UniqueName: \"kubernetes.io/projected/28effb5a-773a-4084-8b1f-719eb7e4ea6f-kube-api-access-k6hmr\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.392000 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28effb5a-773a-4084-8b1f-719eb7e4ea6f-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.407212 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.819689 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-566fb846f6-xwtm2"] Jan 21 15:42:28 crc kubenswrapper[4707]: W0121 15:42:28.822006 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91706eeb_4f05_4bf7_b2b2_53618d51bda5.slice/crio-9c6f37e742788cd2f8282fc461783dfb6dff2f836c4dc4069ba189a8cfa1e561 WatchSource:0}: Error finding container 9c6f37e742788cd2f8282fc461783dfb6dff2f836c4dc4069ba189a8cfa1e561: Status 404 returned error can't find the container with id 9c6f37e742788cd2f8282fc461783dfb6dff2f836c4dc4069ba189a8cfa1e561 Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.893664 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" event={"ID":"28effb5a-773a-4084-8b1f-719eb7e4ea6f","Type":"ContainerDied","Data":"0c2196eb1e3d395e6f36d371eb797167768a9bd1d5d645cd6a71f11ec822d38a"} Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.894364 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2196eb1e3d395e6f36d371eb797167768a9bd1d5d645cd6a71f11ec822d38a" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.893961 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7xxw2" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.894864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" event={"ID":"91706eeb-4f05-4bf7-b2b2-53618d51bda5","Type":"ContainerStarted","Data":"9c6f37e742788cd2f8282fc461783dfb6dff2f836c4dc4069ba189a8cfa1e561"} Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.895003 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.895120 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="2462620d-e137-447e-9b02-83e9488be645" containerName="cinder-api-log" containerID="cri-o://6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18" gracePeriod=30 Jan 21 15:42:28 crc kubenswrapper[4707]: I0121 15:42:28.895413 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="2462620d-e137-447e-9b02-83e9488be645" containerName="cinder-api" containerID="cri-o://3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760" gracePeriod=30 Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.039722 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-56cb4c686f-qqcjc"] Jan 21 15:42:29 crc kubenswrapper[4707]: E0121 15:42:29.040116 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28effb5a-773a-4084-8b1f-719eb7e4ea6f" containerName="keystone-bootstrap" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.040132 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28effb5a-773a-4084-8b1f-719eb7e4ea6f" containerName="keystone-bootstrap" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.040343 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28effb5a-773a-4084-8b1f-719eb7e4ea6f" containerName="keystone-bootstrap" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.040941 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.044482 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.047623 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.047986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-xxph7" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.048022 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.048151 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.048180 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.058903 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-56cb4c686f-qqcjc"] Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.110330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-internal-tls-certs\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.110419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjrtz\" (UniqueName: \"kubernetes.io/projected/7fc127ff-9162-4e68-af5b-b4085cce670c-kube-api-access-hjrtz\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.110492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-credential-keys\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.110538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-config-data\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.110579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-combined-ca-bundle\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.110630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-fernet-keys\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.110668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-scripts\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.110707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-public-tls-certs\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.212271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-config-data\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.212610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-combined-ca-bundle\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.212686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-fernet-keys\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.212751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-scripts\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.212799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-public-tls-certs\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.212921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-internal-tls-certs\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.212961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjrtz\" (UniqueName: \"kubernetes.io/projected/7fc127ff-9162-4e68-af5b-b4085cce670c-kube-api-access-hjrtz\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.213015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-credential-keys\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.216816 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-config-data\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.220828 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-fernet-keys\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.224361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-public-tls-certs\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.226139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-internal-tls-certs\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.233455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-scripts\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.233565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-credential-keys\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.233748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-combined-ca-bundle\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.236628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjrtz\" (UniqueName: \"kubernetes.io/projected/7fc127ff-9162-4e68-af5b-b4085cce670c-kube-api-access-hjrtz\") pod \"keystone-56cb4c686f-qqcjc\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.359789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.408990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.520510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2462620d-e137-447e-9b02-83e9488be645-logs\") pod \"2462620d-e137-447e-9b02-83e9488be645\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.520770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2462620d-e137-447e-9b02-83e9488be645-logs" (OuterVolumeSpecName: "logs") pod "2462620d-e137-447e-9b02-83e9488be645" (UID: "2462620d-e137-447e-9b02-83e9488be645"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.520786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data\") pod \"2462620d-e137-447e-9b02-83e9488be645\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.520837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-scripts\") pod \"2462620d-e137-447e-9b02-83e9488be645\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.520855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-combined-ca-bundle\") pod \"2462620d-e137-447e-9b02-83e9488be645\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.521008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data-custom\") pod \"2462620d-e137-447e-9b02-83e9488be645\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.521111 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sdks\" (UniqueName: \"kubernetes.io/projected/2462620d-e137-447e-9b02-83e9488be645-kube-api-access-7sdks\") pod \"2462620d-e137-447e-9b02-83e9488be645\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.521165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2462620d-e137-447e-9b02-83e9488be645-etc-machine-id\") pod \"2462620d-e137-447e-9b02-83e9488be645\" (UID: \"2462620d-e137-447e-9b02-83e9488be645\") " Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.521278 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2462620d-e137-447e-9b02-83e9488be645-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2462620d-e137-447e-9b02-83e9488be645" (UID: "2462620d-e137-447e-9b02-83e9488be645"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.521625 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2462620d-e137-447e-9b02-83e9488be645-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.521640 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2462620d-e137-447e-9b02-83e9488be645-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.525539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2462620d-e137-447e-9b02-83e9488be645" (UID: "2462620d-e137-447e-9b02-83e9488be645"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.528224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-scripts" (OuterVolumeSpecName: "scripts") pod "2462620d-e137-447e-9b02-83e9488be645" (UID: "2462620d-e137-447e-9b02-83e9488be645"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.533971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2462620d-e137-447e-9b02-83e9488be645-kube-api-access-7sdks" (OuterVolumeSpecName: "kube-api-access-7sdks") pod "2462620d-e137-447e-9b02-83e9488be645" (UID: "2462620d-e137-447e-9b02-83e9488be645"). InnerVolumeSpecName "kube-api-access-7sdks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.543706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2462620d-e137-447e-9b02-83e9488be645" (UID: "2462620d-e137-447e-9b02-83e9488be645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.561312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data" (OuterVolumeSpecName: "config-data") pod "2462620d-e137-447e-9b02-83e9488be645" (UID: "2462620d-e137-447e-9b02-83e9488be645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.623019 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.623052 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sdks\" (UniqueName: \"kubernetes.io/projected/2462620d-e137-447e-9b02-83e9488be645-kube-api-access-7sdks\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.623065 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.623073 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.623081 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2462620d-e137-447e-9b02-83e9488be645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.787727 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-56cb4c686f-qqcjc"] Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.905967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" event={"ID":"91706eeb-4f05-4bf7-b2b2-53618d51bda5","Type":"ContainerStarted","Data":"730d51fbb75b72d819fa14edb27a01f7434fbd019813a7974ed132c35e5ce667"} Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.906218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.906230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" event={"ID":"91706eeb-4f05-4bf7-b2b2-53618d51bda5","Type":"ContainerStarted","Data":"470d8baec8d8a85adfa5eb17592dc43ee2ed9dbd3ce5dcc778e85b9df92189a7"} Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.907308 4707 generic.go:334] "Generic (PLEG): container finished" podID="2462620d-e137-447e-9b02-83e9488be645" containerID="3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760" exitCode=0 Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.907330 4707 generic.go:334] "Generic (PLEG): container finished" podID="2462620d-e137-447e-9b02-83e9488be645" containerID="6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18" exitCode=143 Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.907393 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.907581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"2462620d-e137-447e-9b02-83e9488be645","Type":"ContainerDied","Data":"3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760"} Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.907709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"2462620d-e137-447e-9b02-83e9488be645","Type":"ContainerDied","Data":"6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18"} Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.907722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"2462620d-e137-447e-9b02-83e9488be645","Type":"ContainerDied","Data":"5913e84b927d0839fa1e74968241909ae88ae9b060a505e539afcc9d06983ab3"} Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.907739 4707 scope.go:117] "RemoveContainer" containerID="3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.917039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" event={"ID":"7fc127ff-9162-4e68-af5b-b4085cce670c","Type":"ContainerStarted","Data":"6126481056f8f5356ff3d3043762f274a0bfbb51e2d8181c2a95e5c936ec8d06"} Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.924485 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" podStartSLOduration=1.924475786 podStartE2EDuration="1.924475786s" podCreationTimestamp="2026-01-21 15:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:29.92106899 +0000 UTC m=+2447.102585212" watchObservedRunningTime="2026-01-21 15:42:29.924475786 +0000 UTC m=+2447.105992008" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.938658 4707 scope.go:117] "RemoveContainer" containerID="6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.948482 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.959956 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.964102 4707 scope.go:117] "RemoveContainer" containerID="3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760" Jan 21 15:42:29 crc kubenswrapper[4707]: E0121 15:42:29.964394 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760\": container with ID starting with 3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760 not found: ID does not exist" containerID="3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.964424 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760"} err="failed to get container status \"3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760\": rpc error: code = NotFound desc = could not find container \"3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760\": container with ID starting with 3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760 not found: ID does not exist" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.964444 4707 scope.go:117] "RemoveContainer" containerID="6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18" Jan 21 15:42:29 crc kubenswrapper[4707]: E0121 15:42:29.964696 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18\": container with ID starting with 6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18 not found: ID does not exist" containerID="6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.964714 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18"} err="failed to get container status \"6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18\": rpc error: code = NotFound desc = could not find container \"6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18\": container with ID starting with 6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18 not found: ID does not exist" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.964725 4707 scope.go:117] "RemoveContainer" containerID="3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.965156 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760"} err="failed to get container status \"3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760\": rpc error: code = NotFound desc = could not find container \"3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760\": container with ID starting with 3cb5c0dfb359d9485ca885d98fefb455440962253fd27ce2af9de7567a854760 not found: ID does not exist" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.965177 4707 scope.go:117] "RemoveContainer" containerID="6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.965381 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18"} err="failed to get container status \"6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18\": rpc error: code = NotFound desc = could not find container \"6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18\": container with ID starting with 6beb3950d3612e93f5d616033ee60df4731a7a871da72ba779a96dadfd444a18 not found: ID does not exist" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.970105 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:42:29 crc kubenswrapper[4707]: E0121 15:42:29.970495 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2462620d-e137-447e-9b02-83e9488be645" containerName="cinder-api-log" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.970511 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2462620d-e137-447e-9b02-83e9488be645" containerName="cinder-api-log" Jan 21 15:42:29 crc kubenswrapper[4707]: E0121 15:42:29.970541 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2462620d-e137-447e-9b02-83e9488be645" containerName="cinder-api" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.970547 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2462620d-e137-447e-9b02-83e9488be645" containerName="cinder-api" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.970707 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2462620d-e137-447e-9b02-83e9488be645" containerName="cinder-api-log" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.970728 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2462620d-e137-447e-9b02-83e9488be645" containerName="cinder-api" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.971595 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.975926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.975940 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.976219 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:42:29 crc kubenswrapper[4707]: I0121 15:42:29.976469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.029354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-scripts\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.030187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.030294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.030315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.030347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.030383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-logs\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.030418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.030500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgz97\" (UniqueName: \"kubernetes.io/projected/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-kube-api-access-wgz97\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.030525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.131861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.131903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.131935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.131967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-logs\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.132003 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.132063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgz97\" (UniqueName: \"kubernetes.io/projected/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-kube-api-access-wgz97\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.132081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.132117 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-scripts\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.132208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.132436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.132740 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-logs\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.137402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.137413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.137426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-scripts\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.137832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.141171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.141617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.146177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgz97\" (UniqueName: \"kubernetes.io/projected/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-kube-api-access-wgz97\") pod \"cinder-api-0\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.289524 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.675252 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:42:30 crc kubenswrapper[4707]: W0121 15:42:30.696983 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4aa782a_8c59_486c_8f6f_a1b38b9e3aa8.slice/crio-bd7fde705b15976eb6a5259e82e4aebef0a228963a90d3660a3ec3f830917201 WatchSource:0}: Error finding container bd7fde705b15976eb6a5259e82e4aebef0a228963a90d3660a3ec3f830917201: Status 404 returned error can't find the container with id bd7fde705b15976eb6a5259e82e4aebef0a228963a90d3660a3ec3f830917201 Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.929309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" event={"ID":"7fc127ff-9162-4e68-af5b-b4085cce670c","Type":"ContainerStarted","Data":"15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d"} Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.929431 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.931276 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8","Type":"ContainerStarted","Data":"bd7fde705b15976eb6a5259e82e4aebef0a228963a90d3660a3ec3f830917201"} Jan 21 15:42:30 crc kubenswrapper[4707]: I0121 15:42:30.943711 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" podStartSLOduration=1.943696093 podStartE2EDuration="1.943696093s" podCreationTimestamp="2026-01-21 15:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:30.940866323 +0000 UTC m=+2448.122382545" watchObservedRunningTime="2026-01-21 15:42:30.943696093 +0000 UTC m=+2448.125212315" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.048594 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.048676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.061850 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.061895 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.097317 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.102638 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.102830 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.111271 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.196628 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2462620d-e137-447e-9b02-83e9488be645" path="/var/lib/kubelet/pods/2462620d-e137-447e-9b02-83e9488be645/volumes" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.939372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8","Type":"ContainerStarted","Data":"032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe"} Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.939734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8","Type":"ContainerStarted","Data":"82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083"} Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.940903 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.940922 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.940931 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.940939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:31 crc kubenswrapper[4707]: I0121 15:42:31.962281 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.962266841 podStartE2EDuration="2.962266841s" podCreationTimestamp="2026-01-21 15:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:31.953684746 +0000 UTC m=+2449.135200967" watchObservedRunningTime="2026-01-21 15:42:31.962266841 +0000 UTC m=+2449.143783062" Jan 21 15:42:32 crc kubenswrapper[4707]: I0121 15:42:32.947080 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.345247 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-65bd996df5-pwnvm"] Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.346496 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.354600 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-65bd996df5-pwnvm"] Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.355085 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.355240 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.484779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-httpd-config\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.484856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-internal-tls-certs\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.484960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-combined-ca-bundle\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.485029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-config\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.485053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-ovndb-tls-certs\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.485124 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrhj\" (UniqueName: \"kubernetes.io/projected/b241d4d6-31f1-4523-89df-1e1eab731f48-kube-api-access-dlrhj\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.485234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-public-tls-certs\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.587400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-internal-tls-certs\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.587467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-combined-ca-bundle\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.587519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-config\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.587537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-ovndb-tls-certs\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.587576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrhj\" (UniqueName: \"kubernetes.io/projected/b241d4d6-31f1-4523-89df-1e1eab731f48-kube-api-access-dlrhj\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.587687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-public-tls-certs\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.595318 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-httpd-config\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.604089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-combined-ca-bundle\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.607315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-internal-tls-certs\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.617397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-public-tls-certs\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.626493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-httpd-config\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.626605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-config\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.626858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-ovndb-tls-certs\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.630481 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.647697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrhj\" (UniqueName: \"kubernetes.io/projected/b241d4d6-31f1-4523-89df-1e1eab731f48-kube-api-access-dlrhj\") pod \"neutron-65bd996df5-pwnvm\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.674194 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.714414 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.748218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.845765 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.857148 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.901911 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.913750 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.937053 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.959911 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerName="cinder-scheduler" containerID="cri-o://bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34" gracePeriod=30 Jan 21 15:42:33 crc kubenswrapper[4707]: I0121 15:42:33.960294 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerName="probe" containerID="cri-o://5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6" gracePeriod=30 Jan 21 15:42:34 crc kubenswrapper[4707]: W0121 15:42:34.147036 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb241d4d6_31f1_4523_89df_1e1eab731f48.slice/crio-79c655abc744879935b5d43a70d6e7fcf219d83647a374e61a9dea29a37ff3b1 WatchSource:0}: Error finding container 79c655abc744879935b5d43a70d6e7fcf219d83647a374e61a9dea29a37ff3b1: Status 404 returned error can't find the container with id 79c655abc744879935b5d43a70d6e7fcf219d83647a374e61a9dea29a37ff3b1 Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.152072 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-65bd996df5-pwnvm"] Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.907451 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-d7847c667-bblg7"] Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.909015 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.913380 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.920451 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.922398 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-d7847c667-bblg7"] Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.976760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" event={"ID":"b241d4d6-31f1-4523-89df-1e1eab731f48","Type":"ContainerStarted","Data":"5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400"} Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.976821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" event={"ID":"b241d4d6-31f1-4523-89df-1e1eab731f48","Type":"ContainerStarted","Data":"865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70"} Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.976836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" event={"ID":"b241d4d6-31f1-4523-89df-1e1eab731f48","Type":"ContainerStarted","Data":"79c655abc744879935b5d43a70d6e7fcf219d83647a374e61a9dea29a37ff3b1"} Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.976927 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.978929 4707 generic.go:334] "Generic (PLEG): container finished" podID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerID="5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6" exitCode=0 Jan 21 15:42:34 crc kubenswrapper[4707]: I0121 15:42:34.979017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2fe581ee-af6a-4e08-a466-cf10aa7ef468","Type":"ContainerDied","Data":"5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6"} Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.003841 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" podStartSLOduration=2.003823937 podStartE2EDuration="2.003823937s" podCreationTimestamp="2026-01-21 15:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:34.996919165 +0000 UTC m=+2452.178435387" watchObservedRunningTime="2026-01-21 15:42:35.003823937 +0000 UTC m=+2452.185340159" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.017955 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3952bfb4-0398-438f-a0c1-7289b672313e-logs\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.018061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-public-tls-certs\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.018112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phdgf\" (UniqueName: \"kubernetes.io/projected/3952bfb4-0398-438f-a0c1-7289b672313e-kube-api-access-phdgf\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.018139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-internal-tls-certs\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.018161 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.018408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data-custom\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.018572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-combined-ca-bundle\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.120553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data-custom\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.120659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-combined-ca-bundle\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.121711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3952bfb4-0398-438f-a0c1-7289b672313e-logs\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.121936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-public-tls-certs\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.121968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3952bfb4-0398-438f-a0c1-7289b672313e-logs\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.122005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phdgf\" (UniqueName: \"kubernetes.io/projected/3952bfb4-0398-438f-a0c1-7289b672313e-kube-api-access-phdgf\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.122041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-internal-tls-certs\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.122078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.125315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data-custom\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.128025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-public-tls-certs\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.130690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.136910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-internal-tls-certs\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.136920 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-combined-ca-bundle\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.137188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phdgf\" (UniqueName: \"kubernetes.io/projected/3952bfb4-0398-438f-a0c1-7289b672313e-kube-api-access-phdgf\") pod \"barbican-api-d7847c667-bblg7\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.183160 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:42:35 crc kubenswrapper[4707]: E0121 15:42:35.183505 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.223972 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.641566 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-d7847c667-bblg7"] Jan 21 15:42:35 crc kubenswrapper[4707]: W0121 15:42:35.649583 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3952bfb4_0398_438f_a0c1_7289b672313e.slice/crio-ec91e1122c7d2378896e6a9e12e8dc5f62ebbd926e026cd3e856358b238a267e WatchSource:0}: Error finding container ec91e1122c7d2378896e6a9e12e8dc5f62ebbd926e026cd3e856358b238a267e: Status 404 returned error can't find the container with id ec91e1122c7d2378896e6a9e12e8dc5f62ebbd926e026cd3e856358b238a267e Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.741494 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.832323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fe581ee-af6a-4e08-a466-cf10aa7ef468-etc-machine-id\") pod \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.832397 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmlvg\" (UniqueName: \"kubernetes.io/projected/2fe581ee-af6a-4e08-a466-cf10aa7ef468-kube-api-access-pmlvg\") pod \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.832425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data\") pod \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.832449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-combined-ca-bundle\") pod \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.832485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-scripts\") pod \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.832535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data-custom\") pod \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\" (UID: \"2fe581ee-af6a-4e08-a466-cf10aa7ef468\") " Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.832885 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2fe581ee-af6a-4e08-a466-cf10aa7ef468-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2fe581ee-af6a-4e08-a466-cf10aa7ef468" (UID: "2fe581ee-af6a-4e08-a466-cf10aa7ef468"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.833188 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fe581ee-af6a-4e08-a466-cf10aa7ef468-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.836267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-scripts" (OuterVolumeSpecName: "scripts") pod "2fe581ee-af6a-4e08-a466-cf10aa7ef468" (UID: "2fe581ee-af6a-4e08-a466-cf10aa7ef468"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.836501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2fe581ee-af6a-4e08-a466-cf10aa7ef468" (UID: "2fe581ee-af6a-4e08-a466-cf10aa7ef468"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.836580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe581ee-af6a-4e08-a466-cf10aa7ef468-kube-api-access-pmlvg" (OuterVolumeSpecName: "kube-api-access-pmlvg") pod "2fe581ee-af6a-4e08-a466-cf10aa7ef468" (UID: "2fe581ee-af6a-4e08-a466-cf10aa7ef468"). InnerVolumeSpecName "kube-api-access-pmlvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.867285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fe581ee-af6a-4e08-a466-cf10aa7ef468" (UID: "2fe581ee-af6a-4e08-a466-cf10aa7ef468"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.901009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data" (OuterVolumeSpecName: "config-data") pod "2fe581ee-af6a-4e08-a466-cf10aa7ef468" (UID: "2fe581ee-af6a-4e08-a466-cf10aa7ef468"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.935079 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.935393 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmlvg\" (UniqueName: \"kubernetes.io/projected/2fe581ee-af6a-4e08-a466-cf10aa7ef468-kube-api-access-pmlvg\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.935405 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.935414 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.935421 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fe581ee-af6a-4e08-a466-cf10aa7ef468-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.993632 4707 generic.go:334] "Generic (PLEG): container finished" podID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerID="bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34" exitCode=0 Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.993711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2fe581ee-af6a-4e08-a466-cf10aa7ef468","Type":"ContainerDied","Data":"bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34"} Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.993739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2fe581ee-af6a-4e08-a466-cf10aa7ef468","Type":"ContainerDied","Data":"7f9f5a24569dbb99943914ba628c26ea325bef20bc32847d1229d4bc05109d20"} Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.993763 4707 scope.go:117] "RemoveContainer" containerID="5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.993948 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:35 crc kubenswrapper[4707]: I0121 15:42:35.997697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" event={"ID":"3952bfb4-0398-438f-a0c1-7289b672313e","Type":"ContainerStarted","Data":"8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec"} Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.001785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" event={"ID":"3952bfb4-0398-438f-a0c1-7289b672313e","Type":"ContainerStarted","Data":"4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e"} Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.001821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" event={"ID":"3952bfb4-0398-438f-a0c1-7289b672313e","Type":"ContainerStarted","Data":"ec91e1122c7d2378896e6a9e12e8dc5f62ebbd926e026cd3e856358b238a267e"} Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.019581 4707 scope.go:117] "RemoveContainer" containerID="bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.020016 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" podStartSLOduration=2.020000292 podStartE2EDuration="2.020000292s" podCreationTimestamp="2026-01-21 15:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:36.01261697 +0000 UTC m=+2453.194133192" watchObservedRunningTime="2026-01-21 15:42:36.020000292 +0000 UTC m=+2453.201516514" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.037411 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.048628 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.049493 4707 scope.go:117] "RemoveContainer" containerID="5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6" Jan 21 15:42:36 crc kubenswrapper[4707]: E0121 15:42:36.049883 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6\": container with ID starting with 5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6 not found: ID does not exist" containerID="5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.049918 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6"} err="failed to get container status \"5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6\": rpc error: code = NotFound desc = could not find container \"5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6\": container with ID starting with 5ce8d3570e88c6480903923b4747f73dd8732a3870df015073e5884d82addbd6 not found: ID does not exist" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.049941 4707 scope.go:117] "RemoveContainer" containerID="bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34" Jan 21 15:42:36 crc kubenswrapper[4707]: E0121 15:42:36.051732 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34\": container with ID starting with bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34 not found: ID does not exist" containerID="bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.051769 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34"} err="failed to get container status \"bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34\": rpc error: code = NotFound desc = could not find container \"bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34\": container with ID starting with bfecb3c2880926db477221cfa397bd9e2e4b8edcd14410fb9abeb45b224a5f34 not found: ID does not exist" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.055557 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:42:36 crc kubenswrapper[4707]: E0121 15:42:36.056454 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerName="cinder-scheduler" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.056476 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerName="cinder-scheduler" Jan 21 15:42:36 crc kubenswrapper[4707]: E0121 15:42:36.056488 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerName="probe" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.056495 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerName="probe" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.056675 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerName="probe" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.056692 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" containerName="cinder-scheduler" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.057909 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.059768 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.059928 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.139350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznpg\" (UniqueName: \"kubernetes.io/projected/8f191858-2d71-42bb-8227-069401c26dd4-kube-api-access-zznpg\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.139413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f191858-2d71-42bb-8227-069401c26dd4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.139623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.139742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-scripts\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.139798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.139976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.241907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zznpg\" (UniqueName: \"kubernetes.io/projected/8f191858-2d71-42bb-8227-069401c26dd4-kube-api-access-zznpg\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.242092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f191858-2d71-42bb-8227-069401c26dd4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.242206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.242331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-scripts\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.242417 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.242577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.242231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f191858-2d71-42bb-8227-069401c26dd4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.246468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.246492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-scripts\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.246504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.247090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.254756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznpg\" (UniqueName: \"kubernetes.io/projected/8f191858-2d71-42bb-8227-069401c26dd4-kube-api-access-zznpg\") pod \"cinder-scheduler-0\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.376435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:36 crc kubenswrapper[4707]: I0121 15:42:36.741002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:42:37 crc kubenswrapper[4707]: I0121 15:42:37.017185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"8f191858-2d71-42bb-8227-069401c26dd4","Type":"ContainerStarted","Data":"1e6514e593995c55157d2f14d95b44ee79e25c44d2dfbd7ca01cb319d10037b5"} Jan 21 15:42:37 crc kubenswrapper[4707]: I0121 15:42:37.018877 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:37 crc kubenswrapper[4707]: I0121 15:42:37.018911 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:37 crc kubenswrapper[4707]: I0121 15:42:37.193958 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe581ee-af6a-4e08-a466-cf10aa7ef468" path="/var/lib/kubelet/pods/2fe581ee-af6a-4e08-a466-cf10aa7ef468/volumes" Jan 21 15:42:38 crc kubenswrapper[4707]: I0121 15:42:38.028456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"8f191858-2d71-42bb-8227-069401c26dd4","Type":"ContainerStarted","Data":"b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238"} Jan 21 15:42:38 crc kubenswrapper[4707]: I0121 15:42:38.028759 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"8f191858-2d71-42bb-8227-069401c26dd4","Type":"ContainerStarted","Data":"02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f"} Jan 21 15:42:41 crc kubenswrapper[4707]: I0121 15:42:41.376766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:41 crc kubenswrapper[4707]: I0121 15:42:41.469695 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:41 crc kubenswrapper[4707]: I0121 15:42:41.475014 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:42:41 crc kubenswrapper[4707]: I0121 15:42:41.484104 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=5.484093002 podStartE2EDuration="5.484093002s" podCreationTimestamp="2026-01-21 15:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:38.046477415 +0000 UTC m=+2455.227993636" watchObservedRunningTime="2026-01-21 15:42:41.484093002 +0000 UTC m=+2458.665609223" Jan 21 15:42:41 crc kubenswrapper[4707]: I0121 15:42:41.536207 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll"] Jan 21 15:42:41 crc kubenswrapper[4707]: I0121 15:42:41.536417 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api-log" containerID="cri-o://c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363" gracePeriod=30 Jan 21 15:42:41 crc kubenswrapper[4707]: I0121 15:42:41.536761 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api" containerID="cri-o://77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa" gracePeriod=30 Jan 21 15:42:41 crc kubenswrapper[4707]: I0121 15:42:41.941047 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:42:42 crc kubenswrapper[4707]: I0121 15:42:42.067954 4707 generic.go:334] "Generic (PLEG): container finished" podID="10fd4722-3015-418c-8930-474edb0bdbc6" containerID="c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363" exitCode=143 Jan 21 15:42:42 crc kubenswrapper[4707]: I0121 15:42:42.068042 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" event={"ID":"10fd4722-3015-418c-8930-474edb0bdbc6","Type":"ContainerDied","Data":"c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363"} Jan 21 15:42:44 crc kubenswrapper[4707]: I0121 15:42:44.675244 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.3:9311/healthcheck\": read tcp 10.217.0.2:49188->10.217.1.3:9311: read: connection reset by peer" Jan 21 15:42:44 crc kubenswrapper[4707]: I0121 15:42:44.675287 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.3:9311/healthcheck\": read tcp 10.217.0.2:49178->10.217.1.3:9311: read: connection reset by peer" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.023029 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.089459 4707 generic.go:334] "Generic (PLEG): container finished" podID="10fd4722-3015-418c-8930-474edb0bdbc6" containerID="77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa" exitCode=0 Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.089505 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.089528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" event={"ID":"10fd4722-3015-418c-8930-474edb0bdbc6","Type":"ContainerDied","Data":"77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa"} Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.089561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll" event={"ID":"10fd4722-3015-418c-8930-474edb0bdbc6","Type":"ContainerDied","Data":"52becd58796c84a02aa8c76ace90b695ee23c3b6146c0d09399850f36a89a396"} Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.089636 4707 scope.go:117] "RemoveContainer" containerID="77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.110342 4707 scope.go:117] "RemoveContainer" containerID="c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.127527 4707 scope.go:117] "RemoveContainer" containerID="77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa" Jan 21 15:42:45 crc kubenswrapper[4707]: E0121 15:42:45.127831 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa\": container with ID starting with 77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa not found: ID does not exist" containerID="77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.127868 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa"} err="failed to get container status \"77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa\": rpc error: code = NotFound desc = could not find container \"77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa\": container with ID starting with 77f9c02801b6542373fbbf98ff8602531559155528530a10c56424d2e7944afa not found: ID does not exist" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.127888 4707 scope.go:117] "RemoveContainer" containerID="c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363" Jan 21 15:42:45 crc kubenswrapper[4707]: E0121 15:42:45.128147 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363\": container with ID starting with c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363 not found: ID does not exist" containerID="c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.128181 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363"} err="failed to get container status \"c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363\": rpc error: code = NotFound desc = could not find container \"c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363\": container with ID starting with c62ef92172a4ac987866baa2eca37dac479dfc88b253a08fd22875323b0be363 not found: ID does not exist" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.200114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data-custom\") pod \"10fd4722-3015-418c-8930-474edb0bdbc6\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.200304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-combined-ca-bundle\") pod \"10fd4722-3015-418c-8930-474edb0bdbc6\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.200346 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10fd4722-3015-418c-8930-474edb0bdbc6-logs\") pod \"10fd4722-3015-418c-8930-474edb0bdbc6\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.200417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data\") pod \"10fd4722-3015-418c-8930-474edb0bdbc6\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.200558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp4dx\" (UniqueName: \"kubernetes.io/projected/10fd4722-3015-418c-8930-474edb0bdbc6-kube-api-access-kp4dx\") pod \"10fd4722-3015-418c-8930-474edb0bdbc6\" (UID: \"10fd4722-3015-418c-8930-474edb0bdbc6\") " Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.200770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fd4722-3015-418c-8930-474edb0bdbc6-logs" (OuterVolumeSpecName: "logs") pod "10fd4722-3015-418c-8930-474edb0bdbc6" (UID: "10fd4722-3015-418c-8930-474edb0bdbc6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.201232 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10fd4722-3015-418c-8930-474edb0bdbc6-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.204390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "10fd4722-3015-418c-8930-474edb0bdbc6" (UID: "10fd4722-3015-418c-8930-474edb0bdbc6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.204521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10fd4722-3015-418c-8930-474edb0bdbc6-kube-api-access-kp4dx" (OuterVolumeSpecName: "kube-api-access-kp4dx") pod "10fd4722-3015-418c-8930-474edb0bdbc6" (UID: "10fd4722-3015-418c-8930-474edb0bdbc6"). InnerVolumeSpecName "kube-api-access-kp4dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.220553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10fd4722-3015-418c-8930-474edb0bdbc6" (UID: "10fd4722-3015-418c-8930-474edb0bdbc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.237250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data" (OuterVolumeSpecName: "config-data") pod "10fd4722-3015-418c-8930-474edb0bdbc6" (UID: "10fd4722-3015-418c-8930-474edb0bdbc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.303857 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.303945 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.303961 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp4dx\" (UniqueName: \"kubernetes.io/projected/10fd4722-3015-418c-8930-474edb0bdbc6-kube-api-access-kp4dx\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.303993 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10fd4722-3015-418c-8930-474edb0bdbc6-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.424285 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll"] Jan 21 15:42:45 crc kubenswrapper[4707]: I0121 15:42:45.431914 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-5fbcc445ff-s28ll"] Jan 21 15:42:46 crc kubenswrapper[4707]: I0121 15:42:46.539093 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:42:47 crc kubenswrapper[4707]: I0121 15:42:47.182406 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:42:47 crc kubenswrapper[4707]: E0121 15:42:47.182641 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:42:47 crc kubenswrapper[4707]: I0121 15:42:47.190587 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" path="/var/lib/kubelet/pods/10fd4722-3015-418c-8930-474edb0bdbc6/volumes" Jan 21 15:42:52 crc kubenswrapper[4707]: I0121 15:42:52.051270 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:42:52 crc kubenswrapper[4707]: I0121 15:42:52.988474 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:52 crc kubenswrapper[4707]: I0121 15:42:52.992161 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:53 crc kubenswrapper[4707]: I0121 15:42:53.629116 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:53 crc kubenswrapper[4707]: I0121 15:42:53.636858 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:42:53 crc kubenswrapper[4707]: I0121 15:42:53.683618 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-85b7676df9-ncnhq"] Jan 21 15:42:54 crc kubenswrapper[4707]: I0121 15:42:54.219840 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" podUID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerName="placement-log" containerID="cri-o://05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203" gracePeriod=30 Jan 21 15:42:54 crc kubenswrapper[4707]: I0121 15:42:54.219937 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" podUID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerName="placement-api" containerID="cri-o://5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0" gracePeriod=30 Jan 21 15:42:55 crc kubenswrapper[4707]: I0121 15:42:55.229056 4707 generic.go:334] "Generic (PLEG): container finished" podID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerID="05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203" exitCode=143 Jan 21 15:42:55 crc kubenswrapper[4707]: I0121 15:42:55.229147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" event={"ID":"b86f97a2-7524-46a8-8e52-73a094fc7242","Type":"ContainerDied","Data":"05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203"} Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.682842 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.788126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-config-data\") pod \"b86f97a2-7524-46a8-8e52-73a094fc7242\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.788193 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mrq7\" (UniqueName: \"kubernetes.io/projected/b86f97a2-7524-46a8-8e52-73a094fc7242-kube-api-access-2mrq7\") pod \"b86f97a2-7524-46a8-8e52-73a094fc7242\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.788248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-scripts\") pod \"b86f97a2-7524-46a8-8e52-73a094fc7242\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.788291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-combined-ca-bundle\") pod \"b86f97a2-7524-46a8-8e52-73a094fc7242\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.788361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86f97a2-7524-46a8-8e52-73a094fc7242-logs\") pod \"b86f97a2-7524-46a8-8e52-73a094fc7242\" (UID: \"b86f97a2-7524-46a8-8e52-73a094fc7242\") " Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.788867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b86f97a2-7524-46a8-8e52-73a094fc7242-logs" (OuterVolumeSpecName: "logs") pod "b86f97a2-7524-46a8-8e52-73a094fc7242" (UID: "b86f97a2-7524-46a8-8e52-73a094fc7242"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.792536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-scripts" (OuterVolumeSpecName: "scripts") pod "b86f97a2-7524-46a8-8e52-73a094fc7242" (UID: "b86f97a2-7524-46a8-8e52-73a094fc7242"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.792616 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b86f97a2-7524-46a8-8e52-73a094fc7242-kube-api-access-2mrq7" (OuterVolumeSpecName: "kube-api-access-2mrq7") pod "b86f97a2-7524-46a8-8e52-73a094fc7242" (UID: "b86f97a2-7524-46a8-8e52-73a094fc7242"). InnerVolumeSpecName "kube-api-access-2mrq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.823297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b86f97a2-7524-46a8-8e52-73a094fc7242" (UID: "b86f97a2-7524-46a8-8e52-73a094fc7242"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.825409 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-config-data" (OuterVolumeSpecName: "config-data") pod "b86f97a2-7524-46a8-8e52-73a094fc7242" (UID: "b86f97a2-7524-46a8-8e52-73a094fc7242"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.890660 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b86f97a2-7524-46a8-8e52-73a094fc7242-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.890688 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.890700 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mrq7\" (UniqueName: \"kubernetes.io/projected/b86f97a2-7524-46a8-8e52-73a094fc7242-kube-api-access-2mrq7\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.890708 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:57 crc kubenswrapper[4707]: I0121 15:42:57.890717 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86f97a2-7524-46a8-8e52-73a094fc7242-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.250746 4707 generic.go:334] "Generic (PLEG): container finished" podID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerID="5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0" exitCode=0 Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.250779 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.250791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" event={"ID":"b86f97a2-7524-46a8-8e52-73a094fc7242","Type":"ContainerDied","Data":"5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0"} Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.250838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85b7676df9-ncnhq" event={"ID":"b86f97a2-7524-46a8-8e52-73a094fc7242","Type":"ContainerDied","Data":"94927bab8a001b56d04e3bf6edb99f4161922bc6351a2029b57ac76c426842b2"} Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.250857 4707 scope.go:117] "RemoveContainer" containerID="5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.269096 4707 scope.go:117] "RemoveContainer" containerID="05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.275373 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-85b7676df9-ncnhq"] Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.280338 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-85b7676df9-ncnhq"] Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.284047 4707 scope.go:117] "RemoveContainer" containerID="5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0" Jan 21 15:42:58 crc kubenswrapper[4707]: E0121 15:42:58.284341 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0\": container with ID starting with 5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0 not found: ID does not exist" containerID="5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.284366 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0"} err="failed to get container status \"5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0\": rpc error: code = NotFound desc = could not find container \"5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0\": container with ID starting with 5c62628b174d36096bf280cfb0b404ebf99e02de39999c256b83f82b56e021b0 not found: ID does not exist" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.284382 4707 scope.go:117] "RemoveContainer" containerID="05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203" Jan 21 15:42:58 crc kubenswrapper[4707]: E0121 15:42:58.284623 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203\": container with ID starting with 05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203 not found: ID does not exist" containerID="05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.284654 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203"} err="failed to get container status \"05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203\": rpc error: code = NotFound desc = could not find container \"05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203\": container with ID starting with 05228f289f269d8b70618995f8f989f81068f29c0f35facc4caf36a6aa0a6203 not found: ID does not exist" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.414797 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.558785 4707 scope.go:117] "RemoveContainer" containerID="2ee450230b14b4a5c2a9d71be025be369c10b66e0be91322654b7f1dcc35d9a2" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.576960 4707 scope.go:117] "RemoveContainer" containerID="48ccd1e724ea566ac4ef6f265b3e10eddd03645be766819c72a99208070bd75e" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.609290 4707 scope.go:117] "RemoveContainer" containerID="cbcc2961b6a327d16171eb31d8ea54ff8a2c6f5303297d8511993e6e6fbb1e86" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.636041 4707 scope.go:117] "RemoveContainer" containerID="4e8db9206e60207d764bf1cf861c46a318af07cd5da8f84caf3d1dceec27103a" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.659399 4707 scope.go:117] "RemoveContainer" containerID="6f7a4f03f1265c02518cb79a00df2bf9a2f6b20ef54c2f013a587fc6042298c4" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.686894 4707 scope.go:117] "RemoveContainer" containerID="e9c5e95311c111be35df30d16c6a5e7813e650f8ab4dabdc6a40169ffdf21a57" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.702757 4707 scope.go:117] "RemoveContainer" containerID="f431730553af7e4d97ef4b5ba81a40ace44ae601e09bfd4227bed22314d88fa7" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.732844 4707 scope.go:117] "RemoveContainer" containerID="f123bdd1dba97854e5b6dbe5bcbbf1f01fb964c4f1fe9a7a1965cf38a8e919f5" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.748448 4707 scope.go:117] "RemoveContainer" containerID="57c89da5fb9005ec6dcd82ec6ad8a265e4dd85410a9e6a5f1e56c83f0296636c" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.763759 4707 scope.go:117] "RemoveContainer" containerID="cee8fc65d92a4c449380c20ddd4feeb07fa1354e54366a6b28e01b8d807032e6" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.793418 4707 scope.go:117] "RemoveContainer" containerID="b1188d36dc2bdebaced9f39f950f8b3b1eca82f9ad0ca7aef76acdda21e53bf1" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.807562 4707 scope.go:117] "RemoveContainer" containerID="9ab835caae749435aeadf6ada86120577bb2f30d5a8addc17705da18c542955c" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.831602 4707 scope.go:117] "RemoveContainer" containerID="90983c11e74f2f1c02a3d7133c07441ce683da95b9d00877cfefa6d7ec35f648" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.850208 4707 scope.go:117] "RemoveContainer" containerID="915ec7465a1a102fa76e8d82b871add4c78fcca54bbabfcfdaf7adb84ced2a0d" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.875487 4707 scope.go:117] "RemoveContainer" containerID="141300207b12cdbdf06baed6cb9e51962fd7f970e7d8fd78eb7b854f6ba5d5a6" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.892274 4707 scope.go:117] "RemoveContainer" containerID="2bad8cee5f896a528e089f0e95328272fc4d1d17326f5b82657dc7c6854aebea" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.911037 4707 scope.go:117] "RemoveContainer" containerID="9855407ae307baba14bb2d5bf299fe60a15470f4bae1dd8811b5a4cd3662c164" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.940319 4707 scope.go:117] "RemoveContainer" containerID="b26e99f385c2260f1e53ef3f8fa69105e75942496a20c6800252912d20edc6b9" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.956918 4707 scope.go:117] "RemoveContainer" containerID="30c8e6768c731bc44e17cb82c6d9a3a43f549efa140290faacd619bcb41a9383" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.974137 4707 scope.go:117] "RemoveContainer" containerID="9d5b0b7f5a582f6c4354f5489be09ac9274c0e007f0788be84a147857f5bd690" Jan 21 15:42:58 crc kubenswrapper[4707]: I0121 15:42:58.988858 4707 scope.go:117] "RemoveContainer" containerID="73b71146f952853bdd6e59449df71c4b5a1958e47e4c4588e0b1ac00e9bd595a" Jan 21 15:42:59 crc kubenswrapper[4707]: I0121 15:42:59.017235 4707 scope.go:117] "RemoveContainer" containerID="17314e5ab8670c6d632a90347ca49b8287c481196323c08cd4371587301e2862" Jan 21 15:42:59 crc kubenswrapper[4707]: I0121 15:42:59.193163 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b86f97a2-7524-46a8-8e52-73a094fc7242" path="/var/lib/kubelet/pods/b86f97a2-7524-46a8-8e52-73a094fc7242/volumes" Jan 21 15:43:00 crc kubenswrapper[4707]: I0121 15:43:00.811362 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:43:01 crc kubenswrapper[4707]: I0121 15:43:01.183066 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:43:01 crc kubenswrapper[4707]: E0121 15:43:01.183451 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:43:03 crc kubenswrapper[4707]: I0121 15:43:03.686643 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:43:03 crc kubenswrapper[4707]: I0121 15:43:03.739123 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-566fb846f6-xwtm2"] Jan 21 15:43:03 crc kubenswrapper[4707]: I0121 15:43:03.739413 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" podUID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerName="neutron-api" containerID="cri-o://470d8baec8d8a85adfa5eb17592dc43ee2ed9dbd3ce5dcc778e85b9df92189a7" gracePeriod=30 Jan 21 15:43:03 crc kubenswrapper[4707]: I0121 15:43:03.739909 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" podUID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerName="neutron-httpd" containerID="cri-o://730d51fbb75b72d819fa14edb27a01f7434fbd019813a7974ed132c35e5ce667" gracePeriod=30 Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.311041 4707 generic.go:334] "Generic (PLEG): container finished" podID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerID="730d51fbb75b72d819fa14edb27a01f7434fbd019813a7974ed132c35e5ce667" exitCode=0 Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.311213 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" event={"ID":"91706eeb-4f05-4bf7-b2b2-53618d51bda5","Type":"ContainerDied","Data":"730d51fbb75b72d819fa14edb27a01f7434fbd019813a7974ed132c35e5ce667"} Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.884007 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:43:04 crc kubenswrapper[4707]: E0121 15:43:04.884374 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api-log" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.884387 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api-log" Jan 21 15:43:04 crc kubenswrapper[4707]: E0121 15:43:04.884402 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerName="placement-log" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.884408 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerName="placement-log" Jan 21 15:43:04 crc kubenswrapper[4707]: E0121 15:43:04.884422 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerName="placement-api" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.884427 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerName="placement-api" Jan 21 15:43:04 crc kubenswrapper[4707]: E0121 15:43:04.884435 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.884440 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.884585 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerName="placement-api" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.884597 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api-log" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.884607 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86f97a2-7524-46a8-8e52-73a094fc7242" containerName="placement-log" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.884619 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="10fd4722-3015-418c-8930-474edb0bdbc6" containerName="barbican-api" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.885127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.887075 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-9bq6v" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.892519 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.893192 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.893762 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.901781 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:43:04 crc kubenswrapper[4707]: E0121 15:43:04.903443 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-wj7f5 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-wj7f5]: context canceled" pod="openstack-kuttl-tests/openstackclient" podUID="d48d9a0c-7b80-4f08-8f72-ed48b77875db" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.913569 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.970665 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.971635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:04 crc kubenswrapper[4707]: I0121 15:43:04.984851 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.005606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config-secret\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.005651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgrfg\" (UniqueName: \"kubernetes.io/projected/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-kube-api-access-fgrfg\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.005733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config-secret\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.005770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.005820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-combined-ca-bundle\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.005916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj7f5\" (UniqueName: \"kubernetes.io/projected/d48d9a0c-7b80-4f08-8f72-ed48b77875db-kube-api-access-wj7f5\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.005953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.005991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.106856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj7f5\" (UniqueName: \"kubernetes.io/projected/d48d9a0c-7b80-4f08-8f72-ed48b77875db-kube-api-access-wj7f5\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.106900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.106921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.106953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config-secret\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.106972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgrfg\" (UniqueName: \"kubernetes.io/projected/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-kube-api-access-fgrfg\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.107020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config-secret\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.107045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.107066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-combined-ca-bundle\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.108358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.109279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: E0121 15:43:05.109629 4707 projected.go:194] Error preparing data for projected volume kube-api-access-wj7f5 for pod openstack-kuttl-tests/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (d48d9a0c-7b80-4f08-8f72-ed48b77875db) does not match the UID in record. The object might have been deleted and then recreated Jan 21 15:43:05 crc kubenswrapper[4707]: E0121 15:43:05.109674 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d48d9a0c-7b80-4f08-8f72-ed48b77875db-kube-api-access-wj7f5 podName:d48d9a0c-7b80-4f08-8f72-ed48b77875db nodeName:}" failed. No retries permitted until 2026-01-21 15:43:05.609660442 +0000 UTC m=+2482.791176665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wj7f5" (UniqueName: "kubernetes.io/projected/d48d9a0c-7b80-4f08-8f72-ed48b77875db-kube-api-access-wj7f5") pod "openstackclient" (UID: "d48d9a0c-7b80-4f08-8f72-ed48b77875db") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (d48d9a0c-7b80-4f08-8f72-ed48b77875db) does not match the UID in record. The object might have been deleted and then recreated Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.112072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config-secret\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.112096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.112347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-combined-ca-bundle\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.112866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config-secret\") pod \"openstackclient\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.121925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgrfg\" (UniqueName: \"kubernetes.io/projected/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-kube-api-access-fgrfg\") pod \"openstackclient\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.284334 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.318774 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.363047 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.370843 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="d48d9a0c-7b80-4f08-8f72-ed48b77875db" podUID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.417070 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj7f5\" (UniqueName: \"kubernetes.io/projected/d48d9a0c-7b80-4f08-8f72-ed48b77875db-kube-api-access-wj7f5\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.518049 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-combined-ca-bundle\") pod \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.518104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config\") pod \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.518369 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config-secret\") pod \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\" (UID: \"d48d9a0c-7b80-4f08-8f72-ed48b77875db\") " Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.520786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d48d9a0c-7b80-4f08-8f72-ed48b77875db" (UID: "d48d9a0c-7b80-4f08-8f72-ed48b77875db"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.536528 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d48d9a0c-7b80-4f08-8f72-ed48b77875db" (UID: "d48d9a0c-7b80-4f08-8f72-ed48b77875db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.536531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d48d9a0c-7b80-4f08-8f72-ed48b77875db" (UID: "d48d9a0c-7b80-4f08-8f72-ed48b77875db"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.620438 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.620603 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.620668 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48d9a0c-7b80-4f08-8f72-ed48b77875db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:05 crc kubenswrapper[4707]: I0121 15:43:05.692683 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:43:06 crc kubenswrapper[4707]: I0121 15:43:06.328114 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:43:06 crc kubenswrapper[4707]: I0121 15:43:06.328161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63","Type":"ContainerStarted","Data":"bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9"} Jan 21 15:43:06 crc kubenswrapper[4707]: I0121 15:43:06.328495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63","Type":"ContainerStarted","Data":"571109b7c41015118abe78ac2d06bdd9d867c5cd1d5e7e0818aca89c290c521f"} Jan 21 15:43:06 crc kubenswrapper[4707]: I0121 15:43:06.341736 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.341722414 podStartE2EDuration="2.341722414s" podCreationTimestamp="2026-01-21 15:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:06.338152222 +0000 UTC m=+2483.519668444" watchObservedRunningTime="2026-01-21 15:43:06.341722414 +0000 UTC m=+2483.523238637" Jan 21 15:43:06 crc kubenswrapper[4707]: I0121 15:43:06.342676 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="d48d9a0c-7b80-4f08-8f72-ed48b77875db" podUID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.190927 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48d9a0c-7b80-4f08-8f72-ed48b77875db" path="/var/lib/kubelet/pods/d48d9a0c-7b80-4f08-8f72-ed48b77875db/volumes" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.316583 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct"] Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.318120 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.319413 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.319836 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.321546 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.337071 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct"] Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.350068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-public-tls-certs\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.350110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-etc-swift\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.350134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-internal-tls-certs\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.350157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-log-httpd\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.350207 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-run-httpd\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.350265 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-config-data\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.350301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdvjw\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-kube-api-access-tdvjw\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.350333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-combined-ca-bundle\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.451403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-config-data\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.451461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdvjw\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-kube-api-access-tdvjw\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.451490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-combined-ca-bundle\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.451541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-public-tls-certs\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.451560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-etc-swift\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.451580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-internal-tls-certs\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.451597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-log-httpd\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.451634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-run-httpd\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.452139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-run-httpd\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.452451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-log-httpd\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.458141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-public-tls-certs\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.458150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-internal-tls-certs\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.458881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-etc-swift\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.459679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-config-data\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.466135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdvjw\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-kube-api-access-tdvjw\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.472071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-combined-ca-bundle\") pod \"swift-proxy-6c95d7d87f-4p9ct\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.633506 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.634753 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.634960 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="d8988513-d916-4f32-aa5e-51ed5a40d697" containerName="kube-state-metrics" containerID="cri-o://63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427" gracePeriod=30 Jan 21 15:43:07 crc kubenswrapper[4707]: I0121 15:43:07.993671 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.076494 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct"] Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.162616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k6sh\" (UniqueName: \"kubernetes.io/projected/d8988513-d916-4f32-aa5e-51ed5a40d697-kube-api-access-8k6sh\") pod \"d8988513-d916-4f32-aa5e-51ed5a40d697\" (UID: \"d8988513-d916-4f32-aa5e-51ed5a40d697\") " Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.165331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8988513-d916-4f32-aa5e-51ed5a40d697-kube-api-access-8k6sh" (OuterVolumeSpecName: "kube-api-access-8k6sh") pod "d8988513-d916-4f32-aa5e-51ed5a40d697" (UID: "d8988513-d916-4f32-aa5e-51ed5a40d697"). InnerVolumeSpecName "kube-api-access-8k6sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.265609 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k6sh\" (UniqueName: \"kubernetes.io/projected/d8988513-d916-4f32-aa5e-51ed5a40d697-kube-api-access-8k6sh\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.341957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" event={"ID":"ca53a031-35d7-4861-8ccd-691ec790ce9c","Type":"ContainerStarted","Data":"8597f56a988d7dd7e3a62e7f1cbbe0a94a68d83003c9ba88a0cb7a507db48eae"} Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.342011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" event={"ID":"ca53a031-35d7-4861-8ccd-691ec790ce9c","Type":"ContainerStarted","Data":"91a464dd2b1b7a093f17f2c45826faedf6137701707d73e4c79aa44fc94f0ade"} Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.343352 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8988513-d916-4f32-aa5e-51ed5a40d697" containerID="63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427" exitCode=2 Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.343395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d8988513-d916-4f32-aa5e-51ed5a40d697","Type":"ContainerDied","Data":"63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427"} Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.343401 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.343418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d8988513-d916-4f32-aa5e-51ed5a40d697","Type":"ContainerDied","Data":"14d565ab8e4424fe4f8e7e75ce55eceb9b6bc6310430705cb4ece547a6936fd6"} Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.343434 4707 scope.go:117] "RemoveContainer" containerID="63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.362138 4707 scope.go:117] "RemoveContainer" containerID="63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427" Jan 21 15:43:08 crc kubenswrapper[4707]: E0121 15:43:08.362426 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427\": container with ID starting with 63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427 not found: ID does not exist" containerID="63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.362449 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427"} err="failed to get container status \"63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427\": rpc error: code = NotFound desc = could not find container \"63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427\": container with ID starting with 63d0734d2dd7970888b56ea8946ee786aec5741fd631eea1abac09c1d8e38427 not found: ID does not exist" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.372017 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.382152 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.388373 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:43:08 crc kubenswrapper[4707]: E0121 15:43:08.388691 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8988513-d916-4f32-aa5e-51ed5a40d697" containerName="kube-state-metrics" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.388709 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8988513-d916-4f32-aa5e-51ed5a40d697" containerName="kube-state-metrics" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.388876 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8988513-d916-4f32-aa5e-51ed5a40d697" containerName="kube-state-metrics" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.393612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.394382 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.394881 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.395296 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.572610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.573152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfg52\" (UniqueName: \"kubernetes.io/projected/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-api-access-rfg52\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.573205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.573732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.675212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.675301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.675325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfg52\" (UniqueName: \"kubernetes.io/projected/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-api-access-rfg52\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.675347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.678776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.678912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.678932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.689046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfg52\" (UniqueName: \"kubernetes.io/projected/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-api-access-rfg52\") pod \"kube-state-metrics-0\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:08 crc kubenswrapper[4707]: I0121 15:43:08.707252 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.112888 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.167240 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.167488 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="ceilometer-central-agent" containerID="cri-o://49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33" gracePeriod=30 Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.167939 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="proxy-httpd" containerID="cri-o://7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8" gracePeriod=30 Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.167996 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="sg-core" containerID="cri-o://85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3" gracePeriod=30 Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.168036 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="ceilometer-notification-agent" containerID="cri-o://59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836" gracePeriod=30 Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.190607 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8988513-d916-4f32-aa5e-51ed5a40d697" path="/var/lib/kubelet/pods/d8988513-d916-4f32-aa5e-51ed5a40d697/volumes" Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.354279 4707 generic.go:334] "Generic (PLEG): container finished" podID="ac5d1294-8046-439a-a08b-5beea0684969" containerID="7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8" exitCode=0 Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.354315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerDied","Data":"7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8"} Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.354335 4707 generic.go:334] "Generic (PLEG): container finished" podID="ac5d1294-8046-439a-a08b-5beea0684969" containerID="85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3" exitCode=2 Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.354345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerDied","Data":"85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3"} Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.355570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"e5aeb6d2-f710-4490-9929-ac42ed945b2a","Type":"ContainerStarted","Data":"d6f01eaa5215e6cb6ee1567a17f32e68f211cf83f30f80dca7c3ecf26f893905"} Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.358476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" event={"ID":"ca53a031-35d7-4861-8ccd-691ec790ce9c","Type":"ContainerStarted","Data":"8f7662857b1d2fd154bd23c6d539f25deaead9f07840cdb3f519d53c088180ad"} Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.358676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.358826 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:09 crc kubenswrapper[4707]: I0121 15:43:09.377019 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" podStartSLOduration=2.377005215 podStartE2EDuration="2.377005215s" podCreationTimestamp="2026-01-21 15:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:09.373431887 +0000 UTC m=+2486.554948109" watchObservedRunningTime="2026-01-21 15:43:09.377005215 +0000 UTC m=+2486.558521437" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.373081 4707 generic.go:334] "Generic (PLEG): container finished" podID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerID="470d8baec8d8a85adfa5eb17592dc43ee2ed9dbd3ce5dcc778e85b9df92189a7" exitCode=0 Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.373147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" event={"ID":"91706eeb-4f05-4bf7-b2b2-53618d51bda5","Type":"ContainerDied","Data":"470d8baec8d8a85adfa5eb17592dc43ee2ed9dbd3ce5dcc778e85b9df92189a7"} Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.376760 4707 generic.go:334] "Generic (PLEG): container finished" podID="ac5d1294-8046-439a-a08b-5beea0684969" containerID="49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33" exitCode=0 Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.376839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerDied","Data":"49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33"} Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.378755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"e5aeb6d2-f710-4490-9929-ac42ed945b2a","Type":"ContainerStarted","Data":"7fd0a096985f6e5334dfd12d0e94f5b59badf344ad4373aecceb6bac8d6b3dd4"} Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.378933 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.397646 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.104247504 podStartE2EDuration="2.397628951s" podCreationTimestamp="2026-01-21 15:43:08 +0000 UTC" firstStartedPulling="2026-01-21 15:43:09.107674467 +0000 UTC m=+2486.289190688" lastFinishedPulling="2026-01-21 15:43:09.401055912 +0000 UTC m=+2486.582572135" observedRunningTime="2026-01-21 15:43:10.393316183 +0000 UTC m=+2487.574832405" watchObservedRunningTime="2026-01-21 15:43:10.397628951 +0000 UTC m=+2487.579145173" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.651509 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.821575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz4zl\" (UniqueName: \"kubernetes.io/projected/91706eeb-4f05-4bf7-b2b2-53618d51bda5-kube-api-access-kz4zl\") pod \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.821691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-ovndb-tls-certs\") pod \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.821741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-config\") pod \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.821785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-combined-ca-bundle\") pod \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.821859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-httpd-config\") pod \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\" (UID: \"91706eeb-4f05-4bf7-b2b2-53618d51bda5\") " Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.827941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "91706eeb-4f05-4bf7-b2b2-53618d51bda5" (UID: "91706eeb-4f05-4bf7-b2b2-53618d51bda5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.828978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91706eeb-4f05-4bf7-b2b2-53618d51bda5-kube-api-access-kz4zl" (OuterVolumeSpecName: "kube-api-access-kz4zl") pod "91706eeb-4f05-4bf7-b2b2-53618d51bda5" (UID: "91706eeb-4f05-4bf7-b2b2-53618d51bda5"). InnerVolumeSpecName "kube-api-access-kz4zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.860157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91706eeb-4f05-4bf7-b2b2-53618d51bda5" (UID: "91706eeb-4f05-4bf7-b2b2-53618d51bda5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.863188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-config" (OuterVolumeSpecName: "config") pod "91706eeb-4f05-4bf7-b2b2-53618d51bda5" (UID: "91706eeb-4f05-4bf7-b2b2-53618d51bda5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.883661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "91706eeb-4f05-4bf7-b2b2-53618d51bda5" (UID: "91706eeb-4f05-4bf7-b2b2-53618d51bda5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.905328 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.923720 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.923744 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.923755 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.923763 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91706eeb-4f05-4bf7-b2b2-53618d51bda5-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:10 crc kubenswrapper[4707]: I0121 15:43:10.923772 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz4zl\" (UniqueName: \"kubernetes.io/projected/91706eeb-4f05-4bf7-b2b2-53618d51bda5-kube-api-access-kz4zl\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.025019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-run-httpd\") pod \"ac5d1294-8046-439a-a08b-5beea0684969\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.025240 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-sg-core-conf-yaml\") pod \"ac5d1294-8046-439a-a08b-5beea0684969\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.025354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-scripts\") pod \"ac5d1294-8046-439a-a08b-5beea0684969\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.025449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ac5d1294-8046-439a-a08b-5beea0684969" (UID: "ac5d1294-8046-439a-a08b-5beea0684969"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.025545 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4vqd\" (UniqueName: \"kubernetes.io/projected/ac5d1294-8046-439a-a08b-5beea0684969-kube-api-access-k4vqd\") pod \"ac5d1294-8046-439a-a08b-5beea0684969\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.025623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-log-httpd\") pod \"ac5d1294-8046-439a-a08b-5beea0684969\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.025779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-config-data\") pod \"ac5d1294-8046-439a-a08b-5beea0684969\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.026187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-combined-ca-bundle\") pod \"ac5d1294-8046-439a-a08b-5beea0684969\" (UID: \"ac5d1294-8046-439a-a08b-5beea0684969\") " Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.026245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ac5d1294-8046-439a-a08b-5beea0684969" (UID: "ac5d1294-8046-439a-a08b-5beea0684969"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.026748 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.026916 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5d1294-8046-439a-a08b-5beea0684969-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.027929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-scripts" (OuterVolumeSpecName: "scripts") pod "ac5d1294-8046-439a-a08b-5beea0684969" (UID: "ac5d1294-8046-439a-a08b-5beea0684969"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.029353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5d1294-8046-439a-a08b-5beea0684969-kube-api-access-k4vqd" (OuterVolumeSpecName: "kube-api-access-k4vqd") pod "ac5d1294-8046-439a-a08b-5beea0684969" (UID: "ac5d1294-8046-439a-a08b-5beea0684969"). InnerVolumeSpecName "kube-api-access-k4vqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.046150 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ac5d1294-8046-439a-a08b-5beea0684969" (UID: "ac5d1294-8046-439a-a08b-5beea0684969"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.079112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac5d1294-8046-439a-a08b-5beea0684969" (UID: "ac5d1294-8046-439a-a08b-5beea0684969"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.095045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-config-data" (OuterVolumeSpecName: "config-data") pod "ac5d1294-8046-439a-a08b-5beea0684969" (UID: "ac5d1294-8046-439a-a08b-5beea0684969"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.128490 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.128529 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.128541 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4vqd\" (UniqueName: \"kubernetes.io/projected/ac5d1294-8046-439a-a08b-5beea0684969-kube-api-access-k4vqd\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.128552 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.128562 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5d1294-8046-439a-a08b-5beea0684969-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.386349 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" event={"ID":"91706eeb-4f05-4bf7-b2b2-53618d51bda5","Type":"ContainerDied","Data":"9c6f37e742788cd2f8282fc461783dfb6dff2f836c4dc4069ba189a8cfa1e561"} Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.386414 4707 scope.go:117] "RemoveContainer" containerID="730d51fbb75b72d819fa14edb27a01f7434fbd019813a7974ed132c35e5ce667" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.386365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-566fb846f6-xwtm2" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.389131 4707 generic.go:334] "Generic (PLEG): container finished" podID="ac5d1294-8046-439a-a08b-5beea0684969" containerID="59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836" exitCode=0 Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.389189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerDied","Data":"59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836"} Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.389224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ac5d1294-8046-439a-a08b-5beea0684969","Type":"ContainerDied","Data":"6281d97debb8c5946cd57bc9e3768a8d106ecbab0211588f5808231951a73001"} Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.389197 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.407458 4707 scope.go:117] "RemoveContainer" containerID="470d8baec8d8a85adfa5eb17592dc43ee2ed9dbd3ce5dcc778e85b9df92189a7" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.411318 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-566fb846f6-xwtm2"] Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.421894 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-566fb846f6-xwtm2"] Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.428447 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.434844 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.439875 4707 scope.go:117] "RemoveContainer" containerID="7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.439988 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.440323 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="sg-core" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440342 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="sg-core" Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.440356 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerName="neutron-api" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440362 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerName="neutron-api" Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.440380 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerName="neutron-httpd" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440386 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerName="neutron-httpd" Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.440397 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="proxy-httpd" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440401 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="proxy-httpd" Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.440417 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="ceilometer-central-agent" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440422 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="ceilometer-central-agent" Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.440436 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="ceilometer-notification-agent" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440441 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="ceilometer-notification-agent" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440600 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="ceilometer-notification-agent" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440615 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="proxy-httpd" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440622 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerName="neutron-api" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440632 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="ceilometer-central-agent" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440642 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" containerName="neutron-httpd" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.440652 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5d1294-8046-439a-a08b-5beea0684969" containerName="sg-core" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.442393 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.445496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.445741 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.446361 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.454397 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.458415 4707 scope.go:117] "RemoveContainer" containerID="85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.485411 4707 scope.go:117] "RemoveContainer" containerID="59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.503149 4707 scope.go:117] "RemoveContainer" containerID="49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.544048 4707 scope.go:117] "RemoveContainer" containerID="7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8" Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.544412 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8\": container with ID starting with 7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8 not found: ID does not exist" containerID="7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.544441 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8"} err="failed to get container status \"7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8\": rpc error: code = NotFound desc = could not find container \"7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8\": container with ID starting with 7a282eeb1c4c7c8c7a8abd21c2d353656d2e6233c74e279f230f92f58a74d4b8 not found: ID does not exist" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.544458 4707 scope.go:117] "RemoveContainer" containerID="85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3" Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.544637 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3\": container with ID starting with 85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3 not found: ID does not exist" containerID="85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.544661 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3"} err="failed to get container status \"85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3\": rpc error: code = NotFound desc = could not find container \"85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3\": container with ID starting with 85ad101c9bbfe029dc89f95bcf1b97cc7317cb661b2836b5b2348e0007d80ab3 not found: ID does not exist" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.544675 4707 scope.go:117] "RemoveContainer" containerID="59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836" Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.544864 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836\": container with ID starting with 59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836 not found: ID does not exist" containerID="59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.544880 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836"} err="failed to get container status \"59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836\": rpc error: code = NotFound desc = could not find container \"59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836\": container with ID starting with 59a8f873af10b2c709d9c3ce80f1444d17f1871be9d3f9e2016935ce0854b836 not found: ID does not exist" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.544896 4707 scope.go:117] "RemoveContainer" containerID="49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33" Jan 21 15:43:11 crc kubenswrapper[4707]: E0121 15:43:11.545065 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33\": container with ID starting with 49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33 not found: ID does not exist" containerID="49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.545085 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33"} err="failed to get container status \"49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33\": rpc error: code = NotFound desc = could not find container \"49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33\": container with ID starting with 49f5efdd4e48e8d2816497b9c162e4f379f8c60ec8ca016c1912e87345679a33 not found: ID does not exist" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.635251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvjgn\" (UniqueName: \"kubernetes.io/projected/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-kube-api-access-tvjgn\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.635526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-scripts\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.635545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.635561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.635586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-log-httpd\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.635603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.635651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-run-httpd\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.635666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-config-data\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-run-httpd\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-config-data\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvjgn\" (UniqueName: \"kubernetes.io/projected/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-kube-api-access-tvjgn\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-scripts\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-log-httpd\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-run-httpd\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.736981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-log-httpd\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.742210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-scripts\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.742333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.743072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-config-data\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.743425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.748776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.750586 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvjgn\" (UniqueName: \"kubernetes.io/projected/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-kube-api-access-tvjgn\") pod \"ceilometer-0\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:11 crc kubenswrapper[4707]: I0121 15:43:11.764595 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:12 crc kubenswrapper[4707]: I0121 15:43:12.166880 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:12 crc kubenswrapper[4707]: W0121 15:43:12.167046 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf1f5e4d_8b60_4d29_bf84_5b5aa7ea0e47.slice/crio-c80fbe91a6f12848293d6945985d07cd2b1d6e16b8ec7065797c671eab0151f8 WatchSource:0}: Error finding container c80fbe91a6f12848293d6945985d07cd2b1d6e16b8ec7065797c671eab0151f8: Status 404 returned error can't find the container with id c80fbe91a6f12848293d6945985d07cd2b1d6e16b8ec7065797c671eab0151f8 Jan 21 15:43:12 crc kubenswrapper[4707]: I0121 15:43:12.405292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerStarted","Data":"c80fbe91a6f12848293d6945985d07cd2b1d6e16b8ec7065797c671eab0151f8"} Jan 21 15:43:13 crc kubenswrapper[4707]: I0121 15:43:13.190941 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91706eeb-4f05-4bf7-b2b2-53618d51bda5" path="/var/lib/kubelet/pods/91706eeb-4f05-4bf7-b2b2-53618d51bda5/volumes" Jan 21 15:43:13 crc kubenswrapper[4707]: I0121 15:43:13.191900 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5d1294-8046-439a-a08b-5beea0684969" path="/var/lib/kubelet/pods/ac5d1294-8046-439a-a08b-5beea0684969/volumes" Jan 21 15:43:13 crc kubenswrapper[4707]: I0121 15:43:13.413297 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerStarted","Data":"1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191"} Jan 21 15:43:13 crc kubenswrapper[4707]: I0121 15:43:13.701917 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:13 crc kubenswrapper[4707]: I0121 15:43:13.936557 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:43:13 crc kubenswrapper[4707]: I0121 15:43:13.936980 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerName="glance-log" containerID="cri-o://ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096" gracePeriod=30 Jan 21 15:43:13 crc kubenswrapper[4707]: I0121 15:43:13.937064 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerName="glance-httpd" containerID="cri-o://66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db" gracePeriod=30 Jan 21 15:43:14 crc kubenswrapper[4707]: I0121 15:43:14.420979 4707 generic.go:334] "Generic (PLEG): container finished" podID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerID="ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096" exitCode=143 Jan 21 15:43:14 crc kubenswrapper[4707]: I0121 15:43:14.421017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b423a914-b340-4f8f-ae6a-f55acfccd5da","Type":"ContainerDied","Data":"ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096"} Jan 21 15:43:14 crc kubenswrapper[4707]: I0121 15:43:14.422979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerStarted","Data":"85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39"} Jan 21 15:43:15 crc kubenswrapper[4707]: I0121 15:43:15.137916 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:43:15 crc kubenswrapper[4707]: I0121 15:43:15.138375 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="f762786d-4f31-412e-a685-b196f3cf3fea" containerName="glance-log" containerID="cri-o://f38b9dcb38e276bd9c92539d9f6c09c61aeb4c0e20d24a04f0aecd1dde1da008" gracePeriod=30 Jan 21 15:43:15 crc kubenswrapper[4707]: I0121 15:43:15.138648 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="f762786d-4f31-412e-a685-b196f3cf3fea" containerName="glance-httpd" containerID="cri-o://42a79abf921569981336ed28abfb2303337cb7d5d4419864703a223193f9e1a4" gracePeriod=30 Jan 21 15:43:15 crc kubenswrapper[4707]: I0121 15:43:15.182695 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:43:15 crc kubenswrapper[4707]: E0121 15:43:15.182980 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:43:15 crc kubenswrapper[4707]: I0121 15:43:15.432950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerStarted","Data":"cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4"} Jan 21 15:43:15 crc kubenswrapper[4707]: I0121 15:43:15.435490 4707 generic.go:334] "Generic (PLEG): container finished" podID="f762786d-4f31-412e-a685-b196f3cf3fea" containerID="f38b9dcb38e276bd9c92539d9f6c09c61aeb4c0e20d24a04f0aecd1dde1da008" exitCode=143 Jan 21 15:43:15 crc kubenswrapper[4707]: I0121 15:43:15.435528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f762786d-4f31-412e-a685-b196f3cf3fea","Type":"ContainerDied","Data":"f38b9dcb38e276bd9c92539d9f6c09c61aeb4c0e20d24a04f0aecd1dde1da008"} Jan 21 15:43:16 crc kubenswrapper[4707]: I0121 15:43:16.444047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerStarted","Data":"10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55"} Jan 21 15:43:16 crc kubenswrapper[4707]: I0121 15:43:16.444419 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:16 crc kubenswrapper[4707]: I0121 15:43:16.444247 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="sg-core" containerID="cri-o://cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4" gracePeriod=30 Jan 21 15:43:16 crc kubenswrapper[4707]: I0121 15:43:16.444176 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="ceilometer-central-agent" containerID="cri-o://1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191" gracePeriod=30 Jan 21 15:43:16 crc kubenswrapper[4707]: I0121 15:43:16.444251 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="ceilometer-notification-agent" containerID="cri-o://85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39" gracePeriod=30 Jan 21 15:43:16 crc kubenswrapper[4707]: I0121 15:43:16.444343 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="proxy-httpd" containerID="cri-o://10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55" gracePeriod=30 Jan 21 15:43:16 crc kubenswrapper[4707]: I0121 15:43:16.473795 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.133943623 podStartE2EDuration="5.473780696s" podCreationTimestamp="2026-01-21 15:43:11 +0000 UTC" firstStartedPulling="2026-01-21 15:43:12.169370947 +0000 UTC m=+2489.350887169" lastFinishedPulling="2026-01-21 15:43:15.50920802 +0000 UTC m=+2492.690724242" observedRunningTime="2026-01-21 15:43:16.466210292 +0000 UTC m=+2493.647726514" watchObservedRunningTime="2026-01-21 15:43:16.473780696 +0000 UTC m=+2493.655296917" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.412871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.429388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-public-tls-certs\") pod \"b423a914-b340-4f8f-ae6a-f55acfccd5da\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.429438 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-httpd-run\") pod \"b423a914-b340-4f8f-ae6a-f55acfccd5da\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.429473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-logs\") pod \"b423a914-b340-4f8f-ae6a-f55acfccd5da\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.429946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b423a914-b340-4f8f-ae6a-f55acfccd5da" (UID: "b423a914-b340-4f8f-ae6a-f55acfccd5da"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.429958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-logs" (OuterVolumeSpecName: "logs") pod "b423a914-b340-4f8f-ae6a-f55acfccd5da" (UID: "b423a914-b340-4f8f-ae6a-f55acfccd5da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.429992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"b423a914-b340-4f8f-ae6a-f55acfccd5da\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.430089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-scripts\") pod \"b423a914-b340-4f8f-ae6a-f55acfccd5da\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.430134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgphk\" (UniqueName: \"kubernetes.io/projected/b423a914-b340-4f8f-ae6a-f55acfccd5da-kube-api-access-vgphk\") pod \"b423a914-b340-4f8f-ae6a-f55acfccd5da\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.430149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-config-data\") pod \"b423a914-b340-4f8f-ae6a-f55acfccd5da\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.430188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-combined-ca-bundle\") pod \"b423a914-b340-4f8f-ae6a-f55acfccd5da\" (UID: \"b423a914-b340-4f8f-ae6a-f55acfccd5da\") " Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.430939 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.430956 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b423a914-b340-4f8f-ae6a-f55acfccd5da-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.437244 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-scripts" (OuterVolumeSpecName: "scripts") pod "b423a914-b340-4f8f-ae6a-f55acfccd5da" (UID: "b423a914-b340-4f8f-ae6a-f55acfccd5da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.451549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "b423a914-b340-4f8f-ae6a-f55acfccd5da" (UID: "b423a914-b340-4f8f-ae6a-f55acfccd5da"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.451600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b423a914-b340-4f8f-ae6a-f55acfccd5da-kube-api-access-vgphk" (OuterVolumeSpecName: "kube-api-access-vgphk") pod "b423a914-b340-4f8f-ae6a-f55acfccd5da" (UID: "b423a914-b340-4f8f-ae6a-f55acfccd5da"). InnerVolumeSpecName "kube-api-access-vgphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.465039 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerID="10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55" exitCode=0 Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.465068 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerID="cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4" exitCode=2 Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.465077 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerID="85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39" exitCode=0 Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.465118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerDied","Data":"10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55"} Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.465191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerDied","Data":"cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4"} Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.465210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerDied","Data":"85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39"} Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.469657 4707 generic.go:334] "Generic (PLEG): container finished" podID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerID="66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db" exitCode=0 Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.469690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b423a914-b340-4f8f-ae6a-f55acfccd5da","Type":"ContainerDied","Data":"66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db"} Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.469739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b423a914-b340-4f8f-ae6a-f55acfccd5da","Type":"ContainerDied","Data":"45c09eae1c3695863d1a8d7ec5493347c3d956d73b562f253ee2927315face69"} Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.469757 4707 scope.go:117] "RemoveContainer" containerID="66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.469957 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.484684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b423a914-b340-4f8f-ae6a-f55acfccd5da" (UID: "b423a914-b340-4f8f-ae6a-f55acfccd5da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.499354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-config-data" (OuterVolumeSpecName: "config-data") pod "b423a914-b340-4f8f-ae6a-f55acfccd5da" (UID: "b423a914-b340-4f8f-ae6a-f55acfccd5da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.507297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b423a914-b340-4f8f-ae6a-f55acfccd5da" (UID: "b423a914-b340-4f8f-ae6a-f55acfccd5da"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.510241 4707 scope.go:117] "RemoveContainer" containerID="ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.528938 4707 scope.go:117] "RemoveContainer" containerID="66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db" Jan 21 15:43:17 crc kubenswrapper[4707]: E0121 15:43:17.529488 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db\": container with ID starting with 66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db not found: ID does not exist" containerID="66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.529552 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db"} err="failed to get container status \"66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db\": rpc error: code = NotFound desc = could not find container \"66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db\": container with ID starting with 66c4037d4024a06a33ab18b575af18658a51a67e5c1cab00ab2ddbae9eeeb5db not found: ID does not exist" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.529583 4707 scope.go:117] "RemoveContainer" containerID="ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096" Jan 21 15:43:17 crc kubenswrapper[4707]: E0121 15:43:17.530033 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096\": container with ID starting with ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096 not found: ID does not exist" containerID="ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.530078 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096"} err="failed to get container status \"ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096\": rpc error: code = NotFound desc = could not find container \"ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096\": container with ID starting with ce4ebb70a010bf7a5ec40d6e6429e92b2a51c5e8fc51cc84b265f4d1b1b91096 not found: ID does not exist" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.531446 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.531494 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.531505 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.531518 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgphk\" (UniqueName: \"kubernetes.io/projected/b423a914-b340-4f8f-ae6a-f55acfccd5da-kube-api-access-vgphk\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.531529 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.531537 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423a914-b340-4f8f-ae6a-f55acfccd5da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.545946 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.632604 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.640088 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.640854 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.794843 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.808018 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.821236 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:43:17 crc kubenswrapper[4707]: E0121 15:43:17.821677 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerName="glance-httpd" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.821696 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerName="glance-httpd" Jan 21 15:43:17 crc kubenswrapper[4707]: E0121 15:43:17.821777 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerName="glance-log" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.821790 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerName="glance-log" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.822023 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerName="glance-log" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.822213 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b423a914-b340-4f8f-ae6a-f55acfccd5da" containerName="glance-httpd" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.823901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.825587 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.825760 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.827407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.938418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-logs\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.938463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.938699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmfmb\" (UniqueName: \"kubernetes.io/projected/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-kube-api-access-fmfmb\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.938791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.938856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.938888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.938987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:17 crc kubenswrapper[4707]: I0121 15:43:17.939043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.040392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.040484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-logs\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.040503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.040563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmfmb\" (UniqueName: \"kubernetes.io/projected/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-kube-api-access-fmfmb\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.040593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.040616 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.040646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.040683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.041317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-logs\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.041361 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.041399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.045131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.045197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.045303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.046444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.061138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmfmb\" (UniqueName: \"kubernetes.io/projected/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-kube-api-access-fmfmb\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.067312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.143692 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.484056 4707 generic.go:334] "Generic (PLEG): container finished" podID="f762786d-4f31-412e-a685-b196f3cf3fea" containerID="42a79abf921569981336ed28abfb2303337cb7d5d4419864703a223193f9e1a4" exitCode=0 Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.484121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f762786d-4f31-412e-a685-b196f3cf3fea","Type":"ContainerDied","Data":"42a79abf921569981336ed28abfb2303337cb7d5d4419864703a223193f9e1a4"} Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.525185 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:43:18 crc kubenswrapper[4707]: W0121 15:43:18.539555 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ab5cb3_56d6_45f8_a8ae_0cda9871960f.slice/crio-266de8cab4ca60a56fbc833343b6e594c928e02fe4e35b8aafcd90f5a05497d0 WatchSource:0}: Error finding container 266de8cab4ca60a56fbc833343b6e594c928e02fe4e35b8aafcd90f5a05497d0: Status 404 returned error can't find the container with id 266de8cab4ca60a56fbc833343b6e594c928e02fe4e35b8aafcd90f5a05497d0 Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.731755 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.783837 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.955543 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-internal-tls-certs\") pod \"f762786d-4f31-412e-a685-b196f3cf3fea\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.955598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-config-data\") pod \"f762786d-4f31-412e-a685-b196f3cf3fea\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.955652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-logs\") pod \"f762786d-4f31-412e-a685-b196f3cf3fea\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.955726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"f762786d-4f31-412e-a685-b196f3cf3fea\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.955773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-scripts\") pod \"f762786d-4f31-412e-a685-b196f3cf3fea\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.955820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-combined-ca-bundle\") pod \"f762786d-4f31-412e-a685-b196f3cf3fea\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.955851 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-httpd-run\") pod \"f762786d-4f31-412e-a685-b196f3cf3fea\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.955875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcnjk\" (UniqueName: \"kubernetes.io/projected/f762786d-4f31-412e-a685-b196f3cf3fea-kube-api-access-xcnjk\") pod \"f762786d-4f31-412e-a685-b196f3cf3fea\" (UID: \"f762786d-4f31-412e-a685-b196f3cf3fea\") " Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.957073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-logs" (OuterVolumeSpecName: "logs") pod "f762786d-4f31-412e-a685-b196f3cf3fea" (UID: "f762786d-4f31-412e-a685-b196f3cf3fea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.957103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f762786d-4f31-412e-a685-b196f3cf3fea" (UID: "f762786d-4f31-412e-a685-b196f3cf3fea"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.960531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f762786d-4f31-412e-a685-b196f3cf3fea-kube-api-access-xcnjk" (OuterVolumeSpecName: "kube-api-access-xcnjk") pod "f762786d-4f31-412e-a685-b196f3cf3fea" (UID: "f762786d-4f31-412e-a685-b196f3cf3fea"). InnerVolumeSpecName "kube-api-access-xcnjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.961359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-scripts" (OuterVolumeSpecName: "scripts") pod "f762786d-4f31-412e-a685-b196f3cf3fea" (UID: "f762786d-4f31-412e-a685-b196f3cf3fea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.961578 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "f762786d-4f31-412e-a685-b196f3cf3fea" (UID: "f762786d-4f31-412e-a685-b196f3cf3fea"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.977535 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f762786d-4f31-412e-a685-b196f3cf3fea" (UID: "f762786d-4f31-412e-a685-b196f3cf3fea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:18 crc kubenswrapper[4707]: I0121 15:43:18.995249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f762786d-4f31-412e-a685-b196f3cf3fea" (UID: "f762786d-4f31-412e-a685-b196f3cf3fea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.001707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-config-data" (OuterVolumeSpecName: "config-data") pod "f762786d-4f31-412e-a685-b196f3cf3fea" (UID: "f762786d-4f31-412e-a685-b196f3cf3fea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.058279 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.058322 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.058333 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.058342 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.058351 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f762786d-4f31-412e-a685-b196f3cf3fea-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.058359 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcnjk\" (UniqueName: \"kubernetes.io/projected/f762786d-4f31-412e-a685-b196f3cf3fea-kube-api-access-xcnjk\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.058367 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.058374 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762786d-4f31-412e-a685-b196f3cf3fea-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.074584 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.159565 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.190489 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b423a914-b340-4f8f-ae6a-f55acfccd5da" path="/var/lib/kubelet/pods/b423a914-b340-4f8f-ae6a-f55acfccd5da/volumes" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.495124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f762786d-4f31-412e-a685-b196f3cf3fea","Type":"ContainerDied","Data":"b0c28c4764bb23b7c7d068a39fe864b187aa8b420bdcfc3940d7d314cc2c4769"} Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.495167 4707 scope.go:117] "RemoveContainer" containerID="42a79abf921569981336ed28abfb2303337cb7d5d4419864703a223193f9e1a4" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.495274 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.498557 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f","Type":"ContainerStarted","Data":"f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd"} Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.498581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f","Type":"ContainerStarted","Data":"c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2"} Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.498591 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f","Type":"ContainerStarted","Data":"266de8cab4ca60a56fbc833343b6e594c928e02fe4e35b8aafcd90f5a05497d0"} Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.515889 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.515867047 podStartE2EDuration="2.515867047s" podCreationTimestamp="2026-01-21 15:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:19.511760206 +0000 UTC m=+2496.693276427" watchObservedRunningTime="2026-01-21 15:43:19.515867047 +0000 UTC m=+2496.697383268" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.527146 4707 scope.go:117] "RemoveContainer" containerID="f38b9dcb38e276bd9c92539d9f6c09c61aeb4c0e20d24a04f0aecd1dde1da008" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.529892 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.545734 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.561960 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:43:19 crc kubenswrapper[4707]: E0121 15:43:19.562294 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f762786d-4f31-412e-a685-b196f3cf3fea" containerName="glance-httpd" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.562508 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f762786d-4f31-412e-a685-b196f3cf3fea" containerName="glance-httpd" Jan 21 15:43:19 crc kubenswrapper[4707]: E0121 15:43:19.562528 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f762786d-4f31-412e-a685-b196f3cf3fea" containerName="glance-log" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.562535 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f762786d-4f31-412e-a685-b196f3cf3fea" containerName="glance-log" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.562714 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f762786d-4f31-412e-a685-b196f3cf3fea" containerName="glance-httpd" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.562735 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f762786d-4f31-412e-a685-b196f3cf3fea" containerName="glance-log" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.563559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.565514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.565697 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.569058 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.668302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-logs\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.668344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx7tz\" (UniqueName: \"kubernetes.io/projected/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-kube-api-access-dx7tz\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.668371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.668621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.668708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.668750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.668769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.668794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.770557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.770606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.770632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.770656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.770723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-logs\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.770955 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.771099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.771323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-logs\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.773075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx7tz\" (UniqueName: \"kubernetes.io/projected/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-kube-api-access-dx7tz\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.773127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.773165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.775748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.781471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.782310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.783721 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.784351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx7tz\" (UniqueName: \"kubernetes.io/projected/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-kube-api-access-dx7tz\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.791329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:19 crc kubenswrapper[4707]: I0121 15:43:19.901101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.292455 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:43:20 crc kubenswrapper[4707]: W0121 15:43:20.296409 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b894eb_fa59_4d67_aeb6_46c945fe6dd5.slice/crio-309684819cc803f636817c554c8fd1fd529045c53dc1f1f1a47fa500d987d3a7 WatchSource:0}: Error finding container 309684819cc803f636817c554c8fd1fd529045c53dc1f1f1a47fa500d987d3a7: Status 404 returned error can't find the container with id 309684819cc803f636817c554c8fd1fd529045c53dc1f1f1a47fa500d987d3a7 Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.509173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"44b894eb-fa59-4d67-aeb6-46c945fe6dd5","Type":"ContainerStarted","Data":"309684819cc803f636817c554c8fd1fd529045c53dc1f1f1a47fa500d987d3a7"} Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.807787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.895017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvjgn\" (UniqueName: \"kubernetes.io/projected/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-kube-api-access-tvjgn\") pod \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.895111 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-scripts\") pod \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.895132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-run-httpd\") pod \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.895152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-log-httpd\") pod \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.895197 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-ceilometer-tls-certs\") pod \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.895273 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-config-data\") pod \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.895378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-combined-ca-bundle\") pod \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.895395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-sg-core-conf-yaml\") pod \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\" (UID: \"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47\") " Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.895892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" (UID: "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.898980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-scripts" (OuterVolumeSpecName: "scripts") pod "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" (UID: "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.899426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" (UID: "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.900490 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-kube-api-access-tvjgn" (OuterVolumeSpecName: "kube-api-access-tvjgn") pod "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" (UID: "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47"). InnerVolumeSpecName "kube-api-access-tvjgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.918389 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" (UID: "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.943410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" (UID: "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.967349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" (UID: "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:20 crc kubenswrapper[4707]: I0121 15:43:20.991914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-config-data" (OuterVolumeSpecName: "config-data") pod "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" (UID: "bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.003368 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvjgn\" (UniqueName: \"kubernetes.io/projected/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-kube-api-access-tvjgn\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.003400 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.003411 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.003421 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.003428 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.003438 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.003449 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.003457 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.191977 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f762786d-4f31-412e-a685-b196f3cf3fea" path="/var/lib/kubelet/pods/f762786d-4f31-412e-a685-b196f3cf3fea/volumes" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.519916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"44b894eb-fa59-4d67-aeb6-46c945fe6dd5","Type":"ContainerStarted","Data":"91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b"} Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.519953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"44b894eb-fa59-4d67-aeb6-46c945fe6dd5","Type":"ContainerStarted","Data":"bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54"} Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.522837 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerID="1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191" exitCode=0 Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.522901 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.522898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerDied","Data":"1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191"} Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.523115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47","Type":"ContainerDied","Data":"c80fbe91a6f12848293d6945985d07cd2b1d6e16b8ec7065797c671eab0151f8"} Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.523155 4707 scope.go:117] "RemoveContainer" containerID="10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.540221 4707 scope.go:117] "RemoveContainer" containerID="cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.542987 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.542977771 podStartE2EDuration="2.542977771s" podCreationTimestamp="2026-01-21 15:43:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:21.536087847 +0000 UTC m=+2498.717604069" watchObservedRunningTime="2026-01-21 15:43:21.542977771 +0000 UTC m=+2498.724493993" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.557611 4707 scope.go:117] "RemoveContainer" containerID="85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.563598 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.569033 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.582373 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:21 crc kubenswrapper[4707]: E0121 15:43:21.584131 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="ceilometer-notification-agent" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.584634 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="ceilometer-notification-agent" Jan 21 15:43:21 crc kubenswrapper[4707]: E0121 15:43:21.584718 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="sg-core" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.584726 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="sg-core" Jan 21 15:43:21 crc kubenswrapper[4707]: E0121 15:43:21.584786 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="ceilometer-central-agent" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.584793 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="ceilometer-central-agent" Jan 21 15:43:21 crc kubenswrapper[4707]: E0121 15:43:21.584845 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="proxy-httpd" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.584851 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="proxy-httpd" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.585094 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="sg-core" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.585110 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="ceilometer-central-agent" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.585120 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="proxy-httpd" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.585129 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" containerName="ceilometer-notification-agent" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.587325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.590239 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.590366 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.590402 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.590416 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.620126 4707 scope.go:117] "RemoveContainer" containerID="1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.637084 4707 scope.go:117] "RemoveContainer" containerID="10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55" Jan 21 15:43:21 crc kubenswrapper[4707]: E0121 15:43:21.637478 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55\": container with ID starting with 10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55 not found: ID does not exist" containerID="10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.637601 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55"} err="failed to get container status \"10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55\": rpc error: code = NotFound desc = could not find container \"10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55\": container with ID starting with 10a2e9bf33a815ebf23428bd9c1a5763f8fadc1af2d39c165727b7c750c83c55 not found: ID does not exist" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.637696 4707 scope.go:117] "RemoveContainer" containerID="cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4" Jan 21 15:43:21 crc kubenswrapper[4707]: E0121 15:43:21.638188 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4\": container with ID starting with cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4 not found: ID does not exist" containerID="cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.638233 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4"} err="failed to get container status \"cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4\": rpc error: code = NotFound desc = could not find container \"cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4\": container with ID starting with cf6dd1c6698f624b27d077d7ab321b9ec71249a07940bd7613242e886a767bd4 not found: ID does not exist" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.638266 4707 scope.go:117] "RemoveContainer" containerID="85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39" Jan 21 15:43:21 crc kubenswrapper[4707]: E0121 15:43:21.638574 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39\": container with ID starting with 85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39 not found: ID does not exist" containerID="85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.638597 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39"} err="failed to get container status \"85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39\": rpc error: code = NotFound desc = could not find container \"85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39\": container with ID starting with 85b7465cfd025188b43f3e20f25f7a9639be55ed3b809fce90dcb4116e9fcc39 not found: ID does not exist" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.638609 4707 scope.go:117] "RemoveContainer" containerID="1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191" Jan 21 15:43:21 crc kubenswrapper[4707]: E0121 15:43:21.639035 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191\": container with ID starting with 1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191 not found: ID does not exist" containerID="1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.639174 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191"} err="failed to get container status \"1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191\": rpc error: code = NotFound desc = could not find container \"1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191\": container with ID starting with 1bce877f2a1c57db69f7ed54a088788ec274d7fe88fcbd9a460e8a6747b01191 not found: ID does not exist" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.714027 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.714075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-run-httpd\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.714111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.714365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-config-data\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.714402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-scripts\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.714448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-log-httpd\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.714530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb8gh\" (UniqueName: \"kubernetes.io/projected/9aa155f7-28bd-4346-8d17-9e176f05f924-kube-api-access-gb8gh\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.714600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.815949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.816020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.816041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-run-httpd\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.816066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.816173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-config-data\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.816195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-scripts\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.816231 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-log-httpd\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.816276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb8gh\" (UniqueName: \"kubernetes.io/projected/9aa155f7-28bd-4346-8d17-9e176f05f924-kube-api-access-gb8gh\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.817757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-log-httpd\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.818697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-run-httpd\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.819972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.820008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.820101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-scripts\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.820463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.820871 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-config-data\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.831040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb8gh\" (UniqueName: \"kubernetes.io/projected/9aa155f7-28bd-4346-8d17-9e176f05f924-kube-api-access-gb8gh\") pod \"ceilometer-0\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:21 crc kubenswrapper[4707]: I0121 15:43:21.922793 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:22 crc kubenswrapper[4707]: I0121 15:43:22.317138 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:43:22 crc kubenswrapper[4707]: W0121 15:43:22.319921 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aa155f7_28bd_4346_8d17_9e176f05f924.slice/crio-e71e18176ab1d4c34179455a844fe6a6b3fe017ac550daa9b34c15c80803120e WatchSource:0}: Error finding container e71e18176ab1d4c34179455a844fe6a6b3fe017ac550daa9b34c15c80803120e: Status 404 returned error can't find the container with id e71e18176ab1d4c34179455a844fe6a6b3fe017ac550daa9b34c15c80803120e Jan 21 15:43:22 crc kubenswrapper[4707]: I0121 15:43:22.533008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerStarted","Data":"e71e18176ab1d4c34179455a844fe6a6b3fe017ac550daa9b34c15c80803120e"} Jan 21 15:43:23 crc kubenswrapper[4707]: I0121 15:43:23.189953 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47" path="/var/lib/kubelet/pods/bf1f5e4d-8b60-4d29-bf84-5b5aa7ea0e47/volumes" Jan 21 15:43:23 crc kubenswrapper[4707]: I0121 15:43:23.541436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerStarted","Data":"3ca43973eeb47657c74fbd63e26851952964f1788418d1c9a8e7b0423389eab1"} Jan 21 15:43:24 crc kubenswrapper[4707]: I0121 15:43:24.549525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerStarted","Data":"af7fa61ee4bb60ee7836162a149bd8b4c83a18b47857fed88839b335236ad6c7"} Jan 21 15:43:24 crc kubenswrapper[4707]: I0121 15:43:24.549760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerStarted","Data":"e1017ca648f0834dc9b8e2d10be743e2a03e7d85e0a914cdfee158561b1c6f3d"} Jan 21 15:43:24 crc kubenswrapper[4707]: I0121 15:43:24.956072 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-ktf95"] Jan 21 15:43:24 crc kubenswrapper[4707]: I0121 15:43:24.957284 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:24 crc kubenswrapper[4707]: I0121 15:43:24.999246 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-ktf95"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.058553 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-2n6bd"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.059684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.064800 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-2n6bd"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.066651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a54c41-56b7-4424-8b05-a919cb7ce85b-operator-scripts\") pod \"nova-api-db-create-ktf95\" (UID: \"68a54c41-56b7-4424-8b05-a919cb7ce85b\") " pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.066803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd5qj\" (UniqueName: \"kubernetes.io/projected/68a54c41-56b7-4424-8b05-a919cb7ce85b-kube-api-access-gd5qj\") pod \"nova-api-db-create-ktf95\" (UID: \"68a54c41-56b7-4424-8b05-a919cb7ce85b\") " pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.162607 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.163428 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.165255 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.168334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrtg9\" (UniqueName: \"kubernetes.io/projected/c8711b1e-6689-4b59-a84b-3149a910c125-kube-api-access-rrtg9\") pod \"nova-cell0-db-create-2n6bd\" (UID: \"c8711b1e-6689-4b59-a84b-3149a910c125\") " pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.168535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8711b1e-6689-4b59-a84b-3149a910c125-operator-scripts\") pod \"nova-cell0-db-create-2n6bd\" (UID: \"c8711b1e-6689-4b59-a84b-3149a910c125\") " pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.168640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a54c41-56b7-4424-8b05-a919cb7ce85b-operator-scripts\") pod \"nova-api-db-create-ktf95\" (UID: \"68a54c41-56b7-4424-8b05-a919cb7ce85b\") " pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.168759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd5qj\" (UniqueName: \"kubernetes.io/projected/68a54c41-56b7-4424-8b05-a919cb7ce85b-kube-api-access-gd5qj\") pod \"nova-api-db-create-ktf95\" (UID: \"68a54c41-56b7-4424-8b05-a919cb7ce85b\") " pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.169493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a54c41-56b7-4424-8b05-a919cb7ce85b-operator-scripts\") pod \"nova-api-db-create-ktf95\" (UID: \"68a54c41-56b7-4424-8b05-a919cb7ce85b\") " pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.172458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.186278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd5qj\" (UniqueName: \"kubernetes.io/projected/68a54c41-56b7-4424-8b05-a919cb7ce85b-kube-api-access-gd5qj\") pod \"nova-api-db-create-ktf95\" (UID: \"68a54c41-56b7-4424-8b05-a919cb7ce85b\") " pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.259922 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-hbd6h"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.261013 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.269490 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-hbd6h"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.269979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8711b1e-6689-4b59-a84b-3149a910c125-operator-scripts\") pod \"nova-cell0-db-create-2n6bd\" (UID: \"c8711b1e-6689-4b59-a84b-3149a910c125\") " pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.270020 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/9cc81354-6161-40a7-b62c-1ba1fbec6b89-kube-api-access-j2b7z\") pod \"nova-api-b79e-account-create-update-v8mn6\" (UID: \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.270396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrtg9\" (UniqueName: \"kubernetes.io/projected/c8711b1e-6689-4b59-a84b-3149a910c125-kube-api-access-rrtg9\") pod \"nova-cell0-db-create-2n6bd\" (UID: \"c8711b1e-6689-4b59-a84b-3149a910c125\") " pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.270452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cc81354-6161-40a7-b62c-1ba1fbec6b89-operator-scripts\") pod \"nova-api-b79e-account-create-update-v8mn6\" (UID: \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.270606 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8711b1e-6689-4b59-a84b-3149a910c125-operator-scripts\") pod \"nova-cell0-db-create-2n6bd\" (UID: \"c8711b1e-6689-4b59-a84b-3149a910c125\") " pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.272270 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.287142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrtg9\" (UniqueName: \"kubernetes.io/projected/c8711b1e-6689-4b59-a84b-3149a910c125-kube-api-access-rrtg9\") pod \"nova-cell0-db-create-2n6bd\" (UID: \"c8711b1e-6689-4b59-a84b-3149a910c125\") " pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.371858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7981604f-86e8-40dc-82c5-22f4e786cd72-operator-scripts\") pod \"nova-cell1-db-create-hbd6h\" (UID: \"7981604f-86e8-40dc-82c5-22f4e786cd72\") " pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.372227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.372978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cc81354-6161-40a7-b62c-1ba1fbec6b89-operator-scripts\") pod \"nova-api-b79e-account-create-update-v8mn6\" (UID: \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.373053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlc2d\" (UniqueName: \"kubernetes.io/projected/7981604f-86e8-40dc-82c5-22f4e786cd72-kube-api-access-zlc2d\") pod \"nova-cell1-db-create-hbd6h\" (UID: \"7981604f-86e8-40dc-82c5-22f4e786cd72\") " pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.373088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/9cc81354-6161-40a7-b62c-1ba1fbec6b89-kube-api-access-j2b7z\") pod \"nova-api-b79e-account-create-update-v8mn6\" (UID: \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.374187 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cc81354-6161-40a7-b62c-1ba1fbec6b89-operator-scripts\") pod \"nova-api-b79e-account-create-update-v8mn6\" (UID: \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.389026 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.390195 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.392494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/9cc81354-6161-40a7-b62c-1ba1fbec6b89-kube-api-access-j2b7z\") pod \"nova-api-b79e-account-create-update-v8mn6\" (UID: \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.393647 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.395715 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.476594 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.477390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee595bc-dc44-4195-bbed-7e30c29ddabd-operator-scripts\") pod \"nova-cell0-dd06-account-create-update-ztjcn\" (UID: \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.477443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlc2d\" (UniqueName: \"kubernetes.io/projected/7981604f-86e8-40dc-82c5-22f4e786cd72-kube-api-access-zlc2d\") pod \"nova-cell1-db-create-hbd6h\" (UID: \"7981604f-86e8-40dc-82c5-22f4e786cd72\") " pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.477501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r24f\" (UniqueName: \"kubernetes.io/projected/6ee595bc-dc44-4195-bbed-7e30c29ddabd-kube-api-access-6r24f\") pod \"nova-cell0-dd06-account-create-update-ztjcn\" (UID: \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.477578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7981604f-86e8-40dc-82c5-22f4e786cd72-operator-scripts\") pod \"nova-cell1-db-create-hbd6h\" (UID: \"7981604f-86e8-40dc-82c5-22f4e786cd72\") " pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.478210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7981604f-86e8-40dc-82c5-22f4e786cd72-operator-scripts\") pod \"nova-cell1-db-create-hbd6h\" (UID: \"7981604f-86e8-40dc-82c5-22f4e786cd72\") " pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.497340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlc2d\" (UniqueName: \"kubernetes.io/projected/7981604f-86e8-40dc-82c5-22f4e786cd72-kube-api-access-zlc2d\") pod \"nova-cell1-db-create-hbd6h\" (UID: \"7981604f-86e8-40dc-82c5-22f4e786cd72\") " pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.579661 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.580133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee595bc-dc44-4195-bbed-7e30c29ddabd-operator-scripts\") pod \"nova-cell0-dd06-account-create-update-ztjcn\" (UID: \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.580246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r24f\" (UniqueName: \"kubernetes.io/projected/6ee595bc-dc44-4195-bbed-7e30c29ddabd-kube-api-access-6r24f\") pod \"nova-cell0-dd06-account-create-update-ztjcn\" (UID: \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.581283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee595bc-dc44-4195-bbed-7e30c29ddabd-operator-scripts\") pod \"nova-cell0-dd06-account-create-update-ztjcn\" (UID: \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.592369 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.593562 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.601028 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.602173 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.606557 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r24f\" (UniqueName: \"kubernetes.io/projected/6ee595bc-dc44-4195-bbed-7e30c29ddabd-kube-api-access-6r24f\") pod \"nova-cell0-dd06-account-create-update-ztjcn\" (UID: \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.684045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28rmh\" (UniqueName: \"kubernetes.io/projected/be2bb610-3b7b-4815-aefc-fed4415542e1-kube-api-access-28rmh\") pod \"nova-cell1-d92f-account-create-update-vw9xc\" (UID: \"be2bb610-3b7b-4815-aefc-fed4415542e1\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.684294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be2bb610-3b7b-4815-aefc-fed4415542e1-operator-scripts\") pod \"nova-cell1-d92f-account-create-update-vw9xc\" (UID: \"be2bb610-3b7b-4815-aefc-fed4415542e1\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.694213 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-ktf95"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.780363 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.788226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28rmh\" (UniqueName: \"kubernetes.io/projected/be2bb610-3b7b-4815-aefc-fed4415542e1-kube-api-access-28rmh\") pod \"nova-cell1-d92f-account-create-update-vw9xc\" (UID: \"be2bb610-3b7b-4815-aefc-fed4415542e1\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.788775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be2bb610-3b7b-4815-aefc-fed4415542e1-operator-scripts\") pod \"nova-cell1-d92f-account-create-update-vw9xc\" (UID: \"be2bb610-3b7b-4815-aefc-fed4415542e1\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.789686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be2bb610-3b7b-4815-aefc-fed4415542e1-operator-scripts\") pod \"nova-cell1-d92f-account-create-update-vw9xc\" (UID: \"be2bb610-3b7b-4815-aefc-fed4415542e1\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.810194 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28rmh\" (UniqueName: \"kubernetes.io/projected/be2bb610-3b7b-4815-aefc-fed4415542e1-kube-api-access-28rmh\") pod \"nova-cell1-d92f-account-create-update-vw9xc\" (UID: \"be2bb610-3b7b-4815-aefc-fed4415542e1\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.824602 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-2n6bd"] Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.916685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:25 crc kubenswrapper[4707]: I0121 15:43:25.932403 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6"] Jan 21 15:43:25 crc kubenswrapper[4707]: W0121 15:43:25.932838 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cc81354_6161_40a7_b62c_1ba1fbec6b89.slice/crio-19582491b57dd162c953899e6928fecfedc2f547c20386aedae74a305f989c64 WatchSource:0}: Error finding container 19582491b57dd162c953899e6928fecfedc2f547c20386aedae74a305f989c64: Status 404 returned error can't find the container with id 19582491b57dd162c953899e6928fecfedc2f547c20386aedae74a305f989c64 Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.046710 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-hbd6h"] Jan 21 15:43:26 crc kubenswrapper[4707]: W0121 15:43:26.084852 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7981604f_86e8_40dc_82c5_22f4e786cd72.slice/crio-87363ae5b4417de928b46e51ab9dbb4f2b3dce3934cf3fda0e98750b4f31911b WatchSource:0}: Error finding container 87363ae5b4417de928b46e51ab9dbb4f2b3dce3934cf3fda0e98750b4f31911b: Status 404 returned error can't find the container with id 87363ae5b4417de928b46e51ab9dbb4f2b3dce3934cf3fda0e98750b4f31911b Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.219756 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn"] Jan 21 15:43:26 crc kubenswrapper[4707]: W0121 15:43:26.223535 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ee595bc_dc44_4195_bbed_7e30c29ddabd.slice/crio-1075948ca7d105e60c09d051555b66d033322afd8b23c677104f83cae2809d52 WatchSource:0}: Error finding container 1075948ca7d105e60c09d051555b66d033322afd8b23c677104f83cae2809d52: Status 404 returned error can't find the container with id 1075948ca7d105e60c09d051555b66d033322afd8b23c677104f83cae2809d52 Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.332736 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc"] Jan 21 15:43:26 crc kubenswrapper[4707]: W0121 15:43:26.374323 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe2bb610_3b7b_4815_aefc_fed4415542e1.slice/crio-4c4ecaf3e203a69923bd52db8977ae6a3dfe76feb6f7542cad496c66351a2c8f WatchSource:0}: Error finding container 4c4ecaf3e203a69923bd52db8977ae6a3dfe76feb6f7542cad496c66351a2c8f: Status 404 returned error can't find the container with id 4c4ecaf3e203a69923bd52db8977ae6a3dfe76feb6f7542cad496c66351a2c8f Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.576440 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cc81354-6161-40a7-b62c-1ba1fbec6b89" containerID="c803576a99768313bb7b6d53e8da4250a1defe738422442774fb42f73b7b74f3" exitCode=0 Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.576511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" event={"ID":"9cc81354-6161-40a7-b62c-1ba1fbec6b89","Type":"ContainerDied","Data":"c803576a99768313bb7b6d53e8da4250a1defe738422442774fb42f73b7b74f3"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.576574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" event={"ID":"9cc81354-6161-40a7-b62c-1ba1fbec6b89","Type":"ContainerStarted","Data":"19582491b57dd162c953899e6928fecfedc2f547c20386aedae74a305f989c64"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.577669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" event={"ID":"be2bb610-3b7b-4815-aefc-fed4415542e1","Type":"ContainerStarted","Data":"07b5614e155cdb2282d53bbf3fcbd961a77825fed72469a3a578360d725938b4"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.577712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" event={"ID":"be2bb610-3b7b-4815-aefc-fed4415542e1","Type":"ContainerStarted","Data":"4c4ecaf3e203a69923bd52db8977ae6a3dfe76feb6f7542cad496c66351a2c8f"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.579558 4707 generic.go:334] "Generic (PLEG): container finished" podID="7981604f-86e8-40dc-82c5-22f4e786cd72" containerID="3104fc64e504a3e2c5c25aace72bc0bd76abcd55d1b7d82da6696a3fb2429700" exitCode=0 Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.579607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" event={"ID":"7981604f-86e8-40dc-82c5-22f4e786cd72","Type":"ContainerDied","Data":"3104fc64e504a3e2c5c25aace72bc0bd76abcd55d1b7d82da6696a3fb2429700"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.579625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" event={"ID":"7981604f-86e8-40dc-82c5-22f4e786cd72","Type":"ContainerStarted","Data":"87363ae5b4417de928b46e51ab9dbb4f2b3dce3934cf3fda0e98750b4f31911b"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.580878 4707 generic.go:334] "Generic (PLEG): container finished" podID="c8711b1e-6689-4b59-a84b-3149a910c125" containerID="a2f607d72bb9c2553f08bd9c64844ccce10a089366ffc54a7691ffae13e19329" exitCode=0 Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.580923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" event={"ID":"c8711b1e-6689-4b59-a84b-3149a910c125","Type":"ContainerDied","Data":"a2f607d72bb9c2553f08bd9c64844ccce10a089366ffc54a7691ffae13e19329"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.580941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" event={"ID":"c8711b1e-6689-4b59-a84b-3149a910c125","Type":"ContainerStarted","Data":"4b5248228fe3ba35a3d74f834956e36b625117fcb2698e984eae2abf895db0fe"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.582676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerStarted","Data":"1f73b761d24b8bd5767620d0a64fc846049e6a99669fc0ba4aa19f78bbf5fa01"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.583290 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.584378 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" event={"ID":"6ee595bc-dc44-4195-bbed-7e30c29ddabd","Type":"ContainerStarted","Data":"11bf510afd6ac0fadc0815a9e26b92b15d26b3a24564ea8219a3f2dfb42f73fb"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.584403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" event={"ID":"6ee595bc-dc44-4195-bbed-7e30c29ddabd","Type":"ContainerStarted","Data":"1075948ca7d105e60c09d051555b66d033322afd8b23c677104f83cae2809d52"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.587071 4707 generic.go:334] "Generic (PLEG): container finished" podID="68a54c41-56b7-4424-8b05-a919cb7ce85b" containerID="8babc8e11e4d634f0f3b422641f30104a532577c7a3b1a584bb2b1242ecb0aa7" exitCode=0 Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.587099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-ktf95" event={"ID":"68a54c41-56b7-4424-8b05-a919cb7ce85b","Type":"ContainerDied","Data":"8babc8e11e4d634f0f3b422641f30104a532577c7a3b1a584bb2b1242ecb0aa7"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.587113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-ktf95" event={"ID":"68a54c41-56b7-4424-8b05-a919cb7ce85b","Type":"ContainerStarted","Data":"e239f3feeeb54d5765a9134fc012a951a9319c54a36b073ac6e4feb455445185"} Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.621998 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.488480954 podStartE2EDuration="5.62197904s" podCreationTimestamp="2026-01-21 15:43:21 +0000 UTC" firstStartedPulling="2026-01-21 15:43:22.322208775 +0000 UTC m=+2499.503724986" lastFinishedPulling="2026-01-21 15:43:25.45570685 +0000 UTC m=+2502.637223072" observedRunningTime="2026-01-21 15:43:26.61218669 +0000 UTC m=+2503.793702911" watchObservedRunningTime="2026-01-21 15:43:26.62197904 +0000 UTC m=+2503.803495262" Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.649794 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" podStartSLOduration=1.6497739280000001 podStartE2EDuration="1.649773928s" podCreationTimestamp="2026-01-21 15:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:26.646429019 +0000 UTC m=+2503.827945241" watchObservedRunningTime="2026-01-21 15:43:26.649773928 +0000 UTC m=+2503.831290151" Jan 21 15:43:26 crc kubenswrapper[4707]: I0121 15:43:26.668132 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" podStartSLOduration=1.668110873 podStartE2EDuration="1.668110873s" podCreationTimestamp="2026-01-21 15:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:26.665736678 +0000 UTC m=+2503.847252891" watchObservedRunningTime="2026-01-21 15:43:26.668110873 +0000 UTC m=+2503.849627094" Jan 21 15:43:27 crc kubenswrapper[4707]: I0121 15:43:27.595074 4707 generic.go:334] "Generic (PLEG): container finished" podID="6ee595bc-dc44-4195-bbed-7e30c29ddabd" containerID="11bf510afd6ac0fadc0815a9e26b92b15d26b3a24564ea8219a3f2dfb42f73fb" exitCode=0 Jan 21 15:43:27 crc kubenswrapper[4707]: I0121 15:43:27.595119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" event={"ID":"6ee595bc-dc44-4195-bbed-7e30c29ddabd","Type":"ContainerDied","Data":"11bf510afd6ac0fadc0815a9e26b92b15d26b3a24564ea8219a3f2dfb42f73fb"} Jan 21 15:43:27 crc kubenswrapper[4707]: I0121 15:43:27.596395 4707 generic.go:334] "Generic (PLEG): container finished" podID="be2bb610-3b7b-4815-aefc-fed4415542e1" containerID="07b5614e155cdb2282d53bbf3fcbd961a77825fed72469a3a578360d725938b4" exitCode=0 Jan 21 15:43:27 crc kubenswrapper[4707]: I0121 15:43:27.596455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" event={"ID":"be2bb610-3b7b-4815-aefc-fed4415542e1","Type":"ContainerDied","Data":"07b5614e155cdb2282d53bbf3fcbd961a77825fed72469a3a578360d725938b4"} Jan 21 15:43:27 crc kubenswrapper[4707]: I0121 15:43:27.985948 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.030358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8711b1e-6689-4b59-a84b-3149a910c125-operator-scripts\") pod \"c8711b1e-6689-4b59-a84b-3149a910c125\" (UID: \"c8711b1e-6689-4b59-a84b-3149a910c125\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.030547 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrtg9\" (UniqueName: \"kubernetes.io/projected/c8711b1e-6689-4b59-a84b-3149a910c125-kube-api-access-rrtg9\") pod \"c8711b1e-6689-4b59-a84b-3149a910c125\" (UID: \"c8711b1e-6689-4b59-a84b-3149a910c125\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.031246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8711b1e-6689-4b59-a84b-3149a910c125-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8711b1e-6689-4b59-a84b-3149a910c125" (UID: "c8711b1e-6689-4b59-a84b-3149a910c125"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.031444 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8711b1e-6689-4b59-a84b-3149a910c125-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.036937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8711b1e-6689-4b59-a84b-3149a910c125-kube-api-access-rrtg9" (OuterVolumeSpecName: "kube-api-access-rrtg9") pod "c8711b1e-6689-4b59-a84b-3149a910c125" (UID: "c8711b1e-6689-4b59-a84b-3149a910c125"). InnerVolumeSpecName "kube-api-access-rrtg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.134272 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrtg9\" (UniqueName: \"kubernetes.io/projected/c8711b1e-6689-4b59-a84b-3149a910c125-kube-api-access-rrtg9\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.143897 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.143950 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.162659 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.167953 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.172656 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.172849 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.181468 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.236377 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd5qj\" (UniqueName: \"kubernetes.io/projected/68a54c41-56b7-4424-8b05-a919cb7ce85b-kube-api-access-gd5qj\") pod \"68a54c41-56b7-4424-8b05-a919cb7ce85b\" (UID: \"68a54c41-56b7-4424-8b05-a919cb7ce85b\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.236424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/9cc81354-6161-40a7-b62c-1ba1fbec6b89-kube-api-access-j2b7z\") pod \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\" (UID: \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.236486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7981604f-86e8-40dc-82c5-22f4e786cd72-operator-scripts\") pod \"7981604f-86e8-40dc-82c5-22f4e786cd72\" (UID: \"7981604f-86e8-40dc-82c5-22f4e786cd72\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.236544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlc2d\" (UniqueName: \"kubernetes.io/projected/7981604f-86e8-40dc-82c5-22f4e786cd72-kube-api-access-zlc2d\") pod \"7981604f-86e8-40dc-82c5-22f4e786cd72\" (UID: \"7981604f-86e8-40dc-82c5-22f4e786cd72\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.236657 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cc81354-6161-40a7-b62c-1ba1fbec6b89-operator-scripts\") pod \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\" (UID: \"9cc81354-6161-40a7-b62c-1ba1fbec6b89\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.236693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a54c41-56b7-4424-8b05-a919cb7ce85b-operator-scripts\") pod \"68a54c41-56b7-4424-8b05-a919cb7ce85b\" (UID: \"68a54c41-56b7-4424-8b05-a919cb7ce85b\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.237196 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7981604f-86e8-40dc-82c5-22f4e786cd72-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7981604f-86e8-40dc-82c5-22f4e786cd72" (UID: "7981604f-86e8-40dc-82c5-22f4e786cd72"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.237308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a54c41-56b7-4424-8b05-a919cb7ce85b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68a54c41-56b7-4424-8b05-a919cb7ce85b" (UID: "68a54c41-56b7-4424-8b05-a919cb7ce85b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.237653 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc81354-6161-40a7-b62c-1ba1fbec6b89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9cc81354-6161-40a7-b62c-1ba1fbec6b89" (UID: "9cc81354-6161-40a7-b62c-1ba1fbec6b89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.239952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc81354-6161-40a7-b62c-1ba1fbec6b89-kube-api-access-j2b7z" (OuterVolumeSpecName: "kube-api-access-j2b7z") pod "9cc81354-6161-40a7-b62c-1ba1fbec6b89" (UID: "9cc81354-6161-40a7-b62c-1ba1fbec6b89"). InnerVolumeSpecName "kube-api-access-j2b7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.240367 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a54c41-56b7-4424-8b05-a919cb7ce85b-kube-api-access-gd5qj" (OuterVolumeSpecName: "kube-api-access-gd5qj") pod "68a54c41-56b7-4424-8b05-a919cb7ce85b" (UID: "68a54c41-56b7-4424-8b05-a919cb7ce85b"). InnerVolumeSpecName "kube-api-access-gd5qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.241622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7981604f-86e8-40dc-82c5-22f4e786cd72-kube-api-access-zlc2d" (OuterVolumeSpecName: "kube-api-access-zlc2d") pod "7981604f-86e8-40dc-82c5-22f4e786cd72" (UID: "7981604f-86e8-40dc-82c5-22f4e786cd72"). InnerVolumeSpecName "kube-api-access-zlc2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.354451 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cc81354-6161-40a7-b62c-1ba1fbec6b89-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.354488 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a54c41-56b7-4424-8b05-a919cb7ce85b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.354507 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd5qj\" (UniqueName: \"kubernetes.io/projected/68a54c41-56b7-4424-8b05-a919cb7ce85b-kube-api-access-gd5qj\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.354518 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/9cc81354-6161-40a7-b62c-1ba1fbec6b89-kube-api-access-j2b7z\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.354527 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7981604f-86e8-40dc-82c5-22f4e786cd72-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.354535 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlc2d\" (UniqueName: \"kubernetes.io/projected/7981604f-86e8-40dc-82c5-22f4e786cd72-kube-api-access-zlc2d\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.603739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-ktf95" event={"ID":"68a54c41-56b7-4424-8b05-a919cb7ce85b","Type":"ContainerDied","Data":"e239f3feeeb54d5765a9134fc012a951a9319c54a36b073ac6e4feb455445185"} Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.604058 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e239f3feeeb54d5765a9134fc012a951a9319c54a36b073ac6e4feb455445185" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.604127 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-ktf95" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.609840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" event={"ID":"9cc81354-6161-40a7-b62c-1ba1fbec6b89","Type":"ContainerDied","Data":"19582491b57dd162c953899e6928fecfedc2f547c20386aedae74a305f989c64"} Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.609868 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.609873 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19582491b57dd162c953899e6928fecfedc2f547c20386aedae74a305f989c64" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.611008 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.611006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-hbd6h" event={"ID":"7981604f-86e8-40dc-82c5-22f4e786cd72","Type":"ContainerDied","Data":"87363ae5b4417de928b46e51ab9dbb4f2b3dce3934cf3fda0e98750b4f31911b"} Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.611129 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87363ae5b4417de928b46e51ab9dbb4f2b3dce3934cf3fda0e98750b4f31911b" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.613150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.613153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-2n6bd" event={"ID":"c8711b1e-6689-4b59-a84b-3149a910c125","Type":"ContainerDied","Data":"4b5248228fe3ba35a3d74f834956e36b625117fcb2698e984eae2abf895db0fe"} Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.613183 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b5248228fe3ba35a3d74f834956e36b625117fcb2698e984eae2abf895db0fe" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.614209 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.614228 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.867882 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.902449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.980106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r24f\" (UniqueName: \"kubernetes.io/projected/6ee595bc-dc44-4195-bbed-7e30c29ddabd-kube-api-access-6r24f\") pod \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\" (UID: \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.980208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be2bb610-3b7b-4815-aefc-fed4415542e1-operator-scripts\") pod \"be2bb610-3b7b-4815-aefc-fed4415542e1\" (UID: \"be2bb610-3b7b-4815-aefc-fed4415542e1\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.980453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28rmh\" (UniqueName: \"kubernetes.io/projected/be2bb610-3b7b-4815-aefc-fed4415542e1-kube-api-access-28rmh\") pod \"be2bb610-3b7b-4815-aefc-fed4415542e1\" (UID: \"be2bb610-3b7b-4815-aefc-fed4415542e1\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.980510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee595bc-dc44-4195-bbed-7e30c29ddabd-operator-scripts\") pod \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\" (UID: \"6ee595bc-dc44-4195-bbed-7e30c29ddabd\") " Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.980715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be2bb610-3b7b-4815-aefc-fed4415542e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be2bb610-3b7b-4815-aefc-fed4415542e1" (UID: "be2bb610-3b7b-4815-aefc-fed4415542e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.980987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ee595bc-dc44-4195-bbed-7e30c29ddabd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ee595bc-dc44-4195-bbed-7e30c29ddabd" (UID: "6ee595bc-dc44-4195-bbed-7e30c29ddabd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.981723 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be2bb610-3b7b-4815-aefc-fed4415542e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.981748 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee595bc-dc44-4195-bbed-7e30c29ddabd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.984284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be2bb610-3b7b-4815-aefc-fed4415542e1-kube-api-access-28rmh" (OuterVolumeSpecName: "kube-api-access-28rmh") pod "be2bb610-3b7b-4815-aefc-fed4415542e1" (UID: "be2bb610-3b7b-4815-aefc-fed4415542e1"). InnerVolumeSpecName "kube-api-access-28rmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:28 crc kubenswrapper[4707]: I0121 15:43:28.985715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee595bc-dc44-4195-bbed-7e30c29ddabd-kube-api-access-6r24f" (OuterVolumeSpecName: "kube-api-access-6r24f") pod "6ee595bc-dc44-4195-bbed-7e30c29ddabd" (UID: "6ee595bc-dc44-4195-bbed-7e30c29ddabd"). InnerVolumeSpecName "kube-api-access-6r24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.083187 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r24f\" (UniqueName: \"kubernetes.io/projected/6ee595bc-dc44-4195-bbed-7e30c29ddabd-kube-api-access-6r24f\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.083216 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28rmh\" (UniqueName: \"kubernetes.io/projected/be2bb610-3b7b-4815-aefc-fed4415542e1-kube-api-access-28rmh\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.183709 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:43:29 crc kubenswrapper[4707]: E0121 15:43:29.184152 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.622534 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.622550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn" event={"ID":"6ee595bc-dc44-4195-bbed-7e30c29ddabd","Type":"ContainerDied","Data":"1075948ca7d105e60c09d051555b66d033322afd8b23c677104f83cae2809d52"} Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.623047 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1075948ca7d105e60c09d051555b66d033322afd8b23c677104f83cae2809d52" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.624655 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.625350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc" event={"ID":"be2bb610-3b7b-4815-aefc-fed4415542e1","Type":"ContainerDied","Data":"4c4ecaf3e203a69923bd52db8977ae6a3dfe76feb6f7542cad496c66351a2c8f"} Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.625375 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c4ecaf3e203a69923bd52db8977ae6a3dfe76feb6f7542cad496c66351a2c8f" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.901499 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.901544 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.927524 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:29 crc kubenswrapper[4707]: I0121 15:43:29.937371 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.225170 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.225853 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547060 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s"] Jan 21 15:43:30 crc kubenswrapper[4707]: E0121 15:43:30.547572 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7981604f-86e8-40dc-82c5-22f4e786cd72" containerName="mariadb-database-create" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547590 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7981604f-86e8-40dc-82c5-22f4e786cd72" containerName="mariadb-database-create" Jan 21 15:43:30 crc kubenswrapper[4707]: E0121 15:43:30.547615 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a54c41-56b7-4424-8b05-a919cb7ce85b" containerName="mariadb-database-create" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a54c41-56b7-4424-8b05-a919cb7ce85b" containerName="mariadb-database-create" Jan 21 15:43:30 crc kubenswrapper[4707]: E0121 15:43:30.547632 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc81354-6161-40a7-b62c-1ba1fbec6b89" containerName="mariadb-account-create-update" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547637 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc81354-6161-40a7-b62c-1ba1fbec6b89" containerName="mariadb-account-create-update" Jan 21 15:43:30 crc kubenswrapper[4707]: E0121 15:43:30.547648 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2bb610-3b7b-4815-aefc-fed4415542e1" containerName="mariadb-account-create-update" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547654 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2bb610-3b7b-4815-aefc-fed4415542e1" containerName="mariadb-account-create-update" Jan 21 15:43:30 crc kubenswrapper[4707]: E0121 15:43:30.547667 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8711b1e-6689-4b59-a84b-3149a910c125" containerName="mariadb-database-create" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547673 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8711b1e-6689-4b59-a84b-3149a910c125" containerName="mariadb-database-create" Jan 21 15:43:30 crc kubenswrapper[4707]: E0121 15:43:30.547682 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee595bc-dc44-4195-bbed-7e30c29ddabd" containerName="mariadb-account-create-update" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547688 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee595bc-dc44-4195-bbed-7e30c29ddabd" containerName="mariadb-account-create-update" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a54c41-56b7-4424-8b05-a919cb7ce85b" containerName="mariadb-database-create" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547891 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc81354-6161-40a7-b62c-1ba1fbec6b89" containerName="mariadb-account-create-update" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547900 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8711b1e-6689-4b59-a84b-3149a910c125" containerName="mariadb-database-create" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547910 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be2bb610-3b7b-4815-aefc-fed4415542e1" containerName="mariadb-account-create-update" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547918 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee595bc-dc44-4195-bbed-7e30c29ddabd" containerName="mariadb-account-create-update" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.547924 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7981604f-86e8-40dc-82c5-22f4e786cd72" containerName="mariadb-database-create" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.548479 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.549687 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-j66wd" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.551276 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.556175 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s"] Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.558721 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.610911 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv4pz\" (UniqueName: \"kubernetes.io/projected/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-kube-api-access-kv4pz\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.610947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.610998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-scripts\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.611213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-config-data\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.631737 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.631781 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.713524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-config-data\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.713735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv4pz\" (UniqueName: \"kubernetes.io/projected/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-kube-api-access-kv4pz\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.713855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.713978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-scripts\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.718280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-scripts\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.718332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-config-data\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.718556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.725904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv4pz\" (UniqueName: \"kubernetes.io/projected/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-kube-api-access-kv4pz\") pod \"nova-cell0-conductor-db-sync-l2d4s\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:30 crc kubenswrapper[4707]: I0121 15:43:30.863029 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:31 crc kubenswrapper[4707]: I0121 15:43:31.266228 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s"] Jan 21 15:43:31 crc kubenswrapper[4707]: I0121 15:43:31.641768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" event={"ID":"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49","Type":"ContainerStarted","Data":"2c5a6ecb173c9be1760e62fc79bbb95a3cb76bfe14e017cc957f577ef64e5518"} Jan 21 15:43:31 crc kubenswrapper[4707]: I0121 15:43:31.642028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" event={"ID":"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49","Type":"ContainerStarted","Data":"fbd500c8c6c2ebcb8dc697a3e1d4312d81b627b481a13af13694abcc6476cf85"} Jan 21 15:43:31 crc kubenswrapper[4707]: I0121 15:43:31.656683 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" podStartSLOduration=1.656671705 podStartE2EDuration="1.656671705s" podCreationTimestamp="2026-01-21 15:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:31.649839519 +0000 UTC m=+2508.831355731" watchObservedRunningTime="2026-01-21 15:43:31.656671705 +0000 UTC m=+2508.838187927" Jan 21 15:43:32 crc kubenswrapper[4707]: I0121 15:43:32.265465 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:32 crc kubenswrapper[4707]: I0121 15:43:32.267908 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:43:36 crc kubenswrapper[4707]: I0121 15:43:36.678450 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc87e83f-52c0-49a9-9f2d-0ae38afd1d49" containerID="2c5a6ecb173c9be1760e62fc79bbb95a3cb76bfe14e017cc957f577ef64e5518" exitCode=0 Jan 21 15:43:36 crc kubenswrapper[4707]: I0121 15:43:36.678670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" event={"ID":"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49","Type":"ContainerDied","Data":"2c5a6ecb173c9be1760e62fc79bbb95a3cb76bfe14e017cc957f577ef64e5518"} Jan 21 15:43:37 crc kubenswrapper[4707]: I0121 15:43:37.952847 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.136745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-scripts\") pod \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.137399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv4pz\" (UniqueName: \"kubernetes.io/projected/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-kube-api-access-kv4pz\") pod \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.137472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-config-data\") pod \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.137506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-combined-ca-bundle\") pod \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\" (UID: \"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49\") " Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.141229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-scripts" (OuterVolumeSpecName: "scripts") pod "fc87e83f-52c0-49a9-9f2d-0ae38afd1d49" (UID: "fc87e83f-52c0-49a9-9f2d-0ae38afd1d49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.141396 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-kube-api-access-kv4pz" (OuterVolumeSpecName: "kube-api-access-kv4pz") pod "fc87e83f-52c0-49a9-9f2d-0ae38afd1d49" (UID: "fc87e83f-52c0-49a9-9f2d-0ae38afd1d49"). InnerVolumeSpecName "kube-api-access-kv4pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.156851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-config-data" (OuterVolumeSpecName: "config-data") pod "fc87e83f-52c0-49a9-9f2d-0ae38afd1d49" (UID: "fc87e83f-52c0-49a9-9f2d-0ae38afd1d49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.157291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc87e83f-52c0-49a9-9f2d-0ae38afd1d49" (UID: "fc87e83f-52c0-49a9-9f2d-0ae38afd1d49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.239218 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.239243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv4pz\" (UniqueName: \"kubernetes.io/projected/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-kube-api-access-kv4pz\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.239252 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.239272 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.694737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" event={"ID":"fc87e83f-52c0-49a9-9f2d-0ae38afd1d49","Type":"ContainerDied","Data":"fbd500c8c6c2ebcb8dc697a3e1d4312d81b627b481a13af13694abcc6476cf85"} Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.694975 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbd500c8c6c2ebcb8dc697a3e1d4312d81b627b481a13af13694abcc6476cf85" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.694785 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.739756 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:43:38 crc kubenswrapper[4707]: E0121 15:43:38.740080 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc87e83f-52c0-49a9-9f2d-0ae38afd1d49" containerName="nova-cell0-conductor-db-sync" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.740096 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc87e83f-52c0-49a9-9f2d-0ae38afd1d49" containerName="nova-cell0-conductor-db-sync" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.740275 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc87e83f-52c0-49a9-9f2d-0ae38afd1d49" containerName="nova-cell0-conductor-db-sync" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.740760 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.745920 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-j66wd" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.746466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.746521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.746558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfhqm\" (UniqueName: \"kubernetes.io/projected/f9aba8f8-91db-426a-aa87-96064fe8974c-kube-api-access-mfhqm\") pod \"nova-cell0-conductor-0\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.746762 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.752612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.848034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfhqm\" (UniqueName: \"kubernetes.io/projected/f9aba8f8-91db-426a-aa87-96064fe8974c-kube-api-access-mfhqm\") pod \"nova-cell0-conductor-0\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.848352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.848470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.852144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.852338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:38 crc kubenswrapper[4707]: I0121 15:43:38.860881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfhqm\" (UniqueName: \"kubernetes.io/projected/f9aba8f8-91db-426a-aa87-96064fe8974c-kube-api-access-mfhqm\") pod \"nova-cell0-conductor-0\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:39 crc kubenswrapper[4707]: I0121 15:43:39.052298 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:39 crc kubenswrapper[4707]: I0121 15:43:39.603187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:43:39 crc kubenswrapper[4707]: W0121 15:43:39.604552 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9aba8f8_91db_426a_aa87_96064fe8974c.slice/crio-ebdbd211c858e4eb3888352c667f305ee2f70eec3c12055dbcbbc8efebd65a7b WatchSource:0}: Error finding container ebdbd211c858e4eb3888352c667f305ee2f70eec3c12055dbcbbc8efebd65a7b: Status 404 returned error can't find the container with id ebdbd211c858e4eb3888352c667f305ee2f70eec3c12055dbcbbc8efebd65a7b Jan 21 15:43:39 crc kubenswrapper[4707]: I0121 15:43:39.704009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f9aba8f8-91db-426a-aa87-96064fe8974c","Type":"ContainerStarted","Data":"ebdbd211c858e4eb3888352c667f305ee2f70eec3c12055dbcbbc8efebd65a7b"} Jan 21 15:43:40 crc kubenswrapper[4707]: I0121 15:43:40.182843 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:43:40 crc kubenswrapper[4707]: E0121 15:43:40.183066 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:43:40 crc kubenswrapper[4707]: I0121 15:43:40.711676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f9aba8f8-91db-426a-aa87-96064fe8974c","Type":"ContainerStarted","Data":"ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d"} Jan 21 15:43:40 crc kubenswrapper[4707]: I0121 15:43:40.712986 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:40 crc kubenswrapper[4707]: I0121 15:43:40.722978 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.722968215 podStartE2EDuration="2.722968215s" podCreationTimestamp="2026-01-21 15:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:40.722499764 +0000 UTC m=+2517.904015986" watchObservedRunningTime="2026-01-21 15:43:40.722968215 +0000 UTC m=+2517.904484437" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.072036 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.463891 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.464800 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.466134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.472913 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.473991 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.554132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.554191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjk67\" (UniqueName: \"kubernetes.io/projected/d41685f3-ac56-4369-9528-04ded1edb06b-kube-api-access-zjk67\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.554290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-config-data\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.554388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-scripts\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.563181 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.564514 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.568041 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.574066 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.610834 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.612101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.617723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.641921 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.643113 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.644921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.647339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.653863 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66d5\" (UniqueName: \"kubernetes.io/projected/ee8c7832-9011-4d40-852c-6085193b6a28-kube-api-access-t66d5\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-config-data\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8c7832-9011-4d40-852c-6085193b6a28-logs\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-logs\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjk67\" (UniqueName: \"kubernetes.io/projected/d41685f3-ac56-4369-9528-04ded1edb06b-kube-api-access-zjk67\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnb76\" (UniqueName: \"kubernetes.io/projected/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-kube-api-access-vnb76\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-config-data\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx97r\" (UniqueName: \"kubernetes.io/projected/67062e56-a299-4d72-a077-0cbb6b8feada-kube-api-access-tx97r\") pod \"nova-cell1-novncproxy-0\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-config-data\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.657826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-scripts\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.664043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-scripts\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.666322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.671322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-config-data\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.702053 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjk67\" (UniqueName: \"kubernetes.io/projected/d41685f3-ac56-4369-9528-04ded1edb06b-kube-api-access-zjk67\") pod \"nova-cell0-cell-mapping-dsnpf\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.730631 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.731856 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.734408 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.737459 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlxll\" (UniqueName: \"kubernetes.io/projected/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-kube-api-access-xlxll\") pod \"nova-scheduler-0\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnb76\" (UniqueName: \"kubernetes.io/projected/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-kube-api-access-vnb76\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx97r\" (UniqueName: \"kubernetes.io/projected/67062e56-a299-4d72-a077-0cbb6b8feada-kube-api-access-tx97r\") pod \"nova-cell1-novncproxy-0\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-config-data\") pod \"nova-scheduler-0\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-config-data\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66d5\" (UniqueName: \"kubernetes.io/projected/ee8c7832-9011-4d40-852c-6085193b6a28-kube-api-access-t66d5\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-config-data\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8c7832-9011-4d40-852c-6085193b6a28-logs\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-logs\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.761782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.766386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-config-data\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.766507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.768337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-config-data\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.768587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-logs\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.768947 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.769220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8c7832-9011-4d40-852c-6085193b6a28-logs\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.771176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.773546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.778886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnb76\" (UniqueName: \"kubernetes.io/projected/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-kube-api-access-vnb76\") pod \"nova-api-0\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.778894 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.791354 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx97r\" (UniqueName: \"kubernetes.io/projected/67062e56-a299-4d72-a077-0cbb6b8feada-kube-api-access-tx97r\") pod \"nova-cell1-novncproxy-0\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.792427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66d5\" (UniqueName: \"kubernetes.io/projected/ee8c7832-9011-4d40-852c-6085193b6a28-kube-api-access-t66d5\") pod \"nova-metadata-0\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.863319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlxll\" (UniqueName: \"kubernetes.io/projected/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-kube-api-access-xlxll\") pod \"nova-scheduler-0\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.863388 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-config-data\") pod \"nova-scheduler-0\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.863438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.866768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.870543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-config-data\") pod \"nova-scheduler-0\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.877386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlxll\" (UniqueName: \"kubernetes.io/projected/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-kube-api-access-xlxll\") pod \"nova-scheduler-0\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:44 crc kubenswrapper[4707]: I0121 15:43:44.880381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:44.927645 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:44.958022 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.149607 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.330902 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm"] Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.332068 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.333505 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.333999 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.338572 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm"] Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.372835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mcsl\" (UniqueName: \"kubernetes.io/projected/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-kube-api-access-2mcsl\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.372897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.372930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-scripts\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.372995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-config-data\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.475095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-config-data\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.475286 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mcsl\" (UniqueName: \"kubernetes.io/projected/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-kube-api-access-2mcsl\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.475351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.475391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-scripts\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.483022 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.483066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-config-data\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.483506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-scripts\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.497325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mcsl\" (UniqueName: \"kubernetes.io/projected/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-kube-api-access-2mcsl\") pod \"nova-cell1-conductor-db-sync-mj2nm\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.645391 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.772806 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:43:45 crc kubenswrapper[4707]: W0121 15:43:45.787326 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee8c7832_9011_4d40_852c_6085193b6a28.slice/crio-84945223e03f9463bf586d4a8b4e8997378f3d1a9bb3d947c7f86bd3f4e66e7a WatchSource:0}: Error finding container 84945223e03f9463bf586d4a8b4e8997378f3d1a9bb3d947c7f86bd3f4e66e7a: Status 404 returned error can't find the container with id 84945223e03f9463bf586d4a8b4e8997378f3d1a9bb3d947c7f86bd3f4e66e7a Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.793910 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf"] Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.808448 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.815176 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:45 crc kubenswrapper[4707]: I0121 15:43:45.825939 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.112117 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm"] Jan 21 15:43:46 crc kubenswrapper[4707]: W0121 15:43:46.123172 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb10a74d4_cf8e_4ccd_b07d_999cc270d6b8.slice/crio-2d6b99a939a6f37daf9b0a5aa0f6decff0b8321028a4a85ee0c46ff00b1f50ac WatchSource:0}: Error finding container 2d6b99a939a6f37daf9b0a5aa0f6decff0b8321028a4a85ee0c46ff00b1f50ac: Status 404 returned error can't find the container with id 2d6b99a939a6f37daf9b0a5aa0f6decff0b8321028a4a85ee0c46ff00b1f50ac Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.767134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" event={"ID":"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8","Type":"ContainerStarted","Data":"44aac26d47f17ab219b49fd4ef538fee4c84653e8d7ef18b9e7e52af53a0eb0d"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.768126 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" event={"ID":"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8","Type":"ContainerStarted","Data":"2d6b99a939a6f37daf9b0a5aa0f6decff0b8321028a4a85ee0c46ff00b1f50ac"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.768563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ee8c7832-9011-4d40-852c-6085193b6a28","Type":"ContainerStarted","Data":"0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.768604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ee8c7832-9011-4d40-852c-6085193b6a28","Type":"ContainerStarted","Data":"981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.768619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ee8c7832-9011-4d40-852c-6085193b6a28","Type":"ContainerStarted","Data":"84945223e03f9463bf586d4a8b4e8997378f3d1a9bb3d947c7f86bd3f4e66e7a"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.769674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9237a5ab-d21a-428d-8fd9-a18dde1b9a35","Type":"ContainerStarted","Data":"573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.769700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9237a5ab-d21a-428d-8fd9-a18dde1b9a35","Type":"ContainerStarted","Data":"6387675d4e6ddc229f7ac3135a9fa8901ed853851c03e3a9926f4a743f5f705c"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.771282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"67062e56-a299-4d72-a077-0cbb6b8feada","Type":"ContainerStarted","Data":"9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.771305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"67062e56-a299-4d72-a077-0cbb6b8feada","Type":"ContainerStarted","Data":"40a3758761081897d0bca19fbfc42b48f3bcce5a9e024d83ce32f066316f88c4"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.772856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f","Type":"ContainerStarted","Data":"9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.772891 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f","Type":"ContainerStarted","Data":"5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.772902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f","Type":"ContainerStarted","Data":"b872210ac0e5a469c8949f5d78f180e83b2ee57ea730df81f0b7fa7aae1578d4"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.774351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" event={"ID":"d41685f3-ac56-4369-9528-04ded1edb06b","Type":"ContainerStarted","Data":"b7c709a6cb6bf6144998677dd56db745001a849fd302b2bacb3b10742068e3cd"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.774374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" event={"ID":"d41685f3-ac56-4369-9528-04ded1edb06b","Type":"ContainerStarted","Data":"99e4d21a2d74106a3293953035f0f615625fe163f971ff6e83ed60f48d90d4ff"} Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.782625 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" podStartSLOduration=1.782612683 podStartE2EDuration="1.782612683s" podCreationTimestamp="2026-01-21 15:43:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:46.776388972 +0000 UTC m=+2523.957905194" watchObservedRunningTime="2026-01-21 15:43:46.782612683 +0000 UTC m=+2523.964128905" Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.792127 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.792104399 podStartE2EDuration="2.792104399s" podCreationTimestamp="2026-01-21 15:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:46.789936943 +0000 UTC m=+2523.971453165" watchObservedRunningTime="2026-01-21 15:43:46.792104399 +0000 UTC m=+2523.973620621" Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.807217 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.807204047 podStartE2EDuration="2.807204047s" podCreationTimestamp="2026-01-21 15:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:46.805035992 +0000 UTC m=+2523.986552214" watchObservedRunningTime="2026-01-21 15:43:46.807204047 +0000 UTC m=+2523.988720269" Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.820047 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" podStartSLOduration=2.8200343 podStartE2EDuration="2.8200343s" podCreationTimestamp="2026-01-21 15:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:46.815955441 +0000 UTC m=+2523.997471663" watchObservedRunningTime="2026-01-21 15:43:46.8200343 +0000 UTC m=+2524.001550522" Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.847426 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.847406823 podStartE2EDuration="2.847406823s" podCreationTimestamp="2026-01-21 15:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:46.834116996 +0000 UTC m=+2524.015633218" watchObservedRunningTime="2026-01-21 15:43:46.847406823 +0000 UTC m=+2524.028923045" Jan 21 15:43:46 crc kubenswrapper[4707]: I0121 15:43:46.854663 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.854647466 podStartE2EDuration="2.854647466s" podCreationTimestamp="2026-01-21 15:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:46.852102621 +0000 UTC m=+2524.033618843" watchObservedRunningTime="2026-01-21 15:43:46.854647466 +0000 UTC m=+2524.036163688" Jan 21 15:43:47 crc kubenswrapper[4707]: I0121 15:43:47.131309 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:47 crc kubenswrapper[4707]: I0121 15:43:47.170111 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:43:48 crc kubenswrapper[4707]: I0121 15:43:48.790426 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="67062e56-a299-4d72-a077-0cbb6b8feada" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8" gracePeriod=30 Jan 21 15:43:48 crc kubenswrapper[4707]: I0121 15:43:48.790553 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ee8c7832-9011-4d40-852c-6085193b6a28" containerName="nova-metadata-metadata" containerID="cri-o://0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed" gracePeriod=30 Jan 21 15:43:48 crc kubenswrapper[4707]: I0121 15:43:48.790544 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ee8c7832-9011-4d40-852c-6085193b6a28" containerName="nova-metadata-log" containerID="cri-o://981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8" gracePeriod=30 Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.391498 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.407690 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.449272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-combined-ca-bundle\") pod \"ee8c7832-9011-4d40-852c-6085193b6a28\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.449364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8c7832-9011-4d40-852c-6085193b6a28-logs\") pod \"ee8c7832-9011-4d40-852c-6085193b6a28\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.449401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-config-data\") pod \"67062e56-a299-4d72-a077-0cbb6b8feada\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.449422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-config-data\") pod \"ee8c7832-9011-4d40-852c-6085193b6a28\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.449437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-combined-ca-bundle\") pod \"67062e56-a299-4d72-a077-0cbb6b8feada\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.449457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx97r\" (UniqueName: \"kubernetes.io/projected/67062e56-a299-4d72-a077-0cbb6b8feada-kube-api-access-tx97r\") pod \"67062e56-a299-4d72-a077-0cbb6b8feada\" (UID: \"67062e56-a299-4d72-a077-0cbb6b8feada\") " Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.449488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t66d5\" (UniqueName: \"kubernetes.io/projected/ee8c7832-9011-4d40-852c-6085193b6a28-kube-api-access-t66d5\") pod \"ee8c7832-9011-4d40-852c-6085193b6a28\" (UID: \"ee8c7832-9011-4d40-852c-6085193b6a28\") " Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.449623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee8c7832-9011-4d40-852c-6085193b6a28-logs" (OuterVolumeSpecName: "logs") pod "ee8c7832-9011-4d40-852c-6085193b6a28" (UID: "ee8c7832-9011-4d40-852c-6085193b6a28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.449881 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8c7832-9011-4d40-852c-6085193b6a28-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.460140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8c7832-9011-4d40-852c-6085193b6a28-kube-api-access-t66d5" (OuterVolumeSpecName: "kube-api-access-t66d5") pod "ee8c7832-9011-4d40-852c-6085193b6a28" (UID: "ee8c7832-9011-4d40-852c-6085193b6a28"). InnerVolumeSpecName "kube-api-access-t66d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.462153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67062e56-a299-4d72-a077-0cbb6b8feada-kube-api-access-tx97r" (OuterVolumeSpecName: "kube-api-access-tx97r") pod "67062e56-a299-4d72-a077-0cbb6b8feada" (UID: "67062e56-a299-4d72-a077-0cbb6b8feada"). InnerVolumeSpecName "kube-api-access-tx97r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.473105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee8c7832-9011-4d40-852c-6085193b6a28" (UID: "ee8c7832-9011-4d40-852c-6085193b6a28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.473654 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67062e56-a299-4d72-a077-0cbb6b8feada" (UID: "67062e56-a299-4d72-a077-0cbb6b8feada"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.478292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-config-data" (OuterVolumeSpecName: "config-data") pod "ee8c7832-9011-4d40-852c-6085193b6a28" (UID: "ee8c7832-9011-4d40-852c-6085193b6a28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.482084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-config-data" (OuterVolumeSpecName: "config-data") pod "67062e56-a299-4d72-a077-0cbb6b8feada" (UID: "67062e56-a299-4d72-a077-0cbb6b8feada"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.550751 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.550779 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.550787 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8c7832-9011-4d40-852c-6085193b6a28-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.550798 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67062e56-a299-4d72-a077-0cbb6b8feada-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.550824 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx97r\" (UniqueName: \"kubernetes.io/projected/67062e56-a299-4d72-a077-0cbb6b8feada-kube-api-access-tx97r\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.550835 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t66d5\" (UniqueName: \"kubernetes.io/projected/ee8c7832-9011-4d40-852c-6085193b6a28-kube-api-access-t66d5\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.797863 4707 generic.go:334] "Generic (PLEG): container finished" podID="67062e56-a299-4d72-a077-0cbb6b8feada" containerID="9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8" exitCode=0 Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.797927 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"67062e56-a299-4d72-a077-0cbb6b8feada","Type":"ContainerDied","Data":"9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8"} Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.797966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"67062e56-a299-4d72-a077-0cbb6b8feada","Type":"ContainerDied","Data":"40a3758761081897d0bca19fbfc42b48f3bcce5a9e024d83ce32f066316f88c4"} Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.797986 4707 scope.go:117] "RemoveContainer" containerID="9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.798086 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.802752 4707 generic.go:334] "Generic (PLEG): container finished" podID="b10a74d4-cf8e-4ccd-b07d-999cc270d6b8" containerID="44aac26d47f17ab219b49fd4ef538fee4c84653e8d7ef18b9e7e52af53a0eb0d" exitCode=0 Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.802848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" event={"ID":"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8","Type":"ContainerDied","Data":"44aac26d47f17ab219b49fd4ef538fee4c84653e8d7ef18b9e7e52af53a0eb0d"} Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.805061 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee8c7832-9011-4d40-852c-6085193b6a28" containerID="0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed" exitCode=0 Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.805090 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee8c7832-9011-4d40-852c-6085193b6a28" containerID="981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8" exitCode=143 Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.805118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ee8c7832-9011-4d40-852c-6085193b6a28","Type":"ContainerDied","Data":"0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed"} Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.805144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ee8c7832-9011-4d40-852c-6085193b6a28","Type":"ContainerDied","Data":"981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8"} Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.805154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ee8c7832-9011-4d40-852c-6085193b6a28","Type":"ContainerDied","Data":"84945223e03f9463bf586d4a8b4e8997378f3d1a9bb3d947c7f86bd3f4e66e7a"} Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.805204 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.818246 4707 scope.go:117] "RemoveContainer" containerID="9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8" Jan 21 15:43:49 crc kubenswrapper[4707]: E0121 15:43:49.818720 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8\": container with ID starting with 9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8 not found: ID does not exist" containerID="9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.818789 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8"} err="failed to get container status \"9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8\": rpc error: code = NotFound desc = could not find container \"9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8\": container with ID starting with 9c6ec061319521a5730e30fa51938017f05b5b5b881bf268188ce06162c958e8 not found: ID does not exist" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.818851 4707 scope.go:117] "RemoveContainer" containerID="0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.842888 4707 scope.go:117] "RemoveContainer" containerID="981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.859463 4707 scope.go:117] "RemoveContainer" containerID="0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed" Jan 21 15:43:49 crc kubenswrapper[4707]: E0121 15:43:49.860100 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed\": container with ID starting with 0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed not found: ID does not exist" containerID="0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.860141 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed"} err="failed to get container status \"0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed\": rpc error: code = NotFound desc = could not find container \"0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed\": container with ID starting with 0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed not found: ID does not exist" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.860163 4707 scope.go:117] "RemoveContainer" containerID="981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8" Jan 21 15:43:49 crc kubenswrapper[4707]: E0121 15:43:49.860435 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8\": container with ID starting with 981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8 not found: ID does not exist" containerID="981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.860477 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8"} err="failed to get container status \"981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8\": rpc error: code = NotFound desc = could not find container \"981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8\": container with ID starting with 981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8 not found: ID does not exist" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.860504 4707 scope.go:117] "RemoveContainer" containerID="0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.860779 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed"} err="failed to get container status \"0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed\": rpc error: code = NotFound desc = could not find container \"0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed\": container with ID starting with 0c35155074391adcd9da4a3d18572a5d56773ce508a0893793537d93ee6fc5ed not found: ID does not exist" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.860800 4707 scope.go:117] "RemoveContainer" containerID="981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.861375 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.861485 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8"} err="failed to get container status \"981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8\": rpc error: code = NotFound desc = could not find container \"981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8\": container with ID starting with 981e4b8096ce5056c3562e7f528bb8a4fd2cca439e053235c5c61c995abe91c8 not found: ID does not exist" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.870782 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.887716 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.900947 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.913275 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:49 crc kubenswrapper[4707]: E0121 15:43:49.913687 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67062e56-a299-4d72-a077-0cbb6b8feada" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.913704 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67062e56-a299-4d72-a077-0cbb6b8feada" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:43:49 crc kubenswrapper[4707]: E0121 15:43:49.913719 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8c7832-9011-4d40-852c-6085193b6a28" containerName="nova-metadata-metadata" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.913726 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8c7832-9011-4d40-852c-6085193b6a28" containerName="nova-metadata-metadata" Jan 21 15:43:49 crc kubenswrapper[4707]: E0121 15:43:49.913771 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8c7832-9011-4d40-852c-6085193b6a28" containerName="nova-metadata-log" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.913778 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8c7832-9011-4d40-852c-6085193b6a28" containerName="nova-metadata-log" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.913978 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8c7832-9011-4d40-852c-6085193b6a28" containerName="nova-metadata-metadata" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.913999 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="67062e56-a299-4d72-a077-0cbb6b8feada" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.914006 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8c7832-9011-4d40-852c-6085193b6a28" containerName="nova-metadata-log" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.915057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.918738 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.918863 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.919900 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.921345 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.927516 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.927671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.927716 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.934367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.939861 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-config-data\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j8pj\" (UniqueName: \"kubernetes.io/projected/a9ba2c35-3284-4234-b661-254a5b038cf8-kube-api-access-4j8pj\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxqjp\" (UniqueName: \"kubernetes.io/projected/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-kube-api-access-jxqjp\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956880 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:49 crc kubenswrapper[4707]: I0121 15:43:49.956941 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-logs\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.058714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.058772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.058844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.058866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-logs\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.058940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.058969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-config-data\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.058995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j8pj\" (UniqueName: \"kubernetes.io/projected/a9ba2c35-3284-4234-b661-254a5b038cf8-kube-api-access-4j8pj\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.059041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.059061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxqjp\" (UniqueName: \"kubernetes.io/projected/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-kube-api-access-jxqjp\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.059088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.059431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-logs\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.063982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.064737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-config-data\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.064820 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.065046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.065143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.066191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.070166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.073573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j8pj\" (UniqueName: \"kubernetes.io/projected/a9ba2c35-3284-4234-b661-254a5b038cf8-kube-api-access-4j8pj\") pod \"nova-cell1-novncproxy-0\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.075447 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxqjp\" (UniqueName: \"kubernetes.io/projected/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-kube-api-access-jxqjp\") pod \"nova-metadata-0\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.150215 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.260907 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.265996 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.655137 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:43:50 crc kubenswrapper[4707]: W0121 15:43:50.663416 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9ba2c35_3284_4234_b661_254a5b038cf8.slice/crio-6e42bbb2f3a6a93c7af095075600bf2be2bb88d8eec1a49d9739a2268df06c97 WatchSource:0}: Error finding container 6e42bbb2f3a6a93c7af095075600bf2be2bb88d8eec1a49d9739a2268df06c97: Status 404 returned error can't find the container with id 6e42bbb2f3a6a93c7af095075600bf2be2bb88d8eec1a49d9739a2268df06c97 Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.719092 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:50 crc kubenswrapper[4707]: W0121 15:43:50.720238 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded1b336e_0975_46c6_af3e_9ed4fbecbbd9.slice/crio-8bf80bd710451d983c3eac1d9ad23077506c4218357fbd681f41de27eb48f96d WatchSource:0}: Error finding container 8bf80bd710451d983c3eac1d9ad23077506c4218357fbd681f41de27eb48f96d: Status 404 returned error can't find the container with id 8bf80bd710451d983c3eac1d9ad23077506c4218357fbd681f41de27eb48f96d Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.813933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9","Type":"ContainerStarted","Data":"8bf80bd710451d983c3eac1d9ad23077506c4218357fbd681f41de27eb48f96d"} Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.817967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a9ba2c35-3284-4234-b661-254a5b038cf8","Type":"ContainerStarted","Data":"c93cb1efa43ca7efe450cd63451ac3e5290a1c195b5b66d42c58c98b1bbb06e4"} Jan 21 15:43:50 crc kubenswrapper[4707]: I0121 15:43:50.818021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a9ba2c35-3284-4234-b661-254a5b038cf8","Type":"ContainerStarted","Data":"6e42bbb2f3a6a93c7af095075600bf2be2bb88d8eec1a49d9739a2268df06c97"} Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.130593 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.150836 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.150803992 podStartE2EDuration="2.150803992s" podCreationTimestamp="2026-01-21 15:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:50.842057113 +0000 UTC m=+2528.023573334" watchObservedRunningTime="2026-01-21 15:43:51.150803992 +0000 UTC m=+2528.332320213" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.181024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-combined-ca-bundle\") pod \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.181138 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mcsl\" (UniqueName: \"kubernetes.io/projected/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-kube-api-access-2mcsl\") pod \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.181783 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-config-data\") pod \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.182193 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-scripts\") pod \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\" (UID: \"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8\") " Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.186577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-kube-api-access-2mcsl" (OuterVolumeSpecName: "kube-api-access-2mcsl") pod "b10a74d4-cf8e-4ccd-b07d-999cc270d6b8" (UID: "b10a74d4-cf8e-4ccd-b07d-999cc270d6b8"). InnerVolumeSpecName "kube-api-access-2mcsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.188934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-scripts" (OuterVolumeSpecName: "scripts") pod "b10a74d4-cf8e-4ccd-b07d-999cc270d6b8" (UID: "b10a74d4-cf8e-4ccd-b07d-999cc270d6b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.198572 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67062e56-a299-4d72-a077-0cbb6b8feada" path="/var/lib/kubelet/pods/67062e56-a299-4d72-a077-0cbb6b8feada/volumes" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.199540 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee8c7832-9011-4d40-852c-6085193b6a28" path="/var/lib/kubelet/pods/ee8c7832-9011-4d40-852c-6085193b6a28/volumes" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.218902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b10a74d4-cf8e-4ccd-b07d-999cc270d6b8" (UID: "b10a74d4-cf8e-4ccd-b07d-999cc270d6b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.240031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-config-data" (OuterVolumeSpecName: "config-data") pod "b10a74d4-cf8e-4ccd-b07d-999cc270d6b8" (UID: "b10a74d4-cf8e-4ccd-b07d-999cc270d6b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.285046 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.285080 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mcsl\" (UniqueName: \"kubernetes.io/projected/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-kube-api-access-2mcsl\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.285094 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.285103 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.830529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9","Type":"ContainerStarted","Data":"3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b"} Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.830600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9","Type":"ContainerStarted","Data":"1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50"} Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.835874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.837173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm" event={"ID":"b10a74d4-cf8e-4ccd-b07d-999cc270d6b8","Type":"ContainerDied","Data":"2d6b99a939a6f37daf9b0a5aa0f6decff0b8321028a4a85ee0c46ff00b1f50ac"} Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.837203 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d6b99a939a6f37daf9b0a5aa0f6decff0b8321028a4a85ee0c46ff00b1f50ac" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.884658 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.884637536 podStartE2EDuration="2.884637536s" podCreationTimestamp="2026-01-21 15:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:51.864215638 +0000 UTC m=+2529.045731860" watchObservedRunningTime="2026-01-21 15:43:51.884637536 +0000 UTC m=+2529.066153758" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.900232 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:43:51 crc kubenswrapper[4707]: E0121 15:43:51.900693 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b10a74d4-cf8e-4ccd-b07d-999cc270d6b8" containerName="nova-cell1-conductor-db-sync" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.900711 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b10a74d4-cf8e-4ccd-b07d-999cc270d6b8" containerName="nova-cell1-conductor-db-sync" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.900899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b10a74d4-cf8e-4ccd-b07d-999cc270d6b8" containerName="nova-cell1-conductor-db-sync" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.901559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.903392 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.923889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:43:51 crc kubenswrapper[4707]: I0121 15:43:51.932221 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.000219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.000481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txlwg\" (UniqueName: \"kubernetes.io/projected/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-kube-api-access-txlwg\") pod \"nova-cell1-conductor-0\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.000647 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.104187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.104359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txlwg\" (UniqueName: \"kubernetes.io/projected/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-kube-api-access-txlwg\") pod \"nova-cell1-conductor-0\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.104457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.111831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.112178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.124804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txlwg\" (UniqueName: \"kubernetes.io/projected/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-kube-api-access-txlwg\") pod \"nova-cell1-conductor-0\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.258985 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.658984 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.845498 4707 generic.go:334] "Generic (PLEG): container finished" podID="d41685f3-ac56-4369-9528-04ded1edb06b" containerID="b7c709a6cb6bf6144998677dd56db745001a849fd302b2bacb3b10742068e3cd" exitCode=0 Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.845568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" event={"ID":"d41685f3-ac56-4369-9528-04ded1edb06b","Type":"ContainerDied","Data":"b7c709a6cb6bf6144998677dd56db745001a849fd302b2bacb3b10742068e3cd"} Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.848118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe","Type":"ContainerStarted","Data":"ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2"} Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.848228 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.848244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe","Type":"ContainerStarted","Data":"cedba6f29dee9154f6242a4c5dc406d7fc5ccef59b4225c9c33b899e4c1375ac"} Jan 21 15:43:52 crc kubenswrapper[4707]: I0121 15:43:52.877172 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.877146834 podStartE2EDuration="1.877146834s" podCreationTimestamp="2026-01-21 15:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:52.868270655 +0000 UTC m=+2530.049786877" watchObservedRunningTime="2026-01-21 15:43:52.877146834 +0000 UTC m=+2530.058663057" Jan 21 15:43:53 crc kubenswrapper[4707]: I0121 15:43:53.189659 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:43:53 crc kubenswrapper[4707]: E0121 15:43:53.190320 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.142515 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.244142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjk67\" (UniqueName: \"kubernetes.io/projected/d41685f3-ac56-4369-9528-04ded1edb06b-kube-api-access-zjk67\") pod \"d41685f3-ac56-4369-9528-04ded1edb06b\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.244199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-config-data\") pod \"d41685f3-ac56-4369-9528-04ded1edb06b\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.244271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-combined-ca-bundle\") pod \"d41685f3-ac56-4369-9528-04ded1edb06b\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.244293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-scripts\") pod \"d41685f3-ac56-4369-9528-04ded1edb06b\" (UID: \"d41685f3-ac56-4369-9528-04ded1edb06b\") " Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.250739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41685f3-ac56-4369-9528-04ded1edb06b-kube-api-access-zjk67" (OuterVolumeSpecName: "kube-api-access-zjk67") pod "d41685f3-ac56-4369-9528-04ded1edb06b" (UID: "d41685f3-ac56-4369-9528-04ded1edb06b"). InnerVolumeSpecName "kube-api-access-zjk67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.262130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-scripts" (OuterVolumeSpecName: "scripts") pod "d41685f3-ac56-4369-9528-04ded1edb06b" (UID: "d41685f3-ac56-4369-9528-04ded1edb06b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.265005 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-config-data" (OuterVolumeSpecName: "config-data") pod "d41685f3-ac56-4369-9528-04ded1edb06b" (UID: "d41685f3-ac56-4369-9528-04ded1edb06b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.266267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d41685f3-ac56-4369-9528-04ded1edb06b" (UID: "d41685f3-ac56-4369-9528-04ded1edb06b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.348099 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjk67\" (UniqueName: \"kubernetes.io/projected/d41685f3-ac56-4369-9528-04ded1edb06b-kube-api-access-zjk67\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.348136 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.348147 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.348155 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41685f3-ac56-4369-9528-04ded1edb06b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.864579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" event={"ID":"d41685f3-ac56-4369-9528-04ded1edb06b","Type":"ContainerDied","Data":"99e4d21a2d74106a3293953035f0f615625fe163f971ff6e83ed60f48d90d4ff"} Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.864766 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99e4d21a2d74106a3293953035f0f615625fe163f971ff6e83ed60f48d90d4ff" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.864635 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.880567 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:54 crc kubenswrapper[4707]: I0121 15:43:54.880850 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.025798 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.032712 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.032929 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="9237a5ab-d21a-428d-8fd9-a18dde1b9a35" containerName="nova-scheduler-scheduler" containerID="cri-o://573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa" gracePeriod=30 Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.042657 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.042872 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerName="nova-metadata-log" containerID="cri-o://1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50" gracePeriod=30 Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.042938 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerName="nova-metadata-metadata" containerID="cri-o://3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b" gracePeriod=30 Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.265029 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.266228 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.266294 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.541399 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.571479 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-nova-metadata-tls-certs\") pod \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.571677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-combined-ca-bundle\") pod \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.571724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxqjp\" (UniqueName: \"kubernetes.io/projected/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-kube-api-access-jxqjp\") pod \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.571748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-logs\") pod \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.571782 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-config-data\") pod \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\" (UID: \"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9\") " Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.574024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-logs" (OuterVolumeSpecName: "logs") pod "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" (UID: "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.576030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-kube-api-access-jxqjp" (OuterVolumeSpecName: "kube-api-access-jxqjp") pod "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" (UID: "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9"). InnerVolumeSpecName "kube-api-access-jxqjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.593003 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-config-data" (OuterVolumeSpecName: "config-data") pod "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" (UID: "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.593467 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" (UID: "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.605923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" (UID: "ed1b336e-0975-46c6-af3e-9ed4fbecbbd9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.673037 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.673084 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.673098 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.673109 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxqjp\" (UniqueName: \"kubernetes.io/projected/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-kube-api-access-jxqjp\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.673117 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.873489 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerID="3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b" exitCode=0 Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.873521 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerID="1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50" exitCode=143 Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.874164 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.874195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9","Type":"ContainerDied","Data":"3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b"} Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.874289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9","Type":"ContainerDied","Data":"1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50"} Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.874313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed1b336e-0975-46c6-af3e-9ed4fbecbbd9","Type":"ContainerDied","Data":"8bf80bd710451d983c3eac1d9ad23077506c4218357fbd681f41de27eb48f96d"} Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.874329 4707 scope.go:117] "RemoveContainer" containerID="3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.898143 4707 scope.go:117] "RemoveContainer" containerID="1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.923425 4707 scope.go:117] "RemoveContainer" containerID="3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b" Jan 21 15:43:55 crc kubenswrapper[4707]: E0121 15:43:55.923823 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b\": container with ID starting with 3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b not found: ID does not exist" containerID="3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.923853 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b"} err="failed to get container status \"3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b\": rpc error: code = NotFound desc = could not find container \"3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b\": container with ID starting with 3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b not found: ID does not exist" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.923871 4707 scope.go:117] "RemoveContainer" containerID="1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50" Jan 21 15:43:55 crc kubenswrapper[4707]: E0121 15:43:55.924162 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50\": container with ID starting with 1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50 not found: ID does not exist" containerID="1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.924202 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50"} err="failed to get container status \"1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50\": rpc error: code = NotFound desc = could not find container \"1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50\": container with ID starting with 1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50 not found: ID does not exist" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.924225 4707 scope.go:117] "RemoveContainer" containerID="3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.924542 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b"} err="failed to get container status \"3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b\": rpc error: code = NotFound desc = could not find container \"3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b\": container with ID starting with 3ab762c0b89b1d017ee5803bed0c9e00267cd843d560417445f0cb2b6d863e7b not found: ID does not exist" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.924565 4707 scope.go:117] "RemoveContainer" containerID="1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.924827 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50"} err="failed to get container status \"1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50\": rpc error: code = NotFound desc = could not find container \"1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50\": container with ID starting with 1ccb46bafa1c0ef5b2af314293b3323cafeaf7e460dd196114efcc18e9f61c50 not found: ID does not exist" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.925231 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.933120 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.950040 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:55 crc kubenswrapper[4707]: E0121 15:43:55.950426 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerName="nova-metadata-log" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.950439 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerName="nova-metadata-log" Jan 21 15:43:55 crc kubenswrapper[4707]: E0121 15:43:55.950465 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41685f3-ac56-4369-9528-04ded1edb06b" containerName="nova-manage" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.950471 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41685f3-ac56-4369-9528-04ded1edb06b" containerName="nova-manage" Jan 21 15:43:55 crc kubenswrapper[4707]: E0121 15:43:55.950484 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerName="nova-metadata-metadata" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.950490 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerName="nova-metadata-metadata" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.950650 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerName="nova-metadata-metadata" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.950666 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" containerName="nova-metadata-log" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.950680 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41685f3-ac56-4369-9528-04ded1edb06b" containerName="nova-manage" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.951596 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.956836 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.957141 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.957183 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.969512 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.30:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.969798 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.30:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.977332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.977487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a27a0-15ac-4ee9-88db-c78a16619d74-logs\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.977569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-config-data\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.977664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdnqt\" (UniqueName: \"kubernetes.io/projected/9d2a27a0-15ac-4ee9-88db-c78a16619d74-kube-api-access-gdnqt\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:55 crc kubenswrapper[4707]: I0121 15:43:55.977728 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.078789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-config-data\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.079036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdnqt\" (UniqueName: \"kubernetes.io/projected/9d2a27a0-15ac-4ee9-88db-c78a16619d74-kube-api-access-gdnqt\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.079112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.079216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.079363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a27a0-15ac-4ee9-88db-c78a16619d74-logs\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.079675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a27a0-15ac-4ee9-88db-c78a16619d74-logs\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.083360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.083797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-config-data\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.083968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.092003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdnqt\" (UniqueName: \"kubernetes.io/projected/9d2a27a0-15ac-4ee9-88db-c78a16619d74-kube-api-access-gdnqt\") pod \"nova-metadata-0\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.263933 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.653575 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.885203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d2a27a0-15ac-4ee9-88db-c78a16619d74","Type":"ContainerStarted","Data":"a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553"} Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.885381 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d2a27a0-15ac-4ee9-88db-c78a16619d74","Type":"ContainerStarted","Data":"9114473b475dda35917d749778ac6605a8d805f591efc1ca9166ca130e1ee6ac"} Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.887060 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-log" containerID="cri-o://5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3" gracePeriod=30 Jan 21 15:43:56 crc kubenswrapper[4707]: I0121 15:43:56.887340 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-api" containerID="cri-o://9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7" gracePeriod=30 Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.191213 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed1b336e-0975-46c6-af3e-9ed4fbecbbd9" path="/var/lib/kubelet/pods/ed1b336e-0975-46c6-af3e-9ed4fbecbbd9/volumes" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.285584 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.412395 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.522327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-config-data\") pod \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.522395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-combined-ca-bundle\") pod \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.522452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlxll\" (UniqueName: \"kubernetes.io/projected/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-kube-api-access-xlxll\") pod \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\" (UID: \"9237a5ab-d21a-428d-8fd9-a18dde1b9a35\") " Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.535913 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-kube-api-access-xlxll" (OuterVolumeSpecName: "kube-api-access-xlxll") pod "9237a5ab-d21a-428d-8fd9-a18dde1b9a35" (UID: "9237a5ab-d21a-428d-8fd9-a18dde1b9a35"). InnerVolumeSpecName "kube-api-access-xlxll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.544931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-config-data" (OuterVolumeSpecName: "config-data") pod "9237a5ab-d21a-428d-8fd9-a18dde1b9a35" (UID: "9237a5ab-d21a-428d-8fd9-a18dde1b9a35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.555006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9237a5ab-d21a-428d-8fd9-a18dde1b9a35" (UID: "9237a5ab-d21a-428d-8fd9-a18dde1b9a35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.625740 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.625766 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.625780 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlxll\" (UniqueName: \"kubernetes.io/projected/9237a5ab-d21a-428d-8fd9-a18dde1b9a35-kube-api-access-xlxll\") on node \"crc\" DevicePath \"\"" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.893646 4707 generic.go:334] "Generic (PLEG): container finished" podID="9237a5ab-d21a-428d-8fd9-a18dde1b9a35" containerID="573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa" exitCode=0 Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.893706 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.893695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9237a5ab-d21a-428d-8fd9-a18dde1b9a35","Type":"ContainerDied","Data":"573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa"} Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.894160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9237a5ab-d21a-428d-8fd9-a18dde1b9a35","Type":"ContainerDied","Data":"6387675d4e6ddc229f7ac3135a9fa8901ed853851c03e3a9926f4a743f5f705c"} Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.894180 4707 scope.go:117] "RemoveContainer" containerID="573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.896762 4707 generic.go:334] "Generic (PLEG): container finished" podID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerID="5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3" exitCode=143 Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.896831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f","Type":"ContainerDied","Data":"5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3"} Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.899803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d2a27a0-15ac-4ee9-88db-c78a16619d74","Type":"ContainerStarted","Data":"22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02"} Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.920092 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.920077445 podStartE2EDuration="2.920077445s" podCreationTimestamp="2026-01-21 15:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:57.917538702 +0000 UTC m=+2535.099054924" watchObservedRunningTime="2026-01-21 15:43:57.920077445 +0000 UTC m=+2535.101593667" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.927525 4707 scope.go:117] "RemoveContainer" containerID="573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa" Jan 21 15:43:57 crc kubenswrapper[4707]: E0121 15:43:57.927935 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa\": container with ID starting with 573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa not found: ID does not exist" containerID="573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.927983 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa"} err="failed to get container status \"573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa\": rpc error: code = NotFound desc = could not find container \"573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa\": container with ID starting with 573555d53765c8ff9b8c8fd0823a075a35ac70da5b3fbc030e9422362ba735aa not found: ID does not exist" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.933296 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.940926 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.946341 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:43:57 crc kubenswrapper[4707]: E0121 15:43:57.946759 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9237a5ab-d21a-428d-8fd9-a18dde1b9a35" containerName="nova-scheduler-scheduler" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.946778 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9237a5ab-d21a-428d-8fd9-a18dde1b9a35" containerName="nova-scheduler-scheduler" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.946973 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9237a5ab-d21a-428d-8fd9-a18dde1b9a35" containerName="nova-scheduler-scheduler" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.947587 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.949508 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:43:57 crc kubenswrapper[4707]: I0121 15:43:57.953889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.033843 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp4h5\" (UniqueName: \"kubernetes.io/projected/d915544d-9e5f-4947-abc4-82afa8aab777-kube-api-access-cp4h5\") pod \"nova-scheduler-0\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.034282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-config-data\") pod \"nova-scheduler-0\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.034362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.135229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-config-data\") pod \"nova-scheduler-0\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.135277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.135324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp4h5\" (UniqueName: \"kubernetes.io/projected/d915544d-9e5f-4947-abc4-82afa8aab777-kube-api-access-cp4h5\") pod \"nova-scheduler-0\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.139881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-config-data\") pod \"nova-scheduler-0\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.140357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.148476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp4h5\" (UniqueName: \"kubernetes.io/projected/d915544d-9e5f-4947-abc4-82afa8aab777-kube-api-access-cp4h5\") pod \"nova-scheduler-0\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.263405 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.780732 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:43:58 crc kubenswrapper[4707]: I0121 15:43:58.914800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d915544d-9e5f-4947-abc4-82afa8aab777","Type":"ContainerStarted","Data":"e0d44ef4379af7b0f3dcd1caa3ff12707e4823d2d898a9df6b0e5188052dda4f"} Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.192885 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9237a5ab-d21a-428d-8fd9-a18dde1b9a35" path="/var/lib/kubelet/pods/9237a5ab-d21a-428d-8fd9-a18dde1b9a35/volumes" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.366706 4707 scope.go:117] "RemoveContainer" containerID="1a038bb09ac026e0ff6f452079a322deba5bbb2f6d988725aead5c0eee7ae3dd" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.400540 4707 scope.go:117] "RemoveContainer" containerID="07df91c2736baee7ae0dcb92b0586ed687e019bc1d2d0d55f1d53c627d3b7bb8" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.419138 4707 scope.go:117] "RemoveContainer" containerID="c0e20ad0b54e06f9f898937e4cd6c15bb4efb7247d7665943402e873ebf065eb" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.447166 4707 scope.go:117] "RemoveContainer" containerID="b73c003ff0dc595122eb4fcd8f2b3c3b46a88371d6bdfe451fb8a53df4e771d5" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.475663 4707 scope.go:117] "RemoveContainer" containerID="ac0ab72e2adc66130459decd935a239cd9d2d7fa50b371dc00565ad083fd50a9" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.512092 4707 scope.go:117] "RemoveContainer" containerID="bda04896b44c77f2f7e84995fda7c5d626e8dec1e42dd18df28b089230861206" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.529593 4707 scope.go:117] "RemoveContainer" containerID="793158d30625ea44f14e3cf6927d960d4fb70b77cc2bde9279de8570448d24cb" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.544837 4707 scope.go:117] "RemoveContainer" containerID="46086c7e5a9656d5495acf8f78897d5393fb2095b5808e57b661a658aa9f8aec" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.575238 4707 scope.go:117] "RemoveContainer" containerID="7ae822343165321e1d28599fa0b72e4f84a0e15193bcb3d2070a4af6ab58b85d" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.600364 4707 scope.go:117] "RemoveContainer" containerID="38209e49c03af2916763782b9ab88941d2a2fce3d3f8e0a0e6eb99881b4f82b3" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.616589 4707 scope.go:117] "RemoveContainer" containerID="2121408667c0e3ee8a9ef8da490db5564b60ad19d737feab45bf9a9843fd4d1c" Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.924648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d915544d-9e5f-4947-abc4-82afa8aab777","Type":"ContainerStarted","Data":"955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1"} Jan 21 15:43:59 crc kubenswrapper[4707]: I0121 15:43:59.942552 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.942531469 podStartE2EDuration="2.942531469s" podCreationTimestamp="2026-01-21 15:43:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:43:59.938074599 +0000 UTC m=+2537.119590821" watchObservedRunningTime="2026-01-21 15:43:59.942531469 +0000 UTC m=+2537.124047691" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.266926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.283871 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.604952 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.699781 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-logs\") pod \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.700069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-config-data\") pod \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.700159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-combined-ca-bundle\") pod \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.700226 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-logs" (OuterVolumeSpecName: "logs") pod "aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" (UID: "aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.700378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnb76\" (UniqueName: \"kubernetes.io/projected/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-kube-api-access-vnb76\") pod \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\" (UID: \"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f\") " Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.700718 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.704590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-kube-api-access-vnb76" (OuterVolumeSpecName: "kube-api-access-vnb76") pod "aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" (UID: "aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f"). InnerVolumeSpecName "kube-api-access-vnb76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.721023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" (UID: "aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.721722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-config-data" (OuterVolumeSpecName: "config-data") pod "aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" (UID: "aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.802027 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.802297 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.802310 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnb76\" (UniqueName: \"kubernetes.io/projected/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f-kube-api-access-vnb76\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.937971 4707 generic.go:334] "Generic (PLEG): container finished" podID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerID="9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7" exitCode=0 Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.938040 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.938188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f","Type":"ContainerDied","Data":"9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7"} Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.938234 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f","Type":"ContainerDied","Data":"b872210ac0e5a469c8949f5d78f180e83b2ee57ea730df81f0b7fa7aae1578d4"} Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.938252 4707 scope.go:117] "RemoveContainer" containerID="9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.954053 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.960528 4707 scope.go:117] "RemoveContainer" containerID="5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.983442 4707 scope.go:117] "RemoveContainer" containerID="9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7" Jan 21 15:44:00 crc kubenswrapper[4707]: E0121 15:44:00.984692 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7\": container with ID starting with 9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7 not found: ID does not exist" containerID="9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.984784 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7"} err="failed to get container status \"9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7\": rpc error: code = NotFound desc = could not find container \"9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7\": container with ID starting with 9dad56c5d06510ab25cc77d65175a5d6a91f825fbe945ca2f8218b1d7dae56c7 not found: ID does not exist" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.984882 4707 scope.go:117] "RemoveContainer" containerID="5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3" Jan 21 15:44:00 crc kubenswrapper[4707]: E0121 15:44:00.990909 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3\": container with ID starting with 5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3 not found: ID does not exist" containerID="5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.990947 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3"} err="failed to get container status \"5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3\": rpc error: code = NotFound desc = could not find container \"5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3\": container with ID starting with 5b09f62f887ca42fcdb2342ab321ad43d2ecd95dda91ea4bcff89164b871d7e3 not found: ID does not exist" Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.995998 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:00 crc kubenswrapper[4707]: I0121 15:44:00.999295 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.016668 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:01 crc kubenswrapper[4707]: E0121 15:44:01.017063 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-log" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.017083 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-log" Jan 21 15:44:01 crc kubenswrapper[4707]: E0121 15:44:01.017109 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-api" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.017116 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-api" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.017272 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-log" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.017286 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" containerName="nova-api-api" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.018161 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.019798 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.024927 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.101322 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-tz454"] Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.102450 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.104108 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.104492 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.106939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.107030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/576e9498-fc59-4189-82a8-8804d2d7e475-logs\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.107095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhqgl\" (UniqueName: \"kubernetes.io/projected/576e9498-fc59-4189-82a8-8804d2d7e475-kube-api-access-dhqgl\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.107157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-config-data\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.110286 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-tz454"] Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.190456 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f" path="/var/lib/kubelet/pods/aae7dd1d-8d01-4418-b07a-ef3bad9e2b9f/volumes" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.208464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/576e9498-fc59-4189-82a8-8804d2d7e475-logs\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.208511 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-config-data\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.208549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhqgl\" (UniqueName: \"kubernetes.io/projected/576e9498-fc59-4189-82a8-8804d2d7e475-kube-api-access-dhqgl\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.208587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-scripts\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.208604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-config-data\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.208705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.208731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvrfr\" (UniqueName: \"kubernetes.io/projected/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-kube-api-access-dvrfr\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.208761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.208890 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/576e9498-fc59-4189-82a8-8804d2d7e475-logs\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.212329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.212585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-config-data\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.229656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhqgl\" (UniqueName: \"kubernetes.io/projected/576e9498-fc59-4189-82a8-8804d2d7e475-kube-api-access-dhqgl\") pod \"nova-api-0\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.264570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.264694 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.310934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvrfr\" (UniqueName: \"kubernetes.io/projected/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-kube-api-access-dvrfr\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.310992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.311039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-config-data\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.311087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-scripts\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.313905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.314481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-config-data\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.314492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-scripts\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.323791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvrfr\" (UniqueName: \"kubernetes.io/projected/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-kube-api-access-dvrfr\") pod \"nova-cell1-cell-mapping-tz454\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.337546 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.422876 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:01 crc kubenswrapper[4707]: W0121 15:44:01.708569 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod576e9498_fc59_4189_82a8_8804d2d7e475.slice/crio-260a0301e2bf32d2b56f4089124e79db536442470ce2cb75d5f3011882ab1772 WatchSource:0}: Error finding container 260a0301e2bf32d2b56f4089124e79db536442470ce2cb75d5f3011882ab1772: Status 404 returned error can't find the container with id 260a0301e2bf32d2b56f4089124e79db536442470ce2cb75d5f3011882ab1772 Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.709662 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.796503 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-tz454"] Jan 21 15:44:01 crc kubenswrapper[4707]: W0121 15:44:01.797960 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4146388d_3f42_42fd_ac0e_9ec7ab1bf062.slice/crio-088da09e1632e89af353e5b4b3e1ebf9c8b10fc6c5a3cbce1d94f89c8c0f6731 WatchSource:0}: Error finding container 088da09e1632e89af353e5b4b3e1ebf9c8b10fc6c5a3cbce1d94f89c8c0f6731: Status 404 returned error can't find the container with id 088da09e1632e89af353e5b4b3e1ebf9c8b10fc6c5a3cbce1d94f89c8c0f6731 Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.948003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"576e9498-fc59-4189-82a8-8804d2d7e475","Type":"ContainerStarted","Data":"4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21"} Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.948241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"576e9498-fc59-4189-82a8-8804d2d7e475","Type":"ContainerStarted","Data":"93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5"} Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.948252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"576e9498-fc59-4189-82a8-8804d2d7e475","Type":"ContainerStarted","Data":"260a0301e2bf32d2b56f4089124e79db536442470ce2cb75d5f3011882ab1772"} Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.949959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" event={"ID":"4146388d-3f42-42fd-ac0e-9ec7ab1bf062","Type":"ContainerStarted","Data":"66324ff4f010fdef26b65bc78b114ffa94a1489c0fd9e227005cc8c80d9ace09"} Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.949996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" event={"ID":"4146388d-3f42-42fd-ac0e-9ec7ab1bf062","Type":"ContainerStarted","Data":"088da09e1632e89af353e5b4b3e1ebf9c8b10fc6c5a3cbce1d94f89c8c0f6731"} Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.965019 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.96499998 podStartE2EDuration="1.96499998s" podCreationTimestamp="2026-01-21 15:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:01.963658557 +0000 UTC m=+2539.145174780" watchObservedRunningTime="2026-01-21 15:44:01.96499998 +0000 UTC m=+2539.146516201" Jan 21 15:44:01 crc kubenswrapper[4707]: I0121 15:44:01.978664 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" podStartSLOduration=0.978651168 podStartE2EDuration="978.651168ms" podCreationTimestamp="2026-01-21 15:44:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:01.978419391 +0000 UTC m=+2539.159935614" watchObservedRunningTime="2026-01-21 15:44:01.978651168 +0000 UTC m=+2539.160167390" Jan 21 15:44:03 crc kubenswrapper[4707]: I0121 15:44:03.263980 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:05 crc kubenswrapper[4707]: I0121 15:44:05.182488 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:44:05 crc kubenswrapper[4707]: E0121 15:44:05.182957 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:44:05 crc kubenswrapper[4707]: I0121 15:44:05.983840 4707 generic.go:334] "Generic (PLEG): container finished" podID="4146388d-3f42-42fd-ac0e-9ec7ab1bf062" containerID="66324ff4f010fdef26b65bc78b114ffa94a1489c0fd9e227005cc8c80d9ace09" exitCode=0 Jan 21 15:44:05 crc kubenswrapper[4707]: I0121 15:44:05.983880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" event={"ID":"4146388d-3f42-42fd-ac0e-9ec7ab1bf062","Type":"ContainerDied","Data":"66324ff4f010fdef26b65bc78b114ffa94a1489c0fd9e227005cc8c80d9ace09"} Jan 21 15:44:06 crc kubenswrapper[4707]: I0121 15:44:06.264636 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:06 crc kubenswrapper[4707]: I0121 15:44:06.264692 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.270922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.280855 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.38:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.280888 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.38:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.414517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-combined-ca-bundle\") pod \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.414661 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvrfr\" (UniqueName: \"kubernetes.io/projected/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-kube-api-access-dvrfr\") pod \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.414752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-scripts\") pod \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.414847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-config-data\") pod \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\" (UID: \"4146388d-3f42-42fd-ac0e-9ec7ab1bf062\") " Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.418886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-kube-api-access-dvrfr" (OuterVolumeSpecName: "kube-api-access-dvrfr") pod "4146388d-3f42-42fd-ac0e-9ec7ab1bf062" (UID: "4146388d-3f42-42fd-ac0e-9ec7ab1bf062"). InnerVolumeSpecName "kube-api-access-dvrfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.418961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-scripts" (OuterVolumeSpecName: "scripts") pod "4146388d-3f42-42fd-ac0e-9ec7ab1bf062" (UID: "4146388d-3f42-42fd-ac0e-9ec7ab1bf062"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.436517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-config-data" (OuterVolumeSpecName: "config-data") pod "4146388d-3f42-42fd-ac0e-9ec7ab1bf062" (UID: "4146388d-3f42-42fd-ac0e-9ec7ab1bf062"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.440994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4146388d-3f42-42fd-ac0e-9ec7ab1bf062" (UID: "4146388d-3f42-42fd-ac0e-9ec7ab1bf062"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.517036 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvrfr\" (UniqueName: \"kubernetes.io/projected/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-kube-api-access-dvrfr\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.517070 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.517080 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.517089 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4146388d-3f42-42fd-ac0e-9ec7ab1bf062-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.996402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" event={"ID":"4146388d-3f42-42fd-ac0e-9ec7ab1bf062","Type":"ContainerDied","Data":"088da09e1632e89af353e5b4b3e1ebf9c8b10fc6c5a3cbce1d94f89c8c0f6731"} Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.996440 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="088da09e1632e89af353e5b4b3e1ebf9c8b10fc6c5a3cbce1d94f89c8c0f6731" Jan 21 15:44:07 crc kubenswrapper[4707]: I0121 15:44:07.996492 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-tz454" Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.145281 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.145480 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="576e9498-fc59-4189-82a8-8804d2d7e475" containerName="nova-api-log" containerID="cri-o://93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5" gracePeriod=30 Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.145579 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="576e9498-fc59-4189-82a8-8804d2d7e475" containerName="nova-api-api" containerID="cri-o://4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21" gracePeriod=30 Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.170914 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.171123 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d915544d-9e5f-4947-abc4-82afa8aab777" containerName="nova-scheduler-scheduler" containerID="cri-o://955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1" gracePeriod=30 Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.189697 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.189873 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-log" containerID="cri-o://a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553" gracePeriod=30 Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.190209 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-metadata" containerID="cri-o://22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02" gracePeriod=30 Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.625912 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.734446 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/576e9498-fc59-4189-82a8-8804d2d7e475-logs\") pod \"576e9498-fc59-4189-82a8-8804d2d7e475\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.734535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-config-data\") pod \"576e9498-fc59-4189-82a8-8804d2d7e475\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.734609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-combined-ca-bundle\") pod \"576e9498-fc59-4189-82a8-8804d2d7e475\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.734645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhqgl\" (UniqueName: \"kubernetes.io/projected/576e9498-fc59-4189-82a8-8804d2d7e475-kube-api-access-dhqgl\") pod \"576e9498-fc59-4189-82a8-8804d2d7e475\" (UID: \"576e9498-fc59-4189-82a8-8804d2d7e475\") " Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.734704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/576e9498-fc59-4189-82a8-8804d2d7e475-logs" (OuterVolumeSpecName: "logs") pod "576e9498-fc59-4189-82a8-8804d2d7e475" (UID: "576e9498-fc59-4189-82a8-8804d2d7e475"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.735000 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/576e9498-fc59-4189-82a8-8804d2d7e475-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.738590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/576e9498-fc59-4189-82a8-8804d2d7e475-kube-api-access-dhqgl" (OuterVolumeSpecName: "kube-api-access-dhqgl") pod "576e9498-fc59-4189-82a8-8804d2d7e475" (UID: "576e9498-fc59-4189-82a8-8804d2d7e475"). InnerVolumeSpecName "kube-api-access-dhqgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.755069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "576e9498-fc59-4189-82a8-8804d2d7e475" (UID: "576e9498-fc59-4189-82a8-8804d2d7e475"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.755274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-config-data" (OuterVolumeSpecName: "config-data") pod "576e9498-fc59-4189-82a8-8804d2d7e475" (UID: "576e9498-fc59-4189-82a8-8804d2d7e475"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.836608 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.836635 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576e9498-fc59-4189-82a8-8804d2d7e475-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:08 crc kubenswrapper[4707]: I0121 15:44:08.836646 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhqgl\" (UniqueName: \"kubernetes.io/projected/576e9498-fc59-4189-82a8-8804d2d7e475-kube-api-access-dhqgl\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.004193 4707 generic.go:334] "Generic (PLEG): container finished" podID="576e9498-fc59-4189-82a8-8804d2d7e475" containerID="4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21" exitCode=0 Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.004219 4707 generic.go:334] "Generic (PLEG): container finished" podID="576e9498-fc59-4189-82a8-8804d2d7e475" containerID="93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5" exitCode=143 Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.004245 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.004240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"576e9498-fc59-4189-82a8-8804d2d7e475","Type":"ContainerDied","Data":"4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21"} Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.004307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"576e9498-fc59-4189-82a8-8804d2d7e475","Type":"ContainerDied","Data":"93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5"} Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.004322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"576e9498-fc59-4189-82a8-8804d2d7e475","Type":"ContainerDied","Data":"260a0301e2bf32d2b56f4089124e79db536442470ce2cb75d5f3011882ab1772"} Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.004324 4707 scope.go:117] "RemoveContainer" containerID="4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.005700 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerID="a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553" exitCode=143 Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.005733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d2a27a0-15ac-4ee9-88db-c78a16619d74","Type":"ContainerDied","Data":"a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553"} Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.029802 4707 scope.go:117] "RemoveContainer" containerID="93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.034461 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.046638 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.046951 4707 scope.go:117] "RemoveContainer" containerID="4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21" Jan 21 15:44:09 crc kubenswrapper[4707]: E0121 15:44:09.047318 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21\": container with ID starting with 4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21 not found: ID does not exist" containerID="4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.047350 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21"} err="failed to get container status \"4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21\": rpc error: code = NotFound desc = could not find container \"4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21\": container with ID starting with 4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21 not found: ID does not exist" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.047371 4707 scope.go:117] "RemoveContainer" containerID="93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5" Jan 21 15:44:09 crc kubenswrapper[4707]: E0121 15:44:09.047632 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5\": container with ID starting with 93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5 not found: ID does not exist" containerID="93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.047660 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5"} err="failed to get container status \"93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5\": rpc error: code = NotFound desc = could not find container \"93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5\": container with ID starting with 93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5 not found: ID does not exist" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.047680 4707 scope.go:117] "RemoveContainer" containerID="4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.047921 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21"} err="failed to get container status \"4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21\": rpc error: code = NotFound desc = could not find container \"4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21\": container with ID starting with 4621cf873f8e485f5fecff510eff84c45958e0659e25f31236e0762e64451f21 not found: ID does not exist" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.047942 4707 scope.go:117] "RemoveContainer" containerID="93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.048234 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5"} err="failed to get container status \"93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5\": rpc error: code = NotFound desc = could not find container \"93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5\": container with ID starting with 93df91bb395d417982488a5b2d210af86cb994d5c5febcd7d3ce988c94dd76a5 not found: ID does not exist" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.055631 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:09 crc kubenswrapper[4707]: E0121 15:44:09.056029 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576e9498-fc59-4189-82a8-8804d2d7e475" containerName="nova-api-api" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.056048 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="576e9498-fc59-4189-82a8-8804d2d7e475" containerName="nova-api-api" Jan 21 15:44:09 crc kubenswrapper[4707]: E0121 15:44:09.056076 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576e9498-fc59-4189-82a8-8804d2d7e475" containerName="nova-api-log" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.056083 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="576e9498-fc59-4189-82a8-8804d2d7e475" containerName="nova-api-log" Jan 21 15:44:09 crc kubenswrapper[4707]: E0121 15:44:09.056090 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4146388d-3f42-42fd-ac0e-9ec7ab1bf062" containerName="nova-manage" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.056095 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4146388d-3f42-42fd-ac0e-9ec7ab1bf062" containerName="nova-manage" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.056248 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="576e9498-fc59-4189-82a8-8804d2d7e475" containerName="nova-api-api" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.056283 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="576e9498-fc59-4189-82a8-8804d2d7e475" containerName="nova-api-log" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.056291 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4146388d-3f42-42fd-ac0e-9ec7ab1bf062" containerName="nova-manage" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.057171 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.059084 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.062709 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.141994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52v2x\" (UniqueName: \"kubernetes.io/projected/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-kube-api-access-52v2x\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.142074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-config-data\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.142108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-logs\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.142144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.190096 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="576e9498-fc59-4189-82a8-8804d2d7e475" path="/var/lib/kubelet/pods/576e9498-fc59-4189-82a8-8804d2d7e475/volumes" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.243365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52v2x\" (UniqueName: \"kubernetes.io/projected/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-kube-api-access-52v2x\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.243427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-config-data\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.243456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-logs\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.243483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.243952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-logs\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.246380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-config-data\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.246558 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.255507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52v2x\" (UniqueName: \"kubernetes.io/projected/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-kube-api-access-52v2x\") pod \"nova-api-0\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.374861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.741004 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.858040 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.959553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-config-data\") pod \"d915544d-9e5f-4947-abc4-82afa8aab777\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.959748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-combined-ca-bundle\") pod \"d915544d-9e5f-4947-abc4-82afa8aab777\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.959787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp4h5\" (UniqueName: \"kubernetes.io/projected/d915544d-9e5f-4947-abc4-82afa8aab777-kube-api-access-cp4h5\") pod \"d915544d-9e5f-4947-abc4-82afa8aab777\" (UID: \"d915544d-9e5f-4947-abc4-82afa8aab777\") " Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.965790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d915544d-9e5f-4947-abc4-82afa8aab777-kube-api-access-cp4h5" (OuterVolumeSpecName: "kube-api-access-cp4h5") pod "d915544d-9e5f-4947-abc4-82afa8aab777" (UID: "d915544d-9e5f-4947-abc4-82afa8aab777"). InnerVolumeSpecName "kube-api-access-cp4h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.984547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d915544d-9e5f-4947-abc4-82afa8aab777" (UID: "d915544d-9e5f-4947-abc4-82afa8aab777"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:09 crc kubenswrapper[4707]: I0121 15:44:09.984956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-config-data" (OuterVolumeSpecName: "config-data") pod "d915544d-9e5f-4947-abc4-82afa8aab777" (UID: "d915544d-9e5f-4947-abc4-82afa8aab777"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.014253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8","Type":"ContainerStarted","Data":"989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175"} Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.014305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8","Type":"ContainerStarted","Data":"49ec25f3e90aa803b89b01341fa73e5c751cae663057cbecf0e51632bfe86a87"} Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.016436 4707 generic.go:334] "Generic (PLEG): container finished" podID="d915544d-9e5f-4947-abc4-82afa8aab777" containerID="955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1" exitCode=0 Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.016468 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.016474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d915544d-9e5f-4947-abc4-82afa8aab777","Type":"ContainerDied","Data":"955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1"} Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.016499 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d915544d-9e5f-4947-abc4-82afa8aab777","Type":"ContainerDied","Data":"e0d44ef4379af7b0f3dcd1caa3ff12707e4823d2d898a9df6b0e5188052dda4f"} Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.016515 4707 scope.go:117] "RemoveContainer" containerID="955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.036040 4707 scope.go:117] "RemoveContainer" containerID="955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1" Jan 21 15:44:10 crc kubenswrapper[4707]: E0121 15:44:10.036401 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1\": container with ID starting with 955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1 not found: ID does not exist" containerID="955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.036428 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1"} err="failed to get container status \"955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1\": rpc error: code = NotFound desc = could not find container \"955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1\": container with ID starting with 955367905cd3d12b3d28959c00a72c41f95fb1d0fbd8e13a086e86a55b20abb1 not found: ID does not exist" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.062023 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.062048 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d915544d-9e5f-4947-abc4-82afa8aab777-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.062059 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp4h5\" (UniqueName: \"kubernetes.io/projected/d915544d-9e5f-4947-abc4-82afa8aab777-kube-api-access-cp4h5\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.065382 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.072950 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.082196 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:44:10 crc kubenswrapper[4707]: E0121 15:44:10.082719 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d915544d-9e5f-4947-abc4-82afa8aab777" containerName="nova-scheduler-scheduler" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.082780 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d915544d-9e5f-4947-abc4-82afa8aab777" containerName="nova-scheduler-scheduler" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.083015 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d915544d-9e5f-4947-abc4-82afa8aab777" containerName="nova-scheduler-scheduler" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.083625 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.085591 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.089595 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.163586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-config-data\") pod \"nova-scheduler-0\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.163674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.163849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp9tk\" (UniqueName: \"kubernetes.io/projected/7cfb7b09-0ff5-4134-90e4-2b04813ca151-kube-api-access-lp9tk\") pod \"nova-scheduler-0\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.265761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp9tk\" (UniqueName: \"kubernetes.io/projected/7cfb7b09-0ff5-4134-90e4-2b04813ca151-kube-api-access-lp9tk\") pod \"nova-scheduler-0\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.265981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-config-data\") pod \"nova-scheduler-0\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.266079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.270731 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.271209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-config-data\") pod \"nova-scheduler-0\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.279687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp9tk\" (UniqueName: \"kubernetes.io/projected/7cfb7b09-0ff5-4134-90e4-2b04813ca151-kube-api-access-lp9tk\") pod \"nova-scheduler-0\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.396182 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:10 crc kubenswrapper[4707]: I0121 15:44:10.763403 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:44:10 crc kubenswrapper[4707]: W0121 15:44:10.765661 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cfb7b09_0ff5_4134_90e4_2b04813ca151.slice/crio-fc50202a4d0fab39abe73fbc0cdcbfccca1803dad1e6818435550ddd7852f377 WatchSource:0}: Error finding container fc50202a4d0fab39abe73fbc0cdcbfccca1803dad1e6818435550ddd7852f377: Status 404 returned error can't find the container with id fc50202a4d0fab39abe73fbc0cdcbfccca1803dad1e6818435550ddd7852f377 Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.025866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7cfb7b09-0ff5-4134-90e4-2b04813ca151","Type":"ContainerStarted","Data":"2a68ebdabb09d5063a608b567a0d520fdc9896d30daf8aed958ebfbd4a7454b6"} Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.026070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7cfb7b09-0ff5-4134-90e4-2b04813ca151","Type":"ContainerStarted","Data":"fc50202a4d0fab39abe73fbc0cdcbfccca1803dad1e6818435550ddd7852f377"} Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.027770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8","Type":"ContainerStarted","Data":"4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3"} Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.038664 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.038651513 podStartE2EDuration="1.038651513s" podCreationTimestamp="2026-01-21 15:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:11.036395281 +0000 UTC m=+2548.217911503" watchObservedRunningTime="2026-01-21 15:44:11.038651513 +0000 UTC m=+2548.220167736" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.052142 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.052112984 podStartE2EDuration="2.052112984s" podCreationTimestamp="2026-01-21 15:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:11.048475414 +0000 UTC m=+2548.229991637" watchObservedRunningTime="2026-01-21 15:44:11.052112984 +0000 UTC m=+2548.233629206" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.196674 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d915544d-9e5f-4947-abc4-82afa8aab777" path="/var/lib/kubelet/pods/d915544d-9e5f-4947-abc4-82afa8aab777/volumes" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.708927 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.792604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a27a0-15ac-4ee9-88db-c78a16619d74-logs\") pod \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.792728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdnqt\" (UniqueName: \"kubernetes.io/projected/9d2a27a0-15ac-4ee9-88db-c78a16619d74-kube-api-access-gdnqt\") pod \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.792756 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-nova-metadata-tls-certs\") pod \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.792876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-config-data\") pod \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.792917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-combined-ca-bundle\") pod \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\" (UID: \"9d2a27a0-15ac-4ee9-88db-c78a16619d74\") " Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.794300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d2a27a0-15ac-4ee9-88db-c78a16619d74-logs" (OuterVolumeSpecName: "logs") pod "9d2a27a0-15ac-4ee9-88db-c78a16619d74" (UID: "9d2a27a0-15ac-4ee9-88db-c78a16619d74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.797782 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2a27a0-15ac-4ee9-88db-c78a16619d74-kube-api-access-gdnqt" (OuterVolumeSpecName: "kube-api-access-gdnqt") pod "9d2a27a0-15ac-4ee9-88db-c78a16619d74" (UID: "9d2a27a0-15ac-4ee9-88db-c78a16619d74"). InnerVolumeSpecName "kube-api-access-gdnqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.814016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d2a27a0-15ac-4ee9-88db-c78a16619d74" (UID: "9d2a27a0-15ac-4ee9-88db-c78a16619d74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.815431 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-config-data" (OuterVolumeSpecName: "config-data") pod "9d2a27a0-15ac-4ee9-88db-c78a16619d74" (UID: "9d2a27a0-15ac-4ee9-88db-c78a16619d74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.829369 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9d2a27a0-15ac-4ee9-88db-c78a16619d74" (UID: "9d2a27a0-15ac-4ee9-88db-c78a16619d74"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.894876 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a27a0-15ac-4ee9-88db-c78a16619d74-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.894905 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdnqt\" (UniqueName: \"kubernetes.io/projected/9d2a27a0-15ac-4ee9-88db-c78a16619d74-kube-api-access-gdnqt\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.894918 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.894927 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:11 crc kubenswrapper[4707]: I0121 15:44:11.894936 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a27a0-15ac-4ee9-88db-c78a16619d74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.035946 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerID="22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02" exitCode=0 Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.036150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d2a27a0-15ac-4ee9-88db-c78a16619d74","Type":"ContainerDied","Data":"22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02"} Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.036189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d2a27a0-15ac-4ee9-88db-c78a16619d74","Type":"ContainerDied","Data":"9114473b475dda35917d749778ac6605a8d805f591efc1ca9166ca130e1ee6ac"} Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.036208 4707 scope.go:117] "RemoveContainer" containerID="22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.036324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.057712 4707 scope.go:117] "RemoveContainer" containerID="a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.064708 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.072761 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.087760 4707 scope.go:117] "RemoveContainer" containerID="22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02" Jan 21 15:44:12 crc kubenswrapper[4707]: E0121 15:44:12.088078 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02\": container with ID starting with 22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02 not found: ID does not exist" containerID="22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.088107 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02"} err="failed to get container status \"22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02\": rpc error: code = NotFound desc = could not find container \"22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02\": container with ID starting with 22924fb27e5af9adbeceb89f4a24d5dfb179033d4cfe5e9ab0d3e59ddae82c02 not found: ID does not exist" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.088124 4707 scope.go:117] "RemoveContainer" containerID="a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553" Jan 21 15:44:12 crc kubenswrapper[4707]: E0121 15:44:12.088335 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553\": container with ID starting with a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553 not found: ID does not exist" containerID="a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.088354 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553"} err="failed to get container status \"a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553\": rpc error: code = NotFound desc = could not find container \"a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553\": container with ID starting with a049d52c504e64e09fa2051fd65af4333a5e63fe1874b69815227b9464a32553 not found: ID does not exist" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.091041 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:44:12 crc kubenswrapper[4707]: E0121 15:44:12.091447 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-log" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.091461 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-log" Jan 21 15:44:12 crc kubenswrapper[4707]: E0121 15:44:12.091472 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-metadata" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.091478 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-metadata" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.091670 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-log" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.091694 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" containerName="nova-metadata-metadata" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.093035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.095348 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.095351 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.098560 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.204409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-logs\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.204458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqgdv\" (UniqueName: \"kubernetes.io/projected/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-kube-api-access-jqgdv\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.204497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.204660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.204800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-config-data\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.311307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-config-data\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.311431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-logs\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.311475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqgdv\" (UniqueName: \"kubernetes.io/projected/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-kube-api-access-jqgdv\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.311528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.311644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.312136 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-logs\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.314698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-config-data\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.315198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.316145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.323919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqgdv\" (UniqueName: \"kubernetes.io/projected/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-kube-api-access-jqgdv\") pod \"nova-metadata-0\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.414617 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:12 crc kubenswrapper[4707]: I0121 15:44:12.804293 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:44:12 crc kubenswrapper[4707]: W0121 15:44:12.811088 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2846d28b_24cc_4f6a_bd5b_dc099af2bc3e.slice/crio-ff17493926a97d3e459915a28b84cbdb2de941005a3f8c06897af0f4ad08180a WatchSource:0}: Error finding container ff17493926a97d3e459915a28b84cbdb2de941005a3f8c06897af0f4ad08180a: Status 404 returned error can't find the container with id ff17493926a97d3e459915a28b84cbdb2de941005a3f8c06897af0f4ad08180a Jan 21 15:44:13 crc kubenswrapper[4707]: I0121 15:44:13.044944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e","Type":"ContainerStarted","Data":"9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1"} Jan 21 15:44:13 crc kubenswrapper[4707]: I0121 15:44:13.045147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e","Type":"ContainerStarted","Data":"954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f"} Jan 21 15:44:13 crc kubenswrapper[4707]: I0121 15:44:13.045157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e","Type":"ContainerStarted","Data":"ff17493926a97d3e459915a28b84cbdb2de941005a3f8c06897af0f4ad08180a"} Jan 21 15:44:13 crc kubenswrapper[4707]: I0121 15:44:13.066363 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.066344798 podStartE2EDuration="1.066344798s" podCreationTimestamp="2026-01-21 15:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:13.057700965 +0000 UTC m=+2550.239217188" watchObservedRunningTime="2026-01-21 15:44:13.066344798 +0000 UTC m=+2550.247861019" Jan 21 15:44:13 crc kubenswrapper[4707]: I0121 15:44:13.190602 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2a27a0-15ac-4ee9-88db-c78a16619d74" path="/var/lib/kubelet/pods/9d2a27a0-15ac-4ee9-88db-c78a16619d74/volumes" Jan 21 15:44:15 crc kubenswrapper[4707]: I0121 15:44:15.396566 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:17 crc kubenswrapper[4707]: I0121 15:44:17.415383 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:17 crc kubenswrapper[4707]: I0121 15:44:17.415432 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:19 crc kubenswrapper[4707]: I0121 15:44:19.375561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:19 crc kubenswrapper[4707]: I0121 15:44:19.375770 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:20 crc kubenswrapper[4707]: I0121 15:44:20.182693 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:44:20 crc kubenswrapper[4707]: E0121 15:44:20.183111 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:44:20 crc kubenswrapper[4707]: I0121 15:44:20.396798 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:20 crc kubenswrapper[4707]: I0121 15:44:20.416282 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:20 crc kubenswrapper[4707]: I0121 15:44:20.457990 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.42:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:20 crc kubenswrapper[4707]: I0121 15:44:20.458019 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.42:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:21 crc kubenswrapper[4707]: I0121 15:44:21.115207 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:44:22 crc kubenswrapper[4707]: I0121 15:44:22.415322 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:22 crc kubenswrapper[4707]: I0121 15:44:22.415558 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:23 crc kubenswrapper[4707]: I0121 15:44:23.424931 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.44:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:23 crc kubenswrapper[4707]: I0121 15:44:23.424955 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.44:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:29 crc kubenswrapper[4707]: I0121 15:44:29.378600 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:29 crc kubenswrapper[4707]: I0121 15:44:29.378864 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:29 crc kubenswrapper[4707]: I0121 15:44:29.379132 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:29 crc kubenswrapper[4707]: I0121 15:44:29.379171 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:29 crc kubenswrapper[4707]: I0121 15:44:29.381169 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:29 crc kubenswrapper[4707]: I0121 15:44:29.381481 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:30 crc kubenswrapper[4707]: I0121 15:44:30.914980 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:30 crc kubenswrapper[4707]: I0121 15:44:30.916098 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="proxy-httpd" containerID="cri-o://1f73b761d24b8bd5767620d0a64fc846049e6a99669fc0ba4aa19f78bbf5fa01" gracePeriod=30 Jan 21 15:44:30 crc kubenswrapper[4707]: I0121 15:44:30.916193 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="ceilometer-notification-agent" containerID="cri-o://e1017ca648f0834dc9b8e2d10be743e2a03e7d85e0a914cdfee158561b1c6f3d" gracePeriod=30 Jan 21 15:44:30 crc kubenswrapper[4707]: I0121 15:44:30.916180 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="sg-core" containerID="cri-o://af7fa61ee4bb60ee7836162a149bd8b4c83a18b47857fed88839b335236ad6c7" gracePeriod=30 Jan 21 15:44:30 crc kubenswrapper[4707]: I0121 15:44:30.916533 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="ceilometer-central-agent" containerID="cri-o://3ca43973eeb47657c74fbd63e26851952964f1788418d1c9a8e7b0423389eab1" gracePeriod=30 Jan 21 15:44:31 crc kubenswrapper[4707]: I0121 15:44:31.162206 4707 generic.go:334] "Generic (PLEG): container finished" podID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerID="1f73b761d24b8bd5767620d0a64fc846049e6a99669fc0ba4aa19f78bbf5fa01" exitCode=0 Jan 21 15:44:31 crc kubenswrapper[4707]: I0121 15:44:31.162235 4707 generic.go:334] "Generic (PLEG): container finished" podID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerID="af7fa61ee4bb60ee7836162a149bd8b4c83a18b47857fed88839b335236ad6c7" exitCode=2 Jan 21 15:44:31 crc kubenswrapper[4707]: I0121 15:44:31.162279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerDied","Data":"1f73b761d24b8bd5767620d0a64fc846049e6a99669fc0ba4aa19f78bbf5fa01"} Jan 21 15:44:31 crc kubenswrapper[4707]: I0121 15:44:31.162325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerDied","Data":"af7fa61ee4bb60ee7836162a149bd8b4c83a18b47857fed88839b335236ad6c7"} Jan 21 15:44:31 crc kubenswrapper[4707]: I0121 15:44:31.547219 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:32 crc kubenswrapper[4707]: I0121 15:44:32.173263 4707 generic.go:334] "Generic (PLEG): container finished" podID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerID="3ca43973eeb47657c74fbd63e26851952964f1788418d1c9a8e7b0423389eab1" exitCode=0 Jan 21 15:44:32 crc kubenswrapper[4707]: I0121 15:44:32.173310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerDied","Data":"3ca43973eeb47657c74fbd63e26851952964f1788418d1c9a8e7b0423389eab1"} Jan 21 15:44:32 crc kubenswrapper[4707]: I0121 15:44:32.173476 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-log" containerID="cri-o://989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175" gracePeriod=30 Jan 21 15:44:32 crc kubenswrapper[4707]: I0121 15:44:32.173541 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-api" containerID="cri-o://4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3" gracePeriod=30 Jan 21 15:44:32 crc kubenswrapper[4707]: I0121 15:44:32.419912 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:32 crc kubenswrapper[4707]: I0121 15:44:32.420327 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:32 crc kubenswrapper[4707]: I0121 15:44:32.427136 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:33 crc kubenswrapper[4707]: I0121 15:44:33.181739 4707 generic.go:334] "Generic (PLEG): container finished" podID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerID="989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175" exitCode=143 Jan 21 15:44:33 crc kubenswrapper[4707]: I0121 15:44:33.188382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8","Type":"ContainerDied","Data":"989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175"} Jan 21 15:44:33 crc kubenswrapper[4707]: I0121 15:44:33.189768 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:44:34 crc kubenswrapper[4707]: I0121 15:44:34.182144 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:44:34 crc kubenswrapper[4707]: E0121 15:44:34.182389 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.209601 4707 generic.go:334] "Generic (PLEG): container finished" podID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerID="e1017ca648f0834dc9b8e2d10be743e2a03e7d85e0a914cdfee158561b1c6f3d" exitCode=0 Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.209670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerDied","Data":"e1017ca648f0834dc9b8e2d10be743e2a03e7d85e0a914cdfee158561b1c6f3d"} Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.403312 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.479778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-config-data\") pod \"9aa155f7-28bd-4346-8d17-9e176f05f924\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.479833 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb8gh\" (UniqueName: \"kubernetes.io/projected/9aa155f7-28bd-4346-8d17-9e176f05f924-kube-api-access-gb8gh\") pod \"9aa155f7-28bd-4346-8d17-9e176f05f924\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.479856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-sg-core-conf-yaml\") pod \"9aa155f7-28bd-4346-8d17-9e176f05f924\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.480085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-run-httpd\") pod \"9aa155f7-28bd-4346-8d17-9e176f05f924\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.480112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-ceilometer-tls-certs\") pod \"9aa155f7-28bd-4346-8d17-9e176f05f924\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.480161 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-log-httpd\") pod \"9aa155f7-28bd-4346-8d17-9e176f05f924\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.480181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-combined-ca-bundle\") pod \"9aa155f7-28bd-4346-8d17-9e176f05f924\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.480212 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-scripts\") pod \"9aa155f7-28bd-4346-8d17-9e176f05f924\" (UID: \"9aa155f7-28bd-4346-8d17-9e176f05f924\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.480948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9aa155f7-28bd-4346-8d17-9e176f05f924" (UID: "9aa155f7-28bd-4346-8d17-9e176f05f924"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.481108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9aa155f7-28bd-4346-8d17-9e176f05f924" (UID: "9aa155f7-28bd-4346-8d17-9e176f05f924"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.485353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-scripts" (OuterVolumeSpecName: "scripts") pod "9aa155f7-28bd-4346-8d17-9e176f05f924" (UID: "9aa155f7-28bd-4346-8d17-9e176f05f924"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.485479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa155f7-28bd-4346-8d17-9e176f05f924-kube-api-access-gb8gh" (OuterVolumeSpecName: "kube-api-access-gb8gh") pod "9aa155f7-28bd-4346-8d17-9e176f05f924" (UID: "9aa155f7-28bd-4346-8d17-9e176f05f924"). InnerVolumeSpecName "kube-api-access-gb8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.503947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9aa155f7-28bd-4346-8d17-9e176f05f924" (UID: "9aa155f7-28bd-4346-8d17-9e176f05f924"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.524103 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.526733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9aa155f7-28bd-4346-8d17-9e176f05f924" (UID: "9aa155f7-28bd-4346-8d17-9e176f05f924"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.554712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9aa155f7-28bd-4346-8d17-9e176f05f924" (UID: "9aa155f7-28bd-4346-8d17-9e176f05f924"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.573374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-config-data" (OuterVolumeSpecName: "config-data") pod "9aa155f7-28bd-4346-8d17-9e176f05f924" (UID: "9aa155f7-28bd-4346-8d17-9e176f05f924"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.581579 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-logs\") pod \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.581654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52v2x\" (UniqueName: \"kubernetes.io/projected/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-kube-api-access-52v2x\") pod \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.581682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-combined-ca-bundle\") pod \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.581739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-config-data\") pod \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\" (UID: \"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8\") " Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.582545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-logs" (OuterVolumeSpecName: "logs") pod "57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" (UID: "57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.583375 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.583397 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.583408 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.583417 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aa155f7-28bd-4346-8d17-9e176f05f924-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.583425 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.583434 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.583442 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.583450 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb8gh\" (UniqueName: \"kubernetes.io/projected/9aa155f7-28bd-4346-8d17-9e176f05f924-kube-api-access-gb8gh\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.583457 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aa155f7-28bd-4346-8d17-9e176f05f924-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.584216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-kube-api-access-52v2x" (OuterVolumeSpecName: "kube-api-access-52v2x") pod "57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" (UID: "57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8"). InnerVolumeSpecName "kube-api-access-52v2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.601291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-config-data" (OuterVolumeSpecName: "config-data") pod "57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" (UID: "57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.601964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" (UID: "57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.685603 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52v2x\" (UniqueName: \"kubernetes.io/projected/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-kube-api-access-52v2x\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.685628 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:35 crc kubenswrapper[4707]: I0121 15:44:35.685638 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.218467 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.218474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9aa155f7-28bd-4346-8d17-9e176f05f924","Type":"ContainerDied","Data":"e71e18176ab1d4c34179455a844fe6a6b3fe017ac550daa9b34c15c80803120e"} Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.219236 4707 scope.go:117] "RemoveContainer" containerID="1f73b761d24b8bd5767620d0a64fc846049e6a99669fc0ba4aa19f78bbf5fa01" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.219940 4707 generic.go:334] "Generic (PLEG): container finished" podID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerID="4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3" exitCode=0 Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.219967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8","Type":"ContainerDied","Data":"4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3"} Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.219977 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.219985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8","Type":"ContainerDied","Data":"49ec25f3e90aa803b89b01341fa73e5c751cae663057cbecf0e51632bfe86a87"} Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.246056 4707 scope.go:117] "RemoveContainer" containerID="af7fa61ee4bb60ee7836162a149bd8b4c83a18b47857fed88839b335236ad6c7" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.253561 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.268238 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.268721 4707 scope.go:117] "RemoveContainer" containerID="e1017ca648f0834dc9b8e2d10be743e2a03e7d85e0a914cdfee158561b1c6f3d" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.275882 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:36 crc kubenswrapper[4707]: E0121 15:44:36.276309 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="ceilometer-notification-agent" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276326 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="ceilometer-notification-agent" Jan 21 15:44:36 crc kubenswrapper[4707]: E0121 15:44:36.276341 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="sg-core" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276347 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="sg-core" Jan 21 15:44:36 crc kubenswrapper[4707]: E0121 15:44:36.276360 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="ceilometer-central-agent" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276366 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="ceilometer-central-agent" Jan 21 15:44:36 crc kubenswrapper[4707]: E0121 15:44:36.276381 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-log" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276385 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-log" Jan 21 15:44:36 crc kubenswrapper[4707]: E0121 15:44:36.276405 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-api" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276410 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-api" Jan 21 15:44:36 crc kubenswrapper[4707]: E0121 15:44:36.276419 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="proxy-httpd" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276424 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="proxy-httpd" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276588 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="sg-core" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276599 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="ceilometer-notification-agent" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276618 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-api" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276629 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="ceilometer-central-agent" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276642 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" containerName="proxy-httpd" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.276649 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" containerName="nova-api-log" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.278176 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.282144 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.282243 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.282391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.282453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.289601 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.294754 4707 scope.go:117] "RemoveContainer" containerID="3ca43973eeb47657c74fbd63e26851952964f1788418d1c9a8e7b0423389eab1" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.295308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.309686 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.311150 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.311232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.319832 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.319965 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.320023 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.339911 4707 scope.go:117] "RemoveContainer" containerID="4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.367293 4707 scope.go:117] "RemoveContainer" containerID="989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.379240 4707 scope.go:117] "RemoveContainer" containerID="4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3" Jan 21 15:44:36 crc kubenswrapper[4707]: E0121 15:44:36.379487 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3\": container with ID starting with 4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3 not found: ID does not exist" containerID="4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.379521 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3"} err="failed to get container status \"4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3\": rpc error: code = NotFound desc = could not find container \"4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3\": container with ID starting with 4861b0f98a44e067f2b70f6a6430901a9231c13762614f121be6b93e1c6340d3 not found: ID does not exist" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.379539 4707 scope.go:117] "RemoveContainer" containerID="989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175" Jan 21 15:44:36 crc kubenswrapper[4707]: E0121 15:44:36.379759 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175\": container with ID starting with 989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175 not found: ID does not exist" containerID="989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.379793 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175"} err="failed to get container status \"989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175\": rpc error: code = NotFound desc = could not find container \"989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175\": container with ID starting with 989a4dab465202fa9ca54945d42670bd440725ea754ed9d64040fa7d92b03175 not found: ID does not exist" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-run-httpd\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qmf\" (UniqueName: \"kubernetes.io/projected/f92113cd-544a-4c31-9ca9-e56ec4250773-kube-api-access-85qmf\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-config-data\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92113cd-544a-4c31-9ca9-e56ec4250773-logs\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-config-data\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-log-httpd\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-scripts\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5pfn\" (UniqueName: \"kubernetes.io/projected/7456f87d-fbee-4a79-8e96-9af0ea500ea6-kube-api-access-b5pfn\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.398658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-public-tls-certs\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.499781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.499834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.499857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-log-httpd\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.499891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-scripts\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.499948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.499965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5pfn\" (UniqueName: \"kubernetes.io/projected/7456f87d-fbee-4a79-8e96-9af0ea500ea6-kube-api-access-b5pfn\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-log-httpd\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-public-tls-certs\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-run-httpd\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qmf\" (UniqueName: \"kubernetes.io/projected/f92113cd-544a-4c31-9ca9-e56ec4250773-kube-api-access-85qmf\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-config-data\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92113cd-544a-4c31-9ca9-e56ec4250773-logs\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-config-data\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.500825 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-run-httpd\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.501063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92113cd-544a-4c31-9ca9-e56ec4250773-logs\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.503299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-public-tls-certs\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.503656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.505565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-scripts\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.505737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.506456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.506782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.506859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-config-data\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.507775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.508651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-config-data\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.514060 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5pfn\" (UniqueName: \"kubernetes.io/projected/7456f87d-fbee-4a79-8e96-9af0ea500ea6-kube-api-access-b5pfn\") pod \"ceilometer-0\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.515775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qmf\" (UniqueName: \"kubernetes.io/projected/f92113cd-544a-4c31-9ca9-e56ec4250773-kube-api-access-85qmf\") pod \"nova-api-0\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.601459 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:36 crc kubenswrapper[4707]: I0121 15:44:36.631551 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.000132 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.060255 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:44:37 crc kubenswrapper[4707]: W0121 15:44:37.062106 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf92113cd_544a_4c31_9ca9_e56ec4250773.slice/crio-c97009d6035ce5f1df4e31d6af5ed4249e065135446a5927ebc0e8105fdeb663 WatchSource:0}: Error finding container c97009d6035ce5f1df4e31d6af5ed4249e065135446a5927ebc0e8105fdeb663: Status 404 returned error can't find the container with id c97009d6035ce5f1df4e31d6af5ed4249e065135446a5927ebc0e8105fdeb663 Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.205184 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8" path="/var/lib/kubelet/pods/57fa8c9c-c7a2-40ce-8cf7-629218d8d3c8/volumes" Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.206035 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa155f7-28bd-4346-8d17-9e176f05f924" path="/var/lib/kubelet/pods/9aa155f7-28bd-4346-8d17-9e176f05f924/volumes" Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.267839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f92113cd-544a-4c31-9ca9-e56ec4250773","Type":"ContainerStarted","Data":"6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f"} Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.267887 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f92113cd-544a-4c31-9ca9-e56ec4250773","Type":"ContainerStarted","Data":"c97009d6035ce5f1df4e31d6af5ed4249e065135446a5927ebc0e8105fdeb663"} Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.267911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerStarted","Data":"e214d15197898ba636f0d9dacf4142998241d84774549d4e807e5bdac6593c83"} Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.904549 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.932428 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.946637 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.946860 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" containerName="memcached" containerID="cri-o://0845927e911c37dcf9655a14e1f3c10d1152022715b30329ae47d09eb7ba4612" gracePeriod=30 Jan 21 15:44:37 crc kubenswrapper[4707]: I0121 15:44:37.977613 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.009442 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.134864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vgsx9"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.140462 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hj9wf"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.153305 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vgsx9"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.164161 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hj9wf"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.176804 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="89b706e2-54e8-4a85-b263-cbccec9d93bd" containerName="galera" containerID="cri-o://6ab4155a7c3881ee300c8f40a21c7af6d838cd85e5003eb8d0024abf5e2ad845" gracePeriod=30 Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.180146 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" containerName="galera" containerID="cri-o://fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448" gracePeriod=30 Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.217543 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rvckb"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.231237 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rvckb"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.272307 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-92nzp"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.274207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.285212 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-92nzp"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.286339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerStarted","Data":"4846703d6c3135187ab80dfe5f24312d122ad935f663be43c9d4f6a2f0be4080"} Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.298834 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-x6jxn"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.300033 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.300325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f92113cd-544a-4c31-9ca9-e56ec4250773","Type":"ContainerStarted","Data":"0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9"} Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.306384 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-x6jxn"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.328471 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.328845 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerName="openstack-network-exporter" containerID="cri-o://df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a" gracePeriod=300 Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.351127 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvt5w\" (UniqueName: \"kubernetes.io/projected/1c800162-c9be-4c26-848e-548c5ade1645-kube-api-access-mvt5w\") pod \"keystone-db-sync-92nzp\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.351196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-config-data\") pod \"keystone-db-sync-92nzp\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.351317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-combined-ca-bundle\") pod \"keystone-db-sync-92nzp\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.354034 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-v88c2"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.372798 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-v88c2"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.386739 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.38672104 podStartE2EDuration="2.38672104s" podCreationTimestamp="2026-01-21 15:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:38.348979375 +0000 UTC m=+2575.530495598" watchObservedRunningTime="2026-01-21 15:44:38.38672104 +0000 UTC m=+2575.568237261" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.401470 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-mbvl2"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.403344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.405794 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.416714 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerName="ovsdbserver-nb" containerID="cri-o://1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439" gracePeriod=300 Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.422875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-mbvl2"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.453043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-scripts\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.453124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvt5w\" (UniqueName: \"kubernetes.io/projected/1c800162-c9be-4c26-848e-548c5ade1645-kube-api-access-mvt5w\") pod \"keystone-db-sync-92nzp\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.456789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrh6p\" (UniqueName: \"kubernetes.io/projected/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-kube-api-access-rrh6p\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.456869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-combined-ca-bundle\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.456902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-logs\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.456929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-config-data\") pod \"keystone-db-sync-92nzp\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.457030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-config-data\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.457072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-combined-ca-bundle\") pod \"keystone-db-sync-92nzp\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.458060 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.458333 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="ovn-northd" containerID="cri-o://19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce" gracePeriod=30 Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.458784 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="openstack-network-exporter" containerID="cri-o://63a8221ce80bea60ab8157bbbd92a930360d931306dcb9d132b2e61859dc44ae" gracePeriod=30 Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.474403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-config-data\") pod \"keystone-db-sync-92nzp\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.474898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-combined-ca-bundle\") pod \"keystone-db-sync-92nzp\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.480299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvt5w\" (UniqueName: \"kubernetes.io/projected/1c800162-c9be-4c26-848e-548c5ade1645-kube-api-access-mvt5w\") pod \"keystone-db-sync-92nzp\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.490836 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-r94jp"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.492243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.494862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-r94jp"] Jan 21 15:44:38 crc kubenswrapper[4707]: E0121 15:44:38.508258 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:44:38 crc kubenswrapper[4707]: E0121 15:44:38.514993 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:44:38 crc kubenswrapper[4707]: E0121 15:44:38.521460 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:44:38 crc kubenswrapper[4707]: E0121 15:44:38.521514 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="ovn-northd" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.536703 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-2qlj6"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.560179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-scripts\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.560532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrh6p\" (UniqueName: \"kubernetes.io/projected/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-kube-api-access-rrh6p\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.560564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-combined-ca-bundle\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.560603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-logs\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.560645 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-config-data\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.560703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-db-sync-config-data\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.560776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-config-data\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.564930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89m7\" (UniqueName: \"kubernetes.io/projected/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-kube-api-access-w89m7\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.564958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-combined-ca-bundle\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.565440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-logs\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.571945 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-2qlj6"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.572549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-combined-ca-bundle\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.576354 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-config-data\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.578888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrh6p\" (UniqueName: \"kubernetes.io/projected/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-kube-api-access-rrh6p\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.601070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-scripts\") pod \"placement-db-sync-x6jxn\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.630585 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.631287 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerName="openstack-network-exporter" containerID="cri-o://80c16aa36e45c30a214eb7a6a5d35d4783766903f11910f232a3d249cc3e22cd" gracePeriod=300 Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.656353 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-zcs8f"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.657633 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgmq2\" (UniqueName: \"kubernetes.io/projected/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-kube-api-access-kgmq2\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w89m7\" (UniqueName: \"kubernetes.io/projected/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-kube-api-access-w89m7\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-combined-ca-bundle\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-db-sync-config-data\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-scripts\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-etc-machine-id\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-config-data\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-db-sync-config-data\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-combined-ca-bundle\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.667501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-config-data\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.673865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-db-sync-config-data\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.674275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-config-data\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.679523 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-zcs8f"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.679796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-combined-ca-bundle\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.687069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.687362 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerName="ovsdbserver-sb" containerID="cri-o://945a84fae9f8753ed68a8b62e2772b440ee13e8fb93bc4182ad87216030f7bba" gracePeriod=300 Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.703631 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.719834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89m7\" (UniqueName: \"kubernetes.io/projected/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-kube-api-access-w89m7\") pod \"glance-db-sync-mbvl2\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.733045 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.737395 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/memcached-0" podUID="a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.220:11211: connect: connection refused" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.768620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-combined-ca-bundle\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.768686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-config-data\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.768733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwg99\" (UniqueName: \"kubernetes.io/projected/02253407-57dc-4a23-8c81-e9464a8faa92-kube-api-access-jwg99\") pod \"neutron-db-sync-zcs8f\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.768755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgmq2\" (UniqueName: \"kubernetes.io/projected/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-kube-api-access-kgmq2\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.768775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-combined-ca-bundle\") pod \"neutron-db-sync-zcs8f\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.768788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-config\") pod \"neutron-db-sync-zcs8f\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.768833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-db-sync-config-data\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.768881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-scripts\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.768897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-etc-machine-id\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.769006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-etc-machine-id\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.772878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-combined-ca-bundle\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.777375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-db-sync-config-data\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.783655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-config-data\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.783887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-scripts\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.791232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgmq2\" (UniqueName: \"kubernetes.io/projected/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-kube-api-access-kgmq2\") pod \"cinder-db-sync-r94jp\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.871236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwg99\" (UniqueName: \"kubernetes.io/projected/02253407-57dc-4a23-8c81-e9464a8faa92-kube-api-access-jwg99\") pod \"neutron-db-sync-zcs8f\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.871516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-combined-ca-bundle\") pod \"neutron-db-sync-zcs8f\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.871536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-config\") pod \"neutron-db-sync-zcs8f\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.876536 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-config\") pod \"neutron-db-sync-zcs8f\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.878376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-combined-ca-bundle\") pod \"neutron-db-sync-zcs8f\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.892227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwg99\" (UniqueName: \"kubernetes.io/projected/02253407-57dc-4a23-8c81-e9464a8faa92-kube-api-access-jwg99\") pod \"neutron-db-sync-zcs8f\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.900900 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.918987 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.927770 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-l2d4s"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.977782 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:44:38 crc kubenswrapper[4707]: I0121 15:44:38.978251 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="a9ba2c35-3284-4234-b661-254a5b038cf8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c93cb1efa43ca7efe450cd63451ac3e5290a1c195b5b66d42c58c98b1bbb06e4" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.003924 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.004398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.016425 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mj2nm"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.065928 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.067062 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.072387 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.072982 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.089999 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.091362 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.094657 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.109728 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.186084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.186161 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-config-data\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.186218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-scripts\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.186235 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-scripts\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.186289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxmqx\" (UniqueName: \"kubernetes.io/projected/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-kube-api-access-cxmqx\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.186308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw9x4\" (UniqueName: \"kubernetes.io/projected/708db58a-2cd1-4735-8c38-16987a96a22a-kube-api-access-nw9x4\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.186370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-config-data\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.186420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.247217 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c697467-33ed-4d7c-9496-3e6961ddf8f4" path="/var/lib/kubelet/pods/3c697467-33ed-4d7c-9496-3e6961ddf8f4/volumes" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.247989 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f22a02a-72f1-45cd-876b-99167c754c9c" path="/var/lib/kubelet/pods/3f22a02a-72f1-45cd-876b-99167c754c9c/volumes" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.258613 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f1e87e4-8e97-4b83-94c8-91505cd5dd0d" path="/var/lib/kubelet/pods/4f1e87e4-8e97-4b83-94c8-91505cd5dd0d/volumes" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.259202 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b10a74d4-cf8e-4ccd-b07d-999cc270d6b8" path="/var/lib/kubelet/pods/b10a74d4-cf8e-4ccd-b07d-999cc270d6b8/volumes" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.260108 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe" path="/var/lib/kubelet/pods/c2a6f9e2-ff0d-4bf8-b511-28387e55a4fe/volumes" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.287708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.287779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-config-data\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.287848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-scripts\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.287866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-scripts\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.287929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxmqx\" (UniqueName: \"kubernetes.io/projected/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-kube-api-access-cxmqx\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.287947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw9x4\" (UniqueName: \"kubernetes.io/projected/708db58a-2cd1-4735-8c38-16987a96a22a-kube-api-access-nw9x4\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.288012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-config-data\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.288058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.304375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.307199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.308145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-scripts\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.309567 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_a16b399e-f6ad-455c-b786-fc08d088f7dc/ovsdbserver-nb/0.log" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.309868 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.310706 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d37f54bf-831c-4586-b825-7a4e57bc1878" path="/var/lib/kubelet/pods/d37f54bf-831c-4586-b825-7a4e57bc1878/volumes" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.311373 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc87e83f-52c0-49a9-9f2d-0ae38afd1d49" path="/var/lib/kubelet/pods/fc87e83f-52c0-49a9-9f2d-0ae38afd1d49/volumes" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.311975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.316159 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxmqx\" (UniqueName: \"kubernetes.io/projected/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-kube-api-access-cxmqx\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.346119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw9x4\" (UniqueName: \"kubernetes.io/projected/708db58a-2cd1-4735-8c38-16987a96a22a-kube-api-access-nw9x4\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.359592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-config-data\") pod \"nova-cell0-conductor-db-sync-tfldg\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.367388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-config-data\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.381420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-scripts\") pod \"nova-cell1-conductor-db-sync-lnwv8\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.382880 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerID="63a8221ce80bea60ab8157bbbd92a930360d931306dcb9d132b2e61859dc44ae" exitCode=2 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.383000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81","Type":"ContainerDied","Data":"63a8221ce80bea60ab8157bbbd92a930360d931306dcb9d132b2e61859dc44ae"} Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.388867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-combined-ca-bundle\") pod \"a16b399e-f6ad-455c-b786-fc08d088f7dc\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.388935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdbserver-nb-tls-certs\") pod \"a16b399e-f6ad-455c-b786-fc08d088f7dc\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.388978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdb-rundir\") pod \"a16b399e-f6ad-455c-b786-fc08d088f7dc\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.389074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-metrics-certs-tls-certs\") pod \"a16b399e-f6ad-455c-b786-fc08d088f7dc\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.389099 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"a16b399e-f6ad-455c-b786-fc08d088f7dc\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.389114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-config\") pod \"a16b399e-f6ad-455c-b786-fc08d088f7dc\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.389209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-scripts\") pod \"a16b399e-f6ad-455c-b786-fc08d088f7dc\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.389232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ncsr\" (UniqueName: \"kubernetes.io/projected/a16b399e-f6ad-455c-b786-fc08d088f7dc-kube-api-access-5ncsr\") pod \"a16b399e-f6ad-455c-b786-fc08d088f7dc\" (UID: \"a16b399e-f6ad-455c-b786-fc08d088f7dc\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.392106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "a16b399e-f6ad-455c-b786-fc08d088f7dc" (UID: "a16b399e-f6ad-455c-b786-fc08d088f7dc"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.393155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-config" (OuterVolumeSpecName: "config") pod "a16b399e-f6ad-455c-b786-fc08d088f7dc" (UID: "a16b399e-f6ad-455c-b786-fc08d088f7dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.396945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "a16b399e-f6ad-455c-b786-fc08d088f7dc" (UID: "a16b399e-f6ad-455c-b786-fc08d088f7dc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.397418 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-scripts" (OuterVolumeSpecName: "scripts") pod "a16b399e-f6ad-455c-b786-fc08d088f7dc" (UID: "a16b399e-f6ad-455c-b786-fc08d088f7dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.399844 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.399895 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.399906 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.399916 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a16b399e-f6ad-455c-b786-fc08d088f7dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.406454 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_7d1ed8c7-69eb-461f-9984-719171cafcb2/ovsdbserver-sb/0.log" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.406545 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerID="80c16aa36e45c30a214eb7a6a5d35d4783766903f11910f232a3d249cc3e22cd" exitCode=2 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.406625 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerID="945a84fae9f8753ed68a8b62e2772b440ee13e8fb93bc4182ad87216030f7bba" exitCode=143 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.406731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"7d1ed8c7-69eb-461f-9984-719171cafcb2","Type":"ContainerDied","Data":"80c16aa36e45c30a214eb7a6a5d35d4783766903f11910f232a3d249cc3e22cd"} Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.406802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"7d1ed8c7-69eb-461f-9984-719171cafcb2","Type":"ContainerDied","Data":"945a84fae9f8753ed68a8b62e2772b440ee13e8fb93bc4182ad87216030f7bba"} Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.409782 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.428919 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.458877 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-92nzp"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.460320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16b399e-f6ad-455c-b786-fc08d088f7dc-kube-api-access-5ncsr" (OuterVolumeSpecName: "kube-api-access-5ncsr") pod "a16b399e-f6ad-455c-b786-fc08d088f7dc" (UID: "a16b399e-f6ad-455c-b786-fc08d088f7dc"). InnerVolumeSpecName "kube-api-access-5ncsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.467960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a16b399e-f6ad-455c-b786-fc08d088f7dc" (UID: "a16b399e-f6ad-455c-b786-fc08d088f7dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.468021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerStarted","Data":"39829273d81c5e6b6f5743fe52316bf63d995fb9c51456d0c3691bf5a73f5941"} Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.487024 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_a16b399e-f6ad-455c-b786-fc08d088f7dc/ovsdbserver-nb/0.log" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.487080 4707 generic.go:334] "Generic (PLEG): container finished" podID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerID="df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a" exitCode=2 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.487104 4707 generic.go:334] "Generic (PLEG): container finished" podID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerID="1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439" exitCode=143 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.487198 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.487586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a16b399e-f6ad-455c-b786-fc08d088f7dc","Type":"ContainerDied","Data":"df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a"} Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.487630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a16b399e-f6ad-455c-b786-fc08d088f7dc","Type":"ContainerDied","Data":"1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439"} Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.487645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a16b399e-f6ad-455c-b786-fc08d088f7dc","Type":"ContainerDied","Data":"d8544757a9b2d8e3ca9fff73bef4830f94514ff29fb9b253f5bf7bc9ef70c4ed"} Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.487661 4707 scope.go:117] "RemoveContainer" containerID="df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.490943 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.496569 4707 generic.go:334] "Generic (PLEG): container finished" podID="a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" containerID="0845927e911c37dcf9655a14e1f3c10d1152022715b30329ae47d09eb7ba4612" exitCode=0 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.496695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2","Type":"ContainerDied","Data":"0845927e911c37dcf9655a14e1f3c10d1152022715b30329ae47d09eb7ba4612"} Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.502858 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.502895 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.502907 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ncsr\" (UniqueName: \"kubernetes.io/projected/a16b399e-f6ad-455c-b786-fc08d088f7dc-kube-api-access-5ncsr\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.540474 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qzljp"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.551276 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qzljp"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.568349 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.583629 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.631700 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2lfqh"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.640440 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-dpcvc"] Jan 21 15:44:39 crc kubenswrapper[4707]: E0121 15:44:39.640844 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" containerName="memcached" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.640862 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" containerName="memcached" Jan 21 15:44:39 crc kubenswrapper[4707]: E0121 15:44:39.640870 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerName="ovsdbserver-nb" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.640875 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerName="ovsdbserver-nb" Jan 21 15:44:39 crc kubenswrapper[4707]: E0121 15:44:39.640900 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerName="openstack-network-exporter" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.640906 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerName="openstack-network-exporter" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.641069 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerName="ovsdbserver-nb" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.641084 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16b399e-f6ad-455c-b786-fc08d088f7dc" containerName="openstack-network-exporter" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.641104 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" containerName="memcached" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.641699 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.654275 4707 scope.go:117] "RemoveContainer" containerID="1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.671578 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2lfqh"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.684856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-dpcvc"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.688944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "a16b399e-f6ad-455c-b786-fc08d088f7dc" (UID: "a16b399e-f6ad-455c-b786-fc08d088f7dc"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.715165 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.715527 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-server" containerID="cri-o://e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.715897 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="swift-recon-cron" containerID="cri-o://51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.715956 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="rsync" containerID="cri-o://3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.715993 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-expirer" containerID="cri-o://ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716023 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-updater" containerID="cri-o://2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716051 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-auditor" containerID="cri-o://fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716078 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-replicator" containerID="cri-o://764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716104 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-server" containerID="cri-o://9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716162 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-updater" containerID="cri-o://87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716195 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-auditor" containerID="cri-o://0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716222 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-replicator" containerID="cri-o://f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716249 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-server" containerID="cri-o://5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716288 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-reaper" containerID="cri-o://0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716318 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-auditor" containerID="cri-o://96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.716359 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-replicator" containerID="cri-o://cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58" gracePeriod=30 Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.717356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-memcached-tls-certs\") pod \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.717446 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-config-data\") pod \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.717508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kolla-config\") pod \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.717589 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jdp4\" (UniqueName: \"kubernetes.io/projected/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kube-api-access-8jdp4\") pod \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.717707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-combined-ca-bundle\") pod \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\" (UID: \"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2\") " Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.718217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-combined-ca-bundle\") pod \"barbican-db-sync-dpcvc\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.718400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-db-sync-config-data\") pod \"barbican-db-sync-dpcvc\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.718440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5l47\" (UniqueName: \"kubernetes.io/projected/ef17ce01-e6d2-4028-8e44-e794559c1608-kube-api-access-b5l47\") pod \"barbican-db-sync-dpcvc\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.718511 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.718772 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" (UID: "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.719039 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-config-data" (OuterVolumeSpecName: "config-data") pod "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" (UID: "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.723928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "a16b399e-f6ad-455c-b786-fc08d088f7dc" (UID: "a16b399e-f6ad-455c-b786-fc08d088f7dc"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.741029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kube-api-access-8jdp4" (OuterVolumeSpecName: "kube-api-access-8jdp4") pod "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" (UID: "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2"). InnerVolumeSpecName "kube-api-access-8jdp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.798614 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kdbn6"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.830149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5l47\" (UniqueName: \"kubernetes.io/projected/ef17ce01-e6d2-4028-8e44-e794559c1608-kube-api-access-b5l47\") pod \"barbican-db-sync-dpcvc\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.830677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-combined-ca-bundle\") pod \"barbican-db-sync-dpcvc\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.831175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-db-sync-config-data\") pod \"barbican-db-sync-dpcvc\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.831359 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.831605 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.831692 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jdp4\" (UniqueName: \"kubernetes.io/projected/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-kube-api-access-8jdp4\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.831789 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16b399e-f6ad-455c-b786-fc08d088f7dc-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.831904 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.838148 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.838459 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.838629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-db-sync-config-data\") pod \"barbican-db-sync-dpcvc\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.840274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-combined-ca-bundle\") pod \"barbican-db-sync-dpcvc\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.852753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5l47\" (UniqueName: \"kubernetes.io/projected/ef17ce01-e6d2-4028-8e44-e794559c1608-kube-api-access-b5l47\") pod \"barbican-db-sync-dpcvc\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.860248 4707 scope.go:117] "RemoveContainer" containerID="df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a" Jan 21 15:44:39 crc kubenswrapper[4707]: E0121 15:44:39.863054 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a\": container with ID starting with df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a not found: ID does not exist" containerID="df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.863107 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a"} err="failed to get container status \"df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a\": rpc error: code = NotFound desc = could not find container \"df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a\": container with ID starting with df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a not found: ID does not exist" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.863135 4707 scope.go:117] "RemoveContainer" containerID="1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439" Jan 21 15:44:39 crc kubenswrapper[4707]: E0121 15:44:39.866992 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439\": container with ID starting with 1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439 not found: ID does not exist" containerID="1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.867032 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439"} err="failed to get container status \"1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439\": rpc error: code = NotFound desc = could not find container \"1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439\": container with ID starting with 1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439 not found: ID does not exist" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.867066 4707 scope.go:117] "RemoveContainer" containerID="df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.870060 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a"} err="failed to get container status \"df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a\": rpc error: code = NotFound desc = could not find container \"df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a\": container with ID starting with df089364d7a1e83994a21ec2de07b79fe85cd8463b19e482ec7a69ab59c9471a not found: ID does not exist" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.870082 4707 scope.go:117] "RemoveContainer" containerID="1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.870388 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439"} err="failed to get container status \"1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439\": rpc error: code = NotFound desc = could not find container \"1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439\": container with ID starting with 1b967c38c0b60338b15cbce8cbf8d67fbe6f362a46f08e15d571b902e8eaf439 not found: ID does not exist" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.934004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" (UID: "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.946446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kdbn6"] Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.954439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-swiftconf\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.954541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-combined-ca-bundle\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.965243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-ring-data-devices\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.965466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqh8r\" (UniqueName: \"kubernetes.io/projected/a15a41db-259f-4646-b02e-18cb3c2beab7-kube-api-access-dqh8r\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.965577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-dispersionconf\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.965827 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a15a41db-259f-4646-b02e-18cb3c2beab7-etc-swift\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.965869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-scripts\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.965963 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:39 crc kubenswrapper[4707]: I0121 15:44:39.977073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.019401 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.029017 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.053848 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.056177 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.058417 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.059428 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" (UID: "a11b049c-b2f5-4be3-8a8b-99b141d2b7f2"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.068992 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-5mnrs" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.069200 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.069340 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.069452 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.071107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a15a41db-259f-4646-b02e-18cb3c2beab7-etc-swift\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.071136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-scripts\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.071193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-swiftconf\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.071311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-combined-ca-bundle\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.071355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-ring-data-devices\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.071376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqh8r\" (UniqueName: \"kubernetes.io/projected/a15a41db-259f-4646-b02e-18cb3c2beab7-kube-api-access-dqh8r\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.071434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-dispersionconf\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.071476 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.073858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a15a41db-259f-4646-b02e-18cb3c2beab7-etc-swift\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.074424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-scripts\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.074801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-ring-data-devices\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.075089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-dispersionconf\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.079544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-swiftconf\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.080309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-combined-ca-bundle\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.121602 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqh8r\" (UniqueName: \"kubernetes.io/projected/a15a41db-259f-4646-b02e-18cb3c2beab7-kube-api-access-dqh8r\") pod \"swift-ring-rebalance-kdbn6\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.173150 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.173234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.173252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.173298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-config\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.173323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgcqc\" (UniqueName: \"kubernetes.io/projected/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-kube-api-access-sgcqc\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.173348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.173404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.173420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.266764 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="a9ba2c35-3284-4234-b661-254a5b038cf8" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.36:6080/vnc_lite.html\": dial tcp 10.217.1.36:6080: connect: connection refused" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.275410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-config\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.275482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgcqc\" (UniqueName: \"kubernetes.io/projected/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-kube-api-access-sgcqc\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.275532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.275663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.275682 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.275766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.277844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.277871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.279075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-config\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.283641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.283949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.284772 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.284789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.288539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.290397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.298305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.308667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgcqc\" (UniqueName: \"kubernetes.io/projected/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-kube-api-access-sgcqc\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.327109 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.405504 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.432619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-x6jxn"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.443075 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-mbvl2"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.446873 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_7d1ed8c7-69eb-461f-9984-719171cafcb2/ovsdbserver-sb/0.log" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.446928 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.446993 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.485833 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-r94jp"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.530487 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-zcs8f"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.542496 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_7d1ed8c7-69eb-461f-9984-719171cafcb2/ovsdbserver-sb/0.log" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.542631 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.542818 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"7d1ed8c7-69eb-461f-9984-719171cafcb2","Type":"ContainerDied","Data":"d2e8aa83b3a291c1d03d1eee0517b885b88329c584f84cb80a2f282b4aae999f"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.542871 4707 scope.go:117] "RemoveContainer" containerID="80c16aa36e45c30a214eb7a6a5d35d4783766903f11910f232a3d249cc3e22cd" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.575863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"a11b049c-b2f5-4be3-8a8b-99b141d2b7f2","Type":"ContainerDied","Data":"a5f183bebe0f7da4d0ddf4fa40803191f17d6f310bfb6a29a2703ac6f2d8bbd3"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.576201 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.582006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-92nzp" event={"ID":"1c800162-c9be-4c26-848e-548c5ade1645","Type":"ContainerStarted","Data":"b3819f15bfa92dacfc9efa832f6bba289ab15504086d0c4da1ef4db7b3dfdb57"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-config\") pod \"7d1ed8c7-69eb-461f-9984-719171cafcb2\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdbserver-sb-tls-certs\") pod \"7d1ed8c7-69eb-461f-9984-719171cafcb2\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-galera-tls-certs\") pod \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-generated\") pod \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587540 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-default\") pod \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-combined-ca-bundle\") pod \"7d1ed8c7-69eb-461f-9984-719171cafcb2\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2452s\" (UniqueName: \"kubernetes.io/projected/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kube-api-access-2452s\") pod \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-scripts\") pod \"7d1ed8c7-69eb-461f-9984-719171cafcb2\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kolla-config\") pod \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"7d1ed8c7-69eb-461f-9984-719171cafcb2\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-metrics-certs-tls-certs\") pod \"7d1ed8c7-69eb-461f-9984-719171cafcb2\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-combined-ca-bundle\") pod \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-operator-scripts\") pod \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\" (UID: \"a7280671-6e1f-48c9-a9c8-ca1f7453f07c\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnwcv\" (UniqueName: \"kubernetes.io/projected/7d1ed8c7-69eb-461f-9984-719171cafcb2-kube-api-access-dnwcv\") pod \"7d1ed8c7-69eb-461f-9984-719171cafcb2\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.587889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdb-rundir\") pod \"7d1ed8c7-69eb-461f-9984-719171cafcb2\" (UID: \"7d1ed8c7-69eb-461f-9984-719171cafcb2\") " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.588463 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "7d1ed8c7-69eb-461f-9984-719171cafcb2" (UID: "7d1ed8c7-69eb-461f-9984-719171cafcb2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.588899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a7280671-6e1f-48c9-a9c8-ca1f7453f07c" (UID: "a7280671-6e1f-48c9-a9c8-ca1f7453f07c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.589224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "a7280671-6e1f-48c9-a9c8-ca1f7453f07c" (UID: "a7280671-6e1f-48c9-a9c8-ca1f7453f07c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.589776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-config" (OuterVolumeSpecName: "config") pod "7d1ed8c7-69eb-461f-9984-719171cafcb2" (UID: "7d1ed8c7-69eb-461f-9984-719171cafcb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.590072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "a7280671-6e1f-48c9-a9c8-ca1f7453f07c" (UID: "a7280671-6e1f-48c9-a9c8-ca1f7453f07c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.590494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-scripts" (OuterVolumeSpecName: "scripts") pod "7d1ed8c7-69eb-461f-9984-719171cafcb2" (UID: "7d1ed8c7-69eb-461f-9984-719171cafcb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.592070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7280671-6e1f-48c9-a9c8-ca1f7453f07c" (UID: "a7280671-6e1f-48c9-a9c8-ca1f7453f07c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.599949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1ed8c7-69eb-461f-9984-719171cafcb2-kube-api-access-dnwcv" (OuterVolumeSpecName: "kube-api-access-dnwcv") pod "7d1ed8c7-69eb-461f-9984-719171cafcb2" (UID: "7d1ed8c7-69eb-461f-9984-719171cafcb2"). InnerVolumeSpecName "kube-api-access-dnwcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.604955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "7d1ed8c7-69eb-461f-9984-719171cafcb2" (UID: "7d1ed8c7-69eb-461f-9984-719171cafcb2"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.607026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerStarted","Data":"80c0c3690090f96b43c9dd8217c42cf20d45c7e81cdd539d31f8dfdd07cb3ec4"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.628470 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.631968 4707 scope.go:117] "RemoveContainer" containerID="945a84fae9f8753ed68a8b62e2772b440ee13e8fb93bc4182ad87216030f7bba" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.641342 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9ba2c35-3284-4234-b661-254a5b038cf8" containerID="c93cb1efa43ca7efe450cd63451ac3e5290a1c195b5b66d42c58c98b1bbb06e4" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.641412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a9ba2c35-3284-4234-b661-254a5b038cf8","Type":"ContainerDied","Data":"c93cb1efa43ca7efe450cd63451ac3e5290a1c195b5b66d42c58c98b1bbb06e4"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.658608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7280671-6e1f-48c9-a9c8-ca1f7453f07c" (UID: "a7280671-6e1f-48c9-a9c8-ca1f7453f07c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.659649 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.660989 4707 generic.go:334] "Generic (PLEG): container finished" podID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" containerID="fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.661050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a7280671-6e1f-48c9-a9c8-ca1f7453f07c","Type":"ContainerDied","Data":"fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.661079 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a7280671-6e1f-48c9-a9c8-ca1f7453f07c","Type":"ContainerDied","Data":"85fa3a261dfa556fa3e788b3224aba8d6953f21c5112386ef2616abd7a8edbec"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.661141 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.665723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kube-api-access-2452s" (OuterVolumeSpecName: "kube-api-access-2452s") pod "a7280671-6e1f-48c9-a9c8-ca1f7453f07c" (UID: "a7280671-6e1f-48c9-a9c8-ca1f7453f07c"). InnerVolumeSpecName "kube-api-access-2452s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.688929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "mysql-db") pod "a7280671-6e1f-48c9-a9c8-ca1f7453f07c" (UID: "a7280671-6e1f-48c9-a9c8-ca1f7453f07c"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690493 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690521 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690547 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690557 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690568 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690576 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2452s\" (UniqueName: \"kubernetes.io/projected/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kube-api-access-2452s\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690587 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d1ed8c7-69eb-461f-9984-719171cafcb2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690595 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690607 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690616 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690625 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.690633 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnwcv\" (UniqueName: \"kubernetes.io/projected/7d1ed8c7-69eb-461f-9984-719171cafcb2-kube-api-access-dnwcv\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.699975 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:44:40 crc kubenswrapper[4707]: E0121 15:44:40.700349 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerName="ovsdbserver-sb" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.700361 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerName="ovsdbserver-sb" Jan 21 15:44:40 crc kubenswrapper[4707]: E0121 15:44:40.700376 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" containerName="galera" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.700381 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" containerName="galera" Jan 21 15:44:40 crc kubenswrapper[4707]: E0121 15:44:40.700398 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerName="openstack-network-exporter" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.700404 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerName="openstack-network-exporter" Jan 21 15:44:40 crc kubenswrapper[4707]: E0121 15:44:40.700428 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" containerName="mysql-bootstrap" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.700433 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" containerName="mysql-bootstrap" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.700594 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerName="ovsdbserver-sb" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.700605 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" containerName="openstack-network-exporter" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.700615 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" containerName="galera" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.702508 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.713701 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-mmw7f" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.713886 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.714002 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.714943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d1ed8c7-69eb-461f-9984-719171cafcb2" (UID: "7d1ed8c7-69eb-461f-9984-719171cafcb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.724437 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.737930 4707 generic.go:334] "Generic (PLEG): container finished" podID="89b706e2-54e8-4a85-b263-cbccec9d93bd" containerID="6ab4155a7c3881ee300c8f40a21c7af6d838cd85e5003eb8d0024abf5e2ad845" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.737983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"89b706e2-54e8-4a85-b263-cbccec9d93bd","Type":"ContainerDied","Data":"6ab4155a7c3881ee300c8f40a21c7af6d838cd85e5003eb8d0024abf5e2ad845"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.792043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-kolla-config\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.792106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4lqk\" (UniqueName: \"kubernetes.io/projected/57114736-e4da-48ba-8f64-f382267ede2e-kube-api-access-w4lqk\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.792131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.792185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.792217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-config-data\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.792286 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.806565 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.893705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-kolla-config\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.893980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4lqk\" (UniqueName: \"kubernetes.io/projected/57114736-e4da-48ba-8f64-f382267ede2e-kube-api-access-w4lqk\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.894002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.894036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.894060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-config-data\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.894147 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906005 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906042 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906052 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906061 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906070 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906076 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906084 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906090 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906098 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906104 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906110 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906117 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906123 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906130 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa" exitCode=0 Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906263 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.906309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa"} Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.909372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-config-data\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.910768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-kolla-config\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.928264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.931205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:40 crc kubenswrapper[4707]: I0121 15:44:40.963999 4707 scope.go:117] "RemoveContainer" containerID="0845927e911c37dcf9655a14e1f3c10d1152022715b30329ae47d09eb7ba4612" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.020937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "7d1ed8c7-69eb-461f-9984-719171cafcb2" (UID: "7d1ed8c7-69eb-461f-9984-719171cafcb2"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.029326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4lqk\" (UniqueName: \"kubernetes.io/projected/57114736-e4da-48ba-8f64-f382267ede2e-kube-api-access-w4lqk\") pod \"memcached-0\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.070155 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.097384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "a7280671-6e1f-48c9-a9c8-ca1f7453f07c" (UID: "a7280671-6e1f-48c9-a9c8-ca1f7453f07c"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.097899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7d1ed8c7-69eb-461f-9984-719171cafcb2" (UID: "7d1ed8c7-69eb-461f-9984-719171cafcb2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.103564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vch9b\" (UniqueName: \"kubernetes.io/projected/89b706e2-54e8-4a85-b263-cbccec9d93bd-kube-api-access-vch9b\") pod \"89b706e2-54e8-4a85-b263-cbccec9d93bd\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.103757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-operator-scripts\") pod \"89b706e2-54e8-4a85-b263-cbccec9d93bd\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.104720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-generated\") pod \"89b706e2-54e8-4a85-b263-cbccec9d93bd\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.104951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-default\") pod \"89b706e2-54e8-4a85-b263-cbccec9d93bd\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.105073 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-combined-ca-bundle\") pod \"89b706e2-54e8-4a85-b263-cbccec9d93bd\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.105174 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-kolla-config\") pod \"89b706e2-54e8-4a85-b263-cbccec9d93bd\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.105279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-galera-tls-certs\") pod \"89b706e2-54e8-4a85-b263-cbccec9d93bd\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.105515 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"89b706e2-54e8-4a85-b263-cbccec9d93bd\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.107660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "89b706e2-54e8-4a85-b263-cbccec9d93bd" (UID: "89b706e2-54e8-4a85-b263-cbccec9d93bd"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.107747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "89b706e2-54e8-4a85-b263-cbccec9d93bd" (UID: "89b706e2-54e8-4a85-b263-cbccec9d93bd"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.104069 4707 scope.go:117] "RemoveContainer" containerID="fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.108478 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89b706e2-54e8-4a85-b263-cbccec9d93bd" (UID: "89b706e2-54e8-4a85-b263-cbccec9d93bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.108912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b706e2-54e8-4a85-b263-cbccec9d93bd-kube-api-access-vch9b" (OuterVolumeSpecName: "kube-api-access-vch9b") pod "89b706e2-54e8-4a85-b263-cbccec9d93bd" (UID: "89b706e2-54e8-4a85-b263-cbccec9d93bd"). InnerVolumeSpecName "kube-api-access-vch9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.113704 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.114409 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.117338 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7280671-6e1f-48c9-a9c8-ca1f7453f07c-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.117355 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1ed8c7-69eb-461f-9984-719171cafcb2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.108746 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "89b706e2-54e8-4a85-b263-cbccec9d93bd" (UID: "89b706e2-54e8-4a85-b263-cbccec9d93bd"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.124198 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.200997 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9055239a-b902-41a9-a86b-0a09f7b78d4e" path="/var/lib/kubelet/pods/9055239a-b902-41a9-a86b-0a09f7b78d4e/volumes" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.201567 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a11b049c-b2f5-4be3-8a8b-99b141d2b7f2" path="/var/lib/kubelet/pods/a11b049c-b2f5-4be3-8a8b-99b141d2b7f2/volumes" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.202161 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16b399e-f6ad-455c-b786-fc08d088f7dc" path="/var/lib/kubelet/pods/a16b399e-f6ad-455c-b786-fc08d088f7dc/volumes" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.205975 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b" path="/var/lib/kubelet/pods/f2dc5bc7-8ece-4d91-a0b6-922abbf5af7b/volumes" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.220516 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vch9b\" (UniqueName: \"kubernetes.io/projected/89b706e2-54e8-4a85-b263-cbccec9d93bd-kube-api-access-vch9b\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.220548 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.220558 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.220568 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.220578 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.220588 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89b706e2-54e8-4a85-b263-cbccec9d93bd-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.321602 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "89b706e2-54e8-4a85-b263-cbccec9d93bd" (UID: "89b706e2-54e8-4a85-b263-cbccec9d93bd"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.333977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"89b706e2-54e8-4a85-b263-cbccec9d93bd\" (UID: \"89b706e2-54e8-4a85-b263-cbccec9d93bd\") " Jan 21 15:44:41 crc kubenswrapper[4707]: W0121 15:44:41.336798 4707 mount_helper_common.go:34] Warning: mount cleanup skipped because path does not exist: /var/lib/kubelet/pods/89b706e2-54e8-4a85-b263-cbccec9d93bd/volumes/kubernetes.io~local-volume/local-storage04-crc Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.336853 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "89b706e2-54e8-4a85-b263-cbccec9d93bd" (UID: "89b706e2-54e8-4a85-b263-cbccec9d93bd"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: W0121 15:44:41.373105 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod708db58a_2cd1_4735_8c38_16987a96a22a.slice/crio-e478f869783c247cc1b797cbe5a214c97fb7abf9d51bbfbf94c86ca0d70dcd18 WatchSource:0}: Error finding container e478f869783c247cc1b797cbe5a214c97fb7abf9d51bbfbf94c86ca0d70dcd18: Status 404 returned error can't find the container with id e478f869783c247cc1b797cbe5a214c97fb7abf9d51bbfbf94c86ca0d70dcd18 Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.438294 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.441946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89b706e2-54e8-4a85-b263-cbccec9d93bd" (UID: "89b706e2-54e8-4a85-b263-cbccec9d93bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.549000 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.570182 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.612047 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "89b706e2-54e8-4a85-b263-cbccec9d93bd" (UID: "89b706e2-54e8-4a85-b263-cbccec9d93bd"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.651491 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/89b706e2-54e8-4a85-b263-cbccec9d93bd-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.662053 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.762786 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg"] Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.762843 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8"] Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.762856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kdbn6"] Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.762866 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-dpcvc"] Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.833754 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.875432 4707 scope.go:117] "RemoveContainer" containerID="958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.927450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-x6jxn" event={"ID":"72c825b0-9f75-45ef-ab7c-5e38af70a8d2","Type":"ContainerStarted","Data":"38b41abf3aac941b3174f1c8a5eeb7b89e9817d932cce60bb06f025c8037461f"} Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.927492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-x6jxn" event={"ID":"72c825b0-9f75-45ef-ab7c-5e38af70a8d2","Type":"ContainerStarted","Data":"82da53a0720e2dac9961ac96ab3f6e9a14c9521536d8eea43fedb6de3eb006e7"} Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.934515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a9ba2c35-3284-4234-b661-254a5b038cf8","Type":"ContainerDied","Data":"6e42bbb2f3a6a93c7af095075600bf2be2bb88d8eec1a49d9739a2268df06c97"} Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.934557 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e42bbb2f3a6a93c7af095075600bf2be2bb88d8eec1a49d9739a2268df06c97" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.951230 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-x6jxn" podStartSLOduration=3.951210614 podStartE2EDuration="3.951210614s" podCreationTimestamp="2026-01-21 15:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:41.939991881 +0000 UTC m=+2579.121508103" watchObservedRunningTime="2026-01-21 15:44:41.951210614 +0000 UTC m=+2579.132726836" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.954259 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.959498 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.961313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"89b706e2-54e8-4a85-b263-cbccec9d93bd","Type":"ContainerDied","Data":"aa98719951bf2b6bf29297053f29bfa5df0272461f17db406e43b3c2d0522dc7"} Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.974961 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:44:41 crc kubenswrapper[4707]: I0121 15:44:41.992289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" event={"ID":"708db58a-2cd1-4735-8c38-16987a96a22a","Type":"ContainerStarted","Data":"e478f869783c247cc1b797cbe5a214c97fb7abf9d51bbfbf94c86ca0d70dcd18"} Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.009857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" event={"ID":"02253407-57dc-4a23-8c81-e9464a8faa92","Type":"ContainerStarted","Data":"abcb9446e24e3d2a5fd5954f7a378f3fa3066ada3058e6735bdfc6457ab43ed0"} Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.022413 4707 scope.go:117] "RemoveContainer" containerID="fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.023061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" event={"ID":"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2","Type":"ContainerStarted","Data":"dd0194ca21d3dc66f95ac783627a007729a3c9254b80d396c98825574b296084"} Jan 21 15:44:42 crc kubenswrapper[4707]: E0121 15:44:42.023120 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448\": container with ID starting with fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448 not found: ID does not exist" containerID="fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.023145 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448"} err="failed to get container status \"fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448\": rpc error: code = NotFound desc = could not find container \"fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448\": container with ID starting with fa9d245b1a76dd108b620663a2a1dfa941b59fad89c2ae4a6d7a56169b34d448 not found: ID does not exist" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.023160 4707 scope.go:117] "RemoveContainer" containerID="958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad" Jan 21 15:44:42 crc kubenswrapper[4707]: E0121 15:44:42.023636 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad\": container with ID starting with 958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad not found: ID does not exist" containerID="958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.023654 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad"} err="failed to get container status \"958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad\": rpc error: code = NotFound desc = could not find container \"958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad\": container with ID starting with 958d40f27b15736048f71bda9f79c726eebb869701d521331628a8743245a7ad not found: ID does not exist" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.023666 4707 scope.go:117] "RemoveContainer" containerID="6ab4155a7c3881ee300c8f40a21c7af6d838cd85e5003eb8d0024abf5e2ad845" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.027796 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" event={"ID":"a15a41db-259f-4646-b02e-18cb3c2beab7","Type":"ContainerStarted","Data":"800233753d63665862f1355f8b7902c90e9c0fa2edad3f58ea406607c2220d13"} Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.029787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-r94jp" event={"ID":"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c","Type":"ContainerStarted","Data":"5920e730b25bc38ec106cc97108743826c23555b3d6fc30e3296027f9b23b9da"} Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.033257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-92nzp" event={"ID":"1c800162-c9be-4c26-848e-548c5ade1645","Type":"ContainerStarted","Data":"af2df74e23c68b533ce2ad604e55e3b6a44522d5ac5962c11fede1b902ed12e8"} Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.041105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-mbvl2" event={"ID":"14b133c7-53cb-4cb8-9b04-66973d5ee9a0","Type":"ContainerStarted","Data":"ecbc41a1e39a7c8f5268d6f817fdb37138f92349b7057ab16b853b1bb91ffbd2"} Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.060348 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-92nzp" podStartSLOduration=4.06033795 podStartE2EDuration="4.06033795s" podCreationTimestamp="2026-01-21 15:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:42.056857156 +0000 UTC m=+2579.238373379" watchObservedRunningTime="2026-01-21 15:44:42.06033795 +0000 UTC m=+2579.241854172" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.078329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-combined-ca-bundle\") pod \"a9ba2c35-3284-4234-b661-254a5b038cf8\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.078459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-config-data\") pod \"a9ba2c35-3284-4234-b661-254a5b038cf8\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.078514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j8pj\" (UniqueName: \"kubernetes.io/projected/a9ba2c35-3284-4234-b661-254a5b038cf8-kube-api-access-4j8pj\") pod \"a9ba2c35-3284-4234-b661-254a5b038cf8\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.078569 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-vencrypt-tls-certs\") pod \"a9ba2c35-3284-4234-b661-254a5b038cf8\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.078594 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-nova-novncproxy-tls-certs\") pod \"a9ba2c35-3284-4234-b661-254a5b038cf8\" (UID: \"a9ba2c35-3284-4234-b661-254a5b038cf8\") " Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.093311 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ba2c35-3284-4234-b661-254a5b038cf8-kube-api-access-4j8pj" (OuterVolumeSpecName: "kube-api-access-4j8pj") pod "a9ba2c35-3284-4234-b661-254a5b038cf8" (UID: "a9ba2c35-3284-4234-b661-254a5b038cf8"). InnerVolumeSpecName "kube-api-access-4j8pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:42 crc kubenswrapper[4707]: W0121 15:44:42.101176 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57114736_e4da_48ba_8f64_f382267ede2e.slice/crio-0529d7b6cc5864226f3fb3e7dcfc4a31355bf0c22f544c8ba5fec2c6f1c19652 WatchSource:0}: Error finding container 0529d7b6cc5864226f3fb3e7dcfc4a31355bf0c22f544c8ba5fec2c6f1c19652: Status 404 returned error can't find the container with id 0529d7b6cc5864226f3fb3e7dcfc4a31355bf0c22f544c8ba5fec2c6f1c19652 Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.129142 4707 scope.go:117] "RemoveContainer" containerID="95551b395d0de6386810a37afb11f5ca53b7a8c1ccf730edcd06de3010c53ce6" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.130058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9ba2c35-3284-4234-b661-254a5b038cf8" (UID: "a9ba2c35-3284-4234-b661-254a5b038cf8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.141254 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "a9ba2c35-3284-4234-b661-254a5b038cf8" (UID: "a9ba2c35-3284-4234-b661-254a5b038cf8"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.156994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-config-data" (OuterVolumeSpecName: "config-data") pod "a9ba2c35-3284-4234-b661-254a5b038cf8" (UID: "a9ba2c35-3284-4234-b661-254a5b038cf8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.186114 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.186137 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j8pj\" (UniqueName: \"kubernetes.io/projected/a9ba2c35-3284-4234-b661-254a5b038cf8-kube-api-access-4j8pj\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.186147 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.186156 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.196964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "a9ba2c35-3284-4234-b661-254a5b038cf8" (UID: "a9ba2c35-3284-4234-b661-254a5b038cf8"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.278486 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.288521 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ba2c35-3284-4234-b661-254a5b038cf8-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.298286 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.352766 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:44:42 crc kubenswrapper[4707]: E0121 15:44:42.353202 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b706e2-54e8-4a85-b263-cbccec9d93bd" containerName="galera" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.353215 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b706e2-54e8-4a85-b263-cbccec9d93bd" containerName="galera" Jan 21 15:44:42 crc kubenswrapper[4707]: E0121 15:44:42.353233 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b706e2-54e8-4a85-b263-cbccec9d93bd" containerName="mysql-bootstrap" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.353239 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b706e2-54e8-4a85-b263-cbccec9d93bd" containerName="mysql-bootstrap" Jan 21 15:44:42 crc kubenswrapper[4707]: E0121 15:44:42.353265 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ba2c35-3284-4234-b661-254a5b038cf8" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.353278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ba2c35-3284-4234-b661-254a5b038cf8" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.353447 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ba2c35-3284-4234-b661-254a5b038cf8" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.353461 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b706e2-54e8-4a85-b263-cbccec9d93bd" containerName="galera" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.354449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.359385 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.359671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.359937 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.360006 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.360341 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-w7frc" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.404615 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.492249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.492321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.492343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.492366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9kff\" (UniqueName: \"kubernetes.io/projected/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kube-api-access-s9kff\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.492567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.492608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.492634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-default\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.492674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kolla-config\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.529564 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.529823 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="e5aeb6d2-f710-4490-9929-ac42ed945b2a" containerName="kube-state-metrics" containerID="cri-o://7fd0a096985f6e5334dfd12d0e94f5b59badf344ad4373aecceb6bac8d6b3dd4" gracePeriod=30 Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.594089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.594159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.594182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.594201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9kff\" (UniqueName: \"kubernetes.io/projected/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kube-api-access-s9kff\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.594296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.594328 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.594361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-default\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.594385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kolla-config\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.594395 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.595455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kolla-config\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.595452 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.595720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-default\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.595733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.599242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.599486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.611324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9kff\" (UniqueName: \"kubernetes.io/projected/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kube-api-access-s9kff\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.773930 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:42 crc kubenswrapper[4707]: I0121 15:44:42.787783 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.089942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" event={"ID":"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2","Type":"ContainerStarted","Data":"8df853fe17904e91c15bf9f58ea86cb238d2bb1b6b761b5fe7751f6725ae781f"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.117164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" event={"ID":"708db58a-2cd1-4735-8c38-16987a96a22a","Type":"ContainerStarted","Data":"74275cce571896970dcd652c2e437e0b5e956b6bdecd5677a5c8242f96d9d70a"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.129725 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" podStartSLOduration=4.129709969 podStartE2EDuration="4.129709969s" podCreationTimestamp="2026-01-21 15:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:43.127897301 +0000 UTC m=+2580.309413523" watchObservedRunningTime="2026-01-21 15:44:43.129709969 +0000 UTC m=+2580.311226191" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.141071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d","Type":"ContainerStarted","Data":"65770a8fa3c56a256295d751a70837c7e5a69727abf0f918cb18f1afc358384c"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.141107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d","Type":"ContainerStarted","Data":"855fa048eb548158b696c447ecc83d5902e298c6056a0f55d034166e85b2ba72"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.164988 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" podStartSLOduration=4.164975096 podStartE2EDuration="4.164975096s" podCreationTimestamp="2026-01-21 15:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:43.158046008 +0000 UTC m=+2580.339562230" watchObservedRunningTime="2026-01-21 15:44:43.164975096 +0000 UTC m=+2580.346491319" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.175157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerStarted","Data":"d6475b79105f9cfd74ca83db95f5e1d47d343e091c6d39cd10b2a266308a771f"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.175392 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="ceilometer-central-agent" containerID="cri-o://4846703d6c3135187ab80dfe5f24312d122ad935f663be43c9d4f6a2f0be4080" gracePeriod=30 Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.175692 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.176080 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="proxy-httpd" containerID="cri-o://d6475b79105f9cfd74ca83db95f5e1d47d343e091c6d39cd10b2a266308a771f" gracePeriod=30 Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.176125 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="ceilometer-notification-agent" containerID="cri-o://39829273d81c5e6b6f5743fe52316bf63d995fb9c51456d0c3691bf5a73f5941" gracePeriod=30 Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.176180 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="sg-core" containerID="cri-o://80c0c3690090f96b43c9dd8217c42cf20d45c7e81cdd539d31f8dfdd07cb3ec4" gracePeriod=30 Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.197327 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5aeb6d2-f710-4490-9929-ac42ed945b2a" containerID="7fd0a096985f6e5334dfd12d0e94f5b59badf344ad4373aecceb6bac8d6b3dd4" exitCode=2 Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.211409 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.747634426 podStartE2EDuration="7.211389637s" podCreationTimestamp="2026-01-21 15:44:36 +0000 UTC" firstStartedPulling="2026-01-21 15:44:37.005560804 +0000 UTC m=+2574.187077026" lastFinishedPulling="2026-01-21 15:44:41.469316015 +0000 UTC m=+2578.650832237" observedRunningTime="2026-01-21 15:44:43.201597455 +0000 UTC m=+2580.383113676" watchObservedRunningTime="2026-01-21 15:44:43.211389637 +0000 UTC m=+2580.392905859" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.237961 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b706e2-54e8-4a85-b263-cbccec9d93bd" path="/var/lib/kubelet/pods/89b706e2-54e8-4a85-b263-cbccec9d93bd/volumes" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.298320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"e5aeb6d2-f710-4490-9929-ac42ed945b2a","Type":"ContainerDied","Data":"7fd0a096985f6e5334dfd12d0e94f5b59badf344ad4373aecceb6bac8d6b3dd4"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.298477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" event={"ID":"a15a41db-259f-4646-b02e-18cb3c2beab7","Type":"ContainerStarted","Data":"566d2c79913ac7586f200da2fc0d89fa102371ace8458c1a066117901148fa60"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.298544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-r94jp" event={"ID":"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c","Type":"ContainerStarted","Data":"6d5389c6099f34042afa120959ddb534a1021ab43120ae8f9998af0e26ea03c0"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.298601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-mbvl2" event={"ID":"14b133c7-53cb-4cb8-9b04-66973d5ee9a0","Type":"ContainerStarted","Data":"c10e79abafe959b74a9a32a523621e9d6baab7114207d3edd4ec7a890bd33ad2"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.298654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" event={"ID":"ef17ce01-e6d2-4028-8e44-e794559c1608","Type":"ContainerStarted","Data":"ec9bc512b0547b6776ea1f5e8282c90dc55ec62ca4731917e733976641cf9ab8"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.298710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" event={"ID":"ef17ce01-e6d2-4028-8e44-e794559c1608","Type":"ContainerStarted","Data":"ea9f4d8ca49b11025809d9bdaaa21033f7d17baa7c3d9c78208d4b0874d27d36"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.336440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"57114736-e4da-48ba-8f64-f382267ede2e","Type":"ContainerStarted","Data":"27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.336483 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.336495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"57114736-e4da-48ba-8f64-f382267ede2e","Type":"ContainerStarted","Data":"0529d7b6cc5864226f3fb3e7dcfc4a31355bf0c22f544c8ba5fec2c6f1c19652"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.355696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" event={"ID":"02253407-57dc-4a23-8c81-e9464a8faa92","Type":"ContainerStarted","Data":"be1b9dd35378efeb49e3e0750d3c9b65ebd4b0e6153a53a3a5ccf28db9d738e2"} Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.356391 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.442438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:43 crc kubenswrapper[4707]: E0121 15:44:43.492530 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:44:43 crc kubenswrapper[4707]: E0121 15:44:43.493504 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:44:43 crc kubenswrapper[4707]: E0121 15:44:43.504238 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:44:43 crc kubenswrapper[4707]: E0121 15:44:43.504309 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="ovn-northd" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.517973 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-mbvl2" podStartSLOduration=5.517957629 podStartE2EDuration="5.517957629s" podCreationTimestamp="2026-01-21 15:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:43.497578021 +0000 UTC m=+2580.679094243" watchObservedRunningTime="2026-01-21 15:44:43.517957629 +0000 UTC m=+2580.699473851" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.540201 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=3.5382175719999998 podStartE2EDuration="3.538217572s" podCreationTimestamp="2026-01-21 15:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:43.515073286 +0000 UTC m=+2580.696589508" watchObservedRunningTime="2026-01-21 15:44:43.538217572 +0000 UTC m=+2580.719733794" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.547688 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" podStartSLOduration=4.547680586 podStartE2EDuration="4.547680586s" podCreationTimestamp="2026-01-21 15:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:43.538095393 +0000 UTC m=+2580.719611614" watchObservedRunningTime="2026-01-21 15:44:43.547680586 +0000 UTC m=+2580.729196808" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.563570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-combined-ca-bundle\") pod \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.563635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-certs\") pod \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.563794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfg52\" (UniqueName: \"kubernetes.io/projected/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-api-access-rfg52\") pod \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.563892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-config\") pod \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\" (UID: \"e5aeb6d2-f710-4490-9929-ac42ed945b2a\") " Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.569214 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.571175 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-r94jp" podStartSLOduration=5.571155324 podStartE2EDuration="5.571155324s" podCreationTimestamp="2026-01-21 15:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:43.556164687 +0000 UTC m=+2580.737680909" watchObservedRunningTime="2026-01-21 15:44:43.571155324 +0000 UTC m=+2580.752671546" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.575986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-api-access-rfg52" (OuterVolumeSpecName: "kube-api-access-rfg52") pod "e5aeb6d2-f710-4490-9929-ac42ed945b2a" (UID: "e5aeb6d2-f710-4490-9929-ac42ed945b2a"). InnerVolumeSpecName "kube-api-access-rfg52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.626894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5aeb6d2-f710-4490-9929-ac42ed945b2a" (UID: "e5aeb6d2-f710-4490-9929-ac42ed945b2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.639979 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="de4cffce-959d-47e8-9217-f842b2ab8f65" containerName="rabbitmq" containerID="cri-o://614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3" gracePeriod=604795 Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.672612 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfg52\" (UniqueName: \"kubernetes.io/projected/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-api-access-rfg52\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.672660 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.673646 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" podStartSLOduration=5.673627485 podStartE2EDuration="5.673627485s" podCreationTimestamp="2026-01-21 15:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:43.673146742 +0000 UTC m=+2580.854662964" watchObservedRunningTime="2026-01-21 15:44:43.673627485 +0000 UTC m=+2580.855143708" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.675413 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "e5aeb6d2-f710-4490-9929-ac42ed945b2a" (UID: "e5aeb6d2-f710-4490-9929-ac42ed945b2a"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.697903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "e5aeb6d2-f710-4490-9929-ac42ed945b2a" (UID: "e5aeb6d2-f710-4490-9929-ac42ed945b2a"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.760703 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" podStartSLOduration=4.760683443 podStartE2EDuration="4.760683443s" podCreationTimestamp="2026-01-21 15:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:43.697059694 +0000 UTC m=+2580.878575916" watchObservedRunningTime="2026-01-21 15:44:43.760683443 +0000 UTC m=+2580.942199665" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.775346 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.775377 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e5aeb6d2-f710-4490-9929-ac42ed945b2a-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:43 crc kubenswrapper[4707]: E0121 15:44:43.839176 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7456f87d_fbee_4a79_8e96_9af0ea500ea6.slice/crio-conmon-d6475b79105f9cfd74ca83db95f5e1d47d343e091c6d39cd10b2a266308a771f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7456f87d_fbee_4a79_8e96_9af0ea500ea6.slice/crio-conmon-4846703d6c3135187ab80dfe5f24312d122ad935f663be43c9d4f6a2f0be4080.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7456f87d_fbee_4a79_8e96_9af0ea500ea6.slice/crio-4846703d6c3135187ab80dfe5f24312d122ad935f663be43c9d4f6a2f0be4080.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.844842 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.849859 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.857873 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:44:43 crc kubenswrapper[4707]: E0121 15:44:43.858231 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5aeb6d2-f710-4490-9929-ac42ed945b2a" containerName="kube-state-metrics" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.858243 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5aeb6d2-f710-4490-9929-ac42ed945b2a" containerName="kube-state-metrics" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.858437 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5aeb6d2-f710-4490-9929-ac42ed945b2a" containerName="kube-state-metrics" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.858992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.860381 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.866189 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.866303 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.866504 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.980919 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6mv9\" (UniqueName: \"kubernetes.io/projected/62555ab6-9268-4248-977e-98269dc9483d-kube-api-access-d6mv9\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.980973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.981036 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.981085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:43 crc kubenswrapper[4707]: I0121 15:44:43.981101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.082713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6mv9\" (UniqueName: \"kubernetes.io/projected/62555ab6-9268-4248-977e-98269dc9483d-kube-api-access-d6mv9\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.082776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.082849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.082908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.082928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.092433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.093346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.094280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.101699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.106242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6mv9\" (UniqueName: \"kubernetes.io/projected/62555ab6-9268-4248-977e-98269dc9483d-kube-api-access-d6mv9\") pod \"nova-cell1-novncproxy-0\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.184679 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.409596 4707 generic.go:334] "Generic (PLEG): container finished" podID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerID="d6475b79105f9cfd74ca83db95f5e1d47d343e091c6d39cd10b2a266308a771f" exitCode=0 Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.409628 4707 generic.go:334] "Generic (PLEG): container finished" podID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerID="80c0c3690090f96b43c9dd8217c42cf20d45c7e81cdd539d31f8dfdd07cb3ec4" exitCode=2 Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.409636 4707 generic.go:334] "Generic (PLEG): container finished" podID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerID="39829273d81c5e6b6f5743fe52316bf63d995fb9c51456d0c3691bf5a73f5941" exitCode=0 Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.409642 4707 generic.go:334] "Generic (PLEG): container finished" podID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerID="4846703d6c3135187ab80dfe5f24312d122ad935f663be43c9d4f6a2f0be4080" exitCode=0 Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.409681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerDied","Data":"d6475b79105f9cfd74ca83db95f5e1d47d343e091c6d39cd10b2a266308a771f"} Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.409707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerDied","Data":"80c0c3690090f96b43c9dd8217c42cf20d45c7e81cdd539d31f8dfdd07cb3ec4"} Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.409717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerDied","Data":"39829273d81c5e6b6f5743fe52316bf63d995fb9c51456d0c3691bf5a73f5941"} Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.409726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerDied","Data":"4846703d6c3135187ab80dfe5f24312d122ad935f663be43c9d4f6a2f0be4080"} Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.424404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"e5aeb6d2-f710-4490-9929-ac42ed945b2a","Type":"ContainerDied","Data":"d6f01eaa5215e6cb6ee1567a17f32e68f211cf83f30f80dca7c3ecf26f893905"} Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.424451 4707 scope.go:117] "RemoveContainer" containerID="7fd0a096985f6e5334dfd12d0e94f5b59badf344ad4373aecceb6bac8d6b3dd4" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.424580 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.439330 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8","Type":"ContainerStarted","Data":"5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6"} Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.439366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8","Type":"ContainerStarted","Data":"f738b791d93b86b1f9feb0f63d6e2b3318cac1f956f23c39bd01c4e1622269cb"} Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.454308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d","Type":"ContainerStarted","Data":"0cccd4727b8e47536048d4c6788228ae784cd8a9a57fa878409e98baade244f8"} Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.524567 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=5.524552721 podStartE2EDuration="5.524552721s" podCreationTimestamp="2026-01-21 15:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:44.51812673 +0000 UTC m=+2581.699642951" watchObservedRunningTime="2026-01-21 15:44:44.524552721 +0000 UTC m=+2581.706068944" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.564373 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.567633 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.573638 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.599490 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-sg-core-conf-yaml\") pod \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.599621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5pfn\" (UniqueName: \"kubernetes.io/projected/7456f87d-fbee-4a79-8e96-9af0ea500ea6-kube-api-access-b5pfn\") pod \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.599653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-log-httpd\") pod \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.599698 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-config-data\") pod \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.599721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-ceilometer-tls-certs\") pod \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.599740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-scripts\") pod \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.599779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-combined-ca-bundle\") pod \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.599911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-run-httpd\") pod \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\" (UID: \"7456f87d-fbee-4a79-8e96-9af0ea500ea6\") " Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.603916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7456f87d-fbee-4a79-8e96-9af0ea500ea6" (UID: "7456f87d-fbee-4a79-8e96-9af0ea500ea6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.609776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7456f87d-fbee-4a79-8e96-9af0ea500ea6" (UID: "7456f87d-fbee-4a79-8e96-9af0ea500ea6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.616870 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:44:44 crc kubenswrapper[4707]: E0121 15:44:44.617300 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="ceilometer-central-agent" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.617315 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="ceilometer-central-agent" Jan 21 15:44:44 crc kubenswrapper[4707]: E0121 15:44:44.617339 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="proxy-httpd" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.617345 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="proxy-httpd" Jan 21 15:44:44 crc kubenswrapper[4707]: E0121 15:44:44.617361 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="sg-core" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.617367 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="sg-core" Jan 21 15:44:44 crc kubenswrapper[4707]: E0121 15:44:44.617383 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="ceilometer-notification-agent" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.617390 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="ceilometer-notification-agent" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.617564 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="ceilometer-central-agent" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.617576 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="ceilometer-notification-agent" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.617587 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="proxy-httpd" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.617599 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" containerName="sg-core" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.617867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-scripts" (OuterVolumeSpecName: "scripts") pod "7456f87d-fbee-4a79-8e96-9af0ea500ea6" (UID: "7456f87d-fbee-4a79-8e96-9af0ea500ea6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.618226 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.622727 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.622897 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.636801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7456f87d-fbee-4a79-8e96-9af0ea500ea6-kube-api-access-b5pfn" (OuterVolumeSpecName: "kube-api-access-b5pfn") pod "7456f87d-fbee-4a79-8e96-9af0ea500ea6" (UID: "7456f87d-fbee-4a79-8e96-9af0ea500ea6"). InnerVolumeSpecName "kube-api-access-b5pfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.651027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.668175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7456f87d-fbee-4a79-8e96-9af0ea500ea6" (UID: "7456f87d-fbee-4a79-8e96-9af0ea500ea6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.702334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.702400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.702452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.702581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfzv2\" (UniqueName: \"kubernetes.io/projected/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-api-access-zfzv2\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.703412 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5pfn\" (UniqueName: \"kubernetes.io/projected/7456f87d-fbee-4a79-8e96-9af0ea500ea6-kube-api-access-b5pfn\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.703427 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.703438 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.703448 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7456f87d-fbee-4a79-8e96-9af0ea500ea6-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.703457 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.716623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7456f87d-fbee-4a79-8e96-9af0ea500ea6" (UID: "7456f87d-fbee-4a79-8e96-9af0ea500ea6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.730359 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="e6f92106-4e6f-4fdd-b646-322ee996dc80" containerName="rabbitmq" containerID="cri-o://794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514" gracePeriod=604794 Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.730581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7456f87d-fbee-4a79-8e96-9af0ea500ea6" (UID: "7456f87d-fbee-4a79-8e96-9af0ea500ea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.763473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-config-data" (OuterVolumeSpecName: "config-data") pod "7456f87d-fbee-4a79-8e96-9af0ea500ea6" (UID: "7456f87d-fbee-4a79-8e96-9af0ea500ea6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.805551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfzv2\" (UniqueName: \"kubernetes.io/projected/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-api-access-zfzv2\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.805681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.805723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.805742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.805797 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.805822 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.805833 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7456f87d-fbee-4a79-8e96-9af0ea500ea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.811944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.812224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.813800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.824472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfzv2\" (UniqueName: \"kubernetes.io/projected/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-api-access-zfzv2\") pod \"kube-state-metrics-0\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.835945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:44:44 crc kubenswrapper[4707]: I0121 15:44:44.947458 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.193178 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:44:45 crc kubenswrapper[4707]: E0121 15:44:45.193627 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.233533 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ba2c35-3284-4234-b661-254a5b038cf8" path="/var/lib/kubelet/pods/a9ba2c35-3284-4234-b661-254a5b038cf8/volumes" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.234614 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5aeb6d2-f710-4490-9929-ac42ed945b2a" path="/var/lib/kubelet/pods/e5aeb6d2-f710-4490-9929-ac42ed945b2a/volumes" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.406429 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.445582 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.445671 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.446632 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f"} pod="openstack-kuttl-tests/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.446683 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" containerID="cri-o://02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f" gracePeriod=30 Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.476153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7456f87d-fbee-4a79-8e96-9af0ea500ea6","Type":"ContainerDied","Data":"e214d15197898ba636f0d9dacf4142998241d84774549d4e807e5bdac6593c83"} Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.476176 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.476461 4707 scope.go:117] "RemoveContainer" containerID="d6475b79105f9cfd74ca83db95f5e1d47d343e091c6d39cd10b2a266308a771f" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.485047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"62555ab6-9268-4248-977e-98269dc9483d","Type":"ContainerStarted","Data":"a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea"} Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.485092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"62555ab6-9268-4248-977e-98269dc9483d","Type":"ContainerStarted","Data":"43c7a1cb62672dabed09c98f9e772b07aa1e16158852aa339d07f565b1f0f33b"} Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.491474 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_2b1fde7b-722e-48c8-aefa-7baa7c4ebf81/ovn-northd/0.log" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.491519 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerID="19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce" exitCode=139 Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.491761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81","Type":"ContainerDied","Data":"19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce"} Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.502050 4707 scope.go:117] "RemoveContainer" containerID="80c0c3690090f96b43c9dd8217c42cf20d45c7e81cdd539d31f8dfdd07cb3ec4" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.513190 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.524907 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.535285 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.535494 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.535478376 podStartE2EDuration="2.535478376s" podCreationTimestamp="2026-01-21 15:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:45.519900977 +0000 UTC m=+2582.701417199" watchObservedRunningTime="2026-01-21 15:44:45.535478376 +0000 UTC m=+2582.716994599" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.551106 4707 scope.go:117] "RemoveContainer" containerID="39829273d81c5e6b6f5743fe52316bf63d995fb9c51456d0c3691bf5a73f5941" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.554206 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.558345 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.558491 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.559160 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.573771 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.586962 4707 scope.go:117] "RemoveContainer" containerID="4846703d6c3135187ab80dfe5f24312d122ad935f663be43c9d4f6a2f0be4080" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.602503 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:44:45 crc kubenswrapper[4707]: W0121 15:44:45.624678 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0837d9e7_fb96_43b0_a5fb_5eca08e411a3.slice/crio-fcaeeee77530d0ed58ce1b12a27c42487ac062906587b999e64c35d5fba378ee WatchSource:0}: Error finding container fcaeeee77530d0ed58ce1b12a27c42487ac062906587b999e64c35d5fba378ee: Status 404 returned error can't find the container with id fcaeeee77530d0ed58ce1b12a27c42487ac062906587b999e64c35d5fba378ee Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.637103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.637143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-scripts\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.637163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.637191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-run-httpd\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.637214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-log-httpd\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.637254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpbtt\" (UniqueName: \"kubernetes.io/projected/138d6af4-a6f1-406d-a87d-0c21ff9a844a-kube-api-access-wpbtt\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.637296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.637339 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-config-data\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.688377 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_2b1fde7b-722e-48c8-aefa-7baa7c4ebf81/ovn-northd/0.log" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.688512 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.738501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-rundir\") pod \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.738584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-config\") pod \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.738615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-combined-ca-bundle\") pod \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.738685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-northd-tls-certs\") pod \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.738707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-metrics-certs-tls-certs\") pod \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.738797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sgzk\" (UniqueName: \"kubernetes.io/projected/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-kube-api-access-5sgzk\") pod \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.738840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-scripts\") pod \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\" (UID: \"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81\") " Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.739057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.739081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-scripts\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.739096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.739122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-run-httpd\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.739144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-log-httpd\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.739179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpbtt\" (UniqueName: \"kubernetes.io/projected/138d6af4-a6f1-406d-a87d-0c21ff9a844a-kube-api-access-wpbtt\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.739208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.739249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-config-data\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.740237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-run-httpd\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.740235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-scripts" (OuterVolumeSpecName: "scripts") pod "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" (UID: "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.740475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" (UID: "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.740739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-config" (OuterVolumeSpecName: "config") pod "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" (UID: "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.740983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-log-httpd\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.750237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.757458 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.762418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-scripts\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.762678 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-config-data\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.763295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-kube-api-access-5sgzk" (OuterVolumeSpecName: "kube-api-access-5sgzk") pod "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" (UID: "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81"). InnerVolumeSpecName "kube-api-access-5sgzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.765801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpbtt\" (UniqueName: \"kubernetes.io/projected/138d6af4-a6f1-406d-a87d-0c21ff9a844a-kube-api-access-wpbtt\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.776473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.787432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" (UID: "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.828286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" (UID: "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.832312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" (UID: "2b1fde7b-722e-48c8-aefa-7baa7c4ebf81"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.841604 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.841635 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.841645 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.841654 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.841664 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.841673 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sgzk\" (UniqueName: \"kubernetes.io/projected/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-kube-api-access-5sgzk\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.841681 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:45 crc kubenswrapper[4707]: I0121 15:44:45.869130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.294567 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:46 crc kubenswrapper[4707]: W0121 15:44:46.321446 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod138d6af4_a6f1_406d_a87d_0c21ff9a844a.slice/crio-1de3935c5f0923ae84652be8e234eec3ed5b8ba3117dd53a63c802808486e01f WatchSource:0}: Error finding container 1de3935c5f0923ae84652be8e234eec3ed5b8ba3117dd53a63c802808486e01f: Status 404 returned error can't find the container with id 1de3935c5f0923ae84652be8e234eec3ed5b8ba3117dd53a63c802808486e01f Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.406620 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.439256 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.506181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerStarted","Data":"1de3935c5f0923ae84652be8e234eec3ed5b8ba3117dd53a63c802808486e01f"} Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.509353 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" containerID="5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6" exitCode=0 Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.509444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8","Type":"ContainerDied","Data":"5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6"} Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.515619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0837d9e7-fb96-43b0-a5fb-5eca08e411a3","Type":"ContainerStarted","Data":"c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0"} Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.515661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0837d9e7-fb96-43b0-a5fb-5eca08e411a3","Type":"ContainerStarted","Data":"fcaeeee77530d0ed58ce1b12a27c42487ac062906587b999e64c35d5fba378ee"} Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.515743 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.518153 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_2b1fde7b-722e-48c8-aefa-7baa7c4ebf81/ovn-northd/0.log" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.518220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"2b1fde7b-722e-48c8-aefa-7baa7c4ebf81","Type":"ContainerDied","Data":"6cf3f1ee49384e42b5fcc6e4eaa6fbac183ea7886485676916b84f1d1e3639c4"} Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.518256 4707 scope.go:117] "RemoveContainer" containerID="63a8221ce80bea60ab8157bbbd92a930360d931306dcb9d132b2e61859dc44ae" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.518309 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.568852 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.2600674290000002 podStartE2EDuration="2.568833334s" podCreationTimestamp="2026-01-21 15:44:44 +0000 UTC" firstStartedPulling="2026-01-21 15:44:45.628003717 +0000 UTC m=+2582.809519939" lastFinishedPulling="2026-01-21 15:44:45.936769622 +0000 UTC m=+2583.118285844" observedRunningTime="2026-01-21 15:44:46.56039048 +0000 UTC m=+2583.741906701" watchObservedRunningTime="2026-01-21 15:44:46.568833334 +0000 UTC m=+2583.750349557" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.593819 4707 scope.go:117] "RemoveContainer" containerID="19fc86521679d0a59f7cab3d1fa5049e66453c6f9946ce99b1bc740362e025ce" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.616313 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.621142 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.631931 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:44:46 crc kubenswrapper[4707]: E0121 15:44:46.632374 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="ovn-northd" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.632395 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="ovn-northd" Jan 21 15:44:46 crc kubenswrapper[4707]: E0121 15:44:46.632412 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="openstack-network-exporter" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.632418 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="openstack-network-exporter" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.632899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="ovn-northd" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.632941 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" containerName="openstack-network-exporter" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.633820 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.633838 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.634420 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.636580 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.637739 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-zm2wj" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.637848 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.637955 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.639019 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.654786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.654901 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.654931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mszz\" (UniqueName: \"kubernetes.io/projected/67eda4bf-2ebc-4616-bd19-55afc23963d2-kube-api-access-6mszz\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.655062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-scripts\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.655127 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.655184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-config\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.655285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.758467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.758557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.758585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mszz\" (UniqueName: \"kubernetes.io/projected/67eda4bf-2ebc-4616-bd19-55afc23963d2-kube-api-access-6mszz\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.758645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-scripts\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.758682 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.758723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-config\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.758764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.759370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.759786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-config\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.759799 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-scripts\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.763599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.764226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.764632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.783784 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mszz\" (UniqueName: \"kubernetes.io/projected/67eda4bf-2ebc-4616-bd19-55afc23963d2-kube-api-access-6mszz\") pod \"ovn-northd-0\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:46 crc kubenswrapper[4707]: I0121 15:44:46.962300 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.194825 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b1fde7b-722e-48c8-aefa-7baa7c4ebf81" path="/var/lib/kubelet/pods/2b1fde7b-722e-48c8-aefa-7baa7c4ebf81/volumes" Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.196308 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7456f87d-fbee-4a79-8e96-9af0ea500ea6" path="/var/lib/kubelet/pods/7456f87d-fbee-4a79-8e96-9af0ea500ea6/volumes" Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.354570 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:44:47 crc kubenswrapper[4707]: W0121 15:44:47.356854 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67eda4bf_2ebc_4616_bd19_55afc23963d2.slice/crio-7e0b984353eafbee1e07343a922efd0a4c840c906f3cf126666789d66e034b21 WatchSource:0}: Error finding container 7e0b984353eafbee1e07343a922efd0a4c840c906f3cf126666789d66e034b21: Status 404 returned error can't find the container with id 7e0b984353eafbee1e07343a922efd0a4c840c906f3cf126666789d66e034b21 Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.529892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerStarted","Data":"6e8c18ac54722bf62f459ec4d8f2a549343b964d81d3ff4c14b5d530d9b51599"} Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.532979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8","Type":"ContainerStarted","Data":"71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0"} Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.538252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"67eda4bf-2ebc-4616-bd19-55afc23963d2","Type":"ContainerStarted","Data":"7e0b984353eafbee1e07343a922efd0a4c840c906f3cf126666789d66e034b21"} Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.560789 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=5.560780025 podStartE2EDuration="5.560780025s" podCreationTimestamp="2026-01-21 15:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:47.554460533 +0000 UTC m=+2584.735976754" watchObservedRunningTime="2026-01-21 15:44:47.560780025 +0000 UTC m=+2584.742296247" Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.591683 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.664256 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:47 crc kubenswrapper[4707]: I0121 15:44:47.664340 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:48 crc kubenswrapper[4707]: I0121 15:44:48.547409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"67eda4bf-2ebc-4616-bd19-55afc23963d2","Type":"ContainerStarted","Data":"2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1"} Jan 21 15:44:48 crc kubenswrapper[4707]: I0121 15:44:48.547926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"67eda4bf-2ebc-4616-bd19-55afc23963d2","Type":"ContainerStarted","Data":"907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c"} Jan 21 15:44:48 crc kubenswrapper[4707]: I0121 15:44:48.547943 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:44:48 crc kubenswrapper[4707]: I0121 15:44:48.550419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerStarted","Data":"a470de6cfb38a8ead232404cfdbf26b10b5fa25e8cca4459264a1a27873ad473"} Jan 21 15:44:48 crc kubenswrapper[4707]: I0121 15:44:48.550480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerStarted","Data":"d6e6e39f9397f4e8110c7a9e51297146f66b294a241e04080bf2b57e5ac33106"} Jan 21 15:44:48 crc kubenswrapper[4707]: I0121 15:44:48.551656 4707 generic.go:334] "Generic (PLEG): container finished" podID="a15a41db-259f-4646-b02e-18cb3c2beab7" containerID="566d2c79913ac7586f200da2fc0d89fa102371ace8458c1a066117901148fa60" exitCode=0 Jan 21 15:44:48 crc kubenswrapper[4707]: I0121 15:44:48.551734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" event={"ID":"a15a41db-259f-4646-b02e-18cb3c2beab7","Type":"ContainerDied","Data":"566d2c79913ac7586f200da2fc0d89fa102371ace8458c1a066117901148fa60"} Jan 21 15:44:48 crc kubenswrapper[4707]: I0121 15:44:48.568553 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.56853619 podStartE2EDuration="2.56853619s" podCreationTimestamp="2026-01-21 15:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:48.566222941 +0000 UTC m=+2585.747739163" watchObservedRunningTime="2026-01-21 15:44:48.56853619 +0000 UTC m=+2585.750052412" Jan 21 15:44:49 crc kubenswrapper[4707]: I0121 15:44:49.202592 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:49 crc kubenswrapper[4707]: I0121 15:44:49.919265 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="e6f92106-4e6f-4fdd-b646-322ee996dc80" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.216:5671: connect: connection refused" Jan 21 15:44:49 crc kubenswrapper[4707]: I0121 15:44:49.927057 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jtlgf"] Jan 21 15:44:49 crc kubenswrapper[4707]: I0121 15:44:49.928686 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:49 crc kubenswrapper[4707]: I0121 15:44:49.961074 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtlgf"] Jan 21 15:44:49 crc kubenswrapper[4707]: I0121 15:44:49.996711 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.030735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-combined-ca-bundle\") pod \"a15a41db-259f-4646-b02e-18cb3c2beab7\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.030967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqh8r\" (UniqueName: \"kubernetes.io/projected/a15a41db-259f-4646-b02e-18cb3c2beab7-kube-api-access-dqh8r\") pod \"a15a41db-259f-4646-b02e-18cb3c2beab7\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.031061 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-swiftconf\") pod \"a15a41db-259f-4646-b02e-18cb3c2beab7\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.031359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a15a41db-259f-4646-b02e-18cb3c2beab7-etc-swift\") pod \"a15a41db-259f-4646-b02e-18cb3c2beab7\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.031435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-ring-data-devices\") pod \"a15a41db-259f-4646-b02e-18cb3c2beab7\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.031549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-scripts\") pod \"a15a41db-259f-4646-b02e-18cb3c2beab7\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.031615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-dispersionconf\") pod \"a15a41db-259f-4646-b02e-18cb3c2beab7\" (UID: \"a15a41db-259f-4646-b02e-18cb3c2beab7\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.032194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15a41db-259f-4646-b02e-18cb3c2beab7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a15a41db-259f-4646-b02e-18cb3c2beab7" (UID: "a15a41db-259f-4646-b02e-18cb3c2beab7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.032763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-catalog-content\") pod \"redhat-marketplace-jtlgf\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.032804 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a15a41db-259f-4646-b02e-18cb3c2beab7" (UID: "a15a41db-259f-4646-b02e-18cb3c2beab7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.032831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dprhk\" (UniqueName: \"kubernetes.io/projected/448446bb-e187-45a1-a751-63dbb87dffa2-kube-api-access-dprhk\") pod \"redhat-marketplace-jtlgf\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.032940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-utilities\") pod \"redhat-marketplace-jtlgf\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.033107 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a15a41db-259f-4646-b02e-18cb3c2beab7-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.033124 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.046967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15a41db-259f-4646-b02e-18cb3c2beab7-kube-api-access-dqh8r" (OuterVolumeSpecName: "kube-api-access-dqh8r") pod "a15a41db-259f-4646-b02e-18cb3c2beab7" (UID: "a15a41db-259f-4646-b02e-18cb3c2beab7"). InnerVolumeSpecName "kube-api-access-dqh8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.062167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a15a41db-259f-4646-b02e-18cb3c2beab7" (UID: "a15a41db-259f-4646-b02e-18cb3c2beab7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.069844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a15a41db-259f-4646-b02e-18cb3c2beab7" (UID: "a15a41db-259f-4646-b02e-18cb3c2beab7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.069933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-scripts" (OuterVolumeSpecName: "scripts") pod "a15a41db-259f-4646-b02e-18cb3c2beab7" (UID: "a15a41db-259f-4646-b02e-18cb3c2beab7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.070420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a15a41db-259f-4646-b02e-18cb3c2beab7" (UID: "a15a41db-259f-4646-b02e-18cb3c2beab7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.134983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-utilities\") pod \"redhat-marketplace-jtlgf\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.135104 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-catalog-content\") pod \"redhat-marketplace-jtlgf\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.135132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dprhk\" (UniqueName: \"kubernetes.io/projected/448446bb-e187-45a1-a751-63dbb87dffa2-kube-api-access-dprhk\") pod \"redhat-marketplace-jtlgf\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.135190 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.135202 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a15a41db-259f-4646-b02e-18cb3c2beab7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.135211 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.135220 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15a41db-259f-4646-b02e-18cb3c2beab7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.135229 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqh8r\" (UniqueName: \"kubernetes.io/projected/a15a41db-259f-4646-b02e-18cb3c2beab7-kube-api-access-dqh8r\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.135909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-catalog-content\") pod \"redhat-marketplace-jtlgf\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.136415 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-utilities\") pod \"redhat-marketplace-jtlgf\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.151562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dprhk\" (UniqueName: \"kubernetes.io/projected/448446bb-e187-45a1-a751-63dbb87dffa2-kube-api-access-dprhk\") pod \"redhat-marketplace-jtlgf\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.210282 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.236582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-config-data\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.236633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de4cffce-959d-47e8-9217-f842b2ab8f65-erlang-cookie-secret\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.236740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-server-conf\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.236758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7jm9\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-kube-api-access-j7jm9\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.236825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.236889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-tls\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.236974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de4cffce-959d-47e8-9217-f842b2ab8f65-pod-info\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.237029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-confd\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.237101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-plugins-conf\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.237140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-erlang-cookie\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.237159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-plugins\") pod \"de4cffce-959d-47e8-9217-f842b2ab8f65\" (UID: \"de4cffce-959d-47e8-9217-f842b2ab8f65\") " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.243161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.243344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.243727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.251911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.252104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/de4cffce-959d-47e8-9217-f842b2ab8f65-pod-info" (OuterVolumeSpecName: "pod-info") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.252183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-kube-api-access-j7jm9" (OuterVolumeSpecName: "kube-api-access-j7jm9") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "kube-api-access-j7jm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.254800 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4cffce-959d-47e8-9217-f842b2ab8f65-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.255415 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "persistence") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.262585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-config-data" (OuterVolumeSpecName: "config-data") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.294259 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.319927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-server-conf" (OuterVolumeSpecName: "server-conf") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339717 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339739 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de4cffce-959d-47e8-9217-f842b2ab8f65-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339755 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339765 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7jm9\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-kube-api-access-j7jm9\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339783 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339791 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339799 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de4cffce-959d-47e8-9217-f842b2ab8f65-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339826 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de4cffce-959d-47e8-9217-f842b2ab8f65-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339834 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.339842 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.360312 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.365760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "de4cffce-959d-47e8-9217-f842b2ab8f65" (UID: "de4cffce-959d-47e8-9217-f842b2ab8f65"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.444084 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.444428 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de4cffce-959d-47e8-9217-f842b2ab8f65-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.571152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerStarted","Data":"d7dcd8260bbe061d453670f8c78bc97ecce48dac7c741791036f5dda2b5640d0"} Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.571332 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.574174 4707 generic.go:334] "Generic (PLEG): container finished" podID="de4cffce-959d-47e8-9217-f842b2ab8f65" containerID="614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3" exitCode=0 Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.574279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"de4cffce-959d-47e8-9217-f842b2ab8f65","Type":"ContainerDied","Data":"614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3"} Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.574301 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.574313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"de4cffce-959d-47e8-9217-f842b2ab8f65","Type":"ContainerDied","Data":"ca6d53cc93afc70dd89f52af1dc376e5b807b3336d8d9081ba5ffce7e65c90fd"} Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.574348 4707 scope.go:117] "RemoveContainer" containerID="614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.576018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" event={"ID":"a15a41db-259f-4646-b02e-18cb3c2beab7","Type":"ContainerDied","Data":"800233753d63665862f1355f8b7902c90e9c0fa2edad3f58ea406607c2220d13"} Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.576041 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800233753d63665862f1355f8b7902c90e9c0fa2edad3f58ea406607c2220d13" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.576084 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-kdbn6" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.595452 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.519860656 podStartE2EDuration="5.595431154s" podCreationTimestamp="2026-01-21 15:44:45 +0000 UTC" firstStartedPulling="2026-01-21 15:44:46.325758745 +0000 UTC m=+2583.507274967" lastFinishedPulling="2026-01-21 15:44:49.401329243 +0000 UTC m=+2586.582845465" observedRunningTime="2026-01-21 15:44:50.586218001 +0000 UTC m=+2587.767734223" watchObservedRunningTime="2026-01-21 15:44:50.595431154 +0000 UTC m=+2587.776947377" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.606153 4707 scope.go:117] "RemoveContainer" containerID="91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.623863 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.632100 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.652707 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:44:50 crc kubenswrapper[4707]: E0121 15:44:50.653343 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4cffce-959d-47e8-9217-f842b2ab8f65" containerName="setup-container" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.653370 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4cffce-959d-47e8-9217-f842b2ab8f65" containerName="setup-container" Jan 21 15:44:50 crc kubenswrapper[4707]: E0121 15:44:50.653412 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4cffce-959d-47e8-9217-f842b2ab8f65" containerName="rabbitmq" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.653420 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4cffce-959d-47e8-9217-f842b2ab8f65" containerName="rabbitmq" Jan 21 15:44:50 crc kubenswrapper[4707]: E0121 15:44:50.653443 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15a41db-259f-4646-b02e-18cb3c2beab7" containerName="swift-ring-rebalance" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.653450 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15a41db-259f-4646-b02e-18cb3c2beab7" containerName="swift-ring-rebalance" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.653711 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4cffce-959d-47e8-9217-f842b2ab8f65" containerName="rabbitmq" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.653740 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15a41db-259f-4646-b02e-18cb3c2beab7" containerName="swift-ring-rebalance" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.655379 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.660350 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.661070 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-rq4c9" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.661285 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.661456 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.661612 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.661740 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.661909 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.665467 4707 scope.go:117] "RemoveContainer" containerID="614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.665901 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 15:44:50 crc kubenswrapper[4707]: E0121 15:44:50.667186 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3\": container with ID starting with 614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3 not found: ID does not exist" containerID="614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.667217 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3"} err="failed to get container status \"614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3\": rpc error: code = NotFound desc = could not find container \"614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3\": container with ID starting with 614279c041bc560ae1992f71cf5cbe0781ab9586e7e776df49dd625862906fb3 not found: ID does not exist" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.667241 4707 scope.go:117] "RemoveContainer" containerID="91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30" Jan 21 15:44:50 crc kubenswrapper[4707]: E0121 15:44:50.667631 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30\": container with ID starting with 91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30 not found: ID does not exist" containerID="91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.667659 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30"} err="failed to get container status \"91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30\": rpc error: code = NotFound desc = could not find container \"91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30\": container with ID starting with 91dd8b38242967949df4a41463e058a1d59c991a291679a9bbc5675c3e064c30 not found: ID does not exist" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.751726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f341e188-43e6-4019-b4dd-b8ea727d2d3f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.751767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nksf\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-kube-api-access-8nksf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.751818 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.751876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.751899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.751916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.752008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.752032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.752054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.752077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.752266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f341e188-43e6-4019-b4dd-b8ea727d2d3f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.783234 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtlgf"] Jan 21 15:44:50 crc kubenswrapper[4707]: W0121 15:44:50.786216 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod448446bb_e187_45a1_a751_63dbb87dffa2.slice/crio-e9db4b346dfa9e61545a500d1deac4a5e36544871d3964bd39a24168961a1d1d WatchSource:0}: Error finding container e9db4b346dfa9e61545a500d1deac4a5e36544871d3964bd39a24168961a1d1d: Status 404 returned error can't find the container with id e9db4b346dfa9e61545a500d1deac4a5e36544871d3964bd39a24168961a1d1d Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.854634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.854703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.854739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.854821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.854853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.854888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.854928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.855075 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.856052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.856072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.856280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f341e188-43e6-4019-b4dd-b8ea727d2d3f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.856339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.856366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.856496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f341e188-43e6-4019-b4dd-b8ea727d2d3f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.856544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nksf\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-kube-api-access-8nksf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.856630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.857030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.860958 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.860985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f341e188-43e6-4019-b4dd-b8ea727d2d3f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.861020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.861702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f341e188-43e6-4019-b4dd-b8ea727d2d3f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.875171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nksf\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-kube-api-access-8nksf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:50 crc kubenswrapper[4707]: I0121 15:44:50.888046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.002577 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.115161 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.245852 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de4cffce-959d-47e8-9217-f842b2ab8f65" path="/var/lib/kubelet/pods/de4cffce-959d-47e8-9217-f842b2ab8f65/volumes" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.277947 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.368697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-confd\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.368745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-tls\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.368857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-plugins\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.368945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-erlang-cookie\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.369000 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2nr4\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-kube-api-access-r2nr4\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.369036 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e6f92106-4e6f-4fdd-b646-322ee996dc80-erlang-cookie-secret\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.369121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-plugins-conf\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.369144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e6f92106-4e6f-4fdd-b646-322ee996dc80-pod-info\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.369164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-config-data\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.369188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-server-conf\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.369262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e6f92106-4e6f-4fdd-b646-322ee996dc80\" (UID: \"e6f92106-4e6f-4fdd-b646-322ee996dc80\") " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.372020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.374314 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.376379 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.377083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-kube-api-access-r2nr4" (OuterVolumeSpecName: "kube-api-access-r2nr4") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "kube-api-access-r2nr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.377867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.379995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.387402 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e6f92106-4e6f-4fdd-b646-322ee996dc80-pod-info" (OuterVolumeSpecName: "pod-info") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.387989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f92106-4e6f-4fdd-b646-322ee996dc80-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.407752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-config-data" (OuterVolumeSpecName: "config-data") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.423352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-server-conf" (OuterVolumeSpecName: "server-conf") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.470989 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2nr4\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-kube-api-access-r2nr4\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.471417 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e6f92106-4e6f-4fdd-b646-322ee996dc80-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.471495 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.471569 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.471621 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e6f92106-4e6f-4fdd-b646-322ee996dc80-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.471692 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e6f92106-4e6f-4fdd-b646-322ee996dc80-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.471793 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.471884 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.471965 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.472048 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.499957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e6f92106-4e6f-4fdd-b646-322ee996dc80" (UID: "e6f92106-4e6f-4fdd-b646-322ee996dc80"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.508420 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.517314 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.576122 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.576163 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e6f92106-4e6f-4fdd-b646-322ee996dc80-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.597056 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6f92106-4e6f-4fdd-b646-322ee996dc80" containerID="794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514" exitCode=0 Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.597150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.597144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e6f92106-4e6f-4fdd-b646-322ee996dc80","Type":"ContainerDied","Data":"794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514"} Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.597389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e6f92106-4e6f-4fdd-b646-322ee996dc80","Type":"ContainerDied","Data":"ba41cf0494580131641542900654bdffb24a6379473ee3a827e893dd0f1fb1a9"} Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.597424 4707 scope.go:117] "RemoveContainer" containerID="794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.598617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f341e188-43e6-4019-b4dd-b8ea727d2d3f","Type":"ContainerStarted","Data":"99d78c44149d942c63b32abb0f52b18b72665aa6bf9087c9d15c4961ea333793"} Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.601971 4707 generic.go:334] "Generic (PLEG): container finished" podID="448446bb-e187-45a1-a751-63dbb87dffa2" containerID="8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd" exitCode=0 Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.602038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtlgf" event={"ID":"448446bb-e187-45a1-a751-63dbb87dffa2","Type":"ContainerDied","Data":"8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd"} Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.602795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtlgf" event={"ID":"448446bb-e187-45a1-a751-63dbb87dffa2","Type":"ContainerStarted","Data":"e9db4b346dfa9e61545a500d1deac4a5e36544871d3964bd39a24168961a1d1d"} Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.620460 4707 scope.go:117] "RemoveContainer" containerID="0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.638544 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.643930 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.664026 4707 scope.go:117] "RemoveContainer" containerID="794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514" Jan 21 15:44:51 crc kubenswrapper[4707]: E0121 15:44:51.664541 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514\": container with ID starting with 794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514 not found: ID does not exist" containerID="794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.664655 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514"} err="failed to get container status \"794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514\": rpc error: code = NotFound desc = could not find container \"794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514\": container with ID starting with 794ac942c0acbb2c348ca8af846921d6020b547e881a7cd517d8021198f81514 not found: ID does not exist" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.664731 4707 scope.go:117] "RemoveContainer" containerID="0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0" Jan 21 15:44:51 crc kubenswrapper[4707]: E0121 15:44:51.665872 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0\": container with ID starting with 0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0 not found: ID does not exist" containerID="0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.665963 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0"} err="failed to get container status \"0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0\": rpc error: code = NotFound desc = could not find container \"0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0\": container with ID starting with 0b82432776e7ba6a338a87569c26feb8ba191418bf96c2c6cf13c017e36c85b0 not found: ID does not exist" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.700953 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:44:51 crc kubenswrapper[4707]: E0121 15:44:51.701507 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f92106-4e6f-4fdd-b646-322ee996dc80" containerName="rabbitmq" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.701531 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f92106-4e6f-4fdd-b646-322ee996dc80" containerName="rabbitmq" Jan 21 15:44:51 crc kubenswrapper[4707]: E0121 15:44:51.701597 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f92106-4e6f-4fdd-b646-322ee996dc80" containerName="setup-container" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.701616 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f92106-4e6f-4fdd-b646-322ee996dc80" containerName="setup-container" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.701915 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f92106-4e6f-4fdd-b646-322ee996dc80" containerName="rabbitmq" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.703099 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.704537 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.704671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.704945 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.705019 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-cs298" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.705096 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.705135 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.705547 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.708275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.781593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.781830 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.781910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.781991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.782167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.782349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvw5j\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-kube-api-access-xvw5j\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.782462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.782562 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.782643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.782771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.782898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.884782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.884845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.884870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.884889 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.884938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.884976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvw5j\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-kube-api-access-xvw5j\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.885017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.885046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.885065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.885142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.885178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.885500 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.885775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.886009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.886034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.886259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.887105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.890090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.890239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.890266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.892625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.899202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvw5j\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-kube-api-access-xvw5j\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:51 crc kubenswrapper[4707]: I0121 15:44:51.917428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:52 crc kubenswrapper[4707]: I0121 15:44:52.024764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:44:52 crc kubenswrapper[4707]: I0121 15:44:52.441408 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:44:52 crc kubenswrapper[4707]: W0121 15:44:52.455578 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a048e9d_4ff0_479e_bc75_8ccfa4ec7fab.slice/crio-0f3770314d0a5332a4046629981f2c4c2f54bf0d6f0987f4ef6dfb516694cbed WatchSource:0}: Error finding container 0f3770314d0a5332a4046629981f2c4c2f54bf0d6f0987f4ef6dfb516694cbed: Status 404 returned error can't find the container with id 0f3770314d0a5332a4046629981f2c4c2f54bf0d6f0987f4ef6dfb516694cbed Jan 21 15:44:52 crc kubenswrapper[4707]: I0121 15:44:52.618529 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f191858-2d71-42bb-8227-069401c26dd4" containerID="02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f" exitCode=0 Jan 21 15:44:52 crc kubenswrapper[4707]: I0121 15:44:52.618617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"8f191858-2d71-42bb-8227-069401c26dd4","Type":"ContainerDied","Data":"02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f"} Jan 21 15:44:52 crc kubenswrapper[4707]: I0121 15:44:52.619978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab","Type":"ContainerStarted","Data":"0f3770314d0a5332a4046629981f2c4c2f54bf0d6f0987f4ef6dfb516694cbed"} Jan 21 15:44:52 crc kubenswrapper[4707]: I0121 15:44:52.788768 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:52 crc kubenswrapper[4707]: I0121 15:44:52.789057 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:52 crc kubenswrapper[4707]: I0121 15:44:52.849883 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:53 crc kubenswrapper[4707]: I0121 15:44:53.196314 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f92106-4e6f-4fdd-b646-322ee996dc80" path="/var/lib/kubelet/pods/e6f92106-4e6f-4fdd-b646-322ee996dc80/volumes" Jan 21 15:44:53 crc kubenswrapper[4707]: I0121 15:44:53.630202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"8f191858-2d71-42bb-8227-069401c26dd4","Type":"ContainerStarted","Data":"80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63"} Jan 21 15:44:53 crc kubenswrapper[4707]: I0121 15:44:53.632356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f341e188-43e6-4019-b4dd-b8ea727d2d3f","Type":"ContainerStarted","Data":"2d3a5c43e34f6f46f6d3cb5d8c70467adeed72ea7c6b5d91eb9047045028dd92"} Jan 21 15:44:53 crc kubenswrapper[4707]: I0121 15:44:53.634617 4707 generic.go:334] "Generic (PLEG): container finished" podID="448446bb-e187-45a1-a751-63dbb87dffa2" containerID="e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6" exitCode=0 Jan 21 15:44:53 crc kubenswrapper[4707]: I0121 15:44:53.634663 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtlgf" event={"ID":"448446bb-e187-45a1-a751-63dbb87dffa2","Type":"ContainerDied","Data":"e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6"} Jan 21 15:44:53 crc kubenswrapper[4707]: I0121 15:44:53.637355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab","Type":"ContainerStarted","Data":"e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f"} Jan 21 15:44:53 crc kubenswrapper[4707]: I0121 15:44:53.979787 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:44:54 crc kubenswrapper[4707]: I0121 15:44:54.185977 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:54 crc kubenswrapper[4707]: I0121 15:44:54.205746 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:54 crc kubenswrapper[4707]: I0121 15:44:54.648139 4707 generic.go:334] "Generic (PLEG): container finished" podID="ef17ce01-e6d2-4028-8e44-e794559c1608" containerID="ec9bc512b0547b6776ea1f5e8282c90dc55ec62ca4731917e733976641cf9ab8" exitCode=0 Jan 21 15:44:54 crc kubenswrapper[4707]: I0121 15:44:54.648177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" event={"ID":"ef17ce01-e6d2-4028-8e44-e794559c1608","Type":"ContainerDied","Data":"ec9bc512b0547b6776ea1f5e8282c90dc55ec62ca4731917e733976641cf9ab8"} Jan 21 15:44:54 crc kubenswrapper[4707]: I0121 15:44:54.651608 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtlgf" event={"ID":"448446bb-e187-45a1-a751-63dbb87dffa2","Type":"ContainerStarted","Data":"3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1"} Jan 21 15:44:54 crc kubenswrapper[4707]: I0121 15:44:54.676875 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jtlgf" podStartSLOduration=3.177776356 podStartE2EDuration="5.676861693s" podCreationTimestamp="2026-01-21 15:44:49 +0000 UTC" firstStartedPulling="2026-01-21 15:44:51.606361809 +0000 UTC m=+2588.787878030" lastFinishedPulling="2026-01-21 15:44:54.105447145 +0000 UTC m=+2591.286963367" observedRunningTime="2026-01-21 15:44:54.675416545 +0000 UTC m=+2591.856932767" watchObservedRunningTime="2026-01-21 15:44:54.676861693 +0000 UTC m=+2591.858377915" Jan 21 15:44:54 crc kubenswrapper[4707]: I0121 15:44:54.679367 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:44:54 crc kubenswrapper[4707]: I0121 15:44:54.956879 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:44:55 crc kubenswrapper[4707]: I0121 15:44:55.198418 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="de4cffce-959d-47e8-9217-f842b2ab8f65" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.217:5671: i/o timeout" Jan 21 15:44:55 crc kubenswrapper[4707]: I0121 15:44:55.661709 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" containerID="6d5389c6099f34042afa120959ddb534a1021ab43120ae8f9998af0e26ea03c0" exitCode=0 Jan 21 15:44:55 crc kubenswrapper[4707]: I0121 15:44:55.661860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-r94jp" event={"ID":"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c","Type":"ContainerDied","Data":"6d5389c6099f34042afa120959ddb534a1021ab43120ae8f9998af0e26ea03c0"} Jan 21 15:44:55 crc kubenswrapper[4707]: I0121 15:44:55.980610 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.075980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-db-sync-config-data\") pod \"ef17ce01-e6d2-4028-8e44-e794559c1608\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.076045 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5l47\" (UniqueName: \"kubernetes.io/projected/ef17ce01-e6d2-4028-8e44-e794559c1608-kube-api-access-b5l47\") pod \"ef17ce01-e6d2-4028-8e44-e794559c1608\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.076234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-combined-ca-bundle\") pod \"ef17ce01-e6d2-4028-8e44-e794559c1608\" (UID: \"ef17ce01-e6d2-4028-8e44-e794559c1608\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.080958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef17ce01-e6d2-4028-8e44-e794559c1608-kube-api-access-b5l47" (OuterVolumeSpecName: "kube-api-access-b5l47") pod "ef17ce01-e6d2-4028-8e44-e794559c1608" (UID: "ef17ce01-e6d2-4028-8e44-e794559c1608"). InnerVolumeSpecName "kube-api-access-b5l47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.081236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ef17ce01-e6d2-4028-8e44-e794559c1608" (UID: "ef17ce01-e6d2-4028-8e44-e794559c1608"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.106058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef17ce01-e6d2-4028-8e44-e794559c1608" (UID: "ef17ce01-e6d2-4028-8e44-e794559c1608"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.178376 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.178410 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5l47\" (UniqueName: \"kubernetes.io/projected/ef17ce01-e6d2-4028-8e44-e794559c1608-kube-api-access-b5l47\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.178424 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17ce01-e6d2-4028-8e44-e794559c1608-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.258239 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.258655 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="ceilometer-central-agent" containerID="cri-o://6e8c18ac54722bf62f459ec4d8f2a549343b964d81d3ff4c14b5d530d9b51599" gracePeriod=30 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.258722 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="ceilometer-notification-agent" containerID="cri-o://d6e6e39f9397f4e8110c7a9e51297146f66b294a241e04080bf2b57e5ac33106" gracePeriod=30 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.258715 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="sg-core" containerID="cri-o://a470de6cfb38a8ead232404cfdbf26b10b5fa25e8cca4459264a1a27873ad473" gracePeriod=30 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.258751 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="proxy-httpd" containerID="cri-o://d7dcd8260bbe061d453670f8c78bc97ecce48dac7c741791036f5dda2b5640d0" gracePeriod=30 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.381951 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.673923 4707 generic.go:334] "Generic (PLEG): container finished" podID="14b133c7-53cb-4cb8-9b04-66973d5ee9a0" containerID="c10e79abafe959b74a9a32a523621e9d6baab7114207d3edd4ec7a890bd33ad2" exitCode=0 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.673979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-mbvl2" event={"ID":"14b133c7-53cb-4cb8-9b04-66973d5ee9a0","Type":"ContainerDied","Data":"c10e79abafe959b74a9a32a523621e9d6baab7114207d3edd4ec7a890bd33ad2"} Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.676382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" event={"ID":"ef17ce01-e6d2-4028-8e44-e794559c1608","Type":"ContainerDied","Data":"ea9f4d8ca49b11025809d9bdaaa21033f7d17baa7c3d9c78208d4b0874d27d36"} Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.676409 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea9f4d8ca49b11025809d9bdaaa21033f7d17baa7c3d9c78208d4b0874d27d36" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.676449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-dpcvc" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.696096 4707 generic.go:334] "Generic (PLEG): container finished" podID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerID="d7dcd8260bbe061d453670f8c78bc97ecce48dac7c741791036f5dda2b5640d0" exitCode=0 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.696120 4707 generic.go:334] "Generic (PLEG): container finished" podID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerID="a470de6cfb38a8ead232404cfdbf26b10b5fa25e8cca4459264a1a27873ad473" exitCode=2 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.696129 4707 generic.go:334] "Generic (PLEG): container finished" podID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerID="d6e6e39f9397f4e8110c7a9e51297146f66b294a241e04080bf2b57e5ac33106" exitCode=0 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.696136 4707 generic.go:334] "Generic (PLEG): container finished" podID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerID="6e8c18ac54722bf62f459ec4d8f2a549343b964d81d3ff4c14b5d530d9b51599" exitCode=0 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.696170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerDied","Data":"d7dcd8260bbe061d453670f8c78bc97ecce48dac7c741791036f5dda2b5640d0"} Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.696187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerDied","Data":"a470de6cfb38a8ead232404cfdbf26b10b5fa25e8cca4459264a1a27873ad473"} Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.696196 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerDied","Data":"d6e6e39f9397f4e8110c7a9e51297146f66b294a241e04080bf2b57e5ac33106"} Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.696206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerDied","Data":"6e8c18ac54722bf62f459ec4d8f2a549343b964d81d3ff4c14b5d530d9b51599"} Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.698984 4707 generic.go:334] "Generic (PLEG): container finished" podID="02253407-57dc-4a23-8c81-e9464a8faa92" containerID="be1b9dd35378efeb49e3e0750d3c9b65ebd4b0e6153a53a3a5ccf28db9d738e2" exitCode=0 Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.699113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" event={"ID":"02253407-57dc-4a23-8c81-e9464a8faa92","Type":"ContainerDied","Data":"be1b9dd35378efeb49e3e0750d3c9b65ebd4b0e6153a53a3a5ccf28db9d738e2"} Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.771794 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82"] Jan 21 15:44:56 crc kubenswrapper[4707]: E0121 15:44:56.772348 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef17ce01-e6d2-4028-8e44-e794559c1608" containerName="barbican-db-sync" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.772359 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef17ce01-e6d2-4028-8e44-e794559c1608" containerName="barbican-db-sync" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.772555 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef17ce01-e6d2-4028-8e44-e794559c1608" containerName="barbican-db-sync" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.773415 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.795310 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-cb66854d6-grx69"] Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.796885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.812863 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82"] Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.829482 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6"] Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.830958 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.838972 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb66854d6-grx69"] Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.858044 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6"] Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.891938 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.892712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gjt\" (UniqueName: \"kubernetes.io/projected/8e72ad47-d005-44ce-a494-5ed25884c762-kube-api-access-26gjt\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.892765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data-custom\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.892782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-combined-ca-bundle\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.892841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-combined-ca-bundle\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.892871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7506b316-cf85-48fa-a833-fafc5e2cb8ce-logs\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.892922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-public-tls-certs\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.893002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data-custom\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.893032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.893072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-internal-tls-certs\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.893137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e72ad47-d005-44ce-a494-5ed25884c762-logs\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.893169 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.893194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtcht\" (UniqueName: \"kubernetes.io/projected/7506b316-cf85-48fa-a833-fafc5e2cb8ce-kube-api-access-mtcht\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.995014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-config-data\") pod \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.995075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-sg-core-conf-yaml\") pod \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.995251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-log-httpd\") pod \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.995294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-combined-ca-bundle\") pod \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.995449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpbtt\" (UniqueName: \"kubernetes.io/projected/138d6af4-a6f1-406d-a87d-0c21ff9a844a-kube-api-access-wpbtt\") pod \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.995499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-ceilometer-tls-certs\") pod \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.995614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-run-httpd\") pod \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.995656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-scripts\") pod \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\" (UID: \"138d6af4-a6f1-406d-a87d-0c21ff9a844a\") " Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data-custom\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996083 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-internal-tls-certs\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x6k5\" (UniqueName: \"kubernetes.io/projected/6ba6473c-6566-4d18-88ff-e329a248c151-kube-api-access-5x6k5\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e72ad47-d005-44ce-a494-5ed25884c762-logs\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba6473c-6566-4d18-88ff-e329a248c151-logs\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtcht\" (UniqueName: \"kubernetes.io/projected/7506b316-cf85-48fa-a833-fafc5e2cb8ce-kube-api-access-mtcht\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-combined-ca-bundle\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26gjt\" (UniqueName: \"kubernetes.io/projected/8e72ad47-d005-44ce-a494-5ed25884c762-kube-api-access-26gjt\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data-custom\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data-custom\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996558 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-combined-ca-bundle\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-combined-ca-bundle\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "138d6af4-a6f1-406d-a87d-0c21ff9a844a" (UID: "138d6af4-a6f1-406d-a87d-0c21ff9a844a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7506b316-cf85-48fa-a833-fafc5e2cb8ce-logs\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:56 crc kubenswrapper[4707]: I0121 15:44:56.996945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-public-tls-certs\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:56.997116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7506b316-cf85-48fa-a833-fafc5e2cb8ce-logs\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:56.999180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "138d6af4-a6f1-406d-a87d-0c21ff9a844a" (UID: "138d6af4-a6f1-406d-a87d-0c21ff9a844a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:56.999372 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:56.999407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e72ad47-d005-44ce-a494-5ed25884c762-logs\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.001785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data-custom\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.003696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-internal-tls-certs\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.003839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138d6af4-a6f1-406d-a87d-0c21ff9a844a-kube-api-access-wpbtt" (OuterVolumeSpecName: "kube-api-access-wpbtt") pod "138d6af4-a6f1-406d-a87d-0c21ff9a844a" (UID: "138d6af4-a6f1-406d-a87d-0c21ff9a844a"). InnerVolumeSpecName "kube-api-access-wpbtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.005404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.005730 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.006728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data-custom\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.007826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-combined-ca-bundle\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.011077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-scripts" (OuterVolumeSpecName: "scripts") pod "138d6af4-a6f1-406d-a87d-0c21ff9a844a" (UID: "138d6af4-a6f1-406d-a87d-0c21ff9a844a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.014234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtcht\" (UniqueName: \"kubernetes.io/projected/7506b316-cf85-48fa-a833-fafc5e2cb8ce-kube-api-access-mtcht\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.027843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-combined-ca-bundle\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.029499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-public-tls-certs\") pod \"barbican-api-cb66854d6-grx69\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.037383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gjt\" (UniqueName: \"kubernetes.io/projected/8e72ad47-d005-44ce-a494-5ed25884c762-kube-api-access-26gjt\") pod \"barbican-worker-5b9c7d766c-wml82\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.044964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "138d6af4-a6f1-406d-a87d-0c21ff9a844a" (UID: "138d6af4-a6f1-406d-a87d-0c21ff9a844a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.054456 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "138d6af4-a6f1-406d-a87d-0c21ff9a844a" (UID: "138d6af4-a6f1-406d-a87d-0c21ff9a844a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.086664 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "138d6af4-a6f1-406d-a87d-0c21ff9a844a" (UID: "138d6af4-a6f1-406d-a87d-0c21ff9a844a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x6k5\" (UniqueName: \"kubernetes.io/projected/6ba6473c-6566-4d18-88ff-e329a248c151-kube-api-access-5x6k5\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba6473c-6566-4d18-88ff-e329a248c151-logs\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-combined-ca-bundle\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data-custom\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101630 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101643 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/138d6af4-a6f1-406d-a87d-0c21ff9a844a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101652 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101661 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101668 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.101676 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpbtt\" (UniqueName: \"kubernetes.io/projected/138d6af4-a6f1-406d-a87d-0c21ff9a844a-kube-api-access-wpbtt\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.102064 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba6473c-6566-4d18-88ff-e329a248c151-logs\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.104311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-combined-ca-bundle\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.105245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.105630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data-custom\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.107617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-config-data" (OuterVolumeSpecName: "config-data") pod "138d6af4-a6f1-406d-a87d-0c21ff9a844a" (UID: "138d6af4-a6f1-406d-a87d-0c21ff9a844a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.118950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x6k5\" (UniqueName: \"kubernetes.io/projected/6ba6473c-6566-4d18-88ff-e329a248c151-kube-api-access-5x6k5\") pod \"barbican-keystone-listener-6b7c5477b-n8gj6\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.119382 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.145977 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.152894 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.160618 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.182534 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:44:57 crc kubenswrapper[4707]: E0121 15:44:57.182731 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.203213 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138d6af4-a6f1-406d-a87d-0c21ff9a844a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.304917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-db-sync-config-data\") pod \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.305211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-etc-machine-id\") pod \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.305238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-config-data\") pod \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.305309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" (UID: "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.305704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-combined-ca-bundle\") pod \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.305735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgmq2\" (UniqueName: \"kubernetes.io/projected/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-kube-api-access-kgmq2\") pod \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.305792 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-scripts\") pod \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\" (UID: \"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c\") " Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.306384 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.309390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" (UID: "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.311712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-scripts" (OuterVolumeSpecName: "scripts") pod "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" (UID: "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.314332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-kube-api-access-kgmq2" (OuterVolumeSpecName: "kube-api-access-kgmq2") pod "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" (UID: "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c"). InnerVolumeSpecName "kube-api-access-kgmq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.345664 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" (UID: "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.369873 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-config-data" (OuterVolumeSpecName: "config-data") pod "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" (UID: "fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.407942 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.407972 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.407983 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.407992 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgmq2\" (UniqueName: \"kubernetes.io/projected/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-kube-api-access-kgmq2\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.408002 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.604946 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82"] Jan 21 15:44:57 crc kubenswrapper[4707]: W0121 15:44:57.616235 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7506b316_cf85_48fa_a833_fafc5e2cb8ce.slice/crio-5ed7e978b0758976ed6e29f96df267bc97775ef10e47fddedf13c203c2f46b13 WatchSource:0}: Error finding container 5ed7e978b0758976ed6e29f96df267bc97775ef10e47fddedf13c203c2f46b13: Status 404 returned error can't find the container with id 5ed7e978b0758976ed6e29f96df267bc97775ef10e47fddedf13c203c2f46b13 Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.626416 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb66854d6-grx69"] Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.648060 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.648266 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.721509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"138d6af4-a6f1-406d-a87d-0c21ff9a844a","Type":"ContainerDied","Data":"1de3935c5f0923ae84652be8e234eec3ed5b8ba3117dd53a63c802808486e01f"} Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.721566 4707 scope.go:117] "RemoveContainer" containerID="d7dcd8260bbe061d453670f8c78bc97ecce48dac7c741791036f5dda2b5640d0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.721724 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.725853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" event={"ID":"8e72ad47-d005-44ce-a494-5ed25884c762","Type":"ContainerStarted","Data":"d3f1f3e0627d237a1299b06d79f43c93b264288df51c9b1bd87bdc219ab950e8"} Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.730238 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6"] Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.733791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-r94jp" event={"ID":"fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c","Type":"ContainerDied","Data":"5920e730b25bc38ec106cc97108743826c23555b3d6fc30e3296027f9b23b9da"} Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.733843 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5920e730b25bc38ec106cc97108743826c23555b3d6fc30e3296027f9b23b9da" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.733884 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-r94jp" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.767599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" event={"ID":"7506b316-cf85-48fa-a833-fafc5e2cb8ce","Type":"ContainerStarted","Data":"5ed7e978b0758976ed6e29f96df267bc97775ef10e47fddedf13c203c2f46b13"} Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.767786 4707 scope.go:117] "RemoveContainer" containerID="a470de6cfb38a8ead232404cfdbf26b10b5fa25e8cca4459264a1a27873ad473" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.779733 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.797846 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.825910 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:57 crc kubenswrapper[4707]: E0121 15:44:57.826343 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="ceilometer-central-agent" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826360 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="ceilometer-central-agent" Jan 21 15:44:57 crc kubenswrapper[4707]: E0121 15:44:57.826379 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="ceilometer-notification-agent" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826385 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="ceilometer-notification-agent" Jan 21 15:44:57 crc kubenswrapper[4707]: E0121 15:44:57.826407 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" containerName="cinder-db-sync" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826419 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" containerName="cinder-db-sync" Jan 21 15:44:57 crc kubenswrapper[4707]: E0121 15:44:57.826435 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="sg-core" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826440 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="sg-core" Jan 21 15:44:57 crc kubenswrapper[4707]: E0121 15:44:57.826455 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="proxy-httpd" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826460 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="proxy-httpd" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826608 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="proxy-httpd" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826621 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="ceilometer-central-agent" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826632 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" containerName="cinder-db-sync" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826649 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="ceilometer-notification-agent" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.826659 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" containerName="sg-core" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.828487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.831215 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.831241 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.831413 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.832515 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.840980 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.841198 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerName="cinder-api-log" containerID="cri-o://82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083" gracePeriod=30 Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.841348 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerName="cinder-api" containerID="cri-o://032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe" gracePeriod=30 Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.854122 4707 scope.go:117] "RemoveContainer" containerID="d6e6e39f9397f4e8110c7a9e51297146f66b294a241e04080bf2b57e5ac33106" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.915660 4707 scope.go:117] "RemoveContainer" containerID="6e8c18ac54722bf62f459ec4d8f2a549343b964d81d3ff4c14b5d530d9b51599" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.918137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.918249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-run-httpd\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.918277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-config-data\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.918292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.918321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-log-httpd\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.918343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.918361 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cr5d\" (UniqueName: \"kubernetes.io/projected/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-kube-api-access-7cr5d\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:57 crc kubenswrapper[4707]: I0121 15:44:57.918389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-scripts\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.024304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-run-httpd\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.024346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.024366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-config-data\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.024419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-log-httpd\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.024459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.024484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cr5d\" (UniqueName: \"kubernetes.io/projected/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-kube-api-access-7cr5d\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.024532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-scripts\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.024587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.026220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-log-httpd\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.026453 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-run-httpd\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.033686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-config-data\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.034066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.034486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.047405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-scripts\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.052685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cr5d\" (UniqueName: \"kubernetes.io/projected/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-kube-api-access-7cr5d\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.054411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.191692 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.339479 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.382443 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.438708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwg99\" (UniqueName: \"kubernetes.io/projected/02253407-57dc-4a23-8c81-e9464a8faa92-kube-api-access-jwg99\") pod \"02253407-57dc-4a23-8c81-e9464a8faa92\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.438890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-combined-ca-bundle\") pod \"02253407-57dc-4a23-8c81-e9464a8faa92\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.438999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-config\") pod \"02253407-57dc-4a23-8c81-e9464a8faa92\" (UID: \"02253407-57dc-4a23-8c81-e9464a8faa92\") " Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.448988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02253407-57dc-4a23-8c81-e9464a8faa92-kube-api-access-jwg99" (OuterVolumeSpecName: "kube-api-access-jwg99") pod "02253407-57dc-4a23-8c81-e9464a8faa92" (UID: "02253407-57dc-4a23-8c81-e9464a8faa92"). InnerVolumeSpecName "kube-api-access-jwg99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.471681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02253407-57dc-4a23-8c81-e9464a8faa92" (UID: "02253407-57dc-4a23-8c81-e9464a8faa92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.483914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-config" (OuterVolumeSpecName: "config") pod "02253407-57dc-4a23-8c81-e9464a8faa92" (UID: "02253407-57dc-4a23-8c81-e9464a8faa92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.540678 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-config-data\") pod \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.540837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w89m7\" (UniqueName: \"kubernetes.io/projected/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-kube-api-access-w89m7\") pod \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.540885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-combined-ca-bundle\") pod \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.540906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-db-sync-config-data\") pod \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\" (UID: \"14b133c7-53cb-4cb8-9b04-66973d5ee9a0\") " Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.541324 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.541340 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwg99\" (UniqueName: \"kubernetes.io/projected/02253407-57dc-4a23-8c81-e9464a8faa92-kube-api-access-jwg99\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.541350 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02253407-57dc-4a23-8c81-e9464a8faa92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.546738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "14b133c7-53cb-4cb8-9b04-66973d5ee9a0" (UID: "14b133c7-53cb-4cb8-9b04-66973d5ee9a0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.548066 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-kube-api-access-w89m7" (OuterVolumeSpecName: "kube-api-access-w89m7") pod "14b133c7-53cb-4cb8-9b04-66973d5ee9a0" (UID: "14b133c7-53cb-4cb8-9b04-66973d5ee9a0"). InnerVolumeSpecName "kube-api-access-w89m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.570728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14b133c7-53cb-4cb8-9b04-66973d5ee9a0" (UID: "14b133c7-53cb-4cb8-9b04-66973d5ee9a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.585670 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-config-data" (OuterVolumeSpecName: "config-data") pod "14b133c7-53cb-4cb8-9b04-66973d5ee9a0" (UID: "14b133c7-53cb-4cb8-9b04-66973d5ee9a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.642750 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w89m7\" (UniqueName: \"kubernetes.io/projected/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-kube-api-access-w89m7\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.642775 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.642784 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.642796 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b133c7-53cb-4cb8-9b04-66973d5ee9a0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.694183 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:44:58 crc kubenswrapper[4707]: W0121 15:44:58.755893 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fbc14e1_6cf0_45a0_a87f_7f64b2d122f2.slice/crio-ec879e1a4890e4ba8696f3cfadad5a026adb34a24b14313a7091e3adeb407502 WatchSource:0}: Error finding container ec879e1a4890e4ba8696f3cfadad5a026adb34a24b14313a7091e3adeb407502: Status 404 returned error can't find the container with id ec879e1a4890e4ba8696f3cfadad5a026adb34a24b14313a7091e3adeb407502 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.776348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" event={"ID":"7506b316-cf85-48fa-a833-fafc5e2cb8ce","Type":"ContainerStarted","Data":"d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.776390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" event={"ID":"7506b316-cf85-48fa-a833-fafc5e2cb8ce","Type":"ContainerStarted","Data":"e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.776476 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.776648 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.777793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerStarted","Data":"ec879e1a4890e4ba8696f3cfadad5a026adb34a24b14313a7091e3adeb407502"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.779609 4707 generic.go:334] "Generic (PLEG): container finished" podID="708db58a-2cd1-4735-8c38-16987a96a22a" containerID="74275cce571896970dcd652c2e437e0b5e956b6bdecd5677a5c8242f96d9d70a" exitCode=0 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.779670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" event={"ID":"708db58a-2cd1-4735-8c38-16987a96a22a","Type":"ContainerDied","Data":"74275cce571896970dcd652c2e437e0b5e956b6bdecd5677a5c8242f96d9d70a"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.781152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" event={"ID":"02253407-57dc-4a23-8c81-e9464a8faa92","Type":"ContainerDied","Data":"abcb9446e24e3d2a5fd5954f7a378f3fa3066ada3058e6735bdfc6457ab43ed0"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.781178 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abcb9446e24e3d2a5fd5954f7a378f3fa3066ada3058e6735bdfc6457ab43ed0" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.781157 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-zcs8f" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.782862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" event={"ID":"6ba6473c-6566-4d18-88ff-e329a248c151","Type":"ContainerStarted","Data":"1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.782888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" event={"ID":"6ba6473c-6566-4d18-88ff-e329a248c151","Type":"ContainerStarted","Data":"ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.782899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" event={"ID":"6ba6473c-6566-4d18-88ff-e329a248c151","Type":"ContainerStarted","Data":"b058eb973732d484fbf5e6da1e1c8f73f9e45604fe4050958c4c3094c7f12040"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.784757 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-mbvl2" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.784755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-mbvl2" event={"ID":"14b133c7-53cb-4cb8-9b04-66973d5ee9a0","Type":"ContainerDied","Data":"ecbc41a1e39a7c8f5268d6f817fdb37138f92349b7057ab16b853b1bb91ffbd2"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.784845 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecbc41a1e39a7c8f5268d6f817fdb37138f92349b7057ab16b853b1bb91ffbd2" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.786300 4707 generic.go:334] "Generic (PLEG): container finished" podID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerID="82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083" exitCode=143 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.786362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8","Type":"ContainerDied","Data":"82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.788285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" event={"ID":"8e72ad47-d005-44ce-a494-5ed25884c762","Type":"ContainerStarted","Data":"6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.788310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" event={"ID":"8e72ad47-d005-44ce-a494-5ed25884c762","Type":"ContainerStarted","Data":"2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e"} Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.797593 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" podStartSLOduration=2.797583598 podStartE2EDuration="2.797583598s" podCreationTimestamp="2026-01-21 15:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:58.791358654 +0000 UTC m=+2595.972874876" watchObservedRunningTime="2026-01-21 15:44:58.797583598 +0000 UTC m=+2595.979099820" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.821087 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" podStartSLOduration=2.821075058 podStartE2EDuration="2.821075058s" podCreationTimestamp="2026-01-21 15:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:58.819365884 +0000 UTC m=+2596.000882106" watchObservedRunningTime="2026-01-21 15:44:58.821075058 +0000 UTC m=+2596.002591280" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.840933 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" podStartSLOduration=2.840918758 podStartE2EDuration="2.840918758s" podCreationTimestamp="2026-01-21 15:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:44:58.835586132 +0000 UTC m=+2596.017102375" watchObservedRunningTime="2026-01-21 15:44:58.840918758 +0000 UTC m=+2596.022434981" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.850737 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88"] Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.859305 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-68556f8ccd-sx4vz"] Jan 21 15:44:58 crc kubenswrapper[4707]: E0121 15:44:58.859650 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02253407-57dc-4a23-8c81-e9464a8faa92" containerName="neutron-db-sync" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.859667 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="02253407-57dc-4a23-8c81-e9464a8faa92" containerName="neutron-db-sync" Jan 21 15:44:58 crc kubenswrapper[4707]: E0121 15:44:58.859677 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b133c7-53cb-4cb8-9b04-66973d5ee9a0" containerName="glance-db-sync" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.859684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b133c7-53cb-4cb8-9b04-66973d5ee9a0" containerName="glance-db-sync" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.864069 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" podUID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerName="barbican-worker-log" containerID="cri-o://2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74" gracePeriod=30 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.864685 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b133c7-53cb-4cb8-9b04-66973d5ee9a0" containerName="glance-db-sync" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.864704 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="02253407-57dc-4a23-8c81-e9464a8faa92" containerName="neutron-db-sync" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.865193 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" podUID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerName="barbican-worker" containerID="cri-o://c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020" gracePeriod=30 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.865643 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.874873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-68556f8ccd-sx4vz"] Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.895622 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7"] Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.895930 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" podUID="30955948-0578-447c-9c9d-73e489e54962" containerName="barbican-keystone-listener-log" containerID="cri-o://892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419" gracePeriod=30 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.896086 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" podUID="30955948-0578-447c-9c9d-73e489e54962" containerName="barbican-keystone-listener" containerID="cri-o://dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8" gracePeriod=30 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.898189 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.898377 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerName="glance-log" containerID="cri-o://c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2" gracePeriod=30 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.898472 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerName="glance-httpd" containerID="cri-o://f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd" gracePeriod=30 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.948545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-internal-tls-certs\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.948613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-config\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.948651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-combined-ca-bundle\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.948677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-httpd-config\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.948733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-public-tls-certs\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.948788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-ovndb-tls-certs\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.948849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4kkh\" (UniqueName: \"kubernetes.io/projected/5f587396-a3bb-4add-bb87-d1bacd7e763e-kube-api-access-z4kkh\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.965335 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.965528 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerName="glance-log" containerID="cri-o://bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54" gracePeriod=30 Jan 21 15:44:58 crc kubenswrapper[4707]: I0121 15:44:58.965619 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerName="glance-httpd" containerID="cri-o://91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b" gracePeriod=30 Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.050704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-ovndb-tls-certs\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.050773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4kkh\" (UniqueName: \"kubernetes.io/projected/5f587396-a3bb-4add-bb87-d1bacd7e763e-kube-api-access-z4kkh\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.050860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-internal-tls-certs\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.050903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-config\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.050938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-combined-ca-bundle\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.051102 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-httpd-config\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.051175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-public-tls-certs\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.056900 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-internal-tls-certs\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.057041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-combined-ca-bundle\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.057165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-ovndb-tls-certs\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.058483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-config\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.060025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-public-tls-certs\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.060282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-httpd-config\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.066336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4kkh\" (UniqueName: \"kubernetes.io/projected/5f587396-a3bb-4add-bb87-d1bacd7e763e-kube-api-access-z4kkh\") pod \"neutron-68556f8ccd-sx4vz\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.190031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.215243 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="138d6af4-a6f1-406d-a87d-0c21ff9a844a" path="/var/lib/kubelet/pods/138d6af4-a6f1-406d-a87d-0c21ff9a844a/volumes" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.594573 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-68556f8ccd-sx4vz"] Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.797765 4707 generic.go:334] "Generic (PLEG): container finished" podID="30955948-0578-447c-9c9d-73e489e54962" containerID="892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419" exitCode=143 Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.797837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" event={"ID":"30955948-0578-447c-9c9d-73e489e54962","Type":"ContainerDied","Data":"892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419"} Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.799455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerStarted","Data":"ead8b354d29ca6da319ae2b1b30a2002a1bbe77cdf13011c3fb2f37808e1c514"} Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.799495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerStarted","Data":"0f5ca1c05a8ba258481aceb5bf81924b527b4a365e2428a4dcd4b4a519e233ef"} Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.801485 4707 generic.go:334] "Generic (PLEG): container finished" podID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerID="c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2" exitCode=143 Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.801531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f","Type":"ContainerDied","Data":"c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2"} Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.803139 4707 generic.go:334] "Generic (PLEG): container finished" podID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerID="bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54" exitCode=143 Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.803213 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"44b894eb-fa59-4d67-aeb6-46c945fe6dd5","Type":"ContainerDied","Data":"bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54"} Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.804356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerStarted","Data":"bfb9505879b3bc1531075e521f5c432e0486e1366b2751cb167564231e34887f"} Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.805734 4707 generic.go:334] "Generic (PLEG): container finished" podID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerID="2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74" exitCode=143 Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.806346 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" event={"ID":"e0ffe639-1a1a-45e1-9152-ab2aa7af5301","Type":"ContainerDied","Data":"2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74"} Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.926957 4707 scope.go:117] "RemoveContainer" containerID="1c8e77bfa68e38980554d4bde9e72ce205b90327a6373937e410e37c84c7a7f8" Jan 21 15:44:59 crc kubenswrapper[4707]: I0121 15:44:59.995217 4707 scope.go:117] "RemoveContainer" containerID="0f01c350b247b2f192966e13748e0e762a3105665c2a60f722f20a0211dfeec9" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.137724 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259"] Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.139517 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.141472 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.145573 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.151617 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259"] Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.184150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.204295 4707 scope.go:117] "RemoveContainer" containerID="b99afad247ed569f22be4570733e3e86a8756295925cb7a94ae2e835a8f1fd12" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.268859 4707 scope.go:117] "RemoveContainer" containerID="c66dd17d9b60df10f57e9cc85f7bf0536cdcaea704e2db1ea2f6186abbb9b630" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.279090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-combined-ca-bundle\") pod \"708db58a-2cd1-4735-8c38-16987a96a22a\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.279171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-scripts\") pod \"708db58a-2cd1-4735-8c38-16987a96a22a\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.279309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-config-data\") pod \"708db58a-2cd1-4735-8c38-16987a96a22a\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.279598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw9x4\" (UniqueName: \"kubernetes.io/projected/708db58a-2cd1-4735-8c38-16987a96a22a-kube-api-access-nw9x4\") pod \"708db58a-2cd1-4735-8c38-16987a96a22a\" (UID: \"708db58a-2cd1-4735-8c38-16987a96a22a\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.280128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-config-volume\") pod \"collect-profiles-29483505-qb259\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.280262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-secret-volume\") pod \"collect-profiles-29483505-qb259\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.280738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrxsz\" (UniqueName: \"kubernetes.io/projected/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-kube-api-access-lrxsz\") pod \"collect-profiles-29483505-qb259\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.294570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.295011 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.297251 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-scripts" (OuterVolumeSpecName: "scripts") pod "708db58a-2cd1-4735-8c38-16987a96a22a" (UID: "708db58a-2cd1-4735-8c38-16987a96a22a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.308918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708db58a-2cd1-4735-8c38-16987a96a22a-kube-api-access-nw9x4" (OuterVolumeSpecName: "kube-api-access-nw9x4") pod "708db58a-2cd1-4735-8c38-16987a96a22a" (UID: "708db58a-2cd1-4735-8c38-16987a96a22a"). InnerVolumeSpecName "kube-api-access-nw9x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.322725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-config-data" (OuterVolumeSpecName: "config-data") pod "708db58a-2cd1-4735-8c38-16987a96a22a" (UID: "708db58a-2cd1-4735-8c38-16987a96a22a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.330840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "708db58a-2cd1-4735-8c38-16987a96a22a" (UID: "708db58a-2cd1-4735-8c38-16987a96a22a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.335502 4707 scope.go:117] "RemoveContainer" containerID="1337d8606bac38af9b79c2ec732c96f137ac330857aa0b08877428b22036abc5" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.362971 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.391002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrxsz\" (UniqueName: \"kubernetes.io/projected/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-kube-api-access-lrxsz\") pod \"collect-profiles-29483505-qb259\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.391244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-config-volume\") pod \"collect-profiles-29483505-qb259\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.391344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-secret-volume\") pod \"collect-profiles-29483505-qb259\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.391514 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.391531 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.391541 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708db58a-2cd1-4735-8c38-16987a96a22a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.391549 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw9x4\" (UniqueName: \"kubernetes.io/projected/708db58a-2cd1-4735-8c38-16987a96a22a-kube-api-access-nw9x4\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.393517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-config-volume\") pod \"collect-profiles-29483505-qb259\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.401568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-secret-volume\") pod \"collect-profiles-29483505-qb259\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.413004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrxsz\" (UniqueName: \"kubernetes.io/projected/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-kube-api-access-lrxsz\") pod \"collect-profiles-29483505-qb259\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.461289 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.506481 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.609467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data-custom\") pod \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.609546 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-combined-ca-bundle\") pod \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.610223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-logs\") pod \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.610265 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdh98\" (UniqueName: \"kubernetes.io/projected/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-kube-api-access-kdh98\") pod \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.610297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data\") pod \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\" (UID: \"e0ffe639-1a1a-45e1-9152-ab2aa7af5301\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.611285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-logs" (OuterVolumeSpecName: "logs") pod "e0ffe639-1a1a-45e1-9152-ab2aa7af5301" (UID: "e0ffe639-1a1a-45e1-9152-ab2aa7af5301"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.617957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0ffe639-1a1a-45e1-9152-ab2aa7af5301" (UID: "e0ffe639-1a1a-45e1-9152-ab2aa7af5301"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.620014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-kube-api-access-kdh98" (OuterVolumeSpecName: "kube-api-access-kdh98") pod "e0ffe639-1a1a-45e1-9152-ab2aa7af5301" (UID: "e0ffe639-1a1a-45e1-9152-ab2aa7af5301"). InnerVolumeSpecName "kube-api-access-kdh98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.631200 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.643034 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0ffe639-1a1a-45e1-9152-ab2aa7af5301" (UID: "e0ffe639-1a1a-45e1-9152-ab2aa7af5301"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.670914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data" (OuterVolumeSpecName: "config-data") pod "e0ffe639-1a1a-45e1-9152-ab2aa7af5301" (UID: "e0ffe639-1a1a-45e1-9152-ab2aa7af5301"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.711836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data\") pod \"30955948-0578-447c-9c9d-73e489e54962\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.712015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30955948-0578-447c-9c9d-73e489e54962-logs\") pod \"30955948-0578-447c-9c9d-73e489e54962\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.712044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-combined-ca-bundle\") pod \"30955948-0578-447c-9c9d-73e489e54962\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.712070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw49n\" (UniqueName: \"kubernetes.io/projected/30955948-0578-447c-9c9d-73e489e54962-kube-api-access-rw49n\") pod \"30955948-0578-447c-9c9d-73e489e54962\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.712120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data-custom\") pod \"30955948-0578-447c-9c9d-73e489e54962\" (UID: \"30955948-0578-447c-9c9d-73e489e54962\") " Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.712674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30955948-0578-447c-9c9d-73e489e54962-logs" (OuterVolumeSpecName: "logs") pod "30955948-0578-447c-9c9d-73e489e54962" (UID: "30955948-0578-447c-9c9d-73e489e54962"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.713236 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.713256 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdh98\" (UniqueName: \"kubernetes.io/projected/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-kube-api-access-kdh98\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.713266 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.713302 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.713312 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ffe639-1a1a-45e1-9152-ab2aa7af5301-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.713320 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30955948-0578-447c-9c9d-73e489e54962-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.714672 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "30955948-0578-447c-9c9d-73e489e54962" (UID: "30955948-0578-447c-9c9d-73e489e54962"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.715505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30955948-0578-447c-9c9d-73e489e54962-kube-api-access-rw49n" (OuterVolumeSpecName: "kube-api-access-rw49n") pod "30955948-0578-447c-9c9d-73e489e54962" (UID: "30955948-0578-447c-9c9d-73e489e54962"). InnerVolumeSpecName "kube-api-access-rw49n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.735584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30955948-0578-447c-9c9d-73e489e54962" (UID: "30955948-0578-447c-9c9d-73e489e54962"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.753951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data" (OuterVolumeSpecName: "config-data") pod "30955948-0578-447c-9c9d-73e489e54962" (UID: "30955948-0578-447c-9c9d-73e489e54962"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.819261 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.819314 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw49n\" (UniqueName: \"kubernetes.io/projected/30955948-0578-447c-9c9d-73e489e54962-kube-api-access-rw49n\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.819326 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.819336 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30955948-0578-447c-9c9d-73e489e54962-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.825564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.825566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg" event={"ID":"708db58a-2cd1-4735-8c38-16987a96a22a","Type":"ContainerDied","Data":"e478f869783c247cc1b797cbe5a214c97fb7abf9d51bbfbf94c86ca0d70dcd18"} Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.825701 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e478f869783c247cc1b797cbe5a214c97fb7abf9d51bbfbf94c86ca0d70dcd18" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.830890 4707 generic.go:334] "Generic (PLEG): container finished" podID="30955948-0578-447c-9c9d-73e489e54962" containerID="dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8" exitCode=0 Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.830959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" event={"ID":"30955948-0578-447c-9c9d-73e489e54962","Type":"ContainerDied","Data":"dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8"} Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.830987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" event={"ID":"30955948-0578-447c-9c9d-73e489e54962","Type":"ContainerDied","Data":"3eb1d5dcc534e6062a661e8ba642c7c69e4c56a6d4f5d3a2f5e5560251aba1fa"} Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.831008 4707 scope.go:117] "RemoveContainer" containerID="dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.831141 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.850155 4707 generic.go:334] "Generic (PLEG): container finished" podID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerID="c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020" exitCode=0 Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.850255 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.850366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" event={"ID":"e0ffe639-1a1a-45e1-9152-ab2aa7af5301","Type":"ContainerDied","Data":"c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020"} Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.850409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88" event={"ID":"e0ffe639-1a1a-45e1-9152-ab2aa7af5301","Type":"ContainerDied","Data":"1b0effde157bd6dd6caa66c7277d95c8ed98bb53974f482cb8994505655cd4f8"} Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.860219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerStarted","Data":"93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de"} Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.860605 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.865340 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.865502 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f9aba8f8-91db-426a-aa87-96064fe8974c" containerName="nova-cell0-conductor-conductor" containerID="cri-o://ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d" gracePeriod=30 Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.871028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerStarted","Data":"f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf"} Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.877847 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7"] Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.880182 4707 scope.go:117] "RemoveContainer" containerID="892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.886064 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-57c4bc7d6c-72rt7"] Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.889302 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podStartSLOduration=2.889288691 podStartE2EDuration="2.889288691s" podCreationTimestamp="2026-01-21 15:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:00.885561383 +0000 UTC m=+2598.067077605" watchObservedRunningTime="2026-01-21 15:45:00.889288691 +0000 UTC m=+2598.070804913" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.908604 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88"] Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.914361 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86b7f55fc4-rft88"] Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.921042 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.922241 4707 scope.go:117] "RemoveContainer" containerID="dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8" Jan 21 15:45:00 crc kubenswrapper[4707]: E0121 15:45:00.923826 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8\": container with ID starting with dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8 not found: ID does not exist" containerID="dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.923864 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8"} err="failed to get container status \"dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8\": rpc error: code = NotFound desc = could not find container \"dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8\": container with ID starting with dd3f06062615487f4ac0277e306bca82ac50f7268ddc24f2beeb99a68caa4bb8 not found: ID does not exist" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.923886 4707 scope.go:117] "RemoveContainer" containerID="892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419" Jan 21 15:45:00 crc kubenswrapper[4707]: E0121 15:45:00.929597 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419\": container with ID starting with 892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419 not found: ID does not exist" containerID="892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.929626 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419"} err="failed to get container status \"892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419\": rpc error: code = NotFound desc = could not find container \"892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419\": container with ID starting with 892edac9ec51c730fda4c5d93d71c6764cf59e5402512b89b3415972187d9419 not found: ID does not exist" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.929641 4707 scope.go:117] "RemoveContainer" containerID="c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020" Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.962976 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259"] Jan 21 15:45:00 crc kubenswrapper[4707]: I0121 15:45:00.970595 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtlgf"] Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.033844 4707 scope.go:117] "RemoveContainer" containerID="2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.054763 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.9:8776/healthcheck\": read tcp 10.217.0.2:35774->10.217.1.9:8776: read: connection reset by peer" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.146628 4707 scope.go:117] "RemoveContainer" containerID="c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020" Jan 21 15:45:01 crc kubenswrapper[4707]: E0121 15:45:01.147051 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020\": container with ID starting with c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020 not found: ID does not exist" containerID="c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.147078 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020"} err="failed to get container status \"c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020\": rpc error: code = NotFound desc = could not find container \"c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020\": container with ID starting with c73b213c94e8fc3c517d32e428782b2dc202c6eb380db40ad515831d88a0e020 not found: ID does not exist" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.147097 4707 scope.go:117] "RemoveContainer" containerID="2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74" Jan 21 15:45:01 crc kubenswrapper[4707]: E0121 15:45:01.147490 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74\": container with ID starting with 2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74 not found: ID does not exist" containerID="2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.147509 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74"} err="failed to get container status \"2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74\": rpc error: code = NotFound desc = could not find container \"2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74\": container with ID starting with 2e7ca5af55ceed787e00f26bb1a7048d3264fcde1c9b19cd7b4ee707add9fd74 not found: ID does not exist" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.199521 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30955948-0578-447c-9c9d-73e489e54962" path="/var/lib/kubelet/pods/30955948-0578-447c-9c9d-73e489e54962/volumes" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.200242 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" path="/var/lib/kubelet/pods/e0ffe639-1a1a-45e1-9152-ab2aa7af5301/volumes" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.403041 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.461086 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.472642 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgz97\" (UniqueName: \"kubernetes.io/projected/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-kube-api-access-wgz97\") pod \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634133 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data-custom\") pod \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data\") pod \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-etc-machine-id\") pod \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-public-tls-certs\") pod \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-internal-tls-certs\") pod \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-scripts\") pod \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-logs\") pod \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-combined-ca-bundle\") pod \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\" (UID: \"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8\") " Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" (UID: "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.634820 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.635193 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-logs" (OuterVolumeSpecName: "logs") pod "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" (UID: "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.641990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" (UID: "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.642470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-scripts" (OuterVolumeSpecName: "scripts") pod "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" (UID: "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.647102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-kube-api-access-wgz97" (OuterVolumeSpecName: "kube-api-access-wgz97") pod "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" (UID: "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8"). InnerVolumeSpecName "kube-api-access-wgz97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.657092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" (UID: "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.677945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" (UID: "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.681407 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" (UID: "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.683837 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data" (OuterVolumeSpecName: "config-data") pod "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" (UID: "e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.737119 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.737473 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.737585 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgz97\" (UniqueName: \"kubernetes.io/projected/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-kube-api-access-wgz97\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.737661 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.737721 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.737778 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.737857 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.737921 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.884161 4707 generic.go:334] "Generic (PLEG): container finished" podID="2bb5a6c6-fba7-4e90-9184-2e18785a63e7" containerID="da03af4fe60e288843f036c08c74724d08c4be0a5bb6736c5b2d003559fb7b1a" exitCode=0 Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.884364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" event={"ID":"2bb5a6c6-fba7-4e90-9184-2e18785a63e7","Type":"ContainerDied","Data":"da03af4fe60e288843f036c08c74724d08c4be0a5bb6736c5b2d003559fb7b1a"} Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.884470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" event={"ID":"2bb5a6c6-fba7-4e90-9184-2e18785a63e7","Type":"ContainerStarted","Data":"c4d30d8f7918eba08073d5f0929c20d85be441a0badd374a94c9bbcb1190a41e"} Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.887226 4707 generic.go:334] "Generic (PLEG): container finished" podID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerID="032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe" exitCode=0 Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.887248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8","Type":"ContainerDied","Data":"032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe"} Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.887291 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8","Type":"ContainerDied","Data":"bd7fde705b15976eb6a5259e82e4aebef0a228963a90d3660a3ec3f830917201"} Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.887311 4707 scope.go:117] "RemoveContainer" containerID="032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.887310 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.894452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerStarted","Data":"e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d"} Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.894522 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="probe" containerID="cri-o://b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238" gracePeriod=30 Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.894662 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" containerID="cri-o://80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63" gracePeriod=30 Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.909562 4707 scope.go:117] "RemoveContainer" containerID="82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.937026 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.943908 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.956605 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:45:01 crc kubenswrapper[4707]: E0121 15:45:01.957150 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerName="barbican-worker" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957167 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerName="barbican-worker" Jan 21 15:45:01 crc kubenswrapper[4707]: E0121 15:45:01.957181 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30955948-0578-447c-9c9d-73e489e54962" containerName="barbican-keystone-listener" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957187 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30955948-0578-447c-9c9d-73e489e54962" containerName="barbican-keystone-listener" Jan 21 15:45:01 crc kubenswrapper[4707]: E0121 15:45:01.957203 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerName="cinder-api" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957208 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerName="cinder-api" Jan 21 15:45:01 crc kubenswrapper[4707]: E0121 15:45:01.957219 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerName="cinder-api-log" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957224 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerName="cinder-api-log" Jan 21 15:45:01 crc kubenswrapper[4707]: E0121 15:45:01.957237 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerName="barbican-worker-log" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957243 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerName="barbican-worker-log" Jan 21 15:45:01 crc kubenswrapper[4707]: E0121 15:45:01.957254 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708db58a-2cd1-4735-8c38-16987a96a22a" containerName="nova-cell0-conductor-db-sync" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957259 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="708db58a-2cd1-4735-8c38-16987a96a22a" containerName="nova-cell0-conductor-db-sync" Jan 21 15:45:01 crc kubenswrapper[4707]: E0121 15:45:01.957276 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30955948-0578-447c-9c9d-73e489e54962" containerName="barbican-keystone-listener-log" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957282 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30955948-0578-447c-9c9d-73e489e54962" containerName="barbican-keystone-listener-log" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957439 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerName="barbican-worker" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957452 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="30955948-0578-447c-9c9d-73e489e54962" containerName="barbican-keystone-listener" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957465 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="30955948-0578-447c-9c9d-73e489e54962" containerName="barbican-keystone-listener-log" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957477 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerName="cinder-api-log" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957484 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ffe639-1a1a-45e1-9152-ab2aa7af5301" containerName="barbican-worker-log" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957493 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="708db58a-2cd1-4735-8c38-16987a96a22a" containerName="nova-cell0-conductor-db-sync" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.957501 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" containerName="cinder-api" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.960003 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.962251 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.962553 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:45:01 crc kubenswrapper[4707]: I0121 15:45:01.962663 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:01.998756 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.007425 4707 scope.go:117] "RemoveContainer" containerID="032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe" Jan 21 15:45:02 crc kubenswrapper[4707]: E0121 15:45:02.008057 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe\": container with ID starting with 032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe not found: ID does not exist" containerID="032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.008099 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe"} err="failed to get container status \"032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe\": rpc error: code = NotFound desc = could not find container \"032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe\": container with ID starting with 032b3d7d6503b04b4ce2e3614ed57f3b0f5c3b36c149699af880243cc96c0cfe not found: ID does not exist" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.008121 4707 scope.go:117] "RemoveContainer" containerID="82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083" Jan 21 15:45:02 crc kubenswrapper[4707]: E0121 15:45:02.008530 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083\": container with ID starting with 82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083 not found: ID does not exist" containerID="82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.008561 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083"} err="failed to get container status \"82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083\": rpc error: code = NotFound desc = could not find container \"82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083\": container with ID starting with 82d3c2b02addc0e01448e88993c633e01a29639f57232f452d8cc4dea5db8083 not found: ID does not exist" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.037251 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.043710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxhvl\" (UniqueName: \"kubernetes.io/projected/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-kube-api-access-sxhvl\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.043763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-logs\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.043835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.043858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.043916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-scripts\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.043943 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data-custom\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.044049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.044130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.044236 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.146887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxhvl\" (UniqueName: \"kubernetes.io/projected/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-kube-api-access-sxhvl\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.146995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-logs\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.147046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.147120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.147418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-scripts\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.147863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data-custom\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.147952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.148035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.148141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.147440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-logs\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.149039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.152228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-scripts\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.152519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.153298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data-custom\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.153655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.153950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.153972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.163469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxhvl\" (UniqueName: \"kubernetes.io/projected/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-kube-api-access-sxhvl\") pod \"cinder-api-0\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.299518 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.414430 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.518589 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.561171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-scripts\") pod \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.561239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-config-data\") pod \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.561317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.561349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-combined-ca-bundle\") pod \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.561375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmfmb\" (UniqueName: \"kubernetes.io/projected/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-kube-api-access-fmfmb\") pod \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.561440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-public-tls-certs\") pod \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.561548 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-httpd-run\") pod \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.561596 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-logs\") pod \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\" (UID: \"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.562484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-logs" (OuterVolumeSpecName: "logs") pod "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" (UID: "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.563197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" (UID: "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.566907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-scripts" (OuterVolumeSpecName: "scripts") pod "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" (UID: "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.568932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-kube-api-access-fmfmb" (OuterVolumeSpecName: "kube-api-access-fmfmb") pod "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" (UID: "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f"). InnerVolumeSpecName "kube-api-access-fmfmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.576010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" (UID: "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.601068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" (UID: "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.615413 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" (UID: "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.619058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-config-data" (OuterVolumeSpecName: "config-data") pod "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" (UID: "a5ab5cb3-56d6-45f8-a8ae-0cda9871960f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.663245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-logs\") pod \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.663738 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-config-data\") pod \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.663824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-logs" (OuterVolumeSpecName: "logs") pod "44b894eb-fa59-4d67-aeb6-46c945fe6dd5" (UID: "44b894eb-fa59-4d67-aeb6-46c945fe6dd5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.663918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-httpd-run\") pod \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.664004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.664145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-internal-tls-certs\") pod \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.664187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "44b894eb-fa59-4d67-aeb6-46c945fe6dd5" (UID: "44b894eb-fa59-4d67-aeb6-46c945fe6dd5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.664403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx7tz\" (UniqueName: \"kubernetes.io/projected/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-kube-api-access-dx7tz\") pod \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.664500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-scripts\") pod \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.664613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-combined-ca-bundle\") pod \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\" (UID: \"44b894eb-fa59-4d67-aeb6-46c945fe6dd5\") " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.665298 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.665392 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.665487 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.665567 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.665721 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmfmb\" (UniqueName: \"kubernetes.io/projected/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-kube-api-access-fmfmb\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.665832 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.665910 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.665992 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.666045 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.666122 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.670204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-kube-api-access-dx7tz" (OuterVolumeSpecName: "kube-api-access-dx7tz") pod "44b894eb-fa59-4d67-aeb6-46c945fe6dd5" (UID: "44b894eb-fa59-4d67-aeb6-46c945fe6dd5"). InnerVolumeSpecName "kube-api-access-dx7tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.673501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-scripts" (OuterVolumeSpecName: "scripts") pod "44b894eb-fa59-4d67-aeb6-46c945fe6dd5" (UID: "44b894eb-fa59-4d67-aeb6-46c945fe6dd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.674961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "44b894eb-fa59-4d67-aeb6-46c945fe6dd5" (UID: "44b894eb-fa59-4d67-aeb6-46c945fe6dd5"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.713593 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.716207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44b894eb-fa59-4d67-aeb6-46c945fe6dd5" (UID: "44b894eb-fa59-4d67-aeb6-46c945fe6dd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.728795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44b894eb-fa59-4d67-aeb6-46c945fe6dd5" (UID: "44b894eb-fa59-4d67-aeb6-46c945fe6dd5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.759728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-config-data" (OuterVolumeSpecName: "config-data") pod "44b894eb-fa59-4d67-aeb6-46c945fe6dd5" (UID: "44b894eb-fa59-4d67-aeb6-46c945fe6dd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.768085 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.768121 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx7tz\" (UniqueName: \"kubernetes.io/projected/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-kube-api-access-dx7tz\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.768133 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.768142 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.768151 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.768159 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b894eb-fa59-4d67-aeb6-46c945fe6dd5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.768182 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.787257 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.799244 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:45:02 crc kubenswrapper[4707]: W0121 15:45:02.826572 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98f30a9f_5d85_4dce_8bc8_03e26c9967c8.slice/crio-5bfe0a98c728382335c1754fa37281a046cad5fd2d125f214ba559ecbab12a29 WatchSource:0}: Error finding container 5bfe0a98c728382335c1754fa37281a046cad5fd2d125f214ba559ecbab12a29: Status 404 returned error can't find the container with id 5bfe0a98c728382335c1754fa37281a046cad5fd2d125f214ba559ecbab12a29 Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.869666 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.915594 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c800162-c9be-4c26-848e-548c5ade1645" containerID="af2df74e23c68b533ce2ad604e55e3b6a44522d5ac5962c11fede1b902ed12e8" exitCode=0 Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.915653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-92nzp" event={"ID":"1c800162-c9be-4c26-848e-548c5ade1645","Type":"ContainerDied","Data":"af2df74e23c68b533ce2ad604e55e3b6a44522d5ac5962c11fede1b902ed12e8"} Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.922030 4707 generic.go:334] "Generic (PLEG): container finished" podID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerID="f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd" exitCode=0 Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.922093 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f","Type":"ContainerDied","Data":"f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd"} Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.922117 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.922128 4707 scope.go:117] "RemoveContainer" containerID="f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.922116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ab5cb3-56d6-45f8-a8ae-0cda9871960f","Type":"ContainerDied","Data":"266de8cab4ca60a56fbc833343b6e594c928e02fe4e35b8aafcd90f5a05497d0"} Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.930176 4707 generic.go:334] "Generic (PLEG): container finished" podID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerID="91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b" exitCode=0 Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.930224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"44b894eb-fa59-4d67-aeb6-46c945fe6dd5","Type":"ContainerDied","Data":"91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b"} Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.930243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"44b894eb-fa59-4d67-aeb6-46c945fe6dd5","Type":"ContainerDied","Data":"309684819cc803f636817c554c8fd1fd529045c53dc1f1f1a47fa500d987d3a7"} Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.930301 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.932849 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f191858-2d71-42bb-8227-069401c26dd4" containerID="b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238" exitCode=0 Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.932895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"8f191858-2d71-42bb-8227-069401c26dd4","Type":"ContainerDied","Data":"b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238"} Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.933630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"98f30a9f-5d85-4dce-8bc8-03e26c9967c8","Type":"ContainerStarted","Data":"5bfe0a98c728382335c1754fa37281a046cad5fd2d125f214ba559ecbab12a29"} Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.949500 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jtlgf" podUID="448446bb-e187-45a1-a751-63dbb87dffa2" containerName="registry-server" containerID="cri-o://3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1" gracePeriod=2 Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.949894 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerStarted","Data":"002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5"} Jan 21 15:45:02 crc kubenswrapper[4707]: I0121 15:45:02.950009 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.063279 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.269303496 podStartE2EDuration="6.063253712s" podCreationTimestamp="2026-01-21 15:44:57 +0000 UTC" firstStartedPulling="2026-01-21 15:44:58.757173438 +0000 UTC m=+2595.938689660" lastFinishedPulling="2026-01-21 15:45:02.551123654 +0000 UTC m=+2599.732639876" observedRunningTime="2026-01-21 15:45:02.979586397 +0000 UTC m=+2600.161102619" watchObservedRunningTime="2026-01-21 15:45:03.063253712 +0000 UTC m=+2600.244769934" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.075429 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.140996 4707 scope.go:117] "RemoveContainer" containerID="c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.159265 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.170766 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:45:03 crc kubenswrapper[4707]: E0121 15:45:03.175132 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerName="glance-log" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.175155 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerName="glance-log" Jan 21 15:45:03 crc kubenswrapper[4707]: E0121 15:45:03.175174 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerName="glance-log" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.175179 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerName="glance-log" Jan 21 15:45:03 crc kubenswrapper[4707]: E0121 15:45:03.175190 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerName="glance-httpd" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.175195 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerName="glance-httpd" Jan 21 15:45:03 crc kubenswrapper[4707]: E0121 15:45:03.175209 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerName="glance-httpd" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.175214 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerName="glance-httpd" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.175390 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerName="glance-log" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.175407 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerName="glance-log" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.175418 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" containerName="glance-httpd" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.175432 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" containerName="glance-httpd" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.176243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.181531 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.187554 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-lbfbl" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.187582 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.188636 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.188876 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.197774 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44b894eb-fa59-4d67-aeb6-46c945fe6dd5" path="/var/lib/kubelet/pods/44b894eb-fa59-4d67-aeb6-46c945fe6dd5/volumes" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.198566 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8" path="/var/lib/kubelet/pods/e4aa782a-8c59-486c-8f6f-a1b38b9e3aa8/volumes" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.199349 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.199446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.204201 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.205665 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.213858 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.221623 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.222203 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.242647 4707 scope.go:117] "RemoveContainer" containerID="f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd" Jan 21 15:45:03 crc kubenswrapper[4707]: E0121 15:45:03.243174 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd\": container with ID starting with f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd not found: ID does not exist" containerID="f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.243366 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd"} err="failed to get container status \"f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd\": rpc error: code = NotFound desc = could not find container \"f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd\": container with ID starting with f986333e76ce23c406291209283a6e16394f62a9a08e0557d820d183c867f0dd not found: ID does not exist" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.243491 4707 scope.go:117] "RemoveContainer" containerID="c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2" Jan 21 15:45:03 crc kubenswrapper[4707]: E0121 15:45:03.245402 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2\": container with ID starting with c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2 not found: ID does not exist" containerID="c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.245544 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2"} err="failed to get container status \"c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2\": rpc error: code = NotFound desc = could not find container \"c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2\": container with ID starting with c99e6d607a9721f8d37c86504ed81b72380e121104633f2f7d37fca9b3d384b2 not found: ID does not exist" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.245633 4707 scope.go:117] "RemoveContainer" containerID="91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.294980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-config-data\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-logs\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwp9\" (UniqueName: \"kubernetes.io/projected/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-kube-api-access-dqwp9\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295067 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-scripts\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295138 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-logs\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295215 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djn2s\" (UniqueName: \"kubernetes.io/projected/20a1576d-856f-45fb-b210-66fddef6adaf-kube-api-access-djn2s\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.295718 4707 scope.go:117] "RemoveContainer" containerID="bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.326287 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.331045 4707 scope.go:117] "RemoveContainer" containerID="91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b" Jan 21 15:45:03 crc kubenswrapper[4707]: E0121 15:45:03.331278 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b\": container with ID starting with 91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b not found: ID does not exist" containerID="91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.331302 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b"} err="failed to get container status \"91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b\": rpc error: code = NotFound desc = could not find container \"91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b\": container with ID starting with 91a5ad2491adf3281236c8700575439630f3f1ad9487f0c43e98ec93d36aaf2b not found: ID does not exist" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.331318 4707 scope.go:117] "RemoveContainer" containerID="bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54" Jan 21 15:45:03 crc kubenswrapper[4707]: E0121 15:45:03.331737 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54\": container with ID starting with bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54 not found: ID does not exist" containerID="bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.331756 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54"} err="failed to get container status \"bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54\": rpc error: code = NotFound desc = could not find container \"bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54\": container with ID starting with bf19e9358d40eca35302def7508ce847efea309aa25c1584d1246d478b0b4c54 not found: ID does not exist" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.396522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-secret-volume\") pod \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.396600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrxsz\" (UniqueName: \"kubernetes.io/projected/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-kube-api-access-lrxsz\") pod \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.396664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-config-volume\") pod \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\" (UID: \"2bb5a6c6-fba7-4e90-9184-2e18785a63e7\") " Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.396994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-logs\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djn2s\" (UniqueName: \"kubernetes.io/projected/20a1576d-856f-45fb-b210-66fddef6adaf-kube-api-access-djn2s\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397230 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397290 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397440 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-config-volume" (OuterVolumeSpecName: "config-volume") pod "2bb5a6c6-fba7-4e90-9184-2e18785a63e7" (UID: "2bb5a6c6-fba7-4e90-9184-2e18785a63e7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.399097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.399646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.397301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.402037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-config-data\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.402134 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-logs\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.402157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwp9\" (UniqueName: \"kubernetes.io/projected/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-kube-api-access-dqwp9\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.402181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-scripts\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.402205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.402336 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.402571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-logs\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.403019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-logs\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.407298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.408388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.408468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.411318 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.413382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2bb5a6c6-fba7-4e90-9184-2e18785a63e7" (UID: "2bb5a6c6-fba7-4e90-9184-2e18785a63e7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.414954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-kube-api-access-lrxsz" (OuterVolumeSpecName: "kube-api-access-lrxsz") pod "2bb5a6c6-fba7-4e90-9184-2e18785a63e7" (UID: "2bb5a6c6-fba7-4e90-9184-2e18785a63e7"). InnerVolumeSpecName "kube-api-access-lrxsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.415265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-config-data\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.415506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.415849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.418230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-scripts\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.420649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djn2s\" (UniqueName: \"kubernetes.io/projected/20a1576d-856f-45fb-b210-66fddef6adaf-kube-api-access-djn2s\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.424345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwp9\" (UniqueName: \"kubernetes.io/projected/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-kube-api-access-dqwp9\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.446183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.450177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.499738 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.503497 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.503531 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrxsz\" (UniqueName: \"kubernetes.io/projected/2bb5a6c6-fba7-4e90-9184-2e18785a63e7-kube-api-access-lrxsz\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.504640 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.544604 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.605073 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dprhk\" (UniqueName: \"kubernetes.io/projected/448446bb-e187-45a1-a751-63dbb87dffa2-kube-api-access-dprhk\") pod \"448446bb-e187-45a1-a751-63dbb87dffa2\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.605179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-utilities\") pod \"448446bb-e187-45a1-a751-63dbb87dffa2\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.605222 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-catalog-content\") pod \"448446bb-e187-45a1-a751-63dbb87dffa2\" (UID: \"448446bb-e187-45a1-a751-63dbb87dffa2\") " Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.611423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-utilities" (OuterVolumeSpecName: "utilities") pod "448446bb-e187-45a1-a751-63dbb87dffa2" (UID: "448446bb-e187-45a1-a751-63dbb87dffa2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.616132 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/448446bb-e187-45a1-a751-63dbb87dffa2-kube-api-access-dprhk" (OuterVolumeSpecName: "kube-api-access-dprhk") pod "448446bb-e187-45a1-a751-63dbb87dffa2" (UID: "448446bb-e187-45a1-a751-63dbb87dffa2"). InnerVolumeSpecName "kube-api-access-dprhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.623635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "448446bb-e187-45a1-a751-63dbb87dffa2" (UID: "448446bb-e187-45a1-a751-63dbb87dffa2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.706983 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dprhk\" (UniqueName: \"kubernetes.io/projected/448446bb-e187-45a1-a751-63dbb87dffa2-kube-api-access-dprhk\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.707229 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.707240 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/448446bb-e187-45a1-a751-63dbb87dffa2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.820554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.963098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" event={"ID":"2bb5a6c6-fba7-4e90-9184-2e18785a63e7","Type":"ContainerDied","Data":"c4d30d8f7918eba08073d5f0929c20d85be441a0badd374a94c9bbcb1190a41e"} Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.963134 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d30d8f7918eba08073d5f0929c20d85be441a0badd374a94c9bbcb1190a41e" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.963113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.965875 4707 generic.go:334] "Generic (PLEG): container finished" podID="72c825b0-9f75-45ef-ab7c-5e38af70a8d2" containerID="38b41abf3aac941b3174f1c8a5eeb7b89e9817d932cce60bb06f025c8037461f" exitCode=0 Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.965938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-x6jxn" event={"ID":"72c825b0-9f75-45ef-ab7c-5e38af70a8d2","Type":"ContainerDied","Data":"38b41abf3aac941b3174f1c8a5eeb7b89e9817d932cce60bb06f025c8037461f"} Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.974584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"98f30a9f-5d85-4dce-8bc8-03e26c9967c8","Type":"ContainerStarted","Data":"3f05e66f94e563650c321336e37a8530429e8dfd49776a8178131469087deaaf"} Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.995618 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.997494 4707 generic.go:334] "Generic (PLEG): container finished" podID="448446bb-e187-45a1-a751-63dbb87dffa2" containerID="3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1" exitCode=0 Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.998113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtlgf" Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.998492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtlgf" event={"ID":"448446bb-e187-45a1-a751-63dbb87dffa2","Type":"ContainerDied","Data":"3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1"} Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.998526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtlgf" event={"ID":"448446bb-e187-45a1-a751-63dbb87dffa2","Type":"ContainerDied","Data":"e9db4b346dfa9e61545a500d1deac4a5e36544871d3964bd39a24168961a1d1d"} Jan 21 15:45:03 crc kubenswrapper[4707]: I0121 15:45:03.998546 4707 scope.go:117] "RemoveContainer" containerID="3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.039728 4707 scope.go:117] "RemoveContainer" containerID="e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6" Jan 21 15:45:04 crc kubenswrapper[4707]: E0121 15:45:04.065401 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.073895 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:45:04 crc kubenswrapper[4707]: E0121 15:45:04.074318 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:45:04 crc kubenswrapper[4707]: W0121 15:45:04.080087 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20a1576d_856f_45fb_b210_66fddef6adaf.slice/crio-5695272d4968e4a17ccd13580218505fa359b47e03a18fb5753386526886aad7 WatchSource:0}: Error finding container 5695272d4968e4a17ccd13580218505fa359b47e03a18fb5753386526886aad7: Status 404 returned error can't find the container with id 5695272d4968e4a17ccd13580218505fa359b47e03a18fb5753386526886aad7 Jan 21 15:45:04 crc kubenswrapper[4707]: E0121 15:45:04.080843 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:45:04 crc kubenswrapper[4707]: E0121 15:45:04.080887 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f9aba8f8-91db-426a-aa87-96064fe8974c" containerName="nova-cell0-conductor-conductor" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.094865 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtlgf"] Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.101767 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtlgf"] Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.104551 4707 scope.go:117] "RemoveContainer" containerID="8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.191938 4707 scope.go:117] "RemoveContainer" containerID="3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1" Jan 21 15:45:04 crc kubenswrapper[4707]: E0121 15:45:04.202038 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1\": container with ID starting with 3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1 not found: ID does not exist" containerID="3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.202082 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1"} err="failed to get container status \"3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1\": rpc error: code = NotFound desc = could not find container \"3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1\": container with ID starting with 3fc8e20f68c42f9a1ccc3912bcdfe55a4e4a33dba8cc6038457bb732db21a0c1 not found: ID does not exist" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.202104 4707 scope.go:117] "RemoveContainer" containerID="e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6" Jan 21 15:45:04 crc kubenswrapper[4707]: E0121 15:45:04.203891 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6\": container with ID starting with e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6 not found: ID does not exist" containerID="e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.203919 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6"} err="failed to get container status \"e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6\": rpc error: code = NotFound desc = could not find container \"e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6\": container with ID starting with e7dc29af4ccfaa97fe4926bea0ae7c1778f978b4f3909e52f79dd8f91d2fc8b6 not found: ID does not exist" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.203934 4707 scope.go:117] "RemoveContainer" containerID="8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd" Jan 21 15:45:04 crc kubenswrapper[4707]: E0121 15:45:04.206209 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd\": container with ID starting with 8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd not found: ID does not exist" containerID="8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.206290 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd"} err="failed to get container status \"8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd\": rpc error: code = NotFound desc = could not find container \"8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd\": container with ID starting with 8527e0b119b476ff266fb54d2c821261dc2881cdf0a401e05eb1d410b32a17bd not found: ID does not exist" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.374604 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.398664 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm"] Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.410200 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-rgbtm"] Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.536077 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvt5w\" (UniqueName: \"kubernetes.io/projected/1c800162-c9be-4c26-848e-548c5ade1645-kube-api-access-mvt5w\") pod \"1c800162-c9be-4c26-848e-548c5ade1645\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.536133 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-combined-ca-bundle\") pod \"1c800162-c9be-4c26-848e-548c5ade1645\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.536265 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-config-data\") pod \"1c800162-c9be-4c26-848e-548c5ade1645\" (UID: \"1c800162-c9be-4c26-848e-548c5ade1645\") " Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.545819 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c800162-c9be-4c26-848e-548c5ade1645-kube-api-access-mvt5w" (OuterVolumeSpecName: "kube-api-access-mvt5w") pod "1c800162-c9be-4c26-848e-548c5ade1645" (UID: "1c800162-c9be-4c26-848e-548c5ade1645"). InnerVolumeSpecName "kube-api-access-mvt5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.562758 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c800162-c9be-4c26-848e-548c5ade1645" (UID: "1c800162-c9be-4c26-848e-548c5ade1645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.584149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-config-data" (OuterVolumeSpecName: "config-data") pod "1c800162-c9be-4c26-848e-548c5ade1645" (UID: "1c800162-c9be-4c26-848e-548c5ade1645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.639742 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvt5w\" (UniqueName: \"kubernetes.io/projected/1c800162-c9be-4c26-848e-548c5ade1645-kube-api-access-mvt5w\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.639772 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:04 crc kubenswrapper[4707]: I0121 15:45:04.639782 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c800162-c9be-4c26-848e-548c5ade1645-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.013414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20a1576d-856f-45fb-b210-66fddef6adaf","Type":"ContainerStarted","Data":"5ca8cf0f460517bb6eb2f9678b1116274be8a382c22293f7320425480d1c7e94"} Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.013757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20a1576d-856f-45fb-b210-66fddef6adaf","Type":"ContainerStarted","Data":"5695272d4968e4a17ccd13580218505fa359b47e03a18fb5753386526886aad7"} Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.015921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c","Type":"ContainerStarted","Data":"7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae"} Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.015947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c","Type":"ContainerStarted","Data":"a9175c7f34373daee7f6c69a72abd543b4168a6c57c2682ae2953e531f68b4f8"} Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.018085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"98f30a9f-5d85-4dce-8bc8-03e26c9967c8","Type":"ContainerStarted","Data":"3f5a0d130fa4b31023beed8d473c235ed286cc7b040259cf77ab6da89f38e456"} Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.019496 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.024501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-92nzp" event={"ID":"1c800162-c9be-4c26-848e-548c5ade1645","Type":"ContainerDied","Data":"b3819f15bfa92dacfc9efa832f6bba289ab15504086d0c4da1ef4db7b3dfdb57"} Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.024525 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3819f15bfa92dacfc9efa832f6bba289ab15504086d0c4da1ef4db7b3dfdb57" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.024506 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-92nzp" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.026443 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/0.log" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.026489 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerID="ead8b354d29ca6da319ae2b1b30a2002a1bbe77cdf13011c3fb2f37808e1c514" exitCode=1 Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.026566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerDied","Data":"ead8b354d29ca6da319ae2b1b30a2002a1bbe77cdf13011c3fb2f37808e1c514"} Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.027111 4707 scope.go:117] "RemoveContainer" containerID="ead8b354d29ca6da319ae2b1b30a2002a1bbe77cdf13011c3fb2f37808e1c514" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.040690 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=4.040677608 podStartE2EDuration="4.040677608s" podCreationTimestamp="2026-01-21 15:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:05.040090324 +0000 UTC m=+2602.221606546" watchObservedRunningTime="2026-01-21 15:45:05.040677608 +0000 UTC m=+2602.222193840" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.044736 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.089388 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7xxw2"] Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.099646 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7xxw2"] Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.199402 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28effb5a-773a-4084-8b1f-719eb7e4ea6f" path="/var/lib/kubelet/pods/28effb5a-773a-4084-8b1f-719eb7e4ea6f/volumes" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.200135 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="448446bb-e187-45a1-a751-63dbb87dffa2" path="/var/lib/kubelet/pods/448446bb-e187-45a1-a751-63dbb87dffa2/volumes" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.202393 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5537fff0-7562-4e93-8db8-d13d18966e3d" path="/var/lib/kubelet/pods/5537fff0-7562-4e93-8db8-d13d18966e3d/volumes" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.206386 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ab5cb3-56d6-45f8-a8ae-0cda9871960f" path="/var/lib/kubelet/pods/a5ab5cb3-56d6-45f8-a8ae-0cda9871960f/volumes" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.207994 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-c5n5h"] Jan 21 15:45:05 crc kubenswrapper[4707]: E0121 15:45:05.208316 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="448446bb-e187-45a1-a751-63dbb87dffa2" containerName="registry-server" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.208337 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="448446bb-e187-45a1-a751-63dbb87dffa2" containerName="registry-server" Jan 21 15:45:05 crc kubenswrapper[4707]: E0121 15:45:05.208348 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c800162-c9be-4c26-848e-548c5ade1645" containerName="keystone-db-sync" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.208355 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c800162-c9be-4c26-848e-548c5ade1645" containerName="keystone-db-sync" Jan 21 15:45:05 crc kubenswrapper[4707]: E0121 15:45:05.208371 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="448446bb-e187-45a1-a751-63dbb87dffa2" containerName="extract-content" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.208377 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="448446bb-e187-45a1-a751-63dbb87dffa2" containerName="extract-content" Jan 21 15:45:05 crc kubenswrapper[4707]: E0121 15:45:05.208386 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb5a6c6-fba7-4e90-9184-2e18785a63e7" containerName="collect-profiles" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.208392 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb5a6c6-fba7-4e90-9184-2e18785a63e7" containerName="collect-profiles" Jan 21 15:45:05 crc kubenswrapper[4707]: E0121 15:45:05.208408 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="448446bb-e187-45a1-a751-63dbb87dffa2" containerName="extract-utilities" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.208415 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="448446bb-e187-45a1-a751-63dbb87dffa2" containerName="extract-utilities" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.208665 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="448446bb-e187-45a1-a751-63dbb87dffa2" containerName="registry-server" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.208689 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c800162-c9be-4c26-848e-548c5ade1645" containerName="keystone-db-sync" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.208698 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb5a6c6-fba7-4e90-9184-2e18785a63e7" containerName="collect-profiles" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.209298 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-c5n5h"] Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.209348 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.209406 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.212023 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.280872 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-d7847c667-bblg7"] Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.281093 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" podUID="3952bfb4-0398-438f-a0c1-7289b672313e" containerName="barbican-api-log" containerID="cri-o://4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e" gracePeriod=30 Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.281482 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" podUID="3952bfb4-0398-438f-a0c1-7289b672313e" containerName="barbican-api" containerID="cri-o://8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec" gracePeriod=30 Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.357535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-scripts\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.357640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-combined-ca-bundle\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.357688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-config-data\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.357709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-credential-keys\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.357733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-fernet-keys\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.357763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qmzq\" (UniqueName: \"kubernetes.io/projected/df6c2927-0704-455f-be1a-1c421df8dfc6-kube-api-access-5qmzq\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.441925 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.458829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-scripts\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.459159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-combined-ca-bundle\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.459202 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-config-data\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.459220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-credential-keys\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.459242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-fernet-keys\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.459267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qmzq\" (UniqueName: \"kubernetes.io/projected/df6c2927-0704-455f-be1a-1c421df8dfc6-kube-api-access-5qmzq\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.464070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-fernet-keys\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.464093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-combined-ca-bundle\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.464180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-credential-keys\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.464570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-config-data\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.464772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-scripts\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.473304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qmzq\" (UniqueName: \"kubernetes.io/projected/df6c2927-0704-455f-be1a-1c421df8dfc6-kube-api-access-5qmzq\") pod \"keystone-bootstrap-c5n5h\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.535113 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.560799 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-combined-ca-bundle\") pod \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.560873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrh6p\" (UniqueName: \"kubernetes.io/projected/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-kube-api-access-rrh6p\") pod \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.560925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-scripts\") pod \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.560957 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-config-data\") pod \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.560992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-logs\") pod \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\" (UID: \"72c825b0-9f75-45ef-ab7c-5e38af70a8d2\") " Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.561889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-logs" (OuterVolumeSpecName: "logs") pod "72c825b0-9f75-45ef-ab7c-5e38af70a8d2" (UID: "72c825b0-9f75-45ef-ab7c-5e38af70a8d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.564467 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-kube-api-access-rrh6p" (OuterVolumeSpecName: "kube-api-access-rrh6p") pod "72c825b0-9f75-45ef-ab7c-5e38af70a8d2" (UID: "72c825b0-9f75-45ef-ab7c-5e38af70a8d2"). InnerVolumeSpecName "kube-api-access-rrh6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.564766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-scripts" (OuterVolumeSpecName: "scripts") pod "72c825b0-9f75-45ef-ab7c-5e38af70a8d2" (UID: "72c825b0-9f75-45ef-ab7c-5e38af70a8d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.584117 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-config-data" (OuterVolumeSpecName: "config-data") pod "72c825b0-9f75-45ef-ab7c-5e38af70a8d2" (UID: "72c825b0-9f75-45ef-ab7c-5e38af70a8d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.587580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72c825b0-9f75-45ef-ab7c-5e38af70a8d2" (UID: "72c825b0-9f75-45ef-ab7c-5e38af70a8d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.663095 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.663126 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.663140 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.663148 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.663162 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrh6p\" (UniqueName: \"kubernetes.io/projected/72c825b0-9f75-45ef-ab7c-5e38af70a8d2-kube-api-access-rrh6p\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4707]: I0121 15:45:05.927519 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-c5n5h"] Jan 21 15:45:05 crc kubenswrapper[4707]: W0121 15:45:05.934226 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf6c2927_0704_455f_be1a_1c421df8dfc6.slice/crio-cdb017dd7709e37d06d7046b9f7253ad25407735601e60d0a62039f510422818 WatchSource:0}: Error finding container cdb017dd7709e37d06d7046b9f7253ad25407735601e60d0a62039f510422818: Status 404 returned error can't find the container with id cdb017dd7709e37d06d7046b9f7253ad25407735601e60d0a62039f510422818 Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.036242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" event={"ID":"df6c2927-0704-455f-be1a-1c421df8dfc6","Type":"ContainerStarted","Data":"cdb017dd7709e37d06d7046b9f7253ad25407735601e60d0a62039f510422818"} Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.038766 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/0.log" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.038919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerStarted","Data":"564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8"} Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.043597 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20a1576d-856f-45fb-b210-66fddef6adaf","Type":"ContainerStarted","Data":"fb76672b8b108d21f1c4f38e5e76864b3f3b4e2030e02b6e06d10dafbc199a9b"} Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.044558 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.047189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c","Type":"ContainerStarted","Data":"4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b"} Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.048750 4707 generic.go:334] "Generic (PLEG): container finished" podID="3952bfb4-0398-438f-a0c1-7289b672313e" containerID="4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e" exitCode=143 Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.048842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" event={"ID":"3952bfb4-0398-438f-a0c1-7289b672313e","Type":"ContainerDied","Data":"4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e"} Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.050420 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-x6jxn" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.051874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-x6jxn" event={"ID":"72c825b0-9f75-45ef-ab7c-5e38af70a8d2","Type":"ContainerDied","Data":"82da53a0720e2dac9961ac96ab3f6e9a14c9521536d8eea43fedb6de3eb006e7"} Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.052614 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82da53a0720e2dac9961ac96ab3f6e9a14c9521536d8eea43fedb6de3eb006e7" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.069484 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.069466672 podStartE2EDuration="3.069466672s" podCreationTimestamp="2026-01-21 15:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:06.06747169 +0000 UTC m=+2603.248987912" watchObservedRunningTime="2026-01-21 15:45:06.069466672 +0000 UTC m=+2603.250982893" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.122146 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.122124601 podStartE2EDuration="3.122124601s" podCreationTimestamp="2026-01-21 15:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:06.092421201 +0000 UTC m=+2603.273937423" watchObservedRunningTime="2026-01-21 15:45:06.122124601 +0000 UTC m=+2603.303640822" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.139857 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-6fc86fc9f6-cd66l"] Jan 21 15:45:06 crc kubenswrapper[4707]: E0121 15:45:06.142777 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c825b0-9f75-45ef-ab7c-5e38af70a8d2" containerName="placement-db-sync" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.142800 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c825b0-9f75-45ef-ab7c-5e38af70a8d2" containerName="placement-db-sync" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.142989 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c825b0-9f75-45ef-ab7c-5e38af70a8d2" containerName="placement-db-sync" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.143944 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.153447 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6fc86fc9f6-cd66l"] Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.282155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-scripts\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.282826 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-public-tls-certs\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.282934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37486029-cbe4-4198-afb9-962c7d0728e5-logs\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.283029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-combined-ca-bundle\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.283151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-config-data\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.283254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-internal-tls-certs\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.283451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgkmx\" (UniqueName: \"kubernetes.io/projected/37486029-cbe4-4198-afb9-962c7d0728e5-kube-api-access-tgkmx\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.385637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgkmx\" (UniqueName: \"kubernetes.io/projected/37486029-cbe4-4198-afb9-962c7d0728e5-kube-api-access-tgkmx\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.385901 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-scripts\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.386075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-public-tls-certs\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.386180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37486029-cbe4-4198-afb9-962c7d0728e5-logs\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.386282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-combined-ca-bundle\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.386473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-config-data\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.386564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-internal-tls-certs\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.387333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37486029-cbe4-4198-afb9-962c7d0728e5-logs\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.389470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-combined-ca-bundle\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.397263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-public-tls-certs\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.399162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-internal-tls-certs\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.407498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-scripts\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.408642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-config-data\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.423309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgkmx\" (UniqueName: \"kubernetes.io/projected/37486029-cbe4-4198-afb9-962c7d0728e5-kube-api-access-tgkmx\") pod \"placement-6fc86fc9f6-cd66l\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.503442 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.632421 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.632967 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.640587 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.695641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfhqm\" (UniqueName: \"kubernetes.io/projected/f9aba8f8-91db-426a-aa87-96064fe8974c-kube-api-access-mfhqm\") pod \"f9aba8f8-91db-426a-aa87-96064fe8974c\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.695908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-combined-ca-bundle\") pod \"f9aba8f8-91db-426a-aa87-96064fe8974c\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.695956 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-config-data\") pod \"f9aba8f8-91db-426a-aa87-96064fe8974c\" (UID: \"f9aba8f8-91db-426a-aa87-96064fe8974c\") " Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.702036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9aba8f8-91db-426a-aa87-96064fe8974c-kube-api-access-mfhqm" (OuterVolumeSpecName: "kube-api-access-mfhqm") pod "f9aba8f8-91db-426a-aa87-96064fe8974c" (UID: "f9aba8f8-91db-426a-aa87-96064fe8974c"). InnerVolumeSpecName "kube-api-access-mfhqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.725355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9aba8f8-91db-426a-aa87-96064fe8974c" (UID: "f9aba8f8-91db-426a-aa87-96064fe8974c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.729988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-config-data" (OuterVolumeSpecName: "config-data") pod "f9aba8f8-91db-426a-aa87-96064fe8974c" (UID: "f9aba8f8-91db-426a-aa87-96064fe8974c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.799238 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.799283 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9aba8f8-91db-426a-aa87-96064fe8974c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.799296 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfhqm\" (UniqueName: \"kubernetes.io/projected/f9aba8f8-91db-426a-aa87-96064fe8974c-kube-api-access-mfhqm\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:06 crc kubenswrapper[4707]: I0121 15:45:06.980907 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6fc86fc9f6-cd66l"] Jan 21 15:45:06 crc kubenswrapper[4707]: W0121 15:45:06.983196 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37486029_cbe4_4198_afb9_962c7d0728e5.slice/crio-b079661c01037998a40176799988d931e7d77014dbee9b07236796655f600cf9 WatchSource:0}: Error finding container b079661c01037998a40176799988d931e7d77014dbee9b07236796655f600cf9: Status 404 returned error can't find the container with id b079661c01037998a40176799988d931e7d77014dbee9b07236796655f600cf9 Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.057396 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9aba8f8-91db-426a-aa87-96064fe8974c" containerID="ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d" exitCode=0 Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.057451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f9aba8f8-91db-426a-aa87-96064fe8974c","Type":"ContainerDied","Data":"ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d"} Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.057474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f9aba8f8-91db-426a-aa87-96064fe8974c","Type":"ContainerDied","Data":"ebdbd211c858e4eb3888352c667f305ee2f70eec3c12055dbcbbc8efebd65a7b"} Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.057491 4707 scope.go:117] "RemoveContainer" containerID="ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.057579 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.066749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" event={"ID":"37486029-cbe4-4198-afb9-962c7d0728e5","Type":"ContainerStarted","Data":"b079661c01037998a40176799988d931e7d77014dbee9b07236796655f600cf9"} Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.069563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" event={"ID":"df6c2927-0704-455f-be1a-1c421df8dfc6","Type":"ContainerStarted","Data":"c20bf1554c7dabcb98893e2a58c0be1968387065a014f5ffa800a39e6210d8a9"} Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.129702 4707 scope.go:117] "RemoveContainer" containerID="ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d" Jan 21 15:45:07 crc kubenswrapper[4707]: E0121 15:45:07.130094 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d\": container with ID starting with ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d not found: ID does not exist" containerID="ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.130120 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d"} err="failed to get container status \"ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d\": rpc error: code = NotFound desc = could not find container \"ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d\": container with ID starting with ff23097e7f411f19a393011602ff2483ff6bae711a5e7cfb0c82dcbb4f7a336d not found: ID does not exist" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.155757 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" podStartSLOduration=2.155745688 podStartE2EDuration="2.155745688s" podCreationTimestamp="2026-01-21 15:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:07.086597253 +0000 UTC m=+2604.268113475" watchObservedRunningTime="2026-01-21 15:45:07.155745688 +0000 UTC m=+2604.337261910" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.158227 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.166153 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.182336 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:45:07 crc kubenswrapper[4707]: E0121 15:45:07.182723 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9aba8f8-91db-426a-aa87-96064fe8974c" containerName="nova-cell0-conductor-conductor" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.182735 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9aba8f8-91db-426a-aa87-96064fe8974c" containerName="nova-cell0-conductor-conductor" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.182954 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9aba8f8-91db-426a-aa87-96064fe8974c" containerName="nova-cell0-conductor-conductor" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.183689 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.184957 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.200501 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9aba8f8-91db-426a-aa87-96064fe8974c" path="/var/lib/kubelet/pods/f9aba8f8-91db-426a-aa87-96064fe8974c/volumes" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.200969 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.323868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.323961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.323996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzch\" (UniqueName: \"kubernetes.io/projected/981e49ca-880c-4002-8c7d-d7bdfb867b38-kube-api-access-5xzch\") pod \"nova-cell0-conductor-0\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.427420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.427495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.427529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzch\" (UniqueName: \"kubernetes.io/projected/981e49ca-880c-4002-8c7d-d7bdfb867b38-kube-api-access-5xzch\") pod \"nova-cell0-conductor-0\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.430927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.430980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.441396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzch\" (UniqueName: \"kubernetes.io/projected/981e49ca-880c-4002-8c7d-d7bdfb867b38-kube-api-access-5xzch\") pod \"nova-cell0-conductor-0\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.513629 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.640936 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.640954 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:45:07 crc kubenswrapper[4707]: I0121 15:45:07.904240 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.078478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" event={"ID":"37486029-cbe4-4198-afb9-962c7d0728e5","Type":"ContainerStarted","Data":"a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a"} Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.078745 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.078761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" event={"ID":"37486029-cbe4-4198-afb9-962c7d0728e5","Type":"ContainerStarted","Data":"6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730"} Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.083720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"981e49ca-880c-4002-8c7d-d7bdfb867b38","Type":"ContainerStarted","Data":"dfcd00e11c4d0433fa251d2578cc79fae120b067d19aa27a2670c158694d6f47"} Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.108321 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" podStartSLOduration=2.10830459 podStartE2EDuration="2.10830459s" podCreationTimestamp="2026-01-21 15:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:08.097408313 +0000 UTC m=+2605.278924525" watchObservedRunningTime="2026-01-21 15:45:08.10830459 +0000 UTC m=+2605.289820813" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.772772 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.851579 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-internal-tls-certs\") pod \"3952bfb4-0398-438f-a0c1-7289b672313e\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.851649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-public-tls-certs\") pod \"3952bfb4-0398-438f-a0c1-7289b672313e\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.851694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3952bfb4-0398-438f-a0c1-7289b672313e-logs\") pod \"3952bfb4-0398-438f-a0c1-7289b672313e\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.851772 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phdgf\" (UniqueName: \"kubernetes.io/projected/3952bfb4-0398-438f-a0c1-7289b672313e-kube-api-access-phdgf\") pod \"3952bfb4-0398-438f-a0c1-7289b672313e\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.851831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data\") pod \"3952bfb4-0398-438f-a0c1-7289b672313e\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.851900 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data-custom\") pod \"3952bfb4-0398-438f-a0c1-7289b672313e\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.851935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-combined-ca-bundle\") pod \"3952bfb4-0398-438f-a0c1-7289b672313e\" (UID: \"3952bfb4-0398-438f-a0c1-7289b672313e\") " Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.852745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3952bfb4-0398-438f-a0c1-7289b672313e-logs" (OuterVolumeSpecName: "logs") pod "3952bfb4-0398-438f-a0c1-7289b672313e" (UID: "3952bfb4-0398-438f-a0c1-7289b672313e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.857914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3952bfb4-0398-438f-a0c1-7289b672313e-kube-api-access-phdgf" (OuterVolumeSpecName: "kube-api-access-phdgf") pod "3952bfb4-0398-438f-a0c1-7289b672313e" (UID: "3952bfb4-0398-438f-a0c1-7289b672313e"). InnerVolumeSpecName "kube-api-access-phdgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.858017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3952bfb4-0398-438f-a0c1-7289b672313e" (UID: "3952bfb4-0398-438f-a0c1-7289b672313e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.872520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3952bfb4-0398-438f-a0c1-7289b672313e" (UID: "3952bfb4-0398-438f-a0c1-7289b672313e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.895426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3952bfb4-0398-438f-a0c1-7289b672313e" (UID: "3952bfb4-0398-438f-a0c1-7289b672313e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.904075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data" (OuterVolumeSpecName: "config-data") pod "3952bfb4-0398-438f-a0c1-7289b672313e" (UID: "3952bfb4-0398-438f-a0c1-7289b672313e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.905741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3952bfb4-0398-438f-a0c1-7289b672313e" (UID: "3952bfb4-0398-438f-a0c1-7289b672313e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.954237 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.954263 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.954282 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3952bfb4-0398-438f-a0c1-7289b672313e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.954293 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phdgf\" (UniqueName: \"kubernetes.io/projected/3952bfb4-0398-438f-a0c1-7289b672313e-kube-api-access-phdgf\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.954304 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.954313 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:08 crc kubenswrapper[4707]: I0121 15:45:08.954322 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3952bfb4-0398-438f-a0c1-7289b672313e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.096360 4707 generic.go:334] "Generic (PLEG): container finished" podID="3952bfb4-0398-438f-a0c1-7289b672313e" containerID="8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec" exitCode=0 Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.096410 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.096448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" event={"ID":"3952bfb4-0398-438f-a0c1-7289b672313e","Type":"ContainerDied","Data":"8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec"} Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.096495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d7847c667-bblg7" event={"ID":"3952bfb4-0398-438f-a0c1-7289b672313e","Type":"ContainerDied","Data":"ec91e1122c7d2378896e6a9e12e8dc5f62ebbd926e026cd3e856358b238a267e"} Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.096512 4707 scope.go:117] "RemoveContainer" containerID="8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.099752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"981e49ca-880c-4002-8c7d-d7bdfb867b38","Type":"ContainerStarted","Data":"42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff"} Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.100705 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.102867 4707 generic.go:334] "Generic (PLEG): container finished" podID="df6c2927-0704-455f-be1a-1c421df8dfc6" containerID="c20bf1554c7dabcb98893e2a58c0be1968387065a014f5ffa800a39e6210d8a9" exitCode=0 Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.103357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" event={"ID":"df6c2927-0704-455f-be1a-1c421df8dfc6","Type":"ContainerDied","Data":"c20bf1554c7dabcb98893e2a58c0be1968387065a014f5ffa800a39e6210d8a9"} Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.103422 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.117334 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.11732274 podStartE2EDuration="2.11732274s" podCreationTimestamp="2026-01-21 15:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:09.110106812 +0000 UTC m=+2606.291623033" watchObservedRunningTime="2026-01-21 15:45:09.11732274 +0000 UTC m=+2606.298838961" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.125437 4707 scope.go:117] "RemoveContainer" containerID="4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.143219 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-d7847c667-bblg7"] Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.145588 4707 scope.go:117] "RemoveContainer" containerID="8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec" Jan 21 15:45:09 crc kubenswrapper[4707]: E0121 15:45:09.146022 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec\": container with ID starting with 8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec not found: ID does not exist" containerID="8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.146048 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec"} err="failed to get container status \"8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec\": rpc error: code = NotFound desc = could not find container \"8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec\": container with ID starting with 8b68154efea67d1e3cc4c3a747e3f4b364bbf33a7b8b310fe433af89223cd7ec not found: ID does not exist" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.146062 4707 scope.go:117] "RemoveContainer" containerID="4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e" Jan 21 15:45:09 crc kubenswrapper[4707]: E0121 15:45:09.146553 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e\": container with ID starting with 4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e not found: ID does not exist" containerID="4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.146574 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e"} err="failed to get container status \"4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e\": rpc error: code = NotFound desc = could not find container \"4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e\": container with ID starting with 4b82e8ccd26317de0e1f6265e70a896a00dd1ed982ba1f4f5fc0be5f7e2c770e not found: ID does not exist" Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.152340 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-d7847c667-bblg7"] Jan 21 15:45:09 crc kubenswrapper[4707]: I0121 15:45:09.190992 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3952bfb4-0398-438f-a0c1-7289b672313e" path="/var/lib/kubelet/pods/3952bfb4-0398-438f-a0c1-7289b672313e/volumes" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.071434 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.118099 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerID="51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107" exitCode=137 Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.118158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107"} Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.118176 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.118183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"fb2783e9-b8b1-4899-9a36-af57a12d56ba","Type":"ContainerDied","Data":"0361816401c49ef16e8e70e869d82aea6654f338d15f76c9482638c5dd05b70f"} Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.118198 4707 scope.go:117] "RemoveContainer" containerID="51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.120119 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/1.log" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.120742 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/0.log" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.120773 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerID="564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8" exitCode=1 Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.120909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerDied","Data":"564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8"} Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.121543 4707 scope.go:117] "RemoveContainer" containerID="564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.121878 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-api pod=neutron-68556f8ccd-sx4vz_openstack-kuttl-tests(5f587396-a3bb-4add-bb87-d1bacd7e763e)\"" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.125340 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.156004 4707 scope.go:117] "RemoveContainer" containerID="3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.175203 4707 scope.go:117] "RemoveContainer" containerID="ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.181255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-lock\") pod \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.181445 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-cache\") pod \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.181475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.181516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") pod \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.181536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nww2v\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-kube-api-access-nww2v\") pod \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\" (UID: \"fb2783e9-b8b1-4899-9a36-af57a12d56ba\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.181763 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-lock" (OuterVolumeSpecName: "lock") pod "fb2783e9-b8b1-4899-9a36-af57a12d56ba" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.182396 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.182715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-cache" (OuterVolumeSpecName: "cache") pod "fb2783e9-b8b1-4899-9a36-af57a12d56ba" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.185737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-kube-api-access-nww2v" (OuterVolumeSpecName: "kube-api-access-nww2v") pod "fb2783e9-b8b1-4899-9a36-af57a12d56ba" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba"). InnerVolumeSpecName "kube-api-access-nww2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.185964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "fb2783e9-b8b1-4899-9a36-af57a12d56ba" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.195197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fb2783e9-b8b1-4899-9a36-af57a12d56ba" (UID: "fb2783e9-b8b1-4899-9a36-af57a12d56ba"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.195440 4707 scope.go:117] "RemoveContainer" containerID="2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.287259 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb2783e9-b8b1-4899-9a36-af57a12d56ba-cache\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.287303 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.287314 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.287323 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nww2v\" (UniqueName: \"kubernetes.io/projected/fb2783e9-b8b1-4899-9a36-af57a12d56ba-kube-api-access-nww2v\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.287395 4707 scope.go:117] "RemoveContainer" containerID="fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.301920 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.304356 4707 scope.go:117] "RemoveContainer" containerID="764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.320410 4707 scope.go:117] "RemoveContainer" containerID="9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.336131 4707 scope.go:117] "RemoveContainer" containerID="87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.373599 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.380755 4707 scope.go:117] "RemoveContainer" containerID="0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.388878 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.425465 4707 scope.go:117] "RemoveContainer" containerID="f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.448322 4707 scope.go:117] "RemoveContainer" containerID="5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.460476 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.468102 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484062 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484414 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-auditor" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484431 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-auditor" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484444 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-replicator" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484450 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-replicator" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484460 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-updater" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484467 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-updater" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484476 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-server" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484481 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-server" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484492 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-replicator" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484498 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-replicator" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484510 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-replicator" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484516 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-replicator" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484523 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-auditor" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484528 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-auditor" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484543 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3952bfb4-0398-438f-a0c1-7289b672313e" containerName="barbican-api-log" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484549 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3952bfb4-0398-438f-a0c1-7289b672313e" containerName="barbican-api-log" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484560 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3952bfb4-0398-438f-a0c1-7289b672313e" containerName="barbican-api" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484567 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3952bfb4-0398-438f-a0c1-7289b672313e" containerName="barbican-api" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484575 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-expirer" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484581 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-expirer" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484591 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-server" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484596 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-server" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484602 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-server" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484607 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-server" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484616 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-reaper" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-reaper" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484630 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-updater" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484635 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-updater" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484642 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6c2927-0704-455f-be1a-1c421df8dfc6" containerName="keystone-bootstrap" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484647 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6c2927-0704-455f-be1a-1c421df8dfc6" containerName="keystone-bootstrap" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484660 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="swift-recon-cron" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484666 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="swift-recon-cron" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484674 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-auditor" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484679 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-auditor" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.484687 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="rsync" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484692 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="rsync" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484861 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3952bfb4-0398-438f-a0c1-7289b672313e" containerName="barbican-api-log" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484875 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-server" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484883 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3952bfb4-0398-438f-a0c1-7289b672313e" containerName="barbican-api" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484890 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="rsync" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="swift-recon-cron" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484908 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-server" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484918 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-replicator" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484926 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-auditor" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484932 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6c2927-0704-455f-be1a-1c421df8dfc6" containerName="keystone-bootstrap" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484939 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-updater" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484947 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="container-auditor" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484957 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-replicator" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484965 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-reaper" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484972 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-auditor" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484983 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-expirer" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.484990 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="account-replicator" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.485000 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-server" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.485006 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" containerName="object-updater" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.485179 4707 scope.go:117] "RemoveContainer" containerID="0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.490759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-config-data\") pod \"df6c2927-0704-455f-be1a-1c421df8dfc6\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.490799 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-fernet-keys\") pod \"df6c2927-0704-455f-be1a-1c421df8dfc6\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.490856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-scripts\") pod \"df6c2927-0704-455f-be1a-1c421df8dfc6\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.490928 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-combined-ca-bundle\") pod \"df6c2927-0704-455f-be1a-1c421df8dfc6\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.490963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qmzq\" (UniqueName: \"kubernetes.io/projected/df6c2927-0704-455f-be1a-1c421df8dfc6-kube-api-access-5qmzq\") pod \"df6c2927-0704-455f-be1a-1c421df8dfc6\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.491010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-credential-keys\") pod \"df6c2927-0704-455f-be1a-1c421df8dfc6\" (UID: \"df6c2927-0704-455f-be1a-1c421df8dfc6\") " Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.495540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "df6c2927-0704-455f-be1a-1c421df8dfc6" (UID: "df6c2927-0704-455f-be1a-1c421df8dfc6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.496558 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "df6c2927-0704-455f-be1a-1c421df8dfc6" (UID: "df6c2927-0704-455f-be1a-1c421df8dfc6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.497849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6c2927-0704-455f-be1a-1c421df8dfc6-kube-api-access-5qmzq" (OuterVolumeSpecName: "kube-api-access-5qmzq") pod "df6c2927-0704-455f-be1a-1c421df8dfc6" (UID: "df6c2927-0704-455f-be1a-1c421df8dfc6"). InnerVolumeSpecName "kube-api-access-5qmzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.505486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-scripts" (OuterVolumeSpecName: "scripts") pod "df6c2927-0704-455f-be1a-1c421df8dfc6" (UID: "df6c2927-0704-455f-be1a-1c421df8dfc6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.513040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df6c2927-0704-455f-be1a-1c421df8dfc6" (UID: "df6c2927-0704-455f-be1a-1c421df8dfc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.515299 4707 scope.go:117] "RemoveContainer" containerID="96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.519282 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.519467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.521721 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.525120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-config-data" (OuterVolumeSpecName: "config-data") pod "df6c2927-0704-455f-be1a-1c421df8dfc6" (UID: "df6c2927-0704-455f-be1a-1c421df8dfc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.534228 4707 scope.go:117] "RemoveContainer" containerID="cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.552336 4707 scope.go:117] "RemoveContainer" containerID="e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.567215 4707 scope.go:117] "RemoveContainer" containerID="51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.567549 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107\": container with ID starting with 51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107 not found: ID does not exist" containerID="51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.567597 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107"} err="failed to get container status \"51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107\": rpc error: code = NotFound desc = could not find container \"51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107\": container with ID starting with 51ac197802a5c0922295e6abcd2515ad85937156c5a25cf1fa3d83be0acc4107 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.567617 4707 scope.go:117] "RemoveContainer" containerID="3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.567932 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25\": container with ID starting with 3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25 not found: ID does not exist" containerID="3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.567971 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25"} err="failed to get container status \"3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25\": rpc error: code = NotFound desc = could not find container \"3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25\": container with ID starting with 3e9e54f5d4d1ae146074e3c24ddb9e4e464136f464f4a60af52930c060d36b25 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.568012 4707 scope.go:117] "RemoveContainer" containerID="ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.568356 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838\": container with ID starting with ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838 not found: ID does not exist" containerID="ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.568382 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838"} err="failed to get container status \"ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838\": rpc error: code = NotFound desc = could not find container \"ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838\": container with ID starting with ed0f2d3ba333db873de8c161fef85929a174d0eff8bd5901b5a4e8f0963e1838 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.568416 4707 scope.go:117] "RemoveContainer" containerID="2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.568684 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6\": container with ID starting with 2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6 not found: ID does not exist" containerID="2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.568711 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6"} err="failed to get container status \"2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6\": rpc error: code = NotFound desc = could not find container \"2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6\": container with ID starting with 2a8284f7d3deb89aa91c2598a24f58435f26fbcb0985bf4e38deeec5daae9ef6 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.568726 4707 scope.go:117] "RemoveContainer" containerID="fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.568937 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad\": container with ID starting with fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad not found: ID does not exist" containerID="fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.568979 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad"} err="failed to get container status \"fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad\": rpc error: code = NotFound desc = could not find container \"fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad\": container with ID starting with fd21d71030ee84e7aef7b6ea669ae230fa6cb1692ce924529ae685ef9c36e0ad not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.568993 4707 scope.go:117] "RemoveContainer" containerID="764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.569265 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2\": container with ID starting with 764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2 not found: ID does not exist" containerID="764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.569297 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2"} err="failed to get container status \"764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2\": rpc error: code = NotFound desc = could not find container \"764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2\": container with ID starting with 764e2bd0267dfac77ae8008e029140d61e4a00058da0ed4d4e255b3f893336e2 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.569310 4707 scope.go:117] "RemoveContainer" containerID="9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.569632 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a\": container with ID starting with 9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a not found: ID does not exist" containerID="9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.569655 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a"} err="failed to get container status \"9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a\": rpc error: code = NotFound desc = could not find container \"9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a\": container with ID starting with 9bc5394e792c7128324da5ababf3330024ecb2d5b14facd5be3da83f2b0d983a not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.569670 4707 scope.go:117] "RemoveContainer" containerID="87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.570005 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9\": container with ID starting with 87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9 not found: ID does not exist" containerID="87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.570043 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9"} err="failed to get container status \"87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9\": rpc error: code = NotFound desc = could not find container \"87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9\": container with ID starting with 87c97b0cf700376aaac59dfdec34080ec53b3fa0eabf14daef85ef0d64e4c8b9 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.570065 4707 scope.go:117] "RemoveContainer" containerID="0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.570314 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54\": container with ID starting with 0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54 not found: ID does not exist" containerID="0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.570339 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54"} err="failed to get container status \"0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54\": rpc error: code = NotFound desc = could not find container \"0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54\": container with ID starting with 0106450cda53f46dc905cfcba74e10efadee1a7bf4d91b25cebe8c2f38e50a54 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.570352 4707 scope.go:117] "RemoveContainer" containerID="f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.571202 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304\": container with ID starting with f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304 not found: ID does not exist" containerID="f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.571227 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304"} err="failed to get container status \"f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304\": rpc error: code = NotFound desc = could not find container \"f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304\": container with ID starting with f8ba501e6cdeecdaf5fa83d1c363d24ddced3398c6a24d62e8af412100200304 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.571241 4707 scope.go:117] "RemoveContainer" containerID="5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.571461 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853\": container with ID starting with 5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853 not found: ID does not exist" containerID="5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.571484 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853"} err="failed to get container status \"5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853\": rpc error: code = NotFound desc = could not find container \"5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853\": container with ID starting with 5e77a9559364c9203e98cb4a9fad9aff23a7d656c0ef6e6776a605fb8af3b853 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.571500 4707 scope.go:117] "RemoveContainer" containerID="0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.571738 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07\": container with ID starting with 0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07 not found: ID does not exist" containerID="0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.571769 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07"} err="failed to get container status \"0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07\": rpc error: code = NotFound desc = could not find container \"0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07\": container with ID starting with 0da54948619d17f36c40dd67d7dac4b59df9c0e4a08d1a92f202f73200ef4e07 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.571790 4707 scope.go:117] "RemoveContainer" containerID="96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.572045 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51\": container with ID starting with 96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51 not found: ID does not exist" containerID="96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.572065 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51"} err="failed to get container status \"96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51\": rpc error: code = NotFound desc = could not find container \"96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51\": container with ID starting with 96e35040c619508c1177452790963a2a3021f4061140feca143f66b43fb5ae51 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.572078 4707 scope.go:117] "RemoveContainer" containerID="cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.572332 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58\": container with ID starting with cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58 not found: ID does not exist" containerID="cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.572357 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58"} err="failed to get container status \"cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58\": rpc error: code = NotFound desc = could not find container \"cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58\": container with ID starting with cb2961e5d916f4da77aa49858eed38e68f79a65e58894e534f69054707f31b58 not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.572371 4707 scope.go:117] "RemoveContainer" containerID="e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa" Jan 21 15:45:10 crc kubenswrapper[4707]: E0121 15:45:10.572622 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa\": container with ID starting with e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa not found: ID does not exist" containerID="e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.572643 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa"} err="failed to get container status \"e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa\": rpc error: code = NotFound desc = could not find container \"e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa\": container with ID starting with e8b68e2c0dbf4b7a19452deaff655664931f5a0b0e4f2a29090a0d5673b6fbaa not found: ID does not exist" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.572656 4707 scope.go:117] "RemoveContainer" containerID="ead8b354d29ca6da319ae2b1b30a2002a1bbe77cdf13011c3fb2f37808e1c514" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-cache\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-lock\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7xmj\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-kube-api-access-r7xmj\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-etc-swift\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594534 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594546 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qmzq\" (UniqueName: \"kubernetes.io/projected/df6c2927-0704-455f-be1a-1c421df8dfc6-kube-api-access-5qmzq\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594556 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594564 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594572 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.594579 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6c2927-0704-455f-be1a-1c421df8dfc6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.696860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7xmj\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-kube-api-access-r7xmj\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.697822 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-etc-swift\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.698205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-cache\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.698395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-lock\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.698427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.698623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-cache\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.698664 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.699207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-lock\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.701751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-etc-swift\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.710834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7xmj\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-kube-api-access-r7xmj\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.716672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:10 crc kubenswrapper[4707]: I0121 15:45:10.834266 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.132124 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/1.log" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.134169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" event={"ID":"df6c2927-0704-455f-be1a-1c421df8dfc6","Type":"ContainerDied","Data":"cdb017dd7709e37d06d7046b9f7253ad25407735601e60d0a62039f510422818"} Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.134202 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-c5n5h" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.134212 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdb017dd7709e37d06d7046b9f7253ad25407735601e60d0a62039f510422818" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.183338 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:45:11 crc kubenswrapper[4707]: E0121 15:45:11.183589 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.191561 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2783e9-b8b1-4899-9a36-af57a12d56ba" path="/var/lib/kubelet/pods/fb2783e9-b8b1-4899-9a36-af57a12d56ba/volumes" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.193341 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-667579cbdb-pbr2f"] Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.194412 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.197267 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-667579cbdb-pbr2f"] Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.233506 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:45:11 crc kubenswrapper[4707]: W0121 15:45:11.233820 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b5fdc09_bd73_4c16_96f8_8c083e314aff.slice/crio-93fbf45ca6d181a3320fa0201335a4bb2677bda3f7956f8e4b4b1f55c4531397 WatchSource:0}: Error finding container 93fbf45ca6d181a3320fa0201335a4bb2677bda3f7956f8e4b4b1f55c4531397: Status 404 returned error can't find the container with id 93fbf45ca6d181a3320fa0201335a4bb2677bda3f7956f8e4b4b1f55c4531397 Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.318841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-internal-tls-certs\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.319195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-credential-keys\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.319247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjqd5\" (UniqueName: \"kubernetes.io/projected/26893235-a33a-4211-b02e-b4237c0586c5-kube-api-access-cjqd5\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.319294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-scripts\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.319386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-public-tls-certs\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.319440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-fernet-keys\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.319548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-config-data\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.319712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-combined-ca-bundle\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.421133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-fernet-keys\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.421183 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-config-data\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.421245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-combined-ca-bundle\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.421295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-internal-tls-certs\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.421359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-credential-keys\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.421385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjqd5\" (UniqueName: \"kubernetes.io/projected/26893235-a33a-4211-b02e-b4237c0586c5-kube-api-access-cjqd5\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.421403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-scripts\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.421432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-public-tls-certs\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.431598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-scripts\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.431775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-credential-keys\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.431781 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-internal-tls-certs\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.432050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-combined-ca-bundle\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.432169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-public-tls-certs\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.432189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-config-data\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.432358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-fernet-keys\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.434003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjqd5\" (UniqueName: \"kubernetes.io/projected/26893235-a33a-4211-b02e-b4237c0586c5-kube-api-access-cjqd5\") pod \"keystone-667579cbdb-pbr2f\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.507021 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.755099 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7d1ed8c7-69eb-461f-9984-719171cafcb2"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7d1ed8c7-69eb-461f-9984-719171cafcb2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7d1ed8c7_69eb_461f_9984_719171cafcb2.slice" Jan 21 15:45:11 crc kubenswrapper[4707]: E0121 15:45:11.755330 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod7d1ed8c7-69eb-461f-9984-719171cafcb2] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod7d1ed8c7-69eb-461f-9984-719171cafcb2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7d1ed8c7_69eb_461f_9984_719171cafcb2.slice" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.786610 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poda7280671-6e1f-48c9-a9c8-ca1f7453f07c"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poda7280671-6e1f-48c9-a9c8-ca1f7453f07c] : Timed out while waiting for systemd to remove kubepods-besteffort-poda7280671_6e1f_48c9_a9c8_ca1f7453f07c.slice" Jan 21 15:45:11 crc kubenswrapper[4707]: E0121 15:45:11.786648 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort poda7280671-6e1f-48c9-a9c8-ca1f7453f07c] : unable to destroy cgroup paths for cgroup [kubepods besteffort poda7280671-6e1f-48c9-a9c8-ca1f7453f07c] : Timed out while waiting for systemd to remove kubepods-besteffort-poda7280671_6e1f_48c9_a9c8_ca1f7453f07c.slice" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" Jan 21 15:45:11 crc kubenswrapper[4707]: I0121 15:45:11.894247 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-667579cbdb-pbr2f"] Jan 21 15:45:11 crc kubenswrapper[4707]: W0121 15:45:11.896747 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26893235_a33a_4211_b02e_b4237c0586c5.slice/crio-e9d9573fbce842118db8236ca137fa1a56ffecc6e81c8e4544ae5511ef653af7 WatchSource:0}: Error finding container e9d9573fbce842118db8236ca137fa1a56ffecc6e81c8e4544ae5511ef653af7: Status 404 returned error can't find the container with id e9d9573fbce842118db8236ca137fa1a56ffecc6e81c8e4544ae5511ef653af7 Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.145542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4"} Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.145797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454"} Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.145838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242"} Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.145848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84"} Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.145856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea"} Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.145865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04"} Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.145874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"93fbf45ca6d181a3320fa0201335a4bb2677bda3f7956f8e4b4b1f55c4531397"} Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.148035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.148685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" event={"ID":"26893235-a33a-4211-b02e-b4237c0586c5","Type":"ContainerStarted","Data":"4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74"} Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.148730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" event={"ID":"26893235-a33a-4211-b02e-b4237c0586c5","Type":"ContainerStarted","Data":"e9d9573fbce842118db8236ca137fa1a56ffecc6e81c8e4544ae5511ef653af7"} Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.148744 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.187487 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" podStartSLOduration=1.187469629 podStartE2EDuration="1.187469629s" podCreationTimestamp="2026-01-21 15:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:12.18065669 +0000 UTC m=+2609.362172912" watchObservedRunningTime="2026-01-21 15:45:12.187469629 +0000 UTC m=+2609.368985852" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.217574 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.225706 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.234974 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.241963 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.248333 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.250065 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.252669 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.252919 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-pbbx2" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.253055 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.253208 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.262115 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.280543 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.282443 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-wl7lk" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.282529 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.284015 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.284241 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.334034 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.336373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.336415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.336621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.336702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.336771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttwft\" (UniqueName: \"kubernetes.io/projected/c14f4f47-2fe8-47fd-892f-832266276730-kube-api-access-ttwft\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.336945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.337048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c14f4f47-2fe8-47fd-892f-832266276730-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.337114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.340106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c14f4f47-2fe8-47fd-892f-832266276730-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnww8\" (UniqueName: \"kubernetes.io/projected/79747506-45fc-47c5-b7ee-38e40227b1bb-kube-api-access-dnww8\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttwft\" (UniqueName: \"kubernetes.io/projected/c14f4f47-2fe8-47fd-892f-832266276730-kube-api-access-ttwft\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-config\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.440737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.441469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c14f4f47-2fe8-47fd-892f-832266276730-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.441844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.441954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.442139 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.442214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.452266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.458972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttwft\" (UniqueName: \"kubernetes.io/projected/c14f4f47-2fe8-47fd-892f-832266276730-kube-api-access-ttwft\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.459094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.475498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.541970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.542008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnww8\" (UniqueName: \"kubernetes.io/projected/79747506-45fc-47c5-b7ee-38e40227b1bb-kube-api-access-dnww8\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.542031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-config\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.542075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.542114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.542129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.542165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.542188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.542399 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") device mount path \"/mnt/openstack/pv12\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.542786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.543756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-config\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.544395 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.548230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.548338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.549262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.558097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnww8\" (UniqueName: \"kubernetes.io/projected/79747506-45fc-47c5-b7ee-38e40227b1bb-kube-api-access-dnww8\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.562588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.581331 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.635319 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:12 crc kubenswrapper[4707]: I0121 15:45:12.968573 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:45:12 crc kubenswrapper[4707]: W0121 15:45:12.973279 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc14f4f47_2fe8_47fd_892f_832266276730.slice/crio-87a3b17bf0c7e0c8a4701be87cf3429e578bd8b8965cd247b5f6148fe884559b WatchSource:0}: Error finding container 87a3b17bf0c7e0c8a4701be87cf3429e578bd8b8965cd247b5f6148fe884559b: Status 404 returned error can't find the container with id 87a3b17bf0c7e0c8a4701be87cf3429e578bd8b8965cd247b5f6148fe884559b Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.050973 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.158097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"79747506-45fc-47c5-b7ee-38e40227b1bb","Type":"ContainerStarted","Data":"fa7800c817719e72eb58fd0d2dd563347447e42c0aae8c147976ac58ca81314d"} Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.158987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"c14f4f47-2fe8-47fd-892f-832266276730","Type":"ContainerStarted","Data":"87a3b17bf0c7e0c8a4701be87cf3429e578bd8b8965cd247b5f6148fe884559b"} Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.163511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92"} Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.163543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22"} Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.163555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf"} Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.163565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f"} Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.163574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722"} Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.163582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa"} Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.163591 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724"} Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.163691 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.195513 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1ed8c7-69eb-461f-9984-719171cafcb2" path="/var/lib/kubelet/pods/7d1ed8c7-69eb-461f-9984-719171cafcb2/volumes" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.196829 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7280671-6e1f-48c9-a9c8-ca1f7453f07c" path="/var/lib/kubelet/pods/a7280671-6e1f-48c9-a9c8-ca1f7453f07c/volumes" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.500656 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.500947 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.525583 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.531305 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.544723 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.544769 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.572537 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.579988 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:13 crc kubenswrapper[4707]: I0121 15:45:13.914436 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.184761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841"} Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.184797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerStarted","Data":"1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333"} Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.187610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"79747506-45fc-47c5-b7ee-38e40227b1bb","Type":"ContainerStarted","Data":"85117f9d04aa0f7e6d587b250d45813072e69df9bf3aba08c43cd1843a375512"} Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.187645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"79747506-45fc-47c5-b7ee-38e40227b1bb","Type":"ContainerStarted","Data":"423b9cfbccfd3275dd905b5666c4c6442b89dad40b2b0658353b1b608bf27823"} Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.189353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"c14f4f47-2fe8-47fd-892f-832266276730","Type":"ContainerStarted","Data":"feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936"} Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.190560 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.190581 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.190592 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.190600 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.218626 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=4.218612139 podStartE2EDuration="4.218612139s" podCreationTimestamp="2026-01-21 15:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:14.208345897 +0000 UTC m=+2611.389862119" watchObservedRunningTime="2026-01-21 15:45:14.218612139 +0000 UTC m=+2611.400128361" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.255916 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.255898467 podStartE2EDuration="2.255898467s" podCreationTimestamp="2026-01-21 15:45:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:14.253449723 +0000 UTC m=+2611.434965945" watchObservedRunningTime="2026-01-21 15:45:14.255898467 +0000 UTC m=+2611.437414688" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.325062 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft"] Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.326563 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.334868 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft"] Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.384661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lv6x\" (UniqueName: \"kubernetes.io/projected/f96930d8-596d-433f-bd93-ae767abae88c-kube-api-access-9lv6x\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.384729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.384751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-config\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.384784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.486923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lv6x\" (UniqueName: \"kubernetes.io/projected/f96930d8-596d-433f-bd93-ae767abae88c-kube-api-access-9lv6x\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.486985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.487008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-config\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.487533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.487771 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-config\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.487796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.488118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.508952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lv6x\" (UniqueName: \"kubernetes.io/projected/f96930d8-596d-433f-bd93-ae767abae88c-kube-api-access-9lv6x\") pod \"dnsmasq-dnsmasq-66d9b4485-g2wft\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:14 crc kubenswrapper[4707]: I0121 15:45:14.654915 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:15 crc kubenswrapper[4707]: I0121 15:45:15.059259 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft"] Jan 21 15:45:15 crc kubenswrapper[4707]: I0121 15:45:15.199583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" event={"ID":"f96930d8-596d-433f-bd93-ae767abae88c","Type":"ContainerStarted","Data":"06c0d3fb7865624d130253e3ec19b2be699c313504fd718cd1f3a408e16db445"} Jan 21 15:45:15 crc kubenswrapper[4707]: I0121 15:45:15.636102 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:15 crc kubenswrapper[4707]: I0121 15:45:15.792830 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:15 crc kubenswrapper[4707]: I0121 15:45:15.794096 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:45:15 crc kubenswrapper[4707]: I0121 15:45:15.843366 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:15 crc kubenswrapper[4707]: I0121 15:45:15.845469 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:45:16 crc kubenswrapper[4707]: I0121 15:45:16.207684 4707 generic.go:334] "Generic (PLEG): container finished" podID="c14f4f47-2fe8-47fd-892f-832266276730" containerID="feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936" exitCode=0 Jan 21 15:45:16 crc kubenswrapper[4707]: I0121 15:45:16.207744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"c14f4f47-2fe8-47fd-892f-832266276730","Type":"ContainerDied","Data":"feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936"} Jan 21 15:45:16 crc kubenswrapper[4707]: I0121 15:45:16.209170 4707 generic.go:334] "Generic (PLEG): container finished" podID="f96930d8-596d-433f-bd93-ae767abae88c" containerID="01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175" exitCode=0 Jan 21 15:45:16 crc kubenswrapper[4707]: I0121 15:45:16.209238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" event={"ID":"f96930d8-596d-433f-bd93-ae767abae88c","Type":"ContainerDied","Data":"01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175"} Jan 21 15:45:17 crc kubenswrapper[4707]: I0121 15:45:17.220852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" event={"ID":"f96930d8-596d-433f-bd93-ae767abae88c","Type":"ContainerStarted","Data":"711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5"} Jan 21 15:45:17 crc kubenswrapper[4707]: I0121 15:45:17.222046 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:17 crc kubenswrapper[4707]: I0121 15:45:17.223251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"c14f4f47-2fe8-47fd-892f-832266276730","Type":"ContainerStarted","Data":"5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92"} Jan 21 15:45:17 crc kubenswrapper[4707]: I0121 15:45:17.235927 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" podStartSLOduration=3.235916152 podStartE2EDuration="3.235916152s" podCreationTimestamp="2026-01-21 15:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:17.233779535 +0000 UTC m=+2614.415295757" watchObservedRunningTime="2026-01-21 15:45:17.235916152 +0000 UTC m=+2614.417432375" Jan 21 15:45:17 crc kubenswrapper[4707]: I0121 15:45:17.251785 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=5.251773078 podStartE2EDuration="5.251773078s" podCreationTimestamp="2026-01-21 15:45:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:17.248781502 +0000 UTC m=+2614.430297724" watchObservedRunningTime="2026-01-21 15:45:17.251773078 +0000 UTC m=+2614.433289299" Jan 21 15:45:17 crc kubenswrapper[4707]: I0121 15:45:17.535675 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:45:17 crc kubenswrapper[4707]: I0121 15:45:17.636678 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:17 crc kubenswrapper[4707]: I0121 15:45:17.642982 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:45:17 crc kubenswrapper[4707]: I0121 15:45:17.643162 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:45:18 crc kubenswrapper[4707]: I0121 15:45:18.666618 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:18 crc kubenswrapper[4707]: I0121 15:45:18.697230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:45:22 crc kubenswrapper[4707]: I0121 15:45:22.581851 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:22 crc kubenswrapper[4707]: I0121 15:45:22.582221 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:22 crc kubenswrapper[4707]: I0121 15:45:22.636618 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:23 crc kubenswrapper[4707]: I0121 15:45:23.187432 4707 scope.go:117] "RemoveContainer" containerID="564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8" Jan 21 15:45:23 crc kubenswrapper[4707]: I0121 15:45:23.194316 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:45:23 crc kubenswrapper[4707]: I0121 15:45:23.328218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:45:24 crc kubenswrapper[4707]: I0121 15:45:24.182674 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:45:24 crc kubenswrapper[4707]: E0121 15:45:24.182882 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:45:24 crc kubenswrapper[4707]: I0121 15:45:24.275026 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/1.log" Jan 21 15:45:24 crc kubenswrapper[4707]: I0121 15:45:24.275591 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerStarted","Data":"752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186"} Jan 21 15:45:24 crc kubenswrapper[4707]: I0121 15:45:24.279577 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:45:24 crc kubenswrapper[4707]: I0121 15:45:24.656058 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:45:24 crc kubenswrapper[4707]: I0121 15:45:24.704683 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr"] Jan 21 15:45:24 crc kubenswrapper[4707]: I0121 15:45:24.704911 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" podUID="b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" containerName="dnsmasq-dns" containerID="cri-o://119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f" gracePeriod=10 Jan 21 15:45:24 crc kubenswrapper[4707]: E0121 15:45:24.764255 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76873cf_7ae7_48cf_b1f2_b3b3d6b4e725.slice/crio-conmon-119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76873cf_7ae7_48cf_b1f2_b3b3d6b4e725.slice/crio-119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.085995 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.161426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-config\") pod \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.161503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dns-swift-storage-0\") pod \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.161630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm2v4\" (UniqueName: \"kubernetes.io/projected/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-kube-api-access-cm2v4\") pod \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.161708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dnsmasq-svc\") pod \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\" (UID: \"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725\") " Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.186546 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-kube-api-access-cm2v4" (OuterVolumeSpecName: "kube-api-access-cm2v4") pod "b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" (UID: "b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725"). InnerVolumeSpecName "kube-api-access-cm2v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.200393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" (UID: "b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.202327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-config" (OuterVolumeSpecName: "config") pod "b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" (UID: "b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.203083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" (UID: "b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.264156 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.264187 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm2v4\" (UniqueName: \"kubernetes.io/projected/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-kube-api-access-cm2v4\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.264199 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.264207 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.284287 4707 generic.go:334] "Generic (PLEG): container finished" podID="b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" containerID="119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f" exitCode=0 Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.284363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" event={"ID":"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725","Type":"ContainerDied","Data":"119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f"} Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.284387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" event={"ID":"b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725","Type":"ContainerDied","Data":"48ccdff9968b9c1640f3ddf1f437967bdade6b28d7c7257855a27b69414596c9"} Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.284432 4707 scope.go:117] "RemoveContainer" containerID="119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.284563 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.292516 4707 generic.go:334] "Generic (PLEG): container finished" podID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" containerID="2d3a5c43e34f6f46f6d3cb5d8c70467adeed72ea7c6b5d91eb9047045028dd92" exitCode=0 Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.292582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f341e188-43e6-4019-b4dd-b8ea727d2d3f","Type":"ContainerDied","Data":"2d3a5c43e34f6f46f6d3cb5d8c70467adeed72ea7c6b5d91eb9047045028dd92"} Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.297118 4707 generic.go:334] "Generic (PLEG): container finished" podID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" containerID="e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f" exitCode=0 Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.297144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab","Type":"ContainerDied","Data":"e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f"} Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.321011 4707 scope.go:117] "RemoveContainer" containerID="5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.328990 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr"] Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.340514 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5967bdfff7-tnxtr"] Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.342581 4707 scope.go:117] "RemoveContainer" containerID="119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f" Jan 21 15:45:25 crc kubenswrapper[4707]: E0121 15:45:25.342862 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f\": container with ID starting with 119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f not found: ID does not exist" containerID="119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.342887 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f"} err="failed to get container status \"119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f\": rpc error: code = NotFound desc = could not find container \"119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f\": container with ID starting with 119776d5bbc6b07124c3dff328b14e5c12500d8aca43fb03b2f0b59c58ef104f not found: ID does not exist" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.342906 4707 scope.go:117] "RemoveContainer" containerID="5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c" Jan 21 15:45:25 crc kubenswrapper[4707]: E0121 15:45:25.343464 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c\": container with ID starting with 5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c not found: ID does not exist" containerID="5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c" Jan 21 15:45:25 crc kubenswrapper[4707]: I0121 15:45:25.343482 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c"} err="failed to get container status \"5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c\": rpc error: code = NotFound desc = could not find container \"5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c\": container with ID starting with 5ae22facd4cf3aa25e6850b8f624fcb82cde445502ba6e76fc4ce82b970fe46c not found: ID does not exist" Jan 21 15:45:26 crc kubenswrapper[4707]: I0121 15:45:26.311965 4707 generic.go:334] "Generic (PLEG): container finished" podID="78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2" containerID="8df853fe17904e91c15bf9f58ea86cb238d2bb1b6b761b5fe7751f6725ae781f" exitCode=0 Jan 21 15:45:26 crc kubenswrapper[4707]: I0121 15:45:26.312043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" event={"ID":"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2","Type":"ContainerDied","Data":"8df853fe17904e91c15bf9f58ea86cb238d2bb1b6b761b5fe7751f6725ae781f"} Jan 21 15:45:26 crc kubenswrapper[4707]: I0121 15:45:26.315778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f341e188-43e6-4019-b4dd-b8ea727d2d3f","Type":"ContainerStarted","Data":"7de59c225a5ef6ebd73df1f9d5fa7286a46034a887cc003580cb1dcd45f6ba34"} Jan 21 15:45:26 crc kubenswrapper[4707]: I0121 15:45:26.316038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:45:26 crc kubenswrapper[4707]: I0121 15:45:26.317683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab","Type":"ContainerStarted","Data":"41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161"} Jan 21 15:45:26 crc kubenswrapper[4707]: I0121 15:45:26.318339 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:45:26 crc kubenswrapper[4707]: I0121 15:45:26.338255 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=35.338245397 podStartE2EDuration="35.338245397s" podCreationTimestamp="2026-01-21 15:44:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:26.335955182 +0000 UTC m=+2623.517471404" watchObservedRunningTime="2026-01-21 15:45:26.338245397 +0000 UTC m=+2623.519761619" Jan 21 15:45:26 crc kubenswrapper[4707]: I0121 15:45:26.359390 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.359373594 podStartE2EDuration="36.359373594s" podCreationTimestamp="2026-01-21 15:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:26.352332996 +0000 UTC m=+2623.533849218" watchObservedRunningTime="2026-01-21 15:45:26.359373594 +0000 UTC m=+2623.540889816" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.190309 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" path="/var/lib/kubelet/pods/b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725/volumes" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.585605 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.597727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-combined-ca-bundle\") pod \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.597845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-config-data\") pod \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.597897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxmqx\" (UniqueName: \"kubernetes.io/projected/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-kube-api-access-cxmqx\") pod \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.597953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-scripts\") pod \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\" (UID: \"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2\") " Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.603176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-scripts" (OuterVolumeSpecName: "scripts") pod "78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2" (UID: "78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.607772 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-kube-api-access-cxmqx" (OuterVolumeSpecName: "kube-api-access-cxmqx") pod "78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2" (UID: "78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2"). InnerVolumeSpecName "kube-api-access-cxmqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.626991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-config-data" (OuterVolumeSpecName: "config-data") pod "78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2" (UID: "78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.627132 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2" (UID: "78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.641011 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.640964 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.46:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.699512 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.699541 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxmqx\" (UniqueName: \"kubernetes.io/projected/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-kube-api-access-cxmqx\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.699552 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:27 crc kubenswrapper[4707]: I0121 15:45:27.699562 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4707]: I0121 15:45:28.198243 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:45:28 crc kubenswrapper[4707]: I0121 15:45:28.332301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" event={"ID":"78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2","Type":"ContainerDied","Data":"dd0194ca21d3dc66f95ac783627a007729a3c9254b80d396c98825574b296084"} Jan 21 15:45:28 crc kubenswrapper[4707]: I0121 15:45:28.332339 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd0194ca21d3dc66f95ac783627a007729a3c9254b80d396c98825574b296084" Jan 21 15:45:28 crc kubenswrapper[4707]: I0121 15:45:28.332402 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8" Jan 21 15:45:28 crc kubenswrapper[4707]: I0121 15:45:28.386761 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:45:28 crc kubenswrapper[4707]: I0121 15:45:28.386951 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="cd4ae0ce-5751-45ed-83d7-f721e4ccdabe" containerName="nova-cell1-conductor-conductor" containerID="cri-o://ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2" gracePeriod=30 Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.197346 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.252352 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-65bd996df5-pwnvm"] Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.252561 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerName="neutron-api" containerID="cri-o://865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70" gracePeriod=30 Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.252684 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerName="neutron-httpd" containerID="cri-o://5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400" gracePeriod=30 Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.848045 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.934430 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-config-data\") pod \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.934500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-combined-ca-bundle\") pod \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.934567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txlwg\" (UniqueName: \"kubernetes.io/projected/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-kube-api-access-txlwg\") pod \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\" (UID: \"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe\") " Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.948876 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-kube-api-access-txlwg" (OuterVolumeSpecName: "kube-api-access-txlwg") pod "cd4ae0ce-5751-45ed-83d7-f721e4ccdabe" (UID: "cd4ae0ce-5751-45ed-83d7-f721e4ccdabe"). InnerVolumeSpecName "kube-api-access-txlwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.955499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd4ae0ce-5751-45ed-83d7-f721e4ccdabe" (UID: "cd4ae0ce-5751-45ed-83d7-f721e4ccdabe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:29 crc kubenswrapper[4707]: I0121 15:45:29.961706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-config-data" (OuterVolumeSpecName: "config-data") pod "cd4ae0ce-5751-45ed-83d7-f721e4ccdabe" (UID: "cd4ae0ce-5751-45ed-83d7-f721e4ccdabe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.036901 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txlwg\" (UniqueName: \"kubernetes.io/projected/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-kube-api-access-txlwg\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.036927 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.036938 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.347213 4707 generic.go:334] "Generic (PLEG): container finished" podID="cd4ae0ce-5751-45ed-83d7-f721e4ccdabe" containerID="ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2" exitCode=0 Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.347287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe","Type":"ContainerDied","Data":"ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2"} Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.347312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cd4ae0ce-5751-45ed-83d7-f721e4ccdabe","Type":"ContainerDied","Data":"cedba6f29dee9154f6242a4c5dc406d7fc5ccef59b4225c9c33b899e4c1375ac"} Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.347332 4707 scope.go:117] "RemoveContainer" containerID="ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.347982 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.349499 4707 generic.go:334] "Generic (PLEG): container finished" podID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerID="5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400" exitCode=0 Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.349524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" event={"ID":"b241d4d6-31f1-4523-89df-1e1eab731f48","Type":"ContainerDied","Data":"5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400"} Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.364841 4707 scope.go:117] "RemoveContainer" containerID="ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2" Jan 21 15:45:30 crc kubenswrapper[4707]: E0121 15:45:30.365198 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2\": container with ID starting with ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2 not found: ID does not exist" containerID="ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.365241 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2"} err="failed to get container status \"ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2\": rpc error: code = NotFound desc = could not find container \"ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2\": container with ID starting with ad91d1ac018ca81e7be3b1a5e6fed76673bb206c9e0eb39ea7508ea25caef2a2 not found: ID does not exist" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.372825 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.379557 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.388009 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:45:30 crc kubenswrapper[4707]: E0121 15:45:30.388347 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" containerName="init" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.388366 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" containerName="init" Jan 21 15:45:30 crc kubenswrapper[4707]: E0121 15:45:30.388392 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4ae0ce-5751-45ed-83d7-f721e4ccdabe" containerName="nova-cell1-conductor-conductor" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.388399 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4ae0ce-5751-45ed-83d7-f721e4ccdabe" containerName="nova-cell1-conductor-conductor" Jan 21 15:45:30 crc kubenswrapper[4707]: E0121 15:45:30.388415 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" containerName="dnsmasq-dns" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.388420 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" containerName="dnsmasq-dns" Jan 21 15:45:30 crc kubenswrapper[4707]: E0121 15:45:30.388434 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2" containerName="nova-cell1-conductor-db-sync" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.388440 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2" containerName="nova-cell1-conductor-db-sync" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.388599 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2" containerName="nova-cell1-conductor-db-sync" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.388609 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4ae0ce-5751-45ed-83d7-f721e4ccdabe" containerName="nova-cell1-conductor-conductor" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.388628 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76873cf-7ae7-48cf-b1f2-b3b3d6b4e725" containerName="dnsmasq-dns" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.389167 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.391001 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.398865 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.443373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.443414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sdv2\" (UniqueName: \"kubernetes.io/projected/4505129f-9175-43df-9d92-cbb8fba50ea3-kube-api-access-8sdv2\") pod \"nova-cell1-conductor-0\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.443453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.544096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.544297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sdv2\" (UniqueName: \"kubernetes.io/projected/4505129f-9175-43df-9d92-cbb8fba50ea3-kube-api-access-8sdv2\") pod \"nova-cell1-conductor-0\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.544418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.548436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.556787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.556846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sdv2\" (UniqueName: \"kubernetes.io/projected/4505129f-9175-43df-9d92-cbb8fba50ea3-kube-api-access-8sdv2\") pod \"nova-cell1-conductor-0\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:30 crc kubenswrapper[4707]: I0121 15:45:30.705660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:31 crc kubenswrapper[4707]: W0121 15:45:31.083936 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4505129f_9175_43df_9d92_cbb8fba50ea3.slice/crio-0c9fc690a9be2b1dbecac900a76a7bc0d01e9dddaa7c25360b90f767712378b6 WatchSource:0}: Error finding container 0c9fc690a9be2b1dbecac900a76a7bc0d01e9dddaa7c25360b90f767712378b6: Status 404 returned error can't find the container with id 0c9fc690a9be2b1dbecac900a76a7bc0d01e9dddaa7c25360b90f767712378b6 Jan 21 15:45:31 crc kubenswrapper[4707]: I0121 15:45:31.083987 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:45:31 crc kubenswrapper[4707]: I0121 15:45:31.190387 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4ae0ce-5751-45ed-83d7-f721e4ccdabe" path="/var/lib/kubelet/pods/cd4ae0ce-5751-45ed-83d7-f721e4ccdabe/volumes" Jan 21 15:45:31 crc kubenswrapper[4707]: I0121 15:45:31.359854 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4505129f-9175-43df-9d92-cbb8fba50ea3","Type":"ContainerStarted","Data":"dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f"} Jan 21 15:45:31 crc kubenswrapper[4707]: I0121 15:45:31.359896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4505129f-9175-43df-9d92-cbb8fba50ea3","Type":"ContainerStarted","Data":"0c9fc690a9be2b1dbecac900a76a7bc0d01e9dddaa7c25360b90f767712378b6"} Jan 21 15:45:31 crc kubenswrapper[4707]: I0121 15:45:31.375651 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.37563441 podStartE2EDuration="1.37563441s" podCreationTimestamp="2026-01-21 15:45:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:31.374530685 +0000 UTC m=+2628.556046906" watchObservedRunningTime="2026-01-21 15:45:31.37563441 +0000 UTC m=+2628.557150633" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.219465 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.273339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data\") pod \"8f191858-2d71-42bb-8227-069401c26dd4\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.273403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-combined-ca-bundle\") pod \"8f191858-2d71-42bb-8227-069401c26dd4\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.273424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data-custom\") pod \"8f191858-2d71-42bb-8227-069401c26dd4\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.273480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f191858-2d71-42bb-8227-069401c26dd4-etc-machine-id\") pod \"8f191858-2d71-42bb-8227-069401c26dd4\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.273509 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-scripts\") pod \"8f191858-2d71-42bb-8227-069401c26dd4\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.273529 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zznpg\" (UniqueName: \"kubernetes.io/projected/8f191858-2d71-42bb-8227-069401c26dd4-kube-api-access-zznpg\") pod \"8f191858-2d71-42bb-8227-069401c26dd4\" (UID: \"8f191858-2d71-42bb-8227-069401c26dd4\") " Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.273591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f191858-2d71-42bb-8227-069401c26dd4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8f191858-2d71-42bb-8227-069401c26dd4" (UID: "8f191858-2d71-42bb-8227-069401c26dd4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.273878 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f191858-2d71-42bb-8227-069401c26dd4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.277495 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8f191858-2d71-42bb-8227-069401c26dd4" (UID: "8f191858-2d71-42bb-8227-069401c26dd4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.277804 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f191858-2d71-42bb-8227-069401c26dd4-kube-api-access-zznpg" (OuterVolumeSpecName: "kube-api-access-zznpg") pod "8f191858-2d71-42bb-8227-069401c26dd4" (UID: "8f191858-2d71-42bb-8227-069401c26dd4"). InnerVolumeSpecName "kube-api-access-zznpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.290097 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-scripts" (OuterVolumeSpecName: "scripts") pod "8f191858-2d71-42bb-8227-069401c26dd4" (UID: "8f191858-2d71-42bb-8227-069401c26dd4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.318643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f191858-2d71-42bb-8227-069401c26dd4" (UID: "8f191858-2d71-42bb-8227-069401c26dd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.343563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data" (OuterVolumeSpecName: "config-data") pod "8f191858-2d71-42bb-8227-069401c26dd4" (UID: "8f191858-2d71-42bb-8227-069401c26dd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.376915 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f191858-2d71-42bb-8227-069401c26dd4" containerID="80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63" exitCode=137 Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.376985 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.377023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"8f191858-2d71-42bb-8227-069401c26dd4","Type":"ContainerDied","Data":"80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63"} Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.377054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"8f191858-2d71-42bb-8227-069401c26dd4","Type":"ContainerDied","Data":"1e6514e593995c55157d2f14d95b44ee79e25c44d2dfbd7ca01cb319d10037b5"} Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.377073 4707 scope.go:117] "RemoveContainer" containerID="80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.377329 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.378656 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.378777 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.379185 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.379212 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f191858-2d71-42bb-8227-069401c26dd4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.379223 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zznpg\" (UniqueName: \"kubernetes.io/projected/8f191858-2d71-42bb-8227-069401c26dd4-kube-api-access-zznpg\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.399793 4707 scope.go:117] "RemoveContainer" containerID="b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.408484 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.420621 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.445430 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:45:32 crc kubenswrapper[4707]: E0121 15:45:32.446129 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.446202 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" Jan 21 15:45:32 crc kubenswrapper[4707]: E0121 15:45:32.446260 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="probe" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.446335 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="probe" Jan 21 15:45:32 crc kubenswrapper[4707]: E0121 15:45:32.446397 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.446455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.446693 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="probe" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.446752 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.446830 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f191858-2d71-42bb-8227-069401c26dd4" containerName="cinder-scheduler" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.445705 4707 scope.go:117] "RemoveContainer" containerID="02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.447994 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.449983 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.453531 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.475563 4707 scope.go:117] "RemoveContainer" containerID="80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63" Jan 21 15:45:32 crc kubenswrapper[4707]: E0121 15:45:32.476133 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63\": container with ID starting with 80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63 not found: ID does not exist" containerID="80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.476171 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63"} err="failed to get container status \"80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63\": rpc error: code = NotFound desc = could not find container \"80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63\": container with ID starting with 80346e6c42c7cbc8b62939b85f73eeccb7cf158126f68cb49bac09bda94c5a63 not found: ID does not exist" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.476194 4707 scope.go:117] "RemoveContainer" containerID="b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238" Jan 21 15:45:32 crc kubenswrapper[4707]: E0121 15:45:32.476996 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238\": container with ID starting with b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238 not found: ID does not exist" containerID="b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.477017 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238"} err="failed to get container status \"b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238\": rpc error: code = NotFound desc = could not find container \"b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238\": container with ID starting with b31c2c9ef3f03fc80152155a74b7d50767905b852ab2ca5b28f462c792ccc238 not found: ID does not exist" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.477030 4707 scope.go:117] "RemoveContainer" containerID="02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f" Jan 21 15:45:32 crc kubenswrapper[4707]: E0121 15:45:32.481172 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f\": container with ID starting with 02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f not found: ID does not exist" containerID="02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.481222 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f"} err="failed to get container status \"02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f\": rpc error: code = NotFound desc = could not find container \"02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f\": container with ID starting with 02b0558b260ff73144ceb6568fe0a9d4dd3af153b8c8f9862647416444dbb25f not found: ID does not exist" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.481424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.481498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.481607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-scripts\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.482494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.482754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.482954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n72ps\" (UniqueName: \"kubernetes.io/projected/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-kube-api-access-n72ps\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.584745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-scripts\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.585558 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.585693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.585791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n72ps\" (UniqueName: \"kubernetes.io/projected/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-kube-api-access-n72ps\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.585939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.586014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.585786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.587936 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-scripts\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.588515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.588595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.589186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.602980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n72ps\" (UniqueName: \"kubernetes.io/projected/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-kube-api-access-n72ps\") pod \"cinder-scheduler-0\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:32 crc kubenswrapper[4707]: I0121 15:45:32.798002 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:33 crc kubenswrapper[4707]: I0121 15:45:33.170427 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:45:33 crc kubenswrapper[4707]: I0121 15:45:33.199516 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f191858-2d71-42bb-8227-069401c26dd4" path="/var/lib/kubelet/pods/8f191858-2d71-42bb-8227-069401c26dd4/volumes" Jan 21 15:45:33 crc kubenswrapper[4707]: I0121 15:45:33.389723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760","Type":"ContainerStarted","Data":"03a6a327db3fed2dc170c765af84ea362ed4277a832f0c862453c31dac6c0441"} Jan 21 15:45:33 crc kubenswrapper[4707]: I0121 15:45:33.676103 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.10:9696/\": dial tcp 10.217.1.10:9696: connect: connection refused" Jan 21 15:45:34 crc kubenswrapper[4707]: I0121 15:45:34.398687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760","Type":"ContainerStarted","Data":"ce810f5016c6b84539b5638a579a70bdd684769502c29f783c133ce65a806987"} Jan 21 15:45:34 crc kubenswrapper[4707]: I0121 15:45:34.398909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760","Type":"ContainerStarted","Data":"ad1135a550f9a5d2b3f5160fd4aa31b58313409af1bfcb366c84f0d82a424f8b"} Jan 21 15:45:34 crc kubenswrapper[4707]: I0121 15:45:34.415045 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.415031533 podStartE2EDuration="2.415031533s" podCreationTimestamp="2026-01-21 15:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:34.411677797 +0000 UTC m=+2631.593194020" watchObservedRunningTime="2026-01-21 15:45:34.415031533 +0000 UTC m=+2631.596547755" Jan 21 15:45:36 crc kubenswrapper[4707]: I0121 15:45:36.637617 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:36 crc kubenswrapper[4707]: I0121 15:45:36.639135 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:36 crc kubenswrapper[4707]: I0121 15:45:36.647598 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:37 crc kubenswrapper[4707]: I0121 15:45:37.354725 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:37 crc kubenswrapper[4707]: I0121 15:45:37.355611 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:45:37 crc kubenswrapper[4707]: I0121 15:45:37.420242 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-c48bc69f9-zs4s8"] Jan 21 15:45:37 crc kubenswrapper[4707]: I0121 15:45:37.420456 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" podUID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerName="placement-log" containerID="cri-o://06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b" gracePeriod=30 Jan 21 15:45:37 crc kubenswrapper[4707]: I0121 15:45:37.420562 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" podUID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerName="placement-api" containerID="cri-o://8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231" gracePeriod=30 Jan 21 15:45:37 crc kubenswrapper[4707]: I0121 15:45:37.456737 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:37 crc kubenswrapper[4707]: I0121 15:45:37.798867 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:38 crc kubenswrapper[4707]: I0121 15:45:38.182745 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:45:38 crc kubenswrapper[4707]: E0121 15:45:38.183138 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:45:38 crc kubenswrapper[4707]: I0121 15:45:38.447698 4707 generic.go:334] "Generic (PLEG): container finished" podID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerID="06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b" exitCode=143 Jan 21 15:45:38 crc kubenswrapper[4707]: I0121 15:45:38.448967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" event={"ID":"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf","Type":"ContainerDied","Data":"06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b"} Jan 21 15:45:40 crc kubenswrapper[4707]: I0121 15:45:40.726668 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:45:40 crc kubenswrapper[4707]: I0121 15:45:40.978776 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.005960 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.117598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-combined-ca-bundle\") pod \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.117859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-config-data\") pod \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.118048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-logs\") pod \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.118144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch8f7\" (UniqueName: \"kubernetes.io/projected/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-kube-api-access-ch8f7\") pod \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.118291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-public-tls-certs\") pod \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.118381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-scripts\") pod \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.118454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-internal-tls-certs\") pod \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\" (UID: \"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.120205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-logs" (OuterVolumeSpecName: "logs") pod "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" (UID: "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.123121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-kube-api-access-ch8f7" (OuterVolumeSpecName: "kube-api-access-ch8f7") pod "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" (UID: "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf"). InnerVolumeSpecName "kube-api-access-ch8f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.129021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-scripts" (OuterVolumeSpecName: "scripts") pod "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" (UID: "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.175945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-config-data" (OuterVolumeSpecName: "config-data") pod "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" (UID: "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.179600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" (UID: "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.220661 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.220832 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.220849 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.220862 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.220871 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch8f7\" (UniqueName: \"kubernetes.io/projected/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-kube-api-access-ch8f7\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.223210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" (UID: "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.252493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" (UID: "be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.322784 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.322819 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.383025 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.473872 4707 generic.go:334] "Generic (PLEG): container finished" podID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerID="8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231" exitCode=0 Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.473943 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.473946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" event={"ID":"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf","Type":"ContainerDied","Data":"8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231"} Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.474001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c48bc69f9-zs4s8" event={"ID":"be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf","Type":"ContainerDied","Data":"171bb4a7379bdb4157e358de0b450c2d8cfb1260db9c5dc027c3c1080510e98f"} Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.474021 4707 scope.go:117] "RemoveContainer" containerID="8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.475647 4707 generic.go:334] "Generic (PLEG): container finished" podID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerID="865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70" exitCode=0 Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.475694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" event={"ID":"b241d4d6-31f1-4523-89df-1e1eab731f48","Type":"ContainerDied","Data":"865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70"} Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.475705 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.475719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65bd996df5-pwnvm" event={"ID":"b241d4d6-31f1-4523-89df-1e1eab731f48","Type":"ContainerDied","Data":"79c655abc744879935b5d43a70d6e7fcf219d83647a374e61a9dea29a37ff3b1"} Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.498114 4707 scope.go:117] "RemoveContainer" containerID="06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.498780 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-c48bc69f9-zs4s8"] Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.505929 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-c48bc69f9-zs4s8"] Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.515797 4707 scope.go:117] "RemoveContainer" containerID="8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231" Jan 21 15:45:41 crc kubenswrapper[4707]: E0121 15:45:41.516183 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231\": container with ID starting with 8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231 not found: ID does not exist" containerID="8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.516218 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231"} err="failed to get container status \"8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231\": rpc error: code = NotFound desc = could not find container \"8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231\": container with ID starting with 8f4fa5a5df2c1881e3cc4172cc5b98cce8cb37ea989858cfd50b9e56728e5231 not found: ID does not exist" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.516239 4707 scope.go:117] "RemoveContainer" containerID="06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b" Jan 21 15:45:41 crc kubenswrapper[4707]: E0121 15:45:41.516609 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b\": container with ID starting with 06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b not found: ID does not exist" containerID="06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.516647 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b"} err="failed to get container status \"06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b\": rpc error: code = NotFound desc = could not find container \"06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b\": container with ID starting with 06b76e477a27962bf029b3fd0ef94af45fdf4c50f7f6aee2ef236e44ce3b178b not found: ID does not exist" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.516670 4707 scope.go:117] "RemoveContainer" containerID="5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.525037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlrhj\" (UniqueName: \"kubernetes.io/projected/b241d4d6-31f1-4523-89df-1e1eab731f48-kube-api-access-dlrhj\") pod \"b241d4d6-31f1-4523-89df-1e1eab731f48\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.525084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-ovndb-tls-certs\") pod \"b241d4d6-31f1-4523-89df-1e1eab731f48\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.525129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-public-tls-certs\") pod \"b241d4d6-31f1-4523-89df-1e1eab731f48\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.525216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-internal-tls-certs\") pod \"b241d4d6-31f1-4523-89df-1e1eab731f48\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.525244 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-config\") pod \"b241d4d6-31f1-4523-89df-1e1eab731f48\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.525338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-combined-ca-bundle\") pod \"b241d4d6-31f1-4523-89df-1e1eab731f48\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.525584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-httpd-config\") pod \"b241d4d6-31f1-4523-89df-1e1eab731f48\" (UID: \"b241d4d6-31f1-4523-89df-1e1eab731f48\") " Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.532216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b241d4d6-31f1-4523-89df-1e1eab731f48-kube-api-access-dlrhj" (OuterVolumeSpecName: "kube-api-access-dlrhj") pod "b241d4d6-31f1-4523-89df-1e1eab731f48" (UID: "b241d4d6-31f1-4523-89df-1e1eab731f48"). InnerVolumeSpecName "kube-api-access-dlrhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.532468 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b241d4d6-31f1-4523-89df-1e1eab731f48" (UID: "b241d4d6-31f1-4523-89df-1e1eab731f48"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.537243 4707 scope.go:117] "RemoveContainer" containerID="865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.558386 4707 scope.go:117] "RemoveContainer" containerID="5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400" Jan 21 15:45:41 crc kubenswrapper[4707]: E0121 15:45:41.558671 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400\": container with ID starting with 5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400 not found: ID does not exist" containerID="5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.558699 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400"} err="failed to get container status \"5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400\": rpc error: code = NotFound desc = could not find container \"5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400\": container with ID starting with 5d4cf3f201cd27ea4c067be8b76866c29b6d9a3ee905f0fccfec0c1df8008400 not found: ID does not exist" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.558715 4707 scope.go:117] "RemoveContainer" containerID="865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70" Jan 21 15:45:41 crc kubenswrapper[4707]: E0121 15:45:41.559063 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70\": container with ID starting with 865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70 not found: ID does not exist" containerID="865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.559101 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70"} err="failed to get container status \"865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70\": rpc error: code = NotFound desc = could not find container \"865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70\": container with ID starting with 865990d1d3e34748a223db8fa15a8b99467e90dcf103dd6c154f5f24b4c45e70 not found: ID does not exist" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.564657 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-config" (OuterVolumeSpecName: "config") pod "b241d4d6-31f1-4523-89df-1e1eab731f48" (UID: "b241d4d6-31f1-4523-89df-1e1eab731f48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.572046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b241d4d6-31f1-4523-89df-1e1eab731f48" (UID: "b241d4d6-31f1-4523-89df-1e1eab731f48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.574848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b241d4d6-31f1-4523-89df-1e1eab731f48" (UID: "b241d4d6-31f1-4523-89df-1e1eab731f48"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.579204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b241d4d6-31f1-4523-89df-1e1eab731f48" (UID: "b241d4d6-31f1-4523-89df-1e1eab731f48"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.590944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b241d4d6-31f1-4523-89df-1e1eab731f48" (UID: "b241d4d6-31f1-4523-89df-1e1eab731f48"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.628152 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlrhj\" (UniqueName: \"kubernetes.io/projected/b241d4d6-31f1-4523-89df-1e1eab731f48-kube-api-access-dlrhj\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.628181 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.628190 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.628199 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.628207 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.628216 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.628224 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b241d4d6-31f1-4523-89df-1e1eab731f48-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.823390 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-65bd996df5-pwnvm"] Jan 21 15:45:41 crc kubenswrapper[4707]: I0121 15:45:41.829007 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-65bd996df5-pwnvm"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.028447 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.228882 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.235014 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-tz454"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.287601 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-dsnpf"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.297990 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-tz454"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.307859 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.308084 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-log" containerID="cri-o://6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f" gracePeriod=30 Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.308180 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-api" containerID="cri-o://0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9" gracePeriod=30 Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.319801 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.319996 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7cfb7b09-0ff5-4134-90e4-2b04813ca151" containerName="nova-scheduler-scheduler" containerID="cri-o://2a68ebdabb09d5063a608b567a0d520fdc9896d30daf8aed958ebfbd4a7454b6" gracePeriod=30 Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.327044 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.327286 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-log" containerID="cri-o://954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f" gracePeriod=30 Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.327346 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-metadata" containerID="cri-o://9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1" gracePeriod=30 Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.432176 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm"] Jan 21 15:45:42 crc kubenswrapper[4707]: E0121 15:45:42.432579 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerName="placement-log" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.432595 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerName="placement-log" Jan 21 15:45:42 crc kubenswrapper[4707]: E0121 15:45:42.432612 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerName="neutron-api" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.432618 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerName="neutron-api" Jan 21 15:45:42 crc kubenswrapper[4707]: E0121 15:45:42.432637 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerName="placement-api" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.432642 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerName="placement-api" Jan 21 15:45:42 crc kubenswrapper[4707]: E0121 15:45:42.432648 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerName="neutron-httpd" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.432653 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerName="neutron-httpd" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.432794 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerName="neutron-httpd" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.432818 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerName="placement-api" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.432832 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" containerName="placement-log" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.432843 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" containerName="neutron-api" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.433940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.437213 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.437424 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.455152 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.456306 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.458570 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.458788 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.465611 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.478406 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.486087 4707 generic.go:334] "Generic (PLEG): container finished" podID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerID="6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f" exitCode=143 Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.486157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f92113cd-544a-4c31-9ca9-e56ec4250773","Type":"ContainerDied","Data":"6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f"} Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.487979 4707 generic.go:334] "Generic (PLEG): container finished" podID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerID="954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f" exitCode=143 Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.488039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e","Type":"ContainerDied","Data":"954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f"} Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.565691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlbwq\" (UniqueName: \"kubernetes.io/projected/875e9554-48a4-49ca-8a6f-2408eb57723f-kube-api-access-xlbwq\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.565779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-scripts\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.565801 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-scripts\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.565892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.565912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-config-data\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.565930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-config-data\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.565951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnghf\" (UniqueName: \"kubernetes.io/projected/8cd0abca-b95e-45bf-8325-0c12088ce0d1-kube-api-access-bnghf\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.565983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.667220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.667259 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-config-data\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.667288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-config-data\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.667311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnghf\" (UniqueName: \"kubernetes.io/projected/8cd0abca-b95e-45bf-8325-0c12088ce0d1-kube-api-access-bnghf\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.667338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.667390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlbwq\" (UniqueName: \"kubernetes.io/projected/875e9554-48a4-49ca-8a6f-2408eb57723f-kube-api-access-xlbwq\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.667433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-scripts\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.667450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-scripts\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.674561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-scripts\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.674604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-config-data\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.682132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.682237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-scripts\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.682313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-config-data\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.682348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.686498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnghf\" (UniqueName: \"kubernetes.io/projected/8cd0abca-b95e-45bf-8325-0c12088ce0d1-kube-api-access-bnghf\") pod \"nova-cell0-cell-mapping-kcgbm\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.686668 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlbwq\" (UniqueName: \"kubernetes.io/projected/875e9554-48a4-49ca-8a6f-2408eb57723f-kube-api-access-xlbwq\") pod \"nova-cell1-cell-mapping-59wmr\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.798349 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.807480 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.902545 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.958581 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-56cb4c686f-qqcjc"] Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.958770 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" podUID="7fc127ff-9162-4e68-af5b-b4085cce670c" containerName="keystone-api" containerID="cri-o://15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d" gracePeriod=30 Jan 21 15:45:42 crc kubenswrapper[4707]: I0121 15:45:42.997100 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.191690 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4146388d-3f42-42fd-ac0e-9ec7ab1bf062" path="/var/lib/kubelet/pods/4146388d-3f42-42fd-ac0e-9ec7ab1bf062/volumes" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.192489 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b241d4d6-31f1-4523-89df-1e1eab731f48" path="/var/lib/kubelet/pods/b241d4d6-31f1-4523-89df-1e1eab731f48/volumes" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.193109 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf" path="/var/lib/kubelet/pods/be0ef32a-c74c-44a5-914a-e3a5e8d2aeaf/volumes" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.194118 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41685f3-ac56-4369-9528-04ded1edb06b" path="/var/lib/kubelet/pods/d41685f3-ac56-4369-9528-04ded1edb06b/volumes" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.227791 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm"] Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.319266 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr"] Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.498799 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" event={"ID":"8cd0abca-b95e-45bf-8325-0c12088ce0d1","Type":"ContainerStarted","Data":"b06e320fecab1c2dbd21b7002caf646cd135faaf82ae1ebd489ce9105a961412"} Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.499042 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" event={"ID":"8cd0abca-b95e-45bf-8325-0c12088ce0d1","Type":"ContainerStarted","Data":"e05ccdfa23005a1db801d7f4c48823c43613a762fed518ede7eca4b7b19a3c16"} Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.500404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" event={"ID":"875e9554-48a4-49ca-8a6f-2408eb57723f","Type":"ContainerStarted","Data":"20101a8907237b77b8724d313d47000e9615ffb81952a3f5d753c750223d766c"} Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.500444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" event={"ID":"875e9554-48a4-49ca-8a6f-2408eb57723f","Type":"ContainerStarted","Data":"9b9923bec270c135ad949e412ef9981d70613b01b66d00f5f95481785bc46736"} Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.513251 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" podStartSLOduration=1.513238326 podStartE2EDuration="1.513238326s" podCreationTimestamp="2026-01-21 15:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:43.509540383 +0000 UTC m=+2640.691056605" watchObservedRunningTime="2026-01-21 15:45:43.513238326 +0000 UTC m=+2640.694754549" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.528729 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" podStartSLOduration=1.5287186529999999 podStartE2EDuration="1.528718653s" podCreationTimestamp="2026-01-21 15:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:43.525437023 +0000 UTC m=+2640.706953235" watchObservedRunningTime="2026-01-21 15:45:43.528718653 +0000 UTC m=+2640.710234875" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.786518 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.786729 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" containerName="openstackclient" containerID="cri-o://bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9" gracePeriod=2 Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.795444 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.826762 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:45:43 crc kubenswrapper[4707]: E0121 15:45:43.827660 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" containerName="openstackclient" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.827742 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" containerName="openstackclient" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.828249 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" containerName="openstackclient" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.829219 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.834294 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" podUID="45e9f148-78c8-4e0e-adf4-ae14e936e166" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.838128 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.991281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jps8s\" (UniqueName: \"kubernetes.io/projected/45e9f148-78c8-4e0e-adf4-ae14e936e166-kube-api-access-jps8s\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.991465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.991537 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-combined-ca-bundle\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:43 crc kubenswrapper[4707]: I0121 15:45:43.991679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config-secret\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.093934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config-secret\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.094039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jps8s\" (UniqueName: \"kubernetes.io/projected/45e9f148-78c8-4e0e-adf4-ae14e936e166-kube-api-access-jps8s\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.094096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.094123 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-combined-ca-bundle\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.094837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.097575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-combined-ca-bundle\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.104993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config-secret\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.108217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jps8s\" (UniqueName: \"kubernetes.io/projected/45e9f148-78c8-4e0e-adf4-ae14e936e166-kube-api-access-jps8s\") pod \"openstackclient\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.165364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.509428 4707 generic.go:334] "Generic (PLEG): container finished" podID="7cfb7b09-0ff5-4134-90e4-2b04813ca151" containerID="2a68ebdabb09d5063a608b567a0d520fdc9896d30daf8aed958ebfbd4a7454b6" exitCode=0 Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.509592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7cfb7b09-0ff5-4134-90e4-2b04813ca151","Type":"ContainerDied","Data":"2a68ebdabb09d5063a608b567a0d520fdc9896d30daf8aed958ebfbd4a7454b6"} Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.603187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.646375 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.807252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-config-data\") pod \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.807353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp9tk\" (UniqueName: \"kubernetes.io/projected/7cfb7b09-0ff5-4134-90e4-2b04813ca151-kube-api-access-lp9tk\") pod \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.807403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-combined-ca-bundle\") pod \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\" (UID: \"7cfb7b09-0ff5-4134-90e4-2b04813ca151\") " Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.811630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cfb7b09-0ff5-4134-90e4-2b04813ca151-kube-api-access-lp9tk" (OuterVolumeSpecName: "kube-api-access-lp9tk") pod "7cfb7b09-0ff5-4134-90e4-2b04813ca151" (UID: "7cfb7b09-0ff5-4134-90e4-2b04813ca151"). InnerVolumeSpecName "kube-api-access-lp9tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.828821 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cfb7b09-0ff5-4134-90e4-2b04813ca151" (UID: "7cfb7b09-0ff5-4134-90e4-2b04813ca151"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.829002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-config-data" (OuterVolumeSpecName: "config-data") pod "7cfb7b09-0ff5-4134-90e4-2b04813ca151" (UID: "7cfb7b09-0ff5-4134-90e4-2b04813ca151"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.910062 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp9tk\" (UniqueName: \"kubernetes.io/projected/7cfb7b09-0ff5-4134-90e4-2b04813ca151-kube-api-access-lp9tk\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.910090 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:44 crc kubenswrapper[4707]: I0121 15:45:44.910099 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb7b09-0ff5-4134-90e4-2b04813ca151-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.455044 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.44:8775/\": read tcp 10.217.0.2:44594->10.217.1.44:8775: read: connection reset by peer" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.455257 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.44:8775/\": read tcp 10.217.0.2:44604->10.217.1.44:8775: read: connection reset by peer" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.518956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7cfb7b09-0ff5-4134-90e4-2b04813ca151","Type":"ContainerDied","Data":"fc50202a4d0fab39abe73fbc0cdcbfccca1803dad1e6818435550ddd7852f377"} Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.518974 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.519023 4707 scope.go:117] "RemoveContainer" containerID="2a68ebdabb09d5063a608b567a0d520fdc9896d30daf8aed958ebfbd4a7454b6" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.520696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"45e9f148-78c8-4e0e-adf4-ae14e936e166","Type":"ContainerStarted","Data":"e41f28d469e56bcfb24ec87ca701c01fb43455c941908db97110f75eeb045abb"} Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.520724 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"45e9f148-78c8-4e0e-adf4-ae14e936e166","Type":"ContainerStarted","Data":"2c6a797dd408ce0849137ef60cd0e11fe8f4e25380db6af102d9aee0920d7cde"} Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.540508 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.54049512 podStartE2EDuration="2.54049512s" podCreationTimestamp="2026-01-21 15:45:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:45.533211165 +0000 UTC m=+2642.714727387" watchObservedRunningTime="2026-01-21 15:45:45.54049512 +0000 UTC m=+2642.722011343" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.556760 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.568473 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.581182 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:45:45 crc kubenswrapper[4707]: E0121 15:45:45.581631 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cfb7b09-0ff5-4134-90e4-2b04813ca151" containerName="nova-scheduler-scheduler" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.581642 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfb7b09-0ff5-4134-90e4-2b04813ca151" containerName="nova-scheduler-scheduler" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.581841 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cfb7b09-0ff5-4134-90e4-2b04813ca151" containerName="nova-scheduler-scheduler" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.582442 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.585168 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.587568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.724219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-config-data\") pod \"nova-scheduler-0\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.724423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsn2\" (UniqueName: \"kubernetes.io/projected/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-kube-api-access-clsn2\") pod \"nova-scheduler-0\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.724471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.821648 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.825146 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqgdv\" (UniqueName: \"kubernetes.io/projected/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-kube-api-access-jqgdv\") pod \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.825208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-combined-ca-bundle\") pod \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.825252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-nova-metadata-tls-certs\") pod \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.825300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-logs\") pod \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.825316 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-config-data\") pod \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\" (UID: \"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e\") " Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.825432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clsn2\" (UniqueName: \"kubernetes.io/projected/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-kube-api-access-clsn2\") pod \"nova-scheduler-0\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.825481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.825574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-config-data\") pod \"nova-scheduler-0\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.826511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-logs" (OuterVolumeSpecName: "logs") pod "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" (UID: "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.829673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-config-data\") pod \"nova-scheduler-0\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.830170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-kube-api-access-jqgdv" (OuterVolumeSpecName: "kube-api-access-jqgdv") pod "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" (UID: "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e"). InnerVolumeSpecName "kube-api-access-jqgdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.832264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.849652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsn2\" (UniqueName: \"kubernetes.io/projected/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-kube-api-access-clsn2\") pod \"nova-scheduler-0\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.862526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-config-data" (OuterVolumeSpecName: "config-data") pod "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" (UID: "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.867987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" (UID: "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.896246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" (UID: "2846d28b-24cc-4f6a-bd5b-dc099af2bc3e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.901669 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.927295 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqgdv\" (UniqueName: \"kubernetes.io/projected/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-kube-api-access-jqgdv\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.927337 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.927349 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.927371 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4707]: I0121 15:45:45.927381 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.016506 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.028265 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-combined-ca-bundle\") pod \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.028342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgrfg\" (UniqueName: \"kubernetes.io/projected/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-kube-api-access-fgrfg\") pod \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.028422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config\") pod \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.028518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config-secret\") pod \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\" (UID: \"83d92527-f8b9-40f5-8a6d-4abb2bb1ef63\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.032055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-kube-api-access-fgrfg" (OuterVolumeSpecName: "kube-api-access-fgrfg") pod "83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" (UID: "83d92527-f8b9-40f5-8a6d-4abb2bb1ef63"). InnerVolumeSpecName "kube-api-access-fgrfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.048011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" (UID: "83d92527-f8b9-40f5-8a6d-4abb2bb1ef63"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.054950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" (UID: "83d92527-f8b9-40f5-8a6d-4abb2bb1ef63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.064504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" (UID: "83d92527-f8b9-40f5-8a6d-4abb2bb1ef63"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.133899 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.134027 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.134094 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.134149 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgrfg\" (UniqueName: \"kubernetes.io/projected/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63-kube-api-access-fgrfg\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: W0121 15:45:46.313475 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7840e0f2_352b_4bc3_9a47_5bb1aa88d1dd.slice/crio-10f98f4abc2c4eaf26dd9b149affd779be69c5cd1190d22bd7d02aa0c5bc5022 WatchSource:0}: Error finding container 10f98f4abc2c4eaf26dd9b149affd779be69c5cd1190d22bd7d02aa0c5bc5022: Status 404 returned error can't find the container with id 10f98f4abc2c4eaf26dd9b149affd779be69c5cd1190d22bd7d02aa0c5bc5022 Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.314283 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.365035 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.438376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-scripts\") pod \"7fc127ff-9162-4e68-af5b-b4085cce670c\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.438603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-config-data\") pod \"7fc127ff-9162-4e68-af5b-b4085cce670c\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.438645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-combined-ca-bundle\") pod \"7fc127ff-9162-4e68-af5b-b4085cce670c\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.438713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjrtz\" (UniqueName: \"kubernetes.io/projected/7fc127ff-9162-4e68-af5b-b4085cce670c-kube-api-access-hjrtz\") pod \"7fc127ff-9162-4e68-af5b-b4085cce670c\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.438736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-internal-tls-certs\") pod \"7fc127ff-9162-4e68-af5b-b4085cce670c\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.438785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-fernet-keys\") pod \"7fc127ff-9162-4e68-af5b-b4085cce670c\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.438802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-public-tls-certs\") pod \"7fc127ff-9162-4e68-af5b-b4085cce670c\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.438978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-credential-keys\") pod \"7fc127ff-9162-4e68-af5b-b4085cce670c\" (UID: \"7fc127ff-9162-4e68-af5b-b4085cce670c\") " Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.441229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-scripts" (OuterVolumeSpecName: "scripts") pod "7fc127ff-9162-4e68-af5b-b4085cce670c" (UID: "7fc127ff-9162-4e68-af5b-b4085cce670c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.442739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7fc127ff-9162-4e68-af5b-b4085cce670c" (UID: "7fc127ff-9162-4e68-af5b-b4085cce670c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.443031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc127ff-9162-4e68-af5b-b4085cce670c-kube-api-access-hjrtz" (OuterVolumeSpecName: "kube-api-access-hjrtz") pod "7fc127ff-9162-4e68-af5b-b4085cce670c" (UID: "7fc127ff-9162-4e68-af5b-b4085cce670c"). InnerVolumeSpecName "kube-api-access-hjrtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.443731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7fc127ff-9162-4e68-af5b-b4085cce670c" (UID: "7fc127ff-9162-4e68-af5b-b4085cce670c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.459770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fc127ff-9162-4e68-af5b-b4085cce670c" (UID: "7fc127ff-9162-4e68-af5b-b4085cce670c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.462018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-config-data" (OuterVolumeSpecName: "config-data") pod "7fc127ff-9162-4e68-af5b-b4085cce670c" (UID: "7fc127ff-9162-4e68-af5b-b4085cce670c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.475571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7fc127ff-9162-4e68-af5b-b4085cce670c" (UID: "7fc127ff-9162-4e68-af5b-b4085cce670c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.475591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7fc127ff-9162-4e68-af5b-b4085cce670c" (UID: "7fc127ff-9162-4e68-af5b-b4085cce670c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.530972 4707 generic.go:334] "Generic (PLEG): container finished" podID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerID="9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1" exitCode=0 Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.531020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e","Type":"ContainerDied","Data":"9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1"} Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.531070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2846d28b-24cc-4f6a-bd5b-dc099af2bc3e","Type":"ContainerDied","Data":"ff17493926a97d3e459915a28b84cbdb2de941005a3f8c06897af0f4ad08180a"} Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.531063 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.531084 4707 scope.go:117] "RemoveContainer" containerID="9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.533514 4707 generic.go:334] "Generic (PLEG): container finished" podID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" containerID="bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9" exitCode=137 Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.533543 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.536974 4707 generic.go:334] "Generic (PLEG): container finished" podID="7fc127ff-9162-4e68-af5b-b4085cce670c" containerID="15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d" exitCode=0 Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.537013 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.537007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" event={"ID":"7fc127ff-9162-4e68-af5b-b4085cce670c","Type":"ContainerDied","Data":"15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d"} Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.537134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-56cb4c686f-qqcjc" event={"ID":"7fc127ff-9162-4e68-af5b-b4085cce670c","Type":"ContainerDied","Data":"6126481056f8f5356ff3d3043762f274a0bfbb51e2d8181c2a95e5c936ec8d06"} Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.539237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd","Type":"ContainerStarted","Data":"54d6c16d4e52b40f86e960b3984187cbdfd603f81abadb13eb6865cac9df33cb"} Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.539268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd","Type":"ContainerStarted","Data":"10f98f4abc2c4eaf26dd9b149affd779be69c5cd1190d22bd7d02aa0c5bc5022"} Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.540538 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.540554 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.540564 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.540572 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.540581 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjrtz\" (UniqueName: \"kubernetes.io/projected/7fc127ff-9162-4e68-af5b-b4085cce670c-kube-api-access-hjrtz\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.540591 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.540601 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.540609 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fc127ff-9162-4e68-af5b-b4085cce670c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.554350 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" podUID="45e9f148-78c8-4e0e-adf4-ae14e936e166" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.557733 4707 scope.go:117] "RemoveContainer" containerID="954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.562834 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.562796235 podStartE2EDuration="1.562796235s" podCreationTimestamp="2026-01-21 15:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:46.552671096 +0000 UTC m=+2643.734187318" watchObservedRunningTime="2026-01-21 15:45:46.562796235 +0000 UTC m=+2643.744312457" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.577232 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.587687 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.588063 4707 scope.go:117] "RemoveContainer" containerID="9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1" Jan 21 15:45:46 crc kubenswrapper[4707]: E0121 15:45:46.588426 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1\": container with ID starting with 9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1 not found: ID does not exist" containerID="9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.588462 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1"} err="failed to get container status \"9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1\": rpc error: code = NotFound desc = could not find container \"9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1\": container with ID starting with 9e028458d999310d5b416ec521f0297cf9f73999febc9f55321a1f5307adc4e1 not found: ID does not exist" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.588481 4707 scope.go:117] "RemoveContainer" containerID="954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f" Jan 21 15:45:46 crc kubenswrapper[4707]: E0121 15:45:46.588752 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f\": container with ID starting with 954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f not found: ID does not exist" containerID="954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.589195 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f"} err="failed to get container status \"954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f\": rpc error: code = NotFound desc = could not find container \"954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f\": container with ID starting with 954aa18da7b7bc7ec213732ed63669c973b990f3b3d7d411bb8254874746377f not found: ID does not exist" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.589220 4707 scope.go:117] "RemoveContainer" containerID="bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.598190 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:46 crc kubenswrapper[4707]: E0121 15:45:46.598571 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-metadata" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.598586 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-metadata" Jan 21 15:45:46 crc kubenswrapper[4707]: E0121 15:45:46.598622 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc127ff-9162-4e68-af5b-b4085cce670c" containerName="keystone-api" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.598628 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc127ff-9162-4e68-af5b-b4085cce670c" containerName="keystone-api" Jan 21 15:45:46 crc kubenswrapper[4707]: E0121 15:45:46.598646 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-log" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.598652 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-log" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.598845 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-log" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.598865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" containerName="nova-metadata-metadata" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.598884 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc127ff-9162-4e68-af5b-b4085cce670c" containerName="keystone-api" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.599787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.603639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.604762 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.613513 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-56cb4c686f-qqcjc"] Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.614399 4707 scope.go:117] "RemoveContainer" containerID="bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9" Jan 21 15:45:46 crc kubenswrapper[4707]: E0121 15:45:46.615055 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9\": container with ID starting with bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9 not found: ID does not exist" containerID="bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.615084 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9"} err="failed to get container status \"bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9\": rpc error: code = NotFound desc = could not find container \"bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9\": container with ID starting with bc04fcbc8547907181d9b450951403d09f952bcf977a976367fae96360f582e9 not found: ID does not exist" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.615103 4707 scope.go:117] "RemoveContainer" containerID="15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.621193 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-56cb4c686f-qqcjc"] Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.627822 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.642235 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.642496 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-config-data\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.642828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.642858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9wbl\" (UniqueName: \"kubernetes.io/projected/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-kube-api-access-z9wbl\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.643109 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-logs\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.659924 4707 scope.go:117] "RemoveContainer" containerID="15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d" Jan 21 15:45:46 crc kubenswrapper[4707]: E0121 15:45:46.660380 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d\": container with ID starting with 15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d not found: ID does not exist" containerID="15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.660410 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d"} err="failed to get container status \"15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d\": rpc error: code = NotFound desc = could not find container \"15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d\": container with ID starting with 15b695d49e404956f3581cb8ab80e257b87a45a621bab7aa171c10d6534e5b3d not found: ID does not exist" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.745288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-logs\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.745355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.745420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-config-data\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.745448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.745466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9wbl\" (UniqueName: \"kubernetes.io/projected/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-kube-api-access-z9wbl\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.745691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-logs\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.749056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.749114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-config-data\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.749580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.758110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9wbl\" (UniqueName: \"kubernetes.io/projected/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-kube-api-access-z9wbl\") pod \"nova-metadata-0\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:46 crc kubenswrapper[4707]: I0121 15:45:46.921987 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.014594 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.053980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92113cd-544a-4c31-9ca9-e56ec4250773-logs\") pod \"f92113cd-544a-4c31-9ca9-e56ec4250773\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.054017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-config-data\") pod \"f92113cd-544a-4c31-9ca9-e56ec4250773\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.054042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-internal-tls-certs\") pod \"f92113cd-544a-4c31-9ca9-e56ec4250773\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.054092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-public-tls-certs\") pod \"f92113cd-544a-4c31-9ca9-e56ec4250773\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.054145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85qmf\" (UniqueName: \"kubernetes.io/projected/f92113cd-544a-4c31-9ca9-e56ec4250773-kube-api-access-85qmf\") pod \"f92113cd-544a-4c31-9ca9-e56ec4250773\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.054180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-combined-ca-bundle\") pod \"f92113cd-544a-4c31-9ca9-e56ec4250773\" (UID: \"f92113cd-544a-4c31-9ca9-e56ec4250773\") " Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.055226 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92113cd-544a-4c31-9ca9-e56ec4250773-logs" (OuterVolumeSpecName: "logs") pod "f92113cd-544a-4c31-9ca9-e56ec4250773" (UID: "f92113cd-544a-4c31-9ca9-e56ec4250773"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.063104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92113cd-544a-4c31-9ca9-e56ec4250773-kube-api-access-85qmf" (OuterVolumeSpecName: "kube-api-access-85qmf") pod "f92113cd-544a-4c31-9ca9-e56ec4250773" (UID: "f92113cd-544a-4c31-9ca9-e56ec4250773"). InnerVolumeSpecName "kube-api-access-85qmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.093137 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f92113cd-544a-4c31-9ca9-e56ec4250773" (UID: "f92113cd-544a-4c31-9ca9-e56ec4250773"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.098755 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-config-data" (OuterVolumeSpecName: "config-data") pod "f92113cd-544a-4c31-9ca9-e56ec4250773" (UID: "f92113cd-544a-4c31-9ca9-e56ec4250773"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.106285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f92113cd-544a-4c31-9ca9-e56ec4250773" (UID: "f92113cd-544a-4c31-9ca9-e56ec4250773"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.120692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f92113cd-544a-4c31-9ca9-e56ec4250773" (UID: "f92113cd-544a-4c31-9ca9-e56ec4250773"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.156245 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.156269 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85qmf\" (UniqueName: \"kubernetes.io/projected/f92113cd-544a-4c31-9ca9-e56ec4250773-kube-api-access-85qmf\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.156288 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.156296 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92113cd-544a-4c31-9ca9-e56ec4250773-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.156306 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.156314 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92113cd-544a-4c31-9ca9-e56ec4250773-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.190661 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2846d28b-24cc-4f6a-bd5b-dc099af2bc3e" path="/var/lib/kubelet/pods/2846d28b-24cc-4f6a-bd5b-dc099af2bc3e/volumes" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.191212 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cfb7b09-0ff5-4134-90e4-2b04813ca151" path="/var/lib/kubelet/pods/7cfb7b09-0ff5-4134-90e4-2b04813ca151/volumes" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.191673 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc127ff-9162-4e68-af5b-b4085cce670c" path="/var/lib/kubelet/pods/7fc127ff-9162-4e68-af5b-b4085cce670c/volumes" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.192598 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83d92527-f8b9-40f5-8a6d-4abb2bb1ef63" path="/var/lib/kubelet/pods/83d92527-f8b9-40f5-8a6d-4abb2bb1ef63/volumes" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.314663 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.551788 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a9ec36a7-0e2d-4e93-be37-4a9247bf3426","Type":"ContainerStarted","Data":"829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72"} Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.551985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a9ec36a7-0e2d-4e93-be37-4a9247bf3426","Type":"ContainerStarted","Data":"5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30"} Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.551997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a9ec36a7-0e2d-4e93-be37-4a9247bf3426","Type":"ContainerStarted","Data":"c1aaf7ad9ee93479b686136b02e2a8f2d38a6e4bad77917fafce181d4aa8e25d"} Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.554751 4707 generic.go:334] "Generic (PLEG): container finished" podID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerID="0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9" exitCode=0 Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.554824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f92113cd-544a-4c31-9ca9-e56ec4250773","Type":"ContainerDied","Data":"0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9"} Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.554851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f92113cd-544a-4c31-9ca9-e56ec4250773","Type":"ContainerDied","Data":"c97009d6035ce5f1df4e31d6af5ed4249e065135446a5927ebc0e8105fdeb663"} Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.554867 4707 scope.go:117] "RemoveContainer" containerID="0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.554948 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.559832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" event={"ID":"8cd0abca-b95e-45bf-8325-0c12088ce0d1","Type":"ContainerDied","Data":"b06e320fecab1c2dbd21b7002caf646cd135faaf82ae1ebd489ce9105a961412"} Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.559795 4707 generic.go:334] "Generic (PLEG): container finished" podID="8cd0abca-b95e-45bf-8325-0c12088ce0d1" containerID="b06e320fecab1c2dbd21b7002caf646cd135faaf82ae1ebd489ce9105a961412" exitCode=0 Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.568257 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.5682441599999999 podStartE2EDuration="1.56824416s" podCreationTimestamp="2026-01-21 15:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:47.563498256 +0000 UTC m=+2644.745014478" watchObservedRunningTime="2026-01-21 15:45:47.56824416 +0000 UTC m=+2644.749760382" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.571789 4707 generic.go:334] "Generic (PLEG): container finished" podID="875e9554-48a4-49ca-8a6f-2408eb57723f" containerID="20101a8907237b77b8724d313d47000e9615ffb81952a3f5d753c750223d766c" exitCode=0 Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.571856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" event={"ID":"875e9554-48a4-49ca-8a6f-2408eb57723f","Type":"ContainerDied","Data":"20101a8907237b77b8724d313d47000e9615ffb81952a3f5d753c750223d766c"} Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.594353 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.595510 4707 scope.go:117] "RemoveContainer" containerID="6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.608219 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.626544 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:47 crc kubenswrapper[4707]: E0121 15:45:47.626917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-log" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.626934 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-log" Jan 21 15:45:47 crc kubenswrapper[4707]: E0121 15:45:47.626969 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-api" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.626975 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-api" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.627127 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-log" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.627143 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" containerName="nova-api-api" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.627968 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.629650 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.629861 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.630107 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.642969 4707 scope.go:117] "RemoveContainer" containerID="0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9" Jan 21 15:45:47 crc kubenswrapper[4707]: E0121 15:45:47.643608 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9\": container with ID starting with 0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9 not found: ID does not exist" containerID="0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.643644 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9"} err="failed to get container status \"0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9\": rpc error: code = NotFound desc = could not find container \"0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9\": container with ID starting with 0356039d8f03cdeaecd520d0c61912c41e04b9710ea965a261bcf1484426fde9 not found: ID does not exist" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.643666 4707 scope.go:117] "RemoveContainer" containerID="6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f" Jan 21 15:45:47 crc kubenswrapper[4707]: E0121 15:45:47.644044 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f\": container with ID starting with 6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f not found: ID does not exist" containerID="6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.644146 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f"} err="failed to get container status \"6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f\": rpc error: code = NotFound desc = could not find container \"6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f\": container with ID starting with 6574f750051d12fef242ca50f4379587ffa4805f9fee6513b0de899e55007f5f not found: ID does not exist" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.651726 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.765978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj7dd\" (UniqueName: \"kubernetes.io/projected/be0fd042-23db-4714-8700-3ec1a0a09516-kube-api-access-nj7dd\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.766031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-internal-tls-certs\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.766052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0fd042-23db-4714-8700-3ec1a0a09516-logs\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.766192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-public-tls-certs\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.766224 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.766241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-config-data\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.867493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-internal-tls-certs\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.867536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0fd042-23db-4714-8700-3ec1a0a09516-logs\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.867602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-public-tls-certs\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.867629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.867648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-config-data\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.867716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj7dd\" (UniqueName: \"kubernetes.io/projected/be0fd042-23db-4714-8700-3ec1a0a09516-kube-api-access-nj7dd\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.869870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0fd042-23db-4714-8700-3ec1a0a09516-logs\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.873087 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-config-data\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.875248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-internal-tls-certs\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.875330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-public-tls-certs\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.875345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.888582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj7dd\" (UniqueName: \"kubernetes.io/projected/be0fd042-23db-4714-8700-3ec1a0a09516-kube-api-access-nj7dd\") pod \"nova-api-0\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:47 crc kubenswrapper[4707]: I0121 15:45:47.958363 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.339417 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.584001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"be0fd042-23db-4714-8700-3ec1a0a09516","Type":"ContainerStarted","Data":"bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a"} Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.584059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"be0fd042-23db-4714-8700-3ec1a0a09516","Type":"ContainerStarted","Data":"c00c707a9f815988e694e63662e6cb3fe8a70ea66f4623bb3e2e3cd990cd160c"} Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.809909 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.888555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-scripts\") pod \"875e9554-48a4-49ca-8a6f-2408eb57723f\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.888683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-combined-ca-bundle\") pod \"875e9554-48a4-49ca-8a6f-2408eb57723f\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.888750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlbwq\" (UniqueName: \"kubernetes.io/projected/875e9554-48a4-49ca-8a6f-2408eb57723f-kube-api-access-xlbwq\") pod \"875e9554-48a4-49ca-8a6f-2408eb57723f\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.888777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-config-data\") pod \"875e9554-48a4-49ca-8a6f-2408eb57723f\" (UID: \"875e9554-48a4-49ca-8a6f-2408eb57723f\") " Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.892894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-scripts" (OuterVolumeSpecName: "scripts") pod "875e9554-48a4-49ca-8a6f-2408eb57723f" (UID: "875e9554-48a4-49ca-8a6f-2408eb57723f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.893062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875e9554-48a4-49ca-8a6f-2408eb57723f-kube-api-access-xlbwq" (OuterVolumeSpecName: "kube-api-access-xlbwq") pod "875e9554-48a4-49ca-8a6f-2408eb57723f" (UID: "875e9554-48a4-49ca-8a6f-2408eb57723f"). InnerVolumeSpecName "kube-api-access-xlbwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.910932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "875e9554-48a4-49ca-8a6f-2408eb57723f" (UID: "875e9554-48a4-49ca-8a6f-2408eb57723f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.911956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-config-data" (OuterVolumeSpecName: "config-data") pod "875e9554-48a4-49ca-8a6f-2408eb57723f" (UID: "875e9554-48a4-49ca-8a6f-2408eb57723f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.966265 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.990207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-scripts\") pod \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.990246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-combined-ca-bundle\") pod \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.990323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnghf\" (UniqueName: \"kubernetes.io/projected/8cd0abca-b95e-45bf-8325-0c12088ce0d1-kube-api-access-bnghf\") pod \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.990368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-config-data\") pod \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\" (UID: \"8cd0abca-b95e-45bf-8325-0c12088ce0d1\") " Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.990742 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.990761 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlbwq\" (UniqueName: \"kubernetes.io/projected/875e9554-48a4-49ca-8a6f-2408eb57723f-kube-api-access-xlbwq\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.990771 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.990780 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875e9554-48a4-49ca-8a6f-2408eb57723f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.999502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-scripts" (OuterVolumeSpecName: "scripts") pod "8cd0abca-b95e-45bf-8325-0c12088ce0d1" (UID: "8cd0abca-b95e-45bf-8325-0c12088ce0d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:48 crc kubenswrapper[4707]: I0121 15:45:48.999522 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cd0abca-b95e-45bf-8325-0c12088ce0d1-kube-api-access-bnghf" (OuterVolumeSpecName: "kube-api-access-bnghf") pod "8cd0abca-b95e-45bf-8325-0c12088ce0d1" (UID: "8cd0abca-b95e-45bf-8325-0c12088ce0d1"). InnerVolumeSpecName "kube-api-access-bnghf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.012344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-config-data" (OuterVolumeSpecName: "config-data") pod "8cd0abca-b95e-45bf-8325-0c12088ce0d1" (UID: "8cd0abca-b95e-45bf-8325-0c12088ce0d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.015254 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cd0abca-b95e-45bf-8325-0c12088ce0d1" (UID: "8cd0abca-b95e-45bf-8325-0c12088ce0d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.091710 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.091735 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.091747 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd0abca-b95e-45bf-8325-0c12088ce0d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.091758 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnghf\" (UniqueName: \"kubernetes.io/projected/8cd0abca-b95e-45bf-8325-0c12088ce0d1-kube-api-access-bnghf\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.193096 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f92113cd-544a-4c31-9ca9-e56ec4250773" path="/var/lib/kubelet/pods/f92113cd-544a-4c31-9ca9-e56ec4250773/volumes" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.592817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"be0fd042-23db-4714-8700-3ec1a0a09516","Type":"ContainerStarted","Data":"c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f"} Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.594134 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.594132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm" event={"ID":"8cd0abca-b95e-45bf-8325-0c12088ce0d1","Type":"ContainerDied","Data":"e05ccdfa23005a1db801d7f4c48823c43613a762fed518ede7eca4b7b19a3c16"} Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.594557 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e05ccdfa23005a1db801d7f4c48823c43613a762fed518ede7eca4b7b19a3c16" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.595942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" event={"ID":"875e9554-48a4-49ca-8a6f-2408eb57723f","Type":"ContainerDied","Data":"9b9923bec270c135ad949e412ef9981d70613b01b66d00f5f95481785bc46736"} Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.595963 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b9923bec270c135ad949e412ef9981d70613b01b66d00f5f95481785bc46736" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.595969 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.611731 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.611717004 podStartE2EDuration="2.611717004s" podCreationTimestamp="2026-01-21 15:45:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:49.605717583 +0000 UTC m=+2646.787233806" watchObservedRunningTime="2026-01-21 15:45:49.611717004 +0000 UTC m=+2646.793233226" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.713824 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.727558 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.727743 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd" containerName="nova-scheduler-scheduler" containerID="cri-o://54d6c16d4e52b40f86e960b3984187cbdfd603f81abadb13eb6865cac9df33cb" gracePeriod=30 Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.743963 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.744148 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerName="nova-metadata-log" containerID="cri-o://5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30" gracePeriod=30 Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.744233 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerName="nova-metadata-metadata" containerID="cri-o://829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72" gracePeriod=30 Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.881384 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw"] Jan 21 15:45:49 crc kubenswrapper[4707]: E0121 15:45:49.881749 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd0abca-b95e-45bf-8325-0c12088ce0d1" containerName="nova-manage" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.881767 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd0abca-b95e-45bf-8325-0c12088ce0d1" containerName="nova-manage" Jan 21 15:45:49 crc kubenswrapper[4707]: E0121 15:45:49.881781 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875e9554-48a4-49ca-8a6f-2408eb57723f" containerName="nova-manage" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.881788 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="875e9554-48a4-49ca-8a6f-2408eb57723f" containerName="nova-manage" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.881968 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd0abca-b95e-45bf-8325-0c12088ce0d1" containerName="nova-manage" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.881992 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="875e9554-48a4-49ca-8a6f-2408eb57723f" containerName="nova-manage" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.882838 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.891248 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw"] Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.906415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-public-tls-certs\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.906494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88rhz\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-kube-api-access-88rhz\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.906513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-combined-ca-bundle\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.906567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-log-httpd\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.906595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-etc-swift\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.906613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-config-data\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.906641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-internal-tls-certs\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:49 crc kubenswrapper[4707]: I0121 15:45:49.906669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-run-httpd\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.007789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-public-tls-certs\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.007949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88rhz\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-kube-api-access-88rhz\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.007974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-combined-ca-bundle\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.008060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-log-httpd\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.008110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-etc-swift\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.008132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-config-data\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.008187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-internal-tls-certs\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.008246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-run-httpd\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.008612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-log-httpd\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.008710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-run-httpd\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.012230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-public-tls-certs\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.012959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-internal-tls-certs\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.013445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-config-data\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.013941 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-combined-ca-bundle\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.017081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-etc-swift\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.021532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88rhz\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-kube-api-access-88rhz\") pod \"swift-proxy-6b78579b5b-5nmdw\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.183124 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:45:50 crc kubenswrapper[4707]: E0121 15:45:50.183372 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.190531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.211592 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-config-data\") pod \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.212295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9wbl\" (UniqueName: \"kubernetes.io/projected/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-kube-api-access-z9wbl\") pod \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.212322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-combined-ca-bundle\") pod \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.212376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-nova-metadata-tls-certs\") pod \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.212394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-logs\") pod \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\" (UID: \"a9ec36a7-0e2d-4e93-be37-4a9247bf3426\") " Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.212790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-logs" (OuterVolumeSpecName: "logs") pod "a9ec36a7-0e2d-4e93-be37-4a9247bf3426" (UID: "a9ec36a7-0e2d-4e93-be37-4a9247bf3426"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.213717 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.224933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-kube-api-access-z9wbl" (OuterVolumeSpecName: "kube-api-access-z9wbl") pod "a9ec36a7-0e2d-4e93-be37-4a9247bf3426" (UID: "a9ec36a7-0e2d-4e93-be37-4a9247bf3426"). InnerVolumeSpecName "kube-api-access-z9wbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.238744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9ec36a7-0e2d-4e93-be37-4a9247bf3426" (UID: "a9ec36a7-0e2d-4e93-be37-4a9247bf3426"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.241771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-config-data" (OuterVolumeSpecName: "config-data") pod "a9ec36a7-0e2d-4e93-be37-4a9247bf3426" (UID: "a9ec36a7-0e2d-4e93-be37-4a9247bf3426"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.248989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a9ec36a7-0e2d-4e93-be37-4a9247bf3426" (UID: "a9ec36a7-0e2d-4e93-be37-4a9247bf3426"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.283260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.315507 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9wbl\" (UniqueName: \"kubernetes.io/projected/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-kube-api-access-z9wbl\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.315716 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.315725 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.315733 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ec36a7-0e2d-4e93-be37-4a9247bf3426-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.604481 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerID="829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72" exitCode=0 Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.604513 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerID="5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30" exitCode=143 Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.604888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a9ec36a7-0e2d-4e93-be37-4a9247bf3426","Type":"ContainerDied","Data":"829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72"} Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.604922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a9ec36a7-0e2d-4e93-be37-4a9247bf3426","Type":"ContainerDied","Data":"5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30"} Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.604925 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.604943 4707 scope.go:117] "RemoveContainer" containerID="829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.604932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a9ec36a7-0e2d-4e93-be37-4a9247bf3426","Type":"ContainerDied","Data":"c1aaf7ad9ee93479b686136b02e2a8f2d38a6e4bad77917fafce181d4aa8e25d"} Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.623244 4707 scope.go:117] "RemoveContainer" containerID="5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.639611 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.645184 4707 scope.go:117] "RemoveContainer" containerID="829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.646371 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:50 crc kubenswrapper[4707]: E0121 15:45:50.647268 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72\": container with ID starting with 829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72 not found: ID does not exist" containerID="829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.647313 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72"} err="failed to get container status \"829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72\": rpc error: code = NotFound desc = could not find container \"829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72\": container with ID starting with 829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72 not found: ID does not exist" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.647335 4707 scope.go:117] "RemoveContainer" containerID="5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30" Jan 21 15:45:50 crc kubenswrapper[4707]: E0121 15:45:50.647708 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30\": container with ID starting with 5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30 not found: ID does not exist" containerID="5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.647729 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30"} err="failed to get container status \"5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30\": rpc error: code = NotFound desc = could not find container \"5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30\": container with ID starting with 5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30 not found: ID does not exist" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.647742 4707 scope.go:117] "RemoveContainer" containerID="829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.648093 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72"} err="failed to get container status \"829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72\": rpc error: code = NotFound desc = could not find container \"829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72\": container with ID starting with 829086606c4e47d12511d1a1140741dade23894e87c038dff79ac0bf851b6c72 not found: ID does not exist" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.648133 4707 scope.go:117] "RemoveContainer" containerID="5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.648444 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30"} err="failed to get container status \"5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30\": rpc error: code = NotFound desc = could not find container \"5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30\": container with ID starting with 5bc4bb5fd56c1d2fc3a45f5782c7959b5cb964d152032647bca33daf89816e30 not found: ID does not exist" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.653541 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:50 crc kubenswrapper[4707]: E0121 15:45:50.655683 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerName="nova-metadata-metadata" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.655705 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerName="nova-metadata-metadata" Jan 21 15:45:50 crc kubenswrapper[4707]: E0121 15:45:50.655723 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerName="nova-metadata-log" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.655729 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerName="nova-metadata-log" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.655911 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerName="nova-metadata-metadata" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.655930 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" containerName="nova-metadata-log" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.657168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.659198 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.659401 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.663140 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.687122 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw"] Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.721605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-config-data\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.721640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlcc9\" (UniqueName: \"kubernetes.io/projected/6f277223-7e29-49e4-a1e5-f02c4d883ba7-kube-api-access-nlcc9\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.721734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.721779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f277223-7e29-49e4-a1e5-f02c4d883ba7-logs\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.721803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.822918 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-config-data\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.822955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlcc9\" (UniqueName: \"kubernetes.io/projected/6f277223-7e29-49e4-a1e5-f02c4d883ba7-kube-api-access-nlcc9\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.823072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.823126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f277223-7e29-49e4-a1e5-f02c4d883ba7-logs\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.823155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.823435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f277223-7e29-49e4-a1e5-f02c4d883ba7-logs\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.825579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-config-data\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.826021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.826987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.836092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlcc9\" (UniqueName: \"kubernetes.io/projected/6f277223-7e29-49e4-a1e5-f02c4d883ba7-kube-api-access-nlcc9\") pod \"nova-metadata-0\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.902512 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:45:50 crc kubenswrapper[4707]: I0121 15:45:50.971551 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.191372 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ec36a7-0e2d-4e93-be37-4a9247bf3426" path="/var/lib/kubelet/pods/a9ec36a7-0e2d-4e93-be37-4a9247bf3426/volumes" Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.344350 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:45:51 crc kubenswrapper[4707]: W0121 15:45:51.344364 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f277223_7e29_49e4_a1e5_f02c4d883ba7.slice/crio-ad2e37f4585e6a550e6367b0cf18458e80f18c8aa807d3c719f5ef541788e6bc WatchSource:0}: Error finding container ad2e37f4585e6a550e6367b0cf18458e80f18c8aa807d3c719f5ef541788e6bc: Status 404 returned error can't find the container with id ad2e37f4585e6a550e6367b0cf18458e80f18c8aa807d3c719f5ef541788e6bc Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.612850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" event={"ID":"13204bd0-d001-41b6-b3e3-606d0886ee77","Type":"ContainerStarted","Data":"95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4"} Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.613036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" event={"ID":"13204bd0-d001-41b6-b3e3-606d0886ee77","Type":"ContainerStarted","Data":"ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748"} Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.613049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" event={"ID":"13204bd0-d001-41b6-b3e3-606d0886ee77","Type":"ContainerStarted","Data":"37cf0b0c1386369086dc0b8ef7d5a1dccc8a920f94d3623675241e4d53c5f19d"} Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.613091 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.613115 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.614689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"6f277223-7e29-49e4-a1e5-f02c4d883ba7","Type":"ContainerStarted","Data":"7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686"} Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.614724 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"6f277223-7e29-49e4-a1e5-f02c4d883ba7","Type":"ContainerStarted","Data":"ad2e37f4585e6a550e6367b0cf18458e80f18c8aa807d3c719f5ef541788e6bc"} Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.615916 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="be0fd042-23db-4714-8700-3ec1a0a09516" containerName="nova-api-log" containerID="cri-o://bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a" gracePeriod=30 Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.615945 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="be0fd042-23db-4714-8700-3ec1a0a09516" containerName="nova-api-api" containerID="cri-o://c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f" gracePeriod=30 Jan 21 15:45:51 crc kubenswrapper[4707]: I0121 15:45:51.632104 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" podStartSLOduration=2.632088502 podStartE2EDuration="2.632088502s" podCreationTimestamp="2026-01-21 15:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:51.630350073 +0000 UTC m=+2648.811866295" watchObservedRunningTime="2026-01-21 15:45:51.632088502 +0000 UTC m=+2648.813604724" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.101466 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.142698 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-combined-ca-bundle\") pod \"be0fd042-23db-4714-8700-3ec1a0a09516\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.142783 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-public-tls-certs\") pod \"be0fd042-23db-4714-8700-3ec1a0a09516\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.142848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-config-data\") pod \"be0fd042-23db-4714-8700-3ec1a0a09516\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.143007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-internal-tls-certs\") pod \"be0fd042-23db-4714-8700-3ec1a0a09516\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.143070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0fd042-23db-4714-8700-3ec1a0a09516-logs\") pod \"be0fd042-23db-4714-8700-3ec1a0a09516\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.143089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj7dd\" (UniqueName: \"kubernetes.io/projected/be0fd042-23db-4714-8700-3ec1a0a09516-kube-api-access-nj7dd\") pod \"be0fd042-23db-4714-8700-3ec1a0a09516\" (UID: \"be0fd042-23db-4714-8700-3ec1a0a09516\") " Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.143580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be0fd042-23db-4714-8700-3ec1a0a09516-logs" (OuterVolumeSpecName: "logs") pod "be0fd042-23db-4714-8700-3ec1a0a09516" (UID: "be0fd042-23db-4714-8700-3ec1a0a09516"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.147061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be0fd042-23db-4714-8700-3ec1a0a09516-kube-api-access-nj7dd" (OuterVolumeSpecName: "kube-api-access-nj7dd") pod "be0fd042-23db-4714-8700-3ec1a0a09516" (UID: "be0fd042-23db-4714-8700-3ec1a0a09516"). InnerVolumeSpecName "kube-api-access-nj7dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.163068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be0fd042-23db-4714-8700-3ec1a0a09516" (UID: "be0fd042-23db-4714-8700-3ec1a0a09516"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.164146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-config-data" (OuterVolumeSpecName: "config-data") pod "be0fd042-23db-4714-8700-3ec1a0a09516" (UID: "be0fd042-23db-4714-8700-3ec1a0a09516"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.180745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "be0fd042-23db-4714-8700-3ec1a0a09516" (UID: "be0fd042-23db-4714-8700-3ec1a0a09516"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.183317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "be0fd042-23db-4714-8700-3ec1a0a09516" (UID: "be0fd042-23db-4714-8700-3ec1a0a09516"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.245224 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0fd042-23db-4714-8700-3ec1a0a09516-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.245253 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj7dd\" (UniqueName: \"kubernetes.io/projected/be0fd042-23db-4714-8700-3ec1a0a09516-kube-api-access-nj7dd\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.245265 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.245282 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.245290 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.245297 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0fd042-23db-4714-8700-3ec1a0a09516-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.624834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"6f277223-7e29-49e4-a1e5-f02c4d883ba7","Type":"ContainerStarted","Data":"a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915"} Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.626361 4707 generic.go:334] "Generic (PLEG): container finished" podID="be0fd042-23db-4714-8700-3ec1a0a09516" containerID="c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f" exitCode=0 Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.626389 4707 generic.go:334] "Generic (PLEG): container finished" podID="be0fd042-23db-4714-8700-3ec1a0a09516" containerID="bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a" exitCode=143 Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.626436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"be0fd042-23db-4714-8700-3ec1a0a09516","Type":"ContainerDied","Data":"c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f"} Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.626469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"be0fd042-23db-4714-8700-3ec1a0a09516","Type":"ContainerDied","Data":"bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a"} Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.626482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"be0fd042-23db-4714-8700-3ec1a0a09516","Type":"ContainerDied","Data":"c00c707a9f815988e694e63662e6cb3fe8a70ea66f4623bb3e2e3cd990cd160c"} Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.626497 4707 scope.go:117] "RemoveContainer" containerID="c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.626390 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.640434 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.640416132 podStartE2EDuration="2.640416132s" podCreationTimestamp="2026-01-21 15:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:52.637936589 +0000 UTC m=+2649.819452811" watchObservedRunningTime="2026-01-21 15:45:52.640416132 +0000 UTC m=+2649.821932344" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.649878 4707 scope.go:117] "RemoveContainer" containerID="bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.656999 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.668101 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.671880 4707 scope.go:117] "RemoveContainer" containerID="c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f" Jan 21 15:45:52 crc kubenswrapper[4707]: E0121 15:45:52.672378 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f\": container with ID starting with c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f not found: ID does not exist" containerID="c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.672410 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f"} err="failed to get container status \"c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f\": rpc error: code = NotFound desc = could not find container \"c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f\": container with ID starting with c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f not found: ID does not exist" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.672430 4707 scope.go:117] "RemoveContainer" containerID="bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a" Jan 21 15:45:52 crc kubenswrapper[4707]: E0121 15:45:52.673058 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a\": container with ID starting with bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a not found: ID does not exist" containerID="bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.673101 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a"} err="failed to get container status \"bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a\": rpc error: code = NotFound desc = could not find container \"bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a\": container with ID starting with bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a not found: ID does not exist" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.673114 4707 scope.go:117] "RemoveContainer" containerID="c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.673376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f"} err="failed to get container status \"c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f\": rpc error: code = NotFound desc = could not find container \"c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f\": container with ID starting with c45076141307c6c61434fb45e59dc94cb88309c212a286fafa700c838f1aea6f not found: ID does not exist" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.673400 4707 scope.go:117] "RemoveContainer" containerID="bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.673600 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a"} err="failed to get container status \"bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a\": rpc error: code = NotFound desc = could not find container \"bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a\": container with ID starting with bd24863e1d87f0143babff8239810b2186821dc7d3971decf9d0c9ce8a9a156a not found: ID does not exist" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.679547 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:52 crc kubenswrapper[4707]: E0121 15:45:52.679940 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0fd042-23db-4714-8700-3ec1a0a09516" containerName="nova-api-log" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.679957 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0fd042-23db-4714-8700-3ec1a0a09516" containerName="nova-api-log" Jan 21 15:45:52 crc kubenswrapper[4707]: E0121 15:45:52.679974 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0fd042-23db-4714-8700-3ec1a0a09516" containerName="nova-api-api" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.679980 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0fd042-23db-4714-8700-3ec1a0a09516" containerName="nova-api-api" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.680157 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0fd042-23db-4714-8700-3ec1a0a09516" containerName="nova-api-log" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.680186 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0fd042-23db-4714-8700-3ec1a0a09516" containerName="nova-api-api" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.681095 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.683113 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.683318 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.683448 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.691175 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.753234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.753286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.753325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-public-tls-certs\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.753346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7svwh\" (UniqueName: \"kubernetes.io/projected/a16e7975-1407-4544-8c46-b0445ac9c311-kube-api-access-7svwh\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.753492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-config-data\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.753551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a16e7975-1407-4544-8c46-b0445ac9c311-logs\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.854631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.854673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.854712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-public-tls-certs\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.854737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7svwh\" (UniqueName: \"kubernetes.io/projected/a16e7975-1407-4544-8c46-b0445ac9c311-kube-api-access-7svwh\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.854837 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-config-data\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.854867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a16e7975-1407-4544-8c46-b0445ac9c311-logs\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.855285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a16e7975-1407-4544-8c46-b0445ac9c311-logs\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.857865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.858115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-config-data\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.857940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-public-tls-certs\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.858394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:52 crc kubenswrapper[4707]: I0121 15:45:52.868529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7svwh\" (UniqueName: \"kubernetes.io/projected/a16e7975-1407-4544-8c46-b0445ac9c311-kube-api-access-7svwh\") pod \"nova-api-0\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:53 crc kubenswrapper[4707]: I0121 15:45:53.003718 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:45:53 crc kubenswrapper[4707]: I0121 15:45:53.190635 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be0fd042-23db-4714-8700-3ec1a0a09516" path="/var/lib/kubelet/pods/be0fd042-23db-4714-8700-3ec1a0a09516/volumes" Jan 21 15:45:53 crc kubenswrapper[4707]: I0121 15:45:53.367489 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:45:53 crc kubenswrapper[4707]: W0121 15:45:53.367561 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda16e7975_1407_4544_8c46_b0445ac9c311.slice/crio-5869066c2446507cc781b11d5cd9c0700133b0aab3b4ae707b2752d2e09540e0 WatchSource:0}: Error finding container 5869066c2446507cc781b11d5cd9c0700133b0aab3b4ae707b2752d2e09540e0: Status 404 returned error can't find the container with id 5869066c2446507cc781b11d5cd9c0700133b0aab3b4ae707b2752d2e09540e0 Jan 21 15:45:53 crc kubenswrapper[4707]: I0121 15:45:53.635694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a16e7975-1407-4544-8c46-b0445ac9c311","Type":"ContainerStarted","Data":"165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd"} Jan 21 15:45:53 crc kubenswrapper[4707]: I0121 15:45:53.635745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a16e7975-1407-4544-8c46-b0445ac9c311","Type":"ContainerStarted","Data":"2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633"} Jan 21 15:45:53 crc kubenswrapper[4707]: I0121 15:45:53.635759 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a16e7975-1407-4544-8c46-b0445ac9c311","Type":"ContainerStarted","Data":"5869066c2446507cc781b11d5cd9c0700133b0aab3b4ae707b2752d2e09540e0"} Jan 21 15:45:53 crc kubenswrapper[4707]: I0121 15:45:53.653881 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.653864631 podStartE2EDuration="1.653864631s" podCreationTimestamp="2026-01-21 15:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:53.648757348 +0000 UTC m=+2650.830273569" watchObservedRunningTime="2026-01-21 15:45:53.653864631 +0000 UTC m=+2650.835380852" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.291078 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.291293 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.350483 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct"] Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.350962 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" podUID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerName="proxy-httpd" containerID="cri-o://8597f56a988d7dd7e3a62e7f1cbbe0a94a68d83003c9ba88a0cb7a507db48eae" gracePeriod=30 Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.351487 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" podUID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerName="proxy-server" containerID="cri-o://8f7662857b1d2fd154bd23c6d539f25deaead9f07840cdb3f519d53c088180ad" gracePeriod=30 Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.650988 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerID="8f7662857b1d2fd154bd23c6d539f25deaead9f07840cdb3f519d53c088180ad" exitCode=0 Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.651192 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerID="8597f56a988d7dd7e3a62e7f1cbbe0a94a68d83003c9ba88a0cb7a507db48eae" exitCode=0 Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.651255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" event={"ID":"ca53a031-35d7-4861-8ccd-691ec790ce9c","Type":"ContainerDied","Data":"8f7662857b1d2fd154bd23c6d539f25deaead9f07840cdb3f519d53c088180ad"} Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.651290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" event={"ID":"ca53a031-35d7-4861-8ccd-691ec790ce9c","Type":"ContainerDied","Data":"8597f56a988d7dd7e3a62e7f1cbbe0a94a68d83003c9ba88a0cb7a507db48eae"} Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.818642 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.900834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-log-httpd\") pod \"ca53a031-35d7-4861-8ccd-691ec790ce9c\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.900954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-combined-ca-bundle\") pod \"ca53a031-35d7-4861-8ccd-691ec790ce9c\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.900984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-run-httpd\") pod \"ca53a031-35d7-4861-8ccd-691ec790ce9c\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.901018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-internal-tls-certs\") pod \"ca53a031-35d7-4861-8ccd-691ec790ce9c\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.901071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-etc-swift\") pod \"ca53a031-35d7-4861-8ccd-691ec790ce9c\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.901085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-config-data\") pod \"ca53a031-35d7-4861-8ccd-691ec790ce9c\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.901170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdvjw\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-kube-api-access-tdvjw\") pod \"ca53a031-35d7-4861-8ccd-691ec790ce9c\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.901200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-public-tls-certs\") pod \"ca53a031-35d7-4861-8ccd-691ec790ce9c\" (UID: \"ca53a031-35d7-4861-8ccd-691ec790ce9c\") " Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.901304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ca53a031-35d7-4861-8ccd-691ec790ce9c" (UID: "ca53a031-35d7-4861-8ccd-691ec790ce9c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.901556 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.901962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ca53a031-35d7-4861-8ccd-691ec790ce9c" (UID: "ca53a031-35d7-4861-8ccd-691ec790ce9c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.905761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-kube-api-access-tdvjw" (OuterVolumeSpecName: "kube-api-access-tdvjw") pod "ca53a031-35d7-4861-8ccd-691ec790ce9c" (UID: "ca53a031-35d7-4861-8ccd-691ec790ce9c"). InnerVolumeSpecName "kube-api-access-tdvjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.905990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ca53a031-35d7-4861-8ccd-691ec790ce9c" (UID: "ca53a031-35d7-4861-8ccd-691ec790ce9c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.938825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca53a031-35d7-4861-8ccd-691ec790ce9c" (UID: "ca53a031-35d7-4861-8ccd-691ec790ce9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.940388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ca53a031-35d7-4861-8ccd-691ec790ce9c" (UID: "ca53a031-35d7-4861-8ccd-691ec790ce9c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.940539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-config-data" (OuterVolumeSpecName: "config-data") pod "ca53a031-35d7-4861-8ccd-691ec790ce9c" (UID: "ca53a031-35d7-4861-8ccd-691ec790ce9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.944996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ca53a031-35d7-4861-8ccd-691ec790ce9c" (UID: "ca53a031-35d7-4861-8ccd-691ec790ce9c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.972176 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:55 crc kubenswrapper[4707]: I0121 15:45:55.972218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.003711 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.003737 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.003747 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdvjw\" (UniqueName: \"kubernetes.io/projected/ca53a031-35d7-4861-8ccd-691ec790ce9c-kube-api-access-tdvjw\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.003757 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.003766 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca53a031-35d7-4861-8ccd-691ec790ce9c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.003775 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.003783 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca53a031-35d7-4861-8ccd-691ec790ce9c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.659699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" event={"ID":"ca53a031-35d7-4861-8ccd-691ec790ce9c","Type":"ContainerDied","Data":"91a464dd2b1b7a093f17f2c45826faedf6137701707d73e4c79aa44fc94f0ade"} Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.659742 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.659754 4707 scope.go:117] "RemoveContainer" containerID="8f7662857b1d2fd154bd23c6d539f25deaead9f07840cdb3f519d53c088180ad" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.689095 4707 scope.go:117] "RemoveContainer" containerID="8597f56a988d7dd7e3a62e7f1cbbe0a94a68d83003c9ba88a0cb7a507db48eae" Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.691579 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct"] Jan 21 15:45:56 crc kubenswrapper[4707]: I0121 15:45:56.700132 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c95d7d87f-4p9ct"] Jan 21 15:45:57 crc kubenswrapper[4707]: I0121 15:45:57.190703 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca53a031-35d7-4861-8ccd-691ec790ce9c" path="/var/lib/kubelet/pods/ca53a031-35d7-4861-8ccd-691ec790ce9c/volumes" Jan 21 15:46:00 crc kubenswrapper[4707]: I0121 15:46:00.808778 4707 scope.go:117] "RemoveContainer" containerID="228afae2defd2f7cd01706c71f8b84017837c2d44aa5f97f405fbfa3b2e25d2f" Jan 21 15:46:00 crc kubenswrapper[4707]: I0121 15:46:00.825355 4707 scope.go:117] "RemoveContainer" containerID="3366326ca952000f6610fe4a617379c49e1cb863f2bc2229d35cc5564f663ccd" Jan 21 15:46:00 crc kubenswrapper[4707]: I0121 15:46:00.844899 4707 scope.go:117] "RemoveContainer" containerID="fc38731eba6859dd3321273b90548f6258defbfac5f235ee1a8b652aac078066" Jan 21 15:46:00 crc kubenswrapper[4707]: I0121 15:46:00.972621 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:46:00 crc kubenswrapper[4707]: I0121 15:46:00.972671 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:46:01 crc kubenswrapper[4707]: I0121 15:46:01.183228 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:46:01 crc kubenswrapper[4707]: E0121 15:46:01.183418 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:46:01 crc kubenswrapper[4707]: I0121 15:46:01.984913 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.93:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:46:01 crc kubenswrapper[4707]: I0121 15:46:01.984996 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.93:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:46:03 crc kubenswrapper[4707]: I0121 15:46:03.004073 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:03 crc kubenswrapper[4707]: I0121 15:46:03.004127 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:04 crc kubenswrapper[4707]: I0121 15:46:04.015939 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.94:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:46:04 crc kubenswrapper[4707]: I0121 15:46:04.015950 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.94:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:46:10 crc kubenswrapper[4707]: I0121 15:46:10.978083 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:46:10 crc kubenswrapper[4707]: I0121 15:46:10.979122 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:46:10 crc kubenswrapper[4707]: I0121 15:46:10.981636 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:46:11 crc kubenswrapper[4707]: I0121 15:46:11.761086 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:46:12 crc kubenswrapper[4707]: I0121 15:46:12.182010 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:46:12 crc kubenswrapper[4707]: E0121 15:46:12.182406 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:46:13 crc kubenswrapper[4707]: I0121 15:46:13.009881 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:13 crc kubenswrapper[4707]: I0121 15:46:13.009964 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:13 crc kubenswrapper[4707]: I0121 15:46:13.010326 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:13 crc kubenswrapper[4707]: I0121 15:46:13.010368 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:13 crc kubenswrapper[4707]: I0121 15:46:13.015836 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:13 crc kubenswrapper[4707]: I0121 15:46:13.016041 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:19 crc kubenswrapper[4707]: I0121 15:46:19.825324 4707 generic.go:334] "Generic (PLEG): container finished" podID="7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd" containerID="54d6c16d4e52b40f86e960b3984187cbdfd603f81abadb13eb6865cac9df33cb" exitCode=137 Jan 21 15:46:19 crc kubenswrapper[4707]: I0121 15:46:19.825393 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd","Type":"ContainerDied","Data":"54d6c16d4e52b40f86e960b3984187cbdfd603f81abadb13eb6865cac9df33cb"} Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.014531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.062588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-combined-ca-bundle\") pod \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.062688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clsn2\" (UniqueName: \"kubernetes.io/projected/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-kube-api-access-clsn2\") pod \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.062751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-config-data\") pod \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\" (UID: \"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd\") " Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.066965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-kube-api-access-clsn2" (OuterVolumeSpecName: "kube-api-access-clsn2") pod "7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd" (UID: "7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd"). InnerVolumeSpecName "kube-api-access-clsn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.082786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd" (UID: "7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.083074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-config-data" (OuterVolumeSpecName: "config-data") pod "7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd" (UID: "7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.164588 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.164783 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clsn2\" (UniqueName: \"kubernetes.io/projected/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-kube-api-access-clsn2\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.164979 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.833617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd","Type":"ContainerDied","Data":"10f98f4abc2c4eaf26dd9b149affd779be69c5cd1190d22bd7d02aa0c5bc5022"} Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.834214 4707 scope.go:117] "RemoveContainer" containerID="54d6c16d4e52b40f86e960b3984187cbdfd603f81abadb13eb6865cac9df33cb" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.833662 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.860444 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.867442 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.874687 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:46:20 crc kubenswrapper[4707]: E0121 15:46:20.875021 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerName="proxy-server" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.875036 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerName="proxy-server" Jan 21 15:46:20 crc kubenswrapper[4707]: E0121 15:46:20.875060 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd" containerName="nova-scheduler-scheduler" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.875067 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd" containerName="nova-scheduler-scheduler" Jan 21 15:46:20 crc kubenswrapper[4707]: E0121 15:46:20.875080 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerName="proxy-httpd" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.875086 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerName="proxy-httpd" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.875218 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerName="proxy-server" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.875236 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca53a031-35d7-4861-8ccd-691ec790ce9c" containerName="proxy-httpd" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.875245 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd" containerName="nova-scheduler-scheduler" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.875775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.877976 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.883250 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.977353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvqb\" (UniqueName: \"kubernetes.io/projected/74717719-f653-4098-a6be-105f39bc2f0c-kube-api-access-qfvqb\") pod \"nova-scheduler-0\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.977475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-config-data\") pod \"nova-scheduler-0\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:20 crc kubenswrapper[4707]: I0121 15:46:20.977573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.079215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvqb\" (UniqueName: \"kubernetes.io/projected/74717719-f653-4098-a6be-105f39bc2f0c-kube-api-access-qfvqb\") pod \"nova-scheduler-0\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.079316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-config-data\") pod \"nova-scheduler-0\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.079387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.082509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-config-data\") pod \"nova-scheduler-0\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.082634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.092071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvqb\" (UniqueName: \"kubernetes.io/projected/74717719-f653-4098-a6be-105f39bc2f0c-kube-api-access-qfvqb\") pod \"nova-scheduler-0\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.190484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.190652 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd" path="/var/lib/kubelet/pods/7840e0f2-352b-4bc3-9a47-5bb1aa88d1dd/volumes" Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.558679 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.841278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"74717719-f653-4098-a6be-105f39bc2f0c","Type":"ContainerStarted","Data":"1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8"} Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.841315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"74717719-f653-4098-a6be-105f39bc2f0c","Type":"ContainerStarted","Data":"4203385eb6fc2d532ddb55a1b7d03a78c797579e62951ddd604fdd33190b150b"} Jan 21 15:46:21 crc kubenswrapper[4707]: I0121 15:46:21.859303 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.859288613 podStartE2EDuration="1.859288613s" podCreationTimestamp="2026-01-21 15:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:21.85142597 +0000 UTC m=+2679.032942192" watchObservedRunningTime="2026-01-21 15:46:21.859288613 +0000 UTC m=+2679.040804835" Jan 21 15:46:25 crc kubenswrapper[4707]: I0121 15:46:25.184023 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:46:25 crc kubenswrapper[4707]: E0121 15:46:25.184511 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:46:26 crc kubenswrapper[4707]: I0121 15:46:26.190942 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:31 crc kubenswrapper[4707]: I0121 15:46:31.191447 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:31 crc kubenswrapper[4707]: I0121 15:46:31.212756 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:31 crc kubenswrapper[4707]: I0121 15:46:31.924699 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.072033 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.072394 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="45e9f148-78c8-4e0e-adf4-ae14e936e166" containerName="openstackclient" containerID="cri-o://e41f28d469e56bcfb24ec87ca701c01fb43455c941908db97110f75eeb045abb" gracePeriod=2 Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.085609 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.138576 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.147716 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-49p4w"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.169765 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.175306 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-49p4w"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.189483 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-mgfrs"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.204323 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6"] Jan 21 15:46:34 crc kubenswrapper[4707]: E0121 15:46:34.204873 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e9f148-78c8-4e0e-adf4-ae14e936e166" containerName="openstackclient" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.204968 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e9f148-78c8-4e0e-adf4-ae14e936e166" containerName="openstackclient" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.205235 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e9f148-78c8-4e0e-adf4-ae14e936e166" containerName="openstackclient" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.205870 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.207675 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.218632 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.219632 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.226286 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.232512 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.246329 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.262849 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-jppl5"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.263854 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:34 crc kubenswrapper[4707]: E0121 15:46:34.287534 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:34 crc kubenswrapper[4707]: E0121 15:46:34.287623 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data podName:f341e188-43e6-4019-b4dd-b8ea727d2d3f nodeName:}" failed. No retries permitted until 2026-01-21 15:46:34.787605802 +0000 UTC m=+2691.969122024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data") pod "rabbitmq-cell1-server-0" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.313664 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.394572 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b6rjt"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.395682 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.397066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0ccf6-1685-4165-9980-b2a65109a1a0-operator-scripts\") pod \"barbican-db-create-jppl5\" (UID: \"bad0ccf6-1685-4165-9980-b2a65109a1a0\") " pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.397172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gch4\" (UniqueName: \"kubernetes.io/projected/df2290ff-a278-4834-be6d-3377f82d1548-kube-api-access-6gch4\") pod \"barbican-e6a9-account-create-update-625q6\" (UID: \"df2290ff-a278-4834-be6d-3377f82d1548\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.397199 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36879807-6691-412e-b126-9bb6813b7e65-operator-scripts\") pod \"cinder-a608-account-create-update-w7mtg\" (UID: \"36879807-6691-412e-b126-9bb6813b7e65\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.397216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljhqv\" (UniqueName: \"kubernetes.io/projected/bad0ccf6-1685-4165-9980-b2a65109a1a0-kube-api-access-ljhqv\") pod \"barbican-db-create-jppl5\" (UID: \"bad0ccf6-1685-4165-9980-b2a65109a1a0\") " pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.397257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v48xr\" (UniqueName: \"kubernetes.io/projected/36879807-6691-412e-b126-9bb6813b7e65-kube-api-access-v48xr\") pod \"cinder-a608-account-create-update-w7mtg\" (UID: \"36879807-6691-412e-b126-9bb6813b7e65\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.397324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df2290ff-a278-4834-be6d-3377f82d1548-operator-scripts\") pod \"barbican-e6a9-account-create-update-625q6\" (UID: \"df2290ff-a278-4834-be6d-3377f82d1548\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.403719 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.414084 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-tvcgs"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.431548 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-jppl5"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.470982 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b6rjt"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.490991 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-8rj4f"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.492121 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.493979 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.498306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gch4\" (UniqueName: \"kubernetes.io/projected/df2290ff-a278-4834-be6d-3377f82d1548-kube-api-access-6gch4\") pod \"barbican-e6a9-account-create-update-625q6\" (UID: \"df2290ff-a278-4834-be6d-3377f82d1548\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.498342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts\") pod \"root-account-create-update-b6rjt\" (UID: \"208f36cc-d67f-450a-a68f-1ace0702dfef\") " pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.498367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36879807-6691-412e-b126-9bb6813b7e65-operator-scripts\") pod \"cinder-a608-account-create-update-w7mtg\" (UID: \"36879807-6691-412e-b126-9bb6813b7e65\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.498384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljhqv\" (UniqueName: \"kubernetes.io/projected/bad0ccf6-1685-4165-9980-b2a65109a1a0-kube-api-access-ljhqv\") pod \"barbican-db-create-jppl5\" (UID: \"bad0ccf6-1685-4165-9980-b2a65109a1a0\") " pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.498430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v48xr\" (UniqueName: \"kubernetes.io/projected/36879807-6691-412e-b126-9bb6813b7e65-kube-api-access-v48xr\") pod \"cinder-a608-account-create-update-w7mtg\" (UID: \"36879807-6691-412e-b126-9bb6813b7e65\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.498480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df2290ff-a278-4834-be6d-3377f82d1548-operator-scripts\") pod \"barbican-e6a9-account-create-update-625q6\" (UID: \"df2290ff-a278-4834-be6d-3377f82d1548\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.498500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j8hq\" (UniqueName: \"kubernetes.io/projected/208f36cc-d67f-450a-a68f-1ace0702dfef-kube-api-access-2j8hq\") pod \"root-account-create-update-b6rjt\" (UID: \"208f36cc-d67f-450a-a68f-1ace0702dfef\") " pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.498526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0ccf6-1685-4165-9980-b2a65109a1a0-operator-scripts\") pod \"barbican-db-create-jppl5\" (UID: \"bad0ccf6-1685-4165-9980-b2a65109a1a0\") " pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.499101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36879807-6691-412e-b126-9bb6813b7e65-operator-scripts\") pod \"cinder-a608-account-create-update-w7mtg\" (UID: \"36879807-6691-412e-b126-9bb6813b7e65\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.499179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0ccf6-1685-4165-9980-b2a65109a1a0-operator-scripts\") pod \"barbican-db-create-jppl5\" (UID: \"bad0ccf6-1685-4165-9980-b2a65109a1a0\") " pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.499733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df2290ff-a278-4834-be6d-3377f82d1548-operator-scripts\") pod \"barbican-e6a9-account-create-update-625q6\" (UID: \"df2290ff-a278-4834-be6d-3377f82d1548\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.521429 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-8rj4f"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.535392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gch4\" (UniqueName: \"kubernetes.io/projected/df2290ff-a278-4834-be6d-3377f82d1548-kube-api-access-6gch4\") pod \"barbican-e6a9-account-create-update-625q6\" (UID: \"df2290ff-a278-4834-be6d-3377f82d1548\") " pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.542291 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.556388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljhqv\" (UniqueName: \"kubernetes.io/projected/bad0ccf6-1685-4165-9980-b2a65109a1a0-kube-api-access-ljhqv\") pod \"barbican-db-create-jppl5\" (UID: \"bad0ccf6-1685-4165-9980-b2a65109a1a0\") " pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.559463 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-f2pf7"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.572767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v48xr\" (UniqueName: \"kubernetes.io/projected/36879807-6691-412e-b126-9bb6813b7e65-kube-api-access-v48xr\") pod \"cinder-a608-account-create-update-w7mtg\" (UID: \"36879807-6691-412e-b126-9bb6813b7e65\") " pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.601820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j8hq\" (UniqueName: \"kubernetes.io/projected/208f36cc-d67f-450a-a68f-1ace0702dfef-kube-api-access-2j8hq\") pod \"root-account-create-update-b6rjt\" (UID: \"208f36cc-d67f-450a-a68f-1ace0702dfef\") " pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.601941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts\") pod \"root-account-create-update-b6rjt\" (UID: \"208f36cc-d67f-450a-a68f-1ace0702dfef\") " pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.601981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02f152ce-e53d-4745-974c-bbc06a4f5bd6-operator-scripts\") pod \"glance-4189-account-create-update-8rj4f\" (UID: \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.602023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl46b\" (UniqueName: \"kubernetes.io/projected/02f152ce-e53d-4745-974c-bbc06a4f5bd6-kube-api-access-hl46b\") pod \"glance-4189-account-create-update-8rj4f\" (UID: \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.602983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts\") pod \"root-account-create-update-b6rjt\" (UID: \"208f36cc-d67f-450a-a68f-1ace0702dfef\") " pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.614751 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-f2pf7"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.619984 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.635228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j8hq\" (UniqueName: \"kubernetes.io/projected/208f36cc-d67f-450a-a68f-1ace0702dfef-kube-api-access-2j8hq\") pod \"root-account-create-update-b6rjt\" (UID: \"208f36cc-d67f-450a-a68f-1ace0702dfef\") " pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.679971 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-2gz5j"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.686881 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-2gz5j"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.706335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02f152ce-e53d-4745-974c-bbc06a4f5bd6-operator-scripts\") pod \"glance-4189-account-create-update-8rj4f\" (UID: \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.706400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl46b\" (UniqueName: \"kubernetes.io/projected/02f152ce-e53d-4745-974c-bbc06a4f5bd6-kube-api-access-hl46b\") pod \"glance-4189-account-create-update-8rj4f\" (UID: \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.707489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02f152ce-e53d-4745-974c-bbc06a4f5bd6-operator-scripts\") pod \"glance-4189-account-create-update-8rj4f\" (UID: \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.715369 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.716493 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.717538 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.723695 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.757227 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.783525 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-47dxg"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.784631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl46b\" (UniqueName: \"kubernetes.io/projected/02f152ce-e53d-4745-974c-bbc06a4f5bd6-kube-api-access-hl46b\") pod \"glance-4189-account-create-update-8rj4f\" (UID: \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\") " pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.809787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097b8c3a-4002-4270-9f9b-e56f1507f748-operator-scripts\") pod \"placement-7e8f-account-create-update-wd54k\" (UID: \"097b8c3a-4002-4270-9f9b-e56f1507f748\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.809899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th6fp\" (UniqueName: \"kubernetes.io/projected/097b8c3a-4002-4270-9f9b-e56f1507f748-kube-api-access-th6fp\") pod \"placement-7e8f-account-create-update-wd54k\" (UID: \"097b8c3a-4002-4270-9f9b-e56f1507f748\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:34 crc kubenswrapper[4707]: E0121 15:46:34.810020 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:34 crc kubenswrapper[4707]: E0121 15:46:34.810054 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data podName:f341e188-43e6-4019-b4dd-b8ea727d2d3f nodeName:}" failed. No retries permitted until 2026-01-21 15:46:35.81004247 +0000 UTC m=+2692.991558692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data") pod "rabbitmq-cell1-server-0" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.811260 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-47dxg"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.821218 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.825526 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.826050 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerName="openstack-network-exporter" containerID="cri-o://85117f9d04aa0f7e6d587b250d45813072e69df9bf3aba08c43cd1843a375512" gracePeriod=300 Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.846108 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.862973 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.888100 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-s78zw"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.888196 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.893099 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.903855 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-s78zw"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.911267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th6fp\" (UniqueName: \"kubernetes.io/projected/097b8c3a-4002-4270-9f9b-e56f1507f748-kube-api-access-th6fp\") pod \"placement-7e8f-account-create-update-wd54k\" (UID: \"097b8c3a-4002-4270-9f9b-e56f1507f748\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.911426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097b8c3a-4002-4270-9f9b-e56f1507f748-operator-scripts\") pod \"placement-7e8f-account-create-update-wd54k\" (UID: \"097b8c3a-4002-4270-9f9b-e56f1507f748\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.912055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097b8c3a-4002-4270-9f9b-e56f1507f748-operator-scripts\") pod \"placement-7e8f-account-create-update-wd54k\" (UID: \"097b8c3a-4002-4270-9f9b-e56f1507f748\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.923108 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.936233 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.950479 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerName="ovsdbserver-sb" containerID="cri-o://423b9cfbccfd3275dd905b5666c4c6442b89dad40b2b0658353b1b608bf27823" gracePeriod=300 Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.955188 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-2zf9k"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.957608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th6fp\" (UniqueName: \"kubernetes.io/projected/097b8c3a-4002-4270-9f9b-e56f1507f748-kube-api-access-th6fp\") pod \"placement-7e8f-account-create-update-wd54k\" (UID: \"097b8c3a-4002-4270-9f9b-e56f1507f748\") " pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.981199 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj"] Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.986923 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:34 crc kubenswrapper[4707]: I0121 15:46:34.995184 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.014732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259fc011-3b4a-41a2-a064-d7d2282e9763-operator-scripts\") pod \"neutron-6d40-account-create-update-nsm5s\" (UID: \"259fc011-3b4a-41a2-a064-d7d2282e9763\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.014784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqbn\" (UniqueName: \"kubernetes.io/projected/259fc011-3b4a-41a2-a064-d7d2282e9763-kube-api-access-lhqbn\") pod \"neutron-6d40-account-create-update-nsm5s\" (UID: \"259fc011-3b4a-41a2-a064-d7d2282e9763\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.019391 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.022063 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.027237 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.031582 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.039342 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.046232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.048331 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.049786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.051627 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.055292 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-mbvl2"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.070669 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-mbvl2"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.095295 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.120651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e629cb-1223-4f95-a117-0d72b4a06bce-operator-scripts\") pod \"nova-cell0-dd06-account-create-update-ms8c6\" (UID: \"43e629cb-1223-4f95-a117-0d72b4a06bce\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.120712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-operator-scripts\") pod \"nova-api-b79e-account-create-update-dv8pj\" (UID: \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.120838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259fc011-3b4a-41a2-a064-d7d2282e9763-operator-scripts\") pod \"neutron-6d40-account-create-update-nsm5s\" (UID: \"259fc011-3b4a-41a2-a064-d7d2282e9763\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.120864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzpjn\" (UniqueName: \"kubernetes.io/projected/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-kube-api-access-qzpjn\") pod \"nova-api-b79e-account-create-update-dv8pj\" (UID: \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.120905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhqbn\" (UniqueName: \"kubernetes.io/projected/259fc011-3b4a-41a2-a064-d7d2282e9763-kube-api-access-lhqbn\") pod \"neutron-6d40-account-create-update-nsm5s\" (UID: \"259fc011-3b4a-41a2-a064-d7d2282e9763\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.120938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qvz2\" (UniqueName: \"kubernetes.io/projected/43e629cb-1223-4f95-a117-0d72b4a06bce-kube-api-access-7qvz2\") pod \"nova-cell0-dd06-account-create-update-ms8c6\" (UID: \"43e629cb-1223-4f95-a117-0d72b4a06bce\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.121595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259fc011-3b4a-41a2-a064-d7d2282e9763-operator-scripts\") pod \"neutron-6d40-account-create-update-nsm5s\" (UID: \"259fc011-3b4a-41a2-a064-d7d2282e9763\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.176384 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-r94jp"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.221864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-operator-scripts\") pod \"nova-cell1-d92f-account-create-update-pw6mk\" (UID: \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.222306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhqbn\" (UniqueName: \"kubernetes.io/projected/259fc011-3b4a-41a2-a064-d7d2282e9763-kube-api-access-lhqbn\") pod \"neutron-6d40-account-create-update-nsm5s\" (UID: \"259fc011-3b4a-41a2-a064-d7d2282e9763\") " pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.223710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzpjn\" (UniqueName: \"kubernetes.io/projected/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-kube-api-access-qzpjn\") pod \"nova-api-b79e-account-create-update-dv8pj\" (UID: \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.223769 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qvz2\" (UniqueName: \"kubernetes.io/projected/43e629cb-1223-4f95-a117-0d72b4a06bce-kube-api-access-7qvz2\") pod \"nova-cell0-dd06-account-create-update-ms8c6\" (UID: \"43e629cb-1223-4f95-a117-0d72b4a06bce\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.223859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e629cb-1223-4f95-a117-0d72b4a06bce-operator-scripts\") pod \"nova-cell0-dd06-account-create-update-ms8c6\" (UID: \"43e629cb-1223-4f95-a117-0d72b4a06bce\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.223900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-operator-scripts\") pod \"nova-api-b79e-account-create-update-dv8pj\" (UID: \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.223958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc7th\" (UniqueName: \"kubernetes.io/projected/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-kube-api-access-qc7th\") pod \"nova-cell1-d92f-account-create-update-pw6mk\" (UID: \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.224749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-operator-scripts\") pod \"nova-api-b79e-account-create-update-dv8pj\" (UID: \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.227729 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e629cb-1223-4f95-a117-0d72b4a06bce-operator-scripts\") pod \"nova-cell0-dd06-account-create-update-ms8c6\" (UID: \"43e629cb-1223-4f95-a117-0d72b4a06bce\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.244922 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qvz2\" (UniqueName: \"kubernetes.io/projected/43e629cb-1223-4f95-a117-0d72b4a06bce-kube-api-access-7qvz2\") pod \"nova-cell0-dd06-account-create-update-ms8c6\" (UID: \"43e629cb-1223-4f95-a117-0d72b4a06bce\") " pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.249561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzpjn\" (UniqueName: \"kubernetes.io/projected/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-kube-api-access-qzpjn\") pod \"nova-api-b79e-account-create-update-dv8pj\" (UID: \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\") " pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.311434 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0219fe68-3dcf-4e10-88e8-6de2bd228897" path="/var/lib/kubelet/pods/0219fe68-3dcf-4e10-88e8-6de2bd228897/volumes" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.312184 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03df5afa-682c-4caa-a0ab-091fb6c7fa11" path="/var/lib/kubelet/pods/03df5afa-682c-4caa-a0ab-091fb6c7fa11/volumes" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.312683 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f114bd-85ab-457d-aaad-2cc0a70d6e94" path="/var/lib/kubelet/pods/09f114bd-85ab-457d-aaad-2cc0a70d6e94/volumes" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.315565 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b133c7-53cb-4cb8-9b04-66973d5ee9a0" path="/var/lib/kubelet/pods/14b133c7-53cb-4cb8-9b04-66973d5ee9a0/volumes" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.316532 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b0806ad-f9f2-44d7-b32f-4ad177974092" path="/var/lib/kubelet/pods/3b0806ad-f9f2-44d7-b32f-4ad177974092/volumes" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.317009 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596ebb84-1e59-4793-804c-225d9a9d1794" path="/var/lib/kubelet/pods/596ebb84-1e59-4793-804c-225d9a9d1794/volumes" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.318202 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ba6f6d-b3da-4f93-8af6-22537c56e223" path="/var/lib/kubelet/pods/67ba6f6d-b3da-4f93-8af6-22537c56e223/volumes" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.318664 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eeb0fcb-cc4b-477d-8cb6-d3915da905a5" path="/var/lib/kubelet/pods/6eeb0fcb-cc4b-477d-8cb6-d3915da905a5/volumes" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.319147 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75f8ec8-450e-4fa8-a854-71449667b528" path="/var/lib/kubelet/pods/a75f8ec8-450e-4fa8-a854-71449667b528/volumes" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.319946 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-c5ksp"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.319969 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-r94jp"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.319984 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-c5ksp"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.319994 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-f9l4s"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.320005 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.320026 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.320036 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.320044 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.320189 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerName="ovn-northd" containerID="cri-o://907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c" gracePeriod=30 Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.321386 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerName="openstack-network-exporter" containerID="cri-o://2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1" gracePeriod=30 Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.326594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc7th\" (UniqueName: \"kubernetes.io/projected/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-kube-api-access-qc7th\") pod \"nova-cell1-d92f-account-create-update-pw6mk\" (UID: \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.326680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-operator-scripts\") pod \"nova-cell1-d92f-account-create-update-pw6mk\" (UID: \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.327303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-operator-scripts\") pod \"nova-cell1-d92f-account-create-update-pw6mk\" (UID: \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.328130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.328935 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-v8mn6"] Jan 21 15:46:35 crc kubenswrapper[4707]: E0121 15:46:35.331763 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:35 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:35 crc kubenswrapper[4707]: Jan 21 15:46:35 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:35 crc kubenswrapper[4707]: Jan 21 15:46:35 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:35 crc kubenswrapper[4707]: Jan 21 15:46:35 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:35 crc kubenswrapper[4707]: Jan 21 15:46:35 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:46:35 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:46:35 crc kubenswrapper[4707]: else Jan 21 15:46:35 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:35 crc kubenswrapper[4707]: fi Jan 21 15:46:35 crc kubenswrapper[4707]: Jan 21 15:46:35 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:35 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:35 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:35 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:35 crc kubenswrapper[4707]: # support updates Jan 21 15:46:35 crc kubenswrapper[4707]: Jan 21 15:46:35 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:35 crc kubenswrapper[4707]: E0121 15:46:35.333766 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" podUID="df2290ff-a278-4834-be6d-3377f82d1548" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.344596 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-sppjh"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.350050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.356494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.365218 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-sppjh"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.387660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc7th\" (UniqueName: \"kubernetes.io/projected/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-kube-api-access-qc7th\") pod \"nova-cell1-d92f-account-create-update-pw6mk\" (UID: \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\") " pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.394170 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.399842 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-b2979"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.404367 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.412308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-b2979"] Jan 21 15:46:35 crc kubenswrapper[4707]: E0121 15:46:35.429491 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:46:35 crc kubenswrapper[4707]: E0121 15:46:35.429551 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data podName:8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab nodeName:}" failed. No retries permitted until 2026-01-21 15:46:35.9295372 +0000 UTC m=+2693.111053422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data") pod "rabbitmq-server-0" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab") : configmap "rabbitmq-config-data" not found Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.437972 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.449167 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ztjcn"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.452742 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gwvgj"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.454151 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.463885 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.464158 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerName="openstack-network-exporter" containerID="cri-o://0cccd4727b8e47536048d4c6788228ae784cd8a9a57fa878409e98baade244f8" gracePeriod=300 Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.479666 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gwvgj"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.516754 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.535561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcrfq\" (UniqueName: \"kubernetes.io/projected/be4aab59-e488-40d8-883e-0e09a44aae4b-kube-api-access-mcrfq\") pod \"nova-cell1-db-create-gwvgj\" (UID: \"be4aab59-e488-40d8-883e-0e09a44aae4b\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.535719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-operator-scripts\") pod \"nova-api-db-create-sppjh\" (UID: \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\") " pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.535749 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcq74\" (UniqueName: \"kubernetes.io/projected/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-kube-api-access-pcq74\") pod \"nova-api-db-create-sppjh\" (UID: \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\") " pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.537984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzwvm\" (UniqueName: \"kubernetes.io/projected/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-kube-api-access-fzwvm\") pod \"nova-cell0-db-create-b2979\" (UID: \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\") " pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.538098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4aab59-e488-40d8-883e-0e09a44aae4b-operator-scripts\") pod \"nova-cell1-db-create-gwvgj\" (UID: \"be4aab59-e488-40d8-883e-0e09a44aae4b\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.538161 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-operator-scripts\") pod \"nova-cell0-db-create-b2979\" (UID: \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\") " pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.555960 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerName="ovsdbserver-nb" containerID="cri-o://65770a8fa3c56a256295d751a70837c7e5a69727abf0f918cb18f1afc358384c" gracePeriod=300 Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.556210 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.575490 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-vw9xc"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.588372 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-dpcvc"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.600976 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-dpcvc"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.610159 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lblpk"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.624945 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-lblpk"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.635387 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.641940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-operator-scripts\") pod \"nova-api-db-create-sppjh\" (UID: \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\") " pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.642052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcq74\" (UniqueName: \"kubernetes.io/projected/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-kube-api-access-pcq74\") pod \"nova-api-db-create-sppjh\" (UID: \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\") " pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.642171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzwvm\" (UniqueName: \"kubernetes.io/projected/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-kube-api-access-fzwvm\") pod \"nova-cell0-db-create-b2979\" (UID: \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\") " pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.642294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4aab59-e488-40d8-883e-0e09a44aae4b-operator-scripts\") pod \"nova-cell1-db-create-gwvgj\" (UID: \"be4aab59-e488-40d8-883e-0e09a44aae4b\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.642389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-operator-scripts\") pod \"nova-cell0-db-create-b2979\" (UID: \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\") " pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.642480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcrfq\" (UniqueName: \"kubernetes.io/projected/be4aab59-e488-40d8-883e-0e09a44aae4b-kube-api-access-mcrfq\") pod \"nova-cell1-db-create-gwvgj\" (UID: \"be4aab59-e488-40d8-883e-0e09a44aae4b\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.643581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-operator-scripts\") pod \"nova-api-db-create-sppjh\" (UID: \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\") " pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.644399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4aab59-e488-40d8-883e-0e09a44aae4b-operator-scripts\") pod \"nova-cell1-db-create-gwvgj\" (UID: \"be4aab59-e488-40d8-883e-0e09a44aae4b\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.644917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-operator-scripts\") pod \"nova-cell0-db-create-b2979\" (UID: \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\") " pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.656587 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-x6jxn"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.667062 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-x6jxn"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.686975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-ktf95"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.694295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzwvm\" (UniqueName: \"kubernetes.io/projected/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-kube-api-access-fzwvm\") pod \"nova-cell0-db-create-b2979\" (UID: \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\") " pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.699662 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-ktf95"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.700093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcq74\" (UniqueName: \"kubernetes.io/projected/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-kube-api-access-pcq74\") pod \"nova-api-db-create-sppjh\" (UID: \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\") " pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.711555 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-2n6bd"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.717214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcrfq\" (UniqueName: \"kubernetes.io/projected/be4aab59-e488-40d8-883e-0e09a44aae4b-kube-api-access-mcrfq\") pod \"nova-cell1-db-create-gwvgj\" (UID: \"be4aab59-e488-40d8-883e-0e09a44aae4b\") " pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.724230 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-2n6bd"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.731220 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-hbd6h"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.736981 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.754988 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-hbd6h"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.790667 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:35 crc kubenswrapper[4707]: E0121 15:46:35.856215 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:35 crc kubenswrapper[4707]: E0121 15:46:35.856358 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data podName:f341e188-43e6-4019-b4dd-b8ea727d2d3f nodeName:}" failed. No retries permitted until 2026-01-21 15:46:37.856292403 +0000 UTC m=+2695.037808625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data") pod "rabbitmq-cell1-server-0" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.887853 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-zcs8f"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.897856 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-zcs8f"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.925839 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.926076 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerName="cinder-scheduler" containerID="cri-o://ad1135a550f9a5d2b3f5160fd4aa31b58313409af1bfcb366c84f0d82a424f8b" gracePeriod=30 Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.926314 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerName="probe" containerID="cri-o://ce810f5016c6b84539b5638a579a70bdd684769502c29f783c133ce65a806987" gracePeriod=30 Jan 21 15:46:35 crc kubenswrapper[4707]: E0121 15:46:35.958068 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:46:35 crc kubenswrapper[4707]: E0121 15:46:35.958445 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data podName:8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab nodeName:}" failed. No retries permitted until 2026-01-21 15:46:36.958430187 +0000 UTC m=+2694.139946409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data") pod "rabbitmq-server-0" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab") : configmap "rabbitmq-config-data" not found Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.964949 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.965221 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerName="glance-log" containerID="cri-o://7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae" gracePeriod=30 Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.965633 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerName="glance-httpd" containerID="cri-o://4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b" gracePeriod=30 Jan 21 15:46:35 crc kubenswrapper[4707]: W0121 15:46:35.991246 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbad0ccf6_1685_4165_9980_b2a65109a1a0.slice/crio-86c20ec2423ccc6675438a96568d733315ec2b5ece8143cda1693be010762423 WatchSource:0}: Error finding container 86c20ec2423ccc6675438a96568d733315ec2b5ece8143cda1693be010762423: Status 404 returned error can't find the container with id 86c20ec2423ccc6675438a96568d733315ec2b5ece8143cda1693be010762423 Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.991988 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.997478 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm"] Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.998114 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_79747506-45fc-47c5-b7ee-38e40227b1bb/ovsdbserver-sb/0.log" Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.998149 4707 generic.go:334] "Generic (PLEG): container finished" podID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerID="85117f9d04aa0f7e6d587b250d45813072e69df9bf3aba08c43cd1843a375512" exitCode=2 Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.998165 4707 generic.go:334] "Generic (PLEG): container finished" podID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerID="423b9cfbccfd3275dd905b5666c4c6442b89dad40b2b0658353b1b608bf27823" exitCode=143 Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.998209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"79747506-45fc-47c5-b7ee-38e40227b1bb","Type":"ContainerDied","Data":"85117f9d04aa0f7e6d587b250d45813072e69df9bf3aba08c43cd1843a375512"} Jan 21 15:46:35 crc kubenswrapper[4707]: I0121 15:46:35.998231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"79747506-45fc-47c5-b7ee-38e40227b1bb","Type":"ContainerDied","Data":"423b9cfbccfd3275dd905b5666c4c6442b89dad40b2b0658353b1b608bf27823"} Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.005003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" event={"ID":"df2290ff-a278-4834-be6d-3377f82d1548","Type":"ContainerStarted","Data":"4b6ab9c1938d4ed1052ade94e8b7b556313ba840f307417051ecbbf0e48b912b"} Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.008874 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kcgbm"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.022408 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_6de81e07-b3f2-4f89-ad02-c5f3b3ea355d/ovsdbserver-nb/0.log" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.022452 4707 generic.go:334] "Generic (PLEG): container finished" podID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerID="0cccd4727b8e47536048d4c6788228ae784cd8a9a57fa878409e98baade244f8" exitCode=2 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.022469 4707 generic.go:334] "Generic (PLEG): container finished" podID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerID="65770a8fa3c56a256295d751a70837c7e5a69727abf0f918cb18f1afc358384c" exitCode=143 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.022532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d","Type":"ContainerDied","Data":"0cccd4727b8e47536048d4c6788228ae784cd8a9a57fa878409e98baade244f8"} Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.022555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d","Type":"ContainerDied","Data":"65770a8fa3c56a256295d751a70837c7e5a69727abf0f918cb18f1afc358384c"} Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.027649 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.033150 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-59wmr"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.034478 4707 generic.go:334] "Generic (PLEG): container finished" podID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerID="2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1" exitCode=2 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.034566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"67eda4bf-2ebc-4616-bd19-55afc23963d2","Type":"ContainerDied","Data":"2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1"} Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.037280 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:36 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:46:36 crc kubenswrapper[4707]: else Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:36 crc kubenswrapper[4707]: fi Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:36 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:36 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:36 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:36 crc kubenswrapper[4707]: # support updates Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.039563 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" podUID="df2290ff-a278-4834-be6d-3377f82d1548" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.073126 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.073536 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerName="cinder-api-log" containerID="cri-o://3f05e66f94e563650c321336e37a8530429e8dfd49776a8178131469087deaaf" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.074017 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerName="cinder-api" containerID="cri-o://3f5a0d130fa4b31023beed8d473c235ed286cc7b040259cf77ab6da89f38e456" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.125947 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kdbn6"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.182580 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.182972 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.187518 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kdbn6"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.187960 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_79747506-45fc-47c5-b7ee-38e40227b1bb/ovsdbserver-sb/0.log" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.188004 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.218583 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-jppl5"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.237196 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.237423 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="20a1576d-856f-45fb-b210-66fddef6adaf" containerName="glance-log" containerID="cri-o://5ca8cf0f460517bb6eb2f9678b1116274be8a382c22293f7320425480d1c7e94" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.237785 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="20a1576d-856f-45fb-b210-66fddef6adaf" containerName="glance-httpd" containerID="cri-o://fb76672b8b108d21f1c4f38e5e76864b3f3b4e2030e02b6e06d10dafbc199a9b" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.264296 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-combined-ca-bundle\") pod \"79747506-45fc-47c5-b7ee-38e40227b1bb\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.264359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-config\") pod \"79747506-45fc-47c5-b7ee-38e40227b1bb\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.264448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-scripts\") pod \"79747506-45fc-47c5-b7ee-38e40227b1bb\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.264469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnww8\" (UniqueName: \"kubernetes.io/projected/79747506-45fc-47c5-b7ee-38e40227b1bb-kube-api-access-dnww8\") pod \"79747506-45fc-47c5-b7ee-38e40227b1bb\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.264508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"79747506-45fc-47c5-b7ee-38e40227b1bb\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.266051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-config" (OuterVolumeSpecName: "config") pod "79747506-45fc-47c5-b7ee-38e40227b1bb" (UID: "79747506-45fc-47c5-b7ee-38e40227b1bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.266577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-scripts" (OuterVolumeSpecName: "scripts") pod "79747506-45fc-47c5-b7ee-38e40227b1bb" (UID: "79747506-45fc-47c5-b7ee-38e40227b1bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.284963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "79747506-45fc-47c5-b7ee-38e40227b1bb" (UID: "79747506-45fc-47c5-b7ee-38e40227b1bb"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.285133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79747506-45fc-47c5-b7ee-38e40227b1bb-kube-api-access-dnww8" (OuterVolumeSpecName: "kube-api-access-dnww8") pod "79747506-45fc-47c5-b7ee-38e40227b1bb" (UID: "79747506-45fc-47c5-b7ee-38e40227b1bb"). InnerVolumeSpecName "kube-api-access-dnww8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.303389 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-8rj4f"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314168 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314548 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-server" containerID="cri-o://75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314641 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-updater" containerID="cri-o://97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314685 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-auditor" containerID="cri-o://9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314723 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-replicator" containerID="cri-o://40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314751 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-server" containerID="cri-o://d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314750 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-expirer" containerID="cri-o://ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314782 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-reaper" containerID="cri-o://6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314828 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-auditor" containerID="cri-o://83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314859 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-replicator" containerID="cri-o://c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.314982 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="swift-recon-cron" containerID="cri-o://3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.315122 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="rsync" containerID="cri-o://1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.315166 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-replicator" containerID="cri-o://a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.315168 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-updater" containerID="cri-o://01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.316767 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-auditor" containerID="cri-o://42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.316975 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-server" containerID="cri-o://ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.332544 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6fc86fc9f6-cd66l"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.332736 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-log" containerID="cri-o://6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.333111 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-api" containerID="cri-o://a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.339508 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.345535 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-jppl5"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.354083 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-68556f8ccd-sx4vz"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.354349 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" containerID="cri-o://93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.354941 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" containerID="cri-o://752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.365250 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.365939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdbserver-sb-tls-certs\") pod \"79747506-45fc-47c5-b7ee-38e40227b1bb\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.365973 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-metrics-certs-tls-certs\") pod \"79747506-45fc-47c5-b7ee-38e40227b1bb\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.366080 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdb-rundir\") pod \"79747506-45fc-47c5-b7ee-38e40227b1bb\" (UID: \"79747506-45fc-47c5-b7ee-38e40227b1bb\") " Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.366753 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.366765 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79747506-45fc-47c5-b7ee-38e40227b1bb-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.366774 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnww8\" (UniqueName: \"kubernetes.io/projected/79747506-45fc-47c5-b7ee-38e40227b1bb-kube-api-access-dnww8\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.366791 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.368523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "79747506-45fc-47c5-b7ee-38e40227b1bb" (UID: "79747506-45fc-47c5-b7ee-38e40227b1bb"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.375883 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.386329 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg"] Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.386659 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerName="openstack-network-exporter" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.386670 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerName="openstack-network-exporter" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.386686 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerName="ovsdbserver-sb" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.386691 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerName="ovsdbserver-sb" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.386899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerName="ovsdbserver-sb" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.386908 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79747506-45fc-47c5-b7ee-38e40227b1bb" containerName="openstack-network-exporter" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.387705 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.450152 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.469095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwbfk\" (UniqueName: \"kubernetes.io/projected/68cc87bb-db38-44df-a921-630ecf45a30b-kube-api-access-gwbfk\") pod \"dnsmasq-dnsmasq-84b9f45d47-88qrg\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.469208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-88qrg\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.471046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-88qrg\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.471362 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.493502 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.524861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.541094 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.546369 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.551307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79747506-45fc-47c5-b7ee-38e40227b1bb" (UID: "79747506-45fc-47c5-b7ee-38e40227b1bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.555085 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.570984 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.572669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-88qrg\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.572746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwbfk\" (UniqueName: \"kubernetes.io/projected/68cc87bb-db38-44df-a921-630ecf45a30b-kube-api-access-gwbfk\") pod \"dnsmasq-dnsmasq-84b9f45d47-88qrg\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.572796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-88qrg\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.572858 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.572869 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.573573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-88qrg\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.573714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-88qrg\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.581674 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-sppjh"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.616440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwbfk\" (UniqueName: \"kubernetes.io/projected/68cc87bb-db38-44df-a921-630ecf45a30b-kube-api-access-gwbfk\") pod \"dnsmasq-dnsmasq-84b9f45d47-88qrg\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.648387 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gwvgj"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.669010 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-b2979"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.678938 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" containerName="rabbitmq" containerID="cri-o://7de59c225a5ef6ebd73df1f9d5fa7286a46034a887cc003580cb1dcd45f6ba34" gracePeriod=604800 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.739985 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.753287 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.753457 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-log" containerID="cri-o://2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.753795 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-api" containerID="cri-o://165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: W0121 15:46:36.769981 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02f152ce_e53d_4745_974c_bbc06a4f5bd6.slice/crio-851dfeaf1bcc748de769715db84b7c3a1ea8498e1fcd9de0aadbc3af0d9b390c WatchSource:0}: Error finding container 851dfeaf1bcc748de769715db84b7c3a1ea8498e1fcd9de0aadbc3af0d9b390c: Status 404 returned error can't find the container with id 851dfeaf1bcc748de769715db84b7c3a1ea8498e1fcd9de0aadbc3af0d9b390c Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.785563 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e9f148_78c8_4e0e_adf4_ae14e936e166.slice/crio-conmon-e41f28d469e56bcfb24ec87ca701c01fb43455c941908db97110f75eeb045abb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98f30a9f_5d85_4dce_8bc8_03e26c9967c8.slice/crio-conmon-3f05e66f94e563650c321336e37a8530429e8dfd49776a8178131469087deaaf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b5fdc09_bd73_4c16_96f8_8c083e314aff.slice/crio-40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e9f148_78c8_4e0e_adf4_ae14e936e166.slice/crio-e41f28d469e56bcfb24ec87ca701c01fb43455c941908db97110f75eeb045abb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20a1576d_856f_45fb_b210_66fddef6adaf.slice/crio-5ca8cf0f460517bb6eb2f9678b1116274be8a382c22293f7320425480d1c7e94.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b5fdc09_bd73_4c16_96f8_8c083e314aff.slice/crio-9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.796577 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.796947 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="ceilometer-central-agent" containerID="cri-o://bfb9505879b3bc1531075e521f5c432e0486e1366b2751cb167564231e34887f" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.797262 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="sg-core" containerID="cri-o://e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.797327 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="ceilometer-notification-agent" containerID="cri-o://f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.797836 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="proxy-httpd" containerID="cri-o://002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.813412 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.813621 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-log" containerID="cri-o://7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.813954 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-metadata" containerID="cri-o://a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.814006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "79747506-45fc-47c5-b7ee-38e40227b1bb" (UID: "79747506-45fc-47c5-b7ee-38e40227b1bb"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.838542 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.838710 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" podUID="6ba6473c-6566-4d18-88ff-e329a248c151" containerName="barbican-keystone-listener-log" containerID="cri-o://ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.839195 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" podUID="6ba6473c-6566-4d18-88ff-e329a248c151" containerName="barbican-keystone-listener" containerID="cri-o://1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.864149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "79747506-45fc-47c5-b7ee-38e40227b1bb" (UID: "79747506-45fc-47c5-b7ee-38e40227b1bb"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.889115 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.889145 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79747506-45fc-47c5-b7ee-38e40227b1bb-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.896206 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:36 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 15:46:36 crc kubenswrapper[4707]: else Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:36 crc kubenswrapper[4707]: fi Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:36 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:36 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:36 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:36 crc kubenswrapper[4707]: # support updates Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.897706 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-b6rjt" podUID="208f36cc-d67f-450a-a68f-1ace0702dfef" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.898138 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:36 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 15:46:36 crc kubenswrapper[4707]: else Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:36 crc kubenswrapper[4707]: fi Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:36 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:36 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:36 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:36 crc kubenswrapper[4707]: # support updates Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.901128 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" podUID="02f152ce-e53d-4745-974c-bbc06a4f5bd6" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.913530 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b6rjt"] Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.913865 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:36 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: if [ -n "placement" ]; then Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="placement" Jan 21 15:46:36 crc kubenswrapper[4707]: else Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:36 crc kubenswrapper[4707]: fi Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:36 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:36 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:36 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:36 crc kubenswrapper[4707]: # support updates Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.915018 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" podUID="097b8c3a-4002-4270-9f9b-e56f1507f748" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.918037 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:36 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: if [ -n "nova_cell0" ]; then Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell0" Jan 21 15:46:36 crc kubenswrapper[4707]: else Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:36 crc kubenswrapper[4707]: fi Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:36 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:36 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:36 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:36 crc kubenswrapper[4707]: # support updates Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.924251 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" podUID="43e629cb-1223-4f95-a117-0d72b4a06bce" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.934055 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb66854d6-grx69"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.934313 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api-log" containerID="cri-o://e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.934663 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api" containerID="cri-o://d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.942063 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.942287 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" podUID="8e72ad47-d005-44ce-a494-5ed25884c762" containerName="barbican-worker-log" containerID="cri-o://2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.942546 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" podUID="8e72ad47-d005-44ce-a494-5ed25884c762" containerName="barbican-worker" containerID="cri-o://6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.955684 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.956770 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:36 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 15:46:36 crc kubenswrapper[4707]: else Jan 21 15:46:36 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:36 crc kubenswrapper[4707]: fi Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:36 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:36 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:36 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:36 crc kubenswrapper[4707]: # support updates Jan 21 15:46:36 crc kubenswrapper[4707]: Jan 21 15:46:36 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.958996 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" podUID="36879807-6691-412e-b126-9bb6813b7e65" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.961825 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-8rj4f"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.967234 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b6rjt"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.973030 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.973202 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" containerID="cri-o://1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.979155 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.980102 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="62555ab6-9268-4248-977e-98269dc9483d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.980585 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.982797 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.989161 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.989158 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.989212 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerName="ovn-northd" Jan 21 15:46:36 crc kubenswrapper[4707]: I0121 15:46:36.989327 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="0837d9e7-fb96-43b0-a5fb-5eca08e411a3" containerName="kube-state-metrics" containerID="cri-o://c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0" gracePeriod=30 Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.993650 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:46:36 crc kubenswrapper[4707]: E0121 15:46:36.993706 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data podName:8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab nodeName:}" failed. No retries permitted until 2026-01-21 15:46:38.993692 +0000 UTC m=+2696.175208212 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data") pod "rabbitmq-server-0" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab") : configmap "rabbitmq-config-data" not found Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.001832 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.006929 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.017590 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="c14f4f47-2fe8-47fd-892f-832266276730" containerName="galera" containerID="cri-o://5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92" gracePeriod=30 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.024848 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_6de81e07-b3f2-4f89-ad02-c5f3b3ea355d/ovsdbserver-nb/0.log" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.024928 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.025984 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" containerName="rabbitmq" containerID="cri-o://41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161" gracePeriod=604800 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.028881 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.029138 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" podUID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerName="proxy-httpd" containerID="cri-o://ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748" gracePeriod=30 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.029242 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" podUID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerName="proxy-server" containerID="cri-o://95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4" gracePeriod=30 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.031460 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.034707 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.056553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" event={"ID":"02f152ce-e53d-4745-974c-bbc06a4f5bd6","Type":"ContainerStarted","Data":"851dfeaf1bcc748de769715db84b7c3a1ea8498e1fcd9de0aadbc3af0d9b390c"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.085881 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_6de81e07-b3f2-4f89-ad02-c5f3b3ea355d/ovsdbserver-nb/0.log" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.085953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d","Type":"ContainerDied","Data":"855fa048eb548158b696c447ecc83d5902e298c6056a0f55d034166e85b2ba72"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.085983 4707 scope.go:117] "RemoveContainer" containerID="0cccd4727b8e47536048d4c6788228ae784cd8a9a57fa878409e98baade244f8" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.086105 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.090576 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-b6rjt" event={"ID":"208f36cc-d67f-450a-a68f-1ace0702dfef","Type":"ContainerStarted","Data":"2d418493944578a14725d5064907da8eca4eebec8488df581d359507fb7c15b3"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.090850 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-b6rjt" secret="" err="secret \"galera-openstack-cell1-dockercfg-pbbx2\" not found" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.097421 4707 generic.go:334] "Generic (PLEG): container finished" podID="6ba6473c-6566-4d18-88ff-e329a248c151" containerID="ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55" exitCode=143 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.097474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" event={"ID":"6ba6473c-6566-4d18-88ff-e329a248c151","Type":"ContainerDied","Data":"ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55"} Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.098845 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:37 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 15:46:37 crc kubenswrapper[4707]: else Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:37 crc kubenswrapper[4707]: fi Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:37 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:37 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:37 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:37 crc kubenswrapper[4707]: # support updates Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.099911 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-b6rjt" podUID="208f36cc-d67f-450a-a68f-1ace0702dfef" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.115819 4707 generic.go:334] "Generic (PLEG): container finished" podID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerID="ce810f5016c6b84539b5638a579a70bdd684769502c29f783c133ce65a806987" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.115893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760","Type":"ContainerDied","Data":"ce810f5016c6b84539b5638a579a70bdd684769502c29f783c133ce65a806987"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.122971 4707 generic.go:334] "Generic (PLEG): container finished" podID="20a1576d-856f-45fb-b210-66fddef6adaf" containerID="5ca8cf0f460517bb6eb2f9678b1116274be8a382c22293f7320425480d1c7e94" exitCode=143 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.123034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20a1576d-856f-45fb-b210-66fddef6adaf","Type":"ContainerDied","Data":"5ca8cf0f460517bb6eb2f9678b1116274be8a382c22293f7320425480d1c7e94"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.124214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" event={"ID":"36879807-6691-412e-b126-9bb6813b7e65","Type":"ContainerStarted","Data":"6bf4ae6e3969196d9e9b3efcfca6f9c83c599ad1c482fbd5f9b1431c61312251"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.159761 4707 generic.go:334] "Generic (PLEG): container finished" podID="45e9f148-78c8-4e0e-adf4-ae14e936e166" containerID="e41f28d469e56bcfb24ec87ca701c01fb43455c941908db97110f75eeb045abb" exitCode=137 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.159866 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c6a797dd408ce0849137ef60cd0e11fe8f4e25380db6af102d9aee0920d7cde" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.177392 4707 scope.go:117] "RemoveContainer" containerID="65770a8fa3c56a256295d751a70837c7e5a69727abf0f918cb18f1afc358384c" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.199427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-metrics-certs-tls-certs\") pod \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.199923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-scripts\") pod \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.199968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdb-rundir\") pod \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.199990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdbserver-nb-tls-certs\") pod \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.200101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgcqc\" (UniqueName: \"kubernetes.io/projected/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-kube-api-access-sgcqc\") pod \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.200176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.200248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-combined-ca-bundle\") pod \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.200297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-config\") pod \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\" (UID: \"6de81e07-b3f2-4f89-ad02-c5f3b3ea355d\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.200402 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" (UID: "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.200559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-scripts" (OuterVolumeSpecName: "scripts") pod "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" (UID: "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.201048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-config" (OuterVolumeSpecName: "config") pod "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" (UID: "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.201061 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.201078 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.201097 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.201129 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts podName:208f36cc-d67f-450a-a68f-1ace0702dfef nodeName:}" failed. No retries permitted until 2026-01-21 15:46:37.70111753 +0000 UTC m=+2694.882633752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts") pod "root-account-create-update-b6rjt" (UID: "208f36cc-d67f-450a-a68f-1ace0702dfef") : configmap "openstack-cell1-scripts" not found Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.205027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" (UID: "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.210063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-kube-api-access-sgcqc" (OuterVolumeSpecName: "kube-api-access-sgcqc") pod "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" (UID: "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d"). InnerVolumeSpecName "kube-api-access-sgcqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.210173 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.235518 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" (UID: "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.269526 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02253407-57dc-4a23-8c81-e9464a8faa92" path="/var/lib/kubelet/pods/02253407-57dc-4a23-8c81-e9464a8faa92/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.270090 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b2c49ae-8633-4865-9d20-03b627f5bf54" path="/var/lib/kubelet/pods/2b2c49ae-8633-4865-9d20-03b627f5bf54/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.270751 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a9072a-1e29-4655-aec9-a7bf54ba24f9" path="/var/lib/kubelet/pods/61a9072a-1e29-4655-aec9-a7bf54ba24f9/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279850 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279867 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279875 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279881 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279887 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279893 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279900 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279906 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279913 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279918 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279926 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279932 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279940 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.279950 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.282430 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a54c41-56b7-4424-8b05-a919cb7ce85b" path="/var/lib/kubelet/pods/68a54c41-56b7-4424-8b05-a919cb7ce85b/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.282976 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee595bc-dc44-4195-bbed-7e30c29ddabd" path="/var/lib/kubelet/pods/6ee595bc-dc44-4195-bbed-7e30c29ddabd/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.287667 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c825b0-9f75-45ef-ab7c-5e38af70a8d2" path="/var/lib/kubelet/pods/72c825b0-9f75-45ef-ab7c-5e38af70a8d2/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.288237 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7981604f-86e8-40dc-82c5-22f4e786cd72" path="/var/lib/kubelet/pods/7981604f-86e8-40dc-82c5-22f4e786cd72/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.288777 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="875e9554-48a4-49ca-8a6f-2408eb57723f" path="/var/lib/kubelet/pods/875e9554-48a4-49ca-8a6f-2408eb57723f/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.290225 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cd0abca-b95e-45bf-8325-0c12088ce0d1" path="/var/lib/kubelet/pods/8cd0abca-b95e-45bf-8325-0c12088ce0d1/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.290790 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc81354-6161-40a7-b62c-1ba1fbec6b89" path="/var/lib/kubelet/pods/9cc81354-6161-40a7-b62c-1ba1fbec6b89/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.291430 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15a41db-259f-4646-b02e-18cb3c2beab7" path="/var/lib/kubelet/pods/a15a41db-259f-4646-b02e-18cb3c2beab7/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.291963 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3854a85-dfb4-4d28-bba8-be23fbc85bca" path="/var/lib/kubelet/pods/a3854a85-dfb4-4d28-bba8-be23fbc85bca/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.294996 4707 generic.go:334] "Generic (PLEG): container finished" podID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerID="7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686" exitCode=143 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.297868 4707 generic.go:334] "Generic (PLEG): container finished" podID="37486029-cbe4-4198-afb9-962c7d0728e5" containerID="6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730" exitCode=143 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.300208 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be2bb610-3b7b-4815-aefc-fed4415542e1" path="/var/lib/kubelet/pods/be2bb610-3b7b-4815-aefc-fed4415542e1/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.300677 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8711b1e-6689-4b59-a84b-3149a910c125" path="/var/lib/kubelet/pods/c8711b1e-6689-4b59-a84b-3149a910c125/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.302356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config\") pod \"45e9f148-78c8-4e0e-adf4-ae14e936e166\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.302521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config-secret\") pod \"45e9f148-78c8-4e0e-adf4-ae14e936e166\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.302562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jps8s\" (UniqueName: \"kubernetes.io/projected/45e9f148-78c8-4e0e-adf4-ae14e936e166-kube-api-access-jps8s\") pod \"45e9f148-78c8-4e0e-adf4-ae14e936e166\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.302610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-combined-ca-bundle\") pod \"45e9f148-78c8-4e0e-adf4-ae14e936e166\" (UID: \"45e9f148-78c8-4e0e-adf4-ae14e936e166\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.303303 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgcqc\" (UniqueName: \"kubernetes.io/projected/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-kube-api-access-sgcqc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.303324 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.303335 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.303344 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.307029 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef17ce01-e6d2-4028-8e44-e794559c1608" path="/var/lib/kubelet/pods/ef17ce01-e6d2-4028-8e44-e794559c1608/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.307302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e9f148-78c8-4e0e-adf4-ae14e936e166-kube-api-access-jps8s" (OuterVolumeSpecName: "kube-api-access-jps8s") pod "45e9f148-78c8-4e0e-adf4-ae14e936e166" (UID: "45e9f148-78c8-4e0e-adf4-ae14e936e166"). InnerVolumeSpecName "kube-api-access-jps8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.320261 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" (UID: "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.320371 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c" path="/var/lib/kubelet/pods/fb0c967f-7a49-4e54-b563-2f6f5ba5eb7c/volumes" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.321530 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.328239 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_79747506-45fc-47c5-b7ee-38e40227b1bb/ovsdbserver-sb/0.log" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.328382 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.335679 4707 generic.go:334] "Generic (PLEG): container finished" podID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerID="7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae" exitCode=143 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.340028 4707 generic.go:334] "Generic (PLEG): container finished" podID="a16e7975-1407-4544-8c46-b0445ac9c311" containerID="2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633" exitCode=143 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.343014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" (UID: "6de81e07-b3f2-4f89-ad02-c5f3b3ea355d"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344235 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344252 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344262 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344297 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-b2979"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344318 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gwvgj"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344328 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "45e9f148-78c8-4e0e-adf4-ae14e936e166" (UID: "45e9f148-78c8-4e0e-adf4-ae14e936e166"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344336 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-sppjh"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344435 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"6f277223-7e29-49e4-a1e5-f02c4d883ba7","Type":"ContainerDied","Data":"7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" event={"ID":"37486029-cbe4-4198-afb9-962c7d0728e5","Type":"ContainerDied","Data":"6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" event={"ID":"43e629cb-1223-4f95-a117-0d72b4a06bce","Type":"ContainerStarted","Data":"d8d4ce0a854ae9bddf8e3b10d73742ca5be6457520080bd78367538e9d72ad70"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-jppl5" event={"ID":"bad0ccf6-1685-4165-9980-b2a65109a1a0","Type":"ContainerStarted","Data":"fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-jppl5" event={"ID":"bad0ccf6-1685-4165-9980-b2a65109a1a0","Type":"ContainerStarted","Data":"86c20ec2423ccc6675438a96568d733315ec2b5ece8143cda1693be010762423"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"79747506-45fc-47c5-b7ee-38e40227b1bb","Type":"ContainerDied","Data":"fa7800c817719e72eb58fd0d2dd563347447e42c0aae8c147976ac58ca81314d"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c","Type":"ContainerDied","Data":"7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a16e7975-1407-4544-8c46-b0445ac9c311","Type":"ContainerDied","Data":"2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.344572 4707 scope.go:117] "RemoveContainer" containerID="85117f9d04aa0f7e6d587b250d45813072e69df9bf3aba08c43cd1843a375512" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.345406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45e9f148-78c8-4e0e-adf4-ae14e936e166" (UID: "45e9f148-78c8-4e0e-adf4-ae14e936e166"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.347729 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/1.log" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.350366 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerID="93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de" exitCode=0 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.350457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerDied","Data":"93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de"} Jan 21 15:46:37 crc kubenswrapper[4707]: W0121 15:46:37.355709 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cc4f223_9c2d_442f_aa10_16fcaee9e8c5.slice/crio-90776de41212a4557600956b90a42e0d19ff5e3a7e4941a65d3f52249f688375 WatchSource:0}: Error finding container 90776de41212a4557600956b90a42e0d19ff5e3a7e4941a65d3f52249f688375: Status 404 returned error can't find the container with id 90776de41212a4557600956b90a42e0d19ff5e3a7e4941a65d3f52249f688375 Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.376343 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:37 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: if [ -n "nova_api" ]; then Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="nova_api" Jan 21 15:46:37 crc kubenswrapper[4707]: else Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:37 crc kubenswrapper[4707]: fi Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:37 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:37 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:37 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:37 crc kubenswrapper[4707]: # support updates Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.377971 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" podUID="ebb0298d-8a63-4ea2-8473-e5e86a134fd8" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.377960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "45e9f148-78c8-4e0e-adf4-ae14e936e166" (UID: "45e9f148-78c8-4e0e-adf4-ae14e936e166"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.380665 4707 generic.go:334] "Generic (PLEG): container finished" podID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerID="3f05e66f94e563650c321336e37a8530429e8dfd49776a8178131469087deaaf" exitCode=143 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.380725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"98f30a9f-5d85-4dce-8bc8-03e26c9967c8","Type":"ContainerDied","Data":"3f05e66f94e563650c321336e37a8530429e8dfd49776a8178131469087deaaf"} Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.394195 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:37 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 15:46:37 crc kubenswrapper[4707]: else Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:37 crc kubenswrapper[4707]: fi Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:37 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:37 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:37 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:37 crc kubenswrapper[4707]: # support updates Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.396017 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" podUID="a185d0f4-2937-4afb-ad5c-8ebc5e87a241" Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.396858 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:37 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: if [ -n "neutron" ]; then Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="neutron" Jan 21 15:46:37 crc kubenswrapper[4707]: else Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:37 crc kubenswrapper[4707]: fi Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:37 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:37 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:37 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:37 crc kubenswrapper[4707]: # support updates Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.397744 4707 generic.go:334] "Generic (PLEG): container finished" podID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerID="e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d" exitCode=2 Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.397796 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerDied","Data":"e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.399914 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.398929 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" podUID="259fc011-3b4a-41a2-a064-d7d2282e9763" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.400443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" event={"ID":"097b8c3a-4002-4270-9f9b-e56f1507f748","Type":"ContainerStarted","Data":"c919b259fa9526ea0ac792de2dfd5635ddd24a725071a392afbbd3791ab4d306"} Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.404374 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.404791 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.404825 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.404836 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.404845 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jps8s\" (UniqueName: \"kubernetes.io/projected/45e9f148-78c8-4e0e-adf4-ae14e936e166-kube-api-access-jps8s\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.404853 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e9f148-78c8-4e0e-adf4-ae14e936e166-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.404862 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.404870 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/45e9f148-78c8-4e0e-adf4-ae14e936e166-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.430582 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:46:37 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:46:37 crc kubenswrapper[4707]: else Jan 21 15:46:37 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:46:37 crc kubenswrapper[4707]: fi Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:46:37 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:46:37 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:46:37 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:46:37 crc kubenswrapper[4707]: # support updates Jan 21 15:46:37 crc kubenswrapper[4707]: Jan 21 15:46:37 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.432018 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" podUID="df2290ff-a278-4834-be6d-3377f82d1548" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.434756 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.440892 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.475123 4707 scope.go:117] "RemoveContainer" containerID="423b9cfbccfd3275dd905b5666c4c6442b89dad40b2b0658353b1b608bf27823" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.539603 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.628125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v48xr\" (UniqueName: \"kubernetes.io/projected/36879807-6691-412e-b126-9bb6813b7e65-kube-api-access-v48xr\") pod \"36879807-6691-412e-b126-9bb6813b7e65\" (UID: \"36879807-6691-412e-b126-9bb6813b7e65\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.628164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36879807-6691-412e-b126-9bb6813b7e65-operator-scripts\") pod \"36879807-6691-412e-b126-9bb6813b7e65\" (UID: \"36879807-6691-412e-b126-9bb6813b7e65\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.633169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36879807-6691-412e-b126-9bb6813b7e65-kube-api-access-v48xr" (OuterVolumeSpecName: "kube-api-access-v48xr") pod "36879807-6691-412e-b126-9bb6813b7e65" (UID: "36879807-6691-412e-b126-9bb6813b7e65"). InnerVolumeSpecName "kube-api-access-v48xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.635080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36879807-6691-412e-b126-9bb6813b7e65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36879807-6691-412e-b126-9bb6813b7e65" (UID: "36879807-6691-412e-b126-9bb6813b7e65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.731105 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v48xr\" (UniqueName: \"kubernetes.io/projected/36879807-6691-412e-b126-9bb6813b7e65-kube-api-access-v48xr\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.731136 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36879807-6691-412e-b126-9bb6813b7e65-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.731119 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.731627 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts podName:208f36cc-d67f-450a-a68f-1ace0702dfef nodeName:}" failed. No retries permitted until 2026-01-21 15:46:38.731605936 +0000 UTC m=+2695.913122159 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts") pod "root-account-create-update-b6rjt" (UID: "208f36cc-d67f-450a-a68f-1ace0702dfef") : configmap "openstack-cell1-scripts" not found Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.790791 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.790884 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.794440 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.797720 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0ccf6-1685-4165-9980-b2a65109a1a0-operator-scripts\") pod \"bad0ccf6-1685-4165-9980-b2a65109a1a0\" (UID: \"bad0ccf6-1685-4165-9980-b2a65109a1a0\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e629cb-1223-4f95-a117-0d72b4a06bce-operator-scripts\") pod \"43e629cb-1223-4f95-a117-0d72b4a06bce\" (UID: \"43e629cb-1223-4f95-a117-0d72b4a06bce\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljhqv\" (UniqueName: \"kubernetes.io/projected/bad0ccf6-1685-4165-9980-b2a65109a1a0-kube-api-access-ljhqv\") pod \"bad0ccf6-1685-4165-9980-b2a65109a1a0\" (UID: \"bad0ccf6-1685-4165-9980-b2a65109a1a0\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-combined-ca-bundle\") pod \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qvz2\" (UniqueName: \"kubernetes.io/projected/43e629cb-1223-4f95-a117-0d72b4a06bce-kube-api-access-7qvz2\") pod \"43e629cb-1223-4f95-a117-0d72b4a06bce\" (UID: \"43e629cb-1223-4f95-a117-0d72b4a06bce\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfzv2\" (UniqueName: \"kubernetes.io/projected/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-api-access-zfzv2\") pod \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02f152ce-e53d-4745-974c-bbc06a4f5bd6-operator-scripts\") pod \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\" (UID: \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-config\") pod \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937445 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-certs\") pod \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\" (UID: \"0837d9e7-fb96-43b0-a5fb-5eca08e411a3\") " Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.937485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl46b\" (UniqueName: \"kubernetes.io/projected/02f152ce-e53d-4745-974c-bbc06a4f5bd6-kube-api-access-hl46b\") pod \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\" (UID: \"02f152ce-e53d-4745-974c-bbc06a4f5bd6\") " Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.937951 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:37 crc kubenswrapper[4707]: E0121 15:46:37.937996 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data podName:f341e188-43e6-4019-b4dd-b8ea727d2d3f nodeName:}" failed. No retries permitted until 2026-01-21 15:46:41.937982365 +0000 UTC m=+2699.119498587 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data") pod "rabbitmq-cell1-server-0" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.938123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad0ccf6-1685-4165-9980-b2a65109a1a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bad0ccf6-1685-4165-9980-b2a65109a1a0" (UID: "bad0ccf6-1685-4165-9980-b2a65109a1a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.938613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43e629cb-1223-4f95-a117-0d72b4a06bce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43e629cb-1223-4f95-a117-0d72b4a06bce" (UID: "43e629cb-1223-4f95-a117-0d72b4a06bce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.940162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02f152ce-e53d-4745-974c-bbc06a4f5bd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02f152ce-e53d-4745-974c-bbc06a4f5bd6" (UID: "02f152ce-e53d-4745-974c-bbc06a4f5bd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.944174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02f152ce-e53d-4745-974c-bbc06a4f5bd6-kube-api-access-hl46b" (OuterVolumeSpecName: "kube-api-access-hl46b") pod "02f152ce-e53d-4745-974c-bbc06a4f5bd6" (UID: "02f152ce-e53d-4745-974c-bbc06a4f5bd6"). InnerVolumeSpecName "kube-api-access-hl46b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.945731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad0ccf6-1685-4165-9980-b2a65109a1a0-kube-api-access-ljhqv" (OuterVolumeSpecName: "kube-api-access-ljhqv") pod "bad0ccf6-1685-4165-9980-b2a65109a1a0" (UID: "bad0ccf6-1685-4165-9980-b2a65109a1a0"). InnerVolumeSpecName "kube-api-access-ljhqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.949944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e629cb-1223-4f95-a117-0d72b4a06bce-kube-api-access-7qvz2" (OuterVolumeSpecName: "kube-api-access-7qvz2") pod "43e629cb-1223-4f95-a117-0d72b4a06bce" (UID: "43e629cb-1223-4f95-a117-0d72b4a06bce"). InnerVolumeSpecName "kube-api-access-7qvz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.971991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-api-access-zfzv2" (OuterVolumeSpecName: "kube-api-access-zfzv2") pod "0837d9e7-fb96-43b0-a5fb-5eca08e411a3" (UID: "0837d9e7-fb96-43b0-a5fb-5eca08e411a3"). InnerVolumeSpecName "kube-api-access-zfzv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.983928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0837d9e7-fb96-43b0-a5fb-5eca08e411a3" (UID: "0837d9e7-fb96-43b0-a5fb-5eca08e411a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:37 crc kubenswrapper[4707]: I0121 15:46:37.992162 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.010098 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.014192 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.016120 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-7glvt"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.040930 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e629cb-1223-4f95-a117-0d72b4a06bce-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.040953 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljhqv\" (UniqueName: \"kubernetes.io/projected/bad0ccf6-1685-4165-9980-b2a65109a1a0-kube-api-access-ljhqv\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.040962 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.040970 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qvz2\" (UniqueName: \"kubernetes.io/projected/43e629cb-1223-4f95-a117-0d72b4a06bce-kube-api-access-7qvz2\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.040979 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfzv2\" (UniqueName: \"kubernetes.io/projected/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-api-access-zfzv2\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.040987 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02f152ce-e53d-4745-974c-bbc06a4f5bd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.040994 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl46b\" (UniqueName: \"kubernetes.io/projected/02f152ce-e53d-4745-974c-bbc06a4f5bd6-kube-api-access-hl46b\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.041002 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0ccf6-1685-4165-9980-b2a65109a1a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.053877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-9nhsf"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.072901 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-9nhsf"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.084859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "0837d9e7-fb96-43b0-a5fb-5eca08e411a3" (UID: "0837d9e7-fb96-43b0-a5fb-5eca08e411a3"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088303 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p"] Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.088668 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerName="proxy-httpd" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088679 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerName="proxy-httpd" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.088698 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0837d9e7-fb96-43b0-a5fb-5eca08e411a3" containerName="kube-state-metrics" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088704 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0837d9e7-fb96-43b0-a5fb-5eca08e411a3" containerName="kube-state-metrics" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.088719 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerName="ovsdbserver-nb" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088725 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerName="ovsdbserver-nb" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.088745 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad0ccf6-1685-4165-9980-b2a65109a1a0" containerName="mariadb-database-create" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088751 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad0ccf6-1685-4165-9980-b2a65109a1a0" containerName="mariadb-database-create" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.088765 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerName="proxy-server" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088770 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerName="proxy-server" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.088784 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerName="openstack-network-exporter" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088789 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerName="openstack-network-exporter" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088948 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad0ccf6-1685-4165-9980-b2a65109a1a0" containerName="mariadb-database-create" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088962 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerName="proxy-server" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088973 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerName="ovsdbserver-nb" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088986 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerName="proxy-httpd" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.088993 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" containerName="openstack-network-exporter" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.089003 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0837d9e7-fb96-43b0-a5fb-5eca08e411a3" containerName="kube-state-metrics" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.096869 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.104664 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.107699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "0837d9e7-fb96-43b0-a5fb-5eca08e411a3" (UID: "0837d9e7-fb96-43b0-a5fb-5eca08e411a3"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.116145 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-92nzp"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.129034 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.130953 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-c5n5h"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.135920 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-92nzp"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.141377 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-fh744"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.142529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-etc-swift\") pod \"13204bd0-d001-41b6-b3e3-606d0886ee77\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th6fp\" (UniqueName: \"kubernetes.io/projected/097b8c3a-4002-4270-9f9b-e56f1507f748-kube-api-access-th6fp\") pod \"097b8c3a-4002-4270-9f9b-e56f1507f748\" (UID: \"097b8c3a-4002-4270-9f9b-e56f1507f748\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-combined-ca-bundle\") pod \"13204bd0-d001-41b6-b3e3-606d0886ee77\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-public-tls-certs\") pod \"13204bd0-d001-41b6-b3e3-606d0886ee77\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-internal-tls-certs\") pod \"13204bd0-d001-41b6-b3e3-606d0886ee77\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097b8c3a-4002-4270-9f9b-e56f1507f748-operator-scripts\") pod \"097b8c3a-4002-4270-9f9b-e56f1507f748\" (UID: \"097b8c3a-4002-4270-9f9b-e56f1507f748\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88rhz\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-kube-api-access-88rhz\") pod \"13204bd0-d001-41b6-b3e3-606d0886ee77\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-config-data\") pod \"13204bd0-d001-41b6-b3e3-606d0886ee77\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143965 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-log-httpd\") pod \"13204bd0-d001-41b6-b3e3-606d0886ee77\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.143986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-run-httpd\") pod \"13204bd0-d001-41b6-b3e3-606d0886ee77\" (UID: \"13204bd0-d001-41b6-b3e3-606d0886ee77\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.144411 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-c5n5h"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.144543 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.144555 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0837d9e7-fb96-43b0-a5fb-5eca08e411a3-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.145125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "13204bd0-d001-41b6-b3e3-606d0886ee77" (UID: "13204bd0-d001-41b6-b3e3-606d0886ee77"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.148666 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-667579cbdb-pbr2f"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.148840 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" podUID="26893235-a33a-4211-b02e-b4237c0586c5" containerName="keystone-api" containerID="cri-o://4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74" gracePeriod=30 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.151693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "13204bd0-d001-41b6-b3e3-606d0886ee77" (UID: "13204bd0-d001-41b6-b3e3-606d0886ee77"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.152585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "13204bd0-d001-41b6-b3e3-606d0886ee77" (UID: "13204bd0-d001-41b6-b3e3-606d0886ee77"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.153773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/097b8c3a-4002-4270-9f9b-e56f1507f748-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "097b8c3a-4002-4270-9f9b-e56f1507f748" (UID: "097b8c3a-4002-4270-9f9b-e56f1507f748"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.163037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097b8c3a-4002-4270-9f9b-e56f1507f748-kube-api-access-th6fp" (OuterVolumeSpecName: "kube-api-access-th6fp") pod "097b8c3a-4002-4270-9f9b-e56f1507f748" (UID: "097b8c3a-4002-4270-9f9b-e56f1507f748"). InnerVolumeSpecName "kube-api-access-th6fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.163086 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-fh744"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.171907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-kube-api-access-88rhz" (OuterVolumeSpecName: "kube-api-access-88rhz") pod "13204bd0-d001-41b6-b3e3-606d0886ee77" (UID: "13204bd0-d001-41b6-b3e3-606d0886ee77"). InnerVolumeSpecName "kube-api-access-88rhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.191660 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.223569 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-fh744"] Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.224771 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-pv9jn operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-db-create-fh744" podUID="e4af1c90-a8e5-4608-9ff7-495057250aba" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.236842 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p"] Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.237513 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-l6bt8 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" podUID="6f619bdd-bf9c-40af-90f3-6c3bfe463d74" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.247170 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts\") pod \"keystone-db-create-fh744\" (UID: \"e4af1c90-a8e5-4608-9ff7-495057250aba\") " pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.247343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6bt8\" (UniqueName: \"kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8\") pod \"keystone-d4e5-account-create-update-2c75p\" (UID: \"6f619bdd-bf9c-40af-90f3-6c3bfe463d74\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.247397 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv9jn\" (UniqueName: \"kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn\") pod \"keystone-db-create-fh744\" (UID: \"e4af1c90-a8e5-4608-9ff7-495057250aba\") " pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.248458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts\") pod \"keystone-d4e5-account-create-update-2c75p\" (UID: \"6f619bdd-bf9c-40af-90f3-6c3bfe463d74\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.248559 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th6fp\" (UniqueName: \"kubernetes.io/projected/097b8c3a-4002-4270-9f9b-e56f1507f748-kube-api-access-th6fp\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.248572 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097b8c3a-4002-4270-9f9b-e56f1507f748-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.248581 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88rhz\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-kube-api-access-88rhz\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.248590 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.248618 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13204bd0-d001-41b6-b3e3-606d0886ee77-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.248626 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13204bd0-d001-41b6-b3e3-606d0886ee77-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.251364 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.268237 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.285944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "13204bd0-d001-41b6-b3e3-606d0886ee77" (UID: "13204bd0-d001-41b6-b3e3-606d0886ee77"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.314744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13204bd0-d001-41b6-b3e3-606d0886ee77" (UID: "13204bd0-d001-41b6-b3e3-606d0886ee77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.323757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-config-data" (OuterVolumeSpecName: "config-data") pod "13204bd0-d001-41b6-b3e3-606d0886ee77" (UID: "13204bd0-d001-41b6-b3e3-606d0886ee77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.332383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "13204bd0-d001-41b6-b3e3-606d0886ee77" (UID: "13204bd0-d001-41b6-b3e3-606d0886ee77"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.349146 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.350995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-combined-ca-bundle\") pod \"c14f4f47-2fe8-47fd-892f-832266276730\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-kolla-config\") pod \"c14f4f47-2fe8-47fd-892f-832266276730\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-galera-tls-certs\") pod \"c14f4f47-2fe8-47fd-892f-832266276730\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-combined-ca-bundle\") pod \"62555ab6-9268-4248-977e-98269dc9483d\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-config-data-default\") pod \"c14f4f47-2fe8-47fd-892f-832266276730\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-operator-scripts\") pod \"c14f4f47-2fe8-47fd-892f-832266276730\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c14f4f47-2fe8-47fd-892f-832266276730-config-data-generated\") pod \"c14f4f47-2fe8-47fd-892f-832266276730\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"c14f4f47-2fe8-47fd-892f-832266276730\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-vencrypt-tls-certs\") pod \"62555ab6-9268-4248-977e-98269dc9483d\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttwft\" (UniqueName: \"kubernetes.io/projected/c14f4f47-2fe8-47fd-892f-832266276730-kube-api-access-ttwft\") pod \"c14f4f47-2fe8-47fd-892f-832266276730\" (UID: \"c14f4f47-2fe8-47fd-892f-832266276730\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6mv9\" (UniqueName: \"kubernetes.io/projected/62555ab6-9268-4248-977e-98269dc9483d-kube-api-access-d6mv9\") pod \"62555ab6-9268-4248-977e-98269dc9483d\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.351980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-nova-novncproxy-tls-certs\") pod \"62555ab6-9268-4248-977e-98269dc9483d\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.358610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-config-data\") pod \"62555ab6-9268-4248-977e-98269dc9483d\" (UID: \"62555ab6-9268-4248-977e-98269dc9483d\") " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.359295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6bt8\" (UniqueName: \"kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8\") pod \"keystone-d4e5-account-create-update-2c75p\" (UID: \"6f619bdd-bf9c-40af-90f3-6c3bfe463d74\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.359412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9jn\" (UniqueName: \"kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn\") pod \"keystone-db-create-fh744\" (UID: \"e4af1c90-a8e5-4608-9ff7-495057250aba\") " pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.359747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts\") pod \"keystone-d4e5-account-create-update-2c75p\" (UID: \"6f619bdd-bf9c-40af-90f3-6c3bfe463d74\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.360067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts\") pod \"keystone-db-create-fh744\" (UID: \"e4af1c90-a8e5-4608-9ff7-495057250aba\") " pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.360220 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.360552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c14f4f47-2fe8-47fd-892f-832266276730" (UID: "c14f4f47-2fe8-47fd-892f-832266276730"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.360627 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts podName:e4af1c90-a8e5-4608-9ff7-495057250aba nodeName:}" failed. No retries permitted until 2026-01-21 15:46:38.860589889 +0000 UTC m=+2696.042106111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts") pod "keystone-db-create-fh744" (UID: "e4af1c90-a8e5-4608-9ff7-495057250aba") : configmap "openstack-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.361170 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.361263 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.361340 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.361391 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.361981 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13204bd0-d001-41b6-b3e3-606d0886ee77-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.361204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c14f4f47-2fe8-47fd-892f-832266276730-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "c14f4f47-2fe8-47fd-892f-832266276730" (UID: "c14f4f47-2fe8-47fd-892f-832266276730"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.361484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "c14f4f47-2fe8-47fd-892f-832266276730" (UID: "c14f4f47-2fe8-47fd-892f-832266276730"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.362262 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.362367 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts podName:6f619bdd-bf9c-40af-90f3-6c3bfe463d74 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:38.862348335 +0000 UTC m=+2696.043864557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts") pod "keystone-d4e5-account-create-update-2c75p" (UID: "6f619bdd-bf9c-40af-90f3-6c3bfe463d74") : configmap "openstack-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.364123 4707 projected.go:194] Error preparing data for projected volume kube-api-access-l6bt8 for pod openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.364179 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8 podName:6f619bdd-bf9c-40af-90f3-6c3bfe463d74 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:38.864159771 +0000 UTC m=+2696.045675993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l6bt8" (UniqueName: "kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8") pod "keystone-d4e5-account-create-update-2c75p" (UID: "6f619bdd-bf9c-40af-90f3-6c3bfe463d74") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.364240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c14f4f47-2fe8-47fd-892f-832266276730" (UID: "c14f4f47-2fe8-47fd-892f-832266276730"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.365119 4707 projected.go:194] Error preparing data for projected volume kube-api-access-pv9jn for pod openstack-kuttl-tests/keystone-db-create-fh744: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.365208 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn podName:e4af1c90-a8e5-4608-9ff7-495057250aba nodeName:}" failed. No retries permitted until 2026-01-21 15:46:38.865165181 +0000 UTC m=+2696.046681404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pv9jn" (UniqueName: "kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn") pod "keystone-db-create-fh744" (UID: "e4af1c90-a8e5-4608-9ff7-495057250aba") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.380680 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62555ab6-9268-4248-977e-98269dc9483d-kube-api-access-d6mv9" (OuterVolumeSpecName: "kube-api-access-d6mv9") pod "62555ab6-9268-4248-977e-98269dc9483d" (UID: "62555ab6-9268-4248-977e-98269dc9483d"). InnerVolumeSpecName "kube-api-access-d6mv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.380984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14f4f47-2fe8-47fd-892f-832266276730-kube-api-access-ttwft" (OuterVolumeSpecName: "kube-api-access-ttwft") pod "c14f4f47-2fe8-47fd-892f-832266276730" (UID: "c14f4f47-2fe8-47fd-892f-832266276730"). InnerVolumeSpecName "kube-api-access-ttwft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.385929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c14f4f47-2fe8-47fd-892f-832266276730" (UID: "c14f4f47-2fe8-47fd-892f-832266276730"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.395930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "mysql-db") pod "c14f4f47-2fe8-47fd-892f-832266276730" (UID: "c14f4f47-2fe8-47fd-892f-832266276730"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.401200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-config-data" (OuterVolumeSpecName: "config-data") pod "62555ab6-9268-4248-977e-98269dc9483d" (UID: "62555ab6-9268-4248-977e-98269dc9483d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.410052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "c14f4f47-2fe8-47fd-892f-832266276730" (UID: "c14f4f47-2fe8-47fd-892f-832266276730"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.413741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" event={"ID":"097b8c3a-4002-4270-9f9b-e56f1507f748","Type":"ContainerDied","Data":"c919b259fa9526ea0ac792de2dfd5635ddd24a725071a392afbbd3791ab4d306"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.413838 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.417029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62555ab6-9268-4248-977e-98269dc9483d" (UID: "62555ab6-9268-4248-977e-98269dc9483d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.419236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" event={"ID":"a185d0f4-2937-4afb-ad5c-8ebc5e87a241","Type":"ContainerStarted","Data":"12149c1e0c89cee3ca41f1f3975300484340de2a1b13f81b49a0dc655183cfd3"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.424177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" event={"ID":"be4aab59-e488-40d8-883e-0e09a44aae4b","Type":"ContainerStarted","Data":"9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.424208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" event={"ID":"be4aab59-e488-40d8-883e-0e09a44aae4b","Type":"ContainerStarted","Data":"0e0599d2c6956203d378e4735b032ab01af08232ab17eff3e5ce09671e32f9f5"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.424208 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" podUID="be4aab59-e488-40d8-883e-0e09a44aae4b" containerName="mariadb-database-create" containerID="cri-o://9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df" gracePeriod=30 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.426572 4707 generic.go:334] "Generic (PLEG): container finished" podID="3cc4f223-9c2d-442f-aa10-16fcaee9e8c5" containerID="b662cf8f751d4133a78fceed594f185c60eda5f174396f0f49dabb10a96c0cd9" exitCode=0 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.426613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" event={"ID":"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5","Type":"ContainerDied","Data":"b662cf8f751d4133a78fceed594f185c60eda5f174396f0f49dabb10a96c0cd9"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.426628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" event={"ID":"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5","Type":"ContainerStarted","Data":"90776de41212a4557600956b90a42e0d19ff5e3a7e4941a65d3f52249f688375"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.430952 4707 generic.go:334] "Generic (PLEG): container finished" podID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerID="e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34" exitCode=143 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.431000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" event={"ID":"7506b316-cf85-48fa-a833-fafc5e2cb8ce","Type":"ContainerDied","Data":"e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.432153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" event={"ID":"68cc87bb-db38-44df-a921-630ecf45a30b","Type":"ContainerStarted","Data":"a7f72094a6203e918e4dbc21bf4112ebefeb6a3af22b9b548d390db7d7196592"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.433700 4707 generic.go:334] "Generic (PLEG): container finished" podID="8e72ad47-d005-44ce-a494-5ed25884c762" containerID="2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e" exitCode=143 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.433752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" event={"ID":"8e72ad47-d005-44ce-a494-5ed25884c762","Type":"ContainerDied","Data":"2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.435782 4707 generic.go:334] "Generic (PLEG): container finished" podID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerID="95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4" exitCode=0 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.435800 4707 generic.go:334] "Generic (PLEG): container finished" podID="13204bd0-d001-41b6-b3e3-606d0886ee77" containerID="ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748" exitCode=0 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.435979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" event={"ID":"13204bd0-d001-41b6-b3e3-606d0886ee77","Type":"ContainerDied","Data":"95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.436007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" event={"ID":"13204bd0-d001-41b6-b3e3-606d0886ee77","Type":"ContainerDied","Data":"ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.436018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" event={"ID":"13204bd0-d001-41b6-b3e3-606d0886ee77","Type":"ContainerDied","Data":"37cf0b0c1386369086dc0b8ef7d5a1dccc8a920f94d3623675241e4d53c5f19d"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.436034 4707 scope.go:117] "RemoveContainer" containerID="95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.436137 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.447626 4707 generic.go:334] "Generic (PLEG): container finished" podID="bad0ccf6-1685-4165-9980-b2a65109a1a0" containerID="fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb" exitCode=0 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.447673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-jppl5" event={"ID":"bad0ccf6-1685-4165-9980-b2a65109a1a0","Type":"ContainerDied","Data":"fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.447693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-jppl5" event={"ID":"bad0ccf6-1685-4165-9980-b2a65109a1a0","Type":"ContainerDied","Data":"86c20ec2423ccc6675438a96568d733315ec2b5ece8143cda1693be010762423"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.447972 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-jppl5" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.452688 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" containerName="galera" containerID="cri-o://71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0" gracePeriod=30 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.460105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" event={"ID":"43e629cb-1223-4f95-a117-0d72b4a06bce","Type":"ContainerDied","Data":"d8d4ce0a854ae9bddf8e3b10d73742ca5be6457520080bd78367538e9d72ad70"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.460171 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.462372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" event={"ID":"ebb0298d-8a63-4ea2-8473-e5e86a134fd8","Type":"ContainerStarted","Data":"117b929f04c5aa4bbe38ebabca8be43e1c58f947ea50cd09317531a5fefeeb03"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463129 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c14f4f47-2fe8-47fd-892f-832266276730-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463156 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463166 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttwft\" (UniqueName: \"kubernetes.io/projected/c14f4f47-2fe8-47fd-892f-832266276730-kube-api-access-ttwft\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463175 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6mv9\" (UniqueName: \"kubernetes.io/projected/62555ab6-9268-4248-977e-98269dc9483d-kube-api-access-d6mv9\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463185 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463193 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463201 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463209 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f4f47-2fe8-47fd-892f-832266276730-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463216 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.463224 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c14f4f47-2fe8-47fd-892f-832266276730-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.465994 4707 generic.go:334] "Generic (PLEG): container finished" podID="c14f4f47-2fe8-47fd-892f-832266276730" containerID="5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92" exitCode=0 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.466075 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.466086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "62555ab6-9268-4248-977e-98269dc9483d" (UID: "62555ab6-9268-4248-977e-98269dc9483d"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.466162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"c14f4f47-2fe8-47fd-892f-832266276730","Type":"ContainerDied","Data":"5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.466181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"c14f4f47-2fe8-47fd-892f-832266276730","Type":"ContainerDied","Data":"87a3b17bf0c7e0c8a4701be87cf3429e578bd8b8965cd247b5f6148fe884559b"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.466304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "62555ab6-9268-4248-977e-98269dc9483d" (UID: "62555ab6-9268-4248-977e-98269dc9483d"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.466643 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" podStartSLOduration=4.4666277260000005 podStartE2EDuration="4.466627726s" podCreationTimestamp="2026-01-21 15:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:38.46597572 +0000 UTC m=+2695.647491942" watchObservedRunningTime="2026-01-21 15:46:38.466627726 +0000 UTC m=+2695.648143947" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.478185 4707 generic.go:334] "Generic (PLEG): container finished" podID="0837d9e7-fb96-43b0-a5fb-5eca08e411a3" containerID="c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0" exitCode=2 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.478708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0837d9e7-fb96-43b0-a5fb-5eca08e411a3","Type":"ContainerDied","Data":"c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.478740 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0837d9e7-fb96-43b0-a5fb-5eca08e411a3","Type":"ContainerDied","Data":"fcaeeee77530d0ed58ce1b12a27c42487ac062906587b999e64c35d5fba378ee"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.478794 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.483140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" event={"ID":"36879807-6691-412e-b126-9bb6813b7e65","Type":"ContainerDied","Data":"6bf4ae6e3969196d9e9b3efcfca6f9c83c599ad1c482fbd5f9b1431c61312251"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.483189 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.491543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" event={"ID":"02f152ce-e53d-4745-974c-bbc06a4f5bd6","Type":"ContainerDied","Data":"851dfeaf1bcc748de769715db84b7c3a1ea8498e1fcd9de0aadbc3af0d9b390c"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.491606 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-4189-account-create-update-8rj4f" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.494389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" event={"ID":"259fc011-3b4a-41a2-a064-d7d2282e9763","Type":"ContainerStarted","Data":"91665dcc343908542c8bac545227808b89dbcbce2c0a1951b05752762c5277a2"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.494524 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.524858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerDied","Data":"002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.524920 4707 generic.go:334] "Generic (PLEG): container finished" podID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerID="002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5" exitCode=0 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.524940 4707 generic.go:334] "Generic (PLEG): container finished" podID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerID="bfb9505879b3bc1531075e521f5c432e0486e1366b2751cb167564231e34887f" exitCode=0 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.525347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerDied","Data":"bfb9505879b3bc1531075e521f5c432e0486e1366b2751cb167564231e34887f"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.527456 4707 generic.go:334] "Generic (PLEG): container finished" podID="62555ab6-9268-4248-977e-98269dc9483d" containerID="a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea" exitCode=0 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.527498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"62555ab6-9268-4248-977e-98269dc9483d","Type":"ContainerDied","Data":"a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.527514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"62555ab6-9268-4248-977e-98269dc9483d","Type":"ContainerDied","Data":"43c7a1cb62672dabed09c98f9e772b07aa1e16158852aa339d07f565b1f0f33b"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.528034 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.541145 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" podUID="d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41" containerName="mariadb-database-create" containerID="cri-o://ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771" gracePeriod=30 Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.541345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" event={"ID":"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41","Type":"ContainerStarted","Data":"ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.541370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" event={"ID":"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41","Type":"ContainerStarted","Data":"dc088fa739d7ce5615481ee552f800e908e469a51345f767c2cc3947e35b42c3"} Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.541484 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.545593 4707 scope.go:117] "RemoveContainer" containerID="ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.545964 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.546435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.564471 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.564496 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.564505 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62555ab6-9268-4248-977e-98269dc9483d-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.762744 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.769656 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.769666 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" podStartSLOduration=4.769652956 podStartE2EDuration="4.769652956s" podCreationTimestamp="2026-01-21 15:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:38.608968029 +0000 UTC m=+2695.790484252" watchObservedRunningTime="2026-01-21 15:46:38.769652956 +0000 UTC m=+2695.951169179" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.769780 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts podName:208f36cc-d67f-450a-a68f-1ace0702dfef nodeName:}" failed. No retries permitted until 2026-01-21 15:46:40.769764557 +0000 UTC m=+2697.951280779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts") pod "root-account-create-update-b6rjt" (UID: "208f36cc-d67f-450a-a68f-1ace0702dfef") : configmap "openstack-cell1-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.771480 4707 scope.go:117] "RemoveContainer" containerID="95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.775733 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw"] Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.780718 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4\": container with ID starting with 95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4 not found: ID does not exist" containerID="95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.780774 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4"} err="failed to get container status \"95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4\": rpc error: code = NotFound desc = could not find container \"95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4\": container with ID starting with 95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4 not found: ID does not exist" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.780795 4707 scope.go:117] "RemoveContainer" containerID="ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.781140 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.781866 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748\": container with ID starting with ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748 not found: ID does not exist" containerID="ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.781906 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748"} err="failed to get container status \"ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748\": rpc error: code = NotFound desc = could not find container \"ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748\": container with ID starting with ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748 not found: ID does not exist" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.781921 4707 scope.go:117] "RemoveContainer" containerID="95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.782417 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4"} err="failed to get container status \"95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4\": rpc error: code = NotFound desc = could not find container \"95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4\": container with ID starting with 95d85f96037c14bc1dc249a9886639e82ce25520d15864ce49233ebbc58f3ff4 not found: ID does not exist" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.782455 4707 scope.go:117] "RemoveContainer" containerID="ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.782737 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748"} err="failed to get container status \"ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748\": rpc error: code = NotFound desc = could not find container \"ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748\": container with ID starting with ae53b13ffcd20dfc4820027f6a0f1944fa69c6c49613adbc80717aec5ecf0748 not found: ID does not exist" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.782750 4707 scope.go:117] "RemoveContainer" containerID="fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.797137 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6b78579b5b-5nmdw"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.813029 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.822518 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-a608-account-create-update-w7mtg"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.839205 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-8rj4f"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.844628 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-4189-account-create-update-8rj4f"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.865301 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.869363 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7e8f-account-create-update-wd54k"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.870994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts\") pod \"keystone-db-create-fh744\" (UID: \"e4af1c90-a8e5-4608-9ff7-495057250aba\") " pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.871086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6bt8\" (UniqueName: \"kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8\") pod \"keystone-d4e5-account-create-update-2c75p\" (UID: \"6f619bdd-bf9c-40af-90f3-6c3bfe463d74\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.871109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9jn\" (UniqueName: \"kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn\") pod \"keystone-db-create-fh744\" (UID: \"e4af1c90-a8e5-4608-9ff7-495057250aba\") " pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.871212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts\") pod \"keystone-d4e5-account-create-update-2c75p\" (UID: \"6f619bdd-bf9c-40af-90f3-6c3bfe463d74\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.871328 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.871362 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts podName:6f619bdd-bf9c-40af-90f3-6c3bfe463d74 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:39.871350002 +0000 UTC m=+2697.052866224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts") pod "keystone-d4e5-account-create-update-2c75p" (UID: "6f619bdd-bf9c-40af-90f3-6c3bfe463d74") : configmap "openstack-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.871541 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.871565 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts podName:e4af1c90-a8e5-4608-9ff7-495057250aba nodeName:}" failed. No retries permitted until 2026-01-21 15:46:39.871558033 +0000 UTC m=+2697.053074245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts") pod "keystone-db-create-fh744" (UID: "e4af1c90-a8e5-4608-9ff7-495057250aba") : configmap "openstack-scripts" not found Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.878560 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.878904 4707 scope.go:117] "RemoveContainer" containerID="fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb" Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.878979 4707 projected.go:194] Error preparing data for projected volume kube-api-access-pv9jn for pod openstack-kuttl-tests/keystone-db-create-fh744: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.879025 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn podName:e4af1c90-a8e5-4608-9ff7-495057250aba nodeName:}" failed. No retries permitted until 2026-01-21 15:46:39.879013941 +0000 UTC m=+2697.060530154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pv9jn" (UniqueName: "kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn") pod "keystone-db-create-fh744" (UID: "e4af1c90-a8e5-4608-9ff7-495057250aba") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.879081 4707 projected.go:194] Error preparing data for projected volume kube-api-access-l6bt8 for pod openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.879132 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8 podName:6f619bdd-bf9c-40af-90f3-6c3bfe463d74 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:39.879116395 +0000 UTC m=+2697.060632617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l6bt8" (UniqueName: "kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8") pod "keystone-d4e5-account-create-update-2c75p" (UID: "6f619bdd-bf9c-40af-90f3-6c3bfe463d74") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:38 crc kubenswrapper[4707]: E0121 15:46:38.879203 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb\": container with ID starting with fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb not found: ID does not exist" containerID="fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.879226 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb"} err="failed to get container status \"fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb\": rpc error: code = NotFound desc = could not find container \"fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb\": container with ID starting with fee74db5a6276e4e061a3f5e71d65862c215136077bd2169e5fafd3f55027edb not found: ID does not exist" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.879245 4707 scope.go:117] "RemoveContainer" containerID="5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.888570 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.906879 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.909052 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.913881 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-jppl5"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.925441 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-jppl5"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.936175 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.936208 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-dd06-account-create-update-ms8c6"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.939054 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.944134 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.959040 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:38 crc kubenswrapper[4707]: I0121 15:46:38.966946 4707 scope.go:117] "RemoveContainer" containerID="feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.006215 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.014744 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.021859 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.031519 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.038787 4707 scope.go:117] "RemoveContainer" containerID="5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92" Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.047629 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92\": container with ID starting with 5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92 not found: ID does not exist" containerID="5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.047679 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92"} err="failed to get container status \"5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92\": rpc error: code = NotFound desc = could not find container \"5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92\": container with ID starting with 5fc38f3cb12cfac1f44c49f83815c7f65b24a037a5bfbcd7bbf82ec7a48f3c92 not found: ID does not exist" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.047703 4707 scope.go:117] "RemoveContainer" containerID="feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936" Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.048396 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936\": container with ID starting with feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936 not found: ID does not exist" containerID="feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.048414 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936"} err="failed to get container status \"feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936\": rpc error: code = NotFound desc = could not find container \"feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936\": container with ID starting with feb94eb8dd4b304608e9903f93947bb54699ccd796c22aa562d357e644d55936 not found: ID does not exist" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.048427 4707 scope.go:117] "RemoveContainer" containerID="c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.076128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzwvm\" (UniqueName: \"kubernetes.io/projected/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-kube-api-access-fzwvm\") pod \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\" (UID: \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.077691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-operator-scripts\") pod \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\" (UID: \"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5\") " Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.080351 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.080398 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data podName:8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab nodeName:}" failed. No retries permitted until 2026-01-21 15:46:43.080382032 +0000 UTC m=+2700.261898254 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data") pod "rabbitmq-server-0" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab") : configmap "rabbitmq-config-data" not found Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.080904 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3cc4f223-9c2d-442f-aa10-16fcaee9e8c5" (UID: "3cc4f223-9c2d-442f-aa10-16fcaee9e8c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.101174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-kube-api-access-fzwvm" (OuterVolumeSpecName: "kube-api-access-fzwvm") pod "3cc4f223-9c2d-442f-aa10-16fcaee9e8c5" (UID: "3cc4f223-9c2d-442f-aa10-16fcaee9e8c5"). InnerVolumeSpecName "kube-api-access-fzwvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.121642 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.132444 4707 scope.go:117] "RemoveContainer" containerID="c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0" Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.133292 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0\": container with ID starting with c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0 not found: ID does not exist" containerID="c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.133332 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0"} err="failed to get container status \"c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0\": rpc error: code = NotFound desc = could not find container \"c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0\": container with ID starting with c354aa03acbb37da68849503f74cf11b3d99a595a4b34eeed52c0aba845e70d0 not found: ID does not exist" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.133348 4707 scope.go:117] "RemoveContainer" containerID="a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.181094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzpjn\" (UniqueName: \"kubernetes.io/projected/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-kube-api-access-qzpjn\") pod \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\" (UID: \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.181383 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc7th\" (UniqueName: \"kubernetes.io/projected/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-kube-api-access-qc7th\") pod \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\" (UID: \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.181420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j8hq\" (UniqueName: \"kubernetes.io/projected/208f36cc-d67f-450a-a68f-1ace0702dfef-kube-api-access-2j8hq\") pod \"208f36cc-d67f-450a-a68f-1ace0702dfef\" (UID: \"208f36cc-d67f-450a-a68f-1ace0702dfef\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.181457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259fc011-3b4a-41a2-a064-d7d2282e9763-operator-scripts\") pod \"259fc011-3b4a-41a2-a064-d7d2282e9763\" (UID: \"259fc011-3b4a-41a2-a064-d7d2282e9763\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.181474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-operator-scripts\") pod \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\" (UID: \"a185d0f4-2937-4afb-ad5c-8ebc5e87a241\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.181513 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-operator-scripts\") pod \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\" (UID: \"ebb0298d-8a63-4ea2-8473-e5e86a134fd8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.181964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhqbn\" (UniqueName: \"kubernetes.io/projected/259fc011-3b4a-41a2-a064-d7d2282e9763-kube-api-access-lhqbn\") pod \"259fc011-3b4a-41a2-a064-d7d2282e9763\" (UID: \"259fc011-3b4a-41a2-a064-d7d2282e9763\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.181995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts\") pod \"208f36cc-d67f-450a-a68f-1ace0702dfef\" (UID: \"208f36cc-d67f-450a-a68f-1ace0702dfef\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.181989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a185d0f4-2937-4afb-ad5c-8ebc5e87a241" (UID: "a185d0f4-2937-4afb-ad5c-8ebc5e87a241"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.182009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebb0298d-8a63-4ea2-8473-e5e86a134fd8" (UID: "ebb0298d-8a63-4ea2-8473-e5e86a134fd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.182386 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "208f36cc-d67f-450a-a68f-1ace0702dfef" (UID: "208f36cc-d67f-450a-a68f-1ace0702dfef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.182634 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/259fc011-3b4a-41a2-a064-d7d2282e9763-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "259fc011-3b4a-41a2-a064-d7d2282e9763" (UID: "259fc011-3b4a-41a2-a064-d7d2282e9763"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.182709 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.182741 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzwvm\" (UniqueName: \"kubernetes.io/projected/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5-kube-api-access-fzwvm\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.197304 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02f152ce-e53d-4745-974c-bbc06a4f5bd6" path="/var/lib/kubelet/pods/02f152ce-e53d-4745-974c-bbc06a4f5bd6/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.197892 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0837d9e7-fb96-43b0-a5fb-5eca08e411a3" path="/var/lib/kubelet/pods/0837d9e7-fb96-43b0-a5fb-5eca08e411a3/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.198589 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097b8c3a-4002-4270-9f9b-e56f1507f748" path="/var/lib/kubelet/pods/097b8c3a-4002-4270-9f9b-e56f1507f748/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.199087 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13204bd0-d001-41b6-b3e3-606d0886ee77" path="/var/lib/kubelet/pods/13204bd0-d001-41b6-b3e3-606d0886ee77/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.199930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-kube-api-access-qzpjn" (OuterVolumeSpecName: "kube-api-access-qzpjn") pod "ebb0298d-8a63-4ea2-8473-e5e86a134fd8" (UID: "ebb0298d-8a63-4ea2-8473-e5e86a134fd8"). InnerVolumeSpecName "kube-api-access-qzpjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.200170 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c800162-c9be-4c26-848e-548c5ade1645" path="/var/lib/kubelet/pods/1c800162-c9be-4c26-848e-548c5ade1645/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.200645 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36879807-6691-412e-b126-9bb6813b7e65" path="/var/lib/kubelet/pods/36879807-6691-412e-b126-9bb6813b7e65/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.200966 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43e629cb-1223-4f95-a117-0d72b4a06bce" path="/var/lib/kubelet/pods/43e629cb-1223-4f95-a117-0d72b4a06bce/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.201313 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e9f148-78c8-4e0e-adf4-ae14e936e166" path="/var/lib/kubelet/pods/45e9f148-78c8-4e0e-adf4-ae14e936e166/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.202212 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62555ab6-9268-4248-977e-98269dc9483d" path="/var/lib/kubelet/pods/62555ab6-9268-4248-977e-98269dc9483d/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.202748 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de81e07-b3f2-4f89-ad02-c5f3b3ea355d" path="/var/lib/kubelet/pods/6de81e07-b3f2-4f89-ad02-c5f3b3ea355d/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.203671 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79747506-45fc-47c5-b7ee-38e40227b1bb" path="/var/lib/kubelet/pods/79747506-45fc-47c5-b7ee-38e40227b1bb/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.204749 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93950f85-ecce-471f-bf1e-ecd888e9bc4a" path="/var/lib/kubelet/pods/93950f85-ecce-471f-bf1e-ecd888e9bc4a/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.205196 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad0ccf6-1685-4165-9980-b2a65109a1a0" path="/var/lib/kubelet/pods/bad0ccf6-1685-4165-9980-b2a65109a1a0/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.212069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259fc011-3b4a-41a2-a064-d7d2282e9763-kube-api-access-lhqbn" (OuterVolumeSpecName: "kube-api-access-lhqbn") pod "259fc011-3b4a-41a2-a064-d7d2282e9763" (UID: "259fc011-3b4a-41a2-a064-d7d2282e9763"). InnerVolumeSpecName "kube-api-access-lhqbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.212255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-kube-api-access-qc7th" (OuterVolumeSpecName: "kube-api-access-qc7th") pod "a185d0f4-2937-4afb-ad5c-8ebc5e87a241" (UID: "a185d0f4-2937-4afb-ad5c-8ebc5e87a241"). InnerVolumeSpecName "kube-api-access-qc7th". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.212731 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14f4f47-2fe8-47fd-892f-832266276730" path="/var/lib/kubelet/pods/c14f4f47-2fe8-47fd-892f-832266276730/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.214355 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6c2927-0704-455f-be1a-1c421df8dfc6" path="/var/lib/kubelet/pods/df6c2927-0704-455f-be1a-1c421df8dfc6/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.216155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/208f36cc-d67f-450a-a68f-1ace0702dfef-kube-api-access-2j8hq" (OuterVolumeSpecName: "kube-api-access-2j8hq") pod "208f36cc-d67f-450a-a68f-1ace0702dfef" (UID: "208f36cc-d67f-450a-a68f-1ace0702dfef"). InnerVolumeSpecName "kube-api-access-2j8hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.216849 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5868034-d516-46e9-be86-98f34cea1875" path="/var/lib/kubelet/pods/f5868034-d516-46e9-be86-98f34cea1875/volumes" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.235543 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.246245 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.73:8776/healthcheck\": read tcp 10.217.0.2:56930->10.217.1.73:8776: read: connection reset by peer" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.255488 4707 scope.go:117] "RemoveContainer" containerID="a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea" Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.257159 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea\": container with ID starting with a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea not found: ID does not exist" containerID="a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.257204 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea"} err="failed to get container status \"a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea\": rpc error: code = NotFound desc = could not find container \"a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea\": container with ID starting with a25c933c7d3b365c7361e89fd44cef046169d07964c6b36e6230e53951f1afea not found: ID does not exist" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.283773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gch4\" (UniqueName: \"kubernetes.io/projected/df2290ff-a278-4834-be6d-3377f82d1548-kube-api-access-6gch4\") pod \"df2290ff-a278-4834-be6d-3377f82d1548\" (UID: \"df2290ff-a278-4834-be6d-3377f82d1548\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.283883 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df2290ff-a278-4834-be6d-3377f82d1548-operator-scripts\") pod \"df2290ff-a278-4834-be6d-3377f82d1548\" (UID: \"df2290ff-a278-4834-be6d-3377f82d1548\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.284346 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzpjn\" (UniqueName: \"kubernetes.io/projected/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-kube-api-access-qzpjn\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.284359 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc7th\" (UniqueName: \"kubernetes.io/projected/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-kube-api-access-qc7th\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.284368 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j8hq\" (UniqueName: \"kubernetes.io/projected/208f36cc-d67f-450a-a68f-1ace0702dfef-kube-api-access-2j8hq\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.284378 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259fc011-3b4a-41a2-a064-d7d2282e9763-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.284388 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a185d0f4-2937-4afb-ad5c-8ebc5e87a241-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.284396 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebb0298d-8a63-4ea2-8473-e5e86a134fd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.284404 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhqbn\" (UniqueName: \"kubernetes.io/projected/259fc011-3b4a-41a2-a064-d7d2282e9763-kube-api-access-lhqbn\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.284412 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208f36cc-d67f-450a-a68f-1ace0702dfef-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.284337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df2290ff-a278-4834-be6d-3377f82d1548-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df2290ff-a278-4834-be6d-3377f82d1548" (UID: "df2290ff-a278-4834-be6d-3377f82d1548"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.286982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df2290ff-a278-4834-be6d-3377f82d1548-kube-api-access-6gch4" (OuterVolumeSpecName: "kube-api-access-6gch4") pod "df2290ff-a278-4834-be6d-3377f82d1548" (UID: "df2290ff-a278-4834-be6d-3377f82d1548"). InnerVolumeSpecName "kube-api-access-6gch4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.385028 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4aab59-e488-40d8-883e-0e09a44aae4b-operator-scripts\") pod \"be4aab59-e488-40d8-883e-0e09a44aae4b\" (UID: \"be4aab59-e488-40d8-883e-0e09a44aae4b\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.385199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcrfq\" (UniqueName: \"kubernetes.io/projected/be4aab59-e488-40d8-883e-0e09a44aae4b-kube-api-access-mcrfq\") pod \"be4aab59-e488-40d8-883e-0e09a44aae4b\" (UID: \"be4aab59-e488-40d8-883e-0e09a44aae4b\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.385449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be4aab59-e488-40d8-883e-0e09a44aae4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be4aab59-e488-40d8-883e-0e09a44aae4b" (UID: "be4aab59-e488-40d8-883e-0e09a44aae4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.385786 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4aab59-e488-40d8-883e-0e09a44aae4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.385800 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gch4\" (UniqueName: \"kubernetes.io/projected/df2290ff-a278-4834-be6d-3377f82d1548-kube-api-access-6gch4\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.385831 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df2290ff-a278-4834-be6d-3377f82d1548-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.389967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4aab59-e488-40d8-883e-0e09a44aae4b-kube-api-access-mcrfq" (OuterVolumeSpecName: "kube-api-access-mcrfq") pod "be4aab59-e488-40d8-883e-0e09a44aae4b" (UID: "be4aab59-e488-40d8-883e-0e09a44aae4b"). InnerVolumeSpecName "kube-api-access-mcrfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.428387 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.428749 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.488013 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcrfq\" (UniqueName: \"kubernetes.io/projected/be4aab59-e488-40d8-883e-0e09a44aae4b-kube-api-access-mcrfq\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.493783 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.77:8778/\": read tcp 10.217.0.2:60696->10.217.1.77:8778: read: connection reset by peer" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.494152 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.1.77:8778/\": read tcp 10.217.0.2:60698->10.217.1.77:8778: read: connection reset by peer" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.502077 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.545115 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589240 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kolla-config\") pod \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data-custom\") pod \"6ba6473c-6566-4d18-88ff-e329a248c151\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26gjt\" (UniqueName: \"kubernetes.io/projected/8e72ad47-d005-44ce-a494-5ed25884c762-kube-api-access-26gjt\") pod \"8e72ad47-d005-44ce-a494-5ed25884c762\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e72ad47-d005-44ce-a494-5ed25884c762-logs\") pod \"8e72ad47-d005-44ce-a494-5ed25884c762\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data\") pod \"8e72ad47-d005-44ce-a494-5ed25884c762\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data-custom\") pod \"8e72ad47-d005-44ce-a494-5ed25884c762\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-operator-scripts\") pod \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-combined-ca-bundle\") pod \"8e72ad47-d005-44ce-a494-5ed25884c762\" (UID: \"8e72ad47-d005-44ce-a494-5ed25884c762\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-combined-ca-bundle\") pod \"6ba6473c-6566-4d18-88ff-e329a248c151\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-generated\") pod \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9kff\" (UniqueName: \"kubernetes.io/projected/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kube-api-access-s9kff\") pod \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-default\") pod \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-combined-ca-bundle\") pod \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data\") pod \"6ba6473c-6566-4d18-88ff-e329a248c151\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x6k5\" (UniqueName: \"kubernetes.io/projected/6ba6473c-6566-4d18-88ff-e329a248c151-kube-api-access-5x6k5\") pod \"6ba6473c-6566-4d18-88ff-e329a248c151\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba6473c-6566-4d18-88ff-e329a248c151-logs\") pod \"6ba6473c-6566-4d18-88ff-e329a248c151\" (UID: \"6ba6473c-6566-4d18-88ff-e329a248c151\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.589751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-galera-tls-certs\") pod \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\" (UID: \"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.612067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" (UID: "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.616180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e72ad47-d005-44ce-a494-5ed25884c762-logs" (OuterVolumeSpecName: "logs") pod "8e72ad47-d005-44ce-a494-5ed25884c762" (UID: "8e72ad47-d005-44ce-a494-5ed25884c762"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.624508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba6473c-6566-4d18-88ff-e329a248c151-logs" (OuterVolumeSpecName: "logs") pod "6ba6473c-6566-4d18-88ff-e329a248c151" (UID: "6ba6473c-6566-4d18-88ff-e329a248c151"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.626913 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" (UID: "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.630061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" (UID: "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.632075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba6473c-6566-4d18-88ff-e329a248c151-kube-api-access-5x6k5" (OuterVolumeSpecName: "kube-api-access-5x6k5") pod "6ba6473c-6566-4d18-88ff-e329a248c151" (UID: "6ba6473c-6566-4d18-88ff-e329a248c151"). InnerVolumeSpecName "kube-api-access-5x6k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.635349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" (UID: "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.635970 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kube-api-access-s9kff" (OuterVolumeSpecName: "kube-api-access-s9kff") pod "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" (UID: "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8"). InnerVolumeSpecName "kube-api-access-s9kff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.643919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8e72ad47-d005-44ce-a494-5ed25884c762" (UID: "8e72ad47-d005-44ce-a494-5ed25884c762"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.645239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6ba6473c-6566-4d18-88ff-e329a248c151" (UID: "6ba6473c-6566-4d18-88ff-e329a248c151"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.645242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e72ad47-d005-44ce-a494-5ed25884c762-kube-api-access-26gjt" (OuterVolumeSpecName: "kube-api-access-26gjt") pod "8e72ad47-d005-44ce-a494-5ed25884c762" (UID: "8e72ad47-d005-44ce-a494-5ed25884c762"). InnerVolumeSpecName "kube-api-access-26gjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.654015 4707 generic.go:334] "Generic (PLEG): container finished" podID="20a1576d-856f-45fb-b210-66fddef6adaf" containerID="fb76672b8b108d21f1c4f38e5e76864b3f3b4e2030e02b6e06d10dafbc199a9b" exitCode=0 Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.654081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20a1576d-856f-45fb-b210-66fddef6adaf","Type":"ContainerDied","Data":"fb76672b8b108d21f1c4f38e5e76864b3f3b4e2030e02b6e06d10dafbc199a9b"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.690764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-scripts\") pod \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.701778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-logs\") pod \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.701822 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-httpd-run\") pod \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.701914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-combined-ca-bundle\") pod \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.701950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqwp9\" (UniqueName: \"kubernetes.io/projected/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-kube-api-access-dqwp9\") pod \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.701971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-config-data\") pod \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.701992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-internal-tls-certs\") pod \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.702014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\" (UID: \"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c\") " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.706876 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-logs" (OuterVolumeSpecName: "logs") pod "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" (UID: "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" (UID: "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708461 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x6k5\" (UniqueName: \"kubernetes.io/projected/6ba6473c-6566-4d18-88ff-e329a248c151-kube-api-access-5x6k5\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708474 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708483 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708492 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba6473c-6566-4d18-88ff-e329a248c151-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708500 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708508 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708516 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26gjt\" (UniqueName: \"kubernetes.io/projected/8e72ad47-d005-44ce-a494-5ed25884c762-kube-api-access-26gjt\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708523 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e72ad47-d005-44ce-a494-5ed25884c762-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708532 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708540 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708547 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708555 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9kff\" (UniqueName: \"kubernetes.io/projected/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-kube-api-access-s9kff\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.708563 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.726947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-scripts" (OuterVolumeSpecName: "scripts") pod "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" (UID: "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.741169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e72ad47-d005-44ce-a494-5ed25884c762" (UID: "8e72ad47-d005-44ce-a494-5ed25884c762"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.746867 4707 generic.go:334] "Generic (PLEG): container finished" podID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerID="3f5a0d130fa4b31023beed8d473c235ed286cc7b040259cf77ab6da89f38e456" exitCode=0 Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.746921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"98f30a9f-5d85-4dce-8bc8-03e26c9967c8","Type":"ContainerDied","Data":"3f5a0d130fa4b31023beed8d473c235ed286cc7b040259cf77ab6da89f38e456"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.751176 4707 generic.go:334] "Generic (PLEG): container finished" podID="68cc87bb-db38-44df-a921-630ecf45a30b" containerID="5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766" exitCode=0 Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.751294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" event={"ID":"68cc87bb-db38-44df-a921-630ecf45a30b","Type":"ContainerDied","Data":"5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.757829 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" (UID: "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.757968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ba6473c-6566-4d18-88ff-e329a248c151" (UID: "6ba6473c-6566-4d18-88ff-e329a248c151"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.763237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-kube-api-access-dqwp9" (OuterVolumeSpecName: "kube-api-access-dqwp9") pod "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" (UID: "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c"). InnerVolumeSpecName "kube-api-access-dqwp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.767968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" (UID: "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.771001 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" containerID="71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0" exitCode=0 Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.771910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8","Type":"ContainerDied","Data":"71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.771944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"b4e1e59b-cbb0-4d57-9a35-988ee16e74c8","Type":"ContainerDied","Data":"f738b791d93b86b1f9feb0f63d6e2b3318cac1f956f23c39bd01c4e1622269cb"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.771948 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.771959 4707 scope.go:117] "RemoveContainer" containerID="71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.776589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" event={"ID":"3cc4f223-9c2d-442f-aa10-16fcaee9e8c5","Type":"ContainerDied","Data":"90776de41212a4557600956b90a42e0d19ff5e3a7e4941a65d3f52249f688375"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.776618 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-b2979" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.780649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" event={"ID":"df2290ff-a278-4834-be6d-3377f82d1548","Type":"ContainerDied","Data":"4b6ab9c1938d4ed1052ade94e8b7b556313ba840f307417051ecbbf0e48b912b"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.780734 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.793336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" event={"ID":"a185d0f4-2937-4afb-ad5c-8ebc5e87a241","Type":"ContainerDied","Data":"12149c1e0c89cee3ca41f1f3975300484340de2a1b13f81b49a0dc655183cfd3"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.793421 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.794126 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" (UID: "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.798400 4707 generic.go:334] "Generic (PLEG): container finished" podID="8e72ad47-d005-44ce-a494-5ed25884c762" containerID="6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa" exitCode=0 Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.798448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" event={"ID":"8e72ad47-d005-44ce-a494-5ed25884c762","Type":"ContainerDied","Data":"6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.798470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" event={"ID":"8e72ad47-d005-44ce-a494-5ed25884c762","Type":"ContainerDied","Data":"d3f1f3e0627d237a1299b06d79f43c93b264288df51c9b1bd87bdc219ab950e8"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.798517 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.806366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" event={"ID":"ebb0298d-8a63-4ea2-8473-e5e86a134fd8","Type":"ContainerDied","Data":"117b929f04c5aa4bbe38ebabca8be43e1c58f947ea50cd09317531a5fefeeb03"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.806458 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.810661 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.810717 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.810726 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.810735 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.810761 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.811038 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqwp9\" (UniqueName: \"kubernetes.io/projected/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-kube-api-access-dqwp9\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.811083 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.813074 4707 generic.go:334] "Generic (PLEG): container finished" podID="6ba6473c-6566-4d18-88ff-e329a248c151" containerID="1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020" exitCode=0 Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.813155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" event={"ID":"6ba6473c-6566-4d18-88ff-e329a248c151","Type":"ContainerDied","Data":"1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.813174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" event={"ID":"6ba6473c-6566-4d18-88ff-e329a248c151","Type":"ContainerDied","Data":"b058eb973732d484fbf5e6da1e1c8f73f9e45604fe4050958c4c3094c7f12040"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.813222 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.815647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-b6rjt" event={"ID":"208f36cc-d67f-450a-a68f-1ace0702dfef","Type":"ContainerDied","Data":"2d418493944578a14725d5064907da8eca4eebec8488df581d359507fb7c15b3"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.815697 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b6rjt" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.817661 4707 generic.go:334] "Generic (PLEG): container finished" podID="be4aab59-e488-40d8-883e-0e09a44aae4b" containerID="9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df" exitCode=1 Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.817694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" event={"ID":"be4aab59-e488-40d8-883e-0e09a44aae4b","Type":"ContainerDied","Data":"9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.817708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" event={"ID":"be4aab59-e488-40d8-883e-0e09a44aae4b","Type":"ContainerDied","Data":"0e0599d2c6956203d378e4735b032ab01af08232ab17eff3e5ce09671e32f9f5"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.817743 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-gwvgj" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.822654 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" (UID: "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.836728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data" (OuterVolumeSpecName: "config-data") pod "6ba6473c-6566-4d18-88ff-e329a248c151" (UID: "6ba6473c-6566-4d18-88ff-e329a248c151"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.837352 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.837485 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.840551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-config-data" (OuterVolumeSpecName: "config-data") pod "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" (UID: "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.841194 4707 generic.go:334] "Generic (PLEG): container finished" podID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerID="4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b" exitCode=0 Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.841245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c","Type":"ContainerDied","Data":"4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.841265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"53a9b1e8-b1f8-4bf4-8d99-06b60581d63c","Type":"ContainerDied","Data":"a9175c7f34373daee7f6c69a72abd543b4168a6c57c2682ae2953e531f68b4f8"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.841368 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.849198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data" (OuterVolumeSpecName: "config-data") pod "8e72ad47-d005-44ce-a494-5ed25884c762" (UID: "8e72ad47-d005-44ce-a494-5ed25884c762"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.851980 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.852603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" event={"ID":"259fc011-3b4a-41a2-a064-d7d2282e9763","Type":"ContainerDied","Data":"91665dcc343908542c8bac545227808b89dbcbce2c0a1951b05752762c5277a2"} Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.852639 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.852663 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.866540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" (UID: "b4e1e59b-cbb0-4d57-9a35-988ee16e74c8"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.869045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" (UID: "53a9b1e8-b1f8-4bf4-8d99-06b60581d63c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.914399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6bt8\" (UniqueName: \"kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8\") pod \"keystone-d4e5-account-create-update-2c75p\" (UID: \"6f619bdd-bf9c-40af-90f3-6c3bfe463d74\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.914639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9jn\" (UniqueName: \"kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn\") pod \"keystone-db-create-fh744\" (UID: \"e4af1c90-a8e5-4608-9ff7-495057250aba\") " pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.914959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts\") pod \"keystone-d4e5-account-create-update-2c75p\" (UID: \"6f619bdd-bf9c-40af-90f3-6c3bfe463d74\") " pod="openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.915002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts\") pod \"keystone-db-create-fh744\" (UID: \"e4af1c90-a8e5-4608-9ff7-495057250aba\") " pod="openstack-kuttl-tests/keystone-db-create-fh744" Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.915117 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.915155 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts podName:e4af1c90-a8e5-4608-9ff7-495057250aba nodeName:}" failed. No retries permitted until 2026-01-21 15:46:41.915143246 +0000 UTC m=+2699.096659457 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts") pod "keystone-db-create-fh744" (UID: "e4af1c90-a8e5-4608-9ff7-495057250aba") : configmap "openstack-scripts" not found Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.915161 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.915224 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts podName:6f619bdd-bf9c-40af-90f3-6c3bfe463d74 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:41.91520897 +0000 UTC m=+2699.096725191 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts") pod "keystone-d4e5-account-create-update-2c75p" (UID: "6f619bdd-bf9c-40af-90f3-6c3bfe463d74") : configmap "openstack-scripts" not found Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.915331 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.915349 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.915358 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.915366 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.915374 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e72ad47-d005-44ce-a494-5ed25884c762-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.915382 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.915389 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.915397 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba6473c-6566-4d18-88ff-e329a248c151-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.919736 4707 projected.go:194] Error preparing data for projected volume kube-api-access-l6bt8 for pod openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.919736 4707 projected.go:194] Error preparing data for projected volume kube-api-access-pv9jn for pod openstack-kuttl-tests/keystone-db-create-fh744: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.919784 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8 podName:6f619bdd-bf9c-40af-90f3-6c3bfe463d74 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:41.919769814 +0000 UTC m=+2699.101286037 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l6bt8" (UniqueName: "kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8") pod "keystone-d4e5-account-create-update-2c75p" (UID: "6f619bdd-bf9c-40af-90f3-6c3bfe463d74") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.919798 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn podName:e4af1c90-a8e5-4608-9ff7-495057250aba nodeName:}" failed. No retries permitted until 2026-01-21 15:46:41.919792387 +0000 UTC m=+2699.101308609 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pv9jn" (UniqueName: "kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn") pod "keystone-db-create-fh744" (UID: "e4af1c90-a8e5-4608-9ff7-495057250aba") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.939592 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.944670 4707 scope.go:117] "RemoveContainer" containerID="5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.966256 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.973331 4707 scope.go:117] "RemoveContainer" containerID="71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0" Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.973680 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0\": container with ID starting with 71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0 not found: ID does not exist" containerID="71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.973704 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0"} err="failed to get container status \"71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0\": rpc error: code = NotFound desc = could not find container \"71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0\": container with ID starting with 71625318102aa25d6f92b1b5b7041c1e19bf3fb5dc568ca6da68acdd0a508ad0 not found: ID does not exist" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.973719 4707 scope.go:117] "RemoveContainer" containerID="5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6" Jan 21 15:46:39 crc kubenswrapper[4707]: E0121 15:46:39.974030 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6\": container with ID starting with 5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6 not found: ID does not exist" containerID="5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.974064 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6"} err="failed to get container status \"5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6\": rpc error: code = NotFound desc = could not find container \"5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6\": container with ID starting with 5dd0e8ac8172662850dd1ee2229671effe346f7c0b0402aac90b2c79013714d6 not found: ID does not exist" Jan 21 15:46:39 crc kubenswrapper[4707]: I0121 15:46:39.974084 4707 scope.go:117] "RemoveContainer" containerID="b662cf8f751d4133a78fceed594f185c60eda5f174396f0f49dabb10a96c0cd9" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.021067 4707 scope.go:117] "RemoveContainer" containerID="6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.030538 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.049111 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-e6a9-account-create-update-625q6"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.049868 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.054233 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-b2979"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.068990 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-b2979"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.072428 4707 scope.go:117] "RemoveContainer" containerID="2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.084921 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.093285 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-d92f-account-create-update-pw6mk"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.093706 4707 scope.go:117] "RemoveContainer" containerID="6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa" Jan 21 15:46:40 crc kubenswrapper[4707]: E0121 15:46:40.094033 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa\": container with ID starting with 6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa not found: ID does not exist" containerID="6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.094060 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa"} err="failed to get container status \"6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa\": rpc error: code = NotFound desc = could not find container \"6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa\": container with ID starting with 6fe4fa95d3a2bfed6b005b20be67fc929f46cc3113193272ced6b212f2970faa not found: ID does not exist" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.094076 4707 scope.go:117] "RemoveContainer" containerID="2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e" Jan 21 15:46:40 crc kubenswrapper[4707]: E0121 15:46:40.094265 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e\": container with ID starting with 2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e not found: ID does not exist" containerID="2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.094298 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e"} err="failed to get container status \"2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e\": rpc error: code = NotFound desc = could not find container \"2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e\": container with ID starting with 2c78376eb66ad29f629982d0c2b87b0336de77be46bd95808b61a700e7845a9e not found: ID does not exist" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.094311 4707 scope.go:117] "RemoveContainer" containerID="1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.118388 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.123217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data\") pod \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.123278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-logs\") pod \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.123830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-combined-ca-bundle\") pod \"20a1576d-856f-45fb-b210-66fddef6adaf\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.123896 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-combined-ca-bundle\") pod \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.123927 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-public-tls-certs\") pod \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.123958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-httpd-run\") pod \"20a1576d-856f-45fb-b210-66fddef6adaf\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.123984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-config-data\") pod \"20a1576d-856f-45fb-b210-66fddef6adaf\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxhvl\" (UniqueName: \"kubernetes.io/projected/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-kube-api-access-sxhvl\") pod \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-scripts\") pod \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124077 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"20a1576d-856f-45fb-b210-66fddef6adaf\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djn2s\" (UniqueName: \"kubernetes.io/projected/20a1576d-856f-45fb-b210-66fddef6adaf-kube-api-access-djn2s\") pod \"20a1576d-856f-45fb-b210-66fddef6adaf\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-scripts\") pod \"20a1576d-856f-45fb-b210-66fddef6adaf\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-etc-machine-id\") pod \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124227 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-internal-tls-certs\") pod \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-public-tls-certs\") pod \"20a1576d-856f-45fb-b210-66fddef6adaf\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-logs\") pod \"20a1576d-856f-45fb-b210-66fddef6adaf\" (UID: \"20a1576d-856f-45fb-b210-66fddef6adaf\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.124326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data-custom\") pod \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\" (UID: \"98f30a9f-5d85-4dce-8bc8-03e26c9967c8\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.130924 4707 scope.go:117] "RemoveContainer" containerID="ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.132649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-logs" (OuterVolumeSpecName: "logs") pod "98f30a9f-5d85-4dce-8bc8-03e26c9967c8" (UID: "98f30a9f-5d85-4dce-8bc8-03e26c9967c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.139227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "98f30a9f-5d85-4dce-8bc8-03e26c9967c8" (UID: "98f30a9f-5d85-4dce-8bc8-03e26c9967c8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.139496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "98f30a9f-5d85-4dce-8bc8-03e26c9967c8" (UID: "98f30a9f-5d85-4dce-8bc8-03e26c9967c8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.142000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-logs" (OuterVolumeSpecName: "logs") pod "20a1576d-856f-45fb-b210-66fddef6adaf" (UID: "20a1576d-856f-45fb-b210-66fddef6adaf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.142055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "20a1576d-856f-45fb-b210-66fddef6adaf" (UID: "20a1576d-856f-45fb-b210-66fddef6adaf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.151676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-scripts" (OuterVolumeSpecName: "scripts") pod "98f30a9f-5d85-4dce-8bc8-03e26c9967c8" (UID: "98f30a9f-5d85-4dce-8bc8-03e26c9967c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.154411 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-b79e-account-create-update-dv8pj"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.159768 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.68:9311/healthcheck\": read tcp 10.217.0.2:33356->10.217.1.68:9311: read: connection reset by peer" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.159673 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.68:9311/healthcheck\": read tcp 10.217.0.2:33366->10.217.1.68:9311: read: connection reset by peer" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.163368 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-kube-api-access-sxhvl" (OuterVolumeSpecName: "kube-api-access-sxhvl") pod "98f30a9f-5d85-4dce-8bc8-03e26c9967c8" (UID: "98f30a9f-5d85-4dce-8bc8-03e26c9967c8"). InnerVolumeSpecName "kube-api-access-sxhvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.169400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a1576d-856f-45fb-b210-66fddef6adaf-kube-api-access-djn2s" (OuterVolumeSpecName: "kube-api-access-djn2s") pod "20a1576d-856f-45fb-b210-66fddef6adaf" (UID: "20a1576d-856f-45fb-b210-66fddef6adaf"). InnerVolumeSpecName "kube-api-access-djn2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.173870 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "20a1576d-856f-45fb-b210-66fddef6adaf" (UID: "20a1576d-856f-45fb-b210-66fddef6adaf"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.174993 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-scripts" (OuterVolumeSpecName: "scripts") pod "20a1576d-856f-45fb-b210-66fddef6adaf" (UID: "20a1576d-856f-45fb-b210-66fddef6adaf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.176949 4707 scope.go:117] "RemoveContainer" containerID="1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020" Jan 21 15:46:40 crc kubenswrapper[4707]: E0121 15:46:40.178364 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020\": container with ID starting with 1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020 not found: ID does not exist" containerID="1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.178456 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020"} err="failed to get container status \"1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020\": rpc error: code = NotFound desc = could not find container \"1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020\": container with ID starting with 1965d7d5e8094b1fb007358e276e27a2d81ae86098a8e8c53b418e1245007020 not found: ID does not exist" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.178540 4707 scope.go:117] "RemoveContainer" containerID="ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55" Jan 21 15:46:40 crc kubenswrapper[4707]: E0121 15:46:40.179063 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55\": container with ID starting with ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55 not found: ID does not exist" containerID="ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.179140 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55"} err="failed to get container status \"ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55\": rpc error: code = NotFound desc = could not find container \"ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55\": container with ID starting with ca30343f7cc29880327261b719e07e7f428416194c8d128e7b60e0d60ca0fe55 not found: ID does not exist" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.179203 4707 scope.go:117] "RemoveContainer" containerID="9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.189491 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gwvgj"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.201146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data" (OuterVolumeSpecName: "config-data") pod "98f30a9f-5d85-4dce-8bc8-03e26c9967c8" (UID: "98f30a9f-5d85-4dce-8bc8-03e26c9967c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.202525 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-gwvgj"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.204279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98f30a9f-5d85-4dce-8bc8-03e26c9967c8" (UID: "98f30a9f-5d85-4dce-8bc8-03e26c9967c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.205010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20a1576d-856f-45fb-b210-66fddef6adaf" (UID: "20a1576d-856f-45fb-b210-66fddef6adaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.223453 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-fh744"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.227726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-combined-ca-bundle\") pod \"37486029-cbe4-4198-afb9-962c7d0728e5\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.227934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37486029-cbe4-4198-afb9-962c7d0728e5-logs\") pod \"37486029-cbe4-4198-afb9-962c7d0728e5\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.227975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgkmx\" (UniqueName: \"kubernetes.io/projected/37486029-cbe4-4198-afb9-962c7d0728e5-kube-api-access-tgkmx\") pod \"37486029-cbe4-4198-afb9-962c7d0728e5\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-internal-tls-certs\") pod \"37486029-cbe4-4198-afb9-962c7d0728e5\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-config-data\") pod \"37486029-cbe4-4198-afb9-962c7d0728e5\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-public-tls-certs\") pod \"37486029-cbe4-4198-afb9-962c7d0728e5\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-scripts\") pod \"37486029-cbe4-4198-afb9-962c7d0728e5\" (UID: \"37486029-cbe4-4198-afb9-962c7d0728e5\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37486029-cbe4-4198-afb9-962c7d0728e5-logs" (OuterVolumeSpecName: "logs") pod "37486029-cbe4-4198-afb9-962c7d0728e5" (UID: "37486029-cbe4-4198-afb9-962c7d0728e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228828 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228848 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228859 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228867 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228875 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228883 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228891 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a1576d-856f-45fb-b210-66fddef6adaf-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228899 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxhvl\" (UniqueName: \"kubernetes.io/projected/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-kube-api-access-sxhvl\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228907 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228926 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228934 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37486029-cbe4-4198-afb9-962c7d0728e5-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228944 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djn2s\" (UniqueName: \"kubernetes.io/projected/20a1576d-856f-45fb-b210-66fddef6adaf-kube-api-access-djn2s\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228952 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.228960 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.241607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37486029-cbe4-4198-afb9-962c7d0728e5-kube-api-access-tgkmx" (OuterVolumeSpecName: "kube-api-access-tgkmx") pod "37486029-cbe4-4198-afb9-962c7d0728e5" (UID: "37486029-cbe4-4198-afb9-962c7d0728e5"). InnerVolumeSpecName "kube-api-access-tgkmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.246133 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-fh744"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.246914 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.256634 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "20a1576d-856f-45fb-b210-66fddef6adaf" (UID: "20a1576d-856f-45fb-b210-66fddef6adaf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.259344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "98f30a9f-5d85-4dce-8bc8-03e26c9967c8" (UID: "98f30a9f-5d85-4dce-8bc8-03e26c9967c8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.260433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-config-data" (OuterVolumeSpecName: "config-data") pod "20a1576d-856f-45fb-b210-66fddef6adaf" (UID: "20a1576d-856f-45fb-b210-66fddef6adaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.265392 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.269459 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "98f30a9f-5d85-4dce-8bc8-03e26c9967c8" (UID: "98f30a9f-5d85-4dce-8bc8-03e26c9967c8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.269999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-scripts" (OuterVolumeSpecName: "scripts") pod "37486029-cbe4-4198-afb9-962c7d0728e5" (UID: "37486029-cbe4-4198-afb9-962c7d0728e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.271232 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-d4e5-account-create-update-2c75p"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.273994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37486029-cbe4-4198-afb9-962c7d0728e5" (UID: "37486029-cbe4-4198-afb9-962c7d0728e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.282055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-config-data" (OuterVolumeSpecName: "config-data") pod "37486029-cbe4-4198-afb9-962c7d0728e5" (UID: "37486029-cbe4-4198-afb9-962c7d0728e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.283068 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.287897 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6d40-account-create-update-nsm5s"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.296679 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b6rjt"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.301352 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b6rjt"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.306045 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.310650 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.312613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "37486029-cbe4-4198-afb9-962c7d0728e5" (UID: "37486029-cbe4-4198-afb9-962c7d0728e5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.315127 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.319609 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5b9c7d766c-wml82"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.320313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "37486029-cbe4-4198-afb9-962c7d0728e5" (UID: "37486029-cbe4-4198-afb9-962c7d0728e5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.325109 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.329664 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b7c5477b-n8gj6"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330749 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330764 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgkmx\" (UniqueName: \"kubernetes.io/projected/37486029-cbe4-4198-afb9-962c7d0728e5-kube-api-access-tgkmx\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330773 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330781 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330791 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330799 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330820 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330829 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330836 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98f30a9f-5d85-4dce-8bc8-03e26c9967c8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330844 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37486029-cbe4-4198-afb9-962c7d0728e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.330853 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a1576d-856f-45fb-b210-66fddef6adaf-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.334925 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.339707 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.346471 4707 scope.go:117] "RemoveContainer" containerID="9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df" Jan 21 15:46:40 crc kubenswrapper[4707]: E0121 15:46:40.346843 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df\": container with ID starting with 9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df not found: ID does not exist" containerID="9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.346869 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df"} err="failed to get container status \"9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df\": rpc error: code = NotFound desc = could not find container \"9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df\": container with ID starting with 9bc1e643906e945a41eb173fe6c5a775450a0e9342d5d99a76d7d0b70f6f02df not found: ID does not exist" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.346888 4707 scope.go:117] "RemoveContainer" containerID="4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.350589 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_67eda4bf-2ebc-4616-bd19-55afc23963d2/ovn-northd/0.log" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.350635 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.367537 4707 scope.go:117] "RemoveContainer" containerID="7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.386780 4707 scope.go:117] "RemoveContainer" containerID="4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b" Jan 21 15:46:40 crc kubenswrapper[4707]: E0121 15:46:40.387595 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b\": container with ID starting with 4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b not found: ID does not exist" containerID="4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.387639 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b"} err="failed to get container status \"4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b\": rpc error: code = NotFound desc = could not find container \"4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b\": container with ID starting with 4529467ce4e004da4ac8a0f97b65266deebd612ac0905ebb817b0f9ea189415b not found: ID does not exist" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.387661 4707 scope.go:117] "RemoveContainer" containerID="7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae" Jan 21 15:46:40 crc kubenswrapper[4707]: E0121 15:46:40.389022 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae\": container with ID starting with 7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae not found: ID does not exist" containerID="7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.389079 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae"} err="failed to get container status \"7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae\": rpc error: code = NotFound desc = could not find container \"7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae\": container with ID starting with 7cdae05a1c8d6c4fa0140969c4059845e7e5a8e14153d7bf4cf91029b3b088ae not found: ID does not exist" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.432647 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.432800 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6bt8\" (UniqueName: \"kubernetes.io/projected/6f619bdd-bf9c-40af-90f3-6c3bfe463d74-kube-api-access-l6bt8\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.432840 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4af1c90-a8e5-4608-9ff7-495057250aba-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.432850 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv9jn\" (UniqueName: \"kubernetes.io/projected/e4af1c90-a8e5-4608-9ff7-495057250aba-kube-api-access-pv9jn\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.533673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mszz\" (UniqueName: \"kubernetes.io/projected/67eda4bf-2ebc-4616-bd19-55afc23963d2-kube-api-access-6mszz\") pod \"67eda4bf-2ebc-4616-bd19-55afc23963d2\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.533735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-metrics-certs-tls-certs\") pod \"67eda4bf-2ebc-4616-bd19-55afc23963d2\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.533760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-scripts\") pod \"67eda4bf-2ebc-4616-bd19-55afc23963d2\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.533789 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-combined-ca-bundle\") pod \"67eda4bf-2ebc-4616-bd19-55afc23963d2\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.533847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-config\") pod \"67eda4bf-2ebc-4616-bd19-55afc23963d2\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.533870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-northd-tls-certs\") pod \"67eda4bf-2ebc-4616-bd19-55afc23963d2\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.534535 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-scripts" (OuterVolumeSpecName: "scripts") pod "67eda4bf-2ebc-4616-bd19-55afc23963d2" (UID: "67eda4bf-2ebc-4616-bd19-55afc23963d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.535088 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-config" (OuterVolumeSpecName: "config") pod "67eda4bf-2ebc-4616-bd19-55afc23963d2" (UID: "67eda4bf-2ebc-4616-bd19-55afc23963d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.535142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-rundir\") pod \"67eda4bf-2ebc-4616-bd19-55afc23963d2\" (UID: \"67eda4bf-2ebc-4616-bd19-55afc23963d2\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.535586 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.535603 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eda4bf-2ebc-4616-bd19-55afc23963d2-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.535926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "67eda4bf-2ebc-4616-bd19-55afc23963d2" (UID: "67eda4bf-2ebc-4616-bd19-55afc23963d2"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.537085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67eda4bf-2ebc-4616-bd19-55afc23963d2-kube-api-access-6mszz" (OuterVolumeSpecName: "kube-api-access-6mszz") pod "67eda4bf-2ebc-4616-bd19-55afc23963d2" (UID: "67eda4bf-2ebc-4616-bd19-55afc23963d2"). InnerVolumeSpecName "kube-api-access-6mszz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.556137 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67eda4bf-2ebc-4616-bd19-55afc23963d2" (UID: "67eda4bf-2ebc-4616-bd19-55afc23963d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.565593 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.589304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "67eda4bf-2ebc-4616-bd19-55afc23963d2" (UID: "67eda4bf-2ebc-4616-bd19-55afc23963d2"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.592119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "67eda4bf-2ebc-4616-bd19-55afc23963d2" (UID: "67eda4bf-2ebc-4616-bd19-55afc23963d2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.637465 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.637489 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.637498 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67eda4bf-2ebc-4616-bd19-55afc23963d2-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.637507 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mszz\" (UniqueName: \"kubernetes.io/projected/67eda4bf-2ebc-4616-bd19-55afc23963d2-kube-api-access-6mszz\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.637514 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67eda4bf-2ebc-4616-bd19-55afc23963d2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.738561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-internal-tls-certs\") pod \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.738635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data\") pod \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.738670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-combined-ca-bundle\") pod \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.738697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7506b316-cf85-48fa-a833-fafc5e2cb8ce-logs\") pod \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.738730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data-custom\") pod \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.738768 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-public-tls-certs\") pod \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.738845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtcht\" (UniqueName: \"kubernetes.io/projected/7506b316-cf85-48fa-a833-fafc5e2cb8ce-kube-api-access-mtcht\") pod \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\" (UID: \"7506b316-cf85-48fa-a833-fafc5e2cb8ce\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.739329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7506b316-cf85-48fa-a833-fafc5e2cb8ce-logs" (OuterVolumeSpecName: "logs") pod "7506b316-cf85-48fa-a833-fafc5e2cb8ce" (UID: "7506b316-cf85-48fa-a833-fafc5e2cb8ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.743098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7506b316-cf85-48fa-a833-fafc5e2cb8ce" (UID: "7506b316-cf85-48fa-a833-fafc5e2cb8ce"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.743105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7506b316-cf85-48fa-a833-fafc5e2cb8ce-kube-api-access-mtcht" (OuterVolumeSpecName: "kube-api-access-mtcht") pod "7506b316-cf85-48fa-a833-fafc5e2cb8ce" (UID: "7506b316-cf85-48fa-a833-fafc5e2cb8ce"). InnerVolumeSpecName "kube-api-access-mtcht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.754950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7506b316-cf85-48fa-a833-fafc5e2cb8ce" (UID: "7506b316-cf85-48fa-a833-fafc5e2cb8ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.766323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7506b316-cf85-48fa-a833-fafc5e2cb8ce" (UID: "7506b316-cf85-48fa-a833-fafc5e2cb8ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.769046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data" (OuterVolumeSpecName: "config-data") pod "7506b316-cf85-48fa-a833-fafc5e2cb8ce" (UID: "7506b316-cf85-48fa-a833-fafc5e2cb8ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.773993 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7506b316-cf85-48fa-a833-fafc5e2cb8ce" (UID: "7506b316-cf85-48fa-a833-fafc5e2cb8ce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.829687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.835738 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.841004 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.841126 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtcht\" (UniqueName: \"kubernetes.io/projected/7506b316-cf85-48fa-a833-fafc5e2cb8ce-kube-api-access-mtcht\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.841184 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.841234 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.841295 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.841351 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7506b316-cf85-48fa-a833-fafc5e2cb8ce-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.841399 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7506b316-cf85-48fa-a833-fafc5e2cb8ce-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.871299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20a1576d-856f-45fb-b210-66fddef6adaf","Type":"ContainerDied","Data":"5695272d4968e4a17ccd13580218505fa359b47e03a18fb5753386526886aad7"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.871328 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.871348 4707 scope.go:117] "RemoveContainer" containerID="fb76672b8b108d21f1c4f38e5e76864b3f3b4e2030e02b6e06d10dafbc199a9b" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.877002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"98f30a9f-5d85-4dce-8bc8-03e26c9967c8","Type":"ContainerDied","Data":"5bfe0a98c728382335c1754fa37281a046cad5fd2d125f214ba559ecbab12a29"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.877165 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.882182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" event={"ID":"68cc87bb-db38-44df-a921-630ecf45a30b","Type":"ContainerStarted","Data":"9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.882248 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.886630 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_67eda4bf-2ebc-4616-bd19-55afc23963d2/ovn-northd/0.log" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.886667 4707 generic.go:334] "Generic (PLEG): container finished" podID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerID="907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c" exitCode=139 Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.886711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"67eda4bf-2ebc-4616-bd19-55afc23963d2","Type":"ContainerDied","Data":"907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.886719 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.886730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"67eda4bf-2ebc-4616-bd19-55afc23963d2","Type":"ContainerDied","Data":"7e0b984353eafbee1e07343a922efd0a4c840c906f3cf126666789d66e034b21"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.888875 4707 generic.go:334] "Generic (PLEG): container finished" podID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerID="a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915" exitCode=0 Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.888919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"6f277223-7e29-49e4-a1e5-f02c4d883ba7","Type":"ContainerDied","Data":"a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.888933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"6f277223-7e29-49e4-a1e5-f02c4d883ba7","Type":"ContainerDied","Data":"ad2e37f4585e6a550e6367b0cf18458e80f18c8aa807d3c719f5ef541788e6bc"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.888970 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.897438 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" podStartSLOduration=4.897425484 podStartE2EDuration="4.897425484s" podCreationTimestamp="2026-01-21 15:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:40.892214537 +0000 UTC m=+2698.073730758" watchObservedRunningTime="2026-01-21 15:46:40.897425484 +0000 UTC m=+2698.078941706" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.902092 4707 generic.go:334] "Generic (PLEG): container finished" podID="a16e7975-1407-4544-8c46-b0445ac9c311" containerID="165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd" exitCode=0 Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.902134 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.902241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a16e7975-1407-4544-8c46-b0445ac9c311","Type":"ContainerDied","Data":"165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.902332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a16e7975-1407-4544-8c46-b0445ac9c311","Type":"ContainerDied","Data":"5869066c2446507cc781b11d5cd9c0700133b0aab3b4ae707b2752d2e09540e0"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.905744 4707 scope.go:117] "RemoveContainer" containerID="5ca8cf0f460517bb6eb2f9678b1116274be8a382c22293f7320425480d1c7e94" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.906889 4707 generic.go:334] "Generic (PLEG): container finished" podID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerID="d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707" exitCode=0 Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.906933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" event={"ID":"7506b316-cf85-48fa-a833-fafc5e2cb8ce","Type":"ContainerDied","Data":"d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.906950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" event={"ID":"7506b316-cf85-48fa-a833-fafc5e2cb8ce","Type":"ContainerDied","Data":"5ed7e978b0758976ed6e29f96df267bc97775ef10e47fddedf13c203c2f46b13"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.906961 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.906979 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cb66854d6-grx69" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.911000 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.920003 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.920523 4707 generic.go:334] "Generic (PLEG): container finished" podID="37486029-cbe4-4198-afb9-962c7d0728e5" containerID="a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a" exitCode=0 Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.920603 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.920636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" event={"ID":"37486029-cbe4-4198-afb9-962c7d0728e5","Type":"ContainerDied","Data":"a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.920657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6fc86fc9f6-cd66l" event={"ID":"37486029-cbe4-4198-afb9-962c7d0728e5","Type":"ContainerDied","Data":"b079661c01037998a40176799988d931e7d77014dbee9b07236796655f600cf9"} Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.925155 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.931827 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.934753 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-combined-ca-bundle\") pod \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlcc9\" (UniqueName: \"kubernetes.io/projected/6f277223-7e29-49e4-a1e5-f02c4d883ba7-kube-api-access-nlcc9\") pod \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-config-data\") pod \"a16e7975-1407-4544-8c46-b0445ac9c311\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942483 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f277223-7e29-49e4-a1e5-f02c4d883ba7-logs\") pod \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942540 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-combined-ca-bundle\") pod \"a16e7975-1407-4544-8c46-b0445ac9c311\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942560 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-internal-tls-certs\") pod \"a16e7975-1407-4544-8c46-b0445ac9c311\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942579 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-nova-metadata-tls-certs\") pod \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-public-tls-certs\") pod \"a16e7975-1407-4544-8c46-b0445ac9c311\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7svwh\" (UniqueName: \"kubernetes.io/projected/a16e7975-1407-4544-8c46-b0445ac9c311-kube-api-access-7svwh\") pod \"a16e7975-1407-4544-8c46-b0445ac9c311\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a16e7975-1407-4544-8c46-b0445ac9c311-logs\") pod \"a16e7975-1407-4544-8c46-b0445ac9c311\" (UID: \"a16e7975-1407-4544-8c46-b0445ac9c311\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-config-data\") pod \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\" (UID: \"6f277223-7e29-49e4-a1e5-f02c4d883ba7\") " Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.942984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f277223-7e29-49e4-a1e5-f02c4d883ba7-logs" (OuterVolumeSpecName: "logs") pod "6f277223-7e29-49e4-a1e5-f02c4d883ba7" (UID: "6f277223-7e29-49e4-a1e5-f02c4d883ba7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.943614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a16e7975-1407-4544-8c46-b0445ac9c311-logs" (OuterVolumeSpecName: "logs") pod "a16e7975-1407-4544-8c46-b0445ac9c311" (UID: "a16e7975-1407-4544-8c46-b0445ac9c311"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.948685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16e7975-1407-4544-8c46-b0445ac9c311-kube-api-access-7svwh" (OuterVolumeSpecName: "kube-api-access-7svwh") pod "a16e7975-1407-4544-8c46-b0445ac9c311" (UID: "a16e7975-1407-4544-8c46-b0445ac9c311"). InnerVolumeSpecName "kube-api-access-7svwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.949324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f277223-7e29-49e4-a1e5-f02c4d883ba7-kube-api-access-nlcc9" (OuterVolumeSpecName: "kube-api-access-nlcc9") pod "6f277223-7e29-49e4-a1e5-f02c4d883ba7" (UID: "6f277223-7e29-49e4-a1e5-f02c4d883ba7"). InnerVolumeSpecName "kube-api-access-nlcc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.952104 4707 scope.go:117] "RemoveContainer" containerID="3f5a0d130fa4b31023beed8d473c235ed286cc7b040259cf77ab6da89f38e456" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.954302 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb66854d6-grx69"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.958730 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb66854d6-grx69"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.962868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-config-data" (OuterVolumeSpecName: "config-data") pod "a16e7975-1407-4544-8c46-b0445ac9c311" (UID: "a16e7975-1407-4544-8c46-b0445ac9c311"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.967776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f277223-7e29-49e4-a1e5-f02c4d883ba7" (UID: "6f277223-7e29-49e4-a1e5-f02c4d883ba7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.978139 4707 scope.go:117] "RemoveContainer" containerID="3f05e66f94e563650c321336e37a8530429e8dfd49776a8178131469087deaaf" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.979374 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6fc86fc9f6-cd66l"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.983493 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-6fc86fc9f6-cd66l"] Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.985224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-config-data" (OuterVolumeSpecName: "config-data") pod "6f277223-7e29-49e4-a1e5-f02c4d883ba7" (UID: "6f277223-7e29-49e4-a1e5-f02c4d883ba7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.988553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6f277223-7e29-49e4-a1e5-f02c4d883ba7" (UID: "6f277223-7e29-49e4-a1e5-f02c4d883ba7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.993477 4707 scope.go:117] "RemoveContainer" containerID="2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.997217 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a16e7975-1407-4544-8c46-b0445ac9c311" (UID: "a16e7975-1407-4544-8c46-b0445ac9c311"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.998127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a16e7975-1407-4544-8c46-b0445ac9c311" (UID: "a16e7975-1407-4544-8c46-b0445ac9c311"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:40 crc kubenswrapper[4707]: I0121 15:46:40.998246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a16e7975-1407-4544-8c46-b0445ac9c311" (UID: "a16e7975-1407-4544-8c46-b0445ac9c311"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.003512 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.65:5671: connect: connection refused" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.008825 4707 scope.go:117] "RemoveContainer" containerID="907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.021563 4707 scope.go:117] "RemoveContainer" containerID="2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.021786 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1\": container with ID starting with 2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1 not found: ID does not exist" containerID="2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.021824 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1"} err="failed to get container status \"2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1\": rpc error: code = NotFound desc = could not find container \"2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1\": container with ID starting with 2e4dcee8a728ef9c88a26e0d2535b00a56f62ca9be53f738a7abcc0b3311aba1 not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.021841 4707 scope.go:117] "RemoveContainer" containerID="907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.022100 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c\": container with ID starting with 907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c not found: ID does not exist" containerID="907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.022127 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c"} err="failed to get container status \"907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c\": rpc error: code = NotFound desc = could not find container \"907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c\": container with ID starting with 907d0064166b6ffbfbdecb55391a914c726a05b9d1f2acedbbe1c62b1633783c not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.022143 4707 scope.go:117] "RemoveContainer" containerID="a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.037456 4707 scope.go:117] "RemoveContainer" containerID="7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044718 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044738 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlcc9\" (UniqueName: \"kubernetes.io/projected/6f277223-7e29-49e4-a1e5-f02c4d883ba7-kube-api-access-nlcc9\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044748 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044756 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f277223-7e29-49e4-a1e5-f02c4d883ba7-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044766 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044773 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044781 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044788 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a16e7975-1407-4544-8c46-b0445ac9c311-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044798 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7svwh\" (UniqueName: \"kubernetes.io/projected/a16e7975-1407-4544-8c46-b0445ac9c311-kube-api-access-7svwh\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044818 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a16e7975-1407-4544-8c46-b0445ac9c311-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.044826 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f277223-7e29-49e4-a1e5-f02c4d883ba7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.050982 4707 scope.go:117] "RemoveContainer" containerID="a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.051250 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915\": container with ID starting with a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915 not found: ID does not exist" containerID="a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.051307 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915"} err="failed to get container status \"a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915\": rpc error: code = NotFound desc = could not find container \"a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915\": container with ID starting with a7a4370733f442346fbab723c3ea485453f67f94f3c9f5f503c023c4491ae915 not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.051324 4707 scope.go:117] "RemoveContainer" containerID="7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.051568 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686\": container with ID starting with 7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686 not found: ID does not exist" containerID="7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.051613 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686"} err="failed to get container status \"7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686\": rpc error: code = NotFound desc = could not find container \"7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686\": container with ID starting with 7fd40ab89f5a626bd03171506be141649c72fe253fce1772eeb06e25d33c9686 not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.051630 4707 scope.go:117] "RemoveContainer" containerID="165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.067262 4707 scope.go:117] "RemoveContainer" containerID="2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.080848 4707 scope.go:117] "RemoveContainer" containerID="165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.081103 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd\": container with ID starting with 165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd not found: ID does not exist" containerID="165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.081129 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd"} err="failed to get container status \"165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd\": rpc error: code = NotFound desc = could not find container \"165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd\": container with ID starting with 165959aff5229d8fcf9fc4ccf0229524f0f8c2f45d3e3ca4c04c4fb182b0e1fd not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.081143 4707 scope.go:117] "RemoveContainer" containerID="2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.081395 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633\": container with ID starting with 2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633 not found: ID does not exist" containerID="2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.081416 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633"} err="failed to get container status \"2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633\": rpc error: code = NotFound desc = could not find container \"2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633\": container with ID starting with 2dd150da1dec937f4fc89ce1b200fb19826df0e15900614f74c15617c9978633 not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.081427 4707 scope.go:117] "RemoveContainer" containerID="d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.094856 4707 scope.go:117] "RemoveContainer" containerID="e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.106421 4707 scope.go:117] "RemoveContainer" containerID="d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.106693 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707\": container with ID starting with d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707 not found: ID does not exist" containerID="d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.106736 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707"} err="failed to get container status \"d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707\": rpc error: code = NotFound desc = could not find container \"d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707\": container with ID starting with d94bbc9258b21583cf6ff848ec453ba902867d6dea37570cf2386736d5172707 not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.106751 4707 scope.go:117] "RemoveContainer" containerID="e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.106992 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34\": container with ID starting with e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34 not found: ID does not exist" containerID="e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.107012 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34"} err="failed to get container status \"e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34\": rpc error: code = NotFound desc = could not find container \"e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34\": container with ID starting with e5f4f3d1042a0f232055325dd5fef568d321d5d0d917d9a8cbb0130bf95f6f34 not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.107026 4707 scope.go:117] "RemoveContainer" containerID="a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.120565 4707 scope.go:117] "RemoveContainer" containerID="6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.134949 4707 scope.go:117] "RemoveContainer" containerID="a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.135175 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a\": container with ID starting with a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a not found: ID does not exist" containerID="a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.135200 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a"} err="failed to get container status \"a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a\": rpc error: code = NotFound desc = could not find container \"a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a\": container with ID starting with a626a92669cfb19452a2c56cba27907298bdf149436a890ff8657458c5aa670a not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.135215 4707 scope.go:117] "RemoveContainer" containerID="6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.135510 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730\": container with ID starting with 6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730 not found: ID does not exist" containerID="6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.135529 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730"} err="failed to get container status \"6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730\": rpc error: code = NotFound desc = could not find container \"6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730\": container with ID starting with 6bd98a517b7f457b3083a20483a22dfe04ec5ed0f1954b4d6e17d4cd1bd9a730 not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.197453 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="208f36cc-d67f-450a-a68f-1ace0702dfef" path="/var/lib/kubelet/pods/208f36cc-d67f-450a-a68f-1ace0702dfef/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.197893 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a1576d-856f-45fb-b210-66fddef6adaf" path="/var/lib/kubelet/pods/20a1576d-856f-45fb-b210-66fddef6adaf/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.198392 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="259fc011-3b4a-41a2-a064-d7d2282e9763" path="/var/lib/kubelet/pods/259fc011-3b4a-41a2-a064-d7d2282e9763/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.198514 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.198740 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" path="/var/lib/kubelet/pods/37486029-cbe4-4198-afb9-962c7d0728e5/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.199650 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cc4f223-9c2d-442f-aa10-16fcaee9e8c5" path="/var/lib/kubelet/pods/3cc4f223-9c2d-442f-aa10-16fcaee9e8c5/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.200465 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.200477 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" path="/var/lib/kubelet/pods/53a9b1e8-b1f8-4bf4-8d99-06b60581d63c/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.201308 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" path="/var/lib/kubelet/pods/67eda4bf-2ebc-4616-bd19-55afc23963d2/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.201599 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.201645 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.201853 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba6473c-6566-4d18-88ff-e329a248c151" path="/var/lib/kubelet/pods/6ba6473c-6566-4d18-88ff-e329a248c151/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.202692 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f619bdd-bf9c-40af-90f3-6c3bfe463d74" path="/var/lib/kubelet/pods/6f619bdd-bf9c-40af-90f3-6c3bfe463d74/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.203157 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" path="/var/lib/kubelet/pods/7506b316-cf85-48fa-a833-fafc5e2cb8ce/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.203689 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e72ad47-d005-44ce-a494-5ed25884c762" path="/var/lib/kubelet/pods/8e72ad47-d005-44ce-a494-5ed25884c762/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.204591 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" path="/var/lib/kubelet/pods/98f30a9f-5d85-4dce-8bc8-03e26c9967c8/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.205188 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a185d0f4-2937-4afb-ad5c-8ebc5e87a241" path="/var/lib/kubelet/pods/a185d0f4-2937-4afb-ad5c-8ebc5e87a241/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.205607 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" path="/var/lib/kubelet/pods/b4e1e59b-cbb0-4d57-9a35-988ee16e74c8/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.206460 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4aab59-e488-40d8-883e-0e09a44aae4b" path="/var/lib/kubelet/pods/be4aab59-e488-40d8-883e-0e09a44aae4b/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.206908 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df2290ff-a278-4834-be6d-3377f82d1548" path="/var/lib/kubelet/pods/df2290ff-a278-4834-be6d-3377f82d1548/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.207116 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4af1c90-a8e5-4608-9ff7-495057250aba" path="/var/lib/kubelet/pods/e4af1c90-a8e5-4608-9ff7-495057250aba/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.207380 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb0298d-8a63-4ea2-8473-e5e86a134fd8" path="/var/lib/kubelet/pods/ebb0298d-8a63-4ea2-8473-e5e86a134fd8/volumes" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.218564 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.223648 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.231935 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.236593 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.567501 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.755460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-config-data\") pod \"26893235-a33a-4211-b02e-b4237c0586c5\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.755509 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-fernet-keys\") pod \"26893235-a33a-4211-b02e-b4237c0586c5\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.755535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-internal-tls-certs\") pod \"26893235-a33a-4211-b02e-b4237c0586c5\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.755619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjqd5\" (UniqueName: \"kubernetes.io/projected/26893235-a33a-4211-b02e-b4237c0586c5-kube-api-access-cjqd5\") pod \"26893235-a33a-4211-b02e-b4237c0586c5\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.755654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-public-tls-certs\") pod \"26893235-a33a-4211-b02e-b4237c0586c5\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.755670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-combined-ca-bundle\") pod \"26893235-a33a-4211-b02e-b4237c0586c5\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.755697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-credential-keys\") pod \"26893235-a33a-4211-b02e-b4237c0586c5\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.755712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-scripts\") pod \"26893235-a33a-4211-b02e-b4237c0586c5\" (UID: \"26893235-a33a-4211-b02e-b4237c0586c5\") " Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.759566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26893235-a33a-4211-b02e-b4237c0586c5-kube-api-access-cjqd5" (OuterVolumeSpecName: "kube-api-access-cjqd5") pod "26893235-a33a-4211-b02e-b4237c0586c5" (UID: "26893235-a33a-4211-b02e-b4237c0586c5"). InnerVolumeSpecName "kube-api-access-cjqd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.759737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "26893235-a33a-4211-b02e-b4237c0586c5" (UID: "26893235-a33a-4211-b02e-b4237c0586c5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.759974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "26893235-a33a-4211-b02e-b4237c0586c5" (UID: "26893235-a33a-4211-b02e-b4237c0586c5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.760120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-scripts" (OuterVolumeSpecName: "scripts") pod "26893235-a33a-4211-b02e-b4237c0586c5" (UID: "26893235-a33a-4211-b02e-b4237c0586c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.773972 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-config-data" (OuterVolumeSpecName: "config-data") pod "26893235-a33a-4211-b02e-b4237c0586c5" (UID: "26893235-a33a-4211-b02e-b4237c0586c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.774238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26893235-a33a-4211-b02e-b4237c0586c5" (UID: "26893235-a33a-4211-b02e-b4237c0586c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.786242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "26893235-a33a-4211-b02e-b4237c0586c5" (UID: "26893235-a33a-4211-b02e-b4237c0586c5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.791077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "26893235-a33a-4211-b02e-b4237c0586c5" (UID: "26893235-a33a-4211-b02e-b4237c0586c5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.857196 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.857222 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.857231 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.857242 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjqd5\" (UniqueName: \"kubernetes.io/projected/26893235-a33a-4211-b02e-b4237c0586c5-kube-api-access-cjqd5\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.857250 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.857258 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.857265 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.857280 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26893235-a33a-4211-b02e-b4237c0586c5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.931196 4707 generic.go:334] "Generic (PLEG): container finished" podID="26893235-a33a-4211-b02e-b4237c0586c5" containerID="4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74" exitCode=0 Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.931236 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.931261 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" event={"ID":"26893235-a33a-4211-b02e-b4237c0586c5","Type":"ContainerDied","Data":"4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74"} Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.931312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" event={"ID":"26893235-a33a-4211-b02e-b4237c0586c5","Type":"ContainerDied","Data":"e9d9573fbce842118db8236ca137fa1a56ffecc6e81c8e4544ae5511ef653af7"} Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.931330 4707 scope.go:117] "RemoveContainer" containerID="4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.951765 4707 scope.go:117] "RemoveContainer" containerID="4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74" Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.952111 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74\": container with ID starting with 4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74 not found: ID does not exist" containerID="4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.952137 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74"} err="failed to get container status \"4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74\": rpc error: code = NotFound desc = could not find container \"4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74\": container with ID starting with 4a621bb3e5dc1eb06d6d96d32bd84891551676527bfd4df62ba80c66908c2d74 not found: ID does not exist" Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.954222 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-667579cbdb-pbr2f"] Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.958408 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:41 crc kubenswrapper[4707]: E0121 15:46:41.958456 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data podName:f341e188-43e6-4019-b4dd-b8ea727d2d3f nodeName:}" failed. No retries permitted until 2026-01-21 15:46:49.958441923 +0000 UTC m=+2707.139958135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data") pod "rabbitmq-cell1-server-0" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:46:41 crc kubenswrapper[4707]: I0121 15:46:41.959265 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-667579cbdb-pbr2f"] Jan 21 15:46:42 crc kubenswrapper[4707]: I0121 15:46:42.025645 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.66:5671: connect: connection refused" Jan 21 15:46:42 crc kubenswrapper[4707]: I0121 15:46:42.944832 4707 generic.go:334] "Generic (PLEG): container finished" podID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" containerID="7de59c225a5ef6ebd73df1f9d5fa7286a46034a887cc003580cb1dcd45f6ba34" exitCode=0 Jan 21 15:46:42 crc kubenswrapper[4707]: I0121 15:46:42.944889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f341e188-43e6-4019-b4dd-b8ea727d2d3f","Type":"ContainerDied","Data":"7de59c225a5ef6ebd73df1f9d5fa7286a46034a887cc003580cb1dcd45f6ba34"} Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.092619 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:46:43 crc kubenswrapper[4707]: E0121 15:46:43.178064 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:46:43 crc kubenswrapper[4707]: E0121 15:46:43.178121 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data podName:8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab nodeName:}" failed. No retries permitted until 2026-01-21 15:46:51.178109433 +0000 UTC m=+2708.359625655 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data") pod "rabbitmq-server-0" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab") : configmap "rabbitmq-config-data" not found Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.189607 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26893235-a33a-4211-b02e-b4237c0586c5" path="/var/lib/kubelet/pods/26893235-a33a-4211-b02e-b4237c0586c5/volumes" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.190218 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" path="/var/lib/kubelet/pods/6f277223-7e29-49e4-a1e5-f02c4d883ba7/volumes" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.190800 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" path="/var/lib/kubelet/pods/a16e7975-1407-4544-8c46-b0445ac9c311/volumes" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-erlang-cookie\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-tls\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-confd\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nksf\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-kube-api-access-8nksf\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-plugins\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f341e188-43e6-4019-b4dd-b8ea727d2d3f-erlang-cookie-secret\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278770 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-plugins-conf\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f341e188-43e6-4019-b4dd-b8ea727d2d3f-pod-info\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-server-conf\") pod \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\" (UID: \"f341e188-43e6-4019-b4dd-b8ea727d2d3f\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.278938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.279252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.279411 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.279433 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.279511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.283470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f341e188-43e6-4019-b4dd-b8ea727d2d3f-pod-info" (OuterVolumeSpecName: "pod-info") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.284285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f341e188-43e6-4019-b4dd-b8ea727d2d3f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.285035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "persistence") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.291987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.295069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-kube-api-access-8nksf" (OuterVolumeSpecName: "kube-api-access-8nksf") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "kube-api-access-8nksf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.298905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data" (OuterVolumeSpecName: "config-data") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.328746 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-server-conf" (OuterVolumeSpecName: "server-conf") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.352837 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f341e188-43e6-4019-b4dd-b8ea727d2d3f" (UID: "f341e188-43e6-4019-b4dd-b8ea727d2d3f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.381097 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.381266 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.381285 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.381294 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.381302 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nksf\" (UniqueName: \"kubernetes.io/projected/f341e188-43e6-4019-b4dd-b8ea727d2d3f-kube-api-access-8nksf\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.381311 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f341e188-43e6-4019-b4dd-b8ea727d2d3f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.381328 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.381335 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f341e188-43e6-4019-b4dd-b8ea727d2d3f-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.381343 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f341e188-43e6-4019-b4dd-b8ea727d2d3f-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.383258 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.394264 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-pod-info\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvw5j\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-kube-api-access-xvw5j\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482640 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-erlang-cookie\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-erlang-cookie-secret\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-confd\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-tls\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482803 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-server-conf\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-plugins\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.482887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-plugins-conf\") pod \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\" (UID: \"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab\") " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.483170 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.483988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.484267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.484309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.486597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.486844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.487041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.487115 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-pod-info" (OuterVolumeSpecName: "pod-info") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.497761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data" (OuterVolumeSpecName: "config-data") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.498207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-kube-api-access-xvw5j" (OuterVolumeSpecName: "kube-api-access-xvw5j") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "kube-api-access-xvw5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.510429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-server-conf" (OuterVolumeSpecName: "server-conf") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.537106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" (UID: "8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584776 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvw5j\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-kube-api-access-xvw5j\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584800 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584820 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584829 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584850 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584859 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584867 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584874 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584883 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584890 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.584898 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.611256 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.686188 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.959547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"f341e188-43e6-4019-b4dd-b8ea727d2d3f","Type":"ContainerDied","Data":"99d78c44149d942c63b32abb0f52b18b72665aa6bf9087c9d15c4961ea333793"} Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.959568 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.959647 4707 scope.go:117] "RemoveContainer" containerID="7de59c225a5ef6ebd73df1f9d5fa7286a46034a887cc003580cb1dcd45f6ba34" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.965227 4707 generic.go:334] "Generic (PLEG): container finished" podID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" containerID="41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161" exitCode=0 Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.965412 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.965399 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab","Type":"ContainerDied","Data":"41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161"} Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.965551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab","Type":"ContainerDied","Data":"0f3770314d0a5332a4046629981f2c4c2f54bf0d6f0987f4ef6dfb516694cbed"} Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.985841 4707 scope.go:117] "RemoveContainer" containerID="2d3a5c43e34f6f46f6d3cb5d8c70467adeed72ea7c6b5d91eb9047045028dd92" Jan 21 15:46:43 crc kubenswrapper[4707]: I0121 15:46:43.990779 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.001437 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.009695 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.013851 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.018423 4707 scope.go:117] "RemoveContainer" containerID="41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161" Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.040249 4707 scope.go:117] "RemoveContainer" containerID="e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f" Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.058467 4707 scope.go:117] "RemoveContainer" containerID="41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161" Jan 21 15:46:44 crc kubenswrapper[4707]: E0121 15:46:44.058739 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161\": container with ID starting with 41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161 not found: ID does not exist" containerID="41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161" Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.058769 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161"} err="failed to get container status \"41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161\": rpc error: code = NotFound desc = could not find container \"41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161\": container with ID starting with 41c780477200f69cee24a99bc3ebca66e738a2aa276640e716d1710778082161 not found: ID does not exist" Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.058788 4707 scope.go:117] "RemoveContainer" containerID="e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f" Jan 21 15:46:44 crc kubenswrapper[4707]: E0121 15:46:44.059034 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f\": container with ID starting with e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f not found: ID does not exist" containerID="e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f" Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.059053 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f"} err="failed to get container status \"e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f\": rpc error: code = NotFound desc = could not find container \"e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f\": container with ID starting with e3c350dfcaf1e33aff5883cc810f2dc0db49cf8f82f693de382e2bb5696a9e6f not found: ID does not exist" Jan 21 15:46:44 crc kubenswrapper[4707]: I0121 15:46:44.183615 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-cell1-conductor-0" secret="" err="secret \"nova-nova-dockercfg-j66wd\" not found" Jan 21 15:46:44 crc kubenswrapper[4707]: E0121 15:46:44.296170 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:46:44 crc kubenswrapper[4707]: E0121 15:46:44.296236 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle podName:4505129f-9175-43df-9d92-cbb8fba50ea3 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:44.796220413 +0000 UTC m=+2701.977736635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "4505129f-9175-43df-9d92-cbb8fba50ea3") : secret "combined-ca-bundle" not found Jan 21 15:46:44 crc kubenswrapper[4707]: E0121 15:46:44.805021 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:46:44 crc kubenswrapper[4707]: E0121 15:46:44.805335 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle podName:4505129f-9175-43df-9d92-cbb8fba50ea3 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:45.805322168 +0000 UTC m=+2702.986838389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "4505129f-9175-43df-9d92-cbb8fba50ea3") : secret "combined-ca-bundle" not found Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.190765 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" path="/var/lib/kubelet/pods/8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab/volumes" Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.191632 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" path="/var/lib/kubelet/pods/f341e188-43e6-4019-b4dd-b8ea727d2d3f/volumes" Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.467928 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.468150 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="57114736-e4da-48ba-8f64-f382267ede2e" containerName="memcached" containerID="cri-o://27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf" gracePeriod=30 Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.472548 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.472713 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" containerID="cri-o://dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" gracePeriod=30 Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.480014 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8"] Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.483861 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-lnwv8"] Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.494126 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.494456 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" containerID="cri-o://42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" gracePeriod=30 Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.498053 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg"] Jan 21 15:46:45 crc kubenswrapper[4707]: I0121 15:46:45.501580 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-tfldg"] Jan 21 15:46:45 crc kubenswrapper[4707]: E0121 15:46:45.708500 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:45 crc kubenswrapper[4707]: E0121 15:46:45.709870 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:45 crc kubenswrapper[4707]: E0121 15:46:45.711280 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:45 crc kubenswrapper[4707]: E0121 15:46:45.711342 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:46:45 crc kubenswrapper[4707]: E0121 15:46:45.826450 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:46:45 crc kubenswrapper[4707]: E0121 15:46:45.826506 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle podName:4505129f-9175-43df-9d92-cbb8fba50ea3 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:47.826491774 +0000 UTC m=+2705.008007996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "4505129f-9175-43df-9d92-cbb8fba50ea3") : secret "combined-ca-bundle" not found Jan 21 15:46:46 crc kubenswrapper[4707]: E0121 15:46:46.193053 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:46 crc kubenswrapper[4707]: E0121 15:46:46.194445 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:46 crc kubenswrapper[4707]: E0121 15:46:46.195853 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:46 crc kubenswrapper[4707]: E0121 15:46:46.195894 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.926918 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.942870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-config-data\") pod \"57114736-e4da-48ba-8f64-f382267ede2e\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.943006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-kolla-config\") pod \"57114736-e4da-48ba-8f64-f382267ede2e\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.943140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-combined-ca-bundle\") pod \"57114736-e4da-48ba-8f64-f382267ede2e\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.943285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-memcached-tls-certs\") pod \"57114736-e4da-48ba-8f64-f382267ede2e\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.943384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4lqk\" (UniqueName: \"kubernetes.io/projected/57114736-e4da-48ba-8f64-f382267ede2e-kube-api-access-w4lqk\") pod \"57114736-e4da-48ba-8f64-f382267ede2e\" (UID: \"57114736-e4da-48ba-8f64-f382267ede2e\") " Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.943522 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "57114736-e4da-48ba-8f64-f382267ede2e" (UID: "57114736-e4da-48ba-8f64-f382267ede2e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.943529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-config-data" (OuterVolumeSpecName: "config-data") pod "57114736-e4da-48ba-8f64-f382267ede2e" (UID: "57114736-e4da-48ba-8f64-f382267ede2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.943948 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.944020 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/57114736-e4da-48ba-8f64-f382267ede2e-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.951842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57114736-e4da-48ba-8f64-f382267ede2e-kube-api-access-w4lqk" (OuterVolumeSpecName: "kube-api-access-w4lqk") pod "57114736-e4da-48ba-8f64-f382267ede2e" (UID: "57114736-e4da-48ba-8f64-f382267ede2e"). InnerVolumeSpecName "kube-api-access-w4lqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.960676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57114736-e4da-48ba-8f64-f382267ede2e" (UID: "57114736-e4da-48ba-8f64-f382267ede2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.975309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "57114736-e4da-48ba-8f64-f382267ede2e" (UID: "57114736-e4da-48ba-8f64-f382267ede2e"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.992842 4707 generic.go:334] "Generic (PLEG): container finished" podID="57114736-e4da-48ba-8f64-f382267ede2e" containerID="27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf" exitCode=0 Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.992910 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.992931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"57114736-e4da-48ba-8f64-f382267ede2e","Type":"ContainerDied","Data":"27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf"} Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.992974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"57114736-e4da-48ba-8f64-f382267ede2e","Type":"ContainerDied","Data":"0529d7b6cc5864226f3fb3e7dcfc4a31355bf0c22f544c8ba5fec2c6f1c19652"} Jan 21 15:46:46 crc kubenswrapper[4707]: I0121 15:46:46.993000 4707 scope.go:117] "RemoveContainer" containerID="27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.012457 4707 scope.go:117] "RemoveContainer" containerID="27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf" Jan 21 15:46:47 crc kubenswrapper[4707]: E0121 15:46:47.013062 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf\": container with ID starting with 27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf not found: ID does not exist" containerID="27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.013110 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf"} err="failed to get container status \"27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf\": rpc error: code = NotFound desc = could not find container \"27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf\": container with ID starting with 27c5898ea500b70d1bbd34f791bea972d7921c89a1ef7894fb5fb8ff6775dadf not found: ID does not exist" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.030014 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.033290 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.036044 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.045695 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.045719 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/57114736-e4da-48ba-8f64-f382267ede2e-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.045729 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4lqk\" (UniqueName: \"kubernetes.io/projected/57114736-e4da-48ba-8f64-f382267ede2e-kube-api-access-w4lqk\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.072648 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft"] Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.072946 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" podUID="f96930d8-596d-433f-bd93-ae767abae88c" containerName="dnsmasq-dns" containerID="cri-o://711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5" gracePeriod=10 Jan 21 15:46:47 crc kubenswrapper[4707]: E0121 15:46:47.110253 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57114736_e4da_48ba_8f64_f382267ede2e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57114736_e4da_48ba_8f64_f382267ede2e.slice/crio-0529d7b6cc5864226f3fb3e7dcfc4a31355bf0c22f544c8ba5fec2c6f1c19652\": RecentStats: unable to find data in memory cache]" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.189930 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57114736-e4da-48ba-8f64-f382267ede2e" path="/var/lib/kubelet/pods/57114736-e4da-48ba-8f64-f382267ede2e/volumes" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.190601 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708db58a-2cd1-4735-8c38-16987a96a22a" path="/var/lib/kubelet/pods/708db58a-2cd1-4735-8c38-16987a96a22a/volumes" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.191047 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2" path="/var/lib/kubelet/pods/78f6dbf8-b129-4f6d-a2cb-f1a1d42ff7f2/volumes" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.401402 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.449794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lv6x\" (UniqueName: \"kubernetes.io/projected/f96930d8-596d-433f-bd93-ae767abae88c-kube-api-access-9lv6x\") pod \"f96930d8-596d-433f-bd93-ae767abae88c\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.449898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dns-swift-storage-0\") pod \"f96930d8-596d-433f-bd93-ae767abae88c\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.449923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dnsmasq-svc\") pod \"f96930d8-596d-433f-bd93-ae767abae88c\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.450002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-config\") pod \"f96930d8-596d-433f-bd93-ae767abae88c\" (UID: \"f96930d8-596d-433f-bd93-ae767abae88c\") " Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.454828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f96930d8-596d-433f-bd93-ae767abae88c-kube-api-access-9lv6x" (OuterVolumeSpecName: "kube-api-access-9lv6x") pod "f96930d8-596d-433f-bd93-ae767abae88c" (UID: "f96930d8-596d-433f-bd93-ae767abae88c"). InnerVolumeSpecName "kube-api-access-9lv6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.478069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-config" (OuterVolumeSpecName: "config") pod "f96930d8-596d-433f-bd93-ae767abae88c" (UID: "f96930d8-596d-433f-bd93-ae767abae88c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.478233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "f96930d8-596d-433f-bd93-ae767abae88c" (UID: "f96930d8-596d-433f-bd93-ae767abae88c"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.478328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f96930d8-596d-433f-bd93-ae767abae88c" (UID: "f96930d8-596d-433f-bd93-ae767abae88c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:47 crc kubenswrapper[4707]: E0121 15:46:47.516724 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:47 crc kubenswrapper[4707]: E0121 15:46:47.518159 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:47 crc kubenswrapper[4707]: E0121 15:46:47.519235 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:47 crc kubenswrapper[4707]: E0121 15:46:47.519267 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.551976 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.552009 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lv6x\" (UniqueName: \"kubernetes.io/projected/f96930d8-596d-433f-bd93-ae767abae88c-kube-api-access-9lv6x\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.552020 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:47 crc kubenswrapper[4707]: I0121 15:46:47.552029 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f96930d8-596d-433f-bd93-ae767abae88c-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:47 crc kubenswrapper[4707]: E0121 15:46:47.855727 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:46:47 crc kubenswrapper[4707]: E0121 15:46:47.855789 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle podName:4505129f-9175-43df-9d92-cbb8fba50ea3 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:51.855775819 +0000 UTC m=+2709.037292042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "4505129f-9175-43df-9d92-cbb8fba50ea3") : secret "combined-ca-bundle" not found Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.001730 4707 generic.go:334] "Generic (PLEG): container finished" podID="f96930d8-596d-433f-bd93-ae767abae88c" containerID="711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5" exitCode=0 Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.001780 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.001851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" event={"ID":"f96930d8-596d-433f-bd93-ae767abae88c","Type":"ContainerDied","Data":"711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5"} Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.001887 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft" event={"ID":"f96930d8-596d-433f-bd93-ae767abae88c","Type":"ContainerDied","Data":"06c0d3fb7865624d130253e3ec19b2be699c313504fd718cd1f3a408e16db445"} Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.001903 4707 scope.go:117] "RemoveContainer" containerID="711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5" Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.023434 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft"] Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.026328 4707 scope.go:117] "RemoveContainer" containerID="01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175" Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.027962 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-66d9b4485-g2wft"] Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.059106 4707 scope.go:117] "RemoveContainer" containerID="711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5" Jan 21 15:46:48 crc kubenswrapper[4707]: E0121 15:46:48.059472 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5\": container with ID starting with 711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5 not found: ID does not exist" containerID="711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5" Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.059501 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5"} err="failed to get container status \"711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5\": rpc error: code = NotFound desc = could not find container \"711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5\": container with ID starting with 711334b6705b3f7928cf4973dfc5710914b4ec426cd881d302e467bd55a82fd5 not found: ID does not exist" Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.059518 4707 scope.go:117] "RemoveContainer" containerID="01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175" Jan 21 15:46:48 crc kubenswrapper[4707]: E0121 15:46:48.059725 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175\": container with ID starting with 01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175 not found: ID does not exist" containerID="01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175" Jan 21 15:46:48 crc kubenswrapper[4707]: I0121 15:46:48.059746 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175"} err="failed to get container status \"01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175\": rpc error: code = NotFound desc = could not find container \"01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175\": container with ID starting with 01dc0b7b8e827ca1321020a9394c9c1248bd012e38113ef32c638674547d5175 not found: ID does not exist" Jan 21 15:46:49 crc kubenswrapper[4707]: I0121 15:46:49.189603 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f96930d8-596d-433f-bd93-ae767abae88c" path="/var/lib/kubelet/pods/f96930d8-596d-433f-bd93-ae767abae88c/volumes" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.182404 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.467444 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4kjk5"] Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.468744 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerName="ovn-northd" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.468852 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerName="ovn-northd" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.468908 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" containerName="rabbitmq" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.468972 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" containerName="rabbitmq" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.469036 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba6473c-6566-4d18-88ff-e329a248c151" containerName="barbican-keystone-listener-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.469110 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba6473c-6566-4d18-88ff-e329a248c151" containerName="barbican-keystone-listener-log" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.469165 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cc4f223-9c2d-442f-aa10-16fcaee9e8c5" containerName="mariadb-database-create" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.469226 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc4f223-9c2d-442f-aa10-16fcaee9e8c5" containerName="mariadb-database-create" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.469310 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" containerName="galera" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.469372 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" containerName="galera" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.469422 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba6473c-6566-4d18-88ff-e329a248c151" containerName="barbican-keystone-listener" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.469464 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba6473c-6566-4d18-88ff-e329a248c151" containerName="barbican-keystone-listener" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.469536 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" containerName="setup-container" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.469602 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" containerName="setup-container" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.469664 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.469711 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-api" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.469762 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerName="openstack-network-exporter" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.469826 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerName="openstack-network-exporter" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.469883 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.469926 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api-log" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.469971 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerName="cinder-api-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.470015 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerName="cinder-api-log" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.470063 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e72ad47-d005-44ce-a494-5ed25884c762" containerName="barbican-worker" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.470102 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e72ad47-d005-44ce-a494-5ed25884c762" containerName="barbican-worker" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.470298 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.470356 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-api" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.470404 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96930d8-596d-433f-bd93-ae767abae88c" containerName="dnsmasq-dns" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.470445 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96930d8-596d-433f-bd93-ae767abae88c" containerName="dnsmasq-dns" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.470487 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerName="glance-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.470534 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerName="glance-log" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.470586 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26893235-a33a-4211-b02e-b4237c0586c5" containerName="keystone-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.470629 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="26893235-a33a-4211-b02e-b4237c0586c5" containerName="keystone-api" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.470671 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e72ad47-d005-44ce-a494-5ed25884c762" containerName="barbican-worker-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.470711 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e72ad47-d005-44ce-a494-5ed25884c762" containerName="barbican-worker-log" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.470760 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.470859 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-log" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.470945 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" containerName="setup-container" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472026 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" containerName="setup-container" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472132 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62555ab6-9268-4248-977e-98269dc9483d" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472193 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62555ab6-9268-4248-977e-98269dc9483d" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472241 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a1576d-856f-45fb-b210-66fddef6adaf" containerName="glance-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472307 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a1576d-856f-45fb-b210-66fddef6adaf" containerName="glance-log" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472358 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14f4f47-2fe8-47fd-892f-832266276730" containerName="mysql-bootstrap" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472406 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14f4f47-2fe8-47fd-892f-832266276730" containerName="mysql-bootstrap" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472450 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerName="glance-httpd" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerName="glance-httpd" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472532 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472574 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472616 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14f4f47-2fe8-47fd-892f-832266276730" containerName="galera" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472660 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14f4f47-2fe8-47fd-892f-832266276730" containerName="galera" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472703 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" containerName="rabbitmq" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472742 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" containerName="rabbitmq" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472784 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a1576d-856f-45fb-b210-66fddef6adaf" containerName="glance-httpd" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472845 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a1576d-856f-45fb-b210-66fddef6adaf" containerName="glance-httpd" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472889 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.472928 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-log" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.472991 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4aab59-e488-40d8-883e-0e09a44aae4b" containerName="mariadb-database-create" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.473033 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4aab59-e488-40d8-883e-0e09a44aae4b" containerName="mariadb-database-create" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.473076 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57114736-e4da-48ba-8f64-f382267ede2e" containerName="memcached" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.473115 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="57114736-e4da-48ba-8f64-f382267ede2e" containerName="memcached" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.473159 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerName="cinder-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.473205 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerName="cinder-api" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.473252 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.473307 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-log" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.473348 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96930d8-596d-433f-bd93-ae767abae88c" containerName="init" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.473393 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96930d8-596d-433f-bd93-ae767abae88c" containerName="init" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.473471 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" containerName="mysql-bootstrap" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.473540 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" containerName="mysql-bootstrap" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.473618 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-metadata" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.473689 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-metadata" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474008 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474093 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cc4f223-9c2d-442f-aa10-16fcaee9e8c5" containerName="mariadb-database-create" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474166 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e72ad47-d005-44ce-a494-5ed25884c762" containerName="barbican-worker-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474240 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba6473c-6566-4d18-88ff-e329a248c151" containerName="barbican-keystone-listener" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474330 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerName="ovn-northd" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474405 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerName="cinder-api-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474453 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a1576d-856f-45fb-b210-66fddef6adaf" containerName="glance-httpd" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474523 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474593 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba6473c-6566-4d18-88ff-e329a248c151" containerName="barbican-keystone-listener-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474661 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f96930d8-596d-433f-bd93-ae767abae88c" containerName="dnsmasq-dns" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474729 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="67eda4bf-2ebc-4616-bd19-55afc23963d2" containerName="openstack-network-exporter" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474782 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4aab59-e488-40d8-883e-0e09a44aae4b" containerName="mariadb-database-create" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474920 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.474992 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e72ad47-d005-44ce-a494-5ed25884c762" containerName="barbican-worker" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475065 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62555ab6-9268-4248-977e-98269dc9483d" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475138 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f277223-7e29-49e4-a1e5-f02c4d883ba7" containerName="nova-metadata-metadata" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475188 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="57114736-e4da-48ba-8f64-f382267ede2e" containerName="memcached" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475304 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14f4f47-2fe8-47fd-892f-832266276730" containerName="galera" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="26893235-a33a-4211-b02e-b4237c0586c5" containerName="keystone-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475429 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f341e188-43e6-4019-b4dd-b8ea727d2d3f" containerName="rabbitmq" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475535 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475585 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7506b316-cf85-48fa-a833-fafc5e2cb8ce" containerName="barbican-api-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475657 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerName="glance-httpd" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475731 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f30a9f-5d85-4dce-8bc8-03e26c9967c8" containerName="cinder-api" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475801 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e1e59b-cbb0-4d57-9a35-988ee16e74c8" containerName="galera" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475879 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="37486029-cbe4-4198-afb9-962c7d0728e5" containerName="placement-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475935 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a1576d-856f-45fb-b210-66fddef6adaf" containerName="glance-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.475978 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16e7975-1407-4544-8c46-b0445ac9c311" containerName="nova-api-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.476020 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a048e9d-4ff0-479e-bc75-8ccfa4ec7fab" containerName="rabbitmq" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.476074 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a9b1e8-b1f8-4bf4-8d99-06b60581d63c" containerName="glance-log" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.477171 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4kjk5"] Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.477324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.489381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-catalog-content\") pod \"community-operators-4kjk5\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.489461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47n7t\" (UniqueName: \"kubernetes.io/projected/0d31e627-8f6e-478e-99ac-eccb0608a7bd-kube-api-access-47n7t\") pod \"community-operators-4kjk5\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.489523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-utilities\") pod \"community-operators-4kjk5\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.590681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-utilities\") pod \"community-operators-4kjk5\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.591131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-catalog-content\") pod \"community-operators-4kjk5\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.591346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47n7t\" (UniqueName: \"kubernetes.io/projected/0d31e627-8f6e-478e-99ac-eccb0608a7bd-kube-api-access-47n7t\") pod \"community-operators-4kjk5\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.591458 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-catalog-content\") pod \"community-operators-4kjk5\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.591170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-utilities\") pod \"community-operators-4kjk5\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.607616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47n7t\" (UniqueName: \"kubernetes.io/projected/0d31e627-8f6e-478e-99ac-eccb0608a7bd-kube-api-access-47n7t\") pod \"community-operators-4kjk5\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.710236 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.711671 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.713045 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:50 crc kubenswrapper[4707]: E0121 15:46:50.713156 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:46:50 crc kubenswrapper[4707]: I0121 15:46:50.788959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:46:51 crc kubenswrapper[4707]: I0121 15:46:51.025858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"34a707de92d001c9de59dbef499d523bf460e9895d2f693f866b85aad1e846c3"} Jan 21 15:46:51 crc kubenswrapper[4707]: E0121 15:46:51.193254 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:51 crc kubenswrapper[4707]: E0121 15:46:51.194579 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:51 crc kubenswrapper[4707]: E0121 15:46:51.195714 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:51 crc kubenswrapper[4707]: E0121 15:46:51.195749 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" Jan 21 15:46:51 crc kubenswrapper[4707]: I0121 15:46:51.226160 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4kjk5"] Jan 21 15:46:51 crc kubenswrapper[4707]: W0121 15:46:51.230251 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d31e627_8f6e_478e_99ac_eccb0608a7bd.slice/crio-9748a95fdcf976b51ab70ea52633d6539247600833533375a3f34764be70d0a4 WatchSource:0}: Error finding container 9748a95fdcf976b51ab70ea52633d6539247600833533375a3f34764be70d0a4: Status 404 returned error can't find the container with id 9748a95fdcf976b51ab70ea52633d6539247600833533375a3f34764be70d0a4 Jan 21 15:46:51 crc kubenswrapper[4707]: E0121 15:46:51.908395 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:46:51 crc kubenswrapper[4707]: E0121 15:46:51.908844 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle podName:4505129f-9175-43df-9d92-cbb8fba50ea3 nodeName:}" failed. No retries permitted until 2026-01-21 15:46:59.908829403 +0000 UTC m=+2717.090345624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "4505129f-9175-43df-9d92-cbb8fba50ea3") : secret "combined-ca-bundle" not found Jan 21 15:46:52 crc kubenswrapper[4707]: I0121 15:46:52.036931 4707 generic.go:334] "Generic (PLEG): container finished" podID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerID="1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7" exitCode=0 Jan 21 15:46:52 crc kubenswrapper[4707]: I0121 15:46:52.036972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kjk5" event={"ID":"0d31e627-8f6e-478e-99ac-eccb0608a7bd","Type":"ContainerDied","Data":"1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7"} Jan 21 15:46:52 crc kubenswrapper[4707]: I0121 15:46:52.036995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kjk5" event={"ID":"0d31e627-8f6e-478e-99ac-eccb0608a7bd","Type":"ContainerStarted","Data":"9748a95fdcf976b51ab70ea52633d6539247600833533375a3f34764be70d0a4"} Jan 21 15:46:52 crc kubenswrapper[4707]: E0121 15:46:52.516694 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:52 crc kubenswrapper[4707]: E0121 15:46:52.518032 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:52 crc kubenswrapper[4707]: E0121 15:46:52.519134 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:52 crc kubenswrapper[4707]: E0121 15:46:52.519158 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.044928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kjk5" event={"ID":"0d31e627-8f6e-478e-99ac-eccb0608a7bd","Type":"ContainerStarted","Data":"46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685"} Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.066624 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bcjlb"] Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.068177 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.078041 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcjlb"] Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.124113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7pvz\" (UniqueName: \"kubernetes.io/projected/36ac193f-5744-444f-81d6-8b785dd52ed2-kube-api-access-n7pvz\") pod \"certified-operators-bcjlb\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.124181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-catalog-content\") pod \"certified-operators-bcjlb\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.124229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-utilities\") pod \"certified-operators-bcjlb\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.225739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7pvz\" (UniqueName: \"kubernetes.io/projected/36ac193f-5744-444f-81d6-8b785dd52ed2-kube-api-access-n7pvz\") pod \"certified-operators-bcjlb\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.225789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-catalog-content\") pod \"certified-operators-bcjlb\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.225851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-utilities\") pod \"certified-operators-bcjlb\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.226203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-utilities\") pod \"certified-operators-bcjlb\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.226294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-catalog-content\") pod \"certified-operators-bcjlb\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.242207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7pvz\" (UniqueName: \"kubernetes.io/projected/36ac193f-5744-444f-81d6-8b785dd52ed2-kube-api-access-n7pvz\") pod \"certified-operators-bcjlb\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.388906 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:46:53 crc kubenswrapper[4707]: I0121 15:46:53.823246 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcjlb"] Jan 21 15:46:54 crc kubenswrapper[4707]: I0121 15:46:54.081594 4707 generic.go:334] "Generic (PLEG): container finished" podID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerID="46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685" exitCode=0 Jan 21 15:46:54 crc kubenswrapper[4707]: I0121 15:46:54.081769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kjk5" event={"ID":"0d31e627-8f6e-478e-99ac-eccb0608a7bd","Type":"ContainerDied","Data":"46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685"} Jan 21 15:46:54 crc kubenswrapper[4707]: I0121 15:46:54.085101 4707 generic.go:334] "Generic (PLEG): container finished" podID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerID="2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f" exitCode=0 Jan 21 15:46:54 crc kubenswrapper[4707]: I0121 15:46:54.085131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcjlb" event={"ID":"36ac193f-5744-444f-81d6-8b785dd52ed2","Type":"ContainerDied","Data":"2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f"} Jan 21 15:46:54 crc kubenswrapper[4707]: I0121 15:46:54.085149 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcjlb" event={"ID":"36ac193f-5744-444f-81d6-8b785dd52ed2","Type":"ContainerStarted","Data":"666eacecce05a7c03c5249e38cbef285bdc2d0ffe04ec1085fec5480db751553"} Jan 21 15:46:55 crc kubenswrapper[4707]: I0121 15:46:55.093020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcjlb" event={"ID":"36ac193f-5744-444f-81d6-8b785dd52ed2","Type":"ContainerStarted","Data":"55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0"} Jan 21 15:46:55 crc kubenswrapper[4707]: I0121 15:46:55.095312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kjk5" event={"ID":"0d31e627-8f6e-478e-99ac-eccb0608a7bd","Type":"ContainerStarted","Data":"854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1"} Jan 21 15:46:55 crc kubenswrapper[4707]: I0121 15:46:55.123656 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4kjk5" podStartSLOduration=2.592934955 podStartE2EDuration="5.12363234s" podCreationTimestamp="2026-01-21 15:46:50 +0000 UTC" firstStartedPulling="2026-01-21 15:46:52.038896239 +0000 UTC m=+2709.220412461" lastFinishedPulling="2026-01-21 15:46:54.569593624 +0000 UTC m=+2711.751109846" observedRunningTime="2026-01-21 15:46:55.12200432 +0000 UTC m=+2712.303520542" watchObservedRunningTime="2026-01-21 15:46:55.12363234 +0000 UTC m=+2712.305148563" Jan 21 15:46:55 crc kubenswrapper[4707]: E0121 15:46:55.708323 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:55 crc kubenswrapper[4707]: E0121 15:46:55.709287 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:55 crc kubenswrapper[4707]: E0121 15:46:55.710246 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:55 crc kubenswrapper[4707]: E0121 15:46:55.710302 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:46:56 crc kubenswrapper[4707]: I0121 15:46:56.103579 4707 generic.go:334] "Generic (PLEG): container finished" podID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerID="55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0" exitCode=0 Jan 21 15:46:56 crc kubenswrapper[4707]: I0121 15:46:56.103650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcjlb" event={"ID":"36ac193f-5744-444f-81d6-8b785dd52ed2","Type":"ContainerDied","Data":"55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0"} Jan 21 15:46:56 crc kubenswrapper[4707]: E0121 15:46:56.192767 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:56 crc kubenswrapper[4707]: E0121 15:46:56.194049 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:56 crc kubenswrapper[4707]: E0121 15:46:56.195120 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:46:56 crc kubenswrapper[4707]: E0121 15:46:56.195146 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" Jan 21 15:46:57 crc kubenswrapper[4707]: I0121 15:46:57.112052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcjlb" event={"ID":"36ac193f-5744-444f-81d6-8b785dd52ed2","Type":"ContainerStarted","Data":"927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f"} Jan 21 15:46:57 crc kubenswrapper[4707]: I0121 15:46:57.131848 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bcjlb" podStartSLOduration=1.6182249039999999 podStartE2EDuration="4.131833967s" podCreationTimestamp="2026-01-21 15:46:53 +0000 UTC" firstStartedPulling="2026-01-21 15:46:54.086300696 +0000 UTC m=+2711.267816918" lastFinishedPulling="2026-01-21 15:46:56.599909759 +0000 UTC m=+2713.781425981" observedRunningTime="2026-01-21 15:46:57.127744838 +0000 UTC m=+2714.309261060" watchObservedRunningTime="2026-01-21 15:46:57.131833967 +0000 UTC m=+2714.313350189" Jan 21 15:46:57 crc kubenswrapper[4707]: E0121 15:46:57.516874 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:57 crc kubenswrapper[4707]: E0121 15:46:57.517834 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:57 crc kubenswrapper[4707]: E0121 15:46:57.518857 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:46:57 crc kubenswrapper[4707]: E0121 15:46:57.518914 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" Jan 21 15:46:58 crc kubenswrapper[4707]: I0121 15:46:58.194695 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.70:3000/\": dial tcp 10.217.1.70:3000: connect: connection refused" Jan 21 15:46:59 crc kubenswrapper[4707]: I0121 15:46:59.192409 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.71:9696/\": dial tcp 10.217.1.71:9696: connect: connection refused" Jan 21 15:46:59 crc kubenswrapper[4707]: E0121 15:46:59.931697 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:46:59 crc kubenswrapper[4707]: E0121 15:46:59.931832 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle podName:4505129f-9175-43df-9d92-cbb8fba50ea3 nodeName:}" failed. No retries permitted until 2026-01-21 15:47:15.931787598 +0000 UTC m=+2733.113303820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "4505129f-9175-43df-9d92-cbb8fba50ea3") : secret "combined-ca-bundle" not found Jan 21 15:47:00 crc kubenswrapper[4707]: E0121 15:47:00.707477 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:00 crc kubenswrapper[4707]: E0121 15:47:00.708578 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:00 crc kubenswrapper[4707]: E0121 15:47:00.709568 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:00 crc kubenswrapper[4707]: E0121 15:47:00.709606 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:47:00 crc kubenswrapper[4707]: I0121 15:47:00.789947 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:47:00 crc kubenswrapper[4707]: I0121 15:47:00.789989 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:47:00 crc kubenswrapper[4707]: I0121 15:47:00.822133 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:47:01 crc kubenswrapper[4707]: I0121 15:47:01.143150 4707 scope.go:117] "RemoveContainer" containerID="67426142fb520cd6c7a41ab524ba5e9fdc22b2e4ccf3ef543c4400ce59fc6e49" Jan 21 15:47:01 crc kubenswrapper[4707]: E0121 15:47:01.201503 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:47:01 crc kubenswrapper[4707]: E0121 15:47:01.202791 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:47:01 crc kubenswrapper[4707]: I0121 15:47:01.203408 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:47:01 crc kubenswrapper[4707]: E0121 15:47:01.204356 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:47:01 crc kubenswrapper[4707]: E0121 15:47:01.204409 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" Jan 21 15:47:02 crc kubenswrapper[4707]: I0121 15:47:02.059507 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4kjk5"] Jan 21 15:47:02 crc kubenswrapper[4707]: E0121 15:47:02.517043 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:02 crc kubenswrapper[4707]: E0121 15:47:02.518997 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:02 crc kubenswrapper[4707]: E0121 15:47:02.520045 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:02 crc kubenswrapper[4707]: E0121 15:47:02.520140 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.160391 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4kjk5" podUID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerName="registry-server" containerID="cri-o://854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1" gracePeriod=2 Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.389290 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.389352 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.437211 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.551290 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.579976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-utilities\") pod \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.580093 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47n7t\" (UniqueName: \"kubernetes.io/projected/0d31e627-8f6e-478e-99ac-eccb0608a7bd-kube-api-access-47n7t\") pod \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.580121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-catalog-content\") pod \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\" (UID: \"0d31e627-8f6e-478e-99ac-eccb0608a7bd\") " Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.580780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-utilities" (OuterVolumeSpecName: "utilities") pod "0d31e627-8f6e-478e-99ac-eccb0608a7bd" (UID: "0d31e627-8f6e-478e-99ac-eccb0608a7bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.586279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d31e627-8f6e-478e-99ac-eccb0608a7bd-kube-api-access-47n7t" (OuterVolumeSpecName: "kube-api-access-47n7t") pod "0d31e627-8f6e-478e-99ac-eccb0608a7bd" (UID: "0d31e627-8f6e-478e-99ac-eccb0608a7bd"). InnerVolumeSpecName "kube-api-access-47n7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.622726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d31e627-8f6e-478e-99ac-eccb0608a7bd" (UID: "0d31e627-8f6e-478e-99ac-eccb0608a7bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.681835 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.681862 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47n7t\" (UniqueName: \"kubernetes.io/projected/0d31e627-8f6e-478e-99ac-eccb0608a7bd-kube-api-access-47n7t\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:03 crc kubenswrapper[4707]: I0121 15:47:03.681874 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d31e627-8f6e-478e-99ac-eccb0608a7bd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.167911 4707 generic.go:334] "Generic (PLEG): container finished" podID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerID="854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1" exitCode=0 Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.167974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kjk5" event={"ID":"0d31e627-8f6e-478e-99ac-eccb0608a7bd","Type":"ContainerDied","Data":"854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1"} Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.168009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kjk5" event={"ID":"0d31e627-8f6e-478e-99ac-eccb0608a7bd","Type":"ContainerDied","Data":"9748a95fdcf976b51ab70ea52633d6539247600833533375a3f34764be70d0a4"} Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.168037 4707 scope.go:117] "RemoveContainer" containerID="854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.167980 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kjk5" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.191004 4707 scope.go:117] "RemoveContainer" containerID="46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.201887 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4kjk5"] Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.207091 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4kjk5"] Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.216552 4707 scope.go:117] "RemoveContainer" containerID="1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.225055 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.241875 4707 scope.go:117] "RemoveContainer" containerID="854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1" Jan 21 15:47:04 crc kubenswrapper[4707]: E0121 15:47:04.242686 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1\": container with ID starting with 854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1 not found: ID does not exist" containerID="854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.242736 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1"} err="failed to get container status \"854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1\": rpc error: code = NotFound desc = could not find container \"854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1\": container with ID starting with 854a4ada7a8786acef0b8535c2dd7c588b4b4a9a38b7f305ac718b7a3251d6a1 not found: ID does not exist" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.242764 4707 scope.go:117] "RemoveContainer" containerID="46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685" Jan 21 15:47:04 crc kubenswrapper[4707]: E0121 15:47:04.243179 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685\": container with ID starting with 46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685 not found: ID does not exist" containerID="46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.243205 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685"} err="failed to get container status \"46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685\": rpc error: code = NotFound desc = could not find container \"46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685\": container with ID starting with 46dc24961f98b3ac18082b068229e84b59b3e20fd4d6b62c68c7e8ae28b39685 not found: ID does not exist" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.243222 4707 scope.go:117] "RemoveContainer" containerID="1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7" Jan 21 15:47:04 crc kubenswrapper[4707]: E0121 15:47:04.243444 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7\": container with ID starting with 1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7 not found: ID does not exist" containerID="1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7" Jan 21 15:47:04 crc kubenswrapper[4707]: I0121 15:47:04.243474 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7"} err="failed to get container status \"1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7\": rpc error: code = NotFound desc = could not find container \"1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7\": container with ID starting with 1911c69861e7546ecfa7970bd5259cba6169013fd37ecd3f1b12f5fc996697a7 not found: ID does not exist" Jan 21 15:47:05 crc kubenswrapper[4707]: I0121 15:47:05.189609 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" path="/var/lib/kubelet/pods/0d31e627-8f6e-478e-99ac-eccb0608a7bd/volumes" Jan 21 15:47:05 crc kubenswrapper[4707]: I0121 15:47:05.260584 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bcjlb"] Jan 21 15:47:05 crc kubenswrapper[4707]: E0121 15:47:05.707823 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:05 crc kubenswrapper[4707]: E0121 15:47:05.709125 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:05 crc kubenswrapper[4707]: E0121 15:47:05.710228 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:05 crc kubenswrapper[4707]: E0121 15:47:05.710302 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:47:06 crc kubenswrapper[4707]: E0121 15:47:06.198068 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.198311 4707 generic.go:334] "Generic (PLEG): container finished" podID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerID="ad1135a550f9a5d2b3f5160fd4aa31b58313409af1bfcb366c84f0d82a424f8b" exitCode=137 Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.198347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760","Type":"ContainerDied","Data":"ad1135a550f9a5d2b3f5160fd4aa31b58313409af1bfcb366c84f0d82a424f8b"} Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.198506 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bcjlb" podUID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerName="registry-server" containerID="cri-o://927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f" gracePeriod=2 Jan 21 15:47:06 crc kubenswrapper[4707]: E0121 15:47:06.199118 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:47:06 crc kubenswrapper[4707]: E0121 15:47:06.200041 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:47:06 crc kubenswrapper[4707]: E0121 15:47:06.200083 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.301665 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.312192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-scripts\") pod \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.312229 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-combined-ca-bundle\") pod \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.312256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data-custom\") pod \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.312290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data\") pod \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.312320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-etc-machine-id\") pod \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.312403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n72ps\" (UniqueName: \"kubernetes.io/projected/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-kube-api-access-n72ps\") pod \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\" (UID: \"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.313150 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" (UID: "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.316527 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" (UID: "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.316905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-kube-api-access-n72ps" (OuterVolumeSpecName: "kube-api-access-n72ps") pod "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" (UID: "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760"). InnerVolumeSpecName "kube-api-access-n72ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.319443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-scripts" (OuterVolumeSpecName: "scripts") pod "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" (UID: "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.365069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" (UID: "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.367861 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data" (OuterVolumeSpecName: "config-data") pod "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" (UID: "cc574d2a-944c-4ac5-8ad9-3d6c51f6c760"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.414544 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n72ps\" (UniqueName: \"kubernetes.io/projected/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-kube-api-access-n72ps\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.414574 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.414584 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.414593 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.414601 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.414609 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.638643 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.681884 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/2.log" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.682322 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/1.log" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.682649 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.709842 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-ovndb-tls-certs\") pod \"5f587396-a3bb-4add-bb87-d1bacd7e763e\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-combined-ca-bundle\") pod \"5f587396-a3bb-4add-bb87-d1bacd7e763e\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-utilities\") pod \"36ac193f-5744-444f-81d6-8b785dd52ed2\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7xmj\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-kube-api-access-r7xmj\") pod \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719796 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-cache\") pod \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719880 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-httpd-config\") pod \"5f587396-a3bb-4add-bb87-d1bacd7e763e\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4kkh\" (UniqueName: \"kubernetes.io/projected/5f587396-a3bb-4add-bb87-d1bacd7e763e-kube-api-access-z4kkh\") pod \"5f587396-a3bb-4add-bb87-d1bacd7e763e\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-etc-swift\") pod \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-config\") pod \"5f587396-a3bb-4add-bb87-d1bacd7e763e\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.719974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-public-tls-certs\") pod \"5f587396-a3bb-4add-bb87-d1bacd7e763e\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.720021 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-internal-tls-certs\") pod \"5f587396-a3bb-4add-bb87-d1bacd7e763e\" (UID: \"5f587396-a3bb-4add-bb87-d1bacd7e763e\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.720054 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-catalog-content\") pod \"36ac193f-5744-444f-81d6-8b785dd52ed2\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.720097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7pvz\" (UniqueName: \"kubernetes.io/projected/36ac193f-5744-444f-81d6-8b785dd52ed2-kube-api-access-n7pvz\") pod \"36ac193f-5744-444f-81d6-8b785dd52ed2\" (UID: \"36ac193f-5744-444f-81d6-8b785dd52ed2\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.720131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-lock\") pod \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\" (UID: \"3b5fdc09-bd73-4c16-96f8-8c083e314aff\") " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.721131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-lock" (OuterVolumeSpecName: "lock") pod "3b5fdc09-bd73-4c16-96f8-8c083e314aff" (UID: "3b5fdc09-bd73-4c16-96f8-8c083e314aff"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.721356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-cache" (OuterVolumeSpecName: "cache") pod "3b5fdc09-bd73-4c16-96f8-8c083e314aff" (UID: "3b5fdc09-bd73-4c16-96f8-8c083e314aff"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.722759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5f587396-a3bb-4add-bb87-d1bacd7e763e" (UID: "5f587396-a3bb-4add-bb87-d1bacd7e763e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.723379 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-utilities" (OuterVolumeSpecName: "utilities") pod "36ac193f-5744-444f-81d6-8b785dd52ed2" (UID: "36ac193f-5744-444f-81d6-8b785dd52ed2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.723569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f587396-a3bb-4add-bb87-d1bacd7e763e-kube-api-access-z4kkh" (OuterVolumeSpecName: "kube-api-access-z4kkh") pod "5f587396-a3bb-4add-bb87-d1bacd7e763e" (UID: "5f587396-a3bb-4add-bb87-d1bacd7e763e"). InnerVolumeSpecName "kube-api-access-z4kkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.724683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3b5fdc09-bd73-4c16-96f8-8c083e314aff" (UID: "3b5fdc09-bd73-4c16-96f8-8c083e314aff"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.724993 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-kube-api-access-r7xmj" (OuterVolumeSpecName: "kube-api-access-r7xmj") pod "3b5fdc09-bd73-4c16-96f8-8c083e314aff" (UID: "3b5fdc09-bd73-4c16-96f8-8c083e314aff"). InnerVolumeSpecName "kube-api-access-r7xmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.725691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "3b5fdc09-bd73-4c16-96f8-8c083e314aff" (UID: "3b5fdc09-bd73-4c16-96f8-8c083e314aff"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.729316 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ac193f-5744-444f-81d6-8b785dd52ed2-kube-api-access-n7pvz" (OuterVolumeSpecName: "kube-api-access-n7pvz") pod "36ac193f-5744-444f-81d6-8b785dd52ed2" (UID: "36ac193f-5744-444f-81d6-8b785dd52ed2"). InnerVolumeSpecName "kube-api-access-n7pvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.752783 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f587396-a3bb-4add-bb87-d1bacd7e763e" (UID: "5f587396-a3bb-4add-bb87-d1bacd7e763e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.754790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-config" (OuterVolumeSpecName: "config") pod "5f587396-a3bb-4add-bb87-d1bacd7e763e" (UID: "5f587396-a3bb-4add-bb87-d1bacd7e763e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.755330 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5f587396-a3bb-4add-bb87-d1bacd7e763e" (UID: "5f587396-a3bb-4add-bb87-d1bacd7e763e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.761692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36ac193f-5744-444f-81d6-8b785dd52ed2" (UID: "36ac193f-5744-444f-81d6-8b785dd52ed2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.763892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5f587396-a3bb-4add-bb87-d1bacd7e763e" (UID: "5f587396-a3bb-4add-bb87-d1bacd7e763e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.775712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5f587396-a3bb-4add-bb87-d1bacd7e763e" (UID: "5f587396-a3bb-4add-bb87-d1bacd7e763e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.821944 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7pvz\" (UniqueName: \"kubernetes.io/projected/36ac193f-5744-444f-81d6-8b785dd52ed2-kube-api-access-n7pvz\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.821973 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.821982 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.821992 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822001 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822010 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7xmj\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-kube-api-access-r7xmj\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822018 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b5fdc09-bd73-4c16-96f8-8c083e314aff-cache\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822046 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822054 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822063 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4kkh\" (UniqueName: \"kubernetes.io/projected/5f587396-a3bb-4add-bb87-d1bacd7e763e-kube-api-access-z4kkh\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822071 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b5fdc09-bd73-4c16-96f8-8c083e314aff-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822079 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822086 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822095 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f587396-a3bb-4add-bb87-d1bacd7e763e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.822102 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36ac193f-5744-444f-81d6-8b785dd52ed2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.832756 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:47:06 crc kubenswrapper[4707]: I0121 15:47:06.923506 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.163225 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.217908 4707 generic.go:334] "Generic (PLEG): container finished" podID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerID="927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f" exitCode=0 Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.217997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcjlb" event={"ID":"36ac193f-5744-444f-81d6-8b785dd52ed2","Type":"ContainerDied","Data":"927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.218627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcjlb" event={"ID":"36ac193f-5744-444f-81d6-8b785dd52ed2","Type":"ContainerDied","Data":"666eacecce05a7c03c5249e38cbef285bdc2d0ffe04ec1085fec5480db751553"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.218045 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcjlb" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.218651 4707 scope.go:117] "RemoveContainer" containerID="927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.220008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"74717719-f653-4098-a6be-105f39bc2f0c","Type":"ContainerDied","Data":"1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.219911 4707 generic.go:334] "Generic (PLEG): container finished" podID="74717719-f653-4098-a6be-105f39bc2f0c" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" exitCode=137 Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.222444 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/2.log" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.222874 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68556f8ccd-sx4vz_5f587396-a3bb-4add-bb87-d1bacd7e763e/neutron-api/1.log" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.223122 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerID="752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186" exitCode=137 Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.223165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerDied","Data":"752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.223182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" event={"ID":"5f587396-a3bb-4add-bb87-d1bacd7e763e","Type":"ContainerDied","Data":"0f5ca1c05a8ba258481aceb5bf81924b527b4a365e2428a4dcd4b4a519e233ef"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.223294 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-68556f8ccd-sx4vz" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.226694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-log-httpd\") pod \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.226729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-config-data\") pod \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.226747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-ceilometer-tls-certs\") pod \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.226837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-scripts\") pod \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.227027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-sg-core-conf-yaml\") pod \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.227073 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cr5d\" (UniqueName: \"kubernetes.io/projected/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-kube-api-access-7cr5d\") pod \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.227093 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-combined-ca-bundle\") pod \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.227110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-run-httpd\") pod \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\" (UID: \"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.227570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" (UID: "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.228521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" (UID: "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.232841 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerID="3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841" exitCode=137 Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.232893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.232918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3b5fdc09-bd73-4c16-96f8-8c083e314aff","Type":"ContainerDied","Data":"93fbf45ca6d181a3320fa0201335a4bb2677bda3f7956f8e4b4b1f55c4531397"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.233007 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.237573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"cc574d2a-944c-4ac5-8ad9-3d6c51f6c760","Type":"ContainerDied","Data":"03a6a327db3fed2dc170c765af84ea362ed4277a832f0c862453c31dac6c0441"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.237665 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.240862 4707 generic.go:334] "Generic (PLEG): container finished" podID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerID="f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf" exitCode=137 Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.240885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerDied","Data":"f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.240901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2","Type":"ContainerDied","Data":"ec879e1a4890e4ba8696f3cfadad5a026adb34a24b14313a7091e3adeb407502"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.240911 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfb9505879b3bc1531075e521f5c432e0486e1366b2751cb167564231e34887f"} Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.240960 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.242948 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bcjlb"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.243737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-scripts" (OuterVolumeSpecName: "scripts") pod "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" (UID: "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.247699 4707 scope.go:117] "RemoveContainer" containerID="55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.247777 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bcjlb"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.251308 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-68556f8ccd-sx4vz"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.253448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-kube-api-access-7cr5d" (OuterVolumeSpecName: "kube-api-access-7cr5d") pod "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" (UID: "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2"). InnerVolumeSpecName "kube-api-access-7cr5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.255171 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-68556f8ccd-sx4vz"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.265963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" (UID: "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.280726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" (UID: "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.292541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" (UID: "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.293858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-config-data" (OuterVolumeSpecName: "config-data") pod "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" (UID: "7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.312566 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.313824 4707 scope.go:117] "RemoveContainer" containerID="2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.323967 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.328411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-config-data\") pod \"74717719-f653-4098-a6be-105f39bc2f0c\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.328504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfvqb\" (UniqueName: \"kubernetes.io/projected/74717719-f653-4098-a6be-105f39bc2f0c-kube-api-access-qfvqb\") pod \"74717719-f653-4098-a6be-105f39bc2f0c\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.328544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-combined-ca-bundle\") pod \"74717719-f653-4098-a6be-105f39bc2f0c\" (UID: \"74717719-f653-4098-a6be-105f39bc2f0c\") " Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.328963 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.328983 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.328992 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cr5d\" (UniqueName: \"kubernetes.io/projected/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-kube-api-access-7cr5d\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.329002 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.329010 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.329017 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.329025 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.329034 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.334260 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74717719-f653-4098-a6be-105f39bc2f0c-kube-api-access-qfvqb" (OuterVolumeSpecName: "kube-api-access-qfvqb") pod "74717719-f653-4098-a6be-105f39bc2f0c" (UID: "74717719-f653-4098-a6be-105f39bc2f0c"). InnerVolumeSpecName "kube-api-access-qfvqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.338422 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.341664 4707 scope.go:117] "RemoveContainer" containerID="927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.342017 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f\": container with ID starting with 927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f not found: ID does not exist" containerID="927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.342126 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f"} err="failed to get container status \"927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f\": rpc error: code = NotFound desc = could not find container \"927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f\": container with ID starting with 927f87f1bc9c379d2a35ef415ad928f0adf2e58ef50406d6809d8fdb4bccd15f not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.342147 4707 scope.go:117] "RemoveContainer" containerID="55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.343371 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0\": container with ID starting with 55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0 not found: ID does not exist" containerID="55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.343461 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0"} err="failed to get container status \"55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0\": rpc error: code = NotFound desc = could not find container \"55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0\": container with ID starting with 55ec0ad92c27c33ecf48fd60c2b6da26df12549cc9a689d16a625686bd0c94f0 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.343526 4707 scope.go:117] "RemoveContainer" containerID="2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.343831 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f\": container with ID starting with 2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f not found: ID does not exist" containerID="2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.343863 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f"} err="failed to get container status \"2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f\": rpc error: code = NotFound desc = could not find container \"2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f\": container with ID starting with 2f958b28806ef8db2944892525fbd3f409733e5328ba713bb9cb676303d5f53f not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.343882 4707 scope.go:117] "RemoveContainer" containerID="752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.344261 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.349940 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.359359 4707 scope.go:117] "RemoveContainer" containerID="564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.382261 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74717719-f653-4098-a6be-105f39bc2f0c" (UID: "74717719-f653-4098-a6be-105f39bc2f0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.384427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-config-data" (OuterVolumeSpecName: "config-data") pod "74717719-f653-4098-a6be-105f39bc2f0c" (UID: "74717719-f653-4098-a6be-105f39bc2f0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.421698 4707 scope.go:117] "RemoveContainer" containerID="93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.429876 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.429904 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfvqb\" (UniqueName: \"kubernetes.io/projected/74717719-f653-4098-a6be-105f39bc2f0c-kube-api-access-qfvqb\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.429915 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74717719-f653-4098-a6be-105f39bc2f0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.436519 4707 scope.go:117] "RemoveContainer" containerID="752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.436824 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186\": container with ID starting with 752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186 not found: ID does not exist" containerID="752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.436862 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186"} err="failed to get container status \"752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186\": rpc error: code = NotFound desc = could not find container \"752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186\": container with ID starting with 752dfa6b59cb7a6c1739723588b4873042dc585d9818f18434db5ae4cd1f2186 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.436884 4707 scope.go:117] "RemoveContainer" containerID="564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.437126 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8\": container with ID starting with 564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8 not found: ID does not exist" containerID="564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.437156 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8"} err="failed to get container status \"564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8\": rpc error: code = NotFound desc = could not find container \"564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8\": container with ID starting with 564a7337d6036ab674f9f9e6fa5ad0a0a2dfad8676f27b89d4983b30136998c8 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.437179 4707 scope.go:117] "RemoveContainer" containerID="93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.437391 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de\": container with ID starting with 93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de not found: ID does not exist" containerID="93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.437414 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de"} err="failed to get container status \"93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de\": rpc error: code = NotFound desc = could not find container \"93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de\": container with ID starting with 93a8719968c75e11e558eb8ebb1c990f726db86047db557906dff623675fd6de not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.437426 4707 scope.go:117] "RemoveContainer" containerID="3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.450321 4707 scope.go:117] "RemoveContainer" containerID="1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.461805 4707 scope.go:117] "RemoveContainer" containerID="ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.474760 4707 scope.go:117] "RemoveContainer" containerID="01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.487407 4707 scope.go:117] "RemoveContainer" containerID="42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.500127 4707 scope.go:117] "RemoveContainer" containerID="a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.510931 4707 scope.go:117] "RemoveContainer" containerID="ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.515607 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.516720 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.518734 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.518779 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.522517 4707 scope.go:117] "RemoveContainer" containerID="97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.537138 4707 scope.go:117] "RemoveContainer" containerID="9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.552364 4707 scope.go:117] "RemoveContainer" containerID="40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.575077 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.578308 4707 scope.go:117] "RemoveContainer" containerID="d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.579427 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.592768 4707 scope.go:117] "RemoveContainer" containerID="6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.612289 4707 scope.go:117] "RemoveContainer" containerID="83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.627234 4707 scope.go:117] "RemoveContainer" containerID="c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.640850 4707 scope.go:117] "RemoveContainer" containerID="75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.653867 4707 scope.go:117] "RemoveContainer" containerID="3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.654162 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841\": container with ID starting with 3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841 not found: ID does not exist" containerID="3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.654201 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841"} err="failed to get container status \"3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841\": rpc error: code = NotFound desc = could not find container \"3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841\": container with ID starting with 3c7f636552876621f1f0e115becbef24300be9d9d36fb1dc2d21811ef5a2e841 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.654262 4707 scope.go:117] "RemoveContainer" containerID="1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.654605 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333\": container with ID starting with 1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333 not found: ID does not exist" containerID="1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.654630 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333"} err="failed to get container status \"1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333\": rpc error: code = NotFound desc = could not find container \"1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333\": container with ID starting with 1d8b9ace6a38b759c04a1ea52c4f4a6560278e288e28c985a63f701b636a6333 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.654648 4707 scope.go:117] "RemoveContainer" containerID="ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.654867 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92\": container with ID starting with ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92 not found: ID does not exist" containerID="ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.654887 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92"} err="failed to get container status \"ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92\": rpc error: code = NotFound desc = could not find container \"ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92\": container with ID starting with ce73b5b94d33a80093648a5a5adce504b37a6d05f1c7e0e30116b33fe3c7de92 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.654900 4707 scope.go:117] "RemoveContainer" containerID="01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.655166 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22\": container with ID starting with 01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22 not found: ID does not exist" containerID="01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.655194 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22"} err="failed to get container status \"01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22\": rpc error: code = NotFound desc = could not find container \"01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22\": container with ID starting with 01b41fefd8967cf6f54e1270d6c993f5114b5c48bbb7756ab60e2a9cad54df22 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.655214 4707 scope.go:117] "RemoveContainer" containerID="42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.655475 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf\": container with ID starting with 42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf not found: ID does not exist" containerID="42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.655495 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf"} err="failed to get container status \"42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf\": rpc error: code = NotFound desc = could not find container \"42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf\": container with ID starting with 42f46ac6655bd8e24d541e0a5c19a4ac3a291436d7491e6aac20fcbd637c1baf not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.655509 4707 scope.go:117] "RemoveContainer" containerID="a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.655725 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f\": container with ID starting with a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f not found: ID does not exist" containerID="a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.655746 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f"} err="failed to get container status \"a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f\": rpc error: code = NotFound desc = could not find container \"a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f\": container with ID starting with a2e37919a0bedeb30df1b5534458029999d18314a9d41f18236739245b7fde4f not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.655758 4707 scope.go:117] "RemoveContainer" containerID="ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.655999 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722\": container with ID starting with ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722 not found: ID does not exist" containerID="ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.656024 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722"} err="failed to get container status \"ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722\": rpc error: code = NotFound desc = could not find container \"ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722\": container with ID starting with ed03d87bcb942a8c94b405ff5ee805b3ca2abc5e8869055cc032bf20b3f5a722 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.656040 4707 scope.go:117] "RemoveContainer" containerID="97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.656259 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa\": container with ID starting with 97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa not found: ID does not exist" containerID="97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.656323 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa"} err="failed to get container status \"97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa\": rpc error: code = NotFound desc = could not find container \"97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa\": container with ID starting with 97a3b0434531cb47ec2732767c3a093e8f8c002458a5e8b1051b0d25417392fa not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.656340 4707 scope.go:117] "RemoveContainer" containerID="9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.656526 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724\": container with ID starting with 9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724 not found: ID does not exist" containerID="9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.656570 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724"} err="failed to get container status \"9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724\": rpc error: code = NotFound desc = could not find container \"9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724\": container with ID starting with 9c075075fab98a82d9a1b5a13ee2011a4de1d21266e0d712ef58555ace289724 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.656583 4707 scope.go:117] "RemoveContainer" containerID="40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.656829 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4\": container with ID starting with 40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4 not found: ID does not exist" containerID="40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.656858 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4"} err="failed to get container status \"40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4\": rpc error: code = NotFound desc = could not find container \"40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4\": container with ID starting with 40e8822157eb946d970a1c959925cf0d846464c1f6192cf55c1a4fd4c7e796e4 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.656880 4707 scope.go:117] "RemoveContainer" containerID="d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.657083 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454\": container with ID starting with d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454 not found: ID does not exist" containerID="d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.657101 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454"} err="failed to get container status \"d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454\": rpc error: code = NotFound desc = could not find container \"d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454\": container with ID starting with d06fa8394ebe0efa6897c2fe3d010939fa893b052721bd236094670c70c07454 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.657112 4707 scope.go:117] "RemoveContainer" containerID="6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.657296 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242\": container with ID starting with 6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242 not found: ID does not exist" containerID="6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.657315 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242"} err="failed to get container status \"6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242\": rpc error: code = NotFound desc = could not find container \"6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242\": container with ID starting with 6d5b647254b3727b4d9408d88a1e264c09198222d22daa451651811319635242 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.657327 4707 scope.go:117] "RemoveContainer" containerID="83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.657498 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84\": container with ID starting with 83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84 not found: ID does not exist" containerID="83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.657538 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84"} err="failed to get container status \"83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84\": rpc error: code = NotFound desc = could not find container \"83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84\": container with ID starting with 83283d0b4e0283981d33c6fffc3accc47fb6d4ba1b133481cba57c7d86068c84 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.657551 4707 scope.go:117] "RemoveContainer" containerID="c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.657772 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea\": container with ID starting with c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea not found: ID does not exist" containerID="c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.657793 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea"} err="failed to get container status \"c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea\": rpc error: code = NotFound desc = could not find container \"c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea\": container with ID starting with c6a5b2bbf93ab5c7fbdf483d2d0eb751ac5e48e659af09c7594b8d9c9f339cea not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.657825 4707 scope.go:117] "RemoveContainer" containerID="75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.658015 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04\": container with ID starting with 75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04 not found: ID does not exist" containerID="75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.658034 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04"} err="failed to get container status \"75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04\": rpc error: code = NotFound desc = could not find container \"75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04\": container with ID starting with 75efadb7d88c6e2e385d6687770f8fe99e6a9f72e86131ee8ddeb56505da4e04 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.658049 4707 scope.go:117] "RemoveContainer" containerID="ce810f5016c6b84539b5638a579a70bdd684769502c29f783c133ce65a806987" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.671523 4707 scope.go:117] "RemoveContainer" containerID="ad1135a550f9a5d2b3f5160fd4aa31b58313409af1bfcb366c84f0d82a424f8b" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.685737 4707 scope.go:117] "RemoveContainer" containerID="002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.704915 4707 scope.go:117] "RemoveContainer" containerID="e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.717728 4707 scope.go:117] "RemoveContainer" containerID="f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.732526 4707 scope.go:117] "RemoveContainer" containerID="bfb9505879b3bc1531075e521f5c432e0486e1366b2751cb167564231e34887f" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.747071 4707 scope.go:117] "RemoveContainer" containerID="002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.747411 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5\": container with ID starting with 002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5 not found: ID does not exist" containerID="002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.747443 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5"} err="failed to get container status \"002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5\": rpc error: code = NotFound desc = could not find container \"002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5\": container with ID starting with 002fdcbda91840b6cae7752f36129afe7509d22e4ed6511e502224b87727afd5 not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.747461 4707 scope.go:117] "RemoveContainer" containerID="e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.747895 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d\": container with ID starting with e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d not found: ID does not exist" containerID="e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.747942 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d"} err="failed to get container status \"e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d\": rpc error: code = NotFound desc = could not find container \"e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d\": container with ID starting with e3074b1b9ad0b857cee8648f1213757c9a25160d8c21fe1c278f4e614420a41d not found: ID does not exist" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.747961 4707 scope.go:117] "RemoveContainer" containerID="f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf" Jan 21 15:47:07 crc kubenswrapper[4707]: E0121 15:47:07.748196 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf\": container with ID starting with f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf not found: ID does not exist" containerID="f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf" Jan 21 15:47:07 crc kubenswrapper[4707]: I0121 15:47:07.748239 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf"} err="failed to get container status \"f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf\": rpc error: code = NotFound desc = could not find container \"f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf\": container with ID starting with f335343ac6068569490c87ad8e50dea4de3ea803d663686d949a04b355ebbccf not found: ID does not exist" Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.258221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"74717719-f653-4098-a6be-105f39bc2f0c","Type":"ContainerDied","Data":"4203385eb6fc2d532ddb55a1b7d03a78c797579e62951ddd604fdd33190b150b"} Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.258258 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.258281 4707 scope.go:117] "RemoveContainer" containerID="1d19076e1312eb4a5df67a037bdfa8aa300a81a4e0c883310be5be63e6f1d8f8" Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.282726 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.290892 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.837793 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.948958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-operator-scripts\") pod \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\" (UID: \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\") " Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.949007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcq74\" (UniqueName: \"kubernetes.io/projected/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-kube-api-access-pcq74\") pod \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\" (UID: \"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41\") " Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.949609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41" (UID: "d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.950243 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:08 crc kubenswrapper[4707]: I0121 15:47:08.953400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-kube-api-access-pcq74" (OuterVolumeSpecName: "kube-api-access-pcq74") pod "d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41" (UID: "d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41"). InnerVolumeSpecName "kube-api-access-pcq74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.051596 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcq74\" (UniqueName: \"kubernetes.io/projected/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41-kube-api-access-pcq74\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.189255 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ac193f-5744-444f-81d6-8b785dd52ed2" path="/var/lib/kubelet/pods/36ac193f-5744-444f-81d6-8b785dd52ed2/volumes" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.190215 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" path="/var/lib/kubelet/pods/3b5fdc09-bd73-4c16-96f8-8c083e314aff/volumes" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.191885 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" path="/var/lib/kubelet/pods/5f587396-a3bb-4add-bb87-d1bacd7e763e/volumes" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.192762 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74717719-f653-4098-a6be-105f39bc2f0c" path="/var/lib/kubelet/pods/74717719-f653-4098-a6be-105f39bc2f0c/volumes" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.193234 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" path="/var/lib/kubelet/pods/7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2/volumes" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.193863 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" path="/var/lib/kubelet/pods/cc574d2a-944c-4ac5-8ad9-3d6c51f6c760/volumes" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.265760 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41" containerID="ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771" exitCode=137 Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.265794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" event={"ID":"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41","Type":"ContainerDied","Data":"ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771"} Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.265825 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.265833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-sppjh" event={"ID":"d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41","Type":"ContainerDied","Data":"dc088fa739d7ce5615481ee552f800e908e469a51345f767c2cc3947e35b42c3"} Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.265850 4707 scope.go:117] "RemoveContainer" containerID="ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.278375 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-sppjh"] Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.283723 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-sppjh"] Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.283981 4707 scope.go:117] "RemoveContainer" containerID="ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771" Jan 21 15:47:09 crc kubenswrapper[4707]: E0121 15:47:09.284257 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771\": container with ID starting with ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771 not found: ID does not exist" containerID="ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771" Jan 21 15:47:09 crc kubenswrapper[4707]: I0121 15:47:09.284321 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771"} err="failed to get container status \"ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771\": rpc error: code = NotFound desc = could not find container \"ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771\": container with ID starting with ca1b1aa2d612210b9299de2f5f2f17d670eedf709eba8cd08753808358b41771 not found: ID does not exist" Jan 21 15:47:10 crc kubenswrapper[4707]: E0121 15:47:10.707126 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:10 crc kubenswrapper[4707]: E0121 15:47:10.710039 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:10 crc kubenswrapper[4707]: E0121 15:47:10.711173 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:10 crc kubenswrapper[4707]: E0121 15:47:10.711228 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:47:11 crc kubenswrapper[4707]: I0121 15:47:11.188749 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41" path="/var/lib/kubelet/pods/d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41/volumes" Jan 21 15:47:11 crc kubenswrapper[4707]: I0121 15:47:11.508449 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-667579cbdb-pbr2f" podUID="26893235-a33a-4211-b02e-b4237c0586c5" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.1.80:5000/v3\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:47:12 crc kubenswrapper[4707]: E0121 15:47:12.516082 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:12 crc kubenswrapper[4707]: E0121 15:47:12.517013 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:12 crc kubenswrapper[4707]: E0121 15:47:12.517903 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:12 crc kubenswrapper[4707]: E0121 15:47:12.517937 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" Jan 21 15:47:15 crc kubenswrapper[4707]: E0121 15:47:15.707148 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f is running failed: container process not found" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:15 crc kubenswrapper[4707]: E0121 15:47:15.707626 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f is running failed: container process not found" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:15 crc kubenswrapper[4707]: E0121 15:47:15.707793 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f is running failed: container process not found" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:47:15 crc kubenswrapper[4707]: E0121 15:47:15.707847 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.785553 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.849128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-config-data\") pod \"4505129f-9175-43df-9d92-cbb8fba50ea3\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.849216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle\") pod \"4505129f-9175-43df-9d92-cbb8fba50ea3\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.849248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sdv2\" (UniqueName: \"kubernetes.io/projected/4505129f-9175-43df-9d92-cbb8fba50ea3-kube-api-access-8sdv2\") pod \"4505129f-9175-43df-9d92-cbb8fba50ea3\" (UID: \"4505129f-9175-43df-9d92-cbb8fba50ea3\") " Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.853204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4505129f-9175-43df-9d92-cbb8fba50ea3-kube-api-access-8sdv2" (OuterVolumeSpecName: "kube-api-access-8sdv2") pod "4505129f-9175-43df-9d92-cbb8fba50ea3" (UID: "4505129f-9175-43df-9d92-cbb8fba50ea3"). InnerVolumeSpecName "kube-api-access-8sdv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.853604 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.867316 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4505129f-9175-43df-9d92-cbb8fba50ea3" (UID: "4505129f-9175-43df-9d92-cbb8fba50ea3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.868193 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-config-data" (OuterVolumeSpecName: "config-data") pod "4505129f-9175-43df-9d92-cbb8fba50ea3" (UID: "4505129f-9175-43df-9d92-cbb8fba50ea3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.950289 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-config-data\") pod \"981e49ca-880c-4002-8c7d-d7bdfb867b38\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.950351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xzch\" (UniqueName: \"kubernetes.io/projected/981e49ca-880c-4002-8c7d-d7bdfb867b38-kube-api-access-5xzch\") pod \"981e49ca-880c-4002-8c7d-d7bdfb867b38\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.950390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-combined-ca-bundle\") pod \"981e49ca-880c-4002-8c7d-d7bdfb867b38\" (UID: \"981e49ca-880c-4002-8c7d-d7bdfb867b38\") " Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.950647 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.950665 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4505129f-9175-43df-9d92-cbb8fba50ea3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.950675 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sdv2\" (UniqueName: \"kubernetes.io/projected/4505129f-9175-43df-9d92-cbb8fba50ea3-kube-api-access-8sdv2\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.952892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/981e49ca-880c-4002-8c7d-d7bdfb867b38-kube-api-access-5xzch" (OuterVolumeSpecName: "kube-api-access-5xzch") pod "981e49ca-880c-4002-8c7d-d7bdfb867b38" (UID: "981e49ca-880c-4002-8c7d-d7bdfb867b38"). InnerVolumeSpecName "kube-api-access-5xzch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.963995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-config-data" (OuterVolumeSpecName: "config-data") pod "981e49ca-880c-4002-8c7d-d7bdfb867b38" (UID: "981e49ca-880c-4002-8c7d-d7bdfb867b38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.964651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "981e49ca-880c-4002-8c7d-d7bdfb867b38" (UID: "981e49ca-880c-4002-8c7d-d7bdfb867b38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.984825 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-tm9xs"] Jan 21 15:47:15 crc kubenswrapper[4707]: I0121 15:47:15.989124 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-tm9xs"] Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.052200 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.052410 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xzch\" (UniqueName: \"kubernetes.io/projected/981e49ca-880c-4002-8c7d-d7bdfb867b38-kube-api-access-5xzch\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.052421 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e49ca-880c-4002-8c7d-d7bdfb867b38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.083858 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ww8jl"] Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084065 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="rsync" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084080 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="rsync" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084090 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084096 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084106 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="ceilometer-central-agent" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084111 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="ceilometer-central-agent" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084119 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-updater" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084124 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-updater" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084135 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084140 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084146 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084151 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084158 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-replicator" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084163 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-replicator" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084173 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-updater" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084178 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-updater" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084187 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-auditor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084193 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-auditor" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084199 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084205 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084213 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="swift-recon-cron" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084219 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="swift-recon-cron" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084228 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41" containerName="mariadb-database-create" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084233 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41" containerName="mariadb-database-create" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084242 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084247 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-server" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084254 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-expirer" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084259 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-expirer" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084265 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="sg-core" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084282 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="sg-core" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084292 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-replicator" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084298 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-replicator" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084305 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-auditor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084311 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-auditor" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084316 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerName="cinder-scheduler" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084321 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerName="cinder-scheduler" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084328 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerName="probe" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084332 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerName="probe" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084340 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerName="registry-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084345 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerName="registry-server" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084352 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerName="extract-content" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084358 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerName="extract-content" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084364 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerName="extract-utilities" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084370 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerName="extract-utilities" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084377 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084383 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-server" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084390 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084395 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-server" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084401 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerName="extract-utilities" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084407 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerName="extract-utilities" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084411 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerName="extract-content" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084416 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerName="extract-content" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084425 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084430 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084440 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="proxy-httpd" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084445 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="proxy-httpd" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084453 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-replicator" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084458 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-replicator" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084467 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-auditor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084471 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-auditor" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084477 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084483 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084489 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerName="registry-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084494 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerName="registry-server" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084502 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-reaper" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084507 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-reaper" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.084515 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="ceilometer-notification-agent" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084520 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="ceilometer-notification-agent" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084620 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-auditor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084628 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d31e627-8f6e-478e-99ac-eccb0608a7bd" containerName="registry-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084634 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerName="nova-cell0-conductor-conductor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084641 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="ceilometer-central-agent" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084649 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084654 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-updater" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084662 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="sg-core" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084669 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ac193f-5744-444f-81d6-8b785dd52ed2" containerName="registry-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084677 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="74717719-f653-4098-a6be-105f39bc2f0c" containerName="nova-scheduler-scheduler" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084683 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="ceilometer-notification-agent" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084690 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6dc2806-fe21-4ea9-8c2e-b2bf3e7b1c41" containerName="mariadb-database-create" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084699 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084705 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-auditor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084712 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="rsync" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084721 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-auditor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084727 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084733 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerName="probe" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084742 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="swift-recon-cron" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbc14e1-6cf0-45a0-a87f-7f64b2d122f2" containerName="proxy-httpd" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084755 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-replicator" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084760 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-httpd" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084766 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084772 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-expirer" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084778 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-replicator" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084785 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-updater" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084793 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc574d2a-944c-4ac5-8ad9-3d6c51f6c760" containerName="cinder-scheduler" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084800 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="account-reaper" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084826 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="container-replicator" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084833 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerName="nova-cell1-conductor-conductor" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.084842 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5fdc09-bd73-4c16-96f8-8c083e314aff" containerName="object-server" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.085207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.086793 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.087053 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.087229 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.089764 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.092048 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ww8jl"] Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.153016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-crc-storage\") pod \"crc-storage-crc-ww8jl\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.153049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-node-mnt\") pod \"crc-storage-crc-ww8jl\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.153072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx2vg\" (UniqueName: \"kubernetes.io/projected/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-kube-api-access-hx2vg\") pod \"crc-storage-crc-ww8jl\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.254159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-crc-storage\") pod \"crc-storage-crc-ww8jl\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.254188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-node-mnt\") pod \"crc-storage-crc-ww8jl\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.254227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx2vg\" (UniqueName: \"kubernetes.io/projected/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-kube-api-access-hx2vg\") pod \"crc-storage-crc-ww8jl\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.254637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-node-mnt\") pod \"crc-storage-crc-ww8jl\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.254740 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-crc-storage\") pod \"crc-storage-crc-ww8jl\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.267206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx2vg\" (UniqueName: \"kubernetes.io/projected/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-kube-api-access-hx2vg\") pod \"crc-storage-crc-ww8jl\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.307042 4707 generic.go:334] "Generic (PLEG): container finished" podID="4505129f-9175-43df-9d92-cbb8fba50ea3" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" exitCode=137 Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.307094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4505129f-9175-43df-9d92-cbb8fba50ea3","Type":"ContainerDied","Data":"dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f"} Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.307118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4505129f-9175-43df-9d92-cbb8fba50ea3","Type":"ContainerDied","Data":"0c9fc690a9be2b1dbecac900a76a7bc0d01e9dddaa7c25360b90f767712378b6"} Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.307131 4707 scope.go:117] "RemoveContainer" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.307217 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.309191 4707 generic.go:334] "Generic (PLEG): container finished" podID="981e49ca-880c-4002-8c7d-d7bdfb867b38" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" exitCode=137 Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.309225 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"981e49ca-880c-4002-8c7d-d7bdfb867b38","Type":"ContainerDied","Data":"42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff"} Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.309248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"981e49ca-880c-4002-8c7d-d7bdfb867b38","Type":"ContainerDied","Data":"dfcd00e11c4d0433fa251d2578cc79fae120b067d19aa27a2670c158694d6f47"} Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.309299 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.323549 4707 scope.go:117] "RemoveContainer" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.323963 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f\": container with ID starting with dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f not found: ID does not exist" containerID="dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.324094 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f"} err="failed to get container status \"dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f\": rpc error: code = NotFound desc = could not find container \"dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f\": container with ID starting with dbad78c5e187cb5a1afcd070f32ca0aa35881e8b987fcd601b1571c4d77a625f not found: ID does not exist" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.324165 4707 scope.go:117] "RemoveContainer" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.333798 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.338949 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.343008 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.345307 4707 scope.go:117] "RemoveContainer" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" Jan 21 15:47:16 crc kubenswrapper[4707]: E0121 15:47:16.345598 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff\": container with ID starting with 42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff not found: ID does not exist" containerID="42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.345624 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff"} err="failed to get container status \"42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff\": rpc error: code = NotFound desc = could not find container \"42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff\": container with ID starting with 42509bf2bf54f357931b3ed98076c8afddbbcc27a57dd329ae48858d24ec70ff not found: ID does not exist" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.347825 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.399390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.770562 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ww8jl"] Jan 21 15:47:16 crc kubenswrapper[4707]: I0121 15:47:16.776931 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:47:17 crc kubenswrapper[4707]: I0121 15:47:17.189348 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4505129f-9175-43df-9d92-cbb8fba50ea3" path="/var/lib/kubelet/pods/4505129f-9175-43df-9d92-cbb8fba50ea3/volumes" Jan 21 15:47:17 crc kubenswrapper[4707]: I0121 15:47:17.190144 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f049514-c7cc-473c-a107-3a823dc670ba" path="/var/lib/kubelet/pods/6f049514-c7cc-473c-a107-3a823dc670ba/volumes" Jan 21 15:47:17 crc kubenswrapper[4707]: I0121 15:47:17.190796 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="981e49ca-880c-4002-8c7d-d7bdfb867b38" path="/var/lib/kubelet/pods/981e49ca-880c-4002-8c7d-d7bdfb867b38/volumes" Jan 21 15:47:17 crc kubenswrapper[4707]: I0121 15:47:17.318308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ww8jl" event={"ID":"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c","Type":"ContainerStarted","Data":"dfab706950bd5845152b8c87dc9a920a9772f4cd8c94fad66b90d11491b55574"} Jan 21 15:47:18 crc kubenswrapper[4707]: I0121 15:47:18.325889 4707 generic.go:334] "Generic (PLEG): container finished" podID="8b4de435-6d8b-4a36-b65d-a2b7a09dea4c" containerID="8de232c69091a8ebfbbd5841be04e33e23e6bb682ea20486d7e76dbf7a6b5887" exitCode=0 Jan 21 15:47:18 crc kubenswrapper[4707]: I0121 15:47:18.325939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ww8jl" event={"ID":"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c","Type":"ContainerDied","Data":"8de232c69091a8ebfbbd5841be04e33e23e6bb682ea20486d7e76dbf7a6b5887"} Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.563155 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.696079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-crc-storage\") pod \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.696246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-node-mnt\") pod \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.696333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx2vg\" (UniqueName: \"kubernetes.io/projected/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-kube-api-access-hx2vg\") pod \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\" (UID: \"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c\") " Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.696370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "8b4de435-6d8b-4a36-b65d-a2b7a09dea4c" (UID: "8b4de435-6d8b-4a36-b65d-a2b7a09dea4c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.696627 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.700276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-kube-api-access-hx2vg" (OuterVolumeSpecName: "kube-api-access-hx2vg") pod "8b4de435-6d8b-4a36-b65d-a2b7a09dea4c" (UID: "8b4de435-6d8b-4a36-b65d-a2b7a09dea4c"). InnerVolumeSpecName "kube-api-access-hx2vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.710202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "8b4de435-6d8b-4a36-b65d-a2b7a09dea4c" (UID: "8b4de435-6d8b-4a36-b65d-a2b7a09dea4c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.797184 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:19 crc kubenswrapper[4707]: I0121 15:47:19.797208 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx2vg\" (UniqueName: \"kubernetes.io/projected/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c-kube-api-access-hx2vg\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:20 crc kubenswrapper[4707]: I0121 15:47:20.338538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ww8jl" event={"ID":"8b4de435-6d8b-4a36-b65d-a2b7a09dea4c","Type":"ContainerDied","Data":"dfab706950bd5845152b8c87dc9a920a9772f4cd8c94fad66b90d11491b55574"} Jan 21 15:47:20 crc kubenswrapper[4707]: I0121 15:47:20.338575 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfab706950bd5845152b8c87dc9a920a9772f4cd8c94fad66b90d11491b55574" Jan 21 15:47:20 crc kubenswrapper[4707]: I0121 15:47:20.338581 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ww8jl" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.522320 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ww8jl"] Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.525853 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ww8jl"] Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.636783 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-fsnlz"] Jan 21 15:47:22 crc kubenswrapper[4707]: E0121 15:47:22.637111 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.637128 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" Jan 21 15:47:22 crc kubenswrapper[4707]: E0121 15:47:22.637157 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b4de435-6d8b-4a36-b65d-a2b7a09dea4c" containerName="storage" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.637163 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b4de435-6d8b-4a36-b65d-a2b7a09dea4c" containerName="storage" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.637280 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b4de435-6d8b-4a36-b65d-a2b7a09dea4c" containerName="storage" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.637302 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f587396-a3bb-4add-bb87-d1bacd7e763e" containerName="neutron-api" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.637752 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.639570 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.639821 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.640104 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.640378 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.640957 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fsnlz"] Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.735415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/78e09300-dc96-4d0c-a1ec-9d93064430dc-node-mnt\") pod \"crc-storage-crc-fsnlz\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.735458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxl7r\" (UniqueName: \"kubernetes.io/projected/78e09300-dc96-4d0c-a1ec-9d93064430dc-kube-api-access-mxl7r\") pod \"crc-storage-crc-fsnlz\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.735539 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/78e09300-dc96-4d0c-a1ec-9d93064430dc-crc-storage\") pod \"crc-storage-crc-fsnlz\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.836132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/78e09300-dc96-4d0c-a1ec-9d93064430dc-crc-storage\") pod \"crc-storage-crc-fsnlz\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.836206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/78e09300-dc96-4d0c-a1ec-9d93064430dc-node-mnt\") pod \"crc-storage-crc-fsnlz\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.836234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxl7r\" (UniqueName: \"kubernetes.io/projected/78e09300-dc96-4d0c-a1ec-9d93064430dc-kube-api-access-mxl7r\") pod \"crc-storage-crc-fsnlz\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.836650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/78e09300-dc96-4d0c-a1ec-9d93064430dc-node-mnt\") pod \"crc-storage-crc-fsnlz\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.836766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/78e09300-dc96-4d0c-a1ec-9d93064430dc-crc-storage\") pod \"crc-storage-crc-fsnlz\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.850894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxl7r\" (UniqueName: \"kubernetes.io/projected/78e09300-dc96-4d0c-a1ec-9d93064430dc-kube-api-access-mxl7r\") pod \"crc-storage-crc-fsnlz\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:22 crc kubenswrapper[4707]: I0121 15:47:22.951499 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:23 crc kubenswrapper[4707]: I0121 15:47:23.190497 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b4de435-6d8b-4a36-b65d-a2b7a09dea4c" path="/var/lib/kubelet/pods/8b4de435-6d8b-4a36-b65d-a2b7a09dea4c/volumes" Jan 21 15:47:23 crc kubenswrapper[4707]: I0121 15:47:23.304431 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fsnlz"] Jan 21 15:47:23 crc kubenswrapper[4707]: I0121 15:47:23.356177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fsnlz" event={"ID":"78e09300-dc96-4d0c-a1ec-9d93064430dc","Type":"ContainerStarted","Data":"fb0bb24592de165a237462191caeb6b5fc12bcaded451d9cfd85497ef447b7c3"} Jan 21 15:47:24 crc kubenswrapper[4707]: I0121 15:47:24.362586 4707 generic.go:334] "Generic (PLEG): container finished" podID="78e09300-dc96-4d0c-a1ec-9d93064430dc" containerID="ada1d30de6c406ca7003129e9e09083afedc3ca70ab25f4fd5dd2ddc1aac0321" exitCode=0 Jan 21 15:47:24 crc kubenswrapper[4707]: I0121 15:47:24.362633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fsnlz" event={"ID":"78e09300-dc96-4d0c-a1ec-9d93064430dc","Type":"ContainerDied","Data":"ada1d30de6c406ca7003129e9e09083afedc3ca70ab25f4fd5dd2ddc1aac0321"} Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.598220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.767893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/78e09300-dc96-4d0c-a1ec-9d93064430dc-node-mnt\") pod \"78e09300-dc96-4d0c-a1ec-9d93064430dc\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.767955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78e09300-dc96-4d0c-a1ec-9d93064430dc-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "78e09300-dc96-4d0c-a1ec-9d93064430dc" (UID: "78e09300-dc96-4d0c-a1ec-9d93064430dc"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.768020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxl7r\" (UniqueName: \"kubernetes.io/projected/78e09300-dc96-4d0c-a1ec-9d93064430dc-kube-api-access-mxl7r\") pod \"78e09300-dc96-4d0c-a1ec-9d93064430dc\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.768122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/78e09300-dc96-4d0c-a1ec-9d93064430dc-crc-storage\") pod \"78e09300-dc96-4d0c-a1ec-9d93064430dc\" (UID: \"78e09300-dc96-4d0c-a1ec-9d93064430dc\") " Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.768401 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/78e09300-dc96-4d0c-a1ec-9d93064430dc-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.772248 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78e09300-dc96-4d0c-a1ec-9d93064430dc-kube-api-access-mxl7r" (OuterVolumeSpecName: "kube-api-access-mxl7r") pod "78e09300-dc96-4d0c-a1ec-9d93064430dc" (UID: "78e09300-dc96-4d0c-a1ec-9d93064430dc"). InnerVolumeSpecName "kube-api-access-mxl7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.782484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e09300-dc96-4d0c-a1ec-9d93064430dc-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "78e09300-dc96-4d0c-a1ec-9d93064430dc" (UID: "78e09300-dc96-4d0c-a1ec-9d93064430dc"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.869236 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/78e09300-dc96-4d0c-a1ec-9d93064430dc-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:25 crc kubenswrapper[4707]: I0121 15:47:25.869260 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxl7r\" (UniqueName: \"kubernetes.io/projected/78e09300-dc96-4d0c-a1ec-9d93064430dc-kube-api-access-mxl7r\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:26 crc kubenswrapper[4707]: I0121 15:47:26.376733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fsnlz" event={"ID":"78e09300-dc96-4d0c-a1ec-9d93064430dc","Type":"ContainerDied","Data":"fb0bb24592de165a237462191caeb6b5fc12bcaded451d9cfd85497ef447b7c3"} Jan 21 15:47:26 crc kubenswrapper[4707]: I0121 15:47:26.376793 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fsnlz" Jan 21 15:47:26 crc kubenswrapper[4707]: I0121 15:47:26.376794 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb0bb24592de165a237462191caeb6b5fc12bcaded451d9cfd85497ef447b7c3" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.013446 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:47:37 crc kubenswrapper[4707]: E0121 15:47:37.014042 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e09300-dc96-4d0c-a1ec-9d93064430dc" containerName="storage" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.014054 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e09300-dc96-4d0c-a1ec-9d93064430dc" containerName="storage" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.014174 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="78e09300-dc96-4d0c-a1ec-9d93064430dc" containerName="storage" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.014784 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.015796 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.016103 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.016158 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.016498 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.016672 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-lz57z" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.023460 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.106419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.106452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bm4k\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-kube-api-access-2bm4k\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.106476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.106491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.106520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bde702b1-7406-4751-aaf5-026fb8e348c4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.106535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.106547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bde702b1-7406-4751-aaf5-026fb8e348c4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.106586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.106607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.207714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.207944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.208077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.208411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bm4k\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-kube-api-access-2bm4k\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.208498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.209186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.209875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.209969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bde702b1-7406-4751-aaf5-026fb8e348c4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.210047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bde702b1-7406-4751-aaf5-026fb8e348c4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.210137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.208383 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") device mount path \"/mnt/openstack/pv17\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.208382 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.208284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.208757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.209138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.211308 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.213749 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.214665 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.214871 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-skkb6" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.214981 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.215461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bde702b1-7406-4751-aaf5-026fb8e348c4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.215542 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.214048 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.215922 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bde702b1-7406-4751-aaf5-026fb8e348c4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.226675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bm4k\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-kube-api-access-2bm4k\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.231856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.231985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-server-0\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.311707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.311773 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.311816 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.311872 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e89388a-81da-49c6-a6fe-796c924f2822-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.311904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.311942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.312086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.312151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e89388a-81da-49c6-a6fe-796c924f2822-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.312176 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjz2b\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-kube-api-access-xjz2b\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.328842 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.413742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.414779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.414936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e89388a-81da-49c6-a6fe-796c924f2822-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.414994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjz2b\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-kube-api-access-xjz2b\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.415084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.415147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.415187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.415370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e89388a-81da-49c6-a6fe-796c924f2822-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.415432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.416446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.416839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.417121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.417576 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.419037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.419574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.423415 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e89388a-81da-49c6-a6fe-796c924f2822-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.423501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e89388a-81da-49c6-a6fe-796c924f2822-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.436998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjz2b\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-kube-api-access-xjz2b\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.463193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.569169 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.760837 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:47:37 crc kubenswrapper[4707]: I0121 15:47:37.927452 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:47:37 crc kubenswrapper[4707]: W0121 15:47:37.929842 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e89388a_81da_49c6_a6fe_796c924f2822.slice/crio-da99c640f6e593c8350fca1de94ea9f407db99529e30967850932ab9de53d442 WatchSource:0}: Error finding container da99c640f6e593c8350fca1de94ea9f407db99529e30967850932ab9de53d442: Status 404 returned error can't find the container with id da99c640f6e593c8350fca1de94ea9f407db99529e30967850932ab9de53d442 Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.465443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"bde702b1-7406-4751-aaf5-026fb8e348c4","Type":"ContainerStarted","Data":"43f7db9043a85994b4e96b1be2333a30db9afca105a651569a43385dfd162291"} Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.465662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"bde702b1-7406-4751-aaf5-026fb8e348c4","Type":"ContainerStarted","Data":"4f09c68857037ce3681852e51339608ccf1b2abe72aebfbf73e53925276cc9cb"} Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.466414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"3e89388a-81da-49c6-a6fe-796c924f2822","Type":"ContainerStarted","Data":"da99c640f6e593c8350fca1de94ea9f407db99529e30967850932ab9de53d442"} Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.506921 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.507859 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.509828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-m7p4k" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.510118 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.510148 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.510312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.516126 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.517437 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.630910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-default\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.630957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnbgq\" (UniqueName: \"kubernetes.io/projected/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kube-api-access-rnbgq\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.631034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kolla-config\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.631119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.631185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.631205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.631373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-operator-scripts\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.631440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-generated\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.732110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-default\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.732148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnbgq\" (UniqueName: \"kubernetes.io/projected/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kube-api-access-rnbgq\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.732166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kolla-config\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.732193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.732223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.732241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.732261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-operator-scripts\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.732303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-generated\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.732741 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.733084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-default\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.733116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kolla-config\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.733217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-generated\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.733883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-operator-scripts\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.735727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.736314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.745218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnbgq\" (UniqueName: \"kubernetes.io/projected/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kube-api-access-rnbgq\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.757147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.821435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.970166 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.971110 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.972895 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-2x7h6" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.972895 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:47:38 crc kubenswrapper[4707]: I0121 15:47:38.976921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.036707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kolla-config\") pod \"memcached-0\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.036751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-config-data\") pod \"memcached-0\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.036778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrxhn\" (UniqueName: \"kubernetes.io/projected/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kube-api-access-mrxhn\") pod \"memcached-0\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.137668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kolla-config\") pod \"memcached-0\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.137725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-config-data\") pod \"memcached-0\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.137761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrxhn\" (UniqueName: \"kubernetes.io/projected/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kube-api-access-mrxhn\") pod \"memcached-0\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.138442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kolla-config\") pod \"memcached-0\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.138459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-config-data\") pod \"memcached-0\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.156281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrxhn\" (UniqueName: \"kubernetes.io/projected/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kube-api-access-mrxhn\") pod \"memcached-0\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.205723 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:47:39 crc kubenswrapper[4707]: W0121 15:47:39.209663 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67bae98e_ed83_41c8_be5a_15eae8c9aa48.slice/crio-fc3f686d8b24ab0d62278743523c1bd2017f0bb758567c6c1dcd13c6cde48512 WatchSource:0}: Error finding container fc3f686d8b24ab0d62278743523c1bd2017f0bb758567c6c1dcd13c6cde48512: Status 404 returned error can't find the container with id fc3f686d8b24ab0d62278743523c1bd2017f0bb758567c6c1dcd13c6cde48512 Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.284842 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.474655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"3e89388a-81da-49c6-a6fe-796c924f2822","Type":"ContainerStarted","Data":"df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e"} Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.476904 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"67bae98e-ed83-41c8-be5a-15eae8c9aa48","Type":"ContainerStarted","Data":"3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749"} Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.476939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"67bae98e-ed83-41c8-be5a-15eae8c9aa48","Type":"ContainerStarted","Data":"fc3f686d8b24ab0d62278743523c1bd2017f0bb758567c6c1dcd13c6cde48512"} Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.568015 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.568860 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.570113 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-29tmn" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.574757 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.726426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:47:39 crc kubenswrapper[4707]: W0121 15:47:39.729449 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod373d78b4_4328_42ce_8d54_6fb1c0ecefdb.slice/crio-f3e95752ba008674a53dfa7bff12177e77f4db0814839b702392ea7f471a69f8 WatchSource:0}: Error finding container f3e95752ba008674a53dfa7bff12177e77f4db0814839b702392ea7f471a69f8: Status 404 returned error can't find the container with id f3e95752ba008674a53dfa7bff12177e77f4db0814839b702392ea7f471a69f8 Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.745968 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlw26\" (UniqueName: \"kubernetes.io/projected/165b85c7-02ce-4c3f-9433-60c84b3307fd-kube-api-access-mlw26\") pod \"kube-state-metrics-0\" (UID: \"165b85c7-02ce-4c3f-9433-60c84b3307fd\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.847369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlw26\" (UniqueName: \"kubernetes.io/projected/165b85c7-02ce-4c3f-9433-60c84b3307fd-kube-api-access-mlw26\") pod \"kube-state-metrics-0\" (UID: \"165b85c7-02ce-4c3f-9433-60c84b3307fd\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.861534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlw26\" (UniqueName: \"kubernetes.io/projected/165b85c7-02ce-4c3f-9433-60c84b3307fd-kube-api-access-mlw26\") pod \"kube-state-metrics-0\" (UID: \"165b85c7-02ce-4c3f-9433-60c84b3307fd\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:47:39 crc kubenswrapper[4707]: I0121 15:47:39.881705 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.141017 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.142420 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.144253 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-dr5v9" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.145510 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.145563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.147931 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.147983 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.232667 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:47:40 crc kubenswrapper[4707]: W0121 15:47:40.234231 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod165b85c7_02ce_4c3f_9433_60c84b3307fd.slice/crio-1f3e9d0b061890d387e576c08d067ada58f59b62898573f4e3c9a1d53dbbad45 WatchSource:0}: Error finding container 1f3e9d0b061890d387e576c08d067ada58f59b62898573f4e3c9a1d53dbbad45: Status 404 returned error can't find the container with id 1f3e9d0b061890d387e576c08d067ada58f59b62898573f4e3c9a1d53dbbad45 Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.252102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.252143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.252162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.252306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.252364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjsl\" (UniqueName: \"kubernetes.io/projected/9ab52595-1681-409f-b9fd-f806a0e07117-kube-api-access-7bjsl\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.252418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.252481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.252571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.353780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.353849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.353911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.353966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.353986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.354023 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.354058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.354088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjsl\" (UniqueName: \"kubernetes.io/projected/9ab52595-1681-409f-b9fd-f806a0e07117-kube-api-access-7bjsl\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.355036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.355173 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.355203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.355396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.355978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.358151 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.358589 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.367116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjsl\" (UniqueName: \"kubernetes.io/projected/9ab52595-1681-409f-b9fd-f806a0e07117-kube-api-access-7bjsl\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.370059 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.382574 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.383632 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.385394 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-kdf5s" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.385471 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.386189 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.393084 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.454588 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.482837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"165b85c7-02ce-4c3f-9433-60c84b3307fd","Type":"ContainerStarted","Data":"1f3e9d0b061890d387e576c08d067ada58f59b62898573f4e3c9a1d53dbbad45"} Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.484598 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"373d78b4-4328-42ce-8d54-6fb1c0ecefdb","Type":"ContainerStarted","Data":"2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56"} Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.484633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"373d78b4-4328-42ce-8d54-6fb1c0ecefdb","Type":"ContainerStarted","Data":"f3e95752ba008674a53dfa7bff12177e77f4db0814839b702392ea7f471a69f8"} Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.556525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3325-f770-40c6-a608-a44b32a2642b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.556567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhmth\" (UniqueName: \"kubernetes.io/projected/0d2a3325-f770-40c6-a608-a44b32a2642b-kube-api-access-jhmth\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.556603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.556631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-config\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.556653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3325-f770-40c6-a608-a44b32a2642b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.556675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.658311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3325-f770-40c6-a608-a44b32a2642b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.658354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.658522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3325-f770-40c6-a608-a44b32a2642b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.658573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhmth\" (UniqueName: \"kubernetes.io/projected/0d2a3325-f770-40c6-a608-a44b32a2642b-kube-api-access-jhmth\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.658601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.658621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-config\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.659245 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.659325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-config\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.659731 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3325-f770-40c6-a608-a44b32a2642b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.660057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.669663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3325-f770-40c6-a608-a44b32a2642b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.672670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhmth\" (UniqueName: \"kubernetes.io/projected/0d2a3325-f770-40c6-a608-a44b32a2642b-kube-api-access-jhmth\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.674836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.685844 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.685827519 podStartE2EDuration="2.685827519s" podCreationTimestamp="2026-01-21 15:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:40.498125473 +0000 UTC m=+2757.679641695" watchObservedRunningTime="2026-01-21 15:47:40.685827519 +0000 UTC m=+2757.867343741" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.686168 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.687192 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.691124 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.691361 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.691900 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-6hxvs" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.709931 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.715624 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.760101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92hsw\" (UniqueName: \"kubernetes.io/projected/590ed346-b9bd-4ecb-9a71-69af254c5a18-kube-api-access-92hsw\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.760147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590ed346-b9bd-4ecb-9a71-69af254c5a18-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.760197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.760227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590ed346-b9bd-4ecb-9a71-69af254c5a18-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.760253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-config\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.760280 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.833927 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:47:40 crc kubenswrapper[4707]: W0121 15:47:40.845500 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ab52595_1681_409f_b9fd_f806a0e07117.slice/crio-1adeca4e1fa81267399fa4951f19def7d1ef571a01ca9858ecbea726f81dcb6e WatchSource:0}: Error finding container 1adeca4e1fa81267399fa4951f19def7d1ef571a01ca9858ecbea726f81dcb6e: Status 404 returned error can't find the container with id 1adeca4e1fa81267399fa4951f19def7d1ef571a01ca9858ecbea726f81dcb6e Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.862258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-config\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.862329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.862438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92hsw\" (UniqueName: \"kubernetes.io/projected/590ed346-b9bd-4ecb-9a71-69af254c5a18-kube-api-access-92hsw\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.862487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590ed346-b9bd-4ecb-9a71-69af254c5a18-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.862631 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.864875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.864915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590ed346-b9bd-4ecb-9a71-69af254c5a18-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.865394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590ed346-b9bd-4ecb-9a71-69af254c5a18-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.865629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590ed346-b9bd-4ecb-9a71-69af254c5a18-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.867084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-config\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.874304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.876512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92hsw\" (UniqueName: \"kubernetes.io/projected/590ed346-b9bd-4ecb-9a71-69af254c5a18-kube-api-access-92hsw\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:40 crc kubenswrapper[4707]: I0121 15:47:40.880941 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.008845 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.091650 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.358915 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:47:41 crc kubenswrapper[4707]: W0121 15:47:41.360447 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod590ed346_b9bd_4ecb_9a71_69af254c5a18.slice/crio-af2080cb8f77d5faab5a56952c10677388a4e8f7329b27f0ca8c17058dc2bc5c WatchSource:0}: Error finding container af2080cb8f77d5faab5a56952c10677388a4e8f7329b27f0ca8c17058dc2bc5c: Status 404 returned error can't find the container with id af2080cb8f77d5faab5a56952c10677388a4e8f7329b27f0ca8c17058dc2bc5c Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.492225 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0d2a3325-f770-40c6-a608-a44b32a2642b","Type":"ContainerStarted","Data":"708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c"} Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.492263 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0d2a3325-f770-40c6-a608-a44b32a2642b","Type":"ContainerStarted","Data":"b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8"} Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.492284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0d2a3325-f770-40c6-a608-a44b32a2642b","Type":"ContainerStarted","Data":"a45e0a40ff54869fd1772df320da94a48e92959d3376922062d9b9c49ba19938"} Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.494158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"165b85c7-02ce-4c3f-9433-60c84b3307fd","Type":"ContainerStarted","Data":"e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421"} Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.494313 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.495980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"590ed346-b9bd-4ecb-9a71-69af254c5a18","Type":"ContainerStarted","Data":"6d22edf7bc985ced8e3f70291090f7925ae17bc065ed02f2366a92c1421c1772"} Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.496004 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"590ed346-b9bd-4ecb-9a71-69af254c5a18","Type":"ContainerStarted","Data":"af2080cb8f77d5faab5a56952c10677388a4e8f7329b27f0ca8c17058dc2bc5c"} Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.497725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9ab52595-1681-409f-b9fd-f806a0e07117","Type":"ContainerStarted","Data":"2727ec4f05c11c3ea5fe4042502dccc2c46d18348b1b9ac1444104a1d4ce950f"} Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.497751 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.497761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9ab52595-1681-409f-b9fd-f806a0e07117","Type":"ContainerStarted","Data":"1adeca4e1fa81267399fa4951f19def7d1ef571a01ca9858ecbea726f81dcb6e"} Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.506440 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.506430591 podStartE2EDuration="2.506430591s" podCreationTimestamp="2026-01-21 15:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:41.503513597 +0000 UTC m=+2758.685029819" watchObservedRunningTime="2026-01-21 15:47:41.506430591 +0000 UTC m=+2758.687946813" Jan 21 15:47:41 crc kubenswrapper[4707]: I0121 15:47:41.544916 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.202304351 podStartE2EDuration="2.544902488s" podCreationTimestamp="2026-01-21 15:47:39 +0000 UTC" firstStartedPulling="2026-01-21 15:47:40.236045178 +0000 UTC m=+2757.417561401" lastFinishedPulling="2026-01-21 15:47:40.578643316 +0000 UTC m=+2757.760159538" observedRunningTime="2026-01-21 15:47:41.539756833 +0000 UTC m=+2758.721273055" watchObservedRunningTime="2026-01-21 15:47:41.544902488 +0000 UTC m=+2758.726418709" Jan 21 15:47:42 crc kubenswrapper[4707]: I0121 15:47:42.505087 4707 generic.go:334] "Generic (PLEG): container finished" podID="67bae98e-ed83-41c8-be5a-15eae8c9aa48" containerID="3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749" exitCode=0 Jan 21 15:47:42 crc kubenswrapper[4707]: I0121 15:47:42.505176 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"67bae98e-ed83-41c8-be5a-15eae8c9aa48","Type":"ContainerDied","Data":"3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749"} Jan 21 15:47:42 crc kubenswrapper[4707]: I0121 15:47:42.507228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"590ed346-b9bd-4ecb-9a71-69af254c5a18","Type":"ContainerStarted","Data":"42845822ae5a24294591a58818e9e32984de7b04c2b29b2c2c6f590020e457f8"} Jan 21 15:47:42 crc kubenswrapper[4707]: I0121 15:47:42.535701 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.53568015 podStartE2EDuration="3.53568015s" podCreationTimestamp="2026-01-21 15:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:42.530362953 +0000 UTC m=+2759.711879174" watchObservedRunningTime="2026-01-21 15:47:42.53568015 +0000 UTC m=+2759.717196372" Jan 21 15:47:43 crc kubenswrapper[4707]: I0121 15:47:43.515036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"67bae98e-ed83-41c8-be5a-15eae8c9aa48","Type":"ContainerStarted","Data":"e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3"} Jan 21 15:47:43 crc kubenswrapper[4707]: I0121 15:47:43.531149 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=6.531134656 podStartE2EDuration="6.531134656s" podCreationTimestamp="2026-01-21 15:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:43.52690871 +0000 UTC m=+2760.708424932" watchObservedRunningTime="2026-01-21 15:47:43.531134656 +0000 UTC m=+2760.712650878" Jan 21 15:47:43 crc kubenswrapper[4707]: I0121 15:47:43.716516 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:44 crc kubenswrapper[4707]: I0121 15:47:44.009052 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:44 crc kubenswrapper[4707]: I0121 15:47:44.034200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:44 crc kubenswrapper[4707]: I0121 15:47:44.285645 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:47:44 crc kubenswrapper[4707]: I0121 15:47:44.522634 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ab52595-1681-409f-b9fd-f806a0e07117" containerID="2727ec4f05c11c3ea5fe4042502dccc2c46d18348b1b9ac1444104a1d4ce950f" exitCode=0 Jan 21 15:47:44 crc kubenswrapper[4707]: I0121 15:47:44.522700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9ab52595-1681-409f-b9fd-f806a0e07117","Type":"ContainerDied","Data":"2727ec4f05c11c3ea5fe4042502dccc2c46d18348b1b9ac1444104a1d4ce950f"} Jan 21 15:47:44 crc kubenswrapper[4707]: I0121 15:47:44.523007 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:45 crc kubenswrapper[4707]: I0121 15:47:45.530528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9ab52595-1681-409f-b9fd-f806a0e07117","Type":"ContainerStarted","Data":"e46c9f70ccacdad3402cf6ba32a77979ddf8b59c6dda0c74faca0a9001f5b6b6"} Jan 21 15:47:45 crc kubenswrapper[4707]: I0121 15:47:45.544695 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=6.544683676 podStartE2EDuration="6.544683676s" podCreationTimestamp="2026-01-21 15:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:45.542653538 +0000 UTC m=+2762.724169760" watchObservedRunningTime="2026-01-21 15:47:45.544683676 +0000 UTC m=+2762.726199898" Jan 21 15:47:45 crc kubenswrapper[4707]: I0121 15:47:45.716230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.037827 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.741327 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.767946 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.873680 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.874718 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.876238 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.876935 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-q9bf5" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.878716 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.890712 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.939895 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-config\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.939931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.939966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbdw6\" (UniqueName: \"kubernetes.io/projected/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-kube-api-access-hbdw6\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.940062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-scripts\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:46 crc kubenswrapper[4707]: I0121 15:47:46.940095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.040861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-config\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.040896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.040924 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbdw6\" (UniqueName: \"kubernetes.io/projected/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-kube-api-access-hbdw6\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.040967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-scripts\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.040988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.041465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.041789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-config\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.041914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-scripts\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.045167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.054167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbdw6\" (UniqueName: \"kubernetes.io/projected/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-kube-api-access-hbdw6\") pod \"ovn-northd-0\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.187973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:47 crc kubenswrapper[4707]: I0121 15:47:47.567740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:47:47 crc kubenswrapper[4707]: W0121 15:47:47.576036 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35a31ae7_4ab2_4b98_bd9d_66decb56f00a.slice/crio-21865c72b9ecd808248f45672315160f1c804520877818708a020bd1feacdf08 WatchSource:0}: Error finding container 21865c72b9ecd808248f45672315160f1c804520877818708a020bd1feacdf08: Status 404 returned error can't find the container with id 21865c72b9ecd808248f45672315160f1c804520877818708a020bd1feacdf08 Jan 21 15:47:48 crc kubenswrapper[4707]: I0121 15:47:48.551117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"35a31ae7-4ab2-4b98-bd9d-66decb56f00a","Type":"ContainerStarted","Data":"68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991"} Jan 21 15:47:48 crc kubenswrapper[4707]: I0121 15:47:48.551156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"35a31ae7-4ab2-4b98-bd9d-66decb56f00a","Type":"ContainerStarted","Data":"c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293"} Jan 21 15:47:48 crc kubenswrapper[4707]: I0121 15:47:48.551166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"35a31ae7-4ab2-4b98-bd9d-66decb56f00a","Type":"ContainerStarted","Data":"21865c72b9ecd808248f45672315160f1c804520877818708a020bd1feacdf08"} Jan 21 15:47:48 crc kubenswrapper[4707]: I0121 15:47:48.551921 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:48 crc kubenswrapper[4707]: I0121 15:47:48.564637 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.564628173 podStartE2EDuration="2.564628173s" podCreationTimestamp="2026-01-21 15:47:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:48.561600239 +0000 UTC m=+2765.743116462" watchObservedRunningTime="2026-01-21 15:47:48.564628173 +0000 UTC m=+2765.746144395" Jan 21 15:47:48 crc kubenswrapper[4707]: I0121 15:47:48.822276 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:48 crc kubenswrapper[4707]: I0121 15:47:48.822476 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:48 crc kubenswrapper[4707]: I0121 15:47:48.875031 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:49 crc kubenswrapper[4707]: I0121 15:47:49.608237 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:47:49 crc kubenswrapper[4707]: I0121 15:47:49.887065 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:47:50 crc kubenswrapper[4707]: I0121 15:47:50.455338 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:50 crc kubenswrapper[4707]: I0121 15:47:50.455374 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:50 crc kubenswrapper[4707]: I0121 15:47:50.936057 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:47:50 crc kubenswrapper[4707]: I0121 15:47:50.958613 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:50 crc kubenswrapper[4707]: I0121 15:47:50.959916 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:47:50 crc kubenswrapper[4707]: I0121 15:47:50.960133 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 15:47:50 crc kubenswrapper[4707]: I0121 15:47:50.960368 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-mz6gm" Jan 21 15:47:50 crc kubenswrapper[4707]: I0121 15:47:50.960729 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:47:50 crc kubenswrapper[4707]: I0121 15:47:50.960925 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.098040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.098089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrhbz\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-kube-api-access-nrhbz\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.098181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.098200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-cache\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.098220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-lock\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.199064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.199115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrhbz\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-kube-api-access-nrhbz\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.199223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.199247 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-cache\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.199284 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-lock\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: E0121 15:47:51.199548 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:47:51 crc kubenswrapper[4707]: E0121 15:47:51.199569 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:47:51 crc kubenswrapper[4707]: E0121 15:47:51.199613 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift podName:3e484616-acc3-47ab-a3ac-01ba20347f12 nodeName:}" failed. No retries permitted until 2026-01-21 15:47:51.699599198 +0000 UTC m=+2768.881115420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift") pod "swift-storage-0" (UID: "3e484616-acc3-47ab-a3ac-01ba20347f12") : configmap "swift-ring-files" not found Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.199657 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.199675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-lock\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.199798 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-cache\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.214911 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrhbz\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-kube-api-access-nrhbz\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.215703 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.309008 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x2skx"] Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.309787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.310914 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.311170 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.312084 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.322330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x2skx"] Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.402708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-combined-ca-bundle\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.402800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/328632aa-3bca-4497-b6c5-ea71cceadd18-etc-swift\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.402854 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-scripts\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.402916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-ring-data-devices\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.402946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-dispersionconf\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.402984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8w27\" (UniqueName: \"kubernetes.io/projected/328632aa-3bca-4497-b6c5-ea71cceadd18-kube-api-access-v8w27\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.403006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-swiftconf\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.504419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-ring-data-devices\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.504648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-dispersionconf\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.504699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8w27\" (UniqueName: \"kubernetes.io/projected/328632aa-3bca-4497-b6c5-ea71cceadd18-kube-api-access-v8w27\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.504724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-swiftconf\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.504778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-combined-ca-bundle\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.504850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/328632aa-3bca-4497-b6c5-ea71cceadd18-etc-swift\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.504881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-scripts\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.505137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-ring-data-devices\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.505216 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/328632aa-3bca-4497-b6c5-ea71cceadd18-etc-swift\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.505410 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-scripts\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.507488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-dispersionconf\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.508127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-swiftconf\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.508621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-combined-ca-bundle\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.516719 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8w27\" (UniqueName: \"kubernetes.io/projected/328632aa-3bca-4497-b6c5-ea71cceadd18-kube-api-access-v8w27\") pod \"swift-ring-rebalance-x2skx\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.623363 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.707223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:51 crc kubenswrapper[4707]: E0121 15:47:51.707456 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:47:51 crc kubenswrapper[4707]: E0121 15:47:51.707478 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:47:51 crc kubenswrapper[4707]: E0121 15:47:51.707525 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift podName:3e484616-acc3-47ab-a3ac-01ba20347f12 nodeName:}" failed. No retries permitted until 2026-01-21 15:47:52.707511204 +0000 UTC m=+2769.889027426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift") pod "swift-storage-0" (UID: "3e484616-acc3-47ab-a3ac-01ba20347f12") : configmap "swift-ring-files" not found Jan 21 15:47:51 crc kubenswrapper[4707]: I0121 15:47:51.972886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x2skx"] Jan 21 15:47:51 crc kubenswrapper[4707]: W0121 15:47:51.976179 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod328632aa_3bca_4497_b6c5_ea71cceadd18.slice/crio-e950f572a13a35a5f6b16215732f7900bf66aa69d0964098cfd9067ca49242f3 WatchSource:0}: Error finding container e950f572a13a35a5f6b16215732f7900bf66aa69d0964098cfd9067ca49242f3: Status 404 returned error can't find the container with id e950f572a13a35a5f6b16215732f7900bf66aa69d0964098cfd9067ca49242f3 Jan 21 15:47:52 crc kubenswrapper[4707]: I0121 15:47:52.584261 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" event={"ID":"328632aa-3bca-4497-b6c5-ea71cceadd18","Type":"ContainerStarted","Data":"dc33abbace9177890b6f8417f5c628a1d2b93884c9226ae48996c1a08c975c09"} Jan 21 15:47:52 crc kubenswrapper[4707]: I0121 15:47:52.584493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" event={"ID":"328632aa-3bca-4497-b6c5-ea71cceadd18","Type":"ContainerStarted","Data":"e950f572a13a35a5f6b16215732f7900bf66aa69d0964098cfd9067ca49242f3"} Jan 21 15:47:52 crc kubenswrapper[4707]: I0121 15:47:52.597884 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" podStartSLOduration=1.597869494 podStartE2EDuration="1.597869494s" podCreationTimestamp="2026-01-21 15:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:52.594276198 +0000 UTC m=+2769.775792420" watchObservedRunningTime="2026-01-21 15:47:52.597869494 +0000 UTC m=+2769.779385716" Jan 21 15:47:52 crc kubenswrapper[4707]: I0121 15:47:52.670887 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:52 crc kubenswrapper[4707]: I0121 15:47:52.721120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:52 crc kubenswrapper[4707]: I0121 15:47:52.721434 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:47:52 crc kubenswrapper[4707]: E0121 15:47:52.722101 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:47:52 crc kubenswrapper[4707]: E0121 15:47:52.722120 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:47:52 crc kubenswrapper[4707]: E0121 15:47:52.722152 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift podName:3e484616-acc3-47ab-a3ac-01ba20347f12 nodeName:}" failed. No retries permitted until 2026-01-21 15:47:54.722139492 +0000 UTC m=+2771.903655714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift") pod "swift-storage-0" (UID: "3e484616-acc3-47ab-a3ac-01ba20347f12") : configmap "swift-ring-files" not found Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.379405 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-nj7qw"] Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.380341 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.389143 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-nj7qw"] Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.481232 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l"] Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.482248 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.484327 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.489706 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l"] Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.542956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9384f581-d886-4c29-88d9-fe10aa949617-operator-scripts\") pod \"glance-db-create-nj7qw\" (UID: \"9384f581-d886-4c29-88d9-fe10aa949617\") " pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.543078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swkd5\" (UniqueName: \"kubernetes.io/projected/9384f581-d886-4c29-88d9-fe10aa949617-kube-api-access-swkd5\") pod \"glance-db-create-nj7qw\" (UID: \"9384f581-d886-4c29-88d9-fe10aa949617\") " pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.644757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9384f581-d886-4c29-88d9-fe10aa949617-operator-scripts\") pod \"glance-db-create-nj7qw\" (UID: \"9384f581-d886-4c29-88d9-fe10aa949617\") " pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.644795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swkd5\" (UniqueName: \"kubernetes.io/projected/9384f581-d886-4c29-88d9-fe10aa949617-kube-api-access-swkd5\") pod \"glance-db-create-nj7qw\" (UID: \"9384f581-d886-4c29-88d9-fe10aa949617\") " pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.644835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25gzx\" (UniqueName: \"kubernetes.io/projected/0f8e495d-42cb-401a-ab41-854f544abfff-kube-api-access-25gzx\") pod \"glance-6bb8-account-create-update-mrd6l\" (UID: \"0f8e495d-42cb-401a-ab41-854f544abfff\") " pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.644891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f8e495d-42cb-401a-ab41-854f544abfff-operator-scripts\") pod \"glance-6bb8-account-create-update-mrd6l\" (UID: \"0f8e495d-42cb-401a-ab41-854f544abfff\") " pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.645497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9384f581-d886-4c29-88d9-fe10aa949617-operator-scripts\") pod \"glance-db-create-nj7qw\" (UID: \"9384f581-d886-4c29-88d9-fe10aa949617\") " pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.664342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swkd5\" (UniqueName: \"kubernetes.io/projected/9384f581-d886-4c29-88d9-fe10aa949617-kube-api-access-swkd5\") pod \"glance-db-create-nj7qw\" (UID: \"9384f581-d886-4c29-88d9-fe10aa949617\") " pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.697740 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.746449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.746496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25gzx\" (UniqueName: \"kubernetes.io/projected/0f8e495d-42cb-401a-ab41-854f544abfff-kube-api-access-25gzx\") pod \"glance-6bb8-account-create-update-mrd6l\" (UID: \"0f8e495d-42cb-401a-ab41-854f544abfff\") " pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.746545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f8e495d-42cb-401a-ab41-854f544abfff-operator-scripts\") pod \"glance-6bb8-account-create-update-mrd6l\" (UID: \"0f8e495d-42cb-401a-ab41-854f544abfff\") " pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:54 crc kubenswrapper[4707]: E0121 15:47:54.746762 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:47:54 crc kubenswrapper[4707]: E0121 15:47:54.746910 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:47:54 crc kubenswrapper[4707]: E0121 15:47:54.747003 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift podName:3e484616-acc3-47ab-a3ac-01ba20347f12 nodeName:}" failed. No retries permitted until 2026-01-21 15:47:58.746991504 +0000 UTC m=+2775.928507727 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift") pod "swift-storage-0" (UID: "3e484616-acc3-47ab-a3ac-01ba20347f12") : configmap "swift-ring-files" not found Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.747099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f8e495d-42cb-401a-ab41-854f544abfff-operator-scripts\") pod \"glance-6bb8-account-create-update-mrd6l\" (UID: \"0f8e495d-42cb-401a-ab41-854f544abfff\") " pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.765761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25gzx\" (UniqueName: \"kubernetes.io/projected/0f8e495d-42cb-401a-ab41-854f544abfff-kube-api-access-25gzx\") pod \"glance-6bb8-account-create-update-mrd6l\" (UID: \"0f8e495d-42cb-401a-ab41-854f544abfff\") " pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:54 crc kubenswrapper[4707]: I0121 15:47:54.793866 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:55 crc kubenswrapper[4707]: I0121 15:47:55.060836 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-nj7qw"] Jan 21 15:47:55 crc kubenswrapper[4707]: W0121 15:47:55.063368 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9384f581_d886_4c29_88d9_fe10aa949617.slice/crio-bad35a31769a9f771cad164a89336d7ebe419232fe32b27f9d9dd6f0cceebee1 WatchSource:0}: Error finding container bad35a31769a9f771cad164a89336d7ebe419232fe32b27f9d9dd6f0cceebee1: Status 404 returned error can't find the container with id bad35a31769a9f771cad164a89336d7ebe419232fe32b27f9d9dd6f0cceebee1 Jan 21 15:47:55 crc kubenswrapper[4707]: I0121 15:47:55.146604 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l"] Jan 21 15:47:55 crc kubenswrapper[4707]: W0121 15:47:55.147921 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f8e495d_42cb_401a_ab41_854f544abfff.slice/crio-36cd5408cbdac13b55c7a84065b436c83a9654cabb8f79bc22f3e256ff3d196c WatchSource:0}: Error finding container 36cd5408cbdac13b55c7a84065b436c83a9654cabb8f79bc22f3e256ff3d196c: Status 404 returned error can't find the container with id 36cd5408cbdac13b55c7a84065b436c83a9654cabb8f79bc22f3e256ff3d196c Jan 21 15:47:55 crc kubenswrapper[4707]: I0121 15:47:55.603891 4707 generic.go:334] "Generic (PLEG): container finished" podID="9384f581-d886-4c29-88d9-fe10aa949617" containerID="14f2c43916f1c0eb479f1b960a8fbf2446166014a7f757766f24c9135adbac15" exitCode=0 Jan 21 15:47:55 crc kubenswrapper[4707]: I0121 15:47:55.603936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-nj7qw" event={"ID":"9384f581-d886-4c29-88d9-fe10aa949617","Type":"ContainerDied","Data":"14f2c43916f1c0eb479f1b960a8fbf2446166014a7f757766f24c9135adbac15"} Jan 21 15:47:55 crc kubenswrapper[4707]: I0121 15:47:55.604132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-nj7qw" event={"ID":"9384f581-d886-4c29-88d9-fe10aa949617","Type":"ContainerStarted","Data":"bad35a31769a9f771cad164a89336d7ebe419232fe32b27f9d9dd6f0cceebee1"} Jan 21 15:47:55 crc kubenswrapper[4707]: I0121 15:47:55.605432 4707 generic.go:334] "Generic (PLEG): container finished" podID="0f8e495d-42cb-401a-ab41-854f544abfff" containerID="71b02c968c6f75c52a86bc5fe092ddddf094ad5abc9fe1f5c5ffb5d235679855" exitCode=0 Jan 21 15:47:55 crc kubenswrapper[4707]: I0121 15:47:55.605472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" event={"ID":"0f8e495d-42cb-401a-ab41-854f544abfff","Type":"ContainerDied","Data":"71b02c968c6f75c52a86bc5fe092ddddf094ad5abc9fe1f5c5ffb5d235679855"} Jan 21 15:47:55 crc kubenswrapper[4707]: I0121 15:47:55.605496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" event={"ID":"0f8e495d-42cb-401a-ab41-854f544abfff","Type":"ContainerStarted","Data":"36cd5408cbdac13b55c7a84065b436c83a9654cabb8f79bc22f3e256ff3d196c"} Jan 21 15:47:56 crc kubenswrapper[4707]: I0121 15:47:56.927300 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:56 crc kubenswrapper[4707]: I0121 15:47:56.931476 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.080493 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swkd5\" (UniqueName: \"kubernetes.io/projected/9384f581-d886-4c29-88d9-fe10aa949617-kube-api-access-swkd5\") pod \"9384f581-d886-4c29-88d9-fe10aa949617\" (UID: \"9384f581-d886-4c29-88d9-fe10aa949617\") " Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.080533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f8e495d-42cb-401a-ab41-854f544abfff-operator-scripts\") pod \"0f8e495d-42cb-401a-ab41-854f544abfff\" (UID: \"0f8e495d-42cb-401a-ab41-854f544abfff\") " Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.080551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25gzx\" (UniqueName: \"kubernetes.io/projected/0f8e495d-42cb-401a-ab41-854f544abfff-kube-api-access-25gzx\") pod \"0f8e495d-42cb-401a-ab41-854f544abfff\" (UID: \"0f8e495d-42cb-401a-ab41-854f544abfff\") " Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.080574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9384f581-d886-4c29-88d9-fe10aa949617-operator-scripts\") pod \"9384f581-d886-4c29-88d9-fe10aa949617\" (UID: \"9384f581-d886-4c29-88d9-fe10aa949617\") " Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.081222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f8e495d-42cb-401a-ab41-854f544abfff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f8e495d-42cb-401a-ab41-854f544abfff" (UID: "0f8e495d-42cb-401a-ab41-854f544abfff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.081233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9384f581-d886-4c29-88d9-fe10aa949617-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9384f581-d886-4c29-88d9-fe10aa949617" (UID: "9384f581-d886-4c29-88d9-fe10aa949617"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.084954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f8e495d-42cb-401a-ab41-854f544abfff-kube-api-access-25gzx" (OuterVolumeSpecName: "kube-api-access-25gzx") pod "0f8e495d-42cb-401a-ab41-854f544abfff" (UID: "0f8e495d-42cb-401a-ab41-854f544abfff"). InnerVolumeSpecName "kube-api-access-25gzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.084965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9384f581-d886-4c29-88d9-fe10aa949617-kube-api-access-swkd5" (OuterVolumeSpecName: "kube-api-access-swkd5") pod "9384f581-d886-4c29-88d9-fe10aa949617" (UID: "9384f581-d886-4c29-88d9-fe10aa949617"). InnerVolumeSpecName "kube-api-access-swkd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.186374 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swkd5\" (UniqueName: \"kubernetes.io/projected/9384f581-d886-4c29-88d9-fe10aa949617-kube-api-access-swkd5\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.186399 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f8e495d-42cb-401a-ab41-854f544abfff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.186409 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25gzx\" (UniqueName: \"kubernetes.io/projected/0f8e495d-42cb-401a-ab41-854f544abfff-kube-api-access-25gzx\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.186418 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9384f581-d886-4c29-88d9-fe10aa949617-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.231195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.499019 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-j8nwd"] Jan 21 15:47:57 crc kubenswrapper[4707]: E0121 15:47:57.499925 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9384f581-d886-4c29-88d9-fe10aa949617" containerName="mariadb-database-create" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.499993 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9384f581-d886-4c29-88d9-fe10aa949617" containerName="mariadb-database-create" Jan 21 15:47:57 crc kubenswrapper[4707]: E0121 15:47:57.500050 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8e495d-42cb-401a-ab41-854f544abfff" containerName="mariadb-account-create-update" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.500108 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8e495d-42cb-401a-ab41-854f544abfff" containerName="mariadb-account-create-update" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.500455 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9384f581-d886-4c29-88d9-fe10aa949617" containerName="mariadb-database-create" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.500538 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8e495d-42cb-401a-ab41-854f544abfff" containerName="mariadb-account-create-update" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.504640 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.507386 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.514598 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-j8nwd"] Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.592293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc5c4d0a-2661-425e-a185-3f1ca33df35f-operator-scripts\") pod \"root-account-create-update-j8nwd\" (UID: \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\") " pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.592556 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw894\" (UniqueName: \"kubernetes.io/projected/dc5c4d0a-2661-425e-a185-3f1ca33df35f-kube-api-access-lw894\") pod \"root-account-create-update-j8nwd\" (UID: \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\") " pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.621722 4707 generic.go:334] "Generic (PLEG): container finished" podID="328632aa-3bca-4497-b6c5-ea71cceadd18" containerID="dc33abbace9177890b6f8417f5c628a1d2b93884c9226ae48996c1a08c975c09" exitCode=0 Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.621786 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" event={"ID":"328632aa-3bca-4497-b6c5-ea71cceadd18","Type":"ContainerDied","Data":"dc33abbace9177890b6f8417f5c628a1d2b93884c9226ae48996c1a08c975c09"} Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.623550 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-nj7qw" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.623554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-nj7qw" event={"ID":"9384f581-d886-4c29-88d9-fe10aa949617","Type":"ContainerDied","Data":"bad35a31769a9f771cad164a89336d7ebe419232fe32b27f9d9dd6f0cceebee1"} Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.623824 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bad35a31769a9f771cad164a89336d7ebe419232fe32b27f9d9dd6f0cceebee1" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.624758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" event={"ID":"0f8e495d-42cb-401a-ab41-854f544abfff","Type":"ContainerDied","Data":"36cd5408cbdac13b55c7a84065b436c83a9654cabb8f79bc22f3e256ff3d196c"} Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.624786 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36cd5408cbdac13b55c7a84065b436c83a9654cabb8f79bc22f3e256ff3d196c" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.624787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.693724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc5c4d0a-2661-425e-a185-3f1ca33df35f-operator-scripts\") pod \"root-account-create-update-j8nwd\" (UID: \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\") " pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.693823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw894\" (UniqueName: \"kubernetes.io/projected/dc5c4d0a-2661-425e-a185-3f1ca33df35f-kube-api-access-lw894\") pod \"root-account-create-update-j8nwd\" (UID: \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\") " pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.694409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc5c4d0a-2661-425e-a185-3f1ca33df35f-operator-scripts\") pod \"root-account-create-update-j8nwd\" (UID: \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\") " pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.707308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw894\" (UniqueName: \"kubernetes.io/projected/dc5c4d0a-2661-425e-a185-3f1ca33df35f-kube-api-access-lw894\") pod \"root-account-create-update-j8nwd\" (UID: \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\") " pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:47:57 crc kubenswrapper[4707]: I0121 15:47:57.824786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.174255 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-j8nwd"] Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.632309 4707 generic.go:334] "Generic (PLEG): container finished" podID="dc5c4d0a-2661-425e-a185-3f1ca33df35f" containerID="89c29a3d42aafec3fce31b73ed24346f2ab4e8ae49fae6ac3a1cbc1d68a952c0" exitCode=0 Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.632395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-j8nwd" event={"ID":"dc5c4d0a-2661-425e-a185-3f1ca33df35f","Type":"ContainerDied","Data":"89c29a3d42aafec3fce31b73ed24346f2ab4e8ae49fae6ac3a1cbc1d68a952c0"} Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.632663 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-j8nwd" event={"ID":"dc5c4d0a-2661-425e-a185-3f1ca33df35f","Type":"ContainerStarted","Data":"6997548686b1add9b0f0bd9c42cd2863db90ba371482a638e83d1bc3c684a7ec"} Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.809235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.815995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift\") pod \"swift-storage-0\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.872090 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-pfmbn"] Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.873025 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.879901 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-pfmbn"] Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.908308 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.970364 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc"] Jan 21 15:47:58 crc kubenswrapper[4707]: E0121 15:47:58.970684 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328632aa-3bca-4497-b6c5-ea71cceadd18" containerName="swift-ring-rebalance" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.970702 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="328632aa-3bca-4497-b6c5-ea71cceadd18" containerName="swift-ring-rebalance" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.970869 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="328632aa-3bca-4497-b6c5-ea71cceadd18" containerName="swift-ring-rebalance" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.971346 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.979462 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:47:58 crc kubenswrapper[4707]: I0121 15:47:58.984983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc"] Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.012339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-swiftconf\") pod \"328632aa-3bca-4497-b6c5-ea71cceadd18\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.012383 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-dispersionconf\") pod \"328632aa-3bca-4497-b6c5-ea71cceadd18\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.012422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/328632aa-3bca-4497-b6c5-ea71cceadd18-etc-swift\") pod \"328632aa-3bca-4497-b6c5-ea71cceadd18\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.012440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-ring-data-devices\") pod \"328632aa-3bca-4497-b6c5-ea71cceadd18\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.012531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-scripts\") pod \"328632aa-3bca-4497-b6c5-ea71cceadd18\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.012572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8w27\" (UniqueName: \"kubernetes.io/projected/328632aa-3bca-4497-b6c5-ea71cceadd18-kube-api-access-v8w27\") pod \"328632aa-3bca-4497-b6c5-ea71cceadd18\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.012621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-combined-ca-bundle\") pod \"328632aa-3bca-4497-b6c5-ea71cceadd18\" (UID: \"328632aa-3bca-4497-b6c5-ea71cceadd18\") " Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.012788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lfc4\" (UniqueName: \"kubernetes.io/projected/7be5de23-516b-4f0d-b987-a7adaac7d32f-kube-api-access-6lfc4\") pod \"keystone-db-create-pfmbn\" (UID: \"7be5de23-516b-4f0d-b987-a7adaac7d32f\") " pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.012977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be5de23-516b-4f0d-b987-a7adaac7d32f-operator-scripts\") pod \"keystone-db-create-pfmbn\" (UID: \"7be5de23-516b-4f0d-b987-a7adaac7d32f\") " pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.013329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328632aa-3bca-4497-b6c5-ea71cceadd18-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "328632aa-3bca-4497-b6c5-ea71cceadd18" (UID: "328632aa-3bca-4497-b6c5-ea71cceadd18"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.013655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "328632aa-3bca-4497-b6c5-ea71cceadd18" (UID: "328632aa-3bca-4497-b6c5-ea71cceadd18"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.016098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/328632aa-3bca-4497-b6c5-ea71cceadd18-kube-api-access-v8w27" (OuterVolumeSpecName: "kube-api-access-v8w27") pod "328632aa-3bca-4497-b6c5-ea71cceadd18" (UID: "328632aa-3bca-4497-b6c5-ea71cceadd18"). InnerVolumeSpecName "kube-api-access-v8w27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.018967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "328632aa-3bca-4497-b6c5-ea71cceadd18" (UID: "328632aa-3bca-4497-b6c5-ea71cceadd18"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.031580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "328632aa-3bca-4497-b6c5-ea71cceadd18" (UID: "328632aa-3bca-4497-b6c5-ea71cceadd18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.032916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "328632aa-3bca-4497-b6c5-ea71cceadd18" (UID: "328632aa-3bca-4497-b6c5-ea71cceadd18"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.035645 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-scripts" (OuterVolumeSpecName: "scripts") pod "328632aa-3bca-4497-b6c5-ea71cceadd18" (UID: "328632aa-3bca-4497-b6c5-ea71cceadd18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.075678 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be5de23-516b-4f0d-b987-a7adaac7d32f-operator-scripts\") pod \"keystone-db-create-pfmbn\" (UID: \"7be5de23-516b-4f0d-b987-a7adaac7d32f\") " pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxtnm\" (UniqueName: \"kubernetes.io/projected/7f6b93c5-8639-4f92-9141-d3b904db7bbb-kube-api-access-gxtnm\") pod \"keystone-8d41-account-create-update-sdrlc\" (UID: \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lfc4\" (UniqueName: \"kubernetes.io/projected/7be5de23-516b-4f0d-b987-a7adaac7d32f-kube-api-access-6lfc4\") pod \"keystone-db-create-pfmbn\" (UID: \"7be5de23-516b-4f0d-b987-a7adaac7d32f\") " pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b93c5-8639-4f92-9141-d3b904db7bbb-operator-scripts\") pod \"keystone-8d41-account-create-update-sdrlc\" (UID: \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114716 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114727 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8w27\" (UniqueName: \"kubernetes.io/projected/328632aa-3bca-4497-b6c5-ea71cceadd18-kube-api-access-v8w27\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114768 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114777 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114785 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/328632aa-3bca-4497-b6c5-ea71cceadd18-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114792 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/328632aa-3bca-4497-b6c5-ea71cceadd18-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.114799 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/328632aa-3bca-4497-b6c5-ea71cceadd18-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.115357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be5de23-516b-4f0d-b987-a7adaac7d32f-operator-scripts\") pod \"keystone-db-create-pfmbn\" (UID: \"7be5de23-516b-4f0d-b987-a7adaac7d32f\") " pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.128991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lfc4\" (UniqueName: \"kubernetes.io/projected/7be5de23-516b-4f0d-b987-a7adaac7d32f-kube-api-access-6lfc4\") pod \"keystone-db-create-pfmbn\" (UID: \"7be5de23-516b-4f0d-b987-a7adaac7d32f\") " pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.198785 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-vjhlx"] Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.199603 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-vjhlx"] Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.199661 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.216105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b93c5-8639-4f92-9141-d3b904db7bbb-operator-scripts\") pod \"keystone-8d41-account-create-update-sdrlc\" (UID: \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.216233 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.216251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxtnm\" (UniqueName: \"kubernetes.io/projected/7f6b93c5-8639-4f92-9141-d3b904db7bbb-kube-api-access-gxtnm\") pod \"keystone-8d41-account-create-update-sdrlc\" (UID: \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.216713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b93c5-8639-4f92-9141-d3b904db7bbb-operator-scripts\") pod \"keystone-8d41-account-create-update-sdrlc\" (UID: \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.231491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxtnm\" (UniqueName: \"kubernetes.io/projected/7f6b93c5-8639-4f92-9141-d3b904db7bbb-kube-api-access-gxtnm\") pod \"keystone-8d41-account-create-update-sdrlc\" (UID: \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.274563 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-78d2-account-create-update-8drdj"] Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.275525 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.277064 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.278984 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-78d2-account-create-update-8drdj"] Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.290637 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.317786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dfc499-69dd-414a-bc01-6eb524b099e1-operator-scripts\") pod \"placement-db-create-vjhlx\" (UID: \"d4dfc499-69dd-414a-bc01-6eb524b099e1\") " pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.317923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsp4d\" (UniqueName: \"kubernetes.io/projected/d4dfc499-69dd-414a-bc01-6eb524b099e1-kube-api-access-vsp4d\") pod \"placement-db-create-vjhlx\" (UID: \"d4dfc499-69dd-414a-bc01-6eb524b099e1\") " pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.421229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-operator-scripts\") pod \"placement-78d2-account-create-update-8drdj\" (UID: \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\") " pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.421296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsp4d\" (UniqueName: \"kubernetes.io/projected/d4dfc499-69dd-414a-bc01-6eb524b099e1-kube-api-access-vsp4d\") pod \"placement-db-create-vjhlx\" (UID: \"d4dfc499-69dd-414a-bc01-6eb524b099e1\") " pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.421415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dfc499-69dd-414a-bc01-6eb524b099e1-operator-scripts\") pod \"placement-db-create-vjhlx\" (UID: \"d4dfc499-69dd-414a-bc01-6eb524b099e1\") " pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.421433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m769l\" (UniqueName: \"kubernetes.io/projected/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-kube-api-access-m769l\") pod \"placement-78d2-account-create-update-8drdj\" (UID: \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\") " pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.422287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dfc499-69dd-414a-bc01-6eb524b099e1-operator-scripts\") pod \"placement-db-create-vjhlx\" (UID: \"d4dfc499-69dd-414a-bc01-6eb524b099e1\") " pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.438134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsp4d\" (UniqueName: \"kubernetes.io/projected/d4dfc499-69dd-414a-bc01-6eb524b099e1-kube-api-access-vsp4d\") pod \"placement-db-create-vjhlx\" (UID: \"d4dfc499-69dd-414a-bc01-6eb524b099e1\") " pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.461371 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:47:59 crc kubenswrapper[4707]: W0121 15:47:59.464252 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e484616_acc3_47ab_a3ac_01ba20347f12.slice/crio-5e582c69f547ce754027adc81ad5f8f7c6e2521b6b2f278398a6f52ee2786e29 WatchSource:0}: Error finding container 5e582c69f547ce754027adc81ad5f8f7c6e2521b6b2f278398a6f52ee2786e29: Status 404 returned error can't find the container with id 5e582c69f547ce754027adc81ad5f8f7c6e2521b6b2f278398a6f52ee2786e29 Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.515962 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.522466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-operator-scripts\") pod \"placement-78d2-account-create-update-8drdj\" (UID: \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\") " pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.522558 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m769l\" (UniqueName: \"kubernetes.io/projected/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-kube-api-access-m769l\") pod \"placement-78d2-account-create-update-8drdj\" (UID: \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\") " pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.523484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-operator-scripts\") pod \"placement-78d2-account-create-update-8drdj\" (UID: \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\") " pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.536434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m769l\" (UniqueName: \"kubernetes.io/projected/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-kube-api-access-m769l\") pod \"placement-78d2-account-create-update-8drdj\" (UID: \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\") " pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.593888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.604875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-pfmbn"] Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.640996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" event={"ID":"328632aa-3bca-4497-b6c5-ea71cceadd18","Type":"ContainerDied","Data":"e950f572a13a35a5f6b16215732f7900bf66aa69d0964098cfd9067ca49242f3"} Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.641034 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e950f572a13a35a5f6b16215732f7900bf66aa69d0964098cfd9067ca49242f3" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.641054 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x2skx" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.643323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d"} Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.643347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"5e582c69f547ce754027adc81ad5f8f7c6e2521b6b2f278398a6f52ee2786e29"} Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.647039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-pfmbn" event={"ID":"7be5de23-516b-4f0d-b987-a7adaac7d32f","Type":"ContainerStarted","Data":"838f3cf738f0048cb830b40c0e77e5d790267ffb725afd4b5a4886f2c36930fe"} Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.696894 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc"] Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.701932 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dbsz5"] Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.703114 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.706982 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.707079 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-265m4" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.710100 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dbsz5"] Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.825981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-combined-ca-bundle\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.826043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nprg4\" (UniqueName: \"kubernetes.io/projected/87f26e02-9064-49cf-a33b-d1aef107c16b-kube-api-access-nprg4\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.826064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-config-data\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.826229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-db-sync-config-data\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.873758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-vjhlx"] Jan 21 15:47:59 crc kubenswrapper[4707]: W0121 15:47:59.899910 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4dfc499_69dd_414a_bc01_6eb524b099e1.slice/crio-51467440ade7f6c0323c90acfa09add553585f6ced26fe0e5da64d396fe414d2 WatchSource:0}: Error finding container 51467440ade7f6c0323c90acfa09add553585f6ced26fe0e5da64d396fe414d2: Status 404 returned error can't find the container with id 51467440ade7f6c0323c90acfa09add553585f6ced26fe0e5da64d396fe414d2 Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.927329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-combined-ca-bundle\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.927371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nprg4\" (UniqueName: \"kubernetes.io/projected/87f26e02-9064-49cf-a33b-d1aef107c16b-kube-api-access-nprg4\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.927393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-config-data\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.927451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-db-sync-config-data\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.931829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-db-sync-config-data\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.938487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-config-data\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.938665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-combined-ca-bundle\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:47:59 crc kubenswrapper[4707]: I0121 15:47:59.945330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nprg4\" (UniqueName: \"kubernetes.io/projected/87f26e02-9064-49cf-a33b-d1aef107c16b-kube-api-access-nprg4\") pod \"glance-db-sync-dbsz5\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.009983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-78d2-account-create-update-8drdj"] Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.034009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:48:00 crc kubenswrapper[4707]: W0121 15:48:00.040366 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb454a8_70a4_4e80_be1c_bd3b9f6814ce.slice/crio-fa3ed5a4173a0c5792a802e25d3c67f94943258814417999f2ee6230fdc51598 WatchSource:0}: Error finding container fa3ed5a4173a0c5792a802e25d3c67f94943258814417999f2ee6230fdc51598: Status 404 returned error can't find the container with id fa3ed5a4173a0c5792a802e25d3c67f94943258814417999f2ee6230fdc51598 Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.264053 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.393540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dbsz5"] Jan 21 15:48:00 crc kubenswrapper[4707]: W0121 15:48:00.400944 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87f26e02_9064_49cf_a33b_d1aef107c16b.slice/crio-63f174ed212025e35220c677f8b4ee7ae6b3139c363a21460c874d666033498f WatchSource:0}: Error finding container 63f174ed212025e35220c677f8b4ee7ae6b3139c363a21460c874d666033498f: Status 404 returned error can't find the container with id 63f174ed212025e35220c677f8b4ee7ae6b3139c363a21460c874d666033498f Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.434091 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw894\" (UniqueName: \"kubernetes.io/projected/dc5c4d0a-2661-425e-a185-3f1ca33df35f-kube-api-access-lw894\") pod \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\" (UID: \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\") " Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.434345 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc5c4d0a-2661-425e-a185-3f1ca33df35f-operator-scripts\") pod \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\" (UID: \"dc5c4d0a-2661-425e-a185-3f1ca33df35f\") " Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.435098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc5c4d0a-2661-425e-a185-3f1ca33df35f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc5c4d0a-2661-425e-a185-3f1ca33df35f" (UID: "dc5c4d0a-2661-425e-a185-3f1ca33df35f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.436671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc5c4d0a-2661-425e-a185-3f1ca33df35f-kube-api-access-lw894" (OuterVolumeSpecName: "kube-api-access-lw894") pod "dc5c4d0a-2661-425e-a185-3f1ca33df35f" (UID: "dc5c4d0a-2661-425e-a185-3f1ca33df35f"). InnerVolumeSpecName "kube-api-access-lw894". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.537339 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw894\" (UniqueName: \"kubernetes.io/projected/dc5c4d0a-2661-425e-a185-3f1ca33df35f-kube-api-access-lw894\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.537365 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc5c4d0a-2661-425e-a185-3f1ca33df35f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.663303 4707 generic.go:334] "Generic (PLEG): container finished" podID="7f6b93c5-8639-4f92-9141-d3b904db7bbb" containerID="755d9d5adcd064ca0330aa73fc3e6cc0f540f9c4fa34be8fff019ef4103996e6" exitCode=0 Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.663341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" event={"ID":"7f6b93c5-8639-4f92-9141-d3b904db7bbb","Type":"ContainerDied","Data":"755d9d5adcd064ca0330aa73fc3e6cc0f540f9c4fa34be8fff019ef4103996e6"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.663379 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" event={"ID":"7f6b93c5-8639-4f92-9141-d3b904db7bbb","Type":"ContainerStarted","Data":"5b7b1f9f12cdcb7b2a4358e8a19cb12a1691de6d12c3114aa03b97b27de48746"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.673937 4707 generic.go:334] "Generic (PLEG): container finished" podID="7be5de23-516b-4f0d-b987-a7adaac7d32f" containerID="83e8027d614ff783de9b84f03d3856d99934e6591281a3e9bc31ec8d751eef10" exitCode=0 Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.674019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-pfmbn" event={"ID":"7be5de23-516b-4f0d-b987-a7adaac7d32f","Type":"ContainerDied","Data":"83e8027d614ff783de9b84f03d3856d99934e6591281a3e9bc31ec8d751eef10"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.677006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-dbsz5" event={"ID":"87f26e02-9064-49cf-a33b-d1aef107c16b","Type":"ContainerStarted","Data":"63f174ed212025e35220c677f8b4ee7ae6b3139c363a21460c874d666033498f"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.682507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.682543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.682555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.682564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.685100 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4dfc499-69dd-414a-bc01-6eb524b099e1" containerID="47158a7390e25365b78677d77f11c3992cf6c8641b534d23dcc5098499a9c974" exitCode=0 Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.685167 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-vjhlx" event={"ID":"d4dfc499-69dd-414a-bc01-6eb524b099e1","Type":"ContainerDied","Data":"47158a7390e25365b78677d77f11c3992cf6c8641b534d23dcc5098499a9c974"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.685190 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-vjhlx" event={"ID":"d4dfc499-69dd-414a-bc01-6eb524b099e1","Type":"ContainerStarted","Data":"51467440ade7f6c0323c90acfa09add553585f6ced26fe0e5da64d396fe414d2"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.687071 4707 generic.go:334] "Generic (PLEG): container finished" podID="deb454a8-70a4-4e80-be1c-bd3b9f6814ce" containerID="0d6f9b4edffb5fceeb25511e4ca129919e0cfdbf7d8f30f04c99b5c9e6761ba5" exitCode=0 Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.687132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" event={"ID":"deb454a8-70a4-4e80-be1c-bd3b9f6814ce","Type":"ContainerDied","Data":"0d6f9b4edffb5fceeb25511e4ca129919e0cfdbf7d8f30f04c99b5c9e6761ba5"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.687152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" event={"ID":"deb454a8-70a4-4e80-be1c-bd3b9f6814ce","Type":"ContainerStarted","Data":"fa3ed5a4173a0c5792a802e25d3c67f94943258814417999f2ee6230fdc51598"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.688822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-j8nwd" event={"ID":"dc5c4d0a-2661-425e-a185-3f1ca33df35f","Type":"ContainerDied","Data":"6997548686b1add9b0f0bd9c42cd2863db90ba371482a638e83d1bc3c684a7ec"} Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.688840 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6997548686b1add9b0f0bd9c42cd2863db90ba371482a638e83d1bc3c684a7ec" Jan 21 15:48:00 crc kubenswrapper[4707]: I0121 15:48:00.688875 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-j8nwd" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.696861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-dbsz5" event={"ID":"87f26e02-9064-49cf-a33b-d1aef107c16b","Type":"ContainerStarted","Data":"d3e1d5dbdd13bf53fcf6b6196650b7cbe7095451b0e1b8e6af9f3c9730be9398"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.701596 4707 scope.go:117] "RemoveContainer" containerID="d73ccd21bbb8e5c2fa996b7147ba920c18edaeb40f40be041465c71d75eda7fd" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.703016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.703041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.703052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.703060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.703087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.703097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.703104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.703112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.703119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63"} Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.722938 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-dbsz5" podStartSLOduration=2.722924178 podStartE2EDuration="2.722924178s" podCreationTimestamp="2026-01-21 15:47:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:01.719629763 +0000 UTC m=+2778.901145986" watchObservedRunningTime="2026-01-21 15:48:01.722924178 +0000 UTC m=+2778.904440400" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.723940 4707 scope.go:117] "RemoveContainer" containerID="c718221e74ac014b08a72ecde4ddd13e68c2658920e05635ff5b511a09b10c4d" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.749953 4707 scope.go:117] "RemoveContainer" containerID="a4efbc75ea46ed9e0d43ad15bc77a3b68cc728b78fe26f47fc3a8fc0add65d1e" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.802272 4707 scope.go:117] "RemoveContainer" containerID="a256d0c99dcf11f5da546ffd93042d18773ef99926f52b461c40448890ad7505" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.819945 4707 scope.go:117] "RemoveContainer" containerID="1611c703919e81570b553ebb49d79f83a2d4bd78494b93ec3ff68e128f0b8d6f" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.840742 4707 scope.go:117] "RemoveContainer" containerID="1348e4bd7ab33693f0b9a16d311fc059dd63a789a138bf8a94a3d344e3499b49" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.856484 4707 scope.go:117] "RemoveContainer" containerID="03fc0ec367a820fb2eceeeb390d7bafb68a0ee8bb4d220520f537111c9def27d" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.876580 4707 scope.go:117] "RemoveContainer" containerID="5af2da621a03f8cef0956e53ed72a3d8539092dc2471f1f3cb2edebedcaaafe7" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.947479 4707 scope.go:117] "RemoveContainer" containerID="527995a08fe95e2e4ed2f6dff00c892b5a1fd3d330c658c01e07a7831093049b" Jan 21 15:48:01 crc kubenswrapper[4707]: I0121 15:48:01.967897 4707 scope.go:117] "RemoveContainer" containerID="e1469ba4322e58bc36a1c3104e1125846b364821981487a01957611fe56feef3" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.002394 4707 scope.go:117] "RemoveContainer" containerID="e923daea0011570420f39bbf8ca3fd730d44e5caefd85bd6ad72c2334634c294" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.033745 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.134709 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.136467 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.160700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lfc4\" (UniqueName: \"kubernetes.io/projected/7be5de23-516b-4f0d-b987-a7adaac7d32f-kube-api-access-6lfc4\") pod \"7be5de23-516b-4f0d-b987-a7adaac7d32f\" (UID: \"7be5de23-516b-4f0d-b987-a7adaac7d32f\") " Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.160774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be5de23-516b-4f0d-b987-a7adaac7d32f-operator-scripts\") pod \"7be5de23-516b-4f0d-b987-a7adaac7d32f\" (UID: \"7be5de23-516b-4f0d-b987-a7adaac7d32f\") " Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.161364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7be5de23-516b-4f0d-b987-a7adaac7d32f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7be5de23-516b-4f0d-b987-a7adaac7d32f" (UID: "7be5de23-516b-4f0d-b987-a7adaac7d32f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.162686 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.167027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be5de23-516b-4f0d-b987-a7adaac7d32f-kube-api-access-6lfc4" (OuterVolumeSpecName: "kube-api-access-6lfc4") pod "7be5de23-516b-4f0d-b987-a7adaac7d32f" (UID: "7be5de23-516b-4f0d-b987-a7adaac7d32f"). InnerVolumeSpecName "kube-api-access-6lfc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.262260 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxtnm\" (UniqueName: \"kubernetes.io/projected/7f6b93c5-8639-4f92-9141-d3b904db7bbb-kube-api-access-gxtnm\") pod \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\" (UID: \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\") " Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.262360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m769l\" (UniqueName: \"kubernetes.io/projected/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-kube-api-access-m769l\") pod \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\" (UID: \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\") " Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.262406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-operator-scripts\") pod \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\" (UID: \"deb454a8-70a4-4e80-be1c-bd3b9f6814ce\") " Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.262505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b93c5-8639-4f92-9141-d3b904db7bbb-operator-scripts\") pod \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\" (UID: \"7f6b93c5-8639-4f92-9141-d3b904db7bbb\") " Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.262542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dfc499-69dd-414a-bc01-6eb524b099e1-operator-scripts\") pod \"d4dfc499-69dd-414a-bc01-6eb524b099e1\" (UID: \"d4dfc499-69dd-414a-bc01-6eb524b099e1\") " Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.262577 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsp4d\" (UniqueName: \"kubernetes.io/projected/d4dfc499-69dd-414a-bc01-6eb524b099e1-kube-api-access-vsp4d\") pod \"d4dfc499-69dd-414a-bc01-6eb524b099e1\" (UID: \"d4dfc499-69dd-414a-bc01-6eb524b099e1\") " Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.262961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f6b93c5-8639-4f92-9141-d3b904db7bbb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f6b93c5-8639-4f92-9141-d3b904db7bbb" (UID: "7f6b93c5-8639-4f92-9141-d3b904db7bbb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.262987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dfc499-69dd-414a-bc01-6eb524b099e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4dfc499-69dd-414a-bc01-6eb524b099e1" (UID: "d4dfc499-69dd-414a-bc01-6eb524b099e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.263047 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lfc4\" (UniqueName: \"kubernetes.io/projected/7be5de23-516b-4f0d-b987-a7adaac7d32f-kube-api-access-6lfc4\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.263062 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be5de23-516b-4f0d-b987-a7adaac7d32f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.263344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "deb454a8-70a4-4e80-be1c-bd3b9f6814ce" (UID: "deb454a8-70a4-4e80-be1c-bd3b9f6814ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.264907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6b93c5-8639-4f92-9141-d3b904db7bbb-kube-api-access-gxtnm" (OuterVolumeSpecName: "kube-api-access-gxtnm") pod "7f6b93c5-8639-4f92-9141-d3b904db7bbb" (UID: "7f6b93c5-8639-4f92-9141-d3b904db7bbb"). InnerVolumeSpecName "kube-api-access-gxtnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.265161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-kube-api-access-m769l" (OuterVolumeSpecName: "kube-api-access-m769l") pod "deb454a8-70a4-4e80-be1c-bd3b9f6814ce" (UID: "deb454a8-70a4-4e80-be1c-bd3b9f6814ce"). InnerVolumeSpecName "kube-api-access-m769l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.265184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dfc499-69dd-414a-bc01-6eb524b099e1-kube-api-access-vsp4d" (OuterVolumeSpecName: "kube-api-access-vsp4d") pod "d4dfc499-69dd-414a-bc01-6eb524b099e1" (UID: "d4dfc499-69dd-414a-bc01-6eb524b099e1"). InnerVolumeSpecName "kube-api-access-vsp4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.365311 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxtnm\" (UniqueName: \"kubernetes.io/projected/7f6b93c5-8639-4f92-9141-d3b904db7bbb-kube-api-access-gxtnm\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.365339 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m769l\" (UniqueName: \"kubernetes.io/projected/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-kube-api-access-m769l\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.365351 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb454a8-70a4-4e80-be1c-bd3b9f6814ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.365360 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b93c5-8639-4f92-9141-d3b904db7bbb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.365368 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dfc499-69dd-414a-bc01-6eb524b099e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.365375 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsp4d\" (UniqueName: \"kubernetes.io/projected/d4dfc499-69dd-414a-bc01-6eb524b099e1-kube-api-access-vsp4d\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.712438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" event={"ID":"7f6b93c5-8639-4f92-9141-d3b904db7bbb","Type":"ContainerDied","Data":"5b7b1f9f12cdcb7b2a4358e8a19cb12a1691de6d12c3114aa03b97b27de48746"} Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.713203 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b7b1f9f12cdcb7b2a4358e8a19cb12a1691de6d12c3114aa03b97b27de48746" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.713332 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.726579 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.726574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-78d2-account-create-update-8drdj" event={"ID":"deb454a8-70a4-4e80-be1c-bd3b9f6814ce","Type":"ContainerDied","Data":"fa3ed5a4173a0c5792a802e25d3c67f94943258814417999f2ee6230fdc51598"} Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.726680 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa3ed5a4173a0c5792a802e25d3c67f94943258814417999f2ee6230fdc51598" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.731673 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-vjhlx" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.731700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-vjhlx" event={"ID":"d4dfc499-69dd-414a-bc01-6eb524b099e1","Type":"ContainerDied","Data":"51467440ade7f6c0323c90acfa09add553585f6ced26fe0e5da64d396fe414d2"} Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.731733 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51467440ade7f6c0323c90acfa09add553585f6ced26fe0e5da64d396fe414d2" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.737994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerStarted","Data":"a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e"} Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.739830 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-pfmbn" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.739829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-pfmbn" event={"ID":"7be5de23-516b-4f0d-b987-a7adaac7d32f","Type":"ContainerDied","Data":"838f3cf738f0048cb830b40c0e77e5d790267ffb725afd4b5a4886f2c36930fe"} Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.739935 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="838f3cf738f0048cb830b40c0e77e5d790267ffb725afd4b5a4886f2c36930fe" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.765652 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=13.76563665 podStartE2EDuration="13.76563665s" podCreationTimestamp="2026-01-21 15:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:02.758352114 +0000 UTC m=+2779.939868335" watchObservedRunningTime="2026-01-21 15:48:02.76563665 +0000 UTC m=+2779.947152872" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849034 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs"] Jan 21 15:48:02 crc kubenswrapper[4707]: E0121 15:48:02.849328 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6b93c5-8639-4f92-9141-d3b904db7bbb" containerName="mariadb-account-create-update" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849345 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6b93c5-8639-4f92-9141-d3b904db7bbb" containerName="mariadb-account-create-update" Jan 21 15:48:02 crc kubenswrapper[4707]: E0121 15:48:02.849369 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be5de23-516b-4f0d-b987-a7adaac7d32f" containerName="mariadb-database-create" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849375 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be5de23-516b-4f0d-b987-a7adaac7d32f" containerName="mariadb-database-create" Jan 21 15:48:02 crc kubenswrapper[4707]: E0121 15:48:02.849385 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4dfc499-69dd-414a-bc01-6eb524b099e1" containerName="mariadb-database-create" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849390 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4dfc499-69dd-414a-bc01-6eb524b099e1" containerName="mariadb-database-create" Jan 21 15:48:02 crc kubenswrapper[4707]: E0121 15:48:02.849398 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5c4d0a-2661-425e-a185-3f1ca33df35f" containerName="mariadb-account-create-update" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849402 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5c4d0a-2661-425e-a185-3f1ca33df35f" containerName="mariadb-account-create-update" Jan 21 15:48:02 crc kubenswrapper[4707]: E0121 15:48:02.849412 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb454a8-70a4-4e80-be1c-bd3b9f6814ce" containerName="mariadb-account-create-update" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849418 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb454a8-70a4-4e80-be1c-bd3b9f6814ce" containerName="mariadb-account-create-update" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849545 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc5c4d0a-2661-425e-a185-3f1ca33df35f" containerName="mariadb-account-create-update" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849559 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6b93c5-8639-4f92-9141-d3b904db7bbb" containerName="mariadb-account-create-update" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849569 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be5de23-516b-4f0d-b987-a7adaac7d32f" containerName="mariadb-database-create" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849583 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4dfc499-69dd-414a-bc01-6eb524b099e1" containerName="mariadb-database-create" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.849591 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb454a8-70a4-4e80-be1c-bd3b9f6814ce" containerName="mariadb-account-create-update" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.850246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.851909 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.860275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs"] Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.977379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsrf6\" (UniqueName: \"kubernetes.io/projected/3282f63b-58f5-4f45-a684-55a09adb04d6-kube-api-access-zsrf6\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.977427 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-config\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.977470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:02 crc kubenswrapper[4707]: I0121 15:48:02.977681 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.079475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsrf6\" (UniqueName: \"kubernetes.io/projected/3282f63b-58f5-4f45-a684-55a09adb04d6-kube-api-access-zsrf6\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.079521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-config\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.079556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.079615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.080303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-config\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.080366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.080901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.096428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsrf6\" (UniqueName: \"kubernetes.io/projected/3282f63b-58f5-4f45-a684-55a09adb04d6-kube-api-access-zsrf6\") pod \"dnsmasq-dnsmasq-677f7d4c8c-n77qs\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.161519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.543826 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs"] Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.752388 4707 generic.go:334] "Generic (PLEG): container finished" podID="3282f63b-58f5-4f45-a684-55a09adb04d6" containerID="c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97" exitCode=0 Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.754293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" event={"ID":"3282f63b-58f5-4f45-a684-55a09adb04d6","Type":"ContainerDied","Data":"c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97"} Jan 21 15:48:03 crc kubenswrapper[4707]: I0121 15:48:03.754399 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" event={"ID":"3282f63b-58f5-4f45-a684-55a09adb04d6","Type":"ContainerStarted","Data":"07fc92e5a685c723b68596f66af05e9e7d7a11eb521ce8734f160da1ae771251"} Jan 21 15:48:04 crc kubenswrapper[4707]: I0121 15:48:04.172645 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-j8nwd"] Jan 21 15:48:04 crc kubenswrapper[4707]: I0121 15:48:04.177067 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-j8nwd"] Jan 21 15:48:04 crc kubenswrapper[4707]: I0121 15:48:04.764513 4707 generic.go:334] "Generic (PLEG): container finished" podID="87f26e02-9064-49cf-a33b-d1aef107c16b" containerID="d3e1d5dbdd13bf53fcf6b6196650b7cbe7095451b0e1b8e6af9f3c9730be9398" exitCode=0 Jan 21 15:48:04 crc kubenswrapper[4707]: I0121 15:48:04.764596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-dbsz5" event={"ID":"87f26e02-9064-49cf-a33b-d1aef107c16b","Type":"ContainerDied","Data":"d3e1d5dbdd13bf53fcf6b6196650b7cbe7095451b0e1b8e6af9f3c9730be9398"} Jan 21 15:48:04 crc kubenswrapper[4707]: I0121 15:48:04.766364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" event={"ID":"3282f63b-58f5-4f45-a684-55a09adb04d6","Type":"ContainerStarted","Data":"a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda"} Jan 21 15:48:04 crc kubenswrapper[4707]: I0121 15:48:04.767372 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:04 crc kubenswrapper[4707]: I0121 15:48:04.816080 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" podStartSLOduration=2.816065553 podStartE2EDuration="2.816065553s" podCreationTimestamp="2026-01-21 15:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:04.812887468 +0000 UTC m=+2781.994403690" watchObservedRunningTime="2026-01-21 15:48:04.816065553 +0000 UTC m=+2781.997581775" Jan 21 15:48:05 crc kubenswrapper[4707]: I0121 15:48:05.190733 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc5c4d0a-2661-425e-a185-3f1ca33df35f" path="/var/lib/kubelet/pods/dc5c4d0a-2661-425e-a185-3f1ca33df35f/volumes" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.022234 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.117977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nprg4\" (UniqueName: \"kubernetes.io/projected/87f26e02-9064-49cf-a33b-d1aef107c16b-kube-api-access-nprg4\") pod \"87f26e02-9064-49cf-a33b-d1aef107c16b\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.118103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-config-data\") pod \"87f26e02-9064-49cf-a33b-d1aef107c16b\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.118159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-combined-ca-bundle\") pod \"87f26e02-9064-49cf-a33b-d1aef107c16b\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.118180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-db-sync-config-data\") pod \"87f26e02-9064-49cf-a33b-d1aef107c16b\" (UID: \"87f26e02-9064-49cf-a33b-d1aef107c16b\") " Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.122076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "87f26e02-9064-49cf-a33b-d1aef107c16b" (UID: "87f26e02-9064-49cf-a33b-d1aef107c16b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.122104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f26e02-9064-49cf-a33b-d1aef107c16b-kube-api-access-nprg4" (OuterVolumeSpecName: "kube-api-access-nprg4") pod "87f26e02-9064-49cf-a33b-d1aef107c16b" (UID: "87f26e02-9064-49cf-a33b-d1aef107c16b"). InnerVolumeSpecName "kube-api-access-nprg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.136695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87f26e02-9064-49cf-a33b-d1aef107c16b" (UID: "87f26e02-9064-49cf-a33b-d1aef107c16b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.148564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-config-data" (OuterVolumeSpecName: "config-data") pod "87f26e02-9064-49cf-a33b-d1aef107c16b" (UID: "87f26e02-9064-49cf-a33b-d1aef107c16b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.219653 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nprg4\" (UniqueName: \"kubernetes.io/projected/87f26e02-9064-49cf-a33b-d1aef107c16b-kube-api-access-nprg4\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.219678 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.219687 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.219696 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87f26e02-9064-49cf-a33b-d1aef107c16b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.779896 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-dbsz5" Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.779884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-dbsz5" event={"ID":"87f26e02-9064-49cf-a33b-d1aef107c16b","Type":"ContainerDied","Data":"63f174ed212025e35220c677f8b4ee7ae6b3139c363a21460c874d666033498f"} Jan 21 15:48:06 crc kubenswrapper[4707]: I0121 15:48:06.779937 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63f174ed212025e35220c677f8b4ee7ae6b3139c363a21460c874d666033498f" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.162934 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.201376 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg"] Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.201579 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" podUID="68cc87bb-db38-44df-a921-630ecf45a30b" containerName="dnsmasq-dns" containerID="cri-o://9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a" gracePeriod=10 Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.586059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.752090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwbfk\" (UniqueName: \"kubernetes.io/projected/68cc87bb-db38-44df-a921-630ecf45a30b-kube-api-access-gwbfk\") pod \"68cc87bb-db38-44df-a921-630ecf45a30b\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.752142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-dnsmasq-svc\") pod \"68cc87bb-db38-44df-a921-630ecf45a30b\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.752182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-config\") pod \"68cc87bb-db38-44df-a921-630ecf45a30b\" (UID: \"68cc87bb-db38-44df-a921-630ecf45a30b\") " Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.757398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68cc87bb-db38-44df-a921-630ecf45a30b-kube-api-access-gwbfk" (OuterVolumeSpecName: "kube-api-access-gwbfk") pod "68cc87bb-db38-44df-a921-630ecf45a30b" (UID: "68cc87bb-db38-44df-a921-630ecf45a30b"). InnerVolumeSpecName "kube-api-access-gwbfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.782284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-config" (OuterVolumeSpecName: "config") pod "68cc87bb-db38-44df-a921-630ecf45a30b" (UID: "68cc87bb-db38-44df-a921-630ecf45a30b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.784349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "68cc87bb-db38-44df-a921-630ecf45a30b" (UID: "68cc87bb-db38-44df-a921-630ecf45a30b"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.796298 4707 generic.go:334] "Generic (PLEG): container finished" podID="68cc87bb-db38-44df-a921-630ecf45a30b" containerID="9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a" exitCode=0 Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.796339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" event={"ID":"68cc87bb-db38-44df-a921-630ecf45a30b","Type":"ContainerDied","Data":"9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a"} Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.796363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" event={"ID":"68cc87bb-db38-44df-a921-630ecf45a30b","Type":"ContainerDied","Data":"a7f72094a6203e918e4dbc21bf4112ebefeb6a3af22b9b548d390db7d7196592"} Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.796372 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.796377 4707 scope.go:117] "RemoveContainer" containerID="9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.822136 4707 scope.go:117] "RemoveContainer" containerID="5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.822520 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg"] Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.829624 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-88qrg"] Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.839916 4707 scope.go:117] "RemoveContainer" containerID="9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a" Jan 21 15:48:08 crc kubenswrapper[4707]: E0121 15:48:08.840278 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a\": container with ID starting with 9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a not found: ID does not exist" containerID="9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.840313 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a"} err="failed to get container status \"9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a\": rpc error: code = NotFound desc = could not find container \"9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a\": container with ID starting with 9366fc0f2f0e8c58683bd6e3be200229ef7ac2e974c849f957556c6774f6d53a not found: ID does not exist" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.840332 4707 scope.go:117] "RemoveContainer" containerID="5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766" Jan 21 15:48:08 crc kubenswrapper[4707]: E0121 15:48:08.840591 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766\": container with ID starting with 5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766 not found: ID does not exist" containerID="5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.840667 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766"} err="failed to get container status \"5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766\": rpc error: code = NotFound desc = could not find container \"5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766\": container with ID starting with 5797481b9b94f9254a169630c51b7c48258050344155fa5d10907998e9f7c766 not found: ID does not exist" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.853606 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.853629 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68cc87bb-db38-44df-a921-630ecf45a30b-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:08 crc kubenswrapper[4707]: I0121 15:48:08.853639 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwbfk\" (UniqueName: \"kubernetes.io/projected/68cc87bb-db38-44df-a921-630ecf45a30b-kube-api-access-gwbfk\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.178053 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kjkdz"] Jan 21 15:48:09 crc kubenswrapper[4707]: E0121 15:48:09.178939 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cc87bb-db38-44df-a921-630ecf45a30b" containerName="dnsmasq-dns" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.179018 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cc87bb-db38-44df-a921-630ecf45a30b" containerName="dnsmasq-dns" Jan 21 15:48:09 crc kubenswrapper[4707]: E0121 15:48:09.179092 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f26e02-9064-49cf-a33b-d1aef107c16b" containerName="glance-db-sync" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.179140 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f26e02-9064-49cf-a33b-d1aef107c16b" containerName="glance-db-sync" Jan 21 15:48:09 crc kubenswrapper[4707]: E0121 15:48:09.179196 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cc87bb-db38-44df-a921-630ecf45a30b" containerName="init" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.179250 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cc87bb-db38-44df-a921-630ecf45a30b" containerName="init" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.179453 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f26e02-9064-49cf-a33b-d1aef107c16b" containerName="glance-db-sync" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.179508 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="68cc87bb-db38-44df-a921-630ecf45a30b" containerName="dnsmasq-dns" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.180029 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.188478 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68cc87bb-db38-44df-a921-630ecf45a30b" path="/var/lib/kubelet/pods/68cc87bb-db38-44df-a921-630ecf45a30b/volumes" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.190338 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.190406 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kjkdz"] Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.360693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5403a81d-469a-4fa3-bf5a-77d515c25015-operator-scripts\") pod \"root-account-create-update-kjkdz\" (UID: \"5403a81d-469a-4fa3-bf5a-77d515c25015\") " pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.360784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsfpk\" (UniqueName: \"kubernetes.io/projected/5403a81d-469a-4fa3-bf5a-77d515c25015-kube-api-access-gsfpk\") pod \"root-account-create-update-kjkdz\" (UID: \"5403a81d-469a-4fa3-bf5a-77d515c25015\") " pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.461783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5403a81d-469a-4fa3-bf5a-77d515c25015-operator-scripts\") pod \"root-account-create-update-kjkdz\" (UID: \"5403a81d-469a-4fa3-bf5a-77d515c25015\") " pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.461860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsfpk\" (UniqueName: \"kubernetes.io/projected/5403a81d-469a-4fa3-bf5a-77d515c25015-kube-api-access-gsfpk\") pod \"root-account-create-update-kjkdz\" (UID: \"5403a81d-469a-4fa3-bf5a-77d515c25015\") " pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.462373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5403a81d-469a-4fa3-bf5a-77d515c25015-operator-scripts\") pod \"root-account-create-update-kjkdz\" (UID: \"5403a81d-469a-4fa3-bf5a-77d515c25015\") " pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.475513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsfpk\" (UniqueName: \"kubernetes.io/projected/5403a81d-469a-4fa3-bf5a-77d515c25015-kube-api-access-gsfpk\") pod \"root-account-create-update-kjkdz\" (UID: \"5403a81d-469a-4fa3-bf5a-77d515c25015\") " pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.497395 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:09 crc kubenswrapper[4707]: I0121 15:48:09.839377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kjkdz"] Jan 21 15:48:09 crc kubenswrapper[4707]: W0121 15:48:09.843346 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5403a81d_469a_4fa3_bf5a_77d515c25015.slice/crio-3e95f85b6ca0743f014de6fff677a9f7be881dda78d8c1464194babe151130e0 WatchSource:0}: Error finding container 3e95f85b6ca0743f014de6fff677a9f7be881dda78d8c1464194babe151130e0: Status 404 returned error can't find the container with id 3e95f85b6ca0743f014de6fff677a9f7be881dda78d8c1464194babe151130e0 Jan 21 15:48:10 crc kubenswrapper[4707]: I0121 15:48:10.810606 4707 generic.go:334] "Generic (PLEG): container finished" podID="5403a81d-469a-4fa3-bf5a-77d515c25015" containerID="e71378c878762f4533c4855c6ebb5b163b8d3bb51fcbd291452e5c8d6f498513" exitCode=0 Jan 21 15:48:10 crc kubenswrapper[4707]: I0121 15:48:10.810665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kjkdz" event={"ID":"5403a81d-469a-4fa3-bf5a-77d515c25015","Type":"ContainerDied","Data":"e71378c878762f4533c4855c6ebb5b163b8d3bb51fcbd291452e5c8d6f498513"} Jan 21 15:48:10 crc kubenswrapper[4707]: I0121 15:48:10.810687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kjkdz" event={"ID":"5403a81d-469a-4fa3-bf5a-77d515c25015","Type":"ContainerStarted","Data":"3e95f85b6ca0743f014de6fff677a9f7be881dda78d8c1464194babe151130e0"} Jan 21 15:48:10 crc kubenswrapper[4707]: I0121 15:48:10.812135 4707 generic.go:334] "Generic (PLEG): container finished" podID="bde702b1-7406-4751-aaf5-026fb8e348c4" containerID="43f7db9043a85994b4e96b1be2333a30db9afca105a651569a43385dfd162291" exitCode=0 Jan 21 15:48:10 crc kubenswrapper[4707]: I0121 15:48:10.812185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"bde702b1-7406-4751-aaf5-026fb8e348c4","Type":"ContainerDied","Data":"43f7db9043a85994b4e96b1be2333a30db9afca105a651569a43385dfd162291"} Jan 21 15:48:10 crc kubenswrapper[4707]: I0121 15:48:10.813785 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e89388a-81da-49c6-a6fe-796c924f2822" containerID="df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e" exitCode=0 Jan 21 15:48:10 crc kubenswrapper[4707]: I0121 15:48:10.813830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"3e89388a-81da-49c6-a6fe-796c924f2822","Type":"ContainerDied","Data":"df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e"} Jan 21 15:48:11 crc kubenswrapper[4707]: I0121 15:48:11.821633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"3e89388a-81da-49c6-a6fe-796c924f2822","Type":"ContainerStarted","Data":"e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd"} Jan 21 15:48:11 crc kubenswrapper[4707]: I0121 15:48:11.822125 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:48:11 crc kubenswrapper[4707]: I0121 15:48:11.823480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"bde702b1-7406-4751-aaf5-026fb8e348c4","Type":"ContainerStarted","Data":"32edadedef2c315e6d4653ad9f20b50bc87635b5eb5275fb1e900ad9ba493403"} Jan 21 15:48:11 crc kubenswrapper[4707]: I0121 15:48:11.823651 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:48:11 crc kubenswrapper[4707]: I0121 15:48:11.840298 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=35.840285509 podStartE2EDuration="35.840285509s" podCreationTimestamp="2026-01-21 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:11.837064433 +0000 UTC m=+2789.018580655" watchObservedRunningTime="2026-01-21 15:48:11.840285509 +0000 UTC m=+2789.021801731" Jan 21 15:48:11 crc kubenswrapper[4707]: I0121 15:48:11.857960 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.857946666 podStartE2EDuration="36.857946666s" podCreationTimestamp="2026-01-21 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:11.855723577 +0000 UTC m=+2789.037239798" watchObservedRunningTime="2026-01-21 15:48:11.857946666 +0000 UTC m=+2789.039462888" Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.107048 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.198008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsfpk\" (UniqueName: \"kubernetes.io/projected/5403a81d-469a-4fa3-bf5a-77d515c25015-kube-api-access-gsfpk\") pod \"5403a81d-469a-4fa3-bf5a-77d515c25015\" (UID: \"5403a81d-469a-4fa3-bf5a-77d515c25015\") " Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.198296 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5403a81d-469a-4fa3-bf5a-77d515c25015-operator-scripts\") pod \"5403a81d-469a-4fa3-bf5a-77d515c25015\" (UID: \"5403a81d-469a-4fa3-bf5a-77d515c25015\") " Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.198718 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5403a81d-469a-4fa3-bf5a-77d515c25015-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5403a81d-469a-4fa3-bf5a-77d515c25015" (UID: "5403a81d-469a-4fa3-bf5a-77d515c25015"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.202881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5403a81d-469a-4fa3-bf5a-77d515c25015-kube-api-access-gsfpk" (OuterVolumeSpecName: "kube-api-access-gsfpk") pod "5403a81d-469a-4fa3-bf5a-77d515c25015" (UID: "5403a81d-469a-4fa3-bf5a-77d515c25015"). InnerVolumeSpecName "kube-api-access-gsfpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.299702 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsfpk\" (UniqueName: \"kubernetes.io/projected/5403a81d-469a-4fa3-bf5a-77d515c25015-kube-api-access-gsfpk\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.299858 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5403a81d-469a-4fa3-bf5a-77d515c25015-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.831636 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kjkdz" Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.839485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kjkdz" event={"ID":"5403a81d-469a-4fa3-bf5a-77d515c25015","Type":"ContainerDied","Data":"3e95f85b6ca0743f014de6fff677a9f7be881dda78d8c1464194babe151130e0"} Jan 21 15:48:12 crc kubenswrapper[4707]: I0121 15:48:12.840100 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e95f85b6ca0743f014de6fff677a9f7be881dda78d8c1464194babe151130e0" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.331593 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.572969 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.581434 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-82fkc"] Jan 21 15:48:27 crc kubenswrapper[4707]: E0121 15:48:27.581725 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5403a81d-469a-4fa3-bf5a-77d515c25015" containerName="mariadb-account-create-update" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.581742 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5403a81d-469a-4fa3-bf5a-77d515c25015" containerName="mariadb-account-create-update" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.581901 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5403a81d-469a-4fa3-bf5a-77d515c25015" containerName="mariadb-account-create-update" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.582365 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.602384 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-82fkc"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.696045 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2cksb"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.696922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.698626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-operator-scripts\") pod \"cinder-db-create-82fkc\" (UID: \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\") " pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.698767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5tb7\" (UniqueName: \"kubernetes.io/projected/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-kube-api-access-f5tb7\") pod \"cinder-db-create-82fkc\" (UID: \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\") " pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.749839 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2cksb"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.785870 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-7fbgq"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.786880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.799659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5tb7\" (UniqueName: \"kubernetes.io/projected/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-kube-api-access-f5tb7\") pod \"cinder-db-create-82fkc\" (UID: \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\") " pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.799725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf18429d-676f-4bef-b5c5-d06a69cc7b76-operator-scripts\") pod \"barbican-db-create-2cksb\" (UID: \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\") " pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.799824 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t786n\" (UniqueName: \"kubernetes.io/projected/bf18429d-676f-4bef-b5c5-d06a69cc7b76-kube-api-access-t786n\") pod \"barbican-db-create-2cksb\" (UID: \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\") " pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.799877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-operator-scripts\") pod \"cinder-db-create-82fkc\" (UID: \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\") " pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.800516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-operator-scripts\") pod \"cinder-db-create-82fkc\" (UID: \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\") " pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.802398 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.814407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-7fbgq"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.814485 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.828283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5tb7\" (UniqueName: \"kubernetes.io/projected/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-kube-api-access-f5tb7\") pod \"cinder-db-create-82fkc\" (UID: \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\") " pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.834030 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.876007 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.895556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.901754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c26cf56-fc84-4283-af2d-edfba663233d-operator-scripts\") pod \"cinder-fe8e-account-create-update-6ncbl\" (UID: \"7c26cf56-fc84-4283-af2d-edfba663233d\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.909891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjb98\" (UniqueName: \"kubernetes.io/projected/7bbffd0b-e235-483f-8487-ad5955d58dd7-kube-api-access-vjb98\") pod \"neutron-db-create-7fbgq\" (UID: \"7bbffd0b-e235-483f-8487-ad5955d58dd7\") " pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.910024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brtss\" (UniqueName: \"kubernetes.io/projected/7c26cf56-fc84-4283-af2d-edfba663233d-kube-api-access-brtss\") pod \"cinder-fe8e-account-create-update-6ncbl\" (UID: \"7c26cf56-fc84-4283-af2d-edfba663233d\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.910131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf18429d-676f-4bef-b5c5-d06a69cc7b76-operator-scripts\") pod \"barbican-db-create-2cksb\" (UID: \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\") " pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.910300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t786n\" (UniqueName: \"kubernetes.io/projected/bf18429d-676f-4bef-b5c5-d06a69cc7b76-kube-api-access-t786n\") pod \"barbican-db-create-2cksb\" (UID: \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\") " pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.910386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bbffd0b-e235-483f-8487-ad5955d58dd7-operator-scripts\") pod \"neutron-db-create-7fbgq\" (UID: \"7bbffd0b-e235-483f-8487-ad5955d58dd7\") " pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.902701 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.914723 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.912432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf18429d-676f-4bef-b5c5-d06a69cc7b76-operator-scripts\") pod \"barbican-db-create-2cksb\" (UID: \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\") " pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.916961 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.956867 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.964382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t786n\" (UniqueName: \"kubernetes.io/projected/bf18429d-676f-4bef-b5c5-d06a69cc7b76-kube-api-access-t786n\") pod \"barbican-db-create-2cksb\" (UID: \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\") " pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.965291 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p"] Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.969504 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.971587 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:48:27 crc kubenswrapper[4707]: I0121 15:48:27.977656 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p"] Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.007548 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.013568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2kvp\" (UniqueName: \"kubernetes.io/projected/058de5ec-317c-4ba6-8497-3abaa84e49db-kube-api-access-w2kvp\") pod \"barbican-5988-account-create-update-6jxtw\" (UID: \"058de5ec-317c-4ba6-8497-3abaa84e49db\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.013601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058de5ec-317c-4ba6-8497-3abaa84e49db-operator-scripts\") pod \"barbican-5988-account-create-update-6jxtw\" (UID: \"058de5ec-317c-4ba6-8497-3abaa84e49db\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.013635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c26cf56-fc84-4283-af2d-edfba663233d-operator-scripts\") pod \"cinder-fe8e-account-create-update-6ncbl\" (UID: \"7c26cf56-fc84-4283-af2d-edfba663233d\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.013662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjb98\" (UniqueName: \"kubernetes.io/projected/7bbffd0b-e235-483f-8487-ad5955d58dd7-kube-api-access-vjb98\") pod \"neutron-db-create-7fbgq\" (UID: \"7bbffd0b-e235-483f-8487-ad5955d58dd7\") " pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.013687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brtss\" (UniqueName: \"kubernetes.io/projected/7c26cf56-fc84-4283-af2d-edfba663233d-kube-api-access-brtss\") pod \"cinder-fe8e-account-create-update-6ncbl\" (UID: \"7c26cf56-fc84-4283-af2d-edfba663233d\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.013774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bbffd0b-e235-483f-8487-ad5955d58dd7-operator-scripts\") pod \"neutron-db-create-7fbgq\" (UID: \"7bbffd0b-e235-483f-8487-ad5955d58dd7\") " pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.014485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bbffd0b-e235-483f-8487-ad5955d58dd7-operator-scripts\") pod \"neutron-db-create-7fbgq\" (UID: \"7bbffd0b-e235-483f-8487-ad5955d58dd7\") " pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.014503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c26cf56-fc84-4283-af2d-edfba663233d-operator-scripts\") pod \"cinder-fe8e-account-create-update-6ncbl\" (UID: \"7c26cf56-fc84-4283-af2d-edfba663233d\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.031560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brtss\" (UniqueName: \"kubernetes.io/projected/7c26cf56-fc84-4283-af2d-edfba663233d-kube-api-access-brtss\") pod \"cinder-fe8e-account-create-update-6ncbl\" (UID: \"7c26cf56-fc84-4283-af2d-edfba663233d\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.033644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjb98\" (UniqueName: \"kubernetes.io/projected/7bbffd0b-e235-483f-8487-ad5955d58dd7-kube-api-access-vjb98\") pod \"neutron-db-create-7fbgq\" (UID: \"7bbffd0b-e235-483f-8487-ad5955d58dd7\") " pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.103673 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.114557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ggdn\" (UniqueName: \"kubernetes.io/projected/3c937de4-4dbe-4167-9706-78d225b7de95-kube-api-access-5ggdn\") pod \"neutron-b634-account-create-update-pcn5p\" (UID: \"3c937de4-4dbe-4167-9706-78d225b7de95\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.114626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kvp\" (UniqueName: \"kubernetes.io/projected/058de5ec-317c-4ba6-8497-3abaa84e49db-kube-api-access-w2kvp\") pod \"barbican-5988-account-create-update-6jxtw\" (UID: \"058de5ec-317c-4ba6-8497-3abaa84e49db\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.114646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058de5ec-317c-4ba6-8497-3abaa84e49db-operator-scripts\") pod \"barbican-5988-account-create-update-6jxtw\" (UID: \"058de5ec-317c-4ba6-8497-3abaa84e49db\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.114981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c937de4-4dbe-4167-9706-78d225b7de95-operator-scripts\") pod \"neutron-b634-account-create-update-pcn5p\" (UID: \"3c937de4-4dbe-4167-9706-78d225b7de95\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.115230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058de5ec-317c-4ba6-8497-3abaa84e49db-operator-scripts\") pod \"barbican-5988-account-create-update-6jxtw\" (UID: \"058de5ec-317c-4ba6-8497-3abaa84e49db\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.129158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2kvp\" (UniqueName: \"kubernetes.io/projected/058de5ec-317c-4ba6-8497-3abaa84e49db-kube-api-access-w2kvp\") pod \"barbican-5988-account-create-update-6jxtw\" (UID: \"058de5ec-317c-4ba6-8497-3abaa84e49db\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.191417 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.207003 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fcvt8"] Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.208012 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.212036 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.213022 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.213048 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-x8w4j" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.213181 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.216131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c937de4-4dbe-4167-9706-78d225b7de95-operator-scripts\") pod \"neutron-b634-account-create-update-pcn5p\" (UID: \"3c937de4-4dbe-4167-9706-78d225b7de95\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.216250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ggdn\" (UniqueName: \"kubernetes.io/projected/3c937de4-4dbe-4167-9706-78d225b7de95-kube-api-access-5ggdn\") pod \"neutron-b634-account-create-update-pcn5p\" (UID: \"3c937de4-4dbe-4167-9706-78d225b7de95\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.217036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c937de4-4dbe-4167-9706-78d225b7de95-operator-scripts\") pod \"neutron-b634-account-create-update-pcn5p\" (UID: \"3c937de4-4dbe-4167-9706-78d225b7de95\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.225290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fcvt8"] Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.233080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ggdn\" (UniqueName: \"kubernetes.io/projected/3c937de4-4dbe-4167-9706-78d225b7de95-kube-api-access-5ggdn\") pod \"neutron-b634-account-create-update-pcn5p\" (UID: \"3c937de4-4dbe-4167-9706-78d225b7de95\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.317786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9gzd\" (UniqueName: \"kubernetes.io/projected/8a32805d-3d95-44ab-8bd1-808d7391c49d-kube-api-access-c9gzd\") pod \"keystone-db-sync-fcvt8\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.318014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-config-data\") pod \"keystone-db-sync-fcvt8\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.318046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-combined-ca-bundle\") pod \"keystone-db-sync-fcvt8\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.321205 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.350840 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-82fkc"] Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.358918 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:28 crc kubenswrapper[4707]: W0121 15:48:28.370154 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2098ba33_9f63_4eb2_95d1_c1b8219a3db7.slice/crio-a7226bf6747a7cab53522f1c77ac600260994d9a2ea3653011258eca5bf69fb5 WatchSource:0}: Error finding container a7226bf6747a7cab53522f1c77ac600260994d9a2ea3653011258eca5bf69fb5: Status 404 returned error can't find the container with id a7226bf6747a7cab53522f1c77ac600260994d9a2ea3653011258eca5bf69fb5 Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.420656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-config-data\") pod \"keystone-db-sync-fcvt8\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.420696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-combined-ca-bundle\") pod \"keystone-db-sync-fcvt8\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.420749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9gzd\" (UniqueName: \"kubernetes.io/projected/8a32805d-3d95-44ab-8bd1-808d7391c49d-kube-api-access-c9gzd\") pod \"keystone-db-sync-fcvt8\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.427682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-combined-ca-bundle\") pod \"keystone-db-sync-fcvt8\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.436562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-config-data\") pod \"keystone-db-sync-fcvt8\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.436604 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2cksb"] Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.441849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9gzd\" (UniqueName: \"kubernetes.io/projected/8a32805d-3d95-44ab-8bd1-808d7391c49d-kube-api-access-c9gzd\") pod \"keystone-db-sync-fcvt8\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: W0121 15:48:28.469945 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf18429d_676f_4bef_b5c5_d06a69cc7b76.slice/crio-b872796470e5ac395861911106225de085a4adce3b45a6c09ac766675b12723d WatchSource:0}: Error finding container b872796470e5ac395861911106225de085a4adce3b45a6c09ac766675b12723d: Status 404 returned error can't find the container with id b872796470e5ac395861911106225de085a4adce3b45a6c09ac766675b12723d Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.522753 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.525168 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-7fbgq"] Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.620631 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl"] Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.793799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw"] Jan 21 15:48:28 crc kubenswrapper[4707]: W0121 15:48:28.811418 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod058de5ec_317c_4ba6_8497_3abaa84e49db.slice/crio-c0118b4e41b936c24eda7e41db041a7c7b23c77027963a91bd330644a524c205 WatchSource:0}: Error finding container c0118b4e41b936c24eda7e41db041a7c7b23c77027963a91bd330644a524c205: Status 404 returned error can't find the container with id c0118b4e41b936c24eda7e41db041a7c7b23c77027963a91bd330644a524c205 Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.875221 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p"] Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.957407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fcvt8"] Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.971527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" event={"ID":"3c937de4-4dbe-4167-9706-78d225b7de95","Type":"ContainerStarted","Data":"d78fe18ee4ae49f5a377f50fb1344d259b86478400a251767b671c67f9cdc423"} Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.972921 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c26cf56-fc84-4283-af2d-edfba663233d" containerID="e7eeebd1988f725464039fcc2c92c42910a2ec711071f2574ef7fd86fdba055a" exitCode=0 Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.973010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" event={"ID":"7c26cf56-fc84-4283-af2d-edfba663233d","Type":"ContainerDied","Data":"e7eeebd1988f725464039fcc2c92c42910a2ec711071f2574ef7fd86fdba055a"} Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.973040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" event={"ID":"7c26cf56-fc84-4283-af2d-edfba663233d","Type":"ContainerStarted","Data":"88931086be8e3cf2e3a27c911b52b0366c5aec25140002647559debc225e808d"} Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.975107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" event={"ID":"058de5ec-317c-4ba6-8497-3abaa84e49db","Type":"ContainerStarted","Data":"c0118b4e41b936c24eda7e41db041a7c7b23c77027963a91bd330644a524c205"} Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.976126 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf18429d-676f-4bef-b5c5-d06a69cc7b76" containerID="f34721b7e3114cc6a1e9ef72c684808875796bac51d867e8523d5d153e69cb52" exitCode=0 Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.976165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-2cksb" event={"ID":"bf18429d-676f-4bef-b5c5-d06a69cc7b76","Type":"ContainerDied","Data":"f34721b7e3114cc6a1e9ef72c684808875796bac51d867e8523d5d153e69cb52"} Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.976179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-2cksb" event={"ID":"bf18429d-676f-4bef-b5c5-d06a69cc7b76","Type":"ContainerStarted","Data":"b872796470e5ac395861911106225de085a4adce3b45a6c09ac766675b12723d"} Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.977911 4707 generic.go:334] "Generic (PLEG): container finished" podID="7bbffd0b-e235-483f-8487-ad5955d58dd7" containerID="c230b11acb9618eb5c88ca18c142e1ad367210447000f32c35a942db7cc5585f" exitCode=0 Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.977949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-7fbgq" event={"ID":"7bbffd0b-e235-483f-8487-ad5955d58dd7","Type":"ContainerDied","Data":"c230b11acb9618eb5c88ca18c142e1ad367210447000f32c35a942db7cc5585f"} Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.977963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-7fbgq" event={"ID":"7bbffd0b-e235-483f-8487-ad5955d58dd7","Type":"ContainerStarted","Data":"be26a6f955dfd82ea82d48577dd5f5126483dfee577e71cb8f3eefaf720e984a"} Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.979075 4707 generic.go:334] "Generic (PLEG): container finished" podID="2098ba33-9f63-4eb2-95d1-c1b8219a3db7" containerID="ea797f16aa01ac7b3ee4f0754c5c7ce729ef2b755429c228fa55a073bdf52450" exitCode=0 Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.979099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-82fkc" event={"ID":"2098ba33-9f63-4eb2-95d1-c1b8219a3db7","Type":"ContainerDied","Data":"ea797f16aa01ac7b3ee4f0754c5c7ce729ef2b755429c228fa55a073bdf52450"} Jan 21 15:48:28 crc kubenswrapper[4707]: I0121 15:48:28.979112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-82fkc" event={"ID":"2098ba33-9f63-4eb2-95d1-c1b8219a3db7","Type":"ContainerStarted","Data":"a7226bf6747a7cab53522f1c77ac600260994d9a2ea3653011258eca5bf69fb5"} Jan 21 15:48:29 crc kubenswrapper[4707]: I0121 15:48:29.987084 4707 generic.go:334] "Generic (PLEG): container finished" podID="058de5ec-317c-4ba6-8497-3abaa84e49db" containerID="415f4336b9f102676f316ec6eeca0a530fd2e6b550c3557faee9b4a9087b5cfc" exitCode=0 Jan 21 15:48:29 crc kubenswrapper[4707]: I0121 15:48:29.987341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" event={"ID":"058de5ec-317c-4ba6-8497-3abaa84e49db","Type":"ContainerDied","Data":"415f4336b9f102676f316ec6eeca0a530fd2e6b550c3557faee9b4a9087b5cfc"} Jan 21 15:48:29 crc kubenswrapper[4707]: I0121 15:48:29.989672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" event={"ID":"8a32805d-3d95-44ab-8bd1-808d7391c49d","Type":"ContainerStarted","Data":"1768b7b2effe8e34bac8d411159997a5bafe09b860da85c42876134ed5257e4e"} Jan 21 15:48:29 crc kubenswrapper[4707]: I0121 15:48:29.989705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" event={"ID":"8a32805d-3d95-44ab-8bd1-808d7391c49d","Type":"ContainerStarted","Data":"3f40640919b297dd43312938a55766ebb3ad8fc219ae2ba1a3e4fdf395c0eb54"} Jan 21 15:48:29 crc kubenswrapper[4707]: I0121 15:48:29.993259 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c937de4-4dbe-4167-9706-78d225b7de95" containerID="8e67c2ffb0682c00c6f22286ee55c8e92e062463d0d967c5528351a1e34c7ada" exitCode=0 Jan 21 15:48:29 crc kubenswrapper[4707]: I0121 15:48:29.993427 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" event={"ID":"3c937de4-4dbe-4167-9706-78d225b7de95","Type":"ContainerDied","Data":"8e67c2ffb0682c00c6f22286ee55c8e92e062463d0d967c5528351a1e34c7ada"} Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.033248 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" podStartSLOduration=2.033218513 podStartE2EDuration="2.033218513s" podCreationTimestamp="2026-01-21 15:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:30.029694196 +0000 UTC m=+2807.211210419" watchObservedRunningTime="2026-01-21 15:48:30.033218513 +0000 UTC m=+2807.214734735" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.292879 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.354200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bbffd0b-e235-483f-8487-ad5955d58dd7-operator-scripts\") pod \"7bbffd0b-e235-483f-8487-ad5955d58dd7\" (UID: \"7bbffd0b-e235-483f-8487-ad5955d58dd7\") " Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.354291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjb98\" (UniqueName: \"kubernetes.io/projected/7bbffd0b-e235-483f-8487-ad5955d58dd7-kube-api-access-vjb98\") pod \"7bbffd0b-e235-483f-8487-ad5955d58dd7\" (UID: \"7bbffd0b-e235-483f-8487-ad5955d58dd7\") " Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.355533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bbffd0b-e235-483f-8487-ad5955d58dd7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bbffd0b-e235-483f-8487-ad5955d58dd7" (UID: "7bbffd0b-e235-483f-8487-ad5955d58dd7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.360088 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbffd0b-e235-483f-8487-ad5955d58dd7-kube-api-access-vjb98" (OuterVolumeSpecName: "kube-api-access-vjb98") pod "7bbffd0b-e235-483f-8487-ad5955d58dd7" (UID: "7bbffd0b-e235-483f-8487-ad5955d58dd7"). InnerVolumeSpecName "kube-api-access-vjb98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.411323 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.415076 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.418785 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.455970 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bbffd0b-e235-483f-8487-ad5955d58dd7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.456002 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjb98\" (UniqueName: \"kubernetes.io/projected/7bbffd0b-e235-483f-8487-ad5955d58dd7-kube-api-access-vjb98\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.556691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c26cf56-fc84-4283-af2d-edfba663233d-operator-scripts\") pod \"7c26cf56-fc84-4283-af2d-edfba663233d\" (UID: \"7c26cf56-fc84-4283-af2d-edfba663233d\") " Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.556728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5tb7\" (UniqueName: \"kubernetes.io/projected/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-kube-api-access-f5tb7\") pod \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\" (UID: \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\") " Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.556764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-operator-scripts\") pod \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\" (UID: \"2098ba33-9f63-4eb2-95d1-c1b8219a3db7\") " Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.556805 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf18429d-676f-4bef-b5c5-d06a69cc7b76-operator-scripts\") pod \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\" (UID: \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\") " Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.556916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brtss\" (UniqueName: \"kubernetes.io/projected/7c26cf56-fc84-4283-af2d-edfba663233d-kube-api-access-brtss\") pod \"7c26cf56-fc84-4283-af2d-edfba663233d\" (UID: \"7c26cf56-fc84-4283-af2d-edfba663233d\") " Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.556964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t786n\" (UniqueName: \"kubernetes.io/projected/bf18429d-676f-4bef-b5c5-d06a69cc7b76-kube-api-access-t786n\") pod \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\" (UID: \"bf18429d-676f-4bef-b5c5-d06a69cc7b76\") " Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.557347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf18429d-676f-4bef-b5c5-d06a69cc7b76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf18429d-676f-4bef-b5c5-d06a69cc7b76" (UID: "bf18429d-676f-4bef-b5c5-d06a69cc7b76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.557437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c26cf56-fc84-4283-af2d-edfba663233d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c26cf56-fc84-4283-af2d-edfba663233d" (UID: "7c26cf56-fc84-4283-af2d-edfba663233d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.557484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2098ba33-9f63-4eb2-95d1-c1b8219a3db7" (UID: "2098ba33-9f63-4eb2-95d1-c1b8219a3db7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.560177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c26cf56-fc84-4283-af2d-edfba663233d-kube-api-access-brtss" (OuterVolumeSpecName: "kube-api-access-brtss") pod "7c26cf56-fc84-4283-af2d-edfba663233d" (UID: "7c26cf56-fc84-4283-af2d-edfba663233d"). InnerVolumeSpecName "kube-api-access-brtss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.560659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf18429d-676f-4bef-b5c5-d06a69cc7b76-kube-api-access-t786n" (OuterVolumeSpecName: "kube-api-access-t786n") pod "bf18429d-676f-4bef-b5c5-d06a69cc7b76" (UID: "bf18429d-676f-4bef-b5c5-d06a69cc7b76"). InnerVolumeSpecName "kube-api-access-t786n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.560709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-kube-api-access-f5tb7" (OuterVolumeSpecName: "kube-api-access-f5tb7") pod "2098ba33-9f63-4eb2-95d1-c1b8219a3db7" (UID: "2098ba33-9f63-4eb2-95d1-c1b8219a3db7"). InnerVolumeSpecName "kube-api-access-f5tb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.658461 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brtss\" (UniqueName: \"kubernetes.io/projected/7c26cf56-fc84-4283-af2d-edfba663233d-kube-api-access-brtss\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.658495 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t786n\" (UniqueName: \"kubernetes.io/projected/bf18429d-676f-4bef-b5c5-d06a69cc7b76-kube-api-access-t786n\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.658507 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c26cf56-fc84-4283-af2d-edfba663233d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.658515 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5tb7\" (UniqueName: \"kubernetes.io/projected/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-kube-api-access-f5tb7\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.658536 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2098ba33-9f63-4eb2-95d1-c1b8219a3db7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4707]: I0121 15:48:30.658546 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf18429d-676f-4bef-b5c5-d06a69cc7b76-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.000849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-82fkc" event={"ID":"2098ba33-9f63-4eb2-95d1-c1b8219a3db7","Type":"ContainerDied","Data":"a7226bf6747a7cab53522f1c77ac600260994d9a2ea3653011258eca5bf69fb5"} Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.000863 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-82fkc" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.000988 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7226bf6747a7cab53522f1c77ac600260994d9a2ea3653011258eca5bf69fb5" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.002186 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.002218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl" event={"ID":"7c26cf56-fc84-4283-af2d-edfba663233d","Type":"ContainerDied","Data":"88931086be8e3cf2e3a27c911b52b0366c5aec25140002647559debc225e808d"} Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.002240 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88931086be8e3cf2e3a27c911b52b0366c5aec25140002647559debc225e808d" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.003666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-2cksb" event={"ID":"bf18429d-676f-4bef-b5c5-d06a69cc7b76","Type":"ContainerDied","Data":"b872796470e5ac395861911106225de085a4adce3b45a6c09ac766675b12723d"} Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.003691 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b872796470e5ac395861911106225de085a4adce3b45a6c09ac766675b12723d" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.003745 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-2cksb" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.005152 4707 generic.go:334] "Generic (PLEG): container finished" podID="8a32805d-3d95-44ab-8bd1-808d7391c49d" containerID="1768b7b2effe8e34bac8d411159997a5bafe09b860da85c42876134ed5257e4e" exitCode=0 Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.005212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" event={"ID":"8a32805d-3d95-44ab-8bd1-808d7391c49d","Type":"ContainerDied","Data":"1768b7b2effe8e34bac8d411159997a5bafe09b860da85c42876134ed5257e4e"} Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.006685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-7fbgq" event={"ID":"7bbffd0b-e235-483f-8487-ad5955d58dd7","Type":"ContainerDied","Data":"be26a6f955dfd82ea82d48577dd5f5126483dfee577e71cb8f3eefaf720e984a"} Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.006715 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be26a6f955dfd82ea82d48577dd5f5126483dfee577e71cb8f3eefaf720e984a" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.006798 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-7fbgq" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.311659 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.317007 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.483482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058de5ec-317c-4ba6-8497-3abaa84e49db-operator-scripts\") pod \"058de5ec-317c-4ba6-8497-3abaa84e49db\" (UID: \"058de5ec-317c-4ba6-8497-3abaa84e49db\") " Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.483559 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2kvp\" (UniqueName: \"kubernetes.io/projected/058de5ec-317c-4ba6-8497-3abaa84e49db-kube-api-access-w2kvp\") pod \"058de5ec-317c-4ba6-8497-3abaa84e49db\" (UID: \"058de5ec-317c-4ba6-8497-3abaa84e49db\") " Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.483744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c937de4-4dbe-4167-9706-78d225b7de95-operator-scripts\") pod \"3c937de4-4dbe-4167-9706-78d225b7de95\" (UID: \"3c937de4-4dbe-4167-9706-78d225b7de95\") " Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.483764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ggdn\" (UniqueName: \"kubernetes.io/projected/3c937de4-4dbe-4167-9706-78d225b7de95-kube-api-access-5ggdn\") pod \"3c937de4-4dbe-4167-9706-78d225b7de95\" (UID: \"3c937de4-4dbe-4167-9706-78d225b7de95\") " Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.484020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058de5ec-317c-4ba6-8497-3abaa84e49db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "058de5ec-317c-4ba6-8497-3abaa84e49db" (UID: "058de5ec-317c-4ba6-8497-3abaa84e49db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.484152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c937de4-4dbe-4167-9706-78d225b7de95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c937de4-4dbe-4167-9706-78d225b7de95" (UID: "3c937de4-4dbe-4167-9706-78d225b7de95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.484401 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c937de4-4dbe-4167-9706-78d225b7de95-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.484437 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/058de5ec-317c-4ba6-8497-3abaa84e49db-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.499953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058de5ec-317c-4ba6-8497-3abaa84e49db-kube-api-access-w2kvp" (OuterVolumeSpecName: "kube-api-access-w2kvp") pod "058de5ec-317c-4ba6-8497-3abaa84e49db" (UID: "058de5ec-317c-4ba6-8497-3abaa84e49db"). InnerVolumeSpecName "kube-api-access-w2kvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.503926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c937de4-4dbe-4167-9706-78d225b7de95-kube-api-access-5ggdn" (OuterVolumeSpecName: "kube-api-access-5ggdn") pod "3c937de4-4dbe-4167-9706-78d225b7de95" (UID: "3c937de4-4dbe-4167-9706-78d225b7de95"). InnerVolumeSpecName "kube-api-access-5ggdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.586210 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ggdn\" (UniqueName: \"kubernetes.io/projected/3c937de4-4dbe-4167-9706-78d225b7de95-kube-api-access-5ggdn\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:31 crc kubenswrapper[4707]: I0121 15:48:31.586242 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2kvp\" (UniqueName: \"kubernetes.io/projected/058de5ec-317c-4ba6-8497-3abaa84e49db-kube-api-access-w2kvp\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.015850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" event={"ID":"3c937de4-4dbe-4167-9706-78d225b7de95","Type":"ContainerDied","Data":"d78fe18ee4ae49f5a377f50fb1344d259b86478400a251767b671c67f9cdc423"} Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.015874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.015889 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d78fe18ee4ae49f5a377f50fb1344d259b86478400a251767b671c67f9cdc423" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.017662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" event={"ID":"058de5ec-317c-4ba6-8497-3abaa84e49db","Type":"ContainerDied","Data":"c0118b4e41b936c24eda7e41db041a7c7b23c77027963a91bd330644a524c205"} Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.017684 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0118b4e41b936c24eda7e41db041a7c7b23c77027963a91bd330644a524c205" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.017686 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.261256 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.404290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-combined-ca-bundle\") pod \"8a32805d-3d95-44ab-8bd1-808d7391c49d\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.404568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-config-data\") pod \"8a32805d-3d95-44ab-8bd1-808d7391c49d\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.404630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9gzd\" (UniqueName: \"kubernetes.io/projected/8a32805d-3d95-44ab-8bd1-808d7391c49d-kube-api-access-c9gzd\") pod \"8a32805d-3d95-44ab-8bd1-808d7391c49d\" (UID: \"8a32805d-3d95-44ab-8bd1-808d7391c49d\") " Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.408403 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a32805d-3d95-44ab-8bd1-808d7391c49d-kube-api-access-c9gzd" (OuterVolumeSpecName: "kube-api-access-c9gzd") pod "8a32805d-3d95-44ab-8bd1-808d7391c49d" (UID: "8a32805d-3d95-44ab-8bd1-808d7391c49d"). InnerVolumeSpecName "kube-api-access-c9gzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.421470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a32805d-3d95-44ab-8bd1-808d7391c49d" (UID: "8a32805d-3d95-44ab-8bd1-808d7391c49d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.433857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-config-data" (OuterVolumeSpecName: "config-data") pod "8a32805d-3d95-44ab-8bd1-808d7391c49d" (UID: "8a32805d-3d95-44ab-8bd1-808d7391c49d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.506552 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.506581 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9gzd\" (UniqueName: \"kubernetes.io/projected/8a32805d-3d95-44ab-8bd1-808d7391c49d-kube-api-access-c9gzd\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:32 crc kubenswrapper[4707]: I0121 15:48:32.506593 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a32805d-3d95-44ab-8bd1-808d7391c49d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.025241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" event={"ID":"8a32805d-3d95-44ab-8bd1-808d7391c49d","Type":"ContainerDied","Data":"3f40640919b297dd43312938a55766ebb3ad8fc219ae2ba1a3e4fdf395c0eb54"} Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.025289 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f40640919b297dd43312938a55766ebb3ad8fc219ae2ba1a3e4fdf395c0eb54" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.025292 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-fcvt8" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.392975 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mxx7g"] Jan 21 15:48:33 crc kubenswrapper[4707]: E0121 15:48:33.393273 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058de5ec-317c-4ba6-8497-3abaa84e49db" containerName="mariadb-account-create-update" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393290 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="058de5ec-317c-4ba6-8497-3abaa84e49db" containerName="mariadb-account-create-update" Jan 21 15:48:33 crc kubenswrapper[4707]: E0121 15:48:33.393303 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c26cf56-fc84-4283-af2d-edfba663233d" containerName="mariadb-account-create-update" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393309 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c26cf56-fc84-4283-af2d-edfba663233d" containerName="mariadb-account-create-update" Jan 21 15:48:33 crc kubenswrapper[4707]: E0121 15:48:33.393322 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbffd0b-e235-483f-8487-ad5955d58dd7" containerName="mariadb-database-create" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393327 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbffd0b-e235-483f-8487-ad5955d58dd7" containerName="mariadb-database-create" Jan 21 15:48:33 crc kubenswrapper[4707]: E0121 15:48:33.393337 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf18429d-676f-4bef-b5c5-d06a69cc7b76" containerName="mariadb-database-create" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393343 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf18429d-676f-4bef-b5c5-d06a69cc7b76" containerName="mariadb-database-create" Jan 21 15:48:33 crc kubenswrapper[4707]: E0121 15:48:33.393353 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a32805d-3d95-44ab-8bd1-808d7391c49d" containerName="keystone-db-sync" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393359 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a32805d-3d95-44ab-8bd1-808d7391c49d" containerName="keystone-db-sync" Jan 21 15:48:33 crc kubenswrapper[4707]: E0121 15:48:33.393368 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2098ba33-9f63-4eb2-95d1-c1b8219a3db7" containerName="mariadb-database-create" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393375 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2098ba33-9f63-4eb2-95d1-c1b8219a3db7" containerName="mariadb-database-create" Jan 21 15:48:33 crc kubenswrapper[4707]: E0121 15:48:33.393393 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c937de4-4dbe-4167-9706-78d225b7de95" containerName="mariadb-account-create-update" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393399 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c937de4-4dbe-4167-9706-78d225b7de95" containerName="mariadb-account-create-update" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393545 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c26cf56-fc84-4283-af2d-edfba663233d" containerName="mariadb-account-create-update" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393557 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2098ba33-9f63-4eb2-95d1-c1b8219a3db7" containerName="mariadb-database-create" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393564 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a32805d-3d95-44ab-8bd1-808d7391c49d" containerName="keystone-db-sync" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393575 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c937de4-4dbe-4167-9706-78d225b7de95" containerName="mariadb-account-create-update" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393585 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbffd0b-e235-483f-8487-ad5955d58dd7" containerName="mariadb-database-create" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393592 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf18429d-676f-4bef-b5c5-d06a69cc7b76" containerName="mariadb-database-create" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.393605 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="058de5ec-317c-4ba6-8497-3abaa84e49db" containerName="mariadb-account-create-update" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.394087 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.395710 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.395900 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.396323 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.396499 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-x8w4j" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.396631 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.411124 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mxx7g"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.516270 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.517873 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.519447 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.519608 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.521583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-config-data\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.521617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-fernet-keys\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.521764 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-combined-ca-bundle\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.521821 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-credential-keys\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.521903 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bng27\" (UniqueName: \"kubernetes.io/projected/e1917a27-98a1-43cc-8833-839bc0cf67dc-kube-api-access-bng27\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.521938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-scripts\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.534053 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.580323 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mczjr"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.581115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.587319 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tg5cr"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.589674 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.594680 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-kd86g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.594681 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.595092 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.603608 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.613241 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.613331 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-j7drz" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.619224 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tg5cr"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bng27\" (UniqueName: \"kubernetes.io/projected/e1917a27-98a1-43cc-8833-839bc0cf67dc-kube-api-access-bng27\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-scripts\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-scripts\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-run-httpd\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-log-httpd\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-config-data\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-fernet-keys\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxbtz\" (UniqueName: \"kubernetes.io/projected/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-kube-api-access-sxbtz\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-config-data\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-combined-ca-bundle\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-credential-keys\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.623979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.631325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-fernet-keys\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.634285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-config-data\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.634918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-credential-keys\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.639638 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mczjr"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.640168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-scripts\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.640632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-combined-ca-bundle\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.651756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bng27\" (UniqueName: \"kubernetes.io/projected/e1917a27-98a1-43cc-8833-839bc0cf67dc-kube-api-access-bng27\") pod \"keystone-bootstrap-mxx7g\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.654313 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-tt9db"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.655299 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.666302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-4l28d" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.666496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.666627 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.679903 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-tt9db"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.707761 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.723127 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-45blv"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.724253 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxbtz\" (UniqueName: \"kubernetes.io/projected/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-kube-api-access-sxbtz\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6338d565-911b-47d4-ba71-a9cdd9d6bf76-etc-machine-id\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-config-data\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-scripts\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-combined-ca-bundle\") pod \"neutron-db-sync-tg5cr\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-config\") pod \"neutron-db-sync-tg5cr\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-combined-ca-bundle\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-scripts\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7ph\" (UniqueName: \"kubernetes.io/projected/1cc8577c-8dd6-4ae9-9057-7091c1a10853-kube-api-access-4b7ph\") pod \"neutron-db-sync-tg5cr\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-run-httpd\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-log-httpd\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725735 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-config-data\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725764 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6l92\" (UniqueName: \"kubernetes.io/projected/6338d565-911b-47d4-ba71-a9cdd9d6bf76-kube-api-access-n6l92\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.725785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-db-sync-config-data\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.728598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-run-httpd\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.728679 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-66qv6" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.728917 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.728938 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-log-httpd\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.732170 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-45blv"] Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.733941 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.734749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.742124 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-config-data\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.743867 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxbtz\" (UniqueName: \"kubernetes.io/projected/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-kube-api-access-sxbtz\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.749393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-scripts\") pod \"ceilometer-0\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-combined-ca-bundle\") pod \"barbican-db-sync-45blv\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827406 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-scripts\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-config-data\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-db-sync-config-data\") pod \"barbican-db-sync-45blv\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6l92\" (UniqueName: \"kubernetes.io/projected/6338d565-911b-47d4-ba71-a9cdd9d6bf76-kube-api-access-n6l92\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85d4c\" (UniqueName: \"kubernetes.io/projected/025be631-d9c0-4382-a42f-1979af2fddd3-kube-api-access-85d4c\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-db-sync-config-data\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6338d565-911b-47d4-ba71-a9cdd9d6bf76-etc-machine-id\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-scripts\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827616 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-combined-ca-bundle\") pod \"neutron-db-sync-tg5cr\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbd47\" (UniqueName: \"kubernetes.io/projected/a57de07a-627b-4881-bf68-124934dbe603-kube-api-access-pbd47\") pod \"barbican-db-sync-45blv\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-config\") pod \"neutron-db-sync-tg5cr\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-config-data\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-combined-ca-bundle\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-combined-ca-bundle\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827717 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025be631-d9c0-4382-a42f-1979af2fddd3-logs\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7ph\" (UniqueName: \"kubernetes.io/projected/1cc8577c-8dd6-4ae9-9057-7091c1a10853-kube-api-access-4b7ph\") pod \"neutron-db-sync-tg5cr\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.827761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6338d565-911b-47d4-ba71-a9cdd9d6bf76-etc-machine-id\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.830727 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.831716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-db-sync-config-data\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.832530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-config\") pod \"neutron-db-sync-tg5cr\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.832924 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-combined-ca-bundle\") pod \"neutron-db-sync-tg5cr\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.833156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-combined-ca-bundle\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.834778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-config-data\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.838145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-scripts\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.842122 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7ph\" (UniqueName: \"kubernetes.io/projected/1cc8577c-8dd6-4ae9-9057-7091c1a10853-kube-api-access-4b7ph\") pod \"neutron-db-sync-tg5cr\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.847764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6l92\" (UniqueName: \"kubernetes.io/projected/6338d565-911b-47d4-ba71-a9cdd9d6bf76-kube-api-access-n6l92\") pod \"cinder-db-sync-mczjr\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.893879 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.915650 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.929632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbd47\" (UniqueName: \"kubernetes.io/projected/a57de07a-627b-4881-bf68-124934dbe603-kube-api-access-pbd47\") pod \"barbican-db-sync-45blv\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.929674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-config-data\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.929699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-combined-ca-bundle\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.929715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025be631-d9c0-4382-a42f-1979af2fddd3-logs\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.929756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-combined-ca-bundle\") pod \"barbican-db-sync-45blv\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.929771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-scripts\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.929793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-db-sync-config-data\") pod \"barbican-db-sync-45blv\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.929831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85d4c\" (UniqueName: \"kubernetes.io/projected/025be631-d9c0-4382-a42f-1979af2fddd3-kube-api-access-85d4c\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.930277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025be631-d9c0-4382-a42f-1979af2fddd3-logs\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.933870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-config-data\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.934323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-combined-ca-bundle\") pod \"barbican-db-sync-45blv\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.934490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-scripts\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.939834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-combined-ca-bundle\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.945229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbd47\" (UniqueName: \"kubernetes.io/projected/a57de07a-627b-4881-bf68-124934dbe603-kube-api-access-pbd47\") pod \"barbican-db-sync-45blv\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.945734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-db-sync-config-data\") pod \"barbican-db-sync-45blv\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:33 crc kubenswrapper[4707]: I0121 15:48:33.946039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85d4c\" (UniqueName: \"kubernetes.io/projected/025be631-d9c0-4382-a42f-1979af2fddd3-kube-api-access-85d4c\") pod \"placement-db-sync-tt9db\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.003424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.087883 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.104991 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mxx7g"] Jan 21 15:48:34 crc kubenswrapper[4707]: W0121 15:48:34.115601 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1917a27_98a1_43cc_8833_839bc0cf67dc.slice/crio-8f67368ed5e7353ab1977101d18e09df606680b8b79b78d956c7963361729c68 WatchSource:0}: Error finding container 8f67368ed5e7353ab1977101d18e09df606680b8b79b78d956c7963361729c68: Status 404 returned error can't find the container with id 8f67368ed5e7353ab1977101d18e09df606680b8b79b78d956c7963361729c68 Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.239641 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.338291 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mczjr"] Jan 21 15:48:34 crc kubenswrapper[4707]: W0121 15:48:34.340932 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6338d565_911b_47d4_ba71_a9cdd9d6bf76.slice/crio-96d1971249939bfbcda476fbb940e43641c410409c244111758ceab6d28b4502 WatchSource:0}: Error finding container 96d1971249939bfbcda476fbb940e43641c410409c244111758ceab6d28b4502: Status 404 returned error can't find the container with id 96d1971249939bfbcda476fbb940e43641c410409c244111758ceab6d28b4502 Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.434405 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tg5cr"] Jan 21 15:48:34 crc kubenswrapper[4707]: W0121 15:48:34.440298 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cc8577c_8dd6_4ae9_9057_7091c1a10853.slice/crio-cd4cf8f36c2c15c58b4a086575baeeec4c73c0765efe8fc145cce71006583dbd WatchSource:0}: Error finding container cd4cf8f36c2c15c58b4a086575baeeec4c73c0765efe8fc145cce71006583dbd: Status 404 returned error can't find the container with id cd4cf8f36c2c15c58b4a086575baeeec4c73c0765efe8fc145cce71006583dbd Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.477637 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.480469 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.485553 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-265m4" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.485592 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.485967 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.494094 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.527111 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-tt9db"] Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.546921 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.548193 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.550909 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.572866 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.581098 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:48:34 crc kubenswrapper[4707]: E0121 15:48:34.581615 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-54sw9 logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="96b6524e-2d86-44c8-a423-fa506f0ee983" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.601062 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-45blv"] Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-logs\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-scripts\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643267 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpkrg\" (UniqueName: \"kubernetes.io/projected/200fc443-52df-4f54-988b-9c7e100ab0fa-kube-api-access-wpkrg\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643312 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-config-data\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-logs\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.643437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54sw9\" (UniqueName: \"kubernetes.io/projected/96b6524e-2d86-44c8-a423-fa506f0ee983-kube-api-access-54sw9\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.661580 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:48:34 crc kubenswrapper[4707]: E0121 15:48:34.670411 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-wpkrg logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="200fc443-52df-4f54-988b-9c7e100ab0fa" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-config-data\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-logs\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54sw9\" (UniqueName: \"kubernetes.io/projected/96b6524e-2d86-44c8-a423-fa506f0ee983-kube-api-access-54sw9\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-logs\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.744987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.745004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-scripts\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.745042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpkrg\" (UniqueName: \"kubernetes.io/projected/200fc443-52df-4f54-988b-9c7e100ab0fa-kube-api-access-wpkrg\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.750104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-logs\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.761647 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.762249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-logs\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.763144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.765085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.765340 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.766462 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.766978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.781642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-config-data\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.799241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.801722 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54sw9\" (UniqueName: \"kubernetes.io/projected/96b6524e-2d86-44c8-a423-fa506f0ee983-kube-api-access-54sw9\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.804459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpkrg\" (UniqueName: \"kubernetes.io/projected/200fc443-52df-4f54-988b-9c7e100ab0fa-kube-api-access-wpkrg\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.806654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.899126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-scripts\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.950284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:34 crc kubenswrapper[4707]: I0121 15:48:34.964970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.048119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-mczjr" event={"ID":"6338d565-911b-47d4-ba71-a9cdd9d6bf76","Type":"ContainerStarted","Data":"96d1971249939bfbcda476fbb940e43641c410409c244111758ceab6d28b4502"} Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.052101 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-tt9db" event={"ID":"025be631-d9c0-4382-a42f-1979af2fddd3","Type":"ContainerStarted","Data":"017b62d5c64e8c72010733bce258aabed46e954ed91e7af4e64b2cff4d2c6e42"} Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.052135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-tt9db" event={"ID":"025be631-d9c0-4382-a42f-1979af2fddd3","Type":"ContainerStarted","Data":"c8f59fba561cc8157a26c42a58d8516146a2bfd10e7ec0caab9ac50aafd37d55"} Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.053668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" event={"ID":"1cc8577c-8dd6-4ae9-9057-7091c1a10853","Type":"ContainerStarted","Data":"02e8a2611ee7d85c7fd5a2ff060701f99988408f3267b95159cbe36a24e03ce3"} Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.053690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" event={"ID":"1cc8577c-8dd6-4ae9-9057-7091c1a10853","Type":"ContainerStarted","Data":"cd4cf8f36c2c15c58b4a086575baeeec4c73c0765efe8fc145cce71006583dbd"} Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.059273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" event={"ID":"e1917a27-98a1-43cc-8833-839bc0cf67dc","Type":"ContainerStarted","Data":"3390dfaa1745c314422218fd6221fa92f2714b8258f227fa76da5bce0cf4e8f9"} Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.059362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" event={"ID":"e1917a27-98a1-43cc-8833-839bc0cf67dc","Type":"ContainerStarted","Data":"8f67368ed5e7353ab1977101d18e09df606680b8b79b78d956c7963361729c68"} Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.061964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerStarted","Data":"70746b54176ba74a095ddbf59e0c541b67e692a1e00d80c9ae722b295c769d7a"} Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.063390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.063468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-45blv" event={"ID":"a57de07a-627b-4881-bf68-124934dbe603","Type":"ContainerStarted","Data":"bcb18c1bb6722be674fe474b5d1015d238f642b678756b08acc09dd74b6f854a"} Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.063983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.071966 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-tt9db" podStartSLOduration=2.071956251 podStartE2EDuration="2.071956251s" podCreationTimestamp="2026-01-21 15:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:35.067763918 +0000 UTC m=+2812.249280140" watchObservedRunningTime="2026-01-21 15:48:35.071956251 +0000 UTC m=+2812.253472473" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.079419 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.086338 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.087898 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-45blv" podStartSLOduration=2.08788478 podStartE2EDuration="2.08788478s" podCreationTimestamp="2026-01-21 15:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:35.082564337 +0000 UTC m=+2812.264080559" watchObservedRunningTime="2026-01-21 15:48:35.08788478 +0000 UTC m=+2812.269401002" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.105388 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" podStartSLOduration=2.105369967 podStartE2EDuration="2.105369967s" podCreationTimestamp="2026-01-21 15:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:35.102105859 +0000 UTC m=+2812.283622081" watchObservedRunningTime="2026-01-21 15:48:35.105369967 +0000 UTC m=+2812.286886189" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.117788 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" podStartSLOduration=2.117773909 podStartE2EDuration="2.117773909s" podCreationTimestamp="2026-01-21 15:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:35.115039909 +0000 UTC m=+2812.296556131" watchObservedRunningTime="2026-01-21 15:48:35.117773909 +0000 UTC m=+2812.299290131" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-httpd-run\") pod \"200fc443-52df-4f54-988b-9c7e100ab0fa\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157495 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"200fc443-52df-4f54-988b-9c7e100ab0fa\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-scripts\") pod \"96b6524e-2d86-44c8-a423-fa506f0ee983\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157578 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-scripts\") pod \"200fc443-52df-4f54-988b-9c7e100ab0fa\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157594 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"96b6524e-2d86-44c8-a423-fa506f0ee983\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-combined-ca-bundle\") pod \"200fc443-52df-4f54-988b-9c7e100ab0fa\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-logs\") pod \"200fc443-52df-4f54-988b-9c7e100ab0fa\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-httpd-run\") pod \"96b6524e-2d86-44c8-a423-fa506f0ee983\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-logs\") pod \"96b6524e-2d86-44c8-a423-fa506f0ee983\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-config-data\") pod \"200fc443-52df-4f54-988b-9c7e100ab0fa\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157952 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54sw9\" (UniqueName: \"kubernetes.io/projected/96b6524e-2d86-44c8-a423-fa506f0ee983-kube-api-access-54sw9\") pod \"96b6524e-2d86-44c8-a423-fa506f0ee983\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-combined-ca-bundle\") pod \"96b6524e-2d86-44c8-a423-fa506f0ee983\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.157986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-config-data\") pod \"96b6524e-2d86-44c8-a423-fa506f0ee983\" (UID: \"96b6524e-2d86-44c8-a423-fa506f0ee983\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.158006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpkrg\" (UniqueName: \"kubernetes.io/projected/200fc443-52df-4f54-988b-9c7e100ab0fa-kube-api-access-wpkrg\") pod \"200fc443-52df-4f54-988b-9c7e100ab0fa\" (UID: \"200fc443-52df-4f54-988b-9c7e100ab0fa\") " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.159430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "200fc443-52df-4f54-988b-9c7e100ab0fa" (UID: "200fc443-52df-4f54-988b-9c7e100ab0fa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.160871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-logs" (OuterVolumeSpecName: "logs") pod "96b6524e-2d86-44c8-a423-fa506f0ee983" (UID: "96b6524e-2d86-44c8-a423-fa506f0ee983"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.163429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-scripts" (OuterVolumeSpecName: "scripts") pod "200fc443-52df-4f54-988b-9c7e100ab0fa" (UID: "200fc443-52df-4f54-988b-9c7e100ab0fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.163888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-logs" (OuterVolumeSpecName: "logs") pod "200fc443-52df-4f54-988b-9c7e100ab0fa" (UID: "200fc443-52df-4f54-988b-9c7e100ab0fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.164073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "96b6524e-2d86-44c8-a423-fa506f0ee983" (UID: "96b6524e-2d86-44c8-a423-fa506f0ee983"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.164187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96b6524e-2d86-44c8-a423-fa506f0ee983" (UID: "96b6524e-2d86-44c8-a423-fa506f0ee983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.165029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "200fc443-52df-4f54-988b-9c7e100ab0fa" (UID: "200fc443-52df-4f54-988b-9c7e100ab0fa"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.166310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200fc443-52df-4f54-988b-9c7e100ab0fa-kube-api-access-wpkrg" (OuterVolumeSpecName: "kube-api-access-wpkrg") pod "200fc443-52df-4f54-988b-9c7e100ab0fa" (UID: "200fc443-52df-4f54-988b-9c7e100ab0fa"). InnerVolumeSpecName "kube-api-access-wpkrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.166608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-scripts" (OuterVolumeSpecName: "scripts") pod "96b6524e-2d86-44c8-a423-fa506f0ee983" (UID: "96b6524e-2d86-44c8-a423-fa506f0ee983"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.167372 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "96b6524e-2d86-44c8-a423-fa506f0ee983" (UID: "96b6524e-2d86-44c8-a423-fa506f0ee983"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.169656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b6524e-2d86-44c8-a423-fa506f0ee983-kube-api-access-54sw9" (OuterVolumeSpecName: "kube-api-access-54sw9") pod "96b6524e-2d86-44c8-a423-fa506f0ee983" (UID: "96b6524e-2d86-44c8-a423-fa506f0ee983"). InnerVolumeSpecName "kube-api-access-54sw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.169667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "200fc443-52df-4f54-988b-9c7e100ab0fa" (UID: "200fc443-52df-4f54-988b-9c7e100ab0fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.169740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-config-data" (OuterVolumeSpecName: "config-data") pod "96b6524e-2d86-44c8-a423-fa506f0ee983" (UID: "96b6524e-2d86-44c8-a423-fa506f0ee983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.169889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-config-data" (OuterVolumeSpecName: "config-data") pod "200fc443-52df-4f54-988b-9c7e100ab0fa" (UID: "200fc443-52df-4f54-988b-9c7e100ab0fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.259732 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.259771 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.259782 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.259791 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.266907 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.266978 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.266994 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/200fc443-52df-4f54-988b-9c7e100ab0fa-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.267004 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.267014 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b6524e-2d86-44c8-a423-fa506f0ee983-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.267023 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200fc443-52df-4f54-988b-9c7e100ab0fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.267034 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54sw9\" (UniqueName: \"kubernetes.io/projected/96b6524e-2d86-44c8-a423-fa506f0ee983-kube-api-access-54sw9\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.267043 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.267052 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6524e-2d86-44c8-a423-fa506f0ee983-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.267061 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpkrg\" (UniqueName: \"kubernetes.io/projected/200fc443-52df-4f54-988b-9c7e100ab0fa-kube-api-access-wpkrg\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.273761 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.278801 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.369022 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:35 crc kubenswrapper[4707]: I0121 15:48:35.369051 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.072712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerStarted","Data":"756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757"} Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.072768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerStarted","Data":"35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093"} Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.074396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-45blv" event={"ID":"a57de07a-627b-4881-bf68-124934dbe603","Type":"ContainerStarted","Data":"2f142df25ab095ac8ab812304764acf0c07b5fb5d97186a9832bc8e8eceb7abf"} Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.075866 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.075865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-mczjr" event={"ID":"6338d565-911b-47d4-ba71-a9cdd9d6bf76","Type":"ContainerStarted","Data":"e002ca41987b39d839d15e073a0294cd8fac52b5b32405ad039756d1762a6460"} Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.075893 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.096594 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-mczjr" podStartSLOduration=3.096578509 podStartE2EDuration="3.096578509s" podCreationTimestamp="2026-01-21 15:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:36.090031979 +0000 UTC m=+2813.271548202" watchObservedRunningTime="2026-01-21 15:48:36.096578509 +0000 UTC m=+2813.278094731" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.135617 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.144344 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.152030 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.153259 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.163916 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-265m4" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.165013 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.165161 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.165254 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.182881 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.183642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.183704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-logs\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.183762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.183790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.183853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.183919 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctc2z\" (UniqueName: \"kubernetes.io/projected/f256c211-e440-42ff-80ac-3b9084007cf7-kube-api-access-ctc2z\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.183971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.189847 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.200008 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.201153 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.205219 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.211132 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.284911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-logs\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.284950 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-logs\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.284990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqjld\" (UniqueName: \"kubernetes.io/projected/d08432a6-65ad-4843-924c-2b1f96613356-kube-api-access-gqjld\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-config-data\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-scripts\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctc2z\" (UniqueName: \"kubernetes.io/projected/f256c211-e440-42ff-80ac-3b9084007cf7-kube-api-access-ctc2z\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.285253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.286388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-logs\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.286599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.286734 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.291699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.300276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.300445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.304590 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctc2z\" (UniqueName: \"kubernetes.io/projected/f256c211-e440-42ff-80ac-3b9084007cf7-kube-api-access-ctc2z\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.306902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.386827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-config-data\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.386881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-scripts\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.386972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.387002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.387057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-logs\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.387113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.387166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqjld\" (UniqueName: \"kubernetes.io/projected/d08432a6-65ad-4843-924c-2b1f96613356-kube-api-access-gqjld\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.387463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.387458 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.387595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-logs\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.392480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-config-data\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.392907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-scripts\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.397969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.404020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqjld\" (UniqueName: \"kubernetes.io/projected/d08432a6-65ad-4843-924c-2b1f96613356-kube-api-access-gqjld\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.413974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.485552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.527276 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.933534 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:48:36 crc kubenswrapper[4707]: I0121 15:48:36.975604 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:48:37 crc kubenswrapper[4707]: I0121 15:48:37.086205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d08432a6-65ad-4843-924c-2b1f96613356","Type":"ContainerStarted","Data":"60a2f497ebf06abc01430dea1bb4554116d32887e5f1d0aad6e883ba4a2d7c47"} Jan 21 15:48:37 crc kubenswrapper[4707]: I0121 15:48:37.089331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerStarted","Data":"e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d"} Jan 21 15:48:37 crc kubenswrapper[4707]: I0121 15:48:37.094941 4707 generic.go:334] "Generic (PLEG): container finished" podID="a57de07a-627b-4881-bf68-124934dbe603" containerID="2f142df25ab095ac8ab812304764acf0c07b5fb5d97186a9832bc8e8eceb7abf" exitCode=0 Jan 21 15:48:37 crc kubenswrapper[4707]: I0121 15:48:37.094989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-45blv" event={"ID":"a57de07a-627b-4881-bf68-124934dbe603","Type":"ContainerDied","Data":"2f142df25ab095ac8ab812304764acf0c07b5fb5d97186a9832bc8e8eceb7abf"} Jan 21 15:48:37 crc kubenswrapper[4707]: I0121 15:48:37.096567 4707 generic.go:334] "Generic (PLEG): container finished" podID="025be631-d9c0-4382-a42f-1979af2fddd3" containerID="017b62d5c64e8c72010733bce258aabed46e954ed91e7af4e64b2cff4d2c6e42" exitCode=0 Jan 21 15:48:37 crc kubenswrapper[4707]: I0121 15:48:37.096631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-tt9db" event={"ID":"025be631-d9c0-4382-a42f-1979af2fddd3","Type":"ContainerDied","Data":"017b62d5c64e8c72010733bce258aabed46e954ed91e7af4e64b2cff4d2c6e42"} Jan 21 15:48:37 crc kubenswrapper[4707]: I0121 15:48:37.101419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f256c211-e440-42ff-80ac-3b9084007cf7","Type":"ContainerStarted","Data":"81eeadfd836334be8a28aedd21ac0aaaafb33c397ec11a33e1cb2bef65754285"} Jan 21 15:48:37 crc kubenswrapper[4707]: I0121 15:48:37.193012 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="200fc443-52df-4f54-988b-9c7e100ab0fa" path="/var/lib/kubelet/pods/200fc443-52df-4f54-988b-9c7e100ab0fa/volumes" Jan 21 15:48:37 crc kubenswrapper[4707]: I0121 15:48:37.193574 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b6524e-2d86-44c8-a423-fa506f0ee983" path="/var/lib/kubelet/pods/96b6524e-2d86-44c8-a423-fa506f0ee983/volumes" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.111191 4707 generic.go:334] "Generic (PLEG): container finished" podID="6338d565-911b-47d4-ba71-a9cdd9d6bf76" containerID="e002ca41987b39d839d15e073a0294cd8fac52b5b32405ad039756d1762a6460" exitCode=0 Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.111300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-mczjr" event={"ID":"6338d565-911b-47d4-ba71-a9cdd9d6bf76","Type":"ContainerDied","Data":"e002ca41987b39d839d15e073a0294cd8fac52b5b32405ad039756d1762a6460"} Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.117059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f256c211-e440-42ff-80ac-3b9084007cf7","Type":"ContainerStarted","Data":"d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0"} Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.117113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f256c211-e440-42ff-80ac-3b9084007cf7","Type":"ContainerStarted","Data":"02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf"} Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.118748 4707 generic.go:334] "Generic (PLEG): container finished" podID="e1917a27-98a1-43cc-8833-839bc0cf67dc" containerID="3390dfaa1745c314422218fd6221fa92f2714b8258f227fa76da5bce0cf4e8f9" exitCode=0 Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.118824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" event={"ID":"e1917a27-98a1-43cc-8833-839bc0cf67dc","Type":"ContainerDied","Data":"3390dfaa1745c314422218fd6221fa92f2714b8258f227fa76da5bce0cf4e8f9"} Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.121198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d08432a6-65ad-4843-924c-2b1f96613356","Type":"ContainerStarted","Data":"4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3"} Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.121239 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d08432a6-65ad-4843-924c-2b1f96613356","Type":"ContainerStarted","Data":"0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d"} Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.183461 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.18344804 podStartE2EDuration="2.18344804s" podCreationTimestamp="2026-01-21 15:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:38.167388324 +0000 UTC m=+2815.348904545" watchObservedRunningTime="2026-01-21 15:48:38.18344804 +0000 UTC m=+2815.364964261" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.184561 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.184556353 podStartE2EDuration="2.184556353s" podCreationTimestamp="2026-01-21 15:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:38.181323104 +0000 UTC m=+2815.362839326" watchObservedRunningTime="2026-01-21 15:48:38.184556353 +0000 UTC m=+2815.366072575" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.433890 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.438896 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.620880 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025be631-d9c0-4382-a42f-1979af2fddd3-logs\") pod \"025be631-d9c0-4382-a42f-1979af2fddd3\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.620973 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbd47\" (UniqueName: \"kubernetes.io/projected/a57de07a-627b-4881-bf68-124934dbe603-kube-api-access-pbd47\") pod \"a57de07a-627b-4881-bf68-124934dbe603\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.621001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85d4c\" (UniqueName: \"kubernetes.io/projected/025be631-d9c0-4382-a42f-1979af2fddd3-kube-api-access-85d4c\") pod \"025be631-d9c0-4382-a42f-1979af2fddd3\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.621069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-combined-ca-bundle\") pod \"a57de07a-627b-4881-bf68-124934dbe603\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.621104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-combined-ca-bundle\") pod \"025be631-d9c0-4382-a42f-1979af2fddd3\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.621136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-config-data\") pod \"025be631-d9c0-4382-a42f-1979af2fddd3\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.621205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-scripts\") pod \"025be631-d9c0-4382-a42f-1979af2fddd3\" (UID: \"025be631-d9c0-4382-a42f-1979af2fddd3\") " Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.621226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-db-sync-config-data\") pod \"a57de07a-627b-4881-bf68-124934dbe603\" (UID: \"a57de07a-627b-4881-bf68-124934dbe603\") " Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.622524 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025be631-d9c0-4382-a42f-1979af2fddd3-logs" (OuterVolumeSpecName: "logs") pod "025be631-d9c0-4382-a42f-1979af2fddd3" (UID: "025be631-d9c0-4382-a42f-1979af2fddd3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.636320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025be631-d9c0-4382-a42f-1979af2fddd3-kube-api-access-85d4c" (OuterVolumeSpecName: "kube-api-access-85d4c") pod "025be631-d9c0-4382-a42f-1979af2fddd3" (UID: "025be631-d9c0-4382-a42f-1979af2fddd3"). InnerVolumeSpecName "kube-api-access-85d4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.638931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a57de07a-627b-4881-bf68-124934dbe603" (UID: "a57de07a-627b-4881-bf68-124934dbe603"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.650960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-scripts" (OuterVolumeSpecName: "scripts") pod "025be631-d9c0-4382-a42f-1979af2fddd3" (UID: "025be631-d9c0-4382-a42f-1979af2fddd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.650973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57de07a-627b-4881-bf68-124934dbe603-kube-api-access-pbd47" (OuterVolumeSpecName: "kube-api-access-pbd47") pod "a57de07a-627b-4881-bf68-124934dbe603" (UID: "a57de07a-627b-4881-bf68-124934dbe603"). InnerVolumeSpecName "kube-api-access-pbd47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.656965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "025be631-d9c0-4382-a42f-1979af2fddd3" (UID: "025be631-d9c0-4382-a42f-1979af2fddd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.670974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a57de07a-627b-4881-bf68-124934dbe603" (UID: "a57de07a-627b-4881-bf68-124934dbe603"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.681458 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-config-data" (OuterVolumeSpecName: "config-data") pod "025be631-d9c0-4382-a42f-1979af2fddd3" (UID: "025be631-d9c0-4382-a42f-1979af2fddd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.723650 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.723679 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.723689 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025be631-d9c0-4382-a42f-1979af2fddd3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.723700 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbd47\" (UniqueName: \"kubernetes.io/projected/a57de07a-627b-4881-bf68-124934dbe603-kube-api-access-pbd47\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.723709 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85d4c\" (UniqueName: \"kubernetes.io/projected/025be631-d9c0-4382-a42f-1979af2fddd3-kube-api-access-85d4c\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.723717 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57de07a-627b-4881-bf68-124934dbe603-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.723725 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:38 crc kubenswrapper[4707]: I0121 15:48:38.723733 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025be631-d9c0-4382-a42f-1979af2fddd3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.129407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-45blv" event={"ID":"a57de07a-627b-4881-bf68-124934dbe603","Type":"ContainerDied","Data":"bcb18c1bb6722be674fe474b5d1015d238f642b678756b08acc09dd74b6f854a"} Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.129654 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-45blv" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.129668 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcb18c1bb6722be674fe474b5d1015d238f642b678756b08acc09dd74b6f854a" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.130955 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-tt9db" event={"ID":"025be631-d9c0-4382-a42f-1979af2fddd3","Type":"ContainerDied","Data":"c8f59fba561cc8157a26c42a58d8516146a2bfd10e7ec0caab9ac50aafd37d55"} Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.130980 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-tt9db" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.130990 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8f59fba561cc8157a26c42a58d8516146a2bfd10e7ec0caab9ac50aafd37d55" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.134489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerStarted","Data":"03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f"} Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.168384 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.407536803 podStartE2EDuration="6.168366244s" podCreationTimestamp="2026-01-21 15:48:33 +0000 UTC" firstStartedPulling="2026-01-21 15:48:34.253209478 +0000 UTC m=+2811.434725700" lastFinishedPulling="2026-01-21 15:48:38.014038919 +0000 UTC m=+2815.195555141" observedRunningTime="2026-01-21 15:48:39.164968496 +0000 UTC m=+2816.346484718" watchObservedRunningTime="2026-01-21 15:48:39.168366244 +0000 UTC m=+2816.349882456" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.320697 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-58456fd9fd-9nb94"] Jan 21 15:48:39 crc kubenswrapper[4707]: E0121 15:48:39.321507 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57de07a-627b-4881-bf68-124934dbe603" containerName="barbican-db-sync" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.321621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57de07a-627b-4881-bf68-124934dbe603" containerName="barbican-db-sync" Jan 21 15:48:39 crc kubenswrapper[4707]: E0121 15:48:39.321701 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025be631-d9c0-4382-a42f-1979af2fddd3" containerName="placement-db-sync" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.321752 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="025be631-d9c0-4382-a42f-1979af2fddd3" containerName="placement-db-sync" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.322022 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="025be631-d9c0-4382-a42f-1979af2fddd3" containerName="placement-db-sync" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.322100 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57de07a-627b-4881-bf68-124934dbe603" containerName="barbican-db-sync" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.323539 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.330208 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-4l28d" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.331421 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.331563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.337045 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-58456fd9fd-9nb94"] Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.393724 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf"] Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.395143 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.399749 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.399828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.399977 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-66qv6" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.409794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf"] Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.422037 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr"] Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.423467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.425875 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.439862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr"] Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.453094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8qzw\" (UniqueName: \"kubernetes.io/projected/c29af5f9-41ed-4f5b-a689-f02942a2366c-kube-api-access-w8qzw\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.453172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-config-data\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.453230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-combined-ca-bundle\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.453282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-scripts\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.453303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29af5f9-41ed-4f5b-a689-f02942a2366c-logs\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.500700 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p"] Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.502365 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.506063 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.510886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p"] Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.554895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8qzw\" (UniqueName: \"kubernetes.io/projected/c29af5f9-41ed-4f5b-a689-f02942a2366c-kube-api-access-w8qzw\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.554958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s8jh\" (UniqueName: \"kubernetes.io/projected/62974f15-cc2c-4796-9e97-30b70cfb78b0-kube-api-access-8s8jh\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-config-data\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data-custom\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-combined-ca-bundle\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-combined-ca-bundle\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qv7c\" (UniqueName: \"kubernetes.io/projected/11d963f6-7258-432b-8206-5a1893ed2ff1-kube-api-access-9qv7c\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-scripts\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29af5f9-41ed-4f5b-a689-f02942a2366c-logs\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-combined-ca-bundle\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62974f15-cc2c-4796-9e97-30b70cfb78b0-logs\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11d963f6-7258-432b-8206-5a1893ed2ff1-logs\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data-custom\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.555432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.556181 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29af5f9-41ed-4f5b-a689-f02942a2366c-logs\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.561445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-combined-ca-bundle\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.561961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-scripts\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.562302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-config-data\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.569830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8qzw\" (UniqueName: \"kubernetes.io/projected/c29af5f9-41ed-4f5b-a689-f02942a2366c-kube-api-access-w8qzw\") pod \"placement-58456fd9fd-9nb94\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.603662 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.607932 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.648515 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.656910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-combined-ca-bundle\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.656965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62974f15-cc2c-4796-9e97-30b70cfb78b0-logs\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657010 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11d963f6-7258-432b-8206-5a1893ed2ff1-logs\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data-custom\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-combined-ca-bundle\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s8jh\" (UniqueName: \"kubernetes.io/projected/62974f15-cc2c-4796-9e97-30b70cfb78b0-kube-api-access-8s8jh\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-logs\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data-custom\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-combined-ca-bundle\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz596\" (UniqueName: \"kubernetes.io/projected/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-kube-api-access-vz596\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qv7c\" (UniqueName: \"kubernetes.io/projected/11d963f6-7258-432b-8206-5a1893ed2ff1-kube-api-access-9qv7c\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.657492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data-custom\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.660877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-combined-ca-bundle\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.661439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62974f15-cc2c-4796-9e97-30b70cfb78b0-logs\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.661737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11d963f6-7258-432b-8206-5a1893ed2ff1-logs\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.667768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-combined-ca-bundle\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.669337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data-custom\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.670800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.673034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data-custom\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.674141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s8jh\" (UniqueName: \"kubernetes.io/projected/62974f15-cc2c-4796-9e97-30b70cfb78b0-kube-api-access-8s8jh\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.675219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qv7c\" (UniqueName: \"kubernetes.io/projected/11d963f6-7258-432b-8206-5a1893ed2ff1-kube-api-access-9qv7c\") pod \"barbican-worker-679bdf99cc-8dwtr\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.678498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data\") pod \"barbican-keystone-listener-5499c5c6dd-s7krf\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.725668 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.754870 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-fernet-keys\") pod \"e1917a27-98a1-43cc-8833-839bc0cf67dc\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-config-data\") pod \"e1917a27-98a1-43cc-8833-839bc0cf67dc\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-config-data\") pod \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-db-sync-config-data\") pod \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-credential-keys\") pod \"e1917a27-98a1-43cc-8833-839bc0cf67dc\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-combined-ca-bundle\") pod \"e1917a27-98a1-43cc-8833-839bc0cf67dc\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-combined-ca-bundle\") pod \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6l92\" (UniqueName: \"kubernetes.io/projected/6338d565-911b-47d4-ba71-a9cdd9d6bf76-kube-api-access-n6l92\") pod \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758494 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bng27\" (UniqueName: \"kubernetes.io/projected/e1917a27-98a1-43cc-8833-839bc0cf67dc-kube-api-access-bng27\") pod \"e1917a27-98a1-43cc-8833-839bc0cf67dc\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6338d565-911b-47d4-ba71-a9cdd9d6bf76-etc-machine-id\") pod \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-scripts\") pod \"e1917a27-98a1-43cc-8833-839bc0cf67dc\" (UID: \"e1917a27-98a1-43cc-8833-839bc0cf67dc\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-scripts\") pod \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\" (UID: \"6338d565-911b-47d4-ba71-a9cdd9d6bf76\") " Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz596\" (UniqueName: \"kubernetes.io/projected/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-kube-api-access-vz596\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data-custom\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.758983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-combined-ca-bundle\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.759014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.759049 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-logs\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.759350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-logs\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.759909 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6338d565-911b-47d4-ba71-a9cdd9d6bf76-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6338d565-911b-47d4-ba71-a9cdd9d6bf76" (UID: "6338d565-911b-47d4-ba71-a9cdd9d6bf76"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.763834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-combined-ca-bundle\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.765306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6338d565-911b-47d4-ba71-a9cdd9d6bf76-kube-api-access-n6l92" (OuterVolumeSpecName: "kube-api-access-n6l92") pod "6338d565-911b-47d4-ba71-a9cdd9d6bf76" (UID: "6338d565-911b-47d4-ba71-a9cdd9d6bf76"). InnerVolumeSpecName "kube-api-access-n6l92". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.765536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1917a27-98a1-43cc-8833-839bc0cf67dc-kube-api-access-bng27" (OuterVolumeSpecName: "kube-api-access-bng27") pod "e1917a27-98a1-43cc-8833-839bc0cf67dc" (UID: "e1917a27-98a1-43cc-8833-839bc0cf67dc"). InnerVolumeSpecName "kube-api-access-bng27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.768341 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e1917a27-98a1-43cc-8833-839bc0cf67dc" (UID: "e1917a27-98a1-43cc-8833-839bc0cf67dc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.768417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-scripts" (OuterVolumeSpecName: "scripts") pod "6338d565-911b-47d4-ba71-a9cdd9d6bf76" (UID: "6338d565-911b-47d4-ba71-a9cdd9d6bf76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.770017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-scripts" (OuterVolumeSpecName: "scripts") pod "e1917a27-98a1-43cc-8833-839bc0cf67dc" (UID: "e1917a27-98a1-43cc-8833-839bc0cf67dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.771119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.771528 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6338d565-911b-47d4-ba71-a9cdd9d6bf76" (UID: "6338d565-911b-47d4-ba71-a9cdd9d6bf76"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.778236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e1917a27-98a1-43cc-8833-839bc0cf67dc" (UID: "e1917a27-98a1-43cc-8833-839bc0cf67dc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.779370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data-custom\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.781203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz596\" (UniqueName: \"kubernetes.io/projected/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-kube-api-access-vz596\") pod \"barbican-api-7b784745f6-b8h2p\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.814906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6338d565-911b-47d4-ba71-a9cdd9d6bf76" (UID: "6338d565-911b-47d4-ba71-a9cdd9d6bf76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.820785 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-config-data" (OuterVolumeSpecName: "config-data") pod "e1917a27-98a1-43cc-8833-839bc0cf67dc" (UID: "e1917a27-98a1-43cc-8833-839bc0cf67dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.820833 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861147 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6338d565-911b-47d4-ba71-a9cdd9d6bf76-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861187 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861196 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861204 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861213 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861221 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861229 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861238 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861246 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6l92\" (UniqueName: \"kubernetes.io/projected/6338d565-911b-47d4-ba71-a9cdd9d6bf76-kube-api-access-n6l92\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.861254 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bng27\" (UniqueName: \"kubernetes.io/projected/e1917a27-98a1-43cc-8833-839bc0cf67dc-kube-api-access-bng27\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.864611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1917a27-98a1-43cc-8833-839bc0cf67dc" (UID: "e1917a27-98a1-43cc-8833-839bc0cf67dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.889556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-config-data" (OuterVolumeSpecName: "config-data") pod "6338d565-911b-47d4-ba71-a9cdd9d6bf76" (UID: "6338d565-911b-47d4-ba71-a9cdd9d6bf76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.963883 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6338d565-911b-47d4-ba71-a9cdd9d6bf76-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:39 crc kubenswrapper[4707]: I0121 15:48:39.963914 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1917a27-98a1-43cc-8833-839bc0cf67dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.075973 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-58456fd9fd-9nb94"] Jan 21 15:48:40 crc kubenswrapper[4707]: W0121 15:48:40.076545 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc29af5f9_41ed_4f5b_a689_f02942a2366c.slice/crio-1a4be0d60b41d1b7aaec854fbeed19a16ca7e4e6d40f68e9ec041867320b9543 WatchSource:0}: Error finding container 1a4be0d60b41d1b7aaec854fbeed19a16ca7e4e6d40f68e9ec041867320b9543: Status 404 returned error can't find the container with id 1a4be0d60b41d1b7aaec854fbeed19a16ca7e4e6d40f68e9ec041867320b9543 Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.174169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" event={"ID":"e1917a27-98a1-43cc-8833-839bc0cf67dc","Type":"ContainerDied","Data":"8f67368ed5e7353ab1977101d18e09df606680b8b79b78d956c7963361729c68"} Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.174204 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f67368ed5e7353ab1977101d18e09df606680b8b79b78d956c7963361729c68" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.174253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-mxx7g" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.176575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" event={"ID":"c29af5f9-41ed-4f5b-a689-f02942a2366c","Type":"ContainerStarted","Data":"1a4be0d60b41d1b7aaec854fbeed19a16ca7e4e6d40f68e9ec041867320b9543"} Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.178878 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-mczjr" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.179250 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-mczjr" event={"ID":"6338d565-911b-47d4-ba71-a9cdd9d6bf76","Type":"ContainerDied","Data":"96d1971249939bfbcda476fbb940e43641c410409c244111758ceab6d28b4502"} Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.179297 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96d1971249939bfbcda476fbb940e43641c410409c244111758ceab6d28b4502" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.179314 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.202121 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.308692 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.341921 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mxx7g"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.362572 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-mxx7g"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.374969 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:48:40 crc kubenswrapper[4707]: E0121 15:48:40.375545 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1917a27-98a1-43cc-8833-839bc0cf67dc" containerName="keystone-bootstrap" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.375656 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1917a27-98a1-43cc-8833-839bc0cf67dc" containerName="keystone-bootstrap" Jan 21 15:48:40 crc kubenswrapper[4707]: E0121 15:48:40.375722 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6338d565-911b-47d4-ba71-a9cdd9d6bf76" containerName="cinder-db-sync" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.375785 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6338d565-911b-47d4-ba71-a9cdd9d6bf76" containerName="cinder-db-sync" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.376041 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1917a27-98a1-43cc-8833-839bc0cf67dc" containerName="keystone-bootstrap" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.376111 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6338d565-911b-47d4-ba71-a9cdd9d6bf76" containerName="cinder-db-sync" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.377349 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.380340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-kd86g" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.380846 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.384100 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.384548 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.384629 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.397438 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kxkv"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.398648 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.401627 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.402016 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.402132 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.409010 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-x8w4j" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.409291 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.425619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kxkv"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.440954 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488556 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz27c\" (UniqueName: \"kubernetes.io/projected/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-kube-api-access-kz27c\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-scripts\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmn6\" (UniqueName: \"kubernetes.io/projected/2552e9a1-fb35-4150-b499-a7a1d314fd0d-kube-api-access-zcmn6\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-config-data\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-credential-keys\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-combined-ca-bundle\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488801 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-scripts\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488857 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2552e9a1-fb35-4150-b499-a7a1d314fd0d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.488984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.489022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-fernet-keys\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.528659 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.531129 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.533220 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.554574 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590236 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-fernet-keys\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz27c\" (UniqueName: \"kubernetes.io/projected/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-kube-api-access-kz27c\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5be95f6e-99a6-4607-a29e-87a1c87afd68-logs\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-scripts\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmn6\" (UniqueName: \"kubernetes.io/projected/2552e9a1-fb35-4150-b499-a7a1d314fd0d-kube-api-access-zcmn6\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-config-data\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc79x\" (UniqueName: \"kubernetes.io/projected/5be95f6e-99a6-4607-a29e-87a1c87afd68-kube-api-access-tc79x\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-credential-keys\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-combined-ca-bundle\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5be95f6e-99a6-4607-a29e-87a1c87afd68-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-scripts\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2552e9a1-fb35-4150-b499-a7a1d314fd0d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-scripts\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.590570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data-custom\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.592340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2552e9a1-fb35-4150-b499-a7a1d314fd0d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.594858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-fernet-keys\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.595682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-combined-ca-bundle\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.596556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.597964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-scripts\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.601511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.602421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-config-data\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.602514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-credential-keys\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.603788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-scripts\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.604344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.608896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz27c\" (UniqueName: \"kubernetes.io/projected/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-kube-api-access-kz27c\") pod \"keystone-bootstrap-6kxkv\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.615256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmn6\" (UniqueName: \"kubernetes.io/projected/2552e9a1-fb35-4150-b499-a7a1d314fd0d-kube-api-access-zcmn6\") pod \"cinder-scheduler-0\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.692443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-scripts\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.692493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data-custom\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.692532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.692572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.692627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5be95f6e-99a6-4607-a29e-87a1c87afd68-logs\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.693082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc79x\" (UniqueName: \"kubernetes.io/projected/5be95f6e-99a6-4607-a29e-87a1c87afd68-kube-api-access-tc79x\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.693151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5be95f6e-99a6-4607-a29e-87a1c87afd68-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.693224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5be95f6e-99a6-4607-a29e-87a1c87afd68-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.693508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5be95f6e-99a6-4607-a29e-87a1c87afd68-logs\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.696356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data-custom\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.697333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-scripts\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.697489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.705142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.713311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc79x\" (UniqueName: \"kubernetes.io/projected/5be95f6e-99a6-4607-a29e-87a1c87afd68-kube-api-access-tc79x\") pod \"cinder-api-0\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.730892 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.752657 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:40 crc kubenswrapper[4707]: I0121 15:48:40.875887 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.195856 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1917a27-98a1-43cc-8833-839bc0cf67dc" path="/var/lib/kubelet/pods/e1917a27-98a1-43cc-8833-839bc0cf67dc/volumes" Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.211084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" event={"ID":"c29af5f9-41ed-4f5b-a689-f02942a2366c","Type":"ContainerStarted","Data":"3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.211132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" event={"ID":"c29af5f9-41ed-4f5b-a689-f02942a2366c","Type":"ContainerStarted","Data":"19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.212638 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.212666 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.213009 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.214697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" event={"ID":"11d963f6-7258-432b-8206-5a1893ed2ff1","Type":"ContainerStarted","Data":"436bcbab1f25e15f46246ecc8ede16a55da48c1cd2e3c46c116436928b337639"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.214717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" event={"ID":"11d963f6-7258-432b-8206-5a1893ed2ff1","Type":"ContainerStarted","Data":"d60b3a2a18c48b42bdf714d802ca2dd8c97fac6dfc1e9c9a4de48c3542d38b4c"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.214727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" event={"ID":"11d963f6-7258-432b-8206-5a1893ed2ff1","Type":"ContainerStarted","Data":"783cfe365678214c684fda16135fc7d2fe4163f670ac5f21235b6d3a7e62f885"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.220219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" event={"ID":"62974f15-cc2c-4796-9e97-30b70cfb78b0","Type":"ContainerStarted","Data":"1e13dc0dd446435fd65bf4d3e47bff260f5012e7a18801585e384b29e9417f69"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.220252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" event={"ID":"62974f15-cc2c-4796-9e97-30b70cfb78b0","Type":"ContainerStarted","Data":"478c24f9df975e5f07c1890409bc439d4190136c2d1f1024e43e81cf80e40146"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.220273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" event={"ID":"62974f15-cc2c-4796-9e97-30b70cfb78b0","Type":"ContainerStarted","Data":"a40e3bbf81f3d41577b644074e5b563f0a0264f8aeda7b2bc6491bf7a56f0708"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.224242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" event={"ID":"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d","Type":"ContainerStarted","Data":"8423dad4b94be8eef5c393ef27d1d8f14ad3815ffd34f41da3bfa63e5acc69b9"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.224281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" event={"ID":"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d","Type":"ContainerStarted","Data":"c6104ba8a299d951e7e68b00af9056b68f0e5d2bab75be0997ecd34395ba528d"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.224293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" event={"ID":"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d","Type":"ContainerStarted","Data":"a53608825e98751b6326bba558c63c6ffb45cba19a8fec397c0091d090929dab"} Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.224305 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.224326 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:41 crc kubenswrapper[4707]: W0121 15:48:41.224474 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2552e9a1_fb35_4150_b499_a7a1d314fd0d.slice/crio-9d28ec71e7dc46d3cd74cafcf734ff9c49a94939630d50bb896c4f68dd1fe149 WatchSource:0}: Error finding container 9d28ec71e7dc46d3cd74cafcf734ff9c49a94939630d50bb896c4f68dd1fe149: Status 404 returned error can't find the container with id 9d28ec71e7dc46d3cd74cafcf734ff9c49a94939630d50bb896c4f68dd1fe149 Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.236888 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" podStartSLOduration=2.236867878 podStartE2EDuration="2.236867878s" podCreationTimestamp="2026-01-21 15:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:41.229241258 +0000 UTC m=+2818.410757481" watchObservedRunningTime="2026-01-21 15:48:41.236867878 +0000 UTC m=+2818.418384090" Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.252738 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" podStartSLOduration=2.252720926 podStartE2EDuration="2.252720926s" podCreationTimestamp="2026-01-21 15:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:41.24233655 +0000 UTC m=+2818.423852773" watchObservedRunningTime="2026-01-21 15:48:41.252720926 +0000 UTC m=+2818.434237148" Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.273925 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" podStartSLOduration=2.273908294 podStartE2EDuration="2.273908294s" podCreationTimestamp="2026-01-21 15:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:41.271897613 +0000 UTC m=+2818.453413835" watchObservedRunningTime="2026-01-21 15:48:41.273908294 +0000 UTC m=+2818.455424515" Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.322803 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" podStartSLOduration=2.322785985 podStartE2EDuration="2.322785985s" podCreationTimestamp="2026-01-21 15:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:41.302318632 +0000 UTC m=+2818.483834853" watchObservedRunningTime="2026-01-21 15:48:41.322785985 +0000 UTC m=+2818.504302208" Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.330608 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kxkv"] Jan 21 15:48:41 crc kubenswrapper[4707]: I0121 15:48:41.413328 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:48:41 crc kubenswrapper[4707]: W0121 15:48:41.439464 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5be95f6e_99a6_4607_a29e_87a1c87afd68.slice/crio-65844f215e5db59219a2657c62b7637e5952d18b8f0740c2e4c4da00f1c99502 WatchSource:0}: Error finding container 65844f215e5db59219a2657c62b7637e5952d18b8f0740c2e4c4da00f1c99502: Status 404 returned error can't find the container with id 65844f215e5db59219a2657c62b7637e5952d18b8f0740c2e4c4da00f1c99502 Jan 21 15:48:42 crc kubenswrapper[4707]: I0121 15:48:42.247687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2552e9a1-fb35-4150-b499-a7a1d314fd0d","Type":"ContainerStarted","Data":"fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c"} Jan 21 15:48:42 crc kubenswrapper[4707]: I0121 15:48:42.248318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2552e9a1-fb35-4150-b499-a7a1d314fd0d","Type":"ContainerStarted","Data":"9d28ec71e7dc46d3cd74cafcf734ff9c49a94939630d50bb896c4f68dd1fe149"} Jan 21 15:48:42 crc kubenswrapper[4707]: I0121 15:48:42.249154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5be95f6e-99a6-4607-a29e-87a1c87afd68","Type":"ContainerStarted","Data":"65844f215e5db59219a2657c62b7637e5952d18b8f0740c2e4c4da00f1c99502"} Jan 21 15:48:42 crc kubenswrapper[4707]: I0121 15:48:42.255613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" event={"ID":"47b8c9a7-67a3-471d-a356-3e016bf3a3e4","Type":"ContainerStarted","Data":"bd309fbc3fb22feff248a50774d5c25fb157e048f9cc47f6f6dd0f30e2a730b7"} Jan 21 15:48:42 crc kubenswrapper[4707]: I0121 15:48:42.255660 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" event={"ID":"47b8c9a7-67a3-471d-a356-3e016bf3a3e4","Type":"ContainerStarted","Data":"fc330fe796ffac0e6a20b4b0a35908a9f62854a414d35484ae484ea839f89eef"} Jan 21 15:48:42 crc kubenswrapper[4707]: I0121 15:48:42.275105 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" podStartSLOduration=2.275092833 podStartE2EDuration="2.275092833s" podCreationTimestamp="2026-01-21 15:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:42.274916932 +0000 UTC m=+2819.456433154" watchObservedRunningTime="2026-01-21 15:48:42.275092833 +0000 UTC m=+2819.456609055" Jan 21 15:48:43 crc kubenswrapper[4707]: I0121 15:48:43.269828 4707 generic.go:334] "Generic (PLEG): container finished" podID="1cc8577c-8dd6-4ae9-9057-7091c1a10853" containerID="02e8a2611ee7d85c7fd5a2ff060701f99988408f3267b95159cbe36a24e03ce3" exitCode=0 Jan 21 15:48:43 crc kubenswrapper[4707]: I0121 15:48:43.269858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" event={"ID":"1cc8577c-8dd6-4ae9-9057-7091c1a10853","Type":"ContainerDied","Data":"02e8a2611ee7d85c7fd5a2ff060701f99988408f3267b95159cbe36a24e03ce3"} Jan 21 15:48:43 crc kubenswrapper[4707]: I0121 15:48:43.273409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2552e9a1-fb35-4150-b499-a7a1d314fd0d","Type":"ContainerStarted","Data":"dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1"} Jan 21 15:48:43 crc kubenswrapper[4707]: I0121 15:48:43.275071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5be95f6e-99a6-4607-a29e-87a1c87afd68","Type":"ContainerStarted","Data":"ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e"} Jan 21 15:48:43 crc kubenswrapper[4707]: I0121 15:48:43.275115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5be95f6e-99a6-4607-a29e-87a1c87afd68","Type":"ContainerStarted","Data":"7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676"} Jan 21 15:48:43 crc kubenswrapper[4707]: I0121 15:48:43.311628 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.311607161 podStartE2EDuration="3.311607161s" podCreationTimestamp="2026-01-21 15:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:43.296130101 +0000 UTC m=+2820.477646323" watchObservedRunningTime="2026-01-21 15:48:43.311607161 +0000 UTC m=+2820.493123383" Jan 21 15:48:43 crc kubenswrapper[4707]: I0121 15:48:43.318629 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.318618846 podStartE2EDuration="3.318618846s" podCreationTimestamp="2026-01-21 15:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:43.309354736 +0000 UTC m=+2820.490870958" watchObservedRunningTime="2026-01-21 15:48:43.318618846 +0000 UTC m=+2820.500135067" Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.283390 4707 generic.go:334] "Generic (PLEG): container finished" podID="47b8c9a7-67a3-471d-a356-3e016bf3a3e4" containerID="bd309fbc3fb22feff248a50774d5c25fb157e048f9cc47f6f6dd0f30e2a730b7" exitCode=0 Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.283425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" event={"ID":"47b8c9a7-67a3-471d-a356-3e016bf3a3e4","Type":"ContainerDied","Data":"bd309fbc3fb22feff248a50774d5c25fb157e048f9cc47f6f6dd0f30e2a730b7"} Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.284294 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.541862 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.666188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-config\") pod \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.666384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-combined-ca-bundle\") pod \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.666477 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b7ph\" (UniqueName: \"kubernetes.io/projected/1cc8577c-8dd6-4ae9-9057-7091c1a10853-kube-api-access-4b7ph\") pod \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\" (UID: \"1cc8577c-8dd6-4ae9-9057-7091c1a10853\") " Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.671857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc8577c-8dd6-4ae9-9057-7091c1a10853-kube-api-access-4b7ph" (OuterVolumeSpecName: "kube-api-access-4b7ph") pod "1cc8577c-8dd6-4ae9-9057-7091c1a10853" (UID: "1cc8577c-8dd6-4ae9-9057-7091c1a10853"). InnerVolumeSpecName "kube-api-access-4b7ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.688507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cc8577c-8dd6-4ae9-9057-7091c1a10853" (UID: "1cc8577c-8dd6-4ae9-9057-7091c1a10853"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.689932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-config" (OuterVolumeSpecName: "config") pod "1cc8577c-8dd6-4ae9-9057-7091c1a10853" (UID: "1cc8577c-8dd6-4ae9-9057-7091c1a10853"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.769363 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.769394 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b7ph\" (UniqueName: \"kubernetes.io/projected/1cc8577c-8dd6-4ae9-9057-7091c1a10853-kube-api-access-4b7ph\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:44 crc kubenswrapper[4707]: I0121 15:48:44.769407 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cc8577c-8dd6-4ae9-9057-7091c1a10853-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.292677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" event={"ID":"1cc8577c-8dd6-4ae9-9057-7091c1a10853","Type":"ContainerDied","Data":"cd4cf8f36c2c15c58b4a086575baeeec4c73c0765efe8fc145cce71006583dbd"} Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.292728 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd4cf8f36c2c15c58b4a086575baeeec4c73c0765efe8fc145cce71006583dbd" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.292690 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-tg5cr" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.447568 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-7c97698578-4r4js"] Jan 21 15:48:45 crc kubenswrapper[4707]: E0121 15:48:45.448138 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc8577c-8dd6-4ae9-9057-7091c1a10853" containerName="neutron-db-sync" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.448155 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc8577c-8dd6-4ae9-9057-7091c1a10853" containerName="neutron-db-sync" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.448318 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc8577c-8dd6-4ae9-9057-7091c1a10853" containerName="neutron-db-sync" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.449104 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.452436 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.452591 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.452702 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-j7drz" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.456862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7c97698578-4r4js"] Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.482360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7rc\" (UniqueName: \"kubernetes.io/projected/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-kube-api-access-rp7rc\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.482455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-combined-ca-bundle\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.482485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-httpd-config\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.482751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-config\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.585241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-config\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.585397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7rc\" (UniqueName: \"kubernetes.io/projected/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-kube-api-access-rp7rc\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.585509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-httpd-config\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.585584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-combined-ca-bundle\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.588612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-httpd-config\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.588657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-config\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.589207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-combined-ca-bundle\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.598508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7rc\" (UniqueName: \"kubernetes.io/projected/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-kube-api-access-rp7rc\") pod \"neutron-7c97698578-4r4js\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.653885 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.687064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz27c\" (UniqueName: \"kubernetes.io/projected/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-kube-api-access-kz27c\") pod \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.687119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-fernet-keys\") pod \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.687157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-scripts\") pod \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.687184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-combined-ca-bundle\") pod \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.687319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-config-data\") pod \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.687414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-credential-keys\") pod \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\" (UID: \"47b8c9a7-67a3-471d-a356-3e016bf3a3e4\") " Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.700497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-scripts" (OuterVolumeSpecName: "scripts") pod "47b8c9a7-67a3-471d-a356-3e016bf3a3e4" (UID: "47b8c9a7-67a3-471d-a356-3e016bf3a3e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.700533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "47b8c9a7-67a3-471d-a356-3e016bf3a3e4" (UID: "47b8c9a7-67a3-471d-a356-3e016bf3a3e4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.706971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-kube-api-access-kz27c" (OuterVolumeSpecName: "kube-api-access-kz27c") pod "47b8c9a7-67a3-471d-a356-3e016bf3a3e4" (UID: "47b8c9a7-67a3-471d-a356-3e016bf3a3e4"). InnerVolumeSpecName "kube-api-access-kz27c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.713943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "47b8c9a7-67a3-471d-a356-3e016bf3a3e4" (UID: "47b8c9a7-67a3-471d-a356-3e016bf3a3e4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.736911 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.776006 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.785938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47b8c9a7-67a3-471d-a356-3e016bf3a3e4" (UID: "47b8c9a7-67a3-471d-a356-3e016bf3a3e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.788704 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.788734 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz27c\" (UniqueName: \"kubernetes.io/projected/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-kube-api-access-kz27c\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.788746 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.788754 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.788763 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.811502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-config-data" (OuterVolumeSpecName: "config-data") pod "47b8c9a7-67a3-471d-a356-3e016bf3a3e4" (UID: "47b8c9a7-67a3-471d-a356-3e016bf3a3e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4707]: I0121 15:48:45.890853 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b8c9a7-67a3-471d-a356-3e016bf3a3e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.228046 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7c97698578-4r4js"] Jan 21 15:48:46 crc kubenswrapper[4707]: W0121 15:48:46.233225 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd10d56d4_0d08_4ea0_acd7_8e3b7bf6fb4d.slice/crio-d04db9fa6e111d01fa474b6b342d32b276d64b909d2a34c4f2ab352ec7008083 WatchSource:0}: Error finding container d04db9fa6e111d01fa474b6b342d32b276d64b909d2a34c4f2ab352ec7008083: Status 404 returned error can't find the container with id d04db9fa6e111d01fa474b6b342d32b276d64b909d2a34c4f2ab352ec7008083 Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.312419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" event={"ID":"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d","Type":"ContainerStarted","Data":"d04db9fa6e111d01fa474b6b342d32b276d64b909d2a34c4f2ab352ec7008083"} Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.314382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" event={"ID":"47b8c9a7-67a3-471d-a356-3e016bf3a3e4","Type":"ContainerDied","Data":"fc330fe796ffac0e6a20b4b0a35908a9f62854a414d35484ae484ea839f89eef"} Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.314420 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc330fe796ffac0e6a20b4b0a35908a9f62854a414d35484ae484ea839f89eef" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.314562 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6kxkv" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.431291 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj"] Jan 21 15:48:46 crc kubenswrapper[4707]: E0121 15:48:46.431882 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b8c9a7-67a3-471d-a356-3e016bf3a3e4" containerName="keystone-bootstrap" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.431901 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b8c9a7-67a3-471d-a356-3e016bf3a3e4" containerName="keystone-bootstrap" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.432080 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b8c9a7-67a3-471d-a356-3e016bf3a3e4" containerName="keystone-bootstrap" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.432635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.443323 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.443494 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.443613 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.443781 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-x8w4j" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.449712 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj"] Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.486483 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.486523 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.500206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4qjr\" (UniqueName: \"kubernetes.io/projected/4d0c2183-3194-4f16-819c-2e2473083292-kube-api-access-k4qjr\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.500241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-combined-ca-bundle\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.500274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-credential-keys\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.500296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-config-data\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.500346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-fernet-keys\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.500409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-scripts\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.515748 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.519251 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.527845 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.527872 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.554226 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.564753 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.602042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-credential-keys\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.602100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-config-data\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.602188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-fernet-keys\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.602301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-scripts\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.602357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4qjr\" (UniqueName: \"kubernetes.io/projected/4d0c2183-3194-4f16-819c-2e2473083292-kube-api-access-k4qjr\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.602379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-combined-ca-bundle\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.608842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-combined-ca-bundle\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.613479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-scripts\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.613498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-credential-keys\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.613889 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-fernet-keys\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.615470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-config-data\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.617649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4qjr\" (UniqueName: \"kubernetes.io/projected/4d0c2183-3194-4f16-819c-2e2473083292-kube-api-access-k4qjr\") pod \"keystone-5c66cc94c7-xfnmj\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:46 crc kubenswrapper[4707]: I0121 15:48:46.767747 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.193048 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj"] Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.325572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" event={"ID":"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d","Type":"ContainerStarted","Data":"84b7c48321255d05f9f4ae3bebca5603a4d0fb4d5ce26d0a6812c180a204cc28"} Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.325629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" event={"ID":"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d","Type":"ContainerStarted","Data":"971937516b3b19a806e3c97a43385c931b217beac81e6d595bdb7f3762b9b713"} Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.325662 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.329008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" event={"ID":"4d0c2183-3194-4f16-819c-2e2473083292","Type":"ContainerStarted","Data":"2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb"} Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.329035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" event={"ID":"4d0c2183-3194-4f16-819c-2e2473083292","Type":"ContainerStarted","Data":"217c7055a6fea589075372f01f42fcc0ab20ca221329ff30f0ef3119a2e1c21b"} Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.329336 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.329474 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.329509 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.329520 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.329528 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.343681 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" podStartSLOduration=2.34367164 podStartE2EDuration="2.34367164s" podCreationTimestamp="2026-01-21 15:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:47.336661719 +0000 UTC m=+2824.518177941" watchObservedRunningTime="2026-01-21 15:48:47.34367164 +0000 UTC m=+2824.525187861" Jan 21 15:48:47 crc kubenswrapper[4707]: I0121 15:48:47.361141 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" podStartSLOduration=1.36113237 podStartE2EDuration="1.36113237s" podCreationTimestamp="2026-01-21 15:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:47.353563147 +0000 UTC m=+2824.535079370" watchObservedRunningTime="2026-01-21 15:48:47.36113237 +0000 UTC m=+2824.542648592" Jan 21 15:48:48 crc kubenswrapper[4707]: I0121 15:48:48.895860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:48 crc kubenswrapper[4707]: I0121 15:48:48.899235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:48:48 crc kubenswrapper[4707]: I0121 15:48:48.916071 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:48 crc kubenswrapper[4707]: I0121 15:48:48.917106 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4707]: I0121 15:48:50.894439 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:48:51 crc kubenswrapper[4707]: I0121 15:48:51.036038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:51 crc kubenswrapper[4707]: I0121 15:48:51.088487 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:48:52 crc kubenswrapper[4707]: I0121 15:48:52.404038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.328368 4707 scope.go:117] "RemoveContainer" containerID="4d471870a8d9c652e026de3e74df2f68fd04789c57f3460a8efb5a5831b71ff5" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.357582 4707 scope.go:117] "RemoveContainer" containerID="03daf8802e87e335374564e2a9872eb53275865b06e92ef26781c039f459beb8" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.376864 4707 scope.go:117] "RemoveContainer" containerID="b6b60f79abe8f46a93b6c2e5dfd92e9104cd8b3b93411bbe1c60ed456a3637bc" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.399532 4707 scope.go:117] "RemoveContainer" containerID="9e4452792bdd8e7c7134ffe07b84894dcaa2be63135d758bc283c8b2eb863402" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.430152 4707 scope.go:117] "RemoveContainer" containerID="0921cbb4487de695795c92b6b4be615fabb9b47e191394d00f6a928e940d4022" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.459920 4707 scope.go:117] "RemoveContainer" containerID="2372349b3eb16ac55b94f5e5b0a8ad44958cecbdb48be7d2072deaff4af5c7d9" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.492424 4707 scope.go:117] "RemoveContainer" containerID="4fef014c4eff487e10d655344bee4cd79f65ebb0fd139656b3c18924175d3fa6" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.508467 4707 scope.go:117] "RemoveContainer" containerID="85f0ac1944fb7f7018245b57f8ae281a1b377d7c02aeb6ea7c4032590cd3ceda" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.522668 4707 scope.go:117] "RemoveContainer" containerID="aaaee1f315be8fc5264779dd0208672135508c8732b50d410fe3dee604e532f5" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.540098 4707 scope.go:117] "RemoveContainer" containerID="792ced9ce31de625df8513afd5d8450974231374cbbc8b4faa4e06145c17e003" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.578295 4707 scope.go:117] "RemoveContainer" containerID="d28821461c495d577ce7377449af1a5dcc20ad597a38321ba6e299be5f826bdd" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.605056 4707 scope.go:117] "RemoveContainer" containerID="276b7c05bf5308331c9bd2cda80036ad4969a70291f51c772dd84fff84a8ac15" Jan 21 15:49:02 crc kubenswrapper[4707]: I0121 15:49:02.619780 4707 scope.go:117] "RemoveContainer" containerID="ff9c1f50093af815c6e84a6e86938a5529512e8a26ae5c083aa8c51428af7125" Jan 21 15:49:03 crc kubenswrapper[4707]: I0121 15:49:03.835426 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:09 crc kubenswrapper[4707]: I0121 15:49:09.945477 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:49:09 crc kubenswrapper[4707]: I0121 15:49:09.945900 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:49:10 crc kubenswrapper[4707]: I0121 15:49:10.495452 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:49:10 crc kubenswrapper[4707]: I0121 15:49:10.498721 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:49:15 crc kubenswrapper[4707]: I0121 15:49:15.784647 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:49:18 crc kubenswrapper[4707]: I0121 15:49:18.001235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.293351 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.294599 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.296330 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-9ddpw" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.296841 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.299205 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.310145 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:49:19 crc kubenswrapper[4707]: E0121 15:49:19.310399 4707 reflector.go:158] "Unhandled Error" err="object-\"openstack-kuttl-tests\"/\"openstack-config\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.319435 4707 status_manager.go:875] "Failed to update status for pod" pod="openstack-kuttl-tests/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:49:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:49:19Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:49:19Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b4f8494513a3af102066fec5868ab167ac8664aceb2f0c639d7a0b60260a944\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjtm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:49:19Z\\\"}}\" for pod \"openstack-kuttl-tests\"/\"openstackclient\": pods \"openstackclient\" not found" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.319577 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:49:19 crc kubenswrapper[4707]: E0121 15:49:19.320067 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-cjtm4 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/openstackclient" podUID="37f86feb-8353-4ff5-b78b-b9dd4f315ea7" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.323513 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.330024 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.333826 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.335253 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.336282 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="37f86feb-8353-4ff5-b78b-b9dd4f315ea7" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.409574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.409643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config-secret\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.409665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjtm4\" (UniqueName: \"kubernetes.io/projected/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-kube-api-access-cjtm4\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.409683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.511962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config-secret\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.512001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjtm4\" (UniqueName: \"kubernetes.io/projected/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-kube-api-access-cjtm4\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.512024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.512072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.512155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.512274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbwrf\" (UniqueName: \"kubernetes.io/projected/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-kube-api-access-sbwrf\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.512323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config-secret\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.512519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.512887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: E0121 15:49:19.515292 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cjtm4 for pod openstack-kuttl-tests/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (37f86feb-8353-4ff5-b78b-b9dd4f315ea7) does not match the UID in record. The object might have been deleted and then recreated Jan 21 15:49:19 crc kubenswrapper[4707]: E0121 15:49:19.515353 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-kube-api-access-cjtm4 podName:37f86feb-8353-4ff5-b78b-b9dd4f315ea7 nodeName:}" failed. No retries permitted until 2026-01-21 15:49:20.015338941 +0000 UTC m=+2857.196855163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cjtm4" (UniqueName: "kubernetes.io/projected/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-kube-api-access-cjtm4") pod "openstackclient" (UID: "37f86feb-8353-4ff5-b78b-b9dd4f315ea7") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (37f86feb-8353-4ff5-b78b-b9dd4f315ea7) does not match the UID in record. The object might have been deleted and then recreated Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.517051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config-secret\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.517140 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.560614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.563433 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="37f86feb-8353-4ff5-b78b-b9dd4f315ea7" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.570067 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.572005 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="37f86feb-8353-4ff5-b78b-b9dd4f315ea7" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.613113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config-secret\") pod \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.613172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-combined-ca-bundle\") pod \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.613248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config\") pod \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\" (UID: \"37f86feb-8353-4ff5-b78b-b9dd4f315ea7\") " Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.613585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.613624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbwrf\" (UniqueName: \"kubernetes.io/projected/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-kube-api-access-sbwrf\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.613646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config-secret\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.613740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.613822 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjtm4\" (UniqueName: \"kubernetes.io/projected/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-kube-api-access-cjtm4\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.614075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "37f86feb-8353-4ff5-b78b-b9dd4f315ea7" (UID: "37f86feb-8353-4ff5-b78b-b9dd4f315ea7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.614443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.616610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "37f86feb-8353-4ff5-b78b-b9dd4f315ea7" (UID: "37f86feb-8353-4ff5-b78b-b9dd4f315ea7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.616728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37f86feb-8353-4ff5-b78b-b9dd4f315ea7" (UID: "37f86feb-8353-4ff5-b78b-b9dd4f315ea7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.617066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config-secret\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.617165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.627025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbwrf\" (UniqueName: \"kubernetes.io/projected/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-kube-api-access-sbwrf\") pod \"openstackclient\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.648064 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.714690 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.714921 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:19 crc kubenswrapper[4707]: I0121 15:49:19.714931 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37f86feb-8353-4ff5-b78b-b9dd4f315ea7-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:20 crc kubenswrapper[4707]: I0121 15:49:20.051158 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:49:20 crc kubenswrapper[4707]: I0121 15:49:20.567779 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:49:20 crc kubenswrapper[4707]: I0121 15:49:20.567950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"2e404344-5beb-4f6f-ba7b-ccbb59c4e318","Type":"ContainerStarted","Data":"66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401"} Jan 21 15:49:20 crc kubenswrapper[4707]: I0121 15:49:20.567993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"2e404344-5beb-4f6f-ba7b-ccbb59c4e318","Type":"ContainerStarted","Data":"7a3d4910beeb993e9334ba6f6a0df930cc23870ede2e05b03da0e85243624aac"} Jan 21 15:49:20 crc kubenswrapper[4707]: I0121 15:49:20.570486 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="37f86feb-8353-4ff5-b78b-b9dd4f315ea7" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" Jan 21 15:49:20 crc kubenswrapper[4707]: I0121 15:49:20.583782 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="37f86feb-8353-4ff5-b78b-b9dd4f315ea7" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" Jan 21 15:49:20 crc kubenswrapper[4707]: I0121 15:49:20.583898 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=1.5838825330000001 podStartE2EDuration="1.583882533s" podCreationTimestamp="2026-01-21 15:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:20.58088765 +0000 UTC m=+2857.762403873" watchObservedRunningTime="2026-01-21 15:49:20.583882533 +0000 UTC m=+2857.765398754" Jan 21 15:49:20 crc kubenswrapper[4707]: I0121 15:49:20.636574 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 15:49:21 crc kubenswrapper[4707]: I0121 15:49:21.189892 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f86feb-8353-4ff5-b78b-b9dd4f315ea7" path="/var/lib/kubelet/pods/37f86feb-8353-4ff5-b78b-b9dd4f315ea7/volumes" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.435352 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j"] Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.436763 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.440154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.453710 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j"] Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.465591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-combined-ca-bundle\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.465631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-run-httpd\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.465667 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-etc-swift\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.465684 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-log-httpd\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.465739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-config-data\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.465796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rlxn\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-kube-api-access-5rlxn\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.566272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-run-httpd\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.566335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-etc-swift\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.566371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-log-httpd\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.566410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-config-data\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.566466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rlxn\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-kube-api-access-5rlxn\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.566524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-combined-ca-bundle\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.566765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-run-httpd\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.567325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-log-httpd\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.571646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-config-data\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.578844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-etc-swift\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.580547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-combined-ca-bundle\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.582644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rlxn\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-kube-api-access-5rlxn\") pod \"swift-proxy-6674bb6-hwr2j\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:22 crc kubenswrapper[4707]: I0121 15:49:22.752211 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.130446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j"] Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.589769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" event={"ID":"66616542-abed-45a9-b0e2-e440f49f3ab8","Type":"ContainerStarted","Data":"3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43"} Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.589839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" event={"ID":"66616542-abed-45a9-b0e2-e440f49f3ab8","Type":"ContainerStarted","Data":"bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87"} Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.589852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" event={"ID":"66616542-abed-45a9-b0e2-e440f49f3ab8","Type":"ContainerStarted","Data":"1033eb311ace7f2a51b10b0c584c6127c2ad10bb2a73242133fd4ac297020250"} Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.590670 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.609784 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" podStartSLOduration=1.609767298 podStartE2EDuration="1.609767298s" podCreationTimestamp="2026-01-21 15:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:23.608647713 +0000 UTC m=+2860.790163935" watchObservedRunningTime="2026-01-21 15:49:23.609767298 +0000 UTC m=+2860.791283521" Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.756338 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.756771 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="ceilometer-central-agent" containerID="cri-o://35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093" gracePeriod=30 Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.756802 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="proxy-httpd" containerID="cri-o://03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f" gracePeriod=30 Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.756836 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="sg-core" containerID="cri-o://e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d" gracePeriod=30 Jan 21 15:49:23 crc kubenswrapper[4707]: I0121 15:49:23.756841 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="ceilometer-notification-agent" containerID="cri-o://756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757" gracePeriod=30 Jan 21 15:49:24 crc kubenswrapper[4707]: I0121 15:49:24.597891 4707 generic.go:334] "Generic (PLEG): container finished" podID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerID="03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f" exitCode=0 Jan 21 15:49:24 crc kubenswrapper[4707]: I0121 15:49:24.597920 4707 generic.go:334] "Generic (PLEG): container finished" podID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerID="e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d" exitCode=2 Jan 21 15:49:24 crc kubenswrapper[4707]: I0121 15:49:24.597927 4707 generic.go:334] "Generic (PLEG): container finished" podID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerID="35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093" exitCode=0 Jan 21 15:49:24 crc kubenswrapper[4707]: I0121 15:49:24.597972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerDied","Data":"03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f"} Jan 21 15:49:24 crc kubenswrapper[4707]: I0121 15:49:24.598012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerDied","Data":"e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d"} Jan 21 15:49:24 crc kubenswrapper[4707]: I0121 15:49:24.598023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerDied","Data":"35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093"} Jan 21 15:49:24 crc kubenswrapper[4707]: I0121 15:49:24.598126 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.796291 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tfrs"] Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.797471 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.802665 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tfrs"] Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.809695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztjxg\" (UniqueName: \"kubernetes.io/projected/33110471-654e-4379-8027-e105b51e6c35-kube-api-access-ztjxg\") pod \"nova-api-db-create-7tfrs\" (UID: \"33110471-654e-4379-8027-e105b51e6c35\") " pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.809752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33110471-654e-4379-8027-e105b51e6c35-operator-scripts\") pod \"nova-api-db-create-7tfrs\" (UID: \"33110471-654e-4379-8027-e105b51e6c35\") " pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.891694 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-w54q5"] Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.892798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.899297 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-w54q5"] Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.910466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztjxg\" (UniqueName: \"kubernetes.io/projected/33110471-654e-4379-8027-e105b51e6c35-kube-api-access-ztjxg\") pod \"nova-api-db-create-7tfrs\" (UID: \"33110471-654e-4379-8027-e105b51e6c35\") " pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.910527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33110471-654e-4379-8027-e105b51e6c35-operator-scripts\") pod \"nova-api-db-create-7tfrs\" (UID: \"33110471-654e-4379-8027-e105b51e6c35\") " pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.910591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5f0055-734f-4cf4-977a-cf51a551157d-operator-scripts\") pod \"nova-cell0-db-create-w54q5\" (UID: \"cf5f0055-734f-4cf4-977a-cf51a551157d\") " pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.910628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr8xm\" (UniqueName: \"kubernetes.io/projected/cf5f0055-734f-4cf4-977a-cf51a551157d-kube-api-access-rr8xm\") pod \"nova-cell0-db-create-w54q5\" (UID: \"cf5f0055-734f-4cf4-977a-cf51a551157d\") " pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.911286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33110471-654e-4379-8027-e105b51e6c35-operator-scripts\") pod \"nova-api-db-create-7tfrs\" (UID: \"33110471-654e-4379-8027-e105b51e6c35\") " pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.926248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztjxg\" (UniqueName: \"kubernetes.io/projected/33110471-654e-4379-8027-e105b51e6c35-kube-api-access-ztjxg\") pod \"nova-api-db-create-7tfrs\" (UID: \"33110471-654e-4379-8027-e105b51e6c35\") " pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.996434 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99"] Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.997512 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:25 crc kubenswrapper[4707]: I0121 15:49:25.999632 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.004803 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-9qfh6"] Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.005921 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.011285 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99"] Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.012207 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnc66\" (UniqueName: \"kubernetes.io/projected/fe436520-0000-4a6c-8e4a-8a8b1c394b94-kube-api-access-rnc66\") pod \"nova-api-b3eb-account-create-update-pvj99\" (UID: \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.012431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7l6\" (UniqueName: \"kubernetes.io/projected/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-kube-api-access-ss7l6\") pod \"nova-cell1-db-create-9qfh6\" (UID: \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\") " pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.012544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5f0055-734f-4cf4-977a-cf51a551157d-operator-scripts\") pod \"nova-cell0-db-create-w54q5\" (UID: \"cf5f0055-734f-4cf4-977a-cf51a551157d\") " pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.012664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr8xm\" (UniqueName: \"kubernetes.io/projected/cf5f0055-734f-4cf4-977a-cf51a551157d-kube-api-access-rr8xm\") pod \"nova-cell0-db-create-w54q5\" (UID: \"cf5f0055-734f-4cf4-977a-cf51a551157d\") " pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.012755 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-operator-scripts\") pod \"nova-cell1-db-create-9qfh6\" (UID: \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\") " pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.012894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe436520-0000-4a6c-8e4a-8a8b1c394b94-operator-scripts\") pod \"nova-api-b3eb-account-create-update-pvj99\" (UID: \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.013336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5f0055-734f-4cf4-977a-cf51a551157d-operator-scripts\") pod \"nova-cell0-db-create-w54q5\" (UID: \"cf5f0055-734f-4cf4-977a-cf51a551157d\") " pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.022949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-9qfh6"] Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.029533 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr8xm\" (UniqueName: \"kubernetes.io/projected/cf5f0055-734f-4cf4-977a-cf51a551157d-kube-api-access-rr8xm\") pod \"nova-cell0-db-create-w54q5\" (UID: \"cf5f0055-734f-4cf4-977a-cf51a551157d\") " pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.111588 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.113580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnc66\" (UniqueName: \"kubernetes.io/projected/fe436520-0000-4a6c-8e4a-8a8b1c394b94-kube-api-access-rnc66\") pod \"nova-api-b3eb-account-create-update-pvj99\" (UID: \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.113645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7l6\" (UniqueName: \"kubernetes.io/projected/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-kube-api-access-ss7l6\") pod \"nova-cell1-db-create-9qfh6\" (UID: \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\") " pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.113699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-operator-scripts\") pod \"nova-cell1-db-create-9qfh6\" (UID: \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\") " pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.113732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe436520-0000-4a6c-8e4a-8a8b1c394b94-operator-scripts\") pod \"nova-api-b3eb-account-create-update-pvj99\" (UID: \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.114282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe436520-0000-4a6c-8e4a-8a8b1c394b94-operator-scripts\") pod \"nova-api-b3eb-account-create-update-pvj99\" (UID: \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.115073 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-operator-scripts\") pod \"nova-cell1-db-create-9qfh6\" (UID: \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\") " pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.132281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnc66\" (UniqueName: \"kubernetes.io/projected/fe436520-0000-4a6c-8e4a-8a8b1c394b94-kube-api-access-rnc66\") pod \"nova-api-b3eb-account-create-update-pvj99\" (UID: \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.135609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7l6\" (UniqueName: \"kubernetes.io/projected/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-kube-api-access-ss7l6\") pod \"nova-cell1-db-create-9qfh6\" (UID: \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\") " pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.210154 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.212760 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c"] Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.213689 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.215295 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.231542 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c"] Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.312621 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.317032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-operator-scripts\") pod \"nova-cell0-a56d-account-create-update-lsx2c\" (UID: \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.317096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw5xc\" (UniqueName: \"kubernetes.io/projected/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-kube-api-access-xw5xc\") pod \"nova-cell0-a56d-account-create-update-lsx2c\" (UID: \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.320458 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.417134 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v"] Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.418425 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw5xc\" (UniqueName: \"kubernetes.io/projected/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-kube-api-access-xw5xc\") pod \"nova-cell0-a56d-account-create-update-lsx2c\" (UID: \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.418567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-operator-scripts\") pod \"nova-cell0-a56d-account-create-update-lsx2c\" (UID: \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.419279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-operator-scripts\") pod \"nova-cell0-a56d-account-create-update-lsx2c\" (UID: \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.420375 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.422615 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.426245 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v"] Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.437955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw5xc\" (UniqueName: \"kubernetes.io/projected/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-kube-api-access-xw5xc\") pod \"nova-cell0-a56d-account-create-update-lsx2c\" (UID: \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.510403 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tfrs"] Jan 21 15:49:26 crc kubenswrapper[4707]: W0121 15:49:26.514318 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33110471_654e_4379_8027_e105b51e6c35.slice/crio-7488720ad92fa731ee0998cb90d637d343357e419e67c364104daab72937758a WatchSource:0}: Error finding container 7488720ad92fa731ee0998cb90d637d343357e419e67c364104daab72937758a: Status 404 returned error can't find the container with id 7488720ad92fa731ee0998cb90d637d343357e419e67c364104daab72937758a Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.519434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwt9x\" (UniqueName: \"kubernetes.io/projected/d6728371-3b90-4e51-bc07-8b3f08dc2420-kube-api-access-wwt9x\") pod \"nova-cell1-fd9c-account-create-update-mjr9v\" (UID: \"d6728371-3b90-4e51-bc07-8b3f08dc2420\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.519521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6728371-3b90-4e51-bc07-8b3f08dc2420-operator-scripts\") pod \"nova-cell1-fd9c-account-create-update-mjr9v\" (UID: \"d6728371-3b90-4e51-bc07-8b3f08dc2420\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.565479 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.584166 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-w54q5"] Jan 21 15:49:26 crc kubenswrapper[4707]: W0121 15:49:26.594325 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf5f0055_734f_4cf4_977a_cf51a551157d.slice/crio-aed02d172e080ad849dba8d7ed9a547425b61693e6ac631001523810a739f8b8 WatchSource:0}: Error finding container aed02d172e080ad849dba8d7ed9a547425b61693e6ac631001523810a739f8b8: Status 404 returned error can't find the container with id aed02d172e080ad849dba8d7ed9a547425b61693e6ac631001523810a739f8b8 Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.614031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" event={"ID":"cf5f0055-734f-4cf4-977a-cf51a551157d","Type":"ContainerStarted","Data":"aed02d172e080ad849dba8d7ed9a547425b61693e6ac631001523810a739f8b8"} Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.615215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" event={"ID":"33110471-654e-4379-8027-e105b51e6c35","Type":"ContainerStarted","Data":"7488720ad92fa731ee0998cb90d637d343357e419e67c364104daab72937758a"} Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.621450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6728371-3b90-4e51-bc07-8b3f08dc2420-operator-scripts\") pod \"nova-cell1-fd9c-account-create-update-mjr9v\" (UID: \"d6728371-3b90-4e51-bc07-8b3f08dc2420\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.621623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwt9x\" (UniqueName: \"kubernetes.io/projected/d6728371-3b90-4e51-bc07-8b3f08dc2420-kube-api-access-wwt9x\") pod \"nova-cell1-fd9c-account-create-update-mjr9v\" (UID: \"d6728371-3b90-4e51-bc07-8b3f08dc2420\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.622182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6728371-3b90-4e51-bc07-8b3f08dc2420-operator-scripts\") pod \"nova-cell1-fd9c-account-create-update-mjr9v\" (UID: \"d6728371-3b90-4e51-bc07-8b3f08dc2420\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.636131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwt9x\" (UniqueName: \"kubernetes.io/projected/d6728371-3b90-4e51-bc07-8b3f08dc2420-kube-api-access-wwt9x\") pod \"nova-cell1-fd9c-account-create-update-mjr9v\" (UID: \"d6728371-3b90-4e51-bc07-8b3f08dc2420\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.740424 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99"] Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.741209 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.751444 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-9qfh6"] Jan 21 15:49:26 crc kubenswrapper[4707]: I0121 15:49:26.966487 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c"] Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.110633 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v"] Jan 21 15:49:27 crc kubenswrapper[4707]: W0121 15:49:27.113065 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6728371_3b90_4e51_bc07_8b3f08dc2420.slice/crio-c6eb85a0a493796cacc2bbcb4be32b7ad240eb36bbf165404b62c5048a463c97 WatchSource:0}: Error finding container c6eb85a0a493796cacc2bbcb4be32b7ad240eb36bbf165404b62c5048a463c97: Status 404 returned error can't find the container with id c6eb85a0a493796cacc2bbcb4be32b7ad240eb36bbf165404b62c5048a463c97 Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.623937 4707 generic.go:334] "Generic (PLEG): container finished" podID="a33d5677-3f18-49ea-9098-7b8d0c9d6cd3" containerID="227fa53f6dbdedb9034190a4104e155e6b956798c322115d49360fef6dc181f3" exitCode=0 Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.624181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" event={"ID":"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3","Type":"ContainerDied","Data":"227fa53f6dbdedb9034190a4104e155e6b956798c322115d49360fef6dc181f3"} Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.624206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" event={"ID":"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3","Type":"ContainerStarted","Data":"436c10f79b1cfe17cbc618a86b28a162f7202030921e577d23b4a925b92faa27"} Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.625797 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6728371-3b90-4e51-bc07-8b3f08dc2420" containerID="89f7c57c91a7c289c44fe4c129ace200e5af39011c8ced5441224fe0f0e92e44" exitCode=0 Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.625882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" event={"ID":"d6728371-3b90-4e51-bc07-8b3f08dc2420","Type":"ContainerDied","Data":"89f7c57c91a7c289c44fe4c129ace200e5af39011c8ced5441224fe0f0e92e44"} Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.625911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" event={"ID":"d6728371-3b90-4e51-bc07-8b3f08dc2420","Type":"ContainerStarted","Data":"c6eb85a0a493796cacc2bbcb4be32b7ad240eb36bbf165404b62c5048a463c97"} Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.627256 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf5f0055-734f-4cf4-977a-cf51a551157d" containerID="d7d890aefc9c07d04c6f11b91dc8272424a8c811d55a0e0d7a7880ee6dcb0bf9" exitCode=0 Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.627312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" event={"ID":"cf5f0055-734f-4cf4-977a-cf51a551157d","Type":"ContainerDied","Data":"d7d890aefc9c07d04c6f11b91dc8272424a8c811d55a0e0d7a7880ee6dcb0bf9"} Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.628776 4707 generic.go:334] "Generic (PLEG): container finished" podID="6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c" containerID="d31551146b8ebbbc731382dfb6cd2dabb404edd7c9171497db9313117e68d055" exitCode=0 Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.628828 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" event={"ID":"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c","Type":"ContainerDied","Data":"d31551146b8ebbbc731382dfb6cd2dabb404edd7c9171497db9313117e68d055"} Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.628843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" event={"ID":"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c","Type":"ContainerStarted","Data":"a09d7a1f4fb15a3ac5adabc4172e9176d7fa75c4bffa139686a7dcbce448b5b3"} Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.631805 4707 generic.go:334] "Generic (PLEG): container finished" podID="33110471-654e-4379-8027-e105b51e6c35" containerID="c09faffd0bbe36299f2dd172904831c8c7693d38c1181a1172b98f971042e45c" exitCode=0 Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.631860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" event={"ID":"33110471-654e-4379-8027-e105b51e6c35","Type":"ContainerDied","Data":"c09faffd0bbe36299f2dd172904831c8c7693d38c1181a1172b98f971042e45c"} Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.633779 4707 generic.go:334] "Generic (PLEG): container finished" podID="fe436520-0000-4a6c-8e4a-8a8b1c394b94" containerID="2b76cfe96a31c3a5c537442539103885ef5840bd5265479eefa63122173d5b66" exitCode=0 Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.633803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" event={"ID":"fe436520-0000-4a6c-8e4a-8a8b1c394b94","Type":"ContainerDied","Data":"2b76cfe96a31c3a5c537442539103885ef5840bd5265479eefa63122173d5b66"} Jan 21 15:49:27 crc kubenswrapper[4707]: I0121 15:49:27.633845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" event={"ID":"fe436520-0000-4a6c-8e4a-8a8b1c394b94","Type":"ContainerStarted","Data":"51cc425fefa2954bb2562011d5cb146a1584a6bb92bac789c1d31b0e2b8772a5"} Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.332349 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.353775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-scripts\") pod \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.353823 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-sg-core-conf-yaml\") pod \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.353904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-run-httpd\") pod \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.353927 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-config-data\") pod \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.353948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-log-httpd\") pod \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.354116 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxbtz\" (UniqueName: \"kubernetes.io/projected/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-kube-api-access-sxbtz\") pod \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.354147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-combined-ca-bundle\") pod \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\" (UID: \"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86\") " Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.354671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" (UID: "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.355169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" (UID: "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.361164 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-kube-api-access-sxbtz" (OuterVolumeSpecName: "kube-api-access-sxbtz") pod "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" (UID: "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86"). InnerVolumeSpecName "kube-api-access-sxbtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.367951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-scripts" (OuterVolumeSpecName: "scripts") pod "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" (UID: "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.379689 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" (UID: "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.420189 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" (UID: "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.433081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-config-data" (OuterVolumeSpecName: "config-data") pod "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" (UID: "798774bb-daf9-4cd7-8a4d-e7d0cc26fd86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.456110 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxbtz\" (UniqueName: \"kubernetes.io/projected/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-kube-api-access-sxbtz\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.456138 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.456147 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.456156 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.456163 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.456171 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.456179 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.643476 4707 generic.go:334] "Generic (PLEG): container finished" podID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerID="756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757" exitCode=0 Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.643679 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.646944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerDied","Data":"756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757"} Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.646977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"798774bb-daf9-4cd7-8a4d-e7d0cc26fd86","Type":"ContainerDied","Data":"70746b54176ba74a095ddbf59e0c541b67e692a1e00d80c9ae722b295c769d7a"} Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.646995 4707 scope.go:117] "RemoveContainer" containerID="03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.684981 4707 scope.go:117] "RemoveContainer" containerID="e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.689755 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.700595 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.707110 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:28 crc kubenswrapper[4707]: E0121 15:49:28.707488 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="sg-core" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.707507 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="sg-core" Jan 21 15:49:28 crc kubenswrapper[4707]: E0121 15:49:28.707521 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="ceilometer-central-agent" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.707527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="ceilometer-central-agent" Jan 21 15:49:28 crc kubenswrapper[4707]: E0121 15:49:28.707540 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="proxy-httpd" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.707546 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="proxy-httpd" Jan 21 15:49:28 crc kubenswrapper[4707]: E0121 15:49:28.707559 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="ceilometer-notification-agent" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.707565 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="ceilometer-notification-agent" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.707739 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="proxy-httpd" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.707755 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="sg-core" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.707766 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="ceilometer-notification-agent" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.707778 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" containerName="ceilometer-central-agent" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.709232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.712326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.712516 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.718071 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.733292 4707 scope.go:117] "RemoveContainer" containerID="756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.759394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.759533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-config-data\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.759607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.759663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtmzj\" (UniqueName: \"kubernetes.io/projected/639b49a9-a884-4fff-be7d-b423966429ae-kube-api-access-mtmzj\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.759731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-scripts\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.759783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-run-httpd\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.759802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-log-httpd\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.777461 4707 scope.go:117] "RemoveContainer" containerID="35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.796917 4707 scope.go:117] "RemoveContainer" containerID="03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f" Jan 21 15:49:28 crc kubenswrapper[4707]: E0121 15:49:28.797668 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f\": container with ID starting with 03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f not found: ID does not exist" containerID="03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.797705 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f"} err="failed to get container status \"03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f\": rpc error: code = NotFound desc = could not find container \"03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f\": container with ID starting with 03f914e044a20a33e4cbb16eecded5b1a7f145af6642b2684cd6dc82160bcc0f not found: ID does not exist" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.797748 4707 scope.go:117] "RemoveContainer" containerID="e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d" Jan 21 15:49:28 crc kubenswrapper[4707]: E0121 15:49:28.798247 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d\": container with ID starting with e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d not found: ID does not exist" containerID="e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.798294 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d"} err="failed to get container status \"e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d\": rpc error: code = NotFound desc = could not find container \"e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d\": container with ID starting with e6f0be50c26b2aaa50490bd21c6d531ed3cb64689f8f1950fd3e0887618ffd4d not found: ID does not exist" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.798319 4707 scope.go:117] "RemoveContainer" containerID="756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757" Jan 21 15:49:28 crc kubenswrapper[4707]: E0121 15:49:28.798649 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757\": container with ID starting with 756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757 not found: ID does not exist" containerID="756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.798677 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757"} err="failed to get container status \"756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757\": rpc error: code = NotFound desc = could not find container \"756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757\": container with ID starting with 756b2055529a592c7898de1768cdb418f7fc894458a27cc7df6900e8ae498757 not found: ID does not exist" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.798697 4707 scope.go:117] "RemoveContainer" containerID="35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093" Jan 21 15:49:28 crc kubenswrapper[4707]: E0121 15:49:28.799004 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093\": container with ID starting with 35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093 not found: ID does not exist" containerID="35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.799035 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093"} err="failed to get container status \"35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093\": rpc error: code = NotFound desc = could not find container \"35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093\": container with ID starting with 35866263b17982c2303364047d828d3971a7f9864451a54bfeb2dc5104654093 not found: ID does not exist" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.865692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtmzj\" (UniqueName: \"kubernetes.io/projected/639b49a9-a884-4fff-be7d-b423966429ae-kube-api-access-mtmzj\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.865757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-scripts\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.866376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-run-httpd\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.866406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-log-httpd\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.866491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.866539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-config-data\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.866573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.866935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-log-httpd\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.867103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-run-httpd\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.868940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-scripts\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.869800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-config-data\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.869951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.870723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.879895 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtmzj\" (UniqueName: \"kubernetes.io/projected/639b49a9-a884-4fff-be7d-b423966429ae-kube-api-access-mtmzj\") pod \"ceilometer-0\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.928208 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.967584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwt9x\" (UniqueName: \"kubernetes.io/projected/d6728371-3b90-4e51-bc07-8b3f08dc2420-kube-api-access-wwt9x\") pod \"d6728371-3b90-4e51-bc07-8b3f08dc2420\" (UID: \"d6728371-3b90-4e51-bc07-8b3f08dc2420\") " Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.967710 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6728371-3b90-4e51-bc07-8b3f08dc2420-operator-scripts\") pod \"d6728371-3b90-4e51-bc07-8b3f08dc2420\" (UID: \"d6728371-3b90-4e51-bc07-8b3f08dc2420\") " Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.968246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6728371-3b90-4e51-bc07-8b3f08dc2420-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6728371-3b90-4e51-bc07-8b3f08dc2420" (UID: "d6728371-3b90-4e51-bc07-8b3f08dc2420"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:49:28 crc kubenswrapper[4707]: I0121 15:49:28.970196 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6728371-3b90-4e51-bc07-8b3f08dc2420-kube-api-access-wwt9x" (OuterVolumeSpecName: "kube-api-access-wwt9x") pod "d6728371-3b90-4e51-bc07-8b3f08dc2420" (UID: "d6728371-3b90-4e51-bc07-8b3f08dc2420"). InnerVolumeSpecName "kube-api-access-wwt9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.031728 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.073858 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwt9x\" (UniqueName: \"kubernetes.io/projected/d6728371-3b90-4e51-bc07-8b3f08dc2420-kube-api-access-wwt9x\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.073887 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6728371-3b90-4e51-bc07-8b3f08dc2420-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.192146 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.192579 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798774bb-daf9-4cd7-8a4d-e7d0cc26fd86" path="/var/lib/kubelet/pods/798774bb-daf9-4cd7-8a4d-e7d0cc26fd86/volumes" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.206113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.218711 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.230034 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.241110 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnc66\" (UniqueName: \"kubernetes.io/projected/fe436520-0000-4a6c-8e4a-8a8b1c394b94-kube-api-access-rnc66\") pod \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\" (UID: \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe436520-0000-4a6c-8e4a-8a8b1c394b94-operator-scripts\") pod \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\" (UID: \"fe436520-0000-4a6c-8e4a-8a8b1c394b94\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztjxg\" (UniqueName: \"kubernetes.io/projected/33110471-654e-4379-8027-e105b51e6c35-kube-api-access-ztjxg\") pod \"33110471-654e-4379-8027-e105b51e6c35\" (UID: \"33110471-654e-4379-8027-e105b51e6c35\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr8xm\" (UniqueName: \"kubernetes.io/projected/cf5f0055-734f-4cf4-977a-cf51a551157d-kube-api-access-rr8xm\") pod \"cf5f0055-734f-4cf4-977a-cf51a551157d\" (UID: \"cf5f0055-734f-4cf4-977a-cf51a551157d\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss7l6\" (UniqueName: \"kubernetes.io/projected/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-kube-api-access-ss7l6\") pod \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\" (UID: \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw5xc\" (UniqueName: \"kubernetes.io/projected/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-kube-api-access-xw5xc\") pod \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\" (UID: \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5f0055-734f-4cf4-977a-cf51a551157d-operator-scripts\") pod \"cf5f0055-734f-4cf4-977a-cf51a551157d\" (UID: \"cf5f0055-734f-4cf4-977a-cf51a551157d\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277686 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-operator-scripts\") pod \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\" (UID: \"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33110471-654e-4379-8027-e105b51e6c35-operator-scripts\") pod \"33110471-654e-4379-8027-e105b51e6c35\" (UID: \"33110471-654e-4379-8027-e105b51e6c35\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.277776 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-operator-scripts\") pod \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\" (UID: \"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c\") " Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.278409 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33110471-654e-4379-8027-e105b51e6c35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33110471-654e-4379-8027-e105b51e6c35" (UID: "33110471-654e-4379-8027-e105b51e6c35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.278429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf5f0055-734f-4cf4-977a-cf51a551157d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf5f0055-734f-4cf4-977a-cf51a551157d" (UID: "cf5f0055-734f-4cf4-977a-cf51a551157d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.278436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a33d5677-3f18-49ea-9098-7b8d0c9d6cd3" (UID: "a33d5677-3f18-49ea-9098-7b8d0c9d6cd3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.278720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe436520-0000-4a6c-8e4a-8a8b1c394b94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe436520-0000-4a6c-8e4a-8a8b1c394b94" (UID: "fe436520-0000-4a6c-8e4a-8a8b1c394b94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.278867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c" (UID: "6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.282747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-kube-api-access-xw5xc" (OuterVolumeSpecName: "kube-api-access-xw5xc") pod "6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c" (UID: "6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c"). InnerVolumeSpecName "kube-api-access-xw5xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.282834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5f0055-734f-4cf4-977a-cf51a551157d-kube-api-access-rr8xm" (OuterVolumeSpecName: "kube-api-access-rr8xm") pod "cf5f0055-734f-4cf4-977a-cf51a551157d" (UID: "cf5f0055-734f-4cf4-977a-cf51a551157d"). InnerVolumeSpecName "kube-api-access-rr8xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.283892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-kube-api-access-ss7l6" (OuterVolumeSpecName: "kube-api-access-ss7l6") pod "a33d5677-3f18-49ea-9098-7b8d0c9d6cd3" (UID: "a33d5677-3f18-49ea-9098-7b8d0c9d6cd3"). InnerVolumeSpecName "kube-api-access-ss7l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.287580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe436520-0000-4a6c-8e4a-8a8b1c394b94-kube-api-access-rnc66" (OuterVolumeSpecName: "kube-api-access-rnc66") pod "fe436520-0000-4a6c-8e4a-8a8b1c394b94" (UID: "fe436520-0000-4a6c-8e4a-8a8b1c394b94"). InnerVolumeSpecName "kube-api-access-rnc66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.287647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33110471-654e-4379-8027-e105b51e6c35-kube-api-access-ztjxg" (OuterVolumeSpecName: "kube-api-access-ztjxg") pod "33110471-654e-4379-8027-e105b51e6c35" (UID: "33110471-654e-4379-8027-e105b51e6c35"). InnerVolumeSpecName "kube-api-access-ztjxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.381925 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnc66\" (UniqueName: \"kubernetes.io/projected/fe436520-0000-4a6c-8e4a-8a8b1c394b94-kube-api-access-rnc66\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.381961 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe436520-0000-4a6c-8e4a-8a8b1c394b94-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.381989 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztjxg\" (UniqueName: \"kubernetes.io/projected/33110471-654e-4379-8027-e105b51e6c35-kube-api-access-ztjxg\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.382000 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr8xm\" (UniqueName: \"kubernetes.io/projected/cf5f0055-734f-4cf4-977a-cf51a551157d-kube-api-access-rr8xm\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.382009 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss7l6\" (UniqueName: \"kubernetes.io/projected/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-kube-api-access-ss7l6\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.382017 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw5xc\" (UniqueName: \"kubernetes.io/projected/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-kube-api-access-xw5xc\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.382025 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5f0055-734f-4cf4-977a-cf51a551157d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.382032 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.382041 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33110471-654e-4379-8027-e105b51e6c35-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.382050 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.464871 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.652619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" event={"ID":"d6728371-3b90-4e51-bc07-8b3f08dc2420","Type":"ContainerDied","Data":"c6eb85a0a493796cacc2bbcb4be32b7ad240eb36bbf165404b62c5048a463c97"} Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.652661 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6eb85a0a493796cacc2bbcb4be32b7ad240eb36bbf165404b62c5048a463c97" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.652866 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.653918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" event={"ID":"cf5f0055-734f-4cf4-977a-cf51a551157d","Type":"ContainerDied","Data":"aed02d172e080ad849dba8d7ed9a547425b61693e6ac631001523810a739f8b8"} Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.653941 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed02d172e080ad849dba8d7ed9a547425b61693e6ac631001523810a739f8b8" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.653931 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-w54q5" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.654915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" event={"ID":"6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c","Type":"ContainerDied","Data":"a09d7a1f4fb15a3ac5adabc4172e9176d7fa75c4bffa139686a7dcbce448b5b3"} Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.654936 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09d7a1f4fb15a3ac5adabc4172e9176d7fa75c4bffa139686a7dcbce448b5b3" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.654936 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.656681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.656680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-7tfrs" event={"ID":"33110471-654e-4379-8027-e105b51e6c35","Type":"ContainerDied","Data":"7488720ad92fa731ee0998cb90d637d343357e419e67c364104daab72937758a"} Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.656847 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7488720ad92fa731ee0998cb90d637d343357e419e67c364104daab72937758a" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.657984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" event={"ID":"fe436520-0000-4a6c-8e4a-8a8b1c394b94","Type":"ContainerDied","Data":"51cc425fefa2954bb2562011d5cb146a1584a6bb92bac789c1d31b0e2b8772a5"} Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.658008 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51cc425fefa2954bb2562011d5cb146a1584a6bb92bac789c1d31b0e2b8772a5" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.658048 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.659760 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.659787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-9qfh6" event={"ID":"a33d5677-3f18-49ea-9098-7b8d0c9d6cd3","Type":"ContainerDied","Data":"436c10f79b1cfe17cbc618a86b28a162f7202030921e577d23b4a925b92faa27"} Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.659842 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="436c10f79b1cfe17cbc618a86b28a162f7202030921e577d23b4a925b92faa27" Jan 21 15:49:29 crc kubenswrapper[4707]: I0121 15:49:29.662950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerStarted","Data":"9a9c943dfd2153d182e4bb630230f87cc6291cccd4b28535f0de7dd222862b5c"} Jan 21 15:49:30 crc kubenswrapper[4707]: I0121 15:49:30.673384 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerStarted","Data":"f0d4e3543147c0c5a951f98861d7f7c7f347407de9c57238a7f2cca1509f281f"} Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406058 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9"] Jan 21 15:49:31 crc kubenswrapper[4707]: E0121 15:49:31.406540 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33110471-654e-4379-8027-e105b51e6c35" containerName="mariadb-database-create" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406557 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33110471-654e-4379-8027-e105b51e6c35" containerName="mariadb-database-create" Jan 21 15:49:31 crc kubenswrapper[4707]: E0121 15:49:31.406571 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5f0055-734f-4cf4-977a-cf51a551157d" containerName="mariadb-database-create" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406577 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5f0055-734f-4cf4-977a-cf51a551157d" containerName="mariadb-database-create" Jan 21 15:49:31 crc kubenswrapper[4707]: E0121 15:49:31.406587 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33d5677-3f18-49ea-9098-7b8d0c9d6cd3" containerName="mariadb-database-create" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406592 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33d5677-3f18-49ea-9098-7b8d0c9d6cd3" containerName="mariadb-database-create" Jan 21 15:49:31 crc kubenswrapper[4707]: E0121 15:49:31.406612 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c" containerName="mariadb-account-create-update" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406617 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c" containerName="mariadb-account-create-update" Jan 21 15:49:31 crc kubenswrapper[4707]: E0121 15:49:31.406624 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6728371-3b90-4e51-bc07-8b3f08dc2420" containerName="mariadb-account-create-update" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406630 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6728371-3b90-4e51-bc07-8b3f08dc2420" containerName="mariadb-account-create-update" Jan 21 15:49:31 crc kubenswrapper[4707]: E0121 15:49:31.406640 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe436520-0000-4a6c-8e4a-8a8b1c394b94" containerName="mariadb-account-create-update" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406646 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe436520-0000-4a6c-8e4a-8a8b1c394b94" containerName="mariadb-account-create-update" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406791 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33110471-654e-4379-8027-e105b51e6c35" containerName="mariadb-database-create" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406818 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c" containerName="mariadb-account-create-update" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406829 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6728371-3b90-4e51-bc07-8b3f08dc2420" containerName="mariadb-account-create-update" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406840 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5f0055-734f-4cf4-977a-cf51a551157d" containerName="mariadb-database-create" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406857 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe436520-0000-4a6c-8e4a-8a8b1c394b94" containerName="mariadb-account-create-update" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.406865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33d5677-3f18-49ea-9098-7b8d0c9d6cd3" containerName="mariadb-database-create" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.407344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.409891 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.409938 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-rplq6" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.412527 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.426040 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9"] Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.525899 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.531177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98qpf\" (UniqueName: \"kubernetes.io/projected/82693866-aeda-4300-b19e-819e2f09922f-kube-api-access-98qpf\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.531314 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.531542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-config-data\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.531605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-scripts\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.633606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98qpf\" (UniqueName: \"kubernetes.io/projected/82693866-aeda-4300-b19e-819e2f09922f-kube-api-access-98qpf\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.633677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.633711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-config-data\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.633734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-scripts\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.637847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-scripts\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.638236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-config-data\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.640520 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.649851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98qpf\" (UniqueName: \"kubernetes.io/projected/82693866-aeda-4300-b19e-819e2f09922f-kube-api-access-98qpf\") pod \"nova-cell0-conductor-db-sync-rrqb9\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.683768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerStarted","Data":"606376f270c134c8160eb54573c247bef2eab50c6b2d1a77a0f8bf5549ca4357"} Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.684340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerStarted","Data":"5aa0df8f46bc4f4a82678099c6ad049ef92d18225cbc84570cd7c2bbbca3e42f"} Jan 21 15:49:31 crc kubenswrapper[4707]: I0121 15:49:31.720543 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.080273 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9"] Jan 21 15:49:32 crc kubenswrapper[4707]: W0121 15:49:32.088384 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82693866_aeda_4300_b19e_819e2f09922f.slice/crio-9a1eaeb7c20cfb7c76c45742a3cb373c559f27a97c8f0f772cd570ab4568e5e8 WatchSource:0}: Error finding container 9a1eaeb7c20cfb7c76c45742a3cb373c559f27a97c8f0f772cd570ab4568e5e8: Status 404 returned error can't find the container with id 9a1eaeb7c20cfb7c76c45742a3cb373c559f27a97c8f0f772cd570ab4568e5e8 Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.699391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerStarted","Data":"28fd30364aacf5eddf23a1ae8d7c503c7ec7a81848462d48c5e25dd34169270d"} Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.699793 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="ceilometer-central-agent" containerID="cri-o://f0d4e3543147c0c5a951f98861d7f7c7f347407de9c57238a7f2cca1509f281f" gracePeriod=30 Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.700045 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.700280 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="proxy-httpd" containerID="cri-o://28fd30364aacf5eddf23a1ae8d7c503c7ec7a81848462d48c5e25dd34169270d" gracePeriod=30 Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.700321 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="sg-core" containerID="cri-o://606376f270c134c8160eb54573c247bef2eab50c6b2d1a77a0f8bf5549ca4357" gracePeriod=30 Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.700367 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="ceilometer-notification-agent" containerID="cri-o://5aa0df8f46bc4f4a82678099c6ad049ef92d18225cbc84570cd7c2bbbca3e42f" gracePeriod=30 Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.703691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" event={"ID":"82693866-aeda-4300-b19e-819e2f09922f","Type":"ContainerStarted","Data":"564facd7467241d001d1ede0891aa84310b54a6635db128f420242316c948ab5"} Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.703739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" event={"ID":"82693866-aeda-4300-b19e-819e2f09922f","Type":"ContainerStarted","Data":"9a1eaeb7c20cfb7c76c45742a3cb373c559f27a97c8f0f772cd570ab4568e5e8"} Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.716515 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.779280999 podStartE2EDuration="4.716499361s" podCreationTimestamp="2026-01-21 15:49:28 +0000 UTC" firstStartedPulling="2026-01-21 15:49:29.468355083 +0000 UTC m=+2866.649871305" lastFinishedPulling="2026-01-21 15:49:32.405573445 +0000 UTC m=+2869.587089667" observedRunningTime="2026-01-21 15:49:32.712369376 +0000 UTC m=+2869.893885598" watchObservedRunningTime="2026-01-21 15:49:32.716499361 +0000 UTC m=+2869.898015584" Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.725917 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" podStartSLOduration=1.725904345 podStartE2EDuration="1.725904345s" podCreationTimestamp="2026-01-21 15:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:32.724755876 +0000 UTC m=+2869.906272098" watchObservedRunningTime="2026-01-21 15:49:32.725904345 +0000 UTC m=+2869.907420567" Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.756301 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:32 crc kubenswrapper[4707]: I0121 15:49:32.760101 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.714657 4707 generic.go:334] "Generic (PLEG): container finished" podID="639b49a9-a884-4fff-be7d-b423966429ae" containerID="28fd30364aacf5eddf23a1ae8d7c503c7ec7a81848462d48c5e25dd34169270d" exitCode=0 Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.714908 4707 generic.go:334] "Generic (PLEG): container finished" podID="639b49a9-a884-4fff-be7d-b423966429ae" containerID="606376f270c134c8160eb54573c247bef2eab50c6b2d1a77a0f8bf5549ca4357" exitCode=2 Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.714918 4707 generic.go:334] "Generic (PLEG): container finished" podID="639b49a9-a884-4fff-be7d-b423966429ae" containerID="5aa0df8f46bc4f4a82678099c6ad049ef92d18225cbc84570cd7c2bbbca3e42f" exitCode=0 Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.714927 4707 generic.go:334] "Generic (PLEG): container finished" podID="639b49a9-a884-4fff-be7d-b423966429ae" containerID="f0d4e3543147c0c5a951f98861d7f7c7f347407de9c57238a7f2cca1509f281f" exitCode=0 Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.714707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerDied","Data":"28fd30364aacf5eddf23a1ae8d7c503c7ec7a81848462d48c5e25dd34169270d"} Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.715166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerDied","Data":"606376f270c134c8160eb54573c247bef2eab50c6b2d1a77a0f8bf5549ca4357"} Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.715178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerDied","Data":"5aa0df8f46bc4f4a82678099c6ad049ef92d18225cbc84570cd7c2bbbca3e42f"} Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.715187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerDied","Data":"f0d4e3543147c0c5a951f98861d7f7c7f347407de9c57238a7f2cca1509f281f"} Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.869780 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.968645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-combined-ca-bundle\") pod \"639b49a9-a884-4fff-be7d-b423966429ae\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.968761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-sg-core-conf-yaml\") pod \"639b49a9-a884-4fff-be7d-b423966429ae\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.968793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtmzj\" (UniqueName: \"kubernetes.io/projected/639b49a9-a884-4fff-be7d-b423966429ae-kube-api-access-mtmzj\") pod \"639b49a9-a884-4fff-be7d-b423966429ae\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.968846 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-scripts\") pod \"639b49a9-a884-4fff-be7d-b423966429ae\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.968910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-config-data\") pod \"639b49a9-a884-4fff-be7d-b423966429ae\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.968934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-log-httpd\") pod \"639b49a9-a884-4fff-be7d-b423966429ae\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.968978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-run-httpd\") pod \"639b49a9-a884-4fff-be7d-b423966429ae\" (UID: \"639b49a9-a884-4fff-be7d-b423966429ae\") " Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.969327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "639b49a9-a884-4fff-be7d-b423966429ae" (UID: "639b49a9-a884-4fff-be7d-b423966429ae"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.969359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "639b49a9-a884-4fff-be7d-b423966429ae" (UID: "639b49a9-a884-4fff-be7d-b423966429ae"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.973662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-scripts" (OuterVolumeSpecName: "scripts") pod "639b49a9-a884-4fff-be7d-b423966429ae" (UID: "639b49a9-a884-4fff-be7d-b423966429ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.975949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/639b49a9-a884-4fff-be7d-b423966429ae-kube-api-access-mtmzj" (OuterVolumeSpecName: "kube-api-access-mtmzj") pod "639b49a9-a884-4fff-be7d-b423966429ae" (UID: "639b49a9-a884-4fff-be7d-b423966429ae"). InnerVolumeSpecName "kube-api-access-mtmzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:33 crc kubenswrapper[4707]: I0121 15:49:33.995953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "639b49a9-a884-4fff-be7d-b423966429ae" (UID: "639b49a9-a884-4fff-be7d-b423966429ae"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.023884 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "639b49a9-a884-4fff-be7d-b423966429ae" (UID: "639b49a9-a884-4fff-be7d-b423966429ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.039963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-config-data" (OuterVolumeSpecName: "config-data") pod "639b49a9-a884-4fff-be7d-b423966429ae" (UID: "639b49a9-a884-4fff-be7d-b423966429ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.070798 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.070844 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtmzj\" (UniqueName: \"kubernetes.io/projected/639b49a9-a884-4fff-be7d-b423966429ae-kube-api-access-mtmzj\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.070857 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.070867 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.070875 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.070882 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/639b49a9-a884-4fff-be7d-b423966429ae-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.070890 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/639b49a9-a884-4fff-be7d-b423966429ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.726314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"639b49a9-a884-4fff-be7d-b423966429ae","Type":"ContainerDied","Data":"9a9c943dfd2153d182e4bb630230f87cc6291cccd4b28535f0de7dd222862b5c"} Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.726524 4707 scope.go:117] "RemoveContainer" containerID="28fd30364aacf5eddf23a1ae8d7c503c7ec7a81848462d48c5e25dd34169270d" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.726657 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.748390 4707 scope.go:117] "RemoveContainer" containerID="606376f270c134c8160eb54573c247bef2eab50c6b2d1a77a0f8bf5549ca4357" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.758319 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.766042 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.768697 4707 scope.go:117] "RemoveContainer" containerID="5aa0df8f46bc4f4a82678099c6ad049ef92d18225cbc84570cd7c2bbbca3e42f" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.780726 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:34 crc kubenswrapper[4707]: E0121 15:49:34.781638 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="sg-core" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.781661 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="sg-core" Jan 21 15:49:34 crc kubenswrapper[4707]: E0121 15:49:34.781669 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="ceilometer-central-agent" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.781676 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="ceilometer-central-agent" Jan 21 15:49:34 crc kubenswrapper[4707]: E0121 15:49:34.781709 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="proxy-httpd" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.781714 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="proxy-httpd" Jan 21 15:49:34 crc kubenswrapper[4707]: E0121 15:49:34.781733 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="ceilometer-notification-agent" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.781739 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="ceilometer-notification-agent" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.782146 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="ceilometer-central-agent" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.782169 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="ceilometer-notification-agent" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.782184 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="sg-core" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.782215 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="639b49a9-a884-4fff-be7d-b423966429ae" containerName="proxy-httpd" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.787394 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.802735 4707 scope.go:117] "RemoveContainer" containerID="f0d4e3543147c0c5a951f98861d7f7c7f347407de9c57238a7f2cca1509f281f" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.803374 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.808362 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.812426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.888350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.889543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-scripts\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.889708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-config-data\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.889739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-run-httpd\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.889773 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-log-httpd\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.889799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.889953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572wn\" (UniqueName: \"kubernetes.io/projected/7c006942-d17c-4b1f-b102-164d32c423b8-kube-api-access-572wn\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.991294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-config-data\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.991340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-run-httpd\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.991364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-log-httpd\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.991386 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.991453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-572wn\" (UniqueName: \"kubernetes.io/projected/7c006942-d17c-4b1f-b102-164d32c423b8-kube-api-access-572wn\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.991480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.991503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-scripts\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.991764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-run-httpd\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.992250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-log-httpd\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.995645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-config-data\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.995681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:34 crc kubenswrapper[4707]: I0121 15:49:34.996195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-scripts\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:35 crc kubenswrapper[4707]: I0121 15:49:35.004004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:35 crc kubenswrapper[4707]: I0121 15:49:35.018207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-572wn\" (UniqueName: \"kubernetes.io/projected/7c006942-d17c-4b1f-b102-164d32c423b8-kube-api-access-572wn\") pod \"ceilometer-0\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:35 crc kubenswrapper[4707]: I0121 15:49:35.120726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:35 crc kubenswrapper[4707]: I0121 15:49:35.190924 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="639b49a9-a884-4fff-be7d-b423966429ae" path="/var/lib/kubelet/pods/639b49a9-a884-4fff-be7d-b423966429ae/volumes" Jan 21 15:49:35 crc kubenswrapper[4707]: I0121 15:49:35.302884 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:35 crc kubenswrapper[4707]: I0121 15:49:35.508488 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:35 crc kubenswrapper[4707]: I0121 15:49:35.733558 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerStarted","Data":"2604c08a862c4546eadeeca51e9fd4ea654e51451b4ed4302c61eacd85de2160"} Jan 21 15:49:36 crc kubenswrapper[4707]: I0121 15:49:36.209571 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:49:36 crc kubenswrapper[4707]: I0121 15:49:36.210021 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-log" containerID="cri-o://0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d" gracePeriod=30 Jan 21 15:49:36 crc kubenswrapper[4707]: I0121 15:49:36.210083 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-httpd" containerID="cri-o://4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3" gracePeriod=30 Jan 21 15:49:36 crc kubenswrapper[4707]: I0121 15:49:36.742654 4707 generic.go:334] "Generic (PLEG): container finished" podID="d08432a6-65ad-4843-924c-2b1f96613356" containerID="0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d" exitCode=143 Jan 21 15:49:36 crc kubenswrapper[4707]: I0121 15:49:36.742870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d08432a6-65ad-4843-924c-2b1f96613356","Type":"ContainerDied","Data":"0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d"} Jan 21 15:49:36 crc kubenswrapper[4707]: I0121 15:49:36.744396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerStarted","Data":"d015562c04f2f5a72f21bebe0f422fe61036f15721af23162c79aeac222b8c2a"} Jan 21 15:49:36 crc kubenswrapper[4707]: I0121 15:49:36.869383 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:49:36 crc kubenswrapper[4707]: I0121 15:49:36.869641 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="f256c211-e440-42ff-80ac-3b9084007cf7" containerName="glance-log" containerID="cri-o://02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf" gracePeriod=30 Jan 21 15:49:36 crc kubenswrapper[4707]: I0121 15:49:36.869870 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="f256c211-e440-42ff-80ac-3b9084007cf7" containerName="glance-httpd" containerID="cri-o://d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0" gracePeriod=30 Jan 21 15:49:37 crc kubenswrapper[4707]: I0121 15:49:37.755163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerStarted","Data":"45efb98ad79ad40913c3c78b22cda7501766642e135709e181e309672ceae49d"} Jan 21 15:49:37 crc kubenswrapper[4707]: I0121 15:49:37.755224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerStarted","Data":"c9615e19b7c6188f13c8f3a92046ca9385b3874241dc6df73bf45530dd0e7dca"} Jan 21 15:49:37 crc kubenswrapper[4707]: I0121 15:49:37.756918 4707 generic.go:334] "Generic (PLEG): container finished" podID="f256c211-e440-42ff-80ac-3b9084007cf7" containerID="02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf" exitCode=143 Jan 21 15:49:37 crc kubenswrapper[4707]: I0121 15:49:37.756943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f256c211-e440-42ff-80ac-3b9084007cf7","Type":"ContainerDied","Data":"02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf"} Jan 21 15:49:38 crc kubenswrapper[4707]: I0121 15:49:38.765928 4707 generic.go:334] "Generic (PLEG): container finished" podID="82693866-aeda-4300-b19e-819e2f09922f" containerID="564facd7467241d001d1ede0891aa84310b54a6635db128f420242316c948ab5" exitCode=0 Jan 21 15:49:38 crc kubenswrapper[4707]: I0121 15:49:38.765997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" event={"ID":"82693866-aeda-4300-b19e-819e2f09922f","Type":"ContainerDied","Data":"564facd7467241d001d1ede0891aa84310b54a6635db128f420242316c948ab5"} Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.285857 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.153:9292/healthcheck\": read tcp 10.217.0.2:53408->10.217.1.153:9292: read: connection reset by peer" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.285884 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.153:9292/healthcheck\": read tcp 10.217.0.2:53416->10.217.1.153:9292: read: connection reset by peer" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.569822 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.664769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-httpd-run\") pod \"d08432a6-65ad-4843-924c-2b1f96613356\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.665031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqjld\" (UniqueName: \"kubernetes.io/projected/d08432a6-65ad-4843-924c-2b1f96613356-kube-api-access-gqjld\") pod \"d08432a6-65ad-4843-924c-2b1f96613356\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.665061 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-scripts\") pod \"d08432a6-65ad-4843-924c-2b1f96613356\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.665089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-combined-ca-bundle\") pod \"d08432a6-65ad-4843-924c-2b1f96613356\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.665120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-config-data\") pod \"d08432a6-65ad-4843-924c-2b1f96613356\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.665240 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-logs\") pod \"d08432a6-65ad-4843-924c-2b1f96613356\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.665281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d08432a6-65ad-4843-924c-2b1f96613356\" (UID: \"d08432a6-65ad-4843-924c-2b1f96613356\") " Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.665306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d08432a6-65ad-4843-924c-2b1f96613356" (UID: "d08432a6-65ad-4843-924c-2b1f96613356"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.665564 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.666685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-logs" (OuterVolumeSpecName: "logs") pod "d08432a6-65ad-4843-924c-2b1f96613356" (UID: "d08432a6-65ad-4843-924c-2b1f96613356"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.669187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08432a6-65ad-4843-924c-2b1f96613356-kube-api-access-gqjld" (OuterVolumeSpecName: "kube-api-access-gqjld") pod "d08432a6-65ad-4843-924c-2b1f96613356" (UID: "d08432a6-65ad-4843-924c-2b1f96613356"). InnerVolumeSpecName "kube-api-access-gqjld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.669438 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "d08432a6-65ad-4843-924c-2b1f96613356" (UID: "d08432a6-65ad-4843-924c-2b1f96613356"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.670018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-scripts" (OuterVolumeSpecName: "scripts") pod "d08432a6-65ad-4843-924c-2b1f96613356" (UID: "d08432a6-65ad-4843-924c-2b1f96613356"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.684331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d08432a6-65ad-4843-924c-2b1f96613356" (UID: "d08432a6-65ad-4843-924c-2b1f96613356"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.699566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-config-data" (OuterVolumeSpecName: "config-data") pod "d08432a6-65ad-4843-924c-2b1f96613356" (UID: "d08432a6-65ad-4843-924c-2b1f96613356"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.767026 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08432a6-65ad-4843-924c-2b1f96613356-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.767081 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.767094 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqjld\" (UniqueName: \"kubernetes.io/projected/d08432a6-65ad-4843-924c-2b1f96613356-kube-api-access-gqjld\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.767103 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.767111 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.767120 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08432a6-65ad-4843-924c-2b1f96613356-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.774876 4707 generic.go:334] "Generic (PLEG): container finished" podID="d08432a6-65ad-4843-924c-2b1f96613356" containerID="4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3" exitCode=0 Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.774964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d08432a6-65ad-4843-924c-2b1f96613356","Type":"ContainerDied","Data":"4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3"} Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.775022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d08432a6-65ad-4843-924c-2b1f96613356","Type":"ContainerDied","Data":"60a2f497ebf06abc01430dea1bb4554116d32887e5f1d0aad6e883ba4a2d7c47"} Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.775042 4707 scope.go:117] "RemoveContainer" containerID="4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.774989 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.778119 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="ceilometer-central-agent" containerID="cri-o://d015562c04f2f5a72f21bebe0f422fe61036f15721af23162c79aeac222b8c2a" gracePeriod=30 Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.778404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerStarted","Data":"7faa597b726a176eee8f61b373547446ac9a3ff6e22de055bc920745aa5113da"} Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.778526 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.778773 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="proxy-httpd" containerID="cri-o://7faa597b726a176eee8f61b373547446ac9a3ff6e22de055bc920745aa5113da" gracePeriod=30 Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.778978 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="sg-core" containerID="cri-o://45efb98ad79ad40913c3c78b22cda7501766642e135709e181e309672ceae49d" gracePeriod=30 Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.779073 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="ceilometer-notification-agent" containerID="cri-o://c9615e19b7c6188f13c8f3a92046ca9385b3874241dc6df73bf45530dd0e7dca" gracePeriod=30 Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.794909 4707 scope.go:117] "RemoveContainer" containerID="0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.804845 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.476685963 podStartE2EDuration="5.804829159s" podCreationTimestamp="2026-01-21 15:49:34 +0000 UTC" firstStartedPulling="2026-01-21 15:49:35.501059619 +0000 UTC m=+2872.682575841" lastFinishedPulling="2026-01-21 15:49:38.829202814 +0000 UTC m=+2876.010719037" observedRunningTime="2026-01-21 15:49:39.797035385 +0000 UTC m=+2876.978551607" watchObservedRunningTime="2026-01-21 15:49:39.804829159 +0000 UTC m=+2876.986345381" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.809475 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.818940 4707 scope.go:117] "RemoveContainer" containerID="4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3" Jan 21 15:49:39 crc kubenswrapper[4707]: E0121 15:49:39.820292 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3\": container with ID starting with 4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3 not found: ID does not exist" containerID="4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.820323 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3"} err="failed to get container status \"4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3\": rpc error: code = NotFound desc = could not find container \"4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3\": container with ID starting with 4e4cfc625eaee2c57ab5dc4382e4d1bcca50017bda0f7f7df0ad73321cbddaf3 not found: ID does not exist" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.820344 4707 scope.go:117] "RemoveContainer" containerID="0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d" Jan 21 15:49:39 crc kubenswrapper[4707]: E0121 15:49:39.820650 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d\": container with ID starting with 0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d not found: ID does not exist" containerID="0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.820672 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d"} err="failed to get container status \"0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d\": rpc error: code = NotFound desc = could not find container \"0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d\": container with ID starting with 0cc4eef964fa309fa0c6be487293b1b2e22b0b4e5cd4f5e437cc7c7ab3cecb9d not found: ID does not exist" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.837964 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.843309 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.861151 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:49:39 crc kubenswrapper[4707]: E0121 15:49:39.861522 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-log" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.861542 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-log" Jan 21 15:49:39 crc kubenswrapper[4707]: E0121 15:49:39.861576 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-httpd" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.861583 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-httpd" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.861723 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-httpd" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.861744 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08432a6-65ad-4843-924c-2b1f96613356" containerName="glance-log" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.862647 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.865347 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.868354 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.869524 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.945696 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.945747 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.969481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.969521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q54ht\" (UniqueName: \"kubernetes.io/projected/f02894e7-9ae7-40df-a2d1-c9074494a383-kube-api-access-q54ht\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.969540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-logs\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.969559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.969590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-scripts\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.969644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-config-data\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:39 crc kubenswrapper[4707]: I0121 15:49:39.969670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.059147 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.087196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-scripts\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.087290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-config-data\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.087320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.087376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.087394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q54ht\" (UniqueName: \"kubernetes.io/projected/f02894e7-9ae7-40df-a2d1-c9074494a383-kube-api-access-q54ht\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.087410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-logs\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.087429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.087780 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.087856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.088329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-logs\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.096877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-config-data\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.097331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.099373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-scripts\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.102425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q54ht\" (UniqueName: \"kubernetes.io/projected/f02894e7-9ae7-40df-a2d1-c9074494a383-kube-api-access-q54ht\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.111093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: E0121 15:49:40.145365 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c006942_d17c_4b1f_b102_164d32c423b8.slice/crio-conmon-c9615e19b7c6188f13c8f3a92046ca9385b3874241dc6df73bf45530dd0e7dca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c006942_d17c_4b1f_b102_164d32c423b8.slice/crio-c9615e19b7c6188f13c8f3a92046ca9385b3874241dc6df73bf45530dd0e7dca.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.187902 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-scripts\") pod \"82693866-aeda-4300-b19e-819e2f09922f\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.188217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-combined-ca-bundle\") pod \"82693866-aeda-4300-b19e-819e2f09922f\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.188276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98qpf\" (UniqueName: \"kubernetes.io/projected/82693866-aeda-4300-b19e-819e2f09922f-kube-api-access-98qpf\") pod \"82693866-aeda-4300-b19e-819e2f09922f\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.188321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-config-data\") pod \"82693866-aeda-4300-b19e-819e2f09922f\" (UID: \"82693866-aeda-4300-b19e-819e2f09922f\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.191060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-scripts" (OuterVolumeSpecName: "scripts") pod "82693866-aeda-4300-b19e-819e2f09922f" (UID: "82693866-aeda-4300-b19e-819e2f09922f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.191274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82693866-aeda-4300-b19e-819e2f09922f-kube-api-access-98qpf" (OuterVolumeSpecName: "kube-api-access-98qpf") pod "82693866-aeda-4300-b19e-819e2f09922f" (UID: "82693866-aeda-4300-b19e-819e2f09922f"). InnerVolumeSpecName "kube-api-access-98qpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.210315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-config-data" (OuterVolumeSpecName: "config-data") pod "82693866-aeda-4300-b19e-819e2f09922f" (UID: "82693866-aeda-4300-b19e-819e2f09922f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.215226 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.216778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82693866-aeda-4300-b19e-819e2f09922f" (UID: "82693866-aeda-4300-b19e-819e2f09922f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.259882 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.289428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctc2z\" (UniqueName: \"kubernetes.io/projected/f256c211-e440-42ff-80ac-3b9084007cf7-kube-api-access-ctc2z\") pod \"f256c211-e440-42ff-80ac-3b9084007cf7\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.289468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-httpd-run\") pod \"f256c211-e440-42ff-80ac-3b9084007cf7\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.289531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-logs\") pod \"f256c211-e440-42ff-80ac-3b9084007cf7\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.289549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-scripts\") pod \"f256c211-e440-42ff-80ac-3b9084007cf7\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.289571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-combined-ca-bundle\") pod \"f256c211-e440-42ff-80ac-3b9084007cf7\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.289625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f256c211-e440-42ff-80ac-3b9084007cf7\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.289700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-config-data\") pod \"f256c211-e440-42ff-80ac-3b9084007cf7\" (UID: \"f256c211-e440-42ff-80ac-3b9084007cf7\") " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.289862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-logs" (OuterVolumeSpecName: "logs") pod "f256c211-e440-42ff-80ac-3b9084007cf7" (UID: "f256c211-e440-42ff-80ac-3b9084007cf7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.290091 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f256c211-e440-42ff-80ac-3b9084007cf7" (UID: "f256c211-e440-42ff-80ac-3b9084007cf7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.290292 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98qpf\" (UniqueName: \"kubernetes.io/projected/82693866-aeda-4300-b19e-819e2f09922f-kube-api-access-98qpf\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.290308 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.290318 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.290327 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.290335 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82693866-aeda-4300-b19e-819e2f09922f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.290343 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f256c211-e440-42ff-80ac-3b9084007cf7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.292036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f256c211-e440-42ff-80ac-3b9084007cf7-kube-api-access-ctc2z" (OuterVolumeSpecName: "kube-api-access-ctc2z") pod "f256c211-e440-42ff-80ac-3b9084007cf7" (UID: "f256c211-e440-42ff-80ac-3b9084007cf7"). InnerVolumeSpecName "kube-api-access-ctc2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.292563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "f256c211-e440-42ff-80ac-3b9084007cf7" (UID: "f256c211-e440-42ff-80ac-3b9084007cf7"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.297185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-scripts" (OuterVolumeSpecName: "scripts") pod "f256c211-e440-42ff-80ac-3b9084007cf7" (UID: "f256c211-e440-42ff-80ac-3b9084007cf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.311135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f256c211-e440-42ff-80ac-3b9084007cf7" (UID: "f256c211-e440-42ff-80ac-3b9084007cf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.332784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-config-data" (OuterVolumeSpecName: "config-data") pod "f256c211-e440-42ff-80ac-3b9084007cf7" (UID: "f256c211-e440-42ff-80ac-3b9084007cf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.392063 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.392216 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctc2z\" (UniqueName: \"kubernetes.io/projected/f256c211-e440-42ff-80ac-3b9084007cf7-kube-api-access-ctc2z\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.392227 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.392236 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256c211-e440-42ff-80ac-3b9084007cf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.392272 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.409773 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.494194 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.787348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" event={"ID":"82693866-aeda-4300-b19e-819e2f09922f","Type":"ContainerDied","Data":"9a1eaeb7c20cfb7c76c45742a3cb373c559f27a97c8f0f772cd570ab4568e5e8"} Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.787388 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.787409 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a1eaeb7c20cfb7c76c45742a3cb373c559f27a97c8f0f772cd570ab4568e5e8" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.789347 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c006942-d17c-4b1f-b102-164d32c423b8" containerID="7faa597b726a176eee8f61b373547446ac9a3ff6e22de055bc920745aa5113da" exitCode=0 Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.789379 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c006942-d17c-4b1f-b102-164d32c423b8" containerID="45efb98ad79ad40913c3c78b22cda7501766642e135709e181e309672ceae49d" exitCode=2 Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.789382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerDied","Data":"7faa597b726a176eee8f61b373547446ac9a3ff6e22de055bc920745aa5113da"} Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.789405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerDied","Data":"45efb98ad79ad40913c3c78b22cda7501766642e135709e181e309672ceae49d"} Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.789417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerDied","Data":"c9615e19b7c6188f13c8f3a92046ca9385b3874241dc6df73bf45530dd0e7dca"} Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.789389 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c006942-d17c-4b1f-b102-164d32c423b8" containerID="c9615e19b7c6188f13c8f3a92046ca9385b3874241dc6df73bf45530dd0e7dca" exitCode=0 Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.795883 4707 generic.go:334] "Generic (PLEG): container finished" podID="f256c211-e440-42ff-80ac-3b9084007cf7" containerID="d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0" exitCode=0 Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.795911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f256c211-e440-42ff-80ac-3b9084007cf7","Type":"ContainerDied","Data":"d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0"} Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.795947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f256c211-e440-42ff-80ac-3b9084007cf7","Type":"ContainerDied","Data":"81eeadfd836334be8a28aedd21ac0aaaafb33c397ec11a33e1cb2bef65754285"} Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.795969 4707 scope.go:117] "RemoveContainer" containerID="d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.796002 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.816922 4707 scope.go:117] "RemoveContainer" containerID="02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.825731 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.831211 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.845741 4707 scope.go:117] "RemoveContainer" containerID="d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0" Jan 21 15:49:40 crc kubenswrapper[4707]: E0121 15:49:40.846123 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0\": container with ID starting with d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0 not found: ID does not exist" containerID="d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.846151 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0"} err="failed to get container status \"d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0\": rpc error: code = NotFound desc = could not find container \"d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0\": container with ID starting with d47a4340ae291b3b41f0c2314ef39506cef82dcecb1a97f3896ef8658a63b7e0 not found: ID does not exist" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.846171 4707 scope.go:117] "RemoveContainer" containerID="02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf" Jan 21 15:49:40 crc kubenswrapper[4707]: E0121 15:49:40.846385 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf\": container with ID starting with 02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf not found: ID does not exist" containerID="02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.846427 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf"} err="failed to get container status \"02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf\": rpc error: code = NotFound desc = could not find container \"02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf\": container with ID starting with 02468bbd50600544ca84891d278dfa4e110dc90236044c40123df7454bb0bfbf not found: ID does not exist" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.846455 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:49:40 crc kubenswrapper[4707]: E0121 15:49:40.846790 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f256c211-e440-42ff-80ac-3b9084007cf7" containerName="glance-httpd" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.846820 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f256c211-e440-42ff-80ac-3b9084007cf7" containerName="glance-httpd" Jan 21 15:49:40 crc kubenswrapper[4707]: E0121 15:49:40.846829 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82693866-aeda-4300-b19e-819e2f09922f" containerName="nova-cell0-conductor-db-sync" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.846836 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="82693866-aeda-4300-b19e-819e2f09922f" containerName="nova-cell0-conductor-db-sync" Jan 21 15:49:40 crc kubenswrapper[4707]: E0121 15:49:40.846846 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f256c211-e440-42ff-80ac-3b9084007cf7" containerName="glance-log" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.846851 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f256c211-e440-42ff-80ac-3b9084007cf7" containerName="glance-log" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.846988 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f256c211-e440-42ff-80ac-3b9084007cf7" containerName="glance-httpd" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.847003 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="82693866-aeda-4300-b19e-819e2f09922f" containerName="nova-cell0-conductor-db-sync" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.847021 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f256c211-e440-42ff-80ac-3b9084007cf7" containerName="glance-log" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.848463 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.850334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.858445 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.859374 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.863568 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.863778 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-rplq6" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.875797 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.899593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.899637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.899661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.901165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssl48\" (UniqueName: \"kubernetes.io/projected/2a7a7bf7-097d-4995-a426-90bda67d7490-kube-api-access-ssl48\") pod \"nova-cell0-conductor-0\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.901224 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.901246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.901340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-logs\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.901393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp5mb\" (UniqueName: \"kubernetes.io/projected/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-kube-api-access-hp5mb\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.901412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.901435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.906714 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:49:40 crc kubenswrapper[4707]: W0121 15:49:40.935008 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf02894e7_9ae7_40df_a2d1_c9074494a383.slice/crio-a3650837a2001d12774627e50c73d718a1688b24b0568b4f1d9e45caa320c2bc WatchSource:0}: Error finding container a3650837a2001d12774627e50c73d718a1688b24b0568b4f1d9e45caa320c2bc: Status 404 returned error can't find the container with id a3650837a2001d12774627e50c73d718a1688b24b0568b4f1d9e45caa320c2bc Jan 21 15:49:40 crc kubenswrapper[4707]: I0121 15:49:40.935514 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-logs\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003596 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp5mb\" (UniqueName: \"kubernetes.io/projected/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-kube-api-access-hp5mb\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssl48\" (UniqueName: \"kubernetes.io/projected/2a7a7bf7-097d-4995-a426-90bda67d7490-kube-api-access-ssl48\") pod \"nova-cell0-conductor-0\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003834 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-logs\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.004420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.003844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.008608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.009405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.011317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.014992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.015140 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.017387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp5mb\" (UniqueName: \"kubernetes.io/projected/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-kube-api-access-hp5mb\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.018644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssl48\" (UniqueName: \"kubernetes.io/projected/2a7a7bf7-097d-4995-a426-90bda67d7490-kube-api-access-ssl48\") pod \"nova-cell0-conductor-0\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.035222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.190677 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08432a6-65ad-4843-924c-2b1f96613356" path="/var/lib/kubelet/pods/d08432a6-65ad-4843-924c-2b1f96613356/volumes" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.191352 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f256c211-e440-42ff-80ac-3b9084007cf7" path="/var/lib/kubelet/pods/f256c211-e440-42ff-80ac-3b9084007cf7/volumes" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.205397 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.209698 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.598129 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:49:41 crc kubenswrapper[4707]: W0121 15:49:41.602724 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0687c9e1_7ccd_454c_8ffd_8507cb8af46c.slice/crio-74412d0b57753166d229bd381eff8c0987a5399eb4f4032d1479c0fe828f3a12 WatchSource:0}: Error finding container 74412d0b57753166d229bd381eff8c0987a5399eb4f4032d1479c0fe828f3a12: Status 404 returned error can't find the container with id 74412d0b57753166d229bd381eff8c0987a5399eb4f4032d1479c0fe828f3a12 Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.654175 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:49:41 crc kubenswrapper[4707]: W0121 15:49:41.659202 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a7a7bf7_097d_4995_a426_90bda67d7490.slice/crio-6f53f07be15fdb547fde47138a3a5dbfe0f6ff0194fd49ea97325e35879b52c0 WatchSource:0}: Error finding container 6f53f07be15fdb547fde47138a3a5dbfe0f6ff0194fd49ea97325e35879b52c0: Status 404 returned error can't find the container with id 6f53f07be15fdb547fde47138a3a5dbfe0f6ff0194fd49ea97325e35879b52c0 Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.806108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"2a7a7bf7-097d-4995-a426-90bda67d7490","Type":"ContainerStarted","Data":"a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857"} Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.806307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"2a7a7bf7-097d-4995-a426-90bda67d7490","Type":"ContainerStarted","Data":"6f53f07be15fdb547fde47138a3a5dbfe0f6ff0194fd49ea97325e35879b52c0"} Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.807183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.809994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f02894e7-9ae7-40df-a2d1-c9074494a383","Type":"ContainerStarted","Data":"d92766c25e2a6639816ebec6fda095e760a14a87d4377f309301e1ed97c1a7c9"} Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.810040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f02894e7-9ae7-40df-a2d1-c9074494a383","Type":"ContainerStarted","Data":"a61916f8e83ea502f8075b4d59765ed8f01db90f73b4f184cfeafe8b58ce0fda"} Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.810052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f02894e7-9ae7-40df-a2d1-c9074494a383","Type":"ContainerStarted","Data":"a3650837a2001d12774627e50c73d718a1688b24b0568b4f1d9e45caa320c2bc"} Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.811654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0687c9e1-7ccd-454c-8ffd-8507cb8af46c","Type":"ContainerStarted","Data":"74412d0b57753166d229bd381eff8c0987a5399eb4f4032d1479c0fe828f3a12"} Jan 21 15:49:41 crc kubenswrapper[4707]: I0121 15:49:41.854313 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.854283801 podStartE2EDuration="1.854283801s" podCreationTimestamp="2026-01-21 15:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:41.823638019 +0000 UTC m=+2879.005154240" watchObservedRunningTime="2026-01-21 15:49:41.854283801 +0000 UTC m=+2879.035800022" Jan 21 15:49:42 crc kubenswrapper[4707]: I0121 15:49:42.837625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0687c9e1-7ccd-454c-8ffd-8507cb8af46c","Type":"ContainerStarted","Data":"dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e"} Jan 21 15:49:42 crc kubenswrapper[4707]: I0121 15:49:42.838229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0687c9e1-7ccd-454c-8ffd-8507cb8af46c","Type":"ContainerStarted","Data":"4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8"} Jan 21 15:49:42 crc kubenswrapper[4707]: I0121 15:49:42.850998 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.850980913 podStartE2EDuration="3.850980913s" podCreationTimestamp="2026-01-21 15:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:41.839488802 +0000 UTC m=+2879.021005024" watchObservedRunningTime="2026-01-21 15:49:42.850980913 +0000 UTC m=+2880.032497134" Jan 21 15:49:42 crc kubenswrapper[4707]: I0121 15:49:42.852846 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.85283573 podStartE2EDuration="2.85283573s" podCreationTimestamp="2026-01-21 15:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:42.850140933 +0000 UTC m=+2880.031657155" watchObservedRunningTime="2026-01-21 15:49:42.85283573 +0000 UTC m=+2880.034351953" Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.851188 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c006942-d17c-4b1f-b102-164d32c423b8" containerID="d015562c04f2f5a72f21bebe0f422fe61036f15721af23162c79aeac222b8c2a" exitCode=0 Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.851274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerDied","Data":"d015562c04f2f5a72f21bebe0f422fe61036f15721af23162c79aeac222b8c2a"} Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.957979 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.986316 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-log-httpd\") pod \"7c006942-d17c-4b1f-b102-164d32c423b8\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.986378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-sg-core-conf-yaml\") pod \"7c006942-d17c-4b1f-b102-164d32c423b8\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.986522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-config-data\") pod \"7c006942-d17c-4b1f-b102-164d32c423b8\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.986626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-scripts\") pod \"7c006942-d17c-4b1f-b102-164d32c423b8\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.986720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-572wn\" (UniqueName: \"kubernetes.io/projected/7c006942-d17c-4b1f-b102-164d32c423b8-kube-api-access-572wn\") pod \"7c006942-d17c-4b1f-b102-164d32c423b8\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.986876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-combined-ca-bundle\") pod \"7c006942-d17c-4b1f-b102-164d32c423b8\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.986903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7c006942-d17c-4b1f-b102-164d32c423b8" (UID: "7c006942-d17c-4b1f-b102-164d32c423b8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.986921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-run-httpd\") pod \"7c006942-d17c-4b1f-b102-164d32c423b8\" (UID: \"7c006942-d17c-4b1f-b102-164d32c423b8\") " Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.987621 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.987695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7c006942-d17c-4b1f-b102-164d32c423b8" (UID: "7c006942-d17c-4b1f-b102-164d32c423b8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.992240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c006942-d17c-4b1f-b102-164d32c423b8-kube-api-access-572wn" (OuterVolumeSpecName: "kube-api-access-572wn") pod "7c006942-d17c-4b1f-b102-164d32c423b8" (UID: "7c006942-d17c-4b1f-b102-164d32c423b8"). InnerVolumeSpecName "kube-api-access-572wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:44 crc kubenswrapper[4707]: I0121 15:49:44.992342 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-scripts" (OuterVolumeSpecName: "scripts") pod "7c006942-d17c-4b1f-b102-164d32c423b8" (UID: "7c006942-d17c-4b1f-b102-164d32c423b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.007310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7c006942-d17c-4b1f-b102-164d32c423b8" (UID: "7c006942-d17c-4b1f-b102-164d32c423b8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.033712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c006942-d17c-4b1f-b102-164d32c423b8" (UID: "7c006942-d17c-4b1f-b102-164d32c423b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.045227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-config-data" (OuterVolumeSpecName: "config-data") pod "7c006942-d17c-4b1f-b102-164d32c423b8" (UID: "7c006942-d17c-4b1f-b102-164d32c423b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.089278 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.089305 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.089316 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-572wn\" (UniqueName: \"kubernetes.io/projected/7c006942-d17c-4b1f-b102-164d32c423b8-kube-api-access-572wn\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.089333 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.089342 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c006942-d17c-4b1f-b102-164d32c423b8-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.089350 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c006942-d17c-4b1f-b102-164d32c423b8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.865700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7c006942-d17c-4b1f-b102-164d32c423b8","Type":"ContainerDied","Data":"2604c08a862c4546eadeeca51e9fd4ea654e51451b4ed4302c61eacd85de2160"} Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.865751 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.866478 4707 scope.go:117] "RemoveContainer" containerID="7faa597b726a176eee8f61b373547446ac9a3ff6e22de055bc920745aa5113da" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.906710 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.908851 4707 scope.go:117] "RemoveContainer" containerID="45efb98ad79ad40913c3c78b22cda7501766642e135709e181e309672ceae49d" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.918965 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.925167 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:45 crc kubenswrapper[4707]: E0121 15:49:45.925599 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="proxy-httpd" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.925619 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="proxy-httpd" Jan 21 15:49:45 crc kubenswrapper[4707]: E0121 15:49:45.925639 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="ceilometer-notification-agent" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.925644 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="ceilometer-notification-agent" Jan 21 15:49:45 crc kubenswrapper[4707]: E0121 15:49:45.925657 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="sg-core" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.925663 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="sg-core" Jan 21 15:49:45 crc kubenswrapper[4707]: E0121 15:49:45.925672 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="ceilometer-central-agent" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.925678 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="ceilometer-central-agent" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.925850 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="proxy-httpd" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.925866 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="ceilometer-notification-agent" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.925876 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="sg-core" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.925893 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" containerName="ceilometer-central-agent" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.927597 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.929794 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.930678 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.931158 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.936936 4707 scope.go:117] "RemoveContainer" containerID="c9615e19b7c6188f13c8f3a92046ca9385b3874241dc6df73bf45530dd0e7dca" Jan 21 15:49:45 crc kubenswrapper[4707]: I0121 15:49:45.963114 4707 scope.go:117] "RemoveContainer" containerID="d015562c04f2f5a72f21bebe0f422fe61036f15721af23162c79aeac222b8c2a" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.006362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-log-httpd\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.006424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-run-httpd\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.006488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-scripts\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.006532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.006586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-config-data\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.006689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.006786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgmn6\" (UniqueName: \"kubernetes.io/projected/20366c22-d65c-416c-a985-728cc72025ca-kube-api-access-fgmn6\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.066692 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.067055 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="2a7a7bf7-097d-4995-a426-90bda67d7490" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" gracePeriod=30 Jan 21 15:49:46 crc kubenswrapper[4707]: E0121 15:49:46.068466 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:49:46 crc kubenswrapper[4707]: E0121 15:49:46.071163 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:49:46 crc kubenswrapper[4707]: E0121 15:49:46.072190 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:49:46 crc kubenswrapper[4707]: E0121 15:49:46.072221 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="2a7a7bf7-097d-4995-a426-90bda67d7490" containerName="nova-cell0-conductor-conductor" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.108508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-scripts\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.108581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.108621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-config-data\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.108649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.108683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgmn6\" (UniqueName: \"kubernetes.io/projected/20366c22-d65c-416c-a985-728cc72025ca-kube-api-access-fgmn6\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.108757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-log-httpd\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.108826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-run-httpd\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.109194 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-run-httpd\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.110087 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-log-httpd\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.112532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.112775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-scripts\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.113451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-config-data\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.114035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.123981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgmn6\" (UniqueName: \"kubernetes.io/projected/20366c22-d65c-416c-a985-728cc72025ca-kube-api-access-fgmn6\") pod \"ceilometer-0\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: E0121 15:49:46.212024 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:49:46 crc kubenswrapper[4707]: E0121 15:49:46.216304 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:49:46 crc kubenswrapper[4707]: E0121 15:49:46.219791 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:49:46 crc kubenswrapper[4707]: E0121 15:49:46.219850 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="2a7a7bf7-097d-4995-a426-90bda67d7490" containerName="nova-cell0-conductor-conductor" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.248189 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.644680 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:46 crc kubenswrapper[4707]: W0121 15:49:46.645610 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20366c22_d65c_416c_a985_728cc72025ca.slice/crio-34e26ee0fd971909a356877d38a6e27375ad0fc47ccd2de82c42b4bb01cba5e8 WatchSource:0}: Error finding container 34e26ee0fd971909a356877d38a6e27375ad0fc47ccd2de82c42b4bb01cba5e8: Status 404 returned error can't find the container with id 34e26ee0fd971909a356877d38a6e27375ad0fc47ccd2de82c42b4bb01cba5e8 Jan 21 15:49:46 crc kubenswrapper[4707]: I0121 15:49:46.873245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerStarted","Data":"34e26ee0fd971909a356877d38a6e27375ad0fc47ccd2de82c42b4bb01cba5e8"} Jan 21 15:49:47 crc kubenswrapper[4707]: I0121 15:49:47.191887 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c006942-d17c-4b1f-b102-164d32c423b8" path="/var/lib/kubelet/pods/7c006942-d17c-4b1f-b102-164d32c423b8/volumes" Jan 21 15:49:47 crc kubenswrapper[4707]: I0121 15:49:47.292743 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:49:47 crc kubenswrapper[4707]: I0121 15:49:47.885438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerStarted","Data":"76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831"} Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.693121 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.748969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-config-data\") pod \"2a7a7bf7-097d-4995-a426-90bda67d7490\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.749034 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssl48\" (UniqueName: \"kubernetes.io/projected/2a7a7bf7-097d-4995-a426-90bda67d7490-kube-api-access-ssl48\") pod \"2a7a7bf7-097d-4995-a426-90bda67d7490\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.749064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-combined-ca-bundle\") pod \"2a7a7bf7-097d-4995-a426-90bda67d7490\" (UID: \"2a7a7bf7-097d-4995-a426-90bda67d7490\") " Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.753351 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7a7bf7-097d-4995-a426-90bda67d7490-kube-api-access-ssl48" (OuterVolumeSpecName: "kube-api-access-ssl48") pod "2a7a7bf7-097d-4995-a426-90bda67d7490" (UID: "2a7a7bf7-097d-4995-a426-90bda67d7490"). InnerVolumeSpecName "kube-api-access-ssl48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.768591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-config-data" (OuterVolumeSpecName: "config-data") pod "2a7a7bf7-097d-4995-a426-90bda67d7490" (UID: "2a7a7bf7-097d-4995-a426-90bda67d7490"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.769504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a7a7bf7-097d-4995-a426-90bda67d7490" (UID: "2a7a7bf7-097d-4995-a426-90bda67d7490"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.851596 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.851628 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssl48\" (UniqueName: \"kubernetes.io/projected/2a7a7bf7-097d-4995-a426-90bda67d7490-kube-api-access-ssl48\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.851639 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a7bf7-097d-4995-a426-90bda67d7490-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.893383 4707 generic.go:334] "Generic (PLEG): container finished" podID="2a7a7bf7-097d-4995-a426-90bda67d7490" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" exitCode=0 Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.893448 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.893458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"2a7a7bf7-097d-4995-a426-90bda67d7490","Type":"ContainerDied","Data":"a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857"} Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.893486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"2a7a7bf7-097d-4995-a426-90bda67d7490","Type":"ContainerDied","Data":"6f53f07be15fdb547fde47138a3a5dbfe0f6ff0194fd49ea97325e35879b52c0"} Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.893503 4707 scope.go:117] "RemoveContainer" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.896109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerStarted","Data":"d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f"} Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.896148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerStarted","Data":"fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643"} Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.915952 4707 scope.go:117] "RemoveContainer" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" Jan 21 15:49:48 crc kubenswrapper[4707]: E0121 15:49:48.916338 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857\": container with ID starting with a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857 not found: ID does not exist" containerID="a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.916372 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857"} err="failed to get container status \"a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857\": rpc error: code = NotFound desc = could not find container \"a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857\": container with ID starting with a2d8dbb88c2b8cf2ef8bfc6492ecba489121728ada2301497297a621108d7857 not found: ID does not exist" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.922729 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.930854 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.939539 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:49:48 crc kubenswrapper[4707]: E0121 15:49:48.940096 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7a7bf7-097d-4995-a426-90bda67d7490" containerName="nova-cell0-conductor-conductor" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.940118 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7a7bf7-097d-4995-a426-90bda67d7490" containerName="nova-cell0-conductor-conductor" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.940419 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7a7bf7-097d-4995-a426-90bda67d7490" containerName="nova-cell0-conductor-conductor" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.941180 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.942969 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.943162 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-rplq6" Jan 21 15:49:48 crc kubenswrapper[4707]: I0121 15:49:48.947653 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.095096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.095462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.095500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzjsh\" (UniqueName: \"kubernetes.io/projected/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-kube-api-access-zzjsh\") pod \"nova-cell0-conductor-0\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.192859 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7a7bf7-097d-4995-a426-90bda67d7490" path="/var/lib/kubelet/pods/2a7a7bf7-097d-4995-a426-90bda67d7490/volumes" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.196497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.196545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.196585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzjsh\" (UniqueName: \"kubernetes.io/projected/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-kube-api-access-zzjsh\") pod \"nova-cell0-conductor-0\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.215479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.215513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.220570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzjsh\" (UniqueName: \"kubernetes.io/projected/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-kube-api-access-zzjsh\") pod \"nova-cell0-conductor-0\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.278310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: W0121 15:49:49.646021 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e0f6b31_9752_42cb_90d2_6fa1c2f58fcb.slice/crio-c4e432540931c8225bb3f60778328467daf79bc620c912477b3167314de7423d WatchSource:0}: Error finding container c4e432540931c8225bb3f60778328467daf79bc620c912477b3167314de7423d: Status 404 returned error can't find the container with id c4e432540931c8225bb3f60778328467daf79bc620c912477b3167314de7423d Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.650469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.906343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerStarted","Data":"bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf"} Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.906470 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="ceilometer-central-agent" containerID="cri-o://76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831" gracePeriod=30 Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.906488 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.906508 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="proxy-httpd" containerID="cri-o://bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf" gracePeriod=30 Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.906555 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="ceilometer-notification-agent" containerID="cri-o://fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643" gracePeriod=30 Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.906555 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="sg-core" containerID="cri-o://d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f" gracePeriod=30 Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.918410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb","Type":"ContainerStarted","Data":"abd81aaba2b15dad392c224106e08baa80499773a6b162882b82de584546de2c"} Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.918488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb","Type":"ContainerStarted","Data":"c4e432540931c8225bb3f60778328467daf79bc620c912477b3167314de7423d"} Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.918625 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.926944 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.861795064 podStartE2EDuration="4.926928077s" podCreationTimestamp="2026-01-21 15:49:45 +0000 UTC" firstStartedPulling="2026-01-21 15:49:46.64799037 +0000 UTC m=+2883.829506592" lastFinishedPulling="2026-01-21 15:49:49.713123383 +0000 UTC m=+2886.894639605" observedRunningTime="2026-01-21 15:49:49.924389875 +0000 UTC m=+2887.105906097" watchObservedRunningTime="2026-01-21 15:49:49.926928077 +0000 UTC m=+2887.108444289" Jan 21 15:49:49 crc kubenswrapper[4707]: I0121 15:49:49.949136 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.949120935 podStartE2EDuration="1.949120935s" podCreationTimestamp="2026-01-21 15:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:49.943625884 +0000 UTC m=+2887.125142105" watchObservedRunningTime="2026-01-21 15:49:49.949120935 +0000 UTC m=+2887.130637157" Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.260229 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.260279 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.290170 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.291435 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.928910 4707 generic.go:334] "Generic (PLEG): container finished" podID="20366c22-d65c-416c-a985-728cc72025ca" containerID="d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f" exitCode=2 Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.929408 4707 generic.go:334] "Generic (PLEG): container finished" podID="20366c22-d65c-416c-a985-728cc72025ca" containerID="fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643" exitCode=0 Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.929025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerDied","Data":"d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f"} Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.929489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerDied","Data":"fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643"} Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.930218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:50 crc kubenswrapper[4707]: I0121 15:49:50.930348 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:51 crc kubenswrapper[4707]: I0121 15:49:51.206975 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:51 crc kubenswrapper[4707]: I0121 15:49:51.207013 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:51 crc kubenswrapper[4707]: I0121 15:49:51.238191 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:51 crc kubenswrapper[4707]: I0121 15:49:51.238274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:51 crc kubenswrapper[4707]: I0121 15:49:51.936939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:51 crc kubenswrapper[4707]: I0121 15:49:51.936978 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:52 crc kubenswrapper[4707]: I0121 15:49:52.593708 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:52 crc kubenswrapper[4707]: I0121 15:49:52.647092 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:49:52 crc kubenswrapper[4707]: I0121 15:49:52.945942 4707 generic.go:334] "Generic (PLEG): container finished" podID="20366c22-d65c-416c-a985-728cc72025ca" containerID="76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831" exitCode=0 Jan 21 15:49:52 crc kubenswrapper[4707]: I0121 15:49:52.946009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerDied","Data":"76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831"} Jan 21 15:49:53 crc kubenswrapper[4707]: I0121 15:49:53.587318 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:53 crc kubenswrapper[4707]: I0121 15:49:53.587883 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.299333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.679489 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-tp798"] Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.680620 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.682639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.682700 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.697250 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-tp798"] Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.771784 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.773183 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.776732 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.787421 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.807435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-scripts\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.807479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.807532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-logs\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.807650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhw5x\" (UniqueName: \"kubernetes.io/projected/564cacdc-a141-427f-ba32-593fe665a29b-kube-api-access-rhw5x\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.807696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-config-data\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.807863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjvrb\" (UniqueName: \"kubernetes.io/projected/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-kube-api-access-qjvrb\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.807931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-config-data\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.807978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.823497 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.824726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.845563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.846596 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.847560 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.849423 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.869865 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.874707 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.912900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.912968 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r2gg\" (UniqueName: \"kubernetes.io/projected/d24258d7-92fc-4e5d-a418-9263e8711f16-kube-api-access-4r2gg\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.912997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d24258d7-92fc-4e5d-a418-9263e8711f16-logs\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-logs\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhw5x\" (UniqueName: \"kubernetes.io/projected/564cacdc-a141-427f-ba32-593fe665a29b-kube-api-access-rhw5x\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-config-data\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjvrb\" (UniqueName: \"kubernetes.io/projected/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-kube-api-access-qjvrb\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-config-data\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-config-data\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llb7l\" (UniqueName: \"kubernetes.io/projected/17a6ace0-ecbc-47bd-acf8-9a7e76defede-kube-api-access-llb7l\") pod \"nova-cell1-novncproxy-0\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.913405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-scripts\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.922075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-logs\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.948567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhw5x\" (UniqueName: \"kubernetes.io/projected/564cacdc-a141-427f-ba32-593fe665a29b-kube-api-access-rhw5x\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.962664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjvrb\" (UniqueName: \"kubernetes.io/projected/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-kube-api-access-qjvrb\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.975867 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-config-data\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.991785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-config-data\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.992132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-scripts\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.993417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.993871 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tp798\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.994694 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.996378 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:49:54 crc kubenswrapper[4707]: I0121 15:49:54.997473 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.001684 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.003894 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.015083 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r2gg\" (UniqueName: \"kubernetes.io/projected/d24258d7-92fc-4e5d-a418-9263e8711f16-kube-api-access-4r2gg\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.015175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d24258d7-92fc-4e5d-a418-9263e8711f16-logs\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.015318 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.015433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-config-data\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.015497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llb7l\" (UniqueName: \"kubernetes.io/projected/17a6ace0-ecbc-47bd-acf8-9a7e76defede-kube-api-access-llb7l\") pod \"nova-cell1-novncproxy-0\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.015555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.015637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.016504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d24258d7-92fc-4e5d-a418-9263e8711f16-logs\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.020472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.020927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.021437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-config-data\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.023826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.029358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r2gg\" (UniqueName: \"kubernetes.io/projected/d24258d7-92fc-4e5d-a418-9263e8711f16-kube-api-access-4r2gg\") pod \"nova-metadata-0\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.029626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llb7l\" (UniqueName: \"kubernetes.io/projected/17a6ace0-ecbc-47bd-acf8-9a7e76defede-kube-api-access-llb7l\") pod \"nova-cell1-novncproxy-0\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.089350 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.117771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.117906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-config-data\") pod \"nova-scheduler-0\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.117936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkt68\" (UniqueName: \"kubernetes.io/projected/beb2956b-c7e5-4d23-8d31-6eab66daeb84-kube-api-access-fkt68\") pod \"nova-scheduler-0\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.141023 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.167574 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.219658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-config-data\") pod \"nova-scheduler-0\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.219705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkt68\" (UniqueName: \"kubernetes.io/projected/beb2956b-c7e5-4d23-8d31-6eab66daeb84-kube-api-access-fkt68\") pod \"nova-scheduler-0\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.219792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.225492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-config-data\") pod \"nova-scheduler-0\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.228014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.233888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkt68\" (UniqueName: \"kubernetes.io/projected/beb2956b-c7e5-4d23-8d31-6eab66daeb84-kube-api-access-fkt68\") pod \"nova-scheduler-0\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.373332 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.409450 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-tp798"] Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.526064 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.703693 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.713044 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.728845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.886523 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf"] Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.887536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.891330 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.891346 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.900526 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf"] Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.937413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnv8r\" (UniqueName: \"kubernetes.io/projected/55a11152-db37-403a-a567-131288939b37-kube-api-access-tnv8r\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.937498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.937680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-config-data\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.937848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-scripts\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.988749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"17a6ace0-ecbc-47bd-acf8-9a7e76defede","Type":"ContainerStarted","Data":"00d66efb976b7b6c6e56697dbb50806443c2d9d8a72bfe79e2b3691c666023f1"} Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.990002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" event={"ID":"564cacdc-a141-427f-ba32-593fe665a29b","Type":"ContainerStarted","Data":"6eb520f1f4eea9ecf2295d30d78fe4d6d732051d1f8bbded341746250154bc1d"} Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.990025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" event={"ID":"564cacdc-a141-427f-ba32-593fe665a29b","Type":"ContainerStarted","Data":"6f980775a7b3c59b3d963ad50aa91a2c7892aa8c99383d8e12c5e59fad635a0c"} Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.992714 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"997323b5-c8e3-4f9b-bd80-13cf125ddb7d","Type":"ContainerStarted","Data":"7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee"} Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.992740 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"997323b5-c8e3-4f9b-bd80-13cf125ddb7d","Type":"ContainerStarted","Data":"5e051dd807c5f42234fa9726f2b851ca51f7d61fb7726adb852a727ed798dce8"} Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.993831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d24258d7-92fc-4e5d-a418-9263e8711f16","Type":"ContainerStarted","Data":"680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49"} Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.993854 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d24258d7-92fc-4e5d-a418-9263e8711f16","Type":"ContainerStarted","Data":"e1931fd3dbc233ff110420f4c09ab637c78c1bf9450b7bb37b79c0e753212ddb"} Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.997999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"beb2956b-c7e5-4d23-8d31-6eab66daeb84","Type":"ContainerStarted","Data":"4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118"} Jan 21 15:49:55 crc kubenswrapper[4707]: I0121 15:49:55.998038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"beb2956b-c7e5-4d23-8d31-6eab66daeb84","Type":"ContainerStarted","Data":"24fb2a4c8e16551c2cabc2ee89c40304288dd68d9de89328ab447fe5baf9d6d8"} Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.007972 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" podStartSLOduration=2.00795955 podStartE2EDuration="2.00795955s" podCreationTimestamp="2026-01-21 15:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:56.001575325 +0000 UTC m=+2893.183091548" watchObservedRunningTime="2026-01-21 15:49:56.00795955 +0000 UTC m=+2893.189475772" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.013506 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.013489728 podStartE2EDuration="2.013489728s" podCreationTimestamp="2026-01-21 15:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:56.013248935 +0000 UTC m=+2893.194765157" watchObservedRunningTime="2026-01-21 15:49:56.013489728 +0000 UTC m=+2893.195005950" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.040083 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-config-data\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.040204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-scripts\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.040241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnv8r\" (UniqueName: \"kubernetes.io/projected/55a11152-db37-403a-a567-131288939b37-kube-api-access-tnv8r\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.040310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.046099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-scripts\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.046284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.046298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-config-data\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.055925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnv8r\" (UniqueName: \"kubernetes.io/projected/55a11152-db37-403a-a567-131288939b37-kube-api-access-tnv8r\") pod \"nova-cell1-conductor-db-sync-vp5jf\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.215384 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:49:56 crc kubenswrapper[4707]: I0121 15:49:56.615942 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf"] Jan 21 15:49:56 crc kubenswrapper[4707]: W0121 15:49:56.620450 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55a11152_db37_403a_a567_131288939b37.slice/crio-db54f831df1a62517fe24c17a04049995a9302e1487d3db84659a49c575eec14 WatchSource:0}: Error finding container db54f831df1a62517fe24c17a04049995a9302e1487d3db84659a49c575eec14: Status 404 returned error can't find the container with id db54f831df1a62517fe24c17a04049995a9302e1487d3db84659a49c575eec14 Jan 21 15:49:57 crc kubenswrapper[4707]: I0121 15:49:57.008879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"17a6ace0-ecbc-47bd-acf8-9a7e76defede","Type":"ContainerStarted","Data":"49ab48348e2c3da66f3916afef13af7e4060b40cf8411906a27a3ada9541b590"} Jan 21 15:49:57 crc kubenswrapper[4707]: I0121 15:49:57.011753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"997323b5-c8e3-4f9b-bd80-13cf125ddb7d","Type":"ContainerStarted","Data":"72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b"} Jan 21 15:49:57 crc kubenswrapper[4707]: I0121 15:49:57.018156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d24258d7-92fc-4e5d-a418-9263e8711f16","Type":"ContainerStarted","Data":"e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da"} Jan 21 15:49:57 crc kubenswrapper[4707]: I0121 15:49:57.025711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" event={"ID":"55a11152-db37-403a-a567-131288939b37","Type":"ContainerStarted","Data":"324443ed55b5500f3191c63cafe8edcca7c7ff9370470a884b4139fe7922effa"} Jan 21 15:49:57 crc kubenswrapper[4707]: I0121 15:49:57.025769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" event={"ID":"55a11152-db37-403a-a567-131288939b37","Type":"ContainerStarted","Data":"db54f831df1a62517fe24c17a04049995a9302e1487d3db84659a49c575eec14"} Jan 21 15:49:57 crc kubenswrapper[4707]: I0121 15:49:57.028339 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=3.028327659 podStartE2EDuration="3.028327659s" podCreationTimestamp="2026-01-21 15:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:57.026019158 +0000 UTC m=+2894.207535381" watchObservedRunningTime="2026-01-21 15:49:57.028327659 +0000 UTC m=+2894.209843881" Jan 21 15:49:57 crc kubenswrapper[4707]: I0121 15:49:57.041016 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.040996389 podStartE2EDuration="3.040996389s" podCreationTimestamp="2026-01-21 15:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:57.040150729 +0000 UTC m=+2894.221666952" watchObservedRunningTime="2026-01-21 15:49:57.040996389 +0000 UTC m=+2894.222512611" Jan 21 15:49:57 crc kubenswrapper[4707]: I0121 15:49:57.057233 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" podStartSLOduration=2.05721799 podStartE2EDuration="2.05721799s" podCreationTimestamp="2026-01-21 15:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:57.050748977 +0000 UTC m=+2894.232265199" watchObservedRunningTime="2026-01-21 15:49:57.05721799 +0000 UTC m=+2894.238734212" Jan 21 15:49:57 crc kubenswrapper[4707]: I0121 15:49:57.062594 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.062587526 podStartE2EDuration="3.062587526s" podCreationTimestamp="2026-01-21 15:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:57.06198383 +0000 UTC m=+2894.243500052" watchObservedRunningTime="2026-01-21 15:49:57.062587526 +0000 UTC m=+2894.244103748" Jan 21 15:50:00 crc kubenswrapper[4707]: I0121 15:50:00.055526 4707 generic.go:334] "Generic (PLEG): container finished" podID="55a11152-db37-403a-a567-131288939b37" containerID="324443ed55b5500f3191c63cafe8edcca7c7ff9370470a884b4139fe7922effa" exitCode=0 Jan 21 15:50:00 crc kubenswrapper[4707]: I0121 15:50:00.055688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" event={"ID":"55a11152-db37-403a-a567-131288939b37","Type":"ContainerDied","Data":"324443ed55b5500f3191c63cafe8edcca7c7ff9370470a884b4139fe7922effa"} Jan 21 15:50:00 crc kubenswrapper[4707]: I0121 15:50:00.142092 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4707]: I0121 15:50:00.142129 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4707]: I0121 15:50:00.168242 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:50:00 crc kubenswrapper[4707]: I0121 15:50:00.373456 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.332099 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.422453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-config-data\") pod \"55a11152-db37-403a-a567-131288939b37\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.422544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-combined-ca-bundle\") pod \"55a11152-db37-403a-a567-131288939b37\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.422612 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-scripts\") pod \"55a11152-db37-403a-a567-131288939b37\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.422711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnv8r\" (UniqueName: \"kubernetes.io/projected/55a11152-db37-403a-a567-131288939b37-kube-api-access-tnv8r\") pod \"55a11152-db37-403a-a567-131288939b37\" (UID: \"55a11152-db37-403a-a567-131288939b37\") " Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.428306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-scripts" (OuterVolumeSpecName: "scripts") pod "55a11152-db37-403a-a567-131288939b37" (UID: "55a11152-db37-403a-a567-131288939b37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.428549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a11152-db37-403a-a567-131288939b37-kube-api-access-tnv8r" (OuterVolumeSpecName: "kube-api-access-tnv8r") pod "55a11152-db37-403a-a567-131288939b37" (UID: "55a11152-db37-403a-a567-131288939b37"). InnerVolumeSpecName "kube-api-access-tnv8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.445622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55a11152-db37-403a-a567-131288939b37" (UID: "55a11152-db37-403a-a567-131288939b37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.448643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-config-data" (OuterVolumeSpecName: "config-data") pod "55a11152-db37-403a-a567-131288939b37" (UID: "55a11152-db37-403a-a567-131288939b37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.523872 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.523902 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.523915 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnv8r\" (UniqueName: \"kubernetes.io/projected/55a11152-db37-403a-a567-131288939b37-kube-api-access-tnv8r\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:01 crc kubenswrapper[4707]: I0121 15:50:01.523927 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a11152-db37-403a-a567-131288939b37-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.070528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" event={"ID":"55a11152-db37-403a-a567-131288939b37","Type":"ContainerDied","Data":"db54f831df1a62517fe24c17a04049995a9302e1487d3db84659a49c575eec14"} Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.070772 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db54f831df1a62517fe24c17a04049995a9302e1487d3db84659a49c575eec14" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.070551 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.072069 4707 generic.go:334] "Generic (PLEG): container finished" podID="564cacdc-a141-427f-ba32-593fe665a29b" containerID="6eb520f1f4eea9ecf2295d30d78fe4d6d732051d1f8bbded341746250154bc1d" exitCode=0 Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.072187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" event={"ID":"564cacdc-a141-427f-ba32-593fe665a29b","Type":"ContainerDied","Data":"6eb520f1f4eea9ecf2295d30d78fe4d6d732051d1f8bbded341746250154bc1d"} Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.139638 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:50:02 crc kubenswrapper[4707]: E0121 15:50:02.140215 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a11152-db37-403a-a567-131288939b37" containerName="nova-cell1-conductor-db-sync" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.140320 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a11152-db37-403a-a567-131288939b37" containerName="nova-cell1-conductor-db-sync" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.140527 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a11152-db37-403a-a567-131288939b37" containerName="nova-cell1-conductor-db-sync" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.141192 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.142723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.148219 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.236393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.236437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lklrn\" (UniqueName: \"kubernetes.io/projected/d22ebb6c-70ce-42f9-8893-78d76edab2ce-kube-api-access-lklrn\") pod \"nova-cell1-conductor-0\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.236479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.337899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.337947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lklrn\" (UniqueName: \"kubernetes.io/projected/d22ebb6c-70ce-42f9-8893-78d76edab2ce-kube-api-access-lklrn\") pod \"nova-cell1-conductor-0\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.338001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.341447 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.341499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.350734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lklrn\" (UniqueName: \"kubernetes.io/projected/d22ebb6c-70ce-42f9-8893-78d76edab2ce-kube-api-access-lklrn\") pod \"nova-cell1-conductor-0\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.454112 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.800302 4707 scope.go:117] "RemoveContainer" containerID="66324ff4f010fdef26b65bc78b114ffa94a1489c0fd9e227005cc8c80d9ace09" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.809054 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:50:02 crc kubenswrapper[4707]: W0121 15:50:02.810869 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd22ebb6c_70ce_42f9_8893_78d76edab2ce.slice/crio-621752e70dcb2b44726587d923120e98c76eb889ee4247a8c99b26f517983720 WatchSource:0}: Error finding container 621752e70dcb2b44726587d923120e98c76eb889ee4247a8c99b26f517983720: Status 404 returned error can't find the container with id 621752e70dcb2b44726587d923120e98c76eb889ee4247a8c99b26f517983720 Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.835578 4707 scope.go:117] "RemoveContainer" containerID="8babc8e11e4d634f0f3b422641f30104a532577c7a3b1a584bb2b1242ecb0aa7" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.857690 4707 scope.go:117] "RemoveContainer" containerID="2c5a6ecb173c9be1760e62fc79bbb95a3cb76bfe14e017cc957f577ef64e5518" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.883277 4707 scope.go:117] "RemoveContainer" containerID="c803576a99768313bb7b6d53e8da4250a1defe738422442774fb42f73b7b74f3" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.899210 4707 scope.go:117] "RemoveContainer" containerID="c93cb1efa43ca7efe450cd63451ac3e5290a1c195b5b66d42c58c98b1bbb06e4" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.919549 4707 scope.go:117] "RemoveContainer" containerID="b7c709a6cb6bf6144998677dd56db745001a849fd302b2bacb3b10742068e3cd" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.951070 4707 scope.go:117] "RemoveContainer" containerID="11bf510afd6ac0fadc0815a9e26b92b15d26b3a24564ea8219a3f2dfb42f73fb" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.966234 4707 scope.go:117] "RemoveContainer" containerID="44aac26d47f17ab219b49fd4ef538fee4c84653e8d7ef18b9e7e52af53a0eb0d" Jan 21 15:50:02 crc kubenswrapper[4707]: I0121 15:50:02.997388 4707 scope.go:117] "RemoveContainer" containerID="3104fc64e504a3e2c5c25aace72bc0bd76abcd55d1b7d82da6696a3fb2429700" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.012560 4707 scope.go:117] "RemoveContainer" containerID="a2f607d72bb9c2553f08bd9c64844ccce10a089366ffc54a7691ffae13e19329" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.027987 4707 scope.go:117] "RemoveContainer" containerID="07b5614e155cdb2282d53bbf3fcbd961a77825fed72469a3a578360d725938b4" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.081742 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d22ebb6c-70ce-42f9-8893-78d76edab2ce","Type":"ContainerStarted","Data":"4c8647510d9ba6e9eb322f9de79f9017a1a8fc720b685abc0f311735d05338ad"} Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.081778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d22ebb6c-70ce-42f9-8893-78d76edab2ce","Type":"ContainerStarted","Data":"621752e70dcb2b44726587d923120e98c76eb889ee4247a8c99b26f517983720"} Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.082608 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.098690 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.098678269 podStartE2EDuration="1.098678269s" podCreationTimestamp="2026-01-21 15:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:03.093122713 +0000 UTC m=+2900.274638936" watchObservedRunningTime="2026-01-21 15:50:03.098678269 +0000 UTC m=+2900.280194491" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.313730 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.355226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-combined-ca-bundle\") pod \"564cacdc-a141-427f-ba32-593fe665a29b\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.355405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-config-data\") pod \"564cacdc-a141-427f-ba32-593fe665a29b\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.355472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhw5x\" (UniqueName: \"kubernetes.io/projected/564cacdc-a141-427f-ba32-593fe665a29b-kube-api-access-rhw5x\") pod \"564cacdc-a141-427f-ba32-593fe665a29b\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.355521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-scripts\") pod \"564cacdc-a141-427f-ba32-593fe665a29b\" (UID: \"564cacdc-a141-427f-ba32-593fe665a29b\") " Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.358753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564cacdc-a141-427f-ba32-593fe665a29b-kube-api-access-rhw5x" (OuterVolumeSpecName: "kube-api-access-rhw5x") pod "564cacdc-a141-427f-ba32-593fe665a29b" (UID: "564cacdc-a141-427f-ba32-593fe665a29b"). InnerVolumeSpecName "kube-api-access-rhw5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.359116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-scripts" (OuterVolumeSpecName: "scripts") pod "564cacdc-a141-427f-ba32-593fe665a29b" (UID: "564cacdc-a141-427f-ba32-593fe665a29b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.380833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "564cacdc-a141-427f-ba32-593fe665a29b" (UID: "564cacdc-a141-427f-ba32-593fe665a29b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.382678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-config-data" (OuterVolumeSpecName: "config-data") pod "564cacdc-a141-427f-ba32-593fe665a29b" (UID: "564cacdc-a141-427f-ba32-593fe665a29b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.457140 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.457169 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhw5x\" (UniqueName: \"kubernetes.io/projected/564cacdc-a141-427f-ba32-593fe665a29b-kube-api-access-rhw5x\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.457178 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:03 crc kubenswrapper[4707]: I0121 15:50:03.457187 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564cacdc-a141-427f-ba32-593fe665a29b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.104557 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.105070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-tp798" event={"ID":"564cacdc-a141-427f-ba32-593fe665a29b","Type":"ContainerDied","Data":"6f980775a7b3c59b3d963ad50aa91a2c7892aa8c99383d8e12c5e59fad635a0c"} Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.105100 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f980775a7b3c59b3d963ad50aa91a2c7892aa8c99383d8e12c5e59fad635a0c" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.269043 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.269328 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerName="nova-api-log" containerID="cri-o://7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee" gracePeriod=30 Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.269699 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerName="nova-api-api" containerID="cri-o://72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b" gracePeriod=30 Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.280709 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.280949 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="beb2956b-c7e5-4d23-8d31-6eab66daeb84" containerName="nova-scheduler-scheduler" containerID="cri-o://4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118" gracePeriod=30 Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.289361 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.289604 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerName="nova-metadata-log" containerID="cri-o://680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49" gracePeriod=30 Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.289746 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerName="nova-metadata-metadata" containerID="cri-o://e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da" gracePeriod=30 Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.781574 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.786368 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-config-data\") pod \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-combined-ca-bundle\") pod \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d24258d7-92fc-4e5d-a418-9263e8711f16-logs\") pod \"d24258d7-92fc-4e5d-a418-9263e8711f16\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881302 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r2gg\" (UniqueName: \"kubernetes.io/projected/d24258d7-92fc-4e5d-a418-9263e8711f16-kube-api-access-4r2gg\") pod \"d24258d7-92fc-4e5d-a418-9263e8711f16\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881379 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-config-data\") pod \"d24258d7-92fc-4e5d-a418-9263e8711f16\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjvrb\" (UniqueName: \"kubernetes.io/projected/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-kube-api-access-qjvrb\") pod \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881451 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-combined-ca-bundle\") pod \"d24258d7-92fc-4e5d-a418-9263e8711f16\" (UID: \"d24258d7-92fc-4e5d-a418-9263e8711f16\") " Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-logs\") pod \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\" (UID: \"997323b5-c8e3-4f9b-bd80-13cf125ddb7d\") " Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d24258d7-92fc-4e5d-a418-9263e8711f16-logs" (OuterVolumeSpecName: "logs") pod "d24258d7-92fc-4e5d-a418-9263e8711f16" (UID: "d24258d7-92fc-4e5d-a418-9263e8711f16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881882 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d24258d7-92fc-4e5d-a418-9263e8711f16-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.881969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-logs" (OuterVolumeSpecName: "logs") pod "997323b5-c8e3-4f9b-bd80-13cf125ddb7d" (UID: "997323b5-c8e3-4f9b-bd80-13cf125ddb7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.885309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d24258d7-92fc-4e5d-a418-9263e8711f16-kube-api-access-4r2gg" (OuterVolumeSpecName: "kube-api-access-4r2gg") pod "d24258d7-92fc-4e5d-a418-9263e8711f16" (UID: "d24258d7-92fc-4e5d-a418-9263e8711f16"). InnerVolumeSpecName "kube-api-access-4r2gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.885452 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-kube-api-access-qjvrb" (OuterVolumeSpecName: "kube-api-access-qjvrb") pod "997323b5-c8e3-4f9b-bd80-13cf125ddb7d" (UID: "997323b5-c8e3-4f9b-bd80-13cf125ddb7d"). InnerVolumeSpecName "kube-api-access-qjvrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.901503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "997323b5-c8e3-4f9b-bd80-13cf125ddb7d" (UID: "997323b5-c8e3-4f9b-bd80-13cf125ddb7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.901839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-config-data" (OuterVolumeSpecName: "config-data") pod "d24258d7-92fc-4e5d-a418-9263e8711f16" (UID: "d24258d7-92fc-4e5d-a418-9263e8711f16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.902167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-config-data" (OuterVolumeSpecName: "config-data") pod "997323b5-c8e3-4f9b-bd80-13cf125ddb7d" (UID: "997323b5-c8e3-4f9b-bd80-13cf125ddb7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.902570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d24258d7-92fc-4e5d-a418-9263e8711f16" (UID: "d24258d7-92fc-4e5d-a418-9263e8711f16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.983415 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.983440 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjvrb\" (UniqueName: \"kubernetes.io/projected/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-kube-api-access-qjvrb\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.983451 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24258d7-92fc-4e5d-a418-9263e8711f16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.983462 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.983469 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.983477 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997323b5-c8e3-4f9b-bd80-13cf125ddb7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4707]: I0121 15:50:04.983485 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r2gg\" (UniqueName: \"kubernetes.io/projected/d24258d7-92fc-4e5d-a418-9263e8711f16-kube-api-access-4r2gg\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.112613 4707 generic.go:334] "Generic (PLEG): container finished" podID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerID="e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da" exitCode=0 Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.112803 4707 generic.go:334] "Generic (PLEG): container finished" podID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerID="680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49" exitCode=143 Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.112761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d24258d7-92fc-4e5d-a418-9263e8711f16","Type":"ContainerDied","Data":"e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da"} Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.112870 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.112890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d24258d7-92fc-4e5d-a418-9263e8711f16","Type":"ContainerDied","Data":"680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49"} Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.112907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d24258d7-92fc-4e5d-a418-9263e8711f16","Type":"ContainerDied","Data":"e1931fd3dbc233ff110420f4c09ab637c78c1bf9450b7bb37b79c0e753212ddb"} Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.112922 4707 scope.go:117] "RemoveContainer" containerID="e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.115367 4707 generic.go:334] "Generic (PLEG): container finished" podID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerID="72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b" exitCode=0 Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.115387 4707 generic.go:334] "Generic (PLEG): container finished" podID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerID="7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee" exitCode=143 Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.115984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"997323b5-c8e3-4f9b-bd80-13cf125ddb7d","Type":"ContainerDied","Data":"72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b"} Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.116036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"997323b5-c8e3-4f9b-bd80-13cf125ddb7d","Type":"ContainerDied","Data":"7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee"} Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.116072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"997323b5-c8e3-4f9b-bd80-13cf125ddb7d","Type":"ContainerDied","Data":"5e051dd807c5f42234fa9726f2b851ca51f7d61fb7726adb852a727ed798dce8"} Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.116045 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.129355 4707 scope.go:117] "RemoveContainer" containerID="680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.137882 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.153382 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.157084 4707 scope.go:117] "RemoveContainer" containerID="e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da" Jan 21 15:50:05 crc kubenswrapper[4707]: E0121 15:50:05.157483 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da\": container with ID starting with e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da not found: ID does not exist" containerID="e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.157512 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da"} err="failed to get container status \"e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da\": rpc error: code = NotFound desc = could not find container \"e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da\": container with ID starting with e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da not found: ID does not exist" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.157536 4707 scope.go:117] "RemoveContainer" containerID="680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49" Jan 21 15:50:05 crc kubenswrapper[4707]: E0121 15:50:05.157912 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49\": container with ID starting with 680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49 not found: ID does not exist" containerID="680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.157934 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49"} err="failed to get container status \"680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49\": rpc error: code = NotFound desc = could not find container \"680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49\": container with ID starting with 680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49 not found: ID does not exist" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.157956 4707 scope.go:117] "RemoveContainer" containerID="e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.158162 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da"} err="failed to get container status \"e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da\": rpc error: code = NotFound desc = could not find container \"e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da\": container with ID starting with e9b9ed6bb98c255f42a20a1366ede45fc0a7c34ca239048c29476c47a5c766da not found: ID does not exist" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.158178 4707 scope.go:117] "RemoveContainer" containerID="680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.158360 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49"} err="failed to get container status \"680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49\": rpc error: code = NotFound desc = could not find container \"680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49\": container with ID starting with 680835e755e3c939a4d01e5420b10cff78de75ebcdc2a0e0e6fb386d6b3bcd49 not found: ID does not exist" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.158373 4707 scope.go:117] "RemoveContainer" containerID="72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.162120 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.168398 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.172198 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.176339 4707 scope.go:117] "RemoveContainer" containerID="7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180088 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: E0121 15:50:05.180517 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerName="nova-metadata-metadata" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180536 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerName="nova-metadata-metadata" Jan 21 15:50:05 crc kubenswrapper[4707]: E0121 15:50:05.180553 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerName="nova-api-api" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180559 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerName="nova-api-api" Jan 21 15:50:05 crc kubenswrapper[4707]: E0121 15:50:05.180577 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564cacdc-a141-427f-ba32-593fe665a29b" containerName="nova-manage" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180584 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="564cacdc-a141-427f-ba32-593fe665a29b" containerName="nova-manage" Jan 21 15:50:05 crc kubenswrapper[4707]: E0121 15:50:05.180599 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerName="nova-metadata-log" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180606 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerName="nova-metadata-log" Jan 21 15:50:05 crc kubenswrapper[4707]: E0121 15:50:05.180617 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerName="nova-api-log" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180623 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerName="nova-api-log" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180760 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerName="nova-metadata-metadata" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180773 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerName="nova-api-api" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180789 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="564cacdc-a141-427f-ba32-593fe665a29b" containerName="nova-manage" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180801 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24258d7-92fc-4e5d-a418-9263e8711f16" containerName="nova-metadata-log" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.180835 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" containerName="nova-api-log" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.181710 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.188120 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.195064 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="997323b5-c8e3-4f9b-bd80-13cf125ddb7d" path="/var/lib/kubelet/pods/997323b5-c8e3-4f9b-bd80-13cf125ddb7d/volumes" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.203856 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d24258d7-92fc-4e5d-a418-9263e8711f16" path="/var/lib/kubelet/pods/d24258d7-92fc-4e5d-a418-9263e8711f16/volumes" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.204438 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.206768 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.207046 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.207067 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.207322 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.207514 4707 scope.go:117] "RemoveContainer" containerID="72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b" Jan 21 15:50:05 crc kubenswrapper[4707]: E0121 15:50:05.207912 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b\": container with ID starting with 72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b not found: ID does not exist" containerID="72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.207957 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b"} err="failed to get container status \"72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b\": rpc error: code = NotFound desc = could not find container \"72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b\": container with ID starting with 72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b not found: ID does not exist" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.207985 4707 scope.go:117] "RemoveContainer" containerID="7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee" Jan 21 15:50:05 crc kubenswrapper[4707]: E0121 15:50:05.208500 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee\": container with ID starting with 7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee not found: ID does not exist" containerID="7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.208520 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee"} err="failed to get container status \"7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee\": rpc error: code = NotFound desc = could not find container \"7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee\": container with ID starting with 7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee not found: ID does not exist" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.208532 4707 scope.go:117] "RemoveContainer" containerID="72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.208770 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b"} err="failed to get container status \"72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b\": rpc error: code = NotFound desc = could not find container \"72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b\": container with ID starting with 72370f20930631c5e7c2378071729416692289cc59bb0b959c9674989aa1138b not found: ID does not exist" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.208893 4707 scope.go:117] "RemoveContainer" containerID="7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.209196 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee"} err="failed to get container status \"7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee\": rpc error: code = NotFound desc = could not find container \"7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee\": container with ID starting with 7f9cdf2e49126d8bbf36e07e4562e921aaf4b31ea2639d8e863ce0e568ca29ee not found: ID does not exist" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.209994 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.288157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934ecfe-574b-494b-9d6e-d99d5060d368-logs\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.288219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.288546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.288581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-config-data\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.288654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-config-data\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.288799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-logs\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.288865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxctb\" (UniqueName: \"kubernetes.io/projected/f934ecfe-574b-494b-9d6e-d99d5060d368-kube-api-access-jxctb\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.288902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxzdm\" (UniqueName: \"kubernetes.io/projected/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-kube-api-access-bxzdm\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.390480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-logs\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.390523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxctb\" (UniqueName: \"kubernetes.io/projected/f934ecfe-574b-494b-9d6e-d99d5060d368-kube-api-access-jxctb\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.390544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxzdm\" (UniqueName: \"kubernetes.io/projected/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-kube-api-access-bxzdm\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.390578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934ecfe-574b-494b-9d6e-d99d5060d368-logs\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.390613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.390681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.390695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-config-data\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.390715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-config-data\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.390870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-logs\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.391448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934ecfe-574b-494b-9d6e-d99d5060d368-logs\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.394249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.394298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-config-data\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.394385 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.394545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-config-data\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.402725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxctb\" (UniqueName: \"kubernetes.io/projected/f934ecfe-574b-494b-9d6e-d99d5060d368-kube-api-access-jxctb\") pod \"nova-metadata-0\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.403018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxzdm\" (UniqueName: \"kubernetes.io/projected/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-kube-api-access-bxzdm\") pod \"nova-api-0\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.496799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.522159 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.873867 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: I0121 15:50:05.937458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:05 crc kubenswrapper[4707]: W0121 15:50:05.937871 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5c06214_785b_4d8f_b59d_9da1aee2b2b6.slice/crio-57bd6a35ecdad470e8f03a6a015e546ad3641e4f857988418401f73c79210bf5 WatchSource:0}: Error finding container 57bd6a35ecdad470e8f03a6a015e546ad3641e4f857988418401f73c79210bf5: Status 404 returned error can't find the container with id 57bd6a35ecdad470e8f03a6a015e546ad3641e4f857988418401f73c79210bf5 Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.127025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5c06214-785b-4d8f-b59d-9da1aee2b2b6","Type":"ContainerStarted","Data":"93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341"} Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.127062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5c06214-785b-4d8f-b59d-9da1aee2b2b6","Type":"ContainerStarted","Data":"57bd6a35ecdad470e8f03a6a015e546ad3641e4f857988418401f73c79210bf5"} Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.130367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f934ecfe-574b-494b-9d6e-d99d5060d368","Type":"ContainerStarted","Data":"b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa"} Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.130393 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f934ecfe-574b-494b-9d6e-d99d5060d368","Type":"ContainerStarted","Data":"142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db"} Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.130404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f934ecfe-574b-494b-9d6e-d99d5060d368","Type":"ContainerStarted","Data":"ebbbb015d567ad48d27c841ccdc7bbe04017a8d4394680ed57c8643408e1709d"} Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.140493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.149158 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.149140447 podStartE2EDuration="1.149140447s" podCreationTimestamp="2026-01-21 15:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:06.143449447 +0000 UTC m=+2903.324965669" watchObservedRunningTime="2026-01-21 15:50:06.149140447 +0000 UTC m=+2903.330656670" Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.855470 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.914225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-combined-ca-bundle\") pod \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.914293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-config-data\") pod \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.914366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkt68\" (UniqueName: \"kubernetes.io/projected/beb2956b-c7e5-4d23-8d31-6eab66daeb84-kube-api-access-fkt68\") pod \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\" (UID: \"beb2956b-c7e5-4d23-8d31-6eab66daeb84\") " Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.918676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb2956b-c7e5-4d23-8d31-6eab66daeb84-kube-api-access-fkt68" (OuterVolumeSpecName: "kube-api-access-fkt68") pod "beb2956b-c7e5-4d23-8d31-6eab66daeb84" (UID: "beb2956b-c7e5-4d23-8d31-6eab66daeb84"). InnerVolumeSpecName "kube-api-access-fkt68". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.933950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "beb2956b-c7e5-4d23-8d31-6eab66daeb84" (UID: "beb2956b-c7e5-4d23-8d31-6eab66daeb84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:06 crc kubenswrapper[4707]: I0121 15:50:06.934571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-config-data" (OuterVolumeSpecName: "config-data") pod "beb2956b-c7e5-4d23-8d31-6eab66daeb84" (UID: "beb2956b-c7e5-4d23-8d31-6eab66daeb84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.016000 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkt68\" (UniqueName: \"kubernetes.io/projected/beb2956b-c7e5-4d23-8d31-6eab66daeb84-kube-api-access-fkt68\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.016029 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.016038 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beb2956b-c7e5-4d23-8d31-6eab66daeb84-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.140429 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5c06214-785b-4d8f-b59d-9da1aee2b2b6","Type":"ContainerStarted","Data":"82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef"} Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.141534 4707 generic.go:334] "Generic (PLEG): container finished" podID="beb2956b-c7e5-4d23-8d31-6eab66daeb84" containerID="4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118" exitCode=0 Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.141601 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.141629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"beb2956b-c7e5-4d23-8d31-6eab66daeb84","Type":"ContainerDied","Data":"4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118"} Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.141675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"beb2956b-c7e5-4d23-8d31-6eab66daeb84","Type":"ContainerDied","Data":"24fb2a4c8e16551c2cabc2ee89c40304288dd68d9de89328ab447fe5baf9d6d8"} Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.141694 4707 scope.go:117] "RemoveContainer" containerID="4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.155336 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.155321362 podStartE2EDuration="2.155321362s" podCreationTimestamp="2026-01-21 15:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:07.153988656 +0000 UTC m=+2904.335504877" watchObservedRunningTime="2026-01-21 15:50:07.155321362 +0000 UTC m=+2904.336837583" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.157218 4707 scope.go:117] "RemoveContainer" containerID="4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118" Jan 21 15:50:07 crc kubenswrapper[4707]: E0121 15:50:07.157551 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118\": container with ID starting with 4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118 not found: ID does not exist" containerID="4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.157584 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118"} err="failed to get container status \"4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118\": rpc error: code = NotFound desc = could not find container \"4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118\": container with ID starting with 4552e753aa09e07d0628fb8b42721d751d6135bac3d5ae235ed7dec69a8cb118 not found: ID does not exist" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.169340 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.192608 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.195036 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:07 crc kubenswrapper[4707]: E0121 15:50:07.195356 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb2956b-c7e5-4d23-8d31-6eab66daeb84" containerName="nova-scheduler-scheduler" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.195372 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb2956b-c7e5-4d23-8d31-6eab66daeb84" containerName="nova-scheduler-scheduler" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.195547 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="beb2956b-c7e5-4d23-8d31-6eab66daeb84" containerName="nova-scheduler-scheduler" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.196085 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.201303 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.201529 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.318896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.318977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-config-data\") pod \"nova-scheduler-0\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.319033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx9fd\" (UniqueName: \"kubernetes.io/projected/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-kube-api-access-lx9fd\") pod \"nova-scheduler-0\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.420413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.420871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-config-data\") pod \"nova-scheduler-0\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.420993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx9fd\" (UniqueName: \"kubernetes.io/projected/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-kube-api-access-lx9fd\") pod \"nova-scheduler-0\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.424598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.425155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-config-data\") pod \"nova-scheduler-0\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.433840 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx9fd\" (UniqueName: \"kubernetes.io/projected/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-kube-api-access-lx9fd\") pod \"nova-scheduler-0\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.474952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.515923 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.839931 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6"] Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.841192 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.842982 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.843599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6"] Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.844176 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.879199 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:07 crc kubenswrapper[4707]: W0121 15:50:07.881163 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e1f6d2_5135_46b9_aedd_4f74080c9f24.slice/crio-86f20f72e0164b2dadb7fd2cedef9fd8624d66ec46c1d264f90c7a3b0a2dc16b WatchSource:0}: Error finding container 86f20f72e0164b2dadb7fd2cedef9fd8624d66ec46c1d264f90c7a3b0a2dc16b: Status 404 returned error can't find the container with id 86f20f72e0164b2dadb7fd2cedef9fd8624d66ec46c1d264f90c7a3b0a2dc16b Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.928112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpsbb\" (UniqueName: \"kubernetes.io/projected/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-kube-api-access-xpsbb\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.928161 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.928420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-config-data\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:07 crc kubenswrapper[4707]: I0121 15:50:07.928538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-scripts\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.031121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-scripts\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.031171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsbb\" (UniqueName: \"kubernetes.io/projected/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-kube-api-access-xpsbb\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.031197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.031795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-config-data\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.034440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-config-data\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.036642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-scripts\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.038468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.044114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpsbb\" (UniqueName: \"kubernetes.io/projected/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-kube-api-access-xpsbb\") pod \"nova-cell1-cell-mapping-zqpd6\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.149662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d5e1f6d2-5135-46b9-aedd-4f74080c9f24","Type":"ContainerStarted","Data":"8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef"} Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.149915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d5e1f6d2-5135-46b9-aedd-4f74080c9f24","Type":"ContainerStarted","Data":"86f20f72e0164b2dadb7fd2cedef9fd8624d66ec46c1d264f90c7a3b0a2dc16b"} Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.156975 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.162645 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.162629995 podStartE2EDuration="1.162629995s" podCreationTimestamp="2026-01-21 15:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:08.160476757 +0000 UTC m=+2905.341992979" watchObservedRunningTime="2026-01-21 15:50:08.162629995 +0000 UTC m=+2905.344146207" Jan 21 15:50:08 crc kubenswrapper[4707]: I0121 15:50:08.513712 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6"] Jan 21 15:50:08 crc kubenswrapper[4707]: W0121 15:50:08.514348 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e076ba2_f848_4fa2_b6f7_ee0e7f5fd8d5.slice/crio-22b72aa8bcfb77609d6252e4500a3578a1fc0a7c246d103d8d3a2981e58122b9 WatchSource:0}: Error finding container 22b72aa8bcfb77609d6252e4500a3578a1fc0a7c246d103d8d3a2981e58122b9: Status 404 returned error can't find the container with id 22b72aa8bcfb77609d6252e4500a3578a1fc0a7c246d103d8d3a2981e58122b9 Jan 21 15:50:09 crc kubenswrapper[4707]: I0121 15:50:09.160192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" event={"ID":"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5","Type":"ContainerStarted","Data":"5cffaed2737dd9a63cbf504e7def30c36c381a1b91168e8042ac65b329773df6"} Jan 21 15:50:09 crc kubenswrapper[4707]: I0121 15:50:09.160413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" event={"ID":"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5","Type":"ContainerStarted","Data":"22b72aa8bcfb77609d6252e4500a3578a1fc0a7c246d103d8d3a2981e58122b9"} Jan 21 15:50:09 crc kubenswrapper[4707]: I0121 15:50:09.172939 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" podStartSLOduration=2.172922261 podStartE2EDuration="2.172922261s" podCreationTimestamp="2026-01-21 15:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:09.169917731 +0000 UTC m=+2906.351433952" watchObservedRunningTime="2026-01-21 15:50:09.172922261 +0000 UTC m=+2906.354438482" Jan 21 15:50:09 crc kubenswrapper[4707]: I0121 15:50:09.193344 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb2956b-c7e5-4d23-8d31-6eab66daeb84" path="/var/lib/kubelet/pods/beb2956b-c7e5-4d23-8d31-6eab66daeb84/volumes" Jan 21 15:50:09 crc kubenswrapper[4707]: I0121 15:50:09.945363 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:50:09 crc kubenswrapper[4707]: I0121 15:50:09.945601 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:50:09 crc kubenswrapper[4707]: I0121 15:50:09.945643 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:50:09 crc kubenswrapper[4707]: I0121 15:50:09.946362 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34a707de92d001c9de59dbef499d523bf460e9895d2f693f866b85aad1e846c3"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:50:09 crc kubenswrapper[4707]: I0121 15:50:09.946410 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://34a707de92d001c9de59dbef499d523bf460e9895d2f693f866b85aad1e846c3" gracePeriod=600 Jan 21 15:50:10 crc kubenswrapper[4707]: I0121 15:50:10.167577 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="34a707de92d001c9de59dbef499d523bf460e9895d2f693f866b85aad1e846c3" exitCode=0 Jan 21 15:50:10 crc kubenswrapper[4707]: I0121 15:50:10.168631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"34a707de92d001c9de59dbef499d523bf460e9895d2f693f866b85aad1e846c3"} Jan 21 15:50:10 crc kubenswrapper[4707]: I0121 15:50:10.168668 4707 scope.go:117] "RemoveContainer" containerID="dfd4a90b7d9c7591cc24ac396890622bfa442e623f5a555ef38b66746dbf660e" Jan 21 15:50:10 crc kubenswrapper[4707]: I0121 15:50:10.497955 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:10 crc kubenswrapper[4707]: I0121 15:50:10.498281 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:11 crc kubenswrapper[4707]: I0121 15:50:11.176857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d"} Jan 21 15:50:12 crc kubenswrapper[4707]: I0121 15:50:12.517029 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4707]: I0121 15:50:13.197094 4707 generic.go:334] "Generic (PLEG): container finished" podID="1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5" containerID="5cffaed2737dd9a63cbf504e7def30c36c381a1b91168e8042ac65b329773df6" exitCode=0 Jan 21 15:50:13 crc kubenswrapper[4707]: I0121 15:50:13.198075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" event={"ID":"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5","Type":"ContainerDied","Data":"5cffaed2737dd9a63cbf504e7def30c36c381a1b91168e8042ac65b329773df6"} Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.486995 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.527709 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpsbb\" (UniqueName: \"kubernetes.io/projected/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-kube-api-access-xpsbb\") pod \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.527828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-combined-ca-bundle\") pod \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.527926 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-scripts\") pod \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.527950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-config-data\") pod \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\" (UID: \"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5\") " Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.532215 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-scripts" (OuterVolumeSpecName: "scripts") pod "1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5" (UID: "1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.532485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-kube-api-access-xpsbb" (OuterVolumeSpecName: "kube-api-access-xpsbb") pod "1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5" (UID: "1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5"). InnerVolumeSpecName "kube-api-access-xpsbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.547443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5" (UID: "1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.547533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-config-data" (OuterVolumeSpecName: "config-data") pod "1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5" (UID: "1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.615037 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bzdlz"] Jan 21 15:50:14 crc kubenswrapper[4707]: E0121 15:50:14.615567 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5" containerName="nova-manage" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.615581 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5" containerName="nova-manage" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.615778 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5" containerName="nova-manage" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.622460 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzdlz"] Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.622644 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.629887 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htz6g\" (UniqueName: \"kubernetes.io/projected/7ac885cd-7813-4c48-8039-0eb22685ce23-kube-api-access-htz6g\") pod \"redhat-operators-bzdlz\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.629971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-catalog-content\") pod \"redhat-operators-bzdlz\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.630039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-utilities\") pod \"redhat-operators-bzdlz\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.630098 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpsbb\" (UniqueName: \"kubernetes.io/projected/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-kube-api-access-xpsbb\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.630108 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.630117 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.630126 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.731446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htz6g\" (UniqueName: \"kubernetes.io/projected/7ac885cd-7813-4c48-8039-0eb22685ce23-kube-api-access-htz6g\") pod \"redhat-operators-bzdlz\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.731501 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-catalog-content\") pod \"redhat-operators-bzdlz\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.731581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-utilities\") pod \"redhat-operators-bzdlz\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.731982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-utilities\") pod \"redhat-operators-bzdlz\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.732207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-catalog-content\") pod \"redhat-operators-bzdlz\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.746961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htz6g\" (UniqueName: \"kubernetes.io/projected/7ac885cd-7813-4c48-8039-0eb22685ce23-kube-api-access-htz6g\") pod \"redhat-operators-bzdlz\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:14 crc kubenswrapper[4707]: I0121 15:50:14.936432 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.210081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" event={"ID":"1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5","Type":"ContainerDied","Data":"22b72aa8bcfb77609d6252e4500a3578a1fc0a7c246d103d8d3a2981e58122b9"} Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.210319 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22b72aa8bcfb77609d6252e4500a3578a1fc0a7c246d103d8d3a2981e58122b9" Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.210139 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6" Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.319514 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzdlz"] Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.383723 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.384118 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d5e1f6d2-5135-46b9-aedd-4f74080c9f24" containerName="nova-scheduler-scheduler" containerID="cri-o://8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef" gracePeriod=30 Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.390142 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.390320 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerName="nova-api-log" containerID="cri-o://93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341" gracePeriod=30 Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.390380 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerName="nova-api-api" containerID="cri-o://82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef" gracePeriod=30 Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.403854 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.404071 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerName="nova-metadata-log" containerID="cri-o://142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db" gracePeriod=30 Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.404133 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerName="nova-metadata-metadata" containerID="cri-o://b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa" gracePeriod=30 Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.913574 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.949229 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-combined-ca-bundle\") pod \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.949304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxzdm\" (UniqueName: \"kubernetes.io/projected/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-kube-api-access-bxzdm\") pod \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.949365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-config-data\") pod \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.949394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-logs\") pod \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\" (UID: \"d5c06214-785b-4d8f-b59d-9da1aee2b2b6\") " Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.949753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-logs" (OuterVolumeSpecName: "logs") pod "d5c06214-785b-4d8f-b59d-9da1aee2b2b6" (UID: "d5c06214-785b-4d8f-b59d-9da1aee2b2b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.949856 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.953781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-kube-api-access-bxzdm" (OuterVolumeSpecName: "kube-api-access-bxzdm") pod "d5c06214-785b-4d8f-b59d-9da1aee2b2b6" (UID: "d5c06214-785b-4d8f-b59d-9da1aee2b2b6"). InnerVolumeSpecName "kube-api-access-bxzdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.972181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5c06214-785b-4d8f-b59d-9da1aee2b2b6" (UID: "d5c06214-785b-4d8f-b59d-9da1aee2b2b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:15 crc kubenswrapper[4707]: I0121 15:50:15.973103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-config-data" (OuterVolumeSpecName: "config-data") pod "d5c06214-785b-4d8f-b59d-9da1aee2b2b6" (UID: "d5c06214-785b-4d8f-b59d-9da1aee2b2b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.051231 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.051270 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.051283 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxzdm\" (UniqueName: \"kubernetes.io/projected/d5c06214-785b-4d8f-b59d-9da1aee2b2b6-kube-api-access-bxzdm\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.154365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.218426 4707 generic.go:334] "Generic (PLEG): container finished" podID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerID="b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa" exitCode=0 Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.218453 4707 generic.go:334] "Generic (PLEG): container finished" podID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerID="142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db" exitCode=143 Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.218467 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.218483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f934ecfe-574b-494b-9d6e-d99d5060d368","Type":"ContainerDied","Data":"b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa"} Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.218543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f934ecfe-574b-494b-9d6e-d99d5060d368","Type":"ContainerDied","Data":"142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db"} Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.218558 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f934ecfe-574b-494b-9d6e-d99d5060d368","Type":"ContainerDied","Data":"ebbbb015d567ad48d27c841ccdc7bbe04017a8d4394680ed57c8643408e1709d"} Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.218581 4707 scope.go:117] "RemoveContainer" containerID="b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.220268 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerID="2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40" exitCode=0 Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.220330 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdlz" event={"ID":"7ac885cd-7813-4c48-8039-0eb22685ce23","Type":"ContainerDied","Data":"2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40"} Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.220358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdlz" event={"ID":"7ac885cd-7813-4c48-8039-0eb22685ce23","Type":"ContainerStarted","Data":"f6ab57d33b942fed170c1f59067023344c553d096dfba84bb973c6524123afbf"} Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.222317 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerID="82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef" exitCode=0 Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.222356 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerID="93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341" exitCode=143 Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.222361 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.222376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5c06214-785b-4d8f-b59d-9da1aee2b2b6","Type":"ContainerDied","Data":"82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef"} Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.222394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5c06214-785b-4d8f-b59d-9da1aee2b2b6","Type":"ContainerDied","Data":"93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341"} Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.222404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d5c06214-785b-4d8f-b59d-9da1aee2b2b6","Type":"ContainerDied","Data":"57bd6a35ecdad470e8f03a6a015e546ad3641e4f857988418401f73c79210bf5"} Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.237971 4707 scope.go:117] "RemoveContainer" containerID="142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.251642 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.253276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-combined-ca-bundle\") pod \"f934ecfe-574b-494b-9d6e-d99d5060d368\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.253364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-config-data\") pod \"f934ecfe-574b-494b-9d6e-d99d5060d368\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.253434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934ecfe-574b-494b-9d6e-d99d5060d368-logs\") pod \"f934ecfe-574b-494b-9d6e-d99d5060d368\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.253496 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxctb\" (UniqueName: \"kubernetes.io/projected/f934ecfe-574b-494b-9d6e-d99d5060d368-kube-api-access-jxctb\") pod \"f934ecfe-574b-494b-9d6e-d99d5060d368\" (UID: \"f934ecfe-574b-494b-9d6e-d99d5060d368\") " Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.254581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f934ecfe-574b-494b-9d6e-d99d5060d368-logs" (OuterVolumeSpecName: "logs") pod "f934ecfe-574b-494b-9d6e-d99d5060d368" (UID: "f934ecfe-574b-494b-9d6e-d99d5060d368"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.257144 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.257297 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934ecfe-574b-494b-9d6e-d99d5060d368-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.260370 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.261988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f934ecfe-574b-494b-9d6e-d99d5060d368-kube-api-access-jxctb" (OuterVolumeSpecName: "kube-api-access-jxctb") pod "f934ecfe-574b-494b-9d6e-d99d5060d368" (UID: "f934ecfe-574b-494b-9d6e-d99d5060d368"). InnerVolumeSpecName "kube-api-access-jxctb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.267031 4707 scope.go:117] "RemoveContainer" containerID="b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa" Jan 21 15:50:16 crc kubenswrapper[4707]: E0121 15:50:16.267414 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa\": container with ID starting with b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa not found: ID does not exist" containerID="b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.267443 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa"} err="failed to get container status \"b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa\": rpc error: code = NotFound desc = could not find container \"b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa\": container with ID starting with b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa not found: ID does not exist" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.267462 4707 scope.go:117] "RemoveContainer" containerID="142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db" Jan 21 15:50:16 crc kubenswrapper[4707]: E0121 15:50:16.267763 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db\": container with ID starting with 142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db not found: ID does not exist" containerID="142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.267783 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db"} err="failed to get container status \"142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db\": rpc error: code = NotFound desc = could not find container \"142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db\": container with ID starting with 142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db not found: ID does not exist" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.267793 4707 scope.go:117] "RemoveContainer" containerID="b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.268074 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa"} err="failed to get container status \"b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa\": rpc error: code = NotFound desc = could not find container \"b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa\": container with ID starting with b8058f7e94642b460989d8cf95b906ba48bd935151a5e6976c2fa6700a7fe0aa not found: ID does not exist" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.268092 4707 scope.go:117] "RemoveContainer" containerID="142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.268278 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db"} err="failed to get container status \"142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db\": rpc error: code = NotFound desc = could not find container \"142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db\": container with ID starting with 142abcc5db744a8181b124e34bb7894c3ecb542cf12c849a3355fa6611a2e4db not found: ID does not exist" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.268294 4707 scope.go:117] "RemoveContainer" containerID="82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.274068 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:16 crc kubenswrapper[4707]: E0121 15:50:16.274479 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerName="nova-metadata-metadata" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.274496 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerName="nova-metadata-metadata" Jan 21 15:50:16 crc kubenswrapper[4707]: E0121 15:50:16.274519 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerName="nova-api-api" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.274524 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerName="nova-api-api" Jan 21 15:50:16 crc kubenswrapper[4707]: E0121 15:50:16.274534 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerName="nova-api-log" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.274539 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerName="nova-api-log" Jan 21 15:50:16 crc kubenswrapper[4707]: E0121 15:50:16.274549 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerName="nova-metadata-log" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.274554 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerName="nova-metadata-log" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.275640 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerName="nova-metadata-log" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.275669 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerName="nova-api-api" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.275677 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f934ecfe-574b-494b-9d6e-d99d5060d368" containerName="nova-metadata-metadata" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.275690 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" containerName="nova-api-log" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.276599 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.279122 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.282607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f934ecfe-574b-494b-9d6e-d99d5060d368" (UID: "f934ecfe-574b-494b-9d6e-d99d5060d368"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.284804 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.285509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-config-data" (OuterVolumeSpecName: "config-data") pod "f934ecfe-574b-494b-9d6e-d99d5060d368" (UID: "f934ecfe-574b-494b-9d6e-d99d5060d368"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.292502 4707 scope.go:117] "RemoveContainer" containerID="93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.308008 4707 scope.go:117] "RemoveContainer" containerID="82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef" Jan 21 15:50:16 crc kubenswrapper[4707]: E0121 15:50:16.308485 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef\": container with ID starting with 82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef not found: ID does not exist" containerID="82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.308552 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef"} err="failed to get container status \"82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef\": rpc error: code = NotFound desc = could not find container \"82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef\": container with ID starting with 82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef not found: ID does not exist" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.308580 4707 scope.go:117] "RemoveContainer" containerID="93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341" Jan 21 15:50:16 crc kubenswrapper[4707]: E0121 15:50:16.308920 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341\": container with ID starting with 93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341 not found: ID does not exist" containerID="93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.308946 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341"} err="failed to get container status \"93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341\": rpc error: code = NotFound desc = could not find container \"93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341\": container with ID starting with 93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341 not found: ID does not exist" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.308961 4707 scope.go:117] "RemoveContainer" containerID="82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.309223 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef"} err="failed to get container status \"82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef\": rpc error: code = NotFound desc = could not find container \"82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef\": container with ID starting with 82e26a51a959e85f49d75cb9c88d384708527771574a5ed564fe80837de0acef not found: ID does not exist" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.309308 4707 scope.go:117] "RemoveContainer" containerID="93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.309545 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341"} err="failed to get container status \"93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341\": rpc error: code = NotFound desc = could not find container \"93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341\": container with ID starting with 93f831a5c56c6bae450f1fe600ddf9889ea225bd8b9378e7ba5ce32f12f63341 not found: ID does not exist" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.358879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-config-data\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.359014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1376014d-80c7-43b5-b295-72d7b639df19-logs\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.359058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.359118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnxzg\" (UniqueName: \"kubernetes.io/projected/1376014d-80c7-43b5-b295-72d7b639df19-kube-api-access-jnxzg\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.359246 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.359278 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxctb\" (UniqueName: \"kubernetes.io/projected/f934ecfe-574b-494b-9d6e-d99d5060d368-kube-api-access-jxctb\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.359311 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934ecfe-574b-494b-9d6e-d99d5060d368-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.460153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-config-data\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.460203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1376014d-80c7-43b5-b295-72d7b639df19-logs\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.460228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.460266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnxzg\" (UniqueName: \"kubernetes.io/projected/1376014d-80c7-43b5-b295-72d7b639df19-kube-api-access-jnxzg\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.460957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1376014d-80c7-43b5-b295-72d7b639df19-logs\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.464102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-config-data\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.464437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.473291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnxzg\" (UniqueName: \"kubernetes.io/projected/1376014d-80c7-43b5-b295-72d7b639df19-kube-api-access-jnxzg\") pod \"nova-api-0\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.542459 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.548548 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.555523 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.556846 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.559158 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.575074 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.597371 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.662110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.662185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2391af6c-7b43-479d-9401-3212aa083d20-logs\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.662382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkgv\" (UniqueName: \"kubernetes.io/projected/2391af6c-7b43-479d-9401-3212aa083d20-kube-api-access-5pkgv\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.662416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-config-data\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.763791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2391af6c-7b43-479d-9401-3212aa083d20-logs\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.763929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkgv\" (UniqueName: \"kubernetes.io/projected/2391af6c-7b43-479d-9401-3212aa083d20-kube-api-access-5pkgv\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.763980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-config-data\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.764041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.764639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2391af6c-7b43-479d-9401-3212aa083d20-logs\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.767312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.767545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-config-data\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.779402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkgv\" (UniqueName: \"kubernetes.io/projected/2391af6c-7b43-479d-9401-3212aa083d20-kube-api-access-5pkgv\") pod \"nova-metadata-0\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.870361 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:16 crc kubenswrapper[4707]: I0121 15:50:16.974606 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:50:16 crc kubenswrapper[4707]: W0121 15:50:16.988915 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1376014d_80c7_43b5_b295_72d7b639df19.slice/crio-927b683ff7020116a9e640c74651853bbdebbc9d515b90ad211754ca81d66a24 WatchSource:0}: Error finding container 927b683ff7020116a9e640c74651853bbdebbc9d515b90ad211754ca81d66a24: Status 404 returned error can't find the container with id 927b683ff7020116a9e640c74651853bbdebbc9d515b90ad211754ca81d66a24 Jan 21 15:50:17 crc kubenswrapper[4707]: I0121 15:50:17.190105 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c06214-785b-4d8f-b59d-9da1aee2b2b6" path="/var/lib/kubelet/pods/d5c06214-785b-4d8f-b59d-9da1aee2b2b6/volumes" Jan 21 15:50:17 crc kubenswrapper[4707]: I0121 15:50:17.190691 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f934ecfe-574b-494b-9d6e-d99d5060d368" path="/var/lib/kubelet/pods/f934ecfe-574b-494b-9d6e-d99d5060d368/volumes" Jan 21 15:50:17 crc kubenswrapper[4707]: I0121 15:50:17.231895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdlz" event={"ID":"7ac885cd-7813-4c48-8039-0eb22685ce23","Type":"ContainerStarted","Data":"f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873"} Jan 21 15:50:17 crc kubenswrapper[4707]: I0121 15:50:17.233382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1376014d-80c7-43b5-b295-72d7b639df19","Type":"ContainerStarted","Data":"b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e"} Jan 21 15:50:17 crc kubenswrapper[4707]: I0121 15:50:17.233406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1376014d-80c7-43b5-b295-72d7b639df19","Type":"ContainerStarted","Data":"b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93"} Jan 21 15:50:17 crc kubenswrapper[4707]: I0121 15:50:17.233417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1376014d-80c7-43b5-b295-72d7b639df19","Type":"ContainerStarted","Data":"927b683ff7020116a9e640c74651853bbdebbc9d515b90ad211754ca81d66a24"} Jan 21 15:50:17 crc kubenswrapper[4707]: I0121 15:50:17.243344 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:50:17 crc kubenswrapper[4707]: W0121 15:50:17.245470 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2391af6c_7b43_479d_9401_3212aa083d20.slice/crio-752588f815a0684e7838e017eb74d9739ae359f0c9b874103a4ae3d2bf07f06d WatchSource:0}: Error finding container 752588f815a0684e7838e017eb74d9739ae359f0c9b874103a4ae3d2bf07f06d: Status 404 returned error can't find the container with id 752588f815a0684e7838e017eb74d9739ae359f0c9b874103a4ae3d2bf07f06d Jan 21 15:50:17 crc kubenswrapper[4707]: I0121 15:50:17.271484 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.271461015 podStartE2EDuration="1.271461015s" podCreationTimestamp="2026-01-21 15:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:17.266701296 +0000 UTC m=+2914.448217518" watchObservedRunningTime="2026-01-21 15:50:17.271461015 +0000 UTC m=+2914.452977237" Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.243187 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerID="f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873" exitCode=0 Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.243294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdlz" event={"ID":"7ac885cd-7813-4c48-8039-0eb22685ce23","Type":"ContainerDied","Data":"f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873"} Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.245073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2391af6c-7b43-479d-9401-3212aa083d20","Type":"ContainerStarted","Data":"15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05"} Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.245105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2391af6c-7b43-479d-9401-3212aa083d20","Type":"ContainerStarted","Data":"adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1"} Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.245117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2391af6c-7b43-479d-9401-3212aa083d20","Type":"ContainerStarted","Data":"752588f815a0684e7838e017eb74d9739ae359f0c9b874103a4ae3d2bf07f06d"} Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.275839 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.275821697 podStartE2EDuration="2.275821697s" podCreationTimestamp="2026-01-21 15:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:18.267323678 +0000 UTC m=+2915.448839900" watchObservedRunningTime="2026-01-21 15:50:18.275821697 +0000 UTC m=+2915.457337919" Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.879484 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.900550 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx9fd\" (UniqueName: \"kubernetes.io/projected/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-kube-api-access-lx9fd\") pod \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.900615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-config-data\") pod \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.900716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-combined-ca-bundle\") pod \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\" (UID: \"d5e1f6d2-5135-46b9-aedd-4f74080c9f24\") " Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.905686 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-kube-api-access-lx9fd" (OuterVolumeSpecName: "kube-api-access-lx9fd") pod "d5e1f6d2-5135-46b9-aedd-4f74080c9f24" (UID: "d5e1f6d2-5135-46b9-aedd-4f74080c9f24"). InnerVolumeSpecName "kube-api-access-lx9fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.922593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5e1f6d2-5135-46b9-aedd-4f74080c9f24" (UID: "d5e1f6d2-5135-46b9-aedd-4f74080c9f24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:18 crc kubenswrapper[4707]: I0121 15:50:18.923074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-config-data" (OuterVolumeSpecName: "config-data") pod "d5e1f6d2-5135-46b9-aedd-4f74080c9f24" (UID: "d5e1f6d2-5135-46b9-aedd-4f74080c9f24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.002418 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.002451 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx9fd\" (UniqueName: \"kubernetes.io/projected/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-kube-api-access-lx9fd\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.002462 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e1f6d2-5135-46b9-aedd-4f74080c9f24-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.253436 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5e1f6d2-5135-46b9-aedd-4f74080c9f24" containerID="8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef" exitCode=0 Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.253487 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.253505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d5e1f6d2-5135-46b9-aedd-4f74080c9f24","Type":"ContainerDied","Data":"8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef"} Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.253821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d5e1f6d2-5135-46b9-aedd-4f74080c9f24","Type":"ContainerDied","Data":"86f20f72e0164b2dadb7fd2cedef9fd8624d66ec46c1d264f90c7a3b0a2dc16b"} Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.253845 4707 scope.go:117] "RemoveContainer" containerID="8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.256245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdlz" event={"ID":"7ac885cd-7813-4c48-8039-0eb22685ce23","Type":"ContainerStarted","Data":"665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c"} Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.273902 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bzdlz" podStartSLOduration=2.751761703 podStartE2EDuration="5.273884216s" podCreationTimestamp="2026-01-21 15:50:14 +0000 UTC" firstStartedPulling="2026-01-21 15:50:16.221608458 +0000 UTC m=+2913.403124679" lastFinishedPulling="2026-01-21 15:50:18.743730969 +0000 UTC m=+2915.925247192" observedRunningTime="2026-01-21 15:50:19.268662267 +0000 UTC m=+2916.450178490" watchObservedRunningTime="2026-01-21 15:50:19.273884216 +0000 UTC m=+2916.455400438" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.278388 4707 scope.go:117] "RemoveContainer" containerID="8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef" Jan 21 15:50:19 crc kubenswrapper[4707]: E0121 15:50:19.278682 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef\": container with ID starting with 8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef not found: ID does not exist" containerID="8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.278713 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef"} err="failed to get container status \"8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef\": rpc error: code = NotFound desc = could not find container \"8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef\": container with ID starting with 8fd38a3ed12d3ae0f4b927cfa787228d1cbb72e2931aaf8364c2d011ea95a6ef not found: ID does not exist" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.290277 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.296783 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.304482 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:19 crc kubenswrapper[4707]: E0121 15:50:19.304803 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e1f6d2-5135-46b9-aedd-4f74080c9f24" containerName="nova-scheduler-scheduler" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.304836 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e1f6d2-5135-46b9-aedd-4f74080c9f24" containerName="nova-scheduler-scheduler" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.304973 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e1f6d2-5135-46b9-aedd-4f74080c9f24" containerName="nova-scheduler-scheduler" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.305510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.307120 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.314648 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.408044 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-config-data\") pod \"nova-scheduler-0\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.408191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm58j\" (UniqueName: \"kubernetes.io/projected/aab48af8-8468-44fb-a7c2-1db176749c05-kube-api-access-mm58j\") pod \"nova-scheduler-0\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.408322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.510185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm58j\" (UniqueName: \"kubernetes.io/projected/aab48af8-8468-44fb-a7c2-1db176749c05-kube-api-access-mm58j\") pod \"nova-scheduler-0\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.510295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.510324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-config-data\") pod \"nova-scheduler-0\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.514341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-config-data\") pod \"nova-scheduler-0\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.514787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.523248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm58j\" (UniqueName: \"kubernetes.io/projected/aab48af8-8468-44fb-a7c2-1db176749c05-kube-api-access-mm58j\") pod \"nova-scheduler-0\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.624028 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4707]: I0121 15:50:19.990552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.174249 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.219948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-config-data\") pod \"20366c22-d65c-416c-a985-728cc72025ca\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.219990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-run-httpd\") pod \"20366c22-d65c-416c-a985-728cc72025ca\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.220053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-combined-ca-bundle\") pod \"20366c22-d65c-416c-a985-728cc72025ca\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.220111 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgmn6\" (UniqueName: \"kubernetes.io/projected/20366c22-d65c-416c-a985-728cc72025ca-kube-api-access-fgmn6\") pod \"20366c22-d65c-416c-a985-728cc72025ca\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.220137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-scripts\") pod \"20366c22-d65c-416c-a985-728cc72025ca\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.220153 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-sg-core-conf-yaml\") pod \"20366c22-d65c-416c-a985-728cc72025ca\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.220173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-log-httpd\") pod \"20366c22-d65c-416c-a985-728cc72025ca\" (UID: \"20366c22-d65c-416c-a985-728cc72025ca\") " Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.220521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "20366c22-d65c-416c-a985-728cc72025ca" (UID: "20366c22-d65c-416c-a985-728cc72025ca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.220954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "20366c22-d65c-416c-a985-728cc72025ca" (UID: "20366c22-d65c-416c-a985-728cc72025ca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.223550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-scripts" (OuterVolumeSpecName: "scripts") pod "20366c22-d65c-416c-a985-728cc72025ca" (UID: "20366c22-d65c-416c-a985-728cc72025ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.223577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20366c22-d65c-416c-a985-728cc72025ca-kube-api-access-fgmn6" (OuterVolumeSpecName: "kube-api-access-fgmn6") pod "20366c22-d65c-416c-a985-728cc72025ca" (UID: "20366c22-d65c-416c-a985-728cc72025ca"). InnerVolumeSpecName "kube-api-access-fgmn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.239800 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "20366c22-d65c-416c-a985-728cc72025ca" (UID: "20366c22-d65c-416c-a985-728cc72025ca"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.265656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"aab48af8-8468-44fb-a7c2-1db176749c05","Type":"ContainerStarted","Data":"0f5aa18758d1418e6aca5a3d0f63afa17f1fe0cc53e46470f383109d0907bb6f"} Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.265709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"aab48af8-8468-44fb-a7c2-1db176749c05","Type":"ContainerStarted","Data":"a76e64c4a69104cb639fd2d9f8d85deb4b4ba8d50e3aa36841f377eaf6937906"} Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.270238 4707 generic.go:334] "Generic (PLEG): container finished" podID="20366c22-d65c-416c-a985-728cc72025ca" containerID="bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf" exitCode=137 Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.271999 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.272413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerDied","Data":"bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf"} Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.272507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"20366c22-d65c-416c-a985-728cc72025ca","Type":"ContainerDied","Data":"34e26ee0fd971909a356877d38a6e27375ad0fc47ccd2de82c42b4bb01cba5e8"} Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.272574 4707 scope.go:117] "RemoveContainer" containerID="bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.296607 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.29658937 podStartE2EDuration="1.29658937s" podCreationTimestamp="2026-01-21 15:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:20.277713198 +0000 UTC m=+2917.459229420" watchObservedRunningTime="2026-01-21 15:50:20.29658937 +0000 UTC m=+2917.478105591" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.298907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20366c22-d65c-416c-a985-728cc72025ca" (UID: "20366c22-d65c-416c-a985-728cc72025ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.311796 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-config-data" (OuterVolumeSpecName: "config-data") pod "20366c22-d65c-416c-a985-728cc72025ca" (UID: "20366c22-d65c-416c-a985-728cc72025ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.316844 4707 scope.go:117] "RemoveContainer" containerID="d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.321498 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgmn6\" (UniqueName: \"kubernetes.io/projected/20366c22-d65c-416c-a985-728cc72025ca-kube-api-access-fgmn6\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.321522 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.321532 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.321541 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.321549 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.321557 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20366c22-d65c-416c-a985-728cc72025ca-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.321567 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20366c22-d65c-416c-a985-728cc72025ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.331330 4707 scope.go:117] "RemoveContainer" containerID="fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.347620 4707 scope.go:117] "RemoveContainer" containerID="76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.362916 4707 scope.go:117] "RemoveContainer" containerID="bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf" Jan 21 15:50:20 crc kubenswrapper[4707]: E0121 15:50:20.363292 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf\": container with ID starting with bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf not found: ID does not exist" containerID="bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.363349 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf"} err="failed to get container status \"bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf\": rpc error: code = NotFound desc = could not find container \"bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf\": container with ID starting with bb56765f72064b45a11f4c157a0e991b7e97f9dab052c93f61f4824042420daf not found: ID does not exist" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.363376 4707 scope.go:117] "RemoveContainer" containerID="d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f" Jan 21 15:50:20 crc kubenswrapper[4707]: E0121 15:50:20.363662 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f\": container with ID starting with d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f not found: ID does not exist" containerID="d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.363696 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f"} err="failed to get container status \"d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f\": rpc error: code = NotFound desc = could not find container \"d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f\": container with ID starting with d5ad1dc0c977e5cb57e744b54481595d3ff5904076d9710445c3b0e29ec01a7f not found: ID does not exist" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.363722 4707 scope.go:117] "RemoveContainer" containerID="fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643" Jan 21 15:50:20 crc kubenswrapper[4707]: E0121 15:50:20.364297 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643\": container with ID starting with fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643 not found: ID does not exist" containerID="fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.364324 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643"} err="failed to get container status \"fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643\": rpc error: code = NotFound desc = could not find container \"fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643\": container with ID starting with fab9ffecc3344d6f875cbfbed4305c3f69a82e6a0e9657f83ebfe47bc1d73643 not found: ID does not exist" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.364338 4707 scope.go:117] "RemoveContainer" containerID="76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831" Jan 21 15:50:20 crc kubenswrapper[4707]: E0121 15:50:20.364600 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831\": container with ID starting with 76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831 not found: ID does not exist" containerID="76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.364625 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831"} err="failed to get container status \"76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831\": rpc error: code = NotFound desc = could not find container \"76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831\": container with ID starting with 76f3987c84af876ce8af137d198c426377c56cb6b0f291cbdc120e7c96c15831 not found: ID does not exist" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.602910 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.608644 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.620204 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:20 crc kubenswrapper[4707]: E0121 15:50:20.620554 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="ceilometer-central-agent" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.620569 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="ceilometer-central-agent" Jan 21 15:50:20 crc kubenswrapper[4707]: E0121 15:50:20.620591 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="sg-core" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.620597 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="sg-core" Jan 21 15:50:20 crc kubenswrapper[4707]: E0121 15:50:20.620606 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="proxy-httpd" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.620612 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="proxy-httpd" Jan 21 15:50:20 crc kubenswrapper[4707]: E0121 15:50:20.620624 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="ceilometer-notification-agent" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.620629 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="ceilometer-notification-agent" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.620779 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="ceilometer-central-agent" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.620793 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="sg-core" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.620799 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="ceilometer-notification-agent" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.620824 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20366c22-d65c-416c-a985-728cc72025ca" containerName="proxy-httpd" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.624168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.626725 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.627429 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.632073 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.727640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-run-httpd\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.727708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-log-httpd\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.727728 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.727753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.727784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-scripts\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.728046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-config-data\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.728176 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cbr6\" (UniqueName: \"kubernetes.io/projected/f3de174a-37fa-467b-8757-b3b6a77af447-kube-api-access-5cbr6\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.829423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-log-httpd\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.829469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.829494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.829530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-scripts\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.829578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-config-data\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.829620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cbr6\" (UniqueName: \"kubernetes.io/projected/f3de174a-37fa-467b-8757-b3b6a77af447-kube-api-access-5cbr6\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.829646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-run-httpd\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.829852 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-log-httpd\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.829997 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-run-httpd\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.833348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-scripts\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.834066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-config-data\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.841361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.841403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.845656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cbr6\" (UniqueName: \"kubernetes.io/projected/f3de174a-37fa-467b-8757-b3b6a77af447-kube-api-access-5cbr6\") pod \"ceilometer-0\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:20 crc kubenswrapper[4707]: I0121 15:50:20.954324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:21 crc kubenswrapper[4707]: I0121 15:50:21.190439 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20366c22-d65c-416c-a985-728cc72025ca" path="/var/lib/kubelet/pods/20366c22-d65c-416c-a985-728cc72025ca/volumes" Jan 21 15:50:21 crc kubenswrapper[4707]: I0121 15:50:21.191422 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e1f6d2-5135-46b9-aedd-4f74080c9f24" path="/var/lib/kubelet/pods/d5e1f6d2-5135-46b9-aedd-4f74080c9f24/volumes" Jan 21 15:50:21 crc kubenswrapper[4707]: I0121 15:50:21.328889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:21 crc kubenswrapper[4707]: W0121 15:50:21.333576 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3de174a_37fa_467b_8757_b3b6a77af447.slice/crio-4bb58168208410429a783d90f5542d33dc873b6816910f3e67bbe8b5e9cca79e WatchSource:0}: Error finding container 4bb58168208410429a783d90f5542d33dc873b6816910f3e67bbe8b5e9cca79e: Status 404 returned error can't find the container with id 4bb58168208410429a783d90f5542d33dc873b6816910f3e67bbe8b5e9cca79e Jan 21 15:50:21 crc kubenswrapper[4707]: I0121 15:50:21.870709 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:21 crc kubenswrapper[4707]: I0121 15:50:21.870952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:22 crc kubenswrapper[4707]: I0121 15:50:22.286315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerStarted","Data":"225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549"} Jan 21 15:50:22 crc kubenswrapper[4707]: I0121 15:50:22.286565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerStarted","Data":"4bb58168208410429a783d90f5542d33dc873b6816910f3e67bbe8b5e9cca79e"} Jan 21 15:50:23 crc kubenswrapper[4707]: I0121 15:50:23.295508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerStarted","Data":"c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256"} Jan 21 15:50:23 crc kubenswrapper[4707]: I0121 15:50:23.295782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerStarted","Data":"54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4"} Jan 21 15:50:24 crc kubenswrapper[4707]: I0121 15:50:24.625169 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:24 crc kubenswrapper[4707]: I0121 15:50:24.937399 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:24 crc kubenswrapper[4707]: I0121 15:50:24.937630 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:24 crc kubenswrapper[4707]: I0121 15:50:24.971028 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:25 crc kubenswrapper[4707]: I0121 15:50:25.312463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerStarted","Data":"b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340"} Jan 21 15:50:25 crc kubenswrapper[4707]: I0121 15:50:25.312686 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:25 crc kubenswrapper[4707]: I0121 15:50:25.332443 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.043062723 podStartE2EDuration="5.332426163s" podCreationTimestamp="2026-01-21 15:50:20 +0000 UTC" firstStartedPulling="2026-01-21 15:50:21.334828019 +0000 UTC m=+2918.516344242" lastFinishedPulling="2026-01-21 15:50:24.62419146 +0000 UTC m=+2921.805707682" observedRunningTime="2026-01-21 15:50:25.32765901 +0000 UTC m=+2922.509175252" watchObservedRunningTime="2026-01-21 15:50:25.332426163 +0000 UTC m=+2922.513942384" Jan 21 15:50:25 crc kubenswrapper[4707]: I0121 15:50:25.349279 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:26 crc kubenswrapper[4707]: I0121 15:50:26.002915 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzdlz"] Jan 21 15:50:26 crc kubenswrapper[4707]: I0121 15:50:26.598990 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:26 crc kubenswrapper[4707]: I0121 15:50:26.599036 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:26 crc kubenswrapper[4707]: I0121 15:50:26.870940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:26 crc kubenswrapper[4707]: I0121 15:50:26.871168 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:27 crc kubenswrapper[4707]: I0121 15:50:27.324109 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bzdlz" podUID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerName="registry-server" containerID="cri-o://665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c" gracePeriod=2 Jan 21 15:50:27 crc kubenswrapper[4707]: I0121 15:50:27.680937 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:27 crc kubenswrapper[4707]: I0121 15:50:27.680972 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:27 crc kubenswrapper[4707]: I0121 15:50:27.952958 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.198:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:27 crc kubenswrapper[4707]: I0121 15:50:27.952946 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.198:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.796213 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.884158 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htz6g\" (UniqueName: \"kubernetes.io/projected/7ac885cd-7813-4c48-8039-0eb22685ce23-kube-api-access-htz6g\") pod \"7ac885cd-7813-4c48-8039-0eb22685ce23\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.884352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-utilities\") pod \"7ac885cd-7813-4c48-8039-0eb22685ce23\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.884452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-catalog-content\") pod \"7ac885cd-7813-4c48-8039-0eb22685ce23\" (UID: \"7ac885cd-7813-4c48-8039-0eb22685ce23\") " Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.884881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-utilities" (OuterVolumeSpecName: "utilities") pod "7ac885cd-7813-4c48-8039-0eb22685ce23" (UID: "7ac885cd-7813-4c48-8039-0eb22685ce23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.888630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac885cd-7813-4c48-8039-0eb22685ce23-kube-api-access-htz6g" (OuterVolumeSpecName: "kube-api-access-htz6g") pod "7ac885cd-7813-4c48-8039-0eb22685ce23" (UID: "7ac885cd-7813-4c48-8039-0eb22685ce23"). InnerVolumeSpecName "kube-api-access-htz6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.964315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ac885cd-7813-4c48-8039-0eb22685ce23" (UID: "7ac885cd-7813-4c48-8039-0eb22685ce23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.986560 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.986582 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac885cd-7813-4c48-8039-0eb22685ce23-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:28 crc kubenswrapper[4707]: I0121 15:50:28.986594 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htz6g\" (UniqueName: \"kubernetes.io/projected/7ac885cd-7813-4c48-8039-0eb22685ce23-kube-api-access-htz6g\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.339288 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerID="665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c" exitCode=0 Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.339471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdlz" event={"ID":"7ac885cd-7813-4c48-8039-0eb22685ce23","Type":"ContainerDied","Data":"665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c"} Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.339549 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzdlz" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.339611 4707 scope.go:117] "RemoveContainer" containerID="665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.339593 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdlz" event={"ID":"7ac885cd-7813-4c48-8039-0eb22685ce23","Type":"ContainerDied","Data":"f6ab57d33b942fed170c1f59067023344c553d096dfba84bb973c6524123afbf"} Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.357137 4707 scope.go:117] "RemoveContainer" containerID="f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.359497 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzdlz"] Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.366261 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bzdlz"] Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.373631 4707 scope.go:117] "RemoveContainer" containerID="2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.401175 4707 scope.go:117] "RemoveContainer" containerID="665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c" Jan 21 15:50:29 crc kubenswrapper[4707]: E0121 15:50:29.401438 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c\": container with ID starting with 665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c not found: ID does not exist" containerID="665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.401477 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c"} err="failed to get container status \"665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c\": rpc error: code = NotFound desc = could not find container \"665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c\": container with ID starting with 665bdc465d6b81add02612d3bf856d3499337194854f11c02e34f14840bca27c not found: ID does not exist" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.401506 4707 scope.go:117] "RemoveContainer" containerID="f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873" Jan 21 15:50:29 crc kubenswrapper[4707]: E0121 15:50:29.401770 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873\": container with ID starting with f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873 not found: ID does not exist" containerID="f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.401820 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873"} err="failed to get container status \"f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873\": rpc error: code = NotFound desc = could not find container \"f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873\": container with ID starting with f9aa3945e0b5851d66fb465f9fe479846cda35186ce352f6c701b37b143c0873 not found: ID does not exist" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.401849 4707 scope.go:117] "RemoveContainer" containerID="2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40" Jan 21 15:50:29 crc kubenswrapper[4707]: E0121 15:50:29.402348 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40\": container with ID starting with 2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40 not found: ID does not exist" containerID="2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.402376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40"} err="failed to get container status \"2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40\": rpc error: code = NotFound desc = could not find container \"2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40\": container with ID starting with 2e5382e55f753082d899021a46c05f44b454d73bc1020fc9512be31fd0b1de40 not found: ID does not exist" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.624689 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:29 crc kubenswrapper[4707]: I0121 15:50:29.646549 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:30 crc kubenswrapper[4707]: I0121 15:50:30.367677 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:50:31 crc kubenswrapper[4707]: I0121 15:50:31.192610 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac885cd-7813-4c48-8039-0eb22685ce23" path="/var/lib/kubelet/pods/7ac885cd-7813-4c48-8039-0eb22685ce23/volumes" Jan 21 15:50:36 crc kubenswrapper[4707]: I0121 15:50:36.601082 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:36 crc kubenswrapper[4707]: I0121 15:50:36.601475 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:36 crc kubenswrapper[4707]: I0121 15:50:36.601698 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:36 crc kubenswrapper[4707]: I0121 15:50:36.601730 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:36 crc kubenswrapper[4707]: I0121 15:50:36.603512 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:36 crc kubenswrapper[4707]: I0121 15:50:36.604303 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:50:36 crc kubenswrapper[4707]: I0121 15:50:36.872565 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:36 crc kubenswrapper[4707]: I0121 15:50:36.873019 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:36 crc kubenswrapper[4707]: I0121 15:50:36.876634 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:37 crc kubenswrapper[4707]: I0121 15:50:37.396499 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.187093 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.187569 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="proxy-httpd" containerID="cri-o://b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340" gracePeriod=30 Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.187643 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="ceilometer-central-agent" containerID="cri-o://225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549" gracePeriod=30 Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.187688 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="sg-core" containerID="cri-o://c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256" gracePeriod=30 Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.187722 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="ceilometer-notification-agent" containerID="cri-o://54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4" gracePeriod=30 Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.193399 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.200:3000/\": EOF" Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.408172 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3de174a-37fa-467b-8757-b3b6a77af447" containerID="b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340" exitCode=0 Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.408209 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3de174a-37fa-467b-8757-b3b6a77af447" containerID="c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256" exitCode=2 Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.410318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerDied","Data":"b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340"} Jan 21 15:50:38 crc kubenswrapper[4707]: I0121 15:50:38.410357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerDied","Data":"c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256"} Jan 21 15:50:39 crc kubenswrapper[4707]: I0121 15:50:39.417651 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3de174a-37fa-467b-8757-b3b6a77af447" containerID="225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549" exitCode=0 Jan 21 15:50:39 crc kubenswrapper[4707]: I0121 15:50:39.417722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerDied","Data":"225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549"} Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.164099 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.277953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cbr6\" (UniqueName: \"kubernetes.io/projected/f3de174a-37fa-467b-8757-b3b6a77af447-kube-api-access-5cbr6\") pod \"f3de174a-37fa-467b-8757-b3b6a77af447\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.277993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-scripts\") pod \"f3de174a-37fa-467b-8757-b3b6a77af447\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.278017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-combined-ca-bundle\") pod \"f3de174a-37fa-467b-8757-b3b6a77af447\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.278064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-sg-core-conf-yaml\") pod \"f3de174a-37fa-467b-8757-b3b6a77af447\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.278082 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-log-httpd\") pod \"f3de174a-37fa-467b-8757-b3b6a77af447\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.278203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-run-httpd\") pod \"f3de174a-37fa-467b-8757-b3b6a77af447\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.278234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-config-data\") pod \"f3de174a-37fa-467b-8757-b3b6a77af447\" (UID: \"f3de174a-37fa-467b-8757-b3b6a77af447\") " Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.278923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f3de174a-37fa-467b-8757-b3b6a77af447" (UID: "f3de174a-37fa-467b-8757-b3b6a77af447"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.279242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f3de174a-37fa-467b-8757-b3b6a77af447" (UID: "f3de174a-37fa-467b-8757-b3b6a77af447"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.282136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3de174a-37fa-467b-8757-b3b6a77af447-kube-api-access-5cbr6" (OuterVolumeSpecName: "kube-api-access-5cbr6") pod "f3de174a-37fa-467b-8757-b3b6a77af447" (UID: "f3de174a-37fa-467b-8757-b3b6a77af447"). InnerVolumeSpecName "kube-api-access-5cbr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.284626 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-scripts" (OuterVolumeSpecName: "scripts") pod "f3de174a-37fa-467b-8757-b3b6a77af447" (UID: "f3de174a-37fa-467b-8757-b3b6a77af447"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.298787 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f3de174a-37fa-467b-8757-b3b6a77af447" (UID: "f3de174a-37fa-467b-8757-b3b6a77af447"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.323182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3de174a-37fa-467b-8757-b3b6a77af447" (UID: "f3de174a-37fa-467b-8757-b3b6a77af447"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.341748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-config-data" (OuterVolumeSpecName: "config-data") pod "f3de174a-37fa-467b-8757-b3b6a77af447" (UID: "f3de174a-37fa-467b-8757-b3b6a77af447"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.379771 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cbr6\" (UniqueName: \"kubernetes.io/projected/f3de174a-37fa-467b-8757-b3b6a77af447-kube-api-access-5cbr6\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.379794 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.379804 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.379825 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.379835 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.379843 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3de174a-37fa-467b-8757-b3b6a77af447-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.379850 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3de174a-37fa-467b-8757-b3b6a77af447-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.440026 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3de174a-37fa-467b-8757-b3b6a77af447" containerID="54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4" exitCode=0 Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.440066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerDied","Data":"54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4"} Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.440093 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3de174a-37fa-467b-8757-b3b6a77af447","Type":"ContainerDied","Data":"4bb58168208410429a783d90f5542d33dc873b6816910f3e67bbe8b5e9cca79e"} Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.440098 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.440118 4707 scope.go:117] "RemoveContainer" containerID="b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.454908 4707 scope.go:117] "RemoveContainer" containerID="c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.467802 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.475006 4707 scope.go:117] "RemoveContainer" containerID="54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.491404 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.500991 4707 scope.go:117] "RemoveContainer" containerID="225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505024 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.505354 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="sg-core" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505371 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="sg-core" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.505382 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="ceilometer-notification-agent" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505388 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="ceilometer-notification-agent" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.505404 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerName="registry-server" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505409 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerName="registry-server" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.505420 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerName="extract-utilities" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505426 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerName="extract-utilities" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.505433 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="ceilometer-central-agent" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505438 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="ceilometer-central-agent" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.505448 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="proxy-httpd" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505453 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="proxy-httpd" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.505461 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerName="extract-content" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505466 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerName="extract-content" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505619 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="sg-core" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505631 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="ceilometer-notification-agent" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505642 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="ceilometer-central-agent" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505653 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" containerName="proxy-httpd" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.505662 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac885cd-7813-4c48-8039-0eb22685ce23" containerName="registry-server" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.507167 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.509372 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.509727 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.518279 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.529782 4707 scope.go:117] "RemoveContainer" containerID="b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.530211 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340\": container with ID starting with b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340 not found: ID does not exist" containerID="b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.530245 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340"} err="failed to get container status \"b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340\": rpc error: code = NotFound desc = could not find container \"b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340\": container with ID starting with b265b7df8afbd2e115532c82de5847a1b666051d43fe78059ac8c4d213e38340 not found: ID does not exist" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.530279 4707 scope.go:117] "RemoveContainer" containerID="c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.530593 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256\": container with ID starting with c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256 not found: ID does not exist" containerID="c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.530611 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256"} err="failed to get container status \"c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256\": rpc error: code = NotFound desc = could not find container \"c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256\": container with ID starting with c01e6b260c822f716990dc496c3a5d4614cba7f61783334957dbe3c127a9a256 not found: ID does not exist" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.530621 4707 scope.go:117] "RemoveContainer" containerID="54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.530865 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4\": container with ID starting with 54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4 not found: ID does not exist" containerID="54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.530884 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4"} err="failed to get container status \"54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4\": rpc error: code = NotFound desc = could not find container \"54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4\": container with ID starting with 54b0fa4dd66a84899843d1790c55dbd019b96d125b7b514ac8065767f86473c4 not found: ID does not exist" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.530894 4707 scope.go:117] "RemoveContainer" containerID="225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549" Jan 21 15:50:42 crc kubenswrapper[4707]: E0121 15:50:42.531094 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549\": container with ID starting with 225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549 not found: ID does not exist" containerID="225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.531113 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549"} err="failed to get container status \"225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549\": rpc error: code = NotFound desc = could not find container \"225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549\": container with ID starting with 225c3db812a9be58b158a18bfd2d8b02e6ae71b18c2633f4b3c7d292be5ac549 not found: ID does not exist" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.597790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.597911 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-run-httpd\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.598006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-scripts\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.598115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-log-httpd\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.598235 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glck5\" (UniqueName: \"kubernetes.io/projected/466deb02-8b2c-4c82-a0ef-fc3f719f321a-kube-api-access-glck5\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.598337 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.598406 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-config-data\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.699818 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-log-httpd\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.700065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glck5\" (UniqueName: \"kubernetes.io/projected/466deb02-8b2c-4c82-a0ef-fc3f719f321a-kube-api-access-glck5\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.700153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.700266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-config-data\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.700379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.700472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-run-httpd\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.700850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-scripts\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.700792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-run-httpd\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.700304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-log-httpd\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.703480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.703557 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.703928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-config-data\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.704019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-scripts\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.713249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glck5\" (UniqueName: \"kubernetes.io/projected/466deb02-8b2c-4c82-a0ef-fc3f719f321a-kube-api-access-glck5\") pod \"ceilometer-0\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4707]: I0121 15:50:42.821864 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4707]: I0121 15:50:43.192766 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3de174a-37fa-467b-8757-b3b6a77af447" path="/var/lib/kubelet/pods/f3de174a-37fa-467b-8757-b3b6a77af447/volumes" Jan 21 15:50:43 crc kubenswrapper[4707]: I0121 15:50:43.194552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:50:43 crc kubenswrapper[4707]: I0121 15:50:43.449174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerStarted","Data":"899050c5c889dd02d82e024080ced6c7075805301dd7071790c4ace68f56d937"} Jan 21 15:50:44 crc kubenswrapper[4707]: I0121 15:50:44.457777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerStarted","Data":"e98a6f97c0c59f1457887797ba63a8cd20695ce6c8be90bf06b9bd633a27d4b5"} Jan 21 15:50:45 crc kubenswrapper[4707]: I0121 15:50:45.466260 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerStarted","Data":"fea7125a871d1415c93116c48d75a9581bfa690f1b86118e2d270e7df411388f"} Jan 21 15:50:45 crc kubenswrapper[4707]: I0121 15:50:45.466684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerStarted","Data":"57135fba45aec0028e942e949147ddc31decf31eea83c9d70afc2ddec4bea450"} Jan 21 15:50:47 crc kubenswrapper[4707]: I0121 15:50:47.483868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerStarted","Data":"e6bf0d0ee323fd65232e57a4a6ac7b388eb8e7c5bca6ca6b043f495165ea7d8b"} Jan 21 15:50:47 crc kubenswrapper[4707]: I0121 15:50:47.485274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:50:47 crc kubenswrapper[4707]: I0121 15:50:47.502790 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.00148135 podStartE2EDuration="5.502773928s" podCreationTimestamp="2026-01-21 15:50:42 +0000 UTC" firstStartedPulling="2026-01-21 15:50:43.198869746 +0000 UTC m=+2940.380385968" lastFinishedPulling="2026-01-21 15:50:46.700162325 +0000 UTC m=+2943.881678546" observedRunningTime="2026-01-21 15:50:47.499440761 +0000 UTC m=+2944.680956983" watchObservedRunningTime="2026-01-21 15:50:47.502773928 +0000 UTC m=+2944.684290150" Jan 21 15:51:03 crc kubenswrapper[4707]: I0121 15:51:03.265468 4707 scope.go:117] "RemoveContainer" containerID="6d5389c6099f34042afa120959ddb534a1021ab43120ae8f9998af0e26ea03c0" Jan 21 15:51:03 crc kubenswrapper[4707]: I0121 15:51:03.308328 4707 scope.go:117] "RemoveContainer" containerID="566d2c79913ac7586f200da2fc0d89fa102371ace8458c1a066117901148fa60" Jan 21 15:51:03 crc kubenswrapper[4707]: I0121 15:51:03.327296 4707 scope.go:117] "RemoveContainer" containerID="af2df74e23c68b533ce2ad604e55e3b6a44522d5ac5962c11fede1b902ed12e8" Jan 21 15:51:03 crc kubenswrapper[4707]: I0121 15:51:03.357291 4707 scope.go:117] "RemoveContainer" containerID="c10e79abafe959b74a9a32a523621e9d6baab7114207d3edd4ec7a890bd33ad2" Jan 21 15:51:03 crc kubenswrapper[4707]: I0121 15:51:03.393736 4707 scope.go:117] "RemoveContainer" containerID="38b41abf3aac941b3174f1c8a5eeb7b89e9817d932cce60bb06f025c8037461f" Jan 21 15:51:03 crc kubenswrapper[4707]: I0121 15:51:03.412017 4707 scope.go:117] "RemoveContainer" containerID="ec9bc512b0547b6776ea1f5e8282c90dc55ec62ca4731917e733976641cf9ab8" Jan 21 15:51:03 crc kubenswrapper[4707]: I0121 15:51:03.436856 4707 scope.go:117] "RemoveContainer" containerID="8df853fe17904e91c15bf9f58ea86cb238d2bb1b6b761b5fe7751f6725ae781f" Jan 21 15:51:03 crc kubenswrapper[4707]: I0121 15:51:03.467168 4707 scope.go:117] "RemoveContainer" containerID="74275cce571896970dcd652c2e437e0b5e956b6bdecd5677a5c8242f96d9d70a" Jan 21 15:51:03 crc kubenswrapper[4707]: I0121 15:51:03.495379 4707 scope.go:117] "RemoveContainer" containerID="be1b9dd35378efeb49e3e0750d3c9b65ebd4b0e6153a53a3a5ccf28db9d738e2" Jan 21 15:51:12 crc kubenswrapper[4707]: I0121 15:51:12.825720 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.649056 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.661343 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.661583 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerName="cinder-scheduler" containerID="cri-o://fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c" gracePeriod=30 Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.662033 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerName="probe" containerID="cri-o://dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1" gracePeriod=30 Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.704624 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.704920 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerName="cinder-api-log" containerID="cri-o://7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676" gracePeriod=30 Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.705000 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerName="cinder-api" containerID="cri-o://ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e" gracePeriod=30 Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.746565 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9"] Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.747877 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.764058 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-568886d945-qqxjr"] Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.765694 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.776591 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9"] Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.782305 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-568886d945-qqxjr"] Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.800824 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.800896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c6c283-dad9-4fe2-91a7-184df0427be8-logs\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.800926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.800967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw869\" (UniqueName: \"kubernetes.io/projected/a9c6c283-dad9-4fe2-91a7-184df0427be8-kube-api-access-lw869\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.801024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-logs\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.801069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-combined-ca-bundle\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.801132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data-custom\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.801183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data-custom\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.801241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-combined-ca-bundle\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.801305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24tkw\" (UniqueName: \"kubernetes.io/projected/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-kube-api-access-24tkw\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24tkw\" (UniqueName: \"kubernetes.io/projected/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-kube-api-access-24tkw\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c6c283-dad9-4fe2-91a7-184df0427be8-logs\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903213 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw869\" (UniqueName: \"kubernetes.io/projected/a9c6c283-dad9-4fe2-91a7-184df0427be8-kube-api-access-lw869\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-logs\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-combined-ca-bundle\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data-custom\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data-custom\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.903401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-combined-ca-bundle\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.904374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-logs\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.904737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c6c283-dad9-4fe2-91a7-184df0427be8-logs\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.915517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data-custom\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.915609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-combined-ca-bundle\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.918652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data-custom\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.935329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw869\" (UniqueName: \"kubernetes.io/projected/a9c6c283-dad9-4fe2-91a7-184df0427be8-kube-api-access-lw869\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.937266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.938361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-combined-ca-bundle\") pod \"barbican-keystone-listener-75d95b7f46-dlmg9\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.955842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.959498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24tkw\" (UniqueName: \"kubernetes.io/projected/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-kube-api-access-24tkw\") pod \"barbican-worker-568886d945-qqxjr\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.979867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.980141 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" containerName="nova-cell0-conductor-conductor" containerID="cri-o://abd81aaba2b15dad392c224106e08baa80499773a6b162882b82de584546de2c" gracePeriod=30 Jan 21 15:51:21 crc kubenswrapper[4707]: I0121 15:51:21.999603 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-f6b9744b8-g24th"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.012089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.018895 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-75cf9974f9-bpkfv"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.020683 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.045605 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-f6b9744b8-g24th"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.067348 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.069204 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-75cf9974f9-bpkfv"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.081315 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.106289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-credential-keys\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.106410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-config-data\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.106504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-combined-ca-bundle\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.106583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jmtx\" (UniqueName: \"kubernetes.io/projected/43c11e62-fdd4-4840-b282-7f467f9e4a88-kube-api-access-9jmtx\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.106649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-combined-ca-bundle\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.106725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5206c260-c7a5-4e61-bd58-4ed5772520e6-logs\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.106822 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-fernet-keys\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.106928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-scripts\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.107031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.107100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5mbl\" (UniqueName: \"kubernetes.io/projected/5206c260-c7a5-4e61-bd58-4ed5772520e6-kube-api-access-j5mbl\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.107166 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data-custom\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.140877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.141175 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="aab48af8-8468-44fb-a7c2-1db176749c05" containerName="nova-scheduler-scheduler" containerID="cri-o://0f5aa18758d1418e6aca5a3d0f63afa17f1fe0cc53e46470f383109d0907bb6f" gracePeriod=30 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.145055 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.159027 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.159361 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-log" containerID="cri-o://adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1" gracePeriod=30 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.159424 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-metadata" containerID="cri-o://15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05" gracePeriod=30 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.165511 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.165738 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-log" containerID="cri-o://b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93" gracePeriod=30 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.165904 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-api" containerID="cri-o://b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e" gracePeriod=30 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.248364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.277736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5mbl\" (UniqueName: \"kubernetes.io/projected/5206c260-c7a5-4e61-bd58-4ed5772520e6-kube-api-access-j5mbl\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.277798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data-custom\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.277988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-credential-keys\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.278013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-config-data\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.278188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-combined-ca-bundle\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.278285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jmtx\" (UniqueName: \"kubernetes.io/projected/43c11e62-fdd4-4840-b282-7f467f9e4a88-kube-api-access-9jmtx\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.278322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-combined-ca-bundle\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.311363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5206c260-c7a5-4e61-bd58-4ed5772520e6-logs\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.311670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-fernet-keys\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.311770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-scripts\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.315204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5206c260-c7a5-4e61-bd58-4ed5772520e6-logs\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.318031 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.319720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.329376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-combined-ca-bundle\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.334591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data-custom\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.343536 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.377145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-credential-keys\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.377491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-config-data\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.389837 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.406430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-combined-ca-bundle\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.407411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jmtx\" (UniqueName: \"kubernetes.io/projected/43c11e62-fdd4-4840-b282-7f467f9e4a88-kube-api-access-9jmtx\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.407640 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-fernet-keys\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.411285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-scripts\") pod \"keystone-75cf9974f9-bpkfv\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.412304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5mbl\" (UniqueName: \"kubernetes.io/projected/5206c260-c7a5-4e61-bd58-4ed5772520e6-kube-api-access-j5mbl\") pod \"barbican-api-f6b9744b8-g24th\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.413370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-combined-ca-bundle\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.413601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqvf9\" (UniqueName: \"kubernetes.io/projected/9caaf11c-be1e-48d5-976d-d6960e5288df-kube-api-access-vqvf9\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.413633 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-httpd-config\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.413661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-config\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: E0121 15:51:22.497762 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1376014d_80c7_43b5_b295_72d7b639df19.slice/crio-b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1376014d_80c7_43b5_b295_72d7b639df19.slice/crio-conmon-b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2391af6c_7b43_479d_9401_3212aa083d20.slice/crio-conmon-adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.514728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqvf9\" (UniqueName: \"kubernetes.io/projected/9caaf11c-be1e-48d5-976d-d6960e5288df-kube-api-access-vqvf9\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.514772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-httpd-config\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.514798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-config\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.514860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-combined-ca-bundle\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.524130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-config\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.526222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-combined-ca-bundle\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.527443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-httpd-config\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.536102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqvf9\" (UniqueName: \"kubernetes.io/projected/9caaf11c-be1e-48d5-976d-d6960e5288df-kube-api-access-vqvf9\") pod \"neutron-6cc6d9b6db-9lrzm\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.644326 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.658288 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.678748 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.754364 4707 generic.go:334] "Generic (PLEG): container finished" podID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerID="dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1" exitCode=0 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.754518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2552e9a1-fb35-4150-b499-a7a1d314fd0d","Type":"ContainerDied","Data":"dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1"} Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.793200 4707 generic.go:334] "Generic (PLEG): container finished" podID="1376014d-80c7-43b5-b295-72d7b639df19" containerID="b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93" exitCode=143 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.793338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1376014d-80c7-43b5-b295-72d7b639df19","Type":"ContainerDied","Data":"b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93"} Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.794441 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.834981 4707 generic.go:334] "Generic (PLEG): container finished" podID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerID="7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676" exitCode=143 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.835066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5be95f6e-99a6-4607-a29e-87a1c87afd68","Type":"ContainerDied","Data":"7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676"} Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.852622 4707 generic.go:334] "Generic (PLEG): container finished" podID="2391af6c-7b43-479d-9401-3212aa083d20" containerID="adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1" exitCode=143 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.852662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2391af6c-7b43-479d-9401-3212aa083d20","Type":"ContainerDied","Data":"adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1"} Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.952795 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.966279 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="373d78b4-4328-42ce-8d54-6fb1c0ecefdb" containerName="memcached" containerID="cri-o://2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56" gracePeriod=30 Jan 21 15:51:22 crc kubenswrapper[4707]: I0121 15:51:22.988877 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-568886d945-qqxjr"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.244586 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-75cf9974f9-bpkfv"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.266019 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.276946 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerName="glance-log" containerID="cri-o://4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.277454 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerName="glance-httpd" containerID="cri-o://dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.321875 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.322375 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="17a6ace0-ecbc-47bd-acf8-9a7e76defede" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://49ab48348e2c3da66f3916afef13af7e4060b40cf8411906a27a3ada9541b590" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.399855 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400287 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-server" containerID="cri-o://823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400409 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="swift-recon-cron" containerID="cri-o://a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400454 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="rsync" containerID="cri-o://3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400489 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-expirer" containerID="cri-o://1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400518 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-updater" containerID="cri-o://3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400544 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-auditor" containerID="cri-o://5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400570 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-replicator" containerID="cri-o://f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400595 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-server" containerID="cri-o://012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400621 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-updater" containerID="cri-o://f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400647 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-auditor" containerID="cri-o://fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400698 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-replicator" containerID="cri-o://f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400734 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-server" containerID="cri-o://b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400779 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-reaper" containerID="cri-o://fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400829 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-auditor" containerID="cri-o://3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.400858 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-replicator" containerID="cri-o://ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.493886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.502732 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-75cf9974f9-bpkfv"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.534595 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.534827 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerName="glance-log" containerID="cri-o://a61916f8e83ea502f8075b4d59765ed8f01db90f73b4f184cfeafe8b58ce0fda" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.535217 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerName="glance-httpd" containerID="cri-o://d92766c25e2a6639816ebec6fda095e760a14a87d4377f309301e1ed97c1a7c9" gracePeriod=30 Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.609071 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-877bd7b8b-p9zhz"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.610403 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.637880 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-877bd7b8b-p9zhz"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.648639 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.693526 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-64d4c897d6-z4d82"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.716418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-64d4c897d6-z4d82"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.716503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.865715 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-f6b9744b8-g24th"] Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.868967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96598\" (UniqueName: \"kubernetes.io/projected/b0ff6c1a-8a03-430f-a291-4e4b94123939-kube-api-access-96598\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.878641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-scripts\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.878761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-config-data\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.887404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-combined-ca-bundle\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.887563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-credential-keys\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.887622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-fernet-keys\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-scripts\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-httpd-config\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-combined-ca-bundle\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-config-data\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-combined-ca-bundle\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bksjn\" (UniqueName: \"kubernetes.io/projected/7df46a58-0e69-4140-b4a6-5357eae382ef-kube-api-access-bksjn\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-credential-keys\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-config\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-fernet-keys\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:23 crc kubenswrapper[4707]: I0121 15:51:23.991735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96598\" (UniqueName: \"kubernetes.io/projected/b0ff6c1a-8a03-430f-a291-4e4b94123939-kube-api-access-96598\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.012571 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20" exitCode=0 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.012719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.012911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.012780 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f" exitCode=0 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013037 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea" exitCode=0 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013096 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b" exitCode=0 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013159 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9" exitCode=0 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013171 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085" exitCode=0 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013179 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7" exitCode=0 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.013264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.015833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" event={"ID":"a9c6c283-dad9-4fe2-91a7-184df0427be8","Type":"ContainerStarted","Data":"ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.015883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" event={"ID":"a9c6c283-dad9-4fe2-91a7-184df0427be8","Type":"ContainerStarted","Data":"5763c506bcb491757450aef3bb9ba416785bc89b8eeaf1f08a3091941fded5a3"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.017073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" event={"ID":"5206c260-c7a5-4e61-bd58-4ed5772520e6","Type":"ContainerStarted","Data":"b3648349800419d4f2b03de9add01633483615336d5efe41b825a57326ba8a2e"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.024298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-scripts\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.024591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-combined-ca-bundle\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.025597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-credential-keys\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.026311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-fernet-keys\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.026394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-config-data\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.027687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96598\" (UniqueName: \"kubernetes.io/projected/b0ff6c1a-8a03-430f-a291-4e4b94123939-kube-api-access-96598\") pod \"keystone-877bd7b8b-p9zhz\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.030239 4707 generic.go:334] "Generic (PLEG): container finished" podID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerID="4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8" exitCode=143 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.030344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0687c9e1-7ccd-454c-8ffd-8507cb8af46c","Type":"ContainerDied","Data":"4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.041381 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" event={"ID":"65b2a5f8-e722-43a5-8765-1b6f281e9b8d","Type":"ContainerStarted","Data":"814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.041421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" event={"ID":"65b2a5f8-e722-43a5-8765-1b6f281e9b8d","Type":"ContainerStarted","Data":"53c9d1533f473c8420f1ae18d35c47189ddac03a313992c676247a71ed2ee1b8"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.051082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" event={"ID":"43c11e62-fdd4-4840-b282-7f467f9e4a88","Type":"ContainerStarted","Data":"a5766d5bf168913788538335dbb1615ed79f7befcea3074dcb7e8bac5a9a3946"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.051129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" event={"ID":"43c11e62-fdd4-4840-b282-7f467f9e4a88","Type":"ContainerStarted","Data":"00021ad3af7661cdf46d6752886ed6f4f2027587cd8b939e4d6291f3199e69a0"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.051272 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" podUID="43c11e62-fdd4-4840-b282-7f467f9e4a88" containerName="keystone-api" containerID="cri-o://a5766d5bf168913788538335dbb1615ed79f7befcea3074dcb7e8bac5a9a3946" gracePeriod=30 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.051356 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.058242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" event={"ID":"9caaf11c-be1e-48d5-976d-d6960e5288df","Type":"ContainerStarted","Data":"9276cd0bf8da95c5d76f1098d0fed09fe687bc5f107b989663548fc9031901aa"} Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.077446 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" podStartSLOduration=3.077425962 podStartE2EDuration="3.077425962s" podCreationTimestamp="2026-01-21 15:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:24.076716429 +0000 UTC m=+2981.258232650" watchObservedRunningTime="2026-01-21 15:51:24.077425962 +0000 UTC m=+2981.258942183" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.093960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bksjn\" (UniqueName: \"kubernetes.io/projected/7df46a58-0e69-4140-b4a6-5357eae382ef-kube-api-access-bksjn\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.094059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-config\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.094134 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-httpd-config\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.094155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-combined-ca-bundle\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.101229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-httpd-config\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.103610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-combined-ca-bundle\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.106328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-config\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.114697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bksjn\" (UniqueName: \"kubernetes.io/projected/7df46a58-0e69-4140-b4a6-5357eae382ef-kube-api-access-bksjn\") pod \"neutron-64d4c897d6-z4d82\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.276431 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:24 crc kubenswrapper[4707]: E0121 15:51:24.282688 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="abd81aaba2b15dad392c224106e08baa80499773a6b162882b82de584546de2c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:24 crc kubenswrapper[4707]: E0121 15:51:24.284572 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="abd81aaba2b15dad392c224106e08baa80499773a6b162882b82de584546de2c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:24 crc kubenswrapper[4707]: E0121 15:51:24.285609 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="abd81aaba2b15dad392c224106e08baa80499773a6b162882b82de584546de2c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:24 crc kubenswrapper[4707]: E0121 15:51:24.285641 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.289093 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/memcached-0" podUID="373d78b4-4328-42ce-8d54-6fb1c0ecefdb" containerName="memcached" probeResult="failure" output="dial tcp 10.217.1.119:11211: connect: connection refused" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.296342 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="bde702b1-7406-4751-aaf5-026fb8e348c4" containerName="rabbitmq" containerID="cri-o://32edadedef2c315e6d4653ad9f20b50bc87635b5eb5275fb1e900ad9ba493403" gracePeriod=604798 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.395443 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:24 crc kubenswrapper[4707]: E0121 15:51:24.625570 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0f5aa18758d1418e6aca5a3d0f63afa17f1fe0cc53e46470f383109d0907bb6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:51:24 crc kubenswrapper[4707]: E0121 15:51:24.626877 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0f5aa18758d1418e6aca5a3d0f63afa17f1fe0cc53e46470f383109d0907bb6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:51:24 crc kubenswrapper[4707]: E0121 15:51:24.632148 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0f5aa18758d1418e6aca5a3d0f63afa17f1fe0cc53e46470f383109d0907bb6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:51:24 crc kubenswrapper[4707]: E0121 15:51:24.632184 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="aab48af8-8468-44fb-a7c2-1db176749c05" containerName="nova-scheduler-scheduler" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.635506 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.712175 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kolla-config\") pod \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.712631 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrxhn\" (UniqueName: \"kubernetes.io/projected/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kube-api-access-mrxhn\") pod \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.712656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-config-data\") pod \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\" (UID: \"373d78b4-4328-42ce-8d54-6fb1c0ecefdb\") " Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.713480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "373d78b4-4328-42ce-8d54-6fb1c0ecefdb" (UID: "373d78b4-4328-42ce-8d54-6fb1c0ecefdb"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.718994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-config-data" (OuterVolumeSpecName: "config-data") pod "373d78b4-4328-42ce-8d54-6fb1c0ecefdb" (UID: "373d78b4-4328-42ce-8d54-6fb1c0ecefdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.732339 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kube-api-access-mrxhn" (OuterVolumeSpecName: "kube-api-access-mrxhn") pod "373d78b4-4328-42ce-8d54-6fb1c0ecefdb" (UID: "373d78b4-4328-42ce-8d54-6fb1c0ecefdb"). InnerVolumeSpecName "kube-api-access-mrxhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.743609 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="3e89388a-81da-49c6-a6fe-796c924f2822" containerName="rabbitmq" containerID="cri-o://e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd" gracePeriod=604798 Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.815487 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.815520 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrxhn\" (UniqueName: \"kubernetes.io/projected/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-kube-api-access-mrxhn\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:24 crc kubenswrapper[4707]: I0121 15:51:24.815533 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/373d78b4-4328-42ce-8d54-6fb1c0ecefdb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.054040 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-877bd7b8b-p9zhz"] Jan 21 15:51:25 crc kubenswrapper[4707]: W0121 15:51:25.063429 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0ff6c1a_8a03_430f_a291_4e4b94123939.slice/crio-1f3bb33b3191c68812c158fb0eb79146207a3313284b95c962e047b8464e859b WatchSource:0}: Error finding container 1f3bb33b3191c68812c158fb0eb79146207a3313284b95c962e047b8464e859b: Status 404 returned error can't find the container with id 1f3bb33b3191c68812c158fb0eb79146207a3313284b95c962e047b8464e859b Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.080676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" event={"ID":"9caaf11c-be1e-48d5-976d-d6960e5288df","Type":"ContainerStarted","Data":"839050d1503bd45e328be8023d300c3011c332e41ef788d6aee897ee48f7883e"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.080723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" event={"ID":"9caaf11c-be1e-48d5-976d-d6960e5288df","Type":"ContainerStarted","Data":"6bc5e820544a98e6b0a7d8b60e9bd28b50abe9ed04ca1b915ac66f80d5047419"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.080875 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerName="neutron-api" containerID="cri-o://6bc5e820544a98e6b0a7d8b60e9bd28b50abe9ed04ca1b915ac66f80d5047419" gracePeriod=30 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.081064 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.081376 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerName="neutron-httpd" containerID="cri-o://839050d1503bd45e328be8023d300c3011c332e41ef788d6aee897ee48f7883e" gracePeriod=30 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.105145 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" podStartSLOduration=3.105125466 podStartE2EDuration="3.105125466s" podCreationTimestamp="2026-01-21 15:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:25.103412866 +0000 UTC m=+2982.284929087" watchObservedRunningTime="2026-01-21 15:51:25.105125466 +0000 UTC m=+2982.286641688" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108801 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108842 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108850 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108858 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108864 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108871 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108876 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.108992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.127199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" event={"ID":"a9c6c283-dad9-4fe2-91a7-184df0427be8","Type":"ContainerStarted","Data":"409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.133263 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" event={"ID":"65b2a5f8-e722-43a5-8765-1b6f281e9b8d","Type":"ContainerStarted","Data":"9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.137363 4707 generic.go:334] "Generic (PLEG): container finished" podID="17a6ace0-ecbc-47bd-acf8-9a7e76defede" containerID="49ab48348e2c3da66f3916afef13af7e4060b40cf8411906a27a3ada9541b590" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.137515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"17a6ace0-ecbc-47bd-acf8-9a7e76defede","Type":"ContainerDied","Data":"49ab48348e2c3da66f3916afef13af7e4060b40cf8411906a27a3ada9541b590"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.142983 4707 generic.go:334] "Generic (PLEG): container finished" podID="4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" containerID="abd81aaba2b15dad392c224106e08baa80499773a6b162882b82de584546de2c" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.143008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb","Type":"ContainerDied","Data":"abd81aaba2b15dad392c224106e08baa80499773a6b162882b82de584546de2c"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.143087 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.152197 4707 generic.go:334] "Generic (PLEG): container finished" podID="aab48af8-8468-44fb-a7c2-1db176749c05" containerID="0f5aa18758d1418e6aca5a3d0f63afa17f1fe0cc53e46470f383109d0907bb6f" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.152301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"aab48af8-8468-44fb-a7c2-1db176749c05","Type":"ContainerDied","Data":"0f5aa18758d1418e6aca5a3d0f63afa17f1fe0cc53e46470f383109d0907bb6f"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.154552 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" podStartSLOduration=4.154540168 podStartE2EDuration="4.154540168s" podCreationTimestamp="2026-01-21 15:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:25.146922544 +0000 UTC m=+2982.328438766" watchObservedRunningTime="2026-01-21 15:51:25.154540168 +0000 UTC m=+2982.336056390" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.171604 4707 generic.go:334] "Generic (PLEG): container finished" podID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerID="fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.171667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2552e9a1-fb35-4150-b499-a7a1d314fd0d","Type":"ContainerDied","Data":"fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.171692 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2552e9a1-fb35-4150-b499-a7a1d314fd0d","Type":"ContainerDied","Data":"9d28ec71e7dc46d3cd74cafcf734ff9c49a94939630d50bb896c4f68dd1fe149"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.171708 4707 scope.go:117] "RemoveContainer" containerID="dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.171854 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.182197 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf"] Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.182410 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" podUID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerName="barbican-keystone-listener-log" containerID="cri-o://478c24f9df975e5f07c1890409bc439d4190136c2d1f1024e43e81cf80e40146" gracePeriod=30 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.182506 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" podUID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerName="barbican-keystone-listener" containerID="cri-o://1e13dc0dd446435fd65bf4d3e47bff260f5012e7a18801585e384b29e9417f69" gracePeriod=30 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.192545 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.225079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-scripts\") pod \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.225550 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data-custom\") pod \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.225596 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llb7l\" (UniqueName: \"kubernetes.io/projected/17a6ace0-ecbc-47bd-acf8-9a7e76defede-kube-api-access-llb7l\") pod \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.225620 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-combined-ca-bundle\") pod \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.225635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-config-data\") pod \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.225669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data\") pod \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.225702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-combined-ca-bundle\") pod \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\" (UID: \"17a6ace0-ecbc-47bd-acf8-9a7e76defede\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.225743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcmn6\" (UniqueName: \"kubernetes.io/projected/2552e9a1-fb35-4150-b499-a7a1d314fd0d-kube-api-access-zcmn6\") pod \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.225772 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2552e9a1-fb35-4150-b499-a7a1d314fd0d-etc-machine-id\") pod \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\" (UID: \"2552e9a1-fb35-4150-b499-a7a1d314fd0d\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.234480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2552e9a1-fb35-4150-b499-a7a1d314fd0d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2552e9a1-fb35-4150-b499-a7a1d314fd0d" (UID: "2552e9a1-fb35-4150-b499-a7a1d314fd0d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.249190 4707 generic.go:334] "Generic (PLEG): container finished" podID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerID="a61916f8e83ea502f8075b4d59765ed8f01db90f73b4f184cfeafe8b58ce0fda" exitCode=143 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.266984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2552e9a1-fb35-4150-b499-a7a1d314fd0d-kube-api-access-zcmn6" (OuterVolumeSpecName: "kube-api-access-zcmn6") pod "2552e9a1-fb35-4150-b499-a7a1d314fd0d" (UID: "2552e9a1-fb35-4150-b499-a7a1d314fd0d"). InnerVolumeSpecName "kube-api-access-zcmn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.271153 4707 scope.go:117] "RemoveContainer" containerID="fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.274410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" event={"ID":"5206c260-c7a5-4e61-bd58-4ed5772520e6","Type":"ContainerStarted","Data":"38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.275928 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.275946 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.275954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" event={"ID":"5206c260-c7a5-4e61-bd58-4ed5772520e6","Type":"ContainerStarted","Data":"0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.275964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f02894e7-9ae7-40df-a2d1-c9074494a383","Type":"ContainerDied","Data":"a61916f8e83ea502f8075b4d59765ed8f01db90f73b4f184cfeafe8b58ce0fda"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.280693 4707 generic.go:334] "Generic (PLEG): container finished" podID="373d78b4-4328-42ce-8d54-6fb1c0ecefdb" containerID="2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56" exitCode=0 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.280731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"373d78b4-4328-42ce-8d54-6fb1c0ecefdb","Type":"ContainerDied","Data":"2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.280746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"373d78b4-4328-42ce-8d54-6fb1c0ecefdb","Type":"ContainerDied","Data":"f3e95752ba008674a53dfa7bff12177e77f4db0814839b702392ea7f471a69f8"} Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.280794 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.283265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a6ace0-ecbc-47bd-acf8-9a7e76defede-kube-api-access-llb7l" (OuterVolumeSpecName: "kube-api-access-llb7l") pod "17a6ace0-ecbc-47bd-acf8-9a7e76defede" (UID: "17a6ace0-ecbc-47bd-acf8-9a7e76defede"). InnerVolumeSpecName "kube-api-access-llb7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.300012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-scripts" (OuterVolumeSpecName: "scripts") pod "2552e9a1-fb35-4150-b499-a7a1d314fd0d" (UID: "2552e9a1-fb35-4150-b499-a7a1d314fd0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.308892 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" podStartSLOduration=4.30886704 podStartE2EDuration="4.30886704s" podCreationTimestamp="2026-01-21 15:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:25.211441446 +0000 UTC m=+2982.392957668" watchObservedRunningTime="2026-01-21 15:51:25.30886704 +0000 UTC m=+2982.490383262" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.311643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2552e9a1-fb35-4150-b499-a7a1d314fd0d" (UID: "2552e9a1-fb35-4150-b499-a7a1d314fd0d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.328896 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.328919 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llb7l\" (UniqueName: \"kubernetes.io/projected/17a6ace0-ecbc-47bd-acf8-9a7e76defede-kube-api-access-llb7l\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.328933 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcmn6\" (UniqueName: \"kubernetes.io/projected/2552e9a1-fb35-4150-b499-a7a1d314fd0d-kube-api-access-zcmn6\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.328943 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2552e9a1-fb35-4150-b499-a7a1d314fd0d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.328969 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.341756 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr"] Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.353599 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" podUID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerName="barbican-worker-log" containerID="cri-o://d60b3a2a18c48b42bdf714d802ca2dd8c97fac6dfc1e9c9a4de48c3542d38b4c" gracePeriod=30 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.354559 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" podUID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerName="barbican-worker" containerID="cri-o://436bcbab1f25e15f46246ecc8ede16a55da48c1cd2e3c46c116436928b337639" gracePeriod=30 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.386100 4707 scope.go:117] "RemoveContainer" containerID="dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1" Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.387145 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1\": container with ID starting with dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1 not found: ID does not exist" containerID="dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.387177 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1"} err="failed to get container status \"dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1\": rpc error: code = NotFound desc = could not find container \"dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1\": container with ID starting with dc5a02028ba53efa871e99ed66dc4d3c161d7f82faf5dca13982bbf95836caa1 not found: ID does not exist" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.387196 4707 scope.go:117] "RemoveContainer" containerID="fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c" Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.387724 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c\": container with ID starting with fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c not found: ID does not exist" containerID="fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.387740 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c"} err="failed to get container status \"fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c\": rpc error: code = NotFound desc = could not find container \"fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c\": container with ID starting with fc8f5abe82a14c2d4da1c780095a82d73f3dee14c52cc37f2b0624838218727c not found: ID does not exist" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.387753 4707 scope.go:117] "RemoveContainer" containerID="2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.405475 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" podStartSLOduration=4.40546096 podStartE2EDuration="4.40546096s" podCreationTimestamp="2026-01-21 15:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:25.283425192 +0000 UTC m=+2982.464941415" watchObservedRunningTime="2026-01-21 15:51:25.40546096 +0000 UTC m=+2982.586977182" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.428103 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.432718 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-64d4c897d6-z4d82"] Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.445946 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.450512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17a6ace0-ecbc-47bd-acf8-9a7e76defede" (UID: "17a6ace0-ecbc-47bd-acf8-9a7e76defede"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.452855 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.453214 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373d78b4-4328-42ce-8d54-6fb1c0ecefdb" containerName="memcached" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.453228 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="373d78b4-4328-42ce-8d54-6fb1c0ecefdb" containerName="memcached" Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.453242 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a6ace0-ecbc-47bd-acf8-9a7e76defede" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.453248 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a6ace0-ecbc-47bd-acf8-9a7e76defede" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.453268 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerName="probe" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.453274 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerName="probe" Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.453293 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerName="cinder-scheduler" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.453299 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerName="cinder-scheduler" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.453463 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerName="probe" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.453489 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" containerName="cinder-scheduler" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.453500 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="373d78b4-4328-42ce-8d54-6fb1c0ecefdb" containerName="memcached" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.453508 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a6ace0-ecbc-47bd-acf8-9a7e76defede" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.454057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.455361 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.456404 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.457237 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-2x7h6" Jan 21 15:51:25 crc kubenswrapper[4707]: W0121 15:51:25.474231 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7df46a58_0e69_4140_b4a6_5357eae382ef.slice/crio-f9481f8d2491fbee7e125a51196988eb96fe475497f75bbe51b05c1c817909c5 WatchSource:0}: Error finding container f9481f8d2491fbee7e125a51196988eb96fe475497f75bbe51b05c1c817909c5: Status 404 returned error can't find the container with id f9481f8d2491fbee7e125a51196988eb96fe475497f75bbe51b05c1c817909c5 Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.484579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-config-data" (OuterVolumeSpecName: "config-data") pod "17a6ace0-ecbc-47bd-acf8-9a7e76defede" (UID: "17a6ace0-ecbc-47bd-acf8-9a7e76defede"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.488866 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.496404 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.512694 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm58j\" (UniqueName: \"kubernetes.io/projected/aab48af8-8468-44fb-a7c2-1db176749c05-kube-api-access-mm58j\") pod \"aab48af8-8468-44fb-a7c2-1db176749c05\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzjsh\" (UniqueName: \"kubernetes.io/projected/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-kube-api-access-zzjsh\") pod \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536393 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-config-data\") pod \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536438 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-config-data\") pod \"aab48af8-8468-44fb-a7c2-1db176749c05\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-combined-ca-bundle\") pod \"aab48af8-8468-44fb-a7c2-1db176749c05\" (UID: \"aab48af8-8468-44fb-a7c2-1db176749c05\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-combined-ca-bundle\") pod \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\" (UID: \"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-kolla-config\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcrsw\" (UniqueName: \"kubernetes.io/projected/42924886-d9f3-48df-83a3-f7e27fcf368d-kube-api-access-gcrsw\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.536937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-config-data\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.537032 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.537048 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a6ace0-ecbc-47bd-acf8-9a7e76defede-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.543944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab48af8-8468-44fb-a7c2-1db176749c05-kube-api-access-mm58j" (OuterVolumeSpecName: "kube-api-access-mm58j") pod "aab48af8-8468-44fb-a7c2-1db176749c05" (UID: "aab48af8-8468-44fb-a7c2-1db176749c05"). InnerVolumeSpecName "kube-api-access-mm58j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.559719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-kube-api-access-zzjsh" (OuterVolumeSpecName: "kube-api-access-zzjsh") pod "4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" (UID: "4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb"). InnerVolumeSpecName "kube-api-access-zzjsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.566564 4707 scope.go:117] "RemoveContainer" containerID="2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.568698 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-config-data" (OuterVolumeSpecName: "config-data") pod "aab48af8-8468-44fb-a7c2-1db176749c05" (UID: "aab48af8-8468-44fb-a7c2-1db176749c05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.573708 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56\": container with ID starting with 2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56 not found: ID does not exist" containerID="2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.573748 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56"} err="failed to get container status \"2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56\": rpc error: code = NotFound desc = could not find container \"2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56\": container with ID starting with 2967ead0ebe6f49d2259ad3e0573221b7c040d21fde0c17f8354033c7fc65f56 not found: ID does not exist" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.593952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aab48af8-8468-44fb-a7c2-1db176749c05" (UID: "aab48af8-8468-44fb-a7c2-1db176749c05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.601889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2552e9a1-fb35-4150-b499-a7a1d314fd0d" (UID: "2552e9a1-fb35-4150-b499-a7a1d314fd0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.607719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" (UID: "4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.613026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-config-data" (OuterVolumeSpecName: "config-data") pod "4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" (UID: "4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.637754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-kolla-config\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.637835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.637861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.637902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcrsw\" (UniqueName: \"kubernetes.io/projected/42924886-d9f3-48df-83a3-f7e27fcf368d-kube-api-access-gcrsw\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.637942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-config-data\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.638009 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.638020 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.638028 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.638036 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm58j\" (UniqueName: \"kubernetes.io/projected/aab48af8-8468-44fb-a7c2-1db176749c05-kube-api-access-mm58j\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.638046 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzjsh\" (UniqueName: \"kubernetes.io/projected/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-kube-api-access-zzjsh\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.638054 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.638062 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab48af8-8468-44fb-a7c2-1db176749c05-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.640342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-kolla-config\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.640383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-config-data\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.644295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.644411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.658294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcrsw\" (UniqueName: \"kubernetes.io/projected/42924886-d9f3-48df-83a3-f7e27fcf368d-kube-api-access-gcrsw\") pod \"memcached-0\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.663408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data" (OuterVolumeSpecName: "config-data") pod "2552e9a1-fb35-4150-b499-a7a1d314fd0d" (UID: "2552e9a1-fb35-4150-b499-a7a1d314fd0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.740193 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552e9a1-fb35-4150-b499-a7a1d314fd0d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.838773 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.842643 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.848862 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.903525 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.938927 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.939414 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerName="cinder-api-log" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.939433 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerName="cinder-api-log" Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.939455 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerName="cinder-api" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.939461 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerName="cinder-api" Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.939470 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab48af8-8468-44fb-a7c2-1db176749c05" containerName="nova-scheduler-scheduler" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.939475 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab48af8-8468-44fb-a7c2-1db176749c05" containerName="nova-scheduler-scheduler" Jan 21 15:51:25 crc kubenswrapper[4707]: E0121 15:51:25.939506 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.939512 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.939692 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab48af8-8468-44fb-a7c2-1db176749c05" containerName="nova-scheduler-scheduler" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.939703 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerName="cinder-api" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.939718 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.939726 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerName="cinder-api-log" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.942962 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.945610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.946100 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data-custom\") pod \"5be95f6e-99a6-4607-a29e-87a1c87afd68\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.946157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5be95f6e-99a6-4607-a29e-87a1c87afd68-logs\") pod \"5be95f6e-99a6-4607-a29e-87a1c87afd68\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.946321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data\") pod \"5be95f6e-99a6-4607-a29e-87a1c87afd68\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.946417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc79x\" (UniqueName: \"kubernetes.io/projected/5be95f6e-99a6-4607-a29e-87a1c87afd68-kube-api-access-tc79x\") pod \"5be95f6e-99a6-4607-a29e-87a1c87afd68\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.946480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-scripts\") pod \"5be95f6e-99a6-4607-a29e-87a1c87afd68\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.946527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5be95f6e-99a6-4607-a29e-87a1c87afd68-etc-machine-id\") pod \"5be95f6e-99a6-4607-a29e-87a1c87afd68\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.946574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-combined-ca-bundle\") pod \"5be95f6e-99a6-4607-a29e-87a1c87afd68\" (UID: \"5be95f6e-99a6-4607-a29e-87a1c87afd68\") " Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.947901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be95f6e-99a6-4607-a29e-87a1c87afd68-logs" (OuterVolumeSpecName: "logs") pod "5be95f6e-99a6-4607-a29e-87a1c87afd68" (UID: "5be95f6e-99a6-4607-a29e-87a1c87afd68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.948932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5be95f6e-99a6-4607-a29e-87a1c87afd68-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5be95f6e-99a6-4607-a29e-87a1c87afd68" (UID: "5be95f6e-99a6-4607-a29e-87a1c87afd68"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.950539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be95f6e-99a6-4607-a29e-87a1c87afd68-kube-api-access-tc79x" (OuterVolumeSpecName: "kube-api-access-tc79x") pod "5be95f6e-99a6-4607-a29e-87a1c87afd68" (UID: "5be95f6e-99a6-4607-a29e-87a1c87afd68"). InnerVolumeSpecName "kube-api-access-tc79x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.960695 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.963854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5be95f6e-99a6-4607-a29e-87a1c87afd68" (UID: "5be95f6e-99a6-4607-a29e-87a1c87afd68"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:25 crc kubenswrapper[4707]: I0121 15:51:25.963924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-scripts" (OuterVolumeSpecName: "scripts") pod "5be95f6e-99a6-4607-a29e-87a1c87afd68" (UID: "5be95f6e-99a6-4607-a29e-87a1c87afd68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.005347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5be95f6e-99a6-4607-a29e-87a1c87afd68" (UID: "5be95f6e-99a6-4607-a29e-87a1c87afd68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.028998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data" (OuterVolumeSpecName: "config-data") pod "5be95f6e-99a6-4607-a29e-87a1c87afd68" (UID: "5be95f6e-99a6-4607-a29e-87a1c87afd68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.048908 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.048997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b67658-af5d-467a-8952-22db7d822791-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-scripts\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049156 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pxff\" (UniqueName: \"kubernetes.io/projected/50b67658-af5d-467a-8952-22db7d822791-kube-api-access-9pxff\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049209 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049222 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc79x\" (UniqueName: \"kubernetes.io/projected/5be95f6e-99a6-4607-a29e-87a1c87afd68-kube-api-access-tc79x\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049232 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049242 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5be95f6e-99a6-4607-a29e-87a1c87afd68-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049261 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049270 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5be95f6e-99a6-4607-a29e-87a1c87afd68-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.049292 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5be95f6e-99a6-4607-a29e-87a1c87afd68-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.142433 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.168855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pxff\" (UniqueName: \"kubernetes.io/projected/50b67658-af5d-467a-8952-22db7d822791-kube-api-access-9pxff\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.176147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.177989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.178171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b67658-af5d-467a-8952-22db7d822791-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.178315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.178385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-scripts\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.178412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.179202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b67658-af5d-467a-8952-22db7d822791-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.183781 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-scripts\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.184642 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.189376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pxff\" (UniqueName: \"kubernetes.io/projected/50b67658-af5d-467a-8952-22db7d822791-kube-api-access-9pxff\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.190157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.196516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.283949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-config-data\") pod \"1376014d-80c7-43b5-b295-72d7b639df19\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.284031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-combined-ca-bundle\") pod \"1376014d-80c7-43b5-b295-72d7b639df19\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.284103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1376014d-80c7-43b5-b295-72d7b639df19-logs\") pod \"1376014d-80c7-43b5-b295-72d7b639df19\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.284148 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnxzg\" (UniqueName: \"kubernetes.io/projected/1376014d-80c7-43b5-b295-72d7b639df19-kube-api-access-jnxzg\") pod \"1376014d-80c7-43b5-b295-72d7b639df19\" (UID: \"1376014d-80c7-43b5-b295-72d7b639df19\") " Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.284165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pkgv\" (UniqueName: \"kubernetes.io/projected/2391af6c-7b43-479d-9401-3212aa083d20-kube-api-access-5pkgv\") pod \"2391af6c-7b43-479d-9401-3212aa083d20\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.284205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-combined-ca-bundle\") pod \"2391af6c-7b43-479d-9401-3212aa083d20\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.284239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2391af6c-7b43-479d-9401-3212aa083d20-logs\") pod \"2391af6c-7b43-479d-9401-3212aa083d20\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.284285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-config-data\") pod \"2391af6c-7b43-479d-9401-3212aa083d20\" (UID: \"2391af6c-7b43-479d-9401-3212aa083d20\") " Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.288328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2391af6c-7b43-479d-9401-3212aa083d20-logs" (OuterVolumeSpecName: "logs") pod "2391af6c-7b43-479d-9401-3212aa083d20" (UID: "2391af6c-7b43-479d-9401-3212aa083d20"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.288716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1376014d-80c7-43b5-b295-72d7b639df19-kube-api-access-jnxzg" (OuterVolumeSpecName: "kube-api-access-jnxzg") pod "1376014d-80c7-43b5-b295-72d7b639df19" (UID: "1376014d-80c7-43b5-b295-72d7b639df19"). InnerVolumeSpecName "kube-api-access-jnxzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.288880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1376014d-80c7-43b5-b295-72d7b639df19-logs" (OuterVolumeSpecName: "logs") pod "1376014d-80c7-43b5-b295-72d7b639df19" (UID: "1376014d-80c7-43b5-b295-72d7b639df19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.292781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2391af6c-7b43-479d-9401-3212aa083d20-kube-api-access-5pkgv" (OuterVolumeSpecName: "kube-api-access-5pkgv") pod "2391af6c-7b43-479d-9401-3212aa083d20" (UID: "2391af6c-7b43-479d-9401-3212aa083d20"). InnerVolumeSpecName "kube-api-access-5pkgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.295614 4707 generic.go:334] "Generic (PLEG): container finished" podID="5be95f6e-99a6-4607-a29e-87a1c87afd68" containerID="ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e" exitCode=0 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.295668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5be95f6e-99a6-4607-a29e-87a1c87afd68","Type":"ContainerDied","Data":"ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.295694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5be95f6e-99a6-4607-a29e-87a1c87afd68","Type":"ContainerDied","Data":"65844f215e5db59219a2657c62b7637e5952d18b8f0740c2e4c4da00f1c99502"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.295710 4707 scope.go:117] "RemoveContainer" containerID="ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.295828 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.303083 4707 generic.go:334] "Generic (PLEG): container finished" podID="2391af6c-7b43-479d-9401-3212aa083d20" containerID="15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05" exitCode=0 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.303126 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2391af6c-7b43-479d-9401-3212aa083d20","Type":"ContainerDied","Data":"15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.303155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2391af6c-7b43-479d-9401-3212aa083d20","Type":"ContainerDied","Data":"752588f815a0684e7838e017eb74d9739ae359f0c9b874103a4ae3d2bf07f06d"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.303205 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.318990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" event={"ID":"7df46a58-0e69-4140-b4a6-5357eae382ef","Type":"ContainerStarted","Data":"36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.319035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" event={"ID":"7df46a58-0e69-4140-b4a6-5357eae382ef","Type":"ContainerStarted","Data":"d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.319047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" event={"ID":"7df46a58-0e69-4140-b4a6-5357eae382ef","Type":"ContainerStarted","Data":"f9481f8d2491fbee7e125a51196988eb96fe475497f75bbe51b05c1c817909c5"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.319103 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.320236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1376014d-80c7-43b5-b295-72d7b639df19" (UID: "1376014d-80c7-43b5-b295-72d7b639df19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.327315 4707 scope.go:117] "RemoveContainer" containerID="7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.330125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-config-data" (OuterVolumeSpecName: "config-data") pod "2391af6c-7b43-479d-9401-3212aa083d20" (UID: "2391af6c-7b43-479d-9401-3212aa083d20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.332418 4707 generic.go:334] "Generic (PLEG): container finished" podID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerID="478c24f9df975e5f07c1890409bc439d4190136c2d1f1024e43e81cf80e40146" exitCode=143 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.332462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" event={"ID":"62974f15-cc2c-4796-9e97-30b70cfb78b0","Type":"ContainerDied","Data":"478c24f9df975e5f07c1890409bc439d4190136c2d1f1024e43e81cf80e40146"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.333756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"17a6ace0-ecbc-47bd-acf8-9a7e76defede","Type":"ContainerDied","Data":"00d66efb976b7b6c6e56697dbb50806443c2d9d8a72bfe79e2b3691c666023f1"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.333839 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.349093 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"aab48af8-8468-44fb-a7c2-1db176749c05","Type":"ContainerDied","Data":"a76e64c4a69104cb639fd2d9f8d85deb4b4ba8d50e3aa36841f377eaf6937906"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.349215 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.355256 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-config-data" (OuterVolumeSpecName: "config-data") pod "1376014d-80c7-43b5-b295-72d7b639df19" (UID: "1376014d-80c7-43b5-b295-72d7b639df19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.358058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2391af6c-7b43-479d-9401-3212aa083d20" (UID: "2391af6c-7b43-479d-9401-3212aa083d20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.359412 4707 generic.go:334] "Generic (PLEG): container finished" podID="1376014d-80c7-43b5-b295-72d7b639df19" containerID="b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e" exitCode=0 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.359504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1376014d-80c7-43b5-b295-72d7b639df19","Type":"ContainerDied","Data":"b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.359535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1376014d-80c7-43b5-b295-72d7b639df19","Type":"ContainerDied","Data":"927b683ff7020116a9e640c74651853bbdebbc9d515b90ad211754ca81d66a24"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.359569 4707 scope.go:117] "RemoveContainer" containerID="ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.359661 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.360149 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e\": container with ID starting with ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e not found: ID does not exist" containerID="ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.360178 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e"} err="failed to get container status \"ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e\": rpc error: code = NotFound desc = could not find container \"ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e\": container with ID starting with ca650d3f738c91502d18ad5523aa09df0c5a7a0f60e83a5024b2c90e3d596b2e not found: ID does not exist" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.360196 4707 scope.go:117] "RemoveContainer" containerID="7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.360260 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" podStartSLOduration=3.360230758 podStartE2EDuration="3.360230758s" podCreationTimestamp="2026-01-21 15:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:26.341769687 +0000 UTC m=+2983.523285909" watchObservedRunningTime="2026-01-21 15:51:26.360230758 +0000 UTC m=+2983.541746980" Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.368076 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676\": container with ID starting with 7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676 not found: ID does not exist" containerID="7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.368113 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676"} err="failed to get container status \"7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676\": rpc error: code = NotFound desc = could not find container \"7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676\": container with ID starting with 7b3cdf33a8ab1c93aba48407df6164a1ed989df936634a2311bf5d952bc0f676 not found: ID does not exist" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.368166 4707 scope.go:117] "RemoveContainer" containerID="15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.394439 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.394463 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2391af6c-7b43-479d-9401-3212aa083d20-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.394472 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2391af6c-7b43-479d-9401-3212aa083d20-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.394480 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.394490 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1376014d-80c7-43b5-b295-72d7b639df19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.394497 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1376014d-80c7-43b5-b295-72d7b639df19-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.394504 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnxzg\" (UniqueName: \"kubernetes.io/projected/1376014d-80c7-43b5-b295-72d7b639df19-kube-api-access-jnxzg\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.394514 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pkgv\" (UniqueName: \"kubernetes.io/projected/2391af6c-7b43-479d-9401-3212aa083d20-kube-api-access-5pkgv\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.406867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.426316 4707 generic.go:334] "Generic (PLEG): container finished" podID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerID="d60b3a2a18c48b42bdf714d802ca2dd8c97fac6dfc1e9c9a4de48c3542d38b4c" exitCode=143 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.426474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" event={"ID":"11d963f6-7258-432b-8206-5a1893ed2ff1","Type":"ContainerDied","Data":"d60b3a2a18c48b42bdf714d802ca2dd8c97fac6dfc1e9c9a4de48c3542d38b4c"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.434572 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.443798 4707 scope.go:117] "RemoveContainer" containerID="adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.446620 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.447728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" event={"ID":"b0ff6c1a-8a03-430f-a291-4e4b94123939","Type":"ContainerStarted","Data":"c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.447762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" event={"ID":"b0ff6c1a-8a03-430f-a291-4e4b94123939","Type":"ContainerStarted","Data":"1f3bb33b3191c68812c158fb0eb79146207a3313284b95c962e047b8464e859b"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.448783 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.461902 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.462328 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-api" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.462346 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-api" Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.462370 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-metadata" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.462376 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-metadata" Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.462396 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-log" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.462402 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-log" Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.462415 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-log" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.462421 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-log" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.462600 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-log" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.462617 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-log" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.462633 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1376014d-80c7-43b5-b295-72d7b639df19" containerName="nova-api-api" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.462644 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2391af6c-7b43-479d-9401-3212aa083d20" containerName="nova-metadata-metadata" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.462998 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.463468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb","Type":"ContainerDied","Data":"c4e432540931c8225bb3f60778328467daf79bc620c912477b3167314de7423d"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.463570 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.471426 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.480686 4707 generic.go:334] "Generic (PLEG): container finished" podID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerID="839050d1503bd45e328be8023d300c3011c332e41ef788d6aee897ee48f7883e" exitCode=0 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.481515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" event={"ID":"9caaf11c-be1e-48d5-976d-d6960e5288df","Type":"ContainerDied","Data":"839050d1503bd45e328be8023d300c3011c332e41ef788d6aee897ee48f7883e"} Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.484850 4707 scope.go:117] "RemoveContainer" containerID="15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05" Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.486104 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05\": container with ID starting with 15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05 not found: ID does not exist" containerID="15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.486130 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05"} err="failed to get container status \"15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05\": rpc error: code = NotFound desc = could not find container \"15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05\": container with ID starting with 15a7a247e9999b0ecb0c1fccbe710ec51a88466dde19527bad3f7cad94db0e05 not found: ID does not exist" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.486143 4707 scope.go:117] "RemoveContainer" containerID="adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1" Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.486991 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1\": container with ID starting with adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1 not found: ID does not exist" containerID="adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.487031 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1"} err="failed to get container status \"adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1\": rpc error: code = NotFound desc = could not find container \"adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1\": container with ID starting with adb5d350f62b4d7ad2e9a0f639a07cf67c7ea3d603a29ca41e1735e3384e53a1 not found: ID does not exist" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.487055 4707 scope.go:117] "RemoveContainer" containerID="49ab48348e2c3da66f3916afef13af7e4060b40cf8411906a27a3ada9541b590" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.488287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.496066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f732f500-3538-4a30-8502-fbd2451b7bdd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.496117 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.496127 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.496239 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8xfc\" (UniqueName: \"kubernetes.io/projected/f732f500-3538-4a30-8502-fbd2451b7bdd-kube-api-access-p8xfc\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.496353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data-custom\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.496430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f732f500-3538-4a30-8502-fbd2451b7bdd-logs\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.496464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.496534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-scripts\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.520791 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.541942 4707 scope.go:117] "RemoveContainer" containerID="0f5aa18758d1418e6aca5a3d0f63afa17f1fe0cc53e46470f383109d0907bb6f" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.542079 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.547963 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.571971 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.573367 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.575158 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.578922 4707 scope.go:117] "RemoveContainer" containerID="b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.595573 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598469 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data-custom\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598555 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs6jf\" (UniqueName: \"kubernetes.io/projected/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-kube-api-access-qs6jf\") pod \"nova-scheduler-0\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f732f500-3538-4a30-8502-fbd2451b7bdd-logs\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-config-data\") pod \"nova-scheduler-0\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-scripts\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f732f500-3538-4a30-8502-fbd2451b7bdd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.598955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8xfc\" (UniqueName: \"kubernetes.io/projected/f732f500-3538-4a30-8502-fbd2451b7bdd-kube-api-access-p8xfc\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.601101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f732f500-3538-4a30-8502-fbd2451b7bdd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.601614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f732f500-3538-4a30-8502-fbd2451b7bdd-logs\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.603182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data-custom\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.603338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.604080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.605467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-scripts\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.607123 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.613386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8xfc\" (UniqueName: \"kubernetes.io/projected/f732f500-3538-4a30-8502-fbd2451b7bdd-kube-api-access-p8xfc\") pod \"cinder-api-0\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.614015 4707 scope.go:117] "RemoveContainer" containerID="b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.615733 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.622198 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.625790 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.630566 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.641963 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.654659 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.657569 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.662055 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.670639 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" podStartSLOduration=3.670617521 podStartE2EDuration="3.670617521s" podCreationTimestamp="2026-01-21 15:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:26.4716857 +0000 UTC m=+2983.653201922" watchObservedRunningTime="2026-01-21 15:51:26.670617521 +0000 UTC m=+2983.852133733" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.683936 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.694535 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.699790 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.712996 4707 scope.go:117] "RemoveContainer" containerID="b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-config-data\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713135 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brklh\" (UniqueName: \"kubernetes.io/projected/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-kube-api-access-brklh\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tx9b\" (UniqueName: \"kubernetes.io/projected/56f64f3c-5f52-42fa-a99d-1312c645ba3a-kube-api-access-4tx9b\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f64f3c-5f52-42fa-a99d-1312c645ba3a-logs\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs6jf\" (UniqueName: \"kubernetes.io/projected/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-kube-api-access-qs6jf\") pod \"nova-scheduler-0\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-config-data\") pod \"nova-scheduler-0\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.713678 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.718419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.720583 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e\": container with ID starting with b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e not found: ID does not exist" containerID="b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.720648 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e"} err="failed to get container status \"b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e\": rpc error: code = NotFound desc = could not find container \"b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e\": container with ID starting with b00f53a8284be6e133ce7577949ea25155490bf750b5f9c5951ac59af263369e not found: ID does not exist" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.720679 4707 scope.go:117] "RemoveContainer" containerID="b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93" Jan 21 15:51:26 crc kubenswrapper[4707]: E0121 15:51:26.726966 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93\": container with ID starting with b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93 not found: ID does not exist" containerID="b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.727016 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93"} err="failed to get container status \"b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93\": rpc error: code = NotFound desc = could not find container \"b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93\": container with ID starting with b1adfb6dfd5099ba628c3b8375ae1d91f09098017e58be4f25809df040477c93 not found: ID does not exist" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.727046 4707 scope.go:117] "RemoveContainer" containerID="abd81aaba2b15dad392c224106e08baa80499773a6b162882b82de584546de2c" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.741275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs6jf\" (UniqueName: \"kubernetes.io/projected/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-kube-api-access-qs6jf\") pod \"nova-scheduler-0\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.752347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-config-data\") pod \"nova-scheduler-0\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.755223 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.762687 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.766736 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.775312 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.791229 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.794474 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.812219 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tx9b\" (UniqueName: \"kubernetes.io/projected/56f64f3c-5f52-42fa-a99d-1312c645ba3a-kube-api-access-4tx9b\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68597\" (UniqueName: \"kubernetes.io/projected/180a2ca9-2ef5-4504-a517-e95c9b914238-kube-api-access-68597\") pod \"nova-cell0-conductor-0\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f64f3c-5f52-42fa-a99d-1312c645ba3a-logs\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815878 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-config-data\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.815928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brklh\" (UniqueName: \"kubernetes.io/projected/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-kube-api-access-brklh\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.816326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f64f3c-5f52-42fa-a99d-1312c645ba3a-logs\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.821308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-config-data\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.821366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.821774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.828472 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.839699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.843628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tx9b\" (UniqueName: \"kubernetes.io/projected/56f64f3c-5f52-42fa-a99d-1312c645ba3a-kube-api-access-4tx9b\") pod \"nova-api-0\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.849274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brklh\" (UniqueName: \"kubernetes.io/projected/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-kube-api-access-brklh\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.849320 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.850988 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.852621 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.860915 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.878867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.879172 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="ceilometer-central-agent" containerID="cri-o://e98a6f97c0c59f1457887797ba63a8cd20695ce6c8be90bf06b9bd633a27d4b5" gracePeriod=30 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.881937 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="proxy-httpd" containerID="cri-o://e6bf0d0ee323fd65232e57a4a6ac7b388eb8e7c5bca6ca6b043f495165ea7d8b" gracePeriod=30 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.882078 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="ceilometer-notification-agent" containerID="cri-o://57135fba45aec0028e942e949147ddc31decf31eea83c9d70afc2ddec4bea450" gracePeriod=30 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.882132 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="sg-core" containerID="cri-o://fea7125a871d1415c93116c48d75a9581bfa690f1b86118e2d270e7df411388f" gracePeriod=30 Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.898574 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.918129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.918173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68597\" (UniqueName: \"kubernetes.io/projected/180a2ca9-2ef5-4504-a517-e95c9b914238-kube-api-access-68597\") pod \"nova-cell0-conductor-0\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.918221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r649\" (UniqueName: \"kubernetes.io/projected/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-kube-api-access-8r649\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.918294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-config-data\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.918333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-logs\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.918351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.918383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.924506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.928262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.952486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68597\" (UniqueName: \"kubernetes.io/projected/180a2ca9-2ef5-4504-a517-e95c9b914238-kube-api-access-68597\") pod \"nova-cell0-conductor-0\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.954790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:51:26 crc kubenswrapper[4707]: I0121 15:51:26.975629 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.002048 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.019696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-config-data\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.019737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-logs\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.019757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.020354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r649\" (UniqueName: \"kubernetes.io/projected/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-kube-api-access-8r649\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.021540 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-logs\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.033037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-config-data\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.034385 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.037401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r649\" (UniqueName: \"kubernetes.io/projected/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-kube-api-access-8r649\") pod \"nova-metadata-0\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.055700 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.100771 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.204433 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1376014d-80c7-43b5-b295-72d7b639df19" path="/var/lib/kubelet/pods/1376014d-80c7-43b5-b295-72d7b639df19/volumes" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.205575 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a6ace0-ecbc-47bd-acf8-9a7e76defede" path="/var/lib/kubelet/pods/17a6ace0-ecbc-47bd-acf8-9a7e76defede/volumes" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.206132 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2391af6c-7b43-479d-9401-3212aa083d20" path="/var/lib/kubelet/pods/2391af6c-7b43-479d-9401-3212aa083d20/volumes" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.207148 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2552e9a1-fb35-4150-b499-a7a1d314fd0d" path="/var/lib/kubelet/pods/2552e9a1-fb35-4150-b499-a7a1d314fd0d/volumes" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.207745 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="373d78b4-4328-42ce-8d54-6fb1c0ecefdb" path="/var/lib/kubelet/pods/373d78b4-4328-42ce-8d54-6fb1c0ecefdb/volumes" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.208190 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb" path="/var/lib/kubelet/pods/4e0f6b31-9752-42cb-90d2-6fa1c2f58fcb/volumes" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.209190 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be95f6e-99a6-4607-a29e-87a1c87afd68" path="/var/lib/kubelet/pods/5be95f6e-99a6-4607-a29e-87a1c87afd68/volumes" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.209751 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab48af8-8468-44fb-a7c2-1db176749c05" path="/var/lib/kubelet/pods/aab48af8-8468-44fb-a7c2-1db176749c05/volumes" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.331335 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="bde702b1-7406-4751-aaf5-026fb8e348c4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.116:5672: connect: connection refused" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.335734 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:27 crc kubenswrapper[4707]: W0121 15:51:27.351485 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf732f500_3538_4a30_8502_fbd2451b7bdd.slice/crio-dc57923b0a2f675447c3007b9aa3acd012c50ec9f99c0725f0f16fd0d12c0f6b WatchSource:0}: Error finding container dc57923b0a2f675447c3007b9aa3acd012c50ec9f99c0725f0f16fd0d12c0f6b: Status 404 returned error can't find the container with id dc57923b0a2f675447c3007b9aa3acd012c50ec9f99c0725f0f16fd0d12c0f6b Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.429924 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:51:27 crc kubenswrapper[4707]: W0121 15:51:27.439641 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4fc3f0d_81ac_4f0b_ac7e_f2c4e65c17cf.slice/crio-07fddb2db53c72120d44d7da47def74b309b175026182a2b3e72ba43ca61c998 WatchSource:0}: Error finding container 07fddb2db53c72120d44d7da47def74b309b175026182a2b3e72ba43ca61c998: Status 404 returned error can't find the container with id 07fddb2db53c72120d44d7da47def74b309b175026182a2b3e72ba43ca61c998 Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.504601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf","Type":"ContainerStarted","Data":"07fddb2db53c72120d44d7da47def74b309b175026182a2b3e72ba43ca61c998"} Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.510244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"50b67658-af5d-467a-8952-22db7d822791","Type":"ContainerStarted","Data":"1c5625e0ae45c6b65a76118c05d45f9e9772b7f470907a731317f746340325e9"} Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.538731 4707 generic.go:334] "Generic (PLEG): container finished" podID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerID="e6bf0d0ee323fd65232e57a4a6ac7b388eb8e7c5bca6ca6b043f495165ea7d8b" exitCode=0 Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.538763 4707 generic.go:334] "Generic (PLEG): container finished" podID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerID="fea7125a871d1415c93116c48d75a9581bfa690f1b86118e2d270e7df411388f" exitCode=2 Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.538771 4707 generic.go:334] "Generic (PLEG): container finished" podID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerID="e98a6f97c0c59f1457887797ba63a8cd20695ce6c8be90bf06b9bd633a27d4b5" exitCode=0 Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.538831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerDied","Data":"e6bf0d0ee323fd65232e57a4a6ac7b388eb8e7c5bca6ca6b043f495165ea7d8b"} Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.538857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerDied","Data":"fea7125a871d1415c93116c48d75a9581bfa690f1b86118e2d270e7df411388f"} Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.538866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerDied","Data":"e98a6f97c0c59f1457887797ba63a8cd20695ce6c8be90bf06b9bd633a27d4b5"} Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.542015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f732f500-3538-4a30-8502-fbd2451b7bdd","Type":"ContainerStarted","Data":"dc57923b0a2f675447c3007b9aa3acd012c50ec9f99c0725f0f16fd0d12c0f6b"} Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.544317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"42924886-d9f3-48df-83a3-f7e27fcf368d","Type":"ContainerStarted","Data":"7bc461ffcee7909ae3ce1f7eee5798564341473d88bb5e5dc4e8f44fc909fd32"} Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.544345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"42924886-d9f3-48df-83a3-f7e27fcf368d","Type":"ContainerStarted","Data":"87499cfea8bda08a718a940fd3a7bf071af4b38a1b2f3699e8ea828a649bf96f"} Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.544452 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.546817 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.549645 4707 generic.go:334] "Generic (PLEG): container finished" podID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerID="d92766c25e2a6639816ebec6fda095e760a14a87d4377f309301e1ed97c1a7c9" exitCode=0 Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.550565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f02894e7-9ae7-40df-a2d1-c9074494a383","Type":"ContainerDied","Data":"d92766c25e2a6639816ebec6fda095e760a14a87d4377f309301e1ed97c1a7c9"} Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.560310 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.56029509 podStartE2EDuration="2.56029509s" podCreationTimestamp="2026-01-21 15:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:27.559738354 +0000 UTC m=+2984.741254575" watchObservedRunningTime="2026-01-21 15:51:27.56029509 +0000 UTC m=+2984.741811313" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.571900 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="3e89388a-81da-49c6-a6fe-796c924f2822" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.117:5672: connect: connection refused" Jan 21 15:51:27 crc kubenswrapper[4707]: W0121 15:51:27.598181 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56f64f3c_5f52_42fa_a99d_1312c645ba3a.slice/crio-30c32ec6ea4be9c189b6ca697dec69a66e60a1449227c88bc04321c26726af4a WatchSource:0}: Error finding container 30c32ec6ea4be9c189b6ca697dec69a66e60a1449227c88bc04321c26726af4a: Status 404 returned error can't find the container with id 30c32ec6ea4be9c189b6ca697dec69a66e60a1449227c88bc04321c26726af4a Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.743287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:51:27 crc kubenswrapper[4707]: W0121 15:51:27.745248 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd22d4ba8_ba68_4fe0_9350_d5eec13eb1e3.slice/crio-7a1ba7165cab0d41cd16c92671069652c862d29e1abefcd71c79ace8d56c0e79 WatchSource:0}: Error finding container 7a1ba7165cab0d41cd16c92671069652c862d29e1abefcd71c79ace8d56c0e79: Status 404 returned error can't find the container with id 7a1ba7165cab0d41cd16c92671069652c862d29e1abefcd71c79ace8d56c0e79 Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.749645 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:51:27 crc kubenswrapper[4707]: W0121 15:51:27.750289 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c7f376c_03ea_47f2_9f24_e7bf986f83aa.slice/crio-b556db42b77c3fd21fae8ff161f19f49c10fa830b335d6ae17f4c38473d90571 WatchSource:0}: Error finding container b556db42b77c3fd21fae8ff161f19f49c10fa830b335d6ae17f4c38473d90571: Status 404 returned error can't find the container with id b556db42b77c3fd21fae8ff161f19f49c10fa830b335d6ae17f4c38473d90571 Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.750523 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.866414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.952023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"f02894e7-9ae7-40df-a2d1-c9074494a383\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.952072 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-config-data\") pod \"f02894e7-9ae7-40df-a2d1-c9074494a383\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.952123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q54ht\" (UniqueName: \"kubernetes.io/projected/f02894e7-9ae7-40df-a2d1-c9074494a383-kube-api-access-q54ht\") pod \"f02894e7-9ae7-40df-a2d1-c9074494a383\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.952145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-logs\") pod \"f02894e7-9ae7-40df-a2d1-c9074494a383\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.952163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-httpd-run\") pod \"f02894e7-9ae7-40df-a2d1-c9074494a383\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.952220 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-scripts\") pod \"f02894e7-9ae7-40df-a2d1-c9074494a383\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.952239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-combined-ca-bundle\") pod \"f02894e7-9ae7-40df-a2d1-c9074494a383\" (UID: \"f02894e7-9ae7-40df-a2d1-c9074494a383\") " Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.953219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f02894e7-9ae7-40df-a2d1-c9074494a383" (UID: "f02894e7-9ae7-40df-a2d1-c9074494a383"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.953427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-logs" (OuterVolumeSpecName: "logs") pod "f02894e7-9ae7-40df-a2d1-c9074494a383" (UID: "f02894e7-9ae7-40df-a2d1-c9074494a383"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.959987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f02894e7-9ae7-40df-a2d1-c9074494a383-kube-api-access-q54ht" (OuterVolumeSpecName: "kube-api-access-q54ht") pod "f02894e7-9ae7-40df-a2d1-c9074494a383" (UID: "f02894e7-9ae7-40df-a2d1-c9074494a383"). InnerVolumeSpecName "kube-api-access-q54ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.960515 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-scripts" (OuterVolumeSpecName: "scripts") pod "f02894e7-9ae7-40df-a2d1-c9074494a383" (UID: "f02894e7-9ae7-40df-a2d1-c9074494a383"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.960952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "f02894e7-9ae7-40df-a2d1-c9074494a383" (UID: "f02894e7-9ae7-40df-a2d1-c9074494a383"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:51:27 crc kubenswrapper[4707]: I0121 15:51:27.987639 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f02894e7-9ae7-40df-a2d1-c9074494a383" (UID: "f02894e7-9ae7-40df-a2d1-c9074494a383"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.008017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-config-data" (OuterVolumeSpecName: "config-data") pod "f02894e7-9ae7-40df-a2d1-c9074494a383" (UID: "f02894e7-9ae7-40df-a2d1-c9074494a383"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.053536 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.053564 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.053592 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.053602 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f02894e7-9ae7-40df-a2d1-c9074494a383-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.053611 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q54ht\" (UniqueName: \"kubernetes.io/projected/f02894e7-9ae7-40df-a2d1-c9074494a383-kube-api-access-q54ht\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.053619 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.053626 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f02894e7-9ae7-40df-a2d1-c9074494a383-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.075442 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.155751 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.264991 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.408846 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-675d9b6fc4-hjdm8"] Jan 21 15:51:28 crc kubenswrapper[4707]: E0121 15:51:28.409269 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerName="glance-log" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.409287 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerName="glance-log" Jan 21 15:51:28 crc kubenswrapper[4707]: E0121 15:51:28.409310 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerName="glance-log" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.409316 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerName="glance-log" Jan 21 15:51:28 crc kubenswrapper[4707]: E0121 15:51:28.409334 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerName="glance-httpd" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.409341 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerName="glance-httpd" Jan 21 15:51:28 crc kubenswrapper[4707]: E0121 15:51:28.409358 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerName="glance-httpd" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.409364 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerName="glance-httpd" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.409572 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerName="glance-httpd" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.409596 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerName="glance-log" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.409607 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f02894e7-9ae7-40df-a2d1-c9074494a383" containerName="glance-httpd" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.409624 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerName="glance-log" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.410712 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.416499 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.417418 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.446935 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-675d9b6fc4-hjdm8"] Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.480367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp5mb\" (UniqueName: \"kubernetes.io/projected/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-kube-api-access-hp5mb\") pod \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.480433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-combined-ca-bundle\") pod \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.480455 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-config-data\") pod \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.480474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.480600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-scripts\") pod \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.480638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-httpd-run\") pod \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.480672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-logs\") pod \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\" (UID: \"0687c9e1-7ccd-454c-8ffd-8507cb8af46c\") " Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.481348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-logs" (OuterVolumeSpecName: "logs") pod "0687c9e1-7ccd-454c-8ffd-8507cb8af46c" (UID: "0687c9e1-7ccd-454c-8ffd-8507cb8af46c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.483992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-kube-api-access-hp5mb" (OuterVolumeSpecName: "kube-api-access-hp5mb") pod "0687c9e1-7ccd-454c-8ffd-8507cb8af46c" (UID: "0687c9e1-7ccd-454c-8ffd-8507cb8af46c"). InnerVolumeSpecName "kube-api-access-hp5mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.484231 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0687c9e1-7ccd-454c-8ffd-8507cb8af46c" (UID: "0687c9e1-7ccd-454c-8ffd-8507cb8af46c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.494234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "0687c9e1-7ccd-454c-8ffd-8507cb8af46c" (UID: "0687c9e1-7ccd-454c-8ffd-8507cb8af46c"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.494912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-scripts" (OuterVolumeSpecName: "scripts") pod "0687c9e1-7ccd-454c-8ffd-8507cb8af46c" (UID: "0687c9e1-7ccd-454c-8ffd-8507cb8af46c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.533996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0687c9e1-7ccd-454c-8ffd-8507cb8af46c" (UID: "0687c9e1-7ccd-454c-8ffd-8507cb8af46c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.549952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-config-data" (OuterVolumeSpecName: "config-data") pod "0687c9e1-7ccd-454c-8ffd-8507cb8af46c" (UID: "0687c9e1-7ccd-454c-8ffd-8507cb8af46c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.565160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"180a2ca9-2ef5-4504-a517-e95c9b914238","Type":"ContainerStarted","Data":"3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.565222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"180a2ca9-2ef5-4504-a517-e95c9b914238","Type":"ContainerStarted","Data":"112949b9c43e255583e54af6daa6ba9bff3b55b2278589d922735d3fc69f00c0"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.566860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.577362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3","Type":"ContainerStarted","Data":"dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.577400 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3","Type":"ContainerStarted","Data":"7a1ba7165cab0d41cd16c92671069652c862d29e1abefcd71c79ace8d56c0e79"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.580866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"5c7f376c-03ea-47f2-9f24-e7bf986f83aa","Type":"ContainerStarted","Data":"dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.580896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"5c7f376c-03ea-47f2-9f24-e7bf986f83aa","Type":"ContainerStarted","Data":"b556db42b77c3fd21fae8ff161f19f49c10fa830b335d6ae17f4c38473d90571"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.582794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-public-tls-certs\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.582838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-internal-tls-certs\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.582865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-combined-ca-bundle\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.582899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-config-data\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.582973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmhw7\" (UniqueName: \"kubernetes.io/projected/f597d137-18da-4335-a15b-26e45a8c7b7d-kube-api-access-rmhw7\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.582997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-scripts\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.583012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f597d137-18da-4335-a15b-26e45a8c7b7d-logs\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.583054 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.583064 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.583072 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.583079 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp5mb\" (UniqueName: \"kubernetes.io/projected/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-kube-api-access-hp5mb\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.583088 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.583095 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0687c9e1-7ccd-454c-8ffd-8507cb8af46c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.583112 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.583650 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.5836402769999998 podStartE2EDuration="2.583640277s" podCreationTimestamp="2026-01-21 15:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:28.57907345 +0000 UTC m=+2985.760589673" watchObservedRunningTime="2026-01-21 15:51:28.583640277 +0000 UTC m=+2985.765156499" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.588561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f02894e7-9ae7-40df-a2d1-c9074494a383","Type":"ContainerDied","Data":"a3650837a2001d12774627e50c73d718a1688b24b0568b4f1d9e45caa320c2bc"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.588596 4707 scope.go:117] "RemoveContainer" containerID="d92766c25e2a6639816ebec6fda095e760a14a87d4377f309301e1ed97c1a7c9" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.588701 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.601161 4707 generic.go:334] "Generic (PLEG): container finished" podID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" containerID="dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e" exitCode=0 Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.601210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0687c9e1-7ccd-454c-8ffd-8507cb8af46c","Type":"ContainerDied","Data":"dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.601226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0687c9e1-7ccd-454c-8ffd-8507cb8af46c","Type":"ContainerDied","Data":"74412d0b57753166d229bd381eff8c0987a5399eb4f4032d1479c0fe828f3a12"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.601284 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.611131 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.611116108 podStartE2EDuration="2.611116108s" podCreationTimestamp="2026-01-21 15:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:28.607212569 +0000 UTC m=+2985.788728791" watchObservedRunningTime="2026-01-21 15:51:28.611116108 +0000 UTC m=+2985.792632330" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.617900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"50b67658-af5d-467a-8952-22db7d822791","Type":"ContainerStarted","Data":"cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.631019 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.640216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"56f64f3c-5f52-42fa-a99d-1312c645ba3a","Type":"ContainerStarted","Data":"63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.640273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"56f64f3c-5f52-42fa-a99d-1312c645ba3a","Type":"ContainerStarted","Data":"7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.640285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"56f64f3c-5f52-42fa-a99d-1312c645ba3a","Type":"ContainerStarted","Data":"30c32ec6ea4be9c189b6ca697dec69a66e60a1449227c88bc04321c26726af4a"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.643944 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.653684 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.660613 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.663459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f732f500-3538-4a30-8502-fbd2451b7bdd","Type":"ContainerStarted","Data":"b5259bb32f8e7a63c7f6470d92529c9c2b2d3d96ec7f8b2439aab3601cdc6538"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.663563 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.665276 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.665619 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-265m4" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.666001 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.673565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf","Type":"ContainerStarted","Data":"a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c"} Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.677107 4707 scope.go:117] "RemoveContainer" containerID="a61916f8e83ea502f8075b4d59765ed8f01db90f73b4f184cfeafe8b58ce0fda" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.679845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.685087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-config-data\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.685470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmhw7\" (UniqueName: \"kubernetes.io/projected/f597d137-18da-4335-a15b-26e45a8c7b7d-kube-api-access-rmhw7\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.685506 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-scripts\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.685529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f597d137-18da-4335-a15b-26e45a8c7b7d-logs\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.685563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-public-tls-certs\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.685596 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-internal-tls-certs\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.685627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-combined-ca-bundle\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.685699 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.688049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f597d137-18da-4335-a15b-26e45a8c7b7d-logs\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.697705 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-scripts\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.704370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-combined-ca-bundle\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.712390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-config-data\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.721284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-public-tls-certs\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.721336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmhw7\" (UniqueName: \"kubernetes.io/projected/f597d137-18da-4335-a15b-26e45a8c7b7d-kube-api-access-rmhw7\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.722143 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.726214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-internal-tls-certs\") pod \"placement-675d9b6fc4-hjdm8\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.747267 4707 scope.go:117] "RemoveContainer" containerID="dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.751505 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.771541 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.786938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-scripts\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.787056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.787217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-config-data\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.787264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-logs\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.787281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.787298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hg99\" (UniqueName: \"kubernetes.io/projected/32b3543f-a454-4d1e-a820-031c05bc1aad-kube-api-access-8hg99\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.787338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.789008 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.790566 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.792961 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.797552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.800860 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.800843973 podStartE2EDuration="2.800843973s" podCreationTimestamp="2026-01-21 15:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:28.737247425 +0000 UTC m=+2985.918763647" watchObservedRunningTime="2026-01-21 15:51:28.800843973 +0000 UTC m=+2985.982360195" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.804647 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.804639318 podStartE2EDuration="2.804639318s" podCreationTimestamp="2026-01-21 15:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:28.768968359 +0000 UTC m=+2985.950484581" watchObservedRunningTime="2026-01-21 15:51:28.804639318 +0000 UTC m=+2985.986155541" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.822211 4707 scope.go:117] "RemoveContainer" containerID="4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-config-data\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-logs\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hg99\" (UniqueName: \"kubernetes.io/projected/32b3543f-a454-4d1e-a820-031c05bc1aad-kube-api-access-8hg99\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hgf\" (UniqueName: \"kubernetes.io/projected/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-kube-api-access-c4hgf\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-scripts\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.892874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.893283 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.899872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-logs\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.900192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.917023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.928734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-config-data\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.948286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-scripts\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.952359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hg99\" (UniqueName: \"kubernetes.io/projected/32b3543f-a454-4d1e-a820-031c05bc1aad-kube-api-access-8hg99\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.975416 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:28 crc kubenswrapper[4707]: I0121 15:51:28.990730 4707 scope.go:117] "RemoveContainer" containerID="dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.995497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.995535 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.995561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.995633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hgf\" (UniqueName: \"kubernetes.io/projected/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-kube-api-access-c4hgf\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.995687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.995720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.995736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.996160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.996302 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: E0121 15:51:28.996905 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e\": container with ID starting with dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e not found: ID does not exist" containerID="dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.996937 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e"} err="failed to get container status \"dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e\": rpc error: code = NotFound desc = could not find container \"dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e\": container with ID starting with dedc6870736ac34a79239ee4318568b77ff85c922752894e98fee58d07500d3e not found: ID does not exist" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.996958 4707 scope.go:117] "RemoveContainer" containerID="4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:28.997308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.001691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.011874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.014924 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hgf\" (UniqueName: \"kubernetes.io/projected/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-kube-api-access-c4hgf\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: E0121 15:51:29.017041 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8\": container with ID starting with 4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8 not found: ID does not exist" containerID="4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.017076 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8"} err="failed to get container status \"4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8\": rpc error: code = NotFound desc = could not find container \"4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8\": container with ID starting with 4c1bd33876f9839f3f4eae12c7f122913635a810e3cd725de6589a64553421b8 not found: ID does not exist" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.025049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.034539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.076298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.117471 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.203842 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0687c9e1-7ccd-454c-8ffd-8507cb8af46c" path="/var/lib/kubelet/pods/0687c9e1-7ccd-454c-8ffd-8507cb8af46c/volumes" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.204919 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f02894e7-9ae7-40df-a2d1-c9074494a383" path="/var/lib/kubelet/pods/f02894e7-9ae7-40df-a2d1-c9074494a383/volumes" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.356767 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-675d9b6fc4-hjdm8"] Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.578934 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.705868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3","Type":"ContainerStarted","Data":"e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753"} Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.717690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" event={"ID":"f597d137-18da-4335-a15b-26e45a8c7b7d","Type":"ContainerStarted","Data":"bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b"} Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.717723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" event={"ID":"f597d137-18da-4335-a15b-26e45a8c7b7d","Type":"ContainerStarted","Data":"6883ae3e94d4689d3f66d1675edc4e0ff53fe7704471f99e94fd22a6d7df861f"} Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.728841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"32b3543f-a454-4d1e-a820-031c05bc1aad","Type":"ContainerStarted","Data":"7a6723b6c83abadbb1ed073b496be307022ffbe048e442d06a3d97ae1e346d51"} Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.732997 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.739723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f732f500-3538-4a30-8502-fbd2451b7bdd","Type":"ContainerStarted","Data":"8155b26de07fb92fa6d4dac952758954f8d13061ae5932a9172aed2e4c3b2d3e"} Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.739966 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.740068 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.740059927 podStartE2EDuration="3.740059927s" podCreationTimestamp="2026-01-21 15:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:29.722284153 +0000 UTC m=+2986.903800375" watchObservedRunningTime="2026-01-21 15:51:29.740059927 +0000 UTC m=+2986.921576148" Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.746319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"50b67658-af5d-467a-8952-22db7d822791","Type":"ContainerStarted","Data":"be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032"} Jan 21 15:51:29 crc kubenswrapper[4707]: I0121 15:51:29.754777 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.754763052 podStartE2EDuration="3.754763052s" podCreationTimestamp="2026-01-21 15:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:29.751785883 +0000 UTC m=+2986.933302105" watchObservedRunningTime="2026-01-21 15:51:29.754763052 +0000 UTC m=+2986.936279274" Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.171701 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="17a6ace0-ecbc-47bd-acf8-9a7e76defede" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.1.188:6080/vnc_lite.html\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.778246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"383aecaf-d0c4-4d44-9df7-04e2ea1035b5","Type":"ContainerStarted","Data":"4de6361043002e66d2412937576703f188e896ab813880f6a1980a4ba39cabb6"} Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.778773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"383aecaf-d0c4-4d44-9df7-04e2ea1035b5","Type":"ContainerStarted","Data":"3ab91a15f3e1fefd6ddebc2a4a8e5b5b0f3bbfa8cd395a8ba823c2b7880b4ee3"} Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.796740 4707 generic.go:334] "Generic (PLEG): container finished" podID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerID="57135fba45aec0028e942e949147ddc31decf31eea83c9d70afc2ddec4bea450" exitCode=0 Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.796838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerDied","Data":"57135fba45aec0028e942e949147ddc31decf31eea83c9d70afc2ddec4bea450"} Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.801291 4707 generic.go:334] "Generic (PLEG): container finished" podID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerID="436bcbab1f25e15f46246ecc8ede16a55da48c1cd2e3c46c116436928b337639" exitCode=0 Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.801368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" event={"ID":"11d963f6-7258-432b-8206-5a1893ed2ff1","Type":"ContainerDied","Data":"436bcbab1f25e15f46246ecc8ede16a55da48c1cd2e3c46c116436928b337639"} Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.804702 4707 generic.go:334] "Generic (PLEG): container finished" podID="bde702b1-7406-4751-aaf5-026fb8e348c4" containerID="32edadedef2c315e6d4653ad9f20b50bc87635b5eb5275fb1e900ad9ba493403" exitCode=0 Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.805592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"bde702b1-7406-4751-aaf5-026fb8e348c4","Type":"ContainerDied","Data":"32edadedef2c315e6d4653ad9f20b50bc87635b5eb5275fb1e900ad9ba493403"} Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.811287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" event={"ID":"f597d137-18da-4335-a15b-26e45a8c7b7d","Type":"ContainerStarted","Data":"bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c"} Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.811446 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.816231 4707 generic.go:334] "Generic (PLEG): container finished" podID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerID="1e13dc0dd446435fd65bf4d3e47bff260f5012e7a18801585e384b29e9417f69" exitCode=0 Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.816293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" event={"ID":"62974f15-cc2c-4796-9e97-30b70cfb78b0","Type":"ContainerDied","Data":"1e13dc0dd446435fd65bf4d3e47bff260f5012e7a18801585e384b29e9417f69"} Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.820989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"32b3543f-a454-4d1e-a820-031c05bc1aad","Type":"ContainerStarted","Data":"843277c982352a8f067187b925a5f54dbba4844c3c191d8a0ca6015d0706fc05"} Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.839109 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=5.839091722 podStartE2EDuration="5.839091722s" podCreationTimestamp="2026-01-21 15:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:29.78333294 +0000 UTC m=+2986.964849162" watchObservedRunningTime="2026-01-21 15:51:30.839091722 +0000 UTC m=+2988.020607944" Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.842638 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" podStartSLOduration=2.842622961 podStartE2EDuration="2.842622961s" podCreationTimestamp="2026-01-21 15:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:30.825753783 +0000 UTC m=+2988.007270006" watchObservedRunningTime="2026-01-21 15:51:30.842622961 +0000 UTC m=+2988.024139184" Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.884060 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.884043423 podStartE2EDuration="2.884043423s" podCreationTimestamp="2026-01-21 15:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:30.85317334 +0000 UTC m=+2988.034689562" watchObservedRunningTime="2026-01-21 15:51:30.884043423 +0000 UTC m=+2988.065559645" Jan 21 15:51:30 crc kubenswrapper[4707]: I0121 15:51:30.954139 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.078034 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11d963f6-7258-432b-8206-5a1893ed2ff1-logs\") pod \"11d963f6-7258-432b-8206-5a1893ed2ff1\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.078140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data\") pod \"11d963f6-7258-432b-8206-5a1893ed2ff1\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.078245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data-custom\") pod \"11d963f6-7258-432b-8206-5a1893ed2ff1\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.078442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qv7c\" (UniqueName: \"kubernetes.io/projected/11d963f6-7258-432b-8206-5a1893ed2ff1-kube-api-access-9qv7c\") pod \"11d963f6-7258-432b-8206-5a1893ed2ff1\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.078465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-combined-ca-bundle\") pod \"11d963f6-7258-432b-8206-5a1893ed2ff1\" (UID: \"11d963f6-7258-432b-8206-5a1893ed2ff1\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.084345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11d963f6-7258-432b-8206-5a1893ed2ff1-logs" (OuterVolumeSpecName: "logs") pod "11d963f6-7258-432b-8206-5a1893ed2ff1" (UID: "11d963f6-7258-432b-8206-5a1893ed2ff1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.096904 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "11d963f6-7258-432b-8206-5a1893ed2ff1" (UID: "11d963f6-7258-432b-8206-5a1893ed2ff1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.101131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d963f6-7258-432b-8206-5a1893ed2ff1-kube-api-access-9qv7c" (OuterVolumeSpecName: "kube-api-access-9qv7c") pod "11d963f6-7258-432b-8206-5a1893ed2ff1" (UID: "11d963f6-7258-432b-8206-5a1893ed2ff1"). InnerVolumeSpecName "kube-api-access-9qv7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.133471 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11d963f6-7258-432b-8206-5a1893ed2ff1" (UID: "11d963f6-7258-432b-8206-5a1893ed2ff1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.177261 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data" (OuterVolumeSpecName: "config-data") pod "11d963f6-7258-432b-8206-5a1893ed2ff1" (UID: "11d963f6-7258-432b-8206-5a1893ed2ff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.180467 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qv7c\" (UniqueName: \"kubernetes.io/projected/11d963f6-7258-432b-8206-5a1893ed2ff1-kube-api-access-9qv7c\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.180488 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.180499 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11d963f6-7258-432b-8206-5a1893ed2ff1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.180511 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.180518 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11d963f6-7258-432b-8206-5a1893ed2ff1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.210515 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.215367 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.224291 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.397877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62974f15-cc2c-4796-9e97-30b70cfb78b0-logs\") pod \"62974f15-cc2c-4796-9e97-30b70cfb78b0\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398371 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glck5\" (UniqueName: \"kubernetes.io/projected/466deb02-8b2c-4c82-a0ef-fc3f719f321a-kube-api-access-glck5\") pod \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-plugins-conf\") pod \"bde702b1-7406-4751-aaf5-026fb8e348c4\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"bde702b1-7406-4751-aaf5-026fb8e348c4\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data\") pod \"62974f15-cc2c-4796-9e97-30b70cfb78b0\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data-custom\") pod \"62974f15-cc2c-4796-9e97-30b70cfb78b0\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-erlang-cookie\") pod \"bde702b1-7406-4751-aaf5-026fb8e348c4\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-scripts\") pod \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-server-conf\") pod \"bde702b1-7406-4751-aaf5-026fb8e348c4\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-confd\") pod \"bde702b1-7406-4751-aaf5-026fb8e348c4\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-combined-ca-bundle\") pod \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-plugins\") pod \"bde702b1-7406-4751-aaf5-026fb8e348c4\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-log-httpd\") pod \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-sg-core-conf-yaml\") pod \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bde702b1-7406-4751-aaf5-026fb8e348c4-erlang-cookie-secret\") pod \"bde702b1-7406-4751-aaf5-026fb8e348c4\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bde702b1-7406-4751-aaf5-026fb8e348c4-pod-info\") pod \"bde702b1-7406-4751-aaf5-026fb8e348c4\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398790 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-combined-ca-bundle\") pod \"62974f15-cc2c-4796-9e97-30b70cfb78b0\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-run-httpd\") pod \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62974f15-cc2c-4796-9e97-30b70cfb78b0-logs" (OuterVolumeSpecName: "logs") pod "62974f15-cc2c-4796-9e97-30b70cfb78b0" (UID: "62974f15-cc2c-4796-9e97-30b70cfb78b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s8jh\" (UniqueName: \"kubernetes.io/projected/62974f15-cc2c-4796-9e97-30b70cfb78b0-kube-api-access-8s8jh\") pod \"62974f15-cc2c-4796-9e97-30b70cfb78b0\" (UID: \"62974f15-cc2c-4796-9e97-30b70cfb78b0\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398894 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-config-data\") pod \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\" (UID: \"466deb02-8b2c-4c82-a0ef-fc3f719f321a\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bm4k\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-kube-api-access-2bm4k\") pod \"bde702b1-7406-4751-aaf5-026fb8e348c4\" (UID: \"bde702b1-7406-4751-aaf5-026fb8e348c4\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.398943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bde702b1-7406-4751-aaf5-026fb8e348c4" (UID: "bde702b1-7406-4751-aaf5-026fb8e348c4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.399239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bde702b1-7406-4751-aaf5-026fb8e348c4" (UID: "bde702b1-7406-4751-aaf5-026fb8e348c4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.399550 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62974f15-cc2c-4796-9e97-30b70cfb78b0-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.399563 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.399577 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.400421 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "466deb02-8b2c-4c82-a0ef-fc3f719f321a" (UID: "466deb02-8b2c-4c82-a0ef-fc3f719f321a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.400696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "466deb02-8b2c-4c82-a0ef-fc3f719f321a" (UID: "466deb02-8b2c-4c82-a0ef-fc3f719f321a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.401067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bde702b1-7406-4751-aaf5-026fb8e348c4" (UID: "bde702b1-7406-4751-aaf5-026fb8e348c4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.406907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bde702b1-7406-4751-aaf5-026fb8e348c4-pod-info" (OuterVolumeSpecName: "pod-info") pod "bde702b1-7406-4751-aaf5-026fb8e348c4" (UID: "bde702b1-7406-4751-aaf5-026fb8e348c4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.407017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/466deb02-8b2c-4c82-a0ef-fc3f719f321a-kube-api-access-glck5" (OuterVolumeSpecName: "kube-api-access-glck5") pod "466deb02-8b2c-4c82-a0ef-fc3f719f321a" (UID: "466deb02-8b2c-4c82-a0ef-fc3f719f321a"). InnerVolumeSpecName "kube-api-access-glck5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.413921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62974f15-cc2c-4796-9e97-30b70cfb78b0" (UID: "62974f15-cc2c-4796-9e97-30b70cfb78b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.414031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-scripts" (OuterVolumeSpecName: "scripts") pod "466deb02-8b2c-4c82-a0ef-fc3f719f321a" (UID: "466deb02-8b2c-4c82-a0ef-fc3f719f321a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.415015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62974f15-cc2c-4796-9e97-30b70cfb78b0-kube-api-access-8s8jh" (OuterVolumeSpecName: "kube-api-access-8s8jh") pod "62974f15-cc2c-4796-9e97-30b70cfb78b0" (UID: "62974f15-cc2c-4796-9e97-30b70cfb78b0"). InnerVolumeSpecName "kube-api-access-8s8jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.416594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "persistence") pod "bde702b1-7406-4751-aaf5-026fb8e348c4" (UID: "bde702b1-7406-4751-aaf5-026fb8e348c4"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.420933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde702b1-7406-4751-aaf5-026fb8e348c4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bde702b1-7406-4751-aaf5-026fb8e348c4" (UID: "bde702b1-7406-4751-aaf5-026fb8e348c4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.420956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-kube-api-access-2bm4k" (OuterVolumeSpecName: "kube-api-access-2bm4k") pod "bde702b1-7406-4751-aaf5-026fb8e348c4" (UID: "bde702b1-7406-4751-aaf5-026fb8e348c4"). InnerVolumeSpecName "kube-api-access-2bm4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.435548 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.440055 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.450396 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62974f15-cc2c-4796-9e97-30b70cfb78b0" (UID: "62974f15-cc2c-4796-9e97-30b70cfb78b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.461684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "466deb02-8b2c-4c82-a0ef-fc3f719f321a" (UID: "466deb02-8b2c-4c82-a0ef-fc3f719f321a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.465701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-server-conf" (OuterVolumeSpecName: "server-conf") pod "bde702b1-7406-4751-aaf5-026fb8e348c4" (UID: "bde702b1-7406-4751-aaf5-026fb8e348c4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.508315 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.508345 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.508354 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.508363 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.508372 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.508380 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bde702b1-7406-4751-aaf5-026fb8e348c4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.508390 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bde702b1-7406-4751-aaf5-026fb8e348c4-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.509293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data" (OuterVolumeSpecName: "config-data") pod "62974f15-cc2c-4796-9e97-30b70cfb78b0" (UID: "62974f15-cc2c-4796-9e97-30b70cfb78b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.513878 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.513905 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/466deb02-8b2c-4c82-a0ef-fc3f719f321a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.513916 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s8jh\" (UniqueName: \"kubernetes.io/projected/62974f15-cc2c-4796-9e97-30b70cfb78b0-kube-api-access-8s8jh\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.513927 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bm4k\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-kube-api-access-2bm4k\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.513936 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glck5\" (UniqueName: \"kubernetes.io/projected/466deb02-8b2c-4c82-a0ef-fc3f719f321a-kube-api-access-glck5\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.513944 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bde702b1-7406-4751-aaf5-026fb8e348c4-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.513975 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.532971 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.548933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "466deb02-8b2c-4c82-a0ef-fc3f719f321a" (UID: "466deb02-8b2c-4c82-a0ef-fc3f719f321a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.552946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bde702b1-7406-4751-aaf5-026fb8e348c4" (UID: "bde702b1-7406-4751-aaf5-026fb8e348c4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.566211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-config-data" (OuterVolumeSpecName: "config-data") pod "466deb02-8b2c-4c82-a0ef-fc3f719f321a" (UID: "466deb02-8b2c-4c82-a0ef-fc3f719f321a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-erlang-cookie\") pod \"3e89388a-81da-49c6-a6fe-796c924f2822\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-plugins\") pod \"3e89388a-81da-49c6-a6fe-796c924f2822\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjz2b\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-kube-api-access-xjz2b\") pod \"3e89388a-81da-49c6-a6fe-796c924f2822\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615438 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e89388a-81da-49c6-a6fe-796c924f2822-erlang-cookie-secret\") pod \"3e89388a-81da-49c6-a6fe-796c924f2822\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615493 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"3e89388a-81da-49c6-a6fe-796c924f2822\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-server-conf\") pod \"3e89388a-81da-49c6-a6fe-796c924f2822\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e89388a-81da-49c6-a6fe-796c924f2822-pod-info\") pod \"3e89388a-81da-49c6-a6fe-796c924f2822\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-plugins-conf\") pod \"3e89388a-81da-49c6-a6fe-796c924f2822\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615658 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-confd\") pod \"3e89388a-81da-49c6-a6fe-796c924f2822\" (UID: \"3e89388a-81da-49c6-a6fe-796c924f2822\") " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.615924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3e89388a-81da-49c6-a6fe-796c924f2822" (UID: "3e89388a-81da-49c6-a6fe-796c924f2822"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3e89388a-81da-49c6-a6fe-796c924f2822" (UID: "3e89388a-81da-49c6-a6fe-796c924f2822"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3e89388a-81da-49c6-a6fe-796c924f2822" (UID: "3e89388a-81da-49c6-a6fe-796c924f2822"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616672 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616696 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616709 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616721 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62974f15-cc2c-4796-9e97-30b70cfb78b0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616730 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bde702b1-7406-4751-aaf5-026fb8e348c4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616740 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466deb02-8b2c-4c82-a0ef-fc3f719f321a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616753 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.616762 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.618986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-kube-api-access-xjz2b" (OuterVolumeSpecName: "kube-api-access-xjz2b") pod "3e89388a-81da-49c6-a6fe-796c924f2822" (UID: "3e89388a-81da-49c6-a6fe-796c924f2822"). InnerVolumeSpecName "kube-api-access-xjz2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.619571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e89388a-81da-49c6-a6fe-796c924f2822-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3e89388a-81da-49c6-a6fe-796c924f2822" (UID: "3e89388a-81da-49c6-a6fe-796c924f2822"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.620727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "persistence") pod "3e89388a-81da-49c6-a6fe-796c924f2822" (UID: "3e89388a-81da-49c6-a6fe-796c924f2822"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.623417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3e89388a-81da-49c6-a6fe-796c924f2822-pod-info" (OuterVolumeSpecName: "pod-info") pod "3e89388a-81da-49c6-a6fe-796c924f2822" (UID: "3e89388a-81da-49c6-a6fe-796c924f2822"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.635333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-server-conf" (OuterVolumeSpecName: "server-conf") pod "3e89388a-81da-49c6-a6fe-796c924f2822" (UID: "3e89388a-81da-49c6-a6fe-796c924f2822"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.689000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3e89388a-81da-49c6-a6fe-796c924f2822" (UID: "3e89388a-81da-49c6-a6fe-796c924f2822"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.719336 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.719379 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjz2b\" (UniqueName: \"kubernetes.io/projected/3e89388a-81da-49c6-a6fe-796c924f2822-kube-api-access-xjz2b\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.719394 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e89388a-81da-49c6-a6fe-796c924f2822-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.719434 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.719444 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e89388a-81da-49c6-a6fe-796c924f2822-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.719453 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e89388a-81da-49c6-a6fe-796c924f2822-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.745209 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.822330 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.850553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"466deb02-8b2c-4c82-a0ef-fc3f719f321a","Type":"ContainerDied","Data":"899050c5c889dd02d82e024080ced6c7075805301dd7071790c4ace68f56d937"} Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.850572 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.850922 4707 scope.go:117] "RemoveContainer" containerID="e6bf0d0ee323fd65232e57a4a6ac7b388eb8e7c5bca6ca6b043f495165ea7d8b" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.857196 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.857460 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr" event={"ID":"11d963f6-7258-432b-8206-5a1893ed2ff1","Type":"ContainerDied","Data":"783cfe365678214c684fda16135fc7d2fe4163f670ac5f21235b6d3a7e62f885"} Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.861761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"bde702b1-7406-4751-aaf5-026fb8e348c4","Type":"ContainerDied","Data":"4f09c68857037ce3681852e51339608ccf1b2abe72aebfbf73e53925276cc9cb"} Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.861903 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.866644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" event={"ID":"62974f15-cc2c-4796-9e97-30b70cfb78b0","Type":"ContainerDied","Data":"a40e3bbf81f3d41577b644074e5b563f0a0264f8aeda7b2bc6491bf7a56f0708"} Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.866705 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.871018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"32b3543f-a454-4d1e-a820-031c05bc1aad","Type":"ContainerStarted","Data":"76da2ef4aab0590f619622ba0623aaf82d210931df4ee175dcd4eaa9262365dd"} Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.877658 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e89388a-81da-49c6-a6fe-796c924f2822" containerID="e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd" exitCode=0 Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.877712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"3e89388a-81da-49c6-a6fe-796c924f2822","Type":"ContainerDied","Data":"e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd"} Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.877729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"3e89388a-81da-49c6-a6fe-796c924f2822","Type":"ContainerDied","Data":"da99c640f6e593c8350fca1de94ea9f407db99529e30967850932ab9de53d442"} Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.877800 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.882669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"383aecaf-d0c4-4d44-9df7-04e2ea1035b5","Type":"ContainerStarted","Data":"86e9864dc77ac9f7e004c9d97a05fdc7ea6282109a354cde2ed52926d6d3d33f"} Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.883941 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.900959 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.908676 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.908664529 podStartE2EDuration="3.908664529s" podCreationTimestamp="2026-01-21 15:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:31.896828494 +0000 UTC m=+2989.078344716" watchObservedRunningTime="2026-01-21 15:51:31.908664529 +0000 UTC m=+2989.090180751" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.949358 4707 scope.go:117] "RemoveContainer" containerID="fea7125a871d1415c93116c48d75a9581bfa690f1b86118e2d270e7df411388f" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.956041 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr"] Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.968983 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-679bdf99cc-8dwtr"] Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.980172 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf"] Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.985932 4707 scope.go:117] "RemoveContainer" containerID="57135fba45aec0028e942e949147ddc31decf31eea83c9d70afc2ddec4bea450" Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.990288 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5499c5c6dd-s7krf"] Jan 21 15:51:31 crc kubenswrapper[4707]: I0121 15:51:31.996862 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.005145 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.009435 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.023752 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.030491 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.032976 4707 scope.go:117] "RemoveContainer" containerID="e98a6f97c0c59f1457887797ba63a8cd20695ce6c8be90bf06b9bd633a27d4b5" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037135 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037598 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerName="barbican-keystone-listener-log" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037616 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerName="barbican-keystone-listener-log" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037629 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="ceilometer-notification-agent" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037636 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="ceilometer-notification-agent" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037650 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="proxy-httpd" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037655 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="proxy-httpd" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037671 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerName="barbican-worker-log" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037677 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerName="barbican-worker-log" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037690 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerName="barbican-keystone-listener" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037697 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerName="barbican-keystone-listener" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037707 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="ceilometer-central-agent" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037713 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="ceilometer-central-agent" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037723 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde702b1-7406-4751-aaf5-026fb8e348c4" containerName="rabbitmq" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037729 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde702b1-7406-4751-aaf5-026fb8e348c4" containerName="rabbitmq" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037735 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerName="barbican-worker" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037741 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerName="barbican-worker" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037750 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e89388a-81da-49c6-a6fe-796c924f2822" containerName="setup-container" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037756 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e89388a-81da-49c6-a6fe-796c924f2822" containerName="setup-container" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037764 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde702b1-7406-4751-aaf5-026fb8e348c4" containerName="setup-container" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037771 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde702b1-7406-4751-aaf5-026fb8e348c4" containerName="setup-container" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037781 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="sg-core" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037786 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="sg-core" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.037819 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e89388a-81da-49c6-a6fe-796c924f2822" containerName="rabbitmq" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037826 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e89388a-81da-49c6-a6fe-796c924f2822" containerName="rabbitmq" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037980 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="ceilometer-notification-agent" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.037989 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerName="barbican-keystone-listener" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.038002 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="proxy-httpd" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.038013 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="sg-core" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.038020 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" containerName="ceilometer-central-agent" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.038030 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62974f15-cc2c-4796-9e97-30b70cfb78b0" containerName="barbican-keystone-listener-log" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.038039 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerName="barbican-worker" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.038050 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde702b1-7406-4751-aaf5-026fb8e348c4" containerName="rabbitmq" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.038058 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e89388a-81da-49c6-a6fe-796c924f2822" containerName="rabbitmq" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.038069 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d963f6-7258-432b-8206-5a1893ed2ff1" containerName="barbican-worker-log" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.040339 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.044618 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.047069 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.047290 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-lz57z" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.047394 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.050494 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.050729 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.050765 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.050896 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.050964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.056005 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.061488 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.061520 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.061959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.067343 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.067680 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.067836 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.068089 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.068107 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.068208 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-skkb6" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.068326 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.079137 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.086388 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.086954 4707 scope.go:117] "RemoveContainer" containerID="436bcbab1f25e15f46246ecc8ede16a55da48c1cd2e3c46c116436928b337639" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.099930 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.102231 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.104852 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.105065 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.107551 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.122418 4707 scope.go:117] "RemoveContainer" containerID="d60b3a2a18c48b42bdf714d802ca2dd8c97fac6dfc1e9c9a4de48c3542d38b4c" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.148961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5p7c\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-kube-api-access-w5p7c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjshv\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-kube-api-access-tjshv\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fb17d2c-7e86-4451-b56e-4e89fc494542-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149728 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fb17d2c-7e86-4451-b56e-4e89fc494542-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.149948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.150000 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.150045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.150067 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.150121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.150139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.150158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.150196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.150304 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.169864 4707 scope.go:117] "RemoveContainer" containerID="32edadedef2c315e6d4653ad9f20b50bc87635b5eb5275fb1e900ad9ba493403" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.186804 4707 scope.go:117] "RemoveContainer" containerID="43f7db9043a85994b4e96b1be2333a30db9afca105a651569a43385dfd162291" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.214795 4707 scope.go:117] "RemoveContainer" containerID="1e13dc0dd446435fd65bf4d3e47bff260f5012e7a18801585e384b29e9417f69" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.249074 4707 scope.go:117] "RemoveContainer" containerID="478c24f9df975e5f07c1890409bc439d4190136c2d1f1024e43e81cf80e40146" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5p7c\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-kube-api-access-w5p7c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-run-httpd\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjshv\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-kube-api-access-tjshv\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-log-httpd\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pkqx\" (UniqueName: \"kubernetes.io/projected/340dc9fb-5dbb-4be3-a139-f31e03c43cad-kube-api-access-4pkqx\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fb17d2c-7e86-4451-b56e-4e89fc494542-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.251799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.252957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.253025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.253100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fb17d2c-7e86-4451-b56e-4e89fc494542-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.253135 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-config-data\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.253169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.253186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.253198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.253694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.253846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.253867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.254148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.255266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.255428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.256228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.255714 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") device mount path \"/mnt/openstack/pv17\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.256012 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.256368 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.256512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.256612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.256679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.256758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.256783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.257210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.258303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-scripts\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.258357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.258456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.258478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.259430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.259647 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.261227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.261544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fb17d2c-7e86-4451-b56e-4e89fc494542-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.261777 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fb17d2c-7e86-4451-b56e-4e89fc494542-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.261803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.266676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.268953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjshv\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-kube-api-access-tjshv\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.273157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5p7c\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-kube-api-access-w5p7c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.275509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.278346 4707 scope.go:117] "RemoveContainer" containerID="e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.296356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"rabbitmq-server-0\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.298373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.361796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.361962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.362050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-scripts\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.362172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-run-httpd\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.362279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-log-httpd\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.362349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pkqx\" (UniqueName: \"kubernetes.io/projected/340dc9fb-5dbb-4be3-a139-f31e03c43cad-kube-api-access-4pkqx\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.362457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-config-data\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.363128 4707 scope.go:117] "RemoveContainer" containerID="df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.363709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-run-httpd\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.366011 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-log-httpd\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.367303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-config-data\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.368284 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.369619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.369912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-scripts\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.369976 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.388427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pkqx\" (UniqueName: \"kubernetes.io/projected/340dc9fb-5dbb-4be3-a139-f31e03c43cad-kube-api-access-4pkqx\") pod \"ceilometer-0\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.388928 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.409483 4707 scope.go:117] "RemoveContainer" containerID="e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.409876 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd\": container with ID starting with e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd not found: ID does not exist" containerID="e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.409910 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd"} err="failed to get container status \"e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd\": rpc error: code = NotFound desc = could not find container \"e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd\": container with ID starting with e33dd729e6bfa6b6b7f396b581bfd4d75e03f830d8b65c0ae3856332eeb86ebd not found: ID does not exist" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.409933 4707 scope.go:117] "RemoveContainer" containerID="df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e" Jan 21 15:51:32 crc kubenswrapper[4707]: E0121 15:51:32.410325 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e\": container with ID starting with df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e not found: ID does not exist" containerID="df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.410388 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e"} err="failed to get container status \"df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e\": rpc error: code = NotFound desc = could not find container \"df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e\": container with ID starting with df92f77e77154c386f0dcda1e0972e218aaf3e3310a43d1545ed0ee3442d529e not found: ID does not exist" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.434096 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.796044 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.912171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1fb17d2c-7e86-4451-b56e-4e89fc494542","Type":"ContainerStarted","Data":"b30872d48350038c1ef31ec76544f116a2f2e79bdeabe2733d3094e9fc29ca57"} Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.919745 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:51:32 crc kubenswrapper[4707]: I0121 15:51:32.977768 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.191951 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d963f6-7258-432b-8206-5a1893ed2ff1" path="/var/lib/kubelet/pods/11d963f6-7258-432b-8206-5a1893ed2ff1/volumes" Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.192825 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e89388a-81da-49c6-a6fe-796c924f2822" path="/var/lib/kubelet/pods/3e89388a-81da-49c6-a6fe-796c924f2822/volumes" Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.193445 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="466deb02-8b2c-4c82-a0ef-fc3f719f321a" path="/var/lib/kubelet/pods/466deb02-8b2c-4c82-a0ef-fc3f719f321a/volumes" Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.194645 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62974f15-cc2c-4796-9e97-30b70cfb78b0" path="/var/lib/kubelet/pods/62974f15-cc2c-4796-9e97-30b70cfb78b0/volumes" Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.195400 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde702b1-7406-4751-aaf5-026fb8e348c4" path="/var/lib/kubelet/pods/bde702b1-7406-4751-aaf5-026fb8e348c4/volumes" Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.857866 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.932950 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.935165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerStarted","Data":"73132fdef03dff860634d10a418475b500cf8d9c2efcefd957cf29acb8dd23d5"} Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.937862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1eaefcd6-8b64-47a3-8e3a-280d7130b63e","Type":"ContainerStarted","Data":"a7fb614e0e11f27b97c72334e1130103e002ccc22d5af5fc4409cf3da5fc830c"} Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.985644 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p"] Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.985873 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api-log" containerID="cri-o://c6104ba8a299d951e7e68b00af9056b68f0e5d2bab75be0997ecd34395ba528d" gracePeriod=30 Jan 21 15:51:33 crc kubenswrapper[4707]: I0121 15:51:33.986267 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api" containerID="cri-o://8423dad4b94be8eef5c393ef27d1d8f14ad3815ffd34f41da3bfa63e5acc69b9" gracePeriod=30 Jan 21 15:51:34 crc kubenswrapper[4707]: I0121 15:51:34.944575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1fb17d2c-7e86-4451-b56e-4e89fc494542","Type":"ContainerStarted","Data":"ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c"} Jan 21 15:51:34 crc kubenswrapper[4707]: I0121 15:51:34.946994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1eaefcd6-8b64-47a3-8e3a-280d7130b63e","Type":"ContainerStarted","Data":"14470d78188ddcd00f2db0dea2c3bc584052343e21da79ba6654c2da9d0b26e0"} Jan 21 15:51:34 crc kubenswrapper[4707]: I0121 15:51:34.948591 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerID="c6104ba8a299d951e7e68b00af9056b68f0e5d2bab75be0997ecd34395ba528d" exitCode=143 Jan 21 15:51:34 crc kubenswrapper[4707]: I0121 15:51:34.948678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" event={"ID":"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d","Type":"ContainerDied","Data":"c6104ba8a299d951e7e68b00af9056b68f0e5d2bab75be0997ecd34395ba528d"} Jan 21 15:51:34 crc kubenswrapper[4707]: I0121 15:51:34.950118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerStarted","Data":"302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a"} Jan 21 15:51:34 crc kubenswrapper[4707]: I0121 15:51:34.950156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerStarted","Data":"4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320"} Jan 21 15:51:35 crc kubenswrapper[4707]: I0121 15:51:35.845952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:51:35 crc kubenswrapper[4707]: I0121 15:51:35.957782 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:51:35 crc kubenswrapper[4707]: I0121 15:51:35.958077 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf" containerName="nova-scheduler-scheduler" containerID="cri-o://a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c" gracePeriod=30 Jan 21 15:51:35 crc kubenswrapper[4707]: I0121 15:51:35.984830 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:51:35 crc kubenswrapper[4707]: I0121 15:51:35.985400 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" containerID="cri-o://3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" gracePeriod=30 Jan 21 15:51:35 crc kubenswrapper[4707]: I0121 15:51:35.988792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerStarted","Data":"dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc"} Jan 21 15:51:35 crc kubenswrapper[4707]: E0121 15:51:35.989289 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:36 crc kubenswrapper[4707]: E0121 15:51:36.010962 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:36 crc kubenswrapper[4707]: E0121 15:51:36.022829 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:36 crc kubenswrapper[4707]: E0121 15:51:36.022895 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.025754 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.030917 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api-log" containerID="cri-o://b5259bb32f8e7a63c7f6470d92529c9c2b2d3d96ec7f8b2439aab3601cdc6538" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.031541 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api" containerID="cri-o://8155b26de07fb92fa6d4dac952758954f8d13061ae5932a9172aed2e4c3b2d3e" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.042671 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-api-0" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.211:8776/healthcheck\": EOF" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.043869 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.211:8776/healthcheck\": EOF" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.086395 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.086587 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="d22ebb6c-70ce-42f9-8893-78d76edab2ce" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4c8647510d9ba6e9eb322f9de79f9017a1a8fc720b685abc0f311735d05338ad" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.109891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.110146 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerName="glance-log" containerID="cri-o://4de6361043002e66d2412937576703f188e896ab813880f6a1980a4ba39cabb6" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.110347 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerName="glance-httpd" containerID="cri-o://86e9864dc77ac9f7e004c9d97a05fdc7ea6282109a354cde2ed52926d6d3d33f" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.171436 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kxkv"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.177715 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kxkv"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.186078 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.186306 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerName="glance-log" containerID="cri-o://843277c982352a8f067187b925a5f54dbba4844c3c191d8a0ca6015d0706fc05" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.186672 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerName="glance-httpd" containerID="cri-o://76da2ef4aab0590f619622ba0623aaf82d210931df4ee175dcd4eaa9262365dd" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.328937 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.330401 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.345140 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.383887 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.386312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.417502 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.425220 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-khvzf"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.426846 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.436612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-khvzf"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.449192 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.450553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5b5s\" (UniqueName: \"kubernetes.io/projected/e57ace7a-26b6-433e-99aa-891427aaf04c-kube-api-access-h5b5s\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.450596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data-custom\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.450952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ace7a-26b6-433e-99aa-891427aaf04c-logs\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.450992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.451008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-combined-ca-bundle\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.459646 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.461454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.468354 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.487198 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-675d9b6fc4-hjdm8"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.493464 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-log" containerID="cri-o://bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.494120 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-api" containerID="cri-o://bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.549857 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.551563 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-combined-ca-bundle\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ace7a-26b6-433e-99aa-891427aaf04c-logs\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554232 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-credential-keys\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554286 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-combined-ca-bundle\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data-custom\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b885\" (UniqueName: \"kubernetes.io/projected/25ea2916-144f-4962-92ac-8d9de133ea36-kube-api-access-5b885\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data-custom\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-fernet-keys\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554505 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5b5s\" (UniqueName: \"kubernetes.io/projected/e57ace7a-26b6-433e-99aa-891427aaf04c-kube-api-access-h5b5s\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-combined-ca-bundle\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data-custom\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd5cn\" (UniqueName: \"kubernetes.io/projected/ada6986b-923f-4ca5-a85b-d0578223ae13-kube-api-access-zd5cn\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ea2916-144f-4962-92ac-8d9de133ea36-logs\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-combined-ca-bundle\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpmdg\" (UniqueName: \"kubernetes.io/projected/964ff6ca-837f-4d18-90c9-c78ca0d14340-kube-api-access-wpmdg\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-config-data\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964ff6ca-837f-4d18-90c9-c78ca0d14340-logs\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-scripts\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.554952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.555413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ace7a-26b6-433e-99aa-891427aaf04c-logs\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.591757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.594364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5b5s\" (UniqueName: \"kubernetes.io/projected/e57ace7a-26b6-433e-99aa-891427aaf04c-kube-api-access-h5b5s\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.594629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-combined-ca-bundle\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.600430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data-custom\") pod \"barbican-worker-5cc88854d5-cfgq8\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.618932 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.635900 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-64d4c897d6-z4d82"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.636168 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerName="neutron-api" containerID="cri-o://d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.636932 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerName="neutron-httpd" containerID="cri-o://36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.652488 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.667557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.667617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv8lc\" (UniqueName: \"kubernetes.io/projected/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-kube-api-access-mv8lc\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.667644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-public-tls-certs\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.667672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-combined-ca-bundle\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.667703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-credential-keys\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.667738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data-custom\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.667758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b885\" (UniqueName: \"kubernetes.io/projected/25ea2916-144f-4962-92ac-8d9de133ea36-kube-api-access-5b885\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.667798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data-custom\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.670783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-internal-tls-certs\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.670889 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-fernet-keys\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.670962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-combined-ca-bundle\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd5cn\" (UniqueName: \"kubernetes.io/projected/ada6986b-923f-4ca5-a85b-d0578223ae13-kube-api-access-zd5cn\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ea2916-144f-4962-92ac-8d9de133ea36-logs\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-combined-ca-bundle\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-config-data\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-logs\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpmdg\" (UniqueName: \"kubernetes.io/projected/964ff6ca-837f-4d18-90c9-c78ca0d14340-kube-api-access-wpmdg\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-combined-ca-bundle\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-config-data\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964ff6ca-837f-4d18-90c9-c78ca0d14340-logs\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-scripts\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.671608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-scripts\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.673861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.679967 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ea2916-144f-4962-92ac-8d9de133ea36-logs\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.683464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-combined-ca-bundle\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.684132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964ff6ca-837f-4d18-90c9-c78ca0d14340-logs\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.685647 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-config-data\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.686371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-fernet-keys\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.686511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.686635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-combined-ca-bundle\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.691732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data-custom\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.694192 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-95c949c86-zbgkh"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.697554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-scripts\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.697747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-credential-keys\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.702028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b885\" (UniqueName: \"kubernetes.io/projected/25ea2916-144f-4962-92ac-8d9de133ea36-kube-api-access-5b885\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.726522 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.731657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd5cn\" (UniqueName: \"kubernetes.io/projected/ada6986b-923f-4ca5-a85b-d0578223ae13-kube-api-access-zd5cn\") pod \"keystone-bootstrap-khvzf\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.730697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpmdg\" (UniqueName: \"kubernetes.io/projected/964ff6ca-837f-4d18-90c9-c78ca0d14340-kube-api-access-wpmdg\") pod \"barbican-api-6944fd56d4-74bxx\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.735170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data-custom\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.735277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-combined-ca-bundle\") pod \"barbican-keystone-listener-84c5cd8f8d-9qdxd\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.746323 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.1.208:9696/\": read tcp 10.217.0.2:33136->10.217.1.208:9696: read: connection reset by peer" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.755600 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-95c949c86-zbgkh"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.767089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.772906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-internal-tls-certs\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.773051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-config-data\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.773193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-logs\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.773298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-combined-ca-bundle\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.773453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-scripts\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.773630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv8lc\" (UniqueName: \"kubernetes.io/projected/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-kube-api-access-mv8lc\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.773705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-public-tls-certs\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.778073 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-public-tls-certs\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.783396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-logs\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.783444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-internal-tls-certs\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.783459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-scripts\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.789421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-config-data\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.789529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-combined-ca-bundle\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.792136 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv8lc\" (UniqueName: \"kubernetes.io/projected/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-kube-api-access-mv8lc\") pod \"placement-cdbcc8f6b-xn7nh\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.808802 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.825941 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.884311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-combined-ca-bundle\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.885881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx8mf\" (UniqueName: \"kubernetes.io/projected/f97eddbb-0f7f-4c89-bb38-27636955cfeb-kube-api-access-kx8mf\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.886369 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-config\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.886401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-httpd-config\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.901491 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.947954 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.217:8778/\": read tcp 10.217.0.2:35378->10.217.1.217:8778: read: connection reset by peer" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.947995 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.1.217:8778/\": read tcp 10.217.0.2:35364->10.217.1.217:8778: read: connection reset by peer" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.967056 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.981230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.981281 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.989705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx8mf\" (UniqueName: \"kubernetes.io/projected/f97eddbb-0f7f-4c89-bb38-27636955cfeb-kube-api-access-kx8mf\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.989862 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-config\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.989881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-httpd-config\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.989934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-combined-ca-bundle\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:36 crc kubenswrapper[4707]: I0121 15:51:36.999182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-httpd-config\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.001174 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.002227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-combined-ca-bundle\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.005272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-config\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.011603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx8mf\" (UniqueName: \"kubernetes.io/projected/f97eddbb-0f7f-4c89-bb38-27636955cfeb-kube-api-access-kx8mf\") pod \"neutron-95c949c86-zbgkh\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.014481 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.027956 4707 generic.go:334] "Generic (PLEG): container finished" podID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerID="bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b" exitCode=143 Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.028064 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" event={"ID":"f597d137-18da-4335-a15b-26e45a8c7b7d","Type":"ContainerDied","Data":"bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b"} Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.037156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.038517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" event={"ID":"7df46a58-0e69-4140-b4a6-5357eae382ef","Type":"ContainerDied","Data":"36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b"} Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.038460 4707 generic.go:334] "Generic (PLEG): container finished" podID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerID="36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b" exitCode=0 Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.053980 4707 generic.go:334] "Generic (PLEG): container finished" podID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerID="76da2ef4aab0590f619622ba0623aaf82d210931df4ee175dcd4eaa9262365dd" exitCode=0 Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.054000 4707 generic.go:334] "Generic (PLEG): container finished" podID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerID="843277c982352a8f067187b925a5f54dbba4844c3c191d8a0ca6015d0706fc05" exitCode=143 Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.054044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"32b3543f-a454-4d1e-a820-031c05bc1aad","Type":"ContainerDied","Data":"76da2ef4aab0590f619622ba0623aaf82d210931df4ee175dcd4eaa9262365dd"} Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.054062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"32b3543f-a454-4d1e-a820-031c05bc1aad","Type":"ContainerDied","Data":"843277c982352a8f067187b925a5f54dbba4844c3c191d8a0ca6015d0706fc05"} Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.058904 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.058939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.068615 4707 generic.go:334] "Generic (PLEG): container finished" podID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerID="b5259bb32f8e7a63c7f6470d92529c9c2b2d3d96ec7f8b2439aab3601cdc6538" exitCode=143 Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.068673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f732f500-3538-4a30-8502-fbd2451b7bdd","Type":"ContainerDied","Data":"b5259bb32f8e7a63c7f6470d92529c9c2b2d3d96ec7f8b2439aab3601cdc6538"} Jan 21 15:51:37 crc kubenswrapper[4707]: E0121 15:51:37.110550 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:37 crc kubenswrapper[4707]: E0121 15:51:37.112863 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.113065 4707 generic.go:334] "Generic (PLEG): container finished" podID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerID="86e9864dc77ac9f7e004c9d97a05fdc7ea6282109a354cde2ed52926d6d3d33f" exitCode=0 Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.113081 4707 generic.go:334] "Generic (PLEG): container finished" podID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerID="4de6361043002e66d2412937576703f188e896ab813880f6a1980a4ba39cabb6" exitCode=143 Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.113278 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="50b67658-af5d-467a-8952-22db7d822791" containerName="cinder-scheduler" containerID="cri-o://cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658" gracePeriod=30 Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.113348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"383aecaf-d0c4-4d44-9df7-04e2ea1035b5","Type":"ContainerDied","Data":"86e9864dc77ac9f7e004c9d97a05fdc7ea6282109a354cde2ed52926d6d3d33f"} Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.113372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"383aecaf-d0c4-4d44-9df7-04e2ea1035b5","Type":"ContainerDied","Data":"4de6361043002e66d2412937576703f188e896ab813880f6a1980a4ba39cabb6"} Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.113643 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="50b67658-af5d-467a-8952-22db7d822791" containerName="probe" containerID="cri-o://be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032" gracePeriod=30 Jan 21 15:51:37 crc kubenswrapper[4707]: E0121 15:51:37.128960 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:37 crc kubenswrapper[4707]: E0121 15:51:37.128997 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.137082 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.168879 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.189094 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.157:9311/healthcheck\": read tcp 10.217.0.2:35592->10.217.1.157:9311: read: connection reset by peer" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.189342 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.157:9311/healthcheck\": read tcp 10.217.0.2:35608->10.217.1.157:9311: read: connection reset by peer" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.216306 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b8c9a7-67a3-471d-a356-3e016bf3a3e4" path="/var/lib/kubelet/pods/47b8c9a7-67a3-471d-a356-3e016bf3a3e4/volumes" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.296115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-logs\") pod \"32b3543f-a454-4d1e-a820-031c05bc1aad\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.296224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-combined-ca-bundle\") pod \"32b3543f-a454-4d1e-a820-031c05bc1aad\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.296415 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hg99\" (UniqueName: \"kubernetes.io/projected/32b3543f-a454-4d1e-a820-031c05bc1aad-kube-api-access-8hg99\") pod \"32b3543f-a454-4d1e-a820-031c05bc1aad\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.296442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-scripts\") pod \"32b3543f-a454-4d1e-a820-031c05bc1aad\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.296458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-config-data\") pod \"32b3543f-a454-4d1e-a820-031c05bc1aad\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.296529 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"32b3543f-a454-4d1e-a820-031c05bc1aad\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.296557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-httpd-run\") pod \"32b3543f-a454-4d1e-a820-031c05bc1aad\" (UID: \"32b3543f-a454-4d1e-a820-031c05bc1aad\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.297821 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-logs" (OuterVolumeSpecName: "logs") pod "32b3543f-a454-4d1e-a820-031c05bc1aad" (UID: "32b3543f-a454-4d1e-a820-031c05bc1aad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.308890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "32b3543f-a454-4d1e-a820-031c05bc1aad" (UID: "32b3543f-a454-4d1e-a820-031c05bc1aad"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.311498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-scripts" (OuterVolumeSpecName: "scripts") pod "32b3543f-a454-4d1e-a820-031c05bc1aad" (UID: "32b3543f-a454-4d1e-a820-031c05bc1aad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.313365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b3543f-a454-4d1e-a820-031c05bc1aad-kube-api-access-8hg99" (OuterVolumeSpecName: "kube-api-access-8hg99") pod "32b3543f-a454-4d1e-a820-031c05bc1aad" (UID: "32b3543f-a454-4d1e-a820-031c05bc1aad"). InnerVolumeSpecName "kube-api-access-8hg99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.323027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "32b3543f-a454-4d1e-a820-031c05bc1aad" (UID: "32b3543f-a454-4d1e-a820-031c05bc1aad"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.335585 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.385167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32b3543f-a454-4d1e-a820-031c05bc1aad" (UID: "32b3543f-a454-4d1e-a820-031c05bc1aad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.401053 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.401162 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hg99\" (UniqueName: \"kubernetes.io/projected/32b3543f-a454-4d1e-a820-031c05bc1aad-kube-api-access-8hg99\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.401231 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.401313 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.401366 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.401415 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b3543f-a454-4d1e-a820-031c05bc1aad-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.442074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-config-data" (OuterVolumeSpecName: "config-data") pod "32b3543f-a454-4d1e-a820-031c05bc1aad" (UID: "32b3543f-a454-4d1e-a820-031c05bc1aad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: E0121 15:51:37.472331 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c8647510d9ba6e9eb322f9de79f9017a1a8fc720b685abc0f311735d05338ad" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.472627 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:51:37 crc kubenswrapper[4707]: E0121 15:51:37.476877 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c8647510d9ba6e9eb322f9de79f9017a1a8fc720b685abc0f311735d05338ad" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:37 crc kubenswrapper[4707]: E0121 15:51:37.487866 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c8647510d9ba6e9eb322f9de79f9017a1a8fc720b685abc0f311735d05338ad" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:37 crc kubenswrapper[4707]: E0121 15:51:37.487910 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="d22ebb6c-70ce-42f9-8893-78d76edab2ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.512769 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3543f-a454-4d1e-a820-031c05bc1aad-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.512796 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.554354 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.638406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4hgf\" (UniqueName: \"kubernetes.io/projected/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-kube-api-access-c4hgf\") pod \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.638473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-httpd-run\") pod \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.638558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-scripts\") pod \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.638582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-config-data\") pod \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.638598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.638625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-combined-ca-bundle\") pod \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.638777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-logs\") pod \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\" (UID: \"383aecaf-d0c4-4d44-9df7-04e2ea1035b5\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.639384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-logs" (OuterVolumeSpecName: "logs") pod "383aecaf-d0c4-4d44-9df7-04e2ea1035b5" (UID: "383aecaf-d0c4-4d44-9df7-04e2ea1035b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.640662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "383aecaf-d0c4-4d44-9df7-04e2ea1035b5" (UID: "383aecaf-d0c4-4d44-9df7-04e2ea1035b5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.669192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-scripts" (OuterVolumeSpecName: "scripts") pod "383aecaf-d0c4-4d44-9df7-04e2ea1035b5" (UID: "383aecaf-d0c4-4d44-9df7-04e2ea1035b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.669240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "383aecaf-d0c4-4d44-9df7-04e2ea1035b5" (UID: "383aecaf-d0c4-4d44-9df7-04e2ea1035b5"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.669391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-kube-api-access-c4hgf" (OuterVolumeSpecName: "kube-api-access-c4hgf") pod "383aecaf-d0c4-4d44-9df7-04e2ea1035b5" (UID: "383aecaf-d0c4-4d44-9df7-04e2ea1035b5"). InnerVolumeSpecName "kube-api-access-c4hgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.718903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-config-data" (OuterVolumeSpecName: "config-data") pod "383aecaf-d0c4-4d44-9df7-04e2ea1035b5" (UID: "383aecaf-d0c4-4d44-9df7-04e2ea1035b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.722960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "383aecaf-d0c4-4d44-9df7-04e2ea1035b5" (UID: "383aecaf-d0c4-4d44-9df7-04e2ea1035b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.742837 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.742864 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4hgf\" (UniqueName: \"kubernetes.io/projected/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-kube-api-access-c4hgf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.742875 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.742884 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.742893 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.742916 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.742926 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383aecaf-d0c4-4d44-9df7-04e2ea1035b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.791875 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.811304 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.843649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-scripts\") pod \"f597d137-18da-4335-a15b-26e45a8c7b7d\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.843692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-public-tls-certs\") pod \"f597d137-18da-4335-a15b-26e45a8c7b7d\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.843724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmhw7\" (UniqueName: \"kubernetes.io/projected/f597d137-18da-4335-a15b-26e45a8c7b7d-kube-api-access-rmhw7\") pod \"f597d137-18da-4335-a15b-26e45a8c7b7d\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.843792 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-combined-ca-bundle\") pod \"f597d137-18da-4335-a15b-26e45a8c7b7d\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.843872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f597d137-18da-4335-a15b-26e45a8c7b7d-logs\") pod \"f597d137-18da-4335-a15b-26e45a8c7b7d\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.843903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-internal-tls-certs\") pod \"f597d137-18da-4335-a15b-26e45a8c7b7d\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.844003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-config-data\") pod \"f597d137-18da-4335-a15b-26e45a8c7b7d\" (UID: \"f597d137-18da-4335-a15b-26e45a8c7b7d\") " Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.844419 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.856088 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f597d137-18da-4335-a15b-26e45a8c7b7d-logs" (OuterVolumeSpecName: "logs") pod "f597d137-18da-4335-a15b-26e45a8c7b7d" (UID: "f597d137-18da-4335-a15b-26e45a8c7b7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.865995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f597d137-18da-4335-a15b-26e45a8c7b7d-kube-api-access-rmhw7" (OuterVolumeSpecName: "kube-api-access-rmhw7") pod "f597d137-18da-4335-a15b-26e45a8c7b7d" (UID: "f597d137-18da-4335-a15b-26e45a8c7b7d"). InnerVolumeSpecName "kube-api-access-rmhw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.866153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-scripts" (OuterVolumeSpecName: "scripts") pod "f597d137-18da-4335-a15b-26e45a8c7b7d" (UID: "f597d137-18da-4335-a15b-26e45a8c7b7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.946215 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.946245 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmhw7\" (UniqueName: \"kubernetes.io/projected/f597d137-18da-4335-a15b-26e45a8c7b7d-kube-api-access-rmhw7\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.946267 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f597d137-18da-4335-a15b-26e45a8c7b7d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:37 crc kubenswrapper[4707]: I0121 15:51:37.973870 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.010025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.035013 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-khvzf"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.047142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-config-data" (OuterVolumeSpecName: "config-data") pod "f597d137-18da-4335-a15b-26e45a8c7b7d" (UID: "f597d137-18da-4335-a15b-26e45a8c7b7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.048064 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.063302 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.063390 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.127420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f597d137-18da-4335-a15b-26e45a8c7b7d" (UID: "f597d137-18da-4335-a15b-26e45a8c7b7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.141176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f597d137-18da-4335-a15b-26e45a8c7b7d" (UID: "f597d137-18da-4335-a15b-26e45a8c7b7d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.146353 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.216:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.146429 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.216:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.150277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"32b3543f-a454-4d1e-a820-031c05bc1aad","Type":"ContainerDied","Data":"7a6723b6c83abadbb1ed073b496be307022ffbe048e442d06a3d97ae1e346d51"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.150351 4707 scope.go:117] "RemoveContainer" containerID="76da2ef4aab0590f619622ba0623aaf82d210931df4ee175dcd4eaa9262365dd" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.150559 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.152379 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.152404 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.158023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f597d137-18da-4335-a15b-26e45a8c7b7d" (UID: "f597d137-18da-4335-a15b-26e45a8c7b7d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.159105 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerID="8423dad4b94be8eef5c393ef27d1d8f14ad3815ffd34f41da3bfa63e5acc69b9" exitCode=0 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.159188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" event={"ID":"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d","Type":"ContainerDied","Data":"8423dad4b94be8eef5c393ef27d1d8f14ad3815ffd34f41da3bfa63e5acc69b9"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.173752 4707 generic.go:334] "Generic (PLEG): container finished" podID="50b67658-af5d-467a-8952-22db7d822791" containerID="be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032" exitCode=0 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.173972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"50b67658-af5d-467a-8952-22db7d822791","Type":"ContainerDied","Data":"be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.190491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerStarted","Data":"009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.190659 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="ceilometer-central-agent" containerID="cri-o://4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320" gracePeriod=30 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.190916 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.191113 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="ceilometer-notification-agent" containerID="cri-o://302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a" gracePeriod=30 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.191124 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="proxy-httpd" containerID="cri-o://009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3" gracePeriod=30 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.191148 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="sg-core" containerID="cri-o://dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc" gracePeriod=30 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.219740 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.255816002 podStartE2EDuration="6.219722516s" podCreationTimestamp="2026-01-21 15:51:32 +0000 UTC" firstStartedPulling="2026-01-21 15:51:32.989986726 +0000 UTC m=+2990.171502939" lastFinishedPulling="2026-01-21 15:51:36.95389323 +0000 UTC m=+2994.135409453" observedRunningTime="2026-01-21 15:51:38.213388667 +0000 UTC m=+2995.394904890" watchObservedRunningTime="2026-01-21 15:51:38.219722516 +0000 UTC m=+2995.401238739" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.226165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" event={"ID":"ada6986b-923f-4ca5-a85b-d0578223ae13","Type":"ContainerStarted","Data":"3d7cbed09de0050d7563879ab667ea6edc3c8a0d862e3fc69264df0be5a5f75e"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.245651 4707 generic.go:334] "Generic (PLEG): container finished" podID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerID="bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c" exitCode=0 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.245763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" event={"ID":"f597d137-18da-4335-a15b-26e45a8c7b7d","Type":"ContainerDied","Data":"bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.245787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" event={"ID":"f597d137-18da-4335-a15b-26e45a8c7b7d","Type":"ContainerDied","Data":"6883ae3e94d4689d3f66d1675edc4e0ff53fe7704471f99e94fd22a6d7df861f"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.245875 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-675d9b6fc4-hjdm8" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.252944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" event={"ID":"964ff6ca-837f-4d18-90c9-c78ca0d14340","Type":"ContainerStarted","Data":"bdd42295d139bf8eb5d65c7e1315459ff8e6efa45102bced4efd8b3df6895c90"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.254144 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f597d137-18da-4335-a15b-26e45a8c7b7d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.255115 4707 generic.go:334] "Generic (PLEG): container finished" podID="d22ebb6c-70ce-42f9-8893-78d76edab2ce" containerID="4c8647510d9ba6e9eb322f9de79f9017a1a8fc720b685abc0f311735d05338ad" exitCode=0 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.255162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d22ebb6c-70ce-42f9-8893-78d76edab2ce","Type":"ContainerDied","Data":"4c8647510d9ba6e9eb322f9de79f9017a1a8fc720b685abc0f311735d05338ad"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.279643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"383aecaf-d0c4-4d44-9df7-04e2ea1035b5","Type":"ContainerDied","Data":"3ab91a15f3e1fefd6ddebc2a4a8e5b5b0f3bbfa8cd395a8ba823c2b7880b4ee3"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.279775 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.281581 4707 scope.go:117] "RemoveContainer" containerID="843277c982352a8f067187b925a5f54dbba4844c3c191d8a0ca6015d0706fc05" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.289915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" event={"ID":"e57ace7a-26b6-433e-99aa-891427aaf04c","Type":"ContainerStarted","Data":"77db2c9a60e3bff5990a162414a8616cd6cb058f1c8a92db5832d810c00f9b7f"} Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.299845 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.301748 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.301871 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.306094 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.356182 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-675d9b6fc4-hjdm8"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.356752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz596\" (UniqueName: \"kubernetes.io/projected/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-kube-api-access-vz596\") pod \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.356786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-combined-ca-bundle\") pod \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.359007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-logs\") pod \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.359056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data-custom\") pod \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.359186 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data\") pod \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\" (UID: \"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d\") " Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.362098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-kube-api-access-vz596" (OuterVolumeSpecName: "kube-api-access-vz596") pod "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" (UID: "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d"). InnerVolumeSpecName "kube-api-access-vz596". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.362797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-logs" (OuterVolumeSpecName: "logs") pod "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" (UID: "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.376917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" (UID: "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.401908 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:38 crc kubenswrapper[4707]: E0121 15:51:38.403121 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerName="glance-httpd" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403142 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerName="glance-httpd" Jan 21 15:51:38 crc kubenswrapper[4707]: E0121 15:51:38.403151 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerName="glance-log" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403157 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerName="glance-log" Jan 21 15:51:38 crc kubenswrapper[4707]: E0121 15:51:38.403175 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerName="glance-log" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403182 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerName="glance-log" Jan 21 15:51:38 crc kubenswrapper[4707]: E0121 15:51:38.403198 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerName="glance-httpd" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403205 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerName="glance-httpd" Jan 21 15:51:38 crc kubenswrapper[4707]: E0121 15:51:38.403225 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403232 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api" Jan 21 15:51:38 crc kubenswrapper[4707]: E0121 15:51:38.403260 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-log" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403266 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-log" Jan 21 15:51:38 crc kubenswrapper[4707]: E0121 15:51:38.403283 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api-log" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403289 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api-log" Jan 21 15:51:38 crc kubenswrapper[4707]: E0121 15:51:38.403312 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-api" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403318 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-api" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403778 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-log" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403794 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerName="glance-httpd" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403828 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" containerName="glance-log" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403850 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403863 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" containerName="placement-api" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403873 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerName="glance-log" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403890 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b3543f-a454-4d1e-a820-031c05bc1aad" containerName="glance-httpd" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.403911 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" containerName="barbican-api-log" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.477942 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz596\" (UniqueName: \"kubernetes.io/projected/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-kube-api-access-vz596\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.478046 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.478103 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.489336 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-675d9b6fc4-hjdm8"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.489509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.511917 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.526705 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.527314 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.527467 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-265m4" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.527684 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.551016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data" (OuterVolumeSpecName: "config-data") pod "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" (UID: "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.560030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" (UID: "3c2d4c95-366c-4f9b-b0d8-0c2bd153408d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.564494 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh"] Jan 21 15:51:38 crc kubenswrapper[4707]: W0121 15:51:38.569117 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ea2916_144f_4962_92ac_8d9de133ea36.slice/crio-fb61cad2da4593b08840a1613b011375b83e97eea9b88325f2ae612da95d5502 WatchSource:0}: Error finding container fb61cad2da4593b08840a1613b011375b83e97eea9b88325f2ae612da95d5502: Status 404 returned error can't find the container with id fb61cad2da4593b08840a1613b011375b83e97eea9b88325f2ae612da95d5502 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.580509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.580580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.580630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.580686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-config-data\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.580731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8bg2\" (UniqueName: \"kubernetes.io/projected/60e0b49a-6e14-42af-9480-877a354cec42-kube-api-access-b8bg2\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.580758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.580779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-scripts\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.580797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-logs\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.581573 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.581595 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.589574 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.601265 4707 scope.go:117] "RemoveContainer" containerID="bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.634542 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-95c949c86-zbgkh"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.683796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.683875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.684523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-config-data\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.684912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8bg2\" (UniqueName: \"kubernetes.io/projected/60e0b49a-6e14-42af-9480-877a354cec42-kube-api-access-b8bg2\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.684936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.684955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-scripts\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.684970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-logs\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.685004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.685564 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.696512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.702494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-logs\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.716411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.717177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-config-data\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.721370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-scripts\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.731201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8bg2\" (UniqueName: \"kubernetes.io/projected/60e0b49a-6e14-42af-9480-877a354cec42-kube-api-access-b8bg2\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.743603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.769916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.854191 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.889415 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-config-data\") pod \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.889457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lklrn\" (UniqueName: \"kubernetes.io/projected/d22ebb6c-70ce-42f9-8893-78d76edab2ce-kube-api-access-lklrn\") pod \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.889510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-combined-ca-bundle\") pod \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\" (UID: \"d22ebb6c-70ce-42f9-8893-78d76edab2ce\") " Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.894134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.918045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22ebb6c-70ce-42f9-8893-78d76edab2ce-kube-api-access-lklrn" (OuterVolumeSpecName: "kube-api-access-lklrn") pod "d22ebb6c-70ce-42f9-8893-78d76edab2ce" (UID: "d22ebb6c-70ce-42f9-8893-78d76edab2ce"). InnerVolumeSpecName "kube-api-access-lklrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.918112 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.918351 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerName="ovn-northd" containerID="cri-o://c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293" gracePeriod=30 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.918732 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerName="openstack-network-exporter" containerID="cri-o://68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991" gracePeriod=30 Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.929519 4707 scope.go:117] "RemoveContainer" containerID="bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.938328 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.959311 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:38 crc kubenswrapper[4707]: I0121 15:51:38.991622 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lklrn\" (UniqueName: \"kubernetes.io/projected/d22ebb6c-70ce-42f9-8893-78d76edab2ce-kube-api-access-lklrn\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.002955 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.012869 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:39 crc kubenswrapper[4707]: E0121 15:51:39.013279 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerName="neutron-httpd" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.013292 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerName="neutron-httpd" Jan 21 15:51:39 crc kubenswrapper[4707]: E0121 15:51:39.013320 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerName="neutron-api" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.013326 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerName="neutron-api" Jan 21 15:51:39 crc kubenswrapper[4707]: E0121 15:51:39.013339 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22ebb6c-70ce-42f9-8893-78d76edab2ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.013347 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22ebb6c-70ce-42f9-8893-78d76edab2ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.013535 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22ebb6c-70ce-42f9-8893-78d76edab2ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.013550 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerName="neutron-api" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.013568 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerName="neutron-httpd" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.014487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.038219 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.047997 4707 scope.go:117] "RemoveContainer" containerID="bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.048210 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.048237 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:51:39 crc kubenswrapper[4707]: E0121 15:51:39.054413 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c\": container with ID starting with bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c not found: ID does not exist" containerID="bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.054795 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c"} err="failed to get container status \"bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c\": rpc error: code = NotFound desc = could not find container \"bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c\": container with ID starting with bc060018128c99f6370030c279089fe828bf349e117f79f109aaf832295b9e2c not found: ID does not exist" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.054914 4707 scope.go:117] "RemoveContainer" containerID="bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b" Jan 21 15:51:39 crc kubenswrapper[4707]: E0121 15:51:39.062365 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b\": container with ID starting with bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b not found: ID does not exist" containerID="bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.062407 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b"} err="failed to get container status \"bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b\": rpc error: code = NotFound desc = could not find container \"bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b\": container with ID starting with bc96a3cf18bea2babe3f61fba0a2e31bfc6da72d5a6c640348c474cd830fa27b not found: ID does not exist" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.062432 4707 scope.go:117] "RemoveContainer" containerID="86e9864dc77ac9f7e004c9d97a05fdc7ea6282109a354cde2ed52926d6d3d33f" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.105375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-combined-ca-bundle\") pod \"7df46a58-0e69-4140-b4a6-5357eae382ef\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.105692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-httpd-config\") pod \"7df46a58-0e69-4140-b4a6-5357eae382ef\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.105923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bksjn\" (UniqueName: \"kubernetes.io/projected/7df46a58-0e69-4140-b4a6-5357eae382ef-kube-api-access-bksjn\") pod \"7df46a58-0e69-4140-b4a6-5357eae382ef\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.105964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-config\") pod \"7df46a58-0e69-4140-b4a6-5357eae382ef\" (UID: \"7df46a58-0e69-4140-b4a6-5357eae382ef\") " Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.139475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df46a58-0e69-4140-b4a6-5357eae382ef-kube-api-access-bksjn" (OuterVolumeSpecName: "kube-api-access-bksjn") pod "7df46a58-0e69-4140-b4a6-5357eae382ef" (UID: "7df46a58-0e69-4140-b4a6-5357eae382ef"). InnerVolumeSpecName "kube-api-access-bksjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.139905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7df46a58-0e69-4140-b4a6-5357eae382ef" (UID: "7df46a58-0e69-4140-b4a6-5357eae382ef"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.143417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d22ebb6c-70ce-42f9-8893-78d76edab2ce" (UID: "d22ebb6c-70ce-42f9-8893-78d76edab2ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.157394 4707 scope.go:117] "RemoveContainer" containerID="4de6361043002e66d2412937576703f188e896ab813880f6a1980a4ba39cabb6" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-config-data\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-logs\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-scripts\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208339 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5tb4\" (UniqueName: \"kubernetes.io/projected/11f0696a-c173-41a3-926d-29ef8dcef753-kube-api-access-c5tb4\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208564 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bksjn\" (UniqueName: \"kubernetes.io/projected/7df46a58-0e69-4140-b4a6-5357eae382ef-kube-api-access-bksjn\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208578 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.208586 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.218366 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b3543f-a454-4d1e-a820-031c05bc1aad" path="/var/lib/kubelet/pods/32b3543f-a454-4d1e-a820-031c05bc1aad/volumes" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.224099 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383aecaf-d0c4-4d44-9df7-04e2ea1035b5" path="/var/lib/kubelet/pods/383aecaf-d0c4-4d44-9df7-04e2ea1035b5/volumes" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.227958 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f597d137-18da-4335-a15b-26e45a8c7b7d" path="/var/lib/kubelet/pods/f597d137-18da-4335-a15b-26e45a8c7b7d/volumes" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.299301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" event={"ID":"35251dbe-ab48-4cf5-b556-941dc2cfe6cc","Type":"ContainerStarted","Data":"2f348aa12604f359438109c5914df2a157cd4d6585e01eedf61fd0c92542551b"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.309709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.309761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5tb4\" (UniqueName: \"kubernetes.io/projected/11f0696a-c173-41a3-926d-29ef8dcef753-kube-api-access-c5tb4\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.309840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.309863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-config-data\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.309886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-logs\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.309942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-scripts\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.309959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.309979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.310691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.312401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-logs\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.318396 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.318481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" event={"ID":"f97eddbb-0f7f-4c89-bb38-27636955cfeb","Type":"ContainerStarted","Data":"8e115506bddafddea6ccc0304ef16d7840d849ae4d01c365b3992e4647182add"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.321082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" event={"ID":"e57ace7a-26b6-433e-99aa-891427aaf04c","Type":"ContainerStarted","Data":"a9a94e2300c8f0cebf8271a5715c7814c9ee668a8c0685fdaf947f3e5b501fa5"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.336664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5tb4\" (UniqueName: \"kubernetes.io/projected/11f0696a-c173-41a3-926d-29ef8dcef753-kube-api-access-c5tb4\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.345515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.346044 4707 generic.go:334] "Generic (PLEG): container finished" podID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerID="009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3" exitCode=0 Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.346065 4707 generic.go:334] "Generic (PLEG): container finished" podID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerID="dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc" exitCode=2 Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.346073 4707 generic.go:334] "Generic (PLEG): container finished" podID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerID="302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a" exitCode=0 Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.346114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerDied","Data":"009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.346140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerDied","Data":"dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.346148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerDied","Data":"302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.348325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.360166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-scripts\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.361971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" event={"ID":"25ea2916-144f-4962-92ac-8d9de133ea36","Type":"ContainerStarted","Data":"fb61cad2da4593b08840a1613b011375b83e97eea9b88325f2ae612da95d5502"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.376704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-config-data\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.377006 4707 generic.go:334] "Generic (PLEG): container finished" podID="7df46a58-0e69-4140-b4a6-5357eae382ef" containerID="d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4" exitCode=0 Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.377141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" event={"ID":"7df46a58-0e69-4140-b4a6-5357eae382ef","Type":"ContainerDied","Data":"d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.377216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" event={"ID":"7df46a58-0e69-4140-b4a6-5357eae382ef","Type":"ContainerDied","Data":"f9481f8d2491fbee7e125a51196988eb96fe475497f75bbe51b05c1c817909c5"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.377305 4707 scope.go:117] "RemoveContainer" containerID="36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.377444 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-64d4c897d6-z4d82" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.414058 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" event={"ID":"964ff6ca-837f-4d18-90c9-c78ca0d14340","Type":"ContainerStarted","Data":"16f2fa7274651efe218854309098a344d64c182e3e05144df53e9fa1fa123e4f"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.430933 4707 generic.go:334] "Generic (PLEG): container finished" podID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerID="68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991" exitCode=2 Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.431070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"35a31ae7-4ab2-4b98-bd9d-66decb56f00a","Type":"ContainerDied","Data":"68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.464281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-config-data" (OuterVolumeSpecName: "config-data") pod "d22ebb6c-70ce-42f9-8893-78d76edab2ce" (UID: "d22ebb6c-70ce-42f9-8893-78d76edab2ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.490909 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7df46a58-0e69-4140-b4a6-5357eae382ef" (UID: "7df46a58-0e69-4140-b4a6-5357eae382ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.491349 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d22ebb6c-70ce-42f9-8893-78d76edab2ce","Type":"ContainerDied","Data":"621752e70dcb2b44726587d923120e98c76eb889ee4247a8c99b26f517983720"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.491514 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.498762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.510552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" event={"ID":"3c2d4c95-366c-4f9b-b0d8-0c2bd153408d","Type":"ContainerDied","Data":"a53608825e98751b6326bba558c63c6ffb45cba19a8fec397c0091d090929dab"} Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.510587 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.513865 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ebb6c-70ce-42f9-8893-78d76edab2ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.513889 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.577272 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-config" (OuterVolumeSpecName: "config") pod "7df46a58-0e69-4140-b4a6-5357eae382ef" (UID: "7df46a58-0e69-4140-b4a6-5357eae382ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.615755 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7df46a58-0e69-4140-b4a6-5357eae382ef-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.645646 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.697660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.768032 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.783429 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.793998 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-64d4c897d6-z4d82"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.806592 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-64d4c897d6-z4d82"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.826531 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.828393 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.834590 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.886099 4707 scope.go:117] "RemoveContainer" containerID="d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.886308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.906872 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.923720 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-7b784745f6-b8h2p"] Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.926218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.926283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.926426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pglf2\" (UniqueName: \"kubernetes.io/projected/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-kube-api-access-pglf2\") pod \"nova-cell1-conductor-0\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.958293 4707 scope.go:117] "RemoveContainer" containerID="36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b" Jan 21 15:51:39 crc kubenswrapper[4707]: E0121 15:51:39.962949 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b\": container with ID starting with 36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b not found: ID does not exist" containerID="36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.962997 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b"} err="failed to get container status \"36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b\": rpc error: code = NotFound desc = could not find container \"36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b\": container with ID starting with 36f74b4adbc14f8ad6a182c0ab45f0fd5afdd0ac8d92061a8f4d27623ae51d8b not found: ID does not exist" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.963022 4707 scope.go:117] "RemoveContainer" containerID="d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4" Jan 21 15:51:39 crc kubenswrapper[4707]: E0121 15:51:39.967063 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4\": container with ID starting with d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4 not found: ID does not exist" containerID="d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.967102 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4"} err="failed to get container status \"d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4\": rpc error: code = NotFound desc = could not find container \"d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4\": container with ID starting with d96b447bd1f4623ab921a2c3f7791014e3c559a79e8d567801e74a71c547cab4 not found: ID does not exist" Jan 21 15:51:39 crc kubenswrapper[4707]: I0121 15:51:39.967125 4707 scope.go:117] "RemoveContainer" containerID="4c8647510d9ba6e9eb322f9de79f9017a1a8fc720b685abc0f311735d05338ad" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.013015 4707 scope.go:117] "RemoveContainer" containerID="8423dad4b94be8eef5c393ef27d1d8f14ad3815ffd34f41da3bfa63e5acc69b9" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.028268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pglf2\" (UniqueName: \"kubernetes.io/projected/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-kube-api-access-pglf2\") pod \"nova-cell1-conductor-0\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.028403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.028473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.067478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pglf2\" (UniqueName: \"kubernetes.io/projected/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-kube-api-access-pglf2\") pod \"nova-cell1-conductor-0\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.068023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.075712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.100004 4707 scope.go:117] "RemoveContainer" containerID="c6104ba8a299d951e7e68b00af9056b68f0e5d2bab75be0997ecd34395ba528d" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.199479 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.529952 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.545581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" event={"ID":"ada6986b-923f-4ca5-a85b-d0578223ae13","Type":"ContainerStarted","Data":"741b7a02e8ff9cfd2005241421caecf0b4bf770c7ec62312f29cd16e26f0ef89"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.575452 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" podStartSLOduration=4.575434556 podStartE2EDuration="4.575434556s" podCreationTimestamp="2026-01-21 15:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:40.573955716 +0000 UTC m=+2997.755471937" watchObservedRunningTime="2026-01-21 15:51:40.575434556 +0000 UTC m=+2997.756950779" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.587240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"60e0b49a-6e14-42af-9480-877a354cec42","Type":"ContainerStarted","Data":"640f799cd26a8368e83fe7b7e4531f382bdc1e4b56c140eeee2fde59510990c6"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.594277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" event={"ID":"35251dbe-ab48-4cf5-b556-941dc2cfe6cc","Type":"ContainerStarted","Data":"d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.594325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" event={"ID":"35251dbe-ab48-4cf5-b556-941dc2cfe6cc","Type":"ContainerStarted","Data":"4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.595525 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.595556 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.617175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" event={"ID":"25ea2916-144f-4962-92ac-8d9de133ea36","Type":"ContainerStarted","Data":"f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.617224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" event={"ID":"25ea2916-144f-4962-92ac-8d9de133ea36","Type":"ContainerStarted","Data":"eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.619559 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" podStartSLOduration=4.61954268 podStartE2EDuration="4.61954268s" podCreationTimestamp="2026-01-21 15:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:40.613162043 +0000 UTC m=+2997.794678265" watchObservedRunningTime="2026-01-21 15:51:40.61954268 +0000 UTC m=+2997.801058903" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.643673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" event={"ID":"964ff6ca-837f-4d18-90c9-c78ca0d14340","Type":"ContainerStarted","Data":"70b8b775962c4e1bdd55d20416c186cff48d9907089ad48cb2bd7ce99aa3d814"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.643739 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.645394 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.652424 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" podStartSLOduration=4.652406953 podStartE2EDuration="4.652406953s" podCreationTimestamp="2026-01-21 15:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:40.643309497 +0000 UTC m=+2997.824825719" watchObservedRunningTime="2026-01-21 15:51:40.652406953 +0000 UTC m=+2997.833923175" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.694342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" event={"ID":"f97eddbb-0f7f-4c89-bb38-27636955cfeb","Type":"ContainerStarted","Data":"d970cb3fbc1e75bc36ac63cd33ec506b0f116a6de732638699497cdff07ac747"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.694385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" event={"ID":"f97eddbb-0f7f-4c89-bb38-27636955cfeb","Type":"ContainerStarted","Data":"0dd9c8beaa68d7f956b0822aa72df875c92883f45c8251c11ff6bb1522b49246"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.694420 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.700567 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" podStartSLOduration=4.700550524 podStartE2EDuration="4.700550524s" podCreationTimestamp="2026-01-21 15:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:40.661060775 +0000 UTC m=+2997.842576996" watchObservedRunningTime="2026-01-21 15:51:40.700550524 +0000 UTC m=+2997.882066737" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.715614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" event={"ID":"e57ace7a-26b6-433e-99aa-891427aaf04c","Type":"ContainerStarted","Data":"acc31783e633d011092b6f8764551146c9afb0f71a7bb5976536eae6e9de6e07"} Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.734718 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" podStartSLOduration=4.734699523 podStartE2EDuration="4.734699523s" podCreationTimestamp="2026-01-21 15:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:40.716325354 +0000 UTC m=+2997.897841577" watchObservedRunningTime="2026-01-21 15:51:40.734699523 +0000 UTC m=+2997.916215745" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.735195 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9"] Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.735438 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" podUID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerName="barbican-keystone-listener-log" containerID="cri-o://ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b" gracePeriod=30 Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.735584 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" podUID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerName="barbican-keystone-listener" containerID="cri-o://409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5" gracePeriod=30 Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.794134 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" podStartSLOduration=4.794102184 podStartE2EDuration="4.794102184s" podCreationTimestamp="2026-01-21 15:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:40.736003345 +0000 UTC m=+2997.917519567" watchObservedRunningTime="2026-01-21 15:51:40.794102184 +0000 UTC m=+2997.975618406" Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.842192 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-568886d945-qqxjr"] Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.842488 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" podUID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerName="barbican-worker-log" containerID="cri-o://814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f" gracePeriod=30 Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.842982 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" podUID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerName="barbican-worker" containerID="cri-o://9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736" gracePeriod=30 Jan 21 15:51:40 crc kubenswrapper[4707]: I0121 15:51:40.866435 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.176971 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.179872 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerName="openstack-network-exporter" containerID="cri-o://42845822ae5a24294591a58818e9e32984de7b04c2b29b2c2c6f590020e457f8" gracePeriod=300 Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.301929 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerName="ovsdbserver-sb" containerID="cri-o://6d22edf7bc985ced8e3f70291090f7925ae17bc065ed02f2366a92c1421c1772" gracePeriod=300 Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.359941 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2d4c95-366c-4f9b-b0d8-0c2bd153408d" path="/var/lib/kubelet/pods/3c2d4c95-366c-4f9b-b0d8-0c2bd153408d/volumes" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.360483 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df46a58-0e69-4140-b4a6-5357eae382ef" path="/var/lib/kubelet/pods/7df46a58-0e69-4140-b4a6-5357eae382ef/volumes" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.360995 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22ebb6c-70ce-42f9-8893-78d76edab2ce" path="/var/lib/kubelet/pods/d22ebb6c-70ce-42f9-8893-78d76edab2ce/volumes" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.586894 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-95c949c86-zbgkh"] Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.637130 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-697885d478-92pg7"] Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.652589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.694341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wksdj\" (UniqueName: \"kubernetes.io/projected/855ed6cf-d010-49b4-a048-37d8568bdf60-kube-api-access-wksdj\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.694406 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-combined-ca-bundle\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.694437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-httpd-config\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.694501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-config\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.701790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-697885d478-92pg7"] Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.779325 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.779797 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerName="openstack-network-exporter" containerID="cri-o://708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c" gracePeriod=300 Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.790388 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerID="ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b" exitCode=143 Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.790458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" event={"ID":"a9c6c283-dad9-4fe2-91a7-184df0427be8","Type":"ContainerDied","Data":"ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b"} Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.810136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-config\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.810456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wksdj\" (UniqueName: \"kubernetes.io/projected/855ed6cf-d010-49b4-a048-37d8568bdf60-kube-api-access-wksdj\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.810568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-combined-ca-bundle\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.810613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-httpd-config\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.822012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"60e0b49a-6e14-42af-9480-877a354cec42","Type":"ContainerStarted","Data":"fedf8ec2c1a7c0866b937ba14cc1081426eb67eaeb9d60f31b28228f8b04b03f"} Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.829694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"11f0696a-c173-41a3-926d-29ef8dcef753","Type":"ContainerStarted","Data":"4cb85f2595f3e431e8aedccad0ccbdc0ca961ea0c8df515503936cbc52f487cb"} Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.855513 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_590ed346-b9bd-4ecb-9a71-69af254c5a18/ovsdbserver-sb/0.log" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.855567 4707 generic.go:334] "Generic (PLEG): container finished" podID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerID="42845822ae5a24294591a58818e9e32984de7b04c2b29b2c2c6f590020e457f8" exitCode=2 Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.855585 4707 generic.go:334] "Generic (PLEG): container finished" podID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerID="6d22edf7bc985ced8e3f70291090f7925ae17bc065ed02f2366a92c1421c1772" exitCode=143 Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.855646 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"590ed346-b9bd-4ecb-9a71-69af254c5a18","Type":"ContainerDied","Data":"42845822ae5a24294591a58818e9e32984de7b04c2b29b2c2c6f590020e457f8"} Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.855678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"590ed346-b9bd-4ecb-9a71-69af254c5a18","Type":"ContainerDied","Data":"6d22edf7bc985ced8e3f70291090f7925ae17bc065ed02f2366a92c1421c1772"} Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.857542 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.211:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.858980 4707 generic.go:334] "Generic (PLEG): container finished" podID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerID="814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f" exitCode=143 Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.859152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" event={"ID":"65b2a5f8-e722-43a5-8765-1b6f281e9b8d","Type":"ContainerDied","Data":"814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f"} Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.862396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-httpd-config\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.865237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-combined-ca-bundle\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.870506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-config\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.874057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wksdj\" (UniqueName: \"kubernetes.io/projected/855ed6cf-d010-49b4-a048-37d8568bdf60-kube-api-access-wksdj\") pod \"neutron-697885d478-92pg7\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.876914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7","Type":"ContainerStarted","Data":"47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3"} Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.876951 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.876962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7","Type":"ContainerStarted","Data":"bf2c885dbf38f097c85e5736fed268197e96dd565646be83b3b289caa174cc68"} Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.931150 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.931124301 podStartE2EDuration="2.931124301s" podCreationTimestamp="2026-01-21 15:51:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:41.908235936 +0000 UTC m=+2999.089752158" watchObservedRunningTime="2026-01-21 15:51:41.931124301 +0000 UTC m=+2999.112640524" Jan 21 15:51:41 crc kubenswrapper[4707]: I0121 15:51:41.948020 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerName="ovsdbserver-nb" containerID="cri-o://b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8" gracePeriod=300 Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.029256 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.125506 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.129485 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_590ed346-b9bd-4ecb-9a71-69af254c5a18/ovsdbserver-sb/0.log" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.129569 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.134758 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.138954 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.139014 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.198926 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.224178 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.226289 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.226351 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerName="ovn-northd" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.228217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-config\") pod \"590ed346-b9bd-4ecb-9a71-69af254c5a18\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.228292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92hsw\" (UniqueName: \"kubernetes.io/projected/590ed346-b9bd-4ecb-9a71-69af254c5a18-kube-api-access-92hsw\") pod \"590ed346-b9bd-4ecb-9a71-69af254c5a18\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.228334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590ed346-b9bd-4ecb-9a71-69af254c5a18-ovsdb-rundir\") pod \"590ed346-b9bd-4ecb-9a71-69af254c5a18\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.228366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590ed346-b9bd-4ecb-9a71-69af254c5a18-combined-ca-bundle\") pod \"590ed346-b9bd-4ecb-9a71-69af254c5a18\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.228429 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-scripts\") pod \"590ed346-b9bd-4ecb-9a71-69af254c5a18\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.228571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"590ed346-b9bd-4ecb-9a71-69af254c5a18\" (UID: \"590ed346-b9bd-4ecb-9a71-69af254c5a18\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.230803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-config" (OuterVolumeSpecName: "config") pod "590ed346-b9bd-4ecb-9a71-69af254c5a18" (UID: "590ed346-b9bd-4ecb-9a71-69af254c5a18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.231709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-scripts" (OuterVolumeSpecName: "scripts") pod "590ed346-b9bd-4ecb-9a71-69af254c5a18" (UID: "590ed346-b9bd-4ecb-9a71-69af254c5a18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.232101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/590ed346-b9bd-4ecb-9a71-69af254c5a18-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "590ed346-b9bd-4ecb-9a71-69af254c5a18" (UID: "590ed346-b9bd-4ecb-9a71-69af254c5a18"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.242518 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "590ed346-b9bd-4ecb-9a71-69af254c5a18" (UID: "590ed346-b9bd-4ecb-9a71-69af254c5a18"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.242559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590ed346-b9bd-4ecb-9a71-69af254c5a18-kube-api-access-92hsw" (OuterVolumeSpecName: "kube-api-access-92hsw") pod "590ed346-b9bd-4ecb-9a71-69af254c5a18" (UID: "590ed346-b9bd-4ecb-9a71-69af254c5a18"). InnerVolumeSpecName "kube-api-access-92hsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.299113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590ed346-b9bd-4ecb-9a71-69af254c5a18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "590ed346-b9bd-4ecb-9a71-69af254c5a18" (UID: "590ed346-b9bd-4ecb-9a71-69af254c5a18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.328656 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-697885d478-92pg7"] Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.331817 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.331857 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.331867 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590ed346-b9bd-4ecb-9a71-69af254c5a18-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.331877 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92hsw\" (UniqueName: \"kubernetes.io/projected/590ed346-b9bd-4ecb-9a71-69af254c5a18-kube-api-access-92hsw\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.331885 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590ed346-b9bd-4ecb-9a71-69af254c5a18-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.332003 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590ed346-b9bd-4ecb-9a71-69af254c5a18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.370868 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5594499fcd-tsx9p"] Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.371581 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerName="openstack-network-exporter" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.371596 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerName="openstack-network-exporter" Jan 21 15:51:42 crc kubenswrapper[4707]: E0121 15:51:42.371612 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerName="ovsdbserver-sb" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.371618 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerName="ovsdbserver-sb" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.371844 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerName="openstack-network-exporter" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.371857 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="590ed346-b9bd-4ecb-9a71-69af254c5a18" containerName="ovsdbserver-sb" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.372817 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.382762 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5594499fcd-tsx9p"] Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.401033 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.436886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-combined-ca-bundle\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.436980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-httpd-config\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.437015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl99m\" (UniqueName: \"kubernetes.io/projected/784a4657-f448-46b4-a167-7ba1d8490acf-kube-api-access-rl99m\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.437051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-config\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.437103 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.539655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-httpd-config\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.539749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl99m\" (UniqueName: \"kubernetes.io/projected/784a4657-f448-46b4-a167-7ba1d8490acf-kube-api-access-rl99m\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.539830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-config\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.540171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-combined-ca-bundle\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.544178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-combined-ca-bundle\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.544625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-httpd-config\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.552716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-config\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.564824 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl99m\" (UniqueName: \"kubernetes.io/projected/784a4657-f448-46b4-a167-7ba1d8490acf-kube-api-access-rl99m\") pod \"neutron-5594499fcd-tsx9p\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.622310 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-697885d478-92pg7"] Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.713487 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_0d2a3325-f770-40c6-a608-a44b32a2642b/ovsdbserver-nb/0.log" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.713712 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.719175 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.845081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0d2a3325-f770-40c6-a608-a44b32a2642b\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.845157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhmth\" (UniqueName: \"kubernetes.io/projected/0d2a3325-f770-40c6-a608-a44b32a2642b-kube-api-access-jhmth\") pod \"0d2a3325-f770-40c6-a608-a44b32a2642b\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.845202 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-config\") pod \"0d2a3325-f770-40c6-a608-a44b32a2642b\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.845272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3325-f770-40c6-a608-a44b32a2642b-combined-ca-bundle\") pod \"0d2a3325-f770-40c6-a608-a44b32a2642b\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.845322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3325-f770-40c6-a608-a44b32a2642b-ovsdb-rundir\") pod \"0d2a3325-f770-40c6-a608-a44b32a2642b\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.845457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-scripts\") pod \"0d2a3325-f770-40c6-a608-a44b32a2642b\" (UID: \"0d2a3325-f770-40c6-a608-a44b32a2642b\") " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.846281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-config" (OuterVolumeSpecName: "config") pod "0d2a3325-f770-40c6-a608-a44b32a2642b" (UID: "0d2a3325-f770-40c6-a608-a44b32a2642b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.850083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-scripts" (OuterVolumeSpecName: "scripts") pod "0d2a3325-f770-40c6-a608-a44b32a2642b" (UID: "0d2a3325-f770-40c6-a608-a44b32a2642b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.850423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "0d2a3325-f770-40c6-a608-a44b32a2642b" (UID: "0d2a3325-f770-40c6-a608-a44b32a2642b"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.850906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d2a3325-f770-40c6-a608-a44b32a2642b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "0d2a3325-f770-40c6-a608-a44b32a2642b" (UID: "0d2a3325-f770-40c6-a608-a44b32a2642b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.851494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2a3325-f770-40c6-a608-a44b32a2642b-kube-api-access-jhmth" (OuterVolumeSpecName: "kube-api-access-jhmth") pod "0d2a3325-f770-40c6-a608-a44b32a2642b" (UID: "0d2a3325-f770-40c6-a608-a44b32a2642b"). InnerVolumeSpecName "kube-api-access-jhmth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.888294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2a3325-f770-40c6-a608-a44b32a2642b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d2a3325-f770-40c6-a608-a44b32a2642b" (UID: "0d2a3325-f770-40c6-a608-a44b32a2642b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.902390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"60e0b49a-6e14-42af-9480-877a354cec42","Type":"ContainerStarted","Data":"2d20d49e7b7da58ecb8376a79713efa7bdcae5fb6a7a699d672be0eea1161321"} Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.911522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" event={"ID":"855ed6cf-d010-49b4-a048-37d8568bdf60","Type":"ContainerStarted","Data":"480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8"} Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.911550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" event={"ID":"855ed6cf-d010-49b4-a048-37d8568bdf60","Type":"ContainerStarted","Data":"94c9dd39c6939ad6e60fbab7000e6633ff6e92b0fab17ed250225f872c612710"} Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.918173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"11f0696a-c173-41a3-926d-29ef8dcef753","Type":"ContainerStarted","Data":"8f3d568ef8fe02166321f2465bb001d67912afe92b6e85e4165f4ee77c91e825"} Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.918333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"11f0696a-c173-41a3-926d-29ef8dcef753","Type":"ContainerStarted","Data":"117bd9103ad25aebe304fbfcc64acbc37d4aab9c45ff0d48066ffe05295752c8"} Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.930719 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.930701389 podStartE2EDuration="4.930701389s" podCreationTimestamp="2026-01-21 15:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:42.926765819 +0000 UTC m=+3000.108282042" watchObservedRunningTime="2026-01-21 15:51:42.930701389 +0000 UTC m=+3000.112217611" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.934480 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_590ed346-b9bd-4ecb-9a71-69af254c5a18/ovsdbserver-sb/0.log" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.934640 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"590ed346-b9bd-4ecb-9a71-69af254c5a18","Type":"ContainerDied","Data":"af2080cb8f77d5faab5a56952c10677388a4e8f7329b27f0ca8c17058dc2bc5c"} Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.934760 4707 scope.go:117] "RemoveContainer" containerID="42845822ae5a24294591a58818e9e32984de7b04c2b29b2c2c6f590020e457f8" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.934641 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.936716 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_0d2a3325-f770-40c6-a608-a44b32a2642b/ovsdbserver-nb/0.log" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.936802 4707 generic.go:334] "Generic (PLEG): container finished" podID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerID="708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c" exitCode=2 Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.936963 4707 generic.go:334] "Generic (PLEG): container finished" podID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerID="b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8" exitCode=143 Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.937131 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" podUID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerName="neutron-api" containerID="cri-o://0dd9c8beaa68d7f956b0822aa72df875c92883f45c8251c11ff6bb1522b49246" gracePeriod=30 Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.937391 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.938840 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" podUID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerName="neutron-httpd" containerID="cri-o://d970cb3fbc1e75bc36ac63cd33ec506b0f116a6de732638699497cdff07ac747" gracePeriod=30 Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.938969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0d2a3325-f770-40c6-a608-a44b32a2642b","Type":"ContainerDied","Data":"708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c"} Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.939144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0d2a3325-f770-40c6-a608-a44b32a2642b","Type":"ContainerDied","Data":"b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8"} Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.939164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"0d2a3325-f770-40c6-a608-a44b32a2642b","Type":"ContainerDied","Data":"a45e0a40ff54869fd1772df320da94a48e92959d3376922062d9b9c49ba19938"} Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.954369 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.954412 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.954425 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhmth\" (UniqueName: \"kubernetes.io/projected/0d2a3325-f770-40c6-a608-a44b32a2642b-kube-api-access-jhmth\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.954437 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d2a3325-f770-40c6-a608-a44b32a2642b-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.954447 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3325-f770-40c6-a608-a44b32a2642b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.954456 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3325-f770-40c6-a608-a44b32a2642b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.975356 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.975334199 podStartE2EDuration="4.975334199s" podCreationTimestamp="2026-01-21 15:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:42.9421952 +0000 UTC m=+3000.123711422" watchObservedRunningTime="2026-01-21 15:51:42.975334199 +0000 UTC m=+3000.156850421" Jan 21 15:51:42 crc kubenswrapper[4707]: I0121 15:51:42.999421 4707 scope.go:117] "RemoveContainer" containerID="6d22edf7bc985ced8e3f70291090f7925ae17bc065ed02f2366a92c1421c1772" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.045701 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.064085 4707 scope.go:117] "RemoveContainer" containerID="708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.064623 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.103222 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.106903 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5594499fcd-tsx9p"] Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.112277 4707 scope.go:117] "RemoveContainer" containerID="b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.123116 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:51:43 crc kubenswrapper[4707]: E0121 15:51:43.123927 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerName="openstack-network-exporter" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.123951 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerName="openstack-network-exporter" Jan 21 15:51:43 crc kubenswrapper[4707]: E0121 15:51:43.123979 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerName="ovsdbserver-nb" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.123985 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerName="ovsdbserver-nb" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.124386 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerName="ovsdbserver-nb" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.124415 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2a3325-f770-40c6-a608-a44b32a2642b" containerName="openstack-network-exporter" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.126276 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.128826 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.129913 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.131753 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-kdf5s" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.132366 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.132563 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.147924 4707 scope.go:117] "RemoveContainer" containerID="708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c" Jan 21 15:51:43 crc kubenswrapper[4707]: E0121 15:51:43.156980 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c\": container with ID starting with 708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c not found: ID does not exist" containerID="708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.157269 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c"} err="failed to get container status \"708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c\": rpc error: code = NotFound desc = could not find container \"708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c\": container with ID starting with 708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c not found: ID does not exist" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.157296 4707 scope.go:117] "RemoveContainer" containerID="b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8" Jan 21 15:51:43 crc kubenswrapper[4707]: E0121 15:51:43.164258 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8\": container with ID starting with b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8 not found: ID does not exist" containerID="b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.164333 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8"} err="failed to get container status \"b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8\": rpc error: code = NotFound desc = could not find container \"b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8\": container with ID starting with b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8 not found: ID does not exist" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.164358 4707 scope.go:117] "RemoveContainer" containerID="708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.164539 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.165668 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c"} err="failed to get container status \"708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c\": rpc error: code = NotFound desc = could not find container \"708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c\": container with ID starting with 708964f5419ee3ff92809e343ee220a2d1ffae993010edd9d26debd17a3fba5c not found: ID does not exist" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.165690 4707 scope.go:117] "RemoveContainer" containerID="b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.166009 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8"} err="failed to get container status \"b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8\": rpc error: code = NotFound desc = could not find container \"b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8\": container with ID starting with b515642ac01ca422b5cd0b92e53093465943890d1f77ebc22210ffef104f19b8 not found: ID does not exist" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.168441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.168621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.168650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.168938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.168973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-config\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.168988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msg7z\" (UniqueName: \"kubernetes.io/projected/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-kube-api-access-msg7z\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.169057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.169086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.169283 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: E0121 15:51:43.204278 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d2a3325_f770_40c6_a608_a44b32a2642b.slice/crio-a45e0a40ff54869fd1772df320da94a48e92959d3376922062d9b9c49ba19938\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod590ed346_b9bd_4ecb_9a71_69af254c5a18.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod590ed346_b9bd_4ecb_9a71_69af254c5a18.slice/crio-af2080cb8f77d5faab5a56952c10677388a4e8f7329b27f0ca8c17058dc2bc5c\": RecentStats: unable to find data in memory cache]" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.224661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.230951 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2a3325-f770-40c6-a608-a44b32a2642b" path="/var/lib/kubelet/pods/0d2a3325-f770-40c6-a608-a44b32a2642b/volumes" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.231573 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.242270 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.258094 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.259492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.264182 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.265225 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.265385 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-6hxvs" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.265497 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.265599 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.272396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.272429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.272488 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.272509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-config\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.272527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msg7z\" (UniqueName: \"kubernetes.io/projected/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-kube-api-access-msg7z\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.272566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.272585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.278760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.280498 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.281024 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.281107 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.281221 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.289378 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.291907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-config\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.294614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.295215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.295803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msg7z\" (UniqueName: \"kubernetes.io/projected/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-kube-api-access-msg7z\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.297038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.383718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.383993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-config\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.384050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.384080 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.384151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.384183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.384201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.384219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xggwr\" (UniqueName: \"kubernetes.io/projected/20cc2e06-6e76-4cca-8954-6d362c62e31a-kube-api-access-xggwr\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.451900 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-kdf5s" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.460386 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.485223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.485282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.485304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.485323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xggwr\" (UniqueName: \"kubernetes.io/projected/20cc2e06-6e76-4cca-8954-6d362c62e31a-kube-api-access-xggwr\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.485348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.485363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-config\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.485408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.485429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.486545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-config\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.486632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.486726 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.487324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.489053 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.494087 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.497061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.497129 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.211:8776/healthcheck\": read tcp 10.217.0.2:39932->10.217.1.211:8776: read: connection reset by peer" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.500455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xggwr\" (UniqueName: \"kubernetes.io/projected/20cc2e06-6e76-4cca-8954-6d362c62e31a-kube-api-access-xggwr\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.522014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.588939 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.960429 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.960549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" event={"ID":"784a4657-f448-46b4-a167-7ba1d8490acf","Type":"ContainerStarted","Data":"5ea9c64c0184bd41a698034ddf9a014b325531196680a34e5fb329ba45fb8bff"} Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.960937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.960950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" event={"ID":"784a4657-f448-46b4-a167-7ba1d8490acf","Type":"ContainerStarted","Data":"a920ca1c6854346c260bf4750f69c54b8c03ab00f64477da208dff6203ffa10d"} Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.960968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" event={"ID":"784a4657-f448-46b4-a167-7ba1d8490acf","Type":"ContainerStarted","Data":"cb84606a35d694b4e57c9a3cff9bacdfb68eb3b20f7931ce08395d6b82a3754d"} Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.971843 4707 generic.go:334] "Generic (PLEG): container finished" podID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerID="8155b26de07fb92fa6d4dac952758954f8d13061ae5932a9172aed2e4c3b2d3e" exitCode=0 Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.971911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f732f500-3538-4a30-8502-fbd2451b7bdd","Type":"ContainerDied","Data":"8155b26de07fb92fa6d4dac952758954f8d13061ae5932a9172aed2e4c3b2d3e"} Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.971955 4707 scope.go:117] "RemoveContainer" containerID="8155b26de07fb92fa6d4dac952758954f8d13061ae5932a9172aed2e4c3b2d3e" Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.993702 4707 generic.go:334] "Generic (PLEG): container finished" podID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerID="d970cb3fbc1e75bc36ac63cd33ec506b0f116a6de732638699497cdff07ac747" exitCode=0 Jan 21 15:51:43 crc kubenswrapper[4707]: I0121 15:51:43.993770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" event={"ID":"f97eddbb-0f7f-4c89-bb38-27636955cfeb","Type":"ContainerDied","Data":"d970cb3fbc1e75bc36ac63cd33ec506b0f116a6de732638699497cdff07ac747"} Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.005000 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" podStartSLOduration=2.004985343 podStartE2EDuration="2.004985343s" podCreationTimestamp="2026-01-21 15:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:43.99352761 +0000 UTC m=+3001.175043832" watchObservedRunningTime="2026-01-21 15:51:44.004985343 +0000 UTC m=+3001.186501565" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.005513 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.014287 4707 generic.go:334] "Generic (PLEG): container finished" podID="ada6986b-923f-4ca5-a85b-d0578223ae13" containerID="741b7a02e8ff9cfd2005241421caecf0b4bf770c7ec62312f29cd16e26f0ef89" exitCode=0 Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.014351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" event={"ID":"ada6986b-923f-4ca5-a85b-d0578223ae13","Type":"ContainerDied","Data":"741b7a02e8ff9cfd2005241421caecf0b4bf770c7ec62312f29cd16e26f0ef89"} Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.024119 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" podUID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerName="neutron-api" containerID="cri-o://480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8" gracePeriod=30 Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.024325 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" podUID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerName="neutron-httpd" containerID="cri-o://21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72" gracePeriod=30 Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.024454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" event={"ID":"855ed6cf-d010-49b4-a048-37d8568bdf60","Type":"ContainerStarted","Data":"21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72"} Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.024484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.054441 4707 scope.go:117] "RemoveContainer" containerID="b5259bb32f8e7a63c7f6470d92529c9c2b2d3d96ec7f8b2439aab3601cdc6538" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.105567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data-custom\") pod \"f732f500-3538-4a30-8502-fbd2451b7bdd\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.105641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f732f500-3538-4a30-8502-fbd2451b7bdd-etc-machine-id\") pod \"f732f500-3538-4a30-8502-fbd2451b7bdd\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.105675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-combined-ca-bundle\") pod \"f732f500-3538-4a30-8502-fbd2451b7bdd\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.105725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8xfc\" (UniqueName: \"kubernetes.io/projected/f732f500-3538-4a30-8502-fbd2451b7bdd-kube-api-access-p8xfc\") pod \"f732f500-3538-4a30-8502-fbd2451b7bdd\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.107408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f732f500-3538-4a30-8502-fbd2451b7bdd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f732f500-3538-4a30-8502-fbd2451b7bdd" (UID: "f732f500-3538-4a30-8502-fbd2451b7bdd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.113428 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f732f500-3538-4a30-8502-fbd2451b7bdd" (UID: "f732f500-3538-4a30-8502-fbd2451b7bdd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.113933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f732f500-3538-4a30-8502-fbd2451b7bdd-logs\") pod \"f732f500-3538-4a30-8502-fbd2451b7bdd\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.114079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data\") pod \"f732f500-3538-4a30-8502-fbd2451b7bdd\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.114125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-scripts\") pod \"f732f500-3538-4a30-8502-fbd2451b7bdd\" (UID: \"f732f500-3538-4a30-8502-fbd2451b7bdd\") " Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.114314 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f732f500-3538-4a30-8502-fbd2451b7bdd-logs" (OuterVolumeSpecName: "logs") pod "f732f500-3538-4a30-8502-fbd2451b7bdd" (UID: "f732f500-3538-4a30-8502-fbd2451b7bdd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.115428 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.115453 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f732f500-3538-4a30-8502-fbd2451b7bdd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.115465 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f732f500-3538-4a30-8502-fbd2451b7bdd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.116290 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f732f500-3538-4a30-8502-fbd2451b7bdd-kube-api-access-p8xfc" (OuterVolumeSpecName: "kube-api-access-p8xfc") pod "f732f500-3538-4a30-8502-fbd2451b7bdd" (UID: "f732f500-3538-4a30-8502-fbd2451b7bdd"). InnerVolumeSpecName "kube-api-access-p8xfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.117384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-scripts" (OuterVolumeSpecName: "scripts") pod "f732f500-3538-4a30-8502-fbd2451b7bdd" (UID: "f732f500-3538-4a30-8502-fbd2451b7bdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.126286 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" podStartSLOduration=3.126262132 podStartE2EDuration="3.126262132s" podCreationTimestamp="2026-01-21 15:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:44.051264188 +0000 UTC m=+3001.232780411" watchObservedRunningTime="2026-01-21 15:51:44.126262132 +0000 UTC m=+3001.307778355" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.134207 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.139246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f732f500-3538-4a30-8502-fbd2451b7bdd" (UID: "f732f500-3538-4a30-8502-fbd2451b7bdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.163655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data" (OuterVolumeSpecName: "config-data") pod "f732f500-3538-4a30-8502-fbd2451b7bdd" (UID: "f732f500-3538-4a30-8502-fbd2451b7bdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.217318 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.217348 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.217358 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f732f500-3538-4a30-8502-fbd2451b7bdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:44 crc kubenswrapper[4707]: I0121 15:51:44.217368 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8xfc\" (UniqueName: \"kubernetes.io/projected/f732f500-3538-4a30-8502-fbd2451b7bdd-kube-api-access-p8xfc\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.049192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"e2d7e8f4-c39e-431c-acc5-9dede63b65a8","Type":"ContainerStarted","Data":"d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.049610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"e2d7e8f4-c39e-431c-acc5-9dede63b65a8","Type":"ContainerStarted","Data":"1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.049627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"e2d7e8f4-c39e-431c-acc5-9dede63b65a8","Type":"ContainerStarted","Data":"cdcbc5fbdff27eab0f274d3a39b7e7ac34685194acab823fa2c27e039e4a1317"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.052518 4707 generic.go:334] "Generic (PLEG): container finished" podID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerID="21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72" exitCode=0 Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.052585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" event={"ID":"855ed6cf-d010-49b4-a048-37d8568bdf60","Type":"ContainerDied","Data":"21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.058995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"20cc2e06-6e76-4cca-8954-6d362c62e31a","Type":"ContainerStarted","Data":"e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.059080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"20cc2e06-6e76-4cca-8954-6d362c62e31a","Type":"ContainerStarted","Data":"75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.059096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"20cc2e06-6e76-4cca-8954-6d362c62e31a","Type":"ContainerStarted","Data":"3ec29c63c8b1bd3a54b3d8606ca2e9a84f27dd721039886ca3596c80b8400f91"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.067134 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_35a31ae7-4ab2-4b98-bd9d-66decb56f00a/ovn-northd/0.log" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.067464 4707 generic.go:334] "Generic (PLEG): container finished" podID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerID="c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293" exitCode=139 Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.067223 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_35a31ae7-4ab2-4b98-bd9d-66decb56f00a/ovn-northd/0.log" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.067600 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.067673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"35a31ae7-4ab2-4b98-bd9d-66decb56f00a","Type":"ContainerDied","Data":"c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.067705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"35a31ae7-4ab2-4b98-bd9d-66decb56f00a","Type":"ContainerDied","Data":"21865c72b9ecd808248f45672315160f1c804520877818708a020bd1feacdf08"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.067723 4707 scope.go:117] "RemoveContainer" containerID="68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.070152 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.070136004 podStartE2EDuration="3.070136004s" podCreationTimestamp="2026-01-21 15:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:45.06429454 +0000 UTC m=+3002.245810763" watchObservedRunningTime="2026-01-21 15:51:45.070136004 +0000 UTC m=+3002.251652236" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.072443 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.074517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f732f500-3538-4a30-8502-fbd2451b7bdd","Type":"ContainerDied","Data":"dc57923b0a2f675447c3007b9aa3acd012c50ec9f99c0725f0f16fd0d12c0f6b"} Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.085993 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.085973543 podStartE2EDuration="2.085973543s" podCreationTimestamp="2026-01-21 15:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:45.085826967 +0000 UTC m=+3002.267343189" watchObservedRunningTime="2026-01-21 15:51:45.085973543 +0000 UTC m=+3002.267489765" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.090384 4707 scope.go:117] "RemoveContainer" containerID="c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.121994 4707 scope.go:117] "RemoveContainer" containerID="68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991" Jan 21 15:51:45 crc kubenswrapper[4707]: E0121 15:51:45.128843 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991\": container with ID starting with 68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991 not found: ID does not exist" containerID="68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.128889 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991"} err="failed to get container status \"68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991\": rpc error: code = NotFound desc = could not find container \"68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991\": container with ID starting with 68ff74d5fd18140224310a757cc80d315ee768ab9221278a17908344525dd991 not found: ID does not exist" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.128914 4707 scope.go:117] "RemoveContainer" containerID="c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293" Jan 21 15:51:45 crc kubenswrapper[4707]: E0121 15:51:45.131727 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293\": container with ID starting with c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293 not found: ID does not exist" containerID="c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.131761 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293"} err="failed to get container status \"c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293\": rpc error: code = NotFound desc = could not find container \"c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293\": container with ID starting with c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293 not found: ID does not exist" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.166166 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.179403 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.208612 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590ed346-b9bd-4ecb-9a71-69af254c5a18" path="/var/lib/kubelet/pods/590ed346-b9bd-4ecb-9a71-69af254c5a18/volumes" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.209411 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" path="/var/lib/kubelet/pods/f732f500-3538-4a30-8502-fbd2451b7bdd/volumes" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.231150 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:45 crc kubenswrapper[4707]: E0121 15:51:45.231549 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api-log" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.231563 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api-log" Jan 21 15:51:45 crc kubenswrapper[4707]: E0121 15:51:45.231583 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerName="openstack-network-exporter" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.231590 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerName="openstack-network-exporter" Jan 21 15:51:45 crc kubenswrapper[4707]: E0121 15:51:45.231608 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.231614 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api" Jan 21 15:51:45 crc kubenswrapper[4707]: E0121 15:51:45.231622 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerName="ovn-northd" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.231628 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerName="ovn-northd" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.231845 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerName="openstack-network-exporter" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.231856 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api-log" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.231884 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" containerName="ovn-northd" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.231893 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f732f500-3538-4a30-8502-fbd2451b7bdd" containerName="cinder-api" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.232781 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.232884 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.235343 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.238947 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.240397 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-combined-ca-bundle\") pod \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.240583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-config\") pod \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.240740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-scripts\") pod \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.240787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-ovn-rundir\") pod \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.241071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbdw6\" (UniqueName: \"kubernetes.io/projected/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-kube-api-access-hbdw6\") pod \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\" (UID: \"35a31ae7-4ab2-4b98-bd9d-66decb56f00a\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.242537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-config" (OuterVolumeSpecName: "config") pod "35a31ae7-4ab2-4b98-bd9d-66decb56f00a" (UID: "35a31ae7-4ab2-4b98-bd9d-66decb56f00a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.242894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-scripts" (OuterVolumeSpecName: "scripts") pod "35a31ae7-4ab2-4b98-bd9d-66decb56f00a" (UID: "35a31ae7-4ab2-4b98-bd9d-66decb56f00a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.243120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "35a31ae7-4ab2-4b98-bd9d-66decb56f00a" (UID: "35a31ae7-4ab2-4b98-bd9d-66decb56f00a"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.248165 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.250162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-kube-api-access-hbdw6" (OuterVolumeSpecName: "kube-api-access-hbdw6") pod "35a31ae7-4ab2-4b98-bd9d-66decb56f00a" (UID: "35a31ae7-4ab2-4b98-bd9d-66decb56f00a"). InnerVolumeSpecName "kube-api-access-hbdw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.286621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35a31ae7-4ab2-4b98-bd9d-66decb56f00a" (UID: "35a31ae7-4ab2-4b98-bd9d-66decb56f00a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.344977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data-custom\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-scripts\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c535f926-bfe0-4448-8090-1b942fe85c89-logs\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv2hb\" (UniqueName: \"kubernetes.io/projected/c535f926-bfe0-4448-8090-1b942fe85c89-kube-api-access-fv2hb\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c535f926-bfe0-4448-8090-1b942fe85c89-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345593 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345605 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345612 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345623 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbdw6\" (UniqueName: \"kubernetes.io/projected/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-kube-api-access-hbdw6\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.345633 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a31ae7-4ab2-4b98-bd9d-66decb56f00a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.366783 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5594499fcd-tsx9p"] Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.425868 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-cbb456574-nld82"] Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.427429 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.439292 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.439434 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.439533 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.439856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-cbb456574-nld82"] Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.450428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.450456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data-custom\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.451926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-scripts\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.452081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c535f926-bfe0-4448-8090-1b942fe85c89-logs\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.452110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv2hb\" (UniqueName: \"kubernetes.io/projected/c535f926-bfe0-4448-8090-1b942fe85c89-kube-api-access-fv2hb\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.452130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.452222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c535f926-bfe0-4448-8090-1b942fe85c89-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.452291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.452452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.455145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c535f926-bfe0-4448-8090-1b942fe85c89-logs\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.455273 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c535f926-bfe0-4448-8090-1b942fe85c89-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.502827 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.511451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.533517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-scripts\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.554874 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-ovndb-tls-certs\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.555045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-httpd-config\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.555222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-combined-ca-bundle\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.555561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-config\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.555647 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-internal-tls-certs\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.555774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6tnw\" (UniqueName: \"kubernetes.io/projected/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-kube-api-access-c6tnw\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.555926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-public-tls-certs\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.556717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv2hb\" (UniqueName: \"kubernetes.io/projected/c535f926-bfe0-4448-8090-1b942fe85c89-kube-api-access-fv2hb\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.567372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.593912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.608385 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data-custom\") pod \"cinder-api-0\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.658757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-config\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.658833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-internal-tls-certs\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.658873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6tnw\" (UniqueName: \"kubernetes.io/projected/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-kube-api-access-c6tnw\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.658920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-public-tls-certs\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.658949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-ovndb-tls-certs\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.658986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-httpd-config\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.659024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-combined-ca-bundle\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.663979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-config\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.664653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-ovndb-tls-certs\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.666845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-combined-ca-bundle\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.672325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-public-tls-certs\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.672334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-internal-tls-certs\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.672405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-httpd-config\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.683532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6tnw\" (UniqueName: \"kubernetes.io/projected/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-kube-api-access-c6tnw\") pod \"neutron-cbb456574-nld82\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.719067 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.760274 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd5cn\" (UniqueName: \"kubernetes.io/projected/ada6986b-923f-4ca5-a85b-d0578223ae13-kube-api-access-zd5cn\") pod \"ada6986b-923f-4ca5-a85b-d0578223ae13\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.760369 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-config-data\") pod \"ada6986b-923f-4ca5-a85b-d0578223ae13\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.760598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-combined-ca-bundle\") pod \"ada6986b-923f-4ca5-a85b-d0578223ae13\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.760626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-credential-keys\") pod \"ada6986b-923f-4ca5-a85b-d0578223ae13\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.760642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-fernet-keys\") pod \"ada6986b-923f-4ca5-a85b-d0578223ae13\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.760685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-scripts\") pod \"ada6986b-923f-4ca5-a85b-d0578223ae13\" (UID: \"ada6986b-923f-4ca5-a85b-d0578223ae13\") " Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.764055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada6986b-923f-4ca5-a85b-d0578223ae13-kube-api-access-zd5cn" (OuterVolumeSpecName: "kube-api-access-zd5cn") pod "ada6986b-923f-4ca5-a85b-d0578223ae13" (UID: "ada6986b-923f-4ca5-a85b-d0578223ae13"). InnerVolumeSpecName "kube-api-access-zd5cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.766323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ada6986b-923f-4ca5-a85b-d0578223ae13" (UID: "ada6986b-923f-4ca5-a85b-d0578223ae13"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.766922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ada6986b-923f-4ca5-a85b-d0578223ae13" (UID: "ada6986b-923f-4ca5-a85b-d0578223ae13"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.767511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-scripts" (OuterVolumeSpecName: "scripts") pod "ada6986b-923f-4ca5-a85b-d0578223ae13" (UID: "ada6986b-923f-4ca5-a85b-d0578223ae13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.786821 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ada6986b-923f-4ca5-a85b-d0578223ae13" (UID: "ada6986b-923f-4ca5-a85b-d0578223ae13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.788503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-config-data" (OuterVolumeSpecName: "config-data") pod "ada6986b-923f-4ca5-a85b-d0578223ae13" (UID: "ada6986b-923f-4ca5-a85b-d0578223ae13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.862863 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.862891 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.862901 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.862910 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.862921 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd5cn\" (UniqueName: \"kubernetes.io/projected/ada6986b-923f-4ca5-a85b-d0578223ae13-kube-api-access-zd5cn\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.862931 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada6986b-923f-4ca5-a85b-d0578223ae13-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.898052 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:45 crc kubenswrapper[4707]: I0121 15:51:45.944825 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.084590 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.087234 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" podUID="784a4657-f448-46b4-a167-7ba1d8490acf" containerName="neutron-api" containerID="cri-o://a920ca1c6854346c260bf4750f69c54b8c03ab00f64477da208dff6203ffa10d" gracePeriod=30 Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.087510 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.087970 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-khvzf" event={"ID":"ada6986b-923f-4ca5-a85b-d0578223ae13","Type":"ContainerDied","Data":"3d7cbed09de0050d7563879ab667ea6edc3c8a0d862e3fc69264df0be5a5f75e"} Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.088018 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d7cbed09de0050d7563879ab667ea6edc3c8a0d862e3fc69264df0be5a5f75e" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.088216 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" podUID="784a4657-f448-46b4-a167-7ba1d8490acf" containerName="neutron-httpd" containerID="cri-o://5ea9c64c0184bd41a698034ddf9a014b325531196680a34e5fb329ba45fb8bff" gracePeriod=30 Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.134680 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.134897 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.143027 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:51:46 crc kubenswrapper[4707]: E0121 15:51:46.143414 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada6986b-923f-4ca5-a85b-d0578223ae13" containerName="keystone-bootstrap" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.143427 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada6986b-923f-4ca5-a85b-d0578223ae13" containerName="keystone-bootstrap" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.143634 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada6986b-923f-4ca5-a85b-d0578223ae13" containerName="keystone-bootstrap" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.144939 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.146787 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-q9bf5" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.147595 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.148367 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.148596 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.169483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.169981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.170143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.170316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-scripts\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.170407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v44fm\" (UniqueName: \"kubernetes.io/projected/303e46a4-d2f9-4512-ae87-c50a52d62019-kube-api-access-v44fm\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.170570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-config\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.170677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.175409 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.193688 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-877bd7b8b-p9zhz"] Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.193919 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" podUID="b0ff6c1a-8a03-430f-a291-4e4b94123939" containerName="keystone-api" containerID="cri-o://c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d" gracePeriod=30 Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.263433 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-7fd958d6cb-2thfc"] Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.264995 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.269002 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.269174 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.269478 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7fd958d6cb-2thfc"] Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.276770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-scripts\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.276855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v44fm\" (UniqueName: \"kubernetes.io/projected/303e46a4-d2f9-4512-ae87-c50a52d62019-kube-api-access-v44fm\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.276899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58fxm\" (UniqueName: \"kubernetes.io/projected/239fec41-be3f-42d6-b0f2-a17c6c13efc8-kube-api-access-58fxm\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.276944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-config-data\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-config\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277231 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-credential-keys\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-public-tls-certs\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-combined-ca-bundle\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-fernet-keys\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277755 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-scripts\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.277798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-internal-tls-certs\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.278858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.279671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-config\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.283222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-scripts\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.294107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v44fm\" (UniqueName: \"kubernetes.io/projected/303e46a4-d2f9-4512-ae87-c50a52d62019-kube-api-access-v44fm\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.295051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.295109 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.295708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.346968 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.379795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-combined-ca-bundle\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.379967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-fernet-keys\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.380063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-scripts\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.380138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-internal-tls-certs\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.380316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58fxm\" (UniqueName: \"kubernetes.io/projected/239fec41-be3f-42d6-b0f2-a17c6c13efc8-kube-api-access-58fxm\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.380406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-config-data\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.380531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-credential-keys\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.380616 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-public-tls-certs\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.387490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-public-tls-certs\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.390338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-fernet-keys\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.391059 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-scripts\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.391295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-credential-keys\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.393275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-internal-tls-certs\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.396239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-combined-ca-bundle\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.399590 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-config-data\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.411290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58fxm\" (UniqueName: \"kubernetes.io/projected/239fec41-be3f-42d6-b0f2-a17c6c13efc8-kube-api-access-58fxm\") pod \"keystone-7fd958d6cb-2thfc\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.449944 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-cbb456574-nld82"] Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.471459 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.495207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.590968 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.600353 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.914113 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.988406 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.990324 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.991593 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:46 crc kubenswrapper[4707]: I0121 15:51:46.996400 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.067220 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.074236 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.076521 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:47 crc kubenswrapper[4707]: E0121 15:51:47.132986 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:47 crc kubenswrapper[4707]: E0121 15:51:47.138713 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.144851 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-95c949c86-zbgkh_f97eddbb-0f7f-4c89-bb38-27636955cfeb/neutron-api/0.log" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.144898 4707 generic.go:334] "Generic (PLEG): container finished" podID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerID="0dd9c8beaa68d7f956b0822aa72df875c92883f45c8251c11ff6bb1522b49246" exitCode=1 Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.144968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" event={"ID":"f97eddbb-0f7f-4c89-bb38-27636955cfeb","Type":"ContainerDied","Data":"0dd9c8beaa68d7f956b0822aa72df875c92883f45c8251c11ff6bb1522b49246"} Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.147933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c535f926-bfe0-4448-8090-1b942fe85c89","Type":"ContainerStarted","Data":"55bf97847d806798394ead48d7c3b5a0d95018aea305415b900a7b555b0ad638"} Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.151485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"303e46a4-d2f9-4512-ae87-c50a52d62019","Type":"ContainerStarted","Data":"a2242f50b778f4e468ac99e97508a558454c0b2fa045be20e9a4ae34e86ebfe7"} Jan 21 15:51:47 crc kubenswrapper[4707]: E0121 15:51:47.169473 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:47 crc kubenswrapper[4707]: E0121 15:51:47.169541 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.187058 4707 generic.go:334] "Generic (PLEG): container finished" podID="784a4657-f448-46b4-a167-7ba1d8490acf" containerID="5ea9c64c0184bd41a698034ddf9a014b325531196680a34e5fb329ba45fb8bff" exitCode=0 Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.187532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" event={"ID":"784a4657-f448-46b4-a167-7ba1d8490acf","Type":"ContainerDied","Data":"5ea9c64c0184bd41a698034ddf9a014b325531196680a34e5fb329ba45fb8bff"} Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.187632 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7fd958d6cb-2thfc"] Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.208695 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a31ae7-4ab2-4b98-bd9d-66decb56f00a" path="/var/lib/kubelet/pods/35a31ae7-4ab2-4b98-bd9d-66decb56f00a/volumes" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.209403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" event={"ID":"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1","Type":"ContainerStarted","Data":"9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6"} Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.209441 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.209471 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.209480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" event={"ID":"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1","Type":"ContainerStarted","Data":"59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1"} Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.209489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" event={"ID":"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1","Type":"ContainerStarted","Data":"365f9f6f1807fddd92ab88221bda5d20a734a8e6bbe893232c6bec357b599dd2"} Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.209511 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.231614 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" podStartSLOduration=2.231593859 podStartE2EDuration="2.231593859s" podCreationTimestamp="2026-01-21 15:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:47.212524684 +0000 UTC m=+3004.394040906" watchObservedRunningTime="2026-01-21 15:51:47.231593859 +0000 UTC m=+3004.413110081" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.620832 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-95c949c86-zbgkh_f97eddbb-0f7f-4c89-bb38-27636955cfeb/neutron-api/0.log" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.620979 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.635150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx8mf\" (UniqueName: \"kubernetes.io/projected/f97eddbb-0f7f-4c89-bb38-27636955cfeb-kube-api-access-kx8mf\") pod \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.635200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-config\") pod \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.635246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-httpd-config\") pod \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.635310 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-combined-ca-bundle\") pod \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\" (UID: \"f97eddbb-0f7f-4c89-bb38-27636955cfeb\") " Jan 21 15:51:47 crc kubenswrapper[4707]: E0121 15:51:47.650917 4707 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/301f75676cc81364763c0e1d7d995d8ff24c00f26fc138571bc276ab7a3eb8ce/diff" to get inode usage: stat /var/lib/containers/storage/overlay/301f75676cc81364763c0e1d7d995d8ff24c00f26fc138571bc276ab7a3eb8ce/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_ovn-northd-0_35a31ae7-4ab2-4b98-bd9d-66decb56f00a/ovn-northd/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_ovn-northd-0_35a31ae7-4ab2-4b98-bd9d-66decb56f00a/ovn-northd/0.log: no such file or directory Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.651871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f97eddbb-0f7f-4c89-bb38-27636955cfeb-kube-api-access-kx8mf" (OuterVolumeSpecName: "kube-api-access-kx8mf") pod "f97eddbb-0f7f-4c89-bb38-27636955cfeb" (UID: "f97eddbb-0f7f-4c89-bb38-27636955cfeb"). InnerVolumeSpecName "kube-api-access-kx8mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.654925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f97eddbb-0f7f-4c89-bb38-27636955cfeb" (UID: "f97eddbb-0f7f-4c89-bb38-27636955cfeb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.737406 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx8mf\" (UniqueName: \"kubernetes.io/projected/f97eddbb-0f7f-4c89-bb38-27636955cfeb-kube-api-access-kx8mf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.737445 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.777791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f97eddbb-0f7f-4c89-bb38-27636955cfeb" (UID: "f97eddbb-0f7f-4c89-bb38-27636955cfeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.786102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-config" (OuterVolumeSpecName: "config") pod "f97eddbb-0f7f-4c89-bb38-27636955cfeb" (UID: "f97eddbb-0f7f-4c89-bb38-27636955cfeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.841113 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.841146 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97eddbb-0f7f-4c89-bb38-27636955cfeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:47 crc kubenswrapper[4707]: I0121 15:51:47.961102 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.205535 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-95c949c86-zbgkh_f97eddbb-0f7f-4c89-bb38-27636955cfeb/neutron-api/0.log" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.205641 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.205642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-95c949c86-zbgkh" event={"ID":"f97eddbb-0f7f-4c89-bb38-27636955cfeb","Type":"ContainerDied","Data":"8e115506bddafddea6ccc0304ef16d7840d849ae4d01c365b3992e4647182add"} Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.205786 4707 scope.go:117] "RemoveContainer" containerID="d970cb3fbc1e75bc36ac63cd33ec506b0f116a6de732638699497cdff07ac747" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.207481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" event={"ID":"239fec41-be3f-42d6-b0f2-a17c6c13efc8","Type":"ContainerStarted","Data":"7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a"} Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.207530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" event={"ID":"239fec41-be3f-42d6-b0f2-a17c6c13efc8","Type":"ContainerStarted","Data":"0ace1425f2ca8e06ded67088cecc0119f964b3992e922c6374e0fee32c40fab2"} Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.207577 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.209439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c535f926-bfe0-4448-8090-1b942fe85c89","Type":"ContainerStarted","Data":"6861b5894c2e268dd615bf72e68c53f494b1ac91f918b0c29b772a58817145a6"} Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.209467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c535f926-bfe0-4448-8090-1b942fe85c89","Type":"ContainerStarted","Data":"3c571ef8d58ac48cc178a7b7fbb07a0155fb018f81b61692d264dbdd939885a1"} Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.209634 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.213575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"303e46a4-d2f9-4512-ae87-c50a52d62019","Type":"ContainerStarted","Data":"9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa"} Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.213604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"303e46a4-d2f9-4512-ae87-c50a52d62019","Type":"ContainerStarted","Data":"307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6"} Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.214180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.214553 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.226481 4707 scope.go:117] "RemoveContainer" containerID="0dd9c8beaa68d7f956b0822aa72df875c92883f45c8251c11ff6bb1522b49246" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.229475 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" podStartSLOduration=2.22946083 podStartE2EDuration="2.22946083s" podCreationTimestamp="2026-01-21 15:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:48.222487178 +0000 UTC m=+3005.404003400" watchObservedRunningTime="2026-01-21 15:51:48.22946083 +0000 UTC m=+3005.410977052" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.244260 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-95c949c86-zbgkh"] Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.250245 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-95c949c86-zbgkh"] Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.258404 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.258389283 podStartE2EDuration="3.258389283s" podCreationTimestamp="2026-01-21 15:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:48.253626738 +0000 UTC m=+3005.435142961" watchObservedRunningTime="2026-01-21 15:51:48.258389283 +0000 UTC m=+3005.439905505" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.275223 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.275200653 podStartE2EDuration="2.275200653s" podCreationTimestamp="2026-01-21 15:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:48.272831958 +0000 UTC m=+3005.454348180" watchObservedRunningTime="2026-01-21 15:51:48.275200653 +0000 UTC m=+3005.456716875" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.388189 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.461868 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.519064 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.565596 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-f6b9744b8-g24th"] Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.565834 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" podUID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerName="barbican-api-log" containerID="cri-o://0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17" gracePeriod=30 Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.566222 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" podUID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerName="barbican-api" containerID="cri-o://38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763" gracePeriod=30 Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.591061 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.896582 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.896633 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.922278 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:48 crc kubenswrapper[4707]: I0121 15:51:48.947083 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.192178 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" path="/var/lib/kubelet/pods/f97eddbb-0f7f-4c89-bb38-27636955cfeb/volumes" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.239539 4707 generic.go:334] "Generic (PLEG): container finished" podID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerID="0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17" exitCode=143 Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.239697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" event={"ID":"5206c260-c7a5-4e61-bd58-4ed5772520e6","Type":"ContainerDied","Data":"0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17"} Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.242480 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.242504 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.512966 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.562880 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.692288 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.698366 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.698395 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.729226 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.739081 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:51:49 crc kubenswrapper[4707]: I0121 15:51:49.757143 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.191131 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-697885d478-92pg7_855ed6cf-d010-49b4-a048-37d8568bdf60/neutron-api/0.log" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.191473 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.243608 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.263686 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5594499fcd-tsx9p_784a4657-f448-46b4-a167-7ba1d8490acf/neutron-api/0.log" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.263738 4707 generic.go:334] "Generic (PLEG): container finished" podID="784a4657-f448-46b4-a167-7ba1d8490acf" containerID="a920ca1c6854346c260bf4750f69c54b8c03ab00f64477da208dff6203ffa10d" exitCode=1 Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.263832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" event={"ID":"784a4657-f448-46b4-a167-7ba1d8490acf","Type":"ContainerDied","Data":"a920ca1c6854346c260bf4750f69c54b8c03ab00f64477da208dff6203ffa10d"} Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.270741 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-697885d478-92pg7_855ed6cf-d010-49b4-a048-37d8568bdf60/neutron-api/0.log" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.270791 4707 generic.go:334] "Generic (PLEG): container finished" podID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerID="480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8" exitCode=1 Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.271575 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.272048 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" event={"ID":"855ed6cf-d010-49b4-a048-37d8568bdf60","Type":"ContainerDied","Data":"480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8"} Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.272080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-697885d478-92pg7" event={"ID":"855ed6cf-d010-49b4-a048-37d8568bdf60","Type":"ContainerDied","Data":"94c9dd39c6939ad6e60fbab7000e6633ff6e92b0fab17ed250225f872c612710"} Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.272097 4707 scope.go:117] "RemoveContainer" containerID="21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.274970 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.275187 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.295494 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-config\") pod \"855ed6cf-d010-49b4-a048-37d8568bdf60\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.295666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wksdj\" (UniqueName: \"kubernetes.io/projected/855ed6cf-d010-49b4-a048-37d8568bdf60-kube-api-access-wksdj\") pod \"855ed6cf-d010-49b4-a048-37d8568bdf60\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.295912 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-combined-ca-bundle\") pod \"855ed6cf-d010-49b4-a048-37d8568bdf60\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.295981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-httpd-config\") pod \"855ed6cf-d010-49b4-a048-37d8568bdf60\" (UID: \"855ed6cf-d010-49b4-a048-37d8568bdf60\") " Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.306500 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855ed6cf-d010-49b4-a048-37d8568bdf60-kube-api-access-wksdj" (OuterVolumeSpecName: "kube-api-access-wksdj") pod "855ed6cf-d010-49b4-a048-37d8568bdf60" (UID: "855ed6cf-d010-49b4-a048-37d8568bdf60"). InnerVolumeSpecName "kube-api-access-wksdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.307947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "855ed6cf-d010-49b4-a048-37d8568bdf60" (UID: "855ed6cf-d010-49b4-a048-37d8568bdf60"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.341845 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "855ed6cf-d010-49b4-a048-37d8568bdf60" (UID: "855ed6cf-d010-49b4-a048-37d8568bdf60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.349109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-config" (OuterVolumeSpecName: "config") pod "855ed6cf-d010-49b4-a048-37d8568bdf60" (UID: "855ed6cf-d010-49b4-a048-37d8568bdf60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.349265 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5594499fcd-tsx9p_784a4657-f448-46b4-a167-7ba1d8490acf/neutron-api/0.log" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.349377 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.358196 4707 scope.go:117] "RemoveContainer" containerID="480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.397767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-combined-ca-bundle\") pod \"784a4657-f448-46b4-a167-7ba1d8490acf\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.397905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-httpd-config\") pod \"784a4657-f448-46b4-a167-7ba1d8490acf\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.397940 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-config\") pod \"784a4657-f448-46b4-a167-7ba1d8490acf\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.398932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl99m\" (UniqueName: \"kubernetes.io/projected/784a4657-f448-46b4-a167-7ba1d8490acf-kube-api-access-rl99m\") pod \"784a4657-f448-46b4-a167-7ba1d8490acf\" (UID: \"784a4657-f448-46b4-a167-7ba1d8490acf\") " Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.400053 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.400077 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wksdj\" (UniqueName: \"kubernetes.io/projected/855ed6cf-d010-49b4-a048-37d8568bdf60-kube-api-access-wksdj\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.400092 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.400103 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/855ed6cf-d010-49b4-a048-37d8568bdf60-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.403992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "784a4657-f448-46b4-a167-7ba1d8490acf" (UID: "784a4657-f448-46b4-a167-7ba1d8490acf"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.404031 4707 scope.go:117] "RemoveContainer" containerID="21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72" Jan 21 15:51:50 crc kubenswrapper[4707]: E0121 15:51:50.406049 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72\": container with ID starting with 21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72 not found: ID does not exist" containerID="21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.406081 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72"} err="failed to get container status \"21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72\": rpc error: code = NotFound desc = could not find container \"21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72\": container with ID starting with 21e86d28c3e919b8fa75debc40265bf7066b29ef5aed9edd35c18357a6a58f72 not found: ID does not exist" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.406100 4707 scope.go:117] "RemoveContainer" containerID="480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8" Jan 21 15:51:50 crc kubenswrapper[4707]: E0121 15:51:50.407093 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8\": container with ID starting with 480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8 not found: ID does not exist" containerID="480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.407134 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8"} err="failed to get container status \"480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8\": rpc error: code = NotFound desc = could not find container \"480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8\": container with ID starting with 480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8 not found: ID does not exist" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.407995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784a4657-f448-46b4-a167-7ba1d8490acf-kube-api-access-rl99m" (OuterVolumeSpecName: "kube-api-access-rl99m") pod "784a4657-f448-46b4-a167-7ba1d8490acf" (UID: "784a4657-f448-46b4-a167-7ba1d8490acf"). InnerVolumeSpecName "kube-api-access-rl99m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.448889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-config" (OuterVolumeSpecName: "config") pod "784a4657-f448-46b4-a167-7ba1d8490acf" (UID: "784a4657-f448-46b4-a167-7ba1d8490acf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.498791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "784a4657-f448-46b4-a167-7ba1d8490acf" (UID: "784a4657-f448-46b4-a167-7ba1d8490acf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.502241 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.502297 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.502310 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/784a4657-f448-46b4-a167-7ba1d8490acf-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.502320 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl99m\" (UniqueName: \"kubernetes.io/projected/784a4657-f448-46b4-a167-7ba1d8490acf-kube-api-access-rl99m\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.600004 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-697885d478-92pg7"] Jan 21 15:51:50 crc kubenswrapper[4707]: I0121 15:51:50.607188 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-697885d478-92pg7"] Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.191300 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855ed6cf-d010-49b4-a048-37d8568bdf60" path="/var/lib/kubelet/pods/855ed6cf-d010-49b4-a048-37d8568bdf60/volumes" Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.281785 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5594499fcd-tsx9p_784a4657-f448-46b4-a167-7ba1d8490acf/neutron-api/0.log" Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.281947 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.281964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5594499fcd-tsx9p" event={"ID":"784a4657-f448-46b4-a167-7ba1d8490acf","Type":"ContainerDied","Data":"cb84606a35d694b4e57c9a3cff9bacdfb68eb3b20f7931ce08395d6b82a3754d"} Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.282007 4707 scope.go:117] "RemoveContainer" containerID="5ea9c64c0184bd41a698034ddf9a014b325531196680a34e5fb329ba45fb8bff" Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.300476 4707 scope.go:117] "RemoveContainer" containerID="a920ca1c6854346c260bf4750f69c54b8c03ab00f64477da208dff6203ffa10d" Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.304940 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5594499fcd-tsx9p"] Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.312830 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5594499fcd-tsx9p"] Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.345296 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.345413 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:51:51 crc kubenswrapper[4707]: I0121 15:51:51.346742 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.036636 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.037731 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.105081 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.107548 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.117903 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.125434 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.125478 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.234586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-combined-ca-bundle\") pod \"b0ff6c1a-8a03-430f-a291-4e4b94123939\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.234659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96598\" (UniqueName: \"kubernetes.io/projected/b0ff6c1a-8a03-430f-a291-4e4b94123939-kube-api-access-96598\") pod \"b0ff6c1a-8a03-430f-a291-4e4b94123939\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.234720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-scripts\") pod \"b0ff6c1a-8a03-430f-a291-4e4b94123939\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.234785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-fernet-keys\") pod \"b0ff6c1a-8a03-430f-a291-4e4b94123939\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.234866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-credential-keys\") pod \"b0ff6c1a-8a03-430f-a291-4e4b94123939\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.234999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-config-data\") pod \"b0ff6c1a-8a03-430f-a291-4e4b94123939\" (UID: \"b0ff6c1a-8a03-430f-a291-4e4b94123939\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.239749 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-scripts" (OuterVolumeSpecName: "scripts") pod "b0ff6c1a-8a03-430f-a291-4e4b94123939" (UID: "b0ff6c1a-8a03-430f-a291-4e4b94123939"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.240237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b0ff6c1a-8a03-430f-a291-4e4b94123939" (UID: "b0ff6c1a-8a03-430f-a291-4e4b94123939"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.240362 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ff6c1a-8a03-430f-a291-4e4b94123939-kube-api-access-96598" (OuterVolumeSpecName: "kube-api-access-96598") pod "b0ff6c1a-8a03-430f-a291-4e4b94123939" (UID: "b0ff6c1a-8a03-430f-a291-4e4b94123939"). InnerVolumeSpecName "kube-api-access-96598". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.240691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b0ff6c1a-8a03-430f-a291-4e4b94123939" (UID: "b0ff6c1a-8a03-430f-a291-4e4b94123939"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.258357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0ff6c1a-8a03-430f-a291-4e4b94123939" (UID: "b0ff6c1a-8a03-430f-a291-4e4b94123939"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.260011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-config-data" (OuterVolumeSpecName: "config-data") pod "b0ff6c1a-8a03-430f-a291-4e4b94123939" (UID: "b0ff6c1a-8a03-430f-a291-4e4b94123939"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.276950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.322205 4707 generic.go:334] "Generic (PLEG): container finished" podID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerID="38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763" exitCode=0 Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.323430 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.324030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" event={"ID":"5206c260-c7a5-4e61-bd58-4ed5772520e6","Type":"ContainerDied","Data":"38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763"} Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.324083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f6b9744b8-g24th" event={"ID":"5206c260-c7a5-4e61-bd58-4ed5772520e6","Type":"ContainerDied","Data":"b3648349800419d4f2b03de9add01633483615336d5efe41b825a57326ba8a2e"} Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.324103 4707 scope.go:117] "RemoveContainer" containerID="38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.335335 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0ff6c1a-8a03-430f-a291-4e4b94123939" containerID="c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d" exitCode=0 Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.337163 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.337186 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96598\" (UniqueName: \"kubernetes.io/projected/b0ff6c1a-8a03-430f-a291-4e4b94123939-kube-api-access-96598\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.337197 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.337207 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.337216 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.337224 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ff6c1a-8a03-430f-a291-4e4b94123939-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.337701 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.338303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" event={"ID":"b0ff6c1a-8a03-430f-a291-4e4b94123939","Type":"ContainerDied","Data":"c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d"} Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.338356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-877bd7b8b-p9zhz" event={"ID":"b0ff6c1a-8a03-430f-a291-4e4b94123939","Type":"ContainerDied","Data":"1f3bb33b3191c68812c158fb0eb79146207a3313284b95c962e047b8464e859b"} Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.359286 4707 scope.go:117] "RemoveContainer" containerID="0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.384962 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-877bd7b8b-p9zhz"] Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.397075 4707 scope.go:117] "RemoveContainer" containerID="38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.397531 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763\": container with ID starting with 38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763 not found: ID does not exist" containerID="38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.397658 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763"} err="failed to get container status \"38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763\": rpc error: code = NotFound desc = could not find container \"38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763\": container with ID starting with 38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763 not found: ID does not exist" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.397752 4707 scope.go:117] "RemoveContainer" containerID="0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.398207 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17\": container with ID starting with 0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17 not found: ID does not exist" containerID="0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.398328 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17"} err="failed to get container status \"0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17\": rpc error: code = NotFound desc = could not find container \"0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17\": container with ID starting with 0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17 not found: ID does not exist" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.398409 4707 scope.go:117] "RemoveContainer" containerID="c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.399604 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-877bd7b8b-p9zhz"] Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.419685 4707 scope.go:117] "RemoveContainer" containerID="c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.425398 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d\": container with ID starting with c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d not found: ID does not exist" containerID="c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.425451 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d"} err="failed to get container status \"c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d\": rpc error: code = NotFound desc = could not find container \"c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d\": container with ID starting with c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d not found: ID does not exist" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.438672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-combined-ca-bundle\") pod \"5206c260-c7a5-4e61-bd58-4ed5772520e6\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.438740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data\") pod \"5206c260-c7a5-4e61-bd58-4ed5772520e6\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.439020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5206c260-c7a5-4e61-bd58-4ed5772520e6-logs\") pod \"5206c260-c7a5-4e61-bd58-4ed5772520e6\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.439069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5mbl\" (UniqueName: \"kubernetes.io/projected/5206c260-c7a5-4e61-bd58-4ed5772520e6-kube-api-access-j5mbl\") pod \"5206c260-c7a5-4e61-bd58-4ed5772520e6\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.439128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data-custom\") pod \"5206c260-c7a5-4e61-bd58-4ed5772520e6\" (UID: \"5206c260-c7a5-4e61-bd58-4ed5772520e6\") " Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.440942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5206c260-c7a5-4e61-bd58-4ed5772520e6-logs" (OuterVolumeSpecName: "logs") pod "5206c260-c7a5-4e61-bd58-4ed5772520e6" (UID: "5206c260-c7a5-4e61-bd58-4ed5772520e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.446635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5206c260-c7a5-4e61-bd58-4ed5772520e6" (UID: "5206c260-c7a5-4e61-bd58-4ed5772520e6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.447975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5206c260-c7a5-4e61-bd58-4ed5772520e6-kube-api-access-j5mbl" (OuterVolumeSpecName: "kube-api-access-j5mbl") pod "5206c260-c7a5-4e61-bd58-4ed5772520e6" (UID: "5206c260-c7a5-4e61-bd58-4ed5772520e6"). InnerVolumeSpecName "kube-api-access-j5mbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.468700 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5206c260-c7a5-4e61-bd58-4ed5772520e6" (UID: "5206c260-c7a5-4e61-bd58-4ed5772520e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.483469 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-565595d96-hb7l7"] Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.484102 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerName="barbican-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484123 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerName="barbican-api" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.484145 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerName="neutron-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484152 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerName="neutron-api" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.484165 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerName="neutron-httpd" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484171 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerName="neutron-httpd" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.484180 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerName="barbican-api-log" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484187 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerName="barbican-api-log" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.484197 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerName="neutron-httpd" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484202 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerName="neutron-httpd" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.484212 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ff6c1a-8a03-430f-a291-4e4b94123939" containerName="keystone-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484219 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ff6c1a-8a03-430f-a291-4e4b94123939" containerName="keystone-api" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.484231 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784a4657-f448-46b4-a167-7ba1d8490acf" containerName="neutron-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484237 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="784a4657-f448-46b4-a167-7ba1d8490acf" containerName="neutron-api" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.484260 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerName="neutron-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484265 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerName="neutron-api" Jan 21 15:51:52 crc kubenswrapper[4707]: E0121 15:51:52.484278 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784a4657-f448-46b4-a167-7ba1d8490acf" containerName="neutron-httpd" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484285 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="784a4657-f448-46b4-a167-7ba1d8490acf" containerName="neutron-httpd" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484459 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerName="barbican-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484472 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerName="neutron-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484483 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ff6c1a-8a03-430f-a291-4e4b94123939" containerName="keystone-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484498 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="855ed6cf-d010-49b4-a048-37d8568bdf60" containerName="neutron-httpd" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484505 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerName="neutron-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484514 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="784a4657-f448-46b4-a167-7ba1d8490acf" containerName="neutron-api" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484521 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f97eddbb-0f7f-4c89-bb38-27636955cfeb" containerName="neutron-httpd" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484528 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5206c260-c7a5-4e61-bd58-4ed5772520e6" containerName="barbican-api-log" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.484539 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="784a4657-f448-46b4-a167-7ba1d8490acf" containerName="neutron-httpd" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.488020 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.489667 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.490043 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.494027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-565595d96-hb7l7"] Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.523379 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data" (OuterVolumeSpecName: "config-data") pod "5206c260-c7a5-4e61-bd58-4ed5772520e6" (UID: "5206c260-c7a5-4e61-bd58-4ed5772520e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.541657 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.541692 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.541703 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5206c260-c7a5-4e61-bd58-4ed5772520e6-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.541713 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5mbl\" (UniqueName: \"kubernetes.io/projected/5206c260-c7a5-4e61-bd58-4ed5772520e6-kube-api-access-j5mbl\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.541724 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5206c260-c7a5-4e61-bd58-4ed5772520e6-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.643852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-internal-tls-certs\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.643903 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-public-tls-certs\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.644151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.644208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z7pc\" (UniqueName: \"kubernetes.io/projected/dd1e5821-44a4-4219-aec0-56618e7dba22-kube-api-access-7z7pc\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.644243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-combined-ca-bundle\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.644380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data-custom\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.644451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e5821-44a4-4219-aec0-56618e7dba22-logs\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.655501 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-f6b9744b8-g24th"] Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.668314 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-f6b9744b8-g24th"] Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.680772 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.1.206:9696/\": dial tcp 10.217.1.206:9696: connect: connection refused" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.746357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-internal-tls-certs\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.746415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-public-tls-certs\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.746468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.746487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z7pc\" (UniqueName: \"kubernetes.io/projected/dd1e5821-44a4-4219-aec0-56618e7dba22-kube-api-access-7z7pc\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.746506 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-combined-ca-bundle\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.746547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data-custom\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.746573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e5821-44a4-4219-aec0-56618e7dba22-logs\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.746948 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e5821-44a4-4219-aec0-56618e7dba22-logs\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.750214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-combined-ca-bundle\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.750659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-public-tls-certs\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.751179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-internal-tls-certs\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.751335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.753527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data-custom\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.761612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z7pc\" (UniqueName: \"kubernetes.io/projected/dd1e5821-44a4-4219-aec0-56618e7dba22-kube-api-access-7z7pc\") pod \"barbican-api-565595d96-hb7l7\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:52 crc kubenswrapper[4707]: I0121 15:51:52.818857 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.193827 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5206c260-c7a5-4e61-bd58-4ed5772520e6" path="/var/lib/kubelet/pods/5206c260-c7a5-4e61-bd58-4ed5772520e6/volumes" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.194900 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784a4657-f448-46b4-a167-7ba1d8490acf" path="/var/lib/kubelet/pods/784a4657-f448-46b4-a167-7ba1d8490acf/volumes" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.195478 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ff6c1a-8a03-430f-a291-4e4b94123939" path="/var/lib/kubelet/pods/b0ff6c1a-8a03-430f-a291-4e4b94123939/volumes" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.238955 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-565595d96-hb7l7"] Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.360702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" event={"ID":"dd1e5821-44a4-4219-aec0-56618e7dba22","Type":"ContainerStarted","Data":"4b53100cb868d85f8e050d1bc4829884fd74507bd51615aebc27917f46e11161"} Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.793398 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.871392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift\") pod \"3e484616-acc3-47ab-a3ac-01ba20347f12\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.871459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrhbz\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-kube-api-access-nrhbz\") pod \"3e484616-acc3-47ab-a3ac-01ba20347f12\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.871580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-cache\") pod \"3e484616-acc3-47ab-a3ac-01ba20347f12\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.871651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"3e484616-acc3-47ab-a3ac-01ba20347f12\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.871689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-lock\") pod \"3e484616-acc3-47ab-a3ac-01ba20347f12\" (UID: \"3e484616-acc3-47ab-a3ac-01ba20347f12\") " Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.872650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-lock" (OuterVolumeSpecName: "lock") pod "3e484616-acc3-47ab-a3ac-01ba20347f12" (UID: "3e484616-acc3-47ab-a3ac-01ba20347f12"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.877145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-cache" (OuterVolumeSpecName: "cache") pod "3e484616-acc3-47ab-a3ac-01ba20347f12" (UID: "3e484616-acc3-47ab-a3ac-01ba20347f12"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.881210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-kube-api-access-nrhbz" (OuterVolumeSpecName: "kube-api-access-nrhbz") pod "3e484616-acc3-47ab-a3ac-01ba20347f12" (UID: "3e484616-acc3-47ab-a3ac-01ba20347f12"). InnerVolumeSpecName "kube-api-access-nrhbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.882440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3e484616-acc3-47ab-a3ac-01ba20347f12" (UID: "3e484616-acc3-47ab-a3ac-01ba20347f12"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.885829 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "swift") pod "3e484616-acc3-47ab-a3ac-01ba20347f12" (UID: "3e484616-acc3-47ab-a3ac-01ba20347f12"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.973469 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-cache\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.973525 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.973536 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3e484616-acc3-47ab-a3ac-01ba20347f12-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.973545 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.973556 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrhbz\" (UniqueName: \"kubernetes.io/projected/3e484616-acc3-47ab-a3ac-01ba20347f12-kube-api-access-nrhbz\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:53 crc kubenswrapper[4707]: I0121 15:51:53.989992 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.073466 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" podUID="43c11e62-fdd4-4840-b282-7f467f9e4a88" containerName="keystone-api" probeResult="failure" output="Get \"http://10.217.1.205:5000/v3\": EOF" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.075116 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.374970 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerID="a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e" exitCode=137 Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.375045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e"} Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.375076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3e484616-acc3-47ab-a3ac-01ba20347f12","Type":"ContainerDied","Data":"5e582c69f547ce754027adc81ad5f8f7c6e2521b6b2f278398a6f52ee2786e29"} Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.375096 4707 scope.go:117] "RemoveContainer" containerID="a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.375164 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.378072 4707 generic.go:334] "Generic (PLEG): container finished" podID="43c11e62-fdd4-4840-b282-7f467f9e4a88" containerID="a5766d5bf168913788538335dbb1615ed79f7befcea3074dcb7e8bac5a9a3946" exitCode=137 Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.378127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" event={"ID":"43c11e62-fdd4-4840-b282-7f467f9e4a88","Type":"ContainerDied","Data":"a5766d5bf168913788538335dbb1615ed79f7befcea3074dcb7e8bac5a9a3946"} Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.378145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" event={"ID":"43c11e62-fdd4-4840-b282-7f467f9e4a88","Type":"ContainerDied","Data":"00021ad3af7661cdf46d6752886ed6f4f2027587cd8b939e4d6291f3199e69a0"} Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.378156 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00021ad3af7661cdf46d6752886ed6f4f2027587cd8b939e4d6291f3199e69a0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.380918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" event={"ID":"dd1e5821-44a4-4219-aec0-56618e7dba22","Type":"ContainerStarted","Data":"26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704"} Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.380968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" event={"ID":"dd1e5821-44a4-4219-aec0-56618e7dba22","Type":"ContainerStarted","Data":"2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20"} Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.381105 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.393962 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.403698 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" podStartSLOduration=2.403683094 podStartE2EDuration="2.403683094s" podCreationTimestamp="2026-01-21 15:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:54.396154097 +0000 UTC m=+3011.577670320" watchObservedRunningTime="2026-01-21 15:51:54.403683094 +0000 UTC m=+3011.585199316" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.408513 4707 scope.go:117] "RemoveContainer" containerID="3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.416799 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.430770 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.444630 4707 scope.go:117] "RemoveContainer" containerID="1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455323 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455687 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="swift-recon-cron" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455704 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="swift-recon-cron" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455714 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-server" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455720 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-server" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455734 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-auditor" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455740 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-auditor" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455755 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-reaper" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455760 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-reaper" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455769 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c11e62-fdd4-4840-b282-7f467f9e4a88" containerName="keystone-api" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455774 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c11e62-fdd4-4840-b282-7f467f9e4a88" containerName="keystone-api" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455783 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-server" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455788 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-server" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455798 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-auditor" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455817 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-auditor" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455824 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-server" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455830 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-server" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455837 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-expirer" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455843 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-expirer" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455854 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-replicator" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455859 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-replicator" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455869 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-auditor" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455874 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-auditor" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455883 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-updater" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455888 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-updater" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455895 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-replicator" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455901 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-replicator" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455914 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="rsync" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455921 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="rsync" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455933 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-replicator" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455938 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-replicator" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.455949 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-updater" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.455954 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-updater" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456119 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-expirer" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456132 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="rsync" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456144 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-server" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456151 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-server" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456158 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-updater" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456168 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="swift-recon-cron" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456176 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-replicator" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456184 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c11e62-fdd4-4840-b282-7f467f9e4a88" containerName="keystone-api" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456191 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-updater" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456199 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-reaper" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456208 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-auditor" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456214 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-auditor" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456221 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-replicator" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456228 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="object-auditor" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456238 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="container-server" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.456256 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" containerName="account-replicator" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.465039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.467488 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.472930 4707 scope.go:117] "RemoveContainer" containerID="3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.479529 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.479558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-combined-ca-bundle\") pod \"43c11e62-fdd4-4840-b282-7f467f9e4a88\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.479663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-fernet-keys\") pod \"43c11e62-fdd4-4840-b282-7f467f9e4a88\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.479712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jmtx\" (UniqueName: \"kubernetes.io/projected/43c11e62-fdd4-4840-b282-7f467f9e4a88-kube-api-access-9jmtx\") pod \"43c11e62-fdd4-4840-b282-7f467f9e4a88\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.479765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-scripts\") pod \"43c11e62-fdd4-4840-b282-7f467f9e4a88\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.479916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-config-data\") pod \"43c11e62-fdd4-4840-b282-7f467f9e4a88\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.479997 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-credential-keys\") pod \"43c11e62-fdd4-4840-b282-7f467f9e4a88\" (UID: \"43c11e62-fdd4-4840-b282-7f467f9e4a88\") " Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.485130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43c11e62-fdd4-4840-b282-7f467f9e4a88-kube-api-access-9jmtx" (OuterVolumeSpecName: "kube-api-access-9jmtx") pod "43c11e62-fdd4-4840-b282-7f467f9e4a88" (UID: "43c11e62-fdd4-4840-b282-7f467f9e4a88"). InnerVolumeSpecName "kube-api-access-9jmtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.485575 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-scripts" (OuterVolumeSpecName: "scripts") pod "43c11e62-fdd4-4840-b282-7f467f9e4a88" (UID: "43c11e62-fdd4-4840-b282-7f467f9e4a88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.488285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "43c11e62-fdd4-4840-b282-7f467f9e4a88" (UID: "43c11e62-fdd4-4840-b282-7f467f9e4a88"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.510634 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "43c11e62-fdd4-4840-b282-7f467f9e4a88" (UID: "43c11e62-fdd4-4840-b282-7f467f9e4a88"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.514523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43c11e62-fdd4-4840-b282-7f467f9e4a88" (UID: "43c11e62-fdd4-4840-b282-7f467f9e4a88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.520085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-config-data" (OuterVolumeSpecName: "config-data") pod "43c11e62-fdd4-4840-b282-7f467f9e4a88" (UID: "43c11e62-fdd4-4840-b282-7f467f9e4a88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.522199 4707 scope.go:117] "RemoveContainer" containerID="5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.545106 4707 scope.go:117] "RemoveContainer" containerID="f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.567677 4707 scope.go:117] "RemoveContainer" containerID="012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.582571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-lock\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.582858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-etc-swift\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.582983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7tkx\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-kube-api-access-l7tkx\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.583176 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.583676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-cache\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.583906 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.583983 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jmtx\" (UniqueName: \"kubernetes.io/projected/43c11e62-fdd4-4840-b282-7f467f9e4a88-kube-api-access-9jmtx\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.584039 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.584093 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.584150 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.584200 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c11e62-fdd4-4840-b282-7f467f9e4a88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.584994 4707 scope.go:117] "RemoveContainer" containerID="f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.603520 4707 scope.go:117] "RemoveContainer" containerID="fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.619535 4707 scope.go:117] "RemoveContainer" containerID="f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.639720 4707 scope.go:117] "RemoveContainer" containerID="b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.658929 4707 scope.go:117] "RemoveContainer" containerID="fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.680525 4707 scope.go:117] "RemoveContainer" containerID="3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.686619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-lock\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.686680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-etc-swift\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.686722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7tkx\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-kube-api-access-l7tkx\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.686784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.687040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-cache\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.687069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-lock\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.687281 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.687512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-cache\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.692700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-etc-swift\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.700942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7tkx\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-kube-api-access-l7tkx\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.702112 4707 scope.go:117] "RemoveContainer" containerID="ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.709866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.722803 4707 scope.go:117] "RemoveContainer" containerID="823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.741074 4707 scope.go:117] "RemoveContainer" containerID="a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.741531 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e\": container with ID starting with a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e not found: ID does not exist" containerID="a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.741579 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e"} err="failed to get container status \"a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e\": rpc error: code = NotFound desc = could not find container \"a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e\": container with ID starting with a9ad8d43352ce35e63b9db46441e7a04669ce0ad0e9eb92a92209958047cc26e not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.741606 4707 scope.go:117] "RemoveContainer" containerID="3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.741974 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b\": container with ID starting with 3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b not found: ID does not exist" containerID="3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.742007 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b"} err="failed to get container status \"3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b\": rpc error: code = NotFound desc = could not find container \"3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b\": container with ID starting with 3f84945e9f496dd14c8616ed9840c5931c86ad01aa2f46a12652006e5021312b not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.742025 4707 scope.go:117] "RemoveContainer" containerID="1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.742354 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20\": container with ID starting with 1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20 not found: ID does not exist" containerID="1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.742475 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20"} err="failed to get container status \"1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20\": rpc error: code = NotFound desc = could not find container \"1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20\": container with ID starting with 1b6c17d1363ad09260594db1744b0d9b2f234e4aefab04ecd89fb62612533f20 not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.742591 4707 scope.go:117] "RemoveContainer" containerID="3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.743075 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89\": container with ID starting with 3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89 not found: ID does not exist" containerID="3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.743099 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89"} err="failed to get container status \"3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89\": rpc error: code = NotFound desc = could not find container \"3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89\": container with ID starting with 3cf1edee7fc258bc43295cd6a77fe6f73e14f7a21eb42003f322b8251437ad89 not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.743113 4707 scope.go:117] "RemoveContainer" containerID="5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.743339 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f\": container with ID starting with 5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f not found: ID does not exist" containerID="5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.743456 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f"} err="failed to get container status \"5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f\": rpc error: code = NotFound desc = could not find container \"5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f\": container with ID starting with 5047f2fe56e5bb995938d587abb2c2fb3b9086ace0808248efc137ebd559e99f not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.743541 4707 scope.go:117] "RemoveContainer" containerID="f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.743887 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea\": container with ID starting with f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea not found: ID does not exist" containerID="f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.743994 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea"} err="failed to get container status \"f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea\": rpc error: code = NotFound desc = could not find container \"f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea\": container with ID starting with f31233924a0997b4c74b5ef1c050e8f89cafba9ff8bdf8b09b1ce010d02f2eea not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.744076 4707 scope.go:117] "RemoveContainer" containerID="012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.744387 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88\": container with ID starting with 012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88 not found: ID does not exist" containerID="012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.744413 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88"} err="failed to get container status \"012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88\": rpc error: code = NotFound desc = could not find container \"012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88\": container with ID starting with 012206a40fb0ee2f94067f8035d2d4bc054c46fa2d6ec70f247ad5a23d67fd88 not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.744426 4707 scope.go:117] "RemoveContainer" containerID="f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.744634 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b\": container with ID starting with f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b not found: ID does not exist" containerID="f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.744755 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b"} err="failed to get container status \"f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b\": rpc error: code = NotFound desc = could not find container \"f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b\": container with ID starting with f9173e49bc9fe4b0dc08eb69a36cc9876034a385d52e881b9c774ac464c73f1b not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.744850 4707 scope.go:117] "RemoveContainer" containerID="fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.745160 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9\": container with ID starting with fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9 not found: ID does not exist" containerID="fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.745183 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9"} err="failed to get container status \"fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9\": rpc error: code = NotFound desc = could not find container \"fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9\": container with ID starting with fa88ffd0a8f7ee4a4498a46345287e346161d1f66b1619fa31ed2a1a9789f0b9 not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.745196 4707 scope.go:117] "RemoveContainer" containerID="f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.745426 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63\": container with ID starting with f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63 not found: ID does not exist" containerID="f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.745513 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63"} err="failed to get container status \"f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63\": rpc error: code = NotFound desc = could not find container \"f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63\": container with ID starting with f57f8a547745f9c9fea0db93c43b8f8f1746b298de4ee0be23fc5fd4968afe63 not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.745590 4707 scope.go:117] "RemoveContainer" containerID="b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.745887 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0\": container with ID starting with b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0 not found: ID does not exist" containerID="b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.745908 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0"} err="failed to get container status \"b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0\": rpc error: code = NotFound desc = could not find container \"b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0\": container with ID starting with b70d205dca12a8a35bd841eb651040800388eaeb31d9d7af12217cecb47c56b0 not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.745921 4707 scope.go:117] "RemoveContainer" containerID="fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.746244 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35\": container with ID starting with fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35 not found: ID does not exist" containerID="fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.746373 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35"} err="failed to get container status \"fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35\": rpc error: code = NotFound desc = could not find container \"fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35\": container with ID starting with fea6c6e1b573b1e6bb8224226d25eeb72f322dc7e5ac2bf5fcbbcfed295f3f35 not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.746454 4707 scope.go:117] "RemoveContainer" containerID="3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.747028 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085\": container with ID starting with 3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085 not found: ID does not exist" containerID="3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.747130 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085"} err="failed to get container status \"3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085\": rpc error: code = NotFound desc = could not find container \"3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085\": container with ID starting with 3305a78c11b30d9ef58718e0c29ea8f6778e6e0c02d54b2e9e214d1363515085 not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.747207 4707 scope.go:117] "RemoveContainer" containerID="ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.747522 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7\": container with ID starting with ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7 not found: ID does not exist" containerID="ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.747633 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7"} err="failed to get container status \"ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7\": rpc error: code = NotFound desc = could not find container \"ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7\": container with ID starting with ae2f1059daad9b106dff1646df66d22bc33e326341ce7b22f8061c544c64acd7 not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.747719 4707 scope.go:117] "RemoveContainer" containerID="823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d" Jan 21 15:51:54 crc kubenswrapper[4707]: E0121 15:51:54.748063 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d\": container with ID starting with 823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d not found: ID does not exist" containerID="823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.748089 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d"} err="failed to get container status \"823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d\": rpc error: code = NotFound desc = could not find container \"823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d\": container with ID starting with 823da80f0fba9f8b59e0af37700d574f6763cef9a1f8decb583743a6a497479d not found: ID does not exist" Jan 21 15:51:54 crc kubenswrapper[4707]: I0121 15:51:54.779853 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.191657 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e484616-acc3-47ab-a3ac-01ba20347f12" path="/var/lib/kubelet/pods/3e484616-acc3-47ab-a3ac-01ba20347f12/volumes" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.193969 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.393133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"985d42d961e1bb057efe961cf872b5cd441aeef5a46fee354795a3ca8a954fe2"} Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.397390 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6cc6d9b6db-9lrzm_9caaf11c-be1e-48d5-976d-d6960e5288df/neutron-api/0.log" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.397448 4707 generic.go:334] "Generic (PLEG): container finished" podID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerID="6bc5e820544a98e6b0a7d8b60e9bd28b50abe9ed04ca1b915ac66f80d5047419" exitCode=137 Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.397529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" event={"ID":"9caaf11c-be1e-48d5-976d-d6960e5288df","Type":"ContainerDied","Data":"6bc5e820544a98e6b0a7d8b60e9bd28b50abe9ed04ca1b915ac66f80d5047419"} Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.397546 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-75cf9974f9-bpkfv" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.397955 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.419834 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-75cf9974f9-bpkfv"] Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.433203 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-75cf9974f9-bpkfv"] Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.529168 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6cc6d9b6db-9lrzm_9caaf11c-be1e-48d5-976d-d6960e5288df/neutron-api/0.log" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.529259 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.603611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-combined-ca-bundle\") pod \"9caaf11c-be1e-48d5-976d-d6960e5288df\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.603769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqvf9\" (UniqueName: \"kubernetes.io/projected/9caaf11c-be1e-48d5-976d-d6960e5288df-kube-api-access-vqvf9\") pod \"9caaf11c-be1e-48d5-976d-d6960e5288df\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.603863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-httpd-config\") pod \"9caaf11c-be1e-48d5-976d-d6960e5288df\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.603935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-config\") pod \"9caaf11c-be1e-48d5-976d-d6960e5288df\" (UID: \"9caaf11c-be1e-48d5-976d-d6960e5288df\") " Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.608952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9caaf11c-be1e-48d5-976d-d6960e5288df" (UID: "9caaf11c-be1e-48d5-976d-d6960e5288df"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.610940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9caaf11c-be1e-48d5-976d-d6960e5288df-kube-api-access-vqvf9" (OuterVolumeSpecName: "kube-api-access-vqvf9") pod "9caaf11c-be1e-48d5-976d-d6960e5288df" (UID: "9caaf11c-be1e-48d5-976d-d6960e5288df"). InnerVolumeSpecName "kube-api-access-vqvf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.642963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9caaf11c-be1e-48d5-976d-d6960e5288df" (UID: "9caaf11c-be1e-48d5-976d-d6960e5288df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.646158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-config" (OuterVolumeSpecName: "config") pod "9caaf11c-be1e-48d5-976d-d6960e5288df" (UID: "9caaf11c-be1e-48d5-976d-d6960e5288df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.706901 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.706929 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqvf9\" (UniqueName: \"kubernetes.io/projected/9caaf11c-be1e-48d5-976d-d6960e5288df-kube-api-access-vqvf9\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.706942 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:55 crc kubenswrapper[4707]: I0121 15:51:55.706951 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9caaf11c-be1e-48d5-976d-d6960e5288df-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.408639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8"} Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.408998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6"} Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.409013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669"} Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.409024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb"} Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.409034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086"} Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.409043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6"} Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.410957 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6cc6d9b6db-9lrzm_9caaf11c-be1e-48d5-976d-d6960e5288df/neutron-api/0.log" Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.411078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.411065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm" event={"ID":"9caaf11c-be1e-48d5-976d-d6960e5288df","Type":"ContainerDied","Data":"9276cd0bf8da95c5d76f1098d0fed09fe687bc5f107b989663548fc9031901aa"} Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.411174 4707 scope.go:117] "RemoveContainer" containerID="839050d1503bd45e328be8023d300c3011c332e41ef788d6aee897ee48f7883e" Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.449856 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm"] Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.474439 4707 scope.go:117] "RemoveContainer" containerID="6bc5e820544a98e6b0a7d8b60e9bd28b50abe9ed04ca1b915ac66f80d5047419" Jan 21 15:51:56 crc kubenswrapper[4707]: I0121 15:51:56.478431 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6cc6d9b6db-9lrzm"] Jan 21 15:51:57 crc kubenswrapper[4707]: E0121 15:51:57.103981 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:57 crc kubenswrapper[4707]: E0121 15:51:57.105886 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:57 crc kubenswrapper[4707]: E0121 15:51:57.107163 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:51:57 crc kubenswrapper[4707]: E0121 15:51:57.107196 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:51:57 crc kubenswrapper[4707]: I0121 15:51:57.192862 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43c11e62-fdd4-4840-b282-7f467f9e4a88" path="/var/lib/kubelet/pods/43c11e62-fdd4-4840-b282-7f467f9e4a88/volumes" Jan 21 15:51:57 crc kubenswrapper[4707]: I0121 15:51:57.193479 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" path="/var/lib/kubelet/pods/9caaf11c-be1e-48d5-976d-d6960e5288df/volumes" Jan 21 15:51:57 crc kubenswrapper[4707]: I0121 15:51:57.433437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f"} Jan 21 15:51:57 crc kubenswrapper[4707]: I0121 15:51:57.433976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f"} Jan 21 15:51:57 crc kubenswrapper[4707]: I0121 15:51:57.433997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff"} Jan 21 15:51:57 crc kubenswrapper[4707]: I0121 15:51:57.434007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d"} Jan 21 15:51:57 crc kubenswrapper[4707]: I0121 15:51:57.434016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9"} Jan 21 15:51:57 crc kubenswrapper[4707]: I0121 15:51:57.434026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a"} Jan 21 15:51:57 crc kubenswrapper[4707]: I0121 15:51:57.676539 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.447216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f"} Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.447562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc"} Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.447575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerStarted","Data":"5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241"} Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.477316 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=4.477301544 podStartE2EDuration="4.477301544s" podCreationTimestamp="2026-01-21 15:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:58.47614017 +0000 UTC m=+3015.657656402" watchObservedRunningTime="2026-01-21 15:51:58.477301544 +0000 UTC m=+3015.658817766" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.566668 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j"] Jan 21 15:51:58 crc kubenswrapper[4707]: E0121 15:51:58.567049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerName="neutron-api" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.567067 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerName="neutron-api" Jan 21 15:51:58 crc kubenswrapper[4707]: E0121 15:51:58.567084 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerName="neutron-httpd" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.567090 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerName="neutron-httpd" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.567258 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerName="neutron-httpd" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.567278 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9caaf11c-be1e-48d5-976d-d6960e5288df" containerName="neutron-api" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.568141 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.577511 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j"] Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.662317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.662405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.662449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5mn2\" (UniqueName: \"kubernetes.io/projected/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-kube-api-access-t5mn2\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.662637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-config\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.763833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5mn2\" (UniqueName: \"kubernetes.io/projected/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-kube-api-access-t5mn2\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.763913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-config\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.763945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.763998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.764707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.765437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-config\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.765921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.789107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5mn2\" (UniqueName: \"kubernetes.io/projected/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-kube-api-access-t5mn2\") pod \"dnsmasq-dnsmasq-859975766f-whq4j\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:58 crc kubenswrapper[4707]: I0121 15:51:58.884609 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:51:59 crc kubenswrapper[4707]: I0121 15:51:59.266924 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j"] Jan 21 15:51:59 crc kubenswrapper[4707]: I0121 15:51:59.461718 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d4d9381-62e0-4779-9664-b1f9a3ced4ba" containerID="71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d" exitCode=0 Jan 21 15:51:59 crc kubenswrapper[4707]: I0121 15:51:59.461837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" event={"ID":"7d4d9381-62e0-4779-9664-b1f9a3ced4ba","Type":"ContainerDied","Data":"71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d"} Jan 21 15:51:59 crc kubenswrapper[4707]: I0121 15:51:59.462520 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" event={"ID":"7d4d9381-62e0-4779-9664-b1f9a3ced4ba","Type":"ContainerStarted","Data":"03248ed147576477673165d7a84bf20ab85caec1c459f8bbbe5c284d888e1d92"} Jan 21 15:52:00 crc kubenswrapper[4707]: I0121 15:52:00.475715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" event={"ID":"7d4d9381-62e0-4779-9664-b1f9a3ced4ba","Type":"ContainerStarted","Data":"9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd"} Jan 21 15:52:00 crc kubenswrapper[4707]: I0121 15:52:00.476273 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:52:00 crc kubenswrapper[4707]: I0121 15:52:00.497381 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" podStartSLOduration=2.497359361 podStartE2EDuration="2.497359361s" podCreationTimestamp="2026-01-21 15:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:00.487296712 +0000 UTC m=+3017.668812935" watchObservedRunningTime="2026-01-21 15:52:00.497359361 +0000 UTC m=+3017.678875584" Jan 21 15:52:01 crc kubenswrapper[4707]: I0121 15:52:01.538065 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:52:01 crc kubenswrapper[4707]: E0121 15:52:01.678500 4707 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/51200da19483171db3067ddb4b1f9f53f1460fcceef0cd94c3495e0a78835d09/diff" to get inode usage: stat /var/lib/containers/storage/overlay/51200da19483171db3067ddb4b1f9f53f1460fcceef0cd94c3495e0a78835d09/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_swift-storage-0_3e484616-acc3-47ab-a3ac-01ba20347f12/swift-recon-cron/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_swift-storage-0_3e484616-acc3-47ab-a3ac-01ba20347f12/swift-recon-cron/0.log: no such file or directory Jan 21 15:52:02 crc kubenswrapper[4707]: E0121 15:52:02.103833 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:02 crc kubenswrapper[4707]: E0121 15:52:02.105118 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:02 crc kubenswrapper[4707]: E0121 15:52:02.106687 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:02 crc kubenswrapper[4707]: E0121 15:52:02.106717 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:02 crc kubenswrapper[4707]: I0121 15:52:02.437019 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.222:3000/\": dial tcp 10.217.1.222:3000: connect: connection refused" Jan 21 15:52:03 crc kubenswrapper[4707]: I0121 15:52:03.739342 4707 scope.go:117] "RemoveContainer" containerID="e41f28d469e56bcfb24ec87ca701c01fb43455c941908db97110f75eeb045abb" Jan 21 15:52:03 crc kubenswrapper[4707]: I0121 15:52:03.771564 4707 scope.go:117] "RemoveContainer" containerID="c20bf1554c7dabcb98893e2a58c0be1968387065a014f5ffa800a39e6210d8a9" Jan 21 15:52:03 crc kubenswrapper[4707]: I0121 15:52:03.800524 4707 scope.go:117] "RemoveContainer" containerID="20101a8907237b77b8724d313d47000e9615ffb81952a3f5d753c750223d766c" Jan 21 15:52:03 crc kubenswrapper[4707]: I0121 15:52:03.852355 4707 scope.go:117] "RemoveContainer" containerID="b06e320fecab1c2dbd21b7002caf646cd135faaf82ae1ebd489ce9105a961412" Jan 21 15:52:03 crc kubenswrapper[4707]: I0121 15:52:03.972941 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:52:04 crc kubenswrapper[4707]: I0121 15:52:04.022841 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:52:04 crc kubenswrapper[4707]: I0121 15:52:04.075300 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx"] Jan 21 15:52:04 crc kubenswrapper[4707]: I0121 15:52:04.075606 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api-log" containerID="cri-o://16f2fa7274651efe218854309098a344d64c182e3e05144df53e9fa1fa123e4f" gracePeriod=30 Jan 21 15:52:04 crc kubenswrapper[4707]: I0121 15:52:04.075766 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api" containerID="cri-o://70b8b775962c4e1bdd55d20416c186cff48d9907089ad48cb2bd7ce99aa3d814" gracePeriod=30 Jan 21 15:52:04 crc kubenswrapper[4707]: I0121 15:52:04.508375 4707 generic.go:334] "Generic (PLEG): container finished" podID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerID="16f2fa7274651efe218854309098a344d64c182e3e05144df53e9fa1fa123e4f" exitCode=143 Jan 21 15:52:04 crc kubenswrapper[4707]: I0121 15:52:04.508458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" event={"ID":"964ff6ca-837f-4d18-90c9-c78ca0d14340","Type":"ContainerDied","Data":"16f2fa7274651efe218854309098a344d64c182e3e05144df53e9fa1fa123e4f"} Jan 21 15:52:06 crc kubenswrapper[4707]: E0121 15:52:06.025409 4707 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0ff6c1a_8a03_430f_a291_4e4b94123939.slice/crio-1f3bb33b3191c68812c158fb0eb79146207a3313284b95c962e047b8464e859b: Error finding container 1f3bb33b3191c68812c158fb0eb79146207a3313284b95c962e047b8464e859b: Status 404 returned error can't find the container with id 1f3bb33b3191c68812c158fb0eb79146207a3313284b95c962e047b8464e859b Jan 21 15:52:06 crc kubenswrapper[4707]: W0121 15:52:06.025648 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784a4657_f448_46b4_a167_7ba1d8490acf.slice/crio-a920ca1c6854346c260bf4750f69c54b8c03ab00f64477da208dff6203ffa10d.scope WatchSource:0}: Error finding container a920ca1c6854346c260bf4750f69c54b8c03ab00f64477da208dff6203ffa10d: Status 404 returned error can't find the container with id a920ca1c6854346c260bf4750f69c54b8c03ab00f64477da208dff6203ffa10d Jan 21 15:52:06 crc kubenswrapper[4707]: W0121 15:52:06.026568 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784a4657_f448_46b4_a167_7ba1d8490acf.slice/crio-5ea9c64c0184bd41a698034ddf9a014b325531196680a34e5fb329ba45fb8bff.scope WatchSource:0}: Error finding container 5ea9c64c0184bd41a698034ddf9a014b325531196680a34e5fb329ba45fb8bff: Status 404 returned error can't find the container with id 5ea9c64c0184bd41a698034ddf9a014b325531196680a34e5fb329ba45fb8bff Jan 21 15:52:06 crc kubenswrapper[4707]: E0121 15:52:06.029474 4707 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf732f500_3538_4a30_8502_fbd2451b7bdd.slice/crio-dc57923b0a2f675447c3007b9aa3acd012c50ec9f99c0725f0f16fd0d12c0f6b: Error finding container dc57923b0a2f675447c3007b9aa3acd012c50ec9f99c0725f0f16fd0d12c0f6b: Status 404 returned error can't find the container with id dc57923b0a2f675447c3007b9aa3acd012c50ec9f99c0725f0f16fd0d12c0f6b Jan 21 15:52:06 crc kubenswrapper[4707]: E0121 15:52:06.231045 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784a4657_f448_46b4_a167_7ba1d8490acf.slice/crio-conmon-a920ca1c6854346c260bf4750f69c54b8c03ab00f64477da208dff6203ffa10d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod855ed6cf_d010_49b4_a048_37d8568bdf60.slice/crio-conmon-480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0ff6c1a_8a03_430f_a291_4e4b94123939.slice/crio-conmon-c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5206c260_c7a5_4e61_bd58_4ed5772520e6.slice/crio-38be13d24f8ad0cf6aed26c9cb3900d56d55e9a18d44839c622f1c0e847af763.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e484616_acc3_47ab_a3ac_01ba20347f12.slice/crio-5e582c69f547ce754027adc81ad5f8f7c6e2521b6b2f278398a6f52ee2786e29\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9caaf11c_be1e_48d5_976d_d6960e5288df.slice/crio-6bc5e820544a98e6b0a7d8b60e9bd28b50abe9ed04ca1b915ac66f80d5047419.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784a4657_f448_46b4_a167_7ba1d8490acf.slice/crio-conmon-5ea9c64c0184bd41a698034ddf9a014b325531196680a34e5fb329ba45fb8bff.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf97eddbb_0f7f_4c89_bb38_27636955cfeb.slice/crio-conmon-0dd9c8beaa68d7f956b0822aa72df875c92883f45c8251c11ff6bb1522b49246.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podada6986b_923f_4ca5_a85b_d0578223ae13.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0ff6c1a_8a03_430f_a291_4e4b94123939.slice/crio-c39d7c417e1ee8d9336a4cc769a69280c52c18e73bc5f33796c230b7f0ec467d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43c11e62_fdd4_4840_b282_7f467f9e4a88.slice/crio-a5766d5bf168913788538335dbb1615ed79f7befcea3074dcb7e8bac5a9a3946.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4fc3f0d_81ac_4f0b_ac7e_f2c4e65c17cf.slice/crio-conmon-a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod855ed6cf_d010_49b4_a048_37d8568bdf60.slice/crio-94c9dd39c6939ad6e60fbab7000e6633ff6e92b0fab17ed250225f872c612710\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4fc3f0d_81ac_4f0b_ac7e_f2c4e65c17cf.slice/crio-a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podada6986b_923f_4ca5_a85b_d0578223ae13.slice/crio-3d7cbed09de0050d7563879ab667ea6edc3c8a0d862e3fc69264df0be5a5f75e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e484616_acc3_47ab_a3ac_01ba20347f12.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf97eddbb_0f7f_4c89_bb38_27636955cfeb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod180a2ca9_2ef5_4504_a517_e95c9b914238.slice/crio-conmon-3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf97eddbb_0f7f_4c89_bb38_27636955cfeb.slice/crio-0dd9c8beaa68d7f956b0822aa72df875c92883f45c8251c11ff6bb1522b49246.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9caaf11c_be1e_48d5_976d_d6960e5288df.slice/crio-conmon-6bc5e820544a98e6b0a7d8b60e9bd28b50abe9ed04ca1b915ac66f80d5047419.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5206c260_c7a5_4e61_bd58_4ed5772520e6.slice/crio-b3648349800419d4f2b03de9add01633483615336d5efe41b825a57326ba8a2e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5206c260_c7a5_4e61_bd58_4ed5772520e6.slice/crio-0a797890777eb2f51c212922fb4069734118870864dfe6e374d8541d03cc2b17.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9caaf11c_be1e_48d5_976d_d6960e5288df.slice/crio-9276cd0bf8da95c5d76f1098d0fed09fe687bc5f107b989663548fc9031901aa\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43c11e62_fdd4_4840_b282_7f467f9e4a88.slice/crio-00021ad3af7661cdf46d6752886ed6f4f2027587cd8b939e4d6291f3199e69a0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35a31ae7_4ab2_4b98_bd9d_66decb56f00a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf97eddbb_0f7f_4c89_bb38_27636955cfeb.slice/crio-8e115506bddafddea6ccc0304ef16d7840d849ae4d01c365b3992e4647182add\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eaefcd6_8b64_47a3_8e3a_280d7130b63e.slice/crio-conmon-14470d78188ddcd00f2db0dea2c3bc584052343e21da79ba6654c2da9d0b26e0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784a4657_f448_46b4_a167_7ba1d8490acf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784a4657_f448_46b4_a167_7ba1d8490acf.slice/crio-cb84606a35d694b4e57c9a3cff9bacdfb68eb3b20f7931ce08395d6b82a3754d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35a31ae7_4ab2_4b98_bd9d_66decb56f00a.slice/crio-conmon-c6f775aee616caccaeeccedc4cf3f0575bda5ac33ada637fd9c492e224da8293.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod964ff6ca_837f_4d18_90c9_c78ca0d14340.slice/crio-conmon-16f2fa7274651efe218854309098a344d64c182e3e05144df53e9fa1fa123e4f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fb17d2c_7e86_4451_b56e_4e89fc494542.slice/crio-conmon-ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9caaf11c_be1e_48d5_976d_d6960e5288df.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5206c260_c7a5_4e61_bd58_4ed5772520e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf732f500_3538_4a30_8502_fbd2451b7bdd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod855ed6cf_d010_49b4_a048_37d8568bdf60.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod964ff6ca_837f_4d18_90c9_c78ca0d14340.slice/crio-16f2fa7274651efe218854309098a344d64c182e3e05144df53e9fa1fa123e4f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fb17d2c_7e86_4451_b56e_4e89fc494542.slice/crio-ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod855ed6cf_d010_49b4_a048_37d8568bdf60.slice/crio-480535d453e3ad97913cba4f684a548ec2b8c09c7a7983388165e34ad01ecfc8.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.334618 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.411841 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.414676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs6jf\" (UniqueName: \"kubernetes.io/projected/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-kube-api-access-qs6jf\") pod \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.414723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-config-data\") pod \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.414879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-combined-ca-bundle\") pod \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\" (UID: \"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf\") " Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.421260 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-kube-api-access-qs6jf" (OuterVolumeSpecName: "kube-api-access-qs6jf") pod "d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf" (UID: "d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf"). InnerVolumeSpecName "kube-api-access-qs6jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.442600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf" (UID: "d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.445141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-config-data" (OuterVolumeSpecName: "config-data") pod "d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf" (UID: "d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.516204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-combined-ca-bundle\") pod \"180a2ca9-2ef5-4504-a517-e95c9b914238\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.516258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-config-data\") pod \"180a2ca9-2ef5-4504-a517-e95c9b914238\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.516296 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68597\" (UniqueName: \"kubernetes.io/projected/180a2ca9-2ef5-4504-a517-e95c9b914238-kube-api-access-68597\") pod \"180a2ca9-2ef5-4504-a517-e95c9b914238\" (UID: \"180a2ca9-2ef5-4504-a517-e95c9b914238\") " Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.516728 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.516747 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs6jf\" (UniqueName: \"kubernetes.io/projected/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-kube-api-access-qs6jf\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.516757 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.519466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180a2ca9-2ef5-4504-a517-e95c9b914238-kube-api-access-68597" (OuterVolumeSpecName: "kube-api-access-68597") pod "180a2ca9-2ef5-4504-a517-e95c9b914238" (UID: "180a2ca9-2ef5-4504-a517-e95c9b914238"). InnerVolumeSpecName "kube-api-access-68597". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.525057 4707 generic.go:334] "Generic (PLEG): container finished" podID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" containerID="14470d78188ddcd00f2db0dea2c3bc584052343e21da79ba6654c2da9d0b26e0" exitCode=0 Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.525107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1eaefcd6-8b64-47a3-8e3a-280d7130b63e","Type":"ContainerDied","Data":"14470d78188ddcd00f2db0dea2c3bc584052343e21da79ba6654c2da9d0b26e0"} Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.527456 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf" containerID="a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c" exitCode=137 Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.527519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf","Type":"ContainerDied","Data":"a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c"} Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.527551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf","Type":"ContainerDied","Data":"07fddb2db53c72120d44d7da47def74b309b175026182a2b3e72ba43ca61c998"} Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.527570 4707 scope.go:117] "RemoveContainer" containerID="a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.527713 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.531667 4707 generic.go:334] "Generic (PLEG): container finished" podID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" exitCode=137 Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.531764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"180a2ca9-2ef5-4504-a517-e95c9b914238","Type":"ContainerDied","Data":"3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044"} Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.531799 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"180a2ca9-2ef5-4504-a517-e95c9b914238","Type":"ContainerDied","Data":"112949b9c43e255583e54af6daa6ba9bff3b55b2278589d922735d3fc69f00c0"} Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.531877 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.537263 4707 generic.go:334] "Generic (PLEG): container finished" podID="1fb17d2c-7e86-4451-b56e-4e89fc494542" containerID="ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c" exitCode=0 Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.537395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1fb17d2c-7e86-4451-b56e-4e89fc494542","Type":"ContainerDied","Data":"ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c"} Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.545458 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "180a2ca9-2ef5-4504-a517-e95c9b914238" (UID: "180a2ca9-2ef5-4504-a517-e95c9b914238"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.545510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-config-data" (OuterVolumeSpecName: "config-data") pod "180a2ca9-2ef5-4504-a517-e95c9b914238" (UID: "180a2ca9-2ef5-4504-a517-e95c9b914238"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.555525 4707 scope.go:117] "RemoveContainer" containerID="a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c" Jan 21 15:52:06 crc kubenswrapper[4707]: E0121 15:52:06.556099 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c\": container with ID starting with a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c not found: ID does not exist" containerID="a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.556136 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c"} err="failed to get container status \"a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c\": rpc error: code = NotFound desc = could not find container \"a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c\": container with ID starting with a9e59cc0647fb8f39bbf3b7368b6bf4ad4f9cdfffaa5f088b508e8a6f75aa59c not found: ID does not exist" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.556157 4707 scope.go:117] "RemoveContainer" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.578319 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.584771 4707 scope.go:117] "RemoveContainer" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" Jan 21 15:52:06 crc kubenswrapper[4707]: E0121 15:52:06.585668 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044\": container with ID starting with 3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044 not found: ID does not exist" containerID="3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.585689 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044"} err="failed to get container status \"3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044\": rpc error: code = NotFound desc = could not find container \"3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044\": container with ID starting with 3793fc36d4bc4a4445118bee6c9a49de2d621353e401c8ca4ada79fcfd10d044 not found: ID does not exist" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.596864 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.603396 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:06 crc kubenswrapper[4707]: E0121 15:52:06.603841 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf" containerName="nova-scheduler-scheduler" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.603862 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf" containerName="nova-scheduler-scheduler" Jan 21 15:52:06 crc kubenswrapper[4707]: E0121 15:52:06.603890 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.603897 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.604082 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.604110 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf" containerName="nova-scheduler-scheduler" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.604841 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.607057 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.613996 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.617601 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.617627 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68597\" (UniqueName: \"kubernetes.io/projected/180a2ca9-2ef5-4504-a517-e95c9b914238-kube-api-access-68597\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.617637 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a2ca9-2ef5-4504-a517-e95c9b914238-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.718665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-config-data\") pod \"nova-scheduler-0\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.718721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djkjj\" (UniqueName: \"kubernetes.io/projected/abd22cf0-492c-48c5-b675-0db84da277d2-kube-api-access-djkjj\") pod \"nova-scheduler-0\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.719126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.821028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djkjj\" (UniqueName: \"kubernetes.io/projected/abd22cf0-492c-48c5-b675-0db84da277d2-kube-api-access-djkjj\") pod \"nova-scheduler-0\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.821154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.821235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-config-data\") pod \"nova-scheduler-0\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.824134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-config-data\") pod \"nova-scheduler-0\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.824231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.838729 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djkjj\" (UniqueName: \"kubernetes.io/projected/abd22cf0-492c-48c5-b675-0db84da277d2-kube-api-access-djkjj\") pod \"nova-scheduler-0\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.859676 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.865798 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.880451 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.881713 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.884771 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.898181 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.922050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.923770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpr2m\" (UniqueName: \"kubernetes.io/projected/12285e17-dc8c-4a74-863a-401ef36721aa-kube-api-access-qpr2m\") pod \"nova-cell0-conductor-0\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.923833 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:06 crc kubenswrapper[4707]: I0121 15:52:06.923879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.026098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.026313 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpr2m\" (UniqueName: \"kubernetes.io/projected/12285e17-dc8c-4a74-863a-401ef36721aa-kube-api-access-qpr2m\") pod \"nova-cell0-conductor-0\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.026358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.029859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.029961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.043030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpr2m\" (UniqueName: \"kubernetes.io/projected/12285e17-dc8c-4a74-863a-401ef36721aa-kube-api-access-qpr2m\") pod \"nova-cell0-conductor-0\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.191521 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="180a2ca9-2ef5-4504-a517-e95c9b914238" path="/var/lib/kubelet/pods/180a2ca9-2ef5-4504-a517-e95c9b914238/volumes" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.192329 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf" path="/var/lib/kubelet/pods/d4fc3f0d-81ac-4f0b-ac7e-f2c4e65c17cf/volumes" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.196038 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.246303 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.226:9311/healthcheck\": read tcp 10.217.0.2:46866->10.217.1.226:9311: read: connection reset by peer" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.246357 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.226:9311/healthcheck\": read tcp 10.217.0.2:46858->10.217.1.226:9311: read: connection reset by peer" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.377991 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.467139 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.537794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-scripts\") pod \"50b67658-af5d-467a-8952-22db7d822791\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.538126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pxff\" (UniqueName: \"kubernetes.io/projected/50b67658-af5d-467a-8952-22db7d822791-kube-api-access-9pxff\") pod \"50b67658-af5d-467a-8952-22db7d822791\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.538176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b67658-af5d-467a-8952-22db7d822791-etc-machine-id\") pod \"50b67658-af5d-467a-8952-22db7d822791\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.538238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-combined-ca-bundle\") pod \"50b67658-af5d-467a-8952-22db7d822791\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.538271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data-custom\") pod \"50b67658-af5d-467a-8952-22db7d822791\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.538376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data\") pod \"50b67658-af5d-467a-8952-22db7d822791\" (UID: \"50b67658-af5d-467a-8952-22db7d822791\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.538515 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50b67658-af5d-467a-8952-22db7d822791-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "50b67658-af5d-467a-8952-22db7d822791" (UID: "50b67658-af5d-467a-8952-22db7d822791"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.538925 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b67658-af5d-467a-8952-22db7d822791-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.543938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "50b67658-af5d-467a-8952-22db7d822791" (UID: "50b67658-af5d-467a-8952-22db7d822791"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.544916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-scripts" (OuterVolumeSpecName: "scripts") pod "50b67658-af5d-467a-8952-22db7d822791" (UID: "50b67658-af5d-467a-8952-22db7d822791"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.547497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b67658-af5d-467a-8952-22db7d822791-kube-api-access-9pxff" (OuterVolumeSpecName: "kube-api-access-9pxff") pod "50b67658-af5d-467a-8952-22db7d822791" (UID: "50b67658-af5d-467a-8952-22db7d822791"). InnerVolumeSpecName "kube-api-access-9pxff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.554223 4707 generic.go:334] "Generic (PLEG): container finished" podID="50b67658-af5d-467a-8952-22db7d822791" containerID="cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658" exitCode=137 Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.554304 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.554307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"50b67658-af5d-467a-8952-22db7d822791","Type":"ContainerDied","Data":"cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658"} Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.554423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"50b67658-af5d-467a-8952-22db7d822791","Type":"ContainerDied","Data":"1c5625e0ae45c6b65a76118c05d45f9e9772b7f470907a731317f746340325e9"} Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.554443 4707 scope.go:117] "RemoveContainer" containerID="be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.566364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1fb17d2c-7e86-4451-b56e-4e89fc494542","Type":"ContainerStarted","Data":"af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a"} Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.567472 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.572933 4707 generic.go:334] "Generic (PLEG): container finished" podID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerID="70b8b775962c4e1bdd55d20416c186cff48d9907089ad48cb2bd7ce99aa3d814" exitCode=0 Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.572985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" event={"ID":"964ff6ca-837f-4d18-90c9-c78ca0d14340","Type":"ContainerDied","Data":"70b8b775962c4e1bdd55d20416c186cff48d9907089ad48cb2bd7ce99aa3d814"} Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.583332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1eaefcd6-8b64-47a3-8e3a-280d7130b63e","Type":"ContainerStarted","Data":"c684b390b448b0e7f019fce643a06fcefabe494a19baeedc6a8e4cf355706140"} Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.584162 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.610018 4707 scope.go:117] "RemoveContainer" containerID="cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.610245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"abd22cf0-492c-48c5-b675-0db84da277d2","Type":"ContainerStarted","Data":"78825fc894a928c385a4cef6e0ed5cd7737d29654a919c993af269355da086ba"} Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.629769 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.629753671 podStartE2EDuration="36.629753671s" podCreationTimestamp="2026-01-21 15:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:07.584112756 +0000 UTC m=+3024.765628977" watchObservedRunningTime="2026-01-21 15:52:07.629753671 +0000 UTC m=+3024.811269964" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.641387 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pxff\" (UniqueName: \"kubernetes.io/projected/50b67658-af5d-467a-8952-22db7d822791-kube-api-access-9pxff\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.641413 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.641426 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.665844 4707 scope.go:117] "RemoveContainer" containerID="be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032" Jan 21 15:52:07 crc kubenswrapper[4707]: E0121 15:52:07.666935 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032\": container with ID starting with be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032 not found: ID does not exist" containerID="be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.666978 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032"} err="failed to get container status \"be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032\": rpc error: code = NotFound desc = could not find container \"be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032\": container with ID starting with be73b5ba66a90173b8f856c6c4e7e8b7f4e60b7f5848d98ec2120299db7bf032 not found: ID does not exist" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.667004 4707 scope.go:117] "RemoveContainer" containerID="cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658" Jan 21 15:52:07 crc kubenswrapper[4707]: E0121 15:52:07.667388 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658\": container with ID starting with cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658 not found: ID does not exist" containerID="cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.667420 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658"} err="failed to get container status \"cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658\": rpc error: code = NotFound desc = could not find container \"cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658\": container with ID starting with cb47308dc5039f141d46cdb64988d1d990f57d566d74073c63246b2e87e36658 not found: ID does not exist" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.669301 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.669280872 podStartE2EDuration="36.669280872s" podCreationTimestamp="2026-01-21 15:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:07.628152241 +0000 UTC m=+3024.809668462" watchObservedRunningTime="2026-01-21 15:52:07.669280872 +0000 UTC m=+3024.850797085" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.682883 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.705361 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.713962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50b67658-af5d-467a-8952-22db7d822791" (UID: "50b67658-af5d-467a-8952-22db7d822791"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.737105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data" (OuterVolumeSpecName: "config-data") pod "50b67658-af5d-467a-8952-22db7d822791" (UID: "50b67658-af5d-467a-8952-22db7d822791"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.743230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpmdg\" (UniqueName: \"kubernetes.io/projected/964ff6ca-837f-4d18-90c9-c78ca0d14340-kube-api-access-wpmdg\") pod \"964ff6ca-837f-4d18-90c9-c78ca0d14340\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.743433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data\") pod \"964ff6ca-837f-4d18-90c9-c78ca0d14340\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.743463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964ff6ca-837f-4d18-90c9-c78ca0d14340-logs\") pod \"964ff6ca-837f-4d18-90c9-c78ca0d14340\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.743479 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data-custom\") pod \"964ff6ca-837f-4d18-90c9-c78ca0d14340\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.743534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-combined-ca-bundle\") pod \"964ff6ca-837f-4d18-90c9-c78ca0d14340\" (UID: \"964ff6ca-837f-4d18-90c9-c78ca0d14340\") " Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.743876 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.743892 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b67658-af5d-467a-8952-22db7d822791-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.748215 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964ff6ca-837f-4d18-90c9-c78ca0d14340-logs" (OuterVolumeSpecName: "logs") pod "964ff6ca-837f-4d18-90c9-c78ca0d14340" (UID: "964ff6ca-837f-4d18-90c9-c78ca0d14340"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.752414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "964ff6ca-837f-4d18-90c9-c78ca0d14340" (UID: "964ff6ca-837f-4d18-90c9-c78ca0d14340"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.754822 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964ff6ca-837f-4d18-90c9-c78ca0d14340-kube-api-access-wpmdg" (OuterVolumeSpecName: "kube-api-access-wpmdg") pod "964ff6ca-837f-4d18-90c9-c78ca0d14340" (UID: "964ff6ca-837f-4d18-90c9-c78ca0d14340"). InnerVolumeSpecName "kube-api-access-wpmdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.788318 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data" (OuterVolumeSpecName: "config-data") pod "964ff6ca-837f-4d18-90c9-c78ca0d14340" (UID: "964ff6ca-837f-4d18-90c9-c78ca0d14340"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.791708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "964ff6ca-837f-4d18-90c9-c78ca0d14340" (UID: "964ff6ca-837f-4d18-90c9-c78ca0d14340"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.844662 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.844694 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpmdg\" (UniqueName: \"kubernetes.io/projected/964ff6ca-837f-4d18-90c9-c78ca0d14340-kube-api-access-wpmdg\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.844706 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.844719 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964ff6ca-837f-4d18-90c9-c78ca0d14340-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.844727 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/964ff6ca-837f-4d18-90c9-c78ca0d14340-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.881697 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.891296 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.901243 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:52:07 crc kubenswrapper[4707]: E0121 15:52:07.901673 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b67658-af5d-467a-8952-22db7d822791" containerName="probe" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.901687 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b67658-af5d-467a-8952-22db7d822791" containerName="probe" Jan 21 15:52:07 crc kubenswrapper[4707]: E0121 15:52:07.901704 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.901710 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api" Jan 21 15:52:07 crc kubenswrapper[4707]: E0121 15:52:07.901732 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api-log" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.901738 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api-log" Jan 21 15:52:07 crc kubenswrapper[4707]: E0121 15:52:07.901760 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b67658-af5d-467a-8952-22db7d822791" containerName="cinder-scheduler" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.901767 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b67658-af5d-467a-8952-22db7d822791" containerName="cinder-scheduler" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.901946 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.901959 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" containerName="barbican-api-log" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.901978 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b67658-af5d-467a-8952-22db7d822791" containerName="cinder-scheduler" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.901988 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b67658-af5d-467a-8952-22db7d822791" containerName="probe" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.910381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.911373 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.925126 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.946638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.946675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvdnl\" (UniqueName: \"kubernetes.io/projected/ece56fe7-15d7-4701-b4c7-e102b077c8ea-kube-api-access-qvdnl\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.946724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ece56fe7-15d7-4701-b4c7-e102b077c8ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.946754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.946777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.946821 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:07 crc kubenswrapper[4707]: I0121 15:52:07.962306 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.011216 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.047915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.048316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvdnl\" (UniqueName: \"kubernetes.io/projected/ece56fe7-15d7-4701-b4c7-e102b077c8ea-kube-api-access-qvdnl\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.048408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ece56fe7-15d7-4701-b4c7-e102b077c8ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.048445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.048489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.048509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.049426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ece56fe7-15d7-4701-b4c7-e102b077c8ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.054532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.055702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.055914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.056373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.071456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvdnl\" (UniqueName: \"kubernetes.io/projected/ece56fe7-15d7-4701-b4c7-e102b077c8ea-kube-api-access-qvdnl\") pod \"cinder-scheduler-0\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.082852 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-58456fd9fd-9nb94"] Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.083064 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-log" containerID="cri-o://19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a" gracePeriod=30 Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.084124 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-api" containerID="cri-o://3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0" gracePeriod=30 Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.245863 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.608504 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.608949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:52:08 crc kubenswrapper[4707]: W0121 15:52:08.616899 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podece56fe7_15d7_4701_b4c7_e102b077c8ea.slice/crio-5fd7e6f4f05b2fe31544430845b31fdd0f11aefe60765d95d1234e5e9d494675 WatchSource:0}: Error finding container 5fd7e6f4f05b2fe31544430845b31fdd0f11aefe60765d95d1234e5e9d494675: Status 404 returned error can't find the container with id 5fd7e6f4f05b2fe31544430845b31fdd0f11aefe60765d95d1234e5e9d494675 Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.644587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" event={"ID":"964ff6ca-837f-4d18-90c9-c78ca0d14340","Type":"ContainerDied","Data":"bdd42295d139bf8eb5d65c7e1315459ff8e6efa45102bced4efd8b3df6895c90"} Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.644707 4707 scope.go:117] "RemoveContainer" containerID="70b8b775962c4e1bdd55d20416c186cff48d9907089ad48cb2bd7ce99aa3d814" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.644845 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.651749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"abd22cf0-492c-48c5-b675-0db84da277d2","Type":"ContainerStarted","Data":"65eec0adec4ea3d5220c8e7656b5f3fd34f99bee425fa8317472937e8f59f30a"} Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.657094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"12285e17-dc8c-4a74-863a-401ef36721aa","Type":"ContainerStarted","Data":"d21d2249928786712ab3ed642b595392127fdedfe23bedfc92fcb6241faa2efb"} Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.657150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"12285e17-dc8c-4a74-863a-401ef36721aa","Type":"ContainerStarted","Data":"757c5a0f153201d89b8d9d3932fdfdcf9f7e0059e50722f695d0a99f155ec170"} Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.657711 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.658200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-log-httpd\") pod \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.658242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-combined-ca-bundle\") pod \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.658372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-config-data\") pod \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.658525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pkqx\" (UniqueName: \"kubernetes.io/projected/340dc9fb-5dbb-4be3-a139-f31e03c43cad-kube-api-access-4pkqx\") pod \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.658617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-scripts\") pod \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.658655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-run-httpd\") pod \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.658691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-sg-core-conf-yaml\") pod \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\" (UID: \"340dc9fb-5dbb-4be3-a139-f31e03c43cad\") " Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.658994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "340dc9fb-5dbb-4be3-a139-f31e03c43cad" (UID: "340dc9fb-5dbb-4be3-a139-f31e03c43cad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.659356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "340dc9fb-5dbb-4be3-a139-f31e03c43cad" (UID: "340dc9fb-5dbb-4be3-a139-f31e03c43cad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.659501 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.659514 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/340dc9fb-5dbb-4be3-a139-f31e03c43cad-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.663072 4707 generic.go:334] "Generic (PLEG): container finished" podID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerID="4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320" exitCode=137 Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.663136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerDied","Data":"4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320"} Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.663162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"340dc9fb-5dbb-4be3-a139-f31e03c43cad","Type":"ContainerDied","Data":"73132fdef03dff860634d10a418475b500cf8d9c2efcefd957cf29acb8dd23d5"} Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.663232 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.667167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/340dc9fb-5dbb-4be3-a139-f31e03c43cad-kube-api-access-4pkqx" (OuterVolumeSpecName: "kube-api-access-4pkqx") pod "340dc9fb-5dbb-4be3-a139-f31e03c43cad" (UID: "340dc9fb-5dbb-4be3-a139-f31e03c43cad"). InnerVolumeSpecName "kube-api-access-4pkqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.668085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-scripts" (OuterVolumeSpecName: "scripts") pod "340dc9fb-5dbb-4be3-a139-f31e03c43cad" (UID: "340dc9fb-5dbb-4be3-a139-f31e03c43cad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.669933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" event={"ID":"c29af5f9-41ed-4f5b-a689-f02942a2366c","Type":"ContainerDied","Data":"19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a"} Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.670291 4707 generic.go:334] "Generic (PLEG): container finished" podID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerID="19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a" exitCode=143 Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.675146 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.6751324370000003 podStartE2EDuration="2.675132437s" podCreationTimestamp="2026-01-21 15:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:08.667628828 +0000 UTC m=+3025.849145050" watchObservedRunningTime="2026-01-21 15:52:08.675132437 +0000 UTC m=+3025.856648659" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.682976 4707 scope.go:117] "RemoveContainer" containerID="16f2fa7274651efe218854309098a344d64c182e3e05144df53e9fa1fa123e4f" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.688376 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.688363304 podStartE2EDuration="2.688363304s" podCreationTimestamp="2026-01-21 15:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:08.684011322 +0000 UTC m=+3025.865527544" watchObservedRunningTime="2026-01-21 15:52:08.688363304 +0000 UTC m=+3025.869879526" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.702966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "340dc9fb-5dbb-4be3-a139-f31e03c43cad" (UID: "340dc9fb-5dbb-4be3-a139-f31e03c43cad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.707595 4707 scope.go:117] "RemoveContainer" containerID="009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.720891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx"] Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.727306 4707 scope.go:117] "RemoveContainer" containerID="dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.733758 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "340dc9fb-5dbb-4be3-a139-f31e03c43cad" (UID: "340dc9fb-5dbb-4be3-a139-f31e03c43cad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.735221 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6944fd56d4-74bxx"] Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.749568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-config-data" (OuterVolumeSpecName: "config-data") pod "340dc9fb-5dbb-4be3-a139-f31e03c43cad" (UID: "340dc9fb-5dbb-4be3-a139-f31e03c43cad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.750520 4707 scope.go:117] "RemoveContainer" containerID="302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.769367 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.769390 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.769401 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.769411 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340dc9fb-5dbb-4be3-a139-f31e03c43cad-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.769419 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pkqx\" (UniqueName: \"kubernetes.io/projected/340dc9fb-5dbb-4be3-a139-f31e03c43cad-kube-api-access-4pkqx\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.776225 4707 scope.go:117] "RemoveContainer" containerID="4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.802726 4707 scope.go:117] "RemoveContainer" containerID="009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3" Jan 21 15:52:08 crc kubenswrapper[4707]: E0121 15:52:08.803176 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3\": container with ID starting with 009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3 not found: ID does not exist" containerID="009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.803299 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3"} err="failed to get container status \"009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3\": rpc error: code = NotFound desc = could not find container \"009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3\": container with ID starting with 009e1e873cec23a52ef7b6a4620b54b52fc461a4e6d9c6b69e536205ba4c8ee3 not found: ID does not exist" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.803380 4707 scope.go:117] "RemoveContainer" containerID="dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc" Jan 21 15:52:08 crc kubenswrapper[4707]: E0121 15:52:08.803804 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc\": container with ID starting with dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc not found: ID does not exist" containerID="dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.803851 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc"} err="failed to get container status \"dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc\": rpc error: code = NotFound desc = could not find container \"dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc\": container with ID starting with dd84f38c31ce372b9f9d69ea996eb0537ee2f405219c3171d2be86712cc94ddc not found: ID does not exist" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.803878 4707 scope.go:117] "RemoveContainer" containerID="302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a" Jan 21 15:52:08 crc kubenswrapper[4707]: E0121 15:52:08.804391 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a\": container with ID starting with 302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a not found: ID does not exist" containerID="302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.804414 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a"} err="failed to get container status \"302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a\": rpc error: code = NotFound desc = could not find container \"302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a\": container with ID starting with 302c9e791dda99301ad5a303b5223a611ecd7e31baacefe15187f317415f5f2a not found: ID does not exist" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.804429 4707 scope.go:117] "RemoveContainer" containerID="4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320" Jan 21 15:52:08 crc kubenswrapper[4707]: E0121 15:52:08.804685 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320\": container with ID starting with 4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320 not found: ID does not exist" containerID="4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.806564 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320"} err="failed to get container status \"4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320\": rpc error: code = NotFound desc = could not find container \"4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320\": container with ID starting with 4a46fa976a2da1289b0e1bf13f3848677cc9575d72351c206632abc2bcca5320 not found: ID does not exist" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.887061 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.960341 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs"] Jan 21 15:52:08 crc kubenswrapper[4707]: I0121 15:52:08.960872 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" podUID="3282f63b-58f5-4f45-a684-55a09adb04d6" containerName="dnsmasq-dns" containerID="cri-o://a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda" gracePeriod=10 Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.019708 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.037986 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.052149 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:09 crc kubenswrapper[4707]: E0121 15:52:09.053414 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="ceilometer-notification-agent" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.053518 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="ceilometer-notification-agent" Jan 21 15:52:09 crc kubenswrapper[4707]: E0121 15:52:09.053606 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="sg-core" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.053658 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="sg-core" Jan 21 15:52:09 crc kubenswrapper[4707]: E0121 15:52:09.053716 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="proxy-httpd" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.053761 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="proxy-httpd" Jan 21 15:52:09 crc kubenswrapper[4707]: E0121 15:52:09.059345 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="ceilometer-central-agent" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.059379 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="ceilometer-central-agent" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.059788 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="sg-core" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.059833 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="proxy-httpd" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.059847 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="ceilometer-central-agent" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.059861 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" containerName="ceilometer-notification-agent" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.061671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.063886 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.072286 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.135790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.178879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-log-httpd\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.179114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdqr\" (UniqueName: \"kubernetes.io/projected/132ee358-5559-4c6e-b6d9-742bc3a89735-kube-api-access-vjdqr\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.179223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-scripts\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.179550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.179800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.179947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-config-data\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.180063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-run-httpd\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.191673 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="340dc9fb-5dbb-4be3-a139-f31e03c43cad" path="/var/lib/kubelet/pods/340dc9fb-5dbb-4be3-a139-f31e03c43cad/volumes" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.192675 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b67658-af5d-467a-8952-22db7d822791" path="/var/lib/kubelet/pods/50b67658-af5d-467a-8952-22db7d822791/volumes" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.193653 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="964ff6ca-837f-4d18-90c9-c78ca0d14340" path="/var/lib/kubelet/pods/964ff6ca-837f-4d18-90c9-c78ca0d14340/volumes" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.282267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.282610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.282644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-config-data\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.282694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-run-httpd\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.282726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-log-httpd\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.282777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdqr\" (UniqueName: \"kubernetes.io/projected/132ee358-5559-4c6e-b6d9-742bc3a89735-kube-api-access-vjdqr\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.282796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-scripts\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.283821 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-run-httpd\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.284918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-log-httpd\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.288175 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.311696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-scripts\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.317479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdqr\" (UniqueName: \"kubernetes.io/projected/132ee358-5559-4c6e-b6d9-742bc3a89735-kube-api-access-vjdqr\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.326309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.326887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-config-data\") pod \"ceilometer-0\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.405089 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.478689 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.588053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsrf6\" (UniqueName: \"kubernetes.io/projected/3282f63b-58f5-4f45-a684-55a09adb04d6-kube-api-access-zsrf6\") pod \"3282f63b-58f5-4f45-a684-55a09adb04d6\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.588237 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dns-swift-storage-0\") pod \"3282f63b-58f5-4f45-a684-55a09adb04d6\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.588364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-config\") pod \"3282f63b-58f5-4f45-a684-55a09adb04d6\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.588421 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dnsmasq-svc\") pod \"3282f63b-58f5-4f45-a684-55a09adb04d6\" (UID: \"3282f63b-58f5-4f45-a684-55a09adb04d6\") " Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.595226 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3282f63b-58f5-4f45-a684-55a09adb04d6-kube-api-access-zsrf6" (OuterVolumeSpecName: "kube-api-access-zsrf6") pod "3282f63b-58f5-4f45-a684-55a09adb04d6" (UID: "3282f63b-58f5-4f45-a684-55a09adb04d6"). InnerVolumeSpecName "kube-api-access-zsrf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.639689 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3282f63b-58f5-4f45-a684-55a09adb04d6" (UID: "3282f63b-58f5-4f45-a684-55a09adb04d6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.651359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-config" (OuterVolumeSpecName: "config") pod "3282f63b-58f5-4f45-a684-55a09adb04d6" (UID: "3282f63b-58f5-4f45-a684-55a09adb04d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.651965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "3282f63b-58f5-4f45-a684-55a09adb04d6" (UID: "3282f63b-58f5-4f45-a684-55a09adb04d6"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.686121 4707 generic.go:334] "Generic (PLEG): container finished" podID="3282f63b-58f5-4f45-a684-55a09adb04d6" containerID="a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda" exitCode=0 Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.686173 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.686188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" event={"ID":"3282f63b-58f5-4f45-a684-55a09adb04d6","Type":"ContainerDied","Data":"a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda"} Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.686218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs" event={"ID":"3282f63b-58f5-4f45-a684-55a09adb04d6","Type":"ContainerDied","Data":"07fc92e5a685c723b68596f66af05e9e7d7a11eb521ce8734f160da1ae771251"} Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.686234 4707 scope.go:117] "RemoveContainer" containerID="a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.690138 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.690154 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.690167 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsrf6\" (UniqueName: \"kubernetes.io/projected/3282f63b-58f5-4f45-a684-55a09adb04d6-kube-api-access-zsrf6\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.690175 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3282f63b-58f5-4f45-a684-55a09adb04d6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.700220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ece56fe7-15d7-4701-b4c7-e102b077c8ea","Type":"ContainerStarted","Data":"fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9"} Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.700275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ece56fe7-15d7-4701-b4c7-e102b077c8ea","Type":"ContainerStarted","Data":"5fd7e6f4f05b2fe31544430845b31fdd0f11aefe60765d95d1234e5e9d494675"} Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.741169 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs"] Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.741687 4707 scope.go:117] "RemoveContainer" containerID="c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.763128 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-677f7d4c8c-n77qs"] Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.763636 4707 scope.go:117] "RemoveContainer" containerID="a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda" Jan 21 15:52:09 crc kubenswrapper[4707]: E0121 15:52:09.764500 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda\": container with ID starting with a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda not found: ID does not exist" containerID="a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.764530 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda"} err="failed to get container status \"a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda\": rpc error: code = NotFound desc = could not find container \"a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda\": container with ID starting with a5b38ba613e252314f1901b43f08b94ebe22ea4b26a34e84ce503253d3a70dda not found: ID does not exist" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.764548 4707 scope.go:117] "RemoveContainer" containerID="c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97" Jan 21 15:52:09 crc kubenswrapper[4707]: E0121 15:52:09.765545 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97\": container with ID starting with c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97 not found: ID does not exist" containerID="c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.765571 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97"} err="failed to get container status \"c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97\": rpc error: code = NotFound desc = could not find container \"c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97\": container with ID starting with c470105838d3c6710997ac13383ea198faa4b4c5fb970200c39de496e5e80c97 not found: ID does not exist" Jan 21 15:52:09 crc kubenswrapper[4707]: I0121 15:52:09.976398 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:09 crc kubenswrapper[4707]: W0121 15:52:09.982637 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod132ee358_5559_4c6e_b6d9_742bc3a89735.slice/crio-7e85bb6e95f0c1fbe624c2e7fa539e32d0f0b8a3f16b0c06ec940c42e5a910fb WatchSource:0}: Error finding container 7e85bb6e95f0c1fbe624c2e7fa539e32d0f0b8a3f16b0c06ec940c42e5a910fb: Status 404 returned error can't find the container with id 7e85bb6e95f0c1fbe624c2e7fa539e32d0f0b8a3f16b0c06ec940c42e5a910fb Jan 21 15:52:10 crc kubenswrapper[4707]: I0121 15:52:10.710647 4707 generic.go:334] "Generic (PLEG): container finished" podID="12285e17-dc8c-4a74-863a-401ef36721aa" containerID="d21d2249928786712ab3ed642b595392127fdedfe23bedfc92fcb6241faa2efb" exitCode=1 Jan 21 15:52:10 crc kubenswrapper[4707]: I0121 15:52:10.710741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"12285e17-dc8c-4a74-863a-401ef36721aa","Type":"ContainerDied","Data":"d21d2249928786712ab3ed642b595392127fdedfe23bedfc92fcb6241faa2efb"} Jan 21 15:52:10 crc kubenswrapper[4707]: I0121 15:52:10.711227 4707 scope.go:117] "RemoveContainer" containerID="d21d2249928786712ab3ed642b595392127fdedfe23bedfc92fcb6241faa2efb" Jan 21 15:52:10 crc kubenswrapper[4707]: I0121 15:52:10.713983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ece56fe7-15d7-4701-b4c7-e102b077c8ea","Type":"ContainerStarted","Data":"dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724"} Jan 21 15:52:10 crc kubenswrapper[4707]: I0121 15:52:10.715168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerStarted","Data":"7e85bb6e95f0c1fbe624c2e7fa539e32d0f0b8a3f16b0c06ec940c42e5a910fb"} Jan 21 15:52:10 crc kubenswrapper[4707]: I0121 15:52:10.718712 4707 generic.go:334] "Generic (PLEG): container finished" podID="abd22cf0-492c-48c5-b675-0db84da277d2" containerID="65eec0adec4ea3d5220c8e7656b5f3fd34f99bee425fa8317472937e8f59f30a" exitCode=1 Jan 21 15:52:10 crc kubenswrapper[4707]: I0121 15:52:10.718749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"abd22cf0-492c-48c5-b675-0db84da277d2","Type":"ContainerDied","Data":"65eec0adec4ea3d5220c8e7656b5f3fd34f99bee425fa8317472937e8f59f30a"} Jan 21 15:52:10 crc kubenswrapper[4707]: I0121 15:52:10.719407 4707 scope.go:117] "RemoveContainer" containerID="65eec0adec4ea3d5220c8e7656b5f3fd34f99bee425fa8317472937e8f59f30a" Jan 21 15:52:10 crc kubenswrapper[4707]: I0121 15:52:10.768023 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.768007329 podStartE2EDuration="3.768007329s" podCreationTimestamp="2026-01-21 15:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:10.764936304 +0000 UTC m=+3027.946452527" watchObservedRunningTime="2026-01-21 15:52:10.768007329 +0000 UTC m=+3027.949523551" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.134178 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.200848 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3282f63b-58f5-4f45-a684-55a09adb04d6" path="/var/lib/kubelet/pods/3282f63b-58f5-4f45-a684-55a09adb04d6/volumes" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.214361 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-api" probeResult="failure" output="Get \"http://10.217.1.154:8778/\": read tcp 10.217.0.2:41412->10.217.1.154:8778: read: connection reset by peer" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.214433 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-log" probeResult="failure" output="Get \"http://10.217.1.154:8778/\": read tcp 10.217.0.2:41398->10.217.1.154:8778: read: connection reset by peer" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.319645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data\") pod \"a9c6c283-dad9-4fe2-91a7-184df0427be8\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.319964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data-custom\") pod \"a9c6c283-dad9-4fe2-91a7-184df0427be8\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.320010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw869\" (UniqueName: \"kubernetes.io/projected/a9c6c283-dad9-4fe2-91a7-184df0427be8-kube-api-access-lw869\") pod \"a9c6c283-dad9-4fe2-91a7-184df0427be8\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.320062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-combined-ca-bundle\") pod \"a9c6c283-dad9-4fe2-91a7-184df0427be8\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.320119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c6c283-dad9-4fe2-91a7-184df0427be8-logs\") pod \"a9c6c283-dad9-4fe2-91a7-184df0427be8\" (UID: \"a9c6c283-dad9-4fe2-91a7-184df0427be8\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.322757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9c6c283-dad9-4fe2-91a7-184df0427be8-logs" (OuterVolumeSpecName: "logs") pod "a9c6c283-dad9-4fe2-91a7-184df0427be8" (UID: "a9c6c283-dad9-4fe2-91a7-184df0427be8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.323197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a9c6c283-dad9-4fe2-91a7-184df0427be8" (UID: "a9c6c283-dad9-4fe2-91a7-184df0427be8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.327732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c6c283-dad9-4fe2-91a7-184df0427be8-kube-api-access-lw869" (OuterVolumeSpecName: "kube-api-access-lw869") pod "a9c6c283-dad9-4fe2-91a7-184df0427be8" (UID: "a9c6c283-dad9-4fe2-91a7-184df0427be8"). InnerVolumeSpecName "kube-api-access-lw869". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.356000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9c6c283-dad9-4fe2-91a7-184df0427be8" (UID: "a9c6c283-dad9-4fe2-91a7-184df0427be8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.368894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data" (OuterVolumeSpecName: "config-data") pod "a9c6c283-dad9-4fe2-91a7-184df0427be8" (UID: "a9c6c283-dad9-4fe2-91a7-184df0427be8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.374934 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.422396 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.422583 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.422680 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw869\" (UniqueName: \"kubernetes.io/projected/a9c6c283-dad9-4fe2-91a7-184df0427be8-kube-api-access-lw869\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.422744 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c6c283-dad9-4fe2-91a7-184df0427be8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.422803 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c6c283-dad9-4fe2-91a7-184df0427be8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.524348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data-custom\") pod \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.524457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data\") pod \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.524480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-combined-ca-bundle\") pod \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.524517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-logs\") pod \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.524679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24tkw\" (UniqueName: \"kubernetes.io/projected/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-kube-api-access-24tkw\") pod \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\" (UID: \"65b2a5f8-e722-43a5-8765-1b6f281e9b8d\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.526177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-logs" (OuterVolumeSpecName: "logs") pod "65b2a5f8-e722-43a5-8765-1b6f281e9b8d" (UID: "65b2a5f8-e722-43a5-8765-1b6f281e9b8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.532878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-kube-api-access-24tkw" (OuterVolumeSpecName: "kube-api-access-24tkw") pod "65b2a5f8-e722-43a5-8765-1b6f281e9b8d" (UID: "65b2a5f8-e722-43a5-8765-1b6f281e9b8d"). InnerVolumeSpecName "kube-api-access-24tkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.535900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "65b2a5f8-e722-43a5-8765-1b6f281e9b8d" (UID: "65b2a5f8-e722-43a5-8765-1b6f281e9b8d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.568623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data" (OuterVolumeSpecName: "config-data") pod "65b2a5f8-e722-43a5-8765-1b6f281e9b8d" (UID: "65b2a5f8-e722-43a5-8765-1b6f281e9b8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.571576 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.572210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65b2a5f8-e722-43a5-8765-1b6f281e9b8d" (UID: "65b2a5f8-e722-43a5-8765-1b6f281e9b8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.626696 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.626917 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.627000 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.627057 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.627110 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24tkw\" (UniqueName: \"kubernetes.io/projected/65b2a5f8-e722-43a5-8765-1b6f281e9b8d-kube-api-access-24tkw\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.727859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-scripts\") pod \"c29af5f9-41ed-4f5b-a689-f02942a2366c\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.727923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-combined-ca-bundle\") pod \"c29af5f9-41ed-4f5b-a689-f02942a2366c\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.727989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8qzw\" (UniqueName: \"kubernetes.io/projected/c29af5f9-41ed-4f5b-a689-f02942a2366c-kube-api-access-w8qzw\") pod \"c29af5f9-41ed-4f5b-a689-f02942a2366c\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.728051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29af5f9-41ed-4f5b-a689-f02942a2366c-logs\") pod \"c29af5f9-41ed-4f5b-a689-f02942a2366c\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.728104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-config-data\") pod \"c29af5f9-41ed-4f5b-a689-f02942a2366c\" (UID: \"c29af5f9-41ed-4f5b-a689-f02942a2366c\") " Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.728451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c29af5f9-41ed-4f5b-a689-f02942a2366c-logs" (OuterVolumeSpecName: "logs") pod "c29af5f9-41ed-4f5b-a689-f02942a2366c" (UID: "c29af5f9-41ed-4f5b-a689-f02942a2366c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.730074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"abd22cf0-492c-48c5-b675-0db84da277d2","Type":"ContainerStarted","Data":"672f5dec2a5ec86891dcdf9db026fc7510b70c1bd2772145774fb3771d502e2a"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.733471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"12285e17-dc8c-4a74-863a-401ef36721aa","Type":"ContainerStarted","Data":"ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.735409 4707 generic.go:334] "Generic (PLEG): container finished" podID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerID="3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0" exitCode=0 Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.735534 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.735631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" event={"ID":"c29af5f9-41ed-4f5b-a689-f02942a2366c","Type":"ContainerDied","Data":"3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.735699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29af5f9-41ed-4f5b-a689-f02942a2366c-kube-api-access-w8qzw" (OuterVolumeSpecName: "kube-api-access-w8qzw") pod "c29af5f9-41ed-4f5b-a689-f02942a2366c" (UID: "c29af5f9-41ed-4f5b-a689-f02942a2366c"). InnerVolumeSpecName "kube-api-access-w8qzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.735788 4707 scope.go:117] "RemoveContainer" containerID="3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.735857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-scripts" (OuterVolumeSpecName: "scripts") pod "c29af5f9-41ed-4f5b-a689-f02942a2366c" (UID: "c29af5f9-41ed-4f5b-a689-f02942a2366c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.736955 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-58456fd9fd-9nb94" event={"ID":"c29af5f9-41ed-4f5b-a689-f02942a2366c","Type":"ContainerDied","Data":"1a4be0d60b41d1b7aaec854fbeed19a16ca7e4e6d40f68e9ec041867320b9543"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.738397 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerID="409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5" exitCode=137 Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.738505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" event={"ID":"a9c6c283-dad9-4fe2-91a7-184df0427be8","Type":"ContainerDied","Data":"409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.738593 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" event={"ID":"a9c6c283-dad9-4fe2-91a7-184df0427be8","Type":"ContainerDied","Data":"5763c506bcb491757450aef3bb9ba416785bc89b8eeaf1f08a3091941fded5a3"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.738681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.751309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerStarted","Data":"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.751347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerStarted","Data":"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.761406 4707 generic.go:334] "Generic (PLEG): container finished" podID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerID="9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736" exitCode=137 Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.763081 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.776915 4707 scope.go:117] "RemoveContainer" containerID="19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.777060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" event={"ID":"65b2a5f8-e722-43a5-8765-1b6f281e9b8d","Type":"ContainerDied","Data":"9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.777092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-568886d945-qqxjr" event={"ID":"65b2a5f8-e722-43a5-8765-1b6f281e9b8d","Type":"ContainerDied","Data":"53c9d1533f473c8420f1ae18d35c47189ddac03a313992c676247a71ed2ee1b8"} Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.793353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c29af5f9-41ed-4f5b-a689-f02942a2366c" (UID: "c29af5f9-41ed-4f5b-a689-f02942a2366c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.810757 4707 scope.go:117] "RemoveContainer" containerID="3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0" Jan 21 15:52:11 crc kubenswrapper[4707]: E0121 15:52:11.811125 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0\": container with ID starting with 3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0 not found: ID does not exist" containerID="3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.811162 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0"} err="failed to get container status \"3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0\": rpc error: code = NotFound desc = could not find container \"3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0\": container with ID starting with 3b7d560dc260b189ab3db5fcde527daa9a94f19a70c55383e1245d51a1fb9cd0 not found: ID does not exist" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.811188 4707 scope.go:117] "RemoveContainer" containerID="19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a" Jan 21 15:52:11 crc kubenswrapper[4707]: E0121 15:52:11.811462 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a\": container with ID starting with 19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a not found: ID does not exist" containerID="19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.811490 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a"} err="failed to get container status \"19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a\": rpc error: code = NotFound desc = could not find container \"19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a\": container with ID starting with 19db9216e1421242518daa87b5fe809284eea25c1f72be54681d5f8f0d30797a not found: ID does not exist" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.811506 4707 scope.go:117] "RemoveContainer" containerID="409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.816642 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9"] Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.819900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-config-data" (OuterVolumeSpecName: "config-data") pod "c29af5f9-41ed-4f5b-a689-f02942a2366c" (UID: "c29af5f9-41ed-4f5b-a689-f02942a2366c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.825862 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-75d95b7f46-dlmg9"] Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.826502 4707 scope.go:117] "RemoveContainer" containerID="ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.839468 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-568886d945-qqxjr"] Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.846790 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-568886d945-qqxjr"] Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.849310 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.849357 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.849459 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8qzw\" (UniqueName: \"kubernetes.io/projected/c29af5f9-41ed-4f5b-a689-f02942a2366c-kube-api-access-w8qzw\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.849495 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29af5f9-41ed-4f5b-a689-f02942a2366c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.849521 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29af5f9-41ed-4f5b-a689-f02942a2366c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.863516 4707 scope.go:117] "RemoveContainer" containerID="409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5" Jan 21 15:52:11 crc kubenswrapper[4707]: E0121 15:52:11.863980 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5\": container with ID starting with 409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5 not found: ID does not exist" containerID="409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.864018 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5"} err="failed to get container status \"409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5\": rpc error: code = NotFound desc = could not find container \"409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5\": container with ID starting with 409d0db097703c3dcb1300f3b62f50d52c16ad162d278551cdf5213ab2d4f1f5 not found: ID does not exist" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.864043 4707 scope.go:117] "RemoveContainer" containerID="ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b" Jan 21 15:52:11 crc kubenswrapper[4707]: E0121 15:52:11.864301 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b\": container with ID starting with ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b not found: ID does not exist" containerID="ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.864320 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b"} err="failed to get container status \"ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b\": rpc error: code = NotFound desc = could not find container \"ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b\": container with ID starting with ed88317d7eec62ca494b7a6a763608c1b7a795e463ab8ddda4865693c1f93a9b not found: ID does not exist" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.864333 4707 scope.go:117] "RemoveContainer" containerID="9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.885192 4707 scope.go:117] "RemoveContainer" containerID="814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.900145 4707 scope.go:117] "RemoveContainer" containerID="9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736" Jan 21 15:52:11 crc kubenswrapper[4707]: E0121 15:52:11.900442 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736\": container with ID starting with 9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736 not found: ID does not exist" containerID="9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.900479 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736"} err="failed to get container status \"9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736\": rpc error: code = NotFound desc = could not find container \"9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736\": container with ID starting with 9775e3bbdd9e6eff41fcbe6c84e50f8d003642c24f6157ca62ca298d8f16a736 not found: ID does not exist" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.900503 4707 scope.go:117] "RemoveContainer" containerID="814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f" Jan 21 15:52:11 crc kubenswrapper[4707]: E0121 15:52:11.900737 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f\": container with ID starting with 814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f not found: ID does not exist" containerID="814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.900763 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f"} err="failed to get container status \"814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f\": rpc error: code = NotFound desc = could not find container \"814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f\": container with ID starting with 814eee958ce50161a27fc968242f6038c9868ec0d352dc18dd9b2daa0f32af5f not found: ID does not exist" Jan 21 15:52:11 crc kubenswrapper[4707]: I0121 15:52:11.923127 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:12 crc kubenswrapper[4707]: I0121 15:52:12.069720 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-58456fd9fd-9nb94"] Jan 21 15:52:12 crc kubenswrapper[4707]: I0121 15:52:12.075152 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-58456fd9fd-9nb94"] Jan 21 15:52:12 crc kubenswrapper[4707]: I0121 15:52:12.196772 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:12 crc kubenswrapper[4707]: I0121 15:52:12.774183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerStarted","Data":"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8"} Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.191624 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" path="/var/lib/kubelet/pods/65b2a5f8-e722-43a5-8765-1b6f281e9b8d/volumes" Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.192571 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c6c283-dad9-4fe2-91a7-184df0427be8" path="/var/lib/kubelet/pods/a9c6c283-dad9-4fe2-91a7-184df0427be8/volumes" Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.193130 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" path="/var/lib/kubelet/pods/c29af5f9-41ed-4f5b-a689-f02942a2366c/volumes" Jan 21 15:52:13 crc kubenswrapper[4707]: E0121 15:52:13.197625 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c is running failed: container process not found" containerID="ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:13 crc kubenswrapper[4707]: E0121 15:52:13.197887 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c is running failed: container process not found" containerID="ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:13 crc kubenswrapper[4707]: E0121 15:52:13.198266 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c is running failed: container process not found" containerID="ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:13 crc kubenswrapper[4707]: E0121 15:52:13.198317 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c is running failed: container process not found" probeType="Liveness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.246672 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.783536 4707 generic.go:334] "Generic (PLEG): container finished" podID="abd22cf0-492c-48c5-b675-0db84da277d2" containerID="672f5dec2a5ec86891dcdf9db026fc7510b70c1bd2772145774fb3771d502e2a" exitCode=1 Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.783620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"abd22cf0-492c-48c5-b675-0db84da277d2","Type":"ContainerDied","Data":"672f5dec2a5ec86891dcdf9db026fc7510b70c1bd2772145774fb3771d502e2a"} Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.783915 4707 scope.go:117] "RemoveContainer" containerID="65eec0adec4ea3d5220c8e7656b5f3fd34f99bee425fa8317472937e8f59f30a" Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.784421 4707 scope.go:117] "RemoveContainer" containerID="672f5dec2a5ec86891dcdf9db026fc7510b70c1bd2772145774fb3771d502e2a" Jan 21 15:52:13 crc kubenswrapper[4707]: E0121 15:52:13.784694 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(abd22cf0-492c-48c5-b675-0db84da277d2)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="abd22cf0-492c-48c5-b675-0db84da277d2" Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.785702 4707 generic.go:334] "Generic (PLEG): container finished" podID="12285e17-dc8c-4a74-863a-401ef36721aa" containerID="ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c" exitCode=1 Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.785766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"12285e17-dc8c-4a74-863a-401ef36721aa","Type":"ContainerDied","Data":"ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c"} Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.786005 4707 scope.go:117] "RemoveContainer" containerID="ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c" Jan 21 15:52:13 crc kubenswrapper[4707]: E0121 15:52:13.786202 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(12285e17-dc8c-4a74-863a-401ef36721aa)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.787906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerStarted","Data":"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762"} Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.788110 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.831658 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.437995486 podStartE2EDuration="4.8316394s" podCreationTimestamp="2026-01-21 15:52:09 +0000 UTC" firstStartedPulling="2026-01-21 15:52:09.985123307 +0000 UTC m=+3027.166639529" lastFinishedPulling="2026-01-21 15:52:13.37876722 +0000 UTC m=+3030.560283443" observedRunningTime="2026-01-21 15:52:13.823205511 +0000 UTC m=+3031.004721723" watchObservedRunningTime="2026-01-21 15:52:13.8316394 +0000 UTC m=+3031.013155622" Jan 21 15:52:13 crc kubenswrapper[4707]: I0121 15:52:13.835649 4707 scope.go:117] "RemoveContainer" containerID="d21d2249928786712ab3ed642b595392127fdedfe23bedfc92fcb6241faa2efb" Jan 21 15:52:14 crc kubenswrapper[4707]: I0121 15:52:14.798152 4707 scope.go:117] "RemoveContainer" containerID="ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c" Jan 21 15:52:14 crc kubenswrapper[4707]: E0121 15:52:14.798799 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(12285e17-dc8c-4a74-863a-401ef36721aa)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" Jan 21 15:52:15 crc kubenswrapper[4707]: I0121 15:52:15.953773 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:52:15 crc kubenswrapper[4707]: I0121 15:52:15.997519 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7c97698578-4r4js"] Jan 21 15:52:15 crc kubenswrapper[4707]: I0121 15:52:15.997776 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" podUID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerName="neutron-api" containerID="cri-o://971937516b3b19a806e3c97a43385c931b217beac81e6d595bdb7f3762b9b713" gracePeriod=30 Jan 21 15:52:15 crc kubenswrapper[4707]: I0121 15:52:15.997951 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" podUID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerName="neutron-httpd" containerID="cri-o://84b7c48321255d05f9f4ae3bebca5603a4d0fb4d5ce26d0a6812c180a204cc28" gracePeriod=30 Jan 21 15:52:16 crc kubenswrapper[4707]: I0121 15:52:16.196595 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:16 crc kubenswrapper[4707]: I0121 15:52:16.197323 4707 scope.go:117] "RemoveContainer" containerID="ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c" Jan 21 15:52:16 crc kubenswrapper[4707]: E0121 15:52:16.197573 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(12285e17-dc8c-4a74-863a-401ef36721aa)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" Jan 21 15:52:16 crc kubenswrapper[4707]: I0121 15:52:16.815089 4707 generic.go:334] "Generic (PLEG): container finished" podID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerID="84b7c48321255d05f9f4ae3bebca5603a4d0fb4d5ce26d0a6812c180a204cc28" exitCode=0 Jan 21 15:52:16 crc kubenswrapper[4707]: I0121 15:52:16.815214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" event={"ID":"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d","Type":"ContainerDied","Data":"84b7c48321255d05f9f4ae3bebca5603a4d0fb4d5ce26d0a6812c180a204cc28"} Jan 21 15:52:16 crc kubenswrapper[4707]: I0121 15:52:16.923858 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:16 crc kubenswrapper[4707]: I0121 15:52:16.923903 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:16 crc kubenswrapper[4707]: I0121 15:52:16.923915 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:16 crc kubenswrapper[4707]: I0121 15:52:16.924599 4707 scope.go:117] "RemoveContainer" containerID="672f5dec2a5ec86891dcdf9db026fc7510b70c1bd2772145774fb3771d502e2a" Jan 21 15:52:16 crc kubenswrapper[4707]: E0121 15:52:16.924891 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(abd22cf0-492c-48c5-b675-0db84da277d2)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="abd22cf0-492c-48c5-b675-0db84da277d2" Jan 21 15:52:17 crc kubenswrapper[4707]: I0121 15:52:17.929147 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:52:17 crc kubenswrapper[4707]: I0121 15:52:17.967857 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj"] Jan 21 15:52:17 crc kubenswrapper[4707]: I0121 15:52:17.968115 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" podUID="4d0c2183-3194-4f16-819c-2e2473083292" containerName="keystone-api" containerID="cri-o://2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb" gracePeriod=30 Jan 21 15:52:18 crc kubenswrapper[4707]: I0121 15:52:18.432220 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.778366 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.779537 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-log" containerID="cri-o://7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87" gracePeriod=30 Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.779638 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-api" containerID="cri-o://63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37" gracePeriod=30 Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.796743 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.797062 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-log" containerID="cri-o://dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca" gracePeriod=30 Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.797135 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-metadata" containerID="cri-o://e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753" gracePeriod=30 Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.804409 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.812544 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.812753 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" containerID="cri-o://47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" gracePeriod=30 Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.829029 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.829294 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="5c7f376c-03ea-47f2-9f24-e7bf986f83aa" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2" gracePeriod=30 Jan 21 15:52:20 crc kubenswrapper[4707]: I0121 15:52:20.905386 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.172731 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.178196 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.234758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-combined-ca-bundle\") pod \"abd22cf0-492c-48c5-b675-0db84da277d2\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.234955 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-config-data\") pod \"abd22cf0-492c-48c5-b675-0db84da277d2\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.234974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-combined-ca-bundle\") pod \"12285e17-dc8c-4a74-863a-401ef36721aa\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.235009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-config-data\") pod \"12285e17-dc8c-4a74-863a-401ef36721aa\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.235035 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpr2m\" (UniqueName: \"kubernetes.io/projected/12285e17-dc8c-4a74-863a-401ef36721aa-kube-api-access-qpr2m\") pod \"12285e17-dc8c-4a74-863a-401ef36721aa\" (UID: \"12285e17-dc8c-4a74-863a-401ef36721aa\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.235063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djkjj\" (UniqueName: \"kubernetes.io/projected/abd22cf0-492c-48c5-b675-0db84da277d2-kube-api-access-djkjj\") pod \"abd22cf0-492c-48c5-b675-0db84da277d2\" (UID: \"abd22cf0-492c-48c5-b675-0db84da277d2\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.242385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd22cf0-492c-48c5-b675-0db84da277d2-kube-api-access-djkjj" (OuterVolumeSpecName: "kube-api-access-djkjj") pod "abd22cf0-492c-48c5-b675-0db84da277d2" (UID: "abd22cf0-492c-48c5-b675-0db84da277d2"). InnerVolumeSpecName "kube-api-access-djkjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.244955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12285e17-dc8c-4a74-863a-401ef36721aa-kube-api-access-qpr2m" (OuterVolumeSpecName: "kube-api-access-qpr2m") pod "12285e17-dc8c-4a74-863a-401ef36721aa" (UID: "12285e17-dc8c-4a74-863a-401ef36721aa"). InnerVolumeSpecName "kube-api-access-qpr2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.273267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-config-data" (OuterVolumeSpecName: "config-data") pod "12285e17-dc8c-4a74-863a-401ef36721aa" (UID: "12285e17-dc8c-4a74-863a-401ef36721aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.273276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12285e17-dc8c-4a74-863a-401ef36721aa" (UID: "12285e17-dc8c-4a74-863a-401ef36721aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.275364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-config-data" (OuterVolumeSpecName: "config-data") pod "abd22cf0-492c-48c5-b675-0db84da277d2" (UID: "abd22cf0-492c-48c5-b675-0db84da277d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.276057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abd22cf0-492c-48c5-b675-0db84da277d2" (UID: "abd22cf0-492c-48c5-b675-0db84da277d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.336354 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.336375 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.336384 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12285e17-dc8c-4a74-863a-401ef36721aa-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.336393 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpr2m\" (UniqueName: \"kubernetes.io/projected/12285e17-dc8c-4a74-863a-401ef36721aa-kube-api-access-qpr2m\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.336401 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djkjj\" (UniqueName: \"kubernetes.io/projected/abd22cf0-492c-48c5-b675-0db84da277d2-kube-api-access-djkjj\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.336410 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd22cf0-492c-48c5-b675-0db84da277d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.346741 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.437004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-combined-ca-bundle\") pod \"4d0c2183-3194-4f16-819c-2e2473083292\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.437068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4qjr\" (UniqueName: \"kubernetes.io/projected/4d0c2183-3194-4f16-819c-2e2473083292-kube-api-access-k4qjr\") pod \"4d0c2183-3194-4f16-819c-2e2473083292\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.437175 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-config-data\") pod \"4d0c2183-3194-4f16-819c-2e2473083292\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.437233 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-scripts\") pod \"4d0c2183-3194-4f16-819c-2e2473083292\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.437272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-credential-keys\") pod \"4d0c2183-3194-4f16-819c-2e2473083292\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.437360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-fernet-keys\") pod \"4d0c2183-3194-4f16-819c-2e2473083292\" (UID: \"4d0c2183-3194-4f16-819c-2e2473083292\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.441553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0c2183-3194-4f16-819c-2e2473083292-kube-api-access-k4qjr" (OuterVolumeSpecName: "kube-api-access-k4qjr") pod "4d0c2183-3194-4f16-819c-2e2473083292" (UID: "4d0c2183-3194-4f16-819c-2e2473083292"). InnerVolumeSpecName "kube-api-access-k4qjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.447983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4d0c2183-3194-4f16-819c-2e2473083292" (UID: "4d0c2183-3194-4f16-819c-2e2473083292"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.448035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4d0c2183-3194-4f16-819c-2e2473083292" (UID: "4d0c2183-3194-4f16-819c-2e2473083292"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.448056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-scripts" (OuterVolumeSpecName: "scripts") pod "4d0c2183-3194-4f16-819c-2e2473083292" (UID: "4d0c2183-3194-4f16-819c-2e2473083292"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.464075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d0c2183-3194-4f16-819c-2e2473083292" (UID: "4d0c2183-3194-4f16-819c-2e2473083292"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.471023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-config-data" (OuterVolumeSpecName: "config-data") pod "4d0c2183-3194-4f16-819c-2e2473083292" (UID: "4d0c2183-3194-4f16-819c-2e2473083292"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.494439 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.538482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-config-data\") pod \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.538848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-combined-ca-bundle\") pod \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.538914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brklh\" (UniqueName: \"kubernetes.io/projected/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-kube-api-access-brklh\") pod \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\" (UID: \"5c7f376c-03ea-47f2-9f24-e7bf986f83aa\") " Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.539193 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.539221 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.539231 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.539239 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4qjr\" (UniqueName: \"kubernetes.io/projected/4d0c2183-3194-4f16-819c-2e2473083292-kube-api-access-k4qjr\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.539257 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.539264 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d0c2183-3194-4f16-819c-2e2473083292-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.541834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-kube-api-access-brklh" (OuterVolumeSpecName: "kube-api-access-brklh") pod "5c7f376c-03ea-47f2-9f24-e7bf986f83aa" (UID: "5c7f376c-03ea-47f2-9f24-e7bf986f83aa"). InnerVolumeSpecName "kube-api-access-brklh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.563952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-config-data" (OuterVolumeSpecName: "config-data") pod "5c7f376c-03ea-47f2-9f24-e7bf986f83aa" (UID: "5c7f376c-03ea-47f2-9f24-e7bf986f83aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.565931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c7f376c-03ea-47f2-9f24-e7bf986f83aa" (UID: "5c7f376c-03ea-47f2-9f24-e7bf986f83aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.640965 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brklh\" (UniqueName: \"kubernetes.io/projected/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-kube-api-access-brklh\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.641002 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.641012 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7f376c-03ea-47f2-9f24-e7bf986f83aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.866440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"12285e17-dc8c-4a74-863a-401ef36721aa","Type":"ContainerDied","Data":"757c5a0f153201d89b8d9d3932fdfdcf9f7e0059e50722f695d0a99f155ec170"} Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.866494 4707 scope.go:117] "RemoveContainer" containerID="ef12ec40d3e1bb7576075d3be96884fd1167b800c8556110ee346b98ed21c63c" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.866451 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.868914 4707 generic.go:334] "Generic (PLEG): container finished" podID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerID="7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87" exitCode=143 Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.868958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"56f64f3c-5f52-42fa-a99d-1312c645ba3a","Type":"ContainerDied","Data":"7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87"} Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.870051 4707 generic.go:334] "Generic (PLEG): container finished" podID="5c7f376c-03ea-47f2-9f24-e7bf986f83aa" containerID="dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2" exitCode=0 Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.870089 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.870116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"5c7f376c-03ea-47f2-9f24-e7bf986f83aa","Type":"ContainerDied","Data":"dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2"} Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.870146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"5c7f376c-03ea-47f2-9f24-e7bf986f83aa","Type":"ContainerDied","Data":"b556db42b77c3fd21fae8ff161f19f49c10fa830b335d6ae17f4c38473d90571"} Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.871731 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d0c2183-3194-4f16-819c-2e2473083292" containerID="2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb" exitCode=0 Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.871776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" event={"ID":"4d0c2183-3194-4f16-819c-2e2473083292","Type":"ContainerDied","Data":"2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb"} Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.871793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" event={"ID":"4d0c2183-3194-4f16-819c-2e2473083292","Type":"ContainerDied","Data":"217c7055a6fea589075372f01f42fcc0ab20ca221329ff30f0ef3119a2e1c21b"} Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.871863 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.892571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"abd22cf0-492c-48c5-b675-0db84da277d2","Type":"ContainerDied","Data":"78825fc894a928c385a4cef6e0ed5cd7737d29654a919c993af269355da086ba"} Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.892641 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.908418 4707 scope.go:117] "RemoveContainer" containerID="dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2" Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.914693 4707 generic.go:334] "Generic (PLEG): container finished" podID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerID="dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca" exitCode=143 Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.914729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3","Type":"ContainerDied","Data":"dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca"} Jan 21 15:52:21 crc kubenswrapper[4707]: I0121 15:52:21.958076 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.003079 4707 scope.go:117] "RemoveContainer" containerID="dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.003614 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2\": container with ID starting with dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2 not found: ID does not exist" containerID="dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.003644 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2"} err="failed to get container status \"dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2\": rpc error: code = NotFound desc = could not find container \"dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2\": container with ID starting with dde1c76d226b0bdd12400ae207125a202f18a099e7778b3236b2f93fb415cee2 not found: ID does not exist" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.003666 4707 scope.go:117] "RemoveContainer" containerID="2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.004888 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.030568 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032355 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3282f63b-58f5-4f45-a684-55a09adb04d6" containerName="init" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032380 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3282f63b-58f5-4f45-a684-55a09adb04d6" containerName="init" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032412 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0c2183-3194-4f16-819c-2e2473083292" containerName="keystone-api" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032421 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0c2183-3194-4f16-819c-2e2473083292" containerName="keystone-api" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032438 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032445 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032462 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerName="barbican-worker-log" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032471 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerName="barbican-worker-log" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032483 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerName="barbican-keystone-listener-log" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032491 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerName="barbican-keystone-listener-log" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032512 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032518 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032539 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-log" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032545 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-log" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032554 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerName="barbican-worker" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerName="barbican-worker" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032575 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-api" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032582 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-api" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032606 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7f376c-03ea-47f2-9f24-e7bf986f83aa" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032614 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7f376c-03ea-47f2-9f24-e7bf986f83aa" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032642 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd22cf0-492c-48c5-b675-0db84da277d2" containerName="nova-scheduler-scheduler" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032648 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd22cf0-492c-48c5-b675-0db84da277d2" containerName="nova-scheduler-scheduler" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032677 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd22cf0-492c-48c5-b675-0db84da277d2" containerName="nova-scheduler-scheduler" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032682 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd22cf0-492c-48c5-b675-0db84da277d2" containerName="nova-scheduler-scheduler" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032697 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3282f63b-58f5-4f45-a684-55a09adb04d6" containerName="dnsmasq-dns" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032704 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3282f63b-58f5-4f45-a684-55a09adb04d6" containerName="dnsmasq-dns" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.032722 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerName="barbican-keystone-listener" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.032728 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerName="barbican-keystone-listener" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033243 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerName="barbican-keystone-listener-log" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033284 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd22cf0-492c-48c5-b675-0db84da277d2" containerName="nova-scheduler-scheduler" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033294 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3282f63b-58f5-4f45-a684-55a09adb04d6" containerName="dnsmasq-dns" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033308 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c6c283-dad9-4fe2-91a7-184df0427be8" containerName="barbican-keystone-listener" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033319 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033344 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-log" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033362 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd22cf0-492c-48c5-b675-0db84da277d2" containerName="nova-scheduler-scheduler" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033375 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0c2183-3194-4f16-819c-2e2473083292" containerName="keystone-api" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033390 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerName="barbican-worker" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033404 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7f376c-03ea-47f2-9f24-e7bf986f83aa" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033423 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29af5f9-41ed-4f5b-a689-f02942a2366c" containerName="placement-api" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.033439 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b2a5f8-e722-43a5-8765-1b6f281e9b8d" containerName="barbican-worker-log" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.035054 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.049577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8np7m\" (UniqueName: \"kubernetes.io/projected/2ebce296-11df-4e07-8a09-855cf9ee3417-kube-api-access-8np7m\") pod \"nova-cell0-conductor-0\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.049743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.049927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.050871 4707 scope.go:117] "RemoveContainer" containerID="2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.051195 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:52:22 crc kubenswrapper[4707]: E0121 15:52:22.051660 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb\": container with ID starting with 2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb not found: ID does not exist" containerID="2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.051706 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb"} err="failed to get container status \"2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb\": rpc error: code = NotFound desc = could not find container \"2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb\": container with ID starting with 2d50aa4e77b66130e3c5b2fe69625516f87b84c5cdf60df7c9071c92be4959fb not found: ID does not exist" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.051747 4707 scope.go:117] "RemoveContainer" containerID="672f5dec2a5ec86891dcdf9db026fc7510b70c1bd2772145774fb3771d502e2a" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.057952 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.109736 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.115518 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.122753 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.123402 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.124069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.126706 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.133123 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.142498 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.148493 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-5c66cc94c7-xfnmj"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.151786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8np7m\" (UniqueName: \"kubernetes.io/projected/2ebce296-11df-4e07-8a09-855cf9ee3417-kube-api-access-8np7m\") pod \"nova-cell0-conductor-0\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.151895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.151974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp5md\" (UniqueName: \"kubernetes.io/projected/011702e0-dd00-4236-8d20-ad2dfa6c84fa-kube-api-access-dp5md\") pod \"nova-cell1-novncproxy-0\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.152042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.152069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.152270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.153439 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.155712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.155751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.158729 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.164297 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.165643 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.167872 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.169054 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.169144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8np7m\" (UniqueName: \"kubernetes.io/projected/2ebce296-11df-4e07-8a09-855cf9ee3417-kube-api-access-8np7m\") pod \"nova-cell0-conductor-0\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.254074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.254605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-config-data\") pod \"nova-scheduler-0\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.254915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9nnk\" (UniqueName: \"kubernetes.io/projected/53f0901c-cc75-43c9-8c41-59f342e2df52-kube-api-access-v9nnk\") pod \"nova-scheduler-0\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.255036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.255513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp5md\" (UniqueName: \"kubernetes.io/projected/011702e0-dd00-4236-8d20-ad2dfa6c84fa-kube-api-access-dp5md\") pod \"nova-cell1-novncproxy-0\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.255914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.257903 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.258902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.270000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp5md\" (UniqueName: \"kubernetes.io/projected/011702e0-dd00-4236-8d20-ad2dfa6c84fa-kube-api-access-dp5md\") pod \"nova-cell1-novncproxy-0\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.357737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.357823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-config-data\") pod \"nova-scheduler-0\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.357853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9nnk\" (UniqueName: \"kubernetes.io/projected/53f0901c-cc75-43c9-8c41-59f342e2df52-kube-api-access-v9nnk\") pod \"nova-scheduler-0\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.361420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.363941 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-config-data\") pod \"nova-scheduler-0\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.372975 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.391974 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.392887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9nnk\" (UniqueName: \"kubernetes.io/projected/53f0901c-cc75-43c9-8c41-59f342e2df52-kube-api-access-v9nnk\") pod \"nova-scheduler-0\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.401331 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.408322 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.409516 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.421391 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.462183 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.499846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.984567 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.985161 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="ceilometer-central-agent" containerID="cri-o://90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb" gracePeriod=30 Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.985325 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="proxy-httpd" containerID="cri-o://d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762" gracePeriod=30 Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.985358 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="sg-core" containerID="cri-o://cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8" gracePeriod=30 Jan 21 15:52:22 crc kubenswrapper[4707]: I0121 15:52:22.985387 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="ceilometer-notification-agent" containerID="cri-o://0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d" gracePeriod=30 Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.045389 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.151602 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.202050 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12285e17-dc8c-4a74-863a-401ef36721aa" path="/var/lib/kubelet/pods/12285e17-dc8c-4a74-863a-401ef36721aa/volumes" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.202697 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d0c2183-3194-4f16-819c-2e2473083292" path="/var/lib/kubelet/pods/4d0c2183-3194-4f16-819c-2e2473083292/volumes" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.203605 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7f376c-03ea-47f2-9f24-e7bf986f83aa" path="/var/lib/kubelet/pods/5c7f376c-03ea-47f2-9f24-e7bf986f83aa/volumes" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.204089 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abd22cf0-492c-48c5-b675-0db84da277d2" path="/var/lib/kubelet/pods/abd22cf0-492c-48c5-b675-0db84da277d2/volumes" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.216596 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.448925 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-7784df594-cj4dv"] Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.450390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.453126 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.453554 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.490881 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7784df594-cj4dv"] Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.511089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4pz\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-kube-api-access-nl4pz\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.511152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-internal-tls-certs\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.511186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-config-data\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.511263 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-combined-ca-bundle\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.511335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-public-tls-certs\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.511375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-run-httpd\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.511395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-log-httpd\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.511448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-etc-swift\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.598070 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.613079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-log-httpd\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.613163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-etc-swift\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.613234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4pz\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-kube-api-access-nl4pz\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.613276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-internal-tls-certs\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.613302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-config-data\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.613350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-combined-ca-bundle\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.613413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-public-tls-certs\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.613447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-run-httpd\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.613924 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-run-httpd\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.614144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-log-httpd\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.619655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-etc-swift\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.624929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-config-data\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.626264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-combined-ca-bundle\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.629291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-internal-tls-certs\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.634720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-public-tls-certs\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.639398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4pz\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-kube-api-access-nl4pz\") pod \"swift-proxy-7784df594-cj4dv\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.715349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-config-data\") pod \"132ee358-5559-4c6e-b6d9-742bc3a89735\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.715418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjdqr\" (UniqueName: \"kubernetes.io/projected/132ee358-5559-4c6e-b6d9-742bc3a89735-kube-api-access-vjdqr\") pod \"132ee358-5559-4c6e-b6d9-742bc3a89735\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.715476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-log-httpd\") pod \"132ee358-5559-4c6e-b6d9-742bc3a89735\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.715730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-scripts\") pod \"132ee358-5559-4c6e-b6d9-742bc3a89735\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.715888 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-combined-ca-bundle\") pod \"132ee358-5559-4c6e-b6d9-742bc3a89735\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.715923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-sg-core-conf-yaml\") pod \"132ee358-5559-4c6e-b6d9-742bc3a89735\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.716069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-run-httpd\") pod \"132ee358-5559-4c6e-b6d9-742bc3a89735\" (UID: \"132ee358-5559-4c6e-b6d9-742bc3a89735\") " Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.716594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "132ee358-5559-4c6e-b6d9-742bc3a89735" (UID: "132ee358-5559-4c6e-b6d9-742bc3a89735"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.716684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "132ee358-5559-4c6e-b6d9-742bc3a89735" (UID: "132ee358-5559-4c6e-b6d9-742bc3a89735"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.716896 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.716915 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/132ee358-5559-4c6e-b6d9-742bc3a89735-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.719260 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-scripts" (OuterVolumeSpecName: "scripts") pod "132ee358-5559-4c6e-b6d9-742bc3a89735" (UID: "132ee358-5559-4c6e-b6d9-742bc3a89735"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.719424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132ee358-5559-4c6e-b6d9-742bc3a89735-kube-api-access-vjdqr" (OuterVolumeSpecName: "kube-api-access-vjdqr") pod "132ee358-5559-4c6e-b6d9-742bc3a89735" (UID: "132ee358-5559-4c6e-b6d9-742bc3a89735"). InnerVolumeSpecName "kube-api-access-vjdqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.745631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "132ee358-5559-4c6e-b6d9-742bc3a89735" (UID: "132ee358-5559-4c6e-b6d9-742bc3a89735"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.765098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.781268 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "132ee358-5559-4c6e-b6d9-742bc3a89735" (UID: "132ee358-5559-4c6e-b6d9-742bc3a89735"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.815682 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-config-data" (OuterVolumeSpecName: "config-data") pod "132ee358-5559-4c6e-b6d9-742bc3a89735" (UID: "132ee358-5559-4c6e-b6d9-742bc3a89735"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.819347 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.819370 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjdqr\" (UniqueName: \"kubernetes.io/projected/132ee358-5559-4c6e-b6d9-742bc3a89735-kube-api-access-vjdqr\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.819381 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.819390 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.819399 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/132ee358-5559-4c6e-b6d9-742bc3a89735-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.948637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"2ebce296-11df-4e07-8a09-855cf9ee3417","Type":"ContainerStarted","Data":"192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e"} Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.948970 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"2ebce296-11df-4e07-8a09-855cf9ee3417","Type":"ContainerStarted","Data":"f2289a893a7fdf7645509da321da842815be67cba65365577b3b28ee88cfdd60"} Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.949102 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerName="nova-cell0-conductor-conductor" containerID="cri-o://192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" gracePeriod=30 Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.949144 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.977233 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.977217145 podStartE2EDuration="2.977217145s" podCreationTimestamp="2026-01-21 15:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:23.972705321 +0000 UTC m=+3041.154221543" watchObservedRunningTime="2026-01-21 15:52:23.977217145 +0000 UTC m=+3041.158733367" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983282 4707 generic.go:334] "Generic (PLEG): container finished" podID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerID="d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762" exitCode=0 Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983318 4707 generic.go:334] "Generic (PLEG): container finished" podID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerID="cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8" exitCode=2 Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983325 4707 generic.go:334] "Generic (PLEG): container finished" podID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerID="0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d" exitCode=0 Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983332 4707 generic.go:334] "Generic (PLEG): container finished" podID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerID="90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb" exitCode=0 Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerDied","Data":"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762"} Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerDied","Data":"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8"} Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerDied","Data":"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d"} Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerDied","Data":"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb"} Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"132ee358-5559-4c6e-b6d9-742bc3a89735","Type":"ContainerDied","Data":"7e85bb6e95f0c1fbe624c2e7fa539e32d0f0b8a3f16b0c06ec940c42e5a910fb"} Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983473 4707 scope.go:117] "RemoveContainer" containerID="d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762" Jan 21 15:52:23 crc kubenswrapper[4707]: I0121 15:52:23.983614 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:23.997134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"011702e0-dd00-4236-8d20-ad2dfa6c84fa","Type":"ContainerStarted","Data":"3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57"} Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:23.997183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"011702e0-dd00-4236-8d20-ad2dfa6c84fa","Type":"ContainerStarted","Data":"9eccda1f6b4b5e590bfe854a3b161bd19c111a8cedb6b7c720ece65a480da30b"} Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:23.997336 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="011702e0-dd00-4236-8d20-ad2dfa6c84fa" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57" gracePeriod=30 Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.026460 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.216:8775/\": read tcp 10.217.0.2:57044->10.217.1.216:8775: read: connection reset by peer" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.026785 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.216:8775/\": read tcp 10.217.0.2:57046->10.217.1.216:8775: read: connection reset by peer" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.029853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"53f0901c-cc75-43c9-8c41-59f342e2df52","Type":"ContainerStarted","Data":"11447b17e394ecec52657a05eb6a3db7bb21c1da6a294ca9fa610b29dad1060c"} Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.029904 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"53f0901c-cc75-43c9-8c41-59f342e2df52","Type":"ContainerStarted","Data":"312c9632217410d1c8f2ac9893decfcc76d27cb74e9be417f95b1bffc6c0da00"} Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.030034 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="53f0901c-cc75-43c9-8c41-59f342e2df52" containerName="nova-scheduler-scheduler" containerID="cri-o://11447b17e394ecec52657a05eb6a3db7bb21c1da6a294ca9fa610b29dad1060c" gracePeriod=30 Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.056105 4707 scope.go:117] "RemoveContainer" containerID="cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.070885 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=3.070866614 podStartE2EDuration="3.070866614s" podCreationTimestamp="2026-01-21 15:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:24.027152243 +0000 UTC m=+3041.208668465" watchObservedRunningTime="2026-01-21 15:52:24.070866614 +0000 UTC m=+3041.252382836" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.094862 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.120396 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.131044 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:24 crc kubenswrapper[4707]: E0121 15:52:24.131589 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="sg-core" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.131602 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="sg-core" Jan 21 15:52:24 crc kubenswrapper[4707]: E0121 15:52:24.131645 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="proxy-httpd" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.131653 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="proxy-httpd" Jan 21 15:52:24 crc kubenswrapper[4707]: E0121 15:52:24.131666 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="ceilometer-notification-agent" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.131671 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="ceilometer-notification-agent" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.132501 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.13248905 podStartE2EDuration="2.13248905s" podCreationTimestamp="2026-01-21 15:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:24.10019856 +0000 UTC m=+3041.281714783" watchObservedRunningTime="2026-01-21 15:52:24.13248905 +0000 UTC m=+3041.314005272" Jan 21 15:52:24 crc kubenswrapper[4707]: E0121 15:52:24.133449 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="ceilometer-central-agent" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.133462 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="ceilometer-central-agent" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.133675 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="ceilometer-central-agent" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.133695 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="sg-core" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.133711 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="ceilometer-notification-agent" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.133719 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" containerName="proxy-httpd" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.135732 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.141115 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.141335 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.158952 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.180336 4707 scope.go:117] "RemoveContainer" containerID="0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.206644 4707 scope.go:117] "RemoveContainer" containerID="90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.248411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-run-httpd\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.248652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-log-httpd\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.248829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.249001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.249087 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-scripts\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.249289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkprm\" (UniqueName: \"kubernetes.io/projected/1f3eec98-b34f-4743-a58f-78eb4463a661-kube-api-access-jkprm\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.249628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-config-data\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.254789 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7784df594-cj4dv"] Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.271719 4707 scope.go:117] "RemoveContainer" containerID="d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762" Jan 21 15:52:24 crc kubenswrapper[4707]: E0121 15:52:24.274141 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762\": container with ID starting with d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762 not found: ID does not exist" containerID="d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.274191 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762"} err="failed to get container status \"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762\": rpc error: code = NotFound desc = could not find container \"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762\": container with ID starting with d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762 not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.274214 4707 scope.go:117] "RemoveContainer" containerID="cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8" Jan 21 15:52:24 crc kubenswrapper[4707]: E0121 15:52:24.278436 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8\": container with ID starting with cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8 not found: ID does not exist" containerID="cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.278531 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8"} err="failed to get container status \"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8\": rpc error: code = NotFound desc = could not find container \"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8\": container with ID starting with cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8 not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.278596 4707 scope.go:117] "RemoveContainer" containerID="0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d" Jan 21 15:52:24 crc kubenswrapper[4707]: E0121 15:52:24.283427 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d\": container with ID starting with 0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d not found: ID does not exist" containerID="0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.283450 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d"} err="failed to get container status \"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d\": rpc error: code = NotFound desc = could not find container \"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d\": container with ID starting with 0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.283462 4707 scope.go:117] "RemoveContainer" containerID="90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb" Jan 21 15:52:24 crc kubenswrapper[4707]: E0121 15:52:24.289923 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb\": container with ID starting with 90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb not found: ID does not exist" containerID="90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.289971 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb"} err="failed to get container status \"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb\": rpc error: code = NotFound desc = could not find container \"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb\": container with ID starting with 90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.290001 4707 scope.go:117] "RemoveContainer" containerID="d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.295040 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762"} err="failed to get container status \"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762\": rpc error: code = NotFound desc = could not find container \"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762\": container with ID starting with d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762 not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.295086 4707 scope.go:117] "RemoveContainer" containerID="cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.295486 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8"} err="failed to get container status \"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8\": rpc error: code = NotFound desc = could not find container \"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8\": container with ID starting with cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8 not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.295502 4707 scope.go:117] "RemoveContainer" containerID="0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.295688 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d"} err="failed to get container status \"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d\": rpc error: code = NotFound desc = could not find container \"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d\": container with ID starting with 0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.295702 4707 scope.go:117] "RemoveContainer" containerID="90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.300888 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb"} err="failed to get container status \"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb\": rpc error: code = NotFound desc = could not find container \"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb\": container with ID starting with 90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.300911 4707 scope.go:117] "RemoveContainer" containerID="d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.301592 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762"} err="failed to get container status \"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762\": rpc error: code = NotFound desc = could not find container \"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762\": container with ID starting with d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762 not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.301635 4707 scope.go:117] "RemoveContainer" containerID="cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.301938 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8"} err="failed to get container status \"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8\": rpc error: code = NotFound desc = could not find container \"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8\": container with ID starting with cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8 not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.301953 4707 scope.go:117] "RemoveContainer" containerID="0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.302437 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d"} err="failed to get container status \"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d\": rpc error: code = NotFound desc = could not find container \"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d\": container with ID starting with 0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.302460 4707 scope.go:117] "RemoveContainer" containerID="90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.302686 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb"} err="failed to get container status \"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb\": rpc error: code = NotFound desc = could not find container \"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb\": container with ID starting with 90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.302707 4707 scope.go:117] "RemoveContainer" containerID="d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.302963 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762"} err="failed to get container status \"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762\": rpc error: code = NotFound desc = could not find container \"d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762\": container with ID starting with d21569ec0f2267cd057ae4ee9746f22cf8adbc3b6bf2e0dbde996735aede1762 not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.302990 4707 scope.go:117] "RemoveContainer" containerID="cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.304205 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8"} err="failed to get container status \"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8\": rpc error: code = NotFound desc = could not find container \"cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8\": container with ID starting with cea7b8a3efc42bbf82aa932d3fe97a8c13c7dc0f04785f0f2c2107038bf6e7b8 not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.304226 4707 scope.go:117] "RemoveContainer" containerID="0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.304720 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d"} err="failed to get container status \"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d\": rpc error: code = NotFound desc = could not find container \"0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d\": container with ID starting with 0866ca197ce5296522c7358a7fc77f5129604b2a897c319c889be6783d50694d not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.304734 4707 scope.go:117] "RemoveContainer" containerID="90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.308542 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb"} err="failed to get container status \"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb\": rpc error: code = NotFound desc = could not find container \"90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb\": container with ID starting with 90eba07684f1baa9470c8ab0d26fe42e8ca232b280c3a668bc4e6ca8bb85bdeb not found: ID does not exist" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.354272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkprm\" (UniqueName: \"kubernetes.io/projected/1f3eec98-b34f-4743-a58f-78eb4463a661-kube-api-access-jkprm\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.354333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-config-data\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.354376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-run-httpd\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.354408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-log-httpd\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.354494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.354704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.354733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-scripts\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.359020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-log-httpd\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.359860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-run-httpd\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.361474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.362621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-config-data\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.362759 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.369610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-scripts\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.372329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkprm\" (UniqueName: \"kubernetes.io/projected/1f3eec98-b34f-4743-a58f-78eb4463a661-kube-api-access-jkprm\") pod \"ceilometer-0\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.496057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.546970 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.661558 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.666796 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r649\" (UniqueName: \"kubernetes.io/projected/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-kube-api-access-8r649\") pod \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.666943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-logs\") pod \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.666986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-combined-ca-bundle\") pod \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.667037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-config-data\") pod \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\" (UID: \"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3\") " Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.668088 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-logs" (OuterVolumeSpecName: "logs") pod "d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" (UID: "d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.670913 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-kube-api-access-8r649" (OuterVolumeSpecName: "kube-api-access-8r649") pod "d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" (UID: "d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3"). InnerVolumeSpecName "kube-api-access-8r649". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.707310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-config-data" (OuterVolumeSpecName: "config-data") pod "d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" (UID: "d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.731347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" (UID: "d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.769219 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f64f3c-5f52-42fa-a99d-1312c645ba3a-logs\") pod \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.769332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-combined-ca-bundle\") pod \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.769488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tx9b\" (UniqueName: \"kubernetes.io/projected/56f64f3c-5f52-42fa-a99d-1312c645ba3a-kube-api-access-4tx9b\") pod \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.769610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-config-data\") pod \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\" (UID: \"56f64f3c-5f52-42fa-a99d-1312c645ba3a\") " Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.769873 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f64f3c-5f52-42fa-a99d-1312c645ba3a-logs" (OuterVolumeSpecName: "logs") pod "56f64f3c-5f52-42fa-a99d-1312c645ba3a" (UID: "56f64f3c-5f52-42fa-a99d-1312c645ba3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.770614 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r649\" (UniqueName: \"kubernetes.io/projected/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-kube-api-access-8r649\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.770641 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.770655 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.770665 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.770674 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f64f3c-5f52-42fa-a99d-1312c645ba3a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.775576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f64f3c-5f52-42fa-a99d-1312c645ba3a-kube-api-access-4tx9b" (OuterVolumeSpecName: "kube-api-access-4tx9b") pod "56f64f3c-5f52-42fa-a99d-1312c645ba3a" (UID: "56f64f3c-5f52-42fa-a99d-1312c645ba3a"). InnerVolumeSpecName "kube-api-access-4tx9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.803626 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-config-data" (OuterVolumeSpecName: "config-data") pod "56f64f3c-5f52-42fa-a99d-1312c645ba3a" (UID: "56f64f3c-5f52-42fa-a99d-1312c645ba3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.809943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56f64f3c-5f52-42fa-a99d-1312c645ba3a" (UID: "56f64f3c-5f52-42fa-a99d-1312c645ba3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.872768 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.872803 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tx9b\" (UniqueName: \"kubernetes.io/projected/56f64f3c-5f52-42fa-a99d-1312c645ba3a-kube-api-access-4tx9b\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:24 crc kubenswrapper[4707]: I0121 15:52:24.872831 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f64f3c-5f52-42fa-a99d-1312c645ba3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.013873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: W0121 15:52:25.014848 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f3eec98_b34f_4743_a58f_78eb4463a661.slice/crio-1dc1863dbbf749d9bf8e7903f45fe587a18e06ac77616b33e0951f0f64abda78 WatchSource:0}: Error finding container 1dc1863dbbf749d9bf8e7903f45fe587a18e06ac77616b33e0951f0f64abda78: Status 404 returned error can't find the container with id 1dc1863dbbf749d9bf8e7903f45fe587a18e06ac77616b33e0951f0f64abda78 Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.018062 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.049703 4707 generic.go:334] "Generic (PLEG): container finished" podID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerID="e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753" exitCode=0 Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.049777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3","Type":"ContainerDied","Data":"e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753"} Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.049820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3","Type":"ContainerDied","Data":"7a1ba7165cab0d41cd16c92671069652c862d29e1abefcd71c79ace8d56c0e79"} Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.049842 4707 scope.go:117] "RemoveContainer" containerID="e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.049905 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.059730 4707 generic.go:334] "Generic (PLEG): container finished" podID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerID="63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37" exitCode=0 Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.059792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"56f64f3c-5f52-42fa-a99d-1312c645ba3a","Type":"ContainerDied","Data":"63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37"} Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.059836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"56f64f3c-5f52-42fa-a99d-1312c645ba3a","Type":"ContainerDied","Data":"30c32ec6ea4be9c189b6ca697dec69a66e60a1449227c88bc04321c26726af4a"} Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.059905 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.064231 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.068317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerStarted","Data":"1dc1863dbbf749d9bf8e7903f45fe587a18e06ac77616b33e0951f0f64abda78"} Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.092773 4707 scope.go:117] "RemoveContainer" containerID="dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.099133 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.100055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" event={"ID":"a11afd40-c28b-4140-8546-fd2a270a2931","Type":"ContainerStarted","Data":"26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d"} Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.100102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" event={"ID":"a11afd40-c28b-4140-8546-fd2a270a2931","Type":"ContainerStarted","Data":"092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a"} Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.100113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" event={"ID":"a11afd40-c28b-4140-8546-fd2a270a2931","Type":"ContainerStarted","Data":"5d26c8dfb940791e8964868cf2182628dc4e34d4e0ab3ff6643c0317a64d5e81"} Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.100312 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.100363 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.110411 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.129924 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.130366 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-api" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.130382 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-api" Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.130409 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-metadata" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.130415 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-metadata" Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.130435 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-log" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.130441 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-log" Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.130452 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-log" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.130457 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-log" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.130632 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-metadata" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.130646 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" containerName="nova-metadata-log" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.130660 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-api" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.130667 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" containerName="nova-api-log" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.131587 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.134132 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.138887 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.160871 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.208883 4707 scope.go:117] "RemoveContainer" containerID="e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753" Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.257000 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753\": container with ID starting with e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753 not found: ID does not exist" containerID="e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.257065 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753"} err="failed to get container status \"e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753\": rpc error: code = NotFound desc = could not find container \"e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753\": container with ID starting with e55b28b44de780718a88ed1784c746a1c57bf485de29f2042610fbfaf43fc753 not found: ID does not exist" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.257165 4707 scope.go:117] "RemoveContainer" containerID="dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca" Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.258105 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca\": container with ID starting with dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca not found: ID does not exist" containerID="dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.258170 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca"} err="failed to get container status \"dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca\": rpc error: code = NotFound desc = could not find container \"dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca\": container with ID starting with dcd1833648bc1457388135de0815c77ed32d3b4fd15e900d72b505a758f552ca not found: ID does not exist" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.258211 4707 scope.go:117] "RemoveContainer" containerID="63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.272719 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132ee358-5559-4c6e-b6d9-742bc3a89735" path="/var/lib/kubelet/pods/132ee358-5559-4c6e-b6d9-742bc3a89735/volumes" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.274642 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f64f3c-5f52-42fa-a99d-1312c645ba3a" path="/var/lib/kubelet/pods/56f64f3c-5f52-42fa-a99d-1312c645ba3a/volumes" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.276087 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3" path="/var/lib/kubelet/pods/d22d4ba8-ba68-4fe0-9350-d5eec13eb1e3/volumes" Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.276594 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.277034 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.277079 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.279045 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.281142 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.281180 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.284257 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.287104 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.288762 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" podStartSLOduration=2.288750908 podStartE2EDuration="2.288750908s" podCreationTimestamp="2026-01-21 15:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:25.145743135 +0000 UTC m=+3042.327259357" watchObservedRunningTime="2026-01-21 15:52:25.288750908 +0000 UTC m=+3042.470267129" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.290287 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.290557 4707 scope.go:117] "RemoveContainer" containerID="7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.307994 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.308619 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-6zhcz logs], unattached volumes=[], failed to process volumes=[combined-ca-bundle config-data kube-api-access-6zhcz logs]: context canceled" pod="openstack-kuttl-tests/nova-api-0" podUID="813697ea-2aac-48f0-9888-6675f4ae50fe" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.311579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-config-data\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.311660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-logs\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.311692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.311729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6shhr\" (UniqueName: \"kubernetes.io/projected/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-kube-api-access-6shhr\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.315838 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.316227 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-6shhr logs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/nova-metadata-0" podUID="48040f4e-cefe-45a7-b4e5-6fe9e296c1c1" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.332119 4707 scope.go:117] "RemoveContainer" containerID="63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37" Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.332696 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37\": container with ID starting with 63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37 not found: ID does not exist" containerID="63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.332736 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37"} err="failed to get container status \"63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37\": rpc error: code = NotFound desc = could not find container \"63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37\": container with ID starting with 63a90cfa062823902dfbd1f2cf728c0966a8470f4029f96f33d02894487b7d37 not found: ID does not exist" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.332758 4707 scope.go:117] "RemoveContainer" containerID="7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87" Jan 21 15:52:25 crc kubenswrapper[4707]: E0121 15:52:25.333504 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87\": container with ID starting with 7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87 not found: ID does not exist" containerID="7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.333619 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87"} err="failed to get container status \"7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87\": rpc error: code = NotFound desc = could not find container \"7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87\": container with ID starting with 7a6934e160b90344c4f9f0dbbff7233c32a4e262ec3b1a233445ed6008e38e87 not found: ID does not exist" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.414071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-config-data\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.414712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-logs\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.414744 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-config-data\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.414775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.414855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6shhr\" (UniqueName: \"kubernetes.io/projected/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-kube-api-access-6shhr\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.415026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813697ea-2aac-48f0-9888-6675f4ae50fe-logs\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.415160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-logs\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.415163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.415231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zhcz\" (UniqueName: \"kubernetes.io/projected/813697ea-2aac-48f0-9888-6675f4ae50fe-kube-api-access-6zhcz\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.420213 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.425214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-config-data\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.430205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6shhr\" (UniqueName: \"kubernetes.io/projected/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-kube-api-access-6shhr\") pod \"nova-metadata-0\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.517316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-config-data\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.517421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813697ea-2aac-48f0-9888-6675f4ae50fe-logs\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.517482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.517503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zhcz\" (UniqueName: \"kubernetes.io/projected/813697ea-2aac-48f0-9888-6675f4ae50fe-kube-api-access-6zhcz\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.518139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813697ea-2aac-48f0-9888-6675f4ae50fe-logs\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.521051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.523648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-config-data\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:25 crc kubenswrapper[4707]: I0121 15:52:25.535224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zhcz\" (UniqueName: \"kubernetes.io/projected/813697ea-2aac-48f0-9888-6675f4ae50fe-kube-api-access-6zhcz\") pod \"nova-api-0\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.112388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerStarted","Data":"8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a"} Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.113932 4707 generic.go:334] "Generic (PLEG): container finished" podID="53f0901c-cc75-43c9-8c41-59f342e2df52" containerID="11447b17e394ecec52657a05eb6a3db7bb21c1da6a294ca9fa610b29dad1060c" exitCode=1 Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.113990 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.114493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"53f0901c-cc75-43c9-8c41-59f342e2df52","Type":"ContainerDied","Data":"11447b17e394ecec52657a05eb6a3db7bb21c1da6a294ca9fa610b29dad1060c"} Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.115276 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.124373 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.127639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-logs\") pod \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.127700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6shhr\" (UniqueName: \"kubernetes.io/projected/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-kube-api-access-6shhr\") pod \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.127752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-combined-ca-bundle\") pod \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.129243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.133319 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-logs" (OuterVolumeSpecName: "logs") pod "48040f4e-cefe-45a7-b4e5-6fe9e296c1c1" (UID: "48040f4e-cefe-45a7-b4e5-6fe9e296c1c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.133846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-kube-api-access-6shhr" (OuterVolumeSpecName: "kube-api-access-6shhr") pod "48040f4e-cefe-45a7-b4e5-6fe9e296c1c1" (UID: "48040f4e-cefe-45a7-b4e5-6fe9e296c1c1"). InnerVolumeSpecName "kube-api-access-6shhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.135088 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48040f4e-cefe-45a7-b4e5-6fe9e296c1c1" (UID: "48040f4e-cefe-45a7-b4e5-6fe9e296c1c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.207793 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.229341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-config-data\") pod \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\" (UID: \"48040f4e-cefe-45a7-b4e5-6fe9e296c1c1\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.230042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9nnk\" (UniqueName: \"kubernetes.io/projected/53f0901c-cc75-43c9-8c41-59f342e2df52-kube-api-access-v9nnk\") pod \"53f0901c-cc75-43c9-8c41-59f342e2df52\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.230672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-combined-ca-bundle\") pod \"53f0901c-cc75-43c9-8c41-59f342e2df52\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.230754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-combined-ca-bundle\") pod \"813697ea-2aac-48f0-9888-6675f4ae50fe\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.230785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zhcz\" (UniqueName: \"kubernetes.io/projected/813697ea-2aac-48f0-9888-6675f4ae50fe-kube-api-access-6zhcz\") pod \"813697ea-2aac-48f0-9888-6675f4ae50fe\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.230875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813697ea-2aac-48f0-9888-6675f4ae50fe-logs\") pod \"813697ea-2aac-48f0-9888-6675f4ae50fe\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.231606 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.231619 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6shhr\" (UniqueName: \"kubernetes.io/projected/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-kube-api-access-6shhr\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.231629 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.231628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813697ea-2aac-48f0-9888-6675f4ae50fe-logs" (OuterVolumeSpecName: "logs") pod "813697ea-2aac-48f0-9888-6675f4ae50fe" (UID: "813697ea-2aac-48f0-9888-6675f4ae50fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.236984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "813697ea-2aac-48f0-9888-6675f4ae50fe" (UID: "813697ea-2aac-48f0-9888-6675f4ae50fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.237003 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-config-data" (OuterVolumeSpecName: "config-data") pod "48040f4e-cefe-45a7-b4e5-6fe9e296c1c1" (UID: "48040f4e-cefe-45a7-b4e5-6fe9e296c1c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.244190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813697ea-2aac-48f0-9888-6675f4ae50fe-kube-api-access-6zhcz" (OuterVolumeSpecName: "kube-api-access-6zhcz") pod "813697ea-2aac-48f0-9888-6675f4ae50fe" (UID: "813697ea-2aac-48f0-9888-6675f4ae50fe"). InnerVolumeSpecName "kube-api-access-6zhcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.247631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f0901c-cc75-43c9-8c41-59f342e2df52-kube-api-access-v9nnk" (OuterVolumeSpecName: "kube-api-access-v9nnk") pod "53f0901c-cc75-43c9-8c41-59f342e2df52" (UID: "53f0901c-cc75-43c9-8c41-59f342e2df52"). InnerVolumeSpecName "kube-api-access-v9nnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.256296 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53f0901c-cc75-43c9-8c41-59f342e2df52" (UID: "53f0901c-cc75-43c9-8c41-59f342e2df52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.332898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-config-data\") pod \"813697ea-2aac-48f0-9888-6675f4ae50fe\" (UID: \"813697ea-2aac-48f0-9888-6675f4ae50fe\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.332967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-config-data\") pod \"53f0901c-cc75-43c9-8c41-59f342e2df52\" (UID: \"53f0901c-cc75-43c9-8c41-59f342e2df52\") " Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.333348 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.333367 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zhcz\" (UniqueName: \"kubernetes.io/projected/813697ea-2aac-48f0-9888-6675f4ae50fe-kube-api-access-6zhcz\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.333378 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813697ea-2aac-48f0-9888-6675f4ae50fe-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.333389 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.333397 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9nnk\" (UniqueName: \"kubernetes.io/projected/53f0901c-cc75-43c9-8c41-59f342e2df52-kube-api-access-v9nnk\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.333407 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.335260 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-config-data" (OuterVolumeSpecName: "config-data") pod "813697ea-2aac-48f0-9888-6675f4ae50fe" (UID: "813697ea-2aac-48f0-9888-6675f4ae50fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.354191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-config-data" (OuterVolumeSpecName: "config-data") pod "53f0901c-cc75-43c9-8c41-59f342e2df52" (UID: "53f0901c-cc75-43c9-8c41-59f342e2df52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.436967 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813697ea-2aac-48f0-9888-6675f4ae50fe-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:26 crc kubenswrapper[4707]: I0121 15:52:26.437196 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f0901c-cc75-43c9-8c41-59f342e2df52-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.122738 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerStarted","Data":"13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e"} Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.124335 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.125262 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.127069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.127130 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"53f0901c-cc75-43c9-8c41-59f342e2df52","Type":"ContainerDied","Data":"312c9632217410d1c8f2ac9893decfcc76d27cb74e9be417f95b1bffc6c0da00"} Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.127283 4707 scope.go:117] "RemoveContainer" containerID="11447b17e394ecec52657a05eb6a3db7bb21c1da6a294ca9fa610b29dad1060c" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.167869 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.170370 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.200380 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48040f4e-cefe-45a7-b4e5-6fe9e296c1c1" path="/var/lib/kubelet/pods/48040f4e-cefe-45a7-b4e5-6fe9e296c1c1/volumes" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.200755 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.224861 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: E0121 15:52:27.225263 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f0901c-cc75-43c9-8c41-59f342e2df52" containerName="nova-scheduler-scheduler" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.225277 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f0901c-cc75-43c9-8c41-59f342e2df52" containerName="nova-scheduler-scheduler" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.225474 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f0901c-cc75-43c9-8c41-59f342e2df52" containerName="nova-scheduler-scheduler" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.226417 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.231988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.234604 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.285690 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.319227 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.320694 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.324302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.341881 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.352665 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.360943 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.370347 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.372083 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.373925 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.379688 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.394272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.394351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-config-data\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.394381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba711d4-865f-4d09-892f-30c9b4f06a16-logs\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.394483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vdgl\" (UniqueName: \"kubernetes.io/projected/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-kube-api-access-6vdgl\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.394508 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss2hb\" (UniqueName: \"kubernetes.io/projected/8ba711d4-865f-4d09-892f-30c9b4f06a16-kube-api-access-ss2hb\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.394535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-logs\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.394564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.394579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-config-data\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.462656 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vdgl\" (UniqueName: \"kubernetes.io/projected/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-kube-api-access-6vdgl\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss2hb\" (UniqueName: \"kubernetes.io/projected/8ba711d4-865f-4d09-892f-30c9b4f06a16-kube-api-access-ss2hb\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-logs\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-config-data\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496190 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-config-data\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-config-data\") pod \"nova-scheduler-0\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba711d4-865f-4d09-892f-30c9b4f06a16-logs\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvjmn\" (UniqueName: \"kubernetes.io/projected/198741f2-417e-4efd-b05e-82299bc804dc-kube-api-access-wvjmn\") pod \"nova-scheduler-0\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.496512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-logs\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.497048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba711d4-865f-4d09-892f-30c9b4f06a16-logs\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.500575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.501188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.501346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-config-data\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.510310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-config-data\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.513266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vdgl\" (UniqueName: \"kubernetes.io/projected/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-kube-api-access-6vdgl\") pod \"nova-metadata-0\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.513749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss2hb\" (UniqueName: \"kubernetes.io/projected/8ba711d4-865f-4d09-892f-30c9b4f06a16-kube-api-access-ss2hb\") pod \"nova-api-0\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.561308 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.598728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.598782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-config-data\") pod \"nova-scheduler-0\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.598873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvjmn\" (UniqueName: \"kubernetes.io/projected/198741f2-417e-4efd-b05e-82299bc804dc-kube-api-access-wvjmn\") pod \"nova-scheduler-0\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.602988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.605399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-config-data\") pod \"nova-scheduler-0\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.614200 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvjmn\" (UniqueName: \"kubernetes.io/projected/198741f2-417e-4efd-b05e-82299bc804dc-kube-api-access-wvjmn\") pod \"nova-scheduler-0\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.635349 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.683659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:27 crc kubenswrapper[4707]: I0121 15:52:27.983016 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:52:27 crc kubenswrapper[4707]: W0121 15:52:27.986299 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab6df059_8aa3_40c8_a65f_17e2fa8de97b.slice/crio-b8a9c0566791d1fc14346b1268283488770f5790c8a50fbe3d8d1f5c7926331f WatchSource:0}: Error finding container b8a9c0566791d1fc14346b1268283488770f5790c8a50fbe3d8d1f5c7926331f: Status 404 returned error can't find the container with id b8a9c0566791d1fc14346b1268283488770f5790c8a50fbe3d8d1f5c7926331f Jan 21 15:52:28 crc kubenswrapper[4707]: I0121 15:52:28.097358 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:52:28 crc kubenswrapper[4707]: W0121 15:52:28.100788 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod198741f2_417e_4efd_b05e_82299bc804dc.slice/crio-a7decfbbf3415b255d5a36d00d6c1dd9c22e376d2a19c0e453bf09a9a73276a3 WatchSource:0}: Error finding container a7decfbbf3415b255d5a36d00d6c1dd9c22e376d2a19c0e453bf09a9a73276a3: Status 404 returned error can't find the container with id a7decfbbf3415b255d5a36d00d6c1dd9c22e376d2a19c0e453bf09a9a73276a3 Jan 21 15:52:28 crc kubenswrapper[4707]: I0121 15:52:28.140540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerStarted","Data":"113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7"} Jan 21 15:52:28 crc kubenswrapper[4707]: I0121 15:52:28.142611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ab6df059-8aa3-40c8-a65f-17e2fa8de97b","Type":"ContainerStarted","Data":"b8a9c0566791d1fc14346b1268283488770f5790c8a50fbe3d8d1f5c7926331f"} Jan 21 15:52:28 crc kubenswrapper[4707]: I0121 15:52:28.144857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"198741f2-417e-4efd-b05e-82299bc804dc","Type":"ContainerStarted","Data":"a7decfbbf3415b255d5a36d00d6c1dd9c22e376d2a19c0e453bf09a9a73276a3"} Jan 21 15:52:28 crc kubenswrapper[4707]: I0121 15:52:28.219009 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.156008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerStarted","Data":"92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7"} Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.157065 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.156229 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="proxy-httpd" containerID="cri-o://92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7" gracePeriod=30 Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.156285 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="sg-core" containerID="cri-o://113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7" gracePeriod=30 Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.156298 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="ceilometer-notification-agent" containerID="cri-o://13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e" gracePeriod=30 Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.156155 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="ceilometer-central-agent" containerID="cri-o://8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a" gracePeriod=30 Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.158956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ab6df059-8aa3-40c8-a65f-17e2fa8de97b","Type":"ContainerStarted","Data":"9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173"} Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.159045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ab6df059-8aa3-40c8-a65f-17e2fa8de97b","Type":"ContainerStarted","Data":"85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98"} Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.163566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ba711d4-865f-4d09-892f-30c9b4f06a16","Type":"ContainerStarted","Data":"106e3556f3ec42865d3ccced817a33758f210fb47212fc3f57bf6a9732818a91"} Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.163670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ba711d4-865f-4d09-892f-30c9b4f06a16","Type":"ContainerStarted","Data":"df55e9ac1dc497a2a744b36c04038da3b4b71589b0800551c208fa606a3051f7"} Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.163746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ba711d4-865f-4d09-892f-30c9b4f06a16","Type":"ContainerStarted","Data":"1f8b7bff56582677ba605c071663bc7a49b6e5db322096a602b3e289760132dc"} Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.167081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"198741f2-417e-4efd-b05e-82299bc804dc","Type":"ContainerStarted","Data":"a817a37d6e4454d2bb899e07ff20fa1b040186d67ab023df347c93ddfb12e17d"} Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.181545 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.6145092760000002 podStartE2EDuration="5.181532468s" podCreationTimestamp="2026-01-21 15:52:24 +0000 UTC" firstStartedPulling="2026-01-21 15:52:25.017837829 +0000 UTC m=+3042.199354051" lastFinishedPulling="2026-01-21 15:52:28.58486102 +0000 UTC m=+3045.766377243" observedRunningTime="2026-01-21 15:52:29.174825316 +0000 UTC m=+3046.356341538" watchObservedRunningTime="2026-01-21 15:52:29.181532468 +0000 UTC m=+3046.363048691" Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.192025 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f0901c-cc75-43c9-8c41-59f342e2df52" path="/var/lib/kubelet/pods/53f0901c-cc75-43c9-8c41-59f342e2df52/volumes" Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.192619 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813697ea-2aac-48f0-9888-6675f4ae50fe" path="/var/lib/kubelet/pods/813697ea-2aac-48f0-9888-6675f4ae50fe/volumes" Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.198385 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.198355191 podStartE2EDuration="2.198355191s" podCreationTimestamp="2026-01-21 15:52:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:29.194976548 +0000 UTC m=+3046.376492771" watchObservedRunningTime="2026-01-21 15:52:29.198355191 +0000 UTC m=+3046.379871414" Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.220934 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.220910815 podStartE2EDuration="2.220910815s" podCreationTimestamp="2026-01-21 15:52:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:29.210784744 +0000 UTC m=+3046.392300966" watchObservedRunningTime="2026-01-21 15:52:29.220910815 +0000 UTC m=+3046.402427037" Jan 21 15:52:29 crc kubenswrapper[4707]: I0121 15:52:29.231293 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.231266749 podStartE2EDuration="2.231266749s" podCreationTimestamp="2026-01-21 15:52:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:29.223571228 +0000 UTC m=+3046.405087450" watchObservedRunningTime="2026-01-21 15:52:29.231266749 +0000 UTC m=+3046.412782971" Jan 21 15:52:30 crc kubenswrapper[4707]: I0121 15:52:30.175459 4707 generic.go:334] "Generic (PLEG): container finished" podID="198741f2-417e-4efd-b05e-82299bc804dc" containerID="a817a37d6e4454d2bb899e07ff20fa1b040186d67ab023df347c93ddfb12e17d" exitCode=1 Jan 21 15:52:30 crc kubenswrapper[4707]: I0121 15:52:30.176259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"198741f2-417e-4efd-b05e-82299bc804dc","Type":"ContainerDied","Data":"a817a37d6e4454d2bb899e07ff20fa1b040186d67ab023df347c93ddfb12e17d"} Jan 21 15:52:30 crc kubenswrapper[4707]: I0121 15:52:30.176842 4707 scope.go:117] "RemoveContainer" containerID="a817a37d6e4454d2bb899e07ff20fa1b040186d67ab023df347c93ddfb12e17d" Jan 21 15:52:30 crc kubenswrapper[4707]: I0121 15:52:30.180105 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerID="92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7" exitCode=0 Jan 21 15:52:30 crc kubenswrapper[4707]: I0121 15:52:30.180180 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerID="113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7" exitCode=2 Jan 21 15:52:30 crc kubenswrapper[4707]: I0121 15:52:30.180233 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerID="13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e" exitCode=0 Jan 21 15:52:30 crc kubenswrapper[4707]: I0121 15:52:30.180197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerDied","Data":"92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7"} Jan 21 15:52:30 crc kubenswrapper[4707]: I0121 15:52:30.180339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerDied","Data":"113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7"} Jan 21 15:52:30 crc kubenswrapper[4707]: I0121 15:52:30.180353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerDied","Data":"13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e"} Jan 21 15:52:30 crc kubenswrapper[4707]: E0121 15:52:30.202638 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:30 crc kubenswrapper[4707]: E0121 15:52:30.204134 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:30 crc kubenswrapper[4707]: E0121 15:52:30.205364 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:30 crc kubenswrapper[4707]: E0121 15:52:30.205440 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.190892 4707 generic.go:334] "Generic (PLEG): container finished" podID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerID="971937516b3b19a806e3c97a43385c931b217beac81e6d595bdb7f3762b9b713" exitCode=0 Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.190958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" event={"ID":"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d","Type":"ContainerDied","Data":"971937516b3b19a806e3c97a43385c931b217beac81e6d595bdb7f3762b9b713"} Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.191233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" event={"ID":"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d","Type":"ContainerDied","Data":"d04db9fa6e111d01fa474b6b342d32b276d64b909d2a34c4f2ab352ec7008083"} Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.191258 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d04db9fa6e111d01fa474b6b342d32b276d64b909d2a34c4f2ab352ec7008083" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.193335 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"198741f2-417e-4efd-b05e-82299bc804dc","Type":"ContainerStarted","Data":"582ab3db941ef3aa1a04e28fe1957ffa9b075bb497984c92e56ff758f0bea5ed"} Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.193514 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.377187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-combined-ca-bundle\") pod \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.377233 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp7rc\" (UniqueName: \"kubernetes.io/projected/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-kube-api-access-rp7rc\") pod \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.377291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-config\") pod \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.377334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-httpd-config\") pod \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\" (UID: \"d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d\") " Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.395723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" (UID: "d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.399942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-kube-api-access-rp7rc" (OuterVolumeSpecName: "kube-api-access-rp7rc") pod "d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" (UID: "d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d"). InnerVolumeSpecName "kube-api-access-rp7rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.416963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-config" (OuterVolumeSpecName: "config") pod "d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" (UID: "d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.423555 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" (UID: "d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.478697 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.478734 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp7rc\" (UniqueName: \"kubernetes.io/projected/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-kube-api-access-rp7rc\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.478746 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:31 crc kubenswrapper[4707]: I0121 15:52:31.478755 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.202593 4707 generic.go:334] "Generic (PLEG): container finished" podID="198741f2-417e-4efd-b05e-82299bc804dc" containerID="582ab3db941ef3aa1a04e28fe1957ffa9b075bb497984c92e56ff758f0bea5ed" exitCode=1 Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.202641 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"198741f2-417e-4efd-b05e-82299bc804dc","Type":"ContainerDied","Data":"582ab3db941ef3aa1a04e28fe1957ffa9b075bb497984c92e56ff758f0bea5ed"} Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.202900 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7c97698578-4r4js" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.202899 4707 scope.go:117] "RemoveContainer" containerID="a817a37d6e4454d2bb899e07ff20fa1b040186d67ab023df347c93ddfb12e17d" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.203214 4707 scope.go:117] "RemoveContainer" containerID="582ab3db941ef3aa1a04e28fe1957ffa9b075bb497984c92e56ff758f0bea5ed" Jan 21 15:52:32 crc kubenswrapper[4707]: E0121 15:52:32.203460 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(198741f2-417e-4efd-b05e-82299bc804dc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="198741f2-417e-4efd-b05e-82299bc804dc" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.351517 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7c97698578-4r4js"] Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.358448 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-7c97698578-4r4js"] Jan 21 15:52:32 crc kubenswrapper[4707]: E0121 15:52:32.404480 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:32 crc kubenswrapper[4707]: E0121 15:52:32.406989 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:32 crc kubenswrapper[4707]: E0121 15:52:32.408086 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:32 crc kubenswrapper[4707]: E0121 15:52:32.408115 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.535919 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.562176 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.562224 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.635460 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.695688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-sg-core-conf-yaml\") pod \"1f3eec98-b34f-4743-a58f-78eb4463a661\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.695746 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-log-httpd\") pod \"1f3eec98-b34f-4743-a58f-78eb4463a661\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.695837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-run-httpd\") pod \"1f3eec98-b34f-4743-a58f-78eb4463a661\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.695851 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-scripts\") pod \"1f3eec98-b34f-4743-a58f-78eb4463a661\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.695927 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-config-data\") pod \"1f3eec98-b34f-4743-a58f-78eb4463a661\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.695995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-combined-ca-bundle\") pod \"1f3eec98-b34f-4743-a58f-78eb4463a661\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.696017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkprm\" (UniqueName: \"kubernetes.io/projected/1f3eec98-b34f-4743-a58f-78eb4463a661-kube-api-access-jkprm\") pod \"1f3eec98-b34f-4743-a58f-78eb4463a661\" (UID: \"1f3eec98-b34f-4743-a58f-78eb4463a661\") " Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.696722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1f3eec98-b34f-4743-a58f-78eb4463a661" (UID: "1f3eec98-b34f-4743-a58f-78eb4463a661"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.696844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1f3eec98-b34f-4743-a58f-78eb4463a661" (UID: "1f3eec98-b34f-4743-a58f-78eb4463a661"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.700509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-scripts" (OuterVolumeSpecName: "scripts") pod "1f3eec98-b34f-4743-a58f-78eb4463a661" (UID: "1f3eec98-b34f-4743-a58f-78eb4463a661"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.700692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f3eec98-b34f-4743-a58f-78eb4463a661-kube-api-access-jkprm" (OuterVolumeSpecName: "kube-api-access-jkprm") pod "1f3eec98-b34f-4743-a58f-78eb4463a661" (UID: "1f3eec98-b34f-4743-a58f-78eb4463a661"). InnerVolumeSpecName "kube-api-access-jkprm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.717292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1f3eec98-b34f-4743-a58f-78eb4463a661" (UID: "1f3eec98-b34f-4743-a58f-78eb4463a661"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.745675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f3eec98-b34f-4743-a58f-78eb4463a661" (UID: "1f3eec98-b34f-4743-a58f-78eb4463a661"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.769494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-config-data" (OuterVolumeSpecName: "config-data") pod "1f3eec98-b34f-4743-a58f-78eb4463a661" (UID: "1f3eec98-b34f-4743-a58f-78eb4463a661"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.798294 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.798322 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.798330 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f3eec98-b34f-4743-a58f-78eb4463a661-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.798340 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.798349 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.798358 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3eec98-b34f-4743-a58f-78eb4463a661-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:32 crc kubenswrapper[4707]: I0121 15:52:32.798369 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkprm\" (UniqueName: \"kubernetes.io/projected/1f3eec98-b34f-4743-a58f-78eb4463a661-kube-api-access-jkprm\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.190675 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" path="/var/lib/kubelet/pods/d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d/volumes" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.210987 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerID="8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a" exitCode=0 Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.211066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerDied","Data":"8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a"} Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.211097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1f3eec98-b34f-4743-a58f-78eb4463a661","Type":"ContainerDied","Data":"1dc1863dbbf749d9bf8e7903f45fe587a18e06ac77616b33e0951f0f64abda78"} Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.211113 4707 scope.go:117] "RemoveContainer" containerID="92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.211647 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.212586 4707 scope.go:117] "RemoveContainer" containerID="582ab3db941ef3aa1a04e28fe1957ffa9b075bb497984c92e56ff758f0bea5ed" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.212796 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(198741f2-417e-4efd-b05e-82299bc804dc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="198741f2-417e-4efd-b05e-82299bc804dc" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.236090 4707 scope.go:117] "RemoveContainer" containerID="113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.241912 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.250221 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.251859 4707 scope.go:117] "RemoveContainer" containerID="13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.269911 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.271381 4707 scope.go:117] "RemoveContainer" containerID="8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.274732 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerName="neutron-api" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.274765 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerName="neutron-api" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.274802 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="ceilometer-notification-agent" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.274827 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="ceilometer-notification-agent" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.274841 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="sg-core" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.274849 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="sg-core" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.274863 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="ceilometer-central-agent" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.274869 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="ceilometer-central-agent" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.274893 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerName="neutron-httpd" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.274902 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerName="neutron-httpd" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.274920 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="proxy-httpd" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.274929 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="proxy-httpd" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.275436 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerName="neutron-httpd" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.275479 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10d56d4-0d08-4ea0-acd7-8e3b7bf6fb4d" containerName="neutron-api" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.275495 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="proxy-httpd" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.275508 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="sg-core" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.275523 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="ceilometer-notification-agent" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.275536 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" containerName="ceilometer-central-agent" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.283134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.283265 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.285205 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.285363 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.288856 4707 scope.go:117] "RemoveContainer" containerID="92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.292084 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7\": container with ID starting with 92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7 not found: ID does not exist" containerID="92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.292175 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7"} err="failed to get container status \"92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7\": rpc error: code = NotFound desc = could not find container \"92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7\": container with ID starting with 92196b6391bd1564fc9bfcefc1725c80b31b54b57f7811ebb08f48efe0b3a0b7 not found: ID does not exist" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.292241 4707 scope.go:117] "RemoveContainer" containerID="113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.292572 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7\": container with ID starting with 113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7 not found: ID does not exist" containerID="113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.292593 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7"} err="failed to get container status \"113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7\": rpc error: code = NotFound desc = could not find container \"113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7\": container with ID starting with 113738fd85fbd7844594cee3d74d6b9c09cfac5daa4051777e1d5bc5992817a7 not found: ID does not exist" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.292606 4707 scope.go:117] "RemoveContainer" containerID="13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.292922 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e\": container with ID starting with 13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e not found: ID does not exist" containerID="13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.292971 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e"} err="failed to get container status \"13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e\": rpc error: code = NotFound desc = could not find container \"13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e\": container with ID starting with 13afc82fd6125a5ebf737661eab8b27e964fdb2e621fc11b66f7660b9d0e540e not found: ID does not exist" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.292985 4707 scope.go:117] "RemoveContainer" containerID="8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a" Jan 21 15:52:33 crc kubenswrapper[4707]: E0121 15:52:33.293795 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a\": container with ID starting with 8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a not found: ID does not exist" containerID="8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.293850 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a"} err="failed to get container status \"8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a\": rpc error: code = NotFound desc = could not find container \"8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a\": container with ID starting with 8c134f958dd908732a514a63b343aaac01e84e708dca3e39b19f13755e91209a not found: ID does not exist" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.409523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-config-data\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.409718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.409760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.409799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-scripts\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.409877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6zj\" (UniqueName: \"kubernetes.io/projected/653100f7-1a4f-46bd-8438-c1415bc357a0-kube-api-access-rn6zj\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.410031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-run-httpd\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.410262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-log-httpd\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.511876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-run-httpd\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.511953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-log-httpd\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.511994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-config-data\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.512041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.512057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.512077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-scripts\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.512091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6zj\" (UniqueName: \"kubernetes.io/projected/653100f7-1a4f-46bd-8438-c1415bc357a0-kube-api-access-rn6zj\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.512320 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-run-httpd\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.512845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-log-httpd\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.515198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.515475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.515907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-scripts\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.516258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-config-data\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.525139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6zj\" (UniqueName: \"kubernetes.io/projected/653100f7-1a4f-46bd-8438-c1415bc357a0-kube-api-access-rn6zj\") pod \"ceilometer-0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.595381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.769837 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.770697 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.822833 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j"] Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.823104 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" podUID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerName="proxy-httpd" containerID="cri-o://bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87" gracePeriod=30 Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.823271 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" podUID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerName="proxy-server" containerID="cri-o://3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43" gracePeriod=30 Jan 21 15:52:33 crc kubenswrapper[4707]: I0121 15:52:33.969600 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:52:33 crc kubenswrapper[4707]: W0121 15:52:33.971265 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod653100f7_1a4f_46bd_8438_c1415bc357a0.slice/crio-233cf3f5787af714cace1bcc125deea23a82d0e9e2559dcd32151877d130ff1e WatchSource:0}: Error finding container 233cf3f5787af714cace1bcc125deea23a82d0e9e2559dcd32151877d130ff1e: Status 404 returned error can't find the container with id 233cf3f5787af714cace1bcc125deea23a82d0e9e2559dcd32151877d130ff1e Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.219551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerStarted","Data":"233cf3f5787af714cace1bcc125deea23a82d0e9e2559dcd32151877d130ff1e"} Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.224730 4707 generic.go:334] "Generic (PLEG): container finished" podID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerID="bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87" exitCode=0 Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.224767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" event={"ID":"66616542-abed-45a9-b0e2-e440f49f3ab8","Type":"ContainerDied","Data":"bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87"} Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.230953 4707 scope.go:117] "RemoveContainer" containerID="582ab3db941ef3aa1a04e28fe1957ffa9b075bb497984c92e56ff758f0bea5ed" Jan 21 15:52:34 crc kubenswrapper[4707]: E0121 15:52:34.231179 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(198741f2-417e-4efd-b05e-82299bc804dc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="198741f2-417e-4efd-b05e-82299bc804dc" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.704223 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.835645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rlxn\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-kube-api-access-5rlxn\") pod \"66616542-abed-45a9-b0e2-e440f49f3ab8\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.835692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-run-httpd\") pod \"66616542-abed-45a9-b0e2-e440f49f3ab8\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.835758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-combined-ca-bundle\") pod \"66616542-abed-45a9-b0e2-e440f49f3ab8\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.835825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-etc-swift\") pod \"66616542-abed-45a9-b0e2-e440f49f3ab8\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.835845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-config-data\") pod \"66616542-abed-45a9-b0e2-e440f49f3ab8\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.835891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-log-httpd\") pod \"66616542-abed-45a9-b0e2-e440f49f3ab8\" (UID: \"66616542-abed-45a9-b0e2-e440f49f3ab8\") " Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.836163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "66616542-abed-45a9-b0e2-e440f49f3ab8" (UID: "66616542-abed-45a9-b0e2-e440f49f3ab8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.836416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "66616542-abed-45a9-b0e2-e440f49f3ab8" (UID: "66616542-abed-45a9-b0e2-e440f49f3ab8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.836514 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.842935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-kube-api-access-5rlxn" (OuterVolumeSpecName: "kube-api-access-5rlxn") pod "66616542-abed-45a9-b0e2-e440f49f3ab8" (UID: "66616542-abed-45a9-b0e2-e440f49f3ab8"). InnerVolumeSpecName "kube-api-access-5rlxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.842968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "66616542-abed-45a9-b0e2-e440f49f3ab8" (UID: "66616542-abed-45a9-b0e2-e440f49f3ab8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.872940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-config-data" (OuterVolumeSpecName: "config-data") pod "66616542-abed-45a9-b0e2-e440f49f3ab8" (UID: "66616542-abed-45a9-b0e2-e440f49f3ab8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.873291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66616542-abed-45a9-b0e2-e440f49f3ab8" (UID: "66616542-abed-45a9-b0e2-e440f49f3ab8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.937642 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66616542-abed-45a9-b0e2-e440f49f3ab8-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.937674 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rlxn\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-kube-api-access-5rlxn\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.937686 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.937694 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66616542-abed-45a9-b0e2-e440f49f3ab8-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:34 crc kubenswrapper[4707]: I0121 15:52:34.937703 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66616542-abed-45a9-b0e2-e440f49f3ab8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.191629 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f3eec98-b34f-4743-a58f-78eb4463a661" path="/var/lib/kubelet/pods/1f3eec98-b34f-4743-a58f-78eb4463a661/volumes" Jan 21 15:52:35 crc kubenswrapper[4707]: E0121 15:52:35.202924 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:35 crc kubenswrapper[4707]: E0121 15:52:35.204112 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:35 crc kubenswrapper[4707]: E0121 15:52:35.205370 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:35 crc kubenswrapper[4707]: E0121 15:52:35.205424 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.241127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerStarted","Data":"d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6"} Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.242740 4707 generic.go:334] "Generic (PLEG): container finished" podID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerID="3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43" exitCode=0 Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.242774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" event={"ID":"66616542-abed-45a9-b0e2-e440f49f3ab8","Type":"ContainerDied","Data":"3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43"} Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.242790 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.242799 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j" event={"ID":"66616542-abed-45a9-b0e2-e440f49f3ab8","Type":"ContainerDied","Data":"1033eb311ace7f2a51b10b0c584c6127c2ad10bb2a73242133fd4ac297020250"} Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.242830 4707 scope.go:117] "RemoveContainer" containerID="3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43" Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.265683 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j"] Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.267440 4707 scope.go:117] "RemoveContainer" containerID="bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87" Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.272274 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6674bb6-hwr2j"] Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.282492 4707 scope.go:117] "RemoveContainer" containerID="3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43" Jan 21 15:52:35 crc kubenswrapper[4707]: E0121 15:52:35.282858 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43\": container with ID starting with 3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43 not found: ID does not exist" containerID="3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43" Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.282884 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43"} err="failed to get container status \"3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43\": rpc error: code = NotFound desc = could not find container \"3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43\": container with ID starting with 3b9e56a5492a15dafeab716b5ccb3fe9a74446c7904a400f72eb73e2db202e43 not found: ID does not exist" Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.282898 4707 scope.go:117] "RemoveContainer" containerID="bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87" Jan 21 15:52:35 crc kubenswrapper[4707]: E0121 15:52:35.283107 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87\": container with ID starting with bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87 not found: ID does not exist" containerID="bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87" Jan 21 15:52:35 crc kubenswrapper[4707]: I0121 15:52:35.283130 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87"} err="failed to get container status \"bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87\": rpc error: code = NotFound desc = could not find container \"bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87\": container with ID starting with bc4bb78f36395696c2d82ed5dd56f45bca4f6020e51c1d0060300f78ba4e1a87 not found: ID does not exist" Jan 21 15:52:36 crc kubenswrapper[4707]: I0121 15:52:36.251170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerStarted","Data":"a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58"} Jan 21 15:52:36 crc kubenswrapper[4707]: I0121 15:52:36.251398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerStarted","Data":"8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360"} Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.190615 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66616542-abed-45a9-b0e2-e440f49f3ab8" path="/var/lib/kubelet/pods/66616542-abed-45a9-b0e2-e440f49f3ab8/volumes" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.262999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerStarted","Data":"00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b"} Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.263127 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.282537 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.248285777 podStartE2EDuration="4.2825222s" podCreationTimestamp="2026-01-21 15:52:33 +0000 UTC" firstStartedPulling="2026-01-21 15:52:33.973307526 +0000 UTC m=+3051.154823748" lastFinishedPulling="2026-01-21 15:52:37.007543949 +0000 UTC m=+3054.189060171" observedRunningTime="2026-01-21 15:52:37.275582842 +0000 UTC m=+3054.457099063" watchObservedRunningTime="2026-01-21 15:52:37.2825222 +0000 UTC m=+3054.464038422" Jan 21 15:52:37 crc kubenswrapper[4707]: E0121 15:52:37.403386 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:37 crc kubenswrapper[4707]: E0121 15:52:37.404885 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:37 crc kubenswrapper[4707]: E0121 15:52:37.406218 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:37 crc kubenswrapper[4707]: E0121 15:52:37.406281 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.562233 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.562290 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.636259 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.636302 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.636315 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.636708 4707 scope.go:117] "RemoveContainer" containerID="582ab3db941ef3aa1a04e28fe1957ffa9b075bb497984c92e56ff758f0bea5ed" Jan 21 15:52:37 crc kubenswrapper[4707]: E0121 15:52:37.636949 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(198741f2-417e-4efd-b05e-82299bc804dc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="198741f2-417e-4efd-b05e-82299bc804dc" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.684233 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:37 crc kubenswrapper[4707]: I0121 15:52:37.684318 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:38 crc kubenswrapper[4707]: I0121 15:52:38.646067 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.254:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:52:38 crc kubenswrapper[4707]: I0121 15:52:38.646090 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.254:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:52:38 crc kubenswrapper[4707]: I0121 15:52:38.767920 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.13:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:52:38 crc kubenswrapper[4707]: I0121 15:52:38.767958 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.13:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:52:39 crc kubenswrapper[4707]: I0121 15:52:39.945276 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:52:39 crc kubenswrapper[4707]: I0121 15:52:39.945330 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:52:40 crc kubenswrapper[4707]: E0121 15:52:40.201340 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:40 crc kubenswrapper[4707]: E0121 15:52:40.203093 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:40 crc kubenswrapper[4707]: E0121 15:52:40.205535 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:40 crc kubenswrapper[4707]: E0121 15:52:40.205596 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" Jan 21 15:52:42 crc kubenswrapper[4707]: E0121 15:52:42.403174 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:42 crc kubenswrapper[4707]: E0121 15:52:42.404477 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:42 crc kubenswrapper[4707]: E0121 15:52:42.405455 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:42 crc kubenswrapper[4707]: E0121 15:52:42.405546 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:45 crc kubenswrapper[4707]: E0121 15:52:45.201752 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:45 crc kubenswrapper[4707]: E0121 15:52:45.203270 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:45 crc kubenswrapper[4707]: E0121 15:52:45.204554 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:45 crc kubenswrapper[4707]: E0121 15:52:45.204607 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" Jan 21 15:52:47 crc kubenswrapper[4707]: E0121 15:52:47.403360 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:47 crc kubenswrapper[4707]: E0121 15:52:47.405363 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:47 crc kubenswrapper[4707]: E0121 15:52:47.406468 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:47 crc kubenswrapper[4707]: E0121 15:52:47.406558 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:47 crc kubenswrapper[4707]: I0121 15:52:47.563896 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:47 crc kubenswrapper[4707]: I0121 15:52:47.563962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:47 crc kubenswrapper[4707]: I0121 15:52:47.565470 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:47 crc kubenswrapper[4707]: I0121 15:52:47.565844 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:52:47 crc kubenswrapper[4707]: I0121 15:52:47.688825 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:47 crc kubenswrapper[4707]: I0121 15:52:47.689389 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:47 crc kubenswrapper[4707]: I0121 15:52:47.689635 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:47 crc kubenswrapper[4707]: I0121 15:52:47.692312 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:48 crc kubenswrapper[4707]: I0121 15:52:48.343108 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:48 crc kubenswrapper[4707]: I0121 15:52:48.345675 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:52:50 crc kubenswrapper[4707]: E0121 15:52:50.201753 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:50 crc kubenswrapper[4707]: E0121 15:52:50.203168 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:50 crc kubenswrapper[4707]: E0121 15:52:50.204304 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:50 crc kubenswrapper[4707]: E0121 15:52:50.204336 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.125129 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.297994 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pglf2\" (UniqueName: \"kubernetes.io/projected/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-kube-api-access-pglf2\") pod \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.298133 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-combined-ca-bundle\") pod \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.298176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-config-data\") pod \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\" (UID: \"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7\") " Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.302503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-kube-api-access-pglf2" (OuterVolumeSpecName: "kube-api-access-pglf2") pod "1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" (UID: "1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7"). InnerVolumeSpecName "kube-api-access-pglf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.318713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-config-data" (OuterVolumeSpecName: "config-data") pod "1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" (UID: "1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.319046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" (UID: "1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.367269 4707 generic.go:334] "Generic (PLEG): container finished" podID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" exitCode=137 Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.367321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7","Type":"ContainerDied","Data":"47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3"} Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.367329 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.367356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7","Type":"ContainerDied","Data":"bf2c885dbf38f097c85e5736fed268197e96dd565646be83b3b289caa174cc68"} Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.367375 4707 scope.go:117] "RemoveContainer" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.388289 4707 scope.go:117] "RemoveContainer" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" Jan 21 15:52:51 crc kubenswrapper[4707]: E0121 15:52:51.390180 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3\": container with ID starting with 47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3 not found: ID does not exist" containerID="47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.390220 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3"} err="failed to get container status \"47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3\": rpc error: code = NotFound desc = could not find container \"47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3\": container with ID starting with 47cffea4fed32cce5aa662b138a34737233ce3be4eb03a2701c56535a1c51dc3 not found: ID does not exist" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.392315 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.400627 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pglf2\" (UniqueName: \"kubernetes.io/projected/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-kube-api-access-pglf2\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.400651 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.400661 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.400626 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.408995 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:52:51 crc kubenswrapper[4707]: E0121 15:52:51.409360 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerName="proxy-server" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.409372 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerName="proxy-server" Jan 21 15:52:51 crc kubenswrapper[4707]: E0121 15:52:51.409384 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.409391 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" Jan 21 15:52:51 crc kubenswrapper[4707]: E0121 15:52:51.409456 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerName="proxy-httpd" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.409463 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerName="proxy-httpd" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.409665 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerName="proxy-server" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.409681 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="66616542-abed-45a9-b0e2-e440f49f3ab8" containerName="proxy-httpd" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.409703 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" containerName="nova-cell1-conductor-conductor" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.410310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.413964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.420491 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.604499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwdbm\" (UniqueName: \"kubernetes.io/projected/735f4b3b-efbd-440b-b617-27b89d61a044-kube-api-access-lwdbm\") pod \"nova-cell1-conductor-0\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.604563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.604583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.706468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwdbm\" (UniqueName: \"kubernetes.io/projected/735f4b3b-efbd-440b-b617-27b89d61a044-kube-api-access-lwdbm\") pod \"nova-cell1-conductor-0\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.706533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.706556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.709791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.710081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.718797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwdbm\" (UniqueName: \"kubernetes.io/projected/735f4b3b-efbd-440b-b617-27b89d61a044-kube-api-access-lwdbm\") pod \"nova-cell1-conductor-0\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:51 crc kubenswrapper[4707]: I0121 15:52:51.737725 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:52 crc kubenswrapper[4707]: I0121 15:52:52.104432 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:52:52 crc kubenswrapper[4707]: W0121 15:52:52.105352 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod735f4b3b_efbd_440b_b617_27b89d61a044.slice/crio-c195e4c9c91c37dc4629ae9598b7d352a14c383813fb2a61ee64b268762ee2aa WatchSource:0}: Error finding container c195e4c9c91c37dc4629ae9598b7d352a14c383813fb2a61ee64b268762ee2aa: Status 404 returned error can't find the container with id c195e4c9c91c37dc4629ae9598b7d352a14c383813fb2a61ee64b268762ee2aa Jan 21 15:52:52 crc kubenswrapper[4707]: I0121 15:52:52.376428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"735f4b3b-efbd-440b-b617-27b89d61a044","Type":"ContainerStarted","Data":"239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274"} Jan 21 15:52:52 crc kubenswrapper[4707]: I0121 15:52:52.376775 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:52:52 crc kubenswrapper[4707]: I0121 15:52:52.376793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"735f4b3b-efbd-440b-b617-27b89d61a044","Type":"ContainerStarted","Data":"c195e4c9c91c37dc4629ae9598b7d352a14c383813fb2a61ee64b268762ee2aa"} Jan 21 15:52:52 crc kubenswrapper[4707]: I0121 15:52:52.392352 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.392336016 podStartE2EDuration="1.392336016s" podCreationTimestamp="2026-01-21 15:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:52.386044755 +0000 UTC m=+3069.567560977" watchObservedRunningTime="2026-01-21 15:52:52.392336016 +0000 UTC m=+3069.573852238" Jan 21 15:52:52 crc kubenswrapper[4707]: E0121 15:52:52.403081 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:52 crc kubenswrapper[4707]: E0121 15:52:52.404259 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:52 crc kubenswrapper[4707]: E0121 15:52:52.405245 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:52:52 crc kubenswrapper[4707]: E0121 15:52:52.405298 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:53 crc kubenswrapper[4707]: I0121 15:52:53.188013 4707 scope.go:117] "RemoveContainer" containerID="582ab3db941ef3aa1a04e28fe1957ffa9b075bb497984c92e56ff758f0bea5ed" Jan 21 15:52:53 crc kubenswrapper[4707]: I0121 15:52:53.190833 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7" path="/var/lib/kubelet/pods/1e69371d-2f7c-4ef0-b34a-7c1f342d6fb7/volumes" Jan 21 15:52:53 crc kubenswrapper[4707]: I0121 15:52:53.386269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"198741f2-417e-4efd-b05e-82299bc804dc","Type":"ContainerStarted","Data":"82e4262b5854221af674263805459eaa7c3b2334cab2493fd852b6185a2ec55a"} Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.337544 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.342802 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.361113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-combined-ca-bundle\") pod \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.361174 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp5md\" (UniqueName: \"kubernetes.io/projected/011702e0-dd00-4236-8d20-ad2dfa6c84fa-kube-api-access-dp5md\") pod \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.361209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8np7m\" (UniqueName: \"kubernetes.io/projected/2ebce296-11df-4e07-8a09-855cf9ee3417-kube-api-access-8np7m\") pod \"2ebce296-11df-4e07-8a09-855cf9ee3417\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.361291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-combined-ca-bundle\") pod \"2ebce296-11df-4e07-8a09-855cf9ee3417\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.361351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-config-data\") pod \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\" (UID: \"011702e0-dd00-4236-8d20-ad2dfa6c84fa\") " Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.361427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-config-data\") pod \"2ebce296-11df-4e07-8a09-855cf9ee3417\" (UID: \"2ebce296-11df-4e07-8a09-855cf9ee3417\") " Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.372947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011702e0-dd00-4236-8d20-ad2dfa6c84fa-kube-api-access-dp5md" (OuterVolumeSpecName: "kube-api-access-dp5md") pod "011702e0-dd00-4236-8d20-ad2dfa6c84fa" (UID: "011702e0-dd00-4236-8d20-ad2dfa6c84fa"). InnerVolumeSpecName "kube-api-access-dp5md". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.375308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ebce296-11df-4e07-8a09-855cf9ee3417-kube-api-access-8np7m" (OuterVolumeSpecName: "kube-api-access-8np7m") pod "2ebce296-11df-4e07-8a09-855cf9ee3417" (UID: "2ebce296-11df-4e07-8a09-855cf9ee3417"). InnerVolumeSpecName "kube-api-access-8np7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.385142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-config-data" (OuterVolumeSpecName: "config-data") pod "011702e0-dd00-4236-8d20-ad2dfa6c84fa" (UID: "011702e0-dd00-4236-8d20-ad2dfa6c84fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.387229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "011702e0-dd00-4236-8d20-ad2dfa6c84fa" (UID: "011702e0-dd00-4236-8d20-ad2dfa6c84fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.388290 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-config-data" (OuterVolumeSpecName: "config-data") pod "2ebce296-11df-4e07-8a09-855cf9ee3417" (UID: "2ebce296-11df-4e07-8a09-855cf9ee3417"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.390900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ebce296-11df-4e07-8a09-855cf9ee3417" (UID: "2ebce296-11df-4e07-8a09-855cf9ee3417"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.393223 4707 generic.go:334] "Generic (PLEG): container finished" podID="011702e0-dd00-4236-8d20-ad2dfa6c84fa" containerID="3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57" exitCode=137 Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.393285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"011702e0-dd00-4236-8d20-ad2dfa6c84fa","Type":"ContainerDied","Data":"3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57"} Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.393310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"011702e0-dd00-4236-8d20-ad2dfa6c84fa","Type":"ContainerDied","Data":"9eccda1f6b4b5e590bfe854a3b161bd19c111a8cedb6b7c720ece65a480da30b"} Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.393327 4707 scope.go:117] "RemoveContainer" containerID="3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.393406 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.405532 4707 generic.go:334] "Generic (PLEG): container finished" podID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" exitCode=137 Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.405559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"2ebce296-11df-4e07-8a09-855cf9ee3417","Type":"ContainerDied","Data":"192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e"} Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.405574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"2ebce296-11df-4e07-8a09-855cf9ee3417","Type":"ContainerDied","Data":"f2289a893a7fdf7645509da321da842815be67cba65365577b3b28ee88cfdd60"} Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.405605 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.457296 4707 scope.go:117] "RemoveContainer" containerID="3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57" Jan 21 15:52:54 crc kubenswrapper[4707]: E0121 15:52:54.457858 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57\": container with ID starting with 3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57 not found: ID does not exist" containerID="3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.457893 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57"} err="failed to get container status \"3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57\": rpc error: code = NotFound desc = could not find container \"3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57\": container with ID starting with 3d43caf7035980fc553c81c53dae9bf4d69f7c378b9d4f8abe4b2cc003e89d57 not found: ID does not exist" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.457918 4707 scope.go:117] "RemoveContainer" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.467247 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.467283 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.467294 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp5md\" (UniqueName: \"kubernetes.io/projected/011702e0-dd00-4236-8d20-ad2dfa6c84fa-kube-api-access-dp5md\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.467303 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8np7m\" (UniqueName: \"kubernetes.io/projected/2ebce296-11df-4e07-8a09-855cf9ee3417-kube-api-access-8np7m\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.467311 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebce296-11df-4e07-8a09-855cf9ee3417-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.467319 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011702e0-dd00-4236-8d20-ad2dfa6c84fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.481289 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.499933 4707 scope.go:117] "RemoveContainer" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" Jan 21 15:52:54 crc kubenswrapper[4707]: E0121 15:52:54.500330 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e\": container with ID starting with 192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e not found: ID does not exist" containerID="192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.500377 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e"} err="failed to get container status \"192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e\": rpc error: code = NotFound desc = could not find container \"192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e\": container with ID starting with 192beb9b6001a287ef40c713081da18e7b4f942f367f5c55cab06a3b25d0f17e not found: ID does not exist" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.520149 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.526687 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.532200 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.538721 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:54 crc kubenswrapper[4707]: E0121 15:52:54.539090 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011702e0-dd00-4236-8d20-ad2dfa6c84fa" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.539108 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="011702e0-dd00-4236-8d20-ad2dfa6c84fa" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:52:54 crc kubenswrapper[4707]: E0121 15:52:54.539139 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.539147 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.539308 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" containerName="nova-cell0-conductor-conductor" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.539324 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="011702e0-dd00-4236-8d20-ad2dfa6c84fa" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.539869 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.541373 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.544418 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.545150 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.546631 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.549888 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.559575 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.568734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmhpw\" (UniqueName: \"kubernetes.io/projected/6b9de3b0-5497-4666-800f-6e5db9e68f8e-kube-api-access-nmhpw\") pod \"nova-cell0-conductor-0\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.568888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.569039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.569158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz92z\" (UniqueName: \"kubernetes.io/projected/602a278f-4c40-434d-987a-30494dfb3f15-kube-api-access-xz92z\") pod \"nova-cell1-novncproxy-0\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.569232 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.569271 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.669961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.670289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.670389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz92z\" (UniqueName: \"kubernetes.io/projected/602a278f-4c40-434d-987a-30494dfb3f15-kube-api-access-xz92z\") pod \"nova-cell1-novncproxy-0\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.670452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.670853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.671021 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmhpw\" (UniqueName: \"kubernetes.io/projected/6b9de3b0-5497-4666-800f-6e5db9e68f8e-kube-api-access-nmhpw\") pod \"nova-cell0-conductor-0\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.673300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.673364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.673934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.674051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.684294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmhpw\" (UniqueName: \"kubernetes.io/projected/6b9de3b0-5497-4666-800f-6e5db9e68f8e-kube-api-access-nmhpw\") pod \"nova-cell0-conductor-0\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.684793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz92z\" (UniqueName: \"kubernetes.io/projected/602a278f-4c40-434d-987a-30494dfb3f15-kube-api-access-xz92z\") pod \"nova-cell1-novncproxy-0\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.855969 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:54 crc kubenswrapper[4707]: I0121 15:52:54.866990 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.190907 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="011702e0-dd00-4236-8d20-ad2dfa6c84fa" path="/var/lib/kubelet/pods/011702e0-dd00-4236-8d20-ad2dfa6c84fa/volumes" Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.191402 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ebce296-11df-4e07-8a09-855cf9ee3417" path="/var/lib/kubelet/pods/2ebce296-11df-4e07-8a09-855cf9ee3417/volumes" Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.240037 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.295625 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:52:55 crc kubenswrapper[4707]: W0121 15:52:55.296454 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod602a278f_4c40_434d_987a_30494dfb3f15.slice/crio-9fa664a1846224b1c4c90896ff3c714b680fd6ff44ab4fd1c654e2b50bb24d3f WatchSource:0}: Error finding container 9fa664a1846224b1c4c90896ff3c714b680fd6ff44ab4fd1c654e2b50bb24d3f: Status 404 returned error can't find the container with id 9fa664a1846224b1c4c90896ff3c714b680fd6ff44ab4fd1c654e2b50bb24d3f Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.417112 4707 generic.go:334] "Generic (PLEG): container finished" podID="198741f2-417e-4efd-b05e-82299bc804dc" containerID="82e4262b5854221af674263805459eaa7c3b2334cab2493fd852b6185a2ec55a" exitCode=1 Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.417188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"198741f2-417e-4efd-b05e-82299bc804dc","Type":"ContainerDied","Data":"82e4262b5854221af674263805459eaa7c3b2334cab2493fd852b6185a2ec55a"} Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.417272 4707 scope.go:117] "RemoveContainer" containerID="582ab3db941ef3aa1a04e28fe1957ffa9b075bb497984c92e56ff758f0bea5ed" Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.417940 4707 scope.go:117] "RemoveContainer" containerID="82e4262b5854221af674263805459eaa7c3b2334cab2493fd852b6185a2ec55a" Jan 21 15:52:55 crc kubenswrapper[4707]: E0121 15:52:55.418167 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(198741f2-417e-4efd-b05e-82299bc804dc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="198741f2-417e-4efd-b05e-82299bc804dc" Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.419143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"602a278f-4c40-434d-987a-30494dfb3f15","Type":"ContainerStarted","Data":"9fa664a1846224b1c4c90896ff3c714b680fd6ff44ab4fd1c654e2b50bb24d3f"} Jan 21 15:52:55 crc kubenswrapper[4707]: I0121 15:52:55.420871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"6b9de3b0-5497-4666-800f-6e5db9e68f8e","Type":"ContainerStarted","Data":"26864f10b43e3544c0cb3ccb68e807354afd363c061983a48a6ca15d68d32b5d"} Jan 21 15:52:56 crc kubenswrapper[4707]: I0121 15:52:56.430979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"6b9de3b0-5497-4666-800f-6e5db9e68f8e","Type":"ContainerStarted","Data":"8cf71330f22a9f711f2a9aa118097de5a675b96809b3a77139fbb95dc742909f"} Jan 21 15:52:56 crc kubenswrapper[4707]: I0121 15:52:56.431282 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:52:56 crc kubenswrapper[4707]: I0121 15:52:56.434366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"602a278f-4c40-434d-987a-30494dfb3f15","Type":"ContainerStarted","Data":"6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6"} Jan 21 15:52:56 crc kubenswrapper[4707]: I0121 15:52:56.446738 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.446723562 podStartE2EDuration="2.446723562s" podCreationTimestamp="2026-01-21 15:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:56.44214895 +0000 UTC m=+3073.623665171" watchObservedRunningTime="2026-01-21 15:52:56.446723562 +0000 UTC m=+3073.628239784" Jan 21 15:52:56 crc kubenswrapper[4707]: I0121 15:52:56.459351 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.459335438 podStartE2EDuration="2.459335438s" podCreationTimestamp="2026-01-21 15:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:56.455058134 +0000 UTC m=+3073.636574356" watchObservedRunningTime="2026-01-21 15:52:56.459335438 +0000 UTC m=+3073.640851660" Jan 21 15:52:57 crc kubenswrapper[4707]: I0121 15:52:57.635937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:57 crc kubenswrapper[4707]: I0121 15:52:57.636167 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:57 crc kubenswrapper[4707]: I0121 15:52:57.636178 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:57 crc kubenswrapper[4707]: I0121 15:52:57.636187 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:52:57 crc kubenswrapper[4707]: I0121 15:52:57.636851 4707 scope.go:117] "RemoveContainer" containerID="82e4262b5854221af674263805459eaa7c3b2334cab2493fd852b6185a2ec55a" Jan 21 15:52:57 crc kubenswrapper[4707]: E0121 15:52:57.637124 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(198741f2-417e-4efd-b05e-82299bc804dc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="198741f2-417e-4efd-b05e-82299bc804dc" Jan 21 15:52:59 crc kubenswrapper[4707]: I0121 15:52:59.867550 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:01 crc kubenswrapper[4707]: I0121 15:53:01.758213 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:03 crc kubenswrapper[4707]: I0121 15:53:03.599569 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:04 crc kubenswrapper[4707]: I0121 15:53:04.867385 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:04 crc kubenswrapper[4707]: I0121 15:53:04.876617 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:04 crc kubenswrapper[4707]: I0121 15:53:04.876878 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.297438 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-tp798"] Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.303631 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-tp798"] Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.335093 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.335387 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-log" containerID="cri-o://df55e9ac1dc497a2a744b36c04038da3b4b71589b0800551c208fa606a3051f7" gracePeriod=30 Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.335464 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-api" containerID="cri-o://106e3556f3ec42865d3ccced817a33758f210fb47212fc3f57bf6a9732818a91" gracePeriod=30 Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.345993 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.389230 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.488354 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.488682 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-log" containerID="cri-o://85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98" gracePeriod=30 Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.488795 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-metadata" containerID="cri-o://9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173" gracePeriod=30 Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.497354 4707 generic.go:334] "Generic (PLEG): container finished" podID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerID="df55e9ac1dc497a2a744b36c04038da3b4b71589b0800551c208fa606a3051f7" exitCode=143 Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.498329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ba711d4-865f-4d09-892f-30c9b4f06a16","Type":"ContainerDied","Data":"df55e9ac1dc497a2a744b36c04038da3b4b71589b0800551c208fa606a3051f7"} Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.505572 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.676370 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f"] Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.677600 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.679965 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.680110 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.684709 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f"] Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.722578 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.863380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvjmn\" (UniqueName: \"kubernetes.io/projected/198741f2-417e-4efd-b05e-82299bc804dc-kube-api-access-wvjmn\") pod \"198741f2-417e-4efd-b05e-82299bc804dc\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.863561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-combined-ca-bundle\") pod \"198741f2-417e-4efd-b05e-82299bc804dc\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.863653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-config-data\") pod \"198741f2-417e-4efd-b05e-82299bc804dc\" (UID: \"198741f2-417e-4efd-b05e-82299bc804dc\") " Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.863993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-config-data\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.864062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.864273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nck7f\" (UniqueName: \"kubernetes.io/projected/62d12d33-e904-4139-83d0-23eb2bb74b23-kube-api-access-nck7f\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.864445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-scripts\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.873483 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/198741f2-417e-4efd-b05e-82299bc804dc-kube-api-access-wvjmn" (OuterVolumeSpecName: "kube-api-access-wvjmn") pod "198741f2-417e-4efd-b05e-82299bc804dc" (UID: "198741f2-417e-4efd-b05e-82299bc804dc"). InnerVolumeSpecName "kube-api-access-wvjmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.885727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "198741f2-417e-4efd-b05e-82299bc804dc" (UID: "198741f2-417e-4efd-b05e-82299bc804dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.895618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-config-data" (OuterVolumeSpecName: "config-data") pod "198741f2-417e-4efd-b05e-82299bc804dc" (UID: "198741f2-417e-4efd-b05e-82299bc804dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.966765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-config-data\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.966873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.966935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nck7f\" (UniqueName: \"kubernetes.io/projected/62d12d33-e904-4139-83d0-23eb2bb74b23-kube-api-access-nck7f\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.966977 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-scripts\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.967044 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.967057 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198741f2-417e-4efd-b05e-82299bc804dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.967067 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvjmn\" (UniqueName: \"kubernetes.io/projected/198741f2-417e-4efd-b05e-82299bc804dc-kube-api-access-wvjmn\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.970285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-scripts\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.970527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-config-data\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.972317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:05 crc kubenswrapper[4707]: I0121 15:53:05.980511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nck7f\" (UniqueName: \"kubernetes.io/projected/62d12d33-e904-4139-83d0-23eb2bb74b23-kube-api-access-nck7f\") pod \"nova-cell0-cell-mapping-jhb2f\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.061216 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.460014 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f"] Jan 21 15:53:06 crc kubenswrapper[4707]: W0121 15:53:06.464026 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62d12d33_e904_4139_83d0_23eb2bb74b23.slice/crio-caae9bc0546c9e1825d4e9c760493332176a86bfd8d17989e5fe409312df96ec WatchSource:0}: Error finding container caae9bc0546c9e1825d4e9c760493332176a86bfd8d17989e5fe409312df96ec: Status 404 returned error can't find the container with id caae9bc0546c9e1825d4e9c760493332176a86bfd8d17989e5fe409312df96ec Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.525038 4707 generic.go:334] "Generic (PLEG): container finished" podID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerID="85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98" exitCode=143 Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.525097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ab6df059-8aa3-40c8-a65f-17e2fa8de97b","Type":"ContainerDied","Data":"85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98"} Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.541021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"198741f2-417e-4efd-b05e-82299bc804dc","Type":"ContainerDied","Data":"a7decfbbf3415b255d5a36d00d6c1dd9c22e376d2a19c0e453bf09a9a73276a3"} Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.541057 4707 scope.go:117] "RemoveContainer" containerID="82e4262b5854221af674263805459eaa7c3b2334cab2493fd852b6185a2ec55a" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.541163 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.553324 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="602a278f-4c40-434d-987a-30494dfb3f15" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6" gracePeriod=30 Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.553522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" event={"ID":"62d12d33-e904-4139-83d0-23eb2bb74b23","Type":"ContainerStarted","Data":"caae9bc0546c9e1825d4e9c760493332176a86bfd8d17989e5fe409312df96ec"} Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.587844 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.599141 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.608608 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:06 crc kubenswrapper[4707]: E0121 15:53:06.610086 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198741f2-417e-4efd-b05e-82299bc804dc" containerName="nova-scheduler-scheduler" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.610110 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="198741f2-417e-4efd-b05e-82299bc804dc" containerName="nova-scheduler-scheduler" Jan 21 15:53:06 crc kubenswrapper[4707]: E0121 15:53:06.610138 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198741f2-417e-4efd-b05e-82299bc804dc" containerName="nova-scheduler-scheduler" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.610145 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="198741f2-417e-4efd-b05e-82299bc804dc" containerName="nova-scheduler-scheduler" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.610325 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="198741f2-417e-4efd-b05e-82299bc804dc" containerName="nova-scheduler-scheduler" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.610356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="198741f2-417e-4efd-b05e-82299bc804dc" containerName="nova-scheduler-scheduler" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.611216 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.619054 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.647853 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.782481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjw5q\" (UniqueName: \"kubernetes.io/projected/e6f557ac-0575-4870-b132-6d3a727f1904-kube-api-access-jjw5q\") pod \"nova-scheduler-0\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.782719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-config-data\") pod \"nova-scheduler-0\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.782952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.884729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.884843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjw5q\" (UniqueName: \"kubernetes.io/projected/e6f557ac-0575-4870-b132-6d3a727f1904-kube-api-access-jjw5q\") pod \"nova-scheduler-0\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.884864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-config-data\") pod \"nova-scheduler-0\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.888006 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.888225 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="ceilometer-central-agent" containerID="cri-o://d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6" gracePeriod=30 Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.888602 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="proxy-httpd" containerID="cri-o://00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b" gracePeriod=30 Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.888619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-config-data\") pod \"nova-scheduler-0\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.888667 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="sg-core" containerID="cri-o://a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58" gracePeriod=30 Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.888704 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="ceilometer-notification-agent" containerID="cri-o://8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360" gracePeriod=30 Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.896196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.907737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjw5q\" (UniqueName: \"kubernetes.io/projected/e6f557ac-0575-4870-b132-6d3a727f1904-kube-api-access-jjw5q\") pod \"nova-scheduler-0\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.912648 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.913091 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="165b85c7-02ce-4c3f-9433-60c84b3307fd" containerName="kube-state-metrics" containerID="cri-o://e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421" gracePeriod=30 Jan 21 15:53:06 crc kubenswrapper[4707]: I0121 15:53:06.928247 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.191075 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="198741f2-417e-4efd-b05e-82299bc804dc" path="/var/lib/kubelet/pods/198741f2-417e-4efd-b05e-82299bc804dc/volumes" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.191919 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564cacdc-a141-427f-ba32-593fe665a29b" path="/var/lib/kubelet/pods/564cacdc-a141-427f-ba32-593fe665a29b/volumes" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.280784 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.351569 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.352950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.391499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-combined-ca-bundle\") pod \"602a278f-4c40-434d-987a-30494dfb3f15\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.391656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz92z\" (UniqueName: \"kubernetes.io/projected/602a278f-4c40-434d-987a-30494dfb3f15-kube-api-access-xz92z\") pod \"602a278f-4c40-434d-987a-30494dfb3f15\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.391714 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-config-data\") pod \"602a278f-4c40-434d-987a-30494dfb3f15\" (UID: \"602a278f-4c40-434d-987a-30494dfb3f15\") " Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.395874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602a278f-4c40-434d-987a-30494dfb3f15-kube-api-access-xz92z" (OuterVolumeSpecName: "kube-api-access-xz92z") pod "602a278f-4c40-434d-987a-30494dfb3f15" (UID: "602a278f-4c40-434d-987a-30494dfb3f15"). InnerVolumeSpecName "kube-api-access-xz92z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.419923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-config-data" (OuterVolumeSpecName: "config-data") pod "602a278f-4c40-434d-987a-30494dfb3f15" (UID: "602a278f-4c40-434d-987a-30494dfb3f15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.434916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "602a278f-4c40-434d-987a-30494dfb3f15" (UID: "602a278f-4c40-434d-987a-30494dfb3f15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.493220 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlw26\" (UniqueName: \"kubernetes.io/projected/165b85c7-02ce-4c3f-9433-60c84b3307fd-kube-api-access-mlw26\") pod \"165b85c7-02ce-4c3f-9433-60c84b3307fd\" (UID: \"165b85c7-02ce-4c3f-9433-60c84b3307fd\") " Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.493951 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz92z\" (UniqueName: \"kubernetes.io/projected/602a278f-4c40-434d-987a-30494dfb3f15-kube-api-access-xz92z\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.494080 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.494092 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602a278f-4c40-434d-987a-30494dfb3f15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.495647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165b85c7-02ce-4c3f-9433-60c84b3307fd-kube-api-access-mlw26" (OuterVolumeSpecName: "kube-api-access-mlw26") pod "165b85c7-02ce-4c3f-9433-60c84b3307fd" (UID: "165b85c7-02ce-4c3f-9433-60c84b3307fd"). InnerVolumeSpecName "kube-api-access-mlw26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.563020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" event={"ID":"62d12d33-e904-4139-83d0-23eb2bb74b23","Type":"ContainerStarted","Data":"7728de97aa2f6f9a13c08cc05716ea056bddb1521fae5153be06fdfe1e028f81"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.567411 4707 generic.go:334] "Generic (PLEG): container finished" podID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerID="00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b" exitCode=0 Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.567436 4707 generic.go:334] "Generic (PLEG): container finished" podID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerID="a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58" exitCode=2 Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.567443 4707 generic.go:334] "Generic (PLEG): container finished" podID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerID="d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6" exitCode=0 Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.567474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerDied","Data":"00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.567491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerDied","Data":"a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.567500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerDied","Data":"d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.568658 4707 generic.go:334] "Generic (PLEG): container finished" podID="602a278f-4c40-434d-987a-30494dfb3f15" containerID="6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6" exitCode=0 Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.568700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"602a278f-4c40-434d-987a-30494dfb3f15","Type":"ContainerDied","Data":"6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.568716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"602a278f-4c40-434d-987a-30494dfb3f15","Type":"ContainerDied","Data":"9fa664a1846224b1c4c90896ff3c714b680fd6ff44ab4fd1c654e2b50bb24d3f"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.568730 4707 scope.go:117] "RemoveContainer" containerID="6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.568802 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.577353 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" podStartSLOduration=2.577337163 podStartE2EDuration="2.577337163s" podCreationTimestamp="2026-01-21 15:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:07.576193493 +0000 UTC m=+3084.757709714" watchObservedRunningTime="2026-01-21 15:53:07.577337163 +0000 UTC m=+3084.758853385" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.579305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e6f557ac-0575-4870-b132-6d3a727f1904","Type":"ContainerStarted","Data":"697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.579478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e6f557ac-0575-4870-b132-6d3a727f1904","Type":"ContainerStarted","Data":"fb85a73ac6884b3f777dc939eb38a311268116b39009994aeebe7c0cfcf00826"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.582479 4707 generic.go:334] "Generic (PLEG): container finished" podID="165b85c7-02ce-4c3f-9433-60c84b3307fd" containerID="e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421" exitCode=2 Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.582585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"165b85c7-02ce-4c3f-9433-60c84b3307fd","Type":"ContainerDied","Data":"e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.582651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"165b85c7-02ce-4c3f-9433-60c84b3307fd","Type":"ContainerDied","Data":"1f3e9d0b061890d387e576c08d067ada58f59b62898573f4e3c9a1d53dbbad45"} Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.582733 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.595729 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlw26\" (UniqueName: \"kubernetes.io/projected/165b85c7-02ce-4c3f-9433-60c84b3307fd-kube-api-access-mlw26\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.598955 4707 scope.go:117] "RemoveContainer" containerID="6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6" Jan 21 15:53:07 crc kubenswrapper[4707]: E0121 15:53:07.600871 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6\": container with ID starting with 6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6 not found: ID does not exist" containerID="6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.600901 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6"} err="failed to get container status \"6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6\": rpc error: code = NotFound desc = could not find container \"6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6\": container with ID starting with 6f23118f26877e63eaedf04ca9d42ff582506e9b891f75271fe5d47f1ef08ad6 not found: ID does not exist" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.600917 4707 scope.go:117] "RemoveContainer" containerID="e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.611790 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.611761795 podStartE2EDuration="1.611761795s" podCreationTimestamp="2026-01-21 15:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:07.593913955 +0000 UTC m=+3084.775430197" watchObservedRunningTime="2026-01-21 15:53:07.611761795 +0000 UTC m=+3084.793278017" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.623348 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.637755 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.640320 4707 scope.go:117] "RemoveContainer" containerID="e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421" Jan 21 15:53:07 crc kubenswrapper[4707]: E0121 15:53:07.640724 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421\": container with ID starting with e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421 not found: ID does not exist" containerID="e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.640762 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421"} err="failed to get container status \"e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421\": rpc error: code = NotFound desc = could not find container \"e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421\": container with ID starting with e22a242f1d76034597295ec5a6537ae26349376e5663c479a316281f57f7d421 not found: ID does not exist" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.646015 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.653867 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.682056 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:53:07 crc kubenswrapper[4707]: E0121 15:53:07.682702 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165b85c7-02ce-4c3f-9433-60c84b3307fd" containerName="kube-state-metrics" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.682727 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="165b85c7-02ce-4c3f-9433-60c84b3307fd" containerName="kube-state-metrics" Jan 21 15:53:07 crc kubenswrapper[4707]: E0121 15:53:07.682746 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198741f2-417e-4efd-b05e-82299bc804dc" containerName="nova-scheduler-scheduler" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.682753 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="198741f2-417e-4efd-b05e-82299bc804dc" containerName="nova-scheduler-scheduler" Jan 21 15:53:07 crc kubenswrapper[4707]: E0121 15:53:07.682766 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602a278f-4c40-434d-987a-30494dfb3f15" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.682772 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="602a278f-4c40-434d-987a-30494dfb3f15" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.684168 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="602a278f-4c40-434d-987a-30494dfb3f15" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.684225 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="165b85c7-02ce-4c3f-9433-60c84b3307fd" containerName="kube-state-metrics" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.684272 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="198741f2-417e-4efd-b05e-82299bc804dc" containerName="nova-scheduler-scheduler" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.685112 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.687784 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.691851 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.692969 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.698719 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.700123 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.702135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.702722 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.702729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.709244 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.798931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.799024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.799044 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.799074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.799090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4858v\" (UniqueName: \"kubernetes.io/projected/6dab9056-b15b-4adb-8a83-4b197e2b724c-kube-api-access-4858v\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.799135 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh2l7\" (UniqueName: \"kubernetes.io/projected/874c328a-6498-4a0a-9754-cb2051c03411-kube-api-access-xh2l7\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.799158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.799179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.799200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.901283 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.902020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.902092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.902159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.902231 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4858v\" (UniqueName: \"kubernetes.io/projected/6dab9056-b15b-4adb-8a83-4b197e2b724c-kube-api-access-4858v\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.902320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh2l7\" (UniqueName: \"kubernetes.io/projected/874c328a-6498-4a0a-9754-cb2051c03411-kube-api-access-xh2l7\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.902385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.902444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.902499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.905451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.907436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.907433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.907827 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.908112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.908383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.914444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.920179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4858v\" (UniqueName: \"kubernetes.io/projected/6dab9056-b15b-4adb-8a83-4b197e2b724c-kube-api-access-4858v\") pod \"nova-cell1-novncproxy-0\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:07 crc kubenswrapper[4707]: I0121 15:53:07.933327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh2l7\" (UniqueName: \"kubernetes.io/projected/874c328a-6498-4a0a-9754-cb2051c03411-kube-api-access-xh2l7\") pod \"kube-state-metrics-0\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.011403 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.016425 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.413033 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.466792 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:53:08 crc kubenswrapper[4707]: W0121 15:53:08.475375 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod874c328a_6498_4a0a_9754_cb2051c03411.slice/crio-9a553449177c427b49f07c66bf11e325ecd3d39b745f0103343310d286b949a3 WatchSource:0}: Error finding container 9a553449177c427b49f07c66bf11e325ecd3d39b745f0103343310d286b949a3: Status 404 returned error can't find the container with id 9a553449177c427b49f07c66bf11e325ecd3d39b745f0103343310d286b949a3 Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.523118 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.523348 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" containerID="cri-o://239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" gracePeriod=30 Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.595374 4707 generic.go:334] "Generic (PLEG): container finished" podID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerID="106e3556f3ec42865d3ccced817a33758f210fb47212fc3f57bf6a9732818a91" exitCode=0 Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.595596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ba711d4-865f-4d09-892f-30c9b4f06a16","Type":"ContainerDied","Data":"106e3556f3ec42865d3ccced817a33758f210fb47212fc3f57bf6a9732818a91"} Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.596765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"874c328a-6498-4a0a-9754-cb2051c03411","Type":"ContainerStarted","Data":"9a553449177c427b49f07c66bf11e325ecd3d39b745f0103343310d286b949a3"} Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.599274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6dab9056-b15b-4adb-8a83-4b197e2b724c","Type":"ContainerStarted","Data":"b2c6ad6adefc8376ea0b8a64281cf6f521049a7cd75a8b2f76706d8faf504ba8"} Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.656033 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.254:8775/\": read tcp 10.217.0.2:52258->10.217.1.254:8775: read: connection reset by peer" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.656037 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.254:8775/\": read tcp 10.217.0.2:52266->10.217.1.254:8775: read: connection reset by peer" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.794934 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.819842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss2hb\" (UniqueName: \"kubernetes.io/projected/8ba711d4-865f-4d09-892f-30c9b4f06a16-kube-api-access-ss2hb\") pod \"8ba711d4-865f-4d09-892f-30c9b4f06a16\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.819918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-config-data\") pod \"8ba711d4-865f-4d09-892f-30c9b4f06a16\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.819984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-combined-ca-bundle\") pod \"8ba711d4-865f-4d09-892f-30c9b4f06a16\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.820038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba711d4-865f-4d09-892f-30c9b4f06a16-logs\") pod \"8ba711d4-865f-4d09-892f-30c9b4f06a16\" (UID: \"8ba711d4-865f-4d09-892f-30c9b4f06a16\") " Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.820781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba711d4-865f-4d09-892f-30c9b4f06a16-logs" (OuterVolumeSpecName: "logs") pod "8ba711d4-865f-4d09-892f-30c9b4f06a16" (UID: "8ba711d4-865f-4d09-892f-30c9b4f06a16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.830984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba711d4-865f-4d09-892f-30c9b4f06a16-kube-api-access-ss2hb" (OuterVolumeSpecName: "kube-api-access-ss2hb") pod "8ba711d4-865f-4d09-892f-30c9b4f06a16" (UID: "8ba711d4-865f-4d09-892f-30c9b4f06a16"). InnerVolumeSpecName "kube-api-access-ss2hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.859820 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-config-data" (OuterVolumeSpecName: "config-data") pod "8ba711d4-865f-4d09-892f-30c9b4f06a16" (UID: "8ba711d4-865f-4d09-892f-30c9b4f06a16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.864902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ba711d4-865f-4d09-892f-30c9b4f06a16" (UID: "8ba711d4-865f-4d09-892f-30c9b4f06a16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.922190 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss2hb\" (UniqueName: \"kubernetes.io/projected/8ba711d4-865f-4d09-892f-30c9b4f06a16-kube-api-access-ss2hb\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.922211 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.922221 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba711d4-865f-4d09-892f-30c9b4f06a16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.922229 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba711d4-865f-4d09-892f-30c9b4f06a16-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:08 crc kubenswrapper[4707]: I0121 15:53:08.995872 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.022805 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-logs\") pod \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.022915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-config-data\") pod \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.022968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vdgl\" (UniqueName: \"kubernetes.io/projected/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-kube-api-access-6vdgl\") pod \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.023031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-combined-ca-bundle\") pod \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\" (UID: \"ab6df059-8aa3-40c8-a65f-17e2fa8de97b\") " Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.024122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-logs" (OuterVolumeSpecName: "logs") pod "ab6df059-8aa3-40c8-a65f-17e2fa8de97b" (UID: "ab6df059-8aa3-40c8-a65f-17e2fa8de97b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.027048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-kube-api-access-6vdgl" (OuterVolumeSpecName: "kube-api-access-6vdgl") pod "ab6df059-8aa3-40c8-a65f-17e2fa8de97b" (UID: "ab6df059-8aa3-40c8-a65f-17e2fa8de97b"). InnerVolumeSpecName "kube-api-access-6vdgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.048999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab6df059-8aa3-40c8-a65f-17e2fa8de97b" (UID: "ab6df059-8aa3-40c8-a65f-17e2fa8de97b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.051189 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-config-data" (OuterVolumeSpecName: "config-data") pod "ab6df059-8aa3-40c8-a65f-17e2fa8de97b" (UID: "ab6df059-8aa3-40c8-a65f-17e2fa8de97b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.124517 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.124547 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.124558 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vdgl\" (UniqueName: \"kubernetes.io/projected/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-kube-api-access-6vdgl\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.124567 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6df059-8aa3-40c8-a65f-17e2fa8de97b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.190232 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="165b85c7-02ce-4c3f-9433-60c84b3307fd" path="/var/lib/kubelet/pods/165b85c7-02ce-4c3f-9433-60c84b3307fd/volumes" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.191013 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="602a278f-4c40-434d-987a-30494dfb3f15" path="/var/lib/kubelet/pods/602a278f-4c40-434d-987a-30494dfb3f15/volumes" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.606904 4707 generic.go:334] "Generic (PLEG): container finished" podID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerID="9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173" exitCode=0 Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.606974 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.606989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ab6df059-8aa3-40c8-a65f-17e2fa8de97b","Type":"ContainerDied","Data":"9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173"} Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.607275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ab6df059-8aa3-40c8-a65f-17e2fa8de97b","Type":"ContainerDied","Data":"b8a9c0566791d1fc14346b1268283488770f5790c8a50fbe3d8d1f5c7926331f"} Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.607293 4707 scope.go:117] "RemoveContainer" containerID="9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.608956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ba711d4-865f-4d09-892f-30c9b4f06a16","Type":"ContainerDied","Data":"1f8b7bff56582677ba605c071663bc7a49b6e5db322096a602b3e289760132dc"} Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.609053 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.611376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"874c328a-6498-4a0a-9754-cb2051c03411","Type":"ContainerStarted","Data":"11762a52965a774c3c7ef42b8f5b5562d1e89a9129e880c1f1650e9be9144cda"} Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.611555 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.613268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6dab9056-b15b-4adb-8a83-4b197e2b724c","Type":"ContainerStarted","Data":"c0f64e3130d63c159d5622b3618a7db1051e62d9dd644abb3bb1f144f6bab0b3"} Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.624401 4707 scope.go:117] "RemoveContainer" containerID="85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.634369 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.638665 4707 scope.go:117] "RemoveContainer" containerID="9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173" Jan 21 15:53:09 crc kubenswrapper[4707]: E0121 15:53:09.639034 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173\": container with ID starting with 9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173 not found: ID does not exist" containerID="9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.639071 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173"} err="failed to get container status \"9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173\": rpc error: code = NotFound desc = could not find container \"9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173\": container with ID starting with 9a771b3de32fd10ae3071d143ebed3469fa9e3c810e9416ce2f17488c1bd9173 not found: ID does not exist" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.639087 4707 scope.go:117] "RemoveContainer" containerID="85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98" Jan 21 15:53:09 crc kubenswrapper[4707]: E0121 15:53:09.640451 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98\": container with ID starting with 85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98 not found: ID does not exist" containerID="85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.640480 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98"} err="failed to get container status \"85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98\": rpc error: code = NotFound desc = could not find container \"85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98\": container with ID starting with 85f7ccf6e293f3ae0e46b908d228dc1f7ba1bad8f8cb0f494ae40b2642732e98 not found: ID does not exist" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.640496 4707 scope.go:117] "RemoveContainer" containerID="106e3556f3ec42865d3ccced817a33758f210fb47212fc3f57bf6a9732818a91" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.646529 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.680125 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:09 crc kubenswrapper[4707]: E0121 15:53:09.680549 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-log" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.680568 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-log" Jan 21 15:53:09 crc kubenswrapper[4707]: E0121 15:53:09.680582 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-log" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.680589 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-log" Jan 21 15:53:09 crc kubenswrapper[4707]: E0121 15:53:09.680596 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-metadata" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.680601 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-metadata" Jan 21 15:53:09 crc kubenswrapper[4707]: E0121 15:53:09.680611 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-api" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.680616 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-api" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.680791 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-log" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.680823 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-api" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.680837 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" containerName="nova-metadata-metadata" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.680848 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" containerName="nova-api-log" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.681692 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.682953 4707 scope.go:117] "RemoveContainer" containerID="df55e9ac1dc497a2a744b36c04038da3b4b71589b0800551c208fa606a3051f7" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.685660 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.685828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.689181 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.700759 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.325199754 podStartE2EDuration="2.700741247s" podCreationTimestamp="2026-01-21 15:53:07 +0000 UTC" firstStartedPulling="2026-01-21 15:53:08.477407395 +0000 UTC m=+3085.658923607" lastFinishedPulling="2026-01-21 15:53:08.852948878 +0000 UTC m=+3086.034465100" observedRunningTime="2026-01-21 15:53:09.649199297 +0000 UTC m=+3086.830715519" watchObservedRunningTime="2026-01-21 15:53:09.700741247 +0000 UTC m=+3086.882257469" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.717766 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.729776 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.733649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ecb394-e4de-45ab-a12e-710faf510a48-logs\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.733738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.733792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-config-data\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.733863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqpbb\" (UniqueName: \"kubernetes.io/projected/75ecb394-e4de-45ab-a12e-710faf510a48-kube-api-access-vqpbb\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.733982 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.737034 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.738381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.740800 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.740988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.741347 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.741912 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.741897197 podStartE2EDuration="2.741897197s" podCreationTimestamp="2026-01-21 15:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:09.682709379 +0000 UTC m=+3086.864225602" watchObservedRunningTime="2026-01-21 15:53:09.741897197 +0000 UTC m=+3086.923413419" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.749961 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.835614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpwtr\" (UniqueName: \"kubernetes.io/projected/32c0a029-325a-41e0-91b1-d187b6e036e3-kube-api-access-zpwtr\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.835999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.836049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c0a029-325a-41e0-91b1-d187b6e036e3-logs\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.836070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-config-data\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.836087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ecb394-e4de-45ab-a12e-710faf510a48-logs\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.836101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-public-tls-certs\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.836117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.836155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.836172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.836401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-config-data\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.837185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ecb394-e4de-45ab-a12e-710faf510a48-logs\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.837432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqpbb\" (UniqueName: \"kubernetes.io/projected/75ecb394-e4de-45ab-a12e-710faf510a48-kube-api-access-vqpbb\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.840695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.841610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-config-data\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.851707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.852576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqpbb\" (UniqueName: \"kubernetes.io/projected/75ecb394-e4de-45ab-a12e-710faf510a48-kube-api-access-vqpbb\") pod \"nova-metadata-0\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.938878 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpwtr\" (UniqueName: \"kubernetes.io/projected/32c0a029-325a-41e0-91b1-d187b6e036e3-kube-api-access-zpwtr\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.938956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c0a029-325a-41e0-91b1-d187b6e036e3-logs\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.938980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-config-data\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.938995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-public-tls-certs\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.939028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.939069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.939986 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c0a029-325a-41e0-91b1-d187b6e036e3-logs\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.942567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-public-tls-certs\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.942667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.943782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.944108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-config-data\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.945309 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.945345 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:53:09 crc kubenswrapper[4707]: I0121 15:53:09.953763 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpwtr\" (UniqueName: \"kubernetes.io/projected/32c0a029-325a-41e0-91b1-d187b6e036e3-kube-api-access-zpwtr\") pod \"nova-api-0\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:10 crc kubenswrapper[4707]: I0121 15:53:10.000773 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:10 crc kubenswrapper[4707]: I0121 15:53:10.051198 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:10 crc kubenswrapper[4707]: I0121 15:53:10.392011 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:10 crc kubenswrapper[4707]: I0121 15:53:10.488187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:10 crc kubenswrapper[4707]: W0121 15:53:10.503312 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32c0a029_325a_41e0_91b1_d187b6e036e3.slice/crio-aaa70e729a10f3d1c98322778a3f05983951c04a1881a6ccaa09f090bf596fa4 WatchSource:0}: Error finding container aaa70e729a10f3d1c98322778a3f05983951c04a1881a6ccaa09f090bf596fa4: Status 404 returned error can't find the container with id aaa70e729a10f3d1c98322778a3f05983951c04a1881a6ccaa09f090bf596fa4 Jan 21 15:53:10 crc kubenswrapper[4707]: I0121 15:53:10.637178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"75ecb394-e4de-45ab-a12e-710faf510a48","Type":"ContainerStarted","Data":"048320bc1683c4c19d1e5a5a0575d0ac093127e53a60041b1310294c3aa50555"} Jan 21 15:53:10 crc kubenswrapper[4707]: I0121 15:53:10.640235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"32c0a029-325a-41e0-91b1-d187b6e036e3","Type":"ContainerStarted","Data":"aaa70e729a10f3d1c98322778a3f05983951c04a1881a6ccaa09f090bf596fa4"} Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.182607 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.190475 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba711d4-865f-4d09-892f-30c9b4f06a16" path="/var/lib/kubelet/pods/8ba711d4-865f-4d09-892f-30c9b4f06a16/volumes" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.191066 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab6df059-8aa3-40c8-a65f-17e2fa8de97b" path="/var/lib/kubelet/pods/ab6df059-8aa3-40c8-a65f-17e2fa8de97b/volumes" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.266567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-run-httpd\") pod \"653100f7-1a4f-46bd-8438-c1415bc357a0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.266769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-scripts\") pod \"653100f7-1a4f-46bd-8438-c1415bc357a0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.266801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-combined-ca-bundle\") pod \"653100f7-1a4f-46bd-8438-c1415bc357a0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.266852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-log-httpd\") pod \"653100f7-1a4f-46bd-8438-c1415bc357a0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.266931 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-config-data\") pod \"653100f7-1a4f-46bd-8438-c1415bc357a0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.267328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-sg-core-conf-yaml\") pod \"653100f7-1a4f-46bd-8438-c1415bc357a0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.267376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn6zj\" (UniqueName: \"kubernetes.io/projected/653100f7-1a4f-46bd-8438-c1415bc357a0-kube-api-access-rn6zj\") pod \"653100f7-1a4f-46bd-8438-c1415bc357a0\" (UID: \"653100f7-1a4f-46bd-8438-c1415bc357a0\") " Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.267581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "653100f7-1a4f-46bd-8438-c1415bc357a0" (UID: "653100f7-1a4f-46bd-8438-c1415bc357a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.268146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "653100f7-1a4f-46bd-8438-c1415bc357a0" (UID: "653100f7-1a4f-46bd-8438-c1415bc357a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.269768 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.270141 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/653100f7-1a4f-46bd-8438-c1415bc357a0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.271178 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-scripts" (OuterVolumeSpecName: "scripts") pod "653100f7-1a4f-46bd-8438-c1415bc357a0" (UID: "653100f7-1a4f-46bd-8438-c1415bc357a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.272423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653100f7-1a4f-46bd-8438-c1415bc357a0-kube-api-access-rn6zj" (OuterVolumeSpecName: "kube-api-access-rn6zj") pod "653100f7-1a4f-46bd-8438-c1415bc357a0" (UID: "653100f7-1a4f-46bd-8438-c1415bc357a0"). InnerVolumeSpecName "kube-api-access-rn6zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.287605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "653100f7-1a4f-46bd-8438-c1415bc357a0" (UID: "653100f7-1a4f-46bd-8438-c1415bc357a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.319778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "653100f7-1a4f-46bd-8438-c1415bc357a0" (UID: "653100f7-1a4f-46bd-8438-c1415bc357a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.327289 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-config-data" (OuterVolumeSpecName: "config-data") pod "653100f7-1a4f-46bd-8438-c1415bc357a0" (UID: "653100f7-1a4f-46bd-8438-c1415bc357a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.372001 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.372033 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.372050 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn6zj\" (UniqueName: \"kubernetes.io/projected/653100f7-1a4f-46bd-8438-c1415bc357a0-kube-api-access-rn6zj\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.372059 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.372068 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653100f7-1a4f-46bd-8438-c1415bc357a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.655517 4707 generic.go:334] "Generic (PLEG): container finished" podID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerID="8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360" exitCode=0 Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.655588 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.655587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerDied","Data":"8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360"} Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.655686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"653100f7-1a4f-46bd-8438-c1415bc357a0","Type":"ContainerDied","Data":"233cf3f5787af714cace1bcc125deea23a82d0e9e2559dcd32151877d130ff1e"} Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.655707 4707 scope.go:117] "RemoveContainer" containerID="00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.657880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"32c0a029-325a-41e0-91b1-d187b6e036e3","Type":"ContainerStarted","Data":"a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74"} Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.657906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"32c0a029-325a-41e0-91b1-d187b6e036e3","Type":"ContainerStarted","Data":"b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292"} Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.660079 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"75ecb394-e4de-45ab-a12e-710faf510a48","Type":"ContainerStarted","Data":"299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10"} Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.660123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"75ecb394-e4de-45ab-a12e-710faf510a48","Type":"ContainerStarted","Data":"3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5"} Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.662296 4707 generic.go:334] "Generic (PLEG): container finished" podID="62d12d33-e904-4139-83d0-23eb2bb74b23" containerID="7728de97aa2f6f9a13c08cc05716ea056bddb1521fae5153be06fdfe1e028f81" exitCode=0 Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.662334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" event={"ID":"62d12d33-e904-4139-83d0-23eb2bb74b23","Type":"ContainerDied","Data":"7728de97aa2f6f9a13c08cc05716ea056bddb1521fae5153be06fdfe1e028f81"} Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.679930 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.679915228 podStartE2EDuration="2.679915228s" podCreationTimestamp="2026-01-21 15:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:11.674369158 +0000 UTC m=+3088.855885381" watchObservedRunningTime="2026-01-21 15:53:11.679915228 +0000 UTC m=+3088.861431450" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.687066 4707 scope.go:117] "RemoveContainer" containerID="a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.704401 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.711893 4707 scope.go:117] "RemoveContainer" containerID="8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.721128 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.726876 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.727303 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="proxy-httpd" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.727323 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="proxy-httpd" Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.727337 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="ceilometer-notification-agent" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.727347 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="ceilometer-notification-agent" Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.727358 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="sg-core" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.727364 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="sg-core" Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.727379 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="ceilometer-central-agent" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.727385 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="ceilometer-central-agent" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.727579 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="ceilometer-central-agent" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.727601 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="sg-core" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.727615 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="ceilometer-notification-agent" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.727623 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" containerName="proxy-httpd" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.729193 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.730714 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.732206 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.732243 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.733025 4707 scope.go:117] "RemoveContainer" containerID="d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.737221 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.739771 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.741782 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.74176485 podStartE2EDuration="2.74176485s" podCreationTimestamp="2026-01-21 15:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:11.715687384 +0000 UTC m=+3088.897203596" watchObservedRunningTime="2026-01-21 15:53:11.74176485 +0000 UTC m=+3088.923281073" Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.742324 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.746964 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.747029 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.755687 4707 scope.go:117] "RemoveContainer" containerID="00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b" Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.756322 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b\": container with ID starting with 00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b not found: ID does not exist" containerID="00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.756358 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b"} err="failed to get container status \"00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b\": rpc error: code = NotFound desc = could not find container \"00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b\": container with ID starting with 00255a19f16367307177af2be484c2435ad7e107cd78e27dcab0fd604ec4527b not found: ID does not exist" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.756380 4707 scope.go:117] "RemoveContainer" containerID="a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58" Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.756637 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58\": container with ID starting with a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58 not found: ID does not exist" containerID="a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.756669 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58"} err="failed to get container status \"a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58\": rpc error: code = NotFound desc = could not find container \"a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58\": container with ID starting with a84dbb2f193f13282b4ecd6ca7874eff887266ce905a624173377b967a643f58 not found: ID does not exist" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.756689 4707 scope.go:117] "RemoveContainer" containerID="8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360" Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.757091 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360\": container with ID starting with 8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360 not found: ID does not exist" containerID="8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.757376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360"} err="failed to get container status \"8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360\": rpc error: code = NotFound desc = could not find container \"8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360\": container with ID starting with 8cb265a3e9cab420a2a0d13302063924bff86e620f6c6230f0855e3a6ae02360 not found: ID does not exist" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.757390 4707 scope.go:117] "RemoveContainer" containerID="d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6" Jan 21 15:53:11 crc kubenswrapper[4707]: E0121 15:53:11.757772 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6\": container with ID starting with d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6 not found: ID does not exist" containerID="d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.757844 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6"} err="failed to get container status \"d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6\": rpc error: code = NotFound desc = could not find container \"d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6\": container with ID starting with d5d1e40013134e1c64078f2d826cff1cac1da3f5b9f9596577e12e65e93efaa6 not found: ID does not exist" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.779878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.779991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.780023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xpzg\" (UniqueName: \"kubernetes.io/projected/0cc92b42-4a79-43c3-b749-ae1058708648-kube-api-access-7xpzg\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.780061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-run-httpd\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.780094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.780146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-scripts\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.780235 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-config-data\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.780316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-log-httpd\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.881359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-run-httpd\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.881409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.881444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-scripts\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.881483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-config-data\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.881517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-log-httpd\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.881550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.881590 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.881608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xpzg\" (UniqueName: \"kubernetes.io/projected/0cc92b42-4a79-43c3-b749-ae1058708648-kube-api-access-7xpzg\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.881926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-run-httpd\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.882405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-log-httpd\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.884951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-scripts\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.885047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.886240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.886821 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-config-data\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.888137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.896025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xpzg\" (UniqueName: \"kubernetes.io/projected/0cc92b42-4a79-43c3-b749-ae1058708648-kube-api-access-7xpzg\") pod \"ceilometer-0\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:11 crc kubenswrapper[4707]: I0121 15:53:11.929235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:12 crc kubenswrapper[4707]: I0121 15:53:12.041781 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:12 crc kubenswrapper[4707]: I0121 15:53:12.426712 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:53:12 crc kubenswrapper[4707]: W0121 15:53:12.430182 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cc92b42_4a79_43c3_b749_ae1058708648.slice/crio-506b4298a18501f47309f69d137ff4ce875043a88d9c86d70be89ab71815d5d6 WatchSource:0}: Error finding container 506b4298a18501f47309f69d137ff4ce875043a88d9c86d70be89ab71815d5d6: Status 404 returned error can't find the container with id 506b4298a18501f47309f69d137ff4ce875043a88d9c86d70be89ab71815d5d6 Jan 21 15:53:12 crc kubenswrapper[4707]: I0121 15:53:12.669766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerStarted","Data":"506b4298a18501f47309f69d137ff4ce875043a88d9c86d70be89ab71815d5d6"} Jan 21 15:53:12 crc kubenswrapper[4707]: I0121 15:53:12.934600 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.003600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-scripts\") pod \"62d12d33-e904-4139-83d0-23eb2bb74b23\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.003820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-config-data\") pod \"62d12d33-e904-4139-83d0-23eb2bb74b23\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.003848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-combined-ca-bundle\") pod \"62d12d33-e904-4139-83d0-23eb2bb74b23\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.003897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nck7f\" (UniqueName: \"kubernetes.io/projected/62d12d33-e904-4139-83d0-23eb2bb74b23-kube-api-access-nck7f\") pod \"62d12d33-e904-4139-83d0-23eb2bb74b23\" (UID: \"62d12d33-e904-4139-83d0-23eb2bb74b23\") " Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.005740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-scripts" (OuterVolumeSpecName: "scripts") pod "62d12d33-e904-4139-83d0-23eb2bb74b23" (UID: "62d12d33-e904-4139-83d0-23eb2bb74b23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.006388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62d12d33-e904-4139-83d0-23eb2bb74b23-kube-api-access-nck7f" (OuterVolumeSpecName: "kube-api-access-nck7f") pod "62d12d33-e904-4139-83d0-23eb2bb74b23" (UID: "62d12d33-e904-4139-83d0-23eb2bb74b23"). InnerVolumeSpecName "kube-api-access-nck7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.019509 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.025384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62d12d33-e904-4139-83d0-23eb2bb74b23" (UID: "62d12d33-e904-4139-83d0-23eb2bb74b23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.027134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-config-data" (OuterVolumeSpecName: "config-data") pod "62d12d33-e904-4139-83d0-23eb2bb74b23" (UID: "62d12d33-e904-4139-83d0-23eb2bb74b23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.105128 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.105305 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.105316 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nck7f\" (UniqueName: \"kubernetes.io/projected/62d12d33-e904-4139-83d0-23eb2bb74b23-kube-api-access-nck7f\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.105324 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62d12d33-e904-4139-83d0-23eb2bb74b23-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.191211 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="653100f7-1a4f-46bd-8438-c1415bc357a0" path="/var/lib/kubelet/pods/653100f7-1a4f-46bd-8438-c1415bc357a0/volumes" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.678572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerStarted","Data":"fe8a3919099032fd63ab5edd1340fda4118650819c31c39020c309ee4731c4b3"} Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.679693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" event={"ID":"62d12d33-e904-4139-83d0-23eb2bb74b23","Type":"ContainerDied","Data":"caae9bc0546c9e1825d4e9c760493332176a86bfd8d17989e5fe409312df96ec"} Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.679711 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caae9bc0546c9e1825d4e9c760493332176a86bfd8d17989e5fe409312df96ec" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.679769 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f" Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.840745 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.841017 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerName="nova-api-log" containerID="cri-o://b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292" gracePeriod=30 Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.841075 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerName="nova-api-api" containerID="cri-o://a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74" gracePeriod=30 Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.850897 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.851098 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="e6f557ac-0575-4870-b132-6d3a727f1904" containerName="nova-scheduler-scheduler" containerID="cri-o://697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed" gracePeriod=30 Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.879120 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.879396 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="75ecb394-e4de-45ab-a12e-710faf510a48" containerName="nova-metadata-metadata" containerID="cri-o://299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10" gracePeriod=30 Jan 21 15:53:13 crc kubenswrapper[4707]: I0121 15:53:13.879578 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="75ecb394-e4de-45ab-a12e-710faf510a48" containerName="nova-metadata-log" containerID="cri-o://3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5" gracePeriod=30 Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.285313 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.331566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpwtr\" (UniqueName: \"kubernetes.io/projected/32c0a029-325a-41e0-91b1-d187b6e036e3-kube-api-access-zpwtr\") pod \"32c0a029-325a-41e0-91b1-d187b6e036e3\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.331673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c0a029-325a-41e0-91b1-d187b6e036e3-logs\") pod \"32c0a029-325a-41e0-91b1-d187b6e036e3\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.331709 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-config-data\") pod \"32c0a029-325a-41e0-91b1-d187b6e036e3\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.331747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-public-tls-certs\") pod \"32c0a029-325a-41e0-91b1-d187b6e036e3\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.331779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-internal-tls-certs\") pod \"32c0a029-325a-41e0-91b1-d187b6e036e3\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.331870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-combined-ca-bundle\") pod \"32c0a029-325a-41e0-91b1-d187b6e036e3\" (UID: \"32c0a029-325a-41e0-91b1-d187b6e036e3\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.332104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c0a029-325a-41e0-91b1-d187b6e036e3-logs" (OuterVolumeSpecName: "logs") pod "32c0a029-325a-41e0-91b1-d187b6e036e3" (UID: "32c0a029-325a-41e0-91b1-d187b6e036e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.332583 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c0a029-325a-41e0-91b1-d187b6e036e3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.333186 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.341982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c0a029-325a-41e0-91b1-d187b6e036e3-kube-api-access-zpwtr" (OuterVolumeSpecName: "kube-api-access-zpwtr") pod "32c0a029-325a-41e0-91b1-d187b6e036e3" (UID: "32c0a029-325a-41e0-91b1-d187b6e036e3"). InnerVolumeSpecName "kube-api-access-zpwtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.357857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-config-data" (OuterVolumeSpecName: "config-data") pod "32c0a029-325a-41e0-91b1-d187b6e036e3" (UID: "32c0a029-325a-41e0-91b1-d187b6e036e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.367948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32c0a029-325a-41e0-91b1-d187b6e036e3" (UID: "32c0a029-325a-41e0-91b1-d187b6e036e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.386324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "32c0a029-325a-41e0-91b1-d187b6e036e3" (UID: "32c0a029-325a-41e0-91b1-d187b6e036e3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.388684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "32c0a029-325a-41e0-91b1-d187b6e036e3" (UID: "32c0a029-325a-41e0-91b1-d187b6e036e3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.459528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqpbb\" (UniqueName: \"kubernetes.io/projected/75ecb394-e4de-45ab-a12e-710faf510a48-kube-api-access-vqpbb\") pod \"75ecb394-e4de-45ab-a12e-710faf510a48\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.459601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ecb394-e4de-45ab-a12e-710faf510a48-logs\") pod \"75ecb394-e4de-45ab-a12e-710faf510a48\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.459716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-nova-metadata-tls-certs\") pod \"75ecb394-e4de-45ab-a12e-710faf510a48\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.459737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-combined-ca-bundle\") pod \"75ecb394-e4de-45ab-a12e-710faf510a48\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.459786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-config-data\") pod \"75ecb394-e4de-45ab-a12e-710faf510a48\" (UID: \"75ecb394-e4de-45ab-a12e-710faf510a48\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.460042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75ecb394-e4de-45ab-a12e-710faf510a48-logs" (OuterVolumeSpecName: "logs") pod "75ecb394-e4de-45ab-a12e-710faf510a48" (UID: "75ecb394-e4de-45ab-a12e-710faf510a48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.460280 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpwtr\" (UniqueName: \"kubernetes.io/projected/32c0a029-325a-41e0-91b1-d187b6e036e3-kube-api-access-zpwtr\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.460297 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ecb394-e4de-45ab-a12e-710faf510a48-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.460306 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.460315 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.460324 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.460332 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c0a029-325a-41e0-91b1-d187b6e036e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.476295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ecb394-e4de-45ab-a12e-710faf510a48-kube-api-access-vqpbb" (OuterVolumeSpecName: "kube-api-access-vqpbb") pod "75ecb394-e4de-45ab-a12e-710faf510a48" (UID: "75ecb394-e4de-45ab-a12e-710faf510a48"). InnerVolumeSpecName "kube-api-access-vqpbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.495939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-config-data" (OuterVolumeSpecName: "config-data") pod "75ecb394-e4de-45ab-a12e-710faf510a48" (UID: "75ecb394-e4de-45ab-a12e-710faf510a48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.497798 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75ecb394-e4de-45ab-a12e-710faf510a48" (UID: "75ecb394-e4de-45ab-a12e-710faf510a48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.523923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "75ecb394-e4de-45ab-a12e-710faf510a48" (UID: "75ecb394-e4de-45ab-a12e-710faf510a48"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.561475 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.561699 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.561710 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ecb394-e4de-45ab-a12e-710faf510a48-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.561719 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqpbb\" (UniqueName: \"kubernetes.io/projected/75ecb394-e4de-45ab-a12e-710faf510a48-kube-api-access-vqpbb\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.596447 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.662392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjw5q\" (UniqueName: \"kubernetes.io/projected/e6f557ac-0575-4870-b132-6d3a727f1904-kube-api-access-jjw5q\") pod \"e6f557ac-0575-4870-b132-6d3a727f1904\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.662519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-combined-ca-bundle\") pod \"e6f557ac-0575-4870-b132-6d3a727f1904\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.662604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-config-data\") pod \"e6f557ac-0575-4870-b132-6d3a727f1904\" (UID: \"e6f557ac-0575-4870-b132-6d3a727f1904\") " Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.665716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f557ac-0575-4870-b132-6d3a727f1904-kube-api-access-jjw5q" (OuterVolumeSpecName: "kube-api-access-jjw5q") pod "e6f557ac-0575-4870-b132-6d3a727f1904" (UID: "e6f557ac-0575-4870-b132-6d3a727f1904"). InnerVolumeSpecName "kube-api-access-jjw5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.683350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-config-data" (OuterVolumeSpecName: "config-data") pod "e6f557ac-0575-4870-b132-6d3a727f1904" (UID: "e6f557ac-0575-4870-b132-6d3a727f1904"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.686041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6f557ac-0575-4870-b132-6d3a727f1904" (UID: "e6f557ac-0575-4870-b132-6d3a727f1904"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.688094 4707 generic.go:334] "Generic (PLEG): container finished" podID="75ecb394-e4de-45ab-a12e-710faf510a48" containerID="299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10" exitCode=0 Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.688116 4707 generic.go:334] "Generic (PLEG): container finished" podID="75ecb394-e4de-45ab-a12e-710faf510a48" containerID="3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5" exitCode=143 Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.688154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"75ecb394-e4de-45ab-a12e-710faf510a48","Type":"ContainerDied","Data":"299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.688174 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.688190 4707 scope.go:117] "RemoveContainer" containerID="299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.688179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"75ecb394-e4de-45ab-a12e-710faf510a48","Type":"ContainerDied","Data":"3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.688368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"75ecb394-e4de-45ab-a12e-710faf510a48","Type":"ContainerDied","Data":"048320bc1683c4c19d1e5a5a0575d0ac093127e53a60041b1310294c3aa50555"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.692946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerStarted","Data":"a2c938ca62002f651985a2fc05725ae21653687d6af33d6dc52a4597ce78155a"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.692971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerStarted","Data":"f9fc1facbf6e09c6961b88dbfa1c24da8c02b646a2a6ff48e5682b747b9e2521"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.694003 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6f557ac-0575-4870-b132-6d3a727f1904" containerID="697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed" exitCode=0 Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.694041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e6f557ac-0575-4870-b132-6d3a727f1904","Type":"ContainerDied","Data":"697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.694055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e6f557ac-0575-4870-b132-6d3a727f1904","Type":"ContainerDied","Data":"fb85a73ac6884b3f777dc939eb38a311268116b39009994aeebe7c0cfcf00826"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.694103 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.696305 4707 generic.go:334] "Generic (PLEG): container finished" podID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerID="a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74" exitCode=0 Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.696324 4707 generic.go:334] "Generic (PLEG): container finished" podID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerID="b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292" exitCode=143 Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.696337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"32c0a029-325a-41e0-91b1-d187b6e036e3","Type":"ContainerDied","Data":"a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.696351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"32c0a029-325a-41e0-91b1-d187b6e036e3","Type":"ContainerDied","Data":"b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.696360 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"32c0a029-325a-41e0-91b1-d187b6e036e3","Type":"ContainerDied","Data":"aaa70e729a10f3d1c98322778a3f05983951c04a1881a6ccaa09f090bf596fa4"} Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.696399 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.737942 4707 scope.go:117] "RemoveContainer" containerID="3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.742308 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.764821 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjw5q\" (UniqueName: \"kubernetes.io/projected/e6f557ac-0575-4870-b132-6d3a727f1904-kube-api-access-jjw5q\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.764849 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.764860 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f557ac-0575-4870-b132-6d3a727f1904-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.771956 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.775160 4707 scope.go:117] "RemoveContainer" containerID="299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10" Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.775558 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10\": container with ID starting with 299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10 not found: ID does not exist" containerID="299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.775588 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10"} err="failed to get container status \"299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10\": rpc error: code = NotFound desc = could not find container \"299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10\": container with ID starting with 299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10 not found: ID does not exist" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.775607 4707 scope.go:117] "RemoveContainer" containerID="3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5" Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.776190 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5\": container with ID starting with 3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5 not found: ID does not exist" containerID="3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.776247 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5"} err="failed to get container status \"3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5\": rpc error: code = NotFound desc = could not find container \"3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5\": container with ID starting with 3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5 not found: ID does not exist" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.776302 4707 scope.go:117] "RemoveContainer" containerID="299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.776571 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10"} err="failed to get container status \"299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10\": rpc error: code = NotFound desc = could not find container \"299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10\": container with ID starting with 299f73dc9422c2966b78984e6f183b9d1310f64f0c8d96570ebad97ff8b49d10 not found: ID does not exist" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.776613 4707 scope.go:117] "RemoveContainer" containerID="3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.776887 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5"} err="failed to get container status \"3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5\": rpc error: code = NotFound desc = could not find container \"3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5\": container with ID starting with 3f2ce33e63fa6b9e5c4ce11ed01d48c43a450e72c7e806f356f7a2c941db1be5 not found: ID does not exist" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.776907 4707 scope.go:117] "RemoveContainer" containerID="697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.787879 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.798531 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.803856 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.808685 4707 scope.go:117] "RemoveContainer" containerID="697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.809045 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.812765 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed\": container with ID starting with 697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed not found: ID does not exist" containerID="697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.812797 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed"} err="failed to get container status \"697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed\": rpc error: code = NotFound desc = could not find container \"697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed\": container with ID starting with 697c08f00f52e9f5218a0e572a8bbb07c24bff1ce6ceb6e2b3ce9658903627ed not found: ID does not exist" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.812835 4707 scope.go:117] "RemoveContainer" containerID="a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.814515 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.814797 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ecb394-e4de-45ab-a12e-710faf510a48" containerName="nova-metadata-metadata" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.814825 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ecb394-e4de-45ab-a12e-710faf510a48" containerName="nova-metadata-metadata" Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.814842 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerName="nova-api-api" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.814848 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerName="nova-api-api" Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.814859 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d12d33-e904-4139-83d0-23eb2bb74b23" containerName="nova-manage" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.814864 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d12d33-e904-4139-83d0-23eb2bb74b23" containerName="nova-manage" Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.814875 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ecb394-e4de-45ab-a12e-710faf510a48" containerName="nova-metadata-log" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.814880 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ecb394-e4de-45ab-a12e-710faf510a48" containerName="nova-metadata-log" Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.814895 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f557ac-0575-4870-b132-6d3a727f1904" containerName="nova-scheduler-scheduler" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.814900 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f557ac-0575-4870-b132-6d3a727f1904" containerName="nova-scheduler-scheduler" Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.814915 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerName="nova-api-log" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.814920 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerName="nova-api-log" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.815056 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ecb394-e4de-45ab-a12e-710faf510a48" containerName="nova-metadata-log" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.815072 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerName="nova-api-api" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.815081 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62d12d33-e904-4139-83d0-23eb2bb74b23" containerName="nova-manage" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.815091 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ecb394-e4de-45ab-a12e-710faf510a48" containerName="nova-metadata-metadata" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.815102 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f557ac-0575-4870-b132-6d3a727f1904" containerName="nova-scheduler-scheduler" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.815112 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c0a029-325a-41e0-91b1-d187b6e036e3" containerName="nova-api-log" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.815856 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.818652 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.818759 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.820270 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.822833 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.823492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.824683 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.833360 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.845228 4707 scope.go:117] "RemoveContainer" containerID="b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.868069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-config-data\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.870234 4707 scope.go:117] "RemoveContainer" containerID="a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.870366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14715a96-cd73-4b33-8f0b-5adf99e6e503-logs\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.870433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7qjr\" (UniqueName: \"kubernetes.io/projected/14715a96-cd73-4b33-8f0b-5adf99e6e503-kube-api-access-x7qjr\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.870472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-internal-tls-certs\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.870591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4l2f\" (UniqueName: \"kubernetes.io/projected/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-kube-api-access-h4l2f\") pod \"nova-scheduler-0\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.870659 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-public-tls-certs\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.870704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.870730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.870751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-config-data\") pod \"nova-scheduler-0\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.873685 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.875785 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74\": container with ID starting with a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74 not found: ID does not exist" containerID="a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.875835 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74"} err="failed to get container status \"a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74\": rpc error: code = NotFound desc = could not find container \"a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74\": container with ID starting with a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74 not found: ID does not exist" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.875857 4707 scope.go:117] "RemoveContainer" containerID="b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292" Jan 21 15:53:14 crc kubenswrapper[4707]: E0121 15:53:14.876736 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292\": container with ID starting with b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292 not found: ID does not exist" containerID="b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.876761 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292"} err="failed to get container status \"b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292\": rpc error: code = NotFound desc = could not find container \"b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292\": container with ID starting with b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292 not found: ID does not exist" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.876779 4707 scope.go:117] "RemoveContainer" containerID="a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.877151 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74"} err="failed to get container status \"a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74\": rpc error: code = NotFound desc = could not find container \"a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74\": container with ID starting with a0ceb8c3cba959a5ca1002489b5ce282ceae824652bf4c4cd464d1935173fd74 not found: ID does not exist" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.877179 4707 scope.go:117] "RemoveContainer" containerID="b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.877195 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.877676 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292"} err="failed to get container status \"b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292\": rpc error: code = NotFound desc = could not find container \"b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292\": container with ID starting with b2ac6ec052c60490a32dc58dd7a2c947cabf8c9ab0b0372d429ebd9164d77292 not found: ID does not exist" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.879361 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.880121 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.882356 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.896793 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.979434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.979633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-config-data\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.979737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14715a96-cd73-4b33-8f0b-5adf99e6e503-logs\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.979835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7qjr\" (UniqueName: \"kubernetes.io/projected/14715a96-cd73-4b33-8f0b-5adf99e6e503-kube-api-access-x7qjr\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.979919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-internal-tls-certs\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.979963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-config-data\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.979998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4l2f\" (UniqueName: \"kubernetes.io/projected/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-kube-api-access-h4l2f\") pod \"nova-scheduler-0\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.980015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfw9b\" (UniqueName: \"kubernetes.io/projected/a271ec4c-613e-47a6-adce-609c2ea014e8-kube-api-access-xfw9b\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.980072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-public-tls-certs\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.980082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14715a96-cd73-4b33-8f0b-5adf99e6e503-logs\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.980100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a271ec4c-613e-47a6-adce-609c2ea014e8-logs\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.980177 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.980216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.980244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-config-data\") pod \"nova-scheduler-0\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.980321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.983298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-config-data\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.983653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-public-tls-certs\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.984243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-config-data\") pod \"nova-scheduler-0\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.985420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-internal-tls-certs\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.985789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.987768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.993782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7qjr\" (UniqueName: \"kubernetes.io/projected/14715a96-cd73-4b33-8f0b-5adf99e6e503-kube-api-access-x7qjr\") pod \"nova-api-0\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:14 crc kubenswrapper[4707]: I0121 15:53:14.994969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4l2f\" (UniqueName: \"kubernetes.io/projected/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-kube-api-access-h4l2f\") pod \"nova-scheduler-0\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.082316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a271ec4c-613e-47a6-adce-609c2ea014e8-logs\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.082379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.082442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.082508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-config-data\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.082533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfw9b\" (UniqueName: \"kubernetes.io/projected/a271ec4c-613e-47a6-adce-609c2ea014e8-kube-api-access-xfw9b\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.083213 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a271ec4c-613e-47a6-adce-609c2ea014e8-logs\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.085235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.085463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-config-data\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.089956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.096073 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfw9b\" (UniqueName: \"kubernetes.io/projected/a271ec4c-613e-47a6-adce-609c2ea014e8-kube-api-access-xfw9b\") pod \"nova-metadata-0\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.135898 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.145174 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.191026 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32c0a029-325a-41e0-91b1-d187b6e036e3" path="/var/lib/kubelet/pods/32c0a029-325a-41e0-91b1-d187b6e036e3/volumes" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.191654 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ecb394-e4de-45ab-a12e-710faf510a48" path="/var/lib/kubelet/pods/75ecb394-e4de-45ab-a12e-710faf510a48/volumes" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.192212 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f557ac-0575-4870-b132-6d3a727f1904" path="/var/lib/kubelet/pods/e6f557ac-0575-4870-b132-6d3a727f1904/volumes" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.194918 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.550669 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:15 crc kubenswrapper[4707]: W0121 15:53:15.553581 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb28a6944_58f4_4e92_af1d_fc84b9b2b4a7.slice/crio-2fdc0e3c66c5865f08f6a435b1fa2144c33fe04ac79bf3b0813346f7743f108d WatchSource:0}: Error finding container 2fdc0e3c66c5865f08f6a435b1fa2144c33fe04ac79bf3b0813346f7743f108d: Status 404 returned error can't find the container with id 2fdc0e3c66c5865f08f6a435b1fa2144c33fe04ac79bf3b0813346f7743f108d Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.622050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:15 crc kubenswrapper[4707]: W0121 15:53:15.629223 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14715a96_cd73_4b33_8f0b_5adf99e6e503.slice/crio-0ac26cd01f7fcce4bdd94f39734af2ae9746789961ac2afc908cbf855df27e5b WatchSource:0}: Error finding container 0ac26cd01f7fcce4bdd94f39734af2ae9746789961ac2afc908cbf855df27e5b: Status 404 returned error can't find the container with id 0ac26cd01f7fcce4bdd94f39734af2ae9746789961ac2afc908cbf855df27e5b Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.683397 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:15 crc kubenswrapper[4707]: W0121 15:53:15.689714 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda271ec4c_613e_47a6_adce_609c2ea014e8.slice/crio-20905c7f05674d83d00159accfd22d95af2d2a7506aa3f3e4b637b96e1887509 WatchSource:0}: Error finding container 20905c7f05674d83d00159accfd22d95af2d2a7506aa3f3e4b637b96e1887509: Status 404 returned error can't find the container with id 20905c7f05674d83d00159accfd22d95af2d2a7506aa3f3e4b637b96e1887509 Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.704374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7","Type":"ContainerStarted","Data":"2fdc0e3c66c5865f08f6a435b1fa2144c33fe04ac79bf3b0813346f7743f108d"} Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.708804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a271ec4c-613e-47a6-adce-609c2ea014e8","Type":"ContainerStarted","Data":"20905c7f05674d83d00159accfd22d95af2d2a7506aa3f3e4b637b96e1887509"} Jan 21 15:53:15 crc kubenswrapper[4707]: I0121 15:53:15.710151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"14715a96-cd73-4b33-8f0b-5adf99e6e503","Type":"ContainerStarted","Data":"0ac26cd01f7fcce4bdd94f39734af2ae9746789961ac2afc908cbf855df27e5b"} Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.719921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7","Type":"ContainerStarted","Data":"478287f1da7f0a91068fd7dd2e3b198dfceb73e1b298201f352980e7a596321b"} Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.721713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a271ec4c-613e-47a6-adce-609c2ea014e8","Type":"ContainerStarted","Data":"4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468"} Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.721751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a271ec4c-613e-47a6-adce-609c2ea014e8","Type":"ContainerStarted","Data":"dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7"} Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.725900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"14715a96-cd73-4b33-8f0b-5adf99e6e503","Type":"ContainerStarted","Data":"71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec"} Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.725929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"14715a96-cd73-4b33-8f0b-5adf99e6e503","Type":"ContainerStarted","Data":"c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912"} Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.728549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerStarted","Data":"f8ea85e223922cccc7bd056742a95947de0c06e528f4d7e40d1fc53cd2be4a49"} Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.729054 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:16 crc kubenswrapper[4707]: E0121 15:53:16.740503 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:16 crc kubenswrapper[4707]: E0121 15:53:16.741618 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:16 crc kubenswrapper[4707]: E0121 15:53:16.743512 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:16 crc kubenswrapper[4707]: E0121 15:53:16.743539 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.743800 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.743788743 podStartE2EDuration="2.743788743s" podCreationTimestamp="2026-01-21 15:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:16.735593492 +0000 UTC m=+3093.917109713" watchObservedRunningTime="2026-01-21 15:53:16.743788743 +0000 UTC m=+3093.925304964" Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.753149 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.136466192 podStartE2EDuration="5.753132613s" podCreationTimestamp="2026-01-21 15:53:11 +0000 UTC" firstStartedPulling="2026-01-21 15:53:12.432616836 +0000 UTC m=+3089.614133058" lastFinishedPulling="2026-01-21 15:53:16.049283257 +0000 UTC m=+3093.230799479" observedRunningTime="2026-01-21 15:53:16.749279477 +0000 UTC m=+3093.930795700" watchObservedRunningTime="2026-01-21 15:53:16.753132613 +0000 UTC m=+3093.934648835" Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.770976 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.770961709 podStartE2EDuration="2.770961709s" podCreationTimestamp="2026-01-21 15:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:16.765066203 +0000 UTC m=+3093.946582425" watchObservedRunningTime="2026-01-21 15:53:16.770961709 +0000 UTC m=+3093.952477931" Jan 21 15:53:16 crc kubenswrapper[4707]: I0121 15:53:16.787436 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.787412093 podStartE2EDuration="2.787412093s" podCreationTimestamp="2026-01-21 15:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:16.776396628 +0000 UTC m=+3093.957912851" watchObservedRunningTime="2026-01-21 15:53:16.787412093 +0000 UTC m=+3093.968928314" Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.016630 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.018447 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.031630 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.755194 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.870542 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6"] Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.875797 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zqpd6"] Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.949394 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t"] Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.950461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.953502 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.953801 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 15:53:18 crc kubenswrapper[4707]: I0121 15:53:18.958020 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t"] Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.044663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.045368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t49m8\" (UniqueName: \"kubernetes.io/projected/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-kube-api-access-t49m8\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.045524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-scripts\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.045715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-config-data\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.147733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-config-data\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.147869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.147966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t49m8\" (UniqueName: \"kubernetes.io/projected/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-kube-api-access-t49m8\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.148026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-scripts\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.152307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-scripts\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.152733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-config-data\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.153094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.173634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t49m8\" (UniqueName: \"kubernetes.io/projected/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-kube-api-access-t49m8\") pod \"nova-cell1-cell-mapping-t5t9t\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.193118 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5" path="/var/lib/kubelet/pods/1e076ba2-f848-4fa2-b6f7-ee0e7f5fd8d5/volumes" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.268228 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.654530 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t"] Jan 21 15:53:19 crc kubenswrapper[4707]: W0121 15:53:19.659922 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26cb77e1_6ec3_4c79_be62_1e4e07eaa1a4.slice/crio-b0bc8279c8214d7a369301cda1f95497768f9207b2263504c1b771d1537e2080 WatchSource:0}: Error finding container b0bc8279c8214d7a369301cda1f95497768f9207b2263504c1b771d1537e2080: Status 404 returned error can't find the container with id b0bc8279c8214d7a369301cda1f95497768f9207b2263504c1b771d1537e2080 Jan 21 15:53:19 crc kubenswrapper[4707]: I0121 15:53:19.749603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" event={"ID":"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4","Type":"ContainerStarted","Data":"b0bc8279c8214d7a369301cda1f95497768f9207b2263504c1b771d1537e2080"} Jan 21 15:53:20 crc kubenswrapper[4707]: I0121 15:53:20.145940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:20 crc kubenswrapper[4707]: I0121 15:53:20.195958 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:20 crc kubenswrapper[4707]: I0121 15:53:20.196109 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:20 crc kubenswrapper[4707]: I0121 15:53:20.757500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" event={"ID":"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4","Type":"ContainerStarted","Data":"d18981c375733008aca93a5d12ff6972ca95d89488d268551481a903f9faee35"} Jan 21 15:53:20 crc kubenswrapper[4707]: I0121 15:53:20.773993 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" podStartSLOduration=2.773976974 podStartE2EDuration="2.773976974s" podCreationTimestamp="2026-01-21 15:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:20.7731771 +0000 UTC m=+3097.954693322" watchObservedRunningTime="2026-01-21 15:53:20.773976974 +0000 UTC m=+3097.955493196" Jan 21 15:53:21 crc kubenswrapper[4707]: E0121 15:53:21.740325 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:21 crc kubenswrapper[4707]: E0121 15:53:21.741759 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:21 crc kubenswrapper[4707]: E0121 15:53:21.742853 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:21 crc kubenswrapper[4707]: E0121 15:53:21.742896 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" Jan 21 15:53:23 crc kubenswrapper[4707]: I0121 15:53:23.780005 4707 generic.go:334] "Generic (PLEG): container finished" podID="26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4" containerID="d18981c375733008aca93a5d12ff6972ca95d89488d268551481a903f9faee35" exitCode=0 Jan 21 15:53:23 crc kubenswrapper[4707]: I0121 15:53:23.780086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" event={"ID":"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4","Type":"ContainerDied","Data":"d18981c375733008aca93a5d12ff6972ca95d89488d268551481a903f9faee35"} Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.065765 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.136926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.136961 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.145838 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.146018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-scripts\") pod \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.146062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-combined-ca-bundle\") pod \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.146087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t49m8\" (UniqueName: \"kubernetes.io/projected/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-kube-api-access-t49m8\") pod \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.146239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-config-data\") pod \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\" (UID: \"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4\") " Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.150903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-kube-api-access-t49m8" (OuterVolumeSpecName: "kube-api-access-t49m8") pod "26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4" (UID: "26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4"). InnerVolumeSpecName "kube-api-access-t49m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.152892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-scripts" (OuterVolumeSpecName: "scripts") pod "26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4" (UID: "26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.165868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4" (UID: "26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.166621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-config-data" (OuterVolumeSpecName: "config-data") pod "26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4" (UID: "26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.167338 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.195528 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.195576 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.248213 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.248247 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.248270 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t49m8\" (UniqueName: \"kubernetes.io/projected/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-kube-api-access-t49m8\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.248279 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.796483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" event={"ID":"26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4","Type":"ContainerDied","Data":"b0bc8279c8214d7a369301cda1f95497768f9207b2263504c1b771d1537e2080"} Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.796758 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0bc8279c8214d7a369301cda1f95497768f9207b2263504c1b771d1537e2080" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.796786 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.821288 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.957226 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.957515 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-log" containerID="cri-o://c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912" gracePeriod=30 Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.957668 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-api" containerID="cri-o://71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec" gracePeriod=30 Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.962496 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.97:8774/\": EOF" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.962643 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.97:8774/\": EOF" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.979378 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.979591 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-log" containerID="cri-o://dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7" gracePeriod=30 Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.979653 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-metadata" containerID="cri-o://4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468" gracePeriod=30 Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.990269 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.100:8775/\": EOF" Jan 21 15:53:25 crc kubenswrapper[4707]: I0121 15:53:25.997563 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.100:8775/\": EOF" Jan 21 15:53:26 crc kubenswrapper[4707]: I0121 15:53:26.214901 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:26 crc kubenswrapper[4707]: E0121 15:53:26.740314 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:26 crc kubenswrapper[4707]: E0121 15:53:26.741511 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:26 crc kubenswrapper[4707]: E0121 15:53:26.746307 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:26 crc kubenswrapper[4707]: E0121 15:53:26.746367 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" Jan 21 15:53:26 crc kubenswrapper[4707]: I0121 15:53:26.804463 4707 generic.go:334] "Generic (PLEG): container finished" podID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerID="c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912" exitCode=143 Jan 21 15:53:26 crc kubenswrapper[4707]: I0121 15:53:26.804543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"14715a96-cd73-4b33-8f0b-5adf99e6e503","Type":"ContainerDied","Data":"c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912"} Jan 21 15:53:26 crc kubenswrapper[4707]: I0121 15:53:26.806358 4707 generic.go:334] "Generic (PLEG): container finished" podID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerID="dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7" exitCode=143 Jan 21 15:53:26 crc kubenswrapper[4707]: I0121 15:53:26.806428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a271ec4c-613e-47a6-adce-609c2ea014e8","Type":"ContainerDied","Data":"dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7"} Jan 21 15:53:27 crc kubenswrapper[4707]: I0121 15:53:27.813549 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" containerName="nova-scheduler-scheduler" containerID="cri-o://478287f1da7f0a91068fd7dd2e3b198dfceb73e1b298201f352980e7a596321b" gracePeriod=30 Jan 21 15:53:30 crc kubenswrapper[4707]: E0121 15:53:30.148014 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="478287f1da7f0a91068fd7dd2e3b198dfceb73e1b298201f352980e7a596321b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:53:30 crc kubenswrapper[4707]: E0121 15:53:30.150468 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="478287f1da7f0a91068fd7dd2e3b198dfceb73e1b298201f352980e7a596321b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:53:30 crc kubenswrapper[4707]: E0121 15:53:30.151920 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="478287f1da7f0a91068fd7dd2e3b198dfceb73e1b298201f352980e7a596321b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:53:30 crc kubenswrapper[4707]: E0121 15:53:30.151973 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" containerName="nova-scheduler-scheduler" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.676077 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.736616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a271ec4c-613e-47a6-adce-609c2ea014e8-logs\") pod \"a271ec4c-613e-47a6-adce-609c2ea014e8\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.736758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-combined-ca-bundle\") pod \"a271ec4c-613e-47a6-adce-609c2ea014e8\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.737011 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfw9b\" (UniqueName: \"kubernetes.io/projected/a271ec4c-613e-47a6-adce-609c2ea014e8-kube-api-access-xfw9b\") pod \"a271ec4c-613e-47a6-adce-609c2ea014e8\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.737046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-config-data\") pod \"a271ec4c-613e-47a6-adce-609c2ea014e8\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.737120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-nova-metadata-tls-certs\") pod \"a271ec4c-613e-47a6-adce-609c2ea014e8\" (UID: \"a271ec4c-613e-47a6-adce-609c2ea014e8\") " Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.737376 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a271ec4c-613e-47a6-adce-609c2ea014e8-logs" (OuterVolumeSpecName: "logs") pod "a271ec4c-613e-47a6-adce-609c2ea014e8" (UID: "a271ec4c-613e-47a6-adce-609c2ea014e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.738037 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a271ec4c-613e-47a6-adce-609c2ea014e8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.745896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a271ec4c-613e-47a6-adce-609c2ea014e8-kube-api-access-xfw9b" (OuterVolumeSpecName: "kube-api-access-xfw9b") pod "a271ec4c-613e-47a6-adce-609c2ea014e8" (UID: "a271ec4c-613e-47a6-adce-609c2ea014e8"). InnerVolumeSpecName "kube-api-access-xfw9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.761330 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a271ec4c-613e-47a6-adce-609c2ea014e8" (UID: "a271ec4c-613e-47a6-adce-609c2ea014e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.762551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-config-data" (OuterVolumeSpecName: "config-data") pod "a271ec4c-613e-47a6-adce-609c2ea014e8" (UID: "a271ec4c-613e-47a6-adce-609c2ea014e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.776492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a271ec4c-613e-47a6-adce-609c2ea014e8" (UID: "a271ec4c-613e-47a6-adce-609c2ea014e8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.835559 4707 generic.go:334] "Generic (PLEG): container finished" podID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerID="4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468" exitCode=0 Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.835604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a271ec4c-613e-47a6-adce-609c2ea014e8","Type":"ContainerDied","Data":"4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468"} Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.835656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a271ec4c-613e-47a6-adce-609c2ea014e8","Type":"ContainerDied","Data":"20905c7f05674d83d00159accfd22d95af2d2a7506aa3f3e4b637b96e1887509"} Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.835676 4707 scope.go:117] "RemoveContainer" containerID="4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.835966 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.840141 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.840172 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.840184 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfw9b\" (UniqueName: \"kubernetes.io/projected/a271ec4c-613e-47a6-adce-609c2ea014e8-kube-api-access-xfw9b\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.840194 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a271ec4c-613e-47a6-adce-609c2ea014e8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.854700 4707 scope.go:117] "RemoveContainer" containerID="dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.867648 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.875930 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.885786 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:30 crc kubenswrapper[4707]: E0121 15:53:30.886340 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-metadata" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.886360 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-metadata" Jan 21 15:53:30 crc kubenswrapper[4707]: E0121 15:53:30.886385 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-log" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.886391 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-log" Jan 21 15:53:30 crc kubenswrapper[4707]: E0121 15:53:30.886415 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4" containerName="nova-manage" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.886421 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4" containerName="nova-manage" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.886676 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-metadata" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.886697 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" containerName="nova-metadata-log" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.886939 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4" containerName="nova-manage" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.891867 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.893619 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.893687 4707 scope.go:117] "RemoveContainer" containerID="4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.893846 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:53:30 crc kubenswrapper[4707]: E0121 15:53:30.894119 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468\": container with ID starting with 4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468 not found: ID does not exist" containerID="4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.894144 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468"} err="failed to get container status \"4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468\": rpc error: code = NotFound desc = could not find container \"4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468\": container with ID starting with 4d5a55db94797a830c5a6c698519683b8a29f0e8601b1316bd2ea93ecc0ce468 not found: ID does not exist" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.894173 4707 scope.go:117] "RemoveContainer" containerID="dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7" Jan 21 15:53:30 crc kubenswrapper[4707]: E0121 15:53:30.894369 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7\": container with ID starting with dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7 not found: ID does not exist" containerID="dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.894386 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7"} err="failed to get container status \"dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7\": rpc error: code = NotFound desc = could not find container \"dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7\": container with ID starting with dd124f7c06ab07f1ee7c2294feedc68f96c8a878b9f27c193e5a072caab786f7 not found: ID does not exist" Jan 21 15:53:30 crc kubenswrapper[4707]: I0121 15:53:30.896662 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.045166 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m5wv\" (UniqueName: \"kubernetes.io/projected/ed60a6eb-8348-4608-8905-143106b4d545-kube-api-access-7m5wv\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.045238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed60a6eb-8348-4608-8905-143106b4d545-logs\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.045282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.045351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.045724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-config-data\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.147510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.147651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-config-data\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.147722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m5wv\" (UniqueName: \"kubernetes.io/projected/ed60a6eb-8348-4608-8905-143106b4d545-kube-api-access-7m5wv\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.147768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed60a6eb-8348-4608-8905-143106b4d545-logs\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.147789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.148182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed60a6eb-8348-4608-8905-143106b4d545-logs\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.151166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.151205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-config-data\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.151268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.160993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m5wv\" (UniqueName: \"kubernetes.io/projected/ed60a6eb-8348-4608-8905-143106b4d545-kube-api-access-7m5wv\") pod \"nova-metadata-0\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.195238 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a271ec4c-613e-47a6-adce-609c2ea014e8" path="/var/lib/kubelet/pods/a271ec4c-613e-47a6-adce-609c2ea014e8/volumes" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.214278 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.633551 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:53:31 crc kubenswrapper[4707]: W0121 15:53:31.637140 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded60a6eb_8348_4608_8905_143106b4d545.slice/crio-9c5af38e95f7f727a60c29c38dd1566c6d27bfc300d172a47a803fb2f51d7e6a WatchSource:0}: Error finding container 9c5af38e95f7f727a60c29c38dd1566c6d27bfc300d172a47a803fb2f51d7e6a: Status 404 returned error can't find the container with id 9c5af38e95f7f727a60c29c38dd1566c6d27bfc300d172a47a803fb2f51d7e6a Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.687887 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:31 crc kubenswrapper[4707]: E0121 15:53:31.739697 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:31 crc kubenswrapper[4707]: E0121 15:53:31.740763 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:31 crc kubenswrapper[4707]: E0121 15:53:31.742163 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:31 crc kubenswrapper[4707]: E0121 15:53:31.742198 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.768504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-combined-ca-bundle\") pod \"14715a96-cd73-4b33-8f0b-5adf99e6e503\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.768691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14715a96-cd73-4b33-8f0b-5adf99e6e503-logs\") pod \"14715a96-cd73-4b33-8f0b-5adf99e6e503\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.768749 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-internal-tls-certs\") pod \"14715a96-cd73-4b33-8f0b-5adf99e6e503\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.769087 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14715a96-cd73-4b33-8f0b-5adf99e6e503-logs" (OuterVolumeSpecName: "logs") pod "14715a96-cd73-4b33-8f0b-5adf99e6e503" (UID: "14715a96-cd73-4b33-8f0b-5adf99e6e503"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.769134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-config-data\") pod \"14715a96-cd73-4b33-8f0b-5adf99e6e503\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.769168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-public-tls-certs\") pod \"14715a96-cd73-4b33-8f0b-5adf99e6e503\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.769219 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7qjr\" (UniqueName: \"kubernetes.io/projected/14715a96-cd73-4b33-8f0b-5adf99e6e503-kube-api-access-x7qjr\") pod \"14715a96-cd73-4b33-8f0b-5adf99e6e503\" (UID: \"14715a96-cd73-4b33-8f0b-5adf99e6e503\") " Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.769981 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14715a96-cd73-4b33-8f0b-5adf99e6e503-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.790184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14715a96-cd73-4b33-8f0b-5adf99e6e503-kube-api-access-x7qjr" (OuterVolumeSpecName: "kube-api-access-x7qjr") pod "14715a96-cd73-4b33-8f0b-5adf99e6e503" (UID: "14715a96-cd73-4b33-8f0b-5adf99e6e503"). InnerVolumeSpecName "kube-api-access-x7qjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.790443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-config-data" (OuterVolumeSpecName: "config-data") pod "14715a96-cd73-4b33-8f0b-5adf99e6e503" (UID: "14715a96-cd73-4b33-8f0b-5adf99e6e503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.792025 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14715a96-cd73-4b33-8f0b-5adf99e6e503" (UID: "14715a96-cd73-4b33-8f0b-5adf99e6e503"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.808118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "14715a96-cd73-4b33-8f0b-5adf99e6e503" (UID: "14715a96-cd73-4b33-8f0b-5adf99e6e503"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.810532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "14715a96-cd73-4b33-8f0b-5adf99e6e503" (UID: "14715a96-cd73-4b33-8f0b-5adf99e6e503"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.846566 4707 generic.go:334] "Generic (PLEG): container finished" podID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerID="71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec" exitCode=0 Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.846617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"14715a96-cd73-4b33-8f0b-5adf99e6e503","Type":"ContainerDied","Data":"71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec"} Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.846641 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"14715a96-cd73-4b33-8f0b-5adf99e6e503","Type":"ContainerDied","Data":"0ac26cd01f7fcce4bdd94f39734af2ae9746789961ac2afc908cbf855df27e5b"} Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.846656 4707 scope.go:117] "RemoveContainer" containerID="71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.846754 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.849580 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed60a6eb-8348-4608-8905-143106b4d545","Type":"ContainerStarted","Data":"9c5af38e95f7f727a60c29c38dd1566c6d27bfc300d172a47a803fb2f51d7e6a"} Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.851087 4707 generic.go:334] "Generic (PLEG): container finished" podID="b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" containerID="478287f1da7f0a91068fd7dd2e3b198dfceb73e1b298201f352980e7a596321b" exitCode=0 Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.851123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7","Type":"ContainerDied","Data":"478287f1da7f0a91068fd7dd2e3b198dfceb73e1b298201f352980e7a596321b"} Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.869220 4707 scope.go:117] "RemoveContainer" containerID="c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.890671 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7qjr\" (UniqueName: \"kubernetes.io/projected/14715a96-cd73-4b33-8f0b-5adf99e6e503-kube-api-access-x7qjr\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.890704 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.890715 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.890729 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.890740 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14715a96-cd73-4b33-8f0b-5adf99e6e503-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.909233 4707 scope.go:117] "RemoveContainer" containerID="71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.910497 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:31 crc kubenswrapper[4707]: E0121 15:53:31.914430 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec\": container with ID starting with 71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec not found: ID does not exist" containerID="71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.914466 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec"} err="failed to get container status \"71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec\": rpc error: code = NotFound desc = could not find container \"71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec\": container with ID starting with 71295eeadd55c5e3c29e770a242c455f84f734429009d008cefc72ebb1ad21ec not found: ID does not exist" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.914489 4707 scope.go:117] "RemoveContainer" containerID="c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912" Jan 21 15:53:31 crc kubenswrapper[4707]: E0121 15:53:31.914741 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912\": container with ID starting with c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912 not found: ID does not exist" containerID="c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.914769 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912"} err="failed to get container status \"c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912\": rpc error: code = NotFound desc = could not find container \"c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912\": container with ID starting with c01beb19e704fadd1284d766d49031ad7d4773bf4958db6151a945af52b9a912 not found: ID does not exist" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.925580 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.934525 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:31 crc kubenswrapper[4707]: E0121 15:53:31.934878 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-log" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.934892 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-log" Jan 21 15:53:31 crc kubenswrapper[4707]: E0121 15:53:31.934905 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-api" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.934911 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-api" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.935081 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-log" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.935094 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" containerName="nova-api-api" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.935979 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.937841 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.937942 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.938890 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:53:31 crc kubenswrapper[4707]: I0121 15:53:31.944018 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.067574 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.093940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef0e4e0-0bae-4886-894a-863f13095f5b-logs\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.093988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.094023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.094159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-config-data\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.094230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.094405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhk45\" (UniqueName: \"kubernetes.io/projected/8ef0e4e0-0bae-4886-894a-863f13095f5b-kube-api-access-xhk45\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.195820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4l2f\" (UniqueName: \"kubernetes.io/projected/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-kube-api-access-h4l2f\") pod \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.196079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-config-data\") pod \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.196126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-combined-ca-bundle\") pod \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\" (UID: \"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7\") " Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.196437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef0e4e0-0bae-4886-894a-863f13095f5b-logs\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.196480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.196532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.196617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-config-data\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.196669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.196794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhk45\" (UniqueName: \"kubernetes.io/projected/8ef0e4e0-0bae-4886-894a-863f13095f5b-kube-api-access-xhk45\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.197415 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef0e4e0-0bae-4886-894a-863f13095f5b-logs\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.200281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-kube-api-access-h4l2f" (OuterVolumeSpecName: "kube-api-access-h4l2f") pod "b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" (UID: "b28a6944-58f4-4e92-af1d-fc84b9b2b4a7"). InnerVolumeSpecName "kube-api-access-h4l2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.201105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.204244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.204408 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-config-data\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.206156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.214251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhk45\" (UniqueName: \"kubernetes.io/projected/8ef0e4e0-0bae-4886-894a-863f13095f5b-kube-api-access-xhk45\") pod \"nova-api-0\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.231148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" (UID: "b28a6944-58f4-4e92-af1d-fc84b9b2b4a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.233951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-config-data" (OuterVolumeSpecName: "config-data") pod "b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" (UID: "b28a6944-58f4-4e92-af1d-fc84b9b2b4a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.255360 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.298822 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.298846 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.298857 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4l2f\" (UniqueName: \"kubernetes.io/projected/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7-kube-api-access-h4l2f\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.629693 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.859925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed60a6eb-8348-4608-8905-143106b4d545","Type":"ContainerStarted","Data":"58e6b3acf0942b299010a6d66e7bf7396f26347a33cd0c773d801e2540add159"} Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.860035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed60a6eb-8348-4608-8905-143106b4d545","Type":"ContainerStarted","Data":"9504c925f64890f6adfbe11ade457b63a59f8d88820cd51e5d1948828eaf36da"} Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.862413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"b28a6944-58f4-4e92-af1d-fc84b9b2b4a7","Type":"ContainerDied","Data":"2fdc0e3c66c5865f08f6a435b1fa2144c33fe04ac79bf3b0813346f7743f108d"} Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.862445 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.862457 4707 scope.go:117] "RemoveContainer" containerID="478287f1da7f0a91068fd7dd2e3b198dfceb73e1b298201f352980e7a596321b" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.865976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ef0e4e0-0bae-4886-894a-863f13095f5b","Type":"ContainerStarted","Data":"483d69c30647d452519d9267bee5354dc667b86da55acfe143a2b6d4b7ae08a0"} Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.866071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ef0e4e0-0bae-4886-894a-863f13095f5b","Type":"ContainerStarted","Data":"b8ff0b54881d042be7dd153588eca3fa77a017ec8153dd24b3ee8c576132410b"} Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.901389 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.901374759 podStartE2EDuration="2.901374759s" podCreationTimestamp="2026-01-21 15:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:32.899524109 +0000 UTC m=+3110.081040331" watchObservedRunningTime="2026-01-21 15:53:32.901374759 +0000 UTC m=+3110.082890981" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.923679 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.933491 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.950530 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:32 crc kubenswrapper[4707]: E0121 15:53:32.950859 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" containerName="nova-scheduler-scheduler" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.950877 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" containerName="nova-scheduler-scheduler" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.951045 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" containerName="nova-scheduler-scheduler" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.951550 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.957168 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:53:32 crc kubenswrapper[4707]: I0121 15:53:32.963383 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.128712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-config-data\") pod \"nova-scheduler-0\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.128972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.129105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4ftw\" (UniqueName: \"kubernetes.io/projected/1f72a259-adb4-405e-b6e3-6a39fbe668b8-kube-api-access-x4ftw\") pod \"nova-scheduler-0\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.191542 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14715a96-cd73-4b33-8f0b-5adf99e6e503" path="/var/lib/kubelet/pods/14715a96-cd73-4b33-8f0b-5adf99e6e503/volumes" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.192091 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28a6944-58f4-4e92-af1d-fc84b9b2b4a7" path="/var/lib/kubelet/pods/b28a6944-58f4-4e92-af1d-fc84b9b2b4a7/volumes" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.230460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-config-data\") pod \"nova-scheduler-0\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.230496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.230597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4ftw\" (UniqueName: \"kubernetes.io/projected/1f72a259-adb4-405e-b6e3-6a39fbe668b8-kube-api-access-x4ftw\") pod \"nova-scheduler-0\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.233554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-config-data\") pod \"nova-scheduler-0\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.233772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.243030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4ftw\" (UniqueName: \"kubernetes.io/projected/1f72a259-adb4-405e-b6e3-6a39fbe668b8-kube-api-access-x4ftw\") pod \"nova-scheduler-0\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.274781 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.654036 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:53:33 crc kubenswrapper[4707]: W0121 15:53:33.659243 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f72a259_adb4_405e_b6e3_6a39fbe668b8.slice/crio-d2e3b2e844dcf0a21b40eaeb1bebe3bdaab48ad234534d1f810110e215ef7a81 WatchSource:0}: Error finding container d2e3b2e844dcf0a21b40eaeb1bebe3bdaab48ad234534d1f810110e215ef7a81: Status 404 returned error can't find the container with id d2e3b2e844dcf0a21b40eaeb1bebe3bdaab48ad234534d1f810110e215ef7a81 Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.874027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"1f72a259-adb4-405e-b6e3-6a39fbe668b8","Type":"ContainerStarted","Data":"7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870"} Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.874071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"1f72a259-adb4-405e-b6e3-6a39fbe668b8","Type":"ContainerStarted","Data":"d2e3b2e844dcf0a21b40eaeb1bebe3bdaab48ad234534d1f810110e215ef7a81"} Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.876465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ef0e4e0-0bae-4886-894a-863f13095f5b","Type":"ContainerStarted","Data":"304a9698f83732cdf07b1c1beb63c8b9696304d817437a60ab659c572ebfb73c"} Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.891118 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.891103276 podStartE2EDuration="1.891103276s" podCreationTimestamp="2026-01-21 15:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:33.886948214 +0000 UTC m=+3111.068464435" watchObservedRunningTime="2026-01-21 15:53:33.891103276 +0000 UTC m=+3111.072619499" Jan 21 15:53:33 crc kubenswrapper[4707]: I0121 15:53:33.906195 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.906177432 podStartE2EDuration="2.906177432s" podCreationTimestamp="2026-01-21 15:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:33.902040713 +0000 UTC m=+3111.083556935" watchObservedRunningTime="2026-01-21 15:53:33.906177432 +0000 UTC m=+3111.087693654" Jan 21 15:53:36 crc kubenswrapper[4707]: I0121 15:53:36.215108 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:36 crc kubenswrapper[4707]: I0121 15:53:36.215453 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:36 crc kubenswrapper[4707]: E0121 15:53:36.739934 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:36 crc kubenswrapper[4707]: E0121 15:53:36.741560 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:36 crc kubenswrapper[4707]: E0121 15:53:36.742565 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:53:36 crc kubenswrapper[4707]: E0121 15:53:36.742603 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.275066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.840000 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.910848 4707 generic.go:334] "Generic (PLEG): container finished" podID="735f4b3b-efbd-440b-b617-27b89d61a044" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" exitCode=137 Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.910924 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.910915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"735f4b3b-efbd-440b-b617-27b89d61a044","Type":"ContainerDied","Data":"239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274"} Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.911010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"735f4b3b-efbd-440b-b617-27b89d61a044","Type":"ContainerDied","Data":"c195e4c9c91c37dc4629ae9598b7d352a14c383813fb2a61ee64b268762ee2aa"} Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.911039 4707 scope.go:117] "RemoveContainer" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.929244 4707 scope.go:117] "RemoveContainer" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" Jan 21 15:53:38 crc kubenswrapper[4707]: E0121 15:53:38.929535 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274\": container with ID starting with 239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274 not found: ID does not exist" containerID="239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274" Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.929565 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274"} err="failed to get container status \"239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274\": rpc error: code = NotFound desc = could not find container \"239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274\": container with ID starting with 239d782878c99b412f8f5a1dae24c70e20732b5662287cf579189f2de5825274 not found: ID does not exist" Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.942867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwdbm\" (UniqueName: \"kubernetes.io/projected/735f4b3b-efbd-440b-b617-27b89d61a044-kube-api-access-lwdbm\") pod \"735f4b3b-efbd-440b-b617-27b89d61a044\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.942966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-combined-ca-bundle\") pod \"735f4b3b-efbd-440b-b617-27b89d61a044\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.943112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-config-data\") pod \"735f4b3b-efbd-440b-b617-27b89d61a044\" (UID: \"735f4b3b-efbd-440b-b617-27b89d61a044\") " Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.948830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735f4b3b-efbd-440b-b617-27b89d61a044-kube-api-access-lwdbm" (OuterVolumeSpecName: "kube-api-access-lwdbm") pod "735f4b3b-efbd-440b-b617-27b89d61a044" (UID: "735f4b3b-efbd-440b-b617-27b89d61a044"). InnerVolumeSpecName "kube-api-access-lwdbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.966509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-config-data" (OuterVolumeSpecName: "config-data") pod "735f4b3b-efbd-440b-b617-27b89d61a044" (UID: "735f4b3b-efbd-440b-b617-27b89d61a044"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:38 crc kubenswrapper[4707]: I0121 15:53:38.967469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "735f4b3b-efbd-440b-b617-27b89d61a044" (UID: "735f4b3b-efbd-440b-b617-27b89d61a044"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.045614 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwdbm\" (UniqueName: \"kubernetes.io/projected/735f4b3b-efbd-440b-b617-27b89d61a044-kube-api-access-lwdbm\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.045648 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.045659 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735f4b3b-efbd-440b-b617-27b89d61a044-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.228714 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.235433 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.245361 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:53:39 crc kubenswrapper[4707]: E0121 15:53:39.245668 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.245695 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.245890 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" containerName="nova-cell1-conductor-conductor" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.246397 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.247742 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.248675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.248731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.248754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmm82\" (UniqueName: \"kubernetes.io/projected/0810465b-2f7e-49f5-8d97-c8416940a463-kube-api-access-zmm82\") pod \"nova-cell1-conductor-0\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.254744 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.349602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.349659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.349681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmm82\" (UniqueName: \"kubernetes.io/projected/0810465b-2f7e-49f5-8d97-c8416940a463-kube-api-access-zmm82\") pod \"nova-cell1-conductor-0\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.352702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.352757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.363916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmm82\" (UniqueName: \"kubernetes.io/projected/0810465b-2f7e-49f5-8d97-c8416940a463-kube-api-access-zmm82\") pod \"nova-cell1-conductor-0\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.559984 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.925679 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.945648 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.945804 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.945848 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.946237 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:53:39 crc kubenswrapper[4707]: I0121 15:53:39.946291 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" gracePeriod=600 Jan 21 15:53:40 crc kubenswrapper[4707]: E0121 15:53:40.060697 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:53:40 crc kubenswrapper[4707]: I0121 15:53:40.928865 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" exitCode=0 Jan 21 15:53:40 crc kubenswrapper[4707]: I0121 15:53:40.928930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d"} Jan 21 15:53:40 crc kubenswrapper[4707]: I0121 15:53:40.929135 4707 scope.go:117] "RemoveContainer" containerID="34a707de92d001c9de59dbef499d523bf460e9895d2f693f866b85aad1e846c3" Jan 21 15:53:40 crc kubenswrapper[4707]: I0121 15:53:40.929453 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:53:40 crc kubenswrapper[4707]: E0121 15:53:40.929702 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:53:40 crc kubenswrapper[4707]: I0121 15:53:40.931992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0810465b-2f7e-49f5-8d97-c8416940a463","Type":"ContainerStarted","Data":"184182939c5463a5f2caf7ea1723c17fbff7ff92c20f068adf919f644df65ae3"} Jan 21 15:53:40 crc kubenswrapper[4707]: I0121 15:53:40.932022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0810465b-2f7e-49f5-8d97-c8416940a463","Type":"ContainerStarted","Data":"3619297069a7733c2ff19a715ea08ee6e929df3f698f47759c367dfd5f3f0567"} Jan 21 15:53:40 crc kubenswrapper[4707]: I0121 15:53:40.932054 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:40 crc kubenswrapper[4707]: I0121 15:53:40.968851 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.9688388510000001 podStartE2EDuration="1.968838851s" podCreationTimestamp="2026-01-21 15:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:53:40.960844848 +0000 UTC m=+3118.142361070" watchObservedRunningTime="2026-01-21 15:53:40.968838851 +0000 UTC m=+3118.150355073" Jan 21 15:53:41 crc kubenswrapper[4707]: I0121 15:53:41.193335 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735f4b3b-efbd-440b-b617-27b89d61a044" path="/var/lib/kubelet/pods/735f4b3b-efbd-440b-b617-27b89d61a044/volumes" Jan 21 15:53:41 crc kubenswrapper[4707]: I0121 15:53:41.214901 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:41 crc kubenswrapper[4707]: I0121 15:53:41.214937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:42 crc kubenswrapper[4707]: I0121 15:53:42.047068 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:53:42 crc kubenswrapper[4707]: I0121 15:53:42.225934 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.102:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:53:42 crc kubenswrapper[4707]: I0121 15:53:42.225959 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.102:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:53:42 crc kubenswrapper[4707]: I0121 15:53:42.256725 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:42 crc kubenswrapper[4707]: I0121 15:53:42.256769 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:43 crc kubenswrapper[4707]: I0121 15:53:43.266944 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.103:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:53:43 crc kubenswrapper[4707]: I0121 15:53:43.267372 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.103:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:53:43 crc kubenswrapper[4707]: I0121 15:53:43.275509 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:43 crc kubenswrapper[4707]: I0121 15:53:43.295182 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:43 crc kubenswrapper[4707]: I0121 15:53:43.972132 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:53:49 crc kubenswrapper[4707]: I0121 15:53:49.579711 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:53:51 crc kubenswrapper[4707]: I0121 15:53:51.220444 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:51 crc kubenswrapper[4707]: I0121 15:53:51.220682 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:51 crc kubenswrapper[4707]: I0121 15:53:51.224237 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:51 crc kubenswrapper[4707]: I0121 15:53:51.226161 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:53:52 crc kubenswrapper[4707]: I0121 15:53:52.261855 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:52 crc kubenswrapper[4707]: I0121 15:53:52.261992 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:52 crc kubenswrapper[4707]: I0121 15:53:52.262483 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:52 crc kubenswrapper[4707]: I0121 15:53:52.262503 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:52 crc kubenswrapper[4707]: I0121 15:53:52.267917 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:52 crc kubenswrapper[4707]: I0121 15:53:52.268400 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:53:53 crc kubenswrapper[4707]: I0121 15:53:53.190400 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:53:53 crc kubenswrapper[4707]: E0121 15:53:53.190602 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:53:58 crc kubenswrapper[4707]: E0121 15:53:58.773680 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:40750->192.168.25.165:36655: write tcp 192.168.25.165:40750->192.168.25.165:36655: write: connection reset by peer Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.719925 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.720300 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" containerName="openstackclient" containerID="cri-o://66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401" gracePeriod=2 Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.733263 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.739431 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.739696 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerName="openstack-network-exporter" containerID="cri-o://d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24" gracePeriod=300 Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.763871 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb"] Jan 21 15:54:02 crc kubenswrapper[4707]: E0121 15:54:02.764279 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" containerName="openstackclient" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.764296 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" containerName="openstackclient" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.765971 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" containerName="openstackclient" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.766818 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.772029 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.774863 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.795803 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" podUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.804694 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.804916 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="ovn-northd" containerID="cri-o://307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" gracePeriod=30 Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.805026 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="openstack-network-exporter" containerID="cri-o://9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa" gracePeriod=30 Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.811272 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.826341 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.827378 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.844954 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.845135 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="1f72a259-adb4-405e-b6e3-6a39fbe668b8" containerName="nova-scheduler-scheduler" containerID="cri-o://7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870" gracePeriod=30 Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.860501 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.860724 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="60e0b49a-6e14-42af-9480-877a354cec42" containerName="glance-log" containerID="cri-o://fedf8ec2c1a7c0866b937ba14cc1081426eb67eaeb9d60f31b28228f8b04b03f" gracePeriod=30 Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.860913 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="60e0b49a-6e14-42af-9480-877a354cec42" containerName="glance-httpd" containerID="cri-o://2d20d49e7b7da58ecb8376a79713efa7bdcae5fb6a7a699d672be0eea1161321" gracePeriod=30 Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.876597 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.896944 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.897126 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="42924886-d9f3-48df-83a3-f7e27fcf368d" containerName="memcached" containerID="cri-o://7bc461ffcee7909ae3ce1f7eee5798564341473d88bb5e5dc4e8f44fc909fd32" gracePeriod=30 Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.908869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data-custom\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.908916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-combined-ca-bundle\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.908942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data-custom\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.909003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-logs\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.909021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-logs\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.909043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqswk\" (UniqueName: \"kubernetes.io/projected/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-kube-api-access-nqswk\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.909074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.909114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flzvf\" (UniqueName: \"kubernetes.io/projected/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-kube-api-access-flzvf\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.909151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.909175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-combined-ca-bundle\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.929839 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.936853 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.948006 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-log" containerID="cri-o://9504c925f64890f6adfbe11ade457b63a59f8d88820cd51e5d1948828eaf36da" gracePeriod=30 Jan 21 15:54:02 crc kubenswrapper[4707]: I0121 15:54:02.948161 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-metadata" containerID="cri-o://58e6b3acf0942b299010a6d66e7bf7396f26347a33cd0c773d801e2540add159" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.006350 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.006689 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="6b9de3b0-5497-4666-800f-6e5db9e68f8e" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8cf71330f22a9f711f2a9aa118097de5a675b96809b3a77139fbb95dc742909f" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.014712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flzvf\" (UniqueName: \"kubernetes.io/projected/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-kube-api-access-flzvf\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.014773 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.014848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.014872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-combined-ca-bundle\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.014895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data-custom\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.014927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-combined-ca-bundle\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.014958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data-custom\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.015003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.015059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-logs\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.015078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-logs\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.015108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqswk\" (UniqueName: \"kubernetes.io/projected/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-kube-api-access-nqswk\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.015156 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmm2j\" (UniqueName: \"kubernetes.io/projected/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-kube-api-access-tmm2j\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.015171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.015203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.015573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-logs\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.015848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-logs\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.036513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data-custom\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.037624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-combined-ca-bundle\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.039409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data-custom\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.044412 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-combined-ca-bundle\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.044856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.045437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.062875 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.063079 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-log" containerID="cri-o://483d69c30647d452519d9267bee5354dc667b86da55acfe143a2b6d4b7ae08a0" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.063430 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-api" containerID="cri-o://304a9698f83732cdf07b1c1beb63c8b9696304d817437a60ab659c572ebfb73c" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.093632 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.093903 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerName="cinder-scheduler" containerID="cri-o://fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.094289 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerName="probe" containerID="cri-o://dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.101649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqswk\" (UniqueName: \"kubernetes.io/projected/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-kube-api-access-nqswk\") pod \"barbican-keystone-listener-6ffcdd9b58-khwxb\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.110891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.111109 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" containerName="cinder-api-log" containerID="cri-o://3c571ef8d58ac48cc178a7b7fbb07a0155fb018f81b61692d264dbdd939885a1" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.111450 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" containerName="cinder-api" containerID="cri-o://6861b5894c2e268dd615bf72e68c53f494b1ac91f918b0c29b772a58817145a6" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.117056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmm2j\" (UniqueName: \"kubernetes.io/projected/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-kube-api-access-tmm2j\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.117103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.117154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.117221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.119245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.121322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.133392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.133456 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.134765 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.138460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flzvf\" (UniqueName: \"kubernetes.io/projected/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-kube-api-access-flzvf\") pod \"barbican-worker-57d775cd97-mhhlv\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.139600 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed60a6eb-8348-4608-8905-143106b4d545" containerID="9504c925f64890f6adfbe11ade457b63a59f8d88820cd51e5d1948828eaf36da" exitCode=143 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.139673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed60a6eb-8348-4608-8905-143106b4d545","Type":"ContainerDied","Data":"9504c925f64890f6adfbe11ade457b63a59f8d88820cd51e5d1948828eaf36da"} Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.153641 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerID="d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24" exitCode=2 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.153682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"e2d7e8f4-c39e-431c-acc5-9dede63b65a8","Type":"ContainerDied","Data":"d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24"} Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.158114 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerName="ovsdbserver-nb" containerID="cri-o://1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910" gracePeriod=300 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.164976 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5b758d7546-vzkpl"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.166412 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.176337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmm2j\" (UniqueName: \"kubernetes.io/projected/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-kube-api-access-tmm2j\") pod \"openstackclient\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.177490 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.276687 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.276722 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5b758d7546-vzkpl"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.276731 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.277572 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerName="openstack-network-exporter" containerID="cri-o://e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01" gracePeriod=300 Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.282622 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.289164 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.289459 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="6dab9056-b15b-4adb-8a83-4b197e2b724c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c0f64e3130d63c159d5622b3618a7db1051e62d9dd644abb3bb1f144f6bab0b3" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.294503 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.297836 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.297985 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="0810465b-2f7e-49f5-8d97-c8416940a463" containerName="nova-cell1-conductor-conductor" containerID="cri-o://184182939c5463a5f2caf7ea1723c17fbff7ff92c20f068adf919f644df65ae3" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.301943 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.301999 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="1f72a259-adb4-405e-b6e3-6a39fbe668b8" containerName="nova-scheduler-scheduler" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.310400 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.310608 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="11f0696a-c173-41a3-926d-29ef8dcef753" containerName="glance-log" containerID="cri-o://117bd9103ad25aebe304fbfcc64acbc37d4aab9c45ff0d48066ffe05295752c8" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.310953 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="11f0696a-c173-41a3-926d-29ef8dcef753" containerName="glance-httpd" containerID="cri-o://8f3d568ef8fe02166321f2465bb001d67912afe92b6e85e4165f4ee77c91e825" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.323879 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.343766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-combined-ca-bundle\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.343831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-public-tls-certs\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.343851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-scripts\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.343873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-combined-ca-bundle\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.343888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-logs\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.343928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769b2e3-4438-441b-867e-65d6dbd6e36c-logs\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.343974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-config-data\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.344000 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-internal-tls-certs\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.344019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-internal-tls-certs\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.344035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data-custom\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.344065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrnpq\" (UniqueName: \"kubernetes.io/projected/3769b2e3-4438-441b-867e-65d6dbd6e36c-kube-api-access-jrnpq\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.344101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf5hd\" (UniqueName: \"kubernetes.io/projected/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-kube-api-access-zf5hd\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.344116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.344134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-public-tls-certs\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.345311 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-578d4bcc98-hmqrg"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.347773 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.352983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-578d4bcc98-hmqrg"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.353621 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerName="ovsdbserver-sb" containerID="cri-o://75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1" gracePeriod=300 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.363170 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-7dd959896b-rsffl"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.364525 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.369860 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7dd959896b-rsffl"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.399169 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.401725 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc6sh\" (UniqueName: \"kubernetes.io/projected/3596326f-8c58-4af1-963e-916d15976926-kube-api-access-tc6sh\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-combined-ca-bundle\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-ovndb-tls-certs\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769b2e3-4438-441b-867e-65d6dbd6e36c-logs\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-scripts\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447555 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-config\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-httpd-config\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-combined-ca-bundle\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-config-data\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-public-tls-certs\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-internal-tls-certs\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-internal-tls-certs\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data-custom\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrnpq\" (UniqueName: \"kubernetes.io/projected/3769b2e3-4438-441b-867e-65d6dbd6e36c-kube-api-access-jrnpq\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.447776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-internal-tls-certs\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf5hd\" (UniqueName: \"kubernetes.io/projected/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-kube-api-access-zf5hd\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbr65\" (UniqueName: \"kubernetes.io/projected/eaa206fe-a991-4947-8e0e-0a8f80879f55-kube-api-access-fbr65\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-credential-keys\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-public-tls-certs\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-fernet-keys\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448818 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-combined-ca-bundle\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-public-tls-certs\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-scripts\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448909 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-config-data\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-combined-ca-bundle\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-public-tls-certs\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-internal-tls-certs\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.448984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-logs\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.449393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-logs\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.449623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769b2e3-4438-441b-867e-65d6dbd6e36c-logs\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.455354 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.468239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-combined-ca-bundle\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.468653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-internal-tls-certs\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.469389 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-combined-ca-bundle\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.469667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-scripts\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.471383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data-custom\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.472211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-public-tls-certs\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.473347 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910 is running failed: container process not found" containerID="1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.476737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-config-data\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.476858 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910 is running failed: container process not found" containerID="1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.477163 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910 is running failed: container process not found" containerID="1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.477216 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerName="ovsdbserver-nb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.477268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrnpq\" (UniqueName: \"kubernetes.io/projected/3769b2e3-4438-441b-867e-65d6dbd6e36c-kube-api-access-jrnpq\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.479179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-internal-tls-certs\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.483535 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.483599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf5hd\" (UniqueName: \"kubernetes.io/projected/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-kube-api-access-zf5hd\") pod \"placement-5b758d7546-vzkpl\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.484521 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="9ab52595-1681-409f-b9fd-f806a0e07117" containerName="galera" containerID="cri-o://e46c9f70ccacdad3402cf6ba32a77979ddf8b59c6dda0c74faca0a9001f5b6b6" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.485098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-public-tls-certs\") pod \"barbican-api-68f75c7cb4-2pzs9\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.522924 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.540338 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="67bae98e-ed83-41c8-be5a-15eae8c9aa48" containerName="galera" containerID="cri-o://e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.542492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-internal-tls-certs\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbr65\" (UniqueName: \"kubernetes.io/projected/eaa206fe-a991-4947-8e0e-0a8f80879f55-kube-api-access-fbr65\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-credential-keys\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550505 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-fernet-keys\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-config-data\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-public-tls-certs\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-internal-tls-certs\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc6sh\" (UniqueName: \"kubernetes.io/projected/3596326f-8c58-4af1-963e-916d15976926-kube-api-access-tc6sh\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-combined-ca-bundle\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-ovndb-tls-certs\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-scripts\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-config\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-httpd-config\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-combined-ca-bundle\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.550763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-public-tls-certs\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.553778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-internal-tls-certs\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.556484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-credential-keys\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.556567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-public-tls-certs\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.556973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-public-tls-certs\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.557361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-config-data\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.557595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-ovndb-tls-certs\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.558505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-scripts\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.558743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-fernet-keys\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.565585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-httpd-config\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.566114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-combined-ca-bundle\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.567114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-combined-ca-bundle\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.568544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-internal-tls-certs\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.569574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-config\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.570515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbr65\" (UniqueName: \"kubernetes.io/projected/eaa206fe-a991-4947-8e0e-0a8f80879f55-kube-api-access-fbr65\") pod \"neutron-578d4bcc98-hmqrg\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.571188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc6sh\" (UniqueName: \"kubernetes.io/projected/3596326f-8c58-4af1-963e-916d15976926-kube-api-access-tc6sh\") pod \"keystone-7dd959896b-rsffl\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.577161 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.591648 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1 is running failed: container process not found" containerID="75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.592945 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1 is running failed: container process not found" containerID="75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.593704 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1 is running failed: container process not found" containerID="75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 15:54:03 crc kubenswrapper[4707]: E0121 15:54:03.593729 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerName="ovsdbserver-sb" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.624547 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_e2d7e8f4-c39e-431c-acc5-9dede63b65a8/ovsdbserver-nb/0.log" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.624614 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.758082 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdbserver-nb-tls-certs\") pod \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.758182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-metrics-certs-tls-certs\") pod \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.758220 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msg7z\" (UniqueName: \"kubernetes.io/projected/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-kube-api-access-msg7z\") pod \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.758277 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-config\") pod \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.758295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdb-rundir\") pod \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.758315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-scripts\") pod \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.758346 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.758392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-combined-ca-bundle\") pod \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\" (UID: \"e2d7e8f4-c39e-431c-acc5-9dede63b65a8\") " Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.764842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-config" (OuterVolumeSpecName: "config") pod "e2d7e8f4-c39e-431c-acc5-9dede63b65a8" (UID: "e2d7e8f4-c39e-431c-acc5-9dede63b65a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.765124 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e2d7e8f4-c39e-431c-acc5-9dede63b65a8" (UID: "e2d7e8f4-c39e-431c-acc5-9dede63b65a8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.764869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-scripts" (OuterVolumeSpecName: "scripts") pod "e2d7e8f4-c39e-431c-acc5-9dede63b65a8" (UID: "e2d7e8f4-c39e-431c-acc5-9dede63b65a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.771959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-kube-api-access-msg7z" (OuterVolumeSpecName: "kube-api-access-msg7z") pod "e2d7e8f4-c39e-431c-acc5-9dede63b65a8" (UID: "e2d7e8f4-c39e-431c-acc5-9dede63b65a8"). InnerVolumeSpecName "kube-api-access-msg7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.773082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "e2d7e8f4-c39e-431c-acc5-9dede63b65a8" (UID: "e2d7e8f4-c39e-431c-acc5-9dede63b65a8"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.850333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2d7e8f4-c39e-431c-acc5-9dede63b65a8" (UID: "e2d7e8f4-c39e-431c-acc5-9dede63b65a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.858568 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.860712 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.860739 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.860752 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.860774 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.860783 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.860791 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msg7z\" (UniqueName: \"kubernetes.io/projected/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-kube-api-access-msg7z\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.891235 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.914735 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "e2d7e8f4-c39e-431c-acc5-9dede63b65a8" (UID: "e2d7e8f4-c39e-431c-acc5-9dede63b65a8"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.950156 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.952225 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="ceilometer-central-agent" containerID="cri-o://fe8a3919099032fd63ab5edd1340fda4118650819c31c39020c309ee4731c4b3" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.952474 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="proxy-httpd" containerID="cri-o://f8ea85e223922cccc7bd056742a95947de0c06e528f4d7e40d1fc53cd2be4a49" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.952635 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="ceilometer-notification-agent" containerID="cri-o://f9fc1facbf6e09c6961b88dbfa1c24da8c02b646a2a6ff48e5682b747b9e2521" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.952685 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="sg-core" containerID="cri-o://a2c938ca62002f651985a2fc05725ae21653687d6af33d6dc52a4597ce78155a" gracePeriod=30 Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.963612 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.963631 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:03 crc kubenswrapper[4707]: I0121 15:54:03.985606 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.008308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e2d7e8f4-c39e-431c-acc5-9dede63b65a8" (UID: "e2d7e8f4-c39e-431c-acc5-9dede63b65a8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.071213 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2d7e8f4-c39e-431c-acc5-9dede63b65a8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.079691 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_20cc2e06-6e76-4cca-8954-6d362c62e31a/ovsdbserver-sb/0.log" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.080008 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.170007 4707 generic.go:334] "Generic (PLEG): container finished" podID="6dab9056-b15b-4adb-8a83-4b197e2b724c" containerID="c0f64e3130d63c159d5622b3618a7db1051e62d9dd644abb3bb1f144f6bab0b3" exitCode=0 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.170086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6dab9056-b15b-4adb-8a83-4b197e2b724c","Type":"ContainerDied","Data":"c0f64e3130d63c159d5622b3618a7db1051e62d9dd644abb3bb1f144f6bab0b3"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.177571 4707 generic.go:334] "Generic (PLEG): container finished" podID="c535f926-bfe0-4448-8090-1b942fe85c89" containerID="3c571ef8d58ac48cc178a7b7fbb07a0155fb018f81b61692d264dbdd939885a1" exitCode=143 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.177629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c535f926-bfe0-4448-8090-1b942fe85c89","Type":"ContainerDied","Data":"3c571ef8d58ac48cc178a7b7fbb07a0155fb018f81b61692d264dbdd939885a1"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.178429 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-scripts\") pod \"20cc2e06-6e76-4cca-8954-6d362c62e31a\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.178510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-config\") pod \"20cc2e06-6e76-4cca-8954-6d362c62e31a\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.178574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xggwr\" (UniqueName: \"kubernetes.io/projected/20cc2e06-6e76-4cca-8954-6d362c62e31a-kube-api-access-xggwr\") pod \"20cc2e06-6e76-4cca-8954-6d362c62e31a\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.178761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"20cc2e06-6e76-4cca-8954-6d362c62e31a\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.178885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdbserver-sb-tls-certs\") pod \"20cc2e06-6e76-4cca-8954-6d362c62e31a\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.178968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdb-rundir\") pod \"20cc2e06-6e76-4cca-8954-6d362c62e31a\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.178987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-metrics-certs-tls-certs\") pod \"20cc2e06-6e76-4cca-8954-6d362c62e31a\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.179033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-combined-ca-bundle\") pod \"20cc2e06-6e76-4cca-8954-6d362c62e31a\" (UID: \"20cc2e06-6e76-4cca-8954-6d362c62e31a\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.181994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-config" (OuterVolumeSpecName: "config") pod "20cc2e06-6e76-4cca-8954-6d362c62e31a" (UID: "20cc2e06-6e76-4cca-8954-6d362c62e31a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.183125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-scripts" (OuterVolumeSpecName: "scripts") pod "20cc2e06-6e76-4cca-8954-6d362c62e31a" (UID: "20cc2e06-6e76-4cca-8954-6d362c62e31a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.185060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "20cc2e06-6e76-4cca-8954-6d362c62e31a" (UID: "20cc2e06-6e76-4cca-8954-6d362c62e31a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.189206 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_20cc2e06-6e76-4cca-8954-6d362c62e31a/ovsdbserver-sb/0.log" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.189250 4707 generic.go:334] "Generic (PLEG): container finished" podID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerID="e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01" exitCode=2 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.189272 4707 generic.go:334] "Generic (PLEG): container finished" podID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerID="75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1" exitCode=143 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.189315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"20cc2e06-6e76-4cca-8954-6d362c62e31a","Type":"ContainerDied","Data":"e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.189332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"20cc2e06-6e76-4cca-8954-6d362c62e31a","Type":"ContainerDied","Data":"75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.189343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"20cc2e06-6e76-4cca-8954-6d362c62e31a","Type":"ContainerDied","Data":"3ec29c63c8b1bd3a54b3d8606ca2e9a84f27dd721039886ca3596c80b8400f91"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.189357 4707 scope.go:117] "RemoveContainer" containerID="e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.189453 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.190041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "20cc2e06-6e76-4cca-8954-6d362c62e31a" (UID: "20cc2e06-6e76-4cca-8954-6d362c62e31a"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.190172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cc2e06-6e76-4cca-8954-6d362c62e31a-kube-api-access-xggwr" (OuterVolumeSpecName: "kube-api-access-xggwr") pod "20cc2e06-6e76-4cca-8954-6d362c62e31a" (UID: "20cc2e06-6e76-4cca-8954-6d362c62e31a"). InnerVolumeSpecName "kube-api-access-xggwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.205463 4707 generic.go:334] "Generic (PLEG): container finished" podID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerID="483d69c30647d452519d9267bee5354dc667b86da55acfe143a2b6d4b7ae08a0" exitCode=143 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.205547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ef0e4e0-0bae-4886-894a-863f13095f5b","Type":"ContainerDied","Data":"483d69c30647d452519d9267bee5354dc667b86da55acfe143a2b6d4b7ae08a0"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.215019 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ab52595-1681-409f-b9fd-f806a0e07117" containerID="e46c9f70ccacdad3402cf6ba32a77979ddf8b59c6dda0c74faca0a9001f5b6b6" exitCode=0 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.215163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9ab52595-1681-409f-b9fd-f806a0e07117","Type":"ContainerDied","Data":"e46c9f70ccacdad3402cf6ba32a77979ddf8b59c6dda0c74faca0a9001f5b6b6"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.220271 4707 generic.go:334] "Generic (PLEG): container finished" podID="42924886-d9f3-48df-83a3-f7e27fcf368d" containerID="7bc461ffcee7909ae3ce1f7eee5798564341473d88bb5e5dc4e8f44fc909fd32" exitCode=0 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.220334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"42924886-d9f3-48df-83a3-f7e27fcf368d","Type":"ContainerDied","Data":"7bc461ffcee7909ae3ce1f7eee5798564341473d88bb5e5dc4e8f44fc909fd32"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.228313 4707 scope.go:117] "RemoveContainer" containerID="75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.228614 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_e2d7e8f4-c39e-431c-acc5-9dede63b65a8/ovsdbserver-nb/0.log" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.228641 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerID="1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910" exitCode=143 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.228687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"e2d7e8f4-c39e-431c-acc5-9dede63b65a8","Type":"ContainerDied","Data":"1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.228710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"e2d7e8f4-c39e-431c-acc5-9dede63b65a8","Type":"ContainerDied","Data":"cdcbc5fbdff27eab0f274d3a39b7e7ac34685194acab823fa2c27e039e4a1317"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.228759 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.243606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"60e0b49a-6e14-42af-9480-877a354cec42","Type":"ContainerDied","Data":"fedf8ec2c1a7c0866b937ba14cc1081426eb67eaeb9d60f31b28228f8b04b03f"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.258382 4707 generic.go:334] "Generic (PLEG): container finished" podID="60e0b49a-6e14-42af-9480-877a354cec42" containerID="fedf8ec2c1a7c0866b937ba14cc1081426eb67eaeb9d60f31b28228f8b04b03f" exitCode=143 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.260886 4707 scope.go:117] "RemoveContainer" containerID="e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.261560 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01\": container with ID starting with e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01 not found: ID does not exist" containerID="e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.261591 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01"} err="failed to get container status \"e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01\": rpc error: code = NotFound desc = could not find container \"e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01\": container with ID starting with e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01 not found: ID does not exist" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.261613 4707 scope.go:117] "RemoveContainer" containerID="75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.262247 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" event={"ID":"4ab385ad-11d0-4da8-bf53-70997a2b9fa3","Type":"ContainerStarted","Data":"ccca3340b0447c02009ad6b21494768aa53105116696120210244bb1bc5cc8a2"} Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.262315 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1\": container with ID starting with 75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1 not found: ID does not exist" containerID="75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.262334 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1"} err="failed to get container status \"75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1\": rpc error: code = NotFound desc = could not find container \"75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1\": container with ID starting with 75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1 not found: ID does not exist" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.262347 4707 scope.go:117] "RemoveContainer" containerID="e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.262791 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01"} err="failed to get container status \"e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01\": rpc error: code = NotFound desc = could not find container \"e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01\": container with ID starting with e2edb883544c62135035d6021ba8c38a7ef684a407270281335919bf18c51a01 not found: ID does not exist" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.262824 4707 scope.go:117] "RemoveContainer" containerID="75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.263097 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1"} err="failed to get container status \"75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1\": rpc error: code = NotFound desc = could not find container \"75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1\": container with ID starting with 75326a574f690247a996db412c6461621e8101ba95fae8555f243a3b2f8e87f1 not found: ID does not exist" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.263112 4707 scope.go:117] "RemoveContainer" containerID="d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.265423 4707 generic.go:334] "Generic (PLEG): container finished" podID="11f0696a-c173-41a3-926d-29ef8dcef753" containerID="117bd9103ad25aebe304fbfcc64acbc37d4aab9c45ff0d48066ffe05295752c8" exitCode=143 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.265470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"11f0696a-c173-41a3-926d-29ef8dcef753","Type":"ContainerDied","Data":"117bd9103ad25aebe304fbfcc64acbc37d4aab9c45ff0d48066ffe05295752c8"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.267759 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.280518 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.281801 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.281915 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.281996 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.282061 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20cc2e06-6e76-4cca-8954-6d362c62e31a-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.282120 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xggwr\" (UniqueName: \"kubernetes.io/projected/20cc2e06-6e76-4cca-8954-6d362c62e31a-kube-api-access-xggwr\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.287009 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.287400 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerName="ovsdbserver-sb" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.287413 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerName="ovsdbserver-sb" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.287432 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerName="openstack-network-exporter" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.287438 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerName="openstack-network-exporter" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.287457 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerName="ovsdbserver-nb" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.287463 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerName="ovsdbserver-nb" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.287481 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerName="openstack-network-exporter" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.287486 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerName="openstack-network-exporter" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.287631 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerName="openstack-network-exporter" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.287651 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" containerName="ovsdbserver-nb" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.287666 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerName="ovsdbserver-sb" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.287674 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" containerName="openstack-network-exporter" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.288556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.292955 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.296093 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-kdf5s" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.296216 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.296092 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.299373 4707 generic.go:334] "Generic (PLEG): container finished" podID="0cc92b42-4a79-43c3-b749-ae1058708648" containerID="f8ea85e223922cccc7bd056742a95947de0c06e528f4d7e40d1fc53cd2be4a49" exitCode=0 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.299395 4707 generic.go:334] "Generic (PLEG): container finished" podID="0cc92b42-4a79-43c3-b749-ae1058708648" containerID="a2c938ca62002f651985a2fc05725ae21653687d6af33d6dc52a4597ce78155a" exitCode=2 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.299432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerDied","Data":"f8ea85e223922cccc7bd056742a95947de0c06e528f4d7e40d1fc53cd2be4a49"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.299455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerDied","Data":"a2c938ca62002f651985a2fc05725ae21653687d6af33d6dc52a4597ce78155a"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.303748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.306230 4707 generic.go:334] "Generic (PLEG): container finished" podID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerID="dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724" exitCode=0 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.306280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ece56fe7-15d7-4701-b4c7-e102b077c8ea","Type":"ContainerDied","Data":"dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.321482 4707 generic.go:334] "Generic (PLEG): container finished" podID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerID="9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa" exitCode=2 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.321516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"303e46a4-d2f9-4512-ae87-c50a52d62019","Type":"ContainerDied","Data":"9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa"} Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.329207 4707 scope.go:117] "RemoveContainer" containerID="1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.346782 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20cc2e06-6e76-4cca-8954-6d362c62e31a" (UID: "20cc2e06-6e76-4cca-8954-6d362c62e31a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.352546 4707 scope.go:117] "RemoveContainer" containerID="d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.354227 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24\": container with ID starting with d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24 not found: ID does not exist" containerID="d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.354267 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24"} err="failed to get container status \"d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24\": rpc error: code = NotFound desc = could not find container \"d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24\": container with ID starting with d4341a91c6cf6fe9939e01e4c82572b6352eb2c1fcdd3a215a6aafb222d52b24 not found: ID does not exist" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.354286 4707 scope.go:117] "RemoveContainer" containerID="1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.354658 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910\": container with ID starting with 1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910 not found: ID does not exist" containerID="1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.354677 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910"} err="failed to get container status \"1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910\": rpc error: code = NotFound desc = could not find container \"1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910\": container with ID starting with 1fbbfdd79cb64d09f222ea4c613cf12d1fb3af7d8a5d42faad466c32a1d57910 not found: ID does not exist" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.364683 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.373542 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.380546 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.385529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.385595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.385621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.385680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9t98\" (UniqueName: \"kubernetes.io/projected/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-kube-api-access-l9t98\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.385711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-config\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.385761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.385787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.385964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.386082 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.386117 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.486671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "20cc2e06-6e76-4cca-8954-6d362c62e31a" (UID: "20cc2e06-6e76-4cca-8954-6d362c62e31a"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-combined-ca-bundle\") pod \"6dab9056-b15b-4adb-8a83-4b197e2b724c\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-memcached-tls-certs\") pod \"42924886-d9f3-48df-83a3-f7e27fcf368d\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-config-data\") pod \"42924886-d9f3-48df-83a3-f7e27fcf368d\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4858v\" (UniqueName: \"kubernetes.io/projected/6dab9056-b15b-4adb-8a83-4b197e2b724c-kube-api-access-4858v\") pod \"6dab9056-b15b-4adb-8a83-4b197e2b724c\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-combined-ca-bundle\") pod \"42924886-d9f3-48df-83a3-f7e27fcf368d\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-nova-novncproxy-tls-certs\") pod \"6dab9056-b15b-4adb-8a83-4b197e2b724c\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-vencrypt-tls-certs\") pod \"6dab9056-b15b-4adb-8a83-4b197e2b724c\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-kolla-config\") pod \"42924886-d9f3-48df-83a3-f7e27fcf368d\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcrsw\" (UniqueName: \"kubernetes.io/projected/42924886-d9f3-48df-83a3-f7e27fcf368d-kube-api-access-gcrsw\") pod \"42924886-d9f3-48df-83a3-f7e27fcf368d\" (UID: \"42924886-d9f3-48df-83a3-f7e27fcf368d\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.487767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-config-data\") pod \"6dab9056-b15b-4adb-8a83-4b197e2b724c\" (UID: \"6dab9056-b15b-4adb-8a83-4b197e2b724c\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.500355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.500497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.500547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.500571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.500625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9t98\" (UniqueName: \"kubernetes.io/projected/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-kube-api-access-l9t98\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.500660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-config\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.500695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.500733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.500867 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.504590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "42924886-d9f3-48df-83a3-f7e27fcf368d" (UID: "42924886-d9f3-48df-83a3-f7e27fcf368d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.519073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-config-data" (OuterVolumeSpecName: "config-data") pod "42924886-d9f3-48df-83a3-f7e27fcf368d" (UID: "42924886-d9f3-48df-83a3-f7e27fcf368d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.520434 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.524487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.526060 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-config\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.526301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.544328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.577943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dab9056-b15b-4adb-8a83-4b197e2b724c" (UID: "6dab9056-b15b-4adb-8a83-4b197e2b724c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.621566 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9t98\" (UniqueName: \"kubernetes.io/projected/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-kube-api-access-l9t98\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.622100 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="184182939c5463a5f2caf7ea1723c17fbff7ff92c20f068adf919f644df65ae3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.622608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.630615 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42924886-d9f3-48df-83a3-f7e27fcf368d-kube-api-access-gcrsw" (OuterVolumeSpecName: "kube-api-access-gcrsw") pod "42924886-d9f3-48df-83a3-f7e27fcf368d" (UID: "42924886-d9f3-48df-83a3-f7e27fcf368d"). InnerVolumeSpecName "kube-api-access-gcrsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.631316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.633014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dab9056-b15b-4adb-8a83-4b197e2b724c-kube-api-access-4858v" (OuterVolumeSpecName: "kube-api-access-4858v") pod "6dab9056-b15b-4adb-8a83-4b197e2b724c" (UID: "6dab9056-b15b-4adb-8a83-4b197e2b724c"). InnerVolumeSpecName "kube-api-access-4858v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.633111 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "20cc2e06-6e76-4cca-8954-6d362c62e31a" (UID: "20cc2e06-6e76-4cca-8954-6d362c62e31a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.634473 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.634495 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcrsw\" (UniqueName: \"kubernetes.io/projected/42924886-d9f3-48df-83a3-f7e27fcf368d-kube-api-access-gcrsw\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.634507 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.634516 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cc2e06-6e76-4cca-8954-6d362c62e31a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.634524 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42924886-d9f3-48df-83a3-f7e27fcf368d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.634531 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4858v\" (UniqueName: \"kubernetes.io/projected/6dab9056-b15b-4adb-8a83-4b197e2b724c-kube-api-access-4858v\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.660025 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="184182939c5463a5f2caf7ea1723c17fbff7ff92c20f068adf919f644df65ae3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.665446 4707 scope.go:117] "RemoveContainer" containerID="89c29a3d42aafec3fce31b73ed24346f2ab4e8ae49fae6ac3a1cbc1d68a952c0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.665555 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5b758d7546-vzkpl"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.690226 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv"] Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.703334 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="184182939c5463a5f2caf7ea1723c17fbff7ff92c20f068adf919f644df65ae3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.703470 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="0810465b-2f7e-49f5-8d97-c8416940a463" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:04 crc kubenswrapper[4707]: W0121 15:54:04.719889 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46ef7fb6_d2ea_4c55_ad87_5d4d49a748bb.slice/crio-296309b3efc562a705d8545a8884f5798a01ca9bd3d5f40e7c63975c66f5c748 WatchSource:0}: Error finding container 296309b3efc562a705d8545a8884f5798a01ca9bd3d5f40e7c63975c66f5c748: Status 404 returned error can't find the container with id 296309b3efc562a705d8545a8884f5798a01ca9bd3d5f40e7c63975c66f5c748 Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.742791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.775223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-config-data" (OuterVolumeSpecName: "config-data") pod "6dab9056-b15b-4adb-8a83-4b197e2b724c" (UID: "6dab9056-b15b-4adb-8a83-4b197e2b724c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.775658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42924886-d9f3-48df-83a3-f7e27fcf368d" (UID: "42924886-d9f3-48df-83a3-f7e27fcf368d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.838680 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.838708 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.861841 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8cf71330f22a9f711f2a9aa118097de5a675b96809b3a77139fbb95dc742909f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.877713 4707 scope.go:117] "RemoveContainer" containerID="8de232c69091a8ebfbbd5841be04e33e23e6bb682ea20486d7e76dbf7a6b5887" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.878343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "42924886-d9f3-48df-83a3-f7e27fcf368d" (UID: "42924886-d9f3-48df-83a3-f7e27fcf368d"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.878729 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.879564 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.886903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "6dab9056-b15b-4adb-8a83-4b197e2b724c" (UID: "6dab9056-b15b-4adb-8a83-4b197e2b724c"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.891594 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8cf71330f22a9f711f2a9aa118097de5a675b96809b3a77139fbb95dc742909f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.903697 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8cf71330f22a9f711f2a9aa118097de5a675b96809b3a77139fbb95dc742909f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:04 crc kubenswrapper[4707]: E0121 15:54:04.903741 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="6b9de3b0-5497-4666-800f-6e5db9e68f8e" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.942313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bjsl\" (UniqueName: \"kubernetes.io/projected/9ab52595-1681-409f-b9fd-f806a0e07117-kube-api-access-7bjsl\") pod \"9ab52595-1681-409f-b9fd-f806a0e07117\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.942376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-galera-tls-certs\") pod \"9ab52595-1681-409f-b9fd-f806a0e07117\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.942402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-operator-scripts\") pod \"9ab52595-1681-409f-b9fd-f806a0e07117\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.942497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"9ab52595-1681-409f-b9fd-f806a0e07117\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.942623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-generated\") pod \"9ab52595-1681-409f-b9fd-f806a0e07117\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.942688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-combined-ca-bundle\") pod \"9ab52595-1681-409f-b9fd-f806a0e07117\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.942735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-kolla-config\") pod \"9ab52595-1681-409f-b9fd-f806a0e07117\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.942850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-default\") pod \"9ab52595-1681-409f-b9fd-f806a0e07117\" (UID: \"9ab52595-1681-409f-b9fd-f806a0e07117\") " Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.943231 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42924886-d9f3-48df-83a3-f7e27fcf368d-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.943248 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.943751 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9ab52595-1681-409f-b9fd-f806a0e07117" (UID: "9ab52595-1681-409f-b9fd-f806a0e07117"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.944350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ab52595-1681-409f-b9fd-f806a0e07117" (UID: "9ab52595-1681-409f-b9fd-f806a0e07117"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.948955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab52595-1681-409f-b9fd-f806a0e07117-kube-api-access-7bjsl" (OuterVolumeSpecName: "kube-api-access-7bjsl") pod "9ab52595-1681-409f-b9fd-f806a0e07117" (UID: "9ab52595-1681-409f-b9fd-f806a0e07117"). InnerVolumeSpecName "kube-api-access-7bjsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.950162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9ab52595-1681-409f-b9fd-f806a0e07117" (UID: "9ab52595-1681-409f-b9fd-f806a0e07117"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.951416 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.957562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "6dab9056-b15b-4adb-8a83-4b197e2b724c" (UID: "6dab9056-b15b-4adb-8a83-4b197e2b724c"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.960033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9ab52595-1681-409f-b9fd-f806a0e07117" (UID: "9ab52595-1681-409f-b9fd-f806a0e07117"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.967666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "mysql-db") pod "9ab52595-1681-409f-b9fd-f806a0e07117" (UID: "9ab52595-1681-409f-b9fd-f806a0e07117"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.972586 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:04 crc kubenswrapper[4707]: I0121 15:54:04.981109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ab52595-1681-409f-b9fd-f806a0e07117" (UID: "9ab52595-1681-409f-b9fd-f806a0e07117"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.004742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "9ab52595-1681-409f-b9fd-f806a0e07117" (UID: "9ab52595-1681-409f-b9fd-f806a0e07117"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.014848 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.015357 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab52595-1681-409f-b9fd-f806a0e07117" containerName="mysql-bootstrap" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.015378 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab52595-1681-409f-b9fd-f806a0e07117" containerName="mysql-bootstrap" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.015394 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab52595-1681-409f-b9fd-f806a0e07117" containerName="galera" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.015401 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab52595-1681-409f-b9fd-f806a0e07117" containerName="galera" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.015431 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dab9056-b15b-4adb-8a83-4b197e2b724c" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.015438 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dab9056-b15b-4adb-8a83-4b197e2b724c" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.015452 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42924886-d9f3-48df-83a3-f7e27fcf368d" containerName="memcached" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.015458 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42924886-d9f3-48df-83a3-f7e27fcf368d" containerName="memcached" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.015724 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dab9056-b15b-4adb-8a83-4b197e2b724c" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.015743 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab52595-1681-409f-b9fd-f806a0e07117" containerName="galera" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.015752 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="42924886-d9f3-48df-83a3-f7e27fcf368d" containerName="memcached" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.017449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.020896 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.020943 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-6hxvs" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.022585 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.022849 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.046191 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.046221 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.046232 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.046241 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.046249 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dab9056-b15b-4adb-8a83-4b197e2b724c-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.046265 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.046273 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bjsl\" (UniqueName: \"kubernetes.io/projected/9ab52595-1681-409f-b9fd-f806a0e07117-kube-api-access-7bjsl\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.046281 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab52595-1681-409f-b9fd-f806a0e07117-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.046289 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ab52595-1681-409f-b9fd-f806a0e07117-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.052274 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.070062 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.084134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7dd959896b-rsffl"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.106880 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.119414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-578d4bcc98-hmqrg"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.125534 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.147863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.147955 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.148207 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-config\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.148248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.148680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.148710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.148745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz5gx\" (UniqueName: \"kubernetes.io/projected/db6e605f-fa18-4b63-abb0-398ff39fb010-kube-api-access-mz5gx\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.148791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.148899 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.186344 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.186541 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.197788 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cc2e06-6e76-4cca-8954-6d362c62e31a" path="/var/lib/kubelet/pods/20cc2e06-6e76-4cca-8954-6d362c62e31a/volumes" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.198442 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2d7e8f4-c39e-431c-acc5-9dede63b65a8" path="/var/lib/kubelet/pods/e2d7e8f4-c39e-431c-acc5-9dede63b65a8/volumes" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.250140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.250235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.250270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.250323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-config\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.250347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.250377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.250396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.250426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz5gx\" (UniqueName: \"kubernetes.io/projected/db6e605f-fa18-4b63-abb0-398ff39fb010-kube-api-access-mz5gx\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.251992 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.252337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.254845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.254978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-config\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.259598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.259710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.261125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.265836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz5gx\" (UniqueName: \"kubernetes.io/projected/db6e605f-fa18-4b63-abb0-398ff39fb010-kube-api-access-mz5gx\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.280927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.294866 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.300612 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.319869 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.338488 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" podUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.376104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" event={"ID":"4ab385ad-11d0-4da8-bf53-70997a2b9fa3","Type":"ContainerStarted","Data":"4a5772dd092c131ecf81d8ee1a79e452a6686ec0a7bef349d5605a36d1c443da"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.380417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" event={"ID":"eaa206fe-a991-4947-8e0e-0a8f80879f55","Type":"ContainerStarted","Data":"e63269a3cac96ab1f9b8f88518da6d80d552868292608720525ace78853503c7"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.386046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" event={"ID":"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb","Type":"ContainerStarted","Data":"296309b3efc562a705d8545a8884f5798a01ca9bd3d5f40e7c63975c66f5c748"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.390462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"42924886-d9f3-48df-83a3-f7e27fcf368d","Type":"ContainerDied","Data":"87499cfea8bda08a718a940fd3a7bf071af4b38a1b2f3699e8ea828a649bf96f"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.390499 4707 scope.go:117] "RemoveContainer" containerID="7bc461ffcee7909ae3ce1f7eee5798564341473d88bb5e5dc4e8f44fc909fd32" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.390619 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.395764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" event={"ID":"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a","Type":"ContainerStarted","Data":"126b84ccf58fbb664aa1a8939c6fdff7132c8e799df526d0c0189f405ef7ef6e"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.401320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" event={"ID":"3769b2e3-4438-441b-867e-65d6dbd6e36c","Type":"ContainerStarted","Data":"922b718c7c0e1bc6e40ba65052ce022f472ccf8f06c6dbe5534a84e4ffd5f9e1"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.402878 4707 generic.go:334] "Generic (PLEG): container finished" podID="67bae98e-ed83-41c8-be5a-15eae8c9aa48" containerID="e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3" exitCode=0 Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.402923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"67bae98e-ed83-41c8-be5a-15eae8c9aa48","Type":"ContainerDied","Data":"e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.402940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"67bae98e-ed83-41c8-be5a-15eae8c9aa48","Type":"ContainerDied","Data":"fc3f686d8b24ab0d62278743523c1bd2017f0bb758567c6c1dcd13c6cde48512"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.402990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.412507 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.416292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"c6c4cdde-65b4-4956-aaf6-e7dba05f869d","Type":"ContainerStarted","Data":"18e4974321bdc76bd26754d0698b4558f912a4314c571322644c2989c20de9f0"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.422702 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.432605 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.433036 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67bae98e-ed83-41c8-be5a-15eae8c9aa48" containerName="mysql-bootstrap" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.433110 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67bae98e-ed83-41c8-be5a-15eae8c9aa48" containerName="mysql-bootstrap" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.433174 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67bae98e-ed83-41c8-be5a-15eae8c9aa48" containerName="galera" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.433217 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67bae98e-ed83-41c8-be5a-15eae8c9aa48" containerName="galera" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.433428 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="67bae98e-ed83-41c8-be5a-15eae8c9aa48" containerName="galera" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.434015 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.439429 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.439468 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-2x7h6" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.440013 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.440292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9ab52595-1681-409f-b9fd-f806a0e07117","Type":"ContainerDied","Data":"1adeca4e1fa81267399fa4951f19def7d1ef571a01ca9858ecbea726f81dcb6e"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.442373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6dab9056-b15b-4adb-8a83-4b197e2b724c","Type":"ContainerDied","Data":"b2c6ad6adefc8376ea0b8a64281cf6f521049a7cd75a8b2f76706d8faf504ba8"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.442656 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.443150 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.466551 4707 generic.go:334] "Generic (PLEG): container finished" podID="0cc92b42-4a79-43c3-b749-ae1058708648" containerID="f9fc1facbf6e09c6961b88dbfa1c24da8c02b646a2a6ff48e5682b747b9e2521" exitCode=0 Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.466577 4707 generic.go:334] "Generic (PLEG): container finished" podID="0cc92b42-4a79-43c3-b749-ae1058708648" containerID="fe8a3919099032fd63ab5edd1340fda4118650819c31c39020c309ee4731c4b3" exitCode=0 Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.466668 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.466714 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerDied","Data":"f9fc1facbf6e09c6961b88dbfa1c24da8c02b646a2a6ff48e5682b747b9e2521"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.466729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerDied","Data":"fe8a3919099032fd63ab5edd1340fda4118650819c31c39020c309ee4731c4b3"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.467617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-operator-scripts\") pod \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.467733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.467844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnbgq\" (UniqueName: \"kubernetes.io/projected/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kube-api-access-rnbgq\") pod \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.467928 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kolla-config\") pod \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.467961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-combined-ca-bundle\") pod \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.468015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-default\") pod \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.468030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config-secret\") pod \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.468044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config\") pod \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.468112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbwrf\" (UniqueName: \"kubernetes.io/projected/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-kube-api-access-sbwrf\") pod \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\" (UID: \"2e404344-5beb-4f6f-ba7b-ccbb59c4e318\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.468129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-galera-tls-certs\") pod \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.468146 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-generated\") pod \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.468171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-combined-ca-bundle\") pod \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\" (UID: \"67bae98e-ed83-41c8-be5a-15eae8c9aa48\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.469936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67bae98e-ed83-41c8-be5a-15eae8c9aa48" (UID: "67bae98e-ed83-41c8-be5a-15eae8c9aa48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.470493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "67bae98e-ed83-41c8-be5a-15eae8c9aa48" (UID: "67bae98e-ed83-41c8-be5a-15eae8c9aa48"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.472112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "67bae98e-ed83-41c8-be5a-15eae8c9aa48" (UID: "67bae98e-ed83-41c8-be5a-15eae8c9aa48"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.475841 4707 scope.go:117] "RemoveContainer" containerID="e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.477271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "67bae98e-ed83-41c8-be5a-15eae8c9aa48" (UID: "67bae98e-ed83-41c8-be5a-15eae8c9aa48"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.497224 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.497492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-kube-api-access-sbwrf" (OuterVolumeSpecName: "kube-api-access-sbwrf") pod "2e404344-5beb-4f6f-ba7b-ccbb59c4e318" (UID: "2e404344-5beb-4f6f-ba7b-ccbb59c4e318"). InnerVolumeSpecName "kube-api-access-sbwrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.513099 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f72a259-adb4-405e-b6e3-6a39fbe668b8" containerID="7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870" exitCode=0 Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.513195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"1f72a259-adb4-405e-b6e3-6a39fbe668b8","Type":"ContainerDied","Data":"7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.521974 4707 generic.go:334] "Generic (PLEG): container finished" podID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" containerID="66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401" exitCode=137 Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.522096 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.552835 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" podUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.553659 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.559284 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.570980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-config-data\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.571093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrtft\" (UniqueName: \"kubernetes.io/projected/d2613b63-6d79-4d28-9be6-ca7a8431c724-kube-api-access-jrtft\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.571151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.571185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-kolla-config\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.571227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.571309 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbwrf\" (UniqueName: \"kubernetes.io/projected/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-kube-api-access-sbwrf\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.571321 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.571331 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.571339 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.571351 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/67bae98e-ed83-41c8-be5a-15eae8c9aa48-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.582191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kube-api-access-rnbgq" (OuterVolumeSpecName: "kube-api-access-rnbgq") pod "67bae98e-ed83-41c8-be5a-15eae8c9aa48" (UID: "67bae98e-ed83-41c8-be5a-15eae8c9aa48"). InnerVolumeSpecName "kube-api-access-rnbgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.604025 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.635320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" event={"ID":"3596326f-8c58-4af1-963e-916d15976926","Type":"ContainerStarted","Data":"4217fa6d49e93744c13a6b4bbf06a5a667420ad031da12a149cf74c3e29d3ed2"} Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.665974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "mysql-db") pod "67bae98e-ed83-41c8-be5a-15eae8c9aa48" (UID: "67bae98e-ed83-41c8-be5a-15eae8c9aa48"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.693270 4707 scope.go:117] "RemoveContainer" containerID="3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.693529 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.693891 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="sg-core" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.693903 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="sg-core" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.693915 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="proxy-httpd" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.693921 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="proxy-httpd" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.693931 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="ceilometer-central-agent" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.693936 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="ceilometer-central-agent" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.693953 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f72a259-adb4-405e-b6e3-6a39fbe668b8" containerName="nova-scheduler-scheduler" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.693958 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f72a259-adb4-405e-b6e3-6a39fbe668b8" containerName="nova-scheduler-scheduler" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.693969 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="ceilometer-notification-agent" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.693974 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="ceilometer-notification-agent" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.694120 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="ceilometer-central-agent" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.694137 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="ceilometer-notification-agent" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.694143 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="sg-core" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.694156 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f72a259-adb4-405e-b6e3-6a39fbe668b8" containerName="nova-scheduler-scheduler" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.694165 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" containerName="proxy-httpd" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.695004 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.699575 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.700633 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.700882 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-dr5v9" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.701002 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.703342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-config-data\") pod \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.703420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4ftw\" (UniqueName: \"kubernetes.io/projected/1f72a259-adb4-405e-b6e3-6a39fbe668b8-kube-api-access-x4ftw\") pod \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.703518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-combined-ca-bundle\") pod \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\" (UID: \"1f72a259-adb4-405e-b6e3-6a39fbe668b8\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.707514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrtft\" (UniqueName: \"kubernetes.io/projected/d2613b63-6d79-4d28-9be6-ca7a8431c724-kube-api-access-jrtft\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.707676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.707757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-kolla-config\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.707861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.707905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-config-data\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.708136 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.708150 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnbgq\" (UniqueName: \"kubernetes.io/projected/67bae98e-ed83-41c8-be5a-15eae8c9aa48-kube-api-access-rnbgq\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.714845 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.715459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-config-data\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.721733 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.733357 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.737227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.749588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-kolla-config\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.752244 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.753307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.765773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.765924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.766060 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.766291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrtft\" (UniqueName: \"kubernetes.io/projected/d2613b63-6d79-4d28-9be6-ca7a8431c724-kube-api-access-jrtft\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.784166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.810375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-ceilometer-tls-certs\") pod \"0cc92b42-4a79-43c3-b749-ae1058708648\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.810427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xpzg\" (UniqueName: \"kubernetes.io/projected/0cc92b42-4a79-43c3-b749-ae1058708648-kube-api-access-7xpzg\") pod \"0cc92b42-4a79-43c3-b749-ae1058708648\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.810557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-sg-core-conf-yaml\") pod \"0cc92b42-4a79-43c3-b749-ae1058708648\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.810585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-run-httpd\") pod \"0cc92b42-4a79-43c3-b749-ae1058708648\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.810681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-log-httpd\") pod \"0cc92b42-4a79-43c3-b749-ae1058708648\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.810705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-config-data\") pod \"0cc92b42-4a79-43c3-b749-ae1058708648\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.810741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-combined-ca-bundle\") pod \"0cc92b42-4a79-43c3-b749-ae1058708648\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.810785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-scripts\") pod \"0cc92b42-4a79-43c3-b749-ae1058708648\" (UID: \"0cc92b42-4a79-43c3-b749-ae1058708648\") " Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.810988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcrz9\" (UniqueName: \"kubernetes.io/projected/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kube-api-access-gcrz9\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.811049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.811082 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.811115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.811298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.811340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.811354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.811392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.831463 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f72a259-adb4-405e-b6e3-6a39fbe668b8-kube-api-access-x4ftw" (OuterVolumeSpecName: "kube-api-access-x4ftw") pod "1f72a259-adb4-405e-b6e3-6a39fbe668b8" (UID: "1f72a259-adb4-405e-b6e3-6a39fbe668b8"). InnerVolumeSpecName "kube-api-access-x4ftw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.832036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0cc92b42-4a79-43c3-b749-ae1058708648" (UID: "0cc92b42-4a79-43c3-b749-ae1058708648"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.832180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0cc92b42-4a79-43c3-b749-ae1058708648" (UID: "0cc92b42-4a79-43c3-b749-ae1058708648"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.859083 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.874324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.880920 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.884934 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.886247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc92b42-4a79-43c3-b749-ae1058708648-kube-api-access-7xpzg" (OuterVolumeSpecName: "kube-api-access-7xpzg") pod "0cc92b42-4a79-43c3-b749-ae1058708648" (UID: "0cc92b42-4a79-43c3-b749-ae1058708648"). InnerVolumeSpecName "kube-api-access-7xpzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.893076 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.899568 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.912649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.912690 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.912717 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xp47\" (UniqueName: \"kubernetes.io/projected/728a3264-18ce-4d47-86f5-868f00cc4558-kube-api-access-6xp47\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.912741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.912758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.913204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.924745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.924800 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcrz9\" (UniqueName: \"kubernetes.io/projected/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kube-api-access-gcrz9\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.924876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.924918 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.924935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.924973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.914178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.925152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.913692 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.918939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.913733 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.926098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.926445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.926678 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xpzg\" (UniqueName: \"kubernetes.io/projected/0cc92b42-4a79-43c3-b749-ae1058708648-kube-api-access-7xpzg\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.926691 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4ftw\" (UniqueName: \"kubernetes.io/projected/1f72a259-adb4-405e-b6e3-6a39fbe668b8-kube-api-access-x4ftw\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.926700 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.926710 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc92b42-4a79-43c3-b749-ae1058708648-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.932865 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.941296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.941354 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.942738 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.943273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-scripts" (OuterVolumeSpecName: "scripts") pod "0cc92b42-4a79-43c3-b749-ae1058708648" (UID: "0cc92b42-4a79-43c3-b749-ae1058708648"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.970611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcrz9\" (UniqueName: \"kubernetes.io/projected/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kube-api-access-gcrz9\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.972328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.981905 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq"] Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.988272 4707 scope.go:117] "RemoveContainer" containerID="e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.988694 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3\": container with ID starting with e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3 not found: ID does not exist" containerID="e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.988726 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3"} err="failed to get container status \"e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3\": rpc error: code = NotFound desc = could not find container \"e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3\": container with ID starting with e56b107b4f01f7179fe571262565035e76522ce09759ddea9c7a57cf07738ce3 not found: ID does not exist" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.988744 4707 scope.go:117] "RemoveContainer" containerID="3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749" Jan 21 15:54:05 crc kubenswrapper[4707]: E0121 15:54:05.989529 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749\": container with ID starting with 3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749 not found: ID does not exist" containerID="3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.989553 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749"} err="failed to get container status \"3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749\": rpc error: code = NotFound desc = could not find container \"3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749\": container with ID starting with 3b6b4d0c23578e2b56fe3fdfce298ddf31009e3a9d80916a87a5e05f7c7fb749 not found: ID does not exist" Jan 21 15:54:05 crc kubenswrapper[4707]: I0121 15:54:05.989566 4707 scope.go:117] "RemoveContainer" containerID="e46c9f70ccacdad3402cf6ba32a77979ddf8b59c6dda0c74faca0a9001f5b6b6" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.013054 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9"] Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.029246 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk"] Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.029941 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e31617e-3380-4e4f-9558-ceb4164fd190-logs\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.030041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.030540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjh8d\" (UniqueName: \"kubernetes.io/projected/7e31617e-3380-4e4f-9558-ceb4164fd190-kube-api-access-zjh8d\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.030800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.031044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.031122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-combined-ca-bundle\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.031200 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.031367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xp47\" (UniqueName: \"kubernetes.io/projected/728a3264-18ce-4d47-86f5-868f00cc4558-kube-api-access-6xp47\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.031419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data-custom\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.031823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.032027 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.034591 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.035987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.038196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.038562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.039179 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:06 crc kubenswrapper[4707]: E0121 15:54:06.039975 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[mysql-db], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="cc96bd5b-fb78-41b8-a65b-352f905ff8d5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.049397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.052374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xp47\" (UniqueName: \"kubernetes.io/projected/728a3264-18ce-4d47-86f5-868f00cc4558-kube-api-access-6xp47\") pod \"nova-cell1-novncproxy-0\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.054062 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk"] Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.069408 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.073536 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5b758d7546-vzkpl"] Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.078879 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.081213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "67bae98e-ed83-41c8-be5a-15eae8c9aa48" (UID: "67bae98e-ed83-41c8-be5a-15eae8c9aa48"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.090423 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-75586f844d-9x57h"] Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.091884 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.091902 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.101919 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-75586f844d-9x57h"] Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.130930 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.133435 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.134767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkf6h\" (UniqueName: \"kubernetes.io/projected/a33f40c3-3064-405e-a420-da052b2b4a66-kube-api-access-xkf6h\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.134821 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-logs\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.134849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-combined-ca-bundle\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.134879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data-custom\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a33f40c3-3064-405e-a420-da052b2b4a66-logs\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-internal-tls-certs\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data-custom\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data-custom\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e31617e-3380-4e4f-9558-ceb4164fd190-logs\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjh8d\" (UniqueName: \"kubernetes.io/projected/7e31617e-3380-4e4f-9558-ceb4164fd190-kube-api-access-zjh8d\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-public-tls-certs\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-combined-ca-bundle\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c5wh\" (UniqueName: \"kubernetes.io/projected/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-kube-api-access-5c5wh\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-combined-ca-bundle\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135627 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.135637 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.136452 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e31617e-3380-4e4f-9558-ceb4164fd190-logs\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.142463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-combined-ca-bundle\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.148090 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.152082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.153973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data-custom\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.158317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjh8d\" (UniqueName: \"kubernetes.io/projected/7e31617e-3380-4e4f-9558-ceb4164fd190-kube-api-access-zjh8d\") pod \"barbican-worker-7695bf75b5-6fdh5\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.184216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0cc92b42-4a79-43c3-b749-ae1058708648" (UID: "0cc92b42-4a79-43c3-b749-ae1058708648"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.191207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67bae98e-ed83-41c8-be5a-15eae8c9aa48" (UID: "67bae98e-ed83-41c8-be5a-15eae8c9aa48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.238796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-internal-tls-certs\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.238866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a33f40c3-3064-405e-a420-da052b2b4a66-logs\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.238899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.238928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-internal-tls-certs\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.238952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data-custom\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.238971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data-custom\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-logs\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-public-tls-certs\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-public-tls-certs\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239083 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-combined-ca-bundle\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c5wh\" (UniqueName: \"kubernetes.io/projected/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-kube-api-access-5c5wh\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48v2g\" (UniqueName: \"kubernetes.io/projected/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-kube-api-access-48v2g\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-combined-ca-bundle\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-config-data\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-scripts\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-combined-ca-bundle\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239342 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkf6h\" (UniqueName: \"kubernetes.io/projected/a33f40c3-3064-405e-a420-da052b2b4a66-kube-api-access-xkf6h\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-logs\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239502 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67bae98e-ed83-41c8-be5a-15eae8c9aa48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239513 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.239950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-logs\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.259085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a33f40c3-3064-405e-a420-da052b2b4a66-logs\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.273050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data-custom\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.277417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-combined-ca-bundle\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.277993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.286975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-public-tls-certs\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.287316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-internal-tls-certs\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.293799 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c5wh\" (UniqueName: \"kubernetes.io/projected/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-kube-api-access-5c5wh\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.303253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data-custom\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.307143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkf6h\" (UniqueName: \"kubernetes.io/projected/a33f40c3-3064-405e-a420-da052b2b4a66-kube-api-access-xkf6h\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.323788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data\") pod \"barbican-api-75997c46fb-qf2kk\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.326582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-combined-ca-bundle\") pod \"barbican-keystone-listener-5fddcb7cf6-46spq\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.349134 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-config-data\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.356190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-config-data" (OuterVolumeSpecName: "config-data") pod "1f72a259-adb4-405e-b6e3-6a39fbe668b8" (UID: "1f72a259-adb4-405e-b6e3-6a39fbe668b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.372869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-scripts\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.373488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-config-data\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.383221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-internal-tls-certs\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.383382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-logs\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.383424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-public-tls-certs\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.383508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48v2g\" (UniqueName: \"kubernetes.io/projected/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-kube-api-access-48v2g\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.383538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-combined-ca-bundle\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.383653 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.405098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-logs\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.406174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e404344-5beb-4f6f-ba7b-ccbb59c4e318" (UID: "2e404344-5beb-4f6f-ba7b-ccbb59c4e318"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.416870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-scripts\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.443693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-internal-tls-certs\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.444948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f72a259-adb4-405e-b6e3-6a39fbe668b8" (UID: "1f72a259-adb4-405e-b6e3-6a39fbe668b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.447290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48v2g\" (UniqueName: \"kubernetes.io/projected/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-kube-api-access-48v2g\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.451339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-public-tls-certs\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.464744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-combined-ca-bundle\") pod \"placement-75586f844d-9x57h\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.469981 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "2e404344-5beb-4f6f-ba7b-ccbb59c4e318" (UID: "2e404344-5beb-4f6f-ba7b-ccbb59c4e318"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.492131 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.492270 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.492423 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f72a259-adb4-405e-b6e3-6a39fbe668b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: E0121 15:54:06.500590 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:54:06 crc kubenswrapper[4707]: E0121 15:54:06.502116 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:54:06 crc kubenswrapper[4707]: E0121 15:54:06.513185 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:54:06 crc kubenswrapper[4707]: E0121 15:54:06.513309 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="ovn-northd" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.532956 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.102:8775/\": read tcp 10.217.0.2:46588->10.217.0.102:8775: read: connection reset by peer" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.533497 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.102:8775/\": read tcp 10.217.0.2:46584->10.217.0.102:8775: read: connection reset by peer" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.566794 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.236:8776/healthcheck\": read tcp 10.217.0.2:47076->10.217.1.236:8776: read: connection reset by peer" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.582940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "2e404344-5beb-4f6f-ba7b-ccbb59c4e318" (UID: "2e404344-5beb-4f6f-ba7b-ccbb59c4e318"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.602661 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2e404344-5beb-4f6f-ba7b-ccbb59c4e318-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.613701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cc92b42-4a79-43c3-b749-ae1058708648" (UID: "0cc92b42-4a79-43c3-b749-ae1058708648"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.635370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0cc92b42-4a79-43c3-b749-ae1058708648" (UID: "0cc92b42-4a79-43c3-b749-ae1058708648"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.644655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-config-data" (OuterVolumeSpecName: "config-data") pod "0cc92b42-4a79-43c3-b749-ae1058708648" (UID: "0cc92b42-4a79-43c3-b749-ae1058708648"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.686023 4707 generic.go:334] "Generic (PLEG): container finished" podID="11f0696a-c173-41a3-926d-29ef8dcef753" containerID="8f3d568ef8fe02166321f2465bb001d67912afe92b6e85e4165f4ee77c91e825" exitCode=0 Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.686087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"11f0696a-c173-41a3-926d-29ef8dcef753","Type":"ContainerDied","Data":"8f3d568ef8fe02166321f2465bb001d67912afe92b6e85e4165f4ee77c91e825"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.691041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"db6e605f-fa18-4b63-abb0-398ff39fb010","Type":"ContainerStarted","Data":"e80acd071c30b09fae20ae185142e55376d3f93d45bbf0452e49af6c1fde9685"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.693899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" event={"ID":"3596326f-8c58-4af1-963e-916d15976926","Type":"ContainerStarted","Data":"29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.694760 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.704151 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.704177 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.704187 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc92b42-4a79-43c3-b749-ae1058708648-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.713269 4707 generic.go:334] "Generic (PLEG): container finished" podID="c535f926-bfe0-4448-8090-1b942fe85c89" containerID="6861b5894c2e268dd615bf72e68c53f494b1ac91f918b0c29b772a58817145a6" exitCode=0 Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.713391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c535f926-bfe0-4448-8090-1b942fe85c89","Type":"ContainerDied","Data":"6861b5894c2e268dd615bf72e68c53f494b1ac91f918b0c29b772a58817145a6"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.719202 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" podStartSLOduration=3.719189119 podStartE2EDuration="3.719189119s" podCreationTimestamp="2026-01-21 15:54:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:06.707187963 +0000 UTC m=+3143.888704184" watchObservedRunningTime="2026-01-21 15:54:06.719189119 +0000 UTC m=+3143.900705342" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.739543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" event={"ID":"eaa206fe-a991-4947-8e0e-0a8f80879f55","Type":"ContainerStarted","Data":"a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.746392 4707 generic.go:334] "Generic (PLEG): container finished" podID="0810465b-2f7e-49f5-8d97-c8416940a463" containerID="184182939c5463a5f2caf7ea1723c17fbff7ff92c20f068adf919f644df65ae3" exitCode=0 Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.746445 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0810465b-2f7e-49f5-8d97-c8416940a463","Type":"ContainerDied","Data":"184182939c5463a5f2caf7ea1723c17fbff7ff92c20f068adf919f644df65ae3"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.746464 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0810465b-2f7e-49f5-8d97-c8416940a463","Type":"ContainerDied","Data":"3619297069a7733c2ff19a715ea08ee6e929df3f698f47759c367dfd5f3f0567"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.746474 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3619297069a7733c2ff19a715ea08ee6e929df3f698f47759c367dfd5f3f0567" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.748242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0cc92b42-4a79-43c3-b749-ae1058708648","Type":"ContainerDied","Data":"506b4298a18501f47309f69d137ff4ce875043a88d9c86d70be89ab71815d5d6"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.748354 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.762585 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.780597 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"c6c4cdde-65b4-4956-aaf6-e7dba05f869d","Type":"ContainerStarted","Data":"b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.792401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" event={"ID":"4ab385ad-11d0-4da8-bf53-70997a2b9fa3","Type":"ContainerStarted","Data":"298713c226aa15237d7cd72cb0e956083e7716fdc3d3e28167b1f08dfa128879"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.812606 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.812645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"1f72a259-adb4-405e-b6e3-6a39fbe668b8","Type":"ContainerDied","Data":"d2e3b2e844dcf0a21b40eaeb1bebe3bdaab48ad234534d1f810110e215ef7a81"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.813015 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=4.813006214 podStartE2EDuration="4.813006214s" podCreationTimestamp="2026-01-21 15:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:06.800695606 +0000 UTC m=+3143.982211828" watchObservedRunningTime="2026-01-21 15:54:06.813006214 +0000 UTC m=+3143.994522436" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.830679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" event={"ID":"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb","Type":"ContainerStarted","Data":"66872aae878a51a83199443d8ec479ce07708d2a6515a66af88f405708649bdf"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.832207 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" podStartSLOduration=4.832190228 podStartE2EDuration="4.832190228s" podCreationTimestamp="2026-01-21 15:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:06.821694832 +0000 UTC m=+3144.003211054" watchObservedRunningTime="2026-01-21 15:54:06.832190228 +0000 UTC m=+3144.013706450" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.841111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82","Type":"ContainerStarted","Data":"90840df2d9fb055dc339641940e756000dbb48f1c924ffb548bc2caf721e9d46"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.845647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" event={"ID":"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a","Type":"ContainerStarted","Data":"5317381dd2ab17c4e6cf8ac5d6c3209a97820d7d93db33f88df2d938cc564924"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.853620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" event={"ID":"3769b2e3-4438-441b-867e-65d6dbd6e36c","Type":"ContainerStarted","Data":"272acba6d788239712a5071adbac78f9174904c25e843ddb5f470eb2cbaf2e24"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.862791 4707 generic.go:334] "Generic (PLEG): container finished" podID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerID="304a9698f83732cdf07b1c1beb63c8b9696304d817437a60ab659c572ebfb73c" exitCode=0 Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.862908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ef0e4e0-0bae-4886-894a-863f13095f5b","Type":"ContainerDied","Data":"304a9698f83732cdf07b1c1beb63c8b9696304d817437a60ab659c572ebfb73c"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.866316 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed60a6eb-8348-4608-8905-143106b4d545" containerID="58e6b3acf0942b299010a6d66e7bf7396f26347a33cd0c773d801e2540add159" exitCode=0 Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.866405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed60a6eb-8348-4608-8905-143106b4d545","Type":"ContainerDied","Data":"58e6b3acf0942b299010a6d66e7bf7396f26347a33cd0c773d801e2540add159"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.874585 4707 generic.go:334] "Generic (PLEG): container finished" podID="60e0b49a-6e14-42af-9480-877a354cec42" containerID="2d20d49e7b7da58ecb8376a79713efa7bdcae5fb6a7a699d672be0eea1161321" exitCode=0 Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.874664 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.875041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"60e0b49a-6e14-42af-9480-877a354cec42","Type":"ContainerDied","Data":"2d20d49e7b7da58ecb8376a79713efa7bdcae5fb6a7a699d672be0eea1161321"} Jan 21 15:54:06 crc kubenswrapper[4707]: I0121 15:54:06.909776 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:06 crc kubenswrapper[4707]: W0121 15:54:06.925665 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod728a3264_18ce_4d47_86f5_868f00cc4558.slice/crio-52a876ea60f2280980602b262fa95fa606243036aec33822245ef9b90807649d WatchSource:0}: Error finding container 52a876ea60f2280980602b262fa95fa606243036aec33822245ef9b90807649d: Status 404 returned error can't find the container with id 52a876ea60f2280980602b262fa95fa606243036aec33822245ef9b90807649d Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.057547 4707 scope.go:117] "RemoveContainer" containerID="2727ec4f05c11c3ea5fe4042502dccc2c46d18348b1b9ac1444104a1d4ce950f" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.103954 4707 scope.go:117] "RemoveContainer" containerID="c0f64e3130d63c159d5622b3618a7db1051e62d9dd644abb3bb1f144f6bab0b3" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.158088 4707 scope.go:117] "RemoveContainer" containerID="7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.198820 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" path="/var/lib/kubelet/pods/2e404344-5beb-4f6f-ba7b-ccbb59c4e318/volumes" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.199545 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42924886-d9f3-48df-83a3-f7e27fcf368d" path="/var/lib/kubelet/pods/42924886-d9f3-48df-83a3-f7e27fcf368d/volumes" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.200071 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dab9056-b15b-4adb-8a83-4b197e2b724c" path="/var/lib/kubelet/pods/6dab9056-b15b-4adb-8a83-4b197e2b724c/volumes" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.201159 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab52595-1681-409f-b9fd-f806a0e07117" path="/var/lib/kubelet/pods/9ab52595-1681-409f-b9fd-f806a0e07117/volumes" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.219043 4707 scope.go:117] "RemoveContainer" containerID="66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.282529 4707 scope.go:117] "RemoveContainer" containerID="66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401" Jan 21 15:54:07 crc kubenswrapper[4707]: E0121 15:54:07.282970 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401\": container with ID starting with 66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401 not found: ID does not exist" containerID="66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.283001 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401"} err="failed to get container status \"66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401\": rpc error: code = NotFound desc = could not find container \"66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401\": container with ID starting with 66231077f3a16724f7986333e894fe6fa2ab841ebb4567796269a499c2a13401 not found: ID does not exist" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.283023 4707 scope.go:117] "RemoveContainer" containerID="f8ea85e223922cccc7bd056742a95947de0c06e528f4d7e40d1fc53cd2be4a49" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.393995 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.396357 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.410214 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="2e404344-5beb-4f6f-ba7b-ccbb59c4e318" podUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.418391 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.441610 4707 scope.go:117] "RemoveContainer" containerID="a2c938ca62002f651985a2fc05725ae21653687d6af33d6dc52a4597ce78155a" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.453879 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.478895 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.500004 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.511792 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.513681 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.515332 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.519609 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.519791 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.519874 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-m7p4k" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.521675 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.524032 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.533221 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.533995 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.535470 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.536119 4707 scope.go:117] "RemoveContainer" containerID="f9fc1facbf6e09c6961b88dbfa1c24da8c02b646a2a6ff48e5682b747b9e2521" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.542961 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.544537 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.546316 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.550674 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: E0121 15:54:07.551241 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e0b49a-6e14-42af-9480-877a354cec42" containerName="glance-httpd" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.551266 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e0b49a-6e14-42af-9480-877a354cec42" containerName="glance-httpd" Jan 21 15:54:07 crc kubenswrapper[4707]: E0121 15:54:07.551279 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0810465b-2f7e-49f5-8d97-c8416940a463" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.551286 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0810465b-2f7e-49f5-8d97-c8416940a463" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:07 crc kubenswrapper[4707]: E0121 15:54:07.551305 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e0b49a-6e14-42af-9480-877a354cec42" containerName="glance-log" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.551310 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e0b49a-6e14-42af-9480-877a354cec42" containerName="glance-log" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.551550 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e0b49a-6e14-42af-9480-877a354cec42" containerName="glance-log" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.551576 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e0b49a-6e14-42af-9480-877a354cec42" containerName="glance-httpd" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.551586 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0810465b-2f7e-49f5-8d97-c8416940a463" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.556164 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.559397 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.559584 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.560295 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.561856 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.568863 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: E0121 15:54:07.569283 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-metadata" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.569295 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-metadata" Jan 21 15:54:07 crc kubenswrapper[4707]: E0121 15:54:07.569320 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-log" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.569326 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-log" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.569486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-metadata" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.569500 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed60a6eb-8348-4608-8905-143106b4d545" containerName="nova-metadata-log" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.570120 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.571415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.581064 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.590549 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.592284 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.602689 4707 scope.go:117] "RemoveContainer" containerID="fe8a3919099032fd63ab5edd1340fda4118650819c31c39020c309ee4731c4b3" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.613511 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.617342 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.625696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78bf2\" (UniqueName: \"kubernetes.io/projected/0db5205c-c226-4ce2-b734-b250d0c4f7f5-kube-api-access-78bf2\") pod \"nova-scheduler-0\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.625939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.625983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-config-data\") pod \"nova-scheduler-0\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.709541 4707 scope.go:117] "RemoveContainer" containerID="7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870" Jan 21 15:54:07 crc kubenswrapper[4707]: E0121 15:54:07.722396 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870\": container with ID starting with 7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870 not found: ID does not exist" containerID="7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.722442 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870"} err="failed to get container status \"7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870\": rpc error: code = NotFound desc = could not find container \"7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870\": container with ID starting with 7ce509e0fd085304c55871435f99590816ee887b61e76e58537a58378856a870 not found: ID does not exist" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-config-data\") pod \"11f0696a-c173-41a3-926d-29ef8dcef753\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-combined-ca-bundle\") pod \"0810465b-2f7e-49f5-8d97-c8416940a463\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-internal-tls-certs\") pod \"8ef0e4e0-0bae-4886-894a-863f13095f5b\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736781 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-httpd-run\") pod \"11f0696a-c173-41a3-926d-29ef8dcef753\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736822 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kolla-config\") pod \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-combined-ca-bundle\") pod \"ed60a6eb-8348-4608-8905-143106b4d545\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-scripts\") pod \"c535f926-bfe0-4448-8090-1b942fe85c89\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-logs\") pod \"11f0696a-c173-41a3-926d-29ef8dcef753\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv2hb\" (UniqueName: \"kubernetes.io/projected/c535f926-bfe0-4448-8090-1b942fe85c89-kube-api-access-fv2hb\") pod \"c535f926-bfe0-4448-8090-1b942fe85c89\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.736977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m5wv\" (UniqueName: \"kubernetes.io/projected/ed60a6eb-8348-4608-8905-143106b4d545-kube-api-access-7m5wv\") pod \"ed60a6eb-8348-4608-8905-143106b4d545\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data\") pod \"c535f926-bfe0-4448-8090-1b942fe85c89\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737040 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-scripts\") pod \"60e0b49a-6e14-42af-9480-877a354cec42\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-combined-ca-bundle\") pod \"60e0b49a-6e14-42af-9480-877a354cec42\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"11f0696a-c173-41a3-926d-29ef8dcef753\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-config-data\") pod \"ed60a6eb-8348-4608-8905-143106b4d545\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-combined-ca-bundle\") pod \"8ef0e4e0-0bae-4886-894a-863f13095f5b\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-config-data\") pod \"60e0b49a-6e14-42af-9480-877a354cec42\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-internal-tls-certs\") pod \"11f0696a-c173-41a3-926d-29ef8dcef753\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmm82\" (UniqueName: \"kubernetes.io/projected/0810465b-2f7e-49f5-8d97-c8416940a463-kube-api-access-zmm82\") pod \"0810465b-2f7e-49f5-8d97-c8416940a463\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5tb4\" (UniqueName: \"kubernetes.io/projected/11f0696a-c173-41a3-926d-29ef8dcef753-kube-api-access-c5tb4\") pod \"11f0696a-c173-41a3-926d-29ef8dcef753\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-nova-metadata-tls-certs\") pod \"ed60a6eb-8348-4608-8905-143106b4d545\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"60e0b49a-6e14-42af-9480-877a354cec42\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcrz9\" (UniqueName: \"kubernetes.io/projected/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kube-api-access-gcrz9\") pod \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-httpd-run\") pod \"60e0b49a-6e14-42af-9480-877a354cec42\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-combined-ca-bundle\") pod \"c535f926-bfe0-4448-8090-1b942fe85c89\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-default\") pod \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-config-data\") pod \"0810465b-2f7e-49f5-8d97-c8416940a463\" (UID: \"0810465b-2f7e-49f5-8d97-c8416940a463\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef0e4e0-0bae-4886-894a-863f13095f5b-logs\") pod \"8ef0e4e0-0bae-4886-894a-863f13095f5b\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-scripts\") pod \"11f0696a-c173-41a3-926d-29ef8dcef753\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-operator-scripts\") pod \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-public-tls-certs\") pod \"8ef0e4e0-0bae-4886-894a-863f13095f5b\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-combined-ca-bundle\") pod \"11f0696a-c173-41a3-926d-29ef8dcef753\" (UID: \"11f0696a-c173-41a3-926d-29ef8dcef753\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-combined-ca-bundle\") pod \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-generated\") pod \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data-custom\") pod \"c535f926-bfe0-4448-8090-1b942fe85c89\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c535f926-bfe0-4448-8090-1b942fe85c89-etc-machine-id\") pod \"c535f926-bfe0-4448-8090-1b942fe85c89\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-galera-tls-certs\") pod \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-internal-tls-certs\") pod \"c535f926-bfe0-4448-8090-1b942fe85c89\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-public-tls-certs\") pod \"c535f926-bfe0-4448-8090-1b942fe85c89\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\" (UID: \"cc96bd5b-fb78-41b8-a65b-352f905ff8d5\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhk45\" (UniqueName: \"kubernetes.io/projected/8ef0e4e0-0bae-4886-894a-863f13095f5b-kube-api-access-xhk45\") pod \"8ef0e4e0-0bae-4886-894a-863f13095f5b\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737776 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-config-data\") pod \"8ef0e4e0-0bae-4886-894a-863f13095f5b\" (UID: \"8ef0e4e0-0bae-4886-894a-863f13095f5b\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed60a6eb-8348-4608-8905-143106b4d545-logs\") pod \"ed60a6eb-8348-4608-8905-143106b4d545\" (UID: \"ed60a6eb-8348-4608-8905-143106b4d545\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737838 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8bg2\" (UniqueName: \"kubernetes.io/projected/60e0b49a-6e14-42af-9480-877a354cec42-kube-api-access-b8bg2\") pod \"60e0b49a-6e14-42af-9480-877a354cec42\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-logs\") pod \"60e0b49a-6e14-42af-9480-877a354cec42\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c535f926-bfe0-4448-8090-1b942fe85c89-logs\") pod \"c535f926-bfe0-4448-8090-1b942fe85c89\" (UID: \"c535f926-bfe0-4448-8090-1b942fe85c89\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.737900 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-public-tls-certs\") pod \"60e0b49a-6e14-42af-9480-877a354cec42\" (UID: \"60e0b49a-6e14-42af-9480-877a354cec42\") " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.738777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-log-httpd\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.738871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-scripts\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.738892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-run-httpd\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.738932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.738981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wl6g\" (UniqueName: \"kubernetes.io/projected/e7b163de-7daf-4893-abcd-9fefd3850234-kube-api-access-7wl6g\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-config-data\") pod \"nova-scheduler-0\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-operator-scripts\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wpg6\" (UniqueName: \"kubernetes.io/projected/47097d1a-6225-4900-85a1-a28f2f698a78-kube-api-access-7wpg6\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-kolla-config\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-generated\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78bf2\" (UniqueName: \"kubernetes.io/projected/0db5205c-c226-4ce2-b734-b250d0c4f7f5-kube-api-access-78bf2\") pod \"nova-scheduler-0\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-config-data\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.739402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-default\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.740684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc96bd5b-fb78-41b8-a65b-352f905ff8d5" (UID: "cc96bd5b-fb78-41b8-a65b-352f905ff8d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.741197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "60e0b49a-6e14-42af-9480-877a354cec42" (UID: "60e0b49a-6e14-42af-9480-877a354cec42"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.742764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c535f926-bfe0-4448-8090-1b942fe85c89-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c535f926-bfe0-4448-8090-1b942fe85c89" (UID: "c535f926-bfe0-4448-8090-1b942fe85c89"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.742874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "cc96bd5b-fb78-41b8-a65b-352f905ff8d5" (UID: "cc96bd5b-fb78-41b8-a65b-352f905ff8d5"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.744696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef0e4e0-0bae-4886-894a-863f13095f5b-logs" (OuterVolumeSpecName: "logs") pod "8ef0e4e0-0bae-4886-894a-863f13095f5b" (UID: "8ef0e4e0-0bae-4886-894a-863f13095f5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.745134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "cc96bd5b-fb78-41b8-a65b-352f905ff8d5" (UID: "cc96bd5b-fb78-41b8-a65b-352f905ff8d5"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.749688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "11f0696a-c173-41a3-926d-29ef8dcef753" (UID: "11f0696a-c173-41a3-926d-29ef8dcef753"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.753547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cc96bd5b-fb78-41b8-a65b-352f905ff8d5" (UID: "cc96bd5b-fb78-41b8-a65b-352f905ff8d5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.758188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-logs" (OuterVolumeSpecName: "logs") pod "11f0696a-c173-41a3-926d-29ef8dcef753" (UID: "11f0696a-c173-41a3-926d-29ef8dcef753"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.763571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed60a6eb-8348-4608-8905-143106b4d545-logs" (OuterVolumeSpecName: "logs") pod "ed60a6eb-8348-4608-8905-143106b4d545" (UID: "ed60a6eb-8348-4608-8905-143106b4d545"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.764065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c535f926-bfe0-4448-8090-1b942fe85c89-logs" (OuterVolumeSpecName: "logs") pod "c535f926-bfe0-4448-8090-1b942fe85c89" (UID: "c535f926-bfe0-4448-8090-1b942fe85c89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.767691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-logs" (OuterVolumeSpecName: "logs") pod "60e0b49a-6e14-42af-9480-877a354cec42" (UID: "60e0b49a-6e14-42af-9480-877a354cec42"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.781548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.784306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kube-api-access-gcrz9" (OuterVolumeSpecName: "kube-api-access-gcrz9") pod "cc96bd5b-fb78-41b8-a65b-352f905ff8d5" (UID: "cc96bd5b-fb78-41b8-a65b-352f905ff8d5"). InnerVolumeSpecName "kube-api-access-gcrz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.784465 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-scripts" (OuterVolumeSpecName: "scripts") pod "c535f926-bfe0-4448-8090-1b942fe85c89" (UID: "c535f926-bfe0-4448-8090-1b942fe85c89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.784860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c535f926-bfe0-4448-8090-1b942fe85c89" (UID: "c535f926-bfe0-4448-8090-1b942fe85c89"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.785146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc96bd5b-fb78-41b8-a65b-352f905ff8d5" (UID: "cc96bd5b-fb78-41b8-a65b-352f905ff8d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.785964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "60e0b49a-6e14-42af-9480-877a354cec42" (UID: "60e0b49a-6e14-42af-9480-877a354cec42"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.787914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "cc96bd5b-fb78-41b8-a65b-352f905ff8d5" (UID: "cc96bd5b-fb78-41b8-a65b-352f905ff8d5"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.792093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-scripts" (OuterVolumeSpecName: "scripts") pod "60e0b49a-6e14-42af-9480-877a354cec42" (UID: "60e0b49a-6e14-42af-9480-877a354cec42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.797394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f0696a-c173-41a3-926d-29ef8dcef753-kube-api-access-c5tb4" (OuterVolumeSpecName: "kube-api-access-c5tb4") pod "11f0696a-c173-41a3-926d-29ef8dcef753" (UID: "11f0696a-c173-41a3-926d-29ef8dcef753"). InnerVolumeSpecName "kube-api-access-c5tb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.798604 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-scripts" (OuterVolumeSpecName: "scripts") pod "11f0696a-c173-41a3-926d-29ef8dcef753" (UID: "11f0696a-c173-41a3-926d-29ef8dcef753"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.798934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c535f926-bfe0-4448-8090-1b942fe85c89-kube-api-access-fv2hb" (OuterVolumeSpecName: "kube-api-access-fv2hb") pod "c535f926-bfe0-4448-8090-1b942fe85c89" (UID: "c535f926-bfe0-4448-8090-1b942fe85c89"). InnerVolumeSpecName "kube-api-access-fv2hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.809509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78bf2\" (UniqueName: \"kubernetes.io/projected/0db5205c-c226-4ce2-b734-b250d0c4f7f5-kube-api-access-78bf2\") pod \"nova-scheduler-0\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.812048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0810465b-2f7e-49f5-8d97-c8416940a463-kube-api-access-zmm82" (OuterVolumeSpecName: "kube-api-access-zmm82") pod "0810465b-2f7e-49f5-8d97-c8416940a463" (UID: "0810465b-2f7e-49f5-8d97-c8416940a463"). InnerVolumeSpecName "kube-api-access-zmm82". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.812319 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e0b49a-6e14-42af-9480-877a354cec42-kube-api-access-b8bg2" (OuterVolumeSpecName: "kube-api-access-b8bg2") pod "60e0b49a-6e14-42af-9480-877a354cec42" (UID: "60e0b49a-6e14-42af-9480-877a354cec42"). InnerVolumeSpecName "kube-api-access-b8bg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.815616 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "mysql-db") pod "cc96bd5b-fb78-41b8-a65b-352f905ff8d5" (UID: "cc96bd5b-fb78-41b8-a65b-352f905ff8d5"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.815748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed60a6eb-8348-4608-8905-143106b4d545-kube-api-access-7m5wv" (OuterVolumeSpecName: "kube-api-access-7m5wv") pod "ed60a6eb-8348-4608-8905-143106b4d545" (UID: "ed60a6eb-8348-4608-8905-143106b4d545"). InnerVolumeSpecName "kube-api-access-7m5wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.822753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "11f0696a-c173-41a3-926d-29ef8dcef753" (UID: "11f0696a-c173-41a3-926d-29ef8dcef753"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.823241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-config-data\") pod \"nova-scheduler-0\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.836140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef0e4e0-0bae-4886-894a-863f13095f5b-kube-api-access-xhk45" (OuterVolumeSpecName: "kube-api-access-xhk45") pod "8ef0e4e0-0bae-4886-894a-863f13095f5b" (UID: "8ef0e4e0-0bae-4886-894a-863f13095f5b"). InnerVolumeSpecName "kube-api-access-xhk45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.843765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-log-httpd\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.843827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-scripts\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.843843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-run-httpd\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.843884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wl6g\" (UniqueName: \"kubernetes.io/projected/e7b163de-7daf-4893-abcd-9fefd3850234-kube-api-access-7wl6g\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.843906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-operator-scripts\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.843923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wpg6\" (UniqueName: \"kubernetes.io/projected/47097d1a-6225-4900-85a1-a28f2f698a78-kube-api-access-7wpg6\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.843940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.843959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.843987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-kolla-config\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-generated\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-config-data\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-default\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844192 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844205 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcrz9\" (UniqueName: \"kubernetes.io/projected/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kube-api-access-gcrz9\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844216 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844225 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844234 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef0e4e0-0bae-4886-894a-863f13095f5b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844242 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844250 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844267 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844275 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844284 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844291 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c535f926-bfe0-4448-8090-1b942fe85c89-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844300 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844313 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844321 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhk45\" (UniqueName: \"kubernetes.io/projected/8ef0e4e0-0bae-4886-894a-863f13095f5b-kube-api-access-xhk45\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844329 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed60a6eb-8348-4608-8905-143106b4d545-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844337 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8bg2\" (UniqueName: \"kubernetes.io/projected/60e0b49a-6e14-42af-9480-877a354cec42-kube-api-access-b8bg2\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844346 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e0b49a-6e14-42af-9480-877a354cec42-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844353 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c535f926-bfe0-4448-8090-1b942fe85c89-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844361 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844368 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc96bd5b-fb78-41b8-a65b-352f905ff8d5-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844376 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844383 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11f0696a-c173-41a3-926d-29ef8dcef753-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844391 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv2hb\" (UniqueName: \"kubernetes.io/projected/c535f926-bfe0-4448-8090-1b942fe85c89-kube-api-access-fv2hb\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844399 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m5wv\" (UniqueName: \"kubernetes.io/projected/ed60a6eb-8348-4608-8905-143106b4d545-kube-api-access-7m5wv\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844407 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844420 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844429 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmm82\" (UniqueName: \"kubernetes.io/projected/0810465b-2f7e-49f5-8d97-c8416940a463-kube-api-access-zmm82\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.844437 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5tb4\" (UniqueName: \"kubernetes.io/projected/11f0696a-c173-41a3-926d-29ef8dcef753-kube-api-access-c5tb4\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.850329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-kolla-config\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.857372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-generated\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.857588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-log-httpd\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.859327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-run-httpd\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.865532 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.867379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-operator-scripts\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.868677 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-default\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.872359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wpg6\" (UniqueName: \"kubernetes.io/projected/47097d1a-6225-4900-85a1-a28f2f698a78-kube-api-access-7wpg6\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.872777 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wl6g\" (UniqueName: \"kubernetes.io/projected/e7b163de-7daf-4893-abcd-9fefd3850234-kube-api-access-7wl6g\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.880555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.906952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.907215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-config-data\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.908211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.908612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d2613b63-6d79-4d28-9be6-ca7a8431c724","Type":"ContainerStarted","Data":"bc186b096804ce32276dc98c1a294a7075e03b669bc0882cb5417fd3a9aee28d"} Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.909827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82","Type":"ContainerStarted","Data":"da0a868e4a63c3078ec90d7fb603c14478579582a1b0178a820be0cbec3e8780"} Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.914895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"728a3264-18ce-4d47-86f5-868f00cc4558","Type":"ContainerStarted","Data":"52a876ea60f2280980602b262fa95fa606243036aec33822245ef9b90807649d"} Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.915928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-scripts\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.916000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.916414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" event={"ID":"3769b2e3-4438-441b-867e-65d6dbd6e36c","Type":"ContainerStarted","Data":"b5b35b997c87a59fe1f8a09cb7639122fe8e23066d82a87103fb4be086b8ce2e"} Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.916796 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" containerID="cri-o://272acba6d788239712a5071adbac78f9174904c25e843ddb5f470eb2cbaf2e24" gracePeriod=30 Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.917066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.917138 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.917169 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.917399 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" containerID="cri-o://b5b35b997c87a59fe1f8a09cb7639122fe8e23066d82a87103fb4be086b8ce2e" gracePeriod=30 Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.932869 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podStartSLOduration=5.932856925 podStartE2EDuration="5.932856925s" podCreationTimestamp="2026-01-21 15:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:07.930422697 +0000 UTC m=+3145.111938919" watchObservedRunningTime="2026-01-21 15:54:07.932856925 +0000 UTC m=+3145.114373146" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.941331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8ef0e4e0-0bae-4886-894a-863f13095f5b","Type":"ContainerDied","Data":"b8ff0b54881d042be7dd153588eca3fa77a017ec8153dd24b3ee8c576132410b"} Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.941376 4707 scope.go:117] "RemoveContainer" containerID="304a9698f83732cdf07b1c1beb63c8b9696304d817437a60ab659c572ebfb73c" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.941526 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.952193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"db6e605f-fa18-4b63-abb0-398ff39fb010","Type":"ContainerStarted","Data":"c76d285e93b522930c571d4453414c664b3f53903ed74d4779ddec9a94e191a4"} Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.958695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ed60a6eb-8348-4608-8905-143106b4d545","Type":"ContainerDied","Data":"9c5af38e95f7f727a60c29c38dd1566c6d27bfc300d172a47a803fb2f51d7e6a"} Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.958727 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.970094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"60e0b49a-6e14-42af-9480-877a354cec42","Type":"ContainerDied","Data":"640f799cd26a8368e83fe7b7e4531f382bdc1e4b56c140eeee2fde59510990c6"} Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.970182 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.981072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" event={"ID":"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a","Type":"ContainerStarted","Data":"7f6ca570badd360c4e550f469d4e2d6df888ec83adc2f833b5160cc1738f2b45"} Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.981377 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-log" containerID="cri-o://5317381dd2ab17c4e6cf8ac5d6c3209a97820d7d93db33f88df2d938cc564924" gracePeriod=30 Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.981837 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-api" containerID="cri-o://7f6ca570badd360c4e550f469d4e2d6df888ec83adc2f833b5160cc1738f2b45" gracePeriod=30 Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.981956 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:07 crc kubenswrapper[4707]: I0121 15:54:07.982012 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.002747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"11f0696a-c173-41a3-926d-29ef8dcef753","Type":"ContainerDied","Data":"4cb85f2595f3e431e8aedccad0ccbdc0ca961ea0c8df515503936cbc52f487cb"} Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.003081 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.014514 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.032025 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" podStartSLOduration=6.012550761 podStartE2EDuration="6.012550761s" podCreationTimestamp="2026-01-21 15:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:07.99938777 +0000 UTC m=+3145.180903992" watchObservedRunningTime="2026-01-21 15:54:08.012550761 +0000 UTC m=+3145.194066984" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.032298 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.036317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c535f926-bfe0-4448-8090-1b942fe85c89","Type":"ContainerDied","Data":"55bf97847d806798394ead48d7c3b5a0d95018aea305415b900a7b555b0ad638"} Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.036437 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.040232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.040431 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" podUID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerName="barbican-keystone-listener-log" containerID="cri-o://4a5772dd092c131ecf81d8ee1a79e452a6686ec0a7bef349d5605a36d1c443da" gracePeriod=30 Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.041758 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" podUID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerName="barbican-worker-log" containerID="cri-o://66872aae878a51a83199443d8ec479ce07708d2a6515a66af88f405708649bdf" gracePeriod=30 Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.041848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" event={"ID":"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb","Type":"ContainerStarted","Data":"fcfc18351f1445f8f4c492a625c615d629a4ab60f1b0501cb64068fcf7a7d318"} Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.042120 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" podUID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerName="barbican-worker" containerID="cri-o://fcfc18351f1445f8f4c492a625c615d629a4ab60f1b0501cb64068fcf7a7d318" gracePeriod=30 Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.043626 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" podUID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerName="barbican-keystone-listener" containerID="cri-o://298713c226aa15237d7cd72cb0e956083e7716fdc3d3e28167b1f08dfa128879" gracePeriod=30 Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.055088 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.067543 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" podStartSLOduration=6.06753278 podStartE2EDuration="6.06753278s" podCreationTimestamp="2026-01-21 15:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:08.065860416 +0000 UTC m=+3145.247376638" watchObservedRunningTime="2026-01-21 15:54:08.06753278 +0000 UTC m=+3145.249049002" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.129577 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq"] Jan 21 15:54:08 crc kubenswrapper[4707]: W0121 15:54:08.195524 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a307a5c_9c6e_42d4_80a8_7d6e50650fcd.slice/crio-78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97 WatchSource:0}: Error finding container 78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97: Status 404 returned error can't find the container with id 78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97 Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.240942 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-75586f844d-9x57h"] Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.336344 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk"] Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.356885 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5"] Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.359798 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.369843 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: W0121 15:54:08.428277 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda33f40c3_3064_405e_a420_da052b2b4a66.slice/crio-b24977ab2a042384f059c50e4e011e2e6564a87a0ae23834e79f3a4f5f087280 WatchSource:0}: Error finding container b24977ab2a042384f059c50e4e011e2e6564a87a0ae23834e79f3a4f5f087280: Status 404 returned error can't find the container with id b24977ab2a042384f059c50e4e011e2e6564a87a0ae23834e79f3a4f5f087280 Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.501605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c535f926-bfe0-4448-8090-1b942fe85c89" (UID: "c535f926-bfe0-4448-8090-1b942fe85c89"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.520234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-config-data" (OuterVolumeSpecName: "config-data") pod "0810465b-2f7e-49f5-8d97-c8416940a463" (UID: "0810465b-2f7e-49f5-8d97-c8416940a463"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.572531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ef0e4e0-0bae-4886-894a-863f13095f5b" (UID: "8ef0e4e0-0bae-4886-894a-863f13095f5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.575039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.575854 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.575881 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.575892 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.590433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8ef0e4e0-0bae-4886-894a-863f13095f5b" (UID: "8ef0e4e0-0bae-4886-894a-863f13095f5b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.594399 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.595373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-config-data" (OuterVolumeSpecName: "config-data") pod "8ef0e4e0-0bae-4886-894a-863f13095f5b" (UID: "8ef0e4e0-0bae-4886-894a-863f13095f5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.609312 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.618682 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.619743 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60e0b49a-6e14-42af-9480-877a354cec42" (UID: "60e0b49a-6e14-42af-9480-877a354cec42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.664975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed60a6eb-8348-4608-8905-143106b4d545" (UID: "ed60a6eb-8348-4608-8905-143106b4d545"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.680852 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.680891 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.680903 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.680914 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.680922 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.680930 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.727662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c535f926-bfe0-4448-8090-1b942fe85c89" (UID: "c535f926-bfe0-4448-8090-1b942fe85c89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.734176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-config-data" (OuterVolumeSpecName: "config-data") pod "60e0b49a-6e14-42af-9480-877a354cec42" (UID: "60e0b49a-6e14-42af-9480-877a354cec42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.764403 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-config-data" (OuterVolumeSpecName: "config-data") pod "ed60a6eb-8348-4608-8905-143106b4d545" (UID: "ed60a6eb-8348-4608-8905-143106b4d545"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.783337 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.783362 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.783373 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.818171 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0810465b-2f7e-49f5-8d97-c8416940a463" (UID: "0810465b-2f7e-49f5-8d97-c8416940a463"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.831945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ef0e4e0-0bae-4886-894a-863f13095f5b" (UID: "8ef0e4e0-0bae-4886-894a-863f13095f5b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.853925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "11f0696a-c173-41a3-926d-29ef8dcef753" (UID: "11f0696a-c173-41a3-926d-29ef8dcef753"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.892448 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.892471 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0810465b-2f7e-49f5-8d97-c8416940a463-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.892483 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef0e4e0-0bae-4886-894a-863f13095f5b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.898405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11f0696a-c173-41a3-926d-29ef8dcef753" (UID: "11f0696a-c173-41a3-926d-29ef8dcef753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.902990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "60e0b49a-6e14-42af-9480-877a354cec42" (UID: "60e0b49a-6e14-42af-9480-877a354cec42"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.906994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data" (OuterVolumeSpecName: "config-data") pod "c535f926-bfe0-4448-8090-1b942fe85c89" (UID: "c535f926-bfe0-4448-8090-1b942fe85c89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: E0121 15:54:08.911919 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b9de3b0_5497_4666_800f_6e5db9e68f8e.slice/crio-8cf71330f22a9f711f2a9aa118097de5a675b96809b3a77139fbb95dc742909f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab385ad_11d0_4da8_bf53_70997a2b9fa3.slice/crio-conmon-4a5772dd092c131ecf81d8ee1a79e452a6686ec0a7bef349d5605a36d1c443da.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podece56fe7_15d7_4701_b4c7_e102b077c8ea.slice/crio-fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab385ad_11d0_4da8_bf53_70997a2b9fa3.slice/crio-4a5772dd092c131ecf81d8ee1a79e452a6686ec0a7bef349d5605a36d1c443da.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46ef7fb6_d2ea_4c55_ad87_5d4d49a748bb.slice/crio-conmon-66872aae878a51a83199443d8ec479ce07708d2a6515a66af88f405708649bdf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd85bbe28_55fb_4af7_8dad_8bc6ffdf5d1a.slice/crio-conmon-5317381dd2ab17c4e6cf8ac5d6c3209a97820d7d93db33f88df2d938cc564924.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.913410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c535f926-bfe0-4448-8090-1b942fe85c89" (UID: "c535f926-bfe0-4448-8090-1b942fe85c89"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.939156 4707 scope.go:117] "RemoveContainer" containerID="483d69c30647d452519d9267bee5354dc667b86da55acfe143a2b6d4b7ae08a0" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.942926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-config-data" (OuterVolumeSpecName: "config-data") pod "11f0696a-c173-41a3-926d-29ef8dcef753" (UID: "11f0696a-c173-41a3-926d-29ef8dcef753"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.958456 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ed60a6eb-8348-4608-8905-143106b4d545" (UID: "ed60a6eb-8348-4608-8905-143106b4d545"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.997944 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e0b49a-6e14-42af-9480-877a354cec42-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.997972 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.997981 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.997992 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed60a6eb-8348-4608-8905-143106b4d545-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.998002 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f0696a-c173-41a3-926d-29ef8dcef753-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:08 crc kubenswrapper[4707]: I0121 15:54:08.998011 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c535f926-bfe0-4448-8090-1b942fe85c89-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.092964 4707 scope.go:117] "RemoveContainer" containerID="58e6b3acf0942b299010a6d66e7bf7396f26347a33cd0c773d801e2540add159" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.094987 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.101878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"728a3264-18ce-4d47-86f5-868f00cc4558","Type":"ContainerStarted","Data":"16c71ef64e21aa9d7aef3046c8569e917e4a9710d257ebb19afc5d621cdd72ff"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.102299 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.103926 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerID="4a5772dd092c131ecf81d8ee1a79e452a6686ec0a7bef349d5605a36d1c443da" exitCode=143 Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.103990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" event={"ID":"4ab385ad-11d0-4da8-bf53-70997a2b9fa3","Type":"ContainerDied","Data":"4a5772dd092c131ecf81d8ee1a79e452a6686ec0a7bef349d5605a36d1c443da"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.108073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" event={"ID":"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b","Type":"ContainerStarted","Data":"0cab14f7b1ef62e790e9b6b4c077c4a63e7c0e0452efd922aaa71beb9af5a4d3"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.121101 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.128979 4707 generic.go:334] "Generic (PLEG): container finished" podID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerID="5317381dd2ab17c4e6cf8ac5d6c3209a97820d7d93db33f88df2d938cc564924" exitCode=143 Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.129029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" event={"ID":"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a","Type":"ContainerDied","Data":"5317381dd2ab17c4e6cf8ac5d6c3209a97820d7d93db33f88df2d938cc564924"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.135095 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=4.135080529 podStartE2EDuration="4.135080529s" podCreationTimestamp="2026-01-21 15:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:09.116125917 +0000 UTC m=+3146.297642138" watchObservedRunningTime="2026-01-21 15:54:09.135080529 +0000 UTC m=+3146.316596751" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.139976 4707 generic.go:334] "Generic (PLEG): container finished" podID="6b9de3b0-5497-4666-800f-6e5db9e68f8e" containerID="8cf71330f22a9f711f2a9aa118097de5a675b96809b3a77139fbb95dc742909f" exitCode=0 Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.140206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"6b9de3b0-5497-4666-800f-6e5db9e68f8e","Type":"ContainerDied","Data":"8cf71330f22a9f711f2a9aa118097de5a675b96809b3a77139fbb95dc742909f"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.141007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"6b9de3b0-5497-4666-800f-6e5db9e68f8e","Type":"ContainerDied","Data":"26864f10b43e3544c0cb3ccb68e807354afd363c061983a48a6ca15d68d32b5d"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.141095 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26864f10b43e3544c0cb3ccb68e807354afd363c061983a48a6ca15d68d32b5d" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.155993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" event={"ID":"eaa206fe-a991-4947-8e0e-0a8f80879f55","Type":"ContainerStarted","Data":"5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.156566 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.165551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" event={"ID":"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd","Type":"ContainerStarted","Data":"78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.175886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d2613b63-6d79-4d28-9be6-ca7a8431c724","Type":"ContainerStarted","Data":"c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.178274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.183235 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" podStartSLOduration=6.183222595 podStartE2EDuration="6.183222595s" podCreationTimestamp="2026-01-21 15:54:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:09.171166776 +0000 UTC m=+3146.352682998" watchObservedRunningTime="2026-01-21 15:54:09.183222595 +0000 UTC m=+3146.364738818" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.194972 4707 generic.go:334] "Generic (PLEG): container finished" podID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerID="fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9" exitCode=0 Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.195054 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.195767 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc92b42-4a79-43c3-b749-ae1058708648" path="/var/lib/kubelet/pods/0cc92b42-4a79-43c3-b749-ae1058708648/volumes" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.196684 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f72a259-adb4-405e-b6e3-6a39fbe668b8" path="/var/lib/kubelet/pods/1f72a259-adb4-405e-b6e3-6a39fbe668b8/volumes" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.197355 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67bae98e-ed83-41c8-be5a-15eae8c9aa48" path="/var/lib/kubelet/pods/67bae98e-ed83-41c8-be5a-15eae8c9aa48/volumes" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.200363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ece56fe7-15d7-4701-b4c7-e102b077c8ea-etc-machine-id\") pod \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.200543 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data-custom\") pod \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.200611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data\") pod \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.200646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-combined-ca-bundle\") pod \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.200666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvdnl\" (UniqueName: \"kubernetes.io/projected/ece56fe7-15d7-4701-b4c7-e102b077c8ea-kube-api-access-qvdnl\") pod \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.200716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-scripts\") pod \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\" (UID: \"ece56fe7-15d7-4701-b4c7-e102b077c8ea\") " Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.200643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ece56fe7-15d7-4701-b4c7-e102b077c8ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ece56fe7-15d7-4701-b4c7-e102b077c8ea" (UID: "ece56fe7-15d7-4701-b4c7-e102b077c8ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.203727 4707 generic.go:334] "Generic (PLEG): container finished" podID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerID="272acba6d788239712a5071adbac78f9174904c25e843ddb5f470eb2cbaf2e24" exitCode=143 Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.206200 4707 scope.go:117] "RemoveContainer" containerID="9504c925f64890f6adfbe11ade457b63a59f8d88820cd51e5d1948828eaf36da" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.208909 4707 generic.go:334] "Generic (PLEG): container finished" podID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerID="66872aae878a51a83199443d8ec479ce07708d2a6515a66af88f405708649bdf" exitCode=143 Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.212690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ece56fe7-15d7-4701-b4c7-e102b077c8ea" (UID: "ece56fe7-15d7-4701-b4c7-e102b077c8ea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.214190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece56fe7-15d7-4701-b4c7-e102b077c8ea-kube-api-access-qvdnl" (OuterVolumeSpecName: "kube-api-access-qvdnl") pod "ece56fe7-15d7-4701-b4c7-e102b077c8ea" (UID: "ece56fe7-15d7-4701-b4c7-e102b077c8ea"). InnerVolumeSpecName "kube-api-access-qvdnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.215708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-scripts" (OuterVolumeSpecName: "scripts") pod "ece56fe7-15d7-4701-b4c7-e102b077c8ea" (UID: "ece56fe7-15d7-4701-b4c7-e102b077c8ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.245082 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=4.24506225 podStartE2EDuration="4.24506225s" podCreationTimestamp="2026-01-21 15:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:09.194891337 +0000 UTC m=+3146.376407560" watchObservedRunningTime="2026-01-21 15:54:09.24506225 +0000 UTC m=+3146.426578472" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.282134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ece56fe7-15d7-4701-b4c7-e102b077c8ea" (UID: "ece56fe7-15d7-4701-b4c7-e102b077c8ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.303331 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ece56fe7-15d7-4701-b4c7-e102b077c8ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.303361 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.303372 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.303381 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvdnl\" (UniqueName: \"kubernetes.io/projected/ece56fe7-15d7-4701-b4c7-e102b077c8ea-kube-api-access-qvdnl\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.303389 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: W0121 15:54:09.329307 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47097d1a_6225_4900_85a1_a28f2f698a78.slice/crio-f3d67b476bde1c26de4d8efd1ead80410c9e0a11492e3e04bf4389e2fd795596 WatchSource:0}: Error finding container f3d67b476bde1c26de4d8efd1ead80410c9e0a11492e3e04bf4389e2fd795596: Status 404 returned error can't find the container with id f3d67b476bde1c26de4d8efd1ead80410c9e0a11492e3e04bf4389e2fd795596 Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.332396 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ece56fe7-15d7-4701-b4c7-e102b077c8ea","Type":"ContainerDied","Data":"fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ece56fe7-15d7-4701-b4c7-e102b077c8ea","Type":"ContainerDied","Data":"5fd7e6f4f05b2fe31544430845b31fdd0f11aefe60765d95d1234e5e9d494675"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" event={"ID":"7e31617e-3380-4e4f-9558-ceb4164fd190","Type":"ContainerStarted","Data":"702acc4e961c18bafe2ee10a33468a9d5240115ee283e4e31e8cbfd0a20b4a6f"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" event={"ID":"3769b2e3-4438-441b-867e-65d6dbd6e36c","Type":"ContainerDied","Data":"272acba6d788239712a5071adbac78f9174904c25e843ddb5f470eb2cbaf2e24"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351536 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351552 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" event={"ID":"a33f40c3-3064-405e-a420-da052b2b4a66","Type":"ContainerStarted","Data":"b24977ab2a042384f059c50e4e011e2e6564a87a0ae23834e79f3a4f5f087280"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351572 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351586 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.351595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" event={"ID":"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb","Type":"ContainerDied","Data":"66872aae878a51a83199443d8ec479ce07708d2a6515a66af88f405708649bdf"} Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.358640 4707 scope.go:117] "RemoveContainer" containerID="2d20d49e7b7da58ecb8376a79713efa7bdcae5fb6a7a699d672be0eea1161321" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.359730 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371166 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.371503 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-log" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371520 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-log" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.371540 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-api" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371548 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-api" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.371556 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerName="cinder-scheduler" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371563 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerName="cinder-scheduler" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.371583 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f0696a-c173-41a3-926d-29ef8dcef753" containerName="glance-httpd" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371588 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f0696a-c173-41a3-926d-29ef8dcef753" containerName="glance-httpd" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.371600 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f0696a-c173-41a3-926d-29ef8dcef753" containerName="glance-log" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371606 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f0696a-c173-41a3-926d-29ef8dcef753" containerName="glance-log" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.371616 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" containerName="cinder-api-log" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371629 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" containerName="cinder-api-log" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.371641 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerName="probe" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371646 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerName="probe" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.371654 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" containerName="cinder-api" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371659 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" containerName="cinder-api" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.371668 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9de3b0-5497-4666-800f-6e5db9e68f8e" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371674 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9de3b0-5497-4666-800f-6e5db9e68f8e" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371866 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerName="probe" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371880 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-log" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371891 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f0696a-c173-41a3-926d-29ef8dcef753" containerName="glance-log" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371898 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" containerName="nova-api-api" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371912 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" containerName="cinder-scheduler" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371919 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f0696a-c173-41a3-926d-29ef8dcef753" containerName="glance-httpd" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371930 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" containerName="cinder-api" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371939 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" containerName="cinder-api-log" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.371945 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9de3b0-5497-4666-800f-6e5db9e68f8e" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.372706 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.375249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.375538 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.376387 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.377702 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-dr5v9" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.378740 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.382801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data" (OuterVolumeSpecName: "config-data") pod "ece56fe7-15d7-4701-b4c7-e102b077c8ea" (UID: "ece56fe7-15d7-4701-b4c7-e102b077c8ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.386950 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.403903 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.409392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-combined-ca-bundle\") pod \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.409450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmhpw\" (UniqueName: \"kubernetes.io/projected/6b9de3b0-5497-4666-800f-6e5db9e68f8e-kube-api-access-nmhpw\") pod \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.409718 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-config-data\") pod \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\" (UID: \"6b9de3b0-5497-4666-800f-6e5db9e68f8e\") " Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.410875 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece56fe7-15d7-4701-b4c7-e102b077c8ea-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.425412 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.428947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9de3b0-5497-4666-800f-6e5db9e68f8e-kube-api-access-nmhpw" (OuterVolumeSpecName: "kube-api-access-nmhpw") pod "6b9de3b0-5497-4666-800f-6e5db9e68f8e" (UID: "6b9de3b0-5497-4666-800f-6e5db9e68f8e"). InnerVolumeSpecName "kube-api-access-nmhpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.431879 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.435539 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.435799 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.435975 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.438788 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.462139 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.464221 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.467439 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.467639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.468008 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.473599 4707 scope.go:117] "RemoveContainer" containerID="fedf8ec2c1a7c0866b937ba14cc1081426eb67eaeb9d60f31b28228f8b04b03f" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.485707 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.511979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5cd14d-4d69-4f5a-9901-315fe321579c-logs\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-public-tls-certs\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvbgc\" (UniqueName: \"kubernetes.io/projected/35e0d910-5029-444b-83e7-92d44c8ae55a-kube-api-access-gvbgc\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbwq8\" (UniqueName: \"kubernetes.io/projected/da5cd14d-4d69-4f5a-9901-315fe321579c-kube-api-access-gbwq8\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512694 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-config-data\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512854 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.512982 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmhpw\" (UniqueName: \"kubernetes.io/projected/6b9de3b0-5497-4666-800f-6e5db9e68f8e-kube-api-access-nmhpw\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.536651 4707 scope.go:117] "RemoveContainer" containerID="8f3d568ef8fe02166321f2465bb001d67912afe92b6e85e4165f4ee77c91e825" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.586403 4707 scope.go:117] "RemoveContainer" containerID="117bd9103ad25aebe304fbfcc64acbc37d4aab9c45ff0d48066ffe05295752c8" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.591063 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.597238 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.604493 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbwq8\" (UniqueName: \"kubernetes.io/projected/da5cd14d-4d69-4f5a-9901-315fe321579c-kube-api-access-gbwq8\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-config-data\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk6db\" (UniqueName: \"kubernetes.io/projected/84ca704f-a732-47ec-a479-dabdeea5b864-kube-api-access-mk6db\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ca704f-a732-47ec-a479-dabdeea5b864-logs\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-public-tls-certs\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84ca704f-a732-47ec-a479-dabdeea5b864-etc-machine-id\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5cd14d-4d69-4f5a-9901-315fe321579c-logs\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.614990 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.615008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data-custom\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.615021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.615038 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-public-tls-certs\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.615053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.615067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.615088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvbgc\" (UniqueName: \"kubernetes.io/projected/35e0d910-5029-444b-83e7-92d44c8ae55a-kube-api-access-gvbgc\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.615106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-scripts\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.616271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5cd14d-4d69-4f5a-9901-315fe321579c-logs\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.616456 4707 scope.go:117] "RemoveContainer" containerID="6861b5894c2e268dd615bf72e68c53f494b1ac91f918b0c29b772a58817145a6" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.616557 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.616676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.616904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.617091 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.618763 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.620160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.620666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.622337 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.625413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.627231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.627394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-public-tls-certs\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.628400 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.629289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-config-data\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.631249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbwq8\" (UniqueName: \"kubernetes.io/projected/da5cd14d-4d69-4f5a-9901-315fe321579c-kube-api-access-gbwq8\") pod \"nova-api-0\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.636304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.637696 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.639430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvbgc\" (UniqueName: \"kubernetes.io/projected/35e0d910-5029-444b-83e7-92d44c8ae55a-kube-api-access-gvbgc\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.648122 4707 scope.go:117] "RemoveContainer" containerID="3c571ef8d58ac48cc178a7b7fbb07a0155fb018f81b61692d264dbdd939885a1" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.650096 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.663665 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.671094 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.678274 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.679880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.680863 4707 scope.go:117] "RemoveContainer" containerID="dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.682127 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-265m4" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.682338 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.682351 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.684533 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.686278 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.692434 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.695678 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.697158 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.704013 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.717740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.721930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk6db\" (UniqueName: \"kubernetes.io/projected/84ca704f-a732-47ec-a479-dabdeea5b864-kube-api-access-mk6db\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.722588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4427\" (UniqueName: \"kubernetes.io/projected/12052361-f984-4545-a1e1-9a8b80e2cceb-kube-api-access-r4427\") pod \"nova-cell1-conductor-0\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.727902 4707 scope.go:117] "RemoveContainer" containerID="fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.735158 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.740196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.741708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.741892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ca704f-a732-47ec-a479-dabdeea5b864-logs\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.742004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-public-tls-certs\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.742141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84ca704f-a732-47ec-a479-dabdeea5b864-etc-machine-id\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.742268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.742354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.742432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data-custom\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.742498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.742612 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-scripts\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.742704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.742899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84ca704f-a732-47ec-a479-dabdeea5b864-etc-machine-id\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.744518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ca704f-a732-47ec-a479-dabdeea5b864-logs\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.747598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-scripts\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.764020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.784744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.788553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data-custom\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.789846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.790240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk6db\" (UniqueName: \"kubernetes.io/projected/84ca704f-a732-47ec-a479-dabdeea5b864-kube-api-access-mk6db\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.790477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-public-tls-certs\") pod \"cinder-api-0\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.800898 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.812901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.825873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4427\" (UniqueName: \"kubernetes.io/projected/12052361-f984-4545-a1e1-9a8b80e2cceb-kube-api-access-r4427\") pod \"nova-cell1-conductor-0\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh9sg\" (UniqueName: \"kubernetes.io/projected/28a92eee-d8bb-46d5-bacb-e641b5be138f-kube-api-access-kh9sg\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-config-data\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087bdc72-3300-4e75-9b60-72380c090c95-logs\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-logs\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-config-data\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845554 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-scripts\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqww\" (UniqueName: \"kubernetes.io/projected/087bdc72-3300-4e75-9b60-72380c090c95-kube-api-access-xlqww\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.845635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.847366 4707 scope.go:117] "RemoveContainer" containerID="dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.856672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.859368 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724\": container with ID starting with dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724 not found: ID does not exist" containerID="dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.859410 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724"} err="failed to get container status \"dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724\": rpc error: code = NotFound desc = could not find container \"dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724\": container with ID starting with dcc746d076c142569163bd48e38f8a79d65dff325c94e25c53a2cd6329c66724 not found: ID does not exist" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.859431 4707 scope.go:117] "RemoveContainer" containerID="fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9" Jan 21 15:54:09 crc kubenswrapper[4707]: E0121 15:54:09.864231 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9\": container with ID starting with fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9 not found: ID does not exist" containerID="fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.864279 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9"} err="failed to get container status \"fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9\": rpc error: code = NotFound desc = could not find container \"fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9\": container with ID starting with fe8f354bdc2d024c4e7ffac34372801116b56beba59968034641b535651efba9 not found: ID does not exist" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.867329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.867779 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-config-data" (OuterVolumeSpecName: "config-data") pod "6b9de3b0-5497-4666-800f-6e5db9e68f8e" (UID: "6b9de3b0-5497-4666-800f-6e5db9e68f8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.889864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4427\" (UniqueName: \"kubernetes.io/projected/12052361-f984-4545-a1e1-9a8b80e2cceb-kube-api-access-r4427\") pod \"nova-cell1-conductor-0\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.895255 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.898921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b9de3b0-5497-4666-800f-6e5db9e68f8e" (UID: "6b9de3b0-5497-4666-800f-6e5db9e68f8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.899653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.925498 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.928436 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.935328 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.938295 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.946793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.946855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh9sg\" (UniqueName: \"kubernetes.io/projected/28a92eee-d8bb-46d5-bacb-e641b5be138f-kube-api-access-kh9sg\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.946875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-config-data\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.946906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087bdc72-3300-4e75-9b60-72380c090c95-logs\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.946939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.946989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-logs\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-config-data\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqww\" (UniqueName: \"kubernetes.io/projected/087bdc72-3300-4e75-9b60-72380c090c95-kube-api-access-xlqww\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-scripts\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947117 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947166 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947178 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9de3b0-5497-4666-800f-6e5db9e68f8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.947994 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087bdc72-3300-4e75-9b60-72380c090c95-logs\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.950148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.951159 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.952516 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.953165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-logs\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.985639 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.989328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.989335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:09 crc kubenswrapper[4707]: I0121 15:54:09.998173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.011272 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.012743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-config-data\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.014824 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.016172 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.019165 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.037236 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.047531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-config-data\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.047955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqww\" (UniqueName: \"kubernetes.io/projected/087bdc72-3300-4e75-9b60-72380c090c95-kube-api-access-xlqww\") pod \"nova-metadata-0\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.049783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.056928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh9sg\" (UniqueName: \"kubernetes.io/projected/28a92eee-d8bb-46d5-bacb-e641b5be138f-kube-api-access-kh9sg\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.056936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.057075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.062370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqztn\" (UniqueName: \"kubernetes.io/projected/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-kube-api-access-nqztn\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.062994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.063034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.063119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.063234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.066862 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.080943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-scripts\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.108630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169218 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169247 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9vrd\" (UniqueName: \"kubernetes.io/projected/3e495769-a033-41f2-8285-3f3274673250-kube-api-access-r9vrd\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e495769-a033-41f2-8285-3f3274673250-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqztn\" (UniqueName: \"kubernetes.io/projected/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-kube-api-access-nqztn\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-scripts\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.169553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.174308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.174547 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.175687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.175915 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.182742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.186526 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.186659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.189626 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.198502 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.199154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqztn\" (UniqueName: \"kubernetes.io/projected/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-kube-api-access-nqztn\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.212789 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-5665575645-v8ktc"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.224177 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.229336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.244945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"db6e605f-fa18-4b63-abb0-398ff39fb010","Type":"ContainerStarted","Data":"ecdb17bcb0ef07f2c4492e39172d86abfd80846b3e7790e912da5c2e4616f285"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.245461 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerName="openstack-network-exporter" containerID="cri-o://ecdb17bcb0ef07f2c4492e39172d86abfd80846b3e7790e912da5c2e4616f285" gracePeriod=300 Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.250013 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.250917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"47097d1a-6225-4900-85a1-a28f2f698a78","Type":"ContainerStarted","Data":"505527e41043faf26c8fb4d1c3789cdbd1f623b88d8c64575cc6591ba70dc015"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.250953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"47097d1a-6225-4900-85a1-a28f2f698a78","Type":"ContainerStarted","Data":"f3d67b476bde1c26de4d8efd1ead80410c9e0a11492e3e04bf4389e2fd795596"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.252610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" event={"ID":"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b","Type":"ContainerStarted","Data":"fc7d797bb4cca4bd6f7ee94c58585edda25fefd4c7c624324ba4fb6dae947345"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.254349 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" event={"ID":"a33f40c3-3064-405e-a420-da052b2b4a66","Type":"ContainerStarted","Data":"abe6c330b2f7b9eeb298263fb49016da6c385ea40b62ceaa7db2f7a086c14819"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.268439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerStarted","Data":"8027aeac2953b1103c96572f6a7f5fae90c69fe974d0026f8d6c56b1eb92ba61"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.278851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.278901 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.278922 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-scripts\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.278948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.279132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9vrd\" (UniqueName: \"kubernetes.io/projected/3e495769-a033-41f2-8285-3f3274673250-kube-api-access-r9vrd\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.279154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e495769-a033-41f2-8285-3f3274673250-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.279281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e495769-a033-41f2-8285-3f3274673250-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.283588 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.286583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" event={"ID":"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd","Type":"ContainerStarted","Data":"7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.286752 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.295076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.297024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.297430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.297906 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.313296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9vrd\" (UniqueName: \"kubernetes.io/projected/3e495769-a033-41f2-8285-3f3274673250-kube-api-access-r9vrd\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.313365 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.318980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.322550 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5665575645-v8ktc"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.327897 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.375376 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.377529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0db5205c-c226-4ce2-b734-b250d0c4f7f5","Type":"ContainerStarted","Data":"921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.377555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0db5205c-c226-4ce2-b734-b250d0c4f7f5","Type":"ContainerStarted","Data":"6a969f47e9705be303cdf57e8935315f4c39ccdbb4dfcf02844b2537d3ff7ab1"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data-custom\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-combined-ca-bundle\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9rwm\" (UniqueName: \"kubernetes.io/projected/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-kube-api-access-g9rwm\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380629 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-logs\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lbrk\" (UniqueName: \"kubernetes.io/projected/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-kube-api-access-5lbrk\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-combined-ca-bundle\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380801 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data-custom\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-logs\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.380890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.398461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-scripts\") pod \"cinder-scheduler-0\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.399893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" event={"ID":"7e31617e-3380-4e4f-9558-ceb4164fd190","Type":"ContainerStarted","Data":"2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.408848 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.412470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82","Type":"ContainerStarted","Data":"7f5314d7340ab017d4c0ab4f41006fb9b9f0510993dcfbbb5e9761ae6e3d8dae"} Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.413713 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.416649 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerName="openstack-network-exporter" containerID="cri-o://7f5314d7340ab017d4c0ab4f41006fb9b9f0510993dcfbbb5e9761ae6e3d8dae" gracePeriod=300 Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.419053 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.432548 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-548968857b-s96qt"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.434106 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.454997 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-548968857b-s96qt"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.484878 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lbrk\" (UniqueName: \"kubernetes.io/projected/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-kube-api-access-5lbrk\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.485160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-combined-ca-bundle\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.485576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data-custom\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.485640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-logs\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.485715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.485852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data-custom\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.485906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-combined-ca-bundle\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.485926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9rwm\" (UniqueName: \"kubernetes.io/projected/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-kube-api-access-g9rwm\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.485961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.485999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-logs\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.486311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-logs\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.490758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-logs\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.533144 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.564863 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-75586f844d-9x57h"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.576023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9rwm\" (UniqueName: \"kubernetes.io/projected/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-kube-api-access-g9rwm\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.595695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-internal-tls-certs\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.595787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbgc\" (UniqueName: \"kubernetes.io/projected/fb142568-b2b7-467a-9b33-fc6c3bf725e3-kube-api-access-9vbgc\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.595855 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-public-tls-certs\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.595875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb142568-b2b7-467a-9b33-fc6c3bf725e3-logs\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.595894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-combined-ca-bundle\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.595913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data-custom\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.595951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.597163 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-747cb955f8-j9rwx"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.598535 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.604429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lbrk\" (UniqueName: \"kubernetes.io/projected/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-kube-api-access-5lbrk\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.645306 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.676861 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-747cb955f8-j9rwx"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.699241 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf84r\" (UniqueName: \"kubernetes.io/projected/943cfab1-eb95-4dcb-842d-9826513006f5-kube-api-access-hf84r\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-config-data\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/943cfab1-eb95-4dcb-842d-9826513006f5-logs\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-public-tls-certs\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-scripts\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-internal-tls-certs\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-combined-ca-bundle\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbgc\" (UniqueName: \"kubernetes.io/projected/fb142568-b2b7-467a-9b33-fc6c3bf725e3-kube-api-access-9vbgc\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-public-tls-certs\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb142568-b2b7-467a-9b33-fc6c3bf725e3-logs\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-internal-tls-certs\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-combined-ca-bundle\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.702758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data-custom\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.709707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb142568-b2b7-467a-9b33-fc6c3bf725e3-logs\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.710918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data-custom\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.712502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-combined-ca-bundle\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.712745 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.713157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data-custom\") pod \"barbican-worker-5665575645-v8ktc\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.722129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-combined-ca-bundle\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.722274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data-custom\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.725694 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data\") pod \"barbican-keystone-listener-9d5989c46-nlxmt\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.728142 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=6.728126591 podStartE2EDuration="6.728126591s" podCreationTimestamp="2026-01-21 15:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:10.277802657 +0000 UTC m=+3147.459318879" watchObservedRunningTime="2026-01-21 15:54:10.728126591 +0000 UTC m=+3147.909642813" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.730014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.733296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-combined-ca-bundle\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.733655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-public-tls-certs\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.744242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-internal-tls-certs\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.787181 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerName="ovsdbserver-sb" containerID="cri-o://c76d285e93b522930c571d4453414c664b3f53903ed74d4779ddec9a94e191a4" gracePeriod=300 Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.809868 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf84r\" (UniqueName: \"kubernetes.io/projected/943cfab1-eb95-4dcb-842d-9826513006f5-kube-api-access-hf84r\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.810075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-config-data\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.810135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/943cfab1-eb95-4dcb-842d-9826513006f5-logs\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.810170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-public-tls-certs\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.810214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-scripts\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.810272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-combined-ca-bundle\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.810352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-internal-tls-certs\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.816507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-internal-tls-certs\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.824352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-config-data\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.825491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-public-tls-certs\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.825582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/943cfab1-eb95-4dcb-842d-9826513006f5-logs\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.827753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-combined-ca-bundle\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.833006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-scripts\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.853943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbgc\" (UniqueName: \"kubernetes.io/projected/fb142568-b2b7-467a-9b33-fc6c3bf725e3-kube-api-access-9vbgc\") pod \"barbican-api-548968857b-s96qt\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.856523 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=3.856506127 podStartE2EDuration="3.856506127s" podCreationTimestamp="2026-01-21 15:54:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:10.398735259 +0000 UTC m=+3147.580251482" watchObservedRunningTime="2026-01-21 15:54:10.856506127 +0000 UTC m=+3148.038022349" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.865894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf84r\" (UniqueName: \"kubernetes.io/projected/943cfab1-eb95-4dcb-842d-9826513006f5-kube-api-access-hf84r\") pod \"placement-747cb955f8-j9rwx\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.867896 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=6.867885004 podStartE2EDuration="6.867885004s" podCreationTimestamp="2026-01-21 15:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:10.434607925 +0000 UTC m=+3147.616124147" watchObservedRunningTime="2026-01-21 15:54:10.867885004 +0000 UTC m=+3148.049401227" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.906842 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.912494 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.922738 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.923747 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.925472 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.931838 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.941389 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-747cb955f8-j9rwx"] Jan 21 15:54:10 crc kubenswrapper[4707]: W0121 15:54:10.942727 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35e0d910_5029_444b_83e7_92d44c8ae55a.slice/crio-7defbc4a0578aee77e2fc6d52d39cbeb798da5105f669dd556ab3d763c1caf97 WatchSource:0}: Error finding container 7defbc4a0578aee77e2fc6d52d39cbeb798da5105f669dd556ab3d763c1caf97: Status 404 returned error can't find the container with id 7defbc4a0578aee77e2fc6d52d39cbeb798da5105f669dd556ab3d763c1caf97 Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.942932 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.957870 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-9d4657458-gkqnm"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.959305 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.970732 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.970775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.975872 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-9d4657458-gkqnm"] Jan 21 15:54:10 crc kubenswrapper[4707]: I0121 15:54:10.991754 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.000213 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015087 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-scripts\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6nwv\" (UniqueName: \"kubernetes.io/projected/e86ffe25-a844-40ac-ac5f-11835ee59208-kube-api-access-d6nwv\") pod \"nova-cell0-conductor-0\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-config-data\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zlvl\" (UniqueName: \"kubernetes.io/projected/a1cb8df6-9987-4102-9599-2a0a39252432-kube-api-access-2zlvl\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cb8df6-9987-4102-9599-2a0a39252432-logs\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-internal-tls-certs\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-public-tls-certs\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.015388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-combined-ca-bundle\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.038378 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerName="ovsdbserver-nb" containerID="cri-o://da0a868e4a63c3078ec90d7fb603c14478579582a1b0178a820be0cbec3e8780" gracePeriod=300 Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.093412 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.102027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.111701 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zlvl\" (UniqueName: \"kubernetes.io/projected/a1cb8df6-9987-4102-9599-2a0a39252432-kube-api-access-2zlvl\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cb8df6-9987-4102-9599-2a0a39252432-logs\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-internal-tls-certs\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-public-tls-certs\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-combined-ca-bundle\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-scripts\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6nwv\" (UniqueName: \"kubernetes.io/projected/e86ffe25-a844-40ac-ac5f-11835ee59208-kube-api-access-d6nwv\") pod \"nova-cell0-conductor-0\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.120865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-config-data\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.124234 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.125872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cb8df6-9987-4102-9599-2a0a39252432-logs\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.128355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-public-tls-certs\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.129488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-scripts\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.133447 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-config-data\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.135859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.149963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-combined-ca-bundle\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.152609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.159785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-internal-tls-certs\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.165241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6nwv\" (UniqueName: \"kubernetes.io/projected/e86ffe25-a844-40ac-ac5f-11835ee59208-kube-api-access-d6nwv\") pod \"nova-cell0-conductor-0\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.181234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zlvl\" (UniqueName: \"kubernetes.io/projected/a1cb8df6-9987-4102-9599-2a0a39252432-kube-api-access-2zlvl\") pod \"placement-9d4657458-gkqnm\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.207727 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0810465b-2f7e-49f5-8d97-c8416940a463" path="/var/lib/kubelet/pods/0810465b-2f7e-49f5-8d97-c8416940a463/volumes" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.210871 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f0696a-c173-41a3-926d-29ef8dcef753" path="/var/lib/kubelet/pods/11f0696a-c173-41a3-926d-29ef8dcef753/volumes" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.214938 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e0b49a-6e14-42af-9480-877a354cec42" path="/var/lib/kubelet/pods/60e0b49a-6e14-42af-9480-877a354cec42/volumes" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.215793 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9de3b0-5497-4666-800f-6e5db9e68f8e" path="/var/lib/kubelet/pods/6b9de3b0-5497-4666-800f-6e5db9e68f8e/volumes" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.216385 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef0e4e0-0bae-4886-894a-863f13095f5b" path="/var/lib/kubelet/pods/8ef0e4e0-0bae-4886-894a-863f13095f5b/volumes" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.217382 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c535f926-bfe0-4448-8090-1b942fe85c89" path="/var/lib/kubelet/pods/c535f926-bfe0-4448-8090-1b942fe85c89/volumes" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.218531 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc96bd5b-fb78-41b8-a65b-352f905ff8d5" path="/var/lib/kubelet/pods/cc96bd5b-fb78-41b8-a65b-352f905ff8d5/volumes" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.218881 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece56fe7-15d7-4701-b4c7-e102b077c8ea" path="/var/lib/kubelet/pods/ece56fe7-15d7-4701-b4c7-e102b077c8ea/volumes" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.219915 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed60a6eb-8348-4608-8905-143106b4d545" path="/var/lib/kubelet/pods/ed60a6eb-8348-4608-8905-143106b4d545/volumes" Jan 21 15:54:11 crc kubenswrapper[4707]: W0121 15:54:11.240663 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda5cd14d_4d69_4f5a_9901_315fe321579c.slice/crio-16700dde006dd42b01ae254a129416f3fa8868ef0abf67cf502defe2bbdb21fd WatchSource:0}: Error finding container 16700dde006dd42b01ae254a129416f3fa8868ef0abf67cf502defe2bbdb21fd: Status 404 returned error can't find the container with id 16700dde006dd42b01ae254a129416f3fa8868ef0abf67cf502defe2bbdb21fd Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.398138 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.417283 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.434465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"12052361-f984-4545-a1e1-9a8b80e2cceb","Type":"ContainerStarted","Data":"901f8ccae682b151d1499e809475a50dfa0f4837187a71b68b19da373e3405f8"} Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.445491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" event={"ID":"7e31617e-3380-4e4f-9558-ceb4164fd190","Type":"ContainerStarted","Data":"fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e"} Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.445615 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" podUID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerName="barbican-worker-log" containerID="cri-o://2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00" gracePeriod=30 Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.445829 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.445895 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" podUID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerName="barbican-worker" containerID="cri-o://fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e" gracePeriod=30 Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.463878 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" podStartSLOduration=6.463864821 podStartE2EDuration="6.463864821s" podCreationTimestamp="2026-01-21 15:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:11.460477081 +0000 UTC m=+3148.641993303" watchObservedRunningTime="2026-01-21 15:54:11.463864821 +0000 UTC m=+3148.645381043" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.464208 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_d5f812ac-2896-4f26-b5f4-ec0ab36cbb82/ovsdbserver-nb/0.log" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.464249 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerID="7f5314d7340ab017d4c0ab4f41006fb9b9f0510993dcfbbb5e9761ae6e3d8dae" exitCode=2 Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.464276 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerID="da0a868e4a63c3078ec90d7fb603c14478579582a1b0178a820be0cbec3e8780" exitCode=143 Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.464342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82","Type":"ContainerDied","Data":"7f5314d7340ab017d4c0ab4f41006fb9b9f0510993dcfbbb5e9761ae6e3d8dae"} Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.464369 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82","Type":"ContainerDied","Data":"da0a868e4a63c3078ec90d7fb603c14478579582a1b0178a820be0cbec3e8780"} Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.466689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerStarted","Data":"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381"} Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.472331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"84ca704f-a732-47ec-a479-dabdeea5b864","Type":"ContainerStarted","Data":"a702d1d44ca597a78f8858547601e56e9c70e7cb4efcd0e02b6d1406beda7e20"} Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.484141 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_db6e605f-fa18-4b63-abb0-398ff39fb010/ovsdbserver-sb/0.log" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.484179 4707 generic.go:334] "Generic (PLEG): container finished" podID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerID="ecdb17bcb0ef07f2c4492e39172d86abfd80846b3e7790e912da5c2e4616f285" exitCode=2 Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.484192 4707 generic.go:334] "Generic (PLEG): container finished" podID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerID="c76d285e93b522930c571d4453414c664b3f53903ed74d4779ddec9a94e191a4" exitCode=143 Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.484250 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"db6e605f-fa18-4b63-abb0-398ff39fb010","Type":"ContainerDied","Data":"ecdb17bcb0ef07f2c4492e39172d86abfd80846b3e7790e912da5c2e4616f285"} Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.484286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"db6e605f-fa18-4b63-abb0-398ff39fb010","Type":"ContainerDied","Data":"c76d285e93b522930c571d4453414c664b3f53903ed74d4779ddec9a94e191a4"} Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.487970 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"da5cd14d-4d69-4f5a-9901-315fe321579c","Type":"ContainerStarted","Data":"16700dde006dd42b01ae254a129416f3fa8868ef0abf67cf502defe2bbdb21fd"} Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.500744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"35e0d910-5029-444b-83e7-92d44c8ae55a","Type":"ContainerStarted","Data":"7defbc4a0578aee77e2fc6d52d39cbeb798da5105f669dd556ab3d763c1caf97"} Jan 21 15:54:11 crc kubenswrapper[4707]: E0121 15:54:11.518950 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:54:11 crc kubenswrapper[4707]: E0121 15:54:11.529553 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:54:11 crc kubenswrapper[4707]: E0121 15:54:11.532725 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:54:11 crc kubenswrapper[4707]: E0121 15:54:11.532757 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="ovn-northd" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.608340 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_db6e605f-fa18-4b63-abb0-398ff39fb010/ovsdbserver-sb/0.log" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.608406 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.641980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-scripts\") pod \"db6e605f-fa18-4b63-abb0-398ff39fb010\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.642019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-config\") pod \"db6e605f-fa18-4b63-abb0-398ff39fb010\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.642074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdb-rundir\") pod \"db6e605f-fa18-4b63-abb0-398ff39fb010\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.642148 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-combined-ca-bundle\") pod \"db6e605f-fa18-4b63-abb0-398ff39fb010\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.642176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz5gx\" (UniqueName: \"kubernetes.io/projected/db6e605f-fa18-4b63-abb0-398ff39fb010-kube-api-access-mz5gx\") pod \"db6e605f-fa18-4b63-abb0-398ff39fb010\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.642214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"db6e605f-fa18-4b63-abb0-398ff39fb010\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.642267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdbserver-sb-tls-certs\") pod \"db6e605f-fa18-4b63-abb0-398ff39fb010\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.642289 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-metrics-certs-tls-certs\") pod \"db6e605f-fa18-4b63-abb0-398ff39fb010\" (UID: \"db6e605f-fa18-4b63-abb0-398ff39fb010\") " Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.643198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-config" (OuterVolumeSpecName: "config") pod "db6e605f-fa18-4b63-abb0-398ff39fb010" (UID: "db6e605f-fa18-4b63-abb0-398ff39fb010"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.645102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "db6e605f-fa18-4b63-abb0-398ff39fb010" (UID: "db6e605f-fa18-4b63-abb0-398ff39fb010"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.645146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-scripts" (OuterVolumeSpecName: "scripts") pod "db6e605f-fa18-4b63-abb0-398ff39fb010" (UID: "db6e605f-fa18-4b63-abb0-398ff39fb010"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.665669 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "db6e605f-fa18-4b63-abb0-398ff39fb010" (UID: "db6e605f-fa18-4b63-abb0-398ff39fb010"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.694859 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-747cb955f8-j9rwx"] Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.698083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6e605f-fa18-4b63-abb0-398ff39fb010-kube-api-access-mz5gx" (OuterVolumeSpecName: "kube-api-access-mz5gx") pod "db6e605f-fa18-4b63-abb0-398ff39fb010" (UID: "db6e605f-fa18-4b63-abb0-398ff39fb010"). InnerVolumeSpecName "kube-api-access-mz5gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.702473 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.745423 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.745448 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db6e605f-fa18-4b63-abb0-398ff39fb010-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.745458 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.745466 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz5gx\" (UniqueName: \"kubernetes.io/projected/db6e605f-fa18-4b63-abb0-398ff39fb010-kube-api-access-mz5gx\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.745490 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.938732 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:11 crc kubenswrapper[4707]: I0121 15:54:11.948043 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.071104 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5665575645-v8ktc"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.098055 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.110047 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-548968857b-s96qt"] Jan 21 15:54:12 crc kubenswrapper[4707]: W0121 15:54:12.112960 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7d3be32_a1ae_4a91_bf04_051fd3dde8af.slice/crio-3dfd9e4fed4070cd11fa67fab255b2afe50e8c44e87dc32ad836842420c0f197 WatchSource:0}: Error finding container 3dfd9e4fed4070cd11fa67fab255b2afe50e8c44e87dc32ad836842420c0f197: Status 404 returned error can't find the container with id 3dfd9e4fed4070cd11fa67fab255b2afe50e8c44e87dc32ad836842420c0f197 Jan 21 15:54:12 crc kubenswrapper[4707]: W0121 15:54:12.123424 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa9d3f6_1251_4d45_b274_1bb23f9e4dea.slice/crio-8a3175ee33cf8cb0b2ce91ef546f6d5735dfc0d6a65ce7c80f7b289c294fc9ee WatchSource:0}: Error finding container 8a3175ee33cf8cb0b2ce91ef546f6d5735dfc0d6a65ce7c80f7b289c294fc9ee: Status 404 returned error can't find the container with id 8a3175ee33cf8cb0b2ce91ef546f6d5735dfc0d6a65ce7c80f7b289c294fc9ee Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.196888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db6e605f-fa18-4b63-abb0-398ff39fb010" (UID: "db6e605f-fa18-4b63-abb0-398ff39fb010"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.200863 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.274078 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.274113 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.289178 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.313483 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-9d4657458-gkqnm"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.459787 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "db6e605f-fa18-4b63-abb0-398ff39fb010" (UID: "db6e605f-fa18-4b63-abb0-398ff39fb010"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.480349 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.482756 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-9d4657458-gkqnm"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.535180 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-98765b5d4-w967c"] Jan 21 15:54:12 crc kubenswrapper[4707]: E0121 15:54:12.535583 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerName="ovsdbserver-sb" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.535647 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerName="ovsdbserver-sb" Jan 21 15:54:12 crc kubenswrapper[4707]: E0121 15:54:12.535676 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerName="openstack-network-exporter" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.535682 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerName="openstack-network-exporter" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.535922 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerName="openstack-network-exporter" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.535934 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="db6e605f-fa18-4b63-abb0-398ff39fb010" containerName="ovsdbserver-sb" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.537013 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.545842 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-98765b5d4-w967c"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.575569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "db6e605f-fa18-4b63-abb0-398ff39fb010" (UID: "db6e605f-fa18-4b63-abb0-398ff39fb010"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.588799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-public-tls-certs\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.588973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-combined-ca-bundle\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.589103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/318a94fd-66d7-4cfb-b312-a348e185ae2e-logs\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.589123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-scripts\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.589155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-config-data\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.589175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-internal-tls-certs\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.589322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcr2c\" (UniqueName: \"kubernetes.io/projected/318a94fd-66d7-4cfb-b312-a348e185ae2e-kube-api-access-wcr2c\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.589674 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6e605f-fa18-4b63-abb0-398ff39fb010-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.605924 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3e495769-a033-41f2-8285-3f3274673250","Type":"ContainerStarted","Data":"9d4e07313400d53cb0cb4138c088d5bde34b94479ebb36b4c0f0e080abe3f83f"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.612858 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerID="2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00" exitCode=143 Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.612899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" event={"ID":"7e31617e-3380-4e4f-9558-ceb4164fd190","Type":"ContainerDied","Data":"2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.622357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"087bdc72-3300-4e75-9b60-72380c090c95","Type":"ContainerStarted","Data":"f003fb6700a1377638da83bcae282818afea3ca4f1f8c135f2b2688b4b9a85a1"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.624101 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e86ffe25-a844-40ac-ac5f-11835ee59208","Type":"ContainerStarted","Data":"28e7c775b8d6a550d78ce1fe2166c81faa97a3847251fe24584b1ac5052220b5"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.690981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-public-tls-certs\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.691032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-combined-ca-bundle\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.691072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/318a94fd-66d7-4cfb-b312-a348e185ae2e-logs\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.691088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-scripts\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.691101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-config-data\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.691117 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-internal-tls-certs\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.691165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcr2c\" (UniqueName: \"kubernetes.io/projected/318a94fd-66d7-4cfb-b312-a348e185ae2e-kube-api-access-wcr2c\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.691785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/318a94fd-66d7-4cfb-b312-a348e185ae2e-logs\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.697136 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-scripts\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.697483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-public-tls-certs\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.697641 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-internal-tls-certs\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.697686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-config-data\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.701625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-combined-ca-bundle\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.705333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" event={"ID":"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea","Type":"ContainerStarted","Data":"8a3175ee33cf8cb0b2ce91ef546f6d5735dfc0d6a65ce7c80f7b289c294fc9ee"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.709875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcr2c\" (UniqueName: \"kubernetes.io/projected/318a94fd-66d7-4cfb-b312-a348e185ae2e-kube-api-access-wcr2c\") pod \"placement-98765b5d4-w967c\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.713382 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.715349 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_d5f812ac-2896-4f26-b5f4-ec0ab36cbb82/ovsdbserver-nb/0.log" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.715414 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.720691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" event={"ID":"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd","Type":"ContainerStarted","Data":"7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.721126 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" podUID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerName="barbican-keystone-listener-log" containerID="cri-o://7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517" gracePeriod=30 Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.721389 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" podUID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerName="barbican-keystone-listener" containerID="cri-o://7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c" gracePeriod=30 Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.758835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" event={"ID":"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b","Type":"ContainerStarted","Data":"14d0a43f08a2e749aa63184db127afde2233c4d4ac8dfaab8d8f291ada1ef02e"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.759049 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" podUID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerName="placement-log" containerID="cri-o://fc7d797bb4cca4bd6f7ee94c58585edda25fefd4c7c624324ba4fb6dae947345" gracePeriod=30 Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.759212 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.759308 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.759396 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" podUID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerName="placement-api" containerID="cri-o://14d0a43f08a2e749aa63184db127afde2233c4d4ac8dfaab8d8f291ada1ef02e" gracePeriod=30 Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.767955 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" event={"ID":"a33f40c3-3064-405e-a420-da052b2b4a66","Type":"ContainerStarted","Data":"4785e6cbd1c41004f4ccc7f605931833ce69946a0be235c3ed27a893a8ce09b4"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.768087 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" containerName="barbican-api-log" containerID="cri-o://abe6c330b2f7b9eeb298263fb49016da6c385ea40b62ceaa7db2f7a086c14819" gracePeriod=30 Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.768151 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.768166 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.768193 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" containerName="barbican-api" containerID="cri-o://4785e6cbd1c41004f4ccc7f605931833ce69946a0be235c3ed27a893a8ce09b4" gracePeriod=30 Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.770289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" event={"ID":"b7d3be32-a1ae-4a91-bf04-051fd3dde8af","Type":"ContainerStarted","Data":"3dfd9e4fed4070cd11fa67fab255b2afe50e8c44e87dc32ad836842420c0f197"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.779269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" event={"ID":"fb142568-b2b7-467a-9b33-fc6c3bf725e3","Type":"ContainerStarted","Data":"deaad5d8b622cef5089ba0038af4eeccc61b7dde827d3b035c88608a44bf63f7"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.787694 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_d5f812ac-2896-4f26-b5f4-ec0ab36cbb82/ovsdbserver-nb/0.log" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.787767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82","Type":"ContainerDied","Data":"90840df2d9fb055dc339641940e756000dbb48f1c924ffb548bc2caf721e9d46"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.787798 4707 scope.go:117] "RemoveContainer" containerID="7f5314d7340ab017d4c0ab4f41006fb9b9f0510993dcfbbb5e9761ae6e3d8dae" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.787922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.789793 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" podStartSLOduration=7.789782371 podStartE2EDuration="7.789782371s" podCreationTimestamp="2026-01-21 15:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:12.771293175 +0000 UTC m=+3149.952809396" watchObservedRunningTime="2026-01-21 15:54:12.789782371 +0000 UTC m=+3149.971298614" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.791666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-config\") pod \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.791765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.791944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-metrics-certs-tls-certs\") pod \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.791981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9t98\" (UniqueName: \"kubernetes.io/projected/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-kube-api-access-l9t98\") pod \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.792004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-scripts\") pod \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.792056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-combined-ca-bundle\") pod \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.792088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdb-rundir\") pod \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.792162 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdbserver-nb-tls-certs\") pod \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\" (UID: \"d5f812ac-2896-4f26-b5f4-ec0ab36cbb82\") " Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.793989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-config" (OuterVolumeSpecName: "config") pod "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" (UID: "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.796971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" (UID: "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.799063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-scripts" (OuterVolumeSpecName: "scripts") pod "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" (UID: "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.818858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"388a5540-d98c-4db0-bb1a-5a387ff8f7f3","Type":"ContainerStarted","Data":"c27b99c8e9dedf56fd90ae88b2072d2818dbe99855377c8baec414e6db09ee85"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.823780 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" podStartSLOduration=7.823749945 podStartE2EDuration="7.823749945s" podCreationTimestamp="2026-01-21 15:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:12.793835423 +0000 UTC m=+3149.975351646" watchObservedRunningTime="2026-01-21 15:54:12.823749945 +0000 UTC m=+3150.005266167" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.826870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"35e0d910-5029-444b-83e7-92d44c8ae55a","Type":"ContainerStarted","Data":"8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.826992 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="35e0d910-5029-444b-83e7-92d44c8ae55a" containerName="mysql-bootstrap" containerID="cri-o://8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca" gracePeriod=30 Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.842771 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" podStartSLOduration=7.842761294 podStartE2EDuration="7.842761294s" podCreationTimestamp="2026-01-21 15:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:12.80974541 +0000 UTC m=+3149.991261632" watchObservedRunningTime="2026-01-21 15:54:12.842761294 +0000 UTC m=+3150.024277516" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.858115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" event={"ID":"943cfab1-eb95-4dcb-842d-9826513006f5","Type":"ContainerStarted","Data":"fc3a2dcbd831592615de29479506e92713c3b0c7e17ea131f37d130e661a4e09"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.860029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-kube-api-access-l9t98" (OuterVolumeSpecName: "kube-api-access-l9t98") pod "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" (UID: "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82"). InnerVolumeSpecName "kube-api-access-l9t98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.869322 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_db6e605f-fa18-4b63-abb0-398ff39fb010/ovsdbserver-sb/0.log" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.869414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"db6e605f-fa18-4b63-abb0-398ff39fb010","Type":"ContainerDied","Data":"e80acd071c30b09fae20ae185142e55376d3f93d45bbf0452e49af6c1fde9685"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.869522 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.870460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" (UID: "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.873582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"da5cd14d-4d69-4f5a-9901-315fe321579c","Type":"ContainerStarted","Data":"06a9ecf9a704b3cf801825df72825f3564b9b3123770eb8150d305a227bca3ff"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.895962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" event={"ID":"a1cb8df6-9987-4102-9599-2a0a39252432","Type":"ContainerStarted","Data":"65e36c174cc5207360bd0cf22fdea4e618ae32245ea43ee1288e1e6637670159"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.897310 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.897336 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.897356 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.897367 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.897377 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9t98\" (UniqueName: \"kubernetes.io/projected/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-kube-api-access-l9t98\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.916495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"28a92eee-d8bb-46d5-bacb-e641b5be138f","Type":"ContainerStarted","Data":"fdb85155a5d668dd87d07fc89c8d39d5fe58e426936b9aca1fec084e052fbd48"} Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.916518 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="47097d1a-6225-4900-85a1-a28f2f698a78" containerName="mysql-bootstrap" containerID="cri-o://505527e41043faf26c8fb4d1c3789cdbd1f623b88d8c64575cc6591ba70dc015" gracePeriod=30 Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.947892 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.966609 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.972788 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:12 crc kubenswrapper[4707]: E0121 15:54:12.973182 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerName="openstack-network-exporter" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.973203 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerName="openstack-network-exporter" Jan 21 15:54:12 crc kubenswrapper[4707]: E0121 15:54:12.973242 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerName="ovsdbserver-nb" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.973248 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerName="ovsdbserver-nb" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.973432 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerName="ovsdbserver-nb" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.973456 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" containerName="openstack-network-exporter" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.974387 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.977304 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.977663 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-6hxvs" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.979210 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.983343 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:54:12 crc kubenswrapper[4707]: I0121 15:54:12.983557 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.033730 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.101335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.101554 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.101599 4707 scope.go:117] "RemoveContainer" containerID="da0a868e4a63c3078ec90d7fb603c14478579582a1b0178a820be0cbec3e8780" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.101785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.101852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.102042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.102065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-config\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.102083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vndk\" (UniqueName: \"kubernetes.io/projected/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-kube-api-access-6vndk\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.102106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.204246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.204658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.204732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.204762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.204801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.205023 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-config\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.205048 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vndk\" (UniqueName: \"kubernetes.io/projected/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-kube-api-access-6vndk\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.205067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.206371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.206952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.207208 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.207462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-config\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.218929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.223230 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db6e605f-fa18-4b63-abb0-398ff39fb010" path="/var/lib/kubelet/pods/db6e605f-fa18-4b63-abb0-398ff39fb010/volumes" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.230575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.232173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vndk\" (UniqueName: \"kubernetes.io/projected/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-kube-api-access-6vndk\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.232482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.333669 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.409701 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.485634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.604522 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" (UID: "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.616542 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.658932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" (UID: "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.677470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" (UID: "d5f812ac-2896-4f26-b5f4-ec0ab36cbb82"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.718054 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.718088 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:13 crc kubenswrapper[4707]: I0121 15:54:13.991793 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=3.991776946 podStartE2EDuration="3.991776946s" podCreationTimestamp="2026-01-21 15:54:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:13.988505094 +0000 UTC m=+3151.170021316" watchObservedRunningTime="2026-01-21 15:54:13.991776946 +0000 UTC m=+3151.173293168" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.008089 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerID="7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517" exitCode=143 Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.016021 4707 generic.go:334] "Generic (PLEG): container finished" podID="a33f40c3-3064-405e-a420-da052b2b4a66" containerID="4785e6cbd1c41004f4ccc7f605931833ce69946a0be235c3ed27a893a8ce09b4" exitCode=0 Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.016109 4707 generic.go:334] "Generic (PLEG): container finished" podID="a33f40c3-3064-405e-a420-da052b2b4a66" containerID="abe6c330b2f7b9eeb298263fb49016da6c385ea40b62ceaa7db2f7a086c14819" exitCode=143 Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.033556 4707 generic.go:334] "Generic (PLEG): container finished" podID="47097d1a-6225-4900-85a1-a28f2f698a78" containerID="505527e41043faf26c8fb4d1c3789cdbd1f623b88d8c64575cc6591ba70dc015" exitCode=0 Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.052254 4707 generic.go:334] "Generic (PLEG): container finished" podID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerID="14d0a43f08a2e749aa63184db127afde2233c4d4ac8dfaab8d8f291ada1ef02e" exitCode=0 Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.052302 4707 generic.go:334] "Generic (PLEG): container finished" podID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerID="fc7d797bb4cca4bd6f7ee94c58585edda25fefd4c7c624324ba4fb6dae947345" exitCode=143 Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.056519 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=5.056505072 podStartE2EDuration="5.056505072s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:14.054224142 +0000 UTC m=+3151.235740364" watchObservedRunningTime="2026-01-21 15:54:14.056505072 +0000 UTC m=+3151.238021284" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140168 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140213 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"087bdc72-3300-4e75-9b60-72380c090c95","Type":"ContainerStarted","Data":"738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e86ffe25-a844-40ac-ac5f-11835ee59208","Type":"ContainerStarted","Data":"20b345e63520b4db3415497327c3575148e3e1b43402edfc68843451a441b7df"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" event={"ID":"b7d3be32-a1ae-4a91-bf04-051fd3dde8af","Type":"ContainerStarted","Data":"e4cf394238e37ac5d15301506b18d0ba417d6a80b24ad0f97a43b3434616dd5a"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" event={"ID":"943cfab1-eb95-4dcb-842d-9826513006f5","Type":"ContainerStarted","Data":"cda7c8458976b8fcd3b85e2aeae92be31f32d570eec96376fd70a84ab0e2f2f9"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" event={"ID":"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd","Type":"ContainerDied","Data":"7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" event={"ID":"a33f40c3-3064-405e-a420-da052b2b4a66","Type":"ContainerDied","Data":"4785e6cbd1c41004f4ccc7f605931833ce69946a0be235c3ed27a893a8ce09b4"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" event={"ID":"a33f40c3-3064-405e-a420-da052b2b4a66","Type":"ContainerDied","Data":"abe6c330b2f7b9eeb298263fb49016da6c385ea40b62ceaa7db2f7a086c14819"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" event={"ID":"a33f40c3-3064-405e-a420-da052b2b4a66","Type":"ContainerDied","Data":"b24977ab2a042384f059c50e4e011e2e6564a87a0ae23834e79f3a4f5f087280"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140378 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b24977ab2a042384f059c50e4e011e2e6564a87a0ae23834e79f3a4f5f087280" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerStarted","Data":"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" event={"ID":"318a94fd-66d7-4cfb-b312-a348e185ae2e","Type":"ContainerStarted","Data":"38e1d35a04c5eafa1fb3e2fa41540ef0e683a29abae5339031ce3c86d2b67789"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140416 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"47097d1a-6225-4900-85a1-a28f2f698a78","Type":"ContainerDied","Data":"505527e41043faf26c8fb4d1c3789cdbd1f623b88d8c64575cc6591ba70dc015"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"47097d1a-6225-4900-85a1-a28f2f698a78","Type":"ContainerDied","Data":"f3d67b476bde1c26de4d8efd1ead80410c9e0a11492e3e04bf4389e2fd795596"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140462 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3d67b476bde1c26de4d8efd1ead80410c9e0a11492e3e04bf4389e2fd795596" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"12052361-f984-4545-a1e1-9a8b80e2cceb","Type":"ContainerStarted","Data":"4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"84ca704f-a732-47ec-a479-dabdeea5b864","Type":"ContainerStarted","Data":"e3b398a48d728c6bd65d6a353d60864071f543173e91de5b29ab7fb55a485387"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140493 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-98765b5d4-w967c"] Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" event={"ID":"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b","Type":"ContainerDied","Data":"14d0a43f08a2e749aa63184db127afde2233c4d4ac8dfaab8d8f291ada1ef02e"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" event={"ID":"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b","Type":"ContainerDied","Data":"fc7d797bb4cca4bd6f7ee94c58585edda25fefd4c7c624324ba4fb6dae947345"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" event={"ID":"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b","Type":"ContainerDied","Data":"0cab14f7b1ef62e790e9b6b4c077c4a63e7c0e0452efd922aaa71beb9af5a4d3"} Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.140548 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cab14f7b1ef62e790e9b6b4c077c4a63e7c0e0452efd922aaa71beb9af5a4d3" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.157060 4707 scope.go:117] "RemoveContainer" containerID="ecdb17bcb0ef07f2c4492e39172d86abfd80846b3e7790e912da5c2e4616f285" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.273755 4707 scope.go:117] "RemoveContainer" containerID="c76d285e93b522930c571d4453414c664b3f53903ed74d4779ddec9a94e191a4" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.275486 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.292571 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.325643 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.330725 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.333750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-internal-tls-certs\") pod \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.333794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-public-tls-certs\") pod \"a33f40c3-3064-405e-a420-da052b2b4a66\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.333906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-combined-ca-bundle\") pod \"a33f40c3-3064-405e-a420-da052b2b4a66\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.333952 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-scripts\") pod \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkf6h\" (UniqueName: \"kubernetes.io/projected/a33f40c3-3064-405e-a420-da052b2b4a66-kube-api-access-xkf6h\") pod \"a33f40c3-3064-405e-a420-da052b2b4a66\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data\") pod \"a33f40c3-3064-405e-a420-da052b2b4a66\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data-custom\") pod \"a33f40c3-3064-405e-a420-da052b2b4a66\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a33f40c3-3064-405e-a420-da052b2b4a66-logs\") pod \"a33f40c3-3064-405e-a420-da052b2b4a66\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-combined-ca-bundle\") pod \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-logs\") pod \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48v2g\" (UniqueName: \"kubernetes.io/projected/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-kube-api-access-48v2g\") pod \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-config-data\") pod \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-public-tls-certs\") pod \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\" (UID: \"b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.334471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-internal-tls-certs\") pod \"a33f40c3-3064-405e-a420-da052b2b4a66\" (UID: \"a33f40c3-3064-405e-a420-da052b2b4a66\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.339521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a33f40c3-3064-405e-a420-da052b2b4a66-logs" (OuterVolumeSpecName: "logs") pod "a33f40c3-3064-405e-a420-da052b2b4a66" (UID: "a33f40c3-3064-405e-a420-da052b2b4a66"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.344952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-logs" (OuterVolumeSpecName: "logs") pod "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" (UID: "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.349747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33f40c3-3064-405e-a420-da052b2b4a66-kube-api-access-xkf6h" (OuterVolumeSpecName: "kube-api-access-xkf6h") pod "a33f40c3-3064-405e-a420-da052b2b4a66" (UID: "a33f40c3-3064-405e-a420-da052b2b4a66"). InnerVolumeSpecName "kube-api-access-xkf6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.350901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a33f40c3-3064-405e-a420-da052b2b4a66" (UID: "a33f40c3-3064-405e-a420-da052b2b4a66"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.362879 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.389166 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-scripts" (OuterVolumeSpecName: "scripts") pod "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" (UID: "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.405574 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.406452 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.407078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-kube-api-access-48v2g" (OuterVolumeSpecName: "kube-api-access-48v2g") pod "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" (UID: "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b"). InnerVolumeSpecName "kube-api-access-48v2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: E0121 15:54:14.412644 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" containerName="barbican-api-log" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.412671 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" containerName="barbican-api-log" Jan 21 15:54:14 crc kubenswrapper[4707]: E0121 15:54:14.412930 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" containerName="barbican-api" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.412947 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" containerName="barbican-api" Jan 21 15:54:14 crc kubenswrapper[4707]: E0121 15:54:14.413095 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerName="placement-log" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.413109 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerName="placement-log" Jan 21 15:54:14 crc kubenswrapper[4707]: E0121 15:54:14.413134 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47097d1a-6225-4900-85a1-a28f2f698a78" containerName="mysql-bootstrap" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.413141 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="47097d1a-6225-4900-85a1-a28f2f698a78" containerName="mysql-bootstrap" Jan 21 15:54:14 crc kubenswrapper[4707]: E0121 15:54:14.413278 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerName="placement-api" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.413293 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerName="placement-api" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.414561 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerName="placement-log" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.414601 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" containerName="placement-api" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.414614 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" containerName="barbican-api-log" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.414630 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="47097d1a-6225-4900-85a1-a28f2f698a78" containerName="mysql-bootstrap" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.414648 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" containerName="barbican-api" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.436466 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.436583 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.436826 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wpg6\" (UniqueName: \"kubernetes.io/projected/47097d1a-6225-4900-85a1-a28f2f698a78-kube-api-access-7wpg6\") pod \"47097d1a-6225-4900-85a1-a28f2f698a78\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.436887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-default\") pod \"47097d1a-6225-4900-85a1-a28f2f698a78\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.436908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-generated\") pod \"47097d1a-6225-4900-85a1-a28f2f698a78\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.436963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"47097d1a-6225-4900-85a1-a28f2f698a78\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.437057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-operator-scripts\") pod \"47097d1a-6225-4900-85a1-a28f2f698a78\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.437079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-galera-tls-certs\") pod \"47097d1a-6225-4900-85a1-a28f2f698a78\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.437098 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-kolla-config\") pod \"47097d1a-6225-4900-85a1-a28f2f698a78\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.437136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-combined-ca-bundle\") pod \"47097d1a-6225-4900-85a1-a28f2f698a78\" (UID: \"47097d1a-6225-4900-85a1-a28f2f698a78\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.438669 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.438684 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a33f40c3-3064-405e-a420-da052b2b4a66-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.438698 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.438708 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48v2g\" (UniqueName: \"kubernetes.io/projected/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-kube-api-access-48v2g\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.438719 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.438751 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkf6h\" (UniqueName: \"kubernetes.io/projected/a33f40c3-3064-405e-a420-da052b2b4a66-kube-api-access-xkf6h\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.438794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "47097d1a-6225-4900-85a1-a28f2f698a78" (UID: "47097d1a-6225-4900-85a1-a28f2f698a78"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.439136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "47097d1a-6225-4900-85a1-a28f2f698a78" (UID: "47097d1a-6225-4900-85a1-a28f2f698a78"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.439732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47097d1a-6225-4900-85a1-a28f2f698a78" (UID: "47097d1a-6225-4900-85a1-a28f2f698a78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.441915 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-kdf5s" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.442254 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.442431 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.442557 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.443652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "47097d1a-6225-4900-85a1-a28f2f698a78" (UID: "47097d1a-6225-4900-85a1-a28f2f698a78"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.489622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "mysql-db") pod "47097d1a-6225-4900-85a1-a28f2f698a78" (UID: "47097d1a-6225-4900-85a1-a28f2f698a78"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.489790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47097d1a-6225-4900-85a1-a28f2f698a78-kube-api-access-7wpg6" (OuterVolumeSpecName: "kube-api-access-7wpg6") pod "47097d1a-6225-4900-85a1-a28f2f698a78" (UID: "47097d1a-6225-4900-85a1-a28f2f698a78"). InnerVolumeSpecName "kube-api-access-7wpg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.490638 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47097d1a-6225-4900-85a1-a28f2f698a78" (UID: "47097d1a-6225-4900-85a1-a28f2f698a78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.503286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "47097d1a-6225-4900-85a1-a28f2f698a78" (UID: "47097d1a-6225-4900-85a1-a28f2f698a78"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.540482 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.540516 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47097d1a-6225-4900-85a1-a28f2f698a78-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.540541 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.540553 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.540561 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47097d1a-6225-4900-85a1-a28f2f698a78-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.540569 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.540581 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47097d1a-6225-4900-85a1-a28f2f698a78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.540589 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wpg6\" (UniqueName: \"kubernetes.io/projected/47097d1a-6225-4900-85a1-a28f2f698a78-kube-api-access-7wpg6\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.632271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a33f40c3-3064-405e-a420-da052b2b4a66" (UID: "a33f40c3-3064-405e-a420-da052b2b4a66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.642656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.642704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-config\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.642732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.642782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.642845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38825357-2812-4711-bc86-7638075f8daa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.642864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.642916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtc4v\" (UniqueName: \"kubernetes.io/projected/38825357-2812-4711-bc86-7638075f8daa-kube-api-access-jtc4v\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.642969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.643029 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.745246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.745330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.745355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-config\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.745376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.745420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.745464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38825357-2812-4711-bc86-7638075f8daa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.745480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.745527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtc4v\" (UniqueName: \"kubernetes.io/projected/38825357-2812-4711-bc86-7638075f8daa-kube-api-access-jtc4v\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.746219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38825357-2812-4711-bc86-7638075f8daa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.746849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.747088 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.747373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-config\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.765931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.766316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.766855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.782393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtc4v\" (UniqueName: \"kubernetes.io/projected/38825357-2812-4711-bc86-7638075f8daa-kube-api-access-jtc4v\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.854964 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_303e46a4-d2f9-4512-ae87-c50a52d62019/ovn-northd/0.log" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.855045 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.955338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-combined-ca-bundle\") pod \"303e46a4-d2f9-4512-ae87-c50a52d62019\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.955605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-scripts\") pod \"303e46a4-d2f9-4512-ae87-c50a52d62019\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.955651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-rundir\") pod \"303e46a4-d2f9-4512-ae87-c50a52d62019\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.955673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-config\") pod \"303e46a4-d2f9-4512-ae87-c50a52d62019\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.955690 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-northd-tls-certs\") pod \"303e46a4-d2f9-4512-ae87-c50a52d62019\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.955984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v44fm\" (UniqueName: \"kubernetes.io/projected/303e46a4-d2f9-4512-ae87-c50a52d62019-kube-api-access-v44fm\") pod \"303e46a4-d2f9-4512-ae87-c50a52d62019\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.956006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-metrics-certs-tls-certs\") pod \"303e46a4-d2f9-4512-ae87-c50a52d62019\" (UID: \"303e46a4-d2f9-4512-ae87-c50a52d62019\") " Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.956144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-scripts" (OuterVolumeSpecName: "scripts") pod "303e46a4-d2f9-4512-ae87-c50a52d62019" (UID: "303e46a4-d2f9-4512-ae87-c50a52d62019"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.956406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-config" (OuterVolumeSpecName: "config") pod "303e46a4-d2f9-4512-ae87-c50a52d62019" (UID: "303e46a4-d2f9-4512-ae87-c50a52d62019"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.956530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "303e46a4-d2f9-4512-ae87-c50a52d62019" (UID: "303e46a4-d2f9-4512-ae87-c50a52d62019"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.957013 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.957025 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.957035 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303e46a4-d2f9-4512-ae87-c50a52d62019-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.968227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303e46a4-d2f9-4512-ae87-c50a52d62019-kube-api-access-v44fm" (OuterVolumeSpecName: "kube-api-access-v44fm") pod "303e46a4-d2f9-4512-ae87-c50a52d62019" (UID: "303e46a4-d2f9-4512-ae87-c50a52d62019"). InnerVolumeSpecName "kube-api-access-v44fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:14 crc kubenswrapper[4707]: I0121 15:54:14.979250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.021563 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.030287 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.060330 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v44fm\" (UniqueName: \"kubernetes.io/projected/303e46a4-d2f9-4512-ae87-c50a52d62019-kube-api-access-v44fm\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.060580 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.070043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"388a5540-d98c-4db0-bb1a-5a387ff8f7f3","Type":"ContainerStarted","Data":"d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.075028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" event={"ID":"b7d3be32-a1ae-4a91-bf04-051fd3dde8af","Type":"ContainerStarted","Data":"9832a63a4444f19da99ed41ffa2ee903cf61ce3db5cfce79cec9a1c0b9b19f58"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.078425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" event={"ID":"fb142568-b2b7-467a-9b33-fc6c3bf725e3","Type":"ContainerStarted","Data":"99af3d75a96b9e45347ec6f501959f5aaaf4cc00a9200522051fb91dc082665d"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.082338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" event={"ID":"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea","Type":"ContainerStarted","Data":"ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.085048 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" event={"ID":"943cfab1-eb95-4dcb-842d-9826513006f5","Type":"ContainerStarted","Data":"d19a578b16065b74d9943a39179f1fd2594611a7b821a2af35ebb6465bc0e896"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.086002 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-log" containerID="cri-o://cda7c8458976b8fcd3b85e2aeae92be31f32d570eec96376fd70a84ab0e2f2f9" gracePeriod=30 Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.086142 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-api" containerID="cri-o://d19a578b16065b74d9943a39179f1fd2594611a7b821a2af35ebb6465bc0e896" gracePeriod=30 Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.086036 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.087384 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.100598 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" podStartSLOduration=6.100587296 podStartE2EDuration="6.100587296s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:15.092415449 +0000 UTC m=+3152.273931671" watchObservedRunningTime="2026-01-21 15:54:15.100587296 +0000 UTC m=+3152.282103518" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.119442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"da5cd14d-4d69-4f5a-9901-315fe321579c","Type":"ContainerStarted","Data":"5efc0d6113609a65c6c8c728c1819ba46a5a6053e62048e66910b566697c74eb"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.125444 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" podStartSLOduration=6.125434329 podStartE2EDuration="6.125434329s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:15.110020734 +0000 UTC m=+3152.291536977" watchObservedRunningTime="2026-01-21 15:54:15.125434329 +0000 UTC m=+3152.306950550" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.125895 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.128612 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_303e46a4-d2f9-4512-ae87-c50a52d62019/ovn-northd/0.log" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.128646 4707 generic.go:334] "Generic (PLEG): container finished" podID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" exitCode=139 Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.128694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"303e46a4-d2f9-4512-ae87-c50a52d62019","Type":"ContainerDied","Data":"307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.128715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"303e46a4-d2f9-4512-ae87-c50a52d62019","Type":"ContainerDied","Data":"a2242f50b778f4e468ac99e97508a558454c0b2fa045be20e9a4ae34e86ebfe7"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.128732 4707 scope.go:117] "RemoveContainer" containerID="9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.128898 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.137076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ddb5dfe8-ec47-4bb6-b6c0-403095118b07","Type":"ContainerStarted","Data":"867ac410fbcc706feb2e5d8260d8f3a1a3fbbfbc66c12963ea695d4d1b924302"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.141438 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8"] Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.141698 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" podUID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerName="barbican-worker-log" containerID="cri-o://a9a94e2300c8f0cebf8271a5715c7814c9ee668a8c0685fdaf947f3e5b501fa5" gracePeriod=30 Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.141926 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" podUID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerName="barbican-worker" containerID="cri-o://acc31783e633d011092b6f8764551146c9afb0f71a7bb5976536eae6e9de6e07" gracePeriod=30 Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.147297 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-75586f844d-9x57h" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.148394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"087bdc72-3300-4e75-9b60-72380c090c95","Type":"ContainerStarted","Data":"e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa"} Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.149024 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.150791 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.165477 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=6.165467958 podStartE2EDuration="6.165467958s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:15.143184135 +0000 UTC m=+3152.324700357" watchObservedRunningTime="2026-01-21 15:54:15.165467958 +0000 UTC m=+3152.346984180" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.186673 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=6.186649778 podStartE2EDuration="6.186649778s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:15.178502047 +0000 UTC m=+3152.360018270" watchObservedRunningTime="2026-01-21 15:54:15.186649778 +0000 UTC m=+3152.368166000" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.205197 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f812ac-2896-4f26-b5f4-ec0ab36cbb82" path="/var/lib/kubelet/pods/d5f812ac-2896-4f26-b5f4-ec0ab36cbb82/volumes" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.553897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "303e46a4-d2f9-4512-ae87-c50a52d62019" (UID: "303e46a4-d2f9-4512-ae87-c50a52d62019"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.595601 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.635902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a33f40c3-3064-405e-a420-da052b2b4a66" (UID: "a33f40c3-3064-405e-a420-da052b2b4a66"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.638932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-config-data" (OuterVolumeSpecName: "config-data") pod "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" (UID: "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.639943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data" (OuterVolumeSpecName: "config-data") pod "a33f40c3-3064-405e-a420-da052b2b4a66" (UID: "a33f40c3-3064-405e-a420-da052b2b4a66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.660707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a33f40c3-3064-405e-a420-da052b2b4a66" (UID: "a33f40c3-3064-405e-a420-da052b2b4a66"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.669843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" (UID: "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.688028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "303e46a4-d2f9-4512-ae87-c50a52d62019" (UID: "303e46a4-d2f9-4512-ae87-c50a52d62019"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.688064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "303e46a4-d2f9-4512-ae87-c50a52d62019" (UID: "303e46a4-d2f9-4512-ae87-c50a52d62019"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.698870 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.698899 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.698910 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.698920 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.698929 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f40c3-3064-405e-a420-da052b2b4a66-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.698940 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/303e46a4-d2f9-4512-ae87-c50a52d62019-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.698948 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.732152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" (UID: "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.802242 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.825725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" (UID: "b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.875108 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.875156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.875172 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.904327 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:15 crc kubenswrapper[4707]: E0121 15:54:15.944405 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622 is running failed: container process not found" containerID="4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:15 crc kubenswrapper[4707]: I0121 15:54:15.944528 4707 scope.go:117] "RemoveContainer" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" Jan 21 15:54:15 crc kubenswrapper[4707]: E0121 15:54:15.946297 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622 is running failed: container process not found" containerID="4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:15 crc kubenswrapper[4707]: E0121 15:54:15.947055 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622 is running failed: container process not found" containerID="4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:15 crc kubenswrapper[4707]: E0121 15:54:15.947085 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622 is running failed: container process not found" probeType="Liveness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.046029 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.063421 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.077139 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: E0121 15:54:16.078362 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="openstack-network-exporter" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.078384 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="openstack-network-exporter" Jan 21 15:54:16 crc kubenswrapper[4707]: E0121 15:54:16.078668 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="ovn-northd" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.078681 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="ovn-northd" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.080175 4707 scope.go:117] "RemoveContainer" containerID="9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.080237 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="openstack-network-exporter" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.080256 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" containerName="ovn-northd" Jan 21 15:54:16 crc kubenswrapper[4707]: E0121 15:54:16.082514 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa\": container with ID starting with 9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa not found: ID does not exist" containerID="9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.082558 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa"} err="failed to get container status \"9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa\": rpc error: code = NotFound desc = could not find container \"9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa\": container with ID starting with 9c34ee23ad159dc986d62c0e62cc68e793ef91855260a94d1332d8f72fe31faa not found: ID does not exist" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.082583 4707 scope.go:117] "RemoveContainer" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" Jan 21 15:54:16 crc kubenswrapper[4707]: E0121 15:54:16.083369 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6\": container with ID starting with 307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6 not found: ID does not exist" containerID="307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.083403 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6"} err="failed to get container status \"307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6\": rpc error: code = NotFound desc = could not find container \"307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6\": container with ID starting with 307c081eba72ddb2317d10fbb50c5126032d6e769910fdf900e0a2b5c63f3cf6 not found: ID does not exist" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.083560 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.083839 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.087850 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.090911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.091174 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-m7p4k" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.092671 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.092896 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.092915 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.092977 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.102125 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.114672 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.117114 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.119136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.119280 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.119316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5vc2\" (UniqueName: \"kubernetes.io/projected/d70efa8f-44f9-48d8-8beb-531c69225631-kube-api-access-k5vc2\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.120534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.120567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.120591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-kolla-config\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.120695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.120891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.120958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-default\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.121874 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-q9bf5" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.125867 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.127491 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.127533 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.129437 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.147630 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.160484 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.227565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-kolla-config\") pod \"35e0d910-5029-444b-83e7-92d44c8ae55a\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.227613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-combined-ca-bundle\") pod \"35e0d910-5029-444b-83e7-92d44c8ae55a\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.227644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-default\") pod \"35e0d910-5029-444b-83e7-92d44c8ae55a\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.227730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-galera-tls-certs\") pod \"35e0d910-5029-444b-83e7-92d44c8ae55a\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.227763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-generated\") pod \"35e0d910-5029-444b-83e7-92d44c8ae55a\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.227868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvbgc\" (UniqueName: \"kubernetes.io/projected/35e0d910-5029-444b-83e7-92d44c8ae55a-kube-api-access-gvbgc\") pod \"35e0d910-5029-444b-83e7-92d44c8ae55a\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.227892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"35e0d910-5029-444b-83e7-92d44c8ae55a\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.227915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-operator-scripts\") pod \"35e0d910-5029-444b-83e7-92d44c8ae55a\" (UID: \"35e0d910-5029-444b-83e7-92d44c8ae55a\") " Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228267 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-scripts\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-default\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228480 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7hbm\" (UniqueName: \"kubernetes.io/projected/206e3d62-bdaa-4326-8c66-a615e81200d8-kube-api-access-p7hbm\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228619 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-config\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5vc2\" (UniqueName: \"kubernetes.io/projected/d70efa8f-44f9-48d8-8beb-531c69225631-kube-api-access-k5vc2\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.228838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-kolla-config\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.229685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-kolla-config\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.229856 4707 generic.go:334] "Generic (PLEG): container finished" podID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerID="4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622" exitCode=1 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.229913 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"12052361-f984-4545-a1e1-9a8b80e2cceb","Type":"ContainerDied","Data":"4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.230256 4707 scope.go:117] "RemoveContainer" containerID="4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.236405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "35e0d910-5029-444b-83e7-92d44c8ae55a" (UID: "35e0d910-5029-444b-83e7-92d44c8ae55a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.241056 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.243099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.243976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35e0d910-5029-444b-83e7-92d44c8ae55a" (UID: "35e0d910-5029-444b-83e7-92d44c8ae55a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.244608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-default\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.244867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "35e0d910-5029-444b-83e7-92d44c8ae55a" (UID: "35e0d910-5029-444b-83e7-92d44c8ae55a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.245601 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.247953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "35e0d910-5029-444b-83e7-92d44c8ae55a" (UID: "35e0d910-5029-444b-83e7-92d44c8ae55a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.263673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.274138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "35e0d910-5029-444b-83e7-92d44c8ae55a" (UID: "35e0d910-5029-444b-83e7-92d44c8ae55a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.278676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5vc2\" (UniqueName: \"kubernetes.io/projected/d70efa8f-44f9-48d8-8beb-531c69225631-kube-api-access-k5vc2\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.279239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.279725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35e0d910-5029-444b-83e7-92d44c8ae55a" (UID: "35e0d910-5029-444b-83e7-92d44c8ae55a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.301506 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-75586f844d-9x57h"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.304412 4707 generic.go:334] "Generic (PLEG): container finished" podID="943cfab1-eb95-4dcb-842d-9826513006f5" containerID="cda7c8458976b8fcd3b85e2aeae92be31f32d570eec96376fd70a84ab0e2f2f9" exitCode=143 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.304468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" event={"ID":"943cfab1-eb95-4dcb-842d-9826513006f5","Type":"ContainerDied","Data":"cda7c8458976b8fcd3b85e2aeae92be31f32d570eec96376fd70a84ab0e2f2f9"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.307683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e0d910-5029-444b-83e7-92d44c8ae55a-kube-api-access-gvbgc" (OuterVolumeSpecName: "kube-api-access-gvbgc") pod "35e0d910-5029-444b-83e7-92d44c8ae55a" (UID: "35e0d910-5029-444b-83e7-92d44c8ae55a"). InnerVolumeSpecName "kube-api-access-gvbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.311649 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-75586f844d-9x57h"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.311679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" event={"ID":"318a94fd-66d7-4cfb-b312-a348e185ae2e","Type":"ContainerStarted","Data":"07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.318592 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "mysql-db") pod "35e0d910-5029-444b-83e7-92d44c8ae55a" (UID: "35e0d910-5029-444b-83e7-92d44c8ae55a"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.318835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"38825357-2812-4711-bc86-7638075f8daa","Type":"ContainerStarted","Data":"b33f328b366d5b406b5f6b66e852340992c7f44a32e052c5c36f355d0e2c103a"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.320913 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.321459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" event={"ID":"a1cb8df6-9987-4102-9599-2a0a39252432","Type":"ContainerStarted","Data":"c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.333246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.333402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7hbm\" (UniqueName: \"kubernetes.io/projected/206e3d62-bdaa-4326-8c66-a615e81200d8-kube-api-access-p7hbm\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.333457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-config\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.333529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.333580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.333690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-scripts\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.333850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.334074 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.346429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-scripts\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.347099 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.347125 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.347137 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e0d910-5029-444b-83e7-92d44c8ae55a-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.347150 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/35e0d910-5029-444b-83e7-92d44c8ae55a-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.347163 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvbgc\" (UniqueName: \"kubernetes.io/projected/35e0d910-5029-444b-83e7-92d44c8ae55a-kube-api-access-gvbgc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.347185 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.347195 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35e0d910-5029-444b-83e7-92d44c8ae55a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.353085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.357148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-config\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.361030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.361191 4707 generic.go:334] "Generic (PLEG): container finished" podID="35e0d910-5029-444b-83e7-92d44c8ae55a" containerID="8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca" exitCode=0 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.361231 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.361252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"35e0d910-5029-444b-83e7-92d44c8ae55a","Type":"ContainerDied","Data":"8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.361294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"35e0d910-5029-444b-83e7-92d44c8ae55a","Type":"ContainerDied","Data":"7defbc4a0578aee77e2fc6d52d39cbeb798da5105f669dd556ab3d763c1caf97"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.361311 4707 scope.go:117] "RemoveContainer" containerID="8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.365089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"28a92eee-d8bb-46d5-bacb-e641b5be138f","Type":"ContainerStarted","Data":"869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.371279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.376628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3e495769-a033-41f2-8285-3f3274673250","Type":"ContainerStarted","Data":"1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.389176 4707 generic.go:334] "Generic (PLEG): container finished" podID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerID="a9a94e2300c8f0cebf8271a5715c7814c9ee668a8c0685fdaf947f3e5b501fa5" exitCode=143 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.389410 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.389464 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" event={"ID":"e57ace7a-26b6-433e-99aa-891427aaf04c","Type":"ContainerDied","Data":"a9a94e2300c8f0cebf8271a5715c7814c9ee668a8c0685fdaf947f3e5b501fa5"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.399631 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.399861 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="0db5205c-c226-4ce2-b734-b250d0c4f7f5" containerName="nova-scheduler-scheduler" containerID="cri-o://921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235" gracePeriod=30 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.403309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.413533 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.416789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerStarted","Data":"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.432248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7hbm\" (UniqueName: \"kubernetes.io/projected/206e3d62-bdaa-4326-8c66-a615e81200d8-kube-api-access-p7hbm\") pod \"ovn-northd-0\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.443710 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.455653 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.470336 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.475205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"84ca704f-a732-47ec-a479-dabdeea5b864","Type":"ContainerStarted","Data":"718935d494e8b85ea1bc82992ceafdac9be0e8ac69c479cdd2bb1b38dadd80df"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.475397 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="84ca704f-a732-47ec-a479-dabdeea5b864" containerName="cinder-api-log" containerID="cri-o://e3b398a48d728c6bd65d6a353d60864071f543173e91de5b29ab7fb55a485387" gracePeriod=30 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.475481 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.475730 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="84ca704f-a732-47ec-a479-dabdeea5b864" containerName="cinder-api" containerID="cri-o://718935d494e8b85ea1bc82992ceafdac9be0e8ac69c479cdd2bb1b38dadd80df" gracePeriod=30 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.480197 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.532389 4707 generic.go:334] "Generic (PLEG): container finished" podID="e86ffe25-a844-40ac-ac5f-11835ee59208" containerID="20b345e63520b4db3415497327c3575148e3e1b43402edfc68843451a441b7df" exitCode=1 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.534223 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="d2613b63-6d79-4d28-9be6-ca7a8431c724" containerName="memcached" containerID="cri-o://c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8" gracePeriod=30 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.539528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e86ffe25-a844-40ac-ac5f-11835ee59208","Type":"ContainerDied","Data":"20b345e63520b4db3415497327c3575148e3e1b43402edfc68843451a441b7df"} Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.550872 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-578d4bcc98-hmqrg"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.551153 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-api" containerID="cri-o://a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065" gracePeriod=30 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.551447 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-httpd" containerID="cri-o://5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263" gracePeriod=30 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.569102 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.111:9696/\": read tcp 10.217.0.2:35036->10.217.0.111:9696: read: connection reset by peer" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.581168 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7dd959896b-rsffl"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.581492 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" podUID="3596326f-8c58-4af1-963e-916d15976926" containerName="keystone-api" containerID="cri-o://29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa" gracePeriod=30 Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.587433 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.598699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.609901 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-77c8997466-bd6bs"] Jan 21 15:54:16 crc kubenswrapper[4707]: E0121 15:54:16.628929 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e0d910-5029-444b-83e7-92d44c8ae55a" containerName="mysql-bootstrap" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.629271 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e0d910-5029-444b-83e7-92d44c8ae55a" containerName="mysql-bootstrap" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.630082 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e0d910-5029-444b-83e7-92d44c8ae55a" containerName="mysql-bootstrap" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.631899 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.631968 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.646428 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-77c8997466-bd6bs"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.666562 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.687063 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.692555 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.712174 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj"] Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.743480 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=7.7434609739999996 podStartE2EDuration="7.743460974s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:16.518558304 +0000 UTC m=+3153.700074546" watchObservedRunningTime="2026-01-21 15:54:16.743460974 +0000 UTC m=+3153.924977196" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.768757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-public-tls-certs\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.768802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-ovndb-tls-certs\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.768835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hcr\" (UniqueName: \"kubernetes.io/projected/c197e98a-7956-4c34-a31a-911a28cf4a1a-kube-api-access-29hcr\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.768863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-combined-ca-bundle\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.768893 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-public-tls-certs\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.768914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knx4t\" (UniqueName: \"kubernetes.io/projected/16882f47-6637-4e33-82a3-17da2f553dda-kube-api-access-knx4t\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.768934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-credential-keys\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.768953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-fernet-keys\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.768968 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-internal-tls-certs\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.769002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-config-data\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.769046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-httpd-config\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.769065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-combined-ca-bundle\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.769091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-config\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.769107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-internal-tls-certs\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.769136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-scripts\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.824197 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" podUID="3596326f-8c58-4af1-963e-916d15976926" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.112:5000/v3\": read tcp 10.217.0.2:59350->10.217.0.112:5000: read: connection reset by peer" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.846733 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-credential-keys\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-fernet-keys\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-internal-tls-certs\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-config-data\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-httpd-config\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-combined-ca-bundle\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-config\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-internal-tls-certs\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-scripts\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-public-tls-certs\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-ovndb-tls-certs\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29hcr\" (UniqueName: \"kubernetes.io/projected/c197e98a-7956-4c34-a31a-911a28cf4a1a-kube-api-access-29hcr\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-combined-ca-bundle\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-public-tls-certs\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.877778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knx4t\" (UniqueName: \"kubernetes.io/projected/16882f47-6637-4e33-82a3-17da2f553dda-kube-api-access-knx4t\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.883547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-internal-tls-certs\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.894381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-combined-ca-bundle\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.894427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-public-tls-certs\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.894728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-credential-keys\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.895021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-config-data\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.897095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-scripts\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.898316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-ovndb-tls-certs\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.898796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knx4t\" (UniqueName: \"kubernetes.io/projected/16882f47-6637-4e33-82a3-17da2f553dda-kube-api-access-knx4t\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.901145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-internal-tls-certs\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.901379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29hcr\" (UniqueName: \"kubernetes.io/projected/c197e98a-7956-4c34-a31a-911a28cf4a1a-kube-api-access-29hcr\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.901699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-public-tls-certs\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.902021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-combined-ca-bundle\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.902086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-fernet-keys\") pod \"keystone-6fc7ff9f9-jtsmj\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.905297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-httpd-config\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:16 crc kubenswrapper[4707]: I0121 15:54:16.908389 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-config\") pod \"neutron-77c8997466-bd6bs\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.023468 4707 scope.go:117] "RemoveContainer" containerID="8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca" Jan 21 15:54:17 crc kubenswrapper[4707]: E0121 15:54:17.027845 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca\": container with ID starting with 8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca not found: ID does not exist" containerID="8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.027875 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca"} err="failed to get container status \"8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca\": rpc error: code = NotFound desc = could not find container \"8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca\": container with ID starting with 8bba3947d2313fd7aaaedb865a30ba4e3d39c0f33edee6988e46859994f3ddca not found: ID does not exist" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.184528 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:54:17 crc kubenswrapper[4707]: E0121 15:54:17.184895 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.198991 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303e46a4-d2f9-4512-ae87-c50a52d62019" path="/var/lib/kubelet/pods/303e46a4-d2f9-4512-ae87-c50a52d62019/volumes" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.199708 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47097d1a-6225-4900-85a1-a28f2f698a78" path="/var/lib/kubelet/pods/47097d1a-6225-4900-85a1-a28f2f698a78/volumes" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.200282 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b" path="/var/lib/kubelet/pods/b3bd9fcf-a0cd-4ab5-89d5-9a26599ae84b/volumes" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.499612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.508237 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.536846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.551942 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.560361 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.574242 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.576616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ddb5dfe8-ec47-4bb6-b6c0-403095118b07","Type":"ContainerStarted","Data":"2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.579086 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:17 crc kubenswrapper[4707]: E0121 15:54:17.579497 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86ffe25-a844-40ac-ac5f-11835ee59208" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.579511 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86ffe25-a844-40ac-ac5f-11835ee59208" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.579699 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86ffe25-a844-40ac-ac5f-11835ee59208" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.580131 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.580967 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.586738 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-dr5v9" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.586936 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.587057 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.587278 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.593594 4707 generic.go:334] "Generic (PLEG): container finished" podID="84ca704f-a732-47ec-a479-dabdeea5b864" containerID="718935d494e8b85ea1bc82992ceafdac9be0e8ac69c479cdd2bb1b38dadd80df" exitCode=0 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.593619 4707 generic.go:334] "Generic (PLEG): container finished" podID="84ca704f-a732-47ec-a479-dabdeea5b864" containerID="e3b398a48d728c6bd65d6a353d60864071f543173e91de5b29ab7fb55a485387" exitCode=143 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.593660 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"84ca704f-a732-47ec-a479-dabdeea5b864","Type":"ContainerDied","Data":"718935d494e8b85ea1bc82992ceafdac9be0e8ac69c479cdd2bb1b38dadd80df"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.593680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"84ca704f-a732-47ec-a479-dabdeea5b864","Type":"ContainerDied","Data":"e3b398a48d728c6bd65d6a353d60864071f543173e91de5b29ab7fb55a485387"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.593689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"84ca704f-a732-47ec-a479-dabdeea5b864","Type":"ContainerDied","Data":"a702d1d44ca597a78f8858547601e56e9c70e7cb4efcd0e02b6d1406beda7e20"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.593697 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a702d1d44ca597a78f8858547601e56e9c70e7cb4efcd0e02b6d1406beda7e20" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.598900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" event={"ID":"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea","Type":"ContainerStarted","Data":"b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.599694 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.600831 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.603392 4707 generic.go:334] "Generic (PLEG): container finished" podID="3596326f-8c58-4af1-963e-916d15976926" containerID="29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa" exitCode=0 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.603439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" event={"ID":"3596326f-8c58-4af1-963e-916d15976926","Type":"ContainerDied","Data":"29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.603457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" event={"ID":"3596326f-8c58-4af1-963e-916d15976926","Type":"ContainerDied","Data":"4217fa6d49e93744c13a6b4bbf06a5a667420ad031da12a149cf74c3e29d3ed2"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.603472 4707 scope.go:117] "RemoveContainer" containerID="29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.603555 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7dd959896b-rsffl" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.611125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" event={"ID":"318a94fd-66d7-4cfb-b312-a348e185ae2e","Type":"ContainerStarted","Data":"ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.611693 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.611715 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.631503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"38825357-2812-4711-bc86-7638075f8daa","Type":"ContainerStarted","Data":"5609ff7c7f27e1828f46696621a20e1e765e3c6ae795b3240b11334dc334d6c4"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.677115 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.682326 4707 generic.go:334] "Generic (PLEG): container finished" podID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerID="acc31783e633d011092b6f8764551146c9afb0f71a7bb5976536eae6e9de6e07" exitCode=0 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.682935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" event={"ID":"e57ace7a-26b6-433e-99aa-891427aaf04c","Type":"ContainerDied","Data":"acc31783e633d011092b6f8764551146c9afb0f71a7bb5976536eae6e9de6e07"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.691478 4707 generic.go:334] "Generic (PLEG): container finished" podID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerID="5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263" exitCode=0 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.691858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" event={"ID":"eaa206fe-a991-4947-8e0e-0a8f80879f55","Type":"ContainerDied","Data":"5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.702135 4707 scope.go:117] "RemoveContainer" containerID="29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa" Jan 21 15:54:17 crc kubenswrapper[4707]: E0121 15:54:17.714506 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa\": container with ID starting with 29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa not found: ID does not exist" containerID="29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.714547 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa"} err="failed to get container status \"29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa\": rpc error: code = NotFound desc = could not find container \"29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa\": container with ID starting with 29acb29989c49b68f25c8b685429994ea4a9257119583c3a9a63de0776e881fa not found: ID does not exist" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data\") pod \"84ca704f-a732-47ec-a479-dabdeea5b864\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-scripts\") pod \"3596326f-8c58-4af1-963e-916d15976926\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6nwv\" (UniqueName: \"kubernetes.io/projected/e86ffe25-a844-40ac-ac5f-11835ee59208-kube-api-access-d6nwv\") pod \"e86ffe25-a844-40ac-ac5f-11835ee59208\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736515 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data-custom\") pod \"84ca704f-a732-47ec-a479-dabdeea5b864\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-config-data\") pod \"3596326f-8c58-4af1-963e-916d15976926\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-combined-ca-bundle\") pod \"3596326f-8c58-4af1-963e-916d15976926\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-public-tls-certs\") pod \"84ca704f-a732-47ec-a479-dabdeea5b864\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-credential-keys\") pod \"3596326f-8c58-4af1-963e-916d15976926\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-scripts\") pod \"84ca704f-a732-47ec-a479-dabdeea5b864\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84ca704f-a732-47ec-a479-dabdeea5b864-etc-machine-id\") pod \"84ca704f-a732-47ec-a479-dabdeea5b864\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-combined-ca-bundle\") pod \"e86ffe25-a844-40ac-ac5f-11835ee59208\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-fernet-keys\") pod \"3596326f-8c58-4af1-963e-916d15976926\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk6db\" (UniqueName: \"kubernetes.io/projected/84ca704f-a732-47ec-a479-dabdeea5b864-kube-api-access-mk6db\") pod \"84ca704f-a732-47ec-a479-dabdeea5b864\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.736987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-public-tls-certs\") pod \"3596326f-8c58-4af1-963e-916d15976926\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.737023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-config-data\") pod \"e86ffe25-a844-40ac-ac5f-11835ee59208\" (UID: \"e86ffe25-a844-40ac-ac5f-11835ee59208\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.737045 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc6sh\" (UniqueName: \"kubernetes.io/projected/3596326f-8c58-4af1-963e-916d15976926-kube-api-access-tc6sh\") pod \"3596326f-8c58-4af1-963e-916d15976926\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.737079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-internal-tls-certs\") pod \"84ca704f-a732-47ec-a479-dabdeea5b864\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.737097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-combined-ca-bundle\") pod \"84ca704f-a732-47ec-a479-dabdeea5b864\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.737120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ca704f-a732-47ec-a479-dabdeea5b864-logs\") pod \"84ca704f-a732-47ec-a479-dabdeea5b864\" (UID: \"84ca704f-a732-47ec-a479-dabdeea5b864\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.737164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-internal-tls-certs\") pod \"3596326f-8c58-4af1-963e-916d15976926\" (UID: \"3596326f-8c58-4af1-963e-916d15976926\") " Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.737573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.737664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.739154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.739188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.739380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psdtt\" (UniqueName: \"kubernetes.io/projected/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kube-api-access-psdtt\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.739491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.739536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.739578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.759761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" event={"ID":"fb142568-b2b7-467a-9b33-fc6c3bf725e3","Type":"ContainerStarted","Data":"a4d345f4aef43876ec0d8f9747eadb254c4cfb035eb0ac23f9808a04b5c70dd0"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.759879 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.764330 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.765088 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" podStartSLOduration=5.76506501 podStartE2EDuration="5.76506501s" podCreationTimestamp="2026-01-21 15:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:17.700763056 +0000 UTC m=+3154.882279278" watchObservedRunningTime="2026-01-21 15:54:17.76506501 +0000 UTC m=+3154.946581232" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.765530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84ca704f-a732-47ec-a479-dabdeea5b864-logs" (OuterVolumeSpecName: "logs") pod "84ca704f-a732-47ec-a479-dabdeea5b864" (UID: "84ca704f-a732-47ec-a479-dabdeea5b864"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.765587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84ca704f-a732-47ec-a479-dabdeea5b864-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "84ca704f-a732-47ec-a479-dabdeea5b864" (UID: "84ca704f-a732-47ec-a479-dabdeea5b864"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.772693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" event={"ID":"a1cb8df6-9987-4102-9599-2a0a39252432","Type":"ContainerStarted","Data":"9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.773180 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" podUID="a1cb8df6-9987-4102-9599-2a0a39252432" containerName="placement-log" containerID="cri-o://c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d" gracePeriod=30 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.773556 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.773586 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.775760 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" podUID="a1cb8df6-9987-4102-9599-2a0a39252432" containerName="placement-api" containerID="cri-o://9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d" gracePeriod=30 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.818336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3596326f-8c58-4af1-963e-916d15976926-kube-api-access-tc6sh" (OuterVolumeSpecName: "kube-api-access-tc6sh") pod "3596326f-8c58-4af1-963e-916d15976926" (UID: "3596326f-8c58-4af1-963e-916d15976926"). InnerVolumeSpecName "kube-api-access-tc6sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.828779 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="087bdc72-3300-4e75-9b60-72380c090c95" containerName="nova-metadata-log" containerID="cri-o://738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6" gracePeriod=30 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.829121 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.829994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e86ffe25-a844-40ac-ac5f-11835ee59208","Type":"ContainerDied","Data":"28e7c775b8d6a550d78ce1fe2166c81faa97a3847251fe24584b1ac5052220b5"} Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.830039 4707 scope.go:117] "RemoveContainer" containerID="20b345e63520b4db3415497327c3575148e3e1b43402edfc68843451a441b7df" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.830285 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerName="nova-api-log" containerID="cri-o://06a9ecf9a704b3cf801825df72825f3564b9b3123770eb8150d305a227bca3ff" gracePeriod=30 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.830432 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="728a3264-18ce-4d47-86f5-868f00cc4558" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://16c71ef64e21aa9d7aef3046c8569e917e4a9710d257ebb19afc5d621cdd72ff" gracePeriod=30 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.830524 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="087bdc72-3300-4e75-9b60-72380c090c95" containerName="nova-metadata-metadata" containerID="cri-o://e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa" gracePeriod=30 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.830628 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerName="nova-api-api" containerID="cri-o://5efc0d6113609a65c6c8c728c1819ba46a5a6053e62048e66910b566697c74eb" gracePeriod=30 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.870007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.872194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.872250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.872300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.872574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psdtt\" (UniqueName: \"kubernetes.io/projected/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kube-api-access-psdtt\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.872722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.872799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.872889 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.893103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.893158 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84ca704f-a732-47ec-a479-dabdeea5b864-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.893197 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.893436 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc6sh\" (UniqueName: \"kubernetes.io/projected/3596326f-8c58-4af1-963e-916d15976926-kube-api-access-tc6sh\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.893474 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ca704f-a732-47ec-a479-dabdeea5b864-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.898916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.904042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.908196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.911725 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" podStartSLOduration=8.911711695 podStartE2EDuration="8.911711695s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:17.719204483 +0000 UTC m=+3154.900720705" watchObservedRunningTime="2026-01-21 15:54:17.911711695 +0000 UTC m=+3155.093227918" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.916115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.929555 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd"] Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.929852 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" podUID="25ea2916-144f-4962-92ac-8d9de133ea36" containerName="barbican-keystone-listener-log" containerID="cri-o://eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67" gracePeriod=30 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.929994 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" podUID="25ea2916-144f-4962-92ac-8d9de133ea36" containerName="barbican-keystone-listener" containerID="cri-o://f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac" gracePeriod=30 Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.930144 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podStartSLOduration=8.930132635 podStartE2EDuration="8.930132635s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:17.825445501 +0000 UTC m=+3155.006961722" watchObservedRunningTime="2026-01-21 15:54:17.930132635 +0000 UTC m=+3155.111648846" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.933999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-scripts" (OuterVolumeSpecName: "scripts") pod "3596326f-8c58-4af1-963e-916d15976926" (UID: "3596326f-8c58-4af1-963e-916d15976926"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.934357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ca704f-a732-47ec-a479-dabdeea5b864-kube-api-access-mk6db" (OuterVolumeSpecName: "kube-api-access-mk6db") pod "84ca704f-a732-47ec-a479-dabdeea5b864" (UID: "84ca704f-a732-47ec-a479-dabdeea5b864"). InnerVolumeSpecName "kube-api-access-mk6db". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.943119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3596326f-8c58-4af1-963e-916d15976926" (UID: "3596326f-8c58-4af1-963e-916d15976926"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.943147 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3596326f-8c58-4af1-963e-916d15976926" (UID: "3596326f-8c58-4af1-963e-916d15976926"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.943445 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-scripts" (OuterVolumeSpecName: "scripts") pod "84ca704f-a732-47ec-a479-dabdeea5b864" (UID: "84ca704f-a732-47ec-a479-dabdeea5b864"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.944111 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.950076 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" podStartSLOduration=7.950046441 podStartE2EDuration="7.950046441s" podCreationTimestamp="2026-01-21 15:54:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:17.886420809 +0000 UTC m=+3155.067937031" watchObservedRunningTime="2026-01-21 15:54:17.950046441 +0000 UTC m=+3155.131562662" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.975206 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "84ca704f-a732-47ec-a479-dabdeea5b864" (UID: "84ca704f-a732-47ec-a479-dabdeea5b864"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.975242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86ffe25-a844-40ac-ac5f-11835ee59208-kube-api-access-d6nwv" (OuterVolumeSpecName: "kube-api-access-d6nwv") pod "e86ffe25-a844-40ac-ac5f-11835ee59208" (UID: "e86ffe25-a844-40ac-ac5f-11835ee59208"). InnerVolumeSpecName "kube-api-access-d6nwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:17 crc kubenswrapper[4707]: I0121 15:54:17.978272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psdtt\" (UniqueName: \"kubernetes.io/projected/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kube-api-access-psdtt\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:17.999843 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.000254 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.000280 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.000868 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk6db\" (UniqueName: \"kubernetes.io/projected/84ca704f-a732-47ec-a479-dabdeea5b864-kube-api-access-mk6db\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.000888 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.000897 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6nwv\" (UniqueName: \"kubernetes.io/projected/e86ffe25-a844-40ac-ac5f-11835ee59208-kube-api-access-d6nwv\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.000905 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.141687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.163000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3596326f-8c58-4af1-963e-916d15976926" (UID: "3596326f-8c58-4af1-963e-916d15976926"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.168858 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-77c8997466-bd6bs"] Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.176004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e86ffe25-a844-40ac-ac5f-11835ee59208" (UID: "e86ffe25-a844-40ac-ac5f-11835ee59208"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.199927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3596326f-8c58-4af1-963e-916d15976926" (UID: "3596326f-8c58-4af1-963e-916d15976926"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.207308 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.207334 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.207343 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.225994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-config-data" (OuterVolumeSpecName: "config-data") pod "3596326f-8c58-4af1-963e-916d15976926" (UID: "3596326f-8c58-4af1-963e-916d15976926"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.228702 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.233943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84ca704f-a732-47ec-a479-dabdeea5b864" (UID: "84ca704f-a732-47ec-a479-dabdeea5b864"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.259950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3596326f-8c58-4af1-963e-916d15976926" (UID: "3596326f-8c58-4af1-963e-916d15976926"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.274787 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-config-data" (OuterVolumeSpecName: "config-data") pod "e86ffe25-a844-40ac-ac5f-11835ee59208" (UID: "e86ffe25-a844-40ac-ac5f-11835ee59208"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.311504 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.311533 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3596326f-8c58-4af1-963e-916d15976926-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.311544 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86ffe25-a844-40ac-ac5f-11835ee59208-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.311552 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.318188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data" (OuterVolumeSpecName: "config-data") pod "84ca704f-a732-47ec-a479-dabdeea5b864" (UID: "84ca704f-a732-47ec-a479-dabdeea5b864"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.335429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "84ca704f-a732-47ec-a479-dabdeea5b864" (UID: "84ca704f-a732-47ec-a479-dabdeea5b864"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.349893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "84ca704f-a732-47ec-a479-dabdeea5b864" (UID: "84ca704f-a732-47ec-a479-dabdeea5b864"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.416153 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.416181 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.416190 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ca704f-a732-47ec-a479-dabdeea5b864-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.546999 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.547117 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.794024 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.836377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.847522 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:18 crc kubenswrapper[4707]: W0121 15:54:18.874891 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod206e3d62_bdaa_4326_8c66_a615e81200d8.slice/crio-9039c2bd551565742c91068961f983f235ff686d5ec57640672a242876ff99ea WatchSource:0}: Error finding container 9039c2bd551565742c91068961f983f235ff686d5ec57640672a242876ff99ea: Status 404 returned error can't find the container with id 9039c2bd551565742c91068961f983f235ff686d5ec57640672a242876ff99ea Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.874984 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj"] Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.885566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" event={"ID":"c197e98a-7956-4c34-a31a-911a28cf4a1a","Type":"ContainerStarted","Data":"20c5b4970d15c1a96b2d975842b377379f7937b4b5a64f863e062eb4ad7f03cd"} Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.887104 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.903715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"28a92eee-d8bb-46d5-bacb-e641b5be138f","Type":"ContainerStarted","Data":"b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a"} Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.903933 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerName="glance-log" containerID="cri-o://869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358" gracePeriod=30 Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.905546 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.905652 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerName="glance-httpd" containerID="cri-o://b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a" gracePeriod=30 Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.918745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ddb5dfe8-ec47-4bb6-b6c0-403095118b07","Type":"ContainerStarted","Data":"13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb"} Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.928171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data\") pod \"e57ace7a-26b6-433e-99aa-891427aaf04c\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.928283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data-custom\") pod \"e57ace7a-26b6-433e-99aa-891427aaf04c\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.928338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ace7a-26b6-433e-99aa-891427aaf04c-logs\") pod \"e57ace7a-26b6-433e-99aa-891427aaf04c\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.928395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-combined-ca-bundle\") pod \"e57ace7a-26b6-433e-99aa-891427aaf04c\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.928417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5b5s\" (UniqueName: \"kubernetes.io/projected/e57ace7a-26b6-433e-99aa-891427aaf04c-kube-api-access-h5b5s\") pod \"e57ace7a-26b6-433e-99aa-891427aaf04c\" (UID: \"e57ace7a-26b6-433e-99aa-891427aaf04c\") " Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.928924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57ace7a-26b6-433e-99aa-891427aaf04c-logs" (OuterVolumeSpecName: "logs") pod "e57ace7a-26b6-433e-99aa-891427aaf04c" (UID: "e57ace7a-26b6-433e-99aa-891427aaf04c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.932790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57ace7a-26b6-433e-99aa-891427aaf04c-kube-api-access-h5b5s" (OuterVolumeSpecName: "kube-api-access-h5b5s") pod "e57ace7a-26b6-433e-99aa-891427aaf04c" (UID: "e57ace7a-26b6-433e-99aa-891427aaf04c"). InnerVolumeSpecName "kube-api-access-h5b5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.933439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e57ace7a-26b6-433e-99aa-891427aaf04c" (UID: "e57ace7a-26b6-433e-99aa-891427aaf04c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.942937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.946383 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.979761 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.985767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"388a5540-d98c-4db0-bb1a-5a387ff8f7f3","Type":"ContainerStarted","Data":"3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491"} Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.985969 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerName="glance-log" containerID="cri-o://d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be" gracePeriod=30 Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.986164 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerName="glance-httpd" containerID="cri-o://3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491" gracePeriod=30 Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.986792 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7dd959896b-rsffl"] Jan 21 15:54:18 crc kubenswrapper[4707]: I0121 15:54:18.995548 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-7dd959896b-rsffl"] Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.017283 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.017875 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerName="barbican-worker-log" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.017943 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerName="barbican-worker-log" Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.017993 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2613b63-6d79-4d28-9be6-ca7a8431c724" containerName="memcached" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.018037 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2613b63-6d79-4d28-9be6-ca7a8431c724" containerName="memcached" Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.018098 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1cb8df6-9987-4102-9599-2a0a39252432" containerName="placement-log" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.018145 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cb8df6-9987-4102-9599-2a0a39252432" containerName="placement-log" Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.018208 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ca704f-a732-47ec-a479-dabdeea5b864" containerName="cinder-api-log" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.018270 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ca704f-a732-47ec-a479-dabdeea5b864" containerName="cinder-api-log" Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.018323 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1cb8df6-9987-4102-9599-2a0a39252432" containerName="placement-api" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.018365 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cb8df6-9987-4102-9599-2a0a39252432" containerName="placement-api" Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.018423 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087bdc72-3300-4e75-9b60-72380c090c95" containerName="nova-metadata-metadata" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.018468 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="087bdc72-3300-4e75-9b60-72380c090c95" containerName="nova-metadata-metadata" Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.018539 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ca704f-a732-47ec-a479-dabdeea5b864" containerName="cinder-api" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.018588 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ca704f-a732-47ec-a479-dabdeea5b864" containerName="cinder-api" Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.018648 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerName="barbican-worker" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.018695 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerName="barbican-worker" Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.018746 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3596326f-8c58-4af1-963e-916d15976926" containerName="keystone-api" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.018788 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3596326f-8c58-4af1-963e-916d15976926" containerName="keystone-api" Jan 21 15:54:19 crc kubenswrapper[4707]: E0121 15:54:19.018852 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087bdc72-3300-4e75-9b60-72380c090c95" containerName="nova-metadata-log" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.018910 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="087bdc72-3300-4e75-9b60-72380c090c95" containerName="nova-metadata-log" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019105 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1cb8df6-9987-4102-9599-2a0a39252432" containerName="placement-log" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019162 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="087bdc72-3300-4e75-9b60-72380c090c95" containerName="nova-metadata-log" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019210 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerName="barbican-worker" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019256 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ca704f-a732-47ec-a479-dabdeea5b864" containerName="cinder-api-log" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019319 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="087bdc72-3300-4e75-9b60-72380c090c95" containerName="nova-metadata-metadata" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019374 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57ace7a-26b6-433e-99aa-891427aaf04c" containerName="barbican-worker-log" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019425 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ca704f-a732-47ec-a479-dabdeea5b864" containerName="cinder-api" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019474 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2613b63-6d79-4d28-9be6-ca7a8431c724" containerName="memcached" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019528 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3596326f-8c58-4af1-963e-916d15976926" containerName="keystone-api" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.019579 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1cb8df6-9987-4102-9599-2a0a39252432" containerName="placement-api" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.020181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"12052361-f984-4545-a1e1-9a8b80e2cceb","Type":"ContainerStarted","Data":"66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.026934 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.024673 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.017404 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerName="nova-cell1-conductor-conductor" containerID="cri-o://66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21" gracePeriod=30 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.030717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-scripts\") pod \"a1cb8df6-9987-4102-9599-2a0a39252432\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.030799 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-public-tls-certs\") pod \"a1cb8df6-9987-4102-9599-2a0a39252432\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.030845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-combined-ca-bundle\") pod \"087bdc72-3300-4e75-9b60-72380c090c95\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.030942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087bdc72-3300-4e75-9b60-72380c090c95-logs\") pod \"087bdc72-3300-4e75-9b60-72380c090c95\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.030959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-combined-ca-bundle\") pod \"a1cb8df6-9987-4102-9599-2a0a39252432\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.030984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-config-data\") pod \"d2613b63-6d79-4d28-9be6-ca7a8431c724\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrtft\" (UniqueName: \"kubernetes.io/projected/d2613b63-6d79-4d28-9be6-ca7a8431c724-kube-api-access-jrtft\") pod \"d2613b63-6d79-4d28-9be6-ca7a8431c724\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031041 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-combined-ca-bundle\") pod \"d2613b63-6d79-4d28-9be6-ca7a8431c724\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zlvl\" (UniqueName: \"kubernetes.io/projected/a1cb8df6-9987-4102-9599-2a0a39252432-kube-api-access-2zlvl\") pod \"a1cb8df6-9987-4102-9599-2a0a39252432\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031091 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-nova-metadata-tls-certs\") pod \"087bdc72-3300-4e75-9b60-72380c090c95\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031109 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-config-data\") pod \"a1cb8df6-9987-4102-9599-2a0a39252432\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-kolla-config\") pod \"d2613b63-6d79-4d28-9be6-ca7a8431c724\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-config-data\") pod \"087bdc72-3300-4e75-9b60-72380c090c95\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cb8df6-9987-4102-9599-2a0a39252432-logs\") pod \"a1cb8df6-9987-4102-9599-2a0a39252432\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-memcached-tls-certs\") pod \"d2613b63-6d79-4d28-9be6-ca7a8431c724\" (UID: \"d2613b63-6d79-4d28-9be6-ca7a8431c724\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-internal-tls-certs\") pod \"a1cb8df6-9987-4102-9599-2a0a39252432\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlqww\" (UniqueName: \"kubernetes.io/projected/087bdc72-3300-4e75-9b60-72380c090c95-kube-api-access-xlqww\") pod \"087bdc72-3300-4e75-9b60-72380c090c95\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031691 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ace7a-26b6-433e-99aa-891427aaf04c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031705 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5b5s\" (UniqueName: \"kubernetes.io/projected/e57ace7a-26b6-433e-99aa-891427aaf04c-kube-api-access-h5b5s\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.031714 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.036215 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.036639 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d2613b63-6d79-4d28-9be6-ca7a8431c724" (UID: "d2613b63-6d79-4d28-9be6-ca7a8431c724"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.039735 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1cb8df6-9987-4102-9599-2a0a39252432-logs" (OuterVolumeSpecName: "logs") pod "a1cb8df6-9987-4102-9599-2a0a39252432" (UID: "a1cb8df6-9987-4102-9599-2a0a39252432"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.042587 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.042709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/087bdc72-3300-4e75-9b60-72380c090c95-logs" (OuterVolumeSpecName: "logs") pod "087bdc72-3300-4e75-9b60-72380c090c95" (UID: "087bdc72-3300-4e75-9b60-72380c090c95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.044728 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.051648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-config-data" (OuterVolumeSpecName: "config-data") pod "d2613b63-6d79-4d28-9be6-ca7a8431c724" (UID: "d2613b63-6d79-4d28-9be6-ca7a8431c724"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.071706 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=10.071687457 podStartE2EDuration="10.071687457s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:18.936870256 +0000 UTC m=+3156.118386478" watchObservedRunningTime="2026-01-21 15:54:19.071687457 +0000 UTC m=+3156.253203680" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.082507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" event={"ID":"e57ace7a-26b6-433e-99aa-891427aaf04c","Type":"ContainerDied","Data":"77db2c9a60e3bff5990a162414a8616cd6cb058f1c8a92db5832d810c00f9b7f"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.082553 4707 scope.go:117] "RemoveContainer" containerID="acc31783e633d011092b6f8764551146c9afb0f71a7bb5976536eae6e9de6e07" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.082662 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.091563 4707 generic.go:334] "Generic (PLEG): container finished" podID="25ea2916-144f-4962-92ac-8d9de133ea36" containerID="eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67" exitCode=143 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.091611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" event={"ID":"25ea2916-144f-4962-92ac-8d9de133ea36","Type":"ContainerDied","Data":"eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.092657 4707 generic.go:334] "Generic (PLEG): container finished" podID="728a3264-18ce-4d47-86f5-868f00cc4558" containerID="16c71ef64e21aa9d7aef3046c8569e917e4a9710d257ebb19afc5d621cdd72ff" exitCode=0 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.092694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"728a3264-18ce-4d47-86f5-868f00cc4558","Type":"ContainerDied","Data":"16c71ef64e21aa9d7aef3046c8569e917e4a9710d257ebb19afc5d621cdd72ff"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.111416 4707 generic.go:334] "Generic (PLEG): container finished" podID="087bdc72-3300-4e75-9b60-72380c090c95" containerID="e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa" exitCode=0 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.111446 4707 generic.go:334] "Generic (PLEG): container finished" podID="087bdc72-3300-4e75-9b60-72380c090c95" containerID="738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6" exitCode=143 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.111494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"087bdc72-3300-4e75-9b60-72380c090c95","Type":"ContainerDied","Data":"e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.111515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"087bdc72-3300-4e75-9b60-72380c090c95","Type":"ContainerDied","Data":"738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.111527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"087bdc72-3300-4e75-9b60-72380c090c95","Type":"ContainerDied","Data":"f003fb6700a1377638da83bcae282818afea3ca4f1f8c135f2b2688b4b9a85a1"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.112545 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.119452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"38825357-2812-4711-bc86-7638075f8daa","Type":"ContainerStarted","Data":"b08b894935f483cea6e9fe5cb8204c859d7f717fe6ecb7b8a6bb483b59d86af8"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.123489 4707 generic.go:334] "Generic (PLEG): container finished" podID="a1cb8df6-9987-4102-9599-2a0a39252432" containerID="9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d" exitCode=0 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.123599 4707 generic.go:334] "Generic (PLEG): container finished" podID="a1cb8df6-9987-4102-9599-2a0a39252432" containerID="c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d" exitCode=143 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.123748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" event={"ID":"a1cb8df6-9987-4102-9599-2a0a39252432","Type":"ContainerDied","Data":"9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.123911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" event={"ID":"a1cb8df6-9987-4102-9599-2a0a39252432","Type":"ContainerDied","Data":"c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.123995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" event={"ID":"a1cb8df6-9987-4102-9599-2a0a39252432","Type":"ContainerDied","Data":"65e36c174cc5207360bd0cf22fdea4e618ae32245ea43ee1288e1e6637670159"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.124189 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9d4657458-gkqnm" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.140655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3e495769-a033-41f2-8285-3f3274673250","Type":"ContainerStarted","Data":"25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.146612 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=7.146595265 podStartE2EDuration="7.146595265s" podCreationTimestamp="2026-01-21 15:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:19.017146128 +0000 UTC m=+3156.198662350" watchObservedRunningTime="2026-01-21 15:54:19.146595265 +0000 UTC m=+3156.328111487" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.149528 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-scripts" (OuterVolumeSpecName: "scripts") pod "a1cb8df6-9987-4102-9599-2a0a39252432" (UID: "a1cb8df6-9987-4102-9599-2a0a39252432"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.150329 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=10.150319838 podStartE2EDuration="10.150319838s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:19.045121994 +0000 UTC m=+3156.226638216" watchObservedRunningTime="2026-01-21 15:54:19.150319838 +0000 UTC m=+3156.331836061" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.150605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087bdc72-3300-4e75-9b60-72380c090c95-kube-api-access-xlqww" (OuterVolumeSpecName: "kube-api-access-xlqww") pod "087bdc72-3300-4e75-9b60-72380c090c95" (UID: "087bdc72-3300-4e75-9b60-72380c090c95"). InnerVolumeSpecName "kube-api-access-xlqww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.151055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1cb8df6-9987-4102-9599-2a0a39252432-kube-api-access-2zlvl" (OuterVolumeSpecName: "kube-api-access-2zlvl") pod "a1cb8df6-9987-4102-9599-2a0a39252432" (UID: "a1cb8df6-9987-4102-9599-2a0a39252432"). InnerVolumeSpecName "kube-api-access-2zlvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.155961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlqww\" (UniqueName: \"kubernetes.io/projected/087bdc72-3300-4e75-9b60-72380c090c95-kube-api-access-xlqww\") pod \"087bdc72-3300-4e75-9b60-72380c090c95\" (UID: \"087bdc72-3300-4e75-9b60-72380c090c95\") " Jan 21 15:54:19 crc kubenswrapper[4707]: W0121 15:54:19.156299 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/087bdc72-3300-4e75-9b60-72380c090c95/volumes/kubernetes.io~projected/kube-api-access-xlqww Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.156314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-scripts\") pod \"a1cb8df6-9987-4102-9599-2a0a39252432\" (UID: \"a1cb8df6-9987-4102-9599-2a0a39252432\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.156356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087bdc72-3300-4e75-9b60-72380c090c95-kube-api-access-xlqww" (OuterVolumeSpecName: "kube-api-access-xlqww") pod "087bdc72-3300-4e75-9b60-72380c090c95" (UID: "087bdc72-3300-4e75-9b60-72380c090c95"). InnerVolumeSpecName "kube-api-access-xlqww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: W0121 15:54:19.156586 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a1cb8df6-9987-4102-9599-2a0a39252432/volumes/kubernetes.io~secret/scripts Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.156599 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-scripts" (OuterVolumeSpecName: "scripts") pod "a1cb8df6-9987-4102-9599-2a0a39252432" (UID: "a1cb8df6-9987-4102-9599-2a0a39252432"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.158319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.158530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.158561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql7zz\" (UniqueName: \"kubernetes.io/projected/fb3e6e1d-19d1-4310-907f-a252e900b54a-kube-api-access-ql7zz\") pod \"nova-cell0-conductor-0\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.158613 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2613b63-6d79-4d28-9be6-ca7a8431c724" containerID="c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8" exitCode=0 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.158687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d2613b63-6d79-4d28-9be6-ca7a8431c724","Type":"ContainerDied","Data":"c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.158743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d2613b63-6d79-4d28-9be6-ca7a8431c724","Type":"ContainerDied","Data":"bc186b096804ce32276dc98c1a294a7075e03b669bc0882cb5417fd3a9aee28d"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.158864 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.159640 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087bdc72-3300-4e75-9b60-72380c090c95-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.159662 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.159675 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zlvl\" (UniqueName: \"kubernetes.io/projected/a1cb8df6-9987-4102-9599-2a0a39252432-kube-api-access-2zlvl\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.159684 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d2613b63-6d79-4d28-9be6-ca7a8431c724-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.159694 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cb8df6-9987-4102-9599-2a0a39252432-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.159705 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlqww\" (UniqueName: \"kubernetes.io/projected/087bdc72-3300-4e75-9b60-72380c090c95-kube-api-access-xlqww\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.159713 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.171941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2613b63-6d79-4d28-9be6-ca7a8431c724-kube-api-access-jrtft" (OuterVolumeSpecName: "kube-api-access-jrtft") pod "d2613b63-6d79-4d28-9be6-ca7a8431c724" (UID: "d2613b63-6d79-4d28-9be6-ca7a8431c724"). InnerVolumeSpecName "kube-api-access-jrtft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.201781 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="ceilometer-central-agent" containerID="cri-o://804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381" gracePeriod=30 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.202352 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="proxy-httpd" containerID="cri-o://df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f" gracePeriod=30 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.202465 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="ceilometer-notification-agent" containerID="cri-o://99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8" gracePeriod=30 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.202534 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="sg-core" containerID="cri-o://64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157" gracePeriod=30 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.212926 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3596326f-8c58-4af1-963e-916d15976926" path="/var/lib/kubelet/pods/3596326f-8c58-4af1-963e-916d15976926/volumes" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.213556 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e0d910-5029-444b-83e7-92d44c8ae55a" path="/var/lib/kubelet/pods/35e0d910-5029-444b-83e7-92d44c8ae55a/volumes" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.217399 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86ffe25-a844-40ac-ac5f-11835ee59208" path="/var/lib/kubelet/pods/e86ffe25-a844-40ac-ac5f-11835ee59208/volumes" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.246118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e57ace7a-26b6-433e-99aa-891427aaf04c" (UID: "e57ace7a-26b6-433e-99aa-891427aaf04c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.265221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.265380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.265401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql7zz\" (UniqueName: \"kubernetes.io/projected/fb3e6e1d-19d1-4310-907f-a252e900b54a-kube-api-access-ql7zz\") pod \"nova-cell0-conductor-0\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.265604 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.265615 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrtft\" (UniqueName: \"kubernetes.io/projected/d2613b63-6d79-4d28-9be6-ca7a8431c724-kube-api-access-jrtft\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.287657 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=5.287641519 podStartE2EDuration="5.287641519s" podCreationTimestamp="2026-01-21 15:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:19.13063382 +0000 UTC m=+3156.312150042" watchObservedRunningTime="2026-01-21 15:54:19.287641519 +0000 UTC m=+3156.469157742" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.290057 4707 generic.go:334] "Generic (PLEG): container finished" podID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerID="5efc0d6113609a65c6c8c728c1819ba46a5a6053e62048e66910b566697c74eb" exitCode=0 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.290087 4707 generic.go:334] "Generic (PLEG): container finished" podID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerID="06a9ecf9a704b3cf801825df72825f3564b9b3123770eb8150d305a227bca3ff" exitCode=143 Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.290176 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.291993 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=10.291985899 podStartE2EDuration="10.291985899s" podCreationTimestamp="2026-01-21 15:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:19.161171715 +0000 UTC m=+3156.342687937" watchObservedRunningTime="2026-01-21 15:54:19.291985899 +0000 UTC m=+3156.473502122" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.306350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.306555 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=4.366027698 podStartE2EDuration="12.306546299s" podCreationTimestamp="2026-01-21 15:54:07 +0000 UTC" firstStartedPulling="2026-01-21 15:54:09.11619122 +0000 UTC m=+3146.297707442" lastFinishedPulling="2026-01-21 15:54:17.05670982 +0000 UTC m=+3154.238226043" observedRunningTime="2026-01-21 15:54:19.241383265 +0000 UTC m=+3156.422899487" watchObservedRunningTime="2026-01-21 15:54:19.306546299 +0000 UTC m=+3156.488062520" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.311012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.311911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "087bdc72-3300-4e75-9b60-72380c090c95" (UID: "087bdc72-3300-4e75-9b60-72380c090c95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.313177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql7zz\" (UniqueName: \"kubernetes.io/projected/fb3e6e1d-19d1-4310-907f-a252e900b54a-kube-api-access-ql7zz\") pod \"nova-cell0-conductor-0\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.367750 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.460421 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-config-data" (OuterVolumeSpecName: "config-data") pod "087bdc72-3300-4e75-9b60-72380c090c95" (UID: "087bdc72-3300-4e75-9b60-72380c090c95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.460514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "087bdc72-3300-4e75-9b60-72380c090c95" (UID: "087bdc72-3300-4e75-9b60-72380c090c95"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.471642 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.471671 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087bdc72-3300-4e75-9b60-72380c090c95-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.520919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2613b63-6d79-4d28-9be6-ca7a8431c724" (UID: "d2613b63-6d79-4d28-9be6-ca7a8431c724"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.571549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "d2613b63-6d79-4d28-9be6-ca7a8431c724" (UID: "d2613b63-6d79-4d28-9be6-ca7a8431c724"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.576487 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.580071 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2613b63-6d79-4d28-9be6-ca7a8431c724-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.594060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1cb8df6-9987-4102-9599-2a0a39252432" (UID: "a1cb8df6-9987-4102-9599-2a0a39252432"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.594515 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data" (OuterVolumeSpecName: "config-data") pod "e57ace7a-26b6-433e-99aa-891427aaf04c" (UID: "e57ace7a-26b6-433e-99aa-891427aaf04c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.640374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-config-data" (OuterVolumeSpecName: "config-data") pod "a1cb8df6-9987-4102-9599-2a0a39252432" (UID: "a1cb8df6-9987-4102-9599-2a0a39252432"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.671145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a1cb8df6-9987-4102-9599-2a0a39252432" (UID: "a1cb8df6-9987-4102-9599-2a0a39252432"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.671977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a1cb8df6-9987-4102-9599-2a0a39252432" (UID: "a1cb8df6-9987-4102-9599-2a0a39252432"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.681611 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.681885 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.681897 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.681907 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cb8df6-9987-4102-9599-2a0a39252432-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.681915 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ace7a-26b6-433e-99aa-891427aaf04c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.831985 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.832017 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.832040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerStarted","Data":"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.832055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d70efa8f-44f9-48d8-8beb-531c69225631","Type":"ContainerStarted","Data":"012e28043224893a29e9aa0317b193ef300ff4ee927d53fc44201c8909337525"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.832065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"da5cd14d-4d69-4f5a-9901-315fe321579c","Type":"ContainerDied","Data":"5efc0d6113609a65c6c8c728c1819ba46a5a6053e62048e66910b566697c74eb"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.832076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"da5cd14d-4d69-4f5a-9901-315fe321579c","Type":"ContainerDied","Data":"06a9ecf9a704b3cf801825df72825f3564b9b3123770eb8150d305a227bca3ff"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.832087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"da5cd14d-4d69-4f5a-9901-315fe321579c","Type":"ContainerDied","Data":"16700dde006dd42b01ae254a129416f3fa8868ef0abf67cf502defe2bbdb21fd"} Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.832095 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16700dde006dd42b01ae254a129416f3fa8868ef0abf67cf502defe2bbdb21fd" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.835503 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.848793 4707 scope.go:117] "RemoveContainer" containerID="a9a94e2300c8f0cebf8271a5715c7814c9ee668a8c0685fdaf947f3e5b501fa5" Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.991922 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-public-tls-certs\") pod \"da5cd14d-4d69-4f5a-9901-315fe321579c\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.992014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-config-data\") pod \"da5cd14d-4d69-4f5a-9901-315fe321579c\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.992121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbwq8\" (UniqueName: \"kubernetes.io/projected/da5cd14d-4d69-4f5a-9901-315fe321579c-kube-api-access-gbwq8\") pod \"da5cd14d-4d69-4f5a-9901-315fe321579c\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.992157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5cd14d-4d69-4f5a-9901-315fe321579c-logs\") pod \"da5cd14d-4d69-4f5a-9901-315fe321579c\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.992308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-internal-tls-certs\") pod \"da5cd14d-4d69-4f5a-9901-315fe321579c\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.992335 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-combined-ca-bundle\") pod \"da5cd14d-4d69-4f5a-9901-315fe321579c\" (UID: \"da5cd14d-4d69-4f5a-9901-315fe321579c\") " Jan 21 15:54:19 crc kubenswrapper[4707]: I0121 15:54:19.992733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da5cd14d-4d69-4f5a-9901-315fe321579c-logs" (OuterVolumeSpecName: "logs") pod "da5cd14d-4d69-4f5a-9901-315fe321579c" (UID: "da5cd14d-4d69-4f5a-9901-315fe321579c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.022921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da5cd14d-4d69-4f5a-9901-315fe321579c" (UID: "da5cd14d-4d69-4f5a-9901-315fe321579c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.023329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da5cd14d-4d69-4f5a-9901-315fe321579c-kube-api-access-gbwq8" (OuterVolumeSpecName: "kube-api-access-gbwq8") pod "da5cd14d-4d69-4f5a-9901-315fe321579c" (UID: "da5cd14d-4d69-4f5a-9901-315fe321579c"). InnerVolumeSpecName "kube-api-access-gbwq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.051725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "da5cd14d-4d69-4f5a-9901-315fe321579c" (UID: "da5cd14d-4d69-4f5a-9901-315fe321579c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.057418 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:48384->192.168.25.165:36655: write tcp 192.168.25.165:48384->192.168.25.165:36655: write: connection reset by peer Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.079986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-config-data" (OuterVolumeSpecName: "config-data") pod "da5cd14d-4d69-4f5a-9901-315fe321579c" (UID: "da5cd14d-4d69-4f5a-9901-315fe321579c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.094397 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.094424 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.094434 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.094444 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbwq8\" (UniqueName: \"kubernetes.io/projected/da5cd14d-4d69-4f5a-9901-315fe321579c-kube-api-access-gbwq8\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.094455 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5cd14d-4d69-4f5a-9901-315fe321579c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.095307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "da5cd14d-4d69-4f5a-9901-315fe321579c" (UID: "da5cd14d-4d69-4f5a-9901-315fe321579c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.126296 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.126353 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.138824 4707 scope.go:117] "RemoveContainer" containerID="e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.177982 4707 scope.go:117] "RemoveContainer" containerID="738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.196493 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5cd14d-4d69-4f5a-9901-315fe321579c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.224009 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.225173 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.226471 4707 scope.go:117] "RemoveContainer" containerID="e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.232236 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa\": container with ID starting with e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa not found: ID does not exist" containerID="e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.232289 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa"} err="failed to get container status \"e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa\": rpc error: code = NotFound desc = could not find container \"e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa\": container with ID starting with e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa not found: ID does not exist" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.232316 4707 scope.go:117] "RemoveContainer" containerID="738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.235901 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6\": container with ID starting with 738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6 not found: ID does not exist" containerID="738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.235947 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6"} err="failed to get container status \"738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6\": rpc error: code = NotFound desc = could not find container \"738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6\": container with ID starting with 738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6 not found: ID does not exist" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.235968 4707 scope.go:117] "RemoveContainer" containerID="e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.238930 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.241723 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.241836 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa"} err="failed to get container status \"e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa\": rpc error: code = NotFound desc = could not find container \"e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa\": container with ID starting with e607a9b84a526401ce07a186a21f17bc2b754e041ed3d125291b06adbdd14eaa not found: ID does not exist" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.241856 4707 scope.go:117] "RemoveContainer" containerID="738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.246070 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6"} err="failed to get container status \"738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6\": rpc error: code = NotFound desc = could not find container \"738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6\": container with ID starting with 738a3f20a6ca0cd993ff26bb326d88378a0d45f4ebdc9a4dcc3b845537b737b6 not found: ID does not exist" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.246097 4707 scope.go:117] "RemoveContainer" containerID="9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.247288 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.252766 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.252857 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253204 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="proxy-httpd" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253216 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="proxy-httpd" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253231 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="sg-core" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253237 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="sg-core" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253245 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerName="glance-httpd" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253250 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerName="glance-httpd" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253330 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728a3264-18ce-4d47-86f5-868f00cc4558" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253337 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="728a3264-18ce-4d47-86f5-868f00cc4558" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253345 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="ceilometer-notification-agent" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253350 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="ceilometer-notification-agent" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253364 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerName="nova-api-api" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253371 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerName="nova-api-api" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253389 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerName="glance-log" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253395 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerName="glance-log" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253406 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerName="glance-log" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253411 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerName="glance-log" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253420 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerName="glance-httpd" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253425 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerName="glance-httpd" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253434 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="ceilometer-central-agent" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253440 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="ceilometer-central-agent" Jan 21 15:54:20 crc kubenswrapper[4707]: E0121 15:54:20.253461 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerName="nova-api-log" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253466 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerName="nova-api-log" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253648 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerName="glance-log" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253658 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="proxy-httpd" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253665 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerName="nova-api-log" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253674 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerName="glance-log" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253687 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="sg-core" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253694 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="728a3264-18ce-4d47-86f5-868f00cc4558" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253703 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="ceilometer-central-agent" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253715 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" containerName="ceilometer-notification-agent" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253734 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" containerName="nova-api-api" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253741 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerName="glance-httpd" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.253749 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerName="glance-httpd" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.254659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.267653 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-9d4657458-gkqnm"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.276899 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.284336 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.285169 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.297979 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.300854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-config-data\") pod \"728a3264-18ce-4d47-86f5-868f00cc4558\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.301011 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xp47\" (UniqueName: \"kubernetes.io/projected/728a3264-18ce-4d47-86f5-868f00cc4558-kube-api-access-6xp47\") pod \"728a3264-18ce-4d47-86f5-868f00cc4558\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.301086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-vencrypt-tls-certs\") pod \"728a3264-18ce-4d47-86f5-868f00cc4558\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.301127 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-nova-novncproxy-tls-certs\") pod \"728a3264-18ce-4d47-86f5-868f00cc4558\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.301245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-combined-ca-bundle\") pod \"728a3264-18ce-4d47-86f5-868f00cc4558\" (UID: \"728a3264-18ce-4d47-86f5-868f00cc4558\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.359946 4707 scope.go:117] "RemoveContainer" containerID="c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.367497 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-9d4657458-gkqnm"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.389600 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399495 4707 generic.go:334] "Generic (PLEG): container finished" podID="e7b163de-7daf-4893-abcd-9fefd3850234" containerID="df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f" exitCode=0 Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399518 4707 generic.go:334] "Generic (PLEG): container finished" podID="e7b163de-7daf-4893-abcd-9fefd3850234" containerID="64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157" exitCode=2 Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399526 4707 generic.go:334] "Generic (PLEG): container finished" podID="e7b163de-7daf-4893-abcd-9fefd3850234" containerID="99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8" exitCode=0 Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399532 4707 generic.go:334] "Generic (PLEG): container finished" podID="e7b163de-7daf-4893-abcd-9fefd3850234" containerID="804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381" exitCode=0 Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerDied","Data":"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerDied","Data":"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerDied","Data":"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerDied","Data":"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b163de-7daf-4893-abcd-9fefd3850234","Type":"ContainerDied","Data":"8027aeac2953b1103c96572f6a7f5fae90c69fe974d0026f8d6c56b1eb92ba61"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.399816 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.403616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-internal-tls-certs\") pod \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.403913 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-sg-core-conf-yaml\") pod \"e7b163de-7daf-4893-abcd-9fefd3850234\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.404929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"28a92eee-d8bb-46d5-bacb-e641b5be138f\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.405028 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-combined-ca-bundle\") pod \"e7b163de-7daf-4893-abcd-9fefd3850234\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wl6g\" (UniqueName: \"kubernetes.io/projected/e7b163de-7daf-4893-abcd-9fefd3850234-kube-api-access-7wl6g\") pod \"e7b163de-7daf-4893-abcd-9fefd3850234\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-logs\") pod \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqztn\" (UniqueName: \"kubernetes.io/projected/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-kube-api-access-nqztn\") pod \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-ceilometer-tls-certs\") pod \"e7b163de-7daf-4893-abcd-9fefd3850234\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-config-data\") pod \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-scripts\") pod \"e7b163de-7daf-4893-abcd-9fefd3850234\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-run-httpd\") pod \"e7b163de-7daf-4893-abcd-9fefd3850234\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-scripts\") pod \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-config-data\") pod \"28a92eee-d8bb-46d5-bacb-e641b5be138f\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-logs\") pod \"28a92eee-d8bb-46d5-bacb-e641b5be138f\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-combined-ca-bundle\") pod \"28a92eee-d8bb-46d5-bacb-e641b5be138f\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406781 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-log-httpd\") pod \"e7b163de-7daf-4893-abcd-9fefd3850234\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-config-data\") pod \"e7b163de-7daf-4893-abcd-9fefd3850234\" (UID: \"e7b163de-7daf-4893-abcd-9fefd3850234\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-scripts\") pod \"28a92eee-d8bb-46d5-bacb-e641b5be138f\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406901 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-public-tls-certs\") pod \"28a92eee-d8bb-46d5-bacb-e641b5be138f\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406973 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-combined-ca-bundle\") pod \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.406995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-httpd-run\") pod \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\" (UID: \"388a5540-d98c-4db0-bb1a-5a387ff8f7f3\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.407012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh9sg\" (UniqueName: \"kubernetes.io/projected/28a92eee-d8bb-46d5-bacb-e641b5be138f-kube-api-access-kh9sg\") pod \"28a92eee-d8bb-46d5-bacb-e641b5be138f\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.407048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-httpd-run\") pod \"28a92eee-d8bb-46d5-bacb-e641b5be138f\" (UID: \"28a92eee-d8bb-46d5-bacb-e641b5be138f\") " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.407505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-config-data\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.407528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.407559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gr9m\" (UniqueName: \"kubernetes.io/projected/68a74a76-0bd8-48e1-87c9-778b381f3fa0-kube-api-access-5gr9m\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.407609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68a74a76-0bd8-48e1-87c9-778b381f3fa0-logs\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.407646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.408117 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-logs" (OuterVolumeSpecName: "logs") pod "388a5540-d98c-4db0-bb1a-5a387ff8f7f3" (UID: "388a5540-d98c-4db0-bb1a-5a387ff8f7f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.430140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e7b163de-7daf-4893-abcd-9fefd3850234" (UID: "e7b163de-7daf-4893-abcd-9fefd3850234"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.432229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-logs" (OuterVolumeSpecName: "logs") pod "28a92eee-d8bb-46d5-bacb-e641b5be138f" (UID: "28a92eee-d8bb-46d5-bacb-e641b5be138f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.436949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d70efa8f-44f9-48d8-8beb-531c69225631","Type":"ContainerStarted","Data":"3e29351ea171b68ab5f386f24df504ebd7cdf39483d33555a2145f0a35513183"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.453133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e7b163de-7daf-4893-abcd-9fefd3850234" (UID: "e7b163de-7daf-4893-abcd-9fefd3850234"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.453473 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.454846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "28a92eee-d8bb-46d5-bacb-e641b5be138f" (UID: "28a92eee-d8bb-46d5-bacb-e641b5be138f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.458021 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.459246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.463205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "388a5540-d98c-4db0-bb1a-5a387ff8f7f3" (UID: "388a5540-d98c-4db0-bb1a-5a387ff8f7f3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.477293 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.478179 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.478224 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-2x7h6" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.493657 4707 generic.go:334] "Generic (PLEG): container finished" podID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerID="3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491" exitCode=0 Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.493692 4707 generic.go:334] "Generic (PLEG): container finished" podID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" containerID="d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be" exitCode=143 Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.493782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"388a5540-d98c-4db0-bb1a-5a387ff8f7f3","Type":"ContainerDied","Data":"3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.493834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"388a5540-d98c-4db0-bb1a-5a387ff8f7f3","Type":"ContainerDied","Data":"d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.493847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"388a5540-d98c-4db0-bb1a-5a387ff8f7f3","Type":"ContainerDied","Data":"c27b99c8e9dedf56fd90ae88b2072d2818dbe99855377c8baec414e6db09ee85"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.493980 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.503860 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.514967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-config-data\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gr9m\" (UniqueName: \"kubernetes.io/projected/68a74a76-0bd8-48e1-87c9-778b381f3fa0-kube-api-access-5gr9m\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68a74a76-0bd8-48e1-87c9-778b381f3fa0-logs\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515571 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515591 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515601 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515610 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a92eee-d8bb-46d5-bacb-e641b5be138f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515620 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b163de-7daf-4893-abcd-9fefd3850234-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.515629 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.516321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68a74a76-0bd8-48e1-87c9-778b381f3fa0-logs\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.534853 4707 generic.go:334] "Generic (PLEG): container finished" podID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerID="b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a" exitCode=0 Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.534882 4707 generic.go:334] "Generic (PLEG): container finished" podID="28a92eee-d8bb-46d5-bacb-e641b5be138f" containerID="869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358" exitCode=143 Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.534977 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.535652 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.535675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"28a92eee-d8bb-46d5-bacb-e641b5be138f","Type":"ContainerDied","Data":"b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.535695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"28a92eee-d8bb-46d5-bacb-e641b5be138f","Type":"ContainerDied","Data":"869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.535706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"28a92eee-d8bb-46d5-bacb-e641b5be138f","Type":"ContainerDied","Data":"fdb85155a5d668dd87d07fc89c8d39d5fe58e426936b9aca1fec084e052fbd48"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.547452 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cc88854d5-cfgq8"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.552653 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.558497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"01ec6ded-9697-471a-80d6-9ed5fdd9918e","Type":"ContainerStarted","Data":"eda33eb1062eb5c805e7b51d04e65e94eb0f25597b4db4bdfd5b219d72577a1c"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.560945 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.564978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"728a3264-18ce-4d47-86f5-868f00cc4558","Type":"ContainerDied","Data":"52a876ea60f2280980602b262fa95fa606243036aec33822245ef9b90807649d"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.565083 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.568184 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.570498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" event={"ID":"16882f47-6637-4e33-82a3-17da2f553dda","Type":"ContainerStarted","Data":"4473e83d9a8597f2bac4f36ff83e27194ed426055bac0014b63791e44f1531fe"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.570582 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.573180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"206e3d62-bdaa-4326-8c66-a615e81200d8","Type":"ContainerStarted","Data":"9039c2bd551565742c91068961f983f235ff686d5ec57640672a242876ff99ea"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.577473 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.579438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.580135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" event={"ID":"c197e98a-7956-4c34-a31a-911a28cf4a1a","Type":"ContainerStarted","Data":"0134dae6d06704b28d3c1a93c597121155514f8f85383d373be3fbff4f9697e9"} Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.581507 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.581680 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.583398 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.620712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-kolla-config\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.620772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-scripts\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.620843 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-config-data\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.620865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.620886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.620900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76n8j\" (UniqueName: \"kubernetes.io/projected/7e21981d-0581-4e7c-9485-cfea4bffaa9d-kube-api-access-76n8j\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.620960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.620979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sf74\" (UniqueName: \"kubernetes.io/projected/658de89d-042f-452c-a654-586154c683a4-kube-api-access-6sf74\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.620997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e21981d-0581-4e7c-9485-cfea4bffaa9d-logs\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.621068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.621084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.621098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.621289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.621362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e21981d-0581-4e7c-9485-cfea4bffaa9d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.644102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728a3264-18ce-4d47-86f5-868f00cc4558-kube-api-access-6xp47" (OuterVolumeSpecName: "kube-api-access-6xp47") pod "728a3264-18ce-4d47-86f5-868f00cc4558" (UID: "728a3264-18ce-4d47-86f5-868f00cc4558"). InnerVolumeSpecName "kube-api-access-6xp47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.649395 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.655564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "28a92eee-d8bb-46d5-bacb-e641b5be138f" (UID: "28a92eee-d8bb-46d5-bacb-e641b5be138f"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.657555 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a92eee-d8bb-46d5-bacb-e641b5be138f-kube-api-access-kh9sg" (OuterVolumeSpecName: "kube-api-access-kh9sg") pod "28a92eee-d8bb-46d5-bacb-e641b5be138f" (UID: "28a92eee-d8bb-46d5-bacb-e641b5be138f"). InnerVolumeSpecName "kube-api-access-kh9sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.659435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b163de-7daf-4893-abcd-9fefd3850234-kube-api-access-7wl6g" (OuterVolumeSpecName: "kube-api-access-7wl6g") pod "e7b163de-7daf-4893-abcd-9fefd3850234" (UID: "e7b163de-7daf-4893-abcd-9fefd3850234"). InnerVolumeSpecName "kube-api-access-7wl6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.666658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-config-data\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.666906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-scripts" (OuterVolumeSpecName: "scripts") pod "e7b163de-7daf-4893-abcd-9fefd3850234" (UID: "e7b163de-7daf-4893-abcd-9fefd3850234"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.667069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.667863 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.670123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gr9m\" (UniqueName: \"kubernetes.io/projected/68a74a76-0bd8-48e1-87c9-778b381f3fa0-kube-api-access-5gr9m\") pod \"nova-metadata-0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.672634 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-kube-api-access-nqztn" (OuterVolumeSpecName: "kube-api-access-nqztn") pod "388a5540-d98c-4db0-bb1a-5a387ff8f7f3" (UID: "388a5540-d98c-4db0-bb1a-5a387ff8f7f3"). InnerVolumeSpecName "kube-api-access-nqztn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.676927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-scripts" (OuterVolumeSpecName: "scripts") pod "388a5540-d98c-4db0-bb1a-5a387ff8f7f3" (UID: "388a5540-d98c-4db0-bb1a-5a387ff8f7f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.676959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "388a5540-d98c-4db0-bb1a-5a387ff8f7f3" (UID: "388a5540-d98c-4db0-bb1a-5a387ff8f7f3"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.681603 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-scripts" (OuterVolumeSpecName: "scripts") pod "28a92eee-d8bb-46d5-bacb-e641b5be138f" (UID: "28a92eee-d8bb-46d5-bacb-e641b5be138f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.723654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-kolla-config\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.723846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-scripts\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.723941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-config-data\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.723988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76n8j\" (UniqueName: \"kubernetes.io/projected/7e21981d-0581-4e7c-9485-cfea4bffaa9d-kube-api-access-76n8j\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sf74\" (UniqueName: \"kubernetes.io/projected/658de89d-042f-452c-a654-586154c683a4-kube-api-access-6sf74\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e21981d-0581-4e7c-9485-cfea4bffaa9d-logs\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e21981d-0581-4e7c-9485-cfea4bffaa9d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724403 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724426 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724461 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh9sg\" (UniqueName: \"kubernetes.io/projected/28a92eee-d8bb-46d5-bacb-e641b5be138f-kube-api-access-kh9sg\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724475 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724484 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wl6g\" (UniqueName: \"kubernetes.io/projected/e7b163de-7daf-4893-abcd-9fefd3850234-kube-api-access-7wl6g\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724493 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqztn\" (UniqueName: \"kubernetes.io/projected/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-kube-api-access-nqztn\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724503 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xp47\" (UniqueName: \"kubernetes.io/projected/728a3264-18ce-4d47-86f5-868f00cc4558-kube-api-access-6xp47\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724535 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.724544 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.728645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-kolla-config\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.729052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e21981d-0581-4e7c-9485-cfea4bffaa9d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.729299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e21981d-0581-4e7c-9485-cfea4bffaa9d-logs\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.730864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-config-data\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.746852 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.752489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.752568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.752610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.752992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76n8j\" (UniqueName: \"kubernetes.io/projected/7e21981d-0581-4e7c-9485-cfea4bffaa9d-kube-api-access-76n8j\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.756427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-scripts\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.757493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.757854 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.763690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.769398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sf74\" (UniqueName: \"kubernetes.io/projected/658de89d-042f-452c-a654-586154c683a4-kube-api-access-6sf74\") pod \"memcached-0\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.872541 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.929701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e7b163de-7daf-4893-abcd-9fefd3850234" (UID: "e7b163de-7daf-4893-abcd-9fefd3850234"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.935990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-config-data" (OuterVolumeSpecName: "config-data") pod "728a3264-18ce-4d47-86f5-868f00cc4558" (UID: "728a3264-18ce-4d47-86f5-868f00cc4558"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.956207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:20 crc kubenswrapper[4707]: I0121 15:54:20.987547 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.012466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-config-data" (OuterVolumeSpecName: "config-data") pod "28a92eee-d8bb-46d5-bacb-e641b5be138f" (UID: "28a92eee-d8bb-46d5-bacb-e641b5be138f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.026961 4707 scope.go:117] "RemoveContainer" containerID="9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d" Jan 21 15:54:21 crc kubenswrapper[4707]: E0121 15:54:21.032351 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d\": container with ID starting with 9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d not found: ID does not exist" containerID="9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.032410 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d"} err="failed to get container status \"9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d\": rpc error: code = NotFound desc = could not find container \"9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d\": container with ID starting with 9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.032438 4707 scope.go:117] "RemoveContainer" containerID="c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d" Jan 21 15:54:21 crc kubenswrapper[4707]: E0121 15:54:21.033044 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d\": container with ID starting with c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d not found: ID does not exist" containerID="c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.033087 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d"} err="failed to get container status \"c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d\": rpc error: code = NotFound desc = could not find container \"c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d\": container with ID starting with c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.033112 4707 scope.go:117] "RemoveContainer" containerID="9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.033419 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d"} err="failed to get container status \"9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d\": rpc error: code = NotFound desc = could not find container \"9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d\": container with ID starting with 9cef1d27dd63f32f5d0e9fd0e716115bfa1ddaf408bda63b71cf69d142afcc5d not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.033441 4707 scope.go:117] "RemoveContainer" containerID="c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.034164 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.034194 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.034204 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.034215 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.034222 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d"} err="failed to get container status \"c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d\": rpc error: code = NotFound desc = could not find container \"c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d\": container with ID starting with c8f12639a9f1fc415860670a73db06ad199d5a8dd200464a42e7c9706236630d not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.034241 4707 scope.go:117] "RemoveContainer" containerID="c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.063152 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.068022 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.076063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "728a3264-18ce-4d47-86f5-868f00cc4558" (UID: "728a3264-18ce-4d47-86f5-868f00cc4558"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.079933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28a92eee-d8bb-46d5-bacb-e641b5be138f" (UID: "28a92eee-d8bb-46d5-bacb-e641b5be138f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.080457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "388a5540-d98c-4db0-bb1a-5a387ff8f7f3" (UID: "388a5540-d98c-4db0-bb1a-5a387ff8f7f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.098372 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "728a3264-18ce-4d47-86f5-868f00cc4558" (UID: "728a3264-18ce-4d47-86f5-868f00cc4558"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.100618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "728a3264-18ce-4d47-86f5-868f00cc4558" (UID: "728a3264-18ce-4d47-86f5-868f00cc4558"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.119148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "388a5540-d98c-4db0-bb1a-5a387ff8f7f3" (UID: "388a5540-d98c-4db0-bb1a-5a387ff8f7f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.128455 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.133282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-config-data" (OuterVolumeSpecName: "config-data") pod "388a5540-d98c-4db0-bb1a-5a387ff8f7f3" (UID: "388a5540-d98c-4db0-bb1a-5a387ff8f7f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.137884 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.137911 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.137921 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.137931 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388a5540-d98c-4db0-bb1a-5a387ff8f7f3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.137939 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.137947 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.137959 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.137969 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a3264-18ce-4d47-86f5-868f00cc4558-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.158905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "28a92eee-d8bb-46d5-bacb-e641b5be138f" (UID: "28a92eee-d8bb-46d5-bacb-e641b5be138f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.178574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e7b163de-7daf-4893-abcd-9fefd3850234" (UID: "e7b163de-7daf-4893-abcd-9fefd3850234"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.201298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7b163de-7daf-4893-abcd-9fefd3850234" (UID: "e7b163de-7daf-4893-abcd-9fefd3850234"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.211597 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="087bdc72-3300-4e75-9b60-72380c090c95" path="/var/lib/kubelet/pods/087bdc72-3300-4e75-9b60-72380c090c95/volumes" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.212290 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ca704f-a732-47ec-a479-dabdeea5b864" path="/var/lib/kubelet/pods/84ca704f-a732-47ec-a479-dabdeea5b864/volumes" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.212965 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1cb8df6-9987-4102-9599-2a0a39252432" path="/var/lib/kubelet/pods/a1cb8df6-9987-4102-9599-2a0a39252432/volumes" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.213995 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2613b63-6d79-4d28-9be6-ca7a8431c724" path="/var/lib/kubelet/pods/d2613b63-6d79-4d28-9be6-ca7a8431c724/volumes" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.214500 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57ace7a-26b6-433e-99aa-891427aaf04c" path="/var/lib/kubelet/pods/e57ace7a-26b6-433e-99aa-891427aaf04c/volumes" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.242546 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a92eee-d8bb-46d5-bacb-e641b5be138f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.242571 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.242581 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.248419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-config-data" (OuterVolumeSpecName: "config-data") pod "e7b163de-7daf-4893-abcd-9fefd3850234" (UID: "e7b163de-7daf-4893-abcd-9fefd3850234"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.319406 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.357460 4707 scope.go:117] "RemoveContainer" containerID="c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.359717 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b163de-7daf-4893-abcd-9fefd3850234-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: E0121 15:54:21.359822 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8\": container with ID starting with c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8 not found: ID does not exist" containerID="c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.359851 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8"} err="failed to get container status \"c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8\": rpc error: code = NotFound desc = could not find container \"c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8\": container with ID starting with c543db71f776143b55a91e5478ca290ec521f5fd22fb595a6fa11cc61b6a92a8 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.359874 4707 scope.go:117] "RemoveContainer" containerID="df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.447036 4707 scope.go:117] "RemoveContainer" containerID="64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.449302 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.492583 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.507595 4707 scope.go:117] "RemoveContainer" containerID="99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.518168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.522669 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.534289 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.541883 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: E0121 15:54:21.552777 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.552838 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.553144 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.553170 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:21 crc kubenswrapper[4707]: E0121 15:54:21.553385 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.553398 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.554761 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.554788 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.554881 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.559380 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.559834 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.560187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.560842 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.568227 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.571912 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4427\" (UniqueName: \"kubernetes.io/projected/12052361-f984-4545-a1e1-9a8b80e2cceb-kube-api-access-r4427\") pod \"12052361-f984-4545-a1e1-9a8b80e2cceb\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.572049 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-config-data\") pod \"12052361-f984-4545-a1e1-9a8b80e2cceb\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.572122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-combined-ca-bundle\") pod \"12052361-f984-4545-a1e1-9a8b80e2cceb\" (UID: \"12052361-f984-4545-a1e1-9a8b80e2cceb\") " Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.572641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.583949 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.583995 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.588515 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12052361-f984-4545-a1e1-9a8b80e2cceb-kube-api-access-r4427" (OuterVolumeSpecName: "kube-api-access-r4427") pod "12052361-f984-4545-a1e1-9a8b80e2cceb" (UID: "12052361-f984-4545-a1e1-9a8b80e2cceb"). InnerVolumeSpecName "kube-api-access-r4427". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.589121 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.590574 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-265m4" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.590829 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.594003 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.602422 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.613066 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.616657 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.623253 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.623744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.624110 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.626748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12052361-f984-4545-a1e1-9a8b80e2cceb" (UID: "12052361-f984-4545-a1e1-9a8b80e2cceb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.652085 4707 scope.go:117] "RemoveContainer" containerID="804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.654215 4707 generic.go:334] "Generic (PLEG): container finished" podID="12052361-f984-4545-a1e1-9a8b80e2cceb" containerID="66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21" exitCode=1 Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.654335 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.655121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"12052361-f984-4545-a1e1-9a8b80e2cceb","Type":"ContainerDied","Data":"66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21"} Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.655152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"12052361-f984-4545-a1e1-9a8b80e2cceb","Type":"ContainerDied","Data":"901f8ccae682b151d1499e809475a50dfa0f4837187a71b68b19da373e3405f8"} Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.663450 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-config-data" (OuterVolumeSpecName: "config-data") pod "12052361-f984-4545-a1e1-9a8b80e2cceb" (UID: "12052361-f984-4545-a1e1-9a8b80e2cceb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.666096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"01ec6ded-9697-471a-80d6-9ed5fdd9918e","Type":"ContainerStarted","Data":"43d29cce317d9140ea844e0fa8a01802f3581985166c5b4847f73a0d0cd4079e"} Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkdw4\" (UniqueName: \"kubernetes.io/projected/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-kube-api-access-wkdw4\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-log-httpd\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-config-data\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-scripts\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688616 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-run-httpd\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688804 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688887 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.688903 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qj44\" (UniqueName: \"kubernetes.io/projected/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-kube-api-access-2qj44\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.689009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.689080 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4427\" (UniqueName: \"kubernetes.io/projected/12052361-f984-4545-a1e1-9a8b80e2cceb-kube-api-access-r4427\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.689091 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.689101 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12052361-f984-4545-a1e1-9a8b80e2cceb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.703428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3e6e1d-19d1-4310-907f-a252e900b54a","Type":"ContainerStarted","Data":"277a414f7fc85d83b0d0b7fec77b265156ca3796bce840a3ef15e4cd3698ba0f"} Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.703633 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.710964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" event={"ID":"c197e98a-7956-4c34-a31a-911a28cf4a1a","Type":"ContainerStarted","Data":"c50519e027e1683f85b799afca4576a3862101aad86a8a98197856394645b0ed"} Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.712376 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.734740 4707 scope.go:117] "RemoveContainer" containerID="df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f" Jan 21 15:54:21 crc kubenswrapper[4707]: E0121 15:54:21.737509 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f\": container with ID starting with df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f not found: ID does not exist" containerID="df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.737533 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f"} err="failed to get container status \"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f\": rpc error: code = NotFound desc = could not find container \"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f\": container with ID starting with df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.737550 4707 scope.go:117] "RemoveContainer" containerID="64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157" Jan 21 15:54:21 crc kubenswrapper[4707]: E0121 15:54:21.741859 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157\": container with ID starting with 64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157 not found: ID does not exist" containerID="64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.741895 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157"} err="failed to get container status \"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157\": rpc error: code = NotFound desc = could not find container \"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157\": container with ID starting with 64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.741916 4707 scope.go:117] "RemoveContainer" containerID="99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8" Jan 21 15:54:21 crc kubenswrapper[4707]: E0121 15:54:21.744594 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8\": container with ID starting with 99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8 not found: ID does not exist" containerID="99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.744646 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8"} err="failed to get container status \"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8\": rpc error: code = NotFound desc = could not find container \"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8\": container with ID starting with 99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.744663 4707 scope.go:117] "RemoveContainer" containerID="804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381" Jan 21 15:54:21 crc kubenswrapper[4707]: E0121 15:54:21.745310 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381\": container with ID starting with 804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381 not found: ID does not exist" containerID="804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.745395 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381"} err="failed to get container status \"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381\": rpc error: code = NotFound desc = could not find container \"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381\": container with ID starting with 804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.745409 4707 scope.go:117] "RemoveContainer" containerID="df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.746335 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f"} err="failed to get container status \"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f\": rpc error: code = NotFound desc = could not find container \"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f\": container with ID starting with df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.746353 4707 scope.go:117] "RemoveContainer" containerID="64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.746584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" event={"ID":"16882f47-6637-4e33-82a3-17da2f553dda","Type":"ContainerStarted","Data":"d15803325d6a389441fe6a55dbaa6c70c8ec4a8581d8dc66243c60e1d055805a"} Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.746670 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.746689 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157"} err="failed to get container status \"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157\": rpc error: code = NotFound desc = could not find container \"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157\": container with ID starting with 64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.746786 4707 scope.go:117] "RemoveContainer" containerID="99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.747553 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8"} err="failed to get container status \"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8\": rpc error: code = NotFound desc = could not find container \"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8\": container with ID starting with 99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.747657 4707 scope.go:117] "RemoveContainer" containerID="804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.755962 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381"} err="failed to get container status \"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381\": rpc error: code = NotFound desc = could not find container \"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381\": container with ID starting with 804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.756013 4707 scope.go:117] "RemoveContainer" containerID="df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.763801 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f"} err="failed to get container status \"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f\": rpc error: code = NotFound desc = could not find container \"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f\": container with ID starting with df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.763847 4707 scope.go:117] "RemoveContainer" containerID="64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.764414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"206e3d62-bdaa-4326-8c66-a615e81200d8","Type":"ContainerStarted","Data":"7b39b7154fa0b95548ea7fac51861a1df55416b47e244ccf9463cee5f36581c0"} Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.764439 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.767137 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157"} err="failed to get container status \"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157\": rpc error: code = NotFound desc = could not find container \"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157\": container with ID starting with 64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.767385 4707 scope.go:117] "RemoveContainer" containerID="99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.767737 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8"} err="failed to get container status \"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8\": rpc error: code = NotFound desc = could not find container \"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8\": container with ID starting with 99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.767760 4707 scope.go:117] "RemoveContainer" containerID="804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.768033 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381"} err="failed to get container status \"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381\": rpc error: code = NotFound desc = could not find container \"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381\": container with ID starting with 804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.768062 4707 scope.go:117] "RemoveContainer" containerID="df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.769015 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f"} err="failed to get container status \"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f\": rpc error: code = NotFound desc = could not find container \"df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f\": container with ID starting with df883f17268067ff066c976876370669173c4e40af6052f5c30755fd7bc4d12f not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.769036 4707 scope.go:117] "RemoveContainer" containerID="64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.769608 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.773723 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157"} err="failed to get container status \"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157\": rpc error: code = NotFound desc = could not find container \"64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157\": container with ID starting with 64eefcb9f75883c57f3cb11147f79e559705c04bfe1ad875767dba23d1f13157 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.773745 4707 scope.go:117] "RemoveContainer" containerID="99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.774129 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8"} err="failed to get container status \"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8\": rpc error: code = NotFound desc = could not find container \"99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8\": container with ID starting with 99b4e87682ebc678b5b180809c36cf018977c8f533f2ba7be6bf90089c2f78f8 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.774148 4707 scope.go:117] "RemoveContainer" containerID="804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.776705 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381"} err="failed to get container status \"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381\": rpc error: code = NotFound desc = could not find container \"804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381\": container with ID starting with 804f7badc3c4d860a112b340a2640d9dba86970fb734b465e64562da59858381 not found: ID does not exist" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.776724 4707 scope.go:117] "RemoveContainer" containerID="3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.791895 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=3.79187069 podStartE2EDuration="3.79187069s" podCreationTimestamp="2026-01-21 15:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:21.722426316 +0000 UTC m=+3158.903942548" watchObservedRunningTime="2026-01-21 15:54:21.79187069 +0000 UTC m=+3158.973386912" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.796668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.796763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.796802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.798772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.798858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.798912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qj44\" (UniqueName: \"kubernetes.io/projected/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-kube-api-access-2qj44\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.798971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.798989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-config-data\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-logs\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkdw4\" (UniqueName: \"kubernetes.io/projected/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-kube-api-access-wkdw4\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-log-httpd\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-config-data\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-scripts\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.799597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.800029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.800218 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.800660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.800785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-log-httpd\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.801245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.801707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-scripts\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.801745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.801787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-run-httpd\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.801833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.802070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-run-httpd\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.802831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwgd\" (UniqueName: \"kubernetes.io/projected/7160b3be-b878-41e6-b4ad-530f998a5185-kube-api-access-sgwgd\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.806084 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" podStartSLOduration=5.805955445 podStartE2EDuration="5.805955445s" podCreationTimestamp="2026-01-21 15:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:21.73796254 +0000 UTC m=+3158.919478762" watchObservedRunningTime="2026-01-21 15:54:21.805955445 +0000 UTC m=+3158.987471667" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.815441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.815725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.819381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.820165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.820832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkdw4\" (UniqueName: \"kubernetes.io/projected/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-kube-api-access-wkdw4\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.821179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.821476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-scripts\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.824860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.825489 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.825877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-config-data\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.826570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qj44\" (UniqueName: \"kubernetes.io/projected/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-kube-api-access-2qj44\") pod \"ceilometer-0\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.863066 4707 scope.go:117] "RemoveContainer" containerID="d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.864987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.872714 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.897171 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" podStartSLOduration=5.89715081 podStartE2EDuration="5.89715081s" podCreationTimestamp="2026-01-21 15:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:21.756172373 +0000 UTC m=+3158.937688595" watchObservedRunningTime="2026-01-21 15:54:21.89715081 +0000 UTC m=+3159.078667022" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.905611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.906288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.906333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-config-data\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.907163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-logs\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.907670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.907698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.907883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-scripts\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.907959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.908095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwgd\" (UniqueName: \"kubernetes.io/projected/7160b3be-b878-41e6-b4ad-530f998a5185-kube-api-access-sgwgd\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.915595 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.943439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.967948 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-config-data\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.982535 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=6.9825131769999995 podStartE2EDuration="6.982513177s" podCreationTimestamp="2026-01-21 15:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:21.778553569 +0000 UTC m=+3158.960069780" watchObservedRunningTime="2026-01-21 15:54:21.982513177 +0000 UTC m=+3159.164029398" Jan 21 15:54:21 crc kubenswrapper[4707]: I0121 15:54:21.991077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-logs\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:21.996289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-scripts\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:21.996312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.007319 4707 scope.go:117] "RemoveContainer" containerID="3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.008483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.011416 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwgd\" (UniqueName: \"kubernetes.io/projected/7160b3be-b878-41e6-b4ad-530f998a5185-kube-api-access-sgwgd\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:22 crc kubenswrapper[4707]: E0121 15:54:22.021759 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491\": container with ID starting with 3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491 not found: ID does not exist" containerID="3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.021850 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491"} err="failed to get container status \"3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491\": rpc error: code = NotFound desc = could not find container \"3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491\": container with ID starting with 3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491 not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.021880 4707 scope.go:117] "RemoveContainer" containerID="d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be" Jan 21 15:54:22 crc kubenswrapper[4707]: E0121 15:54:22.022551 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be\": container with ID starting with d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be not found: ID does not exist" containerID="d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.022755 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be"} err="failed to get container status \"d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be\": rpc error: code = NotFound desc = could not find container \"d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be\": container with ID starting with d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.022767 4707 scope.go:117] "RemoveContainer" containerID="3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.033967 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491"} err="failed to get container status \"3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491\": rpc error: code = NotFound desc = could not find container \"3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491\": container with ID starting with 3380486836c070144a804bff87b44963facc35ebe14ad1f0f1a2e2c992ca7491 not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.033996 4707 scope.go:117] "RemoveContainer" containerID="d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.037901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.044276 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be"} err="failed to get container status \"d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be\": rpc error: code = NotFound desc = could not find container \"d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be\": container with ID starting with d4ab1b358e25fa34c3df0c65c936a36bd89335739f9be276ed20eaf59cdfb7be not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.044310 4707 scope.go:117] "RemoveContainer" containerID="b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.172072 4707 scope.go:117] "RemoveContainer" containerID="869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.186895 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.198845 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.204929 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.206461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.209190 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.209739 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.217048 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.248100 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.288252 4707 scope.go:117] "RemoveContainer" containerID="b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.290901 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: E0121 15:54:22.293637 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a\": container with ID starting with b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a not found: ID does not exist" containerID="b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.293682 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a"} err="failed to get container status \"b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a\": rpc error: code = NotFound desc = could not find container \"b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a\": container with ID starting with b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.293708 4707 scope.go:117] "RemoveContainer" containerID="869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358" Jan 21 15:54:22 crc kubenswrapper[4707]: E0121 15:54:22.294834 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358\": container with ID starting with 869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358 not found: ID does not exist" containerID="869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.294862 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358"} err="failed to get container status \"869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358\": rpc error: code = NotFound desc = could not find container \"869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358\": container with ID starting with 869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358 not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.294878 4707 scope.go:117] "RemoveContainer" containerID="b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.295118 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a"} err="failed to get container status \"b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a\": rpc error: code = NotFound desc = could not find container \"b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a\": container with ID starting with b5ce808c5029889e4342c44107e9b83365e3db69679970f8705b0d1393ce2a1a not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.295148 4707 scope.go:117] "RemoveContainer" containerID="869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.295366 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358"} err="failed to get container status \"869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358\": rpc error: code = NotFound desc = could not find container \"869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358\": container with ID starting with 869cbe60a50daab9e7aa20c8a7f73e31536c1cf65c6bb40d3b6c1d4fe0e36358 not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.295384 4707 scope.go:117] "RemoveContainer" containerID="16c71ef64e21aa9d7aef3046c8569e917e4a9710d257ebb19afc5d621cdd72ff" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.297433 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.319371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.319590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.320494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7c4q\" (UniqueName: \"kubernetes.io/projected/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-kube-api-access-k7c4q\") pod \"nova-cell1-conductor-0\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.405112 4707 scope.go:117] "RemoveContainer" containerID="66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.425095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.426639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7c4q\" (UniqueName: \"kubernetes.io/projected/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-kube-api-access-k7c4q\") pod \"nova-cell1-conductor-0\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.426948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.429478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.432251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.441434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7c4q\" (UniqueName: \"kubernetes.io/projected/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-kube-api-access-k7c4q\") pod \"nova-cell1-conductor-0\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.467556 4707 scope.go:117] "RemoveContainer" containerID="4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.486653 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.534313 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: W0121 15:54:22.553098 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3b19b6e_8bbc_4e8e_8a77_57b816be96a3.slice/crio-3d0a8a365a91eaa96330e21170c042f8876741f32549832030901d2056fe375d WatchSource:0}: Error finding container 3d0a8a365a91eaa96330e21170c042f8876741f32549832030901d2056fe375d: Status 404 returned error can't find the container with id 3d0a8a365a91eaa96330e21170c042f8876741f32549832030901d2056fe375d Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.557435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.618795 4707 scope.go:117] "RemoveContainer" containerID="66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21" Jan 21 15:54:22 crc kubenswrapper[4707]: E0121 15:54:22.619455 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21\": container with ID starting with 66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21 not found: ID does not exist" containerID="66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.619533 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21"} err="failed to get container status \"66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21\": rpc error: code = NotFound desc = could not find container \"66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21\": container with ID starting with 66724ecae6276653d5c5aaaa63be53bb68a6cfa7fde57b19374517cd476c4b21 not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.619582 4707 scope.go:117] "RemoveContainer" containerID="4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622" Jan 21 15:54:22 crc kubenswrapper[4707]: E0121 15:54:22.620100 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622\": container with ID starting with 4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622 not found: ID does not exist" containerID="4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.620152 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622"} err="failed to get container status \"4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622\": rpc error: code = NotFound desc = could not find container \"4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622\": container with ID starting with 4ff5e7d12b546bb4e7cab607eb6588762cb0c3fcb5d6b10152a0827b690d7622 not found: ID does not exist" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.652605 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.735545 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data\") pod \"25ea2916-144f-4962-92ac-8d9de133ea36\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.735665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data-custom\") pod \"25ea2916-144f-4962-92ac-8d9de133ea36\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.735893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b885\" (UniqueName: \"kubernetes.io/projected/25ea2916-144f-4962-92ac-8d9de133ea36-kube-api-access-5b885\") pod \"25ea2916-144f-4962-92ac-8d9de133ea36\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.735992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-combined-ca-bundle\") pod \"25ea2916-144f-4962-92ac-8d9de133ea36\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.736232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ea2916-144f-4962-92ac-8d9de133ea36-logs\") pod \"25ea2916-144f-4962-92ac-8d9de133ea36\" (UID: \"25ea2916-144f-4962-92ac-8d9de133ea36\") " Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.740486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ea2916-144f-4962-92ac-8d9de133ea36-logs" (OuterVolumeSpecName: "logs") pod "25ea2916-144f-4962-92ac-8d9de133ea36" (UID: "25ea2916-144f-4962-92ac-8d9de133ea36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.741188 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ea2916-144f-4962-92ac-8d9de133ea36-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.749122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "25ea2916-144f-4962-92ac-8d9de133ea36" (UID: "25ea2916-144f-4962-92ac-8d9de133ea36"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.749213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ea2916-144f-4962-92ac-8d9de133ea36-kube-api-access-5b885" (OuterVolumeSpecName: "kube-api-access-5b885") pod "25ea2916-144f-4962-92ac-8d9de133ea36" (UID: "25ea2916-144f-4962-92ac-8d9de133ea36"). InnerVolumeSpecName "kube-api-access-5b885". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.778638 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25ea2916-144f-4962-92ac-8d9de133ea36" (UID: "25ea2916-144f-4962-92ac-8d9de133ea36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.791617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerStarted","Data":"3d0a8a365a91eaa96330e21170c042f8876741f32549832030901d2056fe375d"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.806966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data" (OuterVolumeSpecName: "config-data") pod "25ea2916-144f-4962-92ac-8d9de133ea36" (UID: "25ea2916-144f-4962-92ac-8d9de133ea36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.814352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"68a74a76-0bd8-48e1-87c9-778b381f3fa0","Type":"ContainerStarted","Data":"664218f2c763fb08ecde9d7d311a3e55c24f62c7de90c8590719e316a1fa1de6"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.814385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"68a74a76-0bd8-48e1-87c9-778b381f3fa0","Type":"ContainerStarted","Data":"4a36a4cbd1862f2d84340378f60d7801d4f238035b05f3aa1e05e062a795ec01"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.814399 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"68a74a76-0bd8-48e1-87c9-778b381f3fa0","Type":"ContainerStarted","Data":"577e1b1c17048e1569555f5db1237b142558c0abb88f4dc5820f89ce52879eb4"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.827295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"206e3d62-bdaa-4326-8c66-a615e81200d8","Type":"ContainerStarted","Data":"23f4a32980a3cc2291bacd76135da995267a0c14d4eb6427e7b2324696c9dd64"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.827417 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.837918 4707 generic.go:334] "Generic (PLEG): container finished" podID="25ea2916-144f-4962-92ac-8d9de133ea36" containerID="f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac" exitCode=0 Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.837975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" event={"ID":"25ea2916-144f-4962-92ac-8d9de133ea36","Type":"ContainerDied","Data":"f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.837996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" event={"ID":"25ea2916-144f-4962-92ac-8d9de133ea36","Type":"ContainerDied","Data":"fb61cad2da4593b08840a1613b011375b83e97eea9b88325f2ae612da95d5502"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.838004 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.838012 4707 scope.go:117] "RemoveContainer" containerID="f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.845747 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.845762 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.845773 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b885\" (UniqueName: \"kubernetes.io/projected/25ea2916-144f-4962-92ac-8d9de133ea36-kube-api-access-5b885\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.845784 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea2916-144f-4962-92ac-8d9de133ea36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.852638 4707 generic.go:334] "Generic (PLEG): container finished" podID="d70efa8f-44f9-48d8-8beb-531c69225631" containerID="3e29351ea171b68ab5f386f24df504ebd7cdf39483d33555a2145f0a35513183" exitCode=0 Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.852715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d70efa8f-44f9-48d8-8beb-531c69225631","Type":"ContainerDied","Data":"3e29351ea171b68ab5f386f24df504ebd7cdf39483d33555a2145f0a35513183"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.873150 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.873472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7e21981d-0581-4e7c-9485-cfea4bffaa9d","Type":"ContainerStarted","Data":"0b751042b32585b77a3d7dd8604710c56f396c4378ab45c5f9a48fedf42e7d83"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.881112 4707 scope.go:117] "RemoveContainer" containerID="eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.883179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"658de89d-042f-452c-a654-586154c683a4","Type":"ContainerStarted","Data":"9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.883304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"658de89d-042f-452c-a654-586154c683a4","Type":"ContainerStarted","Data":"49cfe85510907338c092c956a9f9c82bb24b988f5082366b31d67ee24a07eab5"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.884963 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.884946742 podStartE2EDuration="2.884946742s" podCreationTimestamp="2026-01-21 15:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:22.829511451 +0000 UTC m=+3160.011027673" watchObservedRunningTime="2026-01-21 15:54:22.884946742 +0000 UTC m=+3160.066462964" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.886485 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.892841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3e6e1d-19d1-4310-907f-a252e900b54a","Type":"ContainerStarted","Data":"28b00c21715fd067ad0883206899e195822f24dc4636250897a66b7215c6e997"} Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.908374 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.914376 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c5cd8f8d-9qdxd"] Jan 21 15:54:22 crc kubenswrapper[4707]: I0121 15:54:22.930218 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.9301967209999997 podStartE2EDuration="2.930196721s" podCreationTimestamp="2026-01-21 15:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:22.918899678 +0000 UTC m=+3160.100415900" watchObservedRunningTime="2026-01-21 15:54:22.930196721 +0000 UTC m=+3160.111712943" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:22.997974 4707 scope.go:117] "RemoveContainer" containerID="f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac" Jan 21 15:54:23 crc kubenswrapper[4707]: E0121 15:54:23.003002 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac\": container with ID starting with f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac not found: ID does not exist" containerID="f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.003033 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac"} err="failed to get container status \"f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac\": rpc error: code = NotFound desc = could not find container \"f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac\": container with ID starting with f7bddf98d53ddaeaefee598da090f2cbfc10aba67ef80eba4f79386f69ad24ac not found: ID does not exist" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.003055 4707 scope.go:117] "RemoveContainer" containerID="eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67" Jan 21 15:54:23 crc kubenswrapper[4707]: E0121 15:54:23.010187 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67\": container with ID starting with eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67 not found: ID does not exist" containerID="eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.010265 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67"} err="failed to get container status \"eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67\": rpc error: code = NotFound desc = could not find container \"eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67\": container with ID starting with eadcbcc5366cc8556e646c41515b1415dc4e5da5f5730a9cd87f690587f71f67 not found: ID does not exist" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.130579 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:23 crc kubenswrapper[4707]: W0121 15:54:23.157671 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb3faf0a_bc0f_4ee1_a443_a05eff6d30ce.slice/crio-c19fb5832a4a02b512a1442f31b4a80dff24397da2f0b5e2a11fb314c3a4a2e7 WatchSource:0}: Error finding container c19fb5832a4a02b512a1442f31b4a80dff24397da2f0b5e2a11fb314c3a4a2e7: Status 404 returned error can't find the container with id c19fb5832a4a02b512a1442f31b4a80dff24397da2f0b5e2a11fb314c3a4a2e7 Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.194649 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12052361-f984-4545-a1e1-9a8b80e2cceb" path="/var/lib/kubelet/pods/12052361-f984-4545-a1e1-9a8b80e2cceb/volumes" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.195907 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ea2916-144f-4962-92ac-8d9de133ea36" path="/var/lib/kubelet/pods/25ea2916-144f-4962-92ac-8d9de133ea36/volumes" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.197143 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a92eee-d8bb-46d5-bacb-e641b5be138f" path="/var/lib/kubelet/pods/28a92eee-d8bb-46d5-bacb-e641b5be138f/volumes" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.199729 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388a5540-d98c-4db0-bb1a-5a387ff8f7f3" path="/var/lib/kubelet/pods/388a5540-d98c-4db0-bb1a-5a387ff8f7f3/volumes" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.200712 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b163de-7daf-4893-abcd-9fefd3850234" path="/var/lib/kubelet/pods/e7b163de-7daf-4893-abcd-9fefd3850234/volumes" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.556966 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.557395 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.916529 4707 generic.go:334] "Generic (PLEG): container finished" podID="01ec6ded-9697-471a-80d6-9ed5fdd9918e" containerID="43d29cce317d9140ea844e0fa8a01802f3581985166c5b4847f73a0d0cd4079e" exitCode=0 Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.916586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"01ec6ded-9697-471a-80d6-9ed5fdd9918e","Type":"ContainerDied","Data":"43d29cce317d9140ea844e0fa8a01802f3581985166c5b4847f73a0d0cd4079e"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.925945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d3e6822-57a9-4dbe-a3c2-9fb531886f22","Type":"ContainerStarted","Data":"a448697007e558fe58e7db109d533620b53bc7ebefe7dfe620af036f04aec9a2"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.925982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d3e6822-57a9-4dbe-a3c2-9fb531886f22","Type":"ContainerStarted","Data":"c0b957e00447a6f2a25b953c88a6d91577001fa23f450181dc353f51189ad916"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.928181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d70efa8f-44f9-48d8-8beb-531c69225631","Type":"ContainerStarted","Data":"c0a16b0c06ec56b53325b0a694e257afb1a1c30b41915f7933e77e12c66df71b"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.930387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7e21981d-0581-4e7c-9485-cfea4bffaa9d","Type":"ContainerStarted","Data":"1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.942672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce","Type":"ContainerStarted","Data":"16ee6b178cb41968a2fe3874c57b52c48c07f4367ddef704d168289078b6a88e"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.942904 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce","Type":"ContainerStarted","Data":"c19fb5832a4a02b512a1442f31b4a80dff24397da2f0b5e2a11fb314c3a4a2e7"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.944588 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.947354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerStarted","Data":"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.951542 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=8.951530006 podStartE2EDuration="8.951530006s" podCreationTimestamp="2026-01-21 15:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:23.948421863 +0000 UTC m=+3161.129938105" watchObservedRunningTime="2026-01-21 15:54:23.951530006 +0000 UTC m=+3161.133046229" Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.968500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7160b3be-b878-41e6-b4ad-530f998a5185","Type":"ContainerStarted","Data":"458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.968543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7160b3be-b878-41e6-b4ad-530f998a5185","Type":"ContainerStarted","Data":"a855ca4407ab9294492df487922bd4b6cbad2e91db4bc1ca85890e5f5e6782c5"} Jan 21 15:54:23 crc kubenswrapper[4707]: I0121 15:54:23.979683 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.979665884 podStartE2EDuration="1.979665884s" podCreationTimestamp="2026-01-21 15:54:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:23.963123337 +0000 UTC m=+3161.144639559" watchObservedRunningTime="2026-01-21 15:54:23.979665884 +0000 UTC m=+3161.161182106" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.048944 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-548968857b-s96qt"] Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.049153 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.049361 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" containerID="cri-o://99af3d75a96b9e45347ec6f501959f5aaaf4cc00a9200522051fb91dc082665d" gracePeriod=30 Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.049852 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" containerID="cri-o://a4d345f4aef43876ec0d8f9747eadb254c4cfb035eb0ac23f9808a04b5c70dd0" gracePeriod=30 Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.061031 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": EOF" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.074663 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": EOF" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.074683 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": EOF" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.074848 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": EOF" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.082729 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2"] Jan 21 15:54:24 crc kubenswrapper[4707]: E0121 15:54:24.083572 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ea2916-144f-4962-92ac-8d9de133ea36" containerName="barbican-keystone-listener" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.083641 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ea2916-144f-4962-92ac-8d9de133ea36" containerName="barbican-keystone-listener" Jan 21 15:54:24 crc kubenswrapper[4707]: E0121 15:54:24.083697 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ea2916-144f-4962-92ac-8d9de133ea36" containerName="barbican-keystone-listener-log" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.083706 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ea2916-144f-4962-92ac-8d9de133ea36" containerName="barbican-keystone-listener-log" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.084084 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ea2916-144f-4962-92ac-8d9de133ea36" containerName="barbican-keystone-listener" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.084117 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ea2916-144f-4962-92ac-8d9de133ea36" containerName="barbican-keystone-listener-log" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.110716 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.120124 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2"] Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.319630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.319932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-internal-tls-certs\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.320022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njgd9\" (UniqueName: \"kubernetes.io/projected/29a8677a-d3a5-44da-9e5f-01db306c2e32-kube-api-access-njgd9\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.320092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-combined-ca-bundle\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.320176 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data-custom\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.320216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-public-tls-certs\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.320288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a8677a-d3a5-44da-9e5f-01db306c2e32-logs\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.422345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data-custom\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.422403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-public-tls-certs\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.422450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a8677a-d3a5-44da-9e5f-01db306c2e32-logs\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.422521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.422555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-internal-tls-certs\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.422595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njgd9\" (UniqueName: \"kubernetes.io/projected/29a8677a-d3a5-44da-9e5f-01db306c2e32-kube-api-access-njgd9\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.422629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-combined-ca-bundle\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.423366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a8677a-d3a5-44da-9e5f-01db306c2e32-logs\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.428448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data-custom\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.428581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.429026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-internal-tls-certs\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.430121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-combined-ca-bundle\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.433196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-public-tls-certs\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.439360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njgd9\" (UniqueName: \"kubernetes.io/projected/29a8677a-d3a5-44da-9e5f-01db306c2e32-kube-api-access-njgd9\") pod \"barbican-api-546cc5fb76-t8kj2\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.441649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.735671 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2"] Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.759675 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw"] Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.763733 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.772922 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw"] Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.874639 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2"] Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.930576 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmf79\" (UniqueName: \"kubernetes.io/projected/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-kube-api-access-hmf79\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.930627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-internal-tls-certs\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.930667 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-public-tls-certs\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.930707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data-custom\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.930775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.930824 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-logs\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.930863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-combined-ca-bundle\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.977969 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerID="99af3d75a96b9e45347ec6f501959f5aaaf4cc00a9200522051fb91dc082665d" exitCode=143 Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.978040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" event={"ID":"fb142568-b2b7-467a-9b33-fc6c3bf725e3","Type":"ContainerDied","Data":"99af3d75a96b9e45347ec6f501959f5aaaf4cc00a9200522051fb91dc082665d"} Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.979533 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb3e6e1d-19d1-4310-907f-a252e900b54a" containerID="28b00c21715fd067ad0883206899e195822f24dc4636250897a66b7215c6e997" exitCode=1 Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.979575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3e6e1d-19d1-4310-907f-a252e900b54a","Type":"ContainerDied","Data":"28b00c21715fd067ad0883206899e195822f24dc4636250897a66b7215c6e997"} Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.980185 4707 scope.go:117] "RemoveContainer" containerID="28b00c21715fd067ad0883206899e195822f24dc4636250897a66b7215c6e997" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.982741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7e21981d-0581-4e7c-9485-cfea4bffaa9d","Type":"ContainerStarted","Data":"55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213"} Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.982970 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.984976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" event={"ID":"29a8677a-d3a5-44da-9e5f-01db306c2e32","Type":"ContainerStarted","Data":"426e33ded1bd3ec3264184febb07a59272b9a83363e3390457318d1318e3425f"} Jan 21 15:54:24 crc kubenswrapper[4707]: I0121 15:54:24.999289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerStarted","Data":"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666"} Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.001182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7160b3be-b878-41e6-b4ad-530f998a5185","Type":"ContainerStarted","Data":"721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d"} Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.012067 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"01ec6ded-9697-471a-80d6-9ed5fdd9918e","Type":"ContainerStarted","Data":"84534c9d37714a2ecd0730e455dcc3dc74f38b7ed3f5fbd32c9d21d6b444b1c6"} Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.017075 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=5.017051346 podStartE2EDuration="5.017051346s" podCreationTimestamp="2026-01-21 15:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:25.011217787 +0000 UTC m=+3162.192734008" watchObservedRunningTime="2026-01-21 15:54:25.017051346 +0000 UTC m=+3162.198567568" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.017635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d3e6822-57a9-4dbe-a3c2-9fb531886f22","Type":"ContainerStarted","Data":"a9b30dfab469cfd98d55d611bfc65839a6329f4c91af56c02e35444e7c210451"} Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.032782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmf79\" (UniqueName: \"kubernetes.io/projected/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-kube-api-access-hmf79\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.032979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-internal-tls-certs\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.033059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-public-tls-certs\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.033140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data-custom\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.033226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.033317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-logs\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.033395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-combined-ca-bundle\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.035571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-logs\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.038014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data-custom\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.040381 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.040358292 podStartE2EDuration="4.040358292s" podCreationTimestamp="2026-01-21 15:54:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:25.030418191 +0000 UTC m=+3162.211934413" watchObservedRunningTime="2026-01-21 15:54:25.040358292 +0000 UTC m=+3162.221874515" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.042484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-combined-ca-bundle\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.042859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-public-tls-certs\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.047450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-internal-tls-certs\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.047611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.055943 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.055925354 podStartE2EDuration="4.055925354s" podCreationTimestamp="2026-01-21 15:54:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:25.054750635 +0000 UTC m=+3162.236266858" watchObservedRunningTime="2026-01-21 15:54:25.055925354 +0000 UTC m=+3162.237441576" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.069471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmf79\" (UniqueName: \"kubernetes.io/projected/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-kube-api-access-hmf79\") pod \"barbican-api-cb46d4bdd-s4nhw\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.083151 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.505449 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=8.505432022 podStartE2EDuration="8.505432022s" podCreationTimestamp="2026-01-21 15:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:25.083030614 +0000 UTC m=+3162.264546836" watchObservedRunningTime="2026-01-21 15:54:25.505432022 +0000 UTC m=+3162.686948245" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.512651 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw"] Jan 21 15:54:25 crc kubenswrapper[4707]: W0121 15:54:25.515345 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f27e5ec_c6d9_4b5d_bc14_1818306024f3.slice/crio-a656141261657f0ce20563f4eb8868810a7423f5175ce99e222aaa4c2582dab8 WatchSource:0}: Error finding container a656141261657f0ce20563f4eb8868810a7423f5175ce99e222aaa4c2582dab8: Status 404 returned error can't find the container with id a656141261657f0ce20563f4eb8868810a7423f5175ce99e222aaa4c2582dab8 Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.695412 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.132:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.956433 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:25 crc kubenswrapper[4707]: I0121 15:54:25.956765 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.025990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3e6e1d-19d1-4310-907f-a252e900b54a","Type":"ContainerStarted","Data":"d595cfcd7f15bd2970f5ab9f24d5e2ca4feaa5a5ba6498c06a7eec79734797d8"} Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.026992 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.030234 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" containerID="16ee6b178cb41968a2fe3874c57b52c48c07f4367ddef704d168289078b6a88e" exitCode=1 Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.030295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce","Type":"ContainerDied","Data":"16ee6b178cb41968a2fe3874c57b52c48c07f4367ddef704d168289078b6a88e"} Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.031038 4707 scope.go:117] "RemoveContainer" containerID="16ee6b178cb41968a2fe3874c57b52c48c07f4367ddef704d168289078b6a88e" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.033347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" event={"ID":"29a8677a-d3a5-44da-9e5f-01db306c2e32","Type":"ContainerStarted","Data":"8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68"} Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.033429 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" podUID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerName="barbican-api-log" containerID="cri-o://b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce" gracePeriod=30 Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.033482 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" podUID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerName="barbican-api" containerID="cri-o://8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68" gracePeriod=30 Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.034649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" event={"ID":"29a8677a-d3a5-44da-9e5f-01db306c2e32","Type":"ContainerStarted","Data":"b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce"} Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.034684 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.034697 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.036052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerStarted","Data":"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26"} Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.052111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" event={"ID":"8f27e5ec-c6d9-4b5d-bc14-1818306024f3","Type":"ContainerStarted","Data":"958b22b32e9e9e12c4b19bb7c2cc210c9881143654900671a1871c84040807a4"} Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.052143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" event={"ID":"8f27e5ec-c6d9-4b5d-bc14-1818306024f3","Type":"ContainerStarted","Data":"d7d4a778b9df39e14feb7740f3a7817dd4ceb96bd7dc4108a8e2e99f350903d3"} Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.052157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" event={"ID":"8f27e5ec-c6d9-4b5d-bc14-1818306024f3","Type":"ContainerStarted","Data":"a656141261657f0ce20563f4eb8868810a7423f5175ce99e222aaa4c2582dab8"} Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.052757 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.052842 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.089915 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" podStartSLOduration=2.089899812 podStartE2EDuration="2.089899812s" podCreationTimestamp="2026-01-21 15:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:26.084168255 +0000 UTC m=+3163.265684466" watchObservedRunningTime="2026-01-21 15:54:26.089899812 +0000 UTC m=+3163.271416035" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.109349 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" podStartSLOduration=2.109329488 podStartE2EDuration="2.109329488s" podCreationTimestamp="2026-01-21 15:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:26.104349725 +0000 UTC m=+3163.285865946" watchObservedRunningTime="2026-01-21 15:54:26.109329488 +0000 UTC m=+3163.290845710" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.545967 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.681095 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a8677a-d3a5-44da-9e5f-01db306c2e32-logs\") pod \"29a8677a-d3a5-44da-9e5f-01db306c2e32\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.681670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data-custom\") pod \"29a8677a-d3a5-44da-9e5f-01db306c2e32\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.681700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-combined-ca-bundle\") pod \"29a8677a-d3a5-44da-9e5f-01db306c2e32\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.681721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-public-tls-certs\") pod \"29a8677a-d3a5-44da-9e5f-01db306c2e32\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.681763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data\") pod \"29a8677a-d3a5-44da-9e5f-01db306c2e32\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.681785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-internal-tls-certs\") pod \"29a8677a-d3a5-44da-9e5f-01db306c2e32\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.681902 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njgd9\" (UniqueName: \"kubernetes.io/projected/29a8677a-d3a5-44da-9e5f-01db306c2e32-kube-api-access-njgd9\") pod \"29a8677a-d3a5-44da-9e5f-01db306c2e32\" (UID: \"29a8677a-d3a5-44da-9e5f-01db306c2e32\") " Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.682133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a8677a-d3a5-44da-9e5f-01db306c2e32-logs" (OuterVolumeSpecName: "logs") pod "29a8677a-d3a5-44da-9e5f-01db306c2e32" (UID: "29a8677a-d3a5-44da-9e5f-01db306c2e32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.682449 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a8677a-d3a5-44da-9e5f-01db306c2e32-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.695964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "29a8677a-d3a5-44da-9e5f-01db306c2e32" (UID: "29a8677a-d3a5-44da-9e5f-01db306c2e32"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.699204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a8677a-d3a5-44da-9e5f-01db306c2e32-kube-api-access-njgd9" (OuterVolumeSpecName: "kube-api-access-njgd9") pod "29a8677a-d3a5-44da-9e5f-01db306c2e32" (UID: "29a8677a-d3a5-44da-9e5f-01db306c2e32"). InnerVolumeSpecName "kube-api-access-njgd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.714940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29a8677a-d3a5-44da-9e5f-01db306c2e32" (UID: "29a8677a-d3a5-44da-9e5f-01db306c2e32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.732499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data" (OuterVolumeSpecName: "config-data") pod "29a8677a-d3a5-44da-9e5f-01db306c2e32" (UID: "29a8677a-d3a5-44da-9e5f-01db306c2e32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.743435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "29a8677a-d3a5-44da-9e5f-01db306c2e32" (UID: "29a8677a-d3a5-44da-9e5f-01db306c2e32"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.745895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "29a8677a-d3a5-44da-9e5f-01db306c2e32" (UID: "29a8677a-d3a5-44da-9e5f-01db306c2e32"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.785149 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.785179 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.785189 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.785198 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.785207 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29a8677a-d3a5-44da-9e5f-01db306c2e32-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.785215 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njgd9\" (UniqueName: \"kubernetes.io/projected/29a8677a-d3a5-44da-9e5f-01db306c2e32-kube-api-access-njgd9\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.847178 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:26 crc kubenswrapper[4707]: I0121 15:54:26.847555 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.070968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerStarted","Data":"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008"} Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.071173 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="ceilometer-central-agent" containerID="cri-o://86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e" gracePeriod=30 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.071396 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.071485 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="proxy-httpd" containerID="cri-o://14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008" gracePeriod=30 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.071625 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="ceilometer-notification-agent" containerID="cri-o://1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666" gracePeriod=30 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.071671 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="sg-core" containerID="cri-o://abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26" gracePeriod=30 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.076484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce","Type":"ContainerStarted","Data":"ebb015e4da971718a1807a27f20362fd1ce0ce3b2c3b4d48cc3e743f028aa7fd"} Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.076639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.081247 4707 generic.go:334] "Generic (PLEG): container finished" podID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerID="8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68" exitCode=0 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.081279 4707 generic.go:334] "Generic (PLEG): container finished" podID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerID="b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce" exitCode=143 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.081426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" event={"ID":"29a8677a-d3a5-44da-9e5f-01db306c2e32","Type":"ContainerDied","Data":"8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68"} Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.081478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" event={"ID":"29a8677a-d3a5-44da-9e5f-01db306c2e32","Type":"ContainerDied","Data":"b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce"} Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.081493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" event={"ID":"29a8677a-d3a5-44da-9e5f-01db306c2e32","Type":"ContainerDied","Data":"426e33ded1bd3ec3264184febb07a59272b9a83363e3390457318d1318e3425f"} Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.081513 4707 scope.go:117] "RemoveContainer" containerID="8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.082769 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.110676 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.900795314 podStartE2EDuration="6.110660592s" podCreationTimestamp="2026-01-21 15:54:21 +0000 UTC" firstStartedPulling="2026-01-21 15:54:22.58134736 +0000 UTC m=+3159.762863583" lastFinishedPulling="2026-01-21 15:54:26.791212638 +0000 UTC m=+3163.972728861" observedRunningTime="2026-01-21 15:54:27.094477471 +0000 UTC m=+3164.275993693" watchObservedRunningTime="2026-01-21 15:54:27.110660592 +0000 UTC m=+3164.292176815" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.111750 4707 scope.go:117] "RemoveContainer" containerID="b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.143331 4707 scope.go:117] "RemoveContainer" containerID="8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68" Jan 21 15:54:27 crc kubenswrapper[4707]: E0121 15:54:27.143677 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68\": container with ID starting with 8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68 not found: ID does not exist" containerID="8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.143709 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68"} err="failed to get container status \"8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68\": rpc error: code = NotFound desc = could not find container \"8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68\": container with ID starting with 8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68 not found: ID does not exist" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.143728 4707 scope.go:117] "RemoveContainer" containerID="b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce" Jan 21 15:54:27 crc kubenswrapper[4707]: E0121 15:54:27.143991 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce\": container with ID starting with b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce not found: ID does not exist" containerID="b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.144015 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce"} err="failed to get container status \"b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce\": rpc error: code = NotFound desc = could not find container \"b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce\": container with ID starting with b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce not found: ID does not exist" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.144029 4707 scope.go:117] "RemoveContainer" containerID="8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.144253 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68"} err="failed to get container status \"8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68\": rpc error: code = NotFound desc = could not find container \"8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68\": container with ID starting with 8319294d0bac13c7837207edba43e05b77627143daf9b5860bcf4213a30ace68 not found: ID does not exist" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.144288 4707 scope.go:117] "RemoveContainer" containerID="b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.144485 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce"} err="failed to get container status \"b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce\": rpc error: code = NotFound desc = could not find container \"b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce\": container with ID starting with b141782b40290b4021b10e11c91dbde6c5f086662302e74a0adc8703718532ce not found: ID does not exist" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.144530 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2"] Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.152375 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-546cc5fb76-t8kj2"] Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.203026 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a8677a-d3a5-44da-9e5f-01db306c2e32" path="/var/lib/kubelet/pods/29a8677a-d3a5-44da-9e5f-01db306c2e32/volumes" Jan 21 15:54:27 crc kubenswrapper[4707]: E0121 15:54:27.244605 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:51500->192.168.25.165:36655: write tcp 192.168.25.165:51500->192.168.25.165:36655: write: broken pipe Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.642962 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.675991 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.676220 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerName="ovn-northd" containerID="cri-o://7b39b7154fa0b95548ea7fac51861a1df55416b47e244ccf9463cee5f36581c0" gracePeriod=30 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.676270 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerName="openstack-network-exporter" containerID="cri-o://23f4a32980a3cc2291bacd76135da995267a0c14d4eb6427e7b2324696c9dd64" gracePeriod=30 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.688842 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.689099 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="38825357-2812-4711-bc86-7638075f8daa" containerName="openstack-network-exporter" containerID="cri-o://b08b894935f483cea6e9fe5cb8204c859d7f717fe6ecb7b8a6bb483b59d86af8" gracePeriod=300 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.712613 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz"] Jan 21 15:54:27 crc kubenswrapper[4707]: E0121 15:54:27.712988 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerName="barbican-api" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713002 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerName="barbican-api" Jan 21 15:54:27 crc kubenswrapper[4707]: E0121 15:54:27.713016 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="sg-core" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713022 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="sg-core" Jan 21 15:54:27 crc kubenswrapper[4707]: E0121 15:54:27.713033 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerName="barbican-api-log" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713040 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerName="barbican-api-log" Jan 21 15:54:27 crc kubenswrapper[4707]: E0121 15:54:27.713048 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="ceilometer-notification-agent" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713054 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="ceilometer-notification-agent" Jan 21 15:54:27 crc kubenswrapper[4707]: E0121 15:54:27.713066 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="proxy-httpd" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713071 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="proxy-httpd" Jan 21 15:54:27 crc kubenswrapper[4707]: E0121 15:54:27.713084 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="ceilometer-central-agent" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713090 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="ceilometer-central-agent" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713303 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="ceilometer-notification-agent" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713314 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="ceilometer-central-agent" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713329 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerName="barbican-api" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713340 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="proxy-httpd" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713350 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a8677a-d3a5-44da-9e5f-01db306c2e32" containerName="barbican-api-log" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.713358 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerName="sg-core" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.714292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-log-httpd\") pod \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.714330 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.714363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qj44\" (UniqueName: \"kubernetes.io/projected/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-kube-api-access-2qj44\") pod \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.714641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-sg-core-conf-yaml\") pod \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.714681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-run-httpd\") pod \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.714741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-combined-ca-bundle\") pod \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.714826 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-scripts\") pod \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.714848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-ceilometer-tls-certs\") pod \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.714927 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-config-data\") pod \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\" (UID: \"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3\") " Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.715051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" (UID: "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.715736 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.716001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" (UID: "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.748048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-kube-api-access-2qj44" (OuterVolumeSpecName: "kube-api-access-2qj44") pod "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" (UID: "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3"). InnerVolumeSpecName "kube-api-access-2qj44". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.753535 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="38825357-2812-4711-bc86-7638075f8daa" containerName="ovsdbserver-nb" containerID="cri-o://5609ff7c7f27e1828f46696621a20e1e765e3c6ae795b3240b11334dc334d6c4" gracePeriod=300 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.753693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-scripts" (OuterVolumeSpecName: "scripts") pod "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" (UID: "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.817353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data-custom\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.817407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.817492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4483b-abb1-4c04-9092-d8f04755bc93-logs\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.817556 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-combined-ca-bundle\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.817627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-669nz\" (UniqueName: \"kubernetes.io/projected/a4c4483b-abb1-4c04-9092-d8f04755bc93-kube-api-access-669nz\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.817746 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qj44\" (UniqueName: \"kubernetes.io/projected/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-kube-api-access-2qj44\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.817757 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.817766 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.827753 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd"] Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.829647 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.875408 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz"] Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.897992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" (UID: "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.908426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd"] Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.921110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data-custom\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.929337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.929483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data-custom\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.929623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4483b-abb1-4c04-9092-d8f04755bc93-logs\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.929773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-combined-ca-bundle\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.929932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-669nz\" (UniqueName: \"kubernetes.io/projected/a4c4483b-abb1-4c04-9092-d8f04755bc93-kube-api-access-669nz\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.930019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mmrm\" (UniqueName: \"kubernetes.io/projected/28add001-f139-47b8-9ebb-659ba39858cc-kube-api-access-7mmrm\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.930125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.930274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-combined-ca-bundle\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.930359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28add001-f139-47b8-9ebb-659ba39858cc-logs\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.930545 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.934702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4483b-abb1-4c04-9092-d8f04755bc93-logs\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.934787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data-custom\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.955913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.958319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-combined-ca-bundle\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.961083 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.961355 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerName="cinder-api-log" containerID="cri-o://1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c" gracePeriod=30 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.961482 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerName="cinder-api" containerID="cri-o://55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213" gracePeriod=30 Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.979185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-669nz\" (UniqueName: \"kubernetes.io/projected/a4c4483b-abb1-4c04-9092-d8f04755bc93-kube-api-access-669nz\") pod \"barbican-keystone-listener-6b5f6558c6-lzljz\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:27 crc kubenswrapper[4707]: I0121 15:54:27.985989 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.021422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" (UID: "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.027776 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-98765b5d4-w967c"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.028022 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-log" containerID="cri-o://07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381" gracePeriod=30 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.028382 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-api" containerID="cri-o://ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527" gracePeriod=30 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.031722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28add001-f139-47b8-9ebb-659ba39858cc-logs\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.032013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data-custom\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.032294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mmrm\" (UniqueName: \"kubernetes.io/projected/28add001-f139-47b8-9ebb-659ba39858cc-kube-api-access-7mmrm\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.032394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.032518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-combined-ca-bundle\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.032632 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.041200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" (UID: "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.041586 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.139:8778/\": read tcp 10.217.0.2:54678->10.217.0.139:8778: read: connection reset by peer" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.032420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28add001-f139-47b8-9ebb-659ba39858cc-logs\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.043541 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-combined-ca-bundle\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.045048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.047461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data-custom\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.050645 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.139:8778/\": EOF" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.051007 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.053380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mmrm\" (UniqueName: \"kubernetes.io/projected/28add001-f139-47b8-9ebb-659ba39858cc-kube-api-access-7mmrm\") pod \"barbican-worker-789c64d4bf-l5hnd\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.063006 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.063492 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerName="openstack-network-exporter" containerID="cri-o://13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb" gracePeriod=300 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.104680 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.107631 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.109099 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5469fd854-s2lnn"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.115972 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.133765 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-config-data" (OuterVolumeSpecName: "config-data") pod "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" (UID: "f3b19b6e-8bbc-4e8e-8a77-57b816be96a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.135525 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.135546 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.139084 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerName="ovsdbserver-sb" containerID="cri-o://2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593" gracePeriod=300 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.139605 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerID="14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008" exitCode=0 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.139710 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerID="abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26" exitCode=2 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.139786 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerID="1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666" exitCode=0 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.139947 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" containerID="86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e" exitCode=0 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.139950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.139930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerDied","Data":"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008"} Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.140448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerDied","Data":"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26"} Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.140538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerDied","Data":"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666"} Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.140617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerDied","Data":"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e"} Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.140693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f3b19b6e-8bbc-4e8e-8a77-57b816be96a3","Type":"ContainerDied","Data":"3d0a8a365a91eaa96330e21170c042f8876741f32549832030901d2056fe375d"} Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.159043 4707 scope.go:117] "RemoveContainer" containerID="14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.167761 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.173947 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.181698 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5469fd854-s2lnn"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.191875 4707 generic.go:334] "Generic (PLEG): container finished" podID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerID="23f4a32980a3cc2291bacd76135da995267a0c14d4eb6427e7b2324696c9dd64" exitCode=2 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.191946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"206e3d62-bdaa-4326-8c66-a615e81200d8","Type":"ContainerDied","Data":"23f4a32980a3cc2291bacd76135da995267a0c14d4eb6427e7b2324696c9dd64"} Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.193464 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.211968 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb3e6e1d-19d1-4310-907f-a252e900b54a" containerID="d595cfcd7f15bd2970f5ab9f24d5e2ca4feaa5a5ba6498c06a7eec79734797d8" exitCode=1 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.212035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3e6e1d-19d1-4310-907f-a252e900b54a","Type":"ContainerDied","Data":"d595cfcd7f15bd2970f5ab9f24d5e2ca4feaa5a5ba6498c06a7eec79734797d8"} Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.212693 4707 scope.go:117] "RemoveContainer" containerID="d595cfcd7f15bd2970f5ab9f24d5e2ca4feaa5a5ba6498c06a7eec79734797d8" Jan 21 15:54:28 crc kubenswrapper[4707]: E0121 15:54:28.213010 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(fb3e6e1d-19d1-4310-907f-a252e900b54a)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="fb3e6e1d-19d1-4310-907f-a252e900b54a" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.222011 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_38825357-2812-4711-bc86-7638075f8daa/ovsdbserver-nb/0.log" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.222043 4707 generic.go:334] "Generic (PLEG): container finished" podID="38825357-2812-4711-bc86-7638075f8daa" containerID="b08b894935f483cea6e9fe5cb8204c859d7f717fe6ecb7b8a6bb483b59d86af8" exitCode=2 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.222053 4707 generic.go:334] "Generic (PLEG): container finished" podID="38825357-2812-4711-bc86-7638075f8daa" containerID="5609ff7c7f27e1828f46696621a20e1e765e3c6ae795b3240b11334dc334d6c4" exitCode=143 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.222100 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"38825357-2812-4711-bc86-7638075f8daa","Type":"ContainerDied","Data":"b08b894935f483cea6e9fe5cb8204c859d7f717fe6ecb7b8a6bb483b59d86af8"} Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.222117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"38825357-2812-4711-bc86-7638075f8daa","Type":"ContainerDied","Data":"5609ff7c7f27e1828f46696621a20e1e765e3c6ae795b3240b11334dc334d6c4"} Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.223507 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" podUID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerName="barbican-api-log" containerID="cri-o://d7d4a778b9df39e14feb7740f3a7817dd4ceb96bd7dc4108a8e2e99f350903d3" gracePeriod=30 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.223728 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" podUID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerName="barbican-api" containerID="cri-o://958b22b32e9e9e12c4b19bb7c2cc210c9881143654900671a1871c84040807a4" gracePeriod=30 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.234660 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.240675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-scripts\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.240736 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-public-tls-certs\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.240777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.240849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-internal-tls-certs\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.240885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-config-data\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.240905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-internal-tls-certs\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.240929 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6676q\" (UniqueName: \"kubernetes.io/projected/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-kube-api-access-6676q\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.241539 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-combined-ca-bundle\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.241598 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/243109f6-78d0-4750-a43f-21a7b606626d-logs\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.241722 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-logs\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.249601 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="01ec6ded-9697-471a-80d6-9ed5fdd9918e" containerName="galera" containerID="cri-o://84534c9d37714a2ecd0730e455dcc3dc74f38b7ed3f5fbd32c9d21d6b444b1c6" gracePeriod=30 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.249869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gghhf\" (UniqueName: \"kubernetes.io/projected/243109f6-78d0-4750-a43f-21a7b606626d-kube-api-access-gghhf\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.250052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-public-tls-certs\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.250469 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data-custom\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.250511 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-combined-ca-bundle\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.256662 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.340851 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.351917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-config-data\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.351953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-internal-tls-certs\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.351977 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6676q\" (UniqueName: \"kubernetes.io/projected/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-kube-api-access-6676q\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-combined-ca-bundle\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/243109f6-78d0-4750-a43f-21a7b606626d-logs\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-logs\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gghhf\" (UniqueName: \"kubernetes.io/projected/243109f6-78d0-4750-a43f-21a7b606626d-kube-api-access-gghhf\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-public-tls-certs\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data-custom\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-combined-ca-bundle\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-scripts\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-public-tls-certs\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.352341 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-internal-tls-certs\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.353130 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.353509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-logs\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.361920 4707 scope.go:117] "RemoveContainer" containerID="abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.363007 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.364388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/243109f6-78d0-4750-a43f-21a7b606626d-logs\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.365183 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.371682 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.373740 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-internal-tls-certs\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.375629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-scripts\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.380201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-internal-tls-certs\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.380301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-public-tls-certs\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.380377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-public-tls-certs\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.384689 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.387579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-combined-ca-bundle\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.388301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data-custom\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.388708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-combined-ca-bundle\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.390074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-config-data\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.400018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.400492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6676q\" (UniqueName: \"kubernetes.io/projected/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-kube-api-access-6676q\") pod \"barbican-api-b87b87cdd-2lqlf\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.374121 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.378310 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.408750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gghhf\" (UniqueName: \"kubernetes.io/projected/243109f6-78d0-4750-a43f-21a7b606626d-kube-api-access-gghhf\") pod \"placement-5469fd854-s2lnn\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.430735 4707 scope.go:117] "RemoveContainer" containerID="1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.455661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghlrc\" (UniqueName: \"kubernetes.io/projected/f9460600-2f70-404b-b7f3-2f83c235cef5-kube-api-access-ghlrc\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.455708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-scripts\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.456175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-config-data\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.456288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.456324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.456404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.456429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-run-httpd\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.456526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-log-httpd\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.460081 4707 scope.go:117] "RemoveContainer" containerID="86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.472220 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="d70efa8f-44f9-48d8-8beb-531c69225631" containerName="galera" containerID="cri-o://c0a16b0c06ec56b53325b0a694e257afb1a1c30b41915f7933e77e12c66df71b" gracePeriod=30 Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.480641 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_38825357-2812-4711-bc86-7638075f8daa/ovsdbserver-nb/0.log" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.480711 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.491364 4707 scope.go:117] "RemoveContainer" containerID="14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008" Jan 21 15:54:28 crc kubenswrapper[4707]: E0121 15:54:28.491704 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008\": container with ID starting with 14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008 not found: ID does not exist" containerID="14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.491730 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008"} err="failed to get container status \"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008\": rpc error: code = NotFound desc = could not find container \"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008\": container with ID starting with 14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.491749 4707 scope.go:117] "RemoveContainer" containerID="abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26" Jan 21 15:54:28 crc kubenswrapper[4707]: E0121 15:54:28.493411 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26\": container with ID starting with abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26 not found: ID does not exist" containerID="abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.493442 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26"} err="failed to get container status \"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26\": rpc error: code = NotFound desc = could not find container \"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26\": container with ID starting with abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.493460 4707 scope.go:117] "RemoveContainer" containerID="1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666" Jan 21 15:54:28 crc kubenswrapper[4707]: E0121 15:54:28.493732 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666\": container with ID starting with 1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666 not found: ID does not exist" containerID="1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.493783 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666"} err="failed to get container status \"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666\": rpc error: code = NotFound desc = could not find container \"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666\": container with ID starting with 1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.493796 4707 scope.go:117] "RemoveContainer" containerID="86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e" Jan 21 15:54:28 crc kubenswrapper[4707]: E0121 15:54:28.496639 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e\": container with ID starting with 86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e not found: ID does not exist" containerID="86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.496664 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e"} err="failed to get container status \"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e\": rpc error: code = NotFound desc = could not find container \"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e\": container with ID starting with 86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.496678 4707 scope.go:117] "RemoveContainer" containerID="14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.503849 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008"} err="failed to get container status \"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008\": rpc error: code = NotFound desc = could not find container \"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008\": container with ID starting with 14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.503883 4707 scope.go:117] "RemoveContainer" containerID="abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.521649 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26"} err="failed to get container status \"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26\": rpc error: code = NotFound desc = could not find container \"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26\": container with ID starting with abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.521687 4707 scope.go:117] "RemoveContainer" containerID="1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.527094 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666"} err="failed to get container status \"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666\": rpc error: code = NotFound desc = could not find container \"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666\": container with ID starting with 1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.527135 4707 scope.go:117] "RemoveContainer" containerID="86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.532954 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e"} err="failed to get container status \"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e\": rpc error: code = NotFound desc = could not find container \"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e\": container with ID starting with 86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.533008 4707 scope.go:117] "RemoveContainer" containerID="14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.545735 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008"} err="failed to get container status \"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008\": rpc error: code = NotFound desc = could not find container \"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008\": container with ID starting with 14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.545767 4707 scope.go:117] "RemoveContainer" containerID="abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.547689 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26"} err="failed to get container status \"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26\": rpc error: code = NotFound desc = could not find container \"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26\": container with ID starting with abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.547741 4707 scope.go:117] "RemoveContainer" containerID="1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.548307 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666"} err="failed to get container status \"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666\": rpc error: code = NotFound desc = could not find container \"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666\": container with ID starting with 1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.548326 4707 scope.go:117] "RemoveContainer" containerID="86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.548874 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e"} err="failed to get container status \"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e\": rpc error: code = NotFound desc = could not find container \"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e\": container with ID starting with 86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.548893 4707 scope.go:117] "RemoveContainer" containerID="14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.549150 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008"} err="failed to get container status \"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008\": rpc error: code = NotFound desc = could not find container \"14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008\": container with ID starting with 14acb13cfba5834702ef77fb6c5dbbacef6cb6f8153101480f5f47a8598c8008 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.549163 4707 scope.go:117] "RemoveContainer" containerID="abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.550766 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26"} err="failed to get container status \"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26\": rpc error: code = NotFound desc = could not find container \"abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26\": container with ID starting with abbcbed95be5d737c3bae8495e8066ca514fba36003b0c0b644c59cab6700f26 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.550786 4707 scope.go:117] "RemoveContainer" containerID="1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.554376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666"} err="failed to get container status \"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666\": rpc error: code = NotFound desc = could not find container \"1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666\": container with ID starting with 1ecb2d9f1946a5abfa8bcafd9921f4fbef17710b79f962e70fc220b9ad228666 not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.554414 4707 scope.go:117] "RemoveContainer" containerID="86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.556215 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e"} err="failed to get container status \"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e\": rpc error: code = NotFound desc = could not find container \"86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e\": container with ID starting with 86c0fa41ee10792dd427a88f330ca128f73bdc6613af93d806d1bb0fc4215f4e not found: ID does not exist" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.556257 4707 scope.go:117] "RemoveContainer" containerID="28b00c21715fd067ad0883206899e195822f24dc4636250897a66b7215c6e997" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.561487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-ovsdbserver-nb-tls-certs\") pod \"38825357-2812-4711-bc86-7638075f8daa\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.561577 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtc4v\" (UniqueName: \"kubernetes.io/projected/38825357-2812-4711-bc86-7638075f8daa-kube-api-access-jtc4v\") pod \"38825357-2812-4711-bc86-7638075f8daa\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.561600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38825357-2812-4711-bc86-7638075f8daa-ovsdb-rundir\") pod \"38825357-2812-4711-bc86-7638075f8daa\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.561653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-metrics-certs-tls-certs\") pod \"38825357-2812-4711-bc86-7638075f8daa\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.561722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-scripts\") pod \"38825357-2812-4711-bc86-7638075f8daa\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.561760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-combined-ca-bundle\") pod \"38825357-2812-4711-bc86-7638075f8daa\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.561782 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-config\") pod \"38825357-2812-4711-bc86-7638075f8daa\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.561804 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"38825357-2812-4711-bc86-7638075f8daa\" (UID: \"38825357-2812-4711-bc86-7638075f8daa\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.562411 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-scripts" (OuterVolumeSpecName: "scripts") pod "38825357-2812-4711-bc86-7638075f8daa" (UID: "38825357-2812-4711-bc86-7638075f8daa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.562544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38825357-2812-4711-bc86-7638075f8daa-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "38825357-2812-4711-bc86-7638075f8daa" (UID: "38825357-2812-4711-bc86-7638075f8daa"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.562693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-config-data\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.562891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.562930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.562981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.563007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-run-httpd\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.563078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-log-httpd\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.563120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-config" (OuterVolumeSpecName: "config") pod "38825357-2812-4711-bc86-7638075f8daa" (UID: "38825357-2812-4711-bc86-7638075f8daa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.563144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghlrc\" (UniqueName: \"kubernetes.io/projected/f9460600-2f70-404b-b7f3-2f83c235cef5-kube-api-access-ghlrc\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.563164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-scripts\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.563242 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38825357-2812-4711-bc86-7638075f8daa-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.563253 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.563290 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38825357-2812-4711-bc86-7638075f8daa-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.563684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-run-httpd\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.565245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-log-httpd\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.566477 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.566905 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.566963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "38825357-2812-4711-bc86-7638075f8daa" (UID: "38825357-2812-4711-bc86-7638075f8daa"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.577210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.577866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-scripts\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.579390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38825357-2812-4711-bc86-7638075f8daa-kube-api-access-jtc4v" (OuterVolumeSpecName: "kube-api-access-jtc4v") pod "38825357-2812-4711-bc86-7638075f8daa" (UID: "38825357-2812-4711-bc86-7638075f8daa"). InnerVolumeSpecName "kube-api-access-jtc4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.589109 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.590168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-config-data\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.591670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.595434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghlrc\" (UniqueName: \"kubernetes.io/projected/f9460600-2f70-404b-b7f3-2f83c235cef5-kube-api-access-ghlrc\") pod \"ceilometer-0\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.631486 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.645534 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.672273 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.672305 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtc4v\" (UniqueName: \"kubernetes.io/projected/38825357-2812-4711-bc86-7638075f8daa-kube-api-access-jtc4v\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.682670 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38825357-2812-4711-bc86-7638075f8daa" (UID: "38825357-2812-4711-bc86-7638075f8daa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.695657 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.704905 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_ddb5dfe8-ec47-4bb6-b6c0-403095118b07/ovsdbserver-sb/0.log" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.704981 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.714091 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.740370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "38825357-2812-4711-bc86-7638075f8daa" (UID: "38825357-2812-4711-bc86-7638075f8daa"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.775205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vndk\" (UniqueName: \"kubernetes.io/projected/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-kube-api-access-6vndk\") pod \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.775341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdb-rundir\") pod \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.775368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-combined-ca-bundle\") pod \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.776086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "ddb5dfe8-ec47-4bb6-b6c0-403095118b07" (UID: "ddb5dfe8-ec47-4bb6-b6c0-403095118b07"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.776252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.776585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-config\") pod \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.776637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-metrics-certs-tls-certs\") pod \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.776663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-scripts\") pod \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.777015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdbserver-sb-tls-certs\") pod \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\" (UID: \"ddb5dfe8-ec47-4bb6-b6c0-403095118b07\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.777796 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.777868 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.777878 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.777888 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.779351 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-config" (OuterVolumeSpecName: "config") pod "ddb5dfe8-ec47-4bb6-b6c0-403095118b07" (UID: "ddb5dfe8-ec47-4bb6-b6c0-403095118b07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.780571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-scripts" (OuterVolumeSpecName: "scripts") pod "ddb5dfe8-ec47-4bb6-b6c0-403095118b07" (UID: "ddb5dfe8-ec47-4bb6-b6c0-403095118b07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.782181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "ddb5dfe8-ec47-4bb6-b6c0-403095118b07" (UID: "ddb5dfe8-ec47-4bb6-b6c0-403095118b07"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.796777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-kube-api-access-6vndk" (OuterVolumeSpecName: "kube-api-access-6vndk") pod "ddb5dfe8-ec47-4bb6-b6c0-403095118b07" (UID: "ddb5dfe8-ec47-4bb6-b6c0-403095118b07"). InnerVolumeSpecName "kube-api-access-6vndk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.827038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "38825357-2812-4711-bc86-7638075f8daa" (UID: "38825357-2812-4711-bc86-7638075f8daa"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.856954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddb5dfe8-ec47-4bb6-b6c0-403095118b07" (UID: "ddb5dfe8-ec47-4bb6-b6c0-403095118b07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.857078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.881591 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38825357-2812-4711-bc86-7638075f8daa-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.881619 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vndk\" (UniqueName: \"kubernetes.io/projected/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-kube-api-access-6vndk\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.881629 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.881649 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.881659 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.881670 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.899579 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/openstack-galera-0" podUID="d70efa8f-44f9-48d8-8beb-531c69225631" containerName="galera" probeResult="failure" output="" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.899646 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.906348 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.957908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "ddb5dfe8-ec47-4bb6-b6c0-403095118b07" (UID: "ddb5dfe8-ec47-4bb6-b6c0-403095118b07"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.972050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ddb5dfe8-ec47-4bb6-b6c0-403095118b07" (UID: "ddb5dfe8-ec47-4bb6-b6c0-403095118b07"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.979706 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.982869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e21981d-0581-4e7c-9485-cfea4bffaa9d-logs\") pod \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.982951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data-custom\") pod \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.982987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-combined-ca-bundle\") pod \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.983041 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-internal-tls-certs\") pod \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.983057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-public-tls-certs\") pod \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.983084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e21981d-0581-4e7c-9485-cfea4bffaa9d-etc-machine-id\") pod \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.983113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data\") pod \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.983183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76n8j\" (UniqueName: \"kubernetes.io/projected/7e21981d-0581-4e7c-9485-cfea4bffaa9d-kube-api-access-76n8j\") pod \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.983294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-scripts\") pod \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\" (UID: \"7e21981d-0581-4e7c-9485-cfea4bffaa9d\") " Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.984027 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.984041 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.984052 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb5dfe8-ec47-4bb6-b6c0-403095118b07-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.985461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e21981d-0581-4e7c-9485-cfea4bffaa9d-logs" (OuterVolumeSpecName: "logs") pod "7e21981d-0581-4e7c-9485-cfea4bffaa9d" (UID: "7e21981d-0581-4e7c-9485-cfea4bffaa9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.985503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e21981d-0581-4e7c-9485-cfea4bffaa9d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7e21981d-0581-4e7c-9485-cfea4bffaa9d" (UID: "7e21981d-0581-4e7c-9485-cfea4bffaa9d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.988521 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd"] Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.995800 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e21981d-0581-4e7c-9485-cfea4bffaa9d-kube-api-access-76n8j" (OuterVolumeSpecName: "kube-api-access-76n8j") pod "7e21981d-0581-4e7c-9485-cfea4bffaa9d" (UID: "7e21981d-0581-4e7c-9485-cfea4bffaa9d"). InnerVolumeSpecName "kube-api-access-76n8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:28 crc kubenswrapper[4707]: I0121 15:54:28.996537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-scripts" (OuterVolumeSpecName: "scripts") pod "7e21981d-0581-4e7c-9485-cfea4bffaa9d" (UID: "7e21981d-0581-4e7c-9485-cfea4bffaa9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.007904 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7e21981d-0581-4e7c-9485-cfea4bffaa9d" (UID: "7e21981d-0581-4e7c-9485-cfea4bffaa9d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.029000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e21981d-0581-4e7c-9485-cfea4bffaa9d" (UID: "7e21981d-0581-4e7c-9485-cfea4bffaa9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.063633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data" (OuterVolumeSpecName: "config-data") pod "7e21981d-0581-4e7c-9485-cfea4bffaa9d" (UID: "7e21981d-0581-4e7c-9485-cfea4bffaa9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.074541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7e21981d-0581-4e7c-9485-cfea4bffaa9d" (UID: "7e21981d-0581-4e7c-9485-cfea4bffaa9d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.076391 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.078242 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.088880 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e21981d-0581-4e7c-9485-cfea4bffaa9d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.088909 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.088921 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76n8j\" (UniqueName: \"kubernetes.io/projected/7e21981d-0581-4e7c-9485-cfea4bffaa9d-kube-api-access-76n8j\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.088931 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.088940 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e21981d-0581-4e7c-9485-cfea4bffaa9d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.088948 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.088956 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.088965 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.090009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7e21981d-0581-4e7c-9485-cfea4bffaa9d" (UID: "7e21981d-0581-4e7c-9485-cfea4bffaa9d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.191091 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e21981d-0581-4e7c-9485-cfea4bffaa9d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.198467 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b19b6e-8bbc-4e8e-8a77-57b816be96a3" path="/var/lib/kubelet/pods/f3b19b6e-8bbc-4e8e-8a77-57b816be96a3/volumes" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.226654 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.253840 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerID="958b22b32e9e9e12c4b19bb7c2cc210c9881143654900671a1871c84040807a4" exitCode=0 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.253950 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerID="d7d4a778b9df39e14feb7740f3a7817dd4ceb96bd7dc4108a8e2e99f350903d3" exitCode=143 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.253915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" event={"ID":"8f27e5ec-c6d9-4b5d-bc14-1818306024f3","Type":"ContainerDied","Data":"958b22b32e9e9e12c4b19bb7c2cc210c9881143654900671a1871c84040807a4"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.254643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" event={"ID":"8f27e5ec-c6d9-4b5d-bc14-1818306024f3","Type":"ContainerDied","Data":"d7d4a778b9df39e14feb7740f3a7817dd4ceb96bd7dc4108a8e2e99f350903d3"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.254675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" event={"ID":"8f27e5ec-c6d9-4b5d-bc14-1818306024f3","Type":"ContainerDied","Data":"a656141261657f0ce20563f4eb8868810a7423f5175ce99e222aaa4c2582dab8"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.254686 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a656141261657f0ce20563f4eb8868810a7423f5175ce99e222aaa4c2582dab8" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.256414 4707 generic.go:334] "Generic (PLEG): container finished" podID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerID="07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381" exitCode=143 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.256473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" event={"ID":"318a94fd-66d7-4cfb-b312-a348e185ae2e","Type":"ContainerDied","Data":"07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.258077 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerID="55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213" exitCode=0 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.258102 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerID="1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c" exitCode=143 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.258155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7e21981d-0581-4e7c-9485-cfea4bffaa9d","Type":"ContainerDied","Data":"55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.258178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7e21981d-0581-4e7c-9485-cfea4bffaa9d","Type":"ContainerDied","Data":"1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.258190 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7e21981d-0581-4e7c-9485-cfea4bffaa9d","Type":"ContainerDied","Data":"0b751042b32585b77a3d7dd8604710c56f396c4378ab45c5f9a48fedf42e7d83"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.258194 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.258204 4707 scope.go:117] "RemoveContainer" containerID="55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.265404 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" containerID="ebb015e4da971718a1807a27f20362fd1ce0ce3b2c3b4d48cc3e743f028aa7fd" exitCode=1 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.265473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce","Type":"ContainerDied","Data":"ebb015e4da971718a1807a27f20362fd1ce0ce3b2c3b4d48cc3e743f028aa7fd"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.266030 4707 scope.go:117] "RemoveContainer" containerID="ebb015e4da971718a1807a27f20362fd1ce0ce3b2c3b4d48cc3e743f028aa7fd" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.267205 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.272635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" event={"ID":"28add001-f139-47b8-9ebb-659ba39858cc","Type":"ContainerStarted","Data":"26c5a891a2c404bb36e9542098c2586cffd1c00d906dd37ada375d009956e7a8"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.276621 4707 generic.go:334] "Generic (PLEG): container finished" podID="01ec6ded-9697-471a-80d6-9ed5fdd9918e" containerID="84534c9d37714a2ecd0730e455dcc3dc74f38b7ed3f5fbd32c9d21d6b444b1c6" exitCode=0 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.276688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"01ec6ded-9697-471a-80d6-9ed5fdd9918e","Type":"ContainerDied","Data":"84534c9d37714a2ecd0730e455dcc3dc74f38b7ed3f5fbd32c9d21d6b444b1c6"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.308779 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_ddb5dfe8-ec47-4bb6-b6c0-403095118b07/ovsdbserver-sb/0.log" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.309025 4707 generic.go:334] "Generic (PLEG): container finished" podID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerID="13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb" exitCode=2 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.309041 4707 generic.go:334] "Generic (PLEG): container finished" podID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerID="2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593" exitCode=143 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.309155 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.309671 4707 scope.go:117] "RemoveContainer" containerID="1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.310036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ddb5dfe8-ec47-4bb6-b6c0-403095118b07","Type":"ContainerDied","Data":"13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.310068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ddb5dfe8-ec47-4bb6-b6c0-403095118b07","Type":"ContainerDied","Data":"2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.310079 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ddb5dfe8-ec47-4bb6-b6c0-403095118b07","Type":"ContainerDied","Data":"867ac410fbcc706feb2e5d8260d8f3a1a3fbbfbc66c12963ea695d4d1b924302"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.318056 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.324692 4707 generic.go:334] "Generic (PLEG): container finished" podID="d70efa8f-44f9-48d8-8beb-531c69225631" containerID="c0a16b0c06ec56b53325b0a694e257afb1a1c30b41915f7933e77e12c66df71b" exitCode=0 Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.324768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d70efa8f-44f9-48d8-8beb-531c69225631","Type":"ContainerDied","Data":"c0a16b0c06ec56b53325b0a694e257afb1a1c30b41915f7933e77e12c66df71b"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.352605 4707 scope.go:117] "RemoveContainer" containerID="d595cfcd7f15bd2970f5ab9f24d5e2ca4feaa5a5ba6498c06a7eec79734797d8" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.353107 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(fb3e6e1d-19d1-4310-907f-a252e900b54a)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="fb3e6e1d-19d1-4310-907f-a252e900b54a" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.363959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" event={"ID":"a4c4483b-abb1-4c04-9092-d8f04755bc93","Type":"ContainerStarted","Data":"e682954da34993008005534dd037243e287aaf2a8a003495aee619f754b4a49e"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.377436 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_38825357-2812-4711-bc86-7638075f8daa/ovsdbserver-nb/0.log" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.377572 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.377957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"38825357-2812-4711-bc86-7638075f8daa","Type":"ContainerDied","Data":"b33f328b366d5b406b5f6b66e852340992c7f44a32e052c5c36f355d0e2c103a"} Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.389436 4707 scope.go:117] "RemoveContainer" containerID="55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.389670 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213\": container with ID starting with 55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213 not found: ID does not exist" containerID="55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.389692 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213"} err="failed to get container status \"55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213\": rpc error: code = NotFound desc = could not find container \"55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213\": container with ID starting with 55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213 not found: ID does not exist" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.389707 4707 scope.go:117] "RemoveContainer" containerID="1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.389880 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c\": container with ID starting with 1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c not found: ID does not exist" containerID="1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.389896 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c"} err="failed to get container status \"1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c\": rpc error: code = NotFound desc = could not find container \"1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c\": container with ID starting with 1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c not found: ID does not exist" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.389908 4707 scope.go:117] "RemoveContainer" containerID="55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.390078 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213"} err="failed to get container status \"55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213\": rpc error: code = NotFound desc = could not find container \"55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213\": container with ID starting with 55c65f3d245d37eeb7f019e2a312728afd2684747475eab8095e405000e23213 not found: ID does not exist" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.390090 4707 scope.go:117] "RemoveContainer" containerID="1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.390243 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c"} err="failed to get container status \"1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c\": rpc error: code = NotFound desc = could not find container \"1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c\": container with ID starting with 1a1666cc294ffa9fa43f03ac62ce40dbdf8d72b8afced32a55ec7c8463f4b86c not found: ID does not exist" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.390255 4707 scope.go:117] "RemoveContainer" containerID="16ee6b178cb41968a2fe3874c57b52c48c07f4367ddef704d168289078b6a88e" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.398584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-combined-ca-bundle\") pod \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.398697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-logs\") pod \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.398718 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-public-tls-certs\") pod \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.398796 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data-custom\") pod \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.399153 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data\") pod \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.399286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-internal-tls-certs\") pod \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.399341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmf79\" (UniqueName: \"kubernetes.io/projected/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-kube-api-access-hmf79\") pod \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\" (UID: \"8f27e5ec-c6d9-4b5d-bc14-1818306024f3\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.414740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-logs" (OuterVolumeSpecName: "logs") pod "8f27e5ec-c6d9-4b5d-bc14-1818306024f3" (UID: "8f27e5ec-c6d9-4b5d-bc14-1818306024f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.420224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-kube-api-access-hmf79" (OuterVolumeSpecName: "kube-api-access-hmf79") pod "8f27e5ec-c6d9-4b5d-bc14-1818306024f3" (UID: "8f27e5ec-c6d9-4b5d-bc14-1818306024f3"). InnerVolumeSpecName "kube-api-access-hmf79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.429332 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.436896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8f27e5ec-c6d9-4b5d-bc14-1818306024f3" (UID: "8f27e5ec-c6d9-4b5d-bc14-1818306024f3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.456599 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.474440 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.474910 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerName="barbican-api-log" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.474929 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerName="barbican-api-log" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.474944 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerName="barbican-api" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.474950 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerName="barbican-api" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.474960 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerName="ovsdbserver-sb" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.474966 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerName="ovsdbserver-sb" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.474997 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38825357-2812-4711-bc86-7638075f8daa" containerName="ovsdbserver-nb" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475003 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="38825357-2812-4711-bc86-7638075f8daa" containerName="ovsdbserver-nb" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.475016 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerName="openstack-network-exporter" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475022 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerName="openstack-network-exporter" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.475032 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38825357-2812-4711-bc86-7638075f8daa" containerName="openstack-network-exporter" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475038 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="38825357-2812-4711-bc86-7638075f8daa" containerName="openstack-network-exporter" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.475046 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerName="cinder-api" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475051 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerName="cinder-api" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.475061 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerName="cinder-api-log" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475067 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerName="cinder-api-log" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475224 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="38825357-2812-4711-bc86-7638075f8daa" containerName="openstack-network-exporter" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475236 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerName="openstack-network-exporter" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475244 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerName="cinder-api" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475252 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerName="barbican-api" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475274 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" containerName="barbican-api-log" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475287 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="38825357-2812-4711-bc86-7638075f8daa" containerName="ovsdbserver-nb" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475298 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" containerName="cinder-api-log" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.475305 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" containerName="ovsdbserver-sb" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.476677 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.478868 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.481371 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.483330 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.497984 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.502354 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmf79\" (UniqueName: \"kubernetes.io/projected/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-kube-api-access-hmf79\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.502446 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.502507 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.504218 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.509843 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.516959 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.520888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.528991 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.531874 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.533307 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.533510 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.533817 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.533884 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-6hxvs" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.557999 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.575595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f27e5ec-c6d9-4b5d-bc14-1818306024f3" (UID: "8f27e5ec-c6d9-4b5d-bc14-1818306024f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.577726 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.579642 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.581432 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.581693 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.581733 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.581916 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-kdf5s" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.597711 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.604367 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-logs\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.604417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.604441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.604496 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.604561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.604597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-scripts\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.604666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.604709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.606672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data-custom\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.606708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.606776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.606917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.606968 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.607007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r9cm\" (UniqueName: \"kubernetes.io/projected/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-kube-api-access-6r9cm\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.607064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxms8\" (UniqueName: \"kubernetes.io/projected/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-kube-api-access-dxms8\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.607124 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.607147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.607230 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.628090 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8f27e5ec-c6d9-4b5d-bc14-1818306024f3" (UID: "8f27e5ec-c6d9-4b5d-bc14-1818306024f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.637834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data" (OuterVolumeSpecName: "config-data") pod "8f27e5ec-c6d9-4b5d-bc14-1818306024f3" (UID: "8f27e5ec-c6d9-4b5d-bc14-1818306024f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.640661 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.648249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8f27e5ec-c6d9-4b5d-bc14-1818306024f3" (UID: "8f27e5ec-c6d9-4b5d-bc14-1818306024f3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.655873 4707 scope.go:117] "RemoveContainer" containerID="13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.667469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.670274 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.696350 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5469fd854-s2lnn"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.709058 4707 scope.go:117] "RemoveContainer" containerID="2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.709602 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.709818 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-combined-ca-bundle\") pod \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.709866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5vc2\" (UniqueName: \"kubernetes.io/projected/d70efa8f-44f9-48d8-8beb-531c69225631-kube-api-access-k5vc2\") pod \"d70efa8f-44f9-48d8-8beb-531c69225631\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.709954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-galera-tls-certs\") pod \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.709979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-default\") pod \"d70efa8f-44f9-48d8-8beb-531c69225631\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-generated\") pod \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-galera-tls-certs\") pod \"d70efa8f-44f9-48d8-8beb-531c69225631\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psdtt\" (UniqueName: \"kubernetes.io/projected/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kube-api-access-psdtt\") pod \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710133 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-operator-scripts\") pod \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-kolla-config\") pod \"d70efa8f-44f9-48d8-8beb-531c69225631\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-generated\") pod \"d70efa8f-44f9-48d8-8beb-531c69225631\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"d70efa8f-44f9-48d8-8beb-531c69225631\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-default\") pod \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kolla-config\") pod \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-operator-scripts\") pod \"d70efa8f-44f9-48d8-8beb-531c69225631\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710419 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\" (UID: \"01ec6ded-9697-471a-80d6-9ed5fdd9918e\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-combined-ca-bundle\") pod \"d70efa8f-44f9-48d8-8beb-531c69225631\" (UID: \"d70efa8f-44f9-48d8-8beb-531c69225631\") " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxms8\" (UniqueName: \"kubernetes.io/projected/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-kube-api-access-dxms8\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710827 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d70efa8f-44f9-48d8-8beb-531c69225631" (UID: "d70efa8f-44f9-48d8-8beb-531c69225631"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.710943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-logs\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-scripts\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711933 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data-custom\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712149 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-config\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfk6m\" (UniqueName: \"kubernetes.io/projected/910c3c7f-e041-4efa-93f7-607f23a14bb7-kube-api-access-pfk6m\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r9cm\" (UniqueName: \"kubernetes.io/projected/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-kube-api-access-6r9cm\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712460 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712474 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712485 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.712497 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f27e5ec-c6d9-4b5d-bc14-1818306024f3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.713669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.716143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.717364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-logs\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.711173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "01ec6ded-9697-471a-80d6-9ed5fdd9918e" (UID: "01ec6ded-9697-471a-80d6-9ed5fdd9918e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.715912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "01ec6ded-9697-471a-80d6-9ed5fdd9918e" (UID: "01ec6ded-9697-471a-80d6-9ed5fdd9918e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.717431 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d70efa8f-44f9-48d8-8beb-531c69225631" (UID: "d70efa8f-44f9-48d8-8beb-531c69225631"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.717780 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.717922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "01ec6ded-9697-471a-80d6-9ed5fdd9918e" (UID: "01ec6ded-9697-471a-80d6-9ed5fdd9918e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.727110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70efa8f-44f9-48d8-8beb-531c69225631-kube-api-access-k5vc2" (OuterVolumeSpecName: "kube-api-access-k5vc2") pod "d70efa8f-44f9-48d8-8beb-531c69225631" (UID: "d70efa8f-44f9-48d8-8beb-531c69225631"). InnerVolumeSpecName "kube-api-access-k5vc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.727530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.734615 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.735672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxms8\" (UniqueName: \"kubernetes.io/projected/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-kube-api-access-dxms8\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.735766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.716735 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d70efa8f-44f9-48d8-8beb-531c69225631" (UID: "d70efa8f-44f9-48d8-8beb-531c69225631"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.717055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d70efa8f-44f9-48d8-8beb-531c69225631" (UID: "d70efa8f-44f9-48d8-8beb-531c69225631"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.735850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.736461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01ec6ded-9697-471a-80d6-9ed5fdd9918e" (UID: "01ec6ded-9697-471a-80d6-9ed5fdd9918e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.738193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.745744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kube-api-access-psdtt" (OuterVolumeSpecName: "kube-api-access-psdtt") pod "01ec6ded-9697-471a-80d6-9ed5fdd9918e" (UID: "01ec6ded-9697-471a-80d6-9ed5fdd9918e"). InnerVolumeSpecName "kube-api-access-psdtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.746934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-scripts\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.747549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.747701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.748218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r9cm\" (UniqueName: \"kubernetes.io/projected/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-kube-api-access-6r9cm\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.759082 4707 scope.go:117] "RemoveContainer" containerID="13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.759944 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb\": container with ID starting with 13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb not found: ID does not exist" containerID="13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.760003 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb"} err="failed to get container status \"13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb\": rpc error: code = NotFound desc = could not find container \"13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb\": container with ID starting with 13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb not found: ID does not exist" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.760043 4707 scope.go:117] "RemoveContainer" containerID="2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593" Jan 21 15:54:29 crc kubenswrapper[4707]: E0121 15:54:29.760788 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593\": container with ID starting with 2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593 not found: ID does not exist" containerID="2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.760855 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593"} err="failed to get container status \"2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593\": rpc error: code = NotFound desc = could not find container \"2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593\": container with ID starting with 2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593 not found: ID does not exist" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.760870 4707 scope.go:117] "RemoveContainer" containerID="13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.761217 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb"} err="failed to get container status \"13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb\": rpc error: code = NotFound desc = could not find container \"13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb\": container with ID starting with 13c63c267bcf3e764c070350274f810bf839da47abf43a0a836a96276a0532eb not found: ID does not exist" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.761311 4707 scope.go:117] "RemoveContainer" containerID="2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.761661 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593"} err="failed to get container status \"2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593\": rpc error: code = NotFound desc = could not find container \"2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593\": container with ID starting with 2daeb465aed3942704e1a4a013ad14751180c9d331f6d2785c89d458f91fc593 not found: ID does not exist" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.761696 4707 scope.go:117] "RemoveContainer" containerID="b08b894935f483cea6e9fe5cb8204c859d7f717fe6ecb7b8a6bb483b59d86af8" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.762767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data-custom\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.762931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "mysql-db") pod "d70efa8f-44f9-48d8-8beb-531c69225631" (UID: "d70efa8f-44f9-48d8-8beb-531c69225631"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.763391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.767761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.796093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "mysql-db") pod "01ec6ded-9697-471a-80d6-9ed5fdd9918e" (UID: "01ec6ded-9697-471a-80d6-9ed5fdd9918e"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.800024 4707 scope.go:117] "RemoveContainer" containerID="5609ff7c7f27e1828f46696621a20e1e765e3c6ae795b3240b11334dc334d6c4" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.813904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.813964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-config\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfk6m\" (UniqueName: \"kubernetes.io/projected/910c3c7f-e041-4efa-93f7-607f23a14bb7-kube-api-access-pfk6m\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814268 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814282 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d70efa8f-44f9-48d8-8beb-531c69225631-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814303 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814313 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814341 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814351 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d70efa8f-44f9-48d8-8beb-531c69225631-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814365 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814377 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5vc2\" (UniqueName: \"kubernetes.io/projected/d70efa8f-44f9-48d8-8beb-531c69225631-kube-api-access-k5vc2\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814387 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01ec6ded-9697-471a-80d6-9ed5fdd9918e-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814412 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psdtt\" (UniqueName: \"kubernetes.io/projected/01ec6ded-9697-471a-80d6-9ed5fdd9918e-kube-api-access-psdtt\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.814422 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01ec6ded-9697-471a-80d6-9ed5fdd9918e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.815213 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-config\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.816049 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.817354 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.823352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.829637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.830153 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.831284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfk6m\" (UniqueName: \"kubernetes.io/projected/910c3c7f-e041-4efa-93f7-607f23a14bb7-kube-api-access-pfk6m\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.841651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.856563 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.862210 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.865633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d70efa8f-44f9-48d8-8beb-531c69225631" (UID: "d70efa8f-44f9-48d8-8beb-531c69225631"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.869555 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d70efa8f-44f9-48d8-8beb-531c69225631" (UID: "d70efa8f-44f9-48d8-8beb-531c69225631"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.880048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01ec6ded-9697-471a-80d6-9ed5fdd9918e" (UID: "01ec6ded-9697-471a-80d6-9ed5fdd9918e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.902214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.908643 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.915676 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.915703 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.915714 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.915728 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.915738 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70efa8f-44f9-48d8-8beb-531c69225631-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.917150 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.925310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "01ec6ded-9697-471a-80d6-9ed5fdd9918e" (UID: "01ec6ded-9697-471a-80d6-9ed5fdd9918e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.943928 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:29 crc kubenswrapper[4707]: I0121 15:54:29.948737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.018397 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ec6ded-9697-471a-80d6-9ed5fdd9918e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.461469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" event={"ID":"28add001-f139-47b8-9ebb-659ba39858cc","Type":"ContainerStarted","Data":"db4b9f5629cb09208131600005b17fc382e9d120bae3a7b38b5a8a1752ce7728"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.461946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" event={"ID":"28add001-f139-47b8-9ebb-659ba39858cc","Type":"ContainerStarted","Data":"73b1fd5e41c624d54ff550ab75b71d0f8bbc342e1260f7be7c15ab44c21e4e16"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.472891 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.477844 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d70efa8f-44f9-48d8-8beb-531c69225631","Type":"ContainerDied","Data":"012e28043224893a29e9aa0317b193ef300ff4ee927d53fc44201c8909337525"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.477869 4707 scope.go:117] "RemoveContainer" containerID="c0a16b0c06ec56b53325b0a694e257afb1a1c30b41915f7933e77e12c66df71b" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.477956 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.489268 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" podStartSLOduration=3.489239961 podStartE2EDuration="3.489239961s" podCreationTimestamp="2026-01-21 15:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:30.487632789 +0000 UTC m=+3167.669149011" watchObservedRunningTime="2026-01-21 15:54:30.489239961 +0000 UTC m=+3167.670756182" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.491730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" event={"ID":"243109f6-78d0-4750-a43f-21a7b606626d","Type":"ContainerStarted","Data":"b3b250ebed84579c975d272e553aa69f879cc66363e018d563bfefd4ff7ef997"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.491758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" event={"ID":"243109f6-78d0-4750-a43f-21a7b606626d","Type":"ContainerStarted","Data":"cc0164bd92b9e0315765bd0cef3ea980987ab65b303b1a0cbcde9d86f9918385"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.491770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" event={"ID":"243109f6-78d0-4750-a43f-21a7b606626d","Type":"ContainerStarted","Data":"2632e16366185f74b5f4647a40e3b3adadd47e4c4847d2a650c73312c38e7396"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.491941 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.491967 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.518608 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5665575645-v8ktc"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.518837 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" podUID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerName="barbican-worker-log" containerID="cri-o://e4cf394238e37ac5d15301506b18d0ba417d6a80b24ad0f97a43b3434616dd5a" gracePeriod=30 Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.518955 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" podUID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerName="barbican-worker" containerID="cri-o://9832a63a4444f19da99ed41ffa2ee903cf61ce3db5cfce79cec9a1c0b9b19f58" gracePeriod=30 Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.529830 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" podStartSLOduration=3.529786525 podStartE2EDuration="3.529786525s" podCreationTimestamp="2026-01-21 15:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:30.528517348 +0000 UTC m=+3167.710033560" watchObservedRunningTime="2026-01-21 15:54:30.529786525 +0000 UTC m=+3167.711302747" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.557915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"01ec6ded-9697-471a-80d6-9ed5fdd9918e","Type":"ContainerDied","Data":"eda33eb1062eb5c805e7b51d04e65e94eb0f25597b4db4bdfd5b219d72577a1c"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.558106 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.575602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerStarted","Data":"22d81c91523460a193b8835f540179f7d8d2fd3fea5b8373e05281ab6dbd6aed"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.589632 4707 scope.go:117] "RemoveContainer" containerID="3e29351ea171b68ab5f386f24df504ebd7cdf39483d33555a2145f0a35513183" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.593702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" event={"ID":"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7","Type":"ContainerStarted","Data":"c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.593743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" event={"ID":"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7","Type":"ContainerStarted","Data":"ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.593754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" event={"ID":"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7","Type":"ContainerStarted","Data":"ac729fbe14555219c0a09200bd5f7a530de88f8d8b6d1a038730cbeb9a223666"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.595022 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.595246 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.629702 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.672120 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.681082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" event={"ID":"a4c4483b-abb1-4c04-9092-d8f04755bc93","Type":"ContainerStarted","Data":"60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.682194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" event={"ID":"a4c4483b-abb1-4c04-9092-d8f04755bc93","Type":"ContainerStarted","Data":"e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f"} Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.686760 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.698115 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.720879 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: E0121 15:54:30.722958 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ec6ded-9697-471a-80d6-9ed5fdd9918e" containerName="galera" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.722984 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ec6ded-9697-471a-80d6-9ed5fdd9918e" containerName="galera" Jan 21 15:54:30 crc kubenswrapper[4707]: E0121 15:54:30.723023 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70efa8f-44f9-48d8-8beb-531c69225631" containerName="mysql-bootstrap" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.723031 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70efa8f-44f9-48d8-8beb-531c69225631" containerName="mysql-bootstrap" Jan 21 15:54:30 crc kubenswrapper[4707]: E0121 15:54:30.723077 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70efa8f-44f9-48d8-8beb-531c69225631" containerName="galera" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.723084 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70efa8f-44f9-48d8-8beb-531c69225631" containerName="galera" Jan 21 15:54:30 crc kubenswrapper[4707]: E0121 15:54:30.723106 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ec6ded-9697-471a-80d6-9ed5fdd9918e" containerName="mysql-bootstrap" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.723111 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ec6ded-9697-471a-80d6-9ed5fdd9918e" containerName="mysql-bootstrap" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.723356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ec6ded-9697-471a-80d6-9ed5fdd9918e" containerName="galera" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.723415 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70efa8f-44f9-48d8-8beb-531c69225631" containerName="galera" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.728325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.734248 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-m7p4k" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.734395 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.734885 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.739088 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.750826 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.132:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.755829 4707 scope.go:117] "RemoveContainer" containerID="84534c9d37714a2ecd0730e455dcc3dc74f38b7ed3f5fbd32c9d21d6b444b1c6" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.769207 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.788353 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.803866 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.887307 4707 scope.go:117] "RemoveContainer" containerID="43d29cce317d9140ea844e0fa8a01802f3581985166c5b4847f73a0d0cd4079e" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.896509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsjnl\" (UniqueName: \"kubernetes.io/projected/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kube-api-access-fsjnl\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.922933 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.924535 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podStartSLOduration=3.924517731 podStartE2EDuration="3.924517731s" podCreationTimestamp="2026-01-21 15:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:30.636208723 +0000 UTC m=+3167.817724944" watchObservedRunningTime="2026-01-21 15:54:30.924517731 +0000 UTC m=+3168.106033953" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.926756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.926927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.927088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.927194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.927217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.927243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.927615 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.960246 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.960302 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.969213 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.972925 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.979305 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-dr5v9" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.979617 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.985110 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:54:30 crc kubenswrapper[4707]: I0121 15:54:30.985361 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.018904 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.024018 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" podStartSLOduration=4.023998536 podStartE2EDuration="4.023998536s" podCreationTimestamp="2026-01-21 15:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:30.733311765 +0000 UTC m=+3167.914827988" watchObservedRunningTime="2026-01-21 15:54:31.023998536 +0000 UTC m=+3168.205514758" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.030857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb55j\" (UniqueName: \"kubernetes.io/projected/e5d83529-68a5-4c04-9d24-4524d92f1efb-kube-api-access-lb55j\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsjnl\" (UniqueName: \"kubernetes.io/projected/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kube-api-access-fsjnl\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.031976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.032076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.032171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.032218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.032510 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.035516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.036163 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.036360 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" podUID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerName="barbican-keystone-listener-log" containerID="cri-o://ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.036733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.036756 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" podUID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerName="barbican-keystone-listener" containerID="cri-o://b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.045270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.047314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.057167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.057698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsjnl\" (UniqueName: \"kubernetes.io/projected/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kube-api-access-fsjnl\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.077632 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.078183 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-cb46d4bdd-s4nhw"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.090036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-0\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.134417 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.134512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.134533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb55j\" (UniqueName: \"kubernetes.io/projected/e5d83529-68a5-4c04-9d24-4524d92f1efb-kube-api-access-lb55j\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.134555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.134897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.134998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.135622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.134932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.135732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.135765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.135915 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.136091 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.139474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.141334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.145658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.154914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb55j\" (UniqueName: \"kubernetes.io/projected/e5d83529-68a5-4c04-9d24-4524d92f1efb-kube-api-access-lb55j\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.186351 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:54:31 crc kubenswrapper[4707]: E0121 15:54:31.186661 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.191325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.231062 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ec6ded-9697-471a-80d6-9ed5fdd9918e" path="/var/lib/kubelet/pods/01ec6ded-9697-471a-80d6-9ed5fdd9918e/volumes" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.231832 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38825357-2812-4711-bc86-7638075f8daa" path="/var/lib/kubelet/pods/38825357-2812-4711-bc86-7638075f8daa/volumes" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.232827 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e21981d-0581-4e7c-9485-cfea4bffaa9d" path="/var/lib/kubelet/pods/7e21981d-0581-4e7c-9485-cfea4bffaa9d/volumes" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.233668 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f27e5ec-c6d9-4b5d-bc14-1818306024f3" path="/var/lib/kubelet/pods/8f27e5ec-c6d9-4b5d-bc14-1818306024f3/volumes" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.234319 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d70efa8f-44f9-48d8-8beb-531c69225631" path="/var/lib/kubelet/pods/d70efa8f-44f9-48d8-8beb-531c69225631/volumes" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.235341 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb5dfe8-ec47-4bb6-b6c0-403095118b07" path="/var/lib/kubelet/pods/ddb5dfe8-ec47-4bb6-b6c0-403095118b07/volumes" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.322916 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.367789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.494222 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.562861 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.563391 4707 scope.go:117] "RemoveContainer" containerID="ebb015e4da971718a1807a27f20362fd1ce0ce3b2c3b4d48cc3e743f028aa7fd" Jan 21 15:54:31 crc kubenswrapper[4707]: E0121 15:54:31.570166 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.589591 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.632112 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.641344 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.648250 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.648450 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerName="glance-log" containerID="cri-o://a448697007e558fe58e7db109d533620b53bc7ebefe7dfe620af036f04aec9a2" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.648576 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerName="glance-httpd" containerID="cri-o://a9b30dfab469cfd98d55d611bfc65839a6329f4c91af56c02e35444e7c210451" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.714557 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.761679 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.762247 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7160b3be-b878-41e6-b4ad-530f998a5185" containerName="glance-httpd" containerID="cri-o://721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.761953 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7160b3be-b878-41e6-b4ad-530f998a5185" containerName="glance-log" containerID="cri-o://458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.784609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3","Type":"ContainerStarted","Data":"a8accdc28b79cf7186bb104739e5401435f0d83f38d2889dbd2f860bcd015fc3"} Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.784643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3","Type":"ContainerStarted","Data":"bec143db53af283eadda7d685670091d76206e3e5d85336132185744cf6647c2"} Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.792144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"910c3c7f-e041-4efa-93f7-607f23a14bb7","Type":"ContainerStarted","Data":"26ebac10c4ccec7634080fa327059e6b9379a8b6887c1ceb3d03955ff2d8f89c"} Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.792248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"910c3c7f-e041-4efa-93f7-607f23a14bb7","Type":"ContainerStarted","Data":"7e225c44d31e0ecd598ab142e897367eddbb156998b69dc9264a1e0fafbda81d"} Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.796799 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.808835 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" podUID="16882f47-6637-4e33-82a3-17da2f553dda" containerName="keystone-api" containerID="cri-o://d15803325d6a389441fe6a55dbaa6c70c8ec4a8581d8dc66243c60e1d055805a" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.810767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerStarted","Data":"2fabda75fc3aa04b829aa45c860680c47715c9dbed14545eeda6d5ec18d2460e"} Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.810980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerStarted","Data":"f62e21515126150bf107422ad5dd88cd5dd40ac544ec2f1453d0195c8eb84574"} Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.821343 4707 generic.go:334] "Generic (PLEG): container finished" podID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerID="e4cf394238e37ac5d15301506b18d0ba417d6a80b24ad0f97a43b3434616dd5a" exitCode=143 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.821411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" event={"ID":"b7d3be32-a1ae-4a91-bf04-051fd3dde8af","Type":"ContainerDied","Data":"e4cf394238e37ac5d15301506b18d0ba417d6a80b24ad0f97a43b3434616dd5a"} Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.849490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"fc1f6b7f-a124-49b5-9083-fb4d33a57c57","Type":"ContainerStarted","Data":"0d67ca09b4e6d92b871be25030df5e25caa4be171966f29d06f33d16cdbaf85f"} Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.861453 4707 generic.go:334] "Generic (PLEG): container finished" podID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerID="ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297" exitCode=143 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.861602 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="658de89d-042f-452c-a654-586154c683a4" containerName="memcached" containerID="cri-o://9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.861679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" event={"ID":"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea","Type":"ContainerDied","Data":"ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297"} Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.862640 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-log" containerID="cri-o://4a36a4cbd1862f2d84340378f60d7801d4f238035b05f3aa1e05e062a795ec01" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.864166 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-metadata" containerID="cri-o://664218f2c763fb08ecde9d7d311a3e55c24f62c7de90c8590719e316a1fa1de6" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.873603 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.148:8775/\": EOF" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.873723 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.148:8775/\": EOF" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.874216 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6f9f7564f-zdrdh"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.875360 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.943367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6f9f7564f-zdrdh"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.962576 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.962683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-scripts\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.962744 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-fernet-keys\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.962840 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-credential-keys\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.962880 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.962926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.963008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5hn\" (UniqueName: \"kubernetes.io/projected/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-kube-api-access-cz5hn\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.963029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-config-data\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.963982 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-77c8997466-bd6bs"] Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.964341 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-api" containerID="cri-o://0134dae6d06704b28d3c1a93c597121155514f8f85383d373be3fbff4f9697e9" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.964592 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-httpd" containerID="cri-o://c50519e027e1683f85b799afca4576a3862101aad86a8a98197856394645b0ed" gracePeriod=30 Jan 21 15:54:31 crc kubenswrapper[4707]: I0121 15:54:31.996123 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-76756f697d-jwhvs"] Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.022422 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.027965 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.056228 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-76756f697d-jwhvs"] Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070374 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-config\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-config-data\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5hn\" (UniqueName: \"kubernetes.io/projected/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-kube-api-access-cz5hn\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-combined-ca-bundle\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-httpd-config\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-public-tls-certs\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ngm8\" (UniqueName: \"kubernetes.io/projected/c1ba53ab-abb3-455d-979b-cad1eab8e90a-kube-api-access-2ngm8\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-scripts\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-fernet-keys\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-internal-tls-certs\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070887 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-ovndb-tls-certs\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.070938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-credential-keys\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.082794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-credential-keys\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.095958 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-config-data\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.096751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.098335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5hn\" (UniqueName: \"kubernetes.io/projected/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-kube-api-access-cz5hn\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.099655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.099939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-scripts\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.100433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-fernet-keys\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.110823 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.111324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle\") pod \"keystone-6f9f7564f-zdrdh\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: W0121 15:54:32.116532 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec10a8f4_f5e6_4484_8c37_db290585f9b1.slice/crio-7dcb147429f61c94cdd8287ed0331c84ae66aa782c88e77279eba5f213e98eba WatchSource:0}: Error finding container 7dcb147429f61c94cdd8287ed0331c84ae66aa782c88e77279eba5f213e98eba: Status 404 returned error can't find the container with id 7dcb147429f61c94cdd8287ed0331c84ae66aa782c88e77279eba5f213e98eba Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.132989 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.188602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-internal-tls-certs\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.188838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-ovndb-tls-certs\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.188888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-config\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.188937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-combined-ca-bundle\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.188957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-httpd-config\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.188978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-public-tls-certs\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.189019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ngm8\" (UniqueName: \"kubernetes.io/projected/c1ba53ab-abb3-455d-979b-cad1eab8e90a-kube-api-access-2ngm8\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.205467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-combined-ca-bundle\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.206612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-ovndb-tls-certs\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.206751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-config\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.207175 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-httpd-config\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.210668 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.210830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-public-tls-certs\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.211767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-internal-tls-certs\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.238493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ngm8\" (UniqueName: \"kubernetes.io/projected/c1ba53ab-abb3-455d-979b-cad1eab8e90a-kube-api-access-2ngm8\") pod \"neutron-76756f697d-jwhvs\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: E0121 15:54:32.517136 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7b39b7154fa0b95548ea7fac51861a1df55416b47e244ccf9463cee5f36581c0" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:54:32 crc kubenswrapper[4707]: E0121 15:54:32.523878 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7b39b7154fa0b95548ea7fac51861a1df55416b47e244ccf9463cee5f36581c0" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:54:32 crc kubenswrapper[4707]: E0121 15:54:32.525161 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7b39b7154fa0b95548ea7fac51861a1df55416b47e244ccf9463cee5f36581c0" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:54:32 crc kubenswrapper[4707]: E0121 15:54:32.525192 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerName="ovn-northd" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.551556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.581918 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.613565 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.700273 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-combined-ca-bundle\") pod \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.700605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7c4q\" (UniqueName: \"kubernetes.io/projected/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-kube-api-access-k7c4q\") pod \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.700693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-config-data\") pod \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\" (UID: \"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce\") " Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.700717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-config-data\") pod \"fb3e6e1d-19d1-4310-907f-a252e900b54a\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.700756 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql7zz\" (UniqueName: \"kubernetes.io/projected/fb3e6e1d-19d1-4310-907f-a252e900b54a-kube-api-access-ql7zz\") pod \"fb3e6e1d-19d1-4310-907f-a252e900b54a\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.700790 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-combined-ca-bundle\") pod \"fb3e6e1d-19d1-4310-907f-a252e900b54a\" (UID: \"fb3e6e1d-19d1-4310-907f-a252e900b54a\") " Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.756089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3e6e1d-19d1-4310-907f-a252e900b54a-kube-api-access-ql7zz" (OuterVolumeSpecName: "kube-api-access-ql7zz") pod "fb3e6e1d-19d1-4310-907f-a252e900b54a" (UID: "fb3e6e1d-19d1-4310-907f-a252e900b54a"). InnerVolumeSpecName "kube-api-access-ql7zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.768547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-kube-api-access-k7c4q" (OuterVolumeSpecName: "kube-api-access-k7c4q") pod "eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" (UID: "eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce"). InnerVolumeSpecName "kube-api-access-k7c4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.809140 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql7zz\" (UniqueName: \"kubernetes.io/projected/fb3e6e1d-19d1-4310-907f-a252e900b54a-kube-api-access-ql7zz\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.809169 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7c4q\" (UniqueName: \"kubernetes.io/projected/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-kube-api-access-k7c4q\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.884880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3e6e1d-19d1-4310-907f-a252e900b54a","Type":"ContainerDied","Data":"277a414f7fc85d83b0d0b7fec77b265156ca3796bce840a3ef15e4cd3698ba0f"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.885065 4707 scope.go:117] "RemoveContainer" containerID="d595cfcd7f15bd2970f5ab9f24d5e2ca4feaa5a5ba6498c06a7eec79734797d8" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.885220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.894141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce","Type":"ContainerDied","Data":"c19fb5832a4a02b512a1442f31b4a80dff24397da2f0b5e2a11fb314c3a4a2e7"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.894272 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.900320 4707 generic.go:334] "Generic (PLEG): container finished" podID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerID="a9b30dfab469cfd98d55d611bfc65839a6329f4c91af56c02e35444e7c210451" exitCode=0 Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.900344 4707 generic.go:334] "Generic (PLEG): container finished" podID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerID="a448697007e558fe58e7db109d533620b53bc7ebefe7dfe620af036f04aec9a2" exitCode=143 Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.900391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d3e6822-57a9-4dbe-a3c2-9fb531886f22","Type":"ContainerDied","Data":"a9b30dfab469cfd98d55d611bfc65839a6329f4c91af56c02e35444e7c210451"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.900413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d3e6822-57a9-4dbe-a3c2-9fb531886f22","Type":"ContainerDied","Data":"a448697007e558fe58e7db109d533620b53bc7ebefe7dfe620af036f04aec9a2"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.910651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"fc1f6b7f-a124-49b5-9083-fb4d33a57c57","Type":"ContainerStarted","Data":"0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.914174 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.920147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e5d83529-68a5-4c04-9d24-4524d92f1efb","Type":"ContainerStarted","Data":"107d073faec47bff764b520341b7578e3791f19a154324c2181ba6abef285c06"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.921570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"ec10a8f4-f5e6-4484-8c37-db290585f9b1","Type":"ContainerStarted","Data":"7dcb147429f61c94cdd8287ed0331c84ae66aa782c88e77279eba5f213e98eba"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.939506 4707 generic.go:334] "Generic (PLEG): container finished" podID="7160b3be-b878-41e6-b4ad-530f998a5185" containerID="721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d" exitCode=0 Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.939534 4707 generic.go:334] "Generic (PLEG): container finished" podID="7160b3be-b878-41e6-b4ad-530f998a5185" containerID="458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7" exitCode=143 Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.939608 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7160b3be-b878-41e6-b4ad-530f998a5185","Type":"ContainerDied","Data":"721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.939633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7160b3be-b878-41e6-b4ad-530f998a5185","Type":"ContainerDied","Data":"458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.939696 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.945885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3","Type":"ContainerStarted","Data":"ae1887bc18dba44ce0e0dd605872e724ae817aeba984af267f9e8e85c4a01696"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.956090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.956210 4707 generic.go:334] "Generic (PLEG): container finished" podID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerID="4a36a4cbd1862f2d84340378f60d7801d4f238035b05f3aa1e05e062a795ec01" exitCode=143 Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.956251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"68a74a76-0bd8-48e1-87c9-778b381f3fa0","Type":"ContainerDied","Data":"4a36a4cbd1862f2d84340378f60d7801d4f238035b05f3aa1e05e062a795ec01"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.964348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"910c3c7f-e041-4efa-93f7-607f23a14bb7","Type":"ContainerStarted","Data":"b263696ec070b984748b66e2f231cf93cda412825af0d39c15ed53b1bccf445b"} Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.974547 4707 generic.go:334] "Generic (PLEG): container finished" podID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerID="c50519e027e1683f85b799afca4576a3862101aad86a8a98197856394645b0ed" exitCode=0 Jan 21 15:54:32 crc kubenswrapper[4707]: I0121 15:54:32.974578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" event={"ID":"c197e98a-7956-4c34-a31a-911a28cf4a1a","Type":"ContainerDied","Data":"c50519e027e1683f85b799afca4576a3862101aad86a8a98197856394645b0ed"} Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.001063 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6f9f7564f-zdrdh"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.012441 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=4.012422912 podStartE2EDuration="4.012422912s" podCreationTimestamp="2026-01-21 15:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:32.98128562 +0000 UTC m=+3170.162801843" watchObservedRunningTime="2026-01-21 15:54:33.012422912 +0000 UTC m=+3170.193939134" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.015568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-combined-ca-bundle\") pod \"7160b3be-b878-41e6-b4ad-530f998a5185\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.015635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-internal-tls-certs\") pod \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.015698 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkdw4\" (UniqueName: \"kubernetes.io/projected/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-kube-api-access-wkdw4\") pod \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.015739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.015774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-config-data\") pod \"7160b3be-b878-41e6-b4ad-530f998a5185\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.015798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-logs\") pod \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.016406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-logs" (OuterVolumeSpecName: "logs") pod "3d3e6822-57a9-4dbe-a3c2-9fb531886f22" (UID: "3d3e6822-57a9-4dbe-a3c2-9fb531886f22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.016522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-scripts\") pod \"7160b3be-b878-41e6-b4ad-530f998a5185\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.016715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7160b3be-b878-41e6-b4ad-530f998a5185\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.016835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-httpd-run\") pod \"7160b3be-b878-41e6-b4ad-530f998a5185\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.016910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-combined-ca-bundle\") pod \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.017053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-scripts\") pod \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.017101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7160b3be-b878-41e6-b4ad-530f998a5185" (UID: "7160b3be-b878-41e6-b4ad-530f998a5185"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.016105 4707 scope.go:117] "RemoveContainer" containerID="ebb015e4da971718a1807a27f20362fd1ce0ce3b2c3b4d48cc3e743f028aa7fd" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.017242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgwgd\" (UniqueName: \"kubernetes.io/projected/7160b3be-b878-41e6-b4ad-530f998a5185-kube-api-access-sgwgd\") pod \"7160b3be-b878-41e6-b4ad-530f998a5185\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.017365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-config-data\") pod \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.017484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-logs\") pod \"7160b3be-b878-41e6-b4ad-530f998a5185\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.017575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-public-tls-certs\") pod \"7160b3be-b878-41e6-b4ad-530f998a5185\" (UID: \"7160b3be-b878-41e6-b4ad-530f998a5185\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.017680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-httpd-run\") pod \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\" (UID: \"3d3e6822-57a9-4dbe-a3c2-9fb531886f22\") " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.018963 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.019103 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.027846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-logs" (OuterVolumeSpecName: "logs") pod "7160b3be-b878-41e6-b4ad-530f998a5185" (UID: "7160b3be-b878-41e6-b4ad-530f998a5185"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.030292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3d3e6822-57a9-4dbe-a3c2-9fb531886f22" (UID: "3d3e6822-57a9-4dbe-a3c2-9fb531886f22"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.040437 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=4.040406823 podStartE2EDuration="4.040406823s" podCreationTimestamp="2026-01-21 15:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:33.016544693 +0000 UTC m=+3170.198060915" watchObservedRunningTime="2026-01-21 15:54:33.040406823 +0000 UTC m=+3170.221923046" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.049523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "7160b3be-b878-41e6-b4ad-530f998a5185" (UID: "7160b3be-b878-41e6-b4ad-530f998a5185"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.051327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-scripts" (OuterVolumeSpecName: "scripts") pod "7160b3be-b878-41e6-b4ad-530f998a5185" (UID: "7160b3be-b878-41e6-b4ad-530f998a5185"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.052133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7160b3be-b878-41e6-b4ad-530f998a5185-kube-api-access-sgwgd" (OuterVolumeSpecName: "kube-api-access-sgwgd") pod "7160b3be-b878-41e6-b4ad-530f998a5185" (UID: "7160b3be-b878-41e6-b4ad-530f998a5185"). InnerVolumeSpecName "kube-api-access-sgwgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.053538 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-kube-api-access-wkdw4" (OuterVolumeSpecName: "kube-api-access-wkdw4") pod "3d3e6822-57a9-4dbe-a3c2-9fb531886f22" (UID: "3d3e6822-57a9-4dbe-a3c2-9fb531886f22"). InnerVolumeSpecName "kube-api-access-wkdw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.058753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-scripts" (OuterVolumeSpecName: "scripts") pod "3d3e6822-57a9-4dbe-a3c2-9fb531886f22" (UID: "3d3e6822-57a9-4dbe-a3c2-9fb531886f22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.075168 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-config-data" (OuterVolumeSpecName: "config-data") pod "fb3e6e1d-19d1-4310-907f-a252e900b54a" (UID: "fb3e6e1d-19d1-4310-907f-a252e900b54a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.078562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" (UID: "eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.078744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "3d3e6822-57a9-4dbe-a3c2-9fb531886f22" (UID: "3d3e6822-57a9-4dbe-a3c2-9fb531886f22"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121835 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkdw4\" (UniqueName: \"kubernetes.io/projected/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-kube-api-access-wkdw4\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121880 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121891 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121904 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121913 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121922 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121934 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgwgd\" (UniqueName: \"kubernetes.io/projected/7160b3be-b878-41e6-b4ad-530f998a5185-kube-api-access-sgwgd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121942 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7160b3be-b878-41e6-b4ad-530f998a5185-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121952 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.121960 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.291016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d3e6822-57a9-4dbe-a3c2-9fb531886f22" (UID: "3d3e6822-57a9-4dbe-a3c2-9fb531886f22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.296063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb3e6e1d-19d1-4310-907f-a252e900b54a" (UID: "fb3e6e1d-19d1-4310-907f-a252e900b54a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.309869 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.328669 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.344861 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.355319 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.355395 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.355473 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3e6e1d-19d1-4310-907f-a252e900b54a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.399617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-config-data" (OuterVolumeSpecName: "config-data") pod "eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" (UID: "eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.417942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7160b3be-b878-41e6-b4ad-530f998a5185" (UID: "7160b3be-b878-41e6-b4ad-530f998a5185"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.434122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-config-data" (OuterVolumeSpecName: "config-data") pod "7160b3be-b878-41e6-b4ad-530f998a5185" (UID: "7160b3be-b878-41e6-b4ad-530f998a5185"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.436129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3d3e6822-57a9-4dbe-a3c2-9fb531886f22" (UID: "3d3e6822-57a9-4dbe-a3c2-9fb531886f22"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.458351 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.458381 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.458392 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.458403 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.459440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-config-data" (OuterVolumeSpecName: "config-data") pod "3d3e6822-57a9-4dbe-a3c2-9fb531886f22" (UID: "3d3e6822-57a9-4dbe-a3c2-9fb531886f22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.477184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7160b3be-b878-41e6-b4ad-530f998a5185" (UID: "7160b3be-b878-41e6-b4ad-530f998a5185"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.479518 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-76756f697d-jwhvs"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.531593 4707 scope.go:117] "RemoveContainer" containerID="721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.534799 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.544954 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.557139 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.557683 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.557765 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.557886 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3e6e1d-19d1-4310-907f-a252e900b54a" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.557970 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3e6e1d-19d1-4310-907f-a252e900b54a" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.558048 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.558116 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.558184 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7160b3be-b878-41e6-b4ad-530f998a5185" containerName="glance-log" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.558244 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7160b3be-b878-41e6-b4ad-530f998a5185" containerName="glance-log" Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.558318 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7160b3be-b878-41e6-b4ad-530f998a5185" containerName="glance-httpd" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.558381 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7160b3be-b878-41e6-b4ad-530f998a5185" containerName="glance-httpd" Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.558460 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerName="glance-httpd" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.558526 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerName="glance-httpd" Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.558599 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerName="glance-log" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.558655 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerName="glance-log" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.561401 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7160b3be-b878-41e6-b4ad-530f998a5185" containerName="glance-httpd" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.561695 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerName="glance-log" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.561804 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3e6e1d-19d1-4310-907f-a252e900b54a" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.561920 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.561994 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" containerName="glance-httpd" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.562078 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3e6e1d-19d1-4310-907f-a252e900b54a" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.562159 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7160b3be-b878-41e6-b4ad-530f998a5185" containerName="glance-log" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.562244 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.562786 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e6822-57a9-4dbe-a3c2-9fb531886f22-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.562839 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7160b3be-b878-41e6-b4ad-530f998a5185-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.581577 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.581694 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.583499 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.584029 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.589228 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.589514 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.592337 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.592844 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.593446 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3e6e1d-19d1-4310-907f-a252e900b54a" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.593466 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3e6e1d-19d1-4310-907f-a252e900b54a" containerName="nova-cell0-conductor-conductor" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.594424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.599953 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.618492 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.627969 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.631376 4707 scope.go:117] "RemoveContainer" containerID="458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.638130 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.648889 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.650580 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.660276 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.660514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.661543 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.670222 4707 scope.go:117] "RemoveContainer" containerID="721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d" Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.677912 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d\": container with ID starting with 721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d not found: ID does not exist" containerID="721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.678027 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d"} err="failed to get container status \"721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d\": rpc error: code = NotFound desc = could not find container \"721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d\": container with ID starting with 721049b6ed60908078336c66b60f68828cf5a3eff8baa79659c61b5ca162c99d not found: ID does not exist" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.678120 4707 scope.go:117] "RemoveContainer" containerID="458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.687609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.687936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.688134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.688161 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt4xm\" (UniqueName: \"kubernetes.io/projected/03adcc99-dff8-4703-bacf-25c6576428f9-kube-api-access-nt4xm\") pod \"nova-cell0-conductor-0\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.688325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvz8v\" (UniqueName: \"kubernetes.io/projected/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-kube-api-access-wvz8v\") pod \"nova-cell1-conductor-0\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.688578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.692993 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7\": container with ID starting with 458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7 not found: ID does not exist" containerID="458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.693041 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7"} err="failed to get container status \"458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7\": rpc error: code = NotFound desc = could not find container \"458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7\": container with ID starting with 458c054f340fc51b4336b533f1c539ecba61fd835875deb0200e8255ee87e1c7 not found: ID does not exist" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.790333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pvt2\" (UniqueName: \"kubernetes.io/projected/8b901f78-f277-4b06-85d0-536e5a88427d-kube-api-access-9pvt2\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.790863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-logs\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.790994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.791100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.791228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.791365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.791492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.791582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt4xm\" (UniqueName: \"kubernetes.io/projected/03adcc99-dff8-4703-bacf-25c6576428f9-kube-api-access-nt4xm\") pod \"nova-cell0-conductor-0\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.791712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.791924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.792113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvz8v\" (UniqueName: \"kubernetes.io/projected/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-kube-api-access-wvz8v\") pod \"nova-cell1-conductor-0\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.792210 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.792321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.792428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.797712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.800915 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.801132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.801219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.818096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt4xm\" (UniqueName: \"kubernetes.io/projected/03adcc99-dff8-4703-bacf-25c6576428f9-kube-api-access-nt4xm\") pod \"nova-cell0-conductor-0\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.840041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvz8v\" (UniqueName: \"kubernetes.io/projected/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-kube-api-access-wvz8v\") pod \"nova-cell1-conductor-0\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.863524 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.111:9696/\": dial tcp 10.217.0.111:9696: connect: connection refused" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.876233 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:33 crc kubenswrapper[4707]: E0121 15:54:33.876981 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-9pvt2 logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="8b901f78-f277-4b06-85d0-536e5a88427d" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.895065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.895101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.895129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.895160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.895188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.895226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pvt2\" (UniqueName: \"kubernetes.io/projected/8b901f78-f277-4b06-85d0-536e5a88427d-kube-api-access-9pvt2\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.895245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-logs\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.895303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.896038 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.896198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.899068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-logs\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.901051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.901410 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.901887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.902057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.923802 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pvt2\" (UniqueName: \"kubernetes.io/projected/8b901f78-f277-4b06-85d0-536e5a88427d-kube-api-access-9pvt2\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.947672 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.961901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.970665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.999940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d3e6822-57a9-4dbe-a3c2-9fb531886f22","Type":"ContainerDied","Data":"c0b957e00447a6f2a25b953c88a6d91577001fa23f450181dc353f51189ad916"} Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.999972 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:33 crc kubenswrapper[4707]: I0121 15:54:33.999991 4707 scope.go:117] "RemoveContainer" containerID="a9b30dfab469cfd98d55d611bfc65839a6329f4c91af56c02e35444e7c210451" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.002833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e5d83529-68a5-4c04-9d24-4524d92f1efb","Type":"ContainerStarted","Data":"e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e"} Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.003983 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.064125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"fc1f6b7f-a124-49b5-9083-fb4d33a57c57","Type":"ContainerStarted","Data":"98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c"} Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.065157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.077907 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.078423 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.093662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" event={"ID":"c1ba53ab-abb3-455d-979b-cad1eab8e90a","Type":"ContainerStarted","Data":"310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563"} Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.093709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" event={"ID":"c1ba53ab-abb3-455d-979b-cad1eab8e90a","Type":"ContainerStarted","Data":"6ddc0dfa01a3bd9696865be0af3cf90bf3c02e7d3e1fd17c4b3a8c19c5ca9f93"} Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.102981 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=5.102964989 podStartE2EDuration="5.102964989s" podCreationTimestamp="2026-01-21 15:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:34.097389144 +0000 UTC m=+3171.278905367" watchObservedRunningTime="2026-01-21 15:54:34.102964989 +0000 UTC m=+3171.284481211" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.106919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"ec10a8f4-f5e6-4484-8c37-db290585f9b1","Type":"ContainerStarted","Data":"aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972"} Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.124343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" event={"ID":"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85","Type":"ContainerStarted","Data":"b6eb35a562dda2cde1c5f1def59c03da57bb93b3257929860e2eadfa4f05fa41"} Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.124556 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" event={"ID":"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85","Type":"ContainerStarted","Data":"8b952ccd01371b3e89fdc853ed7872b91d4d20965606e37c429f031d8647070a"} Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.124930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.164120 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" podStartSLOduration=3.164103976 podStartE2EDuration="3.164103976s" podCreationTimestamp="2026-01-21 15:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:34.150381001 +0000 UTC m=+3171.331897223" watchObservedRunningTime="2026-01-21 15:54:34.164103976 +0000 UTC m=+3171.345620198" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.203179 4707 scope.go:117] "RemoveContainer" containerID="a448697007e558fe58e7db109d533620b53bc7ebefe7dfe620af036f04aec9a2" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.251659 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerStarted","Data":"d6070f3e197144a2901111d863557e8496a035ad9a3fc78850139ba433b250fc"} Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.251709 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.252365 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.350460 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.354307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.404881 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.408268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-httpd-run\") pod \"8b901f78-f277-4b06-85d0-536e5a88427d\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.408366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-combined-ca-bundle\") pod \"8b901f78-f277-4b06-85d0-536e5a88427d\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.408433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8b901f78-f277-4b06-85d0-536e5a88427d\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.408515 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pvt2\" (UniqueName: \"kubernetes.io/projected/8b901f78-f277-4b06-85d0-536e5a88427d-kube-api-access-9pvt2\") pod \"8b901f78-f277-4b06-85d0-536e5a88427d\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.408567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-public-tls-certs\") pod \"8b901f78-f277-4b06-85d0-536e5a88427d\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.409075 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.414796 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-scripts\") pod \"8b901f78-f277-4b06-85d0-536e5a88427d\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.414895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-config-data\") pod \"8b901f78-f277-4b06-85d0-536e5a88427d\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.414996 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-logs\") pod \"8b901f78-f277-4b06-85d0-536e5a88427d\" (UID: \"8b901f78-f277-4b06-85d0-536e5a88427d\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.417567 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.418576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-logs" (OuterVolumeSpecName: "logs") pod "8b901f78-f277-4b06-85d0-536e5a88427d" (UID: "8b901f78-f277-4b06-85d0-536e5a88427d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.422326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.423412 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "8b901f78-f277-4b06-85d0-536e5a88427d" (UID: "8b901f78-f277-4b06-85d0-536e5a88427d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.424897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8b901f78-f277-4b06-85d0-536e5a88427d" (UID: "8b901f78-f277-4b06-85d0-536e5a88427d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.442898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b901f78-f277-4b06-85d0-536e5a88427d" (UID: "8b901f78-f277-4b06-85d0-536e5a88427d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.443209 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8b901f78-f277-4b06-85d0-536e5a88427d" (UID: "8b901f78-f277-4b06-85d0-536e5a88427d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.447790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.448649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-scripts" (OuterVolumeSpecName: "scripts") pod "8b901f78-f277-4b06-85d0-536e5a88427d" (UID: "8b901f78-f277-4b06-85d0-536e5a88427d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.450357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-config-data" (OuterVolumeSpecName: "config-data") pod "8b901f78-f277-4b06-85d0-536e5a88427d" (UID: "8b901f78-f277-4b06-85d0-536e5a88427d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.456048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b901f78-f277-4b06-85d0-536e5a88427d-kube-api-access-9pvt2" (OuterVolumeSpecName: "kube-api-access-9pvt2") pod "8b901f78-f277-4b06-85d0-536e5a88427d" (UID: "8b901f78-f277-4b06-85d0-536e5a88427d"). InnerVolumeSpecName "kube-api-access-9pvt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.521838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.522105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.522162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6kh4\" (UniqueName: \"kubernetes.io/projected/262e4109-657a-4194-a805-051b54efb1b8-kube-api-access-z6kh4\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.522240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.522271 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523632 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523648 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pvt2\" (UniqueName: \"kubernetes.io/projected/8b901f78-f277-4b06-85d0-536e5a88427d-kube-api-access-9pvt2\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523658 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523668 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523677 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523685 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.523696 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b901f78-f277-4b06-85d0-536e5a88427d-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.524317 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b901f78-f277-4b06-85d0-536e5a88427d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.556561 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.626188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.626225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.626307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.626334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6kh4\" (UniqueName: \"kubernetes.io/projected/262e4109-657a-4194-a805-051b54efb1b8-kube-api-access-z6kh4\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.626368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.626383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.626438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.626461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.626504 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.627950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.635929 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.642185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.644582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.645047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6kh4\" (UniqueName: \"kubernetes.io/projected/262e4109-657a-4194-a805-051b54efb1b8-kube-api-access-z6kh4\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.647901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.649081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.668384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.687275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.716776 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.725181 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.727198 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.778755 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.829623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-memcached-tls-certs\") pod \"658de89d-042f-452c-a654-586154c683a4\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.829690 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-config-data\") pod \"658de89d-042f-452c-a654-586154c683a4\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.829712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sf74\" (UniqueName: \"kubernetes.io/projected/658de89d-042f-452c-a654-586154c683a4-kube-api-access-6sf74\") pod \"658de89d-042f-452c-a654-586154c683a4\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.829759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-combined-ca-bundle\") pod \"658de89d-042f-452c-a654-586154c683a4\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.829777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-kolla-config\") pod \"658de89d-042f-452c-a654-586154c683a4\" (UID: \"658de89d-042f-452c-a654-586154c683a4\") " Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.830613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "658de89d-042f-452c-a654-586154c683a4" (UID: "658de89d-042f-452c-a654-586154c683a4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.830627 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-config-data" (OuterVolumeSpecName: "config-data") pod "658de89d-042f-452c-a654-586154c683a4" (UID: "658de89d-042f-452c-a654-586154c683a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.835207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658de89d-042f-452c-a654-586154c683a4-kube-api-access-6sf74" (OuterVolumeSpecName: "kube-api-access-6sf74") pod "658de89d-042f-452c-a654-586154c683a4" (UID: "658de89d-042f-452c-a654-586154c683a4"). InnerVolumeSpecName "kube-api-access-6sf74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.859187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "658de89d-042f-452c-a654-586154c683a4" (UID: "658de89d-042f-452c-a654-586154c683a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.866671 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.935148 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.935180 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.935191 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658de89d-042f-452c-a654-586154c683a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.935200 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sf74\" (UniqueName: \"kubernetes.io/projected/658de89d-042f-452c-a654-586154c683a4-kube-api-access-6sf74\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.944455 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.948998 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:34 crc kubenswrapper[4707]: I0121 15:54:34.950101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "658de89d-042f-452c-a654-586154c683a4" (UID: "658de89d-042f-452c-a654-586154c683a4"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.036959 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658de89d-042f-452c-a654-586154c683a4-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.194239 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3e6822-57a9-4dbe-a3c2-9fb531886f22" path="/var/lib/kubelet/pods/3d3e6822-57a9-4dbe-a3c2-9fb531886f22/volumes" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.194905 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7160b3be-b878-41e6-b4ad-530f998a5185" path="/var/lib/kubelet/pods/7160b3be-b878-41e6-b4ad-530f998a5185/volumes" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.195595 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce" path="/var/lib/kubelet/pods/eb3faf0a-bc0f-4ee1-a443-a05eff6d30ce/volumes" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.196523 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3e6e1d-19d1-4310-907f-a252e900b54a" path="/var/lib/kubelet/pods/fb3e6e1d-19d1-4310-907f-a252e900b54a/volumes" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.234974 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.259991 4707 generic.go:334] "Generic (PLEG): container finished" podID="658de89d-042f-452c-a654-586154c683a4" containerID="9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670" exitCode=0 Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.260109 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.260207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"658de89d-042f-452c-a654-586154c683a4","Type":"ContainerDied","Data":"9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670"} Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.260281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"658de89d-042f-452c-a654-586154c683a4","Type":"ContainerDied","Data":"49cfe85510907338c092c956a9f9c82bb24b988f5082366b31d67ee24a07eab5"} Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.260306 4707 scope.go:117] "RemoveContainer" containerID="9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.261467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e22a9822-7aa7-4a38-916a-c3fc9fdb0895","Type":"ContainerStarted","Data":"a64ac7833066130e5e138ec3361f259f17f723cd49acdba0e6a895ea068c41ab"} Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.264629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"03adcc99-dff8-4703-bacf-25c6576428f9","Type":"ContainerStarted","Data":"48264db1c0390508e51bad8d608c82fa44e5dc5b6fe6dad192148f5eaaf31fca"} Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.267540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" event={"ID":"c1ba53ab-abb3-455d-979b-cad1eab8e90a","Type":"ContainerStarted","Data":"2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b"} Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.267597 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.274218 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_206e3d62-bdaa-4326-8c66-a615e81200d8/ovn-northd/0.log" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.274268 4707 generic.go:334] "Generic (PLEG): container finished" podID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerID="7b39b7154fa0b95548ea7fac51861a1df55416b47e244ccf9463cee5f36581c0" exitCode=139 Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.274339 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.274657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"206e3d62-bdaa-4326-8c66-a615e81200d8","Type":"ContainerDied","Data":"7b39b7154fa0b95548ea7fac51861a1df55416b47e244ccf9463cee5f36581c0"} Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.274695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"206e3d62-bdaa-4326-8c66-a615e81200d8","Type":"ContainerDied","Data":"9039c2bd551565742c91068961f983f235ff686d5ec57640672a242876ff99ea"} Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.274707 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9039c2bd551565742c91068961f983f235ff686d5ec57640672a242876ff99ea" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.284895 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_206e3d62-bdaa-4326-8c66-a615e81200d8/ovn-northd/0.log" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.285176 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.301693 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" podStartSLOduration=4.301678003 podStartE2EDuration="4.301678003s" podCreationTimestamp="2026-01-21 15:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:35.290204478 +0000 UTC m=+3172.471720700" watchObservedRunningTime="2026-01-21 15:54:35.301678003 +0000 UTC m=+3172.483194245" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.308110 4707 scope.go:117] "RemoveContainer" containerID="9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670" Jan 21 15:54:35 crc kubenswrapper[4707]: E0121 15:54:35.308506 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670\": container with ID starting with 9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670 not found: ID does not exist" containerID="9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.308546 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670"} err="failed to get container status \"9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670\": rpc error: code = NotFound desc = could not find container \"9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670\": container with ID starting with 9bbcb3912f7f7743ab1fafc7fe68459b4139127ffc7e77f0a424283256e88670 not found: ID does not exist" Jan 21 15:54:35 crc kubenswrapper[4707]: W0121 15:54:35.335215 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod262e4109_657a_4194_a805_051b54efb1b8.slice/crio-4fb8af9d3db715aff455c8f13f9c1ad95556cd1246ec7f4b159073b370ee3d1e WatchSource:0}: Error finding container 4fb8af9d3db715aff455c8f13f9c1ad95556cd1246ec7f4b159073b370ee3d1e: Status 404 returned error can't find the container with id 4fb8af9d3db715aff455c8f13f9c1ad95556cd1246ec7f4b159073b370ee3d1e Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.343480 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.447467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7hbm\" (UniqueName: \"kubernetes.io/projected/206e3d62-bdaa-4326-8c66-a615e81200d8-kube-api-access-p7hbm\") pod \"206e3d62-bdaa-4326-8c66-a615e81200d8\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.447663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-combined-ca-bundle\") pod \"206e3d62-bdaa-4326-8c66-a615e81200d8\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.448058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-rundir\") pod \"206e3d62-bdaa-4326-8c66-a615e81200d8\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.448114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-config\") pod \"206e3d62-bdaa-4326-8c66-a615e81200d8\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.448145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-scripts\") pod \"206e3d62-bdaa-4326-8c66-a615e81200d8\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.448228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-metrics-certs-tls-certs\") pod \"206e3d62-bdaa-4326-8c66-a615e81200d8\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.448279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-northd-tls-certs\") pod \"206e3d62-bdaa-4326-8c66-a615e81200d8\" (UID: \"206e3d62-bdaa-4326-8c66-a615e81200d8\") " Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.448436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "206e3d62-bdaa-4326-8c66-a615e81200d8" (UID: "206e3d62-bdaa-4326-8c66-a615e81200d8"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.448737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-scripts" (OuterVolumeSpecName: "scripts") pod "206e3d62-bdaa-4326-8c66-a615e81200d8" (UID: "206e3d62-bdaa-4326-8c66-a615e81200d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.449497 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.449512 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.451344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-config" (OuterVolumeSpecName: "config") pod "206e3d62-bdaa-4326-8c66-a615e81200d8" (UID: "206e3d62-bdaa-4326-8c66-a615e81200d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.472722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/206e3d62-bdaa-4326-8c66-a615e81200d8-kube-api-access-p7hbm" (OuterVolumeSpecName: "kube-api-access-p7hbm") pod "206e3d62-bdaa-4326-8c66-a615e81200d8" (UID: "206e3d62-bdaa-4326-8c66-a615e81200d8"). InnerVolumeSpecName "kube-api-access-p7hbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.528000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "206e3d62-bdaa-4326-8c66-a615e81200d8" (UID: "206e3d62-bdaa-4326-8c66-a615e81200d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.553931 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.553959 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/206e3d62-bdaa-4326-8c66-a615e81200d8-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.553969 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7hbm\" (UniqueName: \"kubernetes.io/projected/206e3d62-bdaa-4326-8c66-a615e81200d8-kube-api-access-p7hbm\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.589179 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.608587 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.621331 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.650863 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.670026 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: E0121 15:54:35.671993 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658de89d-042f-452c-a654-586154c683a4" containerName="memcached" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.672106 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="658de89d-042f-452c-a654-586154c683a4" containerName="memcached" Jan 21 15:54:35 crc kubenswrapper[4707]: E0121 15:54:35.672188 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerName="openstack-network-exporter" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.672247 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerName="openstack-network-exporter" Jan 21 15:54:35 crc kubenswrapper[4707]: E0121 15:54:35.672324 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerName="ovn-northd" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.672382 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerName="ovn-northd" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.673101 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="658de89d-042f-452c-a654-586154c683a4" containerName="memcached" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.673241 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerName="ovn-northd" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.673346 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" containerName="openstack-network-exporter" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.682882 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.684738 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.686534 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.690917 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.691974 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.694123 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.695475 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.695607 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.695723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-2x7h6" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.700065 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.719419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "206e3d62-bdaa-4326-8c66-a615e81200d8" (UID: "206e3d62-bdaa-4326-8c66-a615e81200d8"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.721044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "206e3d62-bdaa-4326-8c66-a615e81200d8" (UID: "206e3d62-bdaa-4326-8c66-a615e81200d8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.763433 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.763580 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/206e3d62-bdaa-4326-8c66-a615e81200d8-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.794003 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.132:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.866304 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-config-data\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.866672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbknm\" (UniqueName: \"kubernetes.io/projected/56a9521e-4024-4a1d-8b87-66d305c22eb4-kube-api-access-jbknm\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.866740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.866802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.866847 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-kolla-config\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.866973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp5g2\" (UniqueName: \"kubernetes.io/projected/448bc08a-61e8-48bb-be55-256be3a355a2-kube-api-access-xp5g2\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.867002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.867029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.867124 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.867163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.867209 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.867446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.867509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-logs\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.944822 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.949162 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbknm\" (UniqueName: \"kubernetes.io/projected/56a9521e-4024-4a1d-8b87-66d305c22eb4-kube-api-access-jbknm\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969123 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-kolla-config\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp5g2\" (UniqueName: \"kubernetes.io/projected/448bc08a-61e8-48bb-be55-256be3a355a2-kube-api-access-xp5g2\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-logs\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-config-data\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.969589 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.970122 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-config-data\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.970405 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.971490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-kolla-config\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.975135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-logs\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.981769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.984355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.984505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.985053 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.986317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbknm\" (UniqueName: \"kubernetes.io/projected/56a9521e-4024-4a1d-8b87-66d305c22eb4-kube-api-access-jbknm\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.986962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.987333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.988486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp5g2\" (UniqueName: \"kubernetes.io/projected/448bc08a-61e8-48bb-be55-256be3a355a2-kube-api-access-xp5g2\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.989099 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:35 crc kubenswrapper[4707]: I0121 15:54:35.998151 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.030364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.035293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.293037 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e22a9822-7aa7-4a38-916a-c3fc9fdb0895","Type":"ContainerStarted","Data":"544d5ca76e0fb54d974186a2110285e44bbf348f7f71b6273678005e1f776565"} Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.294219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.298518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerStarted","Data":"441cc55aae74c46696d852f6104493f16644abc2b957eb769aafc7cfbc0e94f3"} Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.298666 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.300638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"03adcc99-dff8-4703-bacf-25c6576428f9","Type":"ContainerStarted","Data":"cae2309d9aaf78a8c9bafad7a7e7b8fa9f20cadfa5abaa88cafa500489be2946"} Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.301388 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.308408 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=3.308387861 podStartE2EDuration="3.308387861s" podCreationTimestamp="2026-01-21 15:54:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:36.307640335 +0000 UTC m=+3173.489156557" watchObservedRunningTime="2026-01-21 15:54:36.308387861 +0000 UTC m=+3173.489904073" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.308795 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5d83529-68a5-4c04-9d24-4524d92f1efb" containerID="e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e" exitCode=0 Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.308917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e5d83529-68a5-4c04-9d24-4524d92f1efb","Type":"ContainerDied","Data":"e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e"} Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.315304 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec10a8f4-f5e6-4484-8c37-db290585f9b1" containerID="aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972" exitCode=0 Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.315422 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"ec10a8f4-f5e6-4484-8c37-db290585f9b1","Type":"ContainerDied","Data":"aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972"} Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.321916 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.325077 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.326430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"262e4109-657a-4194-a805-051b54efb1b8","Type":"ContainerStarted","Data":"e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64"} Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.326457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"262e4109-657a-4194-a805-051b54efb1b8","Type":"ContainerStarted","Data":"4fb8af9d3db715aff455c8f13f9c1ad95556cd1246ec7f4b159073b370ee3d1e"} Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.326930 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerName="cinder-api-log" containerID="cri-o://0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31" gracePeriod=30 Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.327084 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerName="cinder-api" containerID="cri-o://98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c" gracePeriod=30 Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.359635 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=3.359616132 podStartE2EDuration="3.359616132s" podCreationTimestamp="2026-01-21 15:54:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:36.321771387 +0000 UTC m=+3173.503287609" watchObservedRunningTime="2026-01-21 15:54:36.359616132 +0000 UTC m=+3173.541132354" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.386108 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.011046221 podStartE2EDuration="8.386091556s" podCreationTimestamp="2026-01-21 15:54:28 +0000 UTC" firstStartedPulling="2026-01-21 15:54:29.657855747 +0000 UTC m=+3166.839371969" lastFinishedPulling="2026-01-21 15:54:35.032901081 +0000 UTC m=+3172.214417304" observedRunningTime="2026-01-21 15:54:36.339418291 +0000 UTC m=+3173.520934513" watchObservedRunningTime="2026-01-21 15:54:36.386091556 +0000 UTC m=+3173.567607778" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.396226 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.398551 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.495237 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.501099 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.509648 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.511184 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.516449 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.516589 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-q9bf5" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.516692 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.517930 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.522124 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.583830 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.589871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-scripts\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.589937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9tb\" (UniqueName: \"kubernetes.io/projected/99baeb41-a129-437e-b701-c2a769595666-kube-api-access-5q9tb\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.589971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99baeb41-a129-437e-b701-c2a769595666-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.590045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.590083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.590112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.590257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-config\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.693692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-scripts\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.693999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9tb\" (UniqueName: \"kubernetes.io/projected/99baeb41-a129-437e-b701-c2a769595666-kube-api-access-5q9tb\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.694027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99baeb41-a129-437e-b701-c2a769595666-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.694080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.694128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.694150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.694211 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-config\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.695010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-config\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.695560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-scripts\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.695697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99baeb41-a129-437e-b701-c2a769595666-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.698005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.698552 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.703915 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:36 crc kubenswrapper[4707]: E0121 15:54:36.704753 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5q9tb metrics-certs-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ovn-northd-0" podUID="99baeb41-a129-437e-b701-c2a769595666" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.708498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.720841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9tb\" (UniqueName: \"kubernetes.io/projected/99baeb41-a129-437e-b701-c2a769595666-kube-api-access-5q9tb\") pod \"ovn-northd-0\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.869541 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.929674 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:54:36 crc kubenswrapper[4707]: I0121 15:54:36.987779 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.077794 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.104642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-public-tls-certs\") pod \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.104705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-etc-machine-id\") pod \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.104779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data\") pod \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.104900 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-scripts\") pod \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.104974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data-custom\") pod \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.105004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxms8\" (UniqueName: \"kubernetes.io/projected/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-kube-api-access-dxms8\") pod \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.105048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-internal-tls-certs\") pod \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.105065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-combined-ca-bundle\") pod \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.105120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-logs\") pod \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\" (UID: \"fc1f6b7f-a124-49b5-9083-fb4d33a57c57\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.105861 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-logs" (OuterVolumeSpecName: "logs") pod "fc1f6b7f-a124-49b5-9083-fb4d33a57c57" (UID: "fc1f6b7f-a124-49b5-9083-fb4d33a57c57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.106009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fc1f6b7f-a124-49b5-9083-fb4d33a57c57" (UID: "fc1f6b7f-a124-49b5-9083-fb4d33a57c57"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.111990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fc1f6b7f-a124-49b5-9083-fb4d33a57c57" (UID: "fc1f6b7f-a124-49b5-9083-fb4d33a57c57"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.112021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-scripts" (OuterVolumeSpecName: "scripts") pod "fc1f6b7f-a124-49b5-9083-fb4d33a57c57" (UID: "fc1f6b7f-a124-49b5-9083-fb4d33a57c57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.115757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-kube-api-access-dxms8" (OuterVolumeSpecName: "kube-api-access-dxms8") pod "fc1f6b7f-a124-49b5-9083-fb4d33a57c57" (UID: "fc1f6b7f-a124-49b5-9083-fb4d33a57c57"). InnerVolumeSpecName "kube-api-access-dxms8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.136399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc1f6b7f-a124-49b5-9083-fb4d33a57c57" (UID: "fc1f6b7f-a124-49b5-9083-fb4d33a57c57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.170138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data" (OuterVolumeSpecName: "config-data") pod "fc1f6b7f-a124-49b5-9083-fb4d33a57c57" (UID: "fc1f6b7f-a124-49b5-9083-fb4d33a57c57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.184931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc1f6b7f-a124-49b5-9083-fb4d33a57c57" (UID: "fc1f6b7f-a124-49b5-9083-fb4d33a57c57"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.187076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fc1f6b7f-a124-49b5-9083-fb4d33a57c57" (UID: "fc1f6b7f-a124-49b5-9083-fb4d33a57c57"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.191856 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="206e3d62-bdaa-4326-8c66-a615e81200d8" path="/var/lib/kubelet/pods/206e3d62-bdaa-4326-8c66-a615e81200d8/volumes" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.192926 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658de89d-042f-452c-a654-586154c683a4" path="/var/lib/kubelet/pods/658de89d-042f-452c-a654-586154c683a4/volumes" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.193488 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b901f78-f277-4b06-85d0-536e5a88427d" path="/var/lib/kubelet/pods/8b901f78-f277-4b06-85d0-536e5a88427d/volumes" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.207578 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.207605 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.207614 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.207622 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.207631 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.207643 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxms8\" (UniqueName: \"kubernetes.io/projected/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-kube-api-access-dxms8\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.207652 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.207661 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.207669 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1f6b7f-a124-49b5-9083-fb4d33a57c57-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.346422 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerID="98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c" exitCode=0 Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.346697 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.346706 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerID="0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31" exitCode=143 Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.346619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"fc1f6b7f-a124-49b5-9083-fb4d33a57c57","Type":"ContainerDied","Data":"98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.346789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"fc1f6b7f-a124-49b5-9083-fb4d33a57c57","Type":"ContainerDied","Data":"0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.346834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"fc1f6b7f-a124-49b5-9083-fb4d33a57c57","Type":"ContainerDied","Data":"0d67ca09b4e6d92b871be25030df5e25caa4be171966f29d06f33d16cdbaf85f"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.346858 4707 scope.go:117] "RemoveContainer" containerID="98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.349433 4707 generic.go:334] "Generic (PLEG): container finished" podID="16882f47-6637-4e33-82a3-17da2f553dda" containerID="d15803325d6a389441fe6a55dbaa6c70c8ec4a8581d8dc66243c60e1d055805a" exitCode=0 Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.349481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" event={"ID":"16882f47-6637-4e33-82a3-17da2f553dda","Type":"ContainerDied","Data":"d15803325d6a389441fe6a55dbaa6c70c8ec4a8581d8dc66243c60e1d055805a"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.359074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e5d83529-68a5-4c04-9d24-4524d92f1efb","Type":"ContainerStarted","Data":"2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.368654 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.373919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"ec10a8f4-f5e6-4484-8c37-db290585f9b1","Type":"ContainerStarted","Data":"dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.382698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"262e4109-657a-4194-a805-051b54efb1b8","Type":"ContainerStarted","Data":"661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.382749 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.382879 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="262e4109-657a-4194-a805-051b54efb1b8" containerName="glance-log" containerID="cri-o://e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64" gracePeriod=30 Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.383133 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="262e4109-657a-4194-a805-051b54efb1b8" containerName="glance-httpd" containerID="cri-o://661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71" gracePeriod=30 Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.391589 4707 scope.go:117] "RemoveContainer" containerID="0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.394586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"56a9521e-4024-4a1d-8b87-66d305c22eb4","Type":"ContainerStarted","Data":"7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.394633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"56a9521e-4024-4a1d-8b87-66d305c22eb4","Type":"ContainerStarted","Data":"dafe48a2197071c0507f0af076c626d5035a6ed80ff2862cca486423004ed811"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.394907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.403493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"448bc08a-61e8-48bb-be55-256be3a355a2","Type":"ContainerStarted","Data":"656409fcf40297cb92024510355ff8bb824028e49035d0f352c2fb57af41d414"} Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.404365 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.426388 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:37 crc kubenswrapper[4707]: E0121 15:54:37.426856 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerName="cinder-api" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.426868 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerName="cinder-api" Jan 21 15:54:37 crc kubenswrapper[4707]: E0121 15:54:37.426919 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerName="cinder-api-log" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.426926 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerName="cinder-api-log" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.427098 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerName="cinder-api" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.427119 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" containerName="cinder-api-log" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.428131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.430330 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.430425 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.430627 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.434051 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.434024792 podStartE2EDuration="7.434024792s" podCreationTimestamp="2026-01-21 15:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:37.386623278 +0000 UTC m=+3174.568139499" watchObservedRunningTime="2026-01-21 15:54:37.434024792 +0000 UTC m=+3174.615541014" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.434308 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.439297 4707 scope.go:117] "RemoveContainer" containerID="98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c" Jan 21 15:54:37 crc kubenswrapper[4707]: E0121 15:54:37.443023 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c\": container with ID starting with 98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c not found: ID does not exist" containerID="98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.443074 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c"} err="failed to get container status \"98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c\": rpc error: code = NotFound desc = could not find container \"98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c\": container with ID starting with 98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c not found: ID does not exist" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.443095 4707 scope.go:117] "RemoveContainer" containerID="0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31" Jan 21 15:54:37 crc kubenswrapper[4707]: E0121 15:54:37.443585 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31\": container with ID starting with 0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31 not found: ID does not exist" containerID="0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.443608 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31"} err="failed to get container status \"0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31\": rpc error: code = NotFound desc = could not find container \"0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31\": container with ID starting with 0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31 not found: ID does not exist" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.443648 4707 scope.go:117] "RemoveContainer" containerID="98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.443854 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c"} err="failed to get container status \"98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c\": rpc error: code = NotFound desc = could not find container \"98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c\": container with ID starting with 98852752e2e54c0181b533fb8625c5bb261e83be49fec924f8892f467f90da0c not found: ID does not exist" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.443896 4707 scope.go:117] "RemoveContainer" containerID="0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.444074 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31"} err="failed to get container status \"0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31\": rpc error: code = NotFound desc = could not find container \"0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31\": container with ID starting with 0bae26d194d2318e41763c1bf93189059e6ea92c2f81f51ad468e12f6aa1cc31 not found: ID does not exist" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.465916 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.476037 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=7.476017274 podStartE2EDuration="7.476017274s" podCreationTimestamp="2026-01-21 15:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:37.410110363 +0000 UTC m=+3174.591626585" watchObservedRunningTime="2026-01-21 15:54:37.476017274 +0000 UTC m=+3174.657533497" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.482566 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.482551201 podStartE2EDuration="3.482551201s" podCreationTimestamp="2026-01-21 15:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:37.426288024 +0000 UTC m=+3174.607804245" watchObservedRunningTime="2026-01-21 15:54:37.482551201 +0000 UTC m=+3174.664067423" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.493333 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.4933198 podStartE2EDuration="2.4933198s" podCreationTimestamp="2026-01-21 15:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:37.446580371 +0000 UTC m=+3174.628096594" watchObservedRunningTime="2026-01-21 15:54:37.4933198 +0000 UTC m=+3174.674836023" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.514768 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99baeb41-a129-437e-b701-c2a769595666-ovn-rundir\") pod \"99baeb41-a129-437e-b701-c2a769595666\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.514829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-scripts\") pod \"99baeb41-a129-437e-b701-c2a769595666\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.514878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-metrics-certs-tls-certs\") pod \"99baeb41-a129-437e-b701-c2a769595666\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.514899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-config\") pod \"99baeb41-a129-437e-b701-c2a769595666\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.515523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-scripts" (OuterVolumeSpecName: "scripts") pod "99baeb41-a129-437e-b701-c2a769595666" (UID: "99baeb41-a129-437e-b701-c2a769595666"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.515564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-ovn-northd-tls-certs\") pod \"99baeb41-a129-437e-b701-c2a769595666\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.515747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9tb\" (UniqueName: \"kubernetes.io/projected/99baeb41-a129-437e-b701-c2a769595666-kube-api-access-5q9tb\") pod \"99baeb41-a129-437e-b701-c2a769595666\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.515791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-combined-ca-bundle\") pod \"99baeb41-a129-437e-b701-c2a769595666\" (UID: \"99baeb41-a129-437e-b701-c2a769595666\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-scripts\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b59f3c9f-adf5-49f3-8f0b-62044f91902c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data-custom\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsvpq\" (UniqueName: \"kubernetes.io/projected/b59f3c9f-adf5-49f3-8f0b-62044f91902c-kube-api-access-fsvpq\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516537 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516619 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b59f3c9f-adf5-49f3-8f0b-62044f91902c-logs\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99baeb41-a129-437e-b701-c2a769595666-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "99baeb41-a129-437e-b701-c2a769595666" (UID: "99baeb41-a129-437e-b701-c2a769595666"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.516665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.517566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-config" (OuterVolumeSpecName: "config") pod "99baeb41-a129-437e-b701-c2a769595666" (UID: "99baeb41-a129-437e-b701-c2a769595666"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.520968 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99baeb41-a129-437e-b701-c2a769595666-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.520989 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.520999 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99baeb41-a129-437e-b701-c2a769595666-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.526235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99baeb41-a129-437e-b701-c2a769595666" (UID: "99baeb41-a129-437e-b701-c2a769595666"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.526411 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99baeb41-a129-437e-b701-c2a769595666-kube-api-access-5q9tb" (OuterVolumeSpecName: "kube-api-access-5q9tb") pod "99baeb41-a129-437e-b701-c2a769595666" (UID: "99baeb41-a129-437e-b701-c2a769595666"). InnerVolumeSpecName "kube-api-access-5q9tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.526692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "99baeb41-a129-437e-b701-c2a769595666" (UID: "99baeb41-a129-437e-b701-c2a769595666"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.532859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "99baeb41-a129-437e-b701-c2a769595666" (UID: "99baeb41-a129-437e-b701-c2a769595666"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.571696 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": read tcp 10.217.0.2:42846->10.217.0.135:9311: read: connection reset by peer" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.572503 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": dial tcp 10.217.0.135:9311: connect: connection refused" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.573207 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": read tcp 10.217.0.2:42856->10.217.0.135:9311: read: connection reset by peer" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.573819 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": dial tcp 10.217.0.135:9311: connect: connection refused" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.574086 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.135:9311/healthcheck\": dial tcp 10.217.0.135:9311: connect: connection refused" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsvpq\" (UniqueName: \"kubernetes.io/projected/b59f3c9f-adf5-49f3-8f0b-62044f91902c-kube-api-access-fsvpq\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b59f3c9f-adf5-49f3-8f0b-62044f91902c-logs\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-scripts\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b59f3c9f-adf5-49f3-8f0b-62044f91902c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data-custom\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623877 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623892 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623904 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99baeb41-a129-437e-b701-c2a769595666-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.623913 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9tb\" (UniqueName: \"kubernetes.io/projected/99baeb41-a129-437e-b701-c2a769595666-kube-api-access-5q9tb\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.625898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b59f3c9f-adf5-49f3-8f0b-62044f91902c-logs\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.626141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b59f3c9f-adf5-49f3-8f0b-62044f91902c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.633578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-scripts\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.633848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.634726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.636723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data-custom\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.639773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.640238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.646607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsvpq\" (UniqueName: \"kubernetes.io/projected/b59f3c9f-adf5-49f3-8f0b-62044f91902c-kube-api-access-fsvpq\") pod \"cinder-api-0\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.695636 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.758013 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.833779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-internal-tls-certs\") pod \"16882f47-6637-4e33-82a3-17da2f553dda\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.833899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-fernet-keys\") pod \"16882f47-6637-4e33-82a3-17da2f553dda\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.834025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-credential-keys\") pod \"16882f47-6637-4e33-82a3-17da2f553dda\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.834248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-combined-ca-bundle\") pod \"16882f47-6637-4e33-82a3-17da2f553dda\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.834306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-public-tls-certs\") pod \"16882f47-6637-4e33-82a3-17da2f553dda\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.834377 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-scripts\") pod \"16882f47-6637-4e33-82a3-17da2f553dda\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.834475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knx4t\" (UniqueName: \"kubernetes.io/projected/16882f47-6637-4e33-82a3-17da2f553dda-kube-api-access-knx4t\") pod \"16882f47-6637-4e33-82a3-17da2f553dda\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.834510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-config-data\") pod \"16882f47-6637-4e33-82a3-17da2f553dda\" (UID: \"16882f47-6637-4e33-82a3-17da2f553dda\") " Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.843631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-scripts" (OuterVolumeSpecName: "scripts") pod "16882f47-6637-4e33-82a3-17da2f553dda" (UID: "16882f47-6637-4e33-82a3-17da2f553dda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.843702 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16882f47-6637-4e33-82a3-17da2f553dda-kube-api-access-knx4t" (OuterVolumeSpecName: "kube-api-access-knx4t") pod "16882f47-6637-4e33-82a3-17da2f553dda" (UID: "16882f47-6637-4e33-82a3-17da2f553dda"). InnerVolumeSpecName "kube-api-access-knx4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.846031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "16882f47-6637-4e33-82a3-17da2f553dda" (UID: "16882f47-6637-4e33-82a3-17da2f553dda"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.856982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "16882f47-6637-4e33-82a3-17da2f553dda" (UID: "16882f47-6637-4e33-82a3-17da2f553dda"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.912924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16882f47-6637-4e33-82a3-17da2f553dda" (UID: "16882f47-6637-4e33-82a3-17da2f553dda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.923653 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-config-data" (OuterVolumeSpecName: "config-data") pod "16882f47-6637-4e33-82a3-17da2f553dda" (UID: "16882f47-6637-4e33-82a3-17da2f553dda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.932506 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "16882f47-6637-4e33-82a3-17da2f553dda" (UID: "16882f47-6637-4e33-82a3-17da2f553dda"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.941194 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.941221 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.941230 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.941240 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.941249 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knx4t\" (UniqueName: \"kubernetes.io/projected/16882f47-6637-4e33-82a3-17da2f553dda-kube-api-access-knx4t\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.941267 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.941275 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.943368 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16882f47-6637-4e33-82a3-17da2f553dda" (UID: "16882f47-6637-4e33-82a3-17da2f553dda"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.974578 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.982985 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": EOF" Jan 21 15:54:37 crc kubenswrapper[4707]: I0121 15:54:37.983001 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": EOF" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.043253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-scripts\") pod \"262e4109-657a-4194-a805-051b54efb1b8\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.043324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-httpd-run\") pod \"262e4109-657a-4194-a805-051b54efb1b8\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.043398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-config-data\") pod \"262e4109-657a-4194-a805-051b54efb1b8\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.043543 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-internal-tls-certs\") pod \"262e4109-657a-4194-a805-051b54efb1b8\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.043575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-combined-ca-bundle\") pod \"262e4109-657a-4194-a805-051b54efb1b8\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.043638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"262e4109-657a-4194-a805-051b54efb1b8\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.043704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-logs\") pod \"262e4109-657a-4194-a805-051b54efb1b8\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.043733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6kh4\" (UniqueName: \"kubernetes.io/projected/262e4109-657a-4194-a805-051b54efb1b8-kube-api-access-z6kh4\") pod \"262e4109-657a-4194-a805-051b54efb1b8\" (UID: \"262e4109-657a-4194-a805-051b54efb1b8\") " Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.043779 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "262e4109-657a-4194-a805-051b54efb1b8" (UID: "262e4109-657a-4194-a805-051b54efb1b8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.044598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-logs" (OuterVolumeSpecName: "logs") pod "262e4109-657a-4194-a805-051b54efb1b8" (UID: "262e4109-657a-4194-a805-051b54efb1b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.045017 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16882f47-6637-4e33-82a3-17da2f553dda-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.045036 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.045047 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/262e4109-657a-4194-a805-051b54efb1b8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.053617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "262e4109-657a-4194-a805-051b54efb1b8" (UID: "262e4109-657a-4194-a805-051b54efb1b8"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.062012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-scripts" (OuterVolumeSpecName: "scripts") pod "262e4109-657a-4194-a805-051b54efb1b8" (UID: "262e4109-657a-4194-a805-051b54efb1b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.062130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262e4109-657a-4194-a805-051b54efb1b8-kube-api-access-z6kh4" (OuterVolumeSpecName: "kube-api-access-z6kh4") pod "262e4109-657a-4194-a805-051b54efb1b8" (UID: "262e4109-657a-4194-a805-051b54efb1b8"). InnerVolumeSpecName "kube-api-access-z6kh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.073178 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.110:8778/\": EOF" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.073643 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.110:8778/\": EOF" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.099618 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.139:8778/\": net/http: TLS handshake timeout" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.141904 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "262e4109-657a-4194-a805-051b54efb1b8" (UID: "262e4109-657a-4194-a805-051b54efb1b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.152063 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.152114 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.152132 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6kh4\" (UniqueName: \"kubernetes.io/projected/262e4109-657a-4194-a805-051b54efb1b8-kube-api-access-z6kh4\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.152143 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.180977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "262e4109-657a-4194-a805-051b54efb1b8" (UID: "262e4109-657a-4194-a805-051b54efb1b8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.196618 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.222745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-config-data" (OuterVolumeSpecName: "config-data") pod "262e4109-657a-4194-a805-051b54efb1b8" (UID: "262e4109-657a-4194-a805-051b54efb1b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.257243 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.257294 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.257312 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/262e4109-657a-4194-a805-051b54efb1b8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.384775 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.423005 4707 generic.go:334] "Generic (PLEG): container finished" podID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerID="7f6ca570badd360c4e550f469d4e2d6df888ec83adc2f833b5160cc1738f2b45" exitCode=137 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.423081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" event={"ID":"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a","Type":"ContainerDied","Data":"7f6ca570badd360c4e550f469d4e2d6df888ec83adc2f833b5160cc1738f2b45"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.427335 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" event={"ID":"16882f47-6637-4e33-82a3-17da2f553dda","Type":"ContainerDied","Data":"4473e83d9a8597f2bac4f36ff83e27194ed426055bac0014b63791e44f1531fe"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.427461 4707 scope.go:117] "RemoveContainer" containerID="d15803325d6a389441fe6a55dbaa6c70c8ec4a8581d8dc66243c60e1d055805a" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.427626 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.439053 4707 generic.go:334] "Generic (PLEG): container finished" podID="262e4109-657a-4194-a805-051b54efb1b8" containerID="661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71" exitCode=0 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.439080 4707 generic.go:334] "Generic (PLEG): container finished" podID="262e4109-657a-4194-a805-051b54efb1b8" containerID="e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64" exitCode=143 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.439119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"262e4109-657a-4194-a805-051b54efb1b8","Type":"ContainerDied","Data":"661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.439144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"262e4109-657a-4194-a805-051b54efb1b8","Type":"ContainerDied","Data":"e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.439154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"262e4109-657a-4194-a805-051b54efb1b8","Type":"ContainerDied","Data":"4fb8af9d3db715aff455c8f13f9c1ad95556cd1246ec7f4b159073b370ee3d1e"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.439227 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.447051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"448bc08a-61e8-48bb-be55-256be3a355a2","Type":"ContainerStarted","Data":"4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.458054 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerID="298713c226aa15237d7cd72cb0e956083e7716fdc3d3e28167b1f08dfa128879" exitCode=137 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.458111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" event={"ID":"4ab385ad-11d0-4da8-bf53-70997a2b9fa3","Type":"ContainerDied","Data":"298713c226aa15237d7cd72cb0e956083e7716fdc3d3e28167b1f08dfa128879"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.462427 4707 generic.go:334] "Generic (PLEG): container finished" podID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerID="b5b35b997c87a59fe1f8a09cb7639122fe8e23066d82a87103fb4be086b8ce2e" exitCode=137 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.462474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" event={"ID":"3769b2e3-4438-441b-867e-65d6dbd6e36c","Type":"ContainerDied","Data":"b5b35b997c87a59fe1f8a09cb7639122fe8e23066d82a87103fb4be086b8ce2e"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.470556 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerID="a4d345f4aef43876ec0d8f9747eadb254c4cfb035eb0ac23f9808a04b5c70dd0" exitCode=0 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.470604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" event={"ID":"fb142568-b2b7-467a-9b33-fc6c3bf725e3","Type":"ContainerDied","Data":"a4d345f4aef43876ec0d8f9747eadb254c4cfb035eb0ac23f9808a04b5c70dd0"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.470627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" event={"ID":"fb142568-b2b7-467a-9b33-fc6c3bf725e3","Type":"ContainerDied","Data":"deaad5d8b622cef5089ba0038af4eeccc61b7dde827d3b035c88608a44bf63f7"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.470638 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deaad5d8b622cef5089ba0038af4eeccc61b7dde827d3b035c88608a44bf63f7" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.477789 4707 generic.go:334] "Generic (PLEG): container finished" podID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerID="fcfc18351f1445f8f4c492a625c615d629a4ab60f1b0501cb64068fcf7a7d318" exitCode=137 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.477875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" event={"ID":"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb","Type":"ContainerDied","Data":"fcfc18351f1445f8f4c492a625c615d629a4ab60f1b0501cb64068fcf7a7d318"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.486217 4707 generic.go:334] "Generic (PLEG): container finished" podID="03adcc99-dff8-4703-bacf-25c6576428f9" containerID="cae2309d9aaf78a8c9bafad7a7e7b8fa9f20cadfa5abaa88cafa500489be2946" exitCode=1 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.486241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"03adcc99-dff8-4703-bacf-25c6576428f9","Type":"ContainerDied","Data":"cae2309d9aaf78a8c9bafad7a7e7b8fa9f20cadfa5abaa88cafa500489be2946"} Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.487208 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.487448 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerName="openstack-network-exporter" containerID="cri-o://ae1887bc18dba44ce0e0dd605872e724ae817aeba984af267f9e8e85c4a01696" gracePeriod=300 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.488054 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerName="openstack-network-exporter" containerID="cri-o://b263696ec070b984748b66e2f231cf93cda412825af0d39c15ed53b1bccf445b" gracePeriod=300 Jan 21 15:54:38 crc kubenswrapper[4707]: I0121 15:54:38.488369 4707 scope.go:117] "RemoveContainer" containerID="cae2309d9aaf78a8c9bafad7a7e7b8fa9f20cadfa5abaa88cafa500489be2946" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.575611 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerName="ovsdbserver-nb" containerID="cri-o://26ebac10c4ccec7634080fa327059e6b9379a8b6887c1ceb3d03955ff2d8f89c" gracePeriod=300 Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.606094 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerName="ovsdbserver-sb" containerID="cri-o://a8accdc28b79cf7186bb104739e5401435f0d83f38d2889dbd2f860bcd015fc3" gracePeriod=300 Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.606365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.624736 4707 scope.go:117] "RemoveContainer" containerID="661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.648038 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:35482->192.168.25.165:36655: write tcp 192.168.25.165:35482->192.168.25.165:36655: write: broken pipe Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.668522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vbgc\" (UniqueName: \"kubernetes.io/projected/fb142568-b2b7-467a-9b33-fc6c3bf725e3-kube-api-access-9vbgc\") pod \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.668881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-internal-tls-certs\") pod \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.668939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-public-tls-certs\") pod \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.669085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data\") pod \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.669279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data-custom\") pod \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.669481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb142568-b2b7-467a-9b33-fc6c3bf725e3-logs\") pod \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.669535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-combined-ca-bundle\") pod \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\" (UID: \"fb142568-b2b7-467a-9b33-fc6c3bf725e3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.674907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb142568-b2b7-467a-9b33-fc6c3bf725e3-logs" (OuterVolumeSpecName: "logs") pod "fb142568-b2b7-467a-9b33-fc6c3bf725e3" (UID: "fb142568-b2b7-467a-9b33-fc6c3bf725e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.674944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fb142568-b2b7-467a-9b33-fc6c3bf725e3" (UID: "fb142568-b2b7-467a-9b33-fc6c3bf725e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.681387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb142568-b2b7-467a-9b33-fc6c3bf725e3-kube-api-access-9vbgc" (OuterVolumeSpecName: "kube-api-access-9vbgc") pod "fb142568-b2b7-467a-9b33-fc6c3bf725e3" (UID: "fb142568-b2b7-467a-9b33-fc6c3bf725e3"). InnerVolumeSpecName "kube-api-access-9vbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.723003 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb142568-b2b7-467a-9b33-fc6c3bf725e3" (UID: "fb142568-b2b7-467a-9b33-fc6c3bf725e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.735274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data" (OuterVolumeSpecName: "config-data") pod "fb142568-b2b7-467a-9b33-fc6c3bf725e3" (UID: "fb142568-b2b7-467a-9b33-fc6c3bf725e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.750532 4707 scope.go:117] "RemoveContainer" containerID="e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.772967 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.772991 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.773003 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb142568-b2b7-467a-9b33-fc6c3bf725e3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.773012 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.773020 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vbgc\" (UniqueName: \"kubernetes.io/projected/fb142568-b2b7-467a-9b33-fc6c3bf725e3-kube-api-access-9vbgc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.774557 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fb142568-b2b7-467a-9b33-fc6c3bf725e3" (UID: "fb142568-b2b7-467a-9b33-fc6c3bf725e3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.784390 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.787054 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.794428 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.801203 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.803349 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.815554 4707 scope.go:117] "RemoveContainer" containerID="661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.815613 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.815947 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerName="barbican-keystone-listener" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.815958 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerName="barbican-keystone-listener" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.815979 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerName="barbican-worker-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.815985 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerName="barbican-worker-log" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.815997 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262e4109-657a-4194-a805-051b54efb1b8" containerName="glance-httpd" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.815994 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71\": container with ID starting with 661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71 not found: ID does not exist" containerID="661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816025 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71"} err="failed to get container status \"661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71\": rpc error: code = NotFound desc = could not find container \"661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71\": container with ID starting with 661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71 not found: ID does not exist" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816046 4707 scope.go:117] "RemoveContainer" containerID="e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816003 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="262e4109-657a-4194-a805-051b54efb1b8" containerName="glance-httpd" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.816134 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816144 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.816158 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816164 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.816180 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerName="barbican-worker" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816185 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerName="barbican-worker" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.816206 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerName="barbican-keystone-listener-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816211 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerName="barbican-keystone-listener-log" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.816223 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816228 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.816238 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262e4109-657a-4194-a805-051b54efb1b8" containerName="glance-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816245 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="262e4109-657a-4194-a805-051b54efb1b8" containerName="glance-log" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.816258 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16882f47-6637-4e33-82a3-17da2f553dda" containerName="keystone-api" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816273 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="16882f47-6637-4e33-82a3-17da2f553dda" containerName="keystone-api" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.816286 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816291 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816568 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerName="barbican-keystone-listener" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816579 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="262e4109-657a-4194-a805-051b54efb1b8" containerName="glance-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816593 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="262e4109-657a-4194-a805-051b54efb1b8" containerName="glance-httpd" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816599 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816607 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816617 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816626 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerName="barbican-worker-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816638 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" containerName="barbican-keystone-listener-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816648 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="16882f47-6637-4e33-82a3-17da2f553dda" containerName="keystone-api" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816659 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" containerName="barbican-worker" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.816669 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" containerName="barbican-api" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.818972 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64\": container with ID starting with e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64 not found: ID does not exist" containerID="e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.818993 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64"} err="failed to get container status \"e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64\": rpc error: code = NotFound desc = could not find container \"e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64\": container with ID starting with e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64 not found: ID does not exist" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.819009 4707 scope.go:117] "RemoveContainer" containerID="661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.820039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.820645 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.821993 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71"} err="failed to get container status \"661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71\": rpc error: code = NotFound desc = could not find container \"661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71\": container with ID starting with 661a9f87ed8d001170a0812a319c93d8207563740dd5627ef0d90d0d8bda3b71 not found: ID does not exist" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.822012 4707 scope.go:117] "RemoveContainer" containerID="e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.822991 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.823135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.823314 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64"} err="failed to get container status \"e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64\": rpc error: code = NotFound desc = could not find container \"e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64\": container with ID starting with e1f21b18c640336b04153f9ab61b4835322d7d0057e9fa1ad70776cc78c5dc64 not found: ID does not exist" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.826883 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6fc7ff9f9-jtsmj"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.831583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fb142568-b2b7-467a-9b33-fc6c3bf725e3" (UID: "fb142568-b2b7-467a-9b33-fc6c3bf725e3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.831960 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.834306 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.854016 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.862873 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqswk\" (UniqueName: \"kubernetes.io/projected/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-kube-api-access-nqswk\") pod \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-logs\") pod \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrnpq\" (UniqueName: \"kubernetes.io/projected/3769b2e3-4438-441b-867e-65d6dbd6e36c-kube-api-access-jrnpq\") pod \"3769b2e3-4438-441b-867e-65d6dbd6e36c\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data\") pod \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-public-tls-certs\") pod \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-logs\") pod \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894196 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data-custom\") pod \"3769b2e3-4438-441b-867e-65d6dbd6e36c\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-config-data\") pod \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf5hd\" (UniqueName: \"kubernetes.io/projected/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-kube-api-access-zf5hd\") pod \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data\") pod \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data-custom\") pod \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-scripts\") pod \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-logs\") pod \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894436 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-combined-ca-bundle\") pod \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flzvf\" (UniqueName: \"kubernetes.io/projected/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-kube-api-access-flzvf\") pod \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-combined-ca-bundle\") pod \"3769b2e3-4438-441b-867e-65d6dbd6e36c\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-internal-tls-certs\") pod \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data\") pod \"3769b2e3-4438-441b-867e-65d6dbd6e36c\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-internal-tls-certs\") pod \"3769b2e3-4438-441b-867e-65d6dbd6e36c\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769b2e3-4438-441b-867e-65d6dbd6e36c-logs\") pod \"3769b2e3-4438-441b-867e-65d6dbd6e36c\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-combined-ca-bundle\") pod \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\" (UID: \"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-public-tls-certs\") pod \"3769b2e3-4438-441b-867e-65d6dbd6e36c\" (UID: \"3769b2e3-4438-441b-867e-65d6dbd6e36c\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data-custom\") pod \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\" (UID: \"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.894742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-combined-ca-bundle\") pod \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\" (UID: \"4ab385ad-11d0-4da8-bf53-70997a2b9fa3\") " Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.895710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.895742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.895795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.895838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.895879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.895913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.895940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.896056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67wc\" (UniqueName: \"kubernetes.io/projected/6c4d63f6-935c-47e2-92c0-4e27286253ba-kube-api-access-f67wc\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.896332 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.896345 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb142568-b2b7-467a-9b33-fc6c3bf725e3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.897076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-logs" (OuterVolumeSpecName: "logs") pod "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" (UID: "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.897111 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3769b2e3-4438-441b-867e-65d6dbd6e36c-logs" (OuterVolumeSpecName: "logs") pod "3769b2e3-4438-441b-867e-65d6dbd6e36c" (UID: "3769b2e3-4438-441b-867e-65d6dbd6e36c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.910195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-logs" (OuterVolumeSpecName: "logs") pod "4ab385ad-11d0-4da8-bf53-70997a2b9fa3" (UID: "4ab385ad-11d0-4da8-bf53-70997a2b9fa3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.918620 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.918885 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-logs" (OuterVolumeSpecName: "logs") pod "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" (UID: "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.920543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-kube-api-access-flzvf" (OuterVolumeSpecName: "kube-api-access-flzvf") pod "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" (UID: "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb"). InnerVolumeSpecName "kube-api-access-flzvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.921928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3769b2e3-4438-441b-867e-65d6dbd6e36c" (UID: "3769b2e3-4438-441b-867e-65d6dbd6e36c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.922064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3769b2e3-4438-441b-867e-65d6dbd6e36c-kube-api-access-jrnpq" (OuterVolumeSpecName: "kube-api-access-jrnpq") pod "3769b2e3-4438-441b-867e-65d6dbd6e36c" (UID: "3769b2e3-4438-441b-867e-65d6dbd6e36c"). InnerVolumeSpecName "kube-api-access-jrnpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.927905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-kube-api-access-nqswk" (OuterVolumeSpecName: "kube-api-access-nqswk") pod "4ab385ad-11d0-4da8-bf53-70997a2b9fa3" (UID: "4ab385ad-11d0-4da8-bf53-70997a2b9fa3"). InnerVolumeSpecName "kube-api-access-nqswk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.927929 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.927952 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-log" Jan 21 15:54:39 crc kubenswrapper[4707]: E0121 15:54:38.927984 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-api" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.927997 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-api" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.929086 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.929117 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" containerName="placement-api" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.942719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-scripts" (OuterVolumeSpecName: "scripts") pod "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" (UID: "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.944214 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" (UID: "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.960943 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.987584 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.987861 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4ab385ad-11d0-4da8-bf53-70997a2b9fa3" (UID: "4ab385ad-11d0-4da8-bf53-70997a2b9fa3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.988754 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.988827 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-q9bf5" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.989372 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.990439 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f67wc\" (UniqueName: \"kubernetes.io/projected/6c4d63f6-935c-47e2-92c0-4e27286253ba-kube-api-access-f67wc\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999629 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999639 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999649 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flzvf\" (UniqueName: \"kubernetes.io/projected/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-kube-api-access-flzvf\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999659 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769b2e3-4438-441b-867e-65d6dbd6e36c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999668 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999676 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqswk\" (UniqueName: \"kubernetes.io/projected/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-kube-api-access-nqswk\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999684 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999694 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrnpq\" (UniqueName: \"kubernetes.io/projected/3769b2e3-4438-441b-867e-65d6dbd6e36c-kube-api-access-jrnpq\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999702 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999711 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:38.999720 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.000116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.000214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.002957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-kube-api-access-zf5hd" (OuterVolumeSpecName: "kube-api-access-zf5hd") pod "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" (UID: "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a"). InnerVolumeSpecName "kube-api-access-zf5hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.003526 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.023204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.037645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.038251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.040915 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.042387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67wc\" (UniqueName: \"kubernetes.io/projected/6c4d63f6-935c-47e2-92c0-4e27286253ba-kube-api-access-f67wc\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.044206 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" (UID: "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.081543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3769b2e3-4438-441b-867e-65d6dbd6e36c" (UID: "3769b2e3-4438-441b-867e-65d6dbd6e36c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.101670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-config\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.101718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.101795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.101851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.101877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-scripts\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.101932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.101962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72czz\" (UniqueName: \"kubernetes.io/projected/51bd9d97-e28f-4b29-a43b-0496f5fb517e-kube-api-access-72czz\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.102407 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf5hd\" (UniqueName: \"kubernetes.io/projected/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-kube-api-access-zf5hd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.102449 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.102460 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.104312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.110421 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ab385ad-11d0-4da8-bf53-70997a2b9fa3" (UID: "4ab385ad-11d0-4da8-bf53-70997a2b9fa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.158348 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.188820 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" (UID: "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.200435 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16882f47-6637-4e33-82a3-17da2f553dda" path="/var/lib/kubelet/pods/16882f47-6637-4e33-82a3-17da2f553dda/volumes" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.201034 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="262e4109-657a-4194-a805-051b54efb1b8" path="/var/lib/kubelet/pods/262e4109-657a-4194-a805-051b54efb1b8/volumes" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.201632 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99baeb41-a129-437e-b701-c2a769595666" path="/var/lib/kubelet/pods/99baeb41-a129-437e-b701-c2a769595666/volumes" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.202465 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc1f6b7f-a124-49b5-9083-fb4d33a57c57" path="/var/lib/kubelet/pods/fc1f6b7f-a124-49b5-9083-fb4d33a57c57/volumes" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.206556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.206628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.206663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-scripts\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.206753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.206802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72czz\" (UniqueName: \"kubernetes.io/projected/51bd9d97-e28f-4b29-a43b-0496f5fb517e-kube-api-access-72czz\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.206874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-config\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.206926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.206991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.207010 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.207026 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.208449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-scripts\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.208587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-config\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.212359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.214409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.230241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.235369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72czz\" (UniqueName: \"kubernetes.io/projected/51bd9d97-e28f-4b29-a43b-0496f5fb517e-kube-api-access-72czz\") pod \"ovn-northd-0\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.241948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-config-data" (OuterVolumeSpecName: "config-data") pod "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" (UID: "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.264579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3769b2e3-4438-441b-867e-65d6dbd6e36c" (UID: "3769b2e3-4438-441b-867e-65d6dbd6e36c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.308095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data" (OuterVolumeSpecName: "config-data") pod "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" (UID: "46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.309753 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.311791 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.311833 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.336368 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data" (OuterVolumeSpecName: "config-data") pod "3769b2e3-4438-441b-867e-65d6dbd6e36c" (UID: "3769b2e3-4438-441b-867e-65d6dbd6e36c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.354938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3769b2e3-4438-441b-867e-65d6dbd6e36c" (UID: "3769b2e3-4438-441b-867e-65d6dbd6e36c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.412113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data" (OuterVolumeSpecName: "config-data") pod "4ab385ad-11d0-4da8-bf53-70997a2b9fa3" (UID: "4ab385ad-11d0-4da8-bf53-70997a2b9fa3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.424187 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.424212 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3769b2e3-4438-441b-867e-65d6dbd6e36c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.424221 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab385ad-11d0-4da8-bf53-70997a2b9fa3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.459153 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.469272 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" (UID: "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.510314 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.512628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" (UID: "d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.525832 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.525904 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.571765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" event={"ID":"4ab385ad-11d0-4da8-bf53-70997a2b9fa3","Type":"ContainerDied","Data":"ccca3340b0447c02009ad6b21494768aa53105116696120210244bb1bc5cc8a2"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.571831 4707 scope.go:117] "RemoveContainer" containerID="298713c226aa15237d7cd72cb0e956083e7716fdc3d3e28167b1f08dfa128879" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.571977 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.617379 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.618223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" event={"ID":"3769b2e3-4438-441b-867e-65d6dbd6e36c","Type":"ContainerDied","Data":"922b718c7c0e1bc6e40ba65052ce022f472ccf8f06c6dbe5534a84e4ffd5f9e1"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.636552 4707 generic.go:334] "Generic (PLEG): container finished" podID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerID="544d5ca76e0fb54d974186a2110285e44bbf348f7f71b6273678005e1f776565" exitCode=1 Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.636600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e22a9822-7aa7-4a38-916a-c3fc9fdb0895","Type":"ContainerDied","Data":"544d5ca76e0fb54d974186a2110285e44bbf348f7f71b6273678005e1f776565"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.636996 4707 scope.go:117] "RemoveContainer" containerID="544d5ca76e0fb54d974186a2110285e44bbf348f7f71b6273678005e1f776565" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.646870 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.655976 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.657834 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_910c3c7f-e041-4efa-93f7-607f23a14bb7/ovsdbserver-nb/0.log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.657868 4707 generic.go:334] "Generic (PLEG): container finished" podID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerID="b263696ec070b984748b66e2f231cf93cda412825af0d39c15ed53b1bccf445b" exitCode=2 Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.657880 4707 generic.go:334] "Generic (PLEG): container finished" podID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerID="26ebac10c4ccec7634080fa327059e6b9379a8b6887c1ceb3d03955ff2d8f89c" exitCode=143 Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.657901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"910c3c7f-e041-4efa-93f7-607f23a14bb7","Type":"ContainerDied","Data":"b263696ec070b984748b66e2f231cf93cda412825af0d39c15ed53b1bccf445b"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.657954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"910c3c7f-e041-4efa-93f7-607f23a14bb7","Type":"ContainerDied","Data":"26ebac10c4ccec7634080fa327059e6b9379a8b6887c1ceb3d03955ff2d8f89c"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.661092 4707 scope.go:117] "RemoveContainer" containerID="4a5772dd092c131ecf81d8ee1a79e452a6686ec0a7bef349d5605a36d1c443da" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.667678 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6ffcdd9b58-khwxb"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.680150 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.691450 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.691902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"03adcc99-dff8-4703-bacf-25c6576428f9","Type":"ContainerStarted","Data":"0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.692156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.738288 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b59f3c9f-adf5-49f3-8f0b-62044f91902c","Type":"ContainerStarted","Data":"40b1b860db6575bed685952498b04a04377e4b919333ddfad12e7e87103c47b9"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.750673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" event={"ID":"46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb","Type":"ContainerDied","Data":"296309b3efc562a705d8545a8884f5798a01ca9bd3d5f40e7c63975c66f5c748"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.750841 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.758195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"448bc08a-61e8-48bb-be55-256be3a355a2","Type":"ContainerStarted","Data":"292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.764856 4707 scope.go:117] "RemoveContainer" containerID="b5b35b997c87a59fe1f8a09cb7639122fe8e23066d82a87103fb4be086b8ce2e" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.785184 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.790641 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-57d775cd97-mhhlv"] Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.819515 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.819493704 podStartE2EDuration="4.819493704s" podCreationTimestamp="2026-01-21 15:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:39.810365559 +0000 UTC m=+3176.991881781" watchObservedRunningTime="2026-01-21 15:54:39.819493704 +0000 UTC m=+3177.001009926" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.832142 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3/ovsdbserver-sb/0.log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.832360 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerID="ae1887bc18dba44ce0e0dd605872e724ae817aeba984af267f9e8e85c4a01696" exitCode=2 Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.832377 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerID="a8accdc28b79cf7186bb104739e5401435f0d83f38d2889dbd2f860bcd015fc3" exitCode=143 Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.832435 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3","Type":"ContainerDied","Data":"ae1887bc18dba44ce0e0dd605872e724ae817aeba984af267f9e8e85c4a01696"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.832457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3","Type":"ContainerDied","Data":"a8accdc28b79cf7186bb104739e5401435f0d83f38d2889dbd2f860bcd015fc3"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.873083 4707 scope.go:117] "RemoveContainer" containerID="272acba6d788239712a5071adbac78f9174904c25e843ddb5f470eb2cbaf2e24" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.873888 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.873935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5b758d7546-vzkpl" event={"ID":"d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a","Type":"ContainerDied","Data":"126b84ccf58fbb664aa1a8939c6fdff7132c8e799df526d0c0189f405ef7ef6e"} Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.897498 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-548968857b-s96qt" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.909433 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3/ovsdbserver-sb/0.log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.909500 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.920068 4707 scope.go:117] "RemoveContainer" containerID="fcfc18351f1445f8f4c492a625c615d629a4ab60f1b0501cb64068fcf7a7d318" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.931632 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_910c3c7f-e041-4efa-93f7-607f23a14bb7/ovsdbserver-nb/0.log" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.931703 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.967902 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:39 crc kubenswrapper[4707]: I0121 15:54:39.992331 4707 scope.go:117] "RemoveContainer" containerID="66872aae878a51a83199443d8ec479ce07708d2a6515a66af88f405708649bdf" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049379 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r9cm\" (UniqueName: \"kubernetes.io/projected/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-kube-api-access-6r9cm\") pod \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdbserver-sb-tls-certs\") pod \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-metrics-certs-tls-certs\") pod \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-scripts\") pod \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-config\") pod \"910c3c7f-e041-4efa-93f7-607f23a14bb7\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049589 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdb-rundir\") pod \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-config\") pod \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049695 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-combined-ca-bundle\") pod \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\" (UID: \"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-scripts\") pod \"910c3c7f-e041-4efa-93f7-607f23a14bb7\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-combined-ca-bundle\") pod \"910c3c7f-e041-4efa-93f7-607f23a14bb7\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdbserver-nb-tls-certs\") pod \"910c3c7f-e041-4efa-93f7-607f23a14bb7\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfk6m\" (UniqueName: \"kubernetes.io/projected/910c3c7f-e041-4efa-93f7-607f23a14bb7-kube-api-access-pfk6m\") pod \"910c3c7f-e041-4efa-93f7-607f23a14bb7\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdb-rundir\") pod \"910c3c7f-e041-4efa-93f7-607f23a14bb7\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"910c3c7f-e041-4efa-93f7-607f23a14bb7\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.049925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-metrics-certs-tls-certs\") pod \"910c3c7f-e041-4efa-93f7-607f23a14bb7\" (UID: \"910c3c7f-e041-4efa-93f7-607f23a14bb7\") " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.051958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-config" (OuterVolumeSpecName: "config") pod "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" (UID: "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.052393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-scripts" (OuterVolumeSpecName: "scripts") pod "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" (UID: "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.052767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-config" (OuterVolumeSpecName: "config") pod "910c3c7f-e041-4efa-93f7-607f23a14bb7" (UID: "910c3c7f-e041-4efa-93f7-607f23a14bb7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.053032 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" (UID: "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.053278 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "910c3c7f-e041-4efa-93f7-607f23a14bb7" (UID: "910c3c7f-e041-4efa-93f7-607f23a14bb7"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.054077 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-548968857b-s96qt"] Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.062474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-scripts" (OuterVolumeSpecName: "scripts") pod "910c3c7f-e041-4efa-93f7-607f23a14bb7" (UID: "910c3c7f-e041-4efa-93f7-607f23a14bb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.064934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "910c3c7f-e041-4efa-93f7-607f23a14bb7" (UID: "910c3c7f-e041-4efa-93f7-607f23a14bb7"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.070009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-kube-api-access-6r9cm" (OuterVolumeSpecName: "kube-api-access-6r9cm") pod "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" (UID: "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3"). InnerVolumeSpecName "kube-api-access-6r9cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.079964 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-548968857b-s96qt"] Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.089998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910c3c7f-e041-4efa-93f7-607f23a14bb7-kube-api-access-pfk6m" (OuterVolumeSpecName: "kube-api-access-pfk6m") pod "910c3c7f-e041-4efa-93f7-607f23a14bb7" (UID: "910c3c7f-e041-4efa-93f7-607f23a14bb7"). InnerVolumeSpecName "kube-api-access-pfk6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.109280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" (UID: "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.136884 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5b758d7546-vzkpl"] Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.150024 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5b758d7546-vzkpl"] Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159110 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159334 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159414 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r9cm\" (UniqueName: \"kubernetes.io/projected/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-kube-api-access-6r9cm\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159913 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159927 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159936 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159944 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159952 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/910c3c7f-e041-4efa-93f7-607f23a14bb7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159962 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.159971 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfk6m\" (UniqueName: \"kubernetes.io/projected/910c3c7f-e041-4efa-93f7-607f23a14bb7-kube-api-access-pfk6m\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.178366 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" (UID: "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.181818 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "910c3c7f-e041-4efa-93f7-607f23a14bb7" (UID: "910c3c7f-e041-4efa-93f7-607f23a14bb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.182918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.207059 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.213005 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "910c3c7f-e041-4efa-93f7-607f23a14bb7" (UID: "910c3c7f-e041-4efa-93f7-607f23a14bb7"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.218040 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.234376 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "910c3c7f-e041-4efa-93f7-607f23a14bb7" (UID: "910c3c7f-e041-4efa-93f7-607f23a14bb7"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.247406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" (UID: "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.247512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" (UID: "4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.250185 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.262038 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.262123 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.262179 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.262340 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.262391 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.262476 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.262503 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/910c3c7f-e041-4efa-93f7-607f23a14bb7-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.262519 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.310711 4707 scope.go:117] "RemoveContainer" containerID="7f6ca570badd360c4e550f469d4e2d6df888ec83adc2f833b5160cc1738f2b45" Jan 21 15:54:40 crc kubenswrapper[4707]: E0121 15:54:40.311318 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd85bbe28_55fb_4af7_8dad_8bc6ffdf5d1a.slice/crio-126b84ccf58fbb664aa1a8939c6fdff7132c8e799df526d0c0189f405ef7ef6e\": RecentStats: unable to find data in memory cache]" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.345602 4707 scope.go:117] "RemoveContainer" containerID="5317381dd2ab17c4e6cf8ac5d6c3209a97820d7d93db33f88df2d938cc564924" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.834953 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.132:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.925069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e22a9822-7aa7-4a38-916a-c3fc9fdb0895","Type":"ContainerStarted","Data":"30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618"} Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.925146 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.929526 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_910c3c7f-e041-4efa-93f7-607f23a14bb7/ovsdbserver-nb/0.log" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.929636 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.931302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"910c3c7f-e041-4efa-93f7-607f23a14bb7","Type":"ContainerDied","Data":"7e225c44d31e0ecd598ab142e897367eddbb156998b69dc9264a1e0fafbda81d"} Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.931345 4707 scope.go:117] "RemoveContainer" containerID="b263696ec070b984748b66e2f231cf93cda412825af0d39c15ed53b1bccf445b" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.948942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"51bd9d97-e28f-4b29-a43b-0496f5fb517e","Type":"ContainerStarted","Data":"893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317"} Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.949206 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.949223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"51bd9d97-e28f-4b29-a43b-0496f5fb517e","Type":"ContainerStarted","Data":"7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478"} Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.949233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"51bd9d97-e28f-4b29-a43b-0496f5fb517e","Type":"ContainerStarted","Data":"ffc7a47341df072dafc96be5bbedd44c2d13743f41f4b82253faf9b66816abdf"} Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.949023 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerName="openstack-network-exporter" containerID="cri-o://893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317" gracePeriod=30 Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.948978 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerName="ovn-northd" containerID="cri-o://7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478" gracePeriod=30 Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.963115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b59f3c9f-adf5-49f3-8f0b-62044f91902c","Type":"ContainerStarted","Data":"0f293a8697eb3310b255d467efcfb20b65db8c27463105113488bd5b244efaf7"} Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.963157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b59f3c9f-adf5-49f3-8f0b-62044f91902c","Type":"ContainerStarted","Data":"bed84625b1724689ba24d64b9ce564f5af89b171fde3cda59c73d1d35a9d2648"} Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.963949 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.973514 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.978103 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3/ovsdbserver-sb/0.log" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.978162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3","Type":"ContainerDied","Data":"bec143db53af283eadda7d685670091d76206e3e5d85336132185744cf6647c2"} Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.978227 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.986506 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:40 crc kubenswrapper[4707]: I0121 15:54:40.997499 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.997486084 podStartE2EDuration="2.997486084s" podCreationTimestamp="2026-01-21 15:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:40.987163253 +0000 UTC m=+3178.168679474" watchObservedRunningTime="2026-01-21 15:54:40.997486084 +0000 UTC m=+3178.179002306" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.005883 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:41 crc kubenswrapper[4707]: E0121 15:54:41.006288 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerName="openstack-network-exporter" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.006305 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerName="openstack-network-exporter" Jan 21 15:54:41 crc kubenswrapper[4707]: E0121 15:54:41.006316 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerName="ovsdbserver-nb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.006322 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerName="ovsdbserver-nb" Jan 21 15:54:41 crc kubenswrapper[4707]: E0121 15:54:41.006332 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerName="ovsdbserver-sb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.006338 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerName="ovsdbserver-sb" Jan 21 15:54:41 crc kubenswrapper[4707]: E0121 15:54:41.006346 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerName="openstack-network-exporter" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.006352 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerName="openstack-network-exporter" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.006532 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerName="ovsdbserver-sb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.006550 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerName="openstack-network-exporter" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.006563 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="910c3c7f-e041-4efa-93f7-607f23a14bb7" containerName="ovsdbserver-nb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.006576 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" containerName="openstack-network-exporter" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.007469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6c4d63f6-935c-47e2-92c0-4e27286253ba","Type":"ContainerStarted","Data":"7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e"} Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.007495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6c4d63f6-935c-47e2-92c0-4e27286253ba","Type":"ContainerStarted","Data":"b5294e944a1c6c65e2aa16f2048bc9bbaa5285e76c2a4fca8ff2dc10cfc829c5"} Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.016958 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.017866 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.021578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-kdf5s" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.021857 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.022143 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.022275 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.037061 4707 scope.go:117] "RemoveContainer" containerID="26ebac10c4ccec7634080fa327059e6b9379a8b6887c1ceb3d03955ff2d8f89c" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.041225 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.044316 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=4.044297629 podStartE2EDuration="4.044297629s" podCreationTimestamp="2026-01-21 15:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:41.04211251 +0000 UTC m=+3178.223628732" watchObservedRunningTime="2026-01-21 15:54:41.044297629 +0000 UTC m=+3178.225813851" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.095634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.096007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.096037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.096106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-config\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.096141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqqvk\" (UniqueName: \"kubernetes.io/projected/feca9a31-2cdb-456b-8e43-17064ac87815-kube-api-access-rqqvk\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.096159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.096304 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.096336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.200992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.201035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.201165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.201219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.201241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.201297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-config\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.201319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqqvk\" (UniqueName: \"kubernetes.io/projected/feca9a31-2cdb-456b-8e43-17064ac87815-kube-api-access-rqqvk\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.201338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.211833 4707 scope.go:117] "RemoveContainer" containerID="ae1887bc18dba44ce0e0dd605872e724ae817aeba984af267f9e8e85c4a01696" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.219749 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.223367 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" path="/var/lib/kubelet/pods/3769b2e3-4438-441b-867e-65d6dbd6e36c/volumes" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.233970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.240824 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.241334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.242393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-config\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.243128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.248419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.275465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqqvk\" (UniqueName: \"kubernetes.io/projected/feca9a31-2cdb-456b-8e43-17064ac87815-kube-api-access-rqqvk\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.288928 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb" path="/var/lib/kubelet/pods/46ef7fb6-d2ea-4c55-ad87-5d4d49a748bb/volumes" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.297406 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab385ad-11d0-4da8-bf53-70997a2b9fa3" path="/var/lib/kubelet/pods/4ab385ad-11d0-4da8-bf53-70997a2b9fa3/volumes" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.388525 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910c3c7f-e041-4efa-93f7-607f23a14bb7" path="/var/lib/kubelet/pods/910c3c7f-e041-4efa-93f7-607f23a14bb7/volumes" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.389395 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a" path="/var/lib/kubelet/pods/d85bbe28-55fb-4af7-8dad-8bc6ffdf5d1a/volumes" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.392437 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb142568-b2b7-467a-9b33-fc6c3bf725e3" path="/var/lib/kubelet/pods/fb142568-b2b7-467a-9b33-fc6c3bf725e3/volumes" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.393088 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.393117 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.393137 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.401840 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.401865 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.401876 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.401890 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.404536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.407635 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.407935 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.408122 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-6hxvs" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.408319 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.436328 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.458400 4707 scope.go:117] "RemoveContainer" containerID="a8accdc28b79cf7186bb104739e5401435f0d83f38d2889dbd2f860bcd015fc3" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.464701 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-76756f697d-jwhvs"] Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.466441 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerName="neutron-api" containerID="cri-o://310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563" gracePeriod=30 Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.476248 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerName="neutron-httpd" containerID="cri-o://2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b" gracePeriod=30 Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.487874 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5b4488769b-2ldzb"] Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.489681 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.491102 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.495862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5b4488769b-2ldzb"] Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.512304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr9sj\" (UniqueName: \"kubernetes.io/projected/a9d297fb-43eb-45d9-a000-55847e5b36cb-kube-api-access-pr9sj\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-config\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw5dr\" (UniqueName: \"kubernetes.io/projected/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-kube-api-access-vw5dr\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.662569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.762795 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764718 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr9sj\" (UniqueName: \"kubernetes.io/projected/a9d297fb-43eb-45d9-a000-55847e5b36cb-kube-api-access-pr9sj\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-config\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.764994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw5dr\" (UniqueName: \"kubernetes.io/projected/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-kube-api-access-vw5dr\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.765018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.765056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.765108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.776878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.777411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.779180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.779476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.780092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-config\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.780989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.781922 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.782911 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.786445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.787503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.788527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.790148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.797716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw5dr\" (UniqueName: \"kubernetes.io/projected/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-kube-api-access-vw5dr\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.800116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs\") pod \"neutron-5b4488769b-2ldzb\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.823550 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr9sj\" (UniqueName: \"kubernetes.io/projected/a9d297fb-43eb-45d9-a000-55847e5b36cb-kube-api-access-pr9sj\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.867123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.896842 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.962648 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.969940 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-combined-ca-bundle\") pod \"7e31617e-3380-4e4f-9558-ceb4164fd190\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.970061 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e31617e-3380-4e4f-9558-ceb4164fd190-logs\") pod \"7e31617e-3380-4e4f-9558-ceb4164fd190\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.970111 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data\") pod \"7e31617e-3380-4e4f-9558-ceb4164fd190\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.970162 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data-custom\") pod \"7e31617e-3380-4e4f-9558-ceb4164fd190\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.970221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjh8d\" (UniqueName: \"kubernetes.io/projected/7e31617e-3380-4e4f-9558-ceb4164fd190-kube-api-access-zjh8d\") pod \"7e31617e-3380-4e4f-9558-ceb4164fd190\" (UID: \"7e31617e-3380-4e4f-9558-ceb4164fd190\") " Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.972017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e31617e-3380-4e4f-9558-ceb4164fd190-logs" (OuterVolumeSpecName: "logs") pod "7e31617e-3380-4e4f-9558-ceb4164fd190" (UID: "7e31617e-3380-4e4f-9558-ceb4164fd190"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.975116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7e31617e-3380-4e4f-9558-ceb4164fd190" (UID: "7e31617e-3380-4e4f-9558-ceb4164fd190"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:41 crc kubenswrapper[4707]: I0121 15:54:41.975216 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.008127 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.018385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e31617e-3380-4e4f-9558-ceb4164fd190-kube-api-access-zjh8d" (OuterVolumeSpecName: "kube-api-access-zjh8d") pod "7e31617e-3380-4e4f-9558-ceb4164fd190" (UID: "7e31617e-3380-4e4f-9558-ceb4164fd190"). InnerVolumeSpecName "kube-api-access-zjh8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.034051 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-internal-tls-certs\") pod \"318a94fd-66d7-4cfb-b312-a348e185ae2e\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-scripts\") pod \"318a94fd-66d7-4cfb-b312-a348e185ae2e\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072155 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-combined-ca-bundle\") pod \"318a94fd-66d7-4cfb-b312-a348e185ae2e\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072185 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-config-data\") pod \"318a94fd-66d7-4cfb-b312-a348e185ae2e\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072213 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/318a94fd-66d7-4cfb-b312-a348e185ae2e-logs\") pod \"318a94fd-66d7-4cfb-b312-a348e185ae2e\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcr2c\" (UniqueName: \"kubernetes.io/projected/318a94fd-66d7-4cfb-b312-a348e185ae2e-kube-api-access-wcr2c\") pod \"318a94fd-66d7-4cfb-b312-a348e185ae2e\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-public-tls-certs\") pod \"318a94fd-66d7-4cfb-b312-a348e185ae2e\" (UID: \"318a94fd-66d7-4cfb-b312-a348e185ae2e\") " Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072765 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e31617e-3380-4e4f-9558-ceb4164fd190-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072782 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.072792 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjh8d\" (UniqueName: \"kubernetes.io/projected/7e31617e-3380-4e4f-9558-ceb4164fd190-kube-api-access-zjh8d\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.073974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318a94fd-66d7-4cfb-b312-a348e185ae2e-logs" (OuterVolumeSpecName: "logs") pod "318a94fd-66d7-4cfb-b312-a348e185ae2e" (UID: "318a94fd-66d7-4cfb-b312-a348e185ae2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.086934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318a94fd-66d7-4cfb-b312-a348e185ae2e-kube-api-access-wcr2c" (OuterVolumeSpecName: "kube-api-access-wcr2c") pod "318a94fd-66d7-4cfb-b312-a348e185ae2e" (UID: "318a94fd-66d7-4cfb-b312-a348e185ae2e"). InnerVolumeSpecName "kube-api-access-wcr2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.087216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e31617e-3380-4e4f-9558-ceb4164fd190" (UID: "7e31617e-3380-4e4f-9558-ceb4164fd190"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.088428 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.096298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-scripts" (OuterVolumeSpecName: "scripts") pod "318a94fd-66d7-4cfb-b312-a348e185ae2e" (UID: "318a94fd-66d7-4cfb-b312-a348e185ae2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.103749 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerID="fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e" exitCode=137 Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.103823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" event={"ID":"7e31617e-3380-4e4f-9558-ceb4164fd190","Type":"ContainerDied","Data":"fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e"} Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.103852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" event={"ID":"7e31617e-3380-4e4f-9558-ceb4164fd190","Type":"ContainerDied","Data":"702acc4e961c18bafe2ee10a33468a9d5240115ee283e4e31e8cbfd0a20b4a6f"} Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.103869 4707 scope.go:117] "RemoveContainer" containerID="fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.103966 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.111917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data" (OuterVolumeSpecName: "config-data") pod "7e31617e-3380-4e4f-9558-ceb4164fd190" (UID: "7e31617e-3380-4e4f-9558-ceb4164fd190"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.158683 4707 generic.go:334] "Generic (PLEG): container finished" podID="03adcc99-dff8-4703-bacf-25c6576428f9" containerID="0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b" exitCode=1 Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.158743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"03adcc99-dff8-4703-bacf-25c6576428f9","Type":"ContainerDied","Data":"0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b"} Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.159439 4707 scope.go:117] "RemoveContainer" containerID="0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b" Jan 21 15:54:42 crc kubenswrapper[4707]: E0121 15:54:42.159715 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(03adcc99-dff8-4703-bacf-25c6576428f9)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.159991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-config-data" (OuterVolumeSpecName: "config-data") pod "318a94fd-66d7-4cfb-b312-a348e185ae2e" (UID: "318a94fd-66d7-4cfb-b312-a348e185ae2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.176066 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.176347 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.176359 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e31617e-3380-4e4f-9558-ceb4164fd190-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.176371 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.176383 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/318a94fd-66d7-4cfb-b312-a348e185ae2e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.176393 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcr2c\" (UniqueName: \"kubernetes.io/projected/318a94fd-66d7-4cfb-b312-a348e185ae2e-kube-api-access-wcr2c\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.177025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6c4d63f6-935c-47e2-92c0-4e27286253ba","Type":"ContainerStarted","Data":"c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715"} Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.184570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "318a94fd-66d7-4cfb-b312-a348e185ae2e" (UID: "318a94fd-66d7-4cfb-b312-a348e185ae2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.186211 4707 generic.go:334] "Generic (PLEG): container finished" podID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerID="893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317" exitCode=2 Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.186419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"51bd9d97-e28f-4b29-a43b-0496f5fb517e","Type":"ContainerDied","Data":"893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317"} Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.205901 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerID="2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b" exitCode=0 Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.205979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" event={"ID":"c1ba53ab-abb3-455d-979b-cad1eab8e90a","Type":"ContainerDied","Data":"2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b"} Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.208391 4707 scope.go:117] "RemoveContainer" containerID="2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.211058 4707 generic.go:334] "Generic (PLEG): container finished" podID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerID="ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527" exitCode=0 Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.211106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" event={"ID":"318a94fd-66d7-4cfb-b312-a348e185ae2e","Type":"ContainerDied","Data":"ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527"} Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.211128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" event={"ID":"318a94fd-66d7-4cfb-b312-a348e185ae2e","Type":"ContainerDied","Data":"38e1d35a04c5eafa1fb3e2fa41540ef0e683a29abae5339031ce3c86d2b67789"} Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.211209 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-98765b5d4-w967c" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.235867 4707 scope.go:117] "RemoveContainer" containerID="fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e" Jan 21 15:54:42 crc kubenswrapper[4707]: E0121 15:54:42.236770 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e\": container with ID starting with fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e not found: ID does not exist" containerID="fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.236800 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e"} err="failed to get container status \"fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e\": rpc error: code = NotFound desc = could not find container \"fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e\": container with ID starting with fed24e108438fd01b2b82cba4b927a210ffb40054ca956619893e253d054469e not found: ID does not exist" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.236885 4707 scope.go:117] "RemoveContainer" containerID="2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00" Jan 21 15:54:42 crc kubenswrapper[4707]: E0121 15:54:42.237547 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00\": container with ID starting with 2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00 not found: ID does not exist" containerID="2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.237590 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00"} err="failed to get container status \"2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00\": rpc error: code = NotFound desc = could not find container \"2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00\": container with ID starting with 2b2dcdfda05829100a8d7ce7bde90877598db31b34a4831797c3e784e4566c00 not found: ID does not exist" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.237606 4707 scope.go:117] "RemoveContainer" containerID="cae2309d9aaf78a8c9bafad7a7e7b8fa9f20cadfa5abaa88cafa500489be2946" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.287113 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.289836 4707 scope.go:117] "RemoveContainer" containerID="ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.291558 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.291532553 podStartE2EDuration="4.291532553s" podCreationTimestamp="2026-01-21 15:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:42.201393706 +0000 UTC m=+3179.382909927" watchObservedRunningTime="2026-01-21 15:54:42.291532553 +0000 UTC m=+3179.473048775" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.301583 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.306550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "318a94fd-66d7-4cfb-b312-a348e185ae2e" (UID: "318a94fd-66d7-4cfb-b312-a348e185ae2e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.312006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "318a94fd-66d7-4cfb-b312-a348e185ae2e" (UID: "318a94fd-66d7-4cfb-b312-a348e185ae2e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.316153 4707 scope.go:117] "RemoveContainer" containerID="07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.317984 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.347395 4707 scope.go:117] "RemoveContainer" containerID="ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.347627 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:54:42 crc kubenswrapper[4707]: E0121 15:54:42.347936 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527\": container with ID starting with ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527 not found: ID does not exist" containerID="ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.347967 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527"} err="failed to get container status \"ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527\": rpc error: code = NotFound desc = could not find container \"ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527\": container with ID starting with ae9ed205b642959f1eb8a37ceb21b1eef507992a96477d06b360f670fd76d527 not found: ID does not exist" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.347990 4707 scope.go:117] "RemoveContainer" containerID="07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381" Jan 21 15:54:42 crc kubenswrapper[4707]: E0121 15:54:42.348342 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381\": container with ID starting with 07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381 not found: ID does not exist" containerID="07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.348376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381"} err="failed to get container status \"07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381\": rpc error: code = NotFound desc = could not find container \"07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381\": container with ID starting with 07b37abeb6aa70ae242aaac6ab353eccc7a83c5628c07e1cfd4a1f9f3771f381 not found: ID does not exist" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.391046 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.391325 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/318a94fd-66d7-4cfb-b312-a348e185ae2e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.470289 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5"] Jan 21 15:54:42 crc kubenswrapper[4707]: W0121 15:54:42.479777 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd8127e7_3a43_454e_a3c9_190cf9eab3d2.slice/crio-8cf872ec8a96d3164abdd36d413b0fbfbbdf8c240a2af32dced07696d24c22c8 WatchSource:0}: Error finding container 8cf872ec8a96d3164abdd36d413b0fbfbbdf8c240a2af32dced07696d24c22c8: Status 404 returned error can't find the container with id 8cf872ec8a96d3164abdd36d413b0fbfbbdf8c240a2af32dced07696d24c22c8 Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.483830 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7695bf75b5-6fdh5"] Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.498296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5b4488769b-2ldzb"] Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.576710 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-98765b5d4-w967c"] Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.585877 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-98765b5d4-w967c"] Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.649600 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.652021 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:42 crc kubenswrapper[4707]: I0121 15:54:42.949886 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:42 crc kubenswrapper[4707]: E0121 15:54:42.966389 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618 is running failed: container process not found" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:42 crc kubenswrapper[4707]: E0121 15:54:42.967553 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618 is running failed: container process not found" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:42 crc kubenswrapper[4707]: E0121 15:54:42.967962 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618 is running failed: container process not found" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:54:42 crc kubenswrapper[4707]: E0121 15:54:42.968082 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618 is running failed: container process not found" probeType="Liveness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.188440 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.210125 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" path="/var/lib/kubelet/pods/318a94fd-66d7-4cfb-b312-a348e185ae2e/volumes" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.210974 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3" path="/var/lib/kubelet/pods/4b4a1f09-6e3d-4468-b389-ab0bdf8e36e3/volumes" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.212653 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e31617e-3380-4e4f-9558-ceb4164fd190" path="/var/lib/kubelet/pods/7e31617e-3380-4e4f-9558-ceb4164fd190/volumes" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.254597 4707 scope.go:117] "RemoveContainer" containerID="0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b" Jan 21 15:54:43 crc kubenswrapper[4707]: E0121 15:54:43.254892 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(03adcc99-dff8-4703-bacf-25c6576428f9)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.282922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a9d297fb-43eb-45d9-a000-55847e5b36cb","Type":"ContainerStarted","Data":"ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.283245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a9d297fb-43eb-45d9-a000-55847e5b36cb","Type":"ContainerStarted","Data":"b9cf66ef704e6570e55f16947bb54c9007553f8ab8950a3f9649fc428f1e1140"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.297218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"feca9a31-2cdb-456b-8e43-17064ac87815","Type":"ContainerStarted","Data":"d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.297281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"feca9a31-2cdb-456b-8e43-17064ac87815","Type":"ContainerStarted","Data":"0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.297294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"feca9a31-2cdb-456b-8e43-17064ac87815","Type":"ContainerStarted","Data":"a39e9bbd1d6760653c6df96779a6c274f806ef15a557d78cde58f9c8b28ca695"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.300836 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerID="7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c" exitCode=137 Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.300893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" event={"ID":"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd","Type":"ContainerDied","Data":"7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.300914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" event={"ID":"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd","Type":"ContainerDied","Data":"78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.300934 4707 scope.go:117] "RemoveContainer" containerID="7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.301078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.313048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-logs\") pod \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.313176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data-custom\") pod \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.313211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-combined-ca-bundle\") pod \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.313241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c5wh\" (UniqueName: \"kubernetes.io/projected/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-kube-api-access-5c5wh\") pod \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.313411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data\") pod \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\" (UID: \"5a307a5c-9c6e-42d4-80a8-7d6e50650fcd\") " Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.317516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" event={"ID":"dd8127e7-3a43-454e-a3c9-190cf9eab3d2","Type":"ContainerStarted","Data":"e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.317602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" event={"ID":"dd8127e7-3a43-454e-a3c9-190cf9eab3d2","Type":"ContainerStarted","Data":"48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.317614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" event={"ID":"dd8127e7-3a43-454e-a3c9-190cf9eab3d2","Type":"ContainerStarted","Data":"8cf872ec8a96d3164abdd36d413b0fbfbbdf8c240a2af32dced07696d24c22c8"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.321881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" (UID: "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.322221 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-logs" (OuterVolumeSpecName: "logs") pod "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" (UID: "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.329869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-kube-api-access-5c5wh" (OuterVolumeSpecName: "kube-api-access-5c5wh") pod "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" (UID: "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd"). InnerVolumeSpecName "kube-api-access-5c5wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.330476 4707 generic.go:334] "Generic (PLEG): container finished" podID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" exitCode=1 Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.332312 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.332293371 podStartE2EDuration="3.332293371s" podCreationTimestamp="2026-01-21 15:54:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:43.317003009 +0000 UTC m=+3180.498519231" watchObservedRunningTime="2026-01-21 15:54:43.332293371 +0000 UTC m=+3180.513809593" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.334361 4707 scope.go:117] "RemoveContainer" containerID="7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.358375 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.358422 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e22a9822-7aa7-4a38-916a-c3fc9fdb0895","Type":"ContainerDied","Data":"30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618"} Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.360079 4707 scope.go:117] "RemoveContainer" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" Jan 21 15:54:43 crc kubenswrapper[4707]: E0121 15:54:43.366376 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(e22a9822-7aa7-4a38-916a-c3fc9fdb0895)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.373543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" (UID: "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.377275 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" podStartSLOduration=2.377252783 podStartE2EDuration="2.377252783s" podCreationTimestamp="2026-01-21 15:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:43.363550878 +0000 UTC m=+3180.545067100" watchObservedRunningTime="2026-01-21 15:54:43.377252783 +0000 UTC m=+3180.558769005" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.429707 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.429738 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.429755 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.429765 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c5wh\" (UniqueName: \"kubernetes.io/projected/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-kube-api-access-5c5wh\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.448561 4707 scope.go:117] "RemoveContainer" containerID="7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c" Jan 21 15:54:43 crc kubenswrapper[4707]: E0121 15:54:43.450529 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c\": container with ID starting with 7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c not found: ID does not exist" containerID="7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.450588 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c"} err="failed to get container status \"7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c\": rpc error: code = NotFound desc = could not find container \"7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c\": container with ID starting with 7c9ada192453cb2e512a435d3075c9c5c4eedb81cb2355c473f30a992a00863c not found: ID does not exist" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.450615 4707 scope.go:117] "RemoveContainer" containerID="7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517" Jan 21 15:54:43 crc kubenswrapper[4707]: E0121 15:54:43.451156 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517\": container with ID starting with 7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517 not found: ID does not exist" containerID="7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.451193 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517"} err="failed to get container status \"7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517\": rpc error: code = NotFound desc = could not find container \"7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517\": container with ID starting with 7c163ef35d41cba3a2a7432918b04603d719080c778275f686b5237436892517 not found: ID does not exist" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.451222 4707 scope.go:117] "RemoveContainer" containerID="544d5ca76e0fb54d974186a2110285e44bbf348f7f71b6273678005e1f776565" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.451761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data" (OuterVolumeSpecName: "config-data") pod "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" (UID: "5a307a5c-9c6e-42d4-80a8-7d6e50650fcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.529641 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.529689 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-68f75c7cb4-2pzs9" podUID="3769b2e3-4438-441b-867e-65d6dbd6e36c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.109:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.531760 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.638276 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq"] Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.648999 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fddcb7cf6-46spq"] Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.657017 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:43 crc kubenswrapper[4707]: I0121 15:54:43.657301 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.361152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a9d297fb-43eb-45d9-a000-55847e5b36cb","Type":"ContainerStarted","Data":"fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844"} Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.661974 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.764577 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.798773 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.798757066 podStartE2EDuration="3.798757066s" podCreationTimestamp="2026-01-21 15:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:44.384095872 +0000 UTC m=+3181.565612094" watchObservedRunningTime="2026-01-21 15:54:44.798757066 +0000 UTC m=+3181.980273289" Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.803960 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.805665 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="ceilometer-central-agent" containerID="cri-o://f62e21515126150bf107422ad5dd88cd5dd40ac544ec2f1453d0195c8eb84574" gracePeriod=30 Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.805697 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="sg-core" containerID="cri-o://d6070f3e197144a2901111d863557e8496a035ad9a3fc78850139ba433b250fc" gracePeriod=30 Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.805712 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="proxy-httpd" containerID="cri-o://441cc55aae74c46696d852f6104493f16644abc2b957eb769aafc7cfbc0e94f3" gracePeriod=30 Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.805722 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="ceilometer-notification-agent" containerID="cri-o://2fabda75fc3aa04b829aa45c860680c47715c9dbed14545eeda6d5ec18d2460e" gracePeriod=30 Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.895235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:44 crc kubenswrapper[4707]: E0121 15:54:44.970830 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:45040->192.168.25.165:36655: write tcp 192.168.25.165:45040->192.168.25.165:36655: write: broken pipe Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.997333 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_51bd9d97-e28f-4b29-a43b-0496f5fb517e/ovn-northd/0.log" Jan 21 15:54:44 crc kubenswrapper[4707]: I0121 15:54:44.997402 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.089387 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.113182 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.136:8778/\": EOF" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.113187 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.136:8778/\": EOF" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.131194 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.162801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72czz\" (UniqueName: \"kubernetes.io/projected/51bd9d97-e28f-4b29-a43b-0496f5fb517e-kube-api-access-72czz\") pod \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.162883 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-metrics-certs-tls-certs\") pod \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.162934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-northd-tls-certs\") pod \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.162976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-scripts\") pod \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.163019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-combined-ca-bundle\") pod \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.163073 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-config\") pod \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.163164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-rundir\") pod \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\" (UID: \"51bd9d97-e28f-4b29-a43b-0496f5fb517e\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.164085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "51bd9d97-e28f-4b29-a43b-0496f5fb517e" (UID: "51bd9d97-e28f-4b29-a43b-0496f5fb517e"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.165244 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-scripts" (OuterVolumeSpecName: "scripts") pod "51bd9d97-e28f-4b29-a43b-0496f5fb517e" (UID: "51bd9d97-e28f-4b29-a43b-0496f5fb517e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.167040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-config" (OuterVolumeSpecName: "config") pod "51bd9d97-e28f-4b29-a43b-0496f5fb517e" (UID: "51bd9d97-e28f-4b29-a43b-0496f5fb517e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.172945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51bd9d97-e28f-4b29-a43b-0496f5fb517e-kube-api-access-72czz" (OuterVolumeSpecName: "kube-api-access-72czz") pod "51bd9d97-e28f-4b29-a43b-0496f5fb517e" (UID: "51bd9d97-e28f-4b29-a43b-0496f5fb517e"). InnerVolumeSpecName "kube-api-access-72czz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.187299 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.187556 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.190116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51bd9d97-e28f-4b29-a43b-0496f5fb517e" (UID: "51bd9d97-e28f-4b29-a43b-0496f5fb517e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.197163 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" path="/var/lib/kubelet/pods/5a307a5c-9c6e-42d4-80a8-7d6e50650fcd/volumes" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.256227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "51bd9d97-e28f-4b29-a43b-0496f5fb517e" (UID: "51bd9d97-e28f-4b29-a43b-0496f5fb517e"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.266759 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.266788 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.266799 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.266823 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72czz\" (UniqueName: \"kubernetes.io/projected/51bd9d97-e28f-4b29-a43b-0496f5fb517e-kube-api-access-72czz\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.266834 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.266843 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51bd9d97-e28f-4b29-a43b-0496f5fb517e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.269610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "51bd9d97-e28f-4b29-a43b-0496f5fb517e" (UID: "51bd9d97-e28f-4b29-a43b-0496f5fb517e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.369044 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51bd9d97-e28f-4b29-a43b-0496f5fb517e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.384446 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerID="441cc55aae74c46696d852f6104493f16644abc2b957eb769aafc7cfbc0e94f3" exitCode=0 Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.384472 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerID="d6070f3e197144a2901111d863557e8496a035ad9a3fc78850139ba433b250fc" exitCode=2 Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.384482 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerID="2fabda75fc3aa04b829aa45c860680c47715c9dbed14545eeda6d5ec18d2460e" exitCode=0 Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.384489 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerID="f62e21515126150bf107422ad5dd88cd5dd40ac544ec2f1453d0195c8eb84574" exitCode=0 Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.399751 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_51bd9d97-e28f-4b29-a43b-0496f5fb517e/ovn-northd/0.log" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.399798 4707 generic.go:334] "Generic (PLEG): container finished" podID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerID="7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478" exitCode=139 Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.399949 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.411373 4707 generic.go:334] "Generic (PLEG): container finished" podID="943cfab1-eb95-4dcb-842d-9826513006f5" containerID="d19a578b16065b74d9943a39179f1fd2594611a7b821a2af35ebb6465bc0e896" exitCode=137 Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.418770 4707 generic.go:334] "Generic (PLEG): container finished" podID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerID="664218f2c763fb08ecde9d7d311a3e55c24f62c7de90c8590719e316a1fa1de6" exitCode=0 Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.439956 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.439985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerDied","Data":"441cc55aae74c46696d852f6104493f16644abc2b957eb769aafc7cfbc0e94f3"} Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.440222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerDied","Data":"d6070f3e197144a2901111d863557e8496a035ad9a3fc78850139ba433b250fc"} Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.440923 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.440976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerDied","Data":"2fabda75fc3aa04b829aa45c860680c47715c9dbed14545eeda6d5ec18d2460e"} Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.440999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerDied","Data":"f62e21515126150bf107422ad5dd88cd5dd40ac544ec2f1453d0195c8eb84574"} Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.441011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"51bd9d97-e28f-4b29-a43b-0496f5fb517e","Type":"ContainerDied","Data":"7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478"} Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.441035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"51bd9d97-e28f-4b29-a43b-0496f5fb517e","Type":"ContainerDied","Data":"ffc7a47341df072dafc96be5bbedd44c2d13743f41f4b82253faf9b66816abdf"} Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.441046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" event={"ID":"943cfab1-eb95-4dcb-842d-9826513006f5","Type":"ContainerDied","Data":"d19a578b16065b74d9943a39179f1fd2594611a7b821a2af35ebb6465bc0e896"} Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.441057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"68a74a76-0bd8-48e1-87c9-778b381f3fa0","Type":"ContainerDied","Data":"664218f2c763fb08ecde9d7d311a3e55c24f62c7de90c8590719e316a1fa1de6"} Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.441094 4707 scope.go:117] "RemoveContainer" containerID="893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.466546 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.470936 4707 scope.go:117] "RemoveContainer" containerID="7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.500286 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.516847 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.521934 4707 scope.go:117] "RemoveContainer" containerID="893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.522426 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317\": container with ID starting with 893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317 not found: ID does not exist" containerID="893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.522471 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317"} err="failed to get container status \"893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317\": rpc error: code = NotFound desc = could not find container \"893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317\": container with ID starting with 893b9b1e34c7b1d3a52aa12de9e87c3cfb818405c445113245804d534eb59317 not found: ID does not exist" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.522489 4707 scope.go:117] "RemoveContainer" containerID="7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.523855 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478\": container with ID starting with 7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478 not found: ID does not exist" containerID="7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.523881 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478"} err="failed to get container status \"7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478\": rpc error: code = NotFound desc = could not find container \"7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478\": container with ID starting with 7069e31d691be730a908280530d3ae0b2a169193c65da157accbb39a93bf5478 not found: ID does not exist" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.536839 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-565595d96-hb7l7"] Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.537037 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api-log" containerID="cri-o://2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20" gracePeriod=30 Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.537396 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api" containerID="cri-o://26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704" gracePeriod=30 Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.556549 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.556945 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerName="barbican-keystone-listener-log" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.556959 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerName="barbican-keystone-listener-log" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.556966 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerName="openstack-network-exporter" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.556972 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerName="openstack-network-exporter" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.556979 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-api" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.556985 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-api" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.557001 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerName="barbican-worker-log" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557007 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerName="barbican-worker-log" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.557020 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerName="barbican-worker" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557025 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerName="barbican-worker" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.557037 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-log" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557042 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-log" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.557049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerName="ovn-northd" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557054 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerName="ovn-northd" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.557060 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-log" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557065 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-log" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.557079 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerName="barbican-keystone-listener" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557083 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerName="barbican-keystone-listener" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.557095 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-api" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557100 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-api" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557296 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerName="openstack-network-exporter" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557311 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerName="barbican-worker-log" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557319 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-api" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557329 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-log" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557339 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerName="barbican-keystone-listener-log" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557350 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="318a94fd-66d7-4cfb-b312-a348e185ae2e" containerName="placement-api" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557363 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" containerName="placement-log" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557373 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" containerName="ovn-northd" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557384 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e31617e-3380-4e4f-9558-ceb4164fd190" containerName="barbican-worker" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.557394 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a307a5c-9c6e-42d4-80a8-7d6e50650fcd" containerName="barbican-keystone-listener" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.558267 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.562650 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.562769 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-q9bf5" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.562803 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.563203 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.571158 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.573380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-public-tls-certs\") pod \"943cfab1-eb95-4dcb-842d-9826513006f5\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.573434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-internal-tls-certs\") pod \"943cfab1-eb95-4dcb-842d-9826513006f5\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.573485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-config-data\") pod \"943cfab1-eb95-4dcb-842d-9826513006f5\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.573562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-scripts\") pod \"943cfab1-eb95-4dcb-842d-9826513006f5\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.573637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/943cfab1-eb95-4dcb-842d-9826513006f5-logs\") pod \"943cfab1-eb95-4dcb-842d-9826513006f5\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.573691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-combined-ca-bundle\") pod \"943cfab1-eb95-4dcb-842d-9826513006f5\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.573724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf84r\" (UniqueName: \"kubernetes.io/projected/943cfab1-eb95-4dcb-842d-9826513006f5-kube-api-access-hf84r\") pod \"943cfab1-eb95-4dcb-842d-9826513006f5\" (UID: \"943cfab1-eb95-4dcb-842d-9826513006f5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.579730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943cfab1-eb95-4dcb-842d-9826513006f5-logs" (OuterVolumeSpecName: "logs") pod "943cfab1-eb95-4dcb-842d-9826513006f5" (UID: "943cfab1-eb95-4dcb-842d-9826513006f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.582915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943cfab1-eb95-4dcb-842d-9826513006f5-kube-api-access-hf84r" (OuterVolumeSpecName: "kube-api-access-hf84r") pod "943cfab1-eb95-4dcb-842d-9826513006f5" (UID: "943cfab1-eb95-4dcb-842d-9826513006f5"). InnerVolumeSpecName "kube-api-access-hf84r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.584572 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-scripts" (OuterVolumeSpecName: "scripts") pod "943cfab1-eb95-4dcb-842d-9826513006f5" (UID: "943cfab1-eb95-4dcb-842d-9826513006f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.633562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "943cfab1-eb95-4dcb-842d-9826513006f5" (UID: "943cfab1-eb95-4dcb-842d-9826513006f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.641287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-config-data" (OuterVolumeSpecName: "config-data") pod "943cfab1-eb95-4dcb-842d-9826513006f5" (UID: "943cfab1-eb95-4dcb-842d-9826513006f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.645984 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-scripts\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678397 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-config\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v82fr\" (UniqueName: \"kubernetes.io/projected/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-kube-api-access-v82fr\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678514 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678524 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678532 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/943cfab1-eb95-4dcb-842d-9826513006f5-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678540 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.678551 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf84r\" (UniqueName: \"kubernetes.io/projected/943cfab1-eb95-4dcb-842d-9826513006f5-kube-api-access-hf84r\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.685046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "943cfab1-eb95-4dcb-842d-9826513006f5" (UID: "943cfab1-eb95-4dcb-842d-9826513006f5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.688877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "943cfab1-eb95-4dcb-842d-9826513006f5" (UID: "943cfab1-eb95-4dcb-842d-9826513006f5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.698436 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.779912 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghlrc\" (UniqueName: \"kubernetes.io/projected/f9460600-2f70-404b-b7f3-2f83c235cef5-kube-api-access-ghlrc\") pod \"f9460600-2f70-404b-b7f3-2f83c235cef5\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.779989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-run-httpd\") pod \"f9460600-2f70-404b-b7f3-2f83c235cef5\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780028 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68a74a76-0bd8-48e1-87c9-778b381f3fa0-logs\") pod \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-config-data\") pod \"f9460600-2f70-404b-b7f3-2f83c235cef5\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-config-data\") pod \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-ceilometer-tls-certs\") pod \"f9460600-2f70-404b-b7f3-2f83c235cef5\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-log-httpd\") pod \"f9460600-2f70-404b-b7f3-2f83c235cef5\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-scripts\") pod \"f9460600-2f70-404b-b7f3-2f83c235cef5\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-sg-core-conf-yaml\") pod \"f9460600-2f70-404b-b7f3-2f83c235cef5\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-combined-ca-bundle\") pod \"f9460600-2f70-404b-b7f3-2f83c235cef5\" (UID: \"f9460600-2f70-404b-b7f3-2f83c235cef5\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f9460600-2f70-404b-b7f3-2f83c235cef5" (UID: "f9460600-2f70-404b-b7f3-2f83c235cef5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-combined-ca-bundle\") pod \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-nova-metadata-tls-certs\") pod \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gr9m\" (UniqueName: \"kubernetes.io/projected/68a74a76-0bd8-48e1-87c9-778b381f3fa0-kube-api-access-5gr9m\") pod \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\" (UID: \"68a74a76-0bd8-48e1-87c9-778b381f3fa0\") " Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-scripts\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-config\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v82fr\" (UniqueName: \"kubernetes.io/projected/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-kube-api-access-v82fr\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780878 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780925 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780935 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.780945 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/943cfab1-eb95-4dcb-842d-9826513006f5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.781243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.782088 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f9460600-2f70-404b-b7f3-2f83c235cef5" (UID: "f9460600-2f70-404b-b7f3-2f83c235cef5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.782745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9460600-2f70-404b-b7f3-2f83c235cef5-kube-api-access-ghlrc" (OuterVolumeSpecName: "kube-api-access-ghlrc") pod "f9460600-2f70-404b-b7f3-2f83c235cef5" (UID: "f9460600-2f70-404b-b7f3-2f83c235cef5"). InnerVolumeSpecName "kube-api-access-ghlrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.783627 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a74a76-0bd8-48e1-87c9-778b381f3fa0-kube-api-access-5gr9m" (OuterVolumeSpecName: "kube-api-access-5gr9m") pod "68a74a76-0bd8-48e1-87c9-778b381f3fa0" (UID: "68a74a76-0bd8-48e1-87c9-778b381f3fa0"). InnerVolumeSpecName "kube-api-access-5gr9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.783916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-scripts\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.784499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-config\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.784877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68a74a76-0bd8-48e1-87c9-778b381f3fa0-logs" (OuterVolumeSpecName: "logs") pod "68a74a76-0bd8-48e1-87c9-778b381f3fa0" (UID: "68a74a76-0bd8-48e1-87c9-778b381f3fa0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.785586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-scripts" (OuterVolumeSpecName: "scripts") pod "f9460600-2f70-404b-b7f3-2f83c235cef5" (UID: "f9460600-2f70-404b-b7f3-2f83c235cef5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.787075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.788337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.794138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.796628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v82fr\" (UniqueName: \"kubernetes.io/projected/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-kube-api-access-v82fr\") pod \"ovn-northd-0\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.807931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-config-data" (OuterVolumeSpecName: "config-data") pod "68a74a76-0bd8-48e1-87c9-778b381f3fa0" (UID: "68a74a76-0bd8-48e1-87c9-778b381f3fa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.808109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68a74a76-0bd8-48e1-87c9-778b381f3fa0" (UID: "68a74a76-0bd8-48e1-87c9-778b381f3fa0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.809842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f9460600-2f70-404b-b7f3-2f83c235cef5" (UID: "f9460600-2f70-404b-b7f3-2f83c235cef5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.828965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "68a74a76-0bd8-48e1-87c9-778b381f3fa0" (UID: "68a74a76-0bd8-48e1-87c9-778b381f3fa0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.829283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f9460600-2f70-404b-b7f3-2f83c235cef5" (UID: "f9460600-2f70-404b-b7f3-2f83c235cef5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.843832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9460600-2f70-404b-b7f3-2f83c235cef5" (UID: "f9460600-2f70-404b-b7f3-2f83c235cef5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.857844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-config-data" (OuterVolumeSpecName: "config-data") pod "f9460600-2f70-404b-b7f3-2f83c235cef5" (UID: "f9460600-2f70-404b-b7f3-2f83c235cef5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.876950 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.132:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.882438 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883239 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883277 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9460600-2f70-404b-b7f3-2f83c235cef5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883287 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883295 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883305 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883315 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883323 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883332 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gr9m\" (UniqueName: \"kubernetes.io/projected/68a74a76-0bd8-48e1-87c9-778b381f3fa0-kube-api-access-5gr9m\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883340 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghlrc\" (UniqueName: \"kubernetes.io/projected/f9460600-2f70-404b-b7f3-2f83c235cef5-kube-api-access-ghlrc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883347 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68a74a76-0bd8-48e1-87c9-778b381f3fa0-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883355 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9460600-2f70-404b-b7f3-2f83c235cef5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.883363 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a74a76-0bd8-48e1-87c9-778b381f3fa0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.901250 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poda33f40c3-3064-405e-a420-da052b2b4a66"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poda33f40c3-3064-405e-a420-da052b2b4a66] : Timed out while waiting for systemd to remove kubepods-besteffort-poda33f40c3_3064_405e_a420_da052b2b4a66.slice" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.901302 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort poda33f40c3-3064-405e-a420-da052b2b4a66] : unable to destroy cgroup paths for cgroup [kubepods besteffort poda33f40c3-3064-405e-a420-da052b2b4a66] : Timed out while waiting for systemd to remove kubepods-besteffort-poda33f40c3_3064_405e_a420_da052b2b4a66.slice" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.962907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:45 crc kubenswrapper[4707]: I0121 15:54:45.963950 4707 scope.go:117] "RemoveContainer" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" Jan 21 15:54:45 crc kubenswrapper[4707]: E0121 15:54:45.965865 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(e22a9822-7aa7-4a38-916a-c3fc9fdb0895)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.254680 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.323151 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.323194 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.346948 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.350140 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.432189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd","Type":"ContainerStarted","Data":"d2ad05ae89fd084069e55e5f76ce6ece9032a6ef14b84c5118ec3897705e45be"} Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.436036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"68a74a76-0bd8-48e1-87c9-778b381f3fa0","Type":"ContainerDied","Data":"577e1b1c17048e1569555f5db1237b142558c0abb88f4dc5820f89ce52879eb4"} Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.436078 4707 scope.go:117] "RemoveContainer" containerID="664218f2c763fb08ecde9d7d311a3e55c24f62c7de90c8590719e316a1fa1de6" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.436161 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.444792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f9460600-2f70-404b-b7f3-2f83c235cef5","Type":"ContainerDied","Data":"22d81c91523460a193b8835f540179f7d8d2fd3fea5b8373e05281ab6dbd6aed"} Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.444871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.450426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" event={"ID":"943cfab1-eb95-4dcb-842d-9826513006f5","Type":"ContainerDied","Data":"fc3a2dcbd831592615de29479506e92713c3b0c7e17ea131f37d130e661a4e09"} Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.450497 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-747cb955f8-j9rwx" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.453133 4707 generic.go:334] "Generic (PLEG): container finished" podID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerID="2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20" exitCode=143 Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.453209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" event={"ID":"dd1e5821-44a4-4219-aec0-56618e7dba22","Type":"ContainerDied","Data":"2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20"} Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.453319 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.453832 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.453852 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.538621 4707 scope.go:117] "RemoveContainer" containerID="4a36a4cbd1862f2d84340378f60d7801d4f238035b05f3aa1e05e062a795ec01" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.581825 4707 scope.go:117] "RemoveContainer" containerID="441cc55aae74c46696d852f6104493f16644abc2b957eb769aafc7cfbc0e94f3" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.582623 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.599515 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.634088 4707 scope.go:117] "RemoveContainer" containerID="d6070f3e197144a2901111d863557e8496a035ad9a3fc78850139ba433b250fc" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.659528 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.670385 4707 scope.go:117] "RemoveContainer" containerID="2fabda75fc3aa04b829aa45c860680c47715c9dbed14545eeda6d5ec18d2460e" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.683534 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-75997c46fb-qf2kk"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696005 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:46 crc kubenswrapper[4707]: E0121 15:54:46.696416 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="ceilometer-central-agent" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696429 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="ceilometer-central-agent" Jan 21 15:54:46 crc kubenswrapper[4707]: E0121 15:54:46.696449 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-metadata" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-metadata" Jan 21 15:54:46 crc kubenswrapper[4707]: E0121 15:54:46.696465 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="sg-core" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696470 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="sg-core" Jan 21 15:54:46 crc kubenswrapper[4707]: E0121 15:54:46.696482 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-log" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696487 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-log" Jan 21 15:54:46 crc kubenswrapper[4707]: E0121 15:54:46.696496 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="ceilometer-notification-agent" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696501 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="ceilometer-notification-agent" Jan 21 15:54:46 crc kubenswrapper[4707]: E0121 15:54:46.696517 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="proxy-httpd" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696522 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="proxy-httpd" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696712 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-metadata" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696721 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="ceilometer-central-agent" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696728 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="sg-core" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696735 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="ceilometer-notification-agent" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696749 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" containerName="proxy-httpd" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.696757 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" containerName="nova-metadata-log" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.698395 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.700599 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.700755 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.700965 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.704714 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.713630 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-747cb955f8-j9rwx"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.719895 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-747cb955f8-j9rwx"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.725438 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.729276 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.729401 4707 scope.go:117] "RemoveContainer" containerID="f62e21515126150bf107422ad5dd88cd5dd40ac544ec2f1453d0195c8eb84574" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.731862 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.737409 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:46 crc kubenswrapper[4707]: E0121 15:54:46.737886 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db5205c-c226-4ce2-b734-b250d0c4f7f5" containerName="nova-scheduler-scheduler" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.737936 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db5205c-c226-4ce2-b734-b250d0c4f7f5" containerName="nova-scheduler-scheduler" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.738341 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db5205c-c226-4ce2-b734-b250d0c4f7f5" containerName="nova-scheduler-scheduler" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.740491 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.747379 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.747567 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.755925 4707 scope.go:117] "RemoveContainer" containerID="d19a578b16065b74d9943a39179f1fd2594611a7b821a2af35ebb6465bc0e896" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.757139 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.767925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.802402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78bf2\" (UniqueName: \"kubernetes.io/projected/0db5205c-c226-4ce2-b734-b250d0c4f7f5-kube-api-access-78bf2\") pod \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.802482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-config-data\") pod \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.802559 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-combined-ca-bundle\") pod \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\" (UID: \"0db5205c-c226-4ce2-b734-b250d0c4f7f5\") " Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.802898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.802937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j925m\" (UniqueName: \"kubernetes.io/projected/2c412020-b207-4f3b-abe0-160022b70cc8-kube-api-access-j925m\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.802969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-scripts\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.802992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ff97\" (UniqueName: \"kubernetes.io/projected/54a93993-fb38-4bec-9fa6-2ffa280ccecd-kube-api-access-2ff97\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.803008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-config-data\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.803027 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.803063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.803088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.803102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.803137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.803153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.803168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a93993-fb38-4bec-9fa6-2ffa280ccecd-logs\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.803185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-config-data\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.808449 4707 scope.go:117] "RemoveContainer" containerID="cda7c8458976b8fcd3b85e2aeae92be31f32d570eec96376fd70a84ab0e2f2f9" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.818666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db5205c-c226-4ce2-b734-b250d0c4f7f5-kube-api-access-78bf2" (OuterVolumeSpecName: "kube-api-access-78bf2") pod "0db5205c-c226-4ce2-b734-b250d0c4f7f5" (UID: "0db5205c-c226-4ce2-b734-b250d0c4f7f5"). InnerVolumeSpecName "kube-api-access-78bf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.834219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0db5205c-c226-4ce2-b734-b250d0c4f7f5" (UID: "0db5205c-c226-4ce2-b734-b250d0c4f7f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.839422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-config-data" (OuterVolumeSpecName: "config-data") pod "0db5205c-c226-4ce2-b734-b250d0c4f7f5" (UID: "0db5205c-c226-4ce2-b734-b250d0c4f7f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.905512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j925m\" (UniqueName: \"kubernetes.io/projected/2c412020-b207-4f3b-abe0-160022b70cc8-kube-api-access-j925m\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.905585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-scripts\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.905618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ff97\" (UniqueName: \"kubernetes.io/projected/54a93993-fb38-4bec-9fa6-2ffa280ccecd-kube-api-access-2ff97\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.905639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-config-data\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.905668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.906737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.906921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.906944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.907051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.907075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.907107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a93993-fb38-4bec-9fa6-2ffa280ccecd-logs\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.907135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-config-data\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.907243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.907617 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.907634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a93993-fb38-4bec-9fa6-2ffa280ccecd-logs\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.909234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.909907 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db5205c-c226-4ce2-b734-b250d0c4f7f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.909942 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78bf2\" (UniqueName: \"kubernetes.io/projected/0db5205c-c226-4ce2-b734-b250d0c4f7f5-kube-api-access-78bf2\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.911125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-scripts\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.911311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.911713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-config-data\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.914383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.914574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.918547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-config-data\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.921277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.921316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.923399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j925m\" (UniqueName: \"kubernetes.io/projected/2c412020-b207-4f3b-abe0-160022b70cc8-kube-api-access-j925m\") pod \"ceilometer-0\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.923723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ff97\" (UniqueName: \"kubernetes.io/projected/54a93993-fb38-4bec-9fa6-2ffa280ccecd-kube-api-access-2ff97\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.925128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.952593 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-578d4bcc98-hmqrg_eaa206fe-a991-4947-8e0e-0a8f80879f55/neutron-api/0.log" Jan 21 15:54:46 crc kubenswrapper[4707]: I0121 15:54:46.952742 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.010934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbr65\" (UniqueName: \"kubernetes.io/projected/eaa206fe-a991-4947-8e0e-0a8f80879f55-kube-api-access-fbr65\") pod \"eaa206fe-a991-4947-8e0e-0a8f80879f55\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.011163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-httpd-config\") pod \"eaa206fe-a991-4947-8e0e-0a8f80879f55\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.011612 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-internal-tls-certs\") pod \"eaa206fe-a991-4947-8e0e-0a8f80879f55\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.011662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-combined-ca-bundle\") pod \"eaa206fe-a991-4947-8e0e-0a8f80879f55\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.014536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-config\") pod \"eaa206fe-a991-4947-8e0e-0a8f80879f55\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.014580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-public-tls-certs\") pod \"eaa206fe-a991-4947-8e0e-0a8f80879f55\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.014613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-ovndb-tls-certs\") pod \"eaa206fe-a991-4947-8e0e-0a8f80879f55\" (UID: \"eaa206fe-a991-4947-8e0e-0a8f80879f55\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.025208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "eaa206fe-a991-4947-8e0e-0a8f80879f55" (UID: "eaa206fe-a991-4947-8e0e-0a8f80879f55"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.028524 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.036235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa206fe-a991-4947-8e0e-0a8f80879f55-kube-api-access-fbr65" (OuterVolumeSpecName: "kube-api-access-fbr65") pod "eaa206fe-a991-4947-8e0e-0a8f80879f55" (UID: "eaa206fe-a991-4947-8e0e-0a8f80879f55"). InnerVolumeSpecName "kube-api-access-fbr65". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.056739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eaa206fe-a991-4947-8e0e-0a8f80879f55" (UID: "eaa206fe-a991-4947-8e0e-0a8f80879f55"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.058870 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaa206fe-a991-4947-8e0e-0a8f80879f55" (UID: "eaa206fe-a991-4947-8e0e-0a8f80879f55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.060354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-config" (OuterVolumeSpecName: "config") pod "eaa206fe-a991-4947-8e0e-0a8f80879f55" (UID: "eaa206fe-a991-4947-8e0e-0a8f80879f55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.068978 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.075439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eaa206fe-a991-4947-8e0e-0a8f80879f55" (UID: "eaa206fe-a991-4947-8e0e-0a8f80879f55"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.075461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "eaa206fe-a991-4947-8e0e-0a8f80879f55" (UID: "eaa206fe-a991-4947-8e0e-0a8f80879f55"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.117370 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.117401 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.117411 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.117419 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.117427 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbr65\" (UniqueName: \"kubernetes.io/projected/eaa206fe-a991-4947-8e0e-0a8f80879f55-kube-api-access-fbr65\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.117437 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.117444 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa206fe-a991-4947-8e0e-0a8f80879f55-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.130199 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.195449 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51bd9d97-e28f-4b29-a43b-0496f5fb517e" path="/var/lib/kubelet/pods/51bd9d97-e28f-4b29-a43b-0496f5fb517e/volumes" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.196145 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a74a76-0bd8-48e1-87c9-778b381f3fa0" path="/var/lib/kubelet/pods/68a74a76-0bd8-48e1-87c9-778b381f3fa0/volumes" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.202377 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943cfab1-eb95-4dcb-842d-9826513006f5" path="/var/lib/kubelet/pods/943cfab1-eb95-4dcb-842d-9826513006f5/volumes" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.206882 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33f40c3-3064-405e-a420-da052b2b4a66" path="/var/lib/kubelet/pods/a33f40c3-3064-405e-a420-da052b2b4a66/volumes" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.207436 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9460600-2f70-404b-b7f3-2f83c235cef5" path="/var/lib/kubelet/pods/f9460600-2f70-404b-b7f3-2f83c235cef5/volumes" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.477450 4707 generic.go:334] "Generic (PLEG): container finished" podID="0db5205c-c226-4ce2-b734-b250d0c4f7f5" containerID="921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235" exitCode=137 Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.477666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0db5205c-c226-4ce2-b734-b250d0c4f7f5","Type":"ContainerDied","Data":"921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235"} Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.477690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0db5205c-c226-4ce2-b734-b250d0c4f7f5","Type":"ContainerDied","Data":"6a969f47e9705be303cdf57e8935315f4c39ccdbb4dfcf02844b2537d3ff7ab1"} Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.477706 4707 scope.go:117] "RemoveContainer" containerID="921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.477785 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.483505 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.486834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd","Type":"ContainerStarted","Data":"488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4"} Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.486865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd","Type":"ContainerStarted","Data":"edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9"} Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.487676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.502469 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.502455261 podStartE2EDuration="2.502455261s" podCreationTimestamp="2026-01-21 15:54:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:47.500446704 +0000 UTC m=+3184.681962925" watchObservedRunningTime="2026-01-21 15:54:47.502455261 +0000 UTC m=+3184.683971483" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.510542 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-578d4bcc98-hmqrg_eaa206fe-a991-4947-8e0e-0a8f80879f55/neutron-api/0.log" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.510689 4707 generic.go:334] "Generic (PLEG): container finished" podID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerID="a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065" exitCode=137 Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.510850 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.510875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" event={"ID":"eaa206fe-a991-4947-8e0e-0a8f80879f55","Type":"ContainerDied","Data":"a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065"} Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.510909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-578d4bcc98-hmqrg" event={"ID":"eaa206fe-a991-4947-8e0e-0a8f80879f55","Type":"ContainerDied","Data":"e63269a3cac96ab1f9b8f88518da6d80d552868292608720525ace78853503c7"} Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.527371 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-77c8997466-bd6bs_c197e98a-7956-4c34-a31a-911a28cf4a1a/neutron-api/0.log" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.527408 4707 generic.go:334] "Generic (PLEG): container finished" podID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerID="0134dae6d06704b28d3c1a93c597121155514f8f85383d373be3fbff4f9697e9" exitCode=1 Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.528055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" event={"ID":"c197e98a-7956-4c34-a31a-911a28cf4a1a","Type":"ContainerDied","Data":"0134dae6d06704b28d3c1a93c597121155514f8f85383d373be3fbff4f9697e9"} Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.528111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" event={"ID":"c197e98a-7956-4c34-a31a-911a28cf4a1a","Type":"ContainerDied","Data":"20c5b4970d15c1a96b2d975842b377379f7937b4b5a64f863e062eb4ad7f03cd"} Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.528126 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20c5b4970d15c1a96b2d975842b377379f7937b4b5a64f863e062eb4ad7f03cd" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.561647 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-77c8997466-bd6bs_c197e98a-7956-4c34-a31a-911a28cf4a1a/neutron-api/0.log" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.566090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.568286 4707 scope.go:117] "RemoveContainer" containerID="921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235" Jan 21 15:54:47 crc kubenswrapper[4707]: E0121 15:54:47.568982 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235\": container with ID starting with 921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235 not found: ID does not exist" containerID="921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.569018 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235"} err="failed to get container status \"921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235\": rpc error: code = NotFound desc = could not find container \"921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235\": container with ID starting with 921c98d3cc571cd0d157516a5e2ee915113095fe6c77fa69389201d2aa569235 not found: ID does not exist" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.569040 4707 scope.go:117] "RemoveContainer" containerID="5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.579520 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-578d4bcc98-hmqrg"] Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.598099 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-578d4bcc98-hmqrg"] Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.603727 4707 scope.go:117] "RemoveContainer" containerID="a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.627361 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.629600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-internal-tls-certs\") pod \"c197e98a-7956-4c34-a31a-911a28cf4a1a\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.629681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-public-tls-certs\") pod \"c197e98a-7956-4c34-a31a-911a28cf4a1a\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.629708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-config\") pod \"c197e98a-7956-4c34-a31a-911a28cf4a1a\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.629747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-combined-ca-bundle\") pod \"c197e98a-7956-4c34-a31a-911a28cf4a1a\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.629867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-httpd-config\") pod \"c197e98a-7956-4c34-a31a-911a28cf4a1a\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.629925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-ovndb-tls-certs\") pod \"c197e98a-7956-4c34-a31a-911a28cf4a1a\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.629977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29hcr\" (UniqueName: \"kubernetes.io/projected/c197e98a-7956-4c34-a31a-911a28cf4a1a-kube-api-access-29hcr\") pod \"c197e98a-7956-4c34-a31a-911a28cf4a1a\" (UID: \"c197e98a-7956-4c34-a31a-911a28cf4a1a\") " Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.636410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c197e98a-7956-4c34-a31a-911a28cf4a1a-kube-api-access-29hcr" (OuterVolumeSpecName: "kube-api-access-29hcr") pod "c197e98a-7956-4c34-a31a-911a28cf4a1a" (UID: "c197e98a-7956-4c34-a31a-911a28cf4a1a"). InnerVolumeSpecName "kube-api-access-29hcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.636463 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.647678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c197e98a-7956-4c34-a31a-911a28cf4a1a" (UID: "c197e98a-7956-4c34-a31a-911a28cf4a1a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.654466 4707 scope.go:117] "RemoveContainer" containerID="5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.654799 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:47 crc kubenswrapper[4707]: E0121 15:54:47.655676 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-httpd" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.655695 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-httpd" Jan 21 15:54:47 crc kubenswrapper[4707]: E0121 15:54:47.655723 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-api" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.655729 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-api" Jan 21 15:54:47 crc kubenswrapper[4707]: E0121 15:54:47.655750 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-httpd" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.655759 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-httpd" Jan 21 15:54:47 crc kubenswrapper[4707]: E0121 15:54:47.655773 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-api" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.655778 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-api" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.656190 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-httpd" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.656211 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-httpd" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.656224 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-api" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.656238 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" containerName="neutron-api" Jan 21 15:54:47 crc kubenswrapper[4707]: E0121 15:54:47.656784 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263\": container with ID starting with 5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263 not found: ID does not exist" containerID="5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.656825 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263"} err="failed to get container status \"5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263\": rpc error: code = NotFound desc = could not find container \"5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263\": container with ID starting with 5b64aa551d68abcafec68620362df5022b9c82c7e3bd16bcdf25583b81946263 not found: ID does not exist" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.656854 4707 scope.go:117] "RemoveContainer" containerID="a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065" Jan 21 15:54:47 crc kubenswrapper[4707]: E0121 15:54:47.658198 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065\": container with ID starting with a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065 not found: ID does not exist" containerID="a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.658227 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065"} err="failed to get container status \"a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065\": rpc error: code = NotFound desc = could not find container \"a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065\": container with ID starting with a272828d662b54c5bee4cfc61b63873854f66b0f3f45ab1ad1ec450e35998065 not found: ID does not exist" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.666622 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.668950 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.686057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.702952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c197e98a-7956-4c34-a31a-911a28cf4a1a" (UID: "c197e98a-7956-4c34-a31a-911a28cf4a1a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.715063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c197e98a-7956-4c34-a31a-911a28cf4a1a" (UID: "c197e98a-7956-4c34-a31a-911a28cf4a1a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.717332 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.731312 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsjnw\" (UniqueName: \"kubernetes.io/projected/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-kube-api-access-gsjnw\") pod \"nova-scheduler-0\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.731355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-config-data\") pod \"nova-scheduler-0\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.731431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.731499 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.731511 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29hcr\" (UniqueName: \"kubernetes.io/projected/c197e98a-7956-4c34-a31a-911a28cf4a1a-kube-api-access-29hcr\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.731522 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.731529 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.733297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-config" (OuterVolumeSpecName: "config") pod "c197e98a-7956-4c34-a31a-911a28cf4a1a" (UID: "c197e98a-7956-4c34-a31a-911a28cf4a1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.738869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c197e98a-7956-4c34-a31a-911a28cf4a1a" (UID: "c197e98a-7956-4c34-a31a-911a28cf4a1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.758791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c197e98a-7956-4c34-a31a-911a28cf4a1a" (UID: "c197e98a-7956-4c34-a31a-911a28cf4a1a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.798201 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.833011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.833247 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsjnw\" (UniqueName: \"kubernetes.io/projected/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-kube-api-access-gsjnw\") pod \"nova-scheduler-0\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.833314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-config-data\") pod \"nova-scheduler-0\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.833462 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.833475 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.833484 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c197e98a-7956-4c34-a31a-911a28cf4a1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.836732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.836795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-config-data\") pod \"nova-scheduler-0\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.837047 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.846768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsjnw\" (UniqueName: \"kubernetes.io/projected/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-kube-api-access-gsjnw\") pod \"nova-scheduler-0\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:47 crc kubenswrapper[4707]: I0121 15:54:47.985203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.200479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:54:48 crc kubenswrapper[4707]: E0121 15:54:48.267111 4707 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.165:45138->192.168.25.165:36655: read tcp 192.168.25.165:45138->192.168.25.165:36655: read: connection reset by peer Jan 21 15:54:48 crc kubenswrapper[4707]: E0121 15:54:48.438574 4707 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.165:45150->192.168.25.165:36655: read tcp 192.168.25.165:45150->192.168.25.165:36655: read: connection reset by peer Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.541226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2","Type":"ContainerStarted","Data":"031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df"} Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.541344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2","Type":"ContainerStarted","Data":"898650ceaf8f3d14295607e34f5a0c5fc5e993dec0e9cc8f02829b77da61c362"} Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.543376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerStarted","Data":"7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039"} Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.543417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerStarted","Data":"f91fdb63d7d782c68864edc2873407e45c5363c0fd8bc3425e38421279ce630e"} Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.545913 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.547455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"54a93993-fb38-4bec-9fa6-2ffa280ccecd","Type":"ContainerStarted","Data":"90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382"} Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.547483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"54a93993-fb38-4bec-9fa6-2ffa280ccecd","Type":"ContainerStarted","Data":"568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a"} Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.547494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"54a93993-fb38-4bec-9fa6-2ffa280ccecd","Type":"ContainerStarted","Data":"59e35d4182fb31c6d7ffc4da98fa3ee8e54da9f10f35945bf0133222ecdb664e"} Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.549163 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.549218 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.560939 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.560930339 podStartE2EDuration="1.560930339s" podCreationTimestamp="2026-01-21 15:54:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:48.558022261 +0000 UTC m=+3185.739538483" watchObservedRunningTime="2026-01-21 15:54:48.560930339 +0000 UTC m=+3185.742446561" Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.581623 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.597272 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.597247871 podStartE2EDuration="2.597247871s" podCreationTimestamp="2026-01-21 15:54:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:48.592396047 +0000 UTC m=+3185.773912270" watchObservedRunningTime="2026-01-21 15:54:48.597247871 +0000 UTC m=+3185.778764094" Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.613591 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-77c8997466-bd6bs"] Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.619640 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-77c8997466-bd6bs"] Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.704145 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.240:9311/healthcheck\": read tcp 10.217.0.2:44730->10.217.1.240:9311: read: connection reset by peer" Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.705130 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.240:9311/healthcheck\": read tcp 10.217.0.2:44716->10.217.1.240:9311: read: connection reset by peer" Jan 21 15:54:48 crc kubenswrapper[4707]: E0121 15:54:48.770044 4707 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.165:45168->192.168.25.165:36655: read tcp 192.168.25.165:45168->192.168.25.165:36655: read: connection reset by peer Jan 21 15:54:48 crc kubenswrapper[4707]: I0121 15:54:48.997737 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.159087 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.159133 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.163075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-public-tls-certs\") pod \"dd1e5821-44a4-4219-aec0-56618e7dba22\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.163143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data\") pod \"dd1e5821-44a4-4219-aec0-56618e7dba22\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.163193 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data-custom\") pod \"dd1e5821-44a4-4219-aec0-56618e7dba22\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.163291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-combined-ca-bundle\") pod \"dd1e5821-44a4-4219-aec0-56618e7dba22\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.163363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z7pc\" (UniqueName: \"kubernetes.io/projected/dd1e5821-44a4-4219-aec0-56618e7dba22-kube-api-access-7z7pc\") pod \"dd1e5821-44a4-4219-aec0-56618e7dba22\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.163409 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e5821-44a4-4219-aec0-56618e7dba22-logs\") pod \"dd1e5821-44a4-4219-aec0-56618e7dba22\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.163461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-internal-tls-certs\") pod \"dd1e5821-44a4-4219-aec0-56618e7dba22\" (UID: \"dd1e5821-44a4-4219-aec0-56618e7dba22\") " Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.173510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd1e5821-44a4-4219-aec0-56618e7dba22-logs" (OuterVolumeSpecName: "logs") pod "dd1e5821-44a4-4219-aec0-56618e7dba22" (UID: "dd1e5821-44a4-4219-aec0-56618e7dba22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.178451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1e5821-44a4-4219-aec0-56618e7dba22-kube-api-access-7z7pc" (OuterVolumeSpecName: "kube-api-access-7z7pc") pod "dd1e5821-44a4-4219-aec0-56618e7dba22" (UID: "dd1e5821-44a4-4219-aec0-56618e7dba22"). InnerVolumeSpecName "kube-api-access-7z7pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.195865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dd1e5821-44a4-4219-aec0-56618e7dba22" (UID: "dd1e5821-44a4-4219-aec0-56618e7dba22"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.215596 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0db5205c-c226-4ce2-b734-b250d0c4f7f5" path="/var/lib/kubelet/pods/0db5205c-c226-4ce2-b734-b250d0c4f7f5/volumes" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.218313 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" path="/var/lib/kubelet/pods/c197e98a-7956-4c34-a31a-911a28cf4a1a/volumes" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.219158 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa206fe-a991-4947-8e0e-0a8f80879f55" path="/var/lib/kubelet/pods/eaa206fe-a991-4947-8e0e-0a8f80879f55/volumes" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.270202 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.270229 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z7pc\" (UniqueName: \"kubernetes.io/projected/dd1e5821-44a4-4219-aec0-56618e7dba22-kube-api-access-7z7pc\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.270239 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e5821-44a4-4219-aec0-56618e7dba22-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.278277 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data" (OuterVolumeSpecName: "config-data") pod "dd1e5821-44a4-4219-aec0-56618e7dba22" (UID: "dd1e5821-44a4-4219-aec0-56618e7dba22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.283057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd1e5821-44a4-4219-aec0-56618e7dba22" (UID: "dd1e5821-44a4-4219-aec0-56618e7dba22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.286158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dd1e5821-44a4-4219-aec0-56618e7dba22" (UID: "dd1e5821-44a4-4219-aec0-56618e7dba22"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.288877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dd1e5821-44a4-4219-aec0-56618e7dba22" (UID: "dd1e5821-44a4-4219-aec0-56618e7dba22"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.330016 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.331475 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.374418 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.374600 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.374655 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.374703 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e5821-44a4-4219-aec0-56618e7dba22-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.558913 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerStarted","Data":"bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17"} Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.560648 4707 generic.go:334] "Generic (PLEG): container finished" podID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerID="26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704" exitCode=0 Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.560693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" event={"ID":"dd1e5821-44a4-4219-aec0-56618e7dba22","Type":"ContainerDied","Data":"26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704"} Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.560868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" event={"ID":"dd1e5821-44a4-4219-aec0-56618e7dba22","Type":"ContainerDied","Data":"4b53100cb868d85f8e050d1bc4829884fd74507bd51615aebc27917f46e11161"} Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.560912 4707 scope.go:117] "RemoveContainer" containerID="26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.560733 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-565595d96-hb7l7" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.562722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.562747 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.588531 4707 scope.go:117] "RemoveContainer" containerID="2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.625005 4707 scope.go:117] "RemoveContainer" containerID="26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704" Jan 21 15:54:49 crc kubenswrapper[4707]: E0121 15:54:49.625413 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704\": container with ID starting with 26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704 not found: ID does not exist" containerID="26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.625439 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704"} err="failed to get container status \"26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704\": rpc error: code = NotFound desc = could not find container \"26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704\": container with ID starting with 26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704 not found: ID does not exist" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.625588 4707 scope.go:117] "RemoveContainer" containerID="2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20" Jan 21 15:54:49 crc kubenswrapper[4707]: E0121 15:54:49.625779 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20\": container with ID starting with 2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20 not found: ID does not exist" containerID="2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.625801 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20"} err="failed to get container status \"2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20\": rpc error: code = NotFound desc = could not find container \"2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20\": container with ID starting with 2e3ff8e5402f679122394461c9955aa6a02d8345f9e87d35a83598c366e7fa20 not found: ID does not exist" Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.631535 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-565595d96-hb7l7"] Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.636862 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-565595d96-hb7l7"] Jan 21 15:54:49 crc kubenswrapper[4707]: I0121 15:54:49.867433 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:54:50 crc kubenswrapper[4707]: I0121 15:54:50.575125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerStarted","Data":"e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159"} Jan 21 15:54:50 crc kubenswrapper[4707]: I0121 15:54:50.917952 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.132:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:50 crc kubenswrapper[4707]: I0121 15:54:50.957083 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podda5cd14d-4d69-4f5a-9901-315fe321579c"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podda5cd14d-4d69-4f5a-9901-315fe321579c] : Timed out while waiting for systemd to remove kubepods-besteffort-podda5cd14d_4d69_4f5a_9901_315fe321579c.slice" Jan 21 15:54:50 crc kubenswrapper[4707]: E0121 15:54:50.957126 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podda5cd14d-4d69-4f5a-9901-315fe321579c] : unable to destroy cgroup paths for cgroup [kubepods besteffort podda5cd14d-4d69-4f5a-9901-315fe321579c] : Timed out while waiting for systemd to remove kubepods-besteffort-podda5cd14d_4d69_4f5a_9901_315fe321579c.slice" pod="openstack-kuttl-tests/nova-api-0" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" Jan 21 15:54:50 crc kubenswrapper[4707]: I0121 15:54:50.996347 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.030294 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.319876 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod728a3264-18ce-4d47-86f5-868f00cc4558"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod728a3264-18ce-4d47-86f5-868f00cc4558] : Timed out while waiting for systemd to remove kubepods-besteffort-pod728a3264_18ce_4d47_86f5_868f00cc4558.slice" Jan 21 15:54:51 crc kubenswrapper[4707]: E0121 15:54:51.319925 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod728a3264-18ce-4d47-86f5-868f00cc4558] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod728a3264-18ce-4d47-86f5-868f00cc4558] : Timed out while waiting for systemd to remove kubepods-besteffort-pod728a3264_18ce_4d47_86f5_868f00cc4558.slice" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="728a3264-18ce-4d47-86f5-868f00cc4558" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.352987 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" path="/var/lib/kubelet/pods/dd1e5821-44a4-4219-aec0-56618e7dba22/volumes" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.353538 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.353570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.439984 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod388a5540-d98c-4db0-bb1a-5a387ff8f7f3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod388a5540-d98c-4db0-bb1a-5a387ff8f7f3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod388a5540_d98c_4db0_bb1a_5a387ff8f7f3.slice" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.440399 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-76756f697d-jwhvs_c1ba53ab-abb3-455d-979b-cad1eab8e90a/neutron-api/0.log" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.440469 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.540967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-combined-ca-bundle\") pod \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.541070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-public-tls-certs\") pod \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.541130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-ovndb-tls-certs\") pod \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.541146 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-internal-tls-certs\") pod \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.541211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-httpd-config\") pod \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.541255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ngm8\" (UniqueName: \"kubernetes.io/projected/c1ba53ab-abb3-455d-979b-cad1eab8e90a-kube-api-access-2ngm8\") pod \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.541315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-config\") pod \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\" (UID: \"c1ba53ab-abb3-455d-979b-cad1eab8e90a\") " Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.545632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ba53ab-abb3-455d-979b-cad1eab8e90a-kube-api-access-2ngm8" (OuterVolumeSpecName: "kube-api-access-2ngm8") pod "c1ba53ab-abb3-455d-979b-cad1eab8e90a" (UID: "c1ba53ab-abb3-455d-979b-cad1eab8e90a"). InnerVolumeSpecName "kube-api-access-2ngm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.546905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c1ba53ab-abb3-455d-979b-cad1eab8e90a" (UID: "c1ba53ab-abb3-455d-979b-cad1eab8e90a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.581112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c1ba53ab-abb3-455d-979b-cad1eab8e90a" (UID: "c1ba53ab-abb3-455d-979b-cad1eab8e90a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.582570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c1ba53ab-abb3-455d-979b-cad1eab8e90a" (UID: "c1ba53ab-abb3-455d-979b-cad1eab8e90a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.583346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-config" (OuterVolumeSpecName: "config") pod "c1ba53ab-abb3-455d-979b-cad1eab8e90a" (UID: "c1ba53ab-abb3-455d-979b-cad1eab8e90a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.588139 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-76756f697d-jwhvs_c1ba53ab-abb3-455d-979b-cad1eab8e90a/neutron-api/0.log" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.588203 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerID="310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563" exitCode=1 Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.588275 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.588285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" event={"ID":"c1ba53ab-abb3-455d-979b-cad1eab8e90a","Type":"ContainerDied","Data":"310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563"} Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.588334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76756f697d-jwhvs" event={"ID":"c1ba53ab-abb3-455d-979b-cad1eab8e90a","Type":"ContainerDied","Data":"6ddc0dfa01a3bd9696865be0af3cf90bf3c02e7d3e1fd17c4b3a8c19c5ca9f93"} Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.588357 4707 scope.go:117] "RemoveContainer" containerID="2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.593693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerStarted","Data":"1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6"} Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.594295 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.594632 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.594674 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.594777 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" containerID="cri-o://1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45" gracePeriod=30 Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.595007 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="probe" containerID="cri-o://25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92" gracePeriod=30 Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.595966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1ba53ab-abb3-455d-979b-cad1eab8e90a" (UID: "c1ba53ab-abb3-455d-979b-cad1eab8e90a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.600529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c1ba53ab-abb3-455d-979b-cad1eab8e90a" (UID: "c1ba53ab-abb3-455d-979b-cad1eab8e90a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.620224 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.14648891 podStartE2EDuration="5.620207869s" podCreationTimestamp="2026-01-21 15:54:46 +0000 UTC" firstStartedPulling="2026-01-21 15:54:47.494893882 +0000 UTC m=+3184.676410105" lastFinishedPulling="2026-01-21 15:54:50.968612843 +0000 UTC m=+3188.150129064" observedRunningTime="2026-01-21 15:54:51.615881604 +0000 UTC m=+3188.797397826" watchObservedRunningTime="2026-01-21 15:54:51.620207869 +0000 UTC m=+3188.801724091" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.646562 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.646978 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.646993 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.647003 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.647016 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ngm8\" (UniqueName: \"kubernetes.io/projected/c1ba53ab-abb3-455d-979b-cad1eab8e90a-kube-api-access-2ngm8\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.647038 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.647047 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ba53ab-abb3-455d-979b-cad1eab8e90a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.681545 4707 scope.go:117] "RemoveContainer" containerID="310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.715985 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.725984 4707 scope.go:117] "RemoveContainer" containerID="2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b" Jan 21 15:54:51 crc kubenswrapper[4707]: E0121 15:54:51.726464 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b\": container with ID starting with 2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b not found: ID does not exist" containerID="2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.726492 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b"} err="failed to get container status \"2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b\": rpc error: code = NotFound desc = could not find container \"2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b\": container with ID starting with 2e8c199909aad69ae58a739a55627c8d564d2adee89b54965446c74a09d1da7b not found: ID does not exist" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.726519 4707 scope.go:117] "RemoveContainer" containerID="310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563" Jan 21 15:54:51 crc kubenswrapper[4707]: E0121 15:54:51.726785 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563\": container with ID starting with 310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563 not found: ID does not exist" containerID="310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.726800 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563"} err="failed to get container status \"310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563\": rpc error: code = NotFound desc = could not find container \"310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563\": container with ID starting with 310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563 not found: ID does not exist" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.729017 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.736773 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.744111 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.750300 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:51 crc kubenswrapper[4707]: E0121 15:54:51.750690 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.750708 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api" Jan 21 15:54:51 crc kubenswrapper[4707]: E0121 15:54:51.750725 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api-log" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.750732 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api-log" Jan 21 15:54:51 crc kubenswrapper[4707]: E0121 15:54:51.750755 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerName="neutron-api" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.750761 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerName="neutron-api" Jan 21 15:54:51 crc kubenswrapper[4707]: E0121 15:54:51.750786 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerName="neutron-httpd" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.750794 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerName="neutron-httpd" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.750977 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerName="neutron-httpd" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.750997 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.751017 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1e5821-44a4-4219-aec0-56618e7dba22" containerName="barbican-api-log" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.751027 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" containerName="neutron-api" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.751977 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.755182 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.755369 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.755488 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.757054 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.763257 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.764433 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.766150 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.766517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.766776 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.768981 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852011 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604d874-1aab-419e-a67e-801734497292-logs\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tptrl\" (UniqueName: \"kubernetes.io/projected/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-kube-api-access-tptrl\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852176 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb54t\" (UniqueName: \"kubernetes.io/projected/2604d874-1aab-419e-a67e-801734497292-kube-api-access-tb54t\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-public-tls-certs\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.852349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-config-data\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.921371 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-76756f697d-jwhvs"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.927800 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-76756f697d-jwhvs"] Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.953732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb54t\" (UniqueName: \"kubernetes.io/projected/2604d874-1aab-419e-a67e-801734497292-kube-api-access-tb54t\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.953772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-public-tls-certs\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.953830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.953854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-config-data\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.953908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604d874-1aab-419e-a67e-801734497292-logs\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.953971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tptrl\" (UniqueName: \"kubernetes.io/projected/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-kube-api-access-tptrl\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.953994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.954020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.954059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.954110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.954142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.954511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604d874-1aab-419e-a67e-801734497292-logs\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.960019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.960098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.960218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-config-data\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.960532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.960869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.961482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-public-tls-certs\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.961578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.967016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.969098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tptrl\" (UniqueName: \"kubernetes.io/projected/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-kube-api-access-tptrl\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:51 crc kubenswrapper[4707]: I0121 15:54:51.969176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb54t\" (UniqueName: \"kubernetes.io/projected/2604d874-1aab-419e-a67e-801734497292-kube-api-access-tb54t\") pod \"nova-api-0\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.067374 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.069956 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.070542 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.087244 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.486917 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.593752 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.608286 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e495769-a033-41f2-8285-3f3274673250" containerID="25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92" exitCode=0 Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.608308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3e495769-a033-41f2-8285-3f3274673250","Type":"ContainerDied","Data":"25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92"} Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.609577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2604d874-1aab-419e-a67e-801734497292","Type":"ContainerStarted","Data":"f9f14ca061218e6674358237ac9cefa1f3ff05176447b7f19f802fddc87ac200"} Jan 21 15:54:52 crc kubenswrapper[4707]: I0121 15:54:52.985587 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.191231 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728a3264-18ce-4d47-86f5-868f00cc4558" path="/var/lib/kubelet/pods/728a3264-18ce-4d47-86f5-868f00cc4558/volumes" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.192500 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ba53ab-abb3-455d-979b-cad1eab8e90a" path="/var/lib/kubelet/pods/c1ba53ab-abb3-455d-979b-cad1eab8e90a/volumes" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.193158 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da5cd14d-4d69-4f5a-9901-315fe321579c" path="/var/lib/kubelet/pods/da5cd14d-4d69-4f5a-9901-315fe321579c/volumes" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.223684 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.276673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e495769-a033-41f2-8285-3f3274673250-etc-machine-id\") pod \"3e495769-a033-41f2-8285-3f3274673250\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.276827 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9vrd\" (UniqueName: \"kubernetes.io/projected/3e495769-a033-41f2-8285-3f3274673250-kube-api-access-r9vrd\") pod \"3e495769-a033-41f2-8285-3f3274673250\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.276853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-scripts\") pod \"3e495769-a033-41f2-8285-3f3274673250\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.276874 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data-custom\") pod \"3e495769-a033-41f2-8285-3f3274673250\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.276923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data\") pod \"3e495769-a033-41f2-8285-3f3274673250\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.276943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-combined-ca-bundle\") pod \"3e495769-a033-41f2-8285-3f3274673250\" (UID: \"3e495769-a033-41f2-8285-3f3274673250\") " Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.278416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e495769-a033-41f2-8285-3f3274673250-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3e495769-a033-41f2-8285-3f3274673250" (UID: "3e495769-a033-41f2-8285-3f3274673250"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.291018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e495769-a033-41f2-8285-3f3274673250-kube-api-access-r9vrd" (OuterVolumeSpecName: "kube-api-access-r9vrd") pod "3e495769-a033-41f2-8285-3f3274673250" (UID: "3e495769-a033-41f2-8285-3f3274673250"). InnerVolumeSpecName "kube-api-access-r9vrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.291101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-scripts" (OuterVolumeSpecName: "scripts") pod "3e495769-a033-41f2-8285-3f3274673250" (UID: "3e495769-a033-41f2-8285-3f3274673250"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.292866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e495769-a033-41f2-8285-3f3274673250" (UID: "3e495769-a033-41f2-8285-3f3274673250"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.348935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e495769-a033-41f2-8285-3f3274673250" (UID: "3e495769-a033-41f2-8285-3f3274673250"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.379375 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9vrd\" (UniqueName: \"kubernetes.io/projected/3e495769-a033-41f2-8285-3f3274673250-kube-api-access-r9vrd\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.379403 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.379413 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.379421 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.379429 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e495769-a033-41f2-8285-3f3274673250-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.383595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data" (OuterVolumeSpecName: "config-data") pod "3e495769-a033-41f2-8285-3f3274673250" (UID: "3e495769-a033-41f2-8285-3f3274673250"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.481714 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e495769-a033-41f2-8285-3f3274673250-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.622892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7c05170c-aee6-46b7-bbc6-921bad0eb7ee","Type":"ContainerStarted","Data":"06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e"} Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.622943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7c05170c-aee6-46b7-bbc6-921bad0eb7ee","Type":"ContainerStarted","Data":"2fc04797f81dcb357b70cfb4ee3e330bf277c36ba600b1351501f0d77fbc32f0"} Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.626185 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e495769-a033-41f2-8285-3f3274673250" containerID="1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45" exitCode=0 Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.626248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3e495769-a033-41f2-8285-3f3274673250","Type":"ContainerDied","Data":"1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45"} Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.626321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3e495769-a033-41f2-8285-3f3274673250","Type":"ContainerDied","Data":"9d4e07313400d53cb0cb4138c088d5bde34b94479ebb36b4c0f0e080abe3f83f"} Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.626348 4707 scope.go:117] "RemoveContainer" containerID="25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.626299 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.629844 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2604d874-1aab-419e-a67e-801734497292","Type":"ContainerStarted","Data":"7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7"} Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.629882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2604d874-1aab-419e-a67e-801734497292","Type":"ContainerStarted","Data":"702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f"} Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.661730 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.6616914879999998 podStartE2EDuration="2.661691488s" podCreationTimestamp="2026-01-21 15:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:53.640587314 +0000 UTC m=+3190.822103536" watchObservedRunningTime="2026-01-21 15:54:53.661691488 +0000 UTC m=+3190.843207710" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.664884 4707 scope.go:117] "RemoveContainer" containerID="1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.695239 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.695219264 podStartE2EDuration="2.695219264s" podCreationTimestamp="2026-01-21 15:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:53.674329383 +0000 UTC m=+3190.855845604" watchObservedRunningTime="2026-01-21 15:54:53.695219264 +0000 UTC m=+3190.876735486" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.710437 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.717840 4707 scope.go:117] "RemoveContainer" containerID="25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92" Jan 21 15:54:53 crc kubenswrapper[4707]: E0121 15:54:53.718336 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92\": container with ID starting with 25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92 not found: ID does not exist" containerID="25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.718395 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92"} err="failed to get container status \"25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92\": rpc error: code = NotFound desc = could not find container \"25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92\": container with ID starting with 25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92 not found: ID does not exist" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.718428 4707 scope.go:117] "RemoveContainer" containerID="1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45" Jan 21 15:54:53 crc kubenswrapper[4707]: E0121 15:54:53.720776 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45\": container with ID starting with 1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45 not found: ID does not exist" containerID="1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.720819 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45"} err="failed to get container status \"1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45\": rpc error: code = NotFound desc = could not find container \"1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45\": container with ID starting with 1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45 not found: ID does not exist" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.723016 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.736034 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:53 crc kubenswrapper[4707]: E0121 15:54:53.736838 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.736859 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" Jan 21 15:54:53 crc kubenswrapper[4707]: E0121 15:54:53.736881 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="probe" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.736887 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="probe" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.737102 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="probe" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.737124 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e495769-a033-41f2-8285-3f3274673250" containerName="cinder-scheduler" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.738053 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.738871 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.740299 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.800257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79a91989-c584-44cc-bb31-494f3d1a7a7d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.800321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.800354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.800401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvbkl\" (UniqueName: \"kubernetes.io/projected/79a91989-c584-44cc-bb31-494f3d1a7a7d-kube-api-access-zvbkl\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.800449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.800542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-scripts\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.902256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-scripts\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.902336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79a91989-c584-44cc-bb31-494f3d1a7a7d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.902375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.902398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.902433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvbkl\" (UniqueName: \"kubernetes.io/projected/79a91989-c584-44cc-bb31-494f3d1a7a7d-kube-api-access-zvbkl\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.902469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.903991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79a91989-c584-44cc-bb31-494f3d1a7a7d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.907422 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.909471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.910133 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.910638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-scripts\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:53 crc kubenswrapper[4707]: I0121 15:54:53.922534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvbkl\" (UniqueName: \"kubernetes.io/projected/79a91989-c584-44cc-bb31-494f3d1a7a7d-kube-api-access-zvbkl\") pod \"cinder-scheduler-0\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:54 crc kubenswrapper[4707]: I0121 15:54:54.061530 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:54 crc kubenswrapper[4707]: I0121 15:54:54.506407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:54:54 crc kubenswrapper[4707]: I0121 15:54:54.643037 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"79a91989-c584-44cc-bb31-494f3d1a7a7d","Type":"ContainerStarted","Data":"8e391b4b7e37eb92ff609b9149d7f3da92176bcb2a3b530d12f255276a262a20"} Jan 21 15:54:55 crc kubenswrapper[4707]: I0121 15:54:55.183199 4707 scope.go:117] "RemoveContainer" containerID="0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b" Jan 21 15:54:55 crc kubenswrapper[4707]: I0121 15:54:55.191762 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e495769-a033-41f2-8285-3f3274673250" path="/var/lib/kubelet/pods/3e495769-a033-41f2-8285-3f3274673250/volumes" Jan 21 15:54:55 crc kubenswrapper[4707]: I0121 15:54:55.654097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"79a91989-c584-44cc-bb31-494f3d1a7a7d","Type":"ContainerStarted","Data":"ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8"} Jan 21 15:54:55 crc kubenswrapper[4707]: I0121 15:54:55.654435 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"79a91989-c584-44cc-bb31-494f3d1a7a7d","Type":"ContainerStarted","Data":"ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39"} Jan 21 15:54:55 crc kubenswrapper[4707]: I0121 15:54:55.655663 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"03adcc99-dff8-4703-bacf-25c6576428f9","Type":"ContainerStarted","Data":"bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70"} Jan 21 15:54:55 crc kubenswrapper[4707]: I0121 15:54:55.655865 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:54:55 crc kubenswrapper[4707]: I0121 15:54:55.670844 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.670829123 podStartE2EDuration="2.670829123s" podCreationTimestamp="2026-01-21 15:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:54:55.666656927 +0000 UTC m=+3192.848173150" watchObservedRunningTime="2026-01-21 15:54:55.670829123 +0000 UTC m=+3192.852345345" Jan 21 15:54:57 crc kubenswrapper[4707]: I0121 15:54:57.069118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:57 crc kubenswrapper[4707]: I0121 15:54:57.069401 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:54:57 crc kubenswrapper[4707]: I0121 15:54:57.087819 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:54:57 crc kubenswrapper[4707]: I0121 15:54:57.986555 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:58 crc kubenswrapper[4707]: I0121 15:54:58.011729 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:58 crc kubenswrapper[4707]: I0121 15:54:58.078915 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:58 crc kubenswrapper[4707]: I0121 15:54:58.078940 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:54:58 crc kubenswrapper[4707]: I0121 15:54:58.183197 4707 scope.go:117] "RemoveContainer" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" Jan 21 15:54:58 crc kubenswrapper[4707]: I0121 15:54:58.697891 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e22a9822-7aa7-4a38-916a-c3fc9fdb0895","Type":"ContainerStarted","Data":"21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749"} Jan 21 15:54:58 crc kubenswrapper[4707]: I0121 15:54:58.698097 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:54:58 crc kubenswrapper[4707]: I0121 15:54:58.721992 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:54:59 crc kubenswrapper[4707]: I0121 15:54:59.063218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:59 crc kubenswrapper[4707]: I0121 15:54:59.236256 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:54:59 crc kubenswrapper[4707]: I0121 15:54:59.530485 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:59 crc kubenswrapper[4707]: I0121 15:54:59.531065 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:54:59 crc kubenswrapper[4707]: I0121 15:54:59.600107 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh"] Jan 21 15:54:59 crc kubenswrapper[4707]: I0121 15:54:59.602167 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" podUID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerName="placement-log" containerID="cri-o://4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5" gracePeriod=30 Jan 21 15:54:59 crc kubenswrapper[4707]: I0121 15:54:59.602302 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" podUID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerName="placement-api" containerID="cri-o://d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b" gracePeriod=30 Jan 21 15:55:00 crc kubenswrapper[4707]: I0121 15:55:00.182804 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:55:00 crc kubenswrapper[4707]: E0121 15:55:00.183395 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:55:00 crc kubenswrapper[4707]: I0121 15:55:00.719896 4707 generic.go:334] "Generic (PLEG): container finished" podID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerID="9832a63a4444f19da99ed41ffa2ee903cf61ce3db5cfce79cec9a1c0b9b19f58" exitCode=137 Jan 21 15:55:00 crc kubenswrapper[4707]: I0121 15:55:00.719960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" event={"ID":"b7d3be32-a1ae-4a91-bf04-051fd3dde8af","Type":"ContainerDied","Data":"9832a63a4444f19da99ed41ffa2ee903cf61ce3db5cfce79cec9a1c0b9b19f58"} Jan 21 15:55:00 crc kubenswrapper[4707]: I0121 15:55:00.721196 4707 generic.go:334] "Generic (PLEG): container finished" podID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerID="4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5" exitCode=143 Jan 21 15:55:00 crc kubenswrapper[4707]: I0121 15:55:00.721219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" event={"ID":"35251dbe-ab48-4cf5-b556-941dc2cfe6cc","Type":"ContainerDied","Data":"4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5"} Jan 21 15:55:00 crc kubenswrapper[4707]: I0121 15:55:00.929241 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:55:00 crc kubenswrapper[4707]: I0121 15:55:00.984662 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.027734 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-logs\") pod \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.027784 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lbrk\" (UniqueName: \"kubernetes.io/projected/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-kube-api-access-5lbrk\") pod \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.027886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-combined-ca-bundle\") pod \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.027907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data-custom\") pod \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.027949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data\") pod \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\" (UID: \"b7d3be32-a1ae-4a91-bf04-051fd3dde8af\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.028122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-logs" (OuterVolumeSpecName: "logs") pod "b7d3be32-a1ae-4a91-bf04-051fd3dde8af" (UID: "b7d3be32-a1ae-4a91-bf04-051fd3dde8af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.028315 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.033286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b7d3be32-a1ae-4a91-bf04-051fd3dde8af" (UID: "b7d3be32-a1ae-4a91-bf04-051fd3dde8af"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.033581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-kube-api-access-5lbrk" (OuterVolumeSpecName: "kube-api-access-5lbrk") pod "b7d3be32-a1ae-4a91-bf04-051fd3dde8af" (UID: "b7d3be32-a1ae-4a91-bf04-051fd3dde8af"). InnerVolumeSpecName "kube-api-access-5lbrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.051350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7d3be32-a1ae-4a91-bf04-051fd3dde8af" (UID: "b7d3be32-a1ae-4a91-bf04-051fd3dde8af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.072989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data" (OuterVolumeSpecName: "config-data") pod "b7d3be32-a1ae-4a91-bf04-051fd3dde8af" (UID: "b7d3be32-a1ae-4a91-bf04-051fd3dde8af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.129685 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.129713 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.129723 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.129732 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lbrk\" (UniqueName: \"kubernetes.io/projected/b7d3be32-a1ae-4a91-bf04-051fd3dde8af-kube-api-access-5lbrk\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: E0121 15:55:01.295164 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc197e98a_7956_4c34_a31a_911a28cf4a1a.slice/crio-20c5b4970d15c1a96b2d975842b377379f7937b4b5a64f863e062eb4ad7f03cd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd1e5821_44a4_4219_aec0_56618e7dba22.slice/crio-conmon-26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1ba53ab_abb3_455d_979b_cad1eab8e90a.slice/crio-conmon-310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35251dbe_ab48_4cf5_b556_941dc2cfe6cc.slice/crio-4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc197e98a_7956_4c34_a31a_911a28cf4a1a.slice/crio-conmon-0134dae6d06704b28d3c1a93c597121155514f8f85383d373be3fbff4f9697e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7d3be32_a1ae_4a91_bf04_051fd3dde8af.slice/crio-conmon-9832a63a4444f19da99ed41ffa2ee903cf61ce3db5cfce79cec9a1c0b9b19f58.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc197e98a_7956_4c34_a31a_911a28cf4a1a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1ba53ab_abb3_455d_979b_cad1eab8e90a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd1e5821_44a4_4219_aec0_56618e7dba22.slice/crio-26520914b821108acae79993a688ef0d40bcf318de2a075a0ebda802ceed0704.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa206fe_a991_4947_8e0e_0a8f80879f55.slice/crio-e63269a3cac96ab1f9b8f88518da6d80d552868292608720525ace78853503c7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7d3be32_a1ae_4a91_bf04_051fd3dde8af.slice/crio-9832a63a4444f19da99ed41ffa2ee903cf61ce3db5cfce79cec9a1c0b9b19f58.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e495769_a033_41f2_8285_3f3274673250.slice/crio-9d4e07313400d53cb0cb4138c088d5bde34b94479ebb36b4c0f0e080abe3f83f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc197e98a_7956_4c34_a31a_911a28cf4a1a.slice/crio-0134dae6d06704b28d3c1a93c597121155514f8f85383d373be3fbff4f9697e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a307a5c_9c6e_42d4_80a8_7d6e50650fcd.slice/crio-78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db5205c_c226_4ce2_b734_b250d0c4f7f5.slice/crio-6a969f47e9705be303cdf57e8935315f4c39ccdbb4dfcf02844b2537d3ff7ab1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db5205c_c226_4ce2_b734_b250d0c4f7f5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e495769_a033_41f2_8285_3f3274673250.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e495769_a033_41f2_8285_3f3274673250.slice/crio-1f5c2ad6dc184b893e357cc028dbdae6e81fb88a7ced1e92e1fde435135a5c45.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd1e5821_44a4_4219_aec0_56618e7dba22.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35251dbe_ab48_4cf5_b556_941dc2cfe6cc.slice/crio-conmon-4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa9d3f6_1251_4d45_b274_1bb23f9e4dea.slice/crio-b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1ba53ab_abb3_455d_979b_cad1eab8e90a.slice/crio-6ddc0dfa01a3bd9696865be0af3cf90bf3c02e7d3e1fd17c4b3a8c19c5ca9f93\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e495769_a033_41f2_8285_3f3274673250.slice/crio-conmon-25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa9d3f6_1251_4d45_b274_1bb23f9e4dea.slice/crio-conmon-b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e495769_a033_41f2_8285_3f3274673250.slice/crio-25f3e8b0872a1a54b185de704b5ec62ab781ae4a3f4d37388c4d9076ad1d3d92.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1ba53ab_abb3_455d_979b_cad1eab8e90a.slice/crio-310a27083a80fcfd83e72f73648bc21b32add2b072a89229f08197c1bb944563.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.321365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.433727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-combined-ca-bundle\") pod \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.433830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-logs\") pod \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.433860 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data-custom\") pod \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.433941 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data\") pod \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.433996 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9rwm\" (UniqueName: \"kubernetes.io/projected/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-kube-api-access-g9rwm\") pod \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\" (UID: \"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea\") " Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.434153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-logs" (OuterVolumeSpecName: "logs") pod "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" (UID: "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.434607 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.437068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" (UID: "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.437185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-kube-api-access-g9rwm" (OuterVolumeSpecName: "kube-api-access-g9rwm") pod "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" (UID: "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea"). InnerVolumeSpecName "kube-api-access-g9rwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.453836 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" (UID: "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.471411 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data" (OuterVolumeSpecName: "config-data") pod "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" (UID: "1aa9d3f6-1251-4d45-b274-1bb23f9e4dea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.536648 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.536673 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9rwm\" (UniqueName: \"kubernetes.io/projected/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-kube-api-access-g9rwm\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.536684 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.536693 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.729722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" event={"ID":"b7d3be32-a1ae-4a91-bf04-051fd3dde8af","Type":"ContainerDied","Data":"3dfd9e4fed4070cd11fa67fab255b2afe50e8c44e87dc32ad836842420c0f197"} Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.729755 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5665575645-v8ktc" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.729771 4707 scope.go:117] "RemoveContainer" containerID="9832a63a4444f19da99ed41ffa2ee903cf61ce3db5cfce79cec9a1c0b9b19f58" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.732332 4707 generic.go:334] "Generic (PLEG): container finished" podID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerID="b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019" exitCode=137 Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.732369 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" event={"ID":"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea","Type":"ContainerDied","Data":"b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019"} Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.732393 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" event={"ID":"1aa9d3f6-1251-4d45-b274-1bb23f9e4dea","Type":"ContainerDied","Data":"8a3175ee33cf8cb0b2ce91ef546f6d5735dfc0d6a65ce7c80f7b289c294fc9ee"} Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.732400 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.792709 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt"] Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.793856 4707 scope.go:117] "RemoveContainer" containerID="e4cf394238e37ac5d15301506b18d0ba417d6a80b24ad0f97a43b3434616dd5a" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.802489 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9d5989c46-nlxmt"] Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.809754 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5665575645-v8ktc"] Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.813046 4707 scope.go:117] "RemoveContainer" containerID="b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.816792 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5665575645-v8ktc"] Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.833459 4707 scope.go:117] "RemoveContainer" containerID="ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.855941 4707 scope.go:117] "RemoveContainer" containerID="b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019" Jan 21 15:55:01 crc kubenswrapper[4707]: E0121 15:55:01.856308 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019\": container with ID starting with b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019 not found: ID does not exist" containerID="b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.856341 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019"} err="failed to get container status \"b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019\": rpc error: code = NotFound desc = could not find container \"b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019\": container with ID starting with b75adeec3b29ad286391df4c4f6528cf6ad49be6bfb5337c3dead407ebe24019 not found: ID does not exist" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.856359 4707 scope.go:117] "RemoveContainer" containerID="ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297" Jan 21 15:55:01 crc kubenswrapper[4707]: E0121 15:55:01.856639 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297\": container with ID starting with ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297 not found: ID does not exist" containerID="ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297" Jan 21 15:55:01 crc kubenswrapper[4707]: I0121 15:55:01.856670 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297"} err="failed to get container status \"ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297\": rpc error: code = NotFound desc = could not find container \"ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297\": container with ID starting with ce748d10b137f1302ba54aca883bcfd4a3ac775a7b02c6d3ea4e8e0d058c7297 not found: ID does not exist" Jan 21 15:55:02 crc kubenswrapper[4707]: I0121 15:55:02.068333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:55:02 crc kubenswrapper[4707]: I0121 15:55:02.068568 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:55:02 crc kubenswrapper[4707]: I0121 15:55:02.088153 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:55:02 crc kubenswrapper[4707]: I0121 15:55:02.102009 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:55:02 crc kubenswrapper[4707]: I0121 15:55:02.760015 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.080052 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.080069 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.115049 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.190555 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" path="/var/lib/kubelet/pods/1aa9d3f6-1251-4d45-b274-1bb23f9e4dea/volumes" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.191296 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" path="/var/lib/kubelet/pods/b7d3be32-a1ae-4a91-bf04-051fd3dde8af/volumes" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.263462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-public-tls-certs\") pod \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.263499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-config-data\") pod \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.263523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-internal-tls-certs\") pod \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.263558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv8lc\" (UniqueName: \"kubernetes.io/projected/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-kube-api-access-mv8lc\") pod \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.263575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-combined-ca-bundle\") pod \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.264143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-scripts\") pod \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.264178 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-logs\") pod \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\" (UID: \"35251dbe-ab48-4cf5-b556-941dc2cfe6cc\") " Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.265764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-logs" (OuterVolumeSpecName: "logs") pod "35251dbe-ab48-4cf5-b556-941dc2cfe6cc" (UID: "35251dbe-ab48-4cf5-b556-941dc2cfe6cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.268415 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-scripts" (OuterVolumeSpecName: "scripts") pod "35251dbe-ab48-4cf5-b556-941dc2cfe6cc" (UID: "35251dbe-ab48-4cf5-b556-941dc2cfe6cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.268678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-kube-api-access-mv8lc" (OuterVolumeSpecName: "kube-api-access-mv8lc") pod "35251dbe-ab48-4cf5-b556-941dc2cfe6cc" (UID: "35251dbe-ab48-4cf5-b556-941dc2cfe6cc"). InnerVolumeSpecName "kube-api-access-mv8lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.306471 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35251dbe-ab48-4cf5-b556-941dc2cfe6cc" (UID: "35251dbe-ab48-4cf5-b556-941dc2cfe6cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.318286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-config-data" (OuterVolumeSpecName: "config-data") pod "35251dbe-ab48-4cf5-b556-941dc2cfe6cc" (UID: "35251dbe-ab48-4cf5-b556-941dc2cfe6cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.337719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "35251dbe-ab48-4cf5-b556-941dc2cfe6cc" (UID: "35251dbe-ab48-4cf5-b556-941dc2cfe6cc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.353866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "35251dbe-ab48-4cf5-b556-941dc2cfe6cc" (UID: "35251dbe-ab48-4cf5-b556-941dc2cfe6cc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.365861 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.365952 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.366007 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.366063 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv8lc\" (UniqueName: \"kubernetes.io/projected/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-kube-api-access-mv8lc\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.366114 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.366160 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.366205 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35251dbe-ab48-4cf5-b556-941dc2cfe6cc-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.568614 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.643768 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7fd958d6cb-2thfc"] Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.643995 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" podUID="239fec41-be3f-42d6-b0f2-a17c6c13efc8" containerName="keystone-api" containerID="cri-o://7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a" gracePeriod=30 Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.754400 4707 generic.go:334] "Generic (PLEG): container finished" podID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerID="d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b" exitCode=0 Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.754463 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.754485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" event={"ID":"35251dbe-ab48-4cf5-b556-941dc2cfe6cc","Type":"ContainerDied","Data":"d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b"} Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.754521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh" event={"ID":"35251dbe-ab48-4cf5-b556-941dc2cfe6cc","Type":"ContainerDied","Data":"2f348aa12604f359438109c5914df2a157cd4d6585e01eedf61fd0c92542551b"} Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.754547 4707 scope.go:117] "RemoveContainer" containerID="d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.791445 4707 scope.go:117] "RemoveContainer" containerID="4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.799076 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh"] Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.810614 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-cdbcc8f6b-xn7nh"] Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.812181 4707 scope.go:117] "RemoveContainer" containerID="d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b" Jan 21 15:55:03 crc kubenswrapper[4707]: E0121 15:55:03.812537 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b\": container with ID starting with d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b not found: ID does not exist" containerID="d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.812617 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b"} err="failed to get container status \"d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b\": rpc error: code = NotFound desc = could not find container \"d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b\": container with ID starting with d1e2044df9637480699dd93c37a13c268eab5eb9604b7134e23ad9afbf9c9f1b not found: ID does not exist" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.812689 4707 scope.go:117] "RemoveContainer" containerID="4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5" Jan 21 15:55:03 crc kubenswrapper[4707]: E0121 15:55:03.813420 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5\": container with ID starting with 4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5 not found: ID does not exist" containerID="4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.813489 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5"} err="failed to get container status \"4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5\": rpc error: code = NotFound desc = could not find container \"4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5\": container with ID starting with 4c05b3a4dbd97bedd65068a85ad1e71e5d132c8baee39a159fc2be1cf1ae2ef5 not found: ID does not exist" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.969735 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:55:03 crc kubenswrapper[4707]: I0121 15:55:03.989096 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:55:05 crc kubenswrapper[4707]: I0121 15:55:05.191185 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" path="/var/lib/kubelet/pods/35251dbe-ab48-4cf5-b556-941dc2cfe6cc/volumes" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550257 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx"] Jan 21 15:55:06 crc kubenswrapper[4707]: E0121 15:55:06.550635 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerName="barbican-worker" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550650 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerName="barbican-worker" Jan 21 15:55:06 crc kubenswrapper[4707]: E0121 15:55:06.550662 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerName="barbican-keystone-listener-log" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550668 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerName="barbican-keystone-listener-log" Jan 21 15:55:06 crc kubenswrapper[4707]: E0121 15:55:06.550678 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerName="placement-log" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerName="placement-log" Jan 21 15:55:06 crc kubenswrapper[4707]: E0121 15:55:06.550694 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerName="barbican-keystone-listener" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550699 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerName="barbican-keystone-listener" Jan 21 15:55:06 crc kubenswrapper[4707]: E0121 15:55:06.550715 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerName="barbican-worker-log" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550721 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerName="barbican-worker-log" Jan 21 15:55:06 crc kubenswrapper[4707]: E0121 15:55:06.550730 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerName="placement-api" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550734 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerName="placement-api" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550911 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerName="barbican-keystone-listener" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550932 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerName="barbican-worker" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550945 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa9d3f6-1251-4d45-b274-1bb23f9e4dea" containerName="barbican-keystone-listener-log" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550950 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d3be32-a1ae-4a91-bf04-051fd3dde8af" containerName="barbican-worker-log" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550962 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerName="placement-api" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.550972 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35251dbe-ab48-4cf5-b556-941dc2cfe6cc" containerName="placement-log" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.551887 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.563020 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx"] Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.620200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-etc-swift\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.620461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-log-httpd\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.620652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-combined-ca-bundle\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.620745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-run-httpd\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.620868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-public-tls-certs\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.620958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-config-data\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.621049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzrn9\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-kube-api-access-xzrn9\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.621136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-internal-tls-certs\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.722294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-combined-ca-bundle\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.722433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-run-httpd\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.722525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-public-tls-certs\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.722611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-config-data\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.722697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzrn9\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-kube-api-access-xzrn9\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.722780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-internal-tls-certs\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.722891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-etc-swift\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.722970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-run-httpd\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.723098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-log-httpd\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.723608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-log-httpd\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.727880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-internal-tls-certs\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.727940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-combined-ca-bundle\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.728627 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-public-tls-certs\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.731895 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-config-data\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.735390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzrn9\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-kube-api-access-xzrn9\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.743132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-etc-swift\") pod \"swift-proxy-68ccdc6bf4-gkfnx\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.849822 4707 scope.go:117] "RemoveContainer" containerID="3390dfaa1745c314422218fd6221fa92f2714b8258f227fa76da5bce0cf4e8f9" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.868943 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.919378 4707 scope.go:117] "RemoveContainer" containerID="971937516b3b19a806e3c97a43385c931b217beac81e6d595bdb7f3762b9b713" Jan 21 15:55:06 crc kubenswrapper[4707]: I0121 15:55:06.981648 4707 scope.go:117] "RemoveContainer" containerID="bd309fbc3fb22feff248a50774d5c25fb157e048f9cc47f6f6dd0f30e2a730b7" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.016126 4707 scope.go:117] "RemoveContainer" containerID="84b7c48321255d05f9f4ae3bebca5603a4d0fb4d5ce26d0a6812c180a204cc28" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.091134 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.092969 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.103961 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.115551 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.231147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58fxm\" (UniqueName: \"kubernetes.io/projected/239fec41-be3f-42d6-b0f2-a17c6c13efc8-kube-api-access-58fxm\") pod \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.231204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-combined-ca-bundle\") pod \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.231257 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-credential-keys\") pod \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.231322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-fernet-keys\") pod \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.231344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-scripts\") pod \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.231403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-config-data\") pod \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.231439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-public-tls-certs\") pod \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.231466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-internal-tls-certs\") pod \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\" (UID: \"239fec41-be3f-42d6-b0f2-a17c6c13efc8\") " Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.236622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "239fec41-be3f-42d6-b0f2-a17c6c13efc8" (UID: "239fec41-be3f-42d6-b0f2-a17c6c13efc8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.236661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-scripts" (OuterVolumeSpecName: "scripts") pod "239fec41-be3f-42d6-b0f2-a17c6c13efc8" (UID: "239fec41-be3f-42d6-b0f2-a17c6c13efc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.236980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/239fec41-be3f-42d6-b0f2-a17c6c13efc8-kube-api-access-58fxm" (OuterVolumeSpecName: "kube-api-access-58fxm") pod "239fec41-be3f-42d6-b0f2-a17c6c13efc8" (UID: "239fec41-be3f-42d6-b0f2-a17c6c13efc8"). InnerVolumeSpecName "kube-api-access-58fxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.237071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "239fec41-be3f-42d6-b0f2-a17c6c13efc8" (UID: "239fec41-be3f-42d6-b0f2-a17c6c13efc8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.254151 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "239fec41-be3f-42d6-b0f2-a17c6c13efc8" (UID: "239fec41-be3f-42d6-b0f2-a17c6c13efc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.258479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-config-data" (OuterVolumeSpecName: "config-data") pod "239fec41-be3f-42d6-b0f2-a17c6c13efc8" (UID: "239fec41-be3f-42d6-b0f2-a17c6c13efc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.270134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "239fec41-be3f-42d6-b0f2-a17c6c13efc8" (UID: "239fec41-be3f-42d6-b0f2-a17c6c13efc8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.276520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "239fec41-be3f-42d6-b0f2-a17c6c13efc8" (UID: "239fec41-be3f-42d6-b0f2-a17c6c13efc8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.332390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx"] Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.335404 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.335433 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.335498 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.335511 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58fxm\" (UniqueName: \"kubernetes.io/projected/239fec41-be3f-42d6-b0f2-a17c6c13efc8-kube-api-access-58fxm\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.335520 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.335529 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.335536 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.335545 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239fec41-be3f-42d6-b0f2-a17c6c13efc8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:07 crc kubenswrapper[4707]: W0121 15:55:07.351917 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc397cca9_7943_4169_9984_c5ec464de8de.slice/crio-6aa037d0871bf6ede6898ff8c7b5f47651612b51ea0f557be4abdb12d76759d7 WatchSource:0}: Error finding container 6aa037d0871bf6ede6898ff8c7b5f47651612b51ea0f557be4abdb12d76759d7: Status 404 returned error can't find the container with id 6aa037d0871bf6ede6898ff8c7b5f47651612b51ea0f557be4abdb12d76759d7 Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.799494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" event={"ID":"c397cca9-7943-4169-9984-c5ec464de8de","Type":"ContainerStarted","Data":"927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b"} Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.799883 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.799896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" event={"ID":"c397cca9-7943-4169-9984-c5ec464de8de","Type":"ContainerStarted","Data":"32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356"} Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.799907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" event={"ID":"c397cca9-7943-4169-9984-c5ec464de8de","Type":"ContainerStarted","Data":"6aa037d0871bf6ede6898ff8c7b5f47651612b51ea0f557be4abdb12d76759d7"} Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.802541 4707 generic.go:334] "Generic (PLEG): container finished" podID="239fec41-be3f-42d6-b0f2-a17c6c13efc8" containerID="7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a" exitCode=0 Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.802585 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.802637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" event={"ID":"239fec41-be3f-42d6-b0f2-a17c6c13efc8","Type":"ContainerDied","Data":"7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a"} Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.802678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7fd958d6cb-2thfc" event={"ID":"239fec41-be3f-42d6-b0f2-a17c6c13efc8","Type":"ContainerDied","Data":"0ace1425f2ca8e06ded67088cecc0119f964b3992e922c6374e0fee32c40fab2"} Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.802717 4707 scope.go:117] "RemoveContainer" containerID="7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.809101 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.814872 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" podStartSLOduration=1.8148618 podStartE2EDuration="1.8148618s" podCreationTimestamp="2026-01-21 15:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:55:07.811937581 +0000 UTC m=+3204.993453802" watchObservedRunningTime="2026-01-21 15:55:07.8148618 +0000 UTC m=+3204.996378021" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.873142 4707 scope.go:117] "RemoveContainer" containerID="7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a" Jan 21 15:55:07 crc kubenswrapper[4707]: E0121 15:55:07.873840 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a\": container with ID starting with 7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a not found: ID does not exist" containerID="7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.873887 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a"} err="failed to get container status \"7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a\": rpc error: code = NotFound desc = could not find container \"7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a\": container with ID starting with 7b47ce8e6a788d261e7b7aa7273176350d2b52f1d57562fafcb40b9f45b7694a not found: ID does not exist" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.882273 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7fd958d6cb-2thfc"] Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.890929 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-7fd958d6cb-2thfc"] Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.933469 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.933704 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" containerName="openstackclient" containerID="cri-o://b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f" gracePeriod=2 Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.939536 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.972558 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:55:07 crc kubenswrapper[4707]: E0121 15:55:07.972948 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" containerName="openstackclient" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.972968 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" containerName="openstackclient" Jan 21 15:55:07 crc kubenswrapper[4707]: E0121 15:55:07.972999 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239fec41-be3f-42d6-b0f2-a17c6c13efc8" containerName="keystone-api" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.973006 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="239fec41-be3f-42d6-b0f2-a17c6c13efc8" containerName="keystone-api" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.973197 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" containerName="openstackclient" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.973218 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="239fec41-be3f-42d6-b0f2-a17c6c13efc8" containerName="keystone-api" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.974240 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.976453 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" podUID="9d0e8b27-be36-4a9b-a897-cdfd131cf319" Jan 21 15:55:07 crc kubenswrapper[4707]: I0121 15:55:07.980794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.046968 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.047233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65fcv\" (UniqueName: \"kubernetes.io/projected/9d0e8b27-be36-4a9b-a897-cdfd131cf319-kube-api-access-65fcv\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.047350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.047417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config-secret\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.149070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.149142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65fcv\" (UniqueName: \"kubernetes.io/projected/9d0e8b27-be36-4a9b-a897-cdfd131cf319-kube-api-access-65fcv\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.149176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.149203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config-secret\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.149842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.153495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.160918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65fcv\" (UniqueName: \"kubernetes.io/projected/9d0e8b27-be36-4a9b-a897-cdfd131cf319-kube-api-access-65fcv\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.166632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config-secret\") pod \"openstackclient\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.292453 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.812622 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:08 crc kubenswrapper[4707]: I0121 15:55:08.838832 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:55:09 crc kubenswrapper[4707]: I0121 15:55:09.190855 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="239fec41-be3f-42d6-b0f2-a17c6c13efc8" path="/var/lib/kubelet/pods/239fec41-be3f-42d6-b0f2-a17c6c13efc8/volumes" Jan 21 15:55:09 crc kubenswrapper[4707]: I0121 15:55:09.820485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"9d0e8b27-be36-4a9b-a897-cdfd131cf319","Type":"ContainerStarted","Data":"bcdaa3a0428c88cb5e1d698160a35a96637e98eb4c2dcd7cf69eb5077af17c0b"} Jan 21 15:55:09 crc kubenswrapper[4707]: I0121 15:55:09.820518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"9d0e8b27-be36-4a9b-a897-cdfd131cf319","Type":"ContainerStarted","Data":"faed4d2a7c90219f52b588384a6fb3b85676eb45a76c142cd3a817d529c1d8a1"} Jan 21 15:55:09 crc kubenswrapper[4707]: I0121 15:55:09.845035 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.845022256 podStartE2EDuration="2.845022256s" podCreationTimestamp="2026-01-21 15:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:55:09.831662995 +0000 UTC m=+3207.013179217" watchObservedRunningTime="2026-01-21 15:55:09.845022256 +0000 UTC m=+3207.026538478" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.173075 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.281750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-combined-ca-bundle\") pod \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.282348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config\") pod \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.283630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config-secret\") pod \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.283724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmm2j\" (UniqueName: \"kubernetes.io/projected/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-kube-api-access-tmm2j\") pod \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\" (UID: \"c6c4cdde-65b4-4956-aaf6-e7dba05f869d\") " Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.289729 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-kube-api-access-tmm2j" (OuterVolumeSpecName: "kube-api-access-tmm2j") pod "c6c4cdde-65b4-4956-aaf6-e7dba05f869d" (UID: "c6c4cdde-65b4-4956-aaf6-e7dba05f869d"). InnerVolumeSpecName "kube-api-access-tmm2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.310168 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6c4cdde-65b4-4956-aaf6-e7dba05f869d" (UID: "c6c4cdde-65b4-4956-aaf6-e7dba05f869d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.313043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c6c4cdde-65b4-4956-aaf6-e7dba05f869d" (UID: "c6c4cdde-65b4-4956-aaf6-e7dba05f869d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.326018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c6c4cdde-65b4-4956-aaf6-e7dba05f869d" (UID: "c6c4cdde-65b4-4956-aaf6-e7dba05f869d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.386434 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.386703 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.386716 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmm2j\" (UniqueName: \"kubernetes.io/projected/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-kube-api-access-tmm2j\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.386727 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c4cdde-65b4-4956-aaf6-e7dba05f869d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.830759 4707 generic.go:334] "Generic (PLEG): container finished" podID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" containerID="b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f" exitCode=137 Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.830847 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.830871 4707 scope.go:117] "RemoveContainer" containerID="b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.843776 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" podUID="9d0e8b27-be36-4a9b-a897-cdfd131cf319" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.847229 4707 scope.go:117] "RemoveContainer" containerID="b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f" Jan 21 15:55:10 crc kubenswrapper[4707]: E0121 15:55:10.847592 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f\": container with ID starting with b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f not found: ID does not exist" containerID="b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f" Jan 21 15:55:10 crc kubenswrapper[4707]: I0121 15:55:10.847635 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f"} err="failed to get container status \"b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f\": rpc error: code = NotFound desc = could not find container \"b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f\": container with ID starting with b1e4b07fb51124da4462276d3f18da423c0f734c9542e73ad6bf05aacae25f9f not found: ID does not exist" Jan 21 15:55:11 crc kubenswrapper[4707]: I0121 15:55:11.192438 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c4cdde-65b4-4956-aaf6-e7dba05f869d" path="/var/lib/kubelet/pods/c6c4cdde-65b4-4956-aaf6-e7dba05f869d/volumes" Jan 21 15:55:11 crc kubenswrapper[4707]: E0121 15:55:11.528801 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a307a5c_9c6e_42d4_80a8_7d6e50650fcd.slice/crio-78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97\": RecentStats: unable to find data in memory cache]" Jan 21 15:55:11 crc kubenswrapper[4707]: I0121 15:55:11.973367 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.019804 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-cbb456574-nld82"] Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.020067 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" podUID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerName="neutron-api" containerID="cri-o://59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1" gracePeriod=30 Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.020146 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" podUID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerName="neutron-httpd" containerID="cri-o://9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6" gracePeriod=30 Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.074450 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.074987 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.084504 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.096038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.854127 4707 generic.go:334] "Generic (PLEG): container finished" podID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerID="9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6" exitCode=0 Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.854189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" event={"ID":"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1","Type":"ContainerDied","Data":"9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6"} Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.855044 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:55:12 crc kubenswrapper[4707]: I0121 15:55:12.861200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.181923 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:55:14 crc kubenswrapper[4707]: E0121 15:55:14.182162 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.825438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.874709 4707 generic.go:334] "Generic (PLEG): container finished" podID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerID="59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1" exitCode=0 Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.874760 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.874769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" event={"ID":"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1","Type":"ContainerDied","Data":"59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1"} Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.874819 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cbb456574-nld82" event={"ID":"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1","Type":"ContainerDied","Data":"365f9f6f1807fddd92ab88221bda5d20a734a8e6bbe893232c6bec357b599dd2"} Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.874833 4707 scope.go:117] "RemoveContainer" containerID="9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.896114 4707 scope.go:117] "RemoveContainer" containerID="59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.912178 4707 scope.go:117] "RemoveContainer" containerID="9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6" Jan 21 15:55:14 crc kubenswrapper[4707]: E0121 15:55:14.912690 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6\": container with ID starting with 9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6 not found: ID does not exist" containerID="9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.912722 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6"} err="failed to get container status \"9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6\": rpc error: code = NotFound desc = could not find container \"9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6\": container with ID starting with 9187b519915886e3b7605db76da8e0ce3c6ecd7e202d2d2e82bed2b7b3b583d6 not found: ID does not exist" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.912741 4707 scope.go:117] "RemoveContainer" containerID="59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1" Jan 21 15:55:14 crc kubenswrapper[4707]: E0121 15:55:14.913214 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1\": container with ID starting with 59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1 not found: ID does not exist" containerID="59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.913251 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1"} err="failed to get container status \"59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1\": rpc error: code = NotFound desc = could not find container \"59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1\": container with ID starting with 59c4f4633b0dbe65e51b0c7be7201be44d6a3cbbd04c59be3dca1717e23700f1 not found: ID does not exist" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.970926 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-combined-ca-bundle\") pod \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.970989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6tnw\" (UniqueName: \"kubernetes.io/projected/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-kube-api-access-c6tnw\") pod \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.971046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-internal-tls-certs\") pod \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.971116 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-httpd-config\") pod \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.971643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-ovndb-tls-certs\") pod \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.971669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-public-tls-certs\") pod \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.971707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-config\") pod \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\" (UID: \"eeaa175c-a01b-47a8-a65b-ca5ec32f18d1\") " Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.975629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" (UID: "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:14 crc kubenswrapper[4707]: I0121 15:55:14.975843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-kube-api-access-c6tnw" (OuterVolumeSpecName: "kube-api-access-c6tnw") pod "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" (UID: "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1"). InnerVolumeSpecName "kube-api-access-c6tnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.008291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" (UID: "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.008641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" (UID: "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.009057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" (UID: "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.016942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-config" (OuterVolumeSpecName: "config") pod "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" (UID: "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.023446 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" (UID: "eeaa175c-a01b-47a8-a65b-ca5ec32f18d1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.073542 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.073758 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.073768 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.073777 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.073786 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.073795 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6tnw\" (UniqueName: \"kubernetes.io/projected/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-kube-api-access-c6tnw\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.073802 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.201598 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-cbb456574-nld82"] Jan 21 15:55:15 crc kubenswrapper[4707]: I0121 15:55:15.208513 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-cbb456574-nld82"] Jan 21 15:55:16 crc kubenswrapper[4707]: I0121 15:55:16.873187 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:16 crc kubenswrapper[4707]: I0121 15:55:16.874385 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:55:16 crc kubenswrapper[4707]: I0121 15:55:16.956451 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7784df594-cj4dv"] Jan 21 15:55:16 crc kubenswrapper[4707]: I0121 15:55:16.956666 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" podUID="a11afd40-c28b-4140-8546-fd2a270a2931" containerName="proxy-httpd" containerID="cri-o://092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a" gracePeriod=30 Jan 21 15:55:16 crc kubenswrapper[4707]: I0121 15:55:16.956802 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" podUID="a11afd40-c28b-4140-8546-fd2a270a2931" containerName="proxy-server" containerID="cri-o://26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d" gracePeriod=30 Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.035565 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.196171 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" path="/var/lib/kubelet/pods/eeaa175c-a01b-47a8-a65b-ca5ec32f18d1/volumes" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.456047 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.500600 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-77c8997466-bd6bs" podUID="c197e98a-7956-4c34-a31a-911a28cf4a1a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.144:9696/\": dial tcp 10.217.0.144:9696: i/o timeout" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.618514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-etc-swift\") pod \"a11afd40-c28b-4140-8546-fd2a270a2931\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.618580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-config-data\") pod \"a11afd40-c28b-4140-8546-fd2a270a2931\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.618631 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-run-httpd\") pod \"a11afd40-c28b-4140-8546-fd2a270a2931\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.618656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-log-httpd\") pod \"a11afd40-c28b-4140-8546-fd2a270a2931\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.618685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-combined-ca-bundle\") pod \"a11afd40-c28b-4140-8546-fd2a270a2931\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.618855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-internal-tls-certs\") pod \"a11afd40-c28b-4140-8546-fd2a270a2931\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.618907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl4pz\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-kube-api-access-nl4pz\") pod \"a11afd40-c28b-4140-8546-fd2a270a2931\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.618958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-public-tls-certs\") pod \"a11afd40-c28b-4140-8546-fd2a270a2931\" (UID: \"a11afd40-c28b-4140-8546-fd2a270a2931\") " Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.619004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a11afd40-c28b-4140-8546-fd2a270a2931" (UID: "a11afd40-c28b-4140-8546-fd2a270a2931"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.619078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a11afd40-c28b-4140-8546-fd2a270a2931" (UID: "a11afd40-c28b-4140-8546-fd2a270a2931"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.619391 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.619409 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11afd40-c28b-4140-8546-fd2a270a2931-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.625198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-kube-api-access-nl4pz" (OuterVolumeSpecName: "kube-api-access-nl4pz") pod "a11afd40-c28b-4140-8546-fd2a270a2931" (UID: "a11afd40-c28b-4140-8546-fd2a270a2931"). InnerVolumeSpecName "kube-api-access-nl4pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.625472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a11afd40-c28b-4140-8546-fd2a270a2931" (UID: "a11afd40-c28b-4140-8546-fd2a270a2931"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.657513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-config-data" (OuterVolumeSpecName: "config-data") pod "a11afd40-c28b-4140-8546-fd2a270a2931" (UID: "a11afd40-c28b-4140-8546-fd2a270a2931"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.662383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a11afd40-c28b-4140-8546-fd2a270a2931" (UID: "a11afd40-c28b-4140-8546-fd2a270a2931"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.668848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a11afd40-c28b-4140-8546-fd2a270a2931" (UID: "a11afd40-c28b-4140-8546-fd2a270a2931"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.677188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a11afd40-c28b-4140-8546-fd2a270a2931" (UID: "a11afd40-c28b-4140-8546-fd2a270a2931"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.721053 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.721087 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.721100 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.721110 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl4pz\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-kube-api-access-nl4pz\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.721120 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a11afd40-c28b-4140-8546-fd2a270a2931-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.721130 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a11afd40-c28b-4140-8546-fd2a270a2931-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.907755 4707 generic.go:334] "Generic (PLEG): container finished" podID="a11afd40-c28b-4140-8546-fd2a270a2931" containerID="26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d" exitCode=0 Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.908173 4707 generic.go:334] "Generic (PLEG): container finished" podID="a11afd40-c28b-4140-8546-fd2a270a2931" containerID="092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a" exitCode=0 Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.907895 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.907827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" event={"ID":"a11afd40-c28b-4140-8546-fd2a270a2931","Type":"ContainerDied","Data":"26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d"} Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.908276 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" event={"ID":"a11afd40-c28b-4140-8546-fd2a270a2931","Type":"ContainerDied","Data":"092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a"} Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.908308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7784df594-cj4dv" event={"ID":"a11afd40-c28b-4140-8546-fd2a270a2931","Type":"ContainerDied","Data":"5d26c8dfb940791e8964868cf2182628dc4e34d4e0ab3ff6643c0317a64d5e81"} Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.908336 4707 scope.go:117] "RemoveContainer" containerID="26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.938535 4707 scope.go:117] "RemoveContainer" containerID="092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.946653 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7784df594-cj4dv"] Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.953502 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7784df594-cj4dv"] Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.959677 4707 scope.go:117] "RemoveContainer" containerID="26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d" Jan 21 15:55:17 crc kubenswrapper[4707]: E0121 15:55:17.960299 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d\": container with ID starting with 26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d not found: ID does not exist" containerID="26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.960340 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d"} err="failed to get container status \"26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d\": rpc error: code = NotFound desc = could not find container \"26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d\": container with ID starting with 26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d not found: ID does not exist" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.960365 4707 scope.go:117] "RemoveContainer" containerID="092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a" Jan 21 15:55:17 crc kubenswrapper[4707]: E0121 15:55:17.960616 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a\": container with ID starting with 092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a not found: ID does not exist" containerID="092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.960646 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a"} err="failed to get container status \"092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a\": rpc error: code = NotFound desc = could not find container \"092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a\": container with ID starting with 092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a not found: ID does not exist" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.960662 4707 scope.go:117] "RemoveContainer" containerID="26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.961060 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d"} err="failed to get container status \"26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d\": rpc error: code = NotFound desc = could not find container \"26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d\": container with ID starting with 26587527c59c27ac54a76aa2c3792c9c419c184e2db28d1c1c841fe746c99d8d not found: ID does not exist" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.961097 4707 scope.go:117] "RemoveContainer" containerID="092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a" Jan 21 15:55:17 crc kubenswrapper[4707]: I0121 15:55:17.961380 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a"} err="failed to get container status \"092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a\": rpc error: code = NotFound desc = could not find container \"092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a\": container with ID starting with 092feed4d3b66af443106981abdba5e353e0e55a293920469f6ba84b97ef832a not found: ID does not exist" Jan 21 15:55:19 crc kubenswrapper[4707]: I0121 15:55:19.191916 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a11afd40-c28b-4140-8546-fd2a270a2931" path="/var/lib/kubelet/pods/a11afd40-c28b-4140-8546-fd2a270a2931/volumes" Jan 21 15:55:21 crc kubenswrapper[4707]: E0121 15:55:21.710520 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a307a5c_9c6e_42d4_80a8_7d6e50650fcd.slice/crio-78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97\": RecentStats: unable to find data in memory cache]" Jan 21 15:55:28 crc kubenswrapper[4707]: I0121 15:55:28.182463 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:55:28 crc kubenswrapper[4707]: E0121 15:55:28.182985 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:55:31 crc kubenswrapper[4707]: E0121 15:55:31.892407 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a307a5c_9c6e_42d4_80a8_7d6e50650fcd.slice/crio-78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97\": RecentStats: unable to find data in memory cache]" Jan 21 15:55:41 crc kubenswrapper[4707]: I0121 15:55:41.183593 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:55:41 crc kubenswrapper[4707]: E0121 15:55:41.184211 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:55:42 crc kubenswrapper[4707]: E0121 15:55:42.077456 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a307a5c_9c6e_42d4_80a8_7d6e50650fcd.slice/crio-78f5cd0ba65a53b051cc3795eff3c1fc240e7bcf238ab6cfaab628988c473f97\": RecentStats: unable to find data in memory cache]" Jan 21 15:55:54 crc kubenswrapper[4707]: I0121 15:55:54.182246 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:55:54 crc kubenswrapper[4707]: E0121 15:55:54.182828 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:56:06 crc kubenswrapper[4707]: I0121 15:56:06.182313 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:56:06 crc kubenswrapper[4707]: E0121 15:56:06.182992 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:56:08 crc kubenswrapper[4707]: I0121 15:56:08.063863 4707 scope.go:117] "RemoveContainer" containerID="6eb520f1f4eea9ecf2295d30d78fe4d6d732051d1f8bbded341746250154bc1d" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.334109 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.334633 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="9d0e8b27-be36-4a9b-a897-cdfd131cf319" containerName="openstackclient" containerID="cri-o://bcdaa3a0428c88cb5e1d698160a35a96637e98eb4c2dcd7cf69eb5077af17c0b" gracePeriod=2 Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.345311 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.360524 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.369926 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6ncbl"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.412823 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5"] Jan 21 15:56:17 crc kubenswrapper[4707]: E0121 15:56:17.419845 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11afd40-c28b-4140-8546-fd2a270a2931" containerName="proxy-httpd" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.419867 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11afd40-c28b-4140-8546-fd2a270a2931" containerName="proxy-httpd" Jan 21 15:56:17 crc kubenswrapper[4707]: E0121 15:56:17.419893 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerName="neutron-httpd" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.419899 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerName="neutron-httpd" Jan 21 15:56:17 crc kubenswrapper[4707]: E0121 15:56:17.419916 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11afd40-c28b-4140-8546-fd2a270a2931" containerName="proxy-server" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.419921 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11afd40-c28b-4140-8546-fd2a270a2931" containerName="proxy-server" Jan 21 15:56:17 crc kubenswrapper[4707]: E0121 15:56:17.419934 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerName="neutron-api" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.419939 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerName="neutron-api" Jan 21 15:56:17 crc kubenswrapper[4707]: E0121 15:56:17.419947 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0e8b27-be36-4a9b-a897-cdfd131cf319" containerName="openstackclient" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.419952 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0e8b27-be36-4a9b-a897-cdfd131cf319" containerName="openstackclient" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.420108 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11afd40-c28b-4140-8546-fd2a270a2931" containerName="proxy-httpd" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.420118 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0e8b27-be36-4a9b-a897-cdfd131cf319" containerName="openstackclient" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.420129 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerName="neutron-httpd" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.420142 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11afd40-c28b-4140-8546-fd2a270a2931" containerName="proxy-server" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.420156 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeaa175c-a01b-47a8-a65b-ca5ec32f18d1" containerName="neutron-api" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.420699 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.429151 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.455187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.459229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2d2743-f38a-48d3-bef6-bc752af7f488-operator-scripts\") pod \"cinder-fe8e-account-create-update-6nbj5\" (UID: \"ed2d2743-f38a-48d3-bef6-bc752af7f488\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.459320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpj7g\" (UniqueName: \"kubernetes.io/projected/ed2d2743-f38a-48d3-bef6-bc752af7f488-kube-api-access-jpj7g\") pod \"cinder-fe8e-account-create-update-6nbj5\" (UID: \"ed2d2743-f38a-48d3-bef6-bc752af7f488\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.475781 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.499225 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-6jxtw"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.543343 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.544364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.554038 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.560765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpj7g\" (UniqueName: \"kubernetes.io/projected/ed2d2743-f38a-48d3-bef6-bc752af7f488-kube-api-access-jpj7g\") pod \"cinder-fe8e-account-create-update-6nbj5\" (UID: \"ed2d2743-f38a-48d3-bef6-bc752af7f488\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.560900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2d2743-f38a-48d3-bef6-bc752af7f488-operator-scripts\") pod \"cinder-fe8e-account-create-update-6nbj5\" (UID: \"ed2d2743-f38a-48d3-bef6-bc752af7f488\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.561546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2d2743-f38a-48d3-bef6-bc752af7f488-operator-scripts\") pod \"cinder-fe8e-account-create-update-6nbj5\" (UID: \"ed2d2743-f38a-48d3-bef6-bc752af7f488\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.561582 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8j2jr"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.562547 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.581845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.581900 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.603471 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8j2jr"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.628499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpj7g\" (UniqueName: \"kubernetes.io/projected/ed2d2743-f38a-48d3-bef6-bc752af7f488-kube-api-access-jpj7g\") pod \"cinder-fe8e-account-create-update-6nbj5\" (UID: \"ed2d2743-f38a-48d3-bef6-bc752af7f488\") " pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.666072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt656\" (UniqueName: \"kubernetes.io/projected/11c197e2-8546-4ef2-8964-86a97f767446-kube-api-access-vt656\") pod \"barbican-5988-account-create-update-sdl8z\" (UID: \"11c197e2-8546-4ef2-8964-86a97f767446\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.666338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c197e2-8546-4ef2-8964-86a97f767446-operator-scripts\") pod \"barbican-5988-account-create-update-sdl8z\" (UID: \"11c197e2-8546-4ef2-8964-86a97f767446\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.683057 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kjkdz"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.708779 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.738907 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kjkdz"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.746174 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.768001 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.769153 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.769747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt656\" (UniqueName: \"kubernetes.io/projected/11c197e2-8546-4ef2-8964-86a97f767446-kube-api-access-vt656\") pod \"barbican-5988-account-create-update-sdl8z\" (UID: \"11c197e2-8546-4ef2-8964-86a97f767446\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.769875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c197e2-8546-4ef2-8964-86a97f767446-operator-scripts\") pod \"barbican-5988-account-create-update-sdl8z\" (UID: \"11c197e2-8546-4ef2-8964-86a97f767446\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.770036 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdmvl\" (UniqueName: \"kubernetes.io/projected/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-kube-api-access-xdmvl\") pod \"root-account-create-update-8j2jr\" (UID: \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\") " pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.770158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts\") pod \"root-account-create-update-8j2jr\" (UID: \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\") " pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.770598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c197e2-8546-4ef2-8964-86a97f767446-operator-scripts\") pod \"barbican-5988-account-create-update-sdl8z\" (UID: \"11c197e2-8546-4ef2-8964-86a97f767446\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.771988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.791053 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.795832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt656\" (UniqueName: \"kubernetes.io/projected/11c197e2-8546-4ef2-8964-86a97f767446-kube-api-access-vt656\") pod \"barbican-5988-account-create-update-sdl8z\" (UID: \"11c197e2-8546-4ef2-8964-86a97f767446\") " pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.855039 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.873062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdmvl\" (UniqueName: \"kubernetes.io/projected/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-kube-api-access-xdmvl\") pod \"root-account-create-update-8j2jr\" (UID: \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\") " pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.873104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4226483-2932-4ba4-a0cd-d989688eb498-operator-scripts\") pod \"nova-api-b3eb-account-create-update-758s8\" (UID: \"b4226483-2932-4ba4-a0cd-d989688eb498\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.873217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts\") pod \"root-account-create-update-8j2jr\" (UID: \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\") " pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.873323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs59c\" (UniqueName: \"kubernetes.io/projected/b4226483-2932-4ba4-a0cd-d989688eb498-kube-api-access-zs59c\") pod \"nova-api-b3eb-account-create-update-758s8\" (UID: \"b4226483-2932-4ba4-a0cd-d989688eb498\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:17 crc kubenswrapper[4707]: E0121 15:56:17.873839 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:17 crc kubenswrapper[4707]: E0121 15:56:17.873884 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data podName:1eaefcd6-8b64-47a3-8e3a-280d7130b63e nodeName:}" failed. No retries permitted until 2026-01-21 15:56:18.373868485 +0000 UTC m=+3275.555384707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data") pod "rabbitmq-cell1-server-0" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.874411 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-pvj99"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.874845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts\") pod \"root-account-create-update-8j2jr\" (UID: \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\") " pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.880975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.881172 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerName="ovn-northd" containerID="cri-o://488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4" gracePeriod=30 Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.881499 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerName="openstack-network-exporter" containerID="cri-o://edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9" gracePeriod=30 Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.894180 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.908346 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.909680 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.913246 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.934016 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.935237 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.935493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdmvl\" (UniqueName: \"kubernetes.io/projected/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-kube-api-access-xdmvl\") pod \"root-account-create-update-8j2jr\" (UID: \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\") " pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.948779 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.951834 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.973166 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg"] Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.974357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4226483-2932-4ba4-a0cd-d989688eb498-operator-scripts\") pod \"nova-api-b3eb-account-create-update-758s8\" (UID: \"b4226483-2932-4ba4-a0cd-d989688eb498\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.974536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs59c\" (UniqueName: \"kubernetes.io/projected/b4226483-2932-4ba4-a0cd-d989688eb498-kube-api-access-zs59c\") pod \"nova-api-b3eb-account-create-update-758s8\" (UID: \"b4226483-2932-4ba4-a0cd-d989688eb498\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.975490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4226483-2932-4ba4-a0cd-d989688eb498-operator-scripts\") pod \"nova-api-b3eb-account-create-update-758s8\" (UID: \"b4226483-2932-4ba4-a0cd-d989688eb498\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:17 crc kubenswrapper[4707]: I0121 15:56:17.982266 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-78d2-account-create-update-8drdj"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.008373 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.009685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.022853 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-78d2-account-create-update-8drdj"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.037477 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.038135 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-tt9db"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.057057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs59c\" (UniqueName: \"kubernetes.io/projected/b4226483-2932-4ba4-a0cd-d989688eb498-kube-api-access-zs59c\") pod \"nova-api-b3eb-account-create-update-758s8\" (UID: \"b4226483-2932-4ba4-a0cd-d989688eb498\") " pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.061281 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-tt9db"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.119177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts\") pod \"nova-cell1-fd9c-account-create-update-49grg\" (UID: \"eb75667c-b816-420a-b6e1-bd1da2826e2e\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.119233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrfjj\" (UniqueName: \"kubernetes.io/projected/eb75667c-b816-420a-b6e1-bd1da2826e2e-kube-api-access-zrfjj\") pod \"nova-cell1-fd9c-account-create-update-49grg\" (UID: \"eb75667c-b816-420a-b6e1-bd1da2826e2e\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.119655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc7hz\" (UniqueName: \"kubernetes.io/projected/49cd1191-1cd2-4afe-a66a-339adfd8cc18-kube-api-access-dc7hz\") pod \"nova-cell0-a56d-account-create-update-fsqrj\" (UID: \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.119961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49cd1191-1cd2-4afe-a66a-339adfd8cc18-operator-scripts\") pod \"nova-cell0-a56d-account-create-update-fsqrj\" (UID: \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.124288 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.126240 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.162534 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mczjr"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.193882 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.219364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.235971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc7hz\" (UniqueName: \"kubernetes.io/projected/49cd1191-1cd2-4afe-a66a-339adfd8cc18-kube-api-access-dc7hz\") pod \"nova-cell0-a56d-account-create-update-fsqrj\" (UID: \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.236015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49cd1191-1cd2-4afe-a66a-339adfd8cc18-operator-scripts\") pod \"nova-cell0-a56d-account-create-update-fsqrj\" (UID: \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.236107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts\") pod \"nova-cell1-fd9c-account-create-update-49grg\" (UID: \"eb75667c-b816-420a-b6e1-bd1da2826e2e\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.236132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrfjj\" (UniqueName: \"kubernetes.io/projected/eb75667c-b816-420a-b6e1-bd1da2826e2e-kube-api-access-zrfjj\") pod \"nova-cell1-fd9c-account-create-update-49grg\" (UID: \"eb75667c-b816-420a-b6e1-bd1da2826e2e\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.236418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d28d24-ed34-4aec-935a-ee218f56f442-operator-scripts\") pod \"neutron-b634-account-create-update-mqv4z\" (UID: \"f5d28d24-ed34-4aec-935a-ee218f56f442\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.236467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc68n\" (UniqueName: \"kubernetes.io/projected/f5d28d24-ed34-4aec-935a-ee218f56f442-kube-api-access-zc68n\") pod \"neutron-b634-account-create-update-mqv4z\" (UID: \"f5d28d24-ed34-4aec-935a-ee218f56f442\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.237382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts\") pod \"nova-cell1-fd9c-account-create-update-49grg\" (UID: \"eb75667c-b816-420a-b6e1-bd1da2826e2e\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.237789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49cd1191-1cd2-4afe-a66a-339adfd8cc18-operator-scripts\") pod \"nova-cell0-a56d-account-create-update-fsqrj\" (UID: \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.248869 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mczjr"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.259131 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-lsx2c"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.265787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc7hz\" (UniqueName: \"kubernetes.io/projected/49cd1191-1cd2-4afe-a66a-339adfd8cc18-kube-api-access-dc7hz\") pod \"nova-cell0-a56d-account-create-update-fsqrj\" (UID: \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\") " pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.266225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrfjj\" (UniqueName: \"kubernetes.io/projected/eb75667c-b816-420a-b6e1-bd1da2826e2e-kube-api-access-zrfjj\") pod \"nova-cell1-fd9c-account-create-update-49grg\" (UID: \"eb75667c-b816-420a-b6e1-bd1da2826e2e\") " pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.267934 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-mjr9v"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.292241 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.322289 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.337657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d28d24-ed34-4aec-935a-ee218f56f442-operator-scripts\") pod \"neutron-b634-account-create-update-mqv4z\" (UID: \"f5d28d24-ed34-4aec-935a-ee218f56f442\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.337713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc68n\" (UniqueName: \"kubernetes.io/projected/f5d28d24-ed34-4aec-935a-ee218f56f442-kube-api-access-zc68n\") pod \"neutron-b634-account-create-update-mqv4z\" (UID: \"f5d28d24-ed34-4aec-935a-ee218f56f442\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.341432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d28d24-ed34-4aec-935a-ee218f56f442-operator-scripts\") pod \"neutron-b634-account-create-update-mqv4z\" (UID: \"f5d28d24-ed34-4aec-935a-ee218f56f442\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.361210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc68n\" (UniqueName: \"kubernetes.io/projected/f5d28d24-ed34-4aec-935a-ee218f56f442-kube-api-access-zc68n\") pod \"neutron-b634-account-create-update-mqv4z\" (UID: \"f5d28d24-ed34-4aec-935a-ee218f56f442\") " pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.375175 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.375470 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="feca9a31-2cdb-456b-8e43-17064ac87815" containerName="openstack-network-exporter" containerID="cri-o://d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4" gracePeriod=300 Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.382518 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.393370 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.395021 4707 generic.go:334] "Generic (PLEG): container finished" podID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerID="edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9" exitCode=2 Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.395050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd","Type":"ContainerDied","Data":"edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9"} Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.401838 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-45blv"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.407839 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-6bb8-account-create-update-mrd6l"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.413414 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-45blv"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.418914 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.420447 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.440547 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-pcn5p"] Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.443084 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.443129 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data podName:1eaefcd6-8b64-47a3-8e3a-280d7130b63e nodeName:}" failed. No retries permitted until 2026-01-21 15:56:19.443117778 +0000 UTC m=+3276.624634000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data") pod "rabbitmq-cell1-server-0" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.443842 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.443884 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.443927 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data podName:1fb17d2c-7e86-4451-b56e-4e89fc494542 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:18.943910248 +0000 UTC m=+3276.125426470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data") pod "rabbitmq-server-0" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542") : configmap "rabbitmq-config-data" not found Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.444107 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerName="openstack-network-exporter" containerID="cri-o://fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844" gracePeriod=300 Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.469521 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="feca9a31-2cdb-456b-8e43-17064ac87815" containerName="ovsdbserver-nb" containerID="cri-o://0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5" gracePeriod=300 Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.502745 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.503166 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.512647 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-jhb2f"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.535208 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.552057 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerName="ovsdbserver-sb" containerID="cri-o://ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2" gracePeriod=300 Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.552300 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dbsz5"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.576852 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-t5t9t"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.595685 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dbsz5"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.611589 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5"] Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.693762 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:18 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 15:56:18 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 15:56:18 crc kubenswrapper[4707]: else Jan 21 15:56:18 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:18 crc kubenswrapper[4707]: fi Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:18 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:18 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:18 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:18 crc kubenswrapper[4707]: # support updates Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.694898 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" podUID="ed2d2743-f38a-48d3-bef6-bc752af7f488" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.759482 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tg5cr"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.767281 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tg5cr"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.774755 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z"] Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.798406 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:18 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:56:18 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:56:18 crc kubenswrapper[4707]: else Jan 21 15:56:18 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:18 crc kubenswrapper[4707]: fi Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:18 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:18 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:18 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:18 crc kubenswrapper[4707]: # support updates Jan 21 15:56:18 crc kubenswrapper[4707]: Jan 21 15:56:18 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.799531 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" podUID="11c197e2-8546-4ef2-8964-86a97f767446" Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.862710 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.862925 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerName="cinder-scheduler" containerID="cri-o://ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39" gracePeriod=30 Jan 21 15:56:18 crc kubenswrapper[4707]: I0121 15:56:18.863247 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerName="probe" containerID="cri-o://ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8" gracePeriod=30 Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.968053 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:56:18 crc kubenswrapper[4707]: E0121 15:56:18.968110 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data podName:1fb17d2c-7e86-4451-b56e-4e89fc494542 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:19.968096203 +0000 UTC m=+3277.149612425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data") pod "rabbitmq-server-0" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542") : configmap "rabbitmq-config-data" not found Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.087940 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.088401 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerName="cinder-api-log" containerID="cri-o://bed84625b1724689ba24d64b9ce564f5af89b171fde3cda59c73d1d35a9d2648" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.088846 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerName="cinder-api" containerID="cri-o://0f293a8697eb3310b255d467efcfb20b65db8c27463105113488bd5b244efaf7" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.181262 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5469fd854-s2lnn"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.184005 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" podUID="243109f6-78d0-4750-a43f-21a7b606626d" containerName="placement-log" containerID="cri-o://cc0164bd92b9e0315765bd0cef3ea980987ab65b303b1a0cbcde9d86f9918385" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.184363 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" podUID="243109f6-78d0-4750-a43f-21a7b606626d" containerName="placement-api" containerID="cri-o://b3b250ebed84579c975d272e553aa69f879cc66363e018d563bfefd4ff7ef997" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.184581 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" secret="" err="secret \"neutron-neutron-dockercfg-j7drz\" not found" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.185146 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.185362 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.203417 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025be631-d9c0-4382-a42f-1979af2fddd3" path="/var/lib/kubelet/pods/025be631-d9c0-4382-a42f-1979af2fddd3/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.204079 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058de5ec-317c-4ba6-8497-3abaa84e49db" path="/var/lib/kubelet/pods/058de5ec-317c-4ba6-8497-3abaa84e49db/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.205162 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f8e495d-42cb-401a-ab41-854f544abfff" path="/var/lib/kubelet/pods/0f8e495d-42cb-401a-ab41-854f544abfff/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.207663 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc8577c-8dd6-4ae9-9057-7091c1a10853" path="/var/lib/kubelet/pods/1cc8577c-8dd6-4ae9-9057-7091c1a10853/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.208819 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4" path="/var/lib/kubelet/pods/26cb77e1-6ec3-4c79-be62-1e4e07eaa1a4/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.209979 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c937de4-4dbe-4167-9706-78d225b7de95" path="/var/lib/kubelet/pods/3c937de4-4dbe-4167-9706-78d225b7de95/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.211162 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5403a81d-469a-4fa3-bf5a-77d515c25015" path="/var/lib/kubelet/pods/5403a81d-469a-4fa3-bf5a-77d515c25015/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.212153 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62d12d33-e904-4139-83d0-23eb2bb74b23" path="/var/lib/kubelet/pods/62d12d33-e904-4139-83d0-23eb2bb74b23/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.212728 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6338d565-911b-47d4-ba71-a9cdd9d6bf76" path="/var/lib/kubelet/pods/6338d565-911b-47d4-ba71-a9cdd9d6bf76/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.220451 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c" path="/var/lib/kubelet/pods/6f44f2b0-d61c-4f75-a6cc-4dbc0427ca7c/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.221056 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c26cf56-fc84-4283-af2d-edfba663233d" path="/var/lib/kubelet/pods/7c26cf56-fc84-4283-af2d-edfba663233d/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.221647 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f26e02-9064-49cf-a33b-d1aef107c16b" path="/var/lib/kubelet/pods/87f26e02-9064-49cf-a33b-d1aef107c16b/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.222635 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a57de07a-627b-4881-bf68-124934dbe603" path="/var/lib/kubelet/pods/a57de07a-627b-4881-bf68-124934dbe603/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.224133 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6728371-3b90-4e51-bc07-8b3f08dc2420" path="/var/lib/kubelet/pods/d6728371-3b90-4e51-bc07-8b3f08dc2420/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.224780 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb454a8-70a4-4e80-be1c-bd3b9f6814ce" path="/var/lib/kubelet/pods/deb454a8-70a4-4e80-be1c-bd3b9f6814ce/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.225440 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe436520-0000-4a6c-8e4a-8a8b1c394b94" path="/var/lib/kubelet/pods/fe436520-0000-4a6c-8e4a-8a8b1c394b94/volumes" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.240968 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8"] Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.266898 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:19 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: if [ -n "nova_api" ]; then Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="nova_api" Jan 21 15:56:19 crc kubenswrapper[4707]: else Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:19 crc kubenswrapper[4707]: fi Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:19 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:19 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:19 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:19 crc kubenswrapper[4707]: # support updates Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.268863 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" podUID="b4226483-2932-4ba4-a0cd-d989688eb498" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.284329 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-config: secret "neutron-config" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.284388 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:19.784364478 +0000 UTC m=+3276.965880700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-config" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285026 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-httpd-config: secret "neutron-httpd-config" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285083 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:19.785064724 +0000 UTC m=+3276.966580946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-httpd-config" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285369 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285414 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285432 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:19.785400285 +0000 UTC m=+3276.966916507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-ovndbs" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285465 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285475 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:19.785467692 +0000 UTC m=+3276.966983913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-public-svc" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285371 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285492 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:19.785481587 +0000 UTC m=+3276.966997809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "combined-ca-bundle" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.285515 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:19.785497368 +0000 UTC m=+3276.967013590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-internal-svc" not found Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.305210 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_a9d297fb-43eb-45d9-a000-55847e5b36cb/ovsdbserver-sb/0.log" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.305264 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.332102 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_feca9a31-2cdb-456b-8e43-17064ac87815/ovsdbserver-nb/0.log" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.332184 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.387986 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8j2jr"] Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.422112 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:19 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 15:56:19 crc kubenswrapper[4707]: else Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:19 crc kubenswrapper[4707]: fi Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:19 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:19 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:19 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:19 crc kubenswrapper[4707]: # support updates Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.426496 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-8j2jr" podUID="4560e3cf-e7a2-4a34-984e-1d58d955a9d2" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.444332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" event={"ID":"b4226483-2932-4ba4-a0cd-d989688eb498","Type":"ContainerStarted","Data":"100a5144f04acebd478cf9063f957e07039893814e934d6782fcea7abf352b68"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.447460 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.447862 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-server" containerID="cri-o://8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.447964 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="swift-recon-cron" containerID="cri-o://2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448000 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="rsync" containerID="cri-o://3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448029 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-expirer" containerID="cri-o://5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448059 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-updater" containerID="cri-o://78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448088 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-auditor" containerID="cri-o://030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448117 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-replicator" containerID="cri-o://e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448142 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-server" containerID="cri-o://ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448169 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-updater" containerID="cri-o://705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448194 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-auditor" containerID="cri-o://0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448221 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-replicator" containerID="cri-o://0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448247 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-server" containerID="cri-o://52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448285 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-reaper" containerID="cri-o://1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448314 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-auditor" containerID="cri-o://0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.448342 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-replicator" containerID="cri-o://e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.450331 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:19 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: if [ -n "nova_api" ]; then Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="nova_api" Jan 21 15:56:19 crc kubenswrapper[4707]: else Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:19 crc kubenswrapper[4707]: fi Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:19 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:19 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:19 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:19 crc kubenswrapper[4707]: # support updates Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.452141 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" podUID="b4226483-2932-4ba4-a0cd-d989688eb498" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.455587 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-82fkc"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.481653 4707 generic.go:334] "Generic (PLEG): container finished" podID="243109f6-78d0-4750-a43f-21a7b606626d" containerID="cc0164bd92b9e0315765bd0cef3ea980987ab65b303b1a0cbcde9d86f9918385" exitCode=143 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.481735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" event={"ID":"243109f6-78d0-4750-a43f-21a7b606626d","Type":"ContainerDied","Data":"cc0164bd92b9e0315765bd0cef3ea980987ab65b303b1a0cbcde9d86f9918385"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.484824 4707 generic.go:334] "Generic (PLEG): container finished" podID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerID="bed84625b1724689ba24d64b9ce564f5af89b171fde3cda59c73d1d35a9d2648" exitCode=143 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.484936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b59f3c9f-adf5-49f3-8f0b-62044f91902c","Type":"ContainerDied","Data":"bed84625b1724689ba24d64b9ce564f5af89b171fde3cda59c73d1d35a9d2648"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.489905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-config\") pod \"a9d297fb-43eb-45d9-a000-55847e5b36cb\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.489951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"feca9a31-2cdb-456b-8e43-17064ac87815\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.489975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-combined-ca-bundle\") pod \"a9d297fb-43eb-45d9-a000-55847e5b36cb\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdbserver-sb-tls-certs\") pod \"a9d297fb-43eb-45d9-a000-55847e5b36cb\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490049 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"a9d297fb-43eb-45d9-a000-55847e5b36cb\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqqvk\" (UniqueName: \"kubernetes.io/projected/feca9a31-2cdb-456b-8e43-17064ac87815-kube-api-access-rqqvk\") pod \"feca9a31-2cdb-456b-8e43-17064ac87815\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-combined-ca-bundle\") pod \"feca9a31-2cdb-456b-8e43-17064ac87815\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-metrics-certs-tls-certs\") pod \"a9d297fb-43eb-45d9-a000-55847e5b36cb\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdbserver-nb-tls-certs\") pod \"feca9a31-2cdb-456b-8e43-17064ac87815\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-scripts\") pod \"a9d297fb-43eb-45d9-a000-55847e5b36cb\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-metrics-certs-tls-certs\") pod \"feca9a31-2cdb-456b-8e43-17064ac87815\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdb-rundir\") pod \"feca9a31-2cdb-456b-8e43-17064ac87815\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-config\") pod \"feca9a31-2cdb-456b-8e43-17064ac87815\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdb-rundir\") pod \"a9d297fb-43eb-45d9-a000-55847e5b36cb\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-scripts\") pod \"feca9a31-2cdb-456b-8e43-17064ac87815\" (UID: \"feca9a31-2cdb-456b-8e43-17064ac87815\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490385 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr9sj\" (UniqueName: \"kubernetes.io/projected/a9d297fb-43eb-45d9-a000-55847e5b36cb-kube-api-access-pr9sj\") pod \"a9d297fb-43eb-45d9-a000-55847e5b36cb\" (UID: \"a9d297fb-43eb-45d9-a000-55847e5b36cb\") " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.490761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-config" (OuterVolumeSpecName: "config") pod "a9d297fb-43eb-45d9-a000-55847e5b36cb" (UID: "a9d297fb-43eb-45d9-a000-55847e5b36cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.491351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" event={"ID":"11c197e2-8546-4ef2-8964-86a97f767446","Type":"ContainerStarted","Data":"0aa2a14d0251ab0e4dd3d13f329527a7138e4ffc26f59cd40580c30ac58604bd"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.492091 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "feca9a31-2cdb-456b-8e43-17064ac87815" (UID: "feca9a31-2cdb-456b-8e43-17064ac87815"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.492517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-scripts" (OuterVolumeSpecName: "scripts") pod "feca9a31-2cdb-456b-8e43-17064ac87815" (UID: "feca9a31-2cdb-456b-8e43-17064ac87815"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.492912 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.492982 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data podName:1eaefcd6-8b64-47a3-8e3a-280d7130b63e nodeName:}" failed. No retries permitted until 2026-01-21 15:56:21.492966495 +0000 UTC m=+3278.674482717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data") pod "rabbitmq-cell1-server-0" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.493138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-config" (OuterVolumeSpecName: "config") pod "feca9a31-2cdb-456b-8e43-17064ac87815" (UID: "feca9a31-2cdb-456b-8e43-17064ac87815"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.497916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "a9d297fb-43eb-45d9-a000-55847e5b36cb" (UID: "a9d297fb-43eb-45d9-a000-55847e5b36cb"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.503650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-scripts" (OuterVolumeSpecName: "scripts") pod "a9d297fb-43eb-45d9-a000-55847e5b36cb" (UID: "a9d297fb-43eb-45d9-a000-55847e5b36cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.504357 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-82fkc"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.510892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "a9d297fb-43eb-45d9-a000-55847e5b36cb" (UID: "a9d297fb-43eb-45d9-a000-55847e5b36cb"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.519788 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "feca9a31-2cdb-456b-8e43-17064ac87815" (UID: "feca9a31-2cdb-456b-8e43-17064ac87815"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.521782 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d297fb-43eb-45d9-a000-55847e5b36cb-kube-api-access-pr9sj" (OuterVolumeSpecName: "kube-api-access-pr9sj") pod "a9d297fb-43eb-45d9-a000-55847e5b36cb" (UID: "a9d297fb-43eb-45d9-a000-55847e5b36cb"). InnerVolumeSpecName "kube-api-access-pr9sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.528077 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" event={"ID":"ed2d2743-f38a-48d3-bef6-bc752af7f488","Type":"ContainerStarted","Data":"51e0d5c42c8b50c2bd222fc48b8406fed0cea941b74380a7c4b91ff31b03d332"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.543679 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d0e8b27-be36-4a9b-a897-cdfd131cf319" containerID="bcdaa3a0428c88cb5e1d698160a35a96637e98eb4c2dcd7cf69eb5077af17c0b" exitCode=137 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.579369 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x2skx"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.580250 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_a9d297fb-43eb-45d9-a000-55847e5b36cb/ovsdbserver-sb/0.log" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.580302 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerID="fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844" exitCode=2 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.580316 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerID="ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2" exitCode=143 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.580362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a9d297fb-43eb-45d9-a000-55847e5b36cb","Type":"ContainerDied","Data":"fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.580383 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a9d297fb-43eb-45d9-a000-55847e5b36cb","Type":"ContainerDied","Data":"ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.580392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"a9d297fb-43eb-45d9-a000-55847e5b36cb","Type":"ContainerDied","Data":"b9cf66ef704e6570e55f16947bb54c9007553f8ab8950a3f9649fc428f1e1140"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.580406 4707 scope.go:117] "RemoveContainer" containerID="fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.580533 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.589587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feca9a31-2cdb-456b-8e43-17064ac87815-kube-api-access-rqqvk" (OuterVolumeSpecName: "kube-api-access-rqqvk") pod "feca9a31-2cdb-456b-8e43-17064ac87815" (UID: "feca9a31-2cdb-456b-8e43-17064ac87815"). InnerVolumeSpecName "kube-api-access-rqqvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.590061 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:19 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:56:19 crc kubenswrapper[4707]: else Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:19 crc kubenswrapper[4707]: fi Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:19 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:19 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:19 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:19 crc kubenswrapper[4707]: # support updates Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.591856 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" podUID="11c197e2-8546-4ef2-8964-86a97f767446" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.594473 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.594663 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.595302 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.595331 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/feca9a31-2cdb-456b-8e43-17064ac87815-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.595344 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr9sj\" (UniqueName: \"kubernetes.io/projected/a9d297fb-43eb-45d9-a000-55847e5b36cb-kube-api-access-pr9sj\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.595358 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.595380 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.595392 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.595402 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqqvk\" (UniqueName: \"kubernetes.io/projected/feca9a31-2cdb-456b-8e43-17064ac87815-kube-api-access-rqqvk\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.595412 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9d297fb-43eb-45d9-a000-55847e5b36cb-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.611959 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_feca9a31-2cdb-456b-8e43-17064ac87815/ovsdbserver-nb/0.log" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.612000 4707 generic.go:334] "Generic (PLEG): container finished" podID="feca9a31-2cdb-456b-8e43-17064ac87815" containerID="d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4" exitCode=2 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.612016 4707 generic.go:334] "Generic (PLEG): container finished" podID="feca9a31-2cdb-456b-8e43-17064ac87815" containerID="0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5" exitCode=143 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.612038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"feca9a31-2cdb-456b-8e43-17064ac87815","Type":"ContainerDied","Data":"d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.612061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"feca9a31-2cdb-456b-8e43-17064ac87815","Type":"ContainerDied","Data":"0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.612071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"feca9a31-2cdb-456b-8e43-17064ac87815","Type":"ContainerDied","Data":"a39e9bbd1d6760653c6df96779a6c274f806ef15a557d78cde58f9c8b28ca695"} Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.612119 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.617841 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:19 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 15:56:19 crc kubenswrapper[4707]: else Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:19 crc kubenswrapper[4707]: fi Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:19 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:19 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:19 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:19 crc kubenswrapper[4707]: # support updates Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.619084 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x2skx"] Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.619921 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" podUID="ed2d2743-f38a-48d3-bef6-bc752af7f488" Jan 21 15:56:19 crc kubenswrapper[4707]: W0121 15:56:19.623541 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb75667c_b816_420a_b6e1_bd1da2826e2e.slice/crio-22fb96223b619ecdbfa02115bbda51ab73b7cfd1b462a3f7e5a34b0cb3062706 WatchSource:0}: Error finding container 22fb96223b619ecdbfa02115bbda51ab73b7cfd1b462a3f7e5a34b0cb3062706: Status 404 returned error can't find the container with id 22fb96223b619ecdbfa02115bbda51ab73b7cfd1b462a3f7e5a34b0cb3062706 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.639893 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.660573 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-vjhlx"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.670967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feca9a31-2cdb-456b-8e43-17064ac87815" (UID: "feca9a31-2cdb-456b-8e43-17064ac87815"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.679443 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-vjhlx"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.688192 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.707034 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.707060 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.713722 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.722397 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:19 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 15:56:19 crc kubenswrapper[4707]: else Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:19 crc kubenswrapper[4707]: fi Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:19 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:19 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:19 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:19 crc kubenswrapper[4707]: # support updates Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.723951 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" podUID="eb75667c-b816-420a-b6e1-bd1da2826e2e" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.728839 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6"] Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.729253 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feca9a31-2cdb-456b-8e43-17064ac87815" containerName="openstack-network-exporter" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.729311 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="feca9a31-2cdb-456b-8e43-17064ac87815" containerName="openstack-network-exporter" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.729325 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feca9a31-2cdb-456b-8e43-17064ac87815" containerName="ovsdbserver-nb" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.729388 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="feca9a31-2cdb-456b-8e43-17064ac87815" containerName="ovsdbserver-nb" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.729398 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerName="ovsdbserver-sb" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.729416 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerName="ovsdbserver-sb" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.729435 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerName="openstack-network-exporter" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.729442 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerName="openstack-network-exporter" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.729656 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerName="openstack-network-exporter" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.729666 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="feca9a31-2cdb-456b-8e43-17064ac87815" containerName="openstack-network-exporter" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.729674 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="feca9a31-2cdb-456b-8e43-17064ac87815" containerName="ovsdbserver-nb" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.729685 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d297fb-43eb-45d9-a000-55847e5b36cb" containerName="ovsdbserver-sb" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.734367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.734462 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.739087 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.749325 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.751553 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.762771 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5b4488769b-2ldzb"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.763027 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerName="neutron-api" containerID="cri-o://48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.763160 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerName="neutron-httpd" containerID="cri-o://e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.772412 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.778989 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.779174 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-log" containerID="cri-o://702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.779305 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-api" containerID="cri-o://7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.780511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9d297fb-43eb-45d9-a000-55847e5b36cb" (UID: "a9d297fb-43eb-45d9-a000-55847e5b36cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.789889 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.790024 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="448bc08a-61e8-48bb-be55-256be3a355a2" containerName="glance-log" containerID="cri-o://4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.790109 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="448bc08a-61e8-48bb-be55-256be3a355a2" containerName="glance-httpd" containerID="cri-o://292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.815052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-5z8p6\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.815155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfkzr\" (UniqueName: \"kubernetes.io/projected/bb62efde-9e0c-429f-85ed-7a251716c8d9-kube-api-access-qfkzr\") pod \"dnsmasq-dnsmasq-84b9f45d47-5z8p6\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.815171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-5z8p6\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.815262 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.815285 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815357 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815399 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:20.815383832 +0000 UTC m=+3277.996900054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-public-svc" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815690 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815714 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:20.81570705 +0000 UTC m=+3277.997223262 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "combined-ca-bundle" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815741 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-config: secret "neutron-config" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815758 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:20.815752365 +0000 UTC m=+3277.997268577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-config" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815790 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-httpd-config: secret "neutron-httpd-config" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815824 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:20.815801188 +0000 UTC m=+3277.997317410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-httpd-config" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815853 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815869 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:20.815863885 +0000 UTC m=+3277.997380107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-internal-svc" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815898 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.815913 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:20.815907627 +0000 UTC m=+3277.997423849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-ovndbs" not found Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.844662 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:19 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: if [ -n "nova_cell0" ]; then Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell0" Jan 21 15:56:19 crc kubenswrapper[4707]: else Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:19 crc kubenswrapper[4707]: fi Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:19 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:19 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:19 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:19 crc kubenswrapper[4707]: # support updates Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.846159 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" podUID="49cd1191-1cd2-4afe-a66a-339adfd8cc18" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.858312 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:19 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: if [ -n "neutron" ]; then Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="neutron" Jan 21 15:56:19 crc kubenswrapper[4707]: else Jan 21 15:56:19 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:19 crc kubenswrapper[4707]: fi Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:19 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:19 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:19 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:19 crc kubenswrapper[4707]: # support updates Jan 21 15:56:19 crc kubenswrapper[4707]: Jan 21 15:56:19 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:19 crc kubenswrapper[4707]: E0121 15:56:19.859895 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" podUID="f5d28d24-ed34-4aec-935a-ee218f56f442" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.865686 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.881207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "feca9a31-2cdb-456b-8e43-17064ac87815" (UID: "feca9a31-2cdb-456b-8e43-17064ac87815"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.884855 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.892988 4707 scope.go:117] "RemoveContainer" containerID="ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.918242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfkzr\" (UniqueName: \"kubernetes.io/projected/bb62efde-9e0c-429f-85ed-7a251716c8d9-kube-api-access-qfkzr\") pod \"dnsmasq-dnsmasq-84b9f45d47-5z8p6\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.918367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-5z8p6\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.918554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-5z8p6\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.919297 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.919372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-5z8p6\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.919988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-5z8p6\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.944703 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.944938 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerName="glance-log" containerID="cri-o://7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.945283 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerName="glance-httpd" containerID="cri-o://c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715" gracePeriod=30 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.948197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "a9d297fb-43eb-45d9-a000-55847e5b36cb" (UID: "a9d297fb-43eb-45d9-a000-55847e5b36cb"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.949129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfkzr\" (UniqueName: \"kubernetes.io/projected/bb62efde-9e0c-429f-85ed-7a251716c8d9-kube-api-access-qfkzr\") pod \"dnsmasq-dnsmasq-84b9f45d47-5z8p6\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.956114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.969219 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-w54q5"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.977436 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-w54q5"] Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.983351 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" containerName="rabbitmq" containerID="cri-o://c684b390b448b0e7f019fce643a06fcefabe494a19baeedc6a8e4cf355706140" gracePeriod=604800 Jan 21 15:56:19 crc kubenswrapper[4707]: I0121 15:56:19.992165 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "a9d297fb-43eb-45d9-a000-55847e5b36cb" (UID: "a9d297fb-43eb-45d9-a000-55847e5b36cb"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:19.997162 4707 scope.go:117] "RemoveContainer" containerID="fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.002998 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844\": container with ID starting with fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844 not found: ID does not exist" containerID="fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.003032 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844"} err="failed to get container status \"fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844\": rpc error: code = NotFound desc = could not find container \"fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844\": container with ID starting with fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844 not found: ID does not exist" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.003059 4707 scope.go:117] "RemoveContainer" containerID="ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.010883 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2\": container with ID starting with ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2 not found: ID does not exist" containerID="ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.010913 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2"} err="failed to get container status \"ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2\": rpc error: code = NotFound desc = could not find container \"ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2\": container with ID starting with ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2 not found: ID does not exist" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.010936 4707 scope.go:117] "RemoveContainer" containerID="fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.014281 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.014471 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-log" containerID="cri-o://568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.014803 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-metadata" containerID="cri-o://90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.023024 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.023049 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d297fb-43eb-45d9-a000-55847e5b36cb-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.023100 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.023136 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data podName:1fb17d2c-7e86-4451-b56e-4e89fc494542 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.02312485 +0000 UTC m=+3279.204641073 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data") pod "rabbitmq-server-0" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542") : configmap "rabbitmq-config-data" not found Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.032694 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844"} err="failed to get container status \"fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844\": rpc error: code = NotFound desc = could not find container \"fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844\": container with ID starting with fbb9389eaa2fe157fcd0f2389cbb5ede128da5462f258449cfb2a375239b0844 not found: ID does not exist" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.032724 4707 scope.go:117] "RemoveContainer" containerID="ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.033157 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2"} err="failed to get container status \"ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2\": rpc error: code = NotFound desc = could not find container \"ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2\": container with ID starting with ce02d4f8b51f0a5ee247644dd9e0aa59be95c1d3f6929e1eb31b977d63b060b2 not found: ID does not exist" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.033176 4707 scope.go:117] "RemoveContainer" containerID="d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.034892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "feca9a31-2cdb-456b-8e43-17064ac87815" (UID: "feca9a31-2cdb-456b-8e43-17064ac87815"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.065731 4707 scope.go:117] "RemoveContainer" containerID="0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.093994 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tfrs"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.108521 4707 scope.go:117] "RemoveContainer" containerID="d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.112353 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4\": container with ID starting with d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4 not found: ID does not exist" containerID="d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.112532 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4"} err="failed to get container status \"d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4\": rpc error: code = NotFound desc = could not find container \"d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4\": container with ID starting with d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4 not found: ID does not exist" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.112644 4707 scope.go:117] "RemoveContainer" containerID="0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.116960 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5\": container with ID starting with 0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5 not found: ID does not exist" containerID="0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.117041 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5"} err="failed to get container status \"0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5\": rpc error: code = NotFound desc = could not find container \"0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5\": container with ID starting with 0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5 not found: ID does not exist" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.117072 4707 scope.go:117] "RemoveContainer" containerID="d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.117728 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4"} err="failed to get container status \"d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4\": rpc error: code = NotFound desc = could not find container \"d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4\": container with ID starting with d9291420b64a0aedb84ffdeebe2cc8773fef646f0b8c53c06a2fb9ef13fbc7d4 not found: ID does not exist" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.117766 4707 scope.go:117] "RemoveContainer" containerID="0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.118986 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5"} err="failed to get container status \"0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5\": rpc error: code = NotFound desc = could not find container \"0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5\": container with ID starting with 0a7d19648764b00ade71197a0677f3b0cf056cca829d86a645f0e2609b1df4c5 not found: ID does not exist" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.125433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65fcv\" (UniqueName: \"kubernetes.io/projected/9d0e8b27-be36-4a9b-a897-cdfd131cf319-kube-api-access-65fcv\") pod \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.125765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config-secret\") pod \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.125888 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config\") pod \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.126294 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="e5d83529-68a5-4c04-9d24-4524d92f1efb" containerName="galera" containerID="cri-o://2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.127168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-combined-ca-bundle\") pod \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\" (UID: \"9d0e8b27-be36-4a9b-a897-cdfd131cf319\") " Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.132083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0e8b27-be36-4a9b-a897-cdfd131cf319-kube-api-access-65fcv" (OuterVolumeSpecName: "kube-api-access-65fcv") pod "9d0e8b27-be36-4a9b-a897-cdfd131cf319" (UID: "9d0e8b27-be36-4a9b-a897-cdfd131cf319"). InnerVolumeSpecName "kube-api-access-65fcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.133422 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/feca9a31-2cdb-456b-8e43-17064ac87815-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.133443 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65fcv\" (UniqueName: \"kubernetes.io/projected/9d0e8b27-be36-4a9b-a897-cdfd131cf319-kube-api-access-65fcv\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.170123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9d0e8b27-be36-4a9b-a897-cdfd131cf319" (UID: "9d0e8b27-be36-4a9b-a897-cdfd131cf319"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.180176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d0e8b27-be36-4a9b-a897-cdfd131cf319" (UID: "9d0e8b27-be36-4a9b-a897-cdfd131cf319"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.186973 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tfrs"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.188486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9d0e8b27-be36-4a9b-a897-cdfd131cf319" (UID: "9d0e8b27-be36-4a9b-a897-cdfd131cf319"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.201898 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.225576 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.234358 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.234382 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d0e8b27-be36-4a9b-a897-cdfd131cf319-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.234394 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0e8b27-be36-4a9b-a897-cdfd131cf319-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.235309 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.235365 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:20.735350873 +0000 UTC m=+3277.916867095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "cert-keystone-internal-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.235777 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.235800 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:20.735793626 +0000 UTC m=+3277.917309848 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "combined-ca-bundle" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.239112 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.239167 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:20.739153414 +0000 UTC m=+3277.920669636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "cert-keystone-public-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.240606 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-9qfh6"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.246370 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.261852 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-9qfh6"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.265716 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.273344 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2cksb"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.275698 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-7fbgq"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.281019 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-nj7qw"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.287843 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-2cksb"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.304940 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-nj7qw"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.308866 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-7fbgq"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.314211 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.314407 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" podUID="28add001-f139-47b8-9ebb-659ba39858cc" containerName="barbican-worker-log" containerID="cri-o://73b1fd5e41c624d54ff550ab75b71d0f8bbc342e1260f7be7c15ab44c21e4e16" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.314743 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" podUID="28add001-f139-47b8-9ebb-659ba39858cc" containerName="barbican-worker" containerID="cri-o://db4b9f5629cb09208131600005b17fc382e9d120bae3a7b38b5a8a1752ce7728" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.320006 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.320211 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" podUID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerName="barbican-keystone-listener-log" containerID="cri-o://e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.320327 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" podUID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerName="barbican-keystone-listener" containerID="cri-o://60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.336570 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.339921 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.340125 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api-log" containerID="cri-o://ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.340462 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api" containerID="cri-o://c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.352217 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.352431 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="7c05170c-aee6-46b7-bbc6-921bad0eb7ee" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.371611 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.381399 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.387245 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8j2jr"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.392473 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.403505 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.419855 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.424987 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.425161 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="874c328a-6498-4a0a-9754-cb2051c03411" containerName="kube-state-metrics" containerID="cri-o://11762a52965a774c3c7ef42b8f5b5562d1e89a9129e880c1f1650e9be9144cda" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.501021 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="1fb17d2c-7e86-4451-b56e-4e89fc494542" containerName="rabbitmq" containerID="cri-o://af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a" gracePeriod=604800 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.509759 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.510037 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="ceilometer-central-agent" containerID="cri-o://7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.510456 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="proxy-httpd" containerID="cri-o://1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.510520 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="sg-core" containerID="cri-o://e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.510744 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="ceilometer-notification-agent" containerID="cri-o://bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.550011 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.550200 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" podUID="c397cca9-7943-4169-9984-c5ec464de8de" containerName="proxy-httpd" containerID="cri-o://32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.550520 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" podUID="c397cca9-7943-4169-9984-c5ec464de8de" containerName="proxy-server" containerID="cri-o://927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b" gracePeriod=30 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.627124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" event={"ID":"f5d28d24-ed34-4aec-935a-ee218f56f442","Type":"ContainerStarted","Data":"372e4f298a993f8f732059fa4c6eb1eee10140b82a351408e64786b29fecf578"} Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.629135 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:20 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: if [ -n "neutron" ]; then Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="neutron" Jan 21 15:56:20 crc kubenswrapper[4707]: else Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:20 crc kubenswrapper[4707]: fi Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:20 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:20 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:20 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:20 crc kubenswrapper[4707]: # support updates Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.630191 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" podUID="f5d28d24-ed34-4aec-935a-ee218f56f442" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.650371 4707 generic.go:334] "Generic (PLEG): container finished" podID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerID="e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f" exitCode=143 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.650436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" event={"ID":"a4c4483b-abb1-4c04-9092-d8f04755bc93","Type":"ContainerDied","Data":"e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.651708 4707 generic.go:334] "Generic (PLEG): container finished" podID="874c328a-6498-4a0a-9754-cb2051c03411" containerID="11762a52965a774c3c7ef42b8f5b5562d1e89a9129e880c1f1650e9be9144cda" exitCode=2 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.651748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"874c328a-6498-4a0a-9754-cb2051c03411","Type":"ContainerDied","Data":"11762a52965a774c3c7ef42b8f5b5562d1e89a9129e880c1f1650e9be9144cda"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.662655 4707 scope.go:117] "RemoveContainer" containerID="bcdaa3a0428c88cb5e1d698160a35a96637e98eb4c2dcd7cf69eb5077af17c0b" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.662775 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.679046 4707 generic.go:334] "Generic (PLEG): container finished" podID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerID="568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a" exitCode=143 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.679109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"54a93993-fb38-4bec-9fa6-2ffa280ccecd","Type":"ContainerDied","Data":"568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.711117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" event={"ID":"eb75667c-b816-420a-b6e1-bd1da2826e2e","Type":"ContainerStarted","Data":"22fb96223b619ecdbfa02115bbda51ab73b7cfd1b462a3f7e5a34b0cb3062706"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.712010 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" secret="" err="secret \"galera-openstack-cell1-dockercfg-dr5v9\" not found" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.715471 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:20 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 15:56:20 crc kubenswrapper[4707]: else Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:20 crc kubenswrapper[4707]: fi Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:20 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:20 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:20 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:20 crc kubenswrapper[4707]: # support updates Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.722430 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" podUID="eb75667c-b816-420a-b6e1-bd1da2826e2e" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.723674 4707 generic.go:334] "Generic (PLEG): container finished" podID="28add001-f139-47b8-9ebb-659ba39858cc" containerID="73b1fd5e41c624d54ff550ab75b71d0f8bbc342e1260f7be7c15ab44c21e4e16" exitCode=143 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.723724 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" event={"ID":"28add001-f139-47b8-9ebb-659ba39858cc","Type":"ContainerDied","Data":"73b1fd5e41c624d54ff550ab75b71d0f8bbc342e1260f7be7c15ab44c21e4e16"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.727118 4707 generic.go:334] "Generic (PLEG): container finished" podID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerID="ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.727216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"79a91989-c584-44cc-bb31-494f3d1a7a7d","Type":"ContainerDied","Data":"ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.730938 4707 generic.go:334] "Generic (PLEG): container finished" podID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerID="7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e" exitCode=143 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.730981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6c4d63f6-935c-47e2-92c0-4e27286253ba","Type":"ContainerDied","Data":"7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.732615 4707 generic.go:334] "Generic (PLEG): container finished" podID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerID="e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.732646 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" event={"ID":"dd8127e7-3a43-454e-a3c9-190cf9eab3d2","Type":"ContainerDied","Data":"e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.734126 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8j2jr" event={"ID":"4560e3cf-e7a2-4a34-984e-1d58d955a9d2","Type":"ContainerStarted","Data":"7cddf9944defeeddc33e6845aae99e7e720462c7d11707fdd12e880a3887d38f"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.734657 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-8j2jr" secret="" err="secret \"galera-openstack-cell1-dockercfg-dr5v9\" not found" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747583 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747628 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts podName:4560e3cf-e7a2-4a34-984e-1d58d955a9d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:21.247614108 +0000 UTC m=+3278.429130330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts") pod "root-account-create-update-8j2jr" (UID: "4560e3cf-e7a2-4a34-984e-1d58d955a9d2") : configmap "openstack-cell1-scripts" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747703 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747765 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747777 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:21.747760904 +0000 UTC m=+3278.929277127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "combined-ca-bundle" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747794 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:21.747787655 +0000 UTC m=+3278.929303876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "cert-keystone-public-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747858 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747881 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:21.74787521 +0000 UTC m=+3278.929391432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "cert-keystone-internal-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747926 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.747965 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts podName:eb75667c-b816-420a-b6e1-bd1da2826e2e nodeName:}" failed. No retries permitted until 2026-01-21 15:56:21.247952926 +0000 UTC m=+3278.429469147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts") pod "nova-cell1-fd9c-account-create-update-49grg" (UID: "eb75667c-b816-420a-b6e1-bd1da2826e2e") : configmap "openstack-cell1-scripts" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.777188 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:20 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 15:56:20 crc kubenswrapper[4707]: else Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:20 crc kubenswrapper[4707]: fi Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:20 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:20 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:20 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:20 crc kubenswrapper[4707]: # support updates Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.779247 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-8j2jr" podUID="4560e3cf-e7a2-4a34-984e-1d58d955a9d2" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779334 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779354 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779360 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779366 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779372 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779378 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779383 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779389 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779395 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779400 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779406 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779411 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779416 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779421 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6" exitCode=0 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779621 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.779628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.784746 4707 generic.go:334] "Generic (PLEG): container finished" podID="448bc08a-61e8-48bb-be55-256be3a355a2" containerID="4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387" exitCode=143 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.784793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"448bc08a-61e8-48bb-be55-256be3a355a2","Type":"ContainerDied","Data":"4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.796096 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6"] Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.799107 4707 generic.go:334] "Generic (PLEG): container finished" podID="2604d874-1aab-419e-a67e-801734497292" containerID="702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f" exitCode=143 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.799181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2604d874-1aab-419e-a67e-801734497292","Type":"ContainerDied","Data":"702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f"} Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.801885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" event={"ID":"49cd1191-1cd2-4afe-a66a-339adfd8cc18","Type":"ContainerStarted","Data":"b3fa5c9c2a3a3289502e1f5bf0f61c3bb3ba4d5d00f237d904c167f5dbe8f8f8"} Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.804286 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:20 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: if [ -n "nova_cell0" ]; then Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell0" Jan 21 15:56:20 crc kubenswrapper[4707]: else Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:20 crc kubenswrapper[4707]: fi Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:20 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:20 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:20 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:20 crc kubenswrapper[4707]: # support updates Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.805447 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" podUID="49cd1191-1cd2-4afe-a66a-339adfd8cc18" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.808575 4707 generic.go:334] "Generic (PLEG): container finished" podID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerID="ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca" exitCode=143 Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.809283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" event={"ID":"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7","Type":"ContainerDied","Data":"ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca"} Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.811521 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:20 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 15:56:20 crc kubenswrapper[4707]: else Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:20 crc kubenswrapper[4707]: fi Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:20 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:20 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:20 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:20 crc kubenswrapper[4707]: # support updates Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.811786 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:20 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 15:56:20 crc kubenswrapper[4707]: else Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:20 crc kubenswrapper[4707]: fi Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:20 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:20 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:20 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:20 crc kubenswrapper[4707]: # support updates Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.812321 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 15:56:20 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: if [ -n "nova_api" ]; then Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="nova_api" Jan 21 15:56:20 crc kubenswrapper[4707]: else Jan 21 15:56:20 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 15:56:20 crc kubenswrapper[4707]: fi Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 15:56:20 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 15:56:20 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 15:56:20 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 15:56:20 crc kubenswrapper[4707]: # support updates Jan 21 15:56:20 crc kubenswrapper[4707]: Jan 21 15:56:20 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.812627 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" podUID="11c197e2-8546-4ef2-8964-86a97f767446" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.813720 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" podUID="b4226483-2932-4ba4-a0cd-d989688eb498" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.813738 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" podUID="ed2d2743-f38a-48d3-bef6-bc752af7f488" Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.850973 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.851053 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.851039692 +0000 UTC m=+3280.032555913 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-public-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.851557 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-config: secret "neutron-config" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.851587 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.851580468 +0000 UTC m=+3280.033096689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-config" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.851618 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-httpd-config: secret "neutron-httpd-config" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.851636 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.851630082 +0000 UTC m=+3280.033146304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-httpd-config" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.851774 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.851798 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.851792718 +0000 UTC m=+3280.033308940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-internal-svc" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.851844 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.851863 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.85185777 +0000 UTC m=+3280.033373992 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-ovndbs" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.852191 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.852240 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.852227926 +0000 UTC m=+3280.033744148 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "combined-ca-bundle" not found Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.886245 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.887306 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.898178 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 15:56:20 crc kubenswrapper[4707]: E0121 15:56:20.898207 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerName="ovn-northd" Jan 21 15:56:20 crc kubenswrapper[4707]: I0121 15:56:20.974650 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.004365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-operator-scripts\") pod \"e5d83529-68a5-4c04-9d24-4524d92f1efb\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-kolla-config\") pod \"e5d83529-68a5-4c04-9d24-4524d92f1efb\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-certs\") pod \"874c328a-6498-4a0a-9754-cb2051c03411\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-combined-ca-bundle\") pod \"e5d83529-68a5-4c04-9d24-4524d92f1efb\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh2l7\" (UniqueName: \"kubernetes.io/projected/874c328a-6498-4a0a-9754-cb2051c03411-kube-api-access-xh2l7\") pod \"874c328a-6498-4a0a-9754-cb2051c03411\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-combined-ca-bundle\") pod \"874c328a-6498-4a0a-9754-cb2051c03411\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-generated\") pod \"e5d83529-68a5-4c04-9d24-4524d92f1efb\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-config\") pod \"874c328a-6498-4a0a-9754-cb2051c03411\" (UID: \"874c328a-6498-4a0a-9754-cb2051c03411\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-galera-tls-certs\") pod \"e5d83529-68a5-4c04-9d24-4524d92f1efb\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-default\") pod \"e5d83529-68a5-4c04-9d24-4524d92f1efb\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052495 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"e5d83529-68a5-4c04-9d24-4524d92f1efb\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb55j\" (UniqueName: \"kubernetes.io/projected/e5d83529-68a5-4c04-9d24-4524d92f1efb-kube-api-access-lb55j\") pod \"e5d83529-68a5-4c04-9d24-4524d92f1efb\" (UID: \"e5d83529-68a5-4c04-9d24-4524d92f1efb\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.052831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e5d83529-68a5-4c04-9d24-4524d92f1efb" (UID: "e5d83529-68a5-4c04-9d24-4524d92f1efb"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.053129 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.053676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e5d83529-68a5-4c04-9d24-4524d92f1efb" (UID: "e5d83529-68a5-4c04-9d24-4524d92f1efb"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.053714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5d83529-68a5-4c04-9d24-4524d92f1efb" (UID: "e5d83529-68a5-4c04-9d24-4524d92f1efb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.056842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e5d83529-68a5-4c04-9d24-4524d92f1efb" (UID: "e5d83529-68a5-4c04-9d24-4524d92f1efb"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.069995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d83529-68a5-4c04-9d24-4524d92f1efb-kube-api-access-lb55j" (OuterVolumeSpecName: "kube-api-access-lb55j") pod "e5d83529-68a5-4c04-9d24-4524d92f1efb" (UID: "e5d83529-68a5-4c04-9d24-4524d92f1efb"). InnerVolumeSpecName "kube-api-access-lb55j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.102364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874c328a-6498-4a0a-9754-cb2051c03411-kube-api-access-xh2l7" (OuterVolumeSpecName: "kube-api-access-xh2l7") pod "874c328a-6498-4a0a-9754-cb2051c03411" (UID: "874c328a-6498-4a0a-9754-cb2051c03411"). InnerVolumeSpecName "kube-api-access-xh2l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.104922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "874c328a-6498-4a0a-9754-cb2051c03411" (UID: "874c328a-6498-4a0a-9754-cb2051c03411"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.113267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "mysql-db") pod "e5d83529-68a5-4c04-9d24-4524d92f1efb" (UID: "e5d83529-68a5-4c04-9d24-4524d92f1efb"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.115184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5d83529-68a5-4c04-9d24-4524d92f1efb" (UID: "e5d83529-68a5-4c04-9d24-4524d92f1efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.124673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "874c328a-6498-4a0a-9754-cb2051c03411" (UID: "874c328a-6498-4a0a-9754-cb2051c03411"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.130723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "874c328a-6498-4a0a-9754-cb2051c03411" (UID: "874c328a-6498-4a0a-9754-cb2051c03411"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.132234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "e5d83529-68a5-4c04-9d24-4524d92f1efb" (UID: "e5d83529-68a5-4c04-9d24-4524d92f1efb"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154784 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154827 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154839 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154862 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154871 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb55j\" (UniqueName: \"kubernetes.io/projected/e5d83529-68a5-4c04-9d24-4524d92f1efb-kube-api-access-lb55j\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154880 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d83529-68a5-4c04-9d24-4524d92f1efb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154889 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154897 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d83529-68a5-4c04-9d24-4524d92f1efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154905 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh2l7\" (UniqueName: \"kubernetes.io/projected/874c328a-6498-4a0a-9754-cb2051c03411-kube-api-access-xh2l7\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154913 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874c328a-6498-4a0a-9754-cb2051c03411-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.154920 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d83529-68a5-4c04-9d24-4524d92f1efb-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.169634 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.190700 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2098ba33-9f63-4eb2-95d1-c1b8219a3db7" path="/var/lib/kubelet/pods/2098ba33-9f63-4eb2-95d1-c1b8219a3db7/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.191362 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="328632aa-3bca-4497-b6c5-ea71cceadd18" path="/var/lib/kubelet/pods/328632aa-3bca-4497-b6c5-ea71cceadd18/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.191863 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33110471-654e-4379-8027-e105b51e6c35" path="/var/lib/kubelet/pods/33110471-654e-4379-8027-e105b51e6c35/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.192716 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbffd0b-e235-483f-8487-ad5955d58dd7" path="/var/lib/kubelet/pods/7bbffd0b-e235-483f-8487-ad5955d58dd7/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.193226 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9384f581-d886-4c29-88d9-fe10aa949617" path="/var/lib/kubelet/pods/9384f581-d886-4c29-88d9-fe10aa949617/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.193701 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0e8b27-be36-4a9b-a897-cdfd131cf319" path="/var/lib/kubelet/pods/9d0e8b27-be36-4a9b-a897-cdfd131cf319/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.194235 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33d5677-3f18-49ea-9098-7b8d0c9d6cd3" path="/var/lib/kubelet/pods/a33d5677-3f18-49ea-9098-7b8d0c9d6cd3/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.195306 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d297fb-43eb-45d9-a000-55847e5b36cb" path="/var/lib/kubelet/pods/a9d297fb-43eb-45d9-a000-55847e5b36cb/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.195822 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf18429d-676f-4bef-b5c5-d06a69cc7b76" path="/var/lib/kubelet/pods/bf18429d-676f-4bef-b5c5-d06a69cc7b76/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.196299 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5f0055-734f-4cf4-977a-cf51a551157d" path="/var/lib/kubelet/pods/cf5f0055-734f-4cf4-977a-cf51a551157d/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.197126 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4dfc499-69dd-414a-bc01-6eb524b099e1" path="/var/lib/kubelet/pods/d4dfc499-69dd-414a-bc01-6eb524b099e1/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.197677 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feca9a31-2cdb-456b-8e43-17064ac87815" path="/var/lib/kubelet/pods/feca9a31-2cdb-456b-8e43-17064ac87815/volumes" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.214032 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.256359 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.256410 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts podName:eb75667c-b816-420a-b6e1-bd1da2826e2e nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.256396759 +0000 UTC m=+3279.437912981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts") pod "nova-cell1-fd9c-account-create-update-49grg" (UID: "eb75667c-b816-420a-b6e1-bd1da2826e2e") : configmap "openstack-cell1-scripts" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.256787 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.256861 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts podName:4560e3cf-e7a2-4a34-984e-1d58d955a9d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.256846354 +0000 UTC m=+3279.438362577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts") pod "root-account-create-update-8j2jr" (UID: "4560e3cf-e7a2-4a34-984e-1d58d955a9d2") : configmap "openstack-cell1-scripts" not found Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.256922 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.357925 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.358143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-vencrypt-tls-certs\") pod \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.358223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tptrl\" (UniqueName: \"kubernetes.io/projected/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-kube-api-access-tptrl\") pod \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.358259 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-config-data\") pod \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.358304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-nova-novncproxy-tls-certs\") pod \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.358391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-combined-ca-bundle\") pod \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\" (UID: \"7c05170c-aee6-46b7-bbc6-921bad0eb7ee\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.413637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-kube-api-access-tptrl" (OuterVolumeSpecName: "kube-api-access-tptrl") pod "7c05170c-aee6-46b7-bbc6-921bad0eb7ee" (UID: "7c05170c-aee6-46b7-bbc6-921bad0eb7ee"). InnerVolumeSpecName "kube-api-access-tptrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.443756 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-sdrlc"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.452857 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw"] Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.453330 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d83529-68a5-4c04-9d24-4524d92f1efb" containerName="mysql-bootstrap" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.453344 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d83529-68a5-4c04-9d24-4524d92f1efb" containerName="mysql-bootstrap" Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.453358 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="874c328a-6498-4a0a-9754-cb2051c03411" containerName="kube-state-metrics" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.453364 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="874c328a-6498-4a0a-9754-cb2051c03411" containerName="kube-state-metrics" Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.453371 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d83529-68a5-4c04-9d24-4524d92f1efb" containerName="galera" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.453377 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d83529-68a5-4c04-9d24-4524d92f1efb" containerName="galera" Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.453403 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c05170c-aee6-46b7-bbc6-921bad0eb7ee" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.453409 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c05170c-aee6-46b7-bbc6-921bad0eb7ee" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.453583 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d83529-68a5-4c04-9d24-4524d92f1efb" containerName="galera" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.453599 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c05170c-aee6-46b7-bbc6-921bad0eb7ee" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.453606 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="874c328a-6498-4a0a-9754-cb2051c03411" containerName="kube-state-metrics" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.471335 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-config-data" (OuterVolumeSpecName: "config-data") pod "7c05170c-aee6-46b7-bbc6-921bad0eb7ee" (UID: "7c05170c-aee6-46b7-bbc6-921bad0eb7ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.487319 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.487411 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.489426 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.490021 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tptrl\" (UniqueName: \"kubernetes.io/projected/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-kube-api-access-tptrl\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.490053 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.547023 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-khvzf"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.556745 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.591738 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-log-httpd\") pod \"c397cca9-7943-4169-9984-c5ec464de8de\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.591800 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzrn9\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-kube-api-access-xzrn9\") pod \"c397cca9-7943-4169-9984-c5ec464de8de\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.591836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-etc-swift\") pod \"c397cca9-7943-4169-9984-c5ec464de8de\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.591873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-config-data\") pod \"c397cca9-7943-4169-9984-c5ec464de8de\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.591903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-run-httpd\") pod \"c397cca9-7943-4169-9984-c5ec464de8de\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.591945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-public-tls-certs\") pod \"c397cca9-7943-4169-9984-c5ec464de8de\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.591966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-internal-tls-certs\") pod \"c397cca9-7943-4169-9984-c5ec464de8de\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.592023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-combined-ca-bundle\") pod \"c397cca9-7943-4169-9984-c5ec464de8de\" (UID: \"c397cca9-7943-4169-9984-c5ec464de8de\") " Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.592281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts\") pod \"keystone-8d41-account-create-update-mddxw\" (UID: \"52fce448-ab0e-46d9-94d6-2d9be4cb6df7\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.592402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvbfr\" (UniqueName: \"kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr\") pod \"keystone-8d41-account-create-update-mddxw\" (UID: \"52fce448-ab0e-46d9-94d6-2d9be4cb6df7\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.592718 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-khvzf"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.592946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c05170c-aee6-46b7-bbc6-921bad0eb7ee" (UID: "7c05170c-aee6-46b7-bbc6-921bad0eb7ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.593040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c397cca9-7943-4169-9984-c5ec464de8de" (UID: "c397cca9-7943-4169-9984-c5ec464de8de"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.593096 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.593128 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data podName:1eaefcd6-8b64-47a3-8e3a-280d7130b63e nodeName:}" failed. No retries permitted until 2026-01-21 15:56:25.593117002 +0000 UTC m=+3282.774633224 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data") pod "rabbitmq-cell1-server-0" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.603050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c397cca9-7943-4169-9984-c5ec464de8de" (UID: "c397cca9-7943-4169-9984-c5ec464de8de"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.603894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "7c05170c-aee6-46b7-bbc6-921bad0eb7ee" (UID: "7c05170c-aee6-46b7-bbc6-921bad0eb7ee"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.640037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c397cca9-7943-4169-9984-c5ec464de8de" (UID: "c397cca9-7943-4169-9984-c5ec464de8de"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.657032 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fcvt8"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.658960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-kube-api-access-xzrn9" (OuterVolumeSpecName: "kube-api-access-xzrn9") pod "c397cca9-7943-4169-9984-c5ec464de8de" (UID: "c397cca9-7943-4169-9984-c5ec464de8de"). InnerVolumeSpecName "kube-api-access-xzrn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.669903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "7c05170c-aee6-46b7-bbc6-921bad0eb7ee" (UID: "7c05170c-aee6-46b7-bbc6-921bad0eb7ee"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.694892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvbfr\" (UniqueName: \"kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr\") pod \"keystone-8d41-account-create-update-mddxw\" (UID: \"52fce448-ab0e-46d9-94d6-2d9be4cb6df7\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.695089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts\") pod \"keystone-8d41-account-create-update-mddxw\" (UID: \"52fce448-ab0e-46d9-94d6-2d9be4cb6df7\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.695232 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzrn9\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-kube-api-access-xzrn9\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.695249 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c397cca9-7943-4169-9984-c5ec464de8de-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.695259 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.695267 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.695282 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.695290 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c05170c-aee6-46b7-bbc6-921bad0eb7ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.695298 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c397cca9-7943-4169-9984-c5ec464de8de-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.695366 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.695406 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts podName:52fce448-ab0e-46d9-94d6-2d9be4cb6df7 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.195393764 +0000 UTC m=+3279.376909977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts") pod "keystone-8d41-account-create-update-mddxw" (UID: "52fce448-ab0e-46d9-94d6-2d9be4cb6df7") : configmap "openstack-scripts" not found Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.724351 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-fcvt8"] Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.725914 4707 projected.go:194] Error preparing data for projected volume kube-api-access-tvbfr for pod openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.725966 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr podName:52fce448-ab0e-46d9-94d6-2d9be4cb6df7 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.225951746 +0000 UTC m=+3279.407467968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tvbfr" (UniqueName: "kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr") pod "keystone-8d41-account-create-update-mddxw" (UID: "52fce448-ab0e-46d9-94d6-2d9be4cb6df7") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.740344 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6f9f7564f-zdrdh"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.740523 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" podUID="eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" containerName="keystone-api" containerID="cri-o://b6eb35a562dda2cde1c5f1def59c03da57bb93b3257929860e2eadfa4f05fa41" gracePeriod=30 Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.741667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c397cca9-7943-4169-9984-c5ec464de8de" (UID: "c397cca9-7943-4169-9984-c5ec464de8de"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.753047 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.754443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c397cca9-7943-4169-9984-c5ec464de8de" (UID: "c397cca9-7943-4169-9984-c5ec464de8de"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.758323 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw"] Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.760989 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-tvbfr operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" podUID="52fce448-ab0e-46d9-94d6-2d9be4cb6df7" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.766238 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-pfmbn"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.772121 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-pfmbn"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.776382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c397cca9-7943-4169-9984-c5ec464de8de" (UID: "c397cca9-7943-4169-9984-c5ec464de8de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.779767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-config-data" (OuterVolumeSpecName: "config-data") pod "c397cca9-7943-4169-9984-c5ec464de8de" (UID: "c397cca9-7943-4169-9984-c5ec464de8de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.796699 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.796722 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.796732 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.796740 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c397cca9-7943-4169-9984-c5ec464de8de-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.796829 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.796865 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:23.79685303 +0000 UTC m=+3280.978369252 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "cert-keystone-public-svc" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.796904 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.796923 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:23.79691709 +0000 UTC m=+3280.978433311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "cert-keystone-internal-svc" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.796949 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.796965 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:23.79695998 +0000 UTC m=+3280.978476203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "combined-ca-bundle" not found Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.823378 4707 generic.go:334] "Generic (PLEG): container finished" podID="c397cca9-7943-4169-9984-c5ec464de8de" containerID="927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b" exitCode=0 Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.823469 4707 generic.go:334] "Generic (PLEG): container finished" podID="c397cca9-7943-4169-9984-c5ec464de8de" containerID="32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356" exitCode=0 Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.823553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" event={"ID":"c397cca9-7943-4169-9984-c5ec464de8de","Type":"ContainerDied","Data":"927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.823624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" event={"ID":"c397cca9-7943-4169-9984-c5ec464de8de","Type":"ContainerDied","Data":"32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.823688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" event={"ID":"c397cca9-7943-4169-9984-c5ec464de8de","Type":"ContainerDied","Data":"6aa037d0871bf6ede6898ff8c7b5f47651612b51ea0f557be4abdb12d76759d7"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.823759 4707 scope.go:117] "RemoveContainer" containerID="927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.823930 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.827514 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c412020-b207-4f3b-abe0-160022b70cc8" containerID="1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6" exitCode=0 Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.827540 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c412020-b207-4f3b-abe0-160022b70cc8" containerID="e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159" exitCode=2 Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.827549 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c412020-b207-4f3b-abe0-160022b70cc8" containerID="7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039" exitCode=0 Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.827584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerDied","Data":"1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.827606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerDied","Data":"e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.827615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerDied","Data":"7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.830953 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb62efde-9e0c-429f-85ed-7a251716c8d9" containerID="bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699" exitCode=0 Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.830997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" event={"ID":"bb62efde-9e0c-429f-85ed-7a251716c8d9","Type":"ContainerDied","Data":"bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.831014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" event={"ID":"bb62efde-9e0c-429f-85ed-7a251716c8d9","Type":"ContainerStarted","Data":"eb011424c38a47c771085b80af1a5ac94f22c9d2e21b32f502c7915b4cf54a74"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.836840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"874c328a-6498-4a0a-9754-cb2051c03411","Type":"ContainerDied","Data":"9a553449177c427b49f07c66bf11e325ecd3d39b745f0103343310d286b949a3"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.836876 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.862065 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5d83529-68a5-4c04-9d24-4524d92f1efb" containerID="2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e" exitCode=0 Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.862120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e5d83529-68a5-4c04-9d24-4524d92f1efb","Type":"ContainerDied","Data":"2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.862143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e5d83529-68a5-4c04-9d24-4524d92f1efb","Type":"ContainerDied","Data":"107d073faec47bff764b520341b7578e3791f19a154324c2181ba6abef285c06"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.862210 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.865089 4707 scope.go:117] "RemoveContainer" containerID="32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.867483 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c05170c-aee6-46b7-bbc6-921bad0eb7ee" containerID="06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e" exitCode=0 Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.867801 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.869037 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7c05170c-aee6-46b7-bbc6-921bad0eb7ee","Type":"ContainerDied","Data":"06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.869063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7c05170c-aee6-46b7-bbc6-921bad0eb7ee","Type":"ContainerDied","Data":"2fc04797f81dcb357b70cfb4ee3e330bf277c36ba600b1351501f0d77fbc32f0"} Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.869143 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.904089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.905868 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.915199 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-68ccdc6bf4-gkfnx"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.935871 4707 scope.go:117] "RemoveContainer" containerID="927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b" Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.940951 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b\": container with ID starting with 927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b not found: ID does not exist" containerID="927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.940991 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b"} err="failed to get container status \"927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b\": rpc error: code = NotFound desc = could not find container \"927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b\": container with ID starting with 927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b not found: ID does not exist" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.941019 4707 scope.go:117] "RemoveContainer" containerID="32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.950116 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.958707 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:56:21 crc kubenswrapper[4707]: E0121 15:56:21.959009 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356\": container with ID starting with 32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356 not found: ID does not exist" containerID="32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.959046 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356"} err="failed to get container status \"32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356\": rpc error: code = NotFound desc = could not find container \"32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356\": container with ID starting with 32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356 not found: ID does not exist" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.959066 4707 scope.go:117] "RemoveContainer" containerID="927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.959512 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b"} err="failed to get container status \"927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b\": rpc error: code = NotFound desc = could not find container \"927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b\": container with ID starting with 927729fba4368d9fff5505b728f556bbcca0b1ba886c3b5492d650851355e03b not found: ID does not exist" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.959528 4707 scope.go:117] "RemoveContainer" containerID="32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.963613 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356"} err="failed to get container status \"32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356\": rpc error: code = NotFound desc = could not find container \"32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356\": container with ID starting with 32b4fa255d8639416902c855a3f03b250ec8654781cf9cd7895fb7ef77d62356 not found: ID does not exist" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.963651 4707 scope.go:117] "RemoveContainer" containerID="11762a52965a774c3c7ef42b8f5b5562d1e89a9129e880c1f1650e9be9144cda" Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.965901 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:56:21 crc kubenswrapper[4707]: I0121 15:56:21.972209 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.022113 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="ec10a8f4-f5e6-4484-8c37-db290585f9b1" containerName="galera" containerID="cri-o://dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d" gracePeriod=30 Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.066960 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.073351 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.077185 4707 scope.go:117] "RemoveContainer" containerID="2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.111289 4707 scope.go:117] "RemoveContainer" containerID="e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e" Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.115237 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.115292 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data podName:1fb17d2c-7e86-4451-b56e-4e89fc494542 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:26.115279823 +0000 UTC m=+3283.296796045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data") pod "rabbitmq-server-0" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542") : configmap "rabbitmq-config-data" not found Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.139286 4707 scope.go:117] "RemoveContainer" containerID="2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e" Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.141311 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e\": container with ID starting with 2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e not found: ID does not exist" containerID="2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.141343 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e"} err="failed to get container status \"2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e\": rpc error: code = NotFound desc = could not find container \"2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e\": container with ID starting with 2b1cf69607dae71c8afc56bfda3c7f38fe125543d77f5c36ff31372003c8399e not found: ID does not exist" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.141364 4707 scope.go:117] "RemoveContainer" containerID="e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e" Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.142482 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e\": container with ID starting with e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e not found: ID does not exist" containerID="e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.142509 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e"} err="failed to get container status \"e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e\": rpc error: code = NotFound desc = could not find container \"e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e\": container with ID starting with e3afc9fb7d200ae01ee71dc558ceaf842501e0acf87985b39b8b6830c233ce7e not found: ID does not exist" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.142525 4707 scope.go:117] "RemoveContainer" containerID="06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.192896 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-cell0-conductor-0" secret="" err="secret \"nova-nova-dockercfg-rplq6\" not found" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.193056 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-cell1-conductor-0" secret="" err="secret \"nova-nova-dockercfg-rplq6\" not found" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.217894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts\") pod \"keystone-8d41-account-create-update-mddxw\" (UID: \"52fce448-ab0e-46d9-94d6-2d9be4cb6df7\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.218896 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.218956 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts podName:52fce448-ab0e-46d9-94d6-2d9be4cb6df7 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:23.218942781 +0000 UTC m=+3280.400459004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts") pod "keystone-8d41-account-create-update-mddxw" (UID: "52fce448-ab0e-46d9-94d6-2d9be4cb6df7") : configmap "openstack-scripts" not found Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.219088 4707 scope.go:117] "RemoveContainer" containerID="06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e" Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.219545 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e\": container with ID starting with 06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e not found: ID does not exist" containerID="06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.219570 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e"} err="failed to get container status \"06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e\": rpc error: code = NotFound desc = could not find container \"06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e\": container with ID starting with 06261eb9d3c6fd076ca2fe95befe252d88cea0781e5814414132063258bf1f1e not found: ID does not exist" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.235150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.318904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d28d24-ed34-4aec-935a-ee218f56f442-operator-scripts\") pod \"f5d28d24-ed34-4aec-935a-ee218f56f442\" (UID: \"f5d28d24-ed34-4aec-935a-ee218f56f442\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.319059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc68n\" (UniqueName: \"kubernetes.io/projected/f5d28d24-ed34-4aec-935a-ee218f56f442-kube-api-access-zc68n\") pod \"f5d28d24-ed34-4aec-935a-ee218f56f442\" (UID: \"f5d28d24-ed34-4aec-935a-ee218f56f442\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.319379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvbfr\" (UniqueName: \"kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr\") pod \"keystone-8d41-account-create-update-mddxw\" (UID: \"52fce448-ab0e-46d9-94d6-2d9be4cb6df7\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.319642 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.319704 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts podName:4560e3cf-e7a2-4a34-984e-1d58d955a9d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:24.319693625 +0000 UTC m=+3281.501209847 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts") pod "root-account-create-update-8j2jr" (UID: "4560e3cf-e7a2-4a34-984e-1d58d955a9d2") : configmap "openstack-cell1-scripts" not found Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.320233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d28d24-ed34-4aec-935a-ee218f56f442-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5d28d24-ed34-4aec-935a-ee218f56f442" (UID: "f5d28d24-ed34-4aec-935a-ee218f56f442"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.320911 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d28d24-ed34-4aec-935a-ee218f56f442-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.320950 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.320973 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts podName:eb75667c-b816-420a-b6e1-bd1da2826e2e nodeName:}" failed. No retries permitted until 2026-01-21 15:56:24.320965086 +0000 UTC m=+3281.502481308 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts") pod "nova-cell1-fd9c-account-create-update-49grg" (UID: "eb75667c-b816-420a-b6e1-bd1da2826e2e") : configmap "openstack-cell1-scripts" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.321320 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.321382 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle podName:e22a9822-7aa7-4a38-916a-c3fc9fdb0895 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.821364667 +0000 UTC m=+3280.002880889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895") : secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.321594 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.321618 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle podName:03adcc99-dff8-4703-bacf-25c6576428f9 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:22.821611211 +0000 UTC m=+3280.003127432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle") pod "nova-cell0-conductor-0" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9") : secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.323995 4707 projected.go:194] Error preparing data for projected volume kube-api-access-tvbfr for pod openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.324058 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr podName:52fce448-ab0e-46d9-94d6-2d9be4cb6df7 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:23.324042552 +0000 UTC m=+3280.505558775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tvbfr" (UniqueName: "kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr") pod "keystone-8d41-account-create-update-mddxw" (UID: "52fce448-ab0e-46d9-94d6-2d9be4cb6df7") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.329898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d28d24-ed34-4aec-935a-ee218f56f442-kube-api-access-zc68n" (OuterVolumeSpecName: "kube-api-access-zc68n") pod "f5d28d24-ed34-4aec-935a-ee218f56f442" (UID: "f5d28d24-ed34-4aec-935a-ee218f56f442"). InnerVolumeSpecName "kube-api-access-zc68n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.368795 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="1fb17d2c-7e86-4451-b56e-4e89fc494542" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.220:5671: connect: connection refused" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.379824 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.382921 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.390321 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.221:5671: connect: connection refused" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.393292 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.394151 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.400124 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.421452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4226483-2932-4ba4-a0cd-d989688eb498-operator-scripts\") pod \"b4226483-2932-4ba4-a0cd-d989688eb498\" (UID: \"b4226483-2932-4ba4-a0cd-d989688eb498\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.421505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrfjj\" (UniqueName: \"kubernetes.io/projected/eb75667c-b816-420a-b6e1-bd1da2826e2e-kube-api-access-zrfjj\") pod \"eb75667c-b816-420a-b6e1-bd1da2826e2e\" (UID: \"eb75667c-b816-420a-b6e1-bd1da2826e2e\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.421587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts\") pod \"eb75667c-b816-420a-b6e1-bd1da2826e2e\" (UID: \"eb75667c-b816-420a-b6e1-bd1da2826e2e\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.421709 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdmvl\" (UniqueName: \"kubernetes.io/projected/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-kube-api-access-xdmvl\") pod \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\" (UID: \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.421769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpj7g\" (UniqueName: \"kubernetes.io/projected/ed2d2743-f38a-48d3-bef6-bc752af7f488-kube-api-access-jpj7g\") pod \"ed2d2743-f38a-48d3-bef6-bc752af7f488\" (UID: \"ed2d2743-f38a-48d3-bef6-bc752af7f488\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.421785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts\") pod \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\" (UID: \"4560e3cf-e7a2-4a34-984e-1d58d955a9d2\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.421802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs59c\" (UniqueName: \"kubernetes.io/projected/b4226483-2932-4ba4-a0cd-d989688eb498-kube-api-access-zs59c\") pod \"b4226483-2932-4ba4-a0cd-d989688eb498\" (UID: \"b4226483-2932-4ba4-a0cd-d989688eb498\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.421840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4226483-2932-4ba4-a0cd-d989688eb498-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4226483-2932-4ba4-a0cd-d989688eb498" (UID: "b4226483-2932-4ba4-a0cd-d989688eb498"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.421910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2d2743-f38a-48d3-bef6-bc752af7f488-operator-scripts\") pod \"ed2d2743-f38a-48d3-bef6-bc752af7f488\" (UID: \"ed2d2743-f38a-48d3-bef6-bc752af7f488\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.422484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4560e3cf-e7a2-4a34-984e-1d58d955a9d2" (UID: "4560e3cf-e7a2-4a34-984e-1d58d955a9d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.422818 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.422834 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4226483-2932-4ba4-a0cd-d989688eb498-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.422833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb75667c-b816-420a-b6e1-bd1da2826e2e" (UID: "eb75667c-b816-420a-b6e1-bd1da2826e2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.422844 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc68n\" (UniqueName: \"kubernetes.io/projected/f5d28d24-ed34-4aec-935a-ee218f56f442-kube-api-access-zc68n\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.422924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2d2743-f38a-48d3-bef6-bc752af7f488-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed2d2743-f38a-48d3-bef6-bc752af7f488" (UID: "ed2d2743-f38a-48d3-bef6-bc752af7f488"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.425792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-kube-api-access-xdmvl" (OuterVolumeSpecName: "kube-api-access-xdmvl") pod "4560e3cf-e7a2-4a34-984e-1d58d955a9d2" (UID: "4560e3cf-e7a2-4a34-984e-1d58d955a9d2"). InnerVolumeSpecName "kube-api-access-xdmvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.426070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4226483-2932-4ba4-a0cd-d989688eb498-kube-api-access-zs59c" (OuterVolumeSpecName: "kube-api-access-zs59c") pod "b4226483-2932-4ba4-a0cd-d989688eb498" (UID: "b4226483-2932-4ba4-a0cd-d989688eb498"). InnerVolumeSpecName "kube-api-access-zs59c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.426169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb75667c-b816-420a-b6e1-bd1da2826e2e-kube-api-access-zrfjj" (OuterVolumeSpecName: "kube-api-access-zrfjj") pod "eb75667c-b816-420a-b6e1-bd1da2826e2e" (UID: "eb75667c-b816-420a-b6e1-bd1da2826e2e"). InnerVolumeSpecName "kube-api-access-zrfjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.427117 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2d2743-f38a-48d3-bef6-bc752af7f488-kube-api-access-jpj7g" (OuterVolumeSpecName: "kube-api-access-jpj7g") pod "ed2d2743-f38a-48d3-bef6-bc752af7f488" (UID: "ed2d2743-f38a-48d3-bef6-bc752af7f488"). InnerVolumeSpecName "kube-api-access-jpj7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.463157 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.524932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc7hz\" (UniqueName: \"kubernetes.io/projected/49cd1191-1cd2-4afe-a66a-339adfd8cc18-kube-api-access-dc7hz\") pod \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\" (UID: \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.525118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt656\" (UniqueName: \"kubernetes.io/projected/11c197e2-8546-4ef2-8964-86a97f767446-kube-api-access-vt656\") pod \"11c197e2-8546-4ef2-8964-86a97f767446\" (UID: \"11c197e2-8546-4ef2-8964-86a97f767446\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.525241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49cd1191-1cd2-4afe-a66a-339adfd8cc18-operator-scripts\") pod \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\" (UID: \"49cd1191-1cd2-4afe-a66a-339adfd8cc18\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.525284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c197e2-8546-4ef2-8964-86a97f767446-operator-scripts\") pod \"11c197e2-8546-4ef2-8964-86a97f767446\" (UID: \"11c197e2-8546-4ef2-8964-86a97f767446\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.525929 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdmvl\" (UniqueName: \"kubernetes.io/projected/4560e3cf-e7a2-4a34-984e-1d58d955a9d2-kube-api-access-xdmvl\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.525946 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpj7g\" (UniqueName: \"kubernetes.io/projected/ed2d2743-f38a-48d3-bef6-bc752af7f488-kube-api-access-jpj7g\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.525957 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs59c\" (UniqueName: \"kubernetes.io/projected/b4226483-2932-4ba4-a0cd-d989688eb498-kube-api-access-zs59c\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.525966 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2d2743-f38a-48d3-bef6-bc752af7f488-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.525976 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrfjj\" (UniqueName: \"kubernetes.io/projected/eb75667c-b816-420a-b6e1-bd1da2826e2e-kube-api-access-zrfjj\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.525985 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb75667c-b816-420a-b6e1-bd1da2826e2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.526323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49cd1191-1cd2-4afe-a66a-339adfd8cc18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49cd1191-1cd2-4afe-a66a-339adfd8cc18" (UID: "49cd1191-1cd2-4afe-a66a-339adfd8cc18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.526488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c197e2-8546-4ef2-8964-86a97f767446-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11c197e2-8546-4ef2-8964-86a97f767446" (UID: "11c197e2-8546-4ef2-8964-86a97f767446"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.568676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c197e2-8546-4ef2-8964-86a97f767446-kube-api-access-vt656" (OuterVolumeSpecName: "kube-api-access-vt656") pod "11c197e2-8546-4ef2-8964-86a97f767446" (UID: "11c197e2-8546-4ef2-8964-86a97f767446"). InnerVolumeSpecName "kube-api-access-vt656". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.569236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49cd1191-1cd2-4afe-a66a-339adfd8cc18-kube-api-access-dc7hz" (OuterVolumeSpecName: "kube-api-access-dc7hz") pod "49cd1191-1cd2-4afe-a66a-339adfd8cc18" (UID: "49cd1191-1cd2-4afe-a66a-339adfd8cc18"). InnerVolumeSpecName "kube-api-access-dc7hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.627764 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49cd1191-1cd2-4afe-a66a-339adfd8cc18-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.627794 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c197e2-8546-4ef2-8964-86a97f767446-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.627828 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc7hz\" (UniqueName: \"kubernetes.io/projected/49cd1191-1cd2-4afe-a66a-339adfd8cc18-kube-api-access-dc7hz\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.627840 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt656\" (UniqueName: \"kubernetes.io/projected/11c197e2-8546-4ef2-8964-86a97f767446-kube-api-access-vt656\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.759558 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.176:8776/healthcheck\": dial tcp 10.217.0.176:8776: connect: connection refused" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.781336 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.795281 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd/ovn-northd/0.log" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.795340 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.829903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-metrics-certs-tls-certs\") pod \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.829942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-config\") pod \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-669nz\" (UniqueName: \"kubernetes.io/projected/a4c4483b-abb1-4c04-9092-d8f04755bc93-kube-api-access-669nz\") pod \"a4c4483b-abb1-4c04-9092-d8f04755bc93\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v82fr\" (UniqueName: \"kubernetes.io/projected/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-kube-api-access-v82fr\") pod \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4483b-abb1-4c04-9092-d8f04755bc93-logs\") pod \"a4c4483b-abb1-4c04-9092-d8f04755bc93\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830100 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data-custom\") pod \"a4c4483b-abb1-4c04-9092-d8f04755bc93\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-scripts\") pod \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-rundir\") pod \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-combined-ca-bundle\") pod \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-combined-ca-bundle\") pod \"a4c4483b-abb1-4c04-9092-d8f04755bc93\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-northd-tls-certs\") pod \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\" (UID: \"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd\") " Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.830259 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data\") pod \"a4c4483b-abb1-4c04-9092-d8f04755bc93\" (UID: \"a4c4483b-abb1-4c04-9092-d8f04755bc93\") " Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.830737 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.830779 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle podName:03adcc99-dff8-4703-bacf-25c6576428f9 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:23.830767394 +0000 UTC m=+3281.012283616 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle") pod "nova-cell0-conductor-0" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9") : secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.831014 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.831062 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle podName:e22a9822-7aa7-4a38-916a-c3fc9fdb0895 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:23.831049073 +0000 UTC m=+3281.012565285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895") : secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.831530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-scripts" (OuterVolumeSpecName: "scripts") pod "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" (UID: "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.831759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" (UID: "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.833361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c4483b-abb1-4c04-9092-d8f04755bc93-kube-api-access-669nz" (OuterVolumeSpecName: "kube-api-access-669nz") pod "a4c4483b-abb1-4c04-9092-d8f04755bc93" (UID: "a4c4483b-abb1-4c04-9092-d8f04755bc93"). InnerVolumeSpecName "kube-api-access-669nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.833694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4c4483b-abb1-4c04-9092-d8f04755bc93-logs" (OuterVolumeSpecName: "logs") pod "a4c4483b-abb1-4c04-9092-d8f04755bc93" (UID: "a4c4483b-abb1-4c04-9092-d8f04755bc93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.835882 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-kube-api-access-v82fr" (OuterVolumeSpecName: "kube-api-access-v82fr") pod "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" (UID: "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd"). InnerVolumeSpecName "kube-api-access-v82fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.836384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4c4483b-abb1-4c04-9092-d8f04755bc93" (UID: "a4c4483b-abb1-4c04-9092-d8f04755bc93"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.836657 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-config" (OuterVolumeSpecName: "config") pod "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" (UID: "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.850037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" (UID: "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.852419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4c4483b-abb1-4c04-9092-d8f04755bc93" (UID: "a4c4483b-abb1-4c04-9092-d8f04755bc93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.874646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data" (OuterVolumeSpecName: "config-data") pod "a4c4483b-abb1-4c04-9092-d8f04755bc93" (UID: "a4c4483b-abb1-4c04-9092-d8f04755bc93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.876725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" event={"ID":"f5d28d24-ed34-4aec-935a-ee218f56f442","Type":"ContainerDied","Data":"372e4f298a993f8f732059fa4c6eb1eee10140b82a351408e64786b29fecf578"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.876794 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.879324 4707 generic.go:334] "Generic (PLEG): container finished" podID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerID="60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85" exitCode=0 Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.879372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" event={"ID":"a4c4483b-abb1-4c04-9092-d8f04755bc93","Type":"ContainerDied","Data":"60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.879394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" event={"ID":"a4c4483b-abb1-4c04-9092-d8f04755bc93","Type":"ContainerDied","Data":"e682954da34993008005534dd037243e287aaf2a8a003495aee619f754b4a49e"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.879408 4707 scope.go:117] "RemoveContainer" containerID="60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.879482 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.883750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" event={"ID":"49cd1191-1cd2-4afe-a66a-339adfd8cc18","Type":"ContainerDied","Data":"b3fa5c9c2a3a3289502e1f5bf0f61c3bb3ba4d5d00f237d904c167f5dbe8f8f8"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.883916 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.896928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" (UID: "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.897639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" event={"ID":"b4226483-2932-4ba4-a0cd-d989688eb498","Type":"ContainerDied","Data":"100a5144f04acebd478cf9063f957e07039893814e934d6782fcea7abf352b68"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.897707 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.916317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" (UID: "92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.917540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" event={"ID":"11c197e2-8546-4ef2-8964-86a97f767446","Type":"ContainerDied","Data":"0aa2a14d0251ab0e4dd3d13f329527a7138e4ffc26f59cd40580c30ac58604bd"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.917598 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.920568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" event={"ID":"eb75667c-b816-420a-b6e1-bd1da2826e2e","Type":"ContainerDied","Data":"22fb96223b619ecdbfa02115bbda51ab73b7cfd1b462a3f7e5a34b0cb3062706"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.920624 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933108 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933128 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933138 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933147 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933155 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933163 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933170 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933178 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-669nz\" (UniqueName: \"kubernetes.io/projected/a4c4483b-abb1-4c04-9092-d8f04755bc93-kube-api-access-669nz\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933186 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v82fr\" (UniqueName: \"kubernetes.io/projected/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-kube-api-access-v82fr\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933193 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4483b-abb1-4c04-9092-d8f04755bc93-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933201 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4c4483b-abb1-4c04-9092-d8f04755bc93-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.933209 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933267 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933309 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:26.933297352 +0000 UTC m=+3284.114813574 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-internal-svc" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933428 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933449 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:26.933441874 +0000 UTC m=+3284.114958096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-ovndbs" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933532 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933552 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:26.933547023 +0000 UTC m=+3284.115063244 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-public-svc" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933630 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933648 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:26.933642722 +0000 UTC m=+3284.115158943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "combined-ca-bundle" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933723 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-config: secret "neutron-config" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933747 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:26.933741738 +0000 UTC m=+3284.115257960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-config" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933856 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-httpd-config: secret "neutron-httpd-config" not found Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.933880 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:26.933874678 +0000 UTC m=+3284.115390899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-httpd-config" not found Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.951869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" event={"ID":"ed2d2743-f38a-48d3-bef6-bc752af7f488","Type":"ContainerDied","Data":"51e0d5c42c8b50c2bd222fc48b8406fed0cea941b74380a7c4b91ff31b03d332"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.952027 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.961190 4707 scope.go:117] "RemoveContainer" containerID="e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.966438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.967824 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd/ovn-northd/0.log" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.967852 4707 generic.go:334] "Generic (PLEG): container finished" podID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerID="488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4" exitCode=139 Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.967915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd","Type":"ContainerDied","Data":"488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.967937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd","Type":"ContainerDied","Data":"d2ad05ae89fd084069e55e5f76ce6ece9032a6ef14b84c5118ec3897705e45be"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.968000 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.977265 4707 generic.go:334] "Generic (PLEG): container finished" podID="243109f6-78d0-4750-a43f-21a7b606626d" containerID="b3b250ebed84579c975d272e553aa69f879cc66363e018d563bfefd4ff7ef997" exitCode=0 Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.977323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" event={"ID":"243109f6-78d0-4750-a43f-21a7b606626d","Type":"ContainerDied","Data":"b3b250ebed84579c975d272e553aa69f879cc66363e018d563bfefd4ff7ef997"} Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.982048 4707 scope.go:117] "RemoveContainer" containerID="60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.982149 4707 generic.go:334] "Generic (PLEG): container finished" podID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerID="0f293a8697eb3310b255d467efcfb20b65db8c27463105113488bd5b244efaf7" exitCode=0 Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.982192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b59f3c9f-adf5-49f3-8f0b-62044f91902c","Type":"ContainerDied","Data":"0f293a8697eb3310b255d467efcfb20b65db8c27463105113488bd5b244efaf7"} Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.983540 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85\": container with ID starting with 60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85 not found: ID does not exist" containerID="60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.983595 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85"} err="failed to get container status \"60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85\": rpc error: code = NotFound desc = could not find container \"60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85\": container with ID starting with 60b886d442c9451d6dc59ebbc83dc85699c0985cf1cc5a57f930ce5323b8ea85 not found: ID does not exist" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.983617 4707 scope.go:117] "RemoveContainer" containerID="e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f" Jan 21 15:56:22 crc kubenswrapper[4707]: E0121 15:56:22.984357 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f\": container with ID starting with e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f not found: ID does not exist" containerID="e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.984383 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f"} err="failed to get container status \"e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f\": rpc error: code = NotFound desc = could not find container \"e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f\": container with ID starting with e19641518785933541a7abc90595def5986e9bbfe3cb1eac94a7194a07baf67f not found: ID does not exist" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.984396 4707 scope.go:117] "RemoveContainer" containerID="edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9" Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.999045 4707 generic.go:334] "Generic (PLEG): container finished" podID="28add001-f139-47b8-9ebb-659ba39858cc" containerID="db4b9f5629cb09208131600005b17fc382e9d120bae3a7b38b5a8a1752ce7728" exitCode=0 Jan 21 15:56:22 crc kubenswrapper[4707]: I0121 15:56:22.999094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" event={"ID":"28add001-f139-47b8-9ebb-659ba39858cc","Type":"ContainerDied","Data":"db4b9f5629cb09208131600005b17fc382e9d120bae3a7b38b5a8a1752ce7728"} Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.006518 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.008419 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec10a8f4-f5e6-4484-8c37-db290585f9b1" containerID="dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d" exitCode=0 Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.008547 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.009042 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.009128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"ec10a8f4-f5e6-4484-8c37-db290585f9b1","Type":"ContainerDied","Data":"dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d"} Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.009214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"ec10a8f4-f5e6-4484-8c37-db290585f9b1","Type":"ContainerDied","Data":"7dcb147429f61c94cdd8287ed0331c84ae66aa782c88e77279eba5f213e98eba"} Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.011453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8j2jr" event={"ID":"4560e3cf-e7a2-4a34-984e-1d58d955a9d2","Type":"ContainerDied","Data":"7cddf9944defeeddc33e6845aae99e7e720462c7d11707fdd12e880a3887d38f"} Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.011562 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8j2jr" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.015557 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-b634-account-create-update-mqv4z"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.016040 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.016092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" event={"ID":"bb62efde-9e0c-429f-85ed-7a251716c8d9","Type":"ContainerStarted","Data":"8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc"} Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.024857 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kolla-config\") pod \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034260 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-default\") pod \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-combined-ca-bundle\") pod \"243109f6-78d0-4750-a43f-21a7b606626d\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-internal-tls-certs\") pod \"243109f6-78d0-4750-a43f-21a7b606626d\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034419 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gghhf\" (UniqueName: \"kubernetes.io/projected/243109f6-78d0-4750-a43f-21a7b606626d-kube-api-access-gghhf\") pod \"243109f6-78d0-4750-a43f-21a7b606626d\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsjnl\" (UniqueName: \"kubernetes.io/projected/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kube-api-access-fsjnl\") pod \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-combined-ca-bundle\") pod \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-operator-scripts\") pod \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-public-tls-certs\") pod \"243109f6-78d0-4750-a43f-21a7b606626d\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-config-data\") pod \"243109f6-78d0-4750-a43f-21a7b606626d\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-scripts\") pod \"243109f6-78d0-4750-a43f-21a7b606626d\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/243109f6-78d0-4750-a43f-21a7b606626d-logs\") pod \"243109f6-78d0-4750-a43f-21a7b606626d\" (UID: \"243109f6-78d0-4750-a43f-21a7b606626d\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034718 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-galera-tls-certs\") pod \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.034737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-generated\") pod \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\" (UID: \"ec10a8f4-f5e6-4484-8c37-db290585f9b1\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.036158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "ec10a8f4-f5e6-4484-8c37-db290585f9b1" (UID: "ec10a8f4-f5e6-4484-8c37-db290585f9b1"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.037173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ec10a8f4-f5e6-4484-8c37-db290585f9b1" (UID: "ec10a8f4-f5e6-4484-8c37-db290585f9b1"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.037699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "ec10a8f4-f5e6-4484-8c37-db290585f9b1" (UID: "ec10a8f4-f5e6-4484-8c37-db290585f9b1"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.037720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec10a8f4-f5e6-4484-8c37-db290585f9b1" (UID: "ec10a8f4-f5e6-4484-8c37-db290585f9b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.045919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/243109f6-78d0-4750-a43f-21a7b606626d-logs" (OuterVolumeSpecName: "logs") pod "243109f6-78d0-4750-a43f-21a7b606626d" (UID: "243109f6-78d0-4750-a43f-21a7b606626d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.058495 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/243109f6-78d0-4750-a43f-21a7b606626d-kube-api-access-gghhf" (OuterVolumeSpecName: "kube-api-access-gghhf") pod "243109f6-78d0-4750-a43f-21a7b606626d" (UID: "243109f6-78d0-4750-a43f-21a7b606626d"). InnerVolumeSpecName "kube-api-access-gghhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.062328 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.062798 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kube-api-access-fsjnl" (OuterVolumeSpecName: "kube-api-access-fsjnl") pod "ec10a8f4-f5e6-4484-8c37-db290585f9b1" (UID: "ec10a8f4-f5e6-4484-8c37-db290585f9b1"). InnerVolumeSpecName "kube-api-access-fsjnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.063001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-scripts" (OuterVolumeSpecName: "scripts") pod "243109f6-78d0-4750-a43f-21a7b606626d" (UID: "243109f6-78d0-4750-a43f-21a7b606626d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.070088 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.080870 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-fd9c-account-create-update-49grg"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.104776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "mysql-db") pod "ec10a8f4-f5e6-4484-8c37-db290585f9b1" (UID: "ec10a8f4-f5e6-4484-8c37-db290585f9b1"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.104784 4707 scope.go:117] "RemoveContainer" containerID="488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.108858 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.120598 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.135443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "ec10a8f4-f5e6-4484-8c37-db290585f9b1" (UID: "ec10a8f4-f5e6-4484-8c37-db290585f9b1"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.137652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data\") pod \"28add001-f139-47b8-9ebb-659ba39858cc\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.137873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-combined-ca-bundle\") pod \"28add001-f139-47b8-9ebb-659ba39858cc\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.137984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mmrm\" (UniqueName: \"kubernetes.io/projected/28add001-f139-47b8-9ebb-659ba39858cc-kube-api-access-7mmrm\") pod \"28add001-f139-47b8-9ebb-659ba39858cc\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.138014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28add001-f139-47b8-9ebb-659ba39858cc-logs\") pod \"28add001-f139-47b8-9ebb-659ba39858cc\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.138088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data-custom\") pod \"28add001-f139-47b8-9ebb-659ba39858cc\" (UID: \"28add001-f139-47b8-9ebb-659ba39858cc\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.144229 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gghhf\" (UniqueName: \"kubernetes.io/projected/243109f6-78d0-4750-a43f-21a7b606626d-kube-api-access-gghhf\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.144257 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsjnl\" (UniqueName: \"kubernetes.io/projected/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kube-api-access-fsjnl\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.144283 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.144294 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.144302 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/243109f6-78d0-4750-a43f-21a7b606626d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.144311 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.144325 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.144335 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.144345 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec10a8f4-f5e6-4484-8c37-db290585f9b1-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.147612 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.148000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28add001-f139-47b8-9ebb-659ba39858cc-logs" (OuterVolumeSpecName: "logs") pod "28add001-f139-47b8-9ebb-659ba39858cc" (UID: "28add001-f139-47b8-9ebb-659ba39858cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.156826 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28add001-f139-47b8-9ebb-659ba39858cc" (UID: "28add001-f139-47b8-9ebb-659ba39858cc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.164789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28add001-f139-47b8-9ebb-659ba39858cc-kube-api-access-7mmrm" (OuterVolumeSpecName: "kube-api-access-7mmrm") pod "28add001-f139-47b8-9ebb-659ba39858cc" (UID: "28add001-f139-47b8-9ebb-659ba39858cc"). InnerVolumeSpecName "kube-api-access-7mmrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.165641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-config-data" (OuterVolumeSpecName: "config-data") pod "243109f6-78d0-4750-a43f-21a7b606626d" (UID: "243109f6-78d0-4750-a43f-21a7b606626d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.170702 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.177333 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:56414->10.217.0.184:8775: read: connection reset by peer" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.177554 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:56406->10.217.0.184:8775: read: connection reset by peer" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.183525 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.184239 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-b3eb-account-create-update-758s8"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.185485 4707 scope.go:117] "RemoveContainer" containerID="edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9" Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.186984 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9\": container with ID starting with edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9 not found: ID does not exist" containerID="edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.187022 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9"} err="failed to get container status \"edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9\": rpc error: code = NotFound desc = could not find container \"edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9\": container with ID starting with edf38d507c6eb5ee49350ae28d1aa62ef5601cfe722414c0bda1288d2200b7c9 not found: ID does not exist" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.187040 4707 scope.go:117] "RemoveContainer" containerID="488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4" Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.187504 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4\": container with ID starting with 488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4 not found: ID does not exist" containerID="488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.187544 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4"} err="failed to get container status \"488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4\": rpc error: code = NotFound desc = could not find container \"488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4\": container with ID starting with 488b69db924fbe8f17fd1acd60f65ca4a6252b30132c5c821d7021cb3b5d91f4 not found: ID does not exist" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.187566 4707 scope.go:117] "RemoveContainer" containerID="dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.194952 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" podStartSLOduration=4.194943785 podStartE2EDuration="4.194943785s" podCreationTimestamp="2026-01-21 15:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:56:23.104945371 +0000 UTC m=+3280.286461593" watchObservedRunningTime="2026-01-21 15:56:23.194943785 +0000 UTC m=+3280.376460007" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.206600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec10a8f4-f5e6-4484-8c37-db290585f9b1" (UID: "ec10a8f4-f5e6-4484-8c37-db290585f9b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.210239 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.212372 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be5de23-516b-4f0d-b987-a7adaac7d32f" path="/var/lib/kubelet/pods/7be5de23-516b-4f0d-b987-a7adaac7d32f/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.213979 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c05170c-aee6-46b7-bbc6-921bad0eb7ee" path="/var/lib/kubelet/pods/7c05170c-aee6-46b7-bbc6-921bad0eb7ee/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.215089 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6b93c5-8639-4f92-9141-d3b904db7bbb" path="/var/lib/kubelet/pods/7f6b93c5-8639-4f92-9141-d3b904db7bbb/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.216102 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="874c328a-6498-4a0a-9754-cb2051c03411" path="/var/lib/kubelet/pods/874c328a-6498-4a0a-9754-cb2051c03411/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.216564 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a32805d-3d95-44ab-8bd1-808d7391c49d" path="/var/lib/kubelet/pods/8a32805d-3d95-44ab-8bd1-808d7391c49d/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.217588 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" path="/var/lib/kubelet/pods/92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.218090 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada6986b-923f-4ca5-a85b-d0578223ae13" path="/var/lib/kubelet/pods/ada6986b-923f-4ca5-a85b-d0578223ae13/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.218556 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4226483-2932-4ba4-a0cd-d989688eb498" path="/var/lib/kubelet/pods/b4226483-2932-4ba4-a0cd-d989688eb498/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.218750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "243109f6-78d0-4750-a43f-21a7b606626d" (UID: "243109f6-78d0-4750-a43f-21a7b606626d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.218918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28add001-f139-47b8-9ebb-659ba39858cc" (UID: "28add001-f139-47b8-9ebb-659ba39858cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.219457 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c397cca9-7943-4169-9984-c5ec464de8de" path="/var/lib/kubelet/pods/c397cca9-7943-4169-9984-c5ec464de8de/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.221752 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d83529-68a5-4c04-9d24-4524d92f1efb" path="/var/lib/kubelet/pods/e5d83529-68a5-4c04-9d24-4524d92f1efb/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.222489 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb75667c-b816-420a-b6e1-bd1da2826e2e" path="/var/lib/kubelet/pods/eb75667c-b816-420a-b6e1-bd1da2826e2e/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.222944 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d28d24-ed34-4aec-935a-ee218f56f442" path="/var/lib/kubelet/pods/f5d28d24-ed34-4aec-935a-ee218f56f442/volumes" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.248256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data\") pod \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.248301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-combined-ca-bundle\") pod \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.248421 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-internal-tls-certs\") pod \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.248449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-public-tls-certs\") pod \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.248470 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b59f3c9f-adf5-49f3-8f0b-62044f91902c-logs\") pod \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.248485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b59f3c9f-adf5-49f3-8f0b-62044f91902c-etc-machine-id\") pod \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.248502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsvpq\" (UniqueName: \"kubernetes.io/projected/b59f3c9f-adf5-49f3-8f0b-62044f91902c-kube-api-access-fsvpq\") pod \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.248578 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-scripts\") pod \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.248600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data-custom\") pod \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\" (UID: \"b59f3c9f-adf5-49f3-8f0b-62044f91902c\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b59f3c9f-adf5-49f3-8f0b-62044f91902c-logs" (OuterVolumeSpecName: "logs") pod "b59f3c9f-adf5-49f3-8f0b-62044f91902c" (UID: "b59f3c9f-adf5-49f3-8f0b-62044f91902c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249102 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts\") pod \"keystone-8d41-account-create-update-mddxw\" (UID: \"52fce448-ab0e-46d9-94d6-2d9be4cb6df7\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249319 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec10a8f4-f5e6-4484-8c37-db290585f9b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249331 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249341 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249350 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b59f3c9f-adf5-49f3-8f0b-62044f91902c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249359 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mmrm\" (UniqueName: \"kubernetes.io/projected/28add001-f139-47b8-9ebb-659ba39858cc-kube-api-access-7mmrm\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249368 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28add001-f139-47b8-9ebb-659ba39858cc-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249376 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249384 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.249392 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.249436 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.249466 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts podName:52fce448-ab0e-46d9-94d6-2d9be4cb6df7 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:25.249454177 +0000 UTC m=+3282.430970399 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts") pod "keystone-8d41-account-create-update-mddxw" (UID: "52fce448-ab0e-46d9-94d6-2d9be4cb6df7") : configmap "openstack-scripts" not found Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.252014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b59f3c9f-adf5-49f3-8f0b-62044f91902c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b59f3c9f-adf5-49f3-8f0b-62044f91902c" (UID: "b59f3c9f-adf5-49f3-8f0b-62044f91902c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.252660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data" (OuterVolumeSpecName: "config-data") pod "28add001-f139-47b8-9ebb-659ba39858cc" (UID: "28add001-f139-47b8-9ebb-659ba39858cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.257860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b59f3c9f-adf5-49f3-8f0b-62044f91902c" (UID: "b59f3c9f-adf5-49f3-8f0b-62044f91902c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.262334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "243109f6-78d0-4750-a43f-21a7b606626d" (UID: "243109f6-78d0-4750-a43f-21a7b606626d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.268082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b59f3c9f-adf5-49f3-8f0b-62044f91902c-kube-api-access-fsvpq" (OuterVolumeSpecName: "kube-api-access-fsvpq") pod "b59f3c9f-adf5-49f3-8f0b-62044f91902c" (UID: "b59f3c9f-adf5-49f3-8f0b-62044f91902c"). InnerVolumeSpecName "kube-api-access-fsvpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.269892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-scripts" (OuterVolumeSpecName: "scripts") pod "b59f3c9f-adf5-49f3-8f0b-62044f91902c" (UID: "b59f3c9f-adf5-49f3-8f0b-62044f91902c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.297215 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.297245 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-fe8e-account-create-update-6nbj5"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.297261 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.297269 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-5988-account-create-update-sdl8z"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.297291 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.297299 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-a56d-account-create-update-fsqrj"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.297309 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.297780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b59f3c9f-adf5-49f3-8f0b-62044f91902c" (UID: "b59f3c9f-adf5-49f3-8f0b-62044f91902c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.304110 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b5f6558c6-lzljz"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.325958 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8j2jr"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.330532 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8j2jr"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.332123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "243109f6-78d0-4750-a43f-21a7b606626d" (UID: "243109f6-78d0-4750-a43f-21a7b606626d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.334936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b59f3c9f-adf5-49f3-8f0b-62044f91902c" (UID: "b59f3c9f-adf5-49f3-8f0b-62044f91902c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.335039 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b59f3c9f-adf5-49f3-8f0b-62044f91902c" (UID: "b59f3c9f-adf5-49f3-8f0b-62044f91902c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.338059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data" (OuterVolumeSpecName: "config-data") pod "b59f3c9f-adf5-49f3-8f0b-62044f91902c" (UID: "b59f3c9f-adf5-49f3-8f0b-62044f91902c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.338869 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.339547 4707 scope.go:117] "RemoveContainer" containerID="aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.350941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvbfr\" (UniqueName: \"kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr\") pod \"keystone-8d41-account-create-update-mddxw\" (UID: \"52fce448-ab0e-46d9-94d6-2d9be4cb6df7\") " pod="openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351044 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351058 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351067 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b59f3c9f-adf5-49f3-8f0b-62044f91902c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351076 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsvpq\" (UniqueName: \"kubernetes.io/projected/b59f3c9f-adf5-49f3-8f0b-62044f91902c-kube-api-access-fsvpq\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351085 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351093 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351102 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351110 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351118 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59f3c9f-adf5-49f3-8f0b-62044f91902c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351126 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28add001-f139-47b8-9ebb-659ba39858cc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.351133 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/243109f6-78d0-4750-a43f-21a7b606626d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.352228 4707 projected.go:194] Error preparing data for projected volume kube-api-access-tvbfr for pod openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw: failed to fetch token: serviceaccounts "galera-openstack" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.352304 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr podName:52fce448-ab0e-46d9-94d6-2d9be4cb6df7 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:25.352289911 +0000 UTC m=+3282.533806132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tvbfr" (UniqueName: "kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr") pod "keystone-8d41-account-create-update-mddxw" (UID: "52fce448-ab0e-46d9-94d6-2d9be4cb6df7") : failed to fetch token: serviceaccounts "galera-openstack" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.352925 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-8d41-account-create-update-mddxw"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.365361 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.371222 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.386548 4707 scope.go:117] "RemoveContainer" containerID="dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d" Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.386951 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d\": container with ID starting with dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d not found: ID does not exist" containerID="dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.386988 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d"} err="failed to get container status \"dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d\": rpc error: code = NotFound desc = could not find container \"dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d\": container with ID starting with dc1edda17bf1886c2f32818ba4a4876b0320d5a0a52d61976b0fefe5a179338d not found: ID does not exist" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.387011 4707 scope.go:117] "RemoveContainer" containerID="aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972" Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.387316 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972\": container with ID starting with aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972 not found: ID does not exist" containerID="aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.387340 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972"} err="failed to get container status \"aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972\": rpc error: code = NotFound desc = could not find container \"aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972\": container with ID starting with aeeedb10a3808a8c88ac2af91bdc6652795f10aba30a76e20594f12b86229972 not found: ID does not exist" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.398803 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.452510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp5g2\" (UniqueName: \"kubernetes.io/projected/448bc08a-61e8-48bb-be55-256be3a355a2-kube-api-access-xp5g2\") pod \"448bc08a-61e8-48bb-be55-256be3a355a2\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.452586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-combined-ca-bundle\") pod \"448bc08a-61e8-48bb-be55-256be3a355a2\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.452648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-config-data\") pod \"448bc08a-61e8-48bb-be55-256be3a355a2\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.452671 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-scripts\") pod \"448bc08a-61e8-48bb-be55-256be3a355a2\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.452780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-public-tls-certs\") pod \"448bc08a-61e8-48bb-be55-256be3a355a2\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.452824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"448bc08a-61e8-48bb-be55-256be3a355a2\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.452863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-httpd-run\") pod \"448bc08a-61e8-48bb-be55-256be3a355a2\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.452904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-logs\") pod \"448bc08a-61e8-48bb-be55-256be3a355a2\" (UID: \"448bc08a-61e8-48bb-be55-256be3a355a2\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.453447 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvbfr\" (UniqueName: \"kubernetes.io/projected/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-kube-api-access-tvbfr\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.453459 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52fce448-ab0e-46d9-94d6-2d9be4cb6df7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.453783 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-logs" (OuterVolumeSpecName: "logs") pod "448bc08a-61e8-48bb-be55-256be3a355a2" (UID: "448bc08a-61e8-48bb-be55-256be3a355a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.454004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "448bc08a-61e8-48bb-be55-256be3a355a2" (UID: "448bc08a-61e8-48bb-be55-256be3a355a2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.455470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/448bc08a-61e8-48bb-be55-256be3a355a2-kube-api-access-xp5g2" (OuterVolumeSpecName: "kube-api-access-xp5g2") pod "448bc08a-61e8-48bb-be55-256be3a355a2" (UID: "448bc08a-61e8-48bb-be55-256be3a355a2"). InnerVolumeSpecName "kube-api-access-xp5g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.459898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "448bc08a-61e8-48bb-be55-256be3a355a2" (UID: "448bc08a-61e8-48bb-be55-256be3a355a2"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.460485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-scripts" (OuterVolumeSpecName: "scripts") pod "448bc08a-61e8-48bb-be55-256be3a355a2" (UID: "448bc08a-61e8-48bb-be55-256be3a355a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.493490 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "448bc08a-61e8-48bb-be55-256be3a355a2" (UID: "448bc08a-61e8-48bb-be55-256be3a355a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.502157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-config-data" (OuterVolumeSpecName: "config-data") pod "448bc08a-61e8-48bb-be55-256be3a355a2" (UID: "448bc08a-61e8-48bb-be55-256be3a355a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.505262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "448bc08a-61e8-48bb-be55-256be3a355a2" (UID: "448bc08a-61e8-48bb-be55-256be3a355a2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.554637 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.554660 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/448bc08a-61e8-48bb-be55-256be3a355a2-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.554670 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp5g2\" (UniqueName: \"kubernetes.io/projected/448bc08a-61e8-48bb-be55-256be3a355a2-kube-api-access-xp5g2\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.554680 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.554689 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.554696 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.554703 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/448bc08a-61e8-48bb-be55-256be3a355a2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.554721 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.571957 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.585800 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.589855 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656336 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-internal-tls-certs\") pod \"6c4d63f6-935c-47e2-92c0-4e27286253ba\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6c4d63f6-935c-47e2-92c0-4e27286253ba\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ff97\" (UniqueName: \"kubernetes.io/projected/54a93993-fb38-4bec-9fa6-2ffa280ccecd-kube-api-access-2ff97\") pod \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-config-data\") pod \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-httpd-run\") pod \"6c4d63f6-935c-47e2-92c0-4e27286253ba\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a93993-fb38-4bec-9fa6-2ffa280ccecd-logs\") pod \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-logs\") pod \"6c4d63f6-935c-47e2-92c0-4e27286253ba\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656560 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-config-data\") pod \"6c4d63f6-935c-47e2-92c0-4e27286253ba\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-combined-ca-bundle\") pod \"6c4d63f6-935c-47e2-92c0-4e27286253ba\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656661 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-nova-metadata-tls-certs\") pod \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-scripts\") pod \"6c4d63f6-935c-47e2-92c0-4e27286253ba\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f67wc\" (UniqueName: \"kubernetes.io/projected/6c4d63f6-935c-47e2-92c0-4e27286253ba-kube-api-access-f67wc\") pod \"6c4d63f6-935c-47e2-92c0-4e27286253ba\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.656741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-combined-ca-bundle\") pod \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\" (UID: \"54a93993-fb38-4bec-9fa6-2ffa280ccecd\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.657176 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.657744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a93993-fb38-4bec-9fa6-2ffa280ccecd-logs" (OuterVolumeSpecName: "logs") pod "54a93993-fb38-4bec-9fa6-2ffa280ccecd" (UID: "54a93993-fb38-4bec-9fa6-2ffa280ccecd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.659947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-logs" (OuterVolumeSpecName: "logs") pod "6c4d63f6-935c-47e2-92c0-4e27286253ba" (UID: "6c4d63f6-935c-47e2-92c0-4e27286253ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.660107 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6c4d63f6-935c-47e2-92c0-4e27286253ba" (UID: "6c4d63f6-935c-47e2-92c0-4e27286253ba"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.664100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "6c4d63f6-935c-47e2-92c0-4e27286253ba" (UID: "6c4d63f6-935c-47e2-92c0-4e27286253ba"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.664115 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-scripts" (OuterVolumeSpecName: "scripts") pod "6c4d63f6-935c-47e2-92c0-4e27286253ba" (UID: "6c4d63f6-935c-47e2-92c0-4e27286253ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.664365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a93993-fb38-4bec-9fa6-2ffa280ccecd-kube-api-access-2ff97" (OuterVolumeSpecName: "kube-api-access-2ff97") pod "54a93993-fb38-4bec-9fa6-2ffa280ccecd" (UID: "54a93993-fb38-4bec-9fa6-2ffa280ccecd"). InnerVolumeSpecName "kube-api-access-2ff97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.664404 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4d63f6-935c-47e2-92c0-4e27286253ba-kube-api-access-f67wc" (OuterVolumeSpecName: "kube-api-access-f67wc") pod "6c4d63f6-935c-47e2-92c0-4e27286253ba" (UID: "6c4d63f6-935c-47e2-92c0-4e27286253ba"). InnerVolumeSpecName "kube-api-access-f67wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.677140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54a93993-fb38-4bec-9fa6-2ffa280ccecd" (UID: "54a93993-fb38-4bec-9fa6-2ffa280ccecd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.678276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c4d63f6-935c-47e2-92c0-4e27286253ba" (UID: "6c4d63f6-935c-47e2-92c0-4e27286253ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.678787 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-config-data" (OuterVolumeSpecName: "config-data") pod "54a93993-fb38-4bec-9fa6-2ffa280ccecd" (UID: "54a93993-fb38-4bec-9fa6-2ffa280ccecd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.691130 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-config-data podName:6c4d63f6-935c-47e2-92c0-4e27286253ba nodeName:}" failed. No retries permitted until 2026-01-21 15:56:24.191110733 +0000 UTC m=+3281.372626955 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-config-data") pod "6c4d63f6-935c-47e2-92c0-4e27286253ba" (UID: "6c4d63f6-935c-47e2-92c0-4e27286253ba") : error deleting /var/lib/kubelet/pods/6c4d63f6-935c-47e2-92c0-4e27286253ba/volume-subpaths: remove /var/lib/kubelet/pods/6c4d63f6-935c-47e2-92c0-4e27286253ba/volume-subpaths: no such file or directory Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.692840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6c4d63f6-935c-47e2-92c0-4e27286253ba" (UID: "6c4d63f6-935c-47e2-92c0-4e27286253ba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.705326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "54a93993-fb38-4bec-9fa6-2ffa280ccecd" (UID: "54a93993-fb38-4bec-9fa6-2ffa280ccecd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759188 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759216 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759227 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f67wc\" (UniqueName: \"kubernetes.io/projected/6c4d63f6-935c-47e2-92c0-4e27286253ba-kube-api-access-f67wc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759235 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759244 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759264 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759283 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ff97\" (UniqueName: \"kubernetes.io/projected/54a93993-fb38-4bec-9fa6-2ffa280ccecd-kube-api-access-2ff97\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759291 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a93993-fb38-4bec-9fa6-2ffa280ccecd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759298 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759305 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a93993-fb38-4bec-9fa6-2ffa280ccecd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759312 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4d63f6-935c-47e2-92c0-4e27286253ba-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.759322 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.775385 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.848200 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.860905 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.860971 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:27.860945803 +0000 UTC m=+3285.042462026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "cert-keystone-public-svc" not found Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.861005 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.861054 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle podName:03adcc99-dff8-4703-bacf-25c6576428f9 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:25.861039579 +0000 UTC m=+3283.042555801 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle") pod "nova-cell0-conductor-0" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9") : secret "combined-ca-bundle" not found Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.861082 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.861131 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:27.861116244 +0000 UTC m=+3285.042632466 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "cert-keystone-internal-svc" not found Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.861158 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.861184 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle podName:eb6b0d99-fa7f-4fce-8c2d-6737562bdd85 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:27.861177399 +0000 UTC m=+3285.042693621 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle") pod "keystone-6f9f7564f-zdrdh" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85") : secret "combined-ca-bundle" not found Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.861242 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.861365 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:23 crc kubenswrapper[4707]: E0121 15:56:23.861424 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle podName:e22a9822-7aa7-4a38-916a-c3fc9fdb0895 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:25.861412921 +0000 UTC m=+3283.042929143 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895") : secret "combined-ca-bundle" not found Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.962311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-logs\") pod \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.962364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data-custom\") pod \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.962405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data\") pod \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.962443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6676q\" (UniqueName: \"kubernetes.io/projected/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-kube-api-access-6676q\") pod \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.962472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-public-tls-certs\") pod \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.962493 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-internal-tls-certs\") pod \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.962589 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-combined-ca-bundle\") pod \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\" (UID: \"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7\") " Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.965832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-logs" (OuterVolumeSpecName: "logs") pod "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" (UID: "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.967388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" (UID: "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.969154 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-kube-api-access-6676q" (OuterVolumeSpecName: "kube-api-access-6676q") pod "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" (UID: "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7"). InnerVolumeSpecName "kube-api-access-6676q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:23 crc kubenswrapper[4707]: I0121 15:56:23.980978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" (UID: "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.002895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" (UID: "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.008824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" (UID: "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.009183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data" (OuterVolumeSpecName: "config-data") pod "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" (UID: "871b3b08-cd0d-4fee-96ab-7b75cf72a0a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.025162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b59f3c9f-adf5-49f3-8f0b-62044f91902c","Type":"ContainerDied","Data":"40b1b860db6575bed685952498b04a04377e4b919333ddfad12e7e87103c47b9"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.025218 4707 scope.go:117] "RemoveContainer" containerID="0f293a8697eb3310b255d467efcfb20b65db8c27463105113488bd5b244efaf7" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.025332 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.031822 4707 generic.go:334] "Generic (PLEG): container finished" podID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerID="c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b" exitCode=0 Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.031879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" event={"ID":"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7","Type":"ContainerDied","Data":"c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.031901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" event={"ID":"871b3b08-cd0d-4fee-96ab-7b75cf72a0a7","Type":"ContainerDied","Data":"ac729fbe14555219c0a09200bd5f7a530de88f8d8b6d1a038730cbeb9a223666"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.031945 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.035865 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.036238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5469fd854-s2lnn" event={"ID":"243109f6-78d0-4750-a43f-21a7b606626d","Type":"ContainerDied","Data":"2632e16366185f74b5f4647a40e3b3adadd47e4c4847d2a650c73312c38e7396"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.039685 4707 generic.go:334] "Generic (PLEG): container finished" podID="448bc08a-61e8-48bb-be55-256be3a355a2" containerID="292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55" exitCode=0 Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.039758 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.039758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"448bc08a-61e8-48bb-be55-256be3a355a2","Type":"ContainerDied","Data":"292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.039839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"448bc08a-61e8-48bb-be55-256be3a355a2","Type":"ContainerDied","Data":"656409fcf40297cb92024510355ff8bb824028e49035d0f352c2fb57af41d414"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.042710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" event={"ID":"28add001-f139-47b8-9ebb-659ba39858cc","Type":"ContainerDied","Data":"26c5a891a2c404bb36e9542098c2586cffd1c00d906dd37ada375d009956e7a8"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.042833 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.046572 4707 generic.go:334] "Generic (PLEG): container finished" podID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerID="c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715" exitCode=0 Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.046612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6c4d63f6-935c-47e2-92c0-4e27286253ba","Type":"ContainerDied","Data":"c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.046626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6c4d63f6-935c-47e2-92c0-4e27286253ba","Type":"ContainerDied","Data":"b5294e944a1c6c65e2aa16f2048bc9bbaa5285e76c2a4fca8ff2dc10cfc829c5"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.046664 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.059099 4707 generic.go:334] "Generic (PLEG): container finished" podID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerID="90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382" exitCode=0 Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.059177 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.059172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"54a93993-fb38-4bec-9fa6-2ffa280ccecd","Type":"ContainerDied","Data":"90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.059304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"54a93993-fb38-4bec-9fa6-2ffa280ccecd","Type":"ContainerDied","Data":"59e35d4182fb31c6d7ffc4da98fa3ee8e54da9f10f35945bf0133222ecdb664e"} Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.064492 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.064507 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.064517 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.064525 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.064533 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6676q\" (UniqueName: \"kubernetes.io/projected/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-kube-api-access-6676q\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.064542 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.064550 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.072821 4707 scope.go:117] "RemoveContainer" containerID="bed84625b1724689ba24d64b9ce564f5af89b171fde3cda59c73d1d35a9d2648" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.076770 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.090709 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.099499 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.104753 4707 scope.go:117] "RemoveContainer" containerID="c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.115994 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.125083 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.133857 4707 scope.go:117] "RemoveContainer" containerID="ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.135890 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-789c64d4bf-l5hnd"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.143020 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.147496 4707 scope.go:117] "RemoveContainer" containerID="c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.147553 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 15:56:24 crc kubenswrapper[4707]: E0121 15:56:24.147822 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b\": container with ID starting with c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b not found: ID does not exist" containerID="c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.147849 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b"} err="failed to get container status \"c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b\": rpc error: code = NotFound desc = could not find container \"c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b\": container with ID starting with c4a412333e9f02c730c4f28b376d92edb7b5d1c448e8bfc10afee230036db79b not found: ID does not exist" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.147866 4707 scope.go:117] "RemoveContainer" containerID="ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca" Jan 21 15:56:24 crc kubenswrapper[4707]: E0121 15:56:24.148133 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca\": container with ID starting with ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca not found: ID does not exist" containerID="ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.148157 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca"} err="failed to get container status \"ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca\": rpc error: code = NotFound desc = could not find container \"ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca\": container with ID starting with ad2dd4e7444096f5a3cd3c64b5a3fa4ff0266ebbc771d92ea85f4aca457b1fca not found: ID does not exist" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.148175 4707 scope.go:117] "RemoveContainer" containerID="b3b250ebed84579c975d272e553aa69f879cc66363e018d563bfefd4ff7ef997" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.151997 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5469fd854-s2lnn"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.156364 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5469fd854-s2lnn"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.160199 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.163260 4707 scope.go:117] "RemoveContainer" containerID="cc0164bd92b9e0315765bd0cef3ea980987ab65b303b1a0cbcde9d86f9918385" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.163960 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.180795 4707 scope.go:117] "RemoveContainer" containerID="292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.201929 4707 scope.go:117] "RemoveContainer" containerID="4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.218127 4707 scope.go:117] "RemoveContainer" containerID="292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55" Jan 21 15:56:24 crc kubenswrapper[4707]: E0121 15:56:24.218529 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55\": container with ID starting with 292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55 not found: ID does not exist" containerID="292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.218559 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55"} err="failed to get container status \"292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55\": rpc error: code = NotFound desc = could not find container \"292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55\": container with ID starting with 292d8fef021fc85e4f0844057abd61645e9eb2ddaffd36d78c4d5de75a085c55 not found: ID does not exist" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.218577 4707 scope.go:117] "RemoveContainer" containerID="4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387" Jan 21 15:56:24 crc kubenswrapper[4707]: E0121 15:56:24.218918 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387\": container with ID starting with 4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387 not found: ID does not exist" containerID="4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.218951 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387"} err="failed to get container status \"4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387\": rpc error: code = NotFound desc = could not find container \"4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387\": container with ID starting with 4816001f54fe0b74ff35cef69111f487ffbf5f530521b4d0fb8b3f568c3af387 not found: ID does not exist" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.218972 4707 scope.go:117] "RemoveContainer" containerID="db4b9f5629cb09208131600005b17fc382e9d120bae3a7b38b5a8a1752ce7728" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.233934 4707 scope.go:117] "RemoveContainer" containerID="73b1fd5e41c624d54ff550ab75b71d0f8bbc342e1260f7be7c15ab44c21e4e16" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.248707 4707 scope.go:117] "RemoveContainer" containerID="c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.264679 4707 scope.go:117] "RemoveContainer" containerID="7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.267444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-config-data\") pod \"6c4d63f6-935c-47e2-92c0-4e27286253ba\" (UID: \"6c4d63f6-935c-47e2-92c0-4e27286253ba\") " Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.269651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-config-data" (OuterVolumeSpecName: "config-data") pod "6c4d63f6-935c-47e2-92c0-4e27286253ba" (UID: "6c4d63f6-935c-47e2-92c0-4e27286253ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.280658 4707 scope.go:117] "RemoveContainer" containerID="c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715" Jan 21 15:56:24 crc kubenswrapper[4707]: E0121 15:56:24.280994 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715\": container with ID starting with c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715 not found: ID does not exist" containerID="c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.281020 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715"} err="failed to get container status \"c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715\": rpc error: code = NotFound desc = could not find container \"c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715\": container with ID starting with c6282e4b9f3c6368959a3abc092a7dd30efd2037cb626baf20b0331d92155715 not found: ID does not exist" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.281034 4707 scope.go:117] "RemoveContainer" containerID="7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e" Jan 21 15:56:24 crc kubenswrapper[4707]: E0121 15:56:24.281300 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e\": container with ID starting with 7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e not found: ID does not exist" containerID="7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.281322 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e"} err="failed to get container status \"7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e\": rpc error: code = NotFound desc = could not find container \"7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e\": container with ID starting with 7fd9fd2408c06db0cbe8a36331986f2419c75c24d17fd40e5744e995afd79c1e not found: ID does not exist" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.281335 4707 scope.go:117] "RemoveContainer" containerID="90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.296586 4707 scope.go:117] "RemoveContainer" containerID="568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.313258 4707 scope.go:117] "RemoveContainer" containerID="90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382" Jan 21 15:56:24 crc kubenswrapper[4707]: E0121 15:56:24.313545 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382\": container with ID starting with 90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382 not found: ID does not exist" containerID="90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.313573 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382"} err="failed to get container status \"90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382\": rpc error: code = NotFound desc = could not find container \"90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382\": container with ID starting with 90f6b3d41550bfd8fe6f5ab9d80decaf6c9ef56a05e8727418d9c99dbb04a382 not found: ID does not exist" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.313591 4707 scope.go:117] "RemoveContainer" containerID="568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a" Jan 21 15:56:24 crc kubenswrapper[4707]: E0121 15:56:24.313890 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a\": container with ID starting with 568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a not found: ID does not exist" containerID="568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.313926 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a"} err="failed to get container status \"568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a\": rpc error: code = NotFound desc = could not find container \"568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a\": container with ID starting with 568d29feedd612bc579812e5fd9b2b88b421bca342ef70a539954058d078659a not found: ID does not exist" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.369126 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4d63f6-935c-47e2-92c0-4e27286253ba-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.372705 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:56:24 crc kubenswrapper[4707]: I0121 15:56:24.377886 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.071266 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" containerID="b6eb35a562dda2cde1c5f1def59c03da57bb93b3257929860e2eadfa4f05fa41" exitCode=0 Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.071417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" event={"ID":"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85","Type":"ContainerDied","Data":"b6eb35a562dda2cde1c5f1def59c03da57bb93b3257929860e2eadfa4f05fa41"} Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.189407 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c197e2-8546-4ef2-8964-86a97f767446" path="/var/lib/kubelet/pods/11c197e2-8546-4ef2-8964-86a97f767446/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.189783 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="243109f6-78d0-4750-a43f-21a7b606626d" path="/var/lib/kubelet/pods/243109f6-78d0-4750-a43f-21a7b606626d/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.190344 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28add001-f139-47b8-9ebb-659ba39858cc" path="/var/lib/kubelet/pods/28add001-f139-47b8-9ebb-659ba39858cc/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.191371 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="448bc08a-61e8-48bb-be55-256be3a355a2" path="/var/lib/kubelet/pods/448bc08a-61e8-48bb-be55-256be3a355a2/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.191956 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4560e3cf-e7a2-4a34-984e-1d58d955a9d2" path="/var/lib/kubelet/pods/4560e3cf-e7a2-4a34-984e-1d58d955a9d2/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.192308 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49cd1191-1cd2-4afe-a66a-339adfd8cc18" path="/var/lib/kubelet/pods/49cd1191-1cd2-4afe-a66a-339adfd8cc18/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.192527 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52fce448-ab0e-46d9-94d6-2d9be4cb6df7" path="/var/lib/kubelet/pods/52fce448-ab0e-46d9-94d6-2d9be4cb6df7/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.192832 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" path="/var/lib/kubelet/pods/54a93993-fb38-4bec-9fa6-2ffa280ccecd/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.193739 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4d63f6-935c-47e2-92c0-4e27286253ba" path="/var/lib/kubelet/pods/6c4d63f6-935c-47e2-92c0-4e27286253ba/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.194459 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" path="/var/lib/kubelet/pods/871b3b08-cd0d-4fee-96ab-7b75cf72a0a7/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.195713 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c4483b-abb1-4c04-9092-d8f04755bc93" path="/var/lib/kubelet/pods/a4c4483b-abb1-4c04-9092-d8f04755bc93/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.196249 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" path="/var/lib/kubelet/pods/b59f3c9f-adf5-49f3-8f0b-62044f91902c/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.198316 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec10a8f4-f5e6-4484-8c37-db290585f9b1" path="/var/lib/kubelet/pods/ec10a8f4-f5e6-4484-8c37-db290585f9b1/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.199591 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2d2743-f38a-48d3-bef6-bc752af7f488" path="/var/lib/kubelet/pods/ed2d2743-f38a-48d3-bef6-bc752af7f488/volumes" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.210414 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.279251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-credential-keys\") pod \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.279299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-scripts\") pod \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.279331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-config-data\") pod \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.279400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz5hn\" (UniqueName: \"kubernetes.io/projected/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-kube-api-access-cz5hn\") pod \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.279451 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle\") pod \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.279476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs\") pod \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.279565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs\") pod \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.279587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-fernet-keys\") pod \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\" (UID: \"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85\") " Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.283231 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-scripts" (OuterVolumeSpecName: "scripts") pod "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.283268 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-kube-api-access-cz5hn" (OuterVolumeSpecName: "kube-api-access-cz5hn") pod "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85"). InnerVolumeSpecName "kube-api-access-cz5hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.283613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.290916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.305754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.317839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.318995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.320388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-config-data" (OuterVolumeSpecName: "config-data") pod "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" (UID: "eb6b0d99-fa7f-4fce-8c2d-6737562bdd85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.381393 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.381414 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.381423 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.381432 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.381439 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.381446 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.381454 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:25 crc kubenswrapper[4707]: I0121 15:56:25.381463 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz5hn\" (UniqueName: \"kubernetes.io/projected/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85-kube-api-access-cz5hn\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:25 crc kubenswrapper[4707]: E0121 15:56:25.686405 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:25 crc kubenswrapper[4707]: E0121 15:56:25.686466 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data podName:1eaefcd6-8b64-47a3-8e3a-280d7130b63e nodeName:}" failed. No retries permitted until 2026-01-21 15:56:33.686453548 +0000 UTC m=+3290.867969760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data") pod "rabbitmq-cell1-server-0" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 15:56:25 crc kubenswrapper[4707]: E0121 15:56:25.889651 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:25 crc kubenswrapper[4707]: E0121 15:56:25.889736 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle podName:03adcc99-dff8-4703-bacf-25c6576428f9 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:29.889718971 +0000 UTC m=+3287.071235194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle") pod "nova-cell0-conductor-0" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9") : secret "combined-ca-bundle" not found Jan 21 15:56:25 crc kubenswrapper[4707]: E0121 15:56:25.889767 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:25 crc kubenswrapper[4707]: E0121 15:56:25.889835 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle podName:e22a9822-7aa7-4a38-916a-c3fc9fdb0895 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:29.889805433 +0000 UTC m=+3287.071321655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895") : secret "combined-ca-bundle" not found Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.105228 4707 generic.go:334] "Generic (PLEG): container finished" podID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" containerID="c684b390b448b0e7f019fce643a06fcefabe494a19baeedc6a8e4cf355706140" exitCode=0 Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.105305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1eaefcd6-8b64-47a3-8e3a-280d7130b63e","Type":"ContainerDied","Data":"c684b390b448b0e7f019fce643a06fcefabe494a19baeedc6a8e4cf355706140"} Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.107148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" event={"ID":"eb6b0d99-fa7f-4fce-8c2d-6737562bdd85","Type":"ContainerDied","Data":"8b952ccd01371b3e89fdc853ed7872b91d4d20965606e37c429f031d8647070a"} Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.107245 4707 scope.go:117] "RemoveContainer" containerID="b6eb35a562dda2cde1c5f1def59c03da57bb93b3257929860e2eadfa4f05fa41" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.107177 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f9f7564f-zdrdh" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.154623 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6f9f7564f-zdrdh"] Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.159761 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6f9f7564f-zdrdh"] Jan 21 15:56:26 crc kubenswrapper[4707]: E0121 15:56:26.204172 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 15:56:26 crc kubenswrapper[4707]: E0121 15:56:26.204236 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data podName:1fb17d2c-7e86-4451-b56e-4e89fc494542 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:34.204221856 +0000 UTC m=+3291.385738078 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data") pod "rabbitmq-server-0" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542") : configmap "rabbitmq-config-data" not found Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.371044 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.406689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5p7c\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-kube-api-access-w5p7c\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.406727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-erlang-cookie\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.406744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-tls\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.406836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-server-conf\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.406881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-confd\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.406935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-plugins\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.406963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-erlang-cookie-secret\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.407018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.407067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-pod-info\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.407103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-plugins-conf\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.407141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data\") pod \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\" (UID: \"1eaefcd6-8b64-47a3-8e3a-280d7130b63e\") " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.408783 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.408965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.409179 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.410734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-kube-api-access-w5p7c" (OuterVolumeSpecName: "kube-api-access-w5p7c") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "kube-api-access-w5p7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.411094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.411519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "persistence") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.411940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-pod-info" (OuterVolumeSpecName: "pod-info") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.423070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.425734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data" (OuterVolumeSpecName: "config-data") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.435243 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-server-conf" (OuterVolumeSpecName: "server-conf") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.466256 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1eaefcd6-8b64-47a3-8e3a-280d7130b63e" (UID: "1eaefcd6-8b64-47a3-8e3a-280d7130b63e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.508930 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.508954 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.508967 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.508976 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.508999 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.509008 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.509016 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.509023 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.509032 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5p7c\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-kube-api-access-w5p7c\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.509040 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.509047 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1eaefcd6-8b64-47a3-8e3a-280d7130b63e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.522675 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.610567 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:26 crc kubenswrapper[4707]: I0121 15:56:26.966243 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.017694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-erlang-cookie\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.017777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-tls\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.017858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fb17d2c-7e86-4451-b56e-4e89fc494542-erlang-cookie-secret\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.017893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.017918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-plugins-conf\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.017939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-server-conf\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.017963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjshv\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-kube-api-access-tjshv\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.017985 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-confd\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.018001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.018018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fb17d2c-7e86-4451-b56e-4e89fc494542-pod-info\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.018097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-plugins\") pod \"1fb17d2c-7e86-4451-b56e-4e89fc494542\" (UID: \"1fb17d2c-7e86-4451-b56e-4e89fc494542\") " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.018596 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.018614 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.018687 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:35.01866309 +0000 UTC m=+3292.200179312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-public-svc" not found Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.018605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019099 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-config: secret "neutron-config" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019302 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:35.019291582 +0000 UTC m=+3292.200807803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-config" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019136 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-httpd-config: secret "neutron-httpd-config" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019332 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:35.019325666 +0000 UTC m=+3292.200841889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-httpd-config" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019161 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019429 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:35.019413761 +0000 UTC m=+3292.200929984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-ovndbs" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019163 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019467 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:35.019460609 +0000 UTC m=+3292.200976831 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-internal-svc" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019201 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.019490 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:35.019486138 +0000 UTC m=+3292.201002360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "combined-ca-bundle" not found Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.019505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.022910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb17d2c-7e86-4451-b56e-4e89fc494542-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.022943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.022959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1fb17d2c-7e86-4451-b56e-4e89fc494542-pod-info" (OuterVolumeSpecName: "pod-info") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.022975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-kube-api-access-tjshv" (OuterVolumeSpecName: "kube-api-access-tjshv") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "kube-api-access-tjshv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.027510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "persistence") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.033295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data" (OuterVolumeSpecName: "config-data") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.046537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-server-conf" (OuterVolumeSpecName: "server-conf") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.073973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1fb17d2c-7e86-4451-b56e-4e89fc494542" (UID: "1fb17d2c-7e86-4451-b56e-4e89fc494542"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.114644 4707 generic.go:334] "Generic (PLEG): container finished" podID="1fb17d2c-7e86-4451-b56e-4e89fc494542" containerID="af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a" exitCode=0 Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.114686 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.114697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1fb17d2c-7e86-4451-b56e-4e89fc494542","Type":"ContainerDied","Data":"af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a"} Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.114720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"1fb17d2c-7e86-4451-b56e-4e89fc494542","Type":"ContainerDied","Data":"b30872d48350038c1ef31ec76544f116a2f2e79bdeabe2733d3094e9fc29ca57"} Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.114734 4707 scope.go:117] "RemoveContainer" containerID="af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.117404 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.118075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1eaefcd6-8b64-47a3-8e3a-280d7130b63e","Type":"ContainerDied","Data":"a7fb614e0e11f27b97c72334e1130103e002ccc22d5af5fc4409cf3da5fc830c"} Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119603 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119623 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fb17d2c-7e86-4451-b56e-4e89fc494542-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119632 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119642 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119650 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fb17d2c-7e86-4451-b56e-4e89fc494542-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119657 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjshv\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-kube-api-access-tjshv\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119666 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119681 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119690 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fb17d2c-7e86-4451-b56e-4e89fc494542-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119698 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.119706 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fb17d2c-7e86-4451-b56e-4e89fc494542-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.138667 4707 scope.go:117] "RemoveContainer" containerID="ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.145294 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.146821 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.161200 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.167142 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.170039 4707 scope.go:117] "RemoveContainer" containerID="af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a" Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.170382 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a\": container with ID starting with af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a not found: ID does not exist" containerID="af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.170419 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a"} err="failed to get container status \"af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a\": rpc error: code = NotFound desc = could not find container \"af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a\": container with ID starting with af115aa251b883f6b502e4b8e645c525e55c11b9d96a4ca242b4fbd916e44a9a not found: ID does not exist" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.170439 4707 scope.go:117] "RemoveContainer" containerID="ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c" Jan 21 15:56:27 crc kubenswrapper[4707]: E0121 15:56:27.170690 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c\": container with ID starting with ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c not found: ID does not exist" containerID="ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.170705 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c"} err="failed to get container status \"ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c\": rpc error: code = NotFound desc = could not find container \"ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c\": container with ID starting with ff80cb6da7c2ebd7f2d733f0a031e4171189eaeecf35f4d436e9f6c24530512c not found: ID does not exist" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.170717 4707 scope.go:117] "RemoveContainer" containerID="c684b390b448b0e7f019fce643a06fcefabe494a19baeedc6a8e4cf355706140" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.172160 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.186029 4707 scope.go:117] "RemoveContainer" containerID="14470d78188ddcd00f2db0dea2c3bc584052343e21da79ba6654c2da9d0b26e0" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.189560 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" path="/var/lib/kubelet/pods/1eaefcd6-8b64-47a3-8e3a-280d7130b63e/volumes" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.190238 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb17d2c-7e86-4451-b56e-4e89fc494542" path="/var/lib/kubelet/pods/1fb17d2c-7e86-4451-b56e-4e89fc494542/volumes" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.191151 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" path="/var/lib/kubelet/pods/eb6b0d99-fa7f-4fce-8c2d-6737562bdd85/volumes" Jan 21 15:56:27 crc kubenswrapper[4707]: I0121 15:56:27.221459 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:28 crc kubenswrapper[4707]: I0121 15:56:28.646801 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:56:28 crc kubenswrapper[4707]: I0121 15:56:28.646852 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-b87b87cdd-2lqlf" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:56:28 crc kubenswrapper[4707]: I0121 15:56:28.762347 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:56:28 crc kubenswrapper[4707]: I0121 15:56:28.762535 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" containerID="cri-o://031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" gracePeriod=30 Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.420964 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.421128 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="56a9521e-4024-4a1d-8b87-66d305c22eb4" containerName="memcached" containerID="cri-o://7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf" gracePeriod=30 Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.438846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf"] Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.443826 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.444025 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" containerID="cri-o://21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" gracePeriod=30 Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.450376 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vp5jf"] Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.454868 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9"] Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.458487 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-rrqb9"] Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.462020 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:56:29 crc kubenswrapper[4707]: I0121 15:56:29.462194 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" containerID="cri-o://bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" gracePeriod=30 Jan 21 15:56:29 crc kubenswrapper[4707]: E0121 15:56:29.953980 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:29 crc kubenswrapper[4707]: E0121 15:56:29.954033 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle podName:e22a9822-7aa7-4a38-916a-c3fc9fdb0895 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:37.954020921 +0000 UTC m=+3295.135537142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895") : secret "combined-ca-bundle" not found Jan 21 15:56:29 crc kubenswrapper[4707]: E0121 15:56:29.954300 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:29 crc kubenswrapper[4707]: E0121 15:56:29.954325 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle podName:03adcc99-dff8-4703-bacf-25c6576428f9 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:37.954318259 +0000 UTC m=+3295.135834472 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle") pod "nova-cell0-conductor-0" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9") : secret "combined-ca-bundle" not found Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.249925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.291078 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j"] Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.291279 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" podUID="7d4d9381-62e0-4779-9664-b1f9a3ced4ba" containerName="dnsmasq-dns" containerID="cri-o://9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd" gracePeriod=10 Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.714870 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.766287 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dns-swift-storage-0\") pod \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.766340 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dnsmasq-svc\") pod \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.766373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5mn2\" (UniqueName: \"kubernetes.io/projected/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-kube-api-access-t5mn2\") pod \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.766469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-config\") pod \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\" (UID: \"7d4d9381-62e0-4779-9664-b1f9a3ced4ba\") " Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.770980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-kube-api-access-t5mn2" (OuterVolumeSpecName: "kube-api-access-t5mn2") pod "7d4d9381-62e0-4779-9664-b1f9a3ced4ba" (UID: "7d4d9381-62e0-4779-9664-b1f9a3ced4ba"). InnerVolumeSpecName "kube-api-access-t5mn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.774106 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.799037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "7d4d9381-62e0-4779-9664-b1f9a3ced4ba" (UID: "7d4d9381-62e0-4779-9664-b1f9a3ced4ba"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.799312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-config" (OuterVolumeSpecName: "config") pod "7d4d9381-62e0-4779-9664-b1f9a3ced4ba" (UID: "7d4d9381-62e0-4779-9664-b1f9a3ced4ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.801374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7d4d9381-62e0-4779-9664-b1f9a3ced4ba" (UID: "7d4d9381-62e0-4779-9664-b1f9a3ced4ba"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.867553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-combined-ca-bundle\") pod \"56a9521e-4024-4a1d-8b87-66d305c22eb4\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.867588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbknm\" (UniqueName: \"kubernetes.io/projected/56a9521e-4024-4a1d-8b87-66d305c22eb4-kube-api-access-jbknm\") pod \"56a9521e-4024-4a1d-8b87-66d305c22eb4\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.867627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-kolla-config\") pod \"56a9521e-4024-4a1d-8b87-66d305c22eb4\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.867664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-memcached-tls-certs\") pod \"56a9521e-4024-4a1d-8b87-66d305c22eb4\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.867709 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-config-data\") pod \"56a9521e-4024-4a1d-8b87-66d305c22eb4\" (UID: \"56a9521e-4024-4a1d-8b87-66d305c22eb4\") " Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.868079 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "56a9521e-4024-4a1d-8b87-66d305c22eb4" (UID: "56a9521e-4024-4a1d-8b87-66d305c22eb4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.868311 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-config-data" (OuterVolumeSpecName: "config-data") pod "56a9521e-4024-4a1d-8b87-66d305c22eb4" (UID: "56a9521e-4024-4a1d-8b87-66d305c22eb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.868633 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.868653 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5mn2\" (UniqueName: \"kubernetes.io/projected/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-kube-api-access-t5mn2\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.868664 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.868671 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a9521e-4024-4a1d-8b87-66d305c22eb4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.868680 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.868687 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d4d9381-62e0-4779-9664-b1f9a3ced4ba-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.869681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a9521e-4024-4a1d-8b87-66d305c22eb4-kube-api-access-jbknm" (OuterVolumeSpecName: "kube-api-access-jbknm") pod "56a9521e-4024-4a1d-8b87-66d305c22eb4" (UID: "56a9521e-4024-4a1d-8b87-66d305c22eb4"). InnerVolumeSpecName "kube-api-access-jbknm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.881805 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56a9521e-4024-4a1d-8b87-66d305c22eb4" (UID: "56a9521e-4024-4a1d-8b87-66d305c22eb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.892799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "56a9521e-4024-4a1d-8b87-66d305c22eb4" (UID: "56a9521e-4024-4a1d-8b87-66d305c22eb4"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.970304 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.970506 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbknm\" (UniqueName: \"kubernetes.io/projected/56a9521e-4024-4a1d-8b87-66d305c22eb4-kube-api-access-jbknm\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:30 crc kubenswrapper[4707]: I0121 15:56:30.970517 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a9521e-4024-4a1d-8b87-66d305c22eb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.147166 4707 generic.go:334] "Generic (PLEG): container finished" podID="56a9521e-4024-4a1d-8b87-66d305c22eb4" containerID="7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf" exitCode=0 Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.147211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"56a9521e-4024-4a1d-8b87-66d305c22eb4","Type":"ContainerDied","Data":"7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf"} Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.147223 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.147285 4707 scope.go:117] "RemoveContainer" containerID="7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.147262 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"56a9521e-4024-4a1d-8b87-66d305c22eb4","Type":"ContainerDied","Data":"dafe48a2197071c0507f0af076c626d5035a6ed80ff2862cca486423004ed811"} Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.148874 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d4d9381-62e0-4779-9664-b1f9a3ced4ba" containerID="9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd" exitCode=0 Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.148904 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" event={"ID":"7d4d9381-62e0-4779-9664-b1f9a3ced4ba","Type":"ContainerDied","Data":"9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd"} Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.148920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" event={"ID":"7d4d9381-62e0-4779-9664-b1f9a3ced4ba","Type":"ContainerDied","Data":"03248ed147576477673165d7a84bf20ab85caec1c459f8bbbe5c284d888e1d92"} Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.148959 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.168134 4707 scope.go:117] "RemoveContainer" containerID="7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf" Jan 21 15:56:31 crc kubenswrapper[4707]: E0121 15:56:31.168460 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf\": container with ID starting with 7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf not found: ID does not exist" containerID="7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.168482 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf"} err="failed to get container status \"7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf\": rpc error: code = NotFound desc = could not find container \"7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf\": container with ID starting with 7438fca70688110e3e2a7f86c17652d83000c49e716192f28f06574fc9e69ecf not found: ID does not exist" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.168499 4707 scope.go:117] "RemoveContainer" containerID="9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.174485 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j"] Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.193374 4707 scope.go:117] "RemoveContainer" containerID="71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.195003 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a11152-db37-403a-a567-131288939b37" path="/var/lib/kubelet/pods/55a11152-db37-403a-a567-131288939b37/volumes" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.195635 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82693866-aeda-4300-b19e-819e2f09922f" path="/var/lib/kubelet/pods/82693866-aeda-4300-b19e-819e2f09922f/volumes" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.196148 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859975766f-whq4j"] Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.199319 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.204219 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.207131 4707 scope.go:117] "RemoveContainer" containerID="9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd" Jan 21 15:56:31 crc kubenswrapper[4707]: E0121 15:56:31.207427 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd\": container with ID starting with 9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd not found: ID does not exist" containerID="9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.207458 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd"} err="failed to get container status \"9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd\": rpc error: code = NotFound desc = could not find container \"9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd\": container with ID starting with 9ac0731e5390790c1c36c627c0cf6fca59103247f12376e3730ad468984537cd not found: ID does not exist" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.207478 4707 scope.go:117] "RemoveContainer" containerID="71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d" Jan 21 15:56:31 crc kubenswrapper[4707]: E0121 15:56:31.207717 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d\": container with ID starting with 71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d not found: ID does not exist" containerID="71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d" Jan 21 15:56:31 crc kubenswrapper[4707]: I0121 15:56:31.207802 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d"} err="failed to get container status \"71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d\": rpc error: code = NotFound desc = could not find container \"71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d\": container with ID starting with 71647ad5acb0f775e20df26d96e49bddddc73c5cb896cacca6d5b79a4c88f48d not found: ID does not exist" Jan 21 15:56:32 crc kubenswrapper[4707]: I0121 15:56:32.113953 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: TLS handshake timeout" Jan 21 15:56:32 crc kubenswrapper[4707]: I0121 15:56:32.114935 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: TLS handshake timeout" Jan 21 15:56:32 crc kubenswrapper[4707]: E0121 15:56:32.987446 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:32 crc kubenswrapper[4707]: E0121 15:56:32.988597 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:32 crc kubenswrapper[4707]: E0121 15:56:32.989715 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:32 crc kubenswrapper[4707]: E0121 15:56:32.989740 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.191181 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.192787 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a9521e-4024-4a1d-8b87-66d305c22eb4" path="/var/lib/kubelet/pods/56a9521e-4024-4a1d-8b87-66d305c22eb4/volumes" Jan 21 15:56:33 crc kubenswrapper[4707]: E0121 15:56:33.192968 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.193581 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4d9381-62e0-4779-9664-b1f9a3ced4ba" path="/var/lib/kubelet/pods/7d4d9381-62e0-4779-9664-b1f9a3ced4ba/volumes" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.546768 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.605191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-public-tls-certs\") pod \"2604d874-1aab-419e-a67e-801734497292\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.605270 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-config-data\") pod \"2604d874-1aab-419e-a67e-801734497292\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.605333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-internal-tls-certs\") pod \"2604d874-1aab-419e-a67e-801734497292\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.605352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-combined-ca-bundle\") pod \"2604d874-1aab-419e-a67e-801734497292\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.605380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb54t\" (UniqueName: \"kubernetes.io/projected/2604d874-1aab-419e-a67e-801734497292-kube-api-access-tb54t\") pod \"2604d874-1aab-419e-a67e-801734497292\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.605487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604d874-1aab-419e-a67e-801734497292-logs\") pod \"2604d874-1aab-419e-a67e-801734497292\" (UID: \"2604d874-1aab-419e-a67e-801734497292\") " Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.606070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2604d874-1aab-419e-a67e-801734497292-logs" (OuterVolumeSpecName: "logs") pod "2604d874-1aab-419e-a67e-801734497292" (UID: "2604d874-1aab-419e-a67e-801734497292"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.609370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2604d874-1aab-419e-a67e-801734497292-kube-api-access-tb54t" (OuterVolumeSpecName: "kube-api-access-tb54t") pod "2604d874-1aab-419e-a67e-801734497292" (UID: "2604d874-1aab-419e-a67e-801734497292"). InnerVolumeSpecName "kube-api-access-tb54t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.621961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2604d874-1aab-419e-a67e-801734497292" (UID: "2604d874-1aab-419e-a67e-801734497292"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.621999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-config-data" (OuterVolumeSpecName: "config-data") pod "2604d874-1aab-419e-a67e-801734497292" (UID: "2604d874-1aab-419e-a67e-801734497292"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.632196 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2604d874-1aab-419e-a67e-801734497292" (UID: "2604d874-1aab-419e-a67e-801734497292"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.633167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2604d874-1aab-419e-a67e-801734497292" (UID: "2604d874-1aab-419e-a67e-801734497292"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.707532 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604d874-1aab-419e-a67e-801734497292-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.707557 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.707567 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.707577 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.707586 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604d874-1aab-419e-a67e-801734497292-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:33 crc kubenswrapper[4707]: I0121 15:56:33.707594 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb54t\" (UniqueName: \"kubernetes.io/projected/2604d874-1aab-419e-a67e-801734497292-kube-api-access-tb54t\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:33 crc kubenswrapper[4707]: E0121 15:56:33.950149 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:33 crc kubenswrapper[4707]: E0121 15:56:33.951264 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:33 crc kubenswrapper[4707]: E0121 15:56:33.952436 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:33 crc kubenswrapper[4707]: E0121 15:56:33.952468 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:56:33 crc kubenswrapper[4707]: E0121 15:56:33.964370 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:33 crc kubenswrapper[4707]: E0121 15:56:33.965217 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:33 crc kubenswrapper[4707]: E0121 15:56:33.966125 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:33 crc kubenswrapper[4707]: E0121 15:56:33.966151 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.175592 4707 generic.go:334] "Generic (PLEG): container finished" podID="2604d874-1aab-419e-a67e-801734497292" containerID="7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7" exitCode=0 Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.175639 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.175656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2604d874-1aab-419e-a67e-801734497292","Type":"ContainerDied","Data":"7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7"} Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.175948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2604d874-1aab-419e-a67e-801734497292","Type":"ContainerDied","Data":"f9f14ca061218e6674358237ac9cefa1f3ff05176447b7f19f802fddc87ac200"} Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.175970 4707 scope.go:117] "RemoveContainer" containerID="7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7" Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.205756 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.210585 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.210746 4707 scope.go:117] "RemoveContainer" containerID="702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f" Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.224395 4707 scope.go:117] "RemoveContainer" containerID="7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7" Jan 21 15:56:34 crc kubenswrapper[4707]: E0121 15:56:34.224729 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7\": container with ID starting with 7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7 not found: ID does not exist" containerID="7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7" Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.224754 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7"} err="failed to get container status \"7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7\": rpc error: code = NotFound desc = could not find container \"7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7\": container with ID starting with 7de198ef8f0db184f109084b2078f9ce0c4734a155551b0fa9f44d4dcb246ea7 not found: ID does not exist" Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.224776 4707 scope.go:117] "RemoveContainer" containerID="702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f" Jan 21 15:56:34 crc kubenswrapper[4707]: E0121 15:56:34.225034 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f\": container with ID starting with 702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f not found: ID does not exist" containerID="702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f" Jan 21 15:56:34 crc kubenswrapper[4707]: I0121 15:56:34.225055 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f"} err="failed to get container status \"702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f\": rpc error: code = NotFound desc = could not find container \"702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f\": container with ID starting with 702efd578e4cddf7cf7b8169b6eb192a7484eb07d1efb33b384b1aa9b6b2770f not found: ID does not exist" Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023117 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-config: secret "neutron-config" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023189 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:51.023173001 +0000 UTC m=+3308.204689233 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-config" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023188 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023188 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/neutron-httpd-config: secret "neutron-httpd-config" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023249 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:51.023234226 +0000 UTC m=+3308.204750448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-internal-svc" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023261 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023132 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023279 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:51.023262019 +0000 UTC m=+3308.204778240 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "neutron-httpd-config" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023348 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:51.023337932 +0000 UTC m=+3308.204854154 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-ovndbs" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023375 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:51.023354372 +0000 UTC m=+3308.204870595 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "combined-ca-bundle" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023559 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 15:56:35 crc kubenswrapper[4707]: E0121 15:56:35.023606 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs podName:dd8127e7-3a43-454e-a3c9-190cf9eab3d2 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:51.023595996 +0000 UTC m=+3308.205112228 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs") pod "neutron-5b4488769b-2ldzb" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2") : secret "cert-neutron-public-svc" not found Jan 21 15:56:35 crc kubenswrapper[4707]: I0121 15:56:35.211634 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2604d874-1aab-419e-a67e-801734497292" path="/var/lib/kubelet/pods/2604d874-1aab-419e-a67e-801734497292/volumes" Jan 21 15:56:37 crc kubenswrapper[4707]: E0121 15:56:37.987251 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:37 crc kubenswrapper[4707]: E0121 15:56:37.988230 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:37 crc kubenswrapper[4707]: E0121 15:56:37.989452 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:37 crc kubenswrapper[4707]: E0121 15:56:37.989481 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.053780 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.053838 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle podName:03adcc99-dff8-4703-bacf-25c6576428f9 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:54.053827708 +0000 UTC m=+3311.235343931 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle") pod "nova-cell0-conductor-0" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9") : secret "combined-ca-bundle" not found Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.053780 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.053925 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle podName:e22a9822-7aa7-4a38-916a-c3fc9fdb0895 nodeName:}" failed. No retries permitted until 2026-01-21 15:56:54.053915304 +0000 UTC m=+3311.235431526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895") : secret "combined-ca-bundle" not found Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.949481 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.950723 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.951778 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.951896 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.963148 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.964046 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.965190 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:38 crc kubenswrapper[4707]: E0121 15:56:38.965302 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:56:41 crc kubenswrapper[4707]: I0121 15:56:41.964177 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.181:9696/\": dial tcp 10.217.0.181:9696: connect: connection refused" Jan 21 15:56:42 crc kubenswrapper[4707]: E0121 15:56:42.987299 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:42 crc kubenswrapper[4707]: E0121 15:56:42.988408 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:42 crc kubenswrapper[4707]: E0121 15:56:42.989400 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:42 crc kubenswrapper[4707]: E0121 15:56:42.989441 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" Jan 21 15:56:43 crc kubenswrapper[4707]: E0121 15:56:43.952843 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:43 crc kubenswrapper[4707]: E0121 15:56:43.953764 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:43 crc kubenswrapper[4707]: E0121 15:56:43.954638 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:43 crc kubenswrapper[4707]: E0121 15:56:43.954662 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:56:43 crc kubenswrapper[4707]: E0121 15:56:43.963643 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:43 crc kubenswrapper[4707]: E0121 15:56:43.964542 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:43 crc kubenswrapper[4707]: E0121 15:56:43.965345 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:43 crc kubenswrapper[4707]: E0121 15:56:43.965393 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:56:47 crc kubenswrapper[4707]: I0121 15:56:47.030354 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.183:3000/\": dial tcp 10.217.0.183:3000: connect: connection refused" Jan 21 15:56:47 crc kubenswrapper[4707]: E0121 15:56:47.987354 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:47 crc kubenswrapper[4707]: E0121 15:56:47.988724 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:47 crc kubenswrapper[4707]: E0121 15:56:47.989649 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:47 crc kubenswrapper[4707]: E0121 15:56:47.989685 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" Jan 21 15:56:48 crc kubenswrapper[4707]: I0121 15:56:48.182793 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:56:48 crc kubenswrapper[4707]: E0121 15:56:48.183026 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:56:48 crc kubenswrapper[4707]: E0121 15:56:48.949997 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:48 crc kubenswrapper[4707]: E0121 15:56:48.951229 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:48 crc kubenswrapper[4707]: E0121 15:56:48.952495 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:48 crc kubenswrapper[4707]: E0121 15:56:48.952543 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:56:48 crc kubenswrapper[4707]: E0121 15:56:48.963486 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:48 crc kubenswrapper[4707]: E0121 15:56:48.964714 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:48 crc kubenswrapper[4707]: E0121 15:56:48.966013 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:48 crc kubenswrapper[4707]: E0121 15:56:48.966060 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.164656 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.191016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data\") pod \"79a91989-c584-44cc-bb31-494f3d1a7a7d\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.191093 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79a91989-c584-44cc-bb31-494f3d1a7a7d-etc-machine-id\") pod \"79a91989-c584-44cc-bb31-494f3d1a7a7d\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.191120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-scripts\") pod \"79a91989-c584-44cc-bb31-494f3d1a7a7d\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.191188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvbkl\" (UniqueName: \"kubernetes.io/projected/79a91989-c584-44cc-bb31-494f3d1a7a7d-kube-api-access-zvbkl\") pod \"79a91989-c584-44cc-bb31-494f3d1a7a7d\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.191212 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data-custom\") pod \"79a91989-c584-44cc-bb31-494f3d1a7a7d\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.191243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-combined-ca-bundle\") pod \"79a91989-c584-44cc-bb31-494f3d1a7a7d\" (UID: \"79a91989-c584-44cc-bb31-494f3d1a7a7d\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.192480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a91989-c584-44cc-bb31-494f3d1a7a7d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "79a91989-c584-44cc-bb31-494f3d1a7a7d" (UID: "79a91989-c584-44cc-bb31-494f3d1a7a7d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.195738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79a91989-c584-44cc-bb31-494f3d1a7a7d" (UID: "79a91989-c584-44cc-bb31-494f3d1a7a7d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.195676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-scripts" (OuterVolumeSpecName: "scripts") pod "79a91989-c584-44cc-bb31-494f3d1a7a7d" (UID: "79a91989-c584-44cc-bb31-494f3d1a7a7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.197997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a91989-c584-44cc-bb31-494f3d1a7a7d-kube-api-access-zvbkl" (OuterVolumeSpecName: "kube-api-access-zvbkl") pod "79a91989-c584-44cc-bb31-494f3d1a7a7d" (UID: "79a91989-c584-44cc-bb31-494f3d1a7a7d"). InnerVolumeSpecName "kube-api-access-zvbkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.221464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79a91989-c584-44cc-bb31-494f3d1a7a7d" (UID: "79a91989-c584-44cc-bb31-494f3d1a7a7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.242345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data" (OuterVolumeSpecName: "config-data") pod "79a91989-c584-44cc-bb31-494f3d1a7a7d" (UID: "79a91989-c584-44cc-bb31-494f3d1a7a7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.287567 4707 generic.go:334] "Generic (PLEG): container finished" podID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerID="ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39" exitCode=137 Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.287599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"79a91989-c584-44cc-bb31-494f3d1a7a7d","Type":"ContainerDied","Data":"ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39"} Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.287619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"79a91989-c584-44cc-bb31-494f3d1a7a7d","Type":"ContainerDied","Data":"8e391b4b7e37eb92ff609b9149d7f3da92176bcb2a3b530d12f255276a262a20"} Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.287634 4707 scope.go:117] "RemoveContainer" containerID="ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.287826 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.293127 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvbkl\" (UniqueName: \"kubernetes.io/projected/79a91989-c584-44cc-bb31-494f3d1a7a7d-kube-api-access-zvbkl\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.293165 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.293175 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.293183 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.293193 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79a91989-c584-44cc-bb31-494f3d1a7a7d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.293201 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a91989-c584-44cc-bb31-494f3d1a7a7d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.310074 4707 scope.go:117] "RemoveContainer" containerID="ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.321628 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.321673 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.329823 4707 scope.go:117] "RemoveContainer" containerID="ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8" Jan 21 15:56:49 crc kubenswrapper[4707]: E0121 15:56:49.330196 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8\": container with ID starting with ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8 not found: ID does not exist" containerID="ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.330292 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8"} err="failed to get container status \"ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8\": rpc error: code = NotFound desc = could not find container \"ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8\": container with ID starting with ef775b044ae05d2353f51c991b92ceb6783b5b28771e4bdce238043bb8b9dac8 not found: ID does not exist" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.330363 4707 scope.go:117] "RemoveContainer" containerID="ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39" Jan 21 15:56:49 crc kubenswrapper[4707]: E0121 15:56:49.330593 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39\": container with ID starting with ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39 not found: ID does not exist" containerID="ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.330668 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39"} err="failed to get container status \"ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39\": rpc error: code = NotFound desc = could not find container \"ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39\": container with ID starting with ae3f9bff5c92dd763c17fc014149edb5a90b1ad0fa8137b2bdb2c824d522ad39 not found: ID does not exist" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.715981 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.799010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7tkx\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-kube-api-access-l7tkx\") pod \"d581f0c3-1972-43b1-9b92-dfca48129402\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.799226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-lock\") pod \"d581f0c3-1972-43b1-9b92-dfca48129402\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.799258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-cache\") pod \"d581f0c3-1972-43b1-9b92-dfca48129402\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.799353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-etc-swift\") pod \"d581f0c3-1972-43b1-9b92-dfca48129402\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.799425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d581f0c3-1972-43b1-9b92-dfca48129402\" (UID: \"d581f0c3-1972-43b1-9b92-dfca48129402\") " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.799912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-lock" (OuterVolumeSpecName: "lock") pod "d581f0c3-1972-43b1-9b92-dfca48129402" (UID: "d581f0c3-1972-43b1-9b92-dfca48129402"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.799963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-cache" (OuterVolumeSpecName: "cache") pod "d581f0c3-1972-43b1-9b92-dfca48129402" (UID: "d581f0c3-1972-43b1-9b92-dfca48129402"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.800059 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.802583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-kube-api-access-l7tkx" (OuterVolumeSpecName: "kube-api-access-l7tkx") pod "d581f0c3-1972-43b1-9b92-dfca48129402" (UID: "d581f0c3-1972-43b1-9b92-dfca48129402"). InnerVolumeSpecName "kube-api-access-l7tkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.802594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "swift") pod "d581f0c3-1972-43b1-9b92-dfca48129402" (UID: "d581f0c3-1972-43b1-9b92-dfca48129402"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.802759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d581f0c3-1972-43b1-9b92-dfca48129402" (UID: "d581f0c3-1972-43b1-9b92-dfca48129402"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.901799 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7tkx\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-kube-api-access-l7tkx\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.901840 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d581f0c3-1972-43b1-9b92-dfca48129402-cache\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.901849 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d581f0c3-1972-43b1-9b92-dfca48129402-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.901874 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 15:56:49 crc kubenswrapper[4707]: I0121 15:56:49.918004 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.002718 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.062407 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5b4488769b-2ldzb_dd8127e7-3a43-454e-a3c9-190cf9eab3d2/neutron-api/0.log" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.062489 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.103577 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config\") pod \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.103633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs\") pod \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.103680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw5dr\" (UniqueName: \"kubernetes.io/projected/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-kube-api-access-vw5dr\") pod \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.103722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config\") pod \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.103754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs\") pod \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.103770 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs\") pod \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.103798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle\") pod \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\" (UID: \"dd8127e7-3a43-454e-a3c9-190cf9eab3d2\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.106374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-kube-api-access-vw5dr" (OuterVolumeSpecName: "kube-api-access-vw5dr") pod "dd8127e7-3a43-454e-a3c9-190cf9eab3d2" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2"). InnerVolumeSpecName "kube-api-access-vw5dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.106427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "dd8127e7-3a43-454e-a3c9-190cf9eab3d2" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.129352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dd8127e7-3a43-454e-a3c9-190cf9eab3d2" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.129791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd8127e7-3a43-454e-a3c9-190cf9eab3d2" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.130002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config" (OuterVolumeSpecName: "config") pod "dd8127e7-3a43-454e-a3c9-190cf9eab3d2" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.130596 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dd8127e7-3a43-454e-a3c9-190cf9eab3d2" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.141187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "dd8127e7-3a43-454e-a3c9-190cf9eab3d2" (UID: "dd8127e7-3a43-454e-a3c9-190cf9eab3d2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.205997 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.206025 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.206037 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw5dr\" (UniqueName: \"kubernetes.io/projected/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-kube-api-access-vw5dr\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.206045 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.206054 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.206063 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.206073 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8127e7-3a43-454e-a3c9-190cf9eab3d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.299116 4707 generic.go:334] "Generic (PLEG): container finished" podID="d581f0c3-1972-43b1-9b92-dfca48129402" containerID="2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f" exitCode=137 Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.299173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f"} Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.299199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d581f0c3-1972-43b1-9b92-dfca48129402","Type":"ContainerDied","Data":"985d42d961e1bb057efe961cf872b5cd441aeef5a46fee354795a3ca8a954fe2"} Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.299217 4707 scope.go:117] "RemoveContainer" containerID="2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.299196 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.300699 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5b4488769b-2ldzb_dd8127e7-3a43-454e-a3c9-190cf9eab3d2/neutron-api/0.log" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.300732 4707 generic.go:334] "Generic (PLEG): container finished" podID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerID="48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb" exitCode=137 Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.300782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" event={"ID":"dd8127e7-3a43-454e-a3c9-190cf9eab3d2","Type":"ContainerDied","Data":"48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb"} Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.300799 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" event={"ID":"dd8127e7-3a43-454e-a3c9-190cf9eab3d2","Type":"ContainerDied","Data":"8cf872ec8a96d3164abdd36d413b0fbfbbdf8c240a2af32dced07696d24c22c8"} Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.300784 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5b4488769b-2ldzb" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.314470 4707 scope.go:117] "RemoveContainer" containerID="3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.325884 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5b4488769b-2ldzb"] Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.331799 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5b4488769b-2ldzb"] Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.336493 4707 scope.go:117] "RemoveContainer" containerID="5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.338986 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.344059 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.350189 4707 scope.go:117] "RemoveContainer" containerID="78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.363984 4707 scope.go:117] "RemoveContainer" containerID="030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.376839 4707 scope.go:117] "RemoveContainer" containerID="e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.389937 4707 scope.go:117] "RemoveContainer" containerID="ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.403479 4707 scope.go:117] "RemoveContainer" containerID="705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.420330 4707 scope.go:117] "RemoveContainer" containerID="0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.433790 4707 scope.go:117] "RemoveContainer" containerID="0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.448496 4707 scope.go:117] "RemoveContainer" containerID="52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.462239 4707 scope.go:117] "RemoveContainer" containerID="1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.475794 4707 scope.go:117] "RemoveContainer" containerID="0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.488131 4707 scope.go:117] "RemoveContainer" containerID="e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.499358 4707 scope.go:117] "RemoveContainer" containerID="8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.511612 4707 scope.go:117] "RemoveContainer" containerID="2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.511839 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f\": container with ID starting with 2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f not found: ID does not exist" containerID="2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.511871 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f"} err="failed to get container status \"2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f\": rpc error: code = NotFound desc = could not find container \"2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f\": container with ID starting with 2d5f6f6c3ff76bbe41aa2b33dbe8ae7a85e6778054794949eb3b7d272fef604f not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.511890 4707 scope.go:117] "RemoveContainer" containerID="3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.512199 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc\": container with ID starting with 3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc not found: ID does not exist" containerID="3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.512224 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc"} err="failed to get container status \"3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc\": rpc error: code = NotFound desc = could not find container \"3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc\": container with ID starting with 3b9a623b8f8506110c3304d696286f2101e261d93e598c33d80a8722a78f6efc not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.512238 4707 scope.go:117] "RemoveContainer" containerID="5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.512524 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241\": container with ID starting with 5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241 not found: ID does not exist" containerID="5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.512554 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241"} err="failed to get container status \"5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241\": rpc error: code = NotFound desc = could not find container \"5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241\": container with ID starting with 5434b3482d42faed49f62b1840e2c45caeecc6b6e3d31efe59466a09c773b241 not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.512572 4707 scope.go:117] "RemoveContainer" containerID="78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.512836 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f\": container with ID starting with 78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f not found: ID does not exist" containerID="78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.512858 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f"} err="failed to get container status \"78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f\": rpc error: code = NotFound desc = could not find container \"78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f\": container with ID starting with 78302f23519e11ec0e06a44fa59c4c5693c7bc900f5e91ef6d21a31e4323c62f not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.512870 4707 scope.go:117] "RemoveContainer" containerID="030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.513073 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f\": container with ID starting with 030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f not found: ID does not exist" containerID="030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.513097 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f"} err="failed to get container status \"030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f\": rpc error: code = NotFound desc = could not find container \"030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f\": container with ID starting with 030274958e1de45f5a73b3b354173ff6015516dd3424b3351f15902188b5ac3f not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.513110 4707 scope.go:117] "RemoveContainer" containerID="e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.513965 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff\": container with ID starting with e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff not found: ID does not exist" containerID="e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.513998 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff"} err="failed to get container status \"e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff\": rpc error: code = NotFound desc = could not find container \"e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff\": container with ID starting with e0438faad05977b494b3fbbb24f0bbe85665d41d2e3dafa533e568075292deff not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.514015 4707 scope.go:117] "RemoveContainer" containerID="ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.514229 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d\": container with ID starting with ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d not found: ID does not exist" containerID="ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.514255 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d"} err="failed to get container status \"ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d\": rpc error: code = NotFound desc = could not find container \"ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d\": container with ID starting with ea3b4c49f1efefd93c66bab629f04e766ebdfc6c25588e3a6d3fbe322bf7d87d not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.514266 4707 scope.go:117] "RemoveContainer" containerID="705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.514514 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9\": container with ID starting with 705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9 not found: ID does not exist" containerID="705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.514583 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9"} err="failed to get container status \"705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9\": rpc error: code = NotFound desc = could not find container \"705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9\": container with ID starting with 705dd9c5d6c8094d04b72ab44c2e5bf594bbe4fe9f1cf55356a22aefe3a8edb9 not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.514595 4707 scope.go:117] "RemoveContainer" containerID="0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.514849 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a\": container with ID starting with 0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a not found: ID does not exist" containerID="0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.514882 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a"} err="failed to get container status \"0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a\": rpc error: code = NotFound desc = could not find container \"0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a\": container with ID starting with 0f52a50b1deb712bb18ca03f63cd3272cc40896d171441f0778a3f271023428a not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.514899 4707 scope.go:117] "RemoveContainer" containerID="0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.515098 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8\": container with ID starting with 0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8 not found: ID does not exist" containerID="0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.515123 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8"} err="failed to get container status \"0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8\": rpc error: code = NotFound desc = could not find container \"0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8\": container with ID starting with 0130bfbfba6641a7be0bc244e0641cbcc5440f3624996cd18ab15ae63cd3f2b8 not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.515135 4707 scope.go:117] "RemoveContainer" containerID="52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.515389 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6\": container with ID starting with 52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6 not found: ID does not exist" containerID="52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.515433 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6"} err="failed to get container status \"52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6\": rpc error: code = NotFound desc = could not find container \"52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6\": container with ID starting with 52eb53abac2d7e643af4a33dc8a193343ebbb5e313a0e055ca82460c02d734b6 not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.515449 4707 scope.go:117] "RemoveContainer" containerID="1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.516253 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669\": container with ID starting with 1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669 not found: ID does not exist" containerID="1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.516284 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669"} err="failed to get container status \"1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669\": rpc error: code = NotFound desc = could not find container \"1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669\": container with ID starting with 1c4618f9b89e3a75147882696f48dd8bc49f6a7c7ec4a2ade0b50e56f9ece669 not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.516297 4707 scope.go:117] "RemoveContainer" containerID="0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.516512 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb\": container with ID starting with 0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb not found: ID does not exist" containerID="0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.516534 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb"} err="failed to get container status \"0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb\": rpc error: code = NotFound desc = could not find container \"0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb\": container with ID starting with 0b8366ca5787cb5a348704c019ad48445ee21ea30866f3f9b629847000d25deb not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.516547 4707 scope.go:117] "RemoveContainer" containerID="e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.516749 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086\": container with ID starting with e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086 not found: ID does not exist" containerID="e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.516767 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086"} err="failed to get container status \"e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086\": rpc error: code = NotFound desc = could not find container \"e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086\": container with ID starting with e686bb789be074e27c29f170e932447e154e8a9101874632e86d5eb303226086 not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.516778 4707 scope.go:117] "RemoveContainer" containerID="8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.517002 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6\": container with ID starting with 8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6 not found: ID does not exist" containerID="8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.517032 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6"} err="failed to get container status \"8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6\": rpc error: code = NotFound desc = could not find container \"8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6\": container with ID starting with 8529e284b51a01a86b36c1000eb093582f30585283055d41097e03c5366f39f6 not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.517044 4707 scope.go:117] "RemoveContainer" containerID="e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.531968 4707 scope.go:117] "RemoveContainer" containerID="48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.546241 4707 scope.go:117] "RemoveContainer" containerID="e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.546518 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a\": container with ID starting with e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a not found: ID does not exist" containerID="e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.546556 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a"} err="failed to get container status \"e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a\": rpc error: code = NotFound desc = could not find container \"e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a\": container with ID starting with e6cfc6e623bfd26821f6a673f81a897bdb26bec0cde86c57c63868cc1f9d379a not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.546575 4707 scope.go:117] "RemoveContainer" containerID="48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb" Jan 21 15:56:50 crc kubenswrapper[4707]: E0121 15:56:50.546892 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb\": container with ID starting with 48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb not found: ID does not exist" containerID="48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.546915 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb"} err="failed to get container status \"48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb\": rpc error: code = NotFound desc = could not find container \"48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb\": container with ID starting with 48f9277614460cac7b9b0fa1013357994bdc562de6047137499f7b151bc82ceb not found: ID does not exist" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.755131 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.814315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-run-httpd\") pod \"2c412020-b207-4f3b-abe0-160022b70cc8\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.814624 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2c412020-b207-4f3b-abe0-160022b70cc8" (UID: "2c412020-b207-4f3b-abe0-160022b70cc8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.814676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-scripts\") pod \"2c412020-b207-4f3b-abe0-160022b70cc8\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.814708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-sg-core-conf-yaml\") pod \"2c412020-b207-4f3b-abe0-160022b70cc8\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.815176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-log-httpd\") pod \"2c412020-b207-4f3b-abe0-160022b70cc8\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.815248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-ceilometer-tls-certs\") pod \"2c412020-b207-4f3b-abe0-160022b70cc8\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.815266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-combined-ca-bundle\") pod \"2c412020-b207-4f3b-abe0-160022b70cc8\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.815303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-config-data\") pod \"2c412020-b207-4f3b-abe0-160022b70cc8\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.815419 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j925m\" (UniqueName: \"kubernetes.io/projected/2c412020-b207-4f3b-abe0-160022b70cc8-kube-api-access-j925m\") pod \"2c412020-b207-4f3b-abe0-160022b70cc8\" (UID: \"2c412020-b207-4f3b-abe0-160022b70cc8\") " Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.815505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2c412020-b207-4f3b-abe0-160022b70cc8" (UID: "2c412020-b207-4f3b-abe0-160022b70cc8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.815802 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.815839 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c412020-b207-4f3b-abe0-160022b70cc8-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.817299 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-scripts" (OuterVolumeSpecName: "scripts") pod "2c412020-b207-4f3b-abe0-160022b70cc8" (UID: "2c412020-b207-4f3b-abe0-160022b70cc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.817386 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c412020-b207-4f3b-abe0-160022b70cc8-kube-api-access-j925m" (OuterVolumeSpecName: "kube-api-access-j925m") pod "2c412020-b207-4f3b-abe0-160022b70cc8" (UID: "2c412020-b207-4f3b-abe0-160022b70cc8"). InnerVolumeSpecName "kube-api-access-j925m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.828704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2c412020-b207-4f3b-abe0-160022b70cc8" (UID: "2c412020-b207-4f3b-abe0-160022b70cc8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.839854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2c412020-b207-4f3b-abe0-160022b70cc8" (UID: "2c412020-b207-4f3b-abe0-160022b70cc8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.852533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c412020-b207-4f3b-abe0-160022b70cc8" (UID: "2c412020-b207-4f3b-abe0-160022b70cc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.864017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-config-data" (OuterVolumeSpecName: "config-data") pod "2c412020-b207-4f3b-abe0-160022b70cc8" (UID: "2c412020-b207-4f3b-abe0-160022b70cc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.917358 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.917391 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.917404 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.917414 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.917423 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c412020-b207-4f3b-abe0-160022b70cc8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:50 crc kubenswrapper[4707]: I0121 15:56:50.917431 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j925m\" (UniqueName: \"kubernetes.io/projected/2c412020-b207-4f3b-abe0-160022b70cc8-kube-api-access-j925m\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.193157 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a91989-c584-44cc-bb31-494f3d1a7a7d" path="/var/lib/kubelet/pods/79a91989-c584-44cc-bb31-494f3d1a7a7d/volumes" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.193848 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" path="/var/lib/kubelet/pods/d581f0c3-1972-43b1-9b92-dfca48129402/volumes" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.195462 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" path="/var/lib/kubelet/pods/dd8127e7-3a43-454e-a3c9-190cf9eab3d2/volumes" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.311601 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c412020-b207-4f3b-abe0-160022b70cc8" containerID="bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17" exitCode=137 Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.311639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerDied","Data":"bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17"} Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.311651 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.311712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2c412020-b207-4f3b-abe0-160022b70cc8","Type":"ContainerDied","Data":"f91fdb63d7d782c68864edc2873407e45c5363c0fd8bc3425e38421279ce630e"} Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.311734 4707 scope.go:117] "RemoveContainer" containerID="1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.327505 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.327776 4707 scope.go:117] "RemoveContainer" containerID="e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.330512 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.341605 4707 scope.go:117] "RemoveContainer" containerID="bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.353900 4707 scope.go:117] "RemoveContainer" containerID="7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.366074 4707 scope.go:117] "RemoveContainer" containerID="1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6" Jan 21 15:56:51 crc kubenswrapper[4707]: E0121 15:56:51.366396 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6\": container with ID starting with 1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6 not found: ID does not exist" containerID="1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.366429 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6"} err="failed to get container status \"1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6\": rpc error: code = NotFound desc = could not find container \"1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6\": container with ID starting with 1c7aa2d49c56b9554ce61cab73ba31af029d03eaa2e2a5ea0b23a7e0d4e2f4a6 not found: ID does not exist" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.366452 4707 scope.go:117] "RemoveContainer" containerID="e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159" Jan 21 15:56:51 crc kubenswrapper[4707]: E0121 15:56:51.366729 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159\": container with ID starting with e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159 not found: ID does not exist" containerID="e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.366756 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159"} err="failed to get container status \"e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159\": rpc error: code = NotFound desc = could not find container \"e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159\": container with ID starting with e9c64d3296d3db2cfefeb7c792f924f2817f04478479a82dbb2dc64858da7159 not found: ID does not exist" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.366772 4707 scope.go:117] "RemoveContainer" containerID="bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17" Jan 21 15:56:51 crc kubenswrapper[4707]: E0121 15:56:51.367028 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17\": container with ID starting with bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17 not found: ID does not exist" containerID="bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.367051 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17"} err="failed to get container status \"bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17\": rpc error: code = NotFound desc = could not find container \"bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17\": container with ID starting with bb51cd6338e4c885c0bce9e19e8aa0663fe44a267883be02f5df789ac8e7fa17 not found: ID does not exist" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.367064 4707 scope.go:117] "RemoveContainer" containerID="7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039" Jan 21 15:56:51 crc kubenswrapper[4707]: E0121 15:56:51.367235 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039\": container with ID starting with 7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039 not found: ID does not exist" containerID="7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039" Jan 21 15:56:51 crc kubenswrapper[4707]: I0121 15:56:51.367255 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039"} err="failed to get container status \"7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039\": rpc error: code = NotFound desc = could not find container \"7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039\": container with ID starting with 7c79a05813554606649e5ea1396d75b14c1d4bcaf3a3cab0516c5dfb8cc76039 not found: ID does not exist" Jan 21 15:56:52 crc kubenswrapper[4707]: E0121 15:56:52.987188 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:52 crc kubenswrapper[4707]: E0121 15:56:52.988264 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:52 crc kubenswrapper[4707]: E0121 15:56:52.989306 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:52 crc kubenswrapper[4707]: E0121 15:56:52.989362 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" Jan 21 15:56:53 crc kubenswrapper[4707]: I0121 15:56:53.188933 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" path="/var/lib/kubelet/pods/2c412020-b207-4f3b-abe0-160022b70cc8/volumes" Jan 21 15:56:53 crc kubenswrapper[4707]: E0121 15:56:53.949620 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:53 crc kubenswrapper[4707]: E0121 15:56:53.950739 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:53 crc kubenswrapper[4707]: E0121 15:56:53.951758 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:53 crc kubenswrapper[4707]: E0121 15:56:53.951792 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:56:53 crc kubenswrapper[4707]: E0121 15:56:53.963262 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:53 crc kubenswrapper[4707]: E0121 15:56:53.964307 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:53 crc kubenswrapper[4707]: E0121 15:56:53.965321 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:53 crc kubenswrapper[4707]: E0121 15:56:53.965360 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:56:54 crc kubenswrapper[4707]: E0121 15:56:54.059648 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:54 crc kubenswrapper[4707]: E0121 15:56:54.059673 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 15:56:54 crc kubenswrapper[4707]: E0121 15:56:54.059713 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle podName:e22a9822-7aa7-4a38-916a-c3fc9fdb0895 nodeName:}" failed. No retries permitted until 2026-01-21 15:57:26.059696836 +0000 UTC m=+3343.241213059 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895") : secret "combined-ca-bundle" not found Jan 21 15:56:54 crc kubenswrapper[4707]: E0121 15:56:54.059729 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle podName:03adcc99-dff8-4703-bacf-25c6576428f9 nodeName:}" failed. No retries permitted until 2026-01-21 15:57:26.059722434 +0000 UTC m=+3343.241238656 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle") pod "nova-cell0-conductor-0" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9") : secret "combined-ca-bundle" not found Jan 21 15:56:57 crc kubenswrapper[4707]: E0121 15:56:57.987468 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:57 crc kubenswrapper[4707]: E0121 15:56:57.988788 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:57 crc kubenswrapper[4707]: E0121 15:56:57.989862 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:56:57 crc kubenswrapper[4707]: E0121 15:56:57.989897 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.442687 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rqrbk"] Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.442955 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerName="barbican-keystone-listener-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.442972 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerName="barbican-keystone-listener-log" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.442989 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerName="cinder-api-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.442994 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerName="cinder-api-log" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443004 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="448bc08a-61e8-48bb-be55-256be3a355a2" containerName="glance-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443010 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="448bc08a-61e8-48bb-be55-256be3a355a2" containerName="glance-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443017 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-server" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443023 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-server" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443031 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-server" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443037 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-server" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec10a8f4-f5e6-4484-8c37-db290585f9b1" containerName="mysql-bootstrap" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443055 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec10a8f4-f5e6-4484-8c37-db290585f9b1" containerName="mysql-bootstrap" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443065 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-auditor" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443071 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-auditor" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443081 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerName="neutron-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443086 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerName="neutron-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443094 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="proxy-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443099 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="proxy-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443109 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243109f6-78d0-4750-a43f-21a7b606626d" containerName="placement-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443114 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="243109f6-78d0-4750-a43f-21a7b606626d" containerName="placement-api" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443120 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb17d2c-7e86-4451-b56e-4e89fc494542" containerName="rabbitmq" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443127 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb17d2c-7e86-4451-b56e-4e89fc494542" containerName="rabbitmq" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443135 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerName="cinder-scheduler" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443140 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerName="cinder-scheduler" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443147 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerName="glance-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443152 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerName="glance-log" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443159 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28add001-f139-47b8-9ebb-659ba39858cc" containerName="barbican-worker-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443164 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28add001-f139-47b8-9ebb-659ba39858cc" containerName="barbican-worker-log" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443170 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-auditor" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443175 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-auditor" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443184 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4d9381-62e0-4779-9664-b1f9a3ced4ba" containerName="init" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443189 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4d9381-62e0-4779-9664-b1f9a3ced4ba" containerName="init" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443196 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="swift-recon-cron" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443202 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="swift-recon-cron" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443209 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4d9381-62e0-4779-9664-b1f9a3ced4ba" containerName="dnsmasq-dns" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443214 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4d9381-62e0-4779-9664-b1f9a3ced4ba" containerName="dnsmasq-dns" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443220 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243109f6-78d0-4750-a43f-21a7b606626d" containerName="placement-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443225 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="243109f6-78d0-4750-a43f-21a7b606626d" containerName="placement-log" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443235 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-server" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443240 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-server" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443245 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443249 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api-log" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443257 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" containerName="rabbitmq" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443262 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" containerName="rabbitmq" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443271 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" containerName="keystone-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443289 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" containerName="keystone-api" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443298 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a9521e-4024-4a1d-8b87-66d305c22eb4" containerName="memcached" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443302 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a9521e-4024-4a1d-8b87-66d305c22eb4" containerName="memcached" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443310 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-expirer" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443317 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-expirer" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443323 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerName="probe" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443329 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerName="probe" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443338 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="ceilometer-central-agent" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443344 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="ceilometer-central-agent" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443350 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-replicator" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443355 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-replicator" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443361 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-replicator" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443367 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-replicator" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443375 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="rsync" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443380 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="rsync" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443388 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerName="cinder-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443394 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerName="cinder-api" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443402 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerName="glance-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443407 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerName="glance-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443413 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443418 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-log" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443424 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb17d2c-7e86-4451-b56e-4e89fc494542" containerName="setup-container" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443429 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb17d2c-7e86-4451-b56e-4e89fc494542" containerName="setup-container" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443435 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c397cca9-7943-4169-9984-c5ec464de8de" containerName="proxy-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443440 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c397cca9-7943-4169-9984-c5ec464de8de" containerName="proxy-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443450 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="sg-core" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="sg-core" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443460 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="448bc08a-61e8-48bb-be55-256be3a355a2" containerName="glance-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443465 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="448bc08a-61e8-48bb-be55-256be3a355a2" containerName="glance-log" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443472 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443477 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-api" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443484 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28add001-f139-47b8-9ebb-659ba39858cc" containerName="barbican-worker" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28add001-f139-47b8-9ebb-659ba39858cc" containerName="barbican-worker" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443497 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerName="neutron-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443501 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerName="neutron-api" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443510 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-metadata" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443515 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-metadata" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443523 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec10a8f4-f5e6-4484-8c37-db290585f9b1" containerName="galera" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443528 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec10a8f4-f5e6-4484-8c37-db290585f9b1" containerName="galera" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443534 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerName="openstack-network-exporter" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443539 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerName="openstack-network-exporter" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443549 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-updater" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443553 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-updater" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443562 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c397cca9-7943-4169-9984-c5ec464de8de" containerName="proxy-server" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443566 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c397cca9-7943-4169-9984-c5ec464de8de" containerName="proxy-server" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443573 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" containerName="setup-container" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443579 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" containerName="setup-container" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443585 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerName="ovn-northd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443590 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerName="ovn-northd" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443597 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-replicator" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443602 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-replicator" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443611 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="ceilometer-notification-agent" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443616 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="ceilometer-notification-agent" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443621 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-reaper" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443626 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-reaper" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443634 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-auditor" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443651 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-auditor" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443662 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-updater" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443666 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-updater" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443674 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443679 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-log" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443686 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerName="barbican-keystone-listener" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443691 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerName="barbican-keystone-listener" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.443699 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443703 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443835 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443847 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerName="cinder-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443855 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="proxy-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443860 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerName="cinder-scheduler" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443867 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4d9381-62e0-4779-9664-b1f9a3ced4ba" containerName="dnsmasq-dns" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b59f3c9f-adf5-49f3-8f0b-62044f91902c" containerName="cinder-api-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443881 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6b0d99-fa7f-4fce-8c2d-6737562bdd85" containerName="keystone-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443887 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-expirer" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443893 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="ceilometer-central-agent" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443900 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c397cca9-7943-4169-9984-c5ec464de8de" containerName="proxy-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443906 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-updater" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443911 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="243109f6-78d0-4750-a43f-21a7b606626d" containerName="placement-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443917 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="448bc08a-61e8-48bb-be55-256be3a355a2" containerName="glance-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443923 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-auditor" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443929 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="ceilometer-notification-agent" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443935 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerName="neutron-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443942 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerName="glance-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443949 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerName="openstack-network-exporter" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443955 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-auditor" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443962 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-replicator" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443968 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerName="barbican-keystone-listener-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443975 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8127e7-3a43-454e-a3c9-190cf9eab3d2" containerName="neutron-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443982 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-reaper" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.443991 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec10a8f4-f5e6-4484-8c37-db290585f9b1" containerName="galera" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444000 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="243109f6-78d0-4750-a43f-21a7b606626d" containerName="placement-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444007 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4d63f6-935c-47e2-92c0-4e27286253ba" containerName="glance-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444014 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c4483b-abb1-4c04-9092-d8f04755bc93" containerName="barbican-keystone-listener" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444022 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28add001-f139-47b8-9ebb-659ba39858cc" containerName="barbican-worker" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444029 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444035 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-server" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444042 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-server" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444047 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="448bc08a-61e8-48bb-be55-256be3a355a2" containerName="glance-httpd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444055 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="swift-recon-cron" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444062 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a91989-c584-44cc-bb31-494f3d1a7a7d" containerName="probe" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444068 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a93993-fb38-4bec-9fa6-2ffa280ccecd" containerName="nova-metadata-metadata" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444073 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="account-replicator" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444078 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb17d2c-7e86-4451-b56e-4e89fc494542" containerName="rabbitmq" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444086 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="object-updater" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444093 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-replicator" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444102 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c397cca9-7943-4169-9984-c5ec464de8de" containerName="proxy-server" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444108 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a9521e-4024-4a1d-8b87-66d305c22eb4" containerName="memcached" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444114 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eaefcd6-8b64-47a3-8e3a-280d7130b63e" containerName="rabbitmq" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444119 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="871b3b08-cd0d-4fee-96ab-7b75cf72a0a7" containerName="barbican-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444127 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28add001-f139-47b8-9ebb-659ba39858cc" containerName="barbican-worker-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444134 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-auditor" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444140 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="rsync" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444146 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d581f0c3-1972-43b1-9b92-dfca48129402" containerName="container-server" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444153 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-log" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444160 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2604d874-1aab-419e-a67e-801734497292" containerName="nova-api-api" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444166 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d4bcc8-6ad8-4a0b-9b73-3725fcf860cd" containerName="ovn-northd" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.444173 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c412020-b207-4f3b-abe0-160022b70cc8" containerName="sg-core" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.445081 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.449385 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rqrbk"] Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.516209 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-catalog-content\") pod \"community-operators-rqrbk\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.516415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp22k\" (UniqueName: \"kubernetes.io/projected/7a915999-9f3b-470f-ab00-25bb814401e1-kube-api-access-jp22k\") pod \"community-operators-rqrbk\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.516470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-utilities\") pod \"community-operators-rqrbk\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.612580 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-fsnlz"] Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.616781 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-fsnlz"] Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.617524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-utilities\") pod \"community-operators-rqrbk\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.617606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-catalog-content\") pod \"community-operators-rqrbk\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.617713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp22k\" (UniqueName: \"kubernetes.io/projected/7a915999-9f3b-470f-ab00-25bb814401e1-kube-api-access-jp22k\") pod \"community-operators-rqrbk\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.617968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-utilities\") pod \"community-operators-rqrbk\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.618035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-catalog-content\") pod \"community-operators-rqrbk\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.635015 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp22k\" (UniqueName: \"kubernetes.io/projected/7a915999-9f3b-470f-ab00-25bb814401e1-kube-api-access-jp22k\") pod \"community-operators-rqrbk\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.738321 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-zw66m"] Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.739156 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.740763 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.741024 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.741053 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.742164 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.748404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zw66m"] Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.763973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.820188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsrk\" (UniqueName: \"kubernetes.io/projected/b1769052-bbf8-4a8c-8658-eaf13a8ce092-kube-api-access-xfsrk\") pod \"crc-storage-crc-zw66m\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.820298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b1769052-bbf8-4a8c-8658-eaf13a8ce092-crc-storage\") pod \"crc-storage-crc-zw66m\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.820318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b1769052-bbf8-4a8c-8658-eaf13a8ce092-node-mnt\") pod \"crc-storage-crc-zw66m\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.921522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b1769052-bbf8-4a8c-8658-eaf13a8ce092-crc-storage\") pod \"crc-storage-crc-zw66m\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.921723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b1769052-bbf8-4a8c-8658-eaf13a8ce092-node-mnt\") pod \"crc-storage-crc-zw66m\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.921870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsrk\" (UniqueName: \"kubernetes.io/projected/b1769052-bbf8-4a8c-8658-eaf13a8ce092-kube-api-access-xfsrk\") pod \"crc-storage-crc-zw66m\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.922794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b1769052-bbf8-4a8c-8658-eaf13a8ce092-crc-storage\") pod \"crc-storage-crc-zw66m\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.922952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b1769052-bbf8-4a8c-8658-eaf13a8ce092-node-mnt\") pod \"crc-storage-crc-zw66m\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: I0121 15:56:58.952600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsrk\" (UniqueName: \"kubernetes.io/projected/b1769052-bbf8-4a8c-8658-eaf13a8ce092-kube-api-access-xfsrk\") pod \"crc-storage-crc-zw66m\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.952603 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.957935 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.960161 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.960202 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.968955 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.980578 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.990087 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:56:58 crc kubenswrapper[4707]: E0121 15:56:58.990141 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.052251 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.107799 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.189253 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78e09300-dc96-4d0c-a1ec-9d93064430dc" path="/var/lib/kubelet/pods/78e09300-dc96-4d0c-a1ec-9d93064430dc/volumes" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.225449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsjnw\" (UniqueName: \"kubernetes.io/projected/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-kube-api-access-gsjnw\") pod \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.225628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-config-data\") pod \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.225681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-combined-ca-bundle\") pod \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\" (UID: \"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2\") " Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.229541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-kube-api-access-gsjnw" (OuterVolumeSpecName: "kube-api-access-gsjnw") pod "aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" (UID: "aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2"). InnerVolumeSpecName "kube-api-access-gsjnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.243062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-config-data" (OuterVolumeSpecName: "config-data") pod "aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" (UID: "aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.244049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" (UID: "aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.271803 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rqrbk"] Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.327801 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsjnw\" (UniqueName: \"kubernetes.io/projected/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-kube-api-access-gsjnw\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.327852 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.327863 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.366208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqrbk" event={"ID":"7a915999-9f3b-470f-ab00-25bb814401e1","Type":"ContainerStarted","Data":"73eb68bb31db381dba8b78b5aab784a5b99d70a1c62d6cdf2f4704d86ea55aa3"} Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.367440 4707 generic.go:334] "Generic (PLEG): container finished" podID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" exitCode=137 Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.367468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2","Type":"ContainerDied","Data":"031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df"} Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.367482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2","Type":"ContainerDied","Data":"898650ceaf8f3d14295607e34f5a0c5fc5e993dec0e9cc8f02829b77da61c362"} Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.367498 4707 scope.go:117] "RemoveContainer" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.367574 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.382678 4707 scope.go:117] "RemoveContainer" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" Jan 21 15:56:59 crc kubenswrapper[4707]: E0121 15:56:59.382967 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df\": container with ID starting with 031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df not found: ID does not exist" containerID="031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.382995 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df"} err="failed to get container status \"031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df\": rpc error: code = NotFound desc = could not find container \"031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df\": container with ID starting with 031a3a7f40682e6927691e6abaeb94962d1ed730a75f1a6d2d67c726062f57df not found: ID does not exist" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.394265 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.398077 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.459290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zw66m"] Jan 21 15:56:59 crc kubenswrapper[4707]: W0121 15:56:59.459330 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1769052_bbf8_4a8c_8658_eaf13a8ce092.slice/crio-0ea0131a24b54e1baa1b679d63e1ce0214d68d7adfd7aa19708122c5d7735084 WatchSource:0}: Error finding container 0ea0131a24b54e1baa1b679d63e1ce0214d68d7adfd7aa19708122c5d7735084: Status 404 returned error can't find the container with id 0ea0131a24b54e1baa1b679d63e1ce0214d68d7adfd7aa19708122c5d7735084 Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.841324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.846992 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.934675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-config-data\") pod \"03adcc99-dff8-4703-bacf-25c6576428f9\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.934722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle\") pod \"03adcc99-dff8-4703-bacf-25c6576428f9\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.934747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle\") pod \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.934779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvz8v\" (UniqueName: \"kubernetes.io/projected/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-kube-api-access-wvz8v\") pod \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.934803 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-config-data\") pod \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\" (UID: \"e22a9822-7aa7-4a38-916a-c3fc9fdb0895\") " Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.934852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt4xm\" (UniqueName: \"kubernetes.io/projected/03adcc99-dff8-4703-bacf-25c6576428f9-kube-api-access-nt4xm\") pod \"03adcc99-dff8-4703-bacf-25c6576428f9\" (UID: \"03adcc99-dff8-4703-bacf-25c6576428f9\") " Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.939588 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03adcc99-dff8-4703-bacf-25c6576428f9-kube-api-access-nt4xm" (OuterVolumeSpecName: "kube-api-access-nt4xm") pod "03adcc99-dff8-4703-bacf-25c6576428f9" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9"). InnerVolumeSpecName "kube-api-access-nt4xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.939631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-kube-api-access-wvz8v" (OuterVolumeSpecName: "kube-api-access-wvz8v") pod "e22a9822-7aa7-4a38-916a-c3fc9fdb0895" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895"). InnerVolumeSpecName "kube-api-access-wvz8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.951073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-config-data" (OuterVolumeSpecName: "config-data") pod "e22a9822-7aa7-4a38-916a-c3fc9fdb0895" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.952067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e22a9822-7aa7-4a38-916a-c3fc9fdb0895" (UID: "e22a9822-7aa7-4a38-916a-c3fc9fdb0895"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.952146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-config-data" (OuterVolumeSpecName: "config-data") pod "03adcc99-dff8-4703-bacf-25c6576428f9" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:59 crc kubenswrapper[4707]: I0121 15:56:59.952621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03adcc99-dff8-4703-bacf-25c6576428f9" (UID: "03adcc99-dff8-4703-bacf-25c6576428f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.037012 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.037041 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adcc99-dff8-4703-bacf-25c6576428f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.037052 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.037061 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvz8v\" (UniqueName: \"kubernetes.io/projected/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-kube-api-access-wvz8v\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.037071 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a9822-7aa7-4a38-916a-c3fc9fdb0895-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.037078 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt4xm\" (UniqueName: \"kubernetes.io/projected/03adcc99-dff8-4703-bacf-25c6576428f9-kube-api-access-nt4xm\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.375623 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a915999-9f3b-470f-ab00-25bb814401e1" containerID="88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403" exitCode=0 Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.375778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqrbk" event={"ID":"7a915999-9f3b-470f-ab00-25bb814401e1","Type":"ContainerDied","Data":"88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403"} Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.377960 4707 generic.go:334] "Generic (PLEG): container finished" podID="b1769052-bbf8-4a8c-8658-eaf13a8ce092" containerID="32fad64bc9d7b12c80fb73bcc5468c75aa88ca61108101349352f58ea1f38d4c" exitCode=0 Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.378044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zw66m" event={"ID":"b1769052-bbf8-4a8c-8658-eaf13a8ce092","Type":"ContainerDied","Data":"32fad64bc9d7b12c80fb73bcc5468c75aa88ca61108101349352f58ea1f38d4c"} Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.378089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zw66m" event={"ID":"b1769052-bbf8-4a8c-8658-eaf13a8ce092","Type":"ContainerStarted","Data":"0ea0131a24b54e1baa1b679d63e1ce0214d68d7adfd7aa19708122c5d7735084"} Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.380316 4707 generic.go:334] "Generic (PLEG): container finished" podID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" exitCode=137 Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.380362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e22a9822-7aa7-4a38-916a-c3fc9fdb0895","Type":"ContainerDied","Data":"21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749"} Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.380368 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.380382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e22a9822-7aa7-4a38-916a-c3fc9fdb0895","Type":"ContainerDied","Data":"a64ac7833066130e5e138ec3361f259f17f723cd49acdba0e6a895ea068c41ab"} Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.380398 4707 scope.go:117] "RemoveContainer" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.382171 4707 generic.go:334] "Generic (PLEG): container finished" podID="03adcc99-dff8-4703-bacf-25c6576428f9" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" exitCode=137 Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.382202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"03adcc99-dff8-4703-bacf-25c6576428f9","Type":"ContainerDied","Data":"bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70"} Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.382221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"03adcc99-dff8-4703-bacf-25c6576428f9","Type":"ContainerDied","Data":"48264db1c0390508e51bad8d608c82fa44e5dc5b6fe6dad192148f5eaaf31fca"} Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.382264 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.424307 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.428441 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.429867 4707 scope.go:117] "RemoveContainer" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.431545 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.435049 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.461863 4707 scope.go:117] "RemoveContainer" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" Jan 21 15:57:00 crc kubenswrapper[4707]: E0121 15:57:00.462143 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749\": container with ID starting with 21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749 not found: ID does not exist" containerID="21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.462173 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749"} err="failed to get container status \"21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749\": rpc error: code = NotFound desc = could not find container \"21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749\": container with ID starting with 21c856fcffed0fb2b86889be97beee7f2cf77edf948adee560ade97eb3513749 not found: ID does not exist" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.462194 4707 scope.go:117] "RemoveContainer" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" Jan 21 15:57:00 crc kubenswrapper[4707]: E0121 15:57:00.462573 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618\": container with ID starting with 30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618 not found: ID does not exist" containerID="30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.462603 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618"} err="failed to get container status \"30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618\": rpc error: code = NotFound desc = could not find container \"30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618\": container with ID starting with 30676c9c19f6fa8852e95c08245713de4f4e1fa7cbf2ec9d3184824750ee4618 not found: ID does not exist" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.462623 4707 scope.go:117] "RemoveContainer" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.475019 4707 scope.go:117] "RemoveContainer" containerID="0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.497749 4707 scope.go:117] "RemoveContainer" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" Jan 21 15:57:00 crc kubenswrapper[4707]: E0121 15:57:00.498090 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70\": container with ID starting with bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70 not found: ID does not exist" containerID="bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.498117 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70"} err="failed to get container status \"bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70\": rpc error: code = NotFound desc = could not find container \"bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70\": container with ID starting with bf7c005f3b3b40a3a4c7fc7fb3bf1938368e5190db51811f7e099d3d7fc9ba70 not found: ID does not exist" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.498134 4707 scope.go:117] "RemoveContainer" containerID="0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b" Jan 21 15:57:00 crc kubenswrapper[4707]: E0121 15:57:00.498394 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b\": container with ID starting with 0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b not found: ID does not exist" containerID="0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b" Jan 21 15:57:00 crc kubenswrapper[4707]: I0121 15:57:00.498425 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b"} err="failed to get container status \"0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b\": rpc error: code = NotFound desc = could not find container \"0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b\": container with ID starting with 0b5925feabdcb52c8e66e5b90381595c6cb8f0848ab2597c8094e26f36376a2b not found: ID does not exist" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.188915 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" path="/var/lib/kubelet/pods/03adcc99-dff8-4703-bacf-25c6576428f9/volumes" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.189669 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" path="/var/lib/kubelet/pods/aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2/volumes" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.190103 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" path="/var/lib/kubelet/pods/e22a9822-7aa7-4a38-916a-c3fc9fdb0895/volumes" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.390515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqrbk" event={"ID":"7a915999-9f3b-470f-ab00-25bb814401e1","Type":"ContainerStarted","Data":"fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0"} Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.610401 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.655297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b1769052-bbf8-4a8c-8658-eaf13a8ce092-crc-storage\") pod \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.655353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b1769052-bbf8-4a8c-8658-eaf13a8ce092-node-mnt\") pod \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.655423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsrk\" (UniqueName: \"kubernetes.io/projected/b1769052-bbf8-4a8c-8658-eaf13a8ce092-kube-api-access-xfsrk\") pod \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\" (UID: \"b1769052-bbf8-4a8c-8658-eaf13a8ce092\") " Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.655505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1769052-bbf8-4a8c-8658-eaf13a8ce092-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b1769052-bbf8-4a8c-8658-eaf13a8ce092" (UID: "b1769052-bbf8-4a8c-8658-eaf13a8ce092"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.655764 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b1769052-bbf8-4a8c-8658-eaf13a8ce092-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.659216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1769052-bbf8-4a8c-8658-eaf13a8ce092-kube-api-access-xfsrk" (OuterVolumeSpecName: "kube-api-access-xfsrk") pod "b1769052-bbf8-4a8c-8658-eaf13a8ce092" (UID: "b1769052-bbf8-4a8c-8658-eaf13a8ce092"). InnerVolumeSpecName "kube-api-access-xfsrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.668601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1769052-bbf8-4a8c-8658-eaf13a8ce092-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b1769052-bbf8-4a8c-8658-eaf13a8ce092" (UID: "b1769052-bbf8-4a8c-8658-eaf13a8ce092"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.757562 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b1769052-bbf8-4a8c-8658-eaf13a8ce092-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:01 crc kubenswrapper[4707]: I0121 15:57:01.757747 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsrk\" (UniqueName: \"kubernetes.io/projected/b1769052-bbf8-4a8c-8658-eaf13a8ce092-kube-api-access-xfsrk\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:02 crc kubenswrapper[4707]: I0121 15:57:02.182125 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:57:02 crc kubenswrapper[4707]: E0121 15:57:02.182337 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:57:02 crc kubenswrapper[4707]: I0121 15:57:02.399025 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a915999-9f3b-470f-ab00-25bb814401e1" containerID="fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0" exitCode=0 Jan 21 15:57:02 crc kubenswrapper[4707]: I0121 15:57:02.399090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqrbk" event={"ID":"7a915999-9f3b-470f-ab00-25bb814401e1","Type":"ContainerDied","Data":"fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0"} Jan 21 15:57:02 crc kubenswrapper[4707]: I0121 15:57:02.400458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zw66m" event={"ID":"b1769052-bbf8-4a8c-8658-eaf13a8ce092","Type":"ContainerDied","Data":"0ea0131a24b54e1baa1b679d63e1ce0214d68d7adfd7aa19708122c5d7735084"} Jan 21 15:57:02 crc kubenswrapper[4707]: I0121 15:57:02.400494 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ea0131a24b54e1baa1b679d63e1ce0214d68d7adfd7aa19708122c5d7735084" Jan 21 15:57:02 crc kubenswrapper[4707]: I0121 15:57:02.400531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zw66m" Jan 21 15:57:03 crc kubenswrapper[4707]: I0121 15:57:03.407489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqrbk" event={"ID":"7a915999-9f3b-470f-ab00-25bb814401e1","Type":"ContainerStarted","Data":"008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427"} Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.336994 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rqrbk" podStartSLOduration=3.869583175 podStartE2EDuration="6.336977792s" podCreationTimestamp="2026-01-21 15:56:58 +0000 UTC" firstStartedPulling="2026-01-21 15:57:00.376858088 +0000 UTC m=+3317.558374309" lastFinishedPulling="2026-01-21 15:57:02.844252714 +0000 UTC m=+3320.025768926" observedRunningTime="2026-01-21 15:57:03.419950574 +0000 UTC m=+3320.601466796" watchObservedRunningTime="2026-01-21 15:57:04.336977792 +0000 UTC m=+3321.518494014" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.338892 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-zw66m"] Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.342654 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-zw66m"] Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.460036 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-lvs4w"] Jan 21 15:57:04 crc kubenswrapper[4707]: E0121 15:57:04.460422 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.460752 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: E0121 15:57:04.460783 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1769052-bbf8-4a8c-8658-eaf13a8ce092" containerName="storage" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.460791 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1769052-bbf8-4a8c-8658-eaf13a8ce092" containerName="storage" Jan 21 15:57:04 crc kubenswrapper[4707]: E0121 15:57:04.460839 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.460848 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" Jan 21 15:57:04 crc kubenswrapper[4707]: E0121 15:57:04.460882 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.460889 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: E0121 15:57:04.460904 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.460921 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: E0121 15:57:04.460940 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.460947 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: E0121 15:57:04.460957 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.460967 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.462159 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bf8a2-ecb1-4af8-99df-6c630be5a2b2" containerName="nova-scheduler-scheduler" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.462198 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.462221 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.462231 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.462247 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="03adcc99-dff8-4703-bacf-25c6576428f9" containerName="nova-cell0-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.462256 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.462281 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1769052-bbf8-4a8c-8658-eaf13a8ce092" containerName="storage" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.464045 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.468494 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.468843 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.469300 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.469565 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.470266 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lvs4w"] Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.491590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hdv7\" (UniqueName: \"kubernetes.io/projected/c287a9f3-dfaf-46b1-a73b-b1d66c346648-kube-api-access-9hdv7\") pod \"crc-storage-crc-lvs4w\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.491906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c287a9f3-dfaf-46b1-a73b-b1d66c346648-crc-storage\") pod \"crc-storage-crc-lvs4w\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.491959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c287a9f3-dfaf-46b1-a73b-b1d66c346648-node-mnt\") pod \"crc-storage-crc-lvs4w\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.593243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hdv7\" (UniqueName: \"kubernetes.io/projected/c287a9f3-dfaf-46b1-a73b-b1d66c346648-kube-api-access-9hdv7\") pod \"crc-storage-crc-lvs4w\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.593296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c287a9f3-dfaf-46b1-a73b-b1d66c346648-crc-storage\") pod \"crc-storage-crc-lvs4w\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.593324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c287a9f3-dfaf-46b1-a73b-b1d66c346648-node-mnt\") pod \"crc-storage-crc-lvs4w\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.593593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c287a9f3-dfaf-46b1-a73b-b1d66c346648-node-mnt\") pod \"crc-storage-crc-lvs4w\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.594204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c287a9f3-dfaf-46b1-a73b-b1d66c346648-crc-storage\") pod \"crc-storage-crc-lvs4w\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.607689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hdv7\" (UniqueName: \"kubernetes.io/projected/c287a9f3-dfaf-46b1-a73b-b1d66c346648-kube-api-access-9hdv7\") pod \"crc-storage-crc-lvs4w\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:04 crc kubenswrapper[4707]: I0121 15:57:04.780303 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:05 crc kubenswrapper[4707]: I0121 15:57:05.189019 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1769052-bbf8-4a8c-8658-eaf13a8ce092" path="/var/lib/kubelet/pods/b1769052-bbf8-4a8c-8658-eaf13a8ce092/volumes" Jan 21 15:57:05 crc kubenswrapper[4707]: I0121 15:57:05.189787 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lvs4w"] Jan 21 15:57:05 crc kubenswrapper[4707]: I0121 15:57:05.422231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lvs4w" event={"ID":"c287a9f3-dfaf-46b1-a73b-b1d66c346648","Type":"ContainerStarted","Data":"9bc0ca8b85fd65edca1fea3ec17d181ede6c02564202479f5705ee4fbebd3fa7"} Jan 21 15:57:06 crc kubenswrapper[4707]: I0121 15:57:06.430019 4707 generic.go:334] "Generic (PLEG): container finished" podID="c287a9f3-dfaf-46b1-a73b-b1d66c346648" containerID="42e1a7b8fbb1c0bf8cd31040a2431ce633ee959814881423be4fe99ae7d68843" exitCode=0 Jan 21 15:57:06 crc kubenswrapper[4707]: I0121 15:57:06.430072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lvs4w" event={"ID":"c287a9f3-dfaf-46b1-a73b-b1d66c346648","Type":"ContainerDied","Data":"42e1a7b8fbb1c0bf8cd31040a2431ce633ee959814881423be4fe99ae7d68843"} Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.680552 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.733843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hdv7\" (UniqueName: \"kubernetes.io/projected/c287a9f3-dfaf-46b1-a73b-b1d66c346648-kube-api-access-9hdv7\") pod \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.733881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c287a9f3-dfaf-46b1-a73b-b1d66c346648-node-mnt\") pod \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.733907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c287a9f3-dfaf-46b1-a73b-b1d66c346648-crc-storage\") pod \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\" (UID: \"c287a9f3-dfaf-46b1-a73b-b1d66c346648\") " Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.734006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c287a9f3-dfaf-46b1-a73b-b1d66c346648-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c287a9f3-dfaf-46b1-a73b-b1d66c346648" (UID: "c287a9f3-dfaf-46b1-a73b-b1d66c346648"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.734205 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c287a9f3-dfaf-46b1-a73b-b1d66c346648-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.738144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c287a9f3-dfaf-46b1-a73b-b1d66c346648-kube-api-access-9hdv7" (OuterVolumeSpecName: "kube-api-access-9hdv7") pod "c287a9f3-dfaf-46b1-a73b-b1d66c346648" (UID: "c287a9f3-dfaf-46b1-a73b-b1d66c346648"). InnerVolumeSpecName "kube-api-access-9hdv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.748852 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c287a9f3-dfaf-46b1-a73b-b1d66c346648-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c287a9f3-dfaf-46b1-a73b-b1d66c346648" (UID: "c287a9f3-dfaf-46b1-a73b-b1d66c346648"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.835847 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hdv7\" (UniqueName: \"kubernetes.io/projected/c287a9f3-dfaf-46b1-a73b-b1d66c346648-kube-api-access-9hdv7\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:07 crc kubenswrapper[4707]: I0121 15:57:07.835874 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c287a9f3-dfaf-46b1-a73b-b1d66c346648-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.203669 4707 scope.go:117] "RemoveContainer" containerID="14f2c43916f1c0eb479f1b960a8fbf2446166014a7f757766f24c9135adbac15" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.223601 4707 scope.go:117] "RemoveContainer" containerID="e71378c878762f4533c4855c6ebb5b163b8d3bb51fcbd291452e5c8d6f498513" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.242118 4707 scope.go:117] "RemoveContainer" containerID="5cffaed2737dd9a63cbf504e7def30c36c381a1b91168e8042ac65b329773df6" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.275909 4707 scope.go:117] "RemoveContainer" containerID="227fa53f6dbdedb9034190a4104e155e6b956798c322115d49360fef6dc181f3" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.290716 4707 scope.go:117] "RemoveContainer" containerID="ea797f16aa01ac7b3ee4f0754c5c7ce729ef2b755429c228fa55a073bdf52450" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.304025 4707 scope.go:117] "RemoveContainer" containerID="8e67c2ffb0682c00c6f22286ee55c8e92e062463d0d967c5528351a1e34c7ada" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.319147 4707 scope.go:117] "RemoveContainer" containerID="c230b11acb9618eb5c88ca18c142e1ad367210447000f32c35a942db7cc5585f" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.335863 4707 scope.go:117] "RemoveContainer" containerID="47158a7390e25365b78677d77f11c3992cf6c8641b534d23dcc5098499a9c974" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.353411 4707 scope.go:117] "RemoveContainer" containerID="017b62d5c64e8c72010733bce258aabed46e954ed91e7af4e64b2cff4d2c6e42" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.373229 4707 scope.go:117] "RemoveContainer" containerID="dc33abbace9177890b6f8417f5c628a1d2b93884c9226ae48996c1a08c975c09" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.388510 4707 scope.go:117] "RemoveContainer" containerID="1768b7b2effe8e34bac8d411159997a5bafe09b860da85c42876134ed5257e4e" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.409142 4707 scope.go:117] "RemoveContainer" containerID="415f4336b9f102676f316ec6eeca0a530fd2e6b550c3557faee9b4a9087b5cfc" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.424493 4707 scope.go:117] "RemoveContainer" containerID="564facd7467241d001d1ede0891aa84310b54a6635db128f420242316c948ab5" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.452062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lvs4w" event={"ID":"c287a9f3-dfaf-46b1-a73b-b1d66c346648","Type":"ContainerDied","Data":"9bc0ca8b85fd65edca1fea3ec17d181ede6c02564202479f5705ee4fbebd3fa7"} Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.452106 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc0ca8b85fd65edca1fea3ec17d181ede6c02564202479f5705ee4fbebd3fa7" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.452077 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lvs4w" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.459957 4707 scope.go:117] "RemoveContainer" containerID="02e8a2611ee7d85c7fd5a2ff060701f99988408f3267b95159cbe36a24e03ce3" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.498200 4707 scope.go:117] "RemoveContainer" containerID="e002ca41987b39d839d15e073a0294cd8fac52b5b32405ad039756d1762a6460" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.538557 4707 scope.go:117] "RemoveContainer" containerID="e7eeebd1988f725464039fcc2c92c42910a2ec711071f2574ef7fd86fdba055a" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.555529 4707 scope.go:117] "RemoveContainer" containerID="f34721b7e3114cc6a1e9ef72c684808875796bac51d867e8523d5d153e69cb52" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.584529 4707 scope.go:117] "RemoveContainer" containerID="d3e1d5dbdd13bf53fcf6b6196650b7cbe7095451b0e1b8e6af9f3c9730be9398" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.629096 4707 scope.go:117] "RemoveContainer" containerID="0d6f9b4edffb5fceeb25511e4ca129919e0cfdbf7d8f30f04c99b5c9e6761ba5" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.645662 4707 scope.go:117] "RemoveContainer" containerID="d7d890aefc9c07d04c6f11b91dc8272424a8c811d55a0e0d7a7880ee6dcb0bf9" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.676801 4707 scope.go:117] "RemoveContainer" containerID="d31551146b8ebbbc731382dfb6cd2dabb404edd7c9171497db9313117e68d055" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.698767 4707 scope.go:117] "RemoveContainer" containerID="2b76cfe96a31c3a5c537442539103885ef5840bd5265479eefa63122173d5b66" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.722045 4707 scope.go:117] "RemoveContainer" containerID="c09faffd0bbe36299f2dd172904831c8c7693d38c1181a1172b98f971042e45c" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.749847 4707 scope.go:117] "RemoveContainer" containerID="324443ed55b5500f3191c63cafe8edcca7c7ff9370470a884b4139fe7922effa" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.764211 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.764251 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.789924 4707 scope.go:117] "RemoveContainer" containerID="71b02c968c6f75c52a86bc5fe092ddddf094ad5abc9fe1f5c5ffb5d235679855" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.801853 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.805909 4707 scope.go:117] "RemoveContainer" containerID="ada1d30de6c406ca7003129e9e09083afedc3ca70ab25f4fd5dd2ddc1aac0321" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.823751 4707 scope.go:117] "RemoveContainer" containerID="755d9d5adcd064ca0330aa73fc3e6cc0f540f9c4fa34be8fff019ef4103996e6" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.837146 4707 scope.go:117] "RemoveContainer" containerID="2f142df25ab095ac8ab812304764acf0c07b5fb5d97186a9832bc8e8eceb7abf" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.854539 4707 scope.go:117] "RemoveContainer" containerID="83e8027d614ff783de9b84f03d3856d99934e6591281a3e9bc31ec8d751eef10" Jan 21 15:57:08 crc kubenswrapper[4707]: I0121 15:57:08.871030 4707 scope.go:117] "RemoveContainer" containerID="89f7c57c91a7c289c44fe4c129ace200e5af39011c8ced5441224fe0f0e92e44" Jan 21 15:57:09 crc kubenswrapper[4707]: I0121 15:57:09.503751 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:57:09 crc kubenswrapper[4707]: I0121 15:57:09.536579 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rqrbk"] Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.484057 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rqrbk" podUID="7a915999-9f3b-470f-ab00-25bb814401e1" containerName="registry-server" containerID="cri-o://008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427" gracePeriod=2 Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.826904 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.892314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-catalog-content\") pod \"7a915999-9f3b-470f-ab00-25bb814401e1\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.892384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-utilities\") pod \"7a915999-9f3b-470f-ab00-25bb814401e1\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.892449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp22k\" (UniqueName: \"kubernetes.io/projected/7a915999-9f3b-470f-ab00-25bb814401e1-kube-api-access-jp22k\") pod \"7a915999-9f3b-470f-ab00-25bb814401e1\" (UID: \"7a915999-9f3b-470f-ab00-25bb814401e1\") " Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.893017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-utilities" (OuterVolumeSpecName: "utilities") pod "7a915999-9f3b-470f-ab00-25bb814401e1" (UID: "7a915999-9f3b-470f-ab00-25bb814401e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.896467 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a915999-9f3b-470f-ab00-25bb814401e1-kube-api-access-jp22k" (OuterVolumeSpecName: "kube-api-access-jp22k") pod "7a915999-9f3b-470f-ab00-25bb814401e1" (UID: "7a915999-9f3b-470f-ab00-25bb814401e1"). InnerVolumeSpecName "kube-api-access-jp22k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.930617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a915999-9f3b-470f-ab00-25bb814401e1" (UID: "7a915999-9f3b-470f-ab00-25bb814401e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.993362 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.993538 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a915999-9f3b-470f-ab00-25bb814401e1-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:11 crc kubenswrapper[4707]: I0121 15:57:11.993549 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp22k\" (UniqueName: \"kubernetes.io/projected/7a915999-9f3b-470f-ab00-25bb814401e1-kube-api-access-jp22k\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.492156 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a915999-9f3b-470f-ab00-25bb814401e1" containerID="008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427" exitCode=0 Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.492191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqrbk" event={"ID":"7a915999-9f3b-470f-ab00-25bb814401e1","Type":"ContainerDied","Data":"008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427"} Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.492220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqrbk" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.492238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqrbk" event={"ID":"7a915999-9f3b-470f-ab00-25bb814401e1","Type":"ContainerDied","Data":"73eb68bb31db381dba8b78b5aab784a5b99d70a1c62d6cdf2f4704d86ea55aa3"} Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.492256 4707 scope.go:117] "RemoveContainer" containerID="008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.507295 4707 scope.go:117] "RemoveContainer" containerID="fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.517421 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rqrbk"] Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.521981 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rqrbk"] Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.525414 4707 scope.go:117] "RemoveContainer" containerID="88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.541308 4707 scope.go:117] "RemoveContainer" containerID="008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427" Jan 21 15:57:12 crc kubenswrapper[4707]: E0121 15:57:12.541696 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427\": container with ID starting with 008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427 not found: ID does not exist" containerID="008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.541722 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427"} err="failed to get container status \"008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427\": rpc error: code = NotFound desc = could not find container \"008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427\": container with ID starting with 008420af625c1e03f2ea5cc4ef2623760da6d168a3638caedb87997224515427 not found: ID does not exist" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.541740 4707 scope.go:117] "RemoveContainer" containerID="fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0" Jan 21 15:57:12 crc kubenswrapper[4707]: E0121 15:57:12.542113 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0\": container with ID starting with fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0 not found: ID does not exist" containerID="fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.542146 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0"} err="failed to get container status \"fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0\": rpc error: code = NotFound desc = could not find container \"fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0\": container with ID starting with fc3aa680a6dd9b874ed4b344e85ca87d63338b0431ba36b1bb8e1528c3704be0 not found: ID does not exist" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.542165 4707 scope.go:117] "RemoveContainer" containerID="88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403" Jan 21 15:57:12 crc kubenswrapper[4707]: E0121 15:57:12.542461 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403\": container with ID starting with 88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403 not found: ID does not exist" containerID="88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403" Jan 21 15:57:12 crc kubenswrapper[4707]: I0121 15:57:12.542491 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403"} err="failed to get container status \"88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403\": rpc error: code = NotFound desc = could not find container \"88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403\": container with ID starting with 88cb371f8cbb9eddf93962f0482cb8bb78936f1a245b6ef1a83f70ac1f522403 not found: ID does not exist" Jan 21 15:57:13 crc kubenswrapper[4707]: I0121 15:57:13.188548 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a915999-9f3b-470f-ab00-25bb814401e1" path="/var/lib/kubelet/pods/7a915999-9f3b-470f-ab00-25bb814401e1/volumes" Jan 21 15:57:16 crc kubenswrapper[4707]: I0121 15:57:16.182683 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:57:16 crc kubenswrapper[4707]: E0121 15:57:16.183093 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.501179 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:57:20 crc kubenswrapper[4707]: E0121 15:57:20.501798 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a915999-9f3b-470f-ab00-25bb814401e1" containerName="extract-content" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.501824 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a915999-9f3b-470f-ab00-25bb814401e1" containerName="extract-content" Jan 21 15:57:20 crc kubenswrapper[4707]: E0121 15:57:20.501838 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a915999-9f3b-470f-ab00-25bb814401e1" containerName="registry-server" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.501843 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a915999-9f3b-470f-ab00-25bb814401e1" containerName="registry-server" Jan 21 15:57:20 crc kubenswrapper[4707]: E0121 15:57:20.501853 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a915999-9f3b-470f-ab00-25bb814401e1" containerName="extract-utilities" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.501859 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a915999-9f3b-470f-ab00-25bb814401e1" containerName="extract-utilities" Jan 21 15:57:20 crc kubenswrapper[4707]: E0121 15:57:20.501878 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c287a9f3-dfaf-46b1-a73b-b1d66c346648" containerName="storage" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.501883 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c287a9f3-dfaf-46b1-a73b-b1d66c346648" containerName="storage" Jan 21 15:57:20 crc kubenswrapper[4707]: E0121 15:57:20.501893 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.501898 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.502015 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22a9822-7aa7-4a38-916a-c3fc9fdb0895" containerName="nova-cell1-conductor-conductor" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.502028 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a915999-9f3b-470f-ab00-25bb814401e1" containerName="registry-server" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.502039 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c287a9f3-dfaf-46b1-a73b-b1d66c346648" containerName="storage" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.502685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.504065 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.504369 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.504581 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.504924 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.505129 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-m7v2m" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.505905 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.506204 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.513906 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604080 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l66wc\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-kube-api-access-l66wc\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50ded377-6ace-4f39-8a60-bee575f7e8cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50ded377-6ace-4f39-8a60-bee575f7e8cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.604565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l66wc\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-kube-api-access-l66wc\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50ded377-6ace-4f39-8a60-bee575f7e8cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50ded377-6ace-4f39-8a60-bee575f7e8cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705671 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.705760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.706148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.706272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.706321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.706377 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.706642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.706992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.708532 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.709596 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.710462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50ded377-6ace-4f39-8a60-bee575f7e8cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.711619 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.711942 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.712257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50ded377-6ace-4f39-8a60-bee575f7e8cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.712550 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.712751 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.712917 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.713052 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-m7gbd" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.712551 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.715919 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.719331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.720964 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.723861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l66wc\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-kube-api-access-l66wc\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.725100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc4kc\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-kube-api-access-pc4kc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c37d5e5-4ffd-412c-a93a-86cb6474735c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c37d5e5-4ffd-412c-a93a-86cb6474735c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.807927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.819565 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.909760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c37d5e5-4ffd-412c-a93a-86cb6474735c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.909838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.909857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.909872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c37d5e5-4ffd-412c-a93a-86cb6474735c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.909891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.909921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.909936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.909971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.909998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.910017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc4kc\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-kube-api-access-pc4kc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.910041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.910485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.910956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.911103 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.911475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.911937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.912587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.913749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.914707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c37d5e5-4ffd-412c-a93a-86cb6474735c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.916171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c37d5e5-4ffd-412c-a93a-86cb6474735c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.917301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.925502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc4kc\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-kube-api-access-pc4kc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:20 crc kubenswrapper[4707]: I0121 15:57:20.939065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:21 crc kubenswrapper[4707]: I0121 15:57:21.065303 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:21 crc kubenswrapper[4707]: I0121 15:57:21.196039 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 15:57:21 crc kubenswrapper[4707]: I0121 15:57:21.430965 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 15:57:21 crc kubenswrapper[4707]: W0121 15:57:21.431935 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c37d5e5_4ffd_412c_a93a_86cb6474735c.slice/crio-557e2ea25d4d161d78cb6fb283e2d22644b492800db38b794bd7bd717f3c20d5 WatchSource:0}: Error finding container 557e2ea25d4d161d78cb6fb283e2d22644b492800db38b794bd7bd717f3c20d5: Status 404 returned error can't find the container with id 557e2ea25d4d161d78cb6fb283e2d22644b492800db38b794bd7bd717f3c20d5 Jan 21 15:57:21 crc kubenswrapper[4707]: I0121 15:57:21.543458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1c37d5e5-4ffd-412c-a93a-86cb6474735c","Type":"ContainerStarted","Data":"557e2ea25d4d161d78cb6fb283e2d22644b492800db38b794bd7bd717f3c20d5"} Jan 21 15:57:21 crc kubenswrapper[4707]: I0121 15:57:21.544549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"50ded377-6ace-4f39-8a60-bee575f7e8cc","Type":"ContainerStarted","Data":"0f47ed0c3f538d528d4f1deba5f82156807e5b65d757738993e37fefc5bb139f"} Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.404855 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.406046 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.408660 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.408668 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-6glg2" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.408960 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.409187 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.413659 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.416467 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.429880 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.429951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kolla-config\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.429979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.430006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xprz2\" (UniqueName: \"kubernetes.io/projected/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kube-api-access-xprz2\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.430051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.430090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.430108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-default\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.430128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kolla-config\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531462 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xprz2\" (UniqueName: \"kubernetes.io/projected/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kube-api-access-xprz2\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-default\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531605 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531694 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.531871 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.532300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kolla-config\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.532560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-default\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.533295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.543095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.543099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.545334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xprz2\" (UniqueName: \"kubernetes.io/projected/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kube-api-access-xprz2\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.547561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.554538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"50ded377-6ace-4f39-8a60-bee575f7e8cc","Type":"ContainerStarted","Data":"322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab"} Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.557443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1c37d5e5-4ffd-412c-a93a-86cb6474735c","Type":"ContainerStarted","Data":"6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679"} Jan 21 15:57:22 crc kubenswrapper[4707]: I0121 15:57:22.721830 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.118098 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 15:57:23 crc kubenswrapper[4707]: W0121 15:57:23.128638 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10c95d6a_958d_45bf_8cf2_c9b915727cd4.slice/crio-415e6d046d93980e7818ca8a0519f3c87c8d871c99b7e72d21fc4a1d90f407a2 WatchSource:0}: Error finding container 415e6d046d93980e7818ca8a0519f3c87c8d871c99b7e72d21fc4a1d90f407a2: Status 404 returned error can't find the container with id 415e6d046d93980e7818ca8a0519f3c87c8d871c99b7e72d21fc4a1d90f407a2 Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.565649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"10c95d6a-958d-45bf-8cf2-c9b915727cd4","Type":"ContainerStarted","Data":"e79d60a7b5b8d4c758290826a1efe8d5479f6a04297437d3fe22a8705fa7d7b8"} Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.566481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"10c95d6a-958d-45bf-8cf2-c9b915727cd4","Type":"ContainerStarted","Data":"415e6d046d93980e7818ca8a0519f3c87c8d871c99b7e72d21fc4a1d90f407a2"} Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.787441 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.788620 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.790837 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.791017 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-rch7x" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.791143 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.791398 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.791975 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.950971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.951015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.951061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v25q5\" (UniqueName: \"kubernetes.io/projected/614dacb7-b2d4-4d19-b93d-c922660ad318-kube-api-access-v25q5\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.951096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.951131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.951185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.951229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:23 crc kubenswrapper[4707]: I0121 15:57:23.951273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.052400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.052442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.052479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v25q5\" (UniqueName: \"kubernetes.io/projected/614dacb7-b2d4-4d19-b93d-c922660ad318-kube-api-access-v25q5\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.052508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.052535 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.052556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.052583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.052614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.052966 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.053527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.053687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.053713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.054207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.058143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.058231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.066485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v25q5\" (UniqueName: \"kubernetes.io/projected/614dacb7-b2d4-4d19-b93d-c922660ad318-kube-api-access-v25q5\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.069093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.101524 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.156995 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.157982 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.160250 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-tgjb6" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.160690 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.161042 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.174233 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.356580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.356683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-config-data\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.356723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmrtv\" (UniqueName: \"kubernetes.io/projected/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kube-api-access-jmrtv\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.356853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.356890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kolla-config\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.458529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.458587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-config-data\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.458610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmrtv\" (UniqueName: \"kubernetes.io/projected/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kube-api-access-jmrtv\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.458674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.458702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kolla-config\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.459316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kolla-config\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.460436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-config-data\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.464262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.464327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.479303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmrtv\" (UniqueName: \"kubernetes.io/projected/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kube-api-access-jmrtv\") pod \"memcached-0\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.537312 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 15:57:24 crc kubenswrapper[4707]: W0121 15:57:24.544531 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod614dacb7_b2d4_4d19_b93d_c922660ad318.slice/crio-d71249bd7452f771849cb799f45131670980f0631ff6b22cb430d6f58d21911f WatchSource:0}: Error finding container d71249bd7452f771849cb799f45131670980f0631ff6b22cb430d6f58d21911f: Status 404 returned error can't find the container with id d71249bd7452f771849cb799f45131670980f0631ff6b22cb430d6f58d21911f Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.573338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"614dacb7-b2d4-4d19-b93d-c922660ad318","Type":"ContainerStarted","Data":"d71249bd7452f771849cb799f45131670980f0631ff6b22cb430d6f58d21911f"} Jan 21 15:57:24 crc kubenswrapper[4707]: I0121 15:57:24.771591 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.125059 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.579653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"614dacb7-b2d4-4d19-b93d-c922660ad318","Type":"ContainerStarted","Data":"fa323bea5e9e1d95534dd4ba7f41b30c1e310d988c5dd2b4f06c9ddc1db6a2e9"} Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.580862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"e5f232af-6f89-4112-a5a1-5fd2093c2b61","Type":"ContainerStarted","Data":"39951c1adfd56981da93394dd5526012af516ce4d8e4e81c75c853a81df75ff1"} Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.580901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"e5f232af-6f89-4112-a5a1-5fd2093c2b61","Type":"ContainerStarted","Data":"c9e92d7b6b68e40da63224c6160c0c0ba5da9124af56543eadab466a5c80785a"} Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.581036 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.613227 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=1.613212985 podStartE2EDuration="1.613212985s" podCreationTimestamp="2026-01-21 15:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:25.607743721 +0000 UTC m=+3342.789259943" watchObservedRunningTime="2026-01-21 15:57:25.613212985 +0000 UTC m=+3342.794729208" Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.912876 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.913649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.916042 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-rrc5d" Jan 21 15:57:25 crc kubenswrapper[4707]: I0121 15:57:25.924627 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.079670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jgrh\" (UniqueName: \"kubernetes.io/projected/ff49b548-7c18-4eba-a23b-9b22dc4e5982-kube-api-access-2jgrh\") pod \"kube-state-metrics-0\" (UID: \"ff49b548-7c18-4eba-a23b-9b22dc4e5982\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.153033 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dltnz"] Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.154566 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.165127 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dltnz"] Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.181382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jgrh\" (UniqueName: \"kubernetes.io/projected/ff49b548-7c18-4eba-a23b-9b22dc4e5982-kube-api-access-2jgrh\") pod \"kube-state-metrics-0\" (UID: \"ff49b548-7c18-4eba-a23b-9b22dc4e5982\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.200027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jgrh\" (UniqueName: \"kubernetes.io/projected/ff49b548-7c18-4eba-a23b-9b22dc4e5982-kube-api-access-2jgrh\") pod \"kube-state-metrics-0\" (UID: \"ff49b548-7c18-4eba-a23b-9b22dc4e5982\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.225626 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.284600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlj7k\" (UniqueName: \"kubernetes.io/projected/da46cb19-a3be-4378-a82f-47f24de3e2b1-kube-api-access-tlj7k\") pod \"redhat-marketplace-dltnz\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.284643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-utilities\") pod \"redhat-marketplace-dltnz\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.284732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-catalog-content\") pod \"redhat-marketplace-dltnz\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.356768 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mpbh2"] Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.364476 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.371616 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mpbh2"] Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.386493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-catalog-content\") pod \"redhat-marketplace-dltnz\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.386645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlj7k\" (UniqueName: \"kubernetes.io/projected/da46cb19-a3be-4378-a82f-47f24de3e2b1-kube-api-access-tlj7k\") pod \"redhat-marketplace-dltnz\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.386684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-utilities\") pod \"redhat-marketplace-dltnz\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.387101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-utilities\") pod \"redhat-marketplace-dltnz\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.387158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-catalog-content\") pod \"redhat-marketplace-dltnz\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.427615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlj7k\" (UniqueName: \"kubernetes.io/projected/da46cb19-a3be-4378-a82f-47f24de3e2b1-kube-api-access-tlj7k\") pod \"redhat-marketplace-dltnz\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.467589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.488421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-catalog-content\") pod \"certified-operators-mpbh2\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.488464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-utilities\") pod \"certified-operators-mpbh2\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.488562 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hchrc\" (UniqueName: \"kubernetes.io/projected/6fff3d01-d3b5-4323-992a-a8542065926b-kube-api-access-hchrc\") pod \"certified-operators-mpbh2\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.588870 4707 generic.go:334] "Generic (PLEG): container finished" podID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" containerID="e79d60a7b5b8d4c758290826a1efe8d5479f6a04297437d3fe22a8705fa7d7b8" exitCode=0 Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.589409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"10c95d6a-958d-45bf-8cf2-c9b915727cd4","Type":"ContainerDied","Data":"e79d60a7b5b8d4c758290826a1efe8d5479f6a04297437d3fe22a8705fa7d7b8"} Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.589919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hchrc\" (UniqueName: \"kubernetes.io/projected/6fff3d01-d3b5-4323-992a-a8542065926b-kube-api-access-hchrc\") pod \"certified-operators-mpbh2\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.589965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-catalog-content\") pod \"certified-operators-mpbh2\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.589995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-utilities\") pod \"certified-operators-mpbh2\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.590368 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-utilities\") pod \"certified-operators-mpbh2\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.590423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-catalog-content\") pod \"certified-operators-mpbh2\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.611986 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hchrc\" (UniqueName: \"kubernetes.io/projected/6fff3d01-d3b5-4323-992a-a8542065926b-kube-api-access-hchrc\") pod \"certified-operators-mpbh2\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.665687 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:57:26 crc kubenswrapper[4707]: W0121 15:57:26.668988 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff49b548_7c18_4eba_a23b_9b22dc4e5982.slice/crio-65e4e2036d324214de6fb4345cdfbcc3bbab91d8820d175f6cd136b4dc4d6fa2 WatchSource:0}: Error finding container 65e4e2036d324214de6fb4345cdfbcc3bbab91d8820d175f6cd136b4dc4d6fa2: Status 404 returned error can't find the container with id 65e4e2036d324214de6fb4345cdfbcc3bbab91d8820d175f6cd136b4dc4d6fa2 Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.671575 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.680085 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:26 crc kubenswrapper[4707]: I0121 15:57:26.887706 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dltnz"] Jan 21 15:57:26 crc kubenswrapper[4707]: W0121 15:57:26.892400 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda46cb19_a3be_4378_a82f_47f24de3e2b1.slice/crio-128d966e8b40accbad8af16e05a641a7433ece376183678fe4c31eafe2b541bf WatchSource:0}: Error finding container 128d966e8b40accbad8af16e05a641a7433ece376183678fe4c31eafe2b541bf: Status 404 returned error can't find the container with id 128d966e8b40accbad8af16e05a641a7433ece376183678fe4c31eafe2b541bf Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.083395 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mpbh2"] Jan 21 15:57:27 crc kubenswrapper[4707]: W0121 15:57:27.124375 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fff3d01_d3b5_4323_992a_a8542065926b.slice/crio-71bd3e36c051854dae428b349d00f0dfde7fcc9299f91ae33ca7a902bf7009b7 WatchSource:0}: Error finding container 71bd3e36c051854dae428b349d00f0dfde7fcc9299f91ae33ca7a902bf7009b7: Status 404 returned error can't find the container with id 71bd3e36c051854dae428b349d00f0dfde7fcc9299f91ae33ca7a902bf7009b7 Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.595604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"10c95d6a-958d-45bf-8cf2-c9b915727cd4","Type":"ContainerStarted","Data":"1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b"} Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.597622 4707 generic.go:334] "Generic (PLEG): container finished" podID="6fff3d01-d3b5-4323-992a-a8542065926b" containerID="22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e" exitCode=0 Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.597680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mpbh2" event={"ID":"6fff3d01-d3b5-4323-992a-a8542065926b","Type":"ContainerDied","Data":"22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e"} Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.597705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mpbh2" event={"ID":"6fff3d01-d3b5-4323-992a-a8542065926b","Type":"ContainerStarted","Data":"71bd3e36c051854dae428b349d00f0dfde7fcc9299f91ae33ca7a902bf7009b7"} Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.598968 4707 generic.go:334] "Generic (PLEG): container finished" podID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerID="132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc" exitCode=0 Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.599018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dltnz" event={"ID":"da46cb19-a3be-4378-a82f-47f24de3e2b1","Type":"ContainerDied","Data":"132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc"} Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.599039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dltnz" event={"ID":"da46cb19-a3be-4378-a82f-47f24de3e2b1","Type":"ContainerStarted","Data":"128d966e8b40accbad8af16e05a641a7433ece376183678fe4c31eafe2b541bf"} Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.601455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"ff49b548-7c18-4eba-a23b-9b22dc4e5982","Type":"ContainerStarted","Data":"38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3"} Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.601483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"ff49b548-7c18-4eba-a23b-9b22dc4e5982","Type":"ContainerStarted","Data":"65e4e2036d324214de6fb4345cdfbcc3bbab91d8820d175f6cd136b4dc4d6fa2"} Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.601574 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.615034 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=6.615023232 podStartE2EDuration="6.615023232s" podCreationTimestamp="2026-01-21 15:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:27.609127325 +0000 UTC m=+3344.790643537" watchObservedRunningTime="2026-01-21 15:57:27.615023232 +0000 UTC m=+3344.796539454" Jan 21 15:57:27 crc kubenswrapper[4707]: I0121 15:57:27.632869 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.335985397 podStartE2EDuration="2.63285878s" podCreationTimestamp="2026-01-21 15:57:25 +0000 UTC" firstStartedPulling="2026-01-21 15:57:26.671360779 +0000 UTC m=+3343.852877001" lastFinishedPulling="2026-01-21 15:57:26.968234162 +0000 UTC m=+3344.149750384" observedRunningTime="2026-01-21 15:57:27.62972136 +0000 UTC m=+3344.811237581" watchObservedRunningTime="2026-01-21 15:57:27.63285878 +0000 UTC m=+3344.814375001" Jan 21 15:57:28 crc kubenswrapper[4707]: I0121 15:57:28.609099 4707 generic.go:334] "Generic (PLEG): container finished" podID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerID="505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78" exitCode=0 Jan 21 15:57:28 crc kubenswrapper[4707]: I0121 15:57:28.609192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dltnz" event={"ID":"da46cb19-a3be-4378-a82f-47f24de3e2b1","Type":"ContainerDied","Data":"505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78"} Jan 21 15:57:28 crc kubenswrapper[4707]: I0121 15:57:28.611425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mpbh2" event={"ID":"6fff3d01-d3b5-4323-992a-a8542065926b","Type":"ContainerStarted","Data":"fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317"} Jan 21 15:57:28 crc kubenswrapper[4707]: I0121 15:57:28.613006 4707 generic.go:334] "Generic (PLEG): container finished" podID="614dacb7-b2d4-4d19-b93d-c922660ad318" containerID="fa323bea5e9e1d95534dd4ba7f41b30c1e310d988c5dd2b4f06c9ddc1db6a2e9" exitCode=0 Jan 21 15:57:28 crc kubenswrapper[4707]: I0121 15:57:28.613083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"614dacb7-b2d4-4d19-b93d-c922660ad318","Type":"ContainerDied","Data":"fa323bea5e9e1d95534dd4ba7f41b30c1e310d988c5dd2b4f06c9ddc1db6a2e9"} Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.620539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dltnz" event={"ID":"da46cb19-a3be-4378-a82f-47f24de3e2b1","Type":"ContainerStarted","Data":"d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07"} Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.623019 4707 generic.go:334] "Generic (PLEG): container finished" podID="6fff3d01-d3b5-4323-992a-a8542065926b" containerID="fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317" exitCode=0 Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.623078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mpbh2" event={"ID":"6fff3d01-d3b5-4323-992a-a8542065926b","Type":"ContainerDied","Data":"fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317"} Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.625412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"614dacb7-b2d4-4d19-b93d-c922660ad318","Type":"ContainerStarted","Data":"1c1f119c65c4fce9675289522fce83be1fb184f19095b0c15b0efe892c668d46"} Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.640148 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dltnz" podStartSLOduration=2.126158186 podStartE2EDuration="3.640135073s" podCreationTimestamp="2026-01-21 15:57:26 +0000 UTC" firstStartedPulling="2026-01-21 15:57:27.599884996 +0000 UTC m=+3344.781401218" lastFinishedPulling="2026-01-21 15:57:29.113861883 +0000 UTC m=+3346.295378105" observedRunningTime="2026-01-21 15:57:29.635229509 +0000 UTC m=+3346.816745731" watchObservedRunningTime="2026-01-21 15:57:29.640135073 +0000 UTC m=+3346.821651295" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.653211 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.653196543 podStartE2EDuration="7.653196543s" podCreationTimestamp="2026-01-21 15:57:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:29.649900706 +0000 UTC m=+3346.831416929" watchObservedRunningTime="2026-01-21 15:57:29.653196543 +0000 UTC m=+3346.834712766" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.856507 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.857654 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.859394 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.859440 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.859771 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-clk78" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.859865 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.860339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.868429 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.937252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.937348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.937378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.937414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.937619 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.937666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.937738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fr5b\" (UniqueName: \"kubernetes.io/projected/909f1cc7-acf2-4e84-ae89-4af8bd83962d-kube-api-access-9fr5b\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:29 crc kubenswrapper[4707]: I0121 15:57:29.937776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-config\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.038683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.038731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.038777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.038901 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.038929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.038973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fr5b\" (UniqueName: \"kubernetes.io/projected/909f1cc7-acf2-4e84-ae89-4af8bd83962d-kube-api-access-9fr5b\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.038993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-config\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.039019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.039374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.039415 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.039786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-config\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.039900 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.043711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.043892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.044289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.052861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fr5b\" (UniqueName: \"kubernetes.io/projected/909f1cc7-acf2-4e84-ae89-4af8bd83962d-kube-api-access-9fr5b\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.056534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.170677 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.182519 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:57:30 crc kubenswrapper[4707]: E0121 15:57:30.182788 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.576680 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.643354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mpbh2" event={"ID":"6fff3d01-d3b5-4323-992a-a8542065926b","Type":"ContainerStarted","Data":"97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0"} Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.645044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"909f1cc7-acf2-4e84-ae89-4af8bd83962d","Type":"ContainerStarted","Data":"ddb9006b93d05c8a539c45bb05dd6e7dcb7f50de147d5929f26a88d4c853ac6f"} Jan 21 15:57:30 crc kubenswrapper[4707]: I0121 15:57:30.656552 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mpbh2" podStartSLOduration=2.133237587 podStartE2EDuration="4.65653932s" podCreationTimestamp="2026-01-21 15:57:26 +0000 UTC" firstStartedPulling="2026-01-21 15:57:27.598745683 +0000 UTC m=+3344.780261895" lastFinishedPulling="2026-01-21 15:57:30.122047406 +0000 UTC m=+3347.303563628" observedRunningTime="2026-01-21 15:57:30.655147433 +0000 UTC m=+3347.836663655" watchObservedRunningTime="2026-01-21 15:57:30.65653932 +0000 UTC m=+3347.838055542" Jan 21 15:57:31 crc kubenswrapper[4707]: I0121 15:57:31.652415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"909f1cc7-acf2-4e84-ae89-4af8bd83962d","Type":"ContainerStarted","Data":"0fc100dde8fb0c9e0faf72a71eaa71c4587eafd3440b5e4f0e1ef1e3d4861bcb"} Jan 21 15:57:31 crc kubenswrapper[4707]: I0121 15:57:31.652638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"909f1cc7-acf2-4e84-ae89-4af8bd83962d","Type":"ContainerStarted","Data":"2f3ffa31dd88b33ac598f0e6e8fcfb46ab698b734a73b7deb6cda430c6834638"} Jan 21 15:57:31 crc kubenswrapper[4707]: I0121 15:57:31.666587 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.6665744609999997 podStartE2EDuration="3.666574461s" podCreationTimestamp="2026-01-21 15:57:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:31.663473179 +0000 UTC m=+3348.844989402" watchObservedRunningTime="2026-01-21 15:57:31.666574461 +0000 UTC m=+3348.848090683" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.115970 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.117088 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.118833 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-xp8nc" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.119655 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.120611 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.121562 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.124229 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.269197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.269446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.269512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.269616 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-config\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.269691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.269823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxnb5\" (UniqueName: \"kubernetes.io/projected/c1857e43-c160-4d63-ba36-3b9cb96ebadc-kube-api-access-xxnb5\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.269906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.269973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.370920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.370973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-config\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.371019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.371086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxnb5\" (UniqueName: \"kubernetes.io/projected/c1857e43-c160-4d63-ba36-3b9cb96ebadc-kube-api-access-xxnb5\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.371131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.371170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.371236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.371254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.371548 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.371647 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.371858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-config\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.372189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.378725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.378726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.379172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.383995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxnb5\" (UniqueName: \"kubernetes.io/projected/c1857e43-c160-4d63-ba36-3b9cb96ebadc-kube-api-access-xxnb5\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.390514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.432038 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.723346 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.723575 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:32 crc kubenswrapper[4707]: W0121 15:57:32.822511 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1857e43_c160_4d63_ba36_3b9cb96ebadc.slice/crio-24b232d7760879a2a07eb128986900b6964a72b4f6d77e4aa82d63cdec49afcf WatchSource:0}: Error finding container 24b232d7760879a2a07eb128986900b6964a72b4f6d77e4aa82d63cdec49afcf: Status 404 returned error can't find the container with id 24b232d7760879a2a07eb128986900b6964a72b4f6d77e4aa82d63cdec49afcf Jan 21 15:57:32 crc kubenswrapper[4707]: I0121 15:57:32.823692 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 15:57:33 crc kubenswrapper[4707]: I0121 15:57:33.171145 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:33 crc kubenswrapper[4707]: I0121 15:57:33.201189 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:33 crc kubenswrapper[4707]: I0121 15:57:33.665761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1857e43-c160-4d63-ba36-3b9cb96ebadc","Type":"ContainerStarted","Data":"b2677b78e02f90668c6187316a0c1e86ff839677dd5fcae728e200c597003229"} Jan 21 15:57:33 crc kubenswrapper[4707]: I0121 15:57:33.665804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1857e43-c160-4d63-ba36-3b9cb96ebadc","Type":"ContainerStarted","Data":"67452b60079d3b2cf3ba3e907030f0defbdf50d69c12ef1aad42ab4d0e43dbcf"} Jan 21 15:57:33 crc kubenswrapper[4707]: I0121 15:57:33.665827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1857e43-c160-4d63-ba36-3b9cb96ebadc","Type":"ContainerStarted","Data":"24b232d7760879a2a07eb128986900b6964a72b4f6d77e4aa82d63cdec49afcf"} Jan 21 15:57:33 crc kubenswrapper[4707]: I0121 15:57:33.665919 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:33 crc kubenswrapper[4707]: I0121 15:57:33.682099 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.682086503 podStartE2EDuration="2.682086503s" podCreationTimestamp="2026-01-21 15:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:33.678243508 +0000 UTC m=+3350.859759729" watchObservedRunningTime="2026-01-21 15:57:33.682086503 +0000 UTC m=+3350.863602726" Jan 21 15:57:34 crc kubenswrapper[4707]: I0121 15:57:34.102622 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:34 crc kubenswrapper[4707]: I0121 15:57:34.102845 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:34 crc kubenswrapper[4707]: I0121 15:57:34.772509 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 15:57:34 crc kubenswrapper[4707]: I0121 15:57:34.960463 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:35 crc kubenswrapper[4707]: I0121 15:57:35.016735 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 15:57:35 crc kubenswrapper[4707]: I0121 15:57:35.197526 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 15:57:35 crc kubenswrapper[4707]: I0121 15:57:35.432819 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:35 crc kubenswrapper[4707]: I0121 15:57:35.457926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:35 crc kubenswrapper[4707]: I0121 15:57:35.677623 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.233451 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.314624 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.369531 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.467798 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.467889 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.501856 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.681692 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.681759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.716372 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:36 crc kubenswrapper[4707]: I0121 15:57:36.721651 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.164971 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.169130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.170519 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.170821 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.171099 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.171786 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-vtp2w" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.179179 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.336392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.337005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.337184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-lock\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.337384 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-cache\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.337462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsprd\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-kube-api-access-jsprd\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.439288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.439556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.439566 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: E0121 15:57:37.439721 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:57:37 crc kubenswrapper[4707]: E0121 15:57:37.439746 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.439773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-lock\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: E0121 15:57:37.439787 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift podName:f9e3466a-372e-4d53-9e3b-807d8403f17a nodeName:}" failed. No retries permitted until 2026-01-21 15:57:37.93977324 +0000 UTC m=+3355.121289462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift") pod "swift-storage-0" (UID: "f9e3466a-372e-4d53-9e3b-807d8403f17a") : configmap "swift-ring-files" not found Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.439859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-cache\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.439885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsprd\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-kube-api-access-jsprd\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.440202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-lock\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.440268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-cache\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.456456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.457632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsprd\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-kube-api-access-jsprd\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.461684 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.560416 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6hpdr"] Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.561228 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.566265 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.566385 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.566397 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.574964 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6hpdr"] Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.580181 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-drb2f"] Jan 21 15:57:37 crc kubenswrapper[4707]: E0121 15:57:37.580945 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-hxc59 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-hxc59 ring-data-devices scripts swiftconf]: context canceled" pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" podUID="a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.581131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.590581 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-drb2f"] Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.607765 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6hpdr"] Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.642896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-dispersionconf\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.642940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-scripts\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.642963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-scripts\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnw7\" (UniqueName: \"kubernetes.io/projected/5074e861-4d4c-4290-9c96-206e7d0b8f6d-kube-api-access-vwnw7\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643059 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-combined-ca-bundle\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-ring-data-devices\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxc59\" (UniqueName: \"kubernetes.io/projected/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-kube-api-access-hxc59\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-etc-swift\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-dispersionconf\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-swiftconf\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-combined-ca-bundle\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-swiftconf\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5074e861-4d4c-4290-9c96-206e7d0b8f6d-etc-swift\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.643688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-ring-data-devices\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.691040 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.714464 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.718942 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.719976 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.722023 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.722525 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.722918 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.722969 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-jvckj" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.735378 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxc59\" (UniqueName: \"kubernetes.io/projected/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-kube-api-access-hxc59\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-etc-swift\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-scripts\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-dispersionconf\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-swiftconf\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-combined-ca-bundle\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745785 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-swiftconf\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5074e861-4d4c-4290-9c96-206e7d0b8f6d-etc-swift\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x59g\" (UniqueName: \"kubernetes.io/projected/40dfa261-8913-4472-8f97-ac7b04d1f349-kube-api-access-2x59g\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-ring-data-devices\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745901 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-dispersionconf\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745918 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-scripts\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-scripts\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.745990 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.746015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnw7\" (UniqueName: \"kubernetes.io/projected/5074e861-4d4c-4290-9c96-206e7d0b8f6d-kube-api-access-vwnw7\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.746035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-combined-ca-bundle\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.746053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-config\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.746069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-ring-data-devices\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.746092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.746172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-etc-swift\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.747120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-scripts\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.747448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-ring-data-devices\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.747602 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-ring-data-devices\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.747937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5074e861-4d4c-4290-9c96-206e7d0b8f6d-etc-swift\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.748636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-scripts\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.752636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-dispersionconf\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.752893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-combined-ca-bundle\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.753801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-dispersionconf\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.753868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-combined-ca-bundle\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.755272 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.759952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-swiftconf\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.763582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-swiftconf\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.764201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxc59\" (UniqueName: \"kubernetes.io/projected/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-kube-api-access-hxc59\") pod \"swift-ring-rebalance-6hpdr\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.764323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnw7\" (UniqueName: \"kubernetes.io/projected/5074e861-4d4c-4290-9c96-206e7d0b8f6d-kube-api-access-vwnw7\") pod \"swift-ring-rebalance-drb2f\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.847304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-scripts\") pod \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.847415 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-etc-swift\") pod \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.847506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-ring-data-devices\") pod \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.847691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0" (UID: "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.847712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-scripts" (OuterVolumeSpecName: "scripts") pod "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0" (UID: "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.847705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.847846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-scripts\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.847850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0" (UID: "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.847978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x59g\" (UniqueName: \"kubernetes.io/projected/40dfa261-8913-4472-8f97-ac7b04d1f349-kube-api-access-2x59g\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.848139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.848201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.848259 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-config\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.848330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.848477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-scripts\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.848765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.848845 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.848860 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.848871 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.849077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-config\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.854496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.854516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.854600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.860303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x59g\" (UniqueName: \"kubernetes.io/projected/40dfa261-8913-4472-8f97-ac7b04d1f349-kube-api-access-2x59g\") pod \"ovn-northd-0\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.892232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.951726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-swiftconf\") pod \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.951920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-dispersionconf\") pod \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.951948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxc59\" (UniqueName: \"kubernetes.io/projected/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-kube-api-access-hxc59\") pod \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.952038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-combined-ca-bundle\") pod \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\" (UID: \"a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0\") " Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.952476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:37 crc kubenswrapper[4707]: E0121 15:57:37.952644 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:57:37 crc kubenswrapper[4707]: E0121 15:57:37.952656 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:57:37 crc kubenswrapper[4707]: E0121 15:57:37.952693 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift podName:f9e3466a-372e-4d53-9e3b-807d8403f17a nodeName:}" failed. No retries permitted until 2026-01-21 15:57:38.952680467 +0000 UTC m=+3356.134196689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift") pod "swift-storage-0" (UID: "f9e3466a-372e-4d53-9e3b-807d8403f17a") : configmap "swift-ring-files" not found Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.957761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0" (UID: "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.957789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0" (UID: "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.957838 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-kube-api-access-hxc59" (OuterVolumeSpecName: "kube-api-access-hxc59") pod "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0" (UID: "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0"). InnerVolumeSpecName "kube-api-access-hxc59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:37 crc kubenswrapper[4707]: I0121 15:57:37.957864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0" (UID: "a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.034529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.053801 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.053839 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.053851 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxc59\" (UniqueName: \"kubernetes.io/projected/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-kube-api-access-hxc59\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.053860 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.244079 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-drb2f"] Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.418603 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 15:57:38 crc kubenswrapper[4707]: W0121 15:57:38.421970 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40dfa261_8913_4472_8f97_ac7b04d1f349.slice/crio-0b60252a0b88b983b9502f95f09710d7e176a0684306d6ebf4a27b605f4aaa75 WatchSource:0}: Error finding container 0b60252a0b88b983b9502f95f09710d7e176a0684306d6ebf4a27b605f4aaa75: Status 404 returned error can't find the container with id 0b60252a0b88b983b9502f95f09710d7e176a0684306d6ebf4a27b605f4aaa75 Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.697694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" event={"ID":"5074e861-4d4c-4290-9c96-206e7d0b8f6d","Type":"ContainerStarted","Data":"1e8e5b9cb01eeda526ec62456f9b5e1ddeccd8b52cd2f55d760453d82d8aabdf"} Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.697921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" event={"ID":"5074e861-4d4c-4290-9c96-206e7d0b8f6d","Type":"ContainerStarted","Data":"89706fea3fbc58aaa5171abf41642ec6b14a8c585f18ef0f509d4de39a0defc5"} Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.698783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"40dfa261-8913-4472-8f97-ac7b04d1f349","Type":"ContainerStarted","Data":"853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61"} Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.698836 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-6hpdr" Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.698848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"40dfa261-8913-4472-8f97-ac7b04d1f349","Type":"ContainerStarted","Data":"0b60252a0b88b983b9502f95f09710d7e176a0684306d6ebf4a27b605f4aaa75"} Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.720563 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" podStartSLOduration=1.720545029 podStartE2EDuration="1.720545029s" podCreationTimestamp="2026-01-21 15:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:38.716024739 +0000 UTC m=+3355.897540961" watchObservedRunningTime="2026-01-21 15:57:38.720545029 +0000 UTC m=+3355.902061250" Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.753115 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dltnz"] Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.753364 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dltnz" podUID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerName="registry-server" containerID="cri-o://d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07" gracePeriod=2 Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.761376 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6hpdr"] Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.768406 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-6hpdr"] Jan 21 15:57:38 crc kubenswrapper[4707]: I0121 15:57:38.968978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:38 crc kubenswrapper[4707]: E0121 15:57:38.969132 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:57:38 crc kubenswrapper[4707]: E0121 15:57:38.969148 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:57:38 crc kubenswrapper[4707]: E0121 15:57:38.969190 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift podName:f9e3466a-372e-4d53-9e3b-807d8403f17a nodeName:}" failed. No retries permitted until 2026-01-21 15:57:40.969178461 +0000 UTC m=+3358.150694683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift") pod "swift-storage-0" (UID: "f9e3466a-372e-4d53-9e3b-807d8403f17a") : configmap "swift-ring-files" not found Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.189380 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0" path="/var/lib/kubelet/pods/a9fabad9-f89c-4bfb-a1d3-57e9c24c43c0/volumes" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.663711 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.707408 4707 generic.go:334] "Generic (PLEG): container finished" podID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerID="d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07" exitCode=0 Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.707461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dltnz" event={"ID":"da46cb19-a3be-4378-a82f-47f24de3e2b1","Type":"ContainerDied","Data":"d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07"} Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.707485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dltnz" event={"ID":"da46cb19-a3be-4378-a82f-47f24de3e2b1","Type":"ContainerDied","Data":"128d966e8b40accbad8af16e05a641a7433ece376183678fe4c31eafe2b541bf"} Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.707500 4707 scope.go:117] "RemoveContainer" containerID="d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.707588 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dltnz" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.711782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"40dfa261-8913-4472-8f97-ac7b04d1f349","Type":"ContainerStarted","Data":"bdec8c944354ba8edd4a696e67ef57315d22a16748e6ca2cf813eb67fe88a679"} Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.711840 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.737254 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.7372394719999997 podStartE2EDuration="2.737239472s" podCreationTimestamp="2026-01-21 15:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:39.729306986 +0000 UTC m=+3356.910823228" watchObservedRunningTime="2026-01-21 15:57:39.737239472 +0000 UTC m=+3356.918755694" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.739394 4707 scope.go:117] "RemoveContainer" containerID="505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.759245 4707 scope.go:117] "RemoveContainer" containerID="132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.778763 4707 scope.go:117] "RemoveContainer" containerID="d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07" Jan 21 15:57:39 crc kubenswrapper[4707]: E0121 15:57:39.779135 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07\": container with ID starting with d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07 not found: ID does not exist" containerID="d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.779157 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07"} err="failed to get container status \"d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07\": rpc error: code = NotFound desc = could not find container \"d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07\": container with ID starting with d3eb1035ce2c50632371a6c70d8c753c39d16061cbe1b9e28a9d2ec4df8abf07 not found: ID does not exist" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.779174 4707 scope.go:117] "RemoveContainer" containerID="505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78" Jan 21 15:57:39 crc kubenswrapper[4707]: E0121 15:57:39.779406 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78\": container with ID starting with 505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78 not found: ID does not exist" containerID="505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.779420 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78"} err="failed to get container status \"505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78\": rpc error: code = NotFound desc = could not find container \"505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78\": container with ID starting with 505f9729924d2793acb106b55e1c494087ed67783443cdf7dcf3ddd5d3274f78 not found: ID does not exist" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.779430 4707 scope.go:117] "RemoveContainer" containerID="132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc" Jan 21 15:57:39 crc kubenswrapper[4707]: E0121 15:57:39.779589 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc\": container with ID starting with 132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc not found: ID does not exist" containerID="132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.779602 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc"} err="failed to get container status \"132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc\": rpc error: code = NotFound desc = could not find container \"132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc\": container with ID starting with 132e02776a4ab21cd13c46cbe7dc145bacd58f11389b98eeb2868837b933a4fc not found: ID does not exist" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.781886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-utilities\") pod \"da46cb19-a3be-4378-a82f-47f24de3e2b1\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.781943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-catalog-content\") pod \"da46cb19-a3be-4378-a82f-47f24de3e2b1\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.781984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlj7k\" (UniqueName: \"kubernetes.io/projected/da46cb19-a3be-4378-a82f-47f24de3e2b1-kube-api-access-tlj7k\") pod \"da46cb19-a3be-4378-a82f-47f24de3e2b1\" (UID: \"da46cb19-a3be-4378-a82f-47f24de3e2b1\") " Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.784239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-utilities" (OuterVolumeSpecName: "utilities") pod "da46cb19-a3be-4378-a82f-47f24de3e2b1" (UID: "da46cb19-a3be-4378-a82f-47f24de3e2b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.786981 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da46cb19-a3be-4378-a82f-47f24de3e2b1-kube-api-access-tlj7k" (OuterVolumeSpecName: "kube-api-access-tlj7k") pod "da46cb19-a3be-4378-a82f-47f24de3e2b1" (UID: "da46cb19-a3be-4378-a82f-47f24de3e2b1"). InnerVolumeSpecName "kube-api-access-tlj7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.800208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da46cb19-a3be-4378-a82f-47f24de3e2b1" (UID: "da46cb19-a3be-4378-a82f-47f24de3e2b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.886398 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.886429 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da46cb19-a3be-4378-a82f-47f24de3e2b1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:39 crc kubenswrapper[4707]: I0121 15:57:39.886440 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlj7k\" (UniqueName: \"kubernetes.io/projected/da46cb19-a3be-4378-a82f-47f24de3e2b1-kube-api-access-tlj7k\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.035652 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dltnz"] Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.041922 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dltnz"] Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.145417 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mpbh2"] Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.145635 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mpbh2" podUID="6fff3d01-d3b5-4323-992a-a8542065926b" containerName="registry-server" containerID="cri-o://97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0" gracePeriod=2 Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.555009 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.603147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hchrc\" (UniqueName: \"kubernetes.io/projected/6fff3d01-d3b5-4323-992a-a8542065926b-kube-api-access-hchrc\") pod \"6fff3d01-d3b5-4323-992a-a8542065926b\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.603209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-catalog-content\") pod \"6fff3d01-d3b5-4323-992a-a8542065926b\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.603262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-utilities\") pod \"6fff3d01-d3b5-4323-992a-a8542065926b\" (UID: \"6fff3d01-d3b5-4323-992a-a8542065926b\") " Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.604632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-utilities" (OuterVolumeSpecName: "utilities") pod "6fff3d01-d3b5-4323-992a-a8542065926b" (UID: "6fff3d01-d3b5-4323-992a-a8542065926b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.609197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fff3d01-d3b5-4323-992a-a8542065926b-kube-api-access-hchrc" (OuterVolumeSpecName: "kube-api-access-hchrc") pod "6fff3d01-d3b5-4323-992a-a8542065926b" (UID: "6fff3d01-d3b5-4323-992a-a8542065926b"). InnerVolumeSpecName "kube-api-access-hchrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.652099 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fff3d01-d3b5-4323-992a-a8542065926b" (UID: "6fff3d01-d3b5-4323-992a-a8542065926b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.706037 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hchrc\" (UniqueName: \"kubernetes.io/projected/6fff3d01-d3b5-4323-992a-a8542065926b-kube-api-access-hchrc\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.706907 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.706980 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fff3d01-d3b5-4323-992a-a8542065926b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.718442 4707 generic.go:334] "Generic (PLEG): container finished" podID="6fff3d01-d3b5-4323-992a-a8542065926b" containerID="97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0" exitCode=0 Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.718496 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mpbh2" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.718513 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mpbh2" event={"ID":"6fff3d01-d3b5-4323-992a-a8542065926b","Type":"ContainerDied","Data":"97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0"} Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.718780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mpbh2" event={"ID":"6fff3d01-d3b5-4323-992a-a8542065926b","Type":"ContainerDied","Data":"71bd3e36c051854dae428b349d00f0dfde7fcc9299f91ae33ca7a902bf7009b7"} Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.718798 4707 scope.go:117] "RemoveContainer" containerID="97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.744929 4707 scope.go:117] "RemoveContainer" containerID="fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.752889 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mpbh2"] Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.755661 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mpbh2"] Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.762081 4707 scope.go:117] "RemoveContainer" containerID="22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.793750 4707 scope.go:117] "RemoveContainer" containerID="97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0" Jan 21 15:57:40 crc kubenswrapper[4707]: E0121 15:57:40.794143 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0\": container with ID starting with 97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0 not found: ID does not exist" containerID="97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.794258 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0"} err="failed to get container status \"97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0\": rpc error: code = NotFound desc = could not find container \"97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0\": container with ID starting with 97b056d1cfb37a0f52d5b9a0d05e8d5f611047dda78e5bf74b7e9f8b2ea23bd0 not found: ID does not exist" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.794361 4707 scope.go:117] "RemoveContainer" containerID="fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317" Jan 21 15:57:40 crc kubenswrapper[4707]: E0121 15:57:40.794653 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317\": container with ID starting with fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317 not found: ID does not exist" containerID="fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.794684 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317"} err="failed to get container status \"fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317\": rpc error: code = NotFound desc = could not find container \"fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317\": container with ID starting with fdce7123cdfea2849b51db524ce1ff3c5f1b80645969fa2334924641eacf1317 not found: ID does not exist" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.794704 4707 scope.go:117] "RemoveContainer" containerID="22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e" Jan 21 15:57:40 crc kubenswrapper[4707]: E0121 15:57:40.794955 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e\": container with ID starting with 22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e not found: ID does not exist" containerID="22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e" Jan 21 15:57:40 crc kubenswrapper[4707]: I0121 15:57:40.794978 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e"} err="failed to get container status \"22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e\": rpc error: code = NotFound desc = could not find container \"22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e\": container with ID starting with 22c3d3d4c5edccc3a99e74ba17b0a33922b21344fd5d5d44dd4e73b92da7501e not found: ID does not exist" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.010433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:41 crc kubenswrapper[4707]: E0121 15:57:41.010571 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:57:41 crc kubenswrapper[4707]: E0121 15:57:41.010593 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:57:41 crc kubenswrapper[4707]: E0121 15:57:41.010649 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift podName:f9e3466a-372e-4d53-9e3b-807d8403f17a nodeName:}" failed. No retries permitted until 2026-01-21 15:57:45.010632665 +0000 UTC m=+3362.192148898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift") pod "swift-storage-0" (UID: "f9e3466a-372e-4d53-9e3b-807d8403f17a") : configmap "swift-ring-files" not found Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.189704 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fff3d01-d3b5-4323-992a-a8542065926b" path="/var/lib/kubelet/pods/6fff3d01-d3b5-4323-992a-a8542065926b/volumes" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.190405 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da46cb19-a3be-4378-a82f-47f24de3e2b1" path="/var/lib/kubelet/pods/da46cb19-a3be-4378-a82f-47f24de3e2b1/volumes" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.375470 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-grmhl"] Jan 21 15:57:41 crc kubenswrapper[4707]: E0121 15:57:41.375896 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fff3d01-d3b5-4323-992a-a8542065926b" containerName="extract-utilities" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.375907 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fff3d01-d3b5-4323-992a-a8542065926b" containerName="extract-utilities" Jan 21 15:57:41 crc kubenswrapper[4707]: E0121 15:57:41.375926 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fff3d01-d3b5-4323-992a-a8542065926b" containerName="registry-server" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.375932 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fff3d01-d3b5-4323-992a-a8542065926b" containerName="registry-server" Jan 21 15:57:41 crc kubenswrapper[4707]: E0121 15:57:41.375943 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerName="extract-utilities" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.375949 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerName="extract-utilities" Jan 21 15:57:41 crc kubenswrapper[4707]: E0121 15:57:41.375956 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerName="extract-content" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.375961 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerName="extract-content" Jan 21 15:57:41 crc kubenswrapper[4707]: E0121 15:57:41.375970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fff3d01-d3b5-4323-992a-a8542065926b" containerName="extract-content" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.375983 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fff3d01-d3b5-4323-992a-a8542065926b" containerName="extract-content" Jan 21 15:57:41 crc kubenswrapper[4707]: E0121 15:57:41.376006 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerName="registry-server" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.376012 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerName="registry-server" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.376136 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fff3d01-d3b5-4323-992a-a8542065926b" containerName="registry-server" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.376156 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="da46cb19-a3be-4378-a82f-47f24de3e2b1" containerName="registry-server" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.376579 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.379140 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.387215 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-grmhl"] Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.416494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7kn\" (UniqueName: \"kubernetes.io/projected/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-kube-api-access-6f7kn\") pod \"root-account-create-update-grmhl\" (UID: \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\") " pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.416692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-operator-scripts\") pod \"root-account-create-update-grmhl\" (UID: \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\") " pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.518467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7kn\" (UniqueName: \"kubernetes.io/projected/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-kube-api-access-6f7kn\") pod \"root-account-create-update-grmhl\" (UID: \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\") " pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.518649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-operator-scripts\") pod \"root-account-create-update-grmhl\" (UID: \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\") " pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.519350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-operator-scripts\") pod \"root-account-create-update-grmhl\" (UID: \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\") " pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.531445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7kn\" (UniqueName: \"kubernetes.io/projected/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-kube-api-access-6f7kn\") pod \"root-account-create-update-grmhl\" (UID: \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\") " pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:41 crc kubenswrapper[4707]: I0121 15:57:41.690323 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:42 crc kubenswrapper[4707]: I0121 15:57:42.046984 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-grmhl"] Jan 21 15:57:42 crc kubenswrapper[4707]: I0121 15:57:42.740109 4707 generic.go:334] "Generic (PLEG): container finished" podID="06e10ad1-26ff-40fd-a3d6-d495462ecfd0" containerID="6069a182333f4865e5bff3fe2585423f7e539e8ae4f687c86c63a5733d4ed4a3" exitCode=0 Jan 21 15:57:42 crc kubenswrapper[4707]: I0121 15:57:42.740211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-grmhl" event={"ID":"06e10ad1-26ff-40fd-a3d6-d495462ecfd0","Type":"ContainerDied","Data":"6069a182333f4865e5bff3fe2585423f7e539e8ae4f687c86c63a5733d4ed4a3"} Jan 21 15:57:42 crc kubenswrapper[4707]: I0121 15:57:42.740355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-grmhl" event={"ID":"06e10ad1-26ff-40fd-a3d6-d495462ecfd0","Type":"ContainerStarted","Data":"350a4a1b0016b7246ef6e5565b73e9bae9e45546acb38a129d8db1ac3c87f4a4"} Jan 21 15:57:43 crc kubenswrapper[4707]: I0121 15:57:43.746989 4707 generic.go:334] "Generic (PLEG): container finished" podID="5074e861-4d4c-4290-9c96-206e7d0b8f6d" containerID="1e8e5b9cb01eeda526ec62456f9b5e1ddeccd8b52cd2f55d760453d82d8aabdf" exitCode=0 Jan 21 15:57:43 crc kubenswrapper[4707]: I0121 15:57:43.747060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" event={"ID":"5074e861-4d4c-4290-9c96-206e7d0b8f6d","Type":"ContainerDied","Data":"1e8e5b9cb01eeda526ec62456f9b5e1ddeccd8b52cd2f55d760453d82d8aabdf"} Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.016825 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.059088 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-75f22"] Jan 21 15:57:44 crc kubenswrapper[4707]: E0121 15:57:44.059430 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e10ad1-26ff-40fd-a3d6-d495462ecfd0" containerName="mariadb-account-create-update" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.059447 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e10ad1-26ff-40fd-a3d6-d495462ecfd0" containerName="mariadb-account-create-update" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.059580 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e10ad1-26ff-40fd-a3d6-d495462ecfd0" containerName="mariadb-account-create-update" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.060073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.064212 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-75f22"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.069096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f7kn\" (UniqueName: \"kubernetes.io/projected/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-kube-api-access-6f7kn\") pod \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\" (UID: \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\") " Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.069189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-operator-scripts\") pod \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\" (UID: \"06e10ad1-26ff-40fd-a3d6-d495462ecfd0\") " Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.070492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06e10ad1-26ff-40fd-a3d6-d495462ecfd0" (UID: "06e10ad1-26ff-40fd-a3d6-d495462ecfd0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.075419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-kube-api-access-6f7kn" (OuterVolumeSpecName: "kube-api-access-6f7kn") pod "06e10ad1-26ff-40fd-a3d6-d495462ecfd0" (UID: "06e10ad1-26ff-40fd-a3d6-d495462ecfd0"). InnerVolumeSpecName "kube-api-access-6f7kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.161353 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.169594 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.170519 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.171959 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.174239 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cbmg\" (UniqueName: \"kubernetes.io/projected/a17672dc-c0ba-40d8-807c-9e1325921ed7-kube-api-access-2cbmg\") pod \"keystone-db-create-75f22\" (UID: \"a17672dc-c0ba-40d8-807c-9e1325921ed7\") " pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.174446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17672dc-c0ba-40d8-807c-9e1325921ed7-operator-scripts\") pod \"keystone-db-create-75f22\" (UID: \"a17672dc-c0ba-40d8-807c-9e1325921ed7\") " pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.174826 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f7kn\" (UniqueName: \"kubernetes.io/projected/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-kube-api-access-6f7kn\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.174862 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e10ad1-26ff-40fd-a3d6-d495462ecfd0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.276533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dl7v\" (UniqueName: \"kubernetes.io/projected/1d47255a-8169-490c-88be-57ffaabcb2d3-kube-api-access-9dl7v\") pod \"keystone-bd81-account-create-update-xxz6k\" (UID: \"1d47255a-8169-490c-88be-57ffaabcb2d3\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.276628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d47255a-8169-490c-88be-57ffaabcb2d3-operator-scripts\") pod \"keystone-bd81-account-create-update-xxz6k\" (UID: \"1d47255a-8169-490c-88be-57ffaabcb2d3\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.276652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cbmg\" (UniqueName: \"kubernetes.io/projected/a17672dc-c0ba-40d8-807c-9e1325921ed7-kube-api-access-2cbmg\") pod \"keystone-db-create-75f22\" (UID: \"a17672dc-c0ba-40d8-807c-9e1325921ed7\") " pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.276675 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17672dc-c0ba-40d8-807c-9e1325921ed7-operator-scripts\") pod \"keystone-db-create-75f22\" (UID: \"a17672dc-c0ba-40d8-807c-9e1325921ed7\") " pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.277268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17672dc-c0ba-40d8-807c-9e1325921ed7-operator-scripts\") pod \"keystone-db-create-75f22\" (UID: \"a17672dc-c0ba-40d8-807c-9e1325921ed7\") " pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.289134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cbmg\" (UniqueName: \"kubernetes.io/projected/a17672dc-c0ba-40d8-807c-9e1325921ed7-kube-api-access-2cbmg\") pod \"keystone-db-create-75f22\" (UID: \"a17672dc-c0ba-40d8-807c-9e1325921ed7\") " pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.377152 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-zt2mk"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.377776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dl7v\" (UniqueName: \"kubernetes.io/projected/1d47255a-8169-490c-88be-57ffaabcb2d3-kube-api-access-9dl7v\") pod \"keystone-bd81-account-create-update-xxz6k\" (UID: \"1d47255a-8169-490c-88be-57ffaabcb2d3\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.377906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d47255a-8169-490c-88be-57ffaabcb2d3-operator-scripts\") pod \"keystone-bd81-account-create-update-xxz6k\" (UID: \"1d47255a-8169-490c-88be-57ffaabcb2d3\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.378026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.378706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d47255a-8169-490c-88be-57ffaabcb2d3-operator-scripts\") pod \"keystone-bd81-account-create-update-xxz6k\" (UID: \"1d47255a-8169-490c-88be-57ffaabcb2d3\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.385045 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-zt2mk"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.390593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dl7v\" (UniqueName: \"kubernetes.io/projected/1d47255a-8169-490c-88be-57ffaabcb2d3-kube-api-access-9dl7v\") pod \"keystone-bd81-account-create-update-xxz6k\" (UID: \"1d47255a-8169-490c-88be-57ffaabcb2d3\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.401655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.479766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4q78\" (UniqueName: \"kubernetes.io/projected/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-kube-api-access-v4q78\") pod \"placement-db-create-zt2mk\" (UID: \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\") " pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.480007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-operator-scripts\") pod \"placement-db-create-zt2mk\" (UID: \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\") " pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.486849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.500884 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.501751 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.505132 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.514098 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.581662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4q78\" (UniqueName: \"kubernetes.io/projected/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-kube-api-access-v4q78\") pod \"placement-db-create-zt2mk\" (UID: \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\") " pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.581783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c55d01-3387-4a00-be62-2fa602109b06-operator-scripts\") pod \"placement-7ef3-account-create-update-ptt6g\" (UID: \"11c55d01-3387-4a00-be62-2fa602109b06\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.581932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvhcv\" (UniqueName: \"kubernetes.io/projected/11c55d01-3387-4a00-be62-2fa602109b06-kube-api-access-zvhcv\") pod \"placement-7ef3-account-create-update-ptt6g\" (UID: \"11c55d01-3387-4a00-be62-2fa602109b06\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.581976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-operator-scripts\") pod \"placement-db-create-zt2mk\" (UID: \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\") " pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.582746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-operator-scripts\") pod \"placement-db-create-zt2mk\" (UID: \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\") " pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.596657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4q78\" (UniqueName: \"kubernetes.io/projected/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-kube-api-access-v4q78\") pod \"placement-db-create-zt2mk\" (UID: \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\") " pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.683859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c55d01-3387-4a00-be62-2fa602109b06-operator-scripts\") pod \"placement-7ef3-account-create-update-ptt6g\" (UID: \"11c55d01-3387-4a00-be62-2fa602109b06\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.683925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhcv\" (UniqueName: \"kubernetes.io/projected/11c55d01-3387-4a00-be62-2fa602109b06-kube-api-access-zvhcv\") pod \"placement-7ef3-account-create-update-ptt6g\" (UID: \"11c55d01-3387-4a00-be62-2fa602109b06\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.684493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c55d01-3387-4a00-be62-2fa602109b06-operator-scripts\") pod \"placement-7ef3-account-create-update-ptt6g\" (UID: \"11c55d01-3387-4a00-be62-2fa602109b06\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.697223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhcv\" (UniqueName: \"kubernetes.io/projected/11c55d01-3387-4a00-be62-2fa602109b06-kube-api-access-zvhcv\") pod \"placement-7ef3-account-create-update-ptt6g\" (UID: \"11c55d01-3387-4a00-be62-2fa602109b06\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.716395 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.754381 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-grmhl" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.754379 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-grmhl" event={"ID":"06e10ad1-26ff-40fd-a3d6-d495462ecfd0","Type":"ContainerDied","Data":"350a4a1b0016b7246ef6e5565b73e9bae9e45546acb38a129d8db1ac3c87f4a4"} Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.754427 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="350a4a1b0016b7246ef6e5565b73e9bae9e45546acb38a129d8db1ac3c87f4a4" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.788643 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-75f22"] Jan 21 15:57:44 crc kubenswrapper[4707]: W0121 15:57:44.796190 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda17672dc_c0ba_40d8_807c_9e1325921ed7.slice/crio-0d9c17bbebea54ee80338b0ee9ac838df29308e719690a6ee2e8942f0041dd27 WatchSource:0}: Error finding container 0d9c17bbebea54ee80338b0ee9ac838df29308e719690a6ee2e8942f0041dd27: Status 404 returned error can't find the container with id 0d9c17bbebea54ee80338b0ee9ac838df29308e719690a6ee2e8942f0041dd27 Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.819340 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.865488 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-gbpp8"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.866612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.873209 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gbpp8"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.879585 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k"] Jan 21 15:57:44 crc kubenswrapper[4707]: W0121 15:57:44.884412 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d47255a_8169_490c_88be_57ffaabcb2d3.slice/crio-9b20a8390e4a0ed108e0e4943aff40fd4d0cbabecf9ae462189d55bfd382e25b WatchSource:0}: Error finding container 9b20a8390e4a0ed108e0e4943aff40fd4d0cbabecf9ae462189d55bfd382e25b: Status 404 returned error can't find the container with id 9b20a8390e4a0ed108e0e4943aff40fd4d0cbabecf9ae462189d55bfd382e25b Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.960697 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-789wf"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.961541 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.965393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.969948 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-789wf"] Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.991400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ffba812-ff2b-453b-92fb-2e669935e242-operator-scripts\") pod \"glance-db-create-gbpp8\" (UID: \"9ffba812-ff2b-453b-92fb-2e669935e242\") " pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.991482 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvctp\" (UniqueName: \"kubernetes.io/projected/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-kube-api-access-pvctp\") pod \"glance-82ac-account-create-update-789wf\" (UID: \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.991543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-operator-scripts\") pod \"glance-82ac-account-create-update-789wf\" (UID: \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:44 crc kubenswrapper[4707]: I0121 15:57:44.991632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl8r9\" (UniqueName: \"kubernetes.io/projected/9ffba812-ff2b-453b-92fb-2e669935e242-kube-api-access-dl8r9\") pod \"glance-db-create-gbpp8\" (UID: \"9ffba812-ff2b-453b-92fb-2e669935e242\") " pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.093027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.093853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ffba812-ff2b-453b-92fb-2e669935e242-operator-scripts\") pod \"glance-db-create-gbpp8\" (UID: \"9ffba812-ff2b-453b-92fb-2e669935e242\") " pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.093886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvctp\" (UniqueName: \"kubernetes.io/projected/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-kube-api-access-pvctp\") pod \"glance-82ac-account-create-update-789wf\" (UID: \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.093950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-operator-scripts\") pod \"glance-82ac-account-create-update-789wf\" (UID: \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.093995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl8r9\" (UniqueName: \"kubernetes.io/projected/9ffba812-ff2b-453b-92fb-2e669935e242-kube-api-access-dl8r9\") pod \"glance-db-create-gbpp8\" (UID: \"9ffba812-ff2b-453b-92fb-2e669935e242\") " pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.095064 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ffba812-ff2b-453b-92fb-2e669935e242-operator-scripts\") pod \"glance-db-create-gbpp8\" (UID: \"9ffba812-ff2b-453b-92fb-2e669935e242\") " pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.095115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-operator-scripts\") pod \"glance-82ac-account-create-update-789wf\" (UID: \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.097007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift\") pod \"swift-storage-0\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.106511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl8r9\" (UniqueName: \"kubernetes.io/projected/9ffba812-ff2b-453b-92fb-2e669935e242-kube-api-access-dl8r9\") pod \"glance-db-create-gbpp8\" (UID: \"9ffba812-ff2b-453b-92fb-2e669935e242\") " pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.109223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvctp\" (UniqueName: \"kubernetes.io/projected/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-kube-api-access-pvctp\") pod \"glance-82ac-account-create-update-789wf\" (UID: \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.182646 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:57:45 crc kubenswrapper[4707]: E0121 15:57:45.182912 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.189142 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.241562 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-zt2mk"] Jan 21 15:57:45 crc kubenswrapper[4707]: W0121 15:57:45.259753 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dbab3d8_0477_4503_a6d1_310a3c0dfda3.slice/crio-4f00fc589e5f113adea0b4682270a10d975008fb01a59adfe0dabd2779bb445a WatchSource:0}: Error finding container 4f00fc589e5f113adea0b4682270a10d975008fb01a59adfe0dabd2779bb445a: Status 404 returned error can't find the container with id 4f00fc589e5f113adea0b4682270a10d975008fb01a59adfe0dabd2779bb445a Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.265337 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.283475 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.297292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5074e861-4d4c-4290-9c96-206e7d0b8f6d-etc-swift\") pod \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.297394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-ring-data-devices\") pod \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.297417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-scripts\") pod \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.297499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-dispersionconf\") pod \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.297537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-combined-ca-bundle\") pod \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.297598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwnw7\" (UniqueName: \"kubernetes.io/projected/5074e861-4d4c-4290-9c96-206e7d0b8f6d-kube-api-access-vwnw7\") pod \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.297642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-swiftconf\") pod \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\" (UID: \"5074e861-4d4c-4290-9c96-206e7d0b8f6d\") " Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.298030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5074e861-4d4c-4290-9c96-206e7d0b8f6d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5074e861-4d4c-4290-9c96-206e7d0b8f6d" (UID: "5074e861-4d4c-4290-9c96-206e7d0b8f6d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.299759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5074e861-4d4c-4290-9c96-206e7d0b8f6d" (UID: "5074e861-4d4c-4290-9c96-206e7d0b8f6d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.303556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5074e861-4d4c-4290-9c96-206e7d0b8f6d-kube-api-access-vwnw7" (OuterVolumeSpecName: "kube-api-access-vwnw7") pod "5074e861-4d4c-4290-9c96-206e7d0b8f6d" (UID: "5074e861-4d4c-4290-9c96-206e7d0b8f6d"). InnerVolumeSpecName "kube-api-access-vwnw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.309872 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5074e861-4d4c-4290-9c96-206e7d0b8f6d" (UID: "5074e861-4d4c-4290-9c96-206e7d0b8f6d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.314550 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.325757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5074e861-4d4c-4290-9c96-206e7d0b8f6d" (UID: "5074e861-4d4c-4290-9c96-206e7d0b8f6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.326925 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g"] Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.327902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-scripts" (OuterVolumeSpecName: "scripts") pod "5074e861-4d4c-4290-9c96-206e7d0b8f6d" (UID: "5074e861-4d4c-4290-9c96-206e7d0b8f6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.336372 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5074e861-4d4c-4290-9c96-206e7d0b8f6d" (UID: "5074e861-4d4c-4290-9c96-206e7d0b8f6d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.400302 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwnw7\" (UniqueName: \"kubernetes.io/projected/5074e861-4d4c-4290-9c96-206e7d0b8f6d-kube-api-access-vwnw7\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.400323 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.400333 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5074e861-4d4c-4290-9c96-206e7d0b8f6d-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.400342 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.400350 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5074e861-4d4c-4290-9c96-206e7d0b8f6d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.400359 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.400386 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5074e861-4d4c-4290-9c96-206e7d0b8f6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.554001 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gbpp8"] Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.693027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 15:57:45 crc kubenswrapper[4707]: W0121 15:57:45.697572 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e3466a_372e_4d53_9e3b_807d8403f17a.slice/crio-59391ec14f4899ff2e8480562b605be820b6e687dbeb9f1d53fda43145cc3c82 WatchSource:0}: Error finding container 59391ec14f4899ff2e8480562b605be820b6e687dbeb9f1d53fda43145cc3c82: Status 404 returned error can't find the container with id 59391ec14f4899ff2e8480562b605be820b6e687dbeb9f1d53fda43145cc3c82 Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.754464 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-789wf"] Jan 21 15:57:45 crc kubenswrapper[4707]: W0121 15:57:45.761183 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6679ca9d_4a6e_446d_bf8a_fb91fc35f740.slice/crio-ff698ba3e6738ca42ed317918a2cd0d575ea60313b1c1e6ac6751c23e48f9882 WatchSource:0}: Error finding container ff698ba3e6738ca42ed317918a2cd0d575ea60313b1c1e6ac6751c23e48f9882: Status 404 returned error can't find the container with id ff698ba3e6738ca42ed317918a2cd0d575ea60313b1c1e6ac6751c23e48f9882 Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.762419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gbpp8" event={"ID":"9ffba812-ff2b-453b-92fb-2e669935e242","Type":"ContainerStarted","Data":"2defb6091bb38073f9508d56ce37c68e9278659fbf74c33a5bb235151648243d"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.763947 4707 generic.go:334] "Generic (PLEG): container finished" podID="11c55d01-3387-4a00-be62-2fa602109b06" containerID="dcee4cc518ae11b51683e86bac3ad963d1df8d8a2067d8861c9950d6a0949189" exitCode=0 Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.763992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" event={"ID":"11c55d01-3387-4a00-be62-2fa602109b06","Type":"ContainerDied","Data":"dcee4cc518ae11b51683e86bac3ad963d1df8d8a2067d8861c9950d6a0949189"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.764011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" event={"ID":"11c55d01-3387-4a00-be62-2fa602109b06","Type":"ContainerStarted","Data":"ebfd01bbcae9b872f246e3bdde40dbaebce3f839071e2d234c213627db85880a"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.770256 4707 generic.go:334] "Generic (PLEG): container finished" podID="a17672dc-c0ba-40d8-807c-9e1325921ed7" containerID="cfa23d2d252b51b7251f2ad8cf18b083481d8f7870dd55f120337919c9261e22" exitCode=0 Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.770317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-75f22" event={"ID":"a17672dc-c0ba-40d8-807c-9e1325921ed7","Type":"ContainerDied","Data":"cfa23d2d252b51b7251f2ad8cf18b083481d8f7870dd55f120337919c9261e22"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.770333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-75f22" event={"ID":"a17672dc-c0ba-40d8-807c-9e1325921ed7","Type":"ContainerStarted","Data":"0d9c17bbebea54ee80338b0ee9ac838df29308e719690a6ee2e8942f0041dd27"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.776219 4707 generic.go:334] "Generic (PLEG): container finished" podID="1d47255a-8169-490c-88be-57ffaabcb2d3" containerID="90cf9dd71194d32afd09ab818cff3462448499ba00cfc5818ba70273315fb6fc" exitCode=0 Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.776305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" event={"ID":"1d47255a-8169-490c-88be-57ffaabcb2d3","Type":"ContainerDied","Data":"90cf9dd71194d32afd09ab818cff3462448499ba00cfc5818ba70273315fb6fc"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.776432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" event={"ID":"1d47255a-8169-490c-88be-57ffaabcb2d3","Type":"ContainerStarted","Data":"9b20a8390e4a0ed108e0e4943aff40fd4d0cbabecf9ae462189d55bfd382e25b"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.790064 4707 generic.go:334] "Generic (PLEG): container finished" podID="8dbab3d8-0477-4503-a6d1-310a3c0dfda3" containerID="fb9886b5992389d717a7412d85598e041bc98e41d12e67e8c72ca77e926c376d" exitCode=0 Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.790125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-zt2mk" event={"ID":"8dbab3d8-0477-4503-a6d1-310a3c0dfda3","Type":"ContainerDied","Data":"fb9886b5992389d717a7412d85598e041bc98e41d12e67e8c72ca77e926c376d"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.790148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-zt2mk" event={"ID":"8dbab3d8-0477-4503-a6d1-310a3c0dfda3","Type":"ContainerStarted","Data":"4f00fc589e5f113adea0b4682270a10d975008fb01a59adfe0dabd2779bb445a"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.791801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" event={"ID":"5074e861-4d4c-4290-9c96-206e7d0b8f6d","Type":"ContainerDied","Data":"89706fea3fbc58aaa5171abf41642ec6b14a8c585f18ef0f509d4de39a0defc5"} Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.791848 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89706fea3fbc58aaa5171abf41642ec6b14a8c585f18ef0f509d4de39a0defc5" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.791883 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-drb2f" Jan 21 15:57:45 crc kubenswrapper[4707]: I0121 15:57:45.796103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"59391ec14f4899ff2e8480562b605be820b6e687dbeb9f1d53fda43145cc3c82"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.845165 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ffba812-ff2b-453b-92fb-2e669935e242" containerID="0c414677c6601250942e98f55dc20db5fd68ccfc1904b7f748f1fbc358f6b862" exitCode=0 Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.845407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gbpp8" event={"ID":"9ffba812-ff2b-453b-92fb-2e669935e242","Type":"ContainerDied","Data":"0c414677c6601250942e98f55dc20db5fd68ccfc1904b7f748f1fbc358f6b862"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.853078 4707 generic.go:334] "Generic (PLEG): container finished" podID="6679ca9d-4a6e-446d-bf8a-fb91fc35f740" containerID="17525babd176f0edcd5f654572aaf712e365128834f9b88edc42dd4fb5ce4907" exitCode=0 Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.853144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" event={"ID":"6679ca9d-4a6e-446d-bf8a-fb91fc35f740","Type":"ContainerDied","Data":"17525babd176f0edcd5f654572aaf712e365128834f9b88edc42dd4fb5ce4907"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.853163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" event={"ID":"6679ca9d-4a6e-446d-bf8a-fb91fc35f740","Type":"ContainerStarted","Data":"ff698ba3e6738ca42ed317918a2cd0d575ea60313b1c1e6ac6751c23e48f9882"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.862821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"83dd8a63c20c1ec70a57c216ecd7483961393678c54904f2df7480290173458d"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.862857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"6c38bd2f0b39209b01d982cd4fee04e83c6c81e7941bf608676b8abd00f2a31b"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.862867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"20bd79ad3c444a59a7f1c99aa1748e0831cf7601587ba19e2fad1093aa7ed199"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.862875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"d0267ab63d88eb5d69c8abe4f8eab3c146378376ad2f3f267d6c6d0e34f4b1e9"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.862883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"13acb9441c4c075d19f2e5354fe2e984c78df04fd88ec52f38421d65c42ab238"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.862892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"cf5083b5384e2bc9efc2cd7c1c61330323f0a7a94248846ec194cdbbdf9f9085"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.862899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"6340fee114b99b10644db682b6b0b87773342ed0330aee24fb6bb4739c6c6b4b"} Jan 21 15:57:46 crc kubenswrapper[4707]: I0121 15:57:46.862906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"42e0fb7f1e1361b1e98d0c8a660d5f6cbd7733ca6677f678924d9b430951360e"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.353762 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.437008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvhcv\" (UniqueName: \"kubernetes.io/projected/11c55d01-3387-4a00-be62-2fa602109b06-kube-api-access-zvhcv\") pod \"11c55d01-3387-4a00-be62-2fa602109b06\" (UID: \"11c55d01-3387-4a00-be62-2fa602109b06\") " Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.437139 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c55d01-3387-4a00-be62-2fa602109b06-operator-scripts\") pod \"11c55d01-3387-4a00-be62-2fa602109b06\" (UID: \"11c55d01-3387-4a00-be62-2fa602109b06\") " Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.438344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c55d01-3387-4a00-be62-2fa602109b06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11c55d01-3387-4a00-be62-2fa602109b06" (UID: "11c55d01-3387-4a00-be62-2fa602109b06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.444498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c55d01-3387-4a00-be62-2fa602109b06-kube-api-access-zvhcv" (OuterVolumeSpecName: "kube-api-access-zvhcv") pod "11c55d01-3387-4a00-be62-2fa602109b06" (UID: "11c55d01-3387-4a00-be62-2fa602109b06"). InnerVolumeSpecName "kube-api-access-zvhcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.468094 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.476324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.494864 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.538691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-operator-scripts\") pod \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\" (UID: \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\") " Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.538734 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cbmg\" (UniqueName: \"kubernetes.io/projected/a17672dc-c0ba-40d8-807c-9e1325921ed7-kube-api-access-2cbmg\") pod \"a17672dc-c0ba-40d8-807c-9e1325921ed7\" (UID: \"a17672dc-c0ba-40d8-807c-9e1325921ed7\") " Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.538842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dl7v\" (UniqueName: \"kubernetes.io/projected/1d47255a-8169-490c-88be-57ffaabcb2d3-kube-api-access-9dl7v\") pod \"1d47255a-8169-490c-88be-57ffaabcb2d3\" (UID: \"1d47255a-8169-490c-88be-57ffaabcb2d3\") " Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.538865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4q78\" (UniqueName: \"kubernetes.io/projected/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-kube-api-access-v4q78\") pod \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\" (UID: \"8dbab3d8-0477-4503-a6d1-310a3c0dfda3\") " Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.538898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d47255a-8169-490c-88be-57ffaabcb2d3-operator-scripts\") pod \"1d47255a-8169-490c-88be-57ffaabcb2d3\" (UID: \"1d47255a-8169-490c-88be-57ffaabcb2d3\") " Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.538917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17672dc-c0ba-40d8-807c-9e1325921ed7-operator-scripts\") pod \"a17672dc-c0ba-40d8-807c-9e1325921ed7\" (UID: \"a17672dc-c0ba-40d8-807c-9e1325921ed7\") " Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.539824 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c55d01-3387-4a00-be62-2fa602109b06-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.539848 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvhcv\" (UniqueName: \"kubernetes.io/projected/11c55d01-3387-4a00-be62-2fa602109b06-kube-api-access-zvhcv\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.540027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8dbab3d8-0477-4503-a6d1-310a3c0dfda3" (UID: "8dbab3d8-0477-4503-a6d1-310a3c0dfda3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.540158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a17672dc-c0ba-40d8-807c-9e1325921ed7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a17672dc-c0ba-40d8-807c-9e1325921ed7" (UID: "a17672dc-c0ba-40d8-807c-9e1325921ed7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.540482 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d47255a-8169-490c-88be-57ffaabcb2d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d47255a-8169-490c-88be-57ffaabcb2d3" (UID: "1d47255a-8169-490c-88be-57ffaabcb2d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.542086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d47255a-8169-490c-88be-57ffaabcb2d3-kube-api-access-9dl7v" (OuterVolumeSpecName: "kube-api-access-9dl7v") pod "1d47255a-8169-490c-88be-57ffaabcb2d3" (UID: "1d47255a-8169-490c-88be-57ffaabcb2d3"). InnerVolumeSpecName "kube-api-access-9dl7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.542252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-kube-api-access-v4q78" (OuterVolumeSpecName: "kube-api-access-v4q78") pod "8dbab3d8-0477-4503-a6d1-310a3c0dfda3" (UID: "8dbab3d8-0477-4503-a6d1-310a3c0dfda3"). InnerVolumeSpecName "kube-api-access-v4q78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.542404 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a17672dc-c0ba-40d8-807c-9e1325921ed7-kube-api-access-2cbmg" (OuterVolumeSpecName: "kube-api-access-2cbmg") pod "a17672dc-c0ba-40d8-807c-9e1325921ed7" (UID: "a17672dc-c0ba-40d8-807c-9e1325921ed7"). InnerVolumeSpecName "kube-api-access-2cbmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.641355 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4q78\" (UniqueName: \"kubernetes.io/projected/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-kube-api-access-v4q78\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.641553 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dl7v\" (UniqueName: \"kubernetes.io/projected/1d47255a-8169-490c-88be-57ffaabcb2d3-kube-api-access-9dl7v\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.641565 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d47255a-8169-490c-88be-57ffaabcb2d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.641574 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a17672dc-c0ba-40d8-807c-9e1325921ed7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.641582 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbab3d8-0477-4503-a6d1-310a3c0dfda3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.641590 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cbmg\" (UniqueName: \"kubernetes.io/projected/a17672dc-c0ba-40d8-807c-9e1325921ed7-kube-api-access-2cbmg\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.761510 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-grmhl"] Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.765601 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-grmhl"] Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.871588 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" event={"ID":"11c55d01-3387-4a00-be62-2fa602109b06","Type":"ContainerDied","Data":"ebfd01bbcae9b872f246e3bdde40dbaebce3f839071e2d234c213627db85880a"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.871623 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebfd01bbcae9b872f246e3bdde40dbaebce3f839071e2d234c213627db85880a" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.871594 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.873134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-75f22" event={"ID":"a17672dc-c0ba-40d8-807c-9e1325921ed7","Type":"ContainerDied","Data":"0d9c17bbebea54ee80338b0ee9ac838df29308e719690a6ee2e8942f0041dd27"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.873165 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d9c17bbebea54ee80338b0ee9ac838df29308e719690a6ee2e8942f0041dd27" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.873142 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-75f22" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.874789 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.874791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k" event={"ID":"1d47255a-8169-490c-88be-57ffaabcb2d3","Type":"ContainerDied","Data":"9b20a8390e4a0ed108e0e4943aff40fd4d0cbabecf9ae462189d55bfd382e25b"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.874912 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b20a8390e4a0ed108e0e4943aff40fd4d0cbabecf9ae462189d55bfd382e25b" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.891073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-zt2mk" event={"ID":"8dbab3d8-0477-4503-a6d1-310a3c0dfda3","Type":"ContainerDied","Data":"4f00fc589e5f113adea0b4682270a10d975008fb01a59adfe0dabd2779bb445a"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.891105 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f00fc589e5f113adea0b4682270a10d975008fb01a59adfe0dabd2779bb445a" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.891145 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-zt2mk" Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.896321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"a2ba006a14c1f79b7783113eb2fc356a7828b8e8106c77fb842a484c74dc2c5a"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.896348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"bed3e53b753f14dc32e9b1b6ab9f45190e2e849c1bd85308d5fb68b7ea133e5f"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.896359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"6f893355c882d903714e129e409f5ce4d4643085b9619147b4afbb2525f3b1e6"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.896367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"e47594c788362325c27848cc7da59200a8816e14df22f36bd527fc0d841b218a"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.896375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"c72a20daf0844253b29487d24bb0e8d55053110bc5b67076c0d93fa4b17a0fcf"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.896383 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"90550d64477b14b9ef1a55ac4d5a33af74701b40b0ce866af985b7931988d125"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.896390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerStarted","Data":"c50db20a23ae77d375778f281e89cb6339b6ce4bdaa193bba212e82b99bf20f3"} Jan 21 15:57:47 crc kubenswrapper[4707]: I0121 15:57:47.923871 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=11.923857943 podStartE2EDuration="11.923857943s" podCreationTimestamp="2026-01-21 15:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:47.922719652 +0000 UTC m=+3365.104235874" watchObservedRunningTime="2026-01-21 15:57:47.923857943 +0000 UTC m=+3365.105374165" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.019529 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp"] Jan 21 15:57:48 crc kubenswrapper[4707]: E0121 15:57:48.019765 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c55d01-3387-4a00-be62-2fa602109b06" containerName="mariadb-account-create-update" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.019780 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c55d01-3387-4a00-be62-2fa602109b06" containerName="mariadb-account-create-update" Jan 21 15:57:48 crc kubenswrapper[4707]: E0121 15:57:48.019800 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a17672dc-c0ba-40d8-807c-9e1325921ed7" containerName="mariadb-database-create" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.019818 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a17672dc-c0ba-40d8-807c-9e1325921ed7" containerName="mariadb-database-create" Jan 21 15:57:48 crc kubenswrapper[4707]: E0121 15:57:48.019825 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbab3d8-0477-4503-a6d1-310a3c0dfda3" containerName="mariadb-database-create" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.019831 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbab3d8-0477-4503-a6d1-310a3c0dfda3" containerName="mariadb-database-create" Jan 21 15:57:48 crc kubenswrapper[4707]: E0121 15:57:48.019836 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5074e861-4d4c-4290-9c96-206e7d0b8f6d" containerName="swift-ring-rebalance" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.019842 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5074e861-4d4c-4290-9c96-206e7d0b8f6d" containerName="swift-ring-rebalance" Jan 21 15:57:48 crc kubenswrapper[4707]: E0121 15:57:48.019852 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d47255a-8169-490c-88be-57ffaabcb2d3" containerName="mariadb-account-create-update" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.019857 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d47255a-8169-490c-88be-57ffaabcb2d3" containerName="mariadb-account-create-update" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.019982 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a17672dc-c0ba-40d8-807c-9e1325921ed7" containerName="mariadb-database-create" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.020000 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dbab3d8-0477-4503-a6d1-310a3c0dfda3" containerName="mariadb-database-create" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.020014 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d47255a-8169-490c-88be-57ffaabcb2d3" containerName="mariadb-account-create-update" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.020021 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5074e861-4d4c-4290-9c96-206e7d0b8f6d" containerName="swift-ring-rebalance" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.020026 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c55d01-3387-4a00-be62-2fa602109b06" containerName="mariadb-account-create-update" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.020669 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.022800 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.031521 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp"] Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.090547 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.148965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-config\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.149195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.149292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwt7p\" (UniqueName: \"kubernetes.io/projected/b1ab8226-4400-48d8-908f-ef4d0778ef1e-kube-api-access-dwt7p\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.149328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.250849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwt7p\" (UniqueName: \"kubernetes.io/projected/b1ab8226-4400-48d8-908f-ef4d0778ef1e-kube-api-access-dwt7p\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.250899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.251035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-config\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.251059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.251672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.251746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.252256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-config\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.264864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwt7p\" (UniqueName: \"kubernetes.io/projected/b1ab8226-4400-48d8-908f-ef4d0778ef1e-kube-api-access-dwt7p\") pod \"dnsmasq-dnsmasq-6db984dd9-bb7fp\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.291940 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.297492 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.352137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvctp\" (UniqueName: \"kubernetes.io/projected/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-kube-api-access-pvctp\") pod \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\" (UID: \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\") " Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.352210 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-operator-scripts\") pod \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\" (UID: \"6679ca9d-4a6e-446d-bf8a-fb91fc35f740\") " Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.352245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ffba812-ff2b-453b-92fb-2e669935e242-operator-scripts\") pod \"9ffba812-ff2b-453b-92fb-2e669935e242\" (UID: \"9ffba812-ff2b-453b-92fb-2e669935e242\") " Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.352272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl8r9\" (UniqueName: \"kubernetes.io/projected/9ffba812-ff2b-453b-92fb-2e669935e242-kube-api-access-dl8r9\") pod \"9ffba812-ff2b-453b-92fb-2e669935e242\" (UID: \"9ffba812-ff2b-453b-92fb-2e669935e242\") " Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.354213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6679ca9d-4a6e-446d-bf8a-fb91fc35f740" (UID: "6679ca9d-4a6e-446d-bf8a-fb91fc35f740"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.354331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffba812-ff2b-453b-92fb-2e669935e242-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ffba812-ff2b-453b-92fb-2e669935e242" (UID: "9ffba812-ff2b-453b-92fb-2e669935e242"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.355364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffba812-ff2b-453b-92fb-2e669935e242-kube-api-access-dl8r9" (OuterVolumeSpecName: "kube-api-access-dl8r9") pod "9ffba812-ff2b-453b-92fb-2e669935e242" (UID: "9ffba812-ff2b-453b-92fb-2e669935e242"). InnerVolumeSpecName "kube-api-access-dl8r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.356321 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-kube-api-access-pvctp" (OuterVolumeSpecName: "kube-api-access-pvctp") pod "6679ca9d-4a6e-446d-bf8a-fb91fc35f740" (UID: "6679ca9d-4a6e-446d-bf8a-fb91fc35f740"). InnerVolumeSpecName "kube-api-access-pvctp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.380441 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.453375 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvctp\" (UniqueName: \"kubernetes.io/projected/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-kube-api-access-pvctp\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.453395 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6679ca9d-4a6e-446d-bf8a-fb91fc35f740-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.453404 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ffba812-ff2b-453b-92fb-2e669935e242-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.453411 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl8r9\" (UniqueName: \"kubernetes.io/projected/9ffba812-ff2b-453b-92fb-2e669935e242-kube-api-access-dl8r9\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.758130 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp"] Jan 21 15:57:48 crc kubenswrapper[4707]: W0121 15:57:48.760116 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1ab8226_4400_48d8_908f_ef4d0778ef1e.slice/crio-dbab3a85540ff33b93c8dc9ebb98fbcede275e1081b091a3a903950d85c5e174 WatchSource:0}: Error finding container dbab3a85540ff33b93c8dc9ebb98fbcede275e1081b091a3a903950d85c5e174: Status 404 returned error can't find the container with id dbab3a85540ff33b93c8dc9ebb98fbcede275e1081b091a3a903950d85c5e174 Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.904650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gbpp8" event={"ID":"9ffba812-ff2b-453b-92fb-2e669935e242","Type":"ContainerDied","Data":"2defb6091bb38073f9508d56ce37c68e9278659fbf74c33a5bb235151648243d"} Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.904688 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2defb6091bb38073f9508d56ce37c68e9278659fbf74c33a5bb235151648243d" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.904690 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gbpp8" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.907249 4707 generic.go:334] "Generic (PLEG): container finished" podID="b1ab8226-4400-48d8-908f-ef4d0778ef1e" containerID="33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078" exitCode=0 Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.907310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" event={"ID":"b1ab8226-4400-48d8-908f-ef4d0778ef1e","Type":"ContainerDied","Data":"33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078"} Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.907326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" event={"ID":"b1ab8226-4400-48d8-908f-ef4d0778ef1e","Type":"ContainerStarted","Data":"dbab3a85540ff33b93c8dc9ebb98fbcede275e1081b091a3a903950d85c5e174"} Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.909220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.909229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-82ac-account-create-update-789wf" event={"ID":"6679ca9d-4a6e-446d-bf8a-fb91fc35f740","Type":"ContainerDied","Data":"ff698ba3e6738ca42ed317918a2cd0d575ea60313b1c1e6ac6751c23e48f9882"} Jan 21 15:57:48 crc kubenswrapper[4707]: I0121 15:57:48.909302 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff698ba3e6738ca42ed317918a2cd0d575ea60313b1c1e6ac6751c23e48f9882" Jan 21 15:57:49 crc kubenswrapper[4707]: I0121 15:57:49.191095 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e10ad1-26ff-40fd-a3d6-d495462ecfd0" path="/var/lib/kubelet/pods/06e10ad1-26ff-40fd-a3d6-d495462ecfd0/volumes" Jan 21 15:57:49 crc kubenswrapper[4707]: I0121 15:57:49.916391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" event={"ID":"b1ab8226-4400-48d8-908f-ef4d0778ef1e","Type":"ContainerStarted","Data":"49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065"} Jan 21 15:57:49 crc kubenswrapper[4707]: I0121 15:57:49.916586 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:49 crc kubenswrapper[4707]: I0121 15:57:49.933194 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" podStartSLOduration=1.933179755 podStartE2EDuration="1.933179755s" podCreationTimestamp="2026-01-21 15:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:49.927789328 +0000 UTC m=+3367.109305550" watchObservedRunningTime="2026-01-21 15:57:49.933179755 +0000 UTC m=+3367.114695976" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.129044 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-kzkp8"] Jan 21 15:57:50 crc kubenswrapper[4707]: E0121 15:57:50.129358 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6679ca9d-4a6e-446d-bf8a-fb91fc35f740" containerName="mariadb-account-create-update" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.129374 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6679ca9d-4a6e-446d-bf8a-fb91fc35f740" containerName="mariadb-account-create-update" Jan 21 15:57:50 crc kubenswrapper[4707]: E0121 15:57:50.129397 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffba812-ff2b-453b-92fb-2e669935e242" containerName="mariadb-database-create" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.129403 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffba812-ff2b-453b-92fb-2e669935e242" containerName="mariadb-database-create" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.129521 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6679ca9d-4a6e-446d-bf8a-fb91fc35f740" containerName="mariadb-account-create-update" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.129540 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffba812-ff2b-453b-92fb-2e669935e242" containerName="mariadb-database-create" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.130041 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.133183 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-dkchd" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.135418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-kzkp8"] Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.135827 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.174326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-db-sync-config-data\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.174464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-config-data\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.174585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-combined-ca-bundle\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.174614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8ptv\" (UniqueName: \"kubernetes.io/projected/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-kube-api-access-b8ptv\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.275622 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-combined-ca-bundle\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.275659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8ptv\" (UniqueName: \"kubernetes.io/projected/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-kube-api-access-b8ptv\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.275718 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-db-sync-config-data\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.275781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-config-data\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.280881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-db-sync-config-data\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.280975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-combined-ca-bundle\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.282193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-config-data\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.290426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8ptv\" (UniqueName: \"kubernetes.io/projected/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-kube-api-access-b8ptv\") pod \"glance-db-sync-kzkp8\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.442302 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.803730 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-kzkp8"] Jan 21 15:57:50 crc kubenswrapper[4707]: I0121 15:57:50.926789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-kzkp8" event={"ID":"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd","Type":"ContainerStarted","Data":"5af5f42d6442bd45de0cc0310ecf196f4f3fbc3298e505f3098e16671cdf70d4"} Jan 21 15:57:51 crc kubenswrapper[4707]: I0121 15:57:51.933945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-kzkp8" event={"ID":"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd","Type":"ContainerStarted","Data":"67b9016cf0d63654ebd9f4125a86329f8bb17d84c9a5f607da5903d91c3570ef"} Jan 21 15:57:51 crc kubenswrapper[4707]: I0121 15:57:51.946820 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-kzkp8" podStartSLOduration=1.9467905509999999 podStartE2EDuration="1.946790551s" podCreationTimestamp="2026-01-21 15:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:51.943568753 +0000 UTC m=+3369.125084974" watchObservedRunningTime="2026-01-21 15:57:51.946790551 +0000 UTC m=+3369.128306763" Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.756016 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fh2md"] Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.757061 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.759588 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.763110 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fh2md"] Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.812427 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtfjt\" (UniqueName: \"kubernetes.io/projected/abdf2ebb-d634-4cce-b6ea-573e7aeae062-kube-api-access-qtfjt\") pod \"root-account-create-update-fh2md\" (UID: \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\") " pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.812464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abdf2ebb-d634-4cce-b6ea-573e7aeae062-operator-scripts\") pod \"root-account-create-update-fh2md\" (UID: \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\") " pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.913301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtfjt\" (UniqueName: \"kubernetes.io/projected/abdf2ebb-d634-4cce-b6ea-573e7aeae062-kube-api-access-qtfjt\") pod \"root-account-create-update-fh2md\" (UID: \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\") " pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.913337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abdf2ebb-d634-4cce-b6ea-573e7aeae062-operator-scripts\") pod \"root-account-create-update-fh2md\" (UID: \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\") " pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.914244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abdf2ebb-d634-4cce-b6ea-573e7aeae062-operator-scripts\") pod \"root-account-create-update-fh2md\" (UID: \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\") " pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:52 crc kubenswrapper[4707]: I0121 15:57:52.929751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtfjt\" (UniqueName: \"kubernetes.io/projected/abdf2ebb-d634-4cce-b6ea-573e7aeae062-kube-api-access-qtfjt\") pod \"root-account-create-update-fh2md\" (UID: \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\") " pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:53 crc kubenswrapper[4707]: I0121 15:57:53.071414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:53 crc kubenswrapper[4707]: W0121 15:57:53.441433 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabdf2ebb_d634_4cce_b6ea_573e7aeae062.slice/crio-f8d8cf0f8cad95a7a059706dc1f57d25049dee16a04c0f5145d7157cbfc2ad17 WatchSource:0}: Error finding container f8d8cf0f8cad95a7a059706dc1f57d25049dee16a04c0f5145d7157cbfc2ad17: Status 404 returned error can't find the container with id f8d8cf0f8cad95a7a059706dc1f57d25049dee16a04c0f5145d7157cbfc2ad17 Jan 21 15:57:53 crc kubenswrapper[4707]: I0121 15:57:53.441540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fh2md"] Jan 21 15:57:53 crc kubenswrapper[4707]: I0121 15:57:53.953214 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c37d5e5-4ffd-412c-a93a-86cb6474735c" containerID="6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679" exitCode=0 Jan 21 15:57:53 crc kubenswrapper[4707]: I0121 15:57:53.953298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1c37d5e5-4ffd-412c-a93a-86cb6474735c","Type":"ContainerDied","Data":"6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679"} Jan 21 15:57:53 crc kubenswrapper[4707]: I0121 15:57:53.954337 4707 generic.go:334] "Generic (PLEG): container finished" podID="abdf2ebb-d634-4cce-b6ea-573e7aeae062" containerID="0f2dfde4fe3c90981f347f6443b260ee7f62e38ea81b5bbcfaef544400577e6d" exitCode=0 Jan 21 15:57:53 crc kubenswrapper[4707]: I0121 15:57:53.954384 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fh2md" event={"ID":"abdf2ebb-d634-4cce-b6ea-573e7aeae062","Type":"ContainerDied","Data":"0f2dfde4fe3c90981f347f6443b260ee7f62e38ea81b5bbcfaef544400577e6d"} Jan 21 15:57:53 crc kubenswrapper[4707]: I0121 15:57:53.954400 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fh2md" event={"ID":"abdf2ebb-d634-4cce-b6ea-573e7aeae062","Type":"ContainerStarted","Data":"f8d8cf0f8cad95a7a059706dc1f57d25049dee16a04c0f5145d7157cbfc2ad17"} Jan 21 15:57:53 crc kubenswrapper[4707]: I0121 15:57:53.955657 4707 generic.go:334] "Generic (PLEG): container finished" podID="50ded377-6ace-4f39-8a60-bee575f7e8cc" containerID="322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab" exitCode=0 Jan 21 15:57:53 crc kubenswrapper[4707]: I0121 15:57:53.955677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"50ded377-6ace-4f39-8a60-bee575f7e8cc","Type":"ContainerDied","Data":"322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab"} Jan 21 15:57:54 crc kubenswrapper[4707]: I0121 15:57:54.962647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1c37d5e5-4ffd-412c-a93a-86cb6474735c","Type":"ContainerStarted","Data":"2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813"} Jan 21 15:57:54 crc kubenswrapper[4707]: I0121 15:57:54.963746 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:57:54 crc kubenswrapper[4707]: I0121 15:57:54.965149 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"50ded377-6ace-4f39-8a60-bee575f7e8cc","Type":"ContainerStarted","Data":"d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0"} Jan 21 15:57:54 crc kubenswrapper[4707]: I0121 15:57:54.965556 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:57:54 crc kubenswrapper[4707]: I0121 15:57:54.967345 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4ac5b39-8cde-4b9d-9b74-b09b6be633bd" containerID="67b9016cf0d63654ebd9f4125a86329f8bb17d84c9a5f607da5903d91c3570ef" exitCode=0 Jan 21 15:57:54 crc kubenswrapper[4707]: I0121 15:57:54.967448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-kzkp8" event={"ID":"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd","Type":"ContainerDied","Data":"67b9016cf0d63654ebd9f4125a86329f8bb17d84c9a5f607da5903d91c3570ef"} Jan 21 15:57:54 crc kubenswrapper[4707]: I0121 15:57:54.983067 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=35.983053474 podStartE2EDuration="35.983053474s" podCreationTimestamp="2026-01-21 15:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:54.982085013 +0000 UTC m=+3372.163601235" watchObservedRunningTime="2026-01-21 15:57:54.983053474 +0000 UTC m=+3372.164569726" Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.028494 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.0284777 podStartE2EDuration="36.0284777s" podCreationTimestamp="2026-01-21 15:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:57:55.016866617 +0000 UTC m=+3372.198382838" watchObservedRunningTime="2026-01-21 15:57:55.0284777 +0000 UTC m=+3372.209993923" Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.290820 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.351628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtfjt\" (UniqueName: \"kubernetes.io/projected/abdf2ebb-d634-4cce-b6ea-573e7aeae062-kube-api-access-qtfjt\") pod \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\" (UID: \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\") " Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.351673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abdf2ebb-d634-4cce-b6ea-573e7aeae062-operator-scripts\") pod \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\" (UID: \"abdf2ebb-d634-4cce-b6ea-573e7aeae062\") " Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.352163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abdf2ebb-d634-4cce-b6ea-573e7aeae062-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "abdf2ebb-d634-4cce-b6ea-573e7aeae062" (UID: "abdf2ebb-d634-4cce-b6ea-573e7aeae062"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.352404 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abdf2ebb-d634-4cce-b6ea-573e7aeae062-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.355602 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abdf2ebb-d634-4cce-b6ea-573e7aeae062-kube-api-access-qtfjt" (OuterVolumeSpecName: "kube-api-access-qtfjt") pod "abdf2ebb-d634-4cce-b6ea-573e7aeae062" (UID: "abdf2ebb-d634-4cce-b6ea-573e7aeae062"). InnerVolumeSpecName "kube-api-access-qtfjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.453622 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtfjt\" (UniqueName: \"kubernetes.io/projected/abdf2ebb-d634-4cce-b6ea-573e7aeae062-kube-api-access-qtfjt\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.974595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fh2md" event={"ID":"abdf2ebb-d634-4cce-b6ea-573e7aeae062","Type":"ContainerDied","Data":"f8d8cf0f8cad95a7a059706dc1f57d25049dee16a04c0f5145d7157cbfc2ad17"} Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.975306 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d8cf0f8cad95a7a059706dc1f57d25049dee16a04c0f5145d7157cbfc2ad17" Jan 21 15:57:55 crc kubenswrapper[4707]: I0121 15:57:55.974737 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fh2md" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.244695 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.264247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-config-data\") pod \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.264326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8ptv\" (UniqueName: \"kubernetes.io/projected/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-kube-api-access-b8ptv\") pod \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.264379 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-db-sync-config-data\") pod \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.264398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-combined-ca-bundle\") pod \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\" (UID: \"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd\") " Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.268018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-kube-api-access-b8ptv" (OuterVolumeSpecName: "kube-api-access-b8ptv") pod "f4ac5b39-8cde-4b9d-9b74-b09b6be633bd" (UID: "f4ac5b39-8cde-4b9d-9b74-b09b6be633bd"). InnerVolumeSpecName "kube-api-access-b8ptv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.271224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f4ac5b39-8cde-4b9d-9b74-b09b6be633bd" (UID: "f4ac5b39-8cde-4b9d-9b74-b09b6be633bd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.283137 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4ac5b39-8cde-4b9d-9b74-b09b6be633bd" (UID: "f4ac5b39-8cde-4b9d-9b74-b09b6be633bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.295951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-config-data" (OuterVolumeSpecName: "config-data") pod "f4ac5b39-8cde-4b9d-9b74-b09b6be633bd" (UID: "f4ac5b39-8cde-4b9d-9b74-b09b6be633bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.366397 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.366426 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.366438 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.366448 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8ptv\" (UniqueName: \"kubernetes.io/projected/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd-kube-api-access-b8ptv\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.982139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-kzkp8" event={"ID":"f4ac5b39-8cde-4b9d-9b74-b09b6be633bd","Type":"ContainerDied","Data":"5af5f42d6442bd45de0cc0310ecf196f4f3fbc3298e505f3098e16671cdf70d4"} Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.982176 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5af5f42d6442bd45de0cc0310ecf196f4f3fbc3298e505f3098e16671cdf70d4" Jan 21 15:57:56 crc kubenswrapper[4707]: I0121 15:57:56.982192 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-kzkp8" Jan 21 15:57:57 crc kubenswrapper[4707]: I0121 15:57:57.183185 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:57:57 crc kubenswrapper[4707]: E0121 15:57:57.183592 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.381921 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.429218 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6"] Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.429411 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" podUID="bb62efde-9e0c-429f-85ed-7a251716c8d9" containerName="dnsmasq-dns" containerID="cri-o://8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc" gracePeriod=10 Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.829484 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.900859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfkzr\" (UniqueName: \"kubernetes.io/projected/bb62efde-9e0c-429f-85ed-7a251716c8d9-kube-api-access-qfkzr\") pod \"bb62efde-9e0c-429f-85ed-7a251716c8d9\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.900944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-dnsmasq-svc\") pod \"bb62efde-9e0c-429f-85ed-7a251716c8d9\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.901014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-config\") pod \"bb62efde-9e0c-429f-85ed-7a251716c8d9\" (UID: \"bb62efde-9e0c-429f-85ed-7a251716c8d9\") " Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.905267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb62efde-9e0c-429f-85ed-7a251716c8d9-kube-api-access-qfkzr" (OuterVolumeSpecName: "kube-api-access-qfkzr") pod "bb62efde-9e0c-429f-85ed-7a251716c8d9" (UID: "bb62efde-9e0c-429f-85ed-7a251716c8d9"). InnerVolumeSpecName "kube-api-access-qfkzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.927387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "bb62efde-9e0c-429f-85ed-7a251716c8d9" (UID: "bb62efde-9e0c-429f-85ed-7a251716c8d9"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.930015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-config" (OuterVolumeSpecName: "config") pod "bb62efde-9e0c-429f-85ed-7a251716c8d9" (UID: "bb62efde-9e0c-429f-85ed-7a251716c8d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.994422 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb62efde-9e0c-429f-85ed-7a251716c8d9" containerID="8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc" exitCode=0 Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.994458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" event={"ID":"bb62efde-9e0c-429f-85ed-7a251716c8d9","Type":"ContainerDied","Data":"8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc"} Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.994482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" event={"ID":"bb62efde-9e0c-429f-85ed-7a251716c8d9","Type":"ContainerDied","Data":"eb011424c38a47c771085b80af1a5ac94f22c9d2e21b32f502c7915b4cf54a74"} Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.994497 4707 scope.go:117] "RemoveContainer" containerID="8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc" Jan 21 15:57:58 crc kubenswrapper[4707]: I0121 15:57:58.994595 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6" Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.002354 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.002378 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfkzr\" (UniqueName: \"kubernetes.io/projected/bb62efde-9e0c-429f-85ed-7a251716c8d9-kube-api-access-qfkzr\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.002388 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/bb62efde-9e0c-429f-85ed-7a251716c8d9-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.011878 4707 scope.go:117] "RemoveContainer" containerID="bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699" Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.017346 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6"] Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.024594 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5z8p6"] Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.027547 4707 scope.go:117] "RemoveContainer" containerID="8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc" Jan 21 15:57:59 crc kubenswrapper[4707]: E0121 15:57:59.027898 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc\": container with ID starting with 8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc not found: ID does not exist" containerID="8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc" Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.027977 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc"} err="failed to get container status \"8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc\": rpc error: code = NotFound desc = could not find container \"8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc\": container with ID starting with 8154547c39b7b6e86ffde9c1e783fb843b75ceb081c1665288e15f8029be44bc not found: ID does not exist" Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.028060 4707 scope.go:117] "RemoveContainer" containerID="bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699" Jan 21 15:57:59 crc kubenswrapper[4707]: E0121 15:57:59.028412 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699\": container with ID starting with bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699 not found: ID does not exist" containerID="bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699" Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.028439 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699"} err="failed to get container status \"bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699\": rpc error: code = NotFound desc = could not find container \"bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699\": container with ID starting with bb90d261335c4acb9fd816b8252b1bf1520770a5df7eb95052e0211a089da699 not found: ID does not exist" Jan 21 15:57:59 crc kubenswrapper[4707]: I0121 15:57:59.190426 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb62efde-9e0c-429f-85ed-7a251716c8d9" path="/var/lib/kubelet/pods/bb62efde-9e0c-429f-85ed-7a251716c8d9/volumes" Jan 21 15:58:09 crc kubenswrapper[4707]: I0121 15:58:09.526367 4707 scope.go:117] "RemoveContainer" containerID="a5766d5bf168913788538335dbb1615ed79f7befcea3074dcb7e8bac5a9a3946" Jan 21 15:58:09 crc kubenswrapper[4707]: I0121 15:58:09.543683 4707 scope.go:117] "RemoveContainer" containerID="741b7a02e8ff9cfd2005241421caecf0b4bf770c7ec62312f29cd16e26f0ef89" Jan 21 15:58:10 crc kubenswrapper[4707]: I0121 15:58:10.822998 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.001950 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-q6hgd"] Jan 21 15:58:11 crc kubenswrapper[4707]: E0121 15:58:11.002249 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ac5b39-8cde-4b9d-9b74-b09b6be633bd" containerName="glance-db-sync" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.002267 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ac5b39-8cde-4b9d-9b74-b09b6be633bd" containerName="glance-db-sync" Jan 21 15:58:11 crc kubenswrapper[4707]: E0121 15:58:11.002292 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb62efde-9e0c-429f-85ed-7a251716c8d9" containerName="init" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.002299 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb62efde-9e0c-429f-85ed-7a251716c8d9" containerName="init" Jan 21 15:58:11 crc kubenswrapper[4707]: E0121 15:58:11.002308 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb62efde-9e0c-429f-85ed-7a251716c8d9" containerName="dnsmasq-dns" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.002313 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb62efde-9e0c-429f-85ed-7a251716c8d9" containerName="dnsmasq-dns" Jan 21 15:58:11 crc kubenswrapper[4707]: E0121 15:58:11.002327 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdf2ebb-d634-4cce-b6ea-573e7aeae062" containerName="mariadb-account-create-update" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.002333 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdf2ebb-d634-4cce-b6ea-573e7aeae062" containerName="mariadb-account-create-update" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.002487 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdf2ebb-d634-4cce-b6ea-573e7aeae062" containerName="mariadb-account-create-update" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.002499 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb62efde-9e0c-429f-85ed-7a251716c8d9" containerName="dnsmasq-dns" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.002505 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ac5b39-8cde-4b9d-9b74-b09b6be633bd" containerName="glance-db-sync" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.002972 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.022823 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-q6hgd"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.069180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.110723 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zfw44"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.111738 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.116144 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.117015 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.119049 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.126161 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zfw44"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.145134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.154873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-operator-scripts\") pod \"cinder-db-create-q6hgd\" (UID: \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\") " pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.154941 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrz2d\" (UniqueName: \"kubernetes.io/projected/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-kube-api-access-wrz2d\") pod \"cinder-db-create-q6hgd\" (UID: \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\") " pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.201193 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.202092 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.207140 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.213911 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.256516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh2th\" (UniqueName: \"kubernetes.io/projected/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-kube-api-access-qh2th\") pod \"barbican-db-create-zfw44\" (UID: \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\") " pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.256596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-operator-scripts\") pod \"barbican-db-create-zfw44\" (UID: \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\") " pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.256671 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-operator-scripts\") pod \"cinder-db-create-q6hgd\" (UID: \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\") " pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.256708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9eb7db96-fb55-4d76-9275-d06f77115530-operator-scripts\") pod \"cinder-fcb5-account-create-update-f8hm9\" (UID: \"9eb7db96-fb55-4d76-9275-d06f77115530\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.256749 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brb8f\" (UniqueName: \"kubernetes.io/projected/9eb7db96-fb55-4d76-9275-d06f77115530-kube-api-access-brb8f\") pod \"cinder-fcb5-account-create-update-f8hm9\" (UID: \"9eb7db96-fb55-4d76-9275-d06f77115530\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.256852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrz2d\" (UniqueName: \"kubernetes.io/projected/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-kube-api-access-wrz2d\") pod \"cinder-db-create-q6hgd\" (UID: \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\") " pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.257394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-operator-scripts\") pod \"cinder-db-create-q6hgd\" (UID: \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\") " pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.275139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrz2d\" (UniqueName: \"kubernetes.io/projected/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-kube-api-access-wrz2d\") pod \"cinder-db-create-q6hgd\" (UID: \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\") " pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.317261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.325661 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-l2fmp"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.326481 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.329083 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.329391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-vnmw4" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.329911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.331668 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.338676 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-l2fmp"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.358505 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh2th\" (UniqueName: \"kubernetes.io/projected/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-kube-api-access-qh2th\") pod \"barbican-db-create-zfw44\" (UID: \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\") " pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.358627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-operator-scripts\") pod \"barbican-db-create-zfw44\" (UID: \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\") " pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.358747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c611a8f-8f60-428c-9fb9-c2009be3534b-operator-scripts\") pod \"barbican-4b91-account-create-update-j98pz\" (UID: \"1c611a8f-8f60-428c-9fb9-c2009be3534b\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.358767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9eb7db96-fb55-4d76-9275-d06f77115530-operator-scripts\") pod \"cinder-fcb5-account-create-update-f8hm9\" (UID: \"9eb7db96-fb55-4d76-9275-d06f77115530\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.358825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brb8f\" (UniqueName: \"kubernetes.io/projected/9eb7db96-fb55-4d76-9275-d06f77115530-kube-api-access-brb8f\") pod \"cinder-fcb5-account-create-update-f8hm9\" (UID: \"9eb7db96-fb55-4d76-9275-d06f77115530\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.358925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc9mf\" (UniqueName: \"kubernetes.io/projected/1c611a8f-8f60-428c-9fb9-c2009be3534b-kube-api-access-jc9mf\") pod \"barbican-4b91-account-create-update-j98pz\" (UID: \"1c611a8f-8f60-428c-9fb9-c2009be3534b\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.359411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-operator-scripts\") pod \"barbican-db-create-zfw44\" (UID: \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\") " pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.359461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9eb7db96-fb55-4d76-9275-d06f77115530-operator-scripts\") pod \"cinder-fcb5-account-create-update-f8hm9\" (UID: \"9eb7db96-fb55-4d76-9275-d06f77115530\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.373649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh2th\" (UniqueName: \"kubernetes.io/projected/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-kube-api-access-qh2th\") pod \"barbican-db-create-zfw44\" (UID: \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\") " pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.374829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brb8f\" (UniqueName: \"kubernetes.io/projected/9eb7db96-fb55-4d76-9275-d06f77115530-kube-api-access-brb8f\") pod \"cinder-fcb5-account-create-update-f8hm9\" (UID: \"9eb7db96-fb55-4d76-9275-d06f77115530\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.427936 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-9ln95"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.428860 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.429130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.440013 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.443213 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.446521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.448603 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.454674 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-9ln95"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.459983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-combined-ca-bundle\") pod \"keystone-db-sync-l2fmp\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.460030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-config-data\") pod \"keystone-db-sync-l2fmp\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.460058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c611a8f-8f60-428c-9fb9-c2009be3534b-operator-scripts\") pod \"barbican-4b91-account-create-update-j98pz\" (UID: \"1c611a8f-8f60-428c-9fb9-c2009be3534b\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.460074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpjkk\" (UniqueName: \"kubernetes.io/projected/22f781fc-51ea-457e-b20f-129e4eaa8649-kube-api-access-jpjkk\") pod \"keystone-db-sync-l2fmp\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.460121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc9mf\" (UniqueName: \"kubernetes.io/projected/1c611a8f-8f60-428c-9fb9-c2009be3534b-kube-api-access-jc9mf\") pod \"barbican-4b91-account-create-update-j98pz\" (UID: \"1c611a8f-8f60-428c-9fb9-c2009be3534b\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.461155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c611a8f-8f60-428c-9fb9-c2009be3534b-operator-scripts\") pod \"barbican-4b91-account-create-update-j98pz\" (UID: \"1c611a8f-8f60-428c-9fb9-c2009be3534b\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.462238 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.481384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc9mf\" (UniqueName: \"kubernetes.io/projected/1c611a8f-8f60-428c-9fb9-c2009be3534b-kube-api-access-jc9mf\") pod \"barbican-4b91-account-create-update-j98pz\" (UID: \"1c611a8f-8f60-428c-9fb9-c2009be3534b\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.520260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.561567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a99ebc-2042-44a8-862c-e3508fdecf5c-operator-scripts\") pod \"neutron-db-create-9ln95\" (UID: \"47a99ebc-2042-44a8-862c-e3508fdecf5c\") " pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.561600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4fj\" (UniqueName: \"kubernetes.io/projected/025cd7bf-ed7d-4465-8d3a-60257e90bdef-kube-api-access-8k4fj\") pod \"neutron-a0e7-account-create-update-82vfd\" (UID: \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\") " pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.561630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbwb\" (UniqueName: \"kubernetes.io/projected/47a99ebc-2042-44a8-862c-e3508fdecf5c-kube-api-access-ffbwb\") pod \"neutron-db-create-9ln95\" (UID: \"47a99ebc-2042-44a8-862c-e3508fdecf5c\") " pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.561654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-config-data\") pod \"keystone-db-sync-l2fmp\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.561672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpjkk\" (UniqueName: \"kubernetes.io/projected/22f781fc-51ea-457e-b20f-129e4eaa8649-kube-api-access-jpjkk\") pod \"keystone-db-sync-l2fmp\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.561777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/025cd7bf-ed7d-4465-8d3a-60257e90bdef-operator-scripts\") pod \"neutron-a0e7-account-create-update-82vfd\" (UID: \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\") " pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.562035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-combined-ca-bundle\") pod \"keystone-db-sync-l2fmp\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.565077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-config-data\") pod \"keystone-db-sync-l2fmp\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.565689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-combined-ca-bundle\") pod \"keystone-db-sync-l2fmp\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.578514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpjkk\" (UniqueName: \"kubernetes.io/projected/22f781fc-51ea-457e-b20f-129e4eaa8649-kube-api-access-jpjkk\") pod \"keystone-db-sync-l2fmp\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.664244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a99ebc-2042-44a8-862c-e3508fdecf5c-operator-scripts\") pod \"neutron-db-create-9ln95\" (UID: \"47a99ebc-2042-44a8-862c-e3508fdecf5c\") " pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.664334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4fj\" (UniqueName: \"kubernetes.io/projected/025cd7bf-ed7d-4465-8d3a-60257e90bdef-kube-api-access-8k4fj\") pod \"neutron-a0e7-account-create-update-82vfd\" (UID: \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\") " pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.664372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbwb\" (UniqueName: \"kubernetes.io/projected/47a99ebc-2042-44a8-862c-e3508fdecf5c-kube-api-access-ffbwb\") pod \"neutron-db-create-9ln95\" (UID: \"47a99ebc-2042-44a8-862c-e3508fdecf5c\") " pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.664410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/025cd7bf-ed7d-4465-8d3a-60257e90bdef-operator-scripts\") pod \"neutron-a0e7-account-create-update-82vfd\" (UID: \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\") " pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.664972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a99ebc-2042-44a8-862c-e3508fdecf5c-operator-scripts\") pod \"neutron-db-create-9ln95\" (UID: \"47a99ebc-2042-44a8-862c-e3508fdecf5c\") " pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.665773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/025cd7bf-ed7d-4465-8d3a-60257e90bdef-operator-scripts\") pod \"neutron-a0e7-account-create-update-82vfd\" (UID: \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\") " pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.682603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4fj\" (UniqueName: \"kubernetes.io/projected/025cd7bf-ed7d-4465-8d3a-60257e90bdef-kube-api-access-8k4fj\") pod \"neutron-a0e7-account-create-update-82vfd\" (UID: \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\") " pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.688340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbwb\" (UniqueName: \"kubernetes.io/projected/47a99ebc-2042-44a8-862c-e3508fdecf5c-kube-api-access-ffbwb\") pod \"neutron-db-create-9ln95\" (UID: \"47a99ebc-2042-44a8-862c-e3508fdecf5c\") " pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.723036 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.752917 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-q6hgd"] Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.753356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:11 crc kubenswrapper[4707]: W0121 15:58:11.771413 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod951a6efd_7bf7_48ea_a8f5_de1ad1c03b08.slice/crio-495a8cbdb02ca282d6b6a598ac7676a902a62f0ae0f4249f8e95003f6dc347e9 WatchSource:0}: Error finding container 495a8cbdb02ca282d6b6a598ac7676a902a62f0ae0f4249f8e95003f6dc347e9: Status 404 returned error can't find the container with id 495a8cbdb02ca282d6b6a598ac7676a902a62f0ae0f4249f8e95003f6dc347e9 Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.791050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.882035 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zfw44"] Jan 21 15:58:11 crc kubenswrapper[4707]: W0121 15:58:11.892507 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode75f12e9_2aae_4b40_9120_bc66aad8d0c1.slice/crio-c711e30ccc753a0b8c2d1a8045cf91211b53291dd31e71fd7ca19e3ae2b530de WatchSource:0}: Error finding container c711e30ccc753a0b8c2d1a8045cf91211b53291dd31e71fd7ca19e3ae2b530de: Status 404 returned error can't find the container with id c711e30ccc753a0b8c2d1a8045cf91211b53291dd31e71fd7ca19e3ae2b530de Jan 21 15:58:11 crc kubenswrapper[4707]: I0121 15:58:11.964519 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9"] Jan 21 15:58:11 crc kubenswrapper[4707]: W0121 15:58:11.979779 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eb7db96_fb55_4d76_9275_d06f77115530.slice/crio-24dba0888235d712f5effa21c4bd40c62d2f3665b7d550eb9b1f1859b23513db WatchSource:0}: Error finding container 24dba0888235d712f5effa21c4bd40c62d2f3665b7d550eb9b1f1859b23513db: Status 404 returned error can't find the container with id 24dba0888235d712f5effa21c4bd40c62d2f3665b7d550eb9b1f1859b23513db Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.011000 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz"] Jan 21 15:58:12 crc kubenswrapper[4707]: W0121 15:58:12.020005 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c611a8f_8f60_428c_9fb9_c2009be3534b.slice/crio-889c0dcf29a0739600bc83b5d687cf2000e8df502a651b878261c2e73212fec3 WatchSource:0}: Error finding container 889c0dcf29a0739600bc83b5d687cf2000e8df502a651b878261c2e73212fec3: Status 404 returned error can't find the container with id 889c0dcf29a0739600bc83b5d687cf2000e8df502a651b878261c2e73212fec3 Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.102575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-q6hgd" event={"ID":"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08","Type":"ContainerStarted","Data":"01d7064ebbc3f6ab8fc1f264955b1dffe3fed3888e517ae25228c53c2f45394f"} Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.102631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-q6hgd" event={"ID":"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08","Type":"ContainerStarted","Data":"495a8cbdb02ca282d6b6a598ac7676a902a62f0ae0f4249f8e95003f6dc347e9"} Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.121112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" event={"ID":"9eb7db96-fb55-4d76-9275-d06f77115530","Type":"ContainerStarted","Data":"24dba0888235d712f5effa21c4bd40c62d2f3665b7d550eb9b1f1859b23513db"} Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.129563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-zfw44" event={"ID":"e75f12e9-2aae-4b40-9120-bc66aad8d0c1","Type":"ContainerStarted","Data":"bda8d3a244ebe6eb8a2cc33162b6791f9a27ee5b8d1ae3c29cbca5c78a213910"} Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.129603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-zfw44" event={"ID":"e75f12e9-2aae-4b40-9120-bc66aad8d0c1","Type":"ContainerStarted","Data":"c711e30ccc753a0b8c2d1a8045cf91211b53291dd31e71fd7ca19e3ae2b530de"} Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.130124 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-create-q6hgd" podStartSLOduration=2.130104359 podStartE2EDuration="2.130104359s" podCreationTimestamp="2026-01-21 15:58:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:12.122361739 +0000 UTC m=+3389.303877961" watchObservedRunningTime="2026-01-21 15:58:12.130104359 +0000 UTC m=+3389.311620581" Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.130769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" event={"ID":"1c611a8f-8f60-428c-9fb9-c2009be3534b","Type":"ContainerStarted","Data":"889c0dcf29a0739600bc83b5d687cf2000e8df502a651b878261c2e73212fec3"} Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.147426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-l2fmp"] Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.152552 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-create-zfw44" podStartSLOduration=1.15254157 podStartE2EDuration="1.15254157s" podCreationTimestamp="2026-01-21 15:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:12.149736626 +0000 UTC m=+3389.331252848" watchObservedRunningTime="2026-01-21 15:58:12.15254157 +0000 UTC m=+3389.334057792" Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.182406 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:58:12 crc kubenswrapper[4707]: E0121 15:58:12.182613 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.239892 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-9ln95"] Jan 21 15:58:12 crc kubenswrapper[4707]: I0121 15:58:12.279136 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd"] Jan 21 15:58:12 crc kubenswrapper[4707]: W0121 15:58:12.285174 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47a99ebc_2042_44a8_862c_e3508fdecf5c.slice/crio-dfa5ba076bd6873d39984fc26348170954286453506c979a49051d8a60d6f0a9 WatchSource:0}: Error finding container dfa5ba076bd6873d39984fc26348170954286453506c979a49051d8a60d6f0a9: Status 404 returned error can't find the container with id dfa5ba076bd6873d39984fc26348170954286453506c979a49051d8a60d6f0a9 Jan 21 15:58:12 crc kubenswrapper[4707]: W0121 15:58:12.302070 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod025cd7bf_ed7d_4465_8d3a_60257e90bdef.slice/crio-f1d29ae2b3c222719a0b7fadea58012f50172cebbb9c5c67ac568cc3c2c3a11c WatchSource:0}: Error finding container f1d29ae2b3c222719a0b7fadea58012f50172cebbb9c5c67ac568cc3c2c3a11c: Status 404 returned error can't find the container with id f1d29ae2b3c222719a0b7fadea58012f50172cebbb9c5c67ac568cc3c2c3a11c Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.139838 4707 generic.go:334] "Generic (PLEG): container finished" podID="025cd7bf-ed7d-4465-8d3a-60257e90bdef" containerID="4eb952b277a1133feb396b43f7c020ea9b855f0beb6c6313e95dfccf24dfca6d" exitCode=0 Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.139939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" event={"ID":"025cd7bf-ed7d-4465-8d3a-60257e90bdef","Type":"ContainerDied","Data":"4eb952b277a1133feb396b43f7c020ea9b855f0beb6c6313e95dfccf24dfca6d"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.140108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" event={"ID":"025cd7bf-ed7d-4465-8d3a-60257e90bdef","Type":"ContainerStarted","Data":"f1d29ae2b3c222719a0b7fadea58012f50172cebbb9c5c67ac568cc3c2c3a11c"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.141854 4707 generic.go:334] "Generic (PLEG): container finished" podID="951a6efd-7bf7-48ea-a8f5-de1ad1c03b08" containerID="01d7064ebbc3f6ab8fc1f264955b1dffe3fed3888e517ae25228c53c2f45394f" exitCode=0 Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.141916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-q6hgd" event={"ID":"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08","Type":"ContainerDied","Data":"01d7064ebbc3f6ab8fc1f264955b1dffe3fed3888e517ae25228c53c2f45394f"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.143485 4707 generic.go:334] "Generic (PLEG): container finished" podID="9eb7db96-fb55-4d76-9275-d06f77115530" containerID="9c5d8efdd505bcf5028e65801a46ebaa299c40cfdb0c320de7af53fbfb3b7410" exitCode=0 Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.143540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" event={"ID":"9eb7db96-fb55-4d76-9275-d06f77115530","Type":"ContainerDied","Data":"9c5d8efdd505bcf5028e65801a46ebaa299c40cfdb0c320de7af53fbfb3b7410"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.144910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" event={"ID":"22f781fc-51ea-457e-b20f-129e4eaa8649","Type":"ContainerStarted","Data":"b451099dbd5ad32127a7989542980a00d90d79b3ce6572d7316028bc4d8c18c4"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.144938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" event={"ID":"22f781fc-51ea-457e-b20f-129e4eaa8649","Type":"ContainerStarted","Data":"b569cd850293d86bac08fb95b08eeae92d70d10c1ff5523af8dd741918e7192e"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.146655 4707 generic.go:334] "Generic (PLEG): container finished" podID="e75f12e9-2aae-4b40-9120-bc66aad8d0c1" containerID="bda8d3a244ebe6eb8a2cc33162b6791f9a27ee5b8d1ae3c29cbca5c78a213910" exitCode=0 Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.146719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-zfw44" event={"ID":"e75f12e9-2aae-4b40-9120-bc66aad8d0c1","Type":"ContainerDied","Data":"bda8d3a244ebe6eb8a2cc33162b6791f9a27ee5b8d1ae3c29cbca5c78a213910"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.147880 4707 generic.go:334] "Generic (PLEG): container finished" podID="47a99ebc-2042-44a8-862c-e3508fdecf5c" containerID="ece978f12d6b5e9ac8023040ae0a54c42a5fbd1433d12812045a7b80b733a412" exitCode=0 Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.147912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-9ln95" event={"ID":"47a99ebc-2042-44a8-862c-e3508fdecf5c","Type":"ContainerDied","Data":"ece978f12d6b5e9ac8023040ae0a54c42a5fbd1433d12812045a7b80b733a412"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.147942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-9ln95" event={"ID":"47a99ebc-2042-44a8-862c-e3508fdecf5c","Type":"ContainerStarted","Data":"dfa5ba076bd6873d39984fc26348170954286453506c979a49051d8a60d6f0a9"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.149080 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c611a8f-8f60-428c-9fb9-c2009be3534b" containerID="2a059741cb9103b48bbbb54f029ad118b4c2e8e0b020439897ea13f2f44172a8" exitCode=0 Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.149109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" event={"ID":"1c611a8f-8f60-428c-9fb9-c2009be3534b","Type":"ContainerDied","Data":"2a059741cb9103b48bbbb54f029ad118b4c2e8e0b020439897ea13f2f44172a8"} Jan 21 15:58:13 crc kubenswrapper[4707]: I0121 15:58:13.178560 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" podStartSLOduration=2.178544698 podStartE2EDuration="2.178544698s" podCreationTimestamp="2026-01-21 15:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:13.176566759 +0000 UTC m=+3390.358082980" watchObservedRunningTime="2026-01-21 15:58:13.178544698 +0000 UTC m=+3390.360060920" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.156787 4707 generic.go:334] "Generic (PLEG): container finished" podID="22f781fc-51ea-457e-b20f-129e4eaa8649" containerID="b451099dbd5ad32127a7989542980a00d90d79b3ce6572d7316028bc4d8c18c4" exitCode=0 Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.156869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" event={"ID":"22f781fc-51ea-457e-b20f-129e4eaa8649","Type":"ContainerDied","Data":"b451099dbd5ad32127a7989542980a00d90d79b3ce6572d7316028bc4d8c18c4"} Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.489402 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.612981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4fj\" (UniqueName: \"kubernetes.io/projected/025cd7bf-ed7d-4465-8d3a-60257e90bdef-kube-api-access-8k4fj\") pod \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\" (UID: \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.614001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/025cd7bf-ed7d-4465-8d3a-60257e90bdef-operator-scripts\") pod \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\" (UID: \"025cd7bf-ed7d-4465-8d3a-60257e90bdef\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.614946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/025cd7bf-ed7d-4465-8d3a-60257e90bdef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "025cd7bf-ed7d-4465-8d3a-60257e90bdef" (UID: "025cd7bf-ed7d-4465-8d3a-60257e90bdef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.618612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025cd7bf-ed7d-4465-8d3a-60257e90bdef-kube-api-access-8k4fj" (OuterVolumeSpecName: "kube-api-access-8k4fj") pod "025cd7bf-ed7d-4465-8d3a-60257e90bdef" (UID: "025cd7bf-ed7d-4465-8d3a-60257e90bdef"). InnerVolumeSpecName "kube-api-access-8k4fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.640133 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.655709 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.667675 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.675113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.685830 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.716714 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4fj\" (UniqueName: \"kubernetes.io/projected/025cd7bf-ed7d-4465-8d3a-60257e90bdef-kube-api-access-8k4fj\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.716762 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/025cd7bf-ed7d-4465-8d3a-60257e90bdef-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brb8f\" (UniqueName: \"kubernetes.io/projected/9eb7db96-fb55-4d76-9275-d06f77115530-kube-api-access-brb8f\") pod \"9eb7db96-fb55-4d76-9275-d06f77115530\" (UID: \"9eb7db96-fb55-4d76-9275-d06f77115530\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffbwb\" (UniqueName: \"kubernetes.io/projected/47a99ebc-2042-44a8-862c-e3508fdecf5c-kube-api-access-ffbwb\") pod \"47a99ebc-2042-44a8-862c-e3508fdecf5c\" (UID: \"47a99ebc-2042-44a8-862c-e3508fdecf5c\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c611a8f-8f60-428c-9fb9-c2009be3534b-operator-scripts\") pod \"1c611a8f-8f60-428c-9fb9-c2009be3534b\" (UID: \"1c611a8f-8f60-428c-9fb9-c2009be3534b\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh2th\" (UniqueName: \"kubernetes.io/projected/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-kube-api-access-qh2th\") pod \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\" (UID: \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc9mf\" (UniqueName: \"kubernetes.io/projected/1c611a8f-8f60-428c-9fb9-c2009be3534b-kube-api-access-jc9mf\") pod \"1c611a8f-8f60-428c-9fb9-c2009be3534b\" (UID: \"1c611a8f-8f60-428c-9fb9-c2009be3534b\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818377 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a99ebc-2042-44a8-862c-e3508fdecf5c-operator-scripts\") pod \"47a99ebc-2042-44a8-862c-e3508fdecf5c\" (UID: \"47a99ebc-2042-44a8-862c-e3508fdecf5c\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9eb7db96-fb55-4d76-9275-d06f77115530-operator-scripts\") pod \"9eb7db96-fb55-4d76-9275-d06f77115530\" (UID: \"9eb7db96-fb55-4d76-9275-d06f77115530\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-operator-scripts\") pod \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\" (UID: \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-operator-scripts\") pod \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\" (UID: \"e75f12e9-2aae-4b40-9120-bc66aad8d0c1\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.818639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrz2d\" (UniqueName: \"kubernetes.io/projected/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-kube-api-access-wrz2d\") pod \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\" (UID: \"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08\") " Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.819066 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47a99ebc-2042-44a8-862c-e3508fdecf5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47a99ebc-2042-44a8-862c-e3508fdecf5c" (UID: "47a99ebc-2042-44a8-862c-e3508fdecf5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.819078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "951a6efd-7bf7-48ea-a8f5-de1ad1c03b08" (UID: "951a6efd-7bf7-48ea-a8f5-de1ad1c03b08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.819112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c611a8f-8f60-428c-9fb9-c2009be3534b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c611a8f-8f60-428c-9fb9-c2009be3534b" (UID: "1c611a8f-8f60-428c-9fb9-c2009be3534b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.819430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eb7db96-fb55-4d76-9275-d06f77115530-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9eb7db96-fb55-4d76-9275-d06f77115530" (UID: "9eb7db96-fb55-4d76-9275-d06f77115530"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.819505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e75f12e9-2aae-4b40-9120-bc66aad8d0c1" (UID: "e75f12e9-2aae-4b40-9120-bc66aad8d0c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.819821 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.819891 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.819944 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c611a8f-8f60-428c-9fb9-c2009be3534b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.820006 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a99ebc-2042-44a8-862c-e3508fdecf5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.820056 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9eb7db96-fb55-4d76-9275-d06f77115530-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.822012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb7db96-fb55-4d76-9275-d06f77115530-kube-api-access-brb8f" (OuterVolumeSpecName: "kube-api-access-brb8f") pod "9eb7db96-fb55-4d76-9275-d06f77115530" (UID: "9eb7db96-fb55-4d76-9275-d06f77115530"). InnerVolumeSpecName "kube-api-access-brb8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.822053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a99ebc-2042-44a8-862c-e3508fdecf5c-kube-api-access-ffbwb" (OuterVolumeSpecName: "kube-api-access-ffbwb") pod "47a99ebc-2042-44a8-862c-e3508fdecf5c" (UID: "47a99ebc-2042-44a8-862c-e3508fdecf5c"). InnerVolumeSpecName "kube-api-access-ffbwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.822220 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-kube-api-access-qh2th" (OuterVolumeSpecName: "kube-api-access-qh2th") pod "e75f12e9-2aae-4b40-9120-bc66aad8d0c1" (UID: "e75f12e9-2aae-4b40-9120-bc66aad8d0c1"). InnerVolumeSpecName "kube-api-access-qh2th". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.822296 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c611a8f-8f60-428c-9fb9-c2009be3534b-kube-api-access-jc9mf" (OuterVolumeSpecName: "kube-api-access-jc9mf") pod "1c611a8f-8f60-428c-9fb9-c2009be3534b" (UID: "1c611a8f-8f60-428c-9fb9-c2009be3534b"). InnerVolumeSpecName "kube-api-access-jc9mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.822408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-kube-api-access-wrz2d" (OuterVolumeSpecName: "kube-api-access-wrz2d") pod "951a6efd-7bf7-48ea-a8f5-de1ad1c03b08" (UID: "951a6efd-7bf7-48ea-a8f5-de1ad1c03b08"). InnerVolumeSpecName "kube-api-access-wrz2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.922211 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffbwb\" (UniqueName: \"kubernetes.io/projected/47a99ebc-2042-44a8-862c-e3508fdecf5c-kube-api-access-ffbwb\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.922250 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh2th\" (UniqueName: \"kubernetes.io/projected/e75f12e9-2aae-4b40-9120-bc66aad8d0c1-kube-api-access-qh2th\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.922259 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc9mf\" (UniqueName: \"kubernetes.io/projected/1c611a8f-8f60-428c-9fb9-c2009be3534b-kube-api-access-jc9mf\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.922268 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrz2d\" (UniqueName: \"kubernetes.io/projected/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08-kube-api-access-wrz2d\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:14 crc kubenswrapper[4707]: I0121 15:58:14.922277 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brb8f\" (UniqueName: \"kubernetes.io/projected/9eb7db96-fb55-4d76-9275-d06f77115530-kube-api-access-brb8f\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.170950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" event={"ID":"1c611a8f-8f60-428c-9fb9-c2009be3534b","Type":"ContainerDied","Data":"889c0dcf29a0739600bc83b5d687cf2000e8df502a651b878261c2e73212fec3"} Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.170986 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="889c0dcf29a0739600bc83b5d687cf2000e8df502a651b878261c2e73212fec3" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.171039 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.174186 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.174248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd" event={"ID":"025cd7bf-ed7d-4465-8d3a-60257e90bdef","Type":"ContainerDied","Data":"f1d29ae2b3c222719a0b7fadea58012f50172cebbb9c5c67ac568cc3c2c3a11c"} Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.174296 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d29ae2b3c222719a0b7fadea58012f50172cebbb9c5c67ac568cc3c2c3a11c" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.175897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-q6hgd" event={"ID":"951a6efd-7bf7-48ea-a8f5-de1ad1c03b08","Type":"ContainerDied","Data":"495a8cbdb02ca282d6b6a598ac7676a902a62f0ae0f4249f8e95003f6dc347e9"} Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.175930 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="495a8cbdb02ca282d6b6a598ac7676a902a62f0ae0f4249f8e95003f6dc347e9" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.175976 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-q6hgd" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.181434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" event={"ID":"9eb7db96-fb55-4d76-9275-d06f77115530","Type":"ContainerDied","Data":"24dba0888235d712f5effa21c4bd40c62d2f3665b7d550eb9b1f1859b23513db"} Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.181592 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24dba0888235d712f5effa21c4bd40c62d2f3665b7d550eb9b1f1859b23513db" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.181524 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.185454 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-zfw44" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.192356 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-9ln95" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.195211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-zfw44" event={"ID":"e75f12e9-2aae-4b40-9120-bc66aad8d0c1","Type":"ContainerDied","Data":"c711e30ccc753a0b8c2d1a8045cf91211b53291dd31e71fd7ca19e3ae2b530de"} Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.195234 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c711e30ccc753a0b8c2d1a8045cf91211b53291dd31e71fd7ca19e3ae2b530de" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.195293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-9ln95" event={"ID":"47a99ebc-2042-44a8-862c-e3508fdecf5c","Type":"ContainerDied","Data":"dfa5ba076bd6873d39984fc26348170954286453506c979a49051d8a60d6f0a9"} Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.195303 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfa5ba076bd6873d39984fc26348170954286453506c979a49051d8a60d6f0a9" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.395214 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.529944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpjkk\" (UniqueName: \"kubernetes.io/projected/22f781fc-51ea-457e-b20f-129e4eaa8649-kube-api-access-jpjkk\") pod \"22f781fc-51ea-457e-b20f-129e4eaa8649\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.529993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-combined-ca-bundle\") pod \"22f781fc-51ea-457e-b20f-129e4eaa8649\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.530059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-config-data\") pod \"22f781fc-51ea-457e-b20f-129e4eaa8649\" (UID: \"22f781fc-51ea-457e-b20f-129e4eaa8649\") " Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.532916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f781fc-51ea-457e-b20f-129e4eaa8649-kube-api-access-jpjkk" (OuterVolumeSpecName: "kube-api-access-jpjkk") pod "22f781fc-51ea-457e-b20f-129e4eaa8649" (UID: "22f781fc-51ea-457e-b20f-129e4eaa8649"). InnerVolumeSpecName "kube-api-access-jpjkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.546964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22f781fc-51ea-457e-b20f-129e4eaa8649" (UID: "22f781fc-51ea-457e-b20f-129e4eaa8649"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.560827 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-config-data" (OuterVolumeSpecName: "config-data") pod "22f781fc-51ea-457e-b20f-129e4eaa8649" (UID: "22f781fc-51ea-457e-b20f-129e4eaa8649"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.631306 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpjkk\" (UniqueName: \"kubernetes.io/projected/22f781fc-51ea-457e-b20f-129e4eaa8649-kube-api-access-jpjkk\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.631343 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:15 crc kubenswrapper[4707]: I0121 15:58:15.631352 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f781fc-51ea-457e-b20f-129e4eaa8649-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.200072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" event={"ID":"22f781fc-51ea-457e-b20f-129e4eaa8649","Type":"ContainerDied","Data":"b569cd850293d86bac08fb95b08eeae92d70d10c1ff5523af8dd741918e7192e"} Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.200337 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b569cd850293d86bac08fb95b08eeae92d70d10c1ff5523af8dd741918e7192e" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.200101 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-l2fmp" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542037 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fsks8"] Jan 21 15:58:16 crc kubenswrapper[4707]: E0121 15:58:16.542322 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f781fc-51ea-457e-b20f-129e4eaa8649" containerName="keystone-db-sync" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542339 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f781fc-51ea-457e-b20f-129e4eaa8649" containerName="keystone-db-sync" Jan 21 15:58:16 crc kubenswrapper[4707]: E0121 15:58:16.542354 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025cd7bf-ed7d-4465-8d3a-60257e90bdef" containerName="mariadb-account-create-update" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542360 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="025cd7bf-ed7d-4465-8d3a-60257e90bdef" containerName="mariadb-account-create-update" Jan 21 15:58:16 crc kubenswrapper[4707]: E0121 15:58:16.542369 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb7db96-fb55-4d76-9275-d06f77115530" containerName="mariadb-account-create-update" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542374 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb7db96-fb55-4d76-9275-d06f77115530" containerName="mariadb-account-create-update" Jan 21 15:58:16 crc kubenswrapper[4707]: E0121 15:58:16.542393 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c611a8f-8f60-428c-9fb9-c2009be3534b" containerName="mariadb-account-create-update" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542398 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c611a8f-8f60-428c-9fb9-c2009be3534b" containerName="mariadb-account-create-update" Jan 21 15:58:16 crc kubenswrapper[4707]: E0121 15:58:16.542411 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a99ebc-2042-44a8-862c-e3508fdecf5c" containerName="mariadb-database-create" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542416 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a99ebc-2042-44a8-862c-e3508fdecf5c" containerName="mariadb-database-create" Jan 21 15:58:16 crc kubenswrapper[4707]: E0121 15:58:16.542427 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75f12e9-2aae-4b40-9120-bc66aad8d0c1" containerName="mariadb-database-create" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542433 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75f12e9-2aae-4b40-9120-bc66aad8d0c1" containerName="mariadb-database-create" Jan 21 15:58:16 crc kubenswrapper[4707]: E0121 15:58:16.542445 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951a6efd-7bf7-48ea-a8f5-de1ad1c03b08" containerName="mariadb-database-create" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542451 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="951a6efd-7bf7-48ea-a8f5-de1ad1c03b08" containerName="mariadb-database-create" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542585 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="951a6efd-7bf7-48ea-a8f5-de1ad1c03b08" containerName="mariadb-database-create" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542599 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75f12e9-2aae-4b40-9120-bc66aad8d0c1" containerName="mariadb-database-create" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542610 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c611a8f-8f60-428c-9fb9-c2009be3534b" containerName="mariadb-account-create-update" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542626 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb7db96-fb55-4d76-9275-d06f77115530" containerName="mariadb-account-create-update" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542635 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="025cd7bf-ed7d-4465-8d3a-60257e90bdef" containerName="mariadb-account-create-update" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542644 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f781fc-51ea-457e-b20f-129e4eaa8649" containerName="keystone-db-sync" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.542655 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a99ebc-2042-44a8-862c-e3508fdecf5c" containerName="mariadb-database-create" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.543107 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.544244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-config-data\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.544301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-scripts\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.544353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m52zt\" (UniqueName: \"kubernetes.io/projected/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-kube-api-access-m52zt\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.544415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-fernet-keys\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.544463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-combined-ca-bundle\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.544504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-credential-keys\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.548551 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.548791 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.548956 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.549001 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.550637 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-vnmw4" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.568002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fsks8"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.645546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-config-data\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.645594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-scripts\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.645722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m52zt\" (UniqueName: \"kubernetes.io/projected/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-kube-api-access-m52zt\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.645845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-fernet-keys\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.645919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-combined-ca-bundle\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.645988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-credential-keys\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.652766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-config-data\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.654631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-fernet-keys\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.654872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-scripts\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.659250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-credential-keys\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.659466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-combined-ca-bundle\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.663464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m52zt\" (UniqueName: \"kubernetes.io/projected/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-kube-api-access-m52zt\") pod \"keystone-bootstrap-fsks8\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.688405 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.690101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.691793 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.691989 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.696758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.747005 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-p8pwz"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.748414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.751155 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-w5lrz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.751323 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.761135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.764165 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7ckwj"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.765051 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.775421 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.776204 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.776522 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-rvgv7" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.799398 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-p8pwz"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.826189 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7ckwj"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.840377 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-ntkqn"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.841410 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.843762 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.851585 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.851707 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-kw2tj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-logs\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdsqz\" (UniqueName: \"kubernetes.io/projected/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-kube-api-access-fdsqz\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2dvz\" (UniqueName: \"kubernetes.io/projected/a8dc14bc-b18c-4937-9319-b73bb16aced4-kube-api-access-m2dvz\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-log-httpd\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-config\") pod \"neutron-db-sync-p8pwz\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8dc14bc-b18c-4937-9319-b73bb16aced4-etc-machine-id\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-db-sync-config-data\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n6d9\" (UniqueName: \"kubernetes.io/projected/343110b7-9660-4e3f-a56f-d7ee9f89467d-kube-api-access-8n6d9\") pod \"neutron-db-sync-p8pwz\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjn55\" (UniqueName: \"kubernetes.io/projected/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-kube-api-access-xjn55\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853367 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-scripts\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-config-data\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-scripts\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-config-data\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-run-httpd\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-combined-ca-bundle\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853555 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-combined-ca-bundle\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-scripts\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-config-data\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-combined-ca-bundle\") pod \"neutron-db-sync-p8pwz\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.853667 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.856385 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.877829 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-ntkqn"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.893222 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tnwlj"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.894843 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.896401 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tnwlj"] Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.896663 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.896829 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-9dg6p" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.954680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-run-httpd\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.954717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-combined-ca-bundle\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.954933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-combined-ca-bundle\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-scripts\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-config-data\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-combined-ca-bundle\") pod \"neutron-db-sync-p8pwz\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-run-httpd\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-logs\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdsqz\" (UniqueName: \"kubernetes.io/projected/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-kube-api-access-fdsqz\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2dvz\" (UniqueName: \"kubernetes.io/projected/a8dc14bc-b18c-4937-9319-b73bb16aced4-kube-api-access-m2dvz\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-log-httpd\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-config\") pod \"neutron-db-sync-p8pwz\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8dc14bc-b18c-4937-9319-b73bb16aced4-etc-machine-id\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-db-sync-config-data\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n6d9\" (UniqueName: \"kubernetes.io/projected/343110b7-9660-4e3f-a56f-d7ee9f89467d-kube-api-access-8n6d9\") pod \"neutron-db-sync-p8pwz\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjn55\" (UniqueName: \"kubernetes.io/projected/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-kube-api-access-xjn55\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-scripts\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-config-data\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-scripts\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.955750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-config-data\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.956211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8dc14bc-b18c-4937-9319-b73bb16aced4-etc-machine-id\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.956464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-log-httpd\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.956738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-logs\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.962020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-db-sync-config-data\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.962192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.962917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-config-data\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.963225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.963255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-combined-ca-bundle\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.965137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-config\") pod \"neutron-db-sync-p8pwz\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.965823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-config-data\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.969256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-combined-ca-bundle\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.969791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-scripts\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.970386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-config-data\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.970643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-scripts\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.971603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjn55\" (UniqueName: \"kubernetes.io/projected/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-kube-api-access-xjn55\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.971851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-combined-ca-bundle\") pod \"neutron-db-sync-p8pwz\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.971896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-scripts\") pod \"placement-db-sync-ntkqn\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.973455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2dvz\" (UniqueName: \"kubernetes.io/projected/a8dc14bc-b18c-4937-9319-b73bb16aced4-kube-api-access-m2dvz\") pod \"cinder-db-sync-7ckwj\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.978466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdsqz\" (UniqueName: \"kubernetes.io/projected/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-kube-api-access-fdsqz\") pod \"ceilometer-0\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:16 crc kubenswrapper[4707]: I0121 15:58:16.978889 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n6d9\" (UniqueName: \"kubernetes.io/projected/343110b7-9660-4e3f-a56f-d7ee9f89467d-kube-api-access-8n6d9\") pod \"neutron-db-sync-p8pwz\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.017663 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.061709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm9rc\" (UniqueName: \"kubernetes.io/projected/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-kube-api-access-tm9rc\") pod \"barbican-db-sync-tnwlj\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.061763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-db-sync-config-data\") pod \"barbican-db-sync-tnwlj\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.061784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-combined-ca-bundle\") pod \"barbican-db-sync-tnwlj\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.064591 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.086103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.159695 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.163167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm9rc\" (UniqueName: \"kubernetes.io/projected/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-kube-api-access-tm9rc\") pod \"barbican-db-sync-tnwlj\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.163212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-db-sync-config-data\") pod \"barbican-db-sync-tnwlj\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.163232 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-combined-ca-bundle\") pod \"barbican-db-sync-tnwlj\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.166802 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-db-sync-config-data\") pod \"barbican-db-sync-tnwlj\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.169315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-combined-ca-bundle\") pod \"barbican-db-sync-tnwlj\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.178736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm9rc\" (UniqueName: \"kubernetes.io/projected/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-kube-api-access-tm9rc\") pod \"barbican-db-sync-tnwlj\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.239673 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.277693 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fsks8"] Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.626397 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.627536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.630093 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.630498 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-dkchd" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.630587 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.630681 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.641576 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.680544 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.681620 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.684059 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.684196 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.700128 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.775357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.775391 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-logs\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.775432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g87tk\" (UniqueName: \"kubernetes.io/projected/7d2d4a6b-3223-4e31-883e-d0878a5cf089-kube-api-access-g87tk\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.775458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.775506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.775535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.775567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.775603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.876751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.876798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.877886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-logs\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.877996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g87tk\" (UniqueName: \"kubernetes.io/projected/7d2d4a6b-3223-4e31-883e-d0878a5cf089-kube-api-access-g87tk\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8zl5\" (UniqueName: \"kubernetes.io/projected/e4cd799d-c609-4193-be39-6ff724a769d6-kube-api-access-p8zl5\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.878419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.881077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-logs\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.881784 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.881971 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.882654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.884401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.904427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.924294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.933975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.939494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g87tk\" (UniqueName: \"kubernetes.io/projected/7d2d4a6b-3223-4e31-883e-d0878a5cf089-kube-api-access-g87tk\") pod \"glance-default-external-api-0\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.949400 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.986758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.986865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8zl5\" (UniqueName: \"kubernetes.io/projected/e4cd799d-c609-4193-be39-6ff724a769d6-kube-api-access-p8zl5\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.986930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.986955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.986974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.987008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.987032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.987094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.987529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.988862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:17 crc kubenswrapper[4707]: I0121 15:58:17.991153 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.009374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.014337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8zl5\" (UniqueName: \"kubernetes.io/projected/e4cd799d-c609-4193-be39-6ff724a769d6-kube-api-access-p8zl5\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.018575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.018788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.034292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.050647 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.144162 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:58:18 crc kubenswrapper[4707]: W0121 15:58:18.147553 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71bb98ab_d1f3_479b_83d7_aa2c3d2b3581.slice/crio-18ff5e29b875a9ec41b5c18847a6568da737dca7a3703229621979056638d0f1 WatchSource:0}: Error finding container 18ff5e29b875a9ec41b5c18847a6568da737dca7a3703229621979056638d0f1: Status 404 returned error can't find the container with id 18ff5e29b875a9ec41b5c18847a6568da737dca7a3703229621979056638d0f1 Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.218657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerStarted","Data":"18ff5e29b875a9ec41b5c18847a6568da737dca7a3703229621979056638d0f1"} Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.219948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" event={"ID":"6de02bec-e42c-48c1-b956-2a9ed65e7ca9","Type":"ContainerStarted","Data":"e7b31926efdf6f8be375bbbffde94813f498decaf60c78ae20ff653568fe6c82"} Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.220020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" event={"ID":"6de02bec-e42c-48c1-b956-2a9ed65e7ca9","Type":"ContainerStarted","Data":"d24059594c1aa68a7f77abbc193843ca8c41327389db1c8648312f4dbb527d22"} Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.243740 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" podStartSLOduration=2.243719571 podStartE2EDuration="2.243719571s" podCreationTimestamp="2026-01-21 15:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:18.234592738 +0000 UTC m=+3395.416108960" watchObservedRunningTime="2026-01-21 15:58:18.243719571 +0000 UTC m=+3395.425235792" Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.295627 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-p8pwz"] Jan 21 15:58:18 crc kubenswrapper[4707]: W0121 15:58:18.297634 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod343110b7_9660_4e3f_a56f_d7ee9f89467d.slice/crio-72e1059c11b42f36bae04841d44667817cf95eb4863f0affd29f15a4c4d2b6e5 WatchSource:0}: Error finding container 72e1059c11b42f36bae04841d44667817cf95eb4863f0affd29f15a4c4d2b6e5: Status 404 returned error can't find the container with id 72e1059c11b42f36bae04841d44667817cf95eb4863f0affd29f15a4c4d2b6e5 Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.301843 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tnwlj"] Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.311196 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7ckwj"] Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.312574 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.325433 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-ntkqn"] Jan 21 15:58:18 crc kubenswrapper[4707]: W0121 15:58:18.332649 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93fe5a42_4cc9_4ebc_af4e_b2dd20fd10ff.slice/crio-136a7b22ef1a3abdbb62078c0e264cc3121d23b4114d802c7340a44b92497b14 WatchSource:0}: Error finding container 136a7b22ef1a3abdbb62078c0e264cc3121d23b4114d802c7340a44b92497b14: Status 404 returned error can't find the container with id 136a7b22ef1a3abdbb62078c0e264cc3121d23b4114d802c7340a44b92497b14 Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.485639 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.774994 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:58:18 crc kubenswrapper[4707]: I0121 15:58:18.975902 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.065145 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.075483 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.230643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4cd799d-c609-4193-be39-6ff724a769d6","Type":"ContainerStarted","Data":"17481431cb5e51f406a23f2a01ad6288b5826ace972860405bbc97e540e70f2a"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.233272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerStarted","Data":"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.238304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" event={"ID":"343110b7-9660-4e3f-a56f-d7ee9f89467d","Type":"ContainerStarted","Data":"c5e539d0bcc848c6d8b89ed9ace7c4bb3a1f1e4ca972075b82e0e1f0ae7b1421"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.238331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" event={"ID":"343110b7-9660-4e3f-a56f-d7ee9f89467d","Type":"ContainerStarted","Data":"72e1059c11b42f36bae04841d44667817cf95eb4863f0affd29f15a4c4d2b6e5"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.241115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" event={"ID":"a8dc14bc-b18c-4937-9319-b73bb16aced4","Type":"ContainerStarted","Data":"743ae25b8acb1b34607a2c58c911a56beecb00bee4c4c9e5e264c4b53d7a16c1"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.241140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" event={"ID":"a8dc14bc-b18c-4937-9319-b73bb16aced4","Type":"ContainerStarted","Data":"6ecf5aff37b7a17a1547b42a254c41d8588f102e47b4a5590d090456a9f8448a"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.242409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-ntkqn" event={"ID":"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff","Type":"ContainerStarted","Data":"fdbdbe661ec8d6b428eafed192668501fa504abde7e9628dd63f90c1ddfffed4"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.242431 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-ntkqn" event={"ID":"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff","Type":"ContainerStarted","Data":"136a7b22ef1a3abdbb62078c0e264cc3121d23b4114d802c7340a44b92497b14"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.243803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" event={"ID":"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae","Type":"ContainerStarted","Data":"4457eabe5f34005325f77fb3e078f6a236a4762a6d975580d7780c5eb68a0db8"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.243863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" event={"ID":"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae","Type":"ContainerStarted","Data":"0c922fc0b256e1b5244590891edb52b7e2a403c9a2d8b466e32d7af45e2a899b"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.246744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7d2d4a6b-3223-4e31-883e-d0878a5cf089","Type":"ContainerStarted","Data":"83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.246772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7d2d4a6b-3223-4e31-883e-d0878a5cf089","Type":"ContainerStarted","Data":"50a86d1375a89e96aa3b7605739188871a90fb35df8f9c23781e7603dab0c655"} Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.251030 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" podStartSLOduration=3.251017925 podStartE2EDuration="3.251017925s" podCreationTimestamp="2026-01-21 15:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:19.249702361 +0000 UTC m=+3396.431218584" watchObservedRunningTime="2026-01-21 15:58:19.251017925 +0000 UTC m=+3396.432534147" Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.268548 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" podStartSLOduration=3.268533872 podStartE2EDuration="3.268533872s" podCreationTimestamp="2026-01-21 15:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:19.266673774 +0000 UTC m=+3396.448189996" watchObservedRunningTime="2026-01-21 15:58:19.268533872 +0000 UTC m=+3396.450050084" Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.292459 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-ntkqn" podStartSLOduration=3.2924473 podStartE2EDuration="3.2924473s" podCreationTimestamp="2026-01-21 15:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:19.280250084 +0000 UTC m=+3396.461766307" watchObservedRunningTime="2026-01-21 15:58:19.2924473 +0000 UTC m=+3396.473963522" Jan 21 15:58:19 crc kubenswrapper[4707]: I0121 15:58:19.305699 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" podStartSLOduration=3.305687136 podStartE2EDuration="3.305687136s" podCreationTimestamp="2026-01-21 15:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:19.294430579 +0000 UTC m=+3396.475946801" watchObservedRunningTime="2026-01-21 15:58:19.305687136 +0000 UTC m=+3396.487203358" Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.255049 4707 generic.go:334] "Generic (PLEG): container finished" podID="93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" containerID="fdbdbe661ec8d6b428eafed192668501fa504abde7e9628dd63f90c1ddfffed4" exitCode=0 Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.255465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-ntkqn" event={"ID":"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff","Type":"ContainerDied","Data":"fdbdbe661ec8d6b428eafed192668501fa504abde7e9628dd63f90c1ddfffed4"} Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.257792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7d2d4a6b-3223-4e31-883e-d0878a5cf089","Type":"ContainerStarted","Data":"b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407"} Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.257915 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerName="glance-log" containerID="cri-o://83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675" gracePeriod=30 Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.258120 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerName="glance-httpd" containerID="cri-o://b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407" gracePeriod=30 Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.261175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4cd799d-c609-4193-be39-6ff724a769d6","Type":"ContainerStarted","Data":"b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e"} Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.261202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4cd799d-c609-4193-be39-6ff724a769d6","Type":"ContainerStarted","Data":"87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5"} Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.261280 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="e4cd799d-c609-4193-be39-6ff724a769d6" containerName="glance-log" containerID="cri-o://87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5" gracePeriod=30 Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.261365 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="e4cd799d-c609-4193-be39-6ff724a769d6" containerName="glance-httpd" containerID="cri-o://b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e" gracePeriod=30 Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.268203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerStarted","Data":"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae"} Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.292124 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.292102573 podStartE2EDuration="4.292102573s" podCreationTimestamp="2026-01-21 15:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:20.28839503 +0000 UTC m=+3397.469911253" watchObservedRunningTime="2026-01-21 15:58:20.292102573 +0000 UTC m=+3397.473618794" Jan 21 15:58:20 crc kubenswrapper[4707]: I0121 15:58:20.906052 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.053343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g87tk\" (UniqueName: \"kubernetes.io/projected/7d2d4a6b-3223-4e31-883e-d0878a5cf089-kube-api-access-g87tk\") pod \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.053395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-logs\") pod \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.053414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-public-tls-certs\") pod \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.053517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-httpd-run\") pod \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.053547 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-config-data\") pod \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.053565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-combined-ca-bundle\") pod \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.053622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-scripts\") pod \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.053676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\" (UID: \"7d2d4a6b-3223-4e31-883e-d0878a5cf089\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.053731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-logs" (OuterVolumeSpecName: "logs") pod "7d2d4a6b-3223-4e31-883e-d0878a5cf089" (UID: "7d2d4a6b-3223-4e31-883e-d0878a5cf089"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.054185 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.054248 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7d2d4a6b-3223-4e31-883e-d0878a5cf089" (UID: "7d2d4a6b-3223-4e31-883e-d0878a5cf089"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.058043 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.058345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-scripts" (OuterVolumeSpecName: "scripts") pod "7d2d4a6b-3223-4e31-883e-d0878a5cf089" (UID: "7d2d4a6b-3223-4e31-883e-d0878a5cf089"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.058356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "7d2d4a6b-3223-4e31-883e-d0878a5cf089" (UID: "7d2d4a6b-3223-4e31-883e-d0878a5cf089"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.058640 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d2d4a6b-3223-4e31-883e-d0878a5cf089-kube-api-access-g87tk" (OuterVolumeSpecName: "kube-api-access-g87tk") pod "7d2d4a6b-3223-4e31-883e-d0878a5cf089" (UID: "7d2d4a6b-3223-4e31-883e-d0878a5cf089"). InnerVolumeSpecName "kube-api-access-g87tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.078162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d2d4a6b-3223-4e31-883e-d0878a5cf089" (UID: "7d2d4a6b-3223-4e31-883e-d0878a5cf089"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.093838 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d2d4a6b-3223-4e31-883e-d0878a5cf089" (UID: "7d2d4a6b-3223-4e31-883e-d0878a5cf089"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.101099 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-config-data" (OuterVolumeSpecName: "config-data") pod "7d2d4a6b-3223-4e31-883e-d0878a5cf089" (UID: "7d2d4a6b-3223-4e31-883e-d0878a5cf089"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.155498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-logs\") pod \"e4cd799d-c609-4193-be39-6ff724a769d6\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.155984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-internal-tls-certs\") pod \"e4cd799d-c609-4193-be39-6ff724a769d6\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.156101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-config-data\") pod \"e4cd799d-c609-4193-be39-6ff724a769d6\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.156180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-httpd-run\") pod \"e4cd799d-c609-4193-be39-6ff724a769d6\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.156256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-scripts\") pod \"e4cd799d-c609-4193-be39-6ff724a769d6\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.156352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-combined-ca-bundle\") pod \"e4cd799d-c609-4193-be39-6ff724a769d6\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.156450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8zl5\" (UniqueName: \"kubernetes.io/projected/e4cd799d-c609-4193-be39-6ff724a769d6-kube-api-access-p8zl5\") pod \"e4cd799d-c609-4193-be39-6ff724a769d6\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.156524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"e4cd799d-c609-4193-be39-6ff724a769d6\" (UID: \"e4cd799d-c609-4193-be39-6ff724a769d6\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.157038 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.157096 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g87tk\" (UniqueName: \"kubernetes.io/projected/7d2d4a6b-3223-4e31-883e-d0878a5cf089-kube-api-access-g87tk\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.157157 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.157210 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2d4a6b-3223-4e31-883e-d0878a5cf089-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.157254 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.157315 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.157361 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2d4a6b-3223-4e31-883e-d0878a5cf089-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.155901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-logs" (OuterVolumeSpecName: "logs") pod "e4cd799d-c609-4193-be39-6ff724a769d6" (UID: "e4cd799d-c609-4193-be39-6ff724a769d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.157949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e4cd799d-c609-4193-be39-6ff724a769d6" (UID: "e4cd799d-c609-4193-be39-6ff724a769d6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.161662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-scripts" (OuterVolumeSpecName: "scripts") pod "e4cd799d-c609-4193-be39-6ff724a769d6" (UID: "e4cd799d-c609-4193-be39-6ff724a769d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.161855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4cd799d-c609-4193-be39-6ff724a769d6-kube-api-access-p8zl5" (OuterVolumeSpecName: "kube-api-access-p8zl5") pod "e4cd799d-c609-4193-be39-6ff724a769d6" (UID: "e4cd799d-c609-4193-be39-6ff724a769d6"). InnerVolumeSpecName "kube-api-access-p8zl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.161934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "e4cd799d-c609-4193-be39-6ff724a769d6" (UID: "e4cd799d-c609-4193-be39-6ff724a769d6"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.173208 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.177414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4cd799d-c609-4193-be39-6ff724a769d6" (UID: "e4cd799d-c609-4193-be39-6ff724a769d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.191895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4cd799d-c609-4193-be39-6ff724a769d6" (UID: "e4cd799d-c609-4193-be39-6ff724a769d6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.192219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-config-data" (OuterVolumeSpecName: "config-data") pod "e4cd799d-c609-4193-be39-6ff724a769d6" (UID: "e4cd799d-c609-4193-be39-6ff724a769d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.258805 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.259020 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.259031 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.259040 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.259049 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.259058 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cd799d-c609-4193-be39-6ff724a769d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.259066 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8zl5\" (UniqueName: \"kubernetes.io/projected/e4cd799d-c609-4193-be39-6ff724a769d6-kube-api-access-p8zl5\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.259094 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.259103 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cd799d-c609-4193-be39-6ff724a769d6-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.273539 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.276993 4707 generic.go:334] "Generic (PLEG): container finished" podID="5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae" containerID="4457eabe5f34005325f77fb3e078f6a236a4762a6d975580d7780c5eb68a0db8" exitCode=0 Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.277111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" event={"ID":"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae","Type":"ContainerDied","Data":"4457eabe5f34005325f77fb3e078f6a236a4762a6d975580d7780c5eb68a0db8"} Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.278909 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerID="b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407" exitCode=0 Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.278932 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerID="83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675" exitCode=143 Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.278949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7d2d4a6b-3223-4e31-883e-d0878a5cf089","Type":"ContainerDied","Data":"b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407"} Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.279001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7d2d4a6b-3223-4e31-883e-d0878a5cf089","Type":"ContainerDied","Data":"83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675"} Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.279016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7d2d4a6b-3223-4e31-883e-d0878a5cf089","Type":"ContainerDied","Data":"50a86d1375a89e96aa3b7605739188871a90fb35df8f9c23781e7603dab0c655"} Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.279063 4707 scope.go:117] "RemoveContainer" containerID="b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.279142 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.281517 4707 generic.go:334] "Generic (PLEG): container finished" podID="e4cd799d-c609-4193-be39-6ff724a769d6" containerID="b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e" exitCode=0 Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.281544 4707 generic.go:334] "Generic (PLEG): container finished" podID="e4cd799d-c609-4193-be39-6ff724a769d6" containerID="87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5" exitCode=143 Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.281582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4cd799d-c609-4193-be39-6ff724a769d6","Type":"ContainerDied","Data":"b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e"} Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.281605 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4cd799d-c609-4193-be39-6ff724a769d6","Type":"ContainerDied","Data":"87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5"} Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.281614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4cd799d-c609-4193-be39-6ff724a769d6","Type":"ContainerDied","Data":"17481431cb5e51f406a23f2a01ad6288b5826ace972860405bbc97e540e70f2a"} Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.281723 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.285915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerStarted","Data":"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da"} Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.289003 4707 generic.go:334] "Generic (PLEG): container finished" podID="6de02bec-e42c-48c1-b956-2a9ed65e7ca9" containerID="e7b31926efdf6f8be375bbbffde94813f498decaf60c78ae20ff653568fe6c82" exitCode=0 Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.289093 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" event={"ID":"6de02bec-e42c-48c1-b956-2a9ed65e7ca9","Type":"ContainerDied","Data":"e7b31926efdf6f8be375bbbffde94813f498decaf60c78ae20ff653568fe6c82"} Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.310245 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.320895 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.335422 4707 scope.go:117] "RemoveContainer" containerID="83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.336268 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:58:21 crc kubenswrapper[4707]: E0121 15:58:21.336578 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerName="glance-log" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.336594 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerName="glance-log" Jan 21 15:58:21 crc kubenswrapper[4707]: E0121 15:58:21.336604 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cd799d-c609-4193-be39-6ff724a769d6" containerName="glance-httpd" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.336610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cd799d-c609-4193-be39-6ff724a769d6" containerName="glance-httpd" Jan 21 15:58:21 crc kubenswrapper[4707]: E0121 15:58:21.336621 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerName="glance-httpd" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.336626 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerName="glance-httpd" Jan 21 15:58:21 crc kubenswrapper[4707]: E0121 15:58:21.336652 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cd799d-c609-4193-be39-6ff724a769d6" containerName="glance-log" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.336657 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cd799d-c609-4193-be39-6ff724a769d6" containerName="glance-log" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.337221 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4cd799d-c609-4193-be39-6ff724a769d6" containerName="glance-log" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.337244 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerName="glance-log" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.337262 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" containerName="glance-httpd" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.337270 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4cd799d-c609-4193-be39-6ff724a769d6" containerName="glance-httpd" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.338044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.340373 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-dkchd" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.340746 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.341120 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.351278 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.357605 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.360849 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.362163 4707 scope.go:117] "RemoveContainer" containerID="b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407" Jan 21 15:58:21 crc kubenswrapper[4707]: E0121 15:58:21.363098 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407\": container with ID starting with b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407 not found: ID does not exist" containerID="b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.363124 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407"} err="failed to get container status \"b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407\": rpc error: code = NotFound desc = could not find container \"b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407\": container with ID starting with b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407 not found: ID does not exist" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.363143 4707 scope.go:117] "RemoveContainer" containerID="83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.363211 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:58:21 crc kubenswrapper[4707]: E0121 15:58:21.363381 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675\": container with ID starting with 83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675 not found: ID does not exist" containerID="83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.363454 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675"} err="failed to get container status \"83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675\": rpc error: code = NotFound desc = could not find container \"83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675\": container with ID starting with 83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675 not found: ID does not exist" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.363515 4707 scope.go:117] "RemoveContainer" containerID="b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.363790 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407"} err="failed to get container status \"b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407\": rpc error: code = NotFound desc = could not find container \"b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407\": container with ID starting with b7b650099ec13587cf84caa048f07daee2cb607fb65a82fa6ccdd9d5e6c99407 not found: ID does not exist" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.363827 4707 scope.go:117] "RemoveContainer" containerID="83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.364017 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675"} err="failed to get container status \"83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675\": rpc error: code = NotFound desc = could not find container \"83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675\": container with ID starting with 83f6a57ef74fa6ada9fa16960141df1b188c3903245fae37d68dae1784de3675 not found: ID does not exist" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.364039 4707 scope.go:117] "RemoveContainer" containerID="b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.374691 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.387858 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.389262 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.392457 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.395132 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.395136 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.406869 4707 scope.go:117] "RemoveContainer" containerID="87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.438936 4707 scope.go:117] "RemoveContainer" containerID="b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e" Jan 21 15:58:21 crc kubenswrapper[4707]: E0121 15:58:21.439200 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e\": container with ID starting with b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e not found: ID does not exist" containerID="b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.439226 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e"} err="failed to get container status \"b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e\": rpc error: code = NotFound desc = could not find container \"b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e\": container with ID starting with b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e not found: ID does not exist" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.439242 4707 scope.go:117] "RemoveContainer" containerID="87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5" Jan 21 15:58:21 crc kubenswrapper[4707]: E0121 15:58:21.439614 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5\": container with ID starting with 87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5 not found: ID does not exist" containerID="87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.439636 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5"} err="failed to get container status \"87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5\": rpc error: code = NotFound desc = could not find container \"87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5\": container with ID starting with 87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5 not found: ID does not exist" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.439651 4707 scope.go:117] "RemoveContainer" containerID="b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.440059 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e"} err="failed to get container status \"b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e\": rpc error: code = NotFound desc = could not find container \"b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e\": container with ID starting with b10adfb3c5e608f703cb3b86dc7c7c0cce67da92849d067973d87c5d7e6ced6e not found: ID does not exist" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.440079 4707 scope.go:117] "RemoveContainer" containerID="87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.440434 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5"} err="failed to get container status \"87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5\": rpc error: code = NotFound desc = could not find container \"87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5\": container with ID starting with 87c0e4473098199b4aeba1663f8c23bd551d0acf38b17e772b7fed70f33381d5 not found: ID does not exist" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.462484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-logs\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.462523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-scripts\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.462546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.462567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v88zm\" (UniqueName: \"kubernetes.io/projected/de71448c-16db-44cc-b37f-f5a01abed953-kube-api-access-v88zm\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.462634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-config-data\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.462656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.462712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.462760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.564675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.564787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncd6r\" (UniqueName: \"kubernetes.io/projected/8f9744f7-4974-4616-834e-61a327e2ffdb-kube-api-access-ncd6r\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.564953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-logs\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-scripts\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v88zm\" (UniqueName: \"kubernetes.io/projected/de71448c-16db-44cc-b37f-f5a01abed953-kube-api-access-v88zm\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-config-data\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-logs\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.565449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.566306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-logs\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.571801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.572237 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.579844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.580127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.580492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-scripts\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.581959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-config-data\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.594479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v88zm\" (UniqueName: \"kubernetes.io/projected/de71448c-16db-44cc-b37f-f5a01abed953-kube-api-access-v88zm\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.646057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.663195 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.666148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.666198 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.666221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncd6r\" (UniqueName: \"kubernetes.io/projected/8f9744f7-4974-4616-834e-61a327e2ffdb-kube-api-access-ncd6r\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.666263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.666296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.666313 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.666330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.666363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-logs\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.666695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-logs\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.667925 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.671039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.678986 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.682471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.686329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.687081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.707558 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.727959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncd6r\" (UniqueName: \"kubernetes.io/projected/8f9744f7-4974-4616-834e-61a327e2ffdb-kube-api-access-ncd6r\") pod \"glance-default-internal-api-0\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.819525 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.973583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-scripts\") pod \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.973909 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-logs\") pod \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.973935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-config-data\") pod \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.973958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjn55\" (UniqueName: \"kubernetes.io/projected/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-kube-api-access-xjn55\") pod \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.974060 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-combined-ca-bundle\") pod \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\" (UID: \"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff\") " Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.974805 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-logs" (OuterVolumeSpecName: "logs") pod "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" (UID: "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.976915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-scripts" (OuterVolumeSpecName: "scripts") pod "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" (UID: "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.977698 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-kube-api-access-xjn55" (OuterVolumeSpecName: "kube-api-access-xjn55") pod "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" (UID: "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff"). InnerVolumeSpecName "kube-api-access-xjn55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:21 crc kubenswrapper[4707]: I0121 15:58:21.991661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-config-data" (OuterVolumeSpecName: "config-data") pod "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" (UID: "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.000543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" (UID: "93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.002947 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.075960 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.075981 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.075993 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjn55\" (UniqueName: \"kubernetes.io/projected/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-kube-api-access-xjn55\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.076001 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.076009 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.172125 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.301442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-ntkqn" event={"ID":"93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff","Type":"ContainerDied","Data":"136a7b22ef1a3abdbb62078c0e264cc3121d23b4114d802c7340a44b92497b14"} Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.301631 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="136a7b22ef1a3abdbb62078c0e264cc3121d23b4114d802c7340a44b92497b14" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.301680 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-ntkqn" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.307166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"de71448c-16db-44cc-b37f-f5a01abed953","Type":"ContainerStarted","Data":"d23f748a7978fc926e6cedbf2fa6e74320ff759d9b2cf9c105d7d760950daf4b"} Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.312389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerStarted","Data":"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7"} Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.312672 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="ceilometer-central-agent" containerID="cri-o://0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a" gracePeriod=30 Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.312856 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="sg-core" containerID="cri-o://602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da" gracePeriod=30 Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.312926 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="proxy-httpd" containerID="cri-o://9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7" gracePeriod=30 Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.312977 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="ceilometer-notification-agent" containerID="cri-o://f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae" gracePeriod=30 Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.342706 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.860656546 podStartE2EDuration="6.342694599s" podCreationTimestamp="2026-01-21 15:58:16 +0000 UTC" firstStartedPulling="2026-01-21 15:58:18.151392778 +0000 UTC m=+3395.332908990" lastFinishedPulling="2026-01-21 15:58:21.633430822 +0000 UTC m=+3398.814947043" observedRunningTime="2026-01-21 15:58:22.335535187 +0000 UTC m=+3399.517051408" watchObservedRunningTime="2026-01-21 15:58:22.342694599 +0000 UTC m=+3399.524210821" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.400640 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:58:22 crc kubenswrapper[4707]: W0121 15:58:22.413972 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f9744f7_4974_4616_834e_61a327e2ffdb.slice/crio-8e9ac5471cefc1af516b8404caddd7f2aa8b2f70878ecb558937fab9992165e2 WatchSource:0}: Error finding container 8e9ac5471cefc1af516b8404caddd7f2aa8b2f70878ecb558937fab9992165e2: Status 404 returned error can't find the container with id 8e9ac5471cefc1af516b8404caddd7f2aa8b2f70878ecb558937fab9992165e2 Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.650364 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.715144 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.786756 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-db-sync-config-data\") pod \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.786825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-fernet-keys\") pod \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.786875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-combined-ca-bundle\") pod \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.786911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-scripts\") pod \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.786932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-config-data\") pod \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.786968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-combined-ca-bundle\") pod \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.787032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-credential-keys\") pod \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.787055 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m52zt\" (UniqueName: \"kubernetes.io/projected/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-kube-api-access-m52zt\") pod \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\" (UID: \"6de02bec-e42c-48c1-b956-2a9ed65e7ca9\") " Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.787071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm9rc\" (UniqueName: \"kubernetes.io/projected/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-kube-api-access-tm9rc\") pod \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\" (UID: \"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae\") " Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.791040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6de02bec-e42c-48c1-b956-2a9ed65e7ca9" (UID: "6de02bec-e42c-48c1-b956-2a9ed65e7ca9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.791340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-kube-api-access-tm9rc" (OuterVolumeSpecName: "kube-api-access-tm9rc") pod "5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae" (UID: "5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae"). InnerVolumeSpecName "kube-api-access-tm9rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.791959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-scripts" (OuterVolumeSpecName: "scripts") pod "6de02bec-e42c-48c1-b956-2a9ed65e7ca9" (UID: "6de02bec-e42c-48c1-b956-2a9ed65e7ca9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.793451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6de02bec-e42c-48c1-b956-2a9ed65e7ca9" (UID: "6de02bec-e42c-48c1-b956-2a9ed65e7ca9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.794734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-kube-api-access-m52zt" (OuterVolumeSpecName: "kube-api-access-m52zt") pod "6de02bec-e42c-48c1-b956-2a9ed65e7ca9" (UID: "6de02bec-e42c-48c1-b956-2a9ed65e7ca9"). InnerVolumeSpecName "kube-api-access-m52zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.797650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae" (UID: "5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.820719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6de02bec-e42c-48c1-b956-2a9ed65e7ca9" (UID: "6de02bec-e42c-48c1-b956-2a9ed65e7ca9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.826962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae" (UID: "5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.831036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-config-data" (OuterVolumeSpecName: "config-data") pod "6de02bec-e42c-48c1-b956-2a9ed65e7ca9" (UID: "6de02bec-e42c-48c1-b956-2a9ed65e7ca9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.888835 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.888863 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.888873 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.888883 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.888890 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.888910 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.888918 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.888929 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m52zt\" (UniqueName: \"kubernetes.io/projected/6de02bec-e42c-48c1-b956-2a9ed65e7ca9-kube-api-access-m52zt\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.888938 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm9rc\" (UniqueName: \"kubernetes.io/projected/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae-kube-api-access-tm9rc\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.916114 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-676cb6d794-mt7xp"] Jan 21 15:58:22 crc kubenswrapper[4707]: E0121 15:58:22.916416 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de02bec-e42c-48c1-b956-2a9ed65e7ca9" containerName="keystone-bootstrap" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.916433 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de02bec-e42c-48c1-b956-2a9ed65e7ca9" containerName="keystone-bootstrap" Jan 21 15:58:22 crc kubenswrapper[4707]: E0121 15:58:22.916448 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae" containerName="barbican-db-sync" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.916454 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae" containerName="barbican-db-sync" Jan 21 15:58:22 crc kubenswrapper[4707]: E0121 15:58:22.916474 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" containerName="placement-db-sync" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.916481 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" containerName="placement-db-sync" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.916614 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae" containerName="barbican-db-sync" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.916622 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" containerName="placement-db-sync" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.916629 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de02bec-e42c-48c1-b956-2a9ed65e7ca9" containerName="keystone-bootstrap" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.917358 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.922672 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.923316 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.923378 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.923448 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-kw2tj" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.923620 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.944922 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-676cb6d794-mt7xp"] Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.972327 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.990076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-public-tls-certs\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.990216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-scripts\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.990307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-config-data\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.990376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08e764c-2c5a-4a9d-9922-7f21e90687ff-logs\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.990450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjcbq\" (UniqueName: \"kubernetes.io/projected/f08e764c-2c5a-4a9d-9922-7f21e90687ff-kube-api-access-gjcbq\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.990512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-combined-ca-bundle\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:22 crc kubenswrapper[4707]: I0121 15:58:22.990574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-internal-tls-certs\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.091403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-combined-ca-bundle\") pod \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.091470 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdsqz\" (UniqueName: \"kubernetes.io/projected/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-kube-api-access-fdsqz\") pod \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.091504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-scripts\") pod \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.091549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-run-httpd\") pod \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.091567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-config-data\") pod \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.091638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-log-httpd\") pod \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.091729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-sg-core-conf-yaml\") pod \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\" (UID: \"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581\") " Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.091935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-combined-ca-bundle\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.091965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-internal-tls-certs\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.092190 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-scripts\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.092205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-public-tls-certs\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.092230 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-config-data\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.092256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08e764c-2c5a-4a9d-9922-7f21e90687ff-logs\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.092293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjcbq\" (UniqueName: \"kubernetes.io/projected/f08e764c-2c5a-4a9d-9922-7f21e90687ff-kube-api-access-gjcbq\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.094417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" (UID: "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.094859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08e764c-2c5a-4a9d-9922-7f21e90687ff-logs\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.095216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" (UID: "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.100009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-scripts\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.100796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-internal-tls-certs\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.101770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-config-data\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.102095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-combined-ca-bundle\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.103860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-kube-api-access-fdsqz" (OuterVolumeSpecName: "kube-api-access-fdsqz") pod "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" (UID: "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581"). InnerVolumeSpecName "kube-api-access-fdsqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.104263 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-scripts" (OuterVolumeSpecName: "scripts") pod "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" (UID: "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.105229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-public-tls-certs\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.108017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjcbq\" (UniqueName: \"kubernetes.io/projected/f08e764c-2c5a-4a9d-9922-7f21e90687ff-kube-api-access-gjcbq\") pod \"placement-676cb6d794-mt7xp\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.119032 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" (UID: "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.153323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" (UID: "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.171300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-config-data" (OuterVolumeSpecName: "config-data") pod "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" (UID: "71bb98ab-d1f3-479b-83d7-aa2c3d2b3581"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.196532 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d2d4a6b-3223-4e31-883e-d0878a5cf089" path="/var/lib/kubelet/pods/7d2d4a6b-3223-4e31-883e-d0878a5cf089/volumes" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.197423 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.197452 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdsqz\" (UniqueName: \"kubernetes.io/projected/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-kube-api-access-fdsqz\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.197605 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.197615 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.197624 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.197631 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.197642 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.197713 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4cd799d-c609-4193-be39-6ff724a769d6" path="/var/lib/kubelet/pods/e4cd799d-c609-4193-be39-6ff724a769d6/volumes" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.240324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.341416 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.341589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-tnwlj" event={"ID":"5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae","Type":"ContainerDied","Data":"0c922fc0b256e1b5244590891edb52b7e2a403c9a2d8b466e32d7af45e2a899b"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.341626 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c922fc0b256e1b5244590891edb52b7e2a403c9a2d8b466e32d7af45e2a899b" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346235 4707 generic.go:334] "Generic (PLEG): container finished" podID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerID="9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7" exitCode=0 Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346259 4707 generic.go:334] "Generic (PLEG): container finished" podID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerID="602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da" exitCode=2 Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346267 4707 generic.go:334] "Generic (PLEG): container finished" podID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerID="f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae" exitCode=0 Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346274 4707 generic.go:334] "Generic (PLEG): container finished" podID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerID="0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a" exitCode=0 Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerDied","Data":"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerDied","Data":"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerDied","Data":"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerDied","Data":"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346369 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"71bb98ab-d1f3-479b-83d7-aa2c3d2b3581","Type":"ContainerDied","Data":"18ff5e29b875a9ec41b5c18847a6568da737dca7a3703229621979056638d0f1"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346382 4707 scope.go:117] "RemoveContainer" containerID="9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.346468 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.349984 4707 generic.go:334] "Generic (PLEG): container finished" podID="a8dc14bc-b18c-4937-9319-b73bb16aced4" containerID="743ae25b8acb1b34607a2c58c911a56beecb00bee4c4c9e5e264c4b53d7a16c1" exitCode=0 Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.350069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" event={"ID":"a8dc14bc-b18c-4937-9319-b73bb16aced4","Type":"ContainerDied","Data":"743ae25b8acb1b34607a2c58c911a56beecb00bee4c4c9e5e264c4b53d7a16c1"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.353368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" event={"ID":"6de02bec-e42c-48c1-b956-2a9ed65e7ca9","Type":"ContainerDied","Data":"d24059594c1aa68a7f77abbc193843ca8c41327389db1c8648312f4dbb527d22"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.353392 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d24059594c1aa68a7f77abbc193843ca8c41327389db1c8648312f4dbb527d22" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.353454 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-fsks8" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.385897 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.388432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8f9744f7-4974-4616-834e-61a327e2ffdb","Type":"ContainerStarted","Data":"251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.388468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8f9744f7-4974-4616-834e-61a327e2ffdb","Type":"ContainerStarted","Data":"8e9ac5471cefc1af516b8404caddd7f2aa8b2f70878ecb558937fab9992165e2"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.391228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"de71448c-16db-44cc-b37f-f5a01abed953","Type":"ContainerStarted","Data":"9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.391265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"de71448c-16db-44cc-b37f-f5a01abed953","Type":"ContainerStarted","Data":"70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c"} Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.399114 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.406194 4707 scope.go:117] "RemoveContainer" containerID="602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.420887 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:58:23 crc kubenswrapper[4707]: E0121 15:58:23.421397 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="sg-core" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.421415 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="sg-core" Jan 21 15:58:23 crc kubenswrapper[4707]: E0121 15:58:23.421425 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="ceilometer-notification-agent" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.421431 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="ceilometer-notification-agent" Jan 21 15:58:23 crc kubenswrapper[4707]: E0121 15:58:23.425233 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="ceilometer-central-agent" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.425250 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="ceilometer-central-agent" Jan 21 15:58:23 crc kubenswrapper[4707]: E0121 15:58:23.425267 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="proxy-httpd" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.425273 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="proxy-httpd" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.425436 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="ceilometer-notification-agent" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.425446 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="sg-core" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.425457 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="proxy-httpd" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.425468 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" containerName="ceilometer-central-agent" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.426707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.430035 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.430617 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.435764 4707 scope.go:117] "RemoveContainer" containerID="f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.451878 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.451938 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.45192584 podStartE2EDuration="2.45192584s" podCreationTimestamp="2026-01-21 15:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:23.411755854 +0000 UTC m=+3400.593272096" watchObservedRunningTime="2026-01-21 15:58:23.45192584 +0000 UTC m=+3400.633442063" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.469773 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fsks8"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.474948 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fsks8"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.476326 4707 scope.go:117] "RemoveContainer" containerID="0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.503973 4707 scope.go:117] "RemoveContainer" containerID="9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.504959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.505094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-log-httpd\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.505147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-config-data\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.505260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.505327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-scripts\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.505405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4d9b\" (UniqueName: \"kubernetes.io/projected/d4341da7-00d4-472a-a479-16602bcb7495-kube-api-access-l4d9b\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.505467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-run-httpd\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: E0121 15:58:23.507038 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7\": container with ID starting with 9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7 not found: ID does not exist" containerID="9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.507075 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7"} err="failed to get container status \"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7\": rpc error: code = NotFound desc = could not find container \"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7\": container with ID starting with 9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7 not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.507095 4707 scope.go:117] "RemoveContainer" containerID="602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da" Jan 21 15:58:23 crc kubenswrapper[4707]: E0121 15:58:23.507355 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da\": container with ID starting with 602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da not found: ID does not exist" containerID="602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.507393 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da"} err="failed to get container status \"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da\": rpc error: code = NotFound desc = could not find container \"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da\": container with ID starting with 602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.507404 4707 scope.go:117] "RemoveContainer" containerID="f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae" Jan 21 15:58:23 crc kubenswrapper[4707]: E0121 15:58:23.507607 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae\": container with ID starting with f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae not found: ID does not exist" containerID="f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.507622 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae"} err="failed to get container status \"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae\": rpc error: code = NotFound desc = could not find container \"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae\": container with ID starting with f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.507634 4707 scope.go:117] "RemoveContainer" containerID="0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a" Jan 21 15:58:23 crc kubenswrapper[4707]: E0121 15:58:23.507838 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a\": container with ID starting with 0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a not found: ID does not exist" containerID="0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.507853 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a"} err="failed to get container status \"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a\": rpc error: code = NotFound desc = could not find container \"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a\": container with ID starting with 0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.507864 4707 scope.go:117] "RemoveContainer" containerID="9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.508044 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7"} err="failed to get container status \"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7\": rpc error: code = NotFound desc = could not find container \"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7\": container with ID starting with 9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7 not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.508056 4707 scope.go:117] "RemoveContainer" containerID="602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.508214 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da"} err="failed to get container status \"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da\": rpc error: code = NotFound desc = could not find container \"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da\": container with ID starting with 602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.508225 4707 scope.go:117] "RemoveContainer" containerID="f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.508400 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae"} err="failed to get container status \"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae\": rpc error: code = NotFound desc = could not find container \"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae\": container with ID starting with f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.508460 4707 scope.go:117] "RemoveContainer" containerID="0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.510827 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a"} err="failed to get container status \"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a\": rpc error: code = NotFound desc = could not find container \"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a\": container with ID starting with 0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.510853 4707 scope.go:117] "RemoveContainer" containerID="9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.514677 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7"} err="failed to get container status \"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7\": rpc error: code = NotFound desc = could not find container \"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7\": container with ID starting with 9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7 not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.514894 4707 scope.go:117] "RemoveContainer" containerID="602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.515116 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da"} err="failed to get container status \"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da\": rpc error: code = NotFound desc = could not find container \"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da\": container with ID starting with 602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.515131 4707 scope.go:117] "RemoveContainer" containerID="f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.515823 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae"} err="failed to get container status \"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae\": rpc error: code = NotFound desc = could not find container \"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae\": container with ID starting with f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.515840 4707 scope.go:117] "RemoveContainer" containerID="0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.516025 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a"} err="failed to get container status \"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a\": rpc error: code = NotFound desc = could not find container \"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a\": container with ID starting with 0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.516038 4707 scope.go:117] "RemoveContainer" containerID="9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.516216 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7"} err="failed to get container status \"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7\": rpc error: code = NotFound desc = could not find container \"9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7\": container with ID starting with 9a0f0e9ebdc4cf3795d84e8b6867e0218cf987d816d39ad6cf6713912152ffe7 not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.516227 4707 scope.go:117] "RemoveContainer" containerID="602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.516438 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da"} err="failed to get container status \"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da\": rpc error: code = NotFound desc = could not find container \"602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da\": container with ID starting with 602740fa8be144e3f7fe77df804caad63a5f0487a37184e08d251aa07cd5f3da not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.516470 4707 scope.go:117] "RemoveContainer" containerID="f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.516676 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae"} err="failed to get container status \"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae\": rpc error: code = NotFound desc = could not find container \"f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae\": container with ID starting with f595da699629bca98fc16801cf96f13757fc7c4ad2ba7b6ccbfb720b80d56dae not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.516727 4707 scope.go:117] "RemoveContainer" containerID="0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.516928 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a"} err="failed to get container status \"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a\": rpc error: code = NotFound desc = could not find container \"0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a\": container with ID starting with 0d8b6ed1174f8f88965753b76c3633e6fa5624791c6158dbefd525f63a90b85a not found: ID does not exist" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.545433 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bkt5g"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.546411 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.549884 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.550048 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.550162 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.550324 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.550479 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-vnmw4" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.559937 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bkt5g"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.606797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.606868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-credential-keys\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.606897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-scripts\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-config-data\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4d9b\" (UniqueName: \"kubernetes.io/projected/d4341da7-00d4-472a-a479-16602bcb7495-kube-api-access-l4d9b\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-run-httpd\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607168 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shpqm\" (UniqueName: \"kubernetes.io/projected/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-kube-api-access-shpqm\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-scripts\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-fernet-keys\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-log-httpd\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-config-data\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-combined-ca-bundle\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.607869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-run-httpd\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.608085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-log-httpd\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.611671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-scripts\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.612007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.612128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-config-data\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.612181 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.622057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4d9b\" (UniqueName: \"kubernetes.io/projected/d4341da7-00d4-472a-a479-16602bcb7495-kube-api-access-l4d9b\") pod \"ceilometer-0\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.657929 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-676cb6d794-mt7xp"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.708829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-combined-ca-bundle\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.708894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-credential-keys\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.708924 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-config-data\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.708971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shpqm\" (UniqueName: \"kubernetes.io/projected/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-kube-api-access-shpqm\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.708993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-scripts\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.709018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-fernet-keys\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.712120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-scripts\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.712258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-credential-keys\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.712607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-config-data\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.712692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-combined-ca-bundle\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.718023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-fernet-keys\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.729862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shpqm\" (UniqueName: \"kubernetes.io/projected/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-kube-api-access-shpqm\") pod \"keystone-bootstrap-bkt5g\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.751888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.790158 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-75d7748979-s5w96"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.791357 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.793741 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-9dg6p" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.793836 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.801736 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.811824 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-75d7748979-s5w96"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.821973 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.831324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.836729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.843306 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.860412 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913042 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-647f58596d-4pnqh"] Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data-custom\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b20a8d-7771-44bf-9ca7-100e04650264-logs\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-combined-ca-bundle\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913531 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b7841fd-aeda-46f5-a60c-3baa9923275d-logs\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-combined-ca-bundle\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt6r6\" (UniqueName: \"kubernetes.io/projected/8b7841fd-aeda-46f5-a60c-3baa9923275d-kube-api-access-kt6r6\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data-custom\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.913873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mlc8\" (UniqueName: \"kubernetes.io/projected/79b20a8d-7771-44bf-9ca7-100e04650264-kube-api-access-6mlc8\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.914446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.916341 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 15:58:23 crc kubenswrapper[4707]: I0121 15:58:23.928715 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-647f58596d-4pnqh"] Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.015825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-combined-ca-bundle\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.015948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data-custom\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016021 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016099 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt6r6\" (UniqueName: \"kubernetes.io/projected/8b7841fd-aeda-46f5-a60c-3baa9923275d-kube-api-access-kt6r6\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data-custom\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mlc8\" (UniqueName: \"kubernetes.io/projected/79b20a8d-7771-44bf-9ca7-100e04650264-kube-api-access-6mlc8\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data-custom\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016633 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/720563ae-0736-4053-bd1d-8e36a8e1db51-logs\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w99zh\" (UniqueName: \"kubernetes.io/projected/720563ae-0736-4053-bd1d-8e36a8e1db51-kube-api-access-w99zh\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-combined-ca-bundle\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.016924 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b20a8d-7771-44bf-9ca7-100e04650264-logs\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.017004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-combined-ca-bundle\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.017095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b7841fd-aeda-46f5-a60c-3baa9923275d-logs\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.017162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.018906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b20a8d-7771-44bf-9ca7-100e04650264-logs\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.019382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-combined-ca-bundle\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.019766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b7841fd-aeda-46f5-a60c-3baa9923275d-logs\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.021578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-combined-ca-bundle\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.022430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data-custom\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.023005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.028623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data-custom\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.037043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt6r6\" (UniqueName: \"kubernetes.io/projected/8b7841fd-aeda-46f5-a60c-3baa9923275d-kube-api-access-kt6r6\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.038301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data\") pod \"barbican-worker-75d7748979-s5w96\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.039128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mlc8\" (UniqueName: \"kubernetes.io/projected/79b20a8d-7771-44bf-9ca7-100e04650264-kube-api-access-6mlc8\") pod \"barbican-keystone-listener-56cd595886-jldwm\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.118210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/720563ae-0736-4053-bd1d-8e36a8e1db51-logs\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.118263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w99zh\" (UniqueName: \"kubernetes.io/projected/720563ae-0736-4053-bd1d-8e36a8e1db51-kube-api-access-w99zh\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.118327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-combined-ca-bundle\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.118409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.118447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data-custom\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.119176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/720563ae-0736-4053-bd1d-8e36a8e1db51-logs\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.123898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.125183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-combined-ca-bundle\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.134224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w99zh\" (UniqueName: \"kubernetes.io/projected/720563ae-0736-4053-bd1d-8e36a8e1db51-kube-api-access-w99zh\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.142554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data-custom\") pod \"barbican-api-647f58596d-4pnqh\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.182692 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:58:24 crc kubenswrapper[4707]: E0121 15:58:24.182896 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.243977 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.254753 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.287030 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.290964 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:58:24 crc kubenswrapper[4707]: W0121 15:58:24.292336 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4341da7_00d4_472a_a479_16602bcb7495.slice/crio-0e72490b914a9d25f4f0daf8fc52d60495b972e225a456ab0506cf9f4a00f958 WatchSource:0}: Error finding container 0e72490b914a9d25f4f0daf8fc52d60495b972e225a456ab0506cf9f4a00f958: Status 404 returned error can't find the container with id 0e72490b914a9d25f4f0daf8fc52d60495b972e225a456ab0506cf9f4a00f958 Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.401871 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bkt5g"] Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.433361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" event={"ID":"f08e764c-2c5a-4a9d-9922-7f21e90687ff","Type":"ContainerStarted","Data":"e14681d4cf87d7520d2927a4c5c73b026b63cf5e240e1a16e2e0ba3ad541c350"} Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.433409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" event={"ID":"f08e764c-2c5a-4a9d-9922-7f21e90687ff","Type":"ContainerStarted","Data":"d376d4f2415e861701dbdf6f065d3a2783b3c5f5fa001d4b22538477483c226a"} Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.433420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" event={"ID":"f08e764c-2c5a-4a9d-9922-7f21e90687ff","Type":"ContainerStarted","Data":"75bde2be090b5aeab3fb578e661ef995d58f1e11b79c512b0cc6ef7cf5f800d9"} Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.433450 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.433473 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.437769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8f9744f7-4974-4616-834e-61a327e2ffdb","Type":"ContainerStarted","Data":"5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba"} Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.441627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerStarted","Data":"0e72490b914a9d25f4f0daf8fc52d60495b972e225a456ab0506cf9f4a00f958"} Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.450732 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" podStartSLOduration=2.450718832 podStartE2EDuration="2.450718832s" podCreationTimestamp="2026-01-21 15:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:24.448982717 +0000 UTC m=+3401.630498939" watchObservedRunningTime="2026-01-21 15:58:24.450718832 +0000 UTC m=+3401.632235054" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.467741 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.467726984 podStartE2EDuration="3.467726984s" podCreationTimestamp="2026-01-21 15:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:24.463527597 +0000 UTC m=+3401.645043820" watchObservedRunningTime="2026-01-21 15:58:24.467726984 +0000 UTC m=+3401.649243206" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.635004 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-75d7748979-s5w96"] Jan 21 15:58:24 crc kubenswrapper[4707]: W0121 15:58:24.640061 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b7841fd_aeda_46f5_a60c_3baa9923275d.slice/crio-1fa7267d4fddac752419228bc6e6683c444d8561d6dd4edd716260a77f921426 WatchSource:0}: Error finding container 1fa7267d4fddac752419228bc6e6683c444d8561d6dd4edd716260a77f921426: Status 404 returned error can't find the container with id 1fa7267d4fddac752419228bc6e6683c444d8561d6dd4edd716260a77f921426 Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.745027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm"] Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.863307 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.876630 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-647f58596d-4pnqh"] Jan 21 15:58:24 crc kubenswrapper[4707]: W0121 15:58:24.886048 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod720563ae_0736_4053_bd1d_8e36a8e1db51.slice/crio-d2145b542e0006506f0fbddbdfe10d01e4d3570507b72d007a7424127f4c4ee9 WatchSource:0}: Error finding container d2145b542e0006506f0fbddbdfe10d01e4d3570507b72d007a7424127f4c4ee9: Status 404 returned error can't find the container with id d2145b542e0006506f0fbddbdfe10d01e4d3570507b72d007a7424127f4c4ee9 Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.935885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-combined-ca-bundle\") pod \"a8dc14bc-b18c-4937-9319-b73bb16aced4\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.936146 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8dc14bc-b18c-4937-9319-b73bb16aced4-etc-machine-id\") pod \"a8dc14bc-b18c-4937-9319-b73bb16aced4\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.936186 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-config-data\") pod \"a8dc14bc-b18c-4937-9319-b73bb16aced4\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.936213 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2dvz\" (UniqueName: \"kubernetes.io/projected/a8dc14bc-b18c-4937-9319-b73bb16aced4-kube-api-access-m2dvz\") pod \"a8dc14bc-b18c-4937-9319-b73bb16aced4\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.936255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-db-sync-config-data\") pod \"a8dc14bc-b18c-4937-9319-b73bb16aced4\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.936388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-scripts\") pod \"a8dc14bc-b18c-4937-9319-b73bb16aced4\" (UID: \"a8dc14bc-b18c-4937-9319-b73bb16aced4\") " Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.938170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8dc14bc-b18c-4937-9319-b73bb16aced4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a8dc14bc-b18c-4937-9319-b73bb16aced4" (UID: "a8dc14bc-b18c-4937-9319-b73bb16aced4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.942131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-scripts" (OuterVolumeSpecName: "scripts") pod "a8dc14bc-b18c-4937-9319-b73bb16aced4" (UID: "a8dc14bc-b18c-4937-9319-b73bb16aced4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.942167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8dc14bc-b18c-4937-9319-b73bb16aced4-kube-api-access-m2dvz" (OuterVolumeSpecName: "kube-api-access-m2dvz") pod "a8dc14bc-b18c-4937-9319-b73bb16aced4" (UID: "a8dc14bc-b18c-4937-9319-b73bb16aced4"). InnerVolumeSpecName "kube-api-access-m2dvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.946380 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a8dc14bc-b18c-4937-9319-b73bb16aced4" (UID: "a8dc14bc-b18c-4937-9319-b73bb16aced4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:24 crc kubenswrapper[4707]: I0121 15:58:24.992945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8dc14bc-b18c-4937-9319-b73bb16aced4" (UID: "a8dc14bc-b18c-4937-9319-b73bb16aced4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.018707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-config-data" (OuterVolumeSpecName: "config-data") pod "a8dc14bc-b18c-4937-9319-b73bb16aced4" (UID: "a8dc14bc-b18c-4937-9319-b73bb16aced4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.039948 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.039975 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.039986 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8dc14bc-b18c-4937-9319-b73bb16aced4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.039994 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.040003 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2dvz\" (UniqueName: \"kubernetes.io/projected/a8dc14bc-b18c-4937-9319-b73bb16aced4-kube-api-access-m2dvz\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.040012 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8dc14bc-b18c-4937-9319-b73bb16aced4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.190932 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de02bec-e42c-48c1-b956-2a9ed65e7ca9" path="/var/lib/kubelet/pods/6de02bec-e42c-48c1-b956-2a9ed65e7ca9/volumes" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.191443 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bb98ab-d1f3-479b-83d7-aa2c3d2b3581" path="/var/lib/kubelet/pods/71bb98ab-d1f3-479b-83d7-aa2c3d2b3581/volumes" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.447112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" event={"ID":"720563ae-0736-4053-bd1d-8e36a8e1db51","Type":"ContainerStarted","Data":"cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.447156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.447167 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" event={"ID":"720563ae-0736-4053-bd1d-8e36a8e1db51","Type":"ContainerStarted","Data":"b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.447177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" event={"ID":"720563ae-0736-4053-bd1d-8e36a8e1db51","Type":"ContainerStarted","Data":"d2145b542e0006506f0fbddbdfe10d01e4d3570507b72d007a7424127f4c4ee9"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.448253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerStarted","Data":"934f93e9e7e32bb8adeddf2391c02267a4cea98ae6d010ce86176d45f0fbb15b"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.449681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" event={"ID":"79b20a8d-7771-44bf-9ca7-100e04650264","Type":"ContainerStarted","Data":"2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.449709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" event={"ID":"79b20a8d-7771-44bf-9ca7-100e04650264","Type":"ContainerStarted","Data":"22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.449719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" event={"ID":"79b20a8d-7771-44bf-9ca7-100e04650264","Type":"ContainerStarted","Data":"934e32f649efc69ced9c8c6fd367b751076784a0bbe281b9961a0841789c904d"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.451028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" event={"ID":"a8dc14bc-b18c-4937-9319-b73bb16aced4","Type":"ContainerDied","Data":"6ecf5aff37b7a17a1547b42a254c41d8588f102e47b4a5590d090456a9f8448a"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.451072 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ecf5aff37b7a17a1547b42a254c41d8588f102e47b4a5590d090456a9f8448a" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.451039 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-7ckwj" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.454976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" event={"ID":"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466","Type":"ContainerStarted","Data":"6493e34684b568e4b46ad2000671bad1726d6a92122699a0a1930bed82439445"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.455006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" event={"ID":"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466","Type":"ContainerStarted","Data":"2dcd7666253e09be17ca3f63dfd053196bda368e66087473539d57f458a41198"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.456946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" event={"ID":"8b7841fd-aeda-46f5-a60c-3baa9923275d","Type":"ContainerStarted","Data":"380a9834052d0f5862f21968659125d6bfa398491f0f1d19cb8337ac171d6977"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.456975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" event={"ID":"8b7841fd-aeda-46f5-a60c-3baa9923275d","Type":"ContainerStarted","Data":"9682448c7c3521e5fabf41464b76348e4257e00745c27d84fde49e69a08cce5b"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.456986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" event={"ID":"8b7841fd-aeda-46f5-a60c-3baa9923275d","Type":"ContainerStarted","Data":"1fa7267d4fddac752419228bc6e6683c444d8561d6dd4edd716260a77f921426"} Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.471425 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" podStartSLOduration=2.4714121540000002 podStartE2EDuration="2.471412154s" podCreationTimestamp="2026-01-21 15:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:25.467969582 +0000 UTC m=+3402.649485804" watchObservedRunningTime="2026-01-21 15:58:25.471412154 +0000 UTC m=+3402.652928377" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.496669 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" podStartSLOduration=2.496653008 podStartE2EDuration="2.496653008s" podCreationTimestamp="2026-01-21 15:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:25.485946737 +0000 UTC m=+3402.667462948" watchObservedRunningTime="2026-01-21 15:58:25.496653008 +0000 UTC m=+3402.678169230" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.513533 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" podStartSLOduration=2.513518812 podStartE2EDuration="2.513518812s" podCreationTimestamp="2026-01-21 15:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:25.500071978 +0000 UTC m=+3402.681588199" watchObservedRunningTime="2026-01-21 15:58:25.513518812 +0000 UTC m=+3402.695035035" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.569045 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" podStartSLOduration=2.569027982 podStartE2EDuration="2.569027982s" podCreationTimestamp="2026-01-21 15:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:25.542230843 +0000 UTC m=+3402.723747065" watchObservedRunningTime="2026-01-21 15:58:25.569027982 +0000 UTC m=+3402.750544205" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.646704 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:58:25 crc kubenswrapper[4707]: E0121 15:58:25.647867 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8dc14bc-b18c-4937-9319-b73bb16aced4" containerName="cinder-db-sync" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.647888 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8dc14bc-b18c-4937-9319-b73bb16aced4" containerName="cinder-db-sync" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.648244 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8dc14bc-b18c-4937-9319-b73bb16aced4" containerName="cinder-db-sync" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.649535 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.655166 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-rvgv7" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.655509 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.655645 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.655786 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.712900 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.732703 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.734106 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.736343 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.749977 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.759998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.760068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.760146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.760167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-scripts\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.760185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.760200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkpkr\" (UniqueName: \"kubernetes.io/projected/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-kube-api-access-dkpkr\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-scripts\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkpkr\" (UniqueName: \"kubernetes.io/projected/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-kube-api-access-dkpkr\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-scripts\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a10e5e-cc41-4411-aa34-ff805a6f4926-logs\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h28kg\" (UniqueName: \"kubernetes.io/projected/92a10e5e-cc41-4411-aa34-ff805a6f4926-kube-api-access-h28kg\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data-custom\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862596 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92a10e5e-cc41-4411-aa34-ff805a6f4926-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.862888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.866612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.866757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-scripts\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.867233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.870516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.879681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkpkr\" (UniqueName: \"kubernetes.io/projected/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-kube-api-access-dkpkr\") pod \"cinder-scheduler-0\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.964228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.964319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.964394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-scripts\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.964452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a10e5e-cc41-4411-aa34-ff805a6f4926-logs\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.964470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h28kg\" (UniqueName: \"kubernetes.io/projected/92a10e5e-cc41-4411-aa34-ff805a6f4926-kube-api-access-h28kg\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.964490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data-custom\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.964556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92a10e5e-cc41-4411-aa34-ff805a6f4926-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.964639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92a10e5e-cc41-4411-aa34-ff805a6f4926-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.964842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a10e5e-cc41-4411-aa34-ff805a6f4926-logs\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.966993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-scripts\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.968242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data-custom\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.968265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.969425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:25 crc kubenswrapper[4707]: I0121 15:58:25.978682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h28kg\" (UniqueName: \"kubernetes.io/projected/92a10e5e-cc41-4411-aa34-ff805a6f4926-kube-api-access-h28kg\") pod \"cinder-api-0\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:26 crc kubenswrapper[4707]: I0121 15:58:26.030715 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:26 crc kubenswrapper[4707]: I0121 15:58:26.049117 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:26 crc kubenswrapper[4707]: I0121 15:58:26.466754 4707 generic.go:334] "Generic (PLEG): container finished" podID="343110b7-9660-4e3f-a56f-d7ee9f89467d" containerID="c5e539d0bcc848c6d8b89ed9ace7c4bb3a1f1e4ca972075b82e0e1f0ae7b1421" exitCode=0 Jan 21 15:58:26 crc kubenswrapper[4707]: I0121 15:58:26.466843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" event={"ID":"343110b7-9660-4e3f-a56f-d7ee9f89467d","Type":"ContainerDied","Data":"c5e539d0bcc848c6d8b89ed9ace7c4bb3a1f1e4ca972075b82e0e1f0ae7b1421"} Jan 21 15:58:26 crc kubenswrapper[4707]: I0121 15:58:26.471530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerStarted","Data":"452e01c8458d06be38dd0725caeef55a6882076a46f333419461b139fd570790"} Jan 21 15:58:26 crc kubenswrapper[4707]: I0121 15:58:26.471573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerStarted","Data":"df912f5a069085faa23dd8c3228fe5debb72d477b7f8f77d20df94f07d0641f9"} Jan 21 15:58:26 crc kubenswrapper[4707]: I0121 15:58:26.473060 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:26 crc kubenswrapper[4707]: I0121 15:58:26.498082 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:58:26 crc kubenswrapper[4707]: I0121 15:58:26.607346 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:58:26 crc kubenswrapper[4707]: W0121 15:58:26.612628 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92a10e5e_cc41_4411_aa34_ff805a6f4926.slice/crio-d92f7f2dfdb10091f1c58cf9b6dedcec2b169317e50c782499b31783947a2122 WatchSource:0}: Error finding container d92f7f2dfdb10091f1c58cf9b6dedcec2b169317e50c782499b31783947a2122: Status 404 returned error can't find the container with id d92f7f2dfdb10091f1c58cf9b6dedcec2b169317e50c782499b31783947a2122 Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.487206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f","Type":"ContainerStarted","Data":"5437d1cfb64bfe99756650386287f336ddc7e044d8c686f5c56a2fb71f4974a7"} Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.487475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f","Type":"ContainerStarted","Data":"107edd2630c866d3ba0537554d5ad718dcb481fc68ca874bf69643d605bd3052"} Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.487492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f","Type":"ContainerStarted","Data":"2269a624e85f4bbf7643892e581a4b8aee4bfe3485d43969571772e01e94a2f3"} Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.500955 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92a10e5e-cc41-4411-aa34-ff805a6f4926","Type":"ContainerStarted","Data":"7076b48d6f9315b8936689630cdf63882383354847034f19289822741bedd933"} Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.500999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92a10e5e-cc41-4411-aa34-ff805a6f4926","Type":"ContainerStarted","Data":"d92f7f2dfdb10091f1c58cf9b6dedcec2b169317e50c782499b31783947a2122"} Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.505255 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.505242672 podStartE2EDuration="2.505242672s" podCreationTimestamp="2026-01-21 15:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:27.503187518 +0000 UTC m=+3404.684703740" watchObservedRunningTime="2026-01-21 15:58:27.505242672 +0000 UTC m=+3404.686758895" Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.510176 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" event={"ID":"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466","Type":"ContainerDied","Data":"6493e34684b568e4b46ad2000671bad1726d6a92122699a0a1930bed82439445"} Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.510184 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" containerID="6493e34684b568e4b46ad2000671bad1726d6a92122699a0a1930bed82439445" exitCode=0 Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.839788 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.920113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-combined-ca-bundle\") pod \"343110b7-9660-4e3f-a56f-d7ee9f89467d\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.920480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n6d9\" (UniqueName: \"kubernetes.io/projected/343110b7-9660-4e3f-a56f-d7ee9f89467d-kube-api-access-8n6d9\") pod \"343110b7-9660-4e3f-a56f-d7ee9f89467d\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.920686 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-config\") pod \"343110b7-9660-4e3f-a56f-d7ee9f89467d\" (UID: \"343110b7-9660-4e3f-a56f-d7ee9f89467d\") " Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.926091 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343110b7-9660-4e3f-a56f-d7ee9f89467d-kube-api-access-8n6d9" (OuterVolumeSpecName: "kube-api-access-8n6d9") pod "343110b7-9660-4e3f-a56f-d7ee9f89467d" (UID: "343110b7-9660-4e3f-a56f-d7ee9f89467d"). InnerVolumeSpecName "kube-api-access-8n6d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.941786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-config" (OuterVolumeSpecName: "config") pod "343110b7-9660-4e3f-a56f-d7ee9f89467d" (UID: "343110b7-9660-4e3f-a56f-d7ee9f89467d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:27 crc kubenswrapper[4707]: I0121 15:58:27.943051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "343110b7-9660-4e3f-a56f-d7ee9f89467d" (UID: "343110b7-9660-4e3f-a56f-d7ee9f89467d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.023760 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.023798 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n6d9\" (UniqueName: \"kubernetes.io/projected/343110b7-9660-4e3f-a56f-d7ee9f89467d-kube-api-access-8n6d9\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.023826 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/343110b7-9660-4e3f-a56f-d7ee9f89467d-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.520013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerStarted","Data":"847353d388943179ff5421f55370c460ae23d93dc684ed50ecce5e109974e786"} Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.520066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.522827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92a10e5e-cc41-4411-aa34-ff805a6f4926","Type":"ContainerStarted","Data":"f319150b8a2832729ed61b577147cbacc79a449eaf8cb169f9a2179850e486d8"} Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.523051 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.525190 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.525178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-p8pwz" event={"ID":"343110b7-9660-4e3f-a56f-d7ee9f89467d","Type":"ContainerDied","Data":"72e1059c11b42f36bae04841d44667817cf95eb4863f0affd29f15a4c4d2b6e5"} Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.525231 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72e1059c11b42f36bae04841d44667817cf95eb4863f0affd29f15a4c4d2b6e5" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.545610 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.422535988 podStartE2EDuration="5.545592058s" podCreationTimestamp="2026-01-21 15:58:23 +0000 UTC" firstStartedPulling="2026-01-21 15:58:24.311648292 +0000 UTC m=+3401.493164514" lastFinishedPulling="2026-01-21 15:58:27.434704362 +0000 UTC m=+3404.616220584" observedRunningTime="2026-01-21 15:58:28.538086784 +0000 UTC m=+3405.719602996" watchObservedRunningTime="2026-01-21 15:58:28.545592058 +0000 UTC m=+3405.727108280" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.553931 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.553921441 podStartE2EDuration="3.553921441s" podCreationTimestamp="2026-01-21 15:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:28.550341599 +0000 UTC m=+3405.731857821" watchObservedRunningTime="2026-01-21 15:58:28.553921441 +0000 UTC m=+3405.735437663" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.665358 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-845db986f7-l6gvl"] Jan 21 15:58:28 crc kubenswrapper[4707]: E0121 15:58:28.666010 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343110b7-9660-4e3f-a56f-d7ee9f89467d" containerName="neutron-db-sync" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.666029 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="343110b7-9660-4e3f-a56f-d7ee9f89467d" containerName="neutron-db-sync" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.666178 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="343110b7-9660-4e3f-a56f-d7ee9f89467d" containerName="neutron-db-sync" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.671091 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.673480 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.673637 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.673965 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.673971 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-w5lrz" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.674632 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-845db986f7-l6gvl"] Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.739519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-httpd-config\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.739592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-ovndb-tls-certs\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.739787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-combined-ca-bundle\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.739860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmbzk\" (UniqueName: \"kubernetes.io/projected/24b81dd8-32fc-4b34-a088-46fbdc2a3153-kube-api-access-wmbzk\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.739954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-config\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.841904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-combined-ca-bundle\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.841980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmbzk\" (UniqueName: \"kubernetes.io/projected/24b81dd8-32fc-4b34-a088-46fbdc2a3153-kube-api-access-wmbzk\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.842063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-config\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.842210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-httpd-config\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.842259 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-ovndb-tls-certs\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.846131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-httpd-config\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.847315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-ovndb-tls-certs\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.847430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-config\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.847957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-combined-ca-bundle\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.856156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmbzk\" (UniqueName: \"kubernetes.io/projected/24b81dd8-32fc-4b34-a088-46fbdc2a3153-kube-api-access-wmbzk\") pod \"neutron-845db986f7-l6gvl\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.896609 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:28 crc kubenswrapper[4707]: I0121 15:58:28.987950 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.056329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-scripts\") pod \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.056373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-fernet-keys\") pod \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.056403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shpqm\" (UniqueName: \"kubernetes.io/projected/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-kube-api-access-shpqm\") pod \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.056463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-combined-ca-bundle\") pod \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.056510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-config-data\") pod \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.056635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-credential-keys\") pod \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\" (UID: \"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466\") " Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.062846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" (UID: "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.079106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-kube-api-access-shpqm" (OuterVolumeSpecName: "kube-api-access-shpqm") pod "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" (UID: "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466"). InnerVolumeSpecName "kube-api-access-shpqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.082908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-scripts" (OuterVolumeSpecName: "scripts") pod "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" (UID: "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.089927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" (UID: "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.110360 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-config-data" (OuterVolumeSpecName: "config-data") pod "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" (UID: "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.113984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" (UID: "bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.159491 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.159540 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.159550 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.159566 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.159576 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.159587 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shpqm\" (UniqueName: \"kubernetes.io/projected/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466-kube-api-access-shpqm\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.533437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" event={"ID":"bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466","Type":"ContainerDied","Data":"2dcd7666253e09be17ca3f63dfd053196bda368e66087473539d57f458a41198"} Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.533477 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dcd7666253e09be17ca3f63dfd053196bda368e66087473539d57f458a41198" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.533620 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bkt5g" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.603723 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-845db986f7-l6gvl"] Jan 21 15:58:29 crc kubenswrapper[4707]: W0121 15:58:29.614117 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24b81dd8_32fc_4b34_a088_46fbdc2a3153.slice/crio-8098f13feec3ca2b25d017a6f1cf3d73060a5d4bba624dbb5c547d0752d8743e WatchSource:0}: Error finding container 8098f13feec3ca2b25d017a6f1cf3d73060a5d4bba624dbb5c547d0752d8743e: Status 404 returned error can't find the container with id 8098f13feec3ca2b25d017a6f1cf3d73060a5d4bba624dbb5c547d0752d8743e Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.666493 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-65c9d986d8-drhzk"] Jan 21 15:58:29 crc kubenswrapper[4707]: E0121 15:58:29.666835 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" containerName="keystone-bootstrap" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.666852 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" containerName="keystone-bootstrap" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.667023 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" containerName="keystone-bootstrap" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.667536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.672608 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.672774 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-vnmw4" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.672848 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.672950 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.672995 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.674068 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.680105 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-65c9d986d8-drhzk"] Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.770688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl4ms\" (UniqueName: \"kubernetes.io/projected/b998d320-944f-413b-a5b5-2a140207b229-kube-api-access-cl4ms\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.770922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-config-data\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.770955 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-public-tls-certs\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.770988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-combined-ca-bundle\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.771011 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-internal-tls-certs\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.771032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-credential-keys\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.771099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-scripts\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.771154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-fernet-keys\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.872661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-scripts\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.872756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-fernet-keys\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.872826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl4ms\" (UniqueName: \"kubernetes.io/projected/b998d320-944f-413b-a5b5-2a140207b229-kube-api-access-cl4ms\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.872866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-config-data\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.872899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-public-tls-certs\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.872939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-combined-ca-bundle\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.872958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-internal-tls-certs\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.872980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-credential-keys\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.877086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-public-tls-certs\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.877340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-fernet-keys\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.877868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-internal-tls-certs\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.877962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-credential-keys\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.878060 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-combined-ca-bundle\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.879573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-scripts\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.879595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-config-data\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.888137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl4ms\" (UniqueName: \"kubernetes.io/projected/b998d320-944f-413b-a5b5-2a140207b229-kube-api-access-cl4ms\") pod \"keystone-65c9d986d8-drhzk\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:29 crc kubenswrapper[4707]: I0121 15:58:29.999511 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:30 crc kubenswrapper[4707]: I0121 15:58:30.407337 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-65c9d986d8-drhzk"] Jan 21 15:58:30 crc kubenswrapper[4707]: W0121 15:58:30.410135 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb998d320_944f_413b_a5b5_2a140207b229.slice/crio-b0134d4948fd07698704528909442219aa1b5af8bb0646cd44422082686ca787 WatchSource:0}: Error finding container b0134d4948fd07698704528909442219aa1b5af8bb0646cd44422082686ca787: Status 404 returned error can't find the container with id b0134d4948fd07698704528909442219aa1b5af8bb0646cd44422082686ca787 Jan 21 15:58:30 crc kubenswrapper[4707]: I0121 15:58:30.551049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" event={"ID":"24b81dd8-32fc-4b34-a088-46fbdc2a3153","Type":"ContainerStarted","Data":"6ca44840ca88d9e7c7d4c674391885852c2c88359bb32cecc82cd63ccc92cfe1"} Jan 21 15:58:30 crc kubenswrapper[4707]: I0121 15:58:30.551477 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:58:30 crc kubenswrapper[4707]: I0121 15:58:30.551509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" event={"ID":"24b81dd8-32fc-4b34-a088-46fbdc2a3153","Type":"ContainerStarted","Data":"e8ad06e0a72754cf479badef467fcc4d22b1569e254808d9b209a49b761f92ff"} Jan 21 15:58:30 crc kubenswrapper[4707]: I0121 15:58:30.551546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" event={"ID":"24b81dd8-32fc-4b34-a088-46fbdc2a3153","Type":"ContainerStarted","Data":"8098f13feec3ca2b25d017a6f1cf3d73060a5d4bba624dbb5c547d0752d8743e"} Jan 21 15:58:30 crc kubenswrapper[4707]: I0121 15:58:30.552100 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" event={"ID":"b998d320-944f-413b-a5b5-2a140207b229","Type":"ContainerStarted","Data":"b0134d4948fd07698704528909442219aa1b5af8bb0646cd44422082686ca787"} Jan 21 15:58:30 crc kubenswrapper[4707]: I0121 15:58:30.568874 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" podStartSLOduration=2.5688506159999998 podStartE2EDuration="2.568850616s" podCreationTimestamp="2026-01-21 15:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:30.565790412 +0000 UTC m=+3407.747306634" watchObservedRunningTime="2026-01-21 15:58:30.568850616 +0000 UTC m=+3407.750366838" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.031773 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.194610 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.194792 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerName="cinder-api-log" containerID="cri-o://7076b48d6f9315b8936689630cdf63882383354847034f19289822741bedd933" gracePeriod=30 Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.194922 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerName="cinder-api" containerID="cri-o://f319150b8a2832729ed61b577147cbacc79a449eaf8cb169f9a2179850e486d8" gracePeriod=30 Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.236767 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.562127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" event={"ID":"b998d320-944f-413b-a5b5-2a140207b229","Type":"ContainerStarted","Data":"de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae"} Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.562548 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.566059 4707 generic.go:334] "Generic (PLEG): container finished" podID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerID="f319150b8a2832729ed61b577147cbacc79a449eaf8cb169f9a2179850e486d8" exitCode=0 Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.566086 4707 generic.go:334] "Generic (PLEG): container finished" podID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerID="7076b48d6f9315b8936689630cdf63882383354847034f19289822741bedd933" exitCode=143 Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.566196 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92a10e5e-cc41-4411-aa34-ff805a6f4926","Type":"ContainerDied","Data":"f319150b8a2832729ed61b577147cbacc79a449eaf8cb169f9a2179850e486d8"} Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.566282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92a10e5e-cc41-4411-aa34-ff805a6f4926","Type":"ContainerDied","Data":"7076b48d6f9315b8936689630cdf63882383354847034f19289822741bedd933"} Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.584303 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" podStartSLOduration=2.584281714 podStartE2EDuration="2.584281714s" podCreationTimestamp="2026-01-21 15:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:31.577422536 +0000 UTC m=+3408.758938758" watchObservedRunningTime="2026-01-21 15:58:31.584281714 +0000 UTC m=+3408.765797935" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.611679 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.637855 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.663739 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.663798 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.701969 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.727410 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:31 crc kubenswrapper[4707]: I0121 15:58:31.895553 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.003714 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.003791 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.018449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a10e5e-cc41-4411-aa34-ff805a6f4926-logs\") pod \"92a10e5e-cc41-4411-aa34-ff805a6f4926\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.018502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92a10e5e-cc41-4411-aa34-ff805a6f4926-etc-machine-id\") pod \"92a10e5e-cc41-4411-aa34-ff805a6f4926\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.018650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-scripts\") pod \"92a10e5e-cc41-4411-aa34-ff805a6f4926\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.018702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-combined-ca-bundle\") pod \"92a10e5e-cc41-4411-aa34-ff805a6f4926\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.018742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data\") pod \"92a10e5e-cc41-4411-aa34-ff805a6f4926\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.018767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data-custom\") pod \"92a10e5e-cc41-4411-aa34-ff805a6f4926\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.018788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h28kg\" (UniqueName: \"kubernetes.io/projected/92a10e5e-cc41-4411-aa34-ff805a6f4926-kube-api-access-h28kg\") pod \"92a10e5e-cc41-4411-aa34-ff805a6f4926\" (UID: \"92a10e5e-cc41-4411-aa34-ff805a6f4926\") " Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.023917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92a10e5e-cc41-4411-aa34-ff805a6f4926-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "92a10e5e-cc41-4411-aa34-ff805a6f4926" (UID: "92a10e5e-cc41-4411-aa34-ff805a6f4926"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.024136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a10e5e-cc41-4411-aa34-ff805a6f4926-logs" (OuterVolumeSpecName: "logs") pod "92a10e5e-cc41-4411-aa34-ff805a6f4926" (UID: "92a10e5e-cc41-4411-aa34-ff805a6f4926"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.024388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-scripts" (OuterVolumeSpecName: "scripts") pod "92a10e5e-cc41-4411-aa34-ff805a6f4926" (UID: "92a10e5e-cc41-4411-aa34-ff805a6f4926"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.024652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a10e5e-cc41-4411-aa34-ff805a6f4926-kube-api-access-h28kg" (OuterVolumeSpecName: "kube-api-access-h28kg") pod "92a10e5e-cc41-4411-aa34-ff805a6f4926" (UID: "92a10e5e-cc41-4411-aa34-ff805a6f4926"). InnerVolumeSpecName "kube-api-access-h28kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.026803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "92a10e5e-cc41-4411-aa34-ff805a6f4926" (UID: "92a10e5e-cc41-4411-aa34-ff805a6f4926"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.034531 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.044094 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.055620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92a10e5e-cc41-4411-aa34-ff805a6f4926" (UID: "92a10e5e-cc41-4411-aa34-ff805a6f4926"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.069110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data" (OuterVolumeSpecName: "config-data") pod "92a10e5e-cc41-4411-aa34-ff805a6f4926" (UID: "92a10e5e-cc41-4411-aa34-ff805a6f4926"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.121631 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.121668 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.121681 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.121692 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92a10e5e-cc41-4411-aa34-ff805a6f4926-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.121703 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h28kg\" (UniqueName: \"kubernetes.io/projected/92a10e5e-cc41-4411-aa34-ff805a6f4926-kube-api-access-h28kg\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.121713 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a10e5e-cc41-4411-aa34-ff805a6f4926-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.121721 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92a10e5e-cc41-4411-aa34-ff805a6f4926-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.573848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92a10e5e-cc41-4411-aa34-ff805a6f4926","Type":"ContainerDied","Data":"d92f7f2dfdb10091f1c58cf9b6dedcec2b169317e50c782499b31783947a2122"} Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.573874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.574265 4707 scope.go:117] "RemoveContainer" containerID="f319150b8a2832729ed61b577147cbacc79a449eaf8cb169f9a2179850e486d8" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.574216 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerName="probe" containerID="cri-o://5437d1cfb64bfe99756650386287f336ddc7e044d8c686f5c56a2fb71f4974a7" gracePeriod=30 Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.574159 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerName="cinder-scheduler" containerID="cri-o://107edd2630c866d3ba0537554d5ad718dcb481fc68ca874bf69643d605bd3052" gracePeriod=30 Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.574643 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.574693 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.574704 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.575206 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.597765 4707 scope.go:117] "RemoveContainer" containerID="7076b48d6f9315b8936689630cdf63882383354847034f19289822741bedd933" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.607890 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.618718 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.629326 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:58:32 crc kubenswrapper[4707]: E0121 15:58:32.629663 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerName="cinder-api" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.629680 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerName="cinder-api" Jan 21 15:58:32 crc kubenswrapper[4707]: E0121 15:58:32.629697 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerName="cinder-api-log" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.629703 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerName="cinder-api-log" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.630171 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerName="cinder-api-log" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.630194 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a10e5e-cc41-4411-aa34-ff805a6f4926" containerName="cinder-api" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.630999 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.633469 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.633716 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.633911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.645747 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.730899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.730973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.731010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7kjc\" (UniqueName: \"kubernetes.io/projected/b8a64a9a-c654-4d05-8504-0839ba022204-kube-api-access-n7kjc\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.731040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8a64a9a-c654-4d05-8504-0839ba022204-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.731058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.731097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a64a9a-c654-4d05-8504-0839ba022204-logs\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.731111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.731128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-scripts\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.731149 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data-custom\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.832316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.832386 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.832424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7kjc\" (UniqueName: \"kubernetes.io/projected/b8a64a9a-c654-4d05-8504-0839ba022204-kube-api-access-n7kjc\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.832455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8a64a9a-c654-4d05-8504-0839ba022204-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.832472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.832510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a64a9a-c654-4d05-8504-0839ba022204-logs\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.832528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.832547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-scripts\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.832570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data-custom\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.833497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8a64a9a-c654-4d05-8504-0839ba022204-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.833803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a64a9a-c654-4d05-8504-0839ba022204-logs\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.843221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-scripts\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.850584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data-custom\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.851395 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.852268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.852631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.863253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.875489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7kjc\" (UniqueName: \"kubernetes.io/projected/b8a64a9a-c654-4d05-8504-0839ba022204-kube-api-access-n7kjc\") pod \"cinder-api-0\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:32 crc kubenswrapper[4707]: I0121 15:58:32.948126 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.196456 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a10e5e-cc41-4411-aa34-ff805a6f4926" path="/var/lib/kubelet/pods/92a10e5e-cc41-4411-aa34-ff805a6f4926/volumes" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.395674 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 15:58:33 crc kubenswrapper[4707]: W0121 15:58:33.414980 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8a64a9a_c654_4d05_8504_0839ba022204.slice/crio-ce975a22a57bfcc4a891f3fbd826af3e61d48a5d09a024fd529209db89980e72 WatchSource:0}: Error finding container ce975a22a57bfcc4a891f3fbd826af3e61d48a5d09a024fd529209db89980e72: Status 404 returned error can't find the container with id ce975a22a57bfcc4a891f3fbd826af3e61d48a5d09a024fd529209db89980e72 Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.590324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b8a64a9a-c654-4d05-8504-0839ba022204","Type":"ContainerStarted","Data":"ce975a22a57bfcc4a891f3fbd826af3e61d48a5d09a024fd529209db89980e72"} Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.603232 4707 generic.go:334] "Generic (PLEG): container finished" podID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerID="5437d1cfb64bfe99756650386287f336ddc7e044d8c686f5c56a2fb71f4974a7" exitCode=0 Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.603254 4707 generic.go:334] "Generic (PLEG): container finished" podID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerID="107edd2630c866d3ba0537554d5ad718dcb481fc68ca874bf69643d605bd3052" exitCode=0 Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.603305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f","Type":"ContainerDied","Data":"5437d1cfb64bfe99756650386287f336ddc7e044d8c686f5c56a2fb71f4974a7"} Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.603342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f","Type":"ContainerDied","Data":"107edd2630c866d3ba0537554d5ad718dcb481fc68ca874bf69643d605bd3052"} Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.709076 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.755149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-scripts\") pod \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.755207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data\") pod \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.755231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-etc-machine-id\") pod \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.755345 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkpkr\" (UniqueName: \"kubernetes.io/projected/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-kube-api-access-dkpkr\") pod \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.755374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data-custom\") pod \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.755444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-combined-ca-bundle\") pod \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\" (UID: \"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f\") " Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.756461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" (UID: "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.763781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-kube-api-access-dkpkr" (OuterVolumeSpecName: "kube-api-access-dkpkr") pod "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" (UID: "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f"). InnerVolumeSpecName "kube-api-access-dkpkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.765360 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-scripts" (OuterVolumeSpecName: "scripts") pod "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" (UID: "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.777433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" (UID: "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.813996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" (UID: "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.858613 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkpkr\" (UniqueName: \"kubernetes.io/projected/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-kube-api-access-dkpkr\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.858644 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.858653 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.858662 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.858670 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.882007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data" (OuterVolumeSpecName: "config-data") pod "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" (UID: "a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:33 crc kubenswrapper[4707]: I0121 15:58:33.961990 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.504018 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.541219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.604631 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.629553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b8a64a9a-c654-4d05-8504-0839ba022204","Type":"ContainerStarted","Data":"8e2da5aaf70f2dca26c0a7481b9da9f4a95637fa22e3641c79f7ab645fd6829e"} Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.629606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b8a64a9a-c654-4d05-8504-0839ba022204","Type":"ContainerStarted","Data":"693aa9dee02beefab5f62e72db83696878302f9d43403043bdbdd0a96bbd28b4"} Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.631072 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.665242 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.666490 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.666794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f","Type":"ContainerDied","Data":"2269a624e85f4bbf7643892e581a4b8aee4bfe3485d43969571772e01e94a2f3"} Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.666927 4707 scope.go:117] "RemoveContainer" containerID="5437d1cfb64bfe99756650386287f336ddc7e044d8c686f5c56a2fb71f4974a7" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.669214 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.6691917739999997 podStartE2EDuration="2.669191774s" podCreationTimestamp="2026-01-21 15:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:34.65565054 +0000 UTC m=+3411.837166762" watchObservedRunningTime="2026-01-21 15:58:34.669191774 +0000 UTC m=+3411.850707995" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.704331 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.710065 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.712954 4707 scope.go:117] "RemoveContainer" containerID="107edd2630c866d3ba0537554d5ad718dcb481fc68ca874bf69643d605bd3052" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.727705 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:58:34 crc kubenswrapper[4707]: E0121 15:58:34.728148 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerName="probe" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.728161 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerName="probe" Jan 21 15:58:34 crc kubenswrapper[4707]: E0121 15:58:34.728171 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerName="cinder-scheduler" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.728177 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerName="cinder-scheduler" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.728369 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerName="probe" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.728392 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" containerName="cinder-scheduler" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.729264 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.734875 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.743193 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.780798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-scripts\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.781571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.781648 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.781677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdjjd\" (UniqueName: \"kubernetes.io/projected/c6da68d4-659d-4cf2-a796-899b51ef8939-kube-api-access-fdjjd\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.781959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.781981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6da68d4-659d-4cf2-a796-899b51ef8939-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.883258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.883308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdjjd\" (UniqueName: \"kubernetes.io/projected/c6da68d4-659d-4cf2-a796-899b51ef8939-kube-api-access-fdjjd\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.883399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.883417 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6da68d4-659d-4cf2-a796-899b51ef8939-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.883456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-scripts\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.883474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.886093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6da68d4-659d-4cf2-a796-899b51ef8939-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.898939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.901588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.902082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-scripts\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.914660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdjjd\" (UniqueName: \"kubernetes.io/projected/c6da68d4-659d-4cf2-a796-899b51ef8939-kube-api-access-fdjjd\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:34 crc kubenswrapper[4707]: I0121 15:58:34.915096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data\") pod \"cinder-scheduler-0\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.007696 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.057830 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.193778 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f" path="/var/lib/kubelet/pods/a0a96eb3-ebd0-4ae9-a03a-0bfcf99f0f8f/volumes" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.490389 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 15:58:35 crc kubenswrapper[4707]: W0121 15:58:35.499467 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6da68d4_659d_4cf2_a796_899b51ef8939.slice/crio-38cc8ec1a09a6dcfe428bc7290accb1c02c7103180c8c183974ac06414a836a8 WatchSource:0}: Error finding container 38cc8ec1a09a6dcfe428bc7290accb1c02c7103180c8c183974ac06414a836a8: Status 404 returned error can't find the container with id 38cc8ec1a09a6dcfe428bc7290accb1c02c7103180c8c183974ac06414a836a8 Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.576860 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5cbd44866-48h8x"] Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.578008 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.583715 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.583924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.597712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmx4l\" (UniqueName: \"kubernetes.io/projected/d30e5918-95e9-4870-bd2e-d95fe0e448d8-kube-api-access-dmx4l\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.597745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-config\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.597769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-internal-tls-certs\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.597788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-httpd-config\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.597838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-ovndb-tls-certs\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.597981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-combined-ca-bundle\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.598102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-public-tls-certs\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.605953 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5cbd44866-48h8x"] Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.688402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c6da68d4-659d-4cf2-a796-899b51ef8939","Type":"ContainerStarted","Data":"38cc8ec1a09a6dcfe428bc7290accb1c02c7103180c8c183974ac06414a836a8"} Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.699764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmx4l\" (UniqueName: \"kubernetes.io/projected/d30e5918-95e9-4870-bd2e-d95fe0e448d8-kube-api-access-dmx4l\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.699803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-config\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.699849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-internal-tls-certs\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.699870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-httpd-config\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.699884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-ovndb-tls-certs\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.699928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-combined-ca-bundle\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.699994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-public-tls-certs\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.704216 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-internal-tls-certs\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.704352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-httpd-config\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.705959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-combined-ca-bundle\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.706710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-config\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.707150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-ovndb-tls-certs\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.707226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-public-tls-certs\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.712531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmx4l\" (UniqueName: \"kubernetes.io/projected/d30e5918-95e9-4870-bd2e-d95fe0e448d8-kube-api-access-dmx4l\") pod \"neutron-5cbd44866-48h8x\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.809444 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.856416 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-5b545794dd-j668n"] Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.870627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.880234 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.880473 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.892463 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:35 crc kubenswrapper[4707]: I0121 15:58:35.927540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5b545794dd-j668n"] Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.011401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-combined-ca-bundle\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.011604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-internal-tls-certs\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.011649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79k2p\" (UniqueName: \"kubernetes.io/projected/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-kube-api-access-79k2p\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.011671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-public-tls-certs\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.011711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data-custom\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.011864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.012017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-logs\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.024277 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.114042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-internal-tls-certs\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.114085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79k2p\" (UniqueName: \"kubernetes.io/projected/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-kube-api-access-79k2p\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.114106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-public-tls-certs\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.114129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data-custom\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.114180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.114236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-logs\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.114269 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-combined-ca-bundle\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.115300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-logs\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.120723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-public-tls-certs\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.120726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data-custom\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.121088 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.122593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-internal-tls-certs\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.126058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-combined-ca-bundle\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.128100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79k2p\" (UniqueName: \"kubernetes.io/projected/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-kube-api-access-79k2p\") pod \"barbican-api-5b545794dd-j668n\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.248472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.401627 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5cbd44866-48h8x"] Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.679279 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5b545794dd-j668n"] Jan 21 15:58:36 crc kubenswrapper[4707]: W0121 15:58:36.688272 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode82ec0cb_eb14_411c_a7a7_6d231875fe6e.slice/crio-d2cc4e8519a8e80f74f07e008baf76b9206d7734490a1bcc720f4b0f977d3122 WatchSource:0}: Error finding container d2cc4e8519a8e80f74f07e008baf76b9206d7734490a1bcc720f4b0f977d3122: Status 404 returned error can't find the container with id d2cc4e8519a8e80f74f07e008baf76b9206d7734490a1bcc720f4b0f977d3122 Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.698162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" event={"ID":"d30e5918-95e9-4870-bd2e-d95fe0e448d8","Type":"ContainerStarted","Data":"ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877"} Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.698199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" event={"ID":"d30e5918-95e9-4870-bd2e-d95fe0e448d8","Type":"ContainerStarted","Data":"36b42e557ae4bb41381ef7e924e97163466c22820d9e58e718f6052192da6df4"} Jan 21 15:58:36 crc kubenswrapper[4707]: I0121 15:58:36.700189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c6da68d4-659d-4cf2-a796-899b51ef8939","Type":"ContainerStarted","Data":"3b3595d781817f63d2073805ae987be2763662eff7d9cb701993e99941539a30"} Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.713771 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" event={"ID":"d30e5918-95e9-4870-bd2e-d95fe0e448d8","Type":"ContainerStarted","Data":"13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871"} Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.714235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.715748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" event={"ID":"e82ec0cb-eb14-411c-a7a7-6d231875fe6e","Type":"ContainerStarted","Data":"3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064"} Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.715771 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" event={"ID":"e82ec0cb-eb14-411c-a7a7-6d231875fe6e","Type":"ContainerStarted","Data":"0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9"} Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.715783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" event={"ID":"e82ec0cb-eb14-411c-a7a7-6d231875fe6e","Type":"ContainerStarted","Data":"d2cc4e8519a8e80f74f07e008baf76b9206d7734490a1bcc720f4b0f977d3122"} Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.716397 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.716423 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.722668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c6da68d4-659d-4cf2-a796-899b51ef8939","Type":"ContainerStarted","Data":"35687d4e9d2f04767e4e660723736595964bd33eb7c312847844b1f32e3a1a5a"} Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.741427 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" podStartSLOduration=2.7414134949999998 podStartE2EDuration="2.741413495s" podCreationTimestamp="2026-01-21 15:58:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:37.731795679 +0000 UTC m=+3414.913311901" watchObservedRunningTime="2026-01-21 15:58:37.741413495 +0000 UTC m=+3414.922929716" Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.773282 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.7732657 podStartE2EDuration="3.7732657s" podCreationTimestamp="2026-01-21 15:58:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:37.755954467 +0000 UTC m=+3414.937470689" watchObservedRunningTime="2026-01-21 15:58:37.7732657 +0000 UTC m=+3414.954781922" Jan 21 15:58:37 crc kubenswrapper[4707]: I0121 15:58:37.775150 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" podStartSLOduration=2.775139093 podStartE2EDuration="2.775139093s" podCreationTimestamp="2026-01-21 15:58:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:58:37.771076222 +0000 UTC m=+3414.952592445" watchObservedRunningTime="2026-01-21 15:58:37.775139093 +0000 UTC m=+3414.956655315" Jan 21 15:58:38 crc kubenswrapper[4707]: I0121 15:58:38.182636 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:58:38 crc kubenswrapper[4707]: E0121 15:58:38.183028 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 15:58:40 crc kubenswrapper[4707]: I0121 15:58:40.058952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:42 crc kubenswrapper[4707]: I0121 15:58:42.518872 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:43 crc kubenswrapper[4707]: I0121 15:58:43.687975 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 15:58:43 crc kubenswrapper[4707]: I0121 15:58:43.741504 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-647f58596d-4pnqh"] Jan 21 15:58:43 crc kubenswrapper[4707]: I0121 15:58:43.741769 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api-log" containerID="cri-o://b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0" gracePeriod=30 Jan 21 15:58:43 crc kubenswrapper[4707]: I0121 15:58:43.741909 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api" containerID="cri-o://cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8" gracePeriod=30 Jan 21 15:58:44 crc kubenswrapper[4707]: I0121 15:58:44.582714 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 15:58:44 crc kubenswrapper[4707]: I0121 15:58:44.771152 4707 generic.go:334] "Generic (PLEG): container finished" podID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerID="b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0" exitCode=143 Jan 21 15:58:44 crc kubenswrapper[4707]: I0121 15:58:44.771191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" event={"ID":"720563ae-0736-4053-bd1d-8e36a8e1db51","Type":"ContainerDied","Data":"b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0"} Jan 21 15:58:45 crc kubenswrapper[4707]: I0121 15:58:45.226827 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 15:58:46 crc kubenswrapper[4707]: I0121 15:58:46.876972 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.250:9311/healthcheck\": read tcp 10.217.0.2:33048->10.217.0.250:9311: read: connection reset by peer" Jan 21 15:58:46 crc kubenswrapper[4707]: I0121 15:58:46.878152 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.250:9311/healthcheck\": read tcp 10.217.0.2:33064->10.217.0.250:9311: read: connection reset by peer" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.249209 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.408771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-combined-ca-bundle\") pod \"720563ae-0736-4053-bd1d-8e36a8e1db51\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.408842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w99zh\" (UniqueName: \"kubernetes.io/projected/720563ae-0736-4053-bd1d-8e36a8e1db51-kube-api-access-w99zh\") pod \"720563ae-0736-4053-bd1d-8e36a8e1db51\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.408946 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data\") pod \"720563ae-0736-4053-bd1d-8e36a8e1db51\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.408999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/720563ae-0736-4053-bd1d-8e36a8e1db51-logs\") pod \"720563ae-0736-4053-bd1d-8e36a8e1db51\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.409045 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data-custom\") pod \"720563ae-0736-4053-bd1d-8e36a8e1db51\" (UID: \"720563ae-0736-4053-bd1d-8e36a8e1db51\") " Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.409887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/720563ae-0736-4053-bd1d-8e36a8e1db51-logs" (OuterVolumeSpecName: "logs") pod "720563ae-0736-4053-bd1d-8e36a8e1db51" (UID: "720563ae-0736-4053-bd1d-8e36a8e1db51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.410388 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/720563ae-0736-4053-bd1d-8e36a8e1db51-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.414399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720563ae-0736-4053-bd1d-8e36a8e1db51-kube-api-access-w99zh" (OuterVolumeSpecName: "kube-api-access-w99zh") pod "720563ae-0736-4053-bd1d-8e36a8e1db51" (UID: "720563ae-0736-4053-bd1d-8e36a8e1db51"). InnerVolumeSpecName "kube-api-access-w99zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.415239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "720563ae-0736-4053-bd1d-8e36a8e1db51" (UID: "720563ae-0736-4053-bd1d-8e36a8e1db51"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.429053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "720563ae-0736-4053-bd1d-8e36a8e1db51" (UID: "720563ae-0736-4053-bd1d-8e36a8e1db51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.457059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data" (OuterVolumeSpecName: "config-data") pod "720563ae-0736-4053-bd1d-8e36a8e1db51" (UID: "720563ae-0736-4053-bd1d-8e36a8e1db51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.512316 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.512676 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.512687 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w99zh\" (UniqueName: \"kubernetes.io/projected/720563ae-0736-4053-bd1d-8e36a8e1db51-kube-api-access-w99zh\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.512699 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/720563ae-0736-4053-bd1d-8e36a8e1db51-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.810067 4707 generic.go:334] "Generic (PLEG): container finished" podID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerID="cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8" exitCode=0 Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.810108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" event={"ID":"720563ae-0736-4053-bd1d-8e36a8e1db51","Type":"ContainerDied","Data":"cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8"} Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.810123 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.810153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-647f58596d-4pnqh" event={"ID":"720563ae-0736-4053-bd1d-8e36a8e1db51","Type":"ContainerDied","Data":"d2145b542e0006506f0fbddbdfe10d01e4d3570507b72d007a7424127f4c4ee9"} Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.810193 4707 scope.go:117] "RemoveContainer" containerID="cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.837929 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-647f58596d-4pnqh"] Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.843224 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-647f58596d-4pnqh"] Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.843826 4707 scope.go:117] "RemoveContainer" containerID="b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.859019 4707 scope.go:117] "RemoveContainer" containerID="cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8" Jan 21 15:58:47 crc kubenswrapper[4707]: E0121 15:58:47.859260 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8\": container with ID starting with cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8 not found: ID does not exist" containerID="cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.859297 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8"} err="failed to get container status \"cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8\": rpc error: code = NotFound desc = could not find container \"cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8\": container with ID starting with cf437c37b03bf2d49edddfd3669f6f411c66f45df5fae3c7d57040d33dc45dd8 not found: ID does not exist" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.859316 4707 scope.go:117] "RemoveContainer" containerID="b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0" Jan 21 15:58:47 crc kubenswrapper[4707]: E0121 15:58:47.859531 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0\": container with ID starting with b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0 not found: ID does not exist" containerID="b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0" Jan 21 15:58:47 crc kubenswrapper[4707]: I0121 15:58:47.859555 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0"} err="failed to get container status \"b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0\": rpc error: code = NotFound desc = could not find container \"b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0\": container with ID starting with b974d4dde554e4c0a4a41ad37ab3a1ccbe9ecb91a186da764d14fde284dea9e0 not found: ID does not exist" Jan 21 15:58:49 crc kubenswrapper[4707]: I0121 15:58:49.186133 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 15:58:49 crc kubenswrapper[4707]: I0121 15:58:49.194997 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" path="/var/lib/kubelet/pods/720563ae-0736-4053-bd1d-8e36a8e1db51/volumes" Jan 21 15:58:49 crc kubenswrapper[4707]: I0121 15:58:49.827603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"c95e964b40a0887841892803aa9f77fc9d3e35044d810c4610890872e712c158"} Jan 21 15:58:53 crc kubenswrapper[4707]: I0121 15:58:53.759179 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:58:54 crc kubenswrapper[4707]: I0121 15:58:54.158195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:55 crc kubenswrapper[4707]: I0121 15:58:55.157862 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 15:58:58 crc kubenswrapper[4707]: I0121 15:58:58.995887 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:59:01 crc kubenswrapper[4707]: I0121 15:59:01.292841 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.629409 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:59:05 crc kubenswrapper[4707]: E0121 15:59:05.630165 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.630178 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api" Jan 21 15:59:05 crc kubenswrapper[4707]: E0121 15:59:05.630204 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api-log" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.630210 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api-log" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.630366 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api-log" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.630376 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="720563ae-0736-4053-bd1d-8e36a8e1db51" containerName="barbican-api" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.630934 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.634514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.634936 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-9slqd" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.634946 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.643022 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.646747 4707 status_manager.go:875] "Failed to update status for pod" pod="openstack-kuttl-tests/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3a784a-4fe7-4bcc-bcb5-b58028ef67a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b4f8494513a3af102066fec5868ab167ac8664aceb2f0c639d7a0b60260a944\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2kf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:59:05Z\\\"}}\" for pod \"openstack-kuttl-tests\"/\"openstackclient\": pods \"openstackclient\" not found" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.656940 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:59:05 crc kubenswrapper[4707]: E0121 15:59:05.657708 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-h2kf2 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-h2kf2 openstack-config openstack-config-secret]: context canceled" pod="openstack-kuttl-tests/openstackclient" podUID="8e3a784a-4fe7-4bcc-bcb5-b58028ef67a1" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.667060 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.673006 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.673995 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.684059 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.692987 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="8e3a784a-4fe7-4bcc-bcb5-b58028ef67a1" podUID="23a190c1-7893-4f25-af82-4f97b95b4e68" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.708819 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nz6c\" (UniqueName: \"kubernetes.io/projected/23a190c1-7893-4f25-af82-4f97b95b4e68-kube-api-access-4nz6c\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.708857 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config-secret\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.708881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.708921 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.810026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nz6c\" (UniqueName: \"kubernetes.io/projected/23a190c1-7893-4f25-af82-4f97b95b4e68-kube-api-access-4nz6c\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.810072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config-secret\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.810098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.810138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.811103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.823251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config-secret\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.824715 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.826966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nz6c\" (UniqueName: \"kubernetes.io/projected/23a190c1-7893-4f25-af82-4f97b95b4e68-kube-api-access-4nz6c\") pod \"openstackclient\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.903121 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.937918 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.942372 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="8e3a784a-4fe7-4bcc-bcb5-b58028ef67a1" podUID="23a190c1-7893-4f25-af82-4f97b95b4e68" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.949067 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-845db986f7-l6gvl"] Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.950529 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" podUID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerName="neutron-api" containerID="cri-o://e8ad06e0a72754cf479badef467fcc4d22b1569e254808d9b209a49b761f92ff" gracePeriod=30 Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.951047 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" podUID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerName="neutron-httpd" containerID="cri-o://6ca44840ca88d9e7c7d4c674391885852c2c88359bb32cecc82cd63ccc92cfe1" gracePeriod=30 Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.952425 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.956062 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="8e3a784a-4fe7-4bcc-bcb5-b58028ef67a1" podUID="23a190c1-7893-4f25-af82-4f97b95b4e68" Jan 21 15:59:05 crc kubenswrapper[4707]: I0121 15:59:05.993494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:06 crc kubenswrapper[4707]: I0121 15:59:06.430104 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 15:59:06 crc kubenswrapper[4707]: I0121 15:59:06.948430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"23a190c1-7893-4f25-af82-4f97b95b4e68","Type":"ContainerStarted","Data":"8dac5ac8a890cd9197910e542e914fc479cea9be24fdd5964752473d60226d8d"} Jan 21 15:59:06 crc kubenswrapper[4707]: I0121 15:59:06.948707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"23a190c1-7893-4f25-af82-4f97b95b4e68","Type":"ContainerStarted","Data":"eb4489d91c22069aa9b44e8dd07c81d67cbcd460363c1271b37051c7c0beb733"} Jan 21 15:59:06 crc kubenswrapper[4707]: I0121 15:59:06.950536 4707 generic.go:334] "Generic (PLEG): container finished" podID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerID="6ca44840ca88d9e7c7d4c674391885852c2c88359bb32cecc82cd63ccc92cfe1" exitCode=0 Jan 21 15:59:06 crc kubenswrapper[4707]: I0121 15:59:06.950620 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 15:59:06 crc kubenswrapper[4707]: I0121 15:59:06.950607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" event={"ID":"24b81dd8-32fc-4b34-a088-46fbdc2a3153","Type":"ContainerDied","Data":"6ca44840ca88d9e7c7d4c674391885852c2c88359bb32cecc82cd63ccc92cfe1"} Jan 21 15:59:06 crc kubenswrapper[4707]: I0121 15:59:06.965638 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="8e3a784a-4fe7-4bcc-bcb5-b58028ef67a1" podUID="23a190c1-7893-4f25-af82-4f97b95b4e68" Jan 21 15:59:06 crc kubenswrapper[4707]: I0121 15:59:06.966483 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=1.9664712930000001 podStartE2EDuration="1.966471293s" podCreationTimestamp="2026-01-21 15:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:59:06.960748521 +0000 UTC m=+3444.142264743" watchObservedRunningTime="2026-01-21 15:59:06.966471293 +0000 UTC m=+3444.147987515" Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.041710 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.041901 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="ff49b548-7c18-4eba-a23b-9b22dc4e5982" containerName="kube-state-metrics" containerID="cri-o://38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3" gracePeriod=30 Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.190953 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e3a784a-4fe7-4bcc-bcb5-b58028ef67a1" path="/var/lib/kubelet/pods/8e3a784a-4fe7-4bcc-bcb5-b58028ef67a1/volumes" Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.443708 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.640296 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jgrh\" (UniqueName: \"kubernetes.io/projected/ff49b548-7c18-4eba-a23b-9b22dc4e5982-kube-api-access-2jgrh\") pod \"ff49b548-7c18-4eba-a23b-9b22dc4e5982\" (UID: \"ff49b548-7c18-4eba-a23b-9b22dc4e5982\") " Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.652354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff49b548-7c18-4eba-a23b-9b22dc4e5982-kube-api-access-2jgrh" (OuterVolumeSpecName: "kube-api-access-2jgrh") pod "ff49b548-7c18-4eba-a23b-9b22dc4e5982" (UID: "ff49b548-7c18-4eba-a23b-9b22dc4e5982"). InnerVolumeSpecName "kube-api-access-2jgrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.742910 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jgrh\" (UniqueName: \"kubernetes.io/projected/ff49b548-7c18-4eba-a23b-9b22dc4e5982-kube-api-access-2jgrh\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.965345 4707 generic.go:334] "Generic (PLEG): container finished" podID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerID="e8ad06e0a72754cf479badef467fcc4d22b1569e254808d9b209a49b761f92ff" exitCode=0 Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.965433 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" event={"ID":"24b81dd8-32fc-4b34-a088-46fbdc2a3153","Type":"ContainerDied","Data":"e8ad06e0a72754cf479badef467fcc4d22b1569e254808d9b209a49b761f92ff"} Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.970101 4707 generic.go:334] "Generic (PLEG): container finished" podID="ff49b548-7c18-4eba-a23b-9b22dc4e5982" containerID="38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3" exitCode=2 Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.970154 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.970177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"ff49b548-7c18-4eba-a23b-9b22dc4e5982","Type":"ContainerDied","Data":"38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3"} Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.970194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"ff49b548-7c18-4eba-a23b-9b22dc4e5982","Type":"ContainerDied","Data":"65e4e2036d324214de6fb4345cdfbcc3bbab91d8820d175f6cd136b4dc4d6fa2"} Jan 21 15:59:07 crc kubenswrapper[4707]: I0121 15:59:07.970210 4707 scope.go:117] "RemoveContainer" containerID="38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.013903 4707 scope.go:117] "RemoveContainer" containerID="38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3" Jan 21 15:59:08 crc kubenswrapper[4707]: E0121 15:59:08.014658 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3\": container with ID starting with 38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3 not found: ID does not exist" containerID="38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.014688 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3"} err="failed to get container status \"38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3\": rpc error: code = NotFound desc = could not find container \"38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3\": container with ID starting with 38844e90a733136c3a81a7b7219b6764f7f0a24a1bcac5abafe03a3aebc1e3f3 not found: ID does not exist" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.019280 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.029449 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.035598 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:59:08 crc kubenswrapper[4707]: E0121 15:59:08.035948 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff49b548-7c18-4eba-a23b-9b22dc4e5982" containerName="kube-state-metrics" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.035963 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff49b548-7c18-4eba-a23b-9b22dc4e5982" containerName="kube-state-metrics" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.036141 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff49b548-7c18-4eba-a23b-9b22dc4e5982" containerName="kube-state-metrics" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.036653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.043995 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.044008 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.047785 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.137332 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.151696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.152044 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brfjd\" (UniqueName: \"kubernetes.io/projected/878bb642-989b-4ea2-825e-514263f9545a-kube-api-access-brfjd\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.152071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.152212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.158163 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc"] Jan 21 15:59:08 crc kubenswrapper[4707]: E0121 15:59:08.158578 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerName="neutron-api" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.158648 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerName="neutron-api" Jan 21 15:59:08 crc kubenswrapper[4707]: E0121 15:59:08.158722 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerName="neutron-httpd" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.158765 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerName="neutron-httpd" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.158985 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerName="neutron-httpd" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.159046 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" containerName="neutron-api" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.175176 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.177580 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.177783 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.198876 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.210796 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc"] Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.253795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-httpd-config\") pod \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.253848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-ovndb-tls-certs\") pod \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.253980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-combined-ca-bundle\") pod \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.254342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-config\") pod \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.254908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmbzk\" (UniqueName: \"kubernetes.io/projected/24b81dd8-32fc-4b34-a088-46fbdc2a3153-kube-api-access-wmbzk\") pod \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\" (UID: \"24b81dd8-32fc-4b34-a088-46fbdc2a3153\") " Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255239 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-log-httpd\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-config-data\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-public-tls-certs\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p87d\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-kube-api-access-2p87d\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brfjd\" (UniqueName: \"kubernetes.io/projected/878bb642-989b-4ea2-825e-514263f9545a-kube-api-access-brfjd\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-run-httpd\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-combined-ca-bundle\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-etc-swift\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-internal-tls-certs\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.255889 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.258071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b81dd8-32fc-4b34-a088-46fbdc2a3153-kube-api-access-wmbzk" (OuterVolumeSpecName: "kube-api-access-wmbzk") pod "24b81dd8-32fc-4b34-a088-46fbdc2a3153" (UID: "24b81dd8-32fc-4b34-a088-46fbdc2a3153"). InnerVolumeSpecName "kube-api-access-wmbzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.259779 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.259937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "24b81dd8-32fc-4b34-a088-46fbdc2a3153" (UID: "24b81dd8-32fc-4b34-a088-46fbdc2a3153"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.260650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.271144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brfjd\" (UniqueName: \"kubernetes.io/projected/878bb642-989b-4ea2-825e-514263f9545a-kube-api-access-brfjd\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.276176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.290343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-config" (OuterVolumeSpecName: "config") pod "24b81dd8-32fc-4b34-a088-46fbdc2a3153" (UID: "24b81dd8-32fc-4b34-a088-46fbdc2a3153"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.292637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24b81dd8-32fc-4b34-a088-46fbdc2a3153" (UID: "24b81dd8-32fc-4b34-a088-46fbdc2a3153"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.304271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "24b81dd8-32fc-4b34-a088-46fbdc2a3153" (UID: "24b81dd8-32fc-4b34-a088-46fbdc2a3153"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-combined-ca-bundle\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-etc-swift\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-internal-tls-certs\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-log-httpd\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-config-data\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-public-tls-certs\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p87d\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-kube-api-access-2p87d\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-run-httpd\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357431 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357443 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357453 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmbzk\" (UniqueName: \"kubernetes.io/projected/24b81dd8-32fc-4b34-a088-46fbdc2a3153-kube-api-access-wmbzk\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357463 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357472 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b81dd8-32fc-4b34-a088-46fbdc2a3153-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.357783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-run-httpd\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.358361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-log-httpd\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.360051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-internal-tls-certs\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.360362 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.360955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-combined-ca-bundle\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.361862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-config-data\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.368643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-etc-swift\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.369028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-public-tls-certs\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.372242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p87d\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-kube-api-access-2p87d\") pod \"swift-proxy-8545fb474-dhvxc\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.513723 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.594861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.595083 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="ceilometer-central-agent" containerID="cri-o://934f93e9e7e32bb8adeddf2391c02267a4cea98ae6d010ce86176d45f0fbb15b" gracePeriod=30 Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.595442 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="proxy-httpd" containerID="cri-o://847353d388943179ff5421f55370c460ae23d93dc684ed50ecce5e109974e786" gracePeriod=30 Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.595502 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="sg-core" containerID="cri-o://452e01c8458d06be38dd0725caeef55a6882076a46f333419461b139fd570790" gracePeriod=30 Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.595536 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="ceilometer-notification-agent" containerID="cri-o://df912f5a069085faa23dd8c3228fe5debb72d477b7f8f77d20df94f07d0641f9" gracePeriod=30 Jan 21 15:59:08 crc kubenswrapper[4707]: W0121 15:59:08.734962 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod878bb642_989b_4ea2_825e_514263f9545a.slice/crio-ffc8dc78bc39353273f375e7210b8d9d0da42cb11e26ccb55670c33ff768584f WatchSource:0}: Error finding container ffc8dc78bc39353273f375e7210b8d9d0da42cb11e26ccb55670c33ff768584f: Status 404 returned error can't find the container with id ffc8dc78bc39353273f375e7210b8d9d0da42cb11e26ccb55670c33ff768584f Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.735788 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.909481 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc"] Jan 21 15:59:08 crc kubenswrapper[4707]: W0121 15:59:08.919831 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7476b843_007f_4dde_8bc6_c623dbc1c101.slice/crio-7657f3179bec23b81d3e1eb3e92db7994e09a2908896463bcc24b7cbcff134e7 WatchSource:0}: Error finding container 7657f3179bec23b81d3e1eb3e92db7994e09a2908896463bcc24b7cbcff134e7: Status 404 returned error can't find the container with id 7657f3179bec23b81d3e1eb3e92db7994e09a2908896463bcc24b7cbcff134e7 Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.979198 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4341da7-00d4-472a-a479-16602bcb7495" containerID="847353d388943179ff5421f55370c460ae23d93dc684ed50ecce5e109974e786" exitCode=0 Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.979220 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4341da7-00d4-472a-a479-16602bcb7495" containerID="452e01c8458d06be38dd0725caeef55a6882076a46f333419461b139fd570790" exitCode=2 Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.979227 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4341da7-00d4-472a-a479-16602bcb7495" containerID="934f93e9e7e32bb8adeddf2391c02267a4cea98ae6d010ce86176d45f0fbb15b" exitCode=0 Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.979230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerDied","Data":"847353d388943179ff5421f55370c460ae23d93dc684ed50ecce5e109974e786"} Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.979271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerDied","Data":"452e01c8458d06be38dd0725caeef55a6882076a46f333419461b139fd570790"} Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.979283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerDied","Data":"934f93e9e7e32bb8adeddf2391c02267a4cea98ae6d010ce86176d45f0fbb15b"} Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.980748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" event={"ID":"24b81dd8-32fc-4b34-a088-46fbdc2a3153","Type":"ContainerDied","Data":"8098f13feec3ca2b25d017a6f1cf3d73060a5d4bba624dbb5c547d0752d8743e"} Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.980789 4707 scope.go:117] "RemoveContainer" containerID="6ca44840ca88d9e7c7d4c674391885852c2c88359bb32cecc82cd63ccc92cfe1" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.980789 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-845db986f7-l6gvl" Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.983278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"878bb642-989b-4ea2-825e-514263f9545a","Type":"ContainerStarted","Data":"ffc8dc78bc39353273f375e7210b8d9d0da42cb11e26ccb55670c33ff768584f"} Jan 21 15:59:08 crc kubenswrapper[4707]: I0121 15:59:08.984471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" event={"ID":"7476b843-007f-4dde-8bc6-c623dbc1c101","Type":"ContainerStarted","Data":"7657f3179bec23b81d3e1eb3e92db7994e09a2908896463bcc24b7cbcff134e7"} Jan 21 15:59:09 crc kubenswrapper[4707]: I0121 15:59:09.045962 4707 scope.go:117] "RemoveContainer" containerID="e8ad06e0a72754cf479badef467fcc4d22b1569e254808d9b209a49b761f92ff" Jan 21 15:59:09 crc kubenswrapper[4707]: I0121 15:59:09.065057 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-845db986f7-l6gvl"] Jan 21 15:59:09 crc kubenswrapper[4707]: I0121 15:59:09.070157 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-845db986f7-l6gvl"] Jan 21 15:59:09 crc kubenswrapper[4707]: I0121 15:59:09.191459 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b81dd8-32fc-4b34-a088-46fbdc2a3153" path="/var/lib/kubelet/pods/24b81dd8-32fc-4b34-a088-46fbdc2a3153/volumes" Jan 21 15:59:09 crc kubenswrapper[4707]: I0121 15:59:09.191989 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff49b548-7c18-4eba-a23b-9b22dc4e5982" path="/var/lib/kubelet/pods/ff49b548-7c18-4eba-a23b-9b22dc4e5982/volumes" Jan 21 15:59:09 crc kubenswrapper[4707]: I0121 15:59:09.629273 4707 scope.go:117] "RemoveContainer" containerID="8cf71330f22a9f711f2a9aa118097de5a675b96809b3a77139fbb95dc742909f" Jan 21 15:59:09 crc kubenswrapper[4707]: I0121 15:59:09.645544 4707 scope.go:117] "RemoveContainer" containerID="7728de97aa2f6f9a13c08cc05716ea056bddb1521fae5153be06fdfe1e028f81" Jan 21 15:59:09 crc kubenswrapper[4707]: I0121 15:59:09.994805 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"878bb642-989b-4ea2-825e-514263f9545a","Type":"ContainerStarted","Data":"6043287912bacccccabcf12d1ba27e4a2e47d6aa6e68c9430fbd4adf5af2e194"} Jan 21 15:59:09 crc kubenswrapper[4707]: I0121 15:59:09.994870 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:10 crc kubenswrapper[4707]: I0121 15:59:10.003976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" event={"ID":"7476b843-007f-4dde-8bc6-c623dbc1c101","Type":"ContainerStarted","Data":"0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca"} Jan 21 15:59:10 crc kubenswrapper[4707]: I0121 15:59:10.004019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" event={"ID":"7476b843-007f-4dde-8bc6-c623dbc1c101","Type":"ContainerStarted","Data":"a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe"} Jan 21 15:59:10 crc kubenswrapper[4707]: I0121 15:59:10.004222 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:10 crc kubenswrapper[4707]: I0121 15:59:10.004334 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:10 crc kubenswrapper[4707]: I0121 15:59:10.018557 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.763323205 podStartE2EDuration="3.018540949s" podCreationTimestamp="2026-01-21 15:59:07 +0000 UTC" firstStartedPulling="2026-01-21 15:59:08.73688065 +0000 UTC m=+3445.918396873" lastFinishedPulling="2026-01-21 15:59:08.992098395 +0000 UTC m=+3446.173614617" observedRunningTime="2026-01-21 15:59:10.01748317 +0000 UTC m=+3447.198999392" watchObservedRunningTime="2026-01-21 15:59:10.018540949 +0000 UTC m=+3447.200057171" Jan 21 15:59:10 crc kubenswrapper[4707]: I0121 15:59:10.039510 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" podStartSLOduration=2.039495854 podStartE2EDuration="2.039495854s" podCreationTimestamp="2026-01-21 15:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:59:10.035750981 +0000 UTC m=+3447.217267204" watchObservedRunningTime="2026-01-21 15:59:10.039495854 +0000 UTC m=+3447.221012075" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.045173 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4341da7-00d4-472a-a479-16602bcb7495" containerID="df912f5a069085faa23dd8c3228fe5debb72d477b7f8f77d20df94f07d0641f9" exitCode=0 Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.045648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerDied","Data":"df912f5a069085faa23dd8c3228fe5debb72d477b7f8f77d20df94f07d0641f9"} Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.294417 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.338493 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-log-httpd\") pod \"d4341da7-00d4-472a-a479-16602bcb7495\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.338569 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-sg-core-conf-yaml\") pod \"d4341da7-00d4-472a-a479-16602bcb7495\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.338695 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-combined-ca-bundle\") pod \"d4341da7-00d4-472a-a479-16602bcb7495\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.338716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-run-httpd\") pod \"d4341da7-00d4-472a-a479-16602bcb7495\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.338750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-config-data\") pod \"d4341da7-00d4-472a-a479-16602bcb7495\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.338828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4d9b\" (UniqueName: \"kubernetes.io/projected/d4341da7-00d4-472a-a479-16602bcb7495-kube-api-access-l4d9b\") pod \"d4341da7-00d4-472a-a479-16602bcb7495\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.338847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-scripts\") pod \"d4341da7-00d4-472a-a479-16602bcb7495\" (UID: \"d4341da7-00d4-472a-a479-16602bcb7495\") " Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.339732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d4341da7-00d4-472a-a479-16602bcb7495" (UID: "d4341da7-00d4-472a-a479-16602bcb7495"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.344906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d4341da7-00d4-472a-a479-16602bcb7495" (UID: "d4341da7-00d4-472a-a479-16602bcb7495"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.348098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4341da7-00d4-472a-a479-16602bcb7495-kube-api-access-l4d9b" (OuterVolumeSpecName: "kube-api-access-l4d9b") pod "d4341da7-00d4-472a-a479-16602bcb7495" (UID: "d4341da7-00d4-472a-a479-16602bcb7495"). InnerVolumeSpecName "kube-api-access-l4d9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.349628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-scripts" (OuterVolumeSpecName: "scripts") pod "d4341da7-00d4-472a-a479-16602bcb7495" (UID: "d4341da7-00d4-472a-a479-16602bcb7495"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.363922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d4341da7-00d4-472a-a479-16602bcb7495" (UID: "d4341da7-00d4-472a-a479-16602bcb7495"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.406245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-config-data" (OuterVolumeSpecName: "config-data") pod "d4341da7-00d4-472a-a479-16602bcb7495" (UID: "d4341da7-00d4-472a-a479-16602bcb7495"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.408926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4341da7-00d4-472a-a479-16602bcb7495" (UID: "d4341da7-00d4-472a-a479-16602bcb7495"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.441117 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.441151 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.441164 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.441174 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4341da7-00d4-472a-a479-16602bcb7495-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.441183 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.441193 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4d9b\" (UniqueName: \"kubernetes.io/projected/d4341da7-00d4-472a-a479-16602bcb7495-kube-api-access-l4d9b\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:12 crc kubenswrapper[4707]: I0121 15:59:12.441202 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4341da7-00d4-472a-a479-16602bcb7495-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.058283 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.058302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d4341da7-00d4-472a-a479-16602bcb7495","Type":"ContainerDied","Data":"0e72490b914a9d25f4f0daf8fc52d60495b972e225a456ab0506cf9f4a00f958"} Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.058645 4707 scope.go:117] "RemoveContainer" containerID="847353d388943179ff5421f55370c460ae23d93dc684ed50ecce5e109974e786" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.077589 4707 scope.go:117] "RemoveContainer" containerID="452e01c8458d06be38dd0725caeef55a6882076a46f333419461b139fd570790" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.090149 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.097756 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.104199 4707 scope.go:117] "RemoveContainer" containerID="df912f5a069085faa23dd8c3228fe5debb72d477b7f8f77d20df94f07d0641f9" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.106078 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:13 crc kubenswrapper[4707]: E0121 15:59:13.106449 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="ceilometer-notification-agent" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.106477 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="ceilometer-notification-agent" Jan 21 15:59:13 crc kubenswrapper[4707]: E0121 15:59:13.106515 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="sg-core" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.106521 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="sg-core" Jan 21 15:59:13 crc kubenswrapper[4707]: E0121 15:59:13.106535 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="proxy-httpd" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.106542 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="proxy-httpd" Jan 21 15:59:13 crc kubenswrapper[4707]: E0121 15:59:13.106554 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="ceilometer-central-agent" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.106560 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="ceilometer-central-agent" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.106715 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="proxy-httpd" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.106731 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="ceilometer-central-agent" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.106746 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="sg-core" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.106756 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4341da7-00d4-472a-a479-16602bcb7495" containerName="ceilometer-notification-agent" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.110726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.112743 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.112984 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.113203 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.122601 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.138065 4707 scope.go:117] "RemoveContainer" containerID="934f93e9e7e32bb8adeddf2391c02267a4cea98ae6d010ce86176d45f0fbb15b" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.151449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-scripts\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.151517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpkl7\" (UniqueName: \"kubernetes.io/projected/69da223e-1ed5-4176-8dee-8a002c1524f9-kube-api-access-lpkl7\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.151666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.151832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-run-httpd\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.152020 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-log-httpd\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.152112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.152219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-config-data\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.152449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.190604 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4341da7-00d4-472a-a479-16602bcb7495" path="/var/lib/kubelet/pods/d4341da7-00d4-472a-a479-16602bcb7495/volumes" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.255168 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.255486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-config-data\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.255530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.255555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-scripts\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.255603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpkl7\" (UniqueName: \"kubernetes.io/projected/69da223e-1ed5-4176-8dee-8a002c1524f9-kube-api-access-lpkl7\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.255636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.255672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-run-httpd\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.255722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-log-httpd\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.256093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-log-httpd\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.256634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-run-httpd\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.261099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.261226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-scripts\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.261703 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-config-data\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.262105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.262675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.275029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpkl7\" (UniqueName: \"kubernetes.io/projected/69da223e-1ed5-4176-8dee-8a002c1524f9-kube-api-access-lpkl7\") pod \"ceilometer-0\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.358891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.359143 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="de71448c-16db-44cc-b37f-f5a01abed953" containerName="glance-log" containerID="cri-o://70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c" gracePeriod=30 Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.359297 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="de71448c-16db-44cc-b37f-f5a01abed953" containerName="glance-httpd" containerID="cri-o://9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661" gracePeriod=30 Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.427242 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.816953 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:13 crc kubenswrapper[4707]: W0121 15:59:13.819540 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69da223e_1ed5_4176_8dee_8a002c1524f9.slice/crio-6e05b185f949d184d21e9ee040b2fbe7d3c4c8eba8df2f907b1c1521f3830bd9 WatchSource:0}: Error finding container 6e05b185f949d184d21e9ee040b2fbe7d3c4c8eba8df2f907b1c1521f3830bd9: Status 404 returned error can't find the container with id 6e05b185f949d184d21e9ee040b2fbe7d3c4c8eba8df2f907b1c1521f3830bd9 Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.940655 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.940874 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerName="glance-log" containerID="cri-o://251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a" gracePeriod=30 Jan 21 15:59:13 crc kubenswrapper[4707]: I0121 15:59:13.941005 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerName="glance-httpd" containerID="cri-o://5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba" gracePeriod=30 Jan 21 15:59:14 crc kubenswrapper[4707]: I0121 15:59:14.067663 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerID="251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a" exitCode=143 Jan 21 15:59:14 crc kubenswrapper[4707]: I0121 15:59:14.067741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8f9744f7-4974-4616-834e-61a327e2ffdb","Type":"ContainerDied","Data":"251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a"} Jan 21 15:59:14 crc kubenswrapper[4707]: I0121 15:59:14.068555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerStarted","Data":"6e05b185f949d184d21e9ee040b2fbe7d3c4c8eba8df2f907b1c1521f3830bd9"} Jan 21 15:59:14 crc kubenswrapper[4707]: I0121 15:59:14.069743 4707 generic.go:334] "Generic (PLEG): container finished" podID="de71448c-16db-44cc-b37f-f5a01abed953" containerID="70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c" exitCode=143 Jan 21 15:59:14 crc kubenswrapper[4707]: I0121 15:59:14.069791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"de71448c-16db-44cc-b37f-f5a01abed953","Type":"ContainerDied","Data":"70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c"} Jan 21 15:59:14 crc kubenswrapper[4707]: I0121 15:59:14.339731 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.081170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerStarted","Data":"7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83"} Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.302047 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wwprz"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.303390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.315741 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wwprz"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.386615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxvnw\" (UniqueName: \"kubernetes.io/projected/528035e2-52ac-4a65-b069-312a9bfab07e-kube-api-access-wxvnw\") pod \"nova-api-db-create-wwprz\" (UID: \"528035e2-52ac-4a65-b069-312a9bfab07e\") " pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.387053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/528035e2-52ac-4a65-b069-312a9bfab07e-operator-scripts\") pod \"nova-api-db-create-wwprz\" (UID: \"528035e2-52ac-4a65-b069-312a9bfab07e\") " pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.392058 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-6zdng"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.393020 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.412612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-6zdng"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.488575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783ac2f5-58ea-4ead-a88e-7c4929cf4661-operator-scripts\") pod \"nova-cell0-db-create-6zdng\" (UID: \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\") " pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.488702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jszjt\" (UniqueName: \"kubernetes.io/projected/783ac2f5-58ea-4ead-a88e-7c4929cf4661-kube-api-access-jszjt\") pod \"nova-cell0-db-create-6zdng\" (UID: \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\") " pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.488767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxvnw\" (UniqueName: \"kubernetes.io/projected/528035e2-52ac-4a65-b069-312a9bfab07e-kube-api-access-wxvnw\") pod \"nova-api-db-create-wwprz\" (UID: \"528035e2-52ac-4a65-b069-312a9bfab07e\") " pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.489040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/528035e2-52ac-4a65-b069-312a9bfab07e-operator-scripts\") pod \"nova-api-db-create-wwprz\" (UID: \"528035e2-52ac-4a65-b069-312a9bfab07e\") " pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.489727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/528035e2-52ac-4a65-b069-312a9bfab07e-operator-scripts\") pod \"nova-api-db-create-wwprz\" (UID: \"528035e2-52ac-4a65-b069-312a9bfab07e\") " pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.493059 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-dsspk"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.493973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.502629 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.503531 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.509176 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.512454 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxvnw\" (UniqueName: \"kubernetes.io/projected/528035e2-52ac-4a65-b069-312a9bfab07e-kube-api-access-wxvnw\") pod \"nova-api-db-create-wwprz\" (UID: \"528035e2-52ac-4a65-b069-312a9bfab07e\") " pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.521143 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.528174 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-dsspk"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.591603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/643f3f30-7d67-4b8b-af44-3637ad271586-operator-scripts\") pod \"nova-api-488e-account-create-update-27bp2\" (UID: \"643f3f30-7d67-4b8b-af44-3637ad271586\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.592386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62dfd2-875a-43ff-b154-e1755965171f-operator-scripts\") pod \"nova-cell1-db-create-dsspk\" (UID: \"ac62dfd2-875a-43ff-b154-e1755965171f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.592543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsv5p\" (UniqueName: \"kubernetes.io/projected/643f3f30-7d67-4b8b-af44-3637ad271586-kube-api-access-vsv5p\") pod \"nova-api-488e-account-create-update-27bp2\" (UID: \"643f3f30-7d67-4b8b-af44-3637ad271586\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.592677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783ac2f5-58ea-4ead-a88e-7c4929cf4661-operator-scripts\") pod \"nova-cell0-db-create-6zdng\" (UID: \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\") " pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.592928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jszjt\" (UniqueName: \"kubernetes.io/projected/783ac2f5-58ea-4ead-a88e-7c4929cf4661-kube-api-access-jszjt\") pod \"nova-cell0-db-create-6zdng\" (UID: \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\") " pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.593165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmdhp\" (UniqueName: \"kubernetes.io/projected/ac62dfd2-875a-43ff-b154-e1755965171f-kube-api-access-dmdhp\") pod \"nova-cell1-db-create-dsspk\" (UID: \"ac62dfd2-875a-43ff-b154-e1755965171f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.593757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783ac2f5-58ea-4ead-a88e-7c4929cf4661-operator-scripts\") pod \"nova-cell0-db-create-6zdng\" (UID: \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\") " pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.606765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jszjt\" (UniqueName: \"kubernetes.io/projected/783ac2f5-58ea-4ead-a88e-7c4929cf4661-kube-api-access-jszjt\") pod \"nova-cell0-db-create-6zdng\" (UID: \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\") " pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.628425 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.694331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62dfd2-875a-43ff-b154-e1755965171f-operator-scripts\") pod \"nova-cell1-db-create-dsspk\" (UID: \"ac62dfd2-875a-43ff-b154-e1755965171f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.694374 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsv5p\" (UniqueName: \"kubernetes.io/projected/643f3f30-7d67-4b8b-af44-3637ad271586-kube-api-access-vsv5p\") pod \"nova-api-488e-account-create-update-27bp2\" (UID: \"643f3f30-7d67-4b8b-af44-3637ad271586\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.694466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmdhp\" (UniqueName: \"kubernetes.io/projected/ac62dfd2-875a-43ff-b154-e1755965171f-kube-api-access-dmdhp\") pod \"nova-cell1-db-create-dsspk\" (UID: \"ac62dfd2-875a-43ff-b154-e1755965171f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.694514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/643f3f30-7d67-4b8b-af44-3637ad271586-operator-scripts\") pod \"nova-api-488e-account-create-update-27bp2\" (UID: \"643f3f30-7d67-4b8b-af44-3637ad271586\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.695453 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/643f3f30-7d67-4b8b-af44-3637ad271586-operator-scripts\") pod \"nova-api-488e-account-create-update-27bp2\" (UID: \"643f3f30-7d67-4b8b-af44-3637ad271586\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.703162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62dfd2-875a-43ff-b154-e1755965171f-operator-scripts\") pod \"nova-cell1-db-create-dsspk\" (UID: \"ac62dfd2-875a-43ff-b154-e1755965171f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.708564 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.713334 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.714148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsv5p\" (UniqueName: \"kubernetes.io/projected/643f3f30-7d67-4b8b-af44-3637ad271586-kube-api-access-vsv5p\") pod \"nova-api-488e-account-create-update-27bp2\" (UID: \"643f3f30-7d67-4b8b-af44-3637ad271586\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.715189 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.719301 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.725765 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.732862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmdhp\" (UniqueName: \"kubernetes.io/projected/ac62dfd2-875a-43ff-b154-e1755965171f-kube-api-access-dmdhp\") pod \"nova-cell1-db-create-dsspk\" (UID: \"ac62dfd2-875a-43ff-b154-e1755965171f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.795036 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k55r\" (UniqueName: \"kubernetes.io/projected/9ff8d805-9b36-4e84-a82b-aca4df5032cd-kube-api-access-2k55r\") pod \"nova-cell0-44f4-account-create-update-54jdg\" (UID: \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\") " pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.795452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff8d805-9b36-4e84-a82b-aca4df5032cd-operator-scripts\") pod \"nova-cell0-44f4-account-create-update-54jdg\" (UID: \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\") " pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.810009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.847788 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.904329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff8d805-9b36-4e84-a82b-aca4df5032cd-operator-scripts\") pod \"nova-cell0-44f4-account-create-update-54jdg\" (UID: \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\") " pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.904375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k55r\" (UniqueName: \"kubernetes.io/projected/9ff8d805-9b36-4e84-a82b-aca4df5032cd-kube-api-access-2k55r\") pod \"nova-cell0-44f4-account-create-update-54jdg\" (UID: \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\") " pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.904642 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.905305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff8d805-9b36-4e84-a82b-aca4df5032cd-operator-scripts\") pod \"nova-cell0-44f4-account-create-update-54jdg\" (UID: \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\") " pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.906044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.910361 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.914684 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj"] Jan 21 15:59:15 crc kubenswrapper[4707]: I0121 15:59:15.932130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k55r\" (UniqueName: \"kubernetes.io/projected/9ff8d805-9b36-4e84-a82b-aca4df5032cd-kube-api-access-2k55r\") pod \"nova-cell0-44f4-account-create-update-54jdg\" (UID: \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\") " pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.006604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22499a49-41fa-4c4f-9e22-16f64c8757b4-operator-scripts\") pod \"nova-cell1-24b2-account-create-update-25rcj\" (UID: \"22499a49-41fa-4c4f-9e22-16f64c8757b4\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.006953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dksd\" (UniqueName: \"kubernetes.io/projected/22499a49-41fa-4c4f-9e22-16f64c8757b4-kube-api-access-7dksd\") pod \"nova-cell1-24b2-account-create-update-25rcj\" (UID: \"22499a49-41fa-4c4f-9e22-16f64c8757b4\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.046164 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.055722 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wwprz"] Jan 21 15:59:16 crc kubenswrapper[4707]: W0121 15:59:16.065203 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod528035e2_52ac_4a65_b069_312a9bfab07e.slice/crio-edbdfb840c998fd379767cada97258042bbe0d7adc4efd9fb76a812be5048dff WatchSource:0}: Error finding container edbdfb840c998fd379767cada97258042bbe0d7adc4efd9fb76a812be5048dff: Status 404 returned error can't find the container with id edbdfb840c998fd379767cada97258042bbe0d7adc4efd9fb76a812be5048dff Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.108539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dksd\" (UniqueName: \"kubernetes.io/projected/22499a49-41fa-4c4f-9e22-16f64c8757b4-kube-api-access-7dksd\") pod \"nova-cell1-24b2-account-create-update-25rcj\" (UID: \"22499a49-41fa-4c4f-9e22-16f64c8757b4\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.108667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22499a49-41fa-4c4f-9e22-16f64c8757b4-operator-scripts\") pod \"nova-cell1-24b2-account-create-update-25rcj\" (UID: \"22499a49-41fa-4c4f-9e22-16f64c8757b4\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.109519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22499a49-41fa-4c4f-9e22-16f64c8757b4-operator-scripts\") pod \"nova-cell1-24b2-account-create-update-25rcj\" (UID: \"22499a49-41fa-4c4f-9e22-16f64c8757b4\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.117111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerStarted","Data":"b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b"} Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.117146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerStarted","Data":"442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984"} Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.122210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-wwprz" event={"ID":"528035e2-52ac-4a65-b069-312a9bfab07e","Type":"ContainerStarted","Data":"edbdfb840c998fd379767cada97258042bbe0d7adc4efd9fb76a812be5048dff"} Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.124032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dksd\" (UniqueName: \"kubernetes.io/projected/22499a49-41fa-4c4f-9e22-16f64c8757b4-kube-api-access-7dksd\") pod \"nova-cell1-24b2-account-create-update-25rcj\" (UID: \"22499a49-41fa-4c4f-9e22-16f64c8757b4\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.160992 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-6zdng"] Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.229067 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.286335 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-dsspk"] Jan 21 15:59:16 crc kubenswrapper[4707]: W0121 15:59:16.292602 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac62dfd2_875a_43ff_b154_e1755965171f.slice/crio-4218fdf00ee104b900e3ad74cc5d7922842dfea0cb3b1f6a4f25adf15c97165f WatchSource:0}: Error finding container 4218fdf00ee104b900e3ad74cc5d7922842dfea0cb3b1f6a4f25adf15c97165f: Status 404 returned error can't find the container with id 4218fdf00ee104b900e3ad74cc5d7922842dfea0cb3b1f6a4f25adf15c97165f Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.374144 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2"] Jan 21 15:59:16 crc kubenswrapper[4707]: W0121 15:59:16.391084 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod643f3f30_7d67_4b8b_af44_3637ad271586.slice/crio-f86b51ab40683636f0c4dd0feee3ecf714e9e68e36941169e8f0b7c636c92918 WatchSource:0}: Error finding container f86b51ab40683636f0c4dd0feee3ecf714e9e68e36941169e8f0b7c636c92918: Status 404 returned error can't find the container with id f86b51ab40683636f0c4dd0feee3ecf714e9e68e36941169e8f0b7c636c92918 Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.542737 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg"] Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.680381 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj"] Jan 21 15:59:16 crc kubenswrapper[4707]: W0121 15:59:16.711644 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22499a49_41fa_4c4f_9e22_16f64c8757b4.slice/crio-c5e58ab714558f3f7cfe71cd1a4681f59c6f5d1a7e4c9940465ed3b7b96b6778 WatchSource:0}: Error finding container c5e58ab714558f3f7cfe71cd1a4681f59c6f5d1a7e4c9940465ed3b7b96b6778: Status 404 returned error can't find the container with id c5e58ab714558f3f7cfe71cd1a4681f59c6f5d1a7e4c9940465ed3b7b96b6778 Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.831913 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.926400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v88zm\" (UniqueName: \"kubernetes.io/projected/de71448c-16db-44cc-b37f-f5a01abed953-kube-api-access-v88zm\") pod \"de71448c-16db-44cc-b37f-f5a01abed953\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.926517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-public-tls-certs\") pod \"de71448c-16db-44cc-b37f-f5a01abed953\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.926556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-httpd-run\") pod \"de71448c-16db-44cc-b37f-f5a01abed953\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.926574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-logs\") pod \"de71448c-16db-44cc-b37f-f5a01abed953\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.926615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-combined-ca-bundle\") pod \"de71448c-16db-44cc-b37f-f5a01abed953\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.926645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"de71448c-16db-44cc-b37f-f5a01abed953\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.926694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-config-data\") pod \"de71448c-16db-44cc-b37f-f5a01abed953\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.926727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-scripts\") pod \"de71448c-16db-44cc-b37f-f5a01abed953\" (UID: \"de71448c-16db-44cc-b37f-f5a01abed953\") " Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.927191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "de71448c-16db-44cc-b37f-f5a01abed953" (UID: "de71448c-16db-44cc-b37f-f5a01abed953"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.927271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-logs" (OuterVolumeSpecName: "logs") pod "de71448c-16db-44cc-b37f-f5a01abed953" (UID: "de71448c-16db-44cc-b37f-f5a01abed953"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.936434 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "de71448c-16db-44cc-b37f-f5a01abed953" (UID: "de71448c-16db-44cc-b37f-f5a01abed953"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.937027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-scripts" (OuterVolumeSpecName: "scripts") pod "de71448c-16db-44cc-b37f-f5a01abed953" (UID: "de71448c-16db-44cc-b37f-f5a01abed953"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.937314 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de71448c-16db-44cc-b37f-f5a01abed953-kube-api-access-v88zm" (OuterVolumeSpecName: "kube-api-access-v88zm") pod "de71448c-16db-44cc-b37f-f5a01abed953" (UID: "de71448c-16db-44cc-b37f-f5a01abed953"). InnerVolumeSpecName "kube-api-access-v88zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.952190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de71448c-16db-44cc-b37f-f5a01abed953" (UID: "de71448c-16db-44cc-b37f-f5a01abed953"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.971959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "de71448c-16db-44cc-b37f-f5a01abed953" (UID: "de71448c-16db-44cc-b37f-f5a01abed953"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:16 crc kubenswrapper[4707]: I0121 15:59:16.976581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-config-data" (OuterVolumeSpecName: "config-data") pod "de71448c-16db-44cc-b37f-f5a01abed953" (UID: "de71448c-16db-44cc-b37f-f5a01abed953"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.028996 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.029151 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.029244 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v88zm\" (UniqueName: \"kubernetes.io/projected/de71448c-16db-44cc-b37f-f5a01abed953-kube-api-access-v88zm\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.029336 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.029404 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.029505 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de71448c-16db-44cc-b37f-f5a01abed953-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.029732 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de71448c-16db-44cc-b37f-f5a01abed953-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.029917 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.045374 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.131399 4707 generic.go:334] "Generic (PLEG): container finished" podID="783ac2f5-58ea-4ead-a88e-7c4929cf4661" containerID="ae353753b0fb18d8352ca556e1585eaa516abe93adbada375743082e9628ebc2" exitCode=0 Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.131496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" event={"ID":"783ac2f5-58ea-4ead-a88e-7c4929cf4661","Type":"ContainerDied","Data":"ae353753b0fb18d8352ca556e1585eaa516abe93adbada375743082e9628ebc2"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.131544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" event={"ID":"783ac2f5-58ea-4ead-a88e-7c4929cf4661","Type":"ContainerStarted","Data":"9f6f466906a25944f23ed9364fadf0262b7be0ede41bbd4174059b744a52982a"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.131692 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.132703 4707 generic.go:334] "Generic (PLEG): container finished" podID="528035e2-52ac-4a65-b069-312a9bfab07e" containerID="9e83185270fc11763a91c77246eff89a685150bc0a98b55aaf5b8eb5e54d524f" exitCode=0 Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.132751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-wwprz" event={"ID":"528035e2-52ac-4a65-b069-312a9bfab07e","Type":"ContainerDied","Data":"9e83185270fc11763a91c77246eff89a685150bc0a98b55aaf5b8eb5e54d524f"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.134129 4707 generic.go:334] "Generic (PLEG): container finished" podID="ac62dfd2-875a-43ff-b154-e1755965171f" containerID="e79cc9bb5b22b0eed5431d7ce834ba9dc92b3ac1898938e892029f4cdbc1812f" exitCode=0 Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.134193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" event={"ID":"ac62dfd2-875a-43ff-b154-e1755965171f","Type":"ContainerDied","Data":"e79cc9bb5b22b0eed5431d7ce834ba9dc92b3ac1898938e892029f4cdbc1812f"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.134219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" event={"ID":"ac62dfd2-875a-43ff-b154-e1755965171f","Type":"ContainerStarted","Data":"4218fdf00ee104b900e3ad74cc5d7922842dfea0cb3b1f6a4f25adf15c97165f"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.135507 4707 generic.go:334] "Generic (PLEG): container finished" podID="22499a49-41fa-4c4f-9e22-16f64c8757b4" containerID="98a9597fa3accf0fc1a3339f115e49717d4b9832f5b88f142f8e788aee42855c" exitCode=0 Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.135556 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" event={"ID":"22499a49-41fa-4c4f-9e22-16f64c8757b4","Type":"ContainerDied","Data":"98a9597fa3accf0fc1a3339f115e49717d4b9832f5b88f142f8e788aee42855c"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.135571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" event={"ID":"22499a49-41fa-4c4f-9e22-16f64c8757b4","Type":"ContainerStarted","Data":"c5e58ab714558f3f7cfe71cd1a4681f59c6f5d1a7e4c9940465ed3b7b96b6778"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.136956 4707 generic.go:334] "Generic (PLEG): container finished" podID="643f3f30-7d67-4b8b-af44-3637ad271586" containerID="8997e6bc538ab68581eb681a7bc2920e440462c29436d549cdcef8caaa1a96dd" exitCode=0 Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.136995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" event={"ID":"643f3f30-7d67-4b8b-af44-3637ad271586","Type":"ContainerDied","Data":"8997e6bc538ab68581eb681a7bc2920e440462c29436d549cdcef8caaa1a96dd"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.137010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" event={"ID":"643f3f30-7d67-4b8b-af44-3637ad271586","Type":"ContainerStarted","Data":"f86b51ab40683636f0c4dd0feee3ecf714e9e68e36941169e8f0b7c636c92918"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.138798 4707 generic.go:334] "Generic (PLEG): container finished" podID="de71448c-16db-44cc-b37f-f5a01abed953" containerID="9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661" exitCode=0 Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.138839 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.138845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"de71448c-16db-44cc-b37f-f5a01abed953","Type":"ContainerDied","Data":"9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.139066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"de71448c-16db-44cc-b37f-f5a01abed953","Type":"ContainerDied","Data":"d23f748a7978fc926e6cedbf2fa6e74320ff759d9b2cf9c105d7d760950daf4b"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.139092 4707 scope.go:117] "RemoveContainer" containerID="9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.140506 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ff8d805-9b36-4e84-a82b-aca4df5032cd" containerID="76d160cc07c271dde3d11395b543e852feef2c9280f3151293f7b7ecec53f818" exitCode=0 Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.140537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" event={"ID":"9ff8d805-9b36-4e84-a82b-aca4df5032cd","Type":"ContainerDied","Data":"76d160cc07c271dde3d11395b543e852feef2c9280f3151293f7b7ecec53f818"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.140552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" event={"ID":"9ff8d805-9b36-4e84-a82b-aca4df5032cd","Type":"ContainerStarted","Data":"f05966ade42885a4262e18f6ed2d6b96ecf25213b4508fb1f704d3bd0a1c7512"} Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.249829 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.254413 4707 scope.go:117] "RemoveContainer" containerID="70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.266934 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.275593 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:59:17 crc kubenswrapper[4707]: E0121 15:59:17.275966 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de71448c-16db-44cc-b37f-f5a01abed953" containerName="glance-log" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.275979 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de71448c-16db-44cc-b37f-f5a01abed953" containerName="glance-log" Jan 21 15:59:17 crc kubenswrapper[4707]: E0121 15:59:17.275993 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de71448c-16db-44cc-b37f-f5a01abed953" containerName="glance-httpd" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.275998 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de71448c-16db-44cc-b37f-f5a01abed953" containerName="glance-httpd" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.276190 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de71448c-16db-44cc-b37f-f5a01abed953" containerName="glance-log" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.276207 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de71448c-16db-44cc-b37f-f5a01abed953" containerName="glance-httpd" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.277057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.278568 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.279417 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.291897 4707 scope.go:117] "RemoveContainer" containerID="9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661" Jan 21 15:59:17 crc kubenswrapper[4707]: E0121 15:59:17.293476 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661\": container with ID starting with 9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661 not found: ID does not exist" containerID="9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.293576 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661"} err="failed to get container status \"9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661\": rpc error: code = NotFound desc = could not find container \"9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661\": container with ID starting with 9f8da926a7e6cd32f1d82025f7025c80312d5baeba1041c06e1cc61c2b67e661 not found: ID does not exist" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.293647 4707 scope.go:117] "RemoveContainer" containerID="70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c" Jan 21 15:59:17 crc kubenswrapper[4707]: E0121 15:59:17.294391 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c\": container with ID starting with 70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c not found: ID does not exist" containerID="70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.294423 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c"} err="failed to get container status \"70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c\": rpc error: code = NotFound desc = could not find container \"70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c\": container with ID starting with 70e6d5e044201812edf5e81ebc1782ffacdd4f72345ec5d568e29c221c21e84c not found: ID does not exist" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.307491 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.334344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.334472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.334506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.334580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.334596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.334666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7x5q\" (UniqueName: \"kubernetes.io/projected/fc9779e5-4e04-424b-b1e3-388ebf744d02-kube-api-access-c7x5q\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.334709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.334728 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-logs\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.436133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.436175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.436199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.436246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.436266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.436314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7x5q\" (UniqueName: \"kubernetes.io/projected/fc9779e5-4e04-424b-b1e3-388ebf744d02-kube-api-access-c7x5q\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.436334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.436350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-logs\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.436768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-logs\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.437081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.437264 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.443532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.444241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.447382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.447962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.466203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7x5q\" (UniqueName: \"kubernetes.io/projected/fc9779e5-4e04-424b-b1e3-388ebf744d02-kube-api-access-c7x5q\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.469587 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.473170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.537644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-internal-tls-certs\") pod \"8f9744f7-4974-4616-834e-61a327e2ffdb\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.537858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"8f9744f7-4974-4616-834e-61a327e2ffdb\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.537933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-combined-ca-bundle\") pod \"8f9744f7-4974-4616-834e-61a327e2ffdb\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.538096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncd6r\" (UniqueName: \"kubernetes.io/projected/8f9744f7-4974-4616-834e-61a327e2ffdb-kube-api-access-ncd6r\") pod \"8f9744f7-4974-4616-834e-61a327e2ffdb\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.539917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-config-data\") pod \"8f9744f7-4974-4616-834e-61a327e2ffdb\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.539984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-httpd-run\") pod \"8f9744f7-4974-4616-834e-61a327e2ffdb\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.540036 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-logs\") pod \"8f9744f7-4974-4616-834e-61a327e2ffdb\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.540062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-scripts\") pod \"8f9744f7-4974-4616-834e-61a327e2ffdb\" (UID: \"8f9744f7-4974-4616-834e-61a327e2ffdb\") " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.545652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "8f9744f7-4974-4616-834e-61a327e2ffdb" (UID: "8f9744f7-4974-4616-834e-61a327e2ffdb"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.546101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-logs" (OuterVolumeSpecName: "logs") pod "8f9744f7-4974-4616-834e-61a327e2ffdb" (UID: "8f9744f7-4974-4616-834e-61a327e2ffdb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.546250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-scripts" (OuterVolumeSpecName: "scripts") pod "8f9744f7-4974-4616-834e-61a327e2ffdb" (UID: "8f9744f7-4974-4616-834e-61a327e2ffdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.548699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f9744f7-4974-4616-834e-61a327e2ffdb-kube-api-access-ncd6r" (OuterVolumeSpecName: "kube-api-access-ncd6r") pod "8f9744f7-4974-4616-834e-61a327e2ffdb" (UID: "8f9744f7-4974-4616-834e-61a327e2ffdb"). InnerVolumeSpecName "kube-api-access-ncd6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.548835 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8f9744f7-4974-4616-834e-61a327e2ffdb" (UID: "8f9744f7-4974-4616-834e-61a327e2ffdb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.565232 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f9744f7-4974-4616-834e-61a327e2ffdb" (UID: "8f9744f7-4974-4616-834e-61a327e2ffdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.582004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8f9744f7-4974-4616-834e-61a327e2ffdb" (UID: "8f9744f7-4974-4616-834e-61a327e2ffdb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.585942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-config-data" (OuterVolumeSpecName: "config-data") pod "8f9744f7-4974-4616-834e-61a327e2ffdb" (UID: "8f9744f7-4974-4616-834e-61a327e2ffdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.607748 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.642570 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.642827 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.642839 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9744f7-4974-4616-834e-61a327e2ffdb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.642848 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.642856 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.642888 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.642897 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9744f7-4974-4616-834e-61a327e2ffdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.642906 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncd6r\" (UniqueName: \"kubernetes.io/projected/8f9744f7-4974-4616-834e-61a327e2ffdb-kube-api-access-ncd6r\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.658594 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.744894 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:17 crc kubenswrapper[4707]: I0121 15:59:17.999705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 15:59:18 crc kubenswrapper[4707]: W0121 15:59:18.008627 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc9779e5_4e04_424b_b1e3_388ebf744d02.slice/crio-cc1ee5d6c7e5f2b2456e3af05b8d762bb0fe47a4df8214f9837dd81fd6fc0a02 WatchSource:0}: Error finding container cc1ee5d6c7e5f2b2456e3af05b8d762bb0fe47a4df8214f9837dd81fd6fc0a02: Status 404 returned error can't find the container with id cc1ee5d6c7e5f2b2456e3af05b8d762bb0fe47a4df8214f9837dd81fd6fc0a02 Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.183691 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerID="5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba" exitCode=0 Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.183728 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.183772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8f9744f7-4974-4616-834e-61a327e2ffdb","Type":"ContainerDied","Data":"5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba"} Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.183837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8f9744f7-4974-4616-834e-61a327e2ffdb","Type":"ContainerDied","Data":"8e9ac5471cefc1af516b8404caddd7f2aa8b2f70878ecb558937fab9992165e2"} Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.183858 4707 scope.go:117] "RemoveContainer" containerID="5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.207196 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerStarted","Data":"51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a"} Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.207332 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="ceilometer-central-agent" containerID="cri-o://7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83" gracePeriod=30 Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.207507 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.207552 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="proxy-httpd" containerID="cri-o://51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a" gracePeriod=30 Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.207590 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="sg-core" containerID="cri-o://b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b" gracePeriod=30 Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.207622 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="ceilometer-notification-agent" containerID="cri-o://442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984" gracePeriod=30 Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.222876 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.236969 4707 scope.go:117] "RemoveContainer" containerID="251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.240085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fc9779e5-4e04-424b-b1e3-388ebf744d02","Type":"ContainerStarted","Data":"cc1ee5d6c7e5f2b2456e3af05b8d762bb0fe47a4df8214f9837dd81fd6fc0a02"} Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.267916 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.322990 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:59:18 crc kubenswrapper[4707]: E0121 15:59:18.323935 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerName="glance-log" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.323953 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerName="glance-log" Jan 21 15:59:18 crc kubenswrapper[4707]: E0121 15:59:18.323998 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerName="glance-httpd" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.324004 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerName="glance-httpd" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.324455 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerName="glance-log" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.324491 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f9744f7-4974-4616-834e-61a327e2ffdb" containerName="glance-httpd" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.330111 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.342517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.343013 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.394092 4707 scope.go:117] "RemoveContainer" containerID="5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.394389 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:59:18 crc kubenswrapper[4707]: E0121 15:59:18.403961 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba\": container with ID starting with 5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba not found: ID does not exist" containerID="5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.404065 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba"} err="failed to get container status \"5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba\": rpc error: code = NotFound desc = could not find container \"5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba\": container with ID starting with 5f85c3f8893e9517fb67eae17fdc1328ff96e98ce51ab22cd10df981c21679ba not found: ID does not exist" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.404141 4707 scope.go:117] "RemoveContainer" containerID="251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.407035 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.9110738029999998 podStartE2EDuration="5.407024729s" podCreationTimestamp="2026-01-21 15:59:13 +0000 UTC" firstStartedPulling="2026-01-21 15:59:13.821481331 +0000 UTC m=+3451.002997553" lastFinishedPulling="2026-01-21 15:59:17.317432257 +0000 UTC m=+3454.498948479" observedRunningTime="2026-01-21 15:59:18.273391894 +0000 UTC m=+3455.454908116" watchObservedRunningTime="2026-01-21 15:59:18.407024729 +0000 UTC m=+3455.588540950" Jan 21 15:59:18 crc kubenswrapper[4707]: E0121 15:59:18.413999 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a\": container with ID starting with 251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a not found: ID does not exist" containerID="251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.414050 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a"} err="failed to get container status \"251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a\": rpc error: code = NotFound desc = could not find container \"251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a\": container with ID starting with 251df8b26483cb454432556e8470d4ac66fa3fde7190ccb24f12c34ea9a3228a not found: ID does not exist" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.423543 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.463894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.464174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-logs\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.464194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q8tc\" (UniqueName: \"kubernetes.io/projected/32eaeddc-f8b5-4c3e-a456-f723f0698b30-kube-api-access-6q8tc\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.464227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.464249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.464300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.464353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.464435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.530398 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.531655 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.567133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.567184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-logs\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.567202 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q8tc\" (UniqueName: \"kubernetes.io/projected/32eaeddc-f8b5-4c3e-a456-f723f0698b30-kube-api-access-6q8tc\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.567225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.567246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.567299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.567335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.567397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.567776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.568544 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.571772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-logs\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.575432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.576205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.578872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.592529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q8tc\" (UniqueName: \"kubernetes.io/projected/32eaeddc-f8b5-4c3e-a456-f723f0698b30-kube-api-access-6q8tc\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.592778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.599567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.693605 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.704949 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.770801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22499a49-41fa-4c4f-9e22-16f64c8757b4-operator-scripts\") pod \"22499a49-41fa-4c4f-9e22-16f64c8757b4\" (UID: \"22499a49-41fa-4c4f-9e22-16f64c8757b4\") " Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.770909 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dksd\" (UniqueName: \"kubernetes.io/projected/22499a49-41fa-4c4f-9e22-16f64c8757b4-kube-api-access-7dksd\") pod \"22499a49-41fa-4c4f-9e22-16f64c8757b4\" (UID: \"22499a49-41fa-4c4f-9e22-16f64c8757b4\") " Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.773302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22499a49-41fa-4c4f-9e22-16f64c8757b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22499a49-41fa-4c4f-9e22-16f64c8757b4" (UID: "22499a49-41fa-4c4f-9e22-16f64c8757b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.774204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22499a49-41fa-4c4f-9e22-16f64c8757b4-kube-api-access-7dksd" (OuterVolumeSpecName: "kube-api-access-7dksd") pod "22499a49-41fa-4c4f-9e22-16f64c8757b4" (UID: "22499a49-41fa-4c4f-9e22-16f64c8757b4"). InnerVolumeSpecName "kube-api-access-7dksd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.877211 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dksd\" (UniqueName: \"kubernetes.io/projected/22499a49-41fa-4c4f-9e22-16f64c8757b4-kube-api-access-7dksd\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.877246 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22499a49-41fa-4c4f-9e22-16f64c8757b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.928162 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.933556 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.940950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.963456 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:18 crc kubenswrapper[4707]: I0121 15:59:18.979872 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083233 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/528035e2-52ac-4a65-b069-312a9bfab07e-operator-scripts\") pod \"528035e2-52ac-4a65-b069-312a9bfab07e\" (UID: \"528035e2-52ac-4a65-b069-312a9bfab07e\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083310 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmdhp\" (UniqueName: \"kubernetes.io/projected/ac62dfd2-875a-43ff-b154-e1755965171f-kube-api-access-dmdhp\") pod \"ac62dfd2-875a-43ff-b154-e1755965171f\" (UID: \"ac62dfd2-875a-43ff-b154-e1755965171f\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxvnw\" (UniqueName: \"kubernetes.io/projected/528035e2-52ac-4a65-b069-312a9bfab07e-kube-api-access-wxvnw\") pod \"528035e2-52ac-4a65-b069-312a9bfab07e\" (UID: \"528035e2-52ac-4a65-b069-312a9bfab07e\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083370 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k55r\" (UniqueName: \"kubernetes.io/projected/9ff8d805-9b36-4e84-a82b-aca4df5032cd-kube-api-access-2k55r\") pod \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\" (UID: \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783ac2f5-58ea-4ead-a88e-7c4929cf4661-operator-scripts\") pod \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\" (UID: \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsv5p\" (UniqueName: \"kubernetes.io/projected/643f3f30-7d67-4b8b-af44-3637ad271586-kube-api-access-vsv5p\") pod \"643f3f30-7d67-4b8b-af44-3637ad271586\" (UID: \"643f3f30-7d67-4b8b-af44-3637ad271586\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/643f3f30-7d67-4b8b-af44-3637ad271586-operator-scripts\") pod \"643f3f30-7d67-4b8b-af44-3637ad271586\" (UID: \"643f3f30-7d67-4b8b-af44-3637ad271586\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62dfd2-875a-43ff-b154-e1755965171f-operator-scripts\") pod \"ac62dfd2-875a-43ff-b154-e1755965171f\" (UID: \"ac62dfd2-875a-43ff-b154-e1755965171f\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jszjt\" (UniqueName: \"kubernetes.io/projected/783ac2f5-58ea-4ead-a88e-7c4929cf4661-kube-api-access-jszjt\") pod \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\" (UID: \"783ac2f5-58ea-4ead-a88e-7c4929cf4661\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.083636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff8d805-9b36-4e84-a82b-aca4df5032cd-operator-scripts\") pod \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\" (UID: \"9ff8d805-9b36-4e84-a82b-aca4df5032cd\") " Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.084133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783ac2f5-58ea-4ead-a88e-7c4929cf4661-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "783ac2f5-58ea-4ead-a88e-7c4929cf4661" (UID: "783ac2f5-58ea-4ead-a88e-7c4929cf4661"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.084509 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783ac2f5-58ea-4ead-a88e-7c4929cf4661-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.084566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ff8d805-9b36-4e84-a82b-aca4df5032cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ff8d805-9b36-4e84-a82b-aca4df5032cd" (UID: "9ff8d805-9b36-4e84-a82b-aca4df5032cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.084600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/643f3f30-7d67-4b8b-af44-3637ad271586-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "643f3f30-7d67-4b8b-af44-3637ad271586" (UID: "643f3f30-7d67-4b8b-af44-3637ad271586"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.085253 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac62dfd2-875a-43ff-b154-e1755965171f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac62dfd2-875a-43ff-b154-e1755965171f" (UID: "ac62dfd2-875a-43ff-b154-e1755965171f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.087713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/528035e2-52ac-4a65-b069-312a9bfab07e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "528035e2-52ac-4a65-b069-312a9bfab07e" (UID: "528035e2-52ac-4a65-b069-312a9bfab07e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.089312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528035e2-52ac-4a65-b069-312a9bfab07e-kube-api-access-wxvnw" (OuterVolumeSpecName: "kube-api-access-wxvnw") pod "528035e2-52ac-4a65-b069-312a9bfab07e" (UID: "528035e2-52ac-4a65-b069-312a9bfab07e"). InnerVolumeSpecName "kube-api-access-wxvnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.089347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff8d805-9b36-4e84-a82b-aca4df5032cd-kube-api-access-2k55r" (OuterVolumeSpecName: "kube-api-access-2k55r") pod "9ff8d805-9b36-4e84-a82b-aca4df5032cd" (UID: "9ff8d805-9b36-4e84-a82b-aca4df5032cd"). InnerVolumeSpecName "kube-api-access-2k55r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.090354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac62dfd2-875a-43ff-b154-e1755965171f-kube-api-access-dmdhp" (OuterVolumeSpecName: "kube-api-access-dmdhp") pod "ac62dfd2-875a-43ff-b154-e1755965171f" (UID: "ac62dfd2-875a-43ff-b154-e1755965171f"). InnerVolumeSpecName "kube-api-access-dmdhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.092358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783ac2f5-58ea-4ead-a88e-7c4929cf4661-kube-api-access-jszjt" (OuterVolumeSpecName: "kube-api-access-jszjt") pod "783ac2f5-58ea-4ead-a88e-7c4929cf4661" (UID: "783ac2f5-58ea-4ead-a88e-7c4929cf4661"). InnerVolumeSpecName "kube-api-access-jszjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.093304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643f3f30-7d67-4b8b-af44-3637ad271586-kube-api-access-vsv5p" (OuterVolumeSpecName: "kube-api-access-vsv5p") pod "643f3f30-7d67-4b8b-af44-3637ad271586" (UID: "643f3f30-7d67-4b8b-af44-3637ad271586"). InnerVolumeSpecName "kube-api-access-vsv5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.185964 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/528035e2-52ac-4a65-b069-312a9bfab07e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.186146 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmdhp\" (UniqueName: \"kubernetes.io/projected/ac62dfd2-875a-43ff-b154-e1755965171f-kube-api-access-dmdhp\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.186160 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxvnw\" (UniqueName: \"kubernetes.io/projected/528035e2-52ac-4a65-b069-312a9bfab07e-kube-api-access-wxvnw\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.186170 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k55r\" (UniqueName: \"kubernetes.io/projected/9ff8d805-9b36-4e84-a82b-aca4df5032cd-kube-api-access-2k55r\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.186178 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsv5p\" (UniqueName: \"kubernetes.io/projected/643f3f30-7d67-4b8b-af44-3637ad271586-kube-api-access-vsv5p\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.186188 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/643f3f30-7d67-4b8b-af44-3637ad271586-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.186197 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62dfd2-875a-43ff-b154-e1755965171f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.186205 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jszjt\" (UniqueName: \"kubernetes.io/projected/783ac2f5-58ea-4ead-a88e-7c4929cf4661-kube-api-access-jszjt\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.186214 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff8d805-9b36-4e84-a82b-aca4df5032cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.200654 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f9744f7-4974-4616-834e-61a327e2ffdb" path="/var/lib/kubelet/pods/8f9744f7-4974-4616-834e-61a327e2ffdb/volumes" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.201430 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de71448c-16db-44cc-b37f-f5a01abed953" path="/var/lib/kubelet/pods/de71448c-16db-44cc-b37f-f5a01abed953/volumes" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.247954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fc9779e5-4e04-424b-b1e3-388ebf744d02","Type":"ContainerStarted","Data":"f13e4d8a65800b110a365933161c21cf2baf677a94c2caac3c3ded57bbda9c10"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.249543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" event={"ID":"ac62dfd2-875a-43ff-b154-e1755965171f","Type":"ContainerDied","Data":"4218fdf00ee104b900e3ad74cc5d7922842dfea0cb3b1f6a4f25adf15c97165f"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.249568 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4218fdf00ee104b900e3ad74cc5d7922842dfea0cb3b1f6a4f25adf15c97165f" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.249610 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-dsspk" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.253202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" event={"ID":"22499a49-41fa-4c4f-9e22-16f64c8757b4","Type":"ContainerDied","Data":"c5e58ab714558f3f7cfe71cd1a4681f59c6f5d1a7e4c9940465ed3b7b96b6778"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.253244 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5e58ab714558f3f7cfe71cd1a4681f59c6f5d1a7e4c9940465ed3b7b96b6778" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.253274 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.257897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" event={"ID":"9ff8d805-9b36-4e84-a82b-aca4df5032cd","Type":"ContainerDied","Data":"f05966ade42885a4262e18f6ed2d6b96ecf25213b4508fb1f704d3bd0a1c7512"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.257932 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f05966ade42885a4262e18f6ed2d6b96ecf25213b4508fb1f704d3bd0a1c7512" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.257930 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.259218 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.259217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-6zdng" event={"ID":"783ac2f5-58ea-4ead-a88e-7c4929cf4661","Type":"ContainerDied","Data":"9f6f466906a25944f23ed9364fadf0262b7be0ede41bbd4174059b744a52982a"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.259323 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6f466906a25944f23ed9364fadf0262b7be0ede41bbd4174059b744a52982a" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.260826 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-wwprz" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.260844 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-wwprz" event={"ID":"528035e2-52ac-4a65-b069-312a9bfab07e","Type":"ContainerDied","Data":"edbdfb840c998fd379767cada97258042bbe0d7adc4efd9fb76a812be5048dff"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.260901 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edbdfb840c998fd379767cada97258042bbe0d7adc4efd9fb76a812be5048dff" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.262590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" event={"ID":"643f3f30-7d67-4b8b-af44-3637ad271586","Type":"ContainerDied","Data":"f86b51ab40683636f0c4dd0feee3ecf714e9e68e36941169e8f0b7c636c92918"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.262651 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f86b51ab40683636f0c4dd0feee3ecf714e9e68e36941169e8f0b7c636c92918" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.262593 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2" Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.265631 4707 generic.go:334] "Generic (PLEG): container finished" podID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerID="51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a" exitCode=0 Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.265656 4707 generic.go:334] "Generic (PLEG): container finished" podID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerID="b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b" exitCode=2 Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.265665 4707 generic.go:334] "Generic (PLEG): container finished" podID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerID="442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984" exitCode=0 Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.265705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerDied","Data":"51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.265733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerDied","Data":"b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.265746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerDied","Data":"442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984"} Jan 21 15:59:19 crc kubenswrapper[4707]: I0121 15:59:19.285166 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.276639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"32eaeddc-f8b5-4c3e-a456-f723f0698b30","Type":"ContainerStarted","Data":"e988b288a8fcc44d803facd8cd151da8bf58f68ac8ebb2aef3d85ab144b16900"} Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.277370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"32eaeddc-f8b5-4c3e-a456-f723f0698b30","Type":"ContainerStarted","Data":"528b18552f9a4743a109d4c0a9bde81e896224749b7ecb137a938d625978f1ba"} Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.277389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"32eaeddc-f8b5-4c3e-a456-f723f0698b30","Type":"ContainerStarted","Data":"79e47180b22fff9ccbd5cf6da7c3d6fe16284b15bb2cd08cc05499fad9cb705b"} Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.279167 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fc9779e5-4e04-424b-b1e3-388ebf744d02","Type":"ContainerStarted","Data":"5f46ece4d49ebd6ff61477cd26b6c7e5b21e6dfbdf8a342e363b3fa2825bf3d7"} Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.310032 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.310015674 podStartE2EDuration="2.310015674s" podCreationTimestamp="2026-01-21 15:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:59:20.304063572 +0000 UTC m=+3457.485579795" watchObservedRunningTime="2026-01-21 15:59:20.310015674 +0000 UTC m=+3457.491531897" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.333475 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.333437788 podStartE2EDuration="3.333437788s" podCreationTimestamp="2026-01-21 15:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:59:20.325256453 +0000 UTC m=+3457.506772695" watchObservedRunningTime="2026-01-21 15:59:20.333437788 +0000 UTC m=+3457.514954010" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.807442 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.917694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-sg-core-conf-yaml\") pod \"69da223e-1ed5-4176-8dee-8a002c1524f9\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.917753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-ceilometer-tls-certs\") pod \"69da223e-1ed5-4176-8dee-8a002c1524f9\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.917842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-run-httpd\") pod \"69da223e-1ed5-4176-8dee-8a002c1524f9\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.917892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-log-httpd\") pod \"69da223e-1ed5-4176-8dee-8a002c1524f9\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.917985 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-combined-ca-bundle\") pod \"69da223e-1ed5-4176-8dee-8a002c1524f9\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.918105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-config-data\") pod \"69da223e-1ed5-4176-8dee-8a002c1524f9\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.918154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-scripts\") pod \"69da223e-1ed5-4176-8dee-8a002c1524f9\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.918187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpkl7\" (UniqueName: \"kubernetes.io/projected/69da223e-1ed5-4176-8dee-8a002c1524f9-kube-api-access-lpkl7\") pod \"69da223e-1ed5-4176-8dee-8a002c1524f9\" (UID: \"69da223e-1ed5-4176-8dee-8a002c1524f9\") " Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.919073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69da223e-1ed5-4176-8dee-8a002c1524f9" (UID: "69da223e-1ed5-4176-8dee-8a002c1524f9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.919169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69da223e-1ed5-4176-8dee-8a002c1524f9" (UID: "69da223e-1ed5-4176-8dee-8a002c1524f9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.925460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-scripts" (OuterVolumeSpecName: "scripts") pod "69da223e-1ed5-4176-8dee-8a002c1524f9" (UID: "69da223e-1ed5-4176-8dee-8a002c1524f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.925755 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69da223e-1ed5-4176-8dee-8a002c1524f9-kube-api-access-lpkl7" (OuterVolumeSpecName: "kube-api-access-lpkl7") pod "69da223e-1ed5-4176-8dee-8a002c1524f9" (UID: "69da223e-1ed5-4176-8dee-8a002c1524f9"). InnerVolumeSpecName "kube-api-access-lpkl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.951945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69da223e-1ed5-4176-8dee-8a002c1524f9" (UID: "69da223e-1ed5-4176-8dee-8a002c1524f9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962106 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77"] Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962568 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="ceilometer-notification-agent" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962608 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="ceilometer-notification-agent" Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962617 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643f3f30-7d67-4b8b-af44-3637ad271586" containerName="mariadb-account-create-update" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="643f3f30-7d67-4b8b-af44-3637ad271586" containerName="mariadb-account-create-update" Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962702 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="ceilometer-central-agent" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962709 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="ceilometer-central-agent" Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962724 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22499a49-41fa-4c4f-9e22-16f64c8757b4" containerName="mariadb-account-create-update" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962731 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="22499a49-41fa-4c4f-9e22-16f64c8757b4" containerName="mariadb-account-create-update" Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962765 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="sg-core" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962770 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="sg-core" Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962777 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783ac2f5-58ea-4ead-a88e-7c4929cf4661" containerName="mariadb-database-create" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962782 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="783ac2f5-58ea-4ead-a88e-7c4929cf4661" containerName="mariadb-database-create" Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962796 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528035e2-52ac-4a65-b069-312a9bfab07e" containerName="mariadb-database-create" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962801 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="528035e2-52ac-4a65-b069-312a9bfab07e" containerName="mariadb-database-create" Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962864 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="proxy-httpd" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962872 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="proxy-httpd" Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962883 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac62dfd2-875a-43ff-b154-e1755965171f" containerName="mariadb-database-create" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962888 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac62dfd2-875a-43ff-b154-e1755965171f" containerName="mariadb-database-create" Jan 21 15:59:20 crc kubenswrapper[4707]: E0121 15:59:20.962901 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff8d805-9b36-4e84-a82b-aca4df5032cd" containerName="mariadb-account-create-update" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.962906 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff8d805-9b36-4e84-a82b-aca4df5032cd" containerName="mariadb-account-create-update" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963132 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="528035e2-52ac-4a65-b069-312a9bfab07e" containerName="mariadb-database-create" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963145 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff8d805-9b36-4e84-a82b-aca4df5032cd" containerName="mariadb-account-create-update" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963151 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="proxy-httpd" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963181 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac62dfd2-875a-43ff-b154-e1755965171f" containerName="mariadb-database-create" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963196 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="783ac2f5-58ea-4ead-a88e-7c4929cf4661" containerName="mariadb-database-create" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963206 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="22499a49-41fa-4c4f-9e22-16f64c8757b4" containerName="mariadb-account-create-update" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963216 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="ceilometer-central-agent" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963233 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="643f3f30-7d67-4b8b-af44-3637ad271586" containerName="mariadb-account-create-update" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963260 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="ceilometer-notification-agent" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963269 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerName="sg-core" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.963916 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.972752 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.973011 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-8244p" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.973028 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.979983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "69da223e-1ed5-4176-8dee-8a002c1524f9" (UID: "69da223e-1ed5-4176-8dee-8a002c1524f9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:20 crc kubenswrapper[4707]: I0121 15:59:20.992042 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77"] Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.020439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.020487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-scripts\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.020547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-config-data\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.020626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gd8l\" (UniqueName: \"kubernetes.io/projected/a3ac6aad-da44-4874-8a19-8098a2b2dccc-kube-api-access-9gd8l\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.020804 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.020986 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69da223e-1ed5-4176-8dee-8a002c1524f9-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.021037 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.021052 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpkl7\" (UniqueName: \"kubernetes.io/projected/69da223e-1ed5-4176-8dee-8a002c1524f9-kube-api-access-lpkl7\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.021062 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.021072 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.028391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69da223e-1ed5-4176-8dee-8a002c1524f9" (UID: "69da223e-1ed5-4176-8dee-8a002c1524f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.032041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-config-data" (OuterVolumeSpecName: "config-data") pod "69da223e-1ed5-4176-8dee-8a002c1524f9" (UID: "69da223e-1ed5-4176-8dee-8a002c1524f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.122753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.122799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-scripts\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.122852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-config-data\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.122897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gd8l\" (UniqueName: \"kubernetes.io/projected/a3ac6aad-da44-4874-8a19-8098a2b2dccc-kube-api-access-9gd8l\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.122960 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.122973 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69da223e-1ed5-4176-8dee-8a002c1524f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.126149 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-config-data\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.126351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-scripts\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.128522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.139731 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gd8l\" (UniqueName: \"kubernetes.io/projected/a3ac6aad-da44-4874-8a19-8098a2b2dccc-kube-api-access-9gd8l\") pod \"nova-cell0-conductor-db-sync-4vq77\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.284451 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.289711 4707 generic.go:334] "Generic (PLEG): container finished" podID="69da223e-1ed5-4176-8dee-8a002c1524f9" containerID="7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83" exitCode=0 Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.289755 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.289772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerDied","Data":"7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83"} Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.289826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69da223e-1ed5-4176-8dee-8a002c1524f9","Type":"ContainerDied","Data":"6e05b185f949d184d21e9ee040b2fbe7d3c4c8eba8df2f907b1c1521f3830bd9"} Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.289844 4707 scope.go:117] "RemoveContainer" containerID="51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.313347 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.315030 4707 scope.go:117] "RemoveContainer" containerID="b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.319407 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.333938 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.336406 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.338081 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.338744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.338948 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.369203 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.387158 4707 scope.go:117] "RemoveContainer" containerID="442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.414945 4707 scope.go:117] "RemoveContainer" containerID="7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.430153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-run-httpd\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.430225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-config-data\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.430245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-log-httpd\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.430278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.430396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-scripts\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.430418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc679\" (UniqueName: \"kubernetes.io/projected/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-kube-api-access-qc679\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.430473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.430513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.439953 4707 scope.go:117] "RemoveContainer" containerID="51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a" Jan 21 15:59:21 crc kubenswrapper[4707]: E0121 15:59:21.445900 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a\": container with ID starting with 51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a not found: ID does not exist" containerID="51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.445936 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a"} err="failed to get container status \"51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a\": rpc error: code = NotFound desc = could not find container \"51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a\": container with ID starting with 51db6425b955ed56e77a6cc622140fa92dcc73c510fec82c2ab168182508785a not found: ID does not exist" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.445955 4707 scope.go:117] "RemoveContainer" containerID="b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b" Jan 21 15:59:21 crc kubenswrapper[4707]: E0121 15:59:21.448044 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b\": container with ID starting with b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b not found: ID does not exist" containerID="b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.448076 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b"} err="failed to get container status \"b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b\": rpc error: code = NotFound desc = could not find container \"b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b\": container with ID starting with b70fc275d5658aff64f8c0d7eb914c9eff98029d375c163d2dd6f3777468432b not found: ID does not exist" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.448092 4707 scope.go:117] "RemoveContainer" containerID="442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984" Jan 21 15:59:21 crc kubenswrapper[4707]: E0121 15:59:21.450954 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984\": container with ID starting with 442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984 not found: ID does not exist" containerID="442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.450975 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984"} err="failed to get container status \"442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984\": rpc error: code = NotFound desc = could not find container \"442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984\": container with ID starting with 442dc943d57847858c883daaf92106a66b4d188e18501084b73c60249c18a984 not found: ID does not exist" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.450990 4707 scope.go:117] "RemoveContainer" containerID="7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83" Jan 21 15:59:21 crc kubenswrapper[4707]: E0121 15:59:21.457687 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83\": container with ID starting with 7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83 not found: ID does not exist" containerID="7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.457723 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83"} err="failed to get container status \"7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83\": rpc error: code = NotFound desc = could not find container \"7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83\": container with ID starting with 7049184d35729e89290ea8c5eab19530b3195d0145573e033e2bcf19fe524a83 not found: ID does not exist" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.532508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-run-httpd\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.532555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-config-data\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.532572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-log-httpd\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.532595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.532629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-scripts\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.532651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc679\" (UniqueName: \"kubernetes.io/projected/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-kube-api-access-qc679\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.532675 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.532702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.532984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-run-httpd\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.533046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-log-httpd\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.537228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.537238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.537481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-config-data\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.537615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.538421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-scripts\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.544865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc679\" (UniqueName: \"kubernetes.io/projected/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-kube-api-access-qc679\") pod \"ceilometer-0\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.681950 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:21 crc kubenswrapper[4707]: I0121 15:59:21.718237 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77"] Jan 21 15:59:21 crc kubenswrapper[4707]: W0121 15:59:21.721500 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3ac6aad_da44_4874_8a19_8098a2b2dccc.slice/crio-639ac4b9f0d7a7c94ffe205b012912c05876283004fc6a02facc03aa170e79d8 WatchSource:0}: Error finding container 639ac4b9f0d7a7c94ffe205b012912c05876283004fc6a02facc03aa170e79d8: Status 404 returned error can't find the container with id 639ac4b9f0d7a7c94ffe205b012912c05876283004fc6a02facc03aa170e79d8 Jan 21 15:59:22 crc kubenswrapper[4707]: I0121 15:59:22.083008 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:22 crc kubenswrapper[4707]: I0121 15:59:22.297281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerStarted","Data":"5b3db2f78ce143ddee3d81c0fd3917fd79bcf284e447a1338117ad3b2b26b257"} Jan 21 15:59:22 crc kubenswrapper[4707]: I0121 15:59:22.300200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" event={"ID":"a3ac6aad-da44-4874-8a19-8098a2b2dccc","Type":"ContainerStarted","Data":"50cdcb498e9416bb5590723b27e3bf3050fedf0e1a72b72bd0a12b1cbd5cea1e"} Jan 21 15:59:22 crc kubenswrapper[4707]: I0121 15:59:22.300237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" event={"ID":"a3ac6aad-da44-4874-8a19-8098a2b2dccc","Type":"ContainerStarted","Data":"639ac4b9f0d7a7c94ffe205b012912c05876283004fc6a02facc03aa170e79d8"} Jan 21 15:59:22 crc kubenswrapper[4707]: I0121 15:59:22.313439 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" podStartSLOduration=2.313427793 podStartE2EDuration="2.313427793s" podCreationTimestamp="2026-01-21 15:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:59:22.310361127 +0000 UTC m=+3459.491877349" watchObservedRunningTime="2026-01-21 15:59:22.313427793 +0000 UTC m=+3459.494944015" Jan 21 15:59:23 crc kubenswrapper[4707]: I0121 15:59:23.149996 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:23 crc kubenswrapper[4707]: I0121 15:59:23.191011 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69da223e-1ed5-4176-8dee-8a002c1524f9" path="/var/lib/kubelet/pods/69da223e-1ed5-4176-8dee-8a002c1524f9/volumes" Jan 21 15:59:23 crc kubenswrapper[4707]: I0121 15:59:23.307871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerStarted","Data":"e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00"} Jan 21 15:59:24 crc kubenswrapper[4707]: I0121 15:59:24.316040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerStarted","Data":"bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb"} Jan 21 15:59:24 crc kubenswrapper[4707]: I0121 15:59:24.316251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerStarted","Data":"1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3"} Jan 21 15:59:26 crc kubenswrapper[4707]: I0121 15:59:26.331304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerStarted","Data":"67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea"} Jan 21 15:59:26 crc kubenswrapper[4707]: I0121 15:59:26.331631 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:26 crc kubenswrapper[4707]: I0121 15:59:26.331452 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="ceilometer-central-agent" containerID="cri-o://e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00" gracePeriod=30 Jan 21 15:59:26 crc kubenswrapper[4707]: I0121 15:59:26.331667 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="proxy-httpd" containerID="cri-o://67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea" gracePeriod=30 Jan 21 15:59:26 crc kubenswrapper[4707]: I0121 15:59:26.331723 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="ceilometer-notification-agent" containerID="cri-o://1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3" gracePeriod=30 Jan 21 15:59:26 crc kubenswrapper[4707]: I0121 15:59:26.331755 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="sg-core" containerID="cri-o://bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb" gracePeriod=30 Jan 21 15:59:26 crc kubenswrapper[4707]: I0121 15:59:26.355577 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.6683384110000001 podStartE2EDuration="5.355560806s" podCreationTimestamp="2026-01-21 15:59:21 +0000 UTC" firstStartedPulling="2026-01-21 15:59:22.088092892 +0000 UTC m=+3459.269609113" lastFinishedPulling="2026-01-21 15:59:25.775315286 +0000 UTC m=+3462.956831508" observedRunningTime="2026-01-21 15:59:26.347596851 +0000 UTC m=+3463.529113073" watchObservedRunningTime="2026-01-21 15:59:26.355560806 +0000 UTC m=+3463.537077028" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.339536 4707 generic.go:334] "Generic (PLEG): container finished" podID="a3ac6aad-da44-4874-8a19-8098a2b2dccc" containerID="50cdcb498e9416bb5590723b27e3bf3050fedf0e1a72b72bd0a12b1cbd5cea1e" exitCode=0 Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.339592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" event={"ID":"a3ac6aad-da44-4874-8a19-8098a2b2dccc","Type":"ContainerDied","Data":"50cdcb498e9416bb5590723b27e3bf3050fedf0e1a72b72bd0a12b1cbd5cea1e"} Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.342581 4707 generic.go:334] "Generic (PLEG): container finished" podID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerID="67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea" exitCode=0 Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.342605 4707 generic.go:334] "Generic (PLEG): container finished" podID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerID="bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb" exitCode=2 Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.342606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerDied","Data":"67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea"} Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.342632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerDied","Data":"bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb"} Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.342613 4707 generic.go:334] "Generic (PLEG): container finished" podID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerID="1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3" exitCode=0 Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.342649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerDied","Data":"1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3"} Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.607922 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.608691 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.629320 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.635658 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.713612 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.825879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-config-data\") pod \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.825911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-scripts\") pod \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.825939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc679\" (UniqueName: \"kubernetes.io/projected/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-kube-api-access-qc679\") pod \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.826017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-combined-ca-bundle\") pod \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.826648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-sg-core-conf-yaml\") pod \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.826676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-ceilometer-tls-certs\") pod \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.826737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-run-httpd\") pod \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.826760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-log-httpd\") pod \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\" (UID: \"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7\") " Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.827139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" (UID: "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.827370 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.827422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" (UID: "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.834650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-scripts" (OuterVolumeSpecName: "scripts") pod "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" (UID: "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.834886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-kube-api-access-qc679" (OuterVolumeSpecName: "kube-api-access-qc679") pod "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" (UID: "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7"). InnerVolumeSpecName "kube-api-access-qc679". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.846936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" (UID: "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.858489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" (UID: "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.871554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" (UID: "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.889923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-config-data" (OuterVolumeSpecName: "config-data") pod "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" (UID: "8be5d369-b874-4473-9ce7-8a1cc3e5bcb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.928876 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.928901 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.928911 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.928918 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.928927 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.928935 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc679\" (UniqueName: \"kubernetes.io/projected/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-kube-api-access-qc679\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:27 crc kubenswrapper[4707]: I0121 15:59:27.928945 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.351386 4707 generic.go:334] "Generic (PLEG): container finished" podID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerID="e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00" exitCode=0 Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.351426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerDied","Data":"e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00"} Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.351440 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.351463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8be5d369-b874-4473-9ce7-8a1cc3e5bcb7","Type":"ContainerDied","Data":"5b3db2f78ce143ddee3d81c0fd3917fd79bcf284e447a1338117ad3b2b26b257"} Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.351483 4707 scope.go:117] "RemoveContainer" containerID="67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.351903 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.351918 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.372703 4707 scope.go:117] "RemoveContainer" containerID="bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.377667 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.383650 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.396577 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:28 crc kubenswrapper[4707]: E0121 15:59:28.396955 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="ceilometer-notification-agent" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.396971 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="ceilometer-notification-agent" Jan 21 15:59:28 crc kubenswrapper[4707]: E0121 15:59:28.396985 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="ceilometer-central-agent" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.396992 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="ceilometer-central-agent" Jan 21 15:59:28 crc kubenswrapper[4707]: E0121 15:59:28.397018 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="sg-core" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.397024 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="sg-core" Jan 21 15:59:28 crc kubenswrapper[4707]: E0121 15:59:28.397030 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="proxy-httpd" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.397036 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="proxy-httpd" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.397177 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="proxy-httpd" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.397187 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="ceilometer-central-agent" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.397199 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="ceilometer-notification-agent" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.397209 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" containerName="sg-core" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.403136 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.405757 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.405968 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.408443 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.413614 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.429501 4707 scope.go:117] "RemoveContainer" containerID="1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.436421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-config-data\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.436449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92fpn\" (UniqueName: \"kubernetes.io/projected/1925fff7-e12e-48ce-814b-ea36ba6965e1-kube-api-access-92fpn\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.436472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-log-httpd\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.436493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-scripts\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.436586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.438985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.439188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.439230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-run-httpd\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.466995 4707 scope.go:117] "RemoveContainer" containerID="e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.483422 4707 scope.go:117] "RemoveContainer" containerID="67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea" Jan 21 15:59:28 crc kubenswrapper[4707]: E0121 15:59:28.483769 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea\": container with ID starting with 67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea not found: ID does not exist" containerID="67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.483828 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea"} err="failed to get container status \"67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea\": rpc error: code = NotFound desc = could not find container \"67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea\": container with ID starting with 67e4c6771a4b6a4000a3b0a39c38bf1a9aa4787852658b8273cdf5db14a6a4ea not found: ID does not exist" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.483848 4707 scope.go:117] "RemoveContainer" containerID="bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb" Jan 21 15:59:28 crc kubenswrapper[4707]: E0121 15:59:28.484103 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb\": container with ID starting with bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb not found: ID does not exist" containerID="bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.484135 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb"} err="failed to get container status \"bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb\": rpc error: code = NotFound desc = could not find container \"bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb\": container with ID starting with bd5c4a3e229abdc9ed4494fba26d49cef9ccc7bb47f95abf4fe9e1ffbe8566eb not found: ID does not exist" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.484155 4707 scope.go:117] "RemoveContainer" containerID="1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3" Jan 21 15:59:28 crc kubenswrapper[4707]: E0121 15:59:28.484511 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3\": container with ID starting with 1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3 not found: ID does not exist" containerID="1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.484579 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3"} err="failed to get container status \"1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3\": rpc error: code = NotFound desc = could not find container \"1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3\": container with ID starting with 1d20f5802b61e8ce03677c242adc71b8104e4552de7bfb4667a169b14132bab3 not found: ID does not exist" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.484596 4707 scope.go:117] "RemoveContainer" containerID="e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00" Jan 21 15:59:28 crc kubenswrapper[4707]: E0121 15:59:28.484862 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00\": container with ID starting with e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00 not found: ID does not exist" containerID="e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.484903 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00"} err="failed to get container status \"e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00\": rpc error: code = NotFound desc = could not find container \"e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00\": container with ID starting with e577d59738da424e8a234ea37cd3d0177733a5b8f3c8f1877e9294f5f6bccb00 not found: ID does not exist" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.540284 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-run-httpd\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.540372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-config-data\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.540392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92fpn\" (UniqueName: \"kubernetes.io/projected/1925fff7-e12e-48ce-814b-ea36ba6965e1-kube-api-access-92fpn\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.540420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-log-httpd\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.540445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-scripts\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.540486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.540541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.540650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.541161 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-run-httpd\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.541096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-log-httpd\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.544669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.545121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.545317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.546236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-config-data\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.548697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-scripts\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.555409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92fpn\" (UniqueName: \"kubernetes.io/projected/1925fff7-e12e-48ce-814b-ea36ba6965e1-kube-api-access-92fpn\") pod \"ceilometer-0\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.606052 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.642691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-config-data\") pod \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.642919 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gd8l\" (UniqueName: \"kubernetes.io/projected/a3ac6aad-da44-4874-8a19-8098a2b2dccc-kube-api-access-9gd8l\") pod \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.642973 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-combined-ca-bundle\") pod \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.643005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-scripts\") pod \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\" (UID: \"a3ac6aad-da44-4874-8a19-8098a2b2dccc\") " Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.646222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3ac6aad-da44-4874-8a19-8098a2b2dccc-kube-api-access-9gd8l" (OuterVolumeSpecName: "kube-api-access-9gd8l") pod "a3ac6aad-da44-4874-8a19-8098a2b2dccc" (UID: "a3ac6aad-da44-4874-8a19-8098a2b2dccc"). InnerVolumeSpecName "kube-api-access-9gd8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.646259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-scripts" (OuterVolumeSpecName: "scripts") pod "a3ac6aad-da44-4874-8a19-8098a2b2dccc" (UID: "a3ac6aad-da44-4874-8a19-8098a2b2dccc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.661349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3ac6aad-da44-4874-8a19-8098a2b2dccc" (UID: "a3ac6aad-da44-4874-8a19-8098a2b2dccc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.662140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-config-data" (OuterVolumeSpecName: "config-data") pod "a3ac6aad-da44-4874-8a19-8098a2b2dccc" (UID: "a3ac6aad-da44-4874-8a19-8098a2b2dccc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.705994 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.706027 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.720351 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.736544 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.737547 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.744911 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gd8l\" (UniqueName: \"kubernetes.io/projected/a3ac6aad-da44-4874-8a19-8098a2b2dccc-kube-api-access-9gd8l\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.744931 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.744940 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:28 crc kubenswrapper[4707]: I0121 15:59:28.744948 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ac6aad-da44-4874-8a19-8098a2b2dccc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.114794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.199720 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be5d369-b874-4473-9ce7-8a1cc3e5bcb7" path="/var/lib/kubelet/pods/8be5d369-b874-4473-9ce7-8a1cc3e5bcb7/volumes" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.359304 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.359300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77" event={"ID":"a3ac6aad-da44-4874-8a19-8098a2b2dccc","Type":"ContainerDied","Data":"639ac4b9f0d7a7c94ffe205b012912c05876283004fc6a02facc03aa170e79d8"} Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.359440 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="639ac4b9f0d7a7c94ffe205b012912c05876283004fc6a02facc03aa170e79d8" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.361660 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerStarted","Data":"48dc0ce1fa6f16b06c94f49a6c344c15ecca936f06cb293f4f9f2d87bb5db812"} Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.362442 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.362468 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.419201 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:59:29 crc kubenswrapper[4707]: E0121 15:59:29.419568 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ac6aad-da44-4874-8a19-8098a2b2dccc" containerName="nova-cell0-conductor-db-sync" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.419586 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ac6aad-da44-4874-8a19-8098a2b2dccc" containerName="nova-cell0-conductor-db-sync" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.419751 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3ac6aad-da44-4874-8a19-8098a2b2dccc" containerName="nova-cell0-conductor-db-sync" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.420281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.421747 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-8244p" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.422742 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.425603 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.455043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.455106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv2hg\" (UniqueName: \"kubernetes.io/projected/f329b20a-0259-4bbe-b48d-6f716689ec29-kube-api-access-sv2hg\") pod \"nova-cell0-conductor-0\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.455166 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.556580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.556703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.556745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv2hg\" (UniqueName: \"kubernetes.io/projected/f329b20a-0259-4bbe-b48d-6f716689ec29-kube-api-access-sv2hg\") pod \"nova-cell0-conductor-0\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.561603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.562830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.569623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv2hg\" (UniqueName: \"kubernetes.io/projected/f329b20a-0259-4bbe-b48d-6f716689ec29-kube-api-access-sv2hg\") pod \"nova-cell0-conductor-0\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.743445 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.967313 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:29 crc kubenswrapper[4707]: I0121 15:59:29.976841 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 15:59:30 crc kubenswrapper[4707]: I0121 15:59:30.151436 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:59:30 crc kubenswrapper[4707]: I0121 15:59:30.374222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerStarted","Data":"32b5c0143a1e67051d5e43fdb17d98a94c97afd93825354e053560a85ad156cc"} Jan 21 15:59:30 crc kubenswrapper[4707]: I0121 15:59:30.374416 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerStarted","Data":"6285d1310ac7d804e4686f50a16a7ad8749d9f999e79e50a4f9496b160cfd4cf"} Jan 21 15:59:30 crc kubenswrapper[4707]: I0121 15:59:30.376786 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f329b20a-0259-4bbe-b48d-6f716689ec29","Type":"ContainerStarted","Data":"930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc"} Jan 21 15:59:30 crc kubenswrapper[4707]: I0121 15:59:30.376858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f329b20a-0259-4bbe-b48d-6f716689ec29","Type":"ContainerStarted","Data":"426da9629d6642f8d388c12192aaacd8840cb0c61f5a41da80101443e7592ac6"} Jan 21 15:59:30 crc kubenswrapper[4707]: I0121 15:59:30.388134 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.388113905 podStartE2EDuration="1.388113905s" podCreationTimestamp="2026-01-21 15:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:59:30.387649672 +0000 UTC m=+3467.569165893" watchObservedRunningTime="2026-01-21 15:59:30.388113905 +0000 UTC m=+3467.569630126" Jan 21 15:59:30 crc kubenswrapper[4707]: I0121 15:59:30.986317 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:31 crc kubenswrapper[4707]: I0121 15:59:31.101983 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 15:59:31 crc kubenswrapper[4707]: I0121 15:59:31.331664 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 15:59:31 crc kubenswrapper[4707]: I0121 15:59:31.389328 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerStarted","Data":"51cf5dc8a8e871c734ab5e17fbba7a3d81d15beef0fddedf1f9408ff06a74018"} Jan 21 15:59:31 crc kubenswrapper[4707]: I0121 15:59:31.390007 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 15:59:31 crc kubenswrapper[4707]: I0121 15:59:31.539625 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:32 crc kubenswrapper[4707]: I0121 15:59:32.396841 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerName="nova-cell0-conductor-conductor" containerID="cri-o://930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" gracePeriod=30 Jan 21 15:59:33 crc kubenswrapper[4707]: I0121 15:59:33.404767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerStarted","Data":"3e50eda8710a63233f2f55973ef42f7216dfb177d49e57ea190dd0c2a0fde94f"} Jan 21 15:59:33 crc kubenswrapper[4707]: I0121 15:59:33.405055 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="ceilometer-central-agent" containerID="cri-o://6285d1310ac7d804e4686f50a16a7ad8749d9f999e79e50a4f9496b160cfd4cf" gracePeriod=30 Jan 21 15:59:33 crc kubenswrapper[4707]: I0121 15:59:33.405194 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="proxy-httpd" containerID="cri-o://3e50eda8710a63233f2f55973ef42f7216dfb177d49e57ea190dd0c2a0fde94f" gracePeriod=30 Jan 21 15:59:33 crc kubenswrapper[4707]: I0121 15:59:33.405235 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="sg-core" containerID="cri-o://51cf5dc8a8e871c734ab5e17fbba7a3d81d15beef0fddedf1f9408ff06a74018" gracePeriod=30 Jan 21 15:59:33 crc kubenswrapper[4707]: I0121 15:59:33.405257 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:33 crc kubenswrapper[4707]: I0121 15:59:33.405265 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="ceilometer-notification-agent" containerID="cri-o://32b5c0143a1e67051d5e43fdb17d98a94c97afd93825354e053560a85ad156cc" gracePeriod=30 Jan 21 15:59:34 crc kubenswrapper[4707]: I0121 15:59:34.413531 4707 generic.go:334] "Generic (PLEG): container finished" podID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerID="3e50eda8710a63233f2f55973ef42f7216dfb177d49e57ea190dd0c2a0fde94f" exitCode=0 Jan 21 15:59:34 crc kubenswrapper[4707]: I0121 15:59:34.413725 4707 generic.go:334] "Generic (PLEG): container finished" podID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerID="51cf5dc8a8e871c734ab5e17fbba7a3d81d15beef0fddedf1f9408ff06a74018" exitCode=2 Jan 21 15:59:34 crc kubenswrapper[4707]: I0121 15:59:34.413733 4707 generic.go:334] "Generic (PLEG): container finished" podID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerID="32b5c0143a1e67051d5e43fdb17d98a94c97afd93825354e053560a85ad156cc" exitCode=0 Jan 21 15:59:34 crc kubenswrapper[4707]: I0121 15:59:34.413600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerDied","Data":"3e50eda8710a63233f2f55973ef42f7216dfb177d49e57ea190dd0c2a0fde94f"} Jan 21 15:59:34 crc kubenswrapper[4707]: I0121 15:59:34.413765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerDied","Data":"51cf5dc8a8e871c734ab5e17fbba7a3d81d15beef0fddedf1f9408ff06a74018"} Jan 21 15:59:34 crc kubenswrapper[4707]: I0121 15:59:34.413779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerDied","Data":"32b5c0143a1e67051d5e43fdb17d98a94c97afd93825354e053560a85ad156cc"} Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.439301 4707 generic.go:334] "Generic (PLEG): container finished" podID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerID="6285d1310ac7d804e4686f50a16a7ad8749d9f999e79e50a4f9496b160cfd4cf" exitCode=0 Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.439383 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerDied","Data":"6285d1310ac7d804e4686f50a16a7ad8749d9f999e79e50a4f9496b160cfd4cf"} Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.439638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1925fff7-e12e-48ce-814b-ea36ba6965e1","Type":"ContainerDied","Data":"48dc0ce1fa6f16b06c94f49a6c344c15ecca936f06cb293f4f9f2d87bb5db812"} Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.439652 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48dc0ce1fa6f16b06c94f49a6c344c15ecca936f06cb293f4f9f2d87bb5db812" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.460452 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.593018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-config-data\") pod \"1925fff7-e12e-48ce-814b-ea36ba6965e1\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.593058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-scripts\") pod \"1925fff7-e12e-48ce-814b-ea36ba6965e1\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.593117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-combined-ca-bundle\") pod \"1925fff7-e12e-48ce-814b-ea36ba6965e1\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.593177 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-sg-core-conf-yaml\") pod \"1925fff7-e12e-48ce-814b-ea36ba6965e1\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.593249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-log-httpd\") pod \"1925fff7-e12e-48ce-814b-ea36ba6965e1\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.593364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92fpn\" (UniqueName: \"kubernetes.io/projected/1925fff7-e12e-48ce-814b-ea36ba6965e1-kube-api-access-92fpn\") pod \"1925fff7-e12e-48ce-814b-ea36ba6965e1\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.593417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-ceilometer-tls-certs\") pod \"1925fff7-e12e-48ce-814b-ea36ba6965e1\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.593498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-run-httpd\") pod \"1925fff7-e12e-48ce-814b-ea36ba6965e1\" (UID: \"1925fff7-e12e-48ce-814b-ea36ba6965e1\") " Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.593868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1925fff7-e12e-48ce-814b-ea36ba6965e1" (UID: "1925fff7-e12e-48ce-814b-ea36ba6965e1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.594133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1925fff7-e12e-48ce-814b-ea36ba6965e1" (UID: "1925fff7-e12e-48ce-814b-ea36ba6965e1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.594313 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.594339 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1925fff7-e12e-48ce-814b-ea36ba6965e1-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.597461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1925fff7-e12e-48ce-814b-ea36ba6965e1-kube-api-access-92fpn" (OuterVolumeSpecName: "kube-api-access-92fpn") pod "1925fff7-e12e-48ce-814b-ea36ba6965e1" (UID: "1925fff7-e12e-48ce-814b-ea36ba6965e1"). InnerVolumeSpecName "kube-api-access-92fpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.597856 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-scripts" (OuterVolumeSpecName: "scripts") pod "1925fff7-e12e-48ce-814b-ea36ba6965e1" (UID: "1925fff7-e12e-48ce-814b-ea36ba6965e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.614091 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1925fff7-e12e-48ce-814b-ea36ba6965e1" (UID: "1925fff7-e12e-48ce-814b-ea36ba6965e1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.628092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1925fff7-e12e-48ce-814b-ea36ba6965e1" (UID: "1925fff7-e12e-48ce-814b-ea36ba6965e1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.639533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1925fff7-e12e-48ce-814b-ea36ba6965e1" (UID: "1925fff7-e12e-48ce-814b-ea36ba6965e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.654573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-config-data" (OuterVolumeSpecName: "config-data") pod "1925fff7-e12e-48ce-814b-ea36ba6965e1" (UID: "1925fff7-e12e-48ce-814b-ea36ba6965e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.695764 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.695787 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.695799 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92fpn\" (UniqueName: \"kubernetes.io/projected/1925fff7-e12e-48ce-814b-ea36ba6965e1-kube-api-access-92fpn\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.695826 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.695835 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:37 crc kubenswrapper[4707]: I0121 15:59:37.695843 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1925fff7-e12e-48ce-814b-ea36ba6965e1-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.445141 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.468875 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.474657 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.483735 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:38 crc kubenswrapper[4707]: E0121 15:59:38.484059 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="proxy-httpd" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.484075 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="proxy-httpd" Jan 21 15:59:38 crc kubenswrapper[4707]: E0121 15:59:38.484089 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="ceilometer-notification-agent" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.484095 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="ceilometer-notification-agent" Jan 21 15:59:38 crc kubenswrapper[4707]: E0121 15:59:38.484109 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="ceilometer-central-agent" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.484114 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="ceilometer-central-agent" Jan 21 15:59:38 crc kubenswrapper[4707]: E0121 15:59:38.484130 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="sg-core" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.484134 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="sg-core" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.484279 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="sg-core" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.484303 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="ceilometer-central-agent" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.484319 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="ceilometer-notification-agent" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.484325 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" containerName="proxy-httpd" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.485685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.488386 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.488837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.488922 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.493514 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.507491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-config-data\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.507554 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-run-httpd\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.507583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.507661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-log-httpd\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.507683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54npd\" (UniqueName: \"kubernetes.io/projected/75da6817-ed62-456c-a572-375325ca0bd5-kube-api-access-54npd\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.507716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.507757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-scripts\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.507994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.609321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.609382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-config-data\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.609407 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-run-httpd\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.609423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.609456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-log-httpd\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.609472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54npd\" (UniqueName: \"kubernetes.io/projected/75da6817-ed62-456c-a572-375325ca0bd5-kube-api-access-54npd\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.609492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.609516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-scripts\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.609878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-run-httpd\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.610279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-log-httpd\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.613080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.613253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-config-data\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.613333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.613998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-scripts\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.622154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.622403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54npd\" (UniqueName: \"kubernetes.io/projected/75da6817-ed62-456c-a572-375325ca0bd5-kube-api-access-54npd\") pod \"ceilometer-0\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:38 crc kubenswrapper[4707]: I0121 15:59:38.805352 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:39 crc kubenswrapper[4707]: I0121 15:59:39.172167 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 15:59:39 crc kubenswrapper[4707]: W0121 15:59:39.173218 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75da6817_ed62_456c_a572_375325ca0bd5.slice/crio-62f887e71283175cd8ecec6f3f36d061a827546fd63b126348c6094479486992 WatchSource:0}: Error finding container 62f887e71283175cd8ecec6f3f36d061a827546fd63b126348c6094479486992: Status 404 returned error can't find the container with id 62f887e71283175cd8ecec6f3f36d061a827546fd63b126348c6094479486992 Jan 21 15:59:39 crc kubenswrapper[4707]: I0121 15:59:39.195496 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1925fff7-e12e-48ce-814b-ea36ba6965e1" path="/var/lib/kubelet/pods/1925fff7-e12e-48ce-814b-ea36ba6965e1/volumes" Jan 21 15:59:39 crc kubenswrapper[4707]: I0121 15:59:39.452757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerStarted","Data":"62f887e71283175cd8ecec6f3f36d061a827546fd63b126348c6094479486992"} Jan 21 15:59:39 crc kubenswrapper[4707]: E0121 15:59:39.746102 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:39 crc kubenswrapper[4707]: E0121 15:59:39.747977 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:39 crc kubenswrapper[4707]: E0121 15:59:39.749073 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:39 crc kubenswrapper[4707]: E0121 15:59:39.749131 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerName="nova-cell0-conductor-conductor" Jan 21 15:59:40 crc kubenswrapper[4707]: I0121 15:59:40.464883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerStarted","Data":"2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8"} Jan 21 15:59:41 crc kubenswrapper[4707]: I0121 15:59:41.474979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerStarted","Data":"a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9"} Jan 21 15:59:41 crc kubenswrapper[4707]: I0121 15:59:41.475305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerStarted","Data":"7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d"} Jan 21 15:59:43 crc kubenswrapper[4707]: I0121 15:59:43.490607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerStarted","Data":"e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd"} Jan 21 15:59:43 crc kubenswrapper[4707]: I0121 15:59:43.491000 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 15:59:44 crc kubenswrapper[4707]: E0121 15:59:44.745373 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:44 crc kubenswrapper[4707]: E0121 15:59:44.746621 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:44 crc kubenswrapper[4707]: E0121 15:59:44.747549 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:44 crc kubenswrapper[4707]: E0121 15:59:44.747598 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerName="nova-cell0-conductor-conductor" Jan 21 15:59:49 crc kubenswrapper[4707]: E0121 15:59:49.745210 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:49 crc kubenswrapper[4707]: E0121 15:59:49.746858 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:49 crc kubenswrapper[4707]: E0121 15:59:49.748033 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:49 crc kubenswrapper[4707]: E0121 15:59:49.748089 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerName="nova-cell0-conductor-conductor" Jan 21 15:59:54 crc kubenswrapper[4707]: E0121 15:59:54.745176 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:54 crc kubenswrapper[4707]: E0121 15:59:54.746518 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:54 crc kubenswrapper[4707]: E0121 15:59:54.747417 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:54 crc kubenswrapper[4707]: E0121 15:59:54.747445 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerName="nova-cell0-conductor-conductor" Jan 21 15:59:59 crc kubenswrapper[4707]: E0121 15:59:59.745334 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:59 crc kubenswrapper[4707]: E0121 15:59:59.746763 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:59 crc kubenswrapper[4707]: E0121 15:59:59.747632 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 15:59:59 crc kubenswrapper[4707]: E0121 15:59:59.747676 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerName="nova-cell0-conductor-conductor" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.131130 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=18.769503769 podStartE2EDuration="22.13111493s" podCreationTimestamp="2026-01-21 15:59:38 +0000 UTC" firstStartedPulling="2026-01-21 15:59:39.178335747 +0000 UTC m=+3476.359851969" lastFinishedPulling="2026-01-21 15:59:42.539946909 +0000 UTC m=+3479.721463130" observedRunningTime="2026-01-21 15:59:43.520626329 +0000 UTC m=+3480.702142551" watchObservedRunningTime="2026-01-21 16:00:00.13111493 +0000 UTC m=+3497.312631152" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.132275 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl"] Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.133452 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.135516 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.135652 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.141469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl"] Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.221004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-secret-volume\") pod \"collect-profiles-29483520-r2xdl\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.221291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-config-volume\") pod \"collect-profiles-29483520-r2xdl\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.221871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5v7b\" (UniqueName: \"kubernetes.io/projected/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-kube-api-access-q5v7b\") pod \"collect-profiles-29483520-r2xdl\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.322618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5v7b\" (UniqueName: \"kubernetes.io/projected/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-kube-api-access-q5v7b\") pod \"collect-profiles-29483520-r2xdl\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.322768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-secret-volume\") pod \"collect-profiles-29483520-r2xdl\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.322909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-config-volume\") pod \"collect-profiles-29483520-r2xdl\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.323622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-config-volume\") pod \"collect-profiles-29483520-r2xdl\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.328243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-secret-volume\") pod \"collect-profiles-29483520-r2xdl\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.335225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5v7b\" (UniqueName: \"kubernetes.io/projected/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-kube-api-access-q5v7b\") pod \"collect-profiles-29483520-r2xdl\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.449036 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:00 crc kubenswrapper[4707]: I0121 16:00:00.840320 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl"] Jan 21 16:00:01 crc kubenswrapper[4707]: I0121 16:00:01.614202 4707 generic.go:334] "Generic (PLEG): container finished" podID="9de2e0bd-a918-46bc-84b1-6a87cb36d98e" containerID="4e609a0862aef0158bf6a9e405656f087f50b6325376a2bd11fbe2989a0e0674" exitCode=0 Jan 21 16:00:01 crc kubenswrapper[4707]: I0121 16:00:01.614273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" event={"ID":"9de2e0bd-a918-46bc-84b1-6a87cb36d98e","Type":"ContainerDied","Data":"4e609a0862aef0158bf6a9e405656f087f50b6325376a2bd11fbe2989a0e0674"} Jan 21 16:00:01 crc kubenswrapper[4707]: I0121 16:00:01.614413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" event={"ID":"9de2e0bd-a918-46bc-84b1-6a87cb36d98e","Type":"ContainerStarted","Data":"3400832cb7dce3534ede3e1205c244bbd337daaad6cdd89467872e3dc399efd9"} Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.622727 4707 generic.go:334] "Generic (PLEG): container finished" podID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" exitCode=137 Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.622819 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f329b20a-0259-4bbe-b48d-6f716689ec29","Type":"ContainerDied","Data":"930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc"} Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.721042 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.769058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-combined-ca-bundle\") pod \"f329b20a-0259-4bbe-b48d-6f716689ec29\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.769378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv2hg\" (UniqueName: \"kubernetes.io/projected/f329b20a-0259-4bbe-b48d-6f716689ec29-kube-api-access-sv2hg\") pod \"f329b20a-0259-4bbe-b48d-6f716689ec29\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.769432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-config-data\") pod \"f329b20a-0259-4bbe-b48d-6f716689ec29\" (UID: \"f329b20a-0259-4bbe-b48d-6f716689ec29\") " Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.779673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f329b20a-0259-4bbe-b48d-6f716689ec29-kube-api-access-sv2hg" (OuterVolumeSpecName: "kube-api-access-sv2hg") pod "f329b20a-0259-4bbe-b48d-6f716689ec29" (UID: "f329b20a-0259-4bbe-b48d-6f716689ec29"). InnerVolumeSpecName "kube-api-access-sv2hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.791230 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f329b20a-0259-4bbe-b48d-6f716689ec29" (UID: "f329b20a-0259-4bbe-b48d-6f716689ec29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.791791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-config-data" (OuterVolumeSpecName: "config-data") pod "f329b20a-0259-4bbe-b48d-6f716689ec29" (UID: "f329b20a-0259-4bbe-b48d-6f716689ec29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.840520 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.870895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-config-volume\") pod \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.871037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5v7b\" (UniqueName: \"kubernetes.io/projected/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-kube-api-access-q5v7b\") pod \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.871106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-secret-volume\") pod \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\" (UID: \"9de2e0bd-a918-46bc-84b1-6a87cb36d98e\") " Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.871380 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-config-volume" (OuterVolumeSpecName: "config-volume") pod "9de2e0bd-a918-46bc-84b1-6a87cb36d98e" (UID: "9de2e0bd-a918-46bc-84b1-6a87cb36d98e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.871695 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.871719 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv2hg\" (UniqueName: \"kubernetes.io/projected/f329b20a-0259-4bbe-b48d-6f716689ec29-kube-api-access-sv2hg\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.871730 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.871739 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f329b20a-0259-4bbe-b48d-6f716689ec29-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.873585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-kube-api-access-q5v7b" (OuterVolumeSpecName: "kube-api-access-q5v7b") pod "9de2e0bd-a918-46bc-84b1-6a87cb36d98e" (UID: "9de2e0bd-a918-46bc-84b1-6a87cb36d98e"). InnerVolumeSpecName "kube-api-access-q5v7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.873905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9de2e0bd-a918-46bc-84b1-6a87cb36d98e" (UID: "9de2e0bd-a918-46bc-84b1-6a87cb36d98e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.974136 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5v7b\" (UniqueName: \"kubernetes.io/projected/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-kube-api-access-q5v7b\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:02 crc kubenswrapper[4707]: I0121 16:00:02.974175 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de2e0bd-a918-46bc-84b1-6a87cb36d98e-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.631210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" event={"ID":"9de2e0bd-a918-46bc-84b1-6a87cb36d98e","Type":"ContainerDied","Data":"3400832cb7dce3534ede3e1205c244bbd337daaad6cdd89467872e3dc399efd9"} Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.631268 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3400832cb7dce3534ede3e1205c244bbd337daaad6cdd89467872e3dc399efd9" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.631223 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.632683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f329b20a-0259-4bbe-b48d-6f716689ec29","Type":"ContainerDied","Data":"426da9629d6642f8d388c12192aaacd8840cb0c61f5a41da80101443e7592ac6"} Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.632726 4707 scope.go:117] "RemoveContainer" containerID="930720f3706137c3956411a6f032f789b257522b2a9ffc133ea332e3f79d78bc" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.632848 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.649994 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.656518 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.667353 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:00:03 crc kubenswrapper[4707]: E0121 16:00:03.667683 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerName="nova-cell0-conductor-conductor" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.667701 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerName="nova-cell0-conductor-conductor" Jan 21 16:00:03 crc kubenswrapper[4707]: E0121 16:00:03.667741 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de2e0bd-a918-46bc-84b1-6a87cb36d98e" containerName="collect-profiles" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.667748 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de2e0bd-a918-46bc-84b1-6a87cb36d98e" containerName="collect-profiles" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.667894 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de2e0bd-a918-46bc-84b1-6a87cb36d98e" containerName="collect-profiles" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.667909 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" containerName="nova-cell0-conductor-conductor" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.669402 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.671539 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-8244p" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.671663 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.681866 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.688046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.688134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlsz8\" (UniqueName: \"kubernetes.io/projected/e94827e5-23b4-4693-a27e-b37a6f495523-kube-api-access-mlsz8\") pod \"nova-cell0-conductor-0\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.688165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.790157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.790375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlsz8\" (UniqueName: \"kubernetes.io/projected/e94827e5-23b4-4693-a27e-b37a6f495523-kube-api-access-mlsz8\") pod \"nova-cell0-conductor-0\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.790468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.793637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.793756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.802452 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlsz8\" (UniqueName: \"kubernetes.io/projected/e94827e5-23b4-4693-a27e-b37a6f495523-kube-api-access-mlsz8\") pod \"nova-cell0-conductor-0\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.887008 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c"] Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.892233 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-8984c"] Jan 21 16:00:03 crc kubenswrapper[4707]: I0121 16:00:03.981197 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:04 crc kubenswrapper[4707]: I0121 16:00:04.350657 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:00:04 crc kubenswrapper[4707]: W0121 16:00:04.353140 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode94827e5_23b4_4693_a27e_b37a6f495523.slice/crio-cda4bd24da400b8c27af123feaae0312e1e7f278bcf1668a4c3f20bf91ed7393 WatchSource:0}: Error finding container cda4bd24da400b8c27af123feaae0312e1e7f278bcf1668a4c3f20bf91ed7393: Status 404 returned error can't find the container with id cda4bd24da400b8c27af123feaae0312e1e7f278bcf1668a4c3f20bf91ed7393 Jan 21 16:00:04 crc kubenswrapper[4707]: I0121 16:00:04.640849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e94827e5-23b4-4693-a27e-b37a6f495523","Type":"ContainerStarted","Data":"9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941"} Jan 21 16:00:04 crc kubenswrapper[4707]: I0121 16:00:04.641022 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:04 crc kubenswrapper[4707]: I0121 16:00:04.641034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e94827e5-23b4-4693-a27e-b37a6f495523","Type":"ContainerStarted","Data":"cda4bd24da400b8c27af123feaae0312e1e7f278bcf1668a4c3f20bf91ed7393"} Jan 21 16:00:04 crc kubenswrapper[4707]: I0121 16:00:04.657418 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.657405298 podStartE2EDuration="1.657405298s" podCreationTimestamp="2026-01-21 16:00:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:04.655210631 +0000 UTC m=+3501.836726853" watchObservedRunningTime="2026-01-21 16:00:04.657405298 +0000 UTC m=+3501.838921520" Jan 21 16:00:05 crc kubenswrapper[4707]: I0121 16:00:05.193525 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23aa5a49-72b2-4855-8655-448300642a47" path="/var/lib/kubelet/pods/23aa5a49-72b2-4855-8655-448300642a47/volumes" Jan 21 16:00:05 crc kubenswrapper[4707]: I0121 16:00:05.194643 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f329b20a-0259-4bbe-b48d-6f716689ec29" path="/var/lib/kubelet/pods/f329b20a-0259-4bbe-b48d-6f716689ec29/volumes" Jan 21 16:00:08 crc kubenswrapper[4707]: I0121 16:00:08.810917 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:00:09 crc kubenswrapper[4707]: I0121 16:00:09.807068 4707 scope.go:117] "RemoveContainer" containerID="d18981c375733008aca93a5d12ff6972ca95d89488d268551481a903f9faee35" Jan 21 16:00:09 crc kubenswrapper[4707]: I0121 16:00:09.837960 4707 scope.go:117] "RemoveContainer" containerID="abe6c330b2f7b9eeb298263fb49016da6c385ea40b62ceaa7db2f7a086c14819" Jan 21 16:00:09 crc kubenswrapper[4707]: I0121 16:00:09.855750 4707 scope.go:117] "RemoveContainer" containerID="31247faff77ca4ed1bf2d135533279d04b627fd3bed08f97d6daa1ac0c92750a" Jan 21 16:00:09 crc kubenswrapper[4707]: I0121 16:00:09.875046 4707 scope.go:117] "RemoveContainer" containerID="fc7d797bb4cca4bd6f7ee94c58585edda25fefd4c7c624324ba4fb6dae947345" Jan 21 16:00:09 crc kubenswrapper[4707]: I0121 16:00:09.903285 4707 scope.go:117] "RemoveContainer" containerID="505527e41043faf26c8fb4d1c3789cdbd1f623b88d8c64575cc6591ba70dc015" Jan 21 16:00:09 crc kubenswrapper[4707]: I0121 16:00:09.917176 4707 scope.go:117] "RemoveContainer" containerID="184182939c5463a5f2caf7ea1723c17fbff7ff92c20f068adf919f644df65ae3" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.003060 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.374518 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.375565 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.377948 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.385865 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.394703 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.442347 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.444568 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.453512 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.455933 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.509832 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.532061 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.536825 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.554237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.554398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.554429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-config-data\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.554466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-config-data\") pod \"nova-scheduler-0\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.554498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-scripts\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.554527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slr45\" (UniqueName: \"kubernetes.io/projected/8551d447-25a9-4955-90f9-b9154feb66b5-kube-api-access-slr45\") pod \"nova-scheduler-0\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.554549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx5km\" (UniqueName: \"kubernetes.io/projected/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-kube-api-access-qx5km\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.594876 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.608417 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.609842 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.612970 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.622561 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.644575 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.645715 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.651033 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.653740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-config-data\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665744 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-config-data\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-config-data\") pod \"nova-scheduler-0\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t545l\" (UniqueName: \"kubernetes.io/projected/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-kube-api-access-t545l\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665826 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-logs\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-scripts\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xjgq\" (UniqueName: \"kubernetes.io/projected/1372af71-cf78-43ee-9b7d-42022fd409f9-kube-api-access-6xjgq\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slr45\" (UniqueName: \"kubernetes.io/projected/8551d447-25a9-4955-90f9-b9154feb66b5-kube-api-access-slr45\") pod \"nova-scheduler-0\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx5km\" (UniqueName: \"kubernetes.io/projected/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-kube-api-access-qx5km\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.665986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-config-data\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.666012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.666041 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372af71-cf78-43ee-9b7d-42022fd409f9-logs\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.666060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.666154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.666264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.672719 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-config-data\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.672904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-scripts\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.673335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.679102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.679292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-config-data\") pod \"nova-scheduler-0\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.697541 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx5km\" (UniqueName: \"kubernetes.io/projected/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-kube-api-access-qx5km\") pod \"nova-cell0-cell-mapping-bncrz\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.699230 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.700091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slr45\" (UniqueName: \"kubernetes.io/projected/8551d447-25a9-4955-90f9-b9154feb66b5-kube-api-access-slr45\") pod \"nova-scheduler-0\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.764309 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-config-data\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372af71-cf78-43ee-9b7d-42022fd409f9-logs\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-config-data\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t545l\" (UniqueName: \"kubernetes.io/projected/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-kube-api-access-t545l\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-logs\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x78wd\" (UniqueName: \"kubernetes.io/projected/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-kube-api-access-x78wd\") pod \"nova-cell1-novncproxy-0\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.784665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xjgq\" (UniqueName: \"kubernetes.io/projected/1372af71-cf78-43ee-9b7d-42022fd409f9-kube-api-access-6xjgq\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.785430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-logs\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.785569 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372af71-cf78-43ee-9b7d-42022fd409f9-logs\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.788543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.789844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.793264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-config-data\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.793978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-config-data\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.803676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t545l\" (UniqueName: \"kubernetes.io/projected/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-kube-api-access-t545l\") pod \"nova-metadata-0\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.804507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xjgq\" (UniqueName: \"kubernetes.io/projected/1372af71-cf78-43ee-9b7d-42022fd409f9-kube-api-access-6xjgq\") pod \"nova-api-0\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.869155 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.886255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.886354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.886429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x78wd\" (UniqueName: \"kubernetes.io/projected/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-kube-api-access-x78wd\") pod \"nova-cell1-novncproxy-0\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.893950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.894104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.907735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x78wd\" (UniqueName: \"kubernetes.io/projected/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-kube-api-access-x78wd\") pod \"nova-cell1-novncproxy-0\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.928789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.965710 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:14 crc kubenswrapper[4707]: I0121 16:00:14.979011 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz"] Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.305095 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:15 crc kubenswrapper[4707]: W0121 16:00:15.306655 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8551d447_25a9_4955_90f9_b9154feb66b5.slice/crio-a96d86ca974f07873e3fa737fdb7b0f82af047ab21b8668e3e6ee4094a3d8357 WatchSource:0}: Error finding container a96d86ca974f07873e3fa737fdb7b0f82af047ab21b8668e3e6ee4094a3d8357: Status 404 returned error can't find the container with id a96d86ca974f07873e3fa737fdb7b0f82af047ab21b8668e3e6ee4094a3d8357 Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.397633 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.431790 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5"] Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.432922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.434595 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.434599 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.460375 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5"] Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.477803 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.539733 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:00:15 crc kubenswrapper[4707]: W0121 16:00:15.542545 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88243dd5_4d19_4fa8_9c21_b2c1ea192e12.slice/crio-83695bbdca4f6670c2cc0ac0e980c76d9a59d89f1a25601b213053cdeb6bdd99 WatchSource:0}: Error finding container 83695bbdca4f6670c2cc0ac0e980c76d9a59d89f1a25601b213053cdeb6bdd99: Status 404 returned error can't find the container with id 83695bbdca4f6670c2cc0ac0e980c76d9a59d89f1a25601b213053cdeb6bdd99 Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.606890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-config-data\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.607203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbm46\" (UniqueName: \"kubernetes.io/projected/9340e961-22d7-4e45-b231-1fe7f197a9e9-kube-api-access-pbm46\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.607249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-scripts\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.607305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.712641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-config-data\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.712824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbm46\" (UniqueName: \"kubernetes.io/projected/9340e961-22d7-4e45-b231-1fe7f197a9e9-kube-api-access-pbm46\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.712870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-scripts\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.712910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.716610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.718162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-scripts\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.718733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-config-data\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.735880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbm46\" (UniqueName: \"kubernetes.io/projected/9340e961-22d7-4e45-b231-1fe7f197a9e9-kube-api-access-pbm46\") pod \"nova-cell1-conductor-db-sync-x9hv5\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.741027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1372af71-cf78-43ee-9b7d-42022fd409f9","Type":"ContainerStarted","Data":"6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.741063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1372af71-cf78-43ee-9b7d-42022fd409f9","Type":"ContainerStarted","Data":"298d32a63b57452411b67c032d74dec530a73a3acf8d231c2dadb86a1286ca89"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.745256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8551d447-25a9-4955-90f9-b9154feb66b5","Type":"ContainerStarted","Data":"80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.745308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8551d447-25a9-4955-90f9-b9154feb66b5","Type":"ContainerStarted","Data":"a96d86ca974f07873e3fa737fdb7b0f82af047ab21b8668e3e6ee4094a3d8357"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.749028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697","Type":"ContainerStarted","Data":"db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.749053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697","Type":"ContainerStarted","Data":"faae4faccd7f7fbe54815491311a6b6c70cf44e10e5f9b64af10b0f76682b163"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.750243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"88243dd5-4d19-4fa8-9c21-b2c1ea192e12","Type":"ContainerStarted","Data":"c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.750266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"88243dd5-4d19-4fa8-9c21-b2c1ea192e12","Type":"ContainerStarted","Data":"83695bbdca4f6670c2cc0ac0e980c76d9a59d89f1a25601b213053cdeb6bdd99"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.751680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" event={"ID":"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a","Type":"ContainerStarted","Data":"242d6d6cc98648532f46e922558d3ad8d98cde0c4a39d7498647f465fc542cfe"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.751704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" event={"ID":"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a","Type":"ContainerStarted","Data":"2095bbd3d0514835190adf94c97261030138fc9a7566a42c7aee60cd2bac0858"} Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.755576 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.768143 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.7681211540000001 podStartE2EDuration="1.768121154s" podCreationTimestamp="2026-01-21 16:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:15.764866774 +0000 UTC m=+3512.946382997" watchObservedRunningTime="2026-01-21 16:00:15.768121154 +0000 UTC m=+3512.949637376" Jan 21 16:00:15 crc kubenswrapper[4707]: I0121 16:00:15.800610 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" podStartSLOduration=1.8005951580000001 podStartE2EDuration="1.800595158s" podCreationTimestamp="2026-01-21 16:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:15.792961134 +0000 UTC m=+3512.974477355" watchObservedRunningTime="2026-01-21 16:00:15.800595158 +0000 UTC m=+3512.982111381" Jan 21 16:00:16 crc kubenswrapper[4707]: I0121 16:00:16.200405 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.200389748 podStartE2EDuration="2.200389748s" podCreationTimestamp="2026-01-21 16:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:15.827116881 +0000 UTC m=+3513.008633102" watchObservedRunningTime="2026-01-21 16:00:16.200389748 +0000 UTC m=+3513.381905969" Jan 21 16:00:16 crc kubenswrapper[4707]: W0121 16:00:16.204350 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9340e961_22d7_4e45_b231_1fe7f197a9e9.slice/crio-8ab0f42253140438b0105ace8e525f7e1f4178dbaf1b9d3da04a5037d03a3bea WatchSource:0}: Error finding container 8ab0f42253140438b0105ace8e525f7e1f4178dbaf1b9d3da04a5037d03a3bea: Status 404 returned error can't find the container with id 8ab0f42253140438b0105ace8e525f7e1f4178dbaf1b9d3da04a5037d03a3bea Jan 21 16:00:16 crc kubenswrapper[4707]: I0121 16:00:16.205494 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5"] Jan 21 16:00:16 crc kubenswrapper[4707]: I0121 16:00:16.761257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" event={"ID":"9340e961-22d7-4e45-b231-1fe7f197a9e9","Type":"ContainerStarted","Data":"b70e55f78933c8327e6fe960fb5ad4107cb5d18bbc98a566057ce52134a76c2f"} Jan 21 16:00:16 crc kubenswrapper[4707]: I0121 16:00:16.761382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" event={"ID":"9340e961-22d7-4e45-b231-1fe7f197a9e9","Type":"ContainerStarted","Data":"8ab0f42253140438b0105ace8e525f7e1f4178dbaf1b9d3da04a5037d03a3bea"} Jan 21 16:00:16 crc kubenswrapper[4707]: I0121 16:00:16.763736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697","Type":"ContainerStarted","Data":"37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96"} Jan 21 16:00:16 crc kubenswrapper[4707]: I0121 16:00:16.767538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1372af71-cf78-43ee-9b7d-42022fd409f9","Type":"ContainerStarted","Data":"9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325"} Jan 21 16:00:16 crc kubenswrapper[4707]: I0121 16:00:16.787835 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" podStartSLOduration=1.787817913 podStartE2EDuration="1.787817913s" podCreationTimestamp="2026-01-21 16:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:16.780251856 +0000 UTC m=+3513.961768078" watchObservedRunningTime="2026-01-21 16:00:16.787817913 +0000 UTC m=+3513.969334135" Jan 21 16:00:16 crc kubenswrapper[4707]: I0121 16:00:16.830487 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.830470488 podStartE2EDuration="2.830470488s" podCreationTimestamp="2026-01-21 16:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:16.827455208 +0000 UTC m=+3514.008971429" watchObservedRunningTime="2026-01-21 16:00:16.830470488 +0000 UTC m=+3514.011986710" Jan 21 16:00:16 crc kubenswrapper[4707]: I0121 16:00:16.831986 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.831979726 podStartE2EDuration="2.831979726s" podCreationTimestamp="2026-01-21 16:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:16.81021529 +0000 UTC m=+3513.991731512" watchObservedRunningTime="2026-01-21 16:00:16.831979726 +0000 UTC m=+3514.013495948" Jan 21 16:00:17 crc kubenswrapper[4707]: I0121 16:00:17.738997 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:17 crc kubenswrapper[4707]: I0121 16:00:17.745042 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:00:17 crc kubenswrapper[4707]: I0121 16:00:17.774982 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="88243dd5-4d19-4fa8-9c21-b2c1ea192e12" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8" gracePeriod=30 Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.369492 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.458179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-config-data\") pod \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.458327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x78wd\" (UniqueName: \"kubernetes.io/projected/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-kube-api-access-x78wd\") pod \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.458346 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-combined-ca-bundle\") pod \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\" (UID: \"88243dd5-4d19-4fa8-9c21-b2c1ea192e12\") " Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.473065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-kube-api-access-x78wd" (OuterVolumeSpecName: "kube-api-access-x78wd") pod "88243dd5-4d19-4fa8-9c21-b2c1ea192e12" (UID: "88243dd5-4d19-4fa8-9c21-b2c1ea192e12"). InnerVolumeSpecName "kube-api-access-x78wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.480358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-config-data" (OuterVolumeSpecName: "config-data") pod "88243dd5-4d19-4fa8-9c21-b2c1ea192e12" (UID: "88243dd5-4d19-4fa8-9c21-b2c1ea192e12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.481357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88243dd5-4d19-4fa8-9c21-b2c1ea192e12" (UID: "88243dd5-4d19-4fa8-9c21-b2c1ea192e12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.559697 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x78wd\" (UniqueName: \"kubernetes.io/projected/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-kube-api-access-x78wd\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.559726 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.559737 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88243dd5-4d19-4fa8-9c21-b2c1ea192e12-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.784596 4707 generic.go:334] "Generic (PLEG): container finished" podID="88243dd5-4d19-4fa8-9c21-b2c1ea192e12" containerID="c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8" exitCode=0 Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.784700 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.784738 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"88243dd5-4d19-4fa8-9c21-b2c1ea192e12","Type":"ContainerDied","Data":"c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8"} Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.784755 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerName="nova-metadata-log" containerID="cri-o://db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc" gracePeriod=30 Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.784763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"88243dd5-4d19-4fa8-9c21-b2c1ea192e12","Type":"ContainerDied","Data":"83695bbdca4f6670c2cc0ac0e980c76d9a59d89f1a25601b213053cdeb6bdd99"} Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.784783 4707 scope.go:117] "RemoveContainer" containerID="c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.784944 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerName="nova-metadata-metadata" containerID="cri-o://37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96" gracePeriod=30 Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.808342 4707 scope.go:117] "RemoveContainer" containerID="c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8" Jan 21 16:00:18 crc kubenswrapper[4707]: E0121 16:00:18.808678 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8\": container with ID starting with c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8 not found: ID does not exist" containerID="c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.808719 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8"} err="failed to get container status \"c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8\": rpc error: code = NotFound desc = could not find container \"c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8\": container with ID starting with c892f36dc8311209bcc5fa315a961ad9fc9e489e69635f56336157e024abd8c8 not found: ID does not exist" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.816530 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.830931 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.836334 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:00:18 crc kubenswrapper[4707]: E0121 16:00:18.836696 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88243dd5-4d19-4fa8-9c21-b2c1ea192e12" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.836715 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="88243dd5-4d19-4fa8-9c21-b2c1ea192e12" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.836922 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="88243dd5-4d19-4fa8-9c21-b2c1ea192e12" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.837498 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.839282 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.839673 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.839840 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.844581 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.979564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf57v\" (UniqueName: \"kubernetes.io/projected/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-kube-api-access-kf57v\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.979697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.979782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.979924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:18 crc kubenswrapper[4707]: I0121 16:00:18.979992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.082377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf57v\" (UniqueName: \"kubernetes.io/projected/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-kube-api-access-kf57v\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.082577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.082624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.082695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.082745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.086923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.088430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.090006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.095747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.095966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf57v\" (UniqueName: \"kubernetes.io/projected/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-kube-api-access-kf57v\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.159442 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.192169 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88243dd5-4d19-4fa8-9c21-b2c1ea192e12" path="/var/lib/kubelet/pods/88243dd5-4d19-4fa8-9c21-b2c1ea192e12/volumes" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.260279 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.387552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-config-data\") pod \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.387603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-logs\") pod \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.387673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t545l\" (UniqueName: \"kubernetes.io/projected/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-kube-api-access-t545l\") pod \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.387803 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-combined-ca-bundle\") pod \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\" (UID: \"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697\") " Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.388139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-logs" (OuterVolumeSpecName: "logs") pod "96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" (UID: "96fc30a7-c2fa-4d7f-bb28-73d9af9f6697"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.388599 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.393937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-kube-api-access-t545l" (OuterVolumeSpecName: "kube-api-access-t545l") pod "96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" (UID: "96fc30a7-c2fa-4d7f-bb28-73d9af9f6697"). InnerVolumeSpecName "kube-api-access-t545l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.407677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-config-data" (OuterVolumeSpecName: "config-data") pod "96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" (UID: "96fc30a7-c2fa-4d7f-bb28-73d9af9f6697"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.420679 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" (UID: "96fc30a7-c2fa-4d7f-bb28-73d9af9f6697"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.490089 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.490117 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.490127 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t545l\" (UniqueName: \"kubernetes.io/projected/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697-kube-api-access-t545l\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.545390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:00:19 crc kubenswrapper[4707]: W0121 16:00:19.547189 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb83aac_11ee_4ce1_8d4f_f634bc2fb8f8.slice/crio-02fef2ffbd4f577563d6f63f4339e36926f110f902e9236bf6f7744c3fd53913 WatchSource:0}: Error finding container 02fef2ffbd4f577563d6f63f4339e36926f110f902e9236bf6f7744c3fd53913: Status 404 returned error can't find the container with id 02fef2ffbd4f577563d6f63f4339e36926f110f902e9236bf6f7744c3fd53913 Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.764742 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.792918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8","Type":"ContainerStarted","Data":"c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420"} Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.792952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8","Type":"ContainerStarted","Data":"02fef2ffbd4f577563d6f63f4339e36926f110f902e9236bf6f7744c3fd53913"} Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.794625 4707 generic.go:334] "Generic (PLEG): container finished" podID="9340e961-22d7-4e45-b231-1fe7f197a9e9" containerID="b70e55f78933c8327e6fe960fb5ad4107cb5d18bbc98a566057ce52134a76c2f" exitCode=0 Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.794956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" event={"ID":"9340e961-22d7-4e45-b231-1fe7f197a9e9","Type":"ContainerDied","Data":"b70e55f78933c8327e6fe960fb5ad4107cb5d18bbc98a566057ce52134a76c2f"} Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.796525 4707 generic.go:334] "Generic (PLEG): container finished" podID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerID="37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96" exitCode=0 Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.796546 4707 generic.go:334] "Generic (PLEG): container finished" podID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerID="db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc" exitCode=143 Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.796577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697","Type":"ContainerDied","Data":"37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96"} Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.796594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697","Type":"ContainerDied","Data":"db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc"} Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.796604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"96fc30a7-c2fa-4d7f-bb28-73d9af9f6697","Type":"ContainerDied","Data":"faae4faccd7f7fbe54815491311a6b6c70cf44e10e5f9b64af10b0f76682b163"} Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.796619 4707 scope.go:117] "RemoveContainer" containerID="37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.796703 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.807416 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.807407574 podStartE2EDuration="1.807407574s" podCreationTimestamp="2026-01-21 16:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:19.803106205 +0000 UTC m=+3516.984622428" watchObservedRunningTime="2026-01-21 16:00:19.807407574 +0000 UTC m=+3516.988923796" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.839184 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.841497 4707 scope.go:117] "RemoveContainer" containerID="db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.849691 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.858134 4707 scope.go:117] "RemoveContainer" containerID="37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96" Jan 21 16:00:19 crc kubenswrapper[4707]: E0121 16:00:19.858434 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96\": container with ID starting with 37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96 not found: ID does not exist" containerID="37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.858482 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96"} err="failed to get container status \"37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96\": rpc error: code = NotFound desc = could not find container \"37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96\": container with ID starting with 37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96 not found: ID does not exist" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.858506 4707 scope.go:117] "RemoveContainer" containerID="db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc" Jan 21 16:00:19 crc kubenswrapper[4707]: E0121 16:00:19.858798 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc\": container with ID starting with db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc not found: ID does not exist" containerID="db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.858845 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc"} err="failed to get container status \"db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc\": rpc error: code = NotFound desc = could not find container \"db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc\": container with ID starting with db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc not found: ID does not exist" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.858861 4707 scope.go:117] "RemoveContainer" containerID="37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.859120 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96"} err="failed to get container status \"37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96\": rpc error: code = NotFound desc = could not find container \"37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96\": container with ID starting with 37af581da370125c626c855b87fff12b5dbab256432f483686abd5f8ea335e96 not found: ID does not exist" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.859142 4707 scope.go:117] "RemoveContainer" containerID="db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.859354 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc"} err="failed to get container status \"db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc\": rpc error: code = NotFound desc = could not find container \"db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc\": container with ID starting with db768ea42d51bd8069a7468d8efdc0974bc860255150bbe93ecaa1e2f162c7fc not found: ID does not exist" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.867589 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:19 crc kubenswrapper[4707]: E0121 16:00:19.867957 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerName="nova-metadata-log" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.867975 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerName="nova-metadata-log" Jan 21 16:00:19 crc kubenswrapper[4707]: E0121 16:00:19.867988 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerName="nova-metadata-metadata" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.867994 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerName="nova-metadata-metadata" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.868146 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerName="nova-metadata-log" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.868167 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" containerName="nova-metadata-metadata" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.869020 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.871224 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.872563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.880134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.999389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnnzf\" (UniqueName: \"kubernetes.io/projected/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-kube-api-access-jnnzf\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.999588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.999715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-config-data\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.999834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:19 crc kubenswrapper[4707]: I0121 16:00:19.999860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-logs\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.101309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnnzf\" (UniqueName: \"kubernetes.io/projected/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-kube-api-access-jnnzf\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.101399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.101480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-config-data\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.101662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.102063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-logs\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.102147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-logs\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.105449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.106373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-config-data\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.108454 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.114330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnnzf\" (UniqueName: \"kubernetes.io/projected/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-kube-api-access-jnnzf\") pod \"nova-metadata-0\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.180719 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:20 crc kubenswrapper[4707]: W0121 16:00:20.592453 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod712d2b76_e504_4ef1_8d3a_7426d60e1ed6.slice/crio-4ea5538070ece208c3b94b316e6a791b1ddf0b5e6b0bd79b1f77dc4bf2380ca2 WatchSource:0}: Error finding container 4ea5538070ece208c3b94b316e6a791b1ddf0b5e6b0bd79b1f77dc4bf2380ca2: Status 404 returned error can't find the container with id 4ea5538070ece208c3b94b316e6a791b1ddf0b5e6b0bd79b1f77dc4bf2380ca2 Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.592845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.811900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"712d2b76-e504-4ef1-8d3a-7426d60e1ed6","Type":"ContainerStarted","Data":"aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320"} Jan 21 16:00:20 crc kubenswrapper[4707]: I0121 16:00:20.811929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"712d2b76-e504-4ef1-8d3a-7426d60e1ed6","Type":"ContainerStarted","Data":"4ea5538070ece208c3b94b316e6a791b1ddf0b5e6b0bd79b1f77dc4bf2380ca2"} Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.067866 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.192016 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96fc30a7-c2fa-4d7f-bb28-73d9af9f6697" path="/var/lib/kubelet/pods/96fc30a7-c2fa-4d7f-bb28-73d9af9f6697/volumes" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.231111 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-scripts\") pod \"9340e961-22d7-4e45-b231-1fe7f197a9e9\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.231190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-combined-ca-bundle\") pod \"9340e961-22d7-4e45-b231-1fe7f197a9e9\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.231215 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-config-data\") pod \"9340e961-22d7-4e45-b231-1fe7f197a9e9\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.232198 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbm46\" (UniqueName: \"kubernetes.io/projected/9340e961-22d7-4e45-b231-1fe7f197a9e9-kube-api-access-pbm46\") pod \"9340e961-22d7-4e45-b231-1fe7f197a9e9\" (UID: \"9340e961-22d7-4e45-b231-1fe7f197a9e9\") " Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.235676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9340e961-22d7-4e45-b231-1fe7f197a9e9-kube-api-access-pbm46" (OuterVolumeSpecName: "kube-api-access-pbm46") pod "9340e961-22d7-4e45-b231-1fe7f197a9e9" (UID: "9340e961-22d7-4e45-b231-1fe7f197a9e9"). InnerVolumeSpecName "kube-api-access-pbm46". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.237411 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-scripts" (OuterVolumeSpecName: "scripts") pod "9340e961-22d7-4e45-b231-1fe7f197a9e9" (UID: "9340e961-22d7-4e45-b231-1fe7f197a9e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.252674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9340e961-22d7-4e45-b231-1fe7f197a9e9" (UID: "9340e961-22d7-4e45-b231-1fe7f197a9e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.254530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-config-data" (OuterVolumeSpecName: "config-data") pod "9340e961-22d7-4e45-b231-1fe7f197a9e9" (UID: "9340e961-22d7-4e45-b231-1fe7f197a9e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.334437 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.334466 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.334476 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbm46\" (UniqueName: \"kubernetes.io/projected/9340e961-22d7-4e45-b231-1fe7f197a9e9-kube-api-access-pbm46\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.334486 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9340e961-22d7-4e45-b231-1fe7f197a9e9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.819755 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.819760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5" event={"ID":"9340e961-22d7-4e45-b231-1fe7f197a9e9","Type":"ContainerDied","Data":"8ab0f42253140438b0105ace8e525f7e1f4178dbaf1b9d3da04a5037d03a3bea"} Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.819870 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ab0f42253140438b0105ace8e525f7e1f4178dbaf1b9d3da04a5037d03a3bea" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.821424 4707 generic.go:334] "Generic (PLEG): container finished" podID="a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a" containerID="242d6d6cc98648532f46e922558d3ad8d98cde0c4a39d7498647f465fc542cfe" exitCode=0 Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.821487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" event={"ID":"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a","Type":"ContainerDied","Data":"242d6d6cc98648532f46e922558d3ad8d98cde0c4a39d7498647f465fc542cfe"} Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.823102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"712d2b76-e504-4ef1-8d3a-7426d60e1ed6","Type":"ContainerStarted","Data":"edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41"} Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.884473 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.884444498 podStartE2EDuration="2.884444498s" podCreationTimestamp="2026-01-21 16:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:21.845514965 +0000 UTC m=+3519.027031187" watchObservedRunningTime="2026-01-21 16:00:21.884444498 +0000 UTC m=+3519.065960720" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.904874 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:00:21 crc kubenswrapper[4707]: E0121 16:00:21.905265 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9340e961-22d7-4e45-b231-1fe7f197a9e9" containerName="nova-cell1-conductor-db-sync" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.905282 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9340e961-22d7-4e45-b231-1fe7f197a9e9" containerName="nova-cell1-conductor-db-sync" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.905517 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9340e961-22d7-4e45-b231-1fe7f197a9e9" containerName="nova-cell1-conductor-db-sync" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.906081 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.907777 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.912240 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.943213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.943250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn2gr\" (UniqueName: \"kubernetes.io/projected/8f8b72f5-0735-4187-aaec-0fb50df56674-kube-api-access-hn2gr\") pod \"nova-cell1-conductor-0\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:21 crc kubenswrapper[4707]: I0121 16:00:21.943306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.044351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.044535 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.044559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn2gr\" (UniqueName: \"kubernetes.io/projected/8f8b72f5-0735-4187-aaec-0fb50df56674-kube-api-access-hn2gr\") pod \"nova-cell1-conductor-0\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.048002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.048171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.057643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn2gr\" (UniqueName: \"kubernetes.io/projected/8f8b72f5-0735-4187-aaec-0fb50df56674-kube-api-access-hn2gr\") pod \"nova-cell1-conductor-0\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.220982 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.581164 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:00:22 crc kubenswrapper[4707]: W0121 16:00:22.581422 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f8b72f5_0735_4187_aaec_0fb50df56674.slice/crio-d4ed10b0e41f00fea85b47debfb90270a496b9f43b126cf47e367e657be44316 WatchSource:0}: Error finding container d4ed10b0e41f00fea85b47debfb90270a496b9f43b126cf47e367e657be44316: Status 404 returned error can't find the container with id d4ed10b0e41f00fea85b47debfb90270a496b9f43b126cf47e367e657be44316 Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.830098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"8f8b72f5-0735-4187-aaec-0fb50df56674","Type":"ContainerStarted","Data":"d09d9e5a0fd4869c4422a15b0398bb6bfdf3645c34fe7a84bc0e0e94d680986f"} Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.830332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"8f8b72f5-0735-4187-aaec-0fb50df56674","Type":"ContainerStarted","Data":"d4ed10b0e41f00fea85b47debfb90270a496b9f43b126cf47e367e657be44316"} Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.830380 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:22 crc kubenswrapper[4707]: I0121 16:00:22.847146 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.847133028 podStartE2EDuration="1.847133028s" podCreationTimestamp="2026-01-21 16:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:22.842562784 +0000 UTC m=+3520.024079036" watchObservedRunningTime="2026-01-21 16:00:22.847133028 +0000 UTC m=+3520.028649250" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.046363 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.059009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-config-data\") pod \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.059130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx5km\" (UniqueName: \"kubernetes.io/projected/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-kube-api-access-qx5km\") pod \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.059172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-scripts\") pod \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.059243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-combined-ca-bundle\") pod \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\" (UID: \"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a\") " Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.063077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-scripts" (OuterVolumeSpecName: "scripts") pod "a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a" (UID: "a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.063119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-kube-api-access-qx5km" (OuterVolumeSpecName: "kube-api-access-qx5km") pod "a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a" (UID: "a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a"). InnerVolumeSpecName "kube-api-access-qx5km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.080499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a" (UID: "a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.081574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-config-data" (OuterVolumeSpecName: "config-data") pod "a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a" (UID: "a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.160515 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx5km\" (UniqueName: \"kubernetes.io/projected/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-kube-api-access-qx5km\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.160538 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.160547 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.160556 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.839057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" event={"ID":"a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a","Type":"ContainerDied","Data":"2095bbd3d0514835190adf94c97261030138fc9a7566a42c7aee60cd2bac0858"} Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.839235 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2095bbd3d0514835190adf94c97261030138fc9a7566a42c7aee60cd2bac0858" Jan 21 16:00:23 crc kubenswrapper[4707]: I0121 16:00:23.839071 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.000003 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.000444 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerName="nova-api-log" containerID="cri-o://6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061" gracePeriod=30 Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.000847 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerName="nova-api-api" containerID="cri-o://9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325" gracePeriod=30 Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.013382 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.013586 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="8551d447-25a9-4955-90f9-b9154feb66b5" containerName="nova-scheduler-scheduler" containerID="cri-o://80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932" gracePeriod=30 Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.022362 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.022557 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerName="nova-metadata-log" containerID="cri-o://aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320" gracePeriod=30 Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.022907 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerName="nova-metadata-metadata" containerID="cri-o://edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41" gracePeriod=30 Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.161907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.514838 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.521150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.690690 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-config-data\") pod \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.690748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-nova-metadata-tls-certs\") pod \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.690834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-combined-ca-bundle\") pod \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.690865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnnzf\" (UniqueName: \"kubernetes.io/projected/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-kube-api-access-jnnzf\") pod \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.690911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-logs\") pod \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\" (UID: \"712d2b76-e504-4ef1-8d3a-7426d60e1ed6\") " Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.691009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-config-data\") pod \"1372af71-cf78-43ee-9b7d-42022fd409f9\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.691024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-combined-ca-bundle\") pod \"1372af71-cf78-43ee-9b7d-42022fd409f9\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.691241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-logs" (OuterVolumeSpecName: "logs") pod "712d2b76-e504-4ef1-8d3a-7426d60e1ed6" (UID: "712d2b76-e504-4ef1-8d3a-7426d60e1ed6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.691284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xjgq\" (UniqueName: \"kubernetes.io/projected/1372af71-cf78-43ee-9b7d-42022fd409f9-kube-api-access-6xjgq\") pod \"1372af71-cf78-43ee-9b7d-42022fd409f9\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.691333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372af71-cf78-43ee-9b7d-42022fd409f9-logs\") pod \"1372af71-cf78-43ee-9b7d-42022fd409f9\" (UID: \"1372af71-cf78-43ee-9b7d-42022fd409f9\") " Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.691547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1372af71-cf78-43ee-9b7d-42022fd409f9-logs" (OuterVolumeSpecName: "logs") pod "1372af71-cf78-43ee-9b7d-42022fd409f9" (UID: "1372af71-cf78-43ee-9b7d-42022fd409f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.691764 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.691783 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372af71-cf78-43ee-9b7d-42022fd409f9-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.696696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-kube-api-access-jnnzf" (OuterVolumeSpecName: "kube-api-access-jnnzf") pod "712d2b76-e504-4ef1-8d3a-7426d60e1ed6" (UID: "712d2b76-e504-4ef1-8d3a-7426d60e1ed6"). InnerVolumeSpecName "kube-api-access-jnnzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.697799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1372af71-cf78-43ee-9b7d-42022fd409f9-kube-api-access-6xjgq" (OuterVolumeSpecName: "kube-api-access-6xjgq") pod "1372af71-cf78-43ee-9b7d-42022fd409f9" (UID: "1372af71-cf78-43ee-9b7d-42022fd409f9"). InnerVolumeSpecName "kube-api-access-6xjgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.712977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "712d2b76-e504-4ef1-8d3a-7426d60e1ed6" (UID: "712d2b76-e504-4ef1-8d3a-7426d60e1ed6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.722394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-config-data" (OuterVolumeSpecName: "config-data") pod "1372af71-cf78-43ee-9b7d-42022fd409f9" (UID: "1372af71-cf78-43ee-9b7d-42022fd409f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.726958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1372af71-cf78-43ee-9b7d-42022fd409f9" (UID: "1372af71-cf78-43ee-9b7d-42022fd409f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.729390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-config-data" (OuterVolumeSpecName: "config-data") pod "712d2b76-e504-4ef1-8d3a-7426d60e1ed6" (UID: "712d2b76-e504-4ef1-8d3a-7426d60e1ed6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.739057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "712d2b76-e504-4ef1-8d3a-7426d60e1ed6" (UID: "712d2b76-e504-4ef1-8d3a-7426d60e1ed6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.792946 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xjgq\" (UniqueName: \"kubernetes.io/projected/1372af71-cf78-43ee-9b7d-42022fd409f9-kube-api-access-6xjgq\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.792973 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.792983 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.792994 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.793002 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnnzf\" (UniqueName: \"kubernetes.io/projected/712d2b76-e504-4ef1-8d3a-7426d60e1ed6-kube-api-access-jnnzf\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.793011 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.793018 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372af71-cf78-43ee-9b7d-42022fd409f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.847777 4707 generic.go:334] "Generic (PLEG): container finished" podID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerID="9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325" exitCode=0 Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.847802 4707 generic.go:334] "Generic (PLEG): container finished" podID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerID="6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061" exitCode=143 Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.847837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1372af71-cf78-43ee-9b7d-42022fd409f9","Type":"ContainerDied","Data":"9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325"} Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.847865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1372af71-cf78-43ee-9b7d-42022fd409f9","Type":"ContainerDied","Data":"6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061"} Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.847874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1372af71-cf78-43ee-9b7d-42022fd409f9","Type":"ContainerDied","Data":"298d32a63b57452411b67c032d74dec530a73a3acf8d231c2dadb86a1286ca89"} Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.847823 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.847886 4707 scope.go:117] "RemoveContainer" containerID="9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.850535 4707 generic.go:334] "Generic (PLEG): container finished" podID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerID="edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41" exitCode=0 Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.850554 4707 generic.go:334] "Generic (PLEG): container finished" podID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerID="aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320" exitCode=143 Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.850619 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.850714 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"712d2b76-e504-4ef1-8d3a-7426d60e1ed6","Type":"ContainerDied","Data":"edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41"} Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.850823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"712d2b76-e504-4ef1-8d3a-7426d60e1ed6","Type":"ContainerDied","Data":"aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320"} Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.850900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"712d2b76-e504-4ef1-8d3a-7426d60e1ed6","Type":"ContainerDied","Data":"4ea5538070ece208c3b94b316e6a791b1ddf0b5e6b0bd79b1f77dc4bf2380ca2"} Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.869717 4707 scope.go:117] "RemoveContainer" containerID="6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.874480 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.892265 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.904191 4707 scope.go:117] "RemoveContainer" containerID="9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325" Jan 21 16:00:24 crc kubenswrapper[4707]: E0121 16:00:24.908650 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325\": container with ID starting with 9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325 not found: ID does not exist" containerID="9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.908690 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325"} err="failed to get container status \"9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325\": rpc error: code = NotFound desc = could not find container \"9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325\": container with ID starting with 9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325 not found: ID does not exist" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.908714 4707 scope.go:117] "RemoveContainer" containerID="6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061" Jan 21 16:00:24 crc kubenswrapper[4707]: E0121 16:00:24.909021 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061\": container with ID starting with 6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061 not found: ID does not exist" containerID="6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.909040 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061"} err="failed to get container status \"6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061\": rpc error: code = NotFound desc = could not find container \"6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061\": container with ID starting with 6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061 not found: ID does not exist" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.909054 4707 scope.go:117] "RemoveContainer" containerID="9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.909984 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325"} err="failed to get container status \"9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325\": rpc error: code = NotFound desc = could not find container \"9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325\": container with ID starting with 9bab32960b70b6120e064d8c2e2ce5fa28d0a100d16fdcc9b2641bfc94624325 not found: ID does not exist" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.910009 4707 scope.go:117] "RemoveContainer" containerID="6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.912465 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061"} err="failed to get container status \"6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061\": rpc error: code = NotFound desc = could not find container \"6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061\": container with ID starting with 6a6aee6d99f3d660ff784d31f2c0bba42a3634a07d28f8b2a97cd326cde50061 not found: ID does not exist" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.912499 4707 scope.go:117] "RemoveContainer" containerID="edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.927254 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.934291 4707 scope.go:117] "RemoveContainer" containerID="aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.939902 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.946145 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: E0121 16:00:24.946641 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerName="nova-api-api" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.946662 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerName="nova-api-api" Jan 21 16:00:24 crc kubenswrapper[4707]: E0121 16:00:24.946684 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerName="nova-api-log" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.946690 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerName="nova-api-log" Jan 21 16:00:24 crc kubenswrapper[4707]: E0121 16:00:24.946702 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a" containerName="nova-manage" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.946707 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a" containerName="nova-manage" Jan 21 16:00:24 crc kubenswrapper[4707]: E0121 16:00:24.946732 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerName="nova-metadata-log" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.946737 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerName="nova-metadata-log" Jan 21 16:00:24 crc kubenswrapper[4707]: E0121 16:00:24.946748 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerName="nova-metadata-metadata" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.946754 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerName="nova-metadata-metadata" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.946964 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerName="nova-metadata-metadata" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.946982 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerName="nova-api-log" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.946992 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" containerName="nova-metadata-log" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.947008 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1372af71-cf78-43ee-9b7d-42022fd409f9" containerName="nova-api-api" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.947020 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a" containerName="nova-manage" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.948046 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.951614 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.953749 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.954416 4707 scope.go:117] "RemoveContainer" containerID="edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41" Jan 21 16:00:24 crc kubenswrapper[4707]: E0121 16:00:24.955926 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41\": container with ID starting with edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41 not found: ID does not exist" containerID="edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.955959 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41"} err="failed to get container status \"edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41\": rpc error: code = NotFound desc = could not find container \"edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41\": container with ID starting with edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41 not found: ID does not exist" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.955980 4707 scope.go:117] "RemoveContainer" containerID="aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320" Jan 21 16:00:24 crc kubenswrapper[4707]: E0121 16:00:24.957211 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320\": container with ID starting with aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320 not found: ID does not exist" containerID="aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.957239 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320"} err="failed to get container status \"aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320\": rpc error: code = NotFound desc = could not find container \"aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320\": container with ID starting with aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320 not found: ID does not exist" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.957260 4707 scope.go:117] "RemoveContainer" containerID="edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.960766 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41"} err="failed to get container status \"edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41\": rpc error: code = NotFound desc = could not find container \"edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41\": container with ID starting with edc6c22bb768a4026cb5e5c32e95300a99a10277158074a03f0f4e7ac3a5cf41 not found: ID does not exist" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.960797 4707 scope.go:117] "RemoveContainer" containerID="aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.960918 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.961077 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320"} err="failed to get container status \"aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320\": rpc error: code = NotFound desc = could not find container \"aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320\": container with ID starting with aa6e4b86973d9e8669677baf8ac587a41c6e6b1a9fae1f38700944f21d15d320 not found: ID does not exist" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.962337 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.963755 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.964917 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:00:24 crc kubenswrapper[4707]: I0121 16:00:24.968080 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.109526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-logs\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.109561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.109599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cjzs\" (UniqueName: \"kubernetes.io/projected/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-kube-api-access-8cjzs\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.109884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm55z\" (UniqueName: \"kubernetes.io/projected/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-kube-api-access-jm55z\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.109986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-config-data\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.110108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.110206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.110245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-logs\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.110358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-config-data\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.189606 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1372af71-cf78-43ee-9b7d-42022fd409f9" path="/var/lib/kubelet/pods/1372af71-cf78-43ee-9b7d-42022fd409f9/volumes" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.190175 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712d2b76-e504-4ef1-8d3a-7426d60e1ed6" path="/var/lib/kubelet/pods/712d2b76-e504-4ef1-8d3a-7426d60e1ed6/volumes" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-logs\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211332 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cjzs\" (UniqueName: \"kubernetes.io/projected/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-kube-api-access-8cjzs\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm55z\" (UniqueName: \"kubernetes.io/projected/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-kube-api-access-jm55z\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-config-data\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-logs\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-config-data\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-logs\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.211986 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-logs\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.215215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.215291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.215827 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-config-data\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.215914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.216247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-config-data\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.223684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm55z\" (UniqueName: \"kubernetes.io/projected/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-kube-api-access-jm55z\") pod \"nova-api-0\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.224232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cjzs\" (UniqueName: \"kubernetes.io/projected/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-kube-api-access-8cjzs\") pod \"nova-metadata-0\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.264267 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.278449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:25 crc kubenswrapper[4707]: W0121 16:00:25.652567 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb8f520_1e2a_4f11_b9ba_54a2c2d40dfb.slice/crio-113cc6348f645751d5ce3866440afe0863a3d26efcb714a2952906631c244d1d WatchSource:0}: Error finding container 113cc6348f645751d5ce3866440afe0863a3d26efcb714a2952906631c244d1d: Status 404 returned error can't find the container with id 113cc6348f645751d5ce3866440afe0863a3d26efcb714a2952906631c244d1d Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.653318 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.660114 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.860978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387","Type":"ContainerStarted","Data":"95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335"} Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.861023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387","Type":"ContainerStarted","Data":"0cb9f0c1aec32d254ac6ef714ac72ec000800f1dc1af3b3db02050694bcb555c"} Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.863452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb","Type":"ContainerStarted","Data":"2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc"} Jan 21 16:00:25 crc kubenswrapper[4707]: I0121 16:00:25.863492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb","Type":"ContainerStarted","Data":"113cc6348f645751d5ce3866440afe0863a3d26efcb714a2952906631c244d1d"} Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.587282 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.734200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-config-data\") pod \"8551d447-25a9-4955-90f9-b9154feb66b5\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.734268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-combined-ca-bundle\") pod \"8551d447-25a9-4955-90f9-b9154feb66b5\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.734306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slr45\" (UniqueName: \"kubernetes.io/projected/8551d447-25a9-4955-90f9-b9154feb66b5-kube-api-access-slr45\") pod \"8551d447-25a9-4955-90f9-b9154feb66b5\" (UID: \"8551d447-25a9-4955-90f9-b9154feb66b5\") " Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.738469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8551d447-25a9-4955-90f9-b9154feb66b5-kube-api-access-slr45" (OuterVolumeSpecName: "kube-api-access-slr45") pod "8551d447-25a9-4955-90f9-b9154feb66b5" (UID: "8551d447-25a9-4955-90f9-b9154feb66b5"). InnerVolumeSpecName "kube-api-access-slr45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.754329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8551d447-25a9-4955-90f9-b9154feb66b5" (UID: "8551d447-25a9-4955-90f9-b9154feb66b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.754658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-config-data" (OuterVolumeSpecName: "config-data") pod "8551d447-25a9-4955-90f9-b9154feb66b5" (UID: "8551d447-25a9-4955-90f9-b9154feb66b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.836789 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slr45\" (UniqueName: \"kubernetes.io/projected/8551d447-25a9-4955-90f9-b9154feb66b5-kube-api-access-slr45\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.837002 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.837015 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8551d447-25a9-4955-90f9-b9154feb66b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.873910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb","Type":"ContainerStarted","Data":"70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad"} Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.876288 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387","Type":"ContainerStarted","Data":"e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8"} Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.877718 4707 generic.go:334] "Generic (PLEG): container finished" podID="8551d447-25a9-4955-90f9-b9154feb66b5" containerID="80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932" exitCode=0 Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.877746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8551d447-25a9-4955-90f9-b9154feb66b5","Type":"ContainerDied","Data":"80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932"} Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.877763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8551d447-25a9-4955-90f9-b9154feb66b5","Type":"ContainerDied","Data":"a96d86ca974f07873e3fa737fdb7b0f82af047ab21b8668e3e6ee4094a3d8357"} Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.877777 4707 scope.go:117] "RemoveContainer" containerID="80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.877854 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.893864 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.893851173 podStartE2EDuration="2.893851173s" podCreationTimestamp="2026-01-21 16:00:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:26.886363533 +0000 UTC m=+3524.067879755" watchObservedRunningTime="2026-01-21 16:00:26.893851173 +0000 UTC m=+3524.075367395" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.899509 4707 scope.go:117] "RemoveContainer" containerID="80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932" Jan 21 16:00:26 crc kubenswrapper[4707]: E0121 16:00:26.899916 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932\": container with ID starting with 80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932 not found: ID does not exist" containerID="80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.899999 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932"} err="failed to get container status \"80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932\": rpc error: code = NotFound desc = could not find container \"80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932\": container with ID starting with 80251f14d2305e28f5bd7683afb3bf111bb59ec5564a54372faf917bc318c932 not found: ID does not exist" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.907062 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.907052867 podStartE2EDuration="2.907052867s" podCreationTimestamp="2026-01-21 16:00:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:26.901185685 +0000 UTC m=+3524.082701906" watchObservedRunningTime="2026-01-21 16:00:26.907052867 +0000 UTC m=+3524.088569089" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.915725 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.924596 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.932214 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:26 crc kubenswrapper[4707]: E0121 16:00:26.932575 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8551d447-25a9-4955-90f9-b9154feb66b5" containerName="nova-scheduler-scheduler" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.932593 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8551d447-25a9-4955-90f9-b9154feb66b5" containerName="nova-scheduler-scheduler" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.932756 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8551d447-25a9-4955-90f9-b9154feb66b5" containerName="nova-scheduler-scheduler" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.933279 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.934460 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:00:26 crc kubenswrapper[4707]: I0121 16:00:26.939352 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.039495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbdc7\" (UniqueName: \"kubernetes.io/projected/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-kube-api-access-sbdc7\") pod \"nova-scheduler-0\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.039577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-config-data\") pod \"nova-scheduler-0\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.039877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.141732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbdc7\" (UniqueName: \"kubernetes.io/projected/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-kube-api-access-sbdc7\") pod \"nova-scheduler-0\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.141803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-config-data\") pod \"nova-scheduler-0\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.141902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.145331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-config-data\") pod \"nova-scheduler-0\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.145754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.154928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbdc7\" (UniqueName: \"kubernetes.io/projected/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-kube-api-access-sbdc7\") pod \"nova-scheduler-0\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.190908 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8551d447-25a9-4955-90f9-b9154feb66b5" path="/var/lib/kubelet/pods/8551d447-25a9-4955-90f9-b9154feb66b5/volumes" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.240645 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.245104 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.621203 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:27 crc kubenswrapper[4707]: W0121 16:00:27.623120 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4811cd07_c8bd_4cfe_b3ba_8d0fd7715923.slice/crio-d46282b8a70f8d1b99c17ffdd53b1504d3cbe038973e25be78f58e014fbed7c0 WatchSource:0}: Error finding container d46282b8a70f8d1b99c17ffdd53b1504d3cbe038973e25be78f58e014fbed7c0: Status 404 returned error can't find the container with id d46282b8a70f8d1b99c17ffdd53b1504d3cbe038973e25be78f58e014fbed7c0 Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.889257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923","Type":"ContainerStarted","Data":"798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec"} Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.889482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923","Type":"ContainerStarted","Data":"d46282b8a70f8d1b99c17ffdd53b1504d3cbe038973e25be78f58e014fbed7c0"} Jan 21 16:00:27 crc kubenswrapper[4707]: I0121 16:00:27.904753 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.904740831 podStartE2EDuration="1.904740831s" podCreationTimestamp="2026-01-21 16:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:27.901270015 +0000 UTC m=+3525.082786236" watchObservedRunningTime="2026-01-21 16:00:27.904740831 +0000 UTC m=+3525.086257053" Jan 21 16:00:29 crc kubenswrapper[4707]: I0121 16:00:29.160056 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:29 crc kubenswrapper[4707]: I0121 16:00:29.174213 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:29 crc kubenswrapper[4707]: I0121 16:00:29.918456 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.018666 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9"] Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.019798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.021184 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.023721 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.024928 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9"] Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.191382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-config-data\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.191610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-scripts\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.191666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.191717 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtq2t\" (UniqueName: \"kubernetes.io/projected/671e177f-2f9d-42bc-b150-5efee0b71d9c-kube-api-access-xtq2t\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.279644 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.279691 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.293272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-config-data\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.293334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-scripts\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.293395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.293445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtq2t\" (UniqueName: \"kubernetes.io/projected/671e177f-2f9d-42bc-b150-5efee0b71d9c-kube-api-access-xtq2t\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.298055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.298841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-scripts\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.298916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-config-data\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.307545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtq2t\" (UniqueName: \"kubernetes.io/projected/671e177f-2f9d-42bc-b150-5efee0b71d9c-kube-api-access-xtq2t\") pod \"nova-cell1-cell-mapping-b8pq9\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.333454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.712172 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9"] Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.912967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" event={"ID":"671e177f-2f9d-42bc-b150-5efee0b71d9c","Type":"ContainerStarted","Data":"ad9cce4ce97e6242f33b520601f135d3c884134a1b63fb09b6ee8acb83dc3e92"} Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.913012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" event={"ID":"671e177f-2f9d-42bc-b150-5efee0b71d9c","Type":"ContainerStarted","Data":"d1f654f3f3831c97b649b5f871a1119561dbf380513d3ce9f4be9b18f5452894"} Jan 21 16:00:30 crc kubenswrapper[4707]: I0121 16:00:30.927220 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" podStartSLOduration=0.927204937 podStartE2EDuration="927.204937ms" podCreationTimestamp="2026-01-21 16:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:30.92345259 +0000 UTC m=+3528.104968813" watchObservedRunningTime="2026-01-21 16:00:30.927204937 +0000 UTC m=+3528.108721158" Jan 21 16:00:32 crc kubenswrapper[4707]: I0121 16:00:32.245870 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:34 crc kubenswrapper[4707]: I0121 16:00:34.944216 4707 generic.go:334] "Generic (PLEG): container finished" podID="671e177f-2f9d-42bc-b150-5efee0b71d9c" containerID="ad9cce4ce97e6242f33b520601f135d3c884134a1b63fb09b6ee8acb83dc3e92" exitCode=0 Jan 21 16:00:34 crc kubenswrapper[4707]: I0121 16:00:34.944290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" event={"ID":"671e177f-2f9d-42bc-b150-5efee0b71d9c","Type":"ContainerDied","Data":"ad9cce4ce97e6242f33b520601f135d3c884134a1b63fb09b6ee8acb83dc3e92"} Jan 21 16:00:35 crc kubenswrapper[4707]: I0121 16:00:35.264869 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:35 crc kubenswrapper[4707]: I0121 16:00:35.264916 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:35 crc kubenswrapper[4707]: I0121 16:00:35.283429 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:35 crc kubenswrapper[4707]: I0121 16:00:35.283503 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.215665 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.305960 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.32:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.391264 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-scripts\") pod \"671e177f-2f9d-42bc-b150-5efee0b71d9c\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.391324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtq2t\" (UniqueName: \"kubernetes.io/projected/671e177f-2f9d-42bc-b150-5efee0b71d9c-kube-api-access-xtq2t\") pod \"671e177f-2f9d-42bc-b150-5efee0b71d9c\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.391441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-combined-ca-bundle\") pod \"671e177f-2f9d-42bc-b150-5efee0b71d9c\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.391490 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-config-data\") pod \"671e177f-2f9d-42bc-b150-5efee0b71d9c\" (UID: \"671e177f-2f9d-42bc-b150-5efee0b71d9c\") " Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.393941 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.32:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.394166 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.33:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.394204 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.33:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.409397 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671e177f-2f9d-42bc-b150-5efee0b71d9c-kube-api-access-xtq2t" (OuterVolumeSpecName: "kube-api-access-xtq2t") pod "671e177f-2f9d-42bc-b150-5efee0b71d9c" (UID: "671e177f-2f9d-42bc-b150-5efee0b71d9c"). InnerVolumeSpecName "kube-api-access-xtq2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.412573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "671e177f-2f9d-42bc-b150-5efee0b71d9c" (UID: "671e177f-2f9d-42bc-b150-5efee0b71d9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.414319 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-scripts" (OuterVolumeSpecName: "scripts") pod "671e177f-2f9d-42bc-b150-5efee0b71d9c" (UID: "671e177f-2f9d-42bc-b150-5efee0b71d9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.414330 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-config-data" (OuterVolumeSpecName: "config-data") pod "671e177f-2f9d-42bc-b150-5efee0b71d9c" (UID: "671e177f-2f9d-42bc-b150-5efee0b71d9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.493417 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.493445 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtq2t\" (UniqueName: \"kubernetes.io/projected/671e177f-2f9d-42bc-b150-5efee0b71d9c-kube-api-access-xtq2t\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.493459 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.493467 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e177f-2f9d-42bc-b150-5efee0b71d9c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.961162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" event={"ID":"671e177f-2f9d-42bc-b150-5efee0b71d9c","Type":"ContainerDied","Data":"d1f654f3f3831c97b649b5f871a1119561dbf380513d3ce9f4be9b18f5452894"} Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.961201 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1f654f3f3831c97b649b5f871a1119561dbf380513d3ce9f4be9b18f5452894" Jan 21 16:00:36 crc kubenswrapper[4707]: I0121 16:00:36.961247 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9" Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.124822 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.125140 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-log" containerID="cri-o://95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335" gracePeriod=30 Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.125643 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-api" containerID="cri-o://e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8" gracePeriod=30 Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.132572 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.132712 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="4811cd07-c8bd-4cfe-b3ba-8d0fd7715923" containerName="nova-scheduler-scheduler" containerID="cri-o://798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec" gracePeriod=30 Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.179640 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.179825 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-log" containerID="cri-o://2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc" gracePeriod=30 Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.179870 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-metadata" containerID="cri-o://70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad" gracePeriod=30 Jan 21 16:00:37 crc kubenswrapper[4707]: E0121 16:00:37.348671 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb8f520_1e2a_4f11_b9ba_54a2c2d40dfb.slice/crio-conmon-2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.969192 4707 generic.go:334] "Generic (PLEG): container finished" podID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerID="2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc" exitCode=143 Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.969264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb","Type":"ContainerDied","Data":"2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc"} Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.971394 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerID="95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335" exitCode=143 Jan 21 16:00:37 crc kubenswrapper[4707]: I0121 16:00:37.971434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387","Type":"ContainerDied","Data":"95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335"} Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.593685 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.734510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-combined-ca-bundle\") pod \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.734611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbdc7\" (UniqueName: \"kubernetes.io/projected/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-kube-api-access-sbdc7\") pod \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.734635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-config-data\") pod \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\" (UID: \"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923\") " Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.738879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-kube-api-access-sbdc7" (OuterVolumeSpecName: "kube-api-access-sbdc7") pod "4811cd07-c8bd-4cfe-b3ba-8d0fd7715923" (UID: "4811cd07-c8bd-4cfe-b3ba-8d0fd7715923"). InnerVolumeSpecName "kube-api-access-sbdc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.754940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-config-data" (OuterVolumeSpecName: "config-data") pod "4811cd07-c8bd-4cfe-b3ba-8d0fd7715923" (UID: "4811cd07-c8bd-4cfe-b3ba-8d0fd7715923"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.755313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4811cd07-c8bd-4cfe-b3ba-8d0fd7715923" (UID: "4811cd07-c8bd-4cfe-b3ba-8d0fd7715923"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.836902 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.836997 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbdc7\" (UniqueName: \"kubernetes.io/projected/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-kube-api-access-sbdc7\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.837066 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.980930 4707 generic.go:334] "Generic (PLEG): container finished" podID="4811cd07-c8bd-4cfe-b3ba-8d0fd7715923" containerID="798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec" exitCode=0 Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.980994 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.981001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923","Type":"ContainerDied","Data":"798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec"} Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.981439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"4811cd07-c8bd-4cfe-b3ba-8d0fd7715923","Type":"ContainerDied","Data":"d46282b8a70f8d1b99c17ffdd53b1504d3cbe038973e25be78f58e014fbed7c0"} Jan 21 16:00:38 crc kubenswrapper[4707]: I0121 16:00:38.981462 4707 scope.go:117] "RemoveContainer" containerID="798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.001083 4707 scope.go:117] "RemoveContainer" containerID="798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec" Jan 21 16:00:39 crc kubenswrapper[4707]: E0121 16:00:39.001408 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec\": container with ID starting with 798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec not found: ID does not exist" containerID="798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.001447 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec"} err="failed to get container status \"798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec\": rpc error: code = NotFound desc = could not find container \"798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec\": container with ID starting with 798c19076f9526b8485a3909416d42ea3eb092a3122f763267824557102102ec not found: ID does not exist" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.003929 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.013205 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.021501 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:39 crc kubenswrapper[4707]: E0121 16:00:39.021923 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4811cd07-c8bd-4cfe-b3ba-8d0fd7715923" containerName="nova-scheduler-scheduler" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.021990 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4811cd07-c8bd-4cfe-b3ba-8d0fd7715923" containerName="nova-scheduler-scheduler" Jan 21 16:00:39 crc kubenswrapper[4707]: E0121 16:00:39.022069 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671e177f-2f9d-42bc-b150-5efee0b71d9c" containerName="nova-manage" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.022127 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="671e177f-2f9d-42bc-b150-5efee0b71d9c" containerName="nova-manage" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.022371 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="671e177f-2f9d-42bc-b150-5efee0b71d9c" containerName="nova-manage" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.022447 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4811cd07-c8bd-4cfe-b3ba-8d0fd7715923" containerName="nova-scheduler-scheduler" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.023080 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.025409 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.030340 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.141921 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.142195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcr4\" (UniqueName: \"kubernetes.io/projected/53e561f4-eb73-4a09-884c-0d5476044b72-kube-api-access-rxcr4\") pod \"nova-scheduler-0\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.142277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-config-data\") pod \"nova-scheduler-0\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.190053 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4811cd07-c8bd-4cfe-b3ba-8d0fd7715923" path="/var/lib/kubelet/pods/4811cd07-c8bd-4cfe-b3ba-8d0fd7715923/volumes" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.244161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.244238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-config-data\") pod \"nova-scheduler-0\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.244258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcr4\" (UniqueName: \"kubernetes.io/projected/53e561f4-eb73-4a09-884c-0d5476044b72-kube-api-access-rxcr4\") pod \"nova-scheduler-0\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.247511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-config-data\") pod \"nova-scheduler-0\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.247659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.257226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcr4\" (UniqueName: \"kubernetes.io/projected/53e561f4-eb73-4a09-884c-0d5476044b72-kube-api-access-rxcr4\") pod \"nova-scheduler-0\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.344387 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.706686 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.990887 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"53e561f4-eb73-4a09-884c-0d5476044b72","Type":"ContainerStarted","Data":"a6140e292d56c1ea39d41e209b222050d7c5bcd750acd566047cd616b55a336c"} Jan 21 16:00:39 crc kubenswrapper[4707]: I0121 16:00:39.990935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"53e561f4-eb73-4a09-884c-0d5476044b72","Type":"ContainerStarted","Data":"29ba37e42dd0b06196b3ffb4fc05b462a0857501790255412777e2974ba5f538"} Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.004217 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.004193157 podStartE2EDuration="1.004193157s" podCreationTimestamp="2026-01-21 16:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:40.000350481 +0000 UTC m=+3537.181866703" watchObservedRunningTime="2026-01-21 16:00:40.004193157 +0000 UTC m=+3537.185709379" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.670385 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.763311 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.771770 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cjzs\" (UniqueName: \"kubernetes.io/projected/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-kube-api-access-8cjzs\") pod \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.771829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-config-data\") pod \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.771887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-logs\") pod \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.771921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-nova-metadata-tls-certs\") pod \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.771957 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-combined-ca-bundle\") pod \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\" (UID: \"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb\") " Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.772274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-logs" (OuterVolumeSpecName: "logs") pod "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" (UID: "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.772723 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.779114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-kube-api-access-8cjzs" (OuterVolumeSpecName: "kube-api-access-8cjzs") pod "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" (UID: "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb"). InnerVolumeSpecName "kube-api-access-8cjzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.804989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-config-data" (OuterVolumeSpecName: "config-data") pod "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" (UID: "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.806310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" (UID: "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.821466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" (UID: "3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.873490 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-config-data\") pod \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.873619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-combined-ca-bundle\") pod \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.873900 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm55z\" (UniqueName: \"kubernetes.io/projected/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-kube-api-access-jm55z\") pod \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.873947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-logs\") pod \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\" (UID: \"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387\") " Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.874422 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.874441 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cjzs\" (UniqueName: \"kubernetes.io/projected/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-kube-api-access-8cjzs\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.874452 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.874444 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-logs" (OuterVolumeSpecName: "logs") pod "d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" (UID: "d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.874461 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.883591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-kube-api-access-jm55z" (OuterVolumeSpecName: "kube-api-access-jm55z") pod "d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" (UID: "d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387"). InnerVolumeSpecName "kube-api-access-jm55z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.891178 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-config-data" (OuterVolumeSpecName: "config-data") pod "d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" (UID: "d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.892360 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" (UID: "d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.976256 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.976282 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm55z\" (UniqueName: \"kubernetes.io/projected/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-kube-api-access-jm55z\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.976292 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:40 crc kubenswrapper[4707]: I0121 16:00:40.976317 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.000349 4707 generic.go:334] "Generic (PLEG): container finished" podID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerID="70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad" exitCode=0 Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.000399 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.000419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb","Type":"ContainerDied","Data":"70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad"} Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.000445 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb","Type":"ContainerDied","Data":"113cc6348f645751d5ce3866440afe0863a3d26efcb714a2952906631c244d1d"} Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.000461 4707 scope.go:117] "RemoveContainer" containerID="70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.002539 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerID="e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8" exitCode=0 Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.002631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387","Type":"ContainerDied","Data":"e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8"} Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.002676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387","Type":"ContainerDied","Data":"0cb9f0c1aec32d254ac6ef714ac72ec000800f1dc1af3b3db02050694bcb555c"} Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.002650 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.017408 4707 scope.go:117] "RemoveContainer" containerID="2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.028229 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.035579 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.041486 4707 scope.go:117] "RemoveContainer" containerID="70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad" Jan 21 16:00:41 crc kubenswrapper[4707]: E0121 16:00:41.043859 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad\": container with ID starting with 70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad not found: ID does not exist" containerID="70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.043889 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad"} err="failed to get container status \"70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad\": rpc error: code = NotFound desc = could not find container \"70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad\": container with ID starting with 70b9cbba133b9e3740cec5a44415229da43e533c8db8c3dc225af06a4e490aad not found: ID does not exist" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.043910 4707 scope.go:117] "RemoveContainer" containerID="2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.044262 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:41 crc kubenswrapper[4707]: E0121 16:00:41.044755 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc\": container with ID starting with 2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc not found: ID does not exist" containerID="2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.044779 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc"} err="failed to get container status \"2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc\": rpc error: code = NotFound desc = could not find container \"2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc\": container with ID starting with 2ae8a1771b22b84335bcd6fab68ff25696f186ec130a68591fc9b14de5cc76bc not found: ID does not exist" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.044817 4707 scope.go:117] "RemoveContainer" containerID="e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.053538 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.062664 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:41 crc kubenswrapper[4707]: E0121 16:00:41.063277 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-metadata" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.063355 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-metadata" Jan 21 16:00:41 crc kubenswrapper[4707]: E0121 16:00:41.063431 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-log" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.063479 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-log" Jan 21 16:00:41 crc kubenswrapper[4707]: E0121 16:00:41.063533 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-api" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.063586 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-api" Jan 21 16:00:41 crc kubenswrapper[4707]: E0121 16:00:41.063648 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-log" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.063699 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-log" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.063919 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-log" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.063983 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-log" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.064037 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" containerName="nova-api-api" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.064097 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" containerName="nova-metadata-metadata" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.065039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.066693 4707 scope.go:117] "RemoveContainer" containerID="95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.067224 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.067236 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.067873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.074887 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.076217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.077668 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.081338 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.097633 4707 scope.go:117] "RemoveContainer" containerID="e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8" Jan 21 16:00:41 crc kubenswrapper[4707]: E0121 16:00:41.097995 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8\": container with ID starting with e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8 not found: ID does not exist" containerID="e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.098122 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8"} err="failed to get container status \"e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8\": rpc error: code = NotFound desc = could not find container \"e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8\": container with ID starting with e07b794bd1c528d5aef375b7b9c1cb54f25580a0425327a5e3cad613daab8db8 not found: ID does not exist" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.098195 4707 scope.go:117] "RemoveContainer" containerID="95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335" Jan 21 16:00:41 crc kubenswrapper[4707]: E0121 16:00:41.098632 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335\": container with ID starting with 95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335 not found: ID does not exist" containerID="95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.098719 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335"} err="failed to get container status \"95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335\": rpc error: code = NotFound desc = could not find container \"95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335\": container with ID starting with 95ec79982a415f42e06c533866a11430fc8d980d551ce5a3f879f37e34716335 not found: ID does not exist" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.181229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5adfc17-d372-40cd-a326-5af09f323dce-logs\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.181276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.181317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lcxn\" (UniqueName: \"kubernetes.io/projected/b5adfc17-d372-40cd-a326-5af09f323dce-kube-api-access-2lcxn\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.181353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60dc1db5-0c41-450a-8545-281cdbc0ff86-logs\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.181379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-config-data\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.181404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlnw5\" (UniqueName: \"kubernetes.io/projected/60dc1db5-0c41-450a-8545-281cdbc0ff86-kube-api-access-tlnw5\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.181426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.181454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-config-data\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.181495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.205479 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb" path="/var/lib/kubelet/pods/3bb8f520-1e2a-4f11-b9ba-54a2c2d40dfb/volumes" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.206192 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387" path="/var/lib/kubelet/pods/d7c88c9a-9cf9-46fd-8fb3-bbec32b8e387/volumes" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.283549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnw5\" (UniqueName: \"kubernetes.io/projected/60dc1db5-0c41-450a-8545-281cdbc0ff86-kube-api-access-tlnw5\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.283597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.283643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-config-data\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.283674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.283737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5adfc17-d372-40cd-a326-5af09f323dce-logs\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.283761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.283820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lcxn\" (UniqueName: \"kubernetes.io/projected/b5adfc17-d372-40cd-a326-5af09f323dce-kube-api-access-2lcxn\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.283845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60dc1db5-0c41-450a-8545-281cdbc0ff86-logs\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.283867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-config-data\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.284521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5adfc17-d372-40cd-a326-5af09f323dce-logs\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.285279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60dc1db5-0c41-450a-8545-281cdbc0ff86-logs\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.287443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-config-data\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.287451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-config-data\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.288131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.294686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.294880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.297201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lcxn\" (UniqueName: \"kubernetes.io/projected/b5adfc17-d372-40cd-a326-5af09f323dce-kube-api-access-2lcxn\") pod \"nova-metadata-0\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.298620 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlnw5\" (UniqueName: \"kubernetes.io/projected/60dc1db5-0c41-450a-8545-281cdbc0ff86-kube-api-access-tlnw5\") pod \"nova-api-0\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.383515 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.397208 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.771356 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:00:41 crc kubenswrapper[4707]: W0121 16:00:41.772976 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5adfc17_d372_40cd_a326_5af09f323dce.slice/crio-f97b57e63c4b0fc0d35bce0b9979cd7e2949ba4c614926e596c0b3a044b8f645 WatchSource:0}: Error finding container f97b57e63c4b0fc0d35bce0b9979cd7e2949ba4c614926e596c0b3a044b8f645: Status 404 returned error can't find the container with id f97b57e63c4b0fc0d35bce0b9979cd7e2949ba4c614926e596c0b3a044b8f645 Jan 21 16:00:41 crc kubenswrapper[4707]: I0121 16:00:41.831080 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:00:42 crc kubenswrapper[4707]: I0121 16:00:42.010091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5adfc17-d372-40cd-a326-5af09f323dce","Type":"ContainerStarted","Data":"51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384"} Jan 21 16:00:42 crc kubenswrapper[4707]: I0121 16:00:42.010318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5adfc17-d372-40cd-a326-5af09f323dce","Type":"ContainerStarted","Data":"f97b57e63c4b0fc0d35bce0b9979cd7e2949ba4c614926e596c0b3a044b8f645"} Jan 21 16:00:42 crc kubenswrapper[4707]: I0121 16:00:42.011449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"60dc1db5-0c41-450a-8545-281cdbc0ff86","Type":"ContainerStarted","Data":"40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829"} Jan 21 16:00:42 crc kubenswrapper[4707]: I0121 16:00:42.011472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"60dc1db5-0c41-450a-8545-281cdbc0ff86","Type":"ContainerStarted","Data":"c57c256aa3ba25c626516ea311159ff6fad27e5f74399128b67a048d66d21e0b"} Jan 21 16:00:43 crc kubenswrapper[4707]: I0121 16:00:43.025114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5adfc17-d372-40cd-a326-5af09f323dce","Type":"ContainerStarted","Data":"0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85"} Jan 21 16:00:43 crc kubenswrapper[4707]: I0121 16:00:43.027615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"60dc1db5-0c41-450a-8545-281cdbc0ff86","Type":"ContainerStarted","Data":"4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de"} Jan 21 16:00:43 crc kubenswrapper[4707]: I0121 16:00:43.038673 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.038659842 podStartE2EDuration="2.038659842s" podCreationTimestamp="2026-01-21 16:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:43.038636098 +0000 UTC m=+3540.220152320" watchObservedRunningTime="2026-01-21 16:00:43.038659842 +0000 UTC m=+3540.220176064" Jan 21 16:00:43 crc kubenswrapper[4707]: I0121 16:00:43.052908 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.052892675 podStartE2EDuration="2.052892675s" podCreationTimestamp="2026-01-21 16:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:00:43.049761217 +0000 UTC m=+3540.231277439" watchObservedRunningTime="2026-01-21 16:00:43.052892675 +0000 UTC m=+3540.234408897" Jan 21 16:00:44 crc kubenswrapper[4707]: I0121 16:00:44.344476 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:46 crc kubenswrapper[4707]: I0121 16:00:46.384273 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:46 crc kubenswrapper[4707]: I0121 16:00:46.384509 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:49 crc kubenswrapper[4707]: I0121 16:00:49.345213 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:49 crc kubenswrapper[4707]: I0121 16:00:49.365565 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:50 crc kubenswrapper[4707]: I0121 16:00:50.092161 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:00:51 crc kubenswrapper[4707]: I0121 16:00:51.384130 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:51 crc kubenswrapper[4707]: I0121 16:00:51.384177 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:00:51 crc kubenswrapper[4707]: I0121 16:00:51.397409 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:51 crc kubenswrapper[4707]: I0121 16:00:51.397455 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:00:52 crc kubenswrapper[4707]: I0121 16:00:52.398975 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.37:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:00:52 crc kubenswrapper[4707]: I0121 16:00:52.398969 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.37:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:00:52 crc kubenswrapper[4707]: I0121 16:00:52.481007 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.38:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:00:52 crc kubenswrapper[4707]: I0121 16:00:52.481009 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.38:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.131571 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-cron-29483521-zkwgs"] Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.133084 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.137763 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cron-29483521-zkwgs"] Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.178845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4slsn\" (UniqueName: \"kubernetes.io/projected/bfd7d039-ca6f-477f-a698-e5fcd49e092b-kube-api-access-4slsn\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.178899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-combined-ca-bundle\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.178948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-config-data\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.178994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-fernet-keys\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.280551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-combined-ca-bundle\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.280617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-config-data\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.280647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-fernet-keys\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.280737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4slsn\" (UniqueName: \"kubernetes.io/projected/bfd7d039-ca6f-477f-a698-e5fcd49e092b-kube-api-access-4slsn\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.285289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-fernet-keys\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.285733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-config-data\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.285885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-combined-ca-bundle\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.294410 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4slsn\" (UniqueName: \"kubernetes.io/projected/bfd7d039-ca6f-477f-a698-e5fcd49e092b-kube-api-access-4slsn\") pod \"keystone-cron-29483521-zkwgs\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.456399 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:00 crc kubenswrapper[4707]: I0121 16:01:00.816287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cron-29483521-zkwgs"] Jan 21 16:01:00 crc kubenswrapper[4707]: W0121 16:01:00.817335 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfd7d039_ca6f_477f_a698_e5fcd49e092b.slice/crio-2a7c38b5efd8bfe6b1c0c6d0c407243e7e31cbf539740168ba645fea897bf4b9 WatchSource:0}: Error finding container 2a7c38b5efd8bfe6b1c0c6d0c407243e7e31cbf539740168ba645fea897bf4b9: Status 404 returned error can't find the container with id 2a7c38b5efd8bfe6b1c0c6d0c407243e7e31cbf539740168ba645fea897bf4b9 Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.147036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" event={"ID":"bfd7d039-ca6f-477f-a698-e5fcd49e092b","Type":"ContainerStarted","Data":"f3747fccd776cf61a6bf6bea5e9650119f54f111c7925fb761524b94126ed398"} Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.147077 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" event={"ID":"bfd7d039-ca6f-477f-a698-e5fcd49e092b","Type":"ContainerStarted","Data":"2a7c38b5efd8bfe6b1c0c6d0c407243e7e31cbf539740168ba645fea897bf4b9"} Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.161675 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" podStartSLOduration=1.161664334 podStartE2EDuration="1.161664334s" podCreationTimestamp="2026-01-21 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:01.15738657 +0000 UTC m=+3558.338902792" watchObservedRunningTime="2026-01-21 16:01:01.161664334 +0000 UTC m=+3558.343180555" Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.388921 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.391886 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.397546 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.400472 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.400800 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.401472 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:01 crc kubenswrapper[4707]: I0121 16:01:01.406104 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:02 crc kubenswrapper[4707]: I0121 16:01:02.163940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:02 crc kubenswrapper[4707]: I0121 16:01:02.174770 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:02 crc kubenswrapper[4707]: I0121 16:01:02.176387 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:03 crc kubenswrapper[4707]: I0121 16:01:03.161082 4707 generic.go:334] "Generic (PLEG): container finished" podID="bfd7d039-ca6f-477f-a698-e5fcd49e092b" containerID="f3747fccd776cf61a6bf6bea5e9650119f54f111c7925fb761524b94126ed398" exitCode=0 Jan 21 16:01:03 crc kubenswrapper[4707]: I0121 16:01:03.161160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" event={"ID":"bfd7d039-ca6f-477f-a698-e5fcd49e092b","Type":"ContainerDied","Data":"f3747fccd776cf61a6bf6bea5e9650119f54f111c7925fb761524b94126ed398"} Jan 21 16:01:03 crc kubenswrapper[4707]: I0121 16:01:03.619243 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:03 crc kubenswrapper[4707]: I0121 16:01:03.619492 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="ceilometer-central-agent" containerID="cri-o://2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8" gracePeriod=30 Jan 21 16:01:03 crc kubenswrapper[4707]: I0121 16:01:03.619559 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="proxy-httpd" containerID="cri-o://e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd" gracePeriod=30 Jan 21 16:01:03 crc kubenswrapper[4707]: I0121 16:01:03.619590 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="sg-core" containerID="cri-o://a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9" gracePeriod=30 Jan 21 16:01:03 crc kubenswrapper[4707]: I0121 16:01:03.619570 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="ceilometer-notification-agent" containerID="cri-o://7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d" gracePeriod=30 Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.170428 4707 generic.go:334] "Generic (PLEG): container finished" podID="75da6817-ed62-456c-a572-375325ca0bd5" containerID="e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd" exitCode=0 Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.170458 4707 generic.go:334] "Generic (PLEG): container finished" podID="75da6817-ed62-456c-a572-375325ca0bd5" containerID="a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9" exitCode=2 Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.170465 4707 generic.go:334] "Generic (PLEG): container finished" podID="75da6817-ed62-456c-a572-375325ca0bd5" containerID="2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8" exitCode=0 Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.170485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerDied","Data":"e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd"} Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.170530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerDied","Data":"a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9"} Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.170542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerDied","Data":"2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8"} Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.433313 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.551128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-fernet-keys\") pod \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.551253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4slsn\" (UniqueName: \"kubernetes.io/projected/bfd7d039-ca6f-477f-a698-e5fcd49e092b-kube-api-access-4slsn\") pod \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.551295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-config-data\") pod \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.551367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-combined-ca-bundle\") pod \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\" (UID: \"bfd7d039-ca6f-477f-a698-e5fcd49e092b\") " Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.555200 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.555800 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd7d039-ca6f-477f-a698-e5fcd49e092b-kube-api-access-4slsn" (OuterVolumeSpecName: "kube-api-access-4slsn") pod "bfd7d039-ca6f-477f-a698-e5fcd49e092b" (UID: "bfd7d039-ca6f-477f-a698-e5fcd49e092b"). InnerVolumeSpecName "kube-api-access-4slsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.555927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bfd7d039-ca6f-477f-a698-e5fcd49e092b" (UID: "bfd7d039-ca6f-477f-a698-e5fcd49e092b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.577426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfd7d039-ca6f-477f-a698-e5fcd49e092b" (UID: "bfd7d039-ca6f-477f-a698-e5fcd49e092b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.591935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-config-data" (OuterVolumeSpecName: "config-data") pod "bfd7d039-ca6f-477f-a698-e5fcd49e092b" (UID: "bfd7d039-ca6f-477f-a698-e5fcd49e092b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.653187 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4slsn\" (UniqueName: \"kubernetes.io/projected/bfd7d039-ca6f-477f-a698-e5fcd49e092b-kube-api-access-4slsn\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.653219 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.653230 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:04 crc kubenswrapper[4707]: I0121 16:01:04.653239 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfd7d039-ca6f-477f-a698-e5fcd49e092b-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:05 crc kubenswrapper[4707]: I0121 16:01:05.178625 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-log" containerID="cri-o://40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829" gracePeriod=30 Jan 21 16:01:05 crc kubenswrapper[4707]: I0121 16:01:05.178915 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" Jan 21 16:01:05 crc kubenswrapper[4707]: I0121 16:01:05.178983 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-api" containerID="cri-o://4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de" gracePeriod=30 Jan 21 16:01:05 crc kubenswrapper[4707]: I0121 16:01:05.179289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cron-29483521-zkwgs" event={"ID":"bfd7d039-ca6f-477f-a698-e5fcd49e092b","Type":"ContainerDied","Data":"2a7c38b5efd8bfe6b1c0c6d0c407243e7e31cbf539740168ba645fea897bf4b9"} Jan 21 16:01:05 crc kubenswrapper[4707]: I0121 16:01:05.179330 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a7c38b5efd8bfe6b1c0c6d0c407243e7e31cbf539740168ba645fea897bf4b9" Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.185984 4707 generic.go:334] "Generic (PLEG): container finished" podID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerID="40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829" exitCode=143 Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.186016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"60dc1db5-0c41-450a-8545-281cdbc0ff86","Type":"ContainerDied","Data":"40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829"} Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.805427 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994236 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-combined-ca-bundle\") pod \"75da6817-ed62-456c-a572-375325ca0bd5\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54npd\" (UniqueName: \"kubernetes.io/projected/75da6817-ed62-456c-a572-375325ca0bd5-kube-api-access-54npd\") pod \"75da6817-ed62-456c-a572-375325ca0bd5\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-log-httpd\") pod \"75da6817-ed62-456c-a572-375325ca0bd5\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-ceilometer-tls-certs\") pod \"75da6817-ed62-456c-a572-375325ca0bd5\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-run-httpd\") pod \"75da6817-ed62-456c-a572-375325ca0bd5\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-sg-core-conf-yaml\") pod \"75da6817-ed62-456c-a572-375325ca0bd5\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-config-data\") pod \"75da6817-ed62-456c-a572-375325ca0bd5\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-scripts\") pod \"75da6817-ed62-456c-a572-375325ca0bd5\" (UID: \"75da6817-ed62-456c-a572-375325ca0bd5\") " Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994913 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "75da6817-ed62-456c-a572-375325ca0bd5" (UID: "75da6817-ed62-456c-a572-375325ca0bd5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.994930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "75da6817-ed62-456c-a572-375325ca0bd5" (UID: "75da6817-ed62-456c-a572-375325ca0bd5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.996310 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:06 crc kubenswrapper[4707]: I0121 16:01:06.996484 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75da6817-ed62-456c-a572-375325ca0bd5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.008541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-scripts" (OuterVolumeSpecName: "scripts") pod "75da6817-ed62-456c-a572-375325ca0bd5" (UID: "75da6817-ed62-456c-a572-375325ca0bd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.008591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75da6817-ed62-456c-a572-375325ca0bd5-kube-api-access-54npd" (OuterVolumeSpecName: "kube-api-access-54npd") pod "75da6817-ed62-456c-a572-375325ca0bd5" (UID: "75da6817-ed62-456c-a572-375325ca0bd5"). InnerVolumeSpecName "kube-api-access-54npd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.018448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "75da6817-ed62-456c-a572-375325ca0bd5" (UID: "75da6817-ed62-456c-a572-375325ca0bd5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.034950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "75da6817-ed62-456c-a572-375325ca0bd5" (UID: "75da6817-ed62-456c-a572-375325ca0bd5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.054923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75da6817-ed62-456c-a572-375325ca0bd5" (UID: "75da6817-ed62-456c-a572-375325ca0bd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.071986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-config-data" (OuterVolumeSpecName: "config-data") pod "75da6817-ed62-456c-a572-375325ca0bd5" (UID: "75da6817-ed62-456c-a572-375325ca0bd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.097692 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.097715 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54npd\" (UniqueName: \"kubernetes.io/projected/75da6817-ed62-456c-a572-375325ca0bd5-kube-api-access-54npd\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.097725 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.097734 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.097743 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.097751 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75da6817-ed62-456c-a572-375325ca0bd5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.194117 4707 generic.go:334] "Generic (PLEG): container finished" podID="75da6817-ed62-456c-a572-375325ca0bd5" containerID="7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d" exitCode=0 Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.194155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerDied","Data":"7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d"} Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.194185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"75da6817-ed62-456c-a572-375325ca0bd5","Type":"ContainerDied","Data":"62f887e71283175cd8ecec6f3f36d061a827546fd63b126348c6094479486992"} Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.194201 4707 scope.go:117] "RemoveContainer" containerID="e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.194802 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.215694 4707 scope.go:117] "RemoveContainer" containerID="a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.229403 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.237451 4707 scope.go:117] "RemoveContainer" containerID="7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.243699 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.252894 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:07 crc kubenswrapper[4707]: E0121 16:01:07.253212 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="ceilometer-central-agent" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253231 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="ceilometer-central-agent" Jan 21 16:01:07 crc kubenswrapper[4707]: E0121 16:01:07.253247 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="sg-core" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253254 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="sg-core" Jan 21 16:01:07 crc kubenswrapper[4707]: E0121 16:01:07.253273 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd7d039-ca6f-477f-a698-e5fcd49e092b" containerName="keystone-cron" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253279 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd7d039-ca6f-477f-a698-e5fcd49e092b" containerName="keystone-cron" Jan 21 16:01:07 crc kubenswrapper[4707]: E0121 16:01:07.253288 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="ceilometer-notification-agent" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253293 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="ceilometer-notification-agent" Jan 21 16:01:07 crc kubenswrapper[4707]: E0121 16:01:07.253330 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="proxy-httpd" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253337 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="proxy-httpd" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253487 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="sg-core" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253500 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="ceilometer-central-agent" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253512 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="ceilometer-notification-agent" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253522 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd7d039-ca6f-477f-a698-e5fcd49e092b" containerName="keystone-cron" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.253534 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75da6817-ed62-456c-a572-375325ca0bd5" containerName="proxy-httpd" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.254908 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.256031 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.256408 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.256736 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.259458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.265962 4707 scope.go:117] "RemoveContainer" containerID="2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.283481 4707 scope.go:117] "RemoveContainer" containerID="e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd" Jan 21 16:01:07 crc kubenswrapper[4707]: E0121 16:01:07.283780 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd\": container with ID starting with e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd not found: ID does not exist" containerID="e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.283827 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd"} err="failed to get container status \"e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd\": rpc error: code = NotFound desc = could not find container \"e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd\": container with ID starting with e7ad74522f86304041f98c05e018f6479182c741fdb4bc9a1c22ff318c2972cd not found: ID does not exist" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.283848 4707 scope.go:117] "RemoveContainer" containerID="a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9" Jan 21 16:01:07 crc kubenswrapper[4707]: E0121 16:01:07.284094 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9\": container with ID starting with a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9 not found: ID does not exist" containerID="a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.284124 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9"} err="failed to get container status \"a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9\": rpc error: code = NotFound desc = could not find container \"a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9\": container with ID starting with a0735a46f0009c24cd7b749e4a21afdcef201b45f8566563c6e8ad7556bae1d9 not found: ID does not exist" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.284142 4707 scope.go:117] "RemoveContainer" containerID="7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d" Jan 21 16:01:07 crc kubenswrapper[4707]: E0121 16:01:07.284461 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d\": container with ID starting with 7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d not found: ID does not exist" containerID="7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.284501 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d"} err="failed to get container status \"7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d\": rpc error: code = NotFound desc = could not find container \"7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d\": container with ID starting with 7a456cecabd58d8b520f7d7fa58210d96a300dd13d3f19f2470800e0d4fbe91d not found: ID does not exist" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.284525 4707 scope.go:117] "RemoveContainer" containerID="2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8" Jan 21 16:01:07 crc kubenswrapper[4707]: E0121 16:01:07.284804 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8\": container with ID starting with 2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8 not found: ID does not exist" containerID="2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.284854 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8"} err="failed to get container status \"2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8\": rpc error: code = NotFound desc = could not find container \"2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8\": container with ID starting with 2c593fcff78e509d8bffdb2ecc0e48c6c972bc4d9360c1809d942f0d43d3f3a8 not found: ID does not exist" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.301490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-scripts\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.301522 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-run-httpd\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.301670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.301740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-log-httpd\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.301776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48hg4\" (UniqueName: \"kubernetes.io/projected/a788c56d-377f-4859-910d-a623173f0a74-kube-api-access-48hg4\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.301825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-config-data\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.301882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.301907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.403553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.403625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-log-httpd\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.403659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48hg4\" (UniqueName: \"kubernetes.io/projected/a788c56d-377f-4859-910d-a623173f0a74-kube-api-access-48hg4\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.403685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-config-data\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.403752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.403794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.403905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-scripts\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.403923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-run-httpd\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.404556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-log-httpd\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.405447 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-run-httpd\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.407584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-config-data\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.407972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.408617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-scripts\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.408738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.409228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.418365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48hg4\" (UniqueName: \"kubernetes.io/projected/a788c56d-377f-4859-910d-a623173f0a74-kube-api-access-48hg4\") pod \"ceilometer-0\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.575612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:07 crc kubenswrapper[4707]: I0121 16:01:07.971158 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.201748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerStarted","Data":"23eb442cbe94a7e31d994e60c9d91679a09744385f08c1118e3bc951f113796e"} Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.860858 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.929516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60dc1db5-0c41-450a-8545-281cdbc0ff86-logs\") pod \"60dc1db5-0c41-450a-8545-281cdbc0ff86\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.929700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlnw5\" (UniqueName: \"kubernetes.io/projected/60dc1db5-0c41-450a-8545-281cdbc0ff86-kube-api-access-tlnw5\") pod \"60dc1db5-0c41-450a-8545-281cdbc0ff86\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.929730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-config-data\") pod \"60dc1db5-0c41-450a-8545-281cdbc0ff86\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.929767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-combined-ca-bundle\") pod \"60dc1db5-0c41-450a-8545-281cdbc0ff86\" (UID: \"60dc1db5-0c41-450a-8545-281cdbc0ff86\") " Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.930015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60dc1db5-0c41-450a-8545-281cdbc0ff86-logs" (OuterVolumeSpecName: "logs") pod "60dc1db5-0c41-450a-8545-281cdbc0ff86" (UID: "60dc1db5-0c41-450a-8545-281cdbc0ff86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.930220 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60dc1db5-0c41-450a-8545-281cdbc0ff86-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.932639 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60dc1db5-0c41-450a-8545-281cdbc0ff86-kube-api-access-tlnw5" (OuterVolumeSpecName: "kube-api-access-tlnw5") pod "60dc1db5-0c41-450a-8545-281cdbc0ff86" (UID: "60dc1db5-0c41-450a-8545-281cdbc0ff86"). InnerVolumeSpecName "kube-api-access-tlnw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.951109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-config-data" (OuterVolumeSpecName: "config-data") pod "60dc1db5-0c41-450a-8545-281cdbc0ff86" (UID: "60dc1db5-0c41-450a-8545-281cdbc0ff86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:08 crc kubenswrapper[4707]: I0121 16:01:08.965156 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60dc1db5-0c41-450a-8545-281cdbc0ff86" (UID: "60dc1db5-0c41-450a-8545-281cdbc0ff86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.031479 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlnw5\" (UniqueName: \"kubernetes.io/projected/60dc1db5-0c41-450a-8545-281cdbc0ff86-kube-api-access-tlnw5\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.031508 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.031519 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60dc1db5-0c41-450a-8545-281cdbc0ff86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.191576 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75da6817-ed62-456c-a572-375325ca0bd5" path="/var/lib/kubelet/pods/75da6817-ed62-456c-a572-375325ca0bd5/volumes" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.211277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerStarted","Data":"82201ef1750ba378349ad8aa6af1ad86347056341ef06a8fd9c79319643362ff"} Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.212939 4707 generic.go:334] "Generic (PLEG): container finished" podID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerID="4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de" exitCode=0 Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.212977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"60dc1db5-0c41-450a-8545-281cdbc0ff86","Type":"ContainerDied","Data":"4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de"} Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.212999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"60dc1db5-0c41-450a-8545-281cdbc0ff86","Type":"ContainerDied","Data":"c57c256aa3ba25c626516ea311159ff6fad27e5f74399128b67a048d66d21e0b"} Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.213014 4707 scope.go:117] "RemoveContainer" containerID="4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.213120 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.231415 4707 scope.go:117] "RemoveContainer" containerID="40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.233082 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.250841 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.258793 4707 scope.go:117] "RemoveContainer" containerID="4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de" Jan 21 16:01:09 crc kubenswrapper[4707]: E0121 16:01:09.259129 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de\": container with ID starting with 4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de not found: ID does not exist" containerID="4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.259161 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de"} err="failed to get container status \"4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de\": rpc error: code = NotFound desc = could not find container \"4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de\": container with ID starting with 4ca1397c0d6081fc496e860b186868e2b2c35e0d0ff9c5c90f38330b3181a4de not found: ID does not exist" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.259179 4707 scope.go:117] "RemoveContainer" containerID="40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829" Jan 21 16:01:09 crc kubenswrapper[4707]: E0121 16:01:09.259593 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829\": container with ID starting with 40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829 not found: ID does not exist" containerID="40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.259631 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829"} err="failed to get container status \"40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829\": rpc error: code = NotFound desc = could not find container \"40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829\": container with ID starting with 40ef5b43b8d417614d829e98b2210504284bb3004febaa4ad9461f7e0b4ae829 not found: ID does not exist" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.260884 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:09 crc kubenswrapper[4707]: E0121 16:01:09.261269 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-log" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.261292 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-log" Jan 21 16:01:09 crc kubenswrapper[4707]: E0121 16:01:09.261322 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-api" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.261329 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-api" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.261532 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-api" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.261561 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" containerName="nova-api-log" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.262449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.264249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.264431 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.264544 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.268019 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.442022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.442068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-public-tls-certs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.442274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-config-data\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.442330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7qs\" (UniqueName: \"kubernetes.io/projected/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-kube-api-access-np7qs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.442467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-internal-tls-certs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.442528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-logs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.544019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.544056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-public-tls-certs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.544098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-config-data\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.544117 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7qs\" (UniqueName: \"kubernetes.io/projected/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-kube-api-access-np7qs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.544155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-internal-tls-certs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.544184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-logs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.544527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-logs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.546997 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-public-tls-certs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.547003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-config-data\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.547625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.547821 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-internal-tls-certs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.558099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7qs\" (UniqueName: \"kubernetes.io/projected/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-kube-api-access-np7qs\") pod \"nova-api-0\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.579163 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.946110 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.946461 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:01:09 crc kubenswrapper[4707]: I0121 16:01:09.980217 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:09 crc kubenswrapper[4707]: W0121 16:01:09.999121 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77eb4906_622c_4eb0_8e20_e91b6d3a8e27.slice/crio-59aae2fe83ccfea1cfd931182372261bdc38b2b85b89753a81dfdc12645ad314 WatchSource:0}: Error finding container 59aae2fe83ccfea1cfd931182372261bdc38b2b85b89753a81dfdc12645ad314: Status 404 returned error can't find the container with id 59aae2fe83ccfea1cfd931182372261bdc38b2b85b89753a81dfdc12645ad314 Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.050327 4707 scope.go:117] "RemoveContainer" containerID="99af3d75a96b9e45347ec6f501959f5aaaf4cc00a9200522051fb91dc082665d" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.076056 4707 scope.go:117] "RemoveContainer" containerID="4785e6cbd1c41004f4ccc7f605931833ce69946a0be235c3ed27a893a8ce09b4" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.096228 4707 scope.go:117] "RemoveContainer" containerID="958b22b32e9e9e12c4b19bb7c2cc210c9881143654900671a1871c84040807a4" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.113030 4707 scope.go:117] "RemoveContainer" containerID="7b39b7154fa0b95548ea7fac51861a1df55416b47e244ccf9463cee5f36581c0" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.137859 4707 scope.go:117] "RemoveContainer" containerID="06a9ecf9a704b3cf801825df72825f3564b9b3123770eb8150d305a227bca3ff" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.155387 4707 scope.go:117] "RemoveContainer" containerID="0134dae6d06704b28d3c1a93c597121155514f8f85383d373be3fbff4f9697e9" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.226645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"77eb4906-622c-4eb0-8e20-e91b6d3a8e27","Type":"ContainerStarted","Data":"e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71"} Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.226679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"77eb4906-622c-4eb0-8e20-e91b6d3a8e27","Type":"ContainerStarted","Data":"59aae2fe83ccfea1cfd931182372261bdc38b2b85b89753a81dfdc12645ad314"} Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.228829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerStarted","Data":"18320d7d0fe5ff7086b31b3cb35d4c5a0cc62611fd7a0661a1b29d50cedc2e6f"} Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.241046 4707 scope.go:117] "RemoveContainer" containerID="d7d4a778b9df39e14feb7740f3a7817dd4ceb96bd7dc4108a8e2e99f350903d3" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.272949 4707 scope.go:117] "RemoveContainer" containerID="23f4a32980a3cc2291bacd76135da995267a0c14d4eb6427e7b2324696c9dd64" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.294144 4707 scope.go:117] "RemoveContainer" containerID="14d0a43f08a2e749aa63184db127afde2233c4d4ac8dfaab8d8f291ada1ef02e" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.313582 4707 scope.go:117] "RemoveContainer" containerID="a4d345f4aef43876ec0d8f9747eadb254c4cfb035eb0ac23f9808a04b5c70dd0" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.332149 4707 scope.go:117] "RemoveContainer" containerID="718935d494e8b85ea1bc82992ceafdac9be0e8ac69c479cdd2bb1b38dadd80df" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.347173 4707 scope.go:117] "RemoveContainer" containerID="e3b398a48d728c6bd65d6a353d60864071f543173e91de5b29ab7fb55a485387" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.361206 4707 scope.go:117] "RemoveContainer" containerID="c50519e027e1683f85b799afca4576a3862101aad86a8a98197856394645b0ed" Jan 21 16:01:10 crc kubenswrapper[4707]: I0121 16:01:10.375475 4707 scope.go:117] "RemoveContainer" containerID="5efc0d6113609a65c6c8c728c1819ba46a5a6053e62048e66910b566697c74eb" Jan 21 16:01:11 crc kubenswrapper[4707]: I0121 16:01:11.190047 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60dc1db5-0c41-450a-8545-281cdbc0ff86" path="/var/lib/kubelet/pods/60dc1db5-0c41-450a-8545-281cdbc0ff86/volumes" Jan 21 16:01:11 crc kubenswrapper[4707]: I0121 16:01:11.244117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerStarted","Data":"3972cf4fcd549a276993eab0ef23faf54b7c1c1e4e805563f5c0014b1ef5faeb"} Jan 21 16:01:11 crc kubenswrapper[4707]: I0121 16:01:11.245794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"77eb4906-622c-4eb0-8e20-e91b6d3a8e27","Type":"ContainerStarted","Data":"081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78"} Jan 21 16:01:11 crc kubenswrapper[4707]: I0121 16:01:11.266865 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.266848886 podStartE2EDuration="2.266848886s" podCreationTimestamp="2026-01-21 16:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:11.257158475 +0000 UTC m=+3568.438674698" watchObservedRunningTime="2026-01-21 16:01:11.266848886 +0000 UTC m=+3568.448365108" Jan 21 16:01:13 crc kubenswrapper[4707]: I0121 16:01:13.261265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerStarted","Data":"ceaf1dadb0a41262987a363b58f66fe315a552cc7470f8d46cfc49b76e95959e"} Jan 21 16:01:13 crc kubenswrapper[4707]: I0121 16:01:13.261663 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:13 crc kubenswrapper[4707]: I0121 16:01:13.281074 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.882843536 podStartE2EDuration="6.281060312s" podCreationTimestamp="2026-01-21 16:01:07 +0000 UTC" firstStartedPulling="2026-01-21 16:01:07.974435021 +0000 UTC m=+3565.155951243" lastFinishedPulling="2026-01-21 16:01:12.372651797 +0000 UTC m=+3569.554168019" observedRunningTime="2026-01-21 16:01:13.274407633 +0000 UTC m=+3570.455923856" watchObservedRunningTime="2026-01-21 16:01:13.281060312 +0000 UTC m=+3570.462576535" Jan 21 16:01:19 crc kubenswrapper[4707]: I0121 16:01:19.579764 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:19 crc kubenswrapper[4707]: I0121 16:01:19.580178 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:20 crc kubenswrapper[4707]: I0121 16:01:20.592928 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.41:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:20 crc kubenswrapper[4707]: I0121 16:01:20.592945 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.41:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:29 crc kubenswrapper[4707]: I0121 16:01:29.585194 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:29 crc kubenswrapper[4707]: I0121 16:01:29.585863 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:29 crc kubenswrapper[4707]: I0121 16:01:29.586406 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:29 crc kubenswrapper[4707]: I0121 16:01:29.589822 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:30 crc kubenswrapper[4707]: I0121 16:01:30.374341 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:30 crc kubenswrapper[4707]: I0121 16:01:30.379767 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:37 crc kubenswrapper[4707]: I0121 16:01:37.581779 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:39 crc kubenswrapper[4707]: I0121 16:01:39.946125 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:01:39 crc kubenswrapper[4707]: I0121 16:01:39.946497 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.227703 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.227965 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="23a190c1-7893-4f25-af82-4f97b95b4e68" containerName="openstackclient" containerID="cri-o://8dac5ac8a890cd9197910e542e914fc479cea9be24fdd5964752473d60226d8d" gracePeriod=2 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.248359 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.262631 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.262855 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-log" containerID="cri-o://51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.263251 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-metadata" containerID="cri-o://0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.283581 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v"] Jan 21 16:01:40 crc kubenswrapper[4707]: E0121 16:01:40.288178 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a190c1-7893-4f25-af82-4f97b95b4e68" containerName="openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.288202 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a190c1-7893-4f25-af82-4f97b95b4e68" containerName="openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.288389 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a190c1-7893-4f25-af82-4f97b95b4e68" containerName="openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.289264 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.294419 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.294677 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="ovn-northd" containerID="cri-o://853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.294803 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="openstack-network-exporter" containerID="cri-o://bdec8c944354ba8edd4a696e67ef57315d22a16748e6ca2cf813eb67fe88a679" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.306420 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="23a190c1-7893-4f25-af82-4f97b95b4e68" podUID="ef804e7a-6ccf-45ea-8025-b3c4df0f1122" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.325852 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.326184 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerName="openstack-network-exporter" containerID="cri-o://0fc100dde8fb0c9e0faf72a71eaa71c4587eafd3440b5e4f0e1ef1e3d4861bcb" gracePeriod=300 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.343198 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.343405 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-log" containerID="cri-o://e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.343532 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-api" containerID="cri-o://081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.360862 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.361099 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="e94827e5-23b4-4693-a27e-b37a6f495523" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.392884 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.394090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.413113 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.463420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.463615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58856a6-ee91-4aee-874e-118980038628-logs\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.464062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data-custom\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.464099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-combined-ca-bundle\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.464202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpjf7\" (UniqueName: \"kubernetes.io/projected/f58856a6-ee91-4aee-874e-118980038628-kube-api-access-mpjf7\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.471876 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.506861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.507078 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="53e561f4-eb73-4a09-884c-0d5476044b72" containerName="nova-scheduler-scheduler" containerID="cri-o://a6140e292d56c1ea39d41e209b222050d7c5bcd750acd566047cd616b55a336c" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.527858 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.528052 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="e5f232af-6f89-4112-a5a1-5fd2093c2b61" containerName="memcached" containerID="cri-o://39951c1adfd56981da93394dd5526012af516ce4d8e4e81c75c853a81df75ff1" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.541290 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.542764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.551581 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.567903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.567969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58856a6-ee91-4aee-874e-118980038628-logs\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.568008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data-custom\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.568029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bbjd\" (UniqueName: \"kubernetes.io/projected/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-kube-api-access-2bbjd\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.568047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-combined-ca-bundle\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.568117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config-secret\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.568184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.568241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.568266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpjf7\" (UniqueName: \"kubernetes.io/projected/f58856a6-ee91-4aee-874e-118980038628-kube-api-access-mpjf7\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.578290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58856a6-ee91-4aee-874e-118980038628-logs\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.582570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data-custom\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.589264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-combined-ca-bundle\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.600032 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.600228 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerName="glance-log" containerID="cri-o://f13e4d8a65800b110a365933161c21cf2baf677a94c2caac3c3ded57bbda9c10" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.600608 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerName="glance-httpd" containerID="cri-o://5f46ece4d49ebd6ff61477cd26b6c7e5b21e6dfbdf8a342e363b3fa2825bf3d7" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.609330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpjf7\" (UniqueName: \"kubernetes.io/projected/f58856a6-ee91-4aee-874e-118980038628-kube-api-access-mpjf7\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.631863 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.633312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.668614 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.669723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bbjd\" (UniqueName: \"kubernetes.io/projected/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-kube-api-access-2bbjd\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.669916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config-secret\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.670043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.670148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.670266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data-custom\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.670353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-combined-ca-bundle\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.670451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892fe45-66a3-4ddf-b0db-b50df71c6493-logs\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.670975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.671022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27s6q\" (UniqueName: \"kubernetes.io/projected/c892fe45-66a3-4ddf-b0db-b50df71c6493-kube-api-access-27s6q\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.671755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.676705 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config-secret\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.686450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.689355 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.704542 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerName="ovsdbserver-nb" containerID="cri-o://2f3ffa31dd88b33ac598f0e6e8fcfb46ab698b734a73b7deb6cda430c6834638" gracePeriod=300 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.704946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bbjd\" (UniqueName: \"kubernetes.io/projected/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-kube-api-access-2bbjd\") pod \"openstackclient\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.705419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data\") pod \"barbican-keystone-listener-68f4cbbf8-p929v\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.710857 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.711093 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerName="cinder-scheduler" containerID="cri-o://3b3595d781817f63d2073805ae987be2763662eff7d9cb701993e99941539a30" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.711386 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerName="probe" containerID="cri-o://35687d4e9d2f04767e4e660723736595964bd33eb7c312847844b1f32e3a1a5a" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.739253 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.739673 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" containerName="cinder-api-log" containerID="cri-o://693aa9dee02beefab5f62e72db83696878302f9d43403043bdbdd0a96bbd28b4" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.740081 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" containerName="cinder-api" containerID="cri-o://8e2da5aaf70f2dca26c0a7481b9da9f4a95637fa22e3641c79f7ab645fd6829e" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.755380 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5865994c8-cs68b"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.756918 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.772994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-combined-ca-bundle\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-public-tls-certs\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data-custom\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data-custom\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-combined-ca-bundle\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cb3219d-64cd-4b49-adaa-723e74405eda-logs\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892fe45-66a3-4ddf-b0db-b50df71c6493-logs\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcqkg\" (UniqueName: \"kubernetes.io/projected/8cb3219d-64cd-4b49-adaa-723e74405eda-kube-api-access-wcqkg\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-internal-tls-certs\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.773289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27s6q\" (UniqueName: \"kubernetes.io/projected/c892fe45-66a3-4ddf-b0db-b50df71c6493-kube-api-access-27s6q\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.774667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892fe45-66a3-4ddf-b0db-b50df71c6493-logs\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.779264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data-custom\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.784571 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.785139 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerName="openstack-network-exporter" containerID="cri-o://b2677b78e02f90668c6187316a0c1e86ff839677dd5fcae728e200c597003229" gracePeriod=300 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.793835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.795328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-combined-ca-bundle\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.812292 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.813654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27s6q\" (UniqueName: \"kubernetes.io/projected/c892fe45-66a3-4ddf-b0db-b50df71c6493-kube-api-access-27s6q\") pod \"barbican-worker-7457d87b7c-v8nld\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.833854 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.891031 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5865994c8-cs68b"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.911486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.911568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cb3219d-64cd-4b49-adaa-723e74405eda-logs\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.911744 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-scripts\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.911823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcqkg\" (UniqueName: \"kubernetes.io/projected/8cb3219d-64cd-4b49-adaa-723e74405eda-kube-api-access-wcqkg\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.911891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-combined-ca-bundle\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.911927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-internal-tls-certs\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.911944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-internal-tls-certs\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.912003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-public-tls-certs\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.912325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cb3219d-64cd-4b49-adaa-723e74405eda-logs\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.913656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52d68\" (UniqueName: \"kubernetes.io/projected/d7d28e68-71cb-476f-a831-5536b2686514-kube-api-access-52d68\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.913845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-combined-ca-bundle\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.913947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-config-data\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.914032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-public-tls-certs\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.914108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d28e68-71cb-476f-a831-5536b2686514-logs\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.914236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data-custom\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.917051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data-custom\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.921839 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.927257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-public-tls-certs\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.932354 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.932570 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerName="ovsdbserver-sb" containerID="cri-o://67452b60079d3b2cf3ba3e907030f0defbdf50d69c12ef1aad42ab4d0e43dbcf" gracePeriod=300 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.932739 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-internal-tls-certs\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.934894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-combined-ca-bundle\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.936209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcqkg\" (UniqueName: \"kubernetes.io/projected/8cb3219d-64cd-4b49-adaa-723e74405eda-kube-api-access-wcqkg\") pod \"barbican-api-85b75f6d64-zntqc\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.975781 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5797484f74-ph7p9"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.980540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5797484f74-ph7p9"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.980627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.985780 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.985949 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.998875 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.999415 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerName="glance-log" containerID="cri-o://528b18552f9a4743a109d4c0a9bde81e896224749b7ecb137a938d625978f1ba" gracePeriod=30 Jan 21 16:01:40 crc kubenswrapper[4707]: I0121 16:01:40.999486 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerName="glance-httpd" containerID="cri-o://e988b288a8fcc44d803facd8cd151da8bf58f68ac8ebb2aef3d85ab144b16900" gracePeriod=30 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.013453 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-687544f45f-6qmlx"] Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.013534 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.015449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-scripts\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.015492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-combined-ca-bundle\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.015514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-internal-tls-certs\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.015537 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-public-tls-certs\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.015581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52d68\" (UniqueName: \"kubernetes.io/projected/d7d28e68-71cb-476f-a831-5536b2686514-kube-api-access-52d68\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.015615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-config-data\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.015630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d28e68-71cb-476f-a831-5536b2686514-logs\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.016418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d28e68-71cb-476f-a831-5536b2686514-logs\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.017997 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-687544f45f-6qmlx"] Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.018656 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.021517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-scripts\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.030050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-config-data\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.036211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-public-tls-certs\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.043904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52d68\" (UniqueName: \"kubernetes.io/projected/d7d28e68-71cb-476f-a831-5536b2686514-kube-api-access-52d68\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.044055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-combined-ca-bundle\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.046179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-internal-tls-certs\") pod \"placement-5865994c8-cs68b\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.063855 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.103078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.118935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xts5f\" (UniqueName: \"kubernetes.io/projected/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-kube-api-access-xts5f\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-scripts\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-fernet-keys\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-config\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119133 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-internal-tls-certs\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-credential-keys\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-public-tls-certs\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-ovndb-tls-certs\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-public-tls-certs\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-combined-ca-bundle\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-internal-tls-certs\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-httpd-config\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-combined-ca-bundle\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22wpg\" (UniqueName: \"kubernetes.io/projected/b2395d51-a312-47ba-9136-87fc6e74bf2c-kube-api-access-22wpg\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.119472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-config-data\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.127124 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="614dacb7-b2d4-4d19-b93d-c922660ad318" containerName="galera" containerID="cri-o://1c1f119c65c4fce9675289522fce83be1fb184f19095b0c15b0efe892c668d46" gracePeriod=30 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-scripts\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-fernet-keys\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-config\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-internal-tls-certs\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-credential-keys\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226598 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-public-tls-certs\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-ovndb-tls-certs\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-public-tls-certs\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-combined-ca-bundle\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-internal-tls-certs\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-httpd-config\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-combined-ca-bundle\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22wpg\" (UniqueName: \"kubernetes.io/projected/b2395d51-a312-47ba-9136-87fc6e74bf2c-kube-api-access-22wpg\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-config-data\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.226992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xts5f\" (UniqueName: \"kubernetes.io/projected/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-kube-api-access-xts5f\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.247648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-config-data\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.286096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-scripts\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.286464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-public-tls-certs\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.287332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-ovndb-tls-certs\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.287718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-combined-ca-bundle\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.288079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-internal-tls-certs\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.288489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-fernet-keys\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.288894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-credential-keys\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.290424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-httpd-config\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.291210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-config\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.291442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22wpg\" (UniqueName: \"kubernetes.io/projected/b2395d51-a312-47ba-9136-87fc6e74bf2c-kube-api-access-22wpg\") pod \"keystone-687544f45f-6qmlx\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.294331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-internal-tls-certs\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.296503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-combined-ca-bundle\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.313859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xts5f\" (UniqueName: \"kubernetes.io/projected/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-kube-api-access-xts5f\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.335998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-public-tls-certs\") pod \"neutron-5797484f74-ph7p9\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.433290 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" containerName="galera" containerID="cri-o://1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b" gracePeriod=30 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.450572 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.503439 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.529869 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.530032 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="8f8b72f5-0735-4187-aaec-0fb50df56674" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d09d9e5a0fd4869c4422a15b0398bb6bfdf3645c34fe7a84bc0e0e94d680986f" gracePeriod=30 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.541929 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8a64a9a-c654-4d05-8504-0839ba022204" containerID="693aa9dee02beefab5f62e72db83696878302f9d43403043bdbdd0a96bbd28b4" exitCode=143 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.542296 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b8a64a9a-c654-4d05-8504-0839ba022204","Type":"ContainerDied","Data":"693aa9dee02beefab5f62e72db83696878302f9d43403043bdbdd0a96bbd28b4"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.544322 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerID="f13e4d8a65800b110a365933161c21cf2baf677a94c2caac3c3ded57bbda9c10" exitCode=143 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.544480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fc9779e5-4e04-424b-b1e3-388ebf744d02","Type":"ContainerDied","Data":"f13e4d8a65800b110a365933161c21cf2baf677a94c2caac3c3ded57bbda9c10"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.566917 4707 generic.go:334] "Generic (PLEG): container finished" podID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerID="35687d4e9d2f04767e4e660723736595964bd33eb7c312847844b1f32e3a1a5a" exitCode=0 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.566992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c6da68d4-659d-4cf2-a796-899b51ef8939","Type":"ContainerDied","Data":"35687d4e9d2f04767e4e660723736595964bd33eb7c312847844b1f32e3a1a5a"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.571129 4707 generic.go:334] "Generic (PLEG): container finished" podID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerID="e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71" exitCode=143 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.571192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"77eb4906-622c-4eb0-8e20-e91b6d3a8e27","Type":"ContainerDied","Data":"e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.585473 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5f232af-6f89-4112-a5a1-5fd2093c2b61" containerID="39951c1adfd56981da93394dd5526012af516ce4d8e4e81c75c853a81df75ff1" exitCode=0 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.585575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"e5f232af-6f89-4112-a5a1-5fd2093c2b61","Type":"ContainerDied","Data":"39951c1adfd56981da93394dd5526012af516ce4d8e4e81c75c853a81df75ff1"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.593878 4707 generic.go:334] "Generic (PLEG): container finished" podID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerID="bdec8c944354ba8edd4a696e67ef57315d22a16748e6ca2cf813eb67fe88a679" exitCode=2 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.593912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"40dfa261-8913-4472-8f97-ac7b04d1f349","Type":"ContainerDied","Data":"bdec8c944354ba8edd4a696e67ef57315d22a16748e6ca2cf813eb67fe88a679"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.595710 4707 generic.go:334] "Generic (PLEG): container finished" podID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerID="528b18552f9a4743a109d4c0a9bde81e896224749b7ecb137a938d625978f1ba" exitCode=143 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.595753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"32eaeddc-f8b5-4c3e-a456-f723f0698b30","Type":"ContainerDied","Data":"528b18552f9a4743a109d4c0a9bde81e896224749b7ecb137a938d625978f1ba"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.597260 4707 generic.go:334] "Generic (PLEG): container finished" podID="b5adfc17-d372-40cd-a326-5af09f323dce" containerID="51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384" exitCode=143 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.597319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5adfc17-d372-40cd-a326-5af09f323dce","Type":"ContainerDied","Data":"51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.598910 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_c1857e43-c160-4d63-ba36-3b9cb96ebadc/ovsdbserver-sb/0.log" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.598946 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerID="b2677b78e02f90668c6187316a0c1e86ff839677dd5fcae728e200c597003229" exitCode=2 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.598957 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerID="67452b60079d3b2cf3ba3e907030f0defbdf50d69c12ef1aad42ab4d0e43dbcf" exitCode=143 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.598992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1857e43-c160-4d63-ba36-3b9cb96ebadc","Type":"ContainerDied","Data":"b2677b78e02f90668c6187316a0c1e86ff839677dd5fcae728e200c597003229"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.599008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1857e43-c160-4d63-ba36-3b9cb96ebadc","Type":"ContainerDied","Data":"67452b60079d3b2cf3ba3e907030f0defbdf50d69c12ef1aad42ab4d0e43dbcf"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.606171 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_909f1cc7-acf2-4e84-ae89-4af8bd83962d/ovsdbserver-nb/0.log" Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.606213 4707 generic.go:334] "Generic (PLEG): container finished" podID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerID="0fc100dde8fb0c9e0faf72a71eaa71c4587eafd3440b5e4f0e1ef1e3d4861bcb" exitCode=2 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.606229 4707 generic.go:334] "Generic (PLEG): container finished" podID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerID="2f3ffa31dd88b33ac598f0e6e8fcfb46ab698b734a73b7deb6cda430c6834638" exitCode=143 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.606245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"909f1cc7-acf2-4e84-ae89-4af8bd83962d","Type":"ContainerDied","Data":"0fc100dde8fb0c9e0faf72a71eaa71c4587eafd3440b5e4f0e1ef1e3d4861bcb"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.606265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"909f1cc7-acf2-4e84-ae89-4af8bd83962d","Type":"ContainerDied","Data":"2f3ffa31dd88b33ac598f0e6e8fcfb46ab698b734a73b7deb6cda430c6834638"} Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.692002 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.692230 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="ceilometer-central-agent" containerID="cri-o://82201ef1750ba378349ad8aa6af1ad86347056341ef06a8fd9c79319643362ff" gracePeriod=30 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.692571 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="proxy-httpd" containerID="cri-o://ceaf1dadb0a41262987a363b58f66fe315a552cc7470f8d46cfc49b76e95959e" gracePeriod=30 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.692622 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="sg-core" containerID="cri-o://3972cf4fcd549a276993eab0ef23faf54b7c1c1e4e805563f5c0014b1ef5faeb" gracePeriod=30 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.692658 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="ceilometer-notification-agent" containerID="cri-o://18320d7d0fe5ff7086b31b3cb35d4c5a0cc62611fd7a0661a1b29d50cedc2e6f" gracePeriod=30 Jan 21 16:01:41 crc kubenswrapper[4707]: I0121 16:01:41.960067 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.048565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-memcached-tls-certs\") pod \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.048624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kolla-config\") pod \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.048700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-combined-ca-bundle\") pod \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.048746 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmrtv\" (UniqueName: \"kubernetes.io/projected/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kube-api-access-jmrtv\") pod \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.048786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-config-data\") pod \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\" (UID: \"e5f232af-6f89-4112-a5a1-5fd2093c2b61\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.049320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e5f232af-6f89-4112-a5a1-5fd2093c2b61" (UID: "e5f232af-6f89-4112-a5a1-5fd2093c2b61"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.049711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-config-data" (OuterVolumeSpecName: "config-data") pod "e5f232af-6f89-4112-a5a1-5fd2093c2b61" (UID: "e5f232af-6f89-4112-a5a1-5fd2093c2b61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.056345 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_909f1cc7-acf2-4e84-ae89-4af8bd83962d/ovsdbserver-nb/0.log" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.056416 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.058585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kube-api-access-jmrtv" (OuterVolumeSpecName: "kube-api-access-jmrtv") pod "e5f232af-6f89-4112-a5a1-5fd2093c2b61" (UID: "e5f232af-6f89-4112-a5a1-5fd2093c2b61"). InnerVolumeSpecName "kube-api-access-jmrtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.071104 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_c1857e43-c160-4d63-ba36-3b9cb96ebadc/ovsdbserver-sb/0.log" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.071160 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.074971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f232af-6f89-4112-a5a1-5fd2093c2b61" (UID: "e5f232af-6f89-4112-a5a1-5fd2093c2b61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.125095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "e5f232af-6f89-4112-a5a1-5fd2093c2b61" (UID: "e5f232af-6f89-4112-a5a1-5fd2093c2b61"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.150607 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-scripts\") pod \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.150650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.150671 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-combined-ca-bundle\") pod \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.150746 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.150899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-combined-ca-bundle\") pod \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.150981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxnb5\" (UniqueName: \"kubernetes.io/projected/c1857e43-c160-4d63-ba36-3b9cb96ebadc-kube-api-access-xxnb5\") pod \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151000 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-metrics-certs-tls-certs\") pod \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-metrics-certs-tls-certs\") pod \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-config\") pod \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdb-rundir\") pod \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-scripts\") pod \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151156 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdbserver-nb-tls-certs\") pod \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdb-rundir\") pod \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdbserver-sb-tls-certs\") pod \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151273 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fr5b\" (UniqueName: \"kubernetes.io/projected/909f1cc7-acf2-4e84-ae89-4af8bd83962d-kube-api-access-9fr5b\") pod \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\" (UID: \"909f1cc7-acf2-4e84-ae89-4af8bd83962d\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-config\") pod \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\" (UID: \"c1857e43-c160-4d63-ba36-3b9cb96ebadc\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.151704 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.159107 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.159121 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f232af-6f89-4112-a5a1-5fd2093c2b61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.159131 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmrtv\" (UniqueName: \"kubernetes.io/projected/e5f232af-6f89-4112-a5a1-5fd2093c2b61-kube-api-access-jmrtv\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.159169 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f232af-6f89-4112-a5a1-5fd2093c2b61-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.153491 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-scripts" (OuterVolumeSpecName: "scripts") pod "909f1cc7-acf2-4e84-ae89-4af8bd83962d" (UID: "909f1cc7-acf2-4e84-ae89-4af8bd83962d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.153719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-config" (OuterVolumeSpecName: "config") pod "909f1cc7-acf2-4e84-ae89-4af8bd83962d" (UID: "909f1cc7-acf2-4e84-ae89-4af8bd83962d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.153827 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "909f1cc7-acf2-4e84-ae89-4af8bd83962d" (UID: "909f1cc7-acf2-4e84-ae89-4af8bd83962d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.155467 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "c1857e43-c160-4d63-ba36-3b9cb96ebadc" (UID: "c1857e43-c160-4d63-ba36-3b9cb96ebadc"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.155716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-config" (OuterVolumeSpecName: "config") pod "c1857e43-c160-4d63-ba36-3b9cb96ebadc" (UID: "c1857e43-c160-4d63-ba36-3b9cb96ebadc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.156747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "c1857e43-c160-4d63-ba36-3b9cb96ebadc" (UID: "c1857e43-c160-4d63-ba36-3b9cb96ebadc"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.157416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1857e43-c160-4d63-ba36-3b9cb96ebadc-kube-api-access-xxnb5" (OuterVolumeSpecName: "kube-api-access-xxnb5") pod "c1857e43-c160-4d63-ba36-3b9cb96ebadc" (UID: "c1857e43-c160-4d63-ba36-3b9cb96ebadc"). InnerVolumeSpecName "kube-api-access-xxnb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.158435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-scripts" (OuterVolumeSpecName: "scripts") pod "c1857e43-c160-4d63-ba36-3b9cb96ebadc" (UID: "c1857e43-c160-4d63-ba36-3b9cb96ebadc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.159840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909f1cc7-acf2-4e84-ae89-4af8bd83962d-kube-api-access-9fr5b" (OuterVolumeSpecName: "kube-api-access-9fr5b") pod "909f1cc7-acf2-4e84-ae89-4af8bd83962d" (UID: "909f1cc7-acf2-4e84-ae89-4af8bd83962d"). InnerVolumeSpecName "kube-api-access-9fr5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.171855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "909f1cc7-acf2-4e84-ae89-4af8bd83962d" (UID: "909f1cc7-acf2-4e84-ae89-4af8bd83962d"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.219291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "909f1cc7-acf2-4e84-ae89-4af8bd83962d" (UID: "909f1cc7-acf2-4e84-ae89-4af8bd83962d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: E0121 16:01:42.231290 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d09d9e5a0fd4869c4422a15b0398bb6bfdf3645c34fe7a84bc0e0e94d680986f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.232055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1857e43-c160-4d63-ba36-3b9cb96ebadc" (UID: "c1857e43-c160-4d63-ba36-3b9cb96ebadc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: E0121 16:01:42.234905 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d09d9e5a0fd4869c4422a15b0398bb6bfdf3645c34fe7a84bc0e0e94d680986f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:01:42 crc kubenswrapper[4707]: E0121 16:01:42.242776 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d09d9e5a0fd4869c4422a15b0398bb6bfdf3645c34fe7a84bc0e0e94d680986f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:01:42 crc kubenswrapper[4707]: E0121 16:01:42.242888 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="8f8b72f5-0735-4187-aaec-0fb50df56674" containerName="nova-cell1-conductor-conductor" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263225 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fr5b\" (UniqueName: \"kubernetes.io/projected/909f1cc7-acf2-4e84-ae89-4af8bd83962d-kube-api-access-9fr5b\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263254 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263265 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263340 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263354 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263368 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263377 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263385 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxnb5\" (UniqueName: \"kubernetes.io/projected/c1857e43-c160-4d63-ba36-3b9cb96ebadc-kube-api-access-xxnb5\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263394 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909f1cc7-acf2-4e84-ae89-4af8bd83962d-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263411 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263421 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1857e43-c160-4d63-ba36-3b9cb96ebadc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.263428 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.275726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "909f1cc7-acf2-4e84-ae89-4af8bd83962d" (UID: "909f1cc7-acf2-4e84-ae89-4af8bd83962d"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.289443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "909f1cc7-acf2-4e84-ae89-4af8bd83962d" (UID: "909f1cc7-acf2-4e84-ae89-4af8bd83962d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.310759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "c1857e43-c160-4d63-ba36-3b9cb96ebadc" (UID: "c1857e43-c160-4d63-ba36-3b9cb96ebadc"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.311081 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.316468 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.332172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c1857e43-c160-4d63-ba36-3b9cb96ebadc" (UID: "c1857e43-c160-4d63-ba36-3b9cb96ebadc"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.364893 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.365352 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.365400 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.365410 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.365419 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/909f1cc7-acf2-4e84-ae89-4af8bd83962d-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.365429 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1857e43-c160-4d63-ba36-3b9cb96ebadc-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.365436 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.384828 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.466537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-combined-ca-bundle\") pod \"e94827e5-23b4-4693-a27e-b37a6f495523\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.466568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-vencrypt-tls-certs\") pod \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.466625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-combined-ca-bundle\") pod \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.466657 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlsz8\" (UniqueName: \"kubernetes.io/projected/e94827e5-23b4-4693-a27e-b37a6f495523-kube-api-access-mlsz8\") pod \"e94827e5-23b4-4693-a27e-b37a6f495523\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.466802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-nova-novncproxy-tls-certs\") pod \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.466838 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-config-data\") pod \"e94827e5-23b4-4693-a27e-b37a6f495523\" (UID: \"e94827e5-23b4-4693-a27e-b37a6f495523\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.466868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf57v\" (UniqueName: \"kubernetes.io/projected/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-kube-api-access-kf57v\") pod \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.466897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-config-data\") pod \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\" (UID: \"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8\") " Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.470079 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94827e5-23b4-4693-a27e-b37a6f495523-kube-api-access-mlsz8" (OuterVolumeSpecName: "kube-api-access-mlsz8") pod "e94827e5-23b4-4693-a27e-b37a6f495523" (UID: "e94827e5-23b4-4693-a27e-b37a6f495523"). InnerVolumeSpecName "kube-api-access-mlsz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.477819 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-kube-api-access-kf57v" (OuterVolumeSpecName: "kube-api-access-kf57v") pod "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" (UID: "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8"). InnerVolumeSpecName "kube-api-access-kf57v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.504661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-config-data" (OuterVolumeSpecName: "config-data") pod "e94827e5-23b4-4693-a27e-b37a6f495523" (UID: "e94827e5-23b4-4693-a27e-b37a6f495523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.539537 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc"] Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.549134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld"] Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.559740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e94827e5-23b4-4693-a27e-b37a6f495523" (UID: "e94827e5-23b4-4693-a27e-b37a6f495523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.560737 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5865994c8-cs68b"] Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.570164 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlsz8\" (UniqueName: \"kubernetes.io/projected/e94827e5-23b4-4693-a27e-b37a6f495523-kube-api-access-mlsz8\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.570189 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.570200 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf57v\" (UniqueName: \"kubernetes.io/projected/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-kube-api-access-kf57v\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.570207 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94827e5-23b4-4693-a27e-b37a6f495523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.583627 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-config-data" (OuterVolumeSpecName: "config-data") pod "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" (UID: "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.583965 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v"] Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.609800 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" (UID: "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.610205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" (UID: "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.630240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" (UID: "1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.654465 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_909f1cc7-acf2-4e84-ae89-4af8bd83962d/ovsdbserver-nb/0.log" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.654531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"909f1cc7-acf2-4e84-ae89-4af8bd83962d","Type":"ContainerDied","Data":"ddb9006b93d05c8a539c45bb05dd6e7dcb7f50de147d5929f26a88d4c853ac6f"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.654561 4707 scope.go:117] "RemoveContainer" containerID="0fc100dde8fb0c9e0faf72a71eaa71c4587eafd3440b5e4f0e1ef1e3d4861bcb" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.654733 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.658768 4707 generic.go:334] "Generic (PLEG): container finished" podID="1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" containerID="c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420" exitCode=0 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.658804 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.658888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8","Type":"ContainerDied","Data":"c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.658915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8","Type":"ContainerDied","Data":"02fef2ffbd4f577563d6f63f4339e36926f110f902e9236bf6f7744c3fd53913"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.673100 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.673146 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.673158 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.673173 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.675211 4707 generic.go:334] "Generic (PLEG): container finished" podID="e94827e5-23b4-4693-a27e-b37a6f495523" containerID="9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941" exitCode=0 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.675368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e94827e5-23b4-4693-a27e-b37a6f495523","Type":"ContainerDied","Data":"9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.675393 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e94827e5-23b4-4693-a27e-b37a6f495523","Type":"ContainerDied","Data":"cda4bd24da400b8c27af123feaae0312e1e7f278bcf1668a4c3f20bf91ed7393"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.675464 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.677697 4707 generic.go:334] "Generic (PLEG): container finished" podID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" containerID="1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b" exitCode=0 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.677763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"10c95d6a-958d-45bf-8cf2-c9b915727cd4","Type":"ContainerDied","Data":"1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.678906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" event={"ID":"f58856a6-ee91-4aee-874e-118980038628","Type":"ContainerStarted","Data":"9d1ddccc903bfca7efd8d342f1062183d9a041c519e23916f08d1983b65772c6"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.688343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"e5f232af-6f89-4112-a5a1-5fd2093c2b61","Type":"ContainerDied","Data":"c9e92d7b6b68e40da63224c6160c0c0ba5da9124af56543eadab466a5c80785a"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.688465 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.694455 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_c1857e43-c160-4d63-ba36-3b9cb96ebadc/ovsdbserver-sb/0.log" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.694517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1857e43-c160-4d63-ba36-3b9cb96ebadc","Type":"ContainerDied","Data":"24b232d7760879a2a07eb128986900b6964a72b4f6d77e4aa82d63cdec49afcf"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.694609 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.715918 4707 generic.go:334] "Generic (PLEG): container finished" podID="614dacb7-b2d4-4d19-b93d-c922660ad318" containerID="1c1f119c65c4fce9675289522fce83be1fb184f19095b0c15b0efe892c668d46" exitCode=0 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.715988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"614dacb7-b2d4-4d19-b93d-c922660ad318","Type":"ContainerDied","Data":"1c1f119c65c4fce9675289522fce83be1fb184f19095b0c15b0efe892c668d46"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.716006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"614dacb7-b2d4-4d19-b93d-c922660ad318","Type":"ContainerDied","Data":"d71249bd7452f771849cb799f45131670980f0631ff6b22cb430d6f58d21911f"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.716017 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d71249bd7452f771849cb799f45131670980f0631ff6b22cb430d6f58d21911f" Jan 21 16:01:42 crc kubenswrapper[4707]: E0121 16:01:42.729042 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b is running failed: container process not found" containerID="1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 16:01:42 crc kubenswrapper[4707]: E0121 16:01:42.734249 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b is running failed: container process not found" containerID="1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 16:01:42 crc kubenswrapper[4707]: E0121 16:01:42.734986 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b is running failed: container process not found" containerID="1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 16:01:42 crc kubenswrapper[4707]: E0121 16:01:42.735016 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-0" podUID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" containerName="galera" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.741915 4707 generic.go:334] "Generic (PLEG): container finished" podID="a788c56d-377f-4859-910d-a623173f0a74" containerID="ceaf1dadb0a41262987a363b58f66fe315a552cc7470f8d46cfc49b76e95959e" exitCode=0 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.741944 4707 generic.go:334] "Generic (PLEG): container finished" podID="a788c56d-377f-4859-910d-a623173f0a74" containerID="3972cf4fcd549a276993eab0ef23faf54b7c1c1e4e805563f5c0014b1ef5faeb" exitCode=2 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.741952 4707 generic.go:334] "Generic (PLEG): container finished" podID="a788c56d-377f-4859-910d-a623173f0a74" containerID="82201ef1750ba378349ad8aa6af1ad86347056341ef06a8fd9c79319643362ff" exitCode=0 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.742000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerDied","Data":"ceaf1dadb0a41262987a363b58f66fe315a552cc7470f8d46cfc49b76e95959e"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.742024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerDied","Data":"3972cf4fcd549a276993eab0ef23faf54b7c1c1e4e805563f5c0014b1ef5faeb"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.742035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerDied","Data":"82201ef1750ba378349ad8aa6af1ad86347056341ef06a8fd9c79319643362ff"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.746612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" event={"ID":"d7d28e68-71cb-476f-a831-5536b2686514","Type":"ContainerStarted","Data":"b14731efe93702798fe2754503d984c60b5d687954099a5d9a9492997d0ada09"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.747861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" event={"ID":"c892fe45-66a3-4ddf-b0db-b50df71c6493","Type":"ContainerStarted","Data":"f540632b4ec7cb92d49b5808b375c6c827924357bbf777fc5cfeed459ef2f207"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.749466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" event={"ID":"8cb3219d-64cd-4b49-adaa-723e74405eda","Type":"ContainerStarted","Data":"4b87ba974014fb0245b4eb81f177a643e50e1d85cf74e1442f205407f5c2ab9d"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.755171 4707 generic.go:334] "Generic (PLEG): container finished" podID="23a190c1-7893-4f25-af82-4f97b95b4e68" containerID="8dac5ac8a890cd9197910e542e914fc479cea9be24fdd5964752473d60226d8d" exitCode=137 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.757235 4707 generic.go:334] "Generic (PLEG): container finished" podID="53e561f4-eb73-4a09-884c-0d5476044b72" containerID="a6140e292d56c1ea39d41e209b222050d7c5bcd750acd566047cd616b55a336c" exitCode=0 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.757281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"53e561f4-eb73-4a09-884c-0d5476044b72","Type":"ContainerDied","Data":"a6140e292d56c1ea39d41e209b222050d7c5bcd750acd566047cd616b55a336c"} Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.825464 4707 scope.go:117] "RemoveContainer" containerID="2f3ffa31dd88b33ac598f0e6e8fcfb46ab698b734a73b7deb6cda430c6834638" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.938886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5797484f74-ph7p9"] Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.946001 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.955760 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-687544f45f-6qmlx"] Jan 21 16:01:42 crc kubenswrapper[4707]: W0121 16:01:42.982096 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08d2edd9_a93d_4bb9_ac1e_6a23494df48c.slice/crio-e4c5b63cd11f873ed2a55b4f24dcbe0506e5d934f705d5c8ab970f8724d61383 WatchSource:0}: Error finding container e4c5b63cd11f873ed2a55b4f24dcbe0506e5d934f705d5c8ab970f8724d61383: Status 404 returned error can't find the container with id e4c5b63cd11f873ed2a55b4f24dcbe0506e5d934f705d5c8ab970f8724d61383 Jan 21 16:01:42 crc kubenswrapper[4707]: W0121 16:01:42.982554 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2395d51_a312_47ba_9136_87fc6e74bf2c.slice/crio-3dbd485ddc9bfa70a1a342ac567d2dfd4359a73aa0280b6102ad6e894baab869 WatchSource:0}: Error finding container 3dbd485ddc9bfa70a1a342ac567d2dfd4359a73aa0280b6102ad6e894baab869: Status 404 returned error can't find the container with id 3dbd485ddc9bfa70a1a342ac567d2dfd4359a73aa0280b6102ad6e894baab869 Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.988363 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:42 crc kubenswrapper[4707]: I0121 16:01:42.998946 4707 scope.go:117] "RemoveContainer" containerID="c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.039935 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.048774 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.072970 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.073040 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="ovn-northd" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.079462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-generated\") pod \"614dacb7-b2d4-4d19-b93d-c922660ad318\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.079512 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-galera-tls-certs\") pod \"614dacb7-b2d4-4d19-b93d-c922660ad318\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.079574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"614dacb7-b2d4-4d19-b93d-c922660ad318\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.079624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-kolla-config\") pod \"614dacb7-b2d4-4d19-b93d-c922660ad318\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.079656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-combined-ca-bundle\") pod \"614dacb7-b2d4-4d19-b93d-c922660ad318\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.079674 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v25q5\" (UniqueName: \"kubernetes.io/projected/614dacb7-b2d4-4d19-b93d-c922660ad318-kube-api-access-v25q5\") pod \"614dacb7-b2d4-4d19-b93d-c922660ad318\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.079779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-default\") pod \"614dacb7-b2d4-4d19-b93d-c922660ad318\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.079825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-operator-scripts\") pod \"614dacb7-b2d4-4d19-b93d-c922660ad318\" (UID: \"614dacb7-b2d4-4d19-b93d-c922660ad318\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.081433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "614dacb7-b2d4-4d19-b93d-c922660ad318" (UID: "614dacb7-b2d4-4d19-b93d-c922660ad318"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.082027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "614dacb7-b2d4-4d19-b93d-c922660ad318" (UID: "614dacb7-b2d4-4d19-b93d-c922660ad318"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.082762 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "614dacb7-b2d4-4d19-b93d-c922660ad318" (UID: "614dacb7-b2d4-4d19-b93d-c922660ad318"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.082984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "614dacb7-b2d4-4d19-b93d-c922660ad318" (UID: "614dacb7-b2d4-4d19-b93d-c922660ad318"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.084004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/614dacb7-b2d4-4d19-b93d-c922660ad318-kube-api-access-v25q5" (OuterVolumeSpecName: "kube-api-access-v25q5") pod "614dacb7-b2d4-4d19-b93d-c922660ad318" (UID: "614dacb7-b2d4-4d19-b93d-c922660ad318"). InnerVolumeSpecName "kube-api-access-v25q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.087127 4707 scope.go:117] "RemoveContainer" containerID="c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.091386 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420\": container with ID starting with c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420 not found: ID does not exist" containerID="c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.091414 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420"} err="failed to get container status \"c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420\": rpc error: code = NotFound desc = could not find container \"c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420\": container with ID starting with c69ce275f5dbbcd5d4074878cc3308a096a3a30da9221f72246a7e94a4dfd420 not found: ID does not exist" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.091429 4707 scope.go:117] "RemoveContainer" containerID="9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.101406 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.115019 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.123647 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.127545 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.140859 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.145959 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146359 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" containerName="mysql-bootstrap" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146376 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" containerName="mysql-bootstrap" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146391 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerName="ovsdbserver-nb" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146397 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerName="ovsdbserver-nb" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146413 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614dacb7-b2d4-4d19-b93d-c922660ad318" containerName="mysql-bootstrap" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146418 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="614dacb7-b2d4-4d19-b93d-c922660ad318" containerName="mysql-bootstrap" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146428 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerName="openstack-network-exporter" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146434 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerName="openstack-network-exporter" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146444 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146450 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146458 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerName="ovsdbserver-sb" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146464 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerName="ovsdbserver-sb" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146474 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614dacb7-b2d4-4d19-b93d-c922660ad318" containerName="galera" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146479 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="614dacb7-b2d4-4d19-b93d-c922660ad318" containerName="galera" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146490 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerName="openstack-network-exporter" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146497 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerName="openstack-network-exporter" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146505 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e561f4-eb73-4a09-884c-0d5476044b72" containerName="nova-scheduler-scheduler" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146510 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e561f4-eb73-4a09-884c-0d5476044b72" containerName="nova-scheduler-scheduler" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146521 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94827e5-23b4-4693-a27e-b37a6f495523" containerName="nova-cell0-conductor-conductor" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94827e5-23b4-4693-a27e-b37a6f495523" containerName="nova-cell0-conductor-conductor" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146535 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" containerName="galera" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146540 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" containerName="galera" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.146547 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f232af-6f89-4112-a5a1-5fd2093c2b61" containerName="memcached" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146552 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f232af-6f89-4112-a5a1-5fd2093c2b61" containerName="memcached" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146725 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f232af-6f89-4112-a5a1-5fd2093c2b61" containerName="memcached" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146738 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerName="openstack-network-exporter" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146750 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94827e5-23b4-4693-a27e-b37a6f495523" containerName="nova-cell0-conductor-conductor" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146757 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" containerName="galera" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146769 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerName="openstack-network-exporter" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146783 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="614dacb7-b2d4-4d19-b93d-c922660ad318" containerName="galera" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146790 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e561f4-eb73-4a09-884c-0d5476044b72" containerName="nova-scheduler-scheduler" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146801 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" containerName="ovsdbserver-sb" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146825 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.146832 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" containerName="ovsdbserver-nb" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.147414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.152562 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.152908 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.153556 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.154850 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.163879 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.165090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.169925 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.180962 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.181627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-galera-tls-certs\") pod \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.181691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-generated\") pod \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.181750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.181794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xprz2\" (UniqueName: \"kubernetes.io/projected/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kube-api-access-xprz2\") pod \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.181906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-default\") pod \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.181928 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-combined-ca-bundle\") pod \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.181962 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kolla-config\") pod \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.182051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-operator-scripts\") pod \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\" (UID: \"10c95d6a-958d-45bf-8cf2-c9b915727cd4\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.182467 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.182478 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.182488 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v25q5\" (UniqueName: \"kubernetes.io/projected/614dacb7-b2d4-4d19-b93d-c922660ad318-kube-api-access-v25q5\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.182495 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.182504 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614dacb7-b2d4-4d19-b93d-c922660ad318-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.183493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10c95d6a-958d-45bf-8cf2-c9b915727cd4" (UID: "10c95d6a-958d-45bf-8cf2-c9b915727cd4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.184575 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.186337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "10c95d6a-958d-45bf-8cf2-c9b915727cd4" (UID: "10c95d6a-958d-45bf-8cf2-c9b915727cd4"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.189386 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kube-api-access-xprz2" (OuterVolumeSpecName: "kube-api-access-xprz2") pod "10c95d6a-958d-45bf-8cf2-c9b915727cd4" (UID: "10c95d6a-958d-45bf-8cf2-c9b915727cd4"). InnerVolumeSpecName "kube-api-access-xprz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.190929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "614dacb7-b2d4-4d19-b93d-c922660ad318" (UID: "614dacb7-b2d4-4d19-b93d-c922660ad318"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.199205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "10c95d6a-958d-45bf-8cf2-c9b915727cd4" (UID: "10c95d6a-958d-45bf-8cf2-c9b915727cd4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.215079 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "10c95d6a-958d-45bf-8cf2-c9b915727cd4" (UID: "10c95d6a-958d-45bf-8cf2-c9b915727cd4"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.260945 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8" path="/var/lib/kubelet/pods/1cb83aac-11ee-4ce1-8d4f-f634bc2fb8f8/volumes" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.261778 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94827e5-23b4-4693-a27e-b37a6f495523" path="/var/lib/kubelet/pods/e94827e5-23b4-4693-a27e-b37a6f495523/volumes" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.283208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxcr4\" (UniqueName: \"kubernetes.io/projected/53e561f4-eb73-4a09-884c-0d5476044b72-kube-api-access-rxcr4\") pod \"53e561f4-eb73-4a09-884c-0d5476044b72\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.283448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-combined-ca-bundle\") pod \"53e561f4-eb73-4a09-884c-0d5476044b72\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.283679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-config-data\") pod \"53e561f4-eb73-4a09-884c-0d5476044b72\" (UID: \"53e561f4-eb73-4a09-884c-0d5476044b72\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfwl\" (UniqueName: \"kubernetes.io/projected/71a22bb8-e120-4ad5-b4e7-171872ed4de0-kube-api-access-kcfwl\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sjgd\" (UniqueName: \"kubernetes.io/projected/4af60f39-57ca-4595-be87-1d1a0c869d72-kube-api-access-7sjgd\") pod \"nova-cell0-conductor-0\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284759 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284841 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284911 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xprz2\" (UniqueName: \"kubernetes.io/projected/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kube-api-access-xprz2\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.284967 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.286436 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10c95d6a-958d-45bf-8cf2-c9b915727cd4-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.286475 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.288255 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.288290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.288312 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.288321 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.288333 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.289704 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.293527 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.295065 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.295324 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.295437 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-xp8nc" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.295629 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.300174 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.301626 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.302177 4707 scope.go:117] "RemoveContainer" containerID="9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.307268 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-tgjb6" Jan 21 16:01:43 crc kubenswrapper[4707]: E0121 16:01:43.307881 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941\": container with ID starting with 9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941 not found: ID does not exist" containerID="9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.307903 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941"} err="failed to get container status \"9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941\": rpc error: code = NotFound desc = could not find container \"9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941\": container with ID starting with 9bd401e590cb4d7a27ce013a49ccce49f3e505e05e33d3acac88330b53e3a941 not found: ID does not exist" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.307922 4707 scope.go:117] "RemoveContainer" containerID="39951c1adfd56981da93394dd5526012af516ce4d8e4e81c75c853a81df75ff1" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.308538 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.308726 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.324980 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.330204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e561f4-eb73-4a09-884c-0d5476044b72-kube-api-access-rxcr4" (OuterVolumeSpecName: "kube-api-access-rxcr4") pod "53e561f4-eb73-4a09-884c-0d5476044b72" (UID: "53e561f4-eb73-4a09-884c-0d5476044b72"). InnerVolumeSpecName "kube-api-access-rxcr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.333037 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.342089 4707 scope.go:117] "RemoveContainer" containerID="b2677b78e02f90668c6187316a0c1e86ff839677dd5fcae728e200c597003229" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.343244 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.345693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "10c95d6a-958d-45bf-8cf2-c9b915727cd4" (UID: "10c95d6a-958d-45bf-8cf2-c9b915727cd4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.348882 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.359321 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.364424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.368992 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.381184 4707 scope.go:117] "RemoveContainer" containerID="67452b60079d3b2cf3ba3e907030f0defbdf50d69c12ef1aad42ab4d0e43dbcf" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.385316 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.385735 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-clk78" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.385934 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.386087 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.388255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config\") pod \"23a190c1-7893-4f25-af82-4f97b95b4e68\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.388318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config-secret\") pod \"23a190c1-7893-4f25-af82-4f97b95b4e68\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.388400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nz6c\" (UniqueName: \"kubernetes.io/projected/23a190c1-7893-4f25-af82-4f97b95b4e68-kube-api-access-4nz6c\") pod \"23a190c1-7893-4f25-af82-4f97b95b4e68\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.388480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-combined-ca-bundle\") pod \"23a190c1-7893-4f25-af82-4f97b95b4e68\" (UID: \"23a190c1-7893-4f25-af82-4f97b95b4e68\") " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfbht\" (UniqueName: \"kubernetes.io/projected/cf13695f-19af-4b6e-9b52-c5f795e265d7-kube-api-access-wfbht\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfwl\" (UniqueName: \"kubernetes.io/projected/71a22bb8-e120-4ad5-b4e7-171872ed4de0-kube-api-access-kcfwl\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sjgd\" (UniqueName: \"kubernetes.io/projected/4af60f39-57ca-4595-be87-1d1a0c869d72-kube-api-access-7sjgd\") pod \"nova-cell0-conductor-0\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-config-data\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98lnd\" (UniqueName: \"kubernetes.io/projected/a5c43f7c-ca96-403b-b852-675dee96ce9c-kube-api-access-98lnd\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-config\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.389967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-kolla-config\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.390002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.390032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.390060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.390085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.390111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.390132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.390198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.390347 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.390363 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxcr4\" (UniqueName: \"kubernetes.io/projected/53e561f4-eb73-4a09-884c-0d5476044b72-kube-api-access-rxcr4\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.400771 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.404642 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.404884 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.427091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.429654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.429666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.442111 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.443774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfwl\" (UniqueName: \"kubernetes.io/projected/71a22bb8-e120-4ad5-b4e7-171872ed4de0-kube-api-access-kcfwl\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.444368 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.447735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sjgd\" (UniqueName: \"kubernetes.io/projected/4af60f39-57ca-4595-be87-1d1a0c869d72-kube-api-access-7sjgd\") pod \"nova-cell0-conductor-0\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.448698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.453386 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a190c1-7893-4f25-af82-4f97b95b4e68-kube-api-access-4nz6c" (OuterVolumeSpecName: "kube-api-access-4nz6c") pod "23a190c1-7893-4f25-af82-4f97b95b4e68" (UID: "23a190c1-7893-4f25-af82-4f97b95b4e68"). InnerVolumeSpecName "kube-api-access-4nz6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.465917 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfbht\" (UniqueName: \"kubernetes.io/projected/cf13695f-19af-4b6e-9b52-c5f795e265d7-kube-api-access-wfbht\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-config-data\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.492565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-config\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-config-data\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493478 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98lnd\" (UniqueName: \"kubernetes.io/projected/a5c43f7c-ca96-403b-b852-675dee96ce9c-kube-api-access-98lnd\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-config\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-kolla-config\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.493842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.494254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-config\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.494338 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.494783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-kolla-config\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.495521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.496407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.497986 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.498895 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qjqv\" (UniqueName: \"kubernetes.io/projected/7c469e86-c709-4bd2-986e-fa08e32052b0-kube-api-access-5qjqv\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.499048 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nz6c\" (UniqueName: \"kubernetes.io/projected/23a190c1-7893-4f25-af82-4f97b95b4e68-kube-api-access-4nz6c\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.508321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.508528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.512746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.515868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.519050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98lnd\" (UniqueName: \"kubernetes.io/projected/a5c43f7c-ca96-403b-b852-675dee96ce9c-kube-api-access-98lnd\") pod \"memcached-0\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.521323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfbht\" (UniqueName: \"kubernetes.io/projected/cf13695f-19af-4b6e-9b52-c5f795e265d7-kube-api-access-wfbht\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.601383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.601574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.601691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.601799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-config\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.601991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.602161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.603448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qjqv\" (UniqueName: \"kubernetes.io/projected/7c469e86-c709-4bd2-986e-fa08e32052b0-kube-api-access-5qjqv\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.603564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.603580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-config\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.602520 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.602729 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.604330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.610995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.620629 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.622080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.628024 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.632052 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.644974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qjqv\" (UniqueName: \"kubernetes.io/projected/7c469e86-c709-4bd2-986e-fa08e32052b0-kube-api-access-5qjqv\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.661779 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.706330 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.706351 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.726395 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.734180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10c95d6a-958d-45bf-8cf2-c9b915727cd4" (UID: "10c95d6a-958d-45bf-8cf2-c9b915727cd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.768595 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.37:8775/\": read tcp 10.217.0.2:58438->10.217.1.37:8775: read: connection reset by peer" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.768605 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.37:8775/\": read tcp 10.217.0.2:58436->10.217.1.37:8775: read: connection reset by peer" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.785549 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.785734 4707 scope.go:117] "RemoveContainer" containerID="8dac5ac8a890cd9197910e542e914fc479cea9be24fdd5964752473d60226d8d" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.796831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" event={"ID":"c892fe45-66a3-4ddf-b0db-b50df71c6493","Type":"ContainerStarted","Data":"58be21a4ad10793fe38a4fd7bab7b968f3bb33396de3d91f81bf652728f3cc32"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.802267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" event={"ID":"d7d28e68-71cb-476f-a831-5536b2686514","Type":"ContainerStarted","Data":"5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.803970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.805005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"10c95d6a-958d-45bf-8cf2-c9b915727cd4","Type":"ContainerDied","Data":"415e6d046d93980e7818ca8a0519f3c87c8d871c99b7e72d21fc4a1d90f407a2"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.805050 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.805080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.807679 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.809128 4707 generic.go:334] "Generic (PLEG): container finished" podID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerID="3b3595d781817f63d2073805ae987be2763662eff7d9cb701993e99941539a30" exitCode=0 Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.809199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c6da68d4-659d-4cf2-a796-899b51ef8939","Type":"ContainerDied","Data":"3b3595d781817f63d2073805ae987be2763662eff7d9cb701993e99941539a30"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.810286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" event={"ID":"08d2edd9-a93d-4bb9-ac1e-6a23494df48c","Type":"ContainerStarted","Data":"e4c5b63cd11f873ed2a55b4f24dcbe0506e5d934f705d5c8ab970f8724d61383"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.811900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"ef804e7a-6ccf-45ea-8025-b3c4df0f1122","Type":"ContainerStarted","Data":"f714cec7aa437014d66c584329c391f2120578382e450630378de4a4d84aa269"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.813191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" event={"ID":"f58856a6-ee91-4aee-874e-118980038628","Type":"ContainerStarted","Data":"a7d885ed9569b9160b3891c8377f8b2c583f8d5d64b28744210b903262b86e4d"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.815477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"53e561f4-eb73-4a09-884c-0d5476044b72","Type":"ContainerDied","Data":"29ba37e42dd0b06196b3ffb4fc05b462a0857501790255412777e2974ba5f538"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.815508 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.816777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" event={"ID":"8cb3219d-64cd-4b49-adaa-723e74405eda","Type":"ContainerStarted","Data":"b9b0292f4265ad697d0af76da80ef835b40922e6b0800a62697e9af19c69303c"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.822897 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.822938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" event={"ID":"b2395d51-a312-47ba-9136-87fc6e74bf2c","Type":"ContainerStarted","Data":"3dbd485ddc9bfa70a1a342ac567d2dfd4359a73aa0280b6102ad6e894baab869"} Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.823522 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "614dacb7-b2d4-4d19-b93d-c922660ad318" (UID: "614dacb7-b2d4-4d19-b93d-c922660ad318"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.837142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53e561f4-eb73-4a09-884c-0d5476044b72" (UID: "53e561f4-eb73-4a09-884c-0d5476044b72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.852454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "614dacb7-b2d4-4d19-b93d-c922660ad318" (UID: "614dacb7-b2d4-4d19-b93d-c922660ad318"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.877378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "23a190c1-7893-4f25-af82-4f97b95b4e68" (UID: "23a190c1-7893-4f25-af82-4f97b95b4e68"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.886211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23a190c1-7893-4f25-af82-4f97b95b4e68" (UID: "23a190c1-7893-4f25-af82-4f97b95b4e68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.896384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-config-data" (OuterVolumeSpecName: "config-data") pod "53e561f4-eb73-4a09-884c-0d5476044b72" (UID: "53e561f4-eb73-4a09-884c-0d5476044b72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.909254 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.909286 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.909305 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.909315 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e561f4-eb73-4a09-884c-0d5476044b72-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.909325 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614dacb7-b2d4-4d19-b93d-c922660ad318-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.909333 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.909479 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.912868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "10c95d6a-958d-45bf-8cf2-c9b915727cd4" (UID: "10c95d6a-958d-45bf-8cf2-c9b915727cd4"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:43 crc kubenswrapper[4707]: I0121 16:01:43.935405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "23a190c1-7893-4f25-af82-4f97b95b4e68" (UID: "23a190c1-7893-4f25-af82-4f97b95b4e68"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.012416 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23a190c1-7893-4f25-af82-4f97b95b4e68-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.012447 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c95d6a-958d-45bf-8cf2-c9b915727cd4-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.026337 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.031922 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.255:8776/healthcheck\": read tcp 10.217.0.2:40390->10.217.0.255:8776: read: connection reset by peer" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.037308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.337806 4707 scope.go:117] "RemoveContainer" containerID="1fe11deed64d422b065fb5d7dda69898845b396481572d0f42a58c8e9c445a0b" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.363704 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.373004 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.581096 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.671027 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.674849 4707 scope.go:117] "RemoveContainer" containerID="e79d60a7b5b8d4c758290826a1efe8d5479f6a04297437d3fe22a8705fa7d7b8" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.726009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6da68d4-659d-4cf2-a796-899b51ef8939-etc-machine-id\") pod \"c6da68d4-659d-4cf2-a796-899b51ef8939\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.726052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-combined-ca-bundle\") pod \"c6da68d4-659d-4cf2-a796-899b51ef8939\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.726096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data\") pod \"c6da68d4-659d-4cf2-a796-899b51ef8939\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.726109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6da68d4-659d-4cf2-a796-899b51ef8939-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c6da68d4-659d-4cf2-a796-899b51ef8939" (UID: "c6da68d4-659d-4cf2-a796-899b51ef8939"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.726197 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-scripts\") pod \"c6da68d4-659d-4cf2-a796-899b51ef8939\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.726317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdjjd\" (UniqueName: \"kubernetes.io/projected/c6da68d4-659d-4cf2-a796-899b51ef8939-kube-api-access-fdjjd\") pod \"c6da68d4-659d-4cf2-a796-899b51ef8939\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.726341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data-custom\") pod \"c6da68d4-659d-4cf2-a796-899b51ef8939\" (UID: \"c6da68d4-659d-4cf2-a796-899b51ef8939\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.726913 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6da68d4-659d-4cf2-a796-899b51ef8939-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.730275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c6da68d4-659d-4cf2-a796-899b51ef8939" (UID: "c6da68d4-659d-4cf2-a796-899b51ef8939"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.730283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6da68d4-659d-4cf2-a796-899b51ef8939-kube-api-access-fdjjd" (OuterVolumeSpecName: "kube-api-access-fdjjd") pod "c6da68d4-659d-4cf2-a796-899b51ef8939" (UID: "c6da68d4-659d-4cf2-a796-899b51ef8939"). InnerVolumeSpecName "kube-api-access-fdjjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.731565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-scripts" (OuterVolumeSpecName: "scripts") pod "c6da68d4-659d-4cf2-a796-899b51ef8939" (UID: "c6da68d4-659d-4cf2-a796-899b51ef8939"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.768484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6da68d4-659d-4cf2-a796-899b51ef8939" (UID: "c6da68d4-659d-4cf2-a796-899b51ef8939"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.804680 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data" (OuterVolumeSpecName: "config-data") pod "c6da68d4-659d-4cf2-a796-899b51ef8939" (UID: "c6da68d4-659d-4cf2-a796-899b51ef8939"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.848134 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdjjd\" (UniqueName: \"kubernetes.io/projected/c6da68d4-659d-4cf2-a796-899b51ef8939-kube-api-access-fdjjd\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.848170 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.848187 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.848197 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.848208 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6da68d4-659d-4cf2-a796-899b51ef8939-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.850002 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.862532 4707 generic.go:334] "Generic (PLEG): container finished" podID="a788c56d-377f-4859-910d-a623173f0a74" containerID="18320d7d0fe5ff7086b31b3cb35d4c5a0cc62611fd7a0661a1b29d50cedc2e6f" exitCode=0 Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.862617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerDied","Data":"18320d7d0fe5ff7086b31b3cb35d4c5a0cc62611fd7a0661a1b29d50cedc2e6f"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.862644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a788c56d-377f-4859-910d-a623173f0a74","Type":"ContainerDied","Data":"23eb442cbe94a7e31d994e60c9d91679a09744385f08c1118e3bc951f113796e"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.862657 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23eb442cbe94a7e31d994e60c9d91679a09744385f08c1118e3bc951f113796e" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.864555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerStarted","Data":"68a32fa6a5dba99ec037bbd9966099a20c4339ff3e5070a7193d05f0305f1347"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.866195 4707 generic.go:334] "Generic (PLEG): container finished" podID="b5adfc17-d372-40cd-a326-5af09f323dce" containerID="0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85" exitCode=0 Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.866260 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5adfc17-d372-40cd-a326-5af09f323dce","Type":"ContainerDied","Data":"0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.866437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5adfc17-d372-40cd-a326-5af09f323dce","Type":"ContainerDied","Data":"f97b57e63c4b0fc0d35bce0b9979cd7e2949ba4c614926e596c0b3a044b8f645"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.866287 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.876258 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8a64a9a-c654-4d05-8504-0839ba022204" containerID="8e2da5aaf70f2dca26c0a7481b9da9f4a95637fa22e3641c79f7ab645fd6829e" exitCode=0 Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.876319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b8a64a9a-c654-4d05-8504-0839ba022204","Type":"ContainerDied","Data":"8e2da5aaf70f2dca26c0a7481b9da9f4a95637fa22e3641c79f7ab645fd6829e"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.876339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b8a64a9a-c654-4d05-8504-0839ba022204","Type":"ContainerDied","Data":"ce975a22a57bfcc4a891f3fbd826af3e61d48a5d09a024fd529209db89980e72"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.876350 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce975a22a57bfcc4a891f3fbd826af3e61d48a5d09a024fd529209db89980e72" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.881446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"ef804e7a-6ccf-45ea-8025-b3c4df0f1122","Type":"ContainerStarted","Data":"2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.891570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"71a22bb8-e120-4ad5-b4e7-171872ed4de0","Type":"ContainerStarted","Data":"65c04243e11b338e24b3bdbcaf7e06b5490cf116becaa0bcb4d85e777f8947fa"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.897637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"7c469e86-c709-4bd2-986e-fa08e32052b0","Type":"ContainerStarted","Data":"f439f77b16ce08da3199352dc751911f63fb6b7b1ef1c0169c31e7ad38531a9a"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.900897 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerID="5f46ece4d49ebd6ff61477cd26b6c7e5b21e6dfbdf8a342e363b3fa2825bf3d7" exitCode=0 Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.900943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fc9779e5-4e04-424b-b1e3-388ebf744d02","Type":"ContainerDied","Data":"5f46ece4d49ebd6ff61477cd26b6c7e5b21e6dfbdf8a342e363b3fa2825bf3d7"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.900960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fc9779e5-4e04-424b-b1e3-388ebf744d02","Type":"ContainerDied","Data":"cc1ee5d6c7e5f2b2456e3af05b8d762bb0fe47a4df8214f9837dd81fd6fc0a02"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.900970 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc1ee5d6c7e5f2b2456e3af05b8d762bb0fe47a4df8214f9837dd81fd6fc0a02" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.905502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c6da68d4-659d-4cf2-a796-899b51ef8939","Type":"ContainerDied","Data":"38cc8ec1a09a6dcfe428bc7290accb1c02c7103180c8c183974ac06414a836a8"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.905625 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.911162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"a5c43f7c-ca96-403b-b852-675dee96ce9c","Type":"ContainerStarted","Data":"12be752356156ba59ab4b887cf7857e8f5dbfe4de59bc40d4746b70f172728d3"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.911985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"cf13695f-19af-4b6e-9b52-c5f795e265d7","Type":"ContainerStarted","Data":"f9bb4f8dd4da4abef4adecf2a121d005e1b718ab3814f27bfa784d5b7afa5606"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.913797 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=4.913777809 podStartE2EDuration="4.913777809s" podCreationTimestamp="2026-01-21 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:44.90335891 +0000 UTC m=+3602.084875131" watchObservedRunningTime="2026-01-21 16:01:44.913777809 +0000 UTC m=+3602.095294032" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.914278 4707 generic.go:334] "Generic (PLEG): container finished" podID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerID="e988b288a8fcc44d803facd8cd151da8bf58f68ac8ebb2aef3d85ab144b16900" exitCode=0 Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.914329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"32eaeddc-f8b5-4c3e-a456-f723f0698b30","Type":"ContainerDied","Data":"e988b288a8fcc44d803facd8cd151da8bf58f68ac8ebb2aef3d85ab144b16900"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.914354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"32eaeddc-f8b5-4c3e-a456-f723f0698b30","Type":"ContainerDied","Data":"79e47180b22fff9ccbd5cf6da7c3d6fe16284b15bb2cd08cc05499fad9cb705b"} Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.914364 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79e47180b22fff9ccbd5cf6da7c3d6fe16284b15bb2cd08cc05499fad9cb705b" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.949160 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-nova-metadata-tls-certs\") pod \"b5adfc17-d372-40cd-a326-5af09f323dce\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.949214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-combined-ca-bundle\") pod \"b5adfc17-d372-40cd-a326-5af09f323dce\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.949254 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5adfc17-d372-40cd-a326-5af09f323dce-logs\") pod \"b5adfc17-d372-40cd-a326-5af09f323dce\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.949396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-config-data\") pod \"b5adfc17-d372-40cd-a326-5af09f323dce\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.949430 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lcxn\" (UniqueName: \"kubernetes.io/projected/b5adfc17-d372-40cd-a326-5af09f323dce-kube-api-access-2lcxn\") pod \"b5adfc17-d372-40cd-a326-5af09f323dce\" (UID: \"b5adfc17-d372-40cd-a326-5af09f323dce\") " Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.950447 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5adfc17-d372-40cd-a326-5af09f323dce-logs" (OuterVolumeSpecName: "logs") pod "b5adfc17-d372-40cd-a326-5af09f323dce" (UID: "b5adfc17-d372-40cd-a326-5af09f323dce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.954324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5adfc17-d372-40cd-a326-5af09f323dce-kube-api-access-2lcxn" (OuterVolumeSpecName: "kube-api-access-2lcxn") pod "b5adfc17-d372-40cd-a326-5af09f323dce" (UID: "b5adfc17-d372-40cd-a326-5af09f323dce"). InnerVolumeSpecName "kube-api-access-2lcxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.974653 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5adfc17-d372-40cd-a326-5af09f323dce" (UID: "b5adfc17-d372-40cd-a326-5af09f323dce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.982656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-config-data" (OuterVolumeSpecName: "config-data") pod "b5adfc17-d372-40cd-a326-5af09f323dce" (UID: "b5adfc17-d372-40cd-a326-5af09f323dce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:44 crc kubenswrapper[4707]: I0121 16:01:44.988335 4707 scope.go:117] "RemoveContainer" containerID="a6140e292d56c1ea39d41e209b222050d7c5bcd750acd566047cd616b55a336c" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.023855 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.052093 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.056893 4707 scope.go:117] "RemoveContainer" containerID="0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.057150 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lcxn\" (UniqueName: \"kubernetes.io/projected/b5adfc17-d372-40cd-a326-5af09f323dce-kube-api-access-2lcxn\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.057178 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.057188 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5adfc17-d372-40cd-a326-5af09f323dce-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.057212 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.080273 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093221 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.093594 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerName="cinder-scheduler" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093613 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerName="cinder-scheduler" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.093624 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-metadata" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093629 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-metadata" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.093645 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" containerName="cinder-api-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093650 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" containerName="cinder-api-log" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.093669 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" containerName="cinder-api" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093674 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" containerName="cinder-api" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.093689 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093695 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-log" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.093706 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerName="probe" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093712 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerName="probe" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerName="probe" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093914 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-metadata" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093921 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6da68d4-659d-4cf2-a796-899b51ef8939" containerName="cinder-scheduler" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093931 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" containerName="cinder-api" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093942 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" containerName="nova-metadata-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.093954 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" containerName="cinder-api-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.094836 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.096799 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.097079 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.097211 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.097263 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-6glg2" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.115713 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.118122 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="23a190c1-7893-4f25-af82-4f97b95b4e68" podUID="ef804e7a-6ccf-45ea-8025-b3c4df0f1122" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.122786 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.123854 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.133874 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.149316 4707 scope.go:117] "RemoveContainer" containerID="51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.155784 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159513 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-config-data\") pod \"a788c56d-377f-4859-910d-a623173f0a74\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8a64a9a-c654-4d05-8504-0839ba022204-etc-machine-id\") pod \"b8a64a9a-c654-4d05-8504-0839ba022204\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159596 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"fc9779e5-4e04-424b-b1e3-388ebf744d02\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-combined-ca-bundle\") pod \"fc9779e5-4e04-424b-b1e3-388ebf744d02\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159634 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-log-httpd\") pod \"a788c56d-377f-4859-910d-a623173f0a74\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-httpd-run\") pod \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q8tc\" (UniqueName: \"kubernetes.io/projected/32eaeddc-f8b5-4c3e-a456-f723f0698b30-kube-api-access-6q8tc\") pod \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-logs\") pod \"fc9779e5-4e04-424b-b1e3-388ebf744d02\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7x5q\" (UniqueName: \"kubernetes.io/projected/fc9779e5-4e04-424b-b1e3-388ebf744d02-kube-api-access-c7x5q\") pod \"fc9779e5-4e04-424b-b1e3-388ebf744d02\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data-custom\") pod \"b8a64a9a-c654-4d05-8504-0839ba022204\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159789 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-scripts\") pod \"fc9779e5-4e04-424b-b1e3-388ebf744d02\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-sg-core-conf-yaml\") pod \"a788c56d-377f-4859-910d-a623173f0a74\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48hg4\" (UniqueName: \"kubernetes.io/projected/a788c56d-377f-4859-910d-a623173f0a74-kube-api-access-48hg4\") pod \"a788c56d-377f-4859-910d-a623173f0a74\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-run-httpd\") pod \"a788c56d-377f-4859-910d-a623173f0a74\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-ceilometer-tls-certs\") pod \"a788c56d-377f-4859-910d-a623173f0a74\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159936 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-combined-ca-bundle\") pod \"a788c56d-377f-4859-910d-a623173f0a74\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a64a9a-c654-4d05-8504-0839ba022204-logs\") pod \"b8a64a9a-c654-4d05-8504-0839ba022204\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-public-tls-certs\") pod \"fc9779e5-4e04-424b-b1e3-388ebf744d02\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.159997 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-internal-tls-certs\") pod \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-scripts\") pod \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160054 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data\") pod \"b8a64a9a-c654-4d05-8504-0839ba022204\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-logs\") pod \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-config-data\") pod \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-scripts\") pod \"b8a64a9a-c654-4d05-8504-0839ba022204\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-public-tls-certs\") pod \"b8a64a9a-c654-4d05-8504-0839ba022204\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-config-data\") pod \"fc9779e5-4e04-424b-b1e3-388ebf744d02\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-combined-ca-bundle\") pod \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\" (UID: \"32eaeddc-f8b5-4c3e-a456-f723f0698b30\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-combined-ca-bundle\") pod \"b8a64a9a-c654-4d05-8504-0839ba022204\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160219 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-httpd-run\") pod \"fc9779e5-4e04-424b-b1e3-388ebf744d02\" (UID: \"fc9779e5-4e04-424b-b1e3-388ebf744d02\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-scripts\") pod \"a788c56d-377f-4859-910d-a623173f0a74\" (UID: \"a788c56d-377f-4859-910d-a623173f0a74\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-internal-tls-certs\") pod \"b8a64a9a-c654-4d05-8504-0839ba022204\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7kjc\" (UniqueName: \"kubernetes.io/projected/b8a64a9a-c654-4d05-8504-0839ba022204-kube-api-access-n7kjc\") pod \"b8a64a9a-c654-4d05-8504-0839ba022204\" (UID: \"b8a64a9a-c654-4d05-8504-0839ba022204\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-logs" (OuterVolumeSpecName: "logs") pod "fc9779e5-4e04-424b-b1e3-388ebf744d02" (UID: "fc9779e5-4e04-424b-b1e3-388ebf744d02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-default\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160694 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-kolla-config\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160735 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-operator-scripts\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-generated\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160866 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtrjp\" (UniqueName: \"kubernetes.io/projected/45062d46-6a83-4a28-bb58-8d2f5a22e270-kube-api-access-qtrjp\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.160961 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.168942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "32eaeddc-f8b5-4c3e-a456-f723f0698b30" (UID: "32eaeddc-f8b5-4c3e-a456-f723f0698b30"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.169918 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.170940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8a64a9a-c654-4d05-8504-0839ba022204-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b8a64a9a-c654-4d05-8504-0839ba022204" (UID: "b8a64a9a-c654-4d05-8504-0839ba022204"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.173343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a64a9a-c654-4d05-8504-0839ba022204-logs" (OuterVolumeSpecName: "logs") pod "b8a64a9a-c654-4d05-8504-0839ba022204" (UID: "b8a64a9a-c654-4d05-8504-0839ba022204"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.182802 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.193741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-logs" (OuterVolumeSpecName: "logs") pod "32eaeddc-f8b5-4c3e-a456-f723f0698b30" (UID: "32eaeddc-f8b5-4c3e-a456-f723f0698b30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.194096 4707 scope.go:117] "RemoveContainer" containerID="0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.197985 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85\": container with ID starting with 0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85 not found: ID does not exist" containerID="0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.198016 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85"} err="failed to get container status \"0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85\": rpc error: code = NotFound desc = could not find container \"0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85\": container with ID starting with 0901f65e1981f7073ce8ae74e905a537ee7141d738094b66224b94592bad1a85 not found: ID does not exist" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.198034 4707 scope.go:117] "RemoveContainer" containerID="51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.198288 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fc9779e5-4e04-424b-b1e3-388ebf744d02" (UID: "fc9779e5-4e04-424b-b1e3-388ebf744d02"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.200546 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a788c56d-377f-4859-910d-a623173f0a74" (UID: "a788c56d-377f-4859-910d-a623173f0a74"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.200662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-scripts" (OuterVolumeSpecName: "scripts") pod "fc9779e5-4e04-424b-b1e3-388ebf744d02" (UID: "fc9779e5-4e04-424b-b1e3-388ebf744d02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.200678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "fc9779e5-4e04-424b-b1e3-388ebf744d02" (UID: "fc9779e5-4e04-424b-b1e3-388ebf744d02"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.201009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-scripts" (OuterVolumeSpecName: "scripts") pod "b8a64a9a-c654-4d05-8504-0839ba022204" (UID: "b8a64a9a-c654-4d05-8504-0839ba022204"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.201023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a788c56d-377f-4859-910d-a623173f0a74" (UID: "a788c56d-377f-4859-910d-a623173f0a74"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.201029 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384\": container with ID starting with 51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384 not found: ID does not exist" containerID="51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.201051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9779e5-4e04-424b-b1e3-388ebf744d02-kube-api-access-c7x5q" (OuterVolumeSpecName: "kube-api-access-c7x5q") pod "fc9779e5-4e04-424b-b1e3-388ebf744d02" (UID: "fc9779e5-4e04-424b-b1e3-388ebf744d02"). InnerVolumeSpecName "kube-api-access-c7x5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.201082 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384"} err="failed to get container status \"51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384\": rpc error: code = NotFound desc = could not find container \"51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384\": container with ID starting with 51b59d9c6b35c2dc93533d5fc12690a0ef18d6da1c7cf80d681250d6425e1384 not found: ID does not exist" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.201105 4707 scope.go:117] "RemoveContainer" containerID="35687d4e9d2f04767e4e660723736595964bd33eb7c312847844b1f32e3a1a5a" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.202051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32eaeddc-f8b5-4c3e-a456-f723f0698b30-kube-api-access-6q8tc" (OuterVolumeSpecName: "kube-api-access-6q8tc") pod "32eaeddc-f8b5-4c3e-a456-f723f0698b30" (UID: "32eaeddc-f8b5-4c3e-a456-f723f0698b30"). InnerVolumeSpecName "kube-api-access-6q8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.206056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-scripts" (OuterVolumeSpecName: "scripts") pod "a788c56d-377f-4859-910d-a623173f0a74" (UID: "a788c56d-377f-4859-910d-a623173f0a74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.207075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b8a64a9a-c654-4d05-8504-0839ba022204" (UID: "b8a64a9a-c654-4d05-8504-0839ba022204"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.223543 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.226113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "32eaeddc-f8b5-4c3e-a456-f723f0698b30" (UID: "32eaeddc-f8b5-4c3e-a456-f723f0698b30"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.226893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-scripts" (OuterVolumeSpecName: "scripts") pod "32eaeddc-f8b5-4c3e-a456-f723f0698b30" (UID: "32eaeddc-f8b5-4c3e-a456-f723f0698b30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.230859 4707 scope.go:117] "RemoveContainer" containerID="3b3595d781817f63d2073805ae987be2763662eff7d9cb701993e99941539a30" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.232102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a788c56d-377f-4859-910d-a623173f0a74-kube-api-access-48hg4" (OuterVolumeSpecName: "kube-api-access-48hg4") pod "a788c56d-377f-4859-910d-a623173f0a74" (UID: "a788c56d-377f-4859-910d-a623173f0a74"). InnerVolumeSpecName "kube-api-access-48hg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.253636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a64a9a-c654-4d05-8504-0839ba022204-kube-api-access-n7kjc" (OuterVolumeSpecName: "kube-api-access-n7kjc") pod "b8a64a9a-c654-4d05-8504-0839ba022204" (UID: "b8a64a9a-c654-4d05-8504-0839ba022204"). InnerVolumeSpecName "kube-api-access-n7kjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.254483 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c95d6a-958d-45bf-8cf2-c9b915727cd4" path="/var/lib/kubelet/pods/10c95d6a-958d-45bf-8cf2-c9b915727cd4/volumes" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.260649 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a190c1-7893-4f25-af82-4f97b95b4e68" path="/var/lib/kubelet/pods/23a190c1-7893-4f25-af82-4f97b95b4e68/volumes" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.272525 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909f1cc7-acf2-4e84-ae89-4af8bd83962d" path="/var/lib/kubelet/pods/909f1cc7-acf2-4e84-ae89-4af8bd83962d/volumes" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.274711 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1857e43-c160-4d63-ba36-3b9cb96ebadc" path="/var/lib/kubelet/pods/c1857e43-c160-4d63-ba36-3b9cb96ebadc/volumes" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.277256 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6da68d4-659d-4cf2-a796-899b51ef8939" path="/var/lib/kubelet/pods/c6da68d4-659d-4cf2-a796-899b51ef8939/volumes" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.273497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.280928 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f232af-6f89-4112-a5a1-5fd2093c2b61" path="/var/lib/kubelet/pods/e5f232af-6f89-4112-a5a1-5fd2093c2b61/volumes" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.282983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-default\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.285040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-kolla-config\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.285169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-operator-scripts\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.285284 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-generated\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.285453 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-kolla-config\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.283924 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-default\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtrjp\" (UniqueName: \"kubernetes.io/projected/45062d46-6a83-4a28-bb58-8d2f5a22e270-kube-api-access-qtrjp\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-generated\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-operator-scripts\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286600 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc9779e5-4e04-424b-b1e3-388ebf744d02-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286624 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286670 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7kjc\" (UniqueName: \"kubernetes.io/projected/b8a64a9a-c654-4d05-8504-0839ba022204-kube-api-access-n7kjc\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286682 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8a64a9a-c654-4d05-8504-0839ba022204-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286701 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286711 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286719 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286728 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q8tc\" (UniqueName: \"kubernetes.io/projected/32eaeddc-f8b5-4c3e-a456-f723f0698b30-kube-api-access-6q8tc\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286737 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7x5q\" (UniqueName: \"kubernetes.io/projected/fc9779e5-4e04-424b-b1e3-388ebf744d02-kube-api-access-c7x5q\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286746 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286754 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286766 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286774 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48hg4\" (UniqueName: \"kubernetes.io/projected/a788c56d-377f-4859-910d-a623173f0a74-kube-api-access-48hg4\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286783 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a788c56d-377f-4859-910d-a623173f0a74-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286791 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a64a9a-c654-4d05-8504-0839ba022204-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286799 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286824 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32eaeddc-f8b5-4c3e-a456-f723f0698b30-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.286831 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.287055 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.292393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.303606 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.303641 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304040 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="sg-core" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304058 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="sg-core" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304074 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerName="glance-httpd" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304080 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerName="glance-httpd" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304093 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerName="glance-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304098 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerName="glance-log" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304109 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="ceilometer-central-agent" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304115 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="ceilometer-central-agent" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304125 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304131 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-log" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304141 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerName="glance-httpd" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304147 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerName="glance-httpd" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304154 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-api" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304159 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-api" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304164 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="ceilometer-notification-agent" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304169 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="ceilometer-notification-agent" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304178 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerName="glance-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304183 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerName="glance-log" Jan 21 16:01:45 crc kubenswrapper[4707]: E0121 16:01:45.304193 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="proxy-httpd" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304198 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="proxy-httpd" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304346 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="ceilometer-central-agent" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304359 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerName="glance-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304371 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerName="glance-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304382 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="ceilometer-notification-agent" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304391 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-api" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304398 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="proxy-httpd" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304408 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerName="nova-api-log" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304416 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9779e5-4e04-424b-b1e3-388ebf744d02" containerName="glance-httpd" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304424 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a788c56d-377f-4859-910d-a623173f0a74" containerName="sg-core" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.304432 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" containerName="glance-httpd" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.305202 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.305224 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.305235 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.305246 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.306450 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.307104 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.307175 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.307505 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.307905 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.308541 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.309456 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.309608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.309951 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.310014 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.310068 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.310581 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-rch7x" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.313873 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.317696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtrjp\" (UniqueName: \"kubernetes.io/projected/45062d46-6a83-4a28-bb58-8d2f5a22e270-kube-api-access-qtrjp\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.387409 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-config-data\") pod \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.387498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7qs\" (UniqueName: \"kubernetes.io/projected/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-kube-api-access-np7qs\") pod \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.387577 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-combined-ca-bundle\") pod \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.387614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-internal-tls-certs\") pod \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.387632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-public-tls-certs\") pod \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.387803 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-logs\") pod \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\" (UID: \"77eb4906-622c-4eb0-8e20-e91b6d3a8e27\") " Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.388114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.388151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8kvw\" (UniqueName: \"kubernetes.io/projected/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-kube-api-access-p8kvw\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.388192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.388221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.388254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.388273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.392254 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-logs" (OuterVolumeSpecName: "logs") pod "77eb4906-622c-4eb0-8e20-e91b6d3a8e27" (UID: "77eb4906-622c-4eb0-8e20-e91b6d3a8e27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.407739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-kube-api-access-np7qs" (OuterVolumeSpecName: "kube-api-access-np7qs") pod "77eb4906-622c-4eb0-8e20-e91b6d3a8e27" (UID: "77eb4906-622c-4eb0-8e20-e91b6d3a8e27"). InnerVolumeSpecName "kube-api-access-np7qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.463745 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8x8s\" (UniqueName: \"kubernetes.io/projected/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-kube-api-access-t8x8s\") pod \"nova-scheduler-0\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgzcf\" (UniqueName: \"kubernetes.io/projected/1de274a1-6480-4858-948a-d0423dbf7fe8-kube-api-access-hgzcf\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-config-data\") pod \"nova-scheduler-0\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.490851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8kvw\" (UniqueName: \"kubernetes.io/projected/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-kube-api-access-p8kvw\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.491212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.491254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.491339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.491367 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.491478 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.491493 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.491507 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7qs\" (UniqueName: \"kubernetes.io/projected/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-kube-api-access-np7qs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.500331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.501193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.504333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.504530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.514491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8kvw\" (UniqueName: \"kubernetes.io/projected/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-kube-api-access-p8kvw\") pod \"cinder-scheduler-0\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8x8s\" (UniqueName: \"kubernetes.io/projected/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-kube-api-access-t8x8s\") pod \"nova-scheduler-0\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgzcf\" (UniqueName: \"kubernetes.io/projected/1de274a1-6480-4858-948a-d0423dbf7fe8-kube-api-access-hgzcf\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.593500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-config-data\") pod \"nova-scheduler-0\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.594268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.594521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.595607 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.596352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.602367 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.606761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.607044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-config-data\") pod \"nova-scheduler-0\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.607860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.607887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.610592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8x8s\" (UniqueName: \"kubernetes.io/projected/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-kube-api-access-t8x8s\") pod \"nova-scheduler-0\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.611316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgzcf\" (UniqueName: \"kubernetes.io/projected/1de274a1-6480-4858-948a-d0423dbf7fe8-kube-api-access-hgzcf\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.615822 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b8a64a9a-c654-4d05-8504-0839ba022204" (UID: "b8a64a9a-c654-4d05-8504-0839ba022204"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.634671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.645983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.648804 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.698251 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.698279 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.710093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8a64a9a-c654-4d05-8504-0839ba022204" (UID: "b8a64a9a-c654-4d05-8504-0839ba022204"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.738200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b5adfc17-d372-40cd-a326-5af09f323dce" (UID: "b5adfc17-d372-40cd-a326-5af09f323dce"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.785116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.800909 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.801123 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5adfc17-d372-40cd-a326-5af09f323dce-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.926924 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" event={"ID":"b2395d51-a312-47ba-9136-87fc6e74bf2c","Type":"ContainerStarted","Data":"32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78"} Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.927067 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.947406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"7c469e86-c709-4bd2-986e-fa08e32052b0","Type":"ContainerStarted","Data":"16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87"} Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.958358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc9779e5-4e04-424b-b1e3-388ebf744d02" (UID: "fc9779e5-4e04-424b-b1e3-388ebf744d02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.968963 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f8b72f5-0735-4187-aaec-0fb50df56674" containerID="d09d9e5a0fd4869c4422a15b0398bb6bfdf3645c34fe7a84bc0e0e94d680986f" exitCode=0 Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.969032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"8f8b72f5-0735-4187-aaec-0fb50df56674","Type":"ContainerDied","Data":"d09d9e5a0fd4869c4422a15b0398bb6bfdf3645c34fe7a84bc0e0e94d680986f"} Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.969059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"8f8b72f5-0735-4187-aaec-0fb50df56674","Type":"ContainerDied","Data":"d4ed10b0e41f00fea85b47debfb90270a496b9f43b126cf47e367e657be44316"} Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.969069 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4ed10b0e41f00fea85b47debfb90270a496b9f43b126cf47e367e657be44316" Jan 21 16:01:45 crc kubenswrapper[4707]: I0121 16:01:45.997971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" event={"ID":"c892fe45-66a3-4ddf-b0db-b50df71c6493","Type":"ContainerStarted","Data":"73b244ba7031f16385e7963d008d81acbd17ba6ba2ee40f7b1bdf960c503b84b"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.000700 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" podStartSLOduration=6.000685511 podStartE2EDuration="6.000685511s" podCreationTimestamp="2026-01-21 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:45.958356129 +0000 UTC m=+3603.139872351" watchObservedRunningTime="2026-01-21 16:01:46.000685511 +0000 UTC m=+3603.182201733" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.007095 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.047084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"71a22bb8-e120-4ad5-b4e7-171872ed4de0","Type":"ContainerStarted","Data":"8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.054823 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.081674 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" podStartSLOduration=6.081658818 podStartE2EDuration="6.081658818s" podCreationTimestamp="2026-01-21 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:46.04123048 +0000 UTC m=+3603.222746702" watchObservedRunningTime="2026-01-21 16:01:46.081658818 +0000 UTC m=+3603.263175040" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.083313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" event={"ID":"f58856a6-ee91-4aee-874e-118980038628","Type":"ContainerStarted","Data":"5e4bd3a5a9bcc7a9caf760e253e0ea48824f56e8b8b18cd49710b14d45577c27"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.084607 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-75d7748979-s5w96"] Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.084839 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" podUID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerName="barbican-worker-log" containerID="cri-o://9682448c7c3521e5fabf41464b76348e4257e00745c27d84fde49e69a08cce5b" gracePeriod=30 Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.084850 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" podUID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerName="barbican-worker" containerID="cri-o://380a9834052d0f5862f21968659125d6bfa398491f0f1d19cb8337ac171d6977" gracePeriod=30 Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.098481 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=3.098468616 podStartE2EDuration="3.098468616s" podCreationTimestamp="2026-01-21 16:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:46.093142561 +0000 UTC m=+3603.274658783" watchObservedRunningTime="2026-01-21 16:01:46.098468616 +0000 UTC m=+3603.279984828" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.132654 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.150835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" event={"ID":"d7d28e68-71cb-476f-a831-5536b2686514","Type":"ContainerStarted","Data":"e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.151431 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.151565 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.159094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a788c56d-377f-4859-910d-a623173f0a74" (UID: "a788c56d-377f-4859-910d-a623173f0a74"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.168070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"a5c43f7c-ca96-403b-b852-675dee96ce9c","Type":"ContainerStarted","Data":"8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.169364 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.170252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32eaeddc-f8b5-4c3e-a456-f723f0698b30" (UID: "32eaeddc-f8b5-4c3e-a456-f723f0698b30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.179532 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" podStartSLOduration=6.179516232 podStartE2EDuration="6.179516232s" podCreationTimestamp="2026-01-21 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:46.15150425 +0000 UTC m=+3603.333020473" watchObservedRunningTime="2026-01-21 16:01:46.179516232 +0000 UTC m=+3603.361032454" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.181755 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm"] Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.186280 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" podStartSLOduration=6.186265473 podStartE2EDuration="6.186265473s" podCreationTimestamp="2026-01-21 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:46.185602987 +0000 UTC m=+3603.367119209" watchObservedRunningTime="2026-01-21 16:01:46.186265473 +0000 UTC m=+3603.367781695" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.186978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" event={"ID":"08d2edd9-a93d-4bb9-ac1e-6a23494df48c","Type":"ContainerStarted","Data":"337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.187192 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" podUID="79b20a8d-7771-44bf-9ca7-100e04650264" containerName="barbican-keystone-listener-log" containerID="cri-o://22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b" gracePeriod=30 Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.187801 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" podUID="79b20a8d-7771-44bf-9ca7-100e04650264" containerName="barbican-keystone-listener" containerID="cri-o://2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4" gracePeriod=30 Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.211206 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.211231 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.212175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"cf13695f-19af-4b6e-9b52-c5f795e265d7","Type":"ContainerStarted","Data":"8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.233212 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.294345 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=3.294327675 podStartE2EDuration="3.294327675s" podCreationTimestamp="2026-01-21 16:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:46.279570076 +0000 UTC m=+3603.461086298" watchObservedRunningTime="2026-01-21 16:01:46.294327675 +0000 UTC m=+3603.475843897" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.298496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerStarted","Data":"7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.299746 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.324075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a788c56d-377f-4859-910d-a623173f0a74" (UID: "a788c56d-377f-4859-910d-a623173f0a74"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.326474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" event={"ID":"8cb3219d-64cd-4b49-adaa-723e74405eda","Type":"ContainerStarted","Data":"dfcfda9d4838b9c50d7eebf432589fed062da83ba1c5877073fb9c19903f4a98"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.326644 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.326793 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.326889 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.369786 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=3.369767947 podStartE2EDuration="3.369767947s" podCreationTimestamp="2026-01-21 16:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:46.359677053 +0000 UTC m=+3603.541193276" watchObservedRunningTime="2026-01-21 16:01:46.369767947 +0000 UTC m=+3603.551284169" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.371466 4707 generic.go:334] "Generic (PLEG): container finished" podID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" containerID="081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78" exitCode=0 Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.371860 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.372515 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.372696 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.373123 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.373339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"77eb4906-622c-4eb0-8e20-e91b6d3a8e27","Type":"ContainerDied","Data":"081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.373366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"77eb4906-622c-4eb0-8e20-e91b6d3a8e27","Type":"ContainerDied","Data":"59aae2fe83ccfea1cfd931182372261bdc38b2b85b89753a81dfdc12645ad314"} Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.373386 4707 scope.go:117] "RemoveContainer" containerID="081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.373524 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.397419 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podStartSLOduration=6.39740832 podStartE2EDuration="6.39740832s" podCreationTimestamp="2026-01-21 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:46.395632682 +0000 UTC m=+3603.577148904" watchObservedRunningTime="2026-01-21 16:01:46.39740832 +0000 UTC m=+3603.578924542" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.426144 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.481751 4707 scope.go:117] "RemoveContainer" containerID="e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.506207 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.531095 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.511005 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.539257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b8a64a9a-c654-4d05-8504-0839ba022204" (UID: "b8a64a9a-c654-4d05-8504-0839ba022204"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.552268 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.559948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "32eaeddc-f8b5-4c3e-a456-f723f0698b30" (UID: "32eaeddc-f8b5-4c3e-a456-f723f0698b30"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.564136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "77eb4906-622c-4eb0-8e20-e91b6d3a8e27" (UID: "77eb4906-622c-4eb0-8e20-e91b6d3a8e27"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.573497 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:01:46 crc kubenswrapper[4707]: E0121 16:01:46.574194 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8b72f5-0735-4187-aaec-0fb50df56674" containerName="nova-cell1-conductor-conductor" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.574270 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8b72f5-0735-4187-aaec-0fb50df56674" containerName="nova-cell1-conductor-conductor" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.574963 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8b72f5-0735-4187-aaec-0fb50df56674" containerName="nova-cell1-conductor-conductor" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.576547 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.580954 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.581135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.624358 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.639339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-combined-ca-bundle\") pod \"8f8b72f5-0735-4187-aaec-0fb50df56674\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.639820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn2gr\" (UniqueName: \"kubernetes.io/projected/8f8b72f5-0735-4187-aaec-0fb50df56674-kube-api-access-hn2gr\") pod \"8f8b72f5-0735-4187-aaec-0fb50df56674\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.639908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-config-data\") pod \"8f8b72f5-0735-4187-aaec-0fb50df56674\" (UID: \"8f8b72f5-0735-4187-aaec-0fb50df56674\") " Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.640025 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-config-data" (OuterVolumeSpecName: "config-data") pod "77eb4906-622c-4eb0-8e20-e91b6d3a8e27" (UID: "77eb4906-622c-4eb0-8e20-e91b6d3a8e27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.640714 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.640730 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.640742 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:46 crc kubenswrapper[4707]: I0121 16:01:46.640754 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.653593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77eb4906-622c-4eb0-8e20-e91b6d3a8e27" (UID: "77eb4906-622c-4eb0-8e20-e91b6d3a8e27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.712412 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f8b72f5-0735-4187-aaec-0fb50df56674-kube-api-access-hn2gr" (OuterVolumeSpecName: "kube-api-access-hn2gr") pod "8f8b72f5-0735-4187-aaec-0fb50df56674" (UID: "8f8b72f5-0735-4187-aaec-0fb50df56674"). InnerVolumeSpecName "kube-api-access-hn2gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.747250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.747288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-config-data\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.747392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3caa588-23c8-4f8c-9a2d-4340345a5a70-logs\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.747411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.747746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9bw\" (UniqueName: \"kubernetes.io/projected/c3caa588-23c8-4f8c-9a2d-4340345a5a70-kube-api-access-rs9bw\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.747825 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn2gr\" (UniqueName: \"kubernetes.io/projected/8f8b72f5-0735-4187-aaec-0fb50df56674-kube-api-access-hn2gr\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.747836 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.774357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a788c56d-377f-4859-910d-a623173f0a74" (UID: "a788c56d-377f-4859-910d-a623173f0a74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.799272 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-config-data" (OuterVolumeSpecName: "config-data") pod "fc9779e5-4e04-424b-b1e3-388ebf744d02" (UID: "fc9779e5-4e04-424b-b1e3-388ebf744d02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.840063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-config-data" (OuterVolumeSpecName: "config-data") pod "32eaeddc-f8b5-4c3e-a456-f723f0698b30" (UID: "32eaeddc-f8b5-4c3e-a456-f723f0698b30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.850480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-config-data" (OuterVolumeSpecName: "config-data") pod "8f8b72f5-0735-4187-aaec-0fb50df56674" (UID: "8f8b72f5-0735-4187-aaec-0fb50df56674"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.851455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9bw\" (UniqueName: \"kubernetes.io/projected/c3caa588-23c8-4f8c-9a2d-4340345a5a70-kube-api-access-rs9bw\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.851491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.851510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-config-data\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.851553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3caa588-23c8-4f8c-9a2d-4340345a5a70-logs\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.851567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.851702 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.851712 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.851721 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eaeddc-f8b5-4c3e-a456-f723f0698b30-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.851730 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.853174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3caa588-23c8-4f8c-9a2d-4340345a5a70-logs\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.864325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.864553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.864696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f8b72f5-0735-4187-aaec-0fb50df56674" (UID: "8f8b72f5-0735-4187-aaec-0fb50df56674"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.867536 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-config-data\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.876181 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9bw\" (UniqueName: \"kubernetes.io/projected/c3caa588-23c8-4f8c-9a2d-4340345a5a70-kube-api-access-rs9bw\") pod \"nova-metadata-0\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.883070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data" (OuterVolumeSpecName: "config-data") pod "b8a64a9a-c654-4d05-8504-0839ba022204" (UID: "b8a64a9a-c654-4d05-8504-0839ba022204"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.883925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc9779e5-4e04-424b-b1e3-388ebf744d02" (UID: "fc9779e5-4e04-424b-b1e3-388ebf744d02"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.895246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "77eb4906-622c-4eb0-8e20-e91b6d3a8e27" (UID: "77eb4906-622c-4eb0-8e20-e91b6d3a8e27"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.913457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-config-data" (OuterVolumeSpecName: "config-data") pod "a788c56d-377f-4859-910d-a623173f0a74" (UID: "a788c56d-377f-4859-910d-a623173f0a74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.953315 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77eb4906-622c-4eb0-8e20-e91b6d3a8e27-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.953346 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8b72f5-0735-4187-aaec-0fb50df56674-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.953357 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a788c56d-377f-4859-910d-a623173f0a74-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.953366 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc9779e5-4e04-424b-b1e3-388ebf744d02-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.953376 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a64a9a-c654-4d05-8504-0839ba022204-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:46.972673 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.036711 4707 scope.go:117] "RemoveContainer" containerID="081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78" Jan 21 16:01:47 crc kubenswrapper[4707]: E0121 16:01:47.039509 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78\": container with ID starting with 081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78 not found: ID does not exist" containerID="081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.039546 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78"} err="failed to get container status \"081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78\": rpc error: code = NotFound desc = could not find container \"081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78\": container with ID starting with 081cff6489874efed0509f30c26dc73de23d588189df21d3a7c503c1971aab78 not found: ID does not exist" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.039578 4707 scope.go:117] "RemoveContainer" containerID="e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71" Jan 21 16:01:47 crc kubenswrapper[4707]: E0121 16:01:47.041840 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71\": container with ID starting with e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71 not found: ID does not exist" containerID="e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.041867 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71"} err="failed to get container status \"e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71\": rpc error: code = NotFound desc = could not find container \"e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71\": container with ID starting with e8eff6a25b00d1bd4448defb3e4242f9620fdbd5c5c1f128fec7fde50785dd71 not found: ID does not exist" Jan 21 16:01:47 crc kubenswrapper[4707]: W0121 16:01:47.185123 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45062d46_6a83_4a28_bb58_8d2f5a22e270.slice/crio-736a7487b8094d2d9cd3aaae926443f542829fc73205aeff4b7a22993b5d4df4 WatchSource:0}: Error finding container 736a7487b8094d2d9cd3aaae926443f542829fc73205aeff4b7a22993b5d4df4: Status 404 returned error can't find the container with id 736a7487b8094d2d9cd3aaae926443f542829fc73205aeff4b7a22993b5d4df4 Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.199350 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e561f4-eb73-4a09-884c-0d5476044b72" path="/var/lib/kubelet/pods/53e561f4-eb73-4a09-884c-0d5476044b72/volumes" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.199983 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="614dacb7-b2d4-4d19-b93d-c922660ad318" path="/var/lib/kubelet/pods/614dacb7-b2d4-4d19-b93d-c922660ad318/volumes" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.200559 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5adfc17-d372-40cd-a326-5af09f323dce" path="/var/lib/kubelet/pods/b5adfc17-d372-40cd-a326-5af09f323dce/volumes" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.344422 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.367152 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.380959 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.399668 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.403481 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.418308 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.418566 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.418677 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.420417 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.432395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" event={"ID":"79b20a8d-7771-44bf-9ca7-100e04650264","Type":"ContainerDied","Data":"22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b"} Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.433206 4707 generic.go:334] "Generic (PLEG): container finished" podID="79b20a8d-7771-44bf-9ca7-100e04650264" containerID="22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b" exitCode=143 Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.436490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"45062d46-6a83-4a28-bb58-8d2f5a22e270","Type":"ContainerStarted","Data":"736a7487b8094d2d9cd3aaae926443f542829fc73205aeff4b7a22993b5d4df4"} Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.443737 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.455780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"99b5c23a-57ba-423e-8afb-878f9f0cf0f2","Type":"ContainerStarted","Data":"933d0272ca2cd9ae33707f95f44a73f394c3a3383a282ae4d9a41fc5b512d063"} Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.464469 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.479272 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.489883 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.492497 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_40dfa261-8913-4472-8f97-ac7b04d1f349/ovn-northd/0.log" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.492536 4707 generic.go:334] "Generic (PLEG): container finished" podID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerID="853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61" exitCode=139 Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.492592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"40dfa261-8913-4472-8f97-ac7b04d1f349","Type":"ContainerDied","Data":"853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61"} Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.517733 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.524674 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.525253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"cf13695f-19af-4b6e-9b52-c5f795e265d7","Type":"ContainerStarted","Data":"a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f"} Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.543060 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.547379 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.547587 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.548190 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.551959 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.562549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c35a2e10-5d5d-44b3-b43e-9a1619f7105c","Type":"ContainerStarted","Data":"20720a8d39c9d26c8b8fadb98b6fa4b1c16b2ed083603d1d60f4605e3481df11"} Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.572597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-scripts\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.572677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tftxj\" (UniqueName: \"kubernetes.io/projected/e9d59870-b78f-4c55-8105-ef69996cd835-kube-api-access-tftxj\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.572740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-run-httpd\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.573897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.573934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-log-httpd\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.573995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.574014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.574071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-config-data\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.590822 4707 generic.go:334] "Generic (PLEG): container finished" podID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerID="380a9834052d0f5862f21968659125d6bfa398491f0f1d19cb8337ac171d6977" exitCode=0 Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.590853 4707 generic.go:334] "Generic (PLEG): container finished" podID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerID="9682448c7c3521e5fabf41464b76348e4257e00745c27d84fde49e69a08cce5b" exitCode=143 Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.590941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" event={"ID":"8b7841fd-aeda-46f5-a60c-3baa9923275d","Type":"ContainerDied","Data":"380a9834052d0f5862f21968659125d6bfa398491f0f1d19cb8337ac171d6977"} Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.590975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" event={"ID":"8b7841fd-aeda-46f5-a60c-3baa9923275d","Type":"ContainerDied","Data":"9682448c7c3521e5fabf41464b76348e4257e00745c27d84fde49e69a08cce5b"} Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.593252 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.595148 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.598058 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.598350 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.603396 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.640851 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.641182 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.663053 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.677662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-log-httpd\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.677898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.677936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-public-tls-certs\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.677954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.677971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlf26\" (UniqueName: \"kubernetes.io/projected/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-kube-api-access-tlf26\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-config-data\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-config-data\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data-custom\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678170 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-scripts\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-logs\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-logs\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tftxj\" (UniqueName: \"kubernetes.io/projected/e9d59870-b78f-4c55-8105-ef69996cd835-kube-api-access-tftxj\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-run-httpd\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-scripts\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w8td\" (UniqueName: \"kubernetes.io/projected/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-kube-api-access-2w8td\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.678471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.685982 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.692004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-log-httpd\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.692973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-run-httpd\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.702755 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.704419 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.705311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-scripts\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.705951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.707450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-config-data\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.707840 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.708009 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-dkchd" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.708134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.715989 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.717682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.718240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.722606 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.737251 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.738961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.754556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tftxj\" (UniqueName: \"kubernetes.io/projected/e9d59870-b78f-4c55-8105-ef69996cd835-kube-api-access-tftxj\") pod \"ceilometer-0\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.762887 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.763667 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-public-tls-certs\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlf26\" (UniqueName: \"kubernetes.io/projected/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-kube-api-access-tlf26\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-config-data\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data-custom\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-logs\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.781985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-logs\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.782048 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.782090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-logs\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.782133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.782169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.782215 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.782287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmrkv\" (UniqueName: \"kubernetes.io/projected/06f6e827-b699-4aa7-b3bc-6729a224186f-kube-api-access-rmrkv\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.782318 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-scripts\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.782384 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.783969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.800462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.806949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.807594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w8td\" (UniqueName: \"kubernetes.io/projected/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-kube-api-access-2w8td\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.808072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-logs\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.811270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data-custom\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.815064 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlf26\" (UniqueName: \"kubernetes.io/projected/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-kube-api-access-tlf26\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.815347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.817067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-scripts\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.824511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.833625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.835477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.843681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-logs\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.864306 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.869975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.870466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-public-tls-certs\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.871328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.879091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-config-data\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.879558 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w8td\" (UniqueName: \"kubernetes.io/projected/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-kube-api-access-2w8td\") pod \"nova-api-0\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.887571 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.906359 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924211 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmrkv\" (UniqueName: \"kubernetes.io/projected/06f6e827-b699-4aa7-b3bc-6729a224186f-kube-api-access-rmrkv\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-logs\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-scripts\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-config-data\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.924731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-logs\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.926085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-logs\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.926987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.933273 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.941140 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.946050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.948630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.949680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.949732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.951026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.958605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmrkv\" (UniqueName: \"kubernetes.io/projected/06f6e827-b699-4aa7-b3bc-6729a224186f-kube-api-access-rmrkv\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.975770 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=4.975751412 podStartE2EDuration="4.975751412s" podCreationTimestamp="2026-01-21 16:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:47.56218431 +0000 UTC m=+3604.743700533" watchObservedRunningTime="2026-01-21 16:01:47.975751412 +0000 UTC m=+3605.157267634" Jan 21 16:01:47 crc kubenswrapper[4707]: I0121 16:01:47.987135 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.9871248379999997 podStartE2EDuration="2.987124838s" podCreationTimestamp="2026-01-21 16:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:47.611354295 +0000 UTC m=+3604.792870518" watchObservedRunningTime="2026-01-21 16:01:47.987124838 +0000 UTC m=+3605.168641060" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.006399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.010003 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=5.009996203 podStartE2EDuration="5.009996203s" podCreationTimestamp="2026-01-21 16:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:47.669599796 +0000 UTC m=+3604.851116019" watchObservedRunningTime="2026-01-21 16:01:48.009996203 +0000 UTC m=+3605.191512425" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.018642 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.021844 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.025856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-combined-ca-bundle\") pod \"8b7841fd-aeda-46f5-a60c-3baa9923275d\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.025980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b7841fd-aeda-46f5-a60c-3baa9923275d-logs\") pod \"8b7841fd-aeda-46f5-a60c-3baa9923275d\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data-custom\") pod \"8b7841fd-aeda-46f5-a60c-3baa9923275d\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data\") pod \"8b7841fd-aeda-46f5-a60c-3baa9923275d\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt6r6\" (UniqueName: \"kubernetes.io/projected/8b7841fd-aeda-46f5-a60c-3baa9923275d-kube-api-access-kt6r6\") pod \"8b7841fd-aeda-46f5-a60c-3baa9923275d\" (UID: \"8b7841fd-aeda-46f5-a60c-3baa9923275d\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-scripts\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-config-data\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwph7\" (UniqueName: \"kubernetes.io/projected/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-kube-api-access-xwph7\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-logs\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.026856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.027103 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.043577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b7841fd-aeda-46f5-a60c-3baa9923275d-logs" (OuterVolumeSpecName: "logs") pod "8b7841fd-aeda-46f5-a60c-3baa9923275d" (UID: "8b7841fd-aeda-46f5-a60c-3baa9923275d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: E0121 16:01:48.045409 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61 is running failed: container process not found" containerID="853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.046926 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.047047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-config-data\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.047554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.048044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-logs\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.048075 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: E0121 16:01:48.048460 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerName="barbican-worker-log" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.048473 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerName="barbican-worker-log" Jan 21 16:01:48 crc kubenswrapper[4707]: E0121 16:01:48.048483 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerName="barbican-worker" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.048488 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerName="barbican-worker" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.048648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-scripts\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.048679 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerName="barbican-worker" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.048692 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7841fd-aeda-46f5-a60c-3baa9923275d" containerName="barbican-worker-log" Jan 21 16:01:48 crc kubenswrapper[4707]: E0121 16:01:48.048733 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61 is running failed: container process not found" containerID="853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.049428 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: E0121 16:01:48.049956 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61 is running failed: container process not found" containerID="853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:01:48 crc kubenswrapper[4707]: E0121 16:01:48.049988 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="ovn-northd" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.053427 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.055635 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.056249 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_40dfa261-8913-4472-8f97-ac7b04d1f349/ovn-northd/0.log" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.056322 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.060724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.067927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b7841fd-aeda-46f5-a60c-3baa9923275d-kube-api-access-kt6r6" (OuterVolumeSpecName: "kube-api-access-kt6r6") pod "8b7841fd-aeda-46f5-a60c-3baa9923275d" (UID: "8b7841fd-aeda-46f5-a60c-3baa9923275d"). InnerVolumeSpecName "kube-api-access-kt6r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.071893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8b7841fd-aeda-46f5-a60c-3baa9923275d" (UID: "8b7841fd-aeda-46f5-a60c-3baa9923275d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.078156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.107243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.121450 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.132956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwph7\" (UniqueName: \"kubernetes.io/projected/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-kube-api-access-xwph7\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.133109 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b7841fd-aeda-46f5-a60c-3baa9923275d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.133123 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.133133 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt6r6\" (UniqueName: \"kubernetes.io/projected/8b7841fd-aeda-46f5-a60c-3baa9923275d-kube-api-access-kt6r6\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.155501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwph7\" (UniqueName: \"kubernetes.io/projected/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-kube-api-access-xwph7\") pod \"glance-default-external-api-0\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.198432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b7841fd-aeda-46f5-a60c-3baa9923275d" (UID: "8b7841fd-aeda-46f5-a60c-3baa9923275d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.203667 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.233956 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-config\") pod \"40dfa261-8913-4472-8f97-ac7b04d1f349\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.234080 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-rundir\") pod \"40dfa261-8913-4472-8f97-ac7b04d1f349\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.234127 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-metrics-certs-tls-certs\") pod \"40dfa261-8913-4472-8f97-ac7b04d1f349\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.234190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x59g\" (UniqueName: \"kubernetes.io/projected/40dfa261-8913-4472-8f97-ac7b04d1f349-kube-api-access-2x59g\") pod \"40dfa261-8913-4472-8f97-ac7b04d1f349\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.234230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-northd-tls-certs\") pod \"40dfa261-8913-4472-8f97-ac7b04d1f349\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.234268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-scripts\") pod \"40dfa261-8913-4472-8f97-ac7b04d1f349\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.234365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-combined-ca-bundle\") pod \"40dfa261-8913-4472-8f97-ac7b04d1f349\" (UID: \"40dfa261-8913-4472-8f97-ac7b04d1f349\") " Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.235029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.235075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7r9l\" (UniqueName: \"kubernetes.io/projected/4b535454-f068-42cd-9817-a23bd7d7aa4d-kube-api-access-n7r9l\") pod \"nova-cell1-conductor-0\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.235130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.235212 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.242860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data" (OuterVolumeSpecName: "config-data") pod "8b7841fd-aeda-46f5-a60c-3baa9923275d" (UID: "8b7841fd-aeda-46f5-a60c-3baa9923275d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.243452 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-scripts" (OuterVolumeSpecName: "scripts") pod "40dfa261-8913-4472-8f97-ac7b04d1f349" (UID: "40dfa261-8913-4472-8f97-ac7b04d1f349"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.243977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-config" (OuterVolumeSpecName: "config") pod "40dfa261-8913-4472-8f97-ac7b04d1f349" (UID: "40dfa261-8913-4472-8f97-ac7b04d1f349"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.244609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "40dfa261-8913-4472-8f97-ac7b04d1f349" (UID: "40dfa261-8913-4472-8f97-ac7b04d1f349"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.254631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40dfa261-8913-4472-8f97-ac7b04d1f349-kube-api-access-2x59g" (OuterVolumeSpecName: "kube-api-access-2x59g") pod "40dfa261-8913-4472-8f97-ac7b04d1f349" (UID: "40dfa261-8913-4472-8f97-ac7b04d1f349"). InnerVolumeSpecName "kube-api-access-2x59g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.293192 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.326987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40dfa261-8913-4472-8f97-ac7b04d1f349" (UID: "40dfa261-8913-4472-8f97-ac7b04d1f349"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.339347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.339392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7r9l\" (UniqueName: \"kubernetes.io/projected/4b535454-f068-42cd-9817-a23bd7d7aa4d-kube-api-access-n7r9l\") pod \"nova-cell1-conductor-0\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.339424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.339523 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7841fd-aeda-46f5-a60c-3baa9923275d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.339536 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.339546 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.339554 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.339563 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x59g\" (UniqueName: \"kubernetes.io/projected/40dfa261-8913-4472-8f97-ac7b04d1f349-kube-api-access-2x59g\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.339571 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40dfa261-8913-4472-8f97-ac7b04d1f349-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.345226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.355946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.371444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7r9l\" (UniqueName: \"kubernetes.io/projected/4b535454-f068-42cd-9817-a23bd7d7aa4d-kube-api-access-n7r9l\") pod \"nova-cell1-conductor-0\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.373768 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "40dfa261-8913-4472-8f97-ac7b04d1f349" (UID: "40dfa261-8913-4472-8f97-ac7b04d1f349"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.384053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "40dfa261-8913-4472-8f97-ac7b04d1f349" (UID: "40dfa261-8913-4472-8f97-ac7b04d1f349"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.437800 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.442911 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.442945 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dfa261-8913-4472-8f97-ac7b04d1f349-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.468943 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.535639 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.661547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"45062d46-6a83-4a28-bb58-8d2f5a22e270","Type":"ContainerStarted","Data":"0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.665873 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.671449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"1de274a1-6480-4858-948a-d0423dbf7fe8","Type":"ContainerStarted","Data":"4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.671472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"1de274a1-6480-4858-948a-d0423dbf7fe8","Type":"ContainerStarted","Data":"efddc62401e43db9e4271336c16dded86452fba487dc48c3385c4163b4148397"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.672758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7a2c79bc-f09f-4e7e-aa40-80cf2914a696","Type":"ContainerStarted","Data":"756cab360c125f911a47a2a4241a2a0d28caa9459001331a4cb7fd41a7c9ad99"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.679851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"99b5c23a-57ba-423e-8afb-878f9f0cf0f2","Type":"ContainerStarted","Data":"9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.682904 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" event={"ID":"8b7841fd-aeda-46f5-a60c-3baa9923275d","Type":"ContainerDied","Data":"1fa7267d4fddac752419228bc6e6683c444d8561d6dd4edd716260a77f921426"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.682931 4707 scope.go:117] "RemoveContainer" containerID="380a9834052d0f5862f21968659125d6bfa398491f0f1d19cb8337ac171d6977" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.683024 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-75d7748979-s5w96" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.692629 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.703114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"7c469e86-c709-4bd2-986e-fa08e32052b0","Type":"ContainerStarted","Data":"91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.741023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c35a2e10-5d5d-44b3-b43e-9a1619f7105c","Type":"ContainerStarted","Data":"82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.763508 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_40dfa261-8913-4472-8f97-ac7b04d1f349/ovn-northd/0.log" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.764229 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.765678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"40dfa261-8913-4472-8f97-ac7b04d1f349","Type":"ContainerDied","Data":"0b60252a0b88b983b9502f95f09710d7e176a0684306d6ebf4a27b605f4aaa75"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.770685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" event={"ID":"08d2edd9-a93d-4bb9-ac1e-6a23494df48c","Type":"ContainerStarted","Data":"5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.771964 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.799928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c3caa588-23c8-4f8c-9a2d-4340345a5a70","Type":"ContainerStarted","Data":"9e9e09e5a13714abe5351750b589c12f25e4f4ece7aa12bdb33843b977c27a74"} Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.800003 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.833397 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.837889 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" podStartSLOduration=8.837876699 podStartE2EDuration="8.837876699s" podCreationTimestamp="2026-01-21 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:48.800217377 +0000 UTC m=+3605.981733600" watchObservedRunningTime="2026-01-21 16:01:48.837876699 +0000 UTC m=+3606.019392921" Jan 21 16:01:48 crc kubenswrapper[4707]: I0121 16:01:48.910126 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.019636 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:01:49 crc kubenswrapper[4707]: W0121 16:01:49.036912 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d59870_b78f_4c55_8105_ef69996cd835.slice/crio-650728a47337d6e72385a0301a527b643944f08cd0b19fa391a9714e5fd6e8fa WatchSource:0}: Error finding container 650728a47337d6e72385a0301a527b643944f08cd0b19fa391a9714e5fd6e8fa: Status 404 returned error can't find the container with id 650728a47337d6e72385a0301a527b643944f08cd0b19fa391a9714e5fd6e8fa Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.037487 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.050862 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.057251 4707 scope.go:117] "RemoveContainer" containerID="9682448c7c3521e5fabf41464b76348e4257e00745c27d84fde49e69a08cce5b" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.066906 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:01:49 crc kubenswrapper[4707]: E0121 16:01:49.067282 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="openstack-network-exporter" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.067294 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="openstack-network-exporter" Jan 21 16:01:49 crc kubenswrapper[4707]: E0121 16:01:49.067323 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="ovn-northd" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.067332 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="ovn-northd" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.067581 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="ovn-northd" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.067605 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" containerName="openstack-network-exporter" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.068637 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.077578 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-75d7748979-s5w96"] Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.077716 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.077881 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.077974 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.078934 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-jvckj" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.111544 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-75d7748979-s5w96"] Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.136798 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.148996 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.164118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.164174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.164198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.164360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-config\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.164391 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-scripts\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.164430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.164478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v787j\" (UniqueName: \"kubernetes.io/projected/6363e430-8c44-4cbc-8923-3ed9147d0cde-kube-api-access-v787j\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: E0121 16:01:49.253513 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79b20a8d_7771_44bf_9ca7_100e04650264.slice/crio-2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b7841fd_aeda_46f5_a60c_3baa9923275d.slice/crio-1fa7267d4fddac752419228bc6e6683c444d8561d6dd4edd716260a77f921426\": RecentStats: unable to find data in memory cache]" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.268316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.268359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.268380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.268475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-config\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.268511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-scripts\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.268534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.268564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v787j\" (UniqueName: \"kubernetes.io/projected/6363e430-8c44-4cbc-8923-3ed9147d0cde-kube-api-access-v787j\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.269489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.276121 4707 scope.go:117] "RemoveContainer" containerID="bdec8c944354ba8edd4a696e67ef57315d22a16748e6ca2cf813eb67fe88a679" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.278690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.279433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-config\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.279906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.283280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.284788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v787j\" (UniqueName: \"kubernetes.io/projected/6363e430-8c44-4cbc-8923-3ed9147d0cde-kube-api-access-v787j\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.286964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-scripts\") pod \"ovn-northd-0\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.289045 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32eaeddc-f8b5-4c3e-a456-f723f0698b30" path="/var/lib/kubelet/pods/32eaeddc-f8b5-4c3e-a456-f723f0698b30/volumes" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.289925 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40dfa261-8913-4472-8f97-ac7b04d1f349" path="/var/lib/kubelet/pods/40dfa261-8913-4472-8f97-ac7b04d1f349/volumes" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.290558 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77eb4906-622c-4eb0-8e20-e91b6d3a8e27" path="/var/lib/kubelet/pods/77eb4906-622c-4eb0-8e20-e91b6d3a8e27/volumes" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.291688 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b7841fd-aeda-46f5-a60c-3baa9923275d" path="/var/lib/kubelet/pods/8b7841fd-aeda-46f5-a60c-3baa9923275d/volumes" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.292263 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f8b72f5-0735-4187-aaec-0fb50df56674" path="/var/lib/kubelet/pods/8f8b72f5-0735-4187-aaec-0fb50df56674/volumes" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.300936 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a788c56d-377f-4859-910d-a623173f0a74" path="/var/lib/kubelet/pods/a788c56d-377f-4859-910d-a623173f0a74/volumes" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.301731 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a64a9a-c654-4d05-8504-0839ba022204" path="/var/lib/kubelet/pods/b8a64a9a-c654-4d05-8504-0839ba022204/volumes" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.302557 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9779e5-4e04-424b-b1e3-388ebf744d02" path="/var/lib/kubelet/pods/fc9779e5-4e04-424b-b1e3-388ebf744d02/volumes" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.308982 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.322555 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.331044 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.372935 4707 scope.go:117] "RemoveContainer" containerID="853ba46360b679abbdea0ab01c8132351116a57cfdc441bd1cb1793257177b61" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.475763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mlc8\" (UniqueName: \"kubernetes.io/projected/79b20a8d-7771-44bf-9ca7-100e04650264-kube-api-access-6mlc8\") pod \"79b20a8d-7771-44bf-9ca7-100e04650264\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.476100 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b20a8d-7771-44bf-9ca7-100e04650264-logs\") pod \"79b20a8d-7771-44bf-9ca7-100e04650264\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.476122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data\") pod \"79b20a8d-7771-44bf-9ca7-100e04650264\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.476181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-combined-ca-bundle\") pod \"79b20a8d-7771-44bf-9ca7-100e04650264\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.476211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data-custom\") pod \"79b20a8d-7771-44bf-9ca7-100e04650264\" (UID: \"79b20a8d-7771-44bf-9ca7-100e04650264\") " Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.478559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b20a8d-7771-44bf-9ca7-100e04650264-logs" (OuterVolumeSpecName: "logs") pod "79b20a8d-7771-44bf-9ca7-100e04650264" (UID: "79b20a8d-7771-44bf-9ca7-100e04650264"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.486584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79b20a8d-7771-44bf-9ca7-100e04650264" (UID: "79b20a8d-7771-44bf-9ca7-100e04650264"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:49 crc kubenswrapper[4707]: E0121 16:01:49.515435 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd is running failed: container process not found" containerID="7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:01:49 crc kubenswrapper[4707]: E0121 16:01:49.516258 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd is running failed: container process not found" containerID="7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:01:49 crc kubenswrapper[4707]: E0121 16:01:49.516559 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd is running failed: container process not found" containerID="7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:01:49 crc kubenswrapper[4707]: E0121 16:01:49.516591 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd is running failed: container process not found" probeType="Liveness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.517271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b20a8d-7771-44bf-9ca7-100e04650264-kube-api-access-6mlc8" (OuterVolumeSpecName: "kube-api-access-6mlc8") pod "79b20a8d-7771-44bf-9ca7-100e04650264" (UID: "79b20a8d-7771-44bf-9ca7-100e04650264"). InnerVolumeSpecName "kube-api-access-6mlc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.578217 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.578243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mlc8\" (UniqueName: \"kubernetes.io/projected/79b20a8d-7771-44bf-9ca7-100e04650264-kube-api-access-6mlc8\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.578254 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b20a8d-7771-44bf-9ca7-100e04650264-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.581231 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.590157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79b20a8d-7771-44bf-9ca7-100e04650264" (UID: "79b20a8d-7771-44bf-9ca7-100e04650264"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.619633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data" (OuterVolumeSpecName: "config-data") pod "79b20a8d-7771-44bf-9ca7-100e04650264" (UID: "79b20a8d-7771-44bf-9ca7-100e04650264"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.680342 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.680591 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b20a8d-7771-44bf-9ca7-100e04650264-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.807384 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"99b5c23a-57ba-423e-8afb-878f9f0cf0f2","Type":"ContainerStarted","Data":"d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.823926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerStarted","Data":"6297542b32b4ff82c9b3e69fe4587e55a7b0578cb8e7c93f713e581d4d3c7557"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.829082 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=4.829069015 podStartE2EDuration="4.829069015s" podCreationTimestamp="2026-01-21 16:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:49.821378254 +0000 UTC m=+3607.002894476" watchObservedRunningTime="2026-01-21 16:01:49.829069015 +0000 UTC m=+3607.010585237" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.831072 4707 generic.go:334] "Generic (PLEG): container finished" podID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerID="7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd" exitCode=1 Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.831123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerDied","Data":"7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.831678 4707 scope.go:117] "RemoveContainer" containerID="7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.840461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8","Type":"ContainerStarted","Data":"53fd1d91ef1568b5cb9106d4d34b7c158f3820b462f752349f1e523c68ba1441"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.842986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"06f6e827-b699-4aa7-b3bc-6729a224186f","Type":"ContainerStarted","Data":"1e4293add566145e540a7792a9f11a3eebd40c580e70db05bafeff4fac5c5e22"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.853423 4707 generic.go:334] "Generic (PLEG): container finished" podID="79b20a8d-7771-44bf-9ca7-100e04650264" containerID="2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4" exitCode=0 Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.853497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" event={"ID":"79b20a8d-7771-44bf-9ca7-100e04650264","Type":"ContainerDied","Data":"2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.853520 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" event={"ID":"79b20a8d-7771-44bf-9ca7-100e04650264","Type":"ContainerDied","Data":"934e32f649efc69ced9c8c6fd367b751076784a0bbe281b9961a0841789c904d"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.853535 4707 scope.go:117] "RemoveContainer" containerID="2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.853639 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.860223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8","Type":"ContainerStarted","Data":"ce8419970875a55a6026044f1bcfa24491b13e9dd25837e7ee133bad0efef6b9"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.861419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerStarted","Data":"650728a47337d6e72385a0301a527b643944f08cd0b19fa391a9714e5fd6e8fa"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.863229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7a2c79bc-f09f-4e7e-aa40-80cf2914a696","Type":"ContainerStarted","Data":"c76746848586d16a8a6d12eabf0bf66bbcce199aa33388ae4e87eb3bdd155156"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.880777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c3caa588-23c8-4f8c-9a2d-4340345a5a70","Type":"ContainerStarted","Data":"202e532715339b4e74f96986e7e29341d437676e0174e845bd4d5cafd176f0c5"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.880827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c3caa588-23c8-4f8c-9a2d-4340345a5a70","Type":"ContainerStarted","Data":"f4e4007ad0d4c5f9910c3506d4489756309ca207796bd3930eac18605903d1db"} Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.892412 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.892399147 podStartE2EDuration="2.892399147s" podCreationTimestamp="2026-01-21 16:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:49.891580828 +0000 UTC m=+3607.073097050" watchObservedRunningTime="2026-01-21 16:01:49.892399147 +0000 UTC m=+3607.073915369" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.899142 4707 scope.go:117] "RemoveContainer" containerID="22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.909639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.924077 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.924063416 podStartE2EDuration="3.924063416s" podCreationTimestamp="2026-01-21 16:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:49.907555405 +0000 UTC m=+3607.089071627" watchObservedRunningTime="2026-01-21 16:01:49.924063416 +0000 UTC m=+3607.105579638" Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.969900 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm"] Jan 21 16:01:49 crc kubenswrapper[4707]: I0121 16:01:49.995934 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-56cd595886-jldwm"] Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.020270 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.030722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.031264 4707 scope.go:117] "RemoveContainer" containerID="2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.034308 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:01:50 crc kubenswrapper[4707]: E0121 16:01:50.037051 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4\": container with ID starting with 2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4 not found: ID does not exist" containerID="2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.037083 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4"} err="failed to get container status \"2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4\": rpc error: code = NotFound desc = could not find container \"2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4\": container with ID starting with 2a52e6712dfb97e9e45a2a6fceb5dd074c1d5fc9ec770e975f46a1a90bf3eff4 not found: ID does not exist" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.037106 4707 scope.go:117] "RemoveContainer" containerID="22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b" Jan 21 16:01:50 crc kubenswrapper[4707]: E0121 16:01:50.037736 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b\": container with ID starting with 22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b not found: ID does not exist" containerID="22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.037760 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b"} err="failed to get container status \"22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b\": rpc error: code = NotFound desc = could not find container \"22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b\": container with ID starting with 22667e87d75c6e5eca53ee5f06e6f7173ec44a8aa9ce3a9878860ac063c71a1b not found: ID does not exist" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.146490 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.635212 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.647294 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.902172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7a2c79bc-f09f-4e7e-aa40-80cf2914a696","Type":"ContainerStarted","Data":"5d26b491314dc3eee1fb728f8d549b40c8203991cf2c9184bca2c9cc244b7041"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.909596 4707 generic.go:334] "Generic (PLEG): container finished" podID="45062d46-6a83-4a28-bb58-8d2f5a22e270" containerID="0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5" exitCode=0 Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.909674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"45062d46-6a83-4a28-bb58-8d2f5a22e270","Type":"ContainerDied","Data":"0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.922046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerStarted","Data":"a609dee96e6747b2f7aecd1bbabda7f16bd305f62372daa8e579f23939d6fe36"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.928622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8","Type":"ContainerStarted","Data":"0c91ea7d49d1293494ad3b71e6a2d9ea9e33ccb909db292e4603ce1aa9e9bb4d"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.937750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6363e430-8c44-4cbc-8923-3ed9147d0cde","Type":"ContainerStarted","Data":"05ec9f6333df97f3167e3d152d12769dc4312df78642cbffed6c3c74e6b24e76"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.937800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6363e430-8c44-4cbc-8923-3ed9147d0cde","Type":"ContainerStarted","Data":"3691aec55f628a7ca9a8989d0678c03a481be77fab6135bf1fef66f4aad67ebd"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.937828 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6363e430-8c44-4cbc-8923-3ed9147d0cde","Type":"ContainerStarted","Data":"eb310c830cc7f22a95201abcdae076f8a724e61493eb24fa79d3f70352913353"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.937988 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.951251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerStarted","Data":"c4dc4cfcfcba1aa33e6a181bb373be2414594c8e3acc632c81306d951b27e3f9"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.952032 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.956949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8","Type":"ContainerStarted","Data":"b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.970788 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"06f6e827-b699-4aa7-b3bc-6729a224186f","Type":"ContainerStarted","Data":"b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.971514 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=1.9715008809999999 podStartE2EDuration="1.971500881s" podCreationTimestamp="2026-01-21 16:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:50.966315511 +0000 UTC m=+3608.147831733" watchObservedRunningTime="2026-01-21 16:01:50.971500881 +0000 UTC m=+3608.153017102" Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.994773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerStarted","Data":"0f202935d98db43b384ed17021e4ebfc0ee5225582322e135c3e18383a48cf21"} Jan 21 16:01:50 crc kubenswrapper[4707]: I0121 16:01:50.995296 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:50.996409 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="1de274a1-6480-4858-948a-d0423dbf7fe8" containerName="mysql-bootstrap" containerID="cri-o://4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b" gracePeriod=30 Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.012800 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=4.01278044 podStartE2EDuration="4.01278044s" podCreationTimestamp="2026-01-21 16:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:50.982575104 +0000 UTC m=+3608.164091326" watchObservedRunningTime="2026-01-21 16:01:51.01278044 +0000 UTC m=+3608.194296662" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.030365 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.030347782 podStartE2EDuration="4.030347782s" podCreationTimestamp="2026-01-21 16:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:51.002237175 +0000 UTC m=+3608.183753397" watchObservedRunningTime="2026-01-21 16:01:51.030347782 +0000 UTC m=+3608.211864004" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.060278 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.072945 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.066515099 podStartE2EDuration="4.066515099s" podCreationTimestamp="2026-01-21 16:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:51.054725691 +0000 UTC m=+3608.236241913" watchObservedRunningTime="2026-01-21 16:01:51.066515099 +0000 UTC m=+3608.248031321" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.086141 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.211127 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b20a8d-7771-44bf-9ca7-100e04650264" path="/var/lib/kubelet/pods/79b20a8d-7771-44bf-9ca7-100e04650264/volumes" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.641857 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.749133 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-combined-ca-bundle\") pod \"1de274a1-6480-4858-948a-d0423dbf7fe8\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.749180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-operator-scripts\") pod \"1de274a1-6480-4858-948a-d0423dbf7fe8\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.749213 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgzcf\" (UniqueName: \"kubernetes.io/projected/1de274a1-6480-4858-948a-d0423dbf7fe8-kube-api-access-hgzcf\") pod \"1de274a1-6480-4858-948a-d0423dbf7fe8\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.749243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-galera-tls-certs\") pod \"1de274a1-6480-4858-948a-d0423dbf7fe8\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.749291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"1de274a1-6480-4858-948a-d0423dbf7fe8\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.749395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-kolla-config\") pod \"1de274a1-6480-4858-948a-d0423dbf7fe8\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.749442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-default\") pod \"1de274a1-6480-4858-948a-d0423dbf7fe8\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.749492 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-generated\") pod \"1de274a1-6480-4858-948a-d0423dbf7fe8\" (UID: \"1de274a1-6480-4858-948a-d0423dbf7fe8\") " Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.750098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "1de274a1-6480-4858-948a-d0423dbf7fe8" (UID: "1de274a1-6480-4858-948a-d0423dbf7fe8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.750183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "1de274a1-6480-4858-948a-d0423dbf7fe8" (UID: "1de274a1-6480-4858-948a-d0423dbf7fe8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.750431 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "1de274a1-6480-4858-948a-d0423dbf7fe8" (UID: "1de274a1-6480-4858-948a-d0423dbf7fe8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.750577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1de274a1-6480-4858-948a-d0423dbf7fe8" (UID: "1de274a1-6480-4858-948a-d0423dbf7fe8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.753936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de274a1-6480-4858-948a-d0423dbf7fe8-kube-api-access-hgzcf" (OuterVolumeSpecName: "kube-api-access-hgzcf") pod "1de274a1-6480-4858-948a-d0423dbf7fe8" (UID: "1de274a1-6480-4858-948a-d0423dbf7fe8"). InnerVolumeSpecName "kube-api-access-hgzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.755121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1de274a1-6480-4858-948a-d0423dbf7fe8" (UID: "1de274a1-6480-4858-948a-d0423dbf7fe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.755894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "1de274a1-6480-4858-948a-d0423dbf7fe8" (UID: "1de274a1-6480-4858-948a-d0423dbf7fe8"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.756457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "1de274a1-6480-4858-948a-d0423dbf7fe8" (UID: "1de274a1-6480-4858-948a-d0423dbf7fe8"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.852155 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.852187 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.852199 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgzcf\" (UniqueName: \"kubernetes.io/projected/1de274a1-6480-4858-948a-d0423dbf7fe8-kube-api-access-hgzcf\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.852207 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1de274a1-6480-4858-948a-d0423dbf7fe8-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.852242 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.852252 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.852261 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.852269 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1de274a1-6480-4858-948a-d0423dbf7fe8-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.869266 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.945652 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5865994c8-cs68b"] Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.946258 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-log" containerID="cri-o://5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d" gracePeriod=30 Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.946309 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-api" containerID="cri-o://e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922" gracePeriod=30 Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.953467 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.954857 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.46:8778/\": EOF" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.980327 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-856c459894-wcndt"] Jan 21 16:01:51 crc kubenswrapper[4707]: E0121 16:01:51.980763 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b20a8d-7771-44bf-9ca7-100e04650264" containerName="barbican-keystone-listener" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.980859 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b20a8d-7771-44bf-9ca7-100e04650264" containerName="barbican-keystone-listener" Jan 21 16:01:51 crc kubenswrapper[4707]: E0121 16:01:51.980951 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de274a1-6480-4858-948a-d0423dbf7fe8" containerName="mysql-bootstrap" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.981004 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de274a1-6480-4858-948a-d0423dbf7fe8" containerName="mysql-bootstrap" Jan 21 16:01:51 crc kubenswrapper[4707]: E0121 16:01:51.981054 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b20a8d-7771-44bf-9ca7-100e04650264" containerName="barbican-keystone-listener-log" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.981099 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b20a8d-7771-44bf-9ca7-100e04650264" containerName="barbican-keystone-listener-log" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.981331 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b20a8d-7771-44bf-9ca7-100e04650264" containerName="barbican-keystone-listener" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.981389 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b20a8d-7771-44bf-9ca7-100e04650264" containerName="barbican-keystone-listener-log" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.981435 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de274a1-6480-4858-948a-d0423dbf7fe8" containerName="mysql-bootstrap" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.982352 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:51 crc kubenswrapper[4707]: I0121 16:01:51.990561 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-856c459894-wcndt"] Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.002555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerStarted","Data":"86a2a8ca4b2ba529787c883f220c177764765b546eb04737097e14f5425fe768"} Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.002596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerStarted","Data":"1edfa4083df3407b1693b03447db0d9cf41cec4d7d408748d3925fc4b1615068"} Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.004070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8","Type":"ContainerStarted","Data":"7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504"} Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.033441 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"06f6e827-b699-4aa7-b3bc-6729a224186f","Type":"ContainerStarted","Data":"10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740"} Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.041128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"45062d46-6a83-4a28-bb58-8d2f5a22e270","Type":"ContainerStarted","Data":"f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592"} Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.046243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8","Type":"ContainerStarted","Data":"03dd9833f6900410aad689e2c52cd17eb9a6b068055cdd52e3ee4b84e822c0c7"} Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.047344 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.054979 4707 generic.go:334] "Generic (PLEG): container finished" podID="1de274a1-6480-4858-948a-d0423dbf7fe8" containerID="4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b" exitCode=0 Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.056196 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.058984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"1de274a1-6480-4858-948a-d0423dbf7fe8","Type":"ContainerDied","Data":"4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b"} Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.059040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"1de274a1-6480-4858-948a-d0423dbf7fe8","Type":"ContainerDied","Data":"efddc62401e43db9e4271336c16dded86452fba487dc48c3385c4163b4148397"} Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.059060 4707 scope.go:117] "RemoveContainer" containerID="4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.060094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-public-tls-certs\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.060192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-internal-tls-certs\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.060223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-combined-ca-bundle\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.060530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkgmt\" (UniqueName: \"kubernetes.io/projected/0164b5d5-db09-4d87-b6cb-2457e1397145-kube-api-access-xkgmt\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.060609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-config-data\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.060731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0164b5d5-db09-4d87-b6cb-2457e1397145-logs\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.060768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-scripts\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.061008 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=7.0609915 podStartE2EDuration="7.0609915s" podCreationTimestamp="2026-01-21 16:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:52.059724998 +0000 UTC m=+3609.241241220" watchObservedRunningTime="2026-01-21 16:01:52.0609915 +0000 UTC m=+3609.242507722" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.095429 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=5.095417903 podStartE2EDuration="5.095417903s" podCreationTimestamp="2026-01-21 16:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:52.089480018 +0000 UTC m=+3609.270996240" watchObservedRunningTime="2026-01-21 16:01:52.095417903 +0000 UTC m=+3609.276934125" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.095966 4707 scope.go:117] "RemoveContainer" containerID="4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b" Jan 21 16:01:52 crc kubenswrapper[4707]: E0121 16:01:52.096843 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b\": container with ID starting with 4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b not found: ID does not exist" containerID="4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.096875 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b"} err="failed to get container status \"4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b\": rpc error: code = NotFound desc = could not find container \"4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b\": container with ID starting with 4958b8309509f02ee30ef9785fc579ba641580d4b4adaf53df88181d638b8d5b not found: ID does not exist" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.142336 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.153558 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.162784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-public-tls-certs\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.163147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-internal-tls-certs\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.163204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-combined-ca-bundle\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.163219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgmt\" (UniqueName: \"kubernetes.io/projected/0164b5d5-db09-4d87-b6cb-2457e1397145-kube-api-access-xkgmt\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.163248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-config-data\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.163423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0164b5d5-db09-4d87-b6cb-2457e1397145-logs\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.163511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-scripts\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.171921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0164b5d5-db09-4d87-b6cb-2457e1397145-logs\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.173235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-combined-ca-bundle\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.175350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-config-data\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.176965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-internal-tls-certs\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.177313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-public-tls-certs\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.177659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-scripts\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.178239 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.180499 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.183032 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-rch7x" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.183277 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.183403 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.188442 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.188686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkgmt\" (UniqueName: \"kubernetes.io/projected/0164b5d5-db09-4d87-b6cb-2457e1397145-kube-api-access-xkgmt\") pod \"placement-856c459894-wcndt\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.188890 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.249723 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-856c459894-wcndt"] Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.250487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.279435 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-685994d4cb-zr62v"] Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.280925 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.288295 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-685994d4cb-zr62v"] Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.367100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.367136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.367330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.367393 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.367409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.367430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.367481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.368340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw9nv\" (UniqueName: \"kubernetes.io/projected/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kube-api-access-nw9nv\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.368388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.368446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh9rz\" (UniqueName: \"kubernetes.io/projected/5b206e03-5e3a-4613-8d4a-6108443785cb-kube-api-access-jh9rz\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw9nv\" (UniqueName: \"kubernetes.io/projected/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kube-api-access-nw9nv\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470635 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-scripts\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-config-data\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470718 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470817 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-combined-ca-bundle\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b206e03-5e3a-4613-8d4a-6108443785cb-logs\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.472007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.472244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.472291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.473760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.477706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.481592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.470952 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.486337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw9nv\" (UniqueName: \"kubernetes.io/projected/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kube-api-access-nw9nv\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.512840 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.517867 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.572279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.572344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-config-data\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.572372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.572435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-combined-ca-bundle\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.572465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b206e03-5e3a-4613-8d4a-6108443785cb-logs\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.572545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh9rz\" (UniqueName: \"kubernetes.io/projected/5b206e03-5e3a-4613-8d4a-6108443785cb-kube-api-access-jh9rz\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.572576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-scripts\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.573474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b206e03-5e3a-4613-8d4a-6108443785cb-logs\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.577133 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-scripts\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.578830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.579562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-config-data\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.581173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.582295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-combined-ca-bundle\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.597026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh9rz\" (UniqueName: \"kubernetes.io/projected/5b206e03-5e3a-4613-8d4a-6108443785cb-kube-api-access-jh9rz\") pod \"placement-685994d4cb-zr62v\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.602221 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:52 crc kubenswrapper[4707]: E0121 16:01:52.644435 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:42614->192.168.25.165:36655: write tcp 192.168.25.165:42614->192.168.25.165:36655: write: broken pipe Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.706936 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-856c459894-wcndt"] Jan 21 16:01:52 crc kubenswrapper[4707]: W0121 16:01:52.715475 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0164b5d5_db09_4d87_b6cb_2457e1397145.slice/crio-7eca49ce8f750e0af4725624cc2b83b748215789499a3cccdb3c549706022798 WatchSource:0}: Error finding container 7eca49ce8f750e0af4725624cc2b83b748215789499a3cccdb3c549706022798: Status 404 returned error can't find the container with id 7eca49ce8f750e0af4725624cc2b83b748215789499a3cccdb3c549706022798 Jan 21 16:01:52 crc kubenswrapper[4707]: I0121 16:01:52.764695 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:01:52 crc kubenswrapper[4707]: W0121 16:01:52.770235 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod072de5b8_01f6_48ea_b4cf_751f3a8a449e.slice/crio-436bda71347d55b3eb36e67c7983d73a9afd4257b1ace2cf6c1327852a29b33b WatchSource:0}: Error finding container 436bda71347d55b3eb36e67c7983d73a9afd4257b1ace2cf6c1327852a29b33b: Status 404 returned error can't find the container with id 436bda71347d55b3eb36e67c7983d73a9afd4257b1ace2cf6c1327852a29b33b Jan 21 16:01:53 crc kubenswrapper[4707]: W0121 16:01:53.057121 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b206e03_5e3a_4613_8d4a_6108443785cb.slice/crio-35794f68ad94b71e66696f834df51416b6a3bc46e7f01a9e1111db3cf9aa9e72 WatchSource:0}: Error finding container 35794f68ad94b71e66696f834df51416b6a3bc46e7f01a9e1111db3cf9aa9e72: Status 404 returned error can't find the container with id 35794f68ad94b71e66696f834df51416b6a3bc46e7f01a9e1111db3cf9aa9e72 Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.058223 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-685994d4cb-zr62v"] Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.065696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-856c459894-wcndt" event={"ID":"0164b5d5-db09-4d87-b6cb-2457e1397145","Type":"ContainerStarted","Data":"8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171"} Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.065888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-856c459894-wcndt" event={"ID":"0164b5d5-db09-4d87-b6cb-2457e1397145","Type":"ContainerStarted","Data":"7eca49ce8f750e0af4725624cc2b83b748215789499a3cccdb3c549706022798"} Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.067744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"072de5b8-01f6-48ea-b4cf-751f3a8a449e","Type":"ContainerStarted","Data":"7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268"} Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.067923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"072de5b8-01f6-48ea-b4cf-751f3a8a449e","Type":"ContainerStarted","Data":"436bda71347d55b3eb36e67c7983d73a9afd4257b1ace2cf6c1327852a29b33b"} Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.076622 4707 generic.go:334] "Generic (PLEG): container finished" podID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerID="0f202935d98db43b384ed17021e4ebfc0ee5225582322e135c3e18383a48cf21" exitCode=1 Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.076782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerDied","Data":"0f202935d98db43b384ed17021e4ebfc0ee5225582322e135c3e18383a48cf21"} Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.076903 4707 scope.go:117] "RemoveContainer" containerID="7c55ba536e080edde206def3a191146ce495626937fb76cabe97f1bf8a1973fd" Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.077872 4707 scope.go:117] "RemoveContainer" containerID="0f202935d98db43b384ed17021e4ebfc0ee5225582322e135c3e18383a48cf21" Jan 21 16:01:53 crc kubenswrapper[4707]: E0121 16:01:53.078239 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(4af60f39-57ca-4595-be87-1d1a0c869d72)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.099195 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerID="c4dc4cfcfcba1aa33e6a181bb373be2414594c8e3acc632c81306d951b27e3f9" exitCode=1 Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.099240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerDied","Data":"c4dc4cfcfcba1aa33e6a181bb373be2414594c8e3acc632c81306d951b27e3f9"} Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.100146 4707 scope.go:117] "RemoveContainer" containerID="c4dc4cfcfcba1aa33e6a181bb373be2414594c8e3acc632c81306d951b27e3f9" Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.105627 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7d28e68-71cb-476f-a831-5536b2686514" containerID="5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d" exitCode=143 Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.107131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" event={"ID":"d7d28e68-71cb-476f-a831-5536b2686514","Type":"ContainerDied","Data":"5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d"} Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.207482 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de274a1-6480-4858-948a-d0423dbf7fe8" path="/var/lib/kubelet/pods/1de274a1-6480-4858-948a-d0423dbf7fe8/volumes" Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.467444 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.485760 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.634966 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.668775 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.908841 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-687544f45f-6qmlx"] Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.909074 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" podUID="b2395d51-a312-47ba-9136-87fc6e74bf2c" containerName="keystone-api" containerID="cri-o://32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78" gracePeriod=30 Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.919670 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" podUID="b2395d51-a312-47ba-9136-87fc6e74bf2c" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.1.49:5000/v3\": EOF" Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.968822 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp"] Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.976795 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:53 crc kubenswrapper[4707]: I0121 16:01:53.990705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp"] Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.107348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-credential-keys\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.107401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-config-data\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.107453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-scripts\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.107485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8q62\" (UniqueName: \"kubernetes.io/projected/30ccb1b1-f7df-49e1-b264-546fe96b02c0-kube-api-access-m8q62\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.107600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-public-tls-certs\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.107695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-combined-ca-bundle\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.107764 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-fernet-keys\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.107790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-internal-tls-certs\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.124217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" event={"ID":"5b206e03-5e3a-4613-8d4a-6108443785cb","Type":"ContainerStarted","Data":"d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0"} Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.124256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" event={"ID":"5b206e03-5e3a-4613-8d4a-6108443785cb","Type":"ContainerStarted","Data":"484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4"} Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.124270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" event={"ID":"5b206e03-5e3a-4613-8d4a-6108443785cb","Type":"ContainerStarted","Data":"35794f68ad94b71e66696f834df51416b6a3bc46e7f01a9e1111db3cf9aa9e72"} Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.125285 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.125337 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.133226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerStarted","Data":"4e80fc2e3b0edededad7615ba0b6b058d59247aa9433541a7da0a7af40085a18"} Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.133705 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.139915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-856c459894-wcndt" event={"ID":"0164b5d5-db09-4d87-b6cb-2457e1397145","Type":"ContainerStarted","Data":"4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32"} Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.140067 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-856c459894-wcndt" podUID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerName="placement-log" containerID="cri-o://8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171" gracePeriod=30 Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.140584 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.140612 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.140643 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-856c459894-wcndt" podUID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerName="placement-api" containerID="cri-o://4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32" gracePeriod=30 Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.155883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerStarted","Data":"645bad01275e738dbf202604018070879944290b95b9bc8a2aab9d80b8e889b5"} Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.155906 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="a5c43f7c-ca96-403b-b852-675dee96ce9c" containerName="memcached" containerID="cri-o://8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee" gracePeriod=30 Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.165919 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" podStartSLOduration=2.165907731 podStartE2EDuration="2.165907731s" podCreationTimestamp="2026-01-21 16:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:54.146291396 +0000 UTC m=+3611.327807618" watchObservedRunningTime="2026-01-21 16:01:54.165907731 +0000 UTC m=+3611.347423952" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.183846 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.190261 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-856c459894-wcndt" podStartSLOduration=3.190241066 podStartE2EDuration="3.190241066s" podCreationTimestamp="2026-01-21 16:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:54.178273543 +0000 UTC m=+3611.359789764" watchObservedRunningTime="2026-01-21 16:01:54.190241066 +0000 UTC m=+3611.371757288" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.210578 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.203327654 podStartE2EDuration="7.210564751s" podCreationTimestamp="2026-01-21 16:01:47 +0000 UTC" firstStartedPulling="2026-01-21 16:01:49.057124336 +0000 UTC m=+3606.238640559" lastFinishedPulling="2026-01-21 16:01:53.064361434 +0000 UTC m=+3610.245877656" observedRunningTime="2026-01-21 16:01:54.20446973 +0000 UTC m=+3611.385985952" watchObservedRunningTime="2026-01-21 16:01:54.210564751 +0000 UTC m=+3611.392080973" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.210729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-combined-ca-bundle\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.210838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-fernet-keys\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.210864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-internal-tls-certs\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.210883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-credential-keys\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.210914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-config-data\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.210954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-scripts\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.210968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8q62\" (UniqueName: \"kubernetes.io/projected/30ccb1b1-f7df-49e1-b264-546fe96b02c0-kube-api-access-m8q62\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.211059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-public-tls-certs\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.215754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-config-data\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.215975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-public-tls-certs\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.217211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-internal-tls-certs\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.221282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-scripts\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.223276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-combined-ca-bundle\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.229156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-credential-keys\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.230689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8q62\" (UniqueName: \"kubernetes.io/projected/30ccb1b1-f7df-49e1-b264-546fe96b02c0-kube-api-access-m8q62\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.235484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-fernet-keys\") pod \"keystone-7b8fc55dc7-dn5dp\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.298453 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.704225 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp"] Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.713016 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.843611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-combined-ca-bundle\") pod \"0164b5d5-db09-4d87-b6cb-2457e1397145\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.843667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkgmt\" (UniqueName: \"kubernetes.io/projected/0164b5d5-db09-4d87-b6cb-2457e1397145-kube-api-access-xkgmt\") pod \"0164b5d5-db09-4d87-b6cb-2457e1397145\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.843723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0164b5d5-db09-4d87-b6cb-2457e1397145-logs\") pod \"0164b5d5-db09-4d87-b6cb-2457e1397145\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.843869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-internal-tls-certs\") pod \"0164b5d5-db09-4d87-b6cb-2457e1397145\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.843993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-config-data\") pod \"0164b5d5-db09-4d87-b6cb-2457e1397145\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.844043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-public-tls-certs\") pod \"0164b5d5-db09-4d87-b6cb-2457e1397145\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.844074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-scripts\") pod \"0164b5d5-db09-4d87-b6cb-2457e1397145\" (UID: \"0164b5d5-db09-4d87-b6cb-2457e1397145\") " Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.845036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0164b5d5-db09-4d87-b6cb-2457e1397145-logs" (OuterVolumeSpecName: "logs") pod "0164b5d5-db09-4d87-b6cb-2457e1397145" (UID: "0164b5d5-db09-4d87-b6cb-2457e1397145"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.862928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-scripts" (OuterVolumeSpecName: "scripts") pod "0164b5d5-db09-4d87-b6cb-2457e1397145" (UID: "0164b5d5-db09-4d87-b6cb-2457e1397145"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.874341 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0164b5d5-db09-4d87-b6cb-2457e1397145-kube-api-access-xkgmt" (OuterVolumeSpecName: "kube-api-access-xkgmt") pod "0164b5d5-db09-4d87-b6cb-2457e1397145" (UID: "0164b5d5-db09-4d87-b6cb-2457e1397145"). InnerVolumeSpecName "kube-api-access-xkgmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.946654 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.946685 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkgmt\" (UniqueName: \"kubernetes.io/projected/0164b5d5-db09-4d87-b6cb-2457e1397145-kube-api-access-xkgmt\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.946697 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0164b5d5-db09-4d87-b6cb-2457e1397145-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:54 crc kubenswrapper[4707]: I0121 16:01:54.949963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0164b5d5-db09-4d87-b6cb-2457e1397145" (UID: "0164b5d5-db09-4d87-b6cb-2457e1397145"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.008424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0164b5d5-db09-4d87-b6cb-2457e1397145" (UID: "0164b5d5-db09-4d87-b6cb-2457e1397145"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.034936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-config-data" (OuterVolumeSpecName: "config-data") pod "0164b5d5-db09-4d87-b6cb-2457e1397145" (UID: "0164b5d5-db09-4d87-b6cb-2457e1397145"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.048065 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.048094 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.048108 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.052663 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0164b5d5-db09-4d87-b6cb-2457e1397145" (UID: "0164b5d5-db09-4d87-b6cb-2457e1397145"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.082940 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.082978 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.149974 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0164b5d5-db09-4d87-b6cb-2457e1397145-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.167000 4707 generic.go:334] "Generic (PLEG): container finished" podID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerID="4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32" exitCode=0 Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.167051 4707 generic.go:334] "Generic (PLEG): container finished" podID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerID="8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171" exitCode=143 Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.167065 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-856c459894-wcndt" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.167065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-856c459894-wcndt" event={"ID":"0164b5d5-db09-4d87-b6cb-2457e1397145","Type":"ContainerDied","Data":"4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32"} Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.167190 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-856c459894-wcndt" event={"ID":"0164b5d5-db09-4d87-b6cb-2457e1397145","Type":"ContainerDied","Data":"8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171"} Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.167204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-856c459894-wcndt" event={"ID":"0164b5d5-db09-4d87-b6cb-2457e1397145","Type":"ContainerDied","Data":"7eca49ce8f750e0af4725624cc2b83b748215789499a3cccdb3c549706022798"} Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.167222 4707 scope.go:117] "RemoveContainer" containerID="4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.168481 4707 generic.go:334] "Generic (PLEG): container finished" podID="072de5b8-01f6-48ea-b4cf-751f3a8a449e" containerID="7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268" exitCode=0 Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.168530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"072de5b8-01f6-48ea-b4cf-751f3a8a449e","Type":"ContainerDied","Data":"7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268"} Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.171694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" event={"ID":"30ccb1b1-f7df-49e1-b264-546fe96b02c0","Type":"ContainerStarted","Data":"2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8"} Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.171756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" event={"ID":"30ccb1b1-f7df-49e1-b264-546fe96b02c0","Type":"ContainerStarted","Data":"8b18577448492fa9c776041b61c122ff6298ba44141f220acc90e3f28a5c07f5"} Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.172329 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.201635 4707 scope.go:117] "RemoveContainer" containerID="8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.214389 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" podStartSLOduration=2.214371907 podStartE2EDuration="2.214371907s" podCreationTimestamp="2026-01-21 16:01:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:55.208492191 +0000 UTC m=+3612.390008413" watchObservedRunningTime="2026-01-21 16:01:55.214371907 +0000 UTC m=+3612.395888128" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.250332 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-856c459894-wcndt"] Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.258069 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-856c459894-wcndt"] Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.261941 4707 scope.go:117] "RemoveContainer" containerID="4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32" Jan 21 16:01:55 crc kubenswrapper[4707]: E0121 16:01:55.262334 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32\": container with ID starting with 4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32 not found: ID does not exist" containerID="4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.262365 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32"} err="failed to get container status \"4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32\": rpc error: code = NotFound desc = could not find container \"4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32\": container with ID starting with 4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32 not found: ID does not exist" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.262383 4707 scope.go:117] "RemoveContainer" containerID="8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171" Jan 21 16:01:55 crc kubenswrapper[4707]: E0121 16:01:55.263639 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171\": container with ID starting with 8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171 not found: ID does not exist" containerID="8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.263681 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171"} err="failed to get container status \"8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171\": rpc error: code = NotFound desc = could not find container \"8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171\": container with ID starting with 8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171 not found: ID does not exist" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.263694 4707 scope.go:117] "RemoveContainer" containerID="4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.263958 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32"} err="failed to get container status \"4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32\": rpc error: code = NotFound desc = could not find container \"4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32\": container with ID starting with 4f1cda0e3253a326954b8bda79d26c74d1b42816fcb8b71d78c6f3556c7d0b32 not found: ID does not exist" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.263976 4707 scope.go:117] "RemoveContainer" containerID="8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.264231 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171"} err="failed to get container status \"8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171\": rpc error: code = NotFound desc = could not find container \"8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171\": container with ID starting with 8c15d7c3d551f7447d6e1b2bb2ee9e8afabe8200847a713053763669d8bb6171 not found: ID does not exist" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.513066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.513782 4707 scope.go:117] "RemoveContainer" containerID="0f202935d98db43b384ed17021e4ebfc0ee5225582322e135c3e18383a48cf21" Jan 21 16:01:55 crc kubenswrapper[4707]: E0121 16:01:55.514126 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(4af60f39-57ca-4595-be87-1d1a0c869d72)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.635439 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.665731 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.761862 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.870662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-memcached-tls-certs\") pod \"a5c43f7c-ca96-403b-b852-675dee96ce9c\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.870767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-config-data\") pod \"a5c43f7c-ca96-403b-b852-675dee96ce9c\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.870888 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98lnd\" (UniqueName: \"kubernetes.io/projected/a5c43f7c-ca96-403b-b852-675dee96ce9c-kube-api-access-98lnd\") pod \"a5c43f7c-ca96-403b-b852-675dee96ce9c\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.871041 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-combined-ca-bundle\") pod \"a5c43f7c-ca96-403b-b852-675dee96ce9c\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.871066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-kolla-config\") pod \"a5c43f7c-ca96-403b-b852-675dee96ce9c\" (UID: \"a5c43f7c-ca96-403b-b852-675dee96ce9c\") " Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.871842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a5c43f7c-ca96-403b-b852-675dee96ce9c" (UID: "a5c43f7c-ca96-403b-b852-675dee96ce9c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.872766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-config-data" (OuterVolumeSpecName: "config-data") pod "a5c43f7c-ca96-403b-b852-675dee96ce9c" (UID: "a5c43f7c-ca96-403b-b852-675dee96ce9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.891043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c43f7c-ca96-403b-b852-675dee96ce9c-kube-api-access-98lnd" (OuterVolumeSpecName: "kube-api-access-98lnd") pod "a5c43f7c-ca96-403b-b852-675dee96ce9c" (UID: "a5c43f7c-ca96-403b-b852-675dee96ce9c"). InnerVolumeSpecName "kube-api-access-98lnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.897499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5c43f7c-ca96-403b-b852-675dee96ce9c" (UID: "a5c43f7c-ca96-403b-b852-675dee96ce9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.922050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "a5c43f7c-ca96-403b-b852-675dee96ce9c" (UID: "a5c43f7c-ca96-403b-b852-675dee96ce9c"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.973006 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98lnd\" (UniqueName: \"kubernetes.io/projected/a5c43f7c-ca96-403b-b852-675dee96ce9c-kube-api-access-98lnd\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.973037 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.973047 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.973056 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c43f7c-ca96-403b-b852-675dee96ce9c-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:55 crc kubenswrapper[4707]: I0121 16:01:55.973065 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5c43f7c-ca96-403b-b852-675dee96ce9c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.055580 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.055622 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.073913 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.073948 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.114190 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.185428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"072de5b8-01f6-48ea-b4cf-751f3a8a449e","Type":"ContainerStarted","Data":"6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02"} Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.187133 4707 generic.go:334] "Generic (PLEG): container finished" podID="a5c43f7c-ca96-403b-b852-675dee96ce9c" containerID="8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee" exitCode=0 Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.187178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"a5c43f7c-ca96-403b-b852-675dee96ce9c","Type":"ContainerDied","Data":"8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee"} Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.187195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"a5c43f7c-ca96-403b-b852-675dee96ce9c","Type":"ContainerDied","Data":"12be752356156ba59ab4b887cf7857e8f5dbfe4de59bc40d4746b70f172728d3"} Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.187209 4707 scope.go:117] "RemoveContainer" containerID="8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.187275 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.191343 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerID="4e80fc2e3b0edededad7615ba0b6b058d59247aa9433541a7da0a7af40085a18" exitCode=1 Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.192451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerDied","Data":"4e80fc2e3b0edededad7615ba0b6b058d59247aa9433541a7da0a7af40085a18"} Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.192514 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.193175 4707 scope.go:117] "RemoveContainer" containerID="4e80fc2e3b0edededad7615ba0b6b058d59247aa9433541a7da0a7af40085a18" Jan 21 16:01:56 crc kubenswrapper[4707]: E0121 16:01:56.193394 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(4b535454-f068-42cd-9817-a23bd7d7aa4d)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.210218 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=4.210201963 podStartE2EDuration="4.210201963s" podCreationTimestamp="2026-01-21 16:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:56.206332317 +0000 UTC m=+3613.387848538" watchObservedRunningTime="2026-01-21 16:01:56.210201963 +0000 UTC m=+3613.391718186" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.210779 4707 scope.go:117] "RemoveContainer" containerID="8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee" Jan 21 16:01:56 crc kubenswrapper[4707]: E0121 16:01:56.212477 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee\": container with ID starting with 8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee not found: ID does not exist" containerID="8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.212504 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee"} err="failed to get container status \"8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee\": rpc error: code = NotFound desc = could not find container \"8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee\": container with ID starting with 8afd64faa2a42cfd192ec5488cfc79d8fa8369738e1e49586c7f6ce6e56f79ee not found: ID does not exist" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.212522 4707 scope.go:117] "RemoveContainer" containerID="c4dc4cfcfcba1aa33e6a181bb373be2414594c8e3acc632c81306d951b27e3f9" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.236205 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.248061 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.253445 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.275876 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:56 crc kubenswrapper[4707]: E0121 16:01:56.276316 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c43f7c-ca96-403b-b852-675dee96ce9c" containerName="memcached" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.276400 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c43f7c-ca96-403b-b852-675dee96ce9c" containerName="memcached" Jan 21 16:01:56 crc kubenswrapper[4707]: E0121 16:01:56.276416 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerName="placement-log" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.276421 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerName="placement-log" Jan 21 16:01:56 crc kubenswrapper[4707]: E0121 16:01:56.276429 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerName="placement-api" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.276435 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerName="placement-api" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.276609 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerName="placement-api" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.276626 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c43f7c-ca96-403b-b852-675dee96ce9c" containerName="memcached" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.276640 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0164b5d5-db09-4d87-b6cb-2457e1397145" containerName="placement-log" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.277489 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.283234 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.283422 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.283548 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-tgjb6" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.301285 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.330760 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.383365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kolla-config\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.383422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-combined-ca-bundle\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.383452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.383606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-config-data\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.383832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fng8\" (UniqueName: \"kubernetes.io/projected/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kube-api-access-4fng8\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.485388 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kolla-config\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.485443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-combined-ca-bundle\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.485498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.485649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-config-data\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.486134 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fng8\" (UniqueName: \"kubernetes.io/projected/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kube-api-access-4fng8\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.486683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-config-data\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.486691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kolla-config\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.489538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-combined-ca-bundle\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.490718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.498619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fng8\" (UniqueName: \"kubernetes.io/projected/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kube-api-access-4fng8\") pod \"memcached-0\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:56 crc kubenswrapper[4707]: I0121 16:01:56.615253 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:56.997868 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.190603 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0164b5d5-db09-4d87-b6cb-2457e1397145" path="/var/lib/kubelet/pods/0164b5d5-db09-4d87-b6cb-2457e1397145/volumes" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.191294 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c43f7c-ca96-403b-b852-675dee96ce9c" path="/var/lib/kubelet/pods/a5c43f7c-ca96-403b-b852-675dee96ce9c/volumes" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.201634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"05aad5e9-0e3a-47b4-a11f-feca99ab8dac","Type":"ContainerStarted","Data":"0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd"} Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.201662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"05aad5e9-0e3a-47b4-a11f-feca99ab8dac","Type":"ContainerStarted","Data":"8a2d91f4d891393fd3b72798b84a31ba66390e1f2c4b9b0b7e2a4107ee091659"} Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.201821 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.221158 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=1.221143714 podStartE2EDuration="1.221143714s" podCreationTimestamp="2026-01-21 16:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:57.213376108 +0000 UTC m=+3614.394892320" watchObservedRunningTime="2026-01-21 16:01:57.221143714 +0000 UTC m=+3614.402659936" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.368405 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.368442 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.369739 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.471202 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.512458 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5b545794dd-j668n"] Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.512644 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" podUID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerName="barbican-api-log" containerID="cri-o://0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9" gracePeriod=30 Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.512984 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" podUID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerName="barbican-api" containerID="cri-o://3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064" gracePeriod=30 Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.667122 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.667775 4707 scope.go:117] "RemoveContainer" containerID="4e80fc2e3b0edededad7615ba0b6b058d59247aa9433541a7da0a7af40085a18" Jan 21 16:01:57 crc kubenswrapper[4707]: E0121 16:01:57.668126 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(4b535454-f068-42cd-9817-a23bd7d7aa4d)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.907650 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:57 crc kubenswrapper[4707]: I0121 16:01:57.907876 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.123105 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.123163 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.154853 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.164753 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.209272 4707 generic.go:334] "Generic (PLEG): container finished" podID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerID="0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9" exitCode=143 Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.209961 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" event={"ID":"e82ec0cb-eb14-411c-a7a7-6d231875fe6e","Type":"ContainerDied","Data":"0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9"} Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.210018 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.210037 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.377934 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.59:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.377954 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.59:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.438856 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.438899 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.474357 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.478856 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.842246 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.917936 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.61:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.917970 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.61:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.950386 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.950614 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="ceilometer-central-agent" containerID="cri-o://a609dee96e6747b2f7aecd1bbabda7f16bd305f62372daa8e579f23939d6fe36" gracePeriod=30 Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.950991 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="proxy-httpd" containerID="cri-o://645bad01275e738dbf202604018070879944290b95b9bc8a2aab9d80b8e889b5" gracePeriod=30 Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.951045 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="sg-core" containerID="cri-o://86a2a8ca4b2ba529787c883f220c177764765b546eb04737097e14f5425fe768" gracePeriod=30 Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.951080 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="ceilometer-notification-agent" containerID="cri-o://1edfa4083df3407b1693b03447db0d9cf41cec4d7d408748d3925fc4b1615068" gracePeriod=30 Jan 21 16:01:58 crc kubenswrapper[4707]: I0121 16:01:58.992157 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="45062d46-6a83-4a28-bb58-8d2f5a22e270" containerName="galera" containerID="cri-o://f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592" gracePeriod=30 Jan 21 16:01:59 crc kubenswrapper[4707]: I0121 16:01:59.284776 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9d59870-b78f-4c55-8105-ef69996cd835" containerID="645bad01275e738dbf202604018070879944290b95b9bc8a2aab9d80b8e889b5" exitCode=0 Jan 21 16:01:59 crc kubenswrapper[4707]: I0121 16:01:59.284837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerDied","Data":"645bad01275e738dbf202604018070879944290b95b9bc8a2aab9d80b8e889b5"} Jan 21 16:01:59 crc kubenswrapper[4707]: I0121 16:01:59.284869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerDied","Data":"86a2a8ca4b2ba529787c883f220c177764765b546eb04737097e14f5425fe768"} Jan 21 16:01:59 crc kubenswrapper[4707]: I0121 16:01:59.284886 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9d59870-b78f-4c55-8105-ef69996cd835" containerID="86a2a8ca4b2ba529787c883f220c177764765b546eb04737097e14f5425fe768" exitCode=2 Jan 21 16:01:59 crc kubenswrapper[4707]: I0121 16:01:59.286220 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:59 crc kubenswrapper[4707]: I0121 16:01:59.286245 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:01:59 crc kubenswrapper[4707]: E0121 16:01:59.582547 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d59870_b78f_4c55_8105_ef69996cd835.slice/crio-a609dee96e6747b2f7aecd1bbabda7f16bd305f62372daa8e579f23939d6fe36.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d59870_b78f_4c55_8105_ef69996cd835.slice/crio-conmon-a609dee96e6747b2f7aecd1bbabda7f16bd305f62372daa8e579f23939d6fe36.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:01:59 crc kubenswrapper[4707]: I0121 16:01:59.934155 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:01:59 crc kubenswrapper[4707]: I0121 16:01:59.951129 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.092984 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.093026 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"45062d46-6a83-4a28-bb58-8d2f5a22e270\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-config-data\") pod \"b2395d51-a312-47ba-9136-87fc6e74bf2c\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-public-tls-certs\") pod \"b2395d51-a312-47ba-9136-87fc6e74bf2c\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-galera-tls-certs\") pod \"45062d46-6a83-4a28-bb58-8d2f5a22e270\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-kolla-config\") pod \"45062d46-6a83-4a28-bb58-8d2f5a22e270\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-internal-tls-certs\") pod \"b2395d51-a312-47ba-9136-87fc6e74bf2c\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-operator-scripts\") pod \"45062d46-6a83-4a28-bb58-8d2f5a22e270\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-combined-ca-bundle\") pod \"45062d46-6a83-4a28-bb58-8d2f5a22e270\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-generated\") pod \"45062d46-6a83-4a28-bb58-8d2f5a22e270\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-credential-keys\") pod \"b2395d51-a312-47ba-9136-87fc6e74bf2c\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-combined-ca-bundle\") pod \"b2395d51-a312-47ba-9136-87fc6e74bf2c\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-default\") pod \"45062d46-6a83-4a28-bb58-8d2f5a22e270\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098548 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22wpg\" (UniqueName: \"kubernetes.io/projected/b2395d51-a312-47ba-9136-87fc6e74bf2c-kube-api-access-22wpg\") pod \"b2395d51-a312-47ba-9136-87fc6e74bf2c\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-fernet-keys\") pod \"b2395d51-a312-47ba-9136-87fc6e74bf2c\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-scripts\") pod \"b2395d51-a312-47ba-9136-87fc6e74bf2c\" (UID: \"b2395d51-a312-47ba-9136-87fc6e74bf2c\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.098682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtrjp\" (UniqueName: \"kubernetes.io/projected/45062d46-6a83-4a28-bb58-8d2f5a22e270-kube-api-access-qtrjp\") pod \"45062d46-6a83-4a28-bb58-8d2f5a22e270\" (UID: \"45062d46-6a83-4a28-bb58-8d2f5a22e270\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.100380 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "45062d46-6a83-4a28-bb58-8d2f5a22e270" (UID: "45062d46-6a83-4a28-bb58-8d2f5a22e270"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.102948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "45062d46-6a83-4a28-bb58-8d2f5a22e270" (UID: "45062d46-6a83-4a28-bb58-8d2f5a22e270"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.107451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45062d46-6a83-4a28-bb58-8d2f5a22e270" (UID: "45062d46-6a83-4a28-bb58-8d2f5a22e270"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.108370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "45062d46-6a83-4a28-bb58-8d2f5a22e270" (UID: "45062d46-6a83-4a28-bb58-8d2f5a22e270"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.109013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b2395d51-a312-47ba-9136-87fc6e74bf2c" (UID: "b2395d51-a312-47ba-9136-87fc6e74bf2c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.113389 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45062d46-6a83-4a28-bb58-8d2f5a22e270-kube-api-access-qtrjp" (OuterVolumeSpecName: "kube-api-access-qtrjp") pod "45062d46-6a83-4a28-bb58-8d2f5a22e270" (UID: "45062d46-6a83-4a28-bb58-8d2f5a22e270"). InnerVolumeSpecName "kube-api-access-qtrjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.118757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-scripts" (OuterVolumeSpecName: "scripts") pod "b2395d51-a312-47ba-9136-87fc6e74bf2c" (UID: "b2395d51-a312-47ba-9136-87fc6e74bf2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.127786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2395d51-a312-47ba-9136-87fc6e74bf2c-kube-api-access-22wpg" (OuterVolumeSpecName: "kube-api-access-22wpg") pod "b2395d51-a312-47ba-9136-87fc6e74bf2c" (UID: "b2395d51-a312-47ba-9136-87fc6e74bf2c"). InnerVolumeSpecName "kube-api-access-22wpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.132267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b2395d51-a312-47ba-9136-87fc6e74bf2c" (UID: "b2395d51-a312-47ba-9136-87fc6e74bf2c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.152894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "45062d46-6a83-4a28-bb58-8d2f5a22e270" (UID: "45062d46-6a83-4a28-bb58-8d2f5a22e270"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.176726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2395d51-a312-47ba-9136-87fc6e74bf2c" (UID: "b2395d51-a312-47ba-9136-87fc6e74bf2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.177841 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45062d46-6a83-4a28-bb58-8d2f5a22e270" (UID: "45062d46-6a83-4a28-bb58-8d2f5a22e270"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.178078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b2395d51-a312-47ba-9136-87fc6e74bf2c" (UID: "b2395d51-a312-47ba-9136-87fc6e74bf2c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.178987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "45062d46-6a83-4a28-bb58-8d2f5a22e270" (UID: "45062d46-6a83-4a28-bb58-8d2f5a22e270"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.181529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-config-data" (OuterVolumeSpecName: "config-data") pod "b2395d51-a312-47ba-9136-87fc6e74bf2c" (UID: "b2395d51-a312-47ba-9136-87fc6e74bf2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.191583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b2395d51-a312-47ba-9136-87fc6e74bf2c" (UID: "b2395d51-a312-47ba-9136-87fc6e74bf2c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201533 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201556 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtrjp\" (UniqueName: \"kubernetes.io/projected/45062d46-6a83-4a28-bb58-8d2f5a22e270-kube-api-access-qtrjp\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201585 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201594 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201605 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201613 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201620 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201629 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201637 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201645 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45062d46-6a83-4a28-bb58-8d2f5a22e270-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201655 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201663 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201671 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201678 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45062d46-6a83-4a28-bb58-8d2f5a22e270-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201686 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22wpg\" (UniqueName: \"kubernetes.io/projected/b2395d51-a312-47ba-9136-87fc6e74bf2c-kube-api-access-22wpg\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.201694 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2395d51-a312-47ba-9136-87fc6e74bf2c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.218635 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.299251 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2395d51-a312-47ba-9136-87fc6e74bf2c" containerID="32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78" exitCode=0 Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.299322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" event={"ID":"b2395d51-a312-47ba-9136-87fc6e74bf2c","Type":"ContainerDied","Data":"32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78"} Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.299351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" event={"ID":"b2395d51-a312-47ba-9136-87fc6e74bf2c","Type":"ContainerDied","Data":"3dbd485ddc9bfa70a1a342ac567d2dfd4359a73aa0280b6102ad6e894baab869"} Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.299372 4707 scope.go:117] "RemoveContainer" containerID="32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.299421 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-687544f45f-6qmlx" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.302678 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.307132 4707 generic.go:334] "Generic (PLEG): container finished" podID="45062d46-6a83-4a28-bb58-8d2f5a22e270" containerID="f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592" exitCode=0 Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.307187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"45062d46-6a83-4a28-bb58-8d2f5a22e270","Type":"ContainerDied","Data":"f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592"} Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.307211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"45062d46-6a83-4a28-bb58-8d2f5a22e270","Type":"ContainerDied","Data":"736a7487b8094d2d9cd3aaae926443f542829fc73205aeff4b7a22993b5d4df4"} Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.307263 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.336915 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9d59870-b78f-4c55-8105-ef69996cd835" containerID="1edfa4083df3407b1693b03447db0d9cf41cec4d7d408748d3925fc4b1615068" exitCode=0 Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.336942 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9d59870-b78f-4c55-8105-ef69996cd835" containerID="a609dee96e6747b2f7aecd1bbabda7f16bd305f62372daa8e579f23939d6fe36" exitCode=0 Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.336971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerDied","Data":"1edfa4083df3407b1693b03447db0d9cf41cec4d7d408748d3925fc4b1615068"} Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.337005 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.337008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerDied","Data":"a609dee96e6747b2f7aecd1bbabda7f16bd305f62372daa8e579f23939d6fe36"} Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.337014 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.353199 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-687544f45f-6qmlx"] Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.358878 4707 scope.go:117] "RemoveContainer" containerID="32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78" Jan 21 16:02:00 crc kubenswrapper[4707]: E0121 16:02:00.361365 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78\": container with ID starting with 32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78 not found: ID does not exist" containerID="32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.361415 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78"} err="failed to get container status \"32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78\": rpc error: code = NotFound desc = could not find container \"32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78\": container with ID starting with 32f4dcb1442b8ad3fb93fbd79ce2381bcdfd85fd79b9ec3715a220a1da27dd78 not found: ID does not exist" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.361437 4707 scope.go:117] "RemoveContainer" containerID="f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.361545 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-687544f45f-6qmlx"] Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.379602 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.402863 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.427249 4707 scope.go:117] "RemoveContainer" containerID="0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.436669 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:02:00 crc kubenswrapper[4707]: E0121 16:02:00.437081 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45062d46-6a83-4a28-bb58-8d2f5a22e270" containerName="mysql-bootstrap" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.437099 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="45062d46-6a83-4a28-bb58-8d2f5a22e270" containerName="mysql-bootstrap" Jan 21 16:02:00 crc kubenswrapper[4707]: E0121 16:02:00.437112 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45062d46-6a83-4a28-bb58-8d2f5a22e270" containerName="galera" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.437119 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="45062d46-6a83-4a28-bb58-8d2f5a22e270" containerName="galera" Jan 21 16:02:00 crc kubenswrapper[4707]: E0121 16:02:00.437150 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2395d51-a312-47ba-9136-87fc6e74bf2c" containerName="keystone-api" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.437171 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2395d51-a312-47ba-9136-87fc6e74bf2c" containerName="keystone-api" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.437374 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="45062d46-6a83-4a28-bb58-8d2f5a22e270" containerName="galera" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.437401 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2395d51-a312-47ba-9136-87fc6e74bf2c" containerName="keystone-api" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.438335 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.444351 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.444490 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-6glg2" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.444591 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.444692 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.452427 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.465899 4707 scope.go:117] "RemoveContainer" containerID="f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592" Jan 21 16:02:00 crc kubenswrapper[4707]: E0121 16:02:00.467256 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592\": container with ID starting with f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592 not found: ID does not exist" containerID="f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.467284 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592"} err="failed to get container status \"f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592\": rpc error: code = NotFound desc = could not find container \"f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592\": container with ID starting with f059f74bbeef163733d74aad4cac02895919d7a14faa84eb71f9dbd67844b592 not found: ID does not exist" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.467316 4707 scope.go:117] "RemoveContainer" containerID="0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5" Jan 21 16:02:00 crc kubenswrapper[4707]: E0121 16:02:00.469079 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5\": container with ID starting with 0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5 not found: ID does not exist" containerID="0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.469104 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5"} err="failed to get container status \"0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5\": rpc error: code = NotFound desc = could not find container \"0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5\": container with ID starting with 0ff428a0cc8b90dc27cf3851eadc9cb315e616d820a8b7c23c1597a73bbc85e5 not found: ID does not exist" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.508206 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.608671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.608779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kolla-config\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.608797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-default\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.608851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bdvw\" (UniqueName: \"kubernetes.io/projected/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kube-api-access-2bdvw\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.608904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.609005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.609052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.609079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.648714 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.691919 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.56:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.710484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-config-data\") pod \"e9d59870-b78f-4c55-8105-ef69996cd835\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.710573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-sg-core-conf-yaml\") pod \"e9d59870-b78f-4c55-8105-ef69996cd835\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.710626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tftxj\" (UniqueName: \"kubernetes.io/projected/e9d59870-b78f-4c55-8105-ef69996cd835-kube-api-access-tftxj\") pod \"e9d59870-b78f-4c55-8105-ef69996cd835\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.710666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-combined-ca-bundle\") pod \"e9d59870-b78f-4c55-8105-ef69996cd835\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.710708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-scripts\") pod \"e9d59870-b78f-4c55-8105-ef69996cd835\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.710770 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-run-httpd\") pod \"e9d59870-b78f-4c55-8105-ef69996cd835\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.710793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-ceilometer-tls-certs\") pod \"e9d59870-b78f-4c55-8105-ef69996cd835\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.710836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-log-httpd\") pod \"e9d59870-b78f-4c55-8105-ef69996cd835\" (UID: \"e9d59870-b78f-4c55-8105-ef69996cd835\") " Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.711091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kolla-config\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.711118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-default\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.711162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bdvw\" (UniqueName: \"kubernetes.io/projected/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kube-api-access-2bdvw\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.711218 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.711349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.711400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.711423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.711475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.712725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.714085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-default\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.714563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e9d59870-b78f-4c55-8105-ef69996cd835" (UID: "e9d59870-b78f-4c55-8105-ef69996cd835"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.714803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e9d59870-b78f-4c55-8105-ef69996cd835" (UID: "e9d59870-b78f-4c55-8105-ef69996cd835"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.715711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kolla-config\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.716124 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d59870-b78f-4c55-8105-ef69996cd835-kube-api-access-tftxj" (OuterVolumeSpecName: "kube-api-access-tftxj") pod "e9d59870-b78f-4c55-8105-ef69996cd835" (UID: "e9d59870-b78f-4c55-8105-ef69996cd835"). InnerVolumeSpecName "kube-api-access-tftxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.716133 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.716280 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.716805 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-scripts" (OuterVolumeSpecName: "scripts") pod "e9d59870-b78f-4c55-8105-ef69996cd835" (UID: "e9d59870-b78f-4c55-8105-ef69996cd835"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.721281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.736360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.737334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bdvw\" (UniqueName: \"kubernetes.io/projected/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kube-api-access-2bdvw\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.751489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e9d59870-b78f-4c55-8105-ef69996cd835" (UID: "e9d59870-b78f-4c55-8105-ef69996cd835"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.753397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.764882 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.800635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e9d59870-b78f-4c55-8105-ef69996cd835" (UID: "e9d59870-b78f-4c55-8105-ef69996cd835"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.800683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9d59870-b78f-4c55-8105-ef69996cd835" (UID: "e9d59870-b78f-4c55-8105-ef69996cd835"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.811105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-config-data" (OuterVolumeSpecName: "config-data") pod "e9d59870-b78f-4c55-8105-ef69996cd835" (UID: "e9d59870-b78f-4c55-8105-ef69996cd835"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.818045 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.818071 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.818082 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tftxj\" (UniqueName: \"kubernetes.io/projected/e9d59870-b78f-4c55-8105-ef69996cd835-kube-api-access-tftxj\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.818091 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.818099 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.818107 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.818114 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9d59870-b78f-4c55-8105-ef69996cd835-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.818122 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9d59870-b78f-4c55-8105-ef69996cd835-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.985346 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 16:02:00 crc kubenswrapper[4707]: I0121 16:02:00.987338 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.140354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-combined-ca-bundle\") pod \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.140444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-internal-tls-certs\") pod \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.140584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-public-tls-certs\") pod \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.140608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79k2p\" (UniqueName: \"kubernetes.io/projected/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-kube-api-access-79k2p\") pod \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.140638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data-custom\") pod \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.140667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data\") pod \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.140727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-logs\") pod \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\" (UID: \"e82ec0cb-eb14-411c-a7a7-6d231875fe6e\") " Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.142881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-logs" (OuterVolumeSpecName: "logs") pod "e82ec0cb-eb14-411c-a7a7-6d231875fe6e" (UID: "e82ec0cb-eb14-411c-a7a7-6d231875fe6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.147618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-kube-api-access-79k2p" (OuterVolumeSpecName: "kube-api-access-79k2p") pod "e82ec0cb-eb14-411c-a7a7-6d231875fe6e" (UID: "e82ec0cb-eb14-411c-a7a7-6d231875fe6e"). InnerVolumeSpecName "kube-api-access-79k2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.149964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e82ec0cb-eb14-411c-a7a7-6d231875fe6e" (UID: "e82ec0cb-eb14-411c-a7a7-6d231875fe6e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.169210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e82ec0cb-eb14-411c-a7a7-6d231875fe6e" (UID: "e82ec0cb-eb14-411c-a7a7-6d231875fe6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.194208 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45062d46-6a83-4a28-bb58-8d2f5a22e270" path="/var/lib/kubelet/pods/45062d46-6a83-4a28-bb58-8d2f5a22e270/volumes" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.195051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e82ec0cb-eb14-411c-a7a7-6d231875fe6e" (UID: "e82ec0cb-eb14-411c-a7a7-6d231875fe6e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.195356 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2395d51-a312-47ba-9136-87fc6e74bf2c" path="/var/lib/kubelet/pods/b2395d51-a312-47ba-9136-87fc6e74bf2c/volumes" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.202250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e82ec0cb-eb14-411c-a7a7-6d231875fe6e" (UID: "e82ec0cb-eb14-411c-a7a7-6d231875fe6e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.210797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data" (OuterVolumeSpecName: "config-data") pod "e82ec0cb-eb14-411c-a7a7-6d231875fe6e" (UID: "e82ec0cb-eb14-411c-a7a7-6d231875fe6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.243799 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.243945 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.244042 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.244131 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.244213 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79k2p\" (UniqueName: \"kubernetes.io/projected/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-kube-api-access-79k2p\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.244284 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.244408 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ec0cb-eb14-411c-a7a7-6d231875fe6e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.256026 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.319203 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.1.46:8778/\": EOF" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.351470 4707 generic.go:334] "Generic (PLEG): container finished" podID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerID="3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064" exitCode=0 Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.351535 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.351545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" event={"ID":"e82ec0cb-eb14-411c-a7a7-6d231875fe6e","Type":"ContainerDied","Data":"3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064"} Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.351680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5b545794dd-j668n" event={"ID":"e82ec0cb-eb14-411c-a7a7-6d231875fe6e","Type":"ContainerDied","Data":"d2cc4e8519a8e80f74f07e008baf76b9206d7734490a1bcc720f4b0f977d3122"} Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.351701 4707 scope.go:117] "RemoveContainer" containerID="3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.353780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e9d59870-b78f-4c55-8105-ef69996cd835","Type":"ContainerDied","Data":"650728a47337d6e72385a0301a527b643944f08cd0b19fa391a9714e5fd6e8fa"} Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.353864 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.357140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9","Type":"ContainerStarted","Data":"2289bcf020b0a530c16c6e70dc95b796dbbed43595c00ae3006d390f0d6b0fa6"} Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.357165 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.357201 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.390608 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.398731 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.399762 4707 scope.go:117] "RemoveContainer" containerID="0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.407836 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5b545794dd-j668n"] Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.414375 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-5b545794dd-j668n"] Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.421903 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:01 crc kubenswrapper[4707]: E0121 16:02:01.422293 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerName="barbican-api-log" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422332 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerName="barbican-api-log" Jan 21 16:02:01 crc kubenswrapper[4707]: E0121 16:02:01.422348 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerName="barbican-api" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422355 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerName="barbican-api" Jan 21 16:02:01 crc kubenswrapper[4707]: E0121 16:02:01.422362 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="ceilometer-central-agent" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422367 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="ceilometer-central-agent" Jan 21 16:02:01 crc kubenswrapper[4707]: E0121 16:02:01.422376 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="sg-core" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422381 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="sg-core" Jan 21 16:02:01 crc kubenswrapper[4707]: E0121 16:02:01.422391 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="proxy-httpd" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422396 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="proxy-httpd" Jan 21 16:02:01 crc kubenswrapper[4707]: E0121 16:02:01.422408 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="ceilometer-notification-agent" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422414 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="ceilometer-notification-agent" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422584 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerName="barbican-api" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422597 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="ceilometer-notification-agent" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422607 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="ceilometer-central-agent" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422617 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" containerName="barbican-api-log" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422627 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="sg-core" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.422638 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" containerName="proxy-httpd" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.424098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.427114 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.427261 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.427387 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.433992 4707 scope.go:117] "RemoveContainer" containerID="3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.434882 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:01 crc kubenswrapper[4707]: E0121 16:02:01.438008 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064\": container with ID starting with 3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064 not found: ID does not exist" containerID="3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.438041 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064"} err="failed to get container status \"3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064\": rpc error: code = NotFound desc = could not find container \"3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064\": container with ID starting with 3ae4c025c6fe468bb01d6fe39b501f6022c9bc92163dd6b5739beef62a3bc064 not found: ID does not exist" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.438139 4707 scope.go:117] "RemoveContainer" containerID="0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9" Jan 21 16:02:01 crc kubenswrapper[4707]: E0121 16:02:01.441125 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9\": container with ID starting with 0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9 not found: ID does not exist" containerID="0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.441148 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9"} err="failed to get container status \"0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9\": rpc error: code = NotFound desc = could not find container \"0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9\": container with ID starting with 0686dd4caa27e1af736cdc087660717f40612424df65d425a2f114cff74776b9 not found: ID does not exist" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.441161 4707 scope.go:117] "RemoveContainer" containerID="645bad01275e738dbf202604018070879944290b95b9bc8a2aab9d80b8e889b5" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.464870 4707 scope.go:117] "RemoveContainer" containerID="86a2a8ca4b2ba529787c883f220c177764765b546eb04737097e14f5425fe768" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.466473 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.469685 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.488464 4707 scope.go:117] "RemoveContainer" containerID="1edfa4083df3407b1693b03447db0d9cf41cec4d7d408748d3925fc4b1615068" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.517906 4707 scope.go:117] "RemoveContainer" containerID="a609dee96e6747b2f7aecd1bbabda7f16bd305f62372daa8e579f23939d6fe36" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.550911 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.551043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.551081 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.551214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-config-data\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.551332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.551357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.551413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7gnd\" (UniqueName: \"kubernetes.io/projected/18bb66c8-1624-454d-82e6-0ce735f84fb1-kube-api-access-t7gnd\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.551527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-scripts\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.653026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.653073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.653101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-config-data\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.653122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.653138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.653597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.653828 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.653169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7gnd\" (UniqueName: \"kubernetes.io/projected/18bb66c8-1624-454d-82e6-0ce735f84fb1-kube-api-access-t7gnd\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.653934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-scripts\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.654320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.656242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.656527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-config-data\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.656795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.657156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.657207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-scripts\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.667420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7gnd\" (UniqueName: \"kubernetes.io/projected/18bb66c8-1624-454d-82e6-0ce735f84fb1-kube-api-access-t7gnd\") pod \"ceilometer-0\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.749536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:01 crc kubenswrapper[4707]: I0121 16:02:01.997985 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.46:8778/\": net/http: TLS handshake timeout" Jan 21 16:02:02 crc kubenswrapper[4707]: I0121 16:02:02.152884 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:02 crc kubenswrapper[4707]: I0121 16:02:02.368017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerStarted","Data":"437da720c4094641f05f79afe16440777ed7d714b4be9d422ccf61c7a1b7650b"} Jan 21 16:02:02 crc kubenswrapper[4707]: I0121 16:02:02.369452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9","Type":"ContainerStarted","Data":"2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36"} Jan 21 16:02:02 crc kubenswrapper[4707]: I0121 16:02:02.518480 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:02:02 crc kubenswrapper[4707]: I0121 16:02:02.518688 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:02:02 crc kubenswrapper[4707]: I0121 16:02:02.598425 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:02:02 crc kubenswrapper[4707]: I0121 16:02:02.946984 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.62:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:03 crc kubenswrapper[4707]: I0121 16:02:03.190465 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82ec0cb-eb14-411c-a7a7-6d231875fe6e" path="/var/lib/kubelet/pods/e82ec0cb-eb14-411c-a7a7-6d231875fe6e/volumes" Jan 21 16:02:03 crc kubenswrapper[4707]: I0121 16:02:03.191506 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d59870-b78f-4c55-8105-ef69996cd835" path="/var/lib/kubelet/pods/e9d59870-b78f-4c55-8105-ef69996cd835/volumes" Jan 21 16:02:03 crc kubenswrapper[4707]: I0121 16:02:03.388713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerStarted","Data":"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa"} Jan 21 16:02:03 crc kubenswrapper[4707]: I0121 16:02:03.471691 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:02:03 crc kubenswrapper[4707]: I0121 16:02:03.911981 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.397243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerStarted","Data":"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb"} Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.397437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerStarted","Data":"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632"} Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.398794 4707 generic.go:334] "Generic (PLEG): container finished" podID="4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" containerID="2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36" exitCode=0 Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.398843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9","Type":"ContainerDied","Data":"2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36"} Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.511159 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-684864457b-mxctv"] Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.513707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.522989 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-684864457b-mxctv"] Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.608243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.608308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-internal-tls-certs\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.608342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-logs\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.608370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-public-tls-certs\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.608429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-combined-ca-bundle\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.608521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65442\" (UniqueName: \"kubernetes.io/projected/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-kube-api-access-65442\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.608613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data-custom\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.629183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.710054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-combined-ca-bundle\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.710110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65442\" (UniqueName: \"kubernetes.io/projected/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-kube-api-access-65442\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.710159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data-custom\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.710222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.710296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-internal-tls-certs\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.710348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-logs\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.710384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-public-tls-certs\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.710754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-logs\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.713059 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data-custom\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.713359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-internal-tls-certs\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.713519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-public-tls-certs\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.713591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-combined-ca-bundle\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.719872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.735913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65442\" (UniqueName: \"kubernetes.io/projected/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-kube-api-access-65442\") pod \"barbican-api-684864457b-mxctv\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.829233 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.921091 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-684864457b-mxctv"] Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.947727 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8"] Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.949257 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.950978 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-api-0" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.62:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:04 crc kubenswrapper[4707]: I0121 16:02:04.953407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8"] Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.016920 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/373e7e86-7df4-4678-8a16-cd6accf342bb-logs\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.016998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data-custom\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.017102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwvwn\" (UniqueName: \"kubernetes.io/projected/373e7e86-7df4-4678-8a16-cd6accf342bb-kube-api-access-wwvwn\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.017136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-internal-tls-certs\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.017256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.017414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-combined-ca-bundle\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.017500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-public-tls-certs\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.119397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data-custom\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.119453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwvwn\" (UniqueName: \"kubernetes.io/projected/373e7e86-7df4-4678-8a16-cd6accf342bb-kube-api-access-wwvwn\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.119480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-internal-tls-certs\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.119518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.119627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-combined-ca-bundle\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.119673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-public-tls-certs\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.120363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/373e7e86-7df4-4678-8a16-cd6accf342bb-logs\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.120719 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/373e7e86-7df4-4678-8a16-cd6accf342bb-logs\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.123788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data-custom\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.124162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-public-tls-certs\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.124346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-combined-ca-bundle\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.124368 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-internal-tls-certs\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.125080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.135693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwvwn\" (UniqueName: \"kubernetes.io/projected/373e7e86-7df4-4678-8a16-cd6accf342bb-kube-api-access-wwvwn\") pod \"barbican-api-85fcc969c8-gghq8\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.247567 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-684864457b-mxctv"] Jan 21 16:02:05 crc kubenswrapper[4707]: W0121 16:02:05.251312 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5960e2d_3b1d_47b2_8efd_e7925ebd62ff.slice/crio-224ebda1dabaacf993f160316646e0b4dd4ae54109283b861123a3e595caa111 WatchSource:0}: Error finding container 224ebda1dabaacf993f160316646e0b4dd4ae54109283b861123a3e595caa111: Status 404 returned error can't find the container with id 224ebda1dabaacf993f160316646e0b4dd4ae54109283b861123a3e595caa111 Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.273384 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.415851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" event={"ID":"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff","Type":"ContainerStarted","Data":"f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482"} Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.416034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" event={"ID":"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff","Type":"ContainerStarted","Data":"224ebda1dabaacf993f160316646e0b4dd4ae54109283b861123a3e595caa111"} Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.420080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9","Type":"ContainerStarted","Data":"b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1"} Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.657752 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=5.657737331 podStartE2EDuration="5.657737331s" podCreationTimestamp="2026-01-21 16:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:05.444087979 +0000 UTC m=+3622.625604201" watchObservedRunningTime="2026-01-21 16:02:05.657737331 +0000 UTC m=+3622.839253552" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.658745 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8"] Jan 21 16:02:05 crc kubenswrapper[4707]: W0121 16:02:05.665072 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod373e7e86_7df4_4678_8a16_cd6accf342bb.slice/crio-8452ccc6e066ed6876612b08167b493a289f32110fefcef1dab6981a93d2f2fb WatchSource:0}: Error finding container 8452ccc6e066ed6876612b08167b493a289f32110fefcef1dab6981a93d2f2fb: Status 404 returned error can't find the container with id 8452ccc6e066ed6876612b08167b493a289f32110fefcef1dab6981a93d2f2fb Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.735277 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.56:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.756313 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.833370 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-scripts\") pod \"d7d28e68-71cb-476f-a831-5536b2686514\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.833625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-config-data\") pod \"d7d28e68-71cb-476f-a831-5536b2686514\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.833737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-internal-tls-certs\") pod \"d7d28e68-71cb-476f-a831-5536b2686514\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.833786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-combined-ca-bundle\") pod \"d7d28e68-71cb-476f-a831-5536b2686514\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.833877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52d68\" (UniqueName: \"kubernetes.io/projected/d7d28e68-71cb-476f-a831-5536b2686514-kube-api-access-52d68\") pod \"d7d28e68-71cb-476f-a831-5536b2686514\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.833972 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d28e68-71cb-476f-a831-5536b2686514-logs\") pod \"d7d28e68-71cb-476f-a831-5536b2686514\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.834002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-public-tls-certs\") pod \"d7d28e68-71cb-476f-a831-5536b2686514\" (UID: \"d7d28e68-71cb-476f-a831-5536b2686514\") " Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.840148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d28e68-71cb-476f-a831-5536b2686514-logs" (OuterVolumeSpecName: "logs") pod "d7d28e68-71cb-476f-a831-5536b2686514" (UID: "d7d28e68-71cb-476f-a831-5536b2686514"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.853136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-scripts" (OuterVolumeSpecName: "scripts") pod "d7d28e68-71cb-476f-a831-5536b2686514" (UID: "d7d28e68-71cb-476f-a831-5536b2686514"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.853190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d28e68-71cb-476f-a831-5536b2686514-kube-api-access-52d68" (OuterVolumeSpecName: "kube-api-access-52d68") pod "d7d28e68-71cb-476f-a831-5536b2686514" (UID: "d7d28e68-71cb-476f-a831-5536b2686514"). InnerVolumeSpecName "kube-api-access-52d68". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.915359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-config-data" (OuterVolumeSpecName: "config-data") pod "d7d28e68-71cb-476f-a831-5536b2686514" (UID: "d7d28e68-71cb-476f-a831-5536b2686514"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.919794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7d28e68-71cb-476f-a831-5536b2686514" (UID: "d7d28e68-71cb-476f-a831-5536b2686514"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.926995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d7d28e68-71cb-476f-a831-5536b2686514" (UID: "d7d28e68-71cb-476f-a831-5536b2686514"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.936441 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7d28e68-71cb-476f-a831-5536b2686514-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.936536 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.936624 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.936692 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.936749 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.936797 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52d68\" (UniqueName: \"kubernetes.io/projected/d7d28e68-71cb-476f-a831-5536b2686514-kube-api-access-52d68\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:05 crc kubenswrapper[4707]: I0121 16:02:05.965746 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d7d28e68-71cb-476f-a831-5536b2686514" (UID: "d7d28e68-71cb-476f-a831-5536b2686514"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.038975 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d28e68-71cb-476f-a831-5536b2686514-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.429484 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7d28e68-71cb-476f-a831-5536b2686514" containerID="e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922" exitCode=0 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.429551 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.429565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" event={"ID":"d7d28e68-71cb-476f-a831-5536b2686514","Type":"ContainerDied","Data":"e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922"} Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.429903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5865994c8-cs68b" event={"ID":"d7d28e68-71cb-476f-a831-5536b2686514","Type":"ContainerDied","Data":"b14731efe93702798fe2754503d984c60b5d687954099a5d9a9492997d0ada09"} Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.429927 4707 scope.go:117] "RemoveContainer" containerID="e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.433536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerStarted","Data":"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61"} Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.433623 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.433612 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="ceilometer-central-agent" containerID="cri-o://4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.433649 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="proxy-httpd" containerID="cri-o://33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.433706 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="sg-core" containerID="cri-o://a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.433754 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="ceilometer-notification-agent" containerID="cri-o://229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.439711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" event={"ID":"373e7e86-7df4-4678-8a16-cd6accf342bb","Type":"ContainerStarted","Data":"7dbbb9ef844ab37cf6aecd9f7cd00e150dba12f7539fb8826c773bd114ec81ca"} Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.439749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" event={"ID":"373e7e86-7df4-4678-8a16-cd6accf342bb","Type":"ContainerStarted","Data":"872d133770e4ab129cb4440be2cd434a18f1e4f6a4f18e56e319acdd51d5de90"} Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.439761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" event={"ID":"373e7e86-7df4-4678-8a16-cd6accf342bb","Type":"ContainerStarted","Data":"8452ccc6e066ed6876612b08167b493a289f32110fefcef1dab6981a93d2f2fb"} Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.439896 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.439930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.442689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" event={"ID":"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff","Type":"ContainerStarted","Data":"4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86"} Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.442786 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" podUID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerName="barbican-api-log" containerID="cri-o://f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.442880 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.442901 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.442930 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" podUID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerName="barbican-api" containerID="cri-o://4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.458832 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.493694453 podStartE2EDuration="5.458818676s" podCreationTimestamp="2026-01-21 16:02:01 +0000 UTC" firstStartedPulling="2026-01-21 16:02:02.156920489 +0000 UTC m=+3619.338436710" lastFinishedPulling="2026-01-21 16:02:06.122044711 +0000 UTC m=+3623.303560933" observedRunningTime="2026-01-21 16:02:06.45167357 +0000 UTC m=+3623.633189792" watchObservedRunningTime="2026-01-21 16:02:06.458818676 +0000 UTC m=+3623.640334898" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.472026 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" podStartSLOduration=2.4720114730000002 podStartE2EDuration="2.472011473s" podCreationTimestamp="2026-01-21 16:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:06.465391474 +0000 UTC m=+3623.646907686" watchObservedRunningTime="2026-01-21 16:02:06.472011473 +0000 UTC m=+3623.653527694" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.491646 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" podStartSLOduration=2.491634038 podStartE2EDuration="2.491634038s" podCreationTimestamp="2026-01-21 16:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:06.486702657 +0000 UTC m=+3623.668218878" watchObservedRunningTime="2026-01-21 16:02:06.491634038 +0000 UTC m=+3623.673150260" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.499973 4707 scope.go:117] "RemoveContainer" containerID="5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.513314 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5865994c8-cs68b"] Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.518324 4707 scope.go:117] "RemoveContainer" containerID="e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922" Jan 21 16:02:06 crc kubenswrapper[4707]: E0121 16:02:06.518668 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922\": container with ID starting with e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922 not found: ID does not exist" containerID="e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.518711 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922"} err="failed to get container status \"e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922\": rpc error: code = NotFound desc = could not find container \"e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922\": container with ID starting with e6b3ec4d92e8647070098005fae7e29034b0eddc57f596af44f2528047719922 not found: ID does not exist" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.518734 4707 scope.go:117] "RemoveContainer" containerID="5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d" Jan 21 16:02:06 crc kubenswrapper[4707]: E0121 16:02:06.519068 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d\": container with ID starting with 5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d not found: ID does not exist" containerID="5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.519102 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d"} err="failed to get container status \"5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d\": rpc error: code = NotFound desc = could not find container \"5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d\": container with ID starting with 5301a4d09ad3b2495e63ddd3aecbc85ae7993fe2e612f6952c61afcfba14ff7d not found: ID does not exist" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.519595 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5865994c8-cs68b"] Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.617448 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.720977 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.721282 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-log" containerID="cri-o://c76746848586d16a8a6d12eabf0bf66bbcce199aa33388ae4e87eb3bdd155156" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.721394 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-api" containerID="cri-o://5d26b491314dc3eee1fb728f8d549b40c8203991cf2c9184bca2c9cc244b7041" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.728183 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.728375 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="71a22bb8-e120-4ad5-b4e7-171872ed4de0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.734911 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.735348 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-metadata" containerID="cri-o://202e532715339b4e74f96986e7e29341d437676e0174e845bd4d5cafd176f0c5" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.735172 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-log" containerID="cri-o://f4e4007ad0d4c5f9910c3506d4489756309ca207796bd3930eac18605903d1db" gracePeriod=30 Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.945117 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.956114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-public-tls-certs\") pod \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.956168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data-custom\") pod \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.956243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data\") pod \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.956329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-combined-ca-bundle\") pod \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.956373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65442\" (UniqueName: \"kubernetes.io/projected/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-kube-api-access-65442\") pod \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.956398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-logs\") pod \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.956497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-internal-tls-certs\") pod \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\" (UID: \"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff\") " Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.957479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-logs" (OuterVolumeSpecName: "logs") pod "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" (UID: "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.963460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" (UID: "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.964052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-kube-api-access-65442" (OuterVolumeSpecName: "kube-api-access-65442") pod "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" (UID: "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff"). InnerVolumeSpecName "kube-api-access-65442". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:06 crc kubenswrapper[4707]: I0121 16:02:06.994941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" (UID: "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.012100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" (UID: "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.012354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" (UID: "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.018886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data" (OuterVolumeSpecName: "config-data") pod "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" (UID: "c5960e2d-3b1d-47b2-8efd-e7925ebd62ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.058990 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.059015 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.059024 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.059032 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.059040 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.059049 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65442\" (UniqueName: \"kubernetes.io/projected/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-kube-api-access-65442\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.059057 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.184858 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.193401 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d28e68-71cb-476f-a831-5536b2686514" path="/var/lib/kubelet/pods/d7d28e68-71cb-476f-a831-5536b2686514/volumes" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.262735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-config-data\") pod \"18bb66c8-1624-454d-82e6-0ce735f84fb1\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.262788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7gnd\" (UniqueName: \"kubernetes.io/projected/18bb66c8-1624-454d-82e6-0ce735f84fb1-kube-api-access-t7gnd\") pod \"18bb66c8-1624-454d-82e6-0ce735f84fb1\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.262866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-scripts\") pod \"18bb66c8-1624-454d-82e6-0ce735f84fb1\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.262904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-combined-ca-bundle\") pod \"18bb66c8-1624-454d-82e6-0ce735f84fb1\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.262946 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-log-httpd\") pod \"18bb66c8-1624-454d-82e6-0ce735f84fb1\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.262967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-sg-core-conf-yaml\") pod \"18bb66c8-1624-454d-82e6-0ce735f84fb1\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.263108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-run-httpd\") pod \"18bb66c8-1624-454d-82e6-0ce735f84fb1\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.263151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-ceilometer-tls-certs\") pod \"18bb66c8-1624-454d-82e6-0ce735f84fb1\" (UID: \"18bb66c8-1624-454d-82e6-0ce735f84fb1\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.263513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18bb66c8-1624-454d-82e6-0ce735f84fb1" (UID: "18bb66c8-1624-454d-82e6-0ce735f84fb1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.263617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18bb66c8-1624-454d-82e6-0ce735f84fb1" (UID: "18bb66c8-1624-454d-82e6-0ce735f84fb1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.263743 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.263756 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18bb66c8-1624-454d-82e6-0ce735f84fb1-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.267826 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18bb66c8-1624-454d-82e6-0ce735f84fb1-kube-api-access-t7gnd" (OuterVolumeSpecName: "kube-api-access-t7gnd") pod "18bb66c8-1624-454d-82e6-0ce735f84fb1" (UID: "18bb66c8-1624-454d-82e6-0ce735f84fb1"). InnerVolumeSpecName "kube-api-access-t7gnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.267837 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-scripts" (OuterVolumeSpecName: "scripts") pod "18bb66c8-1624-454d-82e6-0ce735f84fb1" (UID: "18bb66c8-1624-454d-82e6-0ce735f84fb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.297402 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18bb66c8-1624-454d-82e6-0ce735f84fb1" (UID: "18bb66c8-1624-454d-82e6-0ce735f84fb1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.313313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "18bb66c8-1624-454d-82e6-0ce735f84fb1" (UID: "18bb66c8-1624-454d-82e6-0ce735f84fb1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.338610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18bb66c8-1624-454d-82e6-0ce735f84fb1" (UID: "18bb66c8-1624-454d-82e6-0ce735f84fb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.348064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-config-data" (OuterVolumeSpecName: "config-data") pod "18bb66c8-1624-454d-82e6-0ce735f84fb1" (UID: "18bb66c8-1624-454d-82e6-0ce735f84fb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.365122 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.365146 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7gnd\" (UniqueName: \"kubernetes.io/projected/18bb66c8-1624-454d-82e6-0ce735f84fb1-kube-api-access-t7gnd\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.365155 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.365164 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.365175 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.365182 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18bb66c8-1624-454d-82e6-0ce735f84fb1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.371020 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.451710 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerID="f4e4007ad0d4c5f9910c3506d4489756309ca207796bd3930eac18605903d1db" exitCode=143 Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.451759 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c3caa588-23c8-4f8c-9a2d-4340345a5a70","Type":"ContainerDied","Data":"f4e4007ad0d4c5f9910c3506d4489756309ca207796bd3930eac18605903d1db"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.453123 4707 generic.go:334] "Generic (PLEG): container finished" podID="71a22bb8-e120-4ad5-b4e7-171872ed4de0" containerID="8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0" exitCode=0 Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.453158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"71a22bb8-e120-4ad5-b4e7-171872ed4de0","Type":"ContainerDied","Data":"8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.453174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"71a22bb8-e120-4ad5-b4e7-171872ed4de0","Type":"ContainerDied","Data":"65c04243e11b338e24b3bdbcaf7e06b5490cf116becaa0bcb4d85e777f8947fa"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.453190 4707 scope.go:117] "RemoveContainer" containerID="8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.453270 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456336 4707 generic.go:334] "Generic (PLEG): container finished" podID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerID="33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61" exitCode=0 Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456355 4707 generic.go:334] "Generic (PLEG): container finished" podID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerID="a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb" exitCode=2 Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456362 4707 generic.go:334] "Generic (PLEG): container finished" podID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerID="229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632" exitCode=0 Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456368 4707 generic.go:334] "Generic (PLEG): container finished" podID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerID="4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa" exitCode=0 Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerDied","Data":"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerDied","Data":"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456427 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerDied","Data":"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456434 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerDied","Data":"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.456579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"18bb66c8-1624-454d-82e6-0ce735f84fb1","Type":"ContainerDied","Data":"437da720c4094641f05f79afe16440777ed7d714b4be9d422ccf61c7a1b7650b"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.457896 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerID="c76746848586d16a8a6d12eabf0bf66bbcce199aa33388ae4e87eb3bdd155156" exitCode=143 Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.457937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7a2c79bc-f09f-4e7e-aa40-80cf2914a696","Type":"ContainerDied","Data":"c76746848586d16a8a6d12eabf0bf66bbcce199aa33388ae4e87eb3bdd155156"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.459733 4707 generic.go:334] "Generic (PLEG): container finished" podID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerID="4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86" exitCode=0 Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.459752 4707 generic.go:334] "Generic (PLEG): container finished" podID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerID="f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482" exitCode=143 Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.459790 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" event={"ID":"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff","Type":"ContainerDied","Data":"4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.459831 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.459835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" event={"ID":"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff","Type":"ContainerDied","Data":"f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.459928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-684864457b-mxctv" event={"ID":"c5960e2d-3b1d-47b2-8efd-e7925ebd62ff","Type":"ContainerDied","Data":"224ebda1dabaacf993f160316646e0b4dd4ae54109283b861123a3e595caa111"} Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.465399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-config-data\") pod \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.465474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-nova-novncproxy-tls-certs\") pod \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.465497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcfwl\" (UniqueName: \"kubernetes.io/projected/71a22bb8-e120-4ad5-b4e7-171872ed4de0-kube-api-access-kcfwl\") pod \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.465568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-vencrypt-tls-certs\") pod \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.465623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-combined-ca-bundle\") pod \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\" (UID: \"71a22bb8-e120-4ad5-b4e7-171872ed4de0\") " Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.468536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a22bb8-e120-4ad5-b4e7-171872ed4de0-kube-api-access-kcfwl" (OuterVolumeSpecName: "kube-api-access-kcfwl") pod "71a22bb8-e120-4ad5-b4e7-171872ed4de0" (UID: "71a22bb8-e120-4ad5-b4e7-171872ed4de0"). InnerVolumeSpecName "kube-api-access-kcfwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.489604 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71a22bb8-e120-4ad5-b4e7-171872ed4de0" (UID: "71a22bb8-e120-4ad5-b4e7-171872ed4de0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.491722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-config-data" (OuterVolumeSpecName: "config-data") pod "71a22bb8-e120-4ad5-b4e7-171872ed4de0" (UID: "71a22bb8-e120-4ad5-b4e7-171872ed4de0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.505024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "71a22bb8-e120-4ad5-b4e7-171872ed4de0" (UID: "71a22bb8-e120-4ad5-b4e7-171872ed4de0"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.512942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "71a22bb8-e120-4ad5-b4e7-171872ed4de0" (UID: "71a22bb8-e120-4ad5-b4e7-171872ed4de0"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.568657 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.568685 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.568699 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.568709 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a22bb8-e120-4ad5-b4e7-171872ed4de0-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.568720 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcfwl\" (UniqueName: \"kubernetes.io/projected/71a22bb8-e120-4ad5-b4e7-171872ed4de0-kube-api-access-kcfwl\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.574473 4707 scope.go:117] "RemoveContainer" containerID="8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.583140 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0\": container with ID starting with 8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0 not found: ID does not exist" containerID="8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.583189 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0"} err="failed to get container status \"8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0\": rpc error: code = NotFound desc = could not find container \"8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0\": container with ID starting with 8c56024662bb5249fa7ce375b1d767c0d215ac82a8810cf68d08625afba036f0 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.583213 4707 scope.go:117] "RemoveContainer" containerID="33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.596105 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-684864457b-mxctv"] Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.608147 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-684864457b-mxctv"] Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.620519 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.629874 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.631600 4707 scope.go:117] "RemoveContainer" containerID="a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.653226 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.654288 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="ceilometer-central-agent" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.654316 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="ceilometer-central-agent" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.654511 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerName="barbican-api-log" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.654527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerName="barbican-api-log" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.654546 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a22bb8-e120-4ad5-b4e7-171872ed4de0" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.654553 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a22bb8-e120-4ad5-b4e7-171872ed4de0" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.654574 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-log" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.654582 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-log" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.654598 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerName="barbican-api" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.654607 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerName="barbican-api" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.654630 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-api" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.654636 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-api" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.654643 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="sg-core" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.654649 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="sg-core" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.665730 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="proxy-httpd" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.665748 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="proxy-httpd" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.665775 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="ceilometer-notification-agent" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.665781 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="ceilometer-notification-agent" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.665982 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerName="barbican-api" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.665999 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-log" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.666013 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="ceilometer-notification-agent" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.666023 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="proxy-httpd" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.666031 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a22bb8-e120-4ad5-b4e7-171872ed4de0" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.666038 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" containerName="barbican-api-log" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.666046 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="ceilometer-central-agent" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.666055 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d28e68-71cb-476f-a831-5536b2686514" containerName="placement-api" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.666066 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" containerName="sg-core" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.667606 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.670596 4707 scope.go:117] "RemoveContainer" containerID="229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.671527 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.671654 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.684589 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.686355 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.724952 4707 scope.go:117] "RemoveContainer" containerID="4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.776392 4707 scope.go:117] "RemoveContainer" containerID="33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.777672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-run-httpd\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.777898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-scripts\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.778069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.778181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l9wh\" (UniqueName: \"kubernetes.io/projected/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-kube-api-access-5l9wh\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.778268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-log-httpd\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.778382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.778700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.779043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-config-data\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.790908 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61\": container with ID starting with 33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61 not found: ID does not exist" containerID="33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.790945 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61"} err="failed to get container status \"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61\": rpc error: code = NotFound desc = could not find container \"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61\": container with ID starting with 33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.790966 4707 scope.go:117] "RemoveContainer" containerID="a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.795916 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb\": container with ID starting with a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb not found: ID does not exist" containerID="a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.795945 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb"} err="failed to get container status \"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb\": rpc error: code = NotFound desc = could not find container \"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb\": container with ID starting with a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.795962 4707 scope.go:117] "RemoveContainer" containerID="229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.796398 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632\": container with ID starting with 229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632 not found: ID does not exist" containerID="229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.796530 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632"} err="failed to get container status \"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632\": rpc error: code = NotFound desc = could not find container \"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632\": container with ID starting with 229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.796622 4707 scope.go:117] "RemoveContainer" containerID="4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.796957 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa\": container with ID starting with 4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa not found: ID does not exist" containerID="4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.797064 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa"} err="failed to get container status \"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa\": rpc error: code = NotFound desc = could not find container \"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa\": container with ID starting with 4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.797143 4707 scope.go:117] "RemoveContainer" containerID="33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.797424 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61"} err="failed to get container status \"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61\": rpc error: code = NotFound desc = could not find container \"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61\": container with ID starting with 33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.797525 4707 scope.go:117] "RemoveContainer" containerID="a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.797775 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb"} err="failed to get container status \"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb\": rpc error: code = NotFound desc = could not find container \"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb\": container with ID starting with a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.797883 4707 scope.go:117] "RemoveContainer" containerID="229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.798174 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632"} err="failed to get container status \"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632\": rpc error: code = NotFound desc = could not find container \"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632\": container with ID starting with 229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.798267 4707 scope.go:117] "RemoveContainer" containerID="4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.798522 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa"} err="failed to get container status \"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa\": rpc error: code = NotFound desc = could not find container \"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa\": container with ID starting with 4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.798608 4707 scope.go:117] "RemoveContainer" containerID="33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.798856 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61"} err="failed to get container status \"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61\": rpc error: code = NotFound desc = could not find container \"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61\": container with ID starting with 33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.798941 4707 scope.go:117] "RemoveContainer" containerID="a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.799179 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb"} err="failed to get container status \"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb\": rpc error: code = NotFound desc = could not find container \"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb\": container with ID starting with a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.799266 4707 scope.go:117] "RemoveContainer" containerID="229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.799525 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632"} err="failed to get container status \"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632\": rpc error: code = NotFound desc = could not find container \"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632\": container with ID starting with 229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.799613 4707 scope.go:117] "RemoveContainer" containerID="4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.799859 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa"} err="failed to get container status \"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa\": rpc error: code = NotFound desc = could not find container \"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa\": container with ID starting with 4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.799947 4707 scope.go:117] "RemoveContainer" containerID="33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.804064 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61"} err="failed to get container status \"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61\": rpc error: code = NotFound desc = could not find container \"33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61\": container with ID starting with 33d6ad05649f87b553fcec14b0aebd1458d31b45cd789bf8d1d9705f4a662d61 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.804199 4707 scope.go:117] "RemoveContainer" containerID="a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.804770 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb"} err="failed to get container status \"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb\": rpc error: code = NotFound desc = could not find container \"a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb\": container with ID starting with a8ef26c3b2dc2e2dced19043028205f701d788c647b8e3bb55b37d1f3828c3bb not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.804890 4707 scope.go:117] "RemoveContainer" containerID="229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.808365 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632"} err="failed to get container status \"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632\": rpc error: code = NotFound desc = could not find container \"229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632\": container with ID starting with 229a26d08d293e4ae6dd203b8d8a6e32ad40f51d1542cb88f0eb256855176632 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.808479 4707 scope.go:117] "RemoveContainer" containerID="4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.808625 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.809019 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa"} err="failed to get container status \"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa\": rpc error: code = NotFound desc = could not find container \"4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa\": container with ID starting with 4a6d9081f746950d73b248afcd861192865e6a8d57bed8433f5c715f84e6a8aa not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.809102 4707 scope.go:117] "RemoveContainer" containerID="4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.825165 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.832148 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.833229 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.836161 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.836730 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.836894 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.848204 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.858969 4707 scope.go:117] "RemoveContainer" containerID="f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.880582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.881321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-config-data\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.881821 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-run-httpd\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.881508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-run-httpd\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.881904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-scripts\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.882212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.882237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l9wh\" (UniqueName: \"kubernetes.io/projected/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-kube-api-access-5l9wh\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.882270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-log-httpd\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.882291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.883081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-log-httpd\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.885801 4707 scope.go:117] "RemoveContainer" containerID="4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.886284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.886461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.886519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.887587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-config-data\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.887675 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86\": container with ID starting with 4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86 not found: ID does not exist" containerID="4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.887697 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86"} err="failed to get container status \"4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86\": rpc error: code = NotFound desc = could not find container \"4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86\": container with ID starting with 4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.887716 4707 scope.go:117] "RemoveContainer" containerID="f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482" Jan 21 16:02:07 crc kubenswrapper[4707]: E0121 16:02:07.888078 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482\": container with ID starting with f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482 not found: ID does not exist" containerID="f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.888098 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482"} err="failed to get container status \"f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482\": rpc error: code = NotFound desc = could not find container \"f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482\": container with ID starting with f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.888112 4707 scope.go:117] "RemoveContainer" containerID="4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.888588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-scripts\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.889080 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86"} err="failed to get container status \"4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86\": rpc error: code = NotFound desc = could not find container \"4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86\": container with ID starting with 4883e980f2f519931d0d79a487c5ac2e9cc503b281cdc91efa3a51b3fa653a86 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.889100 4707 scope.go:117] "RemoveContainer" containerID="f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.889735 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482"} err="failed to get container status \"f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482\": rpc error: code = NotFound desc = could not find container \"f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482\": container with ID starting with f535e4a172f322da956c1934ded40582cf8d1f4d58664924f8e577c5d7095482 not found: ID does not exist" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.897355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l9wh\" (UniqueName: \"kubernetes.io/projected/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-kube-api-access-5l9wh\") pod \"ceilometer-0\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.952950 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.62:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.958987 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.984122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km5ws\" (UniqueName: \"kubernetes.io/projected/eadd0643-8179-4e51-a4e4-f48691d24503-kube-api-access-km5ws\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.984166 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.984201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.984438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:07 crc kubenswrapper[4707]: I0121 16:02:07.984497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.004730 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.086380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.086481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.086599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km5ws\" (UniqueName: \"kubernetes.io/projected/eadd0643-8179-4e51-a4e4-f48691d24503-kube-api-access-km5ws\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.086626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.086658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.089614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.091866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.092336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.096063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.099991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km5ws\" (UniqueName: \"kubernetes.io/projected/eadd0643-8179-4e51-a4e4-f48691d24503-kube-api-access-km5ws\") pod \"nova-cell1-novncproxy-0\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.146965 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.389904 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:08 crc kubenswrapper[4707]: W0121 16:02:08.391948 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69bb9e05_ce31_46c3_9ebf_9cd0d05676d2.slice/crio-f0fc2a204f45c0c3bca8122fe10d5407cd0feec27f37af25ae2e737c292d6b5b WatchSource:0}: Error finding container f0fc2a204f45c0c3bca8122fe10d5407cd0feec27f37af25ae2e737c292d6b5b: Status 404 returned error can't find the container with id f0fc2a204f45c0c3bca8122fe10d5407cd0feec27f37af25ae2e737c292d6b5b Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.472481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerStarted","Data":"f0fc2a204f45c0c3bca8122fe10d5407cd0feec27f37af25ae2e737c292d6b5b"} Jan 21 16:02:08 crc kubenswrapper[4707]: I0121 16:02:08.504720 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:02:08 crc kubenswrapper[4707]: W0121 16:02:08.508568 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeadd0643_8179_4e51_a4e4_f48691d24503.slice/crio-b2b7f0838840c36eee39ca1069531f931100e250fd0c72bf867400d2230ca63b WatchSource:0}: Error finding container b2b7f0838840c36eee39ca1069531f931100e250fd0c72bf867400d2230ca63b: Status 404 returned error can't find the container with id b2b7f0838840c36eee39ca1069531f931100e250fd0c72bf867400d2230ca63b Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.189677 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18bb66c8-1624-454d-82e6-0ce735f84fb1" path="/var/lib/kubelet/pods/18bb66c8-1624-454d-82e6-0ce735f84fb1/volumes" Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.190385 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a22bb8-e120-4ad5-b4e7-171872ed4de0" path="/var/lib/kubelet/pods/71a22bb8-e120-4ad5-b4e7-171872ed4de0/volumes" Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.190898 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5960e2d-3b1d-47b2-8efd-e7925ebd62ff" path="/var/lib/kubelet/pods/c5960e2d-3b1d-47b2-8efd-e7925ebd62ff/volumes" Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.480134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerStarted","Data":"ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b"} Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.481781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eadd0643-8179-4e51-a4e4-f48691d24503","Type":"ContainerStarted","Data":"0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5"} Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.481822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eadd0643-8179-4e51-a4e4-f48691d24503","Type":"ContainerStarted","Data":"b2b7f0838840c36eee39ca1069531f931100e250fd0c72bf867400d2230ca63b"} Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.496938 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.4969258500000002 podStartE2EDuration="2.49692585s" podCreationTimestamp="2026-01-21 16:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:09.492165159 +0000 UTC m=+3626.673681382" watchObservedRunningTime="2026-01-21 16:02:09.49692585 +0000 UTC m=+3626.678442072" Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.945385 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.945434 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.945470 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.946197 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c95e964b40a0887841892803aa9f77fc9d3e35044d810c4610890872e712c158"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:02:09 crc kubenswrapper[4707]: I0121 16:02:09.946256 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://c95e964b40a0887841892803aa9f77fc9d3e35044d810c4610890872e712c158" gracePeriod=600 Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.181904 4707 scope.go:117] "RemoveContainer" containerID="0f202935d98db43b384ed17021e4ebfc0ee5225582322e135c3e18383a48cf21" Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.490454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerStarted","Data":"658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a"} Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.491048 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.492776 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="c95e964b40a0887841892803aa9f77fc9d3e35044d810c4610890872e712c158" exitCode=0 Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.492843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"c95e964b40a0887841892803aa9f77fc9d3e35044d810c4610890872e712c158"} Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.492873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391"} Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.492891 4707 scope.go:117] "RemoveContainer" containerID="8d6e5ffd894c785670529342574806bb7146de5041998720e3bd83ec7a19ac1d" Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.495143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerStarted","Data":"e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de"} Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.766071 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.766278 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.776957 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.56:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:10 crc kubenswrapper[4707]: I0121 16:02:10.849050 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:11 crc kubenswrapper[4707]: I0121 16:02:11.182615 4707 scope.go:117] "RemoveContainer" containerID="4e80fc2e3b0edededad7615ba0b6b058d59247aa9433541a7da0a7af40085a18" Jan 21 16:02:11 crc kubenswrapper[4707]: I0121 16:02:11.461009 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 16:02:11 crc kubenswrapper[4707]: I0121 16:02:11.461954 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 16:02:11 crc kubenswrapper[4707]: I0121 16:02:11.462755 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 16:02:11 crc kubenswrapper[4707]: I0121 16:02:11.506574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerStarted","Data":"fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de"} Jan 21 16:02:11 crc kubenswrapper[4707]: I0121 16:02:11.508588 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerStarted","Data":"0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9"} Jan 21 16:02:11 crc kubenswrapper[4707]: I0121 16:02:11.508988 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:02:11 crc kubenswrapper[4707]: I0121 16:02:11.576799 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:02:12 crc kubenswrapper[4707]: I0121 16:02:12.441796 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:02:12 crc kubenswrapper[4707]: I0121 16:02:12.442360 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerName="glance-log" containerID="cri-o://b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34" gracePeriod=30 Jan 21 16:02:12 crc kubenswrapper[4707]: I0121 16:02:12.442476 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerName="glance-httpd" containerID="cri-o://7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504" gracePeriod=30 Jan 21 16:02:12 crc kubenswrapper[4707]: I0121 16:02:12.491544 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:02:12 crc kubenswrapper[4707]: I0121 16:02:12.491965 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerName="glance-log" containerID="cri-o://b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643" gracePeriod=30 Jan 21 16:02:12 crc kubenswrapper[4707]: I0121 16:02:12.492403 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerName="glance-httpd" containerID="cri-o://10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740" gracePeriod=30 Jan 21 16:02:12 crc kubenswrapper[4707]: I0121 16:02:12.537942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerStarted","Data":"a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163"} Jan 21 16:02:12 crc kubenswrapper[4707]: I0121 16:02:12.539429 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:12 crc kubenswrapper[4707]: I0121 16:02:12.590209 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.753262581 podStartE2EDuration="5.590192604s" podCreationTimestamp="2026-01-21 16:02:07 +0000 UTC" firstStartedPulling="2026-01-21 16:02:08.393823598 +0000 UTC m=+3625.575339820" lastFinishedPulling="2026-01-21 16:02:12.23075362 +0000 UTC m=+3629.412269843" observedRunningTime="2026-01-21 16:02:12.580678845 +0000 UTC m=+3629.762195066" watchObservedRunningTime="2026-01-21 16:02:12.590192604 +0000 UTC m=+3629.771708826" Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.147703 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.195775 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.195984 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api-log" containerID="cri-o://0c91ea7d49d1293494ad3b71e6a2d9ea9e33ccb909db292e4603ce1aa9e9bb4d" gracePeriod=30 Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.196029 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api" containerID="cri-o://03dd9833f6900410aad689e2c52cd17eb9a6b068055cdd52e3ee4b84e822c0c7" gracePeriod=30 Jan 21 16:02:13 crc kubenswrapper[4707]: E0121 16:02:13.513954 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a is running failed: container process not found" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:02:13 crc kubenswrapper[4707]: E0121 16:02:13.514580 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a is running failed: container process not found" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:02:13 crc kubenswrapper[4707]: E0121 16:02:13.515067 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a is running failed: container process not found" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:02:13 crc kubenswrapper[4707]: E0121 16:02:13.515104 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a is running failed: container process not found" probeType="Liveness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.545089 4707 generic.go:334] "Generic (PLEG): container finished" podID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerID="b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34" exitCode=143 Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.545152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8","Type":"ContainerDied","Data":"b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34"} Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.546888 4707 generic.go:334] "Generic (PLEG): container finished" podID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerID="b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643" exitCode=143 Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.546925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"06f6e827-b699-4aa7-b3bc-6729a224186f","Type":"ContainerDied","Data":"b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643"} Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.549130 4707 generic.go:334] "Generic (PLEG): container finished" podID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" exitCode=1 Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.549146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerDied","Data":"658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a"} Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.549174 4707 scope.go:117] "RemoveContainer" containerID="0f202935d98db43b384ed17021e4ebfc0ee5225582322e135c3e18383a48cf21" Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.549730 4707 scope.go:117] "RemoveContainer" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" Jan 21 16:02:13 crc kubenswrapper[4707]: E0121 16:02:13.550044 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(4af60f39-57ca-4595-be87-1d1a0c869d72)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.550997 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerID="0c91ea7d49d1293494ad3b71e6a2d9ea9e33ccb909db292e4603ce1aa9e9bb4d" exitCode=143 Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.551040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8","Type":"ContainerDied","Data":"0c91ea7d49d1293494ad3b71e6a2d9ea9e33ccb909db292e4603ce1aa9e9bb4d"} Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.556561 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerID="0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9" exitCode=1 Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.557569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerDied","Data":"0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9"} Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.557852 4707 scope.go:117] "RemoveContainer" containerID="0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9" Jan 21 16:02:13 crc kubenswrapper[4707]: E0121 16:02:13.558037 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(4b535454-f068-42cd-9817-a23bd7d7aa4d)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" Jan 21 16:02:13 crc kubenswrapper[4707]: I0121 16:02:13.597268 4707 scope.go:117] "RemoveContainer" containerID="4e80fc2e3b0edededad7615ba0b6b058d59247aa9433541a7da0a7af40085a18" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.278968 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.75:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.667704 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.668434 4707 scope.go:117] "RemoveContainer" containerID="0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9" Jan 21 16:02:15 crc kubenswrapper[4707]: E0121 16:02:15.668761 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(4b535454-f068-42cd-9817-a23bd7d7aa4d)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.680696 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5797484f74-ph7p9"] Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.681122 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-httpd" containerID="cri-o://5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd" gracePeriod=30 Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.683516 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-api" containerID="cri-o://337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff" gracePeriod=30 Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.692974 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.48:9696/\": read tcp 10.217.0.2:51418->10.217.1.48:9696: read: connection reset by peer" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.750202 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-b94c958fd-2mlhk"] Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.752769 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.757108 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-b94c958fd-2mlhk"] Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.817964 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.56:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.850963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf9lk\" (UniqueName: \"kubernetes.io/projected/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-kube-api-access-jf9lk\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.851026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-ovndb-tls-certs\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.851065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-httpd-config\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.851089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-public-tls-certs\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.851106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-config\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.851168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-internal-tls-certs\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.851331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-combined-ca-bundle\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.953854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-ovndb-tls-certs\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.955640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-httpd-config\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.955707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-public-tls-certs\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.955736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-config\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.955886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-internal-tls-certs\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.956154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-combined-ca-bundle\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.956258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf9lk\" (UniqueName: \"kubernetes.io/projected/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-kube-api-access-jf9lk\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.960373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-config\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.962158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-ovndb-tls-certs\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.966189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-httpd-config\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.966265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-combined-ca-bundle\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.969187 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-public-tls-certs\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.981687 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-b94c958fd-2mlhk"] Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.982401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf9lk\" (UniqueName: \"kubernetes.io/projected/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-kube-api-access-jf9lk\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:15 crc kubenswrapper[4707]: E0121 16:02:15.984393 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[internal-tls-certs kube-api-access-jf9lk], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" podUID="6df0440c-1e20-44b4-a6b8-6b932a2cb19f" Jan 21 16:02:15 crc kubenswrapper[4707]: I0121 16:02:15.987229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-internal-tls-certs\") pod \"neutron-b94c958fd-2mlhk\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.014160 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.023003 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.025583 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.064208 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.103530 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.160376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-scripts\") pod \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.160464 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-logs\") pod \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.160602 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-combined-ca-bundle\") pod \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.160666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-public-tls-certs\") pod \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.160704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-httpd-run\") pod \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.160726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwph7\" (UniqueName: \"kubernetes.io/projected/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-kube-api-access-xwph7\") pod \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.160752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.160776 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-config-data\") pod \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\" (UID: \"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.161010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-combined-ca-bundle\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.161037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgndw\" (UniqueName: \"kubernetes.io/projected/53a357b9-520a-4e1b-be4e-917657c1e24e-kube-api-access-fgndw\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.161061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-ovndb-tls-certs\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.161126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-config\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.161162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-public-tls-certs\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.161176 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-httpd-config\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.161197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-internal-tls-certs\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.161519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-logs" (OuterVolumeSpecName: "logs") pod "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" (UID: "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.162194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" (UID: "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.173956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-kube-api-access-xwph7" (OuterVolumeSpecName: "kube-api-access-xwph7") pod "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" (UID: "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8"). InnerVolumeSpecName "kube-api-access-xwph7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.179963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-scripts" (OuterVolumeSpecName: "scripts") pod "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" (UID: "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.182109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" (UID: "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.209295 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.209517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" (UID: "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-httpd-run\") pod \"06f6e827-b699-4aa7-b3bc-6729a224186f\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-internal-tls-certs\") pod \"06f6e827-b699-4aa7-b3bc-6729a224186f\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-scripts\") pod \"06f6e827-b699-4aa7-b3bc-6729a224186f\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-logs\") pod \"06f6e827-b699-4aa7-b3bc-6729a224186f\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"06f6e827-b699-4aa7-b3bc-6729a224186f\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-combined-ca-bundle\") pod \"06f6e827-b699-4aa7-b3bc-6729a224186f\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmrkv\" (UniqueName: \"kubernetes.io/projected/06f6e827-b699-4aa7-b3bc-6729a224186f-kube-api-access-rmrkv\") pod \"06f6e827-b699-4aa7-b3bc-6729a224186f\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-config-data\") pod \"06f6e827-b699-4aa7-b3bc-6729a224186f\" (UID: \"06f6e827-b699-4aa7-b3bc-6729a224186f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-combined-ca-bundle\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgndw\" (UniqueName: \"kubernetes.io/projected/53a357b9-520a-4e1b-be4e-917657c1e24e-kube-api-access-fgndw\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-ovndb-tls-certs\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.262998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-config\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.263063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-public-tls-certs\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.263081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-httpd-config\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.263116 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-internal-tls-certs\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.263176 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.263185 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.263193 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.263201 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.263210 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwph7\" (UniqueName: \"kubernetes.io/projected/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-kube-api-access-xwph7\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.263227 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.279554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "06f6e827-b699-4aa7-b3bc-6729a224186f" (UID: "06f6e827-b699-4aa7-b3bc-6729a224186f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.279827 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-logs" (OuterVolumeSpecName: "logs") pod "06f6e827-b699-4aa7-b3bc-6729a224186f" (UID: "06f6e827-b699-4aa7-b3bc-6729a224186f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.282378 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-combined-ca-bundle\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.283612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-ovndb-tls-certs\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.302660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-config\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.304372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-httpd-config\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.304529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "06f6e827-b699-4aa7-b3bc-6729a224186f" (UID: "06f6e827-b699-4aa7-b3bc-6729a224186f"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.315441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-public-tls-certs\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.315668 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f6e827-b699-4aa7-b3bc-6729a224186f-kube-api-access-rmrkv" (OuterVolumeSpecName: "kube-api-access-rmrkv") pod "06f6e827-b699-4aa7-b3bc-6729a224186f" (UID: "06f6e827-b699-4aa7-b3bc-6729a224186f"). InnerVolumeSpecName "kube-api-access-rmrkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.319244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-internal-tls-certs\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.322894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-scripts" (OuterVolumeSpecName: "scripts") pod "06f6e827-b699-4aa7-b3bc-6729a224186f" (UID: "06f6e827-b699-4aa7-b3bc-6729a224186f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.325376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgndw\" (UniqueName: \"kubernetes.io/projected/53a357b9-520a-4e1b-be4e-917657c1e24e-kube-api-access-fgndw\") pod \"neutron-6fd5c4b78d-v98lf\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.331192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-config-data" (OuterVolumeSpecName: "config-data") pod "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" (UID: "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.359484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.394289 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.394329 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.394338 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.394349 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f6e827-b699-4aa7-b3bc-6729a224186f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.394378 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.394389 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmrkv\" (UniqueName: \"kubernetes.io/projected/06f6e827-b699-4aa7-b3bc-6729a224186f-kube-api-access-rmrkv\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.408936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06f6e827-b699-4aa7-b3bc-6729a224186f" (UID: "06f6e827-b699-4aa7-b3bc-6729a224186f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.430982 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.467959 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.467994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" (UID: "abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.484940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-config-data" (OuterVolumeSpecName: "config-data") pod "06f6e827-b699-4aa7-b3bc-6729a224186f" (UID: "06f6e827-b699-4aa7-b3bc-6729a224186f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.487027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "06f6e827-b699-4aa7-b3bc-6729a224186f" (UID: "06f6e827-b699-4aa7-b3bc-6729a224186f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.496092 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.496116 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.496127 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.496136 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.496143 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.496151 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f6e827-b699-4aa7-b3bc-6729a224186f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.513978 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.514895 4707 scope.go:117] "RemoveContainer" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" Jan 21 16:02:16 crc kubenswrapper[4707]: E0121 16:02:16.515160 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(4af60f39-57ca-4595-be87-1d1a0c869d72)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.586480 4707 generic.go:334] "Generic (PLEG): container finished" podID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerID="7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504" exitCode=0 Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.586511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8","Type":"ContainerDied","Data":"7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504"} Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.586549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8","Type":"ContainerDied","Data":"53fd1d91ef1568b5cb9106d4d34b7c158f3820b462f752349f1e523c68ba1441"} Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.586544 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.586569 4707 scope.go:117] "RemoveContainer" containerID="7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.593704 4707 generic.go:334] "Generic (PLEG): container finished" podID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerID="10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740" exitCode=0 Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.593754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"06f6e827-b699-4aa7-b3bc-6729a224186f","Type":"ContainerDied","Data":"10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740"} Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.593771 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"06f6e827-b699-4aa7-b3bc-6729a224186f","Type":"ContainerDied","Data":"1e4293add566145e540a7792a9f11a3eebd40c580e70db05bafeff4fac5c5e22"} Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.593841 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.601448 4707 generic.go:334] "Generic (PLEG): container finished" podID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerID="5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd" exitCode=0 Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.601498 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.601540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" event={"ID":"08d2edd9-a93d-4bb9-ac1e-6a23494df48c","Type":"ContainerDied","Data":"5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd"} Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.610225 4707 scope.go:117] "RemoveContainer" containerID="b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.613520 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.651159 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.660379 4707 scope.go:117] "RemoveContainer" containerID="7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504" Jan 21 16:02:16 crc kubenswrapper[4707]: E0121 16:02:16.662061 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504\": container with ID starting with 7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504 not found: ID does not exist" containerID="7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.662089 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504"} err="failed to get container status \"7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504\": rpc error: code = NotFound desc = could not find container \"7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504\": container with ID starting with 7ee3dbf02726ca171a13ce68ee9998685d4978bcbdd8f74bd1a83d0a18a34504 not found: ID does not exist" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.662107 4707 scope.go:117] "RemoveContainer" containerID="b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34" Jan 21 16:02:16 crc kubenswrapper[4707]: E0121 16:02:16.662478 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34\": container with ID starting with b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34 not found: ID does not exist" containerID="b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.662502 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34"} err="failed to get container status \"b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34\": rpc error: code = NotFound desc = could not find container \"b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34\": container with ID starting with b34495303b691cb3cd4eee0ba9306ad5f694d6b96e904b1c330e3b89414c3d34 not found: ID does not exist" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.662514 4707 scope.go:117] "RemoveContainer" containerID="10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.671858 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.683092 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:02:16 crc kubenswrapper[4707]: E0121 16:02:16.683438 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerName="glance-httpd" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.683450 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerName="glance-httpd" Jan 21 16:02:16 crc kubenswrapper[4707]: E0121 16:02:16.683463 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerName="glance-httpd" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.683469 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerName="glance-httpd" Jan 21 16:02:16 crc kubenswrapper[4707]: E0121 16:02:16.683484 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerName="glance-log" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.683489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerName="glance-log" Jan 21 16:02:16 crc kubenswrapper[4707]: E0121 16:02:16.683509 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerName="glance-log" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.683515 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerName="glance-log" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.683689 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerName="glance-httpd" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.683701 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerName="glance-log" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.683722 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" containerName="glance-httpd" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.683729 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f6e827-b699-4aa7-b3bc-6729a224186f" containerName="glance-log" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.684776 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.688029 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.688193 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-dkchd" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.692343 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.692394 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.695092 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.699581 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-public-tls-certs\") pod \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.699666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-httpd-config\") pod \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.699704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-config\") pod \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.699738 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-internal-tls-certs\") pod \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.699772 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf9lk\" (UniqueName: \"kubernetes.io/projected/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-kube-api-access-jf9lk\") pod \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.699798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-combined-ca-bundle\") pod \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.699837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-ovndb-tls-certs\") pod \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\" (UID: \"6df0440c-1e20-44b4-a6b8-6b932a2cb19f\") " Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.703859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6df0440c-1e20-44b4-a6b8-6b932a2cb19f" (UID: "6df0440c-1e20-44b4-a6b8-6b932a2cb19f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.706100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-kube-api-access-jf9lk" (OuterVolumeSpecName: "kube-api-access-jf9lk") pod "6df0440c-1e20-44b4-a6b8-6b932a2cb19f" (UID: "6df0440c-1e20-44b4-a6b8-6b932a2cb19f"). InnerVolumeSpecName "kube-api-access-jf9lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.706507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6df0440c-1e20-44b4-a6b8-6b932a2cb19f" (UID: "6df0440c-1e20-44b4-a6b8-6b932a2cb19f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.708845 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6df0440c-1e20-44b4-a6b8-6b932a2cb19f" (UID: "6df0440c-1e20-44b4-a6b8-6b932a2cb19f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.709370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-config" (OuterVolumeSpecName: "config") pod "6df0440c-1e20-44b4-a6b8-6b932a2cb19f" (UID: "6df0440c-1e20-44b4-a6b8-6b932a2cb19f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.711172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6df0440c-1e20-44b4-a6b8-6b932a2cb19f" (UID: "6df0440c-1e20-44b4-a6b8-6b932a2cb19f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.711764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6df0440c-1e20-44b4-a6b8-6b932a2cb19f" (UID: "6df0440c-1e20-44b4-a6b8-6b932a2cb19f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.713536 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.720338 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.749792 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.753059 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.755295 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.756055 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.773117 4707 scope.go:117] "RemoveContainer" containerID="b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.791193 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.803910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.803964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtkxf\" (UniqueName: \"kubernetes.io/projected/3bace925-e0b3-4554-94fe-fb8173144aef-kube-api-access-xtkxf\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804176 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-config-data\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcngf\" (UniqueName: \"kubernetes.io/projected/ced77cf4-40f3-409a-a776-c47dabd7b84f-kube-api-access-wcngf\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804377 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-scripts\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-logs\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-logs\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804556 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804677 4707 scope.go:117] "RemoveContainer" containerID="10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804871 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804892 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804903 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804913 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804922 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf9lk\" (UniqueName: \"kubernetes.io/projected/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-kube-api-access-jf9lk\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804932 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.804940 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df0440c-1e20-44b4-a6b8-6b932a2cb19f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:16 crc kubenswrapper[4707]: E0121 16:02:16.805211 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740\": container with ID starting with 10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740 not found: ID does not exist" containerID="10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.805268 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740"} err="failed to get container status \"10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740\": rpc error: code = NotFound desc = could not find container \"10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740\": container with ID starting with 10f90bcb93553b67618b2b900cce605c0c940114d7e028a48dffc32a286d3740 not found: ID does not exist" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.805291 4707 scope.go:117] "RemoveContainer" containerID="b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643" Jan 21 16:02:16 crc kubenswrapper[4707]: E0121 16:02:16.806078 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643\": container with ID starting with b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643 not found: ID does not exist" containerID="b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.806120 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643"} err="failed to get container status \"b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643\": rpc error: code = NotFound desc = could not find container \"b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643\": container with ID starting with b53d01024dfc0c4e35eb2b97b6381514de2028b5e0468584725126c20a357643 not found: ID does not exist" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-scripts\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-logs\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-logs\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.906922 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.907000 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtkxf\" (UniqueName: \"kubernetes.io/projected/3bace925-e0b3-4554-94fe-fb8173144aef-kube-api-access-xtkxf\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.907041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-config-data\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.907072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcngf\" (UniqueName: \"kubernetes.io/projected/ced77cf4-40f3-409a-a776-c47dabd7b84f-kube-api-access-wcngf\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.907113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.908104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-logs\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.908148 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.908427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-logs\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.908641 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.910154 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.910755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-scripts\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.911163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.913273 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.914601 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.916087 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.916213 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.917906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.918168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-config-data\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.922285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.927708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcngf\" (UniqueName: \"kubernetes.io/projected/ced77cf4-40f3-409a-a776-c47dabd7b84f-kube-api-access-wcngf\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.937750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtkxf\" (UniqueName: \"kubernetes.io/projected/3bace925-e0b3-4554-94fe-fb8173144aef-kube-api-access-xtkxf\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.938862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.946187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf"] Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.957122 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:16 crc kubenswrapper[4707]: I0121 16:02:16.963119 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.034392 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.085186 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc"] Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.085442 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api-log" containerID="cri-o://b9b0292f4265ad697d0af76da80ef835b40922e6b0800a62697e9af19c69303c" gracePeriod=30 Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.085802 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api" containerID="cri-o://dfcfda9d4838b9c50d7eebf432589fed062da83ba1c5877073fb9c19903f4a98" gracePeriod=30 Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.087487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.094758 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": EOF" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.108794 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.191997 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06f6e827-b699-4aa7-b3bc-6729a224186f" path="/var/lib/kubelet/pods/06f6e827-b699-4aa7-b3bc-6729a224186f/volumes" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.192777 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8" path="/var/lib/kubelet/pods/abfd9d95-a8d9-4ecf-aeb6-f4e465f8c5d8/volumes" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.565934 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.617256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" event={"ID":"53a357b9-520a-4e1b-be4e-917657c1e24e","Type":"ContainerStarted","Data":"a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0"} Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.617297 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" event={"ID":"53a357b9-520a-4e1b-be4e-917657c1e24e","Type":"ContainerStarted","Data":"9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30"} Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.617320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" event={"ID":"53a357b9-520a-4e1b-be4e-917657c1e24e","Type":"ContainerStarted","Data":"30a2e8789c91a984830ac98a8867d10f3ea13a3388ce8fbc35da7c0a7d9e733a"} Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.617541 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:17 crc kubenswrapper[4707]: W0121 16:02:17.619060 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podced77cf4_40f3_409a_a776_c47dabd7b84f.slice/crio-836f0c3146efe58a46bc1565139873dc0940b7862c90aebd55eee7b24457df35 WatchSource:0}: Error finding container 836f0c3146efe58a46bc1565139873dc0940b7862c90aebd55eee7b24457df35: Status 404 returned error can't find the container with id 836f0c3146efe58a46bc1565139873dc0940b7862c90aebd55eee7b24457df35 Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.619873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.620925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3bace925-e0b3-4554-94fe-fb8173144aef","Type":"ContainerStarted","Data":"e76a4362c7e4160720ed6485962cc5f4b7128cba5da3931760c11d3a3ad2b018"} Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.622521 4707 generic.go:334] "Generic (PLEG): container finished" podID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerID="b9b0292f4265ad697d0af76da80ef835b40922e6b0800a62697e9af19c69303c" exitCode=143 Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.622579 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b94c958fd-2mlhk" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.623062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" event={"ID":"8cb3219d-64cd-4b49-adaa-723e74405eda","Type":"ContainerDied","Data":"b9b0292f4265ad697d0af76da80ef835b40922e6b0800a62697e9af19c69303c"} Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.637875 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" podStartSLOduration=2.637862805 podStartE2EDuration="2.637862805s" podCreationTimestamp="2026-01-21 16:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:17.636085724 +0000 UTC m=+3634.817601946" watchObservedRunningTime="2026-01-21 16:02:17.637862805 +0000 UTC m=+3634.819379027" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.676071 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-b94c958fd-2mlhk"] Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.679964 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-b94c958fd-2mlhk"] Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.907409 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:17 crc kubenswrapper[4707]: I0121 16:02:17.907608 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.147350 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.168164 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.633312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ced77cf4-40f3-409a-a776-c47dabd7b84f","Type":"ContainerStarted","Data":"4389a6536c38ee5182d4c5496c074f0119e308ffa099463d2d0def3ee26597b4"} Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.633354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ced77cf4-40f3-409a-a776-c47dabd7b84f","Type":"ContainerStarted","Data":"bb6bca71e3d12e1edf4e10a2e9b41ae39d95b4a7860f5c4bfd9ccf3346a9dda6"} Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.633365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ced77cf4-40f3-409a-a776-c47dabd7b84f","Type":"ContainerStarted","Data":"836f0c3146efe58a46bc1565139873dc0940b7862c90aebd55eee7b24457df35"} Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.635490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3bace925-e0b3-4554-94fe-fb8173144aef","Type":"ContainerStarted","Data":"bcb487a9ff3183117b3e23897c23dfd1d280f8b5d1e1acafd6d3a73af5a188bb"} Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.635525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3bace925-e0b3-4554-94fe-fb8173144aef","Type":"ContainerStarted","Data":"40c8b7431aab97514f8bab7190356c7c5b4555133652fcb6de00896dcf5239f1"} Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.651961 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.651950712 podStartE2EDuration="2.651950712s" podCreationTimestamp="2026-01-21 16:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:18.646762897 +0000 UTC m=+3635.828279119" watchObservedRunningTime="2026-01-21 16:02:18.651950712 +0000 UTC m=+3635.833466934" Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.664018 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:02:18 crc kubenswrapper[4707]: I0121 16:02:18.672760 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.672746826 podStartE2EDuration="2.672746826s" podCreationTimestamp="2026-01-21 16:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:18.67063753 +0000 UTC m=+3635.852153752" watchObservedRunningTime="2026-01-21 16:02:18.672746826 +0000 UTC m=+3635.854263038" Jan 21 16:02:19 crc kubenswrapper[4707]: I0121 16:02:19.191081 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df0440c-1e20-44b4-a6b8-6b932a2cb19f" path="/var/lib/kubelet/pods/6df0440c-1e20-44b4-a6b8-6b932a2cb19f/volumes" Jan 21 16:02:20 crc kubenswrapper[4707]: E0121 16:02:20.636915 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185 is running failed: container process not found" containerID="82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:02:20 crc kubenswrapper[4707]: E0121 16:02:20.638250 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185 is running failed: container process not found" containerID="82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:02:20 crc kubenswrapper[4707]: E0121 16:02:20.642622 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185 is running failed: container process not found" containerID="82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:02:20 crc kubenswrapper[4707]: E0121 16:02:20.642664 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.660262 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerID="5d26b491314dc3eee1fb728f8d549b40c8203991cf2c9184bca2c9cc244b7041" exitCode=0 Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.660332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7a2c79bc-f09f-4e7e-aa40-80cf2914a696","Type":"ContainerDied","Data":"5d26b491314dc3eee1fb728f8d549b40c8203991cf2c9184bca2c9cc244b7041"} Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.665455 4707 generic.go:334] "Generic (PLEG): container finished" podID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerID="82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185" exitCode=1 Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.665505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c35a2e10-5d5d-44b3-b43e-9a1619f7105c","Type":"ContainerDied","Data":"82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185"} Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.666083 4707 scope.go:117] "RemoveContainer" containerID="82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185" Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.670147 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerID="202e532715339b4e74f96986e7e29341d437676e0174e845bd4d5cafd176f0c5" exitCode=0 Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.670220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c3caa588-23c8-4f8c-9a2d-4340345a5a70","Type":"ContainerDied","Data":"202e532715339b4e74f96986e7e29341d437676e0174e845bd4d5cafd176f0c5"} Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.815803 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.822319 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.990942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-public-tls-certs\") pod \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.990987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w8td\" (UniqueName: \"kubernetes.io/projected/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-kube-api-access-2w8td\") pod \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.991023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-internal-tls-certs\") pod \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.991056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-config-data\") pod \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.991084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-combined-ca-bundle\") pod \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.991106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs9bw\" (UniqueName: \"kubernetes.io/projected/c3caa588-23c8-4f8c-9a2d-4340345a5a70-kube-api-access-rs9bw\") pod \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.991754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-combined-ca-bundle\") pod \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.991785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-nova-metadata-tls-certs\") pod \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.992122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3caa588-23c8-4f8c-9a2d-4340345a5a70-logs\") pod \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.992179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-config-data\") pod \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\" (UID: \"c3caa588-23c8-4f8c-9a2d-4340345a5a70\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.992203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-logs\") pod \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\" (UID: \"7a2c79bc-f09f-4e7e-aa40-80cf2914a696\") " Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.992791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-logs" (OuterVolumeSpecName: "logs") pod "7a2c79bc-f09f-4e7e-aa40-80cf2914a696" (UID: "7a2c79bc-f09f-4e7e-aa40-80cf2914a696"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:20 crc kubenswrapper[4707]: I0121 16:02:20.992846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3caa588-23c8-4f8c-9a2d-4340345a5a70-logs" (OuterVolumeSpecName: "logs") pod "c3caa588-23c8-4f8c-9a2d-4340345a5a70" (UID: "c3caa588-23c8-4f8c-9a2d-4340345a5a70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.005791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3caa588-23c8-4f8c-9a2d-4340345a5a70-kube-api-access-rs9bw" (OuterVolumeSpecName: "kube-api-access-rs9bw") pod "c3caa588-23c8-4f8c-9a2d-4340345a5a70" (UID: "c3caa588-23c8-4f8c-9a2d-4340345a5a70"). InnerVolumeSpecName "kube-api-access-rs9bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.005895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-kube-api-access-2w8td" (OuterVolumeSpecName: "kube-api-access-2w8td") pod "7a2c79bc-f09f-4e7e-aa40-80cf2914a696" (UID: "7a2c79bc-f09f-4e7e-aa40-80cf2914a696"). InnerVolumeSpecName "kube-api-access-2w8td". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.015375 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a2c79bc-f09f-4e7e-aa40-80cf2914a696" (UID: "7a2c79bc-f09f-4e7e-aa40-80cf2914a696"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.016494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3caa588-23c8-4f8c-9a2d-4340345a5a70" (UID: "c3caa588-23c8-4f8c-9a2d-4340345a5a70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.023114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-config-data" (OuterVolumeSpecName: "config-data") pod "7a2c79bc-f09f-4e7e-aa40-80cf2914a696" (UID: "7a2c79bc-f09f-4e7e-aa40-80cf2914a696"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.029823 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-config-data" (OuterVolumeSpecName: "config-data") pod "c3caa588-23c8-4f8c-9a2d-4340345a5a70" (UID: "c3caa588-23c8-4f8c-9a2d-4340345a5a70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.037591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7a2c79bc-f09f-4e7e-aa40-80cf2914a696" (UID: "7a2c79bc-f09f-4e7e-aa40-80cf2914a696"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.045658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7a2c79bc-f09f-4e7e-aa40-80cf2914a696" (UID: "7a2c79bc-f09f-4e7e-aa40-80cf2914a696"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.046661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c3caa588-23c8-4f8c-9a2d-4340345a5a70" (UID: "c3caa588-23c8-4f8c-9a2d-4340345a5a70"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.094934 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w8td\" (UniqueName: \"kubernetes.io/projected/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-kube-api-access-2w8td\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.094967 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.094977 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.094988 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.094998 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs9bw\" (UniqueName: \"kubernetes.io/projected/c3caa588-23c8-4f8c-9a2d-4340345a5a70-kube-api-access-rs9bw\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.095005 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.095013 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.095020 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3caa588-23c8-4f8c-9a2d-4340345a5a70-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.095028 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3caa588-23c8-4f8c-9a2d-4340345a5a70-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.095035 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.095042 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a2c79bc-f09f-4e7e-aa40-80cf2914a696-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.679279 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.679488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7a2c79bc-f09f-4e7e-aa40-80cf2914a696","Type":"ContainerDied","Data":"756cab360c125f911a47a2a4241a2a0d28caa9459001331a4cb7fd41a7c9ad99"} Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.679551 4707 scope.go:117] "RemoveContainer" containerID="5d26b491314dc3eee1fb728f8d549b40c8203991cf2c9184bca2c9cc244b7041" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.681269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c35a2e10-5d5d-44b3-b43e-9a1619f7105c","Type":"ContainerStarted","Data":"382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4"} Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.685419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c3caa588-23c8-4f8c-9a2d-4340345a5a70","Type":"ContainerDied","Data":"9e9e09e5a13714abe5351750b589c12f25e4f4ece7aa12bdb33843b977c27a74"} Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.685487 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.698025 4707 scope.go:117] "RemoveContainer" containerID="c76746848586d16a8a6d12eabf0bf66bbcce199aa33388ae4e87eb3bdd155156" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.706601 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.715760 4707 scope.go:117] "RemoveContainer" containerID="202e532715339b4e74f96986e7e29341d437676e0174e845bd4d5cafd176f0c5" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.719908 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.733743 4707 scope.go:117] "RemoveContainer" containerID="f4e4007ad0d4c5f9910c3506d4489756309ca207796bd3930eac18605903d1db" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.737024 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.742151 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.747471 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:02:21 crc kubenswrapper[4707]: E0121 16:02:21.747856 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-api" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.747870 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-api" Jan 21 16:02:21 crc kubenswrapper[4707]: E0121 16:02:21.747882 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-metadata" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.747887 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-metadata" Jan 21 16:02:21 crc kubenswrapper[4707]: E0121 16:02:21.747905 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-log" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.747911 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-log" Jan 21 16:02:21 crc kubenswrapper[4707]: E0121 16:02:21.747933 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-log" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.747938 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-log" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.748129 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-log" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.748143 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-metadata" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.748154 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" containerName="nova-metadata-log" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.748163 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" containerName="nova-api-api" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.749065 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.751368 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.751522 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.752912 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.755056 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.756691 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.756886 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.757082 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.763653 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.768563 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425c52a3-93a2-49fc-82d9-d7d86f381335-logs\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911265 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-config-data\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-public-tls-certs\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-internal-tls-certs\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61bace1e-5cf4-462a-9d82-0f972aa44b75-logs\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74qhz\" (UniqueName: \"kubernetes.io/projected/61bace1e-5cf4-462a-9d82-0f972aa44b75-kube-api-access-74qhz\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-config-data\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg2nj\" (UniqueName: \"kubernetes.io/projected/425c52a3-93a2-49fc-82d9-d7d86f381335-kube-api-access-hg2nj\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:21 crc kubenswrapper[4707]: I0121 16:02:21.911883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.013695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-public-tls-certs\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.013768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-internal-tls-certs\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.013789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61bace1e-5cf4-462a-9d82-0f972aa44b75-logs\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.013830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.013867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74qhz\" (UniqueName: \"kubernetes.io/projected/61bace1e-5cf4-462a-9d82-0f972aa44b75-kube-api-access-74qhz\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.013905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-config-data\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.013939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg2nj\" (UniqueName: \"kubernetes.io/projected/425c52a3-93a2-49fc-82d9-d7d86f381335-kube-api-access-hg2nj\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.013957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.013978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.014000 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425c52a3-93a2-49fc-82d9-d7d86f381335-logs\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.014039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-config-data\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.014637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425c52a3-93a2-49fc-82d9-d7d86f381335-logs\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.015993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61bace1e-5cf4-462a-9d82-0f972aa44b75-logs\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.025632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-public-tls-certs\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.026283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.027342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-internal-tls-certs\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.027544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74qhz\" (UniqueName: \"kubernetes.io/projected/61bace1e-5cf4-462a-9d82-0f972aa44b75-kube-api-access-74qhz\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.027587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.028098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.028538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-config-data\") pod \"nova-api-0\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.028121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-config-data\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.029121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg2nj\" (UniqueName: \"kubernetes.io/projected/425c52a3-93a2-49fc-82d9-d7d86f381335-kube-api-access-hg2nj\") pod \"nova-metadata-0\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.067774 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.080429 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.486869 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.500760 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:02:22 crc kubenswrapper[4707]: W0121 16:02:22.511378 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61bace1e_5cf4_462a_9d82_0f972aa44b75.slice/crio-54256dbc5cc393032b3025ee1aff02d0c3a2825895a7ce9c03406705f89c1f27 WatchSource:0}: Error finding container 54256dbc5cc393032b3025ee1aff02d0c3a2825895a7ce9c03406705f89c1f27: Status 404 returned error can't find the container with id 54256dbc5cc393032b3025ee1aff02d0c3a2825895a7ce9c03406705f89c1f27 Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.703017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"61bace1e-5cf4-462a-9d82-0f972aa44b75","Type":"ContainerStarted","Data":"bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445"} Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.703053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"61bace1e-5cf4-462a-9d82-0f972aa44b75","Type":"ContainerStarted","Data":"54256dbc5cc393032b3025ee1aff02d0c3a2825895a7ce9c03406705f89c1f27"} Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.708150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"425c52a3-93a2-49fc-82d9-d7d86f381335","Type":"ContainerStarted","Data":"d18433085f15d52dffb3af4e5e2caea7ff2d5749b988e012f1b428f0d7983a99"} Jan 21 16:02:22 crc kubenswrapper[4707]: I0121 16:02:22.942410 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.62:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.192676 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2c79bc-f09f-4e7e-aa40-80cf2914a696" path="/var/lib/kubelet/pods/7a2c79bc-f09f-4e7e-aa40-80cf2914a696/volumes" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.193267 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3caa588-23c8-4f8c-9a2d-4340345a5a70" path="/var/lib/kubelet/pods/c3caa588-23c8-4f8c-9a2d-4340345a5a70/volumes" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.481651 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.493417 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.555184 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-676cb6d794-mt7xp"] Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.555437 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" podUID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerName="placement-log" containerID="cri-o://d376d4f2415e861701dbdf6f065d3a2783b3c5f5fa001d4b22538477483c226a" gracePeriod=30 Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.555495 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" podUID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerName="placement-api" containerID="cri-o://e14681d4cf87d7520d2927a4c5c73b026b63cf5e240e1a16e2e0ba3ad541c350" gracePeriod=30 Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.622024 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.62:8776/healthcheck\": read tcp 10.217.0.2:43828->10.217.1.62:8776: read: connection reset by peer" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.731549 4707 generic.go:334] "Generic (PLEG): container finished" podID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerID="d376d4f2415e861701dbdf6f065d3a2783b3c5f5fa001d4b22538477483c226a" exitCode=143 Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.731606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" event={"ID":"f08e764c-2c5a-4a9d-9922-7f21e90687ff","Type":"ContainerDied","Data":"d376d4f2415e861701dbdf6f065d3a2783b3c5f5fa001d4b22538477483c226a"} Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.733536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"61bace1e-5cf4-462a-9d82-0f972aa44b75","Type":"ContainerStarted","Data":"f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0"} Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.737549 4707 generic.go:334] "Generic (PLEG): container finished" podID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerID="382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4" exitCode=1 Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.737615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c35a2e10-5d5d-44b3-b43e-9a1619f7105c","Type":"ContainerDied","Data":"382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4"} Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.737646 4707 scope.go:117] "RemoveContainer" containerID="82014f703c414396dea36919be89565e6689374551cb369ca777e17ea9e10185" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.738249 4707 scope.go:117] "RemoveContainer" containerID="382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4" Jan 21 16:02:23 crc kubenswrapper[4707]: E0121 16:02:23.738467 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(c35a2e10-5d5d-44b3-b43e-9a1619f7105c)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.740849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"425c52a3-93a2-49fc-82d9-d7d86f381335","Type":"ContainerStarted","Data":"170b32ec6e5a16b7ab4b852d6a2315b320618427d7554c7752c0c21ea1350ffa"} Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.740876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"425c52a3-93a2-49fc-82d9-d7d86f381335","Type":"ContainerStarted","Data":"7cfc0f593e80d12c7b3d86938139b541aeeef5f717713f7020df220d37653d45"} Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.743116 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerID="03dd9833f6900410aad689e2c52cd17eb9a6b068055cdd52e3ee4b84e822c0c7" exitCode=0 Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.743195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8","Type":"ContainerDied","Data":"03dd9833f6900410aad689e2c52cd17eb9a6b068055cdd52e3ee4b84e822c0c7"} Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.754833 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.754818383 podStartE2EDuration="2.754818383s" podCreationTimestamp="2026-01-21 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:23.745957903 +0000 UTC m=+3640.927474135" watchObservedRunningTime="2026-01-21 16:02:23.754818383 +0000 UTC m=+3640.936334606" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.788431 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.788409937 podStartE2EDuration="2.788409937s" podCreationTimestamp="2026-01-21 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:23.781003129 +0000 UTC m=+3640.962519352" watchObservedRunningTime="2026-01-21 16:02:23.788409937 +0000 UTC m=+3640.969926159" Jan 21 16:02:23 crc kubenswrapper[4707]: I0121 16:02:23.989266 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.152801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-scripts\") pod \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.152916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-combined-ca-bundle\") pod \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.152988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlf26\" (UniqueName: \"kubernetes.io/projected/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-kube-api-access-tlf26\") pod \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.153003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data\") pod \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.153033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-internal-tls-certs\") pod \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.153071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-logs\") pod \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.153087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data-custom\") pod \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.153109 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-public-tls-certs\") pod \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.153122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-etc-machine-id\") pod \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\" (UID: \"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8\") " Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.153551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" (UID: "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.154152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-logs" (OuterVolumeSpecName: "logs") pod "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" (UID: "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.157451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-scripts" (OuterVolumeSpecName: "scripts") pod "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" (UID: "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.157825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" (UID: "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.158108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-kube-api-access-tlf26" (OuterVolumeSpecName: "kube-api-access-tlf26") pod "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" (UID: "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8"). InnerVolumeSpecName "kube-api-access-tlf26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.177110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" (UID: "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.190658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" (UID: "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.192381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data" (OuterVolumeSpecName: "config-data") pod "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" (UID: "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.193475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" (UID: "eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.254754 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.254783 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.254796 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlf26\" (UniqueName: \"kubernetes.io/projected/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-kube-api-access-tlf26\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.254805 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.254826 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.254834 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.254843 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.254850 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.254858 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.751950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8","Type":"ContainerDied","Data":"ce8419970875a55a6026044f1bcfa24491b13e9dd25837e7ee133bad0efef6b9"} Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.752179 4707 scope.go:117] "RemoveContainer" containerID="03dd9833f6900410aad689e2c52cd17eb9a6b068055cdd52e3ee4b84e822c0c7" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.752326 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.773978 4707 scope.go:117] "RemoveContainer" containerID="0c91ea7d49d1293494ad3b71e6a2d9ea9e33ccb909db292e4603ce1aa9e9bb4d" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.782372 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.799497 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.819950 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:02:24 crc kubenswrapper[4707]: E0121 16:02:24.820482 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.820496 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api" Jan 21 16:02:24 crc kubenswrapper[4707]: E0121 16:02:24.820528 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api-log" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.820534 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api-log" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.820704 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.820724 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" containerName="cinder-api-log" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.821606 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.823670 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.823822 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.824026 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.828940 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.966712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.966875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.967073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.967137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1b367d6-cfdb-44f9-906b-8799d80d073c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.967289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7cqx\" (UniqueName: \"kubernetes.io/projected/a1b367d6-cfdb-44f9-906b-8799d80d073c-kube-api-access-m7cqx\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.967328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-scripts\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.967462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.967505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b367d6-cfdb-44f9-906b-8799d80d073c-logs\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:24 crc kubenswrapper[4707]: I0121 16:02:24.967583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.068979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1b367d6-cfdb-44f9-906b-8799d80d073c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.069089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7cqx\" (UniqueName: \"kubernetes.io/projected/a1b367d6-cfdb-44f9-906b-8799d80d073c-kube-api-access-m7cqx\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.069116 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-scripts\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.069196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.069226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b367d6-cfdb-44f9-906b-8799d80d073c-logs\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.069262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.069330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.069356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.069412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.069657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1b367d6-cfdb-44f9-906b-8799d80d073c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.070531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b367d6-cfdb-44f9-906b-8799d80d073c-logs\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.073800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.073968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.073974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.074237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-scripts\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.085486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.093315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7cqx\" (UniqueName: \"kubernetes.io/projected/a1b367d6-cfdb-44f9-906b-8799d80d073c-kube-api-access-m7cqx\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.093625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data\") pod \"cinder-api-0\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.140605 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.190952 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8" path="/var/lib/kubelet/pods/eb8a3deb-3b7f-42d0-a711-7291ed5d9bc8/volumes" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.506185 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": read tcp 10.217.0.2:35170->10.217.1.45:9311: read: connection reset by peer" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.506910 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": read tcp 10.217.0.2:35154->10.217.1.45:9311: read: connection reset by peer" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.524244 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.635085 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.635341 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.635383 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.635405 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.635685 4707 scope.go:117] "RemoveContainer" containerID="382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4" Jan 21 16:02:25 crc kubenswrapper[4707]: E0121 16:02:25.635903 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(c35a2e10-5d5d-44b3-b43e-9a1619f7105c)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.759377 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.781833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a1b367d6-cfdb-44f9-906b-8799d80d073c","Type":"ContainerStarted","Data":"84abeb97bd70141bf1e6224f255e796ba24f3eb2484fb98985e286c0f02ebc46"} Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.799669 4707 generic.go:334] "Generic (PLEG): container finished" podID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerID="dfcfda9d4838b9c50d7eebf432589fed062da83ba1c5877073fb9c19903f4a98" exitCode=0 Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.799719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" event={"ID":"8cb3219d-64cd-4b49-adaa-723e74405eda","Type":"ContainerDied","Data":"dfcfda9d4838b9c50d7eebf432589fed062da83ba1c5877073fb9c19903f4a98"} Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.801690 4707 scope.go:117] "RemoveContainer" containerID="382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4" Jan 21 16:02:25 crc kubenswrapper[4707]: E0121 16:02:25.807057 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(c35a2e10-5d5d-44b3-b43e-9a1619f7105c)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.810298 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-65c9d986d8-drhzk"] Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.810566 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" podUID="b998d320-944f-413b-a5b5-2a140207b229" containerName="keystone-api" containerID="cri-o://de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae" gracePeriod=30 Jan 21 16:02:25 crc kubenswrapper[4707]: I0121 16:02:25.922420 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.090630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-public-tls-certs\") pod \"8cb3219d-64cd-4b49-adaa-723e74405eda\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.090751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data\") pod \"8cb3219d-64cd-4b49-adaa-723e74405eda\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.090835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data-custom\") pod \"8cb3219d-64cd-4b49-adaa-723e74405eda\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.090942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-combined-ca-bundle\") pod \"8cb3219d-64cd-4b49-adaa-723e74405eda\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.090976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cb3219d-64cd-4b49-adaa-723e74405eda-logs\") pod \"8cb3219d-64cd-4b49-adaa-723e74405eda\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.091020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-internal-tls-certs\") pod \"8cb3219d-64cd-4b49-adaa-723e74405eda\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.091044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcqkg\" (UniqueName: \"kubernetes.io/projected/8cb3219d-64cd-4b49-adaa-723e74405eda-kube-api-access-wcqkg\") pod \"8cb3219d-64cd-4b49-adaa-723e74405eda\" (UID: \"8cb3219d-64cd-4b49-adaa-723e74405eda\") " Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.091384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cb3219d-64cd-4b49-adaa-723e74405eda-logs" (OuterVolumeSpecName: "logs") pod "8cb3219d-64cd-4b49-adaa-723e74405eda" (UID: "8cb3219d-64cd-4b49-adaa-723e74405eda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.095089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb3219d-64cd-4b49-adaa-723e74405eda-kube-api-access-wcqkg" (OuterVolumeSpecName: "kube-api-access-wcqkg") pod "8cb3219d-64cd-4b49-adaa-723e74405eda" (UID: "8cb3219d-64cd-4b49-adaa-723e74405eda"). InnerVolumeSpecName "kube-api-access-wcqkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.095097 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8cb3219d-64cd-4b49-adaa-723e74405eda" (UID: "8cb3219d-64cd-4b49-adaa-723e74405eda"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.129420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cb3219d-64cd-4b49-adaa-723e74405eda" (UID: "8cb3219d-64cd-4b49-adaa-723e74405eda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.143499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8cb3219d-64cd-4b49-adaa-723e74405eda" (UID: "8cb3219d-64cd-4b49-adaa-723e74405eda"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.147629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data" (OuterVolumeSpecName: "config-data") pod "8cb3219d-64cd-4b49-adaa-723e74405eda" (UID: "8cb3219d-64cd-4b49-adaa-723e74405eda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.152096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8cb3219d-64cd-4b49-adaa-723e74405eda" (UID: "8cb3219d-64cd-4b49-adaa-723e74405eda"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.186056 4707 scope.go:117] "RemoveContainer" containerID="0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9" Jan 21 16:02:26 crc kubenswrapper[4707]: E0121 16:02:26.186276 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(4b535454-f068-42cd-9817-a23bd7d7aa4d)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.193255 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.193356 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cb3219d-64cd-4b49-adaa-723e74405eda-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.193410 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.193475 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcqkg\" (UniqueName: \"kubernetes.io/projected/8cb3219d-64cd-4b49-adaa-723e74405eda-kube-api-access-wcqkg\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.193529 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.193575 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.193621 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cb3219d-64cd-4b49-adaa-723e74405eda-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.810286 4707 generic.go:334] "Generic (PLEG): container finished" podID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerID="e14681d4cf87d7520d2927a4c5c73b026b63cf5e240e1a16e2e0ba3ad541c350" exitCode=0 Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.810365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" event={"ID":"f08e764c-2c5a-4a9d-9922-7f21e90687ff","Type":"ContainerDied","Data":"e14681d4cf87d7520d2927a4c5c73b026b63cf5e240e1a16e2e0ba3ad541c350"} Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.812531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a1b367d6-cfdb-44f9-906b-8799d80d073c","Type":"ContainerStarted","Data":"540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc"} Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.812577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a1b367d6-cfdb-44f9-906b-8799d80d073c","Type":"ContainerStarted","Data":"20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810"} Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.812707 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.814131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" event={"ID":"8cb3219d-64cd-4b49-adaa-723e74405eda","Type":"ContainerDied","Data":"4b87ba974014fb0245b4eb81f177a643e50e1d85cf74e1442f205407f5c2ab9d"} Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.814180 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.814180 4707 scope.go:117] "RemoveContainer" containerID="dfcfda9d4838b9c50d7eebf432589fed062da83ba1c5877073fb9c19903f4a98" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.853540 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.853517061 podStartE2EDuration="2.853517061s" podCreationTimestamp="2026-01-21 16:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:26.835850592 +0000 UTC m=+3644.017366813" watchObservedRunningTime="2026-01-21 16:02:26.853517061 +0000 UTC m=+3644.035033283" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.857230 4707 scope.go:117] "RemoveContainer" containerID="b9b0292f4265ad697d0af76da80ef835b40922e6b0800a62697e9af19c69303c" Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.865441 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc"] Jan 21 16:02:26 crc kubenswrapper[4707]: I0121 16:02:26.872573 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-85b75f6d64-zntqc"] Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.068428 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.068870 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.079760 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.088347 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.088498 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.110261 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.110306 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.122858 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.123777 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.133194 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.141252 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.191732 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" path="/var/lib/kubelet/pods/8cb3219d-64cd-4b49-adaa-723e74405eda/volumes" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.214130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjcbq\" (UniqueName: \"kubernetes.io/projected/f08e764c-2c5a-4a9d-9922-7f21e90687ff-kube-api-access-gjcbq\") pod \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.214296 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-internal-tls-certs\") pod \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.214359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-scripts\") pod \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.214431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-public-tls-certs\") pod \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.214470 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08e764c-2c5a-4a9d-9922-7f21e90687ff-logs\") pod \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.214498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-combined-ca-bundle\") pod \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.214526 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-config-data\") pod \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\" (UID: \"f08e764c-2c5a-4a9d-9922-7f21e90687ff\") " Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.214825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08e764c-2c5a-4a9d-9922-7f21e90687ff-logs" (OuterVolumeSpecName: "logs") pod "f08e764c-2c5a-4a9d-9922-7f21e90687ff" (UID: "f08e764c-2c5a-4a9d-9922-7f21e90687ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.215317 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08e764c-2c5a-4a9d-9922-7f21e90687ff-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.218431 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-scripts" (OuterVolumeSpecName: "scripts") pod "f08e764c-2c5a-4a9d-9922-7f21e90687ff" (UID: "f08e764c-2c5a-4a9d-9922-7f21e90687ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.218671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f08e764c-2c5a-4a9d-9922-7f21e90687ff-kube-api-access-gjcbq" (OuterVolumeSpecName: "kube-api-access-gjcbq") pod "f08e764c-2c5a-4a9d-9922-7f21e90687ff" (UID: "f08e764c-2c5a-4a9d-9922-7f21e90687ff"). InnerVolumeSpecName "kube-api-access-gjcbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.252104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-config-data" (OuterVolumeSpecName: "config-data") pod "f08e764c-2c5a-4a9d-9922-7f21e90687ff" (UID: "f08e764c-2c5a-4a9d-9922-7f21e90687ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.254593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f08e764c-2c5a-4a9d-9922-7f21e90687ff" (UID: "f08e764c-2c5a-4a9d-9922-7f21e90687ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.280946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f08e764c-2c5a-4a9d-9922-7f21e90687ff" (UID: "f08e764c-2c5a-4a9d-9922-7f21e90687ff"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.285067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f08e764c-2c5a-4a9d-9922-7f21e90687ff" (UID: "f08e764c-2c5a-4a9d-9922-7f21e90687ff"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.317186 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.317210 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.317221 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.317230 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjcbq\" (UniqueName: \"kubernetes.io/projected/f08e764c-2c5a-4a9d-9922-7f21e90687ff-kube-api-access-gjcbq\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.317241 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.317249 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08e764c-2c5a-4a9d-9922-7f21e90687ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.824103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" event={"ID":"f08e764c-2c5a-4a9d-9922-7f21e90687ff","Type":"ContainerDied","Data":"75bde2be090b5aeab3fb578e661ef995d58f1e11b79c512b0cc6ef7cf5f800d9"} Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.824325 4707 scope.go:117] "RemoveContainer" containerID="e14681d4cf87d7520d2927a4c5c73b026b63cf5e240e1a16e2e0ba3ad541c350" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.824919 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.824939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.824949 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.824957 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.824979 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-676cb6d794-mt7xp" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.844477 4707 scope.go:117] "RemoveContainer" containerID="d376d4f2415e861701dbdf6f065d3a2783b3c5f5fa001d4b22538477483c226a" Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.859314 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-676cb6d794-mt7xp"] Jan 21 16:02:27 crc kubenswrapper[4707]: I0121 16:02:27.866236 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-676cb6d794-mt7xp"] Jan 21 16:02:28 crc kubenswrapper[4707]: I0121 16:02:28.183032 4707 scope.go:117] "RemoveContainer" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" Jan 21 16:02:28 crc kubenswrapper[4707]: E0121 16:02:28.183689 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(4af60f39-57ca-4595-be87-1d1a0c869d72)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.215879 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" path="/var/lib/kubelet/pods/f08e764c-2c5a-4a9d-9922-7f21e90687ff/volumes" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.336075 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.448962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.455953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-scripts\") pod \"b998d320-944f-413b-a5b5-2a140207b229\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.456007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-public-tls-certs\") pod \"b998d320-944f-413b-a5b5-2a140207b229\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.456123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-internal-tls-certs\") pod \"b998d320-944f-413b-a5b5-2a140207b229\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.456216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-combined-ca-bundle\") pod \"b998d320-944f-413b-a5b5-2a140207b229\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.456245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-config-data\") pod \"b998d320-944f-413b-a5b5-2a140207b229\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.456314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-credential-keys\") pod \"b998d320-944f-413b-a5b5-2a140207b229\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.456349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl4ms\" (UniqueName: \"kubernetes.io/projected/b998d320-944f-413b-a5b5-2a140207b229-kube-api-access-cl4ms\") pod \"b998d320-944f-413b-a5b5-2a140207b229\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.456388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-fernet-keys\") pod \"b998d320-944f-413b-a5b5-2a140207b229\" (UID: \"b998d320-944f-413b-a5b5-2a140207b229\") " Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.460784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-scripts" (OuterVolumeSpecName: "scripts") pod "b998d320-944f-413b-a5b5-2a140207b229" (UID: "b998d320-944f-413b-a5b5-2a140207b229"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.461046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b998d320-944f-413b-a5b5-2a140207b229" (UID: "b998d320-944f-413b-a5b5-2a140207b229"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.466208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b998d320-944f-413b-a5b5-2a140207b229" (UID: "b998d320-944f-413b-a5b5-2a140207b229"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.470408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b998d320-944f-413b-a5b5-2a140207b229-kube-api-access-cl4ms" (OuterVolumeSpecName: "kube-api-access-cl4ms") pod "b998d320-944f-413b-a5b5-2a140207b229" (UID: "b998d320-944f-413b-a5b5-2a140207b229"). InnerVolumeSpecName "kube-api-access-cl4ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.483436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b998d320-944f-413b-a5b5-2a140207b229" (UID: "b998d320-944f-413b-a5b5-2a140207b229"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.494939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-config-data" (OuterVolumeSpecName: "config-data") pod "b998d320-944f-413b-a5b5-2a140207b229" (UID: "b998d320-944f-413b-a5b5-2a140207b229"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.508760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b998d320-944f-413b-a5b5-2a140207b229" (UID: "b998d320-944f-413b-a5b5-2a140207b229"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.510285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b998d320-944f-413b-a5b5-2a140207b229" (UID: "b998d320-944f-413b-a5b5-2a140207b229"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.559167 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.559194 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.559207 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.559215 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.559225 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl4ms\" (UniqueName: \"kubernetes.io/projected/b998d320-944f-413b-a5b5-2a140207b229-kube-api-access-cl4ms\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.559233 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.559241 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.559248 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b998d320-944f-413b-a5b5-2a140207b229-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.565782 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.598941 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.601079 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.840405 4707 generic.go:334] "Generic (PLEG): container finished" podID="b998d320-944f-413b-a5b5-2a140207b229" containerID="de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae" exitCode=0 Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.840462 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.840460 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" event={"ID":"b998d320-944f-413b-a5b5-2a140207b229","Type":"ContainerDied","Data":"de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae"} Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.840796 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65c9d986d8-drhzk" event={"ID":"b998d320-944f-413b-a5b5-2a140207b229","Type":"ContainerDied","Data":"b0134d4948fd07698704528909442219aa1b5af8bb0646cd44422082686ca787"} Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.840858 4707 scope.go:117] "RemoveContainer" containerID="de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.861564 4707 scope.go:117] "RemoveContainer" containerID="de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae" Jan 21 16:02:29 crc kubenswrapper[4707]: E0121 16:02:29.861853 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae\": container with ID starting with de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae not found: ID does not exist" containerID="de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.861882 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae"} err="failed to get container status \"de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae\": rpc error: code = NotFound desc = could not find container \"de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae\": container with ID starting with de02a966d40029506f544c1a64f8cd608c016057f075d5dcf0e557e6e99976ae not found: ID does not exist" Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.866031 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-65c9d986d8-drhzk"] Jan 21 16:02:29 crc kubenswrapper[4707]: I0121 16:02:29.872376 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-65c9d986d8-drhzk"] Jan 21 16:02:31 crc kubenswrapper[4707]: I0121 16:02:31.191198 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b998d320-944f-413b-a5b5-2a140207b229" path="/var/lib/kubelet/pods/b998d320-944f-413b-a5b5-2a140207b229/volumes" Jan 21 16:02:32 crc kubenswrapper[4707]: I0121 16:02:32.068532 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:32 crc kubenswrapper[4707]: I0121 16:02:32.068580 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:32 crc kubenswrapper[4707]: I0121 16:02:32.081145 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:32 crc kubenswrapper[4707]: I0121 16:02:32.081187 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:33 crc kubenswrapper[4707]: I0121 16:02:33.081934 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.82:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:33 crc kubenswrapper[4707]: I0121 16:02:33.081954 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.82:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:33 crc kubenswrapper[4707]: I0121 16:02:33.092966 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.83:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:33 crc kubenswrapper[4707]: I0121 16:02:33.092971 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.83:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.261881 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs"] Jan 21 16:02:35 crc kubenswrapper[4707]: E0121 16:02:35.262417 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api-log" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262429 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api-log" Jan 21 16:02:35 crc kubenswrapper[4707]: E0121 16:02:35.262444 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerName="placement-log" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262450 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerName="placement-log" Jan 21 16:02:35 crc kubenswrapper[4707]: E0121 16:02:35.262458 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b998d320-944f-413b-a5b5-2a140207b229" containerName="keystone-api" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262463 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b998d320-944f-413b-a5b5-2a140207b229" containerName="keystone-api" Jan 21 16:02:35 crc kubenswrapper[4707]: E0121 16:02:35.262478 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262483 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api" Jan 21 16:02:35 crc kubenswrapper[4707]: E0121 16:02:35.262502 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerName="placement-api" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262506 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerName="placement-api" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262694 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerName="placement-api" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262709 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262721 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb3219d-64cd-4b49-adaa-723e74405eda" containerName="barbican-api-log" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262730 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08e764c-2c5a-4a9d-9922-7f21e90687ff" containerName="placement-log" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.262737 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b998d320-944f-413b-a5b5-2a140207b229" containerName="keystone-api" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.263622 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.269991 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs"] Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.354724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-internal-tls-certs\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.354771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-public-tls-certs\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.354794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-log-httpd\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.354831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-config-data\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.354847 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-combined-ca-bundle\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.354898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-etc-swift\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.354961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlwwb\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-kube-api-access-wlwwb\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.355021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-run-httpd\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.456902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlwwb\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-kube-api-access-wlwwb\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.456961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-run-httpd\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.457043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-internal-tls-certs\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.457062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-public-tls-certs\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.457081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-log-httpd\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.457095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-config-data\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.457108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-combined-ca-bundle\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.457136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-etc-swift\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.458034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-run-httpd\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.458450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-log-httpd\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.462336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-public-tls-certs\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.462487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-combined-ca-bundle\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.464560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-internal-tls-certs\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.467205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-config-data\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.468370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-etc-swift\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.473292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlwwb\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-kube-api-access-wlwwb\") pod \"swift-proxy-58ff699466-ffxgs\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.578254 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:35 crc kubenswrapper[4707]: I0121 16:02:35.990116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs"] Jan 21 16:02:36 crc kubenswrapper[4707]: I0121 16:02:36.907323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" event={"ID":"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be","Type":"ContainerStarted","Data":"11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa"} Jan 21 16:02:36 crc kubenswrapper[4707]: I0121 16:02:36.907723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" event={"ID":"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be","Type":"ContainerStarted","Data":"915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb"} Jan 21 16:02:36 crc kubenswrapper[4707]: I0121 16:02:36.907736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" event={"ID":"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be","Type":"ContainerStarted","Data":"f158eb4dec08518c041253742307fa9d0b4db27b4d8a049c5f839e5ffbf80f6a"} Jan 21 16:02:36 crc kubenswrapper[4707]: I0121 16:02:36.909533 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:36 crc kubenswrapper[4707]: I0121 16:02:36.909571 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:36 crc kubenswrapper[4707]: I0121 16:02:36.925938 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" podStartSLOduration=1.9259245919999999 podStartE2EDuration="1.925924592s" podCreationTimestamp="2026-01-21 16:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:02:36.923643051 +0000 UTC m=+3654.105159273" watchObservedRunningTime="2026-01-21 16:02:36.925924592 +0000 UTC m=+3654.107440813" Jan 21 16:02:36 crc kubenswrapper[4707]: I0121 16:02:36.960551 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:02:37 crc kubenswrapper[4707]: I0121 16:02:37.182348 4707 scope.go:117] "RemoveContainer" containerID="382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4" Jan 21 16:02:37 crc kubenswrapper[4707]: I0121 16:02:37.918278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c35a2e10-5d5d-44b3-b43e-9a1619f7105c","Type":"ContainerStarted","Data":"b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed"} Jan 21 16:02:38 crc kubenswrapper[4707]: I0121 16:02:38.011152 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:38 crc kubenswrapper[4707]: I0121 16:02:38.182437 4707 scope.go:117] "RemoveContainer" containerID="0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9" Jan 21 16:02:38 crc kubenswrapper[4707]: I0121 16:02:38.926908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerStarted","Data":"69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a"} Jan 21 16:02:38 crc kubenswrapper[4707]: I0121 16:02:38.928386 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.553072 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.553270 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="ceilometer-central-agent" containerID="cri-o://ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b" gracePeriod=30 Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.553355 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="proxy-httpd" containerID="cri-o://a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163" gracePeriod=30 Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.553389 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="ceilometer-notification-agent" containerID="cri-o://e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de" gracePeriod=30 Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.553378 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="sg-core" containerID="cri-o://fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de" gracePeriod=30 Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.935824 4707 generic.go:334] "Generic (PLEG): container finished" podID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerID="a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163" exitCode=0 Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.935983 4707 generic.go:334] "Generic (PLEG): container finished" podID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerID="fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de" exitCode=2 Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.935992 4707 generic.go:334] "Generic (PLEG): container finished" podID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerID="ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b" exitCode=0 Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.935998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerDied","Data":"a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163"} Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.936034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerDied","Data":"fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de"} Jan 21 16:02:39 crc kubenswrapper[4707]: I0121 16:02:39.936046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerDied","Data":"ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b"} Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.182297 4707 scope.go:117] "RemoveContainer" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.186669 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.239539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-log-httpd\") pod \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.239870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-run-httpd\") pod \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.239950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-config-data\") pod \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.240017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-combined-ca-bundle\") pod \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.240042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l9wh\" (UniqueName: \"kubernetes.io/projected/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-kube-api-access-5l9wh\") pod \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.240110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-scripts\") pod \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.240124 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" (UID: "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.240157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-ceilometer-tls-certs\") pod \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.240198 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-sg-core-conf-yaml\") pod \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\" (UID: \"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2\") " Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.240711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" (UID: "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.242160 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.242181 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.245227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-kube-api-access-5l9wh" (OuterVolumeSpecName: "kube-api-access-5l9wh") pod "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" (UID: "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2"). InnerVolumeSpecName "kube-api-access-5l9wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.252895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-scripts" (OuterVolumeSpecName: "scripts") pod "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" (UID: "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.283893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" (UID: "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.346882 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l9wh\" (UniqueName: \"kubernetes.io/projected/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-kube-api-access-5l9wh\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.346911 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.346922 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.356409 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" (UID: "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.365948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" (UID: "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.366927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-config-data" (OuterVolumeSpecName: "config-data") pod "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" (UID: "69bb9e05-ce31-46c3-9ebf-9cd0d05676d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.448169 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.448196 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.448207 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.635392 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.944927 4707 generic.go:334] "Generic (PLEG): container finished" podID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerID="e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de" exitCode=0 Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.944978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerDied","Data":"e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de"} Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.945002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"69bb9e05-ce31-46c3-9ebf-9cd0d05676d2","Type":"ContainerDied","Data":"f0fc2a204f45c0c3bca8122fe10d5407cd0feec27f37af25ae2e737c292d6b5b"} Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.945018 4707 scope.go:117] "RemoveContainer" containerID="a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.945136 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.950019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerStarted","Data":"a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e"} Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.950197 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.971531 4707 scope.go:117] "RemoveContainer" containerID="fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.993902 4707 scope.go:117] "RemoveContainer" containerID="e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de" Jan 21 16:02:40 crc kubenswrapper[4707]: I0121 16:02:40.994551 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.011340 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.016725 4707 scope.go:117] "RemoveContainer" containerID="ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.019225 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:41 crc kubenswrapper[4707]: E0121 16:02:41.019592 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="ceilometer-notification-agent" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.019609 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="ceilometer-notification-agent" Jan 21 16:02:41 crc kubenswrapper[4707]: E0121 16:02:41.019632 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="proxy-httpd" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.019638 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="proxy-httpd" Jan 21 16:02:41 crc kubenswrapper[4707]: E0121 16:02:41.019650 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="sg-core" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.019656 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="sg-core" Jan 21 16:02:41 crc kubenswrapper[4707]: E0121 16:02:41.019668 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="ceilometer-central-agent" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.019673 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="ceilometer-central-agent" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.019877 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="sg-core" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.019904 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="ceilometer-central-agent" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.019922 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="ceilometer-notification-agent" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.019933 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" containerName="proxy-httpd" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.021461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.022756 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.022988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.023035 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.029755 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.039517 4707 scope.go:117] "RemoveContainer" containerID="a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163" Jan 21 16:02:41 crc kubenswrapper[4707]: E0121 16:02:41.040516 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163\": container with ID starting with a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163 not found: ID does not exist" containerID="a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.040550 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163"} err="failed to get container status \"a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163\": rpc error: code = NotFound desc = could not find container \"a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163\": container with ID starting with a8670f622a517be5cba78653b579809ebd8b97159d387552a6b7b07112214163 not found: ID does not exist" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.040572 4707 scope.go:117] "RemoveContainer" containerID="fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de" Jan 21 16:02:41 crc kubenswrapper[4707]: E0121 16:02:41.041691 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de\": container with ID starting with fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de not found: ID does not exist" containerID="fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.041716 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de"} err="failed to get container status \"fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de\": rpc error: code = NotFound desc = could not find container \"fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de\": container with ID starting with fdecb60b7b719ada2174207d231f24f92cca93096d81ecde393cf3a7fdb5d6de not found: ID does not exist" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.041734 4707 scope.go:117] "RemoveContainer" containerID="e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de" Jan 21 16:02:41 crc kubenswrapper[4707]: E0121 16:02:41.041962 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de\": container with ID starting with e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de not found: ID does not exist" containerID="e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.041977 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de"} err="failed to get container status \"e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de\": rpc error: code = NotFound desc = could not find container \"e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de\": container with ID starting with e4c0d663148ed005a9df8907c4622dbc521e2b046c9afd74b751af0d1a5429de not found: ID does not exist" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.041990 4707 scope.go:117] "RemoveContainer" containerID="ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b" Jan 21 16:02:41 crc kubenswrapper[4707]: E0121 16:02:41.042414 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b\": container with ID starting with ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b not found: ID does not exist" containerID="ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.042430 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b"} err="failed to get container status \"ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b\": rpc error: code = NotFound desc = could not find container \"ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b\": container with ID starting with ff01040f7c3ed76f133bba587fb3c978815d4fefd94e8ec297aae167cc37604b not found: ID does not exist" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.160797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkws6\" (UniqueName: \"kubernetes.io/projected/a423cf5c-f81b-4df1-bf8f-51a11024e065-kube-api-access-bkws6\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.161017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.161092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.161223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-run-httpd\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.161284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.161381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-config-data\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.161413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-scripts\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.161503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-log-httpd\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.191504 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69bb9e05-ce31-46c3-9ebf-9cd0d05676d2" path="/var/lib/kubelet/pods/69bb9e05-ce31-46c3-9ebf-9cd0d05676d2/volumes" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.263278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-run-httpd\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.263329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.263362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-config-data\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.263377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-scripts\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.263411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-log-httpd\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.263482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkws6\" (UniqueName: \"kubernetes.io/projected/a423cf5c-f81b-4df1-bf8f-51a11024e065-kube-api-access-bkws6\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.263557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.264091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.263842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-run-httpd\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.263892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-log-httpd\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.267951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-scripts\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.268100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.268291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-config-data\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.268405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.273528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.278235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkws6\" (UniqueName: \"kubernetes.io/projected/a423cf5c-f81b-4df1-bf8f-51a11024e065-kube-api-access-bkws6\") pod \"ceilometer-0\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.341643 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.456227 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.48:9696/\": dial tcp 10.217.1.48:9696: connect: connection refused" Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.750510 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:02:41 crc kubenswrapper[4707]: W0121 16:02:41.754888 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda423cf5c_f81b_4df1_bf8f_51a11024e065.slice/crio-39f7ba9a9da4e8ab5a4404f2a82abadc8b62dffab7357d91044be1cec47be9cd WatchSource:0}: Error finding container 39f7ba9a9da4e8ab5a4404f2a82abadc8b62dffab7357d91044be1cec47be9cd: Status 404 returned error can't find the container with id 39f7ba9a9da4e8ab5a4404f2a82abadc8b62dffab7357d91044be1cec47be9cd Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.756586 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:02:41 crc kubenswrapper[4707]: I0121 16:02:41.957363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerStarted","Data":"39f7ba9a9da4e8ab5a4404f2a82abadc8b62dffab7357d91044be1cec47be9cd"} Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.073901 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.075450 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.078455 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.093126 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.093419 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.093930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.098503 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.967463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerStarted","Data":"5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48"} Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.968447 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.978118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:02:42 crc kubenswrapper[4707]: I0121 16:02:42.978953 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:02:43 crc kubenswrapper[4707]: I0121 16:02:43.700211 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:02:43 crc kubenswrapper[4707]: I0121 16:02:43.975168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerStarted","Data":"8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06"} Jan 21 16:02:43 crc kubenswrapper[4707]: I0121 16:02:43.975207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerStarted","Data":"016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf"} Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.583967 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.584321 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.634853 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.659250 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.671539 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc"] Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.679626 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" podUID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerName="proxy-httpd" containerID="cri-o://a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe" gracePeriod=30 Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.679753 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" podUID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerName="proxy-server" containerID="cri-o://0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca" gracePeriod=30 Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.952324 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5797484f74-ph7p9_08d2edd9-a93d-4bb9-ac1e-6a23494df48c/neutron-api/0.log" Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.952529 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.997497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerStarted","Data":"46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687"} Jan 21 16:02:45 crc kubenswrapper[4707]: I0121 16:02:45.998604 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.000004 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5797484f74-ph7p9_08d2edd9-a93d-4bb9-ac1e-6a23494df48c/neutron-api/0.log" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.000038 4707 generic.go:334] "Generic (PLEG): container finished" podID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerID="337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff" exitCode=137 Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.000129 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.000234 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" event={"ID":"08d2edd9-a93d-4bb9-ac1e-6a23494df48c","Type":"ContainerDied","Data":"337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff"} Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.000290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5797484f74-ph7p9" event={"ID":"08d2edd9-a93d-4bb9-ac1e-6a23494df48c","Type":"ContainerDied","Data":"e4c5b63cd11f873ed2a55b4f24dcbe0506e5d934f705d5c8ab970f8724d61383"} Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.000319 4707 scope.go:117] "RemoveContainer" containerID="5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.002602 4707 generic.go:334] "Generic (PLEG): container finished" podID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerID="a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe" exitCode=0 Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.002669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" event={"ID":"7476b843-007f-4dde-8bc6-c623dbc1c101","Type":"ContainerDied","Data":"a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe"} Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.030773 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.428985056 podStartE2EDuration="6.030760061s" podCreationTimestamp="2026-01-21 16:02:40 +0000 UTC" firstStartedPulling="2026-01-21 16:02:41.756316839 +0000 UTC m=+3658.937833062" lastFinishedPulling="2026-01-21 16:02:45.358091845 +0000 UTC m=+3662.539608067" observedRunningTime="2026-01-21 16:02:46.016726583 +0000 UTC m=+3663.198242806" watchObservedRunningTime="2026-01-21 16:02:46.030760061 +0000 UTC m=+3663.212276282" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.042948 4707 scope.go:117] "RemoveContainer" containerID="337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.048212 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.059631 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-httpd-config\") pod \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.059669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-internal-tls-certs\") pod \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.059765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-ovndb-tls-certs\") pod \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.059805 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-combined-ca-bundle\") pod \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.059896 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xts5f\" (UniqueName: \"kubernetes.io/projected/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-kube-api-access-xts5f\") pod \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.059919 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-public-tls-certs\") pod \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.059943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-config\") pod \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\" (UID: \"08d2edd9-a93d-4bb9-ac1e-6a23494df48c\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.061262 4707 scope.go:117] "RemoveContainer" containerID="5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd" Jan 21 16:02:46 crc kubenswrapper[4707]: E0121 16:02:46.061744 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd\": container with ID starting with 5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd not found: ID does not exist" containerID="5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.061783 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd"} err="failed to get container status \"5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd\": rpc error: code = NotFound desc = could not find container \"5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd\": container with ID starting with 5f6e7da538a12097f6d7c0e4b296cf1c91ec6d117cb1316dd612d4f76940c3dd not found: ID does not exist" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.061925 4707 scope.go:117] "RemoveContainer" containerID="337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff" Jan 21 16:02:46 crc kubenswrapper[4707]: E0121 16:02:46.062230 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff\": container with ID starting with 337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff not found: ID does not exist" containerID="337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.062258 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff"} err="failed to get container status \"337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff\": rpc error: code = NotFound desc = could not find container \"337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff\": container with ID starting with 337d0a6510e4c802410b5ffaa519ae596ed5a88312d93efca3e8546121560bff not found: ID does not exist" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.064266 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-kube-api-access-xts5f" (OuterVolumeSpecName: "kube-api-access-xts5f") pod "08d2edd9-a93d-4bb9-ac1e-6a23494df48c" (UID: "08d2edd9-a93d-4bb9-ac1e-6a23494df48c"). InnerVolumeSpecName "kube-api-access-xts5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.071382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "08d2edd9-a93d-4bb9-ac1e-6a23494df48c" (UID: "08d2edd9-a93d-4bb9-ac1e-6a23494df48c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.101199 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-config" (OuterVolumeSpecName: "config") pod "08d2edd9-a93d-4bb9-ac1e-6a23494df48c" (UID: "08d2edd9-a93d-4bb9-ac1e-6a23494df48c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.101315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "08d2edd9-a93d-4bb9-ac1e-6a23494df48c" (UID: "08d2edd9-a93d-4bb9-ac1e-6a23494df48c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.103790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "08d2edd9-a93d-4bb9-ac1e-6a23494df48c" (UID: "08d2edd9-a93d-4bb9-ac1e-6a23494df48c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.105923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08d2edd9-a93d-4bb9-ac1e-6a23494df48c" (UID: "08d2edd9-a93d-4bb9-ac1e-6a23494df48c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.124364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "08d2edd9-a93d-4bb9-ac1e-6a23494df48c" (UID: "08d2edd9-a93d-4bb9-ac1e-6a23494df48c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.162765 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.162790 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.162800 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xts5f\" (UniqueName: \"kubernetes.io/projected/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-kube-api-access-xts5f\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.162822 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.162831 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.162840 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.162847 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d2edd9-a93d-4bb9-ac1e-6a23494df48c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.329911 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5797484f74-ph7p9"] Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.340248 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5797484f74-ph7p9"] Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.379850 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.435680 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5cbd44866-48h8x"] Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.436234 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerName="neutron-httpd" containerID="cri-o://13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871" gracePeriod=30 Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.435894 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerName="neutron-api" containerID="cri-o://ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877" gracePeriod=30 Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.598803 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.672940 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-combined-ca-bundle\") pod \"7476b843-007f-4dde-8bc6-c623dbc1c101\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.673034 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-log-httpd\") pod \"7476b843-007f-4dde-8bc6-c623dbc1c101\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.673050 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-public-tls-certs\") pod \"7476b843-007f-4dde-8bc6-c623dbc1c101\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.673199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-etc-swift\") pod \"7476b843-007f-4dde-8bc6-c623dbc1c101\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.673231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-run-httpd\") pod \"7476b843-007f-4dde-8bc6-c623dbc1c101\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.673259 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-internal-tls-certs\") pod \"7476b843-007f-4dde-8bc6-c623dbc1c101\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.673284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p87d\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-kube-api-access-2p87d\") pod \"7476b843-007f-4dde-8bc6-c623dbc1c101\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.673321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-config-data\") pod \"7476b843-007f-4dde-8bc6-c623dbc1c101\" (UID: \"7476b843-007f-4dde-8bc6-c623dbc1c101\") " Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.674050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7476b843-007f-4dde-8bc6-c623dbc1c101" (UID: "7476b843-007f-4dde-8bc6-c623dbc1c101"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.674348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7476b843-007f-4dde-8bc6-c623dbc1c101" (UID: "7476b843-007f-4dde-8bc6-c623dbc1c101"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.679185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-kube-api-access-2p87d" (OuterVolumeSpecName: "kube-api-access-2p87d") pod "7476b843-007f-4dde-8bc6-c623dbc1c101" (UID: "7476b843-007f-4dde-8bc6-c623dbc1c101"). InnerVolumeSpecName "kube-api-access-2p87d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.679293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7476b843-007f-4dde-8bc6-c623dbc1c101" (UID: "7476b843-007f-4dde-8bc6-c623dbc1c101"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.713995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-config-data" (OuterVolumeSpecName: "config-data") pod "7476b843-007f-4dde-8bc6-c623dbc1c101" (UID: "7476b843-007f-4dde-8bc6-c623dbc1c101"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.720317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7476b843-007f-4dde-8bc6-c623dbc1c101" (UID: "7476b843-007f-4dde-8bc6-c623dbc1c101"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.722203 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7476b843-007f-4dde-8bc6-c623dbc1c101" (UID: "7476b843-007f-4dde-8bc6-c623dbc1c101"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.729007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7476b843-007f-4dde-8bc6-c623dbc1c101" (UID: "7476b843-007f-4dde-8bc6-c623dbc1c101"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.775822 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.775848 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.775860 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p87d\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-kube-api-access-2p87d\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.775868 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.775875 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.775882 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7476b843-007f-4dde-8bc6-c623dbc1c101-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.775890 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7476b843-007f-4dde-8bc6-c623dbc1c101-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:46 crc kubenswrapper[4707]: I0121 16:02:46.775900 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7476b843-007f-4dde-8bc6-c623dbc1c101-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.010679 4707 generic.go:334] "Generic (PLEG): container finished" podID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerID="13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871" exitCode=0 Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.010736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" event={"ID":"d30e5918-95e9-4870-bd2e-d95fe0e448d8","Type":"ContainerDied","Data":"13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871"} Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.014015 4707 generic.go:334] "Generic (PLEG): container finished" podID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerID="0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca" exitCode=0 Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.014911 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.017849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" event={"ID":"7476b843-007f-4dde-8bc6-c623dbc1c101","Type":"ContainerDied","Data":"0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca"} Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.017877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc" event={"ID":"7476b843-007f-4dde-8bc6-c623dbc1c101","Type":"ContainerDied","Data":"7657f3179bec23b81d3e1eb3e92db7994e09a2908896463bcc24b7cbcff134e7"} Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.017892 4707 scope.go:117] "RemoveContainer" containerID="0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca" Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.037534 4707 scope.go:117] "RemoveContainer" containerID="a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe" Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.044338 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc"] Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.051540 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-8545fb474-dhvxc"] Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.070856 4707 scope.go:117] "RemoveContainer" containerID="0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca" Jan 21 16:02:47 crc kubenswrapper[4707]: E0121 16:02:47.071222 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca\": container with ID starting with 0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca not found: ID does not exist" containerID="0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca" Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.071256 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca"} err="failed to get container status \"0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca\": rpc error: code = NotFound desc = could not find container \"0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca\": container with ID starting with 0ec62b23efc4615734b977030124ce62fac6f8b82fd7fc91acdbc9c4b53b95ca not found: ID does not exist" Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.071274 4707 scope.go:117] "RemoveContainer" containerID="a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe" Jan 21 16:02:47 crc kubenswrapper[4707]: E0121 16:02:47.071617 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe\": container with ID starting with a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe not found: ID does not exist" containerID="a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe" Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.071640 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe"} err="failed to get container status \"a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe\": rpc error: code = NotFound desc = could not find container \"a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe\": container with ID starting with a18cec1e2f25dc2a0a0ede297c037a7c94b828046376d1e2b1aa4cc3676c64fe not found: ID does not exist" Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.191451 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" path="/var/lib/kubelet/pods/08d2edd9-a93d-4bb9-ac1e-6a23494df48c/volumes" Jan 21 16:02:47 crc kubenswrapper[4707]: I0121 16:02:47.192092 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7476b843-007f-4dde-8bc6-c623dbc1c101" path="/var/lib/kubelet/pods/7476b843-007f-4dde-8bc6-c623dbc1c101/volumes" Jan 21 16:02:48 crc kubenswrapper[4707]: I0121 16:02:48.535489 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:03:05 crc kubenswrapper[4707]: I0121 16:03:05.894976 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.1:9696/\": dial tcp 10.217.1.1:9696: connect: connection refused" Jan 21 16:03:11 crc kubenswrapper[4707]: I0121 16:03:11.003504 4707 scope.go:117] "RemoveContainer" containerID="32fad64bc9d7b12c80fb73bcc5468c75aa88ca61108101349352f58ea1f38d4c" Jan 21 16:03:11 crc kubenswrapper[4707]: I0121 16:03:11.348318 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.843502 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5cbd44866-48h8x_d30e5918-95e9-4870-bd2e-d95fe0e448d8/neutron-api/0.log" Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.843862 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.966568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-combined-ca-bundle\") pod \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.966653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmx4l\" (UniqueName: \"kubernetes.io/projected/d30e5918-95e9-4870-bd2e-d95fe0e448d8-kube-api-access-dmx4l\") pod \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.966750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-config\") pod \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.966791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-internal-tls-certs\") pod \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.966834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-ovndb-tls-certs\") pod \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.966853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-public-tls-certs\") pod \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.966879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-httpd-config\") pod \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\" (UID: \"d30e5918-95e9-4870-bd2e-d95fe0e448d8\") " Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.974494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d30e5918-95e9-4870-bd2e-d95fe0e448d8" (UID: "d30e5918-95e9-4870-bd2e-d95fe0e448d8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:16 crc kubenswrapper[4707]: I0121 16:03:16.974520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30e5918-95e9-4870-bd2e-d95fe0e448d8-kube-api-access-dmx4l" (OuterVolumeSpecName: "kube-api-access-dmx4l") pod "d30e5918-95e9-4870-bd2e-d95fe0e448d8" (UID: "d30e5918-95e9-4870-bd2e-d95fe0e448d8"). InnerVolumeSpecName "kube-api-access-dmx4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.005919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-config" (OuterVolumeSpecName: "config") pod "d30e5918-95e9-4870-bd2e-d95fe0e448d8" (UID: "d30e5918-95e9-4870-bd2e-d95fe0e448d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.006698 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d30e5918-95e9-4870-bd2e-d95fe0e448d8" (UID: "d30e5918-95e9-4870-bd2e-d95fe0e448d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.007573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d30e5918-95e9-4870-bd2e-d95fe0e448d8" (UID: "d30e5918-95e9-4870-bd2e-d95fe0e448d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.010292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d30e5918-95e9-4870-bd2e-d95fe0e448d8" (UID: "d30e5918-95e9-4870-bd2e-d95fe0e448d8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.020943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d30e5918-95e9-4870-bd2e-d95fe0e448d8" (UID: "d30e5918-95e9-4870-bd2e-d95fe0e448d8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.069697 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.069721 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmx4l\" (UniqueName: \"kubernetes.io/projected/d30e5918-95e9-4870-bd2e-d95fe0e448d8-kube-api-access-dmx4l\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.069732 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.069742 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.069750 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.069758 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.069765 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d30e5918-95e9-4870-bd2e-d95fe0e448d8-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.232060 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5cbd44866-48h8x_d30e5918-95e9-4870-bd2e-d95fe0e448d8/neutron-api/0.log" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.232575 4707 generic.go:334] "Generic (PLEG): container finished" podID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerID="ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877" exitCode=137 Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.232631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" event={"ID":"d30e5918-95e9-4870-bd2e-d95fe0e448d8","Type":"ContainerDied","Data":"ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877"} Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.232670 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.232693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5cbd44866-48h8x" event={"ID":"d30e5918-95e9-4870-bd2e-d95fe0e448d8","Type":"ContainerDied","Data":"36b42e557ae4bb41381ef7e924e97163466c22820d9e58e718f6052192da6df4"} Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.232717 4707 scope.go:117] "RemoveContainer" containerID="13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.263077 4707 scope.go:117] "RemoveContainer" containerID="ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.265485 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5cbd44866-48h8x"] Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.274071 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5cbd44866-48h8x"] Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.279099 4707 scope.go:117] "RemoveContainer" containerID="13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871" Jan 21 16:03:17 crc kubenswrapper[4707]: E0121 16:03:17.279582 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871\": container with ID starting with 13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871 not found: ID does not exist" containerID="13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.279638 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871"} err="failed to get container status \"13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871\": rpc error: code = NotFound desc = could not find container \"13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871\": container with ID starting with 13caa73c4333d5446240e22873bd80bfccf22d6d2fc6c55f343280d91ecce871 not found: ID does not exist" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.279661 4707 scope.go:117] "RemoveContainer" containerID="ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877" Jan 21 16:03:17 crc kubenswrapper[4707]: E0121 16:03:17.279937 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877\": container with ID starting with ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877 not found: ID does not exist" containerID="ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877" Jan 21 16:03:17 crc kubenswrapper[4707]: I0121 16:03:17.279963 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877"} err="failed to get container status \"ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877\": rpc error: code = NotFound desc = could not find container \"ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877\": container with ID starting with ca5688db8adf2ed4c0bcd41ea2a6f144fd54cd021ce33f81a9ca93265c7c5877 not found: ID does not exist" Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.290154 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.290181 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.290252 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs podName:05aad5e9-0e3a-47b4-a11f-feca99ab8dac nodeName:}" failed. No retries permitted until 2026-01-21 16:03:18.790226782 +0000 UTC m=+3695.971743004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs") pod "memcached-0" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac") : secret "cert-memcached-svc" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.290258 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.290274 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:18.790265835 +0000 UTC m=+3695.971782057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.290342 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:18.790324727 +0000 UTC m=+3695.971840949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovn-metrics" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.802826 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.803205 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs podName:05aad5e9-0e3a-47b4-a11f-feca99ab8dac nodeName:}" failed. No retries permitted until 2026-01-21 16:03:19.803181725 +0000 UTC m=+3696.984697948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs") pod "memcached-0" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac") : secret "cert-memcached-svc" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.802906 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.802905 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.803343 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:19.803316698 +0000 UTC m=+3696.984832921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:18 crc kubenswrapper[4707]: E0121 16:03:18.803389 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:19.803373537 +0000 UTC m=+3696.984889758 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovn-metrics" not found Jan 21 16:03:19 crc kubenswrapper[4707]: I0121 16:03:19.191938 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" path="/var/lib/kubelet/pods/d30e5918-95e9-4870-bd2e-d95fe0e448d8/volumes" Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.313089 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.313164 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs podName:4cd21f18-14c0-44d2-bf3d-967e6d74b3f9 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:19.8131488 +0000 UTC m=+3696.994665032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs") pod "openstack-galera-0" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9") : secret "cert-galera-openstack-svc" not found Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.824011 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.824061 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.824090 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs podName:4cd21f18-14c0-44d2-bf3d-967e6d74b3f9 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:20.824075349 +0000 UTC m=+3698.005591571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs") pod "openstack-galera-0" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9") : secret "cert-galera-openstack-svc" not found Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.824124 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs podName:05aad5e9-0e3a-47b4-a11f-feca99ab8dac nodeName:}" failed. No retries permitted until 2026-01-21 16:03:21.824108692 +0000 UTC m=+3699.005624913 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs") pod "memcached-0" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac") : secret "cert-memcached-svc" not found Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.824012 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.824209 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:21.824194694 +0000 UTC m=+3699.005710916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovn-metrics" not found Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.824026 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:19 crc kubenswrapper[4707]: E0121 16:03:19.824282 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:21.824268232 +0000 UTC m=+3699.005784454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:20 crc kubenswrapper[4707]: E0121 16:03:20.841931 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 21 16:03:20 crc kubenswrapper[4707]: E0121 16:03:20.841998 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs podName:4cd21f18-14c0-44d2-bf3d-967e6d74b3f9 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:22.841983259 +0000 UTC m=+3700.023499481 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs") pod "openstack-galera-0" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9") : secret "cert-galera-openstack-svc" not found Jan 21 16:03:20 crc kubenswrapper[4707]: I0121 16:03:20.984731 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.081241 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" containerName="galera" containerID="cri-o://b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1" gracePeriod=30 Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.630386 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.663886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-operator-scripts\") pod \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.663966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs\") pod \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-generated\") pod \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kolla-config\") pod \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664512 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-default\") pod \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664912 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-combined-ca-bundle\") pod \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.664947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bdvw\" (UniqueName: \"kubernetes.io/projected/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kube-api-access-2bdvw\") pod \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\" (UID: \"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9\") " Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.665797 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.665830 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.665842 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.665850 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.677601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kube-api-access-2bdvw" (OuterVolumeSpecName: "kube-api-access-2bdvw") pod "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9"). InnerVolumeSpecName "kube-api-access-2bdvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.685715 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.697257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.700998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.719574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" (UID: "4cd21f18-14c0-44d2-bf3d-967e6d74b3f9"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.766546 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.766572 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bdvw\" (UniqueName: \"kubernetes.io/projected/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-kube-api-access-2bdvw\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.766598 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.766608 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.780822 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 16:03:21 crc kubenswrapper[4707]: E0121 16:03:21.868913 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:21 crc kubenswrapper[4707]: I0121 16:03:21.868945 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:21 crc kubenswrapper[4707]: E0121 16:03:21.868973 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:25.868960463 +0000 UTC m=+3703.050476685 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:21 crc kubenswrapper[4707]: E0121 16:03:21.869002 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:21 crc kubenswrapper[4707]: E0121 16:03:21.869009 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:03:21 crc kubenswrapper[4707]: E0121 16:03:21.869039 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:25.869030364 +0000 UTC m=+3703.050546586 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovn-metrics" not found Jan 21 16:03:21 crc kubenswrapper[4707]: E0121 16:03:21.869071 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs podName:05aad5e9-0e3a-47b4-a11f-feca99ab8dac nodeName:}" failed. No retries permitted until 2026-01-21 16:03:25.869056593 +0000 UTC m=+3703.050572816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs") pod "memcached-0" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac") : secret "cert-memcached-svc" not found Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.271202 4707 generic.go:334] "Generic (PLEG): container finished" podID="4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" containerID="b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1" exitCode=0 Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.271278 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.271276 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9","Type":"ContainerDied","Data":"b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1"} Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.271608 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4cd21f18-14c0-44d2-bf3d-967e6d74b3f9","Type":"ContainerDied","Data":"2289bcf020b0a530c16c6e70dc95b796dbbed43595c00ae3006d390f0d6b0fa6"} Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.271661 4707 scope.go:117] "RemoveContainer" containerID="b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.292784 4707 scope.go:117] "RemoveContainer" containerID="2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.296104 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.301975 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.316387 4707 scope.go:117] "RemoveContainer" containerID="b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1" Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.316754 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1\": container with ID starting with b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1 not found: ID does not exist" containerID="b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.316785 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1"} err="failed to get container status \"b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1\": rpc error: code = NotFound desc = could not find container \"b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1\": container with ID starting with b5dcfd6fc701b800565b8e685e4e4ac8b8cf7e65236a29e909663842f1b2bae1 not found: ID does not exist" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.316819 4707 scope.go:117] "RemoveContainer" containerID="2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36" Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.317875 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36\": container with ID starting with 2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36 not found: ID does not exist" containerID="2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.317912 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36"} err="failed to get container status \"2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36\": rpc error: code = NotFound desc = could not find container \"2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36\": container with ID starting with 2ce72b364bc7a97f2c507db04f5d4ca8540c5880b13856063ae5716387393d36 not found: ID does not exist" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.326477 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.326802 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-api" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.326837 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-api" Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.326851 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerName="proxy-server" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.326857 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerName="proxy-server" Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.326879 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" containerName="mysql-bootstrap" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.326884 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" containerName="mysql-bootstrap" Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.326891 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerName="proxy-httpd" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.326897 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerName="proxy-httpd" Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.326907 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerName="neutron-api" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.326913 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerName="neutron-api" Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.326926 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-httpd" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.326932 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-httpd" Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.326941 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerName="neutron-httpd" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.326946 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerName="neutron-httpd" Jan 21 16:03:22 crc kubenswrapper[4707]: E0121 16:03:22.326959 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" containerName="galera" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.326965 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" containerName="galera" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.327110 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" containerName="galera" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.327123 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-httpd" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.327132 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerName="neutron-api" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.327142 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerName="proxy-httpd" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.327153 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7476b843-007f-4dde-8bc6-c623dbc1c101" containerName="proxy-server" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.327166 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30e5918-95e9-4870-bd2e-d95fe0e448d8" containerName="neutron-httpd" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.327190 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d2edd9-a93d-4bb9-ac1e-6a23494df48c" containerName="neutron-api" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.328030 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.329671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-6glg2" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.331210 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.331397 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.336369 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.336800 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.477616 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.477873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.477917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.477973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5mg5\" (UniqueName: \"kubernetes.io/projected/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kube-api-access-l5mg5\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.478084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.478125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.478191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kolla-config\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.478233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-default\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-default\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579318 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5mg5\" (UniqueName: \"kubernetes.io/projected/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kube-api-access-l5mg5\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kolla-config\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579798 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.579893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kolla-config\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.580008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.580043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-default\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.581203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.582469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.582869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.592845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5mg5\" (UniqueName: \"kubernetes.io/projected/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kube-api-access-l5mg5\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.597829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:22 crc kubenswrapper[4707]: I0121 16:03:22.641047 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:23 crc kubenswrapper[4707]: I0121 16:03:23.024690 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:03:23 crc kubenswrapper[4707]: I0121 16:03:23.190297 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd21f18-14c0-44d2-bf3d-967e6d74b3f9" path="/var/lib/kubelet/pods/4cd21f18-14c0-44d2-bf3d-967e6d74b3f9/volumes" Jan 21 16:03:23 crc kubenswrapper[4707]: I0121 16:03:23.279322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f74b5687-ee85-4b71-929c-cc6ac63ffd06","Type":"ContainerStarted","Data":"46cdecded34ba4caafda591ec18134d5e73fd6deb30f6c20644310f0985b7bb6"} Jan 21 16:03:23 crc kubenswrapper[4707]: I0121 16:03:23.279367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f74b5687-ee85-4b71-929c-cc6ac63ffd06","Type":"ContainerStarted","Data":"a50dddf8b460942ac358b7cf7c3cd16415bddef2914c6ce670300c4f426403e1"} Jan 21 16:03:24 crc kubenswrapper[4707]: I0121 16:03:24.660004 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 16:03:25 crc kubenswrapper[4707]: E0121 16:03:25.931858 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:25 crc kubenswrapper[4707]: E0121 16:03:25.932079 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:33.93206628 +0000 UTC m=+3711.113582502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:25 crc kubenswrapper[4707]: E0121 16:03:25.932411 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:03:25 crc kubenswrapper[4707]: E0121 16:03:25.932439 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs podName:05aad5e9-0e3a-47b4-a11f-feca99ab8dac nodeName:}" failed. No retries permitted until 2026-01-21 16:03:33.932429994 +0000 UTC m=+3711.113946216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs") pod "memcached-0" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac") : secret "cert-memcached-svc" not found Jan 21 16:03:25 crc kubenswrapper[4707]: E0121 16:03:25.932469 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:25 crc kubenswrapper[4707]: E0121 16:03:25.932490 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:33.932484087 +0000 UTC m=+3711.114000309 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovn-metrics" not found Jan 21 16:03:26 crc kubenswrapper[4707]: I0121 16:03:26.300617 4707 generic.go:334] "Generic (PLEG): container finished" podID="f74b5687-ee85-4b71-929c-cc6ac63ffd06" containerID="46cdecded34ba4caafda591ec18134d5e73fd6deb30f6c20644310f0985b7bb6" exitCode=0 Jan 21 16:03:26 crc kubenswrapper[4707]: I0121 16:03:26.300654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f74b5687-ee85-4b71-929c-cc6ac63ffd06","Type":"ContainerDied","Data":"46cdecded34ba4caafda591ec18134d5e73fd6deb30f6c20644310f0985b7bb6"} Jan 21 16:03:27 crc kubenswrapper[4707]: I0121 16:03:27.316050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f74b5687-ee85-4b71-929c-cc6ac63ffd06","Type":"ContainerStarted","Data":"03100c9eb9b8189faab039c677c5d529534dd8ab01bec2472f4fc82fd729aab9"} Jan 21 16:03:27 crc kubenswrapper[4707]: I0121 16:03:27.329770 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=5.329754982 podStartE2EDuration="5.329754982s" podCreationTimestamp="2026-01-21 16:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:27.329438347 +0000 UTC m=+3704.510954570" watchObservedRunningTime="2026-01-21 16:03:27.329754982 +0000 UTC m=+3704.511271204" Jan 21 16:03:27 crc kubenswrapper[4707]: I0121 16:03:27.662691 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 16:03:27 crc kubenswrapper[4707]: I0121 16:03:27.662758 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:03:27 crc kubenswrapper[4707]: I0121 16:03:27.663228 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7"} pod="openstack-kuttl-tests/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Jan 21 16:03:27 crc kubenswrapper[4707]: I0121 16:03:27.663278 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" containerID="cri-o://9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7" gracePeriod=30 Jan 21 16:03:28 crc kubenswrapper[4707]: I0121 16:03:28.688691 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:03:28 crc kubenswrapper[4707]: I0121 16:03:28.689097 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerName="cinder-api-log" containerID="cri-o://20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810" gracePeriod=30 Jan 21 16:03:28 crc kubenswrapper[4707]: I0121 16:03:28.689157 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerName="cinder-api" containerID="cri-o://540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc" gracePeriod=30 Jan 21 16:03:29 crc kubenswrapper[4707]: I0121 16:03:29.329947 4707 generic.go:334] "Generic (PLEG): container finished" podID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerID="20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810" exitCode=143 Jan 21 16:03:29 crc kubenswrapper[4707]: I0121 16:03:29.329994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a1b367d6-cfdb-44f9-906b-8799d80d073c","Type":"ContainerDied","Data":"20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810"} Jan 21 16:03:30 crc kubenswrapper[4707]: E0121 16:03:30.300264 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:30 crc kubenswrapper[4707]: E0121 16:03:30.300450 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:30 crc kubenswrapper[4707]: E0121 16:03:30.300481 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:30.800464979 +0000 UTC m=+3707.981981202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovn-metrics" not found Jan 21 16:03:30 crc kubenswrapper[4707]: E0121 16:03:30.300503 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:30.800496519 +0000 UTC m=+3707.982012741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:30 crc kubenswrapper[4707]: E0121 16:03:30.808924 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:30 crc kubenswrapper[4707]: E0121 16:03:30.808966 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:30 crc kubenswrapper[4707]: E0121 16:03:30.809008 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:31.808990453 +0000 UTC m=+3708.990506676 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:30 crc kubenswrapper[4707]: E0121 16:03:30.809025 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:31.809018867 +0000 UTC m=+3708.990535089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovn-metrics" not found Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.310325 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-65b994fc4-xql9t"] Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.312322 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.324624 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-65b994fc4-xql9t"] Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.368637 4707 generic.go:334] "Generic (PLEG): container finished" podID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerID="9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7" exitCode=0 Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.368670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"99b5c23a-57ba-423e-8afb-878f9f0cf0f2","Type":"ContainerDied","Data":"9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7"} Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.406205 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.421689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glxg5\" (UniqueName: \"kubernetes.io/projected/329b3e52-9792-42da-a6f0-9b0fc5fb866c-kube-api-access-glxg5\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.421823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-public-tls-certs\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.421924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-internal-tls-certs\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.422009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data-custom\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.422111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b3e52-9792-42da-a6f0-9b0fc5fb866c-logs\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.422233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-combined-ca-bundle\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.422389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.509061 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="072de5b8-01f6-48ea-b4cf-751f3a8a449e" containerName="galera" containerID="cri-o://6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02" gracePeriod=30 Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.524377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-public-tls-certs\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.524428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-internal-tls-certs\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.524457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data-custom\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.524500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b3e52-9792-42da-a6f0-9b0fc5fb866c-logs\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.524556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-combined-ca-bundle\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.524598 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.524641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glxg5\" (UniqueName: \"kubernetes.io/projected/329b3e52-9792-42da-a6f0-9b0fc5fb866c-kube-api-access-glxg5\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.525219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b3e52-9792-42da-a6f0-9b0fc5fb866c-logs\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.528628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-internal-tls-certs\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.528843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-public-tls-certs\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.529994 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data-custom\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.530306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.531178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-combined-ca-bundle\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.540523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glxg5\" (UniqueName: \"kubernetes.io/projected/329b3e52-9792-42da-a6f0-9b0fc5fb866c-kube-api-access-glxg5\") pod \"barbican-api-65b994fc4-xql9t\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.635796 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:31 crc kubenswrapper[4707]: I0121 16:03:31.818671 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.84:8776/healthcheck\": read tcp 10.217.0.2:41542->10.217.1.84:8776: read: connection reset by peer" Jan 21 16:03:31 crc kubenswrapper[4707]: E0121 16:03:31.830494 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:31 crc kubenswrapper[4707]: E0121 16:03:31.830548 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:33.830533799 +0000 UTC m=+3711.012050022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:31 crc kubenswrapper[4707]: E0121 16:03:31.830584 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:31 crc kubenswrapper[4707]: E0121 16:03:31.830643 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:33.830628377 +0000 UTC m=+3711.012144599 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovn-metrics" not found Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.027323 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-65b994fc4-xql9t"] Jan 21 16:03:32 crc kubenswrapper[4707]: W0121 16:03:32.029525 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329b3e52_9792_42da_a6f0_9b0fc5fb866c.slice/crio-e630e27749f0744751134435a1f2622edee5b04f5294b9423bd95eb91a1b775b WatchSource:0}: Error finding container e630e27749f0744751134435a1f2622edee5b04f5294b9423bd95eb91a1b775b: Status 404 returned error can't find the container with id e630e27749f0744751134435a1f2622edee5b04f5294b9423bd95eb91a1b775b Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.228987 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.237797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data\") pod \"a1b367d6-cfdb-44f9-906b-8799d80d073c\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.237936 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7cqx\" (UniqueName: \"kubernetes.io/projected/a1b367d6-cfdb-44f9-906b-8799d80d073c-kube-api-access-m7cqx\") pod \"a1b367d6-cfdb-44f9-906b-8799d80d073c\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.237978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data-custom\") pod \"a1b367d6-cfdb-44f9-906b-8799d80d073c\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.237990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.238008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1b367d6-cfdb-44f9-906b-8799d80d073c-etc-machine-id\") pod \"a1b367d6-cfdb-44f9-906b-8799d80d073c\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.238040 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-combined-ca-bundle\") pod \"a1b367d6-cfdb-44f9-906b-8799d80d073c\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.238067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-public-tls-certs\") pod \"a1b367d6-cfdb-44f9-906b-8799d80d073c\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.238088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b367d6-cfdb-44f9-906b-8799d80d073c-logs\") pod \"a1b367d6-cfdb-44f9-906b-8799d80d073c\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.238117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-internal-tls-certs\") pod \"a1b367d6-cfdb-44f9-906b-8799d80d073c\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.238153 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-scripts\") pod \"a1b367d6-cfdb-44f9-906b-8799d80d073c\" (UID: \"a1b367d6-cfdb-44f9-906b-8799d80d073c\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.241512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1b367d6-cfdb-44f9-906b-8799d80d073c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a1b367d6-cfdb-44f9-906b-8799d80d073c" (UID: "a1b367d6-cfdb-44f9-906b-8799d80d073c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.242721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-scripts" (OuterVolumeSpecName: "scripts") pod "a1b367d6-cfdb-44f9-906b-8799d80d073c" (UID: "a1b367d6-cfdb-44f9-906b-8799d80d073c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.243991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b367d6-cfdb-44f9-906b-8799d80d073c-logs" (OuterVolumeSpecName: "logs") pod "a1b367d6-cfdb-44f9-906b-8799d80d073c" (UID: "a1b367d6-cfdb-44f9-906b-8799d80d073c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.268933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b367d6-cfdb-44f9-906b-8799d80d073c-kube-api-access-m7cqx" (OuterVolumeSpecName: "kube-api-access-m7cqx") pod "a1b367d6-cfdb-44f9-906b-8799d80d073c" (UID: "a1b367d6-cfdb-44f9-906b-8799d80d073c"). InnerVolumeSpecName "kube-api-access-m7cqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.288761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a1b367d6-cfdb-44f9-906b-8799d80d073c" (UID: "a1b367d6-cfdb-44f9-906b-8799d80d073c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.311776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1b367d6-cfdb-44f9-906b-8799d80d073c" (UID: "a1b367d6-cfdb-44f9-906b-8799d80d073c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.328199 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a1b367d6-cfdb-44f9-906b-8799d80d073c" (UID: "a1b367d6-cfdb-44f9-906b-8799d80d073c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.343029 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1b367d6-cfdb-44f9-906b-8799d80d073c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.343064 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.343082 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.343092 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b367d6-cfdb-44f9-906b-8799d80d073c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.343107 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.343116 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7cqx\" (UniqueName: \"kubernetes.io/projected/a1b367d6-cfdb-44f9-906b-8799d80d073c-kube-api-access-m7cqx\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.343127 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.347336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a1b367d6-cfdb-44f9-906b-8799d80d073c" (UID: "a1b367d6-cfdb-44f9-906b-8799d80d073c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.360985 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data" (OuterVolumeSpecName: "config-data") pod "a1b367d6-cfdb-44f9-906b-8799d80d073c" (UID: "a1b367d6-cfdb-44f9-906b-8799d80d073c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.378902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" event={"ID":"329b3e52-9792-42da-a6f0-9b0fc5fb866c","Type":"ContainerStarted","Data":"192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887"} Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.378947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" event={"ID":"329b3e52-9792-42da-a6f0-9b0fc5fb866c","Type":"ContainerStarted","Data":"e630e27749f0744751134435a1f2622edee5b04f5294b9423bd95eb91a1b775b"} Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.380259 4707 generic.go:334] "Generic (PLEG): container finished" podID="072de5b8-01f6-48ea-b4cf-751f3a8a449e" containerID="6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02" exitCode=0 Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.380302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"072de5b8-01f6-48ea-b4cf-751f3a8a449e","Type":"ContainerDied","Data":"6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02"} Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.380321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"072de5b8-01f6-48ea-b4cf-751f3a8a449e","Type":"ContainerDied","Data":"436bda71347d55b3eb36e67c7983d73a9afd4257b1ace2cf6c1327852a29b33b"} Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.380335 4707 scope.go:117] "RemoveContainer" containerID="6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.380441 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.383663 4707 generic.go:334] "Generic (PLEG): container finished" podID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerID="540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc" exitCode=0 Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.383728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a1b367d6-cfdb-44f9-906b-8799d80d073c","Type":"ContainerDied","Data":"540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc"} Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.383750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a1b367d6-cfdb-44f9-906b-8799d80d073c","Type":"ContainerDied","Data":"84abeb97bd70141bf1e6224f255e796ba24f3eb2484fb98985e286c0f02ebc46"} Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.383994 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.395376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"99b5c23a-57ba-423e-8afb-878f9f0cf0f2","Type":"ContainerStarted","Data":"f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44"} Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.415352 4707 scope.go:117] "RemoveContainer" containerID="7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.438782 4707 scope.go:117] "RemoveContainer" containerID="6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02" Jan 21 16:03:32 crc kubenswrapper[4707]: E0121 16:03:32.439298 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02\": container with ID starting with 6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02 not found: ID does not exist" containerID="6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.439340 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02"} err="failed to get container status \"6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02\": rpc error: code = NotFound desc = could not find container \"6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02\": container with ID starting with 6a61f7475db596541700a32686ff3c88631758d775423ae02e5e59488f909d02 not found: ID does not exist" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.439361 4707 scope.go:117] "RemoveContainer" containerID="7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268" Jan 21 16:03:32 crc kubenswrapper[4707]: E0121 16:03:32.439667 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268\": container with ID starting with 7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268 not found: ID does not exist" containerID="7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.439702 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268"} err="failed to get container status \"7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268\": rpc error: code = NotFound desc = could not find container \"7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268\": container with ID starting with 7f9a17c462bee2f88ccc16618d451081210110ecd29512db43e8873186a2d268 not found: ID does not exist" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.439725 4707 scope.go:117] "RemoveContainer" containerID="540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.444744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw9nv\" (UniqueName: \"kubernetes.io/projected/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kube-api-access-nw9nv\") pod \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.444838 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-galera-tls-certs\") pod \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.444869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.445006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kolla-config\") pod \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.445051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-operator-scripts\") pod \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.445070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-default\") pod \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.445108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-combined-ca-bundle\") pod \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.445146 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-generated\") pod \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\" (UID: \"072de5b8-01f6-48ea-b4cf-751f3a8a449e\") " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.445975 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.445992 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b367d6-cfdb-44f9-906b-8799d80d073c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.446337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "072de5b8-01f6-48ea-b4cf-751f3a8a449e" (UID: "072de5b8-01f6-48ea-b4cf-751f3a8a449e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.446500 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "072de5b8-01f6-48ea-b4cf-751f3a8a449e" (UID: "072de5b8-01f6-48ea-b4cf-751f3a8a449e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.446544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "072de5b8-01f6-48ea-b4cf-751f3a8a449e" (UID: "072de5b8-01f6-48ea-b4cf-751f3a8a449e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.447345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "072de5b8-01f6-48ea-b4cf-751f3a8a449e" (UID: "072de5b8-01f6-48ea-b4cf-751f3a8a449e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.452930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "072de5b8-01f6-48ea-b4cf-751f3a8a449e" (UID: "072de5b8-01f6-48ea-b4cf-751f3a8a449e"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.456671 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.463348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kube-api-access-nw9nv" (OuterVolumeSpecName: "kube-api-access-nw9nv") pod "072de5b8-01f6-48ea-b4cf-751f3a8a449e" (UID: "072de5b8-01f6-48ea-b4cf-751f3a8a449e"). InnerVolumeSpecName "kube-api-access-nw9nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.464946 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.472285 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:03:32 crc kubenswrapper[4707]: E0121 16:03:32.472612 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerName="cinder-api" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.472623 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerName="cinder-api" Jan 21 16:03:32 crc kubenswrapper[4707]: E0121 16:03:32.472642 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerName="cinder-api-log" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.472649 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerName="cinder-api-log" Jan 21 16:03:32 crc kubenswrapper[4707]: E0121 16:03:32.472670 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072de5b8-01f6-48ea-b4cf-751f3a8a449e" containerName="galera" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.472676 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="072de5b8-01f6-48ea-b4cf-751f3a8a449e" containerName="galera" Jan 21 16:03:32 crc kubenswrapper[4707]: E0121 16:03:32.472688 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072de5b8-01f6-48ea-b4cf-751f3a8a449e" containerName="mysql-bootstrap" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.472694 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="072de5b8-01f6-48ea-b4cf-751f3a8a449e" containerName="mysql-bootstrap" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.472871 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerName="cinder-api-log" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.472897 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="072de5b8-01f6-48ea-b4cf-751f3a8a449e" containerName="galera" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.472904 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" containerName="cinder-api" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.473722 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.478482 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.478697 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.478880 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.480398 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.481264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "072de5b8-01f6-48ea-b4cf-751f3a8a449e" (UID: "072de5b8-01f6-48ea-b4cf-751f3a8a449e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.484970 4707 scope.go:117] "RemoveContainer" containerID="20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.508724 4707 scope.go:117] "RemoveContainer" containerID="540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc" Jan 21 16:03:32 crc kubenswrapper[4707]: E0121 16:03:32.509213 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc\": container with ID starting with 540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc not found: ID does not exist" containerID="540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.509323 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc"} err="failed to get container status \"540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc\": rpc error: code = NotFound desc = could not find container \"540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc\": container with ID starting with 540b10f073926766b2dfdfc2c6d4d761fc34dd39fc599f83e98a7ff36a9b85dc not found: ID does not exist" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.509381 4707 scope.go:117] "RemoveContainer" containerID="20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810" Jan 21 16:03:32 crc kubenswrapper[4707]: E0121 16:03:32.509611 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810\": container with ID starting with 20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810 not found: ID does not exist" containerID="20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.509704 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810"} err="failed to get container status \"20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810\": rpc error: code = NotFound desc = could not find container \"20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810\": container with ID starting with 20cf12c2a375df4de13c946a07dbbb58657b033baa4c4cd40b92355e0c27e810 not found: ID does not exist" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.519413 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "072de5b8-01f6-48ea-b4cf-751f3a8a449e" (UID: "072de5b8-01f6-48ea-b4cf-751f3a8a449e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.547752 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.547780 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.547790 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.547799 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.547823 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/072de5b8-01f6-48ea-b4cf-751f3a8a449e-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.547834 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw9nv\" (UniqueName: \"kubernetes.io/projected/072de5b8-01f6-48ea-b4cf-751f3a8a449e-kube-api-access-nw9nv\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.547843 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/072de5b8-01f6-48ea-b4cf-751f3a8a449e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.547862 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.568473 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.641858 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.641903 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.657612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.657676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.657707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt6x8\" (UniqueName: \"kubernetes.io/projected/a3b559c3-4e82-4897-9b5c-1816989fffcd-kube-api-access-lt6x8\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.657740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-scripts\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.657787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b559c3-4e82-4897-9b5c-1816989fffcd-logs\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.657848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.657865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data-custom\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.657882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3b559c3-4e82-4897-9b5c-1816989fffcd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.657935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.658057 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.709301 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.720438 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.726309 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.727626 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.730396 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.730435 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-rch7x" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.730716 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.730876 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.737456 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.755493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.762125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.762898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.762965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.763033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt6x8\" (UniqueName: \"kubernetes.io/projected/a3b559c3-4e82-4897-9b5c-1816989fffcd-kube-api-access-lt6x8\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.763154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-scripts\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.763274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b559c3-4e82-4897-9b5c-1816989fffcd-logs\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.763529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b559c3-4e82-4897-9b5c-1816989fffcd-logs\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.764094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.764124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3b559c3-4e82-4897-9b5c-1816989fffcd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.764153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data-custom\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.764282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3b559c3-4e82-4897-9b5c-1816989fffcd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.767198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-scripts\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.767532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.767753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data-custom\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.768394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.769946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.773033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.782004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt6x8\" (UniqueName: \"kubernetes.io/projected/a3b559c3-4e82-4897-9b5c-1816989fffcd-kube-api-access-lt6x8\") pod \"cinder-api-0\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.801207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.866005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.866049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.866096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.866126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.866648 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv9pc\" (UniqueName: \"kubernetes.io/projected/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kube-api-access-pv9pc\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.866698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.866819 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.866897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.968933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.969281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.969329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.969360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.969384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9pc\" (UniqueName: \"kubernetes.io/projected/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kube-api-access-pv9pc\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.969399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.969405 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.969690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.969697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.969838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.970299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.970509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.970996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.973745 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.973783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.983027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv9pc\" (UniqueName: \"kubernetes.io/projected/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kube-api-access-pv9pc\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:32 crc kubenswrapper[4707]: I0121 16:03:32.994075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.053914 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.192708 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="072de5b8-01f6-48ea-b4cf-751f3a8a449e" path="/var/lib/kubelet/pods/072de5b8-01f6-48ea-b4cf-751f3a8a449e/volumes" Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.194767 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b367d6-cfdb-44f9-906b-8799d80d073c" path="/var/lib/kubelet/pods/a1b367d6-cfdb-44f9-906b-8799d80d073c/volumes" Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.208001 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.273903 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.273954 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.273959 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:33.773945816 +0000 UTC m=+3710.955462038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-internal-svc" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.274268 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:33.774250098 +0000 UTC m=+3710.955766321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-public-svc" not found Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.405534 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" event={"ID":"329b3e52-9792-42da-a6f0-9b0fc5fb866c","Type":"ContainerStarted","Data":"8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67"} Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.406685 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.406718 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.413534 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a3b559c3-4e82-4897-9b5c-1816989fffcd","Type":"ContainerStarted","Data":"a7d24ac7fe729e19034129c26583a2e49624988d1fa4236b71701e7e7314a6cb"} Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.424481 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" podStartSLOduration=2.42446952 podStartE2EDuration="2.42446952s" podCreationTimestamp="2026-01-21 16:03:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:33.420690624 +0000 UTC m=+3710.602206846" watchObservedRunningTime="2026-01-21 16:03:33.42446952 +0000 UTC m=+3710.605985732" Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.487183 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:03:33 crc kubenswrapper[4707]: I0121 16:03:33.498637 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.782680 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.783004 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:34.782988683 +0000 UTC m=+3711.964504905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-public-svc" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.783295 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.783408 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:34.783392673 +0000 UTC m=+3711.964908894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-internal-svc" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.884359 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.884412 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.884422 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:37.884409527 +0000 UTC m=+3715.065925749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.884445 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:37.884436919 +0000 UTC m=+3715.065953141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovn-metrics" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.988863 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.989029 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.989737 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:49.98971736 +0000 UTC m=+3727.171233583 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovn-metrics" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.989768 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs podName:7c469e86-c709-4bd2-986e-fa08e32052b0 nodeName:}" failed. No retries permitted until 2026-01-21 16:03:49.989761213 +0000 UTC m=+3727.171277436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.989948 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:03:33 crc kubenswrapper[4707]: E0121 16:03:33.989994 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs podName:05aad5e9-0e3a-47b4-a11f-feca99ab8dac nodeName:}" failed. No retries permitted until 2026-01-21 16:03:49.989987128 +0000 UTC m=+3727.171503350 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs") pod "memcached-0" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac") : secret "cert-memcached-svc" not found Jan 21 16:03:34 crc kubenswrapper[4707]: I0121 16:03:34.424359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf","Type":"ContainerStarted","Data":"3cbe82abedb90d7b7ea8a4dfc7b5a312e0052fff64a0a3f6bbe54174658450ee"} Jan 21 16:03:34 crc kubenswrapper[4707]: I0121 16:03:34.424403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf","Type":"ContainerStarted","Data":"f091c40f21036d1801be20ea6040f2764f81d0f16bd856b12466f03377cf804d"} Jan 21 16:03:34 crc kubenswrapper[4707]: I0121 16:03:34.426577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a3b559c3-4e82-4897-9b5c-1816989fffcd","Type":"ContainerStarted","Data":"030c52d40b8a7c37e53cff77e05fa947476307c66e7a847d083b6b3c32443d7f"} Jan 21 16:03:34 crc kubenswrapper[4707]: I0121 16:03:34.426614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a3b559c3-4e82-4897-9b5c-1816989fffcd","Type":"ContainerStarted","Data":"5e7ec7f63a2fc9f2128d4f3cf1c08e92faaab92b0eaa704e50baa0f7c8e983bc"} Jan 21 16:03:34 crc kubenswrapper[4707]: I0121 16:03:34.457787 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.457775373 podStartE2EDuration="2.457775373s" podCreationTimestamp="2026-01-21 16:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:34.451329913 +0000 UTC m=+3711.632846145" watchObservedRunningTime="2026-01-21 16:03:34.457775373 +0000 UTC m=+3711.639291595" Jan 21 16:03:34 crc kubenswrapper[4707]: E0121 16:03:34.808516 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:03:34 crc kubenswrapper[4707]: E0121 16:03:34.808598 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:36.808565523 +0000 UTC m=+3713.990081745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-internal-svc" not found Jan 21 16:03:34 crc kubenswrapper[4707]: E0121 16:03:34.808614 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:03:34 crc kubenswrapper[4707]: E0121 16:03:34.808688 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:36.80867528 +0000 UTC m=+3713.990191502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-public-svc" not found Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.036492 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.036675 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="05aad5e9-0e3a-47b4-a11f-feca99ab8dac" containerName="memcached" containerID="cri-o://0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd" gracePeriod=30 Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.065216 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.065436 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="3bace925-e0b3-4554-94fe-fb8173144aef" containerName="glance-log" containerID="cri-o://40c8b7431aab97514f8bab7190356c7c5b4555133652fcb6de00896dcf5239f1" gracePeriod=30 Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.065538 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="3bace925-e0b3-4554-94fe-fb8173144aef" containerName="glance-httpd" containerID="cri-o://bcb487a9ff3183117b3e23897c23dfd1d280f8b5d1e1acafd6d3a73af5a188bb" gracePeriod=30 Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.117475 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.117675 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerName="glance-log" containerID="cri-o://bb6bca71e3d12e1edf4e10a2e9b41ae39d95b4a7860f5c4bfd9ccf3346a9dda6" gracePeriod=30 Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.118049 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerName="glance-httpd" containerID="cri-o://4389a6536c38ee5182d4c5496c074f0119e308ffa099463d2d0def3ee26597b4" gracePeriod=30 Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.250356 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-9fc8c7fb9-vltft"] Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.251441 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.258484 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-9fc8c7fb9-vltft"] Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.419091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-credential-keys\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.419158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-public-tls-certs\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.419183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-combined-ca-bundle\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.419204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-config-data\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.419253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-fernet-keys\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.419297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-scripts\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.419315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6w8\" (UniqueName: \"kubernetes.io/projected/113f5252-dcd0-4719-8bd9-11800c33a828-kube-api-access-pd6w8\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.419346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-internal-tls-certs\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.434466 4707 generic.go:334] "Generic (PLEG): container finished" podID="3bace925-e0b3-4554-94fe-fb8173144aef" containerID="40c8b7431aab97514f8bab7190356c7c5b4555133652fcb6de00896dcf5239f1" exitCode=143 Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.434520 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3bace925-e0b3-4554-94fe-fb8173144aef","Type":"ContainerDied","Data":"40c8b7431aab97514f8bab7190356c7c5b4555133652fcb6de00896dcf5239f1"} Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.435775 4707 generic.go:334] "Generic (PLEG): container finished" podID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerID="bb6bca71e3d12e1edf4e10a2e9b41ae39d95b4a7860f5c4bfd9ccf3346a9dda6" exitCode=143 Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.436433 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ced77cf4-40f3-409a-a776-c47dabd7b84f","Type":"ContainerDied","Data":"bb6bca71e3d12e1edf4e10a2e9b41ae39d95b4a7860f5c4bfd9ccf3346a9dda6"} Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.436578 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.521209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-credential-keys\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.521264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-public-tls-certs\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.521289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-combined-ca-bundle\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.521307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-config-data\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.521365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-fernet-keys\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.521412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-scripts\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.521433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6w8\" (UniqueName: \"kubernetes.io/projected/113f5252-dcd0-4719-8bd9-11800c33a828-kube-api-access-pd6w8\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.521465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-internal-tls-certs\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.527013 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-config-data\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.527270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-combined-ca-bundle\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.527434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-scripts\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.528203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-internal-tls-certs\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.528888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-fernet-keys\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.529155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-public-tls-certs\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.530165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-credential-keys\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.540146 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6w8\" (UniqueName: \"kubernetes.io/projected/113f5252-dcd0-4719-8bd9-11800c33a828-kube-api-access-pd6w8\") pod \"keystone-9fc8c7fb9-vltft\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.574927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.647878 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:03:35 crc kubenswrapper[4707]: I0121 16:03:35.951540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-9fc8c7fb9-vltft"] Jan 21 16:03:35 crc kubenswrapper[4707]: W0121 16:03:35.964840 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod113f5252_dcd0_4719_8bd9_11800c33a828.slice/crio-c6ad8c6493470ecd8264485c27ce2557f540b78ba288b48c5dcd5c898f8df4d5 WatchSource:0}: Error finding container c6ad8c6493470ecd8264485c27ce2557f540b78ba288b48c5dcd5c898f8df4d5: Status 404 returned error can't find the container with id c6ad8c6493470ecd8264485c27ce2557f540b78ba288b48c5dcd5c898f8df4d5 Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.041594 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.132592 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-config-data\") pod \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.132689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs\") pod \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.132825 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-combined-ca-bundle\") pod \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.132904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kolla-config\") pod \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.132938 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fng8\" (UniqueName: \"kubernetes.io/projected/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kube-api-access-4fng8\") pod \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\" (UID: \"05aad5e9-0e3a-47b4-a11f-feca99ab8dac\") " Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.133450 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "05aad5e9-0e3a-47b4-a11f-feca99ab8dac" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.133540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-config-data" (OuterVolumeSpecName: "config-data") pod "05aad5e9-0e3a-47b4-a11f-feca99ab8dac" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.133820 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.133839 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.450136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" event={"ID":"113f5252-dcd0-4719-8bd9-11800c33a828","Type":"ContainerStarted","Data":"c6ad8c6493470ecd8264485c27ce2557f540b78ba288b48c5dcd5c898f8df4d5"} Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.451770 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" containerID="3cbe82abedb90d7b7ea8a4dfc7b5a312e0052fff64a0a3f6bbe54174658450ee" exitCode=0 Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.451846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf","Type":"ContainerDied","Data":"3cbe82abedb90d7b7ea8a4dfc7b5a312e0052fff64a0a3f6bbe54174658450ee"} Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.453590 4707 generic.go:334] "Generic (PLEG): container finished" podID="05aad5e9-0e3a-47b4-a11f-feca99ab8dac" containerID="0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd" exitCode=0 Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.453677 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.453728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"05aad5e9-0e3a-47b4-a11f-feca99ab8dac","Type":"ContainerDied","Data":"0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd"} Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.453759 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"05aad5e9-0e3a-47b4-a11f-feca99ab8dac","Type":"ContainerDied","Data":"8a2d91f4d891393fd3b72798b84a31ba66390e1f2c4b9b0b7e2a4107ee091659"} Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.453775 4707 scope.go:117] "RemoveContainer" containerID="0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.465095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kube-api-access-4fng8" (OuterVolumeSpecName: "kube-api-access-4fng8") pod "05aad5e9-0e3a-47b4-a11f-feca99ab8dac" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac"). InnerVolumeSpecName "kube-api-access-4fng8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.490722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05aad5e9-0e3a-47b4-a11f-feca99ab8dac" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.520954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "05aad5e9-0e3a-47b4-a11f-feca99ab8dac" (UID: "05aad5e9-0e3a-47b4-a11f-feca99ab8dac"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.547448 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fng8\" (UniqueName: \"kubernetes.io/projected/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-kube-api-access-4fng8\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.547478 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.547487 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05aad5e9-0e3a-47b4-a11f-feca99ab8dac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.590437 4707 scope.go:117] "RemoveContainer" containerID="0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd" Jan 21 16:03:36 crc kubenswrapper[4707]: E0121 16:03:36.590857 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd\": container with ID starting with 0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd not found: ID does not exist" containerID="0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.590918 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd"} err="failed to get container status \"0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd\": rpc error: code = NotFound desc = could not find container \"0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd\": container with ID starting with 0500cdbd3d53641aecd7cac6ccaf9efcb808f58b794d2a94a4b157137d04cccd not found: ID does not exist" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.693407 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.693663 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="ceilometer-central-agent" containerID="cri-o://5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48" gracePeriod=30 Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.693719 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="sg-core" containerID="cri-o://8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06" gracePeriod=30 Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.693711 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="proxy-httpd" containerID="cri-o://46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687" gracePeriod=30 Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.693790 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="ceilometer-notification-agent" containerID="cri-o://016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf" gracePeriod=30 Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.777942 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.784603 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.822025 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:03:36 crc kubenswrapper[4707]: E0121 16:03:36.822909 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05aad5e9-0e3a-47b4-a11f-feca99ab8dac" containerName="memcached" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.822995 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="05aad5e9-0e3a-47b4-a11f-feca99ab8dac" containerName="memcached" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.823514 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="05aad5e9-0e3a-47b4-a11f-feca99ab8dac" containerName="memcached" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.824536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.828522 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.828540 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.828722 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-tgjb6" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.845542 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:03:36 crc kubenswrapper[4707]: E0121 16:03:36.852402 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:03:36 crc kubenswrapper[4707]: E0121 16:03:36.852456 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:40.852441518 +0000 UTC m=+3718.033957740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-internal-svc" not found Jan 21 16:03:36 crc kubenswrapper[4707]: E0121 16:03:36.852492 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:03:36 crc kubenswrapper[4707]: E0121 16:03:36.852575 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:40.852561194 +0000 UTC m=+3718.034077416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-public-svc" not found Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.955917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-kolla-config\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.955954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-config-data\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.956013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.956092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t62lw\" (UniqueName: \"kubernetes.io/projected/884872ef-7ca2-4262-9561-440673182eba-kube-api-access-t62lw\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:36 crc kubenswrapper[4707]: I0121 16:03:36.956261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.057727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.058436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-kolla-config\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.058461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-config-data\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.059027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-kolla-config\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.059524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-config-data\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.059777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.059853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t62lw\" (UniqueName: \"kubernetes.io/projected/884872ef-7ca2-4262-9561-440673182eba-kube-api-access-t62lw\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.062989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.063394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.073212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t62lw\" (UniqueName: \"kubernetes.io/projected/884872ef-7ca2-4262-9561-440673182eba-kube-api-access-t62lw\") pod \"memcached-0\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.180726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.192005 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05aad5e9-0e3a-47b4-a11f-feca99ab8dac" path="/var/lib/kubelet/pods/05aad5e9-0e3a-47b4-a11f-feca99ab8dac/volumes" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.463073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf","Type":"ContainerStarted","Data":"6afcb360e0fbcf5cfa3da43b53ea242515a563c93456f93154197247f98df091"} Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.465456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" event={"ID":"113f5252-dcd0-4719-8bd9-11800c33a828","Type":"ContainerStarted","Data":"741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6"} Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.465591 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.467643 4707 generic.go:334] "Generic (PLEG): container finished" podID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerID="46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687" exitCode=0 Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.467668 4707 generic.go:334] "Generic (PLEG): container finished" podID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerID="8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06" exitCode=2 Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.467677 4707 generic.go:334] "Generic (PLEG): container finished" podID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerID="5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48" exitCode=0 Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.467699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerDied","Data":"46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687"} Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.467720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerDied","Data":"8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06"} Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.467729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerDied","Data":"5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48"} Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.484546 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=5.484531454 podStartE2EDuration="5.484531454s" podCreationTimestamp="2026-01-21 16:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:37.479456422 +0000 UTC m=+3714.660972644" watchObservedRunningTime="2026-01-21 16:03:37.484531454 +0000 UTC m=+3714.666047676" Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.494561 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" podStartSLOduration=2.49454831 podStartE2EDuration="2.49454831s" podCreationTimestamp="2026-01-21 16:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:37.491628719 +0000 UTC m=+3714.673144941" watchObservedRunningTime="2026-01-21 16:03:37.49454831 +0000 UTC m=+3714.676064532" Jan 21 16:03:37 crc kubenswrapper[4707]: W0121 16:03:37.558145 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod884872ef_7ca2_4262_9561_440673182eba.slice/crio-e2d7afa51a1e8f60c8a002a37bef43ce5e078fb74d003536c303462bcb4bba91 WatchSource:0}: Error finding container e2d7afa51a1e8f60c8a002a37bef43ce5e078fb74d003536c303462bcb4bba91: Status 404 returned error can't find the container with id e2d7afa51a1e8f60c8a002a37bef43ce5e078fb74d003536c303462bcb4bba91 Jan 21 16:03:37 crc kubenswrapper[4707]: I0121 16:03:37.558242 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:03:37 crc kubenswrapper[4707]: E0121 16:03:37.976718 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:03:37 crc kubenswrapper[4707]: E0121 16:03:37.976719 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:37 crc kubenswrapper[4707]: E0121 16:03:37.976782 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:45.976768741 +0000 UTC m=+3723.158284963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovn-metrics" not found Jan 21 16:03:37 crc kubenswrapper[4707]: E0121 16:03:37.976799 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs podName:6363e430-8c44-4cbc-8923-3ed9147d0cde nodeName:}" failed. No retries permitted until 2026-01-21 16:03:45.976790332 +0000 UTC m=+3723.158306554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.477036 4707 generic.go:334] "Generic (PLEG): container finished" podID="3bace925-e0b3-4554-94fe-fb8173144aef" containerID="bcb487a9ff3183117b3e23897c23dfd1d280f8b5d1e1acafd6d3a73af5a188bb" exitCode=0 Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.477172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3bace925-e0b3-4554-94fe-fb8173144aef","Type":"ContainerDied","Data":"bcb487a9ff3183117b3e23897c23dfd1d280f8b5d1e1acafd6d3a73af5a188bb"} Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.478977 4707 generic.go:334] "Generic (PLEG): container finished" podID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerID="4389a6536c38ee5182d4c5496c074f0119e308ffa099463d2d0def3ee26597b4" exitCode=0 Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.479018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ced77cf4-40f3-409a-a776-c47dabd7b84f","Type":"ContainerDied","Data":"4389a6536c38ee5182d4c5496c074f0119e308ffa099463d2d0def3ee26597b4"} Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.480748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"884872ef-7ca2-4262-9561-440673182eba","Type":"ContainerStarted","Data":"bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497"} Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.480786 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"884872ef-7ca2-4262-9561-440673182eba","Type":"ContainerStarted","Data":"e2d7afa51a1e8f60c8a002a37bef43ce5e078fb74d003536c303462bcb4bba91"} Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.570836 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.599368 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.599353073 podStartE2EDuration="2.599353073s" podCreationTimestamp="2026-01-21 16:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:38.503291437 +0000 UTC m=+3715.684807659" watchObservedRunningTime="2026-01-21 16:03:38.599353073 +0000 UTC m=+3715.780869296" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.613584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-public-tls-certs\") pod \"3bace925-e0b3-4554-94fe-fb8173144aef\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.613628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-scripts\") pod \"3bace925-e0b3-4554-94fe-fb8173144aef\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.613715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-config-data\") pod \"3bace925-e0b3-4554-94fe-fb8173144aef\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.613742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"3bace925-e0b3-4554-94fe-fb8173144aef\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.613821 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-logs\") pod \"3bace925-e0b3-4554-94fe-fb8173144aef\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.613858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-httpd-run\") pod \"3bace925-e0b3-4554-94fe-fb8173144aef\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.613883 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-combined-ca-bundle\") pod \"3bace925-e0b3-4554-94fe-fb8173144aef\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.613906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtkxf\" (UniqueName: \"kubernetes.io/projected/3bace925-e0b3-4554-94fe-fb8173144aef-kube-api-access-xtkxf\") pod \"3bace925-e0b3-4554-94fe-fb8173144aef\" (UID: \"3bace925-e0b3-4554-94fe-fb8173144aef\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.614405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-logs" (OuterVolumeSpecName: "logs") pod "3bace925-e0b3-4554-94fe-fb8173144aef" (UID: "3bace925-e0b3-4554-94fe-fb8173144aef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.614422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3bace925-e0b3-4554-94fe-fb8173144aef" (UID: "3bace925-e0b3-4554-94fe-fb8173144aef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.618801 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.619108 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bace925-e0b3-4554-94fe-fb8173144aef-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.618973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "3bace925-e0b3-4554-94fe-fb8173144aef" (UID: "3bace925-e0b3-4554-94fe-fb8173144aef"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.620506 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-scripts" (OuterVolumeSpecName: "scripts") pod "3bace925-e0b3-4554-94fe-fb8173144aef" (UID: "3bace925-e0b3-4554-94fe-fb8173144aef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.621928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bace925-e0b3-4554-94fe-fb8173144aef-kube-api-access-xtkxf" (OuterVolumeSpecName: "kube-api-access-xtkxf") pod "3bace925-e0b3-4554-94fe-fb8173144aef" (UID: "3bace925-e0b3-4554-94fe-fb8173144aef"). InnerVolumeSpecName "kube-api-access-xtkxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: E0121 16:03:38.624464 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:37264->192.168.25.165:36655: write tcp 192.168.25.165:37264->192.168.25.165:36655: write: broken pipe Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.632158 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.648088 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bace925-e0b3-4554-94fe-fb8173144aef" (UID: "3bace925-e0b3-4554-94fe-fb8173144aef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.658886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-config-data" (OuterVolumeSpecName: "config-data") pod "3bace925-e0b3-4554-94fe-fb8173144aef" (UID: "3bace925-e0b3-4554-94fe-fb8173144aef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.669712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3bace925-e0b3-4554-94fe-fb8173144aef" (UID: "3bace925-e0b3-4554-94fe-fb8173144aef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.720152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-internal-tls-certs\") pod \"ced77cf4-40f3-409a-a776-c47dabd7b84f\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.720216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-logs\") pod \"ced77cf4-40f3-409a-a776-c47dabd7b84f\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.720272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"ced77cf4-40f3-409a-a776-c47dabd7b84f\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.720357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-httpd-run\") pod \"ced77cf4-40f3-409a-a776-c47dabd7b84f\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.720398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-scripts\") pod \"ced77cf4-40f3-409a-a776-c47dabd7b84f\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.720448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcngf\" (UniqueName: \"kubernetes.io/projected/ced77cf4-40f3-409a-a776-c47dabd7b84f-kube-api-access-wcngf\") pod \"ced77cf4-40f3-409a-a776-c47dabd7b84f\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.720514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-combined-ca-bundle\") pod \"ced77cf4-40f3-409a-a776-c47dabd7b84f\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.720542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-config-data\") pod \"ced77cf4-40f3-409a-a776-c47dabd7b84f\" (UID: \"ced77cf4-40f3-409a-a776-c47dabd7b84f\") " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.720792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ced77cf4-40f3-409a-a776-c47dabd7b84f" (UID: "ced77cf4-40f3-409a-a776-c47dabd7b84f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.721139 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.721167 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.721178 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.721188 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtkxf\" (UniqueName: \"kubernetes.io/projected/3bace925-e0b3-4554-94fe-fb8173144aef-kube-api-access-xtkxf\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.721197 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.721207 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.721215 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bace925-e0b3-4554-94fe-fb8173144aef-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.721300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-logs" (OuterVolumeSpecName: "logs") pod "ced77cf4-40f3-409a-a776-c47dabd7b84f" (UID: "ced77cf4-40f3-409a-a776-c47dabd7b84f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.723013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-scripts" (OuterVolumeSpecName: "scripts") pod "ced77cf4-40f3-409a-a776-c47dabd7b84f" (UID: "ced77cf4-40f3-409a-a776-c47dabd7b84f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.723621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "ced77cf4-40f3-409a-a776-c47dabd7b84f" (UID: "ced77cf4-40f3-409a-a776-c47dabd7b84f"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.723992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced77cf4-40f3-409a-a776-c47dabd7b84f-kube-api-access-wcngf" (OuterVolumeSpecName: "kube-api-access-wcngf") pod "ced77cf4-40f3-409a-a776-c47dabd7b84f" (UID: "ced77cf4-40f3-409a-a776-c47dabd7b84f"). InnerVolumeSpecName "kube-api-access-wcngf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.736641 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.743476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ced77cf4-40f3-409a-a776-c47dabd7b84f" (UID: "ced77cf4-40f3-409a-a776-c47dabd7b84f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.759315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ced77cf4-40f3-409a-a776-c47dabd7b84f" (UID: "ced77cf4-40f3-409a-a776-c47dabd7b84f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.766377 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-config-data" (OuterVolumeSpecName: "config-data") pod "ced77cf4-40f3-409a-a776-c47dabd7b84f" (UID: "ced77cf4-40f3-409a-a776-c47dabd7b84f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.822621 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.822645 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.822655 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcngf\" (UniqueName: \"kubernetes.io/projected/ced77cf4-40f3-409a-a776-c47dabd7b84f-kube-api-access-wcngf\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.822666 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.822675 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.822683 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.822691 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ced77cf4-40f3-409a-a776-c47dabd7b84f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.822700 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced77cf4-40f3-409a-a776-c47dabd7b84f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.839023 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:03:38 crc kubenswrapper[4707]: I0121 16:03:38.923935 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.448875 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.490003 4707 generic.go:334] "Generic (PLEG): container finished" podID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerID="016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf" exitCode=0 Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.490099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerDied","Data":"016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf"} Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.490146 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.490161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a423cf5c-f81b-4df1-bf8f-51a11024e065","Type":"ContainerDied","Data":"39f7ba9a9da4e8ab5a4404f2a82abadc8b62dffab7357d91044be1cec47be9cd"} Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.490183 4707 scope.go:117] "RemoveContainer" containerID="46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.493421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"3bace925-e0b3-4554-94fe-fb8173144aef","Type":"ContainerDied","Data":"e76a4362c7e4160720ed6485962cc5f4b7128cba5da3931760c11d3a3ad2b018"} Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.493508 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.495776 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.495880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ced77cf4-40f3-409a-a776-c47dabd7b84f","Type":"ContainerDied","Data":"836f0c3146efe58a46bc1565139873dc0940b7862c90aebd55eee7b24457df35"} Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.495962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.518977 4707 scope.go:117] "RemoveContainer" containerID="8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.542383 4707 scope.go:117] "RemoveContainer" containerID="016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.547089 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.557332 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568259 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.568595 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="ceilometer-notification-agent" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568612 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="ceilometer-notification-agent" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.568623 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="sg-core" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568628 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="sg-core" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.568637 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerName="glance-log" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568642 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerName="glance-log" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.568651 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerName="glance-httpd" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568657 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerName="glance-httpd" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.568667 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bace925-e0b3-4554-94fe-fb8173144aef" containerName="glance-log" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568672 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bace925-e0b3-4554-94fe-fb8173144aef" containerName="glance-log" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.568681 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bace925-e0b3-4554-94fe-fb8173144aef" containerName="glance-httpd" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568686 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bace925-e0b3-4554-94fe-fb8173144aef" containerName="glance-httpd" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.568702 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="ceilometer-central-agent" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568708 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="ceilometer-central-agent" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.568730 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="proxy-httpd" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568735 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="proxy-httpd" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568914 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bace925-e0b3-4554-94fe-fb8173144aef" containerName="glance-httpd" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568928 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerName="glance-httpd" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568938 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bace925-e0b3-4554-94fe-fb8173144aef" containerName="glance-log" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568954 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="sg-core" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568964 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="ceilometer-notification-agent" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568975 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="ceilometer-central-agent" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568985 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced77cf4-40f3-409a-a776-c47dabd7b84f" containerName="glance-log" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.568994 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" containerName="proxy-httpd" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.569184 4707 scope.go:117] "RemoveContainer" containerID="5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.570768 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.573076 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-dkchd" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.573273 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.573334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.573391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.578341 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.584576 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.593852 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.597955 4707 scope.go:117] "RemoveContainer" containerID="46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.598894 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687\": container with ID starting with 46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687 not found: ID does not exist" containerID="46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.598932 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687"} err="failed to get container status \"46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687\": rpc error: code = NotFound desc = could not find container \"46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687\": container with ID starting with 46f1e2cd645fa817da0e3565f0b45fdd74919cbe7d4568a7638f11ceaf02e687 not found: ID does not exist" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.598958 4707 scope.go:117] "RemoveContainer" containerID="8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.599592 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06\": container with ID starting with 8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06 not found: ID does not exist" containerID="8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.599618 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06"} err="failed to get container status \"8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06\": rpc error: code = NotFound desc = could not find container \"8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06\": container with ID starting with 8c54ab0c20637d203489e1b87a05b11acfff0c75dc1b45785359cedf83eb9e06 not found: ID does not exist" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.599634 4707 scope.go:117] "RemoveContainer" containerID="016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.600225 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.600352 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf\": container with ID starting with 016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf not found: ID does not exist" containerID="016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.600374 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf"} err="failed to get container status \"016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf\": rpc error: code = NotFound desc = could not find container \"016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf\": container with ID starting with 016049c7bb02d4942fcc7d1bcbde72d3131265051756874a98ba0d8f9b0877bf not found: ID does not exist" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.600387 4707 scope.go:117] "RemoveContainer" containerID="5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48" Jan 21 16:03:39 crc kubenswrapper[4707]: E0121 16:03:39.601059 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48\": container with ID starting with 5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48 not found: ID does not exist" containerID="5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.601077 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48"} err="failed to get container status \"5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48\": rpc error: code = NotFound desc = could not find container \"5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48\": container with ID starting with 5c4bf6b64082b405abf5e67db1cac446af3b7f4781c5142a7e8df3312b4a0b48 not found: ID does not exist" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.601088 4707 scope.go:117] "RemoveContainer" containerID="bcb487a9ff3183117b3e23897c23dfd1d280f8b5d1e1acafd6d3a73af5a188bb" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.606755 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.606939 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.612128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.612287 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.630690 4707 scope.go:117] "RemoveContainer" containerID="40c8b7431aab97514f8bab7190356c7c5b4555133652fcb6de00896dcf5239f1" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.634838 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-scripts\") pod \"a423cf5c-f81b-4df1-bf8f-51a11024e065\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.634876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-log-httpd\") pod \"a423cf5c-f81b-4df1-bf8f-51a11024e065\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.634905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-sg-core-conf-yaml\") pod \"a423cf5c-f81b-4df1-bf8f-51a11024e065\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.634935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-combined-ca-bundle\") pod \"a423cf5c-f81b-4df1-bf8f-51a11024e065\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.634977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-config-data\") pod \"a423cf5c-f81b-4df1-bf8f-51a11024e065\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.635034 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-run-httpd\") pod \"a423cf5c-f81b-4df1-bf8f-51a11024e065\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.635099 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-ceilometer-tls-certs\") pod \"a423cf5c-f81b-4df1-bf8f-51a11024e065\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.635209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkws6\" (UniqueName: \"kubernetes.io/projected/a423cf5c-f81b-4df1-bf8f-51a11024e065-kube-api-access-bkws6\") pod \"a423cf5c-f81b-4df1-bf8f-51a11024e065\" (UID: \"a423cf5c-f81b-4df1-bf8f-51a11024e065\") " Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.635556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a423cf5c-f81b-4df1-bf8f-51a11024e065" (UID: "a423cf5c-f81b-4df1-bf8f-51a11024e065"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.635618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a423cf5c-f81b-4df1-bf8f-51a11024e065" (UID: "a423cf5c-f81b-4df1-bf8f-51a11024e065"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.635973 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.635993 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a423cf5c-f81b-4df1-bf8f-51a11024e065-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.639082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a423cf5c-f81b-4df1-bf8f-51a11024e065-kube-api-access-bkws6" (OuterVolumeSpecName: "kube-api-access-bkws6") pod "a423cf5c-f81b-4df1-bf8f-51a11024e065" (UID: "a423cf5c-f81b-4df1-bf8f-51a11024e065"). InnerVolumeSpecName "kube-api-access-bkws6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.639905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-scripts" (OuterVolumeSpecName: "scripts") pod "a423cf5c-f81b-4df1-bf8f-51a11024e065" (UID: "a423cf5c-f81b-4df1-bf8f-51a11024e065"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.648495 4707 scope.go:117] "RemoveContainer" containerID="4389a6536c38ee5182d4c5496c074f0119e308ffa099463d2d0def3ee26597b4" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.656281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a423cf5c-f81b-4df1-bf8f-51a11024e065" (UID: "a423cf5c-f81b-4df1-bf8f-51a11024e065"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.670496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a423cf5c-f81b-4df1-bf8f-51a11024e065" (UID: "a423cf5c-f81b-4df1-bf8f-51a11024e065"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.682512 4707 scope.go:117] "RemoveContainer" containerID="bb6bca71e3d12e1edf4e10a2e9b41ae39d95b4a7860f5c4bfd9ccf3346a9dda6" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.691153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a423cf5c-f81b-4df1-bf8f-51a11024e065" (UID: "a423cf5c-f81b-4df1-bf8f-51a11024e065"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.701780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-config-data" (OuterVolumeSpecName: "config-data") pod "a423cf5c-f81b-4df1-bf8f-51a11024e065" (UID: "a423cf5c-f81b-4df1-bf8f-51a11024e065"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5gq4\" (UniqueName: \"kubernetes.io/projected/61223200-c7c1-4ee8-b38a-e6074d65114a-kube-api-access-x5gq4\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzz7s\" (UniqueName: \"kubernetes.io/projected/1c00713b-bb97-4adf-82eb-8ff31b65d14b-kube-api-access-wzz7s\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738169 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-logs\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-scripts\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-config-data\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738462 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738473 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738483 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738490 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738499 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a423cf5c-f81b-4df1-bf8f-51a11024e065-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.738508 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkws6\" (UniqueName: \"kubernetes.io/projected/a423cf5c-f81b-4df1-bf8f-51a11024e065-kube-api-access-bkws6\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.819486 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.836663 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-scripts\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-config-data\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841679 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5gq4\" (UniqueName: \"kubernetes.io/projected/61223200-c7c1-4ee8-b38a-e6074d65114a-kube-api-access-x5gq4\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzz7s\" (UniqueName: \"kubernetes.io/projected/1c00713b-bb97-4adf-82eb-8ff31b65d14b-kube-api-access-wzz7s\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-logs\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.841880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.842178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.842344 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.842491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-logs\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.842614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.842710 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.844562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.845620 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.848753 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.851251 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.851898 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.853760 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.854521 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.854941 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.855661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-scripts\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.856588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.857124 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-config-data\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.857215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5gq4\" (UniqueName: \"kubernetes.io/projected/61223200-c7c1-4ee8-b38a-e6074d65114a-kube-api-access-x5gq4\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.857473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.858939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzz7s\" (UniqueName: \"kubernetes.io/projected/1c00713b-bb97-4adf-82eb-8ff31b65d14b-kube-api-access-wzz7s\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.859292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.863626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.872548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.880590 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.898769 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.929023 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.943510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-log-httpd\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.943549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-scripts\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.943574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.943604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz9tx\" (UniqueName: \"kubernetes.io/projected/c468129d-ce39-4ba9-970f-4ecc0bb1df07-kube-api-access-jz9tx\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.943725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.943892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-config-data\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.943975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:39 crc kubenswrapper[4707]: I0121 16:03:39.944129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-run-httpd\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.046731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-log-httpd\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.046993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-scripts\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.047055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.047141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz9tx\" (UniqueName: \"kubernetes.io/projected/c468129d-ce39-4ba9-970f-4ecc0bb1df07-kube-api-access-jz9tx\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.047273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.047389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-config-data\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.047468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.047609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-run-httpd\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.048205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-run-httpd\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.048425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-log-httpd\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.052020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-config-data\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.052476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-scripts\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.052919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.052997 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.053784 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.063372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz9tx\" (UniqueName: \"kubernetes.io/projected/c468129d-ce39-4ba9-970f-4ecc0bb1df07-kube-api-access-jz9tx\") pod \"ceilometer-0\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.167414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.297708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.346732 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:03:40 crc kubenswrapper[4707]: W0121 16:03:40.352041 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c00713b_bb97_4adf_82eb_8ff31b65d14b.slice/crio-702e397dcca94c3668336bb52f4bff29018f7892effd5441b94c8d2199f22f0a WatchSource:0}: Error finding container 702e397dcca94c3668336bb52f4bff29018f7892effd5441b94c8d2199f22f0a: Status 404 returned error can't find the container with id 702e397dcca94c3668336bb52f4bff29018f7892effd5441b94c8d2199f22f0a Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.510689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c00713b-bb97-4adf-82eb-8ff31b65d14b","Type":"ContainerStarted","Data":"702e397dcca94c3668336bb52f4bff29018f7892effd5441b94c8d2199f22f0a"} Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.513256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"61223200-c7c1-4ee8-b38a-e6074d65114a","Type":"ContainerStarted","Data":"b248ef8405ca158d4d7a8937f11f247005b0d7b28069f40241fbcaa4ec63ecc2"} Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.556183 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:03:40 crc kubenswrapper[4707]: W0121 16:03:40.563132 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc468129d_ce39_4ba9_970f_4ecc0bb1df07.slice/crio-f0a9124f09597fe32a56baf849b8987f1c2eb61272881fa6720fe5476f77d3ca WatchSource:0}: Error finding container f0a9124f09597fe32a56baf849b8987f1c2eb61272881fa6720fe5476f77d3ca: Status 404 returned error can't find the container with id f0a9124f09597fe32a56baf849b8987f1c2eb61272881fa6720fe5476f77d3ca Jan 21 16:03:40 crc kubenswrapper[4707]: I0121 16:03:40.667123 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:03:40 crc kubenswrapper[4707]: E0121 16:03:40.865239 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:03:40 crc kubenswrapper[4707]: E0121 16:03:40.865316 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:48.865300306 +0000 UTC m=+3726.046816528 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-public-svc" not found Jan 21 16:03:40 crc kubenswrapper[4707]: E0121 16:03:40.865389 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:03:40 crc kubenswrapper[4707]: E0121 16:03:40.865547 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs podName:5b206e03-5e3a-4613-8d4a-6108443785cb nodeName:}" failed. No retries permitted until 2026-01-21 16:03:48.865510572 +0000 UTC m=+3726.047026793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs") pod "placement-685994d4cb-zr62v" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb") : secret "cert-placement-internal-svc" not found Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.223025 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bace925-e0b3-4554-94fe-fb8173144aef" path="/var/lib/kubelet/pods/3bace925-e0b3-4554-94fe-fb8173144aef/volumes" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.224854 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a423cf5c-f81b-4df1-bf8f-51a11024e065" path="/var/lib/kubelet/pods/a423cf5c-f81b-4df1-bf8f-51a11024e065/volumes" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.227326 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced77cf4-40f3-409a-a776-c47dabd7b84f" path="/var/lib/kubelet/pods/ced77cf4-40f3-409a-a776-c47dabd7b84f/volumes" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.427953 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.428489 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerName="openstack-network-exporter" containerID="cri-o://91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322" gracePeriod=300 Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.482585 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerName="ovsdbserver-nb" containerID="cri-o://16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87" gracePeriod=300 Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.528977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c00713b-bb97-4adf-82eb-8ff31b65d14b","Type":"ContainerStarted","Data":"e161f3943e924491a50c0f1e70e67b3a00296b6236a4e22683928578bbfa1875"} Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.529022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c00713b-bb97-4adf-82eb-8ff31b65d14b","Type":"ContainerStarted","Data":"798d6d03c5f9d0f6851d346fe3cbe26f854510991f5d3c2cad7db678016a87fb"} Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.531511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"61223200-c7c1-4ee8-b38a-e6074d65114a","Type":"ContainerStarted","Data":"01def0d7c0c313086bab3b5dc9cadb48b9024281321ff3f9e84b864c2e80cc95"} Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.531789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"61223200-c7c1-4ee8-b38a-e6074d65114a","Type":"ContainerStarted","Data":"9201040871e23d6bc2a379fa2022d38ff6b9347b086e57318e146ace3cc99e12"} Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.533355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerStarted","Data":"ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047"} Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.533397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerStarted","Data":"f0a9124f09597fe32a56baf849b8987f1c2eb61272881fa6720fe5476f77d3ca"} Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.555047 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.55502816 podStartE2EDuration="2.55502816s" podCreationTimestamp="2026-01-21 16:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:41.542359709 +0000 UTC m=+3718.723875930" watchObservedRunningTime="2026-01-21 16:03:41.55502816 +0000 UTC m=+3718.736544382" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.568879 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.568862704 podStartE2EDuration="2.568862704s" podCreationTimestamp="2026-01-21 16:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:41.562254719 +0000 UTC m=+3718.743770940" watchObservedRunningTime="2026-01-21 16:03:41.568862704 +0000 UTC m=+3718.750378925" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.880185 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_7c469e86-c709-4bd2-986e-fa08e32052b0/ovsdbserver-nb/0.log" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.880461 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.996132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdb-rundir\") pod \"7c469e86-c709-4bd2-986e-fa08e32052b0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.996298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-combined-ca-bundle\") pod \"7c469e86-c709-4bd2-986e-fa08e32052b0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.996392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-scripts\") pod \"7c469e86-c709-4bd2-986e-fa08e32052b0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.996419 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"7c469e86-c709-4bd2-986e-fa08e32052b0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.996517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "7c469e86-c709-4bd2-986e-fa08e32052b0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.996681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs\") pod \"7c469e86-c709-4bd2-986e-fa08e32052b0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.996716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qjqv\" (UniqueName: \"kubernetes.io/projected/7c469e86-c709-4bd2-986e-fa08e32052b0-kube-api-access-5qjqv\") pod \"7c469e86-c709-4bd2-986e-fa08e32052b0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.996763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs\") pod \"7c469e86-c709-4bd2-986e-fa08e32052b0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.996831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-config\") pod \"7c469e86-c709-4bd2-986e-fa08e32052b0\" (UID: \"7c469e86-c709-4bd2-986e-fa08e32052b0\") " Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.997890 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.998569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-config" (OuterVolumeSpecName: "config") pod "7c469e86-c709-4bd2-986e-fa08e32052b0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:41 crc kubenswrapper[4707]: I0121 16:03:41.998597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-scripts" (OuterVolumeSpecName: "scripts") pod "7c469e86-c709-4bd2-986e-fa08e32052b0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.001229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "7c469e86-c709-4bd2-986e-fa08e32052b0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.005939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c469e86-c709-4bd2-986e-fa08e32052b0-kube-api-access-5qjqv" (OuterVolumeSpecName: "kube-api-access-5qjqv") pod "7c469e86-c709-4bd2-986e-fa08e32052b0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0"). InnerVolumeSpecName "kube-api-access-5qjqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.040170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c469e86-c709-4bd2-986e-fa08e32052b0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.089744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7c469e86-c709-4bd2-986e-fa08e32052b0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.092522 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "7c469e86-c709-4bd2-986e-fa08e32052b0" (UID: "7c469e86-c709-4bd2-986e-fa08e32052b0"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.100593 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.100629 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.100660 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.100674 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.100685 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qjqv\" (UniqueName: \"kubernetes.io/projected/7c469e86-c709-4bd2-986e-fa08e32052b0-kube-api-access-5qjqv\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.100695 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c469e86-c709-4bd2-986e-fa08e32052b0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.100703 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c469e86-c709-4bd2-986e-fa08e32052b0-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.118561 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.183235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.202034 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.286407 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.286757 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="eadd0643-8179-4e51-a4e4-f48691d24503" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5" gracePeriod=30 Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.399772 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4"] Jan 21 16:03:42 crc kubenswrapper[4707]: E0121 16:03:42.400287 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerName="ovsdbserver-nb" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.400307 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerName="ovsdbserver-nb" Jan 21 16:03:42 crc kubenswrapper[4707]: E0121 16:03:42.400332 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerName="openstack-network-exporter" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.400340 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerName="openstack-network-exporter" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.400514 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerName="openstack-network-exporter" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.400536 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerName="ovsdbserver-nb" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.401562 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.417887 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4"] Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.508036 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-internal-tls-certs\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.508090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-public-tls-certs\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.508180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-httpd-config\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.508233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-ovndb-tls-certs\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.508319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-combined-ca-bundle\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.508348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-config\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.508392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55lmf\" (UniqueName: \"kubernetes.io/projected/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-kube-api-access-55lmf\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.543586 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_7c469e86-c709-4bd2-986e-fa08e32052b0/ovsdbserver-nb/0.log" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.543632 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerID="91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322" exitCode=2 Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.543648 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c469e86-c709-4bd2-986e-fa08e32052b0" containerID="16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87" exitCode=143 Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.543693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"7c469e86-c709-4bd2-986e-fa08e32052b0","Type":"ContainerDied","Data":"91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322"} Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.543723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"7c469e86-c709-4bd2-986e-fa08e32052b0","Type":"ContainerDied","Data":"16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87"} Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.543732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"7c469e86-c709-4bd2-986e-fa08e32052b0","Type":"ContainerDied","Data":"f439f77b16ce08da3199352dc751911f63fb6b7b1ef1c0169c31e7ad38531a9a"} Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.543748 4707 scope.go:117] "RemoveContainer" containerID="91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.544796 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.545623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerStarted","Data":"3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546"} Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.609421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-combined-ca-bundle\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.609461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-config\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.609510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55lmf\" (UniqueName: \"kubernetes.io/projected/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-kube-api-access-55lmf\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.609581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-internal-tls-certs\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.609602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-public-tls-certs\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.609637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-httpd-config\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.609675 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-ovndb-tls-certs\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.614696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-combined-ca-bundle\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.615063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-config\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.615792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-public-tls-certs\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.618722 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-internal-tls-certs\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.620069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-ovndb-tls-certs\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.626398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-httpd-config\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.634468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55lmf\" (UniqueName: \"kubernetes.io/projected/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-kube-api-access-55lmf\") pod \"neutron-5c95f7cb7d-zhgp4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.725668 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.744744 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.774893 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.776332 4707 scope.go:117] "RemoveContainer" containerID="16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.789545 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.791593 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.796196 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.797682 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-clk78" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.797724 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.802427 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.812480 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.841937 4707 scope.go:117] "RemoveContainer" containerID="91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322" Jan 21 16:03:42 crc kubenswrapper[4707]: E0121 16:03:42.843523 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322\": container with ID starting with 91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322 not found: ID does not exist" containerID="91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.843547 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322"} err="failed to get container status \"91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322\": rpc error: code = NotFound desc = could not find container \"91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322\": container with ID starting with 91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322 not found: ID does not exist" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.843568 4707 scope.go:117] "RemoveContainer" containerID="16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87" Jan 21 16:03:42 crc kubenswrapper[4707]: E0121 16:03:42.843732 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87\": container with ID starting with 16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87 not found: ID does not exist" containerID="16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.843747 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87"} err="failed to get container status \"16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87\": rpc error: code = NotFound desc = could not find container \"16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87\": container with ID starting with 16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87 not found: ID does not exist" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.843758 4707 scope.go:117] "RemoveContainer" containerID="91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.843926 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322"} err="failed to get container status \"91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322\": rpc error: code = NotFound desc = could not find container \"91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322\": container with ID starting with 91a206ae4ae5314a7f749a6616cc71f4d951ac1213e85baf849c01df918bc322 not found: ID does not exist" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.843943 4707 scope.go:117] "RemoveContainer" containerID="16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.844093 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87"} err="failed to get container status \"16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87\": rpc error: code = NotFound desc = could not find container \"16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87\": container with ID starting with 16249c0bb4f461e9790a4b405840574eec834170d4b3a2a6f07c737ef8464e87 not found: ID does not exist" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.914926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.914963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfks7\" (UniqueName: \"kubernetes.io/projected/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-kube-api-access-rfks7\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.915017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.915165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.915287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.915340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-config\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.915383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.915427 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:42 crc kubenswrapper[4707]: I0121 16:03:42.975467 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.019027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.019084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfks7\" (UniqueName: \"kubernetes.io/projected/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-kube-api-access-rfks7\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.019226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.019294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.019370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.019918 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.020026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-config\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.020081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.020141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.020988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.021539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-config\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.023678 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.026600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.034424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.043569 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.045791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfks7\" (UniqueName: \"kubernetes.io/projected/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-kube-api-access-rfks7\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.054667 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.055386 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.076610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.082350 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4"] Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.121235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-vencrypt-tls-certs\") pod \"eadd0643-8179-4e51-a4e4-f48691d24503\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.121386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km5ws\" (UniqueName: \"kubernetes.io/projected/eadd0643-8179-4e51-a4e4-f48691d24503-kube-api-access-km5ws\") pod \"eadd0643-8179-4e51-a4e4-f48691d24503\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.121494 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-config-data\") pod \"eadd0643-8179-4e51-a4e4-f48691d24503\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.121561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-combined-ca-bundle\") pod \"eadd0643-8179-4e51-a4e4-f48691d24503\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.121682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-nova-novncproxy-tls-certs\") pod \"eadd0643-8179-4e51-a4e4-f48691d24503\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.145392 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eadd0643-8179-4e51-a4e4-f48691d24503-kube-api-access-km5ws" (OuterVolumeSpecName: "kube-api-access-km5ws") pod "eadd0643-8179-4e51-a4e4-f48691d24503" (UID: "eadd0643-8179-4e51-a4e4-f48691d24503"). InnerVolumeSpecName "kube-api-access-km5ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.167053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-config-data" (OuterVolumeSpecName: "config-data") pod "eadd0643-8179-4e51-a4e4-f48691d24503" (UID: "eadd0643-8179-4e51-a4e4-f48691d24503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.175177 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.180697 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.185220 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eadd0643-8179-4e51-a4e4-f48691d24503" (UID: "eadd0643-8179-4e51-a4e4-f48691d24503"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.190341 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-clk78" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.201445 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.204191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "eadd0643-8179-4e51-a4e4-f48691d24503" (UID: "eadd0643-8179-4e51-a4e4-f48691d24503"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.222553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "eadd0643-8179-4e51-a4e4-f48691d24503" (UID: "eadd0643-8179-4e51-a4e4-f48691d24503"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.223282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-nova-novncproxy-tls-certs\") pod \"eadd0643-8179-4e51-a4e4-f48691d24503\" (UID: \"eadd0643-8179-4e51-a4e4-f48691d24503\") " Jan 21 16:03:43 crc kubenswrapper[4707]: W0121 16:03:43.223429 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/eadd0643-8179-4e51-a4e4-f48691d24503/volumes/kubernetes.io~secret/nova-novncproxy-tls-certs Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.223448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "eadd0643-8179-4e51-a4e4-f48691d24503" (UID: "eadd0643-8179-4e51-a4e4-f48691d24503"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.224321 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.224645 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km5ws\" (UniqueName: \"kubernetes.io/projected/eadd0643-8179-4e51-a4e4-f48691d24503-kube-api-access-km5ws\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.224731 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.225329 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.225423 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eadd0643-8179-4e51-a4e4-f48691d24503-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.235930 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c469e86-c709-4bd2-986e-fa08e32052b0" path="/var/lib/kubelet/pods/7c469e86-c709-4bd2-986e-fa08e32052b0/volumes" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.333191 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.390491 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8"] Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.390762 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api-log" containerID="cri-o://872d133770e4ab129cb4440be2cd434a18f1e4f6a4f18e56e319acdd51d5de90" gracePeriod=30 Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.391236 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api" containerID="cri-o://7dbbb9ef844ab37cf6aecd9f7cd00e150dba12f7539fb8826c773bd114ec81ca" gracePeriod=30 Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.587365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerStarted","Data":"66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d"} Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.590978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" event={"ID":"3e1ab3b0-53e5-437a-8882-ce4f23c015b4","Type":"ContainerStarted","Data":"0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af"} Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.591015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" event={"ID":"3e1ab3b0-53e5-437a-8882-ce4f23c015b4","Type":"ContainerStarted","Data":"428b26d9488a96b98c243d43737e0b33d2a462d063718223477db5688e275966"} Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.593040 4707 generic.go:334] "Generic (PLEG): container finished" podID="eadd0643-8179-4e51-a4e4-f48691d24503" containerID="0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5" exitCode=0 Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.593090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.593115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eadd0643-8179-4e51-a4e4-f48691d24503","Type":"ContainerDied","Data":"0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5"} Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.593163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eadd0643-8179-4e51-a4e4-f48691d24503","Type":"ContainerDied","Data":"b2b7f0838840c36eee39ca1069531f931100e250fd0c72bf867400d2230ca63b"} Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.593187 4707 scope.go:117] "RemoveContainer" containerID="0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.597853 4707 generic.go:334] "Generic (PLEG): container finished" podID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerID="872d133770e4ab129cb4440be2cd434a18f1e4f6a4f18e56e319acdd51d5de90" exitCode=143 Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.598007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" event={"ID":"373e7e86-7df4-4678-8a16-cd6accf342bb","Type":"ContainerDied","Data":"872d133770e4ab129cb4440be2cd434a18f1e4f6a4f18e56e319acdd51d5de90"} Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.617747 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.630793 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.637797 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:03:43 crc kubenswrapper[4707]: E0121 16:03:43.638187 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eadd0643-8179-4e51-a4e4-f48691d24503" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.638203 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eadd0643-8179-4e51-a4e4-f48691d24503" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.638380 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eadd0643-8179-4e51-a4e4-f48691d24503" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.638952 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.642254 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.642425 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.642560 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.650187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.659633 4707 scope.go:117] "RemoveContainer" containerID="0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5" Jan 21 16:03:43 crc kubenswrapper[4707]: E0121 16:03:43.660154 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5\": container with ID starting with 0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5 not found: ID does not exist" containerID="0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.660182 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5"} err="failed to get container status \"0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5\": rpc error: code = NotFound desc = could not find container \"0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5\": container with ID starting with 0fd4eb1cde31e10e17e25e5ca79e9e3ed83a17f3a02ae34adbf8161bc4b791a5 not found: ID does not exist" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.688492 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.723861 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.734994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.735205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.735374 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.735460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.735544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wtch\" (UniqueName: \"kubernetes.io/projected/273f261e-d7a5-4025-91a6-f9e102be79de-kube-api-access-9wtch\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.837635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.838012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.838052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wtch\" (UniqueName: \"kubernetes.io/projected/273f261e-d7a5-4025-91a6-f9e102be79de-kube-api-access-9wtch\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.838127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.838214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.847627 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.850041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.851006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.851785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.864340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wtch\" (UniqueName: \"kubernetes.io/projected/273f261e-d7a5-4025-91a6-f9e102be79de-kube-api-access-9wtch\") pod \"nova-cell1-novncproxy-0\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.934376 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.934614 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-log" containerID="cri-o://bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445" gracePeriod=30 Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.934884 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-api" containerID="cri-o://f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0" gracePeriod=30 Jan 21 16:03:43 crc kubenswrapper[4707]: I0121 16:03:43.981111 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.500839 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.611549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"26d9bdaa-7c4d-41c6-be5e-ea602171d57d","Type":"ContainerStarted","Data":"4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c"} Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.611596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"26d9bdaa-7c4d-41c6-be5e-ea602171d57d","Type":"ContainerStarted","Data":"00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08"} Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.611607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"26d9bdaa-7c4d-41c6-be5e-ea602171d57d","Type":"ContainerStarted","Data":"5b2f04f405dbdff1df3dce87be51d32f13a60dedb9f71c7cc362ddb25784da0f"} Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.618599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerStarted","Data":"f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a"} Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.619318 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.621177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" event={"ID":"3e1ab3b0-53e5-437a-8882-ce4f23c015b4","Type":"ContainerStarted","Data":"319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1"} Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.621604 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.623696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"273f261e-d7a5-4025-91a6-f9e102be79de","Type":"ContainerStarted","Data":"1e5626700bc617a5d897cb1ff108340acef8b9cc05358be3eb5df9be6d08ad36"} Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.626669 4707 generic.go:334] "Generic (PLEG): container finished" podID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerID="bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445" exitCode=143 Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.626743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"61bace1e-5cf4-462a-9d82-0f972aa44b75","Type":"ContainerDied","Data":"bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445"} Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.636721 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.636704731 podStartE2EDuration="2.636704731s" podCreationTimestamp="2026-01-21 16:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:44.634569535 +0000 UTC m=+3721.816085758" watchObservedRunningTime="2026-01-21 16:03:44.636704731 +0000 UTC m=+3721.818220952" Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.665576 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" podStartSLOduration=2.66556132 podStartE2EDuration="2.66556132s" podCreationTimestamp="2026-01-21 16:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:44.657007736 +0000 UTC m=+3721.838523958" watchObservedRunningTime="2026-01-21 16:03:44.66556132 +0000 UTC m=+3721.847077542" Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.689049 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.689635 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerName="openstack-network-exporter" containerID="cri-o://a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f" gracePeriod=300 Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.701192 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.9929792530000001 podStartE2EDuration="5.701180194s" podCreationTimestamp="2026-01-21 16:03:39 +0000 UTC" firstStartedPulling="2026-01-21 16:03:40.5652644 +0000 UTC m=+3717.746780622" lastFinishedPulling="2026-01-21 16:03:44.273465341 +0000 UTC m=+3721.454981563" observedRunningTime="2026-01-21 16:03:44.681834699 +0000 UTC m=+3721.863350921" watchObservedRunningTime="2026-01-21 16:03:44.701180194 +0000 UTC m=+3721.882696417" Jan 21 16:03:44 crc kubenswrapper[4707]: I0121 16:03:44.734379 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerName="ovsdbserver-sb" containerID="cri-o://8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea" gracePeriod=300 Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.196996 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eadd0643-8179-4e51-a4e4-f48691d24503" path="/var/lib/kubelet/pods/eadd0643-8179-4e51-a4e4-f48691d24503/volumes" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.226506 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_cf13695f-19af-4b6e-9b52-c5f795e265d7/ovsdbserver-sb/0.log" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.226576 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.238440 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.269199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-combined-ca-bundle\") pod \"cf13695f-19af-4b6e-9b52-c5f795e265d7\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.269294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-config\") pod \"cf13695f-19af-4b6e-9b52-c5f795e265d7\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.269317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-scripts\") pod \"cf13695f-19af-4b6e-9b52-c5f795e265d7\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.269339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cf13695f-19af-4b6e-9b52-c5f795e265d7\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.269368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdbserver-sb-tls-certs\") pod \"cf13695f-19af-4b6e-9b52-c5f795e265d7\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.269403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdb-rundir\") pod \"cf13695f-19af-4b6e-9b52-c5f795e265d7\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.269434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfbht\" (UniqueName: \"kubernetes.io/projected/cf13695f-19af-4b6e-9b52-c5f795e265d7-kube-api-access-wfbht\") pod \"cf13695f-19af-4b6e-9b52-c5f795e265d7\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.269514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-metrics-certs-tls-certs\") pod \"cf13695f-19af-4b6e-9b52-c5f795e265d7\" (UID: \"cf13695f-19af-4b6e-9b52-c5f795e265d7\") " Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.275713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-config" (OuterVolumeSpecName: "config") pod "cf13695f-19af-4b6e-9b52-c5f795e265d7" (UID: "cf13695f-19af-4b6e-9b52-c5f795e265d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.276182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "cf13695f-19af-4b6e-9b52-c5f795e265d7" (UID: "cf13695f-19af-4b6e-9b52-c5f795e265d7"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.276381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-scripts" (OuterVolumeSpecName: "scripts") pod "cf13695f-19af-4b6e-9b52-c5f795e265d7" (UID: "cf13695f-19af-4b6e-9b52-c5f795e265d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.304951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "cf13695f-19af-4b6e-9b52-c5f795e265d7" (UID: "cf13695f-19af-4b6e-9b52-c5f795e265d7"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.310827 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf13695f-19af-4b6e-9b52-c5f795e265d7-kube-api-access-wfbht" (OuterVolumeSpecName: "kube-api-access-wfbht") pod "cf13695f-19af-4b6e-9b52-c5f795e265d7" (UID: "cf13695f-19af-4b6e-9b52-c5f795e265d7"). InnerVolumeSpecName "kube-api-access-wfbht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.319352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf13695f-19af-4b6e-9b52-c5f795e265d7" (UID: "cf13695f-19af-4b6e-9b52-c5f795e265d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.372260 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.372293 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf13695f-19af-4b6e-9b52-c5f795e265d7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.372314 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.372324 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.372336 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfbht\" (UniqueName: \"kubernetes.io/projected/cf13695f-19af-4b6e-9b52-c5f795e265d7-kube-api-access-wfbht\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.372347 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.391208 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.400045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "cf13695f-19af-4b6e-9b52-c5f795e265d7" (UID: "cf13695f-19af-4b6e-9b52-c5f795e265d7"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.402548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "cf13695f-19af-4b6e-9b52-c5f795e265d7" (UID: "cf13695f-19af-4b6e-9b52-c5f795e265d7"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.474402 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.474450 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.474460 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf13695f-19af-4b6e-9b52-c5f795e265d7-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.572282 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.572548 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerName="ovn-northd" containerID="cri-o://3691aec55f628a7ca9a8989d0678c03a481be77fab6135bf1fef66f4aad67ebd" gracePeriod=30 Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.573036 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerName="openstack-network-exporter" containerID="cri-o://05ec9f6333df97f3167e3d152d12769dc4312df78642cbffed6c3c74e6b24e76" gracePeriod=30 Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.659668 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_cf13695f-19af-4b6e-9b52-c5f795e265d7/ovsdbserver-sb/0.log" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.660722 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerID="a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f" exitCode=2 Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.660756 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerID="8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea" exitCode=143 Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.660993 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.665008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"cf13695f-19af-4b6e-9b52-c5f795e265d7","Type":"ContainerDied","Data":"a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f"} Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.665061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"cf13695f-19af-4b6e-9b52-c5f795e265d7","Type":"ContainerDied","Data":"8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea"} Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.665079 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"cf13695f-19af-4b6e-9b52-c5f795e265d7","Type":"ContainerDied","Data":"f9bb4f8dd4da4abef4adecf2a121d005e1b718ab3814f27bfa784d5b7afa5606"} Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.665108 4707 scope.go:117] "RemoveContainer" containerID="a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.672943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"273f261e-d7a5-4025-91a6-f9e102be79de","Type":"ContainerStarted","Data":"c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f"} Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.713239 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.7132176059999997 podStartE2EDuration="2.713217606s" podCreationTimestamp="2026-01-21 16:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:45.689907786 +0000 UTC m=+3722.871424018" watchObservedRunningTime="2026-01-21 16:03:45.713217606 +0000 UTC m=+3722.894733828" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.738315 4707 scope.go:117] "RemoveContainer" containerID="8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.747834 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.757185 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.769889 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:03:45 crc kubenswrapper[4707]: E0121 16:03:45.770232 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerName="ovsdbserver-sb" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.770249 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerName="ovsdbserver-sb" Jan 21 16:03:45 crc kubenswrapper[4707]: E0121 16:03:45.770272 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerName="openstack-network-exporter" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.770277 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerName="openstack-network-exporter" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.770425 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerName="ovsdbserver-sb" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.770436 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf13695f-19af-4b6e-9b52-c5f795e265d7" containerName="openstack-network-exporter" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.771300 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.776578 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.776754 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.776869 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.776957 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-xp8nc" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.783725 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.797166 4707 scope.go:117] "RemoveContainer" containerID="a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f" Jan 21 16:03:45 crc kubenswrapper[4707]: E0121 16:03:45.802256 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f\": container with ID starting with a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f not found: ID does not exist" containerID="a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.802291 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f"} err="failed to get container status \"a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f\": rpc error: code = NotFound desc = could not find container \"a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f\": container with ID starting with a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f not found: ID does not exist" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.802317 4707 scope.go:117] "RemoveContainer" containerID="8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea" Jan 21 16:03:45 crc kubenswrapper[4707]: E0121 16:03:45.806430 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea\": container with ID starting with 8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea not found: ID does not exist" containerID="8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.806456 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea"} err="failed to get container status \"8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea\": rpc error: code = NotFound desc = could not find container \"8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea\": container with ID starting with 8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea not found: ID does not exist" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.806474 4707 scope.go:117] "RemoveContainer" containerID="a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.806856 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f"} err="failed to get container status \"a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f\": rpc error: code = NotFound desc = could not find container \"a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f\": container with ID starting with a4c00abb35a8dd19b9056d52362e3132b8578acdad9a4a917d428a983a51270f not found: ID does not exist" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.806911 4707 scope.go:117] "RemoveContainer" containerID="8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.807169 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea"} err="failed to get container status \"8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea\": rpc error: code = NotFound desc = could not find container \"8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea\": container with ID starting with 8d26a8c7768ec7ce27d4b5031ce938b04f8e248c4e4565a581a7b308959d54ea not found: ID does not exist" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.887387 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.887429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.887516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.887535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.887593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.887657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.887724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-config\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.887776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmr4k\" (UniqueName: \"kubernetes.io/projected/c1e538db-f225-4c4a-a147-f2bcf3218f28-kube-api-access-kmr4k\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.989454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.989509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.989556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.989577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.989630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.989665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.989712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-config\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.989767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmr4k\" (UniqueName: \"kubernetes.io/projected/c1e538db-f225-4c4a-a147-f2bcf3218f28-kube-api-access-kmr4k\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.990267 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.990651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.991080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-config\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.991547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.995570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.995977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:45 crc kubenswrapper[4707]: I0121 16:03:45.997035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.004958 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmr4k\" (UniqueName: \"kubernetes.io/projected/c1e538db-f225-4c4a-a147-f2bcf3218f28-kube-api-access-kmr4k\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.016510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.106219 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.203579 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.564064 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.75:9311/healthcheck\": read tcp 10.217.0.2:34370->10.217.1.75:9311: read: connection reset by peer" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.564179 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.75:9311/healthcheck\": read tcp 10.217.0.2:34382->10.217.1.75:9311: read: connection reset by peer" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.581599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:03:46 crc kubenswrapper[4707]: W0121 16:03:46.640117 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1e538db_f225_4c4a_a147_f2bcf3218f28.slice/crio-0e47204e826ec2ccd945d5d14a6c21d07ae997240536f43f126766c760b98c10 WatchSource:0}: Error finding container 0e47204e826ec2ccd945d5d14a6c21d07ae997240536f43f126766c760b98c10: Status 404 returned error can't find the container with id 0e47204e826ec2ccd945d5d14a6c21d07ae997240536f43f126766c760b98c10 Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.686377 4707 generic.go:334] "Generic (PLEG): container finished" podID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerID="7dbbb9ef844ab37cf6aecd9f7cd00e150dba12f7539fb8826c773bd114ec81ca" exitCode=0 Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.686434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" event={"ID":"373e7e86-7df4-4678-8a16-cd6accf342bb","Type":"ContainerDied","Data":"7dbbb9ef844ab37cf6aecd9f7cd00e150dba12f7539fb8826c773bd114ec81ca"} Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.689503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1e538db-f225-4c4a-a147-f2bcf3218f28","Type":"ContainerStarted","Data":"0e47204e826ec2ccd945d5d14a6c21d07ae997240536f43f126766c760b98c10"} Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.693426 4707 generic.go:334] "Generic (PLEG): container finished" podID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerID="05ec9f6333df97f3167e3d152d12769dc4312df78642cbffed6c3c74e6b24e76" exitCode=2 Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.694440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6363e430-8c44-4cbc-8923-3ed9147d0cde","Type":"ContainerDied","Data":"05ec9f6333df97f3167e3d152d12769dc4312df78642cbffed6c3c74e6b24e76"} Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.859542 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.907200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/373e7e86-7df4-4678-8a16-cd6accf342bb-logs\") pod \"373e7e86-7df4-4678-8a16-cd6accf342bb\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.907255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data-custom\") pod \"373e7e86-7df4-4678-8a16-cd6accf342bb\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.907303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-combined-ca-bundle\") pod \"373e7e86-7df4-4678-8a16-cd6accf342bb\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.907409 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-public-tls-certs\") pod \"373e7e86-7df4-4678-8a16-cd6accf342bb\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.907427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data\") pod \"373e7e86-7df4-4678-8a16-cd6accf342bb\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.907468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwvwn\" (UniqueName: \"kubernetes.io/projected/373e7e86-7df4-4678-8a16-cd6accf342bb-kube-api-access-wwvwn\") pod \"373e7e86-7df4-4678-8a16-cd6accf342bb\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.907525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-internal-tls-certs\") pod \"373e7e86-7df4-4678-8a16-cd6accf342bb\" (UID: \"373e7e86-7df4-4678-8a16-cd6accf342bb\") " Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.907668 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/373e7e86-7df4-4678-8a16-cd6accf342bb-logs" (OuterVolumeSpecName: "logs") pod "373e7e86-7df4-4678-8a16-cd6accf342bb" (UID: "373e7e86-7df4-4678-8a16-cd6accf342bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.907977 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/373e7e86-7df4-4678-8a16-cd6accf342bb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.911239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373e7e86-7df4-4678-8a16-cd6accf342bb-kube-api-access-wwvwn" (OuterVolumeSpecName: "kube-api-access-wwvwn") pod "373e7e86-7df4-4678-8a16-cd6accf342bb" (UID: "373e7e86-7df4-4678-8a16-cd6accf342bb"). InnerVolumeSpecName "kube-api-access-wwvwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.922919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "373e7e86-7df4-4678-8a16-cd6accf342bb" (UID: "373e7e86-7df4-4678-8a16-cd6accf342bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.939407 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "373e7e86-7df4-4678-8a16-cd6accf342bb" (UID: "373e7e86-7df4-4678-8a16-cd6accf342bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.960454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "373e7e86-7df4-4678-8a16-cd6accf342bb" (UID: "373e7e86-7df4-4678-8a16-cd6accf342bb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.965505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data" (OuterVolumeSpecName: "config-data") pod "373e7e86-7df4-4678-8a16-cd6accf342bb" (UID: "373e7e86-7df4-4678-8a16-cd6accf342bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:46 crc kubenswrapper[4707]: I0121 16:03:46.975931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "373e7e86-7df4-4678-8a16-cd6accf342bb" (UID: "373e7e86-7df4-4678-8a16-cd6accf342bb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.010745 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.010794 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.010872 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwvwn\" (UniqueName: \"kubernetes.io/projected/373e7e86-7df4-4678-8a16-cd6accf342bb-kube-api-access-wwvwn\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.010887 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.010900 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.010909 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/373e7e86-7df4-4678-8a16-cd6accf342bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.208174 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf13695f-19af-4b6e-9b52-c5f795e265d7" path="/var/lib/kubelet/pods/cf13695f-19af-4b6e-9b52-c5f795e265d7/volumes" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.451016 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.519877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-internal-tls-certs\") pod \"61bace1e-5cf4-462a-9d82-0f972aa44b75\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.519987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-public-tls-certs\") pod \"61bace1e-5cf4-462a-9d82-0f972aa44b75\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.520007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-config-data\") pod \"61bace1e-5cf4-462a-9d82-0f972aa44b75\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.520059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74qhz\" (UniqueName: \"kubernetes.io/projected/61bace1e-5cf4-462a-9d82-0f972aa44b75-kube-api-access-74qhz\") pod \"61bace1e-5cf4-462a-9d82-0f972aa44b75\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.520102 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61bace1e-5cf4-462a-9d82-0f972aa44b75-logs\") pod \"61bace1e-5cf4-462a-9d82-0f972aa44b75\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.520155 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-combined-ca-bundle\") pod \"61bace1e-5cf4-462a-9d82-0f972aa44b75\" (UID: \"61bace1e-5cf4-462a-9d82-0f972aa44b75\") " Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.520649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61bace1e-5cf4-462a-9d82-0f972aa44b75-logs" (OuterVolumeSpecName: "logs") pod "61bace1e-5cf4-462a-9d82-0f972aa44b75" (UID: "61bace1e-5cf4-462a-9d82-0f972aa44b75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.528048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bace1e-5cf4-462a-9d82-0f972aa44b75-kube-api-access-74qhz" (OuterVolumeSpecName: "kube-api-access-74qhz") pod "61bace1e-5cf4-462a-9d82-0f972aa44b75" (UID: "61bace1e-5cf4-462a-9d82-0f972aa44b75"). InnerVolumeSpecName "kube-api-access-74qhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.574774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-config-data" (OuterVolumeSpecName: "config-data") pod "61bace1e-5cf4-462a-9d82-0f972aa44b75" (UID: "61bace1e-5cf4-462a-9d82-0f972aa44b75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.578038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61bace1e-5cf4-462a-9d82-0f972aa44b75" (UID: "61bace1e-5cf4-462a-9d82-0f972aa44b75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.592666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "61bace1e-5cf4-462a-9d82-0f972aa44b75" (UID: "61bace1e-5cf4-462a-9d82-0f972aa44b75"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.603719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "61bace1e-5cf4-462a-9d82-0f972aa44b75" (UID: "61bace1e-5cf4-462a-9d82-0f972aa44b75"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.621532 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74qhz\" (UniqueName: \"kubernetes.io/projected/61bace1e-5cf4-462a-9d82-0f972aa44b75-kube-api-access-74qhz\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.621557 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.621569 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61bace1e-5cf4-462a-9d82-0f972aa44b75-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.621587 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.621595 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.621604 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61bace1e-5cf4-462a-9d82-0f972aa44b75-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.704185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1e538db-f225-4c4a-a147-f2bcf3218f28","Type":"ContainerStarted","Data":"1d9f4ad28c410a19a1a52d47576b0ff87741fce0b1aca9cbf91d3cf607f6137a"} Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.704226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1e538db-f225-4c4a-a147-f2bcf3218f28","Type":"ContainerStarted","Data":"f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c"} Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.707414 4707 generic.go:334] "Generic (PLEG): container finished" podID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerID="f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0" exitCode=0 Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.707491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"61bace1e-5cf4-462a-9d82-0f972aa44b75","Type":"ContainerDied","Data":"f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0"} Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.707515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"61bace1e-5cf4-462a-9d82-0f972aa44b75","Type":"ContainerDied","Data":"54256dbc5cc393032b3025ee1aff02d0c3a2825895a7ce9c03406705f89c1f27"} Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.707532 4707 scope.go:117] "RemoveContainer" containerID="f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.707595 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.712578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" event={"ID":"373e7e86-7df4-4678-8a16-cd6accf342bb","Type":"ContainerDied","Data":"8452ccc6e066ed6876612b08167b493a289f32110fefcef1dab6981a93d2f2fb"} Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.712634 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.727951 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.727936997 podStartE2EDuration="2.727936997s" podCreationTimestamp="2026-01-21 16:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:47.722228663 +0000 UTC m=+3724.903744886" watchObservedRunningTime="2026-01-21 16:03:47.727936997 +0000 UTC m=+3724.909453219" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.746181 4707 scope.go:117] "RemoveContainer" containerID="bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.777403 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8"] Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.784201 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-85fcc969c8-gghq8"] Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.791405 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.801985 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.806198 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:03:47 crc kubenswrapper[4707]: E0121 16:03:47.806595 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.806606 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api" Jan 21 16:03:47 crc kubenswrapper[4707]: E0121 16:03:47.806620 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api-log" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.806626 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api-log" Jan 21 16:03:47 crc kubenswrapper[4707]: E0121 16:03:47.806650 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-api" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.806655 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-api" Jan 21 16:03:47 crc kubenswrapper[4707]: E0121 16:03:47.806663 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-log" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.806668 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-log" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.806203 4707 scope.go:117] "RemoveContainer" containerID="f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.806973 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-log" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.806990 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api-log" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.807013 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" containerName="barbican-api" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.807022 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" containerName="nova-api-api" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.807860 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.809855 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.811291 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:03:47 crc kubenswrapper[4707]: E0121 16:03:47.813346 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0\": container with ID starting with f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0 not found: ID does not exist" containerID="f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.813383 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0"} err="failed to get container status \"f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0\": rpc error: code = NotFound desc = could not find container \"f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0\": container with ID starting with f9148fb23c6e270d9085a3cc6811bdd9d585350d26029d000321b46b18e4beb0 not found: ID does not exist" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.813406 4707 scope.go:117] "RemoveContainer" containerID="bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.813514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.813610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:03:47 crc kubenswrapper[4707]: E0121 16:03:47.816086 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445\": container with ID starting with bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445 not found: ID does not exist" containerID="bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.816210 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445"} err="failed to get container status \"bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445\": rpc error: code = NotFound desc = could not find container \"bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445\": container with ID starting with bbfbe3f9d7665cb4455dc6e9950f130f763847bffd52a3e7a20eedb03134f445 not found: ID does not exist" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.816328 4707 scope.go:117] "RemoveContainer" containerID="7dbbb9ef844ab37cf6aecd9f7cd00e150dba12f7539fb8826c773bd114ec81ca" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.852684 4707 scope.go:117] "RemoveContainer" containerID="872d133770e4ab129cb4440be2cd434a18f1e4f6a4f18e56e319acdd51d5de90" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.931402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.931474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b70303-f297-4a57-b252-7bdc251fa8ef-logs\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.931521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-public-tls-certs\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.931545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4cxm\" (UniqueName: \"kubernetes.io/projected/a1b70303-f297-4a57-b252-7bdc251fa8ef-kube-api-access-d4cxm\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.931564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:47 crc kubenswrapper[4707]: I0121 16:03:47.931633 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-config-data\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.033363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-config-data\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.033483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.033636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b70303-f297-4a57-b252-7bdc251fa8ef-logs\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.033979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b70303-f297-4a57-b252-7bdc251fa8ef-logs\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.035320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-public-tls-certs\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.035371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4cxm\" (UniqueName: \"kubernetes.io/projected/a1b70303-f297-4a57-b252-7bdc251fa8ef-kube-api-access-d4cxm\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.035400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.038283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.038377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-config-data\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.039541 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-public-tls-certs\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.045266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.049476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4cxm\" (UniqueName: \"kubernetes.io/projected/a1b70303-f297-4a57-b252-7bdc251fa8ef-kube-api-access-d4cxm\") pod \"nova-api-0\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.128872 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.202861 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.518686 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:03:48 crc kubenswrapper[4707]: W0121 16:03:48.525945 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1b70303_f297_4a57_b252_7bdc251fa8ef.slice/crio-c846cb49fa8e8fd526b07a61e1e8214e73481ec9a3990c74f86786779a294f98 WatchSource:0}: Error finding container c846cb49fa8e8fd526b07a61e1e8214e73481ec9a3990c74f86786779a294f98: Status 404 returned error can't find the container with id c846cb49fa8e8fd526b07a61e1e8214e73481ec9a3990c74f86786779a294f98 Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.598260 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-cf495cb46-rcdjz"] Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.648598 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-cf495cb46-rcdjz"] Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.648706 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.749478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-internal-tls-certs\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.749643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-scripts\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.749718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-public-tls-certs\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.749786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-config-data\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.749820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-logs\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.749851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-combined-ca-bundle\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.749868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc48n\" (UniqueName: \"kubernetes.io/projected/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-kube-api-access-sc48n\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.754798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a1b70303-f297-4a57-b252-7bdc251fa8ef","Type":"ContainerStarted","Data":"c846cb49fa8e8fd526b07a61e1e8214e73481ec9a3990c74f86786779a294f98"} Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.756850 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_6363e430-8c44-4cbc-8923-3ed9147d0cde/ovn-northd/0.log" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.756882 4707 generic.go:334] "Generic (PLEG): container finished" podID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerID="3691aec55f628a7ca9a8989d0678c03a481be77fab6135bf1fef66f4aad67ebd" exitCode=139 Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.757012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6363e430-8c44-4cbc-8923-3ed9147d0cde","Type":"ContainerDied","Data":"3691aec55f628a7ca9a8989d0678c03a481be77fab6135bf1fef66f4aad67ebd"} Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.852064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-public-tls-certs\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.852509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-config-data\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.852533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-logs\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.852954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-combined-ca-bundle\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.852989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc48n\" (UniqueName: \"kubernetes.io/projected/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-kube-api-access-sc48n\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.853044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-logs\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.853088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-internal-tls-certs\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.853179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-scripts\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.855055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-public-tls-certs\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.855294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-config-data\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.858195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-scripts\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.860317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-internal-tls-certs\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.864274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-combined-ca-bundle\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.869252 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc48n\" (UniqueName: \"kubernetes.io/projected/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-kube-api-access-sc48n\") pod \"placement-cf495cb46-rcdjz\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.936460 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_6363e430-8c44-4cbc-8923-3ed9147d0cde/ovn-northd/0.log" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.936538 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:48 crc kubenswrapper[4707]: I0121 16:03:48.981378 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.058993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-combined-ca-bundle\") pod \"6363e430-8c44-4cbc-8923-3ed9147d0cde\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.059064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs\") pod \"6363e430-8c44-4cbc-8923-3ed9147d0cde\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.059150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-scripts\") pod \"6363e430-8c44-4cbc-8923-3ed9147d0cde\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.059254 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-rundir\") pod \"6363e430-8c44-4cbc-8923-3ed9147d0cde\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.059282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs\") pod \"6363e430-8c44-4cbc-8923-3ed9147d0cde\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.059303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-config\") pod \"6363e430-8c44-4cbc-8923-3ed9147d0cde\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.059328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v787j\" (UniqueName: \"kubernetes.io/projected/6363e430-8c44-4cbc-8923-3ed9147d0cde-kube-api-access-v787j\") pod \"6363e430-8c44-4cbc-8923-3ed9147d0cde\" (UID: \"6363e430-8c44-4cbc-8923-3ed9147d0cde\") " Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.060986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "6363e430-8c44-4cbc-8923-3ed9147d0cde" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.061021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-scripts" (OuterVolumeSpecName: "scripts") pod "6363e430-8c44-4cbc-8923-3ed9147d0cde" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.061190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-config" (OuterVolumeSpecName: "config") pod "6363e430-8c44-4cbc-8923-3ed9147d0cde" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.063364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6363e430-8c44-4cbc-8923-3ed9147d0cde-kube-api-access-v787j" (OuterVolumeSpecName: "kube-api-access-v787j") pod "6363e430-8c44-4cbc-8923-3ed9147d0cde" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde"). InnerVolumeSpecName "kube-api-access-v787j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.081560 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6363e430-8c44-4cbc-8923-3ed9147d0cde" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.103281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.107268 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.129915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6363e430-8c44-4cbc-8923-3ed9147d0cde" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.133999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "6363e430-8c44-4cbc-8923-3ed9147d0cde" (UID: "6363e430-8c44-4cbc-8923-3ed9147d0cde"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.147894 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.161938 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.161981 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.161993 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.162003 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v787j\" (UniqueName: \"kubernetes.io/projected/6363e430-8c44-4cbc-8923-3ed9147d0cde-kube-api-access-v787j\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.162011 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.162119 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6363e430-8c44-4cbc-8923-3ed9147d0cde-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.162430 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6363e430-8c44-4cbc-8923-3ed9147d0cde-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.194144 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="373e7e86-7df4-4678-8a16-cd6accf342bb" path="/var/lib/kubelet/pods/373e7e86-7df4-4678-8a16-cd6accf342bb/volumes" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.194754 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61bace1e-5cf4-462a-9d82-0f972aa44b75" path="/var/lib/kubelet/pods/61bace1e-5cf4-462a-9d82-0f972aa44b75/volumes" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.240352 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.288904 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.536008 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-cf495cb46-rcdjz"] Jan 21 16:03:49 crc kubenswrapper[4707]: W0121 16:03:49.540891 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeff12a6c_5d02_4e45_9e7e_e589b746f1dc.slice/crio-83a59ed7cec5c61fc2188caad0a9e33a181eddc8cac3308b151b01bfb9bae715 WatchSource:0}: Error finding container 83a59ed7cec5c61fc2188caad0a9e33a181eddc8cac3308b151b01bfb9bae715: Status 404 returned error can't find the container with id 83a59ed7cec5c61fc2188caad0a9e33a181eddc8cac3308b151b01bfb9bae715 Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.764411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" event={"ID":"eff12a6c-5d02-4e45-9e7e-e589b746f1dc","Type":"ContainerStarted","Data":"32a92f7f2012dcf1fe6f0d9760599775639a7067de28634de63ab2fdd9b6962c"} Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.764448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" event={"ID":"eff12a6c-5d02-4e45-9e7e-e589b746f1dc","Type":"ContainerStarted","Data":"83a59ed7cec5c61fc2188caad0a9e33a181eddc8cac3308b151b01bfb9bae715"} Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.766017 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_6363e430-8c44-4cbc-8923-3ed9147d0cde/ovn-northd/0.log" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.766066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6363e430-8c44-4cbc-8923-3ed9147d0cde","Type":"ContainerDied","Data":"eb310c830cc7f22a95201abcdae076f8a724e61493eb24fa79d3f70352913353"} Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.766099 4707 scope.go:117] "RemoveContainer" containerID="05ec9f6333df97f3167e3d152d12769dc4312df78642cbffed6c3c74e6b24e76" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.766184 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.770322 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c95f7cb7d-zhgp4_3e1ab3b0-53e5-437a-8882-ce4f23c015b4/neutron-api/0.log" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.770353 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerID="0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af" exitCode=1 Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.770394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" event={"ID":"3e1ab3b0-53e5-437a-8882-ce4f23c015b4","Type":"ContainerDied","Data":"0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af"} Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.770710 4707 scope.go:117] "RemoveContainer" containerID="0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.774069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a1b70303-f297-4a57-b252-7bdc251fa8ef","Type":"ContainerStarted","Data":"5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e"} Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.774102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a1b70303-f297-4a57-b252-7bdc251fa8ef","Type":"ContainerStarted","Data":"23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd"} Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.774114 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.790033 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.823967 4707 scope.go:117] "RemoveContainer" containerID="3691aec55f628a7ca9a8989d0678c03a481be77fab6135bf1fef66f4aad67ebd" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.828917 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.848522 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.856785 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:03:49 crc kubenswrapper[4707]: E0121 16:03:49.857178 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerName="openstack-network-exporter" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.857191 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerName="openstack-network-exporter" Jan 21 16:03:49 crc kubenswrapper[4707]: E0121 16:03:49.857203 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerName="ovn-northd" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.857210 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerName="ovn-northd" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.857365 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerName="openstack-network-exporter" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.857385 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6363e430-8c44-4cbc-8923-3ed9147d0cde" containerName="ovn-northd" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.858283 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.860066 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.860094 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-jvckj" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.860145 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.860204 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.864657 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.868336 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.868322622 podStartE2EDuration="2.868322622s" podCreationTimestamp="2026-01-21 16:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:49.819603895 +0000 UTC m=+3727.001120117" watchObservedRunningTime="2026-01-21 16:03:49.868322622 +0000 UTC m=+3727.049838844" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.899739 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.899928 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.929633 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.929717 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.929817 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.953104 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.962982 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.964966 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.978029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.978084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhhwt\" (UniqueName: \"kubernetes.io/projected/6d840a57-4237-4336-9882-01a52c8a2c09-kube-api-access-vhhwt\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.978129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.978162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-scripts\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.978183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.978305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-config\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:49 crc kubenswrapper[4707]: I0121 16:03:49.978441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.080139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.080222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-scripts\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.080253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.080288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-config\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.080374 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.080480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.080509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhhwt\" (UniqueName: \"kubernetes.io/projected/6d840a57-4237-4336-9882-01a52c8a2c09-kube-api-access-vhhwt\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.080585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.080972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-scripts\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.081525 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-config\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.085102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.085313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.085797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.093728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhhwt\" (UniqueName: \"kubernetes.io/projected/6d840a57-4237-4336-9882-01a52c8a2c09-kube-api-access-vhhwt\") pod \"ovn-northd-0\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.182185 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.561986 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:03:50 crc kubenswrapper[4707]: W0121 16:03:50.566777 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d840a57_4237_4336_9882_01a52c8a2c09.slice/crio-7cdba8929411bf5f77c2cfbbb4e46f35b0508c5a0157c7e3cf760c65407d5be9 WatchSource:0}: Error finding container 7cdba8929411bf5f77c2cfbbb4e46f35b0508c5a0157c7e3cf760c65407d5be9: Status 404 returned error can't find the container with id 7cdba8929411bf5f77c2cfbbb4e46f35b0508c5a0157c7e3cf760c65407d5be9 Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.785523 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c95f7cb7d-zhgp4_3e1ab3b0-53e5-437a-8882-ce4f23c015b4/neutron-api/0.log" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.785649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" event={"ID":"3e1ab3b0-53e5-437a-8882-ce4f23c015b4","Type":"ContainerStarted","Data":"75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6"} Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.787502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6d840a57-4237-4336-9882-01a52c8a2c09","Type":"ContainerStarted","Data":"77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261"} Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.787543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6d840a57-4237-4336-9882-01a52c8a2c09","Type":"ContainerStarted","Data":"7cdba8929411bf5f77c2cfbbb4e46f35b0508c5a0157c7e3cf760c65407d5be9"} Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.789058 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" event={"ID":"eff12a6c-5d02-4e45-9e7e-e589b746f1dc","Type":"ContainerStarted","Data":"1cfbfa0717635e65d7326b7c6d12c3ff52804eea66d22885b209dc75b069dd24"} Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.789880 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.789926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.790168 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.792944 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.792985 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.793214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.793234 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:50 crc kubenswrapper[4707]: I0121 16:03:50.820336 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" podStartSLOduration=2.820323005 podStartE2EDuration="2.820323005s" podCreationTimestamp="2026-01-21 16:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:50.815329755 +0000 UTC m=+3727.996845978" watchObservedRunningTime="2026-01-21 16:03:50.820323005 +0000 UTC m=+3728.001839227" Jan 21 16:03:51 crc kubenswrapper[4707]: I0121 16:03:51.136479 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:03:51 crc kubenswrapper[4707]: I0121 16:03:51.204222 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6363e430-8c44-4cbc-8923-3ed9147d0cde" path="/var/lib/kubelet/pods/6363e430-8c44-4cbc-8923-3ed9147d0cde/volumes" Jan 21 16:03:51 crc kubenswrapper[4707]: I0121 16:03:51.799055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6d840a57-4237-4336-9882-01a52c8a2c09","Type":"ContainerStarted","Data":"b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6"} Jan 21 16:03:52 crc kubenswrapper[4707]: I0121 16:03:52.383301 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:52 crc kubenswrapper[4707]: I0121 16:03:52.385369 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:03:52 crc kubenswrapper[4707]: I0121 16:03:52.401147 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:52 crc kubenswrapper[4707]: I0121 16:03:52.403725 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=3.403713226 podStartE2EDuration="3.403713226s" podCreationTimestamp="2026-01-21 16:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:03:51.816241066 +0000 UTC m=+3728.997757289" watchObservedRunningTime="2026-01-21 16:03:52.403713226 +0000 UTC m=+3729.585229448" Jan 21 16:03:52 crc kubenswrapper[4707]: I0121 16:03:52.429930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:03:52 crc kubenswrapper[4707]: I0121 16:03:52.807487 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:03:53 crc kubenswrapper[4707]: I0121 16:03:53.981840 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:54 crc kubenswrapper[4707]: I0121 16:03:54.012114 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:54 crc kubenswrapper[4707]: I0121 16:03:54.833396 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:03:58 crc kubenswrapper[4707]: I0121 16:03:58.130158 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:58 crc kubenswrapper[4707]: I0121 16:03:58.130511 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:03:59 crc kubenswrapper[4707]: I0121 16:03:59.139902 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.100:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:03:59 crc kubenswrapper[4707]: I0121 16:03:59.139913 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.100:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:04:00 crc kubenswrapper[4707]: I0121 16:04:00.227299 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:04:06 crc kubenswrapper[4707]: I0121 16:04:06.871733 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:04:06 crc kubenswrapper[4707]: I0121 16:04:06.909325 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp"] Jan 21 16:04:06 crc kubenswrapper[4707]: I0121 16:04:06.909527 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" podUID="30ccb1b1-f7df-49e1-b264-546fe96b02c0" containerName="keystone-api" containerID="cri-o://2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8" gracePeriod=30 Jan 21 16:04:08 crc kubenswrapper[4707]: I0121 16:04:08.135493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:04:08 crc kubenswrapper[4707]: I0121 16:04:08.135721 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:04:08 crc kubenswrapper[4707]: I0121 16:04:08.136029 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:04:08 crc kubenswrapper[4707]: I0121 16:04:08.136065 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:04:08 crc kubenswrapper[4707]: I0121 16:04:08.140998 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:04:08 crc kubenswrapper[4707]: I0121 16:04:08.141939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.175372 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.387883 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.412606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8q62\" (UniqueName: \"kubernetes.io/projected/30ccb1b1-f7df-49e1-b264-546fe96b02c0-kube-api-access-m8q62\") pod \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.412672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-combined-ca-bundle\") pod \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.412690 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-scripts\") pod \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.412728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-internal-tls-certs\") pod \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.412817 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-credential-keys\") pod \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.412875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-fernet-keys\") pod \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.412911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-config-data\") pod \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.412963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-public-tls-certs\") pod \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.419136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "30ccb1b1-f7df-49e1-b264-546fe96b02c0" (UID: "30ccb1b1-f7df-49e1-b264-546fe96b02c0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.424078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-scripts" (OuterVolumeSpecName: "scripts") pod "30ccb1b1-f7df-49e1-b264-546fe96b02c0" (UID: "30ccb1b1-f7df-49e1-b264-546fe96b02c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.444924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ccb1b1-f7df-49e1-b264-546fe96b02c0-kube-api-access-m8q62" (OuterVolumeSpecName: "kube-api-access-m8q62") pod "30ccb1b1-f7df-49e1-b264-546fe96b02c0" (UID: "30ccb1b1-f7df-49e1-b264-546fe96b02c0"). InnerVolumeSpecName "kube-api-access-m8q62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.448483 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "30ccb1b1-f7df-49e1-b264-546fe96b02c0" (UID: "30ccb1b1-f7df-49e1-b264-546fe96b02c0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.462155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30ccb1b1-f7df-49e1-b264-546fe96b02c0" (UID: "30ccb1b1-f7df-49e1-b264-546fe96b02c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.462645 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-config-data" (OuterVolumeSpecName: "config-data") pod "30ccb1b1-f7df-49e1-b264-546fe96b02c0" (UID: "30ccb1b1-f7df-49e1-b264-546fe96b02c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:10 crc kubenswrapper[4707]: E0121 16:04:10.492125 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-internal-tls-certs podName:30ccb1b1-f7df-49e1-b264-546fe96b02c0 nodeName:}" failed. No retries permitted until 2026-01-21 16:04:10.992106382 +0000 UTC m=+3748.173622603 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-internal-tls-certs") pod "30ccb1b1-f7df-49e1-b264-546fe96b02c0" (UID: "30ccb1b1-f7df-49e1-b264-546fe96b02c0") : error deleting /var/lib/kubelet/pods/30ccb1b1-f7df-49e1-b264-546fe96b02c0/volume-subpaths: remove /var/lib/kubelet/pods/30ccb1b1-f7df-49e1-b264-546fe96b02c0/volume-subpaths: no such file or directory Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.494666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30ccb1b1-f7df-49e1-b264-546fe96b02c0" (UID: "30ccb1b1-f7df-49e1-b264-546fe96b02c0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.514942 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.514965 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.514991 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.515009 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.515020 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8q62\" (UniqueName: \"kubernetes.io/projected/30ccb1b1-f7df-49e1-b264-546fe96b02c0-kube-api-access-m8q62\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.515031 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.515039 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.934906 4707 generic.go:334] "Generic (PLEG): container finished" podID="30ccb1b1-f7df-49e1-b264-546fe96b02c0" containerID="2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8" exitCode=0 Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.934942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" event={"ID":"30ccb1b1-f7df-49e1-b264-546fe96b02c0","Type":"ContainerDied","Data":"2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8"} Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.934964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" event={"ID":"30ccb1b1-f7df-49e1-b264-546fe96b02c0","Type":"ContainerDied","Data":"8b18577448492fa9c776041b61c122ff6298ba44141f220acc90e3f28a5c07f5"} Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.934978 4707 scope.go:117] "RemoveContainer" containerID="2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.935083 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.954561 4707 scope.go:117] "RemoveContainer" containerID="2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8" Jan 21 16:04:10 crc kubenswrapper[4707]: E0121 16:04:10.954967 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8\": container with ID starting with 2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8 not found: ID does not exist" containerID="2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8" Jan 21 16:04:10 crc kubenswrapper[4707]: I0121 16:04:10.954991 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8"} err="failed to get container status \"2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8\": rpc error: code = NotFound desc = could not find container \"2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8\": container with ID starting with 2ce3c04fd5b2611c3cebc2a58b6ddeface6861d7e2d3437556467e0cfd40b1b8 not found: ID does not exist" Jan 21 16:04:11 crc kubenswrapper[4707]: I0121 16:04:11.022632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-internal-tls-certs\") pod \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\" (UID: \"30ccb1b1-f7df-49e1-b264-546fe96b02c0\") " Jan 21 16:04:11 crc kubenswrapper[4707]: I0121 16:04:11.026417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30ccb1b1-f7df-49e1-b264-546fe96b02c0" (UID: "30ccb1b1-f7df-49e1-b264-546fe96b02c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:11 crc kubenswrapper[4707]: I0121 16:04:11.124722 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ccb1b1-f7df-49e1-b264-546fe96b02c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:11 crc kubenswrapper[4707]: I0121 16:04:11.175829 4707 scope.go:117] "RemoveContainer" containerID="6069a182333f4865e5bff3fe2585423f7e539e8ae4f687c86c63a5733d4ed4a3" Jan 21 16:04:11 crc kubenswrapper[4707]: I0121 16:04:11.207972 4707 scope.go:117] "RemoveContainer" containerID="fa323bea5e9e1d95534dd4ba7f41b30c1e310d988c5dd2b4f06c9ddc1db6a2e9" Jan 21 16:04:11 crc kubenswrapper[4707]: I0121 16:04:11.233293 4707 scope.go:117] "RemoveContainer" containerID="1c1f119c65c4fce9675289522fce83be1fb184f19095b0c15b0efe892c668d46" Jan 21 16:04:11 crc kubenswrapper[4707]: I0121 16:04:11.276968 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp"] Jan 21 16:04:11 crc kubenswrapper[4707]: I0121 16:04:11.284024 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-7b8fc55dc7-dn5dp"] Jan 21 16:04:12 crc kubenswrapper[4707]: I0121 16:04:12.736554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:04:12 crc kubenswrapper[4707]: I0121 16:04:12.789802 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf"] Jan 21 16:04:12 crc kubenswrapper[4707]: I0121 16:04:12.790099 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerName="neutron-httpd" containerID="cri-o://a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0" gracePeriod=30 Jan 21 16:04:12 crc kubenswrapper[4707]: I0121 16:04:12.790027 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerName="neutron-api" containerID="cri-o://9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30" gracePeriod=30 Jan 21 16:04:12 crc kubenswrapper[4707]: I0121 16:04:12.950747 4707 generic.go:334] "Generic (PLEG): container finished" podID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerID="a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0" exitCode=0 Jan 21 16:04:12 crc kubenswrapper[4707]: I0121 16:04:12.950787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" event={"ID":"53a357b9-520a-4e1b-be4e-917657c1e24e","Type":"ContainerDied","Data":"a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0"} Jan 21 16:04:13 crc kubenswrapper[4707]: I0121 16:04:13.191144 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ccb1b1-f7df-49e1-b264-546fe96b02c0" path="/var/lib/kubelet/pods/30ccb1b1-f7df-49e1-b264-546fe96b02c0/volumes" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.188718 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-567575559b-gqwkg"] Jan 21 16:04:14 crc kubenswrapper[4707]: E0121 16:04:14.189216 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ccb1b1-f7df-49e1-b264-546fe96b02c0" containerName="keystone-api" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.189228 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ccb1b1-f7df-49e1-b264-546fe96b02c0" containerName="keystone-api" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.189405 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ccb1b1-f7df-49e1-b264-546fe96b02c0" containerName="keystone-api" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.190246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.196794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567575559b-gqwkg"] Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.275495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78q44\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-kube-api-access-78q44\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.275546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-public-tls-certs\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.275590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-log-httpd\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.275619 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-run-httpd\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.275674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-internal-tls-certs\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.275700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-config-data\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.275753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-etc-swift\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.275782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-combined-ca-bundle\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.376846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-log-httpd\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.376903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-run-httpd\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.376957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-internal-tls-certs\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.376979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-config-data\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.377046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-etc-swift\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.377077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-combined-ca-bundle\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.377108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78q44\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-kube-api-access-78q44\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.377139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-public-tls-certs\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.377295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-log-httpd\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.377399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-run-httpd\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.382423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-internal-tls-certs\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.382609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-public-tls-certs\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.385398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-config-data\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.385406 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-etc-swift\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.385974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-combined-ca-bundle\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.389800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78q44\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-kube-api-access-78q44\") pod \"swift-proxy-567575559b-gqwkg\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.506455 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.895614 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567575559b-gqwkg"] Jan 21 16:04:14 crc kubenswrapper[4707]: I0121 16:04:14.971618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" event={"ID":"2772c54d-baf5-4cd4-8677-0db11b97c5ef","Type":"ContainerStarted","Data":"4305a20f633fc4957c8c0e27ca4e0246bc5f26542a84c6d503dfe7054c2af499"} Jan 21 16:04:15 crc kubenswrapper[4707]: I0121 16:04:15.979328 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" event={"ID":"2772c54d-baf5-4cd4-8677-0db11b97c5ef","Type":"ContainerStarted","Data":"69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47"} Jan 21 16:04:15 crc kubenswrapper[4707]: I0121 16:04:15.979550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" event={"ID":"2772c54d-baf5-4cd4-8677-0db11b97c5ef","Type":"ContainerStarted","Data":"16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3"} Jan 21 16:04:15 crc kubenswrapper[4707]: I0121 16:04:15.979564 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:15 crc kubenswrapper[4707]: I0121 16:04:15.979707 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:16 crc kubenswrapper[4707]: I0121 16:04:16.361844 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.79:9696/\": dial tcp 10.217.1.79:9696: connect: connection refused" Jan 21 16:04:17 crc kubenswrapper[4707]: I0121 16:04:17.976377 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:04:17 crc kubenswrapper[4707]: I0121 16:04:17.994769 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" podStartSLOduration=3.994755577 podStartE2EDuration="3.994755577s" podCreationTimestamp="2026-01-21 16:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:04:16.001298365 +0000 UTC m=+3753.182814587" watchObservedRunningTime="2026-01-21 16:04:17.994755577 +0000 UTC m=+3755.176271799" Jan 21 16:04:17 crc kubenswrapper[4707]: I0121 16:04:17.996940 4707 generic.go:334] "Generic (PLEG): container finished" podID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerID="9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30" exitCode=0 Jan 21 16:04:17 crc kubenswrapper[4707]: I0121 16:04:17.996987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" event={"ID":"53a357b9-520a-4e1b-be4e-917657c1e24e","Type":"ContainerDied","Data":"9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30"} Jan 21 16:04:17 crc kubenswrapper[4707]: I0121 16:04:17.997017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" event={"ID":"53a357b9-520a-4e1b-be4e-917657c1e24e","Type":"ContainerDied","Data":"30a2e8789c91a984830ac98a8867d10f3ea13a3388ce8fbc35da7c0a7d9e733a"} Jan 21 16:04:17 crc kubenswrapper[4707]: I0121 16:04:17.997030 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf" Jan 21 16:04:17 crc kubenswrapper[4707]: I0121 16:04:17.997033 4707 scope.go:117] "RemoveContainer" containerID="a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.013609 4707 scope.go:117] "RemoveContainer" containerID="9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.029173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-combined-ca-bundle\") pod \"53a357b9-520a-4e1b-be4e-917657c1e24e\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.029295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-internal-tls-certs\") pod \"53a357b9-520a-4e1b-be4e-917657c1e24e\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.029314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-config\") pod \"53a357b9-520a-4e1b-be4e-917657c1e24e\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.029332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-httpd-config\") pod \"53a357b9-520a-4e1b-be4e-917657c1e24e\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.029387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-ovndb-tls-certs\") pod \"53a357b9-520a-4e1b-be4e-917657c1e24e\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.029408 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-public-tls-certs\") pod \"53a357b9-520a-4e1b-be4e-917657c1e24e\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.029459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgndw\" (UniqueName: \"kubernetes.io/projected/53a357b9-520a-4e1b-be4e-917657c1e24e-kube-api-access-fgndw\") pod \"53a357b9-520a-4e1b-be4e-917657c1e24e\" (UID: \"53a357b9-520a-4e1b-be4e-917657c1e24e\") " Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.033601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a357b9-520a-4e1b-be4e-917657c1e24e-kube-api-access-fgndw" (OuterVolumeSpecName: "kube-api-access-fgndw") pod "53a357b9-520a-4e1b-be4e-917657c1e24e" (UID: "53a357b9-520a-4e1b-be4e-917657c1e24e"). InnerVolumeSpecName "kube-api-access-fgndw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.034090 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "53a357b9-520a-4e1b-be4e-917657c1e24e" (UID: "53a357b9-520a-4e1b-be4e-917657c1e24e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.035833 4707 scope.go:117] "RemoveContainer" containerID="a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0" Jan 21 16:04:18 crc kubenswrapper[4707]: E0121 16:04:18.047176 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0\": container with ID starting with a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0 not found: ID does not exist" containerID="a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.047237 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0"} err="failed to get container status \"a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0\": rpc error: code = NotFound desc = could not find container \"a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0\": container with ID starting with a9bb6cd8a3451b8049448d0a9eebfa5ad780cf4193b2205fa7b0fce71e599ce0 not found: ID does not exist" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.047262 4707 scope.go:117] "RemoveContainer" containerID="9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30" Jan 21 16:04:18 crc kubenswrapper[4707]: E0121 16:04:18.047581 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30\": container with ID starting with 9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30 not found: ID does not exist" containerID="9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.047613 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30"} err="failed to get container status \"9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30\": rpc error: code = NotFound desc = could not find container \"9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30\": container with ID starting with 9b9333635e68de65f18dd8134bd829ca670ccbc9d918795793928a169f5b9c30 not found: ID does not exist" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.066479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "53a357b9-520a-4e1b-be4e-917657c1e24e" (UID: "53a357b9-520a-4e1b-be4e-917657c1e24e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.067213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "53a357b9-520a-4e1b-be4e-917657c1e24e" (UID: "53a357b9-520a-4e1b-be4e-917657c1e24e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.068782 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-config" (OuterVolumeSpecName: "config") pod "53a357b9-520a-4e1b-be4e-917657c1e24e" (UID: "53a357b9-520a-4e1b-be4e-917657c1e24e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.069043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53a357b9-520a-4e1b-be4e-917657c1e24e" (UID: "53a357b9-520a-4e1b-be4e-917657c1e24e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.083483 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "53a357b9-520a-4e1b-be4e-917657c1e24e" (UID: "53a357b9-520a-4e1b-be4e-917657c1e24e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.131438 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.131590 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.131654 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.131718 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.131776 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.131851 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53a357b9-520a-4e1b-be4e-917657c1e24e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.131914 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgndw\" (UniqueName: \"kubernetes.io/projected/53a357b9-520a-4e1b-be4e-917657c1e24e-kube-api-access-fgndw\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.322105 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf"] Jan 21 16:04:18 crc kubenswrapper[4707]: I0121 16:04:18.328774 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6fd5c4b78d-v98lf"] Jan 21 16:04:19 crc kubenswrapper[4707]: I0121 16:04:19.189485 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" path="/var/lib/kubelet/pods/53a357b9-520a-4e1b-be4e-917657c1e24e/volumes" Jan 21 16:04:19 crc kubenswrapper[4707]: I0121 16:04:19.947131 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:04:19 crc kubenswrapper[4707]: I0121 16:04:19.947759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:04:20 crc kubenswrapper[4707]: I0121 16:04:20.031204 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-685994d4cb-zr62v"] Jan 21 16:04:20 crc kubenswrapper[4707]: I0121 16:04:20.031461 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-log" containerID="cri-o://484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4" gracePeriod=30 Jan 21 16:04:20 crc kubenswrapper[4707]: I0121 16:04:20.031602 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-api" containerID="cri-o://d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0" gracePeriod=30 Jan 21 16:04:21 crc kubenswrapper[4707]: I0121 16:04:21.025789 4707 generic.go:334] "Generic (PLEG): container finished" podID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerID="484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4" exitCode=143 Jan 21 16:04:21 crc kubenswrapper[4707]: I0121 16:04:21.025874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" event={"ID":"5b206e03-5e3a-4613-8d4a-6108443785cb","Type":"ContainerDied","Data":"484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4"} Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.144206 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.1.69:8778/\": read tcp 10.217.0.2:34788->10.217.1.69:8778: read: connection reset by peer" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.144246 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.69:8778/\": read tcp 10.217.0.2:34800->10.217.1.69:8778: read: connection reset by peer" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.582984 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.641762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh9rz\" (UniqueName: \"kubernetes.io/projected/5b206e03-5e3a-4613-8d4a-6108443785cb-kube-api-access-jh9rz\") pod \"5b206e03-5e3a-4613-8d4a-6108443785cb\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.642030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-config-data\") pod \"5b206e03-5e3a-4613-8d4a-6108443785cb\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.642087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-combined-ca-bundle\") pod \"5b206e03-5e3a-4613-8d4a-6108443785cb\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.642115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-scripts\") pod \"5b206e03-5e3a-4613-8d4a-6108443785cb\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.642167 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs\") pod \"5b206e03-5e3a-4613-8d4a-6108443785cb\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.642183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs\") pod \"5b206e03-5e3a-4613-8d4a-6108443785cb\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.642210 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b206e03-5e3a-4613-8d4a-6108443785cb-logs\") pod \"5b206e03-5e3a-4613-8d4a-6108443785cb\" (UID: \"5b206e03-5e3a-4613-8d4a-6108443785cb\") " Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.642790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b206e03-5e3a-4613-8d4a-6108443785cb-logs" (OuterVolumeSpecName: "logs") pod "5b206e03-5e3a-4613-8d4a-6108443785cb" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.647240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-scripts" (OuterVolumeSpecName: "scripts") pod "5b206e03-5e3a-4613-8d4a-6108443785cb" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.647249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b206e03-5e3a-4613-8d4a-6108443785cb-kube-api-access-jh9rz" (OuterVolumeSpecName: "kube-api-access-jh9rz") pod "5b206e03-5e3a-4613-8d4a-6108443785cb" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb"). InnerVolumeSpecName "kube-api-access-jh9rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.678976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-config-data" (OuterVolumeSpecName: "config-data") pod "5b206e03-5e3a-4613-8d4a-6108443785cb" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.680703 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b206e03-5e3a-4613-8d4a-6108443785cb" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.710632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5b206e03-5e3a-4613-8d4a-6108443785cb" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.720491 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5b206e03-5e3a-4613-8d4a-6108443785cb" (UID: "5b206e03-5e3a-4613-8d4a-6108443785cb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.743283 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.743311 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.743321 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b206e03-5e3a-4613-8d4a-6108443785cb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.743332 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh9rz\" (UniqueName: \"kubernetes.io/projected/5b206e03-5e3a-4613-8d4a-6108443785cb-kube-api-access-jh9rz\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.743343 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.743352 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:23 crc kubenswrapper[4707]: I0121 16:04:23.743359 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b206e03-5e3a-4613-8d4a-6108443785cb-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.052287 4707 generic.go:334] "Generic (PLEG): container finished" podID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerID="d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0" exitCode=0 Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.052329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" event={"ID":"5b206e03-5e3a-4613-8d4a-6108443785cb","Type":"ContainerDied","Data":"d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0"} Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.052346 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.052365 4707 scope.go:117] "RemoveContainer" containerID="d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.052354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-685994d4cb-zr62v" event={"ID":"5b206e03-5e3a-4613-8d4a-6108443785cb","Type":"ContainerDied","Data":"35794f68ad94b71e66696f834df51416b6a3bc46e7f01a9e1111db3cf9aa9e72"} Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.071393 4707 scope.go:117] "RemoveContainer" containerID="484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.081294 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-685994d4cb-zr62v"] Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.086295 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-685994d4cb-zr62v"] Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.091448 4707 scope.go:117] "RemoveContainer" containerID="d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0" Jan 21 16:04:24 crc kubenswrapper[4707]: E0121 16:04:24.091821 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0\": container with ID starting with d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0 not found: ID does not exist" containerID="d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.091852 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0"} err="failed to get container status \"d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0\": rpc error: code = NotFound desc = could not find container \"d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0\": container with ID starting with d13903b7e652e6838d684b596b217064e2a38b9d4776479eead6d44ce580bec0 not found: ID does not exist" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.091871 4707 scope.go:117] "RemoveContainer" containerID="484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4" Jan 21 16:04:24 crc kubenswrapper[4707]: E0121 16:04:24.092209 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4\": container with ID starting with 484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4 not found: ID does not exist" containerID="484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.092239 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4"} err="failed to get container status \"484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4\": rpc error: code = NotFound desc = could not find container \"484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4\": container with ID starting with 484dbf896697d3cc87865b46a79fad62e682122dd05b8083a7ea08b314a432b4 not found: ID does not exist" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.510198 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.512175 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.564636 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs"] Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.565101 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" podUID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerName="proxy-httpd" containerID="cri-o://915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb" gracePeriod=30 Jan 21 16:04:24 crc kubenswrapper[4707]: I0121 16:04:24.565173 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" podUID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerName="proxy-server" containerID="cri-o://11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa" gracePeriod=30 Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.030120 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.068985 4707 generic.go:334] "Generic (PLEG): container finished" podID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerID="11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa" exitCode=0 Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069014 4707 generic.go:334] "Generic (PLEG): container finished" podID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerID="915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb" exitCode=0 Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069056 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" event={"ID":"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be","Type":"ContainerDied","Data":"11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa"} Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" event={"ID":"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be","Type":"ContainerDied","Data":"915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb"} Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs" event={"ID":"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be","Type":"ContainerDied","Data":"f158eb4dec08518c041253742307fa9d0b4db27b4d8a049c5f839e5ffbf80f6a"} Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069135 4707 scope.go:117] "RemoveContainer" containerID="11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-internal-tls-certs\") pod \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069496 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-public-tls-certs\") pod \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-combined-ca-bundle\") pod \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlwwb\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-kube-api-access-wlwwb\") pod \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.069992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-log-httpd\") pod \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.070016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-run-httpd\") pod \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.070047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-config-data\") pod \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.070075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-etc-swift\") pod \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\" (UID: \"331689e3-b8a3-4b08-9ac3-f9fc8d7a27be\") " Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.070393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" (UID: "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.070795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" (UID: "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.074078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" (UID: "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.076432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-kube-api-access-wlwwb" (OuterVolumeSpecName: "kube-api-access-wlwwb") pod "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" (UID: "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be"). InnerVolumeSpecName "kube-api-access-wlwwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.085526 4707 scope.go:117] "RemoveContainer" containerID="915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.101400 4707 scope.go:117] "RemoveContainer" containerID="11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa" Jan 21 16:04:25 crc kubenswrapper[4707]: E0121 16:04:25.101657 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa\": container with ID starting with 11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa not found: ID does not exist" containerID="11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.101690 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa"} err="failed to get container status \"11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa\": rpc error: code = NotFound desc = could not find container \"11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa\": container with ID starting with 11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa not found: ID does not exist" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.101711 4707 scope.go:117] "RemoveContainer" containerID="915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb" Jan 21 16:04:25 crc kubenswrapper[4707]: E0121 16:04:25.101988 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb\": container with ID starting with 915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb not found: ID does not exist" containerID="915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.102029 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb"} err="failed to get container status \"915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb\": rpc error: code = NotFound desc = could not find container \"915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb\": container with ID starting with 915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb not found: ID does not exist" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.102054 4707 scope.go:117] "RemoveContainer" containerID="11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.102290 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa"} err="failed to get container status \"11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa\": rpc error: code = NotFound desc = could not find container \"11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa\": container with ID starting with 11dbfbbb9063d022abd779a4ce3f18341bbd73b77858a8ea1b4913f7a6d222fa not found: ID does not exist" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.102317 4707 scope.go:117] "RemoveContainer" containerID="915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.102504 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb"} err="failed to get container status \"915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb\": rpc error: code = NotFound desc = could not find container \"915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb\": container with ID starting with 915931ed77f80bf858711a98c6b9fc3e41dd38d5b74eac344d26a7a1f259febb not found: ID does not exist" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.104595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-config-data" (OuterVolumeSpecName: "config-data") pod "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" (UID: "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.104727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" (UID: "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.106930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" (UID: "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.111218 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" (UID: "331689e3-b8a3-4b08-9ac3-f9fc8d7a27be"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.172515 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.172712 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlwwb\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-kube-api-access-wlwwb\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.172724 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.172733 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.172743 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.172752 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.172761 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.172769 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.192875 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" path="/var/lib/kubelet/pods/5b206e03-5e3a-4613-8d4a-6108443785cb/volumes" Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.390059 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs"] Jan 21 16:04:25 crc kubenswrapper[4707]: I0121 16:04:25.395838 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58ff699466-ffxgs"] Jan 21 16:04:27 crc kubenswrapper[4707]: I0121 16:04:27.190332 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" path="/var/lib/kubelet/pods/331689e3-b8a3-4b08-9ac3-f9fc8d7a27be/volumes" Jan 21 16:04:39 crc kubenswrapper[4707]: I0121 16:04:39.945563 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:04:39 crc kubenswrapper[4707]: I0121 16:04:39.946634 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:04:40 crc kubenswrapper[4707]: E0121 16:04:40.460963 4707 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.165:41874->192.168.25.165:36655: read tcp 192.168.25.165:41874->192.168.25.165:36655: read: connection reset by peer Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.221006 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod30ccb1b1-f7df-49e1-b264-546fe96b02c0"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod30ccb1b1-f7df-49e1-b264-546fe96b02c0] : Timed out while waiting for systemd to remove kubepods-besteffort-pod30ccb1b1_f7df_49e1_b264_546fe96b02c0.slice" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551218 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kbnb4"] Jan 21 16:04:41 crc kubenswrapper[4707]: E0121 16:04:41.551526 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerName="neutron-api" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551543 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerName="neutron-api" Jan 21 16:04:41 crc kubenswrapper[4707]: E0121 16:04:41.551558 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerName="proxy-httpd" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551564 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerName="proxy-httpd" Jan 21 16:04:41 crc kubenswrapper[4707]: E0121 16:04:41.551573 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerName="neutron-httpd" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551580 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerName="neutron-httpd" Jan 21 16:04:41 crc kubenswrapper[4707]: E0121 16:04:41.551587 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerName="proxy-server" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551592 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerName="proxy-server" Jan 21 16:04:41 crc kubenswrapper[4707]: E0121 16:04:41.551613 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-log" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551618 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-log" Jan 21 16:04:41 crc kubenswrapper[4707]: E0121 16:04:41.551628 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-api" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551633 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-api" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551802 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerName="neutron-httpd" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551837 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-api" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551846 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerName="proxy-httpd" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551855 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a357b9-520a-4e1b-be4e-917657c1e24e" containerName="neutron-api" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="331689e3-b8a3-4b08-9ac3-f9fc8d7a27be" containerName="proxy-server" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.551876 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b206e03-5e3a-4613-8d4a-6108443785cb" containerName="placement-log" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.553010 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.559176 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbnb4"] Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.621304 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssptg\" (UniqueName: \"kubernetes.io/projected/a2e118f5-6354-4e6a-b996-1a6a23037434-kube-api-access-ssptg\") pod \"redhat-operators-kbnb4\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.621566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-utilities\") pod \"redhat-operators-kbnb4\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.621700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-catalog-content\") pod \"redhat-operators-kbnb4\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.723167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssptg\" (UniqueName: \"kubernetes.io/projected/a2e118f5-6354-4e6a-b996-1a6a23037434-kube-api-access-ssptg\") pod \"redhat-operators-kbnb4\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.723423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-utilities\") pod \"redhat-operators-kbnb4\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.723564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-catalog-content\") pod \"redhat-operators-kbnb4\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.723774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-utilities\") pod \"redhat-operators-kbnb4\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.724004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-catalog-content\") pod \"redhat-operators-kbnb4\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.741628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssptg\" (UniqueName: \"kubernetes.io/projected/a2e118f5-6354-4e6a-b996-1a6a23037434-kube-api-access-ssptg\") pod \"redhat-operators-kbnb4\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:41 crc kubenswrapper[4707]: I0121 16:04:41.867629 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:42 crc kubenswrapper[4707]: I0121 16:04:42.257817 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbnb4"] Jan 21 16:04:43 crc kubenswrapper[4707]: I0121 16:04:43.184015 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerID="0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c" exitCode=0 Jan 21 16:04:43 crc kubenswrapper[4707]: I0121 16:04:43.192971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbnb4" event={"ID":"a2e118f5-6354-4e6a-b996-1a6a23037434","Type":"ContainerDied","Data":"0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c"} Jan 21 16:04:43 crc kubenswrapper[4707]: I0121 16:04:43.193002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbnb4" event={"ID":"a2e118f5-6354-4e6a-b996-1a6a23037434","Type":"ContainerStarted","Data":"0504b225eaab6b8d69f986531f734cb8d427104e9f8ceed4da7e67fbd2000b81"} Jan 21 16:04:45 crc kubenswrapper[4707]: I0121 16:04:45.198595 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerID="37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca" exitCode=0 Jan 21 16:04:45 crc kubenswrapper[4707]: I0121 16:04:45.198750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbnb4" event={"ID":"a2e118f5-6354-4e6a-b996-1a6a23037434","Type":"ContainerDied","Data":"37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca"} Jan 21 16:04:46 crc kubenswrapper[4707]: I0121 16:04:46.206895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbnb4" event={"ID":"a2e118f5-6354-4e6a-b996-1a6a23037434","Type":"ContainerStarted","Data":"8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a"} Jan 21 16:04:46 crc kubenswrapper[4707]: I0121 16:04:46.223956 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kbnb4" podStartSLOduration=2.717255169 podStartE2EDuration="5.223942642s" podCreationTimestamp="2026-01-21 16:04:41 +0000 UTC" firstStartedPulling="2026-01-21 16:04:43.192190156 +0000 UTC m=+3780.373706378" lastFinishedPulling="2026-01-21 16:04:45.698877629 +0000 UTC m=+3782.880393851" observedRunningTime="2026-01-21 16:04:46.219478508 +0000 UTC m=+3783.400994731" watchObservedRunningTime="2026-01-21 16:04:46.223942642 +0000 UTC m=+3783.405458864" Jan 21 16:04:47 crc kubenswrapper[4707]: E0121 16:04:47.947382 4707 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.165:57062->192.168.25.165:36655: read tcp 192.168.25.165:57062->192.168.25.165:36655: read: connection reset by peer Jan 21 16:04:51 crc kubenswrapper[4707]: I0121 16:04:51.868408 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:51 crc kubenswrapper[4707]: I0121 16:04:51.868784 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:51 crc kubenswrapper[4707]: I0121 16:04:51.973703 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:52 crc kubenswrapper[4707]: I0121 16:04:52.291332 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:52 crc kubenswrapper[4707]: I0121 16:04:52.943719 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbnb4"] Jan 21 16:04:54 crc kubenswrapper[4707]: I0121 16:04:54.273916 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kbnb4" podUID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerName="registry-server" containerID="cri-o://8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a" gracePeriod=2 Jan 21 16:04:54 crc kubenswrapper[4707]: I0121 16:04:54.653641 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:54 crc kubenswrapper[4707]: I0121 16:04:54.704306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-catalog-content\") pod \"a2e118f5-6354-4e6a-b996-1a6a23037434\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " Jan 21 16:04:54 crc kubenswrapper[4707]: I0121 16:04:54.704680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-utilities\") pod \"a2e118f5-6354-4e6a-b996-1a6a23037434\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " Jan 21 16:04:54 crc kubenswrapper[4707]: I0121 16:04:54.704707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssptg\" (UniqueName: \"kubernetes.io/projected/a2e118f5-6354-4e6a-b996-1a6a23037434-kube-api-access-ssptg\") pod \"a2e118f5-6354-4e6a-b996-1a6a23037434\" (UID: \"a2e118f5-6354-4e6a-b996-1a6a23037434\") " Jan 21 16:04:54 crc kubenswrapper[4707]: I0121 16:04:54.705511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-utilities" (OuterVolumeSpecName: "utilities") pod "a2e118f5-6354-4e6a-b996-1a6a23037434" (UID: "a2e118f5-6354-4e6a-b996-1a6a23037434"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:04:54 crc kubenswrapper[4707]: I0121 16:04:54.710271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2e118f5-6354-4e6a-b996-1a6a23037434-kube-api-access-ssptg" (OuterVolumeSpecName: "kube-api-access-ssptg") pod "a2e118f5-6354-4e6a-b996-1a6a23037434" (UID: "a2e118f5-6354-4e6a-b996-1a6a23037434"). InnerVolumeSpecName "kube-api-access-ssptg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:04:54 crc kubenswrapper[4707]: I0121 16:04:54.806633 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:54 crc kubenswrapper[4707]: I0121 16:04:54.806661 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssptg\" (UniqueName: \"kubernetes.io/projected/a2e118f5-6354-4e6a-b996-1a6a23037434-kube-api-access-ssptg\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:55 crc kubenswrapper[4707]: E0121 16:04:55.133686 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:51832->192.168.25.165:36655: write tcp 192.168.25.165:51832->192.168.25.165:36655: write: broken pipe Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.281791 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerID="8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a" exitCode=0 Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.281855 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbnb4" event={"ID":"a2e118f5-6354-4e6a-b996-1a6a23037434","Type":"ContainerDied","Data":"8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a"} Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.281882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbnb4" event={"ID":"a2e118f5-6354-4e6a-b996-1a6a23037434","Type":"ContainerDied","Data":"0504b225eaab6b8d69f986531f734cb8d427104e9f8ceed4da7e67fbd2000b81"} Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.281900 4707 scope.go:117] "RemoveContainer" containerID="8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.282013 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbnb4" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.297989 4707 scope.go:117] "RemoveContainer" containerID="37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.313526 4707 scope.go:117] "RemoveContainer" containerID="0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.344031 4707 scope.go:117] "RemoveContainer" containerID="8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a" Jan 21 16:04:55 crc kubenswrapper[4707]: E0121 16:04:55.344309 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a\": container with ID starting with 8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a not found: ID does not exist" containerID="8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.344341 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a"} err="failed to get container status \"8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a\": rpc error: code = NotFound desc = could not find container \"8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a\": container with ID starting with 8da5977f29ede0447372fa8c0f94b879a3a95d012c756ff79fa5f9aa5be8352a not found: ID does not exist" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.344360 4707 scope.go:117] "RemoveContainer" containerID="37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca" Jan 21 16:04:55 crc kubenswrapper[4707]: E0121 16:04:55.344741 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca\": container with ID starting with 37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca not found: ID does not exist" containerID="37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.344771 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca"} err="failed to get container status \"37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca\": rpc error: code = NotFound desc = could not find container \"37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca\": container with ID starting with 37e978e345c33b93438bf8d6134d42970017e46165a421223a7dd3389e90f4ca not found: ID does not exist" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.344792 4707 scope.go:117] "RemoveContainer" containerID="0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c" Jan 21 16:04:55 crc kubenswrapper[4707]: E0121 16:04:55.345228 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c\": container with ID starting with 0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c not found: ID does not exist" containerID="0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.345254 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c"} err="failed to get container status \"0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c\": rpc error: code = NotFound desc = could not find container \"0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c\": container with ID starting with 0807c8817b7f01beb37d3260f66bf3efeee01e9d37d6643117fffa1e0760183c not found: ID does not exist" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.754725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2e118f5-6354-4e6a-b996-1a6a23037434" (UID: "a2e118f5-6354-4e6a-b996-1a6a23037434"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.821504 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e118f5-6354-4e6a-b996-1a6a23037434-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.907970 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbnb4"] Jan 21 16:04:55 crc kubenswrapper[4707]: I0121 16:04:55.914642 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kbnb4"] Jan 21 16:04:57 crc kubenswrapper[4707]: I0121 16:04:57.189928 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2e118f5-6354-4e6a-b996-1a6a23037434" path="/var/lib/kubelet/pods/a2e118f5-6354-4e6a-b996-1a6a23037434/volumes" Jan 21 16:05:09 crc kubenswrapper[4707]: I0121 16:05:09.946106 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:05:09 crc kubenswrapper[4707]: I0121 16:05:09.946473 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:05:11 crc kubenswrapper[4707]: I0121 16:05:11.509991 4707 scope.go:117] "RemoveContainer" containerID="693aa9dee02beefab5f62e72db83696878302f9d43403043bdbdd0a96bbd28b4" Jan 21 16:05:11 crc kubenswrapper[4707]: I0121 16:05:11.525325 4707 scope.go:117] "RemoveContainer" containerID="8e2da5aaf70f2dca26c0a7481b9da9f4a95637fa22e3641c79f7ab645fd6829e" Jan 21 16:05:11 crc kubenswrapper[4707]: I0121 16:05:11.541513 4707 scope.go:117] "RemoveContainer" containerID="e7b31926efdf6f8be375bbbffde94813f498decaf60c78ae20ff653568fe6c82" Jan 21 16:05:13 crc kubenswrapper[4707]: E0121 16:05:13.015144 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:60968->192.168.25.165:36655: write tcp 192.168.25.165:60968->192.168.25.165:36655: write: broken pipe Jan 21 16:05:27 crc kubenswrapper[4707]: E0121 16:05:27.242077 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:42318->192.168.25.165:36655: write tcp 192.168.25.165:42318->192.168.25.165:36655: write: broken pipe Jan 21 16:05:35 crc kubenswrapper[4707]: E0121 16:05:35.478390 4707 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.165:55166->192.168.25.165:36655: read tcp 192.168.25.165:55166->192.168.25.165:36655: read: connection reset by peer Jan 21 16:05:39 crc kubenswrapper[4707]: I0121 16:05:39.945440 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:05:39 crc kubenswrapper[4707]: I0121 16:05:39.945929 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:05:39 crc kubenswrapper[4707]: I0121 16:05:39.945968 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:05:39 crc kubenswrapper[4707]: I0121 16:05:39.946779 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:05:39 crc kubenswrapper[4707]: I0121 16:05:39.946846 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" gracePeriod=600 Jan 21 16:05:40 crc kubenswrapper[4707]: E0121 16:05:40.070387 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:05:40 crc kubenswrapper[4707]: I0121 16:05:40.585794 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" exitCode=0 Jan 21 16:05:40 crc kubenswrapper[4707]: I0121 16:05:40.585876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391"} Jan 21 16:05:40 crc kubenswrapper[4707]: I0121 16:05:40.585932 4707 scope.go:117] "RemoveContainer" containerID="c95e964b40a0887841892803aa9f77fc9d3e35044d810c4610890872e712c158" Jan 21 16:05:40 crc kubenswrapper[4707]: I0121 16:05:40.586701 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:05:40 crc kubenswrapper[4707]: E0121 16:05:40.587021 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:05:51 crc kubenswrapper[4707]: I0121 16:05:51.182624 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:05:51 crc kubenswrapper[4707]: E0121 16:05:51.183279 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:06:03 crc kubenswrapper[4707]: I0121 16:06:03.188551 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:06:03 crc kubenswrapper[4707]: E0121 16:06:03.189171 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.861952 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.863407 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="probe" containerID="cri-o://d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c" gracePeriod=30 Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.863613 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" containerID="cri-o://f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44" gracePeriod=30 Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.919640 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.937737 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kw7d7"] Jan 21 16:06:08 crc kubenswrapper[4707]: E0121 16:06:08.938281 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerName="registry-server" Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.938299 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerName="registry-server" Jan 21 16:06:08 crc kubenswrapper[4707]: E0121 16:06:08.938313 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerName="extract-content" Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.938319 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerName="extract-content" Jan 21 16:06:08 crc kubenswrapper[4707]: E0121 16:06:08.938337 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerName="extract-utilities" Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.938343 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerName="extract-utilities" Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.938491 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e118f5-6354-4e6a-b996-1a6a23037434" containerName="registry-server" Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.939068 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.947181 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.951035 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kw7d7"] Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.967173 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.967350 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="ef804e7a-6ccf-45ea-8025-b3c4df0f1122" containerName="openstackclient" containerID="cri-o://2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1" gracePeriod=2 Jan 21 16:06:08 crc kubenswrapper[4707]: I0121 16:06:08.988129 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.005274 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc"] Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.005667 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef804e7a-6ccf-45ea-8025-b3c4df0f1122" containerName="openstackclient" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.005680 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef804e7a-6ccf-45ea-8025-b3c4df0f1122" containerName="openstackclient" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.005914 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef804e7a-6ccf-45ea-8025-b3c4df0f1122" containerName="openstackclient" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.006478 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.014147 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.021149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc"] Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.026696 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.026737 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data podName:50ded377-6ace-4f39-8a60-bee575f7e8cc nodeName:}" failed. No retries permitted until 2026-01-21 16:06:09.526724441 +0000 UTC m=+3866.708240662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data") pod "rabbitmq-server-0" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc") : configmap "rabbitmq-config-data" not found Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.041394 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.042922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.089612 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.095172 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.120704 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fh2md"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.127923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40470182-aaeb-43cd-b058-942f0ebe2483-logs\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.127960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-operator-scripts\") pod \"glance-82ac-account-create-update-6w6kc\" (UID: \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.128114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.128170 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxvx\" (UniqueName: \"kubernetes.io/projected/40470182-aaeb-43cd-b058-942f0ebe2483-kube-api-access-9hxvx\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.128280 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data-custom\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.128408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts\") pod \"root-account-create-update-kw7d7\" (UID: \"2b620e1f-426b-4363-ad99-f56ba333fbfd\") " pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.128491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8vwb\" (UniqueName: \"kubernetes.io/projected/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-kube-api-access-p8vwb\") pod \"glance-82ac-account-create-update-6w6kc\" (UID: \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.128548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25v2r\" (UniqueName: \"kubernetes.io/projected/2b620e1f-426b-4363-ad99-f56ba333fbfd-kube-api-access-25v2r\") pod \"root-account-create-update-kw7d7\" (UID: \"2b620e1f-426b-4363-ad99-f56ba333fbfd\") " pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.128574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.131329 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fh2md"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.164497 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.179946 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.202614 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abdf2ebb-d634-4cce-b6ea-573e7aeae062" path="/var/lib/kubelet/pods/abdf2ebb-d634-4cce-b6ea-573e7aeae062/volumes" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.216957 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-789wf"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232576 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data-custom\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62fd99ff-fa07-4608-8a30-69ce0bbe814b-logs\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts\") pod \"root-account-create-update-kw7d7\" (UID: \"2b620e1f-426b-4363-ad99-f56ba333fbfd\") " pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data-custom\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8vwb\" (UniqueName: \"kubernetes.io/projected/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-kube-api-access-p8vwb\") pod \"glance-82ac-account-create-update-6w6kc\" (UID: \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25v2r\" (UniqueName: \"kubernetes.io/projected/2b620e1f-426b-4363-ad99-f56ba333fbfd-kube-api-access-25v2r\") pod \"root-account-create-update-kw7d7\" (UID: \"2b620e1f-426b-4363-ad99-f56ba333fbfd\") " pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvl5k\" (UniqueName: \"kubernetes.io/projected/62fd99ff-fa07-4608-8a30-69ce0bbe814b-kube-api-access-hvl5k\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40470182-aaeb-43cd-b058-942f0ebe2483-logs\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.232984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-operator-scripts\") pod \"glance-82ac-account-create-update-6w6kc\" (UID: \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.233089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.233120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxvx\" (UniqueName: \"kubernetes.io/projected/40470182-aaeb-43cd-b058-942f0ebe2483-kube-api-access-9hxvx\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.234161 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.234208 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle podName:40470182-aaeb-43cd-b058-942f0ebe2483 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:09.734195263 +0000 UTC m=+3866.915711485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle") pod "barbican-worker-8b5858557-7fxnc" (UID: "40470182-aaeb-43cd-b058-942f0ebe2483") : secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.234839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts\") pod \"root-account-create-update-kw7d7\" (UID: \"2b620e1f-426b-4363-ad99-f56ba333fbfd\") " pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.235317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40470182-aaeb-43cd-b058-942f0ebe2483-logs\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.235788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-operator-scripts\") pod \"glance-82ac-account-create-update-6w6kc\" (UID: \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.250868 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-789wf"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.255027 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.256157 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.267231 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.273972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data-custom\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.274949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.285139 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.299257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25v2r\" (UniqueName: \"kubernetes.io/projected/2b620e1f-426b-4363-ad99-f56ba333fbfd-kube-api-access-25v2r\") pod \"root-account-create-update-kw7d7\" (UID: \"2b620e1f-426b-4363-ad99-f56ba333fbfd\") " pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.299258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8vwb\" (UniqueName: \"kubernetes.io/projected/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-kube-api-access-p8vwb\") pod \"glance-82ac-account-create-update-6w6kc\" (UID: \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\") " pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.299657 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.313255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxvx\" (UniqueName: \"kubernetes.io/projected/40470182-aaeb-43cd-b058-942f0ebe2483-kube-api-access-9hxvx\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.335705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.335775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62fd99ff-fa07-4608-8a30-69ce0bbe814b-logs\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.335939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data-custom\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.336032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9c62ea6-096a-4f1a-9326-e14e3ce91911-operator-scripts\") pod \"cinder-fcb5-account-create-update-nfc55\" (UID: \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.336086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvl5k\" (UniqueName: \"kubernetes.io/projected/62fd99ff-fa07-4608-8a30-69ce0bbe814b-kube-api-access-hvl5k\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.336109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.336143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gklzr\" (UniqueName: \"kubernetes.io/projected/b9c62ea6-096a-4f1a-9326-e14e3ce91911-kube-api-access-gklzr\") pod \"cinder-fcb5-account-create-update-nfc55\" (UID: \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.338394 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.338458 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle podName:62fd99ff-fa07-4608-8a30-69ce0bbe814b nodeName:}" failed. No retries permitted until 2026-01-21 16:06:09.838442613 +0000 UTC m=+3867.019958835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle") pod "barbican-keystone-listener-594ddb85bd-8glq2" (UID: "62fd99ff-fa07-4608-8a30-69ce0bbe814b") : secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.341423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62fd99ff-fa07-4608-8a30-69ce0bbe814b-logs\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.341513 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.341551 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle podName:f74b5687-ee85-4b71-929c-cc6ac63ffd06 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:09.841537502 +0000 UTC m=+3867.023053724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle") pod "openstack-galera-0" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06") : secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.342246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.342997 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.345740 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.351380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data-custom\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.369040 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.370623 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.379125 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.379995 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.380660 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.394194 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.408541 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-f8hm9"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.415295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvl5k\" (UniqueName: \"kubernetes.io/projected/62fd99ff-fa07-4608-8a30-69ce0bbe814b-kube-api-access-hvl5k\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.438912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9c62ea6-096a-4f1a-9326-e14e3ce91911-operator-scripts\") pod \"cinder-fcb5-account-create-update-nfc55\" (UID: \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.438993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gklzr\" (UniqueName: \"kubernetes.io/projected/b9c62ea6-096a-4f1a-9326-e14e3ce91911-kube-api-access-gklzr\") pod \"cinder-fcb5-account-create-update-nfc55\" (UID: \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.439043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q9sb\" (UniqueName: \"kubernetes.io/projected/491a0b28-d1e3-4359-af60-d6fb1767153f-kube-api-access-9q9sb\") pod \"placement-7ef3-account-create-update-g8jgp\" (UID: \"491a0b28-d1e3-4359-af60-d6fb1767153f\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.439090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/491a0b28-d1e3-4359-af60-d6fb1767153f-operator-scripts\") pod \"placement-7ef3-account-create-update-g8jgp\" (UID: \"491a0b28-d1e3-4359-af60-d6fb1767153f\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.439206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrvkh\" (UniqueName: \"kubernetes.io/projected/f3838495-e77f-4884-b7e3-ca30b8c6e13e-kube-api-access-xrvkh\") pod \"nova-api-488e-account-create-update-6kvnh\" (UID: \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.439341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3838495-e77f-4884-b7e3-ca30b8c6e13e-operator-scripts\") pod \"nova-api-488e-account-create-update-6kvnh\" (UID: \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.451450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9c62ea6-096a-4f1a-9326-e14e3ce91911-operator-scripts\") pod \"cinder-fcb5-account-create-update-nfc55\" (UID: \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.478848 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.479059 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerName="cinder-api-log" containerID="cri-o://5e7ec7f63a2fc9f2128d4f3cf1c08e92faaab92b0eaa704e50baa0f7c8e983bc" gracePeriod=30 Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.479178 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerName="cinder-api" containerID="cri-o://030c52d40b8a7c37e53cff77e05fa947476307c66e7a847d083b6b3c32443d7f" gracePeriod=30 Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.484849 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.490404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.524592 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.524629 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-27bp2"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.540895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q9sb\" (UniqueName: \"kubernetes.io/projected/491a0b28-d1e3-4359-af60-d6fb1767153f-kube-api-access-9q9sb\") pod \"placement-7ef3-account-create-update-g8jgp\" (UID: \"491a0b28-d1e3-4359-af60-d6fb1767153f\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.540968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/491a0b28-d1e3-4359-af60-d6fb1767153f-operator-scripts\") pod \"placement-7ef3-account-create-update-g8jgp\" (UID: \"491a0b28-d1e3-4359-af60-d6fb1767153f\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.541080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrvkh\" (UniqueName: \"kubernetes.io/projected/f3838495-e77f-4884-b7e3-ca30b8c6e13e-kube-api-access-xrvkh\") pod \"nova-api-488e-account-create-update-6kvnh\" (UID: \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.541216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3838495-e77f-4884-b7e3-ca30b8c6e13e-operator-scripts\") pod \"nova-api-488e-account-create-update-6kvnh\" (UID: \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.542347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/491a0b28-d1e3-4359-af60-d6fb1767153f-operator-scripts\") pod \"placement-7ef3-account-create-update-g8jgp\" (UID: \"491a0b28-d1e3-4359-af60-d6fb1767153f\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.542557 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.542602 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data podName:50ded377-6ace-4f39-8a60-bee575f7e8cc nodeName:}" failed. No retries permitted until 2026-01-21 16:06:10.542590248 +0000 UTC m=+3867.724106459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data") pod "rabbitmq-server-0" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc") : configmap "rabbitmq-config-data" not found Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.543207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3838495-e77f-4884-b7e3-ca30b8c6e13e-operator-scripts\") pod \"nova-api-488e-account-create-update-6kvnh\" (UID: \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.551343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gklzr\" (UniqueName: \"kubernetes.io/projected/b9c62ea6-096a-4f1a-9326-e14e3ce91911-kube-api-access-gklzr\") pod \"cinder-fcb5-account-create-update-nfc55\" (UID: \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\") " pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.556667 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.556977 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerName="openstack-network-exporter" containerID="cri-o://4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c" gracePeriod=300 Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.612350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q9sb\" (UniqueName: \"kubernetes.io/projected/491a0b28-d1e3-4359-af60-d6fb1767153f-kube-api-access-9q9sb\") pod \"placement-7ef3-account-create-update-g8jgp\" (UID: \"491a0b28-d1e3-4359-af60-d6fb1767153f\") " pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.637301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrvkh\" (UniqueName: \"kubernetes.io/projected/f3838495-e77f-4884-b7e3-ca30b8c6e13e-kube-api-access-xrvkh\") pod \"nova-api-488e-account-create-update-6kvnh\" (UID: \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\") " pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.645487 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.680477 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.711990 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-ptt6g"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.792317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.792900 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.792964 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle podName:40470182-aaeb-43cd-b058-942f0ebe2483 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:10.792948563 +0000 UTC m=+3867.974464786 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle") pod "barbican-worker-8b5858557-7fxnc" (UID: "40470182-aaeb-43cd-b058-942f0ebe2483") : secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.801464 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.836949 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.847110 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.878760 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.881283 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" containerName="ovn-northd" containerID="cri-o://77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261" gracePeriod=30 Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.882746 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" containerName="openstack-network-exporter" containerID="cri-o://b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6" gracePeriod=30 Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.896527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.896838 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.896880 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle podName:f74b5687-ee85-4b71-929c-cc6ac63ffd06 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:10.896863778 +0000 UTC m=+3868.078380000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle") pod "openstack-galera-0" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06") : secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.896969 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: E0121 16:06:09.897056 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle podName:62fd99ff-fa07-4608-8a30-69ce0bbe814b nodeName:}" failed. No retries permitted until 2026-01-21 16:06:10.897028697 +0000 UTC m=+3868.078544919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle") pod "barbican-keystone-listener-594ddb85bd-8glq2" (UID: "62fd99ff-fa07-4608-8a30-69ce0bbe814b") : secret "combined-ca-bundle" not found Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.911163 4707 generic.go:334] "Generic (PLEG): container finished" podID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerID="5e7ec7f63a2fc9f2128d4f3cf1c08e92faaab92b0eaa704e50baa0f7c8e983bc" exitCode=143 Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.911206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a3b559c3-4e82-4897-9b5c-1816989fffcd","Type":"ContainerDied","Data":"5e7ec7f63a2fc9f2128d4f3cf1c08e92faaab92b0eaa704e50baa0f7c8e983bc"} Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.954764 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerName="ovsdbserver-nb" containerID="cri-o://00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08" gracePeriod=300 Jan 21 16:06:09 crc kubenswrapper[4707]: I0121 16:06:09.969536 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-j98pz"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.020347 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.089253 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.090541 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.093171 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.107012 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-25rcj"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.112376 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.115146 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.117220 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.129741 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.152397 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.152785 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerName="openstack-network-exporter" containerID="cri-o://1d9f4ad28c410a19a1a52d47576b0ff87741fce0b1aca9cbf91d3cf607f6137a" gracePeriod=300 Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.199686 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.206275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.207402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66adee9-b1b8-4d17-b204-58e840cf62b3-operator-scripts\") pod \"nova-cell1-24b2-account-create-update-hstnz\" (UID: \"f66adee9-b1b8-4d17-b204-58e840cf62b3\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.207542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42c42\" (UniqueName: \"kubernetes.io/projected/f66adee9-b1b8-4d17-b204-58e840cf62b3-kube-api-access-42c42\") pod \"nova-cell1-24b2-account-create-update-hstnz\" (UID: \"f66adee9-b1b8-4d17-b204-58e840cf62b3\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.207591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f9818d-d674-41dc-8f81-7095412fd946-operator-scripts\") pod \"barbican-4b91-account-create-update-cgjh8\" (UID: \"44f9818d-d674-41dc-8f81-7095412fd946\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.207625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvzc4\" (UniqueName: \"kubernetes.io/projected/44f9818d-d674-41dc-8f81-7095412fd946-kube-api-access-nvzc4\") pod \"barbican-4b91-account-create-update-cgjh8\" (UID: \"44f9818d-d674-41dc-8f81-7095412fd946\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.211084 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.215018 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.215090 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data podName:1c37d5e5-4ffd-412c-a93a-86cb6474735c nodeName:}" failed. No retries permitted until 2026-01-21 16:06:10.715072684 +0000 UTC m=+3867.896588906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data") pod "rabbitmq-cell1-server-0" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.248714 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.248869 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" containerName="ovn-northd" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.254218 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerName="ovsdbserver-sb" containerID="cri-o://f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c" gracePeriod=300 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.302318 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.316407 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f9818d-d674-41dc-8f81-7095412fd946-operator-scripts\") pod \"barbican-4b91-account-create-update-cgjh8\" (UID: \"44f9818d-d674-41dc-8f81-7095412fd946\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.316480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvzc4\" (UniqueName: \"kubernetes.io/projected/44f9818d-d674-41dc-8f81-7095412fd946-kube-api-access-nvzc4\") pod \"barbican-4b91-account-create-update-cgjh8\" (UID: \"44f9818d-d674-41dc-8f81-7095412fd946\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.316683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66adee9-b1b8-4d17-b204-58e840cf62b3-operator-scripts\") pod \"nova-cell1-24b2-account-create-update-hstnz\" (UID: \"f66adee9-b1b8-4d17-b204-58e840cf62b3\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.316828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42c42\" (UniqueName: \"kubernetes.io/projected/f66adee9-b1b8-4d17-b204-58e840cf62b3-kube-api-access-42c42\") pod \"nova-cell1-24b2-account-create-update-hstnz\" (UID: \"f66adee9-b1b8-4d17-b204-58e840cf62b3\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.317242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f9818d-d674-41dc-8f81-7095412fd946-operator-scripts\") pod \"barbican-4b91-account-create-update-cgjh8\" (UID: \"44f9818d-d674-41dc-8f81-7095412fd946\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.317753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66adee9-b1b8-4d17-b204-58e840cf62b3-operator-scripts\") pod \"nova-cell1-24b2-account-create-update-hstnz\" (UID: \"f66adee9-b1b8-4d17-b204-58e840cf62b3\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.341011 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7ckwj"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.354070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42c42\" (UniqueName: \"kubernetes.io/projected/f66adee9-b1b8-4d17-b204-58e840cf62b3-kube-api-access-42c42\") pod \"nova-cell1-24b2-account-create-update-hstnz\" (UID: \"f66adee9-b1b8-4d17-b204-58e840cf62b3\") " pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.361052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvzc4\" (UniqueName: \"kubernetes.io/projected/44f9818d-d674-41dc-8f81-7095412fd946-kube-api-access-nvzc4\") pod \"barbican-4b91-account-create-update-cgjh8\" (UID: \"44f9818d-d674-41dc-8f81-7095412fd946\") " pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.362871 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-7ckwj"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.380856 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-kzkp8"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.386900 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-kzkp8"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.395069 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.400216 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-44f4-account-create-update-54jdg"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.407865 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.413952 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-a0e7-account-create-update-82vfd"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.419496 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-ntkqn"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.426430 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-ntkqn"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.441071 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-p8pwz"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.449936 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-p8pwz"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.451874 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.466902 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-bncrz"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.468623 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.468984 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.477057 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-b8pq9"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.480844 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tnwlj"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.481724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.493943 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tnwlj"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.495395 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.495597 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerName="glance-log" containerID="cri-o://9201040871e23d6bc2a379fa2022d38ff6b9347b086e57318e146ace3cc99e12" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.496190 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerName="glance-httpd" containerID="cri-o://01def0d7c0c313086bab3b5dc9cadb48b9024281321ff3f9e84b864c2e80cc95" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.502501 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-drb2f"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.509845 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-drb2f"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.513066 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.520295 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.520675 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-server" containerID="cri-o://42e0fb7f1e1361b1e98d0c8a660d5f6cbd7733ca6677f678924d9b430951360e" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521011 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="swift-recon-cron" containerID="cri-o://a2ba006a14c1f79b7783113eb2fc356a7828b8e8106c77fb842a484c74dc2c5a" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521065 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="rsync" containerID="cri-o://bed3e53b753f14dc32e9b1b6ab9f45190e2e849c1bd85308d5fb68b7ea133e5f" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521094 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-expirer" containerID="cri-o://6f893355c882d903714e129e409f5ce4d4643085b9619147b4afbb2525f3b1e6" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521124 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-updater" containerID="cri-o://e47594c788362325c27848cc7da59200a8816e14df22f36bd527fc0d841b218a" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521302 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-auditor" containerID="cri-o://c72a20daf0844253b29487d24bb0e8d55053110bc5b67076c0d93fa4b17a0fcf" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521392 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-replicator" containerID="cri-o://90550d64477b14b9ef1a55ac4d5a33af74701b40b0ce866af985b7931988d125" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521425 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-server" containerID="cri-o://c50db20a23ae77d375778f281e89cb6339b6ce4bdaa193bba212e82b99bf20f3" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521455 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-updater" containerID="cri-o://83dd8a63c20c1ec70a57c216ecd7483961393678c54904f2df7480290173458d" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521495 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-auditor" containerID="cri-o://6c38bd2f0b39209b01d982cd4fee04e83c6c81e7941bf608676b8abd00f2a31b" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521521 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-replicator" containerID="cri-o://20bd79ad3c444a59a7f1c99aa1748e0831cf7601587ba19e2fad1093aa7ed199" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521547 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-server" containerID="cri-o://d0267ab63d88eb5d69c8abe4f8eab3c146378376ad2f3f267d6c6d0e34f4b1e9" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.521918 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-auditor" containerID="cri-o://cf5083b5384e2bc9efc2cd7c1c61330323f0a7a94248846ec194cdbbdf9f9085" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.522018 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-reaper" containerID="cri-o://13acb9441c4c075d19f2e5354fe2e984c78df04fd88ec52f38421d65c42ab238" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.522062 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-replicator" containerID="cri-o://6340fee114b99b10644db682b6b0b87773342ed0330aee24fb6bb4739c6c6b4b" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.523007 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kw7d7"] Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.573375 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:06:10 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 16:06:10 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 16:06:10 crc kubenswrapper[4707]: else Jan 21 16:06:10 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:06:10 crc kubenswrapper[4707]: fi Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:06:10 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:06:10 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:06:10 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:06:10 crc kubenswrapper[4707]: # support updates Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.575025 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" podUID="6af371a2-cf51-495d-bd0e-a5561b6b9d7c" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.594552 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-cf495cb46-rcdjz"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.594764 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" podUID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerName="placement-log" containerID="cri-o://32a92f7f2012dcf1fe6f0d9760599775639a7067de28634de63ab2fdd9b6962c" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.596197 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" podUID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerName="placement-api" containerID="cri-o://1cfbfa0717635e65d7326b7c6d12c3ff52804eea66d22885b209dc75b069dd24" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.626177 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.626234 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data podName:50ded377-6ace-4f39-8a60-bee575f7e8cc nodeName:}" failed. No retries permitted until 2026-01-21 16:06:12.626220967 +0000 UTC m=+3869.807737190 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data") pod "rabbitmq-server-0" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc") : configmap "rabbitmq-config-data" not found Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.702340 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.727659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.730359 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.730406 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data podName:1c37d5e5-4ffd-412c-a93a-86cb6474735c nodeName:}" failed. No retries permitted until 2026-01-21 16:06:11.73039564 +0000 UTC m=+3868.911911862 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data") pod "rabbitmq-cell1-server-0" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.741535 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.791887 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.792140 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerName="glance-log" containerID="cri-o://798d6d03c5f9d0f6851d346fe3cbe26f854510991f5d3c2cad7db678016a87fb" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.792594 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerName="glance-httpd" containerID="cri-o://e161f3943e924491a50c0f1e70e67b3a00296b6236a4e22683928578bbfa1875" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.833627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.833743 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.833823 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle podName:40470182-aaeb-43cd-b058-942f0ebe2483 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:12.833776339 +0000 UTC m=+3870.015292562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle") pod "barbican-worker-8b5858557-7fxnc" (UID: "40470182-aaeb-43cd-b058-942f0ebe2483") : secret "combined-ca-bundle" not found Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.834288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-l2ftn\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.834389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-l2ftn\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.834421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4hb8\" (UniqueName: \"kubernetes.io/projected/044a3d9d-de3b-4a2b-8007-9d21847149a5-kube-api-access-h4hb8\") pod \"dnsmasq-dnsmasq-84b9f45d47-l2ftn\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.872034 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.883725 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.883948 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-httpd" containerID="cri-o://319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.884054 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-api" containerID="cri-o://75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6" gracePeriod=30 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.913706 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gbpp8"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.920601 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.926149 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d840a57-4237-4336-9882-01a52c8a2c09" containerID="b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6" exitCode=2 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.926249 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gbpp8"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.926274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6d840a57-4237-4336-9882-01a52c8a2c09","Type":"ContainerDied","Data":"b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.932343 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="50ded377-6ace-4f39-8a60-bee575f7e8cc" containerName="rabbitmq" containerID="cri-o://d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0" gracePeriod=604800 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.933134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" event={"ID":"6af371a2-cf51-495d-bd0e-a5561b6b9d7c","Type":"ContainerStarted","Data":"3edec84002032cfbc28619c2d50cf3a6f1546f0653ccdf4367b8ef8388cd27fe"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.936522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.936565 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-l2ftn\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.936634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-l2ftn\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.936639 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.936655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4hb8\" (UniqueName: \"kubernetes.io/projected/044a3d9d-de3b-4a2b-8007-9d21847149a5-kube-api-access-h4hb8\") pod \"dnsmasq-dnsmasq-84b9f45d47-l2ftn\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.936693 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle podName:62fd99ff-fa07-4608-8a30-69ce0bbe814b nodeName:}" failed. No retries permitted until 2026-01-21 16:06:12.936678379 +0000 UTC m=+3870.118194601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle") pod "barbican-keystone-listener-594ddb85bd-8glq2" (UID: "62fd99ff-fa07-4608-8a30-69ce0bbe814b") : secret "combined-ca-bundle" not found Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.937747 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.937773 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle podName:f74b5687-ee85-4b71-929c-cc6ac63ffd06 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:12.937766214 +0000 UTC m=+3870.119282437 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle") pod "openstack-galera-0" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06") : secret "combined-ca-bundle" not found Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.937515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-l2ftn\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.938197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-l2ftn\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.945778 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:06:10 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: if [ -n "glance" ]; then Jan 21 16:06:10 crc kubenswrapper[4707]: GRANT_DATABASE="glance" Jan 21 16:06:10 crc kubenswrapper[4707]: else Jan 21 16:06:10 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:06:10 crc kubenswrapper[4707]: fi Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:06:10 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:06:10 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:06:10 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:06:10 crc kubenswrapper[4707]: # support updates Jan 21 16:06:10 crc kubenswrapper[4707]: Jan 21 16:06:10 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:06:10 crc kubenswrapper[4707]: E0121 16:06:10.948775 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" podUID="6af371a2-cf51-495d-bd0e-a5561b6b9d7c" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.959335 4707 generic.go:334] "Generic (PLEG): container finished" podID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerID="32a92f7f2012dcf1fe6f0d9760599775639a7067de28634de63ab2fdd9b6962c" exitCode=143 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.959394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" event={"ID":"eff12a6c-5d02-4e45-9e7e-e589b746f1dc","Type":"ContainerDied","Data":"32a92f7f2012dcf1fe6f0d9760599775639a7067de28634de63ab2fdd9b6962c"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.972307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4hb8\" (UniqueName: \"kubernetes.io/projected/044a3d9d-de3b-4a2b-8007-9d21847149a5-kube-api-access-h4hb8\") pod \"dnsmasq-dnsmasq-84b9f45d47-l2ftn\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978283 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="6f893355c882d903714e129e409f5ce4d4643085b9619147b4afbb2525f3b1e6" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978308 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="e47594c788362325c27848cc7da59200a8816e14df22f36bd527fc0d841b218a" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978315 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="c72a20daf0844253b29487d24bb0e8d55053110bc5b67076c0d93fa4b17a0fcf" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978323 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="90550d64477b14b9ef1a55ac4d5a33af74701b40b0ce866af985b7931988d125" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978331 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="c50db20a23ae77d375778f281e89cb6339b6ce4bdaa193bba212e82b99bf20f3" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978336 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="83dd8a63c20c1ec70a57c216ecd7483961393678c54904f2df7480290173458d" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978342 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="6c38bd2f0b39209b01d982cd4fee04e83c6c81e7941bf608676b8abd00f2a31b" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978462 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="20bd79ad3c444a59a7f1c99aa1748e0831cf7601587ba19e2fad1093aa7ed199" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978471 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="d0267ab63d88eb5d69c8abe4f8eab3c146378376ad2f3f267d6c6d0e34f4b1e9" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978477 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="13acb9441c4c075d19f2e5354fe2e984c78df04fd88ec52f38421d65c42ab238" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978482 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="cf5083b5384e2bc9efc2cd7c1c61330323f0a7a94248846ec194cdbbdf9f9085" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978488 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="6340fee114b99b10644db682b6b0b87773342ed0330aee24fb6bb4739c6c6b4b" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"6f893355c882d903714e129e409f5ce4d4643085b9619147b4afbb2525f3b1e6"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"e47594c788362325c27848cc7da59200a8816e14df22f36bd527fc0d841b218a"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"c72a20daf0844253b29487d24bb0e8d55053110bc5b67076c0d93fa4b17a0fcf"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"90550d64477b14b9ef1a55ac4d5a33af74701b40b0ce866af985b7931988d125"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"c50db20a23ae77d375778f281e89cb6339b6ce4bdaa193bba212e82b99bf20f3"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"83dd8a63c20c1ec70a57c216ecd7483961393678c54904f2df7480290173458d"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"6c38bd2f0b39209b01d982cd4fee04e83c6c81e7941bf608676b8abd00f2a31b"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"20bd79ad3c444a59a7f1c99aa1748e0831cf7601587ba19e2fad1093aa7ed199"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"d0267ab63d88eb5d69c8abe4f8eab3c146378376ad2f3f267d6c6d0e34f4b1e9"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"13acb9441c4c075d19f2e5354fe2e984c78df04fd88ec52f38421d65c42ab238"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"cf5083b5384e2bc9efc2cd7c1c61330323f0a7a94248846ec194cdbbdf9f9085"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.978638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"6340fee114b99b10644db682b6b0b87773342ed0330aee24fb6bb4739c6c6b4b"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.988170 4707 generic.go:334] "Generic (PLEG): container finished" podID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerID="9201040871e23d6bc2a379fa2022d38ff6b9347b086e57318e146ace3cc99e12" exitCode=143 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.988220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"61223200-c7c1-4ee8-b38a-e6074d65114a","Type":"ContainerDied","Data":"9201040871e23d6bc2a379fa2022d38ff6b9347b086e57318e146ace3cc99e12"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.993519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" event={"ID":"2b620e1f-426b-4363-ad99-f56ba333fbfd","Type":"ContainerStarted","Data":"841a0e20c33f319acc9564e8dedd6e8d4f9ea14e977f1cf34ce8c8474652f992"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.993571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" event={"ID":"2b620e1f-426b-4363-ad99-f56ba333fbfd","Type":"ContainerStarted","Data":"a238d128e70958c0ba0c4a4ae2ac1ef76b53970869e6b8a51d32db231151b274"} Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.998649 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-zt2mk"] Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.998930 4707 generic.go:334] "Generic (PLEG): container finished" podID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerID="d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c" exitCode=0 Jan 21 16:06:10 crc kubenswrapper[4707]: I0121 16:06:10.998999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"99b5c23a-57ba-423e-8afb-878f9f0cf0f2","Type":"ContainerDied","Data":"d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c"} Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.000793 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_26d9bdaa-7c4d-41c6-be5e-ea602171d57d/ovsdbserver-nb/0.log" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.000870 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.001718 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_26d9bdaa-7c4d-41c6-be5e-ea602171d57d/ovsdbserver-nb/0.log" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.001758 4707 generic.go:334] "Generic (PLEG): container finished" podID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerID="4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c" exitCode=2 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.001771 4707 generic.go:334] "Generic (PLEG): container finished" podID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerID="00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08" exitCode=143 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.001803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"26d9bdaa-7c4d-41c6-be5e-ea602171d57d","Type":"ContainerDied","Data":"4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c"} Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.001837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"26d9bdaa-7c4d-41c6-be5e-ea602171d57d","Type":"ContainerDied","Data":"00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08"} Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.001851 4707 scope.go:117] "RemoveContainer" containerID="4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.013137 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_c1e538db-f225-4c4a-a147-f2bcf3218f28/ovsdbserver-sb/0.log" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.013171 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerID="1d9f4ad28c410a19a1a52d47576b0ff87741fce0b1aca9cbf91d3cf607f6137a" exitCode=2 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.013183 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerID="f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c" exitCode=143 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.013217 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-zt2mk"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.013236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1e538db-f225-4c4a-a147-f2bcf3218f28","Type":"ContainerDied","Data":"1d9f4ad28c410a19a1a52d47576b0ff87741fce0b1aca9cbf91d3cf607f6137a"} Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.013248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1e538db-f225-4c4a-a147-f2bcf3218f28","Type":"ContainerDied","Data":"f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c"} Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.018641 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" podStartSLOduration=3.018626799 podStartE2EDuration="3.018626799s" podCreationTimestamp="2026-01-21 16:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:06:11.005349493 +0000 UTC m=+3868.186865716" watchObservedRunningTime="2026-01-21 16:06:11.018626799 +0000 UTC m=+3868.200143021" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.020141 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.020336 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-log" containerID="cri-o://170b32ec6e5a16b7ab4b852d6a2315b320618427d7554c7752c0c21ea1350ffa" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.020733 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-metadata" containerID="cri-o://7cfc0f593e80d12c7b3d86938139b541aeeef5f717713f7020df220d37653d45" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.049801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-config\") pod \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.049884 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-scripts\") pod \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.049904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdbserver-nb-tls-certs\") pod \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.049942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-metrics-certs-tls-certs\") pod \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.049963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.050029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfks7\" (UniqueName: \"kubernetes.io/projected/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-kube-api-access-rfks7\") pod \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.050087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-combined-ca-bundle\") pod \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.051849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdb-rundir\") pod \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\" (UID: \"26d9bdaa-7c4d-41c6-be5e-ea602171d57d\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.052024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-scripts" (OuterVolumeSpecName: "scripts") pod "26d9bdaa-7c4d-41c6-be5e-ea602171d57d" (UID: "26d9bdaa-7c4d-41c6-be5e-ea602171d57d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.052348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "26d9bdaa-7c4d-41c6-be5e-ea602171d57d" (UID: "26d9bdaa-7c4d-41c6-be5e-ea602171d57d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.056573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "26d9bdaa-7c4d-41c6-be5e-ea602171d57d" (UID: "26d9bdaa-7c4d-41c6-be5e-ea602171d57d"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.058012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-kube-api-access-rfks7" (OuterVolumeSpecName: "kube-api-access-rfks7") pod "26d9bdaa-7c4d-41c6-be5e-ea602171d57d" (UID: "26d9bdaa-7c4d-41c6-be5e-ea602171d57d"). InnerVolumeSpecName "kube-api-access-rfks7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.060852 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfks7\" (UniqueName: \"kubernetes.io/projected/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-kube-api-access-rfks7\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.060874 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.060883 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.060921 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.064059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-config" (OuterVolumeSpecName: "config") pod "26d9bdaa-7c4d-41c6-be5e-ea602171d57d" (UID: "26d9bdaa-7c4d-41c6-be5e-ea602171d57d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.078660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.085910 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.100693 4707 scope.go:117] "RemoveContainer" containerID="00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.114785 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c is running failed: container process not found" containerID="f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.115890 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c is running failed: container process not found" containerID="f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.116486 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c is running failed: container process not found" containerID="f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.116516 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerName="ovsdbserver-sb" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.135514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26d9bdaa-7c4d-41c6-be5e-ea602171d57d" (UID: "26d9bdaa-7c4d-41c6-be5e-ea602171d57d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.163732 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.163758 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.163769 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.165430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "26d9bdaa-7c4d-41c6-be5e-ea602171d57d" (UID: "26d9bdaa-7c4d-41c6-be5e-ea602171d57d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.166551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "26d9bdaa-7c4d-41c6-be5e-ea602171d57d" (UID: "26d9bdaa-7c4d-41c6-be5e-ea602171d57d"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.167217 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.220432 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025cd7bf-ed7d-4465-8d3a-60257e90bdef" path="/var/lib/kubelet/pods/025cd7bf-ed7d-4465-8d3a-60257e90bdef/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.220981 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c55d01-3387-4a00-be62-2fa602109b06" path="/var/lib/kubelet/pods/11c55d01-3387-4a00-be62-2fa602109b06/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.221491 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c611a8f-8f60-428c-9fb9-c2009be3534b" path="/var/lib/kubelet/pods/1c611a8f-8f60-428c-9fb9-c2009be3534b/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.221988 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22499a49-41fa-4c4f-9e22-16f64c8757b4" path="/var/lib/kubelet/pods/22499a49-41fa-4c4f-9e22-16f64c8757b4/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.222885 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343110b7-9660-4e3f-a56f-d7ee9f89467d" path="/var/lib/kubelet/pods/343110b7-9660-4e3f-a56f-d7ee9f89467d/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.223421 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5074e861-4d4c-4290-9c96-206e7d0b8f6d" path="/var/lib/kubelet/pods/5074e861-4d4c-4290-9c96-206e7d0b8f6d/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.223906 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae" path="/var/lib/kubelet/pods/5341cdbb-80e0-4eec-b6a9-3f7d97fcb4ae/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.229250 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="643f3f30-7d67-4b8b-af44-3637ad271586" path="/var/lib/kubelet/pods/643f3f30-7d67-4b8b-af44-3637ad271586/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.229261 4707 scope.go:117] "RemoveContainer" containerID="4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.229764 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6679ca9d-4a6e-446d-bf8a-fb91fc35f740" path="/var/lib/kubelet/pods/6679ca9d-4a6e-446d-bf8a-fb91fc35f740/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.230293 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671e177f-2f9d-42bc-b150-5efee0b71d9c" path="/var/lib/kubelet/pods/671e177f-2f9d-42bc-b150-5efee0b71d9c/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.231191 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dbab3d8-0477-4503-a6d1-310a3c0dfda3" path="/var/lib/kubelet/pods/8dbab3d8-0477-4503-a6d1-310a3c0dfda3/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.231722 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff" path="/var/lib/kubelet/pods/93fe5a42-4cc9-4ebc-af4e-b2dd20fd10ff/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.232214 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb7db96-fb55-4d76-9275-d06f77115530" path="/var/lib/kubelet/pods/9eb7db96-fb55-4d76-9275-d06f77115530/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.232785 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c\": container with ID starting with 4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c not found: ID does not exist" containerID="4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.232831 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c"} err="failed to get container status \"4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c\": rpc error: code = NotFound desc = could not find container \"4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c\": container with ID starting with 4688dd5dff8784e8324c5c01d6956e485e109b60beee127ba3c0481c1a4d654c not found: ID does not exist" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.232849 4707 scope.go:117] "RemoveContainer" containerID="00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.233055 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08\": container with ID starting with 00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08 not found: ID does not exist" containerID="00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.233081 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08"} err="failed to get container status \"00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08\": rpc error: code = NotFound desc = could not find container \"00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08\": container with ID starting with 00b95f194604b5e3212a2a12501a418a6987fe77a87011d513dcf1b4bcd20a08 not found: ID does not exist" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.233325 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff8d805-9b36-4e84-a82b-aca4df5032cd" path="/var/lib/kubelet/pods/9ff8d805-9b36-4e84-a82b-aca4df5032cd/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.234014 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffba812-ff2b-453b-92fb-2e669935e242" path="/var/lib/kubelet/pods/9ffba812-ff2b-453b-92fb-2e669935e242/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.234498 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a" path="/var/lib/kubelet/pods/a82c42ae-ee37-49e7-bcbb-c6ec154b9a7a/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.234979 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8dc14bc-b18c-4937-9319-b73bb16aced4" path="/var/lib/kubelet/pods/a8dc14bc-b18c-4937-9319-b73bb16aced4/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.236274 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ac5b39-8cde-4b9d-9b74-b09b6be633bd" path="/var/lib/kubelet/pods/f4ac5b39-8cde-4b9d-9b74-b09b6be633bd/volumes" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.237002 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.237032 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-9ln95"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.241305 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-9ln95"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.268438 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.268473 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9bdaa-7c4d-41c6-be5e-ea602171d57d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.269735 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.269924 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-log" containerID="cri-o://23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.270264 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-api" containerID="cri-o://5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.338303 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.363487 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-dsspk"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.390926 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz"] Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.405678 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:06:11 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: if [ -n "placement" ]; then Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="placement" Jan 21 16:06:11 crc kubenswrapper[4707]: else Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:06:11 crc kubenswrapper[4707]: fi Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:06:11 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:06:11 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:06:11 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:06:11 crc kubenswrapper[4707]: # support updates Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.406487 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-dsspk"] Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.407092 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" podUID="491a0b28-d1e3-4359-af60-d6fb1767153f" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.407699 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:06:11 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: if [ -n "nova_api" ]; then Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="nova_api" Jan 21 16:06:11 crc kubenswrapper[4707]: else Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:06:11 crc kubenswrapper[4707]: fi Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:06:11 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:06:11 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:06:11 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:06:11 crc kubenswrapper[4707]: # support updates Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.410940 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" podUID="f3838495-e77f-4884-b7e3-ca30b8c6e13e" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.411017 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh"] Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.418474 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:06:11 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 16:06:11 crc kubenswrapper[4707]: else Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:06:11 crc kubenswrapper[4707]: fi Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:06:11 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:06:11 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:06:11 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:06:11 crc kubenswrapper[4707]: # support updates Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.420642 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" podUID="b9c62ea6-096a-4f1a-9326-e14e3ce91911" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.438032 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.439231 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.446060 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wwprz"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.452514 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-q6hgd"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.459855 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" containerName="galera" containerID="cri-o://6afcb360e0fbcf5cfa3da43b53ea242515a563c93456f93154197247f98df091" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.459979 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-wwprz"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.465943 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.470521 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-q6hgd"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.475351 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-6zdng"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.481654 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-6zdng"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.487529 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zfw44"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.492199 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-zfw44"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.493744 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_c1e538db-f225-4c4a-a147-f2bcf3218f28/ovsdbserver-sb/0.log" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.493802 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.498574 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.539360 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.539767 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="273f261e-d7a5-4025-91a6-f9e102be79de" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.563520 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.585022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdbserver-sb-tls-certs\") pod \"c1e538db-f225-4c4a-a147-f2bcf3218f28\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.585089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmr4k\" (UniqueName: \"kubernetes.io/projected/c1e538db-f225-4c4a-a147-f2bcf3218f28-kube-api-access-kmr4k\") pod \"c1e538db-f225-4c4a-a147-f2bcf3218f28\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.585596 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-metrics-certs-tls-certs\") pod \"c1e538db-f225-4c4a-a147-f2bcf3218f28\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.585678 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdb-rundir\") pod \"c1e538db-f225-4c4a-a147-f2bcf3218f28\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.585722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-combined-ca-bundle\") pod \"c1e538db-f225-4c4a-a147-f2bcf3218f28\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.585801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-config\") pod \"c1e538db-f225-4c4a-a147-f2bcf3218f28\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.585974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c1e538db-f225-4c4a-a147-f2bcf3218f28\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.586029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-scripts\") pod \"c1e538db-f225-4c4a-a147-f2bcf3218f28\" (UID: \"c1e538db-f225-4c4a-a147-f2bcf3218f28\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.587391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-scripts" (OuterVolumeSpecName: "scripts") pod "c1e538db-f225-4c4a-a147-f2bcf3218f28" (UID: "c1e538db-f225-4c4a-a147-f2bcf3218f28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.588945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "c1e538db-f225-4c4a-a147-f2bcf3218f28" (UID: "c1e538db-f225-4c4a-a147-f2bcf3218f28"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.589289 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-4vq77"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.589707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-config" (OuterVolumeSpecName: "config") pod "c1e538db-f225-4c4a-a147-f2bcf3218f28" (UID: "c1e538db-f225-4c4a-a147-f2bcf3218f28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.595494 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.610962 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.611147 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.611378 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="ceilometer-central-agent" containerID="cri-o://ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.611787 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="proxy-httpd" containerID="cri-o://f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.611871 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="sg-core" containerID="cri-o://66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.611881 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="ceilometer-notification-agent" containerID="cri-o://3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.614960 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.618244 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "c1e538db-f225-4c4a-a147-f2bcf3218f28" (UID: "c1e538db-f225-4c4a-a147-f2bcf3218f28"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.618394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e538db-f225-4c4a-a147-f2bcf3218f28-kube-api-access-kmr4k" (OuterVolumeSpecName: "kube-api-access-kmr4k") pod "c1e538db-f225-4c4a-a147-f2bcf3218f28" (UID: "c1e538db-f225-4c4a-a147-f2bcf3218f28"). InnerVolumeSpecName "kube-api-access-kmr4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.627959 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x9hv5"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.634874 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.635768 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.649013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1e538db-f225-4c4a-a147-f2bcf3218f28" (UID: "c1e538db-f225-4c4a-a147-f2bcf3218f28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.656991 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2"] Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.659894 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" podUID="62fd99ff-fa07-4608-8a30-69ce0bbe814b" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.664930 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc"] Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.665316 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" podUID="40470182-aaeb-43cd-b058-942f0ebe2483" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.670608 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.674790 4707 scope.go:117] "RemoveContainer" containerID="51cf5dc8a8e871c734ab5e17fbba7a3d81d15beef0fddedf1f9408ff06a74018" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.679829 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" podUID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerName="barbican-worker-log" containerID="cri-o://58be21a4ad10793fe38a4fd7bab7b968f3bb33396de3d91f81bf652728f3cc32" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.680218 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" podUID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerName="barbican-worker" containerID="cri-o://73b244ba7031f16385e7963d008d81acbd17ba6ba2ee40f7b1bdf960c503b84b" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.681880 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.682145 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" podUID="f58856a6-ee91-4aee-874e-118980038628" containerName="barbican-keystone-listener-log" containerID="cri-o://a7d885ed9569b9160b3891c8377f8b2c583f8d5d64b28744210b903262b86e4d" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.682391 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" podUID="f58856a6-ee91-4aee-874e-118980038628" containerName="barbican-keystone-listener" containerID="cri-o://5e4bd3a5a9bcc7a9caf760e253e0ea48824f56e8b8b18cd49710b14d45577c27" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.690292 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.690323 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.690332 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.690350 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.690359 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e538db-f225-4c4a-a147-f2bcf3218f28-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.690368 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmr4k\" (UniqueName: \"kubernetes.io/projected/c1e538db-f225-4c4a-a147-f2bcf3218f28-kube-api-access-kmr4k\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.730871 4707 scope.go:117] "RemoveContainer" containerID="fdbdbe661ec8d6b428eafed192668501fa504abde7e9628dd63f90c1ddfffed4" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.731897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "c1e538db-f225-4c4a-a147-f2bcf3218f28" (UID: "c1e538db-f225-4c4a-a147-f2bcf3218f28"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.749921 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.760992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c1e538db-f225-4c4a-a147-f2bcf3218f28" (UID: "c1e538db-f225-4c4a-a147-f2bcf3218f28"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.782995 4707 scope.go:117] "RemoveContainer" containerID="4eb952b277a1133feb396b43f7c020ea9b855f0beb6c6313e95dfccf24dfca6d" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.787460 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.795018 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.795136 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.795204 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e538db-f225-4c4a-a147-f2bcf3218f28-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.795322 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.795415 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data podName:1c37d5e5-4ffd-412c-a93a-86cb6474735c nodeName:}" failed. No retries permitted until 2026-01-21 16:06:13.795401139 +0000 UTC m=+3870.976917361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data") pod "rabbitmq-cell1-server-0" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.800706 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-65b994fc4-xql9t"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.800904 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" podUID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerName="barbican-api-log" containerID="cri-o://192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.801297 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" podUID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerName="barbican-api" containerID="cri-o://8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.808854 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.809052 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="878bb642-989b-4ea2-825e-514263f9545a" containerName="kube-state-metrics" containerID="cri-o://6043287912bacccccabcf12d1ba27e4a2e47d6aa6e68c9430fbd4adf5af2e194" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.816677 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.819841 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.820065 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" containerID="cri-o://b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.830017 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.835682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.857917 4707 scope.go:117] "RemoveContainer" containerID="3e50eda8710a63233f2f55973ef42f7216dfb177d49e57ea190dd0c2a0fde94f" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.859303 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:06:11 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 16:06:11 crc kubenswrapper[4707]: else Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:06:11 crc kubenswrapper[4707]: fi Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:06:11 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:06:11 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:06:11 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:06:11 crc kubenswrapper[4707]: # support updates Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.860939 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" podUID="44f9818d-d674-41dc-8f81-7095412fd946" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.872149 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567575559b-gqwkg"] Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.872340 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" podUID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerName="proxy-httpd" containerID="cri-o://16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.872457 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" podUID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerName="proxy-server" containerID="cri-o://69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47" gracePeriod=30 Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.874485 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="1c37d5e5-4ffd-412c-a93a-86cb6474735c" containerName="rabbitmq" containerID="cri-o://2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813" gracePeriod=604800 Jan 21 16:06:11 crc kubenswrapper[4707]: W0121 16:06:11.892308 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf66adee9_b1b8_4d17_b204_58e840cf62b3.slice/crio-e20cd145e4dd347445c24f4814680dae203fea9a96d94ac9502223f5755d150c WatchSource:0}: Error finding container e20cd145e4dd347445c24f4814680dae203fea9a96d94ac9502223f5755d150c: Status 404 returned error can't find the container with id e20cd145e4dd347445c24f4814680dae203fea9a96d94ac9502223f5755d150c Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.895949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bbjd\" (UniqueName: \"kubernetes.io/projected/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-kube-api-access-2bbjd\") pod \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.896166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config\") pod \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.896234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-combined-ca-bundle\") pod \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.896279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config-secret\") pod \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\" (UID: \"ef804e7a-6ccf-45ea-8025-b3c4df0f1122\") " Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.901247 4707 scope.go:117] "RemoveContainer" containerID="0c414677c6601250942e98f55dc20db5fd68ccfc1904b7f748f1fbc358f6b862" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.904162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-kube-api-access-2bbjd" (OuterVolumeSpecName: "kube-api-access-2bbjd") pod "ef804e7a-6ccf-45ea-8025-b3c4df0f1122" (UID: "ef804e7a-6ccf-45ea-8025-b3c4df0f1122"). InnerVolumeSpecName "kube-api-access-2bbjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.906073 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:06:11 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 16:06:11 crc kubenswrapper[4707]: else Jan 21 16:06:11 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:06:11 crc kubenswrapper[4707]: fi Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:06:11 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:06:11 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:06:11 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:06:11 crc kubenswrapper[4707]: # support updates Jan 21 16:06:11 crc kubenswrapper[4707]: Jan 21 16:06:11 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:06:11 crc kubenswrapper[4707]: E0121 16:06:11.907319 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" podUID="f66adee9-b1b8-4d17-b204-58e840cf62b3" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.925497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ef804e7a-6ccf-45ea-8025-b3c4df0f1122" (UID: "ef804e7a-6ccf-45ea-8025-b3c4df0f1122"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.938795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef804e7a-6ccf-45ea-8025-b3c4df0f1122" (UID: "ef804e7a-6ccf-45ea-8025-b3c4df0f1122"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.938898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ef804e7a-6ccf-45ea-8025-b3c4df0f1122" (UID: "ef804e7a-6ccf-45ea-8025-b3c4df0f1122"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:11 crc kubenswrapper[4707]: I0121 16:06:11.958661 4707 scope.go:117] "RemoveContainer" containerID="5f46ece4d49ebd6ff61477cd26b6c7e5b21e6dfbdf8a342e363b3fa2825bf3d7" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.001104 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.001148 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.001159 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.001168 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bbjd\" (UniqueName: \"kubernetes.io/projected/ef804e7a-6ccf-45ea-8025-b3c4df0f1122-kube-api-access-2bbjd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.012301 4707 scope.go:117] "RemoveContainer" containerID="1e8e5b9cb01eeda526ec62456f9b5e1ddeccd8b52cd2f55d760453d82d8aabdf" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.027616 4707 generic.go:334] "Generic (PLEG): container finished" podID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerID="66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d" exitCode=2 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.027666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerDied","Data":"66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.029395 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c95f7cb7d-zhgp4_3e1ab3b0-53e5-437a-8882-ce4f23c015b4/neutron-api/0.log" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.029416 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerID="319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1" exitCode=0 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.029446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" event={"ID":"3e1ab3b0-53e5-437a-8882-ce4f23c015b4","Type":"ContainerDied","Data":"319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.034144 4707 generic.go:334] "Generic (PLEG): container finished" podID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerID="192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887" exitCode=143 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.034206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" event={"ID":"329b3e52-9792-42da-a6f0-9b0fc5fb866c","Type":"ContainerDied","Data":"192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.039023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" event={"ID":"b9c62ea6-096a-4f1a-9326-e14e3ce91911","Type":"ContainerStarted","Data":"b09498e74f293e56726f74418c15ab455c7a808d51b2fbcb72fbe3c97cc25214"} Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.042636 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:06:12 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 16:06:12 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 16:06:12 crc kubenswrapper[4707]: else Jan 21 16:06:12 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:06:12 crc kubenswrapper[4707]: fi Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:06:12 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:06:12 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:06:12 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:06:12 crc kubenswrapper[4707]: # support updates Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.043744 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" podUID="b9c62ea6-096a-4f1a-9326-e14e3ce91911" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.054417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"26d9bdaa-7c4d-41c6-be5e-ea602171d57d","Type":"ContainerDied","Data":"5b2f04f405dbdff1df3dce87be51d32f13a60dedb9f71c7cc362ddb25784da0f"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.054544 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.062612 4707 generic.go:334] "Generic (PLEG): container finished" podID="ef804e7a-6ccf-45ea-8025-b3c4df0f1122" containerID="2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1" exitCode=137 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.062681 4707 scope.go:117] "RemoveContainer" containerID="2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.062804 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.068657 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerID="798d6d03c5f9d0f6851d346fe3cbe26f854510991f5d3c2cad7db678016a87fb" exitCode=143 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.068713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c00713b-bb97-4adf-82eb-8ff31b65d14b","Type":"ContainerDied","Data":"798d6d03c5f9d0f6851d346fe3cbe26f854510991f5d3c2cad7db678016a87fb"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.071254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" event={"ID":"f66adee9-b1b8-4d17-b204-58e840cf62b3","Type":"ContainerStarted","Data":"e20cd145e4dd347445c24f4814680dae203fea9a96d94ac9502223f5755d150c"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.074120 4707 generic.go:334] "Generic (PLEG): container finished" podID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerID="23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd" exitCode=143 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.074170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a1b70303-f297-4a57-b252-7bdc251fa8ef","Type":"ContainerDied","Data":"23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.076334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" event={"ID":"491a0b28-d1e3-4359-af60-d6fb1767153f","Type":"ContainerStarted","Data":"7fbfa46f076437cc3080a2f620c2543d6f264ace883efedfa9df8130ea64d20d"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.088713 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.094298 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.138426 4707 scope.go:117] "RemoveContainer" containerID="2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.138967 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1\": container with ID starting with 2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1 not found: ID does not exist" containerID="2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.138995 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1"} err="failed to get container status \"2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1\": rpc error: code = NotFound desc = could not find container \"2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1\": container with ID starting with 2b2055f60777f9913c6bc8653c7f38f569fdd5515bb58f263dbee51b73b0f9c1 not found: ID does not exist" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.142323 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="bed3e53b753f14dc32e9b1b6ab9f45190e2e849c1bd85308d5fb68b7ea133e5f" exitCode=0 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.142348 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="42e0fb7f1e1361b1e98d0c8a660d5f6cbd7733ca6677f678924d9b430951360e" exitCode=0 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.142386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"bed3e53b753f14dc32e9b1b6ab9f45190e2e849c1bd85308d5fb68b7ea133e5f"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.142411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"42e0fb7f1e1361b1e98d0c8a660d5f6cbd7733ca6677f678924d9b430951360e"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.168243 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_c1e538db-f225-4c4a-a147-f2bcf3218f28/ovsdbserver-sb/0.log" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.168407 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.169132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"c1e538db-f225-4c4a-a147-f2bcf3218f28","Type":"ContainerDied","Data":"0e47204e826ec2ccd945d5d14a6c21d07ae997240536f43f126766c760b98c10"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.169167 4707 scope.go:117] "RemoveContainer" containerID="1d9f4ad28c410a19a1a52d47576b0ff87741fce0b1aca9cbf91d3cf607f6137a" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.169707 4707 scope.go:117] "RemoveContainer" containerID="dcee4cc518ae11b51683e86bac3ad963d1df8d8a2067d8861c9950d6a0949189" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.177656 4707 generic.go:334] "Generic (PLEG): container finished" podID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerID="170b32ec6e5a16b7ab4b852d6a2315b320618427d7554c7752c0c21ea1350ffa" exitCode=143 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.177763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"425c52a3-93a2-49fc-82d9-d7d86f381335","Type":"ContainerDied","Data":"170b32ec6e5a16b7ab4b852d6a2315b320618427d7554c7752c0c21ea1350ffa"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.181655 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.183858 4707 generic.go:334] "Generic (PLEG): container finished" podID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerID="58be21a4ad10793fe38a4fd7bab7b968f3bb33396de3d91f81bf652728f3cc32" exitCode=143 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.183905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" event={"ID":"c892fe45-66a3-4ddf-b0db-b50df71c6493","Type":"ContainerDied","Data":"58be21a4ad10793fe38a4fd7bab7b968f3bb33396de3d91f81bf652728f3cc32"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.189016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" event={"ID":"44f9818d-d674-41dc-8f81-7095412fd946","Type":"ContainerStarted","Data":"865e705c9b9469b1174e749b5442919be50279577ad4a77a46eb34bd2dd4762b"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.191451 4707 generic.go:334] "Generic (PLEG): container finished" podID="f58856a6-ee91-4aee-874e-118980038628" containerID="a7d885ed9569b9160b3891c8377f8b2c583f8d5d64b28744210b903262b86e4d" exitCode=143 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.191486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" event={"ID":"f58856a6-ee91-4aee-874e-118980038628","Type":"ContainerDied","Data":"a7d885ed9569b9160b3891c8377f8b2c583f8d5d64b28744210b903262b86e4d"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.192774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" event={"ID":"f3838495-e77f-4884-b7e3-ca30b8c6e13e","Type":"ContainerStarted","Data":"b39e57bffa259ca251a7a832f7d927525a294577290c6780a7dda539446c9f94"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.203969 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.204239 4707 scope.go:117] "RemoveContainer" containerID="f7be3dd6d48b956d8bb0f9a09a2a1b031bc5758a8687d898dc58c07a04641f6c" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.209149 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" containerID="6afcb360e0fbcf5cfa3da43b53ea242515a563c93456f93154197247f98df091" exitCode=0 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.209211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf","Type":"ContainerDied","Data":"6afcb360e0fbcf5cfa3da43b53ea242515a563c93456f93154197247f98df091"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.213674 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.214853 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b620e1f-426b-4363-ad99-f56ba333fbfd" containerID="841a0e20c33f319acc9564e8dedd6e8d4f9ea14e977f1cf34ce8c8474652f992" exitCode=1 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.214885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" event={"ID":"2b620e1f-426b-4363-ad99-f56ba333fbfd","Type":"ContainerDied","Data":"841a0e20c33f319acc9564e8dedd6e8d4f9ea14e977f1cf34ce8c8474652f992"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.215606 4707 scope.go:117] "RemoveContainer" containerID="841a0e20c33f319acc9564e8dedd6e8d4f9ea14e977f1cf34ce8c8474652f992" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.218017 4707 generic.go:334] "Generic (PLEG): container finished" podID="878bb642-989b-4ea2-825e-514263f9545a" containerID="6043287912bacccccabcf12d1ba27e4a2e47d6aa6e68c9430fbd4adf5af2e194" exitCode=2 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.218070 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.218568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"878bb642-989b-4ea2-825e-514263f9545a","Type":"ContainerDied","Data":"6043287912bacccccabcf12d1ba27e4a2e47d6aa6e68c9430fbd4adf5af2e194"} Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.218606 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.219150 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:06:12 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: if [ -n "nova_api" ]; then Jan 21 16:06:12 crc kubenswrapper[4707]: GRANT_DATABASE="nova_api" Jan 21 16:06:12 crc kubenswrapper[4707]: else Jan 21 16:06:12 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:06:12 crc kubenswrapper[4707]: fi Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:06:12 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:06:12 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:06:12 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:06:12 crc kubenswrapper[4707]: # support updates Jan 21 16:06:12 crc kubenswrapper[4707]: Jan 21 16:06:12 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.220998 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" podUID="f3838495-e77f-4884-b7e3-ca30b8c6e13e" Jan 21 16:06:12 crc kubenswrapper[4707]: W0121 16:06:12.224565 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod044a3d9d_de3b_4a2b_8007_9d21847149a5.slice/crio-a922d9aa2997f2237686993372c3cb8b882120ecd6b866a26b11662d5b6882cd WatchSource:0}: Error finding container a922d9aa2997f2237686993372c3cb8b882120ecd6b866a26b11662d5b6882cd: Status 404 returned error can't find the container with id a922d9aa2997f2237686993372c3cb8b882120ecd6b866a26b11662d5b6882cd Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.237039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.270396 4707 scope.go:117] "RemoveContainer" containerID="9c5d8efdd505bcf5028e65801a46ebaa299c40cfdb0c320de7af53fbfb3b7410" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.306580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data-custom\") pod \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.306637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62fd99ff-fa07-4608-8a30-69ce0bbe814b-logs\") pod \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.306720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data\") pod \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.306765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvl5k\" (UniqueName: \"kubernetes.io/projected/62fd99ff-fa07-4608-8a30-69ce0bbe814b-kube-api-access-hvl5k\") pod \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.307194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fd99ff-fa07-4608-8a30-69ce0bbe814b-logs" (OuterVolumeSpecName: "logs") pod "62fd99ff-fa07-4608-8a30-69ce0bbe814b" (UID: "62fd99ff-fa07-4608-8a30-69ce0bbe814b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.307459 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62fd99ff-fa07-4608-8a30-69ce0bbe814b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.309680 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62fd99ff-fa07-4608-8a30-69ce0bbe814b-kube-api-access-hvl5k" (OuterVolumeSpecName: "kube-api-access-hvl5k") pod "62fd99ff-fa07-4608-8a30-69ce0bbe814b" (UID: "62fd99ff-fa07-4608-8a30-69ce0bbe814b"). InnerVolumeSpecName "kube-api-access-hvl5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.311362 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62fd99ff-fa07-4608-8a30-69ce0bbe814b" (UID: "62fd99ff-fa07-4608-8a30-69ce0bbe814b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.311874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data" (OuterVolumeSpecName: "config-data") pod "62fd99ff-fa07-4608-8a30-69ce0bbe814b" (UID: "62fd99ff-fa07-4608-8a30-69ce0bbe814b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.408588 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.408614 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.408623 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvl5k\" (UniqueName: \"kubernetes.io/projected/62fd99ff-fa07-4608-8a30-69ce0bbe814b-kube-api-access-hvl5k\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.567203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.581911 4707 scope.go:117] "RemoveContainer" containerID="6285d1310ac7d804e4686f50a16a7ad8749d9f999e79e50a4f9496b160cfd4cf" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.611193 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data-custom\") pod \"40470182-aaeb-43cd-b058-942f0ebe2483\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.611250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hxvx\" (UniqueName: \"kubernetes.io/projected/40470182-aaeb-43cd-b058-942f0ebe2483-kube-api-access-9hxvx\") pod \"40470182-aaeb-43cd-b058-942f0ebe2483\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.611315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data\") pod \"40470182-aaeb-43cd-b058-942f0ebe2483\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.611337 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40470182-aaeb-43cd-b058-942f0ebe2483-logs\") pod \"40470182-aaeb-43cd-b058-942f0ebe2483\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.622894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40470182-aaeb-43cd-b058-942f0ebe2483-logs" (OuterVolumeSpecName: "logs") pod "40470182-aaeb-43cd-b058-942f0ebe2483" (UID: "40470182-aaeb-43cd-b058-942f0ebe2483"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.628651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40470182-aaeb-43cd-b058-942f0ebe2483-kube-api-access-9hxvx" (OuterVolumeSpecName: "kube-api-access-9hxvx") pod "40470182-aaeb-43cd-b058-942f0ebe2483" (UID: "40470182-aaeb-43cd-b058-942f0ebe2483"). InnerVolumeSpecName "kube-api-access-9hxvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.638915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "40470182-aaeb-43cd-b058-942f0ebe2483" (UID: "40470182-aaeb-43cd-b058-942f0ebe2483"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.640364 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.640673 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="884872ef-7ca2-4262-9561-440673182eba" containerName="memcached" containerID="cri-o://bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497" gracePeriod=30 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.641385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data" (OuterVolumeSpecName: "config-data") pod "40470182-aaeb-43cd-b058-942f0ebe2483" (UID: "40470182-aaeb-43cd-b058-942f0ebe2483"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.655849 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.687374 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.705437 4707 scope.go:117] "RemoveContainer" containerID="67b9016cf0d63654ebd9f4125a86329f8bb17d84c9a5f607da5903d91c3570ef" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.718031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kolla-config\") pod \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.718096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-combined-ca-bundle\") pod \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.718122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-generated\") pod \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.718159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.718212 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv9pc\" (UniqueName: \"kubernetes.io/projected/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kube-api-access-pv9pc\") pod \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.718248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-operator-scripts\") pod \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.718276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-galera-tls-certs\") pod \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.718700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-default\") pod \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\" (UID: \"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.723913 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.723941 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hxvx\" (UniqueName: \"kubernetes.io/projected/40470182-aaeb-43cd-b058-942f0ebe2483-kube-api-access-9hxvx\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.723952 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.723964 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40470182-aaeb-43cd-b058-942f0ebe2483-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.724047 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.724087 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data podName:50ded377-6ace-4f39-8a60-bee575f7e8cc nodeName:}" failed. No retries permitted until 2026-01-21 16:06:16.724074999 +0000 UTC m=+3873.905591221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data") pod "rabbitmq-server-0" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc") : configmap "rabbitmq-config-data" not found Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.729641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" (UID: "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.735505 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.96:9696/\": dial tcp 10.217.1.96:9696: connect: connection refused" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.736481 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" (UID: "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.740648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" (UID: "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.741004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" (UID: "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.750852 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.751830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kube-api-access-pv9pc" (OuterVolumeSpecName: "kube-api-access-pv9pc") pod "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" (UID: "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf"). InnerVolumeSpecName "kube-api-access-pv9pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.751901 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-xxz6k"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.768502 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv"] Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.770845 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="878bb642-989b-4ea2-825e-514263f9545a" containerName="kube-state-metrics" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.770865 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="878bb642-989b-4ea2-825e-514263f9545a" containerName="kube-state-metrics" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.770881 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" containerName="galera" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.770892 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" containerName="galera" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.770912 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" containerName="mysql-bootstrap" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.770920 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" containerName="mysql-bootstrap" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.770952 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerName="openstack-network-exporter" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.770958 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerName="openstack-network-exporter" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.770983 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerName="ovsdbserver-sb" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.770989 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerName="ovsdbserver-sb" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.771010 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerName="openstack-network-exporter" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.771015 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerName="openstack-network-exporter" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.771038 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerName="ovsdbserver-nb" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.771050 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerName="ovsdbserver-nb" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.771320 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="878bb642-989b-4ea2-825e-514263f9545a" containerName="kube-state-metrics" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.771342 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" containerName="galera" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.771353 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerName="ovsdbserver-nb" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.771378 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" containerName="openstack-network-exporter" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.771390 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerName="openstack-network-exporter" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.771396 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" containerName="ovsdbserver-sb" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.772131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.793671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.811757 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.814724 4707 scope.go:117] "RemoveContainer" containerID="32b5c0143a1e67051d5e43fdb17d98a94c97afd93825354e053560a85ad156cc" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.816872 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.821712 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.821899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" (UID: "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.824466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-certs\") pod \"878bb642-989b-4ea2-825e-514263f9545a\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.824673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brfjd\" (UniqueName: \"kubernetes.io/projected/878bb642-989b-4ea2-825e-514263f9545a-kube-api-access-brfjd\") pod \"878bb642-989b-4ea2-825e-514263f9545a\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.824737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-config\") pod \"878bb642-989b-4ea2-825e-514263f9545a\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.824771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-combined-ca-bundle\") pod \"878bb642-989b-4ea2-825e-514263f9545a\" (UID: \"878bb642-989b-4ea2-825e-514263f9545a\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.825015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9hlc\" (UniqueName: \"kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc\") pod \"keystone-bd81-account-create-update-g6mbv\" (UID: \"669fe6b4-a99b-41bb-a7f9-cade7b712e87\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.825094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts\") pod \"keystone-bd81-account-create-update-g6mbv\" (UID: \"669fe6b4-a99b-41bb-a7f9-cade7b712e87\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.825359 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.825424 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.825448 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.825458 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv9pc\" (UniqueName: \"kubernetes.io/projected/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-kube-api-access-pv9pc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.825486 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.825496 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.832012 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bkt5g"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.837903 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bkt5g"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.845088 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-cron-29483521-zkwgs"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.850525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/878bb642-989b-4ea2-825e-514263f9545a-kube-api-access-brfjd" (OuterVolumeSpecName: "kube-api-access-brfjd") pod "878bb642-989b-4ea2-825e-514263f9545a" (UID: "878bb642-989b-4ea2-825e-514263f9545a"). InnerVolumeSpecName "kube-api-access-brfjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.851324 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-l2fmp"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.851437 4707 scope.go:117] "RemoveContainer" containerID="f13e4d8a65800b110a365933161c21cf2baf677a94c2caac3c3ded57bbda9c10" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.855981 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.862760 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-9fc8c7fb9-vltft"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.862929 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" podUID="113f5252-dcd0-4719-8bd9-11800c33a828" containerName="keystone-api" containerID="cri-o://741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6" gracePeriod=30 Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.869025 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.876754 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-cron-29483521-zkwgs"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.891615 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-l2fmp"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.893293 4707 scope.go:117] "RemoveContainer" containerID="fb9886b5992389d717a7412d85598e041bc98e41d12e67e8c72ca77e926c376d" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.909389 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.909582 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" (UID: "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.909781 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-l9hlc operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" podUID="669fe6b4-a99b-41bb-a7f9-cade7b712e87" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.914929 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-75f22"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.922419 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-75f22"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.923431 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.932302 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kw7d7"] Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.933721 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.934009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42c42\" (UniqueName: \"kubernetes.io/projected/f66adee9-b1b8-4d17-b204-58e840cf62b3-kube-api-access-42c42\") pod \"f66adee9-b1b8-4d17-b204-58e840cf62b3\" (UID: \"f66adee9-b1b8-4d17-b204-58e840cf62b3\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.934050 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f9818d-d674-41dc-8f81-7095412fd946-operator-scripts\") pod \"44f9818d-d674-41dc-8f81-7095412fd946\" (UID: \"44f9818d-d674-41dc-8f81-7095412fd946\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.934119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvzc4\" (UniqueName: \"kubernetes.io/projected/44f9818d-d674-41dc-8f81-7095412fd946-kube-api-access-nvzc4\") pod \"44f9818d-d674-41dc-8f81-7095412fd946\" (UID: \"44f9818d-d674-41dc-8f81-7095412fd946\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.934168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/491a0b28-d1e3-4359-af60-d6fb1767153f-operator-scripts\") pod \"491a0b28-d1e3-4359-af60-d6fb1767153f\" (UID: \"491a0b28-d1e3-4359-af60-d6fb1767153f\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.934215 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q9sb\" (UniqueName: \"kubernetes.io/projected/491a0b28-d1e3-4359-af60-d6fb1767153f-kube-api-access-9q9sb\") pod \"491a0b28-d1e3-4359-af60-d6fb1767153f\" (UID: \"491a0b28-d1e3-4359-af60-d6fb1767153f\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.934259 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66adee9-b1b8-4d17-b204-58e840cf62b3-operator-scripts\") pod \"f66adee9-b1b8-4d17-b204-58e840cf62b3\" (UID: \"f66adee9-b1b8-4d17-b204-58e840cf62b3\") " Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.934516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle\") pod \"barbican-worker-8b5858557-7fxnc\" (UID: \"40470182-aaeb-43cd-b058-942f0ebe2483\") " pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.934545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts\") pod \"keystone-bd81-account-create-update-g6mbv\" (UID: \"669fe6b4-a99b-41bb-a7f9-cade7b712e87\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.935379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9hlc\" (UniqueName: \"kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc\") pod \"keystone-bd81-account-create-update-g6mbv\" (UID: \"669fe6b4-a99b-41bb-a7f9-cade7b712e87\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.935489 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.935499 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.935509 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brfjd\" (UniqueName: \"kubernetes.io/projected/878bb642-989b-4ea2-825e-514263f9545a-kube-api-access-brfjd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.935532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f9818d-d674-41dc-8f81-7095412fd946-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44f9818d-d674-41dc-8f81-7095412fd946" (UID: "44f9818d-d674-41dc-8f81-7095412fd946"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.936469 4707 scope.go:117] "RemoveContainer" containerID="8997e6bc538ab68581eb681a7bc2920e440462c29436d549cdcef8caaa1a96dd" Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.936487 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491a0b28-d1e3-4359-af60-d6fb1767153f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "491a0b28-d1e3-4359-af60-d6fb1767153f" (UID: "491a0b28-d1e3-4359-af60-d6fb1767153f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.936666 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.936705 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle podName:40470182-aaeb-43cd-b058-942f0ebe2483 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:16.936691578 +0000 UTC m=+3874.118207799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle") pod "barbican-worker-8b5858557-7fxnc" (UID: "40470182-aaeb-43cd-b058-942f0ebe2483") : secret "combined-ca-bundle" not found Jan 21 16:06:12 crc kubenswrapper[4707]: I0121 16:06:12.937466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66adee9-b1b8-4d17-b204-58e840cf62b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f66adee9-b1b8-4d17-b204-58e840cf62b3" (UID: "f66adee9-b1b8-4d17-b204-58e840cf62b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.937855 4707 projected.go:194] Error preparing data for projected volume kube-api-access-l9hlc for pod openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.937885 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc podName:669fe6b4-a99b-41bb-a7f9-cade7b712e87 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:13.437877077 +0000 UTC m=+3870.619393300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l9hlc" (UniqueName: "kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc") pod "keystone-bd81-account-create-update-g6mbv" (UID: "669fe6b4-a99b-41bb-a7f9-cade7b712e87") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.940043 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:06:12 crc kubenswrapper[4707]: E0121 16:06:12.940088 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts podName:669fe6b4-a99b-41bb-a7f9-cade7b712e87 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:13.440074108 +0000 UTC m=+3870.621590320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts") pod "keystone-bd81-account-create-update-g6mbv" (UID: "669fe6b4-a99b-41bb-a7f9-cade7b712e87") : configmap "openstack-scripts" not found Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.036579 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-config-data\") pod \"273f261e-d7a5-4025-91a6-f9e102be79de\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.036686 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wtch\" (UniqueName: \"kubernetes.io/projected/273f261e-d7a5-4025-91a6-f9e102be79de-kube-api-access-9wtch\") pod \"273f261e-d7a5-4025-91a6-f9e102be79de\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.036731 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-vencrypt-tls-certs\") pod \"273f261e-d7a5-4025-91a6-f9e102be79de\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.036751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-combined-ca-bundle\") pod \"273f261e-d7a5-4025-91a6-f9e102be79de\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.036780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-nova-novncproxy-tls-certs\") pod \"273f261e-d7a5-4025-91a6-f9e102be79de\" (UID: \"273f261e-d7a5-4025-91a6-f9e102be79de\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.037308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle\") pod \"barbican-keystone-listener-594ddb85bd-8glq2\" (UID: \"62fd99ff-fa07-4608-8a30-69ce0bbe814b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.037839 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/491a0b28-d1e3-4359-af60-d6fb1767153f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.037859 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66adee9-b1b8-4d17-b204-58e840cf62b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.037870 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f9818d-d674-41dc-8f81-7095412fd946-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.037943 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.037985 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle podName:62fd99ff-fa07-4608-8a30-69ce0bbe814b nodeName:}" failed. No retries permitted until 2026-01-21 16:06:17.037971798 +0000 UTC m=+3874.219488021 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle") pod "barbican-keystone-listener-594ddb85bd-8glq2" (UID: "62fd99ff-fa07-4608-8a30-69ce0bbe814b") : secret "combined-ca-bundle" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.038302 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.038373 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle podName:f74b5687-ee85-4b71-929c-cc6ac63ffd06 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:17.038356442 +0000 UTC m=+3874.219872664 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle") pod "openstack-galera-0" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06") : secret "combined-ca-bundle" not found Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.059509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f9818d-d674-41dc-8f81-7095412fd946-kube-api-access-nvzc4" (OuterVolumeSpecName: "kube-api-access-nvzc4") pod "44f9818d-d674-41dc-8f81-7095412fd946" (UID: "44f9818d-d674-41dc-8f81-7095412fd946"). InnerVolumeSpecName "kube-api-access-nvzc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.059572 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491a0b28-d1e3-4359-af60-d6fb1767153f-kube-api-access-9q9sb" (OuterVolumeSpecName: "kube-api-access-9q9sb") pod "491a0b28-d1e3-4359-af60-d6fb1767153f" (UID: "491a0b28-d1e3-4359-af60-d6fb1767153f"). InnerVolumeSpecName "kube-api-access-9q9sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.059978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66adee9-b1b8-4d17-b204-58e840cf62b3-kube-api-access-42c42" (OuterVolumeSpecName: "kube-api-access-42c42") pod "f66adee9-b1b8-4d17-b204-58e840cf62b3" (UID: "f66adee9-b1b8-4d17-b204-58e840cf62b3"). InnerVolumeSpecName "kube-api-access-42c42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.060093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273f261e-d7a5-4025-91a6-f9e102be79de-kube-api-access-9wtch" (OuterVolumeSpecName: "kube-api-access-9wtch") pod "273f261e-d7a5-4025-91a6-f9e102be79de" (UID: "273f261e-d7a5-4025-91a6-f9e102be79de"). InnerVolumeSpecName "kube-api-access-9wtch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.077274 4707 scope.go:117] "RemoveContainer" containerID="17525babd176f0edcd5f654572aaf712e365128834f9b88edc42dd4fb5ce4907" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.097620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "878bb642-989b-4ea2-825e-514263f9545a" (UID: "878bb642-989b-4ea2-825e-514263f9545a"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.099777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.101276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "878bb642-989b-4ea2-825e-514263f9545a" (UID: "878bb642-989b-4ea2-825e-514263f9545a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.106077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" (UID: "b0ab5a8f-7bbf-449b-b37e-ba4d800237cf"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.109596 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "273f261e-d7a5-4025-91a6-f9e102be79de" (UID: "273f261e-d7a5-4025-91a6-f9e102be79de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.123516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-config-data" (OuterVolumeSpecName: "config-data") pod "273f261e-d7a5-4025-91a6-f9e102be79de" (UID: "273f261e-d7a5-4025-91a6-f9e102be79de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.124721 4707 scope.go:117] "RemoveContainer" containerID="0f2dfde4fe3c90981f347f6443b260ee7f62e38ea81b5bbcfaef544400577e6d" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.126964 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.137641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "878bb642-989b-4ea2-825e-514263f9545a" (UID: "878bb642-989b-4ea2-825e-514263f9545a"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.138537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8vwb\" (UniqueName: \"kubernetes.io/projected/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-kube-api-access-p8vwb\") pod \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\" (UID: \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.138657 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-operator-scripts\") pod \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\" (UID: \"6af371a2-cf51-495d-bd0e-a5561b6b9d7c\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139099 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q9sb\" (UniqueName: \"kubernetes.io/projected/491a0b28-d1e3-4359-af60-d6fb1767153f-kube-api-access-9q9sb\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139111 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139119 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139129 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139139 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139147 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/878bb642-989b-4ea2-825e-514263f9545a-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139157 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42c42\" (UniqueName: \"kubernetes.io/projected/f66adee9-b1b8-4d17-b204-58e840cf62b3-kube-api-access-42c42\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139165 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wtch\" (UniqueName: \"kubernetes.io/projected/273f261e-d7a5-4025-91a6-f9e102be79de-kube-api-access-9wtch\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139172 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.139180 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvzc4\" (UniqueName: \"kubernetes.io/projected/44f9818d-d674-41dc-8f81-7095412fd946-kube-api-access-nvzc4\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.141046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6af371a2-cf51-495d-bd0e-a5561b6b9d7c" (UID: "6af371a2-cf51-495d-bd0e-a5561b6b9d7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.141766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "273f261e-d7a5-4025-91a6-f9e102be79de" (UID: "273f261e-d7a5-4025-91a6-f9e102be79de"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.147930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-kube-api-access-p8vwb" (OuterVolumeSpecName: "kube-api-access-p8vwb") pod "6af371a2-cf51-495d-bd0e-a5561b6b9d7c" (UID: "6af371a2-cf51-495d-bd0e-a5561b6b9d7c"). InnerVolumeSpecName "kube-api-access-p8vwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.176833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "273f261e-d7a5-4025-91a6-f9e102be79de" (UID: "273f261e-d7a5-4025-91a6-f9e102be79de"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.190677 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d47255a-8169-490c-88be-57ffaabcb2d3" path="/var/lib/kubelet/pods/1d47255a-8169-490c-88be-57ffaabcb2d3/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.191410 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f781fc-51ea-457e-b20f-129e4eaa8649" path="/var/lib/kubelet/pods/22f781fc-51ea-457e-b20f-129e4eaa8649/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.192076 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d9bdaa-7c4d-41c6-be5e-ea602171d57d" path="/var/lib/kubelet/pods/26d9bdaa-7c4d-41c6-be5e-ea602171d57d/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.193143 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a99ebc-2042-44a8-862c-e3508fdecf5c" path="/var/lib/kubelet/pods/47a99ebc-2042-44a8-862c-e3508fdecf5c/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.193711 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528035e2-52ac-4a65-b069-312a9bfab07e" path="/var/lib/kubelet/pods/528035e2-52ac-4a65-b069-312a9bfab07e/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.194194 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783ac2f5-58ea-4ead-a88e-7c4929cf4661" path="/var/lib/kubelet/pods/783ac2f5-58ea-4ead-a88e-7c4929cf4661/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.194263 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="f74b5687-ee85-4b71-929c-cc6ac63ffd06" containerName="galera" containerID="cri-o://03100c9eb9b8189faab039c677c5d529534dd8ab01bec2472f4fc82fd729aab9" gracePeriod=30 Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.195226 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9340e961-22d7-4e45-b231-1fe7f197a9e9" path="/var/lib/kubelet/pods/9340e961-22d7-4e45-b231-1fe7f197a9e9/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.195759 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="951a6efd-7bf7-48ea-a8f5-de1ad1c03b08" path="/var/lib/kubelet/pods/951a6efd-7bf7-48ea-a8f5-de1ad1c03b08/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.196687 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a17672dc-c0ba-40d8-807c-9e1325921ed7" path="/var/lib/kubelet/pods/a17672dc-c0ba-40d8-807c-9e1325921ed7/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.197399 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3ac6aad-da44-4874-8a19-8098a2b2dccc" path="/var/lib/kubelet/pods/a3ac6aad-da44-4874-8a19-8098a2b2dccc/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.200514 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac62dfd2-875a-43ff-b154-e1755965171f" path="/var/lib/kubelet/pods/ac62dfd2-875a-43ff-b154-e1755965171f/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.201150 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466" path="/var/lib/kubelet/pods/bb9a74aa-8a87-4bb8-9a6a-d2f1a4f40466/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.201652 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd7d039-ca6f-477f-a698-e5fcd49e092b" path="/var/lib/kubelet/pods/bfd7d039-ca6f-477f-a698-e5fcd49e092b/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.203055 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e538db-f225-4c4a-a147-f2bcf3218f28" path="/var/lib/kubelet/pods/c1e538db-f225-4c4a-a147-f2bcf3218f28/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.203704 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75f12e9-2aae-4b40-9120-bc66aad8d0c1" path="/var/lib/kubelet/pods/e75f12e9-2aae-4b40-9120-bc66aad8d0c1/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.204429 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef804e7a-6ccf-45ea-8025-b3c4df0f1122" path="/var/lib/kubelet/pods/ef804e7a-6ccf-45ea-8025-b3c4df0f1122/volumes" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.229026 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.229645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"b0ab5a8f-7bbf-449b-b37e-ba4d800237cf","Type":"ContainerDied","Data":"f091c40f21036d1801be20ea6040f2764f81d0f16bd856b12466f03377cf804d"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.229740 4707 scope.go:117] "RemoveContainer" containerID="6afcb360e0fbcf5cfa3da43b53ea242515a563c93456f93154197247f98df091" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.237874 4707 scope.go:117] "RemoveContainer" containerID="743ae25b8acb1b34607a2c58c911a56beecb00bee4c4c9e5e264c4b53d7a16c1" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.239713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7r9l\" (UniqueName: \"kubernetes.io/projected/4b535454-f068-42cd-9817-a23bd7d7aa4d-kube-api-access-n7r9l\") pod \"4b535454-f068-42cd-9817-a23bd7d7aa4d\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.240018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-combined-ca-bundle\") pod \"4b535454-f068-42cd-9817-a23bd7d7aa4d\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.240125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-config-data\") pod \"4b535454-f068-42cd-9817-a23bd7d7aa4d\" (UID: \"4b535454-f068-42cd-9817-a23bd7d7aa4d\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.240564 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.240578 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.240587 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8vwb\" (UniqueName: \"kubernetes.io/projected/6af371a2-cf51-495d-bd0e-a5561b6b9d7c-kube-api-access-p8vwb\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.240596 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/273f261e-d7a5-4025-91a6-f9e102be79de-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.241710 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.260876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"878bb642-989b-4ea2-825e-514263f9545a","Type":"ContainerDied","Data":"ffc8dc78bc39353273f375e7210b8d9d0da42cb11e26ccb55670c33ff768584f"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.261961 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.265908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b535454-f068-42cd-9817-a23bd7d7aa4d-kube-api-access-n7r9l" (OuterVolumeSpecName: "kube-api-access-n7r9l") pod "4b535454-f068-42cd-9817-a23bd7d7aa4d" (UID: "4b535454-f068-42cd-9817-a23bd7d7aa4d"). InnerVolumeSpecName "kube-api-access-n7r9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.265918 4707 generic.go:334] "Generic (PLEG): container finished" podID="273f261e-d7a5-4025-91a6-f9e102be79de" containerID="c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f" exitCode=0 Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.265934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"273f261e-d7a5-4025-91a6-f9e102be79de","Type":"ContainerDied","Data":"c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.266881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"273f261e-d7a5-4025-91a6-f9e102be79de","Type":"ContainerDied","Data":"1e5626700bc617a5d897cb1ff108340acef8b9cc05358be3eb5df9be6d08ad36"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.265960 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.271512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b535454-f068-42cd-9817-a23bd7d7aa4d" (UID: "4b535454-f068-42cd-9817-a23bd7d7aa4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.271675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-config-data" (OuterVolumeSpecName: "config-data") pod "4b535454-f068-42cd-9817-a23bd7d7aa4d" (UID: "4b535454-f068-42cd-9817-a23bd7d7aa4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.296759 4707 generic.go:334] "Generic (PLEG): container finished" podID="044a3d9d-de3b-4a2b-8007-9d21847149a5" containerID="e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4" exitCode=0 Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.296995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" event={"ID":"044a3d9d-de3b-4a2b-8007-9d21847149a5","Type":"ContainerDied","Data":"e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.297015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" event={"ID":"044a3d9d-de3b-4a2b-8007-9d21847149a5","Type":"ContainerStarted","Data":"a922d9aa2997f2237686993372c3cb8b882120ecd6b866a26b11662d5b6882cd"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.302867 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerID="69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a" exitCode=0 Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.302922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.302938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerDied","Data":"69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.302963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"4b535454-f068-42cd-9817-a23bd7d7aa4d","Type":"ContainerDied","Data":"6297542b32b4ff82c9b3e69fe4587e55a7b0578cb8e7c93f713e581d4d3c7557"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.304091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" event={"ID":"491a0b28-d1e3-4359-af60-d6fb1767153f","Type":"ContainerDied","Data":"7fbfa46f076437cc3080a2f620c2543d6f264ace883efedfa9df8130ea64d20d"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.304298 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.321096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" event={"ID":"44f9818d-d674-41dc-8f81-7095412fd946","Type":"ContainerDied","Data":"865e705c9b9469b1174e749b5442919be50279577ad4a77a46eb34bd2dd4762b"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.321179 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.329281 4707 generic.go:334] "Generic (PLEG): container finished" podID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerID="f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a" exitCode=0 Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.332383 4707 generic.go:334] "Generic (PLEG): container finished" podID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerID="ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047" exitCode=0 Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.329414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerDied","Data":"f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.332538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerDied","Data":"ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.335904 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" event={"ID":"6af371a2-cf51-495d-bd0e-a5561b6b9d7c","Type":"ContainerDied","Data":"3edec84002032cfbc28619c2d50cf3a6f1546f0653ccdf4367b8ef8388cd27fe"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.336160 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.341351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-config-data\") pod \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.341383 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-run-httpd\") pod \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.341426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78q44\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-kube-api-access-78q44\") pod \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.342042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-log-httpd\") pod \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.342250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-public-tls-certs\") pod \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.342319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-combined-ca-bundle\") pod \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.342385 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-internal-tls-certs\") pod \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.342435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-etc-swift\") pod \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\" (UID: \"2772c54d-baf5-4cd4-8677-0db11b97c5ef\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.344514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2772c54d-baf5-4cd4-8677-0db11b97c5ef" (UID: "2772c54d-baf5-4cd4-8677-0db11b97c5ef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.356575 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7r9l\" (UniqueName: \"kubernetes.io/projected/4b535454-f068-42cd-9817-a23bd7d7aa4d-kube-api-access-n7r9l\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.356591 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.356601 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.356611 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b535454-f068-42cd-9817-a23bd7d7aa4d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.361683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2772c54d-baf5-4cd4-8677-0db11b97c5ef" (UID: "2772c54d-baf5-4cd4-8677-0db11b97c5ef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.362321 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-kube-api-access-78q44" (OuterVolumeSpecName: "kube-api-access-78q44") pod "2772c54d-baf5-4cd4-8677-0db11b97c5ef" (UID: "2772c54d-baf5-4cd4-8677-0db11b97c5ef"). InnerVolumeSpecName "kube-api-access-78q44". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.362668 4707 generic.go:334] "Generic (PLEG): container finished" podID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerID="69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47" exitCode=0 Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.362684 4707 generic.go:334] "Generic (PLEG): container finished" podID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerID="16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3" exitCode=0 Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.362760 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.362872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" event={"ID":"2772c54d-baf5-4cd4-8677-0db11b97c5ef","Type":"ContainerDied","Data":"69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.362914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" event={"ID":"2772c54d-baf5-4cd4-8677-0db11b97c5ef","Type":"ContainerDied","Data":"16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.362925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567575559b-gqwkg" event={"ID":"2772c54d-baf5-4cd4-8677-0db11b97c5ef","Type":"ContainerDied","Data":"4305a20f633fc4957c8c0e27ca4e0246bc5f26542a84c6d503dfe7054c2af499"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.366573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2772c54d-baf5-4cd4-8677-0db11b97c5ef" (UID: "2772c54d-baf5-4cd4-8677-0db11b97c5ef"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.381999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2772c54d-baf5-4cd4-8677-0db11b97c5ef" (UID: "2772c54d-baf5-4cd4-8677-0db11b97c5ef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.382833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" event={"ID":"f66adee9-b1b8-4d17-b204-58e840cf62b3","Type":"ContainerDied","Data":"e20cd145e4dd347445c24f4814680dae203fea9a96d94ac9502223f5755d150c"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.382893 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.392424 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b620e1f-426b-4363-ad99-f56ba333fbfd" containerID="b92d7390d144ce91eac5fff590d199d75cae8e6be0b2fd4b6cf081f59dbf0240" exitCode=1 Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.392672 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-kw7d7" secret="" err="secret \"galera-openstack-dockercfg-6glg2\" not found" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.392703 4707 scope.go:117] "RemoveContainer" containerID="b92d7390d144ce91eac5fff590d199d75cae8e6be0b2fd4b6cf081f59dbf0240" Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.392902 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-kw7d7_openstack-kuttl-tests(2b620e1f-426b-4363-ad99-f56ba333fbfd)\"" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" podUID="2b620e1f-426b-4363-ad99-f56ba333fbfd" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.392940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" event={"ID":"2b620e1f-426b-4363-ad99-f56ba333fbfd","Type":"ContainerDied","Data":"b92d7390d144ce91eac5fff590d199d75cae8e6be0b2fd4b6cf081f59dbf0240"} Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.397726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.397736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.397850 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.412669 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2772c54d-baf5-4cd4-8677-0db11b97c5ef" (UID: "2772c54d-baf5-4cd4-8677-0db11b97c5ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.461219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2772c54d-baf5-4cd4-8677-0db11b97c5ef" (UID: "2772c54d-baf5-4cd4-8677-0db11b97c5ef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.466568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9hlc\" (UniqueName: \"kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc\") pod \"keystone-bd81-account-create-update-g6mbv\" (UID: \"669fe6b4-a99b-41bb-a7f9-cade7b712e87\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.466862 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts\") pod \"keystone-bd81-account-create-update-g6mbv\" (UID: \"669fe6b4-a99b-41bb-a7f9-cade7b712e87\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.467795 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78q44\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-kube-api-access-78q44\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.477913 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.477965 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts podName:2b620e1f-426b-4363-ad99-f56ba333fbfd nodeName:}" failed. No retries permitted until 2026-01-21 16:06:13.977950683 +0000 UTC m=+3871.159466904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts") pod "root-account-create-update-kw7d7" (UID: "2b620e1f-426b-4363-ad99-f56ba333fbfd") : configmap "openstack-scripts" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.478314 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.478341 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts podName:669fe6b4-a99b-41bb-a7f9-cade7b712e87 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:14.478334093 +0000 UTC m=+3871.659850316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts") pod "keystone-bd81-account-create-update-g6mbv" (UID: "669fe6b4-a99b-41bb-a7f9-cade7b712e87") : configmap "openstack-scripts" not found Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.479827 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.479866 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.479884 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.479898 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2772c54d-baf5-4cd4-8677-0db11b97c5ef-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.479907 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2772c54d-baf5-4cd4-8677-0db11b97c5ef-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.483173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-config-data" (OuterVolumeSpecName: "config-data") pod "2772c54d-baf5-4cd4-8677-0db11b97c5ef" (UID: "2772c54d-baf5-4cd4-8677-0db11b97c5ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.484724 4707 projected.go:194] Error preparing data for projected volume kube-api-access-l9hlc for pod openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.484783 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc podName:669fe6b4-a99b-41bb-a7f9-cade7b712e87 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:14.484766339 +0000 UTC m=+3871.666282560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l9hlc" (UniqueName: "kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc") pod "keystone-bd81-account-create-update-g6mbv" (UID: "669fe6b4-a99b-41bb-a7f9-cade7b712e87") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.524369 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.528201 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.537747 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.537921 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.552975 4707 scope.go:117] "RemoveContainer" containerID="3cbe82abedb90d7b7ea8a4dfc7b5a312e0052fff64a0a3f6bbe54174658450ee" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.555790 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.583716 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772c54d-baf5-4cd4-8677-0db11b97c5ef-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.594155 4707 scope.go:117] "RemoveContainer" containerID="b451099dbd5ad32127a7989542980a00d90d79b3ce6572d7316028bc4d8c18c4" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.721140 4707 scope.go:117] "RemoveContainer" containerID="6043287912bacccccabcf12d1ba27e4a2e47d6aa6e68c9430fbd4adf5af2e194" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.721474 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.786344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t62lw\" (UniqueName: \"kubernetes.io/projected/884872ef-7ca2-4262-9561-440673182eba-kube-api-access-t62lw\") pod \"884872ef-7ca2-4262-9561-440673182eba\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.786418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-memcached-tls-certs\") pod \"884872ef-7ca2-4262-9561-440673182eba\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.786437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-combined-ca-bundle\") pod \"884872ef-7ca2-4262-9561-440673182eba\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.786893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-kolla-config\") pod \"884872ef-7ca2-4262-9561-440673182eba\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.786934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-config-data\") pod \"884872ef-7ca2-4262-9561-440673182eba\" (UID: \"884872ef-7ca2-4262-9561-440673182eba\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.787639 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "884872ef-7ca2-4262-9561-440673182eba" (UID: "884872ef-7ca2-4262-9561-440673182eba"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.789442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-config-data" (OuterVolumeSpecName: "config-data") pod "884872ef-7ca2-4262-9561-440673182eba" (UID: "884872ef-7ca2-4262-9561-440673182eba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.790642 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.790663 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884872ef-7ca2-4262-9561-440673182eba-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.792767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884872ef-7ca2-4262-9561-440673182eba-kube-api-access-t62lw" (OuterVolumeSpecName: "kube-api-access-t62lw") pod "884872ef-7ca2-4262-9561-440673182eba" (UID: "884872ef-7ca2-4262-9561-440673182eba"). InnerVolumeSpecName "kube-api-access-t62lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.805883 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.813587 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-594ddb85bd-8glq2"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.816347 4707 scope.go:117] "RemoveContainer" containerID="2a059741cb9103b48bbbb54f029ad118b4c2e8e0b020439897ea13f2f44172a8" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.819842 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.829732 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.831541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "884872ef-7ca2-4262-9561-440673182eba" (UID: "884872ef-7ca2-4262-9561-440673182eba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.836906 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.849928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "884872ef-7ca2-4262-9561-440673182eba" (UID: "884872ef-7ca2-4262-9561-440673182eba"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.849987 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.858002 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.858858 4707 scope.go:117] "RemoveContainer" containerID="c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.860941 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.866850 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7ef3-account-create-update-g8jgp"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.879691 4707 scope.go:117] "RemoveContainer" containerID="cfa23d2d252b51b7251f2ad8cf18b083481d8f7870dd55f120337919c9261e22" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.887467 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.892407 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.892576 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884872ef-7ca2-4262-9561-440673182eba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.892591 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fd99ff-fa07-4608-8a30-69ce0bbe814b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.892601 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t62lw\" (UniqueName: \"kubernetes.io/projected/884872ef-7ca2-4262-9561-440673182eba-kube-api-access-t62lw\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.892527 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.892675 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data podName:1c37d5e5-4ffd-412c-a93a-86cb6474735c nodeName:}" failed. No retries permitted until 2026-01-21 16:06:17.892660272 +0000 UTC m=+3875.074176494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data") pod "rabbitmq-cell1-server-0" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.898200 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-24b2-account-create-update-hstnz"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.904776 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567575559b-gqwkg"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.908975 4707 scope.go:117] "RemoveContainer" containerID="c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f" Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.909676 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f\": container with ID starting with c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f not found: ID does not exist" containerID="c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.909704 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f"} err="failed to get container status \"c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f\": rpc error: code = NotFound desc = could not find container \"c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f\": container with ID starting with c53eaa09847321e3874c7990e1b54842520b73227e3a93fcd8d6bcde1716710f not found: ID does not exist" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.909722 4707 scope.go:117] "RemoveContainer" containerID="69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.912271 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567575559b-gqwkg"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.927122 4707 scope.go:117] "RemoveContainer" containerID="98a9597fa3accf0fc1a3339f115e49717d4b9832f5b88f142f8e788aee42855c" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.934429 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.947064 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-8b5858557-7fxnc"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.959572 4707 scope.go:117] "RemoveContainer" containerID="9e83185270fc11763a91c77246eff89a685150bc0a98b55aaf5b8eb5e54d524f" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.961170 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.967758 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-4b91-account-create-update-cgjh8"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.970979 4707 scope.go:117] "RemoveContainer" containerID="0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.981135 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.982132 4707 scope.go:117] "RemoveContainer" containerID="ece978f12d6b5e9ac8023040ae0a54c42a5fbd1433d12812045a7b80b733a412" Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.986053 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-82ac-account-create-update-6w6kc"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.990865 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.993676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3838495-e77f-4884-b7e3-ca30b8c6e13e-operator-scripts\") pod \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\" (UID: \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\") " Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.993754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrvkh\" (UniqueName: \"kubernetes.io/projected/f3838495-e77f-4884-b7e3-ca30b8c6e13e-kube-api-access-xrvkh\") pod \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\" (UID: \"f3838495-e77f-4884-b7e3-ca30b8c6e13e\") " Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.994460 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:06:13 crc kubenswrapper[4707]: E0121 16:06:13.994501 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts podName:2b620e1f-426b-4363-ad99-f56ba333fbfd nodeName:}" failed. No retries permitted until 2026-01-21 16:06:14.994489384 +0000 UTC m=+3872.176005606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts") pod "root-account-create-update-kw7d7" (UID: "2b620e1f-426b-4363-ad99-f56ba333fbfd") : configmap "openstack-scripts" not found Jan 21 16:06:13 crc kubenswrapper[4707]: I0121 16:06:13.995113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3838495-e77f-4884-b7e3-ca30b8c6e13e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3838495-e77f-4884-b7e3-ca30b8c6e13e" (UID: "f3838495-e77f-4884-b7e3-ca30b8c6e13e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.003125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3838495-e77f-4884-b7e3-ca30b8c6e13e-kube-api-access-xrvkh" (OuterVolumeSpecName: "kube-api-access-xrvkh") pod "f3838495-e77f-4884-b7e3-ca30b8c6e13e" (UID: "f3838495-e77f-4884-b7e3-ca30b8c6e13e"). InnerVolumeSpecName "kube-api-access-xrvkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.005850 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.012848 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.015568 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.023928 4707 scope.go:117] "RemoveContainer" containerID="69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a" Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.031853 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a\": container with ID starting with 69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a not found: ID does not exist" containerID="69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.031879 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a"} err="failed to get container status \"69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a\": rpc error: code = NotFound desc = could not find container \"69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a\": container with ID starting with 69889ea1de43935597cfb9fe3399014ac5a58e7951f5158c3b9805364172136a not found: ID does not exist" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.031899 4707 scope.go:117] "RemoveContainer" containerID="0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9" Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.032218 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9\": container with ID starting with 0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9 not found: ID does not exist" containerID="0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.032251 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9"} err="failed to get container status \"0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9\": rpc error: code = NotFound desc = could not find container \"0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9\": container with ID starting with 0013d0198fc8f8e7a1b3f4a0aca1bfc9e60d3ced22a73a961e0b28069f309da9 not found: ID does not exist" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.032274 4707 scope.go:117] "RemoveContainer" containerID="69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.076987 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.89:8776/healthcheck\": read tcp 10.217.0.2:49610->10.217.1.89:8776: read: connection reset by peer" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.088770 4707 scope.go:117] "RemoveContainer" containerID="16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.095471 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3838495-e77f-4884-b7e3-ca30b8c6e13e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.095486 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrvkh\" (UniqueName: \"kubernetes.io/projected/f3838495-e77f-4884-b7e3-ca30b8c6e13e-kube-api-access-xrvkh\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.095495 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40470182-aaeb-43cd-b058-942f0ebe2483-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.101300 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.138908 4707 scope.go:117] "RemoveContainer" containerID="76d160cc07c271dde3d11395b543e852feef2c9280f3151293f7b7ecec53f818" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.170146 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.82:8775/\": read tcp 10.217.0.2:35480->10.217.1.82:8775: read: connection reset by peer" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.171759 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.82:8775/\": read tcp 10.217.0.2:35478->10.217.1.82:8775: read: connection reset by peer" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.188629 4707 scope.go:117] "RemoveContainer" containerID="69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47" Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.193887 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47\": container with ID starting with 69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47 not found: ID does not exist" containerID="69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.193919 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47"} err="failed to get container status \"69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47\": rpc error: code = NotFound desc = could not find container \"69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47\": container with ID starting with 69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47 not found: ID does not exist" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.193936 4707 scope.go:117] "RemoveContainer" containerID="16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.196966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9c62ea6-096a-4f1a-9326-e14e3ce91911-operator-scripts\") pod \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\" (UID: \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.197015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gklzr\" (UniqueName: \"kubernetes.io/projected/b9c62ea6-096a-4f1a-9326-e14e3ce91911-kube-api-access-gklzr\") pod \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\" (UID: \"b9c62ea6-096a-4f1a-9326-e14e3ce91911\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.198615 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c62ea6-096a-4f1a-9326-e14e3ce91911-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9c62ea6-096a-4f1a-9326-e14e3ce91911" (UID: "b9c62ea6-096a-4f1a-9326-e14e3ce91911"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.196469 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3\": container with ID starting with 16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3 not found: ID does not exist" containerID="16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.200633 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3"} err="failed to get container status \"16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3\": rpc error: code = NotFound desc = could not find container \"16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3\": container with ID starting with 16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3 not found: ID does not exist" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.200681 4707 scope.go:117] "RemoveContainer" containerID="69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.204095 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47"} err="failed to get container status \"69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47\": rpc error: code = NotFound desc = could not find container \"69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47\": container with ID starting with 69cdc8f0880c897aef64836bd23a78598dd03ce6f08e1f0cde4605a096cbeb47 not found: ID does not exist" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.204127 4707 scope.go:117] "RemoveContainer" containerID="16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.204101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c62ea6-096a-4f1a-9326-e14e3ce91911-kube-api-access-gklzr" (OuterVolumeSpecName: "kube-api-access-gklzr") pod "b9c62ea6-096a-4f1a-9326-e14e3ce91911" (UID: "b9c62ea6-096a-4f1a-9326-e14e3ce91911"). InnerVolumeSpecName "kube-api-access-gklzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.204200 4707 scope.go:117] "RemoveContainer" containerID="c5e539d0bcc848c6d8b89ed9ace7c4bb3a1f1e4ca972075b82e0e1f0ae7b1421" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.204899 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3"} err="failed to get container status \"16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3\": rpc error: code = NotFound desc = could not find container \"16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3\": container with ID starting with 16d11ba335a93c69d6a1d3c42b52aaafcf1d935ce28fc0c82bb232c819b6b3b3 not found: ID does not exist" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.204929 4707 scope.go:117] "RemoveContainer" containerID="841a0e20c33f319acc9564e8dedd6e8d4f9ea14e977f1cf34ce8c8474652f992" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.299414 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gklzr\" (UniqueName: \"kubernetes.io/projected/b9c62ea6-096a-4f1a-9326-e14e3ce91911-kube-api-access-gklzr\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.299587 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9c62ea6-096a-4f1a-9326-e14e3ce91911-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.423697 4707 generic.go:334] "Generic (PLEG): container finished" podID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerID="7cfc0f593e80d12c7b3d86938139b541aeeef5f717713f7020df220d37653d45" exitCode=0 Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.423745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"425c52a3-93a2-49fc-82d9-d7d86f381335","Type":"ContainerDied","Data":"7cfc0f593e80d12c7b3d86938139b541aeeef5f717713f7020df220d37653d45"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.427458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" event={"ID":"b9c62ea6-096a-4f1a-9326-e14e3ce91911","Type":"ContainerDied","Data":"b09498e74f293e56726f74418c15ab455c7a808d51b2fbcb72fbe3c97cc25214"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.427514 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.435308 4707 generic.go:334] "Generic (PLEG): container finished" podID="884872ef-7ca2-4262-9561-440673182eba" containerID="bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497" exitCode=0 Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.435367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"884872ef-7ca2-4262-9561-440673182eba","Type":"ContainerDied","Data":"bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.435391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"884872ef-7ca2-4262-9561-440673182eba","Type":"ContainerDied","Data":"e2d7afa51a1e8f60c8a002a37bef43ce5e078fb74d003536c303462bcb4bba91"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.435442 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.444582 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerID="e161f3943e924491a50c0f1e70e67b3a00296b6236a4e22683928578bbfa1875" exitCode=0 Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.444642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c00713b-bb97-4adf-82eb-8ff31b65d14b","Type":"ContainerDied","Data":"e161f3943e924491a50c0f1e70e67b3a00296b6236a4e22683928578bbfa1875"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.447039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" event={"ID":"044a3d9d-de3b-4a2b-8007-9d21847149a5","Type":"ContainerStarted","Data":"96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.448168 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.472908 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" podStartSLOduration=4.472896957 podStartE2EDuration="4.472896957s" podCreationTimestamp="2026-01-21 16:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:06:14.461892925 +0000 UTC m=+3871.643409148" watchObservedRunningTime="2026-01-21 16:06:14.472896957 +0000 UTC m=+3871.654413180" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.477382 4707 generic.go:334] "Generic (PLEG): container finished" podID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerID="1cfbfa0717635e65d7326b7c6d12c3ff52804eea66d22885b209dc75b069dd24" exitCode=0 Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.477430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" event={"ID":"eff12a6c-5d02-4e45-9e7e-e589b746f1dc","Type":"ContainerDied","Data":"1cfbfa0717635e65d7326b7c6d12c3ff52804eea66d22885b209dc75b069dd24"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.477457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" event={"ID":"eff12a6c-5d02-4e45-9e7e-e589b746f1dc","Type":"ContainerDied","Data":"83a59ed7cec5c61fc2188caad0a9e33a181eddc8cac3308b151b01bfb9bae715"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.477475 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a59ed7cec5c61fc2188caad0a9e33a181eddc8cac3308b151b01bfb9bae715" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.483047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" event={"ID":"f3838495-e77f-4884-b7e3-ca30b8c6e13e","Type":"ContainerDied","Data":"b39e57bffa259ca251a7a832f7d927525a294577290c6780a7dda539446c9f94"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.483090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.491724 4707 generic.go:334] "Generic (PLEG): container finished" podID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerID="01def0d7c0c313086bab3b5dc9cadb48b9024281321ff3f9e84b864c2e80cc95" exitCode=0 Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.491781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"61223200-c7c1-4ee8-b38a-e6074d65114a","Type":"ContainerDied","Data":"01def0d7c0c313086bab3b5dc9cadb48b9024281321ff3f9e84b864c2e80cc95"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.491837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"61223200-c7c1-4ee8-b38a-e6074d65114a","Type":"ContainerDied","Data":"b248ef8405ca158d4d7a8937f11f247005b0d7b28069f40241fbcaa4ec63ecc2"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.491849 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b248ef8405ca158d4d7a8937f11f247005b0d7b28069f40241fbcaa4ec63ecc2" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.499133 4707 generic.go:334] "Generic (PLEG): container finished" podID="f74b5687-ee85-4b71-929c-cc6ac63ffd06" containerID="03100c9eb9b8189faab039c677c5d529534dd8ab01bec2472f4fc82fd729aab9" exitCode=0 Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.499175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f74b5687-ee85-4b71-929c-cc6ac63ffd06","Type":"ContainerDied","Data":"03100c9eb9b8189faab039c677c5d529534dd8ab01bec2472f4fc82fd729aab9"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.499191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f74b5687-ee85-4b71-929c-cc6ac63ffd06","Type":"ContainerDied","Data":"a50dddf8b460942ac358b7cf7c3cd16415bddef2914c6ce670300c4f426403e1"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.499203 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a50dddf8b460942ac358b7cf7c3cd16415bddef2914c6ce670300c4f426403e1" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.500606 4707 generic.go:334] "Generic (PLEG): container finished" podID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerID="030c52d40b8a7c37e53cff77e05fa947476307c66e7a847d083b6b3c32443d7f" exitCode=0 Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.500655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a3b559c3-4e82-4897-9b5c-1816989fffcd","Type":"ContainerDied","Data":"030c52d40b8a7c37e53cff77e05fa947476307c66e7a847d083b6b3c32443d7f"} Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.502309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9hlc\" (UniqueName: \"kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc\") pod \"keystone-bd81-account-create-update-g6mbv\" (UID: \"669fe6b4-a99b-41bb-a7f9-cade7b712e87\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.502399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts\") pod \"keystone-bd81-account-create-update-g6mbv\" (UID: \"669fe6b4-a99b-41bb-a7f9-cade7b712e87\") " pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.502597 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.502671 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts podName:669fe6b4-a99b-41bb-a7f9-cade7b712e87 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:16.502654681 +0000 UTC m=+3873.684170903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts") pod "keystone-bd81-account-create-update-g6mbv" (UID: "669fe6b4-a99b-41bb-a7f9-cade7b712e87") : configmap "openstack-scripts" not found Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.504919 4707 generic.go:334] "Generic (PLEG): container finished" podID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerID="73b244ba7031f16385e7963d008d81acbd17ba6ba2ee40f7b1bdf960c503b84b" exitCode=0 Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.504991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" event={"ID":"c892fe45-66a3-4ddf-b0db-b50df71c6493","Type":"ContainerDied","Data":"73b244ba7031f16385e7963d008d81acbd17ba6ba2ee40f7b1bdf960c503b84b"} Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.505000 4707 projected.go:194] Error preparing data for projected volume kube-api-access-l9hlc for pod openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.505043 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc podName:669fe6b4-a99b-41bb-a7f9-cade7b712e87 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:16.505029537 +0000 UTC m=+3873.686545759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l9hlc" (UniqueName: "kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc") pod "keystone-bd81-account-create-update-g6mbv" (UID: "669fe6b4-a99b-41bb-a7f9-cade7b712e87") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.508238 4707 generic.go:334] "Generic (PLEG): container finished" podID="f58856a6-ee91-4aee-874e-118980038628" containerID="5e4bd3a5a9bcc7a9caf760e253e0ea48824f56e8b8b18cd49710b14d45577c27" exitCode=0 Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.508303 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.508310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" event={"ID":"f58856a6-ee91-4aee-874e-118980038628","Type":"ContainerDied","Data":"5e4bd3a5a9bcc7a9caf760e253e0ea48824f56e8b8b18cd49710b14d45577c27"} Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.558203 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4af60f39_57ca_4595_be87_1d1a0c869d72.slice/crio-conmon-a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1b70303_f297_4a57_b252_7bdc251fa8ef.slice/crio-5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.771972 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.779171 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.785715 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.806307 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.811319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-operator-scripts\") pod \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.811368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-generated\") pod \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.811394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-default\") pod \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.811414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.811450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle\") pod \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.811480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-galera-tls-certs\") pod \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.811550 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5mg5\" (UniqueName: \"kubernetes.io/projected/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kube-api-access-l5mg5\") pod \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.811573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kolla-config\") pod \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\" (UID: \"f74b5687-ee85-4b71-929c-cc6ac63ffd06\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.812193 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f74b5687-ee85-4b71-929c-cc6ac63ffd06" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.812394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f74b5687-ee85-4b71-929c-cc6ac63ffd06" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.812712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f74b5687-ee85-4b71-929c-cc6ac63ffd06" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.813690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f74b5687-ee85-4b71-929c-cc6ac63ffd06" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.823071 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.823889 4707 scope.go:117] "RemoveContainer" containerID="4457eabe5f34005325f77fb3e078f6a236a4762a6d975580d7780c5eb68a0db8" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.826766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kube-api-access-l5mg5" (OuterVolumeSpecName: "kube-api-access-l5mg5") pod "f74b5687-ee85-4b71-929c-cc6ac63ffd06" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06"). InnerVolumeSpecName "kube-api-access-l5mg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.842869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "f74b5687-ee85-4b71-929c-cc6ac63ffd06" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.847448 4707 scope.go:117] "RemoveContainer" containerID="bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.859324 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.865108 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.883985 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f74b5687-ee85-4b71-929c-cc6ac63ffd06" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.885135 4707 scope.go:117] "RemoveContainer" containerID="e988b288a8fcc44d803facd8cd151da8bf58f68ac8ebb2aef3d85ab144b16900" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.896232 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f74b5687-ee85-4b71-929c-cc6ac63ffd06" (UID: "f74b5687-ee85-4b71-929c-cc6ac63ffd06"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.897960 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55"] Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.900048 4707 scope.go:117] "RemoveContainer" containerID="bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497" Jan 21 16:06:14 crc kubenswrapper[4707]: E0121 16:06:14.900993 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497\": container with ID starting with bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497 not found: ID does not exist" containerID="bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.901024 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497"} err="failed to get container status \"bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497\": rpc error: code = NotFound desc = could not find container \"bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497\": container with ID starting with bf941c8f3ba55bbab6c96ba882f2b3880e1c28cfdadbf4dba17754705d29f497 not found: ID does not exist" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.902917 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-fcb5-account-create-update-nfc55"] Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.912986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-scripts\") pod \"61223200-c7c1-4ee8-b38a-e6074d65114a\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-logs\") pod \"61223200-c7c1-4ee8-b38a-e6074d65114a\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data\") pod \"a3b559c3-4e82-4897-9b5c-1816989fffcd\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-scripts\") pod \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data-custom\") pod \"a3b559c3-4e82-4897-9b5c-1816989fffcd\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913127 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc48n\" (UniqueName: \"kubernetes.io/projected/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-kube-api-access-sc48n\") pod \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913186 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-internal-tls-certs\") pod \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-logs\") pod \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-config-data\") pod \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-combined-ca-bundle\") pod \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913264 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-scripts\") pod \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-internal-tls-certs\") pod \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b559c3-4e82-4897-9b5c-1816989fffcd-logs\") pod \"a3b559c3-4e82-4897-9b5c-1816989fffcd\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt6x8\" (UniqueName: \"kubernetes.io/projected/a3b559c3-4e82-4897-9b5c-1816989fffcd-kube-api-access-lt6x8\") pod \"a3b559c3-4e82-4897-9b5c-1816989fffcd\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3b559c3-4e82-4897-9b5c-1816989fffcd-etc-machine-id\") pod \"a3b559c3-4e82-4897-9b5c-1816989fffcd\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-combined-ca-bundle\") pod \"a3b559c3-4e82-4897-9b5c-1816989fffcd\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.913707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-logs" (OuterVolumeSpecName: "logs") pod "61223200-c7c1-4ee8-b38a-e6074d65114a" (UID: "61223200-c7c1-4ee8-b38a-e6074d65114a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.917201 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5gq4\" (UniqueName: \"kubernetes.io/projected/61223200-c7c1-4ee8-b38a-e6074d65114a-kube-api-access-x5gq4\") pod \"61223200-c7c1-4ee8-b38a-e6074d65114a\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-logs\") pod \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-config-data\") pod \"61223200-c7c1-4ee8-b38a-e6074d65114a\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-internal-tls-certs\") pod \"a3b559c3-4e82-4897-9b5c-1816989fffcd\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-public-tls-certs\") pod \"a3b559c3-4e82-4897-9b5c-1816989fffcd\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921436 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-httpd-run\") pod \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-scripts\") pod \"a3b559c3-4e82-4897-9b5c-1816989fffcd\" (UID: \"a3b559c3-4e82-4897-9b5c-1816989fffcd\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-public-tls-certs\") pod \"61223200-c7c1-4ee8-b38a-e6074d65114a\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3b559c3-4e82-4897-9b5c-1816989fffcd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a3b559c3-4e82-4897-9b5c-1816989fffcd" (UID: "a3b559c3-4e82-4897-9b5c-1816989fffcd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.921500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-combined-ca-bundle\") pod \"61223200-c7c1-4ee8-b38a-e6074d65114a\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.922893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-config-data\") pod \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.922950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-httpd-run\") pod \"61223200-c7c1-4ee8-b38a-e6074d65114a\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.922971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-public-tls-certs\") pod \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.923025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzz7s\" (UniqueName: \"kubernetes.io/projected/1c00713b-bb97-4adf-82eb-8ff31b65d14b-kube-api-access-wzz7s\") pod \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\" (UID: \"1c00713b-bb97-4adf-82eb-8ff31b65d14b\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.923056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-combined-ca-bundle\") pod \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\" (UID: \"eff12a6c-5d02-4e45-9e7e-e589b746f1dc\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.923077 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"61223200-c7c1-4ee8-b38a-e6074d65114a\" (UID: \"61223200-c7c1-4ee8-b38a-e6074d65114a\") " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924104 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924121 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924139 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924148 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924158 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924166 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f74b5687-ee85-4b71-929c-cc6ac63ffd06-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924175 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5mg5\" (UniqueName: \"kubernetes.io/projected/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kube-api-access-l5mg5\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924184 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924192 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3b559c3-4e82-4897-9b5c-1816989fffcd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924201 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74b5687-ee85-4b71-929c-cc6ac63ffd06-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.924218 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-logs" (OuterVolumeSpecName: "logs") pod "1c00713b-bb97-4adf-82eb-8ff31b65d14b" (UID: "1c00713b-bb97-4adf-82eb-8ff31b65d14b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.925324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-kube-api-access-sc48n" (OuterVolumeSpecName: "kube-api-access-sc48n") pod "eff12a6c-5d02-4e45-9e7e-e589b746f1dc" (UID: "eff12a6c-5d02-4e45-9e7e-e589b746f1dc"). InnerVolumeSpecName "kube-api-access-sc48n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.929765 4707 scope.go:117] "RemoveContainer" containerID="528b18552f9a4743a109d4c0a9bde81e896224749b7ecb137a938d625978f1ba" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.930258 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-scripts" (OuterVolumeSpecName: "scripts") pod "61223200-c7c1-4ee8-b38a-e6074d65114a" (UID: "61223200-c7c1-4ee8-b38a-e6074d65114a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.930330 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "1c00713b-bb97-4adf-82eb-8ff31b65d14b" (UID: "1c00713b-bb97-4adf-82eb-8ff31b65d14b"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.930422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a3b559c3-4e82-4897-9b5c-1816989fffcd" (UID: "a3b559c3-4e82-4897-9b5c-1816989fffcd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.938739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b559c3-4e82-4897-9b5c-1816989fffcd-logs" (OuterVolumeSpecName: "logs") pod "a3b559c3-4e82-4897-9b5c-1816989fffcd" (UID: "a3b559c3-4e82-4897-9b5c-1816989fffcd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.942181 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.943001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c00713b-bb97-4adf-82eb-8ff31b65d14b" (UID: "1c00713b-bb97-4adf-82eb-8ff31b65d14b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.958967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "61223200-c7c1-4ee8-b38a-e6074d65114a" (UID: "61223200-c7c1-4ee8-b38a-e6074d65114a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.961022 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-scripts" (OuterVolumeSpecName: "scripts") pod "eff12a6c-5d02-4e45-9e7e-e589b746f1dc" (UID: "eff12a6c-5d02-4e45-9e7e-e589b746f1dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.962144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-logs" (OuterVolumeSpecName: "logs") pod "eff12a6c-5d02-4e45-9e7e-e589b746f1dc" (UID: "eff12a6c-5d02-4e45-9e7e-e589b746f1dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.964914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c00713b-bb97-4adf-82eb-8ff31b65d14b-kube-api-access-wzz7s" (OuterVolumeSpecName: "kube-api-access-wzz7s") pod "1c00713b-bb97-4adf-82eb-8ff31b65d14b" (UID: "1c00713b-bb97-4adf-82eb-8ff31b65d14b"). InnerVolumeSpecName "kube-api-access-wzz7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.965150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.972031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61223200-c7c1-4ee8-b38a-e6074d65114a-kube-api-access-x5gq4" (OuterVolumeSpecName: "kube-api-access-x5gq4") pod "61223200-c7c1-4ee8-b38a-e6074d65114a" (UID: "61223200-c7c1-4ee8-b38a-e6074d65114a"). InnerVolumeSpecName "kube-api-access-x5gq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.973478 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b559c3-4e82-4897-9b5c-1816989fffcd-kube-api-access-lt6x8" (OuterVolumeSpecName: "kube-api-access-lt6x8") pod "a3b559c3-4e82-4897-9b5c-1816989fffcd" (UID: "a3b559c3-4e82-4897-9b5c-1816989fffcd"). InnerVolumeSpecName "kube-api-access-lt6x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.973565 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_6d840a57-4237-4336-9882-01a52c8a2c09/ovn-northd/0.log" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.973622 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.974188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "61223200-c7c1-4ee8-b38a-e6074d65114a" (UID: "61223200-c7c1-4ee8-b38a-e6074d65114a"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.985724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-scripts" (OuterVolumeSpecName: "scripts") pod "1c00713b-bb97-4adf-82eb-8ff31b65d14b" (UID: "1c00713b-bb97-4adf-82eb-8ff31b65d14b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:14 crc kubenswrapper[4707]: I0121 16:06:14.994208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-scripts" (OuterVolumeSpecName: "scripts") pod "a3b559c3-4e82-4897-9b5c-1816989fffcd" (UID: "a3b559c3-4e82-4897-9b5c-1816989fffcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.009005 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.014556 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bd81-account-create-update-g6mbv"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-config\") pod \"6d840a57-4237-4336-9882-01a52c8a2c09\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sjgd\" (UniqueName: \"kubernetes.io/projected/4af60f39-57ca-4595-be87-1d1a0c869d72-kube-api-access-7sjgd\") pod \"4af60f39-57ca-4595-be87-1d1a0c869d72\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036589 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data\") pod \"f58856a6-ee91-4aee-874e-118980038628\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58856a6-ee91-4aee-874e-118980038628-logs\") pod \"f58856a6-ee91-4aee-874e-118980038628\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036640 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-scripts\") pod \"6d840a57-4237-4336-9882-01a52c8a2c09\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data-custom\") pod \"f58856a6-ee91-4aee-874e-118980038628\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892fe45-66a3-4ddf-b0db-b50df71c6493-logs\") pod \"c892fe45-66a3-4ddf-b0db-b50df71c6493\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-metrics-certs-tls-certs\") pod \"6d840a57-4237-4336-9882-01a52c8a2c09\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhhwt\" (UniqueName: \"kubernetes.io/projected/6d840a57-4237-4336-9882-01a52c8a2c09-kube-api-access-vhhwt\") pod \"6d840a57-4237-4336-9882-01a52c8a2c09\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data\") pod \"c892fe45-66a3-4ddf-b0db-b50df71c6493\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-combined-ca-bundle\") pod \"4af60f39-57ca-4595-be87-1d1a0c869d72\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-combined-ca-bundle\") pod \"f58856a6-ee91-4aee-874e-118980038628\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpjf7\" (UniqueName: \"kubernetes.io/projected/f58856a6-ee91-4aee-874e-118980038628-kube-api-access-mpjf7\") pod \"f58856a6-ee91-4aee-874e-118980038628\" (UID: \"f58856a6-ee91-4aee-874e-118980038628\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-northd-tls-certs\") pod \"6d840a57-4237-4336-9882-01a52c8a2c09\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.036966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27s6q\" (UniqueName: \"kubernetes.io/projected/c892fe45-66a3-4ddf-b0db-b50df71c6493-kube-api-access-27s6q\") pod \"c892fe45-66a3-4ddf-b0db-b50df71c6493\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-rundir\") pod \"6d840a57-4237-4336-9882-01a52c8a2c09\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-combined-ca-bundle\") pod \"6d840a57-4237-4336-9882-01a52c8a2c09\" (UID: \"6d840a57-4237-4336-9882-01a52c8a2c09\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data-custom\") pod \"c892fe45-66a3-4ddf-b0db-b50df71c6493\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-config-data\") pod \"4af60f39-57ca-4595-be87-1d1a0c869d72\" (UID: \"4af60f39-57ca-4595-be87-1d1a0c869d72\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-combined-ca-bundle\") pod \"c892fe45-66a3-4ddf-b0db-b50df71c6493\" (UID: \"c892fe45-66a3-4ddf-b0db-b50df71c6493\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037873 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037915 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037926 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc48n\" (UniqueName: \"kubernetes.io/projected/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-kube-api-access-sc48n\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037944 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037953 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037962 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037970 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b559c3-4e82-4897-9b5c-1816989fffcd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.037996 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt6x8\" (UniqueName: \"kubernetes.io/projected/a3b559c3-4e82-4897-9b5c-1816989fffcd-kube-api-access-lt6x8\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038005 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5gq4\" (UniqueName: \"kubernetes.io/projected/61223200-c7c1-4ee8-b38a-e6074d65114a-kube-api-access-x5gq4\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038013 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038020 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c00713b-bb97-4adf-82eb-8ff31b65d14b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038028 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038036 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61223200-c7c1-4ee8-b38a-e6074d65114a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038045 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzz7s\" (UniqueName: \"kubernetes.io/projected/1c00713b-bb97-4adf-82eb-8ff31b65d14b-kube-api-access-wzz7s\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038077 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f58856a6-ee91-4aee-874e-118980038628-logs" (OuterVolumeSpecName: "logs") pod "f58856a6-ee91-4aee-874e-118980038628" (UID: "f58856a6-ee91-4aee-874e-118980038628"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038087 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.038766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-scripts" (OuterVolumeSpecName: "scripts") pod "6d840a57-4237-4336-9882-01a52c8a2c09" (UID: "6d840a57-4237-4336-9882-01a52c8a2c09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.039009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "6d840a57-4237-4336-9882-01a52c8a2c09" (UID: "6d840a57-4237-4336-9882-01a52c8a2c09"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.039282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c892fe45-66a3-4ddf-b0db-b50df71c6493-logs" (OuterVolumeSpecName: "logs") pod "c892fe45-66a3-4ddf-b0db-b50df71c6493" (UID: "c892fe45-66a3-4ddf-b0db-b50df71c6493"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.039477 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.039594 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts podName:2b620e1f-426b-4363-ad99-f56ba333fbfd nodeName:}" failed. No retries permitted until 2026-01-21 16:06:17.039578523 +0000 UTC m=+3874.221094745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts") pod "root-account-create-update-kw7d7" (UID: "2b620e1f-426b-4363-ad99-f56ba333fbfd") : configmap "openstack-scripts" not found Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.045298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-config" (OuterVolumeSpecName: "config") pod "6d840a57-4237-4336-9882-01a52c8a2c09" (UID: "6d840a57-4237-4336-9882-01a52c8a2c09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.050646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d840a57-4237-4336-9882-01a52c8a2c09-kube-api-access-vhhwt" (OuterVolumeSpecName: "kube-api-access-vhhwt") pod "6d840a57-4237-4336-9882-01a52c8a2c09" (UID: "6d840a57-4237-4336-9882-01a52c8a2c09"). InnerVolumeSpecName "kube-api-access-vhhwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.051292 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.056492 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-488e-account-create-update-6kvnh"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.060447 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f58856a6-ee91-4aee-874e-118980038628" (UID: "f58856a6-ee91-4aee-874e-118980038628"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.062176 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.079840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c892fe45-66a3-4ddf-b0db-b50df71c6493-kube-api-access-27s6q" (OuterVolumeSpecName: "kube-api-access-27s6q") pod "c892fe45-66a3-4ddf-b0db-b50df71c6493" (UID: "c892fe45-66a3-4ddf-b0db-b50df71c6493"). InnerVolumeSpecName "kube-api-access-27s6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.079882 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c892fe45-66a3-4ddf-b0db-b50df71c6493" (UID: "c892fe45-66a3-4ddf-b0db-b50df71c6493"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.083324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58856a6-ee91-4aee-874e-118980038628-kube-api-access-mpjf7" (OuterVolumeSpecName: "kube-api-access-mpjf7") pod "f58856a6-ee91-4aee-874e-118980038628" (UID: "f58856a6-ee91-4aee-874e-118980038628"). InnerVolumeSpecName "kube-api-access-mpjf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.087177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af60f39-57ca-4595-be87-1d1a0c869d72-kube-api-access-7sjgd" (OuterVolumeSpecName: "kube-api-access-7sjgd") pod "4af60f39-57ca-4595-be87-1d1a0c869d72" (UID: "4af60f39-57ca-4595-be87-1d1a0c869d72"). InnerVolumeSpecName "kube-api-access-7sjgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.088587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3b559c3-4e82-4897-9b5c-1816989fffcd" (UID: "a3b559c3-4e82-4897-9b5c-1816989fffcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.090597 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.096782 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.099957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61223200-c7c1-4ee8-b38a-e6074d65114a" (UID: "61223200-c7c1-4ee8-b38a-e6074d65114a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.111522 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.114934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data" (OuterVolumeSpecName: "config-data") pod "a3b559c3-4e82-4897-9b5c-1816989fffcd" (UID: "a3b559c3-4e82-4897-9b5c-1816989fffcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.116485 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.118297 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140631 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669fe6b4-a99b-41bb-a7f9-cade7b712e87-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140657 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sjgd\" (UniqueName: \"kubernetes.io/projected/4af60f39-57ca-4595-be87-1d1a0c869d72-kube-api-access-7sjgd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140667 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140674 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58856a6-ee91-4aee-874e-118980038628-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140682 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140691 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140699 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892fe45-66a3-4ddf-b0db-b50df71c6493-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140707 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140714 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140722 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140730 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhhwt\" (UniqueName: \"kubernetes.io/projected/6d840a57-4237-4336-9882-01a52c8a2c09-kube-api-access-vhhwt\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140738 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpjf7\" (UniqueName: \"kubernetes.io/projected/f58856a6-ee91-4aee-874e-118980038628-kube-api-access-mpjf7\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140745 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140753 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27s6q\" (UniqueName: \"kubernetes.io/projected/c892fe45-66a3-4ddf-b0db-b50df71c6493-kube-api-access-27s6q\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140762 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140769 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140777 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140785 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840a57-4237-4336-9882-01a52c8a2c09-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.140792 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9hlc\" (UniqueName: \"kubernetes.io/projected/669fe6b4-a99b-41bb-a7f9-cade7b712e87-kube-api-access-l9hlc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.148247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c00713b-bb97-4adf-82eb-8ff31b65d14b" (UID: "1c00713b-bb97-4adf-82eb-8ff31b65d14b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.185571 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.185880 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.190160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f58856a6-ee91-4aee-874e-118980038628" (UID: "f58856a6-ee91-4aee-874e-118980038628"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.192291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d840a57-4237-4336-9882-01a52c8a2c09" (UID: "6d840a57-4237-4336-9882-01a52c8a2c09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.193410 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273f261e-d7a5-4025-91a6-f9e102be79de" path="/var/lib/kubelet/pods/273f261e-d7a5-4025-91a6-f9e102be79de/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.193994 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" path="/var/lib/kubelet/pods/2772c54d-baf5-4cd4-8677-0db11b97c5ef/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.194529 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40470182-aaeb-43cd-b058-942f0ebe2483" path="/var/lib/kubelet/pods/40470182-aaeb-43cd-b058-942f0ebe2483/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.194975 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f9818d-d674-41dc-8f81-7095412fd946" path="/var/lib/kubelet/pods/44f9818d-d674-41dc-8f81-7095412fd946/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.195687 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="491a0b28-d1e3-4359-af60-d6fb1767153f" path="/var/lib/kubelet/pods/491a0b28-d1e3-4359-af60-d6fb1767153f/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.195750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4af60f39-57ca-4595-be87-1d1a0c869d72" (UID: "4af60f39-57ca-4595-be87-1d1a0c869d72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.196276 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" path="/var/lib/kubelet/pods/4b535454-f068-42cd-9817-a23bd7d7aa4d/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.197062 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62fd99ff-fa07-4608-8a30-69ce0bbe814b" path="/var/lib/kubelet/pods/62fd99ff-fa07-4608-8a30-69ce0bbe814b/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.197567 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="669fe6b4-a99b-41bb-a7f9-cade7b712e87" path="/var/lib/kubelet/pods/669fe6b4-a99b-41bb-a7f9-cade7b712e87/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.199737 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af371a2-cf51-495d-bd0e-a5561b6b9d7c" path="/var/lib/kubelet/pods/6af371a2-cf51-495d-bd0e-a5561b6b9d7c/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.200085 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="878bb642-989b-4ea2-825e-514263f9545a" path="/var/lib/kubelet/pods/878bb642-989b-4ea2-825e-514263f9545a/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.200165 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c892fe45-66a3-4ddf-b0db-b50df71c6493" (UID: "c892fe45-66a3-4ddf-b0db-b50df71c6493"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.203164 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884872ef-7ca2-4262-9561-440673182eba" path="/var/lib/kubelet/pods/884872ef-7ca2-4262-9561-440673182eba/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.203855 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ab5a8f-7bbf-449b-b37e-ba4d800237cf" path="/var/lib/kubelet/pods/b0ab5a8f-7bbf-449b-b37e-ba4d800237cf/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.205354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a3b559c3-4e82-4897-9b5c-1816989fffcd" (UID: "a3b559c3-4e82-4897-9b5c-1816989fffcd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.206036 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c62ea6-096a-4f1a-9326-e14e3ce91911" path="/var/lib/kubelet/pods/b9c62ea6-096a-4f1a-9326-e14e3ce91911/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.206565 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3838495-e77f-4884-b7e3-ca30b8c6e13e" path="/var/lib/kubelet/pods/f3838495-e77f-4884-b7e3-ca30b8c6e13e/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.206973 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f66adee9-b1b8-4d17-b204-58e840cf62b3" path="/var/lib/kubelet/pods/f66adee9-b1b8-4d17-b204-58e840cf62b3/volumes" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.207238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-config-data" (OuterVolumeSpecName: "config-data") pod "4af60f39-57ca-4595-be87-1d1a0c869d72" (UID: "4af60f39-57ca-4595-be87-1d1a0c869d72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.210753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a3b559c3-4e82-4897-9b5c-1816989fffcd" (UID: "a3b559c3-4e82-4897-9b5c-1816989fffcd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.215721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data" (OuterVolumeSpecName: "config-data") pod "c892fe45-66a3-4ddf-b0db-b50df71c6493" (UID: "c892fe45-66a3-4ddf-b0db-b50df71c6493"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.231039 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1c00713b-bb97-4adf-82eb-8ff31b65d14b" (UID: "1c00713b-bb97-4adf-82eb-8ff31b65d14b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.239283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "61223200-c7c1-4ee8-b38a-e6074d65114a" (UID: "61223200-c7c1-4ee8-b38a-e6074d65114a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.240583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data" (OuterVolumeSpecName: "config-data") pod "f58856a6-ee91-4aee-874e-118980038628" (UID: "f58856a6-ee91-4aee-874e-118980038628"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.241878 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25v2r\" (UniqueName: \"kubernetes.io/projected/2b620e1f-426b-4363-ad99-f56ba333fbfd-kube-api-access-25v2r\") pod \"2b620e1f-426b-4363-ad99-f56ba333fbfd\" (UID: \"2b620e1f-426b-4363-ad99-f56ba333fbfd\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.241925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-public-tls-certs\") pod \"a1b70303-f297-4a57-b252-7bdc251fa8ef\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.241962 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg2nj\" (UniqueName: \"kubernetes.io/projected/425c52a3-93a2-49fc-82d9-d7d86f381335-kube-api-access-hg2nj\") pod \"425c52a3-93a2-49fc-82d9-d7d86f381335\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-combined-ca-bundle\") pod \"a1b70303-f297-4a57-b252-7bdc251fa8ef\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-combined-ca-bundle\") pod \"425c52a3-93a2-49fc-82d9-d7d86f381335\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-config-data\") pod \"a1b70303-f297-4a57-b252-7bdc251fa8ef\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts\") pod \"2b620e1f-426b-4363-ad99-f56ba333fbfd\" (UID: \"2b620e1f-426b-4363-ad99-f56ba333fbfd\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425c52a3-93a2-49fc-82d9-d7d86f381335-logs\") pod \"425c52a3-93a2-49fc-82d9-d7d86f381335\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-nova-metadata-tls-certs\") pod \"425c52a3-93a2-49fc-82d9-d7d86f381335\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242178 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b70303-f297-4a57-b252-7bdc251fa8ef-logs\") pod \"a1b70303-f297-4a57-b252-7bdc251fa8ef\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4cxm\" (UniqueName: \"kubernetes.io/projected/a1b70303-f297-4a57-b252-7bdc251fa8ef-kube-api-access-d4cxm\") pod \"a1b70303-f297-4a57-b252-7bdc251fa8ef\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-config-data\") pod \"425c52a3-93a2-49fc-82d9-d7d86f381335\" (UID: \"425c52a3-93a2-49fc-82d9-d7d86f381335\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-internal-tls-certs\") pod \"a1b70303-f297-4a57-b252-7bdc251fa8ef\" (UID: \"a1b70303-f297-4a57-b252-7bdc251fa8ef\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242612 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242630 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242639 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242648 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242657 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242665 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58856a6-ee91-4aee-874e-118980038628-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242672 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242680 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242688 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242696 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af60f39-57ca-4595-be87-1d1a0c869d72-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242703 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892fe45-66a3-4ddf-b0db-b50df71c6493-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.242711 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b559c3-4e82-4897-9b5c-1816989fffcd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.243522 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b70303-f297-4a57-b252-7bdc251fa8ef-logs" (OuterVolumeSpecName: "logs") pod "a1b70303-f297-4a57-b252-7bdc251fa8ef" (UID: "a1b70303-f297-4a57-b252-7bdc251fa8ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.243851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eff12a6c-5d02-4e45-9e7e-e589b746f1dc" (UID: "eff12a6c-5d02-4e45-9e7e-e589b746f1dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.244035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425c52a3-93a2-49fc-82d9-d7d86f381335-logs" (OuterVolumeSpecName: "logs") pod "425c52a3-93a2-49fc-82d9-d7d86f381335" (UID: "425c52a3-93a2-49fc-82d9-d7d86f381335"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.244167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b620e1f-426b-4363-ad99-f56ba333fbfd" (UID: "2b620e1f-426b-4363-ad99-f56ba333fbfd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.255125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-config-data" (OuterVolumeSpecName: "config-data") pod "1c00713b-bb97-4adf-82eb-8ff31b65d14b" (UID: "1c00713b-bb97-4adf-82eb-8ff31b65d14b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.255423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "6d840a57-4237-4336-9882-01a52c8a2c09" (UID: "6d840a57-4237-4336-9882-01a52c8a2c09"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.256182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b620e1f-426b-4363-ad99-f56ba333fbfd-kube-api-access-25v2r" (OuterVolumeSpecName: "kube-api-access-25v2r") pod "2b620e1f-426b-4363-ad99-f56ba333fbfd" (UID: "2b620e1f-426b-4363-ad99-f56ba333fbfd"). InnerVolumeSpecName "kube-api-access-25v2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.269300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-config-data" (OuterVolumeSpecName: "config-data") pod "eff12a6c-5d02-4e45-9e7e-e589b746f1dc" (UID: "eff12a6c-5d02-4e45-9e7e-e589b746f1dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.271624 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6d840a57-4237-4336-9882-01a52c8a2c09" (UID: "6d840a57-4237-4336-9882-01a52c8a2c09"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.275797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425c52a3-93a2-49fc-82d9-d7d86f381335-kube-api-access-hg2nj" (OuterVolumeSpecName: "kube-api-access-hg2nj") pod "425c52a3-93a2-49fc-82d9-d7d86f381335" (UID: "425c52a3-93a2-49fc-82d9-d7d86f381335"). InnerVolumeSpecName "kube-api-access-hg2nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.275843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eff12a6c-5d02-4e45-9e7e-e589b746f1dc" (UID: "eff12a6c-5d02-4e45-9e7e-e589b746f1dc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.275849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b70303-f297-4a57-b252-7bdc251fa8ef-kube-api-access-d4cxm" (OuterVolumeSpecName: "kube-api-access-d4cxm") pod "a1b70303-f297-4a57-b252-7bdc251fa8ef" (UID: "a1b70303-f297-4a57-b252-7bdc251fa8ef"). InnerVolumeSpecName "kube-api-access-d4cxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.283152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-config-data" (OuterVolumeSpecName: "config-data") pod "a1b70303-f297-4a57-b252-7bdc251fa8ef" (UID: "a1b70303-f297-4a57-b252-7bdc251fa8ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.289079 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-config-data" (OuterVolumeSpecName: "config-data") pod "425c52a3-93a2-49fc-82d9-d7d86f381335" (UID: "425c52a3-93a2-49fc-82d9-d7d86f381335"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.291014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-config-data" (OuterVolumeSpecName: "config-data") pod "61223200-c7c1-4ee8-b38a-e6074d65114a" (UID: "61223200-c7c1-4ee8-b38a-e6074d65114a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.291666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1b70303-f297-4a57-b252-7bdc251fa8ef" (UID: "a1b70303-f297-4a57-b252-7bdc251fa8ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.293102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "425c52a3-93a2-49fc-82d9-d7d86f381335" (UID: "425c52a3-93a2-49fc-82d9-d7d86f381335"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.299270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "425c52a3-93a2-49fc-82d9-d7d86f381335" (UID: "425c52a3-93a2-49fc-82d9-d7d86f381335"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.301425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a1b70303-f297-4a57-b252-7bdc251fa8ef" (UID: "a1b70303-f297-4a57-b252-7bdc251fa8ef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.313252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a1b70303-f297-4a57-b252-7bdc251fa8ef" (UID: "a1b70303-f297-4a57-b252-7bdc251fa8ef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.335205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eff12a6c-5d02-4e45-9e7e-e589b746f1dc" (UID: "eff12a6c-5d02-4e45-9e7e-e589b746f1dc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344111 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344135 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344145 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b620e1f-426b-4363-ad99-f56ba333fbfd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344155 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425c52a3-93a2-49fc-82d9-d7d86f381335-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344163 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344172 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b70303-f297-4a57-b252-7bdc251fa8ef-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344179 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344187 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344197 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4cxm\" (UniqueName: \"kubernetes.io/projected/a1b70303-f297-4a57-b252-7bdc251fa8ef-kube-api-access-d4cxm\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344204 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344212 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344220 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344229 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d840a57-4237-4336-9882-01a52c8a2c09-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344237 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff12a6c-5d02-4e45-9e7e-e589b746f1dc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344245 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25v2r\" (UniqueName: \"kubernetes.io/projected/2b620e1f-426b-4363-ad99-f56ba333fbfd-kube-api-access-25v2r\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344253 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344260 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg2nj\" (UniqueName: \"kubernetes.io/projected/425c52a3-93a2-49fc-82d9-d7d86f381335-kube-api-access-hg2nj\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344268 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61223200-c7c1-4ee8-b38a-e6074d65114a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344275 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b70303-f297-4a57-b252-7bdc251fa8ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344283 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c00713b-bb97-4adf-82eb-8ff31b65d14b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.344290 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425c52a3-93a2-49fc-82d9-d7d86f381335-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.461497 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.519381 4707 generic.go:334] "Generic (PLEG): container finished" podID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerID="8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67" exitCode=0 Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.519430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" event={"ID":"329b3e52-9792-42da-a6f0-9b0fc5fb866c","Type":"ContainerDied","Data":"8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.519450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" event={"ID":"329b3e52-9792-42da-a6f0-9b0fc5fb866c","Type":"ContainerDied","Data":"e630e27749f0744751134435a1f2622edee5b04f5294b9423bd95eb91a1b775b"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.519466 4707 scope.go:117] "RemoveContainer" containerID="8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.519557 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-65b994fc4-xql9t" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.521747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"a3b559c3-4e82-4897-9b5c-1816989fffcd","Type":"ContainerDied","Data":"a7d24ac7fe729e19034129c26583a2e49624988d1fa4236b71701e7e7314a6cb"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.521892 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.525213 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_6d840a57-4237-4336-9882-01a52c8a2c09/ovn-northd/0.log" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.525243 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d840a57-4237-4336-9882-01a52c8a2c09" containerID="77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261" exitCode=139 Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.525285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6d840a57-4237-4336-9882-01a52c8a2c09","Type":"ContainerDied","Data":"77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.525301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"6d840a57-4237-4336-9882-01a52c8a2c09","Type":"ContainerDied","Data":"7cdba8929411bf5f77c2cfbbb4e46f35b0508c5a0157c7e3cf760c65407d5be9"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.525341 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.537196 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.537585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld" event={"ID":"c892fe45-66a3-4ddf-b0db-b50df71c6493","Type":"ContainerDied","Data":"f540632b4ec7cb92d49b5808b375c6c827924357bbf777fc5cfeed459ef2f207"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.539235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" event={"ID":"f58856a6-ee91-4aee-874e-118980038628","Type":"ContainerDied","Data":"9d1ddccc903bfca7efd8d342f1062183d9a041c519e23916f08d1983b65772c6"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.539295 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.540318 4707 scope.go:117] "RemoveContainer" containerID="192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.541892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" event={"ID":"2b620e1f-426b-4363-ad99-f56ba333fbfd","Type":"ContainerDied","Data":"a238d128e70958c0ba0c4a4ae2ac1ef76b53970869e6b8a51d32db231151b274"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.541950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-kw7d7" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.548911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b3e52-9792-42da-a6f0-9b0fc5fb866c-logs\") pod \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.548982 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data-custom\") pod \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.549006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-public-tls-certs\") pod \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.549045 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data\") pod \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.549104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glxg5\" (UniqueName: \"kubernetes.io/projected/329b3e52-9792-42da-a6f0-9b0fc5fb866c-kube-api-access-glxg5\") pod \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.549126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-internal-tls-certs\") pod \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.549153 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-combined-ca-bundle\") pod \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\" (UID: \"329b3e52-9792-42da-a6f0-9b0fc5fb866c\") " Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.551105 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/329b3e52-9792-42da-a6f0-9b0fc5fb866c-logs" (OuterVolumeSpecName: "logs") pod "329b3e52-9792-42da-a6f0-9b0fc5fb866c" (UID: "329b3e52-9792-42da-a6f0-9b0fc5fb866c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.563649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329b3e52-9792-42da-a6f0-9b0fc5fb866c-kube-api-access-glxg5" (OuterVolumeSpecName: "kube-api-access-glxg5") pod "329b3e52-9792-42da-a6f0-9b0fc5fb866c" (UID: "329b3e52-9792-42da-a6f0-9b0fc5fb866c"). InnerVolumeSpecName "kube-api-access-glxg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.563717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c00713b-bb97-4adf-82eb-8ff31b65d14b","Type":"ContainerDied","Data":"702e397dcca94c3668336bb52f4bff29018f7892effd5441b94c8d2199f22f0a"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.563774 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.564722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "329b3e52-9792-42da-a6f0-9b0fc5fb866c" (UID: "329b3e52-9792-42da-a6f0-9b0fc5fb866c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.570942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"425c52a3-93a2-49fc-82d9-d7d86f381335","Type":"ContainerDied","Data":"d18433085f15d52dffb3af4e5e2caea7ff2d5749b988e012f1b428f0d7983a99"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.571098 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.580008 4707 generic.go:334] "Generic (PLEG): container finished" podID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerID="a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e" exitCode=0 Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.580096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerDied","Data":"a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.580130 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4af60f39-57ca-4595-be87-1d1a0c869d72","Type":"ContainerDied","Data":"68a32fa6a5dba99ec037bbd9966099a20c4339ff3e5070a7193d05f0305f1347"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.580257 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.580464 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.582334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "329b3e52-9792-42da-a6f0-9b0fc5fb866c" (UID: "329b3e52-9792-42da-a6f0-9b0fc5fb866c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.582777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.582796 4707 generic.go:334] "Generic (PLEG): container finished" podID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerID="5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e" exitCode=0 Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.582846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a1b70303-f297-4a57-b252-7bdc251fa8ef","Type":"ContainerDied","Data":"5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.582886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a1b70303-f297-4a57-b252-7bdc251fa8ef","Type":"ContainerDied","Data":"c846cb49fa8e8fd526b07a61e1e8214e73481ec9a3990c74f86786779a294f98"} Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.583001 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.583081 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.583167 4707 scope.go:117] "RemoveContainer" containerID="8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.583194 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-cf495cb46-rcdjz" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.583417 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67\": container with ID starting with 8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67 not found: ID does not exist" containerID="8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.583444 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67"} err="failed to get container status \"8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67\": rpc error: code = NotFound desc = could not find container \"8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67\": container with ID starting with 8bf56eb521f9f1a96d28dd6977d92a243a78e6ab93bd86b9bb4ba46d8429ed67 not found: ID does not exist" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.583460 4707 scope.go:117] "RemoveContainer" containerID="192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.583850 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887\": container with ID starting with 192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887 not found: ID does not exist" containerID="192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.583905 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887"} err="failed to get container status \"192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887\": rpc error: code = NotFound desc = could not find container \"192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887\": container with ID starting with 192678f59dab89462197a4cb258559df1c2e4dc76caba17159c9e3adea7f0887 not found: ID does not exist" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.583927 4707 scope.go:117] "RemoveContainer" containerID="030c52d40b8a7c37e53cff77e05fa947476307c66e7a847d083b6b3c32443d7f" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.598902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "329b3e52-9792-42da-a6f0-9b0fc5fb866c" (UID: "329b3e52-9792-42da-a6f0-9b0fc5fb866c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.598934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "329b3e52-9792-42da-a6f0-9b0fc5fb866c" (UID: "329b3e52-9792-42da-a6f0-9b0fc5fb866c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.599931 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.607115 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.610023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data" (OuterVolumeSpecName: "config-data") pod "329b3e52-9792-42da-a6f0-9b0fc5fb866c" (UID: "329b3e52-9792-42da-a6f0-9b0fc5fb866c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.616798 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.624122 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.628580 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7457d87b7c-v8nld"] Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.636685 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.637528 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.638452 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.638481 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.643438 4707 scope.go:117] "RemoveContainer" containerID="5e7ec7f63a2fc9f2128d4f3cf1c08e92faaab92b0eaa704e50baa0f7c8e983bc" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.653720 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.653799 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.653841 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b3e52-9792-42da-a6f0-9b0fc5fb866c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.653855 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.653863 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.653872 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b3e52-9792-42da-a6f0-9b0fc5fb866c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.653882 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glxg5\" (UniqueName: \"kubernetes.io/projected/329b3e52-9792-42da-a6f0-9b0fc5fb866c-kube-api-access-glxg5\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.662242 4707 scope.go:117] "RemoveContainer" containerID="b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.665035 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.674133 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-68f4cbbf8-p929v"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.682281 4707 scope.go:117] "RemoveContainer" containerID="77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.683114 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.696870 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.702242 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.702441 4707 scope.go:117] "RemoveContainer" containerID="b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.703091 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6\": container with ID starting with b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6 not found: ID does not exist" containerID="b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.703122 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6"} err="failed to get container status \"b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6\": rpc error: code = NotFound desc = could not find container \"b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6\": container with ID starting with b3861effe589c1f13c4b7d4528573d445e08c048024849b2fe040106e195f7f6 not found: ID does not exist" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.703139 4707 scope.go:117] "RemoveContainer" containerID="77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.704710 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261\": container with ID starting with 77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261 not found: ID does not exist" containerID="77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.704732 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261"} err="failed to get container status \"77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261\": rpc error: code = NotFound desc = could not find container \"77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261\": container with ID starting with 77f8fc0d55d176231ef535e8b2ce8ee8105ca8019ecf295e99b17f61c500d261 not found: ID does not exist" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.704746 4707 scope.go:117] "RemoveContainer" containerID="73b244ba7031f16385e7963d008d81acbd17ba6ba2ee40f7b1bdf960c503b84b" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.708825 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.713896 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kw7d7"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.717691 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-kw7d7"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.720025 4707 scope.go:117] "RemoveContainer" containerID="58be21a4ad10793fe38a4fd7bab7b968f3bb33396de3d91f81bf652728f3cc32" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.723601 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.729840 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.737366 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.741099 4707 scope.go:117] "RemoveContainer" containerID="5e4bd3a5a9bcc7a9caf760e253e0ea48824f56e8b8b18cd49710b14d45577c27" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.741583 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.745413 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.749180 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.752941 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.754493 4707 scope.go:117] "RemoveContainer" containerID="a7d885ed9569b9160b3891c8377f8b2c583f8d5d64b28744210b903262b86e4d" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.756845 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.760493 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-cf495cb46-rcdjz"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.764223 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-cf495cb46-rcdjz"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.766408 4707 scope.go:117] "RemoveContainer" containerID="b92d7390d144ce91eac5fff590d199d75cae8e6be0b2fd4b6cf081f59dbf0240" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.779565 4707 scope.go:117] "RemoveContainer" containerID="e161f3943e924491a50c0f1e70e67b3a00296b6236a4e22683928578bbfa1875" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.795398 4707 scope.go:117] "RemoveContainer" containerID="798d6d03c5f9d0f6851d346fe3cbe26f854510991f5d3c2cad7db678016a87fb" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.811955 4707 scope.go:117] "RemoveContainer" containerID="7cfc0f593e80d12c7b3d86938139b541aeeef5f717713f7020df220d37653d45" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.824290 4707 scope.go:117] "RemoveContainer" containerID="170b32ec6e5a16b7ab4b852d6a2315b320618427d7554c7752c0c21ea1350ffa" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.835510 4707 scope.go:117] "RemoveContainer" containerID="a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.847783 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-65b994fc4-xql9t"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.852219 4707 scope.go:117] "RemoveContainer" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.853235 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-65b994fc4-xql9t"] Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.873888 4707 scope.go:117] "RemoveContainer" containerID="a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.874154 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e\": container with ID starting with a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e not found: ID does not exist" containerID="a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.874190 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e"} err="failed to get container status \"a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e\": rpc error: code = NotFound desc = could not find container \"a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e\": container with ID starting with a190186a9d8bf7fbf52e1950c44036e878bd8786c0448e81d1425ae7d25d748e not found: ID does not exist" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.874215 4707 scope.go:117] "RemoveContainer" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.874426 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a\": container with ID starting with 658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a not found: ID does not exist" containerID="658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.874446 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a"} err="failed to get container status \"658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a\": rpc error: code = NotFound desc = could not find container \"658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a\": container with ID starting with 658a7e3350a0d20df215c7066985d8e8837ec371de9fef1028423d60e6686a0a not found: ID does not exist" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.874458 4707 scope.go:117] "RemoveContainer" containerID="5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.887681 4707 scope.go:117] "RemoveContainer" containerID="23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.899679 4707 scope.go:117] "RemoveContainer" containerID="5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.899961 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e\": container with ID starting with 5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e not found: ID does not exist" containerID="5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.899992 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e"} err="failed to get container status \"5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e\": rpc error: code = NotFound desc = could not find container \"5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e\": container with ID starting with 5cf4c4e79183160e42f0ae62cc2c7fcd443d226731a2518fff5a11ee83db631e not found: ID does not exist" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.900011 4707 scope.go:117] "RemoveContainer" containerID="23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd" Jan 21 16:06:15 crc kubenswrapper[4707]: E0121 16:06:15.900287 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd\": container with ID starting with 23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd not found: ID does not exist" containerID="23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd" Jan 21 16:06:15 crc kubenswrapper[4707]: I0121 16:06:15.900320 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd"} err="failed to get container status \"23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd\": rpc error: code = NotFound desc = could not find container \"23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd\": container with ID starting with 23dfc90a865a933cbbb48dffdf11c558655e6f6c2b2ad6d24afd9b76000c46dd not found: ID does not exist" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.380695 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.464204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-public-tls-certs\") pod \"113f5252-dcd0-4719-8bd9-11800c33a828\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.464255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-credential-keys\") pod \"113f5252-dcd0-4719-8bd9-11800c33a828\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.464288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-combined-ca-bundle\") pod \"113f5252-dcd0-4719-8bd9-11800c33a828\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.464339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-internal-tls-certs\") pod \"113f5252-dcd0-4719-8bd9-11800c33a828\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.464356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6w8\" (UniqueName: \"kubernetes.io/projected/113f5252-dcd0-4719-8bd9-11800c33a828-kube-api-access-pd6w8\") pod \"113f5252-dcd0-4719-8bd9-11800c33a828\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.464388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-config-data\") pod \"113f5252-dcd0-4719-8bd9-11800c33a828\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.464457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-scripts\") pod \"113f5252-dcd0-4719-8bd9-11800c33a828\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.464471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-fernet-keys\") pod \"113f5252-dcd0-4719-8bd9-11800c33a828\" (UID: \"113f5252-dcd0-4719-8bd9-11800c33a828\") " Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.477246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "113f5252-dcd0-4719-8bd9-11800c33a828" (UID: "113f5252-dcd0-4719-8bd9-11800c33a828"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.477750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113f5252-dcd0-4719-8bd9-11800c33a828-kube-api-access-pd6w8" (OuterVolumeSpecName: "kube-api-access-pd6w8") pod "113f5252-dcd0-4719-8bd9-11800c33a828" (UID: "113f5252-dcd0-4719-8bd9-11800c33a828"). InnerVolumeSpecName "kube-api-access-pd6w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.478661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-scripts" (OuterVolumeSpecName: "scripts") pod "113f5252-dcd0-4719-8bd9-11800c33a828" (UID: "113f5252-dcd0-4719-8bd9-11800c33a828"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.483313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "113f5252-dcd0-4719-8bd9-11800c33a828" (UID: "113f5252-dcd0-4719-8bd9-11800c33a828"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.500120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "113f5252-dcd0-4719-8bd9-11800c33a828" (UID: "113f5252-dcd0-4719-8bd9-11800c33a828"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.500422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-config-data" (OuterVolumeSpecName: "config-data") pod "113f5252-dcd0-4719-8bd9-11800c33a828" (UID: "113f5252-dcd0-4719-8bd9-11800c33a828"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.504163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "113f5252-dcd0-4719-8bd9-11800c33a828" (UID: "113f5252-dcd0-4719-8bd9-11800c33a828"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.514821 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "113f5252-dcd0-4719-8bd9-11800c33a828" (UID: "113f5252-dcd0-4719-8bd9-11800c33a828"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.565984 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.566095 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.566163 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.566224 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.566308 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.566376 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd6w8\" (UniqueName: \"kubernetes.io/projected/113f5252-dcd0-4719-8bd9-11800c33a828-kube-api-access-pd6w8\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.566465 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.566522 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113f5252-dcd0-4719-8bd9-11800c33a828-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.601593 4707 generic.go:334] "Generic (PLEG): container finished" podID="113f5252-dcd0-4719-8bd9-11800c33a828" containerID="741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6" exitCode=0 Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.601684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" event={"ID":"113f5252-dcd0-4719-8bd9-11800c33a828","Type":"ContainerDied","Data":"741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6"} Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.601721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" event={"ID":"113f5252-dcd0-4719-8bd9-11800c33a828","Type":"ContainerDied","Data":"c6ad8c6493470ecd8264485c27ce2557f540b78ba288b48c5dcd5c898f8df4d5"} Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.601741 4707 scope.go:117] "RemoveContainer" containerID="741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.601914 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-9fc8c7fb9-vltft" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.627512 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-9fc8c7fb9-vltft"] Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.629943 4707 scope.go:117] "RemoveContainer" containerID="741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6" Jan 21 16:06:16 crc kubenswrapper[4707]: E0121 16:06:16.630324 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6\": container with ID starting with 741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6 not found: ID does not exist" containerID="741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.630362 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6"} err="failed to get container status \"741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6\": rpc error: code = NotFound desc = could not find container \"741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6\": container with ID starting with 741b11e01782d99ebcae46ebf7fd800348f75ee6b716675b0fcd8b51264aeab6 not found: ID does not exist" Jan 21 16:06:16 crc kubenswrapper[4707]: I0121 16:06:16.631513 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-9fc8c7fb9-vltft"] Jan 21 16:06:16 crc kubenswrapper[4707]: E0121 16:06:16.770034 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:06:16 crc kubenswrapper[4707]: E0121 16:06:16.770095 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data podName:50ded377-6ace-4f39-8a60-bee575f7e8cc nodeName:}" failed. No retries permitted until 2026-01-21 16:06:24.770081686 +0000 UTC m=+3881.951597908 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data") pod "rabbitmq-server-0" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc") : configmap "rabbitmq-config-data" not found Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.190924 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="113f5252-dcd0-4719-8bd9-11800c33a828" path="/var/lib/kubelet/pods/113f5252-dcd0-4719-8bd9-11800c33a828/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.191492 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" path="/var/lib/kubelet/pods/1c00713b-bb97-4adf-82eb-8ff31b65d14b/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.192023 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b620e1f-426b-4363-ad99-f56ba333fbfd" path="/var/lib/kubelet/pods/2b620e1f-426b-4363-ad99-f56ba333fbfd/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.192935 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" path="/var/lib/kubelet/pods/329b3e52-9792-42da-a6f0-9b0fc5fb866c/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.193450 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" path="/var/lib/kubelet/pods/425c52a3-93a2-49fc-82d9-d7d86f381335/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.193964 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" path="/var/lib/kubelet/pods/4af60f39-57ca-4595-be87-1d1a0c869d72/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.194934 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61223200-c7c1-4ee8-b38a-e6074d65114a" path="/var/lib/kubelet/pods/61223200-c7c1-4ee8-b38a-e6074d65114a/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.195712 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" path="/var/lib/kubelet/pods/6d840a57-4237-4336-9882-01a52c8a2c09/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.196879 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" path="/var/lib/kubelet/pods/a1b70303-f297-4a57-b252-7bdc251fa8ef/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.197385 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" path="/var/lib/kubelet/pods/a3b559c3-4e82-4897-9b5c-1816989fffcd/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.198187 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c892fe45-66a3-4ddf-b0db-b50df71c6493" path="/var/lib/kubelet/pods/c892fe45-66a3-4ddf-b0db-b50df71c6493/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.199085 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" path="/var/lib/kubelet/pods/eff12a6c-5d02-4e45-9e7e-e589b746f1dc/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.199598 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58856a6-ee91-4aee-874e-118980038628" path="/var/lib/kubelet/pods/f58856a6-ee91-4aee-874e-118980038628/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.200652 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74b5687-ee85-4b71-929c-cc6ac63ffd06" path="/var/lib/kubelet/pods/f74b5687-ee85-4b71-929c-cc6ac63ffd06/volumes" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.442576 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580578 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-tls\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-erlang-cookie\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l66wc\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-kube-api-access-l66wc\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-confd\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50ded377-6ace-4f39-8a60-bee575f7e8cc-erlang-cookie-secret\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-plugins\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-plugins-conf\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-server-conf\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.580793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50ded377-6ace-4f39-8a60-bee575f7e8cc-pod-info\") pod \"50ded377-6ace-4f39-8a60-bee575f7e8cc\" (UID: \"50ded377-6ace-4f39-8a60-bee575f7e8cc\") " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.581103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.581326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.581570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.585091 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/50ded377-6ace-4f39-8a60-bee575f7e8cc-pod-info" (OuterVolumeSpecName: "pod-info") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.585110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.585098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ded377-6ace-4f39-8a60-bee575f7e8cc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.603116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "persistence") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.604397 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data" (OuterVolumeSpecName: "config-data") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.604776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-kube-api-access-l66wc" (OuterVolumeSpecName: "kube-api-access-l66wc") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "kube-api-access-l66wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.610225 4707 generic.go:334] "Generic (PLEG): container finished" podID="50ded377-6ace-4f39-8a60-bee575f7e8cc" containerID="d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0" exitCode=0 Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.610275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"50ded377-6ace-4f39-8a60-bee575f7e8cc","Type":"ContainerDied","Data":"d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0"} Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.610299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"50ded377-6ace-4f39-8a60-bee575f7e8cc","Type":"ContainerDied","Data":"0f47ed0c3f538d528d4f1deba5f82156807e5b65d757738993e37fefc5bb139f"} Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.610315 4707 scope.go:117] "RemoveContainer" containerID="d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.610416 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.621192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-server-conf" (OuterVolumeSpecName: "server-conf") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.634924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "50ded377-6ace-4f39-8a60-bee575f7e8cc" (UID: "50ded377-6ace-4f39-8a60-bee575f7e8cc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.654679 4707 scope.go:117] "RemoveContainer" containerID="322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.668038 4707 scope.go:117] "RemoveContainer" containerID="d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0" Jan 21 16:06:17 crc kubenswrapper[4707]: E0121 16:06:17.668354 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0\": container with ID starting with d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0 not found: ID does not exist" containerID="d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.668387 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0"} err="failed to get container status \"d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0\": rpc error: code = NotFound desc = could not find container \"d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0\": container with ID starting with d2c59c3b438c562735280fa033d56d59fad31cd8c1d384ff7af2537767b4dde0 not found: ID does not exist" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.668408 4707 scope.go:117] "RemoveContainer" containerID="322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab" Jan 21 16:06:17 crc kubenswrapper[4707]: E0121 16:06:17.668769 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab\": container with ID starting with 322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab not found: ID does not exist" containerID="322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.668792 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab"} err="failed to get container status \"322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab\": rpc error: code = NotFound desc = could not find container \"322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab\": container with ID starting with 322c7a62eb45e568a1f9309d1cf73839dfd6e632ba742292d4a6e99b8d8870ab not found: ID does not exist" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682649 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682674 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50ded377-6ace-4f39-8a60-bee575f7e8cc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682684 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682710 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682721 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682729 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682737 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50ded377-6ace-4f39-8a60-bee575f7e8cc-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682745 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50ded377-6ace-4f39-8a60-bee575f7e8cc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682753 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682761 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50ded377-6ace-4f39-8a60-bee575f7e8cc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.682770 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l66wc\" (UniqueName: \"kubernetes.io/projected/50ded377-6ace-4f39-8a60-bee575f7e8cc-kube-api-access-l66wc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.694646 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.783774 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.948213 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:06:17 crc kubenswrapper[4707]: I0121 16:06:17.954117 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:06:17 crc kubenswrapper[4707]: E0121 16:06:17.990707 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:17 crc kubenswrapper[4707]: E0121 16:06:17.990768 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data podName:1c37d5e5-4ffd-412c-a93a-86cb6474735c nodeName:}" failed. No retries permitted until 2026-01-21 16:06:25.99075461 +0000 UTC m=+3883.172270832 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data") pod "rabbitmq-cell1-server-0" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.233746 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.394875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c37d5e5-4ffd-412c-a93a-86cb6474735c-erlang-cookie-secret\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-plugins-conf\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-confd\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395158 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-plugins\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395175 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-server-conf\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-tls\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-erlang-cookie\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc4kc\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-kube-api-access-pc4kc\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.395708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.396207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.396239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c37d5e5-4ffd-412c-a93a-86cb6474735c-pod-info\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.396283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data\") pod \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\" (UID: \"1c37d5e5-4ffd-412c-a93a-86cb6474735c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.396515 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.396539 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.396548 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.401008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1c37d5e5-4ffd-412c-a93a-86cb6474735c-pod-info" (OuterVolumeSpecName: "pod-info") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.401004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c37d5e5-4ffd-412c-a93a-86cb6474735c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.401025 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-kube-api-access-pc4kc" (OuterVolumeSpecName: "kube-api-access-pc4kc") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "kube-api-access-pc4kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.407062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.415149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.418918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data" (OuterVolumeSpecName: "config-data") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.431082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-server-conf" (OuterVolumeSpecName: "server-conf") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.456186 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1c37d5e5-4ffd-412c-a93a-86cb6474735c" (UID: "1c37d5e5-4ffd-412c-a93a-86cb6474735c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.497686 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.497705 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.497715 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.497723 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc4kc\" (UniqueName: \"kubernetes.io/projected/1c37d5e5-4ffd-412c-a93a-86cb6474735c-kube-api-access-pc4kc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.497732 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c37d5e5-4ffd-412c-a93a-86cb6474735c-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.497739 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c37d5e5-4ffd-412c-a93a-86cb6474735c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.497760 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c37d5e5-4ffd-412c-a93a-86cb6474735c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.497788 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.505765 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.511629 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.589802 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.598156 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-ceilometer-tls-certs\") pod \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.598200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-config-data\") pod \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.598269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-scripts\") pod \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.598287 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-sg-core-conf-yaml\") pod \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.598363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz9tx\" (UniqueName: \"kubernetes.io/projected/c468129d-ce39-4ba9-970f-4ecc0bb1df07-kube-api-access-jz9tx\") pod \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.598391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-log-httpd\") pod \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.598406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-run-httpd\") pod \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.598422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-combined-ca-bundle\") pod \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\" (UID: \"c468129d-ce39-4ba9-970f-4ecc0bb1df07\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.598705 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.600576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c468129d-ce39-4ba9-970f-4ecc0bb1df07" (UID: "c468129d-ce39-4ba9-970f-4ecc0bb1df07"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.602540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c468129d-ce39-4ba9-970f-4ecc0bb1df07" (UID: "c468129d-ce39-4ba9-970f-4ecc0bb1df07"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.602855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c468129d-ce39-4ba9-970f-4ecc0bb1df07-kube-api-access-jz9tx" (OuterVolumeSpecName: "kube-api-access-jz9tx") pod "c468129d-ce39-4ba9-970f-4ecc0bb1df07" (UID: "c468129d-ce39-4ba9-970f-4ecc0bb1df07"). InnerVolumeSpecName "kube-api-access-jz9tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.605611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-scripts" (OuterVolumeSpecName: "scripts") pod "c468129d-ce39-4ba9-970f-4ecc0bb1df07" (UID: "c468129d-ce39-4ba9-970f-4ecc0bb1df07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.616909 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c468129d-ce39-4ba9-970f-4ecc0bb1df07" (UID: "c468129d-ce39-4ba9-970f-4ecc0bb1df07"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.617319 4707 generic.go:334] "Generic (PLEG): container finished" podID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerID="b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed" exitCode=0 Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.617371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c35a2e10-5d5d-44b3-b43e-9a1619f7105c","Type":"ContainerDied","Data":"b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed"} Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.617379 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.617394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c35a2e10-5d5d-44b3-b43e-9a1619f7105c","Type":"ContainerDied","Data":"20720a8d39c9d26c8b8fadb98b6fa4b1c16b2ed083603d1d60f4605e3481df11"} Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.617408 4707 scope.go:117] "RemoveContainer" containerID="b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.620992 4707 generic.go:334] "Generic (PLEG): container finished" podID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerID="3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546" exitCode=0 Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.621035 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.621035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerDied","Data":"3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546"} Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.621129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c468129d-ce39-4ba9-970f-4ecc0bb1df07","Type":"ContainerDied","Data":"f0a9124f09597fe32a56baf849b8987f1c2eb61272881fa6720fe5476f77d3ca"} Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.623156 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c37d5e5-4ffd-412c-a93a-86cb6474735c" containerID="2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813" exitCode=0 Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.623222 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.623324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1c37d5e5-4ffd-412c-a93a-86cb6474735c","Type":"ContainerDied","Data":"2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813"} Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.623429 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"1c37d5e5-4ffd-412c-a93a-86cb6474735c","Type":"ContainerDied","Data":"557e2ea25d4d161d78cb6fb283e2d22644b492800db38b794bd7bd717f3c20d5"} Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.635566 4707 scope.go:117] "RemoveContainer" containerID="382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.640472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c468129d-ce39-4ba9-970f-4ecc0bb1df07" (UID: "c468129d-ce39-4ba9-970f-4ecc0bb1df07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.647897 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.653767 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.659089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c468129d-ce39-4ba9-970f-4ecc0bb1df07" (UID: "c468129d-ce39-4ba9-970f-4ecc0bb1df07"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.669542 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-config-data" (OuterVolumeSpecName: "config-data") pod "c468129d-ce39-4ba9-970f-4ecc0bb1df07" (UID: "c468129d-ce39-4ba9-970f-4ecc0bb1df07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.670181 4707 scope.go:117] "RemoveContainer" containerID="b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed" Jan 21 16:06:18 crc kubenswrapper[4707]: E0121 16:06:18.670500 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed\": container with ID starting with b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed not found: ID does not exist" containerID="b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.670539 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed"} err="failed to get container status \"b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed\": rpc error: code = NotFound desc = could not find container \"b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed\": container with ID starting with b4adccaaa3408ecd74d921dcb6618f1b7b16ab6ec456e7958647ee85aa48d4ed not found: ID does not exist" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.670560 4707 scope.go:117] "RemoveContainer" containerID="382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4" Jan 21 16:06:18 crc kubenswrapper[4707]: E0121 16:06:18.670949 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4\": container with ID starting with 382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4 not found: ID does not exist" containerID="382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.670971 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4"} err="failed to get container status \"382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4\": rpc error: code = NotFound desc = could not find container \"382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4\": container with ID starting with 382788d87d7d544857b12c5a30f5a8ab0fcb6d7f8ad4913e529c1d93587c0ee4 not found: ID does not exist" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.670984 4707 scope.go:117] "RemoveContainer" containerID="f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.683031 4707 scope.go:117] "RemoveContainer" containerID="66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.695700 4707 scope.go:117] "RemoveContainer" containerID="3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.699975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-combined-ca-bundle\") pod \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-config-data\") pod \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8x8s\" (UniqueName: \"kubernetes.io/projected/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-kube-api-access-t8x8s\") pod \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\" (UID: \"c35a2e10-5d5d-44b3-b43e-9a1619f7105c\") " Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700475 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700492 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700502 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz9tx\" (UniqueName: \"kubernetes.io/projected/c468129d-ce39-4ba9-970f-4ecc0bb1df07-kube-api-access-jz9tx\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700510 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700518 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c468129d-ce39-4ba9-970f-4ecc0bb1df07-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700541 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700549 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.700556 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c468129d-ce39-4ba9-970f-4ecc0bb1df07-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.702353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-kube-api-access-t8x8s" (OuterVolumeSpecName: "kube-api-access-t8x8s") pod "c35a2e10-5d5d-44b3-b43e-9a1619f7105c" (UID: "c35a2e10-5d5d-44b3-b43e-9a1619f7105c"). InnerVolumeSpecName "kube-api-access-t8x8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.708291 4707 scope.go:117] "RemoveContainer" containerID="ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.712647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-config-data" (OuterVolumeSpecName: "config-data") pod "c35a2e10-5d5d-44b3-b43e-9a1619f7105c" (UID: "c35a2e10-5d5d-44b3-b43e-9a1619f7105c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.714633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c35a2e10-5d5d-44b3-b43e-9a1619f7105c" (UID: "c35a2e10-5d5d-44b3-b43e-9a1619f7105c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.720442 4707 scope.go:117] "RemoveContainer" containerID="f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a" Jan 21 16:06:18 crc kubenswrapper[4707]: E0121 16:06:18.720757 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a\": container with ID starting with f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a not found: ID does not exist" containerID="f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.720870 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a"} err="failed to get container status \"f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a\": rpc error: code = NotFound desc = could not find container \"f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a\": container with ID starting with f1a17413106368f8a3f6c66622139e9524f1e510e59e6118ef2eb9c67a85095a not found: ID does not exist" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.720946 4707 scope.go:117] "RemoveContainer" containerID="66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d" Jan 21 16:06:18 crc kubenswrapper[4707]: E0121 16:06:18.721241 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d\": container with ID starting with 66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d not found: ID does not exist" containerID="66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.721270 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d"} err="failed to get container status \"66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d\": rpc error: code = NotFound desc = could not find container \"66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d\": container with ID starting with 66a6fd2b486865a71ec31cbf7e0485cb5d0e6acad2736ff6ca452bdbf96d4a0d not found: ID does not exist" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.721287 4707 scope.go:117] "RemoveContainer" containerID="3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546" Jan 21 16:06:18 crc kubenswrapper[4707]: E0121 16:06:18.721628 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546\": container with ID starting with 3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546 not found: ID does not exist" containerID="3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.721652 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546"} err="failed to get container status \"3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546\": rpc error: code = NotFound desc = could not find container \"3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546\": container with ID starting with 3adbff52501f546fceac074d4cc86efca7da0bb4f3cf791537da113dff439546 not found: ID does not exist" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.721666 4707 scope.go:117] "RemoveContainer" containerID="ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047" Jan 21 16:06:18 crc kubenswrapper[4707]: E0121 16:06:18.721951 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047\": container with ID starting with ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047 not found: ID does not exist" containerID="ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.722030 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047"} err="failed to get container status \"ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047\": rpc error: code = NotFound desc = could not find container \"ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047\": container with ID starting with ff5a6986c329e28fdbaf2387aed92852076912da69e9456b7a9cf33d1ea05047 not found: ID does not exist" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.722094 4707 scope.go:117] "RemoveContainer" containerID="2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.734847 4707 scope.go:117] "RemoveContainer" containerID="6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.747516 4707 scope.go:117] "RemoveContainer" containerID="2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813" Jan 21 16:06:18 crc kubenswrapper[4707]: E0121 16:06:18.747749 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813\": container with ID starting with 2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813 not found: ID does not exist" containerID="2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.747770 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813"} err="failed to get container status \"2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813\": rpc error: code = NotFound desc = could not find container \"2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813\": container with ID starting with 2155f094e9b375b082a84248685a9c294de1d689872e98b3995a88771faa7813 not found: ID does not exist" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.747785 4707 scope.go:117] "RemoveContainer" containerID="6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679" Jan 21 16:06:18 crc kubenswrapper[4707]: E0121 16:06:18.748014 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679\": container with ID starting with 6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679 not found: ID does not exist" containerID="6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.748032 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679"} err="failed to get container status \"6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679\": rpc error: code = NotFound desc = could not find container \"6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679\": container with ID starting with 6dd426ff9be3c279039c856766b938434648f2b5c5bd25211780c7a15289f679 not found: ID does not exist" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.801461 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.801495 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8x8s\" (UniqueName: \"kubernetes.io/projected/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-kube-api-access-t8x8s\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.801506 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35a2e10-5d5d-44b3-b43e-9a1619f7105c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.948343 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.953083 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.957508 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:06:18 crc kubenswrapper[4707]: I0121 16:06:18.961674 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:06:19 crc kubenswrapper[4707]: I0121 16:06:19.190503 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c37d5e5-4ffd-412c-a93a-86cb6474735c" path="/var/lib/kubelet/pods/1c37d5e5-4ffd-412c-a93a-86cb6474735c/volumes" Jan 21 16:06:19 crc kubenswrapper[4707]: I0121 16:06:19.191224 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ded377-6ace-4f39-8a60-bee575f7e8cc" path="/var/lib/kubelet/pods/50ded377-6ace-4f39-8a60-bee575f7e8cc/volumes" Jan 21 16:06:19 crc kubenswrapper[4707]: I0121 16:06:19.192113 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" path="/var/lib/kubelet/pods/c35a2e10-5d5d-44b3-b43e-9a1619f7105c/volumes" Jan 21 16:06:19 crc kubenswrapper[4707]: I0121 16:06:19.192657 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" path="/var/lib/kubelet/pods/c468129d-ce39-4ba9-970f-4ecc0bb1df07/volumes" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.080493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.119127 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp"] Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.119327 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" podUID="b1ab8226-4400-48d8-908f-ef4d0778ef1e" containerName="dnsmasq-dns" containerID="cri-o://49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065" gracePeriod=10 Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.507943 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.631526 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwt7p\" (UniqueName: \"kubernetes.io/projected/b1ab8226-4400-48d8-908f-ef4d0778ef1e-kube-api-access-dwt7p\") pod \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.631582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dns-swift-storage-0\") pod \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.631606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dnsmasq-svc\") pod \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.631645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-config\") pod \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\" (UID: \"b1ab8226-4400-48d8-908f-ef4d0778ef1e\") " Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.635965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ab8226-4400-48d8-908f-ef4d0778ef1e-kube-api-access-dwt7p" (OuterVolumeSpecName: "kube-api-access-dwt7p") pod "b1ab8226-4400-48d8-908f-ef4d0778ef1e" (UID: "b1ab8226-4400-48d8-908f-ef4d0778ef1e"). InnerVolumeSpecName "kube-api-access-dwt7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.656926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "b1ab8226-4400-48d8-908f-ef4d0778ef1e" (UID: "b1ab8226-4400-48d8-908f-ef4d0778ef1e"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.657720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b1ab8226-4400-48d8-908f-ef4d0778ef1e" (UID: "b1ab8226-4400-48d8-908f-ef4d0778ef1e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.657796 4707 generic.go:334] "Generic (PLEG): container finished" podID="b1ab8226-4400-48d8-908f-ef4d0778ef1e" containerID="49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065" exitCode=0 Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.657837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" event={"ID":"b1ab8226-4400-48d8-908f-ef4d0778ef1e","Type":"ContainerDied","Data":"49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065"} Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.657859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" event={"ID":"b1ab8226-4400-48d8-908f-ef4d0778ef1e","Type":"ContainerDied","Data":"dbab3a85540ff33b93c8dc9ebb98fbcede275e1081b091a3a903950d85c5e174"} Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.657868 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.657873 4707 scope.go:117] "RemoveContainer" containerID="49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.667194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-config" (OuterVolumeSpecName: "config") pod "b1ab8226-4400-48d8-908f-ef4d0778ef1e" (UID: "b1ab8226-4400-48d8-908f-ef4d0778ef1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.689934 4707 scope.go:117] "RemoveContainer" containerID="33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.702029 4707 scope.go:117] "RemoveContainer" containerID="49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065" Jan 21 16:06:21 crc kubenswrapper[4707]: E0121 16:06:21.702304 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065\": container with ID starting with 49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065 not found: ID does not exist" containerID="49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.702330 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065"} err="failed to get container status \"49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065\": rpc error: code = NotFound desc = could not find container \"49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065\": container with ID starting with 49abcecd4ab5287945f4e84470cb13e8cd7e8c0479cac9932e17c2837dd0b065 not found: ID does not exist" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.702345 4707 scope.go:117] "RemoveContainer" containerID="33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078" Jan 21 16:06:21 crc kubenswrapper[4707]: E0121 16:06:21.702622 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078\": container with ID starting with 33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078 not found: ID does not exist" containerID="33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.702644 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078"} err="failed to get container status \"33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078\": rpc error: code = NotFound desc = could not find container \"33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078\": container with ID starting with 33e69e2963a6e69603c5994166b361f08663c7122e0bf791efe151ca5631d078 not found: ID does not exist" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.733569 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwt7p\" (UniqueName: \"kubernetes.io/projected/b1ab8226-4400-48d8-908f-ef4d0778ef1e-kube-api-access-dwt7p\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.733589 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.733599 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.733626 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1ab8226-4400-48d8-908f-ef4d0778ef1e-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.979271 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp"] Jan 21 16:06:21 crc kubenswrapper[4707]: I0121 16:06:21.985722 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6db984dd9-bb7fp"] Jan 21 16:06:23 crc kubenswrapper[4707]: I0121 16:06:23.188554 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1ab8226-4400-48d8-908f-ef4d0778ef1e" path="/var/lib/kubelet/pods/b1ab8226-4400-48d8-908f-ef4d0778ef1e/volumes" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.182199 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:06:26 crc kubenswrapper[4707]: E0121 16:06:26.182586 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.422767 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c95f7cb7d-zhgp4_3e1ab3b0-53e5-437a-8882-ce4f23c015b4/neutron-api/0.log" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.422963 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.599618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-httpd-config\") pod \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.599697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-ovndb-tls-certs\") pod \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.599744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55lmf\" (UniqueName: \"kubernetes.io/projected/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-kube-api-access-55lmf\") pod \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.599779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-public-tls-certs\") pod \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.599800 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-internal-tls-certs\") pod \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.599843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-combined-ca-bundle\") pod \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.599874 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-config\") pod \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\" (UID: \"3e1ab3b0-53e5-437a-8882-ce4f23c015b4\") " Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.605961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-kube-api-access-55lmf" (OuterVolumeSpecName: "kube-api-access-55lmf") pod "3e1ab3b0-53e5-437a-8882-ce4f23c015b4" (UID: "3e1ab3b0-53e5-437a-8882-ce4f23c015b4"). InnerVolumeSpecName "kube-api-access-55lmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.606476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3e1ab3b0-53e5-437a-8882-ce4f23c015b4" (UID: "3e1ab3b0-53e5-437a-8882-ce4f23c015b4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.625792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e1ab3b0-53e5-437a-8882-ce4f23c015b4" (UID: "3e1ab3b0-53e5-437a-8882-ce4f23c015b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.626298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e1ab3b0-53e5-437a-8882-ce4f23c015b4" (UID: "3e1ab3b0-53e5-437a-8882-ce4f23c015b4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.628887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e1ab3b0-53e5-437a-8882-ce4f23c015b4" (UID: "3e1ab3b0-53e5-437a-8882-ce4f23c015b4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.629215 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-config" (OuterVolumeSpecName: "config") pod "3e1ab3b0-53e5-437a-8882-ce4f23c015b4" (UID: "3e1ab3b0-53e5-437a-8882-ce4f23c015b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.636378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3e1ab3b0-53e5-437a-8882-ce4f23c015b4" (UID: "3e1ab3b0-53e5-437a-8882-ce4f23c015b4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.691250 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c95f7cb7d-zhgp4_3e1ab3b0-53e5-437a-8882-ce4f23c015b4/neutron-api/0.log" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.691296 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerID="75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6" exitCode=0 Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.691322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" event={"ID":"3e1ab3b0-53e5-437a-8882-ce4f23c015b4","Type":"ContainerDied","Data":"75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6"} Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.691346 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" event={"ID":"3e1ab3b0-53e5-437a-8882-ce4f23c015b4","Type":"ContainerDied","Data":"428b26d9488a96b98c243d43737e0b33d2a462d063718223477db5688e275966"} Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.691351 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.691361 4707 scope.go:117] "RemoveContainer" containerID="75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.701141 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55lmf\" (UniqueName: \"kubernetes.io/projected/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-kube-api-access-55lmf\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.701159 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.701168 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.701176 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.701185 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.701193 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.701202 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1ab3b0-53e5-437a-8882-ce4f23c015b4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.706252 4707 scope.go:117] "RemoveContainer" containerID="319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.720226 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4"] Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.726684 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5c95f7cb7d-zhgp4"] Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.731484 4707 scope.go:117] "RemoveContainer" containerID="0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.764992 4707 scope.go:117] "RemoveContainer" containerID="75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6" Jan 21 16:06:26 crc kubenswrapper[4707]: E0121 16:06:26.765407 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6\": container with ID starting with 75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6 not found: ID does not exist" containerID="75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.765438 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6"} err="failed to get container status \"75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6\": rpc error: code = NotFound desc = could not find container \"75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6\": container with ID starting with 75f3ba3fe25cd1f3b7ad5bf3234ebab4a56295c0700022d0b56f72ebba76d6d6 not found: ID does not exist" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.765457 4707 scope.go:117] "RemoveContainer" containerID="319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1" Jan 21 16:06:26 crc kubenswrapper[4707]: E0121 16:06:26.765781 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1\": container with ID starting with 319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1 not found: ID does not exist" containerID="319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.765837 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1"} err="failed to get container status \"319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1\": rpc error: code = NotFound desc = could not find container \"319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1\": container with ID starting with 319d11bf4ee253248cba294568c6b2ebde107a73d923f8478bad91e1766400d1 not found: ID does not exist" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.765859 4707 scope.go:117] "RemoveContainer" containerID="0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af" Jan 21 16:06:26 crc kubenswrapper[4707]: E0121 16:06:26.766222 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af\": container with ID starting with 0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af not found: ID does not exist" containerID="0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af" Jan 21 16:06:26 crc kubenswrapper[4707]: I0121 16:06:26.766247 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af"} err="failed to get container status \"0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af\": rpc error: code = NotFound desc = could not find container \"0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af\": container with ID starting with 0bed65114e873bde8aeef892127cf41683fe897d7657b096c835c9078e0c48af not found: ID does not exist" Jan 21 16:06:27 crc kubenswrapper[4707]: I0121 16:06:27.193843 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" path="/var/lib/kubelet/pods/3e1ab3b0-53e5-437a-8882-ce4f23c015b4/volumes" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.174020 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.360084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-etc-machine-id\") pod \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.360155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "99b5c23a-57ba-423e-8afb-878f9f0cf0f2" (UID: "99b5c23a-57ba-423e-8afb-878f9f0cf0f2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.360208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data-custom\") pod \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.360238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-combined-ca-bundle\") pod \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.360273 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8kvw\" (UniqueName: \"kubernetes.io/projected/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-kube-api-access-p8kvw\") pod \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.360291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-scripts\") pod \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.360305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data\") pod \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\" (UID: \"99b5c23a-57ba-423e-8afb-878f9f0cf0f2\") " Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.360633 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.365072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "99b5c23a-57ba-423e-8afb-878f9f0cf0f2" (UID: "99b5c23a-57ba-423e-8afb-878f9f0cf0f2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.365149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-scripts" (OuterVolumeSpecName: "scripts") pod "99b5c23a-57ba-423e-8afb-878f9f0cf0f2" (UID: "99b5c23a-57ba-423e-8afb-878f9f0cf0f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.365391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-kube-api-access-p8kvw" (OuterVolumeSpecName: "kube-api-access-p8kvw") pod "99b5c23a-57ba-423e-8afb-878f9f0cf0f2" (UID: "99b5c23a-57ba-423e-8afb-878f9f0cf0f2"). InnerVolumeSpecName "kube-api-access-p8kvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.386605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99b5c23a-57ba-423e-8afb-878f9f0cf0f2" (UID: "99b5c23a-57ba-423e-8afb-878f9f0cf0f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.407461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data" (OuterVolumeSpecName: "config-data") pod "99b5c23a-57ba-423e-8afb-878f9f0cf0f2" (UID: "99b5c23a-57ba-423e-8afb-878f9f0cf0f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.461983 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.462012 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.462023 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8kvw\" (UniqueName: \"kubernetes.io/projected/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-kube-api-access-p8kvw\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.462034 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.462044 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b5c23a-57ba-423e-8afb-878f9f0cf0f2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.772669 4707 generic.go:334] "Generic (PLEG): container finished" podID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerID="f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44" exitCode=137 Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.772707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"99b5c23a-57ba-423e-8afb-878f9f0cf0f2","Type":"ContainerDied","Data":"f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44"} Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.772720 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.772732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"99b5c23a-57ba-423e-8afb-878f9f0cf0f2","Type":"ContainerDied","Data":"933d0272ca2cd9ae33707f95f44a73f394c3a3383a282ae4d9a41fc5b512d063"} Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.772749 4707 scope.go:117] "RemoveContainer" containerID="f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.789054 4707 scope.go:117] "RemoveContainer" containerID="d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.795402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.799723 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.804835 4707 scope.go:117] "RemoveContainer" containerID="9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.816939 4707 scope.go:117] "RemoveContainer" containerID="f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44" Jan 21 16:06:39 crc kubenswrapper[4707]: E0121 16:06:39.817279 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44\": container with ID starting with f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44 not found: ID does not exist" containerID="f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.817320 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44"} err="failed to get container status \"f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44\": rpc error: code = NotFound desc = could not find container \"f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44\": container with ID starting with f709ac7f9407c37efd989a46341fbd1153409274509cddef19ca630ff1eeca44 not found: ID does not exist" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.817346 4707 scope.go:117] "RemoveContainer" containerID="d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c" Jan 21 16:06:39 crc kubenswrapper[4707]: E0121 16:06:39.817623 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c\": container with ID starting with d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c not found: ID does not exist" containerID="d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.817666 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c"} err="failed to get container status \"d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c\": rpc error: code = NotFound desc = could not find container \"d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c\": container with ID starting with d3300bbcc1e77b73e84506d3922e6817ba45ee0765b26df2399fd8e3cd3eff6c not found: ID does not exist" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.817683 4707 scope.go:117] "RemoveContainer" containerID="9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7" Jan 21 16:06:39 crc kubenswrapper[4707]: E0121 16:06:39.817971 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7\": container with ID starting with 9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7 not found: ID does not exist" containerID="9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7" Jan 21 16:06:39 crc kubenswrapper[4707]: I0121 16:06:39.818016 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7"} err="failed to get container status \"9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7\": rpc error: code = NotFound desc = could not find container \"9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7\": container with ID starting with 9e6bbe0cd7c8f6384190cf0ee2f936c389ce8651bcd33faf7ea9870d3343fab7 not found: ID does not exist" Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.182570 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:06:40 crc kubenswrapper[4707]: E0121 16:06:40.182887 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.785518 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerID="a2ba006a14c1f79b7783113eb2fc356a7828b8e8106c77fb842a484c74dc2c5a" exitCode=137 Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.785589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"a2ba006a14c1f79b7783113eb2fc356a7828b8e8106c77fb842a484c74dc2c5a"} Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.845072 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.983186 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift\") pod \"f9e3466a-372e-4d53-9e3b-807d8403f17a\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.983227 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-lock\") pod \"f9e3466a-372e-4d53-9e3b-807d8403f17a\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.983304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-cache\") pod \"f9e3466a-372e-4d53-9e3b-807d8403f17a\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.983325 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsprd\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-kube-api-access-jsprd\") pod \"f9e3466a-372e-4d53-9e3b-807d8403f17a\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.983359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f9e3466a-372e-4d53-9e3b-807d8403f17a\" (UID: \"f9e3466a-372e-4d53-9e3b-807d8403f17a\") " Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.990016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f9e3466a-372e-4d53-9e3b-807d8403f17a" (UID: "f9e3466a-372e-4d53-9e3b-807d8403f17a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.990315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-lock" (OuterVolumeSpecName: "lock") pod "f9e3466a-372e-4d53-9e3b-807d8403f17a" (UID: "f9e3466a-372e-4d53-9e3b-807d8403f17a"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.990603 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-cache" (OuterVolumeSpecName: "cache") pod "f9e3466a-372e-4d53-9e3b-807d8403f17a" (UID: "f9e3466a-372e-4d53-9e3b-807d8403f17a"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.992056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "f9e3466a-372e-4d53-9e3b-807d8403f17a" (UID: "f9e3466a-372e-4d53-9e3b-807d8403f17a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:06:40 crc kubenswrapper[4707]: I0121 16:06:40.992554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-kube-api-access-jsprd" (OuterVolumeSpecName: "kube-api-access-jsprd") pod "f9e3466a-372e-4d53-9e3b-807d8403f17a" (UID: "f9e3466a-372e-4d53-9e3b-807d8403f17a"). InnerVolumeSpecName "kube-api-access-jsprd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.085002 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsprd\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-kube-api-access-jsprd\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.085200 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-cache\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.085228 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.085238 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f9e3466a-372e-4d53-9e3b-807d8403f17a-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.085247 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f9e3466a-372e-4d53-9e3b-807d8403f17a-lock\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.095051 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.185882 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.193230 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" path="/var/lib/kubelet/pods/99b5c23a-57ba-423e-8afb-878f9f0cf0f2/volumes" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.798082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f9e3466a-372e-4d53-9e3b-807d8403f17a","Type":"ContainerDied","Data":"59391ec14f4899ff2e8480562b605be820b6e687dbeb9f1d53fda43145cc3c82"} Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.798131 4707 scope.go:117] "RemoveContainer" containerID="a2ba006a14c1f79b7783113eb2fc356a7828b8e8106c77fb842a484c74dc2c5a" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.798180 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.811261 4707 scope.go:117] "RemoveContainer" containerID="bed3e53b753f14dc32e9b1b6ab9f45190e2e849c1bd85308d5fb68b7ea133e5f" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.817109 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.819265 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.823347 4707 scope.go:117] "RemoveContainer" containerID="6f893355c882d903714e129e409f5ce4d4643085b9619147b4afbb2525f3b1e6" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.837552 4707 scope.go:117] "RemoveContainer" containerID="e47594c788362325c27848cc7da59200a8816e14df22f36bd527fc0d841b218a" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.850239 4707 scope.go:117] "RemoveContainer" containerID="c72a20daf0844253b29487d24bb0e8d55053110bc5b67076c0d93fa4b17a0fcf" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.863591 4707 scope.go:117] "RemoveContainer" containerID="90550d64477b14b9ef1a55ac4d5a33af74701b40b0ce866af985b7931988d125" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.876502 4707 scope.go:117] "RemoveContainer" containerID="c50db20a23ae77d375778f281e89cb6339b6ce4bdaa193bba212e82b99bf20f3" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.889009 4707 scope.go:117] "RemoveContainer" containerID="83dd8a63c20c1ec70a57c216ecd7483961393678c54904f2df7480290173458d" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.901850 4707 scope.go:117] "RemoveContainer" containerID="6c38bd2f0b39209b01d982cd4fee04e83c6c81e7941bf608676b8abd00f2a31b" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.916333 4707 scope.go:117] "RemoveContainer" containerID="20bd79ad3c444a59a7f1c99aa1748e0831cf7601587ba19e2fad1093aa7ed199" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.927338 4707 scope.go:117] "RemoveContainer" containerID="d0267ab63d88eb5d69c8abe4f8eab3c146378376ad2f3f267d6c6d0e34f4b1e9" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.943393 4707 scope.go:117] "RemoveContainer" containerID="13acb9441c4c075d19f2e5354fe2e984c78df04fd88ec52f38421d65c42ab238" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.963688 4707 scope.go:117] "RemoveContainer" containerID="cf5083b5384e2bc9efc2cd7c1c61330323f0a7a94248846ec194cdbbdf9f9085" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.976557 4707 scope.go:117] "RemoveContainer" containerID="6340fee114b99b10644db682b6b0b87773342ed0330aee24fb6bb4739c6c6b4b" Jan 21 16:06:41 crc kubenswrapper[4707]: I0121 16:06:41.993216 4707 scope.go:117] "RemoveContainer" containerID="42e0fb7f1e1361b1e98d0c8a660d5f6cbd7733ca6677f678924d9b430951360e" Jan 21 16:06:43 crc kubenswrapper[4707]: I0121 16:06:43.191099 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" path="/var/lib/kubelet/pods/f9e3466a-372e-4d53-9e3b-807d8403f17a/volumes" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.492472 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493048 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c37d5e5-4ffd-412c-a93a-86cb6474735c" containerName="setup-container" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493060 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c37d5e5-4ffd-412c-a93a-86cb6474735c" containerName="setup-container" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493068 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerName="cinder-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493073 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerName="cinder-api" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493085 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493090 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493096 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerName="placement-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493101 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerName="placement-log" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493115 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerName="barbican-worker" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493121 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerName="barbican-worker" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493129 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493134 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493140 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-replicator" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493144 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-replicator" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493152 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-expirer" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493158 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-expirer" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493167 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-auditor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493172 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-auditor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493181 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ded377-6ace-4f39-8a60-bee575f7e8cc" containerName="setup-container" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493186 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ded377-6ace-4f39-8a60-bee575f7e8cc" containerName="setup-container" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493192 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493198 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493208 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="proxy-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493213 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="proxy-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493223 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-replicator" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493245 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-replicator" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493256 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493261 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493266 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493271 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493279 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="rsync" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493283 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="rsync" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493291 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273f261e-d7a5-4025-91a6-f9e102be79de" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493298 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="273f261e-d7a5-4025-91a6-f9e102be79de" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493308 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-server" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493313 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-server" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493318 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="swift-recon-cron" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493323 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="swift-recon-cron" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493330 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493335 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-api" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493343 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-updater" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493347 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-updater" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493353 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerName="barbican-worker-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493358 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerName="barbican-worker-log" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493363 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58856a6-ee91-4aee-874e-118980038628" containerName="barbican-keystone-listener-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493367 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58856a6-ee91-4aee-874e-118980038628" containerName="barbican-keystone-listener-log" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493372 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113f5252-dcd0-4719-8bd9-11800c33a828" containerName="keystone-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493377 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="113f5252-dcd0-4719-8bd9-11800c33a828" containerName="keystone-api" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493388 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-server" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493393 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-server" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493402 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-updater" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493419 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-updater" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493426 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493432 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493438 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ab8226-4400-48d8-908f-ef4d0778ef1e" containerName="dnsmasq-dns" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493443 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ab8226-4400-48d8-908f-ef4d0778ef1e" containerName="dnsmasq-dns" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493450 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerName="glance-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerName="glance-log" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493464 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerName="proxy-server" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493471 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerName="proxy-server" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493478 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493483 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493489 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-auditor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493495 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-auditor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493501 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-server" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493507 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-server" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493513 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="ceilometer-notification-agent" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493518 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="ceilometer-notification-agent" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493525 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-metadata" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493530 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-metadata" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493539 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerName="glance-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493544 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerName="glance-log" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493550 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493555 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493562 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493567 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493575 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493580 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-api" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493587 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884872ef-7ca2-4262-9561-440673182eba" containerName="memcached" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493592 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="884872ef-7ca2-4262-9561-440673182eba" containerName="memcached" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493598 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerName="glance-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493603 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerName="glance-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493608 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-reaper" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493613 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-reaper" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493620 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerName="barbican-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493625 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerName="barbican-api" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493634 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ded377-6ace-4f39-8a60-bee575f7e8cc" containerName="rabbitmq" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493639 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ded377-6ace-4f39-8a60-bee575f7e8cc" containerName="rabbitmq" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493644 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74b5687-ee85-4b71-929c-cc6ac63ffd06" containerName="galera" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493649 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74b5687-ee85-4b71-929c-cc6ac63ffd06" containerName="galera" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493655 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493660 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493668 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerName="cinder-api-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493673 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerName="cinder-api-log" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493679 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" containerName="openstack-network-exporter" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" containerName="openstack-network-exporter" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493692 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="probe" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493697 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="probe" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493704 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493709 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493716 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="ceilometer-central-agent" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493720 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="ceilometer-central-agent" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493729 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58856a6-ee91-4aee-874e-118980038628" containerName="barbican-keystone-listener" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493734 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58856a6-ee91-4aee-874e-118980038628" containerName="barbican-keystone-listener" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493740 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="sg-core" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493745 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="sg-core" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493750 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493756 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-log" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493763 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ab8226-4400-48d8-908f-ef4d0778ef1e" containerName="init" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493768 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ab8226-4400-48d8-908f-ef4d0778ef1e" containerName="init" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493777 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c37d5e5-4ffd-412c-a93a-86cb6474735c" containerName="rabbitmq" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493782 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c37d5e5-4ffd-412c-a93a-86cb6474735c" containerName="rabbitmq" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493791 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493795 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493800 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74b5687-ee85-4b71-929c-cc6ac63ffd06" containerName="mysql-bootstrap" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493820 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74b5687-ee85-4b71-929c-cc6ac63ffd06" containerName="mysql-bootstrap" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493829 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerName="barbican-api-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493834 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerName="barbican-api-log" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493842 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerName="proxy-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493846 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerName="proxy-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493854 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-auditor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493859 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-auditor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493864 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerName="glance-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493869 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerName="glance-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493877 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b620e1f-426b-4363-ad99-f56ba333fbfd" containerName="mariadb-account-create-update" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493882 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b620e1f-426b-4363-ad99-f56ba333fbfd" containerName="mariadb-account-create-update" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493891 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-replicator" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493896 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-replicator" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493905 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" containerName="ovn-northd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493909 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" containerName="ovn-northd" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493918 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b620e1f-426b-4363-ad99-f56ba333fbfd" containerName="mariadb-account-create-update" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493923 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b620e1f-426b-4363-ad99-f56ba333fbfd" containerName="mariadb-account-create-update" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493931 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493936 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-log" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.493945 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerName="placement-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.493950 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerName="placement-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494102 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerName="placement-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494114 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="proxy-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494124 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58856a6-ee91-4aee-874e-118980038628" containerName="barbican-keystone-listener" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494129 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494135 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b620e1f-426b-4363-ad99-f56ba333fbfd" containerName="mariadb-account-create-update" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494144 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerName="glance-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494149 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494155 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494160 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494169 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-expirer" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494175 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494181 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="ceilometer-central-agent" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494189 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494195 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74b5687-ee85-4b71-929c-cc6ac63ffd06" containerName="galera" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494199 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494208 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b70303-f297-4a57-b252-7bdc251fa8ef" containerName="nova-api-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494216 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="113f5252-dcd0-4719-8bd9-11800c33a828" containerName="keystone-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494222 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="rsync" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494230 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerName="glance-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494237 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-replicator" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494243 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerName="cinder-api-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494249 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58856a6-ee91-4aee-874e-118980038628" containerName="barbican-keystone-listener-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494257 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerName="barbican-worker-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494265 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" containerName="openstack-network-exporter" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494272 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-updater" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494281 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494288 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c00713b-bb97-4adf-82eb-8ff31b65d14b" containerName="glance-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494295 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b559c3-4e82-4897-9b5c-1816989fffcd" containerName="cinder-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494303 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-reaper" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494311 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-server" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494319 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494325 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-server" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494331 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-replicator" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494337 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-auditor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494344 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="sg-core" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494352 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="swift-recon-cron" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494361 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerName="barbican-api-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494367 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1ab8226-4400-48d8-908f-ef4d0778ef1e" containerName="dnsmasq-dns" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494374 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerName="proxy-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494381 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="cinder-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494387 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="61223200-c7c1-4ee8-b38a-e6074d65114a" containerName="glance-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494392 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ded377-6ace-4f39-8a60-bee575f7e8cc" containerName="rabbitmq" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494399 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494417 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494422 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff12a6c-5d02-4e45-9e7e-e589b746f1dc" containerName="placement-log" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494430 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c468129d-ce39-4ba9-970f-4ecc0bb1df07" containerName="ceilometer-notification-agent" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494439 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2772c54d-baf5-4cd4-8677-0db11b97c5ef" containerName="proxy-server" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494446 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="425c52a3-93a2-49fc-82d9-d7d86f381335" containerName="nova-metadata-metadata" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494453 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="object-server" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494461 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c892fe45-66a3-4ddf-b0db-b50df71c6493" containerName="barbican-worker" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494469 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-auditor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494474 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="884872ef-7ca2-4262-9561-440673182eba" containerName="memcached" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494480 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c37d5e5-4ffd-412c-a93a-86cb6474735c" containerName="rabbitmq" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d840a57-4237-4336-9882-01a52c8a2c09" containerName="ovn-northd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494492 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-httpd" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494498 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-replicator" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494506 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b5c23a-57ba-423e-8afb-878f9f0cf0f2" containerName="probe" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494512 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494520 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="273f261e-d7a5-4025-91a6-f9e102be79de" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494526 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="329b3e52-9792-42da-a6f0-9b0fc5fb866c" containerName="barbican-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494534 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35a2e10-5d5d-44b3-b43e-9a1619f7105c" containerName="nova-scheduler-scheduler" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494539 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="account-auditor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494544 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e3466a-372e-4d53-9e3b-807d8403f17a" containerName="container-updater" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.494667 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494674 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-api" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.494687 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494692 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: E0121 16:06:51.494702 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494707 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494848 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af60f39-57ca-4595-be87-1d1a0c869d72" containerName="nova-cell0-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494856 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1ab3b0-53e5-437a-8882-ce4f23c015b4" containerName="neutron-api" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494868 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b620e1f-426b-4363-ad99-f56ba333fbfd" containerName="mariadb-account-create-update" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.494880 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b535454-f068-42cd-9817-a23bd7d7aa4d" containerName="nova-cell1-conductor-conductor" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.495307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.497664 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.497830 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-978bc" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.497843 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.497988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.498020 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.498121 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.500168 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.500679 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e42cb4d-8129-4286-a6c7-9f8e94524e91-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjhwt\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-kube-api-access-rjhwt\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e42cb4d-8129-4286-a6c7-9f8e94524e91-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612648 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.612724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714102 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e42cb4d-8129-4286-a6c7-9f8e94524e91-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjhwt\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-kube-api-access-rjhwt\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e42cb4d-8129-4286-a6c7-9f8e94524e91-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714450 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.714745 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.715018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.715199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.715483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.715642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.718348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e42cb4d-8129-4286-a6c7-9f8e94524e91-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.718402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.719953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.720616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e42cb4d-8129-4286-a6c7-9f8e94524e91-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.726248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjhwt\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-kube-api-access-rjhwt\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.728311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.748491 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.750085 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.752966 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.753231 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-gdrtx" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.753649 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.753826 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.754010 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.754141 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.755159 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.759750 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:06:51 crc kubenswrapper[4707]: I0121 16:06:51.810953 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.916864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d633804-5d9a-4792-b183-51e2c1a2ebe2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c2nn\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-kube-api-access-2c2nn\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917515 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:51.917535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d633804-5d9a-4792-b183-51e2c1a2ebe2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c2nn\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-kube-api-access-2c2nn\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019284 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d633804-5d9a-4792-b183-51e2c1a2ebe2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d633804-5d9a-4792-b183-51e2c1a2ebe2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019911 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.020195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.019364 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.020645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.020767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.020915 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.023075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d633804-5d9a-4792-b183-51e2c1a2ebe2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.026065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.026165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d633804-5d9a-4792-b183-51e2c1a2ebe2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.026272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.033010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c2nn\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-kube-api-access-2c2nn\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.034188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.074656 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.436886 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.437896 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.439218 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-s4nbl" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.439707 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.440090 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.440256 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.451607 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.454647 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.628946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.629017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.629141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.629167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwkkz\" (UniqueName: \"kubernetes.io/projected/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kube-api-access-jwkkz\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.629311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.629376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.629562 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.629608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.647864 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.653272 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:06:52 crc kubenswrapper[4707]: W0121 16:06:52.655177 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e42cb4d_8129_4286_a6c7_9f8e94524e91.slice/crio-92d3f4d21b24762879c93f8be5164eac2cb7f4afb5a767a7092ece30e10139c7 WatchSource:0}: Error finding container 92d3f4d21b24762879c93f8be5164eac2cb7f4afb5a767a7092ece30e10139c7: Status 404 returned error can't find the container with id 92d3f4d21b24762879c93f8be5164eac2cb7f4afb5a767a7092ece30e10139c7 Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.730858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkkz\" (UniqueName: \"kubernetes.io/projected/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kube-api-access-jwkkz\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.730915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.730944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.731055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.731086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.731112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.731154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.731187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.731769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.731913 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.732258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.732373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.732562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.733782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.735280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.743078 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwkkz\" (UniqueName: \"kubernetes.io/projected/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kube-api-access-jwkkz\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.746067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.750915 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.870293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"0e42cb4d-8129-4286-a6c7-9f8e94524e91","Type":"ContainerStarted","Data":"92d3f4d21b24762879c93f8be5164eac2cb7f4afb5a767a7092ece30e10139c7"} Jan 21 16:06:52 crc kubenswrapper[4707]: I0121 16:06:52.873678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"5d633804-5d9a-4792-b183-51e2c1a2ebe2","Type":"ContainerStarted","Data":"b8e8702569b7b5ee9afd82303e64707367b8c9a5c8162bac1315dac272fc8bdd"} Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.116568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:06:53 crc kubenswrapper[4707]: W0121 16:06:53.116607 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e173738_a3c5_4e4c_b681_6d1f3cbb90b7.slice/crio-2e4f3225c6d332cc9a63aff5921551afd29acccbb74f6a10d3fbe3a88a9a91b6 WatchSource:0}: Error finding container 2e4f3225c6d332cc9a63aff5921551afd29acccbb74f6a10d3fbe3a88a9a91b6: Status 404 returned error can't find the container with id 2e4f3225c6d332cc9a63aff5921551afd29acccbb74f6a10d3fbe3a88a9a91b6 Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.907729 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.909919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.913249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-bhtdn" Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.914008 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.916709 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.918734 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.926412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7","Type":"ContainerStarted","Data":"aa7356dd24df7564e2d5f877b2a487343b9965b9b92961ba366738c209325070"} Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.926587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7","Type":"ContainerStarted","Data":"2e4f3225c6d332cc9a63aff5921551afd29acccbb74f6a10d3fbe3a88a9a91b6"} Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.936887 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.943335 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"0e42cb4d-8129-4286-a6c7-9f8e94524e91","Type":"ContainerStarted","Data":"f5188077018b4348f2419002120e3a5b4d29c9ab4682810c48d0553e5ac5fccf"} Jan 21 16:06:53 crc kubenswrapper[4707]: I0121 16:06:53.949349 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"5d633804-5d9a-4792-b183-51e2c1a2ebe2","Type":"ContainerStarted","Data":"99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86"} Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.052225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.052496 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.052564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6wx9\" (UniqueName: \"kubernetes.io/projected/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kube-api-access-d6wx9\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.052596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.052625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.052693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.052736 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.053148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.154377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.154443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.154472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.154539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.154620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.154643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6wx9\" (UniqueName: \"kubernetes.io/projected/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kube-api-access-d6wx9\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.154664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.154686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.154914 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.159334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.159665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.159907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.160347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.164774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.164981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.171775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.173036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6wx9\" (UniqueName: \"kubernetes.io/projected/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kube-api-access-d6wx9\") pod \"openstack-cell1-galera-0\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.182529 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:06:54 crc kubenswrapper[4707]: E0121 16:06:54.182720 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.249494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:06:54 crc kubenswrapper[4707]: W0121 16:06:54.619768 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa0ab18b_efe2_4553_8e4d_cf4f1468c460.slice/crio-e91aaa9c7ac43f3a7514125b48c135e813d8bff6757c89f33328a6f2f7612eee WatchSource:0}: Error finding container e91aaa9c7ac43f3a7514125b48c135e813d8bff6757c89f33328a6f2f7612eee: Status 404 returned error can't find the container with id e91aaa9c7ac43f3a7514125b48c135e813d8bff6757c89f33328a6f2f7612eee Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.620378 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.955245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"fa0ab18b-efe2-4553-8e4d-cf4f1468c460","Type":"ContainerStarted","Data":"5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092"} Jan 21 16:06:54 crc kubenswrapper[4707]: I0121 16:06:54.955480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"fa0ab18b-efe2-4553-8e4d-cf4f1468c460","Type":"ContainerStarted","Data":"e91aaa9c7ac43f3a7514125b48c135e813d8bff6757c89f33328a6f2f7612eee"} Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.419215 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.420206 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: W0121 16:06:55.427140 4707 reflector.go:561] object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-vp54k": failed to list *v1.Secret: secrets "memcached-memcached-dockercfg-vp54k" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 21 16:06:55 crc kubenswrapper[4707]: E0121 16:06:55.427176 4707 reflector.go:158] "Unhandled Error" err="object-\"openstack-kuttl-tests\"/\"memcached-memcached-dockercfg-vp54k\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"memcached-memcached-dockercfg-vp54k\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 16:06:55 crc kubenswrapper[4707]: W0121 16:06:55.427224 4707 reflector.go:561] object-"openstack-kuttl-tests"/"cert-memcached-svc": failed to list *v1.Secret: secrets "cert-memcached-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 21 16:06:55 crc kubenswrapper[4707]: E0121 16:06:55.427235 4707 reflector.go:158] "Unhandled Error" err="object-\"openstack-kuttl-tests\"/\"cert-memcached-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-memcached-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 16:06:55 crc kubenswrapper[4707]: W0121 16:06:55.427269 4707 reflector.go:561] object-"openstack-kuttl-tests"/"memcached-config-data": failed to list *v1.ConfigMap: configmaps "memcached-config-data" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 21 16:06:55 crc kubenswrapper[4707]: E0121 16:06:55.427277 4707 reflector.go:158] "Unhandled Error" err="object-\"openstack-kuttl-tests\"/\"memcached-config-data\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"memcached-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.433932 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.475980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.476087 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kolla-config\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.476136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-config-data\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.476157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tcqw\" (UniqueName: \"kubernetes.io/projected/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kube-api-access-5tcqw\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.476223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.577783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kolla-config\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.577870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-config-data\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.577892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tcqw\" (UniqueName: \"kubernetes.io/projected/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kube-api-access-5tcqw\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.577963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.578015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.582800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.611242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tcqw\" (UniqueName: \"kubernetes.io/projected/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kube-api-access-5tcqw\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.962011 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" containerID="aa7356dd24df7564e2d5f877b2a487343b9965b9b92961ba366738c209325070" exitCode=0 Jan 21 16:06:55 crc kubenswrapper[4707]: I0121 16:06:55.962082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7","Type":"ContainerDied","Data":"aa7356dd24df7564e2d5f877b2a487343b9965b9b92961ba366738c209325070"} Jan 21 16:06:56 crc kubenswrapper[4707]: I0121 16:06:56.438326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 16:06:56 crc kubenswrapper[4707]: I0121 16:06:56.442369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:56 crc kubenswrapper[4707]: E0121 16:06:56.579198 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/memcached-config-data: failed to sync configmap cache: timed out waiting for the condition Jan 21 16:06:56 crc kubenswrapper[4707]: E0121 16:06:56.579271 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-config-data podName:7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:57.079252851 +0000 UTC m=+3914.260769073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-config-data") pod "memcached-0" (UID: "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77") : failed to sync configmap cache: timed out waiting for the condition Jan 21 16:06:56 crc kubenswrapper[4707]: E0121 16:06:56.579199 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/memcached-config-data: failed to sync configmap cache: timed out waiting for the condition Jan 21 16:06:56 crc kubenswrapper[4707]: E0121 16:06:56.579486 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kolla-config podName:7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77 nodeName:}" failed. No retries permitted until 2026-01-21 16:06:57.079475188 +0000 UTC m=+3914.260991411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kolla-config") pod "memcached-0" (UID: "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77") : failed to sync configmap cache: timed out waiting for the condition Jan 21 16:06:56 crc kubenswrapper[4707]: I0121 16:06:56.631075 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-vp54k" Jan 21 16:06:56 crc kubenswrapper[4707]: I0121 16:06:56.780466 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 16:06:56 crc kubenswrapper[4707]: I0121 16:06:56.970936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7","Type":"ContainerStarted","Data":"48493be85c0a9d73f2fef3ed9d02b2356ce6d369c4ebbf5cf3e2300c4c9f60dc"} Jan 21 16:06:56 crc kubenswrapper[4707]: I0121 16:06:56.986127 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=4.986112961 podStartE2EDuration="4.986112961s" podCreationTimestamp="2026-01-21 16:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:06:56.984356258 +0000 UTC m=+3914.165872480" watchObservedRunningTime="2026-01-21 16:06:56.986112961 +0000 UTC m=+3914.167629183" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.099906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kolla-config\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.099957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-config-data\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.100659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-config-data\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.100656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kolla-config\") pod \"memcached-0\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.237258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.302593 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.303355 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.305532 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-rmm48" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.312682 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.404277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rfcg\" (UniqueName: \"kubernetes.io/projected/38940cc9-cd7d-49b2-809d-7bc6b0facfea-kube-api-access-5rfcg\") pod \"kube-state-metrics-0\" (UID: \"38940cc9-cd7d-49b2-809d-7bc6b0facfea\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.505872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rfcg\" (UniqueName: \"kubernetes.io/projected/38940cc9-cd7d-49b2-809d-7bc6b0facfea-kube-api-access-5rfcg\") pod \"kube-state-metrics-0\" (UID: \"38940cc9-cd7d-49b2-809d-7bc6b0facfea\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.526655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rfcg\" (UniqueName: \"kubernetes.io/projected/38940cc9-cd7d-49b2-809d-7bc6b0facfea-kube-api-access-5rfcg\") pod \"kube-state-metrics-0\" (UID: \"38940cc9-cd7d-49b2-809d-7bc6b0facfea\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.626098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.683876 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:06:57 crc kubenswrapper[4707]: W0121 16:06:57.686317 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b3b7d0b_5e91_4ba8_bc0d_41c4a4f9cd77.slice/crio-94d0410e107937fab682edebb91dd06a150f7b35a9c98b1c2bf2d2b57621e2ae WatchSource:0}: Error finding container 94d0410e107937fab682edebb91dd06a150f7b35a9c98b1c2bf2d2b57621e2ae: Status 404 returned error can't find the container with id 94d0410e107937fab682edebb91dd06a150f7b35a9c98b1c2bf2d2b57621e2ae Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.978243 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa0ab18b-efe2-4553-8e4d-cf4f1468c460" containerID="5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092" exitCode=0 Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.978332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"fa0ab18b-efe2-4553-8e4d-cf4f1468c460","Type":"ContainerDied","Data":"5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092"} Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.980237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77","Type":"ContainerStarted","Data":"398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d"} Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.980275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77","Type":"ContainerStarted","Data":"94d0410e107937fab682edebb91dd06a150f7b35a9c98b1c2bf2d2b57621e2ae"} Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.980324 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:06:57 crc kubenswrapper[4707]: I0121 16:06:57.996659 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:06:58 crc kubenswrapper[4707]: I0121 16:06:58.024484 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=3.024467335 podStartE2EDuration="3.024467335s" podCreationTimestamp="2026-01-21 16:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:06:58.002678556 +0000 UTC m=+3915.184194768" watchObservedRunningTime="2026-01-21 16:06:58.024467335 +0000 UTC m=+3915.205983558" Jan 21 16:06:58 crc kubenswrapper[4707]: I0121 16:06:58.986715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"38940cc9-cd7d-49b2-809d-7bc6b0facfea","Type":"ContainerStarted","Data":"0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c"} Jan 21 16:06:58 crc kubenswrapper[4707]: I0121 16:06:58.986934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"38940cc9-cd7d-49b2-809d-7bc6b0facfea","Type":"ContainerStarted","Data":"abd17ae86b2d3f2d6b805dac4f86ce4966cb592f642bbf2ef991fb4148f2301f"} Jan 21 16:06:58 crc kubenswrapper[4707]: I0121 16:06:58.986950 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:06:58 crc kubenswrapper[4707]: I0121 16:06:58.988653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"fa0ab18b-efe2-4553-8e4d-cf4f1468c460","Type":"ContainerStarted","Data":"6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e"} Jan 21 16:06:59 crc kubenswrapper[4707]: I0121 16:06:59.001565 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.7063983710000001 podStartE2EDuration="2.001555723s" podCreationTimestamp="2026-01-21 16:06:57 +0000 UTC" firstStartedPulling="2026-01-21 16:06:58.002125596 +0000 UTC m=+3915.183641818" lastFinishedPulling="2026-01-21 16:06:58.297282948 +0000 UTC m=+3915.478799170" observedRunningTime="2026-01-21 16:06:58.997275704 +0000 UTC m=+3916.178791926" watchObservedRunningTime="2026-01-21 16:06:59.001555723 +0000 UTC m=+3916.183071945" Jan 21 16:06:59 crc kubenswrapper[4707]: I0121 16:06:59.011587 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=6.011574851 podStartE2EDuration="6.011574851s" podCreationTimestamp="2026-01-21 16:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:06:59.009357071 +0000 UTC m=+3916.190873293" watchObservedRunningTime="2026-01-21 16:06:59.011574851 +0000 UTC m=+3916.193091073" Jan 21 16:06:59 crc kubenswrapper[4707]: I0121 16:06:59.985744 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:06:59 crc kubenswrapper[4707]: I0121 16:06:59.986904 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:06:59 crc kubenswrapper[4707]: I0121 16:06:59.988202 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 16:06:59 crc kubenswrapper[4707]: I0121 16:06:59.988733 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 16:06:59 crc kubenswrapper[4707]: I0121 16:06:59.988829 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-tpgml" Jan 21 16:06:59 crc kubenswrapper[4707]: I0121 16:06:59.988912 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 16:06:59 crc kubenswrapper[4707]: I0121 16:06:59.990525 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.011608 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.141616 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.141711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.141739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.141765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-config\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.141788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q774q\" (UniqueName: \"kubernetes.io/projected/69a16f95-9cdc-4bab-b68d-275212d909a5-kube-api-access-q774q\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.141988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.142123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.142152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.243538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.243590 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-config\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.243617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q774q\" (UniqueName: \"kubernetes.io/projected/69a16f95-9cdc-4bab-b68d-275212d909a5-kube-api-access-q774q\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.243712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.243775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.243794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.243882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.243929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.244190 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.248596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-config\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.249564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.249686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.249823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.263364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.276825 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.285449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q774q\" (UniqueName: \"kubernetes.io/projected/69a16f95-9cdc-4bab-b68d-275212d909a5-kube-api-access-q774q\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.294051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.309597 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:00 crc kubenswrapper[4707]: I0121 16:07:00.702980 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.004002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69a16f95-9cdc-4bab-b68d-275212d909a5","Type":"ContainerStarted","Data":"c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de"} Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.004778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69a16f95-9cdc-4bab-b68d-275212d909a5","Type":"ContainerStarted","Data":"337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733"} Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.004863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69a16f95-9cdc-4bab-b68d-275212d909a5","Type":"ContainerStarted","Data":"e0083d66a546921aa01d5b2122ee438a6dbc5523197d429a39befc2035537c69"} Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.023323 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.023312076 podStartE2EDuration="2.023312076s" podCreationTimestamp="2026-01-21 16:06:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:01.018464211 +0000 UTC m=+3918.199980433" watchObservedRunningTime="2026-01-21 16:07:01.023312076 +0000 UTC m=+3918.204828297" Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.890513 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.892664 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.895723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.896138 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.896324 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hdph9" Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.896918 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 16:07:01 crc kubenswrapper[4707]: I0121 16:07:01.905025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.065986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.066240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.066271 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgxm\" (UniqueName: \"kubernetes.io/projected/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-kube-api-access-gdgxm\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.066318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.066344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.066405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.066425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-config\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.066477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.167755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.167793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-config\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.167877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.167988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.168022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.168044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgxm\" (UniqueName: \"kubernetes.io/projected/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-kube-api-access-gdgxm\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.168069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.168101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.168125 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.168615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-config\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.168749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.169253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.172703 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.178413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.181123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.184268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgxm\" (UniqueName: \"kubernetes.io/projected/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-kube-api-access-gdgxm\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.195047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.205620 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.238930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.605861 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:07:02 crc kubenswrapper[4707]: W0121 16:07:02.608560 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod687b8d2a_f54f_4c5c_9d23_d5b0874682bb.slice/crio-bdeed3f4c39e501ff3d5e2e1630d9a320fbe8f0e15a353040c4f122753d2f73e WatchSource:0}: Error finding container bdeed3f4c39e501ff3d5e2e1630d9a320fbe8f0e15a353040c4f122753d2f73e: Status 404 returned error can't find the container with id bdeed3f4c39e501ff3d5e2e1630d9a320fbe8f0e15a353040c4f122753d2f73e Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.751311 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.751939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:07:02 crc kubenswrapper[4707]: I0121 16:07:02.812044 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:07:03 crc kubenswrapper[4707]: I0121 16:07:03.016697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"687b8d2a-f54f-4c5c-9d23-d5b0874682bb","Type":"ContainerStarted","Data":"8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106"} Jan 21 16:07:03 crc kubenswrapper[4707]: I0121 16:07:03.016735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"687b8d2a-f54f-4c5c-9d23-d5b0874682bb","Type":"ContainerStarted","Data":"3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7"} Jan 21 16:07:03 crc kubenswrapper[4707]: I0121 16:07:03.016748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"687b8d2a-f54f-4c5c-9d23-d5b0874682bb","Type":"ContainerStarted","Data":"bdeed3f4c39e501ff3d5e2e1630d9a320fbe8f0e15a353040c4f122753d2f73e"} Jan 21 16:07:03 crc kubenswrapper[4707]: I0121 16:07:03.035524 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.035509445 podStartE2EDuration="2.035509445s" podCreationTimestamp="2026-01-21 16:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:03.034641272 +0000 UTC m=+3920.216157494" watchObservedRunningTime="2026-01-21 16:07:03.035509445 +0000 UTC m=+3920.217025667" Jan 21 16:07:03 crc kubenswrapper[4707]: I0121 16:07:03.089066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:07:03 crc kubenswrapper[4707]: I0121 16:07:03.310733 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:03 crc kubenswrapper[4707]: I0121 16:07:03.336422 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:04 crc kubenswrapper[4707]: I0121 16:07:04.023116 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:04 crc kubenswrapper[4707]: I0121 16:07:04.249743 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:07:04 crc kubenswrapper[4707]: I0121 16:07:04.249974 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:07:04 crc kubenswrapper[4707]: I0121 16:07:04.300756 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.053477 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.094428 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.206556 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.231185 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.320518 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-rzg8t"] Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.321352 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.324414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-rzg8t"] Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.414907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8phx6\" (UniqueName: \"kubernetes.io/projected/c3134eb5-7694-439d-8556-fa12b646b0f4-kube-api-access-8phx6\") pod \"keystone-db-create-rzg8t\" (UID: \"c3134eb5-7694-439d-8556-fa12b646b0f4\") " pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.415197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3134eb5-7694-439d-8556-fa12b646b0f4-operator-scripts\") pod \"keystone-db-create-rzg8t\" (UID: \"c3134eb5-7694-439d-8556-fa12b646b0f4\") " pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.419821 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb"] Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.420723 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.423032 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.427326 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb"] Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.516389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d18151-76a1-4301-8747-0e41ad71b2e4-operator-scripts\") pod \"keystone-fd0d-account-create-update-pskrb\" (UID: \"50d18151-76a1-4301-8747-0e41ad71b2e4\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.516528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3134eb5-7694-439d-8556-fa12b646b0f4-operator-scripts\") pod \"keystone-db-create-rzg8t\" (UID: \"c3134eb5-7694-439d-8556-fa12b646b0f4\") " pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.516592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6n2v\" (UniqueName: \"kubernetes.io/projected/50d18151-76a1-4301-8747-0e41ad71b2e4-kube-api-access-z6n2v\") pod \"keystone-fd0d-account-create-update-pskrb\" (UID: \"50d18151-76a1-4301-8747-0e41ad71b2e4\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.516648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8phx6\" (UniqueName: \"kubernetes.io/projected/c3134eb5-7694-439d-8556-fa12b646b0f4-kube-api-access-8phx6\") pod \"keystone-db-create-rzg8t\" (UID: \"c3134eb5-7694-439d-8556-fa12b646b0f4\") " pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.517336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3134eb5-7694-439d-8556-fa12b646b0f4-operator-scripts\") pod \"keystone-db-create-rzg8t\" (UID: \"c3134eb5-7694-439d-8556-fa12b646b0f4\") " pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.531659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8phx6\" (UniqueName: \"kubernetes.io/projected/c3134eb5-7694-439d-8556-fa12b646b0f4-kube-api-access-8phx6\") pod \"keystone-db-create-rzg8t\" (UID: \"c3134eb5-7694-439d-8556-fa12b646b0f4\") " pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.618304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d18151-76a1-4301-8747-0e41ad71b2e4-operator-scripts\") pod \"keystone-fd0d-account-create-update-pskrb\" (UID: \"50d18151-76a1-4301-8747-0e41ad71b2e4\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.618459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6n2v\" (UniqueName: \"kubernetes.io/projected/50d18151-76a1-4301-8747-0e41ad71b2e4-kube-api-access-z6n2v\") pod \"keystone-fd0d-account-create-update-pskrb\" (UID: \"50d18151-76a1-4301-8747-0e41ad71b2e4\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.619016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d18151-76a1-4301-8747-0e41ad71b2e4-operator-scripts\") pod \"keystone-fd0d-account-create-update-pskrb\" (UID: \"50d18151-76a1-4301-8747-0e41ad71b2e4\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.630888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6n2v\" (UniqueName: \"kubernetes.io/projected/50d18151-76a1-4301-8747-0e41ad71b2e4-kube-api-access-z6n2v\") pod \"keystone-fd0d-account-create-update-pskrb\" (UID: \"50d18151-76a1-4301-8747-0e41ad71b2e4\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.636647 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.734168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.739851 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-55qx5"] Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.740647 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.771031 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-55qx5"] Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.789648 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-eb97-account-create-update-kc79d"] Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.790492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.794210 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.794625 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-eb97-account-create-update-kc79d"] Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.822368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6q4j\" (UniqueName: \"kubernetes.io/projected/ee7b5814-7f54-4d09-8bd2-2a3618903157-kube-api-access-w6q4j\") pod \"placement-db-create-55qx5\" (UID: \"ee7b5814-7f54-4d09-8bd2-2a3618903157\") " pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.822605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee7b5814-7f54-4d09-8bd2-2a3618903157-operator-scripts\") pod \"placement-db-create-55qx5\" (UID: \"ee7b5814-7f54-4d09-8bd2-2a3618903157\") " pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.822642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgd6f\" (UniqueName: \"kubernetes.io/projected/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-kube-api-access-lgd6f\") pod \"placement-eb97-account-create-update-kc79d\" (UID: \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\") " pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.822696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-operator-scripts\") pod \"placement-eb97-account-create-update-kc79d\" (UID: \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\") " pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.924499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee7b5814-7f54-4d09-8bd2-2a3618903157-operator-scripts\") pod \"placement-db-create-55qx5\" (UID: \"ee7b5814-7f54-4d09-8bd2-2a3618903157\") " pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.924556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgd6f\" (UniqueName: \"kubernetes.io/projected/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-kube-api-access-lgd6f\") pod \"placement-eb97-account-create-update-kc79d\" (UID: \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\") " pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.924594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-operator-scripts\") pod \"placement-eb97-account-create-update-kc79d\" (UID: \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\") " pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.924629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6q4j\" (UniqueName: \"kubernetes.io/projected/ee7b5814-7f54-4d09-8bd2-2a3618903157-kube-api-access-w6q4j\") pod \"placement-db-create-55qx5\" (UID: \"ee7b5814-7f54-4d09-8bd2-2a3618903157\") " pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.926183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-operator-scripts\") pod \"placement-eb97-account-create-update-kc79d\" (UID: \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\") " pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.927247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee7b5814-7f54-4d09-8bd2-2a3618903157-operator-scripts\") pod \"placement-db-create-55qx5\" (UID: \"ee7b5814-7f54-4d09-8bd2-2a3618903157\") " pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.946261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6q4j\" (UniqueName: \"kubernetes.io/projected/ee7b5814-7f54-4d09-8bd2-2a3618903157-kube-api-access-w6q4j\") pod \"placement-db-create-55qx5\" (UID: \"ee7b5814-7f54-4d09-8bd2-2a3618903157\") " pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:05 crc kubenswrapper[4707]: I0121 16:07:05.948600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgd6f\" (UniqueName: \"kubernetes.io/projected/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-kube-api-access-lgd6f\") pod \"placement-eb97-account-create-update-kc79d\" (UID: \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\") " pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:06 crc kubenswrapper[4707]: I0121 16:07:06.033702 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:06 crc kubenswrapper[4707]: I0121 16:07:06.046025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-rzg8t"] Jan 21 16:07:06 crc kubenswrapper[4707]: W0121 16:07:06.048279 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3134eb5_7694_439d_8556_fa12b646b0f4.slice/crio-17e1bf2c735d272990c3be1187a1b6419c9baad8f77834d680d3acb1346532cb WatchSource:0}: Error finding container 17e1bf2c735d272990c3be1187a1b6419c9baad8f77834d680d3acb1346532cb: Status 404 returned error can't find the container with id 17e1bf2c735d272990c3be1187a1b6419c9baad8f77834d680d3acb1346532cb Jan 21 16:07:06 crc kubenswrapper[4707]: I0121 16:07:06.096590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:06 crc kubenswrapper[4707]: I0121 16:07:06.115100 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:06 crc kubenswrapper[4707]: I0121 16:07:06.145516 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb"] Jan 21 16:07:06 crc kubenswrapper[4707]: W0121 16:07:06.147323 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50d18151_76a1_4301_8747_0e41ad71b2e4.slice/crio-e2a5f6834d35dfbda9ebf9b8fd17cb196b239e5160350c0111f16aae197af98f WatchSource:0}: Error finding container e2a5f6834d35dfbda9ebf9b8fd17cb196b239e5160350c0111f16aae197af98f: Status 404 returned error can't find the container with id e2a5f6834d35dfbda9ebf9b8fd17cb196b239e5160350c0111f16aae197af98f Jan 21 16:07:06 crc kubenswrapper[4707]: I0121 16:07:06.488738 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-55qx5"] Jan 21 16:07:06 crc kubenswrapper[4707]: W0121 16:07:06.524246 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee7b5814_7f54_4d09_8bd2_2a3618903157.slice/crio-138dfb3e7904a569479dc5faba6fab3a6a52612c034ac692b10e71e061ac0f1e WatchSource:0}: Error finding container 138dfb3e7904a569479dc5faba6fab3a6a52612c034ac692b10e71e061ac0f1e: Status 404 returned error can't find the container with id 138dfb3e7904a569479dc5faba6fab3a6a52612c034ac692b10e71e061ac0f1e Jan 21 16:07:06 crc kubenswrapper[4707]: I0121 16:07:06.603303 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-eb97-account-create-update-kc79d"] Jan 21 16:07:06 crc kubenswrapper[4707]: W0121 16:07:06.609635 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b7fad7e_b367_4cf0_8ed3_565ec653bc17.slice/crio-09a02b634b48ddbce1827434703fefc3314202e189dd477fc654d6268c212131 WatchSource:0}: Error finding container 09a02b634b48ddbce1827434703fefc3314202e189dd477fc654d6268c212131: Status 404 returned error can't find the container with id 09a02b634b48ddbce1827434703fefc3314202e189dd477fc654d6268c212131 Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.041107 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee7b5814-7f54-4d09-8bd2-2a3618903157" containerID="9dbde5c66335b6740d73be0a6801440b8c6adb6737788ea519f025ec1a4c33d4" exitCode=0 Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.041152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-55qx5" event={"ID":"ee7b5814-7f54-4d09-8bd2-2a3618903157","Type":"ContainerDied","Data":"9dbde5c66335b6740d73be0a6801440b8c6adb6737788ea519f025ec1a4c33d4"} Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.041199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-55qx5" event={"ID":"ee7b5814-7f54-4d09-8bd2-2a3618903157","Type":"ContainerStarted","Data":"138dfb3e7904a569479dc5faba6fab3a6a52612c034ac692b10e71e061ac0f1e"} Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.042510 4707 generic.go:334] "Generic (PLEG): container finished" podID="9b7fad7e-b367-4cf0-8ed3-565ec653bc17" containerID="2d8a72a6540e65f712baf7c89e1c838c5fd6f3327fb32d1684d970e96405a48a" exitCode=0 Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.042546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" event={"ID":"9b7fad7e-b367-4cf0-8ed3-565ec653bc17","Type":"ContainerDied","Data":"2d8a72a6540e65f712baf7c89e1c838c5fd6f3327fb32d1684d970e96405a48a"} Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.042578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" event={"ID":"9b7fad7e-b367-4cf0-8ed3-565ec653bc17","Type":"ContainerStarted","Data":"09a02b634b48ddbce1827434703fefc3314202e189dd477fc654d6268c212131"} Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.043853 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3134eb5-7694-439d-8556-fa12b646b0f4" containerID="4d2e09304453da89fa1ebfc18a78fea8433713fb84782f539b40233e5b40f194" exitCode=0 Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.043906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-rzg8t" event={"ID":"c3134eb5-7694-439d-8556-fa12b646b0f4","Type":"ContainerDied","Data":"4d2e09304453da89fa1ebfc18a78fea8433713fb84782f539b40233e5b40f194"} Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.043952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-rzg8t" event={"ID":"c3134eb5-7694-439d-8556-fa12b646b0f4","Type":"ContainerStarted","Data":"17e1bf2c735d272990c3be1187a1b6419c9baad8f77834d680d3acb1346532cb"} Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.045031 4707 generic.go:334] "Generic (PLEG): container finished" podID="50d18151-76a1-4301-8747-0e41ad71b2e4" containerID="a2d2e548b2324c60f458f301629f67a22437c8af5ea4cd732d3d8c06e695a507" exitCode=0 Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.045109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" event={"ID":"50d18151-76a1-4301-8747-0e41ad71b2e4","Type":"ContainerDied","Data":"a2d2e548b2324c60f458f301629f67a22437c8af5ea4cd732d3d8c06e695a507"} Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.045146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" event={"ID":"50d18151-76a1-4301-8747-0e41ad71b2e4","Type":"ContainerStarted","Data":"e2a5f6834d35dfbda9ebf9b8fd17cb196b239e5160350c0111f16aae197af98f"} Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.080889 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.212023 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.213136 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.215124 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.215149 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.215647 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-ljxg7" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.220633 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.223237 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.246072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmslp\" (UniqueName: \"kubernetes.io/projected/bd6d7315-b24c-411b-acc8-bda614951af2-kube-api-access-jmslp\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.246145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.246187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.246218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-scripts\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.246275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-config\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.246291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.246306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.347638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.347703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.347736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-scripts\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.347834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-config\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.347854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.347869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.347908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmslp\" (UniqueName: \"kubernetes.io/projected/bd6d7315-b24c-411b-acc8-bda614951af2-kube-api-access-jmslp\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.348490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.348761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-config\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.348789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-scripts\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.352285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.353610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.354136 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.362371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmslp\" (UniqueName: \"kubernetes.io/projected/bd6d7315-b24c-411b-acc8-bda614951af2-kube-api-access-jmslp\") pod \"ovn-northd-0\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.526377 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.634554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.722599 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.728508 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.734146 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-nvkh6" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.734383 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.734499 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.734611 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.756412 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.757643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-lock\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.757731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.757753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-cache\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.757904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.757939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzcjc\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-kube-api-access-kzcjc\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.837675 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x94hb"] Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.838611 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.840843 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.841390 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.841499 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.859415 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x94hb"] Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-lock\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-dispersionconf\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-cache\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-ring-data-devices\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-combined-ca-bundle\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qpj\" (UniqueName: \"kubernetes.io/projected/ec033d89-b803-4077-8fe1-58d264bd8772-kube-api-access-46qpj\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-swiftconf\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-scripts\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860462 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzcjc\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-kube-api-access-kzcjc\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.860478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec033d89-b803-4077-8fe1-58d264bd8772-etc-swift\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.861115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-lock\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.861356 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: E0121 16:07:07.863170 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 16:07:07 crc kubenswrapper[4707]: E0121 16:07:07.863195 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 16:07:07 crc kubenswrapper[4707]: E0121 16:07:07.863231 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift podName:f49362da-7cec-499b-be08-6930c4f470e6 nodeName:}" failed. No retries permitted until 2026-01-21 16:07:08.363215762 +0000 UTC m=+3925.544731984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift") pod "swift-storage-0" (UID: "f49362da-7cec-499b-be08-6930c4f470e6") : configmap "swift-ring-files" not found Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.863525 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-cache\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.900906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.912910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzcjc\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-kube-api-access-kzcjc\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.962507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-dispersionconf\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.962735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-ring-data-devices\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.962783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-combined-ca-bundle\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.962847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qpj\" (UniqueName: \"kubernetes.io/projected/ec033d89-b803-4077-8fe1-58d264bd8772-kube-api-access-46qpj\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.962867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-swiftconf\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.962928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-scripts\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.962953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec033d89-b803-4077-8fe1-58d264bd8772-etc-swift\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.964136 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec033d89-b803-4077-8fe1-58d264bd8772-etc-swift\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.970068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-ring-data-devices\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.973425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-scripts\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.985971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-combined-ca-bundle\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.995991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-dispersionconf\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:07 crc kubenswrapper[4707]: I0121 16:07:07.998343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-swiftconf\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.004425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qpj\" (UniqueName: \"kubernetes.io/projected/ec033d89-b803-4077-8fe1-58d264bd8772-kube-api-access-46qpj\") pod \"swift-ring-rebalance-x94hb\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.182506 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:07:08 crc kubenswrapper[4707]: E0121 16:07:08.182802 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.199665 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.269405 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.368582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:08 crc kubenswrapper[4707]: E0121 16:07:08.368745 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 16:07:08 crc kubenswrapper[4707]: E0121 16:07:08.368840 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 16:07:08 crc kubenswrapper[4707]: E0121 16:07:08.368889 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift podName:f49362da-7cec-499b-be08-6930c4f470e6 nodeName:}" failed. No retries permitted until 2026-01-21 16:07:09.368874806 +0000 UTC m=+3926.550391029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift") pod "swift-storage-0" (UID: "f49362da-7cec-499b-be08-6930c4f470e6") : configmap "swift-ring-files" not found Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.421667 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.573232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d18151-76a1-4301-8747-0e41ad71b2e4-operator-scripts\") pod \"50d18151-76a1-4301-8747-0e41ad71b2e4\" (UID: \"50d18151-76a1-4301-8747-0e41ad71b2e4\") " Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.573296 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6n2v\" (UniqueName: \"kubernetes.io/projected/50d18151-76a1-4301-8747-0e41ad71b2e4-kube-api-access-z6n2v\") pod \"50d18151-76a1-4301-8747-0e41ad71b2e4\" (UID: \"50d18151-76a1-4301-8747-0e41ad71b2e4\") " Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.573760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50d18151-76a1-4301-8747-0e41ad71b2e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50d18151-76a1-4301-8747-0e41ad71b2e4" (UID: "50d18151-76a1-4301-8747-0e41ad71b2e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.574030 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d18151-76a1-4301-8747-0e41ad71b2e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.577182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d18151-76a1-4301-8747-0e41ad71b2e4-kube-api-access-z6n2v" (OuterVolumeSpecName: "kube-api-access-z6n2v") pod "50d18151-76a1-4301-8747-0e41ad71b2e4" (UID: "50d18151-76a1-4301-8747-0e41ad71b2e4"). InnerVolumeSpecName "kube-api-access-z6n2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.640078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.645375 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.653038 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.676339 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6n2v\" (UniqueName: \"kubernetes.io/projected/50d18151-76a1-4301-8747-0e41ad71b2e4-kube-api-access-z6n2v\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.761959 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x94hb"] Jan 21 16:07:08 crc kubenswrapper[4707]: W0121 16:07:08.764242 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec033d89_b803_4077_8fe1_58d264bd8772.slice/crio-b6c554d04a4c38bfa85e7302024e602c3d1fa3db7b6d2f471d0f2a5b47e62e70 WatchSource:0}: Error finding container b6c554d04a4c38bfa85e7302024e602c3d1fa3db7b6d2f471d0f2a5b47e62e70: Status 404 returned error can't find the container with id b6c554d04a4c38bfa85e7302024e602c3d1fa3db7b6d2f471d0f2a5b47e62e70 Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.777798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8phx6\" (UniqueName: \"kubernetes.io/projected/c3134eb5-7694-439d-8556-fa12b646b0f4-kube-api-access-8phx6\") pod \"c3134eb5-7694-439d-8556-fa12b646b0f4\" (UID: \"c3134eb5-7694-439d-8556-fa12b646b0f4\") " Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.777858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee7b5814-7f54-4d09-8bd2-2a3618903157-operator-scripts\") pod \"ee7b5814-7f54-4d09-8bd2-2a3618903157\" (UID: \"ee7b5814-7f54-4d09-8bd2-2a3618903157\") " Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.777931 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3134eb5-7694-439d-8556-fa12b646b0f4-operator-scripts\") pod \"c3134eb5-7694-439d-8556-fa12b646b0f4\" (UID: \"c3134eb5-7694-439d-8556-fa12b646b0f4\") " Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.777998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-operator-scripts\") pod \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\" (UID: \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\") " Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.778047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6q4j\" (UniqueName: \"kubernetes.io/projected/ee7b5814-7f54-4d09-8bd2-2a3618903157-kube-api-access-w6q4j\") pod \"ee7b5814-7f54-4d09-8bd2-2a3618903157\" (UID: \"ee7b5814-7f54-4d09-8bd2-2a3618903157\") " Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.778081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgd6f\" (UniqueName: \"kubernetes.io/projected/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-kube-api-access-lgd6f\") pod \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\" (UID: \"9b7fad7e-b367-4cf0-8ed3-565ec653bc17\") " Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.779247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b7fad7e-b367-4cf0-8ed3-565ec653bc17" (UID: "9b7fad7e-b367-4cf0-8ed3-565ec653bc17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.779493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3134eb5-7694-439d-8556-fa12b646b0f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3134eb5-7694-439d-8556-fa12b646b0f4" (UID: "c3134eb5-7694-439d-8556-fa12b646b0f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.779659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee7b5814-7f54-4d09-8bd2-2a3618903157-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee7b5814-7f54-4d09-8bd2-2a3618903157" (UID: "ee7b5814-7f54-4d09-8bd2-2a3618903157"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.787175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee7b5814-7f54-4d09-8bd2-2a3618903157-kube-api-access-w6q4j" (OuterVolumeSpecName: "kube-api-access-w6q4j") pod "ee7b5814-7f54-4d09-8bd2-2a3618903157" (UID: "ee7b5814-7f54-4d09-8bd2-2a3618903157"). InnerVolumeSpecName "kube-api-access-w6q4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.787480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-kube-api-access-lgd6f" (OuterVolumeSpecName: "kube-api-access-lgd6f") pod "9b7fad7e-b367-4cf0-8ed3-565ec653bc17" (UID: "9b7fad7e-b367-4cf0-8ed3-565ec653bc17"). InnerVolumeSpecName "kube-api-access-lgd6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.787946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3134eb5-7694-439d-8556-fa12b646b0f4-kube-api-access-8phx6" (OuterVolumeSpecName: "kube-api-access-8phx6") pod "c3134eb5-7694-439d-8556-fa12b646b0f4" (UID: "c3134eb5-7694-439d-8556-fa12b646b0f4"). InnerVolumeSpecName "kube-api-access-8phx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.880296 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6q4j\" (UniqueName: \"kubernetes.io/projected/ee7b5814-7f54-4d09-8bd2-2a3618903157-kube-api-access-w6q4j\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.880322 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgd6f\" (UniqueName: \"kubernetes.io/projected/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-kube-api-access-lgd6f\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.880332 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8phx6\" (UniqueName: \"kubernetes.io/projected/c3134eb5-7694-439d-8556-fa12b646b0f4-kube-api-access-8phx6\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.880351 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee7b5814-7f54-4d09-8bd2-2a3618903157-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.880360 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3134eb5-7694-439d-8556-fa12b646b0f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:08 crc kubenswrapper[4707]: I0121 16:07:08.880368 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7fad7e-b367-4cf0-8ed3-565ec653bc17-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.058962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" event={"ID":"9b7fad7e-b367-4cf0-8ed3-565ec653bc17","Type":"ContainerDied","Data":"09a02b634b48ddbce1827434703fefc3314202e189dd477fc654d6268c212131"} Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.058998 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09a02b634b48ddbce1827434703fefc3314202e189dd477fc654d6268c212131" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.058976 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-eb97-account-create-update-kc79d" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.060301 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-rzg8t" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.060301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-rzg8t" event={"ID":"c3134eb5-7694-439d-8556-fa12b646b0f4","Type":"ContainerDied","Data":"17e1bf2c735d272990c3be1187a1b6419c9baad8f77834d680d3acb1346532cb"} Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.060424 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17e1bf2c735d272990c3be1187a1b6419c9baad8f77834d680d3acb1346532cb" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.061553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" event={"ID":"50d18151-76a1-4301-8747-0e41ad71b2e4","Type":"ContainerDied","Data":"e2a5f6834d35dfbda9ebf9b8fd17cb196b239e5160350c0111f16aae197af98f"} Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.061600 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2a5f6834d35dfbda9ebf9b8fd17cb196b239e5160350c0111f16aae197af98f" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.061559 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.062664 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"bd6d7315-b24c-411b-acc8-bda614951af2","Type":"ContainerStarted","Data":"edc9c99ba54dd7b30862a576d3d32b484fff02281a8a134d2f7a03fdef6a53c6"} Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.062690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"bd6d7315-b24c-411b-acc8-bda614951af2","Type":"ContainerStarted","Data":"f5905c256265d74b62ed26557c2803f5ee8e0dde3974c37bd97e1831a18746e7"} Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.062701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"bd6d7315-b24c-411b-acc8-bda614951af2","Type":"ContainerStarted","Data":"d929e54dc6614dc80117812f2e8dcdeca2febaf9705801ba94cc500ea1e5b3db"} Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.063370 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.064467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-55qx5" event={"ID":"ee7b5814-7f54-4d09-8bd2-2a3618903157","Type":"ContainerDied","Data":"138dfb3e7904a569479dc5faba6fab3a6a52612c034ac692b10e71e061ac0f1e"} Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.064486 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="138dfb3e7904a569479dc5faba6fab3a6a52612c034ac692b10e71e061ac0f1e" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.064521 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-55qx5" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.066274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" event={"ID":"ec033d89-b803-4077-8fe1-58d264bd8772","Type":"ContainerStarted","Data":"d452bc6e221810ea066cd64f5b6bffe50921334f2dde676c274b42522b363fcf"} Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.066313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" event={"ID":"ec033d89-b803-4077-8fe1-58d264bd8772","Type":"ContainerStarted","Data":"b6c554d04a4c38bfa85e7302024e602c3d1fa3db7b6d2f471d0f2a5b47e62e70"} Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.086040 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.086027308 podStartE2EDuration="2.086027308s" podCreationTimestamp="2026-01-21 16:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:09.081042184 +0000 UTC m=+3926.262558406" watchObservedRunningTime="2026-01-21 16:07:09.086027308 +0000 UTC m=+3926.267543530" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.101658 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" podStartSLOduration=2.101645725 podStartE2EDuration="2.101645725s" podCreationTimestamp="2026-01-21 16:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:09.098856961 +0000 UTC m=+3926.280373183" watchObservedRunningTime="2026-01-21 16:07:09.101645725 +0000 UTC m=+3926.283161948" Jan 21 16:07:09 crc kubenswrapper[4707]: I0121 16:07:09.387381 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:09 crc kubenswrapper[4707]: E0121 16:07:09.387565 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 16:07:09 crc kubenswrapper[4707]: E0121 16:07:09.387591 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 16:07:09 crc kubenswrapper[4707]: E0121 16:07:09.387644 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift podName:f49362da-7cec-499b-be08-6930c4f470e6 nodeName:}" failed. No retries permitted until 2026-01-21 16:07:11.387628757 +0000 UTC m=+3928.569144979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift") pod "swift-storage-0" (UID: "f49362da-7cec-499b-be08-6930c4f470e6") : configmap "swift-ring-files" not found Jan 21 16:07:11 crc kubenswrapper[4707]: I0121 16:07:11.413916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:11 crc kubenswrapper[4707]: E0121 16:07:11.414141 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 16:07:11 crc kubenswrapper[4707]: E0121 16:07:11.414170 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 16:07:11 crc kubenswrapper[4707]: E0121 16:07:11.414227 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift podName:f49362da-7cec-499b-be08-6930c4f470e6 nodeName:}" failed. No retries permitted until 2026-01-21 16:07:15.414209912 +0000 UTC m=+3932.595726144 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift") pod "swift-storage-0" (UID: "f49362da-7cec-499b-be08-6930c4f470e6") : configmap "swift-ring-files" not found Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.398664 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-w4qwr"] Jan 21 16:07:12 crc kubenswrapper[4707]: E0121 16:07:12.399141 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7b5814-7f54-4d09-8bd2-2a3618903157" containerName="mariadb-database-create" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.399168 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7b5814-7f54-4d09-8bd2-2a3618903157" containerName="mariadb-database-create" Jan 21 16:07:12 crc kubenswrapper[4707]: E0121 16:07:12.399177 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3134eb5-7694-439d-8556-fa12b646b0f4" containerName="mariadb-database-create" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.399183 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3134eb5-7694-439d-8556-fa12b646b0f4" containerName="mariadb-database-create" Jan 21 16:07:12 crc kubenswrapper[4707]: E0121 16:07:12.399195 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7fad7e-b367-4cf0-8ed3-565ec653bc17" containerName="mariadb-account-create-update" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.399200 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7fad7e-b367-4cf0-8ed3-565ec653bc17" containerName="mariadb-account-create-update" Jan 21 16:07:12 crc kubenswrapper[4707]: E0121 16:07:12.399213 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d18151-76a1-4301-8747-0e41ad71b2e4" containerName="mariadb-account-create-update" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.399218 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d18151-76a1-4301-8747-0e41ad71b2e4" containerName="mariadb-account-create-update" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.399380 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3134eb5-7694-439d-8556-fa12b646b0f4" containerName="mariadb-database-create" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.399396 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7fad7e-b367-4cf0-8ed3-565ec653bc17" containerName="mariadb-account-create-update" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.399406 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7b5814-7f54-4d09-8bd2-2a3618903157" containerName="mariadb-database-create" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.399413 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d18151-76a1-4301-8747-0e41ad71b2e4" containerName="mariadb-account-create-update" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.399862 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.407047 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-w4qwr"] Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.505982 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v5rnt"] Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.506833 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.510602 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.518198 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v5rnt"] Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.534019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bae039e-7430-4b54-9235-8aad8a9b834c-operator-scripts\") pod \"glance-db-create-w4qwr\" (UID: \"3bae039e-7430-4b54-9235-8aad8a9b834c\") " pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.534119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwq5g\" (UniqueName: \"kubernetes.io/projected/3bae039e-7430-4b54-9235-8aad8a9b834c-kube-api-access-xwq5g\") pod \"glance-db-create-w4qwr\" (UID: \"3bae039e-7430-4b54-9235-8aad8a9b834c\") " pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.606529 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9"] Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.607389 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.609508 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.614371 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9"] Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.636181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-operator-scripts\") pod \"root-account-create-update-v5rnt\" (UID: \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\") " pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.636440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwq5g\" (UniqueName: \"kubernetes.io/projected/3bae039e-7430-4b54-9235-8aad8a9b834c-kube-api-access-xwq5g\") pod \"glance-db-create-w4qwr\" (UID: \"3bae039e-7430-4b54-9235-8aad8a9b834c\") " pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.636599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2ppq\" (UniqueName: \"kubernetes.io/projected/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-kube-api-access-l2ppq\") pod \"root-account-create-update-v5rnt\" (UID: \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\") " pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.636678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bae039e-7430-4b54-9235-8aad8a9b834c-operator-scripts\") pod \"glance-db-create-w4qwr\" (UID: \"3bae039e-7430-4b54-9235-8aad8a9b834c\") " pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.637370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bae039e-7430-4b54-9235-8aad8a9b834c-operator-scripts\") pod \"glance-db-create-w4qwr\" (UID: \"3bae039e-7430-4b54-9235-8aad8a9b834c\") " pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.652066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwq5g\" (UniqueName: \"kubernetes.io/projected/3bae039e-7430-4b54-9235-8aad8a9b834c-kube-api-access-xwq5g\") pod \"glance-db-create-w4qwr\" (UID: \"3bae039e-7430-4b54-9235-8aad8a9b834c\") " pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.714776 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.738542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2ppq\" (UniqueName: \"kubernetes.io/projected/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-kube-api-access-l2ppq\") pod \"root-account-create-update-v5rnt\" (UID: \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\") " pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.738607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/163a4187-712b-4e94-b883-6f12462cfc13-operator-scripts\") pod \"glance-b0b5-account-create-update-d92q9\" (UID: \"163a4187-712b-4e94-b883-6f12462cfc13\") " pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.738642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-operator-scripts\") pod \"root-account-create-update-v5rnt\" (UID: \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\") " pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.738663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqv77\" (UniqueName: \"kubernetes.io/projected/163a4187-712b-4e94-b883-6f12462cfc13-kube-api-access-bqv77\") pod \"glance-b0b5-account-create-update-d92q9\" (UID: \"163a4187-712b-4e94-b883-6f12462cfc13\") " pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.739235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-operator-scripts\") pod \"root-account-create-update-v5rnt\" (UID: \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\") " pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.751909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2ppq\" (UniqueName: \"kubernetes.io/projected/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-kube-api-access-l2ppq\") pod \"root-account-create-update-v5rnt\" (UID: \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\") " pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.819136 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.839877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/163a4187-712b-4e94-b883-6f12462cfc13-operator-scripts\") pod \"glance-b0b5-account-create-update-d92q9\" (UID: \"163a4187-712b-4e94-b883-6f12462cfc13\") " pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.839937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqv77\" (UniqueName: \"kubernetes.io/projected/163a4187-712b-4e94-b883-6f12462cfc13-kube-api-access-bqv77\") pod \"glance-b0b5-account-create-update-d92q9\" (UID: \"163a4187-712b-4e94-b883-6f12462cfc13\") " pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.840922 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/163a4187-712b-4e94-b883-6f12462cfc13-operator-scripts\") pod \"glance-b0b5-account-create-update-d92q9\" (UID: \"163a4187-712b-4e94-b883-6f12462cfc13\") " pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.853692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqv77\" (UniqueName: \"kubernetes.io/projected/163a4187-712b-4e94-b883-6f12462cfc13-kube-api-access-bqv77\") pod \"glance-b0b5-account-create-update-d92q9\" (UID: \"163a4187-712b-4e94-b883-6f12462cfc13\") " pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:12 crc kubenswrapper[4707]: I0121 16:07:12.923474 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:13 crc kubenswrapper[4707]: I0121 16:07:13.101456 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-w4qwr"] Jan 21 16:07:13 crc kubenswrapper[4707]: W0121 16:07:13.110322 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bae039e_7430_4b54_9235_8aad8a9b834c.slice/crio-99a5374a84bac0110be1f220cc1d83cd4af3a008e34522e411244b83ffe8aac0 WatchSource:0}: Error finding container 99a5374a84bac0110be1f220cc1d83cd4af3a008e34522e411244b83ffe8aac0: Status 404 returned error can't find the container with id 99a5374a84bac0110be1f220cc1d83cd4af3a008e34522e411244b83ffe8aac0 Jan 21 16:07:13 crc kubenswrapper[4707]: W0121 16:07:13.207196 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4fd89cf_9f3d_4202_81ee_e70d62596f4e.slice/crio-667559752cf2dbd779c8d84ac612539d9c59231e07eefa22a7badeb2737564cf WatchSource:0}: Error finding container 667559752cf2dbd779c8d84ac612539d9c59231e07eefa22a7badeb2737564cf: Status 404 returned error can't find the container with id 667559752cf2dbd779c8d84ac612539d9c59231e07eefa22a7badeb2737564cf Jan 21 16:07:13 crc kubenswrapper[4707]: I0121 16:07:13.208908 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v5rnt"] Jan 21 16:07:13 crc kubenswrapper[4707]: I0121 16:07:13.308129 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9"] Jan 21 16:07:13 crc kubenswrapper[4707]: W0121 16:07:13.313979 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod163a4187_712b_4e94_b883_6f12462cfc13.slice/crio-b562d563a4fe6eafd1e44550045973c17073af1555def5e88d176597eb0b757a WatchSource:0}: Error finding container b562d563a4fe6eafd1e44550045973c17073af1555def5e88d176597eb0b757a: Status 404 returned error can't find the container with id b562d563a4fe6eafd1e44550045973c17073af1555def5e88d176597eb0b757a Jan 21 16:07:14 crc kubenswrapper[4707]: I0121 16:07:14.108483 4707 generic.go:334] "Generic (PLEG): container finished" podID="3bae039e-7430-4b54-9235-8aad8a9b834c" containerID="cd4925de544fedf8e5e8b19e458b6dd165d182b18e2bd7e6a19f58b6fb856b84" exitCode=0 Jan 21 16:07:14 crc kubenswrapper[4707]: I0121 16:07:14.108555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-w4qwr" event={"ID":"3bae039e-7430-4b54-9235-8aad8a9b834c","Type":"ContainerDied","Data":"cd4925de544fedf8e5e8b19e458b6dd165d182b18e2bd7e6a19f58b6fb856b84"} Jan 21 16:07:14 crc kubenswrapper[4707]: I0121 16:07:14.108721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-w4qwr" event={"ID":"3bae039e-7430-4b54-9235-8aad8a9b834c","Type":"ContainerStarted","Data":"99a5374a84bac0110be1f220cc1d83cd4af3a008e34522e411244b83ffe8aac0"} Jan 21 16:07:14 crc kubenswrapper[4707]: I0121 16:07:14.111828 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4fd89cf-9f3d-4202-81ee-e70d62596f4e" containerID="531c98ba1ff359ad78152d8e44092a209ec3fab7f1115bfbad71fa1070d2ad28" exitCode=0 Jan 21 16:07:14 crc kubenswrapper[4707]: I0121 16:07:14.111950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-v5rnt" event={"ID":"b4fd89cf-9f3d-4202-81ee-e70d62596f4e","Type":"ContainerDied","Data":"531c98ba1ff359ad78152d8e44092a209ec3fab7f1115bfbad71fa1070d2ad28"} Jan 21 16:07:14 crc kubenswrapper[4707]: I0121 16:07:14.111979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-v5rnt" event={"ID":"b4fd89cf-9f3d-4202-81ee-e70d62596f4e","Type":"ContainerStarted","Data":"667559752cf2dbd779c8d84ac612539d9c59231e07eefa22a7badeb2737564cf"} Jan 21 16:07:14 crc kubenswrapper[4707]: I0121 16:07:14.114753 4707 generic.go:334] "Generic (PLEG): container finished" podID="163a4187-712b-4e94-b883-6f12462cfc13" containerID="8a63ed3c9b50f6cc80dfcc95323d2f7c984bdc6c47f6e52d33e7abe224e613c3" exitCode=0 Jan 21 16:07:14 crc kubenswrapper[4707]: I0121 16:07:14.114785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" event={"ID":"163a4187-712b-4e94-b883-6f12462cfc13","Type":"ContainerDied","Data":"8a63ed3c9b50f6cc80dfcc95323d2f7c984bdc6c47f6e52d33e7abe224e613c3"} Jan 21 16:07:14 crc kubenswrapper[4707]: I0121 16:07:14.114820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" event={"ID":"163a4187-712b-4e94-b883-6f12462cfc13","Type":"ContainerStarted","Data":"b562d563a4fe6eafd1e44550045973c17073af1555def5e88d176597eb0b757a"} Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.126677 4707 generic.go:334] "Generic (PLEG): container finished" podID="ec033d89-b803-4077-8fe1-58d264bd8772" containerID="d452bc6e221810ea066cd64f5b6bffe50921334f2dde676c274b42522b363fcf" exitCode=0 Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.126773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" event={"ID":"ec033d89-b803-4077-8fe1-58d264bd8772","Type":"ContainerDied","Data":"d452bc6e221810ea066cd64f5b6bffe50921334f2dde676c274b42522b363fcf"} Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.397459 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.481864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bae039e-7430-4b54-9235-8aad8a9b834c-operator-scripts\") pod \"3bae039e-7430-4b54-9235-8aad8a9b834c\" (UID: \"3bae039e-7430-4b54-9235-8aad8a9b834c\") " Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.481927 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwq5g\" (UniqueName: \"kubernetes.io/projected/3bae039e-7430-4b54-9235-8aad8a9b834c-kube-api-access-xwq5g\") pod \"3bae039e-7430-4b54-9235-8aad8a9b834c\" (UID: \"3bae039e-7430-4b54-9235-8aad8a9b834c\") " Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.482333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.482739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bae039e-7430-4b54-9235-8aad8a9b834c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3bae039e-7430-4b54-9235-8aad8a9b834c" (UID: "3bae039e-7430-4b54-9235-8aad8a9b834c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.488754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bae039e-7430-4b54-9235-8aad8a9b834c-kube-api-access-xwq5g" (OuterVolumeSpecName: "kube-api-access-xwq5g") pod "3bae039e-7430-4b54-9235-8aad8a9b834c" (UID: "3bae039e-7430-4b54-9235-8aad8a9b834c"). InnerVolumeSpecName "kube-api-access-xwq5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.498067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift\") pod \"swift-storage-0\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.538224 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.577822 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.583680 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bae039e-7430-4b54-9235-8aad8a9b834c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.583709 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwq5g\" (UniqueName: \"kubernetes.io/projected/3bae039e-7430-4b54-9235-8aad8a9b834c-kube-api-access-xwq5g\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.603187 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.626859 4707 scope.go:117] "RemoveContainer" containerID="f3747fccd776cf61a6bf6bea5e9650119f54f111c7925fb761524b94126ed398" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.653886 4707 scope.go:117] "RemoveContainer" containerID="ae353753b0fb18d8352ca556e1585eaa516abe93adbada375743082e9628ebc2" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.673697 4707 scope.go:117] "RemoveContainer" containerID="90cf9dd71194d32afd09ab818cff3462448499ba00cfc5818ba70273315fb6fc" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.684855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqv77\" (UniqueName: \"kubernetes.io/projected/163a4187-712b-4e94-b883-6f12462cfc13-kube-api-access-bqv77\") pod \"163a4187-712b-4e94-b883-6f12462cfc13\" (UID: \"163a4187-712b-4e94-b883-6f12462cfc13\") " Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.684943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2ppq\" (UniqueName: \"kubernetes.io/projected/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-kube-api-access-l2ppq\") pod \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\" (UID: \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\") " Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.684987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-operator-scripts\") pod \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\" (UID: \"b4fd89cf-9f3d-4202-81ee-e70d62596f4e\") " Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.685006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/163a4187-712b-4e94-b883-6f12462cfc13-operator-scripts\") pod \"163a4187-712b-4e94-b883-6f12462cfc13\" (UID: \"163a4187-712b-4e94-b883-6f12462cfc13\") " Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.685961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163a4187-712b-4e94-b883-6f12462cfc13-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "163a4187-712b-4e94-b883-6f12462cfc13" (UID: "163a4187-712b-4e94-b883-6f12462cfc13"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.686884 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4fd89cf-9f3d-4202-81ee-e70d62596f4e" (UID: "b4fd89cf-9f3d-4202-81ee-e70d62596f4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.688222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-kube-api-access-l2ppq" (OuterVolumeSpecName: "kube-api-access-l2ppq") pod "b4fd89cf-9f3d-4202-81ee-e70d62596f4e" (UID: "b4fd89cf-9f3d-4202-81ee-e70d62596f4e"). InnerVolumeSpecName "kube-api-access-l2ppq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.688287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/163a4187-712b-4e94-b883-6f12462cfc13-kube-api-access-bqv77" (OuterVolumeSpecName: "kube-api-access-bqv77") pod "163a4187-712b-4e94-b883-6f12462cfc13" (UID: "163a4187-712b-4e94-b883-6f12462cfc13"). InnerVolumeSpecName "kube-api-access-bqv77". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.696048 4707 scope.go:117] "RemoveContainer" containerID="e79cc9bb5b22b0eed5431d7ce834ba9dc92b3ac1898938e892029f4cdbc1812f" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.731833 4707 scope.go:117] "RemoveContainer" containerID="ceaf1dadb0a41262987a363b58f66fe315a552cc7470f8d46cfc49b76e95959e" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.746624 4707 scope.go:117] "RemoveContainer" containerID="242d6d6cc98648532f46e922558d3ad8d98cde0c4a39d7498647f465fc542cfe" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.782189 4707 scope.go:117] "RemoveContainer" containerID="82201ef1750ba378349ad8aa6af1ad86347056341ef06a8fd9c79319643362ff" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.786596 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqv77\" (UniqueName: \"kubernetes.io/projected/163a4187-712b-4e94-b883-6f12462cfc13-kube-api-access-bqv77\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.786622 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2ppq\" (UniqueName: \"kubernetes.io/projected/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-kube-api-access-l2ppq\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.786633 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fd89cf-9f3d-4202-81ee-e70d62596f4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.786642 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/163a4187-712b-4e94-b883-6f12462cfc13-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.796423 4707 scope.go:117] "RemoveContainer" containerID="d09d9e5a0fd4869c4422a15b0398bb6bfdf3645c34fe7a84bc0e0e94d680986f" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.809180 4707 scope.go:117] "RemoveContainer" containerID="6493e34684b568e4b46ad2000671bad1726d6a92122699a0a1930bed82439445" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.831623 4707 scope.go:117] "RemoveContainer" containerID="18320d7d0fe5ff7086b31b3cb35d4c5a0cc62611fd7a0661a1b29d50cedc2e6f" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.845775 4707 scope.go:117] "RemoveContainer" containerID="01d7064ebbc3f6ab8fc1f264955b1dffe3fed3888e517ae25228c53c2f45394f" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.861660 4707 scope.go:117] "RemoveContainer" containerID="ad9cce4ce97e6242f33b520601f135d3c884134a1b63fb09b6ee8acb83dc3e92" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.889518 4707 scope.go:117] "RemoveContainer" containerID="b70e55f78933c8327e6fe960fb5ad4107cb5d18bbc98a566057ce52134a76c2f" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.917827 4707 scope.go:117] "RemoveContainer" containerID="50cdcb498e9416bb5590723b27e3bf3050fedf0e1a72b72bd0a12b1cbd5cea1e" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.944725 4707 scope.go:117] "RemoveContainer" containerID="bda8d3a244ebe6eb8a2cc33162b6791f9a27ee5b8d1ae3c29cbca5c78a213910" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.961904 4707 scope.go:117] "RemoveContainer" containerID="3972cf4fcd549a276993eab0ef23faf54b7c1c1e4e805563f5c0014b1ef5faeb" Jan 21 16:07:15 crc kubenswrapper[4707]: I0121 16:07:15.963076 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:07:15 crc kubenswrapper[4707]: W0121 16:07:15.967868 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf49362da_7cec_499b_be08_6930c4f470e6.slice/crio-64bca4d7bc91d4ddb1d6623f99b462d34b91c22910298ad1e7fc8c16003a20f2 WatchSource:0}: Error finding container 64bca4d7bc91d4ddb1d6623f99b462d34b91c22910298ad1e7fc8c16003a20f2: Status 404 returned error can't find the container with id 64bca4d7bc91d4ddb1d6623f99b462d34b91c22910298ad1e7fc8c16003a20f2 Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.147366 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.147362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9" event={"ID":"163a4187-712b-4e94-b883-6f12462cfc13","Type":"ContainerDied","Data":"b562d563a4fe6eafd1e44550045973c17073af1555def5e88d176597eb0b757a"} Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.147832 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b562d563a4fe6eafd1e44550045973c17073af1555def5e88d176597eb0b757a" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.156009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60"} Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.156050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"64bca4d7bc91d4ddb1d6623f99b462d34b91c22910298ad1e7fc8c16003a20f2"} Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.157525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-v5rnt" event={"ID":"b4fd89cf-9f3d-4202-81ee-e70d62596f4e","Type":"ContainerDied","Data":"667559752cf2dbd779c8d84ac612539d9c59231e07eefa22a7badeb2737564cf"} Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.157550 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667559752cf2dbd779c8d84ac612539d9c59231e07eefa22a7badeb2737564cf" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.157618 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-v5rnt" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.166126 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-w4qwr" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.166123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-w4qwr" event={"ID":"3bae039e-7430-4b54-9235-8aad8a9b834c","Type":"ContainerDied","Data":"99a5374a84bac0110be1f220cc1d83cd4af3a008e34522e411244b83ffe8aac0"} Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.166514 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99a5374a84bac0110be1f220cc1d83cd4af3a008e34522e411244b83ffe8aac0" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.495717 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.598685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-scripts\") pod \"ec033d89-b803-4077-8fe1-58d264bd8772\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.598935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46qpj\" (UniqueName: \"kubernetes.io/projected/ec033d89-b803-4077-8fe1-58d264bd8772-kube-api-access-46qpj\") pod \"ec033d89-b803-4077-8fe1-58d264bd8772\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.599007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-combined-ca-bundle\") pod \"ec033d89-b803-4077-8fe1-58d264bd8772\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.599035 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-ring-data-devices\") pod \"ec033d89-b803-4077-8fe1-58d264bd8772\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.599062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec033d89-b803-4077-8fe1-58d264bd8772-etc-swift\") pod \"ec033d89-b803-4077-8fe1-58d264bd8772\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.599119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-dispersionconf\") pod \"ec033d89-b803-4077-8fe1-58d264bd8772\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.599142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-swiftconf\") pod \"ec033d89-b803-4077-8fe1-58d264bd8772\" (UID: \"ec033d89-b803-4077-8fe1-58d264bd8772\") " Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.600275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ec033d89-b803-4077-8fe1-58d264bd8772" (UID: "ec033d89-b803-4077-8fe1-58d264bd8772"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.600519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec033d89-b803-4077-8fe1-58d264bd8772-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ec033d89-b803-4077-8fe1-58d264bd8772" (UID: "ec033d89-b803-4077-8fe1-58d264bd8772"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.619914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec033d89-b803-4077-8fe1-58d264bd8772-kube-api-access-46qpj" (OuterVolumeSpecName: "kube-api-access-46qpj") pod "ec033d89-b803-4077-8fe1-58d264bd8772" (UID: "ec033d89-b803-4077-8fe1-58d264bd8772"). InnerVolumeSpecName "kube-api-access-46qpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.627232 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ec033d89-b803-4077-8fe1-58d264bd8772" (UID: "ec033d89-b803-4077-8fe1-58d264bd8772"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.630050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ec033d89-b803-4077-8fe1-58d264bd8772" (UID: "ec033d89-b803-4077-8fe1-58d264bd8772"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.631206 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-scripts" (OuterVolumeSpecName: "scripts") pod "ec033d89-b803-4077-8fe1-58d264bd8772" (UID: "ec033d89-b803-4077-8fe1-58d264bd8772"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.635014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec033d89-b803-4077-8fe1-58d264bd8772" (UID: "ec033d89-b803-4077-8fe1-58d264bd8772"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.700760 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.700792 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.700801 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.700823 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46qpj\" (UniqueName: \"kubernetes.io/projected/ec033d89-b803-4077-8fe1-58d264bd8772-kube-api-access-46qpj\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.700834 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec033d89-b803-4077-8fe1-58d264bd8772-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.700844 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec033d89-b803-4077-8fe1-58d264bd8772-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:16 crc kubenswrapper[4707]: I0121 16:07:16.700851 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec033d89-b803-4077-8fe1-58d264bd8772-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.184071 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.189191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-x94hb" event={"ID":"ec033d89-b803-4077-8fe1-58d264bd8772","Type":"ContainerDied","Data":"b6c554d04a4c38bfa85e7302024e602c3d1fa3db7b6d2f471d0f2a5b47e62e70"} Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.189226 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6c554d04a4c38bfa85e7302024e602c3d1fa3db7b6d2f471d0f2a5b47e62e70" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.192752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620"} Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.192794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261"} Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.192843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a"} Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.192858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf"} Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.192867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e"} Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.192876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794"} Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.192883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0"} Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.192890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479"} Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.742204 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zhsbz"] Jan 21 16:07:17 crc kubenswrapper[4707]: E0121 16:07:17.742445 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bae039e-7430-4b54-9235-8aad8a9b834c" containerName="mariadb-database-create" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.742461 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bae039e-7430-4b54-9235-8aad8a9b834c" containerName="mariadb-database-create" Jan 21 16:07:17 crc kubenswrapper[4707]: E0121 16:07:17.742479 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec033d89-b803-4077-8fe1-58d264bd8772" containerName="swift-ring-rebalance" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.742485 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec033d89-b803-4077-8fe1-58d264bd8772" containerName="swift-ring-rebalance" Jan 21 16:07:17 crc kubenswrapper[4707]: E0121 16:07:17.742500 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="163a4187-712b-4e94-b883-6f12462cfc13" containerName="mariadb-account-create-update" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.742505 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="163a4187-712b-4e94-b883-6f12462cfc13" containerName="mariadb-account-create-update" Jan 21 16:07:17 crc kubenswrapper[4707]: E0121 16:07:17.742517 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4fd89cf-9f3d-4202-81ee-e70d62596f4e" containerName="mariadb-account-create-update" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.742522 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4fd89cf-9f3d-4202-81ee-e70d62596f4e" containerName="mariadb-account-create-update" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.742642 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bae039e-7430-4b54-9235-8aad8a9b834c" containerName="mariadb-database-create" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.742649 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec033d89-b803-4077-8fe1-58d264bd8772" containerName="swift-ring-rebalance" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.742662 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="163a4187-712b-4e94-b883-6f12462cfc13" containerName="mariadb-account-create-update" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.742672 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4fd89cf-9f3d-4202-81ee-e70d62596f4e" containerName="mariadb-account-create-update" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.743150 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.747496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.747798 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-97m6r" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.765344 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zhsbz"] Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.918292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4csm9\" (UniqueName: \"kubernetes.io/projected/306d1f96-06ed-4428-987c-081fc0031dd1-kube-api-access-4csm9\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.918372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-combined-ca-bundle\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.918618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-db-sync-config-data\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:17 crc kubenswrapper[4707]: I0121 16:07:17.918731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-config-data\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.019727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-db-sync-config-data\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.019798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-config-data\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.019890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4csm9\" (UniqueName: \"kubernetes.io/projected/306d1f96-06ed-4428-987c-081fc0031dd1-kube-api-access-4csm9\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.019949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-combined-ca-bundle\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.024340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-db-sync-config-data\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.024454 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-combined-ca-bundle\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.029833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-config-data\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.046969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4csm9\" (UniqueName: \"kubernetes.io/projected/306d1f96-06ed-4428-987c-081fc0031dd1-kube-api-access-4csm9\") pod \"glance-db-sync-zhsbz\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.055772 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.204227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6"} Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.204257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b"} Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.204285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4"} Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.204294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25"} Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.204302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf"} Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.204319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerStarted","Data":"41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446"} Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.232523 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=11.232512682 podStartE2EDuration="11.232512682s" podCreationTimestamp="2026-01-21 16:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:18.223921047 +0000 UTC m=+3935.405437269" watchObservedRunningTime="2026-01-21 16:07:18.232512682 +0000 UTC m=+3935.414028904" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.323183 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8"] Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.335015 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.338104 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.346284 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8"] Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.423659 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zhsbz"] Jan 21 16:07:18 crc kubenswrapper[4707]: W0121 16:07:18.427881 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod306d1f96_06ed_4428_987c_081fc0031dd1.slice/crio-3e6f1f9c6c47f7fe22da97132f7c3e88ef94b980c114f4f5debc73920687b701 WatchSource:0}: Error finding container 3e6f1f9c6c47f7fe22da97132f7c3e88ef94b980c114f4f5debc73920687b701: Status 404 returned error can't find the container with id 3e6f1f9c6c47f7fe22da97132f7c3e88ef94b980c114f4f5debc73920687b701 Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.429297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.429377 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-config\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.429458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pcj6\" (UniqueName: \"kubernetes.io/projected/9ddec444-846e-4b2e-a046-4a0eee45b90f-kube-api-access-9pcj6\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.429570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.530841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.530898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-config\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.530959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pcj6\" (UniqueName: \"kubernetes.io/projected/9ddec444-846e-4b2e-a046-4a0eee45b90f-kube-api-access-9pcj6\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.531008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.531707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.531737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-config\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.531866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.547085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pcj6\" (UniqueName: \"kubernetes.io/projected/9ddec444-846e-4b2e-a046-4a0eee45b90f-kube-api-access-9pcj6\") pod \"dnsmasq-dnsmasq-84cf554b49-q4ff8\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.651245 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.963590 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v5rnt"] Jan 21 16:07:18 crc kubenswrapper[4707]: I0121 16:07:18.967318 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v5rnt"] Jan 21 16:07:19 crc kubenswrapper[4707]: I0121 16:07:19.024539 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8"] Jan 21 16:07:19 crc kubenswrapper[4707]: W0121 16:07:19.028618 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ddec444_846e_4b2e_a046_4a0eee45b90f.slice/crio-f8bb4681e87046a5343de41aa872ccd0791c624406be49afd408961a807763da WatchSource:0}: Error finding container f8bb4681e87046a5343de41aa872ccd0791c624406be49afd408961a807763da: Status 404 returned error can't find the container with id f8bb4681e87046a5343de41aa872ccd0791c624406be49afd408961a807763da Jan 21 16:07:19 crc kubenswrapper[4707]: I0121 16:07:19.190597 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4fd89cf-9f3d-4202-81ee-e70d62596f4e" path="/var/lib/kubelet/pods/b4fd89cf-9f3d-4202-81ee-e70d62596f4e/volumes" Jan 21 16:07:19 crc kubenswrapper[4707]: I0121 16:07:19.211423 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ddec444-846e-4b2e-a046-4a0eee45b90f" containerID="32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3" exitCode=0 Jan 21 16:07:19 crc kubenswrapper[4707]: I0121 16:07:19.211492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" event={"ID":"9ddec444-846e-4b2e-a046-4a0eee45b90f","Type":"ContainerDied","Data":"32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3"} Jan 21 16:07:19 crc kubenswrapper[4707]: I0121 16:07:19.211515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" event={"ID":"9ddec444-846e-4b2e-a046-4a0eee45b90f","Type":"ContainerStarted","Data":"f8bb4681e87046a5343de41aa872ccd0791c624406be49afd408961a807763da"} Jan 21 16:07:19 crc kubenswrapper[4707]: I0121 16:07:19.213578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-zhsbz" event={"ID":"306d1f96-06ed-4428-987c-081fc0031dd1","Type":"ContainerStarted","Data":"5b2c98cc04fc5070f09e606ed3d0835a9cca88ad4946d72f7be2878e99f67f40"} Jan 21 16:07:19 crc kubenswrapper[4707]: I0121 16:07:19.213608 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-zhsbz" event={"ID":"306d1f96-06ed-4428-987c-081fc0031dd1","Type":"ContainerStarted","Data":"3e6f1f9c6c47f7fe22da97132f7c3e88ef94b980c114f4f5debc73920687b701"} Jan 21 16:07:19 crc kubenswrapper[4707]: I0121 16:07:19.260219 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-zhsbz" podStartSLOduration=2.260202545 podStartE2EDuration="2.260202545s" podCreationTimestamp="2026-01-21 16:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:19.248722619 +0000 UTC m=+3936.430238842" watchObservedRunningTime="2026-01-21 16:07:19.260202545 +0000 UTC m=+3936.441718767" Jan 21 16:07:20 crc kubenswrapper[4707]: I0121 16:07:20.222099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" event={"ID":"9ddec444-846e-4b2e-a046-4a0eee45b90f","Type":"ContainerStarted","Data":"41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc"} Jan 21 16:07:20 crc kubenswrapper[4707]: I0121 16:07:20.222145 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:20 crc kubenswrapper[4707]: I0121 16:07:20.241512 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" podStartSLOduration=2.241498763 podStartE2EDuration="2.241498763s" podCreationTimestamp="2026-01-21 16:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:20.23780048 +0000 UTC m=+3937.419316701" watchObservedRunningTime="2026-01-21 16:07:20.241498763 +0000 UTC m=+3937.423014985" Jan 21 16:07:21 crc kubenswrapper[4707]: I0121 16:07:21.182982 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:07:21 crc kubenswrapper[4707]: E0121 16:07:21.183196 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:07:21 crc kubenswrapper[4707]: I0121 16:07:21.227703 4707 generic.go:334] "Generic (PLEG): container finished" podID="306d1f96-06ed-4428-987c-081fc0031dd1" containerID="5b2c98cc04fc5070f09e606ed3d0835a9cca88ad4946d72f7be2878e99f67f40" exitCode=0 Jan 21 16:07:21 crc kubenswrapper[4707]: I0121 16:07:21.227792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-zhsbz" event={"ID":"306d1f96-06ed-4428-987c-081fc0031dd1","Type":"ContainerDied","Data":"5b2c98cc04fc5070f09e606ed3d0835a9cca88ad4946d72f7be2878e99f67f40"} Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.521474 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.570084 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.686653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-config-data\") pod \"306d1f96-06ed-4428-987c-081fc0031dd1\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.686700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-combined-ca-bundle\") pod \"306d1f96-06ed-4428-987c-081fc0031dd1\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.686742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4csm9\" (UniqueName: \"kubernetes.io/projected/306d1f96-06ed-4428-987c-081fc0031dd1-kube-api-access-4csm9\") pod \"306d1f96-06ed-4428-987c-081fc0031dd1\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.687064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-db-sync-config-data\") pod \"306d1f96-06ed-4428-987c-081fc0031dd1\" (UID: \"306d1f96-06ed-4428-987c-081fc0031dd1\") " Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.701085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "306d1f96-06ed-4428-987c-081fc0031dd1" (UID: "306d1f96-06ed-4428-987c-081fc0031dd1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.701128 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306d1f96-06ed-4428-987c-081fc0031dd1-kube-api-access-4csm9" (OuterVolumeSpecName: "kube-api-access-4csm9") pod "306d1f96-06ed-4428-987c-081fc0031dd1" (UID: "306d1f96-06ed-4428-987c-081fc0031dd1"). InnerVolumeSpecName "kube-api-access-4csm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.704911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "306d1f96-06ed-4428-987c-081fc0031dd1" (UID: "306d1f96-06ed-4428-987c-081fc0031dd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.716092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-config-data" (OuterVolumeSpecName: "config-data") pod "306d1f96-06ed-4428-987c-081fc0031dd1" (UID: "306d1f96-06ed-4428-987c-081fc0031dd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.788614 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.788647 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.788660 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4csm9\" (UniqueName: \"kubernetes.io/projected/306d1f96-06ed-4428-987c-081fc0031dd1-kube-api-access-4csm9\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:22 crc kubenswrapper[4707]: I0121 16:07:22.788672 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/306d1f96-06ed-4428-987c-081fc0031dd1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:23 crc kubenswrapper[4707]: I0121 16:07:23.241074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-zhsbz" event={"ID":"306d1f96-06ed-4428-987c-081fc0031dd1","Type":"ContainerDied","Data":"3e6f1f9c6c47f7fe22da97132f7c3e88ef94b980c114f4f5debc73920687b701"} Jan 21 16:07:23 crc kubenswrapper[4707]: I0121 16:07:23.241112 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e6f1f9c6c47f7fe22da97132f7c3e88ef94b980c114f4f5debc73920687b701" Jan 21 16:07:23 crc kubenswrapper[4707]: I0121 16:07:23.241138 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-zhsbz" Jan 21 16:07:23 crc kubenswrapper[4707]: I0121 16:07:23.969245 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fxjzc"] Jan 21 16:07:23 crc kubenswrapper[4707]: E0121 16:07:23.969518 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306d1f96-06ed-4428-987c-081fc0031dd1" containerName="glance-db-sync" Jan 21 16:07:23 crc kubenswrapper[4707]: I0121 16:07:23.969529 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="306d1f96-06ed-4428-987c-081fc0031dd1" containerName="glance-db-sync" Jan 21 16:07:23 crc kubenswrapper[4707]: I0121 16:07:23.969661 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="306d1f96-06ed-4428-987c-081fc0031dd1" containerName="glance-db-sync" Jan 21 16:07:23 crc kubenswrapper[4707]: I0121 16:07:23.970114 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:23 crc kubenswrapper[4707]: I0121 16:07:23.973198 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 16:07:23 crc kubenswrapper[4707]: I0121 16:07:23.975227 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fxjzc"] Jan 21 16:07:24 crc kubenswrapper[4707]: I0121 16:07:24.108867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a5a472-b8bf-458d-8dc8-be569df09d73-operator-scripts\") pod \"root-account-create-update-fxjzc\" (UID: \"25a5a472-b8bf-458d-8dc8-be569df09d73\") " pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:24 crc kubenswrapper[4707]: I0121 16:07:24.108925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwscv\" (UniqueName: \"kubernetes.io/projected/25a5a472-b8bf-458d-8dc8-be569df09d73-kube-api-access-nwscv\") pod \"root-account-create-update-fxjzc\" (UID: \"25a5a472-b8bf-458d-8dc8-be569df09d73\") " pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:24 crc kubenswrapper[4707]: I0121 16:07:24.210499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a5a472-b8bf-458d-8dc8-be569df09d73-operator-scripts\") pod \"root-account-create-update-fxjzc\" (UID: \"25a5a472-b8bf-458d-8dc8-be569df09d73\") " pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:24 crc kubenswrapper[4707]: I0121 16:07:24.210755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwscv\" (UniqueName: \"kubernetes.io/projected/25a5a472-b8bf-458d-8dc8-be569df09d73-kube-api-access-nwscv\") pod \"root-account-create-update-fxjzc\" (UID: \"25a5a472-b8bf-458d-8dc8-be569df09d73\") " pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:24 crc kubenswrapper[4707]: I0121 16:07:24.211199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a5a472-b8bf-458d-8dc8-be569df09d73-operator-scripts\") pod \"root-account-create-update-fxjzc\" (UID: \"25a5a472-b8bf-458d-8dc8-be569df09d73\") " pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:24 crc kubenswrapper[4707]: I0121 16:07:24.230917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwscv\" (UniqueName: \"kubernetes.io/projected/25a5a472-b8bf-458d-8dc8-be569df09d73-kube-api-access-nwscv\") pod \"root-account-create-update-fxjzc\" (UID: \"25a5a472-b8bf-458d-8dc8-be569df09d73\") " pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:24 crc kubenswrapper[4707]: I0121 16:07:24.281894 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:24 crc kubenswrapper[4707]: I0121 16:07:24.632830 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fxjzc"] Jan 21 16:07:25 crc kubenswrapper[4707]: I0121 16:07:25.255050 4707 generic.go:334] "Generic (PLEG): container finished" podID="25a5a472-b8bf-458d-8dc8-be569df09d73" containerID="08784ad7d03c8490ce19607b5c0a3e275886c4606da94d81de1a7da1fb4fa777" exitCode=0 Jan 21 16:07:25 crc kubenswrapper[4707]: I0121 16:07:25.255112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fxjzc" event={"ID":"25a5a472-b8bf-458d-8dc8-be569df09d73","Type":"ContainerDied","Data":"08784ad7d03c8490ce19607b5c0a3e275886c4606da94d81de1a7da1fb4fa777"} Jan 21 16:07:25 crc kubenswrapper[4707]: I0121 16:07:25.255133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fxjzc" event={"ID":"25a5a472-b8bf-458d-8dc8-be569df09d73","Type":"ContainerStarted","Data":"db0d91d989e1b69c9e78bf45f4aab266f896964f394da83e85af491f79a357a1"} Jan 21 16:07:25 crc kubenswrapper[4707]: I0121 16:07:25.256062 4707 generic.go:334] "Generic (PLEG): container finished" podID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" containerID="99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86" exitCode=0 Jan 21 16:07:25 crc kubenswrapper[4707]: I0121 16:07:25.256121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"5d633804-5d9a-4792-b183-51e2c1a2ebe2","Type":"ContainerDied","Data":"99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86"} Jan 21 16:07:25 crc kubenswrapper[4707]: I0121 16:07:25.257902 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerID="f5188077018b4348f2419002120e3a5b4d29c9ab4682810c48d0553e5ac5fccf" exitCode=0 Jan 21 16:07:25 crc kubenswrapper[4707]: I0121 16:07:25.257927 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"0e42cb4d-8129-4286-a6c7-9f8e94524e91","Type":"ContainerDied","Data":"f5188077018b4348f2419002120e3a5b4d29c9ab4682810c48d0553e5ac5fccf"} Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.268661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"0e42cb4d-8129-4286-a6c7-9f8e94524e91","Type":"ContainerStarted","Data":"43c33ad38b01642b5b25214359a49931c587fc6383778e43bacab56ce73f1322"} Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.269072 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.270453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"5d633804-5d9a-4792-b183-51e2c1a2ebe2","Type":"ContainerStarted","Data":"be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49"} Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.270682 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.289208 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.289193257 podStartE2EDuration="36.289193257s" podCreationTimestamp="2026-01-21 16:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:26.287391069 +0000 UTC m=+3943.468907291" watchObservedRunningTime="2026-01-21 16:07:26.289193257 +0000 UTC m=+3943.470709480" Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.308393 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.308380385 podStartE2EDuration="36.308380385s" podCreationTimestamp="2026-01-21 16:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:26.306412365 +0000 UTC m=+3943.487928587" watchObservedRunningTime="2026-01-21 16:07:26.308380385 +0000 UTC m=+3943.489896607" Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.563334 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.644261 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwscv\" (UniqueName: \"kubernetes.io/projected/25a5a472-b8bf-458d-8dc8-be569df09d73-kube-api-access-nwscv\") pod \"25a5a472-b8bf-458d-8dc8-be569df09d73\" (UID: \"25a5a472-b8bf-458d-8dc8-be569df09d73\") " Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.644359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a5a472-b8bf-458d-8dc8-be569df09d73-operator-scripts\") pod \"25a5a472-b8bf-458d-8dc8-be569df09d73\" (UID: \"25a5a472-b8bf-458d-8dc8-be569df09d73\") " Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.644673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a5a472-b8bf-458d-8dc8-be569df09d73-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25a5a472-b8bf-458d-8dc8-be569df09d73" (UID: "25a5a472-b8bf-458d-8dc8-be569df09d73"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.648354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a5a472-b8bf-458d-8dc8-be569df09d73-kube-api-access-nwscv" (OuterVolumeSpecName: "kube-api-access-nwscv") pod "25a5a472-b8bf-458d-8dc8-be569df09d73" (UID: "25a5a472-b8bf-458d-8dc8-be569df09d73"). InnerVolumeSpecName "kube-api-access-nwscv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.746232 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwscv\" (UniqueName: \"kubernetes.io/projected/25a5a472-b8bf-458d-8dc8-be569df09d73-kube-api-access-nwscv\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:26 crc kubenswrapper[4707]: I0121 16:07:26.746260 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a5a472-b8bf-458d-8dc8-be569df09d73-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.278113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fxjzc" event={"ID":"25a5a472-b8bf-458d-8dc8-be569df09d73","Type":"ContainerDied","Data":"db0d91d989e1b69c9e78bf45f4aab266f896964f394da83e85af491f79a357a1"} Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.278150 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db0d91d989e1b69c9e78bf45f4aab266f896964f394da83e85af491f79a357a1" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.278158 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fxjzc" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.798529 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nnbvb"] Jan 21 16:07:27 crc kubenswrapper[4707]: E0121 16:07:27.798965 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a5a472-b8bf-458d-8dc8-be569df09d73" containerName="mariadb-account-create-update" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.798981 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a5a472-b8bf-458d-8dc8-be569df09d73" containerName="mariadb-account-create-update" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.799135 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a5a472-b8bf-458d-8dc8-be569df09d73" containerName="mariadb-account-create-update" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.800055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.810902 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnbvb"] Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.860574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-utilities\") pod \"redhat-marketplace-nnbvb\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.860611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-catalog-content\") pod \"redhat-marketplace-nnbvb\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.860639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7f98\" (UniqueName: \"kubernetes.io/projected/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-kube-api-access-b7f98\") pod \"redhat-marketplace-nnbvb\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.961828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-utilities\") pod \"redhat-marketplace-nnbvb\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.961863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-catalog-content\") pod \"redhat-marketplace-nnbvb\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.961892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7f98\" (UniqueName: \"kubernetes.io/projected/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-kube-api-access-b7f98\") pod \"redhat-marketplace-nnbvb\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.962310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-utilities\") pod \"redhat-marketplace-nnbvb\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.962372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-catalog-content\") pod \"redhat-marketplace-nnbvb\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:27 crc kubenswrapper[4707]: I0121 16:07:27.977004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7f98\" (UniqueName: \"kubernetes.io/projected/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-kube-api-access-b7f98\") pod \"redhat-marketplace-nnbvb\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:28 crc kubenswrapper[4707]: I0121 16:07:28.112588 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:28 crc kubenswrapper[4707]: I0121 16:07:28.516397 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnbvb"] Jan 21 16:07:28 crc kubenswrapper[4707]: W0121 16:07:28.519020 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ada8eb7_00d7_44fc_8b5d_ade50df89e86.slice/crio-84956ef5ff3b705414c06449f44dc7d8ed080f381366edfdb6f4e826839a7f4e WatchSource:0}: Error finding container 84956ef5ff3b705414c06449f44dc7d8ed080f381366edfdb6f4e826839a7f4e: Status 404 returned error can't find the container with id 84956ef5ff3b705414c06449f44dc7d8ed080f381366edfdb6f4e826839a7f4e Jan 21 16:07:28 crc kubenswrapper[4707]: I0121 16:07:28.652942 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:07:28 crc kubenswrapper[4707]: I0121 16:07:28.700646 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn"] Jan 21 16:07:28 crc kubenswrapper[4707]: I0121 16:07:28.700847 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" podUID="044a3d9d-de3b-4a2b-8007-9d21847149a5" containerName="dnsmasq-dns" containerID="cri-o://96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049" gracePeriod=10 Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.090927 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.181864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-dnsmasq-svc\") pod \"044a3d9d-de3b-4a2b-8007-9d21847149a5\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.181960 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-config\") pod \"044a3d9d-de3b-4a2b-8007-9d21847149a5\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.181986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4hb8\" (UniqueName: \"kubernetes.io/projected/044a3d9d-de3b-4a2b-8007-9d21847149a5-kube-api-access-h4hb8\") pod \"044a3d9d-de3b-4a2b-8007-9d21847149a5\" (UID: \"044a3d9d-de3b-4a2b-8007-9d21847149a5\") " Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.187940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/044a3d9d-de3b-4a2b-8007-9d21847149a5-kube-api-access-h4hb8" (OuterVolumeSpecName: "kube-api-access-h4hb8") pod "044a3d9d-de3b-4a2b-8007-9d21847149a5" (UID: "044a3d9d-de3b-4a2b-8007-9d21847149a5"). InnerVolumeSpecName "kube-api-access-h4hb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.212559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-config" (OuterVolumeSpecName: "config") pod "044a3d9d-de3b-4a2b-8007-9d21847149a5" (UID: "044a3d9d-de3b-4a2b-8007-9d21847149a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.213041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "044a3d9d-de3b-4a2b-8007-9d21847149a5" (UID: "044a3d9d-de3b-4a2b-8007-9d21847149a5"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.283641 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.283766 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4hb8\" (UniqueName: \"kubernetes.io/projected/044a3d9d-de3b-4a2b-8007-9d21847149a5-kube-api-access-h4hb8\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.283914 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/044a3d9d-de3b-4a2b-8007-9d21847149a5-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.291687 4707 generic.go:334] "Generic (PLEG): container finished" podID="044a3d9d-de3b-4a2b-8007-9d21847149a5" containerID="96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049" exitCode=0 Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.291749 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.291757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" event={"ID":"044a3d9d-de3b-4a2b-8007-9d21847149a5","Type":"ContainerDied","Data":"96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049"} Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.291782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn" event={"ID":"044a3d9d-de3b-4a2b-8007-9d21847149a5","Type":"ContainerDied","Data":"a922d9aa2997f2237686993372c3cb8b882120ecd6b866a26b11662d5b6882cd"} Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.291797 4707 scope.go:117] "RemoveContainer" containerID="96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.293372 4707 generic.go:334] "Generic (PLEG): container finished" podID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerID="b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425" exitCode=0 Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.293405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnbvb" event={"ID":"2ada8eb7-00d7-44fc-8b5d-ade50df89e86","Type":"ContainerDied","Data":"b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425"} Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.293427 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnbvb" event={"ID":"2ada8eb7-00d7-44fc-8b5d-ade50df89e86","Type":"ContainerStarted","Data":"84956ef5ff3b705414c06449f44dc7d8ed080f381366edfdb6f4e826839a7f4e"} Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.307831 4707 scope.go:117] "RemoveContainer" containerID="e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.319981 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn"] Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.324256 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l2ftn"] Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.339075 4707 scope.go:117] "RemoveContainer" containerID="96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049" Jan 21 16:07:29 crc kubenswrapper[4707]: E0121 16:07:29.339384 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049\": container with ID starting with 96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049 not found: ID does not exist" containerID="96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.339413 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049"} err="failed to get container status \"96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049\": rpc error: code = NotFound desc = could not find container \"96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049\": container with ID starting with 96c53aa37ad5f3c632f9e7a5d66812b532124946a6ef77d784a8ba5f817ec049 not found: ID does not exist" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.339429 4707 scope.go:117] "RemoveContainer" containerID="e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4" Jan 21 16:07:29 crc kubenswrapper[4707]: E0121 16:07:29.339718 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4\": container with ID starting with e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4 not found: ID does not exist" containerID="e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4" Jan 21 16:07:29 crc kubenswrapper[4707]: I0121 16:07:29.339734 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4"} err="failed to get container status \"e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4\": rpc error: code = NotFound desc = could not find container \"e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4\": container with ID starting with e7309290639f592422982cc9ad4fa86f71bfb4aea2162bd639699993437158d4 not found: ID does not exist" Jan 21 16:07:30 crc kubenswrapper[4707]: I0121 16:07:30.302741 4707 generic.go:334] "Generic (PLEG): container finished" podID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerID="314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e" exitCode=0 Jan 21 16:07:30 crc kubenswrapper[4707]: I0121 16:07:30.302830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnbvb" event={"ID":"2ada8eb7-00d7-44fc-8b5d-ade50df89e86","Type":"ContainerDied","Data":"314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e"} Jan 21 16:07:31 crc kubenswrapper[4707]: I0121 16:07:31.189240 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="044a3d9d-de3b-4a2b-8007-9d21847149a5" path="/var/lib/kubelet/pods/044a3d9d-de3b-4a2b-8007-9d21847149a5/volumes" Jan 21 16:07:31 crc kubenswrapper[4707]: I0121 16:07:31.310515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnbvb" event={"ID":"2ada8eb7-00d7-44fc-8b5d-ade50df89e86","Type":"ContainerStarted","Data":"540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39"} Jan 21 16:07:31 crc kubenswrapper[4707]: I0121 16:07:31.326192 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nnbvb" podStartSLOduration=2.641508938 podStartE2EDuration="4.326178998s" podCreationTimestamp="2026-01-21 16:07:27 +0000 UTC" firstStartedPulling="2026-01-21 16:07:29.294993315 +0000 UTC m=+3946.476509537" lastFinishedPulling="2026-01-21 16:07:30.979663376 +0000 UTC m=+3948.161179597" observedRunningTime="2026-01-21 16:07:31.320684988 +0000 UTC m=+3948.502201211" watchObservedRunningTime="2026-01-21 16:07:31.326178998 +0000 UTC m=+3948.507695220" Jan 21 16:07:34 crc kubenswrapper[4707]: I0121 16:07:34.182168 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:07:34 crc kubenswrapper[4707]: E0121 16:07:34.182726 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:07:38 crc kubenswrapper[4707]: I0121 16:07:38.113769 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:38 crc kubenswrapper[4707]: I0121 16:07:38.114126 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:38 crc kubenswrapper[4707]: I0121 16:07:38.142150 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:38 crc kubenswrapper[4707]: I0121 16:07:38.373313 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:38 crc kubenswrapper[4707]: I0121 16:07:38.404966 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnbvb"] Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.357175 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nnbvb" podUID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerName="registry-server" containerID="cri-o://540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39" gracePeriod=2 Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.728068 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.838556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7f98\" (UniqueName: \"kubernetes.io/projected/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-kube-api-access-b7f98\") pod \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.838619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-catalog-content\") pod \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.838675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-utilities\") pod \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\" (UID: \"2ada8eb7-00d7-44fc-8b5d-ade50df89e86\") " Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.839466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-utilities" (OuterVolumeSpecName: "utilities") pod "2ada8eb7-00d7-44fc-8b5d-ade50df89e86" (UID: "2ada8eb7-00d7-44fc-8b5d-ade50df89e86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.843436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-kube-api-access-b7f98" (OuterVolumeSpecName: "kube-api-access-b7f98") pod "2ada8eb7-00d7-44fc-8b5d-ade50df89e86" (UID: "2ada8eb7-00d7-44fc-8b5d-ade50df89e86"). InnerVolumeSpecName "kube-api-access-b7f98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.860100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ada8eb7-00d7-44fc-8b5d-ade50df89e86" (UID: "2ada8eb7-00d7-44fc-8b5d-ade50df89e86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.940646 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7f98\" (UniqueName: \"kubernetes.io/projected/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-kube-api-access-b7f98\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.940863 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:40 crc kubenswrapper[4707]: I0121 16:07:40.940875 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ada8eb7-00d7-44fc-8b5d-ade50df89e86-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.366655 4707 generic.go:334] "Generic (PLEG): container finished" podID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerID="540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39" exitCode=0 Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.366698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnbvb" event={"ID":"2ada8eb7-00d7-44fc-8b5d-ade50df89e86","Type":"ContainerDied","Data":"540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39"} Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.366730 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnbvb" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.366747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnbvb" event={"ID":"2ada8eb7-00d7-44fc-8b5d-ade50df89e86","Type":"ContainerDied","Data":"84956ef5ff3b705414c06449f44dc7d8ed080f381366edfdb6f4e826839a7f4e"} Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.366766 4707 scope.go:117] "RemoveContainer" containerID="540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.381649 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnbvb"] Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.386191 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnbvb"] Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.386841 4707 scope.go:117] "RemoveContainer" containerID="314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.400935 4707 scope.go:117] "RemoveContainer" containerID="b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.425292 4707 scope.go:117] "RemoveContainer" containerID="540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39" Jan 21 16:07:41 crc kubenswrapper[4707]: E0121 16:07:41.425657 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39\": container with ID starting with 540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39 not found: ID does not exist" containerID="540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.425703 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39"} err="failed to get container status \"540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39\": rpc error: code = NotFound desc = could not find container \"540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39\": container with ID starting with 540053e4546051a87fc5101883f1c39cd7911f1d99df05f61fbfca187ed94c39 not found: ID does not exist" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.425722 4707 scope.go:117] "RemoveContainer" containerID="314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e" Jan 21 16:07:41 crc kubenswrapper[4707]: E0121 16:07:41.426045 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e\": container with ID starting with 314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e not found: ID does not exist" containerID="314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.426068 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e"} err="failed to get container status \"314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e\": rpc error: code = NotFound desc = could not find container \"314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e\": container with ID starting with 314ac28041516cb79ec59d6c03bb5d596c8909f5df362f385e3be97dc6322f2e not found: ID does not exist" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.426086 4707 scope.go:117] "RemoveContainer" containerID="b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425" Jan 21 16:07:41 crc kubenswrapper[4707]: E0121 16:07:41.426317 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425\": container with ID starting with b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425 not found: ID does not exist" containerID="b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.426336 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425"} err="failed to get container status \"b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425\": rpc error: code = NotFound desc = could not find container \"b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425\": container with ID starting with b2a98ab7ecfda88005c51d0a30a96d18bf36de53a26d2e8b74b7ebb033fbb425 not found: ID does not exist" Jan 21 16:07:41 crc kubenswrapper[4707]: I0121 16:07:41.813563 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.036108 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-zwv4m"] Jan 21 16:07:42 crc kubenswrapper[4707]: E0121 16:07:42.036410 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044a3d9d-de3b-4a2b-8007-9d21847149a5" containerName="dnsmasq-dns" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.036427 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="044a3d9d-de3b-4a2b-8007-9d21847149a5" containerName="dnsmasq-dns" Jan 21 16:07:42 crc kubenswrapper[4707]: E0121 16:07:42.036447 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044a3d9d-de3b-4a2b-8007-9d21847149a5" containerName="init" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.036453 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="044a3d9d-de3b-4a2b-8007-9d21847149a5" containerName="init" Jan 21 16:07:42 crc kubenswrapper[4707]: E0121 16:07:42.036467 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerName="extract-utilities" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.036472 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerName="extract-utilities" Jan 21 16:07:42 crc kubenswrapper[4707]: E0121 16:07:42.036482 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerName="registry-server" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.036489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerName="registry-server" Jan 21 16:07:42 crc kubenswrapper[4707]: E0121 16:07:42.036502 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerName="extract-content" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.036507 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerName="extract-content" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.036668 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="044a3d9d-de3b-4a2b-8007-9d21847149a5" containerName="dnsmasq-dns" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.036690 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" containerName="registry-server" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.037146 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.043173 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-zwv4m"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.078950 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.143326 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-mzhtx"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.144350 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.147765 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.148741 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.149830 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.153708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-mzhtx"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.157408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw46g\" (UniqueName: \"kubernetes.io/projected/0bc19662-4ee3-4885-9103-313f65c41fd9-kube-api-access-sw46g\") pod \"cinder-db-create-zwv4m\" (UID: \"0bc19662-4ee3-4885-9103-313f65c41fd9\") " pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.157502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bc19662-4ee3-4885-9103-313f65c41fd9-operator-scripts\") pod \"cinder-db-create-zwv4m\" (UID: \"0bc19662-4ee3-4885-9103-313f65c41fd9\") " pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.161668 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.239012 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.240066 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.241497 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.247160 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.259910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dec82238-89a9-422f-bed8-d3a1992d60e9-operator-scripts\") pod \"barbican-db-create-mzhtx\" (UID: \"dec82238-89a9-422f-bed8-d3a1992d60e9\") " pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.259986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbds\" (UniqueName: \"kubernetes.io/projected/dec82238-89a9-422f-bed8-d3a1992d60e9-kube-api-access-bkbds\") pod \"barbican-db-create-mzhtx\" (UID: \"dec82238-89a9-422f-bed8-d3a1992d60e9\") " pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.260018 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-operator-scripts\") pod \"cinder-b36a-account-create-update-nhmxp\" (UID: \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.260056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2n8\" (UniqueName: \"kubernetes.io/projected/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-kube-api-access-pz2n8\") pod \"cinder-b36a-account-create-update-nhmxp\" (UID: \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.260152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw46g\" (UniqueName: \"kubernetes.io/projected/0bc19662-4ee3-4885-9103-313f65c41fd9-kube-api-access-sw46g\") pod \"cinder-db-create-zwv4m\" (UID: \"0bc19662-4ee3-4885-9103-313f65c41fd9\") " pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.260276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bc19662-4ee3-4885-9103-313f65c41fd9-operator-scripts\") pod \"cinder-db-create-zwv4m\" (UID: \"0bc19662-4ee3-4885-9103-313f65c41fd9\") " pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.260970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bc19662-4ee3-4885-9103-313f65c41fd9-operator-scripts\") pod \"cinder-db-create-zwv4m\" (UID: \"0bc19662-4ee3-4885-9103-313f65c41fd9\") " pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.276155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw46g\" (UniqueName: \"kubernetes.io/projected/0bc19662-4ee3-4885-9103-313f65c41fd9-kube-api-access-sw46g\") pod \"cinder-db-create-zwv4m\" (UID: \"0bc19662-4ee3-4885-9103-313f65c41fd9\") " pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.340789 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-67cvx"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.341834 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.343459 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.345644 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g4v5h"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.346488 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.349614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.361878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-operator-scripts\") pod \"neutron-13f5-account-create-update-2t9gw\" (UID: \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.361947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dec82238-89a9-422f-bed8-d3a1992d60e9-operator-scripts\") pod \"barbican-db-create-mzhtx\" (UID: \"dec82238-89a9-422f-bed8-d3a1992d60e9\") " pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.361987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbds\" (UniqueName: \"kubernetes.io/projected/dec82238-89a9-422f-bed8-d3a1992d60e9-kube-api-access-bkbds\") pod \"barbican-db-create-mzhtx\" (UID: \"dec82238-89a9-422f-bed8-d3a1992d60e9\") " pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.362013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-operator-scripts\") pod \"cinder-b36a-account-create-update-nhmxp\" (UID: \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.362037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz2n8\" (UniqueName: \"kubernetes.io/projected/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-kube-api-access-pz2n8\") pod \"cinder-b36a-account-create-update-nhmxp\" (UID: \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.362062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tt69\" (UniqueName: \"kubernetes.io/projected/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-kube-api-access-9tt69\") pod \"neutron-13f5-account-create-update-2t9gw\" (UID: \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.362619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dec82238-89a9-422f-bed8-d3a1992d60e9-operator-scripts\") pod \"barbican-db-create-mzhtx\" (UID: \"dec82238-89a9-422f-bed8-d3a1992d60e9\") " pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.362909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-operator-scripts\") pod \"cinder-b36a-account-create-update-nhmxp\" (UID: \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.395402 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g4v5h"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.400624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbds\" (UniqueName: \"kubernetes.io/projected/dec82238-89a9-422f-bed8-d3a1992d60e9-kube-api-access-bkbds\") pod \"barbican-db-create-mzhtx\" (UID: \"dec82238-89a9-422f-bed8-d3a1992d60e9\") " pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.404043 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-67cvx"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.407534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz2n8\" (UniqueName: \"kubernetes.io/projected/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-kube-api-access-pz2n8\") pod \"cinder-b36a-account-create-update-nhmxp\" (UID: \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.460878 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.463258 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46101c43-94be-44b4-9638-7f5a94535611-operator-scripts\") pod \"neutron-db-create-g4v5h\" (UID: \"46101c43-94be-44b4-9638-7f5a94535611\") " pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.463311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-operator-scripts\") pod \"barbican-db52-account-create-update-67cvx\" (UID: \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.463351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tt69\" (UniqueName: \"kubernetes.io/projected/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-kube-api-access-9tt69\") pod \"neutron-13f5-account-create-update-2t9gw\" (UID: \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.463372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h75hp\" (UniqueName: \"kubernetes.io/projected/46101c43-94be-44b4-9638-7f5a94535611-kube-api-access-h75hp\") pod \"neutron-db-create-g4v5h\" (UID: \"46101c43-94be-44b4-9638-7f5a94535611\") " pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.463474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7llj\" (UniqueName: \"kubernetes.io/projected/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-kube-api-access-x7llj\") pod \"barbican-db52-account-create-update-67cvx\" (UID: \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.463507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-operator-scripts\") pod \"neutron-13f5-account-create-update-2t9gw\" (UID: \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.464103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-operator-scripts\") pod \"neutron-13f5-account-create-update-2t9gw\" (UID: \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.465910 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.484304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tt69\" (UniqueName: \"kubernetes.io/projected/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-kube-api-access-9tt69\") pod \"neutron-13f5-account-create-update-2t9gw\" (UID: \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.500917 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pwngz"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.501944 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.503704 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.504197 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-6rbnr" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.504278 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.504333 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.518852 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pwngz"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.552532 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.565161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7llj\" (UniqueName: \"kubernetes.io/projected/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-kube-api-access-x7llj\") pod \"barbican-db52-account-create-update-67cvx\" (UID: \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.565277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46101c43-94be-44b4-9638-7f5a94535611-operator-scripts\") pod \"neutron-db-create-g4v5h\" (UID: \"46101c43-94be-44b4-9638-7f5a94535611\") " pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.565309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-operator-scripts\") pod \"barbican-db52-account-create-update-67cvx\" (UID: \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.565336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h75hp\" (UniqueName: \"kubernetes.io/projected/46101c43-94be-44b4-9638-7f5a94535611-kube-api-access-h75hp\") pod \"neutron-db-create-g4v5h\" (UID: \"46101c43-94be-44b4-9638-7f5a94535611\") " pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.566461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46101c43-94be-44b4-9638-7f5a94535611-operator-scripts\") pod \"neutron-db-create-g4v5h\" (UID: \"46101c43-94be-44b4-9638-7f5a94535611\") " pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.566711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-operator-scripts\") pod \"barbican-db52-account-create-update-67cvx\" (UID: \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.579940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7llj\" (UniqueName: \"kubernetes.io/projected/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-kube-api-access-x7llj\") pod \"barbican-db52-account-create-update-67cvx\" (UID: \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.580276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h75hp\" (UniqueName: \"kubernetes.io/projected/46101c43-94be-44b4-9638-7f5a94535611-kube-api-access-h75hp\") pod \"neutron-db-create-g4v5h\" (UID: \"46101c43-94be-44b4-9638-7f5a94535611\") " pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.666553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-config-data\") pod \"keystone-db-sync-pwngz\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.666654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-combined-ca-bundle\") pod \"keystone-db-sync-pwngz\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.666757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhsqb\" (UniqueName: \"kubernetes.io/projected/bd39e035-830d-4cc3-ab9f-2127564ecddf-kube-api-access-lhsqb\") pod \"keystone-db-sync-pwngz\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.668008 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.680406 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.769236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-config-data\") pod \"keystone-db-sync-pwngz\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.769353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-combined-ca-bundle\") pod \"keystone-db-sync-pwngz\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.769389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhsqb\" (UniqueName: \"kubernetes.io/projected/bd39e035-830d-4cc3-ab9f-2127564ecddf-kube-api-access-lhsqb\") pod \"keystone-db-sync-pwngz\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.774791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-combined-ca-bundle\") pod \"keystone-db-sync-pwngz\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.774919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-config-data\") pod \"keystone-db-sync-pwngz\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.785451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhsqb\" (UniqueName: \"kubernetes.io/projected/bd39e035-830d-4cc3-ab9f-2127564ecddf-kube-api-access-lhsqb\") pod \"keystone-db-sync-pwngz\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.804445 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.836398 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-zwv4m"] Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.877738 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:42 crc kubenswrapper[4707]: I0121 16:07:42.923036 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-mzhtx"] Jan 21 16:07:42 crc kubenswrapper[4707]: W0121 16:07:42.938171 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddec82238_89a9_422f_bed8_d3a1992d60e9.slice/crio-5b7a751878deba08907859f4e45a7d71126e3bf7b6f1a0596a58333eec36ec5d WatchSource:0}: Error finding container 5b7a751878deba08907859f4e45a7d71126e3bf7b6f1a0596a58333eec36ec5d: Status 404 returned error can't find the container with id 5b7a751878deba08907859f4e45a7d71126e3bf7b6f1a0596a58333eec36ec5d Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.139869 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw"] Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.200355 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ada8eb7-00d7-44fc-8b5d-ade50df89e86" path="/var/lib/kubelet/pods/2ada8eb7-00d7-44fc-8b5d-ade50df89e86/volumes" Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.211783 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g4v5h"] Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.270848 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-67cvx"] Jan 21 16:07:43 crc kubenswrapper[4707]: W0121 16:07:43.284247 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e6ceb90_d7f0_450a_80f2_a22c4f0dfdff.slice/crio-6676b8c3f46759461f81cacc38fea53bc0db5b8d37ee4a0461ef81fca0f908d2 WatchSource:0}: Error finding container 6676b8c3f46759461f81cacc38fea53bc0db5b8d37ee4a0461ef81fca0f908d2: Status 404 returned error can't find the container with id 6676b8c3f46759461f81cacc38fea53bc0db5b8d37ee4a0461ef81fca0f908d2 Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.288359 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.371026 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pwngz"] Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.405874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pwngz" event={"ID":"bd39e035-830d-4cc3-ab9f-2127564ecddf","Type":"ContainerStarted","Data":"ee6c23a56985bf844fc07467c9bd848439fc17f80330b0fb93e06cc1f00c5d44"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.408163 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7" containerID="df63a0f96d2b4c2cd36be6f9bcc5c322311735a28d277daebc97d5ec5b9bfe6c" exitCode=0 Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.408234 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" event={"ID":"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7","Type":"ContainerDied","Data":"df63a0f96d2b4c2cd36be6f9bcc5c322311735a28d277daebc97d5ec5b9bfe6c"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.408258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" event={"ID":"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7","Type":"ContainerStarted","Data":"df4b37601a6b998c1fa1e40fff957db86bf5d15cc9d82af1b57b6ecf5e149bd6"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.410303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" event={"ID":"f4e3a9c8-3c3a-4782-b8ef-75125cda3933","Type":"ContainerStarted","Data":"6e5c1a5371a24097ab7530f393c17000f9250266e47dc1732fbe3880f11b7c8f"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.410344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" event={"ID":"f4e3a9c8-3c3a-4782-b8ef-75125cda3933","Type":"ContainerStarted","Data":"de9bb458d8adb50ec4db6fb00306fd520cbb63439ecba239ff0144ad3963c352"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.412014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" event={"ID":"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff","Type":"ContainerStarted","Data":"868cc8f8724909904912ff258b32f90becf5b16e946b2fdedb12537284739009"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.412041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" event={"ID":"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff","Type":"ContainerStarted","Data":"6676b8c3f46759461f81cacc38fea53bc0db5b8d37ee4a0461ef81fca0f908d2"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.421456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-mzhtx" event={"ID":"dec82238-89a9-422f-bed8-d3a1992d60e9","Type":"ContainerStarted","Data":"e6c5b3ded6d120827ac01f6ac458fb4657209930d6d59a84ed3c4e36e0aabc76"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.421485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-mzhtx" event={"ID":"dec82238-89a9-422f-bed8-d3a1992d60e9","Type":"ContainerStarted","Data":"5b7a751878deba08907859f4e45a7d71126e3bf7b6f1a0596a58333eec36ec5d"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.425213 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-zwv4m" event={"ID":"0bc19662-4ee3-4885-9103-313f65c41fd9","Type":"ContainerStarted","Data":"f3ef8e10752a7b3f655f3ba2c159cbe3e42d3a2fec7a9a6ec78eb3474905e38b"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.425251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-zwv4m" event={"ID":"0bc19662-4ee3-4885-9103-313f65c41fd9","Type":"ContainerStarted","Data":"c120697779ac028d7111997efeaf9a1d656216a0f193b91d0d2d41b25613a873"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.426761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-g4v5h" event={"ID":"46101c43-94be-44b4-9638-7f5a94535611","Type":"ContainerStarted","Data":"fa05439844a1c6e8d944655fe665d7c8f92a474ed2877ea8b4617cf47356577b"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.426787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-g4v5h" event={"ID":"46101c43-94be-44b4-9638-7f5a94535611","Type":"ContainerStarted","Data":"307b6c464ffb840efe09888b31642355e6b934f2bf7191957cea06d078c198f5"} Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.449182 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-create-zwv4m" podStartSLOduration=1.449167733 podStartE2EDuration="1.449167733s" podCreationTimestamp="2026-01-21 16:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:43.443479237 +0000 UTC m=+3960.624995459" watchObservedRunningTime="2026-01-21 16:07:43.449167733 +0000 UTC m=+3960.630683955" Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.461949 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-create-mzhtx" podStartSLOduration=1.4619370329999999 podStartE2EDuration="1.461937033s" podCreationTimestamp="2026-01-21 16:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:43.46101009 +0000 UTC m=+3960.642526312" watchObservedRunningTime="2026-01-21 16:07:43.461937033 +0000 UTC m=+3960.643453255" Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.480146 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" podStartSLOduration=1.480131856 podStartE2EDuration="1.480131856s" podCreationTimestamp="2026-01-21 16:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:43.476982153 +0000 UTC m=+3960.658498376" watchObservedRunningTime="2026-01-21 16:07:43.480131856 +0000 UTC m=+3960.661648088" Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.514596 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-create-g4v5h" podStartSLOduration=1.514579068 podStartE2EDuration="1.514579068s" podCreationTimestamp="2026-01-21 16:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:43.500754142 +0000 UTC m=+3960.682270364" watchObservedRunningTime="2026-01-21 16:07:43.514579068 +0000 UTC m=+3960.696095290" Jan 21 16:07:43 crc kubenswrapper[4707]: I0121 16:07:43.516130 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" podStartSLOduration=1.5161215879999999 podStartE2EDuration="1.516121588s" podCreationTimestamp="2026-01-21 16:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:43.511396914 +0000 UTC m=+3960.692913136" watchObservedRunningTime="2026-01-21 16:07:43.516121588 +0000 UTC m=+3960.697637811" Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.434098 4707 generic.go:334] "Generic (PLEG): container finished" podID="2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff" containerID="868cc8f8724909904912ff258b32f90becf5b16e946b2fdedb12537284739009" exitCode=0 Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.434339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" event={"ID":"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff","Type":"ContainerDied","Data":"868cc8f8724909904912ff258b32f90becf5b16e946b2fdedb12537284739009"} Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.435394 4707 generic.go:334] "Generic (PLEG): container finished" podID="dec82238-89a9-422f-bed8-d3a1992d60e9" containerID="e6c5b3ded6d120827ac01f6ac458fb4657209930d6d59a84ed3c4e36e0aabc76" exitCode=0 Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.435436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-mzhtx" event={"ID":"dec82238-89a9-422f-bed8-d3a1992d60e9","Type":"ContainerDied","Data":"e6c5b3ded6d120827ac01f6ac458fb4657209930d6d59a84ed3c4e36e0aabc76"} Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.436443 4707 generic.go:334] "Generic (PLEG): container finished" podID="0bc19662-4ee3-4885-9103-313f65c41fd9" containerID="f3ef8e10752a7b3f655f3ba2c159cbe3e42d3a2fec7a9a6ec78eb3474905e38b" exitCode=0 Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.436478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-zwv4m" event={"ID":"0bc19662-4ee3-4885-9103-313f65c41fd9","Type":"ContainerDied","Data":"f3ef8e10752a7b3f655f3ba2c159cbe3e42d3a2fec7a9a6ec78eb3474905e38b"} Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.437425 4707 generic.go:334] "Generic (PLEG): container finished" podID="46101c43-94be-44b4-9638-7f5a94535611" containerID="fa05439844a1c6e8d944655fe665d7c8f92a474ed2877ea8b4617cf47356577b" exitCode=0 Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.437460 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-g4v5h" event={"ID":"46101c43-94be-44b4-9638-7f5a94535611","Type":"ContainerDied","Data":"fa05439844a1c6e8d944655fe665d7c8f92a474ed2877ea8b4617cf47356577b"} Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.439951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pwngz" event={"ID":"bd39e035-830d-4cc3-ab9f-2127564ecddf","Type":"ContainerStarted","Data":"b629d96d3befccaa520a351b2d69f22752fe28a8a8419fa92d63093c7d989cb2"} Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.441310 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4e3a9c8-3c3a-4782-b8ef-75125cda3933" containerID="6e5c1a5371a24097ab7530f393c17000f9250266e47dc1732fbe3880f11b7c8f" exitCode=0 Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.441398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" event={"ID":"f4e3a9c8-3c3a-4782-b8ef-75125cda3933","Type":"ContainerDied","Data":"6e5c1a5371a24097ab7530f393c17000f9250266e47dc1732fbe3880f11b7c8f"} Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.502074 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-pwngz" podStartSLOduration=2.502059674 podStartE2EDuration="2.502059674s" podCreationTimestamp="2026-01-21 16:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:44.497082335 +0000 UTC m=+3961.678598557" watchObservedRunningTime="2026-01-21 16:07:44.502059674 +0000 UTC m=+3961.683575896" Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.722453 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.806128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz2n8\" (UniqueName: \"kubernetes.io/projected/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-kube-api-access-pz2n8\") pod \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\" (UID: \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\") " Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.807652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-operator-scripts\") pod \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\" (UID: \"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7\") " Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.808731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7" (UID: "fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.809476 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.816855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-kube-api-access-pz2n8" (OuterVolumeSpecName: "kube-api-access-pz2n8") pod "fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7" (UID: "fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7"). InnerVolumeSpecName "kube-api-access-pz2n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:44 crc kubenswrapper[4707]: I0121 16:07:44.911527 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz2n8\" (UniqueName: \"kubernetes.io/projected/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7-kube-api-access-pz2n8\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.449591 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.449586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp" event={"ID":"fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7","Type":"ContainerDied","Data":"df4b37601a6b998c1fa1e40fff957db86bf5d15cc9d82af1b57b6ecf5e149bd6"} Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.450232 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df4b37601a6b998c1fa1e40fff957db86bf5d15cc9d82af1b57b6ecf5e149bd6" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.451770 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd39e035-830d-4cc3-ab9f-2127564ecddf" containerID="b629d96d3befccaa520a351b2d69f22752fe28a8a8419fa92d63093c7d989cb2" exitCode=0 Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.451824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pwngz" event={"ID":"bd39e035-830d-4cc3-ab9f-2127564ecddf","Type":"ContainerDied","Data":"b629d96d3befccaa520a351b2d69f22752fe28a8a8419fa92d63093c7d989cb2"} Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.725875 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.825524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw46g\" (UniqueName: \"kubernetes.io/projected/0bc19662-4ee3-4885-9103-313f65c41fd9-kube-api-access-sw46g\") pod \"0bc19662-4ee3-4885-9103-313f65c41fd9\" (UID: \"0bc19662-4ee3-4885-9103-313f65c41fd9\") " Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.825556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bc19662-4ee3-4885-9103-313f65c41fd9-operator-scripts\") pod \"0bc19662-4ee3-4885-9103-313f65c41fd9\" (UID: \"0bc19662-4ee3-4885-9103-313f65c41fd9\") " Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.826914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc19662-4ee3-4885-9103-313f65c41fd9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bc19662-4ee3-4885-9103-313f65c41fd9" (UID: "0bc19662-4ee3-4885-9103-313f65c41fd9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.829898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc19662-4ee3-4885-9103-313f65c41fd9-kube-api-access-sw46g" (OuterVolumeSpecName: "kube-api-access-sw46g") pod "0bc19662-4ee3-4885-9103-313f65c41fd9" (UID: "0bc19662-4ee3-4885-9103-313f65c41fd9"). InnerVolumeSpecName "kube-api-access-sw46g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.881243 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.891437 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.897965 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.901339 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.929620 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bc19662-4ee3-4885-9103-313f65c41fd9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:45 crc kubenswrapper[4707]: I0121 16:07:45.929651 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw46g\" (UniqueName: \"kubernetes.io/projected/0bc19662-4ee3-4885-9103-313f65c41fd9-kube-api-access-sw46g\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-operator-scripts\") pod \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\" (UID: \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-operator-scripts\") pod \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\" (UID: \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030167 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dec82238-89a9-422f-bed8-d3a1992d60e9-operator-scripts\") pod \"dec82238-89a9-422f-bed8-d3a1992d60e9\" (UID: \"dec82238-89a9-422f-bed8-d3a1992d60e9\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030200 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkbds\" (UniqueName: \"kubernetes.io/projected/dec82238-89a9-422f-bed8-d3a1992d60e9-kube-api-access-bkbds\") pod \"dec82238-89a9-422f-bed8-d3a1992d60e9\" (UID: \"dec82238-89a9-422f-bed8-d3a1992d60e9\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030229 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h75hp\" (UniqueName: \"kubernetes.io/projected/46101c43-94be-44b4-9638-7f5a94535611-kube-api-access-h75hp\") pod \"46101c43-94be-44b4-9638-7f5a94535611\" (UID: \"46101c43-94be-44b4-9638-7f5a94535611\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46101c43-94be-44b4-9638-7f5a94535611-operator-scripts\") pod \"46101c43-94be-44b4-9638-7f5a94535611\" (UID: \"46101c43-94be-44b4-9638-7f5a94535611\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7llj\" (UniqueName: \"kubernetes.io/projected/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-kube-api-access-x7llj\") pod \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\" (UID: \"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tt69\" (UniqueName: \"kubernetes.io/projected/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-kube-api-access-9tt69\") pod \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\" (UID: \"f4e3a9c8-3c3a-4782-b8ef-75125cda3933\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4e3a9c8-3c3a-4782-b8ef-75125cda3933" (UID: "f4e3a9c8-3c3a-4782-b8ef-75125cda3933"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.030565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dec82238-89a9-422f-bed8-d3a1992d60e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dec82238-89a9-422f-bed8-d3a1992d60e9" (UID: "dec82238-89a9-422f-bed8-d3a1992d60e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.031057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46101c43-94be-44b4-9638-7f5a94535611-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46101c43-94be-44b4-9638-7f5a94535611" (UID: "46101c43-94be-44b4-9638-7f5a94535611"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.031176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff" (UID: "2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.033090 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-kube-api-access-9tt69" (OuterVolumeSpecName: "kube-api-access-9tt69") pod "f4e3a9c8-3c3a-4782-b8ef-75125cda3933" (UID: "f4e3a9c8-3c3a-4782-b8ef-75125cda3933"). InnerVolumeSpecName "kube-api-access-9tt69". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.033424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46101c43-94be-44b4-9638-7f5a94535611-kube-api-access-h75hp" (OuterVolumeSpecName: "kube-api-access-h75hp") pod "46101c43-94be-44b4-9638-7f5a94535611" (UID: "46101c43-94be-44b4-9638-7f5a94535611"). InnerVolumeSpecName "kube-api-access-h75hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.033646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-kube-api-access-x7llj" (OuterVolumeSpecName: "kube-api-access-x7llj") pod "2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff" (UID: "2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff"). InnerVolumeSpecName "kube-api-access-x7llj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.033797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec82238-89a9-422f-bed8-d3a1992d60e9-kube-api-access-bkbds" (OuterVolumeSpecName: "kube-api-access-bkbds") pod "dec82238-89a9-422f-bed8-d3a1992d60e9" (UID: "dec82238-89a9-422f-bed8-d3a1992d60e9"). InnerVolumeSpecName "kube-api-access-bkbds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.131519 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.131545 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.131554 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dec82238-89a9-422f-bed8-d3a1992d60e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.131565 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkbds\" (UniqueName: \"kubernetes.io/projected/dec82238-89a9-422f-bed8-d3a1992d60e9-kube-api-access-bkbds\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.131574 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h75hp\" (UniqueName: \"kubernetes.io/projected/46101c43-94be-44b4-9638-7f5a94535611-kube-api-access-h75hp\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.131582 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46101c43-94be-44b4-9638-7f5a94535611-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.131591 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7llj\" (UniqueName: \"kubernetes.io/projected/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff-kube-api-access-x7llj\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.131600 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tt69\" (UniqueName: \"kubernetes.io/projected/f4e3a9c8-3c3a-4782-b8ef-75125cda3933-kube-api-access-9tt69\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.459331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" event={"ID":"f4e3a9c8-3c3a-4782-b8ef-75125cda3933","Type":"ContainerDied","Data":"de9bb458d8adb50ec4db6fb00306fd520cbb63439ecba239ff0144ad3963c352"} Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.459541 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9bb458d8adb50ec4db6fb00306fd520cbb63439ecba239ff0144ad3963c352" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.459355 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.461005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" event={"ID":"2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff","Type":"ContainerDied","Data":"6676b8c3f46759461f81cacc38fea53bc0db5b8d37ee4a0461ef81fca0f908d2"} Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.461026 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6676b8c3f46759461f81cacc38fea53bc0db5b8d37ee4a0461ef81fca0f908d2" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.461043 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db52-account-create-update-67cvx" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.462938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-mzhtx" event={"ID":"dec82238-89a9-422f-bed8-d3a1992d60e9","Type":"ContainerDied","Data":"5b7a751878deba08907859f4e45a7d71126e3bf7b6f1a0596a58333eec36ec5d"} Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.463056 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b7a751878deba08907859f4e45a7d71126e3bf7b6f1a0596a58333eec36ec5d" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.463148 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-mzhtx" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.463936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-zwv4m" event={"ID":"0bc19662-4ee3-4885-9103-313f65c41fd9","Type":"ContainerDied","Data":"c120697779ac028d7111997efeaf9a1d656216a0f193b91d0d2d41b25613a873"} Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.463958 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c120697779ac028d7111997efeaf9a1d656216a0f193b91d0d2d41b25613a873" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.464078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-zwv4m" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.465102 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-g4v5h" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.465131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-g4v5h" event={"ID":"46101c43-94be-44b4-9638-7f5a94535611","Type":"ContainerDied","Data":"307b6c464ffb840efe09888b31642355e6b934f2bf7191957cea06d078c198f5"} Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.465155 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="307b6c464ffb840efe09888b31642355e6b934f2bf7191957cea06d078c198f5" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.665381 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.740261 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-combined-ca-bundle\") pod \"bd39e035-830d-4cc3-ab9f-2127564ecddf\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.740324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-config-data\") pod \"bd39e035-830d-4cc3-ab9f-2127564ecddf\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.740406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhsqb\" (UniqueName: \"kubernetes.io/projected/bd39e035-830d-4cc3-ab9f-2127564ecddf-kube-api-access-lhsqb\") pod \"bd39e035-830d-4cc3-ab9f-2127564ecddf\" (UID: \"bd39e035-830d-4cc3-ab9f-2127564ecddf\") " Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.746938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd39e035-830d-4cc3-ab9f-2127564ecddf-kube-api-access-lhsqb" (OuterVolumeSpecName: "kube-api-access-lhsqb") pod "bd39e035-830d-4cc3-ab9f-2127564ecddf" (UID: "bd39e035-830d-4cc3-ab9f-2127564ecddf"). InnerVolumeSpecName "kube-api-access-lhsqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.759840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd39e035-830d-4cc3-ab9f-2127564ecddf" (UID: "bd39e035-830d-4cc3-ab9f-2127564ecddf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.768311 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-config-data" (OuterVolumeSpecName: "config-data") pod "bd39e035-830d-4cc3-ab9f-2127564ecddf" (UID: "bd39e035-830d-4cc3-ab9f-2127564ecddf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.841778 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.841806 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd39e035-830d-4cc3-ab9f-2127564ecddf-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:46 crc kubenswrapper[4707]: I0121 16:07:46.841830 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhsqb\" (UniqueName: \"kubernetes.io/projected/bd39e035-830d-4cc3-ab9f-2127564ecddf-kube-api-access-lhsqb\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.182956 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:07:47 crc kubenswrapper[4707]: E0121 16:07:47.183247 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.472585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pwngz" event={"ID":"bd39e035-830d-4cc3-ab9f-2127564ecddf","Type":"ContainerDied","Data":"ee6c23a56985bf844fc07467c9bd848439fc17f80330b0fb93e06cc1f00c5d44"} Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.472832 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee6c23a56985bf844fc07467c9bd848439fc17f80330b0fb93e06cc1f00c5d44" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.472631 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pwngz" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838219 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q25mp"] Jan 21 16:07:47 crc kubenswrapper[4707]: E0121 16:07:47.838525 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff" containerName="mariadb-account-create-update" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838543 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff" containerName="mariadb-account-create-update" Jan 21 16:07:47 crc kubenswrapper[4707]: E0121 16:07:47.838554 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd39e035-830d-4cc3-ab9f-2127564ecddf" containerName="keystone-db-sync" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd39e035-830d-4cc3-ab9f-2127564ecddf" containerName="keystone-db-sync" Jan 21 16:07:47 crc kubenswrapper[4707]: E0121 16:07:47.838574 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e3a9c8-3c3a-4782-b8ef-75125cda3933" containerName="mariadb-account-create-update" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838581 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e3a9c8-3c3a-4782-b8ef-75125cda3933" containerName="mariadb-account-create-update" Jan 21 16:07:47 crc kubenswrapper[4707]: E0121 16:07:47.838587 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec82238-89a9-422f-bed8-d3a1992d60e9" containerName="mariadb-database-create" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838592 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec82238-89a9-422f-bed8-d3a1992d60e9" containerName="mariadb-database-create" Jan 21 16:07:47 crc kubenswrapper[4707]: E0121 16:07:47.838608 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc19662-4ee3-4885-9103-313f65c41fd9" containerName="mariadb-database-create" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838613 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc19662-4ee3-4885-9103-313f65c41fd9" containerName="mariadb-database-create" Jan 21 16:07:47 crc kubenswrapper[4707]: E0121 16:07:47.838627 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46101c43-94be-44b4-9638-7f5a94535611" containerName="mariadb-database-create" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838633 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="46101c43-94be-44b4-9638-7f5a94535611" containerName="mariadb-database-create" Jan 21 16:07:47 crc kubenswrapper[4707]: E0121 16:07:47.838643 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7" containerName="mariadb-account-create-update" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838649 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7" containerName="mariadb-account-create-update" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838799 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7" containerName="mariadb-account-create-update" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838827 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd39e035-830d-4cc3-ab9f-2127564ecddf" containerName="keystone-db-sync" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838844 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e3a9c8-3c3a-4782-b8ef-75125cda3933" containerName="mariadb-account-create-update" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838853 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc19662-4ee3-4885-9103-313f65c41fd9" containerName="mariadb-database-create" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838864 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec82238-89a9-422f-bed8-d3a1992d60e9" containerName="mariadb-database-create" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="46101c43-94be-44b4-9638-7f5a94535611" containerName="mariadb-database-create" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.838887 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff" containerName="mariadb-account-create-update" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.839350 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.844096 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.844165 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.844403 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.844521 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-6rbnr" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.844639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.858733 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q25mp"] Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.898407 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.900672 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.905697 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-97m6r" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.905887 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.910516 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.914125 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.914748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.958755 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-scripts\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-combined-ca-bundle\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-config-data\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcczd\" (UniqueName: \"kubernetes.io/projected/2a1dd97a-3057-4f3b-9940-66d922b6c34b-kube-api-access-gcczd\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959804 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-credential-keys\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959911 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.959913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn5wt\" (UniqueName: \"kubernetes.io/projected/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-kube-api-access-tn5wt\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.960276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-fernet-keys\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.960319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.960392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.960411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-logs\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.961426 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.962490 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:07:47 crc kubenswrapper[4707]: I0121 16:07:47.979866 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.016788 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.018775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.022390 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.022734 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.036539 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mq5pt"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.037509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.039451 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.039614 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.041086 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-nhwsw" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.043604 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-scripts\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062082 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-scripts\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-combined-ca-bundle\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-config-data\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcczd\" (UniqueName: \"kubernetes.io/projected/2a1dd97a-3057-4f3b-9940-66d922b6c34b-kube-api-access-gcczd\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkx95\" (UniqueName: \"kubernetes.io/projected/1a5b8167-6a2b-41e6-b896-500d3c5401d2-kube-api-access-jkx95\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-run-httpd\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-log-httpd\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-credential-keys\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062374 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mq5pt"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-config-data\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbc8w\" (UniqueName: \"kubernetes.io/projected/f1274f2d-1e64-467f-8f46-c91ec758b5c0-kube-api-access-mbc8w\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062469 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn5wt\" (UniqueName: \"kubernetes.io/projected/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-kube-api-access-tn5wt\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-fernet-keys\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-logs\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.062771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.065221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.065386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-logs\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.069579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.071918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.071962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.072271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-fernet-keys\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.072554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-credential-keys\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.073102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-combined-ca-bundle\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.073268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.073968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-scripts\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.084068 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-8mqm4"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.085076 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.085273 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn5wt\" (UniqueName: \"kubernetes.io/projected/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-kube-api-access-tn5wt\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.102135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-config-data\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.102594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcczd\" (UniqueName: \"kubernetes.io/projected/2a1dd97a-3057-4f3b-9940-66d922b6c34b-kube-api-access-gcczd\") pod \"keystone-bootstrap-q25mp\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.103379 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-8mqm4"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.119174 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.119375 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-tjf5j" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.119495 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.168622 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.172466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-combined-ca-bundle\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.173031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.173372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-config-data\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjvs\" (UniqueName: \"kubernetes.io/projected/58df5777-ecc8-4eda-8a67-c58f10b6e920-kube-api-access-hzjvs\") pod \"neutron-db-sync-8mqm4\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-scripts\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkx95\" (UniqueName: \"kubernetes.io/projected/1a5b8167-6a2b-41e6-b896-500d3c5401d2-kube-api-access-jkx95\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-run-httpd\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.174953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-scripts\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.175507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-log-httpd\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.175858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-db-sync-config-data\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-combined-ca-bundle\") pod \"neutron-db-sync-8mqm4\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-config-data\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbc8w\" (UniqueName: \"kubernetes.io/projected/f1274f2d-1e64-467f-8f46-c91ec758b5c0-kube-api-access-mbc8w\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-config\") pod \"neutron-db-sync-8mqm4\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176558 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-run-httpd\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b37e7b-be66-400a-a500-d80241ee7ac3-etc-machine-id\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.176618 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.182578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.183101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-log-httpd\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.184713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.185738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klv6c\" (UniqueName: \"kubernetes.io/projected/72b37e7b-be66-400a-a500-d80241ee7ac3-kube-api-access-klv6c\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.186568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-scripts\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.188698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.190817 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hsp7r"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.190925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.192881 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.194603 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.196259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.196609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.201617 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.201880 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-4nz5r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.209405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.227357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.232626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkx95\" (UniqueName: \"kubernetes.io/projected/1a5b8167-6a2b-41e6-b896-500d3c5401d2-kube-api-access-jkx95\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.233179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbc8w\" (UniqueName: \"kubernetes.io/projected/f1274f2d-1e64-467f-8f46-c91ec758b5c0-kube-api-access-mbc8w\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.233662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.236983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hsp7r"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.240026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-config-data\") pod \"ceilometer-0\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.260377 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tlxvk"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.261363 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.263333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.263735 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-7q2dt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.264424 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.266250 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tlxvk"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.273737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.289705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjvs\" (UniqueName: \"kubernetes.io/projected/58df5777-ecc8-4eda-8a67-c58f10b6e920-kube-api-access-hzjvs\") pod \"neutron-db-sync-8mqm4\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.289769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n58tc\" (UniqueName: \"kubernetes.io/projected/5d9101f3-dd71-4ccb-a894-1778491da333-kube-api-access-n58tc\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.289828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-scripts\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.289847 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9101f3-dd71-4ccb-a894-1778491da333-logs\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.289886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-db-sync-config-data\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.289919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-combined-ca-bundle\") pod \"neutron-db-sync-8mqm4\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.289946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-config\") pod \"neutron-db-sync-8mqm4\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.289971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-config-data\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.291161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b37e7b-be66-400a-a500-d80241ee7ac3-etc-machine-id\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.291227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b37e7b-be66-400a-a500-d80241ee7ac3-etc-machine-id\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.291484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klv6c\" (UniqueName: \"kubernetes.io/projected/72b37e7b-be66-400a-a500-d80241ee7ac3-kube-api-access-klv6c\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.291587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-scripts\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.291613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-combined-ca-bundle\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.291643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-config-data\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.291660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-combined-ca-bundle\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.293079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-db-sync-config-data\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.293523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-config\") pod \"neutron-db-sync-8mqm4\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.293928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-combined-ca-bundle\") pod \"neutron-db-sync-8mqm4\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.295247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-config-data\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.297679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-scripts\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.305442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-combined-ca-bundle\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.315361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjvs\" (UniqueName: \"kubernetes.io/projected/58df5777-ecc8-4eda-8a67-c58f10b6e920-kube-api-access-hzjvs\") pod \"neutron-db-sync-8mqm4\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.317291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klv6c\" (UniqueName: \"kubernetes.io/projected/72b37e7b-be66-400a-a500-d80241ee7ac3-kube-api-access-klv6c\") pod \"cinder-db-sync-mq5pt\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.333879 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.352011 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.392584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-scripts\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.392627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-combined-ca-bundle\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.392669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-db-sync-config-data\") pod \"barbican-db-sync-tlxvk\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.392698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hdrx\" (UniqueName: \"kubernetes.io/projected/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-kube-api-access-9hdrx\") pod \"barbican-db-sync-tlxvk\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.392718 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n58tc\" (UniqueName: \"kubernetes.io/projected/5d9101f3-dd71-4ccb-a894-1778491da333-kube-api-access-n58tc\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.392742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-combined-ca-bundle\") pod \"barbican-db-sync-tlxvk\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.392769 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9101f3-dd71-4ccb-a894-1778491da333-logs\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.392838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-config-data\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.395228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9101f3-dd71-4ccb-a894-1778491da333-logs\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.397469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-combined-ca-bundle\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.397658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-scripts\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.398097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-config-data\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.411562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n58tc\" (UniqueName: \"kubernetes.io/projected/5d9101f3-dd71-4ccb-a894-1778491da333-kube-api-access-n58tc\") pod \"placement-db-sync-hsp7r\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.494395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-db-sync-config-data\") pod \"barbican-db-sync-tlxvk\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.494444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hdrx\" (UniqueName: \"kubernetes.io/projected/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-kube-api-access-9hdrx\") pod \"barbican-db-sync-tlxvk\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.494476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-combined-ca-bundle\") pod \"barbican-db-sync-tlxvk\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.497074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-combined-ca-bundle\") pod \"barbican-db-sync-tlxvk\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.497362 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-db-sync-config-data\") pod \"barbican-db-sync-tlxvk\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.509477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hdrx\" (UniqueName: \"kubernetes.io/projected/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-kube-api-access-9hdrx\") pod \"barbican-db-sync-tlxvk\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.519595 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.581292 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.595173 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.603858 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.658439 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q25mp"] Jan 21 16:07:48 crc kubenswrapper[4707]: W0121 16:07:48.665057 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a1dd97a_3057_4f3b_9940_66d922b6c34b.slice/crio-29b9c2fee5825529ed52beb2df215efd0e043de5b6d0623ff4d35871c1f2072a WatchSource:0}: Error finding container 29b9c2fee5825529ed52beb2df215efd0e043de5b6d0623ff4d35871c1f2072a: Status 404 returned error can't find the container with id 29b9c2fee5825529ed52beb2df215efd0e043de5b6d0623ff4d35871c1f2072a Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.738479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.792914 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:07:48 crc kubenswrapper[4707]: W0121 16:07:48.798648 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a5b8167_6a2b_41e6_b896_500d3c5401d2.slice/crio-b7590cc600ba97c9036bbde1dca8350b4ec3e199298eede790556c55792307f0 WatchSource:0}: Error finding container b7590cc600ba97c9036bbde1dca8350b4ec3e199298eede790556c55792307f0: Status 404 returned error can't find the container with id b7590cc600ba97c9036bbde1dca8350b4ec3e199298eede790556c55792307f0 Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.802428 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.878878 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mq5pt"] Jan 21 16:07:48 crc kubenswrapper[4707]: W0121 16:07:48.884192 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72b37e7b_be66_400a_a500_d80241ee7ac3.slice/crio-8a27066f9834ae980be5213b9ca3704f3ad9614e3c39c19e24fa77634c9f9c7c WatchSource:0}: Error finding container 8a27066f9834ae980be5213b9ca3704f3ad9614e3c39c19e24fa77634c9f9c7c: Status 404 returned error can't find the container with id 8a27066f9834ae980be5213b9ca3704f3ad9614e3c39c19e24fa77634c9f9c7c Jan 21 16:07:48 crc kubenswrapper[4707]: I0121 16:07:48.976834 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.030673 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-8mqm4"] Jan 21 16:07:49 crc kubenswrapper[4707]: W0121 16:07:49.033459 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58df5777_ecc8_4eda_8a67_c58f10b6e920.slice/crio-280d458c1a407c520bdfb674d621f79081b5e7d7bb7f725d67a005d26c4d1017 WatchSource:0}: Error finding container 280d458c1a407c520bdfb674d621f79081b5e7d7bb7f725d67a005d26c4d1017: Status 404 returned error can't find the container with id 280d458c1a407c520bdfb674d621f79081b5e7d7bb7f725d67a005d26c4d1017 Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.089146 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hsp7r"] Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.111950 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tlxvk"] Jan 21 16:07:49 crc kubenswrapper[4707]: W0121 16:07:49.117167 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e2d0133_c8a1_44aa_8d4a_59d0125942b3.slice/crio-a466fcad6514b372ac122f2387c0fc348eb4234bbc9e22fa42aa2b847e422739 WatchSource:0}: Error finding container a466fcad6514b372ac122f2387c0fc348eb4234bbc9e22fa42aa2b847e422739: Status 404 returned error can't find the container with id a466fcad6514b372ac122f2387c0fc348eb4234bbc9e22fa42aa2b847e422739 Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.496373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" event={"ID":"1e2d0133-c8a1-44aa-8d4a-59d0125942b3","Type":"ContainerStarted","Data":"09b6bb9a89e57fb4efbbe1768b6bbd8a7239804021d9c1e24c897539673129b4"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.496563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" event={"ID":"1e2d0133-c8a1-44aa-8d4a-59d0125942b3","Type":"ContainerStarted","Data":"a466fcad6514b372ac122f2387c0fc348eb4234bbc9e22fa42aa2b847e422739"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.505302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hsp7r" event={"ID":"5d9101f3-dd71-4ccb-a894-1778491da333","Type":"ContainerStarted","Data":"4b897ebc0ad2bb55b6bc3e5391a8b08293a8e6797572f87736673c368f860c54"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.505329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hsp7r" event={"ID":"5d9101f3-dd71-4ccb-a894-1778491da333","Type":"ContainerStarted","Data":"ea5c49a74159ef5f3a97fb7b2f871dba94157b06bf53d67bf5a354859b3f185a"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.515557 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" event={"ID":"2a1dd97a-3057-4f3b-9940-66d922b6c34b","Type":"ContainerStarted","Data":"80e9733e9d9a5164525e952079104194c595f92d328fae8c657f54b22e91c862"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.515596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" event={"ID":"2a1dd97a-3057-4f3b-9940-66d922b6c34b","Type":"ContainerStarted","Data":"29b9c2fee5825529ed52beb2df215efd0e043de5b6d0623ff4d35871c1f2072a"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.516327 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" podStartSLOduration=1.5163139829999999 podStartE2EDuration="1.516313983s" podCreationTimestamp="2026-01-21 16:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:49.513356241 +0000 UTC m=+3966.694872453" watchObservedRunningTime="2026-01-21 16:07:49.516313983 +0000 UTC m=+3966.697830205" Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.516467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" event={"ID":"72b37e7b-be66-400a-a500-d80241ee7ac3","Type":"ContainerStarted","Data":"8a27066f9834ae980be5213b9ca3704f3ad9614e3c39c19e24fa77634c9f9c7c"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.523767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerStarted","Data":"b7590cc600ba97c9036bbde1dca8350b4ec3e199298eede790556c55792307f0"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.524663 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-hsp7r" podStartSLOduration=1.5246457690000002 podStartE2EDuration="1.524645769s" podCreationTimestamp="2026-01-21 16:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:49.524057573 +0000 UTC m=+3966.705573795" watchObservedRunningTime="2026-01-21 16:07:49.524645769 +0000 UTC m=+3966.706161991" Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.532451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e","Type":"ContainerStarted","Data":"9a43feb77cd328d0822c3aeeb9c89163cb3bfc01557f39bc6056d243b7541a21"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.533743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f1274f2d-1e64-467f-8f46-c91ec758b5c0","Type":"ContainerStarted","Data":"31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.533766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f1274f2d-1e64-467f-8f46-c91ec758b5c0","Type":"ContainerStarted","Data":"4198adbf956179069d16debd6c30367b5dd93896fd540513c2d698e8335cb0a4"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.535902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" event={"ID":"58df5777-ecc8-4eda-8a67-c58f10b6e920","Type":"ContainerStarted","Data":"a005d0adcf1ca25f60db5126c522d38f536eb886e79a89b2052b955fee4ea881"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.535928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" event={"ID":"58df5777-ecc8-4eda-8a67-c58f10b6e920","Type":"ContainerStarted","Data":"280d458c1a407c520bdfb674d621f79081b5e7d7bb7f725d67a005d26c4d1017"} Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.554875 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" podStartSLOduration=2.554860993 podStartE2EDuration="2.554860993s" podCreationTimestamp="2026-01-21 16:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:49.542709244 +0000 UTC m=+3966.724225467" watchObservedRunningTime="2026-01-21 16:07:49.554860993 +0000 UTC m=+3966.736377216" Jan 21 16:07:49 crc kubenswrapper[4707]: I0121 16:07:49.558084 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" podStartSLOduration=1.558075808 podStartE2EDuration="1.558075808s" podCreationTimestamp="2026-01-21 16:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:49.554168141 +0000 UTC m=+3966.735684363" watchObservedRunningTime="2026-01-21 16:07:49.558075808 +0000 UTC m=+3966.739592021" Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.293970 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.339186 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.353001 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.577019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" event={"ID":"72b37e7b-be66-400a-a500-d80241ee7ac3","Type":"ContainerStarted","Data":"d5feaa8ac1d4f32dd02f665fb416dd696dc70202a06de401c7aae4945d988bb3"} Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.604363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerStarted","Data":"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b"} Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.604420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerStarted","Data":"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5"} Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.609102 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" podStartSLOduration=2.609091232 podStartE2EDuration="2.609091232s" podCreationTimestamp="2026-01-21 16:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:50.601064669 +0000 UTC m=+3967.782580880" watchObservedRunningTime="2026-01-21 16:07:50.609091232 +0000 UTC m=+3967.790607454" Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.633659 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e","Type":"ContainerStarted","Data":"85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae"} Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.633696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e","Type":"ContainerStarted","Data":"140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c"} Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.662873 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.662859824 podStartE2EDuration="3.662859824s" podCreationTimestamp="2026-01-21 16:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:50.658993154 +0000 UTC m=+3967.840509375" watchObservedRunningTime="2026-01-21 16:07:50.662859824 +0000 UTC m=+3967.844376046" Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.676981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f1274f2d-1e64-467f-8f46-c91ec758b5c0","Type":"ContainerStarted","Data":"ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078"} Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.706111 4707 generic.go:334] "Generic (PLEG): container finished" podID="1e2d0133-c8a1-44aa-8d4a-59d0125942b3" containerID="09b6bb9a89e57fb4efbbe1768b6bbd8a7239804021d9c1e24c897539673129b4" exitCode=0 Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.706997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" event={"ID":"1e2d0133-c8a1-44aa-8d4a-59d0125942b3","Type":"ContainerDied","Data":"09b6bb9a89e57fb4efbbe1768b6bbd8a7239804021d9c1e24c897539673129b4"} Jan 21 16:07:50 crc kubenswrapper[4707]: I0121 16:07:50.710798 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.710784628 podStartE2EDuration="3.710784628s" podCreationTimestamp="2026-01-21 16:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:50.70866373 +0000 UTC m=+3967.890179953" watchObservedRunningTime="2026-01-21 16:07:50.710784628 +0000 UTC m=+3967.892300851" Jan 21 16:07:51 crc kubenswrapper[4707]: I0121 16:07:51.715205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerStarted","Data":"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d"} Jan 21 16:07:51 crc kubenswrapper[4707]: I0121 16:07:51.717453 4707 generic.go:334] "Generic (PLEG): container finished" podID="5d9101f3-dd71-4ccb-a894-1778491da333" containerID="4b897ebc0ad2bb55b6bc3e5391a8b08293a8e6797572f87736673c368f860c54" exitCode=0 Jan 21 16:07:51 crc kubenswrapper[4707]: I0121 16:07:51.717580 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hsp7r" event={"ID":"5d9101f3-dd71-4ccb-a894-1778491da333","Type":"ContainerDied","Data":"4b897ebc0ad2bb55b6bc3e5391a8b08293a8e6797572f87736673c368f860c54"} Jan 21 16:07:51 crc kubenswrapper[4707]: I0121 16:07:51.717900 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerName="glance-log" containerID="cri-o://31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0" gracePeriod=30 Jan 21 16:07:51 crc kubenswrapper[4707]: I0121 16:07:51.717997 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerName="glance-httpd" containerID="cri-o://ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078" gracePeriod=30 Jan 21 16:07:51 crc kubenswrapper[4707]: I0121 16:07:51.718393 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerName="glance-log" containerID="cri-o://140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c" gracePeriod=30 Jan 21 16:07:51 crc kubenswrapper[4707]: I0121 16:07:51.718485 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerName="glance-httpd" containerID="cri-o://85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae" gracePeriod=30 Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.115835 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.163603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hdrx\" (UniqueName: \"kubernetes.io/projected/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-kube-api-access-9hdrx\") pod \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.163637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-db-sync-config-data\") pod \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.163763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-combined-ca-bundle\") pod \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\" (UID: \"1e2d0133-c8a1-44aa-8d4a-59d0125942b3\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.171449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-kube-api-access-9hdrx" (OuterVolumeSpecName: "kube-api-access-9hdrx") pod "1e2d0133-c8a1-44aa-8d4a-59d0125942b3" (UID: "1e2d0133-c8a1-44aa-8d4a-59d0125942b3"). InnerVolumeSpecName "kube-api-access-9hdrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.184084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1e2d0133-c8a1-44aa-8d4a-59d0125942b3" (UID: "1e2d0133-c8a1-44aa-8d4a-59d0125942b3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.203607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e2d0133-c8a1-44aa-8d4a-59d0125942b3" (UID: "1e2d0133-c8a1-44aa-8d4a-59d0125942b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.267707 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hdrx\" (UniqueName: \"kubernetes.io/projected/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-kube-api-access-9hdrx\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.267736 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.267746 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2d0133-c8a1-44aa-8d4a-59d0125942b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.531461 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.537508 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.572107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-logs\") pod \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.572144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-combined-ca-bundle\") pod \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.572206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbc8w\" (UniqueName: \"kubernetes.io/projected/f1274f2d-1e64-467f-8f46-c91ec758b5c0-kube-api-access-mbc8w\") pod \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.572269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-httpd-run\") pod \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.572341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-config-data\") pod \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.572360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-internal-tls-certs\") pod \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.572411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.572434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-scripts\") pod \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\" (UID: \"f1274f2d-1e64-467f-8f46-c91ec758b5c0\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.572881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f1274f2d-1e64-467f-8f46-c91ec758b5c0" (UID: "f1274f2d-1e64-467f-8f46-c91ec758b5c0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.573031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-logs" (OuterVolumeSpecName: "logs") pod "f1274f2d-1e64-467f-8f46-c91ec758b5c0" (UID: "f1274f2d-1e64-467f-8f46-c91ec758b5c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.575947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "f1274f2d-1e64-467f-8f46-c91ec758b5c0" (UID: "f1274f2d-1e64-467f-8f46-c91ec758b5c0"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.577470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-scripts" (OuterVolumeSpecName: "scripts") pod "f1274f2d-1e64-467f-8f46-c91ec758b5c0" (UID: "f1274f2d-1e64-467f-8f46-c91ec758b5c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.580271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1274f2d-1e64-467f-8f46-c91ec758b5c0-kube-api-access-mbc8w" (OuterVolumeSpecName: "kube-api-access-mbc8w") pod "f1274f2d-1e64-467f-8f46-c91ec758b5c0" (UID: "f1274f2d-1e64-467f-8f46-c91ec758b5c0"). InnerVolumeSpecName "kube-api-access-mbc8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.649775 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm"] Jan 21 16:07:52 crc kubenswrapper[4707]: E0121 16:07:52.650356 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerName="glance-httpd" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650378 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerName="glance-httpd" Jan 21 16:07:52 crc kubenswrapper[4707]: E0121 16:07:52.650395 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerName="glance-httpd" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650402 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerName="glance-httpd" Jan 21 16:07:52 crc kubenswrapper[4707]: E0121 16:07:52.650417 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2d0133-c8a1-44aa-8d4a-59d0125942b3" containerName="barbican-db-sync" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650423 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2d0133-c8a1-44aa-8d4a-59d0125942b3" containerName="barbican-db-sync" Jan 21 16:07:52 crc kubenswrapper[4707]: E0121 16:07:52.650437 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerName="glance-log" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650442 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerName="glance-log" Jan 21 16:07:52 crc kubenswrapper[4707]: E0121 16:07:52.650456 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerName="glance-log" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650461 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerName="glance-log" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650617 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerName="glance-httpd" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650629 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerName="glance-log" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650637 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerName="glance-httpd" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650644 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e2d0133-c8a1-44aa-8d4a-59d0125942b3" containerName="barbican-db-sync" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.650651 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerName="glance-log" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.653024 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.654905 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.667476 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm"] Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.674729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-combined-ca-bundle\") pod \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.674780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.674841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-config-data\") pod \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.678853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-public-tls-certs\") pod \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.678950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn5wt\" (UniqueName: \"kubernetes.io/projected/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-kube-api-access-tn5wt\") pod \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.678976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-httpd-run\") pod \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.679004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-logs\") pod \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.679031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-scripts\") pod \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\" (UID: \"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e\") " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.679544 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.679562 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.679572 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.679581 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbc8w\" (UniqueName: \"kubernetes.io/projected/f1274f2d-1e64-467f-8f46-c91ec758b5c0-kube-api-access-mbc8w\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.679591 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1274f2d-1e64-467f-8f46-c91ec758b5c0-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.681874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-logs" (OuterVolumeSpecName: "logs") pod "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" (UID: "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.681951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" (UID: "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.705292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-scripts" (OuterVolumeSpecName: "scripts") pod "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" (UID: "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.708699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" (UID: "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.716998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1274f2d-1e64-467f-8f46-c91ec758b5c0" (UID: "f1274f2d-1e64-467f-8f46-c91ec758b5c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.717056 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w"] Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.718318 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.719576 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w"] Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.719964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.727987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-kube-api-access-tn5wt" (OuterVolumeSpecName: "kube-api-access-tn5wt") pod "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" (UID: "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e"). InnerVolumeSpecName "kube-api-access-tn5wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.728641 4707 generic.go:334] "Generic (PLEG): container finished" podID="72b37e7b-be66-400a-a500-d80241ee7ac3" containerID="d5feaa8ac1d4f32dd02f665fb416dd696dc70202a06de401c7aae4945d988bb3" exitCode=0 Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.728720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" event={"ID":"72b37e7b-be66-400a-a500-d80241ee7ac3","Type":"ContainerDied","Data":"d5feaa8ac1d4f32dd02f665fb416dd696dc70202a06de401c7aae4945d988bb3"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.733271 4707 generic.go:334] "Generic (PLEG): container finished" podID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerID="85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae" exitCode=0 Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.733300 4707 generic.go:334] "Generic (PLEG): container finished" podID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" containerID="140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c" exitCode=143 Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.733361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e","Type":"ContainerDied","Data":"85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.733386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e","Type":"ContainerDied","Data":"140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.733409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e","Type":"ContainerDied","Data":"9a43feb77cd328d0822c3aeeb9c89163cb3bfc01557f39bc6056d243b7541a21"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.733423 4707 scope.go:117] "RemoveContainer" containerID="85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.733553 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.736573 4707 generic.go:334] "Generic (PLEG): container finished" podID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerID="ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078" exitCode=0 Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.736588 4707 generic.go:334] "Generic (PLEG): container finished" podID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" containerID="31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0" exitCode=143 Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.736618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f1274f2d-1e64-467f-8f46-c91ec758b5c0","Type":"ContainerDied","Data":"ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.736635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f1274f2d-1e64-467f-8f46-c91ec758b5c0","Type":"ContainerDied","Data":"31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.736645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"f1274f2d-1e64-467f-8f46-c91ec758b5c0","Type":"ContainerDied","Data":"4198adbf956179069d16debd6c30367b5dd93896fd540513c2d698e8335cb0a4"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.736686 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.743555 4707 generic.go:334] "Generic (PLEG): container finished" podID="58df5777-ecc8-4eda-8a67-c58f10b6e920" containerID="a005d0adcf1ca25f60db5126c522d38f536eb886e79a89b2052b955fee4ea881" exitCode=0 Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.743610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" event={"ID":"58df5777-ecc8-4eda-8a67-c58f10b6e920","Type":"ContainerDied","Data":"a005d0adcf1ca25f60db5126c522d38f536eb886e79a89b2052b955fee4ea881"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.762146 4707 scope.go:117] "RemoveContainer" containerID="140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.765956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" event={"ID":"1e2d0133-c8a1-44aa-8d4a-59d0125942b3","Type":"ContainerDied","Data":"a466fcad6514b372ac122f2387c0fc348eb4234bbc9e22fa42aa2b847e422739"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.766056 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a466fcad6514b372ac122f2387c0fc348eb4234bbc9e22fa42aa2b847e422739" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.766158 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-tlxvk" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.774425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f1274f2d-1e64-467f-8f46-c91ec758b5c0" (UID: "f1274f2d-1e64-467f-8f46-c91ec758b5c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.775915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" (UID: "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.788503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-config-data" (OuterVolumeSpecName: "config-data") pod "f1274f2d-1e64-467f-8f46-c91ec758b5c0" (UID: "f1274f2d-1e64-467f-8f46-c91ec758b5c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.788778 4707 generic.go:334] "Generic (PLEG): container finished" podID="2a1dd97a-3057-4f3b-9940-66d922b6c34b" containerID="80e9733e9d9a5164525e952079104194c595f92d328fae8c657f54b22e91c862" exitCode=0 Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.789081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" event={"ID":"2a1dd97a-3057-4f3b-9940-66d922b6c34b","Type":"ContainerDied","Data":"80e9733e9d9a5164525e952079104194c595f92d328fae8c657f54b22e91c862"} Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.789399 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data-custom\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6fa0ac-e4ce-419d-a680-ce63883886a7-logs\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bc4c055-2df1-4d1f-b331-30020327ed16-logs\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp4j7\" (UniqueName: \"kubernetes.io/projected/1bc4c055-2df1-4d1f-b331-30020327ed16-kube-api-access-sp4j7\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-combined-ca-bundle\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data-custom\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7xvj\" (UniqueName: \"kubernetes.io/projected/4e6fa0ac-e4ce-419d-a680-ce63883886a7-kube-api-access-k7xvj\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-combined-ca-bundle\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793754 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793770 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793795 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793819 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793832 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793841 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793850 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1274f2d-1e64-467f-8f46-c91ec758b5c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793859 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn5wt\" (UniqueName: \"kubernetes.io/projected/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-kube-api-access-tn5wt\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793872 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.793909 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.812146 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl"] Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.814522 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.817169 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.824170 4707 scope.go:117] "RemoveContainer" containerID="85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae" Jan 21 16:07:52 crc kubenswrapper[4707]: E0121 16:07:52.824639 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae\": container with ID starting with 85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae not found: ID does not exist" containerID="85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.824669 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae"} err="failed to get container status \"85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae\": rpc error: code = NotFound desc = could not find container \"85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae\": container with ID starting with 85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae not found: ID does not exist" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.824690 4707 scope.go:117] "RemoveContainer" containerID="140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c" Jan 21 16:07:52 crc kubenswrapper[4707]: E0121 16:07:52.825388 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c\": container with ID starting with 140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c not found: ID does not exist" containerID="140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.825418 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c"} err="failed to get container status \"140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c\": rpc error: code = NotFound desc = could not find container \"140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c\": container with ID starting with 140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c not found: ID does not exist" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.825446 4707 scope.go:117] "RemoveContainer" containerID="85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.825649 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae"} err="failed to get container status \"85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae\": rpc error: code = NotFound desc = could not find container \"85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae\": container with ID starting with 85d65c3875dedeb377af6dcf6990315515deead5b5bd9f2ddaff288ae431c3ae not found: ID does not exist" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.825669 4707 scope.go:117] "RemoveContainer" containerID="140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.827275 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.827672 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c"} err="failed to get container status \"140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c\": rpc error: code = NotFound desc = could not find container \"140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c\": container with ID starting with 140d8be9eccb8f878fbafb009b2f78e9107d2e2af4e045e7a33287c4038a6a2c not found: ID does not exist" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.827695 4707 scope.go:117] "RemoveContainer" containerID="ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.834010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" (UID: "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.837657 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl"] Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.851255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-config-data" (OuterVolumeSpecName: "config-data") pod "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" (UID: "ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.856226 4707 scope.go:117] "RemoveContainer" containerID="31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.877973 4707 scope.go:117] "RemoveContainer" containerID="ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078" Jan 21 16:07:52 crc kubenswrapper[4707]: E0121 16:07:52.878671 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078\": container with ID starting with ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078 not found: ID does not exist" containerID="ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.878707 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078"} err="failed to get container status \"ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078\": rpc error: code = NotFound desc = could not find container \"ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078\": container with ID starting with ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078 not found: ID does not exist" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.878748 4707 scope.go:117] "RemoveContainer" containerID="31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0" Jan 21 16:07:52 crc kubenswrapper[4707]: E0121 16:07:52.881940 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0\": container with ID starting with 31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0 not found: ID does not exist" containerID="31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.881977 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0"} err="failed to get container status \"31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0\": rpc error: code = NotFound desc = could not find container \"31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0\": container with ID starting with 31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0 not found: ID does not exist" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.881999 4707 scope.go:117] "RemoveContainer" containerID="ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.882286 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078"} err="failed to get container status \"ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078\": rpc error: code = NotFound desc = could not find container \"ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078\": container with ID starting with ac1fdb9b6393c8f95f793f53c938c5c28aa5db14886943bb5d3c9968e8473078 not found: ID does not exist" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.882331 4707 scope.go:117] "RemoveContainer" containerID="31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.882524 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0"} err="failed to get container status \"31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0\": rpc error: code = NotFound desc = could not find container \"31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0\": container with ID starting with 31784f6823ea7aa79913702ba89f907ffc0058a6adc223e841e7b6457130fda0 not found: ID does not exist" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp4j7\" (UniqueName: \"kubernetes.io/projected/1bc4c055-2df1-4d1f-b331-30020327ed16-kube-api-access-sp4j7\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-combined-ca-bundle\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv5cc\" (UniqueName: \"kubernetes.io/projected/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-kube-api-access-kv5cc\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data-custom\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7xvj\" (UniqueName: \"kubernetes.io/projected/4e6fa0ac-e4ce-419d-a680-ce63883886a7-kube-api-access-k7xvj\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-combined-ca-bundle\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-combined-ca-bundle\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data-custom\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data-custom\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6fa0ac-e4ce-419d-a680-ce63883886a7-logs\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-logs\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bc4c055-2df1-4d1f-b331-30020327ed16-logs\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.895995 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.896005 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.896015 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.898316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bc4c055-2df1-4d1f-b331-30020327ed16-logs\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.898677 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6fa0ac-e4ce-419d-a680-ce63883886a7-logs\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.906550 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-combined-ca-bundle\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.906963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data-custom\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.906971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.907663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.909961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp4j7\" (UniqueName: \"kubernetes.io/projected/1bc4c055-2df1-4d1f-b331-30020327ed16-kube-api-access-sp4j7\") pod \"barbican-worker-5cdcb8d65c-2rlbm\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.911115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7xvj\" (UniqueName: \"kubernetes.io/projected/4e6fa0ac-e4ce-419d-a680-ce63883886a7-kube-api-access-k7xvj\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.912887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-combined-ca-bundle\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.914072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data-custom\") pod \"barbican-keystone-listener-f74c7bd86-hlp8w\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.970274 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.996854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv5cc\" (UniqueName: \"kubernetes.io/projected/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-kube-api-access-kv5cc\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.997055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.997131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-combined-ca-bundle\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.997154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data-custom\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.997195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-logs\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:52 crc kubenswrapper[4707]: I0121 16:07:52.997553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-logs\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.000392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-combined-ca-bundle\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.000525 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data-custom\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.001256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.010421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv5cc\" (UniqueName: \"kubernetes.io/projected/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-kube-api-access-kv5cc\") pod \"barbican-api-76464b79f6-l8nvl\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.038282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.053519 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.101403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-config-data\") pod \"5d9101f3-dd71-4ccb-a894-1778491da333\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.101495 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-scripts\") pod \"5d9101f3-dd71-4ccb-a894-1778491da333\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.101582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-combined-ca-bundle\") pod \"5d9101f3-dd71-4ccb-a894-1778491da333\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.101662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9101f3-dd71-4ccb-a894-1778491da333-logs\") pod \"5d9101f3-dd71-4ccb-a894-1778491da333\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.101684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n58tc\" (UniqueName: \"kubernetes.io/projected/5d9101f3-dd71-4ccb-a894-1778491da333-kube-api-access-n58tc\") pod \"5d9101f3-dd71-4ccb-a894-1778491da333\" (UID: \"5d9101f3-dd71-4ccb-a894-1778491da333\") " Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.103878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d9101f3-dd71-4ccb-a894-1778491da333-logs" (OuterVolumeSpecName: "logs") pod "5d9101f3-dd71-4ccb-a894-1778491da333" (UID: "5d9101f3-dd71-4ccb-a894-1778491da333"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.113245 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.114988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-scripts" (OuterVolumeSpecName: "scripts") pod "5d9101f3-dd71-4ccb-a894-1778491da333" (UID: "5d9101f3-dd71-4ccb-a894-1778491da333"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.135795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9101f3-dd71-4ccb-a894-1778491da333-kube-api-access-n58tc" (OuterVolumeSpecName: "kube-api-access-n58tc") pod "5d9101f3-dd71-4ccb-a894-1778491da333" (UID: "5d9101f3-dd71-4ccb-a894-1778491da333"). InnerVolumeSpecName "kube-api-access-n58tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.141179 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.143874 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.151616 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d9101f3-dd71-4ccb-a894-1778491da333" (UID: "5d9101f3-dd71-4ccb-a894-1778491da333"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.166723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-config-data" (OuterVolumeSpecName: "config-data") pod "5d9101f3-dd71-4ccb-a894-1778491da333" (UID: "5d9101f3-dd71-4ccb-a894-1778491da333"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.174759 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:07:53 crc kubenswrapper[4707]: E0121 16:07:53.175248 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9101f3-dd71-4ccb-a894-1778491da333" containerName="placement-db-sync" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.175266 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9101f3-dd71-4ccb-a894-1778491da333" containerName="placement-db-sync" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.175441 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9101f3-dd71-4ccb-a894-1778491da333" containerName="placement-db-sync" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.176258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.191319 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.191490 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.191569 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-97m6r" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.191607 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.196091 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e" path="/var/lib/kubelet/pods/ff4eb7bd-bc9f-4a77-baa7-fc8e4ba2030e/volumes" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.197201 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.197247 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.199119 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.205060 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.205087 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.205097 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9101f3-dd71-4ccb-a894-1778491da333-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.205105 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n58tc\" (UniqueName: \"kubernetes.io/projected/5d9101f3-dd71-4ccb-a894-1778491da333-kube-api-access-n58tc\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.205114 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9101f3-dd71-4ccb-a894-1778491da333-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.212337 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.214111 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.219531 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.219695 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.224845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.306972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-logs\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-config-data\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw9m7\" (UniqueName: \"kubernetes.io/projected/825bff4e-d6ba-4010-97b5-919025083000-kube-api-access-fw9m7\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-logs\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-scripts\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.307838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz8x2\" (UniqueName: \"kubernetes.io/projected/2f2511c6-b349-44ef-bd96-a121829bd0b8-kube-api-access-cz8x2\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.409017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.409249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.409316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.409349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw9m7\" (UniqueName: \"kubernetes.io/projected/825bff4e-d6ba-4010-97b5-919025083000-kube-api-access-fw9m7\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.409438 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.409880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-logs\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.409919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.409955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-scripts\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.409988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz8x2\" (UniqueName: \"kubernetes.io/projected/2f2511c6-b349-44ef-bd96-a121829bd0b8-kube-api-access-cz8x2\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-logs\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-config-data\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410511 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.410793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-logs\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.413973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.414211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-logs\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.415160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.417985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.419456 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.421080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.423152 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-scripts\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.423565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.424520 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz8x2\" (UniqueName: \"kubernetes.io/projected/2f2511c6-b349-44ef-bd96-a121829bd0b8-kube-api-access-cz8x2\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.424884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.427139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.437232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw9m7\" (UniqueName: \"kubernetes.io/projected/825bff4e-d6ba-4010-97b5-919025083000-kube-api-access-fw9m7\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.440043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.452132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-config-data\") pod \"glance-default-internal-api-0\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.454495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.533595 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.550799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.562314 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.637784 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.805622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" event={"ID":"1bc4c055-2df1-4d1f-b331-30020327ed16","Type":"ContainerStarted","Data":"92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe"} Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.805655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" event={"ID":"1bc4c055-2df1-4d1f-b331-30020327ed16","Type":"ContainerStarted","Data":"8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44"} Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.805666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" event={"ID":"1bc4c055-2df1-4d1f-b331-30020327ed16","Type":"ContainerStarted","Data":"dda11bed22650a80cbd1e89e066168e6698086b16561d374618a5c4168529dd0"} Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.817947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" event={"ID":"92bc7502-e503-4b19-b9f3-8ef8149a1e7d","Type":"ContainerStarted","Data":"a7801d9d5ee47a2f0d9786a9cb40df520effb2580cbdb96f216bddab5e9295b7"} Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.817995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" event={"ID":"92bc7502-e503-4b19-b9f3-8ef8149a1e7d","Type":"ContainerStarted","Data":"9813424d91373477bd684a56e1522f7a27a94e184c5fc1e47102c6414a80cd4b"} Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.823728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" event={"ID":"4e6fa0ac-e4ce-419d-a680-ce63883886a7","Type":"ContainerStarted","Data":"f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0"} Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.823762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" event={"ID":"4e6fa0ac-e4ce-419d-a680-ce63883886a7","Type":"ContainerStarted","Data":"210cb3f58c8c9c6f1aa1eaf7bbf08a773fac3b0cd7d8fbd8cb8b964e69e76872"} Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.826305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hsp7r" event={"ID":"5d9101f3-dd71-4ccb-a894-1778491da333","Type":"ContainerDied","Data":"ea5c49a74159ef5f3a97fb7b2f871dba94157b06bf53d67bf5a354859b3f185a"} Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.826326 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea5c49a74159ef5f3a97fb7b2f871dba94157b06bf53d67bf5a354859b3f185a" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.826392 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hsp7r" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.837108 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" podStartSLOduration=1.837089655 podStartE2EDuration="1.837089655s" podCreationTimestamp="2026-01-21 16:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:53.818658588 +0000 UTC m=+3971.000174810" watchObservedRunningTime="2026-01-21 16:07:53.837089655 +0000 UTC m=+3971.018605877" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.837729 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="ceilometer-central-agent" containerID="cri-o://1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5" gracePeriod=30 Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.837918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerStarted","Data":"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13"} Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.838006 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="proxy-httpd" containerID="cri-o://2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13" gracePeriod=30 Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.838171 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.838224 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="sg-core" containerID="cri-o://cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d" gracePeriod=30 Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.838285 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="ceilometer-notification-agent" containerID="cri-o://7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b" gracePeriod=30 Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.885275 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.372973262 podStartE2EDuration="6.885259049s" podCreationTimestamp="2026-01-21 16:07:47 +0000 UTC" firstStartedPulling="2026-01-21 16:07:48.802232917 +0000 UTC m=+3965.983749138" lastFinishedPulling="2026-01-21 16:07:52.314518703 +0000 UTC m=+3969.496034925" observedRunningTime="2026-01-21 16:07:53.876252373 +0000 UTC m=+3971.057768594" watchObservedRunningTime="2026-01-21 16:07:53.885259049 +0000 UTC m=+3971.066775271" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.916063 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-6f9dc896b6-x5875"] Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.918213 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.920664 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.921174 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.921297 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-4nz5r" Jan 21 16:07:53 crc kubenswrapper[4707]: I0121 16:07:53.921954 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6f9dc896b6-x5875"] Jan 21 16:07:54 crc kubenswrapper[4707]: W0121 16:07:54.010167 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod825bff4e_d6ba_4010_97b5_919025083000.slice/crio-9994bd28d85b012fa6f57829f1010eda91e909a2125204c95d0315bf89a231d0 WatchSource:0}: Error finding container 9994bd28d85b012fa6f57829f1010eda91e909a2125204c95d0315bf89a231d0: Status 404 returned error can't find the container with id 9994bd28d85b012fa6f57829f1010eda91e909a2125204c95d0315bf89a231d0 Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.011402 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.069832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-combined-ca-bundle\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.069883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-scripts\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.069945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-config-data\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.070022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkd9\" (UniqueName: \"kubernetes.io/projected/a2b79667-aae1-4cad-b7d9-52c4893cf000-kube-api-access-cfkd9\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.070047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2b79667-aae1-4cad-b7d9-52c4893cf000-logs\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.115033 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.172044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-combined-ca-bundle\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.172111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-scripts\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.172201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-config-data\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.172343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkd9\" (UniqueName: \"kubernetes.io/projected/a2b79667-aae1-4cad-b7d9-52c4893cf000-kube-api-access-cfkd9\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.172393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2b79667-aae1-4cad-b7d9-52c4893cf000-logs\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.172846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2b79667-aae1-4cad-b7d9-52c4893cf000-logs\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.176100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-combined-ca-bundle\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.176574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-scripts\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.177589 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-config-data\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.185631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkd9\" (UniqueName: \"kubernetes.io/projected/a2b79667-aae1-4cad-b7d9-52c4893cf000-kube-api-access-cfkd9\") pod \"placement-6f9dc896b6-x5875\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.282170 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.462362 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-68f4b84bbb-5wq8w"] Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.467112 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.469679 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.469924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.483601 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-68f4b84bbb-5wq8w"] Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.498564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.541112 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.583361 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-credential-keys\") pod \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-config-data\") pod \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602408 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcczd\" (UniqueName: \"kubernetes.io/projected/2a1dd97a-3057-4f3b-9940-66d922b6c34b-kube-api-access-gcczd\") pod \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-scripts\") pod \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-combined-ca-bundle\") pod \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602545 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-fernet-keys\") pod \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\" (UID: \"2a1dd97a-3057-4f3b-9940-66d922b6c34b\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6g48\" (UniqueName: \"kubernetes.io/projected/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-kube-api-access-k6g48\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-config-data\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-combined-ca-bundle\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-public-tls-certs\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.602965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-logs\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.603046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-internal-tls-certs\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.603070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-scripts\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.611626 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1dd97a-3057-4f3b-9940-66d922b6c34b-kube-api-access-gcczd" (OuterVolumeSpecName: "kube-api-access-gcczd") pod "2a1dd97a-3057-4f3b-9940-66d922b6c34b" (UID: "2a1dd97a-3057-4f3b-9940-66d922b6c34b"). InnerVolumeSpecName "kube-api-access-gcczd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.612918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2a1dd97a-3057-4f3b-9940-66d922b6c34b" (UID: "2a1dd97a-3057-4f3b-9940-66d922b6c34b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.613455 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-scripts" (OuterVolumeSpecName: "scripts") pod "2a1dd97a-3057-4f3b-9940-66d922b6c34b" (UID: "2a1dd97a-3057-4f3b-9940-66d922b6c34b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.622008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2a1dd97a-3057-4f3b-9940-66d922b6c34b" (UID: "2a1dd97a-3057-4f3b-9940-66d922b6c34b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.624045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-config-data" (OuterVolumeSpecName: "config-data") pod "2a1dd97a-3057-4f3b-9940-66d922b6c34b" (UID: "2a1dd97a-3057-4f3b-9940-66d922b6c34b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.624535 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a1dd97a-3057-4f3b-9940-66d922b6c34b" (UID: "2a1dd97a-3057-4f3b-9940-66d922b6c34b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-combined-ca-bundle\") pod \"72b37e7b-be66-400a-a500-d80241ee7ac3\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709337 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klv6c\" (UniqueName: \"kubernetes.io/projected/72b37e7b-be66-400a-a500-d80241ee7ac3-kube-api-access-klv6c\") pod \"72b37e7b-be66-400a-a500-d80241ee7ac3\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709371 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-db-sync-config-data\") pod \"72b37e7b-be66-400a-a500-d80241ee7ac3\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-scripts\") pod \"72b37e7b-be66-400a-a500-d80241ee7ac3\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-combined-ca-bundle\") pod \"58df5777-ecc8-4eda-8a67-c58f10b6e920\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-config-data\") pod \"72b37e7b-be66-400a-a500-d80241ee7ac3\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-config\") pod \"58df5777-ecc8-4eda-8a67-c58f10b6e920\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b37e7b-be66-400a-a500-d80241ee7ac3-etc-machine-id\") pod \"72b37e7b-be66-400a-a500-d80241ee7ac3\" (UID: \"72b37e7b-be66-400a-a500-d80241ee7ac3\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709578 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzjvs\" (UniqueName: \"kubernetes.io/projected/58df5777-ecc8-4eda-8a67-c58f10b6e920-kube-api-access-hzjvs\") pod \"58df5777-ecc8-4eda-8a67-c58f10b6e920\" (UID: \"58df5777-ecc8-4eda-8a67-c58f10b6e920\") " Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-internal-tls-certs\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-scripts\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6g48\" (UniqueName: \"kubernetes.io/projected/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-kube-api-access-k6g48\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-config-data\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-combined-ca-bundle\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.710023 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-public-tls-certs\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.710041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-logs\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.710082 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcczd\" (UniqueName: \"kubernetes.io/projected/2a1dd97a-3057-4f3b-9940-66d922b6c34b-kube-api-access-gcczd\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.710091 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.710100 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.710109 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.710116 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.710125 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1dd97a-3057-4f3b-9940-66d922b6c34b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.710406 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-logs\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.721098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-scripts\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.709860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72b37e7b-be66-400a-a500-d80241ee7ac3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "72b37e7b-be66-400a-a500-d80241ee7ac3" (UID: "72b37e7b-be66-400a-a500-d80241ee7ac3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.721237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72b37e7b-be66-400a-a500-d80241ee7ac3" (UID: "72b37e7b-be66-400a-a500-d80241ee7ac3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.728041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-scripts" (OuterVolumeSpecName: "scripts") pod "72b37e7b-be66-400a-a500-d80241ee7ac3" (UID: "72b37e7b-be66-400a-a500-d80241ee7ac3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.731535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6g48\" (UniqueName: \"kubernetes.io/projected/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-kube-api-access-k6g48\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.731903 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-internal-tls-certs\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.732747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-config-data\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.739067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58df5777-ecc8-4eda-8a67-c58f10b6e920-kube-api-access-hzjvs" (OuterVolumeSpecName: "kube-api-access-hzjvs") pod "58df5777-ecc8-4eda-8a67-c58f10b6e920" (UID: "58df5777-ecc8-4eda-8a67-c58f10b6e920"). InnerVolumeSpecName "kube-api-access-hzjvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.739799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b37e7b-be66-400a-a500-d80241ee7ac3-kube-api-access-klv6c" (OuterVolumeSpecName: "kube-api-access-klv6c") pod "72b37e7b-be66-400a-a500-d80241ee7ac3" (UID: "72b37e7b-be66-400a-a500-d80241ee7ac3"). InnerVolumeSpecName "kube-api-access-klv6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.741409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-public-tls-certs\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.754505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-combined-ca-bundle\") pod \"placement-68f4b84bbb-5wq8w\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.799610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58df5777-ecc8-4eda-8a67-c58f10b6e920" (UID: "58df5777-ecc8-4eda-8a67-c58f10b6e920"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.802676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-config" (OuterVolumeSpecName: "config") pod "58df5777-ecc8-4eda-8a67-c58f10b6e920" (UID: "58df5777-ecc8-4eda-8a67-c58f10b6e920"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.803859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72b37e7b-be66-400a-a500-d80241ee7ac3" (UID: "72b37e7b-be66-400a-a500-d80241ee7ac3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.806512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-config-data" (OuterVolumeSpecName: "config-data") pod "72b37e7b-be66-400a-a500-d80241ee7ac3" (UID: "72b37e7b-be66-400a-a500-d80241ee7ac3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.812322 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.812345 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klv6c\" (UniqueName: \"kubernetes.io/projected/72b37e7b-be66-400a-a500-d80241ee7ac3-kube-api-access-klv6c\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.812357 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.812365 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.812374 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.812382 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b37e7b-be66-400a-a500-d80241ee7ac3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.812390 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/58df5777-ecc8-4eda-8a67-c58f10b6e920-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.812398 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b37e7b-be66-400a-a500-d80241ee7ac3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.812406 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzjvs\" (UniqueName: \"kubernetes.io/projected/58df5777-ecc8-4eda-8a67-c58f10b6e920-kube-api-access-hzjvs\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.815649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.861302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" event={"ID":"4e6fa0ac-e4ce-419d-a680-ce63883886a7","Type":"ContainerStarted","Data":"326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea"} Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.872085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"825bff4e-d6ba-4010-97b5-919025083000","Type":"ContainerStarted","Data":"9994bd28d85b012fa6f57829f1010eda91e909a2125204c95d0315bf89a231d0"} Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.902573 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" podStartSLOduration=2.902556861 podStartE2EDuration="2.902556861s" podCreationTimestamp="2026-01-21 16:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:54.878278029 +0000 UTC m=+3972.059794251" watchObservedRunningTime="2026-01-21 16:07:54.902556861 +0000 UTC m=+3972.084073083" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.903532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2f2511c6-b349-44ef-bd96-a121829bd0b8","Type":"ContainerStarted","Data":"e332a430c43dc7ac34fdb89aa2984f6777afd57422de00a18c8c732bfb793028"} Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.924841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" event={"ID":"2a1dd97a-3057-4f3b-9940-66d922b6c34b","Type":"ContainerDied","Data":"29b9c2fee5825529ed52beb2df215efd0e043de5b6d0623ff4d35871c1f2072a"} Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.924878 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29b9c2fee5825529ed52beb2df215efd0e043de5b6d0623ff4d35871c1f2072a" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.924898 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-q25mp" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.945623 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.949006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-mq5pt" event={"ID":"72b37e7b-be66-400a-a500-d80241ee7ac3","Type":"ContainerDied","Data":"8a27066f9834ae980be5213b9ca3704f3ad9614e3c39c19e24fa77634c9f9c7c"} Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.949040 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a27066f9834ae980be5213b9ca3704f3ad9614e3c39c19e24fa77634c9f9c7c" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.950525 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.957852 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q25mp"] Jan 21 16:07:54 crc kubenswrapper[4707]: I0121 16:07:54.960913 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q25mp"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.034240 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerID="2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13" exitCode=0 Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.034270 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerID="cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d" exitCode=2 Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.034279 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerID="7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b" exitCode=0 Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.034286 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerID="1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5" exitCode=0 Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.034321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerDied","Data":"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13"} Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.034347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerDied","Data":"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d"} Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.034357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerDied","Data":"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b"} Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.034365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerDied","Data":"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5"} Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.034378 4707 scope.go:117] "RemoveContainer" containerID="2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.057755 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-ssq68"] Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.058120 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="proxy-httpd" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058138 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="proxy-httpd" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.058150 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="ceilometer-central-agent" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058156 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="ceilometer-central-agent" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.058168 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="ceilometer-notification-agent" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058185 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="ceilometer-notification-agent" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.058194 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="sg-core" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058199 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="sg-core" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.058219 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58df5777-ecc8-4eda-8a67-c58f10b6e920" containerName="neutron-db-sync" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058225 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="58df5777-ecc8-4eda-8a67-c58f10b6e920" containerName="neutron-db-sync" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.058238 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1dd97a-3057-4f3b-9940-66d922b6c34b" containerName="keystone-bootstrap" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058244 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1dd97a-3057-4f3b-9940-66d922b6c34b" containerName="keystone-bootstrap" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.058253 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b37e7b-be66-400a-a500-d80241ee7ac3" containerName="cinder-db-sync" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058259 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b37e7b-be66-400a-a500-d80241ee7ac3" containerName="cinder-db-sync" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058389 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="ceilometer-central-agent" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058404 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b37e7b-be66-400a-a500-d80241ee7ac3" containerName="cinder-db-sync" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058417 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="ceilometer-notification-agent" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058425 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="proxy-httpd" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058436 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1dd97a-3057-4f3b-9940-66d922b6c34b" containerName="keystone-bootstrap" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058446 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="58df5777-ecc8-4eda-8a67-c58f10b6e920" containerName="neutron-db-sync" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058455 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" containerName="sg-core" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.058969 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.062825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" event={"ID":"58df5777-ecc8-4eda-8a67-c58f10b6e920","Type":"ContainerDied","Data":"280d458c1a407c520bdfb674d621f79081b5e7d7bb7f725d67a005d26c4d1017"} Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.062858 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="280d458c1a407c520bdfb674d621f79081b5e7d7bb7f725d67a005d26c4d1017" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.062922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-8mqm4" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.063122 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.063298 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-6rbnr" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.063480 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.074827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" event={"ID":"92bc7502-e503-4b19-b9f3-8ef8149a1e7d","Type":"ContainerStarted","Data":"174499063bbae139016787c58fd86006b2b9a43b498657447a7247aca279ac14"} Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.074862 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.074874 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.074892 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.074973 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.082512 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.102724 4707 scope.go:117] "RemoveContainer" containerID="cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.110348 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.119133 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.119306 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-nhwsw" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.119481 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.121366 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.122417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-sg-core-conf-yaml\") pod \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.122459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-config-data\") pod \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.122542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-combined-ca-bundle\") pod \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.122566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-log-httpd\") pod \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.122631 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkx95\" (UniqueName: \"kubernetes.io/projected/1a5b8167-6a2b-41e6-b896-500d3c5401d2-kube-api-access-jkx95\") pod \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.122653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-scripts\") pod \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.122701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-run-httpd\") pod \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\" (UID: \"1a5b8167-6a2b-41e6-b896-500d3c5401d2\") " Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.123687 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a5b8167-6a2b-41e6-b896-500d3c5401d2" (UID: "1a5b8167-6a2b-41e6-b896-500d3c5401d2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.152189 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-78d5cb5d89-rgx54"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.153064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a5b8167-6a2b-41e6-b896-500d3c5401d2" (UID: "1a5b8167-6a2b-41e6-b896-500d3c5401d2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.154525 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.158227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a5b8167-6a2b-41e6-b896-500d3c5401d2-kube-api-access-jkx95" (OuterVolumeSpecName: "kube-api-access-jkx95") pod "1a5b8167-6a2b-41e6-b896-500d3c5401d2" (UID: "1a5b8167-6a2b-41e6-b896-500d3c5401d2"). InnerVolumeSpecName "kube-api-access-jkx95". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.164371 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.164637 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.167261 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-tjf5j" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.179467 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.221651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-scripts" (OuterVolumeSpecName: "scripts") pod "1a5b8167-6a2b-41e6-b896-500d3c5401d2" (UID: "1a5b8167-6a2b-41e6-b896-500d3c5401d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.230275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.230329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-config\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.230532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.230577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-credential-keys\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.230664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7dx4\" (UniqueName: \"kubernetes.io/projected/ee652951-c0ac-42b5-a87f-0ea719db7b98-kube-api-access-q7dx4\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.231351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-config-data\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.231375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-scripts\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.231404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-scripts\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.231439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stnk7\" (UniqueName: \"kubernetes.io/projected/83157088-df54-4610-ae05-f3483eaa70b4-kube-api-access-stnk7\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.231479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-fernet-keys\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.231781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-combined-ca-bundle\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee652951-c0ac-42b5-a87f-0ea719db7b98-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-combined-ca-bundle\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpctm\" (UniqueName: \"kubernetes.io/projected/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-kube-api-access-fpctm\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241109 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-httpd-config\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241138 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-ovndb-tls-certs\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241879 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241902 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a5b8167-6a2b-41e6-b896-500d3c5401d2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241913 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkx95\" (UniqueName: \"kubernetes.io/projected/1a5b8167-6a2b-41e6-b896-500d3c5401d2-kube-api-access-jkx95\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.241923 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.286480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a5b8167-6a2b-41e6-b896-500d3c5401d2" (UID: "1a5b8167-6a2b-41e6-b896-500d3c5401d2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.301986 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1dd97a-3057-4f3b-9940-66d922b6c34b" path="/var/lib/kubelet/pods/2a1dd97a-3057-4f3b-9940-66d922b6c34b/volumes" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.302965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a5b8167-6a2b-41e6-b896-500d3c5401d2" (UID: "1a5b8167-6a2b-41e6-b896-500d3c5401d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.303069 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1274f2d-1e64-467f-8f46-c91ec758b5c0" path="/var/lib/kubelet/pods/f1274f2d-1e64-467f-8f46-c91ec758b5c0/volumes" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.309486 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.309522 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-78d5cb5d89-rgx54"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.309534 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-ssq68"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.309547 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.310934 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.313322 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.317646 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6f9dc896b6-x5875"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.328620 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.329913 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" podStartSLOduration=3.329901219 podStartE2EDuration="3.329901219s" podCreationTimestamp="2026-01-21 16:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:55.15204968 +0000 UTC m=+3972.333565902" watchObservedRunningTime="2026-01-21 16:07:55.329901219 +0000 UTC m=+3972.511417431" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-combined-ca-bundle\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee652951-c0ac-42b5-a87f-0ea719db7b98-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-combined-ca-bundle\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpctm\" (UniqueName: \"kubernetes.io/projected/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-kube-api-access-fpctm\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343671 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-httpd-config\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-ovndb-tls-certs\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-config\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-credential-keys\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7dx4\" (UniqueName: \"kubernetes.io/projected/ee652951-c0ac-42b5-a87f-0ea719db7b98-kube-api-access-q7dx4\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-config-data\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-scripts\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-scripts\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.343987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stnk7\" (UniqueName: \"kubernetes.io/projected/83157088-df54-4610-ae05-f3483eaa70b4-kube-api-access-stnk7\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.344029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-fernet-keys\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.344032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee652951-c0ac-42b5-a87f-0ea719db7b98-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.344050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.344117 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.344128 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.346541 4707 scope.go:117] "RemoveContainer" containerID="7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.351921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-combined-ca-bundle\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.354582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-combined-ca-bundle\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.355579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.358634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-scripts\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.358981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-fernet-keys\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.359514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-ovndb-tls-certs\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.361437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-credential-keys\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.361532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7dx4\" (UniqueName: \"kubernetes.io/projected/ee652951-c0ac-42b5-a87f-0ea719db7b98-kube-api-access-q7dx4\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.361957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-scripts\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.362880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-config\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.363474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.364489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stnk7\" (UniqueName: \"kubernetes.io/projected/83157088-df54-4610-ae05-f3483eaa70b4-kube-api-access-stnk7\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.364682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-config-data\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.365468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.366726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-config-data" (OuterVolumeSpecName: "config-data") pod "1a5b8167-6a2b-41e6-b896-500d3c5401d2" (UID: "1a5b8167-6a2b-41e6-b896-500d3c5401d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.373856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpctm\" (UniqueName: \"kubernetes.io/projected/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-kube-api-access-fpctm\") pod \"keystone-bootstrap-ssq68\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.374073 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-httpd-config\") pod \"neutron-78d5cb5d89-rgx54\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.382886 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.404681 4707 scope.go:117] "RemoveContainer" containerID="1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.432025 4707 scope.go:117] "RemoveContainer" containerID="2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.432510 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13\": container with ID starting with 2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13 not found: ID does not exist" containerID="2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.432548 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13"} err="failed to get container status \"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13\": rpc error: code = NotFound desc = could not find container \"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13\": container with ID starting with 2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13 not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.432572 4707 scope.go:117] "RemoveContainer" containerID="cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.433941 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d\": container with ID starting with cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d not found: ID does not exist" containerID="cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.433975 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d"} err="failed to get container status \"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d\": rpc error: code = NotFound desc = could not find container \"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d\": container with ID starting with cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.433997 4707 scope.go:117] "RemoveContainer" containerID="7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.434323 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b\": container with ID starting with 7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b not found: ID does not exist" containerID="7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434354 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b"} err="failed to get container status \"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b\": rpc error: code = NotFound desc = could not find container \"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b\": container with ID starting with 7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434374 4707 scope.go:117] "RemoveContainer" containerID="1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5" Jan 21 16:07:55 crc kubenswrapper[4707]: E0121 16:07:55.434497 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5\": container with ID starting with 1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5 not found: ID does not exist" containerID="1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434518 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5"} err="failed to get container status \"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5\": rpc error: code = NotFound desc = could not find container \"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5\": container with ID starting with 1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5 not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434531 4707 scope.go:117] "RemoveContainer" containerID="2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434643 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13"} err="failed to get container status \"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13\": rpc error: code = NotFound desc = could not find container \"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13\": container with ID starting with 2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13 not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434661 4707 scope.go:117] "RemoveContainer" containerID="cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434760 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d"} err="failed to get container status \"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d\": rpc error: code = NotFound desc = could not find container \"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d\": container with ID starting with cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434780 4707 scope.go:117] "RemoveContainer" containerID="7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434966 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b"} err="failed to get container status \"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b\": rpc error: code = NotFound desc = could not find container \"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b\": container with ID starting with 7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.434985 4707 scope.go:117] "RemoveContainer" containerID="1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.435146 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5"} err="failed to get container status \"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5\": rpc error: code = NotFound desc = could not find container \"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5\": container with ID starting with 1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5 not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.435163 4707 scope.go:117] "RemoveContainer" containerID="2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.435973 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13"} err="failed to get container status \"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13\": rpc error: code = NotFound desc = could not find container \"2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13\": container with ID starting with 2a516a532cc8d4fe9cf550624bf5deb7ceb062d6d7e22405493b4de0bedb7b13 not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.436010 4707 scope.go:117] "RemoveContainer" containerID="cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.436303 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d"} err="failed to get container status \"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d\": rpc error: code = NotFound desc = could not find container \"cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d\": container with ID starting with cc19920aa1b62357b77d937c6c493a867ed6b2e989868ed39d6d332cd2eee71d not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.436331 4707 scope.go:117] "RemoveContainer" containerID="7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.436521 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b"} err="failed to get container status \"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b\": rpc error: code = NotFound desc = could not find container \"7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b\": container with ID starting with 7a536b73f781c59d9f149d0daa074647e4f6ee0796a7793027cbf7b1ccf3848b not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.436539 4707 scope.go:117] "RemoveContainer" containerID="1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.436932 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5"} err="failed to get container status \"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5\": rpc error: code = NotFound desc = could not find container \"1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5\": container with ID starting with 1a6fc37e53dedf25b625b624650f24b6135aa75654a02fe6098daa29900a37f5 not found: ID does not exist" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.446097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsc2r\" (UniqueName: \"kubernetes.io/projected/bf27773b-b6c3-4de4-9e6f-f224d4711c58-kube-api-access-xsc2r\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.446204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-scripts\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.446404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf27773b-b6c3-4de4-9e6f-f224d4711c58-logs\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.446424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf27773b-b6c3-4de4-9e6f-f224d4711c58-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.446439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.446470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data-custom\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.446490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.446586 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5b8167-6a2b-41e6-b896-500d3c5401d2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.547563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-scripts\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.547713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf27773b-b6c3-4de4-9e6f-f224d4711c58-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.547733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf27773b-b6c3-4de4-9e6f-f224d4711c58-logs\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.547747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.547771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data-custom\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.547790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.547877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsc2r\" (UniqueName: \"kubernetes.io/projected/bf27773b-b6c3-4de4-9e6f-f224d4711c58-kube-api-access-xsc2r\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.550668 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf27773b-b6c3-4de4-9e6f-f224d4711c58-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.551377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-scripts\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.551886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.553348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.557210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf27773b-b6c3-4de4-9e6f-f224d4711c58-logs\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.562779 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-68f4b84bbb-5wq8w"] Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.564651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data-custom\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.573504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsc2r\" (UniqueName: \"kubernetes.io/projected/bf27773b-b6c3-4de4-9e6f-f224d4711c58-kube-api-access-xsc2r\") pod \"cinder-api-0\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.606404 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.661959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.693365 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:55 crc kubenswrapper[4707]: I0121 16:07:55.824357 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-78d5cb5d89-rgx54"] Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.065140 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-ssq68"] Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.130468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2f2511c6-b349-44ef-bd96-a121829bd0b8","Type":"ContainerStarted","Data":"2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.130507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2f2511c6-b349-44ef-bd96-a121829bd0b8","Type":"ContainerStarted","Data":"4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.132971 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.139902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" event={"ID":"a2b79667-aae1-4cad-b7d9-52c4893cf000","Type":"ContainerStarted","Data":"0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.139942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" event={"ID":"a2b79667-aae1-4cad-b7d9-52c4893cf000","Type":"ContainerStarted","Data":"1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.139954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" event={"ID":"a2b79667-aae1-4cad-b7d9-52c4893cf000","Type":"ContainerStarted","Data":"20cb6d462022ab1a96675fa54721d6307cc7edd031840668461616bfd9770424"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.139984 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.140158 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.141968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1a5b8167-6a2b-41e6-b896-500d3c5401d2","Type":"ContainerDied","Data":"b7590cc600ba97c9036bbde1dca8350b4ec3e199298eede790556c55792307f0"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.142200 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.145663 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" event={"ID":"83157088-df54-4610-ae05-f3483eaa70b4","Type":"ContainerStarted","Data":"de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.154936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" event={"ID":"83157088-df54-4610-ae05-f3483eaa70b4","Type":"ContainerStarted","Data":"b82dfe665add1347a13ea6106c6cdcba6bcc168bfda01dd93a66d5a187de4c89"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.153746 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.15373201 podStartE2EDuration="3.15373201s" podCreationTimestamp="2026-01-21 16:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:56.151647351 +0000 UTC m=+3973.333163573" watchObservedRunningTime="2026-01-21 16:07:56.15373201 +0000 UTC m=+3973.335248232" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.162878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" event={"ID":"3ebbb7c6-3987-4c0b-826f-10b691ca1e39","Type":"ContainerStarted","Data":"b61fcf03feb13351b8d028376c265c54d145d49154a87773bc65d349bf33430f"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.162921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" event={"ID":"3ebbb7c6-3987-4c0b-826f-10b691ca1e39","Type":"ContainerStarted","Data":"6a2974206b54c7b9a04ddf70f0fd4dfbd63679099f8248cfc05fbbcb49b95ea3"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.162932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" event={"ID":"3ebbb7c6-3987-4c0b-826f-10b691ca1e39","Type":"ContainerStarted","Data":"5555e7e7ff61cd4ecf8f90c0d6e8f78e60b9de611226fd6c5f388f89e16bec2d"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.163123 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.163153 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.164045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" event={"ID":"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd","Type":"ContainerStarted","Data":"0147e25a9a4707e9ee7890d4680786fce042494a7ce39ca7bd665f5b212efe3e"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.174546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"825bff4e-d6ba-4010-97b5-919025083000","Type":"ContainerStarted","Data":"590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.174574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"825bff4e-d6ba-4010-97b5-919025083000","Type":"ContainerStarted","Data":"fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b"} Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.179851 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" podStartSLOduration=3.179836735 podStartE2EDuration="3.179836735s" podCreationTimestamp="2026-01-21 16:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:56.17312237 +0000 UTC m=+3973.354638593" watchObservedRunningTime="2026-01-21 16:07:56.179836735 +0000 UTC m=+3973.361352957" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.212217 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" podStartSLOduration=2.21220041 podStartE2EDuration="2.21220041s" podCreationTimestamp="2026-01-21 16:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:56.187275112 +0000 UTC m=+3973.368791334" watchObservedRunningTime="2026-01-21 16:07:56.21220041 +0000 UTC m=+3973.393716633" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.220545 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.220532196 podStartE2EDuration="3.220532196s" podCreationTimestamp="2026-01-21 16:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:56.205635876 +0000 UTC m=+3973.387152108" watchObservedRunningTime="2026-01-21 16:07:56.220532196 +0000 UTC m=+3973.402048418" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.259761 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.320555 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.347906 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.355491 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.357398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.358961 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.359040 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.362377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:07:56 crc kubenswrapper[4707]: E0121 16:07:56.409594 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a5b8167_6a2b_41e6_b896_500d3c5401d2.slice\": RecentStats: unable to find data in memory cache]" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.465650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.465904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4sss\" (UniqueName: \"kubernetes.io/projected/06d5ea70-00dc-4469-a839-6a644f076265-kube-api-access-f4sss\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.465927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.466019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-config-data\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.466107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-log-httpd\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.466254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-scripts\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.466295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-run-httpd\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.567791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.567846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4sss\" (UniqueName: \"kubernetes.io/projected/06d5ea70-00dc-4469-a839-6a644f076265-kube-api-access-f4sss\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.567864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.567887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-config-data\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.567922 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-log-httpd\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.568015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-scripts\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.568047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-run-httpd\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.568448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-run-httpd\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.568656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-log-httpd\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.573374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.573405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.574351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-scripts\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.576591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-config-data\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.591505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4sss\" (UniqueName: \"kubernetes.io/projected/06d5ea70-00dc-4469-a839-6a644f076265-kube-api-access-f4sss\") pod \"ceilometer-0\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:56 crc kubenswrapper[4707]: I0121 16:07:56.682140 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.082625 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.197619 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" podStartSLOduration=3.197604551 podStartE2EDuration="3.197604551s" podCreationTimestamp="2026-01-21 16:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:57.192302733 +0000 UTC m=+3974.373818955" watchObservedRunningTime="2026-01-21 16:07:57.197604551 +0000 UTC m=+3974.379120774" Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.198608 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a5b8167-6a2b-41e6-b896-500d3c5401d2" path="/var/lib/kubelet/pods/1a5b8167-6a2b-41e6-b896-500d3c5401d2/volumes" Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.199325 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.199344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" event={"ID":"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd","Type":"ContainerStarted","Data":"71f850a8352ca042d8a8b332e9e647c7e46ebf9f2515e3ed09d30a39424ae892"} Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.199356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" event={"ID":"83157088-df54-4610-ae05-f3483eaa70b4","Type":"ContainerStarted","Data":"18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7"} Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.204161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ee652951-c0ac-42b5-a87f-0ea719db7b98","Type":"ContainerStarted","Data":"16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9"} Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.204328 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ee652951-c0ac-42b5-a87f-0ea719db7b98","Type":"ContainerStarted","Data":"c7cb259e2cc6fe520fc94f92389ab782797261e7de2276141bfc6aa4bc3ce65a"} Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.211300 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" podStartSLOduration=3.211289054 podStartE2EDuration="3.211289054s" podCreationTimestamp="2026-01-21 16:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:57.211272562 +0000 UTC m=+3974.392788785" watchObservedRunningTime="2026-01-21 16:07:57.211289054 +0000 UTC m=+3974.392805275" Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.211949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerStarted","Data":"c6f7523ff01e763d4c53b1b6d662e1fd0ca5325badb4a1b0f6b309d4ebd208b1"} Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.216389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bf27773b-b6c3-4de4-9e6f-f224d4711c58","Type":"ContainerStarted","Data":"91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9"} Jan 21 16:07:57 crc kubenswrapper[4707]: I0121 16:07:57.216416 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bf27773b-b6c3-4de4-9e6f-f224d4711c58","Type":"ContainerStarted","Data":"dedfd4ed5f64082ac6db94b84e766bc3902db5846a518ac4cb3a223e0d00fcd6"} Jan 21 16:07:58 crc kubenswrapper[4707]: I0121 16:07:58.182990 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:07:58 crc kubenswrapper[4707]: E0121 16:07:58.183883 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:07:58 crc kubenswrapper[4707]: I0121 16:07:58.247364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ee652951-c0ac-42b5-a87f-0ea719db7b98","Type":"ContainerStarted","Data":"64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304"} Jan 21 16:07:58 crc kubenswrapper[4707]: I0121 16:07:58.260883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerStarted","Data":"8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac"} Jan 21 16:07:58 crc kubenswrapper[4707]: I0121 16:07:58.265780 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=4.2657672269999996 podStartE2EDuration="4.265767227s" podCreationTimestamp="2026-01-21 16:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:58.263025422 +0000 UTC m=+3975.444541643" watchObservedRunningTime="2026-01-21 16:07:58.265767227 +0000 UTC m=+3975.447283449" Jan 21 16:07:58 crc kubenswrapper[4707]: I0121 16:07:58.280691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bf27773b-b6c3-4de4-9e6f-f224d4711c58","Type":"ContainerStarted","Data":"6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8"} Jan 21 16:07:58 crc kubenswrapper[4707]: I0121 16:07:58.308675 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.308661753 podStartE2EDuration="3.308661753s" podCreationTimestamp="2026-01-21 16:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:07:58.300003273 +0000 UTC m=+3975.481519494" watchObservedRunningTime="2026-01-21 16:07:58.308661753 +0000 UTC m=+3975.490177975" Jan 21 16:07:59 crc kubenswrapper[4707]: I0121 16:07:59.289203 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" containerID="71f850a8352ca042d8a8b332e9e647c7e46ebf9f2515e3ed09d30a39424ae892" exitCode=0 Jan 21 16:07:59 crc kubenswrapper[4707]: I0121 16:07:59.289289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" event={"ID":"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd","Type":"ContainerDied","Data":"71f850a8352ca042d8a8b332e9e647c7e46ebf9f2515e3ed09d30a39424ae892"} Jan 21 16:07:59 crc kubenswrapper[4707]: I0121 16:07:59.292911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerStarted","Data":"e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4"} Jan 21 16:07:59 crc kubenswrapper[4707]: I0121 16:07:59.292941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerStarted","Data":"0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f"} Jan 21 16:07:59 crc kubenswrapper[4707]: I0121 16:07:59.293253 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:07:59 crc kubenswrapper[4707]: I0121 16:07:59.393222 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:07:59 crc kubenswrapper[4707]: I0121 16:07:59.892883 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.588199 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.615727 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.662267 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.736932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-fernet-keys\") pod \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.737011 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-credential-keys\") pod \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.737100 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-scripts\") pod \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.737136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-combined-ca-bundle\") pod \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.737164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpctm\" (UniqueName: \"kubernetes.io/projected/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-kube-api-access-fpctm\") pod \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.737192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-config-data\") pod \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\" (UID: \"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd\") " Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.742392 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" (UID: "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.743129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-scripts" (OuterVolumeSpecName: "scripts") pod "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" (UID: "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.743288 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" (UID: "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.744486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-kube-api-access-fpctm" (OuterVolumeSpecName: "kube-api-access-fpctm") pod "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" (UID: "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd"). InnerVolumeSpecName "kube-api-access-fpctm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.758011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" (UID: "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.759269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-config-data" (OuterVolumeSpecName: "config-data") pod "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" (UID: "bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.839591 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.839697 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.839709 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.839720 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.839730 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpctm\" (UniqueName: \"kubernetes.io/projected/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-kube-api-access-fpctm\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:00 crc kubenswrapper[4707]: I0121 16:08:00.839738 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.308117 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerStarted","Data":"0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9"} Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.308485 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.310093 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.310196 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerName="cinder-api-log" containerID="cri-o://91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9" gracePeriod=30 Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.310274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-ssq68" event={"ID":"bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd","Type":"ContainerDied","Data":"0147e25a9a4707e9ee7890d4680786fce042494a7ce39ca7bd665f5b212efe3e"} Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.310296 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0147e25a9a4707e9ee7890d4680786fce042494a7ce39ca7bd665f5b212efe3e" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.310328 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerName="cinder-api" containerID="cri-o://6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8" gracePeriod=30 Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.336381 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.088481632 podStartE2EDuration="5.336369824s" podCreationTimestamp="2026-01-21 16:07:56 +0000 UTC" firstStartedPulling="2026-01-21 16:07:57.143549059 +0000 UTC m=+3974.325065282" lastFinishedPulling="2026-01-21 16:08:00.391437252 +0000 UTC m=+3977.572953474" observedRunningTime="2026-01-21 16:08:01.329799971 +0000 UTC m=+3978.511316194" watchObservedRunningTime="2026-01-21 16:08:01.336369824 +0000 UTC m=+3978.517886046" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.435279 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p"] Jan 21 16:08:01 crc kubenswrapper[4707]: E0121 16:08:01.435593 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" containerName="keystone-bootstrap" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.435610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" containerName="keystone-bootstrap" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.435755 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" containerName="keystone-bootstrap" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.437258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.443479 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.443525 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.444563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.444606 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.445180 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-6rbnr" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.451094 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p"] Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.453043 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.548180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-fernet-keys\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.548279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfl5t\" (UniqueName: \"kubernetes.io/projected/7d6ee277-b629-49a0-94da-030a39be02f5-kube-api-access-nfl5t\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.548363 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-config-data\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.548419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.548462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.548521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-credential-keys\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.548585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-scripts\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.548660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-combined-ca-bundle\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.649944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfl5t\" (UniqueName: \"kubernetes.io/projected/7d6ee277-b629-49a0-94da-030a39be02f5-kube-api-access-nfl5t\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.650027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-config-data\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.650054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.650090 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.650130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-credential-keys\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.650173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-scripts\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.650222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-combined-ca-bundle\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.650308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-fernet-keys\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.653608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.657119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-fernet-keys\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.657723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.662628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-credential-keys\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.663036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-config-data\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.663975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-scripts\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.666297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfl5t\" (UniqueName: \"kubernetes.io/projected/7d6ee277-b629-49a0-94da-030a39be02f5-kube-api-access-nfl5t\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.709096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-combined-ca-bundle\") pod \"keystone-6c79bb4b5d-xfj6p\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.774247 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.802125 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.852966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsc2r\" (UniqueName: \"kubernetes.io/projected/bf27773b-b6c3-4de4-9e6f-f224d4711c58-kube-api-access-xsc2r\") pod \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.853020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data\") pod \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.853064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-scripts\") pod \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.853120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data-custom\") pod \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.853140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf27773b-b6c3-4de4-9e6f-f224d4711c58-logs\") pod \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.853223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-combined-ca-bundle\") pod \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.853269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf27773b-b6c3-4de4-9e6f-f224d4711c58-etc-machine-id\") pod \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\" (UID: \"bf27773b-b6c3-4de4-9e6f-f224d4711c58\") " Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.853614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf27773b-b6c3-4de4-9e6f-f224d4711c58-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bf27773b-b6c3-4de4-9e6f-f224d4711c58" (UID: "bf27773b-b6c3-4de4-9e6f-f224d4711c58"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.853637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf27773b-b6c3-4de4-9e6f-f224d4711c58-logs" (OuterVolumeSpecName: "logs") pod "bf27773b-b6c3-4de4-9e6f-f224d4711c58" (UID: "bf27773b-b6c3-4de4-9e6f-f224d4711c58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.854046 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf27773b-b6c3-4de4-9e6f-f224d4711c58-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.854065 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf27773b-b6c3-4de4-9e6f-f224d4711c58-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.856736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-scripts" (OuterVolumeSpecName: "scripts") pod "bf27773b-b6c3-4de4-9e6f-f224d4711c58" (UID: "bf27773b-b6c3-4de4-9e6f-f224d4711c58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.857138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf27773b-b6c3-4de4-9e6f-f224d4711c58-kube-api-access-xsc2r" (OuterVolumeSpecName: "kube-api-access-xsc2r") pod "bf27773b-b6c3-4de4-9e6f-f224d4711c58" (UID: "bf27773b-b6c3-4de4-9e6f-f224d4711c58"). InnerVolumeSpecName "kube-api-access-xsc2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.857557 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bf27773b-b6c3-4de4-9e6f-f224d4711c58" (UID: "bf27773b-b6c3-4de4-9e6f-f224d4711c58"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.879975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf27773b-b6c3-4de4-9e6f-f224d4711c58" (UID: "bf27773b-b6c3-4de4-9e6f-f224d4711c58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.900309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data" (OuterVolumeSpecName: "config-data") pod "bf27773b-b6c3-4de4-9e6f-f224d4711c58" (UID: "bf27773b-b6c3-4de4-9e6f-f224d4711c58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.955140 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.955178 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.955189 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.955200 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsc2r\" (UniqueName: \"kubernetes.io/projected/bf27773b-b6c3-4de4-9e6f-f224d4711c58-kube-api-access-xsc2r\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:01 crc kubenswrapper[4707]: I0121 16:08:01.955207 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf27773b-b6c3-4de4-9e6f-f224d4711c58-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.199116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p"] Jan 21 16:08:02 crc kubenswrapper[4707]: W0121 16:08:02.204669 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d6ee277_b629_49a0_94da_030a39be02f5.slice/crio-50e0c9ee3558493db0c817a196fe3806b8dee17eb4e53a84c7295e16f6adec04 WatchSource:0}: Error finding container 50e0c9ee3558493db0c817a196fe3806b8dee17eb4e53a84c7295e16f6adec04: Status 404 returned error can't find the container with id 50e0c9ee3558493db0c817a196fe3806b8dee17eb4e53a84c7295e16f6adec04 Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.317944 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerID="6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8" exitCode=0 Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.317970 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerID="91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9" exitCode=143 Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.318013 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.318024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bf27773b-b6c3-4de4-9e6f-f224d4711c58","Type":"ContainerDied","Data":"6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8"} Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.318050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bf27773b-b6c3-4de4-9e6f-f224d4711c58","Type":"ContainerDied","Data":"91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9"} Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.318060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bf27773b-b6c3-4de4-9e6f-f224d4711c58","Type":"ContainerDied","Data":"dedfd4ed5f64082ac6db94b84e766bc3902db5846a518ac4cb3a223e0d00fcd6"} Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.318074 4707 scope.go:117] "RemoveContainer" containerID="6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.319559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" event={"ID":"7d6ee277-b629-49a0-94da-030a39be02f5","Type":"ContainerStarted","Data":"50e0c9ee3558493db0c817a196fe3806b8dee17eb4e53a84c7295e16f6adec04"} Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.340560 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.351385 4707 scope.go:117] "RemoveContainer" containerID="91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.354669 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.370258 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:08:02 crc kubenswrapper[4707]: E0121 16:08:02.370600 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerName="cinder-api-log" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.370618 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerName="cinder-api-log" Jan 21 16:08:02 crc kubenswrapper[4707]: E0121 16:08:02.370649 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerName="cinder-api" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.370656 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerName="cinder-api" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.370845 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerName="cinder-api-log" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.370876 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" containerName="cinder-api" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.371659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.375086 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.375321 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.375325 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.378801 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.390521 4707 scope.go:117] "RemoveContainer" containerID="6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8" Jan 21 16:08:02 crc kubenswrapper[4707]: E0121 16:08:02.391265 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8\": container with ID starting with 6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8 not found: ID does not exist" containerID="6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.391300 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8"} err="failed to get container status \"6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8\": rpc error: code = NotFound desc = could not find container \"6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8\": container with ID starting with 6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8 not found: ID does not exist" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.391320 4707 scope.go:117] "RemoveContainer" containerID="91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9" Jan 21 16:08:02 crc kubenswrapper[4707]: E0121 16:08:02.392004 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9\": container with ID starting with 91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9 not found: ID does not exist" containerID="91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.392032 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9"} err="failed to get container status \"91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9\": rpc error: code = NotFound desc = could not find container \"91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9\": container with ID starting with 91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9 not found: ID does not exist" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.392048 4707 scope.go:117] "RemoveContainer" containerID="6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.392515 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8"} err="failed to get container status \"6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8\": rpc error: code = NotFound desc = could not find container \"6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8\": container with ID starting with 6e05ef4b9d66635fe16bb9e0afcc3adc99ac00a613974419e105779734530bb8 not found: ID does not exist" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.392534 4707 scope.go:117] "RemoveContainer" containerID="91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.392753 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9"} err="failed to get container status \"91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9\": rpc error: code = NotFound desc = could not find container \"91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9\": container with ID starting with 91a75bc735b79c84e582b18cabf8a100cd1fa0b0658f8fd3cdad3a2a381dc4b9 not found: ID does not exist" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.463471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.463528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8nhg\" (UniqueName: \"kubernetes.io/projected/b4fbc761-3cab-4cee-b4f5-59a2306a3650-kube-api-access-t8nhg\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.463581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.463604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data-custom\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.463691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-scripts\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.463713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.463730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.463784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4fbc761-3cab-4cee-b4f5-59a2306a3650-logs\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.463878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4fbc761-3cab-4cee-b4f5-59a2306a3650-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.564835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-scripts\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.564881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.564906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.564960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4fbc761-3cab-4cee-b4f5-59a2306a3650-logs\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.565018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4fbc761-3cab-4cee-b4f5-59a2306a3650-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.565084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.565129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8nhg\" (UniqueName: \"kubernetes.io/projected/b4fbc761-3cab-4cee-b4f5-59a2306a3650-kube-api-access-t8nhg\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.565164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data-custom\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.565178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.565613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4fbc761-3cab-4cee-b4f5-59a2306a3650-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.566104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4fbc761-3cab-4cee-b4f5-59a2306a3650-logs\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.571361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-scripts\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.571434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.571903 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.572429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data-custom\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.576761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.578392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.578619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8nhg\" (UniqueName: \"kubernetes.io/projected/b4fbc761-3cab-4cee-b4f5-59a2306a3650-kube-api-access-t8nhg\") pod \"cinder-api-0\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:02 crc kubenswrapper[4707]: I0121 16:08:02.744656 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.111248 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:08:03 crc kubenswrapper[4707]: W0121 16:08:03.112605 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4fbc761_3cab_4cee_b4f5_59a2306a3650.slice/crio-72cb3a1e0a541104f4491c324ff2e8d835f65bd589656fd31f4d4ffea57b8749 WatchSource:0}: Error finding container 72cb3a1e0a541104f4491c324ff2e8d835f65bd589656fd31f4d4ffea57b8749: Status 404 returned error can't find the container with id 72cb3a1e0a541104f4491c324ff2e8d835f65bd589656fd31f4d4ffea57b8749 Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.191655 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf27773b-b6c3-4de4-9e6f-f224d4711c58" path="/var/lib/kubelet/pods/bf27773b-b6c3-4de4-9e6f-f224d4711c58/volumes" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.330071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" event={"ID":"7d6ee277-b629-49a0-94da-030a39be02f5","Type":"ContainerStarted","Data":"a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c"} Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.330175 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.332105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b4fbc761-3cab-4cee-b4f5-59a2306a3650","Type":"ContainerStarted","Data":"72cb3a1e0a541104f4491c324ff2e8d835f65bd589656fd31f4d4ffea57b8749"} Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.348007 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" podStartSLOduration=2.347993895 podStartE2EDuration="2.347993895s" podCreationTimestamp="2026-01-21 16:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:08:03.342379088 +0000 UTC m=+3980.523895310" watchObservedRunningTime="2026-01-21 16:08:03.347993895 +0000 UTC m=+3980.529510117" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.560346 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.560784 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.564667 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.564859 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.583541 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.589455 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.593696 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:03 crc kubenswrapper[4707]: I0121 16:08:03.594340 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.058240 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6799769478-qpd8z"] Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.060381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.062666 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.066356 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.069960 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6799769478-qpd8z"] Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.092201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb472\" (UniqueName: \"kubernetes.io/projected/88131bb3-fc80-40c1-a068-ae5dfec982f9-kube-api-access-rb472\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.092255 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-combined-ca-bundle\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.092298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-internal-tls-certs\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.092336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-ovndb-tls-certs\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.092377 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-httpd-config\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.092424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-config\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.092476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-public-tls-certs\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.193767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-httpd-config\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.193805 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-config\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.193855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-public-tls-certs\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.193936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb472\" (UniqueName: \"kubernetes.io/projected/88131bb3-fc80-40c1-a068-ae5dfec982f9-kube-api-access-rb472\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.193978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-combined-ca-bundle\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.194033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-internal-tls-certs\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.194087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-ovndb-tls-certs\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.197835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-config\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.198203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-ovndb-tls-certs\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.198755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-combined-ca-bundle\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.200009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-httpd-config\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.202727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-internal-tls-certs\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.202783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-public-tls-certs\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.211358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb472\" (UniqueName: \"kubernetes.io/projected/88131bb3-fc80-40c1-a068-ae5dfec982f9-kube-api-access-rb472\") pod \"neutron-6799769478-qpd8z\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.340752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b4fbc761-3cab-4cee-b4f5-59a2306a3650","Type":"ContainerStarted","Data":"fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa"} Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.340798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b4fbc761-3cab-4cee-b4f5-59a2306a3650","Type":"ContainerStarted","Data":"f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d"} Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.341982 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.342024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.342035 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.342043 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.361295 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.361281107 podStartE2EDuration="2.361281107s" podCreationTimestamp="2026-01-21 16:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:08:04.354833363 +0000 UTC m=+3981.536349585" watchObservedRunningTime="2026-01-21 16:08:04.361281107 +0000 UTC m=+3981.542797330" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.374775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:04 crc kubenswrapper[4707]: I0121 16:08:04.772974 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6799769478-qpd8z"] Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.348964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" event={"ID":"88131bb3-fc80-40c1-a068-ae5dfec982f9","Type":"ContainerStarted","Data":"e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7"} Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.349634 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.349651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" event={"ID":"88131bb3-fc80-40c1-a068-ae5dfec982f9","Type":"ContainerStarted","Data":"dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822"} Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.349661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" event={"ID":"88131bb3-fc80-40c1-a068-ae5dfec982f9","Type":"ContainerStarted","Data":"4ec2dd7ee49214f084a6ef339c999bdd54b7d49bcca7c7f133d9e4ec0332f11e"} Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.349771 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.364580 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" podStartSLOduration=1.364567615 podStartE2EDuration="1.364567615s" podCreationTimestamp="2026-01-21 16:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:08:05.360697036 +0000 UTC m=+3982.542213248" watchObservedRunningTime="2026-01-21 16:08:05.364567615 +0000 UTC m=+3982.546083826" Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.831955 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.884224 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.986333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:05 crc kubenswrapper[4707]: I0121 16:08:05.988560 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.007533 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.007580 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.355900 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerName="cinder-scheduler" containerID="cri-o://16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9" gracePeriod=30 Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.355979 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerName="probe" containerID="cri-o://64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304" gracePeriod=30 Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.596989 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj"] Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.598953 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.601948 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.602837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.609692 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj"] Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.636706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc2h6\" (UniqueName: \"kubernetes.io/projected/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-kube-api-access-fc2h6\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.636761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-public-tls-certs\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.636875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-logs\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.636933 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data-custom\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.636964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-internal-tls-certs\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.637004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-combined-ca-bundle\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.637096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.738261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-logs\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.738328 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data-custom\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.738359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-internal-tls-certs\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.738395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-combined-ca-bundle\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.738456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.738495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc2h6\" (UniqueName: \"kubernetes.io/projected/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-kube-api-access-fc2h6\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.738524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-public-tls-certs\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.738700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-logs\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.743969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.744345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-internal-tls-certs\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.744888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data-custom\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.749241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-public-tls-certs\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.755353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-combined-ca-bundle\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.755358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc2h6\" (UniqueName: \"kubernetes.io/projected/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-kube-api-access-fc2h6\") pod \"barbican-api-778bdc54f8-tnxrj\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:06 crc kubenswrapper[4707]: I0121 16:08:06.918607 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.302108 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.336464 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj"] Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.353621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data-custom\") pod \"ee652951-c0ac-42b5-a87f-0ea719db7b98\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.353745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7dx4\" (UniqueName: \"kubernetes.io/projected/ee652951-c0ac-42b5-a87f-0ea719db7b98-kube-api-access-q7dx4\") pod \"ee652951-c0ac-42b5-a87f-0ea719db7b98\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.353777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee652951-c0ac-42b5-a87f-0ea719db7b98-etc-machine-id\") pod \"ee652951-c0ac-42b5-a87f-0ea719db7b98\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.353819 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-combined-ca-bundle\") pod \"ee652951-c0ac-42b5-a87f-0ea719db7b98\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.353854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee652951-c0ac-42b5-a87f-0ea719db7b98-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ee652951-c0ac-42b5-a87f-0ea719db7b98" (UID: "ee652951-c0ac-42b5-a87f-0ea719db7b98"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.353864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data\") pod \"ee652951-c0ac-42b5-a87f-0ea719db7b98\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.353948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-scripts\") pod \"ee652951-c0ac-42b5-a87f-0ea719db7b98\" (UID: \"ee652951-c0ac-42b5-a87f-0ea719db7b98\") " Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.354279 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee652951-c0ac-42b5-a87f-0ea719db7b98-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.359036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ee652951-c0ac-42b5-a87f-0ea719db7b98" (UID: "ee652951-c0ac-42b5-a87f-0ea719db7b98"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.359055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-scripts" (OuterVolumeSpecName: "scripts") pod "ee652951-c0ac-42b5-a87f-0ea719db7b98" (UID: "ee652951-c0ac-42b5-a87f-0ea719db7b98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.359059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee652951-c0ac-42b5-a87f-0ea719db7b98-kube-api-access-q7dx4" (OuterVolumeSpecName: "kube-api-access-q7dx4") pod "ee652951-c0ac-42b5-a87f-0ea719db7b98" (UID: "ee652951-c0ac-42b5-a87f-0ea719db7b98"). InnerVolumeSpecName "kube-api-access-q7dx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.365203 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerID="64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304" exitCode=0 Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.365247 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerID="16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9" exitCode=0 Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.365355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ee652951-c0ac-42b5-a87f-0ea719db7b98","Type":"ContainerDied","Data":"64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304"} Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.365485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ee652951-c0ac-42b5-a87f-0ea719db7b98","Type":"ContainerDied","Data":"16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9"} Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.365578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"ee652951-c0ac-42b5-a87f-0ea719db7b98","Type":"ContainerDied","Data":"c7cb259e2cc6fe520fc94f92389ab782797261e7de2276141bfc6aa4bc3ce65a"} Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.365558 4707 scope.go:117] "RemoveContainer" containerID="64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.366113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.366373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" event={"ID":"6794a1f0-de70-4b87-8f05-1f6c657b3d5e","Type":"ContainerStarted","Data":"734ecfc73708a430a5a7a653e92565be67825c819643eaeccbf2f9ba156f5079"} Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.379794 4707 scope.go:117] "RemoveContainer" containerID="16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.388972 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee652951-c0ac-42b5-a87f-0ea719db7b98" (UID: "ee652951-c0ac-42b5-a87f-0ea719db7b98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.396076 4707 scope.go:117] "RemoveContainer" containerID="64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304" Jan 21 16:08:07 crc kubenswrapper[4707]: E0121 16:08:07.396355 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304\": container with ID starting with 64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304 not found: ID does not exist" containerID="64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.396390 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304"} err="failed to get container status \"64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304\": rpc error: code = NotFound desc = could not find container \"64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304\": container with ID starting with 64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304 not found: ID does not exist" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.396410 4707 scope.go:117] "RemoveContainer" containerID="16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9" Jan 21 16:08:07 crc kubenswrapper[4707]: E0121 16:08:07.396630 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9\": container with ID starting with 16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9 not found: ID does not exist" containerID="16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.396653 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9"} err="failed to get container status \"16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9\": rpc error: code = NotFound desc = could not find container \"16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9\": container with ID starting with 16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9 not found: ID does not exist" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.396666 4707 scope.go:117] "RemoveContainer" containerID="64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.396853 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304"} err="failed to get container status \"64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304\": rpc error: code = NotFound desc = could not find container \"64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304\": container with ID starting with 64c99e15d5921e7095e3c1c8710d47f9f15ab55febecc9e72854ffea839ac304 not found: ID does not exist" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.396877 4707 scope.go:117] "RemoveContainer" containerID="16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.397213 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9"} err="failed to get container status \"16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9\": rpc error: code = NotFound desc = could not find container \"16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9\": container with ID starting with 16415a9017edceca21062388ce669ac8c2f98ef9a64557cf92b8deb4fc00d9d9 not found: ID does not exist" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.420973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data" (OuterVolumeSpecName: "config-data") pod "ee652951-c0ac-42b5-a87f-0ea719db7b98" (UID: "ee652951-c0ac-42b5-a87f-0ea719db7b98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.455754 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.455778 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.455795 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7dx4\" (UniqueName: \"kubernetes.io/projected/ee652951-c0ac-42b5-a87f-0ea719db7b98-kube-api-access-q7dx4\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.455833 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.455844 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee652951-c0ac-42b5-a87f-0ea719db7b98-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.693895 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.699899 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.707748 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:08:07 crc kubenswrapper[4707]: E0121 16:08:07.708100 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerName="probe" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.708120 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerName="probe" Jan 21 16:08:07 crc kubenswrapper[4707]: E0121 16:08:07.708138 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerName="cinder-scheduler" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.708144 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerName="cinder-scheduler" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.708362 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerName="cinder-scheduler" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.708386 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee652951-c0ac-42b5-a87f-0ea719db7b98" containerName="probe" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.709204 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.710496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.716408 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.759597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.759644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.759666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.759860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8p42\" (UniqueName: \"kubernetes.io/projected/5a6deb2f-fd3d-494e-8d58-c6da07de2498-kube-api-access-q8p42\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.759907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a6deb2f-fd3d-494e-8d58-c6da07de2498-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.759939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.861175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.861225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.861255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.861278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.861360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8p42\" (UniqueName: \"kubernetes.io/projected/5a6deb2f-fd3d-494e-8d58-c6da07de2498-kube-api-access-q8p42\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.861413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a6deb2f-fd3d-494e-8d58-c6da07de2498-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.861489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a6deb2f-fd3d-494e-8d58-c6da07de2498-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.865773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.866225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.866260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.866483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:07 crc kubenswrapper[4707]: I0121 16:08:07.874788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8p42\" (UniqueName: \"kubernetes.io/projected/5a6deb2f-fd3d-494e-8d58-c6da07de2498-kube-api-access-q8p42\") pod \"cinder-scheduler-0\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:08 crc kubenswrapper[4707]: I0121 16:08:08.022508 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:08 crc kubenswrapper[4707]: I0121 16:08:08.379637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" event={"ID":"6794a1f0-de70-4b87-8f05-1f6c657b3d5e","Type":"ContainerStarted","Data":"fd8230b429f5a36de29bc3f49e9e63eee6ddbd339099d3d326be9dd24261d9a6"} Jan 21 16:08:08 crc kubenswrapper[4707]: I0121 16:08:08.379892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" event={"ID":"6794a1f0-de70-4b87-8f05-1f6c657b3d5e","Type":"ContainerStarted","Data":"4415cfbe061679e58ce159e4806960e2e4da012d85c65836e8d01c2df9bdd7cc"} Jan 21 16:08:08 crc kubenswrapper[4707]: I0121 16:08:08.379946 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:08 crc kubenswrapper[4707]: I0121 16:08:08.379958 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:08 crc kubenswrapper[4707]: I0121 16:08:08.392705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:08:08 crc kubenswrapper[4707]: I0121 16:08:08.396427 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" podStartSLOduration=2.396415771 podStartE2EDuration="2.396415771s" podCreationTimestamp="2026-01-21 16:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:08:08.393586541 +0000 UTC m=+3985.575102763" watchObservedRunningTime="2026-01-21 16:08:08.396415771 +0000 UTC m=+3985.577931993" Jan 21 16:08:09 crc kubenswrapper[4707]: I0121 16:08:09.183990 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:08:09 crc kubenswrapper[4707]: E0121 16:08:09.184327 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:08:09 crc kubenswrapper[4707]: I0121 16:08:09.191083 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee652951-c0ac-42b5-a87f-0ea719db7b98" path="/var/lib/kubelet/pods/ee652951-c0ac-42b5-a87f-0ea719db7b98/volumes" Jan 21 16:08:09 crc kubenswrapper[4707]: I0121 16:08:09.387911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5a6deb2f-fd3d-494e-8d58-c6da07de2498","Type":"ContainerStarted","Data":"6da93903072338fa09723fab8b4ae763bcee0d8308c91e3e562a7090cddaa6a2"} Jan 21 16:08:09 crc kubenswrapper[4707]: I0121 16:08:09.388111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5a6deb2f-fd3d-494e-8d58-c6da07de2498","Type":"ContainerStarted","Data":"b61d287158b2568660f871fafa7221b083307857176a321166ff5c5c0acca885"} Jan 21 16:08:09 crc kubenswrapper[4707]: I0121 16:08:09.388124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5a6deb2f-fd3d-494e-8d58-c6da07de2498","Type":"ContainerStarted","Data":"a852b1ab4b43c2fdb82060e07cce70bd140df65d53843b20e0bb4bba7817b83e"} Jan 21 16:08:13 crc kubenswrapper[4707]: I0121 16:08:13.023138 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:13 crc kubenswrapper[4707]: I0121 16:08:13.179105 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:08:13 crc kubenswrapper[4707]: I0121 16:08:13.206443 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=6.206427076 podStartE2EDuration="6.206427076s" podCreationTimestamp="2026-01-21 16:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:08:09.403614483 +0000 UTC m=+3986.585130705" watchObservedRunningTime="2026-01-21 16:08:13.206427076 +0000 UTC m=+3990.387943298" Jan 21 16:08:14 crc kubenswrapper[4707]: I0121 16:08:14.299130 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:08:18 crc kubenswrapper[4707]: I0121 16:08:18.071505 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:18 crc kubenswrapper[4707]: I0121 16:08:18.092988 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:08:18 crc kubenswrapper[4707]: I0121 16:08:18.127394 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl"] Jan 21 16:08:18 crc kubenswrapper[4707]: I0121 16:08:18.127586 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerName="barbican-api-log" containerID="cri-o://a7801d9d5ee47a2f0d9786a9cb40df520effb2580cbdb96f216bddab5e9295b7" gracePeriod=30 Jan 21 16:08:18 crc kubenswrapper[4707]: I0121 16:08:18.128176 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerName="barbican-api" containerID="cri-o://174499063bbae139016787c58fd86006b2b9a43b498657447a7247aca279ac14" gracePeriod=30 Jan 21 16:08:18 crc kubenswrapper[4707]: I0121 16:08:18.142730 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.155:9311/healthcheck\": EOF" Jan 21 16:08:18 crc kubenswrapper[4707]: I0121 16:08:18.455053 4707 generic.go:334] "Generic (PLEG): container finished" podID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerID="a7801d9d5ee47a2f0d9786a9cb40df520effb2580cbdb96f216bddab5e9295b7" exitCode=143 Jan 21 16:08:18 crc kubenswrapper[4707]: I0121 16:08:18.455774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" event={"ID":"92bc7502-e503-4b19-b9f3-8ef8149a1e7d","Type":"ContainerDied","Data":"a7801d9d5ee47a2f0d9786a9cb40df520effb2580cbdb96f216bddab5e9295b7"} Jan 21 16:08:20 crc kubenswrapper[4707]: I0121 16:08:20.182931 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:08:20 crc kubenswrapper[4707]: E0121 16:08:20.183871 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.478182 4707 generic.go:334] "Generic (PLEG): container finished" podID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerID="174499063bbae139016787c58fd86006b2b9a43b498657447a7247aca279ac14" exitCode=0 Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.478360 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" event={"ID":"92bc7502-e503-4b19-b9f3-8ef8149a1e7d","Type":"ContainerDied","Data":"174499063bbae139016787c58fd86006b2b9a43b498657447a7247aca279ac14"} Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.621194 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.771352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv5cc\" (UniqueName: \"kubernetes.io/projected/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-kube-api-access-kv5cc\") pod \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.771453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data\") pod \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.771500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-logs\") pod \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.771546 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-combined-ca-bundle\") pod \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.771611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data-custom\") pod \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\" (UID: \"92bc7502-e503-4b19-b9f3-8ef8149a1e7d\") " Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.772106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-logs" (OuterVolumeSpecName: "logs") pod "92bc7502-e503-4b19-b9f3-8ef8149a1e7d" (UID: "92bc7502-e503-4b19-b9f3-8ef8149a1e7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.772214 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.775979 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "92bc7502-e503-4b19-b9f3-8ef8149a1e7d" (UID: "92bc7502-e503-4b19-b9f3-8ef8149a1e7d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.776019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-kube-api-access-kv5cc" (OuterVolumeSpecName: "kube-api-access-kv5cc") pod "92bc7502-e503-4b19-b9f3-8ef8149a1e7d" (UID: "92bc7502-e503-4b19-b9f3-8ef8149a1e7d"). InnerVolumeSpecName "kube-api-access-kv5cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.790434 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92bc7502-e503-4b19-b9f3-8ef8149a1e7d" (UID: "92bc7502-e503-4b19-b9f3-8ef8149a1e7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.804951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data" (OuterVolumeSpecName: "config-data") pod "92bc7502-e503-4b19-b9f3-8ef8149a1e7d" (UID: "92bc7502-e503-4b19-b9f3-8ef8149a1e7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.873654 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv5cc\" (UniqueName: \"kubernetes.io/projected/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-kube-api-access-kv5cc\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.873685 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.873696 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:21 crc kubenswrapper[4707]: I0121 16:08:21.873705 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92bc7502-e503-4b19-b9f3-8ef8149a1e7d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:22 crc kubenswrapper[4707]: I0121 16:08:22.486973 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" event={"ID":"92bc7502-e503-4b19-b9f3-8ef8149a1e7d","Type":"ContainerDied","Data":"9813424d91373477bd684a56e1522f7a27a94e184c5fc1e47102c6414a80cd4b"} Jan 21 16:08:22 crc kubenswrapper[4707]: I0121 16:08:22.487028 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl" Jan 21 16:08:22 crc kubenswrapper[4707]: I0121 16:08:22.487641 4707 scope.go:117] "RemoveContainer" containerID="174499063bbae139016787c58fd86006b2b9a43b498657447a7247aca279ac14" Jan 21 16:08:22 crc kubenswrapper[4707]: I0121 16:08:22.505946 4707 scope.go:117] "RemoveContainer" containerID="a7801d9d5ee47a2f0d9786a9cb40df520effb2580cbdb96f216bddab5e9295b7" Jan 21 16:08:22 crc kubenswrapper[4707]: I0121 16:08:22.515876 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl"] Jan 21 16:08:22 crc kubenswrapper[4707]: I0121 16:08:22.522792 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-76464b79f6-l8nvl"] Jan 21 16:08:23 crc kubenswrapper[4707]: I0121 16:08:23.191663 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" path="/var/lib/kubelet/pods/92bc7502-e503-4b19-b9f3-8ef8149a1e7d/volumes" Jan 21 16:08:25 crc kubenswrapper[4707]: I0121 16:08:25.131386 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:08:25 crc kubenswrapper[4707]: I0121 16:08:25.132613 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:08:25 crc kubenswrapper[4707]: I0121 16:08:25.390870 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:08:25 crc kubenswrapper[4707]: I0121 16:08:25.750582 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:08:25 crc kubenswrapper[4707]: I0121 16:08:25.751805 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:08:25 crc kubenswrapper[4707]: I0121 16:08:25.801985 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6f9dc896b6-x5875"] Jan 21 16:08:26 crc kubenswrapper[4707]: I0121 16:08:26.515673 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" podUID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerName="placement-log" containerID="cri-o://1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47" gracePeriod=30 Jan 21 16:08:26 crc kubenswrapper[4707]: I0121 16:08:26.515692 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" podUID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerName="placement-api" containerID="cri-o://0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6" gracePeriod=30 Jan 21 16:08:26 crc kubenswrapper[4707]: I0121 16:08:26.686946 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:27 crc kubenswrapper[4707]: I0121 16:08:27.523296 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerID="1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47" exitCode=143 Jan 21 16:08:27 crc kubenswrapper[4707]: I0121 16:08:27.523381 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" event={"ID":"a2b79667-aae1-4cad-b7d9-52c4893cf000","Type":"ContainerDied","Data":"1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47"} Jan 21 16:08:29 crc kubenswrapper[4707]: I0121 16:08:29.877596 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:08:29 crc kubenswrapper[4707]: I0121 16:08:29.988407 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-config-data\") pod \"a2b79667-aae1-4cad-b7d9-52c4893cf000\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " Jan 21 16:08:29 crc kubenswrapper[4707]: I0121 16:08:29.988452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-combined-ca-bundle\") pod \"a2b79667-aae1-4cad-b7d9-52c4893cf000\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " Jan 21 16:08:29 crc kubenswrapper[4707]: I0121 16:08:29.988501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-scripts\") pod \"a2b79667-aae1-4cad-b7d9-52c4893cf000\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " Jan 21 16:08:29 crc kubenswrapper[4707]: I0121 16:08:29.988557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2b79667-aae1-4cad-b7d9-52c4893cf000-logs\") pod \"a2b79667-aae1-4cad-b7d9-52c4893cf000\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " Jan 21 16:08:29 crc kubenswrapper[4707]: I0121 16:08:29.988603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfkd9\" (UniqueName: \"kubernetes.io/projected/a2b79667-aae1-4cad-b7d9-52c4893cf000-kube-api-access-cfkd9\") pod \"a2b79667-aae1-4cad-b7d9-52c4893cf000\" (UID: \"a2b79667-aae1-4cad-b7d9-52c4893cf000\") " Jan 21 16:08:29 crc kubenswrapper[4707]: I0121 16:08:29.989387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b79667-aae1-4cad-b7d9-52c4893cf000-logs" (OuterVolumeSpecName: "logs") pod "a2b79667-aae1-4cad-b7d9-52c4893cf000" (UID: "a2b79667-aae1-4cad-b7d9-52c4893cf000"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.005274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b79667-aae1-4cad-b7d9-52c4893cf000-kube-api-access-cfkd9" (OuterVolumeSpecName: "kube-api-access-cfkd9") pod "a2b79667-aae1-4cad-b7d9-52c4893cf000" (UID: "a2b79667-aae1-4cad-b7d9-52c4893cf000"). InnerVolumeSpecName "kube-api-access-cfkd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.005492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-scripts" (OuterVolumeSpecName: "scripts") pod "a2b79667-aae1-4cad-b7d9-52c4893cf000" (UID: "a2b79667-aae1-4cad-b7d9-52c4893cf000"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.020742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-config-data" (OuterVolumeSpecName: "config-data") pod "a2b79667-aae1-4cad-b7d9-52c4893cf000" (UID: "a2b79667-aae1-4cad-b7d9-52c4893cf000"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.022324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2b79667-aae1-4cad-b7d9-52c4893cf000" (UID: "a2b79667-aae1-4cad-b7d9-52c4893cf000"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.090979 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.091168 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.091181 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b79667-aae1-4cad-b7d9-52c4893cf000-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.091190 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2b79667-aae1-4cad-b7d9-52c4893cf000-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.091199 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfkd9\" (UniqueName: \"kubernetes.io/projected/a2b79667-aae1-4cad-b7d9-52c4893cf000-kube-api-access-cfkd9\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.542927 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerID="0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6" exitCode=0 Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.542965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" event={"ID":"a2b79667-aae1-4cad-b7d9-52c4893cf000","Type":"ContainerDied","Data":"0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6"} Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.542989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" event={"ID":"a2b79667-aae1-4cad-b7d9-52c4893cf000","Type":"ContainerDied","Data":"20cb6d462022ab1a96675fa54721d6307cc7edd031840668461616bfd9770424"} Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.542995 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6f9dc896b6-x5875" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.543007 4707 scope.go:117] "RemoveContainer" containerID="0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.559986 4707 scope.go:117] "RemoveContainer" containerID="1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.569058 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6f9dc896b6-x5875"] Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.574688 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-6f9dc896b6-x5875"] Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.575080 4707 scope.go:117] "RemoveContainer" containerID="0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6" Jan 21 16:08:30 crc kubenswrapper[4707]: E0121 16:08:30.575368 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6\": container with ID starting with 0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6 not found: ID does not exist" containerID="0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.575396 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6"} err="failed to get container status \"0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6\": rpc error: code = NotFound desc = could not find container \"0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6\": container with ID starting with 0f9fb21ce044709315ab4b6d44f4a9fa77c060d1d2d79919754c3870041231b6 not found: ID does not exist" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.575416 4707 scope.go:117] "RemoveContainer" containerID="1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47" Jan 21 16:08:30 crc kubenswrapper[4707]: E0121 16:08:30.575789 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47\": container with ID starting with 1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47 not found: ID does not exist" containerID="1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47" Jan 21 16:08:30 crc kubenswrapper[4707]: I0121 16:08:30.575835 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47"} err="failed to get container status \"1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47\": rpc error: code = NotFound desc = could not find container \"1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47\": container with ID starting with 1de58dbb12c8fbaf47a5528acbfaa3d38fc438b3b3fc05e8fcf11a4118d68b47 not found: ID does not exist" Jan 21 16:08:31 crc kubenswrapper[4707]: I0121 16:08:31.189541 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b79667-aae1-4cad-b7d9-52c4893cf000" path="/var/lib/kubelet/pods/a2b79667-aae1-4cad-b7d9-52c4893cf000/volumes" Jan 21 16:08:33 crc kubenswrapper[4707]: I0121 16:08:33.071557 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:08:33 crc kubenswrapper[4707]: I0121 16:08:33.186717 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:08:33 crc kubenswrapper[4707]: E0121 16:08:33.186947 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:08:34 crc kubenswrapper[4707]: I0121 16:08:34.384274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:08:34 crc kubenswrapper[4707]: I0121 16:08:34.422978 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-78d5cb5d89-rgx54"] Jan 21 16:08:34 crc kubenswrapper[4707]: I0121 16:08:34.423174 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" podUID="83157088-df54-4610-ae05-f3483eaa70b4" containerName="neutron-api" containerID="cri-o://de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b" gracePeriod=30 Jan 21 16:08:34 crc kubenswrapper[4707]: I0121 16:08:34.423502 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" podUID="83157088-df54-4610-ae05-f3483eaa70b4" containerName="neutron-httpd" containerID="cri-o://18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7" gracePeriod=30 Jan 21 16:08:35 crc kubenswrapper[4707]: I0121 16:08:35.575166 4707 generic.go:334] "Generic (PLEG): container finished" podID="83157088-df54-4610-ae05-f3483eaa70b4" containerID="18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7" exitCode=0 Jan 21 16:08:35 crc kubenswrapper[4707]: I0121 16:08:35.575366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" event={"ID":"83157088-df54-4610-ae05-f3483eaa70b4","Type":"ContainerDied","Data":"18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7"} Jan 21 16:08:35 crc kubenswrapper[4707]: I0121 16:08:35.830651 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:35 crc kubenswrapper[4707]: I0121 16:08:35.830888 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="ceilometer-central-agent" containerID="cri-o://8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac" gracePeriod=30 Jan 21 16:08:35 crc kubenswrapper[4707]: I0121 16:08:35.831013 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="proxy-httpd" containerID="cri-o://0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9" gracePeriod=30 Jan 21 16:08:35 crc kubenswrapper[4707]: I0121 16:08:35.831101 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="sg-core" containerID="cri-o://e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4" gracePeriod=30 Jan 21 16:08:35 crc kubenswrapper[4707]: I0121 16:08:35.831028 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="ceilometer-notification-agent" containerID="cri-o://0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f" gracePeriod=30 Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.507230 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.585241 4707 generic.go:334] "Generic (PLEG): container finished" podID="06d5ea70-00dc-4469-a839-6a644f076265" containerID="0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9" exitCode=0 Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.585267 4707 generic.go:334] "Generic (PLEG): container finished" podID="06d5ea70-00dc-4469-a839-6a644f076265" containerID="e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4" exitCode=2 Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.585275 4707 generic.go:334] "Generic (PLEG): container finished" podID="06d5ea70-00dc-4469-a839-6a644f076265" containerID="8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac" exitCode=0 Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.585306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerDied","Data":"0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9"} Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.585337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerDied","Data":"e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4"} Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.585347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerDied","Data":"8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac"} Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.586996 4707 generic.go:334] "Generic (PLEG): container finished" podID="83157088-df54-4610-ae05-f3483eaa70b4" containerID="de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b" exitCode=0 Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.587032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" event={"ID":"83157088-df54-4610-ae05-f3483eaa70b4","Type":"ContainerDied","Data":"de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b"} Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.587055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" event={"ID":"83157088-df54-4610-ae05-f3483eaa70b4","Type":"ContainerDied","Data":"b82dfe665add1347a13ea6106c6cdcba6bcc168bfda01dd93a66d5a187de4c89"} Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.587072 4707 scope.go:117] "RemoveContainer" containerID="18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.587034 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78d5cb5d89-rgx54" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.605030 4707 scope.go:117] "RemoveContainer" containerID="de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.620145 4707 scope.go:117] "RemoveContainer" containerID="18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7" Jan 21 16:08:36 crc kubenswrapper[4707]: E0121 16:08:36.620421 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7\": container with ID starting with 18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7 not found: ID does not exist" containerID="18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.620448 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7"} err="failed to get container status \"18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7\": rpc error: code = NotFound desc = could not find container \"18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7\": container with ID starting with 18d1bb70f4951046d2a3e63adc0a88cdac2dd43112b8b6987aca17c26a3954a7 not found: ID does not exist" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.620463 4707 scope.go:117] "RemoveContainer" containerID="de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b" Jan 21 16:08:36 crc kubenswrapper[4707]: E0121 16:08:36.620777 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b\": container with ID starting with de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b not found: ID does not exist" containerID="de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.620805 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b"} err="failed to get container status \"de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b\": rpc error: code = NotFound desc = could not find container \"de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b\": container with ID starting with de4cee29de8531292c4e03a77830b56a1d31e3f1438ab5b897809c3941a6077b not found: ID does not exist" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.698190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-httpd-config\") pod \"83157088-df54-4610-ae05-f3483eaa70b4\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.698267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stnk7\" (UniqueName: \"kubernetes.io/projected/83157088-df54-4610-ae05-f3483eaa70b4-kube-api-access-stnk7\") pod \"83157088-df54-4610-ae05-f3483eaa70b4\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.698315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-config\") pod \"83157088-df54-4610-ae05-f3483eaa70b4\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.698355 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-combined-ca-bundle\") pod \"83157088-df54-4610-ae05-f3483eaa70b4\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.698378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-ovndb-tls-certs\") pod \"83157088-df54-4610-ae05-f3483eaa70b4\" (UID: \"83157088-df54-4610-ae05-f3483eaa70b4\") " Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.703851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83157088-df54-4610-ae05-f3483eaa70b4-kube-api-access-stnk7" (OuterVolumeSpecName: "kube-api-access-stnk7") pod "83157088-df54-4610-ae05-f3483eaa70b4" (UID: "83157088-df54-4610-ae05-f3483eaa70b4"). InnerVolumeSpecName "kube-api-access-stnk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.707448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "83157088-df54-4610-ae05-f3483eaa70b4" (UID: "83157088-df54-4610-ae05-f3483eaa70b4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.736120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-config" (OuterVolumeSpecName: "config") pod "83157088-df54-4610-ae05-f3483eaa70b4" (UID: "83157088-df54-4610-ae05-f3483eaa70b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.738314 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83157088-df54-4610-ae05-f3483eaa70b4" (UID: "83157088-df54-4610-ae05-f3483eaa70b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.751320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "83157088-df54-4610-ae05-f3483eaa70b4" (UID: "83157088-df54-4610-ae05-f3483eaa70b4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.800515 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stnk7\" (UniqueName: \"kubernetes.io/projected/83157088-df54-4610-ae05-f3483eaa70b4-kube-api-access-stnk7\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.800547 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.800559 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.800567 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.800576 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83157088-df54-4610-ae05-f3483eaa70b4-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.852453 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:08:36 crc kubenswrapper[4707]: E0121 16:08:36.852894 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83157088-df54-4610-ae05-f3483eaa70b4" containerName="neutron-httpd" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.852911 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="83157088-df54-4610-ae05-f3483eaa70b4" containerName="neutron-httpd" Jan 21 16:08:36 crc kubenswrapper[4707]: E0121 16:08:36.852923 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerName="barbican-api-log" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.852928 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerName="barbican-api-log" Jan 21 16:08:36 crc kubenswrapper[4707]: E0121 16:08:36.852940 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerName="barbican-api" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.852946 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerName="barbican-api" Jan 21 16:08:36 crc kubenswrapper[4707]: E0121 16:08:36.852954 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerName="placement-log" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.852959 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerName="placement-log" Jan 21 16:08:36 crc kubenswrapper[4707]: E0121 16:08:36.852974 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83157088-df54-4610-ae05-f3483eaa70b4" containerName="neutron-api" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.852985 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="83157088-df54-4610-ae05-f3483eaa70b4" containerName="neutron-api" Jan 21 16:08:36 crc kubenswrapper[4707]: E0121 16:08:36.852996 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerName="placement-api" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.853003 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerName="placement-api" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.853144 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerName="placement-log" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.853165 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="83157088-df54-4610-ae05-f3483eaa70b4" containerName="neutron-api" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.853174 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b79667-aae1-4cad-b7d9-52c4893cf000" containerName="placement-api" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.853187 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerName="barbican-api" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.853202 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="83157088-df54-4610-ae05-f3483eaa70b4" containerName="neutron-httpd" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.853209 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bc7502-e503-4b19-b9f3-8ef8149a1e7d" containerName="barbican-api-log" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.853822 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.855396 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.855452 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.855472 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-6kq7j" Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.861320 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.912586 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-78d5cb5d89-rgx54"] Jan 21 16:08:36 crc kubenswrapper[4707]: I0121 16:08:36.918979 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-78d5cb5d89-rgx54"] Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.002928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config-secret\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.002998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.003415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjsqt\" (UniqueName: \"kubernetes.io/projected/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-kube-api-access-kjsqt\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.003525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.105338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config-secret\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.105395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.105470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjsqt\" (UniqueName: \"kubernetes.io/projected/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-kube-api-access-kjsqt\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.105494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.106285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.108916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.109069 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config-secret\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.119917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjsqt\" (UniqueName: \"kubernetes.io/projected/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-kube-api-access-kjsqt\") pod \"openstackclient\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.168821 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.190005 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83157088-df54-4610-ae05-f3483eaa70b4" path="/var/lib/kubelet/pods/83157088-df54-4610-ae05-f3483eaa70b4/volumes" Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.557830 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:08:37 crc kubenswrapper[4707]: W0121 16:08:37.564090 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1e1e63f_0c08_41e3_a60d_cd45b774ddbe.slice/crio-756f90a9b1c88e00bc75ee670c3eafa77f702fa933995afd6eee323eeb6cb73b WatchSource:0}: Error finding container 756f90a9b1c88e00bc75ee670c3eafa77f702fa933995afd6eee323eeb6cb73b: Status 404 returned error can't find the container with id 756f90a9b1c88e00bc75ee670c3eafa77f702fa933995afd6eee323eeb6cb73b Jan 21 16:08:37 crc kubenswrapper[4707]: I0121 16:08:37.606987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe","Type":"ContainerStarted","Data":"756f90a9b1c88e00bc75ee670c3eafa77f702fa933995afd6eee323eeb6cb73b"} Jan 21 16:08:38 crc kubenswrapper[4707]: I0121 16:08:38.614406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe","Type":"ContainerStarted","Data":"9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049"} Jan 21 16:08:38 crc kubenswrapper[4707]: I0121 16:08:38.630874 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.630860047 podStartE2EDuration="2.630860047s" podCreationTimestamp="2026-01-21 16:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:08:38.627409729 +0000 UTC m=+4015.808925942" watchObservedRunningTime="2026-01-21 16:08:38.630860047 +0000 UTC m=+4015.812376269" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.698391 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7"] Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.699870 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.701549 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.701712 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.702256 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.715071 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7"] Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.849571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-etc-swift\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.849610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-log-httpd\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.849683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8vx\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-kube-api-access-5v8vx\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.849899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-run-httpd\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.849986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-public-tls-certs\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.850022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-internal-tls-certs\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.850471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-config-data\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.850501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-combined-ca-bundle\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.951673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-config-data\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.951706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-combined-ca-bundle\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.951744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-etc-swift\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.951765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-log-httpd\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.951835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v8vx\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-kube-api-access-5v8vx\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.951890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-run-httpd\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.951919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-public-tls-certs\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.951938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-internal-tls-certs\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.952357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-log-httpd\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.952398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-run-httpd\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.956916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-internal-tls-certs\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.957451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-etc-swift\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.957460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-config-data\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.957648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-public-tls-certs\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.966824 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-combined-ca-bundle\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:39 crc kubenswrapper[4707]: I0121 16:08:39.967028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v8vx\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-kube-api-access-5v8vx\") pod \"swift-proxy-5f557d5d74-bnmf7\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.015000 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.333622 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.358963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-combined-ca-bundle\") pod \"06d5ea70-00dc-4469-a839-6a644f076265\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.359184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-scripts\") pod \"06d5ea70-00dc-4469-a839-6a644f076265\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.359204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-run-httpd\") pod \"06d5ea70-00dc-4469-a839-6a644f076265\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.359249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4sss\" (UniqueName: \"kubernetes.io/projected/06d5ea70-00dc-4469-a839-6a644f076265-kube-api-access-f4sss\") pod \"06d5ea70-00dc-4469-a839-6a644f076265\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.359294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-config-data\") pod \"06d5ea70-00dc-4469-a839-6a644f076265\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.359382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-sg-core-conf-yaml\") pod \"06d5ea70-00dc-4469-a839-6a644f076265\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.359412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-log-httpd\") pod \"06d5ea70-00dc-4469-a839-6a644f076265\" (UID: \"06d5ea70-00dc-4469-a839-6a644f076265\") " Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.359500 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "06d5ea70-00dc-4469-a839-6a644f076265" (UID: "06d5ea70-00dc-4469-a839-6a644f076265"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.359757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "06d5ea70-00dc-4469-a839-6a644f076265" (UID: "06d5ea70-00dc-4469-a839-6a644f076265"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.360003 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.360022 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d5ea70-00dc-4469-a839-6a644f076265-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.363057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-scripts" (OuterVolumeSpecName: "scripts") pod "06d5ea70-00dc-4469-a839-6a644f076265" (UID: "06d5ea70-00dc-4469-a839-6a644f076265"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.363098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d5ea70-00dc-4469-a839-6a644f076265-kube-api-access-f4sss" (OuterVolumeSpecName: "kube-api-access-f4sss") pod "06d5ea70-00dc-4469-a839-6a644f076265" (UID: "06d5ea70-00dc-4469-a839-6a644f076265"). InnerVolumeSpecName "kube-api-access-f4sss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.388595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "06d5ea70-00dc-4469-a839-6a644f076265" (UID: "06d5ea70-00dc-4469-a839-6a644f076265"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.417062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06d5ea70-00dc-4469-a839-6a644f076265" (UID: "06d5ea70-00dc-4469-a839-6a644f076265"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.429314 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-config-data" (OuterVolumeSpecName: "config-data") pod "06d5ea70-00dc-4469-a839-6a644f076265" (UID: "06d5ea70-00dc-4469-a839-6a644f076265"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.461150 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.461191 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.461202 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4sss\" (UniqueName: \"kubernetes.io/projected/06d5ea70-00dc-4469-a839-6a644f076265-kube-api-access-f4sss\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.461211 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.461219 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06d5ea70-00dc-4469-a839-6a644f076265-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.461504 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7"] Jan 21 16:08:40 crc kubenswrapper[4707]: W0121 16:08:40.464797 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod693d76e3_e5ce_4267_8646_42944a6d6678.slice/crio-c95b41494dc6c8b09e027126e7e51a3fe6b8a30ebad223ae4cf9aaefbd0411c4 WatchSource:0}: Error finding container c95b41494dc6c8b09e027126e7e51a3fe6b8a30ebad223ae4cf9aaefbd0411c4: Status 404 returned error can't find the container with id c95b41494dc6c8b09e027126e7e51a3fe6b8a30ebad223ae4cf9aaefbd0411c4 Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.626874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" event={"ID":"693d76e3-e5ce-4267-8646-42944a6d6678","Type":"ContainerStarted","Data":"4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685"} Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.626912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" event={"ID":"693d76e3-e5ce-4267-8646-42944a6d6678","Type":"ContainerStarted","Data":"c95b41494dc6c8b09e027126e7e51a3fe6b8a30ebad223ae4cf9aaefbd0411c4"} Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.628898 4707 generic.go:334] "Generic (PLEG): container finished" podID="06d5ea70-00dc-4469-a839-6a644f076265" containerID="0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f" exitCode=0 Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.628922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerDied","Data":"0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f"} Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.628937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"06d5ea70-00dc-4469-a839-6a644f076265","Type":"ContainerDied","Data":"c6f7523ff01e763d4c53b1b6d662e1fd0ca5325badb4a1b0f6b309d4ebd208b1"} Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.628951 4707 scope.go:117] "RemoveContainer" containerID="0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.629054 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.654578 4707 scope.go:117] "RemoveContainer" containerID="e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.654905 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.660667 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.674049 4707 scope.go:117] "RemoveContainer" containerID="0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.677667 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:40 crc kubenswrapper[4707]: E0121 16:08:40.678069 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="ceilometer-central-agent" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.678096 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="ceilometer-central-agent" Jan 21 16:08:40 crc kubenswrapper[4707]: E0121 16:08:40.678110 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="ceilometer-notification-agent" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.678115 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="ceilometer-notification-agent" Jan 21 16:08:40 crc kubenswrapper[4707]: E0121 16:08:40.678140 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="proxy-httpd" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.678145 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="proxy-httpd" Jan 21 16:08:40 crc kubenswrapper[4707]: E0121 16:08:40.678173 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="sg-core" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.678179 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="sg-core" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.678356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="ceilometer-central-agent" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.678379 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="sg-core" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.678397 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="ceilometer-notification-agent" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.678404 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d5ea70-00dc-4469-a839-6a644f076265" containerName="proxy-httpd" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.690067 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.690179 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.692018 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.692190 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.694076 4707 scope.go:117] "RemoveContainer" containerID="8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.713925 4707 scope.go:117] "RemoveContainer" containerID="0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9" Jan 21 16:08:40 crc kubenswrapper[4707]: E0121 16:08:40.715498 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9\": container with ID starting with 0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9 not found: ID does not exist" containerID="0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.715704 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9"} err="failed to get container status \"0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9\": rpc error: code = NotFound desc = could not find container \"0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9\": container with ID starting with 0d18361756761dd09a4e936297f42a517ccc91eaae8bed34ff7f10f21d1b83e9 not found: ID does not exist" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.715726 4707 scope.go:117] "RemoveContainer" containerID="e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4" Jan 21 16:08:40 crc kubenswrapper[4707]: E0121 16:08:40.716171 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4\": container with ID starting with e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4 not found: ID does not exist" containerID="e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.716247 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4"} err="failed to get container status \"e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4\": rpc error: code = NotFound desc = could not find container \"e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4\": container with ID starting with e0f00a7435c3bc2b7df0f1892f6bde2715e92c1d132287e9c10f4766acb0bfa4 not found: ID does not exist" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.716310 4707 scope.go:117] "RemoveContainer" containerID="0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f" Jan 21 16:08:40 crc kubenswrapper[4707]: E0121 16:08:40.716592 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f\": container with ID starting with 0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f not found: ID does not exist" containerID="0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.716673 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f"} err="failed to get container status \"0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f\": rpc error: code = NotFound desc = could not find container \"0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f\": container with ID starting with 0b0e36a69e859defb597f12c47a09dc07a022429a23835dfa0d9cb102616f88f not found: ID does not exist" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.716887 4707 scope.go:117] "RemoveContainer" containerID="8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac" Jan 21 16:08:40 crc kubenswrapper[4707]: E0121 16:08:40.717235 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac\": container with ID starting with 8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac not found: ID does not exist" containerID="8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.717259 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac"} err="failed to get container status \"8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac\": rpc error: code = NotFound desc = could not find container \"8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac\": container with ID starting with 8867da96a122004afd7681af3e016625acf016ac36a1a1eb6dc65a35557157ac not found: ID does not exist" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.865525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.865596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-log-httpd\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.865630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.865684 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-scripts\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.865700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbllh\" (UniqueName: \"kubernetes.io/projected/47c0b03e-0ca0-4250-b374-7070f8bd1699-kube-api-access-dbllh\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.865832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-config-data\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.865873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-run-httpd\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.967001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-config-data\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.967052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-run-httpd\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.967107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.967132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-log-httpd\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.967170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.967199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-scripts\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.967215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbllh\" (UniqueName: \"kubernetes.io/projected/47c0b03e-0ca0-4250-b374-7070f8bd1699-kube-api-access-dbllh\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.967800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-log-httpd\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.968115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-run-httpd\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.970270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.970748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-config-data\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.971333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.971754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-scripts\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:40 crc kubenswrapper[4707]: I0121 16:08:40.982059 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbllh\" (UniqueName: \"kubernetes.io/projected/47c0b03e-0ca0-4250-b374-7070f8bd1699-kube-api-access-dbllh\") pod \"ceilometer-0\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:41 crc kubenswrapper[4707]: I0121 16:08:41.008458 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:41 crc kubenswrapper[4707]: I0121 16:08:41.197666 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06d5ea70-00dc-4469-a839-6a644f076265" path="/var/lib/kubelet/pods/06d5ea70-00dc-4469-a839-6a644f076265/volumes" Jan 21 16:08:41 crc kubenswrapper[4707]: I0121 16:08:41.399339 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:41 crc kubenswrapper[4707]: I0121 16:08:41.638020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerStarted","Data":"f0071cf767e064f96ec60072b213c895cc53c511d40ef0d3593b71b32507a3f3"} Jan 21 16:08:41 crc kubenswrapper[4707]: I0121 16:08:41.639884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" event={"ID":"693d76e3-e5ce-4267-8646-42944a6d6678","Type":"ContainerStarted","Data":"f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6"} Jan 21 16:08:41 crc kubenswrapper[4707]: I0121 16:08:41.640113 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:41 crc kubenswrapper[4707]: I0121 16:08:41.640301 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:41 crc kubenswrapper[4707]: I0121 16:08:41.653453 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" podStartSLOduration=2.653438193 podStartE2EDuration="2.653438193s" podCreationTimestamp="2026-01-21 16:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:08:41.652406553 +0000 UTC m=+4018.833922775" watchObservedRunningTime="2026-01-21 16:08:41.653438193 +0000 UTC m=+4018.834954415" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.123290 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-pvg29"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.124422 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.130077 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-pvg29"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.224743 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-dvq7g"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.225852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.237548 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.238522 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.240182 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.243733 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-dvq7g"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.250084 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.291768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbtf4\" (UniqueName: \"kubernetes.io/projected/972548d1-bc2a-4edc-9d47-7f31b98459fd-kube-api-access-fbtf4\") pod \"nova-api-db-create-pvg29\" (UID: \"972548d1-bc2a-4edc-9d47-7f31b98459fd\") " pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.292388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/972548d1-bc2a-4edc-9d47-7f31b98459fd-operator-scripts\") pod \"nova-api-db-create-pvg29\" (UID: \"972548d1-bc2a-4edc-9d47-7f31b98459fd\") " pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.324837 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-v95sk"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.325792 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.331603 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-v95sk"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.393519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m5w2\" (UniqueName: \"kubernetes.io/projected/73047435-3cb8-43e3-bfc8-560c961e34cf-kube-api-access-2m5w2\") pod \"nova-api-dc17-account-create-update-trb5l\" (UID: \"73047435-3cb8-43e3-bfc8-560c961e34cf\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.393579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a82682c-ac19-48c1-a9bf-30e1eca19329-operator-scripts\") pod \"nova-cell0-db-create-dvq7g\" (UID: \"7a82682c-ac19-48c1-a9bf-30e1eca19329\") " pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.393723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbtf4\" (UniqueName: \"kubernetes.io/projected/972548d1-bc2a-4edc-9d47-7f31b98459fd-kube-api-access-fbtf4\") pod \"nova-api-db-create-pvg29\" (UID: \"972548d1-bc2a-4edc-9d47-7f31b98459fd\") " pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.393767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqgg\" (UniqueName: \"kubernetes.io/projected/7a82682c-ac19-48c1-a9bf-30e1eca19329-kube-api-access-4nqgg\") pod \"nova-cell0-db-create-dvq7g\" (UID: \"7a82682c-ac19-48c1-a9bf-30e1eca19329\") " pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.393981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73047435-3cb8-43e3-bfc8-560c961e34cf-operator-scripts\") pod \"nova-api-dc17-account-create-update-trb5l\" (UID: \"73047435-3cb8-43e3-bfc8-560c961e34cf\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.394100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/972548d1-bc2a-4edc-9d47-7f31b98459fd-operator-scripts\") pod \"nova-api-db-create-pvg29\" (UID: \"972548d1-bc2a-4edc-9d47-7f31b98459fd\") " pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.394744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/972548d1-bc2a-4edc-9d47-7f31b98459fd-operator-scripts\") pod \"nova-api-db-create-pvg29\" (UID: \"972548d1-bc2a-4edc-9d47-7f31b98459fd\") " pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.412880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbtf4\" (UniqueName: \"kubernetes.io/projected/972548d1-bc2a-4edc-9d47-7f31b98459fd-kube-api-access-fbtf4\") pod \"nova-api-db-create-pvg29\" (UID: \"972548d1-bc2a-4edc-9d47-7f31b98459fd\") " pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.428645 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.429612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.435149 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.441368 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.447499 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.495178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44l9t\" (UniqueName: \"kubernetes.io/projected/2933067b-5f4b-4431-9b9b-5da784bab778-kube-api-access-44l9t\") pod \"nova-cell1-db-create-v95sk\" (UID: \"2933067b-5f4b-4431-9b9b-5da784bab778\") " pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.495357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m5w2\" (UniqueName: \"kubernetes.io/projected/73047435-3cb8-43e3-bfc8-560c961e34cf-kube-api-access-2m5w2\") pod \"nova-api-dc17-account-create-update-trb5l\" (UID: \"73047435-3cb8-43e3-bfc8-560c961e34cf\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.495388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2933067b-5f4b-4431-9b9b-5da784bab778-operator-scripts\") pod \"nova-cell1-db-create-v95sk\" (UID: \"2933067b-5f4b-4431-9b9b-5da784bab778\") " pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.495439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a82682c-ac19-48c1-a9bf-30e1eca19329-operator-scripts\") pod \"nova-cell0-db-create-dvq7g\" (UID: \"7a82682c-ac19-48c1-a9bf-30e1eca19329\") " pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.495472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nqgg\" (UniqueName: \"kubernetes.io/projected/7a82682c-ac19-48c1-a9bf-30e1eca19329-kube-api-access-4nqgg\") pod \"nova-cell0-db-create-dvq7g\" (UID: \"7a82682c-ac19-48c1-a9bf-30e1eca19329\") " pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.495583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73047435-3cb8-43e3-bfc8-560c961e34cf-operator-scripts\") pod \"nova-api-dc17-account-create-update-trb5l\" (UID: \"73047435-3cb8-43e3-bfc8-560c961e34cf\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.496092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a82682c-ac19-48c1-a9bf-30e1eca19329-operator-scripts\") pod \"nova-cell0-db-create-dvq7g\" (UID: \"7a82682c-ac19-48c1-a9bf-30e1eca19329\") " pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.496239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73047435-3cb8-43e3-bfc8-560c961e34cf-operator-scripts\") pod \"nova-api-dc17-account-create-update-trb5l\" (UID: \"73047435-3cb8-43e3-bfc8-560c961e34cf\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.509245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m5w2\" (UniqueName: \"kubernetes.io/projected/73047435-3cb8-43e3-bfc8-560c961e34cf-kube-api-access-2m5w2\") pod \"nova-api-dc17-account-create-update-trb5l\" (UID: \"73047435-3cb8-43e3-bfc8-560c961e34cf\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.518564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nqgg\" (UniqueName: \"kubernetes.io/projected/7a82682c-ac19-48c1-a9bf-30e1eca19329-kube-api-access-4nqgg\") pod \"nova-cell0-db-create-dvq7g\" (UID: \"7a82682c-ac19-48c1-a9bf-30e1eca19329\") " pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.540199 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.567957 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.597300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44l9t\" (UniqueName: \"kubernetes.io/projected/2933067b-5f4b-4431-9b9b-5da784bab778-kube-api-access-44l9t\") pod \"nova-cell1-db-create-v95sk\" (UID: \"2933067b-5f4b-4431-9b9b-5da784bab778\") " pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.597405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2933067b-5f4b-4431-9b9b-5da784bab778-operator-scripts\") pod \"nova-cell1-db-create-v95sk\" (UID: \"2933067b-5f4b-4431-9b9b-5da784bab778\") " pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.597482 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df737a8e-a932-4546-99f4-2e75115f4bb9-operator-scripts\") pod \"nova-cell0-0e3e-account-create-update-w8rtv\" (UID: \"df737a8e-a932-4546-99f4-2e75115f4bb9\") " pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.597619 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8mgq\" (UniqueName: \"kubernetes.io/projected/df737a8e-a932-4546-99f4-2e75115f4bb9-kube-api-access-l8mgq\") pod \"nova-cell0-0e3e-account-create-update-w8rtv\" (UID: \"df737a8e-a932-4546-99f4-2e75115f4bb9\") " pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.598257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2933067b-5f4b-4431-9b9b-5da784bab778-operator-scripts\") pod \"nova-cell1-db-create-v95sk\" (UID: \"2933067b-5f4b-4431-9b9b-5da784bab778\") " pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.624166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44l9t\" (UniqueName: \"kubernetes.io/projected/2933067b-5f4b-4431-9b9b-5da784bab778-kube-api-access-44l9t\") pod \"nova-cell1-db-create-v95sk\" (UID: \"2933067b-5f4b-4431-9b9b-5da784bab778\") " pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.644386 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.653293 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.654889 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.657073 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.665611 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.674947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerStarted","Data":"bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93"} Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.698712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df737a8e-a932-4546-99f4-2e75115f4bb9-operator-scripts\") pod \"nova-cell0-0e3e-account-create-update-w8rtv\" (UID: \"df737a8e-a932-4546-99f4-2e75115f4bb9\") " pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.698804 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8mgq\" (UniqueName: \"kubernetes.io/projected/df737a8e-a932-4546-99f4-2e75115f4bb9-kube-api-access-l8mgq\") pod \"nova-cell0-0e3e-account-create-update-w8rtv\" (UID: \"df737a8e-a932-4546-99f4-2e75115f4bb9\") " pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.699473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df737a8e-a932-4546-99f4-2e75115f4bb9-operator-scripts\") pod \"nova-cell0-0e3e-account-create-update-w8rtv\" (UID: \"df737a8e-a932-4546-99f4-2e75115f4bb9\") " pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.715002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8mgq\" (UniqueName: \"kubernetes.io/projected/df737a8e-a932-4546-99f4-2e75115f4bb9-kube-api-access-l8mgq\") pod \"nova-cell0-0e3e-account-create-update-w8rtv\" (UID: \"df737a8e-a932-4546-99f4-2e75115f4bb9\") " pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.759243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.803836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnzp6\" (UniqueName: \"kubernetes.io/projected/891bddcb-ef77-491e-958f-6bc9d740c3be-kube-api-access-vnzp6\") pod \"nova-cell1-524d-account-create-update-hkhxn\" (UID: \"891bddcb-ef77-491e-958f-6bc9d740c3be\") " pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.803976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/891bddcb-ef77-491e-958f-6bc9d740c3be-operator-scripts\") pod \"nova-cell1-524d-account-create-update-hkhxn\" (UID: \"891bddcb-ef77-491e-958f-6bc9d740c3be\") " pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.821997 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-pvg29"] Jan 21 16:08:42 crc kubenswrapper[4707]: W0121 16:08:42.826208 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod972548d1_bc2a_4edc_9d47_7f31b98459fd.slice/crio-f095bbab3a268b8381ace164cc18b4406339460be7a8adeb5699908a19ba4c78 WatchSource:0}: Error finding container f095bbab3a268b8381ace164cc18b4406339460be7a8adeb5699908a19ba4c78: Status 404 returned error can't find the container with id f095bbab3a268b8381ace164cc18b4406339460be7a8adeb5699908a19ba4c78 Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.905663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnzp6\" (UniqueName: \"kubernetes.io/projected/891bddcb-ef77-491e-958f-6bc9d740c3be-kube-api-access-vnzp6\") pod \"nova-cell1-524d-account-create-update-hkhxn\" (UID: \"891bddcb-ef77-491e-958f-6bc9d740c3be\") " pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.905759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/891bddcb-ef77-491e-958f-6bc9d740c3be-operator-scripts\") pod \"nova-cell1-524d-account-create-update-hkhxn\" (UID: \"891bddcb-ef77-491e-958f-6bc9d740c3be\") " pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.907110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/891bddcb-ef77-491e-958f-6bc9d740c3be-operator-scripts\") pod \"nova-cell1-524d-account-create-update-hkhxn\" (UID: \"891bddcb-ef77-491e-958f-6bc9d740c3be\") " pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.921315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnzp6\" (UniqueName: \"kubernetes.io/projected/891bddcb-ef77-491e-958f-6bc9d740c3be-kube-api-access-vnzp6\") pod \"nova-cell1-524d-account-create-update-hkhxn\" (UID: \"891bddcb-ef77-491e-958f-6bc9d740c3be\") " pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.964936 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-dvq7g"] Jan 21 16:08:42 crc kubenswrapper[4707]: I0121 16:08:42.976320 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.084965 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l"] Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.107167 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-v95sk"] Jan 21 16:08:43 crc kubenswrapper[4707]: W0121 16:08:43.112271 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2933067b_5f4b_4431_9b9b_5da784bab778.slice/crio-fbe9bdc1f5e590f6483c9a124987c9fb1fd62a44d8725615c5b2913e208979c2 WatchSource:0}: Error finding container fbe9bdc1f5e590f6483c9a124987c9fb1fd62a44d8725615c5b2913e208979c2: Status 404 returned error can't find the container with id fbe9bdc1f5e590f6483c9a124987c9fb1fd62a44d8725615c5b2913e208979c2 Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.222714 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv"] Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.235690 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.405275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn"] Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.414100 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.681750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" event={"ID":"891bddcb-ef77-491e-958f-6bc9d740c3be","Type":"ContainerStarted","Data":"0f6f130367f0801fae21c33ab0ad026600cc933e3887547756a0851ac9b67247"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.682597 4707 generic.go:334] "Generic (PLEG): container finished" podID="73047435-3cb8-43e3-bfc8-560c961e34cf" containerID="58830d3499b14fa51fbf78280bba3cf9c8496d8b7a93968c43ae679eaba07c04" exitCode=0 Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.682637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" event={"ID":"73047435-3cb8-43e3-bfc8-560c961e34cf","Type":"ContainerDied","Data":"58830d3499b14fa51fbf78280bba3cf9c8496d8b7a93968c43ae679eaba07c04"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.682652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" event={"ID":"73047435-3cb8-43e3-bfc8-560c961e34cf","Type":"ContainerStarted","Data":"32c49b6479a2832ae29d15ca9cf77bb60f276e5988e25a60a07a3650718cafd0"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.684116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerStarted","Data":"376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.686046 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a82682c-ac19-48c1-a9bf-30e1eca19329" containerID="ea92c2d6019c27b4829b3e9f434a365dd09e2813d074984c8109dbf9ed386c84" exitCode=0 Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.686087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" event={"ID":"7a82682c-ac19-48c1-a9bf-30e1eca19329","Type":"ContainerDied","Data":"ea92c2d6019c27b4829b3e9f434a365dd09e2813d074984c8109dbf9ed386c84"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.686129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" event={"ID":"7a82682c-ac19-48c1-a9bf-30e1eca19329","Type":"ContainerStarted","Data":"6979a01bb343e24386c09a76ff68485ef149c65efde6ab8838881cc65e859a4e"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.688391 4707 generic.go:334] "Generic (PLEG): container finished" podID="df737a8e-a932-4546-99f4-2e75115f4bb9" containerID="8d6806b4d276c4e814c81f7eb9a13232ef63c8f4ed5080fb5caadd173a7acb91" exitCode=0 Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.688442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" event={"ID":"df737a8e-a932-4546-99f4-2e75115f4bb9","Type":"ContainerDied","Data":"8d6806b4d276c4e814c81f7eb9a13232ef63c8f4ed5080fb5caadd173a7acb91"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.688463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" event={"ID":"df737a8e-a932-4546-99f4-2e75115f4bb9","Type":"ContainerStarted","Data":"6bb22f42f5d399bc49e935d89435a0b46e2976a282eb5464d2f0c32b9d25e707"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.689582 4707 generic.go:334] "Generic (PLEG): container finished" podID="2933067b-5f4b-4431-9b9b-5da784bab778" containerID="fdfba2521f191a956805a9894a059b391490e094209faadcb9e1196d96a653a7" exitCode=0 Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.689620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" event={"ID":"2933067b-5f4b-4431-9b9b-5da784bab778","Type":"ContainerDied","Data":"fdfba2521f191a956805a9894a059b391490e094209faadcb9e1196d96a653a7"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.689633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" event={"ID":"2933067b-5f4b-4431-9b9b-5da784bab778","Type":"ContainerStarted","Data":"fbe9bdc1f5e590f6483c9a124987c9fb1fd62a44d8725615c5b2913e208979c2"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.692187 4707 generic.go:334] "Generic (PLEG): container finished" podID="972548d1-bc2a-4edc-9d47-7f31b98459fd" containerID="35168c1fabdc2f1d8d3b47af44542532da9c2604d8d33b99a98c14e589f3ba68" exitCode=0 Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.692210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-pvg29" event={"ID":"972548d1-bc2a-4edc-9d47-7f31b98459fd","Type":"ContainerDied","Data":"35168c1fabdc2f1d8d3b47af44542532da9c2604d8d33b99a98c14e589f3ba68"} Jan 21 16:08:43 crc kubenswrapper[4707]: I0121 16:08:43.692223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-pvg29" event={"ID":"972548d1-bc2a-4edc-9d47-7f31b98459fd","Type":"ContainerStarted","Data":"f095bbab3a268b8381ace164cc18b4406339460be7a8adeb5699908a19ba4c78"} Jan 21 16:08:44 crc kubenswrapper[4707]: I0121 16:08:44.699515 4707 generic.go:334] "Generic (PLEG): container finished" podID="891bddcb-ef77-491e-958f-6bc9d740c3be" containerID="2836fb5072518b31b20d2d2ab0bac82ac6aaa2d7fa44b4ce3c1c244ed3389a72" exitCode=0 Jan 21 16:08:44 crc kubenswrapper[4707]: I0121 16:08:44.699567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" event={"ID":"891bddcb-ef77-491e-958f-6bc9d740c3be","Type":"ContainerDied","Data":"2836fb5072518b31b20d2d2ab0bac82ac6aaa2d7fa44b4ce3c1c244ed3389a72"} Jan 21 16:08:44 crc kubenswrapper[4707]: I0121 16:08:44.701323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerStarted","Data":"a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42"} Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.031109 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.040753 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.112099 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.143947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8mgq\" (UniqueName: \"kubernetes.io/projected/df737a8e-a932-4546-99f4-2e75115f4bb9-kube-api-access-l8mgq\") pod \"df737a8e-a932-4546-99f4-2e75115f4bb9\" (UID: \"df737a8e-a932-4546-99f4-2e75115f4bb9\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.144034 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df737a8e-a932-4546-99f4-2e75115f4bb9-operator-scripts\") pod \"df737a8e-a932-4546-99f4-2e75115f4bb9\" (UID: \"df737a8e-a932-4546-99f4-2e75115f4bb9\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.144569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df737a8e-a932-4546-99f4-2e75115f4bb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df737a8e-a932-4546-99f4-2e75115f4bb9" (UID: "df737a8e-a932-4546-99f4-2e75115f4bb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.148728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df737a8e-a932-4546-99f4-2e75115f4bb9-kube-api-access-l8mgq" (OuterVolumeSpecName: "kube-api-access-l8mgq") pod "df737a8e-a932-4546-99f4-2e75115f4bb9" (UID: "df737a8e-a932-4546-99f4-2e75115f4bb9"). InnerVolumeSpecName "kube-api-access-l8mgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.188845 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:08:45 crc kubenswrapper[4707]: E0121 16:08:45.189219 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.205801 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.223090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.224005 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.228361 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m5w2\" (UniqueName: \"kubernetes.io/projected/73047435-3cb8-43e3-bfc8-560c961e34cf-kube-api-access-2m5w2\") pod \"73047435-3cb8-43e3-bfc8-560c961e34cf\" (UID: \"73047435-3cb8-43e3-bfc8-560c961e34cf\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2933067b-5f4b-4431-9b9b-5da784bab778-operator-scripts\") pod \"2933067b-5f4b-4431-9b9b-5da784bab778\" (UID: \"2933067b-5f4b-4431-9b9b-5da784bab778\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nqgg\" (UniqueName: \"kubernetes.io/projected/7a82682c-ac19-48c1-a9bf-30e1eca19329-kube-api-access-4nqgg\") pod \"7a82682c-ac19-48c1-a9bf-30e1eca19329\" (UID: \"7a82682c-ac19-48c1-a9bf-30e1eca19329\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245438 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73047435-3cb8-43e3-bfc8-560c961e34cf-operator-scripts\") pod \"73047435-3cb8-43e3-bfc8-560c961e34cf\" (UID: \"73047435-3cb8-43e3-bfc8-560c961e34cf\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/972548d1-bc2a-4edc-9d47-7f31b98459fd-operator-scripts\") pod \"972548d1-bc2a-4edc-9d47-7f31b98459fd\" (UID: \"972548d1-bc2a-4edc-9d47-7f31b98459fd\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44l9t\" (UniqueName: \"kubernetes.io/projected/2933067b-5f4b-4431-9b9b-5da784bab778-kube-api-access-44l9t\") pod \"2933067b-5f4b-4431-9b9b-5da784bab778\" (UID: \"2933067b-5f4b-4431-9b9b-5da784bab778\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbtf4\" (UniqueName: \"kubernetes.io/projected/972548d1-bc2a-4edc-9d47-7f31b98459fd-kube-api-access-fbtf4\") pod \"972548d1-bc2a-4edc-9d47-7f31b98459fd\" (UID: \"972548d1-bc2a-4edc-9d47-7f31b98459fd\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a82682c-ac19-48c1-a9bf-30e1eca19329-operator-scripts\") pod \"7a82682c-ac19-48c1-a9bf-30e1eca19329\" (UID: \"7a82682c-ac19-48c1-a9bf-30e1eca19329\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245952 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df737a8e-a932-4546-99f4-2e75115f4bb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.245967 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8mgq\" (UniqueName: \"kubernetes.io/projected/df737a8e-a932-4546-99f4-2e75115f4bb9-kube-api-access-l8mgq\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.255051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73047435-3cb8-43e3-bfc8-560c961e34cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73047435-3cb8-43e3-bfc8-560c961e34cf" (UID: "73047435-3cb8-43e3-bfc8-560c961e34cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.255462 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2933067b-5f4b-4431-9b9b-5da784bab778-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2933067b-5f4b-4431-9b9b-5da784bab778" (UID: "2933067b-5f4b-4431-9b9b-5da784bab778"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.255573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a82682c-ac19-48c1-a9bf-30e1eca19329-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a82682c-ac19-48c1-a9bf-30e1eca19329" (UID: "7a82682c-ac19-48c1-a9bf-30e1eca19329"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.256539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972548d1-bc2a-4edc-9d47-7f31b98459fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "972548d1-bc2a-4edc-9d47-7f31b98459fd" (UID: "972548d1-bc2a-4edc-9d47-7f31b98459fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.256821 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73047435-3cb8-43e3-bfc8-560c961e34cf-kube-api-access-2m5w2" (OuterVolumeSpecName: "kube-api-access-2m5w2") pod "73047435-3cb8-43e3-bfc8-560c961e34cf" (UID: "73047435-3cb8-43e3-bfc8-560c961e34cf"). InnerVolumeSpecName "kube-api-access-2m5w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.257731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2933067b-5f4b-4431-9b9b-5da784bab778-kube-api-access-44l9t" (OuterVolumeSpecName: "kube-api-access-44l9t") pod "2933067b-5f4b-4431-9b9b-5da784bab778" (UID: "2933067b-5f4b-4431-9b9b-5da784bab778"). InnerVolumeSpecName "kube-api-access-44l9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.269397 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972548d1-bc2a-4edc-9d47-7f31b98459fd-kube-api-access-fbtf4" (OuterVolumeSpecName: "kube-api-access-fbtf4") pod "972548d1-bc2a-4edc-9d47-7f31b98459fd" (UID: "972548d1-bc2a-4edc-9d47-7f31b98459fd"). InnerVolumeSpecName "kube-api-access-fbtf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.269517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a82682c-ac19-48c1-a9bf-30e1eca19329-kube-api-access-4nqgg" (OuterVolumeSpecName: "kube-api-access-4nqgg") pod "7a82682c-ac19-48c1-a9bf-30e1eca19329" (UID: "7a82682c-ac19-48c1-a9bf-30e1eca19329"). InnerVolumeSpecName "kube-api-access-4nqgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.347583 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a82682c-ac19-48c1-a9bf-30e1eca19329-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.347614 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m5w2\" (UniqueName: \"kubernetes.io/projected/73047435-3cb8-43e3-bfc8-560c961e34cf-kube-api-access-2m5w2\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.347625 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2933067b-5f4b-4431-9b9b-5da784bab778-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.347634 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nqgg\" (UniqueName: \"kubernetes.io/projected/7a82682c-ac19-48c1-a9bf-30e1eca19329-kube-api-access-4nqgg\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.347643 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73047435-3cb8-43e3-bfc8-560c961e34cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.347651 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/972548d1-bc2a-4edc-9d47-7f31b98459fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.347659 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44l9t\" (UniqueName: \"kubernetes.io/projected/2933067b-5f4b-4431-9b9b-5da784bab778-kube-api-access-44l9t\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.347667 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbtf4\" (UniqueName: \"kubernetes.io/projected/972548d1-bc2a-4edc-9d47-7f31b98459fd-kube-api-access-fbtf4\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.709445 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerStarted","Data":"889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626"} Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.710296 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.711585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" event={"ID":"7a82682c-ac19-48c1-a9bf-30e1eca19329","Type":"ContainerDied","Data":"6979a01bb343e24386c09a76ff68485ef149c65efde6ab8838881cc65e859a4e"} Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.711608 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6979a01bb343e24386c09a76ff68485ef149c65efde6ab8838881cc65e859a4e" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.711643 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-dvq7g" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.716998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" event={"ID":"df737a8e-a932-4546-99f4-2e75115f4bb9","Type":"ContainerDied","Data":"6bb22f42f5d399bc49e935d89435a0b46e2976a282eb5464d2f0c32b9d25e707"} Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.717048 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb22f42f5d399bc49e935d89435a0b46e2976a282eb5464d2f0c32b9d25e707" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.717021 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.718329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" event={"ID":"2933067b-5f4b-4431-9b9b-5da784bab778","Type":"ContainerDied","Data":"fbe9bdc1f5e590f6483c9a124987c9fb1fd62a44d8725615c5b2913e208979c2"} Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.718366 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbe9bdc1f5e590f6483c9a124987c9fb1fd62a44d8725615c5b2913e208979c2" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.718414 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-v95sk" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.720173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-pvg29" event={"ID":"972548d1-bc2a-4edc-9d47-7f31b98459fd","Type":"ContainerDied","Data":"f095bbab3a268b8381ace164cc18b4406339460be7a8adeb5699908a19ba4c78"} Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.720197 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f095bbab3a268b8381ace164cc18b4406339460be7a8adeb5699908a19ba4c78" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.720239 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-pvg29" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.721718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" event={"ID":"73047435-3cb8-43e3-bfc8-560c961e34cf","Type":"ContainerDied","Data":"32c49b6479a2832ae29d15ca9cf77bb60f276e5988e25a60a07a3650718cafd0"} Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.721738 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32c49b6479a2832ae29d15ca9cf77bb60f276e5988e25a60a07a3650718cafd0" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.721767 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.728791 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.192984456 podStartE2EDuration="5.728781632s" podCreationTimestamp="2026-01-21 16:08:40 +0000 UTC" firstStartedPulling="2026-01-21 16:08:41.398782857 +0000 UTC m=+4018.580299069" lastFinishedPulling="2026-01-21 16:08:44.934580023 +0000 UTC m=+4022.116096245" observedRunningTime="2026-01-21 16:08:45.727352354 +0000 UTC m=+4022.908868576" watchObservedRunningTime="2026-01-21 16:08:45.728781632 +0000 UTC m=+4022.910297854" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.940277 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.957484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnzp6\" (UniqueName: \"kubernetes.io/projected/891bddcb-ef77-491e-958f-6bc9d740c3be-kube-api-access-vnzp6\") pod \"891bddcb-ef77-491e-958f-6bc9d740c3be\" (UID: \"891bddcb-ef77-491e-958f-6bc9d740c3be\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.957553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/891bddcb-ef77-491e-958f-6bc9d740c3be-operator-scripts\") pod \"891bddcb-ef77-491e-958f-6bc9d740c3be\" (UID: \"891bddcb-ef77-491e-958f-6bc9d740c3be\") " Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.958143 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/891bddcb-ef77-491e-958f-6bc9d740c3be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "891bddcb-ef77-491e-958f-6bc9d740c3be" (UID: "891bddcb-ef77-491e-958f-6bc9d740c3be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:08:45 crc kubenswrapper[4707]: I0121 16:08:45.961858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/891bddcb-ef77-491e-958f-6bc9d740c3be-kube-api-access-vnzp6" (OuterVolumeSpecName: "kube-api-access-vnzp6") pod "891bddcb-ef77-491e-958f-6bc9d740c3be" (UID: "891bddcb-ef77-491e-958f-6bc9d740c3be"). InnerVolumeSpecName "kube-api-access-vnzp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:46 crc kubenswrapper[4707]: I0121 16:08:46.059555 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnzp6\" (UniqueName: \"kubernetes.io/projected/891bddcb-ef77-491e-958f-6bc9d740c3be-kube-api-access-vnzp6\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:46 crc kubenswrapper[4707]: I0121 16:08:46.059769 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/891bddcb-ef77-491e-958f-6bc9d740c3be-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:46 crc kubenswrapper[4707]: I0121 16:08:46.729943 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" Jan 21 16:08:46 crc kubenswrapper[4707]: I0121 16:08:46.733959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn" event={"ID":"891bddcb-ef77-491e-958f-6bc9d740c3be","Type":"ContainerDied","Data":"0f6f130367f0801fae21c33ab0ad026600cc933e3887547756a0851ac9b67247"} Jan 21 16:08:46 crc kubenswrapper[4707]: I0121 16:08:46.734000 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f6f130367f0801fae21c33ab0ad026600cc933e3887547756a0851ac9b67247" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.488914 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.633216 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4"] Jan 21 16:08:47 crc kubenswrapper[4707]: E0121 16:08:47.633879 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df737a8e-a932-4546-99f4-2e75115f4bb9" containerName="mariadb-account-create-update" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.633899 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df737a8e-a932-4546-99f4-2e75115f4bb9" containerName="mariadb-account-create-update" Jan 21 16:08:47 crc kubenswrapper[4707]: E0121 16:08:47.633923 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73047435-3cb8-43e3-bfc8-560c961e34cf" containerName="mariadb-account-create-update" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.633930 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="73047435-3cb8-43e3-bfc8-560c961e34cf" containerName="mariadb-account-create-update" Jan 21 16:08:47 crc kubenswrapper[4707]: E0121 16:08:47.633952 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2933067b-5f4b-4431-9b9b-5da784bab778" containerName="mariadb-database-create" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.633959 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2933067b-5f4b-4431-9b9b-5da784bab778" containerName="mariadb-database-create" Jan 21 16:08:47 crc kubenswrapper[4707]: E0121 16:08:47.633976 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891bddcb-ef77-491e-958f-6bc9d740c3be" containerName="mariadb-account-create-update" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.633982 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="891bddcb-ef77-491e-958f-6bc9d740c3be" containerName="mariadb-account-create-update" Jan 21 16:08:47 crc kubenswrapper[4707]: E0121 16:08:47.634005 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972548d1-bc2a-4edc-9d47-7f31b98459fd" containerName="mariadb-database-create" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.634012 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="972548d1-bc2a-4edc-9d47-7f31b98459fd" containerName="mariadb-database-create" Jan 21 16:08:47 crc kubenswrapper[4707]: E0121 16:08:47.634020 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a82682c-ac19-48c1-a9bf-30e1eca19329" containerName="mariadb-database-create" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.634026 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a82682c-ac19-48c1-a9bf-30e1eca19329" containerName="mariadb-database-create" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.634449 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="73047435-3cb8-43e3-bfc8-560c961e34cf" containerName="mariadb-account-create-update" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.634477 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df737a8e-a932-4546-99f4-2e75115f4bb9" containerName="mariadb-account-create-update" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.634505 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2933067b-5f4b-4431-9b9b-5da784bab778" containerName="mariadb-database-create" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.634527 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="972548d1-bc2a-4edc-9d47-7f31b98459fd" containerName="mariadb-database-create" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.634538 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a82682c-ac19-48c1-a9bf-30e1eca19329" containerName="mariadb-database-create" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.634554 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="891bddcb-ef77-491e-958f-6bc9d740c3be" containerName="mariadb-account-create-update" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.635323 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.650195 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-h8dbq" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.651166 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.686145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-scripts\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.686264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db79d\" (UniqueName: \"kubernetes.io/projected/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-kube-api-access-db79d\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.686302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-config-data\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.686329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.687787 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.701623 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4"] Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.787909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-scripts\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.788134 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db79d\" (UniqueName: \"kubernetes.io/projected/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-kube-api-access-db79d\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.788205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-config-data\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.788243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.793190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-scripts\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.793858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-config-data\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.794584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.811713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db79d\" (UniqueName: \"kubernetes.io/projected/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-kube-api-access-db79d\") pod \"nova-cell0-conductor-db-sync-zc2z4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:47 crc kubenswrapper[4707]: I0121 16:08:47.978875 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:48 crc kubenswrapper[4707]: I0121 16:08:48.361329 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4"] Jan 21 16:08:48 crc kubenswrapper[4707]: I0121 16:08:48.754821 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="ceilometer-central-agent" containerID="cri-o://bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93" gracePeriod=30 Jan 21 16:08:48 crc kubenswrapper[4707]: I0121 16:08:48.755734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" event={"ID":"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4","Type":"ContainerStarted","Data":"52e0023d3e6851f48959386497daba5bed222e165b85dbf4d2543fae005a5f62"} Jan 21 16:08:48 crc kubenswrapper[4707]: I0121 16:08:48.755764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" event={"ID":"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4","Type":"ContainerStarted","Data":"1a5d02dceb34c23d2099d667f573518f216ae8709db5d542874d4433b7bade12"} Jan 21 16:08:48 crc kubenswrapper[4707]: I0121 16:08:48.755988 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="proxy-httpd" containerID="cri-o://889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626" gracePeriod=30 Jan 21 16:08:48 crc kubenswrapper[4707]: I0121 16:08:48.756042 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="sg-core" containerID="cri-o://a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42" gracePeriod=30 Jan 21 16:08:48 crc kubenswrapper[4707]: I0121 16:08:48.756072 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="ceilometer-notification-agent" containerID="cri-o://376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81" gracePeriod=30 Jan 21 16:08:48 crc kubenswrapper[4707]: I0121 16:08:48.775369 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" podStartSLOduration=1.775355447 podStartE2EDuration="1.775355447s" podCreationTimestamp="2026-01-21 16:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:08:48.77160769 +0000 UTC m=+4025.953123912" watchObservedRunningTime="2026-01-21 16:08:48.775355447 +0000 UTC m=+4025.956871669" Jan 21 16:08:49 crc kubenswrapper[4707]: I0121 16:08:49.762932 4707 generic.go:334] "Generic (PLEG): container finished" podID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerID="889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626" exitCode=0 Jan 21 16:08:49 crc kubenswrapper[4707]: I0121 16:08:49.763139 4707 generic.go:334] "Generic (PLEG): container finished" podID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerID="a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42" exitCode=2 Jan 21 16:08:49 crc kubenswrapper[4707]: I0121 16:08:49.762969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerDied","Data":"889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626"} Jan 21 16:08:49 crc kubenswrapper[4707]: I0121 16:08:49.763192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerDied","Data":"a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42"} Jan 21 16:08:49 crc kubenswrapper[4707]: I0121 16:08:49.763206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerDied","Data":"376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81"} Jan 21 16:08:49 crc kubenswrapper[4707]: I0121 16:08:49.763148 4707 generic.go:334] "Generic (PLEG): container finished" podID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerID="376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81" exitCode=0 Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.088659 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.158469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-config-data\") pod \"47c0b03e-0ca0-4250-b374-7070f8bd1699\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.158504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-sg-core-conf-yaml\") pod \"47c0b03e-0ca0-4250-b374-7070f8bd1699\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.158528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-log-httpd\") pod \"47c0b03e-0ca0-4250-b374-7070f8bd1699\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.158646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-run-httpd\") pod \"47c0b03e-0ca0-4250-b374-7070f8bd1699\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.158669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbllh\" (UniqueName: \"kubernetes.io/projected/47c0b03e-0ca0-4250-b374-7070f8bd1699-kube-api-access-dbllh\") pod \"47c0b03e-0ca0-4250-b374-7070f8bd1699\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.158714 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-scripts\") pod \"47c0b03e-0ca0-4250-b374-7070f8bd1699\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.158794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-combined-ca-bundle\") pod \"47c0b03e-0ca0-4250-b374-7070f8bd1699\" (UID: \"47c0b03e-0ca0-4250-b374-7070f8bd1699\") " Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.159420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "47c0b03e-0ca0-4250-b374-7070f8bd1699" (UID: "47c0b03e-0ca0-4250-b374-7070f8bd1699"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.159730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "47c0b03e-0ca0-4250-b374-7070f8bd1699" (UID: "47c0b03e-0ca0-4250-b374-7070f8bd1699"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.168271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-scripts" (OuterVolumeSpecName: "scripts") pod "47c0b03e-0ca0-4250-b374-7070f8bd1699" (UID: "47c0b03e-0ca0-4250-b374-7070f8bd1699"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.168688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c0b03e-0ca0-4250-b374-7070f8bd1699-kube-api-access-dbllh" (OuterVolumeSpecName: "kube-api-access-dbllh") pod "47c0b03e-0ca0-4250-b374-7070f8bd1699" (UID: "47c0b03e-0ca0-4250-b374-7070f8bd1699"). InnerVolumeSpecName "kube-api-access-dbllh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.182127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "47c0b03e-0ca0-4250-b374-7070f8bd1699" (UID: "47c0b03e-0ca0-4250-b374-7070f8bd1699"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.213930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47c0b03e-0ca0-4250-b374-7070f8bd1699" (UID: "47c0b03e-0ca0-4250-b374-7070f8bd1699"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.229509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-config-data" (OuterVolumeSpecName: "config-data") pod "47c0b03e-0ca0-4250-b374-7070f8bd1699" (UID: "47c0b03e-0ca0-4250-b374-7070f8bd1699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.260898 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbllh\" (UniqueName: \"kubernetes.io/projected/47c0b03e-0ca0-4250-b374-7070f8bd1699-kube-api-access-dbllh\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.260926 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.260936 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.260944 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.260952 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.260960 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47c0b03e-0ca0-4250-b374-7070f8bd1699-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.260967 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47c0b03e-0ca0-4250-b374-7070f8bd1699-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.786262 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ede7fec-2e6f-47e1-9653-a55e80d7c9c4" containerID="52e0023d3e6851f48959386497daba5bed222e165b85dbf4d2543fae005a5f62" exitCode=0 Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.786341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" event={"ID":"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4","Type":"ContainerDied","Data":"52e0023d3e6851f48959386497daba5bed222e165b85dbf4d2543fae005a5f62"} Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.788636 4707 generic.go:334] "Generic (PLEG): container finished" podID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerID="bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93" exitCode=0 Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.788669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerDied","Data":"bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93"} Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.788687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"47c0b03e-0ca0-4250-b374-7070f8bd1699","Type":"ContainerDied","Data":"f0071cf767e064f96ec60072b213c895cc53c511d40ef0d3593b71b32507a3f3"} Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.788701 4707 scope.go:117] "RemoveContainer" containerID="889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.788787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.815145 4707 scope.go:117] "RemoveContainer" containerID="a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.824193 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.832202 4707 scope.go:117] "RemoveContainer" containerID="376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.833881 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.840318 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:52 crc kubenswrapper[4707]: E0121 16:08:52.840718 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="ceilometer-central-agent" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.840736 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="ceilometer-central-agent" Jan 21 16:08:52 crc kubenswrapper[4707]: E0121 16:08:52.840745 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="ceilometer-notification-agent" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.840752 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="ceilometer-notification-agent" Jan 21 16:08:52 crc kubenswrapper[4707]: E0121 16:08:52.840769 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="sg-core" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.840775 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="sg-core" Jan 21 16:08:52 crc kubenswrapper[4707]: E0121 16:08:52.840800 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="proxy-httpd" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.840824 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="proxy-httpd" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.841001 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="ceilometer-notification-agent" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.841018 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="ceilometer-central-agent" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.841036 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="sg-core" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.841047 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" containerName="proxy-httpd" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.842425 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.843703 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.847137 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.847234 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.865929 4707 scope.go:117] "RemoveContainer" containerID="bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.868431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-run-httpd\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.868484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-log-httpd\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.868529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-config-data\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.868675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2mq9\" (UniqueName: \"kubernetes.io/projected/951167cd-6411-4b71-a8b8-2da2745eee6e-kube-api-access-h2mq9\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.868761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.868904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-scripts\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.868975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.881914 4707 scope.go:117] "RemoveContainer" containerID="889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626" Jan 21 16:08:52 crc kubenswrapper[4707]: E0121 16:08:52.882239 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626\": container with ID starting with 889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626 not found: ID does not exist" containerID="889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.882263 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626"} err="failed to get container status \"889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626\": rpc error: code = NotFound desc = could not find container \"889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626\": container with ID starting with 889d48eed05e21be40e51940762da52d3cccf14a37b9844d1eca23500a0e2626 not found: ID does not exist" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.882280 4707 scope.go:117] "RemoveContainer" containerID="a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42" Jan 21 16:08:52 crc kubenswrapper[4707]: E0121 16:08:52.882489 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42\": container with ID starting with a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42 not found: ID does not exist" containerID="a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.882506 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42"} err="failed to get container status \"a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42\": rpc error: code = NotFound desc = could not find container \"a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42\": container with ID starting with a262052f942ebcc15e02cfd437155b865b115f7e34f12b0dd14056cacba59c42 not found: ID does not exist" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.882519 4707 scope.go:117] "RemoveContainer" containerID="376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81" Jan 21 16:08:52 crc kubenswrapper[4707]: E0121 16:08:52.882705 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81\": container with ID starting with 376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81 not found: ID does not exist" containerID="376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.882725 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81"} err="failed to get container status \"376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81\": rpc error: code = NotFound desc = could not find container \"376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81\": container with ID starting with 376a0e2904a990f578ccc6a43463f29f3deeeb72e18fca9c1de9451661e8cb81 not found: ID does not exist" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.882736 4707 scope.go:117] "RemoveContainer" containerID="bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93" Jan 21 16:08:52 crc kubenswrapper[4707]: E0121 16:08:52.882957 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93\": container with ID starting with bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93 not found: ID does not exist" containerID="bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.882979 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93"} err="failed to get container status \"bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93\": rpc error: code = NotFound desc = could not find container \"bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93\": container with ID starting with bc7365ac6aca7e43a0dad940bc19ce3313cdb3c29e87be335e996b19d381fc93 not found: ID does not exist" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.969968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.970031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-run-httpd\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.970052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-log-httpd\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.970098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-config-data\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.970507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-run-httpd\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.970545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2mq9\" (UniqueName: \"kubernetes.io/projected/951167cd-6411-4b71-a8b8-2da2745eee6e-kube-api-access-h2mq9\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.970593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.970648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-scripts\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.970544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-log-httpd\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.974038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.974486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.974993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-scripts\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.982526 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-config-data\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:52 crc kubenswrapper[4707]: I0121 16:08:52.983074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2mq9\" (UniqueName: \"kubernetes.io/projected/951167cd-6411-4b71-a8b8-2da2745eee6e-kube-api-access-h2mq9\") pod \"ceilometer-0\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:53 crc kubenswrapper[4707]: I0121 16:08:53.158493 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:53 crc kubenswrapper[4707]: I0121 16:08:53.192726 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c0b03e-0ca0-4250-b374-7070f8bd1699" path="/var/lib/kubelet/pods/47c0b03e-0ca0-4250-b374-7070f8bd1699/volumes" Jan 21 16:08:53 crc kubenswrapper[4707]: I0121 16:08:53.542069 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:53 crc kubenswrapper[4707]: I0121 16:08:53.796345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerStarted","Data":"8d14d3a6287462e382075e14957228c8fd2c4e8dd6409d0ab81805606449e236"} Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.065968 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.085552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-combined-ca-bundle\") pod \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.085589 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-config-data\") pod \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.085653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db79d\" (UniqueName: \"kubernetes.io/projected/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-kube-api-access-db79d\") pod \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.085729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-scripts\") pod \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\" (UID: \"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4\") " Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.099949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-kube-api-access-db79d" (OuterVolumeSpecName: "kube-api-access-db79d") pod "1ede7fec-2e6f-47e1-9653-a55e80d7c9c4" (UID: "1ede7fec-2e6f-47e1-9653-a55e80d7c9c4"). InnerVolumeSpecName "kube-api-access-db79d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.100045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-scripts" (OuterVolumeSpecName: "scripts") pod "1ede7fec-2e6f-47e1-9653-a55e80d7c9c4" (UID: "1ede7fec-2e6f-47e1-9653-a55e80d7c9c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.109988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ede7fec-2e6f-47e1-9653-a55e80d7c9c4" (UID: "1ede7fec-2e6f-47e1-9653-a55e80d7c9c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.122456 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-config-data" (OuterVolumeSpecName: "config-data") pod "1ede7fec-2e6f-47e1-9653-a55e80d7c9c4" (UID: "1ede7fec-2e6f-47e1-9653-a55e80d7c9c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.187381 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.187410 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.187421 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.187431 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db79d\" (UniqueName: \"kubernetes.io/projected/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4-kube-api-access-db79d\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.806133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerStarted","Data":"b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76"} Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.807603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" event={"ID":"1ede7fec-2e6f-47e1-9653-a55e80d7c9c4","Type":"ContainerDied","Data":"1a5d02dceb34c23d2099d667f573518f216ae8709db5d542874d4433b7bade12"} Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.807668 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5d02dceb34c23d2099d667f573518f216ae8709db5d542874d4433b7bade12" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.807631 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.853833 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:08:54 crc kubenswrapper[4707]: E0121 16:08:54.854185 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ede7fec-2e6f-47e1-9653-a55e80d7c9c4" containerName="nova-cell0-conductor-db-sync" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.854212 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ede7fec-2e6f-47e1-9653-a55e80d7c9c4" containerName="nova-cell0-conductor-db-sync" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.854367 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ede7fec-2e6f-47e1-9653-a55e80d7c9c4" containerName="nova-cell0-conductor-db-sync" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.854866 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.856268 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-h8dbq" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.856589 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.861800 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.898678 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7m8f\" (UniqueName: \"kubernetes.io/projected/ef2c2b15-f4e1-4409-bbe8-f929ecced405-kube-api-access-t7m8f\") pod \"nova-cell0-conductor-0\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.898732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.898803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.999762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7m8f\" (UniqueName: \"kubernetes.io/projected/ef2c2b15-f4e1-4409-bbe8-f929ecced405-kube-api-access-t7m8f\") pod \"nova-cell0-conductor-0\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.999806 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:54 crc kubenswrapper[4707]: I0121 16:08:54.999883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.007381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.007421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.032778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7m8f\" (UniqueName: \"kubernetes.io/projected/ef2c2b15-f4e1-4409-bbe8-f929ecced405-kube-api-access-t7m8f\") pod \"nova-cell0-conductor-0\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.158238 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.158629 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerName="glance-log" containerID="cri-o://4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db" gracePeriod=30 Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.158747 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerName="glance-httpd" containerID="cri-o://2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e" gracePeriod=30 Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.182685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.595294 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:08:55 crc kubenswrapper[4707]: W0121 16:08:55.595898 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef2c2b15_f4e1_4409_bbe8_f929ecced405.slice/crio-285c195401b46e4290d98a6b978b15978b2dda7aa3713eedabc2cd4189bde514 WatchSource:0}: Error finding container 285c195401b46e4290d98a6b978b15978b2dda7aa3713eedabc2cd4189bde514: Status 404 returned error can't find the container with id 285c195401b46e4290d98a6b978b15978b2dda7aa3713eedabc2cd4189bde514 Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.636004 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.670270 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.670923 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="825bff4e-d6ba-4010-97b5-919025083000" containerName="glance-log" containerID="cri-o://fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b" gracePeriod=30 Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.671289 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="825bff4e-d6ba-4010-97b5-919025083000" containerName="glance-httpd" containerID="cri-o://590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444" gracePeriod=30 Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.815132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerStarted","Data":"456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e"} Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.815341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerStarted","Data":"7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2"} Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.816801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"ef2c2b15-f4e1-4409-bbe8-f929ecced405","Type":"ContainerStarted","Data":"602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb"} Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.816838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"ef2c2b15-f4e1-4409-bbe8-f929ecced405","Type":"ContainerStarted","Data":"285c195401b46e4290d98a6b978b15978b2dda7aa3713eedabc2cd4189bde514"} Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.816912 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" containerID="cri-o://602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" gracePeriod=30 Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.817085 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.818915 4707 generic.go:334] "Generic (PLEG): container finished" podID="825bff4e-d6ba-4010-97b5-919025083000" containerID="fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b" exitCode=143 Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.818963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"825bff4e-d6ba-4010-97b5-919025083000","Type":"ContainerDied","Data":"fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b"} Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.820985 4707 generic.go:334] "Generic (PLEG): container finished" podID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerID="4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db" exitCode=143 Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.821007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2f2511c6-b349-44ef-bd96-a121829bd0b8","Type":"ContainerDied","Data":"4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db"} Jan 21 16:08:55 crc kubenswrapper[4707]: I0121 16:08:55.830781 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.830773271 podStartE2EDuration="1.830773271s" podCreationTimestamp="2026-01-21 16:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:08:55.828177559 +0000 UTC m=+4033.009693782" watchObservedRunningTime="2026-01-21 16:08:55.830773271 +0000 UTC m=+4033.012289493" Jan 21 16:08:56 crc kubenswrapper[4707]: I0121 16:08:56.002391 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:08:57 crc kubenswrapper[4707]: I0121 16:08:57.840759 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerStarted","Data":"9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f"} Jan 21 16:08:57 crc kubenswrapper[4707]: I0121 16:08:57.841774 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:08:57 crc kubenswrapper[4707]: I0121 16:08:57.841471 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="proxy-httpd" containerID="cri-o://9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f" gracePeriod=30 Jan 21 16:08:57 crc kubenswrapper[4707]: I0121 16:08:57.841098 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="ceilometer-central-agent" containerID="cri-o://b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76" gracePeriod=30 Jan 21 16:08:57 crc kubenswrapper[4707]: I0121 16:08:57.841494 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="ceilometer-notification-agent" containerID="cri-o://7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2" gracePeriod=30 Jan 21 16:08:57 crc kubenswrapper[4707]: I0121 16:08:57.841485 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="sg-core" containerID="cri-o://456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e" gracePeriod=30 Jan 21 16:08:57 crc kubenswrapper[4707]: I0121 16:08:57.873969 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.381206791 podStartE2EDuration="5.873952215s" podCreationTimestamp="2026-01-21 16:08:52 +0000 UTC" firstStartedPulling="2026-01-21 16:08:53.548200827 +0000 UTC m=+4030.729717049" lastFinishedPulling="2026-01-21 16:08:57.040946251 +0000 UTC m=+4034.222462473" observedRunningTime="2026-01-21 16:08:57.857136647 +0000 UTC m=+4035.038652868" watchObservedRunningTime="2026-01-21 16:08:57.873952215 +0000 UTC m=+4035.055468437" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.630078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.670188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-httpd-run\") pod \"2f2511c6-b349-44ef-bd96-a121829bd0b8\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.670351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-logs\") pod \"2f2511c6-b349-44ef-bd96-a121829bd0b8\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.670448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-combined-ca-bundle\") pod \"2f2511c6-b349-44ef-bd96-a121829bd0b8\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.670530 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-scripts\") pod \"2f2511c6-b349-44ef-bd96-a121829bd0b8\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.670599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"2f2511c6-b349-44ef-bd96-a121829bd0b8\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.670660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2f2511c6-b349-44ef-bd96-a121829bd0b8" (UID: "2f2511c6-b349-44ef-bd96-a121829bd0b8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.670741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz8x2\" (UniqueName: \"kubernetes.io/projected/2f2511c6-b349-44ef-bd96-a121829bd0b8-kube-api-access-cz8x2\") pod \"2f2511c6-b349-44ef-bd96-a121829bd0b8\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.670899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-config-data\") pod \"2f2511c6-b349-44ef-bd96-a121829bd0b8\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.671007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-public-tls-certs\") pod \"2f2511c6-b349-44ef-bd96-a121829bd0b8\" (UID: \"2f2511c6-b349-44ef-bd96-a121829bd0b8\") " Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.671368 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.671569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-logs" (OuterVolumeSpecName: "logs") pod "2f2511c6-b349-44ef-bd96-a121829bd0b8" (UID: "2f2511c6-b349-44ef-bd96-a121829bd0b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.675661 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "2f2511c6-b349-44ef-bd96-a121829bd0b8" (UID: "2f2511c6-b349-44ef-bd96-a121829bd0b8"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.682044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-scripts" (OuterVolumeSpecName: "scripts") pod "2f2511c6-b349-44ef-bd96-a121829bd0b8" (UID: "2f2511c6-b349-44ef-bd96-a121829bd0b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.682983 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2511c6-b349-44ef-bd96-a121829bd0b8-kube-api-access-cz8x2" (OuterVolumeSpecName: "kube-api-access-cz8x2") pod "2f2511c6-b349-44ef-bd96-a121829bd0b8" (UID: "2f2511c6-b349-44ef-bd96-a121829bd0b8"). InnerVolumeSpecName "kube-api-access-cz8x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.698784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f2511c6-b349-44ef-bd96-a121829bd0b8" (UID: "2f2511c6-b349-44ef-bd96-a121829bd0b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.717166 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-config-data" (OuterVolumeSpecName: "config-data") pod "2f2511c6-b349-44ef-bd96-a121829bd0b8" (UID: "2f2511c6-b349-44ef-bd96-a121829bd0b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.723765 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2f2511c6-b349-44ef-bd96-a121829bd0b8" (UID: "2f2511c6-b349-44ef-bd96-a121829bd0b8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.773120 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.773148 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.773189 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.773199 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz8x2\" (UniqueName: \"kubernetes.io/projected/2f2511c6-b349-44ef-bd96-a121829bd0b8-kube-api-access-cz8x2\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.773210 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.773218 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2511c6-b349-44ef-bd96-a121829bd0b8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.773225 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f2511c6-b349-44ef-bd96-a121829bd0b8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.789404 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.848740 4707 generic.go:334] "Generic (PLEG): container finished" podID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerID="2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e" exitCode=0 Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.848803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2f2511c6-b349-44ef-bd96-a121829bd0b8","Type":"ContainerDied","Data":"2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e"} Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.848851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2f2511c6-b349-44ef-bd96-a121829bd0b8","Type":"ContainerDied","Data":"e332a430c43dc7ac34fdb89aa2984f6777afd57422de00a18c8c732bfb793028"} Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.848870 4707 scope.go:117] "RemoveContainer" containerID="2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.849428 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.851448 4707 generic.go:334] "Generic (PLEG): container finished" podID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerID="9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f" exitCode=0 Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.851475 4707 generic.go:334] "Generic (PLEG): container finished" podID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerID="456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e" exitCode=2 Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.851483 4707 generic.go:334] "Generic (PLEG): container finished" podID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerID="7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2" exitCode=0 Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.851500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerDied","Data":"9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f"} Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.851521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerDied","Data":"456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e"} Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.851530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerDied","Data":"7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2"} Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.868601 4707 scope.go:117] "RemoveContainer" containerID="4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.874472 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.877188 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.882538 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.897774 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:08:58 crc kubenswrapper[4707]: E0121 16:08:58.898050 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerName="glance-log" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.898068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerName="glance-log" Jan 21 16:08:58 crc kubenswrapper[4707]: E0121 16:08:58.898081 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerName="glance-httpd" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.898088 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerName="glance-httpd" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.898243 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerName="glance-log" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.898258 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2511c6-b349-44ef-bd96-a121829bd0b8" containerName="glance-httpd" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.899668 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.902311 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.902934 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.910124 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.921034 4707 scope.go:117] "RemoveContainer" containerID="2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e" Jan 21 16:08:58 crc kubenswrapper[4707]: E0121 16:08:58.921418 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e\": container with ID starting with 2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e not found: ID does not exist" containerID="2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.921446 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e"} err="failed to get container status \"2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e\": rpc error: code = NotFound desc = could not find container \"2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e\": container with ID starting with 2967078a53b6b6025b038337029eceb3cfff44518b1d237e7a1985baa731952e not found: ID does not exist" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.921464 4707 scope.go:117] "RemoveContainer" containerID="4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db" Jan 21 16:08:58 crc kubenswrapper[4707]: E0121 16:08:58.921674 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db\": container with ID starting with 4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db not found: ID does not exist" containerID="4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.921691 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db"} err="failed to get container status \"4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db\": rpc error: code = NotFound desc = could not find container \"4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db\": container with ID starting with 4610f8ad78527ba18f9d870c86cec2a0558acb0d667c9b637ff223fade1b24db not found: ID does not exist" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.976022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-logs\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.976075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-scripts\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.976097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75kzc\" (UniqueName: \"kubernetes.io/projected/d8bdcff4-af97-483e-80bb-21b36c870c0b-kube-api-access-75kzc\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.976243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.976341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.976395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.976419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:58 crc kubenswrapper[4707]: I0121 16:08:58.976503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-config-data\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.078988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.079060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.079087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.079146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-config-data\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.079254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-logs\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.079275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-scripts\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.079294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75kzc\" (UniqueName: \"kubernetes.io/projected/d8bdcff4-af97-483e-80bb-21b36c870c0b-kube-api-access-75kzc\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.079360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.079386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.079589 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.081141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-logs\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.082778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.082826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.082987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-scripts\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.084278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-config-data\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.098306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75kzc\" (UniqueName: \"kubernetes.io/projected/d8bdcff4-af97-483e-80bb-21b36c870c0b-kube-api-access-75kzc\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.102297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.151306 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.181312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-internal-tls-certs\") pod \"825bff4e-d6ba-4010-97b5-919025083000\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.181472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-httpd-run\") pod \"825bff4e-d6ba-4010-97b5-919025083000\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.181568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-logs\") pod \"825bff4e-d6ba-4010-97b5-919025083000\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.181632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-scripts\") pod \"825bff4e-d6ba-4010-97b5-919025083000\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.181726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-config-data\") pod \"825bff4e-d6ba-4010-97b5-919025083000\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.181827 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw9m7\" (UniqueName: \"kubernetes.io/projected/825bff4e-d6ba-4010-97b5-919025083000-kube-api-access-fw9m7\") pod \"825bff4e-d6ba-4010-97b5-919025083000\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.181944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"825bff4e-d6ba-4010-97b5-919025083000\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.182025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-combined-ca-bundle\") pod \"825bff4e-d6ba-4010-97b5-919025083000\" (UID: \"825bff4e-d6ba-4010-97b5-919025083000\") " Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.188908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-scripts" (OuterVolumeSpecName: "scripts") pod "825bff4e-d6ba-4010-97b5-919025083000" (UID: "825bff4e-d6ba-4010-97b5-919025083000"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.191223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "825bff4e-d6ba-4010-97b5-919025083000" (UID: "825bff4e-d6ba-4010-97b5-919025083000"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.191447 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-logs" (OuterVolumeSpecName: "logs") pod "825bff4e-d6ba-4010-97b5-919025083000" (UID: "825bff4e-d6ba-4010-97b5-919025083000"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.194237 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:08:59 crc kubenswrapper[4707]: E0121 16:08:59.194439 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.197207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "825bff4e-d6ba-4010-97b5-919025083000" (UID: "825bff4e-d6ba-4010-97b5-919025083000"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.198488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/825bff4e-d6ba-4010-97b5-919025083000-kube-api-access-fw9m7" (OuterVolumeSpecName: "kube-api-access-fw9m7") pod "825bff4e-d6ba-4010-97b5-919025083000" (UID: "825bff4e-d6ba-4010-97b5-919025083000"). InnerVolumeSpecName "kube-api-access-fw9m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.217228 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.226356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "825bff4e-d6ba-4010-97b5-919025083000" (UID: "825bff4e-d6ba-4010-97b5-919025083000"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.226530 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f2511c6-b349-44ef-bd96-a121829bd0b8" path="/var/lib/kubelet/pods/2f2511c6-b349-44ef-bd96-a121829bd0b8/volumes" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.233823 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-config-data" (OuterVolumeSpecName: "config-data") pod "825bff4e-d6ba-4010-97b5-919025083000" (UID: "825bff4e-d6ba-4010-97b5-919025083000"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.252058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "825bff4e-d6ba-4010-97b5-919025083000" (UID: "825bff4e-d6ba-4010-97b5-919025083000"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.284562 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.284585 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.284596 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.284604 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825bff4e-d6ba-4010-97b5-919025083000-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.284611 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.284619 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825bff4e-d6ba-4010-97b5-919025083000-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.284627 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw9m7\" (UniqueName: \"kubernetes.io/projected/825bff4e-d6ba-4010-97b5-919025083000-kube-api-access-fw9m7\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.284645 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.299794 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.386369 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.601566 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:08:59 crc kubenswrapper[4707]: W0121 16:08:59.603710 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8bdcff4_af97_483e_80bb_21b36c870c0b.slice/crio-e6480f04a38d5ddb2b8ba72eab165d69da1db7ca61b3918ceccae1298d8eb670 WatchSource:0}: Error finding container e6480f04a38d5ddb2b8ba72eab165d69da1db7ca61b3918ceccae1298d8eb670: Status 404 returned error can't find the container with id e6480f04a38d5ddb2b8ba72eab165d69da1db7ca61b3918ceccae1298d8eb670 Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.858896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8bdcff4-af97-483e-80bb-21b36c870c0b","Type":"ContainerStarted","Data":"e6480f04a38d5ddb2b8ba72eab165d69da1db7ca61b3918ceccae1298d8eb670"} Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.862556 4707 generic.go:334] "Generic (PLEG): container finished" podID="825bff4e-d6ba-4010-97b5-919025083000" containerID="590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444" exitCode=0 Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.862602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"825bff4e-d6ba-4010-97b5-919025083000","Type":"ContainerDied","Data":"590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444"} Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.862620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"825bff4e-d6ba-4010-97b5-919025083000","Type":"ContainerDied","Data":"9994bd28d85b012fa6f57829f1010eda91e909a2125204c95d0315bf89a231d0"} Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.862634 4707 scope.go:117] "RemoveContainer" containerID="590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.862707 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.886468 4707 scope.go:117] "RemoveContainer" containerID="fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.907549 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.914837 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.916549 4707 scope.go:117] "RemoveContainer" containerID="590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444" Jan 21 16:08:59 crc kubenswrapper[4707]: E0121 16:08:59.917601 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444\": container with ID starting with 590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444 not found: ID does not exist" containerID="590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.917630 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444"} err="failed to get container status \"590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444\": rpc error: code = NotFound desc = could not find container \"590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444\": container with ID starting with 590a341c737a303afdce29dbf69850fcaec8f8f39cdf0fcbff70b66baec87444 not found: ID does not exist" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.917649 4707 scope.go:117] "RemoveContainer" containerID="fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.921556 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:08:59 crc kubenswrapper[4707]: E0121 16:08:59.922498 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825bff4e-d6ba-4010-97b5-919025083000" containerName="glance-httpd" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.922517 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="825bff4e-d6ba-4010-97b5-919025083000" containerName="glance-httpd" Jan 21 16:08:59 crc kubenswrapper[4707]: E0121 16:08:59.923347 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825bff4e-d6ba-4010-97b5-919025083000" containerName="glance-log" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.923361 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="825bff4e-d6ba-4010-97b5-919025083000" containerName="glance-log" Jan 21 16:08:59 crc kubenswrapper[4707]: E0121 16:08:59.923486 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b\": container with ID starting with fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b not found: ID does not exist" containerID="fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.923518 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b"} err="failed to get container status \"fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b\": rpc error: code = NotFound desc = could not find container \"fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b\": container with ID starting with fb6286e6e76060a6ed72012ed9b7dc520650fc4b8a1625b609ceea77228a710b not found: ID does not exist" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.923545 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="825bff4e-d6ba-4010-97b5-919025083000" containerName="glance-httpd" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.923571 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="825bff4e-d6ba-4010-97b5-919025083000" containerName="glance-log" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.924401 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.927441 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.927789 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:08:59 crc kubenswrapper[4707]: I0121 16:08:59.927953 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.003682 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.003745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.003818 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.003856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vpt8\" (UniqueName: \"kubernetes.io/projected/dd534564-de96-44c7-9026-a55d66da26ef-kube-api-access-7vpt8\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.003914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.004037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.004172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-logs\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.004213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.105180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.105217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.105237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.105256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vpt8\" (UniqueName: \"kubernetes.io/projected/dd534564-de96-44c7-9026-a55d66da26ef-kube-api-access-7vpt8\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.105274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.105595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.105666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-logs\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.105690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.105591 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.106080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-logs\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.106195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.108094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.108651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.109619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.116696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.117946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vpt8\" (UniqueName: \"kubernetes.io/projected/dd534564-de96-44c7-9026-a55d66da26ef-kube-api-access-7vpt8\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.123749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: E0121 16:09:00.185518 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:00 crc kubenswrapper[4707]: E0121 16:09:00.187081 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:00 crc kubenswrapper[4707]: E0121 16:09:00.188340 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:00 crc kubenswrapper[4707]: E0121 16:09:00.188367 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.239058 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.614943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.873637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8bdcff4-af97-483e-80bb-21b36c870c0b","Type":"ContainerStarted","Data":"7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a"} Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.873675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8bdcff4-af97-483e-80bb-21b36c870c0b","Type":"ContainerStarted","Data":"117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0"} Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.874601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dd534564-de96-44c7-9026-a55d66da26ef","Type":"ContainerStarted","Data":"5c1fd8a000f323e786259549c867c5f97ba05b1eff8e3a671c2199832f2a107b"} Jan 21 16:09:00 crc kubenswrapper[4707]: I0121 16:09:00.889277 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.889263478 podStartE2EDuration="2.889263478s" podCreationTimestamp="2026-01-21 16:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:00.886726056 +0000 UTC m=+4038.068242278" watchObservedRunningTime="2026-01-21 16:09:00.889263478 +0000 UTC m=+4038.070779699" Jan 21 16:09:01 crc kubenswrapper[4707]: I0121 16:09:01.191133 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="825bff4e-d6ba-4010-97b5-919025083000" path="/var/lib/kubelet/pods/825bff4e-d6ba-4010-97b5-919025083000/volumes" Jan 21 16:09:01 crc kubenswrapper[4707]: I0121 16:09:01.884045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dd534564-de96-44c7-9026-a55d66da26ef","Type":"ContainerStarted","Data":"240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60"} Jan 21 16:09:01 crc kubenswrapper[4707]: I0121 16:09:01.884081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dd534564-de96-44c7-9026-a55d66da26ef","Type":"ContainerStarted","Data":"c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5"} Jan 21 16:09:01 crc kubenswrapper[4707]: I0121 16:09:01.898151 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.8981389220000002 podStartE2EDuration="2.898138922s" podCreationTimestamp="2026-01-21 16:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:01.896638843 +0000 UTC m=+4039.078155065" watchObservedRunningTime="2026-01-21 16:09:01.898138922 +0000 UTC m=+4039.079655145" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.410029 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.541513 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-combined-ca-bundle\") pod \"951167cd-6411-4b71-a8b8-2da2745eee6e\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.541552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2mq9\" (UniqueName: \"kubernetes.io/projected/951167cd-6411-4b71-a8b8-2da2745eee6e-kube-api-access-h2mq9\") pod \"951167cd-6411-4b71-a8b8-2da2745eee6e\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.541591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-config-data\") pod \"951167cd-6411-4b71-a8b8-2da2745eee6e\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.541629 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-sg-core-conf-yaml\") pod \"951167cd-6411-4b71-a8b8-2da2745eee6e\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.541702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-scripts\") pod \"951167cd-6411-4b71-a8b8-2da2745eee6e\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.541741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-log-httpd\") pod \"951167cd-6411-4b71-a8b8-2da2745eee6e\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.541755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-run-httpd\") pod \"951167cd-6411-4b71-a8b8-2da2745eee6e\" (UID: \"951167cd-6411-4b71-a8b8-2da2745eee6e\") " Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.542136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "951167cd-6411-4b71-a8b8-2da2745eee6e" (UID: "951167cd-6411-4b71-a8b8-2da2745eee6e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.542379 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "951167cd-6411-4b71-a8b8-2da2745eee6e" (UID: "951167cd-6411-4b71-a8b8-2da2745eee6e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.545610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-scripts" (OuterVolumeSpecName: "scripts") pod "951167cd-6411-4b71-a8b8-2da2745eee6e" (UID: "951167cd-6411-4b71-a8b8-2da2745eee6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.550990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/951167cd-6411-4b71-a8b8-2da2745eee6e-kube-api-access-h2mq9" (OuterVolumeSpecName: "kube-api-access-h2mq9") pod "951167cd-6411-4b71-a8b8-2da2745eee6e" (UID: "951167cd-6411-4b71-a8b8-2da2745eee6e"). InnerVolumeSpecName "kube-api-access-h2mq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.643830 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.643855 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.643865 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/951167cd-6411-4b71-a8b8-2da2745eee6e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.643873 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2mq9\" (UniqueName: \"kubernetes.io/projected/951167cd-6411-4b71-a8b8-2da2745eee6e-kube-api-access-h2mq9\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.750584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "951167cd-6411-4b71-a8b8-2da2745eee6e" (UID: "951167cd-6411-4b71-a8b8-2da2745eee6e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.781946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "951167cd-6411-4b71-a8b8-2da2745eee6e" (UID: "951167cd-6411-4b71-a8b8-2da2745eee6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.795416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-config-data" (OuterVolumeSpecName: "config-data") pod "951167cd-6411-4b71-a8b8-2da2745eee6e" (UID: "951167cd-6411-4b71-a8b8-2da2745eee6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.845872 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.845896 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.845905 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/951167cd-6411-4b71-a8b8-2da2745eee6e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.892152 4707 generic.go:334] "Generic (PLEG): container finished" podID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerID="b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76" exitCode=0 Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.892220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.892254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerDied","Data":"b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76"} Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.892290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"951167cd-6411-4b71-a8b8-2da2745eee6e","Type":"ContainerDied","Data":"8d14d3a6287462e382075e14957228c8fd2c4e8dd6409d0ab81805606449e236"} Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.892307 4707 scope.go:117] "RemoveContainer" containerID="9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.908005 4707 scope.go:117] "RemoveContainer" containerID="456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.918371 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.923032 4707 scope.go:117] "RemoveContainer" containerID="7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.929577 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.936017 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:02 crc kubenswrapper[4707]: E0121 16:09:02.942534 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="sg-core" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.942556 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="sg-core" Jan 21 16:09:02 crc kubenswrapper[4707]: E0121 16:09:02.942572 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="ceilometer-notification-agent" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.942578 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="ceilometer-notification-agent" Jan 21 16:09:02 crc kubenswrapper[4707]: E0121 16:09:02.942591 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="ceilometer-central-agent" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.942596 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="ceilometer-central-agent" Jan 21 16:09:02 crc kubenswrapper[4707]: E0121 16:09:02.942612 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="proxy-httpd" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.942617 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="proxy-httpd" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.942762 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="ceilometer-central-agent" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.942776 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="sg-core" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.942784 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="proxy-httpd" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.942796 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" containerName="ceilometer-notification-agent" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.944078 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.944165 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.945904 4707 scope.go:117] "RemoveContainer" containerID="b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.945990 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.949248 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.967756 4707 scope.go:117] "RemoveContainer" containerID="9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f" Jan 21 16:09:02 crc kubenswrapper[4707]: E0121 16:09:02.968245 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f\": container with ID starting with 9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f not found: ID does not exist" containerID="9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.968276 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f"} err="failed to get container status \"9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f\": rpc error: code = NotFound desc = could not find container \"9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f\": container with ID starting with 9dd4085122415ad0c60c67246ba789b65a90f5a8578079cce72d447a3be8a83f not found: ID does not exist" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.968299 4707 scope.go:117] "RemoveContainer" containerID="456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e" Jan 21 16:09:02 crc kubenswrapper[4707]: E0121 16:09:02.968766 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e\": container with ID starting with 456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e not found: ID does not exist" containerID="456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.968791 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e"} err="failed to get container status \"456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e\": rpc error: code = NotFound desc = could not find container \"456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e\": container with ID starting with 456a813b84e12c00b05aeabf6ff1d1b3c74bc831ef4ad293af53163cc7e1715e not found: ID does not exist" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.968881 4707 scope.go:117] "RemoveContainer" containerID="7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2" Jan 21 16:09:02 crc kubenswrapper[4707]: E0121 16:09:02.969275 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2\": container with ID starting with 7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2 not found: ID does not exist" containerID="7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.969302 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2"} err="failed to get container status \"7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2\": rpc error: code = NotFound desc = could not find container \"7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2\": container with ID starting with 7d1ec51cbfd255e26b73856f379f4073d91c0cee598b08bbb24696b3e368eee2 not found: ID does not exist" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.969336 4707 scope.go:117] "RemoveContainer" containerID="b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76" Jan 21 16:09:02 crc kubenswrapper[4707]: E0121 16:09:02.969877 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76\": container with ID starting with b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76 not found: ID does not exist" containerID="b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76" Jan 21 16:09:02 crc kubenswrapper[4707]: I0121 16:09:02.969922 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76"} err="failed to get container status \"b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76\": rpc error: code = NotFound desc = could not find container \"b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76\": container with ID starting with b43f3e2e242f693bf64df49e2ad4f15b1e55c2bda4da38077f1b2f4ee295ab76 not found: ID does not exist" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.047837 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-config-data\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.047967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-scripts\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.047994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-run-httpd\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.048179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbr8g\" (UniqueName: \"kubernetes.io/projected/b57d26d0-b96a-4d62-94cf-ddb928148048-kube-api-access-xbr8g\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.048288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.048342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.048376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-log-httpd\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.149840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbr8g\" (UniqueName: \"kubernetes.io/projected/b57d26d0-b96a-4d62-94cf-ddb928148048-kube-api-access-xbr8g\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.149891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.149920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.149945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-log-httpd\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.149959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-config-data\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.150002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-scripts\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.150017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-run-httpd\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.150830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-log-httpd\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.150851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-run-httpd\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.189531 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="951167cd-6411-4b71-a8b8-2da2745eee6e" path="/var/lib/kubelet/pods/951167cd-6411-4b71-a8b8-2da2745eee6e/volumes" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.548584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.548937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbr8g\" (UniqueName: \"kubernetes.io/projected/b57d26d0-b96a-4d62-94cf-ddb928148048-kube-api-access-xbr8g\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.548992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-config-data\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.548717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.549098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-scripts\") pod \"ceilometer-0\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.567902 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:03 crc kubenswrapper[4707]: I0121 16:09:03.930537 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:04 crc kubenswrapper[4707]: I0121 16:09:04.908746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerStarted","Data":"fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c"} Jan 21 16:09:04 crc kubenswrapper[4707]: I0121 16:09:04.908979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerStarted","Data":"83ca996af47aab531e29283493076da3ddb49d5a8113893bb100408770c71a20"} Jan 21 16:09:05 crc kubenswrapper[4707]: E0121 16:09:05.186464 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:05 crc kubenswrapper[4707]: E0121 16:09:05.187551 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:05 crc kubenswrapper[4707]: E0121 16:09:05.188771 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:05 crc kubenswrapper[4707]: E0121 16:09:05.188802 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" Jan 21 16:09:05 crc kubenswrapper[4707]: I0121 16:09:05.916414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerStarted","Data":"fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74"} Jan 21 16:09:05 crc kubenswrapper[4707]: I0121 16:09:05.916604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerStarted","Data":"69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84"} Jan 21 16:09:07 crc kubenswrapper[4707]: I0121 16:09:07.933859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerStarted","Data":"76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8"} Jan 21 16:09:07 crc kubenswrapper[4707]: I0121 16:09:07.934222 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:07 crc kubenswrapper[4707]: I0121 16:09:07.951286 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.795054453 podStartE2EDuration="5.951272646s" podCreationTimestamp="2026-01-21 16:09:02 +0000 UTC" firstStartedPulling="2026-01-21 16:09:03.932164797 +0000 UTC m=+4041.113681019" lastFinishedPulling="2026-01-21 16:09:07.08838299 +0000 UTC m=+4044.269899212" observedRunningTime="2026-01-21 16:09:07.947804433 +0000 UTC m=+4045.129320675" watchObservedRunningTime="2026-01-21 16:09:07.951272646 +0000 UTC m=+4045.132788867" Jan 21 16:09:09 crc kubenswrapper[4707]: I0121 16:09:09.218547 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:09:09 crc kubenswrapper[4707]: I0121 16:09:09.218754 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:09:09 crc kubenswrapper[4707]: I0121 16:09:09.469503 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:09:09 crc kubenswrapper[4707]: I0121 16:09:09.562454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:09:09 crc kubenswrapper[4707]: I0121 16:09:09.947090 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:09:09 crc kubenswrapper[4707]: I0121 16:09:09.947122 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:09:10 crc kubenswrapper[4707]: E0121 16:09:10.186455 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:10 crc kubenswrapper[4707]: E0121 16:09:10.187906 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:10 crc kubenswrapper[4707]: E0121 16:09:10.188929 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:10 crc kubenswrapper[4707]: E0121 16:09:10.188955 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" Jan 21 16:09:10 crc kubenswrapper[4707]: I0121 16:09:10.240244 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:10 crc kubenswrapper[4707]: I0121 16:09:10.240304 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:10 crc kubenswrapper[4707]: I0121 16:09:10.264609 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:10 crc kubenswrapper[4707]: I0121 16:09:10.275945 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:10 crc kubenswrapper[4707]: I0121 16:09:10.952788 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:10 crc kubenswrapper[4707]: I0121 16:09:10.952836 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:11 crc kubenswrapper[4707]: I0121 16:09:11.501635 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:09:11 crc kubenswrapper[4707]: I0121 16:09:11.501856 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:09:12 crc kubenswrapper[4707]: I0121 16:09:12.182516 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:09:12 crc kubenswrapper[4707]: E0121 16:09:12.183062 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:09:12 crc kubenswrapper[4707]: I0121 16:09:12.506667 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:12 crc kubenswrapper[4707]: I0121 16:09:12.543215 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:09:15 crc kubenswrapper[4707]: E0121 16:09:15.184787 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:15 crc kubenswrapper[4707]: E0121 16:09:15.185976 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:15 crc kubenswrapper[4707]: E0121 16:09:15.187184 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:15 crc kubenswrapper[4707]: E0121 16:09:15.187235 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" Jan 21 16:09:20 crc kubenswrapper[4707]: E0121 16:09:20.185028 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:20 crc kubenswrapper[4707]: E0121 16:09:20.186849 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:20 crc kubenswrapper[4707]: E0121 16:09:20.188032 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:20 crc kubenswrapper[4707]: E0121 16:09:20.188067 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" Jan 21 16:09:23 crc kubenswrapper[4707]: I0121 16:09:23.186715 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:09:23 crc kubenswrapper[4707]: E0121 16:09:23.187121 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:09:25 crc kubenswrapper[4707]: E0121 16:09:25.184205 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:25 crc kubenswrapper[4707]: E0121 16:09:25.192864 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:25 crc kubenswrapper[4707]: E0121 16:09:25.194752 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:09:25 crc kubenswrapper[4707]: E0121 16:09:25.194835 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.043709 4707 generic.go:334] "Generic (PLEG): container finished" podID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" exitCode=137 Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.044000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"ef2c2b15-f4e1-4409-bbe8-f929ecced405","Type":"ContainerDied","Data":"602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb"} Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.122213 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.203448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-combined-ca-bundle\") pod \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.203571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-config-data\") pod \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.203660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7m8f\" (UniqueName: \"kubernetes.io/projected/ef2c2b15-f4e1-4409-bbe8-f929ecced405-kube-api-access-t7m8f\") pod \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\" (UID: \"ef2c2b15-f4e1-4409-bbe8-f929ecced405\") " Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.207444 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef2c2b15-f4e1-4409-bbe8-f929ecced405-kube-api-access-t7m8f" (OuterVolumeSpecName: "kube-api-access-t7m8f") pod "ef2c2b15-f4e1-4409-bbe8-f929ecced405" (UID: "ef2c2b15-f4e1-4409-bbe8-f929ecced405"). InnerVolumeSpecName "kube-api-access-t7m8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.222513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-config-data" (OuterVolumeSpecName: "config-data") pod "ef2c2b15-f4e1-4409-bbe8-f929ecced405" (UID: "ef2c2b15-f4e1-4409-bbe8-f929ecced405"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.223131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef2c2b15-f4e1-4409-bbe8-f929ecced405" (UID: "ef2c2b15-f4e1-4409-bbe8-f929ecced405"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.305725 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.305798 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2c2b15-f4e1-4409-bbe8-f929ecced405-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:26 crc kubenswrapper[4707]: I0121 16:09:26.305852 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7m8f\" (UniqueName: \"kubernetes.io/projected/ef2c2b15-f4e1-4409-bbe8-f929ecced405-kube-api-access-t7m8f\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.051124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"ef2c2b15-f4e1-4409-bbe8-f929ecced405","Type":"ContainerDied","Data":"285c195401b46e4290d98a6b978b15978b2dda7aa3713eedabc2cd4189bde514"} Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.051168 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.051367 4707 scope.go:117] "RemoveContainer" containerID="602386b839d5a9282e1c58163477bd88c882106e1138a4eeae18b234c74dfabb" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.077798 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.081555 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.090365 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:09:27 crc kubenswrapper[4707]: E0121 16:09:27.090929 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.090996 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.091242 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" containerName="nova-cell0-conductor-conductor" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.091958 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.094447 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.095305 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-h8dbq" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.096234 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.190079 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef2c2b15-f4e1-4409-bbe8-f929ecced405" path="/var/lib/kubelet/pods/ef2c2b15-f4e1-4409-bbe8-f929ecced405/volumes" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.219096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tchtl\" (UniqueName: \"kubernetes.io/projected/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-kube-api-access-tchtl\") pod \"nova-cell0-conductor-0\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.219196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.219216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.320970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tchtl\" (UniqueName: \"kubernetes.io/projected/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-kube-api-access-tchtl\") pod \"nova-cell0-conductor-0\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.321273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.321322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.327399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.328284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.340231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tchtl\" (UniqueName: \"kubernetes.io/projected/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-kube-api-access-tchtl\") pod \"nova-cell0-conductor-0\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.404961 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:27 crc kubenswrapper[4707]: I0121 16:09:27.764608 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:09:28 crc kubenswrapper[4707]: I0121 16:09:28.060607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151","Type":"ContainerStarted","Data":"6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a"} Jan 21 16:09:28 crc kubenswrapper[4707]: I0121 16:09:28.060820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151","Type":"ContainerStarted","Data":"ceb6df4d206d8fe9f2969327f5ec824ca7c4afbfa0367adfd142e01b31aa6440"} Jan 21 16:09:28 crc kubenswrapper[4707]: I0121 16:09:28.063595 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:28 crc kubenswrapper[4707]: I0121 16:09:28.087035 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.087020919 podStartE2EDuration="1.087020919s" podCreationTimestamp="2026-01-21 16:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:28.07951254 +0000 UTC m=+4065.261028762" watchObservedRunningTime="2026-01-21 16:09:28.087020919 +0000 UTC m=+4065.268537141" Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.424676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.797614 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz"] Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.798608 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.804074 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.804201 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.804992 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz"] Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.908517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9ktn\" (UniqueName: \"kubernetes.io/projected/ca088bd1-a4bf-47c4-ba45-d25107adb619-kube-api-access-m9ktn\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.908552 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-scripts\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.908573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-config-data\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:32 crc kubenswrapper[4707]: I0121 16:09:32.908601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.010527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9ktn\" (UniqueName: \"kubernetes.io/projected/ca088bd1-a4bf-47c4-ba45-d25107adb619-kube-api-access-m9ktn\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.010746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-scripts\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.010947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-config-data\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.011052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.016284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-scripts\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.019317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-config-data\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.020844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.063625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9ktn\" (UniqueName: \"kubernetes.io/projected/ca088bd1-a4bf-47c4-ba45-d25107adb619-kube-api-access-m9ktn\") pod \"nova-cell0-cell-mapping-sc8lz\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.082724 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.092826 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.103120 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.120796 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.136210 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.216855 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7446\" (UniqueName: \"kubernetes.io/projected/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-kube-api-access-n7446\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.216983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.217096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-config-data\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.217117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-logs\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.257197 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.258442 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.268219 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.277558 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.278610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.281128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.293626 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.308890 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.311142 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.312180 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.314241 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.319252 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7446\" (UniqueName: \"kubernetes.io/projected/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-kube-api-access-n7446\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.319387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.319508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-config-data\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.319543 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-logs\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.320571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-logs\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.325632 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.328338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.331437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-config-data\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.341242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7446\" (UniqueName: \"kubernetes.io/projected/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-kube-api-access-n7446\") pod \"nova-api-0\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-logs\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6kqd\" (UniqueName: \"kubernetes.io/projected/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-kube-api-access-g6kqd\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqp62\" (UniqueName: \"kubernetes.io/projected/ea121fac-9d87-4988-bb58-e5b18100388f-kube-api-access-zqp62\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-config-data\") pod \"nova-scheduler-0\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-config-data\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.421693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pzkm\" (UniqueName: \"kubernetes.io/projected/54baaa16-b753-4cd6-a66b-87a57aed3bff-kube-api-access-8pzkm\") pod \"nova-scheduler-0\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.429380 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.524742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqp62\" (UniqueName: \"kubernetes.io/projected/ea121fac-9d87-4988-bb58-e5b18100388f-kube-api-access-zqp62\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.524793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-config-data\") pod \"nova-scheduler-0\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.524842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.524863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-config-data\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.524893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pzkm\" (UniqueName: \"kubernetes.io/projected/54baaa16-b753-4cd6-a66b-87a57aed3bff-kube-api-access-8pzkm\") pod \"nova-scheduler-0\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.524964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-logs\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.524988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6kqd\" (UniqueName: \"kubernetes.io/projected/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-kube-api-access-g6kqd\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.525017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.525042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.525061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.525445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-logs\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.528768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-config-data\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.532472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.533875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.540475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.542991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqp62\" (UniqueName: \"kubernetes.io/projected/ea121fac-9d87-4988-bb58-e5b18100388f-kube-api-access-zqp62\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.543577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-config-data\") pod \"nova-scheduler-0\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.546140 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pzkm\" (UniqueName: \"kubernetes.io/projected/54baaa16-b753-4cd6-a66b-87a57aed3bff-kube-api-access-8pzkm\") pod \"nova-scheduler-0\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.546394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6kqd\" (UniqueName: \"kubernetes.io/projected/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-kube-api-access-g6kqd\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.551313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.573823 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.582209 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.607972 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.643553 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.661141 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.757121 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.758342 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.766344 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.766571 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.790313 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.831010 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.934800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-config-data\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.934867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.934919 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-scripts\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:33 crc kubenswrapper[4707]: I0121 16:09:33.934976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnh72\" (UniqueName: \"kubernetes.io/projected/7a0b2b2e-2a3d-4b99-874c-08306819d928-kube-api-access-rnh72\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.036693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnh72\" (UniqueName: \"kubernetes.io/projected/7a0b2b2e-2a3d-4b99-874c-08306819d928-kube-api-access-rnh72\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.036860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-config-data\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.036905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.036962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-scripts\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.040325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-scripts\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.040647 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-config-data\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.041129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.050341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnh72\" (UniqueName: \"kubernetes.io/projected/7a0b2b2e-2a3d-4b99-874c-08306819d928-kube-api-access-rnh72\") pod \"nova-cell1-conductor-db-sync-mzglm\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.081668 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:34 crc kubenswrapper[4707]: W0121 16:09:34.110634 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7a2c821_67b0_4d2b_8fd3_b506ac9db599.slice/crio-2bbb74f591c32a1249c52838c0bfebe2286aca91087aeeec4484011e26eb523c WatchSource:0}: Error finding container 2bbb74f591c32a1249c52838c0bfebe2286aca91087aeeec4484011e26eb523c: Status 404 returned error can't find the container with id 2bbb74f591c32a1249c52838c0bfebe2286aca91087aeeec4484011e26eb523c Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.117720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.117757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239","Type":"ContainerStarted","Data":"2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d"} Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.117792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239","Type":"ContainerStarted","Data":"af241f248c25e27ac3a11eca3e8d147c27c5c51ebfd237b1796b9ab719f57873"} Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.123878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" event={"ID":"ca088bd1-a4bf-47c4-ba45-d25107adb619","Type":"ContainerStarted","Data":"1d299162f4afa98964d4ec11db17d73e66c0c0a221338dbed9777d7db58d8b6f"} Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.123916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" event={"ID":"ca088bd1-a4bf-47c4-ba45-d25107adb619","Type":"ContainerStarted","Data":"2965177a0d1938b1c544871e6f10bb0d1c0d3e14f79b6c6b30ef47c35e577074"} Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.144533 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" podStartSLOduration=2.144519959 podStartE2EDuration="2.144519959s" podCreationTimestamp="2026-01-21 16:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:34.140676693 +0000 UTC m=+4071.322192914" watchObservedRunningTime="2026-01-21 16:09:34.144519959 +0000 UTC m=+4071.326036181" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.184139 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:09:34 crc kubenswrapper[4707]: E0121 16:09:34.184730 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.243624 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.292407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:09:34 crc kubenswrapper[4707]: W0121 16:09:34.312084 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea121fac_9d87_4988_bb58_e5b18100388f.slice/crio-0753cb2a74bc606558ec3dd5236793aed64d90ddeca08689bf1fc836acb8c496 WatchSource:0}: Error finding container 0753cb2a74bc606558ec3dd5236793aed64d90ddeca08689bf1fc836acb8c496: Status 404 returned error can't find the container with id 0753cb2a74bc606558ec3dd5236793aed64d90ddeca08689bf1fc836acb8c496 Jan 21 16:09:34 crc kubenswrapper[4707]: I0121 16:09:34.571313 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm"] Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.132004 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" event={"ID":"7a0b2b2e-2a3d-4b99-874c-08306819d928","Type":"ContainerStarted","Data":"23373f1cfb2f8dcf2f285c27279eea29343e9ee4d18981e295e313a664f144f6"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.132500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" event={"ID":"7a0b2b2e-2a3d-4b99-874c-08306819d928","Type":"ContainerStarted","Data":"06e28c37fc6f9c87f25e64ddc4233256403a4cdb9b6f634ba735f836cb5b86e3"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.136341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d7a2c821-67b0-4d2b-8fd3-b506ac9db599","Type":"ContainerStarted","Data":"128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.136367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d7a2c821-67b0-4d2b-8fd3-b506ac9db599","Type":"ContainerStarted","Data":"e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.136378 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d7a2c821-67b0-4d2b-8fd3-b506ac9db599","Type":"ContainerStarted","Data":"2bbb74f591c32a1249c52838c0bfebe2286aca91087aeeec4484011e26eb523c"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.138974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239","Type":"ContainerStarted","Data":"6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.141147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"54baaa16-b753-4cd6-a66b-87a57aed3bff","Type":"ContainerStarted","Data":"9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.141184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"54baaa16-b753-4cd6-a66b-87a57aed3bff","Type":"ContainerStarted","Data":"6cdcaed1bca8b87e9d06cc56b33b7bd61dc8a1b5bd7f9d597a47f685c52d9172"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.144352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"ea121fac-9d87-4988-bb58-e5b18100388f","Type":"ContainerStarted","Data":"12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.144377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"ea121fac-9d87-4988-bb58-e5b18100388f","Type":"ContainerStarted","Data":"0753cb2a74bc606558ec3dd5236793aed64d90ddeca08689bf1fc836acb8c496"} Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.162177 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" podStartSLOduration=2.162150256 podStartE2EDuration="2.162150256s" podCreationTimestamp="2026-01-21 16:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:35.150002606 +0000 UTC m=+4072.331518828" watchObservedRunningTime="2026-01-21 16:09:35.162150256 +0000 UTC m=+4072.343666479" Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.179924 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.179906634 podStartE2EDuration="2.179906634s" podCreationTimestamp="2026-01-21 16:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:35.173584226 +0000 UTC m=+4072.355100449" watchObservedRunningTime="2026-01-21 16:09:35.179906634 +0000 UTC m=+4072.361422857" Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.213826 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.219867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.231506 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.231489066 podStartE2EDuration="3.231489066s" podCreationTimestamp="2026-01-21 16:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:35.202950324 +0000 UTC m=+4072.384466547" watchObservedRunningTime="2026-01-21 16:09:35.231489066 +0000 UTC m=+4072.413005289" Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.232296 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.232290653 podStartE2EDuration="2.232290653s" podCreationTimestamp="2026-01-21 16:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:35.22524219 +0000 UTC m=+4072.406758411" watchObservedRunningTime="2026-01-21 16:09:35.232290653 +0000 UTC m=+4072.413806876" Jan 21 16:09:35 crc kubenswrapper[4707]: I0121 16:09:35.244827 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.244819311 podStartE2EDuration="2.244819311s" podCreationTimestamp="2026-01-21 16:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:35.238726535 +0000 UTC m=+4072.420242757" watchObservedRunningTime="2026-01-21 16:09:35.244819311 +0000 UTC m=+4072.426335534" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.168719 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a0b2b2e-2a3d-4b99-874c-08306819d928" containerID="23373f1cfb2f8dcf2f285c27279eea29343e9ee4d18981e295e313a664f144f6" exitCode=0 Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.168862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" event={"ID":"7a0b2b2e-2a3d-4b99-874c-08306819d928","Type":"ContainerDied","Data":"23373f1cfb2f8dcf2f285c27279eea29343e9ee4d18981e295e313a664f144f6"} Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.169240 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="ea121fac-9d87-4988-bb58-e5b18100388f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8" gracePeriod=30 Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.169513 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerName="nova-metadata-metadata" containerID="cri-o://128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b" gracePeriod=30 Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.169523 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerName="nova-metadata-log" containerID="cri-o://e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8" gracePeriod=30 Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.771054 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.771621 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="38940cc9-cd7d-49b2-809d-7bc6b0facfea" containerName="kube-state-metrics" containerID="cri-o://0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c" gracePeriod=30 Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.781600 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.788551 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.827909 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6kqd\" (UniqueName: \"kubernetes.io/projected/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-kube-api-access-g6kqd\") pod \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.828024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-combined-ca-bundle\") pod \"ea121fac-9d87-4988-bb58-e5b18100388f\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.828069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-config-data\") pod \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.828187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-combined-ca-bundle\") pod \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.828294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-config-data\") pod \"ea121fac-9d87-4988-bb58-e5b18100388f\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.828397 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqp62\" (UniqueName: \"kubernetes.io/projected/ea121fac-9d87-4988-bb58-e5b18100388f-kube-api-access-zqp62\") pod \"ea121fac-9d87-4988-bb58-e5b18100388f\" (UID: \"ea121fac-9d87-4988-bb58-e5b18100388f\") " Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.828528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-logs\") pod \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\" (UID: \"d7a2c821-67b0-4d2b-8fd3-b506ac9db599\") " Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.829472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-logs" (OuterVolumeSpecName: "logs") pod "d7a2c821-67b0-4d2b-8fd3-b506ac9db599" (UID: "d7a2c821-67b0-4d2b-8fd3-b506ac9db599"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.835316 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea121fac-9d87-4988-bb58-e5b18100388f-kube-api-access-zqp62" (OuterVolumeSpecName: "kube-api-access-zqp62") pod "ea121fac-9d87-4988-bb58-e5b18100388f" (UID: "ea121fac-9d87-4988-bb58-e5b18100388f"). InnerVolumeSpecName "kube-api-access-zqp62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.840625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-kube-api-access-g6kqd" (OuterVolumeSpecName: "kube-api-access-g6kqd") pod "d7a2c821-67b0-4d2b-8fd3-b506ac9db599" (UID: "d7a2c821-67b0-4d2b-8fd3-b506ac9db599"). InnerVolumeSpecName "kube-api-access-g6kqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.864946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-config-data" (OuterVolumeSpecName: "config-data") pod "d7a2c821-67b0-4d2b-8fd3-b506ac9db599" (UID: "d7a2c821-67b0-4d2b-8fd3-b506ac9db599"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.864999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea121fac-9d87-4988-bb58-e5b18100388f" (UID: "ea121fac-9d87-4988-bb58-e5b18100388f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.867599 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-config-data" (OuterVolumeSpecName: "config-data") pod "ea121fac-9d87-4988-bb58-e5b18100388f" (UID: "ea121fac-9d87-4988-bb58-e5b18100388f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.872705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7a2c821-67b0-4d2b-8fd3-b506ac9db599" (UID: "d7a2c821-67b0-4d2b-8fd3-b506ac9db599"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.930936 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.931419 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.932356 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqp62\" (UniqueName: \"kubernetes.io/projected/ea121fac-9d87-4988-bb58-e5b18100388f-kube-api-access-zqp62\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.932432 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.932495 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6kqd\" (UniqueName: \"kubernetes.io/projected/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-kube-api-access-g6kqd\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.932556 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea121fac-9d87-4988-bb58-e5b18100388f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:37 crc kubenswrapper[4707]: I0121 16:09:37.932608 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a2c821-67b0-4d2b-8fd3-b506ac9db599-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.130958 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.185883 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca088bd1-a4bf-47c4-ba45-d25107adb619" containerID="1d299162f4afa98964d4ec11db17d73e66c0c0a221338dbed9777d7db58d8b6f" exitCode=0 Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.185961 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" event={"ID":"ca088bd1-a4bf-47c4-ba45-d25107adb619","Type":"ContainerDied","Data":"1d299162f4afa98964d4ec11db17d73e66c0c0a221338dbed9777d7db58d8b6f"} Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.188095 4707 generic.go:334] "Generic (PLEG): container finished" podID="ea121fac-9d87-4988-bb58-e5b18100388f" containerID="12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8" exitCode=0 Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.188146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"ea121fac-9d87-4988-bb58-e5b18100388f","Type":"ContainerDied","Data":"12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8"} Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.188181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"ea121fac-9d87-4988-bb58-e5b18100388f","Type":"ContainerDied","Data":"0753cb2a74bc606558ec3dd5236793aed64d90ddeca08689bf1fc836acb8c496"} Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.188198 4707 scope.go:117] "RemoveContainer" containerID="12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.188296 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.190635 4707 generic.go:334] "Generic (PLEG): container finished" podID="38940cc9-cd7d-49b2-809d-7bc6b0facfea" containerID="0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c" exitCode=2 Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.190712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"38940cc9-cd7d-49b2-809d-7bc6b0facfea","Type":"ContainerDied","Data":"0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c"} Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.190735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"38940cc9-cd7d-49b2-809d-7bc6b0facfea","Type":"ContainerDied","Data":"abd17ae86b2d3f2d6b805dac4f86ce4966cb592f642bbf2ef991fb4148f2301f"} Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.190777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.194215 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerID="128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b" exitCode=0 Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.194239 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerID="e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8" exitCode=143 Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.194275 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.194274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d7a2c821-67b0-4d2b-8fd3-b506ac9db599","Type":"ContainerDied","Data":"128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b"} Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.194331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d7a2c821-67b0-4d2b-8fd3-b506ac9db599","Type":"ContainerDied","Data":"e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8"} Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.194347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"d7a2c821-67b0-4d2b-8fd3-b506ac9db599","Type":"ContainerDied","Data":"2bbb74f591c32a1249c52838c0bfebe2286aca91087aeeec4484011e26eb523c"} Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.242260 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rfcg\" (UniqueName: \"kubernetes.io/projected/38940cc9-cd7d-49b2-809d-7bc6b0facfea-kube-api-access-5rfcg\") pod \"38940cc9-cd7d-49b2-809d-7bc6b0facfea\" (UID: \"38940cc9-cd7d-49b2-809d-7bc6b0facfea\") " Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.246588 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.248113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38940cc9-cd7d-49b2-809d-7bc6b0facfea-kube-api-access-5rfcg" (OuterVolumeSpecName: "kube-api-access-5rfcg") pod "38940cc9-cd7d-49b2-809d-7bc6b0facfea" (UID: "38940cc9-cd7d-49b2-809d-7bc6b0facfea"). InnerVolumeSpecName "kube-api-access-5rfcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.248249 4707 scope.go:117] "RemoveContainer" containerID="12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8" Jan 21 16:09:38 crc kubenswrapper[4707]: E0121 16:09:38.252680 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8\": container with ID starting with 12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8 not found: ID does not exist" containerID="12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.252732 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8"} err="failed to get container status \"12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8\": rpc error: code = NotFound desc = could not find container \"12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8\": container with ID starting with 12cf5d1567ddb45b28955074c38c2b8e01888282365d41cd21419793c02fe9a8 not found: ID does not exist" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.252752 4707 scope.go:117] "RemoveContainer" containerID="0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.272344 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.297946 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.329319 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.345652 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rfcg\" (UniqueName: \"kubernetes.io/projected/38940cc9-cd7d-49b2-809d-7bc6b0facfea-kube-api-access-5rfcg\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.350966 4707 scope.go:117] "RemoveContainer" containerID="0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c" Jan 21 16:09:38 crc kubenswrapper[4707]: E0121 16:09:38.352474 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c\": container with ID starting with 0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c not found: ID does not exist" containerID="0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.352504 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c"} err="failed to get container status \"0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c\": rpc error: code = NotFound desc = could not find container \"0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c\": container with ID starting with 0d81462173d09873e123fff99f0998ccf0bda7c0d4eb6c1725450796b3d1bc8c not found: ID does not exist" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.352526 4707 scope.go:117] "RemoveContainer" containerID="128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.353847 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: E0121 16:09:38.354241 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea121fac-9d87-4988-bb58-e5b18100388f" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.354533 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea121fac-9d87-4988-bb58-e5b18100388f" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:09:38 crc kubenswrapper[4707]: E0121 16:09:38.354561 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerName="nova-metadata-log" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.354568 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerName="nova-metadata-log" Jan 21 16:09:38 crc kubenswrapper[4707]: E0121 16:09:38.354581 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38940cc9-cd7d-49b2-809d-7bc6b0facfea" containerName="kube-state-metrics" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.354587 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="38940cc9-cd7d-49b2-809d-7bc6b0facfea" containerName="kube-state-metrics" Jan 21 16:09:38 crc kubenswrapper[4707]: E0121 16:09:38.354601 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerName="nova-metadata-metadata" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.354607 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerName="nova-metadata-metadata" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.354784 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea121fac-9d87-4988-bb58-e5b18100388f" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.354799 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerName="nova-metadata-log" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.354825 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="38940cc9-cd7d-49b2-809d-7bc6b0facfea" containerName="kube-state-metrics" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.354836 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" containerName="nova-metadata-metadata" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.355463 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.359360 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.359549 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.359664 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.370470 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.378575 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.380943 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.382854 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.382895 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.385178 4707 scope.go:117] "RemoveContainer" containerID="e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.392945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.407404 4707 scope.go:117] "RemoveContainer" containerID="128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b" Jan 21 16:09:38 crc kubenswrapper[4707]: E0121 16:09:38.407750 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b\": container with ID starting with 128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b not found: ID does not exist" containerID="128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.407777 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b"} err="failed to get container status \"128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b\": rpc error: code = NotFound desc = could not find container \"128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b\": container with ID starting with 128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b not found: ID does not exist" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.407798 4707 scope.go:117] "RemoveContainer" containerID="e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8" Jan 21 16:09:38 crc kubenswrapper[4707]: E0121 16:09:38.408151 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8\": container with ID starting with e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8 not found: ID does not exist" containerID="e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.408216 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8"} err="failed to get container status \"e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8\": rpc error: code = NotFound desc = could not find container \"e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8\": container with ID starting with e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8 not found: ID does not exist" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.408253 4707 scope.go:117] "RemoveContainer" containerID="128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.408600 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b"} err="failed to get container status \"128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b\": rpc error: code = NotFound desc = could not find container \"128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b\": container with ID starting with 128b1b18ce0b4307e97333635eec6cff16645bb5fc6cd3d09c47868edad4762b not found: ID does not exist" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.408624 4707 scope.go:117] "RemoveContainer" containerID="e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.408921 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8"} err="failed to get container status \"e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8\": rpc error: code = NotFound desc = could not find container \"e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8\": container with ID starting with e3dee11ca85cd5611012cb3d10e0d82d402780a19564f34a0c4fbb07260d96a8 not found: ID does not exist" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.446868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8phz\" (UniqueName: \"kubernetes.io/projected/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-kube-api-access-c8phz\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.446912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.446952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7386d13c-aac3-478b-8174-4e2f4be72e87-logs\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.446967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.446986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.447006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-config-data\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.447052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfd6z\" (UniqueName: \"kubernetes.io/projected/7386d13c-aac3-478b-8174-4e2f4be72e87-kube-api-access-jfd6z\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.447078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.447106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.447141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: E0121 16:09:38.466559 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7a2c821_67b0_4d2b_8fd3_b506ac9db599.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea121fac_9d87_4988_bb58_e5b18100388f.slice\": RecentStats: unable to find data in memory cache]" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.526207 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.535867 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7386d13c-aac3-478b-8174-4e2f4be72e87-logs\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-config-data\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfd6z\" (UniqueName: \"kubernetes.io/projected/7386d13c-aac3-478b-8174-4e2f4be72e87-kube-api-access-jfd6z\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8phz\" (UniqueName: \"kubernetes.io/projected/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-kube-api-access-c8phz\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.549394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.553898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.554966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-config-data\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.555127 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.557661 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.557895 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.558723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.559765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.561103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7386d13c-aac3-478b-8174-4e2f4be72e87-logs\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.562055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.562534 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.562862 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.566072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.568047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.579772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfd6z\" (UniqueName: \"kubernetes.io/projected/7386d13c-aac3-478b-8174-4e2f4be72e87-kube-api-access-jfd6z\") pod \"nova-metadata-0\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.580489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8phz\" (UniqueName: \"kubernetes.io/projected/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-kube-api-access-c8phz\") pod \"nova-cell1-novncproxy-0\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.609257 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.625677 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.652604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-combined-ca-bundle\") pod \"7a0b2b2e-2a3d-4b99-874c-08306819d928\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.653014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72pnt\" (UniqueName: \"kubernetes.io/projected/10090d75-68eb-4597-98f6-b3f0282513e6-kube-api-access-72pnt\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.653098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.653234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.653547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.679727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a0b2b2e-2a3d-4b99-874c-08306819d928" (UID: "7a0b2b2e-2a3d-4b99-874c-08306819d928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.682203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.704246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.754841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnh72\" (UniqueName: \"kubernetes.io/projected/7a0b2b2e-2a3d-4b99-874c-08306819d928-kube-api-access-rnh72\") pod \"7a0b2b2e-2a3d-4b99-874c-08306819d928\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.755086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-scripts\") pod \"7a0b2b2e-2a3d-4b99-874c-08306819d928\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.755201 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-config-data\") pod \"7a0b2b2e-2a3d-4b99-874c-08306819d928\" (UID: \"7a0b2b2e-2a3d-4b99-874c-08306819d928\") " Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.755607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.755706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72pnt\" (UniqueName: \"kubernetes.io/projected/10090d75-68eb-4597-98f6-b3f0282513e6-kube-api-access-72pnt\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.755762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.755778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.755998 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.762654 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.763165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.763242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-scripts" (OuterVolumeSpecName: "scripts") pod "7a0b2b2e-2a3d-4b99-874c-08306819d928" (UID: "7a0b2b2e-2a3d-4b99-874c-08306819d928"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.763328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0b2b2e-2a3d-4b99-874c-08306819d928-kube-api-access-rnh72" (OuterVolumeSpecName: "kube-api-access-rnh72") pod "7a0b2b2e-2a3d-4b99-874c-08306819d928" (UID: "7a0b2b2e-2a3d-4b99-874c-08306819d928"). InnerVolumeSpecName "kube-api-access-rnh72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.763970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.772449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72pnt\" (UniqueName: \"kubernetes.io/projected/10090d75-68eb-4597-98f6-b3f0282513e6-kube-api-access-72pnt\") pod \"kube-state-metrics-0\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.790390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-config-data" (OuterVolumeSpecName: "config-data") pod "7a0b2b2e-2a3d-4b99-874c-08306819d928" (UID: "7a0b2b2e-2a3d-4b99-874c-08306819d928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.857302 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnh72\" (UniqueName: \"kubernetes.io/projected/7a0b2b2e-2a3d-4b99-874c-08306819d928-kube-api-access-rnh72\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.857330 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.857341 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b2b2e-2a3d-4b99-874c-08306819d928-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:38 crc kubenswrapper[4707]: I0121 16:09:38.935692 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.118324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.194219 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38940cc9-cd7d-49b2-809d-7bc6b0facfea" path="/var/lib/kubelet/pods/38940cc9-cd7d-49b2-809d-7bc6b0facfea/volumes" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.194736 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a2c821-67b0-4d2b-8fd3-b506ac9db599" path="/var/lib/kubelet/pods/d7a2c821-67b0-4d2b-8fd3-b506ac9db599/volumes" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.195273 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea121fac-9d87-4988-bb58-e5b18100388f" path="/var/lib/kubelet/pods/ea121fac-9d87-4988-bb58-e5b18100388f/volumes" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.218732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" event={"ID":"7a0b2b2e-2a3d-4b99-874c-08306819d928","Type":"ContainerDied","Data":"06e28c37fc6f9c87f25e64ddc4233256403a4cdb9b6f634ba735f836cb5b86e3"} Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.218763 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06e28c37fc6f9c87f25e64ddc4233256403a4cdb9b6f634ba735f836cb5b86e3" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.218827 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm" Jan 21 16:09:39 crc kubenswrapper[4707]: W0121 16:09:39.221290 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7386d13c_aac3_478b_8174_4e2f4be72e87.slice/crio-3c59a20f524a87d6fc0f06255886fb5c08cc30df754cbf7af5a9fb10919cacf8 WatchSource:0}: Error finding container 3c59a20f524a87d6fc0f06255886fb5c08cc30df754cbf7af5a9fb10919cacf8: Status 404 returned error can't find the container with id 3c59a20f524a87d6fc0f06255886fb5c08cc30df754cbf7af5a9fb10919cacf8 Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.230195 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.237099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef","Type":"ContainerStarted","Data":"e3017366764c70929229cc28608c44a1e8f46d0b5bb1dfa621d22fa38aa86f08"} Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.284165 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:09:39 crc kubenswrapper[4707]: E0121 16:09:39.284498 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0b2b2e-2a3d-4b99-874c-08306819d928" containerName="nova-cell1-conductor-db-sync" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.284515 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0b2b2e-2a3d-4b99-874c-08306819d928" containerName="nova-cell1-conductor-db-sync" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.284713 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0b2b2e-2a3d-4b99-874c-08306819d928" containerName="nova-cell1-conductor-db-sync" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.285259 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.301568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.301783 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.412282 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.473767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.474125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxk6w\" (UniqueName: \"kubernetes.io/projected/54b025fd-f070-4b20-b44c-e49aa1930987-kube-api-access-sxk6w\") pod \"nova-cell1-conductor-0\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.474220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.576332 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.576452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.576607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxk6w\" (UniqueName: \"kubernetes.io/projected/54b025fd-f070-4b20-b44c-e49aa1930987-kube-api-access-sxk6w\") pod \"nova-cell1-conductor-0\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.582285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.594301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxk6w\" (UniqueName: \"kubernetes.io/projected/54b025fd-f070-4b20-b44c-e49aa1930987-kube-api-access-sxk6w\") pod \"nova-cell1-conductor-0\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.594402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.692973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.755059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.883829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9ktn\" (UniqueName: \"kubernetes.io/projected/ca088bd1-a4bf-47c4-ba45-d25107adb619-kube-api-access-m9ktn\") pod \"ca088bd1-a4bf-47c4-ba45-d25107adb619\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.884733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-combined-ca-bundle\") pod \"ca088bd1-a4bf-47c4-ba45-d25107adb619\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.885695 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-scripts\") pod \"ca088bd1-a4bf-47c4-ba45-d25107adb619\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.885942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-config-data\") pod \"ca088bd1-a4bf-47c4-ba45-d25107adb619\" (UID: \"ca088bd1-a4bf-47c4-ba45-d25107adb619\") " Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.890508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-scripts" (OuterVolumeSpecName: "scripts") pod "ca088bd1-a4bf-47c4-ba45-d25107adb619" (UID: "ca088bd1-a4bf-47c4-ba45-d25107adb619"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.891060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca088bd1-a4bf-47c4-ba45-d25107adb619-kube-api-access-m9ktn" (OuterVolumeSpecName: "kube-api-access-m9ktn") pod "ca088bd1-a4bf-47c4-ba45-d25107adb619" (UID: "ca088bd1-a4bf-47c4-ba45-d25107adb619"). InnerVolumeSpecName "kube-api-access-m9ktn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.909481 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca088bd1-a4bf-47c4-ba45-d25107adb619" (UID: "ca088bd1-a4bf-47c4-ba45-d25107adb619"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.914077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-config-data" (OuterVolumeSpecName: "config-data") pod "ca088bd1-a4bf-47c4-ba45-d25107adb619" (UID: "ca088bd1-a4bf-47c4-ba45-d25107adb619"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.989136 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.989177 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.989191 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9ktn\" (UniqueName: \"kubernetes.io/projected/ca088bd1-a4bf-47c4-ba45-d25107adb619-kube-api-access-m9ktn\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:39 crc kubenswrapper[4707]: I0121 16:09:39.989204 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca088bd1-a4bf-47c4-ba45-d25107adb619-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.003203 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.003477 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="ceilometer-central-agent" containerID="cri-o://fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c" gracePeriod=30 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.003529 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="proxy-httpd" containerID="cri-o://76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8" gracePeriod=30 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.003575 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="sg-core" containerID="cri-o://fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74" gracePeriod=30 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.003579 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="ceilometer-notification-agent" containerID="cri-o://69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84" gracePeriod=30 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.107880 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:09:40 crc kubenswrapper[4707]: W0121 16:09:40.116934 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54b025fd_f070_4b20_b44c_e49aa1930987.slice/crio-43a7c7915ac3875c47d8555e1d035fa118b1903778addc0ebd7155d8e87474f7 WatchSource:0}: Error finding container 43a7c7915ac3875c47d8555e1d035fa118b1903778addc0ebd7155d8e87474f7: Status 404 returned error can't find the container with id 43a7c7915ac3875c47d8555e1d035fa118b1903778addc0ebd7155d8e87474f7 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.245072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"54b025fd-f070-4b20-b44c-e49aa1930987","Type":"ContainerStarted","Data":"15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.245119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"54b025fd-f070-4b20-b44c-e49aa1930987","Type":"ContainerStarted","Data":"43a7c7915ac3875c47d8555e1d035fa118b1903778addc0ebd7155d8e87474f7"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.245136 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.246894 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef","Type":"ContainerStarted","Data":"3fa65774e091d2648c8e5d199347bcc9ac19de4c12027001ad23996032d7b010"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.248681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7386d13c-aac3-478b-8174-4e2f4be72e87","Type":"ContainerStarted","Data":"54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.248728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7386d13c-aac3-478b-8174-4e2f4be72e87","Type":"ContainerStarted","Data":"7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.248739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7386d13c-aac3-478b-8174-4e2f4be72e87","Type":"ContainerStarted","Data":"3c59a20f524a87d6fc0f06255886fb5c08cc30df754cbf7af5a9fb10919cacf8"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.253544 4707 generic.go:334] "Generic (PLEG): container finished" podID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerID="76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8" exitCode=0 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.253568 4707 generic.go:334] "Generic (PLEG): container finished" podID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerID="fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74" exitCode=2 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.253610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerDied","Data":"76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.253629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerDied","Data":"fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.256281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"10090d75-68eb-4597-98f6-b3f0282513e6","Type":"ContainerStarted","Data":"2cec1240e19583cb646030fe86a81a2167b97fef2ae3246d40fd82a422fe2b74"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.256311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"10090d75-68eb-4597-98f6-b3f0282513e6","Type":"ContainerStarted","Data":"ea0d2fd30a0d46c82d89ccaed6aa727e763a327dc24df044b01a7102712782ba"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.257044 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.263320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" event={"ID":"ca088bd1-a4bf-47c4-ba45-d25107adb619","Type":"ContainerDied","Data":"2965177a0d1938b1c544871e6f10bb0d1c0d3e14f79b6c6b30ef47c35e577074"} Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.263397 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2965177a0d1938b1c544871e6f10bb0d1c0d3e14f79b6c6b30ef47c35e577074" Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.263510 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz" Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.291698 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.291660695 podStartE2EDuration="2.291660695s" podCreationTimestamp="2026-01-21 16:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:40.286046849 +0000 UTC m=+4077.467563070" watchObservedRunningTime="2026-01-21 16:09:40.291660695 +0000 UTC m=+4077.473176917" Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.292795 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=1.292784066 podStartE2EDuration="1.292784066s" podCreationTimestamp="2026-01-21 16:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:40.262548845 +0000 UTC m=+4077.444065068" watchObservedRunningTime="2026-01-21 16:09:40.292784066 +0000 UTC m=+4077.474300288" Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.309822 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.975371307 podStartE2EDuration="2.309789071s" podCreationTimestamp="2026-01-21 16:09:38 +0000 UTC" firstStartedPulling="2026-01-21 16:09:39.438821175 +0000 UTC m=+4076.620337396" lastFinishedPulling="2026-01-21 16:09:39.773238939 +0000 UTC m=+4076.954755160" observedRunningTime="2026-01-21 16:09:40.309207127 +0000 UTC m=+4077.490723349" watchObservedRunningTime="2026-01-21 16:09:40.309789071 +0000 UTC m=+4077.491305293" Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.339699 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.339659476 podStartE2EDuration="2.339659476s" podCreationTimestamp="2026-01-21 16:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:40.332237721 +0000 UTC m=+4077.513753944" watchObservedRunningTime="2026-01-21 16:09:40.339659476 +0000 UTC m=+4077.521175698" Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.447298 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.447798 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerName="nova-api-log" containerID="cri-o://2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d" gracePeriod=30 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.448367 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerName="nova-api-api" containerID="cri-o://6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57" gracePeriod=30 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.472101 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.472360 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="54baaa16-b753-4cd6-a66b-87a57aed3bff" containerName="nova-scheduler-scheduler" containerID="cri-o://9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d" gracePeriod=30 Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.480652 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:40 crc kubenswrapper[4707]: I0121 16:09:40.948067 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.113403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-logs\") pod \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.113677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7446\" (UniqueName: \"kubernetes.io/projected/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-kube-api-access-n7446\") pod \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.114098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-logs" (OuterVolumeSpecName: "logs") pod "fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" (UID: "fd9de56c-ab8b-47d8-a8c2-d2cea65ad239"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.113800 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-combined-ca-bundle\") pod \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.114477 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-config-data\") pod \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\" (UID: \"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239\") " Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.115540 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.118793 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-kube-api-access-n7446" (OuterVolumeSpecName: "kube-api-access-n7446") pod "fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" (UID: "fd9de56c-ab8b-47d8-a8c2-d2cea65ad239"). InnerVolumeSpecName "kube-api-access-n7446". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.138663 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" (UID: "fd9de56c-ab8b-47d8-a8c2-d2cea65ad239"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.141284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-config-data" (OuterVolumeSpecName: "config-data") pod "fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" (UID: "fd9de56c-ab8b-47d8-a8c2-d2cea65ad239"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.218436 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7446\" (UniqueName: \"kubernetes.io/projected/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-kube-api-access-n7446\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.218473 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.218490 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.274452 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerID="6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57" exitCode=0 Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.274689 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerID="2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d" exitCode=143 Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.274499 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.274525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239","Type":"ContainerDied","Data":"6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57"} Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.274754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239","Type":"ContainerDied","Data":"2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d"} Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.274771 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fd9de56c-ab8b-47d8-a8c2-d2cea65ad239","Type":"ContainerDied","Data":"af241f248c25e27ac3a11eca3e8d147c27c5c51ebfd237b1796b9ab719f57873"} Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.274785 4707 scope.go:117] "RemoveContainer" containerID="6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.277490 4707 generic.go:334] "Generic (PLEG): container finished" podID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerID="fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c" exitCode=0 Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.278268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerDied","Data":"fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c"} Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.319702 4707 scope.go:117] "RemoveContainer" containerID="2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.329839 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.338488 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.339463 4707 scope.go:117] "RemoveContainer" containerID="6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57" Jan 21 16:09:41 crc kubenswrapper[4707]: E0121 16:09:41.339971 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57\": container with ID starting with 6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57 not found: ID does not exist" containerID="6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.340010 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57"} err="failed to get container status \"6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57\": rpc error: code = NotFound desc = could not find container \"6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57\": container with ID starting with 6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57 not found: ID does not exist" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.340048 4707 scope.go:117] "RemoveContainer" containerID="2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d" Jan 21 16:09:41 crc kubenswrapper[4707]: E0121 16:09:41.340419 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d\": container with ID starting with 2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d not found: ID does not exist" containerID="2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.340460 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d"} err="failed to get container status \"2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d\": rpc error: code = NotFound desc = could not find container \"2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d\": container with ID starting with 2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d not found: ID does not exist" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.340495 4707 scope.go:117] "RemoveContainer" containerID="6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.340873 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57"} err="failed to get container status \"6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57\": rpc error: code = NotFound desc = could not find container \"6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57\": container with ID starting with 6dcf73b803ebf640b13d0ed35bce663929a952d2b095c8a6d15e1de38737bd57 not found: ID does not exist" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.340899 4707 scope.go:117] "RemoveContainer" containerID="2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.341130 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d"} err="failed to get container status \"2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d\": rpc error: code = NotFound desc = could not find container \"2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d\": container with ID starting with 2b1935999b771f7a3727b25f4f64bf7afc721003ed60ff561a844e5774a6814d not found: ID does not exist" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.346483 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:41 crc kubenswrapper[4707]: E0121 16:09:41.346823 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerName="nova-api-api" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.346840 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerName="nova-api-api" Jan 21 16:09:41 crc kubenswrapper[4707]: E0121 16:09:41.346856 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerName="nova-api-log" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.346861 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerName="nova-api-log" Jan 21 16:09:41 crc kubenswrapper[4707]: E0121 16:09:41.346875 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca088bd1-a4bf-47c4-ba45-d25107adb619" containerName="nova-manage" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.346881 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca088bd1-a4bf-47c4-ba45-d25107adb619" containerName="nova-manage" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.347033 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerName="nova-api-log" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.347051 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca088bd1-a4bf-47c4-ba45-d25107adb619" containerName="nova-manage" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.347062 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" containerName="nova-api-api" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.347874 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.354261 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.367746 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.523581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txqs5\" (UniqueName: \"kubernetes.io/projected/a373955d-893e-405a-82eb-b36431f0df12-kube-api-access-txqs5\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.523827 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a373955d-893e-405a-82eb-b36431f0df12-logs\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.524069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-config-data\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:41 crc kubenswrapper[4707]: I0121 16:09:41.524311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.625895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txqs5\" (UniqueName: \"kubernetes.io/projected/a373955d-893e-405a-82eb-b36431f0df12-kube-api-access-txqs5\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.626086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a373955d-893e-405a-82eb-b36431f0df12-logs\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.626236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-config-data\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.626632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a373955d-893e-405a-82eb-b36431f0df12-logs\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.626688 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.638823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.639843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-config-data\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.645168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txqs5\" (UniqueName: \"kubernetes.io/projected/a373955d-893e-405a-82eb-b36431f0df12-kube-api-access-txqs5\") pod \"nova-api-0\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.663375 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.762293 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.931031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pzkm\" (UniqueName: \"kubernetes.io/projected/54baaa16-b753-4cd6-a66b-87a57aed3bff-kube-api-access-8pzkm\") pod \"54baaa16-b753-4cd6-a66b-87a57aed3bff\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.931150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-config-data\") pod \"54baaa16-b753-4cd6-a66b-87a57aed3bff\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.931221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-combined-ca-bundle\") pod \"54baaa16-b753-4cd6-a66b-87a57aed3bff\" (UID: \"54baaa16-b753-4cd6-a66b-87a57aed3bff\") " Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.937542 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54baaa16-b753-4cd6-a66b-87a57aed3bff-kube-api-access-8pzkm" (OuterVolumeSpecName: "kube-api-access-8pzkm") pod "54baaa16-b753-4cd6-a66b-87a57aed3bff" (UID: "54baaa16-b753-4cd6-a66b-87a57aed3bff"). InnerVolumeSpecName "kube-api-access-8pzkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.953847 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54baaa16-b753-4cd6-a66b-87a57aed3bff" (UID: "54baaa16-b753-4cd6-a66b-87a57aed3bff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:41.955126 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-config-data" (OuterVolumeSpecName: "config-data") pod "54baaa16-b753-4cd6-a66b-87a57aed3bff" (UID: "54baaa16-b753-4cd6-a66b-87a57aed3bff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.033904 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pzkm\" (UniqueName: \"kubernetes.io/projected/54baaa16-b753-4cd6-a66b-87a57aed3bff-kube-api-access-8pzkm\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.033924 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.033935 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54baaa16-b753-4cd6-a66b-87a57aed3bff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.287702 4707 generic.go:334] "Generic (PLEG): container finished" podID="54baaa16-b753-4cd6-a66b-87a57aed3bff" containerID="9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d" exitCode=0 Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.287761 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.287795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"54baaa16-b753-4cd6-a66b-87a57aed3bff","Type":"ContainerDied","Data":"9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d"} Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.287856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"54baaa16-b753-4cd6-a66b-87a57aed3bff","Type":"ContainerDied","Data":"6cdcaed1bca8b87e9d06cc56b33b7bd61dc8a1b5bd7f9d597a47f685c52d9172"} Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.287873 4707 scope.go:117] "RemoveContainer" containerID="9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.292538 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerName="nova-metadata-log" containerID="cri-o://7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6" gracePeriod=30 Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.292855 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerName="nova-metadata-metadata" containerID="cri-o://54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7" gracePeriod=30 Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.345789 4707 scope.go:117] "RemoveContainer" containerID="9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d" Jan 21 16:09:42 crc kubenswrapper[4707]: E0121 16:09:42.346338 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d\": container with ID starting with 9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d not found: ID does not exist" containerID="9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.346369 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d"} err="failed to get container status \"9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d\": rpc error: code = NotFound desc = could not find container \"9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d\": container with ID starting with 9192bbfc93deb5f5821db1d45bae3676da0227c9435f23bea185ece01c0ae97d not found: ID does not exist" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.356201 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.363945 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.370521 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:42 crc kubenswrapper[4707]: E0121 16:09:42.370894 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54baaa16-b753-4cd6-a66b-87a57aed3bff" containerName="nova-scheduler-scheduler" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.370912 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="54baaa16-b753-4cd6-a66b-87a57aed3bff" containerName="nova-scheduler-scheduler" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.371082 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="54baaa16-b753-4cd6-a66b-87a57aed3bff" containerName="nova-scheduler-scheduler" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.371658 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.373691 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.377018 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.449450 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:42 crc kubenswrapper[4707]: W0121 16:09:42.454077 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda373955d_893e_405a_82eb_b36431f0df12.slice/crio-928a8c54e7fbca672615598cde77167dfc542a417b97e169b33dd09c94cd374a WatchSource:0}: Error finding container 928a8c54e7fbca672615598cde77167dfc542a417b97e169b33dd09c94cd374a: Status 404 returned error can't find the container with id 928a8c54e7fbca672615598cde77167dfc542a417b97e169b33dd09c94cd374a Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.548934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-config-data\") pod \"nova-scheduler-0\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.549242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6nb7\" (UniqueName: \"kubernetes.io/projected/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-kube-api-access-x6nb7\") pod \"nova-scheduler-0\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.549268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.651426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6nb7\" (UniqueName: \"kubernetes.io/projected/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-kube-api-access-x6nb7\") pod \"nova-scheduler-0\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.651469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.651677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-config-data\") pod \"nova-scheduler-0\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.656430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-config-data\") pod \"nova-scheduler-0\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.657011 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.669006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6nb7\" (UniqueName: \"kubernetes.io/projected/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-kube-api-access-x6nb7\") pod \"nova-scheduler-0\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.691789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.749011 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.855188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-combined-ca-bundle\") pod \"7386d13c-aac3-478b-8174-4e2f4be72e87\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.855458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7386d13c-aac3-478b-8174-4e2f4be72e87-logs\") pod \"7386d13c-aac3-478b-8174-4e2f4be72e87\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.855564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-nova-metadata-tls-certs\") pod \"7386d13c-aac3-478b-8174-4e2f4be72e87\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.855600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-config-data\") pod \"7386d13c-aac3-478b-8174-4e2f4be72e87\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.855629 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfd6z\" (UniqueName: \"kubernetes.io/projected/7386d13c-aac3-478b-8174-4e2f4be72e87-kube-api-access-jfd6z\") pod \"7386d13c-aac3-478b-8174-4e2f4be72e87\" (UID: \"7386d13c-aac3-478b-8174-4e2f4be72e87\") " Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.856175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7386d13c-aac3-478b-8174-4e2f4be72e87-logs" (OuterVolumeSpecName: "logs") pod "7386d13c-aac3-478b-8174-4e2f4be72e87" (UID: "7386d13c-aac3-478b-8174-4e2f4be72e87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.860166 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7386d13c-aac3-478b-8174-4e2f4be72e87-kube-api-access-jfd6z" (OuterVolumeSpecName: "kube-api-access-jfd6z") pod "7386d13c-aac3-478b-8174-4e2f4be72e87" (UID: "7386d13c-aac3-478b-8174-4e2f4be72e87"). InnerVolumeSpecName "kube-api-access-jfd6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.878731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7386d13c-aac3-478b-8174-4e2f4be72e87" (UID: "7386d13c-aac3-478b-8174-4e2f4be72e87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.880764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-config-data" (OuterVolumeSpecName: "config-data") pod "7386d13c-aac3-478b-8174-4e2f4be72e87" (UID: "7386d13c-aac3-478b-8174-4e2f4be72e87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.898095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7386d13c-aac3-478b-8174-4e2f4be72e87" (UID: "7386d13c-aac3-478b-8174-4e2f4be72e87"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.958593 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.958644 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.958666 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfd6z\" (UniqueName: \"kubernetes.io/projected/7386d13c-aac3-478b-8174-4e2f4be72e87-kube-api-access-jfd6z\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.958680 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386d13c-aac3-478b-8174-4e2f4be72e87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:42 crc kubenswrapper[4707]: I0121 16:09:42.958694 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7386d13c-aac3-478b-8174-4e2f4be72e87-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.143098 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.227660 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54baaa16-b753-4cd6-a66b-87a57aed3bff" path="/var/lib/kubelet/pods/54baaa16-b753-4cd6-a66b-87a57aed3bff/volumes" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.228438 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9de56c-ab8b-47d8-a8c2-d2cea65ad239" path="/var/lib/kubelet/pods/fd9de56c-ab8b-47d8-a8c2-d2cea65ad239/volumes" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.301034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a","Type":"ContainerStarted","Data":"88ca45f0265a7551414c6d643e0ceb9b0052d4cb105bb6e508a3fc93c0b54674"} Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.303198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a373955d-893e-405a-82eb-b36431f0df12","Type":"ContainerStarted","Data":"90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27"} Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.303226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a373955d-893e-405a-82eb-b36431f0df12","Type":"ContainerStarted","Data":"62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4"} Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.303237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a373955d-893e-405a-82eb-b36431f0df12","Type":"ContainerStarted","Data":"928a8c54e7fbca672615598cde77167dfc542a417b97e169b33dd09c94cd374a"} Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.305052 4707 generic.go:334] "Generic (PLEG): container finished" podID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerID="54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7" exitCode=0 Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.305124 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.305139 4707 generic.go:334] "Generic (PLEG): container finished" podID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerID="7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6" exitCode=143 Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.305106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7386d13c-aac3-478b-8174-4e2f4be72e87","Type":"ContainerDied","Data":"54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7"} Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.305850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7386d13c-aac3-478b-8174-4e2f4be72e87","Type":"ContainerDied","Data":"7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6"} Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.305923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7386d13c-aac3-478b-8174-4e2f4be72e87","Type":"ContainerDied","Data":"3c59a20f524a87d6fc0f06255886fb5c08cc30df754cbf7af5a9fb10919cacf8"} Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.305952 4707 scope.go:117] "RemoveContainer" containerID="54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.339465 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.339432938 podStartE2EDuration="2.339432938s" podCreationTimestamp="2026-01-21 16:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:43.322224202 +0000 UTC m=+4080.503740444" watchObservedRunningTime="2026-01-21 16:09:43.339432938 +0000 UTC m=+4080.520949161" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.362029 4707 scope.go:117] "RemoveContainer" containerID="7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.383494 4707 scope.go:117] "RemoveContainer" containerID="54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.385249 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:43 crc kubenswrapper[4707]: E0121 16:09:43.385853 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7\": container with ID starting with 54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7 not found: ID does not exist" containerID="54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.385907 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7"} err="failed to get container status \"54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7\": rpc error: code = NotFound desc = could not find container \"54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7\": container with ID starting with 54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7 not found: ID does not exist" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.385939 4707 scope.go:117] "RemoveContainer" containerID="7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6" Jan 21 16:09:43 crc kubenswrapper[4707]: E0121 16:09:43.386352 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6\": container with ID starting with 7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6 not found: ID does not exist" containerID="7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.386377 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6"} err="failed to get container status \"7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6\": rpc error: code = NotFound desc = could not find container \"7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6\": container with ID starting with 7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6 not found: ID does not exist" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.386401 4707 scope.go:117] "RemoveContainer" containerID="54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.387008 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7"} err="failed to get container status \"54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7\": rpc error: code = NotFound desc = could not find container \"54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7\": container with ID starting with 54ce6856cb3497a2640ea411d0b16f59f55eb4a8da2150efeb616f60d47229c7 not found: ID does not exist" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.387071 4707 scope.go:117] "RemoveContainer" containerID="7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.387466 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6"} err="failed to get container status \"7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6\": rpc error: code = NotFound desc = could not find container \"7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6\": container with ID starting with 7867230c2cfd2c933758b9260334f86292da0954bbe367b47db37d70b8e50ec6 not found: ID does not exist" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.423281 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.432377 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:43 crc kubenswrapper[4707]: E0121 16:09:43.432911 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerName="nova-metadata-log" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.432934 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerName="nova-metadata-log" Jan 21 16:09:43 crc kubenswrapper[4707]: E0121 16:09:43.432948 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerName="nova-metadata-metadata" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.432956 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerName="nova-metadata-metadata" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.433197 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerName="nova-metadata-metadata" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.433230 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7386d13c-aac3-478b-8174-4e2f4be72e87" containerName="nova-metadata-log" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.434668 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.437419 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.439772 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.443084 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.509306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.509364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98hnt\" (UniqueName: \"kubernetes.io/projected/2401fc34-ec2a-41f1-9080-0c3320c276b1-kube-api-access-98hnt\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.509450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.509489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-config-data\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.509518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401fc34-ec2a-41f1-9080-0c3320c276b1-logs\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.611443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401fc34-ec2a-41f1-9080-0c3320c276b1-logs\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.611623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.611664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98hnt\" (UniqueName: \"kubernetes.io/projected/2401fc34-ec2a-41f1-9080-0c3320c276b1-kube-api-access-98hnt\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.611780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.611833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401fc34-ec2a-41f1-9080-0c3320c276b1-logs\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.611860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-config-data\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.615534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.616592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-config-data\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.628377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.630567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98hnt\" (UniqueName: \"kubernetes.io/projected/2401fc34-ec2a-41f1-9080-0c3320c276b1-kube-api-access-98hnt\") pod \"nova-metadata-0\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.682410 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:43 crc kubenswrapper[4707]: I0121 16:09:43.753373 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:44 crc kubenswrapper[4707]: I0121 16:09:44.165416 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:44 crc kubenswrapper[4707]: W0121 16:09:44.166800 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2401fc34_ec2a_41f1_9080_0c3320c276b1.slice/crio-83160d73984ae47cb6e562d23c161e650baebc631bb304679e9e89f5459ebd3e WatchSource:0}: Error finding container 83160d73984ae47cb6e562d23c161e650baebc631bb304679e9e89f5459ebd3e: Status 404 returned error can't find the container with id 83160d73984ae47cb6e562d23c161e650baebc631bb304679e9e89f5459ebd3e Jan 21 16:09:44 crc kubenswrapper[4707]: I0121 16:09:44.322885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a","Type":"ContainerStarted","Data":"cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1"} Jan 21 16:09:44 crc kubenswrapper[4707]: I0121 16:09:44.326696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2401fc34-ec2a-41f1-9080-0c3320c276b1","Type":"ContainerStarted","Data":"4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa"} Jan 21 16:09:44 crc kubenswrapper[4707]: I0121 16:09:44.326734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2401fc34-ec2a-41f1-9080-0c3320c276b1","Type":"ContainerStarted","Data":"83160d73984ae47cb6e562d23c161e650baebc631bb304679e9e89f5459ebd3e"} Jan 21 16:09:45 crc kubenswrapper[4707]: I0121 16:09:45.183053 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:09:45 crc kubenswrapper[4707]: E0121 16:09:45.183528 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:09:45 crc kubenswrapper[4707]: I0121 16:09:45.194735 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7386d13c-aac3-478b-8174-4e2f4be72e87" path="/var/lib/kubelet/pods/7386d13c-aac3-478b-8174-4e2f4be72e87/volumes" Jan 21 16:09:45 crc kubenswrapper[4707]: I0121 16:09:45.349848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2401fc34-ec2a-41f1-9080-0c3320c276b1","Type":"ContainerStarted","Data":"55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811"} Jan 21 16:09:45 crc kubenswrapper[4707]: I0121 16:09:45.365592 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=3.365568894 podStartE2EDuration="3.365568894s" podCreationTimestamp="2026-01-21 16:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:44.343587667 +0000 UTC m=+4081.525103889" watchObservedRunningTime="2026-01-21 16:09:45.365568894 +0000 UTC m=+4082.547085116" Jan 21 16:09:45 crc kubenswrapper[4707]: I0121 16:09:45.367920 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.367914213 podStartE2EDuration="2.367914213s" podCreationTimestamp="2026-01-21 16:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:45.364395227 +0000 UTC m=+4082.545911469" watchObservedRunningTime="2026-01-21 16:09:45.367914213 +0000 UTC m=+4082.549430435" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.311422 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.359864 4707 generic.go:334] "Generic (PLEG): container finished" podID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerID="69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84" exitCode=0 Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.359902 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.359901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerDied","Data":"69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84"} Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.360007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b57d26d0-b96a-4d62-94cf-ddb928148048","Type":"ContainerDied","Data":"83ca996af47aab531e29283493076da3ddb49d5a8113893bb100408770c71a20"} Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.360026 4707 scope.go:117] "RemoveContainer" containerID="76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.382120 4707 scope.go:117] "RemoveContainer" containerID="fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.397884 4707 scope.go:117] "RemoveContainer" containerID="69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.419451 4707 scope.go:117] "RemoveContainer" containerID="fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.461418 4707 scope.go:117] "RemoveContainer" containerID="76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8" Jan 21 16:09:46 crc kubenswrapper[4707]: E0121 16:09:46.461827 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8\": container with ID starting with 76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8 not found: ID does not exist" containerID="76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.461863 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8"} err="failed to get container status \"76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8\": rpc error: code = NotFound desc = could not find container \"76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8\": container with ID starting with 76169dba210266c58f89053afeb092d8338121b54d47c5370cbf4938f030b2e8 not found: ID does not exist" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.461885 4707 scope.go:117] "RemoveContainer" containerID="fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74" Jan 21 16:09:46 crc kubenswrapper[4707]: E0121 16:09:46.462310 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74\": container with ID starting with fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74 not found: ID does not exist" containerID="fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.462338 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74"} err="failed to get container status \"fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74\": rpc error: code = NotFound desc = could not find container \"fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74\": container with ID starting with fa23d0f0c42811b96d11a24820bb6db070a3fdd7c0c7f550d4fc34caffcccf74 not found: ID does not exist" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.462353 4707 scope.go:117] "RemoveContainer" containerID="69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84" Jan 21 16:09:46 crc kubenswrapper[4707]: E0121 16:09:46.462655 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84\": container with ID starting with 69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84 not found: ID does not exist" containerID="69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.462680 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84"} err="failed to get container status \"69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84\": rpc error: code = NotFound desc = could not find container \"69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84\": container with ID starting with 69e99e7c69fb115dc412aab3f04c172eb8c05f8082db5668afd0b9f9a2738a84 not found: ID does not exist" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.462695 4707 scope.go:117] "RemoveContainer" containerID="fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c" Jan 21 16:09:46 crc kubenswrapper[4707]: E0121 16:09:46.463008 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c\": container with ID starting with fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c not found: ID does not exist" containerID="fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.463032 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c"} err="failed to get container status \"fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c\": rpc error: code = NotFound desc = could not find container \"fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c\": container with ID starting with fdd5d9f62bd618936bdd8bfceabc68081f3337309b51d1cfc8e3cb91ff65b11c not found: ID does not exist" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.493026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbr8g\" (UniqueName: \"kubernetes.io/projected/b57d26d0-b96a-4d62-94cf-ddb928148048-kube-api-access-xbr8g\") pod \"b57d26d0-b96a-4d62-94cf-ddb928148048\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.493086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-sg-core-conf-yaml\") pod \"b57d26d0-b96a-4d62-94cf-ddb928148048\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.493112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-config-data\") pod \"b57d26d0-b96a-4d62-94cf-ddb928148048\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.493422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-log-httpd\") pod \"b57d26d0-b96a-4d62-94cf-ddb928148048\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.493452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-combined-ca-bundle\") pod \"b57d26d0-b96a-4d62-94cf-ddb928148048\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.493511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-scripts\") pod \"b57d26d0-b96a-4d62-94cf-ddb928148048\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.493564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-run-httpd\") pod \"b57d26d0-b96a-4d62-94cf-ddb928148048\" (UID: \"b57d26d0-b96a-4d62-94cf-ddb928148048\") " Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.494648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b57d26d0-b96a-4d62-94cf-ddb928148048" (UID: "b57d26d0-b96a-4d62-94cf-ddb928148048"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.494764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b57d26d0-b96a-4d62-94cf-ddb928148048" (UID: "b57d26d0-b96a-4d62-94cf-ddb928148048"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.499130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-scripts" (OuterVolumeSpecName: "scripts") pod "b57d26d0-b96a-4d62-94cf-ddb928148048" (UID: "b57d26d0-b96a-4d62-94cf-ddb928148048"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.509939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57d26d0-b96a-4d62-94cf-ddb928148048-kube-api-access-xbr8g" (OuterVolumeSpecName: "kube-api-access-xbr8g") pod "b57d26d0-b96a-4d62-94cf-ddb928148048" (UID: "b57d26d0-b96a-4d62-94cf-ddb928148048"). InnerVolumeSpecName "kube-api-access-xbr8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.516015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b57d26d0-b96a-4d62-94cf-ddb928148048" (UID: "b57d26d0-b96a-4d62-94cf-ddb928148048"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.556540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b57d26d0-b96a-4d62-94cf-ddb928148048" (UID: "b57d26d0-b96a-4d62-94cf-ddb928148048"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.570092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-config-data" (OuterVolumeSpecName: "config-data") pod "b57d26d0-b96a-4d62-94cf-ddb928148048" (UID: "b57d26d0-b96a-4d62-94cf-ddb928148048"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.595168 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbr8g\" (UniqueName: \"kubernetes.io/projected/b57d26d0-b96a-4d62-94cf-ddb928148048-kube-api-access-xbr8g\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.595254 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.595309 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.595358 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.595415 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.595462 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57d26d0-b96a-4d62-94cf-ddb928148048-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.595511 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b57d26d0-b96a-4d62-94cf-ddb928148048-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.707991 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.723653 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.734271 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:46 crc kubenswrapper[4707]: E0121 16:09:46.734693 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="ceilometer-central-agent" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.734708 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="ceilometer-central-agent" Jan 21 16:09:46 crc kubenswrapper[4707]: E0121 16:09:46.734734 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="sg-core" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.734741 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="sg-core" Jan 21 16:09:46 crc kubenswrapper[4707]: E0121 16:09:46.734753 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="ceilometer-notification-agent" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.734760 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="ceilometer-notification-agent" Jan 21 16:09:46 crc kubenswrapper[4707]: E0121 16:09:46.734771 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="proxy-httpd" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.734776 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="proxy-httpd" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.734967 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="ceilometer-central-agent" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.734988 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="sg-core" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.735002 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="ceilometer-notification-agent" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.735010 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" containerName="proxy-httpd" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.736881 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.738744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.738982 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.739426 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.744912 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.903835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-scripts\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.903894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-log-httpd\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.903935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.904462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lfbr\" (UniqueName: \"kubernetes.io/projected/0942ee4d-0234-4c83-ab9a-04320d525c7d-kube-api-access-2lfbr\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.904573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.904643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-run-httpd\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.905095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:46 crc kubenswrapper[4707]: I0121 16:09:46.905200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-config-data\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.007436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.007473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-config-data\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.007530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-scripts\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.007552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-log-httpd\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.007576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.007628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lfbr\" (UniqueName: \"kubernetes.io/projected/0942ee4d-0234-4c83-ab9a-04320d525c7d-kube-api-access-2lfbr\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.007649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.007666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-run-httpd\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.008050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-run-httpd\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.008276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-log-httpd\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.011468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.011641 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-scripts\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.011684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.012311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-config-data\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.012436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.022732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lfbr\" (UniqueName: \"kubernetes.io/projected/0942ee4d-0234-4c83-ab9a-04320d525c7d-kube-api-access-2lfbr\") pod \"ceilometer-0\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.053907 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.198051 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57d26d0-b96a-4d62-94cf-ddb928148048" path="/var/lib/kubelet/pods/b57d26d0-b96a-4d62-94cf-ddb928148048/volumes" Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.477951 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:09:47 crc kubenswrapper[4707]: W0121 16:09:47.492748 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0942ee4d_0234_4c83_ab9a_04320d525c7d.slice/crio-e0d244d5e7212e4a2eb4d3979d91cf14f095a2b4a6a0fbd536b88c98725d8452 WatchSource:0}: Error finding container e0d244d5e7212e4a2eb4d3979d91cf14f095a2b4a6a0fbd536b88c98725d8452: Status 404 returned error can't find the container with id e0d244d5e7212e4a2eb4d3979d91cf14f095a2b4a6a0fbd536b88c98725d8452 Jan 21 16:09:47 crc kubenswrapper[4707]: I0121 16:09:47.693297 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:48 crc kubenswrapper[4707]: I0121 16:09:48.379903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerStarted","Data":"81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5"} Jan 21 16:09:48 crc kubenswrapper[4707]: I0121 16:09:48.380195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerStarted","Data":"e0d244d5e7212e4a2eb4d3979d91cf14f095a2b4a6a0fbd536b88c98725d8452"} Jan 21 16:09:48 crc kubenswrapper[4707]: I0121 16:09:48.682495 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:48 crc kubenswrapper[4707]: I0121 16:09:48.700343 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:48 crc kubenswrapper[4707]: I0121 16:09:48.753657 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:48 crc kubenswrapper[4707]: I0121 16:09:48.753709 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:48 crc kubenswrapper[4707]: I0121 16:09:48.950869 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:09:49 crc kubenswrapper[4707]: I0121 16:09:49.389712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerStarted","Data":"952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25"} Jan 21 16:09:49 crc kubenswrapper[4707]: I0121 16:09:49.403842 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:09:49 crc kubenswrapper[4707]: I0121 16:09:49.727051 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.242027 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng"] Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.243450 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.246286 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.246456 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.265645 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng"] Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.394605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-config-data\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.394963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-scripts\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.395021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbjr2\" (UniqueName: \"kubernetes.io/projected/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-kube-api-access-fbjr2\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.395096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.401491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerStarted","Data":"209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337"} Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.496861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.496954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-config-data\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.497011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-scripts\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.497063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbjr2\" (UniqueName: \"kubernetes.io/projected/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-kube-api-access-fbjr2\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.502595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-config-data\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.503106 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-scripts\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.509835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.513001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbjr2\" (UniqueName: \"kubernetes.io/projected/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-kube-api-access-fbjr2\") pod \"nova-cell1-cell-mapping-bdjng\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.560866 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:50 crc kubenswrapper[4707]: I0121 16:09:50.963682 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng"] Jan 21 16:09:51 crc kubenswrapper[4707]: I0121 16:09:51.409751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" event={"ID":"ab9f6631-421a-4fd3-977e-7f03a4d13e2e","Type":"ContainerStarted","Data":"ad264b7e1f00fc4984b3fc2c3c7f5bb618cb111057cf37ecd2ef5c32ea47b50d"} Jan 21 16:09:51 crc kubenswrapper[4707]: I0121 16:09:51.410086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" event={"ID":"ab9f6631-421a-4fd3-977e-7f03a4d13e2e","Type":"ContainerStarted","Data":"f0c68c94d5ef35fd1bb0d03528d926cf66d187a62282d1da76a4782a214ad9bb"} Jan 21 16:09:51 crc kubenswrapper[4707]: I0121 16:09:51.412191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerStarted","Data":"84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988"} Jan 21 16:09:51 crc kubenswrapper[4707]: I0121 16:09:51.412382 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:09:51 crc kubenswrapper[4707]: I0121 16:09:51.460001 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" podStartSLOduration=1.459986243 podStartE2EDuration="1.459986243s" podCreationTimestamp="2026-01-21 16:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:09:51.437825714 +0000 UTC m=+4088.619341937" watchObservedRunningTime="2026-01-21 16:09:51.459986243 +0000 UTC m=+4088.641502464" Jan 21 16:09:51 crc kubenswrapper[4707]: I0121 16:09:51.460264 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.185948835 podStartE2EDuration="5.460260359s" podCreationTimestamp="2026-01-21 16:09:46 +0000 UTC" firstStartedPulling="2026-01-21 16:09:47.495680314 +0000 UTC m=+4084.677196536" lastFinishedPulling="2026-01-21 16:09:50.769991838 +0000 UTC m=+4087.951508060" observedRunningTime="2026-01-21 16:09:51.452503984 +0000 UTC m=+4088.634020206" watchObservedRunningTime="2026-01-21 16:09:51.460260359 +0000 UTC m=+4088.641776580" Jan 21 16:09:51 crc kubenswrapper[4707]: I0121 16:09:51.664422 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:51 crc kubenswrapper[4707]: I0121 16:09:51.664472 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:09:52 crc kubenswrapper[4707]: I0121 16:09:52.692862 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:52 crc kubenswrapper[4707]: I0121 16:09:52.714328 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:52 crc kubenswrapper[4707]: I0121 16:09:52.745965 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:09:52 crc kubenswrapper[4707]: I0121 16:09:52.746004 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:09:53 crc kubenswrapper[4707]: I0121 16:09:53.451414 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:53 crc kubenswrapper[4707]: I0121 16:09:53.753611 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:53 crc kubenswrapper[4707]: I0121 16:09:53.753663 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:09:54 crc kubenswrapper[4707]: I0121 16:09:54.767930 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:09:54 crc kubenswrapper[4707]: I0121 16:09:54.767975 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:09:56 crc kubenswrapper[4707]: I0121 16:09:56.451951 4707 generic.go:334] "Generic (PLEG): container finished" podID="ab9f6631-421a-4fd3-977e-7f03a4d13e2e" containerID="ad264b7e1f00fc4984b3fc2c3c7f5bb618cb111057cf37ecd2ef5c32ea47b50d" exitCode=0 Jan 21 16:09:56 crc kubenswrapper[4707]: I0121 16:09:56.452038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" event={"ID":"ab9f6631-421a-4fd3-977e-7f03a4d13e2e","Type":"ContainerDied","Data":"ad264b7e1f00fc4984b3fc2c3c7f5bb618cb111057cf37ecd2ef5c32ea47b50d"} Jan 21 16:09:57 crc kubenswrapper[4707]: I0121 16:09:57.768256 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:57 crc kubenswrapper[4707]: I0121 16:09:57.951909 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-config-data\") pod \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " Jan 21 16:09:57 crc kubenswrapper[4707]: I0121 16:09:57.952048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbjr2\" (UniqueName: \"kubernetes.io/projected/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-kube-api-access-fbjr2\") pod \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " Jan 21 16:09:57 crc kubenswrapper[4707]: I0121 16:09:57.952106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-combined-ca-bundle\") pod \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " Jan 21 16:09:57 crc kubenswrapper[4707]: I0121 16:09:57.952236 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-scripts\") pod \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\" (UID: \"ab9f6631-421a-4fd3-977e-7f03a4d13e2e\") " Jan 21 16:09:57 crc kubenswrapper[4707]: I0121 16:09:57.958725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-scripts" (OuterVolumeSpecName: "scripts") pod "ab9f6631-421a-4fd3-977e-7f03a4d13e2e" (UID: "ab9f6631-421a-4fd3-977e-7f03a4d13e2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:57 crc kubenswrapper[4707]: I0121 16:09:57.959395 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-kube-api-access-fbjr2" (OuterVolumeSpecName: "kube-api-access-fbjr2") pod "ab9f6631-421a-4fd3-977e-7f03a4d13e2e" (UID: "ab9f6631-421a-4fd3-977e-7f03a4d13e2e"). InnerVolumeSpecName "kube-api-access-fbjr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:57 crc kubenswrapper[4707]: I0121 16:09:57.980845 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab9f6631-421a-4fd3-977e-7f03a4d13e2e" (UID: "ab9f6631-421a-4fd3-977e-7f03a4d13e2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:57 crc kubenswrapper[4707]: I0121 16:09:57.981074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-config-data" (OuterVolumeSpecName: "config-data") pod "ab9f6631-421a-4fd3-977e-7f03a4d13e2e" (UID: "ab9f6631-421a-4fd3-977e-7f03a4d13e2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.056864 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.056911 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbjr2\" (UniqueName: \"kubernetes.io/projected/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-kube-api-access-fbjr2\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.056927 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.056938 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f6631-421a-4fd3-977e-7f03a4d13e2e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.182171 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:09:58 crc kubenswrapper[4707]: E0121 16:09:58.183490 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.468471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" event={"ID":"ab9f6631-421a-4fd3-977e-7f03a4d13e2e","Type":"ContainerDied","Data":"f0c68c94d5ef35fd1bb0d03528d926cf66d187a62282d1da76a4782a214ad9bb"} Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.468541 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0c68c94d5ef35fd1bb0d03528d926cf66d187a62282d1da76a4782a214ad9bb" Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.468565 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng" Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.684116 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.684438 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-log" containerID="cri-o://4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa" gracePeriod=30 Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.684504 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-metadata" containerID="cri-o://55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811" gracePeriod=30 Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.695547 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.695788 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-log" containerID="cri-o://62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4" gracePeriod=30 Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.695898 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-api" containerID="cri-o://90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27" gracePeriod=30 Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.704567 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:58 crc kubenswrapper[4707]: I0121 16:09:58.704739 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="091cea2f-7c79-4cba-b4df-0c85b8fbfe7a" containerName="nova-scheduler-scheduler" containerID="cri-o://cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1" gracePeriod=30 Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.374475 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.481901 4707 generic.go:334] "Generic (PLEG): container finished" podID="091cea2f-7c79-4cba-b4df-0c85b8fbfe7a" containerID="cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1" exitCode=0 Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.481954 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.481987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a","Type":"ContainerDied","Data":"cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1"} Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.482041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a","Type":"ContainerDied","Data":"88ca45f0265a7551414c6d643e0ceb9b0052d4cb105bb6e508a3fc93c0b54674"} Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.482063 4707 scope.go:117] "RemoveContainer" containerID="cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.484791 4707 generic.go:334] "Generic (PLEG): container finished" podID="a373955d-893e-405a-82eb-b36431f0df12" containerID="62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4" exitCode=143 Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.484920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a373955d-893e-405a-82eb-b36431f0df12","Type":"ContainerDied","Data":"62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4"} Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.486022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6nb7\" (UniqueName: \"kubernetes.io/projected/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-kube-api-access-x6nb7\") pod \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.486148 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-config-data\") pod \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.486650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-combined-ca-bundle\") pod \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\" (UID: \"091cea2f-7c79-4cba-b4df-0c85b8fbfe7a\") " Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.492286 4707 generic.go:334] "Generic (PLEG): container finished" podID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerID="4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa" exitCode=143 Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.492338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2401fc34-ec2a-41f1-9080-0c3320c276b1","Type":"ContainerDied","Data":"4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa"} Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.492756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-kube-api-access-x6nb7" (OuterVolumeSpecName: "kube-api-access-x6nb7") pod "091cea2f-7c79-4cba-b4df-0c85b8fbfe7a" (UID: "091cea2f-7c79-4cba-b4df-0c85b8fbfe7a"). InnerVolumeSpecName "kube-api-access-x6nb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.508837 4707 scope.go:117] "RemoveContainer" containerID="cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1" Jan 21 16:09:59 crc kubenswrapper[4707]: E0121 16:09:59.509800 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1\": container with ID starting with cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1 not found: ID does not exist" containerID="cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.509872 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1"} err="failed to get container status \"cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1\": rpc error: code = NotFound desc = could not find container \"cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1\": container with ID starting with cc98c5e4e7e3d4532c8d5edaca1bbd7fd6a86a575b7eb788795f88720b0b86e1 not found: ID does not exist" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.521042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-config-data" (OuterVolumeSpecName: "config-data") pod "091cea2f-7c79-4cba-b4df-0c85b8fbfe7a" (UID: "091cea2f-7c79-4cba-b4df-0c85b8fbfe7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.521066 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "091cea2f-7c79-4cba-b4df-0c85b8fbfe7a" (UID: "091cea2f-7c79-4cba-b4df-0c85b8fbfe7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.589206 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6nb7\" (UniqueName: \"kubernetes.io/projected/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-kube-api-access-x6nb7\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.589233 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.589244 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.809682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.814945 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.830754 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:59 crc kubenswrapper[4707]: E0121 16:09:59.831254 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9f6631-421a-4fd3-977e-7f03a4d13e2e" containerName="nova-manage" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.831275 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9f6631-421a-4fd3-977e-7f03a4d13e2e" containerName="nova-manage" Jan 21 16:09:59 crc kubenswrapper[4707]: E0121 16:09:59.831298 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091cea2f-7c79-4cba-b4df-0c85b8fbfe7a" containerName="nova-scheduler-scheduler" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.831307 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="091cea2f-7c79-4cba-b4df-0c85b8fbfe7a" containerName="nova-scheduler-scheduler" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.831471 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="091cea2f-7c79-4cba-b4df-0c85b8fbfe7a" containerName="nova-scheduler-scheduler" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.831491 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9f6631-421a-4fd3-977e-7f03a4d13e2e" containerName="nova-manage" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.832118 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.833941 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.846404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.895180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.895429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z6cm\" (UniqueName: \"kubernetes.io/projected/43b25315-bece-4c48-bd94-3132650e2b14-kube-api-access-5z6cm\") pod \"nova-scheduler-0\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.895557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-config-data\") pod \"nova-scheduler-0\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.997749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z6cm\" (UniqueName: \"kubernetes.io/projected/43b25315-bece-4c48-bd94-3132650e2b14-kube-api-access-5z6cm\") pod \"nova-scheduler-0\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.997953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-config-data\") pod \"nova-scheduler-0\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:09:59 crc kubenswrapper[4707]: I0121 16:09:59.998091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:10:00 crc kubenswrapper[4707]: I0121 16:10:00.002168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:10:00 crc kubenswrapper[4707]: I0121 16:10:00.002204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-config-data\") pod \"nova-scheduler-0\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:10:00 crc kubenswrapper[4707]: I0121 16:10:00.011407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z6cm\" (UniqueName: \"kubernetes.io/projected/43b25315-bece-4c48-bd94-3132650e2b14-kube-api-access-5z6cm\") pod \"nova-scheduler-0\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:10:00 crc kubenswrapper[4707]: I0121 16:10:00.146027 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:10:00 crc kubenswrapper[4707]: I0121 16:10:00.580362 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:10:00 crc kubenswrapper[4707]: W0121 16:10:00.582422 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43b25315_bece_4c48_bd94_3132650e2b14.slice/crio-505a7b08eead2cf1c6175f645a0b4eea548123018a8032e6282223933b6932c3 WatchSource:0}: Error finding container 505a7b08eead2cf1c6175f645a0b4eea548123018a8032e6282223933b6932c3: Status 404 returned error can't find the container with id 505a7b08eead2cf1c6175f645a0b4eea548123018a8032e6282223933b6932c3 Jan 21 16:10:01 crc kubenswrapper[4707]: I0121 16:10:01.195129 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="091cea2f-7c79-4cba-b4df-0c85b8fbfe7a" path="/var/lib/kubelet/pods/091cea2f-7c79-4cba-b4df-0c85b8fbfe7a/volumes" Jan 21 16:10:01 crc kubenswrapper[4707]: I0121 16:10:01.516122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"43b25315-bece-4c48-bd94-3132650e2b14","Type":"ContainerStarted","Data":"5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04"} Jan 21 16:10:01 crc kubenswrapper[4707]: I0121 16:10:01.516182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"43b25315-bece-4c48-bd94-3132650e2b14","Type":"ContainerStarted","Data":"505a7b08eead2cf1c6175f645a0b4eea548123018a8032e6282223933b6932c3"} Jan 21 16:10:01 crc kubenswrapper[4707]: I0121 16:10:01.538838 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.538794856 podStartE2EDuration="2.538794856s" podCreationTimestamp="2026-01-21 16:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:10:01.536662557 +0000 UTC m=+4098.718178779" watchObservedRunningTime="2026-01-21 16:10:01.538794856 +0000 UTC m=+4098.720311079" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.327679 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.358343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-combined-ca-bundle\") pod \"a373955d-893e-405a-82eb-b36431f0df12\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.358607 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-config-data\") pod \"a373955d-893e-405a-82eb-b36431f0df12\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.358737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txqs5\" (UniqueName: \"kubernetes.io/projected/a373955d-893e-405a-82eb-b36431f0df12-kube-api-access-txqs5\") pod \"a373955d-893e-405a-82eb-b36431f0df12\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.358762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a373955d-893e-405a-82eb-b36431f0df12-logs\") pod \"a373955d-893e-405a-82eb-b36431f0df12\" (UID: \"a373955d-893e-405a-82eb-b36431f0df12\") " Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.359658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a373955d-893e-405a-82eb-b36431f0df12-logs" (OuterVolumeSpecName: "logs") pod "a373955d-893e-405a-82eb-b36431f0df12" (UID: "a373955d-893e-405a-82eb-b36431f0df12"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.379943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a373955d-893e-405a-82eb-b36431f0df12-kube-api-access-txqs5" (OuterVolumeSpecName: "kube-api-access-txqs5") pod "a373955d-893e-405a-82eb-b36431f0df12" (UID: "a373955d-893e-405a-82eb-b36431f0df12"). InnerVolumeSpecName "kube-api-access-txqs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.404462 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-config-data" (OuterVolumeSpecName: "config-data") pod "a373955d-893e-405a-82eb-b36431f0df12" (UID: "a373955d-893e-405a-82eb-b36431f0df12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.427892 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6t6xq"] Jan 21 16:10:02 crc kubenswrapper[4707]: E0121 16:10:02.428594 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-log" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.428614 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-log" Jan 21 16:10:02 crc kubenswrapper[4707]: E0121 16:10:02.428651 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-api" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.428657 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-api" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.428831 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-api" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.428863 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a373955d-893e-405a-82eb-b36431f0df12" containerName="nova-api-log" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.430270 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.439916 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6t6xq"] Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.446240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a373955d-893e-405a-82eb-b36431f0df12" (UID: "a373955d-893e-405a-82eb-b36431f0df12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.464249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-catalog-content\") pod \"certified-operators-6t6xq\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.464282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-utilities\") pod \"certified-operators-6t6xq\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.464429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2p6w\" (UniqueName: \"kubernetes.io/projected/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-kube-api-access-b2p6w\") pod \"certified-operators-6t6xq\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.464488 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.464502 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a373955d-893e-405a-82eb-b36431f0df12-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.464512 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txqs5\" (UniqueName: \"kubernetes.io/projected/a373955d-893e-405a-82eb-b36431f0df12-kube-api-access-txqs5\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.464523 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a373955d-893e-405a-82eb-b36431f0df12-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.467136 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.529092 4707 generic.go:334] "Generic (PLEG): container finished" podID="a373955d-893e-405a-82eb-b36431f0df12" containerID="90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27" exitCode=0 Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.529186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a373955d-893e-405a-82eb-b36431f0df12","Type":"ContainerDied","Data":"90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27"} Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.529220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a373955d-893e-405a-82eb-b36431f0df12","Type":"ContainerDied","Data":"928a8c54e7fbca672615598cde77167dfc542a417b97e169b33dd09c94cd374a"} Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.529242 4707 scope.go:117] "RemoveContainer" containerID="90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.529412 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.536263 4707 generic.go:334] "Generic (PLEG): container finished" podID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerID="55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811" exitCode=0 Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.537246 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.537273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2401fc34-ec2a-41f1-9080-0c3320c276b1","Type":"ContainerDied","Data":"55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811"} Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.537850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"2401fc34-ec2a-41f1-9080-0c3320c276b1","Type":"ContainerDied","Data":"83160d73984ae47cb6e562d23c161e650baebc631bb304679e9e89f5459ebd3e"} Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.557836 4707 scope.go:117] "RemoveContainer" containerID="62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.566568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-nova-metadata-tls-certs\") pod \"2401fc34-ec2a-41f1-9080-0c3320c276b1\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.567152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-config-data\") pod \"2401fc34-ec2a-41f1-9080-0c3320c276b1\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.567615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401fc34-ec2a-41f1-9080-0c3320c276b1-logs\") pod \"2401fc34-ec2a-41f1-9080-0c3320c276b1\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.567654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-combined-ca-bundle\") pod \"2401fc34-ec2a-41f1-9080-0c3320c276b1\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.567708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98hnt\" (UniqueName: \"kubernetes.io/projected/2401fc34-ec2a-41f1-9080-0c3320c276b1-kube-api-access-98hnt\") pod \"2401fc34-ec2a-41f1-9080-0c3320c276b1\" (UID: \"2401fc34-ec2a-41f1-9080-0c3320c276b1\") " Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.568289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-catalog-content\") pod \"certified-operators-6t6xq\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.568327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-utilities\") pod \"certified-operators-6t6xq\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.570049 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.570124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2p6w\" (UniqueName: \"kubernetes.io/projected/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-kube-api-access-b2p6w\") pod \"certified-operators-6t6xq\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.571647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2401fc34-ec2a-41f1-9080-0c3320c276b1-logs" (OuterVolumeSpecName: "logs") pod "2401fc34-ec2a-41f1-9080-0c3320c276b1" (UID: "2401fc34-ec2a-41f1-9080-0c3320c276b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.573177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-utilities\") pod \"certified-operators-6t6xq\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.575992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-catalog-content\") pod \"certified-operators-6t6xq\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.583893 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.591595 4707 scope.go:117] "RemoveContainer" containerID="90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.594737 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:02 crc kubenswrapper[4707]: E0121 16:10:02.594695 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27\": container with ID starting with 90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27 not found: ID does not exist" containerID="90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.594935 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27"} err="failed to get container status \"90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27\": rpc error: code = NotFound desc = could not find container \"90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27\": container with ID starting with 90163514b59c8a3cb7299f1a1e51b9355a7643e01ac50ee9ceab0c1cbfcefc27 not found: ID does not exist" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.594962 4707 scope.go:117] "RemoveContainer" containerID="62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4" Jan 21 16:10:02 crc kubenswrapper[4707]: E0121 16:10:02.595312 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-log" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.595336 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-log" Jan 21 16:10:02 crc kubenswrapper[4707]: E0121 16:10:02.595363 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-metadata" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.595369 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-metadata" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.595658 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-log" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.595689 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" containerName="nova-metadata-metadata" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.596919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: E0121 16:10:02.599106 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4\": container with ID starting with 62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4 not found: ID does not exist" containerID="62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.599128 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4"} err="failed to get container status \"62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4\": rpc error: code = NotFound desc = could not find container \"62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4\": container with ID starting with 62b919cc8d22defcede4ff25b4893ba139754cdff294149b9aacb2aa45f995d4 not found: ID does not exist" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.599146 4707 scope.go:117] "RemoveContainer" containerID="55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.599227 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.619850 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.624656 4707 scope.go:117] "RemoveContainer" containerID="4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.649189 4707 scope.go:117] "RemoveContainer" containerID="55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811" Jan 21 16:10:02 crc kubenswrapper[4707]: E0121 16:10:02.654596 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811\": container with ID starting with 55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811 not found: ID does not exist" containerID="55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.654626 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811"} err="failed to get container status \"55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811\": rpc error: code = NotFound desc = could not find container \"55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811\": container with ID starting with 55c934ce9e2bd074e679c4efac38d2439a74e16b43b371410f4393c88423d811 not found: ID does not exist" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.654645 4707 scope.go:117] "RemoveContainer" containerID="4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa" Jan 21 16:10:02 crc kubenswrapper[4707]: E0121 16:10:02.655437 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa\": container with ID starting with 4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa not found: ID does not exist" containerID="4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.655459 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa"} err="failed to get container status \"4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa\": rpc error: code = NotFound desc = could not find container \"4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa\": container with ID starting with 4f3fcd1eda729da257da1717209fe1ca1b4e4d4621eb89cedb2545fd1d74c8fa not found: ID does not exist" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.672131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-config-data\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.672175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkwdr\" (UniqueName: \"kubernetes.io/projected/02297ae7-b37f-41c6-ac59-947e2996f44d-kube-api-access-qkwdr\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.672484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.672547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02297ae7-b37f-41c6-ac59-947e2996f44d-logs\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.672646 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401fc34-ec2a-41f1-9080-0c3320c276b1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.774412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-config-data\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.774456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkwdr\" (UniqueName: \"kubernetes.io/projected/02297ae7-b37f-41c6-ac59-947e2996f44d-kube-api-access-qkwdr\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.774578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.774607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02297ae7-b37f-41c6-ac59-947e2996f44d-logs\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.775009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02297ae7-b37f-41c6-ac59-947e2996f44d-logs\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.948525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2401fc34-ec2a-41f1-9080-0c3320c276b1-kube-api-access-98hnt" (OuterVolumeSpecName: "kube-api-access-98hnt") pod "2401fc34-ec2a-41f1-9080-0c3320c276b1" (UID: "2401fc34-ec2a-41f1-9080-0c3320c276b1"). InnerVolumeSpecName "kube-api-access-98hnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.949028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2p6w\" (UniqueName: \"kubernetes.io/projected/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-kube-api-access-b2p6w\") pod \"certified-operators-6t6xq\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.949206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.949476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-config-data\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.950085 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkwdr\" (UniqueName: \"kubernetes.io/projected/02297ae7-b37f-41c6-ac59-947e2996f44d-kube-api-access-qkwdr\") pod \"nova-api-0\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.968229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-config-data" (OuterVolumeSpecName: "config-data") pod "2401fc34-ec2a-41f1-9080-0c3320c276b1" (UID: "2401fc34-ec2a-41f1-9080-0c3320c276b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.976415 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98hnt\" (UniqueName: \"kubernetes.io/projected/2401fc34-ec2a-41f1-9080-0c3320c276b1-kube-api-access-98hnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.976437 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:02 crc kubenswrapper[4707]: I0121 16:10:02.994056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2401fc34-ec2a-41f1-9080-0c3320c276b1" (UID: "2401fc34-ec2a-41f1-9080-0c3320c276b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.007207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2401fc34-ec2a-41f1-9080-0c3320c276b1" (UID: "2401fc34-ec2a-41f1-9080-0c3320c276b1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.055712 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.078540 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.078571 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401fc34-ec2a-41f1-9080-0c3320c276b1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.186934 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.210270 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a373955d-893e-405a-82eb-b36431f0df12" path="/var/lib/kubelet/pods/a373955d-893e-405a-82eb-b36431f0df12/volumes" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.211397 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.218852 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.221914 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.226296 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.228580 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.228790 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.237623 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.280910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.280964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-config-data\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.281038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvcsv\" (UniqueName: \"kubernetes.io/projected/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-kube-api-access-jvcsv\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.281182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-logs\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.281215 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.384736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-config-data\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.385189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvcsv\" (UniqueName: \"kubernetes.io/projected/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-kube-api-access-jvcsv\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.385435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-logs\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.385473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.385661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.385995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-logs\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.390417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.390547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-config-data\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.392188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.402637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvcsv\" (UniqueName: \"kubernetes.io/projected/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-kube-api-access-jvcsv\") pod \"nova-metadata-0\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.524568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6t6xq"] Jan 21 16:10:03 crc kubenswrapper[4707]: W0121 16:10:03.526077 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d750c86_177f_4f1d_a5b5_0c4cde73c43b.slice/crio-9b8771111ed23e909de5b5fb0b04454116006880a80abca571cf8f8b416f9548 WatchSource:0}: Error finding container 9b8771111ed23e909de5b5fb0b04454116006880a80abca571cf8f8b416f9548: Status 404 returned error can't find the container with id 9b8771111ed23e909de5b5fb0b04454116006880a80abca571cf8f8b416f9548 Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.547295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6xq" event={"ID":"4d750c86-177f-4f1d-a5b5-0c4cde73c43b","Type":"ContainerStarted","Data":"9b8771111ed23e909de5b5fb0b04454116006880a80abca571cf8f8b416f9548"} Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.554587 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.646797 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:03 crc kubenswrapper[4707]: W0121 16:10:03.650234 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02297ae7_b37f_41c6_ac59_947e2996f44d.slice/crio-4532452c66f4fb7316ba369c1d6dde6f8d93e331af3a3d4782133ba7bd87f78a WatchSource:0}: Error finding container 4532452c66f4fb7316ba369c1d6dde6f8d93e331af3a3d4782133ba7bd87f78a: Status 404 returned error can't find the container with id 4532452c66f4fb7316ba369c1d6dde6f8d93e331af3a3d4782133ba7bd87f78a Jan 21 16:10:03 crc kubenswrapper[4707]: I0121 16:10:03.959201 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:10:03 crc kubenswrapper[4707]: W0121 16:10:03.966879 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef9ba957_bcd9_4e63_bbf2_4e2398ac259a.slice/crio-b9e71e7fcf09edfb909804e6bf87a12e584d6bd34652be5e2af98fa85d658c69 WatchSource:0}: Error finding container b9e71e7fcf09edfb909804e6bf87a12e584d6bd34652be5e2af98fa85d658c69: Status 404 returned error can't find the container with id b9e71e7fcf09edfb909804e6bf87a12e584d6bd34652be5e2af98fa85d658c69 Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.558104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a","Type":"ContainerStarted","Data":"72ebb7a26f99df0fa91b1dd0b938c53301a324860220aedf1fc9ada4708a5afc"} Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.559295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a","Type":"ContainerStarted","Data":"899a26b8e50ba33fca57b72d44974297421aa331badc78e13d0e83c15ab03b22"} Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.559315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a","Type":"ContainerStarted","Data":"b9e71e7fcf09edfb909804e6bf87a12e584d6bd34652be5e2af98fa85d658c69"} Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.560108 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerID="4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526" exitCode=0 Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.560172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6xq" event={"ID":"4d750c86-177f-4f1d-a5b5-0c4cde73c43b","Type":"ContainerDied","Data":"4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526"} Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.561857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"02297ae7-b37f-41c6-ac59-947e2996f44d","Type":"ContainerStarted","Data":"117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0"} Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.561896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"02297ae7-b37f-41c6-ac59-947e2996f44d","Type":"ContainerStarted","Data":"57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664"} Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.561906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"02297ae7-b37f-41c6-ac59-947e2996f44d","Type":"ContainerStarted","Data":"4532452c66f4fb7316ba369c1d6dde6f8d93e331af3a3d4782133ba7bd87f78a"} Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.584729 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.584708687 podStartE2EDuration="1.584708687s" podCreationTimestamp="2026-01-21 16:10:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:10:04.575184097 +0000 UTC m=+4101.756700319" watchObservedRunningTime="2026-01-21 16:10:04.584708687 +0000 UTC m=+4101.766224908" Jan 21 16:10:04 crc kubenswrapper[4707]: I0121 16:10:04.610667 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.6106428 podStartE2EDuration="2.6106428s" podCreationTimestamp="2026-01-21 16:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:10:04.604242766 +0000 UTC m=+4101.785758979" watchObservedRunningTime="2026-01-21 16:10:04.6106428 +0000 UTC m=+4101.792159022" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.146769 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.194479 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2401fc34-ec2a-41f1-9080-0c3320c276b1" path="/var/lib/kubelet/pods/2401fc34-ec2a-41f1-9080-0c3320c276b1/volumes" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.423962 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fdvpj"] Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.425770 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.441722 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdvpj"] Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.526436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-catalog-content\") pod \"community-operators-fdvpj\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.526612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl7dp\" (UniqueName: \"kubernetes.io/projected/2b2c1b18-6185-4e89-af41-a8e0892afe68-kube-api-access-hl7dp\") pod \"community-operators-fdvpj\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.526756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-utilities\") pod \"community-operators-fdvpj\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.575207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6xq" event={"ID":"4d750c86-177f-4f1d-a5b5-0c4cde73c43b","Type":"ContainerStarted","Data":"592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2"} Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.628521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-utilities\") pod \"community-operators-fdvpj\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.629192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-utilities\") pod \"community-operators-fdvpj\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.629208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-catalog-content\") pod \"community-operators-fdvpj\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.629538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-catalog-content\") pod \"community-operators-fdvpj\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.629732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl7dp\" (UniqueName: \"kubernetes.io/projected/2b2c1b18-6185-4e89-af41-a8e0892afe68-kube-api-access-hl7dp\") pod \"community-operators-fdvpj\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.644318 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl7dp\" (UniqueName: \"kubernetes.io/projected/2b2c1b18-6185-4e89-af41-a8e0892afe68-kube-api-access-hl7dp\") pod \"community-operators-fdvpj\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:05 crc kubenswrapper[4707]: I0121 16:10:05.747929 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:06 crc kubenswrapper[4707]: I0121 16:10:06.179758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdvpj"] Jan 21 16:10:06 crc kubenswrapper[4707]: I0121 16:10:06.590581 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerID="ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9" exitCode=0 Jan 21 16:10:06 crc kubenswrapper[4707]: I0121 16:10:06.590696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdvpj" event={"ID":"2b2c1b18-6185-4e89-af41-a8e0892afe68","Type":"ContainerDied","Data":"ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9"} Jan 21 16:10:06 crc kubenswrapper[4707]: I0121 16:10:06.591180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdvpj" event={"ID":"2b2c1b18-6185-4e89-af41-a8e0892afe68","Type":"ContainerStarted","Data":"c28fd83defa49c99f339d28117824dc86a75734de0d843d567bc1aaa4326df0b"} Jan 21 16:10:06 crc kubenswrapper[4707]: I0121 16:10:06.594333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6xq" event={"ID":"4d750c86-177f-4f1d-a5b5-0c4cde73c43b","Type":"ContainerDied","Data":"592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2"} Jan 21 16:10:06 crc kubenswrapper[4707]: I0121 16:10:06.594214 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerID="592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2" exitCode=0 Jan 21 16:10:07 crc kubenswrapper[4707]: I0121 16:10:07.604087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdvpj" event={"ID":"2b2c1b18-6185-4e89-af41-a8e0892afe68","Type":"ContainerStarted","Data":"5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961"} Jan 21 16:10:07 crc kubenswrapper[4707]: I0121 16:10:07.608007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6xq" event={"ID":"4d750c86-177f-4f1d-a5b5-0c4cde73c43b","Type":"ContainerStarted","Data":"3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081"} Jan 21 16:10:07 crc kubenswrapper[4707]: I0121 16:10:07.641149 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6t6xq" podStartSLOduration=3.154449738 podStartE2EDuration="5.6411335s" podCreationTimestamp="2026-01-21 16:10:02 +0000 UTC" firstStartedPulling="2026-01-21 16:10:04.562026898 +0000 UTC m=+4101.743543120" lastFinishedPulling="2026-01-21 16:10:07.04871066 +0000 UTC m=+4104.230226882" observedRunningTime="2026-01-21 16:10:07.637640081 +0000 UTC m=+4104.819156303" watchObservedRunningTime="2026-01-21 16:10:07.6411335 +0000 UTC m=+4104.822649712" Jan 21 16:10:08 crc kubenswrapper[4707]: I0121 16:10:08.555020 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:08 crc kubenswrapper[4707]: I0121 16:10:08.555111 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:08 crc kubenswrapper[4707]: I0121 16:10:08.615781 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerID="5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961" exitCode=0 Jan 21 16:10:08 crc kubenswrapper[4707]: I0121 16:10:08.615868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdvpj" event={"ID":"2b2c1b18-6185-4e89-af41-a8e0892afe68","Type":"ContainerDied","Data":"5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961"} Jan 21 16:10:09 crc kubenswrapper[4707]: I0121 16:10:09.183150 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:10:09 crc kubenswrapper[4707]: E0121 16:10:09.183683 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:10:09 crc kubenswrapper[4707]: I0121 16:10:09.625627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdvpj" event={"ID":"2b2c1b18-6185-4e89-af41-a8e0892afe68","Type":"ContainerStarted","Data":"eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f"} Jan 21 16:10:10 crc kubenswrapper[4707]: I0121 16:10:10.146588 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:10:10 crc kubenswrapper[4707]: I0121 16:10:10.168823 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:10:10 crc kubenswrapper[4707]: I0121 16:10:10.180092 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fdvpj" podStartSLOduration=2.60748002 podStartE2EDuration="5.180075261s" podCreationTimestamp="2026-01-21 16:10:05 +0000 UTC" firstStartedPulling="2026-01-21 16:10:06.593400121 +0000 UTC m=+4103.774916343" lastFinishedPulling="2026-01-21 16:10:09.165995362 +0000 UTC m=+4106.347511584" observedRunningTime="2026-01-21 16:10:09.637881967 +0000 UTC m=+4106.819398190" watchObservedRunningTime="2026-01-21 16:10:10.180075261 +0000 UTC m=+4107.361591483" Jan 21 16:10:11 crc kubenswrapper[4707]: I0121 16:10:11.261322 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:10:13 crc kubenswrapper[4707]: I0121 16:10:13.055889 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:13 crc kubenswrapper[4707]: I0121 16:10:13.056102 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:13 crc kubenswrapper[4707]: I0121 16:10:13.086981 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:13 crc kubenswrapper[4707]: I0121 16:10:13.222515 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:13 crc kubenswrapper[4707]: I0121 16:10:13.222559 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:13 crc kubenswrapper[4707]: I0121 16:10:13.555520 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:13 crc kubenswrapper[4707]: I0121 16:10:13.555556 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:13 crc kubenswrapper[4707]: I0121 16:10:13.686399 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:13 crc kubenswrapper[4707]: I0121 16:10:13.721335 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6t6xq"] Jan 21 16:10:14 crc kubenswrapper[4707]: I0121 16:10:14.303968 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:10:14 crc kubenswrapper[4707]: I0121 16:10:14.304270 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:10:14 crc kubenswrapper[4707]: I0121 16:10:14.569912 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:10:14 crc kubenswrapper[4707]: I0121 16:10:14.569973 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:10:15 crc kubenswrapper[4707]: I0121 16:10:15.664326 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6t6xq" podUID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerName="registry-server" containerID="cri-o://3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081" gracePeriod=2 Jan 21 16:10:15 crc kubenswrapper[4707]: I0121 16:10:15.749027 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:15 crc kubenswrapper[4707]: I0121 16:10:15.749188 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:15 crc kubenswrapper[4707]: I0121 16:10:15.780092 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.014694 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.209528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2p6w\" (UniqueName: \"kubernetes.io/projected/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-kube-api-access-b2p6w\") pod \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.209643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-catalog-content\") pod \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.209683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-utilities\") pod \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\" (UID: \"4d750c86-177f-4f1d-a5b5-0c4cde73c43b\") " Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.210350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-utilities" (OuterVolumeSpecName: "utilities") pod "4d750c86-177f-4f1d-a5b5-0c4cde73c43b" (UID: "4d750c86-177f-4f1d-a5b5-0c4cde73c43b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.213517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-kube-api-access-b2p6w" (OuterVolumeSpecName: "kube-api-access-b2p6w") pod "4d750c86-177f-4f1d-a5b5-0c4cde73c43b" (UID: "4d750c86-177f-4f1d-a5b5-0c4cde73c43b"). InnerVolumeSpecName "kube-api-access-b2p6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.241067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d750c86-177f-4f1d-a5b5-0c4cde73c43b" (UID: "4d750c86-177f-4f1d-a5b5-0c4cde73c43b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.311496 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2p6w\" (UniqueName: \"kubernetes.io/projected/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-kube-api-access-b2p6w\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.311519 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.311529 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d750c86-177f-4f1d-a5b5-0c4cde73c43b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.555565 4707 scope.go:117] "RemoveContainer" containerID="1cfbfa0717635e65d7326b7c6d12c3ff52804eea66d22885b209dc75b069dd24" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.572941 4707 scope.go:117] "RemoveContainer" containerID="03100c9eb9b8189faab039c677c5d529534dd8ab01bec2472f4fc82fd729aab9" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.587563 4707 scope.go:117] "RemoveContainer" containerID="46cdecded34ba4caafda591ec18134d5e73fd6deb30f6c20644310f0985b7bb6" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.601931 4707 scope.go:117] "RemoveContainer" containerID="9201040871e23d6bc2a379fa2022d38ff6b9347b086e57318e146ace3cc99e12" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.628485 4707 scope.go:117] "RemoveContainer" containerID="32a92f7f2012dcf1fe6f0d9760599775639a7067de28634de63ab2fdd9b6962c" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.640731 4707 scope.go:117] "RemoveContainer" containerID="01def0d7c0c313086bab3b5dc9cadb48b9024281321ff3f9e84b864c2e80cc95" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.671800 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerID="3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081" exitCode=0 Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.671866 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t6xq" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.671876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6xq" event={"ID":"4d750c86-177f-4f1d-a5b5-0c4cde73c43b","Type":"ContainerDied","Data":"3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081"} Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.671903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6xq" event={"ID":"4d750c86-177f-4f1d-a5b5-0c4cde73c43b","Type":"ContainerDied","Data":"9b8771111ed23e909de5b5fb0b04454116006880a80abca571cf8f8b416f9548"} Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.671920 4707 scope.go:117] "RemoveContainer" containerID="3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.700551 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6t6xq"] Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.706042 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6t6xq"] Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.709224 4707 scope.go:117] "RemoveContainer" containerID="592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.713053 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.730829 4707 scope.go:117] "RemoveContainer" containerID="4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.750836 4707 scope.go:117] "RemoveContainer" containerID="3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081" Jan 21 16:10:16 crc kubenswrapper[4707]: E0121 16:10:16.751175 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081\": container with ID starting with 3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081 not found: ID does not exist" containerID="3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.751207 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081"} err="failed to get container status \"3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081\": rpc error: code = NotFound desc = could not find container \"3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081\": container with ID starting with 3718459b79f283100d0fc1be18d16ff09c730df8c78a32ad8402a78c184e5081 not found: ID does not exist" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.751228 4707 scope.go:117] "RemoveContainer" containerID="592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2" Jan 21 16:10:16 crc kubenswrapper[4707]: E0121 16:10:16.751556 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2\": container with ID starting with 592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2 not found: ID does not exist" containerID="592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.751580 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2"} err="failed to get container status \"592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2\": rpc error: code = NotFound desc = could not find container \"592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2\": container with ID starting with 592070e75b324d3a3c76bbaad4162393f2eef4498b07f870f40d565915161cd2 not found: ID does not exist" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.751597 4707 scope.go:117] "RemoveContainer" containerID="4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526" Jan 21 16:10:16 crc kubenswrapper[4707]: E0121 16:10:16.751827 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526\": container with ID starting with 4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526 not found: ID does not exist" containerID="4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526" Jan 21 16:10:16 crc kubenswrapper[4707]: I0121 16:10:16.751852 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526"} err="failed to get container status \"4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526\": rpc error: code = NotFound desc = could not find container \"4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526\": container with ID starting with 4f07a55dab998d8c4b8e637b427ec8c8d66e513e0d2298a04308d2f18b991526 not found: ID does not exist" Jan 21 16:10:17 crc kubenswrapper[4707]: I0121 16:10:17.060432 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:17 crc kubenswrapper[4707]: I0121 16:10:17.190389 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" path="/var/lib/kubelet/pods/4d750c86-177f-4f1d-a5b5-0c4cde73c43b/volumes" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.004542 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdvpj"] Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.004897 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fdvpj" podUID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerName="registry-server" containerID="cri-o://eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f" gracePeriod=2 Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.672009 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.698671 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerID="eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f" exitCode=0 Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.698706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdvpj" event={"ID":"2b2c1b18-6185-4e89-af41-a8e0892afe68","Type":"ContainerDied","Data":"eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f"} Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.698729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdvpj" event={"ID":"2b2c1b18-6185-4e89-af41-a8e0892afe68","Type":"ContainerDied","Data":"c28fd83defa49c99f339d28117824dc86a75734de0d843d567bc1aaa4326df0b"} Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.698744 4707 scope.go:117] "RemoveContainer" containerID="eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.698770 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdvpj" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.714120 4707 scope.go:117] "RemoveContainer" containerID="5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.729057 4707 scope.go:117] "RemoveContainer" containerID="ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.758572 4707 scope.go:117] "RemoveContainer" containerID="eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f" Jan 21 16:10:19 crc kubenswrapper[4707]: E0121 16:10:19.758928 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f\": container with ID starting with eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f not found: ID does not exist" containerID="eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.758957 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f"} err="failed to get container status \"eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f\": rpc error: code = NotFound desc = could not find container \"eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f\": container with ID starting with eef16f35e28bebb0043f2cc6c483378fb4e42471d497a1bd9c7d92c17dbbb08f not found: ID does not exist" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.758974 4707 scope.go:117] "RemoveContainer" containerID="5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961" Jan 21 16:10:19 crc kubenswrapper[4707]: E0121 16:10:19.759272 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961\": container with ID starting with 5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961 not found: ID does not exist" containerID="5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.759295 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961"} err="failed to get container status \"5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961\": rpc error: code = NotFound desc = could not find container \"5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961\": container with ID starting with 5f2053b28b3e94efe1cb4bcb1897e401336625fe99715084f667979561100961 not found: ID does not exist" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.759314 4707 scope.go:117] "RemoveContainer" containerID="ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9" Jan 21 16:10:19 crc kubenswrapper[4707]: E0121 16:10:19.759536 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9\": container with ID starting with ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9 not found: ID does not exist" containerID="ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.759556 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9"} err="failed to get container status \"ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9\": rpc error: code = NotFound desc = could not find container \"ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9\": container with ID starting with ad1258c15ebaed1d23943365b816c14c95b456d1028c191655e0613deefd63e9 not found: ID does not exist" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.774024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl7dp\" (UniqueName: \"kubernetes.io/projected/2b2c1b18-6185-4e89-af41-a8e0892afe68-kube-api-access-hl7dp\") pod \"2b2c1b18-6185-4e89-af41-a8e0892afe68\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.774224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-utilities\") pod \"2b2c1b18-6185-4e89-af41-a8e0892afe68\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.774254 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-catalog-content\") pod \"2b2c1b18-6185-4e89-af41-a8e0892afe68\" (UID: \"2b2c1b18-6185-4e89-af41-a8e0892afe68\") " Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.774614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-utilities" (OuterVolumeSpecName: "utilities") pod "2b2c1b18-6185-4e89-af41-a8e0892afe68" (UID: "2b2c1b18-6185-4e89-af41-a8e0892afe68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.774786 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.778259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2c1b18-6185-4e89-af41-a8e0892afe68-kube-api-access-hl7dp" (OuterVolumeSpecName: "kube-api-access-hl7dp") pod "2b2c1b18-6185-4e89-af41-a8e0892afe68" (UID: "2b2c1b18-6185-4e89-af41-a8e0892afe68"). InnerVolumeSpecName "kube-api-access-hl7dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.812453 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b2c1b18-6185-4e89-af41-a8e0892afe68" (UID: "2b2c1b18-6185-4e89-af41-a8e0892afe68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.876506 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl7dp\" (UniqueName: \"kubernetes.io/projected/2b2c1b18-6185-4e89-af41-a8e0892afe68-kube-api-access-hl7dp\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:19 crc kubenswrapper[4707]: I0121 16:10:19.876532 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b2c1b18-6185-4e89-af41-a8e0892afe68-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:20 crc kubenswrapper[4707]: I0121 16:10:20.021218 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdvpj"] Jan 21 16:10:20 crc kubenswrapper[4707]: I0121 16:10:20.026469 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fdvpj"] Jan 21 16:10:21 crc kubenswrapper[4707]: I0121 16:10:21.189849 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b2c1b18-6185-4e89-af41-a8e0892afe68" path="/var/lib/kubelet/pods/2b2c1b18-6185-4e89-af41-a8e0892afe68/volumes" Jan 21 16:10:22 crc kubenswrapper[4707]: I0121 16:10:22.182115 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:10:22 crc kubenswrapper[4707]: E0121 16:10:22.182372 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.225990 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.226050 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.226256 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.226287 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.227883 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.228134 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.559712 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.561908 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.563399 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:23 crc kubenswrapper[4707]: I0121 16:10:23.733888 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:10:24 crc kubenswrapper[4707]: I0121 16:10:24.977108 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:10:24 crc kubenswrapper[4707]: I0121 16:10:24.977629 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="ceilometer-central-agent" containerID="cri-o://81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5" gracePeriod=30 Jan 21 16:10:24 crc kubenswrapper[4707]: I0121 16:10:24.977701 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="sg-core" containerID="cri-o://209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337" gracePeriod=30 Jan 21 16:10:24 crc kubenswrapper[4707]: I0121 16:10:24.977734 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="ceilometer-notification-agent" containerID="cri-o://952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25" gracePeriod=30 Jan 21 16:10:24 crc kubenswrapper[4707]: I0121 16:10:24.977734 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="proxy-httpd" containerID="cri-o://84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988" gracePeriod=30 Jan 21 16:10:25 crc kubenswrapper[4707]: I0121 16:10:25.068930 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:25 crc kubenswrapper[4707]: I0121 16:10:25.745705 4707 generic.go:334] "Generic (PLEG): container finished" podID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerID="84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988" exitCode=0 Jan 21 16:10:25 crc kubenswrapper[4707]: I0121 16:10:25.745742 4707 generic.go:334] "Generic (PLEG): container finished" podID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerID="209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337" exitCode=2 Jan 21 16:10:25 crc kubenswrapper[4707]: I0121 16:10:25.745751 4707 generic.go:334] "Generic (PLEG): container finished" podID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerID="81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5" exitCode=0 Jan 21 16:10:25 crc kubenswrapper[4707]: I0121 16:10:25.745780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerDied","Data":"84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988"} Jan 21 16:10:25 crc kubenswrapper[4707]: I0121 16:10:25.745837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerDied","Data":"209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337"} Jan 21 16:10:25 crc kubenswrapper[4707]: I0121 16:10:25.745855 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerDied","Data":"81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5"} Jan 21 16:10:25 crc kubenswrapper[4707]: I0121 16:10:25.745982 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-log" containerID="cri-o://57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664" gracePeriod=30 Jan 21 16:10:25 crc kubenswrapper[4707]: I0121 16:10:25.746197 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-api" containerID="cri-o://117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0" gracePeriod=30 Jan 21 16:10:26 crc kubenswrapper[4707]: I0121 16:10:26.756101 4707 generic.go:334] "Generic (PLEG): container finished" podID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerID="57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664" exitCode=143 Jan 21 16:10:26 crc kubenswrapper[4707]: I0121 16:10:26.756207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"02297ae7-b37f-41c6-ac59-947e2996f44d","Type":"ContainerDied","Data":"57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664"} Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.138658 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.226096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkwdr\" (UniqueName: \"kubernetes.io/projected/02297ae7-b37f-41c6-ac59-947e2996f44d-kube-api-access-qkwdr\") pod \"02297ae7-b37f-41c6-ac59-947e2996f44d\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.226242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-combined-ca-bundle\") pod \"02297ae7-b37f-41c6-ac59-947e2996f44d\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.226286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02297ae7-b37f-41c6-ac59-947e2996f44d-logs\") pod \"02297ae7-b37f-41c6-ac59-947e2996f44d\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.226394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-config-data\") pod \"02297ae7-b37f-41c6-ac59-947e2996f44d\" (UID: \"02297ae7-b37f-41c6-ac59-947e2996f44d\") " Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.228453 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02297ae7-b37f-41c6-ac59-947e2996f44d-logs" (OuterVolumeSpecName: "logs") pod "02297ae7-b37f-41c6-ac59-947e2996f44d" (UID: "02297ae7-b37f-41c6-ac59-947e2996f44d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.232777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02297ae7-b37f-41c6-ac59-947e2996f44d-kube-api-access-qkwdr" (OuterVolumeSpecName: "kube-api-access-qkwdr") pod "02297ae7-b37f-41c6-ac59-947e2996f44d" (UID: "02297ae7-b37f-41c6-ac59-947e2996f44d"). InnerVolumeSpecName "kube-api-access-qkwdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.250954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02297ae7-b37f-41c6-ac59-947e2996f44d" (UID: "02297ae7-b37f-41c6-ac59-947e2996f44d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.251252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-config-data" (OuterVolumeSpecName: "config-data") pod "02297ae7-b37f-41c6-ac59-947e2996f44d" (UID: "02297ae7-b37f-41c6-ac59-947e2996f44d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.327990 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02297ae7-b37f-41c6-ac59-947e2996f44d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.328018 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.328030 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkwdr\" (UniqueName: \"kubernetes.io/projected/02297ae7-b37f-41c6-ac59-947e2996f44d-kube-api-access-qkwdr\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.328040 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02297ae7-b37f-41c6-ac59-947e2996f44d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.777469 4707 generic.go:334] "Generic (PLEG): container finished" podID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerID="117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0" exitCode=0 Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.777762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"02297ae7-b37f-41c6-ac59-947e2996f44d","Type":"ContainerDied","Data":"117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0"} Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.777789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"02297ae7-b37f-41c6-ac59-947e2996f44d","Type":"ContainerDied","Data":"4532452c66f4fb7316ba369c1d6dde6f8d93e331af3a3d4782133ba7bd87f78a"} Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.777823 4707 scope.go:117] "RemoveContainer" containerID="117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.777959 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.801720 4707 scope.go:117] "RemoveContainer" containerID="57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.815825 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.823985 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.826667 4707 scope.go:117] "RemoveContainer" containerID="117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0" Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.827530 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0\": container with ID starting with 117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0 not found: ID does not exist" containerID="117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.827562 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0"} err="failed to get container status \"117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0\": rpc error: code = NotFound desc = could not find container \"117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0\": container with ID starting with 117c770c127f9e3ca302c324e8b038bcedc7f9ac5891ecbc55088ff8778e94e0 not found: ID does not exist" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.827582 4707 scope.go:117] "RemoveContainer" containerID="57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664" Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.827820 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664\": container with ID starting with 57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664 not found: ID does not exist" containerID="57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.827861 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664"} err="failed to get container status \"57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664\": rpc error: code = NotFound desc = could not find container \"57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664\": container with ID starting with 57375738fac0b62a05dd1b6c0c29521a62f66f7509590bc8657e53051643b664 not found: ID does not exist" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.834397 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.835128 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-log" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.835147 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-log" Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.835213 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerName="extract-content" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.835221 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerName="extract-content" Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.835239 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerName="registry-server" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.835246 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerName="registry-server" Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.836047 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerName="extract-content" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.836061 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerName="extract-content" Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.836073 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-api" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.836079 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-api" Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.836099 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerName="extract-utilities" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.836106 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerName="extract-utilities" Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.836114 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerName="extract-utilities" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.836119 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerName="extract-utilities" Jan 21 16:10:29 crc kubenswrapper[4707]: E0121 16:10:29.836140 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerName="registry-server" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.836145 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerName="registry-server" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.836340 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d750c86-177f-4f1d-a5b5-0c4cde73c43b" containerName="registry-server" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.836359 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-log" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.836365 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2c1b18-6185-4e89-af41-a8e0892afe68" containerName="registry-server" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.836374 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" containerName="nova-api-api" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.837318 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.838912 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.839436 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.839449 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.847542 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.961851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-config-data\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.961969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.962016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.962031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-public-tls-certs\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.962069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xv59\" (UniqueName: \"kubernetes.io/projected/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-kube-api-access-7xv59\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:29 crc kubenswrapper[4707]: I0121 16:10:29.962093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-logs\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.063591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-config-data\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.063945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.063984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.063998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-public-tls-certs\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.064028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xv59\" (UniqueName: \"kubernetes.io/projected/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-kube-api-access-7xv59\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.064046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-logs\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.064402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-logs\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.069129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.069596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-config-data\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.069858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.077949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xv59\" (UniqueName: \"kubernetes.io/projected/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-kube-api-access-7xv59\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.082860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-public-tls-certs\") pod \"nova-api-0\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.139447 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.157725 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.266669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-ceilometer-tls-certs\") pod \"0942ee4d-0234-4c83-ab9a-04320d525c7d\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.266707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-combined-ca-bundle\") pod \"0942ee4d-0234-4c83-ab9a-04320d525c7d\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.266734 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-log-httpd\") pod \"0942ee4d-0234-4c83-ab9a-04320d525c7d\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.266771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-run-httpd\") pod \"0942ee4d-0234-4c83-ab9a-04320d525c7d\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.266871 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lfbr\" (UniqueName: \"kubernetes.io/projected/0942ee4d-0234-4c83-ab9a-04320d525c7d-kube-api-access-2lfbr\") pod \"0942ee4d-0234-4c83-ab9a-04320d525c7d\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.266941 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-config-data\") pod \"0942ee4d-0234-4c83-ab9a-04320d525c7d\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.267300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-sg-core-conf-yaml\") pod \"0942ee4d-0234-4c83-ab9a-04320d525c7d\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.267367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-scripts\") pod \"0942ee4d-0234-4c83-ab9a-04320d525c7d\" (UID: \"0942ee4d-0234-4c83-ab9a-04320d525c7d\") " Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.267514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0942ee4d-0234-4c83-ab9a-04320d525c7d" (UID: "0942ee4d-0234-4c83-ab9a-04320d525c7d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.267999 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.268277 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0942ee4d-0234-4c83-ab9a-04320d525c7d" (UID: "0942ee4d-0234-4c83-ab9a-04320d525c7d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.270817 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-scripts" (OuterVolumeSpecName: "scripts") pod "0942ee4d-0234-4c83-ab9a-04320d525c7d" (UID: "0942ee4d-0234-4c83-ab9a-04320d525c7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.270861 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0942ee4d-0234-4c83-ab9a-04320d525c7d-kube-api-access-2lfbr" (OuterVolumeSpecName: "kube-api-access-2lfbr") pod "0942ee4d-0234-4c83-ab9a-04320d525c7d" (UID: "0942ee4d-0234-4c83-ab9a-04320d525c7d"). InnerVolumeSpecName "kube-api-access-2lfbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.288462 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0942ee4d-0234-4c83-ab9a-04320d525c7d" (UID: "0942ee4d-0234-4c83-ab9a-04320d525c7d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.304725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0942ee4d-0234-4c83-ab9a-04320d525c7d" (UID: "0942ee4d-0234-4c83-ab9a-04320d525c7d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.331370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0942ee4d-0234-4c83-ab9a-04320d525c7d" (UID: "0942ee4d-0234-4c83-ab9a-04320d525c7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.345037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-config-data" (OuterVolumeSpecName: "config-data") pod "0942ee4d-0234-4c83-ab9a-04320d525c7d" (UID: "0942ee4d-0234-4c83-ab9a-04320d525c7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.371702 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.371737 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0942ee4d-0234-4c83-ab9a-04320d525c7d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.371747 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.371757 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lfbr\" (UniqueName: \"kubernetes.io/projected/0942ee4d-0234-4c83-ab9a-04320d525c7d-kube-api-access-2lfbr\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.371766 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.371774 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.371781 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0942ee4d-0234-4c83-ab9a-04320d525c7d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.545001 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.801413 4707 generic.go:334] "Generic (PLEG): container finished" podID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerID="952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25" exitCode=0 Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.801444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerDied","Data":"952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25"} Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.801479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0942ee4d-0234-4c83-ab9a-04320d525c7d","Type":"ContainerDied","Data":"e0d244d5e7212e4a2eb4d3979d91cf14f095a2b4a6a0fbd536b88c98725d8452"} Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.801505 4707 scope.go:117] "RemoveContainer" containerID="84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.801447 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.804954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f072c1c5-92fb-4b9c-86dd-6f34cffb4166","Type":"ContainerStarted","Data":"3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40"} Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.804989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f072c1c5-92fb-4b9c-86dd-6f34cffb4166","Type":"ContainerStarted","Data":"56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e"} Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.805000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f072c1c5-92fb-4b9c-86dd-6f34cffb4166","Type":"ContainerStarted","Data":"790d957d81693df9bfc0deb8ac0aa00fdda85b4d65813b91db8458d52703bb4b"} Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.828199 4707 scope.go:117] "RemoveContainer" containerID="209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.828302 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.8282889039999999 podStartE2EDuration="1.828288904s" podCreationTimestamp="2026-01-21 16:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:10:30.821076451 +0000 UTC m=+4128.002592673" watchObservedRunningTime="2026-01-21 16:10:30.828288904 +0000 UTC m=+4128.009805125" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.849011 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.858215 4707 scope.go:117] "RemoveContainer" containerID="952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.867445 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.884660 4707 scope.go:117] "RemoveContainer" containerID="81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.887280 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:10:30 crc kubenswrapper[4707]: E0121 16:10:30.887691 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="sg-core" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.887710 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="sg-core" Jan 21 16:10:30 crc kubenswrapper[4707]: E0121 16:10:30.887740 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="proxy-httpd" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.887747 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="proxy-httpd" Jan 21 16:10:30 crc kubenswrapper[4707]: E0121 16:10:30.887761 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="ceilometer-notification-agent" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.887767 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="ceilometer-notification-agent" Jan 21 16:10:30 crc kubenswrapper[4707]: E0121 16:10:30.887780 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="ceilometer-central-agent" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.887785 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="ceilometer-central-agent" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.888000 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="ceilometer-notification-agent" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.888020 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="sg-core" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.888041 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="ceilometer-central-agent" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.888050 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" containerName="proxy-httpd" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.889529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.892391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.892591 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.892889 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.900869 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.903015 4707 scope.go:117] "RemoveContainer" containerID="84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988" Jan 21 16:10:30 crc kubenswrapper[4707]: E0121 16:10:30.905405 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988\": container with ID starting with 84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988 not found: ID does not exist" containerID="84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.905450 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988"} err="failed to get container status \"84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988\": rpc error: code = NotFound desc = could not find container \"84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988\": container with ID starting with 84ffdeda21df5ab963b7b4e1debc78e5fa8e90e76a0473bd79e70e62ed6e0988 not found: ID does not exist" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.905479 4707 scope.go:117] "RemoveContainer" containerID="209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337" Jan 21 16:10:30 crc kubenswrapper[4707]: E0121 16:10:30.905838 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337\": container with ID starting with 209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337 not found: ID does not exist" containerID="209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.905863 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337"} err="failed to get container status \"209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337\": rpc error: code = NotFound desc = could not find container \"209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337\": container with ID starting with 209d33a7d93a7870a737cf1bd8ae2b2b23aeb9bf5f60cfd2a46525d86e0e6337 not found: ID does not exist" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.905884 4707 scope.go:117] "RemoveContainer" containerID="952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25" Jan 21 16:10:30 crc kubenswrapper[4707]: E0121 16:10:30.906208 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25\": container with ID starting with 952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25 not found: ID does not exist" containerID="952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.906249 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25"} err="failed to get container status \"952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25\": rpc error: code = NotFound desc = could not find container \"952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25\": container with ID starting with 952a2c8a303707c3994c79b4ee2554e608c2d72e1594152e8a60d592edc71c25 not found: ID does not exist" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.906278 4707 scope.go:117] "RemoveContainer" containerID="81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5" Jan 21 16:10:30 crc kubenswrapper[4707]: E0121 16:10:30.906504 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5\": container with ID starting with 81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5 not found: ID does not exist" containerID="81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.906522 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5"} err="failed to get container status \"81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5\": rpc error: code = NotFound desc = could not find container \"81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5\": container with ID starting with 81046b0c46d53fa1737eb0c2c3d1d9d7d32da834223c535d873c6a9e08f979a5 not found: ID does not exist" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.987060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfj82\" (UniqueName: \"kubernetes.io/projected/97e2beb7-1eea-43c2-909e-732a2e8c822c-kube-api-access-kfj82\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.987128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.987254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-run-httpd\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.987278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.987332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-scripts\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.987402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-log-httpd\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.987557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:30 crc kubenswrapper[4707]: I0121 16:10:30.987624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-config-data\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.088800 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.088875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-config-data\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.088935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfj82\" (UniqueName: \"kubernetes.io/projected/97e2beb7-1eea-43c2-909e-732a2e8c822c-kube-api-access-kfj82\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.088957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.089003 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-run-httpd\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.089026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.089056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-scripts\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.089085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-log-httpd\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.089484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-log-httpd\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.089742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-run-httpd\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.092211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.092559 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.092574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.092935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-config-data\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.097373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-scripts\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.104719 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfj82\" (UniqueName: \"kubernetes.io/projected/97e2beb7-1eea-43c2-909e-732a2e8c822c-kube-api-access-kfj82\") pod \"ceilometer-0\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.191595 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02297ae7-b37f-41c6-ac59-947e2996f44d" path="/var/lib/kubelet/pods/02297ae7-b37f-41c6-ac59-947e2996f44d/volumes" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.192275 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0942ee4d-0234-4c83-ab9a-04320d525c7d" path="/var/lib/kubelet/pods/0942ee4d-0234-4c83-ab9a-04320d525c7d/volumes" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.206639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.591748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:10:31 crc kubenswrapper[4707]: W0121 16:10:31.597081 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97e2beb7_1eea_43c2_909e_732a2e8c822c.slice/crio-183d24e9abd1ee2106920a99a99c89fa85eeea60076365fab7636e9243d872f7 WatchSource:0}: Error finding container 183d24e9abd1ee2106920a99a99c89fa85eeea60076365fab7636e9243d872f7: Status 404 returned error can't find the container with id 183d24e9abd1ee2106920a99a99c89fa85eeea60076365fab7636e9243d872f7 Jan 21 16:10:31 crc kubenswrapper[4707]: I0121 16:10:31.816107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerStarted","Data":"183d24e9abd1ee2106920a99a99c89fa85eeea60076365fab7636e9243d872f7"} Jan 21 16:10:32 crc kubenswrapper[4707]: I0121 16:10:32.825290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerStarted","Data":"6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b"} Jan 21 16:10:33 crc kubenswrapper[4707]: I0121 16:10:33.834446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerStarted","Data":"53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4"} Jan 21 16:10:34 crc kubenswrapper[4707]: I0121 16:10:34.843526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerStarted","Data":"23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065"} Jan 21 16:10:35 crc kubenswrapper[4707]: I0121 16:10:35.185950 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:10:35 crc kubenswrapper[4707]: E0121 16:10:35.188535 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:10:35 crc kubenswrapper[4707]: I0121 16:10:35.853026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerStarted","Data":"579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53"} Jan 21 16:10:35 crc kubenswrapper[4707]: I0121 16:10:35.853470 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:10:35 crc kubenswrapper[4707]: I0121 16:10:35.871029 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.13292729 podStartE2EDuration="5.871017526s" podCreationTimestamp="2026-01-21 16:10:30 +0000 UTC" firstStartedPulling="2026-01-21 16:10:31.598348964 +0000 UTC m=+4128.779865186" lastFinishedPulling="2026-01-21 16:10:35.336439201 +0000 UTC m=+4132.517955422" observedRunningTime="2026-01-21 16:10:35.868763778 +0000 UTC m=+4133.050280000" watchObservedRunningTime="2026-01-21 16:10:35.871017526 +0000 UTC m=+4133.052533748" Jan 21 16:10:40 crc kubenswrapper[4707]: I0121 16:10:40.158299 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:40 crc kubenswrapper[4707]: I0121 16:10:40.158715 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:41 crc kubenswrapper[4707]: I0121 16:10:41.167911 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:10:41 crc kubenswrapper[4707]: I0121 16:10:41.167946 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:10:48 crc kubenswrapper[4707]: I0121 16:10:48.183334 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:10:48 crc kubenswrapper[4707]: I0121 16:10:48.949940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"973e8f3f79963508f548e1f1508104218595db51536e1edaf4cd1d819983387a"} Jan 21 16:10:50 crc kubenswrapper[4707]: I0121 16:10:50.164323 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:50 crc kubenswrapper[4707]: I0121 16:10:50.165153 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:50 crc kubenswrapper[4707]: I0121 16:10:50.168524 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:50 crc kubenswrapper[4707]: I0121 16:10:50.170667 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:50 crc kubenswrapper[4707]: I0121 16:10:50.962120 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:10:50 crc kubenswrapper[4707]: I0121 16:10:50.969876 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:01 crc kubenswrapper[4707]: I0121 16:11:01.213907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.458004 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.458598 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" containerName="openstackclient" containerID="cri-o://9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049" gracePeriod=2 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.465331 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.477532 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.477725 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" containerName="ovn-northd" containerID="cri-o://f5905c256265d74b62ed26557c2803f5ee8e0dde3974c37bd97e1831a18746e7" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.477871 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" containerName="openstack-network-exporter" containerID="cri-o://edc9c99ba54dd7b30862a576d3d32b484fff02281a8a134d2f7a03fdef6a53c6" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.518604 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9"] Jan 21 16:11:03 crc kubenswrapper[4707]: E0121 16:11:03.519297 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" containerName="openstackclient" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.519317 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" containerName="openstackclient" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.519533 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" containerName="openstackclient" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.520576 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.531082 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.544075 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.570878 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.591902 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.605172 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.605377 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-log" containerID="cri-o://56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.607645 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-api" containerID="cri-o://3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.619200 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.619516 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerName="openstack-network-exporter" containerID="cri-o://8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106" gracePeriod=300 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.626460 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.626601 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="43b25315-bece-4c48-bd94-3132650e2b14" containerName="nova-scheduler-scheduler" containerID="cri-o://5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.634454 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.634670 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerName="glance-log" containerID="cri-o://117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.634788 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerName="glance-httpd" containerID="cri-o://7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.643520 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.643695 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" containerName="memcached" containerID="cri-o://398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.651882 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.652069 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-log" containerID="cri-o://899a26b8e50ba33fca57b72d44974297421aa331badc78e13d0e83c15ab03b22" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.652181 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-metadata" containerID="cri-o://72ebb7a26f99df0fa91b1dd0b938c53301a324860220aedf1fc9ada4708a5afc" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.665350 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.665629 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerName="cinder-scheduler" containerID="cri-o://b61d287158b2568660f871fafa7221b083307857176a321166ff5c5c0acca885" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.665784 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerName="probe" containerID="cri-o://6da93903072338fa09723fab8b4ae763bcee0d8308c91e3e562a7090cddaa6a2" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.686681 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.686898 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="fb3c866f-fcb9-416f-aa0a-f34a7c9e5151" containerName="nova-cell0-conductor-conductor" containerID="cri-o://6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.705606 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.705884 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerName="cinder-api-log" containerID="cri-o://f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.706036 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerName="cinder-api" containerID="cri-o://fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.710622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.710660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data-custom\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.711780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f927e632-11ba-4d56-993a-f89357f885a4-logs\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.711835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b115ad1a-72c2-4d2e-8669-4d125acf6751-logs\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.711862 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.711965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqrsr\" (UniqueName: \"kubernetes.io/projected/b115ad1a-72c2-4d2e-8669-4d125acf6751-kube-api-access-cqrsr\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.712000 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data-custom\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.712018 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-combined-ca-bundle\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.712038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-combined-ca-bundle\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.712072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44zg5\" (UniqueName: \"kubernetes.io/projected/f927e632-11ba-4d56-993a-f89357f885a4-kube-api-access-44zg5\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.741867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.742054 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3fa65774e091d2648c8e5d199347bcc9ac19de4c12027001ad23996032d7b010" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.796249 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.209:8775/\": EOF" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.796339 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.209:8775/\": read tcp 10.217.0.2:48840->10.217.1.209:8775: read: connection reset by peer" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.796454 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.209:8775/\": EOF" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.796596 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-5c9b5b5886-675s9"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.797684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.813848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqrsr\" (UniqueName: \"kubernetes.io/projected/b115ad1a-72c2-4d2e-8669-4d125acf6751-kube-api-access-cqrsr\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.813892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data-custom\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.813910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-combined-ca-bundle\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.813937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-combined-ca-bundle\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.813954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44zg5\" (UniqueName: \"kubernetes.io/projected/f927e632-11ba-4d56-993a-f89357f885a4-kube-api-access-44zg5\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.814019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.814041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data-custom\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.814110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f927e632-11ba-4d56-993a-f89357f885a4-logs\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.814130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b115ad1a-72c2-4d2e-8669-4d125acf6751-logs\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.814153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.821246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f927e632-11ba-4d56-993a-f89357f885a4-logs\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.821521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b115ad1a-72c2-4d2e-8669-4d125acf6751-logs\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.824292 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.824502 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="dd534564-de96-44c7-9026-a55d66da26ef" containerName="glance-log" containerID="cri-o://c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.824636 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="dd534564-de96-44c7-9026-a55d66da26ef" containerName="glance-httpd" containerID="cri-o://240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60" gracePeriod=30 Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.827516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data-custom\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.828028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data-custom\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.830318 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-combined-ca-bundle\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.839844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.842121 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.845287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-combined-ca-bundle\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.847395 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.867869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44zg5\" (UniqueName: \"kubernetes.io/projected/f927e632-11ba-4d56-993a-f89357f885a4-kube-api-access-44zg5\") pod \"barbican-worker-784ddd84c9-bhvh9\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.884703 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqrsr\" (UniqueName: \"kubernetes.io/projected/b115ad1a-72c2-4d2e-8669-4d125acf6751-kube-api-access-cqrsr\") pod \"barbican-keystone-listener-74486d786b-zsvr9\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.884778 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5c9b5b5886-675s9"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.911990 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.916883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-scripts\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.916967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-combined-ca-bundle\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.916996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-credential-keys\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.917030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.917053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.917082 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7lmq\" (UniqueName: \"kubernetes.io/projected/1528d71b-31ac-461e-b1d9-29c348f3d63e-kube-api-access-k7lmq\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.917098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-config-data\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.930892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-fernet-keys\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.976248 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:11:03 crc kubenswrapper[4707]: I0121 16:11:03.997224 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.023712 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns"] Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.025243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.043222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-scripts\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.043305 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-combined-ca-bundle\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.043336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-credential-keys\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.043373 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.043396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.043427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7lmq\" (UniqueName: \"kubernetes.io/projected/1528d71b-31ac-461e-b1d9-29c348f3d63e-kube-api-access-k7lmq\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.043444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-config-data\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.043466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-fernet-keys\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.047325 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5878d64444-swb5j"] Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.049055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.049459 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.049502 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:04.549486976 +0000 UTC m=+4161.731003199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.050054 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.050103 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:04.550087566 +0000 UTC m=+4161.731603788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.065637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-credential-keys\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.067335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-combined-ca-bundle\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.085195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-scripts\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.085624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-fernet-keys\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.091128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7lmq\" (UniqueName: \"kubernetes.io/projected/1528d71b-31ac-461e-b1d9-29c348f3d63e-kube-api-access-k7lmq\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.093437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-config-data\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.112149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns"] Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.148601 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5878d64444-swb5j"] Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-logs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-combined-ca-bundle\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-config-data\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data-custom\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-combined-ca-bundle\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwl2q\" (UniqueName: \"kubernetes.io/projected/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-kube-api-access-jwl2q\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-scripts\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.163848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blwt9\" (UniqueName: \"kubernetes.io/projected/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-kube-api-access-blwt9\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.166787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.167117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-logs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.183876 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerName="ovsdbserver-sb" containerID="cri-o://3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7" gracePeriod=300 Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.191985 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.192516 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerName="openstack-network-exporter" containerID="cri-o://c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de" gracePeriod=300 Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.198707 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6f795569d6-mmhxt"] Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.234063 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.244655 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerID="117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0" exitCode=143 Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.244948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8bdcff4-af97-483e-80bb-21b36c870c0b","Type":"ContainerDied","Data":"117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0"} Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.283767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-config-data\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.283801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.283872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data-custom\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.283893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.283911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.283932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.283962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-combined-ca-bundle\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.284000 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwl2q\" (UniqueName: \"kubernetes.io/projected/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-kube-api-access-jwl2q\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.284101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-scripts\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.284127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blwt9\" (UniqueName: \"kubernetes.io/projected/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-kube-api-access-blwt9\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.285034 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd6d7315-b24c-411b-acc8-bda614951af2" containerID="edc9c99ba54dd7b30862a576d3d32b484fff02281a8a134d2f7a03fdef6a53c6" exitCode=2 Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.285170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"bd6d7315-b24c-411b-acc8-bda614951af2","Type":"ContainerDied","Data":"edc9c99ba54dd7b30862a576d3d32b484fff02281a8a134d2f7a03fdef6a53c6"} Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.288947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.289149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-logs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.289206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-logs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.289226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-combined-ca-bundle\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.290253 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.290359 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:04.790326157 +0000 UTC m=+4161.971842368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.291333 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.291385 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:04.791373134 +0000 UTC m=+4161.972889357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.291448 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.291475 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:04.791468404 +0000 UTC m=+4161.972984626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.291519 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.291544 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:04.791535711 +0000 UTC m=+4161.973051932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.296227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-scripts\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.297248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-combined-ca-bundle\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.297527 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6f795569d6-mmhxt"] Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.298276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-logs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.298494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-logs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.309243 4707 generic.go:334] "Generic (PLEG): container finished" podID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerID="899a26b8e50ba33fca57b72d44974297421aa331badc78e13d0e83c15ab03b22" exitCode=143 Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.309289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a","Type":"ContainerDied","Data":"899a26b8e50ba33fca57b72d44974297421aa331badc78e13d0e83c15ab03b22"} Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.330482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwl2q\" (UniqueName: \"kubernetes.io/projected/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-kube-api-access-jwl2q\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.330865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-config-data\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.331495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blwt9\" (UniqueName: \"kubernetes.io/projected/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-kube-api-access-blwt9\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.333208 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-combined-ca-bundle\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.338487 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerName="ovsdbserver-nb" containerID="cri-o://337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733" gracePeriod=300 Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.360462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data-custom\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.375932 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="fa0ab18b-efe2-4553-8e4d-cf4f1468c460" containerName="galera" containerID="cri-o://6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e" gracePeriod=30 Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.397493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-config\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.397622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-httpd-config\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.397689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.397790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-combined-ca-bundle\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.398142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.398208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tf5n\" (UniqueName: \"kubernetes.io/projected/f904e1e0-53bb-45dc-90c4-df6a6783f58a-kube-api-access-4tf5n\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.398238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.399303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.457933 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" containerName="galera" containerID="cri-o://48493be85c0a9d73f2fef3ed9d02b2356ce6d369c4ebbf5cf3e2300c4c9f60dc" gracePeriod=30 Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.502673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-config\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.502732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-httpd-config\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.502768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.502801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-combined-ca-bundle\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.502926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.502955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tf5n\" (UniqueName: \"kubernetes.io/projected/f904e1e0-53bb-45dc-90c4-df6a6783f58a-kube-api-access-4tf5n\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.502979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.503092 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.503136 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:05.003122566 +0000 UTC m=+4162.184638788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.507206 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.507266 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:05.007248033 +0000 UTC m=+4162.188764254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.507339 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.507375 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:05.007365924 +0000 UTC m=+4162.188882147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-ovndbs" not found Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.510185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-config\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.512365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-combined-ca-bundle\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.517889 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-httpd-config\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.527394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tf5n\" (UniqueName: \"kubernetes.io/projected/f904e1e0-53bb-45dc-90c4-df6a6783f58a-kube-api-access-4tf5n\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.607355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.607444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.607569 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.607685 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.607710 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:05.607690645 +0000 UTC m=+4162.789206867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.607848 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:05.607775966 +0000 UTC m=+4162.789292188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.683137 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9"] Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.813702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.814148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.814315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.814353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.814494 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.814554 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:05.814540044 +0000 UTC m=+4162.996056266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.814956 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.814991 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:05.814981584 +0000 UTC m=+4162.996497805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-public-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.814993 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.815028 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.815057 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:05.815041957 +0000 UTC m=+4162.996558179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: E0121 16:11:04.815075 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:05.815068587 +0000 UTC m=+4162.996584808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-internal-svc" not found Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.983430 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_687b8d2a-f54f-4c5c-9d23-d5b0874682bb/ovsdbserver-sb/0.log" Jan 21 16:11:04 crc kubenswrapper[4707]: I0121 16:11:04.983718 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.013343 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_69a16f95-9cdc-4bab-b68d-275212d909a5/ovsdbserver-nb/0.log" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.013412 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.021333 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.022566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-metrics-certs-tls-certs\") pod \"69a16f95-9cdc-4bab-b68d-275212d909a5\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.022623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"69a16f95-9cdc-4bab-b68d-275212d909a5\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.022653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-memcached-tls-certs\") pod \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.022675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-combined-ca-bundle\") pod \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.022693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tcqw\" (UniqueName: \"kubernetes.io/projected/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kube-api-access-5tcqw\") pod \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.022715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-metrics-certs-tls-certs\") pod \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.023075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.023136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.023345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.023440 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.023469 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.023502 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:06.023485372 +0000 UTC m=+4163.205001594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-internal-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.023519 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:06.023513224 +0000 UTC m=+4163.205029447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-public-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.023576 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.023613 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:06.023601421 +0000 UTC m=+4163.205117643 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-ovndbs" not found Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.064063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "69a16f95-9cdc-4bab-b68d-275212d909a5" (UID: "69a16f95-9cdc-4bab-b68d-275212d909a5"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.064248 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kube-api-access-5tcqw" (OuterVolumeSpecName: "kube-api-access-5tcqw") pod "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" (UID: "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77"). InnerVolumeSpecName "kube-api-access-5tcqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.087656 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q774q\" (UniqueName: \"kubernetes.io/projected/69a16f95-9cdc-4bab-b68d-275212d909a5-kube-api-access-q774q\") pod \"69a16f95-9cdc-4bab-b68d-275212d909a5\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-config\") pod \"69a16f95-9cdc-4bab-b68d-275212d909a5\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kolla-config\") pod \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124545 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdb-rundir\") pod \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-scripts\") pod \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-combined-ca-bundle\") pod \"69a16f95-9cdc-4bab-b68d-275212d909a5\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-config-data\") pod \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\" (UID: \"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-config\") pod \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdbserver-sb-tls-certs\") pod \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124902 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdb-rundir\") pod \"69a16f95-9cdc-4bab-b68d-275212d909a5\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-scripts\") pod \"69a16f95-9cdc-4bab-b68d-275212d909a5\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.124954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-combined-ca-bundle\") pod \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.125004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdgxm\" (UniqueName: \"kubernetes.io/projected/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-kube-api-access-gdgxm\") pod \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\" (UID: \"687b8d2a-f54f-4c5c-9d23-d5b0874682bb\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.125025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdbserver-nb-tls-certs\") pod \"69a16f95-9cdc-4bab-b68d-275212d909a5\" (UID: \"69a16f95-9cdc-4bab-b68d-275212d909a5\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.125026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "687b8d2a-f54f-4c5c-9d23-d5b0874682bb" (UID: "687b8d2a-f54f-4c5c-9d23-d5b0874682bb"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.125651 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.125677 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.125687 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tcqw\" (UniqueName: \"kubernetes.io/projected/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kube-api-access-5tcqw\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.125847 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-config" (OuterVolumeSpecName: "config") pod "69a16f95-9cdc-4bab-b68d-275212d909a5" (UID: "69a16f95-9cdc-4bab-b68d-275212d909a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.126237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" (UID: "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.127204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-config-data" (OuterVolumeSpecName: "config-data") pod "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" (UID: "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.127721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-scripts" (OuterVolumeSpecName: "scripts") pod "687b8d2a-f54f-4c5c-9d23-d5b0874682bb" (UID: "687b8d2a-f54f-4c5c-9d23-d5b0874682bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.128140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "69a16f95-9cdc-4bab-b68d-275212d909a5" (UID: "69a16f95-9cdc-4bab-b68d-275212d909a5"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.128280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-config" (OuterVolumeSpecName: "config") pod "687b8d2a-f54f-4c5c-9d23-d5b0874682bb" (UID: "687b8d2a-f54f-4c5c-9d23-d5b0874682bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.128726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-scripts" (OuterVolumeSpecName: "scripts") pod "69a16f95-9cdc-4bab-b68d-275212d909a5" (UID: "69a16f95-9cdc-4bab-b68d-275212d909a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.128820 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" (UID: "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.142316 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "687b8d2a-f54f-4c5c-9d23-d5b0874682bb" (UID: "687b8d2a-f54f-4c5c-9d23-d5b0874682bb"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.149436 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.150593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a16f95-9cdc-4bab-b68d-275212d909a5-kube-api-access-q774q" (OuterVolumeSpecName: "kube-api-access-q774q") pod "69a16f95-9cdc-4bab-b68d-275212d909a5" (UID: "69a16f95-9cdc-4bab-b68d-275212d909a5"). InnerVolumeSpecName "kube-api-access-q774q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.150912 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.152198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-kube-api-access-gdgxm" (OuterVolumeSpecName: "kube-api-access-gdgxm") pod "687b8d2a-f54f-4c5c-9d23-d5b0874682bb" (UID: "687b8d2a-f54f-4c5c-9d23-d5b0874682bb"). InnerVolumeSpecName "kube-api-access-gdgxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.154272 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.154320 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="43b25315-bece-4c48-bd94-3132650e2b14" containerName="nova-scheduler-scheduler" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.171168 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.198486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" (UID: "7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.200717 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "687b8d2a-f54f-4c5c-9d23-d5b0874682bb" (UID: "687b8d2a-f54f-4c5c-9d23-d5b0874682bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.203957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69a16f95-9cdc-4bab-b68d-275212d909a5" (UID: "69a16f95-9cdc-4bab-b68d-275212d909a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232127 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232151 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232177 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232187 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232207 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232218 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232228 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232238 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232250 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232262 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232271 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232280 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdgxm\" (UniqueName: \"kubernetes.io/projected/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-kube-api-access-gdgxm\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232289 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q774q\" (UniqueName: \"kubernetes.io/projected/69a16f95-9cdc-4bab-b68d-275212d909a5-kube-api-access-q774q\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232299 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a16f95-9cdc-4bab-b68d-275212d909a5-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.232309 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.241103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "69a16f95-9cdc-4bab-b68d-275212d909a5" (UID: "69a16f95-9cdc-4bab-b68d-275212d909a5"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.248020 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.318674 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa0ab18b-efe2-4553-8e4d-cf4f1468c460" containerID="6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e" exitCode=0 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.318752 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.318774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"fa0ab18b-efe2-4553-8e4d-cf4f1468c460","Type":"ContainerDied","Data":"6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.318861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"fa0ab18b-efe2-4553-8e4d-cf4f1468c460","Type":"ContainerDied","Data":"e91aaa9c7ac43f3a7514125b48c135e813d8bff6757c89f33328a6f2f7612eee"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.318883 4707 scope.go:117] "RemoveContainer" containerID="6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.321533 4707 generic.go:334] "Generic (PLEG): container finished" podID="3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" containerID="48493be85c0a9d73f2fef3ed9d02b2356ce6d369c4ebbf5cf3e2300c4c9f60dc" exitCode=0 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.321655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7","Type":"ContainerDied","Data":"48493be85c0a9d73f2fef3ed9d02b2356ce6d369c4ebbf5cf3e2300c4c9f60dc"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.326369 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerID="6da93903072338fa09723fab8b4ae763bcee0d8308c91e3e562a7090cddaa6a2" exitCode=0 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.326440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5a6deb2f-fd3d-494e-8d58-c6da07de2498","Type":"ContainerDied","Data":"6da93903072338fa09723fab8b4ae763bcee0d8308c91e3e562a7090cddaa6a2"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.329247 4707 generic.go:334] "Generic (PLEG): container finished" podID="dd534564-de96-44c7-9026-a55d66da26ef" containerID="c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5" exitCode=143 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.329323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dd534564-de96-44c7-9026-a55d66da26ef","Type":"ContainerDied","Data":"c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.332235 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_69a16f95-9cdc-4bab-b68d-275212d909a5/ovsdbserver-nb/0.log" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.332286 4707 generic.go:334] "Generic (PLEG): container finished" podID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerID="c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de" exitCode=2 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.332301 4707 generic.go:334] "Generic (PLEG): container finished" podID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerID="337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733" exitCode=143 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.332355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69a16f95-9cdc-4bab-b68d-275212d909a5","Type":"ContainerDied","Data":"c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.332391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69a16f95-9cdc-4bab-b68d-275212d909a5","Type":"ContainerDied","Data":"337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.332404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69a16f95-9cdc-4bab-b68d-275212d909a5","Type":"ContainerDied","Data":"e0083d66a546921aa01d5b2122ee438a6dbc5523197d429a39befc2035537c69"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.332478 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.334804 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-generated\") pod \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.335017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.335093 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-combined-ca-bundle\") pod \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.335189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-default\") pod \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.335211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-galera-tls-certs\") pod \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.335281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kolla-config\") pod \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.335302 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-operator-scripts\") pod \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.335331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6wx9\" (UniqueName: \"kubernetes.io/projected/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kube-api-access-d6wx9\") pod \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\" (UID: \"fa0ab18b-efe2-4553-8e4d-cf4f1468c460\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.335469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "fa0ab18b-efe2-4553-8e4d-cf4f1468c460" (UID: "fa0ab18b-efe2-4553-8e4d-cf4f1468c460"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.336224 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.336239 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.337611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "fa0ab18b-efe2-4553-8e4d-cf4f1468c460" (UID: "fa0ab18b-efe2-4553-8e4d-cf4f1468c460"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.338460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa0ab18b-efe2-4553-8e4d-cf4f1468c460" (UID: "fa0ab18b-efe2-4553-8e4d-cf4f1468c460"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.339633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "fa0ab18b-efe2-4553-8e4d-cf4f1468c460" (UID: "fa0ab18b-efe2-4553-8e4d-cf4f1468c460"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.341857 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_687b8d2a-f54f-4c5c-9d23-d5b0874682bb/ovsdbserver-sb/0.log" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.341909 4707 generic.go:334] "Generic (PLEG): container finished" podID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerID="8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106" exitCode=2 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.341927 4707 generic.go:334] "Generic (PLEG): container finished" podID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerID="3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7" exitCode=143 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.341992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"687b8d2a-f54f-4c5c-9d23-d5b0874682bb","Type":"ContainerDied","Data":"8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.342019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"687b8d2a-f54f-4c5c-9d23-d5b0874682bb","Type":"ContainerDied","Data":"3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.342029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"687b8d2a-f54f-4c5c-9d23-d5b0874682bb","Type":"ContainerDied","Data":"bdeed3f4c39e501ff3d5e2e1630d9a320fbe8f0e15a353040c4f122753d2f73e"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.342038 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.343990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" event={"ID":"f927e632-11ba-4d56-993a-f89357f885a4","Type":"ContainerStarted","Data":"e5dbc21ea96fe8af7c22e5a8b83b9ad5a23d88f49da9a7633ac37b17857d42d8"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.346967 4707 generic.go:334] "Generic (PLEG): container finished" podID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerID="56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e" exitCode=143 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.347113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f072c1c5-92fb-4b9c-86dd-6f34cffb4166","Type":"ContainerDied","Data":"56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.349632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" event={"ID":"b115ad1a-72c2-4d2e-8669-4d125acf6751","Type":"ContainerStarted","Data":"c64d5fccd6f9fe5912e6644f1e191a0c804f30b7b9271b2b9b4aaaefbe8774f5"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.349731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" event={"ID":"b115ad1a-72c2-4d2e-8669-4d125acf6751","Type":"ContainerStarted","Data":"ffc2e91804b972561f45233d7b51f3ca8db7af0e53ef85d4b155d0c761a635bb"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.363456 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerID="f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d" exitCode=143 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.363532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b4fbc761-3cab-4cee-b4f5-59a2306a3650","Type":"ContainerDied","Data":"f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.373018 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" containerID="398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d" exitCode=0 Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.373054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77","Type":"ContainerDied","Data":"398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.373074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77","Type":"ContainerDied","Data":"94d0410e107937fab682edebb91dd06a150f7b35a9c98b1c2bf2d2b57621e2ae"} Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.373078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.373154 4707 scope.go:117] "RemoveContainer" containerID="5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.381460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kube-api-access-d6wx9" (OuterVolumeSpecName: "kube-api-access-d6wx9") pod "fa0ab18b-efe2-4553-8e4d-cf4f1468c460" (UID: "fa0ab18b-efe2-4553-8e4d-cf4f1468c460"). InnerVolumeSpecName "kube-api-access-d6wx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.385003 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "fa0ab18b-efe2-4553-8e4d-cf4f1468c460" (UID: "fa0ab18b-efe2-4553-8e4d-cf4f1468c460"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.392078 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.397598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "687b8d2a-f54f-4c5c-9d23-d5b0874682bb" (UID: "687b8d2a-f54f-4c5c-9d23-d5b0874682bb"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.404680 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "69a16f95-9cdc-4bab-b68d-275212d909a5" (UID: "69a16f95-9cdc-4bab-b68d-275212d909a5"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.405251 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa0ab18b-efe2-4553-8e4d-cf4f1468c460" (UID: "fa0ab18b-efe2-4553-8e4d-cf4f1468c460"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.409079 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "687b8d2a-f54f-4c5c-9d23-d5b0874682bb" (UID: "687b8d2a-f54f-4c5c-9d23-d5b0874682bb"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.421743 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "fa0ab18b-efe2-4553-8e4d-cf4f1468c460" (UID: "fa0ab18b-efe2-4553-8e4d-cf4f1468c460"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.431072 4707 scope.go:117] "RemoveContainer" containerID="6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.431790 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e\": container with ID starting with 6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e not found: ID does not exist" containerID="6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.431837 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e"} err="failed to get container status \"6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e\": rpc error: code = NotFound desc = could not find container \"6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e\": container with ID starting with 6725ad66168c7188fb3e2a28c1e3c944dfd8375d48f0cd64a8574bffa9914c8e not found: ID does not exist" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.431856 4707 scope.go:117] "RemoveContainer" containerID="5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.437909 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.437975 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092\": container with ID starting with 5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092 not found: ID does not exist" containerID="5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438013 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092"} err="failed to get container status \"5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092\": rpc error: code = NotFound desc = could not find container \"5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092\": container with ID starting with 5dd73ff91eeb6b3de16de1b6a4e45c4f701ff0c1875aed795db89cb6b6d43092 not found: ID does not exist" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438040 4707 scope.go:117] "RemoveContainer" containerID="c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438880 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438901 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438912 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6wx9\" (UniqueName: \"kubernetes.io/projected/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-kube-api-access-d6wx9\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438920 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438929 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438941 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/687b8d2a-f54f-4c5c-9d23-d5b0874682bb-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438969 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438981 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438990 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a16f95-9cdc-4bab-b68d-275212d909a5-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.438999 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.439007 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0ab18b-efe2-4553-8e4d-cf4f1468c460-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.449909 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.476116 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.486344 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.486912 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerName="ovsdbserver-nb" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.486927 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerName="ovsdbserver-nb" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.486947 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerName="ovsdbserver-sb" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.486952 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerName="ovsdbserver-sb" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.486966 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerName="openstack-network-exporter" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.486972 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerName="openstack-network-exporter" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.486982 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0ab18b-efe2-4553-8e4d-cf4f1468c460" containerName="galera" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.486987 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0ab18b-efe2-4553-8e4d-cf4f1468c460" containerName="galera" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.486996 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" containerName="memcached" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487002 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" containerName="memcached" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.487015 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" containerName="galera" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487020 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" containerName="galera" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.487032 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0ab18b-efe2-4553-8e4d-cf4f1468c460" containerName="mysql-bootstrap" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487038 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0ab18b-efe2-4553-8e4d-cf4f1468c460" containerName="mysql-bootstrap" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.487052 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" containerName="mysql-bootstrap" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487057 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" containerName="mysql-bootstrap" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.487066 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerName="openstack-network-exporter" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487071 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerName="openstack-network-exporter" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487225 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" containerName="memcached" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487237 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerName="ovsdbserver-nb" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487249 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" containerName="galera" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487260 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a16f95-9cdc-4bab-b68d-275212d909a5" containerName="openstack-network-exporter" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487269 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerName="ovsdbserver-sb" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487276 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0ab18b-efe2-4553-8e4d-cf4f1468c460" containerName="galera" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487289 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" containerName="openstack-network-exporter" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.487841 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.489585 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-vp54k" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.489866 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.491747 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.493353 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.506822 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.517363 4707 scope.go:117] "RemoveContainer" containerID="337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.540222 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwkkz\" (UniqueName: \"kubernetes.io/projected/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kube-api-access-jwkkz\") pod \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.540333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kolla-config\") pod \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.540377 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-operator-scripts\") pod \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.540412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-galera-tls-certs\") pod \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.540439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.540523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-combined-ca-bundle\") pod \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.540544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-default\") pod \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.540587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-generated\") pod \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\" (UID: \"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7\") " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.541268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-kolla-config\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.541297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-config-data\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.541436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmcg\" (UniqueName: \"kubernetes.io/projected/97adda3a-b739-44a3-8119-516f049b2784-kube-api-access-9xmcg\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.541452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-combined-ca-bundle\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.541477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.541556 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.544262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" (UID: "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.544834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" (UID: "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.544927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" (UID: "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.545134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" (UID: "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.547998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kube-api-access-jwkkz" (OuterVolumeSpecName: "kube-api-access-jwkkz") pod "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" (UID: "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7"). InnerVolumeSpecName "kube-api-access-jwkkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.560778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" (UID: "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.563580 4707 scope.go:117] "RemoveContainer" containerID="c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.565958 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de\": container with ID starting with c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de not found: ID does not exist" containerID="c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.566062 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de"} err="failed to get container status \"c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de\": rpc error: code = NotFound desc = could not find container \"c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de\": container with ID starting with c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de not found: ID does not exist" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.566145 4707 scope.go:117] "RemoveContainer" containerID="337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.566660 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733\": container with ID starting with 337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733 not found: ID does not exist" containerID="337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.566691 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733"} err="failed to get container status \"337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733\": rpc error: code = NotFound desc = could not find container \"337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733\": container with ID starting with 337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733 not found: ID does not exist" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.566715 4707 scope.go:117] "RemoveContainer" containerID="c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.567199 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de"} err="failed to get container status \"c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de\": rpc error: code = NotFound desc = could not find container \"c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de\": container with ID starting with c93a33bebe72c8bc6aabb89ddeda99150bd62c97e1583c9738504e33d47f66de not found: ID does not exist" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.567229 4707 scope.go:117] "RemoveContainer" containerID="337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.567713 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733"} err="failed to get container status \"337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733\": rpc error: code = NotFound desc = could not find container \"337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733\": container with ID starting with 337d91f3f8089891897f18097678dacea486a1407ad8440c52b26f0e350b2733 not found: ID does not exist" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.567732 4707 scope.go:117] "RemoveContainer" containerID="8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.575208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" (UID: "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.594716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" (UID: "3e173738-a3c5-4e4c-b681-6d1f3cbb90b7"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.611281 4707 scope.go:117] "RemoveContainer" containerID="3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-kolla-config\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-config-data\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmcg\" (UniqueName: \"kubernetes.io/projected/97adda3a-b739-44a3-8119-516f049b2784-kube-api-access-9xmcg\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-combined-ca-bundle\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642907 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642919 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642929 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642947 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642959 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642968 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642976 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.642985 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwkkz\" (UniqueName: \"kubernetes.io/projected/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7-kube-api-access-jwkkz\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.653876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-config-data\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.654029 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.654114 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.654088783 +0000 UTC m=+4164.835605006 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-public-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.655058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-kolla-config\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.655664 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.655796 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.65577846 +0000 UTC m=+4164.837294682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.656479 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.656558 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs podName:97adda3a-b739-44a3-8119-516f049b2784 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:06.156539802 +0000 UTC m=+4163.338056023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs") pod "memcached-0" (UID: "97adda3a-b739-44a3-8119-516f049b2784") : secret "cert-memcached-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.741610 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.756010 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.792562 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.807383 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.820887 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.824891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.825134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.827740 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.829393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hdph9" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.829501 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.835138 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.845565 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.860033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.860087 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.860713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-config\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.860820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.860954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pls4\" (UniqueName: \"kubernetes.io/projected/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-kube-api-access-4pls4\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.860997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.861022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.861097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.861126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.861300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.861332 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.861359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.861485 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.861524 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.861513112 +0000 UTC m=+4165.043029325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-internal-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.861921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.861972 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.861998 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.861990471 +0000 UTC m=+4165.043506692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-internal-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.862033 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.862051 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.862045644 +0000 UTC m=+4165.043561866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-public-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.862117 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.862187 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.86214523 +0000 UTC m=+4165.043661453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-public-svc" not found Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.871567 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.873153 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.878260 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.879079 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.879216 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-bhtdn" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.879246 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.882201 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.883673 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.886257 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.886417 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.886584 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-tpgml" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.887265 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.896003 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.911165 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.963431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pls4\" (UniqueName: \"kubernetes.io/projected/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-kube-api-access-4pls4\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.963524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.964197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.964265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.964303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.964394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.964142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.964569 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.964678 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:06.464629652 +0000 UTC m=+4163.646145874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.964781 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.964802 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: E0121 16:11:05.964829 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:06.464820061 +0000 UTC m=+4163.646336293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovn-metrics" not found Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.964907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.964938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-config\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.965757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-config\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:05 crc kubenswrapper[4707]: I0121 16:11:05.965762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.048292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-combined-ca-bundle\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.048317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmcg\" (UniqueName: \"kubernetes.io/projected/97adda3a-b739-44a3-8119-516f049b2784-kube-api-access-9xmcg\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.060969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pls4\" (UniqueName: \"kubernetes.io/projected/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-kube-api-access-4pls4\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.061024 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.062246 4707 scope.go:117] "RemoveContainer" containerID="8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqcqz\" (UniqueName: \"kubernetes.io/projected/a020fbaa-3cc2-46c6-84af-9f5f0323842a-kube-api-access-kqcqz\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.067327 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106\": container with ID starting with 8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106 not found: ID does not exist" containerID="8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067362 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106"} err="failed to get container status \"8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106\": rpc error: code = NotFound desc = could not find container \"8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106\": container with ID starting with 8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106 not found: ID does not exist" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067387 4707 scope.go:117] "RemoveContainer" containerID="3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.067631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.067695 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.067735 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:08.067722807 +0000 UTC m=+4165.249239030 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-internal-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.068096 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.068224 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:08.068192311 +0000 UTC m=+4165.249708533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-ovndbs" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.068369 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7\": container with ID starting with 3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7 not found: ID does not exist" containerID="3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.068405 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7"} err="failed to get container status \"3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7\": rpc error: code = NotFound desc = could not find container \"3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7\": container with ID starting with 3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7 not found: ID does not exist" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.068427 4707 scope.go:117] "RemoveContainer" containerID="8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.068579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.068628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms5s5\" (UniqueName: \"kubernetes.io/projected/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kube-api-access-ms5s5\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.068654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.068681 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-config\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.068745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.068781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.068844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.069055 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.069096 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:08.069081332 +0000 UTC m=+4165.250597554 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-public-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.069112 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106"} err="failed to get container status \"8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106\": rpc error: code = NotFound desc = could not find container \"8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106\": container with ID starting with 8ca7e6a1063518118ce0415248b27b73372aeb6e4d6739270968424bd9971106 not found: ID does not exist" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.069131 4707 scope.go:117] "RemoveContainer" containerID="3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.069545 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.069924 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7"} err="failed to get container status \"3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7\": rpc error: code = NotFound desc = could not find container \"3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7\": container with ID starting with 3a600577c2ddbb9f00a10d227c6f5aba0bd26e59d6d1cf8c45f12f387b344ed7 not found: ID does not exist" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.069953 4707 scope.go:117] "RemoveContainer" containerID="398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.102330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.172338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.172397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.172480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.172502 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.172942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.172982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms5s5\" (UniqueName: \"kubernetes.io/projected/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kube-api-access-ms5s5\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-config\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173323 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqcqz\" (UniqueName: \"kubernetes.io/projected/a020fbaa-3cc2-46c6-84af-9f5f0323842a-kube-api-access-kqcqz\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.173857 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.174304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.174693 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.175045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.175097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-config\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.173678 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.175191 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs podName:23f0ac73-fdf4-4d9f-b00d-90906390fe70 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:06.675173908 +0000 UTC m=+4163.856690129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70") : secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.174482 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.175296 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:06.675284846 +0000 UTC m=+4163.856801069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovn-metrics" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.174749 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.175326 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:06.675318649 +0000 UTC m=+4163.856834872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.175126 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.175357 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs podName:97adda3a-b739-44a3-8119-516f049b2784 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.175350549 +0000 UTC m=+4164.356866771 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs") pod "memcached-0" (UID: "97adda3a-b739-44a3-8119-516f049b2784") : secret "cert-memcached-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.176510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.178536 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.179016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.194653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqcqz\" (UniqueName: \"kubernetes.io/projected/a020fbaa-3cc2-46c6-84af-9f5f0323842a-kube-api-access-kqcqz\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.198462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.201351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms5s5\" (UniqueName: \"kubernetes.io/projected/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kube-api-access-ms5s5\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.211794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.239902 4707 scope.go:117] "RemoveContainer" containerID="398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.247109 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.248195 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d\": container with ID starting with 398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d not found: ID does not exist" containerID="398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.248232 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d"} err="failed to get container status \"398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d\": rpc error: code = NotFound desc = could not find container \"398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d\": container with ID starting with 398fd69a2888a8f8041a315889555ecab01dbae6902afc7420e3542c7bfe2f3d not found: ID does not exist" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.253734 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.377175 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config\") pod \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.377214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-combined-ca-bundle\") pod \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.377268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config-secret\") pod \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.377336 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-config-data\") pod \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.377370 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-combined-ca-bundle\") pod \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.377457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjsqt\" (UniqueName: \"kubernetes.io/projected/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-kube-api-access-kjsqt\") pod \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\" (UID: \"b1e1e63f-0c08-41e3-a60d-cd45b774ddbe\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.377517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tchtl\" (UniqueName: \"kubernetes.io/projected/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-kube-api-access-tchtl\") pod \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\" (UID: \"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.390120 4707 generic.go:334] "Generic (PLEG): container finished" podID="b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" containerID="9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049" exitCode=137 Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.390212 4707 scope.go:117] "RemoveContainer" containerID="9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.390303 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.397424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-kube-api-access-tchtl" (OuterVolumeSpecName: "kube-api-access-tchtl") pod "fb3c866f-fcb9-416f-aa0a-f34a7c9e5151" (UID: "fb3c866f-fcb9-416f-aa0a-f34a7c9e5151"). InnerVolumeSpecName "kube-api-access-tchtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.399568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-kube-api-access-kjsqt" (OuterVolumeSpecName: "kube-api-access-kjsqt") pod "b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" (UID: "b1e1e63f-0c08-41e3-a60d-cd45b774ddbe"). InnerVolumeSpecName "kube-api-access-kjsqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.409714 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb3c866f-fcb9-416f-aa0a-f34a7c9e5151" containerID="6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a" exitCode=0 Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.409992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151","Type":"ContainerDied","Data":"6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a"} Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.410023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"fb3c866f-fcb9-416f-aa0a-f34a7c9e5151","Type":"ContainerDied","Data":"ceb6df4d206d8fe9f2969327f5ec824ca7c4afbfa0367adfd142e01b31aa6440"} Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.410071 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.423852 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb3c866f-fcb9-416f-aa0a-f34a7c9e5151" (UID: "fb3c866f-fcb9-416f-aa0a-f34a7c9e5151"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.424546 4707 scope.go:117] "RemoveContainer" containerID="9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.425318 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049\": container with ID starting with 9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049 not found: ID does not exist" containerID="9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.425351 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049"} err="failed to get container status \"9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049\": rpc error: code = NotFound desc = could not find container \"9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049\": container with ID starting with 9b751921e64e7f6742586da1900fb998403d23481755038eed532758eb920049 not found: ID does not exist" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.425398 4707 scope.go:117] "RemoveContainer" containerID="6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.426980 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.426993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"3e173738-a3c5-4e4c-b681-6d1f3cbb90b7","Type":"ContainerDied","Data":"2e4f3225c6d332cc9a63aff5921551afd29acccbb74f6a10d3fbe3a88a9a91b6"} Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.432108 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerID="b61d287158b2568660f871fafa7221b083307857176a321166ff5c5c0acca885" exitCode=0 Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.432186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5a6deb2f-fd3d-494e-8d58-c6da07de2498","Type":"ContainerDied","Data":"b61d287158b2568660f871fafa7221b083307857176a321166ff5c5c0acca885"} Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.433518 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-config-data" (OuterVolumeSpecName: "config-data") pod "fb3c866f-fcb9-416f-aa0a-f34a7c9e5151" (UID: "fb3c866f-fcb9-416f-aa0a-f34a7c9e5151"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.437458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" event={"ID":"b115ad1a-72c2-4d2e-8669-4d125acf6751","Type":"ContainerStarted","Data":"d3f4f9fc5bc4ad2e024ecd33cb433c74190a4054f4d5fc7992c2d5cfaceae52f"} Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.448867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" event={"ID":"f927e632-11ba-4d56-993a-f89357f885a4","Type":"ContainerStarted","Data":"f9eec9f2777bc56f0de0809e130e7eb56b4ccf7f6d582d5054978a6f37712858"} Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.448982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" event={"ID":"f927e632-11ba-4d56-993a-f89357f885a4","Type":"ContainerStarted","Data":"55a8ca4d4d4f845958bddd165606fd0a9aa08d68148d5ce1f6d84578e370b654"} Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.460558 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" (UID: "b1e1e63f-0c08-41e3-a60d-cd45b774ddbe"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.469502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" (UID: "b1e1e63f-0c08-41e3-a60d-cd45b774ddbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.471546 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.473037 4707 scope.go:117] "RemoveContainer" containerID="6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.476041 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a\": container with ID starting with 6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a not found: ID does not exist" containerID="6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.476073 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a"} err="failed to get container status \"6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a\": rpc error: code = NotFound desc = could not find container \"6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a\": container with ID starting with 6b9f24cb9db09c81b19828e4dda6a799e5cdcc66438feaa5e7157f0e0104df9a not found: ID does not exist" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.476171 4707 scope.go:117] "RemoveContainer" containerID="48493be85c0a9d73f2fef3ed9d02b2356ce6d369c4ebbf5cf3e2300c4c9f60dc" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.476883 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" podStartSLOduration=3.476858473 podStartE2EDuration="3.476858473s" podCreationTimestamp="2026-01-21 16:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:06.458900417 +0000 UTC m=+4163.640416640" watchObservedRunningTime="2026-01-21 16:11:06.476858473 +0000 UTC m=+4163.658374696" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.483452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.483717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.483950 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.483970 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjsqt\" (UniqueName: \"kubernetes.io/projected/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-kube-api-access-kjsqt\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.483982 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tchtl\" (UniqueName: \"kubernetes.io/projected/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-kube-api-access-tchtl\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.483993 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.484002 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.484011 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.484489 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.484539 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.484522043 +0000 UTC m=+4164.666038265 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovn-metrics" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.485293 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.485343 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.485329342 +0000 UTC m=+4164.666845565 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.520824 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w"] Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.521058 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" podUID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerName="barbican-keystone-listener-log" containerID="cri-o://f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0" gracePeriod=30 Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.521089 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" podStartSLOduration=3.521069153 podStartE2EDuration="3.521069153s" podCreationTimestamp="2026-01-21 16:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:06.481413729 +0000 UTC m=+4163.662929951" watchObservedRunningTime="2026-01-21 16:11:06.521069153 +0000 UTC m=+4163.702585375" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.521183 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" podUID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerName="barbican-keystone-listener" containerID="cri-o://326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea" gracePeriod=30 Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.522007 4707 scope.go:117] "RemoveContainer" containerID="aa7356dd24df7564e2d5f877b2a487343b9965b9b92961ba366738c209325070" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.527175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" (UID: "b1e1e63f-0c08-41e3-a60d-cd45b774ddbe"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.564903 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.585277 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-combined-ca-bundle\") pod \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.585719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8p42\" (UniqueName: \"kubernetes.io/projected/5a6deb2f-fd3d-494e-8d58-c6da07de2498-kube-api-access-q8p42\") pod \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.585844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a6deb2f-fd3d-494e-8d58-c6da07de2498-etc-machine-id\") pod \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.585954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data\") pod \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.586300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-scripts\") pod \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.586402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data-custom\") pod \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\" (UID: \"5a6deb2f-fd3d-494e-8d58-c6da07de2498\") " Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.587191 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.587792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a6deb2f-fd3d-494e-8d58-c6da07de2498-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5a6deb2f-fd3d-494e-8d58-c6da07de2498" (UID: "5a6deb2f-fd3d-494e-8d58-c6da07de2498"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.589427 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.593293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5a6deb2f-fd3d-494e-8d58-c6da07de2498" (UID: "5a6deb2f-fd3d-494e-8d58-c6da07de2498"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.593431 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-scripts" (OuterVolumeSpecName: "scripts") pod "5a6deb2f-fd3d-494e-8d58-c6da07de2498" (UID: "5a6deb2f-fd3d-494e-8d58-c6da07de2498"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.593643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a6deb2f-fd3d-494e-8d58-c6da07de2498-kube-api-access-q8p42" (OuterVolumeSpecName: "kube-api-access-q8p42") pod "5a6deb2f-fd3d-494e-8d58-c6da07de2498" (UID: "5a6deb2f-fd3d-494e-8d58-c6da07de2498"). InnerVolumeSpecName "kube-api-access-q8p42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.604969 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm"] Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.605267 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" podUID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerName="barbican-worker-log" containerID="cri-o://8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44" gracePeriod=30 Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.605662 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" podUID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerName="barbican-worker" containerID="cri-o://92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe" gracePeriod=30 Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.628018 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.628686 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerName="probe" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.629492 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerName="probe" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.629578 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerName="cinder-scheduler" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.629625 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerName="cinder-scheduler" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.629674 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3c866f-fcb9-416f-aa0a-f34a7c9e5151" containerName="nova-cell0-conductor-conductor" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.629729 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3c866f-fcb9-416f-aa0a-f34a7c9e5151" containerName="nova-cell0-conductor-conductor" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.630019 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3c866f-fcb9-416f-aa0a-f34a7c9e5151" containerName="nova-cell0-conductor-conductor" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.630084 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerName="probe" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.630131 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" containerName="cinder-scheduler" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.631230 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.633326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.633620 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.633862 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-s4nbl" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.634405 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.644412 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a6deb2f-fd3d-494e-8d58-c6da07de2498" (UID: "5a6deb2f-fd3d-494e-8d58-c6da07de2498"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.646109 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.688675 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.688921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.688954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.689012 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.689023 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8p42\" (UniqueName: \"kubernetes.io/projected/5a6deb2f-fd3d-494e-8d58-c6da07de2498-kube-api-access-q8p42\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.689033 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a6deb2f-fd3d-494e-8d58-c6da07de2498-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.689042 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.689050 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.689122 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.689169 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.689149103 +0000 UTC m=+4164.870665325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.689327 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.689375 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.689364839 +0000 UTC m=+4164.870881061 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovn-metrics" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.689396 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.689420 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs podName:23f0ac73-fdf4-4d9f-b00d-90906390fe70 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.689413351 +0000 UTC m=+4164.870929572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70") : secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.696063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data" (OuterVolumeSpecName: "config-data") pod "5a6deb2f-fd3d-494e-8d58-c6da07de2498" (UID: "5a6deb2f-fd3d-494e-8d58-c6da07de2498"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.790280 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.790443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.790558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/449654c7-aa25-4489-b5b9-500f51719353-config-data-generated\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.790690 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7tjp\" (UniqueName: \"kubernetes.io/projected/449654c7-aa25-4489-b5b9-500f51719353-kube-api-access-s7tjp\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.790783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.790887 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-config-data-default\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.791086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-kolla-config\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.791201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-operator-scripts\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.791325 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6deb2f-fd3d-494e-8d58-c6da07de2498-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.857737 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.896959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-operator-scripts\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.897122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.897474 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.899477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.899579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/449654c7-aa25-4489-b5b9-500f51719353-config-data-generated\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.899735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7tjp\" (UniqueName: \"kubernetes.io/projected/449654c7-aa25-4489-b5b9-500f51719353-kube-api-access-s7tjp\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.899797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.899872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-config-data-default\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.900085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-kolla-config\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.900971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-kolla-config\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.901069 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: E0121 16:11:06.901115 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs podName:449654c7-aa25-4489-b5b9-500f51719353 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:07.40110365 +0000 UTC m=+4164.582619873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs") pod "openstack-galera-0" (UID: "449654c7-aa25-4489-b5b9-500f51719353") : secret "cert-galera-openstack-svc" not found Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.901581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/449654c7-aa25-4489-b5b9-500f51719353-config-data-generated\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.909197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-config-data-default\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.909543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.909685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-operator-scripts\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.946422 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7tjp\" (UniqueName: \"kubernetes.io/projected/449654c7-aa25-4489-b5b9-500f51719353-kube-api-access-s7tjp\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.947484 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.954869 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.956045 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.963143 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:11:06 crc kubenswrapper[4707]: I0121 16:11:06.971800 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.103517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.103850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.104200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn8jb\" (UniqueName: \"kubernetes.io/projected/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-kube-api-access-rn8jb\") pod \"nova-cell0-conductor-0\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.190983 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e173738-a3c5-4e4c-b681-6d1f3cbb90b7" path="/var/lib/kubelet/pods/3e173738-a3c5-4e4c-b681-6d1f3cbb90b7/volumes" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.191665 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="687b8d2a-f54f-4c5c-9d23-d5b0874682bb" path="/var/lib/kubelet/pods/687b8d2a-f54f-4c5c-9d23-d5b0874682bb/volumes" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.192777 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a16f95-9cdc-4bab-b68d-275212d909a5" path="/var/lib/kubelet/pods/69a16f95-9cdc-4bab-b68d-275212d909a5/volumes" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.193349 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77" path="/var/lib/kubelet/pods/7b3b7d0b-5e91-4ba8-bc0d-41c4a4f9cd77/volumes" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.193832 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e1e63f-0c08-41e3-a60d-cd45b774ddbe" path="/var/lib/kubelet/pods/b1e1e63f-0c08-41e3-a60d-cd45b774ddbe/volumes" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.194746 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa0ab18b-efe2-4553-8e4d-cf4f1468c460" path="/var/lib/kubelet/pods/fa0ab18b-efe2-4553-8e4d-cf4f1468c460/volumes" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.195268 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3c866f-fcb9-416f-aa0a-f34a7c9e5151" path="/var/lib/kubelet/pods/fb3c866f-fcb9-416f-aa0a-f34a7c9e5151/volumes" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.205790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn8jb\" (UniqueName: \"kubernetes.io/projected/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-kube-api-access-rn8jb\") pod \"nova-cell0-conductor-0\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.206024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.206170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.206326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.206201 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.206534 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs podName:97adda3a-b739-44a3-8119-516f049b2784 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.20651788 +0000 UTC m=+4166.388034102 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs") pod "memcached-0" (UID: "97adda3a-b739-44a3-8119-516f049b2784") : secret "cert-memcached-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.410657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.410824 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.410883 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs podName:449654c7-aa25-4489-b5b9-500f51719353 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:08.410867006 +0000 UTC m=+4165.592383238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs") pod "openstack-galera-0" (UID: "449654c7-aa25-4489-b5b9-500f51719353") : secret "cert-galera-openstack-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.448361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.448397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.449653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.451289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn8jb\" (UniqueName: \"kubernetes.io/projected/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-kube-api-access-rn8jb\") pod \"nova-cell0-conductor-0\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.458392 4707 generic.go:334] "Generic (PLEG): container finished" podID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerID="8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44" exitCode=143 Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.458470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" event={"ID":"1bc4c055-2df1-4d1f-b331-30020327ed16","Type":"ContainerDied","Data":"8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44"} Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.461199 4707 generic.go:334] "Generic (PLEG): container finished" podID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerID="f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0" exitCode=143 Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.461249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" event={"ID":"4e6fa0ac-e4ce-419d-a680-ce63883886a7","Type":"ContainerDied","Data":"f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0"} Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.465138 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5a6deb2f-fd3d-494e-8d58-c6da07de2498","Type":"ContainerDied","Data":"a852b1ab4b43c2fdb82060e07cce70bd140df65d53843b20e0bb4bba7817b83e"} Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.465274 4707 scope.go:117] "RemoveContainer" containerID="6da93903072338fa09723fab8b4ae763bcee0d8308c91e3e562a7090cddaa6a2" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.465173 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.512035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.512177 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.512210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.512221 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.512208188 +0000 UTC m=+4166.693724410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovn-metrics" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.512332 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.512388 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.512373428 +0000 UTC m=+4166.693889650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.528079 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5905c256265d74b62ed26557c2803f5ee8e0dde3974c37bd97e1831a18746e7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.529530 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5905c256265d74b62ed26557c2803f5ee8e0dde3974c37bd97e1831a18746e7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.530719 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5905c256265d74b62ed26557c2803f5ee8e0dde3974c37bd97e1831a18746e7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.530756 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" containerName="ovn-northd" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.583128 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.716199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.716437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.716474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.716517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.716566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.716829 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.716899 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:11.716882485 +0000 UTC m=+4168.898398708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-public-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.716934 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.716981 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs podName:23f0ac73-fdf4-4d9f-b00d-90906390fe70 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.716968007 +0000 UTC m=+4166.898484229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70") : secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.717023 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.717042 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.717035945 +0000 UTC m=+4166.898552166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.717069 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.717132 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.717212 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.71719823 +0000 UTC m=+4166.898714452 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovn-metrics" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.717465 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:11.717450975 +0000 UTC m=+4168.898967197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.749090 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.171:8776/healthcheck\": dial tcp 10.217.1.171:8776: connect: connection refused" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.752117 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.756297 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.773215 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.775518 4707 scope.go:117] "RemoveContainer" containerID="b61d287158b2568660f871fafa7221b083307857176a321166ff5c5c0acca885" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.817868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"d8bdcff4-af97-483e-80bb-21b36c870c0b\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.818140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-logs\") pod \"d8bdcff4-af97-483e-80bb-21b36c870c0b\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.818175 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-scripts\") pod \"d8bdcff4-af97-483e-80bb-21b36c870c0b\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.818214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-combined-ca-bundle\") pod \"d8bdcff4-af97-483e-80bb-21b36c870c0b\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.818275 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-public-tls-certs\") pod \"d8bdcff4-af97-483e-80bb-21b36c870c0b\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.818337 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-config-data\") pod \"d8bdcff4-af97-483e-80bb-21b36c870c0b\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.818362 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-httpd-run\") pod \"d8bdcff4-af97-483e-80bb-21b36c870c0b\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.818416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75kzc\" (UniqueName: \"kubernetes.io/projected/d8bdcff4-af97-483e-80bb-21b36c870c0b-kube-api-access-75kzc\") pod \"d8bdcff4-af97-483e-80bb-21b36c870c0b\" (UID: \"d8bdcff4-af97-483e-80bb-21b36c870c0b\") " Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.819608 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.819995 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerName="glance-log" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.820013 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerName="glance-log" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.820030 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerName="glance-httpd" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.820047 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerName="glance-httpd" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.820243 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerName="glance-httpd" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.820271 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerName="glance-log" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.820886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-logs" (OuterVolumeSpecName: "logs") pod "d8bdcff4-af97-483e-80bb-21b36c870c0b" (UID: "d8bdcff4-af97-483e-80bb-21b36c870c0b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.821136 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.824051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bdcff4-af97-483e-80bb-21b36c870c0b-kube-api-access-75kzc" (OuterVolumeSpecName: "kube-api-access-75kzc") pod "d8bdcff4-af97-483e-80bb-21b36c870c0b" (UID: "d8bdcff4-af97-483e-80bb-21b36c870c0b"). InnerVolumeSpecName "kube-api-access-75kzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.824330 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d8bdcff4-af97-483e-80bb-21b36c870c0b" (UID: "d8bdcff4-af97-483e-80bb-21b36c870c0b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.824572 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.830978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "d8bdcff4-af97-483e-80bb-21b36c870c0b" (UID: "d8bdcff4-af97-483e-80bb-21b36c870c0b"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.833737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-scripts" (OuterVolumeSpecName: "scripts") pod "d8bdcff4-af97-483e-80bb-21b36c870c0b" (UID: "d8bdcff4-af97-483e-80bb-21b36c870c0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.853981 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.892667 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.920774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.920839 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-scripts\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.920870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.920924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.920978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmszn\" (UniqueName: \"kubernetes.io/projected/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-kube-api-access-qmszn\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.921040 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.921097 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:11.921080699 +0000 UTC m=+4169.102596922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-internal-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921133 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.921227 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921238 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.921276 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:11.921262741 +0000 UTC m=+4169.102778964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-public-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.921296 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921300 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75kzc\" (UniqueName: \"kubernetes.io/projected/d8bdcff4-af97-483e-80bb-21b36c870c0b-kube-api-access-75kzc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.921331 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:11.921319277 +0000 UTC m=+4169.102835500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-internal-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921351 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921364 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8bdcff4-af97-483e-80bb-21b36c870c0b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.921372 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.921880 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: E0121 16:11:07.921970 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:11.921952349 +0000 UTC m=+4169.103468571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-public-svc" not found Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.937119 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.951745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8bdcff4-af97-483e-80bb-21b36c870c0b" (UID: "d8bdcff4-af97-483e-80bb-21b36c870c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.957229 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.967521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-config-data" (OuterVolumeSpecName: "config-data") pod "d8bdcff4-af97-483e-80bb-21b36c870c0b" (UID: "d8bdcff4-af97-483e-80bb-21b36c870c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:07 crc kubenswrapper[4707]: I0121 16:11:07.969382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d8bdcff4-af97-483e-80bb-21b36c870c0b" (UID: "d8bdcff4-af97-483e-80bb-21b36c870c0b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.022840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vpt8\" (UniqueName: \"kubernetes.io/projected/dd534564-de96-44c7-9026-a55d66da26ef-kube-api-access-7vpt8\") pod \"dd534564-de96-44c7-9026-a55d66da26ef\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.022920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-httpd-run\") pod \"dd534564-de96-44c7-9026-a55d66da26ef\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.022951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-internal-tls-certs\") pod \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4fbc761-3cab-4cee-b4f5-59a2306a3650-logs\") pod \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-combined-ca-bundle\") pod \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-logs\") pod \"dd534564-de96-44c7-9026-a55d66da26ef\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-scripts\") pod \"dd534564-de96-44c7-9026-a55d66da26ef\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023091 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-internal-tls-certs\") pod \"dd534564-de96-44c7-9026-a55d66da26ef\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"dd534564-de96-44c7-9026-a55d66da26ef\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023155 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8nhg\" (UniqueName: \"kubernetes.io/projected/b4fbc761-3cab-4cee-b4f5-59a2306a3650-kube-api-access-t8nhg\") pod \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023197 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-scripts\") pod \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data\") pod \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023244 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data-custom\") pod \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023259 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4fbc761-3cab-4cee-b4f5-59a2306a3650-etc-machine-id\") pod \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-public-tls-certs\") pod \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\" (UID: \"b4fbc761-3cab-4cee-b4f5-59a2306a3650\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-combined-ca-bundle\") pod \"dd534564-de96-44c7-9026-a55d66da26ef\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-config-data\") pod \"dd534564-de96-44c7-9026-a55d66da26ef\" (UID: \"dd534564-de96-44c7-9026-a55d66da26ef\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dd534564-de96-44c7-9026-a55d66da26ef" (UID: "dd534564-de96-44c7-9026-a55d66da26ef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.023670 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4fbc761-3cab-4cee-b4f5-59a2306a3650-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b4fbc761-3cab-4cee-b4f5-59a2306a3650" (UID: "b4fbc761-3cab-4cee-b4f5-59a2306a3650"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.024121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.024187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-scripts\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.024592 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.026512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmszn\" (UniqueName: \"kubernetes.io/projected/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-kube-api-access-qmszn\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.026584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.026612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd534564-de96-44c7-9026-a55d66da26ef-kube-api-access-7vpt8" (OuterVolumeSpecName: "kube-api-access-7vpt8") pod "dd534564-de96-44c7-9026-a55d66da26ef" (UID: "dd534564-de96-44c7-9026-a55d66da26ef"). InnerVolumeSpecName "kube-api-access-7vpt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.026701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.026837 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vpt8\" (UniqueName: \"kubernetes.io/projected/dd534564-de96-44c7-9026-a55d66da26ef-kube-api-access-7vpt8\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.027544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4fbc761-3cab-4cee-b4f5-59a2306a3650-kube-api-access-t8nhg" (OuterVolumeSpecName: "kube-api-access-t8nhg") pod "b4fbc761-3cab-4cee-b4f5-59a2306a3650" (UID: "b4fbc761-3cab-4cee-b4f5-59a2306a3650"). InnerVolumeSpecName "kube-api-access-t8nhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.026856 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.027787 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.027802 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.027826 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4fbc761-3cab-4cee-b4f5-59a2306a3650-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.027835 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.027845 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8bdcff4-af97-483e-80bb-21b36c870c0b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.028636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-scripts" (OuterVolumeSpecName: "scripts") pod "dd534564-de96-44c7-9026-a55d66da26ef" (UID: "dd534564-de96-44c7-9026-a55d66da26ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.029782 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-logs" (OuterVolumeSpecName: "logs") pod "dd534564-de96-44c7-9026-a55d66da26ef" (UID: "dd534564-de96-44c7-9026-a55d66da26ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.029874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b4fbc761-3cab-4cee-b4f5-59a2306a3650" (UID: "b4fbc761-3cab-4cee-b4f5-59a2306a3650"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.029920 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-scripts" (OuterVolumeSpecName: "scripts") pod "b4fbc761-3cab-4cee-b4f5-59a2306a3650" (UID: "b4fbc761-3cab-4cee-b4f5-59a2306a3650"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.030182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4fbc761-3cab-4cee-b4f5-59a2306a3650-logs" (OuterVolumeSpecName: "logs") pod "b4fbc761-3cab-4cee-b4f5-59a2306a3650" (UID: "b4fbc761-3cab-4cee-b4f5-59a2306a3650"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.030237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.033370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.033448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "dd534564-de96-44c7-9026-a55d66da26ef" (UID: "dd534564-de96-44c7-9026-a55d66da26ef"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.034932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.039347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.039547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-scripts\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.048201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmszn\" (UniqueName: \"kubernetes.io/projected/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-kube-api-access-qmszn\") pod \"cinder-scheduler-0\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.054701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd534564-de96-44c7-9026-a55d66da26ef" (UID: "dd534564-de96-44c7-9026-a55d66da26ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.056534 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4fbc761-3cab-4cee-b4f5-59a2306a3650" (UID: "b4fbc761-3cab-4cee-b4f5-59a2306a3650"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.075100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dd534564-de96-44c7-9026-a55d66da26ef" (UID: "dd534564-de96-44c7-9026-a55d66da26ef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.077340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b4fbc761-3cab-4cee-b4f5-59a2306a3650" (UID: "b4fbc761-3cab-4cee-b4f5-59a2306a3650"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.083653 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data" (OuterVolumeSpecName: "config-data") pod "b4fbc761-3cab-4cee-b4f5-59a2306a3650" (UID: "b4fbc761-3cab-4cee-b4f5-59a2306a3650"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.089979 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-config-data" (OuterVolumeSpecName: "config-data") pod "dd534564-de96-44c7-9026-a55d66da26ef" (UID: "dd534564-de96-44c7-9026-a55d66da26ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.102087 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b4fbc761-3cab-4cee-b4f5-59a2306a3650" (UID: "b4fbc761-3cab-4cee-b4f5-59a2306a3650"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.135800 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.135952 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.136013 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.135998769 +0000 UTC m=+4169.317514991 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-ovndbs" not found Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.136228 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.136278 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.13626508 +0000 UTC m=+4169.317781302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-internal-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136297 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136310 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4fbc761-3cab-4cee-b4f5-59a2306a3650-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.136312 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136319 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136329 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd534564-de96-44c7-9026-a55d66da26ef-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.136335 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.136328969 +0000 UTC m=+4169.317845192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-public-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136350 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136360 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136385 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136397 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8nhg\" (UniqueName: \"kubernetes.io/projected/b4fbc761-3cab-4cee-b4f5-59a2306a3650-kube-api-access-t8nhg\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136406 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136415 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136423 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136432 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136440 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4fbc761-3cab-4cee-b4f5-59a2306a3650-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.136448 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd534564-de96-44c7-9026-a55d66da26ef-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.151371 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.188378 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.225075 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: W0121 16:11:08.235073 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbd914cc_fc30_48bb_b6bb_0fa2f8bbbb56.slice/crio-42518b90ebd1540df899bb3b30c8c78594e131cd0f489c1d21293ba12545ed6d WatchSource:0}: Error finding container 42518b90ebd1540df899bb3b30c8c78594e131cd0f489c1d21293ba12545ed6d: Status 404 returned error can't find the container with id 42518b90ebd1540df899bb3b30c8c78594e131cd0f489c1d21293ba12545ed6d Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.237647 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.394822 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.441692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.441957 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.442289 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs podName:449654c7-aa25-4489-b5b9-500f51719353 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:10.442262265 +0000 UTC m=+4167.623778486 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs") pod "openstack-galera-0" (UID: "449654c7-aa25-4489-b5b9-500f51719353") : secret "cert-galera-openstack-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.475707 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerID="fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa" exitCode=0 Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.475758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b4fbc761-3cab-4cee-b4f5-59a2306a3650","Type":"ContainerDied","Data":"fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.475782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b4fbc761-3cab-4cee-b4f5-59a2306a3650","Type":"ContainerDied","Data":"72cb3a1e0a541104f4491c324ff2e8d835f65bd589656fd31f4d4ffea57b8749"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.475798 4707 scope.go:117] "RemoveContainer" containerID="fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.475938 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.482069 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8bdcff4-af97-483e-80bb-21b36c870c0b" containerID="7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a" exitCode=0 Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.482154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8bdcff4-af97-483e-80bb-21b36c870c0b","Type":"ContainerDied","Data":"7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.482193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d8bdcff4-af97-483e-80bb-21b36c870c0b","Type":"ContainerDied","Data":"e6480f04a38d5ddb2b8ba72eab165d69da1db7ca61b3918ceccae1298d8eb670"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.482311 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.491570 4707 generic.go:334] "Generic (PLEG): container finished" podID="dd534564-de96-44c7-9026-a55d66da26ef" containerID="240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60" exitCode=0 Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.491633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dd534564-de96-44c7-9026-a55d66da26ef","Type":"ContainerDied","Data":"240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.491655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dd534564-de96-44c7-9026-a55d66da26ef","Type":"ContainerDied","Data":"5c1fd8a000f323e786259549c867c5f97ba05b1eff8e3a671c2199832f2a107b"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.491679 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.493986 4707 generic.go:334] "Generic (PLEG): container finished" podID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerID="3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40" exitCode=0 Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.494188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f072c1c5-92fb-4b9c-86dd-6f34cffb4166","Type":"ContainerDied","Data":"3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.494323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f072c1c5-92fb-4b9c-86dd-6f34cffb4166","Type":"ContainerDied","Data":"790d957d81693df9bfc0deb8ac0aa00fdda85b4d65813b91db8458d52703bb4b"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.494391 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.497254 4707 generic.go:334] "Generic (PLEG): container finished" podID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerID="72ebb7a26f99df0fa91b1dd0b938c53301a324860220aedf1fc9ada4708a5afc" exitCode=0 Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.497282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a","Type":"ContainerDied","Data":"72ebb7a26f99df0fa91b1dd0b938c53301a324860220aedf1fc9ada4708a5afc"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.499251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerStarted","Data":"cfbed99679c4ef966f0c2a8850897c643850ca266da8c7489dc2e08a36f36c46"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.499281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerStarted","Data":"42518b90ebd1540df899bb3b30c8c78594e131cd0f489c1d21293ba12545ed6d"} Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.499432 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.504591 4707 scope.go:117] "RemoveContainer" containerID="f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.542793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xv59\" (UniqueName: \"kubernetes.io/projected/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-kube-api-access-7xv59\") pod \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.542881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-public-tls-certs\") pod \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.542997 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-combined-ca-bundle\") pod \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.543035 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-internal-tls-certs\") pod \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.543056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-logs\") pod \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.543104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-config-data\") pod \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\" (UID: \"f072c1c5-92fb-4b9c-86dd-6f34cffb4166\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.545766 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.560516 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.560716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-logs" (OuterVolumeSpecName: "logs") pod "f072c1c5-92fb-4b9c-86dd-6f34cffb4166" (UID: "f072c1c5-92fb-4b9c-86dd-6f34cffb4166"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.561941 4707 scope.go:117] "RemoveContainer" containerID="fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.571573 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa\": container with ID starting with fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa not found: ID does not exist" containerID="fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.571749 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa"} err="failed to get container status \"fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa\": rpc error: code = NotFound desc = could not find container \"fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa\": container with ID starting with fcc099a44f46c51f7bfce440a585c39cc9cc907fcec1a8712300abb8c979f3fa not found: ID does not exist" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.571879 4707 scope.go:117] "RemoveContainer" containerID="f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.578993 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d\": container with ID starting with f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d not found: ID does not exist" containerID="f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.579108 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d"} err="failed to get container status \"f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d\": rpc error: code = NotFound desc = could not find container \"f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d\": container with ID starting with f5ec9d2438e5a1ac09614fbf4c628e48edd0588e8238887714d530ab5ba21d1d not found: ID does not exist" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.579198 4707 scope.go:117] "RemoveContainer" containerID="7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.607860 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.609044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-kube-api-access-7xv59" (OuterVolumeSpecName: "kube-api-access-7xv59") pod "f072c1c5-92fb-4b9c-86dd-6f34cffb4166" (UID: "f072c1c5-92fb-4b9c-86dd-6f34cffb4166"). InnerVolumeSpecName "kube-api-access-7xv59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.626367 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635027 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.635633 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd534564-de96-44c7-9026-a55d66da26ef" containerName="glance-log" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635659 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd534564-de96-44c7-9026-a55d66da26ef" containerName="glance-log" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.635679 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerName="cinder-api-log" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635687 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerName="cinder-api-log" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.635703 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-api" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635711 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-api" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.635723 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd534564-de96-44c7-9026-a55d66da26ef" containerName="glance-httpd" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635729 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd534564-de96-44c7-9026-a55d66da26ef" containerName="glance-httpd" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.635745 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-log" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635753 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-log" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.635771 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerName="cinder-api" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635778 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerName="cinder-api" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635969 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-log" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635983 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" containerName="nova-api-api" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.635994 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerName="cinder-api" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.636007 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" containerName="cinder-api-log" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.636017 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd534564-de96-44c7-9026-a55d66da26ef" containerName="glance-log" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.636029 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd534564-de96-44c7-9026-a55d66da26ef" containerName="glance-httpd" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.636561 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.636550023 podStartE2EDuration="2.636550023s" podCreationTimestamp="2026-01-21 16:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:08.53082519 +0000 UTC m=+4165.712341411" watchObservedRunningTime="2026-01-21 16:11:08.636550023 +0000 UTC m=+4165.818066245" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.637022 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.639213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f072c1c5-92fb-4b9c-86dd-6f34cffb4166" (UID: "f072c1c5-92fb-4b9c-86dd-6f34cffb4166"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.639292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-config-data" (OuterVolumeSpecName: "config-data") pod "f072c1c5-92fb-4b9c-86dd-6f34cffb4166" (UID: "f072c1c5-92fb-4b9c-86dd-6f34cffb4166"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.641318 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.641372 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.641442 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.645308 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.645334 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.645345 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xv59\" (UniqueName: \"kubernetes.io/projected/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-kube-api-access-7xv59\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.645357 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.648390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.649620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f072c1c5-92fb-4b9c-86dd-6f34cffb4166" (UID: "f072c1c5-92fb-4b9c-86dd-6f34cffb4166"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.653876 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.655307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.656873 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.657246 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.657336 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-97m6r" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.664689 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.668815 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.672422 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.674698 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.687291 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.688887 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.690223 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.690672 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.696425 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.697402 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f072c1c5-92fb-4b9c-86dd-6f34cffb4166" (UID: "f072c1c5-92fb-4b9c-86dd-6f34cffb4166"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.704302 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-scripts\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdjj\" (UniqueName: \"kubernetes.io/projected/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-kube-api-access-xkdjj\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data-custom\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-logs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747874 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rpqm\" (UniqueName: \"kubernetes.io/projected/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-kube-api-access-2rpqm\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-logs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747943 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.747980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.748011 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.748055 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.748067 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f072c1c5-92fb-4b9c-86dd-6f34cffb4166-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.814635 4707 scope.go:117] "RemoveContainer" containerID="117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.827591 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.839276 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdjj\" (UniqueName: \"kubernetes.io/projected/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-kube-api-access-xkdjj\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7cr4\" (UniqueName: \"kubernetes.io/projected/45335228-5b76-4f29-a370-98578b0dfd62-kube-api-access-m7cr4\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854199 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-logs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data-custom\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-logs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854425 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rpqm\" (UniqueName: \"kubernetes.io/projected/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-kube-api-access-2rpqm\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-logs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.854790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.857726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-scripts\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.863438 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.863851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-scripts\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.863950 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.864015 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.363996571 +0000 UTC m=+4166.545512792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-internal-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.864690 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.868376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-logs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.868785 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.868865 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.368843834 +0000 UTC m=+4166.550360047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-public-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.873665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data-custom\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.875908 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.875992 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs podName:6a3ae0f0-4069-4bac-8927-e0d9dce26caf nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.375974783 +0000 UTC m=+4166.557491005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs") pod "glance-default-external-api-0" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf") : secret "cert-glance-default-public-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.876652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-logs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.876723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.877916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.883642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.886762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rpqm\" (UniqueName: \"kubernetes.io/projected/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-kube-api-access-2rpqm\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.888519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.889642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.889892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.892173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdjj\" (UniqueName: \"kubernetes.io/projected/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-kube-api-access-xkdjj\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.901769 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.903903 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.916735 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.917223 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-log" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.917245 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-log" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.917274 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-metadata" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.917280 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-metadata" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.917489 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-metadata" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.917501 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-log" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.918424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.920523 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.920541 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.920964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.937492 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.959502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-nova-metadata-tls-certs\") pod \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.959606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvcsv\" (UniqueName: \"kubernetes.io/projected/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-kube-api-access-jvcsv\") pod \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.959859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-combined-ca-bundle\") pod \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.959939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-config-data\") pod \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.960027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-logs\") pod \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\" (UID: \"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a\") " Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.960884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7cr4\" (UniqueName: \"kubernetes.io/projected/45335228-5b76-4f29-a370-98578b0dfd62-kube-api-access-m7cr4\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.960935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-logs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.960974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.961108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.961447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.961515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.961600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.961802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.962803 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: E0121 16:11:08.962884 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs podName:45335228-5b76-4f29-a370-98578b0dfd62 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.462865583 +0000 UTC m=+4166.644381805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "45335228-5b76-4f29-a370-98578b0dfd62") : secret "cert-glance-default-internal-svc" not found Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.964878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-logs" (OuterVolumeSpecName: "logs") pod "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" (UID: "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.966155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-logs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.966504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.967338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.967890 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:08 crc kubenswrapper[4707]: I0121 16:11:08.989686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.005090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.010116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7cr4\" (UniqueName: \"kubernetes.io/projected/45335228-5b76-4f29-a370-98578b0dfd62-kube-api-access-m7cr4\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.025149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" (UID: "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.027985 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-config-data" (OuterVolumeSpecName: "config-data") pod "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" (UID: "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.028004 4707 scope.go:117] "RemoveContainer" containerID="7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.028625 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a\": container with ID starting with 7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a not found: ID does not exist" containerID="7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.028696 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a"} err="failed to get container status \"7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a\": rpc error: code = NotFound desc = could not find container \"7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a\": container with ID starting with 7725dc32fab8ad72aec37aa875559726c454833255811eb191840199ed06796a not found: ID does not exist" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.028726 4707 scope.go:117] "RemoveContainer" containerID="117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.029400 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0\": container with ID starting with 117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0 not found: ID does not exist" containerID="117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.029429 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0"} err="failed to get container status \"117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0\": rpc error: code = NotFound desc = could not find container \"117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0\": container with ID starting with 117df19445336d65dcc83c2d3278242703441e3ffc765f05049a9685791c23b0 not found: ID does not exist" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.029447 4707 scope.go:117] "RemoveContainer" containerID="240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.032435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-kube-api-access-jvcsv" (OuterVolumeSpecName: "kube-api-access-jvcsv") pod "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" (UID: "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a"). InnerVolumeSpecName "kube-api-access-jvcsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.039988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.054309 4707 scope.go:117] "RemoveContainer" containerID="c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.064734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms87k\" (UniqueName: \"kubernetes.io/projected/97c00bba-c016-4c12-9df2-819557bb8d8f-kube-api-access-ms87k\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.064856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.064888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-config-data\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.064957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.065012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c00bba-c016-4c12-9df2-819557bb8d8f-logs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.067150 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" (UID: "ef9ba957-bcd9-4e63-bbf2-4e2398ac259a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.067800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.068443 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.068462 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.068497 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.068517 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.068526 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvcsv\" (UniqueName: \"kubernetes.io/projected/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a-kube-api-access-jvcsv\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.073215 4707 scope.go:117] "RemoveContainer" containerID="240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.074021 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60\": container with ID starting with 240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60 not found: ID does not exist" containerID="240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.074108 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60"} err="failed to get container status \"240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60\": rpc error: code = NotFound desc = could not find container \"240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60\": container with ID starting with 240ca7476680b8751aa6c9151326b13497f603e01f7ca2852d7b63c14bed2c60 not found: ID does not exist" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.074150 4707 scope.go:117] "RemoveContainer" containerID="c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.074873 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5\": container with ID starting with c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5 not found: ID does not exist" containerID="c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.074902 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5"} err="failed to get container status \"c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5\": rpc error: code = NotFound desc = could not find container \"c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5\": container with ID starting with c7ec5424945b8ccf1ab5a1733b718bd5adfcfb04cfdb5e0491f86f02142258a5 not found: ID does not exist" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.074944 4707 scope.go:117] "RemoveContainer" containerID="3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.105977 4707 scope.go:117] "RemoveContainer" containerID="56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.145406 4707 scope.go:117] "RemoveContainer" containerID="3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.146629 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40\": container with ID starting with 3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40 not found: ID does not exist" containerID="3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.146687 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40"} err="failed to get container status \"3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40\": rpc error: code = NotFound desc = could not find container \"3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40\": container with ID starting with 3dd17a88796babd9bd167bfd0e50e377a2b32f595a08de99bc3672da1fe18c40 not found: ID does not exist" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.146719 4707 scope.go:117] "RemoveContainer" containerID="56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.147190 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e\": container with ID starting with 56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e not found: ID does not exist" containerID="56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.147225 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e"} err="failed to get container status \"56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e\": rpc error: code = NotFound desc = could not find container \"56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e\": container with ID starting with 56b51fb32f3881be512ce59105f3db7bcba647c96f1e6f24276b8ca85e1a762e not found: ID does not exist" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.157854 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.170072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms87k\" (UniqueName: \"kubernetes.io/projected/97c00bba-c016-4c12-9df2-819557bb8d8f-kube-api-access-ms87k\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.170136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.170176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-config-data\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.170218 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.170251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c00bba-c016-4c12-9df2-819557bb8d8f-logs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.171181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.171685 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.171736 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.671721264 +0000 UTC m=+4166.853237485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-internal-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.181946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.182005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c00bba-c016-4c12-9df2-819557bb8d8f-logs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.182056 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.182093 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.682081283 +0000 UTC m=+4166.863597506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-public-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.188771 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-config-data\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.190841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms87k\" (UniqueName: \"kubernetes.io/projected/97c00bba-c016-4c12-9df2-819557bb8d8f-kube-api-access-ms87k\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.198465 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a6deb2f-fd3d-494e-8d58-c6da07de2498" path="/var/lib/kubelet/pods/5a6deb2f-fd3d-494e-8d58-c6da07de2498/volumes" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.199243 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4fbc761-3cab-4cee-b4f5-59a2306a3650" path="/var/lib/kubelet/pods/b4fbc761-3cab-4cee-b4f5-59a2306a3650/volumes" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.199897 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8bdcff4-af97-483e-80bb-21b36c870c0b" path="/var/lib/kubelet/pods/d8bdcff4-af97-483e-80bb-21b36c870c0b/volumes" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.201228 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd534564-de96-44c7-9026-a55d66da26ef" path="/var/lib/kubelet/pods/dd534564-de96-44c7-9026-a55d66da26ef/volumes" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.202131 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f072c1c5-92fb-4b9c-86dd-6f34cffb4166" path="/var/lib/kubelet/pods/f072c1c5-92fb-4b9c-86dd-6f34cffb4166/volumes" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.273705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-config-data\") pod \"43b25315-bece-4c48-bd94-3132650e2b14\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.273799 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-combined-ca-bundle\") pod \"43b25315-bece-4c48-bd94-3132650e2b14\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.273922 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z6cm\" (UniqueName: \"kubernetes.io/projected/43b25315-bece-4c48-bd94-3132650e2b14-kube-api-access-5z6cm\") pod \"43b25315-bece-4c48-bd94-3132650e2b14\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.274525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.276317 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.276391 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs podName:97adda3a-b739-44a3-8119-516f049b2784 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:13.276373321 +0000 UTC m=+4170.457889542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs") pod "memcached-0" (UID: "97adda3a-b739-44a3-8119-516f049b2784") : secret "cert-memcached-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.279427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b25315-bece-4c48-bd94-3132650e2b14-kube-api-access-5z6cm" (OuterVolumeSpecName: "kube-api-access-5z6cm") pod "43b25315-bece-4c48-bd94-3132650e2b14" (UID: "43b25315-bece-4c48-bd94-3132650e2b14"). InnerVolumeSpecName "kube-api-access-5z6cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.301465 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-config-data podName:43b25315-bece-4c48-bd94-3132650e2b14 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:09.801441365 +0000 UTC m=+4166.982957588 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-config-data") pod "43b25315-bece-4c48-bd94-3132650e2b14" (UID: "43b25315-bece-4c48-bd94-3132650e2b14") : error deleting /var/lib/kubelet/pods/43b25315-bece-4c48-bd94-3132650e2b14/volume-subpaths: remove /var/lib/kubelet/pods/43b25315-bece-4c48-bd94-3132650e2b14/volume-subpaths: no such file or directory Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.304849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43b25315-bece-4c48-bd94-3132650e2b14" (UID: "43b25315-bece-4c48-bd94-3132650e2b14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.376699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.376799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.376861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.377055 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.377073 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z6cm\" (UniqueName: \"kubernetes.io/projected/43b25315-bece-4c48-bd94-3132650e2b14-kube-api-access-5z6cm\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.377171 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.377214 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.377218 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:10.377204794 +0000 UTC m=+4167.558721016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-internal-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.377269 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:10.377254146 +0000 UTC m=+4167.558770369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-public-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.377174 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.377293 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs podName:6a3ae0f0-4069-4bac-8927-e0d9dce26caf nodeName:}" failed. No retries permitted until 2026-01-21 16:11:10.377286798 +0000 UTC m=+4167.558803009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs") pod "glance-default-external-api-0" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf") : secret "cert-glance-default-public-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.468923 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.478378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.478533 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.478610 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs podName:45335228-5b76-4f29-a370-98578b0dfd62 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:10.478593744 +0000 UTC m=+4167.660109967 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "45335228-5b76-4f29-a370-98578b0dfd62") : secret "cert-glance-default-internal-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.514972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2f6304c7-b127-4fcb-a74b-b5009a2b1a50","Type":"ContainerStarted","Data":"22c7df73a4078571e27ee73b13d827243c425325781c88d7bd9adb9efeb6f37a"} Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.515029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2f6304c7-b127-4fcb-a74b-b5009a2b1a50","Type":"ContainerStarted","Data":"4368e0e23f4023ca7c8ec88a8592985a35fe982982f1cbc296355f1a66ebe007"} Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.518687 4707 generic.go:334] "Generic (PLEG): container finished" podID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerID="326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea" exitCode=0 Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.518726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" event={"ID":"4e6fa0ac-e4ce-419d-a680-ce63883886a7","Type":"ContainerDied","Data":"326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea"} Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.518774 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.518795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w" event={"ID":"4e6fa0ac-e4ce-419d-a680-ce63883886a7","Type":"ContainerDied","Data":"210cb3f58c8c9c6f1aa1eaf7bbf08a773fac3b0cd7d8fbd8cb8b964e69e76872"} Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.518833 4707 scope.go:117] "RemoveContainer" containerID="326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.538319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ef9ba957-bcd9-4e63-bbf2-4e2398ac259a","Type":"ContainerDied","Data":"b9e71e7fcf09edfb909804e6bf87a12e584d6bd34652be5e2af98fa85d658c69"} Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.538404 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.541016 4707 generic.go:334] "Generic (PLEG): container finished" podID="43b25315-bece-4c48-bd94-3132650e2b14" containerID="5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04" exitCode=0 Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.541064 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"43b25315-bece-4c48-bd94-3132650e2b14","Type":"ContainerDied","Data":"5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04"} Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.541084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"43b25315-bece-4c48-bd94-3132650e2b14","Type":"ContainerDied","Data":"505a7b08eead2cf1c6175f645a0b4eea548123018a8032e6282223933b6932c3"} Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.541115 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.549499 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_bd6d7315-b24c-411b-acc8-bda614951af2/ovn-northd/0.log" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.549533 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd6d7315-b24c-411b-acc8-bda614951af2" containerID="f5905c256265d74b62ed26557c2803f5ee8e0dde3974c37bd97e1831a18746e7" exitCode=139 Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.551135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"bd6d7315-b24c-411b-acc8-bda614951af2","Type":"ContainerDied","Data":"f5905c256265d74b62ed26557c2803f5ee8e0dde3974c37bd97e1831a18746e7"} Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.562715 4707 scope.go:117] "RemoveContainer" containerID="f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.568031 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.577759 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.582884 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data\") pod \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.582979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data-custom\") pod \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.583088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-combined-ca-bundle\") pod \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.583179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7xvj\" (UniqueName: \"kubernetes.io/projected/4e6fa0ac-e4ce-419d-a680-ce63883886a7-kube-api-access-k7xvj\") pod \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.583269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6fa0ac-e4ce-419d-a680-ce63883886a7-logs\") pod \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\" (UID: \"4e6fa0ac-e4ce-419d-a680-ce63883886a7\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.584967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.585150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.585453 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.585545 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:13.585533093 +0000 UTC m=+4170.767049315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovn-metrics" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.587869 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.588143 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.588208 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:13.588191401 +0000 UTC m=+4170.769707624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.588321 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b25315-bece-4c48-bd94-3132650e2b14" containerName="nova-scheduler-scheduler" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.588338 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b25315-bece-4c48-bd94-3132650e2b14" containerName="nova-scheduler-scheduler" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.588352 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerName="barbican-keystone-listener" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.588359 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerName="barbican-keystone-listener" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.588369 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerName="barbican-keystone-listener-log" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.588375 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerName="barbican-keystone-listener-log" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.588543 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerName="barbican-keystone-listener" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.588556 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b25315-bece-4c48-bd94-3132650e2b14" containerName="nova-scheduler-scheduler" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.588568 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" containerName="barbican-keystone-listener-log" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.589476 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.589918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e6fa0ac-e4ce-419d-a680-ce63883886a7-logs" (OuterVolumeSpecName: "logs") pod "4e6fa0ac-e4ce-419d-a680-ce63883886a7" (UID: "4e6fa0ac-e4ce-419d-a680-ce63883886a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.592563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4e6fa0ac-e4ce-419d-a680-ce63883886a7" (UID: "4e6fa0ac-e4ce-419d-a680-ce63883886a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.593649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e6fa0ac-e4ce-419d-a680-ce63883886a7-kube-api-access-k7xvj" (OuterVolumeSpecName: "kube-api-access-k7xvj") pod "4e6fa0ac-e4ce-419d-a680-ce63883886a7" (UID: "4e6fa0ac-e4ce-419d-a680-ce63883886a7"). InnerVolumeSpecName "kube-api-access-k7xvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.593877 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.594418 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.596931 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.601151 4707 scope.go:117] "RemoveContainer" containerID="326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.603514 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea\": container with ID starting with 326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea not found: ID does not exist" containerID="326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.603544 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea"} err="failed to get container status \"326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea\": rpc error: code = NotFound desc = could not find container \"326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea\": container with ID starting with 326e8f179ff33046c476e36a19a4c9fa87c081ce22d0e12d1e455414a15165ea not found: ID does not exist" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.603562 4707 scope.go:117] "RemoveContainer" containerID="f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.604307 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0\": container with ID starting with f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0 not found: ID does not exist" containerID="f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.604355 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0"} err="failed to get container status \"f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0\": rpc error: code = NotFound desc = could not find container \"f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0\": container with ID starting with f2888a8d55bd9e8d80441ffc9c92b633e6e4422f3cfd17626976a5111185eda0 not found: ID does not exist" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.604371 4707 scope.go:117] "RemoveContainer" containerID="72ebb7a26f99df0fa91b1dd0b938c53301a324860220aedf1fc9ada4708a5afc" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.614421 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_bd6d7315-b24c-411b-acc8-bda614951af2/ovn-northd/0.log" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.614489 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.620250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e6fa0ac-e4ce-419d-a680-ce63883886a7" (UID: "4e6fa0ac-e4ce-419d-a680-ce63883886a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.641399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data" (OuterVolumeSpecName: "config-data") pod "4e6fa0ac-e4ce-419d-a680-ce63883886a7" (UID: "4e6fa0ac-e4ce-419d-a680-ce63883886a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.654042 4707 scope.go:117] "RemoveContainer" containerID="899a26b8e50ba33fca57b72d44974297421aa331badc78e13d0e83c15ab03b22" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-rundir\") pod \"bd6d7315-b24c-411b-acc8-bda614951af2\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-scripts\") pod \"bd6d7315-b24c-411b-acc8-bda614951af2\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-northd-tls-certs\") pod \"bd6d7315-b24c-411b-acc8-bda614951af2\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688201 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-config\") pod \"bd6d7315-b24c-411b-acc8-bda614951af2\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688244 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmslp\" (UniqueName: \"kubernetes.io/projected/bd6d7315-b24c-411b-acc8-bda614951af2-kube-api-access-jmslp\") pod \"bd6d7315-b24c-411b-acc8-bda614951af2\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-combined-ca-bundle\") pod \"bd6d7315-b24c-411b-acc8-bda614951af2\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-metrics-certs-tls-certs\") pod \"bd6d7315-b24c-411b-acc8-bda614951af2\" (UID: \"bd6d7315-b24c-411b-acc8-bda614951af2\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-logs\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d44g5\" (UniqueName: \"kubernetes.io/projected/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-kube-api-access-d44g5\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688718 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.688917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-config-data\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.689029 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.689039 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.689049 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6fa0ac-e4ce-419d-a680-ce63883886a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.689058 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7xvj\" (UniqueName: \"kubernetes.io/projected/4e6fa0ac-e4ce-419d-a680-ce63883886a7-kube-api-access-k7xvj\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.689066 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6fa0ac-e4ce-419d-a680-ce63883886a7-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.689716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "bd6d7315-b24c-411b-acc8-bda614951af2" (UID: "bd6d7315-b24c-411b-acc8-bda614951af2"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.690180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-scripts" (OuterVolumeSpecName: "scripts") pod "bd6d7315-b24c-411b-acc8-bda614951af2" (UID: "bd6d7315-b24c-411b-acc8-bda614951af2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.690800 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-config" (OuterVolumeSpecName: "config") pod "bd6d7315-b24c-411b-acc8-bda614951af2" (UID: "bd6d7315-b24c-411b-acc8-bda614951af2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.694195 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.694247 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:10.694233272 +0000 UTC m=+4167.875749494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-internal-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.697072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6d7315-b24c-411b-acc8-bda614951af2-kube-api-access-jmslp" (OuterVolumeSpecName: "kube-api-access-jmslp") pod "bd6d7315-b24c-411b-acc8-bda614951af2" (UID: "bd6d7315-b24c-411b-acc8-bda614951af2"). InnerVolumeSpecName "kube-api-access-jmslp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.697198 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.697235 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:10.697222191 +0000 UTC m=+4167.878738413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-public-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.731992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd6d7315-b24c-411b-acc8-bda614951af2" (UID: "bd6d7315-b24c-411b-acc8-bda614951af2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.766589 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "bd6d7315-b24c-411b-acc8-bda614951af2" (UID: "bd6d7315-b24c-411b-acc8-bda614951af2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.778637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "bd6d7315-b24c-411b-acc8-bda614951af2" (UID: "bd6d7315-b24c-411b-acc8-bda614951af2"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-config-data\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-logs\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791283 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d44g5\" (UniqueName: \"kubernetes.io/projected/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-kube-api-access-d44g5\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791638 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791650 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791660 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791669 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791678 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6d7315-b24c-411b-acc8-bda614951af2-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791687 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6d7315-b24c-411b-acc8-bda614951af2-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.791695 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmslp\" (UniqueName: \"kubernetes.io/projected/bd6d7315-b24c-411b-acc8-bda614951af2-kube-api-access-jmslp\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.791771 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.792417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-logs\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.792522 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.792676 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:13.791799424 +0000 UTC m=+4170.973315646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovn-metrics" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.792699 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:13.79269057 +0000 UTC m=+4170.974206792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.792863 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.792908 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs podName:23f0ac73-fdf4-4d9f-b00d-90906390fe70 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:13.792892429 +0000 UTC m=+4170.974408652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70") : secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.795004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-config-data\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.795530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.796550 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.807246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d44g5\" (UniqueName: \"kubernetes.io/projected/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-kube-api-access-d44g5\") pod \"nova-metadata-0\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.852503 4707 scope.go:117] "RemoveContainer" containerID="5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.869191 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w"] Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.875247 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f74c7bd86-hlp8w"] Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.880434 4707 scope.go:117] "RemoveContainer" containerID="5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04" Jan 21 16:11:09 crc kubenswrapper[4707]: E0121 16:11:09.880777 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04\": container with ID starting with 5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04 not found: ID does not exist" containerID="5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.880802 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04"} err="failed to get container status \"5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04\": rpc error: code = NotFound desc = could not find container \"5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04\": container with ID starting with 5e1c50c4935c8d07166b7d0db3f7e0cd9c7956ad91aa06cbf351195b8b993d04 not found: ID does not exist" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.892733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-config-data\") pod \"43b25315-bece-4c48-bd94-3132650e2b14\" (UID: \"43b25315-bece-4c48-bd94-3132650e2b14\") " Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.895550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-config-data" (OuterVolumeSpecName: "config-data") pod "43b25315-bece-4c48-bd94-3132650e2b14" (UID: "43b25315-bece-4c48-bd94-3132650e2b14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.919225 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:09 crc kubenswrapper[4707]: I0121 16:11:09.995300 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b25315-bece-4c48-bd94-3132650e2b14-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.178044 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.196570 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.206562 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.212632 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.213142 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" containerName="ovn-northd" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.213168 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" containerName="ovn-northd" Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.213182 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" containerName="openstack-network-exporter" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.213190 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" containerName="openstack-network-exporter" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.213567 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" containerName="ovn-northd" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.213621 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" containerName="openstack-network-exporter" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.214462 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.216155 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.218442 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.301621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-config-data\") pod \"nova-scheduler-0\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.301669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.301985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fr27\" (UniqueName: \"kubernetes.io/projected/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-kube-api-access-2fr27\") pod \"nova-scheduler-0\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.308090 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43b25315_bece_4c48_bd94_3132650e2b14.slice/crio-505a7b08eead2cf1c6175f645a0b4eea548123018a8032e6282223933b6932c3\": RecentStats: unable to find data in memory cache]" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.403906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.404055 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.404490 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.404467532 +0000 UTC m=+4169.585983754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-public-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.404401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.404640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-config-data\") pod \"nova-scheduler-0\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.404703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.405211 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fr27\" (UniqueName: \"kubernetes.io/projected/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-kube-api-access-2fr27\") pod \"nova-scheduler-0\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.405268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.405401 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.405456 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs podName:6a3ae0f0-4069-4bac-8927-e0d9dce26caf nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.405443147 +0000 UTC m=+4169.586959358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs") pod "glance-default-external-api-0" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf") : secret "cert-glance-default-public-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.405605 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.405722 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.405709056 +0000 UTC m=+4169.587225279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-internal-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.411490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-config-data\") pod \"nova-scheduler-0\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.411764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.419220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fr27\" (UniqueName: \"kubernetes.io/projected/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-kube-api-access-2fr27\") pod \"nova-scheduler-0\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.507348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.507539 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.507618 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs podName:45335228-5b76-4f29-a370-98578b0dfd62 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.507602345 +0000 UTC m=+4169.689118567 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "45335228-5b76-4f29-a370-98578b0dfd62") : secret "cert-glance-default-internal-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.507885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.508092 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.508142 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs podName:449654c7-aa25-4489-b5b9-500f51719353 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:14.50813113 +0000 UTC m=+4171.689647352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs") pod "openstack-galera-0" (UID: "449654c7-aa25-4489-b5b9-500f51719353") : secret "cert-galera-openstack-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.534617 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.559508 4707 generic.go:334] "Generic (PLEG): container finished" podID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerID="cfbed99679c4ef966f0c2a8850897c643850ca266da8c7489dc2e08a36f36c46" exitCode=1 Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.559602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerDied","Data":"cfbed99679c4ef966f0c2a8850897c643850ca266da8c7489dc2e08a36f36c46"} Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.560459 4707 scope.go:117] "RemoveContainer" containerID="cfbed99679c4ef966f0c2a8850897c643850ca266da8c7489dc2e08a36f36c46" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.565662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2f6304c7-b127-4fcb-a74b-b5009a2b1a50","Type":"ContainerStarted","Data":"cd779a5c94d7063de0abc7e3a7f2b71e9ae69f23b6f7b82dbce250ade905a2db"} Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.571722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a56f6441-8492-43ae-8a5a-8e38eb6b7cab","Type":"ContainerStarted","Data":"f1f33074f9c4884125c41e8f10998b0cfb13e6b6903b0de59bbc6d2a374caca5"} Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.571797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a56f6441-8492-43ae-8a5a-8e38eb6b7cab","Type":"ContainerStarted","Data":"3805d4c67848713fa21d3835fea8fa9390b192a873e21f06f95e32eafd197ef0"} Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.571836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a56f6441-8492-43ae-8a5a-8e38eb6b7cab","Type":"ContainerStarted","Data":"ee4df396e056207a57c588b23cf0363049858f0ddecf9fe9b16f81dfa131d113"} Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.583294 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.594916 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.594872659 podStartE2EDuration="1.594872659s" podCreationTimestamp="2026-01-21 16:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:10.592741622 +0000 UTC m=+4167.774257845" watchObservedRunningTime="2026-01-21 16:11:10.594872659 +0000 UTC m=+4167.776388882" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.599431 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_bd6d7315-b24c-411b-acc8-bda614951af2/ovn-northd/0.log" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.599544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"bd6d7315-b24c-411b-acc8-bda614951af2","Type":"ContainerDied","Data":"d929e54dc6614dc80117812f2e8dcdeca2febaf9705801ba94cc500ea1e5b3db"} Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.599610 4707 scope.go:117] "RemoveContainer" containerID="edc9c99ba54dd7b30862a576d3d32b484fff02281a8a134d2f7a03fdef6a53c6" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.599859 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.621697 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.621681288 podStartE2EDuration="3.621681288s" podCreationTimestamp="2026-01-21 16:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:10.618080056 +0000 UTC m=+4167.799596279" watchObservedRunningTime="2026-01-21 16:11:10.621681288 +0000 UTC m=+4167.803197510" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.712764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.713115 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.713203 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.713185252 +0000 UTC m=+4169.894701475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-internal-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.713781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.714057 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: E0121 16:11:10.714105 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.714094522 +0000 UTC m=+4169.895610744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-public-svc" not found Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.772685 4707 scope.go:117] "RemoveContainer" containerID="f5905c256265d74b62ed26557c2803f5ee8e0dde3974c37bd97e1831a18746e7" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.812067 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.837236 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.850928 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.852751 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.855030 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.855135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.855350 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-ljxg7" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.858490 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.858649 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.922240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.922477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-config\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.922538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-scripts\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.922691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.923187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nkrv\" (UniqueName: \"kubernetes.io/projected/a18848e3-7a28-4f1e-913d-b8aa7440df5e-kube-api-access-2nkrv\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.923336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:10 crc kubenswrapper[4707]: I0121 16:11:10.923512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.025149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.025340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.025445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.025512 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.025560 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.025584 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:11.525567424 +0000 UTC m=+4168.707083646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovn-metrics" not found Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.025517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-config\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.025611 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:11.525597921 +0000 UTC m=+4168.707114143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.025635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-scripts\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.025592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.025846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.025982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nkrv\" (UniqueName: \"kubernetes.io/projected/a18848e3-7a28-4f1e-913d-b8aa7440df5e-kube-api-access-2nkrv\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.026243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-config\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.026563 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-scripts\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.037467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.042396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nkrv\" (UniqueName: \"kubernetes.io/projected/a18848e3-7a28-4f1e-913d-b8aa7440df5e-kube-api-access-2nkrv\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.095546 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.111511 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.202984 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b25315-bece-4c48-bd94-3132650e2b14" path="/var/lib/kubelet/pods/43b25315-bece-4c48-bd94-3132650e2b14/volumes" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.203914 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6fa0ac-e4ce-419d-a680-ce63883886a7" path="/var/lib/kubelet/pods/4e6fa0ac-e4ce-419d-a680-ce63883886a7/volumes" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.204779 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6d7315-b24c-411b-acc8-bda614951af2" path="/var/lib/kubelet/pods/bd6d7315-b24c-411b-acc8-bda614951af2/volumes" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.208593 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" path="/var/lib/kubelet/pods/ef9ba957-bcd9-4e63-bbf2-4e2398ac259a/volumes" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.231598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-combined-ca-bundle\") pod \"1bc4c055-2df1-4d1f-b331-30020327ed16\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.231980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data\") pod \"1bc4c055-2df1-4d1f-b331-30020327ed16\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.232155 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bc4c055-2df1-4d1f-b331-30020327ed16-logs\") pod \"1bc4c055-2df1-4d1f-b331-30020327ed16\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.232191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data-custom\") pod \"1bc4c055-2df1-4d1f-b331-30020327ed16\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.232285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp4j7\" (UniqueName: \"kubernetes.io/projected/1bc4c055-2df1-4d1f-b331-30020327ed16-kube-api-access-sp4j7\") pod \"1bc4c055-2df1-4d1f-b331-30020327ed16\" (UID: \"1bc4c055-2df1-4d1f-b331-30020327ed16\") " Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.237535 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bc4c055-2df1-4d1f-b331-30020327ed16-logs" (OuterVolumeSpecName: "logs") pod "1bc4c055-2df1-4d1f-b331-30020327ed16" (UID: "1bc4c055-2df1-4d1f-b331-30020327ed16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.257020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1bc4c055-2df1-4d1f-b331-30020327ed16" (UID: "1bc4c055-2df1-4d1f-b331-30020327ed16"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.290796 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc4c055-2df1-4d1f-b331-30020327ed16-kube-api-access-sp4j7" (OuterVolumeSpecName: "kube-api-access-sp4j7") pod "1bc4c055-2df1-4d1f-b331-30020327ed16" (UID: "1bc4c055-2df1-4d1f-b331-30020327ed16"). InnerVolumeSpecName "kube-api-access-sp4j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.340173 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bc4c055-2df1-4d1f-b331-30020327ed16-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.340201 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.340214 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp4j7\" (UniqueName: \"kubernetes.io/projected/1bc4c055-2df1-4d1f-b331-30020327ed16-kube-api-access-sp4j7\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.341064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bc4c055-2df1-4d1f-b331-30020327ed16" (UID: "1bc4c055-2df1-4d1f-b331-30020327ed16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.374799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data" (OuterVolumeSpecName: "config-data") pod "1bc4c055-2df1-4d1f-b331-30020327ed16" (UID: "1bc4c055-2df1-4d1f-b331-30020327ed16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.442179 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.442206 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc4c055-2df1-4d1f-b331-30020327ed16-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.543709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.543784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.543892 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.543913 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.543949 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.543935018 +0000 UTC m=+4169.725451240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovn-metrics" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.543975 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:12.543960685 +0000 UTC m=+4169.725476908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.626981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerStarted","Data":"2f792b82ffa7260df2fe2acbed939f4f7e50fdc85ed7854f6905c5767651ed06"} Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.627025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerStarted","Data":"0f515b5801bd198300964126774afddd1c6e414cd602008997bdec48127d09c5"} Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.631227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerStarted","Data":"2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5"} Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.631341 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.632709 4707 generic.go:334] "Generic (PLEG): container finished" podID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerID="92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe" exitCode=0 Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.632760 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.632793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" event={"ID":"1bc4c055-2df1-4d1f-b331-30020327ed16","Type":"ContainerDied","Data":"92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe"} Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.632843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm" event={"ID":"1bc4c055-2df1-4d1f-b331-30020327ed16","Type":"ContainerDied","Data":"dda11bed22650a80cbd1e89e066168e6698086b16561d374618a5c4168529dd0"} Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.632863 4707 scope.go:117] "RemoveContainer" containerID="92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.644115 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.644085861 podStartE2EDuration="1.644085861s" podCreationTimestamp="2026-01-21 16:11:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:11.638093744 +0000 UTC m=+4168.819609966" watchObservedRunningTime="2026-01-21 16:11:11.644085861 +0000 UTC m=+4168.825602083" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.671996 4707 scope.go:117] "RemoveContainer" containerID="8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.677551 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm"] Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.684227 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5cdcb8d65c-2rlbm"] Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.687225 4707 scope.go:117] "RemoveContainer" containerID="92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe" Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.687498 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe\": container with ID starting with 92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe not found: ID does not exist" containerID="92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.687534 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe"} err="failed to get container status \"92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe\": rpc error: code = NotFound desc = could not find container \"92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe\": container with ID starting with 92f911407c9f5f55f8a8b30c15bad6acf26cfce85721b12771e8a8bae660d1fe not found: ID does not exist" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.687551 4707 scope.go:117] "RemoveContainer" containerID="8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44" Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.687770 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44\": container with ID starting with 8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44 not found: ID does not exist" containerID="8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.687793 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44"} err="failed to get container status \"8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44\": rpc error: code = NotFound desc = could not find container \"8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44\": container with ID starting with 8ee97cdc9adcaa70472c96dbfb68262b8ee12a326aebe7cc30368967e614ae44 not found: ID does not exist" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.745871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.745909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.747189 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.747231 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:19.747219222 +0000 UTC m=+4176.928735444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.747619 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.747652 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:19.747645364 +0000 UTC m=+4176.929161585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-public-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.949888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.950062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.950089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:11 crc kubenswrapper[4707]: I0121 16:11:11.950109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.950242 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.950286 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:19.950274527 +0000 UTC m=+4177.131790748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-internal-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.950532 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.950567 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:19.950558621 +0000 UTC m=+4177.132074842 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-internal-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.950596 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.950621 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:19.950615378 +0000 UTC m=+4177.132131600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-public-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.950647 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 16:11:11 crc kubenswrapper[4707]: E0121 16:11:11.950664 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:19.950658028 +0000 UTC m=+4177.132174250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-public-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.157768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.157949 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.158010 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:20.157995934 +0000 UTC m=+4177.339512156 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-public-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.157957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.158391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.158500 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.158535 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:20.158528335 +0000 UTC m=+4177.340044558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-internal-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.164526 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.463238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.463331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.463367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.463604 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.463654 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:16.463641929 +0000 UTC m=+4173.645158151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-internal-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.463955 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.463983 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs podName:6a3ae0f0-4069-4bac-8927-e0d9dce26caf nodeName:}" failed. No retries permitted until 2026-01-21 16:11:16.463976628 +0000 UTC m=+4173.645492850 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs") pod "glance-default-external-api-0" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf") : secret "cert-glance-default-public-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.464014 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.464032 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:16.464026712 +0000 UTC m=+4173.645542934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-public-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.564522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.564796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.564840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.565032 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.565068 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:14.565059755 +0000 UTC m=+4171.746575977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.565077 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.565150 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:14.565132552 +0000 UTC m=+4171.746648774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovn-metrics" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.565215 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.565240 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs podName:45335228-5b76-4f29-a370-98578b0dfd62 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:16.56523269 +0000 UTC m=+4173.746748912 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "45335228-5b76-4f29-a370-98578b0dfd62") : secret "cert-glance-default-internal-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.767196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:12 crc kubenswrapper[4707]: I0121 16:11:12.767380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.768090 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.768129 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:16.768118566 +0000 UTC m=+4173.949634788 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-public-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.769142 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 21 16:11:12 crc kubenswrapper[4707]: E0121 16:11:12.769185 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:16.769177636 +0000 UTC m=+4173.950693859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-internal-svc" not found Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.189861 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc4c055-2df1-4d1f-b331-30020327ed16" path="/var/lib/kubelet/pods/1bc4c055-2df1-4d1f-b331-30020327ed16/volumes" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.190400 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.377288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.448062 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs\") pod \"memcached-0\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.560520 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.209:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.560891 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ef9ba957-bcd9-4e63-bbf2-4e2398ac259a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.209:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.583649 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5 is running failed: container process not found" containerID="2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.584013 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5 is running failed: container process not found" containerID="2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.584301 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5 is running failed: container process not found" containerID="2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.584336 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5 is running failed: container process not found" probeType="Liveness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.637206 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.647094 4707 generic.go:334] "Generic (PLEG): container finished" podID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerID="2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5" exitCode=1 Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.647127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerDied","Data":"2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5"} Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.647154 4707 scope.go:117] "RemoveContainer" containerID="cfbed99679c4ef966f0c2a8850897c643850ca266da8c7489dc2e08a36f36c46" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.647523 4707 scope.go:117] "RemoveContainer" containerID="2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5" Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.647754 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.682965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.683104 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.683167 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:21.683144005 +0000 UTC m=+4178.864660227 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.683246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.683529 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.683556 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:21.683550279 +0000 UTC m=+4178.865066500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovn-metrics" not found Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.887172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.887497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.887319 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.887569 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:21.88755136 +0000 UTC m=+4179.069067582 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovn-metrics" not found Jan 21 16:11:13 crc kubenswrapper[4707]: I0121 16:11:13.887603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.887679 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.887744 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:21.887728463 +0000 UTC m=+4179.069244685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.887769 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:13 crc kubenswrapper[4707]: E0121 16:11:13.887844 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs podName:23f0ac73-fdf4-4d9f-b00d-90906390fe70 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:21.887828422 +0000 UTC m=+4179.069344643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70") : secret "cert-galera-openstack-cell1-svc" not found Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.038170 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.599126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:14 crc kubenswrapper[4707]: E0121 16:11:14.599263 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 21 16:11:14 crc kubenswrapper[4707]: E0121 16:11:14.599449 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs podName:449654c7-aa25-4489-b5b9-500f51719353 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:22.59942298 +0000 UTC m=+4179.780939202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs") pod "openstack-galera-0" (UID: "449654c7-aa25-4489-b5b9-500f51719353") : secret "cert-galera-openstack-svc" not found Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.599472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.599545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:14 crc kubenswrapper[4707]: E0121 16:11:14.599621 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:14 crc kubenswrapper[4707]: E0121 16:11:14.599655 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:18.5996468 +0000 UTC m=+4175.781163023 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovn-metrics" not found Jan 21 16:11:14 crc kubenswrapper[4707]: E0121 16:11:14.599757 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:14 crc kubenswrapper[4707]: E0121 16:11:14.599835 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:18.599820157 +0000 UTC m=+4175.781336379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.656009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"97adda3a-b739-44a3-8119-516f049b2784","Type":"ContainerStarted","Data":"5aed7ec7960a72f577ec69b2bad9b80e35f61e4ed5447c6a1499ce16cca77434"} Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.656044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"97adda3a-b739-44a3-8119-516f049b2784","Type":"ContainerStarted","Data":"b72fb37a556adaa68dbfff32a806e793362e5c46a77ab09b3d0ab39a0cf5cf71"} Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.656135 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.671202 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=9.671187007 podStartE2EDuration="9.671187007s" podCreationTimestamp="2026-01-21 16:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:14.666264001 +0000 UTC m=+4171.847780223" watchObservedRunningTime="2026-01-21 16:11:14.671187007 +0000 UTC m=+4171.852703229" Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.919589 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:14 crc kubenswrapper[4707]: I0121 16:11:14.919645 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.243627 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:15 crc kubenswrapper[4707]: E0121 16:11:15.245014 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[galera-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="23f0ac73-fdf4-4d9f-b00d-90906390fe70" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.535601 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.661677 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.670470 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.731476 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:15 crc kubenswrapper[4707]: E0121 16:11:15.732311 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[galera-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/openstack-galera-0" podUID="449654c7-aa25-4489-b5b9-500f51719353" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.820380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-default\") pod \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.820653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kolla-config\") pod \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.820834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.820899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "23f0ac73-fdf4-4d9f-b00d-90906390fe70" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.821044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-generated\") pod \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.821118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms5s5\" (UniqueName: \"kubernetes.io/projected/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kube-api-access-ms5s5\") pod \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.821047 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "23f0ac73-fdf4-4d9f-b00d-90906390fe70" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.821179 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "23f0ac73-fdf4-4d9f-b00d-90906390fe70" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.821278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-combined-ca-bundle\") pod \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.821358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-operator-scripts\") pod \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\" (UID: \"23f0ac73-fdf4-4d9f-b00d-90906390fe70\") " Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.821910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23f0ac73-fdf4-4d9f-b00d-90906390fe70" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.822403 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.822482 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.822536 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.822584 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.824959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "23f0ac73-fdf4-4d9f-b00d-90906390fe70" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.825323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kube-api-access-ms5s5" (OuterVolumeSpecName: "kube-api-access-ms5s5") pod "23f0ac73-fdf4-4d9f-b00d-90906390fe70" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70"). InnerVolumeSpecName "kube-api-access-ms5s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.825649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23f0ac73-fdf4-4d9f-b00d-90906390fe70" (UID: "23f0ac73-fdf4-4d9f-b00d-90906390fe70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.924848 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms5s5\" (UniqueName: \"kubernetes.io/projected/23f0ac73-fdf4-4d9f-b00d-90906390fe70-kube-api-access-ms5s5\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.924876 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.924905 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 16:11:15 crc kubenswrapper[4707]: I0121 16:11:15.939280 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.026720 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.535215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.535293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.535368 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.535425 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:24.535410269 +0000 UTC m=+4181.716926492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-public-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.535572 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.535617 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.535646 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs podName:6a3ae0f0-4069-4bac-8927-e0d9dce26caf nodeName:}" failed. No retries permitted until 2026-01-21 16:11:24.53563921 +0000 UTC m=+4181.717155431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs") pod "glance-default-external-api-0" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf") : secret "cert-glance-default-public-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.535581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.535699 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:24.535673293 +0000 UTC m=+4181.717189516 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-internal-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.583218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.583699 4707 scope.go:117] "RemoveContainer" containerID="2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5" Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.583974 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.637753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.637940 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.638022 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs podName:45335228-5b76-4f29-a370-98578b0dfd62 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:24.638002423 +0000 UTC m=+4181.819518645 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "45335228-5b76-4f29-a370-98578b0dfd62") : secret "cert-glance-default-internal-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.667738 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.667755 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.677876 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.713406 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.720324 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.735238 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.735658 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerName="barbican-worker-log" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.735677 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerName="barbican-worker-log" Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.735691 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerName="barbican-worker" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.735697 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerName="barbican-worker" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.735851 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerName="barbican-worker" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.735871 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc4c055-2df1-4d1f-b331-30020327ed16" containerName="barbican-worker-log" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.736792 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.741299 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.741712 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-bhtdn" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.742266 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.742440 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.744968 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.841376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-operator-scripts\") pod \"449654c7-aa25-4489-b5b9-500f51719353\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.841433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7tjp\" (UniqueName: \"kubernetes.io/projected/449654c7-aa25-4489-b5b9-500f51719353-kube-api-access-s7tjp\") pod \"449654c7-aa25-4489-b5b9-500f51719353\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.841540 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-config-data-default\") pod \"449654c7-aa25-4489-b5b9-500f51719353\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.841588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"449654c7-aa25-4489-b5b9-500f51719353\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.842125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "449654c7-aa25-4489-b5b9-500f51719353" (UID: "449654c7-aa25-4489-b5b9-500f51719353"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.842155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "449654c7-aa25-4489-b5b9-500f51719353" (UID: "449654c7-aa25-4489-b5b9-500f51719353"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.841706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-combined-ca-bundle\") pod \"449654c7-aa25-4489-b5b9-500f51719353\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.842389 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/449654c7-aa25-4489-b5b9-500f51719353-config-data-generated\") pod \"449654c7-aa25-4489-b5b9-500f51719353\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.842454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-kolla-config\") pod \"449654c7-aa25-4489-b5b9-500f51719353\" (UID: \"449654c7-aa25-4489-b5b9-500f51719353\") " Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.842703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.842774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.842872 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.842921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/449654c7-aa25-4489-b5b9-500f51719353-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "449654c7-aa25-4489-b5b9-500f51719353" (UID: "449654c7-aa25-4489-b5b9-500f51719353"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.842979 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.842980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "449654c7-aa25-4489-b5b9-500f51719353" (UID: "449654c7-aa25-4489-b5b9-500f51719353"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.843035 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:24.843015017 +0000 UTC m=+4182.024531239 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-internal-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhzk5\" (UniqueName: \"kubernetes.io/projected/58427ff4-8738-48bd-a3ea-a99986023664-kube-api-access-nhzk5\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843582 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/58427ff4-8738-48bd-a3ea-a99986023664-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843941 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/449654c7-aa25-4489-b5b9-500f51719353-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843961 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843972 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843980 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f0ac73-fdf4-4d9f-b00d-90906390fe70-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.843990 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/449654c7-aa25-4489-b5b9-500f51719353-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.844034 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: E0121 16:11:16.844072 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs podName:97c00bba-c016-4c12-9df2-819557bb8d8f nodeName:}" failed. No retries permitted until 2026-01-21 16:11:24.844063198 +0000 UTC m=+4182.025579410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs") pod "nova-api-0" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f") : secret "cert-nova-public-svc" not found Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.846168 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "449654c7-aa25-4489-b5b9-500f51719353" (UID: "449654c7-aa25-4489-b5b9-500f51719353"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.846370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449654c7-aa25-4489-b5b9-500f51719353-kube-api-access-s7tjp" (OuterVolumeSpecName: "kube-api-access-s7tjp") pod "449654c7-aa25-4489-b5b9-500f51719353" (UID: "449654c7-aa25-4489-b5b9-500f51719353"). InnerVolumeSpecName "kube-api-access-s7tjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.849443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "449654c7-aa25-4489-b5b9-500f51719353" (UID: "449654c7-aa25-4489-b5b9-500f51719353"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.946559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/58427ff4-8738-48bd-a3ea-a99986023664-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.946963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.947129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/58427ff4-8738-48bd-a3ea-a99986023664-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.947829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.948016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.948789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.948875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.948954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.949102 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.949128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhzk5\" (UniqueName: \"kubernetes.io/projected/58427ff4-8738-48bd-a3ea-a99986023664-kube-api-access-nhzk5\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.949145 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.949507 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7tjp\" (UniqueName: \"kubernetes.io/projected/449654c7-aa25-4489-b5b9-500f51719353-kube-api-access-s7tjp\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.949549 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.949565 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.949702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.950951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.952508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.952834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.963718 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.964610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhzk5\" (UniqueName: \"kubernetes.io/projected/58427ff4-8738-48bd-a3ea-a99986023664-kube-api-access-nhzk5\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:16 crc kubenswrapper[4707]: I0121 16:11:16.976866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.051825 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.063912 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.224179 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f0ac73-fdf4-4d9f-b00d-90906390fe70" path="/var/lib/kubelet/pods/23f0ac73-fdf4-4d9f-b00d-90906390fe70/volumes" Jan 21 16:11:17 crc kubenswrapper[4707]: W0121 16:11:17.478083 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58427ff4_8738_48bd_a3ea_a99986023664.slice/crio-02a85cbfb13496723c88bf261d79f640fc16c5d9738d9a6caa90e8a03b802bb1 WatchSource:0}: Error finding container 02a85cbfb13496723c88bf261d79f640fc16c5d9738d9a6caa90e8a03b802bb1: Status 404 returned error can't find the container with id 02a85cbfb13496723c88bf261d79f640fc16c5d9738d9a6caa90e8a03b802bb1 Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.478749 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.678424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"58427ff4-8738-48bd-a3ea-a99986023664","Type":"ContainerStarted","Data":"3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098"} Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.678626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"58427ff4-8738-48bd-a3ea-a99986023664","Type":"ContainerStarted","Data":"02a85cbfb13496723c88bf261d79f640fc16c5d9738d9a6caa90e8a03b802bb1"} Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.678455 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.744067 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.753182 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.759915 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.761350 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.764101 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-s4nbl" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.764332 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.764537 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.764911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.779881 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.871871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.872189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.872236 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jhb\" (UniqueName: \"kubernetes.io/projected/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kube-api-access-l9jhb\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.872349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kolla-config\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.872369 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.872528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-default\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.872799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.872861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.873106 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/449654c7-aa25-4489-b5b9-500f51719353-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.974267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.974334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.974379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jhb\" (UniqueName: \"kubernetes.io/projected/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kube-api-access-l9jhb\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.974471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kolla-config\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.974490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.974531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-default\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.974604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.974625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.974877 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.975264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.975682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kolla-config\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.975755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-default\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.976536 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.979225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.979904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.989270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jhb\" (UniqueName: \"kubernetes.io/projected/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kube-api-access-l9jhb\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:17 crc kubenswrapper[4707]: I0121 16:11:17.999579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:18 crc kubenswrapper[4707]: I0121 16:11:18.082479 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:18 crc kubenswrapper[4707]: I0121 16:11:18.459077 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:11:18 crc kubenswrapper[4707]: W0121 16:11:18.461175 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1ca9dac_9e6a_4626_a8ff_111cfee2ef3b.slice/crio-4599c305e572f34ed5c16b20e2dedae0100fe3465c84e3075b360da360309573 WatchSource:0}: Error finding container 4599c305e572f34ed5c16b20e2dedae0100fe3465c84e3075b360da360309573: Status 404 returned error can't find the container with id 4599c305e572f34ed5c16b20e2dedae0100fe3465c84e3075b360da360309573 Jan 21 16:11:18 crc kubenswrapper[4707]: I0121 16:11:18.685995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b","Type":"ContainerStarted","Data":"062b57f9c5689c8c3511ae8fbeb3929d0f15a809640fc51bf28faf8a24ac751f"} Jan 21 16:11:18 crc kubenswrapper[4707]: I0121 16:11:18.686382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b","Type":"ContainerStarted","Data":"4599c305e572f34ed5c16b20e2dedae0100fe3465c84e3075b360da360309573"} Jan 21 16:11:18 crc kubenswrapper[4707]: I0121 16:11:18.695832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:18 crc kubenswrapper[4707]: I0121 16:11:18.695916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:18 crc kubenswrapper[4707]: E0121 16:11:18.696190 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:18 crc kubenswrapper[4707]: E0121 16:11:18.696231 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:26.69622013 +0000 UTC m=+4183.877736353 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:18 crc kubenswrapper[4707]: E0121 16:11:18.696331 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:18 crc kubenswrapper[4707]: E0121 16:11:18.696404 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:26.696389199 +0000 UTC m=+4183.877905421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovn-metrics" not found Jan 21 16:11:19 crc kubenswrapper[4707]: I0121 16:11:19.193748 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449654c7-aa25-4489-b5b9-500f51719353" path="/var/lib/kubelet/pods/449654c7-aa25-4489-b5b9-500f51719353/volumes" Jan 21 16:11:19 crc kubenswrapper[4707]: I0121 16:11:19.818172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:19 crc kubenswrapper[4707]: I0121 16:11:19.818245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:19 crc kubenswrapper[4707]: E0121 16:11:19.818803 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:19 crc kubenswrapper[4707]: E0121 16:11:19.818930 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:19 crc kubenswrapper[4707]: E0121 16:11:19.819042 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:35.818950066 +0000 UTC m=+4193.000466288 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:19 crc kubenswrapper[4707]: E0121 16:11:19.819118 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:35.819107753 +0000 UTC m=+4193.000623975 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-public-svc" not found Jan 21 16:11:19 crc kubenswrapper[4707]: I0121 16:11:19.919477 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:19 crc kubenswrapper[4707]: I0121 16:11:19.919512 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.021775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.021853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.021880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.022010 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs\") pod \"placement-5878d64444-swb5j\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:20 crc kubenswrapper[4707]: E0121 16:11:20.022282 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 21 16:11:20 crc kubenswrapper[4707]: E0121 16:11:20.022335 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:36.022320231 +0000 UTC m=+4193.203836453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-public-svc" not found Jan 21 16:11:20 crc kubenswrapper[4707]: E0121 16:11:20.022402 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:11:20 crc kubenswrapper[4707]: E0121 16:11:20.022460 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:36.022448012 +0000 UTC m=+4193.203964234 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-internal-svc" not found Jan 21 16:11:20 crc kubenswrapper[4707]: E0121 16:11:20.022494 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 21 16:11:20 crc kubenswrapper[4707]: E0121 16:11:20.022514 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs podName:8e33fc10-40c0-4167-bf53-bf5d535ce2f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:36.022508415 +0000 UTC m=+4193.204024638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs") pod "placement-5878d64444-swb5j" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6") : secret "cert-placement-internal-svc" not found Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.027380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.224376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.224435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:20 crc kubenswrapper[4707]: E0121 16:11:20.224618 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 16:11:20 crc kubenswrapper[4707]: E0121 16:11:20.224684 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:36.224668045 +0000 UTC m=+4193.406184277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-public-svc" not found Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.228283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.535117 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.588052 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.703554 4707 generic.go:334] "Generic (PLEG): container finished" podID="58427ff4-8738-48bd-a3ea-a99986023664" containerID="3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098" exitCode=0 Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.703637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"58427ff4-8738-48bd-a3ea-a99986023664","Type":"ContainerDied","Data":"3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098"} Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.727604 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.932978 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:20 crc kubenswrapper[4707]: I0121 16:11:20.933039 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:21 crc kubenswrapper[4707]: I0121 16:11:21.710502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"58427ff4-8738-48bd-a3ea-a99986023664","Type":"ContainerStarted","Data":"bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4"} Jan 21 16:11:21 crc kubenswrapper[4707]: I0121 16:11:21.711955 4707 generic.go:334] "Generic (PLEG): container finished" podID="f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" containerID="062b57f9c5689c8c3511ae8fbeb3929d0f15a809640fc51bf28faf8a24ac751f" exitCode=0 Jan 21 16:11:21 crc kubenswrapper[4707]: I0121 16:11:21.712141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b","Type":"ContainerDied","Data":"062b57f9c5689c8c3511ae8fbeb3929d0f15a809640fc51bf28faf8a24ac751f"} Jan 21 16:11:21 crc kubenswrapper[4707]: I0121 16:11:21.726220 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=5.726206722 podStartE2EDuration="5.726206722s" podCreationTimestamp="2026-01-21 16:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:21.725791682 +0000 UTC m=+4178.907307905" watchObservedRunningTime="2026-01-21 16:11:21.726206722 +0000 UTC m=+4178.907722945" Jan 21 16:11:21 crc kubenswrapper[4707]: I0121 16:11:21.756585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:21 crc kubenswrapper[4707]: I0121 16:11:21.756717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:21 crc kubenswrapper[4707]: E0121 16:11:21.756779 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:21 crc kubenswrapper[4707]: E0121 16:11:21.756870 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:37.75685424 +0000 UTC m=+4194.938370472 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovn-metrics" not found Jan 21 16:11:21 crc kubenswrapper[4707]: E0121 16:11:21.756876 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:21 crc kubenswrapper[4707]: E0121 16:11:21.756927 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs podName:f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:37.75691288 +0000 UTC m=+4194.938429102 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 21 16:11:21 crc kubenswrapper[4707]: I0121 16:11:21.960030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:21 crc kubenswrapper[4707]: I0121 16:11:21.960094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:21 crc kubenswrapper[4707]: E0121 16:11:21.960237 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 21 16:11:21 crc kubenswrapper[4707]: E0121 16:11:21.960308 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:37.96029069 +0000 UTC m=+4195.141806912 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovn-metrics" not found Jan 21 16:11:21 crc kubenswrapper[4707]: E0121 16:11:21.960347 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:21 crc kubenswrapper[4707]: E0121 16:11:21.960386 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs podName:a020fbaa-3cc2-46c6-84af-9f5f0323842a nodeName:}" failed. No retries permitted until 2026-01-21 16:11:37.960374227 +0000 UTC m=+4195.141890449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 21 16:11:22 crc kubenswrapper[4707]: I0121 16:11:22.723776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b","Type":"ContainerStarted","Data":"2ce45f981554c524b9a786326d92c8a0c4cc2bbf20e590dd6c7773290192d2cd"} Jan 21 16:11:22 crc kubenswrapper[4707]: I0121 16:11:22.740182 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=5.740152932 podStartE2EDuration="5.740152932s" podCreationTimestamp="2026-01-21 16:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:22.736857185 +0000 UTC m=+4179.918373408" watchObservedRunningTime="2026-01-21 16:11:22.740152932 +0000 UTC m=+4179.921669154" Jan 21 16:11:23 crc kubenswrapper[4707]: I0121 16:11:23.230977 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.225:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:23 crc kubenswrapper[4707]: E0121 16:11:23.288844 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:23 crc kubenswrapper[4707]: E0121 16:11:23.288916 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:23.788901241 +0000 UTC m=+4180.970417463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-public-svc" not found Jan 21 16:11:23 crc kubenswrapper[4707]: E0121 16:11:23.289221 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:23 crc kubenswrapper[4707]: E0121 16:11:23.289261 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:23.789250919 +0000 UTC m=+4180.970767140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:23 crc kubenswrapper[4707]: I0121 16:11:23.638474 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:23 crc kubenswrapper[4707]: I0121 16:11:23.662398 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:23 crc kubenswrapper[4707]: I0121 16:11:23.729675 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="97adda3a-b739-44a3-8119-516f049b2784" containerName="memcached" containerID="cri-o://5aed7ec7960a72f577ec69b2bad9b80e35f61e4ed5447c6a1499ce16cca77434" gracePeriod=30 Jan 21 16:11:23 crc kubenswrapper[4707]: I0121 16:11:23.772485 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:11:23 crc kubenswrapper[4707]: I0121 16:11:23.772687 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="54b025fd-f070-4b20-b44c-e49aa1930987" containerName="nova-cell1-conductor-conductor" containerID="cri-o://15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7" gracePeriod=30 Jan 21 16:11:23 crc kubenswrapper[4707]: E0121 16:11:23.798069 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:23 crc kubenswrapper[4707]: E0121 16:11:23.798209 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:24.798190486 +0000 UTC m=+4181.979706708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-public-svc" not found Jan 21 16:11:23 crc kubenswrapper[4707]: E0121 16:11:23.798264 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:23 crc kubenswrapper[4707]: E0121 16:11:23.798373 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:24.798365345 +0000 UTC m=+4181.979881567 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.612132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.612231 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.612551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.612653 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.612725 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs podName:6a3ae0f0-4069-4bac-8927-e0d9dce26caf nodeName:}" failed. No retries permitted until 2026-01-21 16:11:40.612710243 +0000 UTC m=+4197.794226465 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs") pod "glance-default-external-api-0" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf") : secret "cert-glance-default-public-svc" not found Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.612724 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.612900 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs podName:d6195d18-9ddf-4470-9f2f-5a3fe9717c53 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:40.612881776 +0000 UTC m=+4197.794397998 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs") pod "cinder-api-0" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53") : secret "cert-cinder-internal-svc" not found Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.616760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.667388 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.668229 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[internal-tls-certs public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/nova-api-0" podUID="97c00bba-c016-4c12-9df2-819557bb8d8f" Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.695671 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.696750 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.697982 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.698082 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="54b025fd-f070-4b20-b44c-e49aa1930987" containerName="nova-cell1-conductor-conductor" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.714034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.717126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.735639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.750271 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.817036 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.817100 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:26.81708553 +0000 UTC m=+4183.998601752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.817184 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:24 crc kubenswrapper[4707]: E0121 16:11:24.817255 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:26.817242024 +0000 UTC m=+4183.998758246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-public-svc" not found Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.906003 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.918776 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-combined-ca-bundle\") pod \"97c00bba-c016-4c12-9df2-819557bb8d8f\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.918905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms87k\" (UniqueName: \"kubernetes.io/projected/97c00bba-c016-4c12-9df2-819557bb8d8f-kube-api-access-ms87k\") pod \"97c00bba-c016-4c12-9df2-819557bb8d8f\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.918955 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c00bba-c016-4c12-9df2-819557bb8d8f-logs\") pod \"97c00bba-c016-4c12-9df2-819557bb8d8f\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.919075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-config-data\") pod \"97c00bba-c016-4c12-9df2-819557bb8d8f\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.919399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c00bba-c016-4c12-9df2-819557bb8d8f-logs" (OuterVolumeSpecName: "logs") pod "97c00bba-c016-4c12-9df2-819557bb8d8f" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.919413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.919552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.919702 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c00bba-c016-4c12-9df2-819557bb8d8f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.922095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c00bba-c016-4c12-9df2-819557bb8d8f-kube-api-access-ms87k" (OuterVolumeSpecName: "kube-api-access-ms87k") pod "97c00bba-c016-4c12-9df2-819557bb8d8f" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f"). InnerVolumeSpecName "kube-api-access-ms87k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.923173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97c00bba-c016-4c12-9df2-819557bb8d8f" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.924665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.924738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-config-data" (OuterVolumeSpecName: "config-data") pod "97c00bba-c016-4c12-9df2-819557bb8d8f" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:24 crc kubenswrapper[4707]: I0121 16:11:24.926546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.020315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") pod \"97c00bba-c016-4c12-9df2-819557bb8d8f\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.020473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") pod \"97c00bba-c016-4c12-9df2-819557bb8d8f\" (UID: \"97c00bba-c016-4c12-9df2-819557bb8d8f\") " Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.021100 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms87k\" (UniqueName: \"kubernetes.io/projected/97c00bba-c016-4c12-9df2-819557bb8d8f-kube-api-access-ms87k\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.021253 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.021263 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.025418 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "97c00bba-c016-4c12-9df2-819557bb8d8f" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.026952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "97c00bba-c016-4c12-9df2-819557bb8d8f" (UID: "97c00bba-c016-4c12-9df2-819557bb8d8f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.123270 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.123304 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c00bba-c016-4c12-9df2-819557bb8d8f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.290347 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.747275 4707 generic.go:334] "Generic (PLEG): container finished" podID="97adda3a-b739-44a3-8119-516f049b2784" containerID="5aed7ec7960a72f577ec69b2bad9b80e35f61e4ed5447c6a1499ce16cca77434" exitCode=0 Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.747353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"97adda3a-b739-44a3-8119-516f049b2784","Type":"ContainerDied","Data":"5aed7ec7960a72f577ec69b2bad9b80e35f61e4ed5447c6a1499ce16cca77434"} Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.750776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"45335228-5b76-4f29-a370-98578b0dfd62","Type":"ContainerStarted","Data":"681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60"} Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.750802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"45335228-5b76-4f29-a370-98578b0dfd62","Type":"ContainerStarted","Data":"ee67adb083f54026653faa6a58edfd853f55522fe0f32b5f8210e39d69312e76"} Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.750835 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.794616 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.803458 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.808761 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.812457 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.814334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.814474 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.814666 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.816694 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.892105 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.938756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-public-tls-certs\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.938877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjg2v\" (UniqueName: \"kubernetes.io/projected/8d8f30cf-308e-4177-8b29-5d0d6811b551-kube-api-access-cjg2v\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.938959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.938998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.939039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8f30cf-308e-4177-8b29-5d0d6811b551-logs\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.939074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-config-data\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:25 crc kubenswrapper[4707]: I0121 16:11:25.977491 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:25 crc kubenswrapper[4707]: E0121 16:11:25.978183 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs-tls-certs ovsdbserver-nb-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="a020fbaa-3cc2-46c6-84af-9f5f0323842a" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.040761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-combined-ca-bundle\") pod \"97adda3a-b739-44a3-8119-516f049b2784\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.040798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs\") pod \"97adda3a-b739-44a3-8119-516f049b2784\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.040950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-config-data\") pod \"97adda3a-b739-44a3-8119-516f049b2784\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.040981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xmcg\" (UniqueName: \"kubernetes.io/projected/97adda3a-b739-44a3-8119-516f049b2784-kube-api-access-9xmcg\") pod \"97adda3a-b739-44a3-8119-516f049b2784\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.041068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-kolla-config\") pod \"97adda3a-b739-44a3-8119-516f049b2784\" (UID: \"97adda3a-b739-44a3-8119-516f049b2784\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.041293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.041355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.041403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8f30cf-308e-4177-8b29-5d0d6811b551-logs\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.041443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-config-data\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.041562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-public-tls-certs\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.041660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjg2v\" (UniqueName: \"kubernetes.io/projected/8d8f30cf-308e-4177-8b29-5d0d6811b551-kube-api-access-cjg2v\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.041956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-config-data" (OuterVolumeSpecName: "config-data") pod "97adda3a-b739-44a3-8119-516f049b2784" (UID: "97adda3a-b739-44a3-8119-516f049b2784"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.042491 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "97adda3a-b739-44a3-8119-516f049b2784" (UID: "97adda3a-b739-44a3-8119-516f049b2784"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.043323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8f30cf-308e-4177-8b29-5d0d6811b551-logs\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.045723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.045733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97adda3a-b739-44a3-8119-516f049b2784-kube-api-access-9xmcg" (OuterVolumeSpecName: "kube-api-access-9xmcg") pod "97adda3a-b739-44a3-8119-516f049b2784" (UID: "97adda3a-b739-44a3-8119-516f049b2784"). InnerVolumeSpecName "kube-api-access-9xmcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.045907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.046646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-config-data\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.048845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-public-tls-certs\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.056204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjg2v\" (UniqueName: \"kubernetes.io/projected/8d8f30cf-308e-4177-8b29-5d0d6811b551-kube-api-access-cjg2v\") pod \"nova-api-0\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.070487 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97adda3a-b739-44a3-8119-516f049b2784" (UID: "97adda3a-b739-44a3-8119-516f049b2784"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.090135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "97adda3a-b739-44a3-8119-516f049b2784" (UID: "97adda3a-b739-44a3-8119-516f049b2784"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.129573 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.143008 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.143247 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xmcg\" (UniqueName: \"kubernetes.io/projected/97adda3a-b739-44a3-8119-516f049b2784-kube-api-access-9xmcg\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.143317 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97adda3a-b739-44a3-8119-516f049b2784-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.143370 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.143419 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97adda3a-b739-44a3-8119-516f049b2784-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:26 crc kubenswrapper[4707]: W0121 16:11:26.488886 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d8f30cf_308e_4177_8b29_5d0d6811b551.slice/crio-303a95345de521cf994c2aa7b38cdbeeb9040656ca262c2b9cc64c79d8ef4901 WatchSource:0}: Error finding container 303a95345de521cf994c2aa7b38cdbeeb9040656ca262c2b9cc64c79d8ef4901: Status 404 returned error can't find the container with id 303a95345de521cf994c2aa7b38cdbeeb9040656ca262c2b9cc64c79d8ef4901 Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.490686 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.757342 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.757437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:26 crc kubenswrapper[4707]: E0121 16:11:26.757664 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:26 crc kubenswrapper[4707]: E0121 16:11:26.757733 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs podName:a18848e3-7a28-4f1e-913d-b8aa7440df5e nodeName:}" failed. No retries permitted until 2026-01-21 16:11:42.757717833 +0000 UTC m=+4199.939234055 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e") : secret "cert-ovnnorthd-ovndbs" not found Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.760445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.763793 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.764244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"97adda3a-b739-44a3-8119-516f049b2784","Type":"ContainerDied","Data":"b72fb37a556adaa68dbfff32a806e793362e5c46a77ab09b3d0ab39a0cf5cf71"} Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.764282 4707 scope.go:117] "RemoveContainer" containerID="5aed7ec7960a72f577ec69b2bad9b80e35f61e4ed5447c6a1499ce16cca77434" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.767958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"45335228-5b76-4f29-a370-98578b0dfd62","Type":"ContainerStarted","Data":"39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0"} Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.769997 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.770064 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8d8f30cf-308e-4177-8b29-5d0d6811b551","Type":"ContainerStarted","Data":"78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d"} Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.770094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8d8f30cf-308e-4177-8b29-5d0d6811b551","Type":"ContainerStarted","Data":"ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb"} Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.770105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8d8f30cf-308e-4177-8b29-5d0d6811b551","Type":"ContainerStarted","Data":"303a95345de521cf994c2aa7b38cdbeeb9040656ca262c2b9cc64c79d8ef4901"} Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.799972 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=18.799958288 podStartE2EDuration="18.799958288s" podCreationTimestamp="2026-01-21 16:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:26.789951181 +0000 UTC m=+4183.971467404" watchObservedRunningTime="2026-01-21 16:11:26.799958288 +0000 UTC m=+4183.981474510" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.802543 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.817737 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.817724734 podStartE2EDuration="1.817724734s" podCreationTimestamp="2026-01-21 16:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:26.806052416 +0000 UTC m=+4183.987568638" watchObservedRunningTime="2026-01-21 16:11:26.817724734 +0000 UTC m=+4183.999240955" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.826682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.834125 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.843661 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:26 crc kubenswrapper[4707]: E0121 16:11:26.843990 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97adda3a-b739-44a3-8119-516f049b2784" containerName="memcached" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.844009 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97adda3a-b739-44a3-8119-516f049b2784" containerName="memcached" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.844200 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="97adda3a-b739-44a3-8119-516f049b2784" containerName="memcached" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.844719 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.846821 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-vp54k" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.846962 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.848837 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 16:11:26 crc kubenswrapper[4707]: E0121 16:11:26.859451 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:26 crc kubenswrapper[4707]: E0121 16:11:26.859498 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:30.859486479 +0000 UTC m=+4188.041002701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-public-svc" not found Jan 21 16:11:26 crc kubenswrapper[4707]: E0121 16:11:26.860221 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:26 crc kubenswrapper[4707]: E0121 16:11:26.860277 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:30.860260835 +0000 UTC m=+4188.041777056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.865437 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.960250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdb-rundir\") pod \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.960464 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-config\") pod \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.960487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-scripts\") pod \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.960517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.960603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqcqz\" (UniqueName: \"kubernetes.io/projected/a020fbaa-3cc2-46c6-84af-9f5f0323842a-kube-api-access-kqcqz\") pod \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.960632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-combined-ca-bundle\") pod \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\" (UID: \"a020fbaa-3cc2-46c6-84af-9f5f0323842a\") " Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.960881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kolla-config\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.960942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.960973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-config-data\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.961036 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.961079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vcpt\" (UniqueName: \"kubernetes.io/projected/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kube-api-access-5vcpt\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.961439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "a020fbaa-3cc2-46c6-84af-9f5f0323842a" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.961705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-config" (OuterVolumeSpecName: "config") pod "a020fbaa-3cc2-46c6-84af-9f5f0323842a" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.961987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-scripts" (OuterVolumeSpecName: "scripts") pod "a020fbaa-3cc2-46c6-84af-9f5f0323842a" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.966023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "a020fbaa-3cc2-46c6-84af-9f5f0323842a" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.966029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a020fbaa-3cc2-46c6-84af-9f5f0323842a" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:26 crc kubenswrapper[4707]: I0121 16:11:26.979709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a020fbaa-3cc2-46c6-84af-9f5f0323842a-kube-api-access-kqcqz" (OuterVolumeSpecName: "kube-api-access-kqcqz") pod "a020fbaa-3cc2-46c6-84af-9f5f0323842a" (UID: "a020fbaa-3cc2-46c6-84af-9f5f0323842a"). InnerVolumeSpecName "kube-api-access-kqcqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vcpt\" (UniqueName: \"kubernetes.io/projected/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kube-api-access-5vcpt\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062425 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kolla-config\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-config-data\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062630 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqcqz\" (UniqueName: \"kubernetes.io/projected/a020fbaa-3cc2-46c6-84af-9f5f0323842a-kube-api-access-kqcqz\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062640 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062649 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062659 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062668 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a020fbaa-3cc2-46c6-84af-9f5f0323842a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.062686 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.063946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kolla-config\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.065293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-config-data\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.066025 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.066061 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.066765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.068352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.077779 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.082092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vcpt\" (UniqueName: \"kubernetes.io/projected/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kube-api-access-5vcpt\") pod \"memcached-0\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.123086 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.164978 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.169173 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.190614 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97adda3a-b739-44a3-8119-516f049b2784" path="/var/lib/kubelet/pods/97adda3a-b739-44a3-8119-516f049b2784/volumes" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.191149 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c00bba-c016-4c12-9df2-819557bb8d8f" path="/var/lib/kubelet/pods/97c00bba-c016-4c12-9df2-819557bb8d8f/volumes" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.525010 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:27 crc kubenswrapper[4707]: W0121 16:11:27.525231 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb87763ac_1165_4dec_b7c5_4397e2fcff3e.slice/crio-2f29bc765f434e5f9f65ed88dfb837ea5e2a564ae01b9f92044f548c5ebe9b59 WatchSource:0}: Error finding container 2f29bc765f434e5f9f65ed88dfb837ea5e2a564ae01b9f92044f548c5ebe9b59: Status 404 returned error can't find the container with id 2f29bc765f434e5f9f65ed88dfb837ea5e2a564ae01b9f92044f548c5ebe9b59 Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.777732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"b87763ac-1165-4dec-b7c5-4397e2fcff3e","Type":"ContainerStarted","Data":"d6fddc205c6890a454e9f163957631dcca8806c9925d38125376e97d6b85a4e2"} Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.777772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"b87763ac-1165-4dec-b7c5-4397e2fcff3e","Type":"ContainerStarted","Data":"2f29bc765f434e5f9f65ed88dfb837ea5e2a564ae01b9f92044f548c5ebe9b59"} Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.777960 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.802126 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=1.8021108510000001 podStartE2EDuration="1.802110851s" podCreationTimestamp="2026-01-21 16:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:27.793644673 +0000 UTC m=+4184.975160894" watchObservedRunningTime="2026-01-21 16:11:27.802110851 +0000 UTC m=+4184.983627073" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.854517 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:27 crc kubenswrapper[4707]: E0121 16:11:27.855007 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ovn-northd-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ovn-northd-0" podUID="a18848e3-7a28-4f1e-913d-b8aa7440df5e" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.873496 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.880446 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.893712 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.895271 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.897030 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.897213 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.898227 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-tpgml" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.898369 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.900478 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.904996 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.979530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.979585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.979637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.979654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-config\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.979754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.979799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.979880 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc6fc\" (UniqueName: \"kubernetes.io/projected/b86320b8-c3c8-42d6-b48b-206526f89271-kube-api-access-fc6fc\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.980020 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.980083 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:27 crc kubenswrapper[4707]: I0121 16:11:27.980101 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020fbaa-3cc2-46c6-84af-9f5f0323842a-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.016613 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:28 crc kubenswrapper[4707]: E0121 16:11:28.017270 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs-tls-certs ovsdbserver-sb-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-config\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc6fc\" (UniqueName: \"kubernetes.io/projected/b86320b8-c3c8-42d6-b48b-206526f89271-kube-api-access-fc6fc\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082687 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.082993 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.083055 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.083099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.083565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-config\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.083577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.086357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.087587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.092481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.098361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc6fc\" (UniqueName: \"kubernetes.io/projected/b86320b8-c3c8-42d6-b48b-206526f89271-kube-api-access-fc6fc\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.107043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.161082 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.195218 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.212985 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.271924 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.225:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.286170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-combined-ca-bundle\") pod \"54b025fd-f070-4b20-b44c-e49aa1930987\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.286228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxk6w\" (UniqueName: \"kubernetes.io/projected/54b025fd-f070-4b20-b44c-e49aa1930987-kube-api-access-sxk6w\") pod \"54b025fd-f070-4b20-b44c-e49aa1930987\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.286420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-config-data\") pod \"54b025fd-f070-4b20-b44c-e49aa1930987\" (UID: \"54b025fd-f070-4b20-b44c-e49aa1930987\") " Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.290309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b025fd-f070-4b20-b44c-e49aa1930987-kube-api-access-sxk6w" (OuterVolumeSpecName: "kube-api-access-sxk6w") pod "54b025fd-f070-4b20-b44c-e49aa1930987" (UID: "54b025fd-f070-4b20-b44c-e49aa1930987"). InnerVolumeSpecName "kube-api-access-sxk6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.305668 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-config-data" (OuterVolumeSpecName: "config-data") pod "54b025fd-f070-4b20-b44c-e49aa1930987" (UID: "54b025fd-f070-4b20-b44c-e49aa1930987"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.306129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54b025fd-f070-4b20-b44c-e49aa1930987" (UID: "54b025fd-f070-4b20-b44c-e49aa1930987"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.391031 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxk6w\" (UniqueName: \"kubernetes.io/projected/54b025fd-f070-4b20-b44c-e49aa1930987-kube-api-access-sxk6w\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.391064 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.391075 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b025fd-f070-4b20-b44c-e49aa1930987-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.510202 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.786192 4707 generic.go:334] "Generic (PLEG): container finished" podID="54b025fd-f070-4b20-b44c-e49aa1930987" containerID="15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7" exitCode=0 Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.786229 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.786249 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.786280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"54b025fd-f070-4b20-b44c-e49aa1930987","Type":"ContainerDied","Data":"15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7"} Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.786301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"54b025fd-f070-4b20-b44c-e49aa1930987","Type":"ContainerDied","Data":"43a7c7915ac3875c47d8555e1d035fa118b1903778addc0ebd7155d8e87474f7"} Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.786316 4707 scope.go:117] "RemoveContainer" containerID="15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.786366 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.786968 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.989848 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:28 crc kubenswrapper[4707]: I0121 16:11:28.995704 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.001658 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.007938 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.015392 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: E0121 16:11:29.015730 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b025fd-f070-4b20-b44c-e49aa1930987" containerName="nova-cell1-conductor-conductor" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.015748 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b025fd-f070-4b20-b44c-e49aa1930987" containerName="nova-cell1-conductor-conductor" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.015953 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b025fd-f070-4b20-b44c-e49aa1930987" containerName="nova-cell1-conductor-conductor" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.016465 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.018046 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.027346 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.063283 4707 scope.go:117] "RemoveContainer" containerID="15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7" Jan 21 16:11:29 crc kubenswrapper[4707]: E0121 16:11:29.063574 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7\": container with ID starting with 15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7 not found: ID does not exist" containerID="15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.063613 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7"} err="failed to get container status \"15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7\": rpc error: code = NotFound desc = could not find container \"15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7\": container with ID starting with 15756d9348a17be2ba16ab10c7832f788e4ddfa3bd28c1501fb5375090c838d7 not found: ID does not exist" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-config\") pod \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102236 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-scripts\") pod \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pls4\" (UniqueName: \"kubernetes.io/projected/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-kube-api-access-4pls4\") pod \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-config\") pod \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-combined-ca-bundle\") pod \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-rundir\") pod \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-scripts\") pod \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-combined-ca-bundle\") pod \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") pod \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102581 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nkrv\" (UniqueName: \"kubernetes.io/projected/a18848e3-7a28-4f1e-913d-b8aa7440df5e-kube-api-access-2nkrv\") pod \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\" (UID: \"a18848e3-7a28-4f1e-913d-b8aa7440df5e\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.102624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdb-rundir\") pod \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\" (UID: \"f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2\") " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "a18848e3-7a28-4f1e-913d-b8aa7440df5e" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-scripts" (OuterVolumeSpecName: "scripts") pod "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-config" (OuterVolumeSpecName: "config") pod "a18848e3-7a28-4f1e-913d-b8aa7440df5e" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103397 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-scripts" (OuterVolumeSpecName: "scripts") pod "a18848e3-7a28-4f1e-913d-b8aa7440df5e" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-config" (OuterVolumeSpecName: "config") pod "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66ks\" (UniqueName: \"kubernetes.io/projected/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-kube-api-access-x66ks\") pod \"nova-cell1-conductor-0\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103939 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103951 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103982 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18848e3-7a28-4f1e-913d-b8aa7440df5e-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.103999 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.104010 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.104019 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.104731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.105017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a18848e3-7a28-4f1e-913d-b8aa7440df5e" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.106031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-kube-api-access-4pls4" (OuterVolumeSpecName: "kube-api-access-4pls4") pod "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2"). InnerVolumeSpecName "kube-api-access-4pls4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.106093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18848e3-7a28-4f1e-913d-b8aa7440df5e-kube-api-access-2nkrv" (OuterVolumeSpecName: "kube-api-access-2nkrv") pod "a18848e3-7a28-4f1e-913d-b8aa7440df5e" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e"). InnerVolumeSpecName "kube-api-access-2nkrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.106318 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "a18848e3-7a28-4f1e-913d-b8aa7440df5e" (UID: "a18848e3-7a28-4f1e-913d-b8aa7440df5e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.106403 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2" (UID: "f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.190027 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b025fd-f070-4b20-b44c-e49aa1930987" path="/var/lib/kubelet/pods/54b025fd-f070-4b20-b44c-e49aa1930987/volumes" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.190617 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a020fbaa-3cc2-46c6-84af-9f5f0323842a" path="/var/lib/kubelet/pods/a020fbaa-3cc2-46c6-84af-9f5f0323842a/volumes" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.199255 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.205571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.205650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x66ks\" (UniqueName: \"kubernetes.io/projected/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-kube-api-access-x66ks\") pod \"nova-cell1-conductor-0\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.205713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.205856 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pls4\" (UniqueName: \"kubernetes.io/projected/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-kube-api-access-4pls4\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.205882 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.205970 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.206005 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.206016 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.206028 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nkrv\" (UniqueName: \"kubernetes.io/projected/a18848e3-7a28-4f1e-913d-b8aa7440df5e-kube-api-access-2nkrv\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.210134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.210427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.221769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66ks\" (UniqueName: \"kubernetes.io/projected/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-kube-api-access-x66ks\") pod \"nova-cell1-conductor-0\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.226824 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.307942 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.338382 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.722616 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.794686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b86320b8-c3c8-42d6-b48b-206526f89271","Type":"ContainerStarted","Data":"7019254782a506b2d815df0e4448644e9e93a6b8330ab0977344a2a08dc91ea2"} Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.794721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b86320b8-c3c8-42d6-b48b-206526f89271","Type":"ContainerStarted","Data":"7d1010a0b55312ee73410bd885472062d3c784a9d367b85e66eb856b023fc87c"} Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.794733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b86320b8-c3c8-42d6-b48b-206526f89271","Type":"ContainerStarted","Data":"3b1af069d581d3464a3b38ec400f4fe584002411e02285c4eab03a9ebd21c0a9"} Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.796645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerStarted","Data":"3aa25acbd4de9cf12483122ba0fa631336d240c2bad44047c3d8daf054a45b76"} Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.799075 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.799262 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.808981 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.80896416 podStartE2EDuration="2.80896416s" podCreationTimestamp="2026-01-21 16:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:29.808036505 +0000 UTC m=+4186.989552728" watchObservedRunningTime="2026-01-21 16:11:29.80896416 +0000 UTC m=+4186.990480382" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.871133 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.886226 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.902855 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.910936 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.916761 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.919009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.921635 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.921708 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.921933 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.922132 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-ljxg7" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.922591 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.924040 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.927628 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.927943 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.928052 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hdph9" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.928179 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.929608 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:29 crc kubenswrapper[4707]: I0121 16:11:29.934389 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9qfc\" (UniqueName: \"kubernetes.io/projected/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-kube-api-access-d9qfc\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-scripts\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xkqb\" (UniqueName: \"kubernetes.io/projected/bb06987b-a8ab-40c8-b73b-f0244ac056c7-kube-api-access-8xkqb\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-config\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019764 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.019876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.020098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-config\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.020146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.020234 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18848e3-7a28-4f1e-913d-b8aa7440df5e-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.020256 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.020270 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.121699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-config\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.121902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.121950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9qfc\" (UniqueName: \"kubernetes.io/projected/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-kube-api-access-d9qfc\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.121978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.121993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-scripts\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xkqb\" (UniqueName: \"kubernetes.io/projected/bb06987b-a8ab-40c8-b73b-f0244ac056c7-kube-api-access-8xkqb\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122099 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-config\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122443 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-config\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.122843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-scripts\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.123405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.123834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-config\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.123921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.183391 4707 scope.go:117] "RemoveContainer" containerID="2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.348502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.348533 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.348851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.349581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.350527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.350867 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.350914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9qfc\" (UniqueName: \"kubernetes.io/projected/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-kube-api-access-d9qfc\") pod \"ovn-northd-0\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.363896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xkqb\" (UniqueName: \"kubernetes.io/projected/bb06987b-a8ab-40c8-b73b-f0244ac056c7-kube-api-access-8xkqb\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.368616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.542334 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.551606 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.806646 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerStarted","Data":"3e2e572e2b9c3ab78970333ab0779e1558d12c60389c19900d5ef68470e127c9"} Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.806882 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.809585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerStarted","Data":"c43bdee51f4936382c063ef548a600d72b44db1be83fde94ba90c55aa3abc9be"} Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.821845 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.821833193 podStartE2EDuration="2.821833193s" podCreationTimestamp="2026-01-21 16:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:30.81764571 +0000 UTC m=+4187.999161942" watchObservedRunningTime="2026-01-21 16:11:30.821833193 +0000 UTC m=+4188.003349416" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.924033 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.927954 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:30 crc kubenswrapper[4707]: E0121 16:11:30.944238 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:30 crc kubenswrapper[4707]: E0121 16:11:30.944283 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:38.944268768 +0000 UTC m=+4196.125784981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-public-svc" not found Jan 21 16:11:30 crc kubenswrapper[4707]: E0121 16:11:30.944508 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:11:30 crc kubenswrapper[4707]: E0121 16:11:30.944540 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs podName:7d6ee277-b629-49a0-94da-030a39be02f5 nodeName:}" failed. No retries permitted until 2026-01-21 16:11:38.944532945 +0000 UTC m=+4196.126049167 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs") pod "keystone-6c79bb4b5d-xfj6p" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5") : secret "cert-keystone-internal-svc" not found Jan 21 16:11:30 crc kubenswrapper[4707]: I0121 16:11:30.951320 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.017376 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:11:31 crc kubenswrapper[4707]: W0121 16:11:31.030640 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e10ec4e_d4dd_412a_be77_d3d7d7ae27d8.slice/crio-10a89f75f33c3601c50829a918237a55bde8d9395e3825b4addc64c864a74c34 WatchSource:0}: Error finding container 10a89f75f33c3601c50829a918237a55bde8d9395e3825b4addc64c864a74c34: Status 404 returned error can't find the container with id 10a89f75f33c3601c50829a918237a55bde8d9395e3825b4addc64c864a74c34 Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.195169 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18848e3-7a28-4f1e-913d-b8aa7440df5e" path="/var/lib/kubelet/pods/a18848e3-7a28-4f1e-913d-b8aa7440df5e/volumes" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.195848 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2" path="/var/lib/kubelet/pods/f58126d0-aefa-41e0-8ba5-ecb3ef73e1f2/volumes" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.213056 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.696022 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5878d64444-swb5j"] Jan 21 16:11:31 crc kubenswrapper[4707]: E0121 16:11:31.696592 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[internal-tls-certs public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/placement-5878d64444-swb5j" podUID="8e33fc10-40c0-4167-bf53-bf5d535ce2f6" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.716795 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-56454c4d6b-f6jz7"] Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.718176 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.729588 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-56454c4d6b-f6jz7"] Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.755924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np4hx\" (UniqueName: \"kubernetes.io/projected/72880cef-939a-498b-bdea-2be7388c8037-kube-api-access-np4hx\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.756146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-combined-ca-bundle\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.756266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-scripts\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.756357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-internal-tls-certs\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.756432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-config-data\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.756530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72880cef-939a-498b-bdea-2be7388c8037-logs\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.756650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-public-tls-certs\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.818013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8","Type":"ContainerStarted","Data":"cc159cdf2eb7eebf0df0489fca214415a177fd59dd0017097f91d3bdaa7b7b49"} Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.818062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8","Type":"ContainerStarted","Data":"6fb82acd6cb43c9c4e99edd1e2836e009b79b9244ac1f9188cfc0eec4e39f3eb"} Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.818074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8","Type":"ContainerStarted","Data":"10a89f75f33c3601c50829a918237a55bde8d9395e3825b4addc64c864a74c34"} Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.818180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.819254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"bb06987b-a8ab-40c8-b73b-f0244ac056c7","Type":"ContainerStarted","Data":"89922e5f5460efebce04fba540cf6fbd73efdfb666c0b1bc1af0fa90f06ec5bb"} Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.819403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"bb06987b-a8ab-40c8-b73b-f0244ac056c7","Type":"ContainerStarted","Data":"5a00f55ff9ae0fdbc1e8d372d22b6a13fe41087fb6a5f3e20c7888d085b9c8e6"} Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.819465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"bb06987b-a8ab-40c8-b73b-f0244ac056c7","Type":"ContainerStarted","Data":"73e49e6cdf62ccfb699d60be36f133e6e742d6be284d34183d967f2cbb1e40ca"} Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.820956 4707 generic.go:334] "Generic (PLEG): container finished" podID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerID="3e2e572e2b9c3ab78970333ab0779e1558d12c60389c19900d5ef68470e127c9" exitCode=1 Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.821017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerDied","Data":"3e2e572e2b9c3ab78970333ab0779e1558d12c60389c19900d5ef68470e127c9"} Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.821078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.821365 4707 scope.go:117] "RemoveContainer" containerID="3e2e572e2b9c3ab78970333ab0779e1558d12c60389c19900d5ef68470e127c9" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.831311 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.857820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-combined-ca-bundle\") pod \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.857863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-scripts\") pod \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwl2q\" (UniqueName: \"kubernetes.io/projected/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-kube-api-access-jwl2q\") pod \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-config-data\") pod \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-logs\") pod \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\" (UID: \"8e33fc10-40c0-4167-bf53-bf5d535ce2f6\") " Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858332 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-scripts\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-config-data\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-internal-tls-certs\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72880cef-939a-498b-bdea-2be7388c8037-logs\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-public-tls-certs\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np4hx\" (UniqueName: \"kubernetes.io/projected/72880cef-939a-498b-bdea-2be7388c8037-kube-api-access-np4hx\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.858754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-combined-ca-bundle\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.863202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72880cef-939a-498b-bdea-2be7388c8037-logs\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.863568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-logs" (OuterVolumeSpecName: "logs") pod "8e33fc10-40c0-4167-bf53-bf5d535ce2f6" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.865935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-config-data\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.870493 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.87048031 podStartE2EDuration="2.87048031s" podCreationTimestamp="2026-01-21 16:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:31.841937461 +0000 UTC m=+4189.023453683" watchObservedRunningTime="2026-01-21 16:11:31.87048031 +0000 UTC m=+4189.051996533" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.872696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-config-data" (OuterVolumeSpecName: "config-data") pod "8e33fc10-40c0-4167-bf53-bf5d535ce2f6" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.872858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e33fc10-40c0-4167-bf53-bf5d535ce2f6" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.872979 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-scripts" (OuterVolumeSpecName: "scripts") pod "8e33fc10-40c0-4167-bf53-bf5d535ce2f6" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.873056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-kube-api-access-jwl2q" (OuterVolumeSpecName: "kube-api-access-jwl2q") pod "8e33fc10-40c0-4167-bf53-bf5d535ce2f6" (UID: "8e33fc10-40c0-4167-bf53-bf5d535ce2f6"). InnerVolumeSpecName "kube-api-access-jwl2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.873103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-public-tls-certs\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.877004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-combined-ca-bundle\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.878328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-scripts\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.878615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-internal-tls-certs\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.878940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np4hx\" (UniqueName: \"kubernetes.io/projected/72880cef-939a-498b-bdea-2be7388c8037-kube-api-access-np4hx\") pod \"placement-56454c4d6b-f6jz7\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.888100 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.888086635 podStartE2EDuration="2.888086635s" podCreationTimestamp="2026-01-21 16:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:31.87338897 +0000 UTC m=+4189.054905192" watchObservedRunningTime="2026-01-21 16:11:31.888086635 +0000 UTC m=+4189.069602857" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.960695 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwl2q\" (UniqueName: \"kubernetes.io/projected/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-kube-api-access-jwl2q\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.960782 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.960868 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.960938 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:31 crc kubenswrapper[4707]: I0121 16:11:31.960991 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.034418 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.177024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.339082 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.426724 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-56454c4d6b-f6jz7"] Jan 21 16:11:32 crc kubenswrapper[4707]: W0121 16:11:32.430279 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72880cef_939a_498b_bdea_2be7388c8037.slice/crio-84f5ba8378aab57b6bc81b1d4b3dc82d0e8866408568e564472b4ad1a4bcc062 WatchSource:0}: Error finding container 84f5ba8378aab57b6bc81b1d4b3dc82d0e8866408568e564472b4ad1a4bcc062: Status 404 returned error can't find the container with id 84f5ba8378aab57b6bc81b1d4b3dc82d0e8866408568e564472b4ad1a4bcc062 Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.583180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.828725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerStarted","Data":"377c2980821b46c6c12e8389069e3d10dae1a719d2505cc988a6fd8a8c900c52"} Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.829452 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.830724 4707 generic.go:334] "Generic (PLEG): container finished" podID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerID="c43bdee51f4936382c063ef548a600d72b44db1be83fde94ba90c55aa3abc9be" exitCode=1 Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.830764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerDied","Data":"c43bdee51f4936382c063ef548a600d72b44db1be83fde94ba90c55aa3abc9be"} Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.830785 4707 scope.go:117] "RemoveContainer" containerID="2ffcbb4c3ae958308ad55b6321ec6e276b8dabf227806e02d4c84d0da7f78aa5" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.831459 4707 scope.go:117] "RemoveContainer" containerID="c43bdee51f4936382c063ef548a600d72b44db1be83fde94ba90c55aa3abc9be" Jan 21 16:11:32 crc kubenswrapper[4707]: E0121 16:11:32.831803 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.849341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" event={"ID":"72880cef-939a-498b-bdea-2be7388c8037","Type":"ContainerStarted","Data":"c4c68a89695bf606a0a4cc6f42584ecaa9e2bf69fe90294a201d349ae0a85e8f"} Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.849563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" event={"ID":"72880cef-939a-498b-bdea-2be7388c8037","Type":"ContainerStarted","Data":"418de46e8ddd98abdd313f3caa5188dc17cb30561b35253f50ddd8686752659b"} Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.849575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" event={"ID":"72880cef-939a-498b-bdea-2be7388c8037","Type":"ContainerStarted","Data":"84f5ba8378aab57b6bc81b1d4b3dc82d0e8866408568e564472b4ad1a4bcc062"} Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.849426 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5878d64444-swb5j" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.850086 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.850237 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.906514 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" podStartSLOduration=1.906500525 podStartE2EDuration="1.906500525s" podCreationTimestamp="2026-01-21 16:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:32.884604854 +0000 UTC m=+4190.066121076" watchObservedRunningTime="2026-01-21 16:11:32.906500525 +0000 UTC m=+4190.088016747" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.914729 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5878d64444-swb5j"] Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.922114 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5878d64444-swb5j"] Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.978260 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:32 crc kubenswrapper[4707]: I0121 16:11:32.978392 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e33fc10-40c0-4167-bf53-bf5d535ce2f6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.191797 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e33fc10-40c0-4167-bf53-bf5d535ce2f6" path="/var/lib/kubelet/pods/8e33fc10-40c0-4167-bf53-bf5d535ce2f6/volumes" Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.214146 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.313955 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.225:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.552660 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.858688 4707 generic.go:334] "Generic (PLEG): container finished" podID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerID="377c2980821b46c6c12e8389069e3d10dae1a719d2505cc988a6fd8a8c900c52" exitCode=1 Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.858738 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerDied","Data":"377c2980821b46c6c12e8389069e3d10dae1a719d2505cc988a6fd8a8c900c52"} Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.858767 4707 scope.go:117] "RemoveContainer" containerID="3e2e572e2b9c3ab78970333ab0779e1558d12c60389c19900d5ef68470e127c9" Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.859318 4707 scope.go:117] "RemoveContainer" containerID="377c2980821b46c6c12e8389069e3d10dae1a719d2505cc988a6fd8a8c900c52" Jan 21 16:11:33 crc kubenswrapper[4707]: E0121 16:11:33.859544 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(047c06ee-71d5-4bcf-83e0-dd8aa295cdb3)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.867225 4707 generic.go:334] "Generic (PLEG): container finished" podID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerID="2f792b82ffa7260df2fe2acbed939f4f7e50fdc85ed7854f6905c5767651ed06" exitCode=1 Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.867303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerDied","Data":"2f792b82ffa7260df2fe2acbed939f4f7e50fdc85ed7854f6905c5767651ed06"} Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.867596 4707 scope.go:117] "RemoveContainer" containerID="2f792b82ffa7260df2fe2acbed939f4f7e50fdc85ed7854f6905c5767651ed06" Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.876681 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" containerID="3fa65774e091d2648c8e5d199347bcc9ac19de4c12027001ad23996032d7b010" exitCode=137 Jan 21 16:11:33 crc kubenswrapper[4707]: I0121 16:11:33.876719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef","Type":"ContainerDied","Data":"3fa65774e091d2648c8e5d199347bcc9ac19de4c12027001ad23996032d7b010"} Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.127688 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.247855 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.283922 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.301347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8phz\" (UniqueName: \"kubernetes.io/projected/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-kube-api-access-c8phz\") pod \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.301512 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-vencrypt-tls-certs\") pod \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.301542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-combined-ca-bundle\") pod \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.301613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-config-data\") pod \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.301673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-nova-novncproxy-tls-certs\") pod \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\" (UID: \"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef\") " Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.306930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-kube-api-access-c8phz" (OuterVolumeSpecName: "kube-api-access-c8phz") pod "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" (UID: "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef"). InnerVolumeSpecName "kube-api-access-c8phz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.334149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" (UID: "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.337502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-config-data" (OuterVolumeSpecName: "config-data") pod "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" (UID: "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.354663 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" (UID: "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.361629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" (UID: "f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.405366 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.405608 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.405618 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8phz\" (UniqueName: \"kubernetes.io/projected/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-kube-api-access-c8phz\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.405628 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.405636 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.584064 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.584674 4707 scope.go:117] "RemoveContainer" containerID="c43bdee51f4936382c063ef548a600d72b44db1be83fde94ba90c55aa3abc9be" Jan 21 16:11:34 crc kubenswrapper[4707]: E0121 16:11:34.584971 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.889212 4707 scope.go:117] "RemoveContainer" containerID="377c2980821b46c6c12e8389069e3d10dae1a719d2505cc988a6fd8a8c900c52" Jan 21 16:11:34 crc kubenswrapper[4707]: E0121 16:11:34.889433 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(047c06ee-71d5-4bcf-83e0-dd8aa295cdb3)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.891625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerStarted","Data":"1dfd5b276b63120434749292bace0314a46bb3c0904a213de5adf4eae13bbb7d"} Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.893887 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.893926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef","Type":"ContainerDied","Data":"e3017366764c70929229cc28608c44a1e8f46d0b5bb1dfa621d22fa38aa86f08"} Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.893955 4707 scope.go:117] "RemoveContainer" containerID="3fa65774e091d2648c8e5d199347bcc9ac19de4c12027001ad23996032d7b010" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.906782 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.907048 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.944541 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.948996 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.954883 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.966232 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.977641 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:11:34 crc kubenswrapper[4707]: E0121 16:11:34.978051 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.978070 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.978265 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.978839 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.981428 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.981629 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:11:34 crc kubenswrapper[4707]: I0121 16:11:34.984613 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.000084 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.065945 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:11:35 crc kubenswrapper[4707]: E0121 16:11:35.066609 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="6a3ae0f0-4069-4bac-8927-e0d9dce26caf" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.083079 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.121366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.121442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.121548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.121579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfr7s\" (UniqueName: \"kubernetes.io/projected/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-kube-api-access-zfr7s\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.121617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.191062 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef" path="/var/lib/kubelet/pods/f3eba8e2-4051-4016-bcd2-3f2ca9fd9fef/volumes" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.223077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.223310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.223542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.224004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfr7s\" (UniqueName: \"kubernetes.io/projected/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-kube-api-access-zfr7s\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.224102 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.226901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.226999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.227499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.227962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.242563 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfr7s\" (UniqueName: \"kubernetes.io/projected/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-kube-api-access-zfr7s\") pod \"nova-cell1-novncproxy-0\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.302298 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.338763 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.509137 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.509665 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="ceilometer-central-agent" containerID="cri-o://6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b" gracePeriod=30 Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.509776 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="proxy-httpd" containerID="cri-o://579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53" gracePeriod=30 Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.509831 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="sg-core" containerID="cri-o://23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065" gracePeriod=30 Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.509860 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="ceilometer-notification-agent" containerID="cri-o://53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4" gracePeriod=30 Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.535440 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.552339 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.681067 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.833991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.834038 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:35 crc kubenswrapper[4707]: E0121 16:11:35.834193 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:11:35 crc kubenswrapper[4707]: E0121 16:11:35.834249 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs podName:1528d71b-31ac-461e-b1d9-29c348f3d63e nodeName:}" failed. No retries permitted until 2026-01-21 16:12:07.834235543 +0000 UTC m=+4225.015751766 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs") pod "keystone-5c9b5b5886-675s9" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e") : secret "cert-keystone-public-svc" not found Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.906167 4707 generic.go:334] "Generic (PLEG): container finished" podID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerID="579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53" exitCode=0 Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.906190 4707 generic.go:334] "Generic (PLEG): container finished" podID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerID="23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065" exitCode=2 Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.906224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerDied","Data":"579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53"} Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.906253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerDied","Data":"23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065"} Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.907454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.907726 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.907747 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.908039 4707 scope.go:117] "RemoveContainer" containerID="377c2980821b46c6c12e8389069e3d10dae1a719d2505cc988a6fd8a8c900c52" Jan 21 16:11:35 crc kubenswrapper[4707]: E0121 16:11:35.908239 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(047c06ee-71d5-4bcf-83e0-dd8aa295cdb3)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" Jan 21 16:11:35 crc kubenswrapper[4707]: I0121 16:11:35.950418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"keystone-5c9b5b5886-675s9\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.037111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs\") pod \"barbican-api-fb8df9d8-8g2ns\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:36 crc kubenswrapper[4707]: E0121 16:11:36.040073 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:11:36 crc kubenswrapper[4707]: E0121 16:11:36.040148 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs podName:6227e663-2e19-4ddc-b8cb-ce6e35eb67c2 nodeName:}" failed. No retries permitted until 2026-01-21 16:12:08.040128523 +0000 UTC m=+4225.221644735 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs") pod "barbican-api-fb8df9d8-8g2ns" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2") : secret "cert-barbican-internal-svc" not found Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.046475 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c9b5b5886-675s9"] Jan 21 16:11:36 crc kubenswrapper[4707]: E0121 16:11:36.046998 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" podUID="1528d71b-31ac-461e-b1d9-29c348f3d63e" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.067751 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-5c46885fd8-bdfw6"] Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.070566 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.077850 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5c46885fd8-bdfw6"] Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.130407 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.130439 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.193672 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.241097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-credential-keys\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.241222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-combined-ca-bundle\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.241242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjhhx\" (UniqueName: \"kubernetes.io/projected/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-kube-api-access-tjhhx\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.241306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-internal-tls-certs\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.241334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-scripts\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.241382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-fernet-keys\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.241430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-public-tls-certs\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.241469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs\") pod \"neutron-6f795569d6-mmhxt\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.241540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-config-data\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: E0121 16:11:36.242500 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 21 16:11:36 crc kubenswrapper[4707]: E0121 16:11:36.242547 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs podName:f904e1e0-53bb-45dc-90c4-df6a6783f58a nodeName:}" failed. No retries permitted until 2026-01-21 16:12:08.242533635 +0000 UTC m=+4225.424049857 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs") pod "neutron-6f795569d6-mmhxt" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a") : secret "cert-neutron-public-svc" not found Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.342361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkdjj\" (UniqueName: \"kubernetes.io/projected/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-kube-api-access-xkdjj\") pod \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.342434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-config-data\") pod \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.342582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-scripts\") pod \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.342629 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-httpd-run\") pod \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.342656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.342687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-combined-ca-bundle\") pod \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.342717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-logs\") pod \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.342948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a3ae0f0-4069-4bac-8927-e0d9dce26caf" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.343078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-logs" (OuterVolumeSpecName: "logs") pod "6a3ae0f0-4069-4bac-8927-e0d9dce26caf" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.343264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-config-data\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.343667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-credential-keys\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.343847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-combined-ca-bundle\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.343872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjhhx\" (UniqueName: \"kubernetes.io/projected/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-kube-api-access-tjhhx\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.343930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-internal-tls-certs\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.343964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-scripts\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.344002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-fernet-keys\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.344035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-public-tls-certs\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.344106 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.344122 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.347134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "6a3ae0f0-4069-4bac-8927-e0d9dce26caf" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.347140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-scripts" (OuterVolumeSpecName: "scripts") pod "6a3ae0f0-4069-4bac-8927-e0d9dce26caf" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.348693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a3ae0f0-4069-4bac-8927-e0d9dce26caf" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.349715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-config-data" (OuterVolumeSpecName: "config-data") pod "6a3ae0f0-4069-4bac-8927-e0d9dce26caf" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.349857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-kube-api-access-xkdjj" (OuterVolumeSpecName: "kube-api-access-xkdjj") pod "6a3ae0f0-4069-4bac-8927-e0d9dce26caf" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf"). InnerVolumeSpecName "kube-api-access-xkdjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.350186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-combined-ca-bundle\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.352296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-config-data\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.353431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-internal-tls-certs\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.353637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-public-tls-certs\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.355046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-scripts\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.355524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-fernet-keys\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.355606 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-credential-keys\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.364177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjhhx\" (UniqueName: \"kubernetes.io/projected/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-kube-api-access-tjhhx\") pod \"keystone-5c46885fd8-bdfw6\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.446325 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkdjj\" (UniqueName: \"kubernetes.io/projected/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-kube-api-access-xkdjj\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.446532 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.446542 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.446568 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.446578 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.461216 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.504919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.550321 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.597487 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.656992 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.916923 4707 generic.go:334] "Generic (PLEG): container finished" podID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerID="6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b" exitCode=0 Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.916984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerDied","Data":"6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b"} Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.918230 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5c46885fd8-bdfw6"] Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.918667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"38eec5c3-a42c-45b1-9cb9-d21e8c88de27","Type":"ContainerStarted","Data":"23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc"} Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.918698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"38eec5c3-a42c-45b1-9cb9-d21e8c88de27","Type":"ContainerStarted","Data":"7213c8df36de41b0edb774db638801343eac416614b300b2596866389eb5b8f8"} Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.923511 4707 generic.go:334] "Generic (PLEG): container finished" podID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerID="1dfd5b276b63120434749292bace0314a46bb3c0904a213de5adf4eae13bbb7d" exitCode=1 Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.923562 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.925211 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.925284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerDied","Data":"1dfd5b276b63120434749292bace0314a46bb3c0904a213de5adf4eae13bbb7d"} Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.925324 4707 scope.go:117] "RemoveContainer" containerID="2f792b82ffa7260df2fe2acbed939f4f7e50fdc85ed7854f6905c5767651ed06" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.925365 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-log" containerID="cri-o://681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60" gracePeriod=30 Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.925561 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-httpd" containerID="cri-o://39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0" gracePeriod=30 Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.925676 4707 scope.go:117] "RemoveContainer" containerID="1dfd5b276b63120434749292bace0314a46bb3c0904a213de5adf4eae13bbb7d" Jan 21 16:11:36 crc kubenswrapper[4707]: E0121 16:11:36.925891 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(647f33a8-e7c7-4941-9c0d-f9c5e4b07409)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.952635 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.952617703 podStartE2EDuration="2.952617703s" podCreationTimestamp="2026-01-21 16:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:36.939426279 +0000 UTC m=+4194.120942501" watchObservedRunningTime="2026-01-21 16:11:36.952617703 +0000 UTC m=+4194.134133925" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.960432 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.228:9292/healthcheck\": EOF" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.976453 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.228:9292/healthcheck\": read tcp 10.217.0.2:59934->10.217.1.228:9292: read: connection reset by peer" Jan 21 16:11:36 crc kubenswrapper[4707]: I0121 16:11:36.977000 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.228:9292/healthcheck\": EOF" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.144137 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.145194 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.463885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.569344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-config-data\") pod \"1528d71b-31ac-461e-b1d9-29c348f3d63e\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.569513 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-credential-keys\") pod \"1528d71b-31ac-461e-b1d9-29c348f3d63e\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.569624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-fernet-keys\") pod \"1528d71b-31ac-461e-b1d9-29c348f3d63e\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.569699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-scripts\") pod \"1528d71b-31ac-461e-b1d9-29c348f3d63e\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.569771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7lmq\" (UniqueName: \"kubernetes.io/projected/1528d71b-31ac-461e-b1d9-29c348f3d63e-kube-api-access-k7lmq\") pod \"1528d71b-31ac-461e-b1d9-29c348f3d63e\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.569890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") pod \"1528d71b-31ac-461e-b1d9-29c348f3d63e\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.570030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-combined-ca-bundle\") pod \"1528d71b-31ac-461e-b1d9-29c348f3d63e\" (UID: \"1528d71b-31ac-461e-b1d9-29c348f3d63e\") " Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.573621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1528d71b-31ac-461e-b1d9-29c348f3d63e-kube-api-access-k7lmq" (OuterVolumeSpecName: "kube-api-access-k7lmq") pod "1528d71b-31ac-461e-b1d9-29c348f3d63e" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e"). InnerVolumeSpecName "kube-api-access-k7lmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.573669 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-scripts" (OuterVolumeSpecName: "scripts") pod "1528d71b-31ac-461e-b1d9-29c348f3d63e" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.574008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-config-data" (OuterVolumeSpecName: "config-data") pod "1528d71b-31ac-461e-b1d9-29c348f3d63e" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.574100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1528d71b-31ac-461e-b1d9-29c348f3d63e" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.575153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1528d71b-31ac-461e-b1d9-29c348f3d63e" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.576210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1528d71b-31ac-461e-b1d9-29c348f3d63e" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.576365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1528d71b-31ac-461e-b1d9-29c348f3d63e" (UID: "1528d71b-31ac-461e-b1d9-29c348f3d63e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.672006 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.672290 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.672355 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.672418 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.672479 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.672532 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7lmq\" (UniqueName: \"kubernetes.io/projected/1528d71b-31ac-461e-b1d9-29c348f3d63e-kube-api-access-k7lmq\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.672585 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.860179 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:37 crc kubenswrapper[4707]: E0121 16:11:37.861032 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[internal-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/cinder-api-0" podUID="d6195d18-9ddf-4470-9f2f-5a3fe9717c53" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.932098 4707 generic.go:334] "Generic (PLEG): container finished" podID="45335228-5b76-4f29-a370-98578b0dfd62" containerID="681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60" exitCode=143 Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.932172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"45335228-5b76-4f29-a370-98578b0dfd62","Type":"ContainerDied","Data":"681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60"} Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.933289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" event={"ID":"61a39f5f-0c54-4def-ab69-4c0d7ef44e57","Type":"ContainerStarted","Data":"159bdb08582167cf1c8abdc43ff23a21e055879346a2d331601f1afb7b6503b5"} Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.933316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" event={"ID":"61a39f5f-0c54-4def-ab69-4c0d7ef44e57","Type":"ContainerStarted","Data":"e3112b3473d85e54b9df13f053a62a9760a93292d4632f67ee945c4122924e05"} Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.934216 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.935601 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.935639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c9b5b5886-675s9" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.942097 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.963850 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" podStartSLOduration=1.963838828 podStartE2EDuration="1.963838828s" podCreationTimestamp="2026-01-21 16:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:37.958423235 +0000 UTC m=+4195.139939457" watchObservedRunningTime="2026-01-21 16:11:37.963838828 +0000 UTC m=+4195.145355050" Jan 21 16:11:37 crc kubenswrapper[4707]: I0121 16:11:37.995948 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c9b5b5886-675s9"] Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.021088 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-5c9b5b5886-675s9"] Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.080038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rpqm\" (UniqueName: \"kubernetes.io/projected/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-kube-api-access-2rpqm\") pod \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.080110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-logs\") pod \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.080150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data\") pod \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.080179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data-custom\") pod \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.080198 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-scripts\") pod \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.080217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") pod \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.080354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-combined-ca-bundle\") pod \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.080404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-etc-machine-id\") pod \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\" (UID: \"d6195d18-9ddf-4470-9f2f-5a3fe9717c53\") " Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.081726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d6195d18-9ddf-4470-9f2f-5a3fe9717c53" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.081960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-logs" (OuterVolumeSpecName: "logs") pod "d6195d18-9ddf-4470-9f2f-5a3fe9717c53" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.085907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data" (OuterVolumeSpecName: "config-data") pod "d6195d18-9ddf-4470-9f2f-5a3fe9717c53" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.086475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6195d18-9ddf-4470-9f2f-5a3fe9717c53" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.086601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-kube-api-access-2rpqm" (OuterVolumeSpecName: "kube-api-access-2rpqm") pod "d6195d18-9ddf-4470-9f2f-5a3fe9717c53" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53"). InnerVolumeSpecName "kube-api-access-2rpqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.087908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d6195d18-9ddf-4470-9f2f-5a3fe9717c53" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.091084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d6195d18-9ddf-4470-9f2f-5a3fe9717c53" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.093912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-scripts" (OuterVolumeSpecName: "scripts") pod "d6195d18-9ddf-4470-9f2f-5a3fe9717c53" (UID: "d6195d18-9ddf-4470-9f2f-5a3fe9717c53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.182351 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.182545 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1528d71b-31ac-461e-b1d9-29c348f3d63e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.182610 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.182669 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rpqm\" (UniqueName: \"kubernetes.io/projected/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-kube-api-access-2rpqm\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.182729 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.182783 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.182868 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.182926 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.182983 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.303783 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns"] Jan 21 16:11:38 crc kubenswrapper[4707]: E0121 16:11:38.304196 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[internal-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" podUID="6227e663-2e19-4ddc-b8cb-ce6e35eb67c2" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.319104 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-cff485676-6h5mm"] Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.320318 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.333517 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-cff485676-6h5mm"] Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.355961 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.225:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.386519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-combined-ca-bundle\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.386566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0812fb22-492c-4b73-9d75-3cfaf898c043-logs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.386598 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data-custom\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.386657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-public-tls-certs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.386675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-internal-tls-certs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.386695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.386725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7xs\" (UniqueName: \"kubernetes.io/projected/0812fb22-492c-4b73-9d75-3cfaf898c043-kube-api-access-2d7xs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.458574 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.488355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-combined-ca-bundle\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.488402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0812fb22-492c-4b73-9d75-3cfaf898c043-logs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.488434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data-custom\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.488504 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-public-tls-certs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.488524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-internal-tls-certs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.488545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.488575 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7xs\" (UniqueName: \"kubernetes.io/projected/0812fb22-492c-4b73-9d75-3cfaf898c043-kube-api-access-2d7xs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.489440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0812fb22-492c-4b73-9d75-3cfaf898c043-logs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.493413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data-custom\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.494706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.495055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-internal-tls-certs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.498141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-combined-ca-bundle\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.498186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-public-tls-certs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.504000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7xs\" (UniqueName: \"kubernetes.io/projected/0812fb22-492c-4b73-9d75-3cfaf898c043-kube-api-access-2d7xs\") pod \"barbican-api-cff485676-6h5mm\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.632958 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.942403 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.942424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.951778 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.977064 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:38 crc kubenswrapper[4707]: I0121 16:11:38.982075 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.004441 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.005844 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.011411 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.011627 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.011788 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.022225 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.033775 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-cff485676-6h5mm"] Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.102787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blwt9\" (UniqueName: \"kubernetes.io/projected/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-kube-api-access-blwt9\") pod \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.102889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-logs\") pod \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.102923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-combined-ca-bundle\") pod \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.102976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") pod \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.103020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data-custom\") pod \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.103038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data\") pod \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\" (UID: \"6227e663-2e19-4ddc-b8cb-ce6e35eb67c2\") " Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.103289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data-custom\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.103320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t9pp\" (UniqueName: \"kubernetes.io/projected/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-kube-api-access-8t9pp\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.103406 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.103448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-logs\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.103480 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.104081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-logs" (OuterVolumeSpecName: "logs") pod "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.104217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-public-tls-certs\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.104370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-scripts\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.104793 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.104903 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.105473 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.105495 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6195d18-9ddf-4470-9f2f-5a3fe9717c53-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.107118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data" (OuterVolumeSpecName: "config-data") pod "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.107177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.107345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.107397 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-kube-api-access-blwt9" (OuterVolumeSpecName: "kube-api-access-blwt9") pod "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2"). InnerVolumeSpecName "kube-api-access-blwt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.107433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2" (UID: "6227e663-2e19-4ddc-b8cb-ce6e35eb67c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.190559 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1528d71b-31ac-461e-b1d9-29c348f3d63e" path="/var/lib/kubelet/pods/1528d71b-31ac-461e-b1d9-29c348f3d63e/volumes" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.191056 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6195d18-9ddf-4470-9f2f-5a3fe9717c53" path="/var/lib/kubelet/pods/d6195d18-9ddf-4470-9f2f-5a3fe9717c53/volumes" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.206659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.206709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-logs\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.206733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.206768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-public-tls-certs\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.206825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-scripts\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.206863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.206895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.206921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data-custom\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.206939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t9pp\" (UniqueName: \"kubernetes.io/projected/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-kube-api-access-8t9pp\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.207002 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blwt9\" (UniqueName: \"kubernetes.io/projected/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-kube-api-access-blwt9\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.207015 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.207023 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.207032 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.207041 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.207102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-logs\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.207169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.209988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.210489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-scripts\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.210894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-public-tls-certs\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.211263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data-custom\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.211992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.225096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.226409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t9pp\" (UniqueName: \"kubernetes.io/projected/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-kube-api-access-8t9pp\") pod \"cinder-api-0\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.352364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.718336 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:39 crc kubenswrapper[4707]: W0121 16:11:39.720421 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58ae19a2_5a2f_40e1_8d21_f226cfcbdc02.slice/crio-27b8f6482505a4faf2da46dd71e8d3500ebefb78b8ce9274845d600c046ad7a8 WatchSource:0}: Error finding container 27b8f6482505a4faf2da46dd71e8d3500ebefb78b8ce9274845d600c046ad7a8: Status 404 returned error can't find the container with id 27b8f6482505a4faf2da46dd71e8d3500ebefb78b8ce9274845d600c046ad7a8 Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.923618 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.924186 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.927353 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.953513 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02","Type":"ContainerStarted","Data":"27b8f6482505a4faf2da46dd71e8d3500ebefb78b8ce9274845d600c046ad7a8"} Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.956085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" event={"ID":"0812fb22-492c-4b73-9d75-3cfaf898c043","Type":"ContainerStarted","Data":"059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d"} Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.956119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" event={"ID":"0812fb22-492c-4b73-9d75-3cfaf898c043","Type":"ContainerStarted","Data":"67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c"} Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.956129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" event={"ID":"0812fb22-492c-4b73-9d75-3cfaf898c043","Type":"ContainerStarted","Data":"e373c970f1ba6bf80bff3e56699ad92d9b88c2971726527d8812c4b7e3323a0d"} Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.956194 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.956317 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.956342 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.962896 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:11:39 crc kubenswrapper[4707]: I0121 16:11:39.975653 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" podStartSLOduration=1.975639079 podStartE2EDuration="1.975639079s" podCreationTimestamp="2026-01-21 16:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:39.966633866 +0000 UTC m=+4197.148150088" watchObservedRunningTime="2026-01-21 16:11:39.975639079 +0000 UTC m=+4197.157155301" Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.009146 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns"] Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.019868 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-fb8df9d8-8g2ns"] Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.126298 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.302573 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.535195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.535376 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.535385 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.536015 4707 scope.go:117] "RemoveContainer" containerID="1dfd5b276b63120434749292bace0314a46bb3c0904a213de5adf4eae13bbb7d" Jan 21 16:11:40 crc kubenswrapper[4707]: E0121 16:11:40.536231 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(647f33a8-e7c7-4941-9c0d-f9c5e4b07409)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.640699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a3ae0f0-4069-4bac-8927-e0d9dce26caf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:11:40 crc kubenswrapper[4707]: E0121 16:11:40.641491 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: object "openstack-kuttl-tests"/"cert-glance-default-public-svc" not registered Jan 21 16:11:40 crc kubenswrapper[4707]: E0121 16:11:40.641533 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs podName:6a3ae0f0-4069-4bac-8927-e0d9dce26caf nodeName:}" failed. No retries permitted until 2026-01-21 16:12:12.641521224 +0000 UTC m=+4229.823037446 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs") pod "glance-default-external-api-0" (UID: "6a3ae0f0-4069-4bac-8927-e0d9dce26caf") : object "openstack-kuttl-tests"/"cert-glance-default-public-svc" not registered Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.964718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02","Type":"ContainerStarted","Data":"fd7d65ab3ac1da47baee9cd1bb7a8c6e225241aecccbc966467d4b9c5d3c4850"} Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.964760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02","Type":"ContainerStarted","Data":"78fc3c225bc4604ce6cf99b5a6bdb1580e0cfe49cd998dab11af86c1c5998a54"} Jan 21 16:11:40 crc kubenswrapper[4707]: I0121 16:11:40.979591 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.979574023 podStartE2EDuration="2.979574023s" podCreationTimestamp="2026-01-21 16:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:40.979385027 +0000 UTC m=+4198.160901259" watchObservedRunningTime="2026-01-21 16:11:40.979574023 +0000 UTC m=+4198.161090245" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.191478 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6227e663-2e19-4ddc-b8cb-ce6e35eb67c2" path="/var/lib/kubelet/pods/6227e663-2e19-4ddc-b8cb-ce6e35eb67c2/volumes" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.344905 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.228:9292/healthcheck\": read tcp 10.217.0.2:59940->10.217.1.228:9292: read: connection reset by peer" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.733526 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.860802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"45335228-5b76-4f29-a370-98578b0dfd62\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.860901 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7cr4\" (UniqueName: \"kubernetes.io/projected/45335228-5b76-4f29-a370-98578b0dfd62-kube-api-access-m7cr4\") pod \"45335228-5b76-4f29-a370-98578b0dfd62\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.860964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-config-data\") pod \"45335228-5b76-4f29-a370-98578b0dfd62\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.860998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-combined-ca-bundle\") pod \"45335228-5b76-4f29-a370-98578b0dfd62\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.861061 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-logs\") pod \"45335228-5b76-4f29-a370-98578b0dfd62\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.861076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") pod \"45335228-5b76-4f29-a370-98578b0dfd62\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.861096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-scripts\") pod \"45335228-5b76-4f29-a370-98578b0dfd62\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.861112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-httpd-run\") pod \"45335228-5b76-4f29-a370-98578b0dfd62\" (UID: \"45335228-5b76-4f29-a370-98578b0dfd62\") " Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.862257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-logs" (OuterVolumeSpecName: "logs") pod "45335228-5b76-4f29-a370-98578b0dfd62" (UID: "45335228-5b76-4f29-a370-98578b0dfd62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.862401 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "45335228-5b76-4f29-a370-98578b0dfd62" (UID: "45335228-5b76-4f29-a370-98578b0dfd62"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.865612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "45335228-5b76-4f29-a370-98578b0dfd62" (UID: "45335228-5b76-4f29-a370-98578b0dfd62"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.865857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45335228-5b76-4f29-a370-98578b0dfd62-kube-api-access-m7cr4" (OuterVolumeSpecName: "kube-api-access-m7cr4") pod "45335228-5b76-4f29-a370-98578b0dfd62" (UID: "45335228-5b76-4f29-a370-98578b0dfd62"). InnerVolumeSpecName "kube-api-access-m7cr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.865984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-scripts" (OuterVolumeSpecName: "scripts") pod "45335228-5b76-4f29-a370-98578b0dfd62" (UID: "45335228-5b76-4f29-a370-98578b0dfd62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.885188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45335228-5b76-4f29-a370-98578b0dfd62" (UID: "45335228-5b76-4f29-a370-98578b0dfd62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.899042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "45335228-5b76-4f29-a370-98578b0dfd62" (UID: "45335228-5b76-4f29-a370-98578b0dfd62"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.899913 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-config-data" (OuterVolumeSpecName: "config-data") pod "45335228-5b76-4f29-a370-98578b0dfd62" (UID: "45335228-5b76-4f29-a370-98578b0dfd62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.963216 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.963243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7cr4\" (UniqueName: \"kubernetes.io/projected/45335228-5b76-4f29-a370-98578b0dfd62-kube-api-access-m7cr4\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.963255 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.963265 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.963273 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.963281 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.963289 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45335228-5b76-4f29-a370-98578b0dfd62-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.963296 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45335228-5b76-4f29-a370-98578b0dfd62-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.972672 4707 generic.go:334] "Generic (PLEG): container finished" podID="45335228-5b76-4f29-a370-98578b0dfd62" containerID="39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0" exitCode=0 Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.972773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"45335228-5b76-4f29-a370-98578b0dfd62","Type":"ContainerDied","Data":"39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0"} Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.972800 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.973394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"45335228-5b76-4f29-a370-98578b0dfd62","Type":"ContainerDied","Data":"ee67adb083f54026653faa6a58edfd853f55522fe0f32b5f8210e39d69312e76"} Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.973421 4707 scope.go:117] "RemoveContainer" containerID="39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.973546 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.985310 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:11:41 crc kubenswrapper[4707]: I0121 16:11:41.999520 4707 scope.go:117] "RemoveContainer" containerID="681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.013755 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.015108 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.040115 4707 scope.go:117] "RemoveContainer" containerID="39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0" Jan 21 16:11:42 crc kubenswrapper[4707]: E0121 16:11:42.040896 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0\": container with ID starting with 39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0 not found: ID does not exist" containerID="39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.040931 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0"} err="failed to get container status \"39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0\": rpc error: code = NotFound desc = could not find container \"39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0\": container with ID starting with 39cb5437ba2556d423965f639736d07fda3408a1396b868497da02932baba5b0 not found: ID does not exist" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.040950 4707 scope.go:117] "RemoveContainer" containerID="681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60" Jan 21 16:11:42 crc kubenswrapper[4707]: E0121 16:11:42.045032 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60\": container with ID starting with 681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60 not found: ID does not exist" containerID="681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.045058 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60"} err="failed to get container status \"681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60\": rpc error: code = NotFound desc = could not find container \"681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60\": container with ID starting with 681f9b60fbdc9fe32219cf42578d3a89fd20a1cd67a08b6e9fcd5fba9e4c1a60 not found: ID does not exist" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.045246 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:42 crc kubenswrapper[4707]: E0121 16:11:42.045701 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-log" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.045772 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-log" Jan 21 16:11:42 crc kubenswrapper[4707]: E0121 16:11:42.045870 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-httpd" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.045923 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-httpd" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.046214 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-httpd" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.046285 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="45335228-5b76-4f29-a370-98578b0dfd62" containerName="glance-log" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.047246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.051256 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-97m6r" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.051331 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.051422 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.052138 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.058225 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.066534 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.167855 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.167891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgpxd\" (UniqueName: \"kubernetes.io/projected/5d267e3f-3947-43f5-9045-0320e5112dc1-kube-api-access-qgpxd\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.167942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.167965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.168151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.168222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.168352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.168402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.269976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.270014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgpxd\" (UniqueName: \"kubernetes.io/projected/5d267e3f-3947-43f5-9045-0320e5112dc1-kube-api-access-qgpxd\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.270059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.270081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.270207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.270873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.270909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.271202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.271199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.271308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.271458 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.273917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.274102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.274144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.274490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.285649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgpxd\" (UniqueName: \"kubernetes.io/projected/5d267e3f-3947-43f5-9045-0320e5112dc1-kube-api-access-qgpxd\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.292034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.372076 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.747035 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:42 crc kubenswrapper[4707]: W0121 16:11:42.748270 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d267e3f_3947_43f5_9045_0320e5112dc1.slice/crio-218488e53bb763f88d93b8e79f832ec4bc9708df9c43d0ed218e3b1974b42fd0 WatchSource:0}: Error finding container 218488e53bb763f88d93b8e79f832ec4bc9708df9c43d0ed218e3b1974b42fd0: Status 404 returned error can't find the container with id 218488e53bb763f88d93b8e79f832ec4bc9708df9c43d0ed218e3b1974b42fd0 Jan 21 16:11:42 crc kubenswrapper[4707]: I0121 16:11:42.997512 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"5d267e3f-3947-43f5-9045-0320e5112dc1","Type":"ContainerStarted","Data":"218488e53bb763f88d93b8e79f832ec4bc9708df9c43d0ed218e3b1974b42fd0"} Jan 21 16:11:43 crc kubenswrapper[4707]: I0121 16:11:43.198268 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45335228-5b76-4f29-a370-98578b0dfd62" path="/var/lib/kubelet/pods/45335228-5b76-4f29-a370-98578b0dfd62/volumes" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.005720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"5d267e3f-3947-43f5-9045-0320e5112dc1","Type":"ContainerStarted","Data":"11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e"} Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.414849 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.531366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-ceilometer-tls-certs\") pod \"97e2beb7-1eea-43c2-909e-732a2e8c822c\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.531444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-log-httpd\") pod \"97e2beb7-1eea-43c2-909e-732a2e8c822c\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.531468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfj82\" (UniqueName: \"kubernetes.io/projected/97e2beb7-1eea-43c2-909e-732a2e8c822c-kube-api-access-kfj82\") pod \"97e2beb7-1eea-43c2-909e-732a2e8c822c\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.531525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-config-data\") pod \"97e2beb7-1eea-43c2-909e-732a2e8c822c\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.531630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-scripts\") pod \"97e2beb7-1eea-43c2-909e-732a2e8c822c\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.531656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-run-httpd\") pod \"97e2beb7-1eea-43c2-909e-732a2e8c822c\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.531683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-sg-core-conf-yaml\") pod \"97e2beb7-1eea-43c2-909e-732a2e8c822c\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.531714 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-combined-ca-bundle\") pod \"97e2beb7-1eea-43c2-909e-732a2e8c822c\" (UID: \"97e2beb7-1eea-43c2-909e-732a2e8c822c\") " Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.531982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "97e2beb7-1eea-43c2-909e-732a2e8c822c" (UID: "97e2beb7-1eea-43c2-909e-732a2e8c822c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.532180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "97e2beb7-1eea-43c2-909e-732a2e8c822c" (UID: "97e2beb7-1eea-43c2-909e-732a2e8c822c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.532232 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.536392 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e2beb7-1eea-43c2-909e-732a2e8c822c-kube-api-access-kfj82" (OuterVolumeSpecName: "kube-api-access-kfj82") pod "97e2beb7-1eea-43c2-909e-732a2e8c822c" (UID: "97e2beb7-1eea-43c2-909e-732a2e8c822c"). InnerVolumeSpecName "kube-api-access-kfj82". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.536472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-scripts" (OuterVolumeSpecName: "scripts") pod "97e2beb7-1eea-43c2-909e-732a2e8c822c" (UID: "97e2beb7-1eea-43c2-909e-732a2e8c822c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.606957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-config-data" (OuterVolumeSpecName: "config-data") pod "97e2beb7-1eea-43c2-909e-732a2e8c822c" (UID: "97e2beb7-1eea-43c2-909e-732a2e8c822c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.634072 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfj82\" (UniqueName: \"kubernetes.io/projected/97e2beb7-1eea-43c2-909e-732a2e8c822c-kube-api-access-kfj82\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.634101 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.634110 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.634121 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97e2beb7-1eea-43c2-909e-732a2e8c822c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.658942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "97e2beb7-1eea-43c2-909e-732a2e8c822c" (UID: "97e2beb7-1eea-43c2-909e-732a2e8c822c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.675574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "97e2beb7-1eea-43c2-909e-732a2e8c822c" (UID: "97e2beb7-1eea-43c2-909e-732a2e8c822c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.688956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97e2beb7-1eea-43c2-909e-732a2e8c822c" (UID: "97e2beb7-1eea-43c2-909e-732a2e8c822c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.735796 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.735848 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:44 crc kubenswrapper[4707]: I0121 16:11:44.735858 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97e2beb7-1eea-43c2-909e-732a2e8c822c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.012659 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"5d267e3f-3947-43f5-9045-0320e5112dc1","Type":"ContainerStarted","Data":"a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202"} Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.015993 4707 generic.go:334] "Generic (PLEG): container finished" podID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerID="53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4" exitCode=0 Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.016025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerDied","Data":"53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4"} Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.016044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"97e2beb7-1eea-43c2-909e-732a2e8c822c","Type":"ContainerDied","Data":"183d24e9abd1ee2106920a99a99c89fa85eeea60076365fab7636e9243d872f7"} Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.016058 4707 scope.go:117] "RemoveContainer" containerID="579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.016057 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.030669 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.030653412 podStartE2EDuration="3.030653412s" podCreationTimestamp="2026-01-21 16:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:45.028101003 +0000 UTC m=+4202.209617225" watchObservedRunningTime="2026-01-21 16:11:45.030653412 +0000 UTC m=+4202.212169634" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.035621 4707 scope.go:117] "RemoveContainer" containerID="23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.051184 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.062688 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.071723 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:11:45 crc kubenswrapper[4707]: E0121 16:11:45.072126 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="ceilometer-notification-agent" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.072143 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="ceilometer-notification-agent" Jan 21 16:11:45 crc kubenswrapper[4707]: E0121 16:11:45.072171 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="sg-core" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.072178 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="sg-core" Jan 21 16:11:45 crc kubenswrapper[4707]: E0121 16:11:45.072193 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="proxy-httpd" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.072198 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="proxy-httpd" Jan 21 16:11:45 crc kubenswrapper[4707]: E0121 16:11:45.072207 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="ceilometer-central-agent" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.072212 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="ceilometer-central-agent" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.072348 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="ceilometer-central-agent" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.072357 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="proxy-httpd" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.072370 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="sg-core" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.072384 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" containerName="ceilometer-notification-agent" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.073829 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.077418 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.078417 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.078534 4707 scope.go:117] "RemoveContainer" containerID="53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.078588 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.078656 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.098522 4707 scope.go:117] "RemoveContainer" containerID="6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.114640 4707 scope.go:117] "RemoveContainer" containerID="579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53" Jan 21 16:11:45 crc kubenswrapper[4707]: E0121 16:11:45.115010 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53\": container with ID starting with 579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53 not found: ID does not exist" containerID="579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.115051 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53"} err="failed to get container status \"579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53\": rpc error: code = NotFound desc = could not find container \"579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53\": container with ID starting with 579d4c6f9bec1d51c7f580031b0d65d8dcd39c7ac1df960d01445f35d6f34c53 not found: ID does not exist" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.115075 4707 scope.go:117] "RemoveContainer" containerID="23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065" Jan 21 16:11:45 crc kubenswrapper[4707]: E0121 16:11:45.115451 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065\": container with ID starting with 23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065 not found: ID does not exist" containerID="23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.115481 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065"} err="failed to get container status \"23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065\": rpc error: code = NotFound desc = could not find container \"23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065\": container with ID starting with 23308d2169ebd3fa2ad19b2073e7d3f703838213b22f0a37863f68900139f065 not found: ID does not exist" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.115504 4707 scope.go:117] "RemoveContainer" containerID="53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4" Jan 21 16:11:45 crc kubenswrapper[4707]: E0121 16:11:45.115772 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4\": container with ID starting with 53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4 not found: ID does not exist" containerID="53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.115795 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4"} err="failed to get container status \"53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4\": rpc error: code = NotFound desc = could not find container \"53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4\": container with ID starting with 53a6ff31c7b776d6ff43ac290f5e02b4dc59aafcb57f7aafecbb8698c8ba7bd4 not found: ID does not exist" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.115822 4707 scope.go:117] "RemoveContainer" containerID="6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b" Jan 21 16:11:45 crc kubenswrapper[4707]: E0121 16:11:45.116063 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b\": container with ID starting with 6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b not found: ID does not exist" containerID="6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.116083 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b"} err="failed to get container status \"6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b\": rpc error: code = NotFound desc = could not find container \"6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b\": container with ID starting with 6b4a347704f662cba10d8cb22e0a3f6568bbc6122e5904935f5e83b7f6f1a07b not found: ID does not exist" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.191330 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e2beb7-1eea-43c2-909e-732a2e8c822c" path="/var/lib/kubelet/pods/97e2beb7-1eea-43c2-909e-732a2e8c822c/volumes" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.243803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-run-httpd\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.243852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-log-httpd\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.243872 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.243962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.243999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-scripts\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.244021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-config-data\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.244050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9scm\" (UniqueName: \"kubernetes.io/projected/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-kube-api-access-q9scm\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.244084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.302703 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.317144 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.345797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.345860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-scripts\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.345887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-config-data\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.345916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9scm\" (UniqueName: \"kubernetes.io/projected/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-kube-api-access-q9scm\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.345957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.346022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-run-httpd\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.346046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-log-httpd\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.346060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.346742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-run-httpd\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.347786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-log-httpd\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.747628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.748035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-scripts\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.748093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.748180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-config-data\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.748552 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.748649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9scm\" (UniqueName: \"kubernetes.io/projected/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-kube-api-access-q9scm\") pod \"ceilometer-0\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.886914 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:11:45 crc kubenswrapper[4707]: I0121 16:11:45.989009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.049673 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.143039 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.143819 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.150924 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.168455 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c46885fd8-bdfw6"] Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.168617 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" podUID="61a39f5f-0c54-4def-ab69-4c0d7ef44e57" containerName="keystone-api" containerID="cri-o://159bdb08582167cf1c8abdc43ff23a21e055879346a2d331601f1afb7b6503b5" gracePeriod=30 Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.177199 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.228619 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6d55599ff6-55gnd"] Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.230493 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.312034 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6d55599ff6-55gnd"] Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.375994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-fernet-keys\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.376096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-combined-ca-bundle\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.376124 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-public-tls-certs\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.376172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-scripts\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.376405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-credential-keys\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.376501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-internal-tls-certs\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.376557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-config-data\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.376693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ql7s\" (UniqueName: \"kubernetes.io/projected/1876cc6c-ebc8-4c95-a937-27d2e01adf64-kube-api-access-8ql7s\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.406532 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.478453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ql7s\" (UniqueName: \"kubernetes.io/projected/1876cc6c-ebc8-4c95-a937-27d2e01adf64-kube-api-access-8ql7s\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.478498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-fernet-keys\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.478566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-combined-ca-bundle\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.478590 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-public-tls-certs\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.478617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-scripts\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.478663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-credential-keys\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.478691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-internal-tls-certs\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.478709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-config-data\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.485304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-internal-tls-certs\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.488030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-scripts\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.497323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-fernet-keys\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.497343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-config-data\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.499151 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-credential-keys\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.501330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-combined-ca-bundle\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.501719 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-public-tls-certs\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.503180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ql7s\" (UniqueName: \"kubernetes.io/projected/1876cc6c-ebc8-4c95-a937-27d2e01adf64-kube-api-access-8ql7s\") pod \"keystone-6d55599ff6-55gnd\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.604196 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.604620 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api-log" containerID="cri-o://78fc3c225bc4604ce6cf99b5a6bdb1580e0cfe49cd998dab11af86c1c5998a54" gracePeriod=30 Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.604706 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api" containerID="cri-o://fd7d65ab3ac1da47baee9cd1bb7a8c6e225241aecccbc966467d4b9c5d3c4850" gracePeriod=30 Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.610550 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" podUID="61a39f5f-0c54-4def-ab69-4c0d7ef44e57" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.1.243:5000/v3\": read tcp 10.217.0.2:43422->10.217.1.243:5000: read: connection reset by peer" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.611085 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.245:8776/healthcheck\": EOF" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.611088 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-api-0" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.245:8776/healthcheck\": EOF" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.622752 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.931346 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6f795569d6-mmhxt"] Jan 21 16:11:46 crc kubenswrapper[4707]: E0121 16:11:46.953543 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" podUID="f904e1e0-53bb-45dc-90c4-df6a6783f58a" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.962131 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-8657895bb4-ls6tf"] Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.964087 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:46 crc kubenswrapper[4707]: I0121 16:11:46.992871 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8657895bb4-ls6tf"] Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.044646 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6d55599ff6-55gnd"] Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.046317 4707 generic.go:334] "Generic (PLEG): container finished" podID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerID="78fc3c225bc4604ce6cf99b5a6bdb1580e0cfe49cd998dab11af86c1c5998a54" exitCode=143 Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.046375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02","Type":"ContainerDied","Data":"78fc3c225bc4604ce6cf99b5a6bdb1580e0cfe49cd998dab11af86c1c5998a54"} Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.048468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerStarted","Data":"2921dc59f4d65fee2b79c41825972f23829cc2d8c33303ffcf285877197e5610"} Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.049920 4707 generic.go:334] "Generic (PLEG): container finished" podID="61a39f5f-0c54-4def-ab69-4c0d7ef44e57" containerID="159bdb08582167cf1c8abdc43ff23a21e055879346a2d331601f1afb7b6503b5" exitCode=0 Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.050996 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.051049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" event={"ID":"61a39f5f-0c54-4def-ab69-4c0d7ef44e57","Type":"ContainerDied","Data":"159bdb08582167cf1c8abdc43ff23a21e055879346a2d331601f1afb7b6503b5"} Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.051084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" event={"ID":"61a39f5f-0c54-4def-ab69-4c0d7ef44e57","Type":"ContainerDied","Data":"e3112b3473d85e54b9df13f053a62a9760a93292d4632f67ee945c4122924e05"} Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.051094 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3112b3473d85e54b9df13f053a62a9760a93292d4632f67ee945c4122924e05" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.051112 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.056105 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.098151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvdv\" (UniqueName: \"kubernetes.io/projected/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-kube-api-access-hfvdv\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.098218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-internal-tls-certs\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.098246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-config\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.098263 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-ovndb-tls-certs\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.098332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-public-tls-certs\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.098398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-httpd-config\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.098416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-combined-ca-bundle\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.115499 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.127773 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.200056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvdv\" (UniqueName: \"kubernetes.io/projected/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-kube-api-access-hfvdv\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.200110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-internal-tls-certs\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.200146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-config\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.200171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-ovndb-tls-certs\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.200239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-public-tls-certs\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.200402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-httpd-config\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.200435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-combined-ca-bundle\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.204886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-httpd-config\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.205205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-combined-ca-bundle\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.205339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-internal-tls-certs\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.206244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-ovndb-tls-certs\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.206382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-config\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.206591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-public-tls-certs\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.215475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvdv\" (UniqueName: \"kubernetes.io/projected/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-kube-api-access-hfvdv\") pod \"neutron-8657895bb4-ls6tf\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.301608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") pod \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.301668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-credential-keys\") pod \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.301688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-combined-ca-bundle\") pod \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.301707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs\") pod \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.301750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-config-data\") pod \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.301821 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-public-tls-certs\") pod \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.301863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-httpd-config\") pod \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.301949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-fernet-keys\") pod \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.301975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-scripts\") pod \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.302048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-internal-tls-certs\") pod \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.302071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-config\") pod \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.302107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tf5n\" (UniqueName: \"kubernetes.io/projected/f904e1e0-53bb-45dc-90c4-df6a6783f58a-kube-api-access-4tf5n\") pod \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.302131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjhhx\" (UniqueName: \"kubernetes.io/projected/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-kube-api-access-tjhhx\") pod \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\" (UID: \"61a39f5f-0c54-4def-ab69-4c0d7ef44e57\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.302206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-combined-ca-bundle\") pod \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\" (UID: \"f904e1e0-53bb-45dc-90c4-df6a6783f58a\") " Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.306952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f904e1e0-53bb-45dc-90c4-df6a6783f58a" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.307270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "61a39f5f-0c54-4def-ab69-4c0d7ef44e57" (UID: "61a39f5f-0c54-4def-ab69-4c0d7ef44e57"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.307291 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f904e1e0-53bb-45dc-90c4-df6a6783f58a" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.307586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f904e1e0-53bb-45dc-90c4-df6a6783f58a" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.312024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-config" (OuterVolumeSpecName: "config") pod "f904e1e0-53bb-45dc-90c4-df6a6783f58a" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.312027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "61a39f5f-0c54-4def-ab69-4c0d7ef44e57" (UID: "61a39f5f-0c54-4def-ab69-4c0d7ef44e57"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.312738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-scripts" (OuterVolumeSpecName: "scripts") pod "61a39f5f-0c54-4def-ab69-4c0d7ef44e57" (UID: "61a39f5f-0c54-4def-ab69-4c0d7ef44e57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.312752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-kube-api-access-tjhhx" (OuterVolumeSpecName: "kube-api-access-tjhhx") pod "61a39f5f-0c54-4def-ab69-4c0d7ef44e57" (UID: "61a39f5f-0c54-4def-ab69-4c0d7ef44e57"). InnerVolumeSpecName "kube-api-access-tjhhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.313770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f904e1e0-53bb-45dc-90c4-df6a6783f58a-kube-api-access-4tf5n" (OuterVolumeSpecName: "kube-api-access-4tf5n") pod "f904e1e0-53bb-45dc-90c4-df6a6783f58a" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a"). InnerVolumeSpecName "kube-api-access-4tf5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.313912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f904e1e0-53bb-45dc-90c4-df6a6783f58a" (UID: "f904e1e0-53bb-45dc-90c4-df6a6783f58a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.333081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-config-data" (OuterVolumeSpecName: "config-data") pod "61a39f5f-0c54-4def-ab69-4c0d7ef44e57" (UID: "61a39f5f-0c54-4def-ab69-4c0d7ef44e57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.338338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61a39f5f-0c54-4def-ab69-4c0d7ef44e57" (UID: "61a39f5f-0c54-4def-ab69-4c0d7ef44e57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.353309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "61a39f5f-0c54-4def-ab69-4c0d7ef44e57" (UID: "61a39f5f-0c54-4def-ab69-4c0d7ef44e57"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.358372 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "61a39f5f-0c54-4def-ab69-4c0d7ef44e57" (UID: "61a39f5f-0c54-4def-ab69-4c0d7ef44e57"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.405011 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.405323 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.405418 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.405496 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.405570 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tf5n\" (UniqueName: \"kubernetes.io/projected/f904e1e0-53bb-45dc-90c4-df6a6783f58a-kube-api-access-4tf5n\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.407258 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjhhx\" (UniqueName: \"kubernetes.io/projected/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-kube-api-access-tjhhx\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.407334 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.410210 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.410318 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.410374 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.410450 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.410573 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.410660 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a39f5f-0c54-4def-ab69-4c0d7ef44e57-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.410732 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.426888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:47 crc kubenswrapper[4707]: I0121 16:11:47.842825 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8657895bb4-ls6tf"] Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.065382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" event={"ID":"1876cc6c-ebc8-4c95-a937-27d2e01adf64","Type":"ContainerStarted","Data":"a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63"} Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.065652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" event={"ID":"1876cc6c-ebc8-4c95-a937-27d2e01adf64","Type":"ContainerStarted","Data":"56120464f973bba17190ae40e00a75bd42ca12f05b0c0ffccbe2dcdb9c13aa9d"} Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.065775 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.070733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" event={"ID":"de7fba6f-a3f2-4a62-a094-056e0cccd7fa","Type":"ContainerStarted","Data":"0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92"} Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.070762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" event={"ID":"de7fba6f-a3f2-4a62-a094-056e0cccd7fa","Type":"ContainerStarted","Data":"5bdc43924b302fde29989f06c46bbbed65e642b144e77c412f0c86831402ab88"} Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.072097 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c46885fd8-bdfw6" Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.079733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerStarted","Data":"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c"} Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.079755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerStarted","Data":"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3"} Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.079786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f795569d6-mmhxt" Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.092878 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" podStartSLOduration=2.092863964 podStartE2EDuration="2.092863964s" podCreationTimestamp="2026-01-21 16:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:48.084294982 +0000 UTC m=+4205.265811204" watchObservedRunningTime="2026-01-21 16:11:48.092863964 +0000 UTC m=+4205.274380186" Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.119089 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c46885fd8-bdfw6"] Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.134515 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-5c46885fd8-bdfw6"] Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.168370 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6f795569d6-mmhxt"] Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.174174 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6f795569d6-mmhxt"] Jan 21 16:11:48 crc kubenswrapper[4707]: I0121 16:11:48.231536 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f904e1e0-53bb-45dc-90c4-df6a6783f58a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.080257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" event={"ID":"de7fba6f-a3f2-4a62-a094-056e0cccd7fa","Type":"ContainerStarted","Data":"f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50"} Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.080561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.083439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerStarted","Data":"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459"} Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.102604 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" podStartSLOduration=3.10259084 podStartE2EDuration="3.10259084s" podCreationTimestamp="2026-01-21 16:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:49.098076762 +0000 UTC m=+4206.279592984" watchObservedRunningTime="2026-01-21 16:11:49.10259084 +0000 UTC m=+4206.284107051" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.145240 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-56454c4d6b-f6jz7"] Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.145546 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-log" containerID="cri-o://418de46e8ddd98abdd313f3caa5188dc17cb30561b35253f50ddd8686752659b" gracePeriod=30 Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.145659 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-api" containerID="cri-o://c4c68a89695bf606a0a4cc6f42584ecaa9e2bf69fe90294a201d349ae0a85e8f" gracePeriod=30 Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.175017 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-644c6c574-8cgjw"] Jan 21 16:11:49 crc kubenswrapper[4707]: E0121 16:11:49.176033 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a39f5f-0c54-4def-ab69-4c0d7ef44e57" containerName="keystone-api" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.176059 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a39f5f-0c54-4def-ab69-4c0d7ef44e57" containerName="keystone-api" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.176261 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a39f5f-0c54-4def-ab69-4c0d7ef44e57" containerName="keystone-api" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.178153 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.183549 4707 scope.go:117] "RemoveContainer" containerID="c43bdee51f4936382c063ef548a600d72b44db1be83fde94ba90c55aa3abc9be" Jan 21 16:11:49 crc kubenswrapper[4707]: E0121 16:11:49.183755 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.210901 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a39f5f-0c54-4def-ab69-4c0d7ef44e57" path="/var/lib/kubelet/pods/61a39f5f-0c54-4def-ab69-4c0d7ef44e57/volumes" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.212073 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f904e1e0-53bb-45dc-90c4-df6a6783f58a" path="/var/lib/kubelet/pods/f904e1e0-53bb-45dc-90c4-df6a6783f58a/volumes" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.225307 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-644c6c574-8cgjw"] Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.252459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkn5h\" (UniqueName: \"kubernetes.io/projected/ab3292fc-fbc8-481c-803a-3e32ff576d6d-kube-api-access-vkn5h\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.252514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-public-tls-certs\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.252534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab3292fc-fbc8-481c-803a-3e32ff576d6d-logs\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.252604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-config-data\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.252675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-combined-ca-bundle\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.252708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-internal-tls-certs\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.252803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-scripts\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.353774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkn5h\" (UniqueName: \"kubernetes.io/projected/ab3292fc-fbc8-481c-803a-3e32ff576d6d-kube-api-access-vkn5h\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.353991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab3292fc-fbc8-481c-803a-3e32ff576d6d-logs\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.354084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-public-tls-certs\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.354247 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-config-data\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.354470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab3292fc-fbc8-481c-803a-3e32ff576d6d-logs\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.354841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-combined-ca-bundle\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.355281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-internal-tls-certs\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.355453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-scripts\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.359946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-public-tls-certs\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.360451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-config-data\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.360964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-combined-ca-bundle\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.361779 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-scripts\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.372478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-internal-tls-certs\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.372907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkn5h\" (UniqueName: \"kubernetes.io/projected/ab3292fc-fbc8-481c-803a-3e32ff576d6d-kube-api-access-vkn5h\") pod \"placement-644c6c574-8cgjw\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.478987 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.479216 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerName="glance-log" containerID="cri-o://11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e" gracePeriod=30 Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.479297 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerName="glance-httpd" containerID="cri-o://a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202" gracePeriod=30 Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.512247 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.562043 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.1.241:8778/\": read tcp 10.217.0.2:34802->10.217.1.241:8778: read: connection reset by peer" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.562218 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.241:8778/\": read tcp 10.217.0.2:34818->10.217.1.241:8778: read: connection reset by peer" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.945322 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.949214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:11:49 crc kubenswrapper[4707]: I0121 16:11:49.970763 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-644c6c574-8cgjw"] Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.038782 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj"] Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.039398 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api-log" containerID="cri-o://4415cfbe061679e58ce159e4806960e2e4da012d85c65836e8d01c2df9bdd7cc" gracePeriod=30 Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.039453 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api" containerID="cri-o://fd8230b429f5a36de29bc3f49e9e63eee6ddbd339099d3d326be9dd24261d9a6" gracePeriod=30 Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.177864 4707 generic.go:334] "Generic (PLEG): container finished" podID="72880cef-939a-498b-bdea-2be7388c8037" containerID="c4c68a89695bf606a0a4cc6f42584ecaa9e2bf69fe90294a201d349ae0a85e8f" exitCode=0 Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.177896 4707 generic.go:334] "Generic (PLEG): container finished" podID="72880cef-939a-498b-bdea-2be7388c8037" containerID="418de46e8ddd98abdd313f3caa5188dc17cb30561b35253f50ddd8686752659b" exitCode=143 Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.177932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" event={"ID":"72880cef-939a-498b-bdea-2be7388c8037","Type":"ContainerDied","Data":"c4c68a89695bf606a0a4cc6f42584ecaa9e2bf69fe90294a201d349ae0a85e8f"} Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.177957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" event={"ID":"72880cef-939a-498b-bdea-2be7388c8037","Type":"ContainerDied","Data":"418de46e8ddd98abdd313f3caa5188dc17cb30561b35253f50ddd8686752659b"} Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.177965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" event={"ID":"72880cef-939a-498b-bdea-2be7388c8037","Type":"ContainerDied","Data":"84f5ba8378aab57b6bc81b1d4b3dc82d0e8866408568e564472b4ad1a4bcc062"} Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.177974 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84f5ba8378aab57b6bc81b1d4b3dc82d0e8866408568e564472b4ad1a4bcc062" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.178088 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.183447 4707 scope.go:117] "RemoveContainer" containerID="377c2980821b46c6c12e8389069e3d10dae1a719d2505cc988a6fd8a8c900c52" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.184988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" event={"ID":"ab3292fc-fbc8-481c-803a-3e32ff576d6d","Type":"ContainerStarted","Data":"937b13686bf85801046f72c011a90669b57937a4d5c6fa09f6ad12296a5683dd"} Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.196965 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.244142 4707 generic.go:334] "Generic (PLEG): container finished" podID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerID="a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202" exitCode=0 Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.244180 4707 generic.go:334] "Generic (PLEG): container finished" podID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerID="11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e" exitCode=143 Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.245071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"5d267e3f-3947-43f5-9045-0320e5112dc1","Type":"ContainerDied","Data":"a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202"} Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.245098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"5d267e3f-3947-43f5-9045-0320e5112dc1","Type":"ContainerDied","Data":"11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e"} Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.245124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"5d267e3f-3947-43f5-9045-0320e5112dc1","Type":"ContainerDied","Data":"218488e53bb763f88d93b8e79f832ec4bc9708df9c43d0ed218e3b1974b42fd0"} Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.245139 4707 scope.go:117] "RemoveContainer" containerID="a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277376 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-httpd-run\") pod \"5d267e3f-3947-43f5-9045-0320e5112dc1\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-internal-tls-certs\") pod \"72880cef-939a-498b-bdea-2be7388c8037\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72880cef-939a-498b-bdea-2be7388c8037-logs\") pod \"72880cef-939a-498b-bdea-2be7388c8037\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np4hx\" (UniqueName: \"kubernetes.io/projected/72880cef-939a-498b-bdea-2be7388c8037-kube-api-access-np4hx\") pod \"72880cef-939a-498b-bdea-2be7388c8037\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-combined-ca-bundle\") pod \"72880cef-939a-498b-bdea-2be7388c8037\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-internal-tls-certs\") pod \"5d267e3f-3947-43f5-9045-0320e5112dc1\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-logs\") pod \"5d267e3f-3947-43f5-9045-0320e5112dc1\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgpxd\" (UniqueName: \"kubernetes.io/projected/5d267e3f-3947-43f5-9045-0320e5112dc1-kube-api-access-qgpxd\") pod \"5d267e3f-3947-43f5-9045-0320e5112dc1\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277674 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-scripts\") pod \"72880cef-939a-498b-bdea-2be7388c8037\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-scripts\") pod \"5d267e3f-3947-43f5-9045-0320e5112dc1\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-config-data\") pod \"5d267e3f-3947-43f5-9045-0320e5112dc1\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"5d267e3f-3947-43f5-9045-0320e5112dc1\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277823 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-public-tls-certs\") pod \"72880cef-939a-498b-bdea-2be7388c8037\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-config-data\") pod \"72880cef-939a-498b-bdea-2be7388c8037\" (UID: \"72880cef-939a-498b-bdea-2be7388c8037\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.277898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-combined-ca-bundle\") pod \"5d267e3f-3947-43f5-9045-0320e5112dc1\" (UID: \"5d267e3f-3947-43f5-9045-0320e5112dc1\") " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.279021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72880cef-939a-498b-bdea-2be7388c8037-logs" (OuterVolumeSpecName: "logs") pod "72880cef-939a-498b-bdea-2be7388c8037" (UID: "72880cef-939a-498b-bdea-2be7388c8037"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.279340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5d267e3f-3947-43f5-9045-0320e5112dc1" (UID: "5d267e3f-3947-43f5-9045-0320e5112dc1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.286455 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-logs" (OuterVolumeSpecName: "logs") pod "5d267e3f-3947-43f5-9045-0320e5112dc1" (UID: "5d267e3f-3947-43f5-9045-0320e5112dc1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.307349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-scripts" (OuterVolumeSpecName: "scripts") pod "72880cef-939a-498b-bdea-2be7388c8037" (UID: "72880cef-939a-498b-bdea-2be7388c8037"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.337962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "5d267e3f-3947-43f5-9045-0320e5112dc1" (UID: "5d267e3f-3947-43f5-9045-0320e5112dc1"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.340925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-scripts" (OuterVolumeSpecName: "scripts") pod "5d267e3f-3947-43f5-9045-0320e5112dc1" (UID: "5d267e3f-3947-43f5-9045-0320e5112dc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.341022 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d267e3f-3947-43f5-9045-0320e5112dc1-kube-api-access-qgpxd" (OuterVolumeSpecName: "kube-api-access-qgpxd") pod "5d267e3f-3947-43f5-9045-0320e5112dc1" (UID: "5d267e3f-3947-43f5-9045-0320e5112dc1"). InnerVolumeSpecName "kube-api-access-qgpxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.381001 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.381028 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.381054 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.381064 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.381073 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72880cef-939a-498b-bdea-2be7388c8037-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.381080 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d267e3f-3947-43f5-9045-0320e5112dc1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.381089 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgpxd\" (UniqueName: \"kubernetes.io/projected/5d267e3f-3947-43f5-9045-0320e5112dc1-kube-api-access-qgpxd\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.383535 4707 scope.go:117] "RemoveContainer" containerID="11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.384519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72880cef-939a-498b-bdea-2be7388c8037-kube-api-access-np4hx" (OuterVolumeSpecName: "kube-api-access-np4hx") pod "72880cef-939a-498b-bdea-2be7388c8037" (UID: "72880cef-939a-498b-bdea-2be7388c8037"). InnerVolumeSpecName "kube-api-access-np4hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.429212 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.437958 4707 scope.go:117] "RemoveContainer" containerID="a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202" Jan 21 16:11:50 crc kubenswrapper[4707]: E0121 16:11:50.438655 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202\": container with ID starting with a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202 not found: ID does not exist" containerID="a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.438861 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202"} err="failed to get container status \"a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202\": rpc error: code = NotFound desc = could not find container \"a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202\": container with ID starting with a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202 not found: ID does not exist" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.438900 4707 scope.go:117] "RemoveContainer" containerID="11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e" Jan 21 16:11:50 crc kubenswrapper[4707]: E0121 16:11:50.442108 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e\": container with ID starting with 11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e not found: ID does not exist" containerID="11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.442179 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e"} err="failed to get container status \"11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e\": rpc error: code = NotFound desc = could not find container \"11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e\": container with ID starting with 11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e not found: ID does not exist" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.442203 4707 scope.go:117] "RemoveContainer" containerID="a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.442549 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202"} err="failed to get container status \"a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202\": rpc error: code = NotFound desc = could not find container \"a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202\": container with ID starting with a4a3cb8b3a76f26b687c19ecc2cb01a419141be8694c2cddec0fdecb1e279202 not found: ID does not exist" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.442570 4707 scope.go:117] "RemoveContainer" containerID="11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.444575 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e"} err="failed to get container status \"11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e\": rpc error: code = NotFound desc = could not find container \"11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e\": container with ID starting with 11417f156a25a021ca5c987b46c1f6f3efbb3d09899027d9b045f9932ac6362e not found: ID does not exist" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.484066 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.484095 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np4hx\" (UniqueName: \"kubernetes.io/projected/72880cef-939a-498b-bdea-2be7388c8037-kube-api-access-np4hx\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.507613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-config-data" (OuterVolumeSpecName: "config-data") pod "72880cef-939a-498b-bdea-2be7388c8037" (UID: "72880cef-939a-498b-bdea-2be7388c8037"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.585271 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.633366 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d267e3f-3947-43f5-9045-0320e5112dc1" (UID: "5d267e3f-3947-43f5-9045-0320e5112dc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.653195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72880cef-939a-498b-bdea-2be7388c8037" (UID: "72880cef-939a-498b-bdea-2be7388c8037"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.673917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-config-data" (OuterVolumeSpecName: "config-data") pod "5d267e3f-3947-43f5-9045-0320e5112dc1" (UID: "5d267e3f-3947-43f5-9045-0320e5112dc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.684563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d267e3f-3947-43f5-9045-0320e5112dc1" (UID: "5d267e3f-3947-43f5-9045-0320e5112dc1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.688909 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.689096 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.689118 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.689129 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d267e3f-3947-43f5-9045-0320e5112dc1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.728190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "72880cef-939a-498b-bdea-2be7388c8037" (UID: "72880cef-939a-498b-bdea-2be7388c8037"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.760908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "72880cef-939a-498b-bdea-2be7388c8037" (UID: "72880cef-939a-498b-bdea-2be7388c8037"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.791749 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:50 crc kubenswrapper[4707]: I0121 16:11:50.791783 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72880cef-939a-498b-bdea-2be7388c8037-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.015251 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.245:8776/healthcheck\": read tcp 10.217.0.2:35378->10.217.1.245:8776: read: connection reset by peer" Jan 21 16:11:51 crc kubenswrapper[4707]: E0121 16:11:51.195580 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58ae19a2_5a2f_40e1_8d21_f226cfcbdc02.slice/crio-conmon-fd7d65ab3ac1da47baee9cd1bb7a8c6e225241aecccbc966467d4b9c5d3c4850.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58ae19a2_5a2f_40e1_8d21_f226cfcbdc02.slice/crio-fd7d65ab3ac1da47baee9cd1bb7a8c6e225241aecccbc966467d4b9c5d3c4850.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.256715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerStarted","Data":"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83"} Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.257891 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.262426 4707 generic.go:334] "Generic (PLEG): container finished" podID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerID="4415cfbe061679e58ce159e4806960e2e4da012d85c65836e8d01c2df9bdd7cc" exitCode=143 Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.262531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" event={"ID":"6794a1f0-de70-4b87-8f05-1f6c657b3d5e","Type":"ContainerDied","Data":"4415cfbe061679e58ce159e4806960e2e4da012d85c65836e8d01c2df9bdd7cc"} Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.264936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerStarted","Data":"0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd"} Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.266353 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.270642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" event={"ID":"ab3292fc-fbc8-481c-803a-3e32ff576d6d","Type":"ContainerStarted","Data":"142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb"} Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.270682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" event={"ID":"ab3292fc-fbc8-481c-803a-3e32ff576d6d","Type":"ContainerStarted","Data":"8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e"} Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.271150 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.271225 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.288681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.296870 4707 generic.go:334] "Generic (PLEG): container finished" podID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerID="fd7d65ab3ac1da47baee9cd1bb7a8c6e225241aecccbc966467d4b9c5d3c4850" exitCode=0 Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.296976 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-56454c4d6b-f6jz7" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.297462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02","Type":"ContainerDied","Data":"fd7d65ab3ac1da47baee9cd1bb7a8c6e225241aecccbc966467d4b9c5d3c4850"} Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.302597 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.436421628 podStartE2EDuration="6.302509067s" podCreationTimestamp="2026-01-21 16:11:45 +0000 UTC" firstStartedPulling="2026-01-21 16:11:46.421828964 +0000 UTC m=+4203.603345186" lastFinishedPulling="2026-01-21 16:11:50.287916403 +0000 UTC m=+4207.469432625" observedRunningTime="2026-01-21 16:11:51.288014753 +0000 UTC m=+4208.469530975" watchObservedRunningTime="2026-01-21 16:11:51.302509067 +0000 UTC m=+4208.484025278" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.362656 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" podStartSLOduration=2.362632456 podStartE2EDuration="2.362632456s" podCreationTimestamp="2026-01-21 16:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:51.317170756 +0000 UTC m=+4208.498686978" watchObservedRunningTime="2026-01-21 16:11:51.362632456 +0000 UTC m=+4208.544148678" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.404867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.411376 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.417235 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-56454c4d6b-f6jz7"] Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.423539 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:51 crc kubenswrapper[4707]: E0121 16:11:51.424115 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerName="glance-httpd" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.424136 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerName="glance-httpd" Jan 21 16:11:51 crc kubenswrapper[4707]: E0121 16:11:51.424202 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-log" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.424210 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-log" Jan 21 16:11:51 crc kubenswrapper[4707]: E0121 16:11:51.424221 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerName="glance-log" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.424227 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerName="glance-log" Jan 21 16:11:51 crc kubenswrapper[4707]: E0121 16:11:51.424237 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-api" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.424243 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-api" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.424429 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-log" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.424446 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="72880cef-939a-498b-bdea-2be7388c8037" containerName="placement-api" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.424460 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerName="glance-log" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.424470 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d267e3f-3947-43f5-9045-0320e5112dc1" containerName="glance-httpd" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.425697 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.428068 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.428485 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-97m6r" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.428829 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.430608 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-56454c4d6b-f6jz7"] Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.434886 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.435470 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.514625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.515188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.515309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mbx\" (UniqueName: \"kubernetes.io/projected/804ec34d-12d2-48ee-9559-ca08aca94c3d-kube-api-access-b4mbx\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.515416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-logs\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.515520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.515644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.517055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.517190 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.533024 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.618735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data\") pod \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.618902 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-public-tls-certs\") pod \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.618981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-scripts\") pod \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.619023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data-custom\") pod \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.619086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t9pp\" (UniqueName: \"kubernetes.io/projected/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-kube-api-access-8t9pp\") pod \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.619156 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-combined-ca-bundle\") pod \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.619308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-internal-tls-certs\") pod \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.619436 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-etc-machine-id\") pod \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.619484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-logs\") pod \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\" (UID: \"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02\") " Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.619711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" (UID: "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.619994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.620171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.620349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.620430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.620568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.620656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.620783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mbx\" (UniqueName: \"kubernetes.io/projected/804ec34d-12d2-48ee-9559-ca08aca94c3d-kube-api-access-b4mbx\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.620845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-logs\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.620991 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.621241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-logs" (OuterVolumeSpecName: "logs") pod "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" (UID: "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.624110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.624486 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.624504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-scripts" (OuterVolumeSpecName: "scripts") pod "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" (UID: "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.624925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-logs\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.628484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.630925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" (UID: "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.632823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.637469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-kube-api-access-8t9pp" (OuterVolumeSpecName: "kube-api-access-8t9pp") pod "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" (UID: "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02"). InnerVolumeSpecName "kube-api-access-8t9pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.638289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.638700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.643318 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mbx\" (UniqueName: \"kubernetes.io/projected/804ec34d-12d2-48ee-9559-ca08aca94c3d-kube-api-access-b4mbx\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.663359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.669975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data" (OuterVolumeSpecName: "config-data") pod "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" (UID: "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.674568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" (UID: "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.686142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" (UID: "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.688492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" (UID: "58ae19a2-5a2f-40e1-8d21-f226cfcbdc02"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.723136 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.723168 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.723181 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.723193 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.723203 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t9pp\" (UniqueName: \"kubernetes.io/projected/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-kube-api-access-8t9pp\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.723211 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.723220 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.723229 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:51 crc kubenswrapper[4707]: I0121 16:11:51.826623 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.236053 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.304721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"804ec34d-12d2-48ee-9559-ca08aca94c3d","Type":"ContainerStarted","Data":"aee46c0b9fcf24d5f30507b57ccefd43c64c6daa63400c140278a8cd79ae7693"} Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.307061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"58ae19a2-5a2f-40e1-8d21-f226cfcbdc02","Type":"ContainerDied","Data":"27b8f6482505a4faf2da46dd71e8d3500ebefb78b8ce9274845d600c046ad7a8"} Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.307177 4707 scope.go:117] "RemoveContainer" containerID="fd7d65ab3ac1da47baee9cd1bb7a8c6e225241aecccbc966467d4b9c5d3c4850" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.307096 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.367772 4707 scope.go:117] "RemoveContainer" containerID="78fc3c225bc4604ce6cf99b5a6bdb1580e0cfe49cd998dab11af86c1c5998a54" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.372732 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.421479 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.434769 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:52 crc kubenswrapper[4707]: E0121 16:11:52.435435 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.435473 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api" Jan 21 16:11:52 crc kubenswrapper[4707]: E0121 16:11:52.435489 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api-log" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.435495 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api-log" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.435761 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api-log" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.435791 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" containerName="cinder-api" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.437327 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.440137 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.440317 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.440430 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.443942 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.542650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca35a833-6d19-485b-8d6b-f2a4d469964e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.542687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-scripts\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.542724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.542743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.542785 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5dwc\" (UniqueName: \"kubernetes.io/projected/ca35a833-6d19-485b-8d6b-f2a4d469964e-kube-api-access-t5dwc\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.542827 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca35a833-6d19-485b-8d6b-f2a4d469964e-logs\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.542857 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.542891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.542951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.645654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca35a833-6d19-485b-8d6b-f2a4d469964e-logs\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.645776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.645919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.646106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.646226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca35a833-6d19-485b-8d6b-f2a4d469964e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.646226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca35a833-6d19-485b-8d6b-f2a4d469964e-logs\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.646270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-scripts\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.646466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.646492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.646652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5dwc\" (UniqueName: \"kubernetes.io/projected/ca35a833-6d19-485b-8d6b-f2a4d469964e-kube-api-access-t5dwc\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.646277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca35a833-6d19-485b-8d6b-f2a4d469964e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.652366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.652568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.653001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.653198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.653499 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-scripts\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.667884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.668715 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5dwc\" (UniqueName: \"kubernetes.io/projected/ca35a833-6d19-485b-8d6b-f2a4d469964e-kube-api-access-t5dwc\") pod \"cinder-api-0\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:52 crc kubenswrapper[4707]: I0121 16:11:52.756087 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.170947 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.198121 4707 scope.go:117] "RemoveContainer" containerID="1dfd5b276b63120434749292bace0314a46bb3c0904a213de5adf4eae13bbb7d" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.201240 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ae19a2-5a2f-40e1-8d21-f226cfcbdc02" path="/var/lib/kubelet/pods/58ae19a2-5a2f-40e1-8d21-f226cfcbdc02/volumes" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.201976 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d267e3f-3947-43f5-9045-0320e5112dc1" path="/var/lib/kubelet/pods/5d267e3f-3947-43f5-9045-0320e5112dc1/volumes" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.202586 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72880cef-939a-498b-bdea-2be7388c8037" path="/var/lib/kubelet/pods/72880cef-939a-498b-bdea-2be7388c8037/volumes" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.223841 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.173:9311/healthcheck\": read tcp 10.217.0.2:56506->10.217.1.173:9311: read: connection reset by peer" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.223910 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.173:9311/healthcheck\": read tcp 10.217.0.2:56490->10.217.1.173:9311: read: connection reset by peer" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.319945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"804ec34d-12d2-48ee-9559-ca08aca94c3d","Type":"ContainerStarted","Data":"7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809"} Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.323506 4707 generic.go:334] "Generic (PLEG): container finished" podID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerID="fd8230b429f5a36de29bc3f49e9e63eee6ddbd339099d3d326be9dd24261d9a6" exitCode=0 Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.323677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" event={"ID":"6794a1f0-de70-4b87-8f05-1f6c657b3d5e","Type":"ContainerDied","Data":"fd8230b429f5a36de29bc3f49e9e63eee6ddbd339099d3d326be9dd24261d9a6"} Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.324955 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ca35a833-6d19-485b-8d6b-f2a4d469964e","Type":"ContainerStarted","Data":"67dc4e26125adc86fe2f2a40833479f8c399b6f348c6a1749a74a2d0a7634e05"} Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.328623 4707 generic.go:334] "Generic (PLEG): container finished" podID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerID="0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd" exitCode=1 Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.330362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerDied","Data":"0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd"} Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.331540 4707 scope.go:117] "RemoveContainer" containerID="377c2980821b46c6c12e8389069e3d10dae1a719d2505cc988a6fd8a8c900c52" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.332475 4707 scope.go:117] "RemoveContainer" containerID="0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd" Jan 21 16:11:53 crc kubenswrapper[4707]: E0121 16:11:53.332843 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(047c06ee-71d5-4bcf-83e0-dd8aa295cdb3)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.339410 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:11:53 crc kubenswrapper[4707]: I0121 16:11:53.988403 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.084753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-public-tls-certs\") pod \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.085109 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-internal-tls-certs\") pod \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.085289 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc2h6\" (UniqueName: \"kubernetes.io/projected/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-kube-api-access-fc2h6\") pod \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.085420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-logs\") pod \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.085520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data\") pod \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.085691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data-custom\") pod \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.085743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-combined-ca-bundle\") pod \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\" (UID: \"6794a1f0-de70-4b87-8f05-1f6c657b3d5e\") " Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.086426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-logs" (OuterVolumeSpecName: "logs") pod "6794a1f0-de70-4b87-8f05-1f6c657b3d5e" (UID: "6794a1f0-de70-4b87-8f05-1f6c657b3d5e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.088893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-kube-api-access-fc2h6" (OuterVolumeSpecName: "kube-api-access-fc2h6") pod "6794a1f0-de70-4b87-8f05-1f6c657b3d5e" (UID: "6794a1f0-de70-4b87-8f05-1f6c657b3d5e"). InnerVolumeSpecName "kube-api-access-fc2h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.089058 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc2h6\" (UniqueName: \"kubernetes.io/projected/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-kube-api-access-fc2h6\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.089077 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.090461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6794a1f0-de70-4b87-8f05-1f6c657b3d5e" (UID: "6794a1f0-de70-4b87-8f05-1f6c657b3d5e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.139106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6794a1f0-de70-4b87-8f05-1f6c657b3d5e" (UID: "6794a1f0-de70-4b87-8f05-1f6c657b3d5e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.142821 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6794a1f0-de70-4b87-8f05-1f6c657b3d5e" (UID: "6794a1f0-de70-4b87-8f05-1f6c657b3d5e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.142911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data" (OuterVolumeSpecName: "config-data") pod "6794a1f0-de70-4b87-8f05-1f6c657b3d5e" (UID: "6794a1f0-de70-4b87-8f05-1f6c657b3d5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.190452 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.190477 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.190487 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.190496 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.247269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6794a1f0-de70-4b87-8f05-1f6c657b3d5e" (UID: "6794a1f0-de70-4b87-8f05-1f6c657b3d5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.292018 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6794a1f0-de70-4b87-8f05-1f6c657b3d5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.355211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerStarted","Data":"7261e8621661d753a0ab84dc58a6333b9f4bf934a0bff35dc7ae6582d4347840"} Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.359964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"804ec34d-12d2-48ee-9559-ca08aca94c3d","Type":"ContainerStarted","Data":"a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11"} Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.363256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" event={"ID":"6794a1f0-de70-4b87-8f05-1f6c657b3d5e","Type":"ContainerDied","Data":"734ecfc73708a430a5a7a653e92565be67825c819643eaeccbf2f9ba156f5079"} Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.363273 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.363323 4707 scope.go:117] "RemoveContainer" containerID="fd8230b429f5a36de29bc3f49e9e63eee6ddbd339099d3d326be9dd24261d9a6" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.364800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ca35a833-6d19-485b-8d6b-f2a4d469964e","Type":"ContainerStarted","Data":"7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91"} Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.370026 4707 scope.go:117] "RemoveContainer" containerID="0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd" Jan 21 16:11:54 crc kubenswrapper[4707]: E0121 16:11:54.375382 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(047c06ee-71d5-4bcf-83e0-dd8aa295cdb3)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.395284 4707 scope.go:117] "RemoveContainer" containerID="4415cfbe061679e58ce159e4806960e2e4da012d85c65836e8d01c2df9bdd7cc" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.403890 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.403879421 podStartE2EDuration="3.403879421s" podCreationTimestamp="2026-01-21 16:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:54.398027469 +0000 UTC m=+4211.579543690" watchObservedRunningTime="2026-01-21 16:11:54.403879421 +0000 UTC m=+4211.585395643" Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.458993 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj"] Jan 21 16:11:54 crc kubenswrapper[4707]: I0121 16:11:54.470537 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-778bdc54f8-tnxrj"] Jan 21 16:11:55 crc kubenswrapper[4707]: I0121 16:11:55.125014 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:55 crc kubenswrapper[4707]: I0121 16:11:55.125565 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="b87763ac-1165-4dec-b7c5-4397e2fcff3e" containerName="memcached" containerID="cri-o://d6fddc205c6890a454e9f163957631dcca8806c9925d38125376e97d6b85a4e2" gracePeriod=30 Jan 21 16:11:55 crc kubenswrapper[4707]: I0121 16:11:55.193317 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" path="/var/lib/kubelet/pods/6794a1f0-de70-4b87-8f05-1f6c657b3d5e/volumes" Jan 21 16:11:55 crc kubenswrapper[4707]: I0121 16:11:55.380506 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ca35a833-6d19-485b-8d6b-f2a4d469964e","Type":"ContainerStarted","Data":"9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca"} Jan 21 16:11:55 crc kubenswrapper[4707]: I0121 16:11:55.380664 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:55 crc kubenswrapper[4707]: I0121 16:11:55.395032 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.395015872 podStartE2EDuration="3.395015872s" podCreationTimestamp="2026-01-21 16:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:55.393576324 +0000 UTC m=+4212.575092547" watchObservedRunningTime="2026-01-21 16:11:55.395015872 +0000 UTC m=+4212.576532094" Jan 21 16:11:55 crc kubenswrapper[4707]: I0121 16:11:55.534697 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.185763 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.396670 4707 generic.go:334] "Generic (PLEG): container finished" podID="b87763ac-1165-4dec-b7c5-4397e2fcff3e" containerID="d6fddc205c6890a454e9f163957631dcca8806c9925d38125376e97d6b85a4e2" exitCode=0 Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.396777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"b87763ac-1165-4dec-b7c5-4397e2fcff3e","Type":"ContainerDied","Data":"d6fddc205c6890a454e9f163957631dcca8806c9925d38125376e97d6b85a4e2"} Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.400633 4707 generic.go:334] "Generic (PLEG): container finished" podID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerID="7261e8621661d753a0ab84dc58a6333b9f4bf934a0bff35dc7ae6582d4347840" exitCode=1 Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.400791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerDied","Data":"7261e8621661d753a0ab84dc58a6333b9f4bf934a0bff35dc7ae6582d4347840"} Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.400866 4707 scope.go:117] "RemoveContainer" containerID="1dfd5b276b63120434749292bace0314a46bb3c0904a213de5adf4eae13bbb7d" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.401593 4707 scope.go:117] "RemoveContainer" containerID="7261e8621661d753a0ab84dc58a6333b9f4bf934a0bff35dc7ae6582d4347840" Jan 21 16:11:56 crc kubenswrapper[4707]: E0121 16:11:56.401966 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(647f33a8-e7c7-4941-9c0d-f9c5e4b07409)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.518686 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.665450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vcpt\" (UniqueName: \"kubernetes.io/projected/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kube-api-access-5vcpt\") pod \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.665524 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-memcached-tls-certs\") pod \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.665679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kolla-config\") pod \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.665707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-combined-ca-bundle\") pod \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.665822 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-config-data\") pod \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\" (UID: \"b87763ac-1165-4dec-b7c5-4397e2fcff3e\") " Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.666358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b87763ac-1165-4dec-b7c5-4397e2fcff3e" (UID: "b87763ac-1165-4dec-b7c5-4397e2fcff3e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.666457 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.666874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-config-data" (OuterVolumeSpecName: "config-data") pod "b87763ac-1165-4dec-b7c5-4397e2fcff3e" (UID: "b87763ac-1165-4dec-b7c5-4397e2fcff3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.686297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kube-api-access-5vcpt" (OuterVolumeSpecName: "kube-api-access-5vcpt") pod "b87763ac-1165-4dec-b7c5-4397e2fcff3e" (UID: "b87763ac-1165-4dec-b7c5-4397e2fcff3e"). InnerVolumeSpecName "kube-api-access-5vcpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.689143 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b87763ac-1165-4dec-b7c5-4397e2fcff3e" (UID: "b87763ac-1165-4dec-b7c5-4397e2fcff3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.705548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "b87763ac-1165-4dec-b7c5-4397e2fcff3e" (UID: "b87763ac-1165-4dec-b7c5-4397e2fcff3e"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.769862 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b87763ac-1165-4dec-b7c5-4397e2fcff3e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.769904 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vcpt\" (UniqueName: \"kubernetes.io/projected/b87763ac-1165-4dec-b7c5-4397e2fcff3e-kube-api-access-5vcpt\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.769924 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:56 crc kubenswrapper[4707]: I0121 16:11:56.769939 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87763ac-1165-4dec-b7c5-4397e2fcff3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.409453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"b87763ac-1165-4dec-b7c5-4397e2fcff3e","Type":"ContainerDied","Data":"2f29bc765f434e5f9f65ed88dfb837ea5e2a564ae01b9f92044f548c5ebe9b59"} Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.409513 4707 scope.go:117] "RemoveContainer" containerID="d6fddc205c6890a454e9f163957631dcca8806c9925d38125376e97d6b85a4e2" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.409564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.411332 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerName="cinder-api-log" containerID="cri-o://7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91" gracePeriod=30 Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.411396 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerName="cinder-api" containerID="cri-o://9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca" gracePeriod=30 Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.436017 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.453130 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.462077 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:57 crc kubenswrapper[4707]: E0121 16:11:57.462593 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87763ac-1165-4dec-b7c5-4397e2fcff3e" containerName="memcached" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.462619 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87763ac-1165-4dec-b7c5-4397e2fcff3e" containerName="memcached" Jan 21 16:11:57 crc kubenswrapper[4707]: E0121 16:11:57.462663 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api-log" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.462671 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api-log" Jan 21 16:11:57 crc kubenswrapper[4707]: E0121 16:11:57.462685 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.462691 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.463580 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api-log" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.463629 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87763ac-1165-4dec-b7c5-4397e2fcff3e" containerName="memcached" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.463639 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6794a1f0-de70-4b87-8f05-1f6c657b3d5e" containerName="barbican-api" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.464420 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.469824 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.470517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.470667 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-vp54k" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.473113 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.482110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2bzj\" (UniqueName: \"kubernetes.io/projected/deb46fb3-5bea-4b35-9e6c-6816415058af-kube-api-access-z2bzj\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.482210 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-config-data\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.482238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-kolla-config\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.482272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-combined-ca-bundle\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.482313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-memcached-tls-certs\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.583567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-config-data\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.583920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-kolla-config\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.583968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-combined-ca-bundle\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.584017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-memcached-tls-certs\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.584064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2bzj\" (UniqueName: \"kubernetes.io/projected/deb46fb3-5bea-4b35-9e6c-6816415058af-kube-api-access-z2bzj\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.584489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-config-data\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.584786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-kolla-config\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.587897 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-combined-ca-bundle\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.588571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-memcached-tls-certs\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.598914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2bzj\" (UniqueName: \"kubernetes.io/projected/deb46fb3-5bea-4b35-9e6c-6816415058af-kube-api-access-z2bzj\") pod \"memcached-0\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.780915 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.877921 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.989663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data\") pod \"ca35a833-6d19-485b-8d6b-f2a4d469964e\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.989851 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca35a833-6d19-485b-8d6b-f2a4d469964e-logs\") pod \"ca35a833-6d19-485b-8d6b-f2a4d469964e\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.990024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-combined-ca-bundle\") pod \"ca35a833-6d19-485b-8d6b-f2a4d469964e\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.990107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-internal-tls-certs\") pod \"ca35a833-6d19-485b-8d6b-f2a4d469964e\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.990139 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-scripts\") pod \"ca35a833-6d19-485b-8d6b-f2a4d469964e\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.990220 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca35a833-6d19-485b-8d6b-f2a4d469964e-etc-machine-id\") pod \"ca35a833-6d19-485b-8d6b-f2a4d469964e\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.990376 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca35a833-6d19-485b-8d6b-f2a4d469964e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ca35a833-6d19-485b-8d6b-f2a4d469964e" (UID: "ca35a833-6d19-485b-8d6b-f2a4d469964e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.990474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data-custom\") pod \"ca35a833-6d19-485b-8d6b-f2a4d469964e\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.990544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-public-tls-certs\") pod \"ca35a833-6d19-485b-8d6b-f2a4d469964e\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.990590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5dwc\" (UniqueName: \"kubernetes.io/projected/ca35a833-6d19-485b-8d6b-f2a4d469964e-kube-api-access-t5dwc\") pod \"ca35a833-6d19-485b-8d6b-f2a4d469964e\" (UID: \"ca35a833-6d19-485b-8d6b-f2a4d469964e\") " Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.990730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca35a833-6d19-485b-8d6b-f2a4d469964e-logs" (OuterVolumeSpecName: "logs") pod "ca35a833-6d19-485b-8d6b-f2a4d469964e" (UID: "ca35a833-6d19-485b-8d6b-f2a4d469964e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.991687 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca35a833-6d19-485b-8d6b-f2a4d469964e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.991717 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca35a833-6d19-485b-8d6b-f2a4d469964e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.995292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-scripts" (OuterVolumeSpecName: "scripts") pod "ca35a833-6d19-485b-8d6b-f2a4d469964e" (UID: "ca35a833-6d19-485b-8d6b-f2a4d469964e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.995606 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca35a833-6d19-485b-8d6b-f2a4d469964e-kube-api-access-t5dwc" (OuterVolumeSpecName: "kube-api-access-t5dwc") pod "ca35a833-6d19-485b-8d6b-f2a4d469964e" (UID: "ca35a833-6d19-485b-8d6b-f2a4d469964e"). InnerVolumeSpecName "kube-api-access-t5dwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:57 crc kubenswrapper[4707]: I0121 16:11:57.995660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca35a833-6d19-485b-8d6b-f2a4d469964e" (UID: "ca35a833-6d19-485b-8d6b-f2a4d469964e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.012151 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca35a833-6d19-485b-8d6b-f2a4d469964e" (UID: "ca35a833-6d19-485b-8d6b-f2a4d469964e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.027093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data" (OuterVolumeSpecName: "config-data") pod "ca35a833-6d19-485b-8d6b-f2a4d469964e" (UID: "ca35a833-6d19-485b-8d6b-f2a4d469964e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.029197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ca35a833-6d19-485b-8d6b-f2a4d469964e" (UID: "ca35a833-6d19-485b-8d6b-f2a4d469964e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.030996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ca35a833-6d19-485b-8d6b-f2a4d469964e" (UID: "ca35a833-6d19-485b-8d6b-f2a4d469964e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.092441 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.092473 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.092483 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.092493 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.092501 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.092512 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5dwc\" (UniqueName: \"kubernetes.io/projected/ca35a833-6d19-485b-8d6b-f2a4d469964e-kube-api-access-t5dwc\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.092522 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca35a833-6d19-485b-8d6b-f2a4d469964e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.187424 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:11:58 crc kubenswrapper[4707]: W0121 16:11:58.190829 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb46fb3_5bea_4b35_9e6c_6816415058af.slice/crio-42e1f70bcbe8876300c9e3e1ec3625a74d1b6332a4404da58829101af41b0d68 WatchSource:0}: Error finding container 42e1f70bcbe8876300c9e3e1ec3625a74d1b6332a4404da58829101af41b0d68: Status 404 returned error can't find the container with id 42e1f70bcbe8876300c9e3e1ec3625a74d1b6332a4404da58829101af41b0d68 Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.419769 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerID="9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca" exitCode=0 Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.420028 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerID="7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91" exitCode=143 Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.419834 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.419853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ca35a833-6d19-485b-8d6b-f2a4d469964e","Type":"ContainerDied","Data":"9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca"} Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.420124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ca35a833-6d19-485b-8d6b-f2a4d469964e","Type":"ContainerDied","Data":"7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91"} Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.420136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ca35a833-6d19-485b-8d6b-f2a4d469964e","Type":"ContainerDied","Data":"67dc4e26125adc86fe2f2a40833479f8c399b6f348c6a1749a74a2d0a7634e05"} Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.420150 4707 scope.go:117] "RemoveContainer" containerID="9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.436230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"deb46fb3-5bea-4b35-9e6c-6816415058af","Type":"ContainerStarted","Data":"9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7"} Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.436304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"deb46fb3-5bea-4b35-9e6c-6816415058af","Type":"ContainerStarted","Data":"42e1f70bcbe8876300c9e3e1ec3625a74d1b6332a4404da58829101af41b0d68"} Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.436854 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.448740 4707 scope.go:117] "RemoveContainer" containerID="7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.468125 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=1.468107343 podStartE2EDuration="1.468107343s" podCreationTimestamp="2026-01-21 16:11:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:11:58.452751018 +0000 UTC m=+4215.634267240" watchObservedRunningTime="2026-01-21 16:11:58.468107343 +0000 UTC m=+4215.649623565" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.473593 4707 scope.go:117] "RemoveContainer" containerID="9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca" Jan 21 16:11:58 crc kubenswrapper[4707]: E0121 16:11:58.476069 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca\": container with ID starting with 9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca not found: ID does not exist" containerID="9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.476106 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca"} err="failed to get container status \"9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca\": rpc error: code = NotFound desc = could not find container \"9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca\": container with ID starting with 9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca not found: ID does not exist" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.476133 4707 scope.go:117] "RemoveContainer" containerID="7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91" Jan 21 16:11:58 crc kubenswrapper[4707]: E0121 16:11:58.476480 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91\": container with ID starting with 7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91 not found: ID does not exist" containerID="7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.476521 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91"} err="failed to get container status \"7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91\": rpc error: code = NotFound desc = could not find container \"7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91\": container with ID starting with 7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91 not found: ID does not exist" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.476539 4707 scope.go:117] "RemoveContainer" containerID="9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.476867 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca"} err="failed to get container status \"9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca\": rpc error: code = NotFound desc = could not find container \"9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca\": container with ID starting with 9b3817cce3aa3eaaeb8e56a71123e918265c02c4e63d0c7ba0ef61975548b7ca not found: ID does not exist" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.476922 4707 scope.go:117] "RemoveContainer" containerID="7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.477359 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91"} err="failed to get container status \"7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91\": rpc error: code = NotFound desc = could not find container \"7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91\": container with ID starting with 7fa3c3a4e0229bd700990f26a78799813705eae34cb9db5e5c7025a38dd41b91 not found: ID does not exist" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.483339 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.496871 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.512426 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:58 crc kubenswrapper[4707]: E0121 16:11:58.512774 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerName="cinder-api" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.512793 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerName="cinder-api" Jan 21 16:11:58 crc kubenswrapper[4707]: E0121 16:11:58.512867 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerName="cinder-api-log" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.512875 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerName="cinder-api-log" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.513118 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerName="cinder-api" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.513140 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca35a833-6d19-485b-8d6b-f2a4d469964e" containerName="cinder-api-log" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.514327 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.516186 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.516732 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.518499 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.521765 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.604636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/559741cf-87c9-47d4-98a3-6b02ffb4610c-logs\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.604671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.604700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.604741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/559741cf-87c9-47d4-98a3-6b02ffb4610c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.604767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.604794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.604883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.604914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.604930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mswgl\" (UniqueName: \"kubernetes.io/projected/559741cf-87c9-47d4-98a3-6b02ffb4610c-kube-api-access-mswgl\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.706565 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.706667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/559741cf-87c9-47d4-98a3-6b02ffb4610c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.706708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.706744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.706836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.706845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/559741cf-87c9-47d4-98a3-6b02ffb4610c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.706876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.707018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mswgl\" (UniqueName: \"kubernetes.io/projected/559741cf-87c9-47d4-98a3-6b02ffb4610c-kube-api-access-mswgl\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.707683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/559741cf-87c9-47d4-98a3-6b02ffb4610c-logs\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.708156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/559741cf-87c9-47d4-98a3-6b02ffb4610c-logs\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.708242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.712559 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.712929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.713086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.716265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.716561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.716827 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.723966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mswgl\" (UniqueName: \"kubernetes.io/projected/559741cf-87c9-47d4-98a3-6b02ffb4610c-kube-api-access-mswgl\") pod \"cinder-api-0\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:58 crc kubenswrapper[4707]: I0121 16:11:58.828520 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:11:59 crc kubenswrapper[4707]: I0121 16:11:59.190995 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b87763ac-1165-4dec-b7c5-4397e2fcff3e" path="/var/lib/kubelet/pods/b87763ac-1165-4dec-b7c5-4397e2fcff3e/volumes" Jan 21 16:11:59 crc kubenswrapper[4707]: I0121 16:11:59.191790 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca35a833-6d19-485b-8d6b-f2a4d469964e" path="/var/lib/kubelet/pods/ca35a833-6d19-485b-8d6b-f2a4d469964e/volumes" Jan 21 16:11:59 crc kubenswrapper[4707]: I0121 16:11:59.311124 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:11:59 crc kubenswrapper[4707]: W0121 16:11:59.319769 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod559741cf_87c9_47d4_98a3_6b02ffb4610c.slice/crio-d79c34f29b08fd0ad6c8ad1d97bf1f3506743a117855e0388eb175ee1db2a8c4 WatchSource:0}: Error finding container d79c34f29b08fd0ad6c8ad1d97bf1f3506743a117855e0388eb175ee1db2a8c4: Status 404 returned error can't find the container with id d79c34f29b08fd0ad6c8ad1d97bf1f3506743a117855e0388eb175ee1db2a8c4 Jan 21 16:11:59 crc kubenswrapper[4707]: I0121 16:11:59.453248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"559741cf-87c9-47d4-98a3-6b02ffb4610c","Type":"ContainerStarted","Data":"d79c34f29b08fd0ad6c8ad1d97bf1f3506743a117855e0388eb175ee1db2a8c4"} Jan 21 16:12:00 crc kubenswrapper[4707]: I0121 16:12:00.469112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"559741cf-87c9-47d4-98a3-6b02ffb4610c","Type":"ContainerStarted","Data":"b7106616e826175cc64ed9a092fca4a51264ccb076e1e9ed7b657799fb68de52"} Jan 21 16:12:00 crc kubenswrapper[4707]: I0121 16:12:00.469151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"559741cf-87c9-47d4-98a3-6b02ffb4610c","Type":"ContainerStarted","Data":"30b75fdd71f74771e52987f8d2a7b8e272eabb3c7db15d9c46d8dfc4759a563c"} Jan 21 16:12:00 crc kubenswrapper[4707]: I0121 16:12:00.470475 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:12:00 crc kubenswrapper[4707]: I0121 16:12:00.488058 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.488043932 podStartE2EDuration="2.488043932s" podCreationTimestamp="2026-01-21 16:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:00.482503274 +0000 UTC m=+4217.664019496" watchObservedRunningTime="2026-01-21 16:12:00.488043932 +0000 UTC m=+4217.669560154" Jan 21 16:12:00 crc kubenswrapper[4707]: I0121 16:12:00.535208 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:12:00 crc kubenswrapper[4707]: I0121 16:12:00.535339 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:12:00 crc kubenswrapper[4707]: I0121 16:12:00.535356 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:12:00 crc kubenswrapper[4707]: I0121 16:12:00.536828 4707 scope.go:117] "RemoveContainer" containerID="7261e8621661d753a0ab84dc58a6333b9f4bf934a0bff35dc7ae6582d4347840" Jan 21 16:12:00 crc kubenswrapper[4707]: E0121 16:12:00.537430 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(647f33a8-e7c7-4941-9c0d-f9c5e4b07409)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" Jan 21 16:12:01 crc kubenswrapper[4707]: I0121 16:12:01.183370 4707 scope.go:117] "RemoveContainer" containerID="c43bdee51f4936382c063ef548a600d72b44db1be83fde94ba90c55aa3abc9be" Jan 21 16:12:01 crc kubenswrapper[4707]: I0121 16:12:01.826864 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:01 crc kubenswrapper[4707]: I0121 16:12:01.826912 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:01 crc kubenswrapper[4707]: I0121 16:12:01.855521 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:01 crc kubenswrapper[4707]: I0121 16:12:01.862415 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:02 crc kubenswrapper[4707]: I0121 16:12:02.491778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerStarted","Data":"1408b5018e888ee2df7c2a74b879464d9d7f9397fbdd091d97fbff81697afea1"} Jan 21 16:12:02 crc kubenswrapper[4707]: I0121 16:12:02.492488 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:02 crc kubenswrapper[4707]: I0121 16:12:02.492544 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:02 crc kubenswrapper[4707]: I0121 16:12:02.583371 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.132960 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.185358 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.512609 4707 generic.go:334] "Generic (PLEG): container finished" podID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerID="1408b5018e888ee2df7c2a74b879464d9d7f9397fbdd091d97fbff81697afea1" exitCode=1 Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.512670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerDied","Data":"1408b5018e888ee2df7c2a74b879464d9d7f9397fbdd091d97fbff81697afea1"} Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.512723 4707 scope.go:117] "RemoveContainer" containerID="c43bdee51f4936382c063ef548a600d72b44db1be83fde94ba90c55aa3abc9be" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.513388 4707 scope.go:117] "RemoveContainer" containerID="1408b5018e888ee2df7c2a74b879464d9d7f9397fbdd091d97fbff81697afea1" Jan 21 16:12:04 crc kubenswrapper[4707]: E0121 16:12:04.513635 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.558790 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr"] Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.561189 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.586613 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.589749 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr"] Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.749227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-combined-ca-bundle\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.749292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-internal-tls-certs\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.749320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.749344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data-custom\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.749415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-public-tls-certs\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.749433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ggr5\" (UniqueName: \"kubernetes.io/projected/0ec7c890-e349-422b-bbea-c40083057a7f-kube-api-access-7ggr5\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.749459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec7c890-e349-422b-bbea-c40083057a7f-logs\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.851698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-internal-tls-certs\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.851761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.851795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data-custom\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.851952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-public-tls-certs\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.851977 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ggr5\" (UniqueName: \"kubernetes.io/projected/0ec7c890-e349-422b-bbea-c40083057a7f-kube-api-access-7ggr5\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.852012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec7c890-e349-422b-bbea-c40083057a7f-logs\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.852293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-combined-ca-bundle\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.852861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec7c890-e349-422b-bbea-c40083057a7f-logs\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.858385 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-internal-tls-certs\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.858719 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data-custom\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.860391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-public-tls-certs\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.864537 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.866553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-combined-ca-bundle\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.869307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ggr5\" (UniqueName: \"kubernetes.io/projected/0ec7c890-e349-422b-bbea-c40083057a7f-kube-api-access-7ggr5\") pod \"barbican-api-bf9cb7fc8-rhjfr\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:04 crc kubenswrapper[4707]: I0121 16:12:04.879689 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:05 crc kubenswrapper[4707]: I0121 16:12:05.183437 4707 scope.go:117] "RemoveContainer" containerID="0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd" Jan 21 16:12:05 crc kubenswrapper[4707]: E0121 16:12:05.183919 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(047c06ee-71d5-4bcf-83e0-dd8aa295cdb3)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" Jan 21 16:12:05 crc kubenswrapper[4707]: I0121 16:12:05.305331 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr"] Jan 21 16:12:05 crc kubenswrapper[4707]: I0121 16:12:05.536277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" event={"ID":"0ec7c890-e349-422b-bbea-c40083057a7f","Type":"ContainerStarted","Data":"c1dbdd29f67438ae9743e9e0b54ae682bc2556fd91e8655fdc8ef260aa96d9ec"} Jan 21 16:12:05 crc kubenswrapper[4707]: I0121 16:12:05.536588 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" event={"ID":"0ec7c890-e349-422b-bbea-c40083057a7f","Type":"ContainerStarted","Data":"df2a6b15ed5cac857930bf02ffb07acf15c9254211dd5940dd1772f2502852ac"} Jan 21 16:12:05 crc kubenswrapper[4707]: I0121 16:12:05.539388 4707 scope.go:117] "RemoveContainer" containerID="1408b5018e888ee2df7c2a74b879464d9d7f9397fbdd091d97fbff81697afea1" Jan 21 16:12:05 crc kubenswrapper[4707]: E0121 16:12:05.539578 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" Jan 21 16:12:06 crc kubenswrapper[4707]: I0121 16:12:06.547872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" event={"ID":"0ec7c890-e349-422b-bbea-c40083057a7f","Type":"ContainerStarted","Data":"5abfe7dfab2c387f03fc384a6b2f564e952ec647ef9f0271b33a23ad90c8e35e"} Jan 21 16:12:06 crc kubenswrapper[4707]: I0121 16:12:06.548273 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:06 crc kubenswrapper[4707]: I0121 16:12:06.548299 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.457055 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6a3ae0f0-4069-4bac-8927-e0d9dce26caf"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6a3ae0f0-4069-4bac-8927-e0d9dce26caf] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6a3ae0f0_4069_4bac_8927_e0d9dce26caf.slice" Jan 21 16:12:07 crc kubenswrapper[4707]: E0121 16:12:07.457474 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod6a3ae0f0-4069-4bac-8927-e0d9dce26caf] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod6a3ae0f0-4069-4bac-8927-e0d9dce26caf] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6a3ae0f0_4069_4bac_8927_e0d9dce26caf.slice" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="6a3ae0f0-4069-4bac-8927-e0d9dce26caf" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.554537 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.580072 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" podStartSLOduration=3.580050032 podStartE2EDuration="3.580050032s" podCreationTimestamp="2026-01-21 16:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:06.566419408 +0000 UTC m=+4223.747935650" watchObservedRunningTime="2026-01-21 16:12:07.580050032 +0000 UTC m=+4224.761566255" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.588632 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.605882 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.619084 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.621742 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.624199 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.625246 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.630275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.727492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9s4w\" (UniqueName: \"kubernetes.io/projected/cc201295-13b6-4e01-b791-f89da0093402-kube-api-access-c9s4w\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.727890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.728647 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.728747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-logs\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.729177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.729529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.729581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.729827 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.731530 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3ae0f0-4069-4bac-8927-e0d9dce26caf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.782732 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.833894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.833935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.833967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-logs\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.834020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.834070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.834088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.834135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.834179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9s4w\" (UniqueName: \"kubernetes.io/projected/cc201295-13b6-4e01-b791-f89da0093402-kube-api-access-c9s4w\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.834637 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.834827 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.835286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-logs\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.841839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.844190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.847461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.851803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9s4w\" (UniqueName: \"kubernetes.io/projected/cc201295-13b6-4e01-b791-f89da0093402-kube-api-access-c9s4w\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.852125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.866464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.892853 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.893094 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-log" containerID="cri-o://3805d4c67848713fa21d3835fea8fa9390b192a873e21f06f95e32eafd197ef0" gracePeriod=30 Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.893175 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-metadata" containerID="cri-o://f1f33074f9c4884125c41e8f10998b0cfb13e6b6903b0de59bbc6d2a374caca5" gracePeriod=30 Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.913856 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.914265 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-log" containerID="cri-o://ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb" gracePeriod=30 Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.914395 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-api" containerID="cri-o://78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d" gracePeriod=30 Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.922354 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.923405 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.988274 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.988591 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerName="glance-log" containerID="cri-o://7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809" gracePeriod=30 Jan 21 16:12:07 crc kubenswrapper[4707]: I0121 16:12:07.989405 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerName="glance-httpd" containerID="cri-o://a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11" gracePeriod=30 Jan 21 16:12:08 crc kubenswrapper[4707]: I0121 16:12:08.415551 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:08 crc kubenswrapper[4707]: I0121 16:12:08.571484 4707 generic.go:334] "Generic (PLEG): container finished" podID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerID="7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809" exitCode=143 Jan 21 16:12:08 crc kubenswrapper[4707]: I0121 16:12:08.571592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"804ec34d-12d2-48ee-9559-ca08aca94c3d","Type":"ContainerDied","Data":"7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809"} Jan 21 16:12:08 crc kubenswrapper[4707]: I0121 16:12:08.574186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"cc201295-13b6-4e01-b791-f89da0093402","Type":"ContainerStarted","Data":"f090688fe38dcff346fcfc62fe945dbf0100d4f3fac3b53af5c75ae4e36f724b"} Jan 21 16:12:08 crc kubenswrapper[4707]: I0121 16:12:08.576047 4707 generic.go:334] "Generic (PLEG): container finished" podID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerID="3805d4c67848713fa21d3835fea8fa9390b192a873e21f06f95e32eafd197ef0" exitCode=143 Jan 21 16:12:08 crc kubenswrapper[4707]: I0121 16:12:08.576105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a56f6441-8492-43ae-8a5a-8e38eb6b7cab","Type":"ContainerDied","Data":"3805d4c67848713fa21d3835fea8fa9390b192a873e21f06f95e32eafd197ef0"} Jan 21 16:12:08 crc kubenswrapper[4707]: I0121 16:12:08.578871 4707 generic.go:334] "Generic (PLEG): container finished" podID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerID="ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb" exitCode=143 Jan 21 16:12:08 crc kubenswrapper[4707]: I0121 16:12:08.578893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8d8f30cf-308e-4177-8b29-5d0d6811b551","Type":"ContainerDied","Data":"ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb"} Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.009995 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-8657895bb4-ls6tf"] Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.010468 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerName="neutron-api" containerID="cri-o://0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92" gracePeriod=30 Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.010732 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerName="neutron-httpd" containerID="cri-o://f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50" gracePeriod=30 Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.021384 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.249:9696/\": EOF" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.033193 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp"] Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.034492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.062636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp"] Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.170232 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-httpd-config\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.170424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64tc9\" (UniqueName: \"kubernetes.io/projected/317a81bb-283b-43fe-88cb-ad8fadf45471-kube-api-access-64tc9\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.170537 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-ovndb-tls-certs\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.170609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-internal-tls-certs\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.170642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-config\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.170683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-combined-ca-bundle\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.170705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-public-tls-certs\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.200926 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3ae0f0-4069-4bac-8927-e0d9dce26caf" path="/var/lib/kubelet/pods/6a3ae0f0-4069-4bac-8927-e0d9dce26caf/volumes" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.273129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-ovndb-tls-certs\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.273225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-internal-tls-certs\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.273274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-config\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.273342 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-combined-ca-bundle\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.273362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-public-tls-certs\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.273482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-httpd-config\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.273865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64tc9\" (UniqueName: \"kubernetes.io/projected/317a81bb-283b-43fe-88cb-ad8fadf45471-kube-api-access-64tc9\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.277831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-httpd-config\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.278362 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-ovndb-tls-certs\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.279240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-internal-tls-certs\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.279472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-public-tls-certs\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.284436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-config\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.288048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-combined-ca-bundle\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.290974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64tc9\" (UniqueName: \"kubernetes.io/projected/317a81bb-283b-43fe-88cb-ad8fadf45471-kube-api-access-64tc9\") pod \"neutron-58d6d5d4f4-ldvtp\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.357939 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.595786 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"cc201295-13b6-4e01-b791-f89da0093402","Type":"ContainerStarted","Data":"68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087"} Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.596236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"cc201295-13b6-4e01-b791-f89da0093402","Type":"ContainerStarted","Data":"959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2"} Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.596355 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="cc201295-13b6-4e01-b791-f89da0093402" containerName="glance-log" containerID="cri-o://959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2" gracePeriod=30 Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.596583 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="cc201295-13b6-4e01-b791-f89da0093402" containerName="glance-httpd" containerID="cri-o://68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087" gracePeriod=30 Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.602150 4707 generic.go:334] "Generic (PLEG): container finished" podID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerID="f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50" exitCode=0 Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.602232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" event={"ID":"de7fba6f-a3f2-4a62-a094-056e0cccd7fa","Type":"ContainerDied","Data":"f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50"} Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.625315 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.625299127 podStartE2EDuration="2.625299127s" podCreationTimestamp="2026-01-21 16:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:09.613697993 +0000 UTC m=+4226.795214214" watchObservedRunningTime="2026-01-21 16:12:09.625299127 +0000 UTC m=+4226.806815349" Jan 21 16:12:09 crc kubenswrapper[4707]: I0121 16:12:09.799694 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp"] Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.003909 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr"] Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.004131 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api-log" containerID="cri-o://c1dbdd29f67438ae9743e9e0b54ae682bc2556fd91e8655fdc8ef260aa96d9ec" gracePeriod=30 Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.004234 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api" containerID="cri-o://5abfe7dfab2c387f03fc384a6b2f564e952ec647ef9f0271b33a23ad90c8e35e" gracePeriod=30 Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.009458 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.5:9311/healthcheck\": read tcp 10.217.0.2:43874->10.217.0.5:9311: read: connection reset by peer" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.009832 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.5:9311/healthcheck\": read tcp 10.217.0.2:43878->10.217.0.5:9311: read: connection reset by peer" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.031994 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-ff884c948-lmvhf"] Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.033940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.045510 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-ff884c948-lmvhf"] Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.109319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.109649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-logs\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.109932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-internal-tls-certs\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.109975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-combined-ca-bundle\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.110031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlxtq\" (UniqueName: \"kubernetes.io/projected/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-kube-api-access-jlxtq\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.110150 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-public-tls-certs\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.110186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data-custom\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.214956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-internal-tls-certs\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.215039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-combined-ca-bundle\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.215109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlxtq\" (UniqueName: \"kubernetes.io/projected/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-kube-api-access-jlxtq\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.215212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-public-tls-certs\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.215240 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data-custom\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.215402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.215436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-logs\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.216205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-logs\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.220837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-combined-ca-bundle\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.220929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-internal-tls-certs\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.221377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data-custom\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.223547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.227275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-public-tls-certs\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.233714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlxtq\" (UniqueName: \"kubernetes.io/projected/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-kube-api-access-jlxtq\") pod \"barbican-api-ff884c948-lmvhf\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.410519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.423492 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.423689 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="38eec5c3-a42c-45b1-9cb9-d21e8c88de27" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc" gracePeriod=30 Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.480726 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.519929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-httpd-config\") pod \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.520145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-public-tls-certs\") pod \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.520307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfvdv\" (UniqueName: \"kubernetes.io/projected/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-kube-api-access-hfvdv\") pod \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.520445 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-ovndb-tls-certs\") pod \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.520496 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-internal-tls-certs\") pod \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.520568 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-config\") pod \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.520611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-combined-ca-bundle\") pod \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\" (UID: \"de7fba6f-a3f2-4a62-a094-056e0cccd7fa\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.521778 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.527102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-kube-api-access-hfvdv" (OuterVolumeSpecName: "kube-api-access-hfvdv") pod "de7fba6f-a3f2-4a62-a094-056e0cccd7fa" (UID: "de7fba6f-a3f2-4a62-a094-056e0cccd7fa"). InnerVolumeSpecName "kube-api-access-hfvdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.527414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "de7fba6f-a3f2-4a62-a094-056e0cccd7fa" (UID: "de7fba6f-a3f2-4a62-a094-056e0cccd7fa"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.623668 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ec7c890-e349-422b-bbea-c40083057a7f" containerID="5abfe7dfab2c387f03fc384a6b2f564e952ec647ef9f0271b33a23ad90c8e35e" exitCode=0 Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.623919 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ec7c890-e349-422b-bbea-c40083057a7f" containerID="c1dbdd29f67438ae9743e9e0b54ae682bc2556fd91e8655fdc8ef260aa96d9ec" exitCode=143 Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.623719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" event={"ID":"0ec7c890-e349-422b-bbea-c40083057a7f","Type":"ContainerDied","Data":"5abfe7dfab2c387f03fc384a6b2f564e952ec647ef9f0271b33a23ad90c8e35e"} Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.624068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" event={"ID":"0ec7c890-e349-422b-bbea-c40083057a7f","Type":"ContainerDied","Data":"c1dbdd29f67438ae9743e9e0b54ae682bc2556fd91e8655fdc8ef260aa96d9ec"} Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.624290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-scripts\") pod \"cc201295-13b6-4e01-b791-f89da0093402\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.624335 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9s4w\" (UniqueName: \"kubernetes.io/projected/cc201295-13b6-4e01-b791-f89da0093402-kube-api-access-c9s4w\") pod \"cc201295-13b6-4e01-b791-f89da0093402\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.624372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-config-data\") pod \"cc201295-13b6-4e01-b791-f89da0093402\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.624412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-combined-ca-bundle\") pod \"cc201295-13b6-4e01-b791-f89da0093402\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.624463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-logs\") pod \"cc201295-13b6-4e01-b791-f89da0093402\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.624531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-public-tls-certs\") pod \"cc201295-13b6-4e01-b791-f89da0093402\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.624603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-httpd-run\") pod \"cc201295-13b6-4e01-b791-f89da0093402\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.624621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"cc201295-13b6-4e01-b791-f89da0093402\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.625169 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.625182 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfvdv\" (UniqueName: \"kubernetes.io/projected/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-kube-api-access-hfvdv\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.628971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-logs" (OuterVolumeSpecName: "logs") pod "cc201295-13b6-4e01-b791-f89da0093402" (UID: "cc201295-13b6-4e01-b791-f89da0093402"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.629373 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cc201295-13b6-4e01-b791-f89da0093402" (UID: "cc201295-13b6-4e01-b791-f89da0093402"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.631020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" event={"ID":"317a81bb-283b-43fe-88cb-ad8fadf45471","Type":"ContainerStarted","Data":"da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd"} Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.631089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" event={"ID":"317a81bb-283b-43fe-88cb-ad8fadf45471","Type":"ContainerStarted","Data":"ed02f5813ce6ae9bea3dafe0363b9f0b1730c57e577d54c423ea7f70ac3cd7dc"} Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.648508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-scripts" (OuterVolumeSpecName: "scripts") pod "cc201295-13b6-4e01-b791-f89da0093402" (UID: "cc201295-13b6-4e01-b791-f89da0093402"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.648646 4707 generic.go:334] "Generic (PLEG): container finished" podID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerID="0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92" exitCode=0 Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.648696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" event={"ID":"de7fba6f-a3f2-4a62-a094-056e0cccd7fa","Type":"ContainerDied","Data":"0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92"} Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.648716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" event={"ID":"de7fba6f-a3f2-4a62-a094-056e0cccd7fa","Type":"ContainerDied","Data":"5bdc43924b302fde29989f06c46bbbed65e642b144e77c412f0c86831402ab88"} Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.648736 4707 scope.go:117] "RemoveContainer" containerID="f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.648765 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.649262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "cc201295-13b6-4e01-b791-f89da0093402" (UID: "cc201295-13b6-4e01-b791-f89da0093402"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.652095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc201295-13b6-4e01-b791-f89da0093402-kube-api-access-c9s4w" (OuterVolumeSpecName: "kube-api-access-c9s4w") pod "cc201295-13b6-4e01-b791-f89da0093402" (UID: "cc201295-13b6-4e01-b791-f89da0093402"). InnerVolumeSpecName "kube-api-access-c9s4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.663306 4707 generic.go:334] "Generic (PLEG): container finished" podID="cc201295-13b6-4e01-b791-f89da0093402" containerID="68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087" exitCode=143 Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.663355 4707 generic.go:334] "Generic (PLEG): container finished" podID="cc201295-13b6-4e01-b791-f89da0093402" containerID="959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2" exitCode=143 Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.663378 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"cc201295-13b6-4e01-b791-f89da0093402","Type":"ContainerDied","Data":"68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087"} Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.663405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"cc201295-13b6-4e01-b791-f89da0093402","Type":"ContainerDied","Data":"959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2"} Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.663440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"cc201295-13b6-4e01-b791-f89da0093402","Type":"ContainerDied","Data":"f090688fe38dcff346fcfc62fe945dbf0100d4f3fac3b53af5c75ae4e36f724b"} Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.663491 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.670908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc201295-13b6-4e01-b791-f89da0093402" (UID: "cc201295-13b6-4e01-b791-f89da0093402"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.676113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "de7fba6f-a3f2-4a62-a094-056e0cccd7fa" (UID: "de7fba6f-a3f2-4a62-a094-056e0cccd7fa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.691199 4707 scope.go:117] "RemoveContainer" containerID="0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.693275 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.698917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de7fba6f-a3f2-4a62-a094-056e0cccd7fa" (UID: "de7fba6f-a3f2-4a62-a094-056e0cccd7fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.719142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-config" (OuterVolumeSpecName: "config") pod "de7fba6f-a3f2-4a62-a094-056e0cccd7fa" (UID: "de7fba6f-a3f2-4a62-a094-056e0cccd7fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.727594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cc201295-13b6-4e01-b791-f89da0093402" (UID: "cc201295-13b6-4e01-b791-f89da0093402"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.740429 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.740657 4707 scope.go:117] "RemoveContainer" containerID="f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50" Jan 21 16:12:10 crc kubenswrapper[4707]: E0121 16:12:10.743865 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50\": container with ID starting with f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50 not found: ID does not exist" containerID="f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.743902 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50"} err="failed to get container status \"f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50\": rpc error: code = NotFound desc = could not find container \"f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50\": container with ID starting with f7b0e213cfaa8dc8e31e07071c4e4533b9108a5259c62bcb5612f14b62d82b50 not found: ID does not exist" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.743922 4707 scope.go:117] "RemoveContainer" containerID="0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92" Jan 21 16:12:10 crc kubenswrapper[4707]: E0121 16:12:10.744535 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92\": container with ID starting with 0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92 not found: ID does not exist" containerID="0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.744582 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92"} err="failed to get container status \"0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92\": rpc error: code = NotFound desc = could not find container \"0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92\": container with ID starting with 0244b6d1a1ecab933b5af1b72241f3973e6dfe3ec41b3e604a5c9b795d9b9b92 not found: ID does not exist" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.744606 4707 scope.go:117] "RemoveContainer" containerID="68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.785778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-public-tls-certs\") pod \"0ec7c890-e349-422b-bbea-c40083057a7f\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.785866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data\") pod \"0ec7c890-e349-422b-bbea-c40083057a7f\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.786075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ggr5\" (UniqueName: \"kubernetes.io/projected/0ec7c890-e349-422b-bbea-c40083057a7f-kube-api-access-7ggr5\") pod \"0ec7c890-e349-422b-bbea-c40083057a7f\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.786215 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec7c890-e349-422b-bbea-c40083057a7f-logs\") pod \"0ec7c890-e349-422b-bbea-c40083057a7f\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.786326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data-custom\") pod \"0ec7c890-e349-422b-bbea-c40083057a7f\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.786372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-internal-tls-certs\") pod \"0ec7c890-e349-422b-bbea-c40083057a7f\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.786416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-combined-ca-bundle\") pod \"0ec7c890-e349-422b-bbea-c40083057a7f\" (UID: \"0ec7c890-e349-422b-bbea-c40083057a7f\") " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.787340 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.787368 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.787387 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.787412 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.787428 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.787438 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.789291 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9s4w\" (UniqueName: \"kubernetes.io/projected/cc201295-13b6-4e01-b791-f89da0093402-kube-api-access-c9s4w\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.789316 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.789330 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc201295-13b6-4e01-b791-f89da0093402-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.789339 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.791964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ec7c890-e349-422b-bbea-c40083057a7f-logs" (OuterVolumeSpecName: "logs") pod "0ec7c890-e349-422b-bbea-c40083057a7f" (UID: "0ec7c890-e349-422b-bbea-c40083057a7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.793782 4707 scope.go:117] "RemoveContainer" containerID="959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.801408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "de7fba6f-a3f2-4a62-a094-056e0cccd7fa" (UID: "de7fba6f-a3f2-4a62-a094-056e0cccd7fa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.813037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "de7fba6f-a3f2-4a62-a094-056e0cccd7fa" (UID: "de7fba6f-a3f2-4a62-a094-056e0cccd7fa"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.822849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec7c890-e349-422b-bbea-c40083057a7f-kube-api-access-7ggr5" (OuterVolumeSpecName: "kube-api-access-7ggr5") pod "0ec7c890-e349-422b-bbea-c40083057a7f" (UID: "0ec7c890-e349-422b-bbea-c40083057a7f"). InnerVolumeSpecName "kube-api-access-7ggr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.833907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0ec7c890-e349-422b-bbea-c40083057a7f" (UID: "0ec7c890-e349-422b-bbea-c40083057a7f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.841861 4707 scope.go:117] "RemoveContainer" containerID="68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087" Jan 21 16:12:10 crc kubenswrapper[4707]: E0121 16:12:10.842863 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087\": container with ID starting with 68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087 not found: ID does not exist" containerID="68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.842902 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087"} err="failed to get container status \"68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087\": rpc error: code = NotFound desc = could not find container \"68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087\": container with ID starting with 68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087 not found: ID does not exist" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.842927 4707 scope.go:117] "RemoveContainer" containerID="959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2" Jan 21 16:12:10 crc kubenswrapper[4707]: E0121 16:12:10.843184 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2\": container with ID starting with 959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2 not found: ID does not exist" containerID="959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.843216 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2"} err="failed to get container status \"959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2\": rpc error: code = NotFound desc = could not find container \"959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2\": container with ID starting with 959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2 not found: ID does not exist" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.843235 4707 scope.go:117] "RemoveContainer" containerID="68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.846302 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087"} err="failed to get container status \"68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087\": rpc error: code = NotFound desc = could not find container \"68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087\": container with ID starting with 68ff15c1a8c4e7f3bde7eef2ae5c4769e82f4e4d480b0bf0772309a0202ca087 not found: ID does not exist" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.846327 4707 scope.go:117] "RemoveContainer" containerID="959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.846563 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2"} err="failed to get container status \"959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2\": rpc error: code = NotFound desc = could not find container \"959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2\": container with ID starting with 959824c567adb02d6639af7bab28a04aa8ac4e50cefe55e50aeb1a263e40a7a2 not found: ID does not exist" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.892524 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ggr5\" (UniqueName: \"kubernetes.io/projected/0ec7c890-e349-422b-bbea-c40083057a7f-kube-api-access-7ggr5\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.892550 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.892562 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec7c890-e349-422b-bbea-c40083057a7f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.892572 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7fba6f-a3f2-4a62-a094-056e0cccd7fa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.892581 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.948758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-ff884c948-lmvhf"] Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.968086 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:12:10 crc kubenswrapper[4707]: I0121 16:12:10.994552 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.087606 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ec7c890-e349-422b-bbea-c40083057a7f" (UID: "0ec7c890-e349-422b-bbea-c40083057a7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.088871 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.230:8775/\": read tcp 10.217.0.2:37334->10.217.1.230:8775: read: connection reset by peer" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.089323 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.230:8775/\": read tcp 10.217.0.2:37332->10.217.1.230:8775: read: connection reset by peer" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.095324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0ec7c890-e349-422b-bbea-c40083057a7f" (UID: "0ec7c890-e349-422b-bbea-c40083057a7f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.095530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-config-data" (OuterVolumeSpecName: "config-data") pod "cc201295-13b6-4e01-b791-f89da0093402" (UID: "cc201295-13b6-4e01-b791-f89da0093402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.097378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-config-data\") pod \"cc201295-13b6-4e01-b791-f89da0093402\" (UID: \"cc201295-13b6-4e01-b791-f89da0093402\") " Jan 21 16:12:11 crc kubenswrapper[4707]: W0121 16:12:11.097712 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/cc201295-13b6-4e01-b791-f89da0093402/volumes/kubernetes.io~secret/config-data Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.097741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-config-data" (OuterVolumeSpecName: "config-data") pod "cc201295-13b6-4e01-b791-f89da0093402" (UID: "cc201295-13b6-4e01-b791-f89da0093402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.098207 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.098224 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.098234 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc201295-13b6-4e01-b791-f89da0093402-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.116887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0ec7c890-e349-422b-bbea-c40083057a7f" (UID: "0ec7c890-e349-422b-bbea-c40083057a7f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.133534 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data" (OuterVolumeSpecName: "config-data") pod "0ec7c890-e349-422b-bbea-c40083057a7f" (UID: "0ec7c890-e349-422b-bbea-c40083057a7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.184383 4707 scope.go:117] "RemoveContainer" containerID="7261e8621661d753a0ab84dc58a6333b9f4bf934a0bff35dc7ae6582d4347840" Jan 21 16:12:11 crc kubenswrapper[4707]: E0121 16:12:11.184777 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(647f33a8-e7c7-4941-9c0d-f9c5e4b07409)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.199482 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.199511 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec7c890-e349-422b-bbea-c40083057a7f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.675216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" event={"ID":"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0","Type":"ContainerStarted","Data":"f5fa696efe6598371fba15c3d9ef300ea51e9809ba27a262ee7247e760e06552"} Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.679801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" event={"ID":"0ec7c890-e349-422b-bbea-c40083057a7f","Type":"ContainerDied","Data":"df2a6b15ed5cac857930bf02ffb07acf15c9254211dd5940dd1772f2502852ac"} Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.679905 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.680066 4707 scope.go:117] "RemoveContainer" containerID="5abfe7dfab2c387f03fc384a6b2f564e952ec647ef9f0271b33a23ad90c8e35e" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.682982 4707 generic.go:334] "Generic (PLEG): container finished" podID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerID="f1f33074f9c4884125c41e8f10998b0cfb13e6b6903b0de59bbc6d2a374caca5" exitCode=0 Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.683084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a56f6441-8492-43ae-8a5a-8e38eb6b7cab","Type":"ContainerDied","Data":"f1f33074f9c4884125c41e8f10998b0cfb13e6b6903b0de59bbc6d2a374caca5"} Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.683112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a56f6441-8492-43ae-8a5a-8e38eb6b7cab","Type":"ContainerDied","Data":"ee4df396e056207a57c588b23cf0363049858f0ddecf9fe9b16f81dfa131d113"} Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.683129 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee4df396e056207a57c588b23cf0363049858f0ddecf9fe9b16f81dfa131d113" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.686713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" event={"ID":"317a81bb-283b-43fe-88cb-ad8fadf45471","Type":"ContainerStarted","Data":"555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4"} Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.686896 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.721338 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" podStartSLOduration=2.721317592 podStartE2EDuration="2.721317592s" podCreationTimestamp="2026-01-21 16:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:11.711583429 +0000 UTC m=+4228.893099651" watchObservedRunningTime="2026-01-21 16:12:11.721317592 +0000 UTC m=+4228.902833814" Jan 21 16:12:11 crc kubenswrapper[4707]: I0121 16:12:11.815000 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.116:5671: connect: connection reset by peer" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.140207 4707 scope.go:117] "RemoveContainer" containerID="c1dbdd29f67438ae9743e9e0b54ae682bc2556fd91e8655fdc8ef260aa96d9ec" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.176392 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.243365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.273122 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.297510 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.322078 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.330630 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-bf9cb7fc8-rhjfr"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.331957 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-config-data\") pod \"804ec34d-12d2-48ee-9559-ca08aca94c3d\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-combined-ca-bundle\") pod \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"804ec34d-12d2-48ee-9559-ca08aca94c3d\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-nova-metadata-tls-certs\") pod \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4mbx\" (UniqueName: \"kubernetes.io/projected/804ec34d-12d2-48ee-9559-ca08aca94c3d-kube-api-access-b4mbx\") pod \"804ec34d-12d2-48ee-9559-ca08aca94c3d\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-httpd-run\") pod \"804ec34d-12d2-48ee-9559-ca08aca94c3d\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d44g5\" (UniqueName: \"kubernetes.io/projected/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-kube-api-access-d44g5\") pod \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-logs\") pod \"804ec34d-12d2-48ee-9559-ca08aca94c3d\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-scripts\") pod \"804ec34d-12d2-48ee-9559-ca08aca94c3d\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-config-data\") pod \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-logs\") pod \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\" (UID: \"a56f6441-8492-43ae-8a5a-8e38eb6b7cab\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-internal-tls-certs\") pod \"804ec34d-12d2-48ee-9559-ca08aca94c3d\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.332506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-combined-ca-bundle\") pod \"804ec34d-12d2-48ee-9559-ca08aca94c3d\" (UID: \"804ec34d-12d2-48ee-9559-ca08aca94c3d\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.340239 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.343618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-logs" (OuterVolumeSpecName: "logs") pod "a56f6441-8492-43ae-8a5a-8e38eb6b7cab" (UID: "a56f6441-8492-43ae-8a5a-8e38eb6b7cab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.345241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "804ec34d-12d2-48ee-9559-ca08aca94c3d" (UID: "804ec34d-12d2-48ee-9559-ca08aca94c3d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.350507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-logs" (OuterVolumeSpecName: "logs") pod "804ec34d-12d2-48ee-9559-ca08aca94c3d" (UID: "804ec34d-12d2-48ee-9559-ca08aca94c3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.355428 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-kube-api-access-d44g5" (OuterVolumeSpecName: "kube-api-access-d44g5") pod "a56f6441-8492-43ae-8a5a-8e38eb6b7cab" (UID: "a56f6441-8492-43ae-8a5a-8e38eb6b7cab"). InnerVolumeSpecName "kube-api-access-d44g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.355480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804ec34d-12d2-48ee-9559-ca08aca94c3d-kube-api-access-b4mbx" (OuterVolumeSpecName: "kube-api-access-b4mbx") pod "804ec34d-12d2-48ee-9559-ca08aca94c3d" (UID: "804ec34d-12d2-48ee-9559-ca08aca94c3d"). InnerVolumeSpecName "kube-api-access-b4mbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.356136 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.357067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "804ec34d-12d2-48ee-9559-ca08aca94c3d" (UID: "804ec34d-12d2-48ee-9559-ca08aca94c3d"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.362933 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363281 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-metadata" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363307 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-metadata" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363320 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerName="glance-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363327 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerName="glance-log" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363338 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc201295-13b6-4e01-b791-f89da0093402" containerName="glance-httpd" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363344 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc201295-13b6-4e01-b791-f89da0093402" containerName="glance-httpd" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363357 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc201295-13b6-4e01-b791-f89da0093402" containerName="glance-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363362 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc201295-13b6-4e01-b791-f89da0093402" containerName="glance-log" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363371 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363376 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api-log" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363394 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363399 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-log" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363411 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerName="neutron-api" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363416 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerName="neutron-api" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363430 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38eec5c3-a42c-45b1-9cb9-d21e8c88de27" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363435 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="38eec5c3-a42c-45b1-9cb9-d21e8c88de27" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363442 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363447 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363459 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerName="glance-httpd" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363464 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerName="glance-httpd" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363478 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-api" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363483 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-api" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363490 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363496 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-log" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.363506 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerName="neutron-httpd" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363511 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerName="neutron-httpd" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363679 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363689 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-api" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363701 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-metadata" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363711 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc201295-13b6-4e01-b791-f89da0093402" containerName="glance-httpd" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363723 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" containerName="nova-metadata-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363730 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerName="neutron-httpd" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363737 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" containerName="barbican-api-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363746 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerName="glance-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363754 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc201295-13b6-4e01-b791-f89da0093402" containerName="glance-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363760 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" containerName="neutron-api" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363766 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="38eec5c3-a42c-45b1-9cb9-d21e8c88de27" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363777 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerName="nova-api-log" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.363787 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerName="glance-httpd" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.364696 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.366445 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.366707 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.368258 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-scripts" (OuterVolumeSpecName: "scripts") pod "804ec34d-12d2-48ee-9559-ca08aca94c3d" (UID: "804ec34d-12d2-48ee-9559-ca08aca94c3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.370473 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.374612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "804ec34d-12d2-48ee-9559-ca08aca94c3d" (UID: "804ec34d-12d2-48ee-9559-ca08aca94c3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.396079 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-config-data" (OuterVolumeSpecName: "config-data") pod "a56f6441-8492-43ae-8a5a-8e38eb6b7cab" (UID: "a56f6441-8492-43ae-8a5a-8e38eb6b7cab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.401598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a56f6441-8492-43ae-8a5a-8e38eb6b7cab" (UID: "a56f6441-8492-43ae-8a5a-8e38eb6b7cab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.417930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a56f6441-8492-43ae-8a5a-8e38eb6b7cab" (UID: "a56f6441-8492-43ae-8a5a-8e38eb6b7cab"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.426609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-config-data" (OuterVolumeSpecName: "config-data") pod "804ec34d-12d2-48ee-9559-ca08aca94c3d" (UID: "804ec34d-12d2-48ee-9559-ca08aca94c3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.434078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjg2v\" (UniqueName: \"kubernetes.io/projected/8d8f30cf-308e-4177-8b29-5d0d6811b551-kube-api-access-cjg2v\") pod \"8d8f30cf-308e-4177-8b29-5d0d6811b551\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.434140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-nova-novncproxy-tls-certs\") pod \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.434191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-public-tls-certs\") pod \"8d8f30cf-308e-4177-8b29-5d0d6811b551\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.434323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8f30cf-308e-4177-8b29-5d0d6811b551-logs\") pod \"8d8f30cf-308e-4177-8b29-5d0d6811b551\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.434599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-config-data\") pod \"8d8f30cf-308e-4177-8b29-5d0d6811b551\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.434660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfr7s\" (UniqueName: \"kubernetes.io/projected/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-kube-api-access-zfr7s\") pod \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.434742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-config-data\") pod \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.434930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-vencrypt-tls-certs\") pod \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.434957 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-combined-ca-bundle\") pod \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\" (UID: \"38eec5c3-a42c-45b1-9cb9-d21e8c88de27\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-combined-ca-bundle\") pod \"8d8f30cf-308e-4177-8b29-5d0d6811b551\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-internal-tls-certs\") pod \"8d8f30cf-308e-4177-8b29-5d0d6811b551\" (UID: \"8d8f30cf-308e-4177-8b29-5d0d6811b551\") " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435588 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435605 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435615 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435628 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435646 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435655 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435664 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4mbx\" (UniqueName: \"kubernetes.io/projected/804ec34d-12d2-48ee-9559-ca08aca94c3d-kube-api-access-b4mbx\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435674 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435683 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d44g5\" (UniqueName: \"kubernetes.io/projected/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-kube-api-access-d44g5\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435691 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804ec34d-12d2-48ee-9559-ca08aca94c3d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435699 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.435708 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56f6441-8492-43ae-8a5a-8e38eb6b7cab-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.438524 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d8f30cf-308e-4177-8b29-5d0d6811b551-logs" (OuterVolumeSpecName: "logs") pod "8d8f30cf-308e-4177-8b29-5d0d6811b551" (UID: "8d8f30cf-308e-4177-8b29-5d0d6811b551"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.462928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "804ec34d-12d2-48ee-9559-ca08aca94c3d" (UID: "804ec34d-12d2-48ee-9559-ca08aca94c3d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.463127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-kube-api-access-zfr7s" (OuterVolumeSpecName: "kube-api-access-zfr7s") pod "38eec5c3-a42c-45b1-9cb9-d21e8c88de27" (UID: "38eec5c3-a42c-45b1-9cb9-d21e8c88de27"). InnerVolumeSpecName "kube-api-access-zfr7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.463267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8f30cf-308e-4177-8b29-5d0d6811b551-kube-api-access-cjg2v" (OuterVolumeSpecName: "kube-api-access-cjg2v") pod "8d8f30cf-308e-4177-8b29-5d0d6811b551" (UID: "8d8f30cf-308e-4177-8b29-5d0d6811b551"). InnerVolumeSpecName "kube-api-access-cjg2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.474654 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.492862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-config-data" (OuterVolumeSpecName: "config-data") pod "38eec5c3-a42c-45b1-9cb9-d21e8c88de27" (UID: "38eec5c3-a42c-45b1-9cb9-d21e8c88de27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.499111 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38eec5c3-a42c-45b1-9cb9-d21e8c88de27" (UID: "38eec5c3-a42c-45b1-9cb9-d21e8c88de27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.512936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-config-data" (OuterVolumeSpecName: "config-data") pod "8d8f30cf-308e-4177-8b29-5d0d6811b551" (UID: "8d8f30cf-308e-4177-8b29-5d0d6811b551"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.516711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d8f30cf-308e-4177-8b29-5d0d6811b551" (UID: "8d8f30cf-308e-4177-8b29-5d0d6811b551"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.517773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "38eec5c3-a42c-45b1-9cb9-d21e8c88de27" (UID: "38eec5c3-a42c-45b1-9cb9-d21e8c88de27"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.524024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8d8f30cf-308e-4177-8b29-5d0d6811b551" (UID: "8d8f30cf-308e-4177-8b29-5d0d6811b551"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.524907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8d8f30cf-308e-4177-8b29-5d0d6811b551" (UID: "8d8f30cf-308e-4177-8b29-5d0d6811b551"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.525109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "38eec5c3-a42c-45b1-9cb9-d21e8c88de27" (UID: "38eec5c3-a42c-45b1-9cb9-d21e8c88de27"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.539333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.539757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.539883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.539942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.540037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-logs\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.540285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.540687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.540871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg558\" (UniqueName: \"kubernetes.io/projected/1162dc59-3a54-42e6-84c5-466eafb062b4-kube-api-access-fg558\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541087 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541149 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/804ec34d-12d2-48ee-9559-ca08aca94c3d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541224 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541276 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541331 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541418 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541494 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541572 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjg2v\" (UniqueName: \"kubernetes.io/projected/8d8f30cf-308e-4177-8b29-5d0d6811b551-kube-api-access-cjg2v\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541649 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541721 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.541848 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8f30cf-308e-4177-8b29-5d0d6811b551-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.542156 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8f30cf-308e-4177-8b29-5d0d6811b551-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.542289 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfr7s\" (UniqueName: \"kubernetes.io/projected/38eec5c3-a42c-45b1-9cb9-d21e8c88de27-kube-api-access-zfr7s\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.644165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.644262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.644289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.644326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-logs\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.644404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.644448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.644482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg558\" (UniqueName: \"kubernetes.io/projected/1162dc59-3a54-42e6-84c5-466eafb062b4-kube-api-access-fg558\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.644529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.645109 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-logs\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.645830 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.646302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.648706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.649471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.649625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.659793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.663694 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg558\" (UniqueName: \"kubernetes.io/projected/1162dc59-3a54-42e6-84c5-466eafb062b4-kube-api-access-fg558\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.673388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.704060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" event={"ID":"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0","Type":"ContainerStarted","Data":"34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0"} Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.705121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" event={"ID":"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0","Type":"ContainerStarted","Data":"408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6"} Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.705238 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.705301 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.715296 4707 generic.go:334] "Generic (PLEG): container finished" podID="8d8f30cf-308e-4177-8b29-5d0d6811b551" containerID="78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d" exitCode=0 Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.715416 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8d8f30cf-308e-4177-8b29-5d0d6811b551","Type":"ContainerDied","Data":"78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d"} Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.715462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"8d8f30cf-308e-4177-8b29-5d0d6811b551","Type":"ContainerDied","Data":"303a95345de521cf994c2aa7b38cdbeeb9040656ca262c2b9cc64c79d8ef4901"} Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.715489 4707 scope.go:117] "RemoveContainer" containerID="78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.715662 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.720672 4707 generic.go:334] "Generic (PLEG): container finished" podID="804ec34d-12d2-48ee-9559-ca08aca94c3d" containerID="a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11" exitCode=0 Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.720774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"804ec34d-12d2-48ee-9559-ca08aca94c3d","Type":"ContainerDied","Data":"a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11"} Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.720857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"804ec34d-12d2-48ee-9559-ca08aca94c3d","Type":"ContainerDied","Data":"aee46c0b9fcf24d5f30507b57ccefd43c64c6daa63400c140278a8cd79ae7693"} Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.720983 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.733195 4707 generic.go:334] "Generic (PLEG): container finished" podID="38eec5c3-a42c-45b1-9cb9-d21e8c88de27" containerID="23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc" exitCode=0 Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.733267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"38eec5c3-a42c-45b1-9cb9-d21e8c88de27","Type":"ContainerDied","Data":"23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc"} Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.733318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"38eec5c3-a42c-45b1-9cb9-d21e8c88de27","Type":"ContainerDied","Data":"7213c8df36de41b0edb774db638801343eac416614b300b2596866389eb5b8f8"} Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.733389 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.733619 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.738633 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" podStartSLOduration=2.738613509 podStartE2EDuration="2.738613509s" podCreationTimestamp="2026-01-21 16:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:12.72223316 +0000 UTC m=+4229.903749381" watchObservedRunningTime="2026-01-21 16:12:12.738613509 +0000 UTC m=+4229.920129731" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.744224 4707 scope.go:117] "RemoveContainer" containerID="ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.776545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.785997 4707 scope.go:117] "RemoveContainer" containerID="78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.786432 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d\": container with ID starting with 78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d not found: ID does not exist" containerID="78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.786568 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d"} err="failed to get container status \"78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d\": rpc error: code = NotFound desc = could not find container \"78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d\": container with ID starting with 78e1e263ebb58018d1dc4661a030bb76315d742a84fd6fe4a46a0b2c2312634d not found: ID does not exist" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.786645 4707 scope.go:117] "RemoveContainer" containerID="ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb" Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.791896 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb\": container with ID starting with ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb not found: ID does not exist" containerID="ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.791940 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb"} err="failed to get container status \"ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb\": rpc error: code = NotFound desc = could not find container \"ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb\": container with ID starting with ea220bd0932d54ab22c264f10c14330ede4ba526423b872308218c0e16480bbb not found: ID does not exist" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.791965 4707 scope.go:117] "RemoveContainer" containerID="a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.831864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.858656 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.892679 4707 scope.go:117] "RemoveContainer" containerID="7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.900891 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.902314 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.904358 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.904637 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.913912 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.921499 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.923742 4707 scope.go:117] "RemoveContainer" containerID="a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.929795 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.929974 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11\": container with ID starting with a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11 not found: ID does not exist" containerID="a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.930032 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11"} err="failed to get container status \"a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11\": rpc error: code = NotFound desc = could not find container \"a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11\": container with ID starting with a22a9ca7c3a6cc665c7c278548bdb90be4e5a85fbbaa3b4fc813b3f35d6cfa11 not found: ID does not exist" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.930066 4707 scope.go:117] "RemoveContainer" containerID="7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.936977 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.942208 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809\": container with ID starting with 7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809 not found: ID does not exist" containerID="7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.942242 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809"} err="failed to get container status \"7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809\": rpc error: code = NotFound desc = could not find container \"7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809\": container with ID starting with 7dea43414c0fbb835bafe867be9e8be726ecb7755d947ef7c59c73fa78fd3809 not found: ID does not exist" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.942258 4707 scope.go:117] "RemoveContainer" containerID="23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.942365 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.946853 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.948277 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.950488 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.950956 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.956145 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.959334 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.961577 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.963066 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.963980 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.964193 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.972028 4707 scope.go:117] "RemoveContainer" containerID="23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.972174 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: E0121 16:12:12.972533 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc\": container with ID starting with 23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc not found: ID does not exist" containerID="23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.972559 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc"} err="failed to get container status \"23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc\": rpc error: code = NotFound desc = could not find container \"23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc\": container with ID starting with 23a896d9717a751f665397433e0148d136b7c441ab23f9ef60933b0b3100e7fc not found: ID does not exist" Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.978455 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:12:12 crc kubenswrapper[4707]: I0121 16:12:12.984253 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:12.998758 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.000276 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.001889 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.002844 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.002964 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.008027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.052839 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvfgk\" (UniqueName: \"kubernetes.io/projected/69e40397-47ac-49d6-9b01-50b995d290f7-kube-api-access-lvfgk\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.052884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-logs\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.052907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-logs\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-config-data\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.053968 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.054039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htmzz\" (UniqueName: \"kubernetes.io/projected/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-kube-api-access-htmzz\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.054078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.054139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d6pt\" (UniqueName: \"kubernetes.io/projected/561add48-0cde-40f7-95b6-09300d00eec4-kube-api-access-2d6pt\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-public-tls-certs\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-config-data\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156467 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnkkz\" (UniqueName: \"kubernetes.io/projected/594873ed-1daa-422c-853c-65fab465dbdb-kube-api-access-nnkkz\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156706 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-logs\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.156987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-config-data\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157218 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htmzz\" (UniqueName: \"kubernetes.io/projected/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-kube-api-access-htmzz\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d6pt\" (UniqueName: \"kubernetes.io/projected/561add48-0cde-40f7-95b6-09300d00eec4-kube-api-access-2d6pt\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvfgk\" (UniqueName: \"kubernetes.io/projected/69e40397-47ac-49d6-9b01-50b995d290f7-kube-api-access-lvfgk\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-logs\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.157756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594873ed-1daa-422c-853c-65fab465dbdb-logs\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.158747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-logs\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.158979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.159480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-logs\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.163465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.169380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.169487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.169509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.169497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.169875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.169955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.170089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.170222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.170228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.170553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-config-data\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.172686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d6pt\" (UniqueName: \"kubernetes.io/projected/561add48-0cde-40f7-95b6-09300d00eec4-kube-api-access-2d6pt\") pod \"nova-cell1-novncproxy-0\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.172764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htmzz\" (UniqueName: \"kubernetes.io/projected/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-kube-api-access-htmzz\") pod \"nova-metadata-0\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.173552 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvfgk\" (UniqueName: \"kubernetes.io/projected/69e40397-47ac-49d6-9b01-50b995d290f7-kube-api-access-lvfgk\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.183547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-0\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.195272 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec7c890-e349-422b-bbea-c40083057a7f" path="/var/lib/kubelet/pods/0ec7c890-e349-422b-bbea-c40083057a7f/volumes" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.195876 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38eec5c3-a42c-45b1-9cb9-d21e8c88de27" path="/var/lib/kubelet/pods/38eec5c3-a42c-45b1-9cb9-d21e8c88de27/volumes" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.196446 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804ec34d-12d2-48ee-9559-ca08aca94c3d" path="/var/lib/kubelet/pods/804ec34d-12d2-48ee-9559-ca08aca94c3d/volumes" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.197435 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d8f30cf-308e-4177-8b29-5d0d6811b551" path="/var/lib/kubelet/pods/8d8f30cf-308e-4177-8b29-5d0d6811b551/volumes" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.198015 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a56f6441-8492-43ae-8a5a-8e38eb6b7cab" path="/var/lib/kubelet/pods/a56f6441-8492-43ae-8a5a-8e38eb6b7cab/volumes" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.199028 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc201295-13b6-4e01-b791-f89da0093402" path="/var/lib/kubelet/pods/cc201295-13b6-4e01-b791-f89da0093402/volumes" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.217590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.260006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-config-data\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.260412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnkkz\" (UniqueName: \"kubernetes.io/projected/594873ed-1daa-422c-853c-65fab465dbdb-kube-api-access-nnkkz\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.260879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.260919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.262943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594873ed-1daa-422c-853c-65fab465dbdb-logs\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.262999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-public-tls-certs\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.263402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594873ed-1daa-422c-853c-65fab465dbdb-logs\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.264983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-config-data\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.265184 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.266098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.266268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-public-tls-certs\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.266603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.274123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnkkz\" (UniqueName: \"kubernetes.io/projected/594873ed-1daa-422c-853c-65fab465dbdb-kube-api-access-nnkkz\") pod \"nova-api-0\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.277281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.348653 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.366515 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:13 crc kubenswrapper[4707]: W0121 16:12:13.367828 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1162dc59_3a54_42e6_84c5_466eafb062b4.slice/crio-38d27dd9bb919d4d3cdd89043730417d1b73ecdd18fafcb0e3bc9684d17af3f1 WatchSource:0}: Error finding container 38d27dd9bb919d4d3cdd89043730417d1b73ecdd18fafcb0e3bc9684d17af3f1: Status 404 returned error can't find the container with id 38d27dd9bb919d4d3cdd89043730417d1b73ecdd18fafcb0e3bc9684d17af3f1 Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.641699 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:12:13 crc kubenswrapper[4707]: W0121 16:12:13.651022 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69e40397_47ac_49d6_9b01_50b995d290f7.slice/crio-a6c7cd084f1ec726b7c15a7e37b8c82152c44796da07eb2e51101755c1bb1094 WatchSource:0}: Error finding container a6c7cd084f1ec726b7c15a7e37b8c82152c44796da07eb2e51101755c1bb1094: Status 404 returned error can't find the container with id a6c7cd084f1ec726b7c15a7e37b8c82152c44796da07eb2e51101755c1bb1094 Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.758733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1162dc59-3a54-42e6-84c5-466eafb062b4","Type":"ContainerStarted","Data":"38d27dd9bb919d4d3cdd89043730417d1b73ecdd18fafcb0e3bc9684d17af3f1"} Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.765537 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.768879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"69e40397-47ac-49d6-9b01-50b995d290f7","Type":"ContainerStarted","Data":"a6c7cd084f1ec726b7c15a7e37b8c82152c44796da07eb2e51101755c1bb1094"} Jan 21 16:12:13 crc kubenswrapper[4707]: W0121 16:12:13.770758 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod561add48_0cde_40f7_95b6_09300d00eec4.slice/crio-5daab13c19a07ca007adfa9b4b37be73cd9f33e8ab7fc149dcca36c7474fcae2 WatchSource:0}: Error finding container 5daab13c19a07ca007adfa9b4b37be73cd9f33e8ab7fc149dcca36c7474fcae2: Status 404 returned error can't find the container with id 5daab13c19a07ca007adfa9b4b37be73cd9f33e8ab7fc149dcca36c7474fcae2 Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.824579 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:12:13 crc kubenswrapper[4707]: I0121 16:12:13.922446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:12:13 crc kubenswrapper[4707]: W0121 16:12:13.939893 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod594873ed_1daa_422c_853c_65fab465dbdb.slice/crio-bdd8aa679b0f57b3df344998456e4d250d7959b5e962261728ab17ce6dd9b21b WatchSource:0}: Error finding container bdd8aa679b0f57b3df344998456e4d250d7959b5e962261728ab17ce6dd9b21b: Status 404 returned error can't find the container with id bdd8aa679b0f57b3df344998456e4d250d7959b5e962261728ab17ce6dd9b21b Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.799594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2","Type":"ContainerStarted","Data":"a3f629a586eeaf3a0430289235f8ed826a343e4d540c7552bf23587188a60f57"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.800004 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2","Type":"ContainerStarted","Data":"75695222855ca1c4f0fbbb6b2725a344e12f18c6673f5479b9fa3e2af236f9ba"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.800018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2","Type":"ContainerStarted","Data":"7319f861ffb7c9c24e473247fe12ac8f3757571c1b479da39dc27d4a08e061ed"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.802375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"69e40397-47ac-49d6-9b01-50b995d290f7","Type":"ContainerStarted","Data":"cd8eeca7ee1cb0f9e8acd6fb2fc5f3841099beedc093a3deaf46d81c1d640e58"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.803717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1162dc59-3a54-42e6-84c5-466eafb062b4","Type":"ContainerStarted","Data":"d9c4cae2facb86f1f23dc52a9d0d7d3cf12140812ce2d41bc7723b83171f967b"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.806604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1162dc59-3a54-42e6-84c5-466eafb062b4","Type":"ContainerStarted","Data":"0d655dcc30fe3941863c06455c345616e3fae519df331a4a8dd1db93d04f7978"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.808222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"594873ed-1daa-422c-853c-65fab465dbdb","Type":"ContainerStarted","Data":"8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.808259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"594873ed-1daa-422c-853c-65fab465dbdb","Type":"ContainerStarted","Data":"ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.808269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"594873ed-1daa-422c-853c-65fab465dbdb","Type":"ContainerStarted","Data":"bdd8aa679b0f57b3df344998456e4d250d7959b5e962261728ab17ce6dd9b21b"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.810251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"561add48-0cde-40f7-95b6-09300d00eec4","Type":"ContainerStarted","Data":"d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.810277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"561add48-0cde-40f7-95b6-09300d00eec4","Type":"ContainerStarted","Data":"5daab13c19a07ca007adfa9b4b37be73cd9f33e8ab7fc149dcca36c7474fcae2"} Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.816754 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.8167453829999998 podStartE2EDuration="2.816745383s" podCreationTimestamp="2026-01-21 16:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:14.813133321 +0000 UTC m=+4231.994649543" watchObservedRunningTime="2026-01-21 16:12:14.816745383 +0000 UTC m=+4231.998261604" Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.837731 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.83771438 podStartE2EDuration="2.83771438s" podCreationTimestamp="2026-01-21 16:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:14.828844402 +0000 UTC m=+4232.010360624" watchObservedRunningTime="2026-01-21 16:12:14.83771438 +0000 UTC m=+4232.019230603" Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.864839 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.864822552 podStartE2EDuration="2.864822552s" podCreationTimestamp="2026-01-21 16:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:14.860018699 +0000 UTC m=+4232.041534921" watchObservedRunningTime="2026-01-21 16:12:14.864822552 +0000 UTC m=+4232.046338774" Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.905411 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.905392415 podStartE2EDuration="2.905392415s" podCreationTimestamp="2026-01-21 16:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:14.880642918 +0000 UTC m=+4232.062159140" watchObservedRunningTime="2026-01-21 16:12:14.905392415 +0000 UTC m=+4232.086908637" Jan 21 16:12:14 crc kubenswrapper[4707]: I0121 16:12:14.906603 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.90659751 podStartE2EDuration="2.90659751s" podCreationTimestamp="2026-01-21 16:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:14.899985849 +0000 UTC m=+4232.081502071" watchObservedRunningTime="2026-01-21 16:12:14.90659751 +0000 UTC m=+4232.088113733" Jan 21 16:12:15 crc kubenswrapper[4707]: I0121 16:12:15.821665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"69e40397-47ac-49d6-9b01-50b995d290f7","Type":"ContainerStarted","Data":"8e765e71026c6eb75d97a822236ee30c3868681e753b9c3940e5cd5d39e67c94"} Jan 21 16:12:16 crc kubenswrapper[4707]: I0121 16:12:16.006066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:16 crc kubenswrapper[4707]: I0121 16:12:16.182465 4707 scope.go:117] "RemoveContainer" containerID="0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd" Jan 21 16:12:16 crc kubenswrapper[4707]: I0121 16:12:16.844478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerStarted","Data":"2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700"} Jan 21 16:12:16 crc kubenswrapper[4707]: I0121 16:12:16.845562 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:12:17 crc kubenswrapper[4707]: I0121 16:12:17.950298 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.017317 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p"] Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.018902 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" podUID="7d6ee277-b629-49a0-94da-030a39be02f5" containerName="keystone-api" containerID="cri-o://a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c" gracePeriod=30 Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.265747 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.277461 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.277516 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.559398 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.560617 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.564180 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.568116 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.569124 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-6kq7j" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.578057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.610398 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:12:18 crc kubenswrapper[4707]: E0121 16:12:18.611445 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-5569z openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/openstackclient" podUID="c690e3bc-184a-4844-8556-661cd5c2c69c" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.618105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config-secret\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.618174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5569z\" (UniqueName: \"kubernetes.io/projected/c690e3bc-184a-4844-8556-661cd5c2c69c-kube-api-access-5569z\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.618239 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.618297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.630846 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.634438 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.635639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.642370 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.720083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config-secret\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.720141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.720191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config-secret\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.720218 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5569z\" (UniqueName: \"kubernetes.io/projected/c690e3bc-184a-4844-8556-661cd5c2c69c-kube-api-access-5569z\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.720269 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.720291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.720330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.720350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9qc4\" (UniqueName: \"kubernetes.io/projected/85d128b6-18c3-4018-a74a-86b025e2196d-kube-api-access-w9qc4\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.721121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: E0121 16:12:18.722435 4707 projected.go:194] Error preparing data for projected volume kube-api-access-5569z for pod openstack-kuttl-tests/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c690e3bc-184a-4844-8556-661cd5c2c69c) does not match the UID in record. The object might have been deleted and then recreated Jan 21 16:12:18 crc kubenswrapper[4707]: E0121 16:12:18.722501 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c690e3bc-184a-4844-8556-661cd5c2c69c-kube-api-access-5569z podName:c690e3bc-184a-4844-8556-661cd5c2c69c nodeName:}" failed. No retries permitted until 2026-01-21 16:12:19.222484277 +0000 UTC m=+4236.404000499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5569z" (UniqueName: "kubernetes.io/projected/c690e3bc-184a-4844-8556-661cd5c2c69c-kube-api-access-5569z") pod "openstackclient" (UID: "c690e3bc-184a-4844-8556-661cd5c2c69c") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c690e3bc-184a-4844-8556-661cd5c2c69c) does not match the UID in record. The object might have been deleted and then recreated Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.725562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config-secret\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.726881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.821394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.821458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9qc4\" (UniqueName: \"kubernetes.io/projected/85d128b6-18c3-4018-a74a-86b025e2196d-kube-api-access-w9qc4\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.821509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config-secret\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.821547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.822390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.825177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.825606 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config-secret\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.837145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9qc4\" (UniqueName: \"kubernetes.io/projected/85d128b6-18c3-4018-a74a-86b025e2196d-kube-api-access-w9qc4\") pod \"openstackclient\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.860939 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.863932 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="c690e3bc-184a-4844-8556-661cd5c2c69c" podUID="85d128b6-18c3-4018-a74a-86b025e2196d" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.896798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:18 crc kubenswrapper[4707]: I0121 16:12:18.955533 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.024090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-combined-ca-bundle\") pod \"c690e3bc-184a-4844-8556-661cd5c2c69c\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.024367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config-secret\") pod \"c690e3bc-184a-4844-8556-661cd5c2c69c\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.024642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config\") pod \"c690e3bc-184a-4844-8556-661cd5c2c69c\" (UID: \"c690e3bc-184a-4844-8556-661cd5c2c69c\") " Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.025023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c690e3bc-184a-4844-8556-661cd5c2c69c" (UID: "c690e3bc-184a-4844-8556-661cd5c2c69c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.025447 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.025527 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5569z\" (UniqueName: \"kubernetes.io/projected/c690e3bc-184a-4844-8556-661cd5c2c69c-kube-api-access-5569z\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.027957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c690e3bc-184a-4844-8556-661cd5c2c69c" (UID: "c690e3bc-184a-4844-8556-661cd5c2c69c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.028017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c690e3bc-184a-4844-8556-661cd5c2c69c" (UID: "c690e3bc-184a-4844-8556-661cd5c2c69c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.128256 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.128561 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c690e3bc-184a-4844-8556-661cd5c2c69c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.194489 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c690e3bc-184a-4844-8556-661cd5c2c69c" path="/var/lib/kubelet/pods/c690e3bc-184a-4844-8556-661cd5c2c69c/volumes" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.363484 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:12:19 crc kubenswrapper[4707]: W0121 16:12:19.367583 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85d128b6_18c3_4018_a74a_86b025e2196d.slice/crio-fd8cc84a2f0107c14e207ce977604c96e03b516dc5d04e63eda5a413a6a15f72 WatchSource:0}: Error finding container fd8cc84a2f0107c14e207ce977604c96e03b516dc5d04e63eda5a413a6a15f72: Status 404 returned error can't find the container with id fd8cc84a2f0107c14e207ce977604c96e03b516dc5d04e63eda5a413a6a15f72 Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.875425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"85d128b6-18c3-4018-a74a-86b025e2196d","Type":"ContainerStarted","Data":"dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69"} Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.875467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.875958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"85d128b6-18c3-4018-a74a-86b025e2196d","Type":"ContainerStarted","Data":"fd8cc84a2f0107c14e207ce977604c96e03b516dc5d04e63eda5a413a6a15f72"} Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.891842 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="c690e3bc-184a-4844-8556-661cd5c2c69c" podUID="85d128b6-18c3-4018-a74a-86b025e2196d" Jan 21 16:12:19 crc kubenswrapper[4707]: I0121 16:12:19.918676 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=1.918655534 podStartE2EDuration="1.918655534s" podCreationTimestamp="2026-01-21 16:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:19.888729905 +0000 UTC m=+4237.070246127" watchObservedRunningTime="2026-01-21 16:12:19.918655534 +0000 UTC m=+4237.100171756" Jan 21 16:12:20 crc kubenswrapper[4707]: I0121 16:12:20.182623 4707 scope.go:117] "RemoveContainer" containerID="1408b5018e888ee2df7c2a74b879464d9d7f9397fbdd091d97fbff81697afea1" Jan 21 16:12:20 crc kubenswrapper[4707]: E0121 16:12:20.182912 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" Jan 21 16:12:20 crc kubenswrapper[4707]: I0121 16:12:20.532379 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:12:20 crc kubenswrapper[4707]: I0121 16:12:20.542737 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:12:20 crc kubenswrapper[4707]: I0121 16:12:20.618376 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-68f4b84bbb-5wq8w"] Jan 21 16:12:20 crc kubenswrapper[4707]: I0121 16:12:20.618626 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" podUID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerName="placement-log" containerID="cri-o://6a2974206b54c7b9a04ddf70f0fd4dfbd63679099f8248cfc05fbbcb49b95ea3" gracePeriod=30 Jan 21 16:12:20 crc kubenswrapper[4707]: I0121 16:12:20.619103 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" podUID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerName="placement-api" containerID="cri-o://b61fcf03feb13351b8d028376c265c54d145d49154a87773bc65d349bf33430f" gracePeriod=30 Jan 21 16:12:20 crc kubenswrapper[4707]: I0121 16:12:20.885661 4707 generic.go:334] "Generic (PLEG): container finished" podID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerID="6a2974206b54c7b9a04ddf70f0fd4dfbd63679099f8248cfc05fbbcb49b95ea3" exitCode=143 Jan 21 16:12:20 crc kubenswrapper[4707]: I0121 16:12:20.885762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" event={"ID":"3ebbb7c6-3987-4c0b-826f-10b691ca1e39","Type":"ContainerDied","Data":"6a2974206b54c7b9a04ddf70f0fd4dfbd63679099f8248cfc05fbbcb49b95ea3"} Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.584646 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.682254 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs\") pod \"7d6ee277-b629-49a0-94da-030a39be02f5\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.682312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs\") pod \"7d6ee277-b629-49a0-94da-030a39be02f5\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.682381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-combined-ca-bundle\") pod \"7d6ee277-b629-49a0-94da-030a39be02f5\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.683454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-config-data\") pod \"7d6ee277-b629-49a0-94da-030a39be02f5\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.683532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfl5t\" (UniqueName: \"kubernetes.io/projected/7d6ee277-b629-49a0-94da-030a39be02f5-kube-api-access-nfl5t\") pod \"7d6ee277-b629-49a0-94da-030a39be02f5\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.683728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-credential-keys\") pod \"7d6ee277-b629-49a0-94da-030a39be02f5\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.683833 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-fernet-keys\") pod \"7d6ee277-b629-49a0-94da-030a39be02f5\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.683975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-scripts\") pod \"7d6ee277-b629-49a0-94da-030a39be02f5\" (UID: \"7d6ee277-b629-49a0-94da-030a39be02f5\") " Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.690191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d6ee277-b629-49a0-94da-030a39be02f5-kube-api-access-nfl5t" (OuterVolumeSpecName: "kube-api-access-nfl5t") pod "7d6ee277-b629-49a0-94da-030a39be02f5" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5"). InnerVolumeSpecName "kube-api-access-nfl5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.690924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7d6ee277-b629-49a0-94da-030a39be02f5" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.692309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7d6ee277-b629-49a0-94da-030a39be02f5" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.693474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-scripts" (OuterVolumeSpecName: "scripts") pod "7d6ee277-b629-49a0-94da-030a39be02f5" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.720211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-config-data" (OuterVolumeSpecName: "config-data") pod "7d6ee277-b629-49a0-94da-030a39be02f5" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.735597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d6ee277-b629-49a0-94da-030a39be02f5" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.739307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7d6ee277-b629-49a0-94da-030a39be02f5" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.745138 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.747851 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.747877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d6ee277-b629-49a0-94da-030a39be02f5" (UID: "7d6ee277-b629-49a0-94da-030a39be02f5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.787557 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.787605 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.787621 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.787633 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.787645 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfl5t\" (UniqueName: \"kubernetes.io/projected/7d6ee277-b629-49a0-94da-030a39be02f5-kube-api-access-nfl5t\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.787662 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.787673 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.787683 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d6ee277-b629-49a0-94da-030a39be02f5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.841379 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-cff485676-6h5mm"] Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.841607 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api-log" containerID="cri-o://67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c" gracePeriod=30 Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.842109 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api" containerID="cri-o://059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d" gracePeriod=30 Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.901716 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d6ee277-b629-49a0-94da-030a39be02f5" containerID="a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c" exitCode=0 Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.901863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" event={"ID":"7d6ee277-b629-49a0-94da-030a39be02f5","Type":"ContainerDied","Data":"a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c"} Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.901910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" event={"ID":"7d6ee277-b629-49a0-94da-030a39be02f5","Type":"ContainerDied","Data":"50e0c9ee3558493db0c817a196fe3806b8dee17eb4e53a84c7295e16f6adec04"} Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.901930 4707 scope.go:117] "RemoveContainer" containerID="a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.902113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.933958 4707 scope.go:117] "RemoveContainer" containerID="a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c" Jan 21 16:12:21 crc kubenswrapper[4707]: E0121 16:12:21.937073 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c\": container with ID starting with a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c not found: ID does not exist" containerID="a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.937127 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c"} err="failed to get container status \"a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c\": rpc error: code = NotFound desc = could not find container \"a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c\": container with ID starting with a76e93007c836bd51716c6e2c9b4d39549c8bf6b32a9c1dd2f1ebd60f81efc3c not found: ID does not exist" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.938024 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p"] Jan 21 16:12:21 crc kubenswrapper[4707]: E0121 16:12:21.947935 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0812fb22_492c_4b73_9d75_3cfaf898c043.slice/crio-67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:12:21 crc kubenswrapper[4707]: I0121 16:12:21.956731 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6c79bb4b5d-xfj6p"] Jan 21 16:12:22 crc kubenswrapper[4707]: I0121 16:12:22.777955 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:22 crc kubenswrapper[4707]: I0121 16:12:22.778254 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:22 crc kubenswrapper[4707]: I0121 16:12:22.813142 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:22 crc kubenswrapper[4707]: I0121 16:12:22.814132 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:22 crc kubenswrapper[4707]: I0121 16:12:22.911616 4707 generic.go:334] "Generic (PLEG): container finished" podID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerID="67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c" exitCode=143 Jan 21 16:12:22 crc kubenswrapper[4707]: I0121 16:12:22.911863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" event={"ID":"0812fb22-492c-4b73-9d75-3cfaf898c043","Type":"ContainerDied","Data":"67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c"} Jan 21 16:12:22 crc kubenswrapper[4707]: I0121 16:12:22.911976 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:22 crc kubenswrapper[4707]: I0121 16:12:22.911998 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.193329 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d6ee277-b629-49a0-94da-030a39be02f5" path="/var/lib/kubelet/pods/7d6ee277-b629-49a0-94da-030a39be02f5/volumes" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.217682 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.217772 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.247392 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.263013 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.265892 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.277734 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.277797 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.301611 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.366933 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.366975 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.382662 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n"] Jan 21 16:12:23 crc kubenswrapper[4707]: E0121 16:12:23.383203 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d6ee277-b629-49a0-94da-030a39be02f5" containerName="keystone-api" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.383225 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d6ee277-b629-49a0-94da-030a39be02f5" containerName="keystone-api" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.383430 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d6ee277-b629-49a0-94da-030a39be02f5" containerName="keystone-api" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.388071 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.393774 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n"] Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.527259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-log-httpd\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.527318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-internal-tls-certs\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.527672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-run-httpd\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.527853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2gw\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-kube-api-access-nw2gw\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.528051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-etc-swift\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.528340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-public-tls-certs\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.528521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-combined-ca-bundle\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.528583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-config-data\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.630907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-public-tls-certs\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.630964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-combined-ca-bundle\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.630995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-config-data\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.631539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-log-httpd\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.631571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-internal-tls-certs\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.631613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-run-httpd\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.631639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw2gw\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-kube-api-access-nw2gw\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.631668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-etc-swift\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.632319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-run-httpd\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.632875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-log-httpd\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.638511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-public-tls-certs\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.639004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-etc-swift\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.640465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-combined-ca-bundle\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.643594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-config-data\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.650445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw2gw\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-kube-api-access-nw2gw\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.656203 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-internal-tls-certs\") pod \"swift-proxy-7c98dd475b-csd2n\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.706707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.922110 4707 generic.go:334] "Generic (PLEG): container finished" podID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerID="b61fcf03feb13351b8d028376c265c54d145d49154a87773bc65d349bf33430f" exitCode=0 Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.922197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" event={"ID":"3ebbb7c6-3987-4c0b-826f-10b691ca1e39","Type":"ContainerDied","Data":"b61fcf03feb13351b8d028376c265c54d145d49154a87773bc65d349bf33430f"} Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.924267 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.924315 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:23 crc kubenswrapper[4707]: I0121 16:12:23.947326 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.187972 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.196258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n"] Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.289947 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.70:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.290030 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.70:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.358907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-config-data\") pod \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.359125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-logs\") pod \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.359253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6g48\" (UniqueName: \"kubernetes.io/projected/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-kube-api-access-k6g48\") pod \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.359421 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-internal-tls-certs\") pod \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.359468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-combined-ca-bundle\") pod \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.359483 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-scripts\") pod \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.359519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-public-tls-certs\") pod \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\" (UID: \"3ebbb7c6-3987-4c0b-826f-10b691ca1e39\") " Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.359770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-logs" (OuterVolumeSpecName: "logs") pod "3ebbb7c6-3987-4c0b-826f-10b691ca1e39" (UID: "3ebbb7c6-3987-4c0b-826f-10b691ca1e39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.360899 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.373726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-scripts" (OuterVolumeSpecName: "scripts") pod "3ebbb7c6-3987-4c0b-826f-10b691ca1e39" (UID: "3ebbb7c6-3987-4c0b-826f-10b691ca1e39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.382770 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.382960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-kube-api-access-k6g48" (OuterVolumeSpecName: "kube-api-access-k6g48") pod "3ebbb7c6-3987-4c0b-826f-10b691ca1e39" (UID: "3ebbb7c6-3987-4c0b-826f-10b691ca1e39"). InnerVolumeSpecName "kube-api-access-k6g48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.384089 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.91:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.384293 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.91:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.427954 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.428278 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="ceilometer-central-agent" containerID="cri-o://64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3" gracePeriod=30 Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.428744 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="proxy-httpd" containerID="cri-o://cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83" gracePeriod=30 Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.428798 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="sg-core" containerID="cri-o://d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459" gracePeriod=30 Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.428861 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="ceilometer-notification-agent" containerID="cri-o://48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c" gracePeriod=30 Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.479629 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6g48\" (UniqueName: \"kubernetes.io/projected/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-kube-api-access-k6g48\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.479659 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.672088 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3ebbb7c6-3987-4c0b-826f-10b691ca1e39" (UID: "3ebbb7c6-3987-4c0b-826f-10b691ca1e39"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.711176 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.753083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ebbb7c6-3987-4c0b-826f-10b691ca1e39" (UID: "3ebbb7c6-3987-4c0b-826f-10b691ca1e39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.802843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-config-data" (OuterVolumeSpecName: "config-data") pod "3ebbb7c6-3987-4c0b-826f-10b691ca1e39" (UID: "3ebbb7c6-3987-4c0b-826f-10b691ca1e39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.813079 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.813113 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.827518 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3ebbb7c6-3987-4c0b-826f-10b691ca1e39" (UID: "3ebbb7c6-3987-4c0b-826f-10b691ca1e39"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.915072 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebbb7c6-3987-4c0b-826f-10b691ca1e39-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.946434 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerID="d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459" exitCode=2 Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.946518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerDied","Data":"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459"} Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.948397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" event={"ID":"81457b6e-3af3-4da8-9f83-0e896795cb2c","Type":"ContainerStarted","Data":"d9b5938294531b6e015f18518c545afd2fc7bc61634c8904c5434b33da8fd969"} Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.948436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" event={"ID":"81457b6e-3af3-4da8-9f83-0e896795cb2c","Type":"ContainerStarted","Data":"770e28dc760349d846a328f6dcc5814425a16f9414e79e379a58b6c895cb04aa"} Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.950401 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.952872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-68f4b84bbb-5wq8w" event={"ID":"3ebbb7c6-3987-4c0b-826f-10b691ca1e39","Type":"ContainerDied","Data":"5555e7e7ff61cd4ecf8f90c0d6e8f78e60b9de611226fd6c5f388f89e16bec2d"} Jan 21 16:12:24 crc kubenswrapper[4707]: I0121 16:12:24.952923 4707 scope.go:117] "RemoveContainer" containerID="b61fcf03feb13351b8d028376c265c54d145d49154a87773bc65d349bf33430f" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.002966 4707 scope.go:117] "RemoveContainer" containerID="6a2974206b54c7b9a04ddf70f0fd4dfbd63679099f8248cfc05fbbcb49b95ea3" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.010953 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-68f4b84bbb-5wq8w"] Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.036330 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-68f4b84bbb-5wq8w"] Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.091547 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.244:9311/healthcheck\": read tcp 10.217.0.2:54230->10.217.1.244:9311: read: connection reset by peer" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.091767 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.244:9311/healthcheck\": read tcp 10.217.0.2:54224->10.217.1.244:9311: read: connection reset by peer" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.182928 4707 scope.go:117] "RemoveContainer" containerID="7261e8621661d753a0ab84dc58a6333b9f4bf934a0bff35dc7ae6582d4347840" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.195027 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" path="/var/lib/kubelet/pods/3ebbb7c6-3987-4c0b-826f-10b691ca1e39/volumes" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.357608 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.357711 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.362056 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.522276 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.639295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0812fb22-492c-4b73-9d75-3cfaf898c043-logs\") pod \"0812fb22-492c-4b73-9d75-3cfaf898c043\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.639367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-combined-ca-bundle\") pod \"0812fb22-492c-4b73-9d75-3cfaf898c043\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.639435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data\") pod \"0812fb22-492c-4b73-9d75-3cfaf898c043\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.639467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-internal-tls-certs\") pod \"0812fb22-492c-4b73-9d75-3cfaf898c043\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.639483 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data-custom\") pod \"0812fb22-492c-4b73-9d75-3cfaf898c043\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.639552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-public-tls-certs\") pod \"0812fb22-492c-4b73-9d75-3cfaf898c043\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.639672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d7xs\" (UniqueName: \"kubernetes.io/projected/0812fb22-492c-4b73-9d75-3cfaf898c043-kube-api-access-2d7xs\") pod \"0812fb22-492c-4b73-9d75-3cfaf898c043\" (UID: \"0812fb22-492c-4b73-9d75-3cfaf898c043\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.643883 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0812fb22-492c-4b73-9d75-3cfaf898c043-logs" (OuterVolumeSpecName: "logs") pod "0812fb22-492c-4b73-9d75-3cfaf898c043" (UID: "0812fb22-492c-4b73-9d75-3cfaf898c043"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.663653 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0812fb22-492c-4b73-9d75-3cfaf898c043-kube-api-access-2d7xs" (OuterVolumeSpecName: "kube-api-access-2d7xs") pod "0812fb22-492c-4b73-9d75-3cfaf898c043" (UID: "0812fb22-492c-4b73-9d75-3cfaf898c043"). InnerVolumeSpecName "kube-api-access-2d7xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.664625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0812fb22-492c-4b73-9d75-3cfaf898c043" (UID: "0812fb22-492c-4b73-9d75-3cfaf898c043"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.728170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0812fb22-492c-4b73-9d75-3cfaf898c043" (UID: "0812fb22-492c-4b73-9d75-3cfaf898c043"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.743279 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0812fb22-492c-4b73-9d75-3cfaf898c043-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.743302 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.743315 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.743327 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d7xs\" (UniqueName: \"kubernetes.io/projected/0812fb22-492c-4b73-9d75-3cfaf898c043-kube-api-access-2d7xs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.776024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0812fb22-492c-4b73-9d75-3cfaf898c043" (UID: "0812fb22-492c-4b73-9d75-3cfaf898c043"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.790898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data" (OuterVolumeSpecName: "config-data") pod "0812fb22-492c-4b73-9d75-3cfaf898c043" (UID: "0812fb22-492c-4b73-9d75-3cfaf898c043"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.798954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0812fb22-492c-4b73-9d75-3cfaf898c043" (UID: "0812fb22-492c-4b73-9d75-3cfaf898c043"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.844260 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.844288 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.844299 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0812fb22-492c-4b73-9d75-3cfaf898c043-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.847254 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.945506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-ceilometer-tls-certs\") pod \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.945896 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9scm\" (UniqueName: \"kubernetes.io/projected/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-kube-api-access-q9scm\") pod \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.945959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-config-data\") pod \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.945991 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-log-httpd\") pod \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.946009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-sg-core-conf-yaml\") pod \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.946124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-scripts\") pod \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.946219 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-run-httpd\") pod \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.946242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-combined-ca-bundle\") pod \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\" (UID: \"2b9e4dcd-b1d8-40a8-92c6-8c4030484599\") " Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.956475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-kube-api-access-q9scm" (OuterVolumeSpecName: "kube-api-access-q9scm") pod "2b9e4dcd-b1d8-40a8-92c6-8c4030484599" (UID: "2b9e4dcd-b1d8-40a8-92c6-8c4030484599"). InnerVolumeSpecName "kube-api-access-q9scm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.958694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2b9e4dcd-b1d8-40a8-92c6-8c4030484599" (UID: "2b9e4dcd-b1d8-40a8-92c6-8c4030484599"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.959059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2b9e4dcd-b1d8-40a8-92c6-8c4030484599" (UID: "2b9e4dcd-b1d8-40a8-92c6-8c4030484599"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.961086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-scripts" (OuterVolumeSpecName: "scripts") pod "2b9e4dcd-b1d8-40a8-92c6-8c4030484599" (UID: "2b9e4dcd-b1d8-40a8-92c6-8c4030484599"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.983682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerStarted","Data":"ffe9e5471773aa44d24bb089adcede94c9a0a7c07e343fc71a5076c836a67866"} Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.985011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2b9e4dcd-b1d8-40a8-92c6-8c4030484599" (UID: "2b9e4dcd-b1d8-40a8-92c6-8c4030484599"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.991489 4707 generic.go:334] "Generic (PLEG): container finished" podID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerID="059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d" exitCode=0 Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.991589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" event={"ID":"0812fb22-492c-4b73-9d75-3cfaf898c043","Type":"ContainerDied","Data":"059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d"} Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.991641 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" event={"ID":"0812fb22-492c-4b73-9d75-3cfaf898c043","Type":"ContainerDied","Data":"e373c970f1ba6bf80bff3e56699ad92d9b88c2971726527d8812c4b7e3323a0d"} Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.991668 4707 scope.go:117] "RemoveContainer" containerID="059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d" Jan 21 16:12:25 crc kubenswrapper[4707]: I0121 16:12:25.991902 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-cff485676-6h5mm" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.013846 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerID="cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83" exitCode=0 Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.013875 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerID="48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c" exitCode=0 Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.013884 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerID="64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3" exitCode=0 Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.013968 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.014444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerDied","Data":"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83"} Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.014476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerDied","Data":"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c"} Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.014489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerDied","Data":"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3"} Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.014497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2b9e4dcd-b1d8-40a8-92c6-8c4030484599","Type":"ContainerDied","Data":"2921dc59f4d65fee2b79c41825972f23829cc2d8c33303ffcf285877197e5610"} Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.017073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2b9e4dcd-b1d8-40a8-92c6-8c4030484599" (UID: "2b9e4dcd-b1d8-40a8-92c6-8c4030484599"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.017958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" event={"ID":"81457b6e-3af3-4da8-9f83-0e896795cb2c","Type":"ContainerStarted","Data":"6a3e0961eaad6a849ce19556e597cc407293421db0ba0b4457b3b4372a4b61d6"} Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.018323 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.018418 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.018954 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.018974 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.041526 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" podStartSLOduration=3.041514731 podStartE2EDuration="3.041514731s" podCreationTimestamp="2026-01-21 16:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:12:26.031033905 +0000 UTC m=+4243.212550127" watchObservedRunningTime="2026-01-21 16:12:26.041514731 +0000 UTC m=+4243.223030944" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.044543 4707 scope.go:117] "RemoveContainer" containerID="67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.048424 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.048449 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.048460 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.048470 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9scm\" (UniqueName: \"kubernetes.io/projected/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-kube-api-access-q9scm\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.048481 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.048490 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.077474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b9e4dcd-b1d8-40a8-92c6-8c4030484599" (UID: "2b9e4dcd-b1d8-40a8-92c6-8c4030484599"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.081327 4707 scope.go:117] "RemoveContainer" containerID="059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.081645 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d\": container with ID starting with 059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d not found: ID does not exist" containerID="059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.081670 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d"} err="failed to get container status \"059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d\": rpc error: code = NotFound desc = could not find container \"059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d\": container with ID starting with 059f58bf1bdd0a92b75625cecb18a0e8cb28576f79bc7cacfcbb3bbc41d6e66d not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.081703 4707 scope.go:117] "RemoveContainer" containerID="67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.081968 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c\": container with ID starting with 67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c not found: ID does not exist" containerID="67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.081989 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c"} err="failed to get container status \"67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c\": rpc error: code = NotFound desc = could not find container \"67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c\": container with ID starting with 67ecc0e00107e2b8de846c92787a0b015f6d411192a585c53d5d963bab393a5c not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.082004 4707 scope.go:117] "RemoveContainer" containerID="cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.091480 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-cff485676-6h5mm"] Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.098840 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-cff485676-6h5mm"] Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.101235 4707 scope.go:117] "RemoveContainer" containerID="d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.114999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-config-data" (OuterVolumeSpecName: "config-data") pod "2b9e4dcd-b1d8-40a8-92c6-8c4030484599" (UID: "2b9e4dcd-b1d8-40a8-92c6-8c4030484599"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.150080 4707 scope.go:117] "RemoveContainer" containerID="48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.153348 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.153373 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9e4dcd-b1d8-40a8-92c6-8c4030484599-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.187369 4707 scope.go:117] "RemoveContainer" containerID="64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.208116 4707 scope.go:117] "RemoveContainer" containerID="cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.208533 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83\": container with ID starting with cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83 not found: ID does not exist" containerID="cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.208560 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83"} err="failed to get container status \"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83\": rpc error: code = NotFound desc = could not find container \"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83\": container with ID starting with cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83 not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.208576 4707 scope.go:117] "RemoveContainer" containerID="d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.209294 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459\": container with ID starting with d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459 not found: ID does not exist" containerID="d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.209320 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459"} err="failed to get container status \"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459\": rpc error: code = NotFound desc = could not find container \"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459\": container with ID starting with d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459 not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.209335 4707 scope.go:117] "RemoveContainer" containerID="48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.209910 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c\": container with ID starting with 48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c not found: ID does not exist" containerID="48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.209931 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c"} err="failed to get container status \"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c\": rpc error: code = NotFound desc = could not find container \"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c\": container with ID starting with 48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.209946 4707 scope.go:117] "RemoveContainer" containerID="64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.210152 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3\": container with ID starting with 64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3 not found: ID does not exist" containerID="64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.210186 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3"} err="failed to get container status \"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3\": rpc error: code = NotFound desc = could not find container \"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3\": container with ID starting with 64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3 not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.210198 4707 scope.go:117] "RemoveContainer" containerID="cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.211682 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83"} err="failed to get container status \"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83\": rpc error: code = NotFound desc = could not find container \"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83\": container with ID starting with cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83 not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.211759 4707 scope.go:117] "RemoveContainer" containerID="d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.212466 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459"} err="failed to get container status \"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459\": rpc error: code = NotFound desc = could not find container \"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459\": container with ID starting with d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459 not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.212489 4707 scope.go:117] "RemoveContainer" containerID="48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.212689 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c"} err="failed to get container status \"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c\": rpc error: code = NotFound desc = could not find container \"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c\": container with ID starting with 48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.212726 4707 scope.go:117] "RemoveContainer" containerID="64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.213004 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3"} err="failed to get container status \"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3\": rpc error: code = NotFound desc = could not find container \"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3\": container with ID starting with 64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3 not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.213042 4707 scope.go:117] "RemoveContainer" containerID="cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.213249 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83"} err="failed to get container status \"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83\": rpc error: code = NotFound desc = could not find container \"cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83\": container with ID starting with cd0d78ff9415827f47c8aba7fef964d750344e46a59eb661ecf8403f68542d83 not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.213268 4707 scope.go:117] "RemoveContainer" containerID="d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.213469 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459"} err="failed to get container status \"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459\": rpc error: code = NotFound desc = could not find container \"d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459\": container with ID starting with d76cbe90f2c03f3d2656ff084bfc5801dabae94d8d31b9d1e544fe46ba2bc459 not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.213487 4707 scope.go:117] "RemoveContainer" containerID="48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.213654 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c"} err="failed to get container status \"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c\": rpc error: code = NotFound desc = could not find container \"48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c\": container with ID starting with 48fab01d558c719ea4ffecc819b219c0d4fa656b30dbc6680d43dbaf7f715d6c not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.213692 4707 scope.go:117] "RemoveContainer" containerID="64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.213877 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3"} err="failed to get container status \"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3\": rpc error: code = NotFound desc = could not find container \"64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3\": container with ID starting with 64c2faf9f5fa7ff2d9fe94a9ca04f3e404e9c8a52db3e70ea4ee235b5a8125b3 not found: ID does not exist" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.349555 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.357345 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.375637 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.376094 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerName="placement-api" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.376115 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerName="placement-api" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.376129 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="sg-core" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.376136 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="sg-core" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.376156 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="ceilometer-central-agent" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.376173 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="ceilometer-central-agent" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.376191 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="ceilometer-notification-agent" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.376196 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="ceilometer-notification-agent" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.376210 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="proxy-httpd" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.376215 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="proxy-httpd" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.376226 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.376232 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.376247 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerName="placement-log" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.376351 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerName="placement-log" Jan 21 16:12:26 crc kubenswrapper[4707]: E0121 16:12:26.376361 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api-log" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.376367 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api-log" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.385396 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="ceilometer-central-agent" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.385433 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerName="placement-log" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.385458 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="proxy-httpd" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.385473 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="sg-core" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.385488 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.385497 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" containerName="barbican-api-log" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.385504 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ebbb7c6-3987-4c0b-826f-10b691ca1e39" containerName="placement-api" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.385518 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" containerName="ceilometer-notification-agent" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.387480 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.392089 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.392307 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.392435 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.392722 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.459368 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.464376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-scripts\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.464413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-run-httpd\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.464439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-config-data\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.464457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.464656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcgrc\" (UniqueName: \"kubernetes.io/projected/5595e246-81d0-4a4e-9a94-b131456c089e-kube-api-access-wcgrc\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.464722 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-log-httpd\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.464758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.464782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.566384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-log-httpd\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.566445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.566476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.566508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-scripts\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.566524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-run-httpd\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.566543 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-config-data\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.566564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.566677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcgrc\" (UniqueName: \"kubernetes.io/projected/5595e246-81d0-4a4e-9a94-b131456c089e-kube-api-access-wcgrc\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.566973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-log-httpd\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.567052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-run-httpd\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.630369 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.847528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.847913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-scripts\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.848294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.848529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-config-data\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.849146 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcgrc\" (UniqueName: \"kubernetes.io/projected/5595e246-81d0-4a4e-9a94-b131456c089e-kube-api-access-wcgrc\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:26 crc kubenswrapper[4707]: I0121 16:12:26.849205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:27 crc kubenswrapper[4707]: I0121 16:12:27.013590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:27 crc kubenswrapper[4707]: I0121 16:12:27.192075 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0812fb22-492c-4b73-9d75-3cfaf898c043" path="/var/lib/kubelet/pods/0812fb22-492c-4b73-9d75-3cfaf898c043/volumes" Jan 21 16:12:27 crc kubenswrapper[4707]: I0121 16:12:27.193132 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9e4dcd-b1d8-40a8-92c6-8c4030484599" path="/var/lib/kubelet/pods/2b9e4dcd-b1d8-40a8-92c6-8c4030484599/volumes" Jan 21 16:12:27 crc kubenswrapper[4707]: I0121 16:12:27.449379 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:27 crc kubenswrapper[4707]: I0121 16:12:27.667435 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:28 crc kubenswrapper[4707]: I0121 16:12:28.047564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerStarted","Data":"a9fe0b8c70dbc2aef85b10b9efb5e027be6fe363f7fa5cf209011e520cbeb42e"} Jan 21 16:12:29 crc kubenswrapper[4707]: I0121 16:12:29.055522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerStarted","Data":"54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1"} Jan 21 16:12:30 crc kubenswrapper[4707]: I0121 16:12:30.076791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerStarted","Data":"0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e"} Jan 21 16:12:30 crc kubenswrapper[4707]: I0121 16:12:30.535393 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:12:30 crc kubenswrapper[4707]: I0121 16:12:30.535709 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:12:30 crc kubenswrapper[4707]: I0121 16:12:30.564336 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:12:31 crc kubenswrapper[4707]: I0121 16:12:31.092243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerStarted","Data":"dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0"} Jan 21 16:12:31 crc kubenswrapper[4707]: I0121 16:12:31.120483 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:12:32 crc kubenswrapper[4707]: I0121 16:12:32.107410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerStarted","Data":"7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6"} Jan 21 16:12:32 crc kubenswrapper[4707]: I0121 16:12:32.108130 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:32 crc kubenswrapper[4707]: I0121 16:12:32.107947 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="proxy-httpd" containerID="cri-o://7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6" gracePeriod=30 Jan 21 16:12:32 crc kubenswrapper[4707]: I0121 16:12:32.107963 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="sg-core" containerID="cri-o://dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0" gracePeriod=30 Jan 21 16:12:32 crc kubenswrapper[4707]: I0121 16:12:32.107978 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="ceilometer-notification-agent" containerID="cri-o://0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e" gracePeriod=30 Jan 21 16:12:32 crc kubenswrapper[4707]: I0121 16:12:32.107647 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="ceilometer-central-agent" containerID="cri-o://54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1" gracePeriod=30 Jan 21 16:12:32 crc kubenswrapper[4707]: I0121 16:12:32.128678 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.90683626 podStartE2EDuration="6.128658194s" podCreationTimestamp="2026-01-21 16:12:26 +0000 UTC" firstStartedPulling="2026-01-21 16:12:27.455574302 +0000 UTC m=+4244.637090525" lastFinishedPulling="2026-01-21 16:12:31.677396238 +0000 UTC m=+4248.858912459" observedRunningTime="2026-01-21 16:12:32.123416677 +0000 UTC m=+4249.304932900" watchObservedRunningTime="2026-01-21 16:12:32.128658194 +0000 UTC m=+4249.310174416" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.119909 4707 generic.go:334] "Generic (PLEG): container finished" podID="5595e246-81d0-4a4e-9a94-b131456c089e" containerID="7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6" exitCode=0 Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.120255 4707 generic.go:334] "Generic (PLEG): container finished" podID="5595e246-81d0-4a4e-9a94-b131456c089e" containerID="dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0" exitCode=2 Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.120266 4707 generic.go:334] "Generic (PLEG): container finished" podID="5595e246-81d0-4a4e-9a94-b131456c089e" containerID="0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e" exitCode=0 Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.119984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerDied","Data":"7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6"} Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.120308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerDied","Data":"dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0"} Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.120322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerDied","Data":"0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e"} Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.188432 4707 scope.go:117] "RemoveContainer" containerID="1408b5018e888ee2df7c2a74b879464d9d7f9397fbdd091d97fbff81697afea1" Jan 21 16:12:33 crc kubenswrapper[4707]: E0121 16:12:33.188797 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.281803 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.291079 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.291521 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.384882 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.385562 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.387483 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.389704 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.711429 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.712642 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.787402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7"] Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.787645 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" podUID="693d76e3-e5ce-4267-8646-42944a6d6678" containerName="proxy-httpd" containerID="cri-o://4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685" gracePeriod=30 Jan 21 16:12:33 crc kubenswrapper[4707]: I0121 16:12:33.787833 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" podUID="693d76e3-e5ce-4267-8646-42944a6d6678" containerName="proxy-server" containerID="cri-o://f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6" gracePeriod=30 Jan 21 16:12:34 crc kubenswrapper[4707]: I0121 16:12:34.139455 4707 generic.go:334] "Generic (PLEG): container finished" podID="693d76e3-e5ce-4267-8646-42944a6d6678" containerID="4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685" exitCode=0 Jan 21 16:12:34 crc kubenswrapper[4707]: I0121 16:12:34.139607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" event={"ID":"693d76e3-e5ce-4267-8646-42944a6d6678","Type":"ContainerDied","Data":"4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685"} Jan 21 16:12:34 crc kubenswrapper[4707]: I0121 16:12:34.140618 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:34 crc kubenswrapper[4707]: I0121 16:12:34.146572 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:12:34 crc kubenswrapper[4707]: I0121 16:12:34.147415 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:12:34 crc kubenswrapper[4707]: I0121 16:12:34.978699 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.051748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-combined-ca-bundle\") pod \"693d76e3-e5ce-4267-8646-42944a6d6678\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.052041 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v8vx\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-kube-api-access-5v8vx\") pod \"693d76e3-e5ce-4267-8646-42944a6d6678\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.052073 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-public-tls-certs\") pod \"693d76e3-e5ce-4267-8646-42944a6d6678\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.052128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-etc-swift\") pod \"693d76e3-e5ce-4267-8646-42944a6d6678\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.052232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-internal-tls-certs\") pod \"693d76e3-e5ce-4267-8646-42944a6d6678\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.052312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-log-httpd\") pod \"693d76e3-e5ce-4267-8646-42944a6d6678\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.052349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-run-httpd\") pod \"693d76e3-e5ce-4267-8646-42944a6d6678\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.052431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-config-data\") pod \"693d76e3-e5ce-4267-8646-42944a6d6678\" (UID: \"693d76e3-e5ce-4267-8646-42944a6d6678\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.054460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "693d76e3-e5ce-4267-8646-42944a6d6678" (UID: "693d76e3-e5ce-4267-8646-42944a6d6678"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.054981 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "693d76e3-e5ce-4267-8646-42944a6d6678" (UID: "693d76e3-e5ce-4267-8646-42944a6d6678"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.058748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-kube-api-access-5v8vx" (OuterVolumeSpecName: "kube-api-access-5v8vx") pod "693d76e3-e5ce-4267-8646-42944a6d6678" (UID: "693d76e3-e5ce-4267-8646-42944a6d6678"). InnerVolumeSpecName "kube-api-access-5v8vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.061972 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "693d76e3-e5ce-4267-8646-42944a6d6678" (UID: "693d76e3-e5ce-4267-8646-42944a6d6678"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.101562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "693d76e3-e5ce-4267-8646-42944a6d6678" (UID: "693d76e3-e5ce-4267-8646-42944a6d6678"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.114103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "693d76e3-e5ce-4267-8646-42944a6d6678" (UID: "693d76e3-e5ce-4267-8646-42944a6d6678"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.117242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "693d76e3-e5ce-4267-8646-42944a6d6678" (UID: "693d76e3-e5ce-4267-8646-42944a6d6678"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.121830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-config-data" (OuterVolumeSpecName: "config-data") pod "693d76e3-e5ce-4267-8646-42944a6d6678" (UID: "693d76e3-e5ce-4267-8646-42944a6d6678"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.149710 4707 generic.go:334] "Generic (PLEG): container finished" podID="693d76e3-e5ce-4267-8646-42944a6d6678" containerID="f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6" exitCode=0 Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.149778 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.149819 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" event={"ID":"693d76e3-e5ce-4267-8646-42944a6d6678","Type":"ContainerDied","Data":"f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6"} Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.149867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7" event={"ID":"693d76e3-e5ce-4267-8646-42944a6d6678","Type":"ContainerDied","Data":"c95b41494dc6c8b09e027126e7e51a3fe6b8a30ebad223ae4cf9aaefbd0411c4"} Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.149891 4707 scope.go:117] "RemoveContainer" containerID="f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.156209 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.156234 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.156253 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v8vx\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-kube-api-access-5v8vx\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.156264 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.156273 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/693d76e3-e5ce-4267-8646-42944a6d6678-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.156282 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/693d76e3-e5ce-4267-8646-42944a6d6678-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.156291 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.156298 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/693d76e3-e5ce-4267-8646-42944a6d6678-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.174645 4707 scope.go:117] "RemoveContainer" containerID="4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.194995 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7"] Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.195031 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5f557d5d74-bnmf7"] Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.200339 4707 scope.go:117] "RemoveContainer" containerID="f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6" Jan 21 16:12:35 crc kubenswrapper[4707]: E0121 16:12:35.200877 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6\": container with ID starting with f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6 not found: ID does not exist" containerID="f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.200936 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6"} err="failed to get container status \"f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6\": rpc error: code = NotFound desc = could not find container \"f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6\": container with ID starting with f755dfef53bd10f0e0fcc10de7c4edc7f325c71f1ba764a2da44fc151f6f16b6 not found: ID does not exist" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.200960 4707 scope.go:117] "RemoveContainer" containerID="4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685" Jan 21 16:12:35 crc kubenswrapper[4707]: E0121 16:12:35.201335 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685\": container with ID starting with 4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685 not found: ID does not exist" containerID="4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.201357 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685"} err="failed to get container status \"4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685\": rpc error: code = NotFound desc = could not find container \"4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685\": container with ID starting with 4f74da52d76e45c99d172a8dfeabc259bbc501a27c12db5a2da17e8591b27685 not found: ID does not exist" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.550882 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.668886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-sg-core-conf-yaml\") pod \"5595e246-81d0-4a4e-9a94-b131456c089e\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.668971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcgrc\" (UniqueName: \"kubernetes.io/projected/5595e246-81d0-4a4e-9a94-b131456c089e-kube-api-access-wcgrc\") pod \"5595e246-81d0-4a4e-9a94-b131456c089e\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.669004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-scripts\") pod \"5595e246-81d0-4a4e-9a94-b131456c089e\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.669469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-config-data\") pod \"5595e246-81d0-4a4e-9a94-b131456c089e\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.669523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-run-httpd\") pod \"5595e246-81d0-4a4e-9a94-b131456c089e\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.669584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-combined-ca-bundle\") pod \"5595e246-81d0-4a4e-9a94-b131456c089e\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.669898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5595e246-81d0-4a4e-9a94-b131456c089e" (UID: "5595e246-81d0-4a4e-9a94-b131456c089e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.669617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-ceilometer-tls-certs\") pod \"5595e246-81d0-4a4e-9a94-b131456c089e\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.669980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-log-httpd\") pod \"5595e246-81d0-4a4e-9a94-b131456c089e\" (UID: \"5595e246-81d0-4a4e-9a94-b131456c089e\") " Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.670420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5595e246-81d0-4a4e-9a94-b131456c089e" (UID: "5595e246-81d0-4a4e-9a94-b131456c089e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.670887 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.671048 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5595e246-81d0-4a4e-9a94-b131456c089e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.675961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5595e246-81d0-4a4e-9a94-b131456c089e-kube-api-access-wcgrc" (OuterVolumeSpecName: "kube-api-access-wcgrc") pod "5595e246-81d0-4a4e-9a94-b131456c089e" (UID: "5595e246-81d0-4a4e-9a94-b131456c089e"). InnerVolumeSpecName "kube-api-access-wcgrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.677153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-scripts" (OuterVolumeSpecName: "scripts") pod "5595e246-81d0-4a4e-9a94-b131456c089e" (UID: "5595e246-81d0-4a4e-9a94-b131456c089e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.697069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5595e246-81d0-4a4e-9a94-b131456c089e" (UID: "5595e246-81d0-4a4e-9a94-b131456c089e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.711237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5595e246-81d0-4a4e-9a94-b131456c089e" (UID: "5595e246-81d0-4a4e-9a94-b131456c089e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.728755 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5595e246-81d0-4a4e-9a94-b131456c089e" (UID: "5595e246-81d0-4a4e-9a94-b131456c089e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.745000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-config-data" (OuterVolumeSpecName: "config-data") pod "5595e246-81d0-4a4e-9a94-b131456c089e" (UID: "5595e246-81d0-4a4e-9a94-b131456c089e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.773436 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.773466 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.773479 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.773490 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcgrc\" (UniqueName: \"kubernetes.io/projected/5595e246-81d0-4a4e-9a94-b131456c089e-kube-api-access-wcgrc\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.773502 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:35 crc kubenswrapper[4707]: I0121 16:12:35.773511 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595e246-81d0-4a4e-9a94-b131456c089e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.159386 4707 generic.go:334] "Generic (PLEG): container finished" podID="5595e246-81d0-4a4e-9a94-b131456c089e" containerID="54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1" exitCode=0 Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.159447 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.159448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerDied","Data":"54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1"} Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.159568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5595e246-81d0-4a4e-9a94-b131456c089e","Type":"ContainerDied","Data":"a9fe0b8c70dbc2aef85b10b9efb5e027be6fe363f7fa5cf209011e520cbeb42e"} Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.159588 4707 scope.go:117] "RemoveContainer" containerID="7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.181015 4707 scope.go:117] "RemoveContainer" containerID="dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.187953 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.193489 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.199619 4707 scope.go:117] "RemoveContainer" containerID="0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.210633 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.211129 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="sg-core" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211149 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="sg-core" Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.211169 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693d76e3-e5ce-4267-8646-42944a6d6678" containerName="proxy-httpd" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211175 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="693d76e3-e5ce-4267-8646-42944a6d6678" containerName="proxy-httpd" Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.211190 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="proxy-httpd" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211196 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="proxy-httpd" Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.211206 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693d76e3-e5ce-4267-8646-42944a6d6678" containerName="proxy-server" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211213 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="693d76e3-e5ce-4267-8646-42944a6d6678" containerName="proxy-server" Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.211236 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="ceilometer-notification-agent" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211242 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="ceilometer-notification-agent" Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.211250 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="ceilometer-central-agent" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211255 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="ceilometer-central-agent" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211431 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="ceilometer-notification-agent" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211447 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="693d76e3-e5ce-4267-8646-42944a6d6678" containerName="proxy-server" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211455 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="693d76e3-e5ce-4267-8646-42944a6d6678" containerName="proxy-httpd" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211465 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="proxy-httpd" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211476 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="ceilometer-central-agent" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.211536 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" containerName="sg-core" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.213415 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.217007 4707 scope.go:117] "RemoveContainer" containerID="54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.217833 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.217906 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.218114 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.230014 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.243416 4707 scope.go:117] "RemoveContainer" containerID="7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6" Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.244387 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6\": container with ID starting with 7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6 not found: ID does not exist" containerID="7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.245042 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6"} err="failed to get container status \"7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6\": rpc error: code = NotFound desc = could not find container \"7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6\": container with ID starting with 7c323f76caf00e728ee703ee8643ffe7dec470ed8ebadd8cc1212e6b989179f6 not found: ID does not exist" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.245462 4707 scope.go:117] "RemoveContainer" containerID="dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0" Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.248576 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0\": container with ID starting with dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0 not found: ID does not exist" containerID="dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.248797 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0"} err="failed to get container status \"dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0\": rpc error: code = NotFound desc = could not find container \"dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0\": container with ID starting with dd410c5268febeb6c44518b70389acb6f816dac9902793d6c02bc47b8296b4a0 not found: ID does not exist" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.249081 4707 scope.go:117] "RemoveContainer" containerID="0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e" Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.249939 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e\": container with ID starting with 0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e not found: ID does not exist" containerID="0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.250292 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e"} err="failed to get container status \"0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e\": rpc error: code = NotFound desc = could not find container \"0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e\": container with ID starting with 0b7ad61b31f14158578390c28db92847d3d037e64fb597d0a0e0ccffa4121d1e not found: ID does not exist" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.250431 4707 scope.go:117] "RemoveContainer" containerID="54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1" Jan 21 16:12:36 crc kubenswrapper[4707]: E0121 16:12:36.252836 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1\": container with ID starting with 54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1 not found: ID does not exist" containerID="54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.252884 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1"} err="failed to get container status \"54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1\": rpc error: code = NotFound desc = could not find container \"54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1\": container with ID starting with 54e3e70967e68a8c397bc07485c2592a520f481b8d4f97f382cece7cec1ff6e1 not found: ID does not exist" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.281453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-log-httpd\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.281706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.281946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-run-httpd\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.282042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbvh\" (UniqueName: \"kubernetes.io/projected/f6195e76-f5d4-4062-99a0-1597e6e85ba0-kube-api-access-xcbvh\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.282260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-scripts\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.282424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-config-data\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.282905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.283022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.384463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-scripts\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.384501 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-config-data\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.384562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.384582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.384646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-log-httpd\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.384689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.384728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-run-httpd\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.384751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbvh\" (UniqueName: \"kubernetes.io/projected/f6195e76-f5d4-4062-99a0-1597e6e85ba0-kube-api-access-xcbvh\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.385301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-log-httpd\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.385696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-run-httpd\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.448278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-scripts\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.448522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.449108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.449506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.449688 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbvh\" (UniqueName: \"kubernetes.io/projected/f6195e76-f5d4-4062-99a0-1597e6e85ba0-kube-api-access-xcbvh\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.449905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-config-data\") pod \"ceilometer-0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.532337 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:36 crc kubenswrapper[4707]: I0121 16:12:36.933113 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:12:37 crc kubenswrapper[4707]: I0121 16:12:37.170662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerStarted","Data":"7fa872b809cb716fcbd369a8602cc94d33fd5cd7c7a3eb40e9470678df9b8f0d"} Jan 21 16:12:37 crc kubenswrapper[4707]: I0121 16:12:37.190361 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5595e246-81d0-4a4e-9a94-b131456c089e" path="/var/lib/kubelet/pods/5595e246-81d0-4a4e-9a94-b131456c089e/volumes" Jan 21 16:12:37 crc kubenswrapper[4707]: I0121 16:12:37.191263 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693d76e3-e5ce-4267-8646-42944a6d6678" path="/var/lib/kubelet/pods/693d76e3-e5ce-4267-8646-42944a6d6678/volumes" Jan 21 16:12:39 crc kubenswrapper[4707]: I0121 16:12:39.191965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerStarted","Data":"c775c3ced09d3af1afb4330ece26d7b04215632d1b6e25a92627d6902a9f8f3e"} Jan 21 16:12:39 crc kubenswrapper[4707]: I0121 16:12:39.192475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerStarted","Data":"669bb6d5318148cc1ac21fae845cda6d0b6d3e81db0f375ed6ebf15b8e802dd4"} Jan 21 16:12:39 crc kubenswrapper[4707]: I0121 16:12:39.370367 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:12:39 crc kubenswrapper[4707]: I0121 16:12:39.432236 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6799769478-qpd8z"] Jan 21 16:12:39 crc kubenswrapper[4707]: I0121 16:12:39.432455 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerName="neutron-api" containerID="cri-o://dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822" gracePeriod=30 Jan 21 16:12:39 crc kubenswrapper[4707]: I0121 16:12:39.432568 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerName="neutron-httpd" containerID="cri-o://e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7" gracePeriod=30 Jan 21 16:12:40 crc kubenswrapper[4707]: I0121 16:12:40.201355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerStarted","Data":"778167efb85219d760ca2bee2211cb50dd31b9555451c93aea631b587f5c4e24"} Jan 21 16:12:40 crc kubenswrapper[4707]: I0121 16:12:40.202777 4707 generic.go:334] "Generic (PLEG): container finished" podID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerID="e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7" exitCode=0 Jan 21 16:12:40 crc kubenswrapper[4707]: I0121 16:12:40.202803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" event={"ID":"88131bb3-fc80-40c1-a068-ae5dfec982f9","Type":"ContainerDied","Data":"e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7"} Jan 21 16:12:42 crc kubenswrapper[4707]: I0121 16:12:42.107801 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podde7fba6f-a3f2-4a62-a094-056e0cccd7fa"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podde7fba6f-a3f2-4a62-a094-056e0cccd7fa] : Timed out while waiting for systemd to remove kubepods-besteffort-podde7fba6f_a3f2_4a62_a094_056e0cccd7fa.slice" Jan 21 16:12:42 crc kubenswrapper[4707]: E0121 16:12:42.108459 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podde7fba6f-a3f2-4a62-a094-056e0cccd7fa] : unable to destroy cgroup paths for cgroup [kubepods besteffort podde7fba6f-a3f2-4a62-a094-056e0cccd7fa] : Timed out while waiting for systemd to remove kubepods-besteffort-podde7fba6f_a3f2_4a62_a094_056e0cccd7fa.slice" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" Jan 21 16:12:42 crc kubenswrapper[4707]: I0121 16:12:42.116655 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podcc201295-13b6-4e01-b791-f89da0093402"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podcc201295-13b6-4e01-b791-f89da0093402] : Timed out while waiting for systemd to remove kubepods-besteffort-podcc201295_13b6_4e01_b791_f89da0093402.slice" Jan 21 16:12:42 crc kubenswrapper[4707]: I0121 16:12:42.221712 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8657895bb4-ls6tf" Jan 21 16:12:42 crc kubenswrapper[4707]: I0121 16:12:42.221736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerStarted","Data":"274d838448ffed3d5e993bce9fc0532d447a3d723e068308801479939c591e55"} Jan 21 16:12:42 crc kubenswrapper[4707]: I0121 16:12:42.222298 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:12:42 crc kubenswrapper[4707]: I0121 16:12:42.225670 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod0ec7c890-e349-422b-bbea-c40083057a7f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod0ec7c890-e349-422b-bbea-c40083057a7f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0ec7c890_e349_422b_bbea_c40083057a7f.slice" Jan 21 16:12:42 crc kubenswrapper[4707]: I0121 16:12:42.248243 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.159970187 podStartE2EDuration="6.248204657s" podCreationTimestamp="2026-01-21 16:12:36 +0000 UTC" firstStartedPulling="2026-01-21 16:12:36.94748257 +0000 UTC m=+4254.128998793" lastFinishedPulling="2026-01-21 16:12:41.035717031 +0000 UTC m=+4258.217233263" observedRunningTime="2026-01-21 16:12:42.247150204 +0000 UTC m=+4259.428666426" watchObservedRunningTime="2026-01-21 16:12:42.248204657 +0000 UTC m=+4259.429720880" Jan 21 16:12:42 crc kubenswrapper[4707]: I0121 16:12:42.282440 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-8657895bb4-ls6tf"] Jan 21 16:12:42 crc kubenswrapper[4707]: I0121 16:12:42.291182 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-8657895bb4-ls6tf"] Jan 21 16:12:43 crc kubenswrapper[4707]: I0121 16:12:43.215407 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de7fba6f-a3f2-4a62-a094-056e0cccd7fa" path="/var/lib/kubelet/pods/de7fba6f-a3f2-4a62-a094-056e0cccd7fa/volumes" Jan 21 16:12:46 crc kubenswrapper[4707]: I0121 16:12:46.182285 4707 scope.go:117] "RemoveContainer" containerID="1408b5018e888ee2df7c2a74b879464d9d7f9397fbdd091d97fbff81697afea1" Jan 21 16:12:47 crc kubenswrapper[4707]: I0121 16:12:47.259120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerStarted","Data":"4b218d98dcea319cdfaa8ddb7b590949e9f120aa66afb2e52a36db4e18a4d975"} Jan 21 16:12:47 crc kubenswrapper[4707]: I0121 16:12:47.260105 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:12:52 crc kubenswrapper[4707]: I0121 16:12:52.610955 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:13:04 crc kubenswrapper[4707]: I0121 16:13:04.376615 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.172:9696/\": dial tcp 10.217.1.172:9696: connect: connection refused" Jan 21 16:13:06 crc kubenswrapper[4707]: I0121 16:13:06.538341 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.808281 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6799769478-qpd8z_88131bb3-fc80-40c1-a068-ae5dfec982f9/neutron-api/0.log" Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.808464 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.945620 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.945676 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.951499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-internal-tls-certs\") pod \"88131bb3-fc80-40c1-a068-ae5dfec982f9\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.951537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-public-tls-certs\") pod \"88131bb3-fc80-40c1-a068-ae5dfec982f9\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.951621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb472\" (UniqueName: \"kubernetes.io/projected/88131bb3-fc80-40c1-a068-ae5dfec982f9-kube-api-access-rb472\") pod \"88131bb3-fc80-40c1-a068-ae5dfec982f9\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.951664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-config\") pod \"88131bb3-fc80-40c1-a068-ae5dfec982f9\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.951751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-httpd-config\") pod \"88131bb3-fc80-40c1-a068-ae5dfec982f9\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.951819 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-combined-ca-bundle\") pod \"88131bb3-fc80-40c1-a068-ae5dfec982f9\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.951901 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-ovndb-tls-certs\") pod \"88131bb3-fc80-40c1-a068-ae5dfec982f9\" (UID: \"88131bb3-fc80-40c1-a068-ae5dfec982f9\") " Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.956231 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88131bb3-fc80-40c1-a068-ae5dfec982f9-kube-api-access-rb472" (OuterVolumeSpecName: "kube-api-access-rb472") pod "88131bb3-fc80-40c1-a068-ae5dfec982f9" (UID: "88131bb3-fc80-40c1-a068-ae5dfec982f9"). InnerVolumeSpecName "kube-api-access-rb472". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.956242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "88131bb3-fc80-40c1-a068-ae5dfec982f9" (UID: "88131bb3-fc80-40c1-a068-ae5dfec982f9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.989633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88131bb3-fc80-40c1-a068-ae5dfec982f9" (UID: "88131bb3-fc80-40c1-a068-ae5dfec982f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.989937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88131bb3-fc80-40c1-a068-ae5dfec982f9" (UID: "88131bb3-fc80-40c1-a068-ae5dfec982f9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.991695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88131bb3-fc80-40c1-a068-ae5dfec982f9" (UID: "88131bb3-fc80-40c1-a068-ae5dfec982f9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:09 crc kubenswrapper[4707]: I0121 16:13:09.994278 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-config" (OuterVolumeSpecName: "config") pod "88131bb3-fc80-40c1-a068-ae5dfec982f9" (UID: "88131bb3-fc80-40c1-a068-ae5dfec982f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.005542 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "88131bb3-fc80-40c1-a068-ae5dfec982f9" (UID: "88131bb3-fc80-40c1-a068-ae5dfec982f9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.053773 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.053804 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.053828 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.053836 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.053845 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.053854 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb472\" (UniqueName: \"kubernetes.io/projected/88131bb3-fc80-40c1-a068-ae5dfec982f9-kube-api-access-rb472\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.053862 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/88131bb3-fc80-40c1-a068-ae5dfec982f9-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.436626 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6799769478-qpd8z_88131bb3-fc80-40c1-a068-ae5dfec982f9/neutron-api/0.log" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.436677 4707 generic.go:334] "Generic (PLEG): container finished" podID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerID="dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822" exitCode=137 Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.436705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" event={"ID":"88131bb3-fc80-40c1-a068-ae5dfec982f9","Type":"ContainerDied","Data":"dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822"} Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.436729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" event={"ID":"88131bb3-fc80-40c1-a068-ae5dfec982f9","Type":"ContainerDied","Data":"4ec2dd7ee49214f084a6ef339c999bdd54b7d49bcca7c7f133d9e4ec0332f11e"} Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.436744 4707 scope.go:117] "RemoveContainer" containerID="e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.436762 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6799769478-qpd8z" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.457882 4707 scope.go:117] "RemoveContainer" containerID="dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.467852 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6799769478-qpd8z"] Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.475245 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6799769478-qpd8z"] Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.475607 4707 scope.go:117] "RemoveContainer" containerID="e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7" Jan 21 16:13:10 crc kubenswrapper[4707]: E0121 16:13:10.475992 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7\": container with ID starting with e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7 not found: ID does not exist" containerID="e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.476027 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7"} err="failed to get container status \"e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7\": rpc error: code = NotFound desc = could not find container \"e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7\": container with ID starting with e011fb794e0d50920e1f783bd5d3019b128a64923a65d5ddf4514f1e06cfe4f7 not found: ID does not exist" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.476047 4707 scope.go:117] "RemoveContainer" containerID="dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822" Jan 21 16:13:10 crc kubenswrapper[4707]: E0121 16:13:10.476309 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822\": container with ID starting with dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822 not found: ID does not exist" containerID="dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822" Jan 21 16:13:10 crc kubenswrapper[4707]: I0121 16:13:10.476325 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822"} err="failed to get container status \"dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822\": rpc error: code = NotFound desc = could not find container \"dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822\": container with ID starting with dd5860f730c7f8a33c8caa17c6749bb598a5c99d0463f923c25ca83e91a82822 not found: ID does not exist" Jan 21 16:13:11 crc kubenswrapper[4707]: I0121 16:13:11.190521 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" path="/var/lib/kubelet/pods/88131bb3-fc80-40c1-a068-ae5dfec982f9/volumes" Jan 21 16:13:17 crc kubenswrapper[4707]: I0121 16:13:17.479934 4707 scope.go:117] "RemoveContainer" containerID="531c98ba1ff359ad78152d8e44092a209ec3fab7f1115bfbad71fa1070d2ad28" Jan 21 16:13:18 crc kubenswrapper[4707]: E0121 16:13:18.609966 4707 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.165:53378->192.168.25.165:36655: read tcp 192.168.25.165:53378->192.168.25.165:36655: read: connection reset by peer Jan 21 16:13:19 crc kubenswrapper[4707]: E0121 16:13:19.002119 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.165:53398->192.168.25.165:36655: write tcp 192.168.25.165:53398->192.168.25.165:36655: write: broken pipe Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.001408 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.140194 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-rg6q7"] Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.140547 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerName="neutron-httpd" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.140564 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerName="neutron-httpd" Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.140586 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerName="neutron-api" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.140593 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerName="neutron-api" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.140783 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerName="neutron-httpd" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.140798 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="88131bb3-fc80-40c1-a068-ae5dfec982f9" containerName="neutron-api" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.141325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.144223 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.156385 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.157529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.165352 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.171615 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.172584 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.180667 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.180735 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data podName:5d633804-5d9a-4792-b183-51e2c1a2ebe2 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:38.680712858 +0000 UTC m=+4315.862229080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data") pod "rabbitmq-cell1-server-0" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.189110 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.200304 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.207503 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-rg6q7"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.247510 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.249174 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.282795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e69885-5099-471a-8f06-ef85f63f18b7-operator-scripts\") pod \"root-account-create-update-rg6q7\" (UID: \"b6e69885-5099-471a-8f06-ef85f63f18b7\") " pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.282845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw7t6\" (UniqueName: \"kubernetes.io/projected/b6e69885-5099-471a-8f06-ef85f63f18b7-kube-api-access-rw7t6\") pod \"root-account-create-update-rg6q7\" (UID: \"b6e69885-5099-471a-8f06-ef85f63f18b7\") " pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.283105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgd5k\" (UniqueName: \"kubernetes.io/projected/309eb8bc-de8e-454e-8835-811c8df0ed63-kube-api-access-lgd5k\") pod \"cinder-b36a-account-create-update-7zlnl\" (UID: \"309eb8bc-de8e-454e-8835-811c8df0ed63\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.283156 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d5dc988-7c0d-4bef-a471-656baf486bc9-operator-scripts\") pod \"neutron-13f5-account-create-update-4gg48\" (UID: \"7d5dc988-7c0d-4bef-a471-656baf486bc9\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.283222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8ncz\" (UniqueName: \"kubernetes.io/projected/7d5dc988-7c0d-4bef-a471-656baf486bc9-kube-api-access-t8ncz\") pod \"neutron-13f5-account-create-update-4gg48\" (UID: \"7d5dc988-7c0d-4bef-a471-656baf486bc9\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.283264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/309eb8bc-de8e-454e-8835-811c8df0ed63-operator-scripts\") pod \"cinder-b36a-account-create-update-7zlnl\" (UID: \"309eb8bc-de8e-454e-8835-811c8df0ed63\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.287676 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.287731 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle podName:047c06ee-71d5-4bcf-83e0-dd8aa295cdb3 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:38.787717056 +0000 UTC m=+4315.969233279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3") : secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.289888 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.291887 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.360858 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.370765 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data-custom\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data-custom\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/153f9936-d723-4c86-a4b2-176325dc9513-logs\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgd5k\" (UniqueName: \"kubernetes.io/projected/309eb8bc-de8e-454e-8835-811c8df0ed63-kube-api-access-lgd5k\") pod \"cinder-b36a-account-create-update-7zlnl\" (UID: \"309eb8bc-de8e-454e-8835-811c8df0ed63\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d5dc988-7c0d-4bef-a471-656baf486bc9-operator-scripts\") pod \"neutron-13f5-account-create-update-4gg48\" (UID: \"7d5dc988-7c0d-4bef-a471-656baf486bc9\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cds9t\" (UniqueName: \"kubernetes.io/projected/153f9936-d723-4c86-a4b2-176325dc9513-kube-api-access-cds9t\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8ncz\" (UniqueName: \"kubernetes.io/projected/7d5dc988-7c0d-4bef-a471-656baf486bc9-kube-api-access-t8ncz\") pod \"neutron-13f5-account-create-update-4gg48\" (UID: \"7d5dc988-7c0d-4bef-a471-656baf486bc9\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/309eb8bc-de8e-454e-8835-811c8df0ed63-operator-scripts\") pod \"cinder-b36a-account-create-update-7zlnl\" (UID: \"309eb8bc-de8e-454e-8835-811c8df0ed63\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-logs\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.384985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e69885-5099-471a-8f06-ef85f63f18b7-operator-scripts\") pod \"root-account-create-update-rg6q7\" (UID: \"b6e69885-5099-471a-8f06-ef85f63f18b7\") " pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.385000 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7t6\" (UniqueName: \"kubernetes.io/projected/b6e69885-5099-471a-8f06-ef85f63f18b7-kube-api-access-rw7t6\") pod \"root-account-create-update-rg6q7\" (UID: \"b6e69885-5099-471a-8f06-ef85f63f18b7\") " pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.385021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wbpz\" (UniqueName: \"kubernetes.io/projected/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-kube-api-access-6wbpz\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.385767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d5dc988-7c0d-4bef-a471-656baf486bc9-operator-scripts\") pod \"neutron-13f5-account-create-update-4gg48\" (UID: \"7d5dc988-7c0d-4bef-a471-656baf486bc9\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.386371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/309eb8bc-de8e-454e-8835-811c8df0ed63-operator-scripts\") pod \"cinder-b36a-account-create-update-7zlnl\" (UID: \"309eb8bc-de8e-454e-8835-811c8df0ed63\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.386796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e69885-5099-471a-8f06-ef85f63f18b7-operator-scripts\") pod \"root-account-create-update-rg6q7\" (UID: \"b6e69885-5099-471a-8f06-ef85f63f18b7\") " pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.389895 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.436849 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.438597 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.443349 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw7t6\" (UniqueName: \"kubernetes.io/projected/b6e69885-5099-471a-8f06-ef85f63f18b7-kube-api-access-rw7t6\") pod \"root-account-create-update-rg6q7\" (UID: \"b6e69885-5099-471a-8f06-ef85f63f18b7\") " pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.459955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgd5k\" (UniqueName: \"kubernetes.io/projected/309eb8bc-de8e-454e-8835-811c8df0ed63-kube-api-access-lgd5k\") pod \"cinder-b36a-account-create-update-7zlnl\" (UID: \"309eb8bc-de8e-454e-8835-811c8df0ed63\") " pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.460623 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.464555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8ncz\" (UniqueName: \"kubernetes.io/projected/7d5dc988-7c0d-4bef-a471-656baf486bc9-kube-api-access-t8ncz\") pod \"neutron-13f5-account-create-update-4gg48\" (UID: \"7d5dc988-7c0d-4bef-a471-656baf486bc9\") " pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.477943 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.482778 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.489299 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.497597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.497670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.497698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cds9t\" (UniqueName: \"kubernetes.io/projected/153f9936-d723-4c86-a4b2-176325dc9513-kube-api-access-cds9t\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.497747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.497790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-logs\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.497872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wbpz\" (UniqueName: \"kubernetes.io/projected/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-kube-api-access-6wbpz\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.497919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data-custom\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.497953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data-custom\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.498019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/153f9936-d723-4c86-a4b2-176325dc9513-logs\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.498120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.498342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-logs\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.498556 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.498598 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle podName:153f9936-d723-4c86-a4b2-176325dc9513 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:38.998585201 +0000 UTC m=+4316.180101424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle") pod "barbican-worker-5ff6c79dfc-9wvl9" (UID: "153f9936-d723-4c86-a4b2-176325dc9513") : secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.498627 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.498668 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle podName:e9e813fb-84dc-4d38-b3b5-1be17e1848f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:38.998655634 +0000 UTC m=+4316.180171856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle") pod "barbican-keystone-listener-5fcdff7b8d-2cscv" (UID: "e9e813fb-84dc-4d38-b3b5-1be17e1848f6") : secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.499062 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/153f9936-d723-4c86-a4b2-176325dc9513-logs\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.512157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data-custom\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.518100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.522174 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-nhmxp"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.531761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.533107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cds9t\" (UniqueName: \"kubernetes.io/projected/153f9936-d723-4c86-a4b2-176325dc9513-kube-api-access-cds9t\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.558090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data-custom\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.600800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data-custom\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.600902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.600928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.601017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872bac5e-90cd-4cb0-9279-baa312892916-logs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.601052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.601098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.601151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz6j8\" (UniqueName: \"kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.601322 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.603016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wbpz\" (UniqueName: \"kubernetes.io/projected/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-kube-api-access-6wbpz\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.704666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.705548 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.705571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872bac5e-90cd-4cb0-9279-baa312892916-logs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.705633 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.205615319 +0000 UTC m=+4316.387131541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "barbican-config-data" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.705668 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.705673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.705749 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data podName:5d633804-5d9a-4792-b183-51e2c1a2ebe2 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.705728571 +0000 UTC m=+4316.887244793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data") pod "rabbitmq-cell1-server-0" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.705764 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.705872 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.205862113 +0000 UTC m=+4316.387378335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "cert-barbican-internal-svc" not found Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.705942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.706062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6j8\" (UniqueName: \"kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.706140 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.706185 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.206172778 +0000 UTC m=+4316.387688989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "cert-barbican-public-svc" not found Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.706119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872bac5e-90cd-4cb0-9279-baa312892916-logs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.706395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data-custom\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.706518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.707764 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.707895 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.207877713 +0000 UTC m=+4316.389393936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.737414 4707 projected.go:194] Error preparing data for projected volume kube-api-access-tz6j8 for pod openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.737533 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8 podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.23750402 +0000 UTC m=+4316.419020242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tz6j8" (UniqueName: "kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.757223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data-custom\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.804395 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw"] Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.850775 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: E0121 16:13:38.850889 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle podName:047c06ee-71d5-4bcf-83e0-dd8aa295cdb3 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.850865252 +0000 UTC m=+4317.032381475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3") : secret "combined-ca-bundle" not found Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.878837 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-2t9gw"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.899084 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.900860 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.903563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.911317 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.933849 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.934060 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="85d128b6-18c3-4018-a74a-86b025e2196d" containerName="openstackclient" containerID="cri-o://dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69" gracePeriod=2 Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.944560 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.952854 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.953104 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerName="ovn-northd" containerID="cri-o://6fb82acd6cb43c9c4e99edd1e2836e009b79b9244ac1f9188cfc0eec4e39f3eb" gracePeriod=30 Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.953587 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerName="openstack-network-exporter" containerID="cri-o://cc159cdf2eb7eebf0df0489fca214415a177fd59dd0017097f91d3bdaa7b7b49" gracePeriod=30 Jan 21 16:13:38 crc kubenswrapper[4707]: I0121 16:13:38.969563 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fxjzc"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:38.980388 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fxjzc"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.011322 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-67cvx"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.019077 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-67cvx"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.027116 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zhsbz"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.037229 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mq5pt"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.046690 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.054253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.054320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-operator-scripts\") pod \"nova-api-dc17-account-create-update-b8j7b\" (UID: \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.054395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r994t\" (UniqueName: \"kubernetes.io/projected/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-kube-api-access-r994t\") pod \"nova-api-dc17-account-create-update-b8j7b\" (UID: \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.054490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.054606 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.054654 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle podName:153f9936-d723-4c86-a4b2-176325dc9513 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.054643545 +0000 UTC m=+4317.236159767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle") pod "barbican-worker-5ff6c79dfc-9wvl9" (UID: "153f9936-d723-4c86-a4b2-176325dc9513") : secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.054690 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.054708 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle podName:e9e813fb-84dc-4d38-b3b5-1be17e1848f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.054701925 +0000 UTC m=+4317.236218147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle") pod "barbican-keystone-listener-5fcdff7b8d-2cscv" (UID: "e9e813fb-84dc-4d38-b3b5-1be17e1848f6") : secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.060710 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.061239 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerName="openstack-network-exporter" containerID="cri-o://89922e5f5460efebce04fba540cf6fbd73efdfb666c0b1bc1af0fa90f06ec5bb" gracePeriod=300 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.066470 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zhsbz"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.075759 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-mq5pt"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.082528 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-trb5l"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.088067 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv"] Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.088564 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d128b6-18c3-4018-a74a-86b025e2196d" containerName="openstackclient" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.088578 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d128b6-18c3-4018-a74a-86b025e2196d" containerName="openstackclient" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.088792 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d128b6-18c3-4018-a74a-86b025e2196d" containerName="openstackclient" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.090747 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.092943 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.097445 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.122788 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.131936 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-8mqm4"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.138273 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-8mqm4"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.150890 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hsp7r"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.156262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-operator-scripts\") pod \"barbican-db52-account-create-update-qz4cv\" (UID: \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.156294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r994t\" (UniqueName: \"kubernetes.io/projected/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-kube-api-access-r994t\") pod \"nova-api-dc17-account-create-update-b8j7b\" (UID: \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.156418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh82n\" (UniqueName: \"kubernetes.io/projected/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-kube-api-access-nh82n\") pod \"barbican-db52-account-create-update-qz4cv\" (UID: \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.156613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-operator-scripts\") pod \"nova-api-dc17-account-create-update-b8j7b\" (UID: \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.157478 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.157549 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data podName:0e42cb4d-8129-4286-a6c7-9f8e94524e91 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.657530714 +0000 UTC m=+4316.839046936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data") pod "rabbitmq-server-0" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91") : configmap "rabbitmq-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.158371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-operator-scripts\") pod \"nova-api-dc17-account-create-update-b8j7b\" (UID: \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.161228 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hsp7r"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.171916 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.176985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r994t\" (UniqueName: \"kubernetes.io/projected/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-kube-api-access-r994t\") pod \"nova-api-dc17-account-create-update-b8j7b\" (UID: \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\") " pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.178419 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0e3e-account-create-update-w8rtv"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.182955 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/cinder-api-0" secret="" err="secret \"cinder-cinder-dockercfg-nhwsw\" not found" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.203484 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a5a472-b8bf-458d-8dc8-be569df09d73" path="/var/lib/kubelet/pods/25a5a472-b8bf-458d-8dc8-be569df09d73/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.204499 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff" path="/var/lib/kubelet/pods/2e6ceb90-d7f0-450a-80f2-a22c4f0dfdff/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.205328 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306d1f96-06ed-4428-987c-081fc0031dd1" path="/var/lib/kubelet/pods/306d1f96-06ed-4428-987c-081fc0031dd1/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.206237 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58df5777-ecc8-4eda-8a67-c58f10b6e920" path="/var/lib/kubelet/pods/58df5777-ecc8-4eda-8a67-c58f10b6e920/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.207878 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9101f3-dd71-4ccb-a894-1778491da333" path="/var/lib/kubelet/pods/5d9101f3-dd71-4ccb-a894-1778491da333/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.208707 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b37e7b-be66-400a-a500-d80241ee7ac3" path="/var/lib/kubelet/pods/72b37e7b-be66-400a-a500-d80241ee7ac3/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.210345 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73047435-3cb8-43e3-bfc8-560c961e34cf" path="/var/lib/kubelet/pods/73047435-3cb8-43e3-bfc8-560c961e34cf/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.211717 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df737a8e-a932-4546-99f4-2e75115f4bb9" path="/var/lib/kubelet/pods/df737a8e-a932-4546-99f4-2e75115f4bb9/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.212265 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e3a9c8-3c3a-4782-b8ef-75125cda3933" path="/var/lib/kubelet/pods/f4e3a9c8-3c3a-4782-b8ef-75125cda3933/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.212734 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7" path="/var/lib/kubelet/pods/fb1aa9cd-3ee4-44f1-bc34-40a4ecaf3ab7/volumes" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.213643 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.213682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.213695 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-524d-account-create-update-hkhxn"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.213778 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.214009 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="cinder-scheduler" containerID="cri-o://22c7df73a4078571e27ee73b13d827243c425325781c88d7bd9adb9efeb6f37a" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.214317 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="b86320b8-c3c8-42d6-b48b-206526f89271" containerName="openstack-network-exporter" containerID="cri-o://7019254782a506b2d815df0e4448644e9e93a6b8330ab0977344a2a08dc91ea2" gracePeriod=300 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.214411 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="probe" containerID="cri-o://cd779a5c94d7063de0abc7e3a7f2b71e9ae69f23b6f7b82dbce250ade905a2db" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.220846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tlxvk"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.225861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.228889 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-tlxvk"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.234307 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.234522 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="69e40397-47ac-49d6-9b01-50b995d290f7" containerName="glance-log" containerID="cri-o://cd8eeca7ee1cb0f9e8acd6fb2fc5f3841099beedc093a3deaf46d81c1d640e58" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.234656 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="69e40397-47ac-49d6-9b01-50b995d290f7" containerName="glance-httpd" containerID="cri-o://8e765e71026c6eb75d97a822236ee30c3868681e753b9c3940e5cd5d39e67c94" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.260156 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.260208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.260238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh82n\" (UniqueName: \"kubernetes.io/projected/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-kube-api-access-nh82n\") pod \"barbican-db52-account-create-update-qz4cv\" (UID: \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.260348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.260398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.260437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6j8\" (UniqueName: \"kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.260524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-operator-scripts\") pod \"barbican-db52-account-create-update-qz4cv\" (UID: \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.260647 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.260747 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.260692338 +0000 UTC m=+4317.442208560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "barbican-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.261199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-operator-scripts\") pod \"barbican-db52-account-create-update-qz4cv\" (UID: \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261270 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261294 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-api-config-data: secret "cinder-api-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261334 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261372 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261644 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261685 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-config-data: secret "cinder-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261698 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scripts: secret "cinder-scripts" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261304 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.261292908 +0000 UTC m=+4317.442809130 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "cert-barbican-internal-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261729 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261737 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.761723597 +0000 UTC m=+4316.943239820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-api-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261750 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.761744476 +0000 UTC m=+4316.943260699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cert-cinder-internal-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261700 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261762 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.261756178 +0000 UTC m=+4317.443272401 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "cert-barbican-public-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261798 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.761788229 +0000 UTC m=+4316.943304451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cert-cinder-public-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261830 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.761823926 +0000 UTC m=+4316.943340147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261841 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.761835858 +0000 UTC m=+4316.943352080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-scripts" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261853 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.261848471 +0000 UTC m=+4317.443364694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.261865 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:39.761859402 +0000 UTC m=+4316.943375624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.264233 4707 projected.go:194] Error preparing data for projected volume kube-api-access-tz6j8 for pod openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.264303 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8 podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.264286837 +0000 UTC m=+4317.445803058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tz6j8" (UniqueName: "kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.287512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh82n\" (UniqueName: \"kubernetes.io/projected/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-kube-api-access-nh82n\") pod \"barbican-db52-account-create-update-qz4cv\" (UID: \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\") " pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.299797 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="b86320b8-c3c8-42d6-b48b-206526f89271" containerName="ovsdbserver-nb" containerID="cri-o://7d1010a0b55312ee73410bd885472062d3c784a9d367b85e66eb856b023fc87c" gracePeriod=300 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.332084 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.343430 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerName="ovsdbserver-sb" containerID="cri-o://5a00f55ff9ae0fdbc1e8d372d22b6a13fe41087fb6a5f3e20c7888d085b9c8e6" gracePeriod=300 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.387556 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.387844 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" podUID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerName="neutron-api" containerID="cri-o://da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.388224 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" podUID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerName="neutron-httpd" containerID="cri-o://555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.399736 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-644c6c574-8cgjw"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.399973 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" podUID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerName="placement-log" containerID="cri-o://8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.400250 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" podUID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerName="placement-api" containerID="cri-o://142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.487026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.577921 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.607233 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-bdjng"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.619952 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.628506 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-sc8lz"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.639467 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.686889 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.687273 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-log" containerID="cri-o://0d655dcc30fe3941863c06455c345616e3fae519df331a4a8dd1db93d04f7978" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.687777 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-httpd" containerID="cri-o://d9c4cae2facb86f1f23dc52a9d0d7d3cf12140812ce2d41bc7723b83171f967b" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.733975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-w4qwr"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.761425 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-w4qwr"] Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763097 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-config-data: secret "cinder-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763174 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.763149155 +0000 UTC m=+4317.944665377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763245 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763307 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.763293136 +0000 UTC m=+4317.944809359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cert-cinder-internal-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763370 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763411 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data podName:0e42cb4d-8129-4286-a6c7-9f8e94524e91 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.763397052 +0000 UTC m=+4317.944913274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data") pod "rabbitmq-server-0" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91") : configmap "rabbitmq-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763472 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763502 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.763496198 +0000 UTC m=+4317.945012420 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763524 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763539 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data podName:5d633804-5d9a-4792-b183-51e2c1a2ebe2 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:41.76353458 +0000 UTC m=+4318.945050803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data") pod "rabbitmq-cell1-server-0" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763567 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-api-config-data: secret "cinder-api-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763582 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.763577841 +0000 UTC m=+4317.945094063 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-api-config-data" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763609 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scripts: secret "cinder-scripts" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763623 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.763618849 +0000 UTC m=+4317.945135071 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-scripts" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763659 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.763677 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:40.76367263 +0000 UTC m=+4317.945188852 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cert-cinder-public-svc" not found Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.763867 4707 generic.go:334] "Generic (PLEG): container finished" podID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerID="cc159cdf2eb7eebf0df0489fca214415a177fd59dd0017097f91d3bdaa7b7b49" exitCode=2 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.763998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8","Type":"ContainerDied","Data":"cc159cdf2eb7eebf0df0489fca214415a177fd59dd0017097f91d3bdaa7b7b49"} Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.768558 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.773508 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-zwv4m"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.780267 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-zwv4m"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.785437 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-b0b5-account-create-update-d92q9"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.789429 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.789863 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-server" containerID="cri-o://59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.789900 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-server" containerID="cri-o://49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.789936 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-updater" containerID="cri-o://47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.789975 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-auditor" containerID="cri-o://6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.789985 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-reaper" containerID="cri-o://8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790008 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-replicator" containerID="cri-o://1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790090 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-auditor" containerID="cri-o://9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790132 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-replicator" containerID="cri-o://78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790181 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-server" containerID="cri-o://ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790206 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="swift-recon-cron" containerID="cri-o://f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790245 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-updater" containerID="cri-o://3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790261 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-auditor" containerID="cri-o://0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790287 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_b86320b8-c3c8-42d6-b48b-206526f89271/ovsdbserver-nb/0.log" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790307 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-replicator" containerID="cri-o://41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790314 4707 generic.go:334] "Generic (PLEG): container finished" podID="b86320b8-c3c8-42d6-b48b-206526f89271" containerID="7019254782a506b2d815df0e4448644e9e93a6b8330ab0977344a2a08dc91ea2" exitCode=2 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790326 4707 generic.go:334] "Generic (PLEG): container finished" podID="b86320b8-c3c8-42d6-b48b-206526f89271" containerID="7d1010a0b55312ee73410bd885472062d3c784a9d367b85e66eb856b023fc87c" exitCode=143 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790302 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="rsync" containerID="cri-o://8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b86320b8-c3c8-42d6-b48b-206526f89271","Type":"ContainerDied","Data":"7019254782a506b2d815df0e4448644e9e93a6b8330ab0977344a2a08dc91ea2"} Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b86320b8-c3c8-42d6-b48b-206526f89271","Type":"ContainerDied","Data":"7d1010a0b55312ee73410bd885472062d3c784a9d367b85e66eb856b023fc87c"} Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.790326 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-expirer" containerID="cri-o://ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.804882 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x94hb"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.804939 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-x94hb"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.810518 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.825854 4707 generic.go:334] "Generic (PLEG): container finished" podID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerID="555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4" exitCode=0 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.825940 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-55qx5"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.825959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" event={"ID":"317a81bb-283b-43fe-88cb-ad8fadf45471","Type":"ContainerDied","Data":"555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4"} Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.831528 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-55qx5"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.841153 4707 generic.go:334] "Generic (PLEG): container finished" podID="69e40397-47ac-49d6-9b01-50b995d290f7" containerID="cd8eeca7ee1cb0f9e8acd6fb2fc5f3841099beedc093a3deaf46d81c1d640e58" exitCode=143 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.841236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"69e40397-47ac-49d6-9b01-50b995d290f7","Type":"ContainerDied","Data":"cd8eeca7ee1cb0f9e8acd6fb2fc5f3841099beedc093a3deaf46d81c1d640e58"} Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.851148 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_bb06987b-a8ab-40c8-b73b-f0244ac056c7/ovsdbserver-sb/0.log" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.851205 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerID="89922e5f5460efebce04fba540cf6fbd73efdfb666c0b1bc1af0fa90f06ec5bb" exitCode=2 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.851224 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerID="5a00f55ff9ae0fdbc1e8d372d22b6a13fe41087fb6a5f3e20c7888d085b9c8e6" exitCode=143 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.851277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"bb06987b-a8ab-40c8-b73b-f0244ac056c7","Type":"ContainerDied","Data":"89922e5f5460efebce04fba540cf6fbd73efdfb666c0b1bc1af0fa90f06ec5bb"} Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.851300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"bb06987b-a8ab-40c8-b73b-f0244ac056c7","Type":"ContainerDied","Data":"5a00f55ff9ae0fdbc1e8d372d22b6a13fe41087fb6a5f3e20c7888d085b9c8e6"} Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.872488 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: E0121 16:13:39.872550 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle podName:047c06ee-71d5-4bcf-83e0-dd8aa295cdb3 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:41.872536606 +0000 UTC m=+4319.054052828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3") : secret "combined-ca-bundle" not found Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.903873 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.905831 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.915956 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" containerName="rabbitmq" containerID="cri-o://be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49" gracePeriod=604800 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.916269 4707 generic.go:334] "Generic (PLEG): container finished" podID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerID="8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e" exitCode=143 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.916501 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerName="cinder-api-log" containerID="cri-o://30b75fdd71f74771e52987f8d2a7b8e272eabb3c7db15d9c46d8dfc4759a563c" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.916577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" event={"ID":"ab3292fc-fbc8-481c-803a-3e32ff576d6d","Type":"ContainerDied","Data":"8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e"} Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.916586 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerName="cinder-api" containerID="cri-o://b7106616e826175cc64ed9a092fca4a51264ccb076e1e9ed7b657799fb68de52" gracePeriod=30 Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.953276 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.953337 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.959695 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb"] Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.974686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-bbvdb\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.974761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-bbvdb\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:39 crc kubenswrapper[4707]: I0121 16:13:39.975416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fds5l\" (UniqueName: \"kubernetes.io/projected/3cd910ba-3565-4935-a15f-dda572968edd-kube-api-access-fds5l\") pod \"dnsmasq-dnsmasq-84b9f45d47-bbvdb\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.010609 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-eb97-account-create-update-kc79d"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.065063 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-eb97-account-create-update-kc79d"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.075238 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.076984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.077144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-bbvdb\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.077217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-bbvdb\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.077357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fds5l\" (UniqueName: \"kubernetes.io/projected/3cd910ba-3565-4935-a15f-dda572968edd-kube-api-access-fds5l\") pod \"dnsmasq-dnsmasq-84b9f45d47-bbvdb\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.077455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.077643 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.077681 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle podName:153f9936-d723-4c86-a4b2-176325dc9513 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.077669117 +0000 UTC m=+4319.259185338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle") pod "barbican-worker-5ff6c79dfc-9wvl9" (UID: "153f9936-d723-4c86-a4b2-176325dc9513") : secret "combined-ca-bundle" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.077717 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.077736 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle podName:e9e813fb-84dc-4d38-b3b5-1be17e1848f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.077730772 +0000 UTC m=+4319.259246994 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle") pod "barbican-keystone-listener-5fcdff7b8d-2cscv" (UID: "e9e813fb-84dc-4d38-b3b5-1be17e1848f6") : secret "combined-ca-bundle" not found Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.079080 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-bbvdb\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.079345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-bbvdb\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.081545 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g4v5h"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.090623 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-g4v5h"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.092442 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_bb06987b-a8ab-40c8-b73b-f0244ac056c7/ovsdbserver-sb/0.log" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.093599 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.100595 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.111056 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-dvq7g"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.113209 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-dvq7g"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.115149 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fds5l\" (UniqueName: \"kubernetes.io/projected/3cd910ba-3565-4935-a15f-dda572968edd-kube-api-access-fds5l\") pod \"dnsmasq-dnsmasq-84b9f45d47-bbvdb\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.117437 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-mzhtx"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.124553 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-mzhtx"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.169984 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.170216 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-log" containerID="cri-o://75695222855ca1c4f0fbbb6b2725a344e12f18c6673f5479b9fa3e2af236f9ba" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.170606 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-metadata" containerID="cri-o://a3f629a586eeaf3a0430289235f8ed826a343e4d540c7552bf23587188a60f57" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.176434 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.194640 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.194673 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-rg6q7"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.234785 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-pvg29"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.286029 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.286234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdb-rundir\") pod \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.286272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.286315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdbserver-sb-tls-certs\") pod \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.295551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "bb06987b-a8ab-40c8-b73b-f0244ac056c7" (UID: "bb06987b-a8ab-40c8-b73b-f0244ac056c7"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.297012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xkqb\" (UniqueName: \"kubernetes.io/projected/bb06987b-a8ab-40c8-b73b-f0244ac056c7-kube-api-access-8xkqb\") pod \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.300174 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.300562 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.297055 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-combined-ca-bundle\") pod \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.304116 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-scripts\") pod \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.304169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-config\") pod \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.304195 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-metrics-certs-tls-certs\") pod \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\" (UID: \"bb06987b-a8ab-40c8-b73b-f0244ac056c7\") " Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.304589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.306076 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.306281 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-log" containerID="cri-o://ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.306582 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.306635 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.306620642 +0000 UTC m=+4319.488136865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "barbican-config-data" not found Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.306052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "bb06987b-a8ab-40c8-b73b-f0244ac056c7" (UID: "bb06987b-a8ab-40c8-b73b-f0244ac056c7"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.306643 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-api" containerID="cri-o://8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.307141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-scripts" (OuterVolumeSpecName: "scripts") pod "bb06987b-a8ab-40c8-b73b-f0244ac056c7" (UID: "bb06987b-a8ab-40c8-b73b-f0244ac056c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.307176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-config" (OuterVolumeSpecName: "config") pod "bb06987b-a8ab-40c8-b73b-f0244ac056c7" (UID: "bb06987b-a8ab-40c8-b73b-f0244ac056c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.307639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.307734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.307753 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.307796 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.307782377 +0000 UTC m=+4319.489298599 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "cert-barbican-internal-svc" not found Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.307835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6j8\" (UniqueName: \"kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.308072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.308156 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.308180 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb06987b-a8ab-40c8-b73b-f0244ac056c7-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.308190 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.308208 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.308155 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.308520 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.308510997 +0000 UTC m=+4319.490027219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "cert-barbican-public-svc" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.308427 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.308551 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.308546003 +0000 UTC m=+4319.490062215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "combined-ca-bundle" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.310456 4707 projected.go:194] Error preparing data for projected volume kube-api-access-tz6j8 for pod openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.310504 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8 podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.310490158 +0000 UTC m=+4319.492006380 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tz6j8" (UniqueName: "kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.361863 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-pvg29"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.403784 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.416309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb06987b-a8ab-40c8-b73b-f0244ac056c7-kube-api-access-8xkqb" (OuterVolumeSpecName: "kube-api-access-8xkqb") pod "bb06987b-a8ab-40c8-b73b-f0244ac056c7" (UID: "bb06987b-a8ab-40c8-b73b-f0244ac056c7"). InnerVolumeSpecName "kube-api-access-8xkqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.419216 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xkqb\" (UniqueName: \"kubernetes.io/projected/bb06987b-a8ab-40c8-b73b-f0244ac056c7-kube-api-access-8xkqb\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.442796 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-v95sk"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.452910 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-v95sk"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.459507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb06987b-a8ab-40c8-b73b-f0244ac056c7" (UID: "bb06987b-a8ab-40c8-b73b-f0244ac056c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.475082 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.488921 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9"] Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.494131 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" podUID="153f9936-d723-4c86-a4b2-176325dc9513" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.513146 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.513385 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" podUID="f927e632-11ba-4d56-993a-f89357f885a4" containerName="barbican-worker-log" containerID="cri-o://55a8ca4d4d4f845958bddd165606fd0a9aa08d68148d5ce1f6d84578e370b654" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.513778 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" podUID="f927e632-11ba-4d56-993a-f89357f885a4" containerName="barbican-worker" containerID="cri-o://f9eec9f2777bc56f0de0809e130e7eb56b4ccf7f6d582d5054978a6f37712858" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.518723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "bb06987b-a8ab-40c8-b73b-f0244ac056c7" (UID: "bb06987b-a8ab-40c8-b73b-f0244ac056c7"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.522988 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.523012 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.523023 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.532533 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv"] Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.533268 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" podUID="e9e813fb-84dc-4d38-b3b5-1be17e1848f6" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.540574 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.541355 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" podUID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerName="barbican-keystone-listener" containerID="cri-o://d3f4f9fc5bc4ad2e024ecd33cb433c74190a4054f4d5fc7992c2d5cfaceae52f" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.542102 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" podUID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerName="barbican-keystone-listener-log" containerID="cri-o://c64d5fccd6f9fe5912e6644f1e191a0c804f30b7b9271b2b9b4aaaefbe8774f5" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.549125 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv"] Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.551227 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data internal-tls-certs kube-api-access-tz6j8 public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" podUID="872bac5e-90cd-4cb0-9279-baa312892916" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.555236 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-ff884c948-lmvhf"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.555482 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" podUID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerName="barbican-api-log" containerID="cri-o://408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.556033 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" podUID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerName="barbican-api" containerID="cri-o://34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.556340 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fb82acd6cb43c9c4e99edd1e2836e009b79b9244ac1f9188cfc0eec4e39f3eb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.561457 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fb82acd6cb43c9c4e99edd1e2836e009b79b9244ac1f9188cfc0eec4e39f3eb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.567942 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.569563 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fb82acd6cb43c9c4e99edd1e2836e009b79b9244ac1f9188cfc0eec4e39f3eb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.569599 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerName="ovn-northd" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.573990 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.574197 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="561add48-0cde-40f7-95b6-09300d00eec4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.580080 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-rg6q7"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.583210 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="58427ff4-8738-48bd-a3ea-a99986023664" containerName="galera" containerID="cri-o://bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.588870 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.589050 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="10090d75-68eb-4597-98f6-b3f0282513e6" containerName="kube-state-metrics" containerID="cri-o://2cec1240e19583cb646030fe86a81a2167b97fef2ae3246d40fd82a422fe2b74" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.597725 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.605923 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.606182 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="ceilometer-central-agent" containerID="cri-o://669bb6d5318148cc1ac21fae845cda6d0b6d3e81db0f375ed6ebf15b8e802dd4" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.606639 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="proxy-httpd" containerID="cri-o://274d838448ffed3d5e993bce9fc0532d447a3d723e068308801479939c591e55" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.606778 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="ceilometer-notification-agent" containerID="cri-o://c775c3ced09d3af1afb4330ece26d7b04215632d1b6e25a92627d6902a9f8f3e" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.606866 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="sg-core" containerID="cri-o://778167efb85219d760ca2bee2211cb50dd31b9555451c93aea631b587f5c4e24" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.608423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "bb06987b-a8ab-40c8-b73b-f0244ac056c7" (UID: "bb06987b-a8ab-40c8-b73b-f0244ac056c7"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.633177 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb06987b-a8ab-40c8-b73b-f0244ac056c7-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.675922 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.690072 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-mzglm"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.690115 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.690296 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" containerID="cri-o://2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.701986 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.705411 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-zc2z4"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.710840 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.710999 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4b218d98dcea319cdfaa8ddb7b590949e9f120aa66afb2e52a36db4e18a4d975" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.715212 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.715390 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" containerID="cri-o://ffe9e5471773aa44d24bb089adcede94c9a0a7c07e343fc71a5076c836a67866" gracePeriod=30 Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.744755 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:13:40 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: if [ -n "neutron" ]; then Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="neutron" Jan 21 16:13:40 crc kubenswrapper[4707]: else Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:13:40 crc kubenswrapper[4707]: fi Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:13:40 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:13:40 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:13:40 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:13:40 crc kubenswrapper[4707]: # support updates Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.746596 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" podUID="7d5dc988-7c0d-4bef-a471-656baf486bc9" Jan 21 16:13:40 crc kubenswrapper[4707]: W0121 16:13:40.748365 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6e69885_5099_471a_8f06_ef85f63f18b7.slice/crio-cedbe1c40b4e58a4c686c5837aeff97eea323b5f17c2c15bf3a130d890ecdd99 WatchSource:0}: Error finding container cedbe1c40b4e58a4c686c5837aeff97eea323b5f17c2c15bf3a130d890ecdd99: Status 404 returned error can't find the container with id cedbe1c40b4e58a4c686c5837aeff97eea323b5f17c2c15bf3a130d890ecdd99 Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.748710 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:13:40 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: if [ -n "cinder" ]; then Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="cinder" Jan 21 16:13:40 crc kubenswrapper[4707]: else Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:13:40 crc kubenswrapper[4707]: fi Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:13:40 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:13:40 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:13:40 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:13:40 crc kubenswrapper[4707]: # support updates Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.749953 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" podUID="309eb8bc-de8e-454e-8835-811c8df0ed63" Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.751240 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:13:40 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: if [ -n "nova_api" ]; then Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="nova_api" Jan 21 16:13:40 crc kubenswrapper[4707]: else Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:13:40 crc kubenswrapper[4707]: fi Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:13:40 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:13:40 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:13:40 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:13:40 crc kubenswrapper[4707]: # support updates Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:13:40 crc kubenswrapper[4707]: W0121 16:13:40.751290 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b93df4_cdbb_470e_8147_2a9bd7e5b74e.slice/crio-9a493328799ba4b7c7fb965e9a58ea1b071f20121979ce5e57356d0723a4699c WatchSource:0}: Error finding container 9a493328799ba4b7c7fb965e9a58ea1b071f20121979ce5e57356d0723a4699c: Status 404 returned error can't find the container with id 9a493328799ba4b7c7fb965e9a58ea1b071f20121979ce5e57356d0723a4699c Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.752657 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" podUID="b9d5ac98-e3ae-450a-90fb-45b53c48f9ce" Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.759525 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:13:40 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: if [ -n "barbican" ]; then Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="barbican" Jan 21 16:13:40 crc kubenswrapper[4707]: else Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:13:40 crc kubenswrapper[4707]: fi Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:13:40 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:13:40 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:13:40 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:13:40 crc kubenswrapper[4707]: # support updates Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.759787 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:13:40 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 16:13:40 crc kubenswrapper[4707]: else Jan 21 16:13:40 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:13:40 crc kubenswrapper[4707]: fi Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:13:40 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:13:40 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:13:40 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:13:40 crc kubenswrapper[4707]: # support updates Jan 21 16:13:40 crc kubenswrapper[4707]: Jan 21 16:13:40 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.765601 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-rg6q7" podUID="b6e69885-5099-471a-8f06-ef85f63f18b7" Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.766623 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" podUID="06b93df4-cdbb-470e-8147-2a9bd7e5b74e" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.780295 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerName="rabbitmq" containerID="cri-o://43c33ad38b01642b5b25214359a49931c587fc6383778e43bacab56ce73f1322" gracePeriod=604800 Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.836965 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837300 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scripts: secret "cinder-scripts" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837327 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data podName:0e42cb4d-8129-4286-a6c7-9f8e94524e91 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.837312922 +0000 UTC m=+4320.018829144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data") pod "rabbitmq-server-0" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91") : configmap "rabbitmq-config-data" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837345 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.837337157 +0000 UTC m=+4320.018853379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-scripts" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837241 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837362 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837370 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.83736531 +0000 UTC m=+4320.018881532 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "combined-ca-bundle" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837382 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.837377052 +0000 UTC m=+4320.018893274 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cert-cinder-internal-svc" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837407 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837437 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.837422448 +0000 UTC m=+4320.018938670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cert-cinder-public-svc" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837466 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-config-data: secret "cinder-config-data" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837486 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.837480336 +0000 UTC m=+4320.018996558 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-config-data" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837275 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-api-config-data: secret "cinder-api-config-data" not found Jan 21 16:13:40 crc kubenswrapper[4707]: E0121 16:13:40.837557 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:42.83755137 +0000 UTC m=+4320.019067592 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-api-config-data" not found Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.948649 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_b86320b8-c3c8-42d6-b48b-206526f89271/ovsdbserver-nb/0.log" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.948764 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.958892 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_b86320b8-c3c8-42d6-b48b-206526f89271/ovsdbserver-nb/0.log" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.959012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"b86320b8-c3c8-42d6-b48b-206526f89271","Type":"ContainerDied","Data":"3b1af069d581d3464a3b38ec400f4fe584002411e02285c4eab03a9ebd21c0a9"} Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.959051 4707 scope.go:117] "RemoveContainer" containerID="7019254782a506b2d815df0e4448644e9e93a6b8330ab0977344a2a08dc91ea2" Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.964953 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerID="408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6" exitCode=143 Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.965018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" event={"ID":"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0","Type":"ContainerDied","Data":"408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6"} Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.967745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" event={"ID":"7d5dc988-7c0d-4bef-a471-656baf486bc9","Type":"ContainerStarted","Data":"0ed70ca13ea572c8e8a6b2f742413f35cc0451cad77272e07d17cc50400d8d34"} Jan 21 16:13:40 crc kubenswrapper[4707]: I0121 16:13:40.975521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" event={"ID":"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce","Type":"ContainerStarted","Data":"c0e322245408b6886107b77735e060fab647f8fe6c1172720ba7bd3abcfa8f80"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.007230 4707 generic.go:334] "Generic (PLEG): container finished" podID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerID="30b75fdd71f74771e52987f8d2a7b8e272eabb3c7db15d9c46d8dfc4759a563c" exitCode=143 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.007335 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"559741cf-87c9-47d4-98a3-6b02ffb4610c","Type":"ContainerDied","Data":"30b75fdd71f74771e52987f8d2a7b8e272eabb3c7db15d9c46d8dfc4759a563c"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.038561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-rg6q7" event={"ID":"b6e69885-5099-471a-8f06-ef85f63f18b7","Type":"ContainerStarted","Data":"cedbe1c40b4e58a4c686c5837aeff97eea323b5f17c2c15bf3a130d890ecdd99"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.046599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b86320b8-c3c8-42d6-b48b-206526f89271\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.046680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-config\") pod \"b86320b8-c3c8-42d6-b48b-206526f89271\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.046762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-combined-ca-bundle\") pod \"b86320b8-c3c8-42d6-b48b-206526f89271\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.046798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-scripts\") pod \"b86320b8-c3c8-42d6-b48b-206526f89271\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.046860 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc6fc\" (UniqueName: \"kubernetes.io/projected/b86320b8-c3c8-42d6-b48b-206526f89271-kube-api-access-fc6fc\") pod \"b86320b8-c3c8-42d6-b48b-206526f89271\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.046888 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdb-rundir\") pod \"b86320b8-c3c8-42d6-b48b-206526f89271\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.046976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-metrics-certs-tls-certs\") pod \"b86320b8-c3c8-42d6-b48b-206526f89271\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.047002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdbserver-nb-tls-certs\") pod \"b86320b8-c3c8-42d6-b48b-206526f89271\" (UID: \"b86320b8-c3c8-42d6-b48b-206526f89271\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.051991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-scripts" (OuterVolumeSpecName: "scripts") pod "b86320b8-c3c8-42d6-b48b-206526f89271" (UID: "b86320b8-c3c8-42d6-b48b-206526f89271"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.053063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-config" (OuterVolumeSpecName: "config") pod "b86320b8-c3c8-42d6-b48b-206526f89271" (UID: "b86320b8-c3c8-42d6-b48b-206526f89271"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.059820 4707 generic.go:334] "Generic (PLEG): container finished" podID="10090d75-68eb-4597-98f6-b3f0282513e6" containerID="2cec1240e19583cb646030fe86a81a2167b97fef2ae3246d40fd82a422fe2b74" exitCode=2 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.059923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"10090d75-68eb-4597-98f6-b3f0282513e6","Type":"ContainerDied","Data":"2cec1240e19583cb646030fe86a81a2167b97fef2ae3246d40fd82a422fe2b74"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.065973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b86320b8-c3c8-42d6-b48b-206526f89271" (UID: "b86320b8-c3c8-42d6-b48b-206526f89271"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.080031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "b86320b8-c3c8-42d6-b48b-206526f89271" (UID: "b86320b8-c3c8-42d6-b48b-206526f89271"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.085592 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b86320b8-c3c8-42d6-b48b-206526f89271-kube-api-access-fc6fc" (OuterVolumeSpecName: "kube-api-access-fc6fc") pod "b86320b8-c3c8-42d6-b48b-206526f89271" (UID: "b86320b8-c3c8-42d6-b48b-206526f89271"). InnerVolumeSpecName "kube-api-access-fc6fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.086406 4707 scope.go:117] "RemoveContainer" containerID="7d1010a0b55312ee73410bd885472062d3c784a9d367b85e66eb856b023fc87c" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116706 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116736 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116744 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116752 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116758 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116765 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116771 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116776 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116783 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116789 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116794 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116800 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116837 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116844 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.116995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.117003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.117011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.132106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" event={"ID":"06b93df4-cdbb-470e-8147-2a9bd7e5b74e","Type":"ContainerStarted","Data":"9a493328799ba4b7c7fb965e9a58ea1b071f20121979ce5e57356d0723a4699c"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.158856 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.158887 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.158898 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b86320b8-c3c8-42d6-b48b-206526f89271-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.158907 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc6fc\" (UniqueName: \"kubernetes.io/projected/b86320b8-c3c8-42d6-b48b-206526f89271-kube-api-access-fc6fc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.158915 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.200252 4707 generic.go:334] "Generic (PLEG): container finished" podID="594873ed-1daa-422c-853c-65fab465dbdb" containerID="ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2" exitCode=143 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.241141 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc19662-4ee3-4885-9103-313f65c41fd9" path="/var/lib/kubelet/pods/0bc19662-4ee3-4885-9103-313f65c41fd9/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.242447 4707 generic.go:334] "Generic (PLEG): container finished" podID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerID="0d655dcc30fe3941863c06455c345616e3fae519df331a4a8dd1db93d04f7978" exitCode=143 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.249491 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="163a4187-712b-4e94-b883-6f12462cfc13" path="/var/lib/kubelet/pods/163a4187-712b-4e94-b883-6f12462cfc13/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.250468 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e2d0133-c8a1-44aa-8d4a-59d0125942b3" path="/var/lib/kubelet/pods/1e2d0133-c8a1-44aa-8d4a-59d0125942b3/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.251231 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ede7fec-2e6f-47e1-9653-a55e80d7c9c4" path="/var/lib/kubelet/pods/1ede7fec-2e6f-47e1-9653-a55e80d7c9c4/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.255283 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.269201 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2933067b-5f4b-4431-9b9b-5da784bab778" path="/var/lib/kubelet/pods/2933067b-5f4b-4431-9b9b-5da784bab778/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.269865 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bae039e-7430-4b54-9235-8aad8a9b834c" path="/var/lib/kubelet/pods/3bae039e-7430-4b54-9235-8aad8a9b834c/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.270507 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46101c43-94be-44b4-9638-7f5a94535611" path="/var/lib/kubelet/pods/46101c43-94be-44b4-9638-7f5a94535611/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.297437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b86320b8-c3c8-42d6-b48b-206526f89271" (UID: "b86320b8-c3c8-42d6-b48b-206526f89271"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.310330 4707 generic.go:334] "Generic (PLEG): container finished" podID="f927e632-11ba-4d56-993a-f89357f885a4" containerID="55a8ca4d4d4f845958bddd165606fd0a9aa08d68148d5ce1f6d84578e370b654" exitCode=143 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.334106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "b86320b8-c3c8-42d6-b48b-206526f89271" (UID: "b86320b8-c3c8-42d6-b48b-206526f89271"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.336312 4707 generic.go:334] "Generic (PLEG): container finished" podID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerID="75695222855ca1c4f0fbbb6b2725a344e12f18c6673f5479b9fa3e2af236f9ba" exitCode=143 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.347243 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0b2b2e-2a3d-4b99-874c-08306819d928" path="/var/lib/kubelet/pods/7a0b2b2e-2a3d-4b99-874c-08306819d928/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.348278 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a82682c-ac19-48c1-a9bf-30e1eca19329" path="/var/lib/kubelet/pods/7a82682c-ac19-48c1-a9bf-30e1eca19329/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.355844 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="891bddcb-ef77-491e-958f-6bc9d740c3be" path="/var/lib/kubelet/pods/891bddcb-ef77-491e-958f-6bc9d740c3be/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.356354 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="972548d1-bc2a-4edc-9d47-7f31b98459fd" path="/var/lib/kubelet/pods/972548d1-bc2a-4edc-9d47-7f31b98459fd/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.356849 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7fad7e-b367-4cf0-8ed3-565ec653bc17" path="/var/lib/kubelet/pods/9b7fad7e-b367-4cf0-8ed3-565ec653bc17/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.357111 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.377578 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerID="274d838448ffed3d5e993bce9fc0532d447a3d723e068308801479939c591e55" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.377601 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerID="778167efb85219d760ca2bee2211cb50dd31b9555451c93aea631b587f5c4e24" exitCode=2 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.378414 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.378437 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.378447 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.379171 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9f6631-421a-4fd3-977e-7f03a4d13e2e" path="/var/lib/kubelet/pods/ab9f6631-421a-4fd3-977e-7f03a4d13e2e/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.379842 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca088bd1-a4bf-47c4-ba45-d25107adb619" path="/var/lib/kubelet/pods/ca088bd1-a4bf-47c4-ba45-d25107adb619/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.380499 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec82238-89a9-422f-bed8-d3a1992d60e9" path="/var/lib/kubelet/pods/dec82238-89a9-422f-bed8-d3a1992d60e9/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.386440 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec033d89-b803-4077-8fe1-58d264bd8772" path="/var/lib/kubelet/pods/ec033d89-b803-4077-8fe1-58d264bd8772/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.387152 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee7b5814-7f54-4d09-8bd2-2a3618903157" path="/var/lib/kubelet/pods/ee7b5814-7f54-4d09-8bd2-2a3618903157/volumes" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400126 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n"] Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400173 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb"] Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"594873ed-1daa-422c-853c-65fab465dbdb","Type":"ContainerDied","Data":"ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1162dc59-3a54-42e6-84c5-466eafb062b4","Type":"ContainerDied","Data":"0d655dcc30fe3941863c06455c345616e3fae519df331a4a8dd1db93d04f7978"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" event={"ID":"309eb8bc-de8e-454e-8835-811c8df0ed63","Type":"ContainerStarted","Data":"c137677e0e36c3d9ee3b269e74666e1c3abbfe801db743e844e825fdfc61d98b"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400234 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" event={"ID":"f927e632-11ba-4d56-993a-f89357f885a4","Type":"ContainerDied","Data":"55a8ca4d4d4f845958bddd165606fd0a9aa08d68148d5ce1f6d84578e370b654"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2","Type":"ContainerDied","Data":"75695222855ca1c4f0fbbb6b2725a344e12f18c6673f5479b9fa3e2af236f9ba"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerDied","Data":"274d838448ffed3d5e993bce9fc0532d447a3d723e068308801479939c591e55"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerDied","Data":"778167efb85219d760ca2bee2211cb50dd31b9555451c93aea631b587f5c4e24"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400454 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" podUID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerName="proxy-httpd" containerID="cri-o://d9b5938294531b6e015f18518c545afd2fc7bc61634c8904c5434b33da8fd969" gracePeriod=30 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.400801 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" podUID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerName="proxy-server" containerID="cri-o://6a3e0961eaad6a849ce19556e597cc407293421db0ba0b4457b3b4372a4b61d6" gracePeriod=30 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.439663 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_bb06987b-a8ab-40c8-b73b-f0244ac056c7/ovsdbserver-sb/0.log" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.439974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"bb06987b-a8ab-40c8-b73b-f0244ac056c7","Type":"ContainerDied","Data":"73e49e6cdf62ccfb699d60be36f133e6e742d6be284d34183d967f2cbb1e40ca"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.440011 4707 scope.go:117] "RemoveContainer" containerID="89922e5f5460efebce04fba540cf6fbd73efdfb666c0b1bc1af0fa90f06ec5bb" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.440107 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.479932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72pnt\" (UniqueName: \"kubernetes.io/projected/10090d75-68eb-4597-98f6-b3f0282513e6-kube-api-access-72pnt\") pod \"10090d75-68eb-4597-98f6-b3f0282513e6\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.479995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-combined-ca-bundle\") pod \"10090d75-68eb-4597-98f6-b3f0282513e6\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.480022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-certs\") pod \"10090d75-68eb-4597-98f6-b3f0282513e6\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.480129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-config\") pod \"10090d75-68eb-4597-98f6-b3f0282513e6\" (UID: \"10090d75-68eb-4597-98f6-b3f0282513e6\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.523189 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10090d75-68eb-4597-98f6-b3f0282513e6-kube-api-access-72pnt" (OuterVolumeSpecName: "kube-api-access-72pnt") pod "10090d75-68eb-4597-98f6-b3f0282513e6" (UID: "10090d75-68eb-4597-98f6-b3f0282513e6"). InnerVolumeSpecName "kube-api-access-72pnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.528970 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b86320b8-c3c8-42d6-b48b-206526f89271" (UID: "b86320b8-c3c8-42d6-b48b-206526f89271"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.547437 4707 generic.go:334] "Generic (PLEG): container finished" podID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerID="c64d5fccd6f9fe5912e6644f1e191a0c804f30b7b9271b2b9b4aaaefbe8774f5" exitCode=143 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.547512 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" event={"ID":"b115ad1a-72c2-4d2e-8669-4d125acf6751","Type":"ContainerDied","Data":"c64d5fccd6f9fe5912e6644f1e191a0c804f30b7b9271b2b9b4aaaefbe8774f5"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.552071 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.571988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "10090d75-68eb-4597-98f6-b3f0282513e6" (UID: "10090d75-68eb-4597-98f6-b3f0282513e6"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.572123 4707 generic.go:334] "Generic (PLEG): container finished" podID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerID="cd779a5c94d7063de0abc7e3a7f2b71e9ae69f23b6f7b82dbce250ade905a2db" exitCode=0 Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.572203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.573533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2f6304c7-b127-4fcb-a74b-b5009a2b1a50","Type":"ContainerDied","Data":"cd779a5c94d7063de0abc7e3a7f2b71e9ae69f23b6f7b82dbce250ade905a2db"} Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.573618 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.573713 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.583414 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72pnt\" (UniqueName: \"kubernetes.io/projected/10090d75-68eb-4597-98f6-b3f0282513e6-kube-api-access-72pnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.583438 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.583450 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b86320b8-c3c8-42d6-b48b-206526f89271-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.589481 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.591005 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.596648 4707 scope.go:117] "RemoveContainer" containerID="5a00f55ff9ae0fdbc1e8d372d22b6a13fe41087fb6a5f3e20c7888d085b9c8e6" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.611900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10090d75-68eb-4597-98f6-b3f0282513e6" (UID: "10090d75-68eb-4597-98f6-b3f0282513e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.611911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "10090d75-68eb-4597-98f6-b3f0282513e6" (UID: "10090d75-68eb-4597-98f6-b3f0282513e6"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.620042 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.621055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-logs\") pod \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cds9t\" (UniqueName: \"kubernetes.io/projected/153f9936-d723-4c86-a4b2-176325dc9513-kube-api-access-cds9t\") pod \"153f9936-d723-4c86-a4b2-176325dc9513\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data-custom\") pod \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data-custom\") pod \"872bac5e-90cd-4cb0-9279-baa312892916\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wbpz\" (UniqueName: \"kubernetes.io/projected/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-kube-api-access-6wbpz\") pod \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data\") pod \"153f9936-d723-4c86-a4b2-176325dc9513\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data-custom\") pod \"153f9936-d723-4c86-a4b2-176325dc9513\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data\") pod \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872bac5e-90cd-4cb0-9279-baa312892916-logs\") pod \"872bac5e-90cd-4cb0-9279-baa312892916\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.684875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/153f9936-d723-4c86-a4b2-176325dc9513-logs\") pod \"153f9936-d723-4c86-a4b2-176325dc9513\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.685269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-logs" (OuterVolumeSpecName: "logs") pod "e9e813fb-84dc-4d38-b3b5-1be17e1848f6" (UID: "e9e813fb-84dc-4d38-b3b5-1be17e1848f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.685519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/153f9936-d723-4c86-a4b2-176325dc9513-logs" (OuterVolumeSpecName: "logs") pod "153f9936-d723-4c86-a4b2-176325dc9513" (UID: "153f9936-d723-4c86-a4b2-176325dc9513"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.685710 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.685736 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10090d75-68eb-4597-98f6-b3f0282513e6-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.685757 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/153f9936-d723-4c86-a4b2-176325dc9513-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.685768 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.689959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/153f9936-d723-4c86-a4b2-176325dc9513-kube-api-access-cds9t" (OuterVolumeSpecName: "kube-api-access-cds9t") pod "153f9936-d723-4c86-a4b2-176325dc9513" (UID: "153f9936-d723-4c86-a4b2-176325dc9513"). InnerVolumeSpecName "kube-api-access-cds9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.690824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872bac5e-90cd-4cb0-9279-baa312892916-logs" (OuterVolumeSpecName: "logs") pod "872bac5e-90cd-4cb0-9279-baa312892916" (UID: "872bac5e-90cd-4cb0-9279-baa312892916"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.690907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e9e813fb-84dc-4d38-b3b5-1be17e1848f6" (UID: "e9e813fb-84dc-4d38-b3b5-1be17e1848f6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.696315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "872bac5e-90cd-4cb0-9279-baa312892916" (UID: "872bac5e-90cd-4cb0-9279-baa312892916"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.696602 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data" (OuterVolumeSpecName: "config-data") pod "153f9936-d723-4c86-a4b2-176325dc9513" (UID: "153f9936-d723-4c86-a4b2-176325dc9513"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.696752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "153f9936-d723-4c86-a4b2-176325dc9513" (UID: "153f9936-d723-4c86-a4b2-176325dc9513"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.711640 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-kube-api-access-6wbpz" (OuterVolumeSpecName: "kube-api-access-6wbpz") pod "e9e813fb-84dc-4d38-b3b5-1be17e1848f6" (UID: "e9e813fb-84dc-4d38-b3b5-1be17e1848f6"). InnerVolumeSpecName "kube-api-access-6wbpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.720302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data" (OuterVolumeSpecName: "config-data") pod "e9e813fb-84dc-4d38-b3b5-1be17e1848f6" (UID: "e9e813fb-84dc-4d38-b3b5-1be17e1848f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.788596 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.788933 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.788944 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/872bac5e-90cd-4cb0-9279-baa312892916-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.788955 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cds9t\" (UniqueName: \"kubernetes.io/projected/153f9936-d723-4c86-a4b2-176325dc9513-kube-api-access-cds9t\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.788966 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.788976 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.788987 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wbpz\" (UniqueName: \"kubernetes.io/projected/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-kube-api-access-6wbpz\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.788998 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:41 crc kubenswrapper[4707]: E0121 16:13:41.788775 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:41 crc kubenswrapper[4707]: E0121 16:13:41.789049 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data podName:5d633804-5d9a-4792-b183-51e2c1a2ebe2 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:45.789035291 +0000 UTC m=+4322.970551514 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data") pod "rabbitmq-cell1-server-0" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.822798 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.116:5671: connect: connection refused" Jan 21 16:13:41 crc kubenswrapper[4707]: E0121 16:13:41.893404 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:41 crc kubenswrapper[4707]: E0121 16:13:41.893484 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle podName:047c06ee-71d5-4bcf-83e0-dd8aa295cdb3 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:45.89346488 +0000 UTC m=+4323.074981102 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle") pod "nova-cell1-conductor-0" (UID: "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3") : secret "combined-ca-bundle" not found Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.964396 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:41 crc kubenswrapper[4707]: I0121 16:13:41.976709 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.075213 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.117:5671: connect: connection refused" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.103283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d5dc988-7c0d-4bef-a471-656baf486bc9-operator-scripts\") pod \"7d5dc988-7c0d-4bef-a471-656baf486bc9\" (UID: \"7d5dc988-7c0d-4bef-a471-656baf486bc9\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.103451 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8ncz\" (UniqueName: \"kubernetes.io/projected/7d5dc988-7c0d-4bef-a471-656baf486bc9-kube-api-access-t8ncz\") pod \"7d5dc988-7c0d-4bef-a471-656baf486bc9\" (UID: \"7d5dc988-7c0d-4bef-a471-656baf486bc9\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.103535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-operator-scripts\") pod \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\" (UID: \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.103607 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r994t\" (UniqueName: \"kubernetes.io/projected/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-kube-api-access-r994t\") pod \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\" (UID: \"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.103777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d5dc988-7c0d-4bef-a471-656baf486bc9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d5dc988-7c0d-4bef-a471-656baf486bc9" (UID: "7d5dc988-7c0d-4bef-a471-656baf486bc9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.104484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9d5ac98-e3ae-450a-90fb-45b53c48f9ce" (UID: "b9d5ac98-e3ae-450a-90fb-45b53c48f9ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.104906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle\") pod \"barbican-worker-5ff6c79dfc-9wvl9\" (UID: \"153f9936-d723-4c86-a4b2-176325dc9513\") " pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.104958 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.105002 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle podName:153f9936-d723-4c86-a4b2-176325dc9513 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.104987636 +0000 UTC m=+4323.286503857 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle") pod "barbican-worker-5ff6c79dfc-9wvl9" (UID: "153f9936-d723-4c86-a4b2-176325dc9513") : secret "combined-ca-bundle" not found Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.105095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle\") pod \"barbican-keystone-listener-5fcdff7b8d-2cscv\" (UID: \"e9e813fb-84dc-4d38-b3b5-1be17e1848f6\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.105345 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.105381 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle podName:e9e813fb-84dc-4d38-b3b5-1be17e1848f6 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.10537297 +0000 UTC m=+4323.286889192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle") pod "barbican-keystone-listener-5fcdff7b8d-2cscv" (UID: "e9e813fb-84dc-4d38-b3b5-1be17e1848f6") : secret "combined-ca-bundle" not found Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.105569 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d5dc988-7c0d-4bef-a471-656baf486bc9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.105588 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.109716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5dc988-7c0d-4bef-a471-656baf486bc9-kube-api-access-t8ncz" (OuterVolumeSpecName: "kube-api-access-t8ncz") pod "7d5dc988-7c0d-4bef-a471-656baf486bc9" (UID: "7d5dc988-7c0d-4bef-a471-656baf486bc9"). InnerVolumeSpecName "kube-api-access-t8ncz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.114881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-kube-api-access-r994t" (OuterVolumeSpecName: "kube-api-access-r994t") pod "b9d5ac98-e3ae-450a-90fb-45b53c48f9ce" (UID: "b9d5ac98-e3ae-450a-90fb-45b53c48f9ce"). InnerVolumeSpecName "kube-api-access-r994t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.177364 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.182057 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.188753 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.194708 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.207200 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8ncz\" (UniqueName: \"kubernetes.io/projected/7d5dc988-7c0d-4bef-a471-656baf486bc9-kube-api-access-t8ncz\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.207230 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r994t\" (UniqueName: \"kubernetes.io/projected/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce-kube-api-access-r994t\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.211382 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.217001 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-galera-tls-certs\") pod \"58427ff4-8738-48bd-a3ea-a99986023664\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9qc4\" (UniqueName: \"kubernetes.io/projected/85d128b6-18c3-4018-a74a-86b025e2196d-kube-api-access-w9qc4\") pod \"85d128b6-18c3-4018-a74a-86b025e2196d\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-combined-ca-bundle\") pod \"58427ff4-8738-48bd-a3ea-a99986023664\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw7t6\" (UniqueName: \"kubernetes.io/projected/b6e69885-5099-471a-8f06-ef85f63f18b7-kube-api-access-rw7t6\") pod \"b6e69885-5099-471a-8f06-ef85f63f18b7\" (UID: \"b6e69885-5099-471a-8f06-ef85f63f18b7\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315690 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-vencrypt-tls-certs\") pod \"561add48-0cde-40f7-95b6-09300d00eec4\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/309eb8bc-de8e-454e-8835-811c8df0ed63-operator-scripts\") pod \"309eb8bc-de8e-454e-8835-811c8df0ed63\" (UID: \"309eb8bc-de8e-454e-8835-811c8df0ed63\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/58427ff4-8738-48bd-a3ea-a99986023664-config-data-generated\") pod \"58427ff4-8738-48bd-a3ea-a99986023664\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315800 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e69885-5099-471a-8f06-ef85f63f18b7-operator-scripts\") pod \"b6e69885-5099-471a-8f06-ef85f63f18b7\" (UID: \"b6e69885-5099-471a-8f06-ef85f63f18b7\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgd5k\" (UniqueName: \"kubernetes.io/projected/309eb8bc-de8e-454e-8835-811c8df0ed63-kube-api-access-lgd5k\") pod \"309eb8bc-de8e-454e-8835-811c8df0ed63\" (UID: \"309eb8bc-de8e-454e-8835-811c8df0ed63\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315882 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"58427ff4-8738-48bd-a3ea-a99986023664\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config-secret\") pod \"85d128b6-18c3-4018-a74a-86b025e2196d\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-config-data-default\") pod \"58427ff4-8738-48bd-a3ea-a99986023664\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-operator-scripts\") pod \"58427ff4-8738-48bd-a3ea-a99986023664\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.315989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-config-data\") pod \"561add48-0cde-40f7-95b6-09300d00eec4\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.316008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhzk5\" (UniqueName: \"kubernetes.io/projected/58427ff4-8738-48bd-a3ea-a99986023664-kube-api-access-nhzk5\") pod \"58427ff4-8738-48bd-a3ea-a99986023664\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.316037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-nova-novncproxy-tls-certs\") pod \"561add48-0cde-40f7-95b6-09300d00eec4\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.316121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh82n\" (UniqueName: \"kubernetes.io/projected/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-kube-api-access-nh82n\") pod \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\" (UID: \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.316154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-combined-ca-bundle\") pod \"85d128b6-18c3-4018-a74a-86b025e2196d\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.316191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d6pt\" (UniqueName: \"kubernetes.io/projected/561add48-0cde-40f7-95b6-09300d00eec4-kube-api-access-2d6pt\") pod \"561add48-0cde-40f7-95b6-09300d00eec4\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.316221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-kolla-config\") pod \"58427ff4-8738-48bd-a3ea-a99986023664\" (UID: \"58427ff4-8738-48bd-a3ea-a99986023664\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.316322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config\") pod \"85d128b6-18c3-4018-a74a-86b025e2196d\" (UID: \"85d128b6-18c3-4018-a74a-86b025e2196d\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.316347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-operator-scripts\") pod \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\" (UID: \"06b93df4-cdbb-470e-8147-2a9bd7e5b74e\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.316377 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-combined-ca-bundle\") pod \"561add48-0cde-40f7-95b6-09300d00eec4\" (UID: \"561add48-0cde-40f7-95b6-09300d00eec4\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.317068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.317146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.317208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6j8\" (UniqueName: \"kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.317322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.317357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data\") pod \"barbican-api-6cd44fb754-n5rlv\" (UID: \"872bac5e-90cd-4cb0-9279-baa312892916\") " pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.321108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "58427ff4-8738-48bd-a3ea-a99986023664" (UID: "58427ff4-8738-48bd-a3ea-a99986023664"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.328192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d128b6-18c3-4018-a74a-86b025e2196d-kube-api-access-w9qc4" (OuterVolumeSpecName: "kube-api-access-w9qc4") pod "85d128b6-18c3-4018-a74a-86b025e2196d" (UID: "85d128b6-18c3-4018-a74a-86b025e2196d"). InnerVolumeSpecName "kube-api-access-w9qc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.328259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e69885-5099-471a-8f06-ef85f63f18b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6e69885-5099-471a-8f06-ef85f63f18b7" (UID: "b6e69885-5099-471a-8f06-ef85f63f18b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.328306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309eb8bc-de8e-454e-8835-811c8df0ed63-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "309eb8bc-de8e-454e-8835-811c8df0ed63" (UID: "309eb8bc-de8e-454e-8835-811c8df0ed63"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.328642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58427ff4-8738-48bd-a3ea-a99986023664-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "58427ff4-8738-48bd-a3ea-a99986023664" (UID: "58427ff4-8738-48bd-a3ea-a99986023664"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.328758 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.328836 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.328800631 +0000 UTC m=+4323.510316852 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "cert-barbican-public-svc" not found Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.328919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58427ff4-8738-48bd-a3ea-a99986023664" (UID: "58427ff4-8738-48bd-a3ea-a99986023664"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.329479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "58427ff4-8738-48bd-a3ea-a99986023664" (UID: "58427ff4-8738-48bd-a3ea-a99986023664"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.331021 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.331103 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.331080727 +0000 UTC m=+4323.512596949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "combined-ca-bundle" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.332398 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.332466 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.332449471 +0000 UTC m=+4323.513965693 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "cert-barbican-internal-svc" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.332494 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.332531 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.332520534 +0000 UTC m=+4323.514036756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : secret "barbican-config-data" not found Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.335025 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06b93df4-cdbb-470e-8147-2a9bd7e5b74e" (UID: "06b93df4-cdbb-470e-8147-2a9bd7e5b74e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.336976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58427ff4-8738-48bd-a3ea-a99986023664-kube-api-access-nhzk5" (OuterVolumeSpecName: "kube-api-access-nhzk5") pod "58427ff4-8738-48bd-a3ea-a99986023664" (UID: "58427ff4-8738-48bd-a3ea-a99986023664"). InnerVolumeSpecName "kube-api-access-nhzk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.343081 4707 projected.go:194] Error preparing data for projected volume kube-api-access-tz6j8 for pod openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.343144 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8 podName:872bac5e-90cd-4cb0-9279-baa312892916 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.343127598 +0000 UTC m=+4323.524643821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tz6j8" (UniqueName: "kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8") pod "barbican-api-6cd44fb754-n5rlv" (UID: "872bac5e-90cd-4cb0-9279-baa312892916") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.352874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309eb8bc-de8e-454e-8835-811c8df0ed63-kube-api-access-lgd5k" (OuterVolumeSpecName: "kube-api-access-lgd5k") pod "309eb8bc-de8e-454e-8835-811c8df0ed63" (UID: "309eb8bc-de8e-454e-8835-811c8df0ed63"). InnerVolumeSpecName "kube-api-access-lgd5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.356857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-kube-api-access-nh82n" (OuterVolumeSpecName: "kube-api-access-nh82n") pod "06b93df4-cdbb-470e-8147-2a9bd7e5b74e" (UID: "06b93df4-cdbb-470e-8147-2a9bd7e5b74e"). InnerVolumeSpecName "kube-api-access-nh82n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.360739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561add48-0cde-40f7-95b6-09300d00eec4-kube-api-access-2d6pt" (OuterVolumeSpecName: "kube-api-access-2d6pt") pod "561add48-0cde-40f7-95b6-09300d00eec4" (UID: "561add48-0cde-40f7-95b6-09300d00eec4"). InnerVolumeSpecName "kube-api-access-2d6pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.360932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e69885-5099-471a-8f06-ef85f63f18b7-kube-api-access-rw7t6" (OuterVolumeSpecName: "kube-api-access-rw7t6") pod "b6e69885-5099-471a-8f06-ef85f63f18b7" (UID: "b6e69885-5099-471a-8f06-ef85f63f18b7"). InnerVolumeSpecName "kube-api-access-rw7t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.384797 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.393967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "58427ff4-8738-48bd-a3ea-a99986023664" (UID: "58427ff4-8738-48bd-a3ea-a99986023664"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420353 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh82n\" (UniqueName: \"kubernetes.io/projected/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-kube-api-access-nh82n\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420381 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d6pt\" (UniqueName: \"kubernetes.io/projected/561add48-0cde-40f7-95b6-09300d00eec4-kube-api-access-2d6pt\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420409 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420422 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06b93df4-cdbb-470e-8147-2a9bd7e5b74e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420434 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9qc4\" (UniqueName: \"kubernetes.io/projected/85d128b6-18c3-4018-a74a-86b025e2196d-kube-api-access-w9qc4\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420443 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw7t6\" (UniqueName: \"kubernetes.io/projected/b6e69885-5099-471a-8f06-ef85f63f18b7-kube-api-access-rw7t6\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420451 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/309eb8bc-de8e-454e-8835-811c8df0ed63-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420460 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/58427ff4-8738-48bd-a3ea-a99986023664-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420469 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e69885-5099-471a-8f06-ef85f63f18b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420495 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgd5k\" (UniqueName: \"kubernetes.io/projected/309eb8bc-de8e-454e-8835-811c8df0ed63-kube-api-access-lgd5k\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420516 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420525 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420534 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58427ff4-8738-48bd-a3ea-a99986023664-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.420542 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhzk5\" (UniqueName: \"kubernetes.io/projected/58427ff4-8738-48bd-a3ea-a99986023664-kube-api-access-nhzk5\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.426218 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "561add48-0cde-40f7-95b6-09300d00eec4" (UID: "561add48-0cde-40f7-95b6-09300d00eec4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.435355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85d128b6-18c3-4018-a74a-86b025e2196d" (UID: "85d128b6-18c3-4018-a74a-86b025e2196d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.447950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58427ff4-8738-48bd-a3ea-a99986023664" (UID: "58427ff4-8738-48bd-a3ea-a99986023664"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.449251 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.487127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "85d128b6-18c3-4018-a74a-86b025e2196d" (UID: "85d128b6-18c3-4018-a74a-86b025e2196d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.490016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-config-data" (OuterVolumeSpecName: "config-data") pod "561add48-0cde-40f7-95b6-09300d00eec4" (UID: "561add48-0cde-40f7-95b6-09300d00eec4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.521401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x66ks\" (UniqueName: \"kubernetes.io/projected/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-kube-api-access-x66ks\") pod \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.521628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-config-data\") pod \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.521763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle\") pod \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\" (UID: \"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.522362 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.522429 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.522510 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.527135 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.527256 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.527315 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.543651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "561add48-0cde-40f7-95b6-09300d00eec4" (UID: "561add48-0cde-40f7-95b6-09300d00eec4"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.577542 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-kube-api-access-x66ks" (OuterVolumeSpecName: "kube-api-access-x66ks") pod "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" (UID: "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3"). InnerVolumeSpecName "kube-api-access-x66ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.577741 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb"] Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.593453 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b218d98dcea319cdfaa8ddb7b590949e9f120aa66afb2e52a36db4e18a4d975" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.619692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "85d128b6-18c3-4018-a74a-86b025e2196d" (UID: "85d128b6-18c3-4018-a74a-86b025e2196d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.619898 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b218d98dcea319cdfaa8ddb7b590949e9f120aa66afb2e52a36db4e18a4d975" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.625902 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b218d98dcea319cdfaa8ddb7b590949e9f120aa66afb2e52a36db4e18a4d975" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.625951 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.627688 4707 generic.go:334] "Generic (PLEG): container finished" podID="3cd910ba-3565-4935-a15f-dda572968edd" containerID="a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0" exitCode=0 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.627762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" event={"ID":"3cd910ba-3565-4935-a15f-dda572968edd","Type":"ContainerDied","Data":"a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.627789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" event={"ID":"3cd910ba-3565-4935-a15f-dda572968edd","Type":"ContainerStarted","Data":"a6ec8071ef1e47f5ab62d7a5187112ccb543dceafbc05d75056866db528dfeb6"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.628734 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x66ks\" (UniqueName: \"kubernetes.io/projected/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-kube-api-access-x66ks\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.631240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "561add48-0cde-40f7-95b6-09300d00eec4" (UID: "561add48-0cde-40f7-95b6-09300d00eec4"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.632827 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/85d128b6-18c3-4018-a74a-86b025e2196d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.632862 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.633406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" (UID: "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.640545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-config-data" (OuterVolumeSpecName: "config-data") pod "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" (UID: "047c06ee-71d5-4bcf-83e0-dd8aa295cdb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.641447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" event={"ID":"309eb8bc-de8e-454e-8835-811c8df0ed63","Type":"ContainerDied","Data":"c137677e0e36c3d9ee3b269e74666e1c3abbfe801db743e844e825fdfc61d98b"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.641550 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.654358 4707 generic.go:334] "Generic (PLEG): container finished" podID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerID="2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700" exitCode=0 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.654397 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.654446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerDied","Data":"2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.654480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"047c06ee-71d5-4bcf-83e0-dd8aa295cdb3","Type":"ContainerDied","Data":"3aa25acbd4de9cf12483122ba0fa631336d240c2bad44047c3d8daf054a45b76"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.654499 4707 scope.go:117] "RemoveContainer" containerID="2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.655211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "58427ff4-8738-48bd-a3ea-a99986023664" (UID: "58427ff4-8738-48bd-a3ea-a99986023664"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.659714 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" event={"ID":"b9d5ac98-e3ae-450a-90fb-45b53c48f9ce","Type":"ContainerDied","Data":"c0e322245408b6886107b77735e060fab647f8fe6c1172720ba7bd3abcfa8f80"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.660094 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.666412 4707 generic.go:334] "Generic (PLEG): container finished" podID="58427ff4-8738-48bd-a3ea-a99986023664" containerID="bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4" exitCode=0 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.666513 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.667193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"58427ff4-8738-48bd-a3ea-a99986023664","Type":"ContainerDied","Data":"bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.667224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"58427ff4-8738-48bd-a3ea-a99986023664","Type":"ContainerDied","Data":"02a85cbfb13496723c88bf261d79f640fc16c5d9738d9a6caa90e8a03b802bb1"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.671116 4707 generic.go:334] "Generic (PLEG): container finished" podID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerID="6a3e0961eaad6a849ce19556e597cc407293421db0ba0b4457b3b4372a4b61d6" exitCode=0 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.671142 4707 generic.go:334] "Generic (PLEG): container finished" podID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerID="d9b5938294531b6e015f18518c545afd2fc7bc61634c8904c5434b33da8fd969" exitCode=0 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.671187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" event={"ID":"81457b6e-3af3-4da8-9f83-0e896795cb2c","Type":"ContainerDied","Data":"6a3e0961eaad6a849ce19556e597cc407293421db0ba0b4457b3b4372a4b61d6"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.671209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" event={"ID":"81457b6e-3af3-4da8-9f83-0e896795cb2c","Type":"ContainerDied","Data":"d9b5938294531b6e015f18518c545afd2fc7bc61634c8904c5434b33da8fd969"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.676347 4707 generic.go:334] "Generic (PLEG): container finished" podID="561add48-0cde-40f7-95b6-09300d00eec4" containerID="d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8" exitCode=0 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.676401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"561add48-0cde-40f7-95b6-09300d00eec4","Type":"ContainerDied","Data":"d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.676442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"561add48-0cde-40f7-95b6-09300d00eec4","Type":"ContainerDied","Data":"5daab13c19a07ca007adfa9b4b37be73cd9f33e8ab7fc149dcca36c7474fcae2"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.676412 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.676973 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.677235 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="deb46fb3-5bea-4b35-9e6c-6816415058af" containerName="memcached" containerID="cri-o://9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7" gracePeriod=30 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.684429 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.684886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48" event={"ID":"7d5dc988-7c0d-4bef-a471-656baf486bc9","Type":"ContainerDied","Data":"0ed70ca13ea572c8e8a6b2f742413f35cc0451cad77272e07d17cc50400d8d34"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.684936 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-pskrb"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.690474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"10090d75-68eb-4597-98f6-b3f0282513e6","Type":"ContainerDied","Data":"ea0d2fd30a0d46c82d89ccaed6aa727e763a327dc24df044b01a7102712782ba"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.690549 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.692286 4707 generic.go:334] "Generic (PLEG): container finished" podID="85d128b6-18c3-4018-a74a-86b025e2196d" containerID="dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69" exitCode=137 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.692457 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.695631 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn"] Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696135 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696150 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696169 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561add48-0cde-40f7-95b6-09300d00eec4" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696175 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="561add48-0cde-40f7-95b6-09300d00eec4" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696189 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696195 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696220 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86320b8-c3c8-42d6-b48b-206526f89271" containerName="ovsdbserver-nb" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696225 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86320b8-c3c8-42d6-b48b-206526f89271" containerName="ovsdbserver-nb" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696238 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58427ff4-8738-48bd-a3ea-a99986023664" containerName="galera" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696243 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="58427ff4-8738-48bd-a3ea-a99986023664" containerName="galera" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696258 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58427ff4-8738-48bd-a3ea-a99986023664" containerName="mysql-bootstrap" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696263 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="58427ff4-8738-48bd-a3ea-a99986023664" containerName="mysql-bootstrap" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696292 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerName="ovsdbserver-sb" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696298 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerName="ovsdbserver-sb" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696310 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86320b8-c3c8-42d6-b48b-206526f89271" containerName="openstack-network-exporter" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696315 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86320b8-c3c8-42d6-b48b-206526f89271" containerName="openstack-network-exporter" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696332 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696338 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696349 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerName="openstack-network-exporter" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696355 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerName="openstack-network-exporter" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.696365 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10090d75-68eb-4597-98f6-b3f0282513e6" containerName="kube-state-metrics" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696370 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10090d75-68eb-4597-98f6-b3f0282513e6" containerName="kube-state-metrics" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696542 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86320b8-c3c8-42d6-b48b-206526f89271" containerName="openstack-network-exporter" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696556 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696564 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696574 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="561add48-0cde-40f7-95b6-09300d00eec4" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696582 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="10090d75-68eb-4597-98f6-b3f0282513e6" containerName="kube-state-metrics" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696591 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerName="ovsdbserver-sb" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696602 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" containerName="openstack-network-exporter" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696610 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86320b8-c3c8-42d6-b48b-206526f89271" containerName="ovsdbserver-nb" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.696619 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="58427ff4-8738-48bd-a3ea-a99986023664" containerName="galera" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.697369 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.698417 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerID="669bb6d5318148cc1ac21fae845cda6d0b6d3e81db0f375ed6ebf15b8e802dd4" exitCode=0 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.698445 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerDied","Data":"669bb6d5318148cc1ac21fae845cda6d0b6d3e81db0f375ed6ebf15b8e802dd4"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.699222 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.699611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" event={"ID":"06b93df4-cdbb-470e-8147-2a9bd7e5b74e","Type":"ContainerDied","Data":"9a493328799ba4b7c7fb965e9a58ea1b071f20121979ce5e57356d0723a4699c"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.699667 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.700671 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.710217 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.711670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-rg6q7" event={"ID":"b6e69885-5099-471a-8f06-ef85f63f18b7","Type":"ContainerDied","Data":"cedbe1c40b4e58a4c686c5837aeff97eea323b5f17c2c15bf3a130d890ecdd99"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.711781 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-rg6q7" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.716962 4707 generic.go:334] "Generic (PLEG): container finished" podID="69e40397-47ac-49d6-9b01-50b995d290f7" containerID="8e765e71026c6eb75d97a822236ee30c3868681e753b9c3940e5cd5d39e67c94" exitCode=0 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.717043 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.717464 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"69e40397-47ac-49d6-9b01-50b995d290f7","Type":"ContainerDied","Data":"8e765e71026c6eb75d97a822236ee30c3868681e753b9c3940e5cd5d39e67c94"} Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.717544 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.717634 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.723563 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-ssq68"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.734845 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pwngz"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.736588 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/561add48-0cde-40f7-95b6-09300d00eec4-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.736610 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.736621 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.736632 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/58427ff4-8738-48bd-a3ea-a99986023664-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.738855 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pwngz"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.745832 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-ssq68"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.768264 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.773746 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6d55599ff6-55gnd"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.774018 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" podUID="1876cc6c-ebc8-4c95-a937-27d2e01adf64" containerName="keystone-api" containerID="cri-o://a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63" gracePeriod=30 Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.812636 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.843134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts\") pod \"keystone-fd0d-account-create-update-wp9kn\" (UID: \"42ec36fa-88c2-46f1-affb-ba010670839c\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843314 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843338 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843351 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-config-data: secret "cinder-config-data" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843425 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843426 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.843401221 +0000 UTC m=+4324.024917444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-config-data" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843448 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.843439884 +0000 UTC m=+4324.024956106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cert-cinder-public-svc" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843466 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data podName:0e42cb4d-8129-4286-a6c7-9f8e94524e91 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.843454772 +0000 UTC m=+4324.024970994 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data") pod "rabbitmq-server-0" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91") : configmap "rabbitmq-config-data" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843487 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-api-config-data: secret "cinder-api-config-data" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843520 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.843506369 +0000 UTC m=+4324.025022591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-api-config-data" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843589 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scripts: secret "cinder-scripts" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843629 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.84361296 +0000 UTC m=+4324.025129182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cinder-scripts" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843659 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.843644749 +0000 UTC m=+4324.025160971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "combined-ca-bundle" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843686 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.843687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wclvw\" (UniqueName: \"kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw\") pod \"keystone-fd0d-account-create-update-wp9kn\" (UID: \"42ec36fa-88c2-46f1-affb-ba010670839c\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.843712 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs podName:559741cf-87c9-47d4-98a3-6b02ffb4610c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.843702929 +0000 UTC m=+4324.025219151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs") pod "cinder-api-0" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c") : secret "cert-cinder-internal-svc" not found Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.851634 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-rzg8t"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.863965 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-rzg8t"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.890044 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.894242 4707 scope.go:117] "RemoveContainer" containerID="0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.936572 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.945287 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-config-data\") pod \"81457b6e-3af3-4da8-9f83-0e896795cb2c\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.945462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-run-httpd\") pod \"81457b6e-3af3-4da8-9f83-0e896795cb2c\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.945485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-log-httpd\") pod \"81457b6e-3af3-4da8-9f83-0e896795cb2c\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.945528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw2gw\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-kube-api-access-nw2gw\") pod \"81457b6e-3af3-4da8-9f83-0e896795cb2c\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.945622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-combined-ca-bundle\") pod \"81457b6e-3af3-4da8-9f83-0e896795cb2c\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.945659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-public-tls-certs\") pod \"81457b6e-3af3-4da8-9f83-0e896795cb2c\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.945735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-etc-swift\") pod \"81457b6e-3af3-4da8-9f83-0e896795cb2c\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.945770 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-internal-tls-certs\") pod \"81457b6e-3af3-4da8-9f83-0e896795cb2c\" (UID: \"81457b6e-3af3-4da8-9f83-0e896795cb2c\") " Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.946011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "81457b6e-3af3-4da8-9f83-0e896795cb2c" (UID: "81457b6e-3af3-4da8-9f83-0e896795cb2c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.946872 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "81457b6e-3af3-4da8-9f83-0e896795cb2c" (UID: "81457b6e-3af3-4da8-9f83-0e896795cb2c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.947231 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wclvw\" (UniqueName: \"kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw\") pod \"keystone-fd0d-account-create-update-wp9kn\" (UID: \"42ec36fa-88c2-46f1-affb-ba010670839c\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.947353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts\") pod \"keystone-fd0d-account-create-update-wp9kn\" (UID: \"42ec36fa-88c2-46f1-affb-ba010670839c\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.947587 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.947601 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81457b6e-3af3-4da8-9f83-0e896795cb2c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.947658 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.947696 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts podName:42ec36fa-88c2-46f1-affb-ba010670839c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:43.447684344 +0000 UTC m=+4320.629200566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts") pod "keystone-fd0d-account-create-update-wp9kn" (UID: "42ec36fa-88c2-46f1-affb-ba010670839c") : configmap "openstack-scripts" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.951753 4707 projected.go:194] Error preparing data for projected volume kube-api-access-wclvw for pod openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:13:42 crc kubenswrapper[4707]: E0121 16:13:42.952134 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw podName:42ec36fa-88c2-46f1-affb-ba010670839c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:43.452105457 +0000 UTC m=+4320.633621679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wclvw" (UniqueName: "kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw") pod "keystone-fd0d-account-create-update-wp9kn" (UID: "42ec36fa-88c2-46f1-affb-ba010670839c") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.961206 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-b36a-account-create-update-7zlnl"] Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.961415 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "81457b6e-3af3-4da8-9f83-0e896795cb2c" (UID: "81457b6e-3af3-4da8-9f83-0e896795cb2c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.961479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-kube-api-access-nw2gw" (OuterVolumeSpecName: "kube-api-access-nw2gw") pod "81457b6e-3af3-4da8-9f83-0e896795cb2c" (UID: "81457b6e-3af3-4da8-9f83-0e896795cb2c"). InnerVolumeSpecName "kube-api-access-nw2gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.991348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-config-data" (OuterVolumeSpecName: "config-data") pod "81457b6e-3af3-4da8-9f83-0e896795cb2c" (UID: "81457b6e-3af3-4da8-9f83-0e896795cb2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:42 crc kubenswrapper[4707]: I0121 16:13:42.991419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81457b6e-3af3-4da8-9f83-0e896795cb2c" (UID: "81457b6e-3af3-4da8-9f83-0e896795cb2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.003377 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "81457b6e-3af3-4da8-9f83-0e896795cb2c" (UID: "81457b6e-3af3-4da8-9f83-0e896795cb2c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.024362 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "81457b6e-3af3-4da8-9f83-0e896795cb2c" (UID: "81457b6e-3af3-4da8-9f83-0e896795cb2c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.050567 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.050644 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.050699 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.050760 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw2gw\" (UniqueName: \"kubernetes.io/projected/81457b6e-3af3-4da8-9f83-0e896795cb2c-kube-api-access-nw2gw\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.050826 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.050888 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81457b6e-3af3-4da8-9f83-0e896795cb2c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.059458 4707 scope.go:117] "RemoveContainer" containerID="2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700" Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.059895 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700\": container with ID starting with 2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700 not found: ID does not exist" containerID="2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.059982 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700"} err="failed to get container status \"2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700\": rpc error: code = NotFound desc = could not find container \"2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700\": container with ID starting with 2cf905461933c7586008a2568f8aa2dc615a51274fa2a1c47275c0a53b95a700 not found: ID does not exist" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.060057 4707 scope.go:117] "RemoveContainer" containerID="0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd" Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.060376 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd\": container with ID starting with 0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd not found: ID does not exist" containerID="0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.060412 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd"} err="failed to get container status \"0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd\": rpc error: code = NotFound desc = could not find container \"0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd\": container with ID starting with 0ce9a409416f8d3902bef1534bdef23f0d5ffe727af5022eef32be66be6ec6fd not found: ID does not exist" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.060436 4707 scope.go:117] "RemoveContainer" containerID="bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.061019 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.070595 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-wclvw operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" podUID="42ec36fa-88c2-46f1-affb-ba010670839c" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.102488 4707 scope.go:117] "RemoveContainer" containerID="3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.102718 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" containerName="galera" containerID="cri-o://2ce45f981554c524b9a786326d92c8a0c4cc2bbf20e590dd6c7773290192d2cd" gracePeriod=30 Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.105553 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.139881 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.151358 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5ff6c79dfc-9wvl9"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.153038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-internal-tls-certs\") pod \"69e40397-47ac-49d6-9b01-50b995d290f7\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.153138 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvfgk\" (UniqueName: \"kubernetes.io/projected/69e40397-47ac-49d6-9b01-50b995d290f7-kube-api-access-lvfgk\") pod \"69e40397-47ac-49d6-9b01-50b995d290f7\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.153234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"69e40397-47ac-49d6-9b01-50b995d290f7\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.153312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-scripts\") pod \"69e40397-47ac-49d6-9b01-50b995d290f7\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.153386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-httpd-run\") pod \"69e40397-47ac-49d6-9b01-50b995d290f7\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.153866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-logs\") pod \"69e40397-47ac-49d6-9b01-50b995d290f7\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.153969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-combined-ca-bundle\") pod \"69e40397-47ac-49d6-9b01-50b995d290f7\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.154128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-config-data\") pod \"69e40397-47ac-49d6-9b01-50b995d290f7\" (UID: \"69e40397-47ac-49d6-9b01-50b995d290f7\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.156673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e40397-47ac-49d6-9b01-50b995d290f7-kube-api-access-lvfgk" (OuterVolumeSpecName: "kube-api-access-lvfgk") pod "69e40397-47ac-49d6-9b01-50b995d290f7" (UID: "69e40397-47ac-49d6-9b01-50b995d290f7"). InnerVolumeSpecName "kube-api-access-lvfgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.156929 4707 scope.go:117] "RemoveContainer" containerID="bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.159608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "69e40397-47ac-49d6-9b01-50b995d290f7" (UID: "69e40397-47ac-49d6-9b01-50b995d290f7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.160408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-logs" (OuterVolumeSpecName: "logs") pod "69e40397-47ac-49d6-9b01-50b995d290f7" (UID: "69e40397-47ac-49d6-9b01-50b995d290f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.165236 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4\": container with ID starting with bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4 not found: ID does not exist" containerID="bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.165388 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4"} err="failed to get container status \"bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4\": rpc error: code = NotFound desc = could not find container \"bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4\": container with ID starting with bddc05fd0231d3affe8adf3706f9ec101fbcd56a5ae06f68e2bc514227f62cc4 not found: ID does not exist" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.165430 4707 scope.go:117] "RemoveContainer" containerID="3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098" Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.171007 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098\": container with ID starting with 3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098 not found: ID does not exist" containerID="3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.171045 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098"} err="failed to get container status \"3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098\": rpc error: code = NotFound desc = could not find container \"3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098\": container with ID starting with 3f8d80b2218261110aec839aee431ecd10d2abed9df91037867df90e1f06f098 not found: ID does not exist" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.171069 4707 scope.go:117] "RemoveContainer" containerID="d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.178664 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.54:9292/healthcheck\": read tcp 10.217.0.2:58148->10.217.0.54:9292: read: connection reset by peer" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.178665 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.54:9292/healthcheck\": read tcp 10.217.0.2:58150->10.217.0.54:9292: read: connection reset by peer" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.180031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-scripts" (OuterVolumeSpecName: "scripts") pod "69e40397-47ac-49d6-9b01-50b995d290f7" (UID: "69e40397-47ac-49d6-9b01-50b995d290f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.209107 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "69e40397-47ac-49d6-9b01-50b995d290f7" (UID: "69e40397-47ac-49d6-9b01-50b995d290f7"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.210976 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="153f9936-d723-4c86-a4b2-176325dc9513" path="/var/lib/kubelet/pods/153f9936-d723-4c86-a4b2-176325dc9513/volumes" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.211419 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="309eb8bc-de8e-454e-8835-811c8df0ed63" path="/var/lib/kubelet/pods/309eb8bc-de8e-454e-8835-811c8df0ed63/volumes" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.211833 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d18151-76a1-4301-8747-0e41ad71b2e4" path="/var/lib/kubelet/pods/50d18151-76a1-4301-8747-0e41ad71b2e4/volumes" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.212375 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d128b6-18c3-4018-a74a-86b025e2196d" path="/var/lib/kubelet/pods/85d128b6-18c3-4018-a74a-86b025e2196d/volumes" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.224352 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb06987b-a8ab-40c8-b73b-f0244ac056c7" path="/var/lib/kubelet/pods/bb06987b-a8ab-40c8-b73b-f0244ac056c7/volumes" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.225142 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd" path="/var/lib/kubelet/pods/bb42e16b-54bc-4c45-a1b9-7f8ad04dd4fd/volumes" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.225661 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd39e035-830d-4cc3-ab9f-2127564ecddf" path="/var/lib/kubelet/pods/bd39e035-830d-4cc3-ab9f-2127564ecddf/volumes" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.226682 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3134eb5-7694-439d-8556-fa12b646b0f4" path="/var/lib/kubelet/pods/c3134eb5-7694-439d-8556-fa12b646b0f4/volumes" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.235647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "69e40397-47ac-49d6-9b01-50b995d290f7" (UID: "69e40397-47ac-49d6-9b01-50b995d290f7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.236020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-config-data" (OuterVolumeSpecName: "config-data") pod "69e40397-47ac-49d6-9b01-50b995d290f7" (UID: "69e40397-47ac-49d6-9b01-50b995d290f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.246714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69e40397-47ac-49d6-9b01-50b995d290f7" (UID: "69e40397-47ac-49d6-9b01-50b995d290f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.255788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-combined-ca-bundle\") pod \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.255851 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab3292fc-fbc8-481c-803a-3e32ff576d6d-logs\") pod \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.255936 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-scripts\") pod \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256011 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkn5h\" (UniqueName: \"kubernetes.io/projected/ab3292fc-fbc8-481c-803a-3e32ff576d6d-kube-api-access-vkn5h\") pod \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-public-tls-certs\") pod \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-config-data\") pod \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-internal-tls-certs\") pod \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\" (UID: \"ab3292fc-fbc8-481c-803a-3e32ff576d6d\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256733 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256746 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256756 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/153f9936-d723-4c86-a4b2-176325dc9513-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256766 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvfgk\" (UniqueName: \"kubernetes.io/projected/69e40397-47ac-49d6-9b01-50b995d290f7-kube-api-access-lvfgk\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256785 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256793 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256802 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256828 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e40397-47ac-49d6-9b01-50b995d290f7-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.256837 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e40397-47ac-49d6-9b01-50b995d290f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.261907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab3292fc-fbc8-481c-803a-3e32ff576d6d-logs" (OuterVolumeSpecName: "logs") pod "ab3292fc-fbc8-481c-803a-3e32ff576d6d" (UID: "ab3292fc-fbc8-481c-803a-3e32ff576d6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.267891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.267926 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-13f5-account-create-update-4gg48"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.267946 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-rg6q7"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.267956 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-rg6q7"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.267965 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.267975 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.267987 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.269742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-scripts" (OuterVolumeSpecName: "scripts") pod "ab3292fc-fbc8-481c-803a-3e32ff576d6d" (UID: "ab3292fc-fbc8-481c-803a-3e32ff576d6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.275374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3292fc-fbc8-481c-803a-3e32ff576d6d-kube-api-access-vkn5h" (OuterVolumeSpecName: "kube-api-access-vkn5h") pod "ab3292fc-fbc8-481c-803a-3e32ff576d6d" (UID: "ab3292fc-fbc8-481c-803a-3e32ff576d6d"). InnerVolumeSpecName "kube-api-access-vkn5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.283520 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db52-account-create-update-qz4cv"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.311467 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.318576 4707 scope.go:117] "RemoveContainer" containerID="d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8" Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.328968 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8\": container with ID starting with d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8 not found: ID does not exist" containerID="d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.329004 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8"} err="failed to get container status \"d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8\": rpc error: code = NotFound desc = could not find container \"d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8\": container with ID starting with d771625fd3d86ba6a5d2ed08e70612a0fcf52113ad63a435d596700feaf30bb8 not found: ID does not exist" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.329026 4707 scope.go:117] "RemoveContainer" containerID="2cec1240e19583cb646030fe86a81a2167b97fef2ae3246d40fd82a422fe2b74" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.346059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-config-data" (OuterVolumeSpecName: "config-data") pod "ab3292fc-fbc8-481c-803a-3e32ff576d6d" (UID: "ab3292fc-fbc8-481c-803a-3e32ff576d6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.363789 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkn5h\" (UniqueName: \"kubernetes.io/projected/ab3292fc-fbc8-481c-803a-3e32ff576d6d-kube-api-access-vkn5h\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.363824 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.363834 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.363844 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab3292fc-fbc8-481c-803a-3e32ff576d6d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.363852 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.376438 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.385002 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6cd44fb754-n5rlv"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.388042 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.393846 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.396907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab3292fc-fbc8-481c-803a-3e32ff576d6d" (UID: "ab3292fc-fbc8-481c-803a-3e32ff576d6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.397262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ab3292fc-fbc8-481c-803a-3e32ff576d6d" (UID: "ab3292fc-fbc8-481c-803a-3e32ff576d6d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.398889 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.402026 4707 scope.go:117] "RemoveContainer" containerID="dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.407593 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.416907 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.426847 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.428564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ab3292fc-fbc8-481c-803a-3e32ff576d6d" (UID: "ab3292fc-fbc8-481c-803a-3e32ff576d6d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.448674 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.455720 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5fcdff7b8d-2cscv"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.466583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wclvw\" (UniqueName: \"kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw\") pod \"keystone-fd0d-account-create-update-wp9kn\" (UID: \"42ec36fa-88c2-46f1-affb-ba010670839c\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.466733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts\") pod \"keystone-fd0d-account-create-update-wp9kn\" (UID: \"42ec36fa-88c2-46f1-affb-ba010670839c\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.467061 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.467088 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.467102 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.467116 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.467128 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.467140 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/872bac5e-90cd-4cb0-9279-baa312892916-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.467151 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab3292fc-fbc8-481c-803a-3e32ff576d6d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.467174 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz6j8\" (UniqueName: \"kubernetes.io/projected/872bac5e-90cd-4cb0-9279-baa312892916-kube-api-access-tz6j8\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.467253 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.467311 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts podName:42ec36fa-88c2-46f1-affb-ba010670839c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:44.467295738 +0000 UTC m=+4321.648811959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts") pod "keystone-fd0d-account-create-update-wp9kn" (UID: "42ec36fa-88c2-46f1-affb-ba010670839c") : configmap "openstack-scripts" not found Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.470551 4707 projected.go:194] Error preparing data for projected volume kube-api-access-wclvw for pod openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.470628 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw podName:42ec36fa-88c2-46f1-affb-ba010670839c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:44.470608206 +0000 UTC m=+4321.652124427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wclvw" (UniqueName: "kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw") pod "keystone-fd0d-account-create-update-wp9kn" (UID: "42ec36fa-88c2-46f1-affb-ba010670839c") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.486267 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.488633 4707 scope.go:117] "RemoveContainer" containerID="dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69" Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.490348 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69\": container with ID starting with dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69 not found: ID does not exist" containerID="dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.490384 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69"} err="failed to get container status \"dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69\": rpc error: code = NotFound desc = could not find container \"dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69\": container with ID starting with dbae17cdb625acc62bf6186b186afbff6c7e17c98c48ba0281b4b8df1c6e0c69 not found: ID does not exist" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.504310 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-dc17-account-create-update-b8j7b"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.513951 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.527024 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.569983 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e813fb-84dc-4d38-b3b5-1be17e1848f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.596389 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.70:8775/\": read tcp 10.217.0.2:37318->10.217.0.70:8775: read: connection reset by peer" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.596394 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.70:8775/\": read tcp 10.217.0.2:37326->10.217.0.70:8775: read: connection reset by peer" Jan 21 16:13:43 crc kubenswrapper[4707]: E0121 16:13:43.613471 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice/crio-770e28dc760349d846a328f6dcc5814425a16f9414e79e379a58b6c895cb04aa\": RecentStats: unable to find data in memory cache]" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.729010 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.744996 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.91:8774/\": read tcp 10.217.0.2:38012->10.217.0.91:8774: read: connection reset by peer" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.745390 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.91:8774/\": read tcp 10.217.0.2:38024->10.217.0.91:8774: read: connection reset by peer" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.756745 4707 generic.go:334] "Generic (PLEG): container finished" podID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerID="d9c4cae2facb86f1f23dc52a9d0d7d3cf12140812ce2d41bc7723b83171f967b" exitCode=0 Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.756933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1162dc59-3a54-42e6-84c5-466eafb062b4","Type":"ContainerDied","Data":"d9c4cae2facb86f1f23dc52a9d0d7d3cf12140812ce2d41bc7723b83171f967b"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.762033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" event={"ID":"81457b6e-3af3-4da8-9f83-0e896795cb2c","Type":"ContainerDied","Data":"770e28dc760349d846a328f6dcc5814425a16f9414e79e379a58b6c895cb04aa"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.762079 4707 scope.go:117] "RemoveContainer" containerID="6a3e0961eaad6a849ce19556e597cc407293421db0ba0b4457b3b4372a4b61d6" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.762239 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.774296 4707 generic.go:334] "Generic (PLEG): container finished" podID="f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" containerID="2ce45f981554c524b9a786326d92c8a0c4cc2bbf20e590dd6c7773290192d2cd" exitCode=0 Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.774388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b","Type":"ContainerDied","Data":"2ce45f981554c524b9a786326d92c8a0c4cc2bbf20e590dd6c7773290192d2cd"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.784097 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.799887 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.808586 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c98dd475b-csd2n"] Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.820957 4707 generic.go:334] "Generic (PLEG): container finished" podID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerID="142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb" exitCode=0 Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.821230 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.827997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" event={"ID":"ab3292fc-fbc8-481c-803a-3e32ff576d6d","Type":"ContainerDied","Data":"142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.828098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-644c6c574-8cgjw" event={"ID":"ab3292fc-fbc8-481c-803a-3e32ff576d6d","Type":"ContainerDied","Data":"937b13686bf85801046f72c011a90669b57937a4d5c6fa09f6ad12296a5683dd"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.838774 4707 generic.go:334] "Generic (PLEG): container finished" podID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerID="b7106616e826175cc64ed9a092fca4a51264ccb076e1e9ed7b657799fb68de52" exitCode=0 Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.838847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"559741cf-87c9-47d4-98a3-6b02ffb4610c","Type":"ContainerDied","Data":"b7106616e826175cc64ed9a092fca4a51264ccb076e1e9ed7b657799fb68de52"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.852982 4707 generic.go:334] "Generic (PLEG): container finished" podID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerID="a3f629a586eeaf3a0430289235f8ed826a343e4d540c7552bf23587188a60f57" exitCode=0 Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.853082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2","Type":"ContainerDied","Data":"a3f629a586eeaf3a0430289235f8ed826a343e4d540c7552bf23587188a60f57"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.888311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-config-data\") pod \"1162dc59-3a54-42e6-84c5-466eafb062b4\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.888372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-config-data\") pod \"deb46fb3-5bea-4b35-9e6c-6816415058af\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.888405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-combined-ca-bundle\") pod \"1162dc59-3a54-42e6-84c5-466eafb062b4\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.888539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-logs\") pod \"1162dc59-3a54-42e6-84c5-466eafb062b4\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.888573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-public-tls-certs\") pod \"1162dc59-3a54-42e6-84c5-466eafb062b4\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.888656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-memcached-tls-certs\") pod \"deb46fb3-5bea-4b35-9e6c-6816415058af\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.888735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"1162dc59-3a54-42e6-84c5-466eafb062b4\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.888779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-httpd-run\") pod \"1162dc59-3a54-42e6-84c5-466eafb062b4\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.888931 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg558\" (UniqueName: \"kubernetes.io/projected/1162dc59-3a54-42e6-84c5-466eafb062b4-kube-api-access-fg558\") pod \"1162dc59-3a54-42e6-84c5-466eafb062b4\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.889038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-kolla-config\") pod \"deb46fb3-5bea-4b35-9e6c-6816415058af\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.889088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-combined-ca-bundle\") pod \"deb46fb3-5bea-4b35-9e6c-6816415058af\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.891275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-config-data" (OuterVolumeSpecName: "config-data") pod "deb46fb3-5bea-4b35-9e6c-6816415058af" (UID: "deb46fb3-5bea-4b35-9e6c-6816415058af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.896312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-scripts\") pod \"1162dc59-3a54-42e6-84c5-466eafb062b4\" (UID: \"1162dc59-3a54-42e6-84c5-466eafb062b4\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.896412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2bzj\" (UniqueName: \"kubernetes.io/projected/deb46fb3-5bea-4b35-9e6c-6816415058af-kube-api-access-z2bzj\") pod \"deb46fb3-5bea-4b35-9e6c-6816415058af\" (UID: \"deb46fb3-5bea-4b35-9e6c-6816415058af\") " Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.897845 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.910544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1162dc59-3a54-42e6-84c5-466eafb062b4" (UID: "1162dc59-3a54-42e6-84c5-466eafb062b4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.915573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-logs" (OuterVolumeSpecName: "logs") pod "1162dc59-3a54-42e6-84c5-466eafb062b4" (UID: "1162dc59-3a54-42e6-84c5-466eafb062b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.915688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "deb46fb3-5bea-4b35-9e6c-6816415058af" (UID: "deb46fb3-5bea-4b35-9e6c-6816415058af"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.919238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb46fb3-5bea-4b35-9e6c-6816415058af-kube-api-access-z2bzj" (OuterVolumeSpecName: "kube-api-access-z2bzj") pod "deb46fb3-5bea-4b35-9e6c-6816415058af" (UID: "deb46fb3-5bea-4b35-9e6c-6816415058af"). InnerVolumeSpecName "kube-api-access-z2bzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.928721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-scripts" (OuterVolumeSpecName: "scripts") pod "1162dc59-3a54-42e6-84c5-466eafb062b4" (UID: "1162dc59-3a54-42e6-84c5-466eafb062b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.928821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"69e40397-47ac-49d6-9b01-50b995d290f7","Type":"ContainerDied","Data":"a6c7cd084f1ec726b7c15a7e37b8c82152c44796da07eb2e51101755c1bb1094"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.928861 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.932944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1162dc59-3a54-42e6-84c5-466eafb062b4-kube-api-access-fg558" (OuterVolumeSpecName: "kube-api-access-fg558") pod "1162dc59-3a54-42e6-84c5-466eafb062b4" (UID: "1162dc59-3a54-42e6-84c5-466eafb062b4"). InnerVolumeSpecName "kube-api-access-fg558". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.952235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "1162dc59-3a54-42e6-84c5-466eafb062b4" (UID: "1162dc59-3a54-42e6-84c5-466eafb062b4"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.973777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" event={"ID":"3cd910ba-3565-4935-a15f-dda572968edd","Type":"ContainerStarted","Data":"11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.974568 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.983715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1162dc59-3a54-42e6-84c5-466eafb062b4" (UID: "1162dc59-3a54-42e6-84c5-466eafb062b4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.998044 4707 generic.go:334] "Generic (PLEG): container finished" podID="deb46fb3-5bea-4b35-9e6c-6816415058af" containerID="9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7" exitCode=0 Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.998193 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.998329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"deb46fb3-5bea-4b35-9e6c-6816415058af","Type":"ContainerDied","Data":"9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7"} Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.998345 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:13:43 crc kubenswrapper[4707]: I0121 16:13:43.998355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"deb46fb3-5bea-4b35-9e6c-6816415058af","Type":"ContainerDied","Data":"42e1f70bcbe8876300c9e3e1ec3625a74d1b6332a4404da58829101af41b0d68"} Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.000398 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2bzj\" (UniqueName: \"kubernetes.io/projected/deb46fb3-5bea-4b35-9e6c-6816415058af-kube-api-access-z2bzj\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.000428 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.000439 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.000466 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.000477 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1162dc59-3a54-42e6-84c5-466eafb062b4-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.000487 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg558\" (UniqueName: \"kubernetes.io/projected/1162dc59-3a54-42e6-84c5-466eafb062b4-kube-api-access-fg558\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.000495 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deb46fb3-5bea-4b35-9e6c-6816415058af-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.000502 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.017262 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" podStartSLOduration=5.017242761 podStartE2EDuration="5.017242761s" podCreationTimestamp="2026-01-21 16:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:13:43.998614985 +0000 UTC m=+4321.180131207" watchObservedRunningTime="2026-01-21 16:13:44.017242761 +0000 UTC m=+4321.198758982" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.023270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-config-data" (OuterVolumeSpecName: "config-data") pod "1162dc59-3a54-42e6-84c5-466eafb062b4" (UID: "1162dc59-3a54-42e6-84c5-466eafb062b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.029417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1162dc59-3a54-42e6-84c5-466eafb062b4" (UID: "1162dc59-3a54-42e6-84c5-466eafb062b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.047510 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.047523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deb46fb3-5bea-4b35-9e6c-6816415058af" (UID: "deb46fb3-5bea-4b35-9e6c-6816415058af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.066884 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "deb46fb3-5bea-4b35-9e6c-6816415058af" (UID: "deb46fb3-5bea-4b35-9e6c-6816415058af"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.105119 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.105432 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1162dc59-3a54-42e6-84c5-466eafb062b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.105444 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.105453 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.105463 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb46fb3-5bea-4b35-9e6c-6816415058af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.151877 4707 scope.go:117] "RemoveContainer" containerID="d9b5938294531b6e015f18518c545afd2fc7bc61634c8904c5434b33da8fd969" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.160855 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-644c6c574-8cgjw"] Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.161282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.163034 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.166723 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-644c6c574-8cgjw"] Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.171847 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.176023 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.183043 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.183213 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.199157 4707 scope.go:117] "RemoveContainer" containerID="142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308109 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom\") pod \"559741cf-87c9-47d4-98a3-6b02ffb4610c\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-default\") pod \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-combined-ca-bundle\") pod \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htmzz\" (UniqueName: \"kubernetes.io/projected/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-kube-api-access-htmzz\") pod \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-nova-metadata-tls-certs\") pod \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle\") pod \"559741cf-87c9-47d4-98a3-6b02ffb4610c\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs\") pod \"559741cf-87c9-47d4-98a3-6b02ffb4610c\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308540 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data\") pod \"559741cf-87c9-47d4-98a3-6b02ffb4610c\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308581 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-logs\") pod \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9jhb\" (UniqueName: \"kubernetes.io/projected/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kube-api-access-l9jhb\") pod \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs\") pod \"559741cf-87c9-47d4-98a3-6b02ffb4610c\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308658 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kolla-config\") pod \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts\") pod \"559741cf-87c9-47d4-98a3-6b02ffb4610c\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/559741cf-87c9-47d4-98a3-6b02ffb4610c-etc-machine-id\") pod \"559741cf-87c9-47d4-98a3-6b02ffb4610c\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-generated\") pod \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-config-data\") pod \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-operator-scripts\") pod \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mswgl\" (UniqueName: \"kubernetes.io/projected/559741cf-87c9-47d4-98a3-6b02ffb4610c-kube-api-access-mswgl\") pod \"559741cf-87c9-47d4-98a3-6b02ffb4610c\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-combined-ca-bundle\") pod \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\" (UID: \"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-galera-tls-certs\") pod \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/559741cf-87c9-47d4-98a3-6b02ffb4610c-logs\") pod \"559741cf-87c9-47d4-98a3-6b02ffb4610c\" (UID: \"559741cf-87c9-47d4-98a3-6b02ffb4610c\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.308969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\" (UID: \"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.312580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" (UID: "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.312617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "559741cf-87c9-47d4-98a3-6b02ffb4610c" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.313389 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" (UID: "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.314139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559741cf-87c9-47d4-98a3-6b02ffb4610c-logs" (OuterVolumeSpecName: "logs") pod "559741cf-87c9-47d4-98a3-6b02ffb4610c" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.325293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/559741cf-87c9-47d4-98a3-6b02ffb4610c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "559741cf-87c9-47d4-98a3-6b02ffb4610c" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.325341 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-kube-api-access-htmzz" (OuterVolumeSpecName: "kube-api-access-htmzz") pod "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" (UID: "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2"). InnerVolumeSpecName "kube-api-access-htmzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.325562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" (UID: "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.325891 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" (UID: "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.326184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-logs" (OuterVolumeSpecName: "logs") pod "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" (UID: "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.329938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts" (OuterVolumeSpecName: "scripts") pod "559741cf-87c9-47d4-98a3-6b02ffb4610c" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.332919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559741cf-87c9-47d4-98a3-6b02ffb4610c-kube-api-access-mswgl" (OuterVolumeSpecName: "kube-api-access-mswgl") pod "559741cf-87c9-47d4-98a3-6b02ffb4610c" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c"). InnerVolumeSpecName "kube-api-access-mswgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.339435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kube-api-access-l9jhb" (OuterVolumeSpecName: "kube-api-access-l9jhb") pod "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" (UID: "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b"). InnerVolumeSpecName "kube-api-access-l9jhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.347306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-config-data" (OuterVolumeSpecName: "config-data") pod "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" (UID: "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.360430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" (UID: "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.369320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" (UID: "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.377860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "559741cf-87c9-47d4-98a3-6b02ffb4610c" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.393352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" (UID: "f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.393593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" (UID: "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.400995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data" (OuterVolumeSpecName: "config-data") pod "559741cf-87c9-47d4-98a3-6b02ffb4610c" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.402928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "559741cf-87c9-47d4-98a3-6b02ffb4610c" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.403432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" (UID: "a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.408974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "559741cf-87c9-47d4-98a3-6b02ffb4610c" (UID: "559741cf-87c9-47d4-98a3-6b02ffb4610c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415074 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/559741cf-87c9-47d4-98a3-6b02ffb4610c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415106 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415119 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415130 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415140 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mswgl\" (UniqueName: \"kubernetes.io/projected/559741cf-87c9-47d4-98a3-6b02ffb4610c-kube-api-access-mswgl\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415151 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415170 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415178 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/559741cf-87c9-47d4-98a3-6b02ffb4610c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415208 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415218 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415227 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415238 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415249 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htmzz\" (UniqueName: \"kubernetes.io/projected/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-kube-api-access-htmzz\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415257 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415267 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415277 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415286 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415293 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415303 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9jhb\" (UniqueName: \"kubernetes.io/projected/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kube-api-access-l9jhb\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415312 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415320 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.415329 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559741cf-87c9-47d4-98a3-6b02ffb4610c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.439081 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.442114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.442494 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.446730 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.448182 4707 scope.go:117] "RemoveContainer" containerID="8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.454886 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.487211 4707 scope.go:117] "RemoveContainer" containerID="142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb" Jan 21 16:13:44 crc kubenswrapper[4707]: E0121 16:13:44.493124 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb\": container with ID starting with 142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb not found: ID does not exist" containerID="142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.493189 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb"} err="failed to get container status \"142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb\": rpc error: code = NotFound desc = could not find container \"142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb\": container with ID starting with 142cfeb16fd5aa66761b6947f6c9cb8ba9bbd086739e3b485d6a380c100d5adb not found: ID does not exist" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.493216 4707 scope.go:117] "RemoveContainer" containerID="8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e" Jan 21 16:13:44 crc kubenswrapper[4707]: E0121 16:13:44.495636 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e\": container with ID starting with 8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e not found: ID does not exist" containerID="8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.495665 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e"} err="failed to get container status \"8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e\": rpc error: code = NotFound desc = could not find container \"8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e\": container with ID starting with 8e4cb210835dbcd7d204d29f1c50e6cfe106a711310352f4aab8f6deab73ab1e not found: ID does not exist" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.495680 4707 scope.go:117] "RemoveContainer" containerID="8e765e71026c6eb75d97a822236ee30c3868681e753b9c3940e5cd5d39e67c94" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.516705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wclvw\" (UniqueName: \"kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw\") pod \"keystone-fd0d-account-create-update-wp9kn\" (UID: \"42ec36fa-88c2-46f1-affb-ba010670839c\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:44 crc kubenswrapper[4707]: E0121 16:13:44.516898 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.517178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts\") pod \"keystone-fd0d-account-create-update-wp9kn\" (UID: \"42ec36fa-88c2-46f1-affb-ba010670839c\") " pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:44 crc kubenswrapper[4707]: E0121 16:13:44.517287 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts podName:42ec36fa-88c2-46f1-affb-ba010670839c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.517251986 +0000 UTC m=+4323.698768208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts") pod "keystone-fd0d-account-create-update-wp9kn" (UID: "42ec36fa-88c2-46f1-affb-ba010670839c") : configmap "openstack-scripts" not found Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.517469 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: E0121 16:13:44.525993 4707 projected.go:194] Error preparing data for projected volume kube-api-access-wclvw for pod openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:13:44 crc kubenswrapper[4707]: E0121 16:13:44.526237 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw podName:42ec36fa-88c2-46f1-affb-ba010670839c nodeName:}" failed. No retries permitted until 2026-01-21 16:13:46.526080326 +0000 UTC m=+4323.707596548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wclvw" (UniqueName: "kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw") pod "keystone-fd0d-account-create-update-wp9kn" (UID: "42ec36fa-88c2-46f1-affb-ba010670839c") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.534400 4707 scope.go:117] "RemoveContainer" containerID="cd8eeca7ee1cb0f9e8acd6fb2fc5f3841099beedc093a3deaf46d81c1d640e58" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.557917 4707 scope.go:117] "RemoveContainer" containerID="9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.584262 4707 scope.go:117] "RemoveContainer" containerID="9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7" Jan 21 16:13:44 crc kubenswrapper[4707]: E0121 16:13:44.584730 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7\": container with ID starting with 9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7 not found: ID does not exist" containerID="9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.584758 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7"} err="failed to get container status \"9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7\": rpc error: code = NotFound desc = could not find container \"9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7\": container with ID starting with 9d6baef4adc6b4d9bd5eb08aa857b2e81bea98ce7af88ccde0f88c4f4a282eb7 not found: ID does not exist" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.618954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data\") pod \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data-custom\") pod \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-public-tls-certs\") pod \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619222 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlxtq\" (UniqueName: \"kubernetes.io/projected/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-kube-api-access-jlxtq\") pod \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594873ed-1daa-422c-853c-65fab465dbdb-logs\") pod \"594873ed-1daa-422c-853c-65fab465dbdb\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619419 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-public-tls-certs\") pod \"594873ed-1daa-422c-853c-65fab465dbdb\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619489 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-combined-ca-bundle\") pod \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnkkz\" (UniqueName: \"kubernetes.io/projected/594873ed-1daa-422c-853c-65fab465dbdb-kube-api-access-nnkkz\") pod \"594873ed-1daa-422c-853c-65fab465dbdb\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-internal-tls-certs\") pod \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-internal-tls-certs\") pod \"594873ed-1daa-422c-853c-65fab465dbdb\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-config-data\") pod \"594873ed-1daa-422c-853c-65fab465dbdb\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-combined-ca-bundle\") pod \"594873ed-1daa-422c-853c-65fab465dbdb\" (UID: \"594873ed-1daa-422c-853c-65fab465dbdb\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.619796 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-logs\") pod \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\" (UID: \"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0\") " Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.622154 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594873ed-1daa-422c-853c-65fab465dbdb-logs" (OuterVolumeSpecName: "logs") pod "594873ed-1daa-422c-853c-65fab465dbdb" (UID: "594873ed-1daa-422c-853c-65fab465dbdb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.622308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-logs" (OuterVolumeSpecName: "logs") pod "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" (UID: "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.724972 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:44 crc kubenswrapper[4707]: I0121 16:13:44.725347 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594873ed-1daa-422c-853c-65fab465dbdb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.009136 4707 generic.go:334] "Generic (PLEG): container finished" podID="594873ed-1daa-422c-853c-65fab465dbdb" containerID="8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a" exitCode=0 Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.009204 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.009236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"594873ed-1daa-422c-853c-65fab465dbdb","Type":"ContainerDied","Data":"8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.009479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"594873ed-1daa-422c-853c-65fab465dbdb","Type":"ContainerDied","Data":"bdd8aa679b0f57b3df344998456e4d250d7959b5e962261728ab17ce6dd9b21b"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.009530 4707 scope.go:117] "RemoveContainer" containerID="8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.015863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"559741cf-87c9-47d4-98a3-6b02ffb4610c","Type":"ContainerDied","Data":"d79c34f29b08fd0ad6c8ad1d97bf1f3506743a117855e0388eb175ee1db2a8c4"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.015928 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.020121 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerID="34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0" exitCode=0 Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.020194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" event={"ID":"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0","Type":"ContainerDied","Data":"34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.020216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" event={"ID":"c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0","Type":"ContainerDied","Data":"f5fa696efe6598371fba15c3d9ef300ea51e9809ba27a262ee7247e760e06552"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.020268 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-ff884c948-lmvhf" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.024667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1162dc59-3a54-42e6-84c5-466eafb062b4","Type":"ContainerDied","Data":"38d27dd9bb919d4d3cdd89043730417d1b73ecdd18fafcb0e3bc9684d17af3f1"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.024740 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.040761 4707 generic.go:334] "Generic (PLEG): container finished" podID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerID="d3f4f9fc5bc4ad2e024ecd33cb433c74190a4054f4d5fc7992c2d5cfaceae52f" exitCode=0 Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.040895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" event={"ID":"b115ad1a-72c2-4d2e-8669-4d125acf6751","Type":"ContainerDied","Data":"d3f4f9fc5bc4ad2e024ecd33cb433c74190a4054f4d5fc7992c2d5cfaceae52f"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.043763 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8/ovn-northd/0.log" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.043831 4707 generic.go:334] "Generic (PLEG): container finished" podID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerID="6fb82acd6cb43c9c4e99edd1e2836e009b79b9244ac1f9188cfc0eec4e39f3eb" exitCode=139 Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.043956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8","Type":"ContainerDied","Data":"6fb82acd6cb43c9c4e99edd1e2836e009b79b9244ac1f9188cfc0eec4e39f3eb"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.049269 4707 generic.go:334] "Generic (PLEG): container finished" podID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerID="4b218d98dcea319cdfaa8ddb7b590949e9f120aa66afb2e52a36db4e18a4d975" exitCode=0 Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.049349 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerDied","Data":"4b218d98dcea319cdfaa8ddb7b590949e9f120aa66afb2e52a36db4e18a4d975"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.055280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b","Type":"ContainerDied","Data":"4599c305e572f34ed5c16b20e2dedae0100fe3465c84e3075b360da360309573"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.055431 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.059106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2","Type":"ContainerDied","Data":"7319f861ffb7c9c24e473247fe12ac8f3757571c1b479da39dc27d4a08e061ed"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.059121 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.061623 4707 generic.go:334] "Generic (PLEG): container finished" podID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerID="ffe9e5471773aa44d24bb089adcede94c9a0a7c07e343fc71a5076c836a67866" exitCode=0 Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.061690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerDied","Data":"ffe9e5471773aa44d24bb089adcede94c9a0a7c07e343fc71a5076c836a67866"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.068287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" (UID: "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.068391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594873ed-1daa-422c-853c-65fab465dbdb-kube-api-access-nnkkz" (OuterVolumeSpecName: "kube-api-access-nnkkz") pod "594873ed-1daa-422c-853c-65fab465dbdb" (UID: "594873ed-1daa-422c-853c-65fab465dbdb"). InnerVolumeSpecName "kube-api-access-nnkkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.069301 4707 generic.go:334] "Generic (PLEG): container finished" podID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerID="22c7df73a4078571e27ee73b13d827243c425325781c88d7bd9adb9efeb6f37a" exitCode=0 Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.069441 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.069303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2f6304c7-b127-4fcb-a74b-b5009a2b1a50","Type":"ContainerDied","Data":"22c7df73a4078571e27ee73b13d827243c425325781c88d7bd9adb9efeb6f37a"} Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.073326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-kube-api-access-jlxtq" (OuterVolumeSpecName: "kube-api-access-jlxtq") pod "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" (UID: "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0"). InnerVolumeSpecName "kube-api-access-jlxtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.115707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-config-data" (OuterVolumeSpecName: "config-data") pod "594873ed-1daa-422c-853c-65fab465dbdb" (UID: "594873ed-1daa-422c-853c-65fab465dbdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.118951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" (UID: "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.127489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "594873ed-1daa-422c-853c-65fab465dbdb" (UID: "594873ed-1daa-422c-853c-65fab465dbdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.138475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "594873ed-1daa-422c-853c-65fab465dbdb" (UID: "594873ed-1daa-422c-853c-65fab465dbdb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.144234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" (UID: "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.144516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data" (OuterVolumeSpecName: "config-data") pod "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" (UID: "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.147902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "594873ed-1daa-422c-853c-65fab465dbdb" (UID: "594873ed-1daa-422c-853c-65fab465dbdb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.158987 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.159018 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.159033 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.159046 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.159057 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.159068 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlxtq\" (UniqueName: \"kubernetes.io/projected/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-kube-api-access-jlxtq\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.159080 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594873ed-1daa-422c-853c-65fab465dbdb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.159090 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.159100 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnkkz\" (UniqueName: \"kubernetes.io/projected/594873ed-1daa-422c-853c-65fab465dbdb-kube-api-access-nnkkz\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.159114 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.191443 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" path="/var/lib/kubelet/pods/047c06ee-71d5-4bcf-83e0-dd8aa295cdb3/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.192046 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b93df4-cdbb-470e-8147-2a9bd7e5b74e" path="/var/lib/kubelet/pods/06b93df4-cdbb-470e-8147-2a9bd7e5b74e/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.192439 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10090d75-68eb-4597-98f6-b3f0282513e6" path="/var/lib/kubelet/pods/10090d75-68eb-4597-98f6-b3f0282513e6/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.192973 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561add48-0cde-40f7-95b6-09300d00eec4" path="/var/lib/kubelet/pods/561add48-0cde-40f7-95b6-09300d00eec4/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.194283 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58427ff4-8738-48bd-a3ea-a99986023664" path="/var/lib/kubelet/pods/58427ff4-8738-48bd-a3ea-a99986023664/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.194949 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e40397-47ac-49d6-9b01-50b995d290f7" path="/var/lib/kubelet/pods/69e40397-47ac-49d6-9b01-50b995d290f7/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.196352 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5dc988-7c0d-4bef-a471-656baf486bc9" path="/var/lib/kubelet/pods/7d5dc988-7c0d-4bef-a471-656baf486bc9/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.196762 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81457b6e-3af3-4da8-9f83-0e896795cb2c" path="/var/lib/kubelet/pods/81457b6e-3af3-4da8-9f83-0e896795cb2c/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.197333 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872bac5e-90cd-4cb0-9279-baa312892916" path="/var/lib/kubelet/pods/872bac5e-90cd-4cb0-9279-baa312892916/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.197729 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" path="/var/lib/kubelet/pods/ab3292fc-fbc8-481c-803a-3e32ff576d6d/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.198865 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e69885-5099-471a-8f06-ef85f63f18b7" path="/var/lib/kubelet/pods/b6e69885-5099-471a-8f06-ef85f63f18b7/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.199373 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b86320b8-c3c8-42d6-b48b-206526f89271" path="/var/lib/kubelet/pods/b86320b8-c3c8-42d6-b48b-206526f89271/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.200009 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9d5ac98-e3ae-450a-90fb-45b53c48f9ce" path="/var/lib/kubelet/pods/b9d5ac98-e3ae-450a-90fb-45b53c48f9ce/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.200936 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb46fb3-5bea-4b35-9e6c-6816415058af" path="/var/lib/kubelet/pods/deb46fb3-5bea-4b35-9e6c-6816415058af/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.202370 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e813fb-84dc-4d38-b3b5-1be17e1848f6" path="/var/lib/kubelet/pods/e9e813fb-84dc-4d38-b3b5-1be17e1848f6/volumes" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.242715 4707 scope.go:117] "RemoveContainer" containerID="ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.251081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" (UID: "c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.260852 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.308618 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.329123 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.334461 4707 scope.go:117] "RemoveContainer" containerID="8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a" Jan 21 16:13:45 crc kubenswrapper[4707]: E0121 16:13:45.335343 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a\": container with ID starting with 8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a not found: ID does not exist" containerID="8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.335376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a"} err="failed to get container status \"8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a\": rpc error: code = NotFound desc = could not find container \"8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a\": container with ID starting with 8a7b8d7ea41484ac9469cc6f0e66e0b85e6f42aba8428a8256128dfa2c8d700a not found: ID does not exist" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.335399 4707 scope.go:117] "RemoveContainer" containerID="ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2" Jan 21 16:13:45 crc kubenswrapper[4707]: E0121 16:13:45.335682 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2\": container with ID starting with ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2 not found: ID does not exist" containerID="ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.335739 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2"} err="failed to get container status \"ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2\": rpc error: code = NotFound desc = could not find container \"ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2\": container with ID starting with ff83c06e0b031316eaa302077b22c15c3dc569706eac7dd9cc35867ca428a2b2 not found: ID does not exist" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.336201 4707 scope.go:117] "RemoveContainer" containerID="b7106616e826175cc64ed9a092fca4a51264ccb076e1e9ed7b657799fb68de52" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.345662 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8/ovn-northd/0.log" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.345756 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.358225 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.363084 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-scripts\") pod \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365215 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-config\") pod \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365273 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn8jb\" (UniqueName: \"kubernetes.io/projected/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-kube-api-access-rn8jb\") pod \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmszn\" (UniqueName: \"kubernetes.io/projected/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-kube-api-access-qmszn\") pod \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-combined-ca-bundle\") pod \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-combined-ca-bundle\") pod \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-rundir\") pod \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365429 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9qfc\" (UniqueName: \"kubernetes.io/projected/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-kube-api-access-d9qfc\") pod \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-etc-machine-id\") pod \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.365989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-scripts\") pod \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.367290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-config-data\") pod \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.367692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data-custom\") pod \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.367872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data\") pod \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\" (UID: \"2f6304c7-b127-4fcb-a74b-b5009a2b1a50\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.368225 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-config" (OuterVolumeSpecName: "config") pod "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" (UID: "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.368309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-metrics-certs-tls-certs\") pod \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.368394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-combined-ca-bundle\") pod \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\" (UID: \"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.368417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-northd-tls-certs\") pod \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\" (UID: \"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.370766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-scripts" (OuterVolumeSpecName: "scripts") pod "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" (UID: "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.371633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" (UID: "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.371748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2f6304c7-b127-4fcb-a74b-b5009a2b1a50" (UID: "2f6304c7-b127-4fcb-a74b-b5009a2b1a50"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.373023 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.374587 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.374611 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.374622 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.374633 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.377560 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-fd0d-account-create-update-wp9kn"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.381059 4707 scope.go:117] "RemoveContainer" containerID="30b75fdd71f74771e52987f8d2a7b8e272eabb3c7db15d9c46d8dfc4759a563c" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.388007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-kube-api-access-d9qfc" (OuterVolumeSpecName: "kube-api-access-d9qfc") pod "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" (UID: "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8"). InnerVolumeSpecName "kube-api-access-d9qfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.390864 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.391085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2f6304c7-b127-4fcb-a74b-b5009a2b1a50" (UID: "2f6304c7-b127-4fcb-a74b-b5009a2b1a50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.405873 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-scripts" (OuterVolumeSpecName: "scripts") pod "2f6304c7-b127-4fcb-a74b-b5009a2b1a50" (UID: "2f6304c7-b127-4fcb-a74b-b5009a2b1a50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.428857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-kube-api-access-rn8jb" (OuterVolumeSpecName: "kube-api-access-rn8jb") pod "dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" (UID: "dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56"). InnerVolumeSpecName "kube-api-access-rn8jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.436536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-kube-api-access-qmszn" (OuterVolumeSpecName: "kube-api-access-qmszn") pod "2f6304c7-b127-4fcb-a74b-b5009a2b1a50" (UID: "2f6304c7-b127-4fcb-a74b-b5009a2b1a50"). InnerVolumeSpecName "kube-api-access-qmszn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.440333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-config-data" (OuterVolumeSpecName: "config-data") pod "dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" (UID: "dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.448650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" (UID: "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.449756 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.450371 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.462051 4707 scope.go:117] "RemoveContainer" containerID="34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.465262 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.472323 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-ff884c948-lmvhf"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.475991 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-combined-ca-bundle\") pod \"b115ad1a-72c2-4d2e-8669-4d125acf6751\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.476084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data\") pod \"b115ad1a-72c2-4d2e-8669-4d125acf6751\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.476114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-config-data\") pod \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.476280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b115ad1a-72c2-4d2e-8669-4d125acf6751-logs\") pod \"b115ad1a-72c2-4d2e-8669-4d125acf6751\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.476326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fr27\" (UniqueName: \"kubernetes.io/projected/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-kube-api-access-2fr27\") pod \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.476436 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data-custom\") pod \"b115ad1a-72c2-4d2e-8669-4d125acf6751\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.476473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqrsr\" (UniqueName: \"kubernetes.io/projected/b115ad1a-72c2-4d2e-8669-4d125acf6751-kube-api-access-cqrsr\") pod \"b115ad1a-72c2-4d2e-8669-4d125acf6751\" (UID: \"b115ad1a-72c2-4d2e-8669-4d125acf6751\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.476501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-combined-ca-bundle\") pod \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\" (UID: \"647f33a8-e7c7-4941-9c0d-f9c5e4b07409\") " Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477224 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn8jb\" (UniqueName: \"kubernetes.io/projected/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-kube-api-access-rn8jb\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477247 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477259 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmszn\" (UniqueName: \"kubernetes.io/projected/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-kube-api-access-qmszn\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477269 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9qfc\" (UniqueName: \"kubernetes.io/projected/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-kube-api-access-d9qfc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477279 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477289 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477300 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42ec36fa-88c2-46f1-affb-ba010670839c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477310 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477319 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wclvw\" (UniqueName: \"kubernetes.io/projected/42ec36fa-88c2-46f1-affb-ba010670839c-kube-api-access-wclvw\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b115ad1a-72c2-4d2e-8669-4d125acf6751-logs" (OuterVolumeSpecName: "logs") pod "b115ad1a-72c2-4d2e-8669-4d125acf6751" (UID: "b115ad1a-72c2-4d2e-8669-4d125acf6751"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.477943 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-ff884c948-lmvhf"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.483598 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.485777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b115ad1a-72c2-4d2e-8669-4d125acf6751-kube-api-access-cqrsr" (OuterVolumeSpecName: "kube-api-access-cqrsr") pod "b115ad1a-72c2-4d2e-8669-4d125acf6751" (UID: "b115ad1a-72c2-4d2e-8669-4d125acf6751"). InnerVolumeSpecName "kube-api-access-cqrsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.486432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-kube-api-access-2fr27" (OuterVolumeSpecName: "kube-api-access-2fr27") pod "647f33a8-e7c7-4941-9c0d-f9c5e4b07409" (UID: "647f33a8-e7c7-4941-9c0d-f9c5e4b07409"). InnerVolumeSpecName "kube-api-access-2fr27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.488957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b115ad1a-72c2-4d2e-8669-4d125acf6751" (UID: "b115ad1a-72c2-4d2e-8669-4d125acf6751"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.491536 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.497778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" (UID: "dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.497862 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.498771 4707 scope.go:117] "RemoveContainer" containerID="408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.509346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "647f33a8-e7c7-4941-9c0d-f9c5e4b07409" (UID: "647f33a8-e7c7-4941-9c0d-f9c5e4b07409"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.513030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f6304c7-b127-4fcb-a74b-b5009a2b1a50" (UID: "2f6304c7-b127-4fcb-a74b-b5009a2b1a50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.517962 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.524737 4707 scope.go:117] "RemoveContainer" containerID="34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.527719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-config-data" (OuterVolumeSpecName: "config-data") pod "647f33a8-e7c7-4941-9c0d-f9c5e4b07409" (UID: "647f33a8-e7c7-4941-9c0d-f9c5e4b07409"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.527759 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: E0121 16:13:45.529467 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0\": container with ID starting with 34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0 not found: ID does not exist" containerID="34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.529505 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0"} err="failed to get container status \"34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0\": rpc error: code = NotFound desc = could not find container \"34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0\": container with ID starting with 34e9b45d23c43cf942e44e1a9a226fe5c0d0fe2650bfd94777275eb594203ae0 not found: ID does not exist" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.529548 4707 scope.go:117] "RemoveContainer" containerID="408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.530677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b115ad1a-72c2-4d2e-8669-4d125acf6751" (UID: "b115ad1a-72c2-4d2e-8669-4d125acf6751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: E0121 16:13:45.531047 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6\": container with ID starting with 408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6 not found: ID does not exist" containerID="408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.531135 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6"} err="failed to get container status \"408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6\": rpc error: code = NotFound desc = could not find container \"408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6\": container with ID starting with 408fd9c69d4e142ea774836149f77d664c0252e1345e5c8ee3a6bcc4258b28c6 not found: ID does not exist" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.531216 4707 scope.go:117] "RemoveContainer" containerID="d9c4cae2facb86f1f23dc52a9d0d7d3cf12140812ce2d41bc7723b83171f967b" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.540942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" (UID: "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.541707 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.543710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data" (OuterVolumeSpecName: "config-data") pod "b115ad1a-72c2-4d2e-8669-4d125acf6751" (UID: "b115ad1a-72c2-4d2e-8669-4d125acf6751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.553769 4707 scope.go:117] "RemoveContainer" containerID="0d655dcc30fe3941863c06455c345616e3fae519df331a4a8dd1db93d04f7978" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.556444 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" (UID: "8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.559420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data" (OuterVolumeSpecName: "config-data") pod "2f6304c7-b127-4fcb-a74b-b5009a2b1a50" (UID: "2f6304c7-b127-4fcb-a74b-b5009a2b1a50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579229 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579260 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579272 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579282 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579292 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579302 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b115ad1a-72c2-4d2e-8669-4d125acf6751-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579313 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fr27\" (UniqueName: \"kubernetes.io/projected/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-kube-api-access-2fr27\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579323 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579332 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579340 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b115ad1a-72c2-4d2e-8669-4d125acf6751-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579349 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqrsr\" (UniqueName: \"kubernetes.io/projected/b115ad1a-72c2-4d2e-8669-4d125acf6751-kube-api-access-cqrsr\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579357 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647f33a8-e7c7-4941-9c0d-f9c5e4b07409-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.579365 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6304c7-b127-4fcb-a74b-b5009a2b1a50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.587450 4707 scope.go:117] "RemoveContainer" containerID="1408b5018e888ee2df7c2a74b879464d9d7f9397fbdd091d97fbff81697afea1" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.609748 4707 scope.go:117] "RemoveContainer" containerID="2ce45f981554c524b9a786326d92c8a0c4cc2bbf20e590dd6c7773290192d2cd" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.632304 4707 scope.go:117] "RemoveContainer" containerID="062b57f9c5689c8c3511ae8fbeb3929d0f15a809640fc51bf28faf8a24ac751f" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.650537 4707 scope.go:117] "RemoveContainer" containerID="a3f629a586eeaf3a0430289235f8ed826a343e4d540c7552bf23587188a60f57" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.677059 4707 scope.go:117] "RemoveContainer" containerID="75695222855ca1c4f0fbbb6b2725a344e12f18c6673f5479b9fa3e2af236f9ba" Jan 21 16:13:45 crc kubenswrapper[4707]: I0121 16:13:45.690291 4707 scope.go:117] "RemoveContainer" containerID="7261e8621661d753a0ab84dc58a6333b9f4bf934a0bff35dc7ae6582d4347840" Jan 21 16:13:45 crc kubenswrapper[4707]: E0121 16:13:45.884107 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:45 crc kubenswrapper[4707]: E0121 16:13:45.884328 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data podName:5d633804-5d9a-4792-b183-51e2c1a2ebe2 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:53.884312134 +0000 UTC m=+4331.065828356 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data") pod "rabbitmq-cell1-server-0" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.080326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2f6304c7-b127-4fcb-a74b-b5009a2b1a50","Type":"ContainerDied","Data":"4368e0e23f4023ca7c8ec88a8592985a35fe982982f1cbc296355f1a66ebe007"} Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.080397 4707 scope.go:117] "RemoveContainer" containerID="cd779a5c94d7063de0abc7e3a7f2b71e9ae69f23b6f7b82dbce250ade905a2db" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.080525 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.083451 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8/ovn-northd/0.log" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.083539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8","Type":"ContainerDied","Data":"10a89f75f33c3601c50829a918237a55bde8d9395e3825b4addc64c864a74c34"} Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.083567 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.085442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"647f33a8-e7c7-4941-9c0d-f9c5e4b07409","Type":"ContainerDied","Data":"0f515b5801bd198300964126774afddd1c6e414cd602008997bdec48127d09c5"} Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.085511 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.090658 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" event={"ID":"b115ad1a-72c2-4d2e-8669-4d125acf6751","Type":"ContainerDied","Data":"ffc2e91804b972561f45233d7b51f3ca8db7af0e53ef85d4b155d0c761a635bb"} Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.090740 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.110624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56","Type":"ContainerDied","Data":"42518b90ebd1540df899bb3b30c8c78594e131cd0f489c1d21293ba12545ed6d"} Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.110685 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.248453 4707 scope.go:117] "RemoveContainer" containerID="22c7df73a4078571e27ee73b13d827243c425325781c88d7bd9adb9efeb6f37a" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.250192 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.270414 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.289109 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.310763 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.314349 4707 scope.go:117] "RemoveContainer" containerID="cc159cdf2eb7eebf0df0489fca214415a177fd59dd0017097f91d3bdaa7b7b49" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.319151 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.326706 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.356841 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.369269 4707 scope.go:117] "RemoveContainer" containerID="6fb82acd6cb43c9c4e99edd1e2836e009b79b9244ac1f9188cfc0eec4e39f3eb" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.370075 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74486d786b-zsvr9"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.375949 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.379494 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.431286 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.437596 4707 scope.go:117] "RemoveContainer" containerID="ffe9e5471773aa44d24bb089adcede94c9a0a7c07e343fc71a5076c836a67866" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.459447 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.463886 4707 scope.go:117] "RemoveContainer" containerID="d3f4f9fc5bc4ad2e024ecd33cb433c74190a4054f4d5fc7992c2d5cfaceae52f" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.493565 4707 scope.go:117] "RemoveContainer" containerID="c64d5fccd6f9fe5912e6644f1e191a0c804f30b7b9271b2b9b4aaaefbe8774f5" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.498750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-public-tls-certs\") pod \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.498830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-erlang-cookie\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.498911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ql7s\" (UniqueName: \"kubernetes.io/projected/1876cc6c-ebc8-4c95-a937-27d2e01adf64-kube-api-access-8ql7s\") pod \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.498977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-tls\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d633804-5d9a-4792-b183-51e2c1a2ebe2-pod-info\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c2nn\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-kube-api-access-2c2nn\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-confd\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-fernet-keys\") pod \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-scripts\") pod \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-config-data\") pod \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-internal-tls-certs\") pod \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-plugins\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d633804-5d9a-4792-b183-51e2c1a2ebe2-erlang-cookie-secret\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-plugins-conf\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-combined-ca-bundle\") pod \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-credential-keys\") pod \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\" (UID: \"1876cc6c-ebc8-4c95-a937-27d2e01adf64\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-server-conf\") pod \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\" (UID: \"5d633804-5d9a-4792-b183-51e2c1a2ebe2\") " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.499901 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.500291 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.500699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.501626 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.510441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d633804-5d9a-4792-b183-51e2c1a2ebe2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.510612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5d633804-5d9a-4792-b183-51e2c1a2ebe2-pod-info" (OuterVolumeSpecName: "pod-info") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.510717 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1876cc6c-ebc8-4c95-a937-27d2e01adf64" (UID: "1876cc6c-ebc8-4c95-a937-27d2e01adf64"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.510880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1876cc6c-ebc8-4c95-a937-27d2e01adf64-kube-api-access-8ql7s" (OuterVolumeSpecName: "kube-api-access-8ql7s") pod "1876cc6c-ebc8-4c95-a937-27d2e01adf64" (UID: "1876cc6c-ebc8-4c95-a937-27d2e01adf64"). InnerVolumeSpecName "kube-api-access-8ql7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.511130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-kube-api-access-2c2nn" (OuterVolumeSpecName: "kube-api-access-2c2nn") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "kube-api-access-2c2nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.513613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1876cc6c-ebc8-4c95-a937-27d2e01adf64" (UID: "1876cc6c-ebc8-4c95-a937-27d2e01adf64"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.514195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.516078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.516423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-scripts" (OuterVolumeSpecName: "scripts") pod "1876cc6c-ebc8-4c95-a937-27d2e01adf64" (UID: "1876cc6c-ebc8-4c95-a937-27d2e01adf64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.522312 4707 scope.go:117] "RemoveContainer" containerID="4b218d98dcea319cdfaa8ddb7b590949e9f120aa66afb2e52a36db4e18a4d975" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.527249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data" (OuterVolumeSpecName: "config-data") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.530625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-config-data" (OuterVolumeSpecName: "config-data") pod "1876cc6c-ebc8-4c95-a937-27d2e01adf64" (UID: "1876cc6c-ebc8-4c95-a937-27d2e01adf64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.537751 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1876cc6c-ebc8-4c95-a937-27d2e01adf64" (UID: "1876cc6c-ebc8-4c95-a937-27d2e01adf64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.539350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-server-conf" (OuterVolumeSpecName: "server-conf") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.540606 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1876cc6c-ebc8-4c95-a937-27d2e01adf64" (UID: "1876cc6c-ebc8-4c95-a937-27d2e01adf64"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.548046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1876cc6c-ebc8-4c95-a937-27d2e01adf64" (UID: "1876cc6c-ebc8-4c95-a937-27d2e01adf64"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.576714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5d633804-5d9a-4792-b183-51e2c1a2ebe2" (UID: "5d633804-5d9a-4792-b183-51e2c1a2ebe2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615515 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c2nn\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-kube-api-access-2c2nn\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615550 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615561 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615573 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615582 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615591 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615602 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615612 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615621 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d633804-5d9a-4792-b183-51e2c1a2ebe2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615629 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615638 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615648 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615657 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d633804-5d9a-4792-b183-51e2c1a2ebe2-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615667 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1876cc6c-ebc8-4c95-a937-27d2e01adf64-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615676 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ql7s\" (UniqueName: \"kubernetes.io/projected/1876cc6c-ebc8-4c95-a937-27d2e01adf64-kube-api-access-8ql7s\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615687 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d633804-5d9a-4792-b183-51e2c1a2ebe2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615721 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.615731 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d633804-5d9a-4792-b183-51e2c1a2ebe2-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.625488 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 16:13:46 crc kubenswrapper[4707]: I0121 16:13:46.717541 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:46 crc kubenswrapper[4707]: E0121 16:13:46.921185 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:13:46 crc kubenswrapper[4707]: E0121 16:13:46.921281 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data podName:0e42cb4d-8129-4286-a6c7-9f8e94524e91 nodeName:}" failed. No retries permitted until 2026-01-21 16:13:54.921258109 +0000 UTC m=+4332.102774331 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data") pod "rabbitmq-server-0" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91") : configmap "rabbitmq-config-data" not found Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.160183 4707 generic.go:334] "Generic (PLEG): container finished" podID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" containerID="be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49" exitCode=0 Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.160273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"5d633804-5d9a-4792-b183-51e2c1a2ebe2","Type":"ContainerDied","Data":"be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49"} Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.160315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"5d633804-5d9a-4792-b183-51e2c1a2ebe2","Type":"ContainerDied","Data":"b8e8702569b7b5ee9afd82303e64707367b8c9a5c8162bac1315dac272fc8bdd"} Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.160343 4707 scope.go:117] "RemoveContainer" containerID="be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.160510 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.190093 4707 scope.go:117] "RemoveContainer" containerID="99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.191647 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerID="43c33ad38b01642b5b25214359a49931c587fc6383778e43bacab56ce73f1322" exitCode=0 Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.197649 4707 generic.go:334] "Generic (PLEG): container finished" podID="1876cc6c-ebc8-4c95-a937-27d2e01adf64" containerID="a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63" exitCode=0 Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.197794 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.201399 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" path="/var/lib/kubelet/pods/1162dc59-3a54-42e6-84c5-466eafb062b4/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.202192 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" path="/var/lib/kubelet/pods/2f6304c7-b127-4fcb-a74b-b5009a2b1a50/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.202619 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ec36fa-88c2-46f1-affb-ba010670839c" path="/var/lib/kubelet/pods/42ec36fa-88c2-46f1-affb-ba010670839c/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.203033 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" path="/var/lib/kubelet/pods/559741cf-87c9-47d4-98a3-6b02ffb4610c/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.206650 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594873ed-1daa-422c-853c-65fab465dbdb" path="/var/lib/kubelet/pods/594873ed-1daa-422c-853c-65fab465dbdb/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.210199 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" path="/var/lib/kubelet/pods/647f33a8-e7c7-4941-9c0d-f9c5e4b07409/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.211424 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" path="/var/lib/kubelet/pods/8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.212262 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" path="/var/lib/kubelet/pods/a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.213416 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b115ad1a-72c2-4d2e-8669-4d125acf6751" path="/var/lib/kubelet/pods/b115ad1a-72c2-4d2e-8669-4d125acf6751/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.214028 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" path="/var/lib/kubelet/pods/c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.214844 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" path="/var/lib/kubelet/pods/dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.215955 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" path="/var/lib/kubelet/pods/f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b/volumes" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.216892 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.216920 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.216960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"0e42cb4d-8129-4286-a6c7-9f8e94524e91","Type":"ContainerDied","Data":"43c33ad38b01642b5b25214359a49931c587fc6383778e43bacab56ce73f1322"} Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.217028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" event={"ID":"1876cc6c-ebc8-4c95-a937-27d2e01adf64","Type":"ContainerDied","Data":"a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63"} Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.217047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" event={"ID":"1876cc6c-ebc8-4c95-a937-27d2e01adf64","Type":"ContainerDied","Data":"56120464f973bba17190ae40e00a75bd42ca12f05b0c0ffccbe2dcdb9c13aa9d"} Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.226680 4707 scope.go:117] "RemoveContainer" containerID="be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49" Jan 21 16:13:47 crc kubenswrapper[4707]: E0121 16:13:47.228644 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49\": container with ID starting with be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49 not found: ID does not exist" containerID="be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.228982 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49"} err="failed to get container status \"be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49\": rpc error: code = NotFound desc = could not find container \"be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49\": container with ID starting with be152ebf0e70c87d06ee1eb00312f31ecab5e79bee118cfe694def499de48d49 not found: ID does not exist" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.229127 4707 scope.go:117] "RemoveContainer" containerID="99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86" Jan 21 16:13:47 crc kubenswrapper[4707]: E0121 16:13:47.229564 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86\": container with ID starting with 99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86 not found: ID does not exist" containerID="99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.229613 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86"} err="failed to get container status \"99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86\": rpc error: code = NotFound desc = could not find container \"99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86\": container with ID starting with 99991e5c9c10a62f501596712500ca90b215ca05f55db628f57fd38432effd86 not found: ID does not exist" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.229651 4707 scope.go:117] "RemoveContainer" containerID="a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.259205 4707 scope.go:117] "RemoveContainer" containerID="a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63" Jan 21 16:13:47 crc kubenswrapper[4707]: E0121 16:13:47.260033 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63\": container with ID starting with a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63 not found: ID does not exist" containerID="a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.260067 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63"} err="failed to get container status \"a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63\": rpc error: code = NotFound desc = could not find container \"a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63\": container with ID starting with a5b1c239562bbeb049c0e1a336d6046a1b56135174d2aede3c37cc0a3e84ae63 not found: ID does not exist" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.299489 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjhwt\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-kube-api-access-rjhwt\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e42cb4d-8129-4286-a6c7-9f8e94524e91-pod-info\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-erlang-cookie\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-server-conf\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e42cb4d-8129-4286-a6c7-9f8e94524e91-erlang-cookie-secret\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-tls\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-confd\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-plugins-conf\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.329761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-plugins\") pod \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\" (UID: \"0e42cb4d-8129-4286-a6c7-9f8e94524e91\") " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.335334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-kube-api-access-rjhwt" (OuterVolumeSpecName: "kube-api-access-rjhwt") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "kube-api-access-rjhwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.335451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.335698 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0e42cb4d-8129-4286-a6c7-9f8e94524e91-pod-info" (OuterVolumeSpecName: "pod-info") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.337052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.338191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.340496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.342243 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e42cb4d-8129-4286-a6c7-9f8e94524e91-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.342998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.350019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data" (OuterVolumeSpecName: "config-data") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.363430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-server-conf" (OuterVolumeSpecName: "server-conf") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.391037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0e42cb4d-8129-4286-a6c7-9f8e94524e91" (UID: "0e42cb4d-8129-4286-a6c7-9f8e94524e91"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432201 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432236 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432252 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e42cb4d-8129-4286-a6c7-9f8e94524e91-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432291 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432302 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432313 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432324 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432352 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e42cb4d-8129-4286-a6c7-9f8e94524e91-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432361 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e42cb4d-8129-4286-a6c7-9f8e94524e91-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432374 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjhwt\" (UniqueName: \"kubernetes.io/projected/0e42cb4d-8129-4286-a6c7-9f8e94524e91-kube-api-access-rjhwt\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.432386 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e42cb4d-8129-4286-a6c7-9f8e94524e91-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.442966 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 16:13:47 crc kubenswrapper[4707]: I0121 16:13:47.534027 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.221348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"0e42cb4d-8129-4286-a6c7-9f8e94524e91","Type":"ContainerDied","Data":"92d3f4d21b24762879c93f8be5164eac2cb7f4afb5a767a7092ece30e10139c7"} Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.221391 4707 scope.go:117] "RemoveContainer" containerID="43c33ad38b01642b5b25214359a49931c587fc6383778e43bacab56ce73f1322" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.221393 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.230577 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerID="c775c3ced09d3af1afb4330ece26d7b04215632d1b6e25a92627d6902a9f8f3e" exitCode=0 Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.230644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerDied","Data":"c775c3ced09d3af1afb4330ece26d7b04215632d1b6e25a92627d6902a9f8f3e"} Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.239267 4707 generic.go:334] "Generic (PLEG): container finished" podID="f927e632-11ba-4d56-993a-f89357f885a4" containerID="f9eec9f2777bc56f0de0809e130e7eb56b4ccf7f6d582d5054978a6f37712858" exitCode=0 Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.239294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" event={"ID":"f927e632-11ba-4d56-993a-f89357f885a4","Type":"ContainerDied","Data":"f9eec9f2777bc56f0de0809e130e7eb56b4ccf7f6d582d5054978a6f37712858"} Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.240920 4707 scope.go:117] "RemoveContainer" containerID="f5188077018b4348f2419002120e3a5b4d29c9ab4682810c48d0553e5ac5fccf" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.251918 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.257529 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.401945 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.423561 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.446905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-ceilometer-tls-certs\") pod \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.446966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data-custom\") pod \"f927e632-11ba-4d56-993a-f89357f885a4\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-combined-ca-bundle\") pod \"f927e632-11ba-4d56-993a-f89357f885a4\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcbvh\" (UniqueName: \"kubernetes.io/projected/f6195e76-f5d4-4062-99a0-1597e6e85ba0-kube-api-access-xcbvh\") pod \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data\") pod \"f927e632-11ba-4d56-993a-f89357f885a4\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-scripts\") pod \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44zg5\" (UniqueName: \"kubernetes.io/projected/f927e632-11ba-4d56-993a-f89357f885a4-kube-api-access-44zg5\") pod \"f927e632-11ba-4d56-993a-f89357f885a4\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447240 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-log-httpd\") pod \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f927e632-11ba-4d56-993a-f89357f885a4-logs\") pod \"f927e632-11ba-4d56-993a-f89357f885a4\" (UID: \"f927e632-11ba-4d56-993a-f89357f885a4\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-sg-core-conf-yaml\") pod \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447355 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-combined-ca-bundle\") pod \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-config-data\") pod \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-run-httpd\") pod \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\" (UID: \"f6195e76-f5d4-4062-99a0-1597e6e85ba0\") " Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.447748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f6195e76-f5d4-4062-99a0-1597e6e85ba0" (UID: "f6195e76-f5d4-4062-99a0-1597e6e85ba0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.448072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f6195e76-f5d4-4062-99a0-1597e6e85ba0" (UID: "f6195e76-f5d4-4062-99a0-1597e6e85ba0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.448213 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.448227 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6195e76-f5d4-4062-99a0-1597e6e85ba0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.448521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f927e632-11ba-4d56-993a-f89357f885a4-logs" (OuterVolumeSpecName: "logs") pod "f927e632-11ba-4d56-993a-f89357f885a4" (UID: "f927e632-11ba-4d56-993a-f89357f885a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.459409 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-scripts" (OuterVolumeSpecName: "scripts") pod "f6195e76-f5d4-4062-99a0-1597e6e85ba0" (UID: "f6195e76-f5d4-4062-99a0-1597e6e85ba0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.459475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f927e632-11ba-4d56-993a-f89357f885a4-kube-api-access-44zg5" (OuterVolumeSpecName: "kube-api-access-44zg5") pod "f927e632-11ba-4d56-993a-f89357f885a4" (UID: "f927e632-11ba-4d56-993a-f89357f885a4"). InnerVolumeSpecName "kube-api-access-44zg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.460969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f927e632-11ba-4d56-993a-f89357f885a4" (UID: "f927e632-11ba-4d56-993a-f89357f885a4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.462831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6195e76-f5d4-4062-99a0-1597e6e85ba0-kube-api-access-xcbvh" (OuterVolumeSpecName: "kube-api-access-xcbvh") pod "f6195e76-f5d4-4062-99a0-1597e6e85ba0" (UID: "f6195e76-f5d4-4062-99a0-1597e6e85ba0"). InnerVolumeSpecName "kube-api-access-xcbvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.471410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f6195e76-f5d4-4062-99a0-1597e6e85ba0" (UID: "f6195e76-f5d4-4062-99a0-1597e6e85ba0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.472198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f927e632-11ba-4d56-993a-f89357f885a4" (UID: "f927e632-11ba-4d56-993a-f89357f885a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.482918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f6195e76-f5d4-4062-99a0-1597e6e85ba0" (UID: "f6195e76-f5d4-4062-99a0-1597e6e85ba0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.486068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data" (OuterVolumeSpecName: "config-data") pod "f927e632-11ba-4d56-993a-f89357f885a4" (UID: "f927e632-11ba-4d56-993a-f89357f885a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.496188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6195e76-f5d4-4062-99a0-1597e6e85ba0" (UID: "f6195e76-f5d4-4062-99a0-1597e6e85ba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.513119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-config-data" (OuterVolumeSpecName: "config-data") pod "f6195e76-f5d4-4062-99a0-1597e6e85ba0" (UID: "f6195e76-f5d4-4062-99a0-1597e6e85ba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550684 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550727 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44zg5\" (UniqueName: \"kubernetes.io/projected/f927e632-11ba-4d56-993a-f89357f885a4-kube-api-access-44zg5\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550743 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f927e632-11ba-4d56-993a-f89357f885a4-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550759 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550775 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550785 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550796 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6195e76-f5d4-4062-99a0-1597e6e85ba0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550821 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550831 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550842 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcbvh\" (UniqueName: \"kubernetes.io/projected/f6195e76-f5d4-4062-99a0-1597e6e85ba0-kube-api-access-xcbvh\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.550851 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f927e632-11ba-4d56-993a-f89357f885a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:48 crc kubenswrapper[4707]: I0121 16:13:48.831216 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.254:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.192324 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" path="/var/lib/kubelet/pods/0e42cb4d-8129-4286-a6c7-9f8e94524e91/volumes" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.193068 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" path="/var/lib/kubelet/pods/5d633804-5d9a-4792-b183-51e2c1a2ebe2/volumes" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.249773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" event={"ID":"f927e632-11ba-4d56-993a-f89357f885a4","Type":"ContainerDied","Data":"e5dbc21ea96fe8af7c22e5a8b83b9ad5a23d88f49da9a7633ac37b17857d42d8"} Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.249851 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.249878 4707 scope.go:117] "RemoveContainer" containerID="f9eec9f2777bc56f0de0809e130e7eb56b4ccf7f6d582d5054978a6f37712858" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.257551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f6195e76-f5d4-4062-99a0-1597e6e85ba0","Type":"ContainerDied","Data":"7fa872b809cb716fcbd369a8602cc94d33fd5cd7c7a3eb40e9470678df9b8f0d"} Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.257702 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.273941 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9"] Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.279333 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-784ddd84c9-bhvh9"] Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.281367 4707 scope.go:117] "RemoveContainer" containerID="55a8ca4d4d4f845958bddd165606fd0a9aa08d68148d5ce1f6d84578e370b654" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.293788 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.296930 4707 scope.go:117] "RemoveContainer" containerID="274d838448ffed3d5e993bce9fc0532d447a3d723e068308801479939c591e55" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.298717 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.310423 4707 scope.go:117] "RemoveContainer" containerID="778167efb85219d760ca2bee2211cb50dd31b9555451c93aea631b587f5c4e24" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.330713 4707 scope.go:117] "RemoveContainer" containerID="c775c3ced09d3af1afb4330ece26d7b04215632d1b6e25a92627d6902a9f8f3e" Jan 21 16:13:49 crc kubenswrapper[4707]: I0121 16:13:49.343209 4707 scope.go:117] "RemoveContainer" containerID="669bb6d5318148cc1ac21fae845cda6d0b6d3e81db0f375ed6ebf15b8e802dd4" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.303085 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.347446 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8"] Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.347745 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" podUID="9ddec444-846e-4b2e-a046-4a0eee45b90f" containerName="dnsmasq-dns" containerID="cri-o://41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc" gracePeriod=10 Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.756334 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.800767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dns-swift-storage-0\") pod \"9ddec444-846e-4b2e-a046-4a0eee45b90f\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.800948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-config\") pod \"9ddec444-846e-4b2e-a046-4a0eee45b90f\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.801022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dnsmasq-svc\") pod \"9ddec444-846e-4b2e-a046-4a0eee45b90f\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.801082 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pcj6\" (UniqueName: \"kubernetes.io/projected/9ddec444-846e-4b2e-a046-4a0eee45b90f-kube-api-access-9pcj6\") pod \"9ddec444-846e-4b2e-a046-4a0eee45b90f\" (UID: \"9ddec444-846e-4b2e-a046-4a0eee45b90f\") " Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.810440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ddec444-846e-4b2e-a046-4a0eee45b90f-kube-api-access-9pcj6" (OuterVolumeSpecName: "kube-api-access-9pcj6") pod "9ddec444-846e-4b2e-a046-4a0eee45b90f" (UID: "9ddec444-846e-4b2e-a046-4a0eee45b90f"). InnerVolumeSpecName "kube-api-access-9pcj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.854430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-config" (OuterVolumeSpecName: "config") pod "9ddec444-846e-4b2e-a046-4a0eee45b90f" (UID: "9ddec444-846e-4b2e-a046-4a0eee45b90f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.858269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ddec444-846e-4b2e-a046-4a0eee45b90f" (UID: "9ddec444-846e-4b2e-a046-4a0eee45b90f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.875170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "9ddec444-846e-4b2e-a046-4a0eee45b90f" (UID: "9ddec444-846e-4b2e-a046-4a0eee45b90f"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.902890 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.902922 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pcj6\" (UniqueName: \"kubernetes.io/projected/9ddec444-846e-4b2e-a046-4a0eee45b90f-kube-api-access-9pcj6\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.902933 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:50 crc kubenswrapper[4707]: I0121 16:13:50.902942 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddec444-846e-4b2e-a046-4a0eee45b90f-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.191063 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" path="/var/lib/kubelet/pods/f6195e76-f5d4-4062-99a0-1597e6e85ba0/volumes" Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.191964 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f927e632-11ba-4d56-993a-f89357f885a4" path="/var/lib/kubelet/pods/f927e632-11ba-4d56-993a-f89357f885a4/volumes" Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.278991 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ddec444-846e-4b2e-a046-4a0eee45b90f" containerID="41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc" exitCode=0 Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.279024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" event={"ID":"9ddec444-846e-4b2e-a046-4a0eee45b90f","Type":"ContainerDied","Data":"41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc"} Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.279046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" event={"ID":"9ddec444-846e-4b2e-a046-4a0eee45b90f","Type":"ContainerDied","Data":"f8bb4681e87046a5343de41aa872ccd0791c624406be49afd408961a807763da"} Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.279062 4707 scope.go:117] "RemoveContainer" containerID="41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc" Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.279132 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8" Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.293843 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8"] Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.297770 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84cf554b49-q4ff8"] Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.306111 4707 scope.go:117] "RemoveContainer" containerID="32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3" Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.325563 4707 scope.go:117] "RemoveContainer" containerID="41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc" Jan 21 16:13:51 crc kubenswrapper[4707]: E0121 16:13:51.326003 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc\": container with ID starting with 41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc not found: ID does not exist" containerID="41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc" Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.326044 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc"} err="failed to get container status \"41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc\": rpc error: code = NotFound desc = could not find container \"41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc\": container with ID starting with 41e7db4b33ab5020a8343493f01bbe310688cec634f3cdfd0fe92859012506cc not found: ID does not exist" Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.326062 4707 scope.go:117] "RemoveContainer" containerID="32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3" Jan 21 16:13:51 crc kubenswrapper[4707]: E0121 16:13:51.326424 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3\": container with ID starting with 32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3 not found: ID does not exist" containerID="32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3" Jan 21 16:13:51 crc kubenswrapper[4707]: I0121 16:13:51.326451 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3"} err="failed to get container status \"32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3\": rpc error: code = NotFound desc = could not find container \"32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3\": container with ID starting with 32a09da87fa72db158007f2e5bab230e0608abf8afa091eb1b6e03b705e1fcf3 not found: ID does not exist" Jan 21 16:13:53 crc kubenswrapper[4707]: I0121 16:13:53.192689 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ddec444-846e-4b2e-a046-4a0eee45b90f" path="/var/lib/kubelet/pods/9ddec444-846e-4b2e-a046-4a0eee45b90f/volumes" Jan 21 16:13:53 crc kubenswrapper[4707]: E0121 16:13:53.790008 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice/crio-770e28dc760349d846a328f6dcc5814425a16f9414e79e379a58b6c895cb04aa\": RecentStats: unable to find data in memory cache]" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.236139 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.263626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-internal-tls-certs\") pod \"317a81bb-283b-43fe-88cb-ad8fadf45471\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.263720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-config\") pod \"317a81bb-283b-43fe-88cb-ad8fadf45471\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.263940 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-httpd-config\") pod \"317a81bb-283b-43fe-88cb-ad8fadf45471\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.264071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-ovndb-tls-certs\") pod \"317a81bb-283b-43fe-88cb-ad8fadf45471\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.264128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-public-tls-certs\") pod \"317a81bb-283b-43fe-88cb-ad8fadf45471\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.264188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64tc9\" (UniqueName: \"kubernetes.io/projected/317a81bb-283b-43fe-88cb-ad8fadf45471-kube-api-access-64tc9\") pod \"317a81bb-283b-43fe-88cb-ad8fadf45471\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.264239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-combined-ca-bundle\") pod \"317a81bb-283b-43fe-88cb-ad8fadf45471\" (UID: \"317a81bb-283b-43fe-88cb-ad8fadf45471\") " Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.271125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "317a81bb-283b-43fe-88cb-ad8fadf45471" (UID: "317a81bb-283b-43fe-88cb-ad8fadf45471"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.272740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317a81bb-283b-43fe-88cb-ad8fadf45471-kube-api-access-64tc9" (OuterVolumeSpecName: "kube-api-access-64tc9") pod "317a81bb-283b-43fe-88cb-ad8fadf45471" (UID: "317a81bb-283b-43fe-88cb-ad8fadf45471"). InnerVolumeSpecName "kube-api-access-64tc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.297071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "317a81bb-283b-43fe-88cb-ad8fadf45471" (UID: "317a81bb-283b-43fe-88cb-ad8fadf45471"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.298348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-config" (OuterVolumeSpecName: "config") pod "317a81bb-283b-43fe-88cb-ad8fadf45471" (UID: "317a81bb-283b-43fe-88cb-ad8fadf45471"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.300667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "317a81bb-283b-43fe-88cb-ad8fadf45471" (UID: "317a81bb-283b-43fe-88cb-ad8fadf45471"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.304559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "317a81bb-283b-43fe-88cb-ad8fadf45471" (UID: "317a81bb-283b-43fe-88cb-ad8fadf45471"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.318273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "317a81bb-283b-43fe-88cb-ad8fadf45471" (UID: "317a81bb-283b-43fe-88cb-ad8fadf45471"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.368045 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.368084 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.368097 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.368113 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64tc9\" (UniqueName: \"kubernetes.io/projected/317a81bb-283b-43fe-88cb-ad8fadf45471-kube-api-access-64tc9\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.368125 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.368135 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.368147 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/317a81bb-283b-43fe-88cb-ad8fadf45471-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.377101 4707 generic.go:334] "Generic (PLEG): container finished" podID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerID="da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd" exitCode=0 Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.377140 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.377144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" event={"ID":"317a81bb-283b-43fe-88cb-ad8fadf45471","Type":"ContainerDied","Data":"da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd"} Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.377223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp" event={"ID":"317a81bb-283b-43fe-88cb-ad8fadf45471","Type":"ContainerDied","Data":"ed02f5813ce6ae9bea3dafe0363b9f0b1730c57e577d54c423ea7f70ac3cd7dc"} Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.377241 4707 scope.go:117] "RemoveContainer" containerID="555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.397334 4707 scope.go:117] "RemoveContainer" containerID="da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.415555 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp"] Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.424439 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-58d6d5d4f4-ldvtp"] Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.426833 4707 scope.go:117] "RemoveContainer" containerID="555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4" Jan 21 16:14:02 crc kubenswrapper[4707]: E0121 16:14:02.427235 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4\": container with ID starting with 555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4 not found: ID does not exist" containerID="555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.427274 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4"} err="failed to get container status \"555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4\": rpc error: code = NotFound desc = could not find container \"555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4\": container with ID starting with 555e9b8cd7547f8e53504fc734a56c0ee7750db4fa9bb374babefcddd25027a4 not found: ID does not exist" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.427297 4707 scope.go:117] "RemoveContainer" containerID="da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd" Jan 21 16:14:02 crc kubenswrapper[4707]: E0121 16:14:02.427572 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd\": container with ID starting with da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd not found: ID does not exist" containerID="da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd" Jan 21 16:14:02 crc kubenswrapper[4707]: I0121 16:14:02.427596 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd"} err="failed to get container status \"da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd\": rpc error: code = NotFound desc = could not find container \"da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd\": container with ID starting with da3ec1efc69d7e34a136155776e196f8cbe2808a22503e7738b077aedf1179fd not found: ID does not exist" Jan 21 16:14:03 crc kubenswrapper[4707]: I0121 16:14:03.188689 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317a81bb-283b-43fe-88cb-ad8fadf45471" path="/var/lib/kubelet/pods/317a81bb-283b-43fe-88cb-ad8fadf45471/volumes" Jan 21 16:14:03 crc kubenswrapper[4707]: E0121 16:14:03.935517 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice/crio-770e28dc760349d846a328f6dcc5814425a16f9414e79e379a58b6c895cb04aa\": RecentStats: unable to find data in memory cache]" Jan 21 16:14:09 crc kubenswrapper[4707]: I0121 16:14:09.945783 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:14:09 crc kubenswrapper[4707]: I0121 16:14:09.946367 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:14:09 crc kubenswrapper[4707]: I0121 16:14:09.946412 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:14:09 crc kubenswrapper[4707]: I0121 16:14:09.947316 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"973e8f3f79963508f548e1f1508104218595db51536e1edaf4cd1d819983387a"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:14:09 crc kubenswrapper[4707]: I0121 16:14:09.947366 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://973e8f3f79963508f548e1f1508104218595db51536e1edaf4cd1d819983387a" gracePeriod=600 Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.101029 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.172736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift\") pod \"f49362da-7cec-499b-be08-6930c4f470e6\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.172958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f49362da-7cec-499b-be08-6930c4f470e6\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.173097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-cache\") pod \"f49362da-7cec-499b-be08-6930c4f470e6\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.173454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-cache" (OuterVolumeSpecName: "cache") pod "f49362da-7cec-499b-be08-6930c4f470e6" (UID: "f49362da-7cec-499b-be08-6930c4f470e6"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.173564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-lock\") pod \"f49362da-7cec-499b-be08-6930c4f470e6\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.173835 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-lock" (OuterVolumeSpecName: "lock") pod "f49362da-7cec-499b-be08-6930c4f470e6" (UID: "f49362da-7cec-499b-be08-6930c4f470e6"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.173600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzcjc\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-kube-api-access-kzcjc\") pod \"f49362da-7cec-499b-be08-6930c4f470e6\" (UID: \"f49362da-7cec-499b-be08-6930c4f470e6\") " Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.174244 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-lock\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.174268 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f49362da-7cec-499b-be08-6930c4f470e6-cache\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.177427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f49362da-7cec-499b-be08-6930c4f470e6" (UID: "f49362da-7cec-499b-be08-6930c4f470e6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.177581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "f49362da-7cec-499b-be08-6930c4f470e6" (UID: "f49362da-7cec-499b-be08-6930c4f470e6"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.177778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-kube-api-access-kzcjc" (OuterVolumeSpecName: "kube-api-access-kzcjc") pod "f49362da-7cec-499b-be08-6930c4f470e6" (UID: "f49362da-7cec-499b-be08-6930c4f470e6"). InnerVolumeSpecName "kube-api-access-kzcjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.275427 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.275472 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.275485 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzcjc\" (UniqueName: \"kubernetes.io/projected/f49362da-7cec-499b-be08-6930c4f470e6-kube-api-access-kzcjc\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.287095 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.377608 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.439504 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="973e8f3f79963508f548e1f1508104218595db51536e1edaf4cd1d819983387a" exitCode=0 Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.439539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"973e8f3f79963508f548e1f1508104218595db51536e1edaf4cd1d819983387a"} Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.439612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466"} Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.439635 4707 scope.go:117] "RemoveContainer" containerID="a3626861978c47e9a3123211f1aa66f6b8b95a3dce2699250bd878ddb6b0b391" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.458775 4707 generic.go:334] "Generic (PLEG): container finished" podID="f49362da-7cec-499b-be08-6930c4f470e6" containerID="f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6" exitCode=137 Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.458864 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.458887 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6"} Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.459127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f49362da-7cec-499b-be08-6930c4f470e6","Type":"ContainerDied","Data":"64bca4d7bc91d4ddb1d6623f99b462d34b91c22910298ad1e7fc8c16003a20f2"} Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.459489 4707 scope.go:117] "RemoveContainer" containerID="f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.476311 4707 scope.go:117] "RemoveContainer" containerID="8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.487561 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.492540 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.495831 4707 scope.go:117] "RemoveContainer" containerID="ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.514773 4707 scope.go:117] "RemoveContainer" containerID="3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.529663 4707 scope.go:117] "RemoveContainer" containerID="0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.543529 4707 scope.go:117] "RemoveContainer" containerID="41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.563552 4707 scope.go:117] "RemoveContainer" containerID="49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.578783 4707 scope.go:117] "RemoveContainer" containerID="47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.591746 4707 scope.go:117] "RemoveContainer" containerID="9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.611228 4707 scope.go:117] "RemoveContainer" containerID="78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.625623 4707 scope.go:117] "RemoveContainer" containerID="ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.648947 4707 scope.go:117] "RemoveContainer" containerID="8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.668057 4707 scope.go:117] "RemoveContainer" containerID="6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.685869 4707 scope.go:117] "RemoveContainer" containerID="1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.703946 4707 scope.go:117] "RemoveContainer" containerID="59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.721039 4707 scope.go:117] "RemoveContainer" containerID="f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.721436 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6\": container with ID starting with f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6 not found: ID does not exist" containerID="f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.721494 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6"} err="failed to get container status \"f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6\": rpc error: code = NotFound desc = could not find container \"f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6\": container with ID starting with f2bb98c16a1c33840b2310b57d7feca4070a90875ee1f2081bec9fb8ab4491f6 not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.721517 4707 scope.go:117] "RemoveContainer" containerID="8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.721921 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b\": container with ID starting with 8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b not found: ID does not exist" containerID="8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.721948 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b"} err="failed to get container status \"8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b\": rpc error: code = NotFound desc = could not find container \"8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b\": container with ID starting with 8ecf9cc189600f9538dad2d6c52946394daba83fdd7e7ce4177ce90fcc87190b not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.721971 4707 scope.go:117] "RemoveContainer" containerID="ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.722297 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4\": container with ID starting with ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4 not found: ID does not exist" containerID="ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.722325 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4"} err="failed to get container status \"ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4\": rpc error: code = NotFound desc = could not find container \"ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4\": container with ID starting with ff86c3b302cd35ec59f0eaf2bc231e40df1f493c4cec8c61be3a43c590609dd4 not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.722340 4707 scope.go:117] "RemoveContainer" containerID="3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.722737 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25\": container with ID starting with 3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25 not found: ID does not exist" containerID="3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.722758 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25"} err="failed to get container status \"3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25\": rpc error: code = NotFound desc = could not find container \"3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25\": container with ID starting with 3f042e89d2e05eb8d24314e58acb432294f9d2b7d048362a6c215b1280aacf25 not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.722770 4707 scope.go:117] "RemoveContainer" containerID="0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.723140 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf\": container with ID starting with 0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf not found: ID does not exist" containerID="0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.723205 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf"} err="failed to get container status \"0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf\": rpc error: code = NotFound desc = could not find container \"0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf\": container with ID starting with 0a98787c3019b4f547453c8615a95070545370aeda2f5852d3469101456ecbbf not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.723220 4707 scope.go:117] "RemoveContainer" containerID="41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.723537 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446\": container with ID starting with 41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446 not found: ID does not exist" containerID="41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.723562 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446"} err="failed to get container status \"41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446\": rpc error: code = NotFound desc = could not find container \"41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446\": container with ID starting with 41bb1b2c9ea8560e295c73eda469288e98c6d673008d39882d232ae8c7043446 not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.723575 4707 scope.go:117] "RemoveContainer" containerID="49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.724001 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620\": container with ID starting with 49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620 not found: ID does not exist" containerID="49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.724056 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620"} err="failed to get container status \"49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620\": rpc error: code = NotFound desc = could not find container \"49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620\": container with ID starting with 49672511765e7fe708feab3be840d58c29d43fe34f966c5d58e9de28e7e64620 not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.724070 4707 scope.go:117] "RemoveContainer" containerID="47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.724376 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261\": container with ID starting with 47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261 not found: ID does not exist" containerID="47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.724421 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261"} err="failed to get container status \"47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261\": rpc error: code = NotFound desc = could not find container \"47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261\": container with ID starting with 47da4b4d0e9a676b1ae9f96d61dc98f2a156b86fe2002e65b2fa1b610897c261 not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.724435 4707 scope.go:117] "RemoveContainer" containerID="9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.724725 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a\": container with ID starting with 9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a not found: ID does not exist" containerID="9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.724746 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a"} err="failed to get container status \"9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a\": rpc error: code = NotFound desc = could not find container \"9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a\": container with ID starting with 9437c147b4555b3a25d0d4dc4ccd0b8eb8e79df41f6bb689bf22cfd8223c251a not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.724761 4707 scope.go:117] "RemoveContainer" containerID="78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.725251 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf\": container with ID starting with 78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf not found: ID does not exist" containerID="78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.725290 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf"} err="failed to get container status \"78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf\": rpc error: code = NotFound desc = could not find container \"78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf\": container with ID starting with 78f8357dc369a64c53ca3abf14700a9d110f8be7e695c2c9a3069b9dfef9f6cf not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.725306 4707 scope.go:117] "RemoveContainer" containerID="ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.725577 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e\": container with ID starting with ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e not found: ID does not exist" containerID="ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.725600 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e"} err="failed to get container status \"ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e\": rpc error: code = NotFound desc = could not find container \"ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e\": container with ID starting with ca401aa76110b27bad4a5bff1fdf2164853b37e31ed3eab037f0a53a0810f23e not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.725612 4707 scope.go:117] "RemoveContainer" containerID="8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.725850 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794\": container with ID starting with 8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794 not found: ID does not exist" containerID="8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.725870 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794"} err="failed to get container status \"8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794\": rpc error: code = NotFound desc = could not find container \"8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794\": container with ID starting with 8ad0aac79b61938b28cf55a0c9ab1508181bbe3da17741e1a01df73a24c7f794 not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.725882 4707 scope.go:117] "RemoveContainer" containerID="6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.726090 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0\": container with ID starting with 6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0 not found: ID does not exist" containerID="6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.726110 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0"} err="failed to get container status \"6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0\": rpc error: code = NotFound desc = could not find container \"6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0\": container with ID starting with 6222d7129f474db6e9c3929f968691bac9db117630068cac80226e583cbbb8b0 not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.726124 4707 scope.go:117] "RemoveContainer" containerID="1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.726580 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479\": container with ID starting with 1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479 not found: ID does not exist" containerID="1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.726605 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479"} err="failed to get container status \"1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479\": rpc error: code = NotFound desc = could not find container \"1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479\": container with ID starting with 1e205397568e9aea30406fd3d2e8416f2f4c6d57bc5b5fe28b89dcab9c6dd479 not found: ID does not exist" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.726619 4707 scope.go:117] "RemoveContainer" containerID="59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60" Jan 21 16:14:10 crc kubenswrapper[4707]: E0121 16:14:10.726929 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60\": container with ID starting with 59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60 not found: ID does not exist" containerID="59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60" Jan 21 16:14:10 crc kubenswrapper[4707]: I0121 16:14:10.726954 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60"} err="failed to get container status \"59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60\": rpc error: code = NotFound desc = could not find container \"59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60\": container with ID starting with 59b046d90b8da476e0a74d35a6c5d7b9de7d63c7a98b954c6596e97b0ef67c60 not found: ID does not exist" Jan 21 16:14:11 crc kubenswrapper[4707]: I0121 16:14:11.191341 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49362da-7cec-499b-be08-6930c4f470e6" path="/var/lib/kubelet/pods/f49362da-7cec-499b-be08-6930c4f470e6/volumes" Jan 21 16:14:14 crc kubenswrapper[4707]: E0121 16:14:14.082671 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice/crio-770e28dc760349d846a328f6dcc5814425a16f9414e79e379a58b6c895cb04aa\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice\": RecentStats: unable to find data in memory cache]" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.239341 4707 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod1876cc6c-ebc8-4c95-a937-27d2e01adf64"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod1876cc6c-ebc8-4c95-a937-27d2e01adf64] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1876cc6c_ebc8_4c95_a937_27d2e01adf64.slice" Jan 21 16:14:17 crc kubenswrapper[4707]: E0121 16:14:17.239591 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod1876cc6c-ebc8-4c95-a937-27d2e01adf64] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod1876cc6c-ebc8-4c95-a937-27d2e01adf64] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1876cc6c_ebc8_4c95_a937_27d2e01adf64.slice" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" podUID="1876cc6c-ebc8-4c95-a937-27d2e01adf64" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.543777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6d55599ff6-55gnd" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.557991 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6d55599ff6-55gnd"] Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.561535 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6d55599ff6-55gnd"] Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.605120 4707 scope.go:117] "RemoveContainer" containerID="71f850a8352ca042d8a8b332e9e647c7e46ebf9f2515e3ed09d30a39424ae892" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.646154 4707 scope.go:117] "RemoveContainer" containerID="8a63ed3c9b50f6cc80dfcc95323d2f7c984bdc6c47f6e52d33e7abe224e613c3" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.661706 4707 scope.go:117] "RemoveContainer" containerID="6e5c1a5371a24097ab7530f393c17000f9250266e47dc1732fbe3880f11b7c8f" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.680366 4707 scope.go:117] "RemoveContainer" containerID="cd4925de544fedf8e5e8b19e458b6dd165d182b18e2bd7e6a19f58b6fb856b84" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.700131 4707 scope.go:117] "RemoveContainer" containerID="2d8a72a6540e65f712baf7c89e1c838c5fd6f3327fb32d1684d970e96405a48a" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.719422 4707 scope.go:117] "RemoveContainer" containerID="80e9733e9d9a5164525e952079104194c595f92d328fae8c657f54b22e91c862" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.755633 4707 scope.go:117] "RemoveContainer" containerID="fa05439844a1c6e8d944655fe665d7c8f92a474ed2877ea8b4617cf47356577b" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.770579 4707 scope.go:117] "RemoveContainer" containerID="e6c5b3ded6d120827ac01f6ac458fb4657209930d6d59a84ed3c4e36e0aabc76" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.784746 4707 scope.go:117] "RemoveContainer" containerID="4b897ebc0ad2bb55b6bc3e5391a8b08293a8e6797572f87736673c368f860c54" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.804205 4707 scope.go:117] "RemoveContainer" containerID="a005d0adcf1ca25f60db5126c522d38f536eb886e79a89b2052b955fee4ea881" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.843211 4707 scope.go:117] "RemoveContainer" containerID="d452bc6e221810ea066cd64f5b6bffe50921334f2dde676c274b42522b363fcf" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.876511 4707 scope.go:117] "RemoveContainer" containerID="b629d96d3befccaa520a351b2d69f22752fe28a8a8419fa92d63093c7d989cb2" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.923742 4707 scope.go:117] "RemoveContainer" containerID="08784ad7d03c8490ce19607b5c0a3e275886c4606da94d81de1a7da1fb4fa777" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.945754 4707 scope.go:117] "RemoveContainer" containerID="868cc8f8724909904912ff258b32f90becf5b16e946b2fdedb12537284739009" Jan 21 16:14:17 crc kubenswrapper[4707]: I0121 16:14:17.980515 4707 scope.go:117] "RemoveContainer" containerID="9dbde5c66335b6740d73be0a6801440b8c6adb6737788ea519f025ec1a4c33d4" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.006660 4707 scope.go:117] "RemoveContainer" containerID="d5feaa8ac1d4f32dd02f665fb416dd696dc70202a06de401c7aae4945d988bb3" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.047553 4707 scope.go:117] "RemoveContainer" containerID="09b6bb9a89e57fb4efbbe1768b6bbd8a7239804021d9c1e24c897539673129b4" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.066366 4707 scope.go:117] "RemoveContainer" containerID="f3ef8e10752a7b3f655f3ba2c159cbe3e42d3a2fec7a9a6ec78eb3474905e38b" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.079573 4707 scope.go:117] "RemoveContainer" containerID="5b2c98cc04fc5070f09e606ed3d0835a9cca88ad4946d72f7be2878e99f67f40" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.112555 4707 scope.go:117] "RemoveContainer" containerID="4d2e09304453da89fa1ebfc18a78fea8433713fb84782f539b40233e5b40f194" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.127098 4707 scope.go:117] "RemoveContainer" containerID="df63a0f96d2b4c2cd36be6f9bcc5c322311735a28d277daebc97d5ec5b9bfe6c" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.139479 4707 scope.go:117] "RemoveContainer" containerID="a2d2e548b2324c60f458f301629f67a22437c8af5ea4cd732d3d8c06e695a507" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.527436 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-lvs4w"] Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.531229 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-lvs4w"] Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.634645 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-78np7"] Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.634900 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="probe" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.634917 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="probe" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.634934 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-server" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.634940 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-server" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.634948 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.634953 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-api" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.634963 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="swift-recon-cron" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.634968 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="swift-recon-cron" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.634975 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" containerName="setup-container" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.634980 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" containerName="setup-container" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.634986 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-auditor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.634992 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-auditor" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.634999 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" containerName="rabbitmq" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635004 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" containerName="rabbitmq" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635011 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e40397-47ac-49d6-9b01-50b995d290f7" containerName="glance-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635016 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e40397-47ac-49d6-9b01-50b995d290f7" containerName="glance-log" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635022 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635027 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635033 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635038 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-log" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635047 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635052 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635119 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635126 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635133 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-replicator" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635138 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-replicator" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635144 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="rsync" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635150 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="rsync" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635174 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerName="proxy-server" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635180 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerName="proxy-server" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635186 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="sg-core" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635190 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="sg-core" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635200 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-auditor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635205 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-auditor" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635211 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f927e632-11ba-4d56-993a-f89357f885a4" containerName="barbican-worker-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635216 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f927e632-11ba-4d56-993a-f89357f885a4" containerName="barbican-worker-log" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635224 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerName="cinder-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635230 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerName="cinder-api" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635238 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerName="barbican-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635244 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerName="barbican-api" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635250 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635255 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635262 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerName="barbican-keystone-listener" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635267 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerName="barbican-keystone-listener" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635275 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ddec444-846e-4b2e-a046-4a0eee45b90f" containerName="dnsmasq-dns" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635280 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ddec444-846e-4b2e-a046-4a0eee45b90f" containerName="dnsmasq-dns" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635285 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635290 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635295 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerName="neutron-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635300 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerName="neutron-api" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635308 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerName="neutron-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635313 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerName="neutron-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635323 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb46fb3-5bea-4b35-9e6c-6816415058af" containerName="memcached" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635328 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb46fb3-5bea-4b35-9e6c-6816415058af" containerName="memcached" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635334 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-replicator" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635340 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-replicator" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635346 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1876cc6c-ebc8-4c95-a937-27d2e01adf64" containerName="keystone-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635351 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1876cc6c-ebc8-4c95-a937-27d2e01adf64" containerName="keystone-api" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635360 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerName="setup-container" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635365 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerName="setup-container" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635375 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-metadata" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635380 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-metadata" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635387 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-server" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635392 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-server" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635400 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-updater" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635405 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-updater" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635413 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerName="placement-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635418 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerName="placement-api" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635424 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerName="cinder-api-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635429 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerName="cinder-api-log" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635435 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="cinder-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635440 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="cinder-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635447 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerName="ovn-northd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635452 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerName="ovn-northd" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635459 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635464 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-log" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635471 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-updater" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635476 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-updater" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635483 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerName="barbican-keystone-listener-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635488 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerName="barbican-keystone-listener-log" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635496 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ddec444-846e-4b2e-a046-4a0eee45b90f" containerName="init" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635501 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ddec444-846e-4b2e-a046-4a0eee45b90f" containerName="init" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635510 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerName="rabbitmq" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635515 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerName="rabbitmq" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635523 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-server" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635528 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-server" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635536 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-auditor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635554 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-auditor" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635564 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-replicator" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635569 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-replicator" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635577 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" containerName="mysql-bootstrap" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635582 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" containerName="mysql-bootstrap" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635588 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-reaper" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635593 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-reaper" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635599 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" containerName="galera" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635604 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" containerName="galera" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635610 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerName="placement-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635615 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerName="placement-log" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635623 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="ceilometer-central-agent" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635628 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="ceilometer-central-agent" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635634 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635639 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635644 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerName="openstack-network-exporter" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635649 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerName="openstack-network-exporter" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635660 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635665 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635672 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerName="barbican-api-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635677 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerName="barbican-api-log" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635682 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635688 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635697 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-expirer" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635702 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-expirer" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635709 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635713 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-log" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635721 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="ceilometer-notification-agent" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635726 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="ceilometer-notification-agent" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635731 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerName="proxy-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635737 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerName="proxy-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635745 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e40397-47ac-49d6-9b01-50b995d290f7" containerName="glance-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635750 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e40397-47ac-49d6-9b01-50b995d290f7" containerName="glance-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635756 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="proxy-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635761 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="proxy-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: E0121 16:14:18.635769 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f927e632-11ba-4d56-993a-f89357f885a4" containerName="barbican-worker" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635774 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f927e632-11ba-4d56-993a-f89357f885a4" containerName="barbican-worker" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635925 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635940 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-replicator" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635945 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e40397-47ac-49d6-9b01-50b995d290f7" containerName="glance-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635954 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-expirer" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635962 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-updater" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635970 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635976 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635982 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635988 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-updater" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.635996 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e42cb4d-8129-4286-a6c7-9f8e94524e91" containerName="rabbitmq" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636005 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-server" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636014 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ddec444-846e-4b2e-a046-4a0eee45b90f" containerName="dnsmasq-dns" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636023 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-auditor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636032 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1876cc6c-ebc8-4c95-a937-27d2e01adf64" containerName="keystone-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636039 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="ceilometer-central-agent" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636044 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerName="proxy-server" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636052 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerName="ovn-northd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636059 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerName="barbican-keystone-listener" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636066 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="swift-recon-cron" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636075 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636083 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-server" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636089 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="047c06ee-71d5-4bcf-83e0-dd8aa295cdb3" containerName="nova-cell1-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636097 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="ceilometer-notification-agent" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636107 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="sg-core" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636112 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f927e632-11ba-4d56-993a-f89357f885a4" containerName="barbican-worker-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636119 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="594873ed-1daa-422c-853c-65fab465dbdb" containerName="nova-api-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636125 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b115ad1a-72c2-4d2e-8669-4d125acf6751" containerName="barbican-keystone-listener-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636131 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e40397-47ac-49d6-9b01-50b995d290f7" containerName="glance-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636139 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="81457b6e-3af3-4da8-9f83-0e896795cb2c" containerName="proxy-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636148 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerName="neutron-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636153 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ca9dac-9e6a-4626-a8ff-111cfee2ef3b" containerName="galera" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636171 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerName="cinder-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636177 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636182 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-auditor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636188 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb46fb3-5bea-4b35-9e6c-6816415058af" containerName="memcached" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636194 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-replicator" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636201 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636207 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerName="placement-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636217 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3292fc-fbc8-481c-803a-3e32ff576d6d" containerName="placement-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636223 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-server" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636231 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636238 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d633804-5d9a-4792-b183-51e2c1a2ebe2" containerName="rabbitmq" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636244 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636250 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="account-reaper" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636256 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="rsync" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636264 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerName="barbican-api-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636273 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0dcc4b7-8d5f-47b6-9da1-d135be5db8f2" containerName="nova-metadata-metadata" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636282 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f927e632-11ba-4d56-993a-f89357f885a4" containerName="barbican-worker" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636288 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="probe" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636295 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="container-auditor" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636301 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e10ec4e-d4dd-412a-be77-d3d7d7ae27d8" containerName="openstack-network-exporter" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636308 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f6304c7-b127-4fcb-a74b-b5009a2b1a50" containerName="cinder-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636314 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49362da-7cec-499b-be08-6930c4f470e6" containerName="object-replicator" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636322 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6195e76-f5d4-4062-99a0-1597e6e85ba0" containerName="proxy-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636328 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b4f4cb-72f6-44e7-ad6b-07f25a3f08b0" containerName="barbican-api" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636335 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="559741cf-87c9-47d4-98a3-6b02ffb4610c" containerName="cinder-api-log" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636343 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1162dc59-3a54-42e6-84c5-466eafb062b4" containerName="glance-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636349 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="317a81bb-283b-43fe-88cb-ad8fadf45471" containerName="neutron-httpd" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636355 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.636724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.638939 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.638947 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.639274 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.639565 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.641873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-78np7"] Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.690538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f7e2307d-fe01-4826-af55-eebb2c98de3b-crc-storage\") pod \"crc-storage-crc-78np7\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.690791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f7e2307d-fe01-4826-af55-eebb2c98de3b-node-mnt\") pod \"crc-storage-crc-78np7\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.690838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68dhl\" (UniqueName: \"kubernetes.io/projected/f7e2307d-fe01-4826-af55-eebb2c98de3b-kube-api-access-68dhl\") pod \"crc-storage-crc-78np7\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.791767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f7e2307d-fe01-4826-af55-eebb2c98de3b-crc-storage\") pod \"crc-storage-crc-78np7\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.791840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f7e2307d-fe01-4826-af55-eebb2c98de3b-node-mnt\") pod \"crc-storage-crc-78np7\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.791880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68dhl\" (UniqueName: \"kubernetes.io/projected/f7e2307d-fe01-4826-af55-eebb2c98de3b-kube-api-access-68dhl\") pod \"crc-storage-crc-78np7\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.792100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f7e2307d-fe01-4826-af55-eebb2c98de3b-node-mnt\") pod \"crc-storage-crc-78np7\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.792458 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f7e2307d-fe01-4826-af55-eebb2c98de3b-crc-storage\") pod \"crc-storage-crc-78np7\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.807063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68dhl\" (UniqueName: \"kubernetes.io/projected/f7e2307d-fe01-4826-af55-eebb2c98de3b-kube-api-access-68dhl\") pod \"crc-storage-crc-78np7\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:18 crc kubenswrapper[4707]: I0121 16:14:18.960962 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:19 crc kubenswrapper[4707]: I0121 16:14:19.190147 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1876cc6c-ebc8-4c95-a937-27d2e01adf64" path="/var/lib/kubelet/pods/1876cc6c-ebc8-4c95-a937-27d2e01adf64/volumes" Jan 21 16:14:19 crc kubenswrapper[4707]: I0121 16:14:19.190981 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c287a9f3-dfaf-46b1-a73b-b1d66c346648" path="/var/lib/kubelet/pods/c287a9f3-dfaf-46b1-a73b-b1d66c346648/volumes" Jan 21 16:14:19 crc kubenswrapper[4707]: I0121 16:14:19.321565 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-78np7"] Jan 21 16:14:19 crc kubenswrapper[4707]: I0121 16:14:19.329839 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:14:19 crc kubenswrapper[4707]: I0121 16:14:19.562066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-78np7" event={"ID":"f7e2307d-fe01-4826-af55-eebb2c98de3b","Type":"ContainerStarted","Data":"f8f725efcce04d69cc3b89c6539fdc9f844ecbce00a2ed1a1a68a27334ae4542"} Jan 21 16:14:20 crc kubenswrapper[4707]: I0121 16:14:20.570145 4707 generic.go:334] "Generic (PLEG): container finished" podID="f7e2307d-fe01-4826-af55-eebb2c98de3b" containerID="3a4490cc8084048a122cc3dc772edf3f70456b2316b556413b3953de47b4a3f8" exitCode=0 Jan 21 16:14:20 crc kubenswrapper[4707]: I0121 16:14:20.570199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-78np7" event={"ID":"f7e2307d-fe01-4826-af55-eebb2c98de3b","Type":"ContainerDied","Data":"3a4490cc8084048a122cc3dc772edf3f70456b2316b556413b3953de47b4a3f8"} Jan 21 16:14:21 crc kubenswrapper[4707]: I0121 16:14:21.801173 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:21 crc kubenswrapper[4707]: I0121 16:14:21.931655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68dhl\" (UniqueName: \"kubernetes.io/projected/f7e2307d-fe01-4826-af55-eebb2c98de3b-kube-api-access-68dhl\") pod \"f7e2307d-fe01-4826-af55-eebb2c98de3b\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " Jan 21 16:14:21 crc kubenswrapper[4707]: I0121 16:14:21.931715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f7e2307d-fe01-4826-af55-eebb2c98de3b-crc-storage\") pod \"f7e2307d-fe01-4826-af55-eebb2c98de3b\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " Jan 21 16:14:21 crc kubenswrapper[4707]: I0121 16:14:21.931832 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f7e2307d-fe01-4826-af55-eebb2c98de3b-node-mnt\") pod \"f7e2307d-fe01-4826-af55-eebb2c98de3b\" (UID: \"f7e2307d-fe01-4826-af55-eebb2c98de3b\") " Jan 21 16:14:21 crc kubenswrapper[4707]: I0121 16:14:21.932093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7e2307d-fe01-4826-af55-eebb2c98de3b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f7e2307d-fe01-4826-af55-eebb2c98de3b" (UID: "f7e2307d-fe01-4826-af55-eebb2c98de3b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:14:21 crc kubenswrapper[4707]: I0121 16:14:21.935774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e2307d-fe01-4826-af55-eebb2c98de3b-kube-api-access-68dhl" (OuterVolumeSpecName: "kube-api-access-68dhl") pod "f7e2307d-fe01-4826-af55-eebb2c98de3b" (UID: "f7e2307d-fe01-4826-af55-eebb2c98de3b"). InnerVolumeSpecName "kube-api-access-68dhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:14:21 crc kubenswrapper[4707]: I0121 16:14:21.946158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e2307d-fe01-4826-af55-eebb2c98de3b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f7e2307d-fe01-4826-af55-eebb2c98de3b" (UID: "f7e2307d-fe01-4826-af55-eebb2c98de3b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:14:22 crc kubenswrapper[4707]: I0121 16:14:22.032692 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68dhl\" (UniqueName: \"kubernetes.io/projected/f7e2307d-fe01-4826-af55-eebb2c98de3b-kube-api-access-68dhl\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:22 crc kubenswrapper[4707]: I0121 16:14:22.032714 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f7e2307d-fe01-4826-af55-eebb2c98de3b-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:22 crc kubenswrapper[4707]: I0121 16:14:22.032724 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f7e2307d-fe01-4826-af55-eebb2c98de3b-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:22 crc kubenswrapper[4707]: I0121 16:14:22.590445 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-78np7" event={"ID":"f7e2307d-fe01-4826-af55-eebb2c98de3b","Type":"ContainerDied","Data":"f8f725efcce04d69cc3b89c6539fdc9f844ecbce00a2ed1a1a68a27334ae4542"} Jan 21 16:14:22 crc kubenswrapper[4707]: I0121 16:14:22.590669 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8f725efcce04d69cc3b89c6539fdc9f844ecbce00a2ed1a1a68a27334ae4542" Jan 21 16:14:22 crc kubenswrapper[4707]: I0121 16:14:22.590487 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-78np7" Jan 21 16:14:24 crc kubenswrapper[4707]: E0121 16:14:24.242134 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice/crio-770e28dc760349d846a328f6dcc5814425a16f9414e79e379a58b6c895cb04aa\": RecentStats: unable to find data in memory cache]" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.615936 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-78np7"] Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.620363 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-78np7"] Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.714175 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jm9xf"] Jan 21 16:14:24 crc kubenswrapper[4707]: E0121 16:14:24.714429 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e2307d-fe01-4826-af55-eebb2c98de3b" containerName="storage" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.714447 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e2307d-fe01-4826-af55-eebb2c98de3b" containerName="storage" Jan 21 16:14:24 crc kubenswrapper[4707]: E0121 16:14:24.714466 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.714472 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:24 crc kubenswrapper[4707]: E0121 16:14:24.714479 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.714487 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:24 crc kubenswrapper[4707]: E0121 16:14:24.714498 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.714504 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.714606 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e2307d-fe01-4826-af55-eebb2c98de3b" containerName="storage" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.714617 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="647f33a8-e7c7-4941-9c0d-f9c5e4b07409" containerName="nova-scheduler-scheduler" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.714631 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.714639 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd914cc-fc30-48bb-b6bb-0fa2f8bbbb56" containerName="nova-cell0-conductor-conductor" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.715037 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.716387 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.716688 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.716854 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.718913 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.722673 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jm9xf"] Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.766171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6281cf61-0f59-4ad0-90a3-6f31bb120016-crc-storage\") pod \"crc-storage-crc-jm9xf\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.766509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6281cf61-0f59-4ad0-90a3-6f31bb120016-node-mnt\") pod \"crc-storage-crc-jm9xf\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.766596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9mxm\" (UniqueName: \"kubernetes.io/projected/6281cf61-0f59-4ad0-90a3-6f31bb120016-kube-api-access-t9mxm\") pod \"crc-storage-crc-jm9xf\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.867507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6281cf61-0f59-4ad0-90a3-6f31bb120016-node-mnt\") pod \"crc-storage-crc-jm9xf\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.867553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9mxm\" (UniqueName: \"kubernetes.io/projected/6281cf61-0f59-4ad0-90a3-6f31bb120016-kube-api-access-t9mxm\") pod \"crc-storage-crc-jm9xf\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.867607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6281cf61-0f59-4ad0-90a3-6f31bb120016-crc-storage\") pod \"crc-storage-crc-jm9xf\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.868006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6281cf61-0f59-4ad0-90a3-6f31bb120016-node-mnt\") pod \"crc-storage-crc-jm9xf\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.868230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6281cf61-0f59-4ad0-90a3-6f31bb120016-crc-storage\") pod \"crc-storage-crc-jm9xf\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:24 crc kubenswrapper[4707]: I0121 16:14:24.883181 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9mxm\" (UniqueName: \"kubernetes.io/projected/6281cf61-0f59-4ad0-90a3-6f31bb120016-kube-api-access-t9mxm\") pod \"crc-storage-crc-jm9xf\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:25 crc kubenswrapper[4707]: I0121 16:14:25.031043 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:25 crc kubenswrapper[4707]: I0121 16:14:25.190563 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e2307d-fe01-4826-af55-eebb2c98de3b" path="/var/lib/kubelet/pods/f7e2307d-fe01-4826-af55-eebb2c98de3b/volumes" Jan 21 16:14:25 crc kubenswrapper[4707]: I0121 16:14:25.400371 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jm9xf"] Jan 21 16:14:25 crc kubenswrapper[4707]: W0121 16:14:25.403919 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6281cf61_0f59_4ad0_90a3_6f31bb120016.slice/crio-5fbae2b53fba2289fe0cafb6ff5eb05c8ca5e0def1de9cf934216a7c34272bfe WatchSource:0}: Error finding container 5fbae2b53fba2289fe0cafb6ff5eb05c8ca5e0def1de9cf934216a7c34272bfe: Status 404 returned error can't find the container with id 5fbae2b53fba2289fe0cafb6ff5eb05c8ca5e0def1de9cf934216a7c34272bfe Jan 21 16:14:25 crc kubenswrapper[4707]: I0121 16:14:25.622567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jm9xf" event={"ID":"6281cf61-0f59-4ad0-90a3-6f31bb120016","Type":"ContainerStarted","Data":"5fbae2b53fba2289fe0cafb6ff5eb05c8ca5e0def1de9cf934216a7c34272bfe"} Jan 21 16:14:26 crc kubenswrapper[4707]: I0121 16:14:26.631221 4707 generic.go:334] "Generic (PLEG): container finished" podID="6281cf61-0f59-4ad0-90a3-6f31bb120016" containerID="27a61ac48846f110d86e798aec4b399e84383f396bcc0b0c6292cfd9a73082ad" exitCode=0 Jan 21 16:14:26 crc kubenswrapper[4707]: I0121 16:14:26.631272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jm9xf" event={"ID":"6281cf61-0f59-4ad0-90a3-6f31bb120016","Type":"ContainerDied","Data":"27a61ac48846f110d86e798aec4b399e84383f396bcc0b0c6292cfd9a73082ad"} Jan 21 16:14:27 crc kubenswrapper[4707]: I0121 16:14:27.867089 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:27 crc kubenswrapper[4707]: I0121 16:14:27.904214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6281cf61-0f59-4ad0-90a3-6f31bb120016-crc-storage\") pod \"6281cf61-0f59-4ad0-90a3-6f31bb120016\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " Jan 21 16:14:27 crc kubenswrapper[4707]: I0121 16:14:27.904267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6281cf61-0f59-4ad0-90a3-6f31bb120016-node-mnt\") pod \"6281cf61-0f59-4ad0-90a3-6f31bb120016\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " Jan 21 16:14:27 crc kubenswrapper[4707]: I0121 16:14:27.904301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9mxm\" (UniqueName: \"kubernetes.io/projected/6281cf61-0f59-4ad0-90a3-6f31bb120016-kube-api-access-t9mxm\") pod \"6281cf61-0f59-4ad0-90a3-6f31bb120016\" (UID: \"6281cf61-0f59-4ad0-90a3-6f31bb120016\") " Jan 21 16:14:27 crc kubenswrapper[4707]: I0121 16:14:27.904430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6281cf61-0f59-4ad0-90a3-6f31bb120016-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6281cf61-0f59-4ad0-90a3-6f31bb120016" (UID: "6281cf61-0f59-4ad0-90a3-6f31bb120016"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:14:27 crc kubenswrapper[4707]: I0121 16:14:27.904714 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6281cf61-0f59-4ad0-90a3-6f31bb120016-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:27 crc kubenswrapper[4707]: I0121 16:14:27.910058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6281cf61-0f59-4ad0-90a3-6f31bb120016-kube-api-access-t9mxm" (OuterVolumeSpecName: "kube-api-access-t9mxm") pod "6281cf61-0f59-4ad0-90a3-6f31bb120016" (UID: "6281cf61-0f59-4ad0-90a3-6f31bb120016"). InnerVolumeSpecName "kube-api-access-t9mxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:14:27 crc kubenswrapper[4707]: I0121 16:14:27.921245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6281cf61-0f59-4ad0-90a3-6f31bb120016-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6281cf61-0f59-4ad0-90a3-6f31bb120016" (UID: "6281cf61-0f59-4ad0-90a3-6f31bb120016"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:14:28 crc kubenswrapper[4707]: I0121 16:14:28.005393 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6281cf61-0f59-4ad0-90a3-6f31bb120016-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:28 crc kubenswrapper[4707]: I0121 16:14:28.005420 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9mxm\" (UniqueName: \"kubernetes.io/projected/6281cf61-0f59-4ad0-90a3-6f31bb120016-kube-api-access-t9mxm\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:28 crc kubenswrapper[4707]: I0121 16:14:28.645531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jm9xf" event={"ID":"6281cf61-0f59-4ad0-90a3-6f31bb120016","Type":"ContainerDied","Data":"5fbae2b53fba2289fe0cafb6ff5eb05c8ca5e0def1de9cf934216a7c34272bfe"} Jan 21 16:14:28 crc kubenswrapper[4707]: I0121 16:14:28.645568 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fbae2b53fba2289fe0cafb6ff5eb05c8ca5e0def1de9cf934216a7c34272bfe" Jan 21 16:14:28 crc kubenswrapper[4707]: I0121 16:14:28.645571 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jm9xf" Jan 21 16:14:34 crc kubenswrapper[4707]: E0121 16:14:34.395506 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81457b6e_3af3_4da8_9f83_0e896795cb2c.slice/crio-770e28dc760349d846a328f6dcc5814425a16f9414e79e379a58b6c895cb04aa\": RecentStats: unable to find data in memory cache]" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.739585 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:14:40 crc kubenswrapper[4707]: E0121 16:14:40.740913 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6281cf61-0f59-4ad0-90a3-6f31bb120016" containerName="storage" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.740995 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6281cf61-0f59-4ad0-90a3-6f31bb120016" containerName="storage" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.741158 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6281cf61-0f59-4ad0-90a3-6f31bb120016" containerName="storage" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.741862 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.744187 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.746466 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.746644 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.747258 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-rzxhp" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.747375 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.748710 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.749717 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.749839 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.908683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.908738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm8vs\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-kube-api-access-tm8vs\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.908771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.908848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.908872 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.908916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.908940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.908971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.908989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.909010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.909024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.961143 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.962189 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.965431 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.965430 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.965517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.965677 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.969020 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.969030 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.969084 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-w925z" Jan 21 16:14:40 crc kubenswrapper[4707]: I0121 16:14:40.971499 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.009781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.009843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.009869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.009891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.009906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.009933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.009961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm8vs\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-kube-api-access-tm8vs\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.009989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.010011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.010027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.010068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.010621 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.010672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.011016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.011217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.011281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.011658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.017224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.017244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.017347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.017561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.025621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm8vs\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-kube-api-access-tm8vs\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.026610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.057713 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4pl7\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-kube-api-access-r4pl7\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00504354-68de-4d18-8192-144ce961d72e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00504354-68de-4d18-8192-144ce961d72e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111616 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.111702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4pl7\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-kube-api-access-r4pl7\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00504354-68de-4d18-8192-144ce961d72e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00504354-68de-4d18-8192-144ce961d72e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.213642 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.214036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.214071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.214121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.214263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.220644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.220752 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00504354-68de-4d18-8192-144ce961d72e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.220900 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00504354-68de-4d18-8192-144ce961d72e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.221060 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.222182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.228339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4pl7\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-kube-api-access-r4pl7\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.235149 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.278056 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.429221 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:14:41 crc kubenswrapper[4707]: W0121 16:14:41.434688 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda7ffdb4_4647_47fb_8ec9_012b0c81a83e.slice/crio-c8fa2204280b79a08ef1789a71e9bf7a5a3772f555f13c5d2b21714ba407ae92 WatchSource:0}: Error finding container c8fa2204280b79a08ef1789a71e9bf7a5a3772f555f13c5d2b21714ba407ae92: Status 404 returned error can't find the container with id c8fa2204280b79a08ef1789a71e9bf7a5a3772f555f13c5d2b21714ba407ae92 Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.651148 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:14:41 crc kubenswrapper[4707]: W0121 16:14:41.653916 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00504354_68de_4d18_8192_144ce961d72e.slice/crio-5de853c01221fcced37194ede5d5fcdc7c902fad87689859a9b851771df82d45 WatchSource:0}: Error finding container 5de853c01221fcced37194ede5d5fcdc7c902fad87689859a9b851771df82d45: Status 404 returned error can't find the container with id 5de853c01221fcced37194ede5d5fcdc7c902fad87689859a9b851771df82d45 Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.738454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"00504354-68de-4d18-8192-144ce961d72e","Type":"ContainerStarted","Data":"5de853c01221fcced37194ede5d5fcdc7c902fad87689859a9b851771df82d45"} Jan 21 16:14:41 crc kubenswrapper[4707]: I0121 16:14:41.739705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"da7ffdb4-4647-47fb-8ec9-012b0c81a83e","Type":"ContainerStarted","Data":"c8fa2204280b79a08ef1789a71e9bf7a5a3772f555f13c5d2b21714ba407ae92"} Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.564430 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.566346 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.569339 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.569408 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-jvtfj" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.569366 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.573937 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.574299 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.576516 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.631285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-operator-scripts\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.631365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.631387 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcxwf\" (UniqueName: \"kubernetes.io/projected/41784af8-4684-4cb8-959b-2cff22fe7f16-kube-api-access-gcxwf\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.631411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.631462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-default\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.631486 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-generated\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.631501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-kolla-config\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.631522 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.732670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.732716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcxwf\" (UniqueName: \"kubernetes.io/projected/41784af8-4684-4cb8-959b-2cff22fe7f16-kube-api-access-gcxwf\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.732742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.732772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-default\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.732795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-generated\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.732828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-kolla-config\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.732846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.732882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-operator-scripts\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.733854 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-kolla-config\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.733986 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.734305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-operator-scripts\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.737090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-generated\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.737495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-default\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.738881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.740079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.745710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcxwf\" (UniqueName: \"kubernetes.io/projected/41784af8-4684-4cb8-959b-2cff22fe7f16-kube-api-access-gcxwf\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.752497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"da7ffdb4-4647-47fb-8ec9-012b0c81a83e","Type":"ContainerStarted","Data":"66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075"} Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.757626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:42 crc kubenswrapper[4707]: I0121 16:14:42.888432 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.290198 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.760830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"41784af8-4684-4cb8-959b-2cff22fe7f16","Type":"ContainerStarted","Data":"3412449b8abfb0bc2880a2a1cb3170df4e64a694075dab079c2ba72a7ec9b33b"} Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.762415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"41784af8-4684-4cb8-959b-2cff22fe7f16","Type":"ContainerStarted","Data":"c1283d78e23a2058042cfcc6a44c7656fc8f658b2d5da061e23aeafa88042067"} Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.762540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"00504354-68de-4d18-8192-144ce961d72e","Type":"ContainerStarted","Data":"c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356"} Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.934870 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.936459 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.938259 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-ncfmm" Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.938738 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.940525 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.941059 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 21 16:14:43 crc kubenswrapper[4707]: I0121 16:14:43.944992 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.060080 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.060148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjxk5\" (UniqueName: \"kubernetes.io/projected/11bd3474-14bf-4c21-a327-c4dcbefdec96-kube-api-access-xjxk5\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.060182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.060256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.060282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.060311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.060366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.060450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.162014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.162086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjxk5\" (UniqueName: \"kubernetes.io/projected/11bd3474-14bf-4c21-a327-c4dcbefdec96-kube-api-access-xjxk5\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.162112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.162151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.162189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.162216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.162237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.162274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.162474 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") device mount path \"/mnt/openstack/pv17\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.164157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.167016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.167117 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.167230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.176191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.185857 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjxk5\" (UniqueName: \"kubernetes.io/projected/11bd3474-14bf-4c21-a327-c4dcbefdec96-kube-api-access-xjxk5\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.192524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.207744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.225848 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.226682 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.228188 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.228345 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-h2mdv" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.228369 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.250641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.256898 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.368202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.368325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjz76\" (UniqueName: \"kubernetes.io/projected/243cdfad-87ba-4917-8c2c-061272251dab-kube-api-access-fjz76\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.368366 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-kolla-config\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.368422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-config-data\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.368761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.470236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjz76\" (UniqueName: \"kubernetes.io/projected/243cdfad-87ba-4917-8c2c-061272251dab-kube-api-access-fjz76\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.470631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-kolla-config\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.470672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-config-data\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.470731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.470799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.471371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-kolla-config\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.472037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-config-data\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.476108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.477006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.484339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjz76\" (UniqueName: \"kubernetes.io/projected/243cdfad-87ba-4917-8c2c-061272251dab-kube-api-access-fjz76\") pod \"memcached-0\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.567593 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:44 crc kubenswrapper[4707]: W0121 16:14:44.656618 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11bd3474_14bf_4c21_a327_c4dcbefdec96.slice/crio-3f202e2e853ef921bd55bc21f919397144ec6481adbb438660ca4b94b96dad87 WatchSource:0}: Error finding container 3f202e2e853ef921bd55bc21f919397144ec6481adbb438660ca4b94b96dad87: Status 404 returned error can't find the container with id 3f202e2e853ef921bd55bc21f919397144ec6481adbb438660ca4b94b96dad87 Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.657878 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.755108 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:14:44 crc kubenswrapper[4707]: W0121 16:14:44.769210 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod243cdfad_87ba_4917_8c2c_061272251dab.slice/crio-8d73c24cf36e5d8118e4eebeaecd3536a73be52df33a91e6f1e726024c381b8e WatchSource:0}: Error finding container 8d73c24cf36e5d8118e4eebeaecd3536a73be52df33a91e6f1e726024c381b8e: Status 404 returned error can't find the container with id 8d73c24cf36e5d8118e4eebeaecd3536a73be52df33a91e6f1e726024c381b8e Jan 21 16:14:44 crc kubenswrapper[4707]: I0121 16:14:44.769692 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"11bd3474-14bf-4c21-a327-c4dcbefdec96","Type":"ContainerStarted","Data":"3f202e2e853ef921bd55bc21f919397144ec6481adbb438660ca4b94b96dad87"} Jan 21 16:14:45 crc kubenswrapper[4707]: I0121 16:14:45.782561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"11bd3474-14bf-4c21-a327-c4dcbefdec96","Type":"ContainerStarted","Data":"53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64"} Jan 21 16:14:45 crc kubenswrapper[4707]: I0121 16:14:45.785568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"243cdfad-87ba-4917-8c2c-061272251dab","Type":"ContainerStarted","Data":"7c0e62cfad943f1c8d4c59525642ec4ccf69200f67d5f018cb5bef6717a97838"} Jan 21 16:14:45 crc kubenswrapper[4707]: I0121 16:14:45.785606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"243cdfad-87ba-4917-8c2c-061272251dab","Type":"ContainerStarted","Data":"8d73c24cf36e5d8118e4eebeaecd3536a73be52df33a91e6f1e726024c381b8e"} Jan 21 16:14:45 crc kubenswrapper[4707]: I0121 16:14:45.785707 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:45 crc kubenswrapper[4707]: I0121 16:14:45.835459 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=1.835427406 podStartE2EDuration="1.835427406s" podCreationTimestamp="2026-01-21 16:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:14:45.825922534 +0000 UTC m=+4383.007438756" watchObservedRunningTime="2026-01-21 16:14:45.835427406 +0000 UTC m=+4383.016943627" Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.196632 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.197885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.202087 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-h6x24" Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.221335 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.299075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp2bv\" (UniqueName: \"kubernetes.io/projected/02a1be71-0142-4970-8a8a-9a04c4bd03da-kube-api-access-tp2bv\") pod \"kube-state-metrics-0\" (UID: \"02a1be71-0142-4970-8a8a-9a04c4bd03da\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.401274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp2bv\" (UniqueName: \"kubernetes.io/projected/02a1be71-0142-4970-8a8a-9a04c4bd03da-kube-api-access-tp2bv\") pod \"kube-state-metrics-0\" (UID: \"02a1be71-0142-4970-8a8a-9a04c4bd03da\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.421866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp2bv\" (UniqueName: \"kubernetes.io/projected/02a1be71-0142-4970-8a8a-9a04c4bd03da-kube-api-access-tp2bv\") pod \"kube-state-metrics-0\" (UID: \"02a1be71-0142-4970-8a8a-9a04c4bd03da\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.516540 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.792921 4707 generic.go:334] "Generic (PLEG): container finished" podID="41784af8-4684-4cb8-959b-2cff22fe7f16" containerID="3412449b8abfb0bc2880a2a1cb3170df4e64a694075dab079c2ba72a7ec9b33b" exitCode=0 Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.793010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"41784af8-4684-4cb8-959b-2cff22fe7f16","Type":"ContainerDied","Data":"3412449b8abfb0bc2880a2a1cb3170df4e64a694075dab079c2ba72a7ec9b33b"} Jan 21 16:14:46 crc kubenswrapper[4707]: I0121 16:14:46.899652 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:14:46 crc kubenswrapper[4707]: W0121 16:14:46.909453 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02a1be71_0142_4970_8a8a_9a04c4bd03da.slice/crio-df6210c03f9f87c4d68a70b6e968050a4b14db8482dad0d2cbae022dfabf63f8 WatchSource:0}: Error finding container df6210c03f9f87c4d68a70b6e968050a4b14db8482dad0d2cbae022dfabf63f8: Status 404 returned error can't find the container with id df6210c03f9f87c4d68a70b6e968050a4b14db8482dad0d2cbae022dfabf63f8 Jan 21 16:14:47 crc kubenswrapper[4707]: I0121 16:14:47.802516 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"02a1be71-0142-4970-8a8a-9a04c4bd03da","Type":"ContainerStarted","Data":"41464610db9257d72f412902dad7f3147f55589f358a384de5c478a302cf91eb"} Jan 21 16:14:47 crc kubenswrapper[4707]: I0121 16:14:47.802563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"02a1be71-0142-4970-8a8a-9a04c4bd03da","Type":"ContainerStarted","Data":"df6210c03f9f87c4d68a70b6e968050a4b14db8482dad0d2cbae022dfabf63f8"} Jan 21 16:14:47 crc kubenswrapper[4707]: I0121 16:14:47.802637 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:14:47 crc kubenswrapper[4707]: I0121 16:14:47.804675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"41784af8-4684-4cb8-959b-2cff22fe7f16","Type":"ContainerStarted","Data":"c2938dcd35595546d7f18672cb9df771c2c4e180300200fe8fd98bae1671e6ee"} Jan 21 16:14:47 crc kubenswrapper[4707]: I0121 16:14:47.823820 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.5116525140000001 podStartE2EDuration="1.823791118s" podCreationTimestamp="2026-01-21 16:14:46 +0000 UTC" firstStartedPulling="2026-01-21 16:14:46.913264699 +0000 UTC m=+4384.094780920" lastFinishedPulling="2026-01-21 16:14:47.225403311 +0000 UTC m=+4384.406919524" observedRunningTime="2026-01-21 16:14:47.815054651 +0000 UTC m=+4384.996570872" watchObservedRunningTime="2026-01-21 16:14:47.823791118 +0000 UTC m=+4385.005307339" Jan 21 16:14:47 crc kubenswrapper[4707]: I0121 16:14:47.830253 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=6.830240254 podStartE2EDuration="6.830240254s" podCreationTimestamp="2026-01-21 16:14:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:14:47.828894724 +0000 UTC m=+4385.010410946" watchObservedRunningTime="2026-01-21 16:14:47.830240254 +0000 UTC m=+4385.011756476" Jan 21 16:14:48 crc kubenswrapper[4707]: I0121 16:14:48.813131 4707 generic.go:334] "Generic (PLEG): container finished" podID="11bd3474-14bf-4c21-a327-c4dcbefdec96" containerID="53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64" exitCode=0 Jan 21 16:14:48 crc kubenswrapper[4707]: I0121 16:14:48.813223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"11bd3474-14bf-4c21-a327-c4dcbefdec96","Type":"ContainerDied","Data":"53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64"} Jan 21 16:14:49 crc kubenswrapper[4707]: I0121 16:14:49.568940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:14:49 crc kubenswrapper[4707]: I0121 16:14:49.823307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"11bd3474-14bf-4c21-a327-c4dcbefdec96","Type":"ContainerStarted","Data":"0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b"} Jan 21 16:14:49 crc kubenswrapper[4707]: I0121 16:14:49.845052 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.845039473 podStartE2EDuration="7.845039473s" podCreationTimestamp="2026-01-21 16:14:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:14:49.839276857 +0000 UTC m=+4387.020793080" watchObservedRunningTime="2026-01-21 16:14:49.845039473 +0000 UTC m=+4387.026555696" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.087263 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.088660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.090793 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-m9xjc" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.091303 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.091467 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.095136 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.096220 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.096255 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.260984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq5p6\" (UniqueName: \"kubernetes.io/projected/48c6c533-464a-46bb-a15b-c54a45c7043d-kube-api-access-jq5p6\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.261054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-config\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.261082 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.261108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.261328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.261461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.261510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.261567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.362266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.362338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.362372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.362411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq5p6\" (UniqueName: \"kubernetes.io/projected/48c6c533-464a-46bb-a15b-c54a45c7043d-kube-api-access-jq5p6\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.362470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-config\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.362497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.362552 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.362604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.363450 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.363968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.364485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.364598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-config\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.371727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.371908 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.373513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.377667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq5p6\" (UniqueName: \"kubernetes.io/projected/48c6c533-464a-46bb-a15b-c54a45c7043d-kube-api-access-jq5p6\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.381057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.402884 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.699286 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.700765 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.702631 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.702708 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.703236 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.705778 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-b7twx" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.710744 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.787788 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.832470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"48c6c533-464a-46bb-a15b-c54a45c7043d","Type":"ContainerStarted","Data":"e444e4dc8220979ca199654f217e3684b3870875e525e7cd1e04c82a7f9c99a1"} Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.873253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.873655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcx5m\" (UniqueName: \"kubernetes.io/projected/9b4a6d46-528c-4e6b-b890-391a0b3d4331-kube-api-access-vcx5m\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.873739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.873847 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.874015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.874077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.874111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-config\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.874205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.975849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.975930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.975964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-config\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.976013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.976056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.976098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcx5m\" (UniqueName: \"kubernetes.io/projected/9b4a6d46-528c-4e6b-b890-391a0b3d4331-kube-api-access-vcx5m\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.976120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.976149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.977217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.978154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.979402 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") device mount path \"/mnt/openstack/pv15\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.979754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-config\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.980956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.981232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.987772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:50 crc kubenswrapper[4707]: I0121 16:14:50.990712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcx5m\" (UniqueName: \"kubernetes.io/projected/9b4a6d46-528c-4e6b-b890-391a0b3d4331-kube-api-access-vcx5m\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.005131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.016115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.413946 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.847000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"48c6c533-464a-46bb-a15b-c54a45c7043d","Type":"ContainerStarted","Data":"1650aea6f1311ecc422cd7ecb402deed854b461e426b372cb16d23e5a7607f1c"} Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.847589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"48c6c533-464a-46bb-a15b-c54a45c7043d","Type":"ContainerStarted","Data":"23f18fa4c78ae9693483b4f51cc0a08db358f8a47f9ebc8353d056ecdd44f584"} Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.850655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"9b4a6d46-528c-4e6b-b890-391a0b3d4331","Type":"ContainerStarted","Data":"b09353f46c3b967cbbf4f0521a507dc2668c538f723bcf4edb30041804e018aa"} Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.850693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"9b4a6d46-528c-4e6b-b890-391a0b3d4331","Type":"ContainerStarted","Data":"393a518fbca493795d494a3aaca1d986424d5e9241c8300a7062bf397b08d31a"} Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.850707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"9b4a6d46-528c-4e6b-b890-391a0b3d4331","Type":"ContainerStarted","Data":"d46e557a41974565774706796039f73ec258264ea6bb57259d8bf9d9ac007090"} Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.870250 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.870229369 podStartE2EDuration="2.870229369s" podCreationTimestamp="2026-01-21 16:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:14:51.860670727 +0000 UTC m=+4389.042186948" watchObservedRunningTime="2026-01-21 16:14:51.870229369 +0000 UTC m=+4389.051745592" Jan 21 16:14:51 crc kubenswrapper[4707]: I0121 16:14:51.877952 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.877936071 podStartE2EDuration="2.877936071s" podCreationTimestamp="2026-01-21 16:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:14:51.876464292 +0000 UTC m=+4389.057980515" watchObservedRunningTime="2026-01-21 16:14:51.877936071 +0000 UTC m=+4389.059452292" Jan 21 16:14:52 crc kubenswrapper[4707]: I0121 16:14:52.888928 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:52 crc kubenswrapper[4707]: I0121 16:14:52.889350 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:52 crc kubenswrapper[4707]: I0121 16:14:52.949029 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:53 crc kubenswrapper[4707]: I0121 16:14:53.403978 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:53 crc kubenswrapper[4707]: I0121 16:14:53.433334 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:53 crc kubenswrapper[4707]: I0121 16:14:53.865011 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:53 crc kubenswrapper[4707]: I0121 16:14:53.919218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.016638 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.043641 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.155615 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-67lxl"] Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.156914 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.161758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-67lxl"] Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.251142 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.251195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.262290 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5"] Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.263632 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.272186 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.278748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5"] Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.344301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15dfa52-789a-420d-ad9f-4dfc7d755569-operator-scripts\") pod \"keystone-db-create-67lxl\" (UID: \"f15dfa52-789a-420d-ad9f-4dfc7d755569\") " pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.344788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7sh9\" (UniqueName: \"kubernetes.io/projected/f15dfa52-789a-420d-ad9f-4dfc7d755569-kube-api-access-f7sh9\") pod \"keystone-db-create-67lxl\" (UID: \"f15dfa52-789a-420d-ad9f-4dfc7d755569\") " pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.446084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7sh9\" (UniqueName: \"kubernetes.io/projected/f15dfa52-789a-420d-ad9f-4dfc7d755569-kube-api-access-f7sh9\") pod \"keystone-db-create-67lxl\" (UID: \"f15dfa52-789a-420d-ad9f-4dfc7d755569\") " pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.446131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdp64\" (UniqueName: \"kubernetes.io/projected/2dc01904-c145-45f7-bd12-98f776502dfa-kube-api-access-jdp64\") pod \"keystone-fa94-account-create-update-wbrj5\" (UID: \"2dc01904-c145-45f7-bd12-98f776502dfa\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.446208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc01904-c145-45f7-bd12-98f776502dfa-operator-scripts\") pod \"keystone-fa94-account-create-update-wbrj5\" (UID: \"2dc01904-c145-45f7-bd12-98f776502dfa\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.446235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15dfa52-789a-420d-ad9f-4dfc7d755569-operator-scripts\") pod \"keystone-db-create-67lxl\" (UID: \"f15dfa52-789a-420d-ad9f-4dfc7d755569\") " pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.446955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15dfa52-789a-420d-ad9f-4dfc7d755569-operator-scripts\") pod \"keystone-db-create-67lxl\" (UID: \"f15dfa52-789a-420d-ad9f-4dfc7d755569\") " pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.463174 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-hp9rb"] Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.464289 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.469013 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-hp9rb"] Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.470365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7sh9\" (UniqueName: \"kubernetes.io/projected/f15dfa52-789a-420d-ad9f-4dfc7d755569-kube-api-access-f7sh9\") pod \"keystone-db-create-67lxl\" (UID: \"f15dfa52-789a-420d-ad9f-4dfc7d755569\") " pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.473364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.552265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc01904-c145-45f7-bd12-98f776502dfa-operator-scripts\") pod \"keystone-fa94-account-create-update-wbrj5\" (UID: \"2dc01904-c145-45f7-bd12-98f776502dfa\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.552512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdp64\" (UniqueName: \"kubernetes.io/projected/2dc01904-c145-45f7-bd12-98f776502dfa-kube-api-access-jdp64\") pod \"keystone-fa94-account-create-update-wbrj5\" (UID: \"2dc01904-c145-45f7-bd12-98f776502dfa\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.555555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc01904-c145-45f7-bd12-98f776502dfa-operator-scripts\") pod \"keystone-fa94-account-create-update-wbrj5\" (UID: \"2dc01904-c145-45f7-bd12-98f776502dfa\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.571823 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-c324-account-create-update-mlsqx"] Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.572728 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.576868 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.580370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdp64\" (UniqueName: \"kubernetes.io/projected/2dc01904-c145-45f7-bd12-98f776502dfa-kube-api-access-jdp64\") pod \"keystone-fa94-account-create-update-wbrj5\" (UID: \"2dc01904-c145-45f7-bd12-98f776502dfa\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.582516 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.603075 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-c324-account-create-update-mlsqx"] Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.655317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w49cj\" (UniqueName: \"kubernetes.io/projected/8af4275d-327c-428a-b9ef-6daf1e3b32dc-kube-api-access-w49cj\") pod \"placement-db-create-hp9rb\" (UID: \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\") " pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.655487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8af4275d-327c-428a-b9ef-6daf1e3b32dc-operator-scripts\") pod \"placement-db-create-hp9rb\" (UID: \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\") " pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.756481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w49cj\" (UniqueName: \"kubernetes.io/projected/8af4275d-327c-428a-b9ef-6daf1e3b32dc-kube-api-access-w49cj\") pod \"placement-db-create-hp9rb\" (UID: \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\") " pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.756551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2479\" (UniqueName: \"kubernetes.io/projected/cc53221c-f932-4750-b193-c43af935722c-kube-api-access-p2479\") pod \"placement-c324-account-create-update-mlsqx\" (UID: \"cc53221c-f932-4750-b193-c43af935722c\") " pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.756578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc53221c-f932-4750-b193-c43af935722c-operator-scripts\") pod \"placement-c324-account-create-update-mlsqx\" (UID: \"cc53221c-f932-4750-b193-c43af935722c\") " pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.756599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8af4275d-327c-428a-b9ef-6daf1e3b32dc-operator-scripts\") pod \"placement-db-create-hp9rb\" (UID: \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\") " pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.757304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8af4275d-327c-428a-b9ef-6daf1e3b32dc-operator-scripts\") pod \"placement-db-create-hp9rb\" (UID: \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\") " pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.773231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w49cj\" (UniqueName: \"kubernetes.io/projected/8af4275d-327c-428a-b9ef-6daf1e3b32dc-kube-api-access-w49cj\") pod \"placement-db-create-hp9rb\" (UID: \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\") " pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.844940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.858498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2479\" (UniqueName: \"kubernetes.io/projected/cc53221c-f932-4750-b193-c43af935722c-kube-api-access-p2479\") pod \"placement-c324-account-create-update-mlsqx\" (UID: \"cc53221c-f932-4750-b193-c43af935722c\") " pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.858547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc53221c-f932-4750-b193-c43af935722c-operator-scripts\") pod \"placement-c324-account-create-update-mlsqx\" (UID: \"cc53221c-f932-4750-b193-c43af935722c\") " pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.859570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc53221c-f932-4750-b193-c43af935722c-operator-scripts\") pod \"placement-c324-account-create-update-mlsqx\" (UID: \"cc53221c-f932-4750-b193-c43af935722c\") " pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.872216 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.873291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2479\" (UniqueName: \"kubernetes.io/projected/cc53221c-f932-4750-b193-c43af935722c-kube-api-access-p2479\") pod \"placement-c324-account-create-update-mlsqx\" (UID: \"cc53221c-f932-4750-b193-c43af935722c\") " pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.931290 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:54 crc kubenswrapper[4707]: I0121 16:14:54.932139 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-67lxl"] Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.029156 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5"] Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.227771 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-hp9rb"] Jan 21 16:14:55 crc kubenswrapper[4707]: W0121 16:14:55.229912 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8af4275d_327c_428a_b9ef_6daf1e3b32dc.slice/crio-d56d17d3ecdaf639661251fe47c107e451fc25f2c4f963e1cdb9db89add71965 WatchSource:0}: Error finding container d56d17d3ecdaf639661251fe47c107e451fc25f2c4f963e1cdb9db89add71965: Status 404 returned error can't find the container with id d56d17d3ecdaf639661251fe47c107e451fc25f2c4f963e1cdb9db89add71965 Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.308968 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-c324-account-create-update-mlsqx"] Jan 21 16:14:55 crc kubenswrapper[4707]: W0121 16:14:55.374787 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc53221c_f932_4750_b193_c43af935722c.slice/crio-99f2b256fcf6bdf13c20878221686128e09b14855fd9477393cfb775a77b603c WatchSource:0}: Error finding container 99f2b256fcf6bdf13c20878221686128e09b14855fd9477393cfb775a77b603c: Status 404 returned error can't find the container with id 99f2b256fcf6bdf13c20878221686128e09b14855fd9477393cfb775a77b603c Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.442626 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.879587 4707 generic.go:334] "Generic (PLEG): container finished" podID="8af4275d-327c-428a-b9ef-6daf1e3b32dc" containerID="8080e1820e8617e734e7f6d3591626364addc6c0715323456f5b545599c9e1a6" exitCode=0 Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.879673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-hp9rb" event={"ID":"8af4275d-327c-428a-b9ef-6daf1e3b32dc","Type":"ContainerDied","Data":"8080e1820e8617e734e7f6d3591626364addc6c0715323456f5b545599c9e1a6"} Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.879700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-hp9rb" event={"ID":"8af4275d-327c-428a-b9ef-6daf1e3b32dc","Type":"ContainerStarted","Data":"d56d17d3ecdaf639661251fe47c107e451fc25f2c4f963e1cdb9db89add71965"} Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.881336 4707 generic.go:334] "Generic (PLEG): container finished" podID="cc53221c-f932-4750-b193-c43af935722c" containerID="968fa043e9d75f074257ba63bdd2414573937b0e4092066cd458b52c27017104" exitCode=0 Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.881421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" event={"ID":"cc53221c-f932-4750-b193-c43af935722c","Type":"ContainerDied","Data":"968fa043e9d75f074257ba63bdd2414573937b0e4092066cd458b52c27017104"} Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.881517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" event={"ID":"cc53221c-f932-4750-b193-c43af935722c","Type":"ContainerStarted","Data":"99f2b256fcf6bdf13c20878221686128e09b14855fd9477393cfb775a77b603c"} Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.883441 4707 generic.go:334] "Generic (PLEG): container finished" podID="2dc01904-c145-45f7-bd12-98f776502dfa" containerID="00b2e9080849f70f06e72d0f8cf6a6bb5cd6fdd4652e62a5fc3b7c49f6d06b04" exitCode=0 Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.883517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" event={"ID":"2dc01904-c145-45f7-bd12-98f776502dfa","Type":"ContainerDied","Data":"00b2e9080849f70f06e72d0f8cf6a6bb5cd6fdd4652e62a5fc3b7c49f6d06b04"} Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.883540 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" event={"ID":"2dc01904-c145-45f7-bd12-98f776502dfa","Type":"ContainerStarted","Data":"0efcd50a90c8043e4a93da1a79ec9cecf61a8e73e21679a88216e288810133ce"} Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.885017 4707 generic.go:334] "Generic (PLEG): container finished" podID="f15dfa52-789a-420d-ad9f-4dfc7d755569" containerID="4973b6ceb8c94a4209098aa2976ed9a0c40751c09bb67854d097b046b15d1209" exitCode=0 Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.885060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-67lxl" event={"ID":"f15dfa52-789a-420d-ad9f-4dfc7d755569","Type":"ContainerDied","Data":"4973b6ceb8c94a4209098aa2976ed9a0c40751c09bb67854d097b046b15d1209"} Jan 21 16:14:55 crc kubenswrapper[4707]: I0121 16:14:55.885125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-67lxl" event={"ID":"f15dfa52-789a-420d-ad9f-4dfc7d755569","Type":"ContainerStarted","Data":"196574728b73dc6bcf1c16f3bac7ce4937dcbfc7ce755369e6d85fe7c2e34403"} Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.043645 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.153488 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.154649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.156598 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.156780 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-f5sl7" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.162654 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.167844 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.178889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.280977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.281117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.281515 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhhtc\" (UniqueName: \"kubernetes.io/projected/e961b985-3419-4a6d-9cc5-e7c615f54129-kube-api-access-jhhtc\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.281569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-config\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.281846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.282270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.282370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-scripts\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.384345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.384387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.384432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhhtc\" (UniqueName: \"kubernetes.io/projected/e961b985-3419-4a6d-9cc5-e7c615f54129-kube-api-access-jhhtc\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.384454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-config\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.384494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.384521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.384544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-scripts\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.385480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-config\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.385492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-scripts\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.385717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.408096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.408407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.408431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.410224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhhtc\" (UniqueName: \"kubernetes.io/projected/e961b985-3419-4a6d-9cc5-e7c615f54129-kube-api-access-jhhtc\") pod \"ovn-northd-0\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.473127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.520765 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:14:56 crc kubenswrapper[4707]: I0121 16:14:56.959408 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.066578 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.362014 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.473653 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.486302 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.486922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.491523 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:14:57 crc kubenswrapper[4707]: E0121 16:14:57.491956 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15dfa52-789a-420d-ad9f-4dfc7d755569" containerName="mariadb-database-create" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.491977 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15dfa52-789a-420d-ad9f-4dfc7d755569" containerName="mariadb-database-create" Jan 21 16:14:57 crc kubenswrapper[4707]: E0121 16:14:57.491995 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc53221c-f932-4750-b193-c43af935722c" containerName="mariadb-account-create-update" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.492004 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc53221c-f932-4750-b193-c43af935722c" containerName="mariadb-account-create-update" Jan 21 16:14:57 crc kubenswrapper[4707]: E0121 16:14:57.492029 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc01904-c145-45f7-bd12-98f776502dfa" containerName="mariadb-account-create-update" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.492036 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc01904-c145-45f7-bd12-98f776502dfa" containerName="mariadb-account-create-update" Jan 21 16:14:57 crc kubenswrapper[4707]: E0121 16:14:57.492043 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af4275d-327c-428a-b9ef-6daf1e3b32dc" containerName="mariadb-database-create" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.492049 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af4275d-327c-428a-b9ef-6daf1e3b32dc" containerName="mariadb-database-create" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.492243 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15dfa52-789a-420d-ad9f-4dfc7d755569" containerName="mariadb-database-create" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.492263 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af4275d-327c-428a-b9ef-6daf1e3b32dc" containerName="mariadb-database-create" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.492277 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc53221c-f932-4750-b193-c43af935722c" containerName="mariadb-account-create-update" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.492286 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc01904-c145-45f7-bd12-98f776502dfa" containerName="mariadb-account-create-update" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.509912 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.511091 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc01904-c145-45f7-bd12-98f776502dfa-operator-scripts\") pod \"2dc01904-c145-45f7-bd12-98f776502dfa\" (UID: \"2dc01904-c145-45f7-bd12-98f776502dfa\") " Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.511403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdp64\" (UniqueName: \"kubernetes.io/projected/2dc01904-c145-45f7-bd12-98f776502dfa-kube-api-access-jdp64\") pod \"2dc01904-c145-45f7-bd12-98f776502dfa\" (UID: \"2dc01904-c145-45f7-bd12-98f776502dfa\") " Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.512454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dc01904-c145-45f7-bd12-98f776502dfa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2dc01904-c145-45f7-bd12-98f776502dfa" (UID: "2dc01904-c145-45f7-bd12-98f776502dfa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.512520 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.514119 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.514382 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.514528 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-w4v67" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.520367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.531650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc01904-c145-45f7-bd12-98f776502dfa-kube-api-access-jdp64" (OuterVolumeSpecName: "kube-api-access-jdp64") pod "2dc01904-c145-45f7-bd12-98f776502dfa" (UID: "2dc01904-c145-45f7-bd12-98f776502dfa"). InnerVolumeSpecName "kube-api-access-jdp64". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.591079 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.613025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc53221c-f932-4750-b193-c43af935722c-operator-scripts\") pod \"cc53221c-f932-4750-b193-c43af935722c\" (UID: \"cc53221c-f932-4750-b193-c43af935722c\") " Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.613093 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7sh9\" (UniqueName: \"kubernetes.io/projected/f15dfa52-789a-420d-ad9f-4dfc7d755569-kube-api-access-f7sh9\") pod \"f15dfa52-789a-420d-ad9f-4dfc7d755569\" (UID: \"f15dfa52-789a-420d-ad9f-4dfc7d755569\") " Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.613308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w49cj\" (UniqueName: \"kubernetes.io/projected/8af4275d-327c-428a-b9ef-6daf1e3b32dc-kube-api-access-w49cj\") pod \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\" (UID: \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\") " Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.613353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2479\" (UniqueName: \"kubernetes.io/projected/cc53221c-f932-4750-b193-c43af935722c-kube-api-access-p2479\") pod \"cc53221c-f932-4750-b193-c43af935722c\" (UID: \"cc53221c-f932-4750-b193-c43af935722c\") " Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.613540 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15dfa52-789a-420d-ad9f-4dfc7d755569-operator-scripts\") pod \"f15dfa52-789a-420d-ad9f-4dfc7d755569\" (UID: \"f15dfa52-789a-420d-ad9f-4dfc7d755569\") " Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.613591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8af4275d-327c-428a-b9ef-6daf1e3b32dc-operator-scripts\") pod \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\" (UID: \"8af4275d-327c-428a-b9ef-6daf1e3b32dc\") " Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.613687 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc53221c-f932-4750-b193-c43af935722c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc53221c-f932-4750-b193-c43af935722c" (UID: "cc53221c-f932-4750-b193-c43af935722c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.613927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.614026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjfwm\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-kube-api-access-rjfwm\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.614061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-cache\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.614265 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.614305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-lock\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.614369 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc01904-c145-45f7-bd12-98f776502dfa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.614387 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc53221c-f932-4750-b193-c43af935722c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.614400 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdp64\" (UniqueName: \"kubernetes.io/projected/2dc01904-c145-45f7-bd12-98f776502dfa-kube-api-access-jdp64\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.614423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15dfa52-789a-420d-ad9f-4dfc7d755569-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f15dfa52-789a-420d-ad9f-4dfc7d755569" (UID: "f15dfa52-789a-420d-ad9f-4dfc7d755569"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.614420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8af4275d-327c-428a-b9ef-6daf1e3b32dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8af4275d-327c-428a-b9ef-6daf1e3b32dc" (UID: "8af4275d-327c-428a-b9ef-6daf1e3b32dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.616723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15dfa52-789a-420d-ad9f-4dfc7d755569-kube-api-access-f7sh9" (OuterVolumeSpecName: "kube-api-access-f7sh9") pod "f15dfa52-789a-420d-ad9f-4dfc7d755569" (UID: "f15dfa52-789a-420d-ad9f-4dfc7d755569"). InnerVolumeSpecName "kube-api-access-f7sh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.616777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc53221c-f932-4750-b193-c43af935722c-kube-api-access-p2479" (OuterVolumeSpecName: "kube-api-access-p2479") pod "cc53221c-f932-4750-b193-c43af935722c" (UID: "cc53221c-f932-4750-b193-c43af935722c"). InnerVolumeSpecName "kube-api-access-p2479". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.617141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8af4275d-327c-428a-b9ef-6daf1e3b32dc-kube-api-access-w49cj" (OuterVolumeSpecName: "kube-api-access-w49cj") pod "8af4275d-327c-428a-b9ef-6daf1e3b32dc" (UID: "8af4275d-327c-428a-b9ef-6daf1e3b32dc"). InnerVolumeSpecName "kube-api-access-w49cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjfwm\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-kube-api-access-rjfwm\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-cache\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-lock\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716494 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7sh9\" (UniqueName: \"kubernetes.io/projected/f15dfa52-789a-420d-ad9f-4dfc7d755569-kube-api-access-f7sh9\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716508 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w49cj\" (UniqueName: \"kubernetes.io/projected/8af4275d-327c-428a-b9ef-6daf1e3b32dc-kube-api-access-w49cj\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716520 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2479\" (UniqueName: \"kubernetes.io/projected/cc53221c-f932-4750-b193-c43af935722c-kube-api-access-p2479\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716534 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15dfa52-789a-420d-ad9f-4dfc7d755569-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716544 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8af4275d-327c-428a-b9ef-6daf1e3b32dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.716627 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.717090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-cache\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.717119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-lock\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: E0121 16:14:57.717156 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 16:14:57 crc kubenswrapper[4707]: E0121 16:14:57.717191 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 16:14:57 crc kubenswrapper[4707]: E0121 16:14:57.717251 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift podName:2c2f6089-1b67-4c36-ae6f-66c394da5881 nodeName:}" failed. No retries permitted until 2026-01-21 16:14:58.21723462 +0000 UTC m=+4395.398750842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift") pod "swift-storage-0" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881") : configmap "swift-ring-files" not found Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.734435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjfwm\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-kube-api-access-rjfwm\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.735590 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.909181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"e961b985-3419-4a6d-9cc5-e7c615f54129","Type":"ContainerStarted","Data":"0d0f736ac687f2851035b080d22d8bb9669464f300eba02e20e36b392b5e41d4"} Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.909235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"e961b985-3419-4a6d-9cc5-e7c615f54129","Type":"ContainerStarted","Data":"f9aa7b9bc4195c0f2d38a8d78f5b06de6d67fd973cba295a3dafcbda4c366fb5"} Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.909268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"e961b985-3419-4a6d-9cc5-e7c615f54129","Type":"ContainerStarted","Data":"d30c4adb31cf9fb4497122bf11d268303d3a823d3ea674fa40ce5a5d5cd533cb"} Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.909284 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.910772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" event={"ID":"cc53221c-f932-4750-b193-c43af935722c","Type":"ContainerDied","Data":"99f2b256fcf6bdf13c20878221686128e09b14855fd9477393cfb775a77b603c"} Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.910800 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-c324-account-create-update-mlsqx" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.910820 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99f2b256fcf6bdf13c20878221686128e09b14855fd9477393cfb775a77b603c" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.913180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" event={"ID":"2dc01904-c145-45f7-bd12-98f776502dfa","Type":"ContainerDied","Data":"0efcd50a90c8043e4a93da1a79ec9cecf61a8e73e21679a88216e288810133ce"} Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.913195 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.913203 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0efcd50a90c8043e4a93da1a79ec9cecf61a8e73e21679a88216e288810133ce" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.924664 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-67lxl" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.924701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-67lxl" event={"ID":"f15dfa52-789a-420d-ad9f-4dfc7d755569","Type":"ContainerDied","Data":"196574728b73dc6bcf1c16f3bac7ce4937dcbfc7ce755369e6d85fe7c2e34403"} Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.924748 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="196574728b73dc6bcf1c16f3bac7ce4937dcbfc7ce755369e6d85fe7c2e34403" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.926198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-hp9rb" event={"ID":"8af4275d-327c-428a-b9ef-6daf1e3b32dc","Type":"ContainerDied","Data":"d56d17d3ecdaf639661251fe47c107e451fc25f2c4f963e1cdb9db89add71965"} Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.926234 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d56d17d3ecdaf639661251fe47c107e451fc25f2c4f963e1cdb9db89add71965" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.926269 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-hp9rb" Jan 21 16:14:57 crc kubenswrapper[4707]: I0121 16:14:57.969417 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=1.969397931 podStartE2EDuration="1.969397931s" podCreationTimestamp="2026-01-21 16:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:14:57.949866908 +0000 UTC m=+4395.131383130" watchObservedRunningTime="2026-01-21 16:14:57.969397931 +0000 UTC m=+4395.150914153" Jan 21 16:14:58 crc kubenswrapper[4707]: I0121 16:14:58.228584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:58 crc kubenswrapper[4707]: E0121 16:14:58.228782 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 16:14:58 crc kubenswrapper[4707]: E0121 16:14:58.229845 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 16:14:58 crc kubenswrapper[4707]: E0121 16:14:58.229900 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift podName:2c2f6089-1b67-4c36-ae6f-66c394da5881 nodeName:}" failed. No retries permitted until 2026-01-21 16:14:59.229885586 +0000 UTC m=+4396.411401808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift") pod "swift-storage-0" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881") : configmap "swift-ring-files" not found Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.249693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:14:59 crc kubenswrapper[4707]: E0121 16:14:59.250335 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 16:14:59 crc kubenswrapper[4707]: E0121 16:14:59.250356 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 16:14:59 crc kubenswrapper[4707]: E0121 16:14:59.250406 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift podName:2c2f6089-1b67-4c36-ae6f-66c394da5881 nodeName:}" failed. No retries permitted until 2026-01-21 16:15:01.250388473 +0000 UTC m=+4398.431904695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift") pod "swift-storage-0" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881") : configmap "swift-ring-files" not found Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.687496 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-8khz6"] Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.688422 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.694318 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-8khz6"] Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.759485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4mhm\" (UniqueName: \"kubernetes.io/projected/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-kube-api-access-v4mhm\") pod \"glance-db-create-8khz6\" (UID: \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\") " pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.759579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-operator-scripts\") pod \"glance-db-create-8khz6\" (UID: \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\") " pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.801648 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5"] Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.803090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.811124 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.813655 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5"] Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.860843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4mhm\" (UniqueName: \"kubernetes.io/projected/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-kube-api-access-v4mhm\") pod \"glance-db-create-8khz6\" (UID: \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\") " pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.860889 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-operator-scripts\") pod \"glance-db-create-8khz6\" (UID: \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\") " pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.861503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-operator-scripts\") pod \"glance-db-create-8khz6\" (UID: \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\") " pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.880990 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4mhm\" (UniqueName: \"kubernetes.io/projected/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-kube-api-access-v4mhm\") pod \"glance-db-create-8khz6\" (UID: \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\") " pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.961775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-operator-scripts\") pod \"glance-0b3b-account-create-update-xcdb5\" (UID: \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\") " pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:14:59 crc kubenswrapper[4707]: I0121 16:14:59.961859 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdsvl\" (UniqueName: \"kubernetes.io/projected/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-kube-api-access-kdsvl\") pod \"glance-0b3b-account-create-update-xcdb5\" (UID: \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\") " pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.005269 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.063619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-operator-scripts\") pod \"glance-0b3b-account-create-update-xcdb5\" (UID: \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\") " pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.063692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdsvl\" (UniqueName: \"kubernetes.io/projected/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-kube-api-access-kdsvl\") pod \"glance-0b3b-account-create-update-xcdb5\" (UID: \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\") " pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.064366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-operator-scripts\") pod \"glance-0b3b-account-create-update-xcdb5\" (UID: \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\") " pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.078020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdsvl\" (UniqueName: \"kubernetes.io/projected/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-kube-api-access-kdsvl\") pod \"glance-0b3b-account-create-update-xcdb5\" (UID: \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\") " pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.127022 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.152366 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc"] Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.154881 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.157839 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.158017 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.172248 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc"] Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.272762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7609d4eb-aad1-45f2-bc49-3b2322e8a817-config-volume\") pod \"collect-profiles-29483535-2nkhc\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.272857 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7609d4eb-aad1-45f2-bc49-3b2322e8a817-secret-volume\") pod \"collect-profiles-29483535-2nkhc\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.272893 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zntw2\" (UniqueName: \"kubernetes.io/projected/7609d4eb-aad1-45f2-bc49-3b2322e8a817-kube-api-access-zntw2\") pod \"collect-profiles-29483535-2nkhc\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.376261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7609d4eb-aad1-45f2-bc49-3b2322e8a817-config-volume\") pod \"collect-profiles-29483535-2nkhc\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.376345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7609d4eb-aad1-45f2-bc49-3b2322e8a817-secret-volume\") pod \"collect-profiles-29483535-2nkhc\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.376406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zntw2\" (UniqueName: \"kubernetes.io/projected/7609d4eb-aad1-45f2-bc49-3b2322e8a817-kube-api-access-zntw2\") pod \"collect-profiles-29483535-2nkhc\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.377120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7609d4eb-aad1-45f2-bc49-3b2322e8a817-config-volume\") pod \"collect-profiles-29483535-2nkhc\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.382061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7609d4eb-aad1-45f2-bc49-3b2322e8a817-secret-volume\") pod \"collect-profiles-29483535-2nkhc\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.391868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zntw2\" (UniqueName: \"kubernetes.io/projected/7609d4eb-aad1-45f2-bc49-3b2322e8a817-kube-api-access-zntw2\") pod \"collect-profiles-29483535-2nkhc\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: W0121 16:15:00.438525 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea63c116_37f3_4c5d_9ae3_ed07ceb1a530.slice/crio-63d06bba46daa46f37718a7110de6f11192f767cd54699dc7495853c6bcce938 WatchSource:0}: Error finding container 63d06bba46daa46f37718a7110de6f11192f767cd54699dc7495853c6bcce938: Status 404 returned error can't find the container with id 63d06bba46daa46f37718a7110de6f11192f767cd54699dc7495853c6bcce938 Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.439355 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-8khz6"] Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.496225 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.532367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5"] Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.892875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc"] Jan 21 16:15:00 crc kubenswrapper[4707]: W0121 16:15:00.895435 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7609d4eb_aad1_45f2_bc49_3b2322e8a817.slice/crio-3c9504c9fbba23682cf8d4948fd76d923547d4dfdc48df67d055342dc8985313 WatchSource:0}: Error finding container 3c9504c9fbba23682cf8d4948fd76d923547d4dfdc48df67d055342dc8985313: Status 404 returned error can't find the container with id 3c9504c9fbba23682cf8d4948fd76d923547d4dfdc48df67d055342dc8985313 Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.948439 4707 generic.go:334] "Generic (PLEG): container finished" podID="ea63c116-37f3-4c5d-9ae3-ed07ceb1a530" containerID="0bb9c48ab5870f30efc043dff3b25c1812f174f3b39367fec91edd9d13f5bb5f" exitCode=0 Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.948537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-8khz6" event={"ID":"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530","Type":"ContainerDied","Data":"0bb9c48ab5870f30efc043dff3b25c1812f174f3b39367fec91edd9d13f5bb5f"} Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.948746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-8khz6" event={"ID":"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530","Type":"ContainerStarted","Data":"63d06bba46daa46f37718a7110de6f11192f767cd54699dc7495853c6bcce938"} Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.950326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" event={"ID":"7609d4eb-aad1-45f2-bc49-3b2322e8a817","Type":"ContainerStarted","Data":"3c9504c9fbba23682cf8d4948fd76d923547d4dfdc48df67d055342dc8985313"} Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.951754 4707 generic.go:334] "Generic (PLEG): container finished" podID="c8f9ee62-8c3a-4e58-b616-5537c6220ff9" containerID="1ea7c9e00fa5bb2f612af778f3c7730e56fc51b09d34fc2f817000b446962d3e" exitCode=0 Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.951783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" event={"ID":"c8f9ee62-8c3a-4e58-b616-5537c6220ff9","Type":"ContainerDied","Data":"1ea7c9e00fa5bb2f612af778f3c7730e56fc51b09d34fc2f817000b446962d3e"} Jan 21 16:15:00 crc kubenswrapper[4707]: I0121 16:15:00.951958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" event={"ID":"c8f9ee62-8c3a-4e58-b616-5537c6220ff9","Type":"ContainerStarted","Data":"d6b0b67a94e83141816836f5a613e45fc69922a5d12bd95114750971a76fa04f"} Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.298127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:15:01 crc kubenswrapper[4707]: E0121 16:15:01.298359 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 16:15:01 crc kubenswrapper[4707]: E0121 16:15:01.298385 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 16:15:01 crc kubenswrapper[4707]: E0121 16:15:01.298453 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift podName:2c2f6089-1b67-4c36-ae6f-66c394da5881 nodeName:}" failed. No retries permitted until 2026-01-21 16:15:05.298432693 +0000 UTC m=+4402.479948915 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift") pod "swift-storage-0" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881") : configmap "swift-ring-files" not found Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.363251 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-wlhs4"] Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.364293 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.366371 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.366387 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.366513 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.381026 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-wlhs4"] Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.501859 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-ring-data-devices\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.502570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/348b9fd8-56d9-479d-ad68-6e98a249d50e-etc-swift\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.502874 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt7bn\" (UniqueName: \"kubernetes.io/projected/348b9fd8-56d9-479d-ad68-6e98a249d50e-kube-api-access-mt7bn\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.503030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-dispersionconf\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.503092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-combined-ca-bundle\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.503174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-scripts\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.503222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-swiftconf\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.538055 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n7lst"] Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.538966 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.540676 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.548945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n7lst"] Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.604767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49lc8\" (UniqueName: \"kubernetes.io/projected/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-kube-api-access-49lc8\") pod \"root-account-create-update-n7lst\" (UID: \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\") " pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.604838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt7bn\" (UniqueName: \"kubernetes.io/projected/348b9fd8-56d9-479d-ad68-6e98a249d50e-kube-api-access-mt7bn\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.604873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-dispersionconf\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.604898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-combined-ca-bundle\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.604934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-operator-scripts\") pod \"root-account-create-update-n7lst\" (UID: \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\") " pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.604956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-scripts\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.604975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-swiftconf\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.605104 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-ring-data-devices\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.605743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-scripts\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.605988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-ring-data-devices\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.606095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/348b9fd8-56d9-479d-ad68-6e98a249d50e-etc-swift\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.606372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/348b9fd8-56d9-479d-ad68-6e98a249d50e-etc-swift\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.610258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-swiftconf\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.610361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-dispersionconf\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.610431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-combined-ca-bundle\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.619037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt7bn\" (UniqueName: \"kubernetes.io/projected/348b9fd8-56d9-479d-ad68-6e98a249d50e-kube-api-access-mt7bn\") pod \"swift-ring-rebalance-wlhs4\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.696414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.707708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49lc8\" (UniqueName: \"kubernetes.io/projected/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-kube-api-access-49lc8\") pod \"root-account-create-update-n7lst\" (UID: \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\") " pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.707793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-operator-scripts\") pod \"root-account-create-update-n7lst\" (UID: \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\") " pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.708500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-operator-scripts\") pod \"root-account-create-update-n7lst\" (UID: \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\") " pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.722031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49lc8\" (UniqueName: \"kubernetes.io/projected/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-kube-api-access-49lc8\") pod \"root-account-create-update-n7lst\" (UID: \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\") " pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.851291 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.965080 4707 generic.go:334] "Generic (PLEG): container finished" podID="7609d4eb-aad1-45f2-bc49-3b2322e8a817" containerID="f0b05456edefd34b6599529e3b34703d766d0113ca61f5aca013d8272a793a36" exitCode=0 Jan 21 16:15:01 crc kubenswrapper[4707]: I0121 16:15:01.965849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" event={"ID":"7609d4eb-aad1-45f2-bc49-3b2322e8a817","Type":"ContainerDied","Data":"f0b05456edefd34b6599529e3b34703d766d0113ca61f5aca013d8272a793a36"} Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.113708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-wlhs4"] Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.263726 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n7lst"] Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.295696 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.301605 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.422650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdsvl\" (UniqueName: \"kubernetes.io/projected/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-kube-api-access-kdsvl\") pod \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\" (UID: \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\") " Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.422713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-operator-scripts\") pod \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\" (UID: \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\") " Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.422750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4mhm\" (UniqueName: \"kubernetes.io/projected/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-kube-api-access-v4mhm\") pod \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\" (UID: \"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530\") " Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.422777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-operator-scripts\") pod \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\" (UID: \"c8f9ee62-8c3a-4e58-b616-5537c6220ff9\") " Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.423494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea63c116-37f3-4c5d-9ae3-ed07ceb1a530" (UID: "ea63c116-37f3-4c5d-9ae3-ed07ceb1a530"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.423543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8f9ee62-8c3a-4e58-b616-5537c6220ff9" (UID: "c8f9ee62-8c3a-4e58-b616-5537c6220ff9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.426612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-kube-api-access-v4mhm" (OuterVolumeSpecName: "kube-api-access-v4mhm") pod "ea63c116-37f3-4c5d-9ae3-ed07ceb1a530" (UID: "ea63c116-37f3-4c5d-9ae3-ed07ceb1a530"). InnerVolumeSpecName "kube-api-access-v4mhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.426676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-kube-api-access-kdsvl" (OuterVolumeSpecName: "kube-api-access-kdsvl") pod "c8f9ee62-8c3a-4e58-b616-5537c6220ff9" (UID: "c8f9ee62-8c3a-4e58-b616-5537c6220ff9"). InnerVolumeSpecName "kube-api-access-kdsvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.524182 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdsvl\" (UniqueName: \"kubernetes.io/projected/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-kube-api-access-kdsvl\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.524210 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.524220 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4mhm\" (UniqueName: \"kubernetes.io/projected/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530-kube-api-access-v4mhm\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.524229 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f9ee62-8c3a-4e58-b616-5537c6220ff9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.649756 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-khrx7"] Jan 21 16:15:02 crc kubenswrapper[4707]: E0121 16:15:02.650045 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f9ee62-8c3a-4e58-b616-5537c6220ff9" containerName="mariadb-account-create-update" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.650063 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f9ee62-8c3a-4e58-b616-5537c6220ff9" containerName="mariadb-account-create-update" Jan 21 16:15:02 crc kubenswrapper[4707]: E0121 16:15:02.650085 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea63c116-37f3-4c5d-9ae3-ed07ceb1a530" containerName="mariadb-database-create" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.650091 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea63c116-37f3-4c5d-9ae3-ed07ceb1a530" containerName="mariadb-database-create" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.650244 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f9ee62-8c3a-4e58-b616-5537c6220ff9" containerName="mariadb-account-create-update" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.650261 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea63c116-37f3-4c5d-9ae3-ed07ceb1a530" containerName="mariadb-database-create" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.651218 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.658400 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khrx7"] Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.726524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-utilities\") pod \"redhat-operators-khrx7\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.726601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9z97\" (UniqueName: \"kubernetes.io/projected/2c2458c7-ba58-4c56-bbea-725b7c873708-kube-api-access-v9z97\") pod \"redhat-operators-khrx7\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.726668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-catalog-content\") pod \"redhat-operators-khrx7\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.828419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-utilities\") pod \"redhat-operators-khrx7\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.828503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9z97\" (UniqueName: \"kubernetes.io/projected/2c2458c7-ba58-4c56-bbea-725b7c873708-kube-api-access-v9z97\") pod \"redhat-operators-khrx7\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.828544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-catalog-content\") pod \"redhat-operators-khrx7\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.829110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-catalog-content\") pod \"redhat-operators-khrx7\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.829180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-utilities\") pod \"redhat-operators-khrx7\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.846440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9z97\" (UniqueName: \"kubernetes.io/projected/2c2458c7-ba58-4c56-bbea-725b7c873708-kube-api-access-v9z97\") pod \"redhat-operators-khrx7\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.964705 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.987634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" event={"ID":"c8f9ee62-8c3a-4e58-b616-5537c6220ff9","Type":"ContainerDied","Data":"d6b0b67a94e83141816836f5a613e45fc69922a5d12bd95114750971a76fa04f"} Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.987934 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6b0b67a94e83141816836f5a613e45fc69922a5d12bd95114750971a76fa04f" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.987996 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.996292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-8khz6" event={"ID":"ea63c116-37f3-4c5d-9ae3-ed07ceb1a530","Type":"ContainerDied","Data":"63d06bba46daa46f37718a7110de6f11192f767cd54699dc7495853c6bcce938"} Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.996334 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63d06bba46daa46f37718a7110de6f11192f767cd54699dc7495853c6bcce938" Jan 21 16:15:02 crc kubenswrapper[4707]: I0121 16:15:02.996383 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-8khz6" Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.005061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" event={"ID":"348b9fd8-56d9-479d-ad68-6e98a249d50e","Type":"ContainerStarted","Data":"bb46f9e83f66a772289c0448dc2ee29a80460b2788d5947207c573f27eed0606"} Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.005105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" event={"ID":"348b9fd8-56d9-479d-ad68-6e98a249d50e","Type":"ContainerStarted","Data":"8f49c2258cb6e43079f6d39865026788852f76629040cfd64a39f781e2dbf15a"} Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.006961 4707 generic.go:334] "Generic (PLEG): container finished" podID="4cd0b4da-ec8b-4973-bc1f-1a63731b7c66" containerID="5a1fdd32b7c048c2b08b9b7548470c2376175782a4962b0b050b54f1d5d49ea7" exitCode=0 Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.007132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-n7lst" event={"ID":"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66","Type":"ContainerDied","Data":"5a1fdd32b7c048c2b08b9b7548470c2376175782a4962b0b050b54f1d5d49ea7"} Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.007150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-n7lst" event={"ID":"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66","Type":"ContainerStarted","Data":"34a528b3937482424b856e3414319932511fdf40899a158df06f5449b7688ca5"} Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.037404 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" podStartSLOduration=2.037385308 podStartE2EDuration="2.037385308s" podCreationTimestamp="2026-01-21 16:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:03.018302698 +0000 UTC m=+4400.199818919" watchObservedRunningTime="2026-01-21 16:15:03.037385308 +0000 UTC m=+4400.218901531" Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.400619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khrx7"] Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.413736 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.543784 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7609d4eb-aad1-45f2-bc49-3b2322e8a817-secret-volume\") pod \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.544029 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zntw2\" (UniqueName: \"kubernetes.io/projected/7609d4eb-aad1-45f2-bc49-3b2322e8a817-kube-api-access-zntw2\") pod \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.544132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7609d4eb-aad1-45f2-bc49-3b2322e8a817-config-volume\") pod \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\" (UID: \"7609d4eb-aad1-45f2-bc49-3b2322e8a817\") " Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.545079 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7609d4eb-aad1-45f2-bc49-3b2322e8a817-config-volume" (OuterVolumeSpecName: "config-volume") pod "7609d4eb-aad1-45f2-bc49-3b2322e8a817" (UID: "7609d4eb-aad1-45f2-bc49-3b2322e8a817"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.549494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7609d4eb-aad1-45f2-bc49-3b2322e8a817-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7609d4eb-aad1-45f2-bc49-3b2322e8a817" (UID: "7609d4eb-aad1-45f2-bc49-3b2322e8a817"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.550120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7609d4eb-aad1-45f2-bc49-3b2322e8a817-kube-api-access-zntw2" (OuterVolumeSpecName: "kube-api-access-zntw2") pod "7609d4eb-aad1-45f2-bc49-3b2322e8a817" (UID: "7609d4eb-aad1-45f2-bc49-3b2322e8a817"). InnerVolumeSpecName "kube-api-access-zntw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.648104 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7609d4eb-aad1-45f2-bc49-3b2322e8a817-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.648144 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zntw2\" (UniqueName: \"kubernetes.io/projected/7609d4eb-aad1-45f2-bc49-3b2322e8a817-kube-api-access-zntw2\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:03 crc kubenswrapper[4707]: I0121 16:15:03.648157 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7609d4eb-aad1-45f2-bc49-3b2322e8a817-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.015565 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerID="546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb" exitCode=0 Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.015648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrx7" event={"ID":"2c2458c7-ba58-4c56-bbea-725b7c873708","Type":"ContainerDied","Data":"546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb"} Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.015680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrx7" event={"ID":"2c2458c7-ba58-4c56-bbea-725b7c873708","Type":"ContainerStarted","Data":"62bf86e9ae3733e2f740393fc006d017cdc4a10215661c3b46a18eaec8cefda8"} Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.018300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" event={"ID":"7609d4eb-aad1-45f2-bc49-3b2322e8a817","Type":"ContainerDied","Data":"3c9504c9fbba23682cf8d4948fd76d923547d4dfdc48df67d055342dc8985313"} Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.018404 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c9504c9fbba23682cf8d4948fd76d923547d4dfdc48df67d055342dc8985313" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.018329 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.335467 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.461030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-operator-scripts\") pod \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\" (UID: \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\") " Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.461335 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49lc8\" (UniqueName: \"kubernetes.io/projected/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-kube-api-access-49lc8\") pod \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\" (UID: \"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66\") " Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.461504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cd0b4da-ec8b-4973-bc1f-1a63731b7c66" (UID: "4cd0b4da-ec8b-4973-bc1f-1a63731b7c66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.461909 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.466639 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-kube-api-access-49lc8" (OuterVolumeSpecName: "kube-api-access-49lc8") pod "4cd0b4da-ec8b-4973-bc1f-1a63731b7c66" (UID: "4cd0b4da-ec8b-4973-bc1f-1a63731b7c66"). InnerVolumeSpecName "kube-api-access-49lc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.497684 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8"] Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.502701 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-9p6v8"] Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.563269 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49lc8\" (UniqueName: \"kubernetes.io/projected/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66-kube-api-access-49lc8\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.848544 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rtvph"] Jan 21 16:15:04 crc kubenswrapper[4707]: E0121 16:15:04.849205 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd0b4da-ec8b-4973-bc1f-1a63731b7c66" containerName="mariadb-account-create-update" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.849217 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd0b4da-ec8b-4973-bc1f-1a63731b7c66" containerName="mariadb-account-create-update" Jan 21 16:15:04 crc kubenswrapper[4707]: E0121 16:15:04.849248 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7609d4eb-aad1-45f2-bc49-3b2322e8a817" containerName="collect-profiles" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.849254 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7609d4eb-aad1-45f2-bc49-3b2322e8a817" containerName="collect-profiles" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.849394 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd0b4da-ec8b-4973-bc1f-1a63731b7c66" containerName="mariadb-account-create-update" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.849407 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7609d4eb-aad1-45f2-bc49-3b2322e8a817" containerName="collect-profiles" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.850124 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.852211 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-dxb2g" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.852468 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.877970 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rtvph"] Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.970248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzm7t\" (UniqueName: \"kubernetes.io/projected/bfe542c2-1818-48fd-ba38-1998c812cac3-kube-api-access-dzm7t\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.970293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-combined-ca-bundle\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.970324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-db-sync-config-data\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:04 crc kubenswrapper[4707]: I0121 16:15:04.970346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-config-data\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.028327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-n7lst" event={"ID":"4cd0b4da-ec8b-4973-bc1f-1a63731b7c66","Type":"ContainerDied","Data":"34a528b3937482424b856e3414319932511fdf40899a158df06f5449b7688ca5"} Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.028373 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34a528b3937482424b856e3414319932511fdf40899a158df06f5449b7688ca5" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.028421 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-n7lst" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.072668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzm7t\" (UniqueName: \"kubernetes.io/projected/bfe542c2-1818-48fd-ba38-1998c812cac3-kube-api-access-dzm7t\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.072734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-combined-ca-bundle\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.072777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-db-sync-config-data\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.072819 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-config-data\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.076688 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-db-sync-config-data\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.079700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-combined-ca-bundle\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.086097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-config-data\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.088019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzm7t\" (UniqueName: \"kubernetes.io/projected/bfe542c2-1818-48fd-ba38-1998c812cac3-kube-api-access-dzm7t\") pod \"glance-db-sync-rtvph\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.174546 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.191597 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b25168-75cb-4128-9228-1547e29e5af9" path="/var/lib/kubelet/pods/60b25168-75cb-4128-9228-1547e29e5af9/volumes" Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.379210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:15:05 crc kubenswrapper[4707]: E0121 16:15:05.379442 4707 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 21 16:15:05 crc kubenswrapper[4707]: E0121 16:15:05.379638 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 21 16:15:05 crc kubenswrapper[4707]: E0121 16:15:05.379713 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift podName:2c2f6089-1b67-4c36-ae6f-66c394da5881 nodeName:}" failed. No retries permitted until 2026-01-21 16:15:13.379694152 +0000 UTC m=+4410.561210374 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift") pod "swift-storage-0" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881") : configmap "swift-ring-files" not found Jan 21 16:15:05 crc kubenswrapper[4707]: I0121 16:15:05.577363 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rtvph"] Jan 21 16:15:06 crc kubenswrapper[4707]: I0121 16:15:06.075306 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerID="6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae" exitCode=0 Jan 21 16:15:06 crc kubenswrapper[4707]: I0121 16:15:06.076183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrx7" event={"ID":"2c2458c7-ba58-4c56-bbea-725b7c873708","Type":"ContainerDied","Data":"6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae"} Jan 21 16:15:06 crc kubenswrapper[4707]: I0121 16:15:06.078733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-rtvph" event={"ID":"bfe542c2-1818-48fd-ba38-1998c812cac3","Type":"ContainerStarted","Data":"5ed44944455dbc1a114604c766a45c1ca972a4606703683a2633724ca91ebe9c"} Jan 21 16:15:06 crc kubenswrapper[4707]: I0121 16:15:06.078756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-rtvph" event={"ID":"bfe542c2-1818-48fd-ba38-1998c812cac3","Type":"ContainerStarted","Data":"3a1f69f80f44cc99ae9ccbc91f3677d0df2d8011aae9003e75c42cdbd4bbd657"} Jan 21 16:15:06 crc kubenswrapper[4707]: I0121 16:15:06.112790 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-rtvph" podStartSLOduration=2.112776965 podStartE2EDuration="2.112776965s" podCreationTimestamp="2026-01-21 16:15:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:06.111676265 +0000 UTC m=+4403.293192488" watchObservedRunningTime="2026-01-21 16:15:06.112776965 +0000 UTC m=+4403.294293187" Jan 21 16:15:07 crc kubenswrapper[4707]: I0121 16:15:07.907059 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n7lst"] Jan 21 16:15:07 crc kubenswrapper[4707]: I0121 16:15:07.914652 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n7lst"] Jan 21 16:15:08 crc kubenswrapper[4707]: I0121 16:15:08.094269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrx7" event={"ID":"2c2458c7-ba58-4c56-bbea-725b7c873708","Type":"ContainerStarted","Data":"631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9"} Jan 21 16:15:08 crc kubenswrapper[4707]: I0121 16:15:08.114041 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-khrx7" podStartSLOduration=3.055985925 podStartE2EDuration="6.114029031s" podCreationTimestamp="2026-01-21 16:15:02 +0000 UTC" firstStartedPulling="2026-01-21 16:15:04.017824264 +0000 UTC m=+4401.199340486" lastFinishedPulling="2026-01-21 16:15:07.07586737 +0000 UTC m=+4404.257383592" observedRunningTime="2026-01-21 16:15:08.111146151 +0000 UTC m=+4405.292662373" watchObservedRunningTime="2026-01-21 16:15:08.114029031 +0000 UTC m=+4405.295545254" Jan 21 16:15:09 crc kubenswrapper[4707]: I0121 16:15:09.104524 4707 generic.go:334] "Generic (PLEG): container finished" podID="348b9fd8-56d9-479d-ad68-6e98a249d50e" containerID="bb46f9e83f66a772289c0448dc2ee29a80460b2788d5947207c573f27eed0606" exitCode=0 Jan 21 16:15:09 crc kubenswrapper[4707]: I0121 16:15:09.104641 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" event={"ID":"348b9fd8-56d9-479d-ad68-6e98a249d50e","Type":"ContainerDied","Data":"bb46f9e83f66a772289c0448dc2ee29a80460b2788d5947207c573f27eed0606"} Jan 21 16:15:09 crc kubenswrapper[4707]: I0121 16:15:09.193129 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd0b4da-ec8b-4973-bc1f-1a63731b7c66" path="/var/lib/kubelet/pods/4cd0b4da-ec8b-4973-bc1f-1a63731b7c66/volumes" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.415102 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.467767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-dispersionconf\") pod \"348b9fd8-56d9-479d-ad68-6e98a249d50e\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.467839 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-ring-data-devices\") pod \"348b9fd8-56d9-479d-ad68-6e98a249d50e\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.467908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-combined-ca-bundle\") pod \"348b9fd8-56d9-479d-ad68-6e98a249d50e\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.467966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-scripts\") pod \"348b9fd8-56d9-479d-ad68-6e98a249d50e\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.468065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/348b9fd8-56d9-479d-ad68-6e98a249d50e-etc-swift\") pod \"348b9fd8-56d9-479d-ad68-6e98a249d50e\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.468126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt7bn\" (UniqueName: \"kubernetes.io/projected/348b9fd8-56d9-479d-ad68-6e98a249d50e-kube-api-access-mt7bn\") pod \"348b9fd8-56d9-479d-ad68-6e98a249d50e\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.468414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "348b9fd8-56d9-479d-ad68-6e98a249d50e" (UID: "348b9fd8-56d9-479d-ad68-6e98a249d50e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.469066 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/348b9fd8-56d9-479d-ad68-6e98a249d50e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "348b9fd8-56d9-479d-ad68-6e98a249d50e" (UID: "348b9fd8-56d9-479d-ad68-6e98a249d50e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.469178 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-swiftconf\") pod \"348b9fd8-56d9-479d-ad68-6e98a249d50e\" (UID: \"348b9fd8-56d9-479d-ad68-6e98a249d50e\") " Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.470221 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/348b9fd8-56d9-479d-ad68-6e98a249d50e-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.470241 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.474455 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348b9fd8-56d9-479d-ad68-6e98a249d50e-kube-api-access-mt7bn" (OuterVolumeSpecName: "kube-api-access-mt7bn") pod "348b9fd8-56d9-479d-ad68-6e98a249d50e" (UID: "348b9fd8-56d9-479d-ad68-6e98a249d50e"). InnerVolumeSpecName "kube-api-access-mt7bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.480660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "348b9fd8-56d9-479d-ad68-6e98a249d50e" (UID: "348b9fd8-56d9-479d-ad68-6e98a249d50e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.488935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "348b9fd8-56d9-479d-ad68-6e98a249d50e" (UID: "348b9fd8-56d9-479d-ad68-6e98a249d50e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.490018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-scripts" (OuterVolumeSpecName: "scripts") pod "348b9fd8-56d9-479d-ad68-6e98a249d50e" (UID: "348b9fd8-56d9-479d-ad68-6e98a249d50e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.494306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "348b9fd8-56d9-479d-ad68-6e98a249d50e" (UID: "348b9fd8-56d9-479d-ad68-6e98a249d50e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.573257 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.573494 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.573557 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348b9fd8-56d9-479d-ad68-6e98a249d50e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.573625 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt7bn\" (UniqueName: \"kubernetes.io/projected/348b9fd8-56d9-479d-ad68-6e98a249d50e-kube-api-access-mt7bn\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:10 crc kubenswrapper[4707]: I0121 16:15:10.573683 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/348b9fd8-56d9-479d-ad68-6e98a249d50e-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:11 crc kubenswrapper[4707]: I0121 16:15:11.123523 4707 generic.go:334] "Generic (PLEG): container finished" podID="bfe542c2-1818-48fd-ba38-1998c812cac3" containerID="5ed44944455dbc1a114604c766a45c1ca972a4606703683a2633724ca91ebe9c" exitCode=0 Jan 21 16:15:11 crc kubenswrapper[4707]: I0121 16:15:11.123606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-rtvph" event={"ID":"bfe542c2-1818-48fd-ba38-1998c812cac3","Type":"ContainerDied","Data":"5ed44944455dbc1a114604c766a45c1ca972a4606703683a2633724ca91ebe9c"} Jan 21 16:15:11 crc kubenswrapper[4707]: I0121 16:15:11.126176 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" event={"ID":"348b9fd8-56d9-479d-ad68-6e98a249d50e","Type":"ContainerDied","Data":"8f49c2258cb6e43079f6d39865026788852f76629040cfd64a39f781e2dbf15a"} Jan 21 16:15:11 crc kubenswrapper[4707]: I0121 16:15:11.126290 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f49c2258cb6e43079f6d39865026788852f76629040cfd64a39f781e2dbf15a" Jan 21 16:15:11 crc kubenswrapper[4707]: I0121 16:15:11.126234 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-wlhs4" Jan 21 16:15:11 crc kubenswrapper[4707]: I0121 16:15:11.548486 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.439970 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.510309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzm7t\" (UniqueName: \"kubernetes.io/projected/bfe542c2-1818-48fd-ba38-1998c812cac3-kube-api-access-dzm7t\") pod \"bfe542c2-1818-48fd-ba38-1998c812cac3\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.510400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-combined-ca-bundle\") pod \"bfe542c2-1818-48fd-ba38-1998c812cac3\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.510461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-db-sync-config-data\") pod \"bfe542c2-1818-48fd-ba38-1998c812cac3\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.510515 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-config-data\") pod \"bfe542c2-1818-48fd-ba38-1998c812cac3\" (UID: \"bfe542c2-1818-48fd-ba38-1998c812cac3\") " Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.517111 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bfe542c2-1818-48fd-ba38-1998c812cac3" (UID: "bfe542c2-1818-48fd-ba38-1998c812cac3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.517944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe542c2-1818-48fd-ba38-1998c812cac3-kube-api-access-dzm7t" (OuterVolumeSpecName: "kube-api-access-dzm7t") pod "bfe542c2-1818-48fd-ba38-1998c812cac3" (UID: "bfe542c2-1818-48fd-ba38-1998c812cac3"). InnerVolumeSpecName "kube-api-access-dzm7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.532415 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfe542c2-1818-48fd-ba38-1998c812cac3" (UID: "bfe542c2-1818-48fd-ba38-1998c812cac3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.546777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-config-data" (OuterVolumeSpecName: "config-data") pod "bfe542c2-1818-48fd-ba38-1998c812cac3" (UID: "bfe542c2-1818-48fd-ba38-1998c812cac3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.613488 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.613543 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzm7t\" (UniqueName: \"kubernetes.io/projected/bfe542c2-1818-48fd-ba38-1998c812cac3-kube-api-access-dzm7t\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.613562 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.613579 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bfe542c2-1818-48fd-ba38-1998c812cac3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.918757 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-9xw7x"] Jan 21 16:15:12 crc kubenswrapper[4707]: E0121 16:15:12.919124 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe542c2-1818-48fd-ba38-1998c812cac3" containerName="glance-db-sync" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.919143 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe542c2-1818-48fd-ba38-1998c812cac3" containerName="glance-db-sync" Jan 21 16:15:12 crc kubenswrapper[4707]: E0121 16:15:12.919174 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348b9fd8-56d9-479d-ad68-6e98a249d50e" containerName="swift-ring-rebalance" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.919181 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="348b9fd8-56d9-479d-ad68-6e98a249d50e" containerName="swift-ring-rebalance" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.919341 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="348b9fd8-56d9-479d-ad68-6e98a249d50e" containerName="swift-ring-rebalance" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.919367 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe542c2-1818-48fd-ba38-1998c812cac3" containerName="glance-db-sync" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.919919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.922280 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.934424 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-9xw7x"] Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.965479 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.965722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:12 crc kubenswrapper[4707]: I0121 16:15:12.999618 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.020831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dn7r\" (UniqueName: \"kubernetes.io/projected/f0d00d19-3404-42e9-be55-942f24e995f0-kube-api-access-5dn7r\") pod \"root-account-create-update-9xw7x\" (UID: \"f0d00d19-3404-42e9-be55-942f24e995f0\") " pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.021005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0d00d19-3404-42e9-be55-942f24e995f0-operator-scripts\") pod \"root-account-create-update-9xw7x\" (UID: \"f0d00d19-3404-42e9-be55-942f24e995f0\") " pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.122474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0d00d19-3404-42e9-be55-942f24e995f0-operator-scripts\") pod \"root-account-create-update-9xw7x\" (UID: \"f0d00d19-3404-42e9-be55-942f24e995f0\") " pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.122563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dn7r\" (UniqueName: \"kubernetes.io/projected/f0d00d19-3404-42e9-be55-942f24e995f0-kube-api-access-5dn7r\") pod \"root-account-create-update-9xw7x\" (UID: \"f0d00d19-3404-42e9-be55-942f24e995f0\") " pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.123744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0d00d19-3404-42e9-be55-942f24e995f0-operator-scripts\") pod \"root-account-create-update-9xw7x\" (UID: \"f0d00d19-3404-42e9-be55-942f24e995f0\") " pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.137661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dn7r\" (UniqueName: \"kubernetes.io/projected/f0d00d19-3404-42e9-be55-942f24e995f0-kube-api-access-5dn7r\") pod \"root-account-create-update-9xw7x\" (UID: \"f0d00d19-3404-42e9-be55-942f24e995f0\") " pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.142803 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-rtvph" event={"ID":"bfe542c2-1818-48fd-ba38-1998c812cac3","Type":"ContainerDied","Data":"3a1f69f80f44cc99ae9ccbc91f3677d0df2d8011aae9003e75c42cdbd4bbd657"} Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.142860 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a1f69f80f44cc99ae9ccbc91f3677d0df2d8011aae9003e75c42cdbd4bbd657" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.142874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-rtvph" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.195935 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.234877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khrx7"] Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.235073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.427763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.434386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") pod \"swift-storage-0\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.652180 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-9xw7x"] Jan 21 16:15:13 crc kubenswrapper[4707]: W0121 16:15:13.654318 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0d00d19_3404_42e9_be55_942f24e995f0.slice/crio-fec69c496adf2ca8214c5ec2d667ad525d2ec1a206d061e69099f9254c8811a8 WatchSource:0}: Error finding container fec69c496adf2ca8214c5ec2d667ad525d2ec1a206d061e69099f9254c8811a8: Status 404 returned error can't find the container with id fec69c496adf2ca8214c5ec2d667ad525d2ec1a206d061e69099f9254c8811a8 Jan 21 16:15:13 crc kubenswrapper[4707]: I0121 16:15:13.732374 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:15:14 crc kubenswrapper[4707]: I0121 16:15:14.128016 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:15:14 crc kubenswrapper[4707]: I0121 16:15:14.174287 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0d00d19-3404-42e9-be55-942f24e995f0" containerID="8363967c7b8268c252b183ff9c78e4bc582e3b7632d312b491c23c4ee83dcc32" exitCode=0 Jan 21 16:15:14 crc kubenswrapper[4707]: I0121 16:15:14.174435 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-9xw7x" event={"ID":"f0d00d19-3404-42e9-be55-942f24e995f0","Type":"ContainerDied","Data":"8363967c7b8268c252b183ff9c78e4bc582e3b7632d312b491c23c4ee83dcc32"} Jan 21 16:15:14 crc kubenswrapper[4707]: I0121 16:15:14.174510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-9xw7x" event={"ID":"f0d00d19-3404-42e9-be55-942f24e995f0","Type":"ContainerStarted","Data":"fec69c496adf2ca8214c5ec2d667ad525d2ec1a206d061e69099f9254c8811a8"} Jan 21 16:15:14 crc kubenswrapper[4707]: I0121 16:15:14.176132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"7ac3d2c350757ccb932d512ede416f9f9f38057a71361cc853fb1e70481913b6"} Jan 21 16:15:14 crc kubenswrapper[4707]: I0121 16:15:14.178183 4707 generic.go:334] "Generic (PLEG): container finished" podID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" containerID="66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075" exitCode=0 Jan 21 16:15:14 crc kubenswrapper[4707]: I0121 16:15:14.178206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"da7ffdb4-4647-47fb-8ec9-012b0c81a83e","Type":"ContainerDied","Data":"66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.189723 4707 generic.go:334] "Generic (PLEG): container finished" podID="00504354-68de-4d18-8192-144ce961d72e" containerID="c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356" exitCode=0 Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.189836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"00504354-68de-4d18-8192-144ce961d72e","Type":"ContainerDied","Data":"c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.193573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.193600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.193611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.193620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.193631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.193638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.193646 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.202109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"da7ffdb4-4647-47fb-8ec9-012b0c81a83e","Type":"ContainerStarted","Data":"90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929"} Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.202222 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-khrx7" podUID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerName="registry-server" containerID="cri-o://631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9" gracePeriod=2 Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.261254 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.261238862 podStartE2EDuration="36.261238862s" podCreationTimestamp="2026-01-21 16:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:15.257286651 +0000 UTC m=+4412.438802872" watchObservedRunningTime="2026-01-21 16:15:15.261238862 +0000 UTC m=+4412.442755084" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.594788 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.731378 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.770078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dn7r\" (UniqueName: \"kubernetes.io/projected/f0d00d19-3404-42e9-be55-942f24e995f0-kube-api-access-5dn7r\") pod \"f0d00d19-3404-42e9-be55-942f24e995f0\" (UID: \"f0d00d19-3404-42e9-be55-942f24e995f0\") " Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.770351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0d00d19-3404-42e9-be55-942f24e995f0-operator-scripts\") pod \"f0d00d19-3404-42e9-be55-942f24e995f0\" (UID: \"f0d00d19-3404-42e9-be55-942f24e995f0\") " Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.771900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0d00d19-3404-42e9-be55-942f24e995f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0d00d19-3404-42e9-be55-942f24e995f0" (UID: "f0d00d19-3404-42e9-be55-942f24e995f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.779108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d00d19-3404-42e9-be55-942f24e995f0-kube-api-access-5dn7r" (OuterVolumeSpecName: "kube-api-access-5dn7r") pod "f0d00d19-3404-42e9-be55-942f24e995f0" (UID: "f0d00d19-3404-42e9-be55-942f24e995f0"). InnerVolumeSpecName "kube-api-access-5dn7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.872013 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-utilities\") pod \"2c2458c7-ba58-4c56-bbea-725b7c873708\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.872402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9z97\" (UniqueName: \"kubernetes.io/projected/2c2458c7-ba58-4c56-bbea-725b7c873708-kube-api-access-v9z97\") pod \"2c2458c7-ba58-4c56-bbea-725b7c873708\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.872502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-catalog-content\") pod \"2c2458c7-ba58-4c56-bbea-725b7c873708\" (UID: \"2c2458c7-ba58-4c56-bbea-725b7c873708\") " Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.872968 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dn7r\" (UniqueName: \"kubernetes.io/projected/f0d00d19-3404-42e9-be55-942f24e995f0-kube-api-access-5dn7r\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.872983 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0d00d19-3404-42e9-be55-942f24e995f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.872991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-utilities" (OuterVolumeSpecName: "utilities") pod "2c2458c7-ba58-4c56-bbea-725b7c873708" (UID: "2c2458c7-ba58-4c56-bbea-725b7c873708"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.876945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2458c7-ba58-4c56-bbea-725b7c873708-kube-api-access-v9z97" (OuterVolumeSpecName: "kube-api-access-v9z97") pod "2c2458c7-ba58-4c56-bbea-725b7c873708" (UID: "2c2458c7-ba58-4c56-bbea-725b7c873708"). InnerVolumeSpecName "kube-api-access-v9z97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.975680 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:15 crc kubenswrapper[4707]: I0121 16:15:15.975718 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9z97\" (UniqueName: \"kubernetes.io/projected/2c2458c7-ba58-4c56-bbea-725b7c873708-kube-api-access-v9z97\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.211860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"00504354-68de-4d18-8192-144ce961d72e","Type":"ContainerStarted","Data":"ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.212366 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.220267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.220311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.220323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.220332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.220340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.220350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.222666 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerID="631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9" exitCode=0 Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.222724 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khrx7" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.222728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrx7" event={"ID":"2c2458c7-ba58-4c56-bbea-725b7c873708","Type":"ContainerDied","Data":"631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.222855 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khrx7" event={"ID":"2c2458c7-ba58-4c56-bbea-725b7c873708","Type":"ContainerDied","Data":"62bf86e9ae3733e2f740393fc006d017cdc4a10215661c3b46a18eaec8cefda8"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.222891 4707 scope.go:117] "RemoveContainer" containerID="631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.224898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-9xw7x" event={"ID":"f0d00d19-3404-42e9-be55-942f24e995f0","Type":"ContainerDied","Data":"fec69c496adf2ca8214c5ec2d667ad525d2ec1a206d061e69099f9254c8811a8"} Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.224931 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fec69c496adf2ca8214c5ec2d667ad525d2ec1a206d061e69099f9254c8811a8" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.224955 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-9xw7x" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.253428 4707 scope.go:117] "RemoveContainer" containerID="6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.274767 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=37.274747067 podStartE2EDuration="37.274747067s" podCreationTimestamp="2026-01-21 16:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:16.264654069 +0000 UTC m=+4413.446170291" watchObservedRunningTime="2026-01-21 16:15:16.274747067 +0000 UTC m=+4413.456263290" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.339698 4707 scope.go:117] "RemoveContainer" containerID="546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.380334 4707 scope.go:117] "RemoveContainer" containerID="631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9" Jan 21 16:15:16 crc kubenswrapper[4707]: E0121 16:15:16.380736 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9\": container with ID starting with 631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9 not found: ID does not exist" containerID="631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.380765 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9"} err="failed to get container status \"631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9\": rpc error: code = NotFound desc = could not find container \"631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9\": container with ID starting with 631b81ba0f50462d9d0aaac5665a89849dd86dd0eb80473098aaca42086b38b9 not found: ID does not exist" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.380785 4707 scope.go:117] "RemoveContainer" containerID="6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae" Jan 21 16:15:16 crc kubenswrapper[4707]: E0121 16:15:16.381095 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae\": container with ID starting with 6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae not found: ID does not exist" containerID="6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.381134 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae"} err="failed to get container status \"6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae\": rpc error: code = NotFound desc = could not find container \"6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae\": container with ID starting with 6552d5d3c8ab39c4f917dcd5b5e064b4a8c89a26984e71e76cde74bcad8449ae not found: ID does not exist" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.381155 4707 scope.go:117] "RemoveContainer" containerID="546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb" Jan 21 16:15:16 crc kubenswrapper[4707]: E0121 16:15:16.381460 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb\": container with ID starting with 546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb not found: ID does not exist" containerID="546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.381497 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb"} err="failed to get container status \"546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb\": rpc error: code = NotFound desc = could not find container \"546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb\": container with ID starting with 546d56a74bd9f73bca5a558cf589167e095bcc9ad9bc89048bbcbf28e2a493bb not found: ID does not exist" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.845988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c2458c7-ba58-4c56-bbea-725b7c873708" (UID: "2c2458c7-ba58-4c56-bbea-725b7c873708"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:16 crc kubenswrapper[4707]: I0121 16:15:16.893928 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2458c7-ba58-4c56-bbea-725b7c873708-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.155373 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khrx7"] Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.162656 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-khrx7"] Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.194004 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2458c7-ba58-4c56-bbea-725b7c873708" path="/var/lib/kubelet/pods/2c2458c7-ba58-4c56-bbea-725b7c873708/volumes" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.239560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec"} Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.239616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerStarted","Data":"2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d"} Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.266738 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=21.266713818 podStartE2EDuration="21.266713818s" podCreationTimestamp="2026-01-21 16:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:17.266394678 +0000 UTC m=+4414.447910901" watchObservedRunningTime="2026-01-21 16:15:17.266713818 +0000 UTC m=+4414.448230041" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.403172 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854"] Jan 21 16:15:17 crc kubenswrapper[4707]: E0121 16:15:17.403544 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerName="registry-server" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.403564 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerName="registry-server" Jan 21 16:15:17 crc kubenswrapper[4707]: E0121 16:15:17.403579 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d00d19-3404-42e9-be55-942f24e995f0" containerName="mariadb-account-create-update" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.403585 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d00d19-3404-42e9-be55-942f24e995f0" containerName="mariadb-account-create-update" Jan 21 16:15:17 crc kubenswrapper[4707]: E0121 16:15:17.403596 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerName="extract-utilities" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.403602 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerName="extract-utilities" Jan 21 16:15:17 crc kubenswrapper[4707]: E0121 16:15:17.403613 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerName="extract-content" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.403618 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerName="extract-content" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.403772 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2458c7-ba58-4c56-bbea-725b7c873708" containerName="registry-server" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.403794 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d00d19-3404-42e9-be55-942f24e995f0" containerName="mariadb-account-create-update" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.404664 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.406854 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.427142 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854"] Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.513514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.513573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.513642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxrzz\" (UniqueName: \"kubernetes.io/projected/3d94635b-5310-4d9e-b655-163963e945b9-kube-api-access-vxrzz\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.513661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-config\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.616254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.616327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.616402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxrzz\" (UniqueName: \"kubernetes.io/projected/3d94635b-5310-4d9e-b655-163963e945b9-kube-api-access-vxrzz\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.616426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-config\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.617700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-config\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.617707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.617706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.635207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxrzz\" (UniqueName: \"kubernetes.io/projected/3d94635b-5310-4d9e-b655-163963e945b9-kube-api-access-vxrzz\") pod \"dnsmasq-dnsmasq-64ddcf65d5-mr854\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:17 crc kubenswrapper[4707]: I0121 16:15:17.726505 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.167502 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854"] Jan 21 16:15:18 crc kubenswrapper[4707]: W0121 16:15:18.168299 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d94635b_5310_4d9e_b655_163963e945b9.slice/crio-21e8b7c0cbfa4420d0baafce2562ba5e86595d9f888cdf7e060204960e1e7111 WatchSource:0}: Error finding container 21e8b7c0cbfa4420d0baafce2562ba5e86595d9f888cdf7e060204960e1e7111: Status 404 returned error can't find the container with id 21e8b7c0cbfa4420d0baafce2562ba5e86595d9f888cdf7e060204960e1e7111 Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.249230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" event={"ID":"3d94635b-5310-4d9e-b655-163963e945b9","Type":"ContainerStarted","Data":"21e8b7c0cbfa4420d0baafce2562ba5e86595d9f888cdf7e060204960e1e7111"} Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.759096 4707 scope.go:117] "RemoveContainer" containerID="a39188bac11b6baaf80c99303d7a4fcc4cc0ab5f6d005043fe277c2caf45f82c" Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.774600 4707 scope.go:117] "RemoveContainer" containerID="2836fb5072518b31b20d2d2ab0bac82ac6aaa2d7fa44b4ce3c1c244ed3389a72" Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.796615 4707 scope.go:117] "RemoveContainer" containerID="ea92c2d6019c27b4829b3e9f434a365dd09e2813d074984c8109dbf9ed386c84" Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.822740 4707 scope.go:117] "RemoveContainer" containerID="35168c1fabdc2f1d8d3b47af44542532da9c2604d8d33b99a98c14e589f3ba68" Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.843530 4707 scope.go:117] "RemoveContainer" containerID="fdfba2521f191a956805a9894a059b391490e094209faadcb9e1196d96a653a7" Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.867532 4707 scope.go:117] "RemoveContainer" containerID="8d6806b4d276c4e814c81f7eb9a13232ef63c8f4ed5080fb5caadd173a7acb91" Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.885764 4707 scope.go:117] "RemoveContainer" containerID="52e0023d3e6851f48959386497daba5bed222e165b85dbf4d2543fae005a5f62" Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.927381 4707 scope.go:117] "RemoveContainer" containerID="58830d3499b14fa51fbf78280bba3cf9c8496d8b7a93968c43ae679eaba07c04" Jan 21 16:15:18 crc kubenswrapper[4707]: I0121 16:15:18.945468 4707 scope.go:117] "RemoveContainer" containerID="42e1a7b8fbb1c0bf8cd31040a2431ce633ee959814881423be4fe99ae7d68843" Jan 21 16:15:19 crc kubenswrapper[4707]: I0121 16:15:19.258827 4707 generic.go:334] "Generic (PLEG): container finished" podID="3d94635b-5310-4d9e-b655-163963e945b9" containerID="0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa" exitCode=0 Jan 21 16:15:19 crc kubenswrapper[4707]: I0121 16:15:19.258893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" event={"ID":"3d94635b-5310-4d9e-b655-163963e945b9","Type":"ContainerDied","Data":"0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa"} Jan 21 16:15:20 crc kubenswrapper[4707]: I0121 16:15:20.266074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" event={"ID":"3d94635b-5310-4d9e-b655-163963e945b9","Type":"ContainerStarted","Data":"2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c"} Jan 21 16:15:20 crc kubenswrapper[4707]: I0121 16:15:20.266339 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:20 crc kubenswrapper[4707]: I0121 16:15:20.279035 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" podStartSLOduration=3.279019685 podStartE2EDuration="3.279019685s" podCreationTimestamp="2026-01-21 16:15:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:20.277705223 +0000 UTC m=+4417.459221445" watchObservedRunningTime="2026-01-21 16:15:20.279019685 +0000 UTC m=+4417.460535907" Jan 21 16:15:21 crc kubenswrapper[4707]: I0121 16:15:21.058847 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:15:27 crc kubenswrapper[4707]: I0121 16:15:27.727677 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:15:27 crc kubenswrapper[4707]: I0121 16:15:27.766306 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb"] Jan 21 16:15:27 crc kubenswrapper[4707]: I0121 16:15:27.766568 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" podUID="3cd910ba-3565-4935-a15f-dda572968edd" containerName="dnsmasq-dns" containerID="cri-o://11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b" gracePeriod=10 Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.158868 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.287083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fds5l\" (UniqueName: \"kubernetes.io/projected/3cd910ba-3565-4935-a15f-dda572968edd-kube-api-access-fds5l\") pod \"3cd910ba-3565-4935-a15f-dda572968edd\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.287128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-config\") pod \"3cd910ba-3565-4935-a15f-dda572968edd\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.287208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-dnsmasq-svc\") pod \"3cd910ba-3565-4935-a15f-dda572968edd\" (UID: \"3cd910ba-3565-4935-a15f-dda572968edd\") " Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.292924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd910ba-3565-4935-a15f-dda572968edd-kube-api-access-fds5l" (OuterVolumeSpecName: "kube-api-access-fds5l") pod "3cd910ba-3565-4935-a15f-dda572968edd" (UID: "3cd910ba-3565-4935-a15f-dda572968edd"). InnerVolumeSpecName "kube-api-access-fds5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.319973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "3cd910ba-3565-4935-a15f-dda572968edd" (UID: "3cd910ba-3565-4935-a15f-dda572968edd"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.320984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-config" (OuterVolumeSpecName: "config") pod "3cd910ba-3565-4935-a15f-dda572968edd" (UID: "3cd910ba-3565-4935-a15f-dda572968edd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.324187 4707 generic.go:334] "Generic (PLEG): container finished" podID="3cd910ba-3565-4935-a15f-dda572968edd" containerID="11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b" exitCode=0 Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.324227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" event={"ID":"3cd910ba-3565-4935-a15f-dda572968edd","Type":"ContainerDied","Data":"11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b"} Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.324242 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.324257 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb" event={"ID":"3cd910ba-3565-4935-a15f-dda572968edd","Type":"ContainerDied","Data":"a6ec8071ef1e47f5ab62d7a5187112ccb543dceafbc05d75056866db528dfeb6"} Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.324283 4707 scope.go:117] "RemoveContainer" containerID="11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.362084 4707 scope.go:117] "RemoveContainer" containerID="a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.368033 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb"] Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.372629 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-bbvdb"] Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.388945 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.388971 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fds5l\" (UniqueName: \"kubernetes.io/projected/3cd910ba-3565-4935-a15f-dda572968edd-kube-api-access-fds5l\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.388982 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd910ba-3565-4935-a15f-dda572968edd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.390617 4707 scope.go:117] "RemoveContainer" containerID="11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b" Jan 21 16:15:28 crc kubenswrapper[4707]: E0121 16:15:28.390912 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b\": container with ID starting with 11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b not found: ID does not exist" containerID="11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.390945 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b"} err="failed to get container status \"11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b\": rpc error: code = NotFound desc = could not find container \"11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b\": container with ID starting with 11d73f4954692436f67ecb2acff41c2c93832668e2feb6bee0d926293305ec5b not found: ID does not exist" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.390964 4707 scope.go:117] "RemoveContainer" containerID="a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0" Jan 21 16:15:28 crc kubenswrapper[4707]: E0121 16:15:28.391274 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0\": container with ID starting with a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0 not found: ID does not exist" containerID="a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0" Jan 21 16:15:28 crc kubenswrapper[4707]: I0121 16:15:28.391298 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0"} err="failed to get container status \"a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0\": rpc error: code = NotFound desc = could not find container \"a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0\": container with ID starting with a4c328538ab4e1ab434ccb7b0d2adbe0d2c0aa44c28c5585131cf0bbfc77e3b0 not found: ID does not exist" Jan 21 16:15:29 crc kubenswrapper[4707]: I0121 16:15:29.190423 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd910ba-3565-4935-a15f-dda572968edd" path="/var/lib/kubelet/pods/3cd910ba-3565-4935-a15f-dda572968edd/volumes" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.061961 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.280962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.384583 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r"] Jan 21 16:15:31 crc kubenswrapper[4707]: E0121 16:15:31.385110 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd910ba-3565-4935-a15f-dda572968edd" containerName="dnsmasq-dns" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.385133 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd910ba-3565-4935-a15f-dda572968edd" containerName="dnsmasq-dns" Jan 21 16:15:31 crc kubenswrapper[4707]: E0121 16:15:31.385189 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd910ba-3565-4935-a15f-dda572968edd" containerName="init" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.385196 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd910ba-3565-4935-a15f-dda572968edd" containerName="init" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.385383 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd910ba-3565-4935-a15f-dda572968edd" containerName="dnsmasq-dns" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.386122 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.387946 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.391032 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-cq9z7"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.392204 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.397485 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-cq9z7"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.405785 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.466948 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-t4c2x"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.468130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.477018 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-t4c2x"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.534173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f1d5774-2251-43d5-938b-7e87e5fee400-operator-scripts\") pod \"cinder-db-create-cq9z7\" (UID: \"8f1d5774-2251-43d5-938b-7e87e5fee400\") " pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.534242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vf77\" (UniqueName: \"kubernetes.io/projected/8f1d5774-2251-43d5-938b-7e87e5fee400-kube-api-access-6vf77\") pod \"cinder-db-create-cq9z7\" (UID: \"8f1d5774-2251-43d5-938b-7e87e5fee400\") " pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.534283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab7b82b-a7d1-41af-abc2-21322ebdd01f-operator-scripts\") pod \"barbican-b43e-account-create-update-qwc4r\" (UID: \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\") " pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.534308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js9gw\" (UniqueName: \"kubernetes.io/projected/eab7b82b-a7d1-41af-abc2-21322ebdd01f-kube-api-access-js9gw\") pod \"barbican-b43e-account-create-update-qwc4r\" (UID: \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\") " pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.571871 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.572775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.574297 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.579829 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.627362 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-8dtzx"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.628436 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.630134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.631320 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-shwt7" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.636581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab7b82b-a7d1-41af-abc2-21322ebdd01f-operator-scripts\") pod \"barbican-b43e-account-create-update-qwc4r\" (UID: \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\") " pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.636648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js9gw\" (UniqueName: \"kubernetes.io/projected/eab7b82b-a7d1-41af-abc2-21322ebdd01f-kube-api-access-js9gw\") pod \"barbican-b43e-account-create-update-qwc4r\" (UID: \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\") " pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.636860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-operator-scripts\") pod \"barbican-db-create-t4c2x\" (UID: \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\") " pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.637032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f1d5774-2251-43d5-938b-7e87e5fee400-operator-scripts\") pod \"cinder-db-create-cq9z7\" (UID: \"8f1d5774-2251-43d5-938b-7e87e5fee400\") " pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.637198 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vf77\" (UniqueName: \"kubernetes.io/projected/8f1d5774-2251-43d5-938b-7e87e5fee400-kube-api-access-6vf77\") pod \"cinder-db-create-cq9z7\" (UID: \"8f1d5774-2251-43d5-938b-7e87e5fee400\") " pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.637286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spr74\" (UniqueName: \"kubernetes.io/projected/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-kube-api-access-spr74\") pod \"barbican-db-create-t4c2x\" (UID: \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\") " pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.637290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab7b82b-a7d1-41af-abc2-21322ebdd01f-operator-scripts\") pod \"barbican-b43e-account-create-update-qwc4r\" (UID: \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\") " pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.637636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f1d5774-2251-43d5-938b-7e87e5fee400-operator-scripts\") pod \"cinder-db-create-cq9z7\" (UID: \"8f1d5774-2251-43d5-938b-7e87e5fee400\") " pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.638542 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.640556 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.640966 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-8dtzx"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.654947 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js9gw\" (UniqueName: \"kubernetes.io/projected/eab7b82b-a7d1-41af-abc2-21322ebdd01f-kube-api-access-js9gw\") pod \"barbican-b43e-account-create-update-qwc4r\" (UID: \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\") " pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.660307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vf77\" (UniqueName: \"kubernetes.io/projected/8f1d5774-2251-43d5-938b-7e87e5fee400-kube-api-access-6vf77\") pod \"cinder-db-create-cq9z7\" (UID: \"8f1d5774-2251-43d5-938b-7e87e5fee400\") " pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.683152 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jkw2f"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.684301 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.700954 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jkw2f"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.705487 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-794d2"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.706446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.707612 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.711227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.713127 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-794d2"] Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.725147 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.738532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2n29\" (UniqueName: \"kubernetes.io/projected/f87d50e5-d59b-48b8-a36d-5672422fa831-kube-api-access-b2n29\") pod \"cinder-f22d-account-create-update-dqhzk\" (UID: \"f87d50e5-d59b-48b8-a36d-5672422fa831\") " pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.738575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttdn2\" (UniqueName: \"kubernetes.io/projected/934765c4-0ef2-4748-831a-3fa09a941a42-kube-api-access-ttdn2\") pod \"keystone-db-sync-8dtzx\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.738661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-operator-scripts\") pod \"barbican-db-create-t4c2x\" (UID: \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\") " pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.738703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87d50e5-d59b-48b8-a36d-5672422fa831-operator-scripts\") pod \"cinder-f22d-account-create-update-dqhzk\" (UID: \"f87d50e5-d59b-48b8-a36d-5672422fa831\") " pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.738724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-config-data\") pod \"keystone-db-sync-8dtzx\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.738829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-combined-ca-bundle\") pod \"keystone-db-sync-8dtzx\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.738876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spr74\" (UniqueName: \"kubernetes.io/projected/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-kube-api-access-spr74\") pod \"barbican-db-create-t4c2x\" (UID: \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\") " pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.739286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-operator-scripts\") pod \"barbican-db-create-t4c2x\" (UID: \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\") " pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.754113 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spr74\" (UniqueName: \"kubernetes.io/projected/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-kube-api-access-spr74\") pod \"barbican-db-create-t4c2x\" (UID: \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\") " pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.793472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.841405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x74hc\" (UniqueName: \"kubernetes.io/projected/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-kube-api-access-x74hc\") pod \"neutron-90a1-account-create-update-794d2\" (UID: \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.841471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-operator-scripts\") pod \"neutron-90a1-account-create-update-794d2\" (UID: \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.841528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2n29\" (UniqueName: \"kubernetes.io/projected/f87d50e5-d59b-48b8-a36d-5672422fa831-kube-api-access-b2n29\") pod \"cinder-f22d-account-create-update-dqhzk\" (UID: \"f87d50e5-d59b-48b8-a36d-5672422fa831\") " pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.841581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttdn2\" (UniqueName: \"kubernetes.io/projected/934765c4-0ef2-4748-831a-3fa09a941a42-kube-api-access-ttdn2\") pod \"keystone-db-sync-8dtzx\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.841638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab563e0-5667-4a88-902e-0f6d8d3da539-operator-scripts\") pod \"neutron-db-create-jkw2f\" (UID: \"2ab563e0-5667-4a88-902e-0f6d8d3da539\") " pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.841748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87d50e5-d59b-48b8-a36d-5672422fa831-operator-scripts\") pod \"cinder-f22d-account-create-update-dqhzk\" (UID: \"f87d50e5-d59b-48b8-a36d-5672422fa831\") " pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.841772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-config-data\") pod \"keystone-db-sync-8dtzx\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.842236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-combined-ca-bundle\") pod \"keystone-db-sync-8dtzx\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.842383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j57jf\" (UniqueName: \"kubernetes.io/projected/2ab563e0-5667-4a88-902e-0f6d8d3da539-kube-api-access-j57jf\") pod \"neutron-db-create-jkw2f\" (UID: \"2ab563e0-5667-4a88-902e-0f6d8d3da539\") " pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.843717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87d50e5-d59b-48b8-a36d-5672422fa831-operator-scripts\") pod \"cinder-f22d-account-create-update-dqhzk\" (UID: \"f87d50e5-d59b-48b8-a36d-5672422fa831\") " pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.849708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-combined-ca-bundle\") pod \"keystone-db-sync-8dtzx\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.849849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-config-data\") pod \"keystone-db-sync-8dtzx\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.862354 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2n29\" (UniqueName: \"kubernetes.io/projected/f87d50e5-d59b-48b8-a36d-5672422fa831-kube-api-access-b2n29\") pod \"cinder-f22d-account-create-update-dqhzk\" (UID: \"f87d50e5-d59b-48b8-a36d-5672422fa831\") " pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.864340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttdn2\" (UniqueName: \"kubernetes.io/projected/934765c4-0ef2-4748-831a-3fa09a941a42-kube-api-access-ttdn2\") pod \"keystone-db-sync-8dtzx\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.890006 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.943753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j57jf\" (UniqueName: \"kubernetes.io/projected/2ab563e0-5667-4a88-902e-0f6d8d3da539-kube-api-access-j57jf\") pod \"neutron-db-create-jkw2f\" (UID: \"2ab563e0-5667-4a88-902e-0f6d8d3da539\") " pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.943825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x74hc\" (UniqueName: \"kubernetes.io/projected/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-kube-api-access-x74hc\") pod \"neutron-90a1-account-create-update-794d2\" (UID: \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.943850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-operator-scripts\") pod \"neutron-90a1-account-create-update-794d2\" (UID: \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.943881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab563e0-5667-4a88-902e-0f6d8d3da539-operator-scripts\") pod \"neutron-db-create-jkw2f\" (UID: \"2ab563e0-5667-4a88-902e-0f6d8d3da539\") " pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.944572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab563e0-5667-4a88-902e-0f6d8d3da539-operator-scripts\") pod \"neutron-db-create-jkw2f\" (UID: \"2ab563e0-5667-4a88-902e-0f6d8d3da539\") " pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.945364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-operator-scripts\") pod \"neutron-90a1-account-create-update-794d2\" (UID: \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.947222 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.959179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j57jf\" (UniqueName: \"kubernetes.io/projected/2ab563e0-5667-4a88-902e-0f6d8d3da539-kube-api-access-j57jf\") pod \"neutron-db-create-jkw2f\" (UID: \"2ab563e0-5667-4a88-902e-0f6d8d3da539\") " pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:31 crc kubenswrapper[4707]: I0121 16:15:31.961390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x74hc\" (UniqueName: \"kubernetes.io/projected/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-kube-api-access-x74hc\") pod \"neutron-90a1-account-create-update-794d2\" (UID: \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:32 crc kubenswrapper[4707]: I0121 16:15:32.021075 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:32 crc kubenswrapper[4707]: I0121 16:15:32.097217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:32 crc kubenswrapper[4707]: I0121 16:15:32.141906 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r"] Jan 21 16:15:32 crc kubenswrapper[4707]: I0121 16:15:32.220446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-cq9z7"] Jan 21 16:15:32 crc kubenswrapper[4707]: I0121 16:15:32.283661 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-t4c2x"] Jan 21 16:15:32 crc kubenswrapper[4707]: I0121 16:15:32.349769 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk"] Jan 21 16:15:32 crc kubenswrapper[4707]: I0121 16:15:32.438842 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-8dtzx"] Jan 21 16:15:32 crc kubenswrapper[4707]: W0121 16:15:32.454784 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeab7b82b_a7d1_41af_abc2_21322ebdd01f.slice/crio-32eb78cb716de1a454b83bbd48572b6847a39849ab4645a65b2a6424d334f585 WatchSource:0}: Error finding container 32eb78cb716de1a454b83bbd48572b6847a39849ab4645a65b2a6424d334f585: Status 404 returned error can't find the container with id 32eb78cb716de1a454b83bbd48572b6847a39849ab4645a65b2a6424d334f585 Jan 21 16:15:32 crc kubenswrapper[4707]: W0121 16:15:32.458022 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f1d5774_2251_43d5_938b_7e87e5fee400.slice/crio-b4909b609e94b0ffa3b85f4638906ae3c97be7dec3b4864b1c8f844c4cdf9524 WatchSource:0}: Error finding container b4909b609e94b0ffa3b85f4638906ae3c97be7dec3b4864b1c8f844c4cdf9524: Status 404 returned error can't find the container with id b4909b609e94b0ffa3b85f4638906ae3c97be7dec3b4864b1c8f844c4cdf9524 Jan 21 16:15:32 crc kubenswrapper[4707]: W0121 16:15:32.460776 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a6b8d4c_bb07_4c24_b97e_870ad37de2d9.slice/crio-6bcaf0214d8c5329177067a84b12600880d3fcaa08964fe156e16cf4cd2e5ade WatchSource:0}: Error finding container 6bcaf0214d8c5329177067a84b12600880d3fcaa08964fe156e16cf4cd2e5ade: Status 404 returned error can't find the container with id 6bcaf0214d8c5329177067a84b12600880d3fcaa08964fe156e16cf4cd2e5ade Jan 21 16:15:32 crc kubenswrapper[4707]: I0121 16:15:32.916004 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-794d2"] Jan 21 16:15:32 crc kubenswrapper[4707]: W0121 16:15:32.985191 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d21d851_8cc6_4b92_ad04_7e4fe88fb166.slice/crio-06163ab0ba922aced21c1fda3b7bcfb2af532ee0912af065be146b8724fcdd26 WatchSource:0}: Error finding container 06163ab0ba922aced21c1fda3b7bcfb2af532ee0912af065be146b8724fcdd26: Status 404 returned error can't find the container with id 06163ab0ba922aced21c1fda3b7bcfb2af532ee0912af065be146b8724fcdd26 Jan 21 16:15:32 crc kubenswrapper[4707]: I0121 16:15:32.995710 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jkw2f"] Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.374125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" event={"ID":"934765c4-0ef2-4748-831a-3fa09a941a42","Type":"ContainerStarted","Data":"c6d62380a3ef675f453ba1bede62df33562aea21e545e2c9252332cfb4df99bd"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.374489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" event={"ID":"934765c4-0ef2-4748-831a-3fa09a941a42","Type":"ContainerStarted","Data":"4f524f6b0134a9b6e9c3ae7a9c7cbcce58423736635ee6ef8c87159b358d006f"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.376801 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f1d5774-2251-43d5-938b-7e87e5fee400" containerID="49fe3cdf415f022d620f2d71a91e520b9ef9b344fdc73a3dfc14029d3d981a93" exitCode=0 Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.376935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-cq9z7" event={"ID":"8f1d5774-2251-43d5-938b-7e87e5fee400","Type":"ContainerDied","Data":"49fe3cdf415f022d620f2d71a91e520b9ef9b344fdc73a3dfc14029d3d981a93"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.377006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-cq9z7" event={"ID":"8f1d5774-2251-43d5-938b-7e87e5fee400","Type":"ContainerStarted","Data":"b4909b609e94b0ffa3b85f4638906ae3c97be7dec3b4864b1c8f844c4cdf9524"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.379064 4707 generic.go:334] "Generic (PLEG): container finished" podID="f87d50e5-d59b-48b8-a36d-5672422fa831" containerID="b2f29327c12a3fe9a1b70817c1826ceec703c5e985e6fecbd37b4a63aadbba45" exitCode=0 Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.379139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" event={"ID":"f87d50e5-d59b-48b8-a36d-5672422fa831","Type":"ContainerDied","Data":"b2f29327c12a3fe9a1b70817c1826ceec703c5e985e6fecbd37b4a63aadbba45"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.379181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" event={"ID":"f87d50e5-d59b-48b8-a36d-5672422fa831","Type":"ContainerStarted","Data":"06b3becd8b31fb53bb397e23f6fd66314abf2d45537dc2873ba9c2246d76452a"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.380637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" event={"ID":"9d21d851-8cc6-4b92-ad04-7e4fe88fb166","Type":"ContainerStarted","Data":"3d192501b8ea2c70eceb4505c10c6f626fcbc95a51cb226de357496781cd162e"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.380676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" event={"ID":"9d21d851-8cc6-4b92-ad04-7e4fe88fb166","Type":"ContainerStarted","Data":"06163ab0ba922aced21c1fda3b7bcfb2af532ee0912af065be146b8724fcdd26"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.383731 4707 generic.go:334] "Generic (PLEG): container finished" podID="eab7b82b-a7d1-41af-abc2-21322ebdd01f" containerID="cf355fbbc590a30098b111959e33b999901e90e3f0d26f06bc1c371e2d6021e0" exitCode=0 Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.383781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" event={"ID":"eab7b82b-a7d1-41af-abc2-21322ebdd01f","Type":"ContainerDied","Data":"cf355fbbc590a30098b111959e33b999901e90e3f0d26f06bc1c371e2d6021e0"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.384203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" event={"ID":"eab7b82b-a7d1-41af-abc2-21322ebdd01f","Type":"ContainerStarted","Data":"32eb78cb716de1a454b83bbd48572b6847a39849ab4645a65b2a6424d334f585"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.385995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jkw2f" event={"ID":"2ab563e0-5667-4a88-902e-0f6d8d3da539","Type":"ContainerStarted","Data":"581de8169d595296309fc4da76d672bf0da9c756c1bb9e4be004b67a1328005f"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.386040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jkw2f" event={"ID":"2ab563e0-5667-4a88-902e-0f6d8d3da539","Type":"ContainerStarted","Data":"eb74c6eae48e42124b5d0adbb43155b3b92c65efd9c36a1871e8ec85506f6046"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.388018 4707 generic.go:334] "Generic (PLEG): container finished" podID="4a6b8d4c-bb07-4c24-b97e-870ad37de2d9" containerID="3a483ecb202201b5633ca0fcde9357132b9ebe781a6f6f398d8cff7856e0cd29" exitCode=0 Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.388053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-t4c2x" event={"ID":"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9","Type":"ContainerDied","Data":"3a483ecb202201b5633ca0fcde9357132b9ebe781a6f6f398d8cff7856e0cd29"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.388070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-t4c2x" event={"ID":"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9","Type":"ContainerStarted","Data":"6bcaf0214d8c5329177067a84b12600880d3fcaa08964fe156e16cf4cd2e5ade"} Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.395280 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" podStartSLOduration=2.395270761 podStartE2EDuration="2.395270761s" podCreationTimestamp="2026-01-21 16:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:33.388528323 +0000 UTC m=+4430.570044545" watchObservedRunningTime="2026-01-21 16:15:33.395270761 +0000 UTC m=+4430.576786983" Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.411062 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" podStartSLOduration=2.411041454 podStartE2EDuration="2.411041454s" podCreationTimestamp="2026-01-21 16:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:33.404724226 +0000 UTC m=+4430.586240449" watchObservedRunningTime="2026-01-21 16:15:33.411041454 +0000 UTC m=+4430.592557676" Jan 21 16:15:33 crc kubenswrapper[4707]: I0121 16:15:33.448064 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-create-jkw2f" podStartSLOduration=2.448046174 podStartE2EDuration="2.448046174s" podCreationTimestamp="2026-01-21 16:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:33.447295293 +0000 UTC m=+4430.628811504" watchObservedRunningTime="2026-01-21 16:15:33.448046174 +0000 UTC m=+4430.629562396" Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.396792 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d21d851-8cc6-4b92-ad04-7e4fe88fb166" containerID="3d192501b8ea2c70eceb4505c10c6f626fcbc95a51cb226de357496781cd162e" exitCode=0 Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.396906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" event={"ID":"9d21d851-8cc6-4b92-ad04-7e4fe88fb166","Type":"ContainerDied","Data":"3d192501b8ea2c70eceb4505c10c6f626fcbc95a51cb226de357496781cd162e"} Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.399176 4707 generic.go:334] "Generic (PLEG): container finished" podID="2ab563e0-5667-4a88-902e-0f6d8d3da539" containerID="581de8169d595296309fc4da76d672bf0da9c756c1bb9e4be004b67a1328005f" exitCode=0 Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.399347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jkw2f" event={"ID":"2ab563e0-5667-4a88-902e-0f6d8d3da539","Type":"ContainerDied","Data":"581de8169d595296309fc4da76d672bf0da9c756c1bb9e4be004b67a1328005f"} Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.726841 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.812031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f1d5774-2251-43d5-938b-7e87e5fee400-operator-scripts\") pod \"8f1d5774-2251-43d5-938b-7e87e5fee400\" (UID: \"8f1d5774-2251-43d5-938b-7e87e5fee400\") " Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.812120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vf77\" (UniqueName: \"kubernetes.io/projected/8f1d5774-2251-43d5-938b-7e87e5fee400-kube-api-access-6vf77\") pod \"8f1d5774-2251-43d5-938b-7e87e5fee400\" (UID: \"8f1d5774-2251-43d5-938b-7e87e5fee400\") " Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.813320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f1d5774-2251-43d5-938b-7e87e5fee400-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f1d5774-2251-43d5-938b-7e87e5fee400" (UID: "8f1d5774-2251-43d5-938b-7e87e5fee400"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.817277 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f1d5774-2251-43d5-938b-7e87e5fee400-kube-api-access-6vf77" (OuterVolumeSpecName: "kube-api-access-6vf77") pod "8f1d5774-2251-43d5-938b-7e87e5fee400" (UID: "8f1d5774-2251-43d5-938b-7e87e5fee400"). InnerVolumeSpecName "kube-api-access-6vf77". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.861113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.868370 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.877523 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.919762 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vf77\" (UniqueName: \"kubernetes.io/projected/8f1d5774-2251-43d5-938b-7e87e5fee400-kube-api-access-6vf77\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:34 crc kubenswrapper[4707]: I0121 16:15:34.919849 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f1d5774-2251-43d5-938b-7e87e5fee400-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.021002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spr74\" (UniqueName: \"kubernetes.io/projected/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-kube-api-access-spr74\") pod \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\" (UID: \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.021055 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js9gw\" (UniqueName: \"kubernetes.io/projected/eab7b82b-a7d1-41af-abc2-21322ebdd01f-kube-api-access-js9gw\") pod \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\" (UID: \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.021104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-operator-scripts\") pod \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\" (UID: \"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.021171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2n29\" (UniqueName: \"kubernetes.io/projected/f87d50e5-d59b-48b8-a36d-5672422fa831-kube-api-access-b2n29\") pod \"f87d50e5-d59b-48b8-a36d-5672422fa831\" (UID: \"f87d50e5-d59b-48b8-a36d-5672422fa831\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.021227 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87d50e5-d59b-48b8-a36d-5672422fa831-operator-scripts\") pod \"f87d50e5-d59b-48b8-a36d-5672422fa831\" (UID: \"f87d50e5-d59b-48b8-a36d-5672422fa831\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.021258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab7b82b-a7d1-41af-abc2-21322ebdd01f-operator-scripts\") pod \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\" (UID: \"eab7b82b-a7d1-41af-abc2-21322ebdd01f\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.021986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab7b82b-a7d1-41af-abc2-21322ebdd01f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eab7b82b-a7d1-41af-abc2-21322ebdd01f" (UID: "eab7b82b-a7d1-41af-abc2-21322ebdd01f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.021988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a6b8d4c-bb07-4c24-b97e-870ad37de2d9" (UID: "4a6b8d4c-bb07-4c24-b97e-870ad37de2d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.022146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87d50e5-d59b-48b8-a36d-5672422fa831-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f87d50e5-d59b-48b8-a36d-5672422fa831" (UID: "f87d50e5-d59b-48b8-a36d-5672422fa831"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.024820 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-kube-api-access-spr74" (OuterVolumeSpecName: "kube-api-access-spr74") pod "4a6b8d4c-bb07-4c24-b97e-870ad37de2d9" (UID: "4a6b8d4c-bb07-4c24-b97e-870ad37de2d9"). InnerVolumeSpecName "kube-api-access-spr74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.025133 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87d50e5-d59b-48b8-a36d-5672422fa831-kube-api-access-b2n29" (OuterVolumeSpecName: "kube-api-access-b2n29") pod "f87d50e5-d59b-48b8-a36d-5672422fa831" (UID: "f87d50e5-d59b-48b8-a36d-5672422fa831"). InnerVolumeSpecName "kube-api-access-b2n29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.025317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab7b82b-a7d1-41af-abc2-21322ebdd01f-kube-api-access-js9gw" (OuterVolumeSpecName: "kube-api-access-js9gw") pod "eab7b82b-a7d1-41af-abc2-21322ebdd01f" (UID: "eab7b82b-a7d1-41af-abc2-21322ebdd01f"). InnerVolumeSpecName "kube-api-access-js9gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.122512 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spr74\" (UniqueName: \"kubernetes.io/projected/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-kube-api-access-spr74\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.122536 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js9gw\" (UniqueName: \"kubernetes.io/projected/eab7b82b-a7d1-41af-abc2-21322ebdd01f-kube-api-access-js9gw\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.122549 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.122559 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2n29\" (UniqueName: \"kubernetes.io/projected/f87d50e5-d59b-48b8-a36d-5672422fa831-kube-api-access-b2n29\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.122567 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87d50e5-d59b-48b8-a36d-5672422fa831-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.122575 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab7b82b-a7d1-41af-abc2-21322ebdd01f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.409491 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.409482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r" event={"ID":"eab7b82b-a7d1-41af-abc2-21322ebdd01f","Type":"ContainerDied","Data":"32eb78cb716de1a454b83bbd48572b6847a39849ab4645a65b2a6424d334f585"} Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.409968 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32eb78cb716de1a454b83bbd48572b6847a39849ab4645a65b2a6424d334f585" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.412229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-t4c2x" event={"ID":"4a6b8d4c-bb07-4c24-b97e-870ad37de2d9","Type":"ContainerDied","Data":"6bcaf0214d8c5329177067a84b12600880d3fcaa08964fe156e16cf4cd2e5ade"} Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.412335 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bcaf0214d8c5329177067a84b12600880d3fcaa08964fe156e16cf4cd2e5ade" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.412326 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-t4c2x" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.414948 4707 generic.go:334] "Generic (PLEG): container finished" podID="934765c4-0ef2-4748-831a-3fa09a941a42" containerID="c6d62380a3ef675f453ba1bede62df33562aea21e545e2c9252332cfb4df99bd" exitCode=0 Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.415049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" event={"ID":"934765c4-0ef2-4748-831a-3fa09a941a42","Type":"ContainerDied","Data":"c6d62380a3ef675f453ba1bede62df33562aea21e545e2c9252332cfb4df99bd"} Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.418216 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-cq9z7" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.418225 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-cq9z7" event={"ID":"8f1d5774-2251-43d5-938b-7e87e5fee400","Type":"ContainerDied","Data":"b4909b609e94b0ffa3b85f4638906ae3c97be7dec3b4864b1c8f844c4cdf9524"} Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.418442 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4909b609e94b0ffa3b85f4638906ae3c97be7dec3b4864b1c8f844c4cdf9524" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.421832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" event={"ID":"f87d50e5-d59b-48b8-a36d-5672422fa831","Type":"ContainerDied","Data":"06b3becd8b31fb53bb397e23f6fd66314abf2d45537dc2873ba9c2246d76452a"} Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.421862 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b3becd8b31fb53bb397e23f6fd66314abf2d45537dc2873ba9c2246d76452a" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.422910 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.711403 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.782108 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.837427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j57jf\" (UniqueName: \"kubernetes.io/projected/2ab563e0-5667-4a88-902e-0f6d8d3da539-kube-api-access-j57jf\") pod \"2ab563e0-5667-4a88-902e-0f6d8d3da539\" (UID: \"2ab563e0-5667-4a88-902e-0f6d8d3da539\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.837734 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab563e0-5667-4a88-902e-0f6d8d3da539-operator-scripts\") pod \"2ab563e0-5667-4a88-902e-0f6d8d3da539\" (UID: \"2ab563e0-5667-4a88-902e-0f6d8d3da539\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.838630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab563e0-5667-4a88-902e-0f6d8d3da539-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ab563e0-5667-4a88-902e-0f6d8d3da539" (UID: "2ab563e0-5667-4a88-902e-0f6d8d3da539"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.843378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab563e0-5667-4a88-902e-0f6d8d3da539-kube-api-access-j57jf" (OuterVolumeSpecName: "kube-api-access-j57jf") pod "2ab563e0-5667-4a88-902e-0f6d8d3da539" (UID: "2ab563e0-5667-4a88-902e-0f6d8d3da539"). InnerVolumeSpecName "kube-api-access-j57jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.940221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x74hc\" (UniqueName: \"kubernetes.io/projected/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-kube-api-access-x74hc\") pod \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\" (UID: \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.940562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-operator-scripts\") pod \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\" (UID: \"9d21d851-8cc6-4b92-ad04-7e4fe88fb166\") " Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.941129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d21d851-8cc6-4b92-ad04-7e4fe88fb166" (UID: "9d21d851-8cc6-4b92-ad04-7e4fe88fb166"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.941351 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab563e0-5667-4a88-902e-0f6d8d3da539-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.941465 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.941573 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j57jf\" (UniqueName: \"kubernetes.io/projected/2ab563e0-5667-4a88-902e-0f6d8d3da539-kube-api-access-j57jf\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:35 crc kubenswrapper[4707]: I0121 16:15:35.943602 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-kube-api-access-x74hc" (OuterVolumeSpecName: "kube-api-access-x74hc") pod "9d21d851-8cc6-4b92-ad04-7e4fe88fb166" (UID: "9d21d851-8cc6-4b92-ad04-7e4fe88fb166"). InnerVolumeSpecName "kube-api-access-x74hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.043411 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x74hc\" (UniqueName: \"kubernetes.io/projected/9d21d851-8cc6-4b92-ad04-7e4fe88fb166-kube-api-access-x74hc\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.434252 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jkw2f" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.434251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jkw2f" event={"ID":"2ab563e0-5667-4a88-902e-0f6d8d3da539","Type":"ContainerDied","Data":"eb74c6eae48e42124b5d0adbb43155b3b92c65efd9c36a1871e8ec85506f6046"} Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.434431 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb74c6eae48e42124b5d0adbb43155b3b92c65efd9c36a1871e8ec85506f6046" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.443567 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.443637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-794d2" event={"ID":"9d21d851-8cc6-4b92-ad04-7e4fe88fb166","Type":"ContainerDied","Data":"06163ab0ba922aced21c1fda3b7bcfb2af532ee0912af065be146b8724fcdd26"} Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.443707 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06163ab0ba922aced21c1fda3b7bcfb2af532ee0912af065be146b8724fcdd26" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.694779 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.856847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-config-data\") pod \"934765c4-0ef2-4748-831a-3fa09a941a42\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.856951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttdn2\" (UniqueName: \"kubernetes.io/projected/934765c4-0ef2-4748-831a-3fa09a941a42-kube-api-access-ttdn2\") pod \"934765c4-0ef2-4748-831a-3fa09a941a42\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.857077 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-combined-ca-bundle\") pod \"934765c4-0ef2-4748-831a-3fa09a941a42\" (UID: \"934765c4-0ef2-4748-831a-3fa09a941a42\") " Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.862407 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934765c4-0ef2-4748-831a-3fa09a941a42-kube-api-access-ttdn2" (OuterVolumeSpecName: "kube-api-access-ttdn2") pod "934765c4-0ef2-4748-831a-3fa09a941a42" (UID: "934765c4-0ef2-4748-831a-3fa09a941a42"). InnerVolumeSpecName "kube-api-access-ttdn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.878278 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "934765c4-0ef2-4748-831a-3fa09a941a42" (UID: "934765c4-0ef2-4748-831a-3fa09a941a42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.890096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-config-data" (OuterVolumeSpecName: "config-data") pod "934765c4-0ef2-4748-831a-3fa09a941a42" (UID: "934765c4-0ef2-4748-831a-3fa09a941a42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.959403 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.959444 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttdn2\" (UniqueName: \"kubernetes.io/projected/934765c4-0ef2-4748-831a-3fa09a941a42-kube-api-access-ttdn2\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:36 crc kubenswrapper[4707]: I0121 16:15:36.959459 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934765c4-0ef2-4748-831a-3fa09a941a42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.457181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" event={"ID":"934765c4-0ef2-4748-831a-3fa09a941a42","Type":"ContainerDied","Data":"4f524f6b0134a9b6e9c3ae7a9c7cbcce58423736635ee6ef8c87159b358d006f"} Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.457228 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-8dtzx" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.457251 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f524f6b0134a9b6e9c3ae7a9c7cbcce58423736635ee6ef8c87159b358d006f" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.533336 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7n7ck"] Jan 21 16:15:37 crc kubenswrapper[4707]: E0121 16:15:37.533924 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab563e0-5667-4a88-902e-0f6d8d3da539" containerName="mariadb-database-create" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.533948 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab563e0-5667-4a88-902e-0f6d8d3da539" containerName="mariadb-database-create" Jan 21 16:15:37 crc kubenswrapper[4707]: E0121 16:15:37.533962 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87d50e5-d59b-48b8-a36d-5672422fa831" containerName="mariadb-account-create-update" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.533969 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87d50e5-d59b-48b8-a36d-5672422fa831" containerName="mariadb-account-create-update" Jan 21 16:15:37 crc kubenswrapper[4707]: E0121 16:15:37.533979 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f1d5774-2251-43d5-938b-7e87e5fee400" containerName="mariadb-database-create" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.533985 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f1d5774-2251-43d5-938b-7e87e5fee400" containerName="mariadb-database-create" Jan 21 16:15:37 crc kubenswrapper[4707]: E0121 16:15:37.533995 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab7b82b-a7d1-41af-abc2-21322ebdd01f" containerName="mariadb-account-create-update" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534001 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab7b82b-a7d1-41af-abc2-21322ebdd01f" containerName="mariadb-account-create-update" Jan 21 16:15:37 crc kubenswrapper[4707]: E0121 16:15:37.534016 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d21d851-8cc6-4b92-ad04-7e4fe88fb166" containerName="mariadb-account-create-update" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534023 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d21d851-8cc6-4b92-ad04-7e4fe88fb166" containerName="mariadb-account-create-update" Jan 21 16:15:37 crc kubenswrapper[4707]: E0121 16:15:37.534049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6b8d4c-bb07-4c24-b97e-870ad37de2d9" containerName="mariadb-database-create" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534055 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6b8d4c-bb07-4c24-b97e-870ad37de2d9" containerName="mariadb-database-create" Jan 21 16:15:37 crc kubenswrapper[4707]: E0121 16:15:37.534070 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934765c4-0ef2-4748-831a-3fa09a941a42" containerName="keystone-db-sync" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534076 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="934765c4-0ef2-4748-831a-3fa09a941a42" containerName="keystone-db-sync" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534310 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="934765c4-0ef2-4748-831a-3fa09a941a42" containerName="keystone-db-sync" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534322 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab563e0-5667-4a88-902e-0f6d8d3da539" containerName="mariadb-database-create" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534337 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87d50e5-d59b-48b8-a36d-5672422fa831" containerName="mariadb-account-create-update" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534345 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f1d5774-2251-43d5-938b-7e87e5fee400" containerName="mariadb-database-create" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534352 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d21d851-8cc6-4b92-ad04-7e4fe88fb166" containerName="mariadb-account-create-update" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534366 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab7b82b-a7d1-41af-abc2-21322ebdd01f" containerName="mariadb-account-create-update" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.534379 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6b8d4c-bb07-4c24-b97e-870ad37de2d9" containerName="mariadb-database-create" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.535750 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.544421 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7n7ck"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.548827 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.548800 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-shwt7" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.549552 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.549621 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.549781 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.652503 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.654227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.655888 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.656213 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.669415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-credential-keys\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.669531 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-scripts\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.669610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-config-data\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.669675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72llt\" (UniqueName: \"kubernetes.io/projected/073f18c4-b272-4307-9e4c-123432b25fb2-kube-api-access-72llt\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.669716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-fernet-keys\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.669766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-combined-ca-bundle\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.672190 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.738101 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-ppt78"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.739077 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.744758 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-6g6g9" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.744780 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.744988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.748245 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-sx8r2"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.749229 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.752008 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.752521 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.752661 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-9bcpk" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-config-data\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-config-data\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s7vk\" (UniqueName: \"kubernetes.io/projected/090cb60b-998d-4415-b080-aa7f6a6b45c6-kube-api-access-6s7vk\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771342 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72llt\" (UniqueName: \"kubernetes.io/projected/073f18c4-b272-4307-9e4c-123432b25fb2-kube-api-access-72llt\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-fernet-keys\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-combined-ca-bundle\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-log-httpd\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-run-httpd\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-scripts\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-credential-keys\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.771607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-scripts\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.774047 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-ppt78"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.775637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-scripts\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.781592 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-sx8r2"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.784448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-credential-keys\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.786870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-fernet-keys\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.789407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-config-data\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.791667 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-psp6t"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.792689 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.795688 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.796283 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-r4rhn" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.797527 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.798242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-combined-ca-bundle\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.829489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72llt\" (UniqueName: \"kubernetes.io/projected/073f18c4-b272-4307-9e4c-123432b25fb2-kube-api-access-72llt\") pod \"keystone-bootstrap-7n7ck\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.830673 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-psp6t"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.848482 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-krrl7"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.851747 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.853702 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-krrl7"] Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.853872 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.857863 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.857902 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-x9wl2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.873212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-combined-ca-bundle\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.873263 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-etc-machine-id\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.873290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-config\") pod \"neutron-db-sync-ppt78\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.873328 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-log-httpd\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.873362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scsg6\" (UniqueName: \"kubernetes.io/projected/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-kube-api-access-scsg6\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.873380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-scripts\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.874147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-db-sync-config-data\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.874195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-scripts\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.874227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-run-httpd\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.874248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-config-data\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.874274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glsc2\" (UniqueName: \"kubernetes.io/projected/7d31e188-7526-4747-b72b-5fb81bb77006-kube-api-access-glsc2\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.874714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-run-httpd\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.874728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-log-httpd\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.874775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-scripts\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.875200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-config-data\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.875337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.875425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnjc2\" (UniqueName: \"kubernetes.io/projected/228927a0-3a8f-43f9-853f-99fc8df8e727-kube-api-access-nnjc2\") pod \"neutron-db-sync-ppt78\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.875525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-combined-ca-bundle\") pod \"neutron-db-sync-ppt78\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.875591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d31e188-7526-4747-b72b-5fb81bb77006-logs\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.875689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-config-data\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.875740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s7vk\" (UniqueName: \"kubernetes.io/projected/090cb60b-998d-4415-b080-aa7f6a6b45c6-kube-api-access-6s7vk\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.875765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.875789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-combined-ca-bundle\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.881718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-scripts\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.882768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.883963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.886749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-config-data\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.892123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s7vk\" (UniqueName: \"kubernetes.io/projected/090cb60b-998d-4415-b080-aa7f6a6b45c6-kube-api-access-6s7vk\") pod \"ceilometer-0\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.967352 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977183 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-etc-machine-id\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977222 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-config\") pod \"neutron-db-sync-ppt78\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scsg6\" (UniqueName: \"kubernetes.io/projected/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-kube-api-access-scsg6\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-scripts\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-db-sync-config-data\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-scripts\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-config-data\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glsc2\" (UniqueName: \"kubernetes.io/projected/7d31e188-7526-4747-b72b-5fb81bb77006-kube-api-access-glsc2\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-config-data\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977414 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnjc2\" (UniqueName: \"kubernetes.io/projected/228927a0-3a8f-43f9-853f-99fc8df8e727-kube-api-access-nnjc2\") pod \"neutron-db-sync-ppt78\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6dfz\" (UniqueName: \"kubernetes.io/projected/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-kube-api-access-l6dfz\") pod \"barbican-db-sync-krrl7\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-combined-ca-bundle\") pod \"barbican-db-sync-krrl7\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-combined-ca-bundle\") pod \"neutron-db-sync-ppt78\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d31e188-7526-4747-b72b-5fb81bb77006-logs\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-combined-ca-bundle\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-db-sync-config-data\") pod \"barbican-db-sync-krrl7\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-combined-ca-bundle\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.977978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-etc-machine-id\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.979576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d31e188-7526-4747-b72b-5fb81bb77006-logs\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.982198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-combined-ca-bundle\") pod \"neutron-db-sync-ppt78\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.982646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-combined-ca-bundle\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.983028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-config\") pod \"neutron-db-sync-ppt78\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.983359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-scripts\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.983555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-scripts\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.984750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-combined-ca-bundle\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.986045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-config-data\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.987267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-db-sync-config-data\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.989314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-config-data\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:37 crc kubenswrapper[4707]: I0121 16:15:37.996515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glsc2\" (UniqueName: \"kubernetes.io/projected/7d31e188-7526-4747-b72b-5fb81bb77006-kube-api-access-glsc2\") pod \"placement-db-sync-psp6t\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.000031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnjc2\" (UniqueName: \"kubernetes.io/projected/228927a0-3a8f-43f9-853f-99fc8df8e727-kube-api-access-nnjc2\") pod \"neutron-db-sync-ppt78\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.006223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scsg6\" (UniqueName: \"kubernetes.io/projected/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-kube-api-access-scsg6\") pod \"cinder-db-sync-sx8r2\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.059660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.070685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.082565 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-db-sync-config-data\") pod \"barbican-db-sync-krrl7\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.082670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6dfz\" (UniqueName: \"kubernetes.io/projected/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-kube-api-access-l6dfz\") pod \"barbican-db-sync-krrl7\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.082690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-combined-ca-bundle\") pod \"barbican-db-sync-krrl7\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.086738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-combined-ca-bundle\") pod \"barbican-db-sync-krrl7\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.107204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-db-sync-config-data\") pod \"barbican-db-sync-krrl7\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.108776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6dfz\" (UniqueName: \"kubernetes.io/projected/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-kube-api-access-l6dfz\") pod \"barbican-db-sync-krrl7\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.199659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.231820 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.245529 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7n7ck"] Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.475771 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" event={"ID":"073f18c4-b272-4307-9e4c-123432b25fb2","Type":"ContainerStarted","Data":"423dd05f876b8af355a9ef2e2edac5427fed97701a85990d3147e5d77e0b4362"} Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.594606 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:15:38 crc kubenswrapper[4707]: W0121 16:15:38.606321 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod090cb60b_998d_4415_b080_aa7f6a6b45c6.slice/crio-97a6c5869e0e8aad0ddfdd129607b8d5840f33b0693e44487b2df7a32e1df9a5 WatchSource:0}: Error finding container 97a6c5869e0e8aad0ddfdd129607b8d5840f33b0693e44487b2df7a32e1df9a5: Status 404 returned error can't find the container with id 97a6c5869e0e8aad0ddfdd129607b8d5840f33b0693e44487b2df7a32e1df9a5 Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.624433 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.625829 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.631108 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.631341 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.631460 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-dxb2g" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.631670 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.650768 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.692017 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.693300 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.695900 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.696157 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.701022 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.705227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.705271 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.705342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.705374 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rbhq\" (UniqueName: \"kubernetes.io/projected/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-kube-api-access-7rbhq\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.705393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.705412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-logs\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.705450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-scripts\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.705533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-config-data\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.726335 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-psp6t"] Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.738249 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-ppt78"] Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.742539 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-sx8r2"] Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807501 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-scripts\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbzr4\" (UniqueName: \"kubernetes.io/projected/abd77318-16d4-4199-bf94-dd884ef91d75-kube-api-access-dbzr4\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-logs\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-config-data\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807855 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.807997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rbhq\" (UniqueName: \"kubernetes.io/projected/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-kube-api-access-7rbhq\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.808021 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.808049 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-logs\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.808084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.808751 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.809124 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.809126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-logs\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.857435 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-krrl7"] Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.908931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbzr4\" (UniqueName: \"kubernetes.io/projected/abd77318-16d4-4199-bf94-dd884ef91d75-kube-api-access-dbzr4\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-logs\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909213 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909759 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-logs\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909897 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:38 crc kubenswrapper[4707]: I0121 16:15:38.909891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.061017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.061308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbzr4\" (UniqueName: \"kubernetes.io/projected/abd77318-16d4-4199-bf94-dd884ef91d75-kube-api-access-dbzr4\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: W0121 16:15:39.061105 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod228927a0_3a8f_43f9_853f_99fc8df8e727.slice/crio-14fd7cffbe4103d7110ec99c95981c777d5f98930f5ba2f23f3a70826902f02d WatchSource:0}: Error finding container 14fd7cffbe4103d7110ec99c95981c777d5f98930f5ba2f23f3a70826902f02d: Status 404 returned error can't find the container with id 14fd7cffbe4103d7110ec99c95981c777d5f98930f5ba2f23f3a70826902f02d Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.061034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.061579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.061598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.061691 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.061777 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-scripts\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.062155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.064197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-config-data\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.074509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rbhq\" (UniqueName: \"kubernetes.io/projected/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-kube-api-access-7rbhq\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.106736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.126496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.256298 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.322563 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.523946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerStarted","Data":"97a6c5869e0e8aad0ddfdd129607b8d5840f33b0693e44487b2df7a32e1df9a5"} Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.545855 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-krrl7" event={"ID":"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4","Type":"ContainerStarted","Data":"b52afd5f2e01f239ce45f1d19e62887eb3138eeefb455653380278b022c9f0a2"} Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.546132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-krrl7" event={"ID":"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4","Type":"ContainerStarted","Data":"caaffe1502cbf325e898d5a49fff91a3612077fea5ca1d42855b1b85a6108435"} Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.549709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" event={"ID":"f82f8d8c-cc8e-4295-8453-61fb1b1c3848","Type":"ContainerStarted","Data":"e0ad20db1e76f4b5cac527b70c6a6a7974706e6e4cfd81bd09911bd0adc9f337"} Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.554035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" event={"ID":"073f18c4-b272-4307-9e4c-123432b25fb2","Type":"ContainerStarted","Data":"1ec9f8e9cdd570c1c0f9a3cff013e5233069dabb7c23b0432d95418fcef9886f"} Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.555654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-ppt78" event={"ID":"228927a0-3a8f-43f9-853f-99fc8df8e727","Type":"ContainerStarted","Data":"b895c33c9e96e01533b1f6162a9b38e42a966f465e3f68cd50d1c00b4fe94657"} Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.555681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-ppt78" event={"ID":"228927a0-3a8f-43f9-853f-99fc8df8e727","Type":"ContainerStarted","Data":"14fd7cffbe4103d7110ec99c95981c777d5f98930f5ba2f23f3a70826902f02d"} Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.571564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-psp6t" event={"ID":"7d31e188-7526-4747-b72b-5fb81bb77006","Type":"ContainerStarted","Data":"3b952b60555ceb376dabbb9df62869e70378873b1c14e61dbc327e332bbb85b6"} Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.571612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-psp6t" event={"ID":"7d31e188-7526-4747-b72b-5fb81bb77006","Type":"ContainerStarted","Data":"0d87388d867d25bb94506a74396da8c9537a2414609c3b39d6a2118edbc56fab"} Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.593993 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-krrl7" podStartSLOduration=2.593962546 podStartE2EDuration="2.593962546s" podCreationTimestamp="2026-01-21 16:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:39.57023932 +0000 UTC m=+4436.751755542" watchObservedRunningTime="2026-01-21 16:15:39.593962546 +0000 UTC m=+4436.775478759" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.595001 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" podStartSLOduration=2.594993654 podStartE2EDuration="2.594993654s" podCreationTimestamp="2026-01-21 16:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:39.590063084 +0000 UTC m=+4436.771579307" watchObservedRunningTime="2026-01-21 16:15:39.594993654 +0000 UTC m=+4436.776509877" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.617867 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-ppt78" podStartSLOduration=2.6178470259999997 podStartE2EDuration="2.617847026s" podCreationTimestamp="2026-01-21 16:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:39.602670489 +0000 UTC m=+4436.784186711" watchObservedRunningTime="2026-01-21 16:15:39.617847026 +0000 UTC m=+4436.799363248" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.633708 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-psp6t" podStartSLOduration=2.6336938229999998 podStartE2EDuration="2.633693823s" podCreationTimestamp="2026-01-21 16:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:39.616607054 +0000 UTC m=+4436.798123277" watchObservedRunningTime="2026-01-21 16:15:39.633693823 +0000 UTC m=+4436.815210045" Jan 21 16:15:39 crc kubenswrapper[4707]: I0121 16:15:39.876981 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:15:40 crc kubenswrapper[4707]: I0121 16:15:40.001733 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:15:40 crc kubenswrapper[4707]: I0121 16:15:40.044715 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:15:40 crc kubenswrapper[4707]: I0121 16:15:40.061459 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:15:40 crc kubenswrapper[4707]: I0121 16:15:40.070286 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:15:40 crc kubenswrapper[4707]: I0121 16:15:40.580780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"abd77318-16d4-4199-bf94-dd884ef91d75","Type":"ContainerStarted","Data":"77f827576a3663f5bb74c1b657cf94d5fd20882a7309b8b1a595c6e648ce61b5"} Jan 21 16:15:40 crc kubenswrapper[4707]: I0121 16:15:40.582378 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" event={"ID":"f82f8d8c-cc8e-4295-8453-61fb1b1c3848","Type":"ContainerStarted","Data":"b64f3b8df76efe4f95911f51150ea7f0f45391b29547dd5c2934533c217187fa"} Jan 21 16:15:40 crc kubenswrapper[4707]: I0121 16:15:40.584631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerStarted","Data":"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b"} Jan 21 16:15:40 crc kubenswrapper[4707]: I0121 16:15:40.584677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerStarted","Data":"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5"} Jan 21 16:15:40 crc kubenswrapper[4707]: I0121 16:15:40.586851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4bbf7a3d-ae4a-4afb-953b-02346807fcfc","Type":"ContainerStarted","Data":"73d8facd0034ea977db42bbf9d1585bf2a357e809d7fda1292def166264b829a"} Jan 21 16:15:41 crc kubenswrapper[4707]: I0121 16:15:41.602276 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d31e188-7526-4747-b72b-5fb81bb77006" containerID="3b952b60555ceb376dabbb9df62869e70378873b1c14e61dbc327e332bbb85b6" exitCode=0 Jan 21 16:15:41 crc kubenswrapper[4707]: I0121 16:15:41.602416 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-psp6t" event={"ID":"7d31e188-7526-4747-b72b-5fb81bb77006","Type":"ContainerDied","Data":"3b952b60555ceb376dabbb9df62869e70378873b1c14e61dbc327e332bbb85b6"} Jan 21 16:15:41 crc kubenswrapper[4707]: I0121 16:15:41.606777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerStarted","Data":"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6"} Jan 21 16:15:41 crc kubenswrapper[4707]: I0121 16:15:41.611129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4bbf7a3d-ae4a-4afb-953b-02346807fcfc","Type":"ContainerStarted","Data":"905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030"} Jan 21 16:15:41 crc kubenswrapper[4707]: I0121 16:15:41.618834 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="abd77318-16d4-4199-bf94-dd884ef91d75" containerName="glance-log" containerID="cri-o://980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5" gracePeriod=30 Jan 21 16:15:41 crc kubenswrapper[4707]: I0121 16:15:41.618938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"abd77318-16d4-4199-bf94-dd884ef91d75","Type":"ContainerStarted","Data":"980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5"} Jan 21 16:15:41 crc kubenswrapper[4707]: I0121 16:15:41.619027 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="abd77318-16d4-4199-bf94-dd884ef91d75" containerName="glance-httpd" containerID="cri-o://26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b" gracePeriod=30 Jan 21 16:15:41 crc kubenswrapper[4707]: I0121 16:15:41.635498 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" podStartSLOduration=4.635484151 podStartE2EDuration="4.635484151s" podCreationTimestamp="2026-01-21 16:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:40.605493394 +0000 UTC m=+4437.787009636" watchObservedRunningTime="2026-01-21 16:15:41.635484151 +0000 UTC m=+4438.817000363" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.066681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.204441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-combined-ca-bundle\") pod \"abd77318-16d4-4199-bf94-dd884ef91d75\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.204495 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbzr4\" (UniqueName: \"kubernetes.io/projected/abd77318-16d4-4199-bf94-dd884ef91d75-kube-api-access-dbzr4\") pod \"abd77318-16d4-4199-bf94-dd884ef91d75\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.204536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-logs\") pod \"abd77318-16d4-4199-bf94-dd884ef91d75\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.204583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-config-data\") pod \"abd77318-16d4-4199-bf94-dd884ef91d75\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.204695 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-internal-tls-certs\") pod \"abd77318-16d4-4199-bf94-dd884ef91d75\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.204725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-httpd-run\") pod \"abd77318-16d4-4199-bf94-dd884ef91d75\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.204793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"abd77318-16d4-4199-bf94-dd884ef91d75\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.204863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-scripts\") pod \"abd77318-16d4-4199-bf94-dd884ef91d75\" (UID: \"abd77318-16d4-4199-bf94-dd884ef91d75\") " Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.204986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-logs" (OuterVolumeSpecName: "logs") pod "abd77318-16d4-4199-bf94-dd884ef91d75" (UID: "abd77318-16d4-4199-bf94-dd884ef91d75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.205027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "abd77318-16d4-4199-bf94-dd884ef91d75" (UID: "abd77318-16d4-4199-bf94-dd884ef91d75"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.205209 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.205221 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abd77318-16d4-4199-bf94-dd884ef91d75-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.209089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd77318-16d4-4199-bf94-dd884ef91d75-kube-api-access-dbzr4" (OuterVolumeSpecName: "kube-api-access-dbzr4") pod "abd77318-16d4-4199-bf94-dd884ef91d75" (UID: "abd77318-16d4-4199-bf94-dd884ef91d75"). InnerVolumeSpecName "kube-api-access-dbzr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.209170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "abd77318-16d4-4199-bf94-dd884ef91d75" (UID: "abd77318-16d4-4199-bf94-dd884ef91d75"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.212642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-scripts" (OuterVolumeSpecName: "scripts") pod "abd77318-16d4-4199-bf94-dd884ef91d75" (UID: "abd77318-16d4-4199-bf94-dd884ef91d75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.230301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abd77318-16d4-4199-bf94-dd884ef91d75" (UID: "abd77318-16d4-4199-bf94-dd884ef91d75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.247007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "abd77318-16d4-4199-bf94-dd884ef91d75" (UID: "abd77318-16d4-4199-bf94-dd884ef91d75"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.255902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-config-data" (OuterVolumeSpecName: "config-data") pod "abd77318-16d4-4199-bf94-dd884ef91d75" (UID: "abd77318-16d4-4199-bf94-dd884ef91d75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.308244 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.308302 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.308317 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.308329 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.308342 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbzr4\" (UniqueName: \"kubernetes.io/projected/abd77318-16d4-4199-bf94-dd884ef91d75-kube-api-access-dbzr4\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.308358 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd77318-16d4-4199-bf94-dd884ef91d75-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.327149 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.409672 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.625964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4bbf7a3d-ae4a-4afb-953b-02346807fcfc","Type":"ContainerStarted","Data":"2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206"} Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.626298 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerName="glance-log" containerID="cri-o://905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030" gracePeriod=30 Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.626410 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerName="glance-httpd" containerID="cri-o://2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206" gracePeriod=30 Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.628532 4707 generic.go:334] "Generic (PLEG): container finished" podID="4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4" containerID="b52afd5f2e01f239ce45f1d19e62887eb3138eeefb455653380278b022c9f0a2" exitCode=0 Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.628582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-krrl7" event={"ID":"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4","Type":"ContainerDied","Data":"b52afd5f2e01f239ce45f1d19e62887eb3138eeefb455653380278b022c9f0a2"} Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.630712 4707 generic.go:334] "Generic (PLEG): container finished" podID="abd77318-16d4-4199-bf94-dd884ef91d75" containerID="26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b" exitCode=143 Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.630734 4707 generic.go:334] "Generic (PLEG): container finished" podID="abd77318-16d4-4199-bf94-dd884ef91d75" containerID="980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5" exitCode=143 Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.630801 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.631469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"abd77318-16d4-4199-bf94-dd884ef91d75","Type":"ContainerDied","Data":"26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b"} Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.631517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"abd77318-16d4-4199-bf94-dd884ef91d75","Type":"ContainerDied","Data":"980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5"} Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.631529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"abd77318-16d4-4199-bf94-dd884ef91d75","Type":"ContainerDied","Data":"77f827576a3663f5bb74c1b657cf94d5fd20882a7309b8b1a595c6e648ce61b5"} Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.631546 4707 scope.go:117] "RemoveContainer" containerID="26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.633881 4707 generic.go:334] "Generic (PLEG): container finished" podID="073f18c4-b272-4307-9e4c-123432b25fb2" containerID="1ec9f8e9cdd570c1c0f9a3cff013e5233069dabb7c23b0432d95418fcef9886f" exitCode=0 Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.633942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" event={"ID":"073f18c4-b272-4307-9e4c-123432b25fb2","Type":"ContainerDied","Data":"1ec9f8e9cdd570c1c0f9a3cff013e5233069dabb7c23b0432d95418fcef9886f"} Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.654100 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=5.6540893709999995 podStartE2EDuration="5.654089371s" podCreationTimestamp="2026-01-21 16:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:42.653039997 +0000 UTC m=+4439.834556219" watchObservedRunningTime="2026-01-21 16:15:42.654089371 +0000 UTC m=+4439.835605593" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.661619 4707 scope.go:117] "RemoveContainer" containerID="980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.763346 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.791121 4707 scope.go:117] "RemoveContainer" containerID="26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b" Jan 21 16:15:42 crc kubenswrapper[4707]: E0121 16:15:42.791791 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b\": container with ID starting with 26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b not found: ID does not exist" containerID="26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.791844 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b"} err="failed to get container status \"26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b\": rpc error: code = NotFound desc = could not find container \"26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b\": container with ID starting with 26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b not found: ID does not exist" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.791863 4707 scope.go:117] "RemoveContainer" containerID="980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5" Jan 21 16:15:42 crc kubenswrapper[4707]: E0121 16:15:42.811905 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5\": container with ID starting with 980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5 not found: ID does not exist" containerID="980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.811940 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5"} err="failed to get container status \"980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5\": rpc error: code = NotFound desc = could not find container \"980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5\": container with ID starting with 980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5 not found: ID does not exist" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.811960 4707 scope.go:117] "RemoveContainer" containerID="26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.828538 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b"} err="failed to get container status \"26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b\": rpc error: code = NotFound desc = could not find container \"26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b\": container with ID starting with 26813970807b6a38000965221e9a5a5cafde4273627d68c66e86c3196615a80b not found: ID does not exist" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.828564 4707 scope.go:117] "RemoveContainer" containerID="980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.843395 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5"} err="failed to get container status \"980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5\": rpc error: code = NotFound desc = could not find container \"980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5\": container with ID starting with 980016f2facd7544c023739832ba36217b37d0c21a04b02ab53aa2fb7b5868f5 not found: ID does not exist" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.843440 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.854668 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:15:42 crc kubenswrapper[4707]: E0121 16:15:42.855023 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd77318-16d4-4199-bf94-dd884ef91d75" containerName="glance-httpd" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.855039 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd77318-16d4-4199-bf94-dd884ef91d75" containerName="glance-httpd" Jan 21 16:15:42 crc kubenswrapper[4707]: E0121 16:15:42.855054 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd77318-16d4-4199-bf94-dd884ef91d75" containerName="glance-log" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.855061 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd77318-16d4-4199-bf94-dd884ef91d75" containerName="glance-log" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.855222 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd77318-16d4-4199-bf94-dd884ef91d75" containerName="glance-log" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.855244 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd77318-16d4-4199-bf94-dd884ef91d75" containerName="glance-httpd" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.856049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.858310 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.861042 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.902402 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.921030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.921151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.921266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slpm8\" (UniqueName: \"kubernetes.io/projected/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-kube-api-access-slpm8\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.921298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.921324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.921373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.921405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:42 crc kubenswrapper[4707]: I0121 16:15:42.921452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.030010 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slpm8\" (UniqueName: \"kubernetes.io/projected/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-kube-api-access-slpm8\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.030381 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.030416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.030460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.030494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.030547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.030652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.030725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.033621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.034102 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.034542 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.038594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.054427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.057383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.061118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.069494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slpm8\" (UniqueName: \"kubernetes.io/projected/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-kube-api-access-slpm8\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.092609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.098039 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.132845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-combined-ca-bundle\") pod \"7d31e188-7526-4747-b72b-5fb81bb77006\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.132957 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-scripts\") pod \"7d31e188-7526-4747-b72b-5fb81bb77006\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.133170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glsc2\" (UniqueName: \"kubernetes.io/projected/7d31e188-7526-4747-b72b-5fb81bb77006-kube-api-access-glsc2\") pod \"7d31e188-7526-4747-b72b-5fb81bb77006\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.133223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d31e188-7526-4747-b72b-5fb81bb77006-logs\") pod \"7d31e188-7526-4747-b72b-5fb81bb77006\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.133263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-config-data\") pod \"7d31e188-7526-4747-b72b-5fb81bb77006\" (UID: \"7d31e188-7526-4747-b72b-5fb81bb77006\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.136082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d31e188-7526-4747-b72b-5fb81bb77006-logs" (OuterVolumeSpecName: "logs") pod "7d31e188-7526-4747-b72b-5fb81bb77006" (UID: "7d31e188-7526-4747-b72b-5fb81bb77006"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.143319 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-scripts" (OuterVolumeSpecName: "scripts") pod "7d31e188-7526-4747-b72b-5fb81bb77006" (UID: "7d31e188-7526-4747-b72b-5fb81bb77006"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.145999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d31e188-7526-4747-b72b-5fb81bb77006-kube-api-access-glsc2" (OuterVolumeSpecName: "kube-api-access-glsc2") pod "7d31e188-7526-4747-b72b-5fb81bb77006" (UID: "7d31e188-7526-4747-b72b-5fb81bb77006"). InnerVolumeSpecName "kube-api-access-glsc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.158326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d31e188-7526-4747-b72b-5fb81bb77006" (UID: "7d31e188-7526-4747-b72b-5fb81bb77006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.167666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-config-data" (OuterVolumeSpecName: "config-data") pod "7d31e188-7526-4747-b72b-5fb81bb77006" (UID: "7d31e188-7526-4747-b72b-5fb81bb77006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.196195 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abd77318-16d4-4199-bf94-dd884ef91d75" path="/var/lib/kubelet/pods/abd77318-16d4-4199-bf94-dd884ef91d75/volumes" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.239316 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.239340 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.239353 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glsc2\" (UniqueName: \"kubernetes.io/projected/7d31e188-7526-4747-b72b-5fb81bb77006-kube-api-access-glsc2\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.239363 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d31e188-7526-4747-b72b-5fb81bb77006-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.239374 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31e188-7526-4747-b72b-5fb81bb77006-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.279834 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.464438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.544420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.544650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-public-tls-certs\") pod \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.544693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-logs\") pod \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.544742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-combined-ca-bundle\") pod \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.544793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-scripts\") pod \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.544826 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-httpd-run\") pod \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.544864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-config-data\") pod \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.544916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rbhq\" (UniqueName: \"kubernetes.io/projected/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-kube-api-access-7rbhq\") pod \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\" (UID: \"4bbf7a3d-ae4a-4afb-953b-02346807fcfc\") " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.545031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-logs" (OuterVolumeSpecName: "logs") pod "4bbf7a3d-ae4a-4afb-953b-02346807fcfc" (UID: "4bbf7a3d-ae4a-4afb-953b-02346807fcfc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.545228 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4bbf7a3d-ae4a-4afb-953b-02346807fcfc" (UID: "4bbf7a3d-ae4a-4afb-953b-02346807fcfc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.546049 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.546082 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.548470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "4bbf7a3d-ae4a-4afb-953b-02346807fcfc" (UID: "4bbf7a3d-ae4a-4afb-953b-02346807fcfc"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.548539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-scripts" (OuterVolumeSpecName: "scripts") pod "4bbf7a3d-ae4a-4afb-953b-02346807fcfc" (UID: "4bbf7a3d-ae4a-4afb-953b-02346807fcfc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.550443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-kube-api-access-7rbhq" (OuterVolumeSpecName: "kube-api-access-7rbhq") pod "4bbf7a3d-ae4a-4afb-953b-02346807fcfc" (UID: "4bbf7a3d-ae4a-4afb-953b-02346807fcfc"). InnerVolumeSpecName "kube-api-access-7rbhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.567249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bbf7a3d-ae4a-4afb-953b-02346807fcfc" (UID: "4bbf7a3d-ae4a-4afb-953b-02346807fcfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.581474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-config-data" (OuterVolumeSpecName: "config-data") pod "4bbf7a3d-ae4a-4afb-953b-02346807fcfc" (UID: "4bbf7a3d-ae4a-4afb-953b-02346807fcfc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.585284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4bbf7a3d-ae4a-4afb-953b-02346807fcfc" (UID: "4bbf7a3d-ae4a-4afb-953b-02346807fcfc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.647426 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.647454 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.647470 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.647480 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.647488 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.647500 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rbhq\" (UniqueName: \"kubernetes.io/projected/4bbf7a3d-ae4a-4afb-953b-02346807fcfc-kube-api-access-7rbhq\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.654459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-psp6t" event={"ID":"7d31e188-7526-4747-b72b-5fb81bb77006","Type":"ContainerDied","Data":"0d87388d867d25bb94506a74396da8c9537a2414609c3b39d6a2118edbc56fab"} Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.654508 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d87388d867d25bb94506a74396da8c9537a2414609c3b39d6a2118edbc56fab" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.654571 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-psp6t" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.661290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerStarted","Data":"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa"} Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.661470 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="ceilometer-central-agent" containerID="cri-o://da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b" gracePeriod=30 Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.661770 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.662121 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="proxy-httpd" containerID="cri-o://4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa" gracePeriod=30 Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.662199 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="sg-core" containerID="cri-o://74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6" gracePeriod=30 Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.662244 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="ceilometer-notification-agent" containerID="cri-o://263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5" gracePeriod=30 Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.667887 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.669217 4707 generic.go:334] "Generic (PLEG): container finished" podID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerID="2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206" exitCode=0 Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.669242 4707 generic.go:334] "Generic (PLEG): container finished" podID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerID="905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030" exitCode=143 Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.669306 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.669317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4bbf7a3d-ae4a-4afb-953b-02346807fcfc","Type":"ContainerDied","Data":"2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206"} Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.669363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4bbf7a3d-ae4a-4afb-953b-02346807fcfc","Type":"ContainerDied","Data":"905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030"} Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.669373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4bbf7a3d-ae4a-4afb-953b-02346807fcfc","Type":"ContainerDied","Data":"73d8facd0034ea977db42bbf9d1585bf2a357e809d7fda1292def166264b829a"} Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.669392 4707 scope.go:117] "RemoveContainer" containerID="2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.693097 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.4563490310000002 podStartE2EDuration="6.693080652s" podCreationTimestamp="2026-01-21 16:15:37 +0000 UTC" firstStartedPulling="2026-01-21 16:15:38.608223485 +0000 UTC m=+4435.789739706" lastFinishedPulling="2026-01-21 16:15:42.844955104 +0000 UTC m=+4440.026471327" observedRunningTime="2026-01-21 16:15:43.678112798 +0000 UTC m=+4440.859629020" watchObservedRunningTime="2026-01-21 16:15:43.693080652 +0000 UTC m=+4440.874596874" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.716489 4707 scope.go:117] "RemoveContainer" containerID="905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.726856 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.733045 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.740374 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.749658 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.750203 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:15:43 crc kubenswrapper[4707]: E0121 16:15:43.750549 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerName="glance-log" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.750561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerName="glance-log" Jan 21 16:15:43 crc kubenswrapper[4707]: E0121 16:15:43.750598 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d31e188-7526-4747-b72b-5fb81bb77006" containerName="placement-db-sync" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.750605 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d31e188-7526-4747-b72b-5fb81bb77006" containerName="placement-db-sync" Jan 21 16:15:43 crc kubenswrapper[4707]: E0121 16:15:43.750619 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerName="glance-httpd" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.750626 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerName="glance-httpd" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.750766 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerName="glance-httpd" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.750785 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d31e188-7526-4747-b72b-5fb81bb77006" containerName="placement-db-sync" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.750800 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" containerName="glance-log" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.751591 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.754014 4707 scope.go:117] "RemoveContainer" containerID="2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206" Jan 21 16:15:43 crc kubenswrapper[4707]: E0121 16:15:43.754407 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206\": container with ID starting with 2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206 not found: ID does not exist" containerID="2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.754438 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206"} err="failed to get container status \"2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206\": rpc error: code = NotFound desc = could not find container \"2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206\": container with ID starting with 2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206 not found: ID does not exist" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.754461 4707 scope.go:117] "RemoveContainer" containerID="905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030" Jan 21 16:15:43 crc kubenswrapper[4707]: E0121 16:15:43.754666 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030\": container with ID starting with 905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030 not found: ID does not exist" containerID="905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.754697 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030"} err="failed to get container status \"905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030\": rpc error: code = NotFound desc = could not find container \"905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030\": container with ID starting with 905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030 not found: ID does not exist" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.754710 4707 scope.go:117] "RemoveContainer" containerID="2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.754903 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206"} err="failed to get container status \"2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206\": rpc error: code = NotFound desc = could not find container \"2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206\": container with ID starting with 2afb93489f0f11e87777a2c69396e0eb0b791edb855e1c80044b6c99188e2206 not found: ID does not exist" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.754916 4707 scope.go:117] "RemoveContainer" containerID="905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.755141 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030"} err="failed to get container status \"905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030\": rpc error: code = NotFound desc = could not find container \"905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030\": container with ID starting with 905629472c19ed2fff43452e783697a98d4ec12b76ac870e95b5e561fd008030 not found: ID does not exist" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.766995 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.767323 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.774485 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.795384 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-6c876dcb7d-x45vx"] Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.797150 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.801025 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.801195 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-r4rhn" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.802149 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.833512 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6c876dcb7d-x45vx"] Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.958754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-config-data\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.958857 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-scripts\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.958912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/1a330953-fa65-465d-b502-3143953dd0ab-kube-api-access-p2hxc\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.958976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-logs\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.959028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.959077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-config-data\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.959126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqvjk\" (UniqueName: \"kubernetes.io/projected/42d3240b-126c-4815-b24a-0b02f8416c58-kube-api-access-sqvjk\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.959154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-combined-ca-bundle\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.959194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-scripts\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.959229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.959269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a330953-fa65-465d-b502-3143953dd0ab-logs\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.959340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:43 crc kubenswrapper[4707]: I0121 16:15:43.959378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-scripts\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061590 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/1a330953-fa65-465d-b502-3143953dd0ab-kube-api-access-p2hxc\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-logs\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061679 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-config-data\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqvjk\" (UniqueName: \"kubernetes.io/projected/42d3240b-126c-4815-b24a-0b02f8416c58-kube-api-access-sqvjk\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-combined-ca-bundle\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-scripts\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.061962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a330953-fa65-465d-b502-3143953dd0ab-logs\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.062039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.062069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.062100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-config-data\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.063999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a330953-fa65-465d-b502-3143953dd0ab-logs\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.064182 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.065797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.065883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-logs\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.068584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-scripts\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.068931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.069311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-config-data\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.073113 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-combined-ca-bundle\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.073535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-config-data\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.078986 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqvjk\" (UniqueName: \"kubernetes.io/projected/42d3240b-126c-4815-b24a-0b02f8416c58-kube-api-access-sqvjk\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.079134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.079608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/1a330953-fa65-465d-b502-3143953dd0ab-kube-api-access-p2hxc\") pod \"placement-6c876dcb7d-x45vx\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.080032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-scripts\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.088178 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.091211 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.154597 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.164250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6dfz\" (UniqueName: \"kubernetes.io/projected/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-kube-api-access-l6dfz\") pod \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.164400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-db-sync-config-data\") pod \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.164438 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-combined-ca-bundle\") pod \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\" (UID: \"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.169472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-kube-api-access-l6dfz" (OuterVolumeSpecName: "kube-api-access-l6dfz") pod "4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4" (UID: "4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4"). InnerVolumeSpecName "kube-api-access-l6dfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.171759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4" (UID: "4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.197652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4" (UID: "4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.207032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.266716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-combined-ca-bundle\") pod \"073f18c4-b272-4307-9e4c-123432b25fb2\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.266770 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72llt\" (UniqueName: \"kubernetes.io/projected/073f18c4-b272-4307-9e4c-123432b25fb2-kube-api-access-72llt\") pod \"073f18c4-b272-4307-9e4c-123432b25fb2\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.266843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-config-data\") pod \"073f18c4-b272-4307-9e4c-123432b25fb2\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.266909 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-scripts\") pod \"073f18c4-b272-4307-9e4c-123432b25fb2\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.266956 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-fernet-keys\") pod \"073f18c4-b272-4307-9e4c-123432b25fb2\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.267172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-credential-keys\") pod \"073f18c4-b272-4307-9e4c-123432b25fb2\" (UID: \"073f18c4-b272-4307-9e4c-123432b25fb2\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.267976 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.267997 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.268007 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6dfz\" (UniqueName: \"kubernetes.io/projected/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4-kube-api-access-l6dfz\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.271612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-scripts" (OuterVolumeSpecName: "scripts") pod "073f18c4-b272-4307-9e4c-123432b25fb2" (UID: "073f18c4-b272-4307-9e4c-123432b25fb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.271748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073f18c4-b272-4307-9e4c-123432b25fb2-kube-api-access-72llt" (OuterVolumeSpecName: "kube-api-access-72llt") pod "073f18c4-b272-4307-9e4c-123432b25fb2" (UID: "073f18c4-b272-4307-9e4c-123432b25fb2"). InnerVolumeSpecName "kube-api-access-72llt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.271927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.282035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "073f18c4-b272-4307-9e4c-123432b25fb2" (UID: "073f18c4-b272-4307-9e4c-123432b25fb2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.282129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "073f18c4-b272-4307-9e4c-123432b25fb2" (UID: "073f18c4-b272-4307-9e4c-123432b25fb2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.296842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-config-data" (OuterVolumeSpecName: "config-data") pod "073f18c4-b272-4307-9e4c-123432b25fb2" (UID: "073f18c4-b272-4307-9e4c-123432b25fb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.300034 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "073f18c4-b272-4307-9e4c-123432b25fb2" (UID: "073f18c4-b272-4307-9e4c-123432b25fb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.336304 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.372063 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.372086 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.372095 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.372107 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.372118 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72llt\" (UniqueName: \"kubernetes.io/projected/073f18c4-b272-4307-9e4c-123432b25fb2-kube-api-access-72llt\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.372127 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073f18c4-b272-4307-9e4c-123432b25fb2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.408870 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7bbc9795db-8fjxd"] Jan 21 16:15:44 crc kubenswrapper[4707]: E0121 16:15:44.409356 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="sg-core" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409378 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="sg-core" Jan 21 16:15:44 crc kubenswrapper[4707]: E0121 16:15:44.409392 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="ceilometer-notification-agent" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409399 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="ceilometer-notification-agent" Jan 21 16:15:44 crc kubenswrapper[4707]: E0121 16:15:44.409421 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="ceilometer-central-agent" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409428 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="ceilometer-central-agent" Jan 21 16:15:44 crc kubenswrapper[4707]: E0121 16:15:44.409438 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073f18c4-b272-4307-9e4c-123432b25fb2" containerName="keystone-bootstrap" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409444 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="073f18c4-b272-4307-9e4c-123432b25fb2" containerName="keystone-bootstrap" Jan 21 16:15:44 crc kubenswrapper[4707]: E0121 16:15:44.409455 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="proxy-httpd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409462 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="proxy-httpd" Jan 21 16:15:44 crc kubenswrapper[4707]: E0121 16:15:44.409481 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4" containerName="barbican-db-sync" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409487 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4" containerName="barbican-db-sync" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409675 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="073f18c4-b272-4307-9e4c-123432b25fb2" containerName="keystone-bootstrap" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409693 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4" containerName="barbican-db-sync" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409703 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="sg-core" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409712 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="proxy-httpd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409721 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="ceilometer-central-agent" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.409733 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerName="ceilometer-notification-agent" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.410777 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.416120 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.416329 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.429342 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7bbc9795db-8fjxd"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.472454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-sg-core-conf-yaml\") pod \"090cb60b-998d-4415-b080-aa7f6a6b45c6\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.472721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s7vk\" (UniqueName: \"kubernetes.io/projected/090cb60b-998d-4415-b080-aa7f6a6b45c6-kube-api-access-6s7vk\") pod \"090cb60b-998d-4415-b080-aa7f6a6b45c6\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.472781 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-log-httpd\") pod \"090cb60b-998d-4415-b080-aa7f6a6b45c6\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.472963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-scripts\") pod \"090cb60b-998d-4415-b080-aa7f6a6b45c6\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.472991 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-config-data\") pod \"090cb60b-998d-4415-b080-aa7f6a6b45c6\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.473017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-combined-ca-bundle\") pod \"090cb60b-998d-4415-b080-aa7f6a6b45c6\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.473080 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-run-httpd\") pod \"090cb60b-998d-4415-b080-aa7f6a6b45c6\" (UID: \"090cb60b-998d-4415-b080-aa7f6a6b45c6\") " Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.473877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "090cb60b-998d-4415-b080-aa7f6a6b45c6" (UID: "090cb60b-998d-4415-b080-aa7f6a6b45c6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.475979 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "090cb60b-998d-4415-b080-aa7f6a6b45c6" (UID: "090cb60b-998d-4415-b080-aa7f6a6b45c6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.485838 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090cb60b-998d-4415-b080-aa7f6a6b45c6-kube-api-access-6s7vk" (OuterVolumeSpecName: "kube-api-access-6s7vk") pod "090cb60b-998d-4415-b080-aa7f6a6b45c6" (UID: "090cb60b-998d-4415-b080-aa7f6a6b45c6"). InnerVolumeSpecName "kube-api-access-6s7vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.485907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-scripts" (OuterVolumeSpecName: "scripts") pod "090cb60b-998d-4415-b080-aa7f6a6b45c6" (UID: "090cb60b-998d-4415-b080-aa7f6a6b45c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.509272 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "090cb60b-998d-4415-b080-aa7f6a6b45c6" (UID: "090cb60b-998d-4415-b080-aa7f6a6b45c6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.570630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "090cb60b-998d-4415-b080-aa7f6a6b45c6" (UID: "090cb60b-998d-4415-b080-aa7f6a6b45c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-logs\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-combined-ca-bundle\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-internal-tls-certs\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-public-tls-certs\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bv2b\" (UniqueName: \"kubernetes.io/projected/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-kube-api-access-4bv2b\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-scripts\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-config-data\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575861 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575891 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575903 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575914 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575923 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s7vk\" (UniqueName: \"kubernetes.io/projected/090cb60b-998d-4415-b080-aa7f6a6b45c6-kube-api-access-6s7vk\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.575932 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/090cb60b-998d-4415-b080-aa7f6a6b45c6-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.587339 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-config-data" (OuterVolumeSpecName: "config-data") pod "090cb60b-998d-4415-b080-aa7f6a6b45c6" (UID: "090cb60b-998d-4415-b080-aa7f6a6b45c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.677535 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-logs\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.677959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-combined-ca-bundle\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.677991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-logs\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.678032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-internal-tls-certs\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.678074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-public-tls-certs\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.678131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bv2b\" (UniqueName: \"kubernetes.io/projected/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-kube-api-access-4bv2b\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.678183 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-scripts\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.678239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-config-data\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.678351 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090cb60b-998d-4415-b080-aa7f6a6b45c6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.682174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-internal-tls-certs\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.685579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-scripts\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.686444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-config-data\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.687535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-public-tls-certs\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.687543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-combined-ca-bundle\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.706121 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-krrl7" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.706446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-krrl7" event={"ID":"4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4","Type":"ContainerDied","Data":"caaffe1502cbf325e898d5a49fff91a3612077fea5ca1d42855b1b85a6108435"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.706482 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caaffe1502cbf325e898d5a49fff91a3612077fea5ca1d42855b1b85a6108435" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.708061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bv2b\" (UniqueName: \"kubernetes.io/projected/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-kube-api-access-4bv2b\") pod \"placement-7bbc9795db-8fjxd\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.710650 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.717378 4707 generic.go:334] "Generic (PLEG): container finished" podID="f82f8d8c-cc8e-4295-8453-61fb1b1c3848" containerID="b64f3b8df76efe4f95911f51150ea7f0f45391b29547dd5c2934533c217187fa" exitCode=0 Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.717421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" event={"ID":"f82f8d8c-cc8e-4295-8453-61fb1b1c3848","Type":"ContainerDied","Data":"b64f3b8df76efe4f95911f51150ea7f0f45391b29547dd5c2934533c217187fa"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.722278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea","Type":"ContainerStarted","Data":"0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.722324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea","Type":"ContainerStarted","Data":"8896a2707c13bf3ae97b270f90ba3046300518ab781d56f48ed17d9cd6eb2255"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.734147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" event={"ID":"073f18c4-b272-4307-9e4c-123432b25fb2","Type":"ContainerDied","Data":"423dd05f876b8af355a9ef2e2edac5427fed97701a85990d3147e5d77e0b4362"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.734193 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="423dd05f876b8af355a9ef2e2edac5427fed97701a85990d3147e5d77e0b4362" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.734218 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7n7ck" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753451 4707 generic.go:334] "Generic (PLEG): container finished" podID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerID="4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa" exitCode=0 Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753490 4707 generic.go:334] "Generic (PLEG): container finished" podID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerID="74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6" exitCode=2 Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753500 4707 generic.go:334] "Generic (PLEG): container finished" podID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerID="263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5" exitCode=0 Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753508 4707 generic.go:334] "Generic (PLEG): container finished" podID="090cb60b-998d-4415-b080-aa7f6a6b45c6" containerID="da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b" exitCode=0 Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerDied","Data":"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerDied","Data":"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerDied","Data":"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerDied","Data":"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"090cb60b-998d-4415-b080-aa7f6a6b45c6","Type":"ContainerDied","Data":"97a6c5869e0e8aad0ddfdd129607b8d5840f33b0693e44487b2df7a32e1df9a5"} Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753618 4707 scope.go:117] "RemoveContainer" containerID="4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.753781 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.759845 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7n7ck"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.771527 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7n7ck"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.772498 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.785021 4707 scope.go:117] "RemoveContainer" containerID="74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.809270 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.824222 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.838928 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.846249 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.847189 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.851743 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.857432 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6c876dcb7d-x45vx"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.864042 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.888420 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bmjbg"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.890064 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.902288 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.902584 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.902701 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-shwt7" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.902905 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.903011 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.908593 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bmjbg"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.956721 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.958327 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.969667 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.969731 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-x9wl2" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.969858 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.973787 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m"] Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.980436 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.984118 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.988792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-config-data\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.988984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-credential-keys\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-config-data\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-fernet-keys\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qcr\" (UniqueName: \"kubernetes.io/projected/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-kube-api-access-56qcr\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxvsp\" (UniqueName: \"kubernetes.io/projected/d5dd96c7-ee47-4191-9f21-fc10d99edbec-kube-api-access-hxvsp\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-scripts\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-scripts\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-run-httpd\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-combined-ca-bundle\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:44 crc kubenswrapper[4707]: I0121 16:15:44.989487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-log-httpd\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.010612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79"] Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.067126 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m"] Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.097430 4707 scope.go:117] "RemoveContainer" containerID="263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-credential-keys\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-config-data\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107414 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-fernet-keys\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56qcr\" (UniqueName: \"kubernetes.io/projected/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-kube-api-access-56qcr\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwjv\" (UniqueName: \"kubernetes.io/projected/3f8cfcd0-7878-4185-b558-a0ad021f06df-kube-api-access-sgwjv\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxvsp\" (UniqueName: \"kubernetes.io/projected/d5dd96c7-ee47-4191-9f21-fc10d99edbec-kube-api-access-hxvsp\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data-custom\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-scripts\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-logs\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-scripts\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-combined-ca-bundle\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.107966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.108004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-run-httpd\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.108029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-combined-ca-bundle\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.108091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data-custom\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.108115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.108181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8cfcd0-7878-4185-b558-a0ad021f06df-logs\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.108203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cnwv\" (UniqueName: \"kubernetes.io/projected/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-kube-api-access-7cnwv\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.108260 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-log-httpd\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.108288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-config-data\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.115107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-credential-keys\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.115611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-config-data\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.116778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-combined-ca-bundle\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.116879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-log-httpd\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.117057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-run-httpd\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.117084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.117979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.120113 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc"] Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.120307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-scripts\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.121936 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.124925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-scripts\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.125189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-fernet-keys\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.125769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-config-data\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.126104 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.136962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qcr\" (UniqueName: \"kubernetes.io/projected/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-kube-api-access-56qcr\") pod \"ceilometer-0\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.136999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxvsp\" (UniqueName: \"kubernetes.io/projected/d5dd96c7-ee47-4191-9f21-fc10d99edbec-kube-api-access-hxvsp\") pod \"keystone-bootstrap-bmjbg\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.148393 4707 scope.go:117] "RemoveContainer" containerID="da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.152529 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc"] Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.188954 4707 scope.go:117] "RemoveContainer" containerID="4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa" Jan 21 16:15:45 crc kubenswrapper[4707]: E0121 16:15:45.192990 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa\": container with ID starting with 4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa not found: ID does not exist" containerID="4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.193030 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa"} err="failed to get container status \"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa\": rpc error: code = NotFound desc = could not find container \"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa\": container with ID starting with 4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.193052 4707 scope.go:117] "RemoveContainer" containerID="74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6" Jan 21 16:15:45 crc kubenswrapper[4707]: E0121 16:15:45.194909 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6\": container with ID starting with 74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6 not found: ID does not exist" containerID="74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.194936 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6"} err="failed to get container status \"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6\": rpc error: code = NotFound desc = could not find container \"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6\": container with ID starting with 74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6 not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.194951 4707 scope.go:117] "RemoveContainer" containerID="263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5" Jan 21 16:15:45 crc kubenswrapper[4707]: E0121 16:15:45.196421 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5\": container with ID starting with 263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5 not found: ID does not exist" containerID="263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.196456 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5"} err="failed to get container status \"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5\": rpc error: code = NotFound desc = could not find container \"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5\": container with ID starting with 263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5 not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.196477 4707 scope.go:117] "RemoveContainer" containerID="da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b" Jan 21 16:15:45 crc kubenswrapper[4707]: E0121 16:15:45.201261 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b\": container with ID starting with da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b not found: ID does not exist" containerID="da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.201293 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b"} err="failed to get container status \"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b\": rpc error: code = NotFound desc = could not find container \"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b\": container with ID starting with da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.201308 4707 scope.go:117] "RemoveContainer" containerID="4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.201670 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa"} err="failed to get container status \"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa\": rpc error: code = NotFound desc = could not find container \"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa\": container with ID starting with 4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.201702 4707 scope.go:117] "RemoveContainer" containerID="74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.202135 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073f18c4-b272-4307-9e4c-123432b25fb2" path="/var/lib/kubelet/pods/073f18c4-b272-4307-9e4c-123432b25fb2/volumes" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.202592 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6"} err="failed to get container status \"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6\": rpc error: code = NotFound desc = could not find container \"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6\": container with ID starting with 74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6 not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.202612 4707 scope.go:117] "RemoveContainer" containerID="263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.202863 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090cb60b-998d-4415-b080-aa7f6a6b45c6" path="/var/lib/kubelet/pods/090cb60b-998d-4415-b080-aa7f6a6b45c6/volumes" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.202975 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5"} err="failed to get container status \"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5\": rpc error: code = NotFound desc = could not find container \"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5\": container with ID starting with 263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5 not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.202991 4707 scope.go:117] "RemoveContainer" containerID="da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.203769 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b"} err="failed to get container status \"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b\": rpc error: code = NotFound desc = could not find container \"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b\": container with ID starting with da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.203801 4707 scope.go:117] "RemoveContainer" containerID="4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.204312 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bbf7a3d-ae4a-4afb-953b-02346807fcfc" path="/var/lib/kubelet/pods/4bbf7a3d-ae4a-4afb-953b-02346807fcfc/volumes" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.204429 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa"} err="failed to get container status \"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa\": rpc error: code = NotFound desc = could not find container \"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa\": container with ID starting with 4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.204559 4707 scope.go:117] "RemoveContainer" containerID="74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.205100 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6"} err="failed to get container status \"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6\": rpc error: code = NotFound desc = could not find container \"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6\": container with ID starting with 74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6 not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.205119 4707 scope.go:117] "RemoveContainer" containerID="263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.205437 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5"} err="failed to get container status \"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5\": rpc error: code = NotFound desc = could not find container \"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5\": container with ID starting with 263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5 not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.205457 4707 scope.go:117] "RemoveContainer" containerID="da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.205897 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b"} err="failed to get container status \"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b\": rpc error: code = NotFound desc = could not find container \"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b\": container with ID starting with da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.205920 4707 scope.go:117] "RemoveContainer" containerID="4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.206364 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa"} err="failed to get container status \"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa\": rpc error: code = NotFound desc = could not find container \"4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa\": container with ID starting with 4b09e5fed2048c7e52d8c6dce93f97177ba9eb7aa8ddd4adf20290676b56d4fa not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.206387 4707 scope.go:117] "RemoveContainer" containerID="74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.208250 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6"} err="failed to get container status \"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6\": rpc error: code = NotFound desc = could not find container \"74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6\": container with ID starting with 74f67bff6add1f3e2620989a1d17fa1463cfdfd90205ae2fd9e113eafed7e4d6 not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.208270 4707 scope.go:117] "RemoveContainer" containerID="263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209044 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5"} err="failed to get container status \"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5\": rpc error: code = NotFound desc = could not find container \"263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5\": container with ID starting with 263b0205357524ce6c3a24d49692f40056f1e259b1a552b8cc149024a92942a5 not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209065 4707 scope.go:117] "RemoveContainer" containerID="da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209281 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b"} err="failed to get container status \"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b\": rpc error: code = NotFound desc = could not find container \"da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b\": container with ID starting with da6eba348bda5bec232b45b359b8913f833bc86d996077a65b0d9439f0348d9b not found: ID does not exist" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-combined-ca-bundle\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707fba96-29a7-4fd4-ad1f-4a9cfae28021-logs\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs2lt\" (UniqueName: \"kubernetes.io/projected/707fba96-29a7-4fd4-ad1f-4a9cfae28021-kube-api-access-xs2lt\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209422 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data-custom\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8cfcd0-7878-4185-b558-a0ad021f06df-logs\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cnwv\" (UniqueName: \"kubernetes.io/projected/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-kube-api-access-7cnwv\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data-custom\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwjv\" (UniqueName: \"kubernetes.io/projected/3f8cfcd0-7878-4185-b558-a0ad021f06df-kube-api-access-sgwjv\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-combined-ca-bundle\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data-custom\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.209633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-logs\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.210375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-logs\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.210543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8cfcd0-7878-4185-b558-a0ad021f06df-logs\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.214344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data-custom\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.214728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.216496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-combined-ca-bundle\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.216837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.216912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data-custom\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.218158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.225891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cnwv\" (UniqueName: \"kubernetes.io/projected/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-kube-api-access-7cnwv\") pod \"barbican-keystone-listener-6b6fbc4c8b-9448m\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.226052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwjv\" (UniqueName: \"kubernetes.io/projected/3f8cfcd0-7878-4185-b558-a0ad021f06df-kube-api-access-sgwjv\") pod \"barbican-worker-6f7c5bdff-mpg79\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.290068 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.310933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs2lt\" (UniqueName: \"kubernetes.io/projected/707fba96-29a7-4fd4-ad1f-4a9cfae28021-kube-api-access-xs2lt\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.310991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.311069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data-custom\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.311154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-combined-ca-bundle\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.311242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707fba96-29a7-4fd4-ad1f-4a9cfae28021-logs\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.312115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707fba96-29a7-4fd4-ad1f-4a9cfae28021-logs\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.316669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-combined-ca-bundle\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.317896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data-custom\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.320692 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.326769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.327315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs2lt\" (UniqueName: \"kubernetes.io/projected/707fba96-29a7-4fd4-ad1f-4a9cfae28021-kube-api-access-xs2lt\") pod \"barbican-api-75d65b4bdd-zn2lc\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.358922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.378700 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.457659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.462744 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7bbc9795db-8fjxd"] Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.769764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea","Type":"ContainerStarted","Data":"68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd"} Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.775571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42d3240b-126c-4815-b24a-0b02f8416c58","Type":"ContainerStarted","Data":"8c3d1aa4c7e07ced0f9c9649b79271951b2b5ac01a281dd0765d10564964c5eb"} Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.775609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42d3240b-126c-4815-b24a-0b02f8416c58","Type":"ContainerStarted","Data":"da50fde175b1926f100bc671b0b7feac1bb445d445efe64cac69c6f9b1f7b970"} Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.780793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" event={"ID":"1a330953-fa65-465d-b502-3143953dd0ab","Type":"ContainerStarted","Data":"4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c"} Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.780916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" event={"ID":"1a330953-fa65-465d-b502-3143953dd0ab","Type":"ContainerStarted","Data":"2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23"} Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.780930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" event={"ID":"1a330953-fa65-465d-b502-3143953dd0ab","Type":"ContainerStarted","Data":"48e3cd6813db854c805d4e11f38563c55dcd976be791929874e5aed0f9acf24c"} Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.781628 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.781662 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.782996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" event={"ID":"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2","Type":"ContainerStarted","Data":"54dd56ada64ce67d274cd0b3bb2acc42f882289b5f7422926e8209f09d92c818"} Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.784348 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.794398 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.794384646 podStartE2EDuration="3.794384646s" podCreationTimestamp="2026-01-21 16:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:45.794049335 +0000 UTC m=+4442.975565557" watchObservedRunningTime="2026-01-21 16:15:45.794384646 +0000 UTC m=+4442.975900868" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.837248 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" podStartSLOduration=2.8372263110000002 podStartE2EDuration="2.837226311s" podCreationTimestamp="2026-01-21 16:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:45.828989633 +0000 UTC m=+4443.010505875" watchObservedRunningTime="2026-01-21 16:15:45.837226311 +0000 UTC m=+4443.018742532" Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.890903 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bmjbg"] Jan 21 16:15:45 crc kubenswrapper[4707]: I0121 16:15:45.916949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m"] Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.098933 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79"] Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.223733 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc"] Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.259411 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.329791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-db-sync-config-data\") pod \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.329855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scsg6\" (UniqueName: \"kubernetes.io/projected/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-kube-api-access-scsg6\") pod \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.330005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-config-data\") pod \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.330092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-scripts\") pod \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.330112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-etc-machine-id\") pod \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.330218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-combined-ca-bundle\") pod \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\" (UID: \"f82f8d8c-cc8e-4295-8453-61fb1b1c3848\") " Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.334896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f82f8d8c-cc8e-4295-8453-61fb1b1c3848" (UID: "f82f8d8c-cc8e-4295-8453-61fb1b1c3848"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.355986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-kube-api-access-scsg6" (OuterVolumeSpecName: "kube-api-access-scsg6") pod "f82f8d8c-cc8e-4295-8453-61fb1b1c3848" (UID: "f82f8d8c-cc8e-4295-8453-61fb1b1c3848"). InnerVolumeSpecName "kube-api-access-scsg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.369057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-scripts" (OuterVolumeSpecName: "scripts") pod "f82f8d8c-cc8e-4295-8453-61fb1b1c3848" (UID: "f82f8d8c-cc8e-4295-8453-61fb1b1c3848"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.378960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f82f8d8c-cc8e-4295-8453-61fb1b1c3848" (UID: "f82f8d8c-cc8e-4295-8453-61fb1b1c3848"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.410681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f82f8d8c-cc8e-4295-8453-61fb1b1c3848" (UID: "f82f8d8c-cc8e-4295-8453-61fb1b1c3848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.430278 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-config-data" (OuterVolumeSpecName: "config-data") pod "f82f8d8c-cc8e-4295-8453-61fb1b1c3848" (UID: "f82f8d8c-cc8e-4295-8453-61fb1b1c3848"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.433346 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.433374 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.433387 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.433399 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.433408 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.433420 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scsg6\" (UniqueName: \"kubernetes.io/projected/f82f8d8c-cc8e-4295-8453-61fb1b1c3848-kube-api-access-scsg6\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.791385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" event={"ID":"d5dd96c7-ee47-4191-9f21-fc10d99edbec","Type":"ContainerStarted","Data":"22d5a5d536394a7c6034ffab06541d4868a12a093533861fc0bb90582e15484c"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.791649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" event={"ID":"d5dd96c7-ee47-4191-9f21-fc10d99edbec","Type":"ContainerStarted","Data":"b2db6c403813d323467092b10f044f6cb332872cb0c032583a46d7585a765006"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.794286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerStarted","Data":"65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.794314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerStarted","Data":"d5deef518bc0417711bf032fe4f59cad060a80c70807de9b591c04ea62753edd"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.795779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" event={"ID":"707fba96-29a7-4fd4-ad1f-4a9cfae28021","Type":"ContainerStarted","Data":"5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.795802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" event={"ID":"707fba96-29a7-4fd4-ad1f-4a9cfae28021","Type":"ContainerStarted","Data":"5685ec8c791774b73780dcb61cb55f22f6384000868bfe83e69723d342386d37"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.797847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" event={"ID":"2ba05f78-2a71-4b67-8e95-db9c04a4ed42","Type":"ContainerStarted","Data":"530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.797872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" event={"ID":"2ba05f78-2a71-4b67-8e95-db9c04a4ed42","Type":"ContainerStarted","Data":"1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.797882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" event={"ID":"2ba05f78-2a71-4b67-8e95-db9c04a4ed42","Type":"ContainerStarted","Data":"2f7e97b26f7795278c6e122326b465e0f0fb739574c352d50272ec391309e236"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.803142 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.803158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-sx8r2" event={"ID":"f82f8d8c-cc8e-4295-8453-61fb1b1c3848","Type":"ContainerDied","Data":"e0ad20db1e76f4b5cac527b70c6a6a7974706e6e4cfd81bd09911bd0adc9f337"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.803205 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ad20db1e76f4b5cac527b70c6a6a7974706e6e4cfd81bd09911bd0adc9f337" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.805125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" event={"ID":"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2","Type":"ContainerStarted","Data":"f31a9e28d650086abf54eb41930bbd347007b6dda6d23d5b073d912b680a7659"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.805183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" event={"ID":"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2","Type":"ContainerStarted","Data":"2edd1c6516797b8cc69e132ff4ce12d5ff92480e5675a5122e825d8deab90b2f"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.806798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" event={"ID":"3f8cfcd0-7878-4185-b558-a0ad021f06df","Type":"ContainerStarted","Data":"65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.806844 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" event={"ID":"3f8cfcd0-7878-4185-b558-a0ad021f06df","Type":"ContainerStarted","Data":"f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.806855 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" event={"ID":"3f8cfcd0-7878-4185-b558-a0ad021f06df","Type":"ContainerStarted","Data":"9b546e8f4cce9f56195c0a68364d2fea906f9c32f8e14716e77d3c107da162fb"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.810051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42d3240b-126c-4815-b24a-0b02f8416c58","Type":"ContainerStarted","Data":"ed5d874d9c4d8d662d1aaf49cf031a81b9c998a70b0df4bb18a92e7b5f5ccae2"} Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.824938 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" podStartSLOduration=2.824927181 podStartE2EDuration="2.824927181s" podCreationTimestamp="2026-01-21 16:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:46.816987181 +0000 UTC m=+4443.998503403" watchObservedRunningTime="2026-01-21 16:15:46.824927181 +0000 UTC m=+4444.006443403" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.857285 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.8572686320000003 podStartE2EDuration="3.857268632s" podCreationTimestamp="2026-01-21 16:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:46.850993283 +0000 UTC m=+4444.032509515" watchObservedRunningTime="2026-01-21 16:15:46.857268632 +0000 UTC m=+4444.038784854" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.876451 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" podStartSLOduration=2.876438106 podStartE2EDuration="2.876438106s" podCreationTimestamp="2026-01-21 16:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:46.870765832 +0000 UTC m=+4444.052282053" watchObservedRunningTime="2026-01-21 16:15:46.876438106 +0000 UTC m=+4444.057954329" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.890984 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" podStartSLOduration=2.890947038 podStartE2EDuration="2.890947038s" podCreationTimestamp="2026-01-21 16:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:46.890045723 +0000 UTC m=+4444.071561955" watchObservedRunningTime="2026-01-21 16:15:46.890947038 +0000 UTC m=+4444.072463260" Jan 21 16:15:46 crc kubenswrapper[4707]: I0121 16:15:46.932393 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" podStartSLOduration=2.932372499 podStartE2EDuration="2.932372499s" podCreationTimestamp="2026-01-21 16:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:46.90914773 +0000 UTC m=+4444.090663952" watchObservedRunningTime="2026-01-21 16:15:46.932372499 +0000 UTC m=+4444.113888721" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.017868 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:15:47 crc kubenswrapper[4707]: E0121 16:15:47.018293 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82f8d8c-cc8e-4295-8453-61fb1b1c3848" containerName="cinder-db-sync" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.018311 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82f8d8c-cc8e-4295-8453-61fb1b1c3848" containerName="cinder-db-sync" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.018471 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82f8d8c-cc8e-4295-8453-61fb1b1c3848" containerName="cinder-db-sync" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.019343 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.025382 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.025584 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-9bcpk" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.026094 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.026215 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.045032 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.069928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-scripts\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.070186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.070211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.070239 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.075021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvh5s\" (UniqueName: \"kubernetes.io/projected/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-kube-api-access-dvh5s\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.075091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.132025 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.133310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.136096 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.178307 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.178931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-scripts\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.192723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.192757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.192798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.192836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvh5s\" (UniqueName: \"kubernetes.io/projected/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-kube-api-access-dvh5s\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.192883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.186029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-scripts\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.193874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.202158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.204725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.204773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.225306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvh5s\" (UniqueName: \"kubernetes.io/projected/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-kube-api-access-dvh5s\") pod \"cinder-scheduler-0\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.295961 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ce0f55d-19f5-4878-9173-3deaddab452c-logs\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.296002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r666z\" (UniqueName: \"kubernetes.io/projected/7ce0f55d-19f5-4878-9173-3deaddab452c-kube-api-access-r666z\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.296024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.296050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ce0f55d-19f5-4878-9173-3deaddab452c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.296068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.296630 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.296914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-scripts\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.350858 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.401306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ce0f55d-19f5-4878-9173-3deaddab452c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.401385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.401602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.401797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-scripts\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.401983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ce0f55d-19f5-4878-9173-3deaddab452c-logs\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.402023 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r666z\" (UniqueName: \"kubernetes.io/projected/7ce0f55d-19f5-4878-9173-3deaddab452c-kube-api-access-r666z\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.402052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.404007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ce0f55d-19f5-4878-9173-3deaddab452c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.404411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ce0f55d-19f5-4878-9173-3deaddab452c-logs\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.408607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.409308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-scripts\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.421572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.422726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.429495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r666z\" (UniqueName: \"kubernetes.io/projected/7ce0f55d-19f5-4878-9173-3deaddab452c-kube-api-access-r666z\") pod \"cinder-api-0\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.478849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.661767 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.840950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bd341c6b-6920-4954-a5d8-51a7e5ab13bc","Type":"ContainerStarted","Data":"3becbf7604c5862b42c05b23e6b4803de9ecbedbe8ef4981fd7e7246f42e8d1d"} Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.850750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerStarted","Data":"f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05"} Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.861348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" event={"ID":"707fba96-29a7-4fd4-ad1f-4a9cfae28021","Type":"ContainerStarted","Data":"511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2"} Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.861908 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.862017 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.862602 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.862639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.881015 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" podStartSLOduration=2.880986826 podStartE2EDuration="2.880986826s" podCreationTimestamp="2026-01-21 16:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:47.876647927 +0000 UTC m=+4445.058164149" watchObservedRunningTime="2026-01-21 16:15:47.880986826 +0000 UTC m=+4445.062503047" Jan 21 16:15:47 crc kubenswrapper[4707]: I0121 16:15:47.968857 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:15:48 crc kubenswrapper[4707]: I0121 16:15:48.872106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7ce0f55d-19f5-4878-9173-3deaddab452c","Type":"ContainerStarted","Data":"c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572"} Jan 21 16:15:48 crc kubenswrapper[4707]: I0121 16:15:48.872528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7ce0f55d-19f5-4878-9173-3deaddab452c","Type":"ContainerStarted","Data":"9b529da1710488ce0bb19b6f6de1eb147a8868fac6f635c9e674228b462176b7"} Jan 21 16:15:48 crc kubenswrapper[4707]: I0121 16:15:48.874480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bd341c6b-6920-4954-a5d8-51a7e5ab13bc","Type":"ContainerStarted","Data":"a4c0bcb50d74e4edcf57b0a0c6d0174d6f705cb46fa205f5901dbdcbab03caa3"} Jan 21 16:15:48 crc kubenswrapper[4707]: I0121 16:15:48.878172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerStarted","Data":"834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5"} Jan 21 16:15:49 crc kubenswrapper[4707]: I0121 16:15:49.887912 4707 generic.go:334] "Generic (PLEG): container finished" podID="d5dd96c7-ee47-4191-9f21-fc10d99edbec" containerID="22d5a5d536394a7c6034ffab06541d4868a12a093533861fc0bb90582e15484c" exitCode=0 Jan 21 16:15:49 crc kubenswrapper[4707]: I0121 16:15:49.887996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" event={"ID":"d5dd96c7-ee47-4191-9f21-fc10d99edbec","Type":"ContainerDied","Data":"22d5a5d536394a7c6034ffab06541d4868a12a093533861fc0bb90582e15484c"} Jan 21 16:15:49 crc kubenswrapper[4707]: I0121 16:15:49.891028 4707 generic.go:334] "Generic (PLEG): container finished" podID="228927a0-3a8f-43f9-853f-99fc8df8e727" containerID="b895c33c9e96e01533b1f6162a9b38e42a966f465e3f68cd50d1c00b4fe94657" exitCode=0 Jan 21 16:15:49 crc kubenswrapper[4707]: I0121 16:15:49.891087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-ppt78" event={"ID":"228927a0-3a8f-43f9-853f-99fc8df8e727","Type":"ContainerDied","Data":"b895c33c9e96e01533b1f6162a9b38e42a966f465e3f68cd50d1c00b4fe94657"} Jan 21 16:15:49 crc kubenswrapper[4707]: I0121 16:15:49.893779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7ce0f55d-19f5-4878-9173-3deaddab452c","Type":"ContainerStarted","Data":"0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5"} Jan 21 16:15:49 crc kubenswrapper[4707]: I0121 16:15:49.894710 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:49 crc kubenswrapper[4707]: I0121 16:15:49.896884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bd341c6b-6920-4954-a5d8-51a7e5ab13bc","Type":"ContainerStarted","Data":"2a1dc6c4ab4a2697f15febf2854084fc6b8cc8c57b6b4ba7851ac9717f05cfe7"} Jan 21 16:15:49 crc kubenswrapper[4707]: I0121 16:15:49.965542 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.965518062 podStartE2EDuration="2.965518062s" podCreationTimestamp="2026-01-21 16:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:49.935584537 +0000 UTC m=+4447.117100760" watchObservedRunningTime="2026-01-21 16:15:49.965518062 +0000 UTC m=+4447.147034284" Jan 21 16:15:49 crc kubenswrapper[4707]: I0121 16:15:49.971431 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.971423435 podStartE2EDuration="3.971423435s" podCreationTimestamp="2026-01-21 16:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:49.954433349 +0000 UTC m=+4447.135949571" watchObservedRunningTime="2026-01-21 16:15:49.971423435 +0000 UTC m=+4447.152939657" Jan 21 16:15:50 crc kubenswrapper[4707]: I0121 16:15:50.907457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerStarted","Data":"f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc"} Jan 21 16:15:50 crc kubenswrapper[4707]: I0121 16:15:50.908116 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:15:50 crc kubenswrapper[4707]: I0121 16:15:50.936763 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.891814749 podStartE2EDuration="6.936733374s" podCreationTimestamp="2026-01-21 16:15:44 +0000 UTC" firstStartedPulling="2026-01-21 16:15:45.799193818 +0000 UTC m=+4442.980710040" lastFinishedPulling="2026-01-21 16:15:49.844112443 +0000 UTC m=+4447.025628665" observedRunningTime="2026-01-21 16:15:50.930361413 +0000 UTC m=+4448.111877635" watchObservedRunningTime="2026-01-21 16:15:50.936733374 +0000 UTC m=+4448.118249586" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.162764 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.167412 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.321938 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-fernet-keys\") pod \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.321992 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-config-data\") pod \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.322026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxvsp\" (UniqueName: \"kubernetes.io/projected/d5dd96c7-ee47-4191-9f21-fc10d99edbec-kube-api-access-hxvsp\") pod \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.322115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-credential-keys\") pod \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.322151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnjc2\" (UniqueName: \"kubernetes.io/projected/228927a0-3a8f-43f9-853f-99fc8df8e727-kube-api-access-nnjc2\") pod \"228927a0-3a8f-43f9-853f-99fc8df8e727\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.322285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-config\") pod \"228927a0-3a8f-43f9-853f-99fc8df8e727\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.322349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-combined-ca-bundle\") pod \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.322442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-scripts\") pod \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\" (UID: \"d5dd96c7-ee47-4191-9f21-fc10d99edbec\") " Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.322484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-combined-ca-bundle\") pod \"228927a0-3a8f-43f9-853f-99fc8df8e727\" (UID: \"228927a0-3a8f-43f9-853f-99fc8df8e727\") " Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.334326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5dd96c7-ee47-4191-9f21-fc10d99edbec-kube-api-access-hxvsp" (OuterVolumeSpecName: "kube-api-access-hxvsp") pod "d5dd96c7-ee47-4191-9f21-fc10d99edbec" (UID: "d5dd96c7-ee47-4191-9f21-fc10d99edbec"). InnerVolumeSpecName "kube-api-access-hxvsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.334630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228927a0-3a8f-43f9-853f-99fc8df8e727-kube-api-access-nnjc2" (OuterVolumeSpecName: "kube-api-access-nnjc2") pod "228927a0-3a8f-43f9-853f-99fc8df8e727" (UID: "228927a0-3a8f-43f9-853f-99fc8df8e727"). InnerVolumeSpecName "kube-api-access-nnjc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.338127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d5dd96c7-ee47-4191-9f21-fc10d99edbec" (UID: "d5dd96c7-ee47-4191-9f21-fc10d99edbec"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.338323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-scripts" (OuterVolumeSpecName: "scripts") pod "d5dd96c7-ee47-4191-9f21-fc10d99edbec" (UID: "d5dd96c7-ee47-4191-9f21-fc10d99edbec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.350799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d5dd96c7-ee47-4191-9f21-fc10d99edbec" (UID: "d5dd96c7-ee47-4191-9f21-fc10d99edbec"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.351606 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.365086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-config" (OuterVolumeSpecName: "config") pod "228927a0-3a8f-43f9-853f-99fc8df8e727" (UID: "228927a0-3a8f-43f9-853f-99fc8df8e727"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.365074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-config-data" (OuterVolumeSpecName: "config-data") pod "d5dd96c7-ee47-4191-9f21-fc10d99edbec" (UID: "d5dd96c7-ee47-4191-9f21-fc10d99edbec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.365148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "228927a0-3a8f-43f9-853f-99fc8df8e727" (UID: "228927a0-3a8f-43f9-853f-99fc8df8e727"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.366383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5dd96c7-ee47-4191-9f21-fc10d99edbec" (UID: "d5dd96c7-ee47-4191-9f21-fc10d99edbec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.427460 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.427496 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.427510 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.427521 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.427529 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.427540 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxvsp\" (UniqueName: \"kubernetes.io/projected/d5dd96c7-ee47-4191-9f21-fc10d99edbec-kube-api-access-hxvsp\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.427553 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dd96c7-ee47-4191-9f21-fc10d99edbec-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.427562 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnjc2\" (UniqueName: \"kubernetes.io/projected/228927a0-3a8f-43f9-853f-99fc8df8e727-kube-api-access-nnjc2\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.427571 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/228927a0-3a8f-43f9-853f-99fc8df8e727-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.707976 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.931246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" event={"ID":"d5dd96c7-ee47-4191-9f21-fc10d99edbec","Type":"ContainerDied","Data":"b2db6c403813d323467092b10f044f6cb332872cb0c032583a46d7585a765006"} Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.931297 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2db6c403813d323467092b10f044f6cb332872cb0c032583a46d7585a765006" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.931389 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bmjbg" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.933890 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerName="cinder-api-log" containerID="cri-o://c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572" gracePeriod=30 Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.934158 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerName="cinder-api" containerID="cri-o://0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5" gracePeriod=30 Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.934170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-ppt78" event={"ID":"228927a0-3a8f-43f9-853f-99fc8df8e727","Type":"ContainerDied","Data":"14fd7cffbe4103d7110ec99c95981c777d5f98930f5ba2f23f3a70826902f02d"} Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.934390 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14fd7cffbe4103d7110ec99c95981c777d5f98930f5ba2f23f3a70826902f02d" Jan 21 16:15:52 crc kubenswrapper[4707]: I0121 16:15:52.934209 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-ppt78" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.219157 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp"] Jan 21 16:15:53 crc kubenswrapper[4707]: E0121 16:15:53.219699 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5dd96c7-ee47-4191-9f21-fc10d99edbec" containerName="keystone-bootstrap" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.219711 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5dd96c7-ee47-4191-9f21-fc10d99edbec" containerName="keystone-bootstrap" Jan 21 16:15:53 crc kubenswrapper[4707]: E0121 16:15:53.219747 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228927a0-3a8f-43f9-853f-99fc8df8e727" containerName="neutron-db-sync" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.219752 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="228927a0-3a8f-43f9-853f-99fc8df8e727" containerName="neutron-db-sync" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.219919 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="228927a0-3a8f-43f9-853f-99fc8df8e727" containerName="neutron-db-sync" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.219931 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5dd96c7-ee47-4191-9f21-fc10d99edbec" containerName="keystone-bootstrap" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.220714 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.224699 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.225827 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.241489 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp"] Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.280303 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.280547 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.335255 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h"] Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.336665 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.339533 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.339883 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.340000 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.340082 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-6g6g9" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.344192 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-7d947f86fd-wpfh7"] Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.345348 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.351335 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.351626 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.351841 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.352116 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-shwt7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.352346 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.352669 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.352798 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h"] Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.353870 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-scripts\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.353943 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-public-tls-certs\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.353983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-credential-keys\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pks9z\" (UniqueName: \"kubernetes.io/projected/90275cb5-d75d-4fbc-b84d-31fc918a19d1-kube-api-access-pks9z\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354267 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data-custom\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-config-data\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25947cd-d79d-4983-abfa-5d0f80850cdb-logs\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-combined-ca-bundle\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sdbw\" (UniqueName: \"kubernetes.io/projected/c25947cd-d79d-4983-abfa-5d0f80850cdb-kube-api-access-6sdbw\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-fernet-keys\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-internal-tls-certs\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.354648 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.361157 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7d947f86fd-wpfh7"] Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244hb\" (UniqueName: \"kubernetes.io/projected/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-kube-api-access-244hb\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-scripts\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-public-tls-certs\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-credential-keys\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456240 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pks9z\" (UniqueName: \"kubernetes.io/projected/90275cb5-d75d-4fbc-b84d-31fc918a19d1-kube-api-access-pks9z\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456341 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data-custom\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-httpd-config\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-config-data\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25947cd-d79d-4983-abfa-5d0f80850cdb-logs\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-ovndb-tls-certs\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-combined-ca-bundle\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-combined-ca-bundle\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sdbw\" (UniqueName: \"kubernetes.io/projected/c25947cd-d79d-4983-abfa-5d0f80850cdb-kube-api-access-6sdbw\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-fernet-keys\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456598 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-internal-tls-certs\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.456637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-config\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.475908 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25947cd-d79d-4983-abfa-5d0f80850cdb-logs\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.558257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-httpd-config\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.558324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-ovndb-tls-certs\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.558345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-combined-ca-bundle\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.558477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-config\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.558510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-244hb\" (UniqueName: \"kubernetes.io/projected/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-kube-api-access-244hb\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.648115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-scripts\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.648473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-public-tls-certs\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.648663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-combined-ca-bundle\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.650292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-internal-tls-certs\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.650686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-fernet-keys\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.650732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data-custom\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.650773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-credential-keys\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.651779 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.651971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.652742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.653036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.653502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-config-data\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.653970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sdbw\" (UniqueName: \"kubernetes.io/projected/c25947cd-d79d-4983-abfa-5d0f80850cdb-kube-api-access-6sdbw\") pod \"barbican-api-5894fb5d9b-krfgp\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.654399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pks9z\" (UniqueName: \"kubernetes.io/projected/90275cb5-d75d-4fbc-b84d-31fc918a19d1-kube-api-access-pks9z\") pod \"keystone-7d947f86fd-wpfh7\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.655291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-config\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.656228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-ovndb-tls-certs\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.658148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-httpd-config\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.663145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-combined-ca-bundle\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.663546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-244hb\" (UniqueName: \"kubernetes.io/projected/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-kube-api-access-244hb\") pod \"neutron-6f8c6b4fd7-t478h\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.673311 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.679562 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.684399 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.782214 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.839385 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.864061 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data\") pod \"7ce0f55d-19f5-4878-9173-3deaddab452c\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.864133 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ce0f55d-19f5-4878-9173-3deaddab452c-logs\") pod \"7ce0f55d-19f5-4878-9173-3deaddab452c\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.864183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r666z\" (UniqueName: \"kubernetes.io/projected/7ce0f55d-19f5-4878-9173-3deaddab452c-kube-api-access-r666z\") pod \"7ce0f55d-19f5-4878-9173-3deaddab452c\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.864313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-scripts\") pod \"7ce0f55d-19f5-4878-9173-3deaddab452c\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.864368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-combined-ca-bundle\") pod \"7ce0f55d-19f5-4878-9173-3deaddab452c\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.864388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ce0f55d-19f5-4878-9173-3deaddab452c-etc-machine-id\") pod \"7ce0f55d-19f5-4878-9173-3deaddab452c\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.864486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data-custom\") pod \"7ce0f55d-19f5-4878-9173-3deaddab452c\" (UID: \"7ce0f55d-19f5-4878-9173-3deaddab452c\") " Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.864590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ce0f55d-19f5-4878-9173-3deaddab452c-logs" (OuterVolumeSpecName: "logs") pod "7ce0f55d-19f5-4878-9173-3deaddab452c" (UID: "7ce0f55d-19f5-4878-9173-3deaddab452c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.865017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ce0f55d-19f5-4878-9173-3deaddab452c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ce0f55d-19f5-4878-9173-3deaddab452c" (UID: "7ce0f55d-19f5-4878-9173-3deaddab452c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.865245 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ce0f55d-19f5-4878-9173-3deaddab452c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.865266 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ce0f55d-19f5-4878-9173-3deaddab452c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.869142 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ce0f55d-19f5-4878-9173-3deaddab452c" (UID: "7ce0f55d-19f5-4878-9173-3deaddab452c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.871969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce0f55d-19f5-4878-9173-3deaddab452c-kube-api-access-r666z" (OuterVolumeSpecName: "kube-api-access-r666z") pod "7ce0f55d-19f5-4878-9173-3deaddab452c" (UID: "7ce0f55d-19f5-4878-9173-3deaddab452c"). InnerVolumeSpecName "kube-api-access-r666z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.884549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-scripts" (OuterVolumeSpecName: "scripts") pod "7ce0f55d-19f5-4878-9173-3deaddab452c" (UID: "7ce0f55d-19f5-4878-9173-3deaddab452c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.901457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ce0f55d-19f5-4878-9173-3deaddab452c" (UID: "7ce0f55d-19f5-4878-9173-3deaddab452c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.930740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data" (OuterVolumeSpecName: "config-data") pod "7ce0f55d-19f5-4878-9173-3deaddab452c" (UID: "7ce0f55d-19f5-4878-9173-3deaddab452c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.949104 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerID="0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5" exitCode=0 Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.949128 4707 generic.go:334] "Generic (PLEG): container finished" podID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerID="c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572" exitCode=143 Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.950407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7ce0f55d-19f5-4878-9173-3deaddab452c","Type":"ContainerDied","Data":"0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5"} Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.950443 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.950454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7ce0f55d-19f5-4878-9173-3deaddab452c","Type":"ContainerDied","Data":"c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572"} Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.950464 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"7ce0f55d-19f5-4878-9173-3deaddab452c","Type":"ContainerDied","Data":"9b529da1710488ce0bb19b6f6de1eb147a8868fac6f635c9e674228b462176b7"} Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.950480 4707 scope.go:117] "RemoveContainer" containerID="0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.950598 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.951796 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.959399 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.967214 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.967251 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.967262 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.967276 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r666z\" (UniqueName: \"kubernetes.io/projected/7ce0f55d-19f5-4878-9173-3deaddab452c-kube-api-access-r666z\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.967286 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ce0f55d-19f5-4878-9173-3deaddab452c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:53 crc kubenswrapper[4707]: I0121 16:15:53.991022 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.004024 4707 scope.go:117] "RemoveContainer" containerID="c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.016919 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.033844 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:15:54 crc kubenswrapper[4707]: E0121 16:15:54.035126 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerName="cinder-api-log" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.035152 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerName="cinder-api-log" Jan 21 16:15:54 crc kubenswrapper[4707]: E0121 16:15:54.035304 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerName="cinder-api" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.035320 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerName="cinder-api" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.036221 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerName="cinder-api-log" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.036244 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce0f55d-19f5-4878-9173-3deaddab452c" containerName="cinder-api" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.038895 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.048441 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.048495 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.048641 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.059261 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.070877 4707 scope.go:117] "RemoveContainer" containerID="0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5" Jan 21 16:15:54 crc kubenswrapper[4707]: E0121 16:15:54.073882 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5\": container with ID starting with 0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5 not found: ID does not exist" containerID="0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.073953 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5"} err="failed to get container status \"0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5\": rpc error: code = NotFound desc = could not find container \"0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5\": container with ID starting with 0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5 not found: ID does not exist" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.073981 4707 scope.go:117] "RemoveContainer" containerID="c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572" Jan 21 16:15:54 crc kubenswrapper[4707]: E0121 16:15:54.074518 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572\": container with ID starting with c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572 not found: ID does not exist" containerID="c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.074587 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572"} err="failed to get container status \"c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572\": rpc error: code = NotFound desc = could not find container \"c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572\": container with ID starting with c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572 not found: ID does not exist" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.074619 4707 scope.go:117] "RemoveContainer" containerID="0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.075069 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5"} err="failed to get container status \"0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5\": rpc error: code = NotFound desc = could not find container \"0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5\": container with ID starting with 0f8d3aedf6c167aa3713286b7a8601f5107747cb783bc040cdf7e1c649748ab5 not found: ID does not exist" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.075103 4707 scope.go:117] "RemoveContainer" containerID="c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.080094 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572"} err="failed to get container status \"c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572\": rpc error: code = NotFound desc = could not find container \"c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572\": container with ID starting with c836cf9678846bd52ecdfaf77abb76818b9cf5a6c4cacba93836e12382feb572 not found: ID does not exist" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.174207 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28274132-ebaa-415b-a596-28b30796449c-logs\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.174497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.174629 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.174654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.174677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.174776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28274132-ebaa-415b-a596-28b30796449c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.174849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-scripts\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.174968 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data-custom\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.175005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdmd8\" (UniqueName: \"kubernetes.io/projected/28274132-ebaa-415b-a596-28b30796449c-kube-api-access-rdmd8\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.176111 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7d947f86fd-wpfh7"] Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.207716 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.207907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.260179 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.277150 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278462 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28274132-ebaa-415b-a596-28b30796449c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-scripts\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data-custom\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdmd8\" (UniqueName: \"kubernetes.io/projected/28274132-ebaa-415b-a596-28b30796449c-kube-api-access-rdmd8\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28274132-ebaa-415b-a596-28b30796449c-logs\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.278981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28274132-ebaa-415b-a596-28b30796449c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.284733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.287155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28274132-ebaa-415b-a596-28b30796449c-logs\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.289044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data-custom\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.295439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.296454 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-scripts\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.297634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.303533 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.307320 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdmd8\" (UniqueName: \"kubernetes.io/projected/28274132-ebaa-415b-a596-28b30796449c-kube-api-access-rdmd8\") pod \"cinder-api-0\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.344293 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp"] Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.417089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.572539 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h"] Jan 21 16:15:54 crc kubenswrapper[4707]: W0121 16:15:54.589293 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68f0c9a1_1b61_4381_9d0d_b006b5bcda52.slice/crio-db94fb9b99c29f99329e283846c83fbd76dde6731c8e9a503365e5cdef254664 WatchSource:0}: Error finding container db94fb9b99c29f99329e283846c83fbd76dde6731c8e9a503365e5cdef254664: Status 404 returned error can't find the container with id db94fb9b99c29f99329e283846c83fbd76dde6731c8e9a503365e5cdef254664 Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.717459 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.977937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" event={"ID":"c25947cd-d79d-4983-abfa-5d0f80850cdb","Type":"ContainerStarted","Data":"120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06"} Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.978254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" event={"ID":"c25947cd-d79d-4983-abfa-5d0f80850cdb","Type":"ContainerStarted","Data":"edf53ef52dd9868f5e3e85f0e21b3d871899444426637012805d8c871d99cfb0"} Jan 21 16:15:54 crc kubenswrapper[4707]: I0121 16:15:54.996888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" event={"ID":"68f0c9a1-1b61-4381-9d0d-b006b5bcda52","Type":"ContainerStarted","Data":"db94fb9b99c29f99329e283846c83fbd76dde6731c8e9a503365e5cdef254664"} Jan 21 16:15:55 crc kubenswrapper[4707]: I0121 16:15:55.000989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"28274132-ebaa-415b-a596-28b30796449c","Type":"ContainerStarted","Data":"462db79e6c81a738011acae018aa85f23ff0d788b376df5efc871b765d328cbb"} Jan 21 16:15:55 crc kubenswrapper[4707]: I0121 16:15:55.006053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" event={"ID":"90275cb5-d75d-4fbc-b84d-31fc918a19d1","Type":"ContainerStarted","Data":"38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e"} Jan 21 16:15:55 crc kubenswrapper[4707]: I0121 16:15:55.006106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" event={"ID":"90275cb5-d75d-4fbc-b84d-31fc918a19d1","Type":"ContainerStarted","Data":"7c2b6db4473d8ecfc0f5132feda7e9eb77e1d7bcd769cfb486f6d3f0573eb63c"} Jan 21 16:15:55 crc kubenswrapper[4707]: I0121 16:15:55.008480 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:55 crc kubenswrapper[4707]: I0121 16:15:55.008507 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:55 crc kubenswrapper[4707]: I0121 16:15:55.042771 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" podStartSLOduration=2.042753397 podStartE2EDuration="2.042753397s" podCreationTimestamp="2026-01-21 16:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:55.035683975 +0000 UTC m=+4452.217200196" watchObservedRunningTime="2026-01-21 16:15:55.042753397 +0000 UTC m=+4452.224269619" Jan 21 16:15:55 crc kubenswrapper[4707]: I0121 16:15:55.230595 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce0f55d-19f5-4878-9173-3deaddab452c" path="/var/lib/kubelet/pods/7ce0f55d-19f5-4878-9173-3deaddab452c/volumes" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.021713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" event={"ID":"c25947cd-d79d-4983-abfa-5d0f80850cdb","Type":"ContainerStarted","Data":"67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d"} Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.023979 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.033768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" event={"ID":"68f0c9a1-1b61-4381-9d0d-b006b5bcda52","Type":"ContainerStarted","Data":"39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce"} Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.033831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" event={"ID":"68f0c9a1-1b61-4381-9d0d-b006b5bcda52","Type":"ContainerStarted","Data":"da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0"} Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.033986 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.064107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"28274132-ebaa-415b-a596-28b30796449c","Type":"ContainerStarted","Data":"b1866e7c5c31c45c31a0a123206f96d06c54b25f3bf493859e89f55cf818e3aa"} Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.064278 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.064356 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.064420 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.064839 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.084933 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" podStartSLOduration=3.084920229 podStartE2EDuration="3.084920229s" podCreationTimestamp="2026-01-21 16:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:56.07555012 +0000 UTC m=+4453.257066343" watchObservedRunningTime="2026-01-21 16:15:56.084920229 +0000 UTC m=+4453.266436442" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.087358 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" podStartSLOduration=3.087350829 podStartE2EDuration="3.087350829s" podCreationTimestamp="2026-01-21 16:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:56.055290727 +0000 UTC m=+4453.236806949" watchObservedRunningTime="2026-01-21 16:15:56.087350829 +0000 UTC m=+4453.268867051" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.104831 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.104795931 podStartE2EDuration="3.104795931s" podCreationTimestamp="2026-01-21 16:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:56.096673699 +0000 UTC m=+4453.278189921" watchObservedRunningTime="2026-01-21 16:15:56.104795931 +0000 UTC m=+4453.286312153" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.662239 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.727142 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:15:56 crc kubenswrapper[4707]: I0121 16:15:56.834490 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.077406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"28274132-ebaa-415b-a596-28b30796449c","Type":"ContainerStarted","Data":"a35a0a3b6e7558ea488bb3ad5745e40ecbe3628ac61c149c4cd49b174c91a74f"} Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.078313 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.078334 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.079172 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.411719 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-69b97779c8-jlqt9"] Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.413803 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.416048 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.419604 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.437899 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-69b97779c8-jlqt9"] Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.474171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-httpd-config\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.474222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-config\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.474259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-internal-tls-certs\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.474284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-public-tls-certs\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.474347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-combined-ca-bundle\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.474392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-ovndb-tls-certs\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.474493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vvnc\" (UniqueName: \"kubernetes.io/projected/64190d62-9a33-4999-bc4a-3eff62940162-kube-api-access-4vvnc\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.576213 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vvnc\" (UniqueName: \"kubernetes.io/projected/64190d62-9a33-4999-bc4a-3eff62940162-kube-api-access-4vvnc\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.576329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-httpd-config\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.576366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-config\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.576393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-internal-tls-certs\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.576413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-public-tls-certs\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.576495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-combined-ca-bundle\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.576545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-ovndb-tls-certs\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.585192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-combined-ca-bundle\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.585386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-internal-tls-certs\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.589425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-public-tls-certs\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.589477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-config\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.589516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-httpd-config\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.589524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-ovndb-tls-certs\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.596015 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vvnc\" (UniqueName: \"kubernetes.io/projected/64190d62-9a33-4999-bc4a-3eff62940162-kube-api-access-4vvnc\") pod \"neutron-69b97779c8-jlqt9\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.729479 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.729963 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.765041 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.852549 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.869457 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:57 crc kubenswrapper[4707]: I0121 16:15:57.989507 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:15:58 crc kubenswrapper[4707]: I0121 16:15:58.084599 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerName="cinder-scheduler" containerID="cri-o://a4c0bcb50d74e4edcf57b0a0c6d0174d6f705cb46fa205f5901dbdcbab03caa3" gracePeriod=30 Jan 21 16:15:58 crc kubenswrapper[4707]: I0121 16:15:58.085588 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerName="probe" containerID="cri-o://2a1dc6c4ab4a2697f15febf2854084fc6b8cc8c57b6b4ba7851ac9717f05cfe7" gracePeriod=30 Jan 21 16:15:58 crc kubenswrapper[4707]: I0121 16:15:58.212670 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-69b97779c8-jlqt9"] Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.096306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" event={"ID":"64190d62-9a33-4999-bc4a-3eff62940162","Type":"ContainerStarted","Data":"5b3ebbd8a16f076cbe84710aa19479370ca9661ce99511215aea7ce272f068eb"} Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.097590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" event={"ID":"64190d62-9a33-4999-bc4a-3eff62940162","Type":"ContainerStarted","Data":"d2d7c5789da9a58eb70a75b4a3d6ee778cf3a4ed19f27c737a94bbdf5be149df"} Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.097721 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.097824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" event={"ID":"64190d62-9a33-4999-bc4a-3eff62940162","Type":"ContainerStarted","Data":"64e00f319a49f010f4afb5370013e1976165b3f1ae2cd24d5e334ea917351241"} Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.102300 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerID="2a1dc6c4ab4a2697f15febf2854084fc6b8cc8c57b6b4ba7851ac9717f05cfe7" exitCode=0 Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.102345 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerID="a4c0bcb50d74e4edcf57b0a0c6d0174d6f705cb46fa205f5901dbdcbab03caa3" exitCode=0 Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.102437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bd341c6b-6920-4954-a5d8-51a7e5ab13bc","Type":"ContainerDied","Data":"2a1dc6c4ab4a2697f15febf2854084fc6b8cc8c57b6b4ba7851ac9717f05cfe7"} Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.102552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bd341c6b-6920-4954-a5d8-51a7e5ab13bc","Type":"ContainerDied","Data":"a4c0bcb50d74e4edcf57b0a0c6d0174d6f705cb46fa205f5901dbdcbab03caa3"} Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.128615 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" podStartSLOduration=2.128599687 podStartE2EDuration="2.128599687s" podCreationTimestamp="2026-01-21 16:15:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:15:59.119103582 +0000 UTC m=+4456.300619805" watchObservedRunningTime="2026-01-21 16:15:59.128599687 +0000 UTC m=+4456.310115910" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.536741 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.616750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-combined-ca-bundle\") pod \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.616957 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data\") pod \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.617001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvh5s\" (UniqueName: \"kubernetes.io/projected/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-kube-api-access-dvh5s\") pod \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.617071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-etc-machine-id\") pod \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.617094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-scripts\") pod \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.617171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data-custom\") pod \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\" (UID: \"bd341c6b-6920-4954-a5d8-51a7e5ab13bc\") " Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.617875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bd341c6b-6920-4954-a5d8-51a7e5ab13bc" (UID: "bd341c6b-6920-4954-a5d8-51a7e5ab13bc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.618022 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.626992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-kube-api-access-dvh5s" (OuterVolumeSpecName: "kube-api-access-dvh5s") pod "bd341c6b-6920-4954-a5d8-51a7e5ab13bc" (UID: "bd341c6b-6920-4954-a5d8-51a7e5ab13bc"). InnerVolumeSpecName "kube-api-access-dvh5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.628907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bd341c6b-6920-4954-a5d8-51a7e5ab13bc" (UID: "bd341c6b-6920-4954-a5d8-51a7e5ab13bc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.639923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-scripts" (OuterVolumeSpecName: "scripts") pod "bd341c6b-6920-4954-a5d8-51a7e5ab13bc" (UID: "bd341c6b-6920-4954-a5d8-51a7e5ab13bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.671964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd341c6b-6920-4954-a5d8-51a7e5ab13bc" (UID: "bd341c6b-6920-4954-a5d8-51a7e5ab13bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.720837 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvh5s\" (UniqueName: \"kubernetes.io/projected/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-kube-api-access-dvh5s\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.720870 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.720883 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.720893 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.759054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data" (OuterVolumeSpecName: "config-data") pod "bd341c6b-6920-4954-a5d8-51a7e5ab13bc" (UID: "bd341c6b-6920-4954-a5d8-51a7e5ab13bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:59 crc kubenswrapper[4707]: I0121 16:15:59.822884 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd341c6b-6920-4954-a5d8-51a7e5ab13bc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.120639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bd341c6b-6920-4954-a5d8-51a7e5ab13bc","Type":"ContainerDied","Data":"3becbf7604c5862b42c05b23e6b4803de9ecbedbe8ef4981fd7e7246f42e8d1d"} Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.120934 4707 scope.go:117] "RemoveContainer" containerID="2a1dc6c4ab4a2697f15febf2854084fc6b8cc8c57b6b4ba7851ac9717f05cfe7" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.120698 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.161494 4707 scope.go:117] "RemoveContainer" containerID="a4c0bcb50d74e4edcf57b0a0c6d0174d6f705cb46fa205f5901dbdcbab03caa3" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.163790 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.171648 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.177301 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:16:00 crc kubenswrapper[4707]: E0121 16:16:00.177689 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerName="cinder-scheduler" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.177709 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerName="cinder-scheduler" Jan 21 16:16:00 crc kubenswrapper[4707]: E0121 16:16:00.177731 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerName="probe" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.177738 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerName="probe" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.177927 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerName="cinder-scheduler" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.177948 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" containerName="probe" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.178929 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.183088 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.198479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.228093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.228267 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.228425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-scripts\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.228565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eddda5d-3cfc-4038-821c-4c628aeb3c69-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.228618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.228686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbjj8\" (UniqueName: \"kubernetes.io/projected/1eddda5d-3cfc-4038-821c-4c628aeb3c69-kube-api-access-hbjj8\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.330180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.330251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-scripts\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.330280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eddda5d-3cfc-4038-821c-4c628aeb3c69-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.330299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.330325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbjj8\" (UniqueName: \"kubernetes.io/projected/1eddda5d-3cfc-4038-821c-4c628aeb3c69-kube-api-access-hbjj8\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.330381 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.330630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eddda5d-3cfc-4038-821c-4c628aeb3c69-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.334378 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-scripts\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.334484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.335313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.335483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.344929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbjj8\" (UniqueName: \"kubernetes.io/projected/1eddda5d-3cfc-4038-821c-4c628aeb3c69-kube-api-access-hbjj8\") pod \"cinder-scheduler-0\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:00 crc kubenswrapper[4707]: I0121 16:16:00.501047 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:01 crc kubenswrapper[4707]: I0121 16:16:01.039156 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:16:01 crc kubenswrapper[4707]: I0121 16:16:01.193086 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd341c6b-6920-4954-a5d8-51a7e5ab13bc" path="/var/lib/kubelet/pods/bd341c6b-6920-4954-a5d8-51a7e5ab13bc/volumes" Jan 21 16:16:01 crc kubenswrapper[4707]: I0121 16:16:01.251280 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:16:02 crc kubenswrapper[4707]: I0121 16:16:02.163099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1eddda5d-3cfc-4038-821c-4c628aeb3c69","Type":"ContainerStarted","Data":"2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5"} Jan 21 16:16:02 crc kubenswrapper[4707]: I0121 16:16:02.163440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1eddda5d-3cfc-4038-821c-4c628aeb3c69","Type":"ContainerStarted","Data":"202dc5272231aec953a6ab5106342e284658733962780834a351c961bc7b2387"} Jan 21 16:16:02 crc kubenswrapper[4707]: I0121 16:16:02.284735 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:16:02 crc kubenswrapper[4707]: I0121 16:16:02.343112 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc"] Jan 21 16:16:02 crc kubenswrapper[4707]: I0121 16:16:02.343752 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api-log" containerID="cri-o://5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310" gracePeriod=30 Jan 21 16:16:02 crc kubenswrapper[4707]: I0121 16:16:02.344195 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api" containerID="cri-o://511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2" gracePeriod=30 Jan 21 16:16:03 crc kubenswrapper[4707]: I0121 16:16:03.175122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1eddda5d-3cfc-4038-821c-4c628aeb3c69","Type":"ContainerStarted","Data":"8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e"} Jan 21 16:16:03 crc kubenswrapper[4707]: I0121 16:16:03.176963 4707 generic.go:334] "Generic (PLEG): container finished" podID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerID="5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310" exitCode=143 Jan 21 16:16:03 crc kubenswrapper[4707]: I0121 16:16:03.177012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" event={"ID":"707fba96-29a7-4fd4-ad1f-4a9cfae28021","Type":"ContainerDied","Data":"5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310"} Jan 21 16:16:03 crc kubenswrapper[4707]: I0121 16:16:03.195777 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.195757578 podStartE2EDuration="3.195757578s" podCreationTimestamp="2026-01-21 16:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:03.190269589 +0000 UTC m=+4460.371785812" watchObservedRunningTime="2026-01-21 16:16:03.195757578 +0000 UTC m=+4460.377273800" Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.458908 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": dial tcp 10.217.0.157:9311: connect: connection refused" Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.458908 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": dial tcp 10.217.0.157:9311: connect: connection refused" Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.502189 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.927568 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.979643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data\") pod \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.979845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-combined-ca-bundle\") pod \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.979908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707fba96-29a7-4fd4-ad1f-4a9cfae28021-logs\") pod \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.980003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs2lt\" (UniqueName: \"kubernetes.io/projected/707fba96-29a7-4fd4-ad1f-4a9cfae28021-kube-api-access-xs2lt\") pod \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.980314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data-custom\") pod \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\" (UID: \"707fba96-29a7-4fd4-ad1f-4a9cfae28021\") " Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.981060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/707fba96-29a7-4fd4-ad1f-4a9cfae28021-logs" (OuterVolumeSpecName: "logs") pod "707fba96-29a7-4fd4-ad1f-4a9cfae28021" (UID: "707fba96-29a7-4fd4-ad1f-4a9cfae28021"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.986742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707fba96-29a7-4fd4-ad1f-4a9cfae28021-kube-api-access-xs2lt" (OuterVolumeSpecName: "kube-api-access-xs2lt") pod "707fba96-29a7-4fd4-ad1f-4a9cfae28021" (UID: "707fba96-29a7-4fd4-ad1f-4a9cfae28021"). InnerVolumeSpecName "kube-api-access-xs2lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:05 crc kubenswrapper[4707]: I0121 16:16:05.990033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "707fba96-29a7-4fd4-ad1f-4a9cfae28021" (UID: "707fba96-29a7-4fd4-ad1f-4a9cfae28021"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.005092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "707fba96-29a7-4fd4-ad1f-4a9cfae28021" (UID: "707fba96-29a7-4fd4-ad1f-4a9cfae28021"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.019499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data" (OuterVolumeSpecName: "config-data") pod "707fba96-29a7-4fd4-ad1f-4a9cfae28021" (UID: "707fba96-29a7-4fd4-ad1f-4a9cfae28021"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.082270 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.082302 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.082314 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707fba96-29a7-4fd4-ad1f-4a9cfae28021-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.082325 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707fba96-29a7-4fd4-ad1f-4a9cfae28021-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.082340 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs2lt\" (UniqueName: \"kubernetes.io/projected/707fba96-29a7-4fd4-ad1f-4a9cfae28021-kube-api-access-xs2lt\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.123024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.207145 4707 generic.go:334] "Generic (PLEG): container finished" podID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerID="511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2" exitCode=0 Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.207226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" event={"ID":"707fba96-29a7-4fd4-ad1f-4a9cfae28021","Type":"ContainerDied","Data":"511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2"} Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.207235 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.207256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc" event={"ID":"707fba96-29a7-4fd4-ad1f-4a9cfae28021","Type":"ContainerDied","Data":"5685ec8c791774b73780dcb61cb55f22f6384000868bfe83e69723d342386d37"} Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.207295 4707 scope.go:117] "RemoveContainer" containerID="511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.231564 4707 scope.go:117] "RemoveContainer" containerID="5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.245269 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc"] Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.257932 4707 scope.go:117] "RemoveContainer" containerID="511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2" Jan 21 16:16:06 crc kubenswrapper[4707]: E0121 16:16:06.258888 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2\": container with ID starting with 511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2 not found: ID does not exist" containerID="511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.258927 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2"} err="failed to get container status \"511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2\": rpc error: code = NotFound desc = could not find container \"511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2\": container with ID starting with 511433e58daf76ebbac58069d90d5a7d7a8b6f69d93533fe35d856e2fbbf90b2 not found: ID does not exist" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.258956 4707 scope.go:117] "RemoveContainer" containerID="5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310" Jan 21 16:16:06 crc kubenswrapper[4707]: E0121 16:16:06.259514 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310\": container with ID starting with 5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310 not found: ID does not exist" containerID="5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.259540 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310"} err="failed to get container status \"5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310\": rpc error: code = NotFound desc = could not find container \"5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310\": container with ID starting with 5ca3f147ae70cf2dd4f42f8c24cc8e20b41c3d3e74aa482d6027c942bb241310 not found: ID does not exist" Jan 21 16:16:06 crc kubenswrapper[4707]: I0121 16:16:06.277937 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-75d65b4bdd-zn2lc"] Jan 21 16:16:07 crc kubenswrapper[4707]: I0121 16:16:07.193544 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" path="/var/lib/kubelet/pods/707fba96-29a7-4fd4-ad1f-4a9cfae28021/volumes" Jan 21 16:16:10 crc kubenswrapper[4707]: I0121 16:16:10.693739 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:16:15 crc kubenswrapper[4707]: I0121 16:16:15.162270 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:16:15 crc kubenswrapper[4707]: I0121 16:16:15.271766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:16:15 crc kubenswrapper[4707]: I0121 16:16:15.325511 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:15 crc kubenswrapper[4707]: I0121 16:16:15.707523 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:16:15 crc kubenswrapper[4707]: I0121 16:16:15.719590 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:16:15 crc kubenswrapper[4707]: I0121 16:16:15.771068 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6c876dcb7d-x45vx"] Jan 21 16:16:16 crc kubenswrapper[4707]: I0121 16:16:16.296074 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" podUID="1a330953-fa65-465d-b502-3143953dd0ab" containerName="placement-log" containerID="cri-o://2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23" gracePeriod=30 Jan 21 16:16:16 crc kubenswrapper[4707]: I0121 16:16:16.296269 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" podUID="1a330953-fa65-465d-b502-3143953dd0ab" containerName="placement-api" containerID="cri-o://4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c" gracePeriod=30 Jan 21 16:16:17 crc kubenswrapper[4707]: I0121 16:16:17.303921 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a330953-fa65-465d-b502-3143953dd0ab" containerID="2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23" exitCode=143 Jan 21 16:16:17 crc kubenswrapper[4707]: I0121 16:16:17.304612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" event={"ID":"1a330953-fa65-465d-b502-3143953dd0ab","Type":"ContainerDied","Data":"2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23"} Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.074241 4707 scope.go:117] "RemoveContainer" containerID="ad264b7e1f00fc4984b3fc2c3c7f5bb618cb111057cf37ecd2ef5c32ea47b50d" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.109848 4707 scope.go:117] "RemoveContainer" containerID="1d299162f4afa98964d4ec11db17d73e66c0c0a221338dbed9777d7db58d8b6f" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.146960 4707 scope.go:117] "RemoveContainer" containerID="23373f1cfb2f8dcf2f285c27279eea29343e9ee4d18981e295e313a664f144f6" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.692539 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.828867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-combined-ca-bundle\") pod \"1a330953-fa65-465d-b502-3143953dd0ab\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.828924 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-config-data\") pod \"1a330953-fa65-465d-b502-3143953dd0ab\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.829005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/1a330953-fa65-465d-b502-3143953dd0ab-kube-api-access-p2hxc\") pod \"1a330953-fa65-465d-b502-3143953dd0ab\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.829049 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-scripts\") pod \"1a330953-fa65-465d-b502-3143953dd0ab\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.829067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a330953-fa65-465d-b502-3143953dd0ab-logs\") pod \"1a330953-fa65-465d-b502-3143953dd0ab\" (UID: \"1a330953-fa65-465d-b502-3143953dd0ab\") " Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.829683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a330953-fa65-465d-b502-3143953dd0ab-logs" (OuterVolumeSpecName: "logs") pod "1a330953-fa65-465d-b502-3143953dd0ab" (UID: "1a330953-fa65-465d-b502-3143953dd0ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.836205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-scripts" (OuterVolumeSpecName: "scripts") pod "1a330953-fa65-465d-b502-3143953dd0ab" (UID: "1a330953-fa65-465d-b502-3143953dd0ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.836355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a330953-fa65-465d-b502-3143953dd0ab-kube-api-access-p2hxc" (OuterVolumeSpecName: "kube-api-access-p2hxc") pod "1a330953-fa65-465d-b502-3143953dd0ab" (UID: "1a330953-fa65-465d-b502-3143953dd0ab"). InnerVolumeSpecName "kube-api-access-p2hxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.868236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a330953-fa65-465d-b502-3143953dd0ab" (UID: "1a330953-fa65-465d-b502-3143953dd0ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.868621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-config-data" (OuterVolumeSpecName: "config-data") pod "1a330953-fa65-465d-b502-3143953dd0ab" (UID: "1a330953-fa65-465d-b502-3143953dd0ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.931441 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/1a330953-fa65-465d-b502-3143953dd0ab-kube-api-access-p2hxc\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.931473 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.931484 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a330953-fa65-465d-b502-3143953dd0ab-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.931497 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:19 crc kubenswrapper[4707]: I0121 16:16:19.931506 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a330953-fa65-465d-b502-3143953dd0ab-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.345033 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a330953-fa65-465d-b502-3143953dd0ab" containerID="4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c" exitCode=0 Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.345073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" event={"ID":"1a330953-fa65-465d-b502-3143953dd0ab","Type":"ContainerDied","Data":"4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c"} Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.345082 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.345095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c876dcb7d-x45vx" event={"ID":"1a330953-fa65-465d-b502-3143953dd0ab","Type":"ContainerDied","Data":"48e3cd6813db854c805d4e11f38563c55dcd976be791929874e5aed0f9acf24c"} Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.345110 4707 scope.go:117] "RemoveContainer" containerID="4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c" Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.365791 4707 scope.go:117] "RemoveContainer" containerID="2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23" Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.378647 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6c876dcb7d-x45vx"] Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.387654 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-6c876dcb7d-x45vx"] Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.400835 4707 scope.go:117] "RemoveContainer" containerID="4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c" Jan 21 16:16:20 crc kubenswrapper[4707]: E0121 16:16:20.401247 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c\": container with ID starting with 4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c not found: ID does not exist" containerID="4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c" Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.401280 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c"} err="failed to get container status \"4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c\": rpc error: code = NotFound desc = could not find container \"4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c\": container with ID starting with 4dca84efc8545c7a631048d70adb445380f87ecdc665d1eae6c48fa20695c05c not found: ID does not exist" Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.401298 4707 scope.go:117] "RemoveContainer" containerID="2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23" Jan 21 16:16:20 crc kubenswrapper[4707]: E0121 16:16:20.401559 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23\": container with ID starting with 2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23 not found: ID does not exist" containerID="2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23" Jan 21 16:16:20 crc kubenswrapper[4707]: I0121 16:16:20.401580 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23"} err="failed to get container status \"2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23\": rpc error: code = NotFound desc = could not find container \"2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23\": container with ID starting with 2294c9ec8587a36a099a37bc5906ead812d81ca3cb73696c1018006e0ce74f23 not found: ID does not exist" Jan 21 16:16:21 crc kubenswrapper[4707]: I0121 16:16:21.193600 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a330953-fa65-465d-b502-3143953dd0ab" path="/var/lib/kubelet/pods/1a330953-fa65-465d-b502-3143953dd0ab/volumes" Jan 21 16:16:23 crc kubenswrapper[4707]: I0121 16:16:23.970117 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.046639 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.887888 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:16:25 crc kubenswrapper[4707]: E0121 16:16:25.888530 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a330953-fa65-465d-b502-3143953dd0ab" containerName="placement-api" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.888548 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a330953-fa65-465d-b502-3143953dd0ab" containerName="placement-api" Jan 21 16:16:25 crc kubenswrapper[4707]: E0121 16:16:25.888561 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api-log" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.888567 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api-log" Jan 21 16:16:25 crc kubenswrapper[4707]: E0121 16:16:25.888584 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a330953-fa65-465d-b502-3143953dd0ab" containerName="placement-log" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.888589 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a330953-fa65-465d-b502-3143953dd0ab" containerName="placement-log" Jan 21 16:16:25 crc kubenswrapper[4707]: E0121 16:16:25.888605 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.888612 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.888773 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a330953-fa65-465d-b502-3143953dd0ab" containerName="placement-log" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.888789 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.888800 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a330953-fa65-465d-b502-3143953dd0ab" containerName="placement-api" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.888824 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="707fba96-29a7-4fd4-ad1f-4a9cfae28021" containerName="barbican-api-log" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.889411 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.891838 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.891876 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.893867 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.898003 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-45k5v" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.949007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-combined-ca-bundle\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.949228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config-secret\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.949272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:25 crc kubenswrapper[4707]: I0121 16:16:25.949469 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crn52\" (UniqueName: \"kubernetes.io/projected/453b486c-659b-4fc0-853e-2c8e17129d27-kube-api-access-crn52\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.051867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-combined-ca-bundle\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.052098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config-secret\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.052131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.052261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crn52\" (UniqueName: \"kubernetes.io/projected/453b486c-659b-4fc0-853e-2c8e17129d27-kube-api-access-crn52\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.053383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.058431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-combined-ca-bundle\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.066402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config-secret\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.068510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crn52\" (UniqueName: \"kubernetes.io/projected/453b486c-659b-4fc0-853e-2c8e17129d27-kube-api-access-crn52\") pod \"openstackclient\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.204904 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:16:26 crc kubenswrapper[4707]: W0121 16:16:26.638826 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod453b486c_659b_4fc0_853e_2c8e17129d27.slice/crio-e1a617ee687a03ec02b2d658bd8bdbdbf4704a580c35a2712bef12f196f20e1c WatchSource:0}: Error finding container e1a617ee687a03ec02b2d658bd8bdbdbf4704a580c35a2712bef12f196f20e1c: Status 404 returned error can't find the container with id e1a617ee687a03ec02b2d658bd8bdbdbf4704a580c35a2712bef12f196f20e1c Jan 21 16:16:26 crc kubenswrapper[4707]: I0121 16:16:26.642518 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:16:27 crc kubenswrapper[4707]: I0121 16:16:27.414770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"453b486c-659b-4fc0-853e-2c8e17129d27","Type":"ContainerStarted","Data":"49fdcf9618939364f2f9c95eb510f65f5cbb896c2cbd65a978458a287b5f3b5c"} Jan 21 16:16:27 crc kubenswrapper[4707]: I0121 16:16:27.415383 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"453b486c-659b-4fc0-853e-2c8e17129d27","Type":"ContainerStarted","Data":"e1a617ee687a03ec02b2d658bd8bdbdbf4704a580c35a2712bef12f196f20e1c"} Jan 21 16:16:27 crc kubenswrapper[4707]: I0121 16:16:27.429350 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.429332263 podStartE2EDuration="2.429332263s" podCreationTimestamp="2026-01-21 16:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:27.428270727 +0000 UTC m=+4484.609786949" watchObservedRunningTime="2026-01-21 16:16:27.429332263 +0000 UTC m=+4484.610848484" Jan 21 16:16:27 crc kubenswrapper[4707]: I0121 16:16:27.764269 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:16:27 crc kubenswrapper[4707]: I0121 16:16:27.845180 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h"] Jan 21 16:16:27 crc kubenswrapper[4707]: I0121 16:16:27.845447 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" podUID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerName="neutron-api" containerID="cri-o://da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0" gracePeriod=30 Jan 21 16:16:27 crc kubenswrapper[4707]: I0121 16:16:27.845585 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" podUID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerName="neutron-httpd" containerID="cri-o://39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce" gracePeriod=30 Jan 21 16:16:28 crc kubenswrapper[4707]: I0121 16:16:28.424006 4707 generic.go:334] "Generic (PLEG): container finished" podID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerID="39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce" exitCode=0 Jan 21 16:16:28 crc kubenswrapper[4707]: I0121 16:16:28.424195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" event={"ID":"68f0c9a1-1b61-4381-9d0d-b006b5bcda52","Type":"ContainerDied","Data":"39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce"} Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.214471 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45"] Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.216004 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.219394 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.219648 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.224521 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.227282 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45"] Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.267144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-internal-tls-certs\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.267197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-config-data\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.267230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-etc-swift\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.267271 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-combined-ca-bundle\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.267293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-public-tls-certs\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.267338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkrpv\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-kube-api-access-fkrpv\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.267367 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-log-httpd\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.267383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-run-httpd\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.297002 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.368303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-combined-ca-bundle\") pod \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.368411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-httpd-config\") pod \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.368449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-config\") pod \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.368510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-244hb\" (UniqueName: \"kubernetes.io/projected/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-kube-api-access-244hb\") pod \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.368544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-ovndb-tls-certs\") pod \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\" (UID: \"68f0c9a1-1b61-4381-9d0d-b006b5bcda52\") " Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.368847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkrpv\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-kube-api-access-fkrpv\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.368897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-log-httpd\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.368917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-run-httpd\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.369035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-internal-tls-certs\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.369056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-config-data\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.369078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-etc-swift\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.369104 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-combined-ca-bundle\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.369123 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-public-tls-certs\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.371381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-log-httpd\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.371142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-run-httpd\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.376274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "68f0c9a1-1b61-4381-9d0d-b006b5bcda52" (UID: "68f0c9a1-1b61-4381-9d0d-b006b5bcda52"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.376991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-config-data\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.376990 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-combined-ca-bundle\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.378463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-internal-tls-certs\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.378984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-kube-api-access-244hb" (OuterVolumeSpecName: "kube-api-access-244hb") pod "68f0c9a1-1b61-4381-9d0d-b006b5bcda52" (UID: "68f0c9a1-1b61-4381-9d0d-b006b5bcda52"). InnerVolumeSpecName "kube-api-access-244hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.379523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-public-tls-certs\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.383822 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-etc-swift\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.386540 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkrpv\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-kube-api-access-fkrpv\") pod \"swift-proxy-6886f88d58-xtb45\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.431252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-config" (OuterVolumeSpecName: "config") pod "68f0c9a1-1b61-4381-9d0d-b006b5bcda52" (UID: "68f0c9a1-1b61-4381-9d0d-b006b5bcda52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.435553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68f0c9a1-1b61-4381-9d0d-b006b5bcda52" (UID: "68f0c9a1-1b61-4381-9d0d-b006b5bcda52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.445438 4707 generic.go:334] "Generic (PLEG): container finished" podID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerID="da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0" exitCode=0 Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.445493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" event={"ID":"68f0c9a1-1b61-4381-9d0d-b006b5bcda52","Type":"ContainerDied","Data":"da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0"} Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.445541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" event={"ID":"68f0c9a1-1b61-4381-9d0d-b006b5bcda52","Type":"ContainerDied","Data":"db94fb9b99c29f99329e283846c83fbd76dde6731c8e9a503365e5cdef254664"} Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.445565 4707 scope.go:117] "RemoveContainer" containerID="39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.445960 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.454389 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "68f0c9a1-1b61-4381-9d0d-b006b5bcda52" (UID: "68f0c9a1-1b61-4381-9d0d-b006b5bcda52"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.466684 4707 scope.go:117] "RemoveContainer" containerID="da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.472423 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-244hb\" (UniqueName: \"kubernetes.io/projected/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-kube-api-access-244hb\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.472456 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.472473 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.472582 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.472596 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/68f0c9a1-1b61-4381-9d0d-b006b5bcda52-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.490870 4707 scope.go:117] "RemoveContainer" containerID="39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce" Jan 21 16:16:29 crc kubenswrapper[4707]: E0121 16:16:29.491380 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce\": container with ID starting with 39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce not found: ID does not exist" containerID="39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.491429 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce"} err="failed to get container status \"39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce\": rpc error: code = NotFound desc = could not find container \"39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce\": container with ID starting with 39f57716967d64f0357709c1d44e5317fffd5b55dc0a44173b5fc2a5c8dcf3ce not found: ID does not exist" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.491468 4707 scope.go:117] "RemoveContainer" containerID="da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0" Jan 21 16:16:29 crc kubenswrapper[4707]: E0121 16:16:29.491773 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0\": container with ID starting with da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0 not found: ID does not exist" containerID="da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.491796 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0"} err="failed to get container status \"da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0\": rpc error: code = NotFound desc = could not find container \"da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0\": container with ID starting with da0a28282c0bbb051f803bec7f55b46548d858bc36e86d9bcc17ac358b8c6ae0 not found: ID does not exist" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.532847 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.697231 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.698064 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="ceilometer-central-agent" containerID="cri-o://65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08" gracePeriod=30 Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.698238 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="proxy-httpd" containerID="cri-o://f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc" gracePeriod=30 Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.698285 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="sg-core" containerID="cri-o://834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5" gracePeriod=30 Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.698320 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="ceilometer-notification-agent" containerID="cri-o://f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05" gracePeriod=30 Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.773995 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h"] Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.780125 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6f8c6b4fd7-t478h"] Jan 21 16:16:29 crc kubenswrapper[4707]: W0121 16:16:29.996045 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb73f5d48_3358_4ea2_8b12_cfd7118a7337.slice/crio-4015c59744765628cfcba9fe29cb7ef93ca8f3243bdf14f278576c88f5673f58 WatchSource:0}: Error finding container 4015c59744765628cfcba9fe29cb7ef93ca8f3243bdf14f278576c88f5673f58: Status 404 returned error can't find the container with id 4015c59744765628cfcba9fe29cb7ef93ca8f3243bdf14f278576c88f5673f58 Jan 21 16:16:29 crc kubenswrapper[4707]: I0121 16:16:29.996570 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45"] Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.459385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" event={"ID":"b73f5d48-3358-4ea2-8b12-cfd7118a7337","Type":"ContainerStarted","Data":"c4d2cce318ed39a68499016f449e682bd2f407153f80ddf6bfa77db0849f99da"} Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.459735 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.459984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" event={"ID":"b73f5d48-3358-4ea2-8b12-cfd7118a7337","Type":"ContainerStarted","Data":"24a13662b41e6b6bfb847addf208fe2012b93376a6c253308ab52b66871cd124"} Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.459997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" event={"ID":"b73f5d48-3358-4ea2-8b12-cfd7118a7337","Type":"ContainerStarted","Data":"4015c59744765628cfcba9fe29cb7ef93ca8f3243bdf14f278576c88f5673f58"} Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.460012 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.462068 4707 generic.go:334] "Generic (PLEG): container finished" podID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerID="f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc" exitCode=0 Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.462105 4707 generic.go:334] "Generic (PLEG): container finished" podID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerID="834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5" exitCode=2 Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.462114 4707 generic.go:334] "Generic (PLEG): container finished" podID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerID="65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08" exitCode=0 Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.462173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerDied","Data":"f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc"} Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.462205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerDied","Data":"834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5"} Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.462215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerDied","Data":"65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08"} Jan 21 16:16:30 crc kubenswrapper[4707]: I0121 16:16:30.480346 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" podStartSLOduration=1.480334591 podStartE2EDuration="1.480334591s" podCreationTimestamp="2026-01-21 16:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:30.473456898 +0000 UTC m=+4487.654973121" watchObservedRunningTime="2026-01-21 16:16:30.480334591 +0000 UTC m=+4487.661850813" Jan 21 16:16:31 crc kubenswrapper[4707]: I0121 16:16:31.192665 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" path="/var/lib/kubelet/pods/68f0c9a1-1b61-4381-9d0d-b006b5bcda52/volumes" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.763634 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-4rqnw"] Jan 21 16:16:32 crc kubenswrapper[4707]: E0121 16:16:32.764432 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerName="neutron-api" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.764448 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerName="neutron-api" Jan 21 16:16:32 crc kubenswrapper[4707]: E0121 16:16:32.764465 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerName="neutron-httpd" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.764472 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerName="neutron-httpd" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.764694 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerName="neutron-api" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.764710 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f0c9a1-1b61-4381-9d0d-b006b5bcda52" containerName="neutron-httpd" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.765472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.772530 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-4rqnw"] Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.859923 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-s6wlc"] Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.861313 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.869385 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-s6wlc"] Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.944523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035033e-3578-409b-865f-ec2e3812a9f7-operator-scripts\") pod \"nova-api-db-create-4rqnw\" (UID: \"c035033e-3578-409b-865f-ec2e3812a9f7\") " pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.944577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mmjh\" (UniqueName: \"kubernetes.io/projected/c035033e-3578-409b-865f-ec2e3812a9f7-kube-api-access-8mmjh\") pod \"nova-api-db-create-4rqnw\" (UID: \"c035033e-3578-409b-865f-ec2e3812a9f7\") " pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.979266 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj"] Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.980892 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.982559 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 21 16:16:32 crc kubenswrapper[4707]: I0121 16:16:32.986845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj"] Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.046886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwwpj\" (UniqueName: \"kubernetes.io/projected/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-kube-api-access-nwwpj\") pod \"nova-cell0-db-create-s6wlc\" (UID: \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\") " pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.047496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035033e-3578-409b-865f-ec2e3812a9f7-operator-scripts\") pod \"nova-api-db-create-4rqnw\" (UID: \"c035033e-3578-409b-865f-ec2e3812a9f7\") " pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.047536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-operator-scripts\") pod \"nova-cell0-db-create-s6wlc\" (UID: \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\") " pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.047583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mmjh\" (UniqueName: \"kubernetes.io/projected/c035033e-3578-409b-865f-ec2e3812a9f7-kube-api-access-8mmjh\") pod \"nova-api-db-create-4rqnw\" (UID: \"c035033e-3578-409b-865f-ec2e3812a9f7\") " pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.048605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035033e-3578-409b-865f-ec2e3812a9f7-operator-scripts\") pod \"nova-api-db-create-4rqnw\" (UID: \"c035033e-3578-409b-865f-ec2e3812a9f7\") " pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.067794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mmjh\" (UniqueName: \"kubernetes.io/projected/c035033e-3578-409b-865f-ec2e3812a9f7-kube-api-access-8mmjh\") pod \"nova-api-db-create-4rqnw\" (UID: \"c035033e-3578-409b-865f-ec2e3812a9f7\") " pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.073946 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-nx25r"] Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.075079 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.084353 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.089951 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-nx25r"] Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.149631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625cd2cf-9a3e-4870-8df0-de16c44cf081-operator-scripts\") pod \"nova-api-54d1-account-create-update-qh9zj\" (UID: \"625cd2cf-9a3e-4870-8df0-de16c44cf081\") " pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.149712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmmv\" (UniqueName: \"kubernetes.io/projected/db4739a7-e3aa-4fce-8e12-8fe520467ba2-kube-api-access-gtmmv\") pod \"nova-cell1-db-create-nx25r\" (UID: \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.149739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z4n9\" (UniqueName: \"kubernetes.io/projected/625cd2cf-9a3e-4870-8df0-de16c44cf081-kube-api-access-9z4n9\") pod \"nova-api-54d1-account-create-update-qh9zj\" (UID: \"625cd2cf-9a3e-4870-8df0-de16c44cf081\") " pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.150008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwwpj\" (UniqueName: \"kubernetes.io/projected/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-kube-api-access-nwwpj\") pod \"nova-cell0-db-create-s6wlc\" (UID: \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\") " pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.150234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-operator-scripts\") pod \"nova-cell0-db-create-s6wlc\" (UID: \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\") " pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.150275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db4739a7-e3aa-4fce-8e12-8fe520467ba2-operator-scripts\") pod \"nova-cell1-db-create-nx25r\" (UID: \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.151343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-operator-scripts\") pod \"nova-cell0-db-create-s6wlc\" (UID: \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\") " pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.169989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwwpj\" (UniqueName: \"kubernetes.io/projected/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-kube-api-access-nwwpj\") pod \"nova-cell0-db-create-s6wlc\" (UID: \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\") " pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.173528 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92"] Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.174928 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.182363 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.185411 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.228599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92"] Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.251604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625cd2cf-9a3e-4870-8df0-de16c44cf081-operator-scripts\") pod \"nova-api-54d1-account-create-update-qh9zj\" (UID: \"625cd2cf-9a3e-4870-8df0-de16c44cf081\") " pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.251686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmmv\" (UniqueName: \"kubernetes.io/projected/db4739a7-e3aa-4fce-8e12-8fe520467ba2-kube-api-access-gtmmv\") pod \"nova-cell1-db-create-nx25r\" (UID: \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.251710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z4n9\" (UniqueName: \"kubernetes.io/projected/625cd2cf-9a3e-4870-8df0-de16c44cf081-kube-api-access-9z4n9\") pod \"nova-api-54d1-account-create-update-qh9zj\" (UID: \"625cd2cf-9a3e-4870-8df0-de16c44cf081\") " pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.251762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264a35d9-80a3-47e0-9489-5d088f07da4f-operator-scripts\") pod \"nova-cell0-7ab6-account-create-update-9rm92\" (UID: \"264a35d9-80a3-47e0-9489-5d088f07da4f\") " pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.251828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6pwv\" (UniqueName: \"kubernetes.io/projected/264a35d9-80a3-47e0-9489-5d088f07da4f-kube-api-access-g6pwv\") pod \"nova-cell0-7ab6-account-create-update-9rm92\" (UID: \"264a35d9-80a3-47e0-9489-5d088f07da4f\") " pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.251861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db4739a7-e3aa-4fce-8e12-8fe520467ba2-operator-scripts\") pod \"nova-cell1-db-create-nx25r\" (UID: \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.252481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db4739a7-e3aa-4fce-8e12-8fe520467ba2-operator-scripts\") pod \"nova-cell1-db-create-nx25r\" (UID: \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.253096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625cd2cf-9a3e-4870-8df0-de16c44cf081-operator-scripts\") pod \"nova-api-54d1-account-create-update-qh9zj\" (UID: \"625cd2cf-9a3e-4870-8df0-de16c44cf081\") " pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.266468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z4n9\" (UniqueName: \"kubernetes.io/projected/625cd2cf-9a3e-4870-8df0-de16c44cf081-kube-api-access-9z4n9\") pod \"nova-api-54d1-account-create-update-qh9zj\" (UID: \"625cd2cf-9a3e-4870-8df0-de16c44cf081\") " pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.268157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmmv\" (UniqueName: \"kubernetes.io/projected/db4739a7-e3aa-4fce-8e12-8fe520467ba2-kube-api-access-gtmmv\") pod \"nova-cell1-db-create-nx25r\" (UID: \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.305919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.354610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264a35d9-80a3-47e0-9489-5d088f07da4f-operator-scripts\") pod \"nova-cell0-7ab6-account-create-update-9rm92\" (UID: \"264a35d9-80a3-47e0-9489-5d088f07da4f\") " pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.354722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6pwv\" (UniqueName: \"kubernetes.io/projected/264a35d9-80a3-47e0-9489-5d088f07da4f-kube-api-access-g6pwv\") pod \"nova-cell0-7ab6-account-create-update-9rm92\" (UID: \"264a35d9-80a3-47e0-9489-5d088f07da4f\") " pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.356387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264a35d9-80a3-47e0-9489-5d088f07da4f-operator-scripts\") pod \"nova-cell0-7ab6-account-create-update-9rm92\" (UID: \"264a35d9-80a3-47e0-9489-5d088f07da4f\") " pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.378987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6pwv\" (UniqueName: \"kubernetes.io/projected/264a35d9-80a3-47e0-9489-5d088f07da4f-kube-api-access-g6pwv\") pod \"nova-cell0-7ab6-account-create-update-9rm92\" (UID: \"264a35d9-80a3-47e0-9489-5d088f07da4f\") " pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.382483 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92"] Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.383737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.385754 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.390443 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92"] Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.456960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-operator-scripts\") pod \"nova-cell1-a90f-account-create-update-9fg92\" (UID: \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.457150 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z62zh\" (UniqueName: \"kubernetes.io/projected/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-kube-api-access-z62zh\") pod \"nova-cell1-a90f-account-create-update-9fg92\" (UID: \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.507536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.554549 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-4rqnw"] Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.559743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-operator-scripts\") pod \"nova-cell1-a90f-account-create-update-9fg92\" (UID: \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.559920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z62zh\" (UniqueName: \"kubernetes.io/projected/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-kube-api-access-z62zh\") pod \"nova-cell1-a90f-account-create-update-9fg92\" (UID: \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.561018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-operator-scripts\") pod \"nova-cell1-a90f-account-create-update-9fg92\" (UID: \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.564996 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.578833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z62zh\" (UniqueName: \"kubernetes.io/projected/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-kube-api-access-z62zh\") pod \"nova-cell1-a90f-account-create-update-9fg92\" (UID: \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.654550 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-s6wlc"] Jan 21 16:16:33 crc kubenswrapper[4707]: W0121 16:16:33.660333 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1d7a1c2_ca46_4884_8c5b_f96cb24dbd3c.slice/crio-89fb1a56eaf9a8957e863540549453ec054ddf0d08ac6372be09dc7fc4fc6690 WatchSource:0}: Error finding container 89fb1a56eaf9a8957e863540549453ec054ddf0d08ac6372be09dc7fc4fc6690: Status 404 returned error can't find the container with id 89fb1a56eaf9a8957e863540549453ec054ddf0d08ac6372be09dc7fc4fc6690 Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.698059 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.755480 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj"] Jan 21 16:16:33 crc kubenswrapper[4707]: I0121 16:16:33.949889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-nx25r"] Jan 21 16:16:33 crc kubenswrapper[4707]: W0121 16:16:33.991983 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb4739a7_e3aa_4fce_8e12_8fe520467ba2.slice/crio-99f38ee297a42a6329ed5f2a37877dfcb43481bc513fbf6bf0cd78965f47c244 WatchSource:0}: Error finding container 99f38ee297a42a6329ed5f2a37877dfcb43481bc513fbf6bf0cd78965f47c244: Status 404 returned error can't find the container with id 99f38ee297a42a6329ed5f2a37877dfcb43481bc513fbf6bf0cd78965f47c244 Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.053404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92"] Jan 21 16:16:34 crc kubenswrapper[4707]: W0121 16:16:34.060435 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod264a35d9_80a3_47e0_9489_5d088f07da4f.slice/crio-00f2681a9e80315ac34f6024d3e90835753d021b62d5ba161a3cde37ae2bfbdd WatchSource:0}: Error finding container 00f2681a9e80315ac34f6024d3e90835753d021b62d5ba161a3cde37ae2bfbdd: Status 404 returned error can't find the container with id 00f2681a9e80315ac34f6024d3e90835753d021b62d5ba161a3cde37ae2bfbdd Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.153123 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92"] Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.508946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" event={"ID":"264a35d9-80a3-47e0-9489-5d088f07da4f","Type":"ContainerStarted","Data":"00f2681a9e80315ac34f6024d3e90835753d021b62d5ba161a3cde37ae2bfbdd"} Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.510680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" event={"ID":"625cd2cf-9a3e-4870-8df0-de16c44cf081","Type":"ContainerStarted","Data":"1901cb8cd260095a37ab695ec27b9cc5306d788abfe05ef6e14099fddbd0617d"} Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.510731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" event={"ID":"625cd2cf-9a3e-4870-8df0-de16c44cf081","Type":"ContainerStarted","Data":"f5015803816db2db937786f623ef844a4a655190b6f70ec6d19cd3392412179a"} Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.512890 4707 generic.go:334] "Generic (PLEG): container finished" podID="e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c" containerID="3a326b0249b7701a07598cdebb6cc82a855666df6ab38b7cb3d19e5643ea4f64" exitCode=0 Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.512959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" event={"ID":"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c","Type":"ContainerDied","Data":"3a326b0249b7701a07598cdebb6cc82a855666df6ab38b7cb3d19e5643ea4f64"} Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.512978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" event={"ID":"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c","Type":"ContainerStarted","Data":"89fb1a56eaf9a8957e863540549453ec054ddf0d08ac6372be09dc7fc4fc6690"} Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.514794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" event={"ID":"db4739a7-e3aa-4fce-8e12-8fe520467ba2","Type":"ContainerStarted","Data":"99f38ee297a42a6329ed5f2a37877dfcb43481bc513fbf6bf0cd78965f47c244"} Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.516494 4707 generic.go:334] "Generic (PLEG): container finished" podID="c035033e-3578-409b-865f-ec2e3812a9f7" containerID="72f0c21a64d5101061525331cfce47a016ad7faf1b203bdda3c3a11eff4ed108" exitCode=0 Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.516523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" event={"ID":"c035033e-3578-409b-865f-ec2e3812a9f7","Type":"ContainerDied","Data":"72f0c21a64d5101061525331cfce47a016ad7faf1b203bdda3c3a11eff4ed108"} Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.516541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" event={"ID":"c035033e-3578-409b-865f-ec2e3812a9f7","Type":"ContainerStarted","Data":"2f1c01c48598c99f7705b35c76be42748ce64f440d31236fba19538ab7729278"} Jan 21 16:16:34 crc kubenswrapper[4707]: I0121 16:16:34.531336 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" podStartSLOduration=2.531316917 podStartE2EDuration="2.531316917s" podCreationTimestamp="2026-01-21 16:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:34.52385699 +0000 UTC m=+4491.705373211" watchObservedRunningTime="2026-01-21 16:16:34.531316917 +0000 UTC m=+4491.712833129" Jan 21 16:16:34 crc kubenswrapper[4707]: W0121 16:16:34.569992 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ab2a23_9f4a_4e22_b752_b8ce428104f2.slice/crio-5d713c686d31e9e14f4d78da4217380dcb86a27009ac1cd68c622e2dfac0e20b WatchSource:0}: Error finding container 5d713c686d31e9e14f4d78da4217380dcb86a27009ac1cd68c622e2dfac0e20b: Status 404 returned error can't find the container with id 5d713c686d31e9e14f4d78da4217380dcb86a27009ac1cd68c622e2dfac0e20b Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.049922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.092749 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-combined-ca-bundle\") pod \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.092792 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56qcr\" (UniqueName: \"kubernetes.io/projected/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-kube-api-access-56qcr\") pod \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.099917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-kube-api-access-56qcr" (OuterVolumeSpecName: "kube-api-access-56qcr") pod "9a8dfc0d-088f-45e7-b4a7-61a17bc99768" (UID: "9a8dfc0d-088f-45e7-b4a7-61a17bc99768"). InnerVolumeSpecName "kube-api-access-56qcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.158478 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a8dfc0d-088f-45e7-b4a7-61a17bc99768" (UID: "9a8dfc0d-088f-45e7-b4a7-61a17bc99768"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.194898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-config-data\") pod \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.194967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-sg-core-conf-yaml\") pod \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.195013 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-scripts\") pod \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.195185 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-run-httpd\") pod \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.195251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-log-httpd\") pod \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\" (UID: \"9a8dfc0d-088f-45e7-b4a7-61a17bc99768\") " Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.196037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9a8dfc0d-088f-45e7-b4a7-61a17bc99768" (UID: "9a8dfc0d-088f-45e7-b4a7-61a17bc99768"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.196282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9a8dfc0d-088f-45e7-b4a7-61a17bc99768" (UID: "9a8dfc0d-088f-45e7-b4a7-61a17bc99768"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.197382 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.197401 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56qcr\" (UniqueName: \"kubernetes.io/projected/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-kube-api-access-56qcr\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.203990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-scripts" (OuterVolumeSpecName: "scripts") pod "9a8dfc0d-088f-45e7-b4a7-61a17bc99768" (UID: "9a8dfc0d-088f-45e7-b4a7-61a17bc99768"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.221019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9a8dfc0d-088f-45e7-b4a7-61a17bc99768" (UID: "9a8dfc0d-088f-45e7-b4a7-61a17bc99768"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.280251 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-config-data" (OuterVolumeSpecName: "config-data") pod "9a8dfc0d-088f-45e7-b4a7-61a17bc99768" (UID: "9a8dfc0d-088f-45e7-b4a7-61a17bc99768"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.303957 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.304005 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.304018 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.304034 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.304047 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a8dfc0d-088f-45e7-b4a7-61a17bc99768-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.543334 4707 generic.go:334] "Generic (PLEG): container finished" podID="db4739a7-e3aa-4fce-8e12-8fe520467ba2" containerID="ab1850d3462b9d70ea9580abe0ab8b461756cad0bf489d65b52f19dc9153b2ce" exitCode=0 Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.543839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" event={"ID":"db4739a7-e3aa-4fce-8e12-8fe520467ba2","Type":"ContainerDied","Data":"ab1850d3462b9d70ea9580abe0ab8b461756cad0bf489d65b52f19dc9153b2ce"} Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.545464 4707 generic.go:334] "Generic (PLEG): container finished" podID="a4ab2a23-9f4a-4e22-b752-b8ce428104f2" containerID="ab8f2b3c20e84db0ca40c7bb13debf7c50ed553488dd77b15a16fee6c242e20f" exitCode=0 Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.545506 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" event={"ID":"a4ab2a23-9f4a-4e22-b752-b8ce428104f2","Type":"ContainerDied","Data":"ab8f2b3c20e84db0ca40c7bb13debf7c50ed553488dd77b15a16fee6c242e20f"} Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.545521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" event={"ID":"a4ab2a23-9f4a-4e22-b752-b8ce428104f2","Type":"ContainerStarted","Data":"5d713c686d31e9e14f4d78da4217380dcb86a27009ac1cd68c622e2dfac0e20b"} Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.548869 4707 generic.go:334] "Generic (PLEG): container finished" podID="264a35d9-80a3-47e0-9489-5d088f07da4f" containerID="8619c0d6694f4fb15f617a41dd4e1b24fec26125230f5be9bd32bb4e75e89e0b" exitCode=0 Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.548966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" event={"ID":"264a35d9-80a3-47e0-9489-5d088f07da4f","Type":"ContainerDied","Data":"8619c0d6694f4fb15f617a41dd4e1b24fec26125230f5be9bd32bb4e75e89e0b"} Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.550609 4707 generic.go:334] "Generic (PLEG): container finished" podID="625cd2cf-9a3e-4870-8df0-de16c44cf081" containerID="1901cb8cd260095a37ab695ec27b9cc5306d788abfe05ef6e14099fddbd0617d" exitCode=0 Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.550649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" event={"ID":"625cd2cf-9a3e-4870-8df0-de16c44cf081","Type":"ContainerDied","Data":"1901cb8cd260095a37ab695ec27b9cc5306d788abfe05ef6e14099fddbd0617d"} Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.558507 4707 generic.go:334] "Generic (PLEG): container finished" podID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerID="f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05" exitCode=0 Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.558616 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.558670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerDied","Data":"f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05"} Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.558701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9a8dfc0d-088f-45e7-b4a7-61a17bc99768","Type":"ContainerDied","Data":"d5deef518bc0417711bf032fe4f59cad060a80c70807de9b591c04ea62753edd"} Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.558722 4707 scope.go:117] "RemoveContainer" containerID="f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.595884 4707 scope.go:117] "RemoveContainer" containerID="834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.644933 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.645285 4707 scope.go:117] "RemoveContainer" containerID="f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.656778 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.663829 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:35 crc kubenswrapper[4707]: E0121 16:16:35.664237 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="sg-core" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.664250 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="sg-core" Jan 21 16:16:35 crc kubenswrapper[4707]: E0121 16:16:35.664267 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="ceilometer-central-agent" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.664273 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="ceilometer-central-agent" Jan 21 16:16:35 crc kubenswrapper[4707]: E0121 16:16:35.664286 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="ceilometer-notification-agent" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.664292 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="ceilometer-notification-agent" Jan 21 16:16:35 crc kubenswrapper[4707]: E0121 16:16:35.664304 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="proxy-httpd" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.664310 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="proxy-httpd" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.664471 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="ceilometer-central-agent" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.664486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="ceilometer-notification-agent" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.664497 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="proxy-httpd" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.664510 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" containerName="sg-core" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.668041 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.668802 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.671395 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.671538 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.690692 4707 scope.go:117] "RemoveContainer" containerID="65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.716957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-scripts\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.717126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.717235 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-run-httpd\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.717273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-config-data\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.717295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.717331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sbxl\" (UniqueName: \"kubernetes.io/projected/1153bb3c-307b-48fc-b0f9-50db76937f43-kube-api-access-9sbxl\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.717364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-log-httpd\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.719864 4707 scope.go:117] "RemoveContainer" containerID="f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc" Jan 21 16:16:35 crc kubenswrapper[4707]: E0121 16:16:35.720543 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc\": container with ID starting with f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc not found: ID does not exist" containerID="f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.720582 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc"} err="failed to get container status \"f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc\": rpc error: code = NotFound desc = could not find container \"f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc\": container with ID starting with f3419c98cf4c15c4c8c0653d80f167665eb07ba332febfc73a69948ed210d9dc not found: ID does not exist" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.720610 4707 scope.go:117] "RemoveContainer" containerID="834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5" Jan 21 16:16:35 crc kubenswrapper[4707]: E0121 16:16:35.720960 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5\": container with ID starting with 834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5 not found: ID does not exist" containerID="834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.720995 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5"} err="failed to get container status \"834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5\": rpc error: code = NotFound desc = could not find container \"834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5\": container with ID starting with 834827bee87411628f9c3d49046cb16b84bf7fef2f6ba758ff873390fbfc77a5 not found: ID does not exist" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.721018 4707 scope.go:117] "RemoveContainer" containerID="f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05" Jan 21 16:16:35 crc kubenswrapper[4707]: E0121 16:16:35.721351 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05\": container with ID starting with f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05 not found: ID does not exist" containerID="f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.721379 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05"} err="failed to get container status \"f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05\": rpc error: code = NotFound desc = could not find container \"f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05\": container with ID starting with f84b34e0fe6289af5d9d18f97c1f57f36537f1d3e8c9462ca34136e56e988c05 not found: ID does not exist" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.721397 4707 scope.go:117] "RemoveContainer" containerID="65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08" Jan 21 16:16:35 crc kubenswrapper[4707]: E0121 16:16:35.721685 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08\": container with ID starting with 65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08 not found: ID does not exist" containerID="65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.721722 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08"} err="failed to get container status \"65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08\": rpc error: code = NotFound desc = could not find container \"65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08\": container with ID starting with 65ee4746888a0310c99944ce3145331c5b17879a562404fc11387c6d47028d08 not found: ID does not exist" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.821176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.821289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-run-httpd\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.821336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-config-data\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.821357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.821389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sbxl\" (UniqueName: \"kubernetes.io/projected/1153bb3c-307b-48fc-b0f9-50db76937f43-kube-api-access-9sbxl\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.821421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-log-httpd\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.821469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-scripts\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.822228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-run-httpd\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.822309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-log-httpd\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.826003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.826859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-config-data\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.827594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.828740 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-scripts\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.838344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sbxl\" (UniqueName: \"kubernetes.io/projected/1153bb3c-307b-48fc-b0f9-50db76937f43-kube-api-access-9sbxl\") pod \"ceilometer-0\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:35 crc kubenswrapper[4707]: I0121 16:16:35.992671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.539142 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.553148 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.583387 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.583383 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-s6wlc" event={"ID":"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c","Type":"ContainerDied","Data":"89fb1a56eaf9a8957e863540549453ec054ddf0d08ac6372be09dc7fc4fc6690"} Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.583453 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89fb1a56eaf9a8957e863540549453ec054ddf0d08ac6372be09dc7fc4fc6690" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.585293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" event={"ID":"c035033e-3578-409b-865f-ec2e3812a9f7","Type":"ContainerDied","Data":"2f1c01c48598c99f7705b35c76be42748ce64f440d31236fba19538ab7729278"} Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.585327 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1c01c48598c99f7705b35c76be42748ce64f440d31236fba19538ab7729278" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.585543 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-4rqnw" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.637895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-operator-scripts\") pod \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\" (UID: \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\") " Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.638203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwwpj\" (UniqueName: \"kubernetes.io/projected/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-kube-api-access-nwwpj\") pod \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\" (UID: \"e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c\") " Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.639933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c" (UID: "e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.653115 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-kube-api-access-nwwpj" (OuterVolumeSpecName: "kube-api-access-nwwpj") pod "e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c" (UID: "e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c"). InnerVolumeSpecName "kube-api-access-nwwpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.741561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035033e-3578-409b-865f-ec2e3812a9f7-operator-scripts\") pod \"c035033e-3578-409b-865f-ec2e3812a9f7\" (UID: \"c035033e-3578-409b-865f-ec2e3812a9f7\") " Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.742299 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c035033e-3578-409b-865f-ec2e3812a9f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c035033e-3578-409b-865f-ec2e3812a9f7" (UID: "c035033e-3578-409b-865f-ec2e3812a9f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.742476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mmjh\" (UniqueName: \"kubernetes.io/projected/c035033e-3578-409b-865f-ec2e3812a9f7-kube-api-access-8mmjh\") pod \"c035033e-3578-409b-865f-ec2e3812a9f7\" (UID: \"c035033e-3578-409b-865f-ec2e3812a9f7\") " Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.743224 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035033e-3578-409b-865f-ec2e3812a9f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.743245 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwwpj\" (UniqueName: \"kubernetes.io/projected/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-kube-api-access-nwwpj\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.743261 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.746197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c035033e-3578-409b-865f-ec2e3812a9f7-kube-api-access-8mmjh" (OuterVolumeSpecName: "kube-api-access-8mmjh") pod "c035033e-3578-409b-865f-ec2e3812a9f7" (UID: "c035033e-3578-409b-865f-ec2e3812a9f7"). InnerVolumeSpecName "kube-api-access-8mmjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:36 crc kubenswrapper[4707]: I0121 16:16:36.857626 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mmjh\" (UniqueName: \"kubernetes.io/projected/c035033e-3578-409b-865f-ec2e3812a9f7-kube-api-access-8mmjh\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.088532 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.092315 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.101661 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.109291 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.110631 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.193866 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a8dfc0d-088f-45e7-b4a7-61a17bc99768" path="/var/lib/kubelet/pods/9a8dfc0d-088f-45e7-b4a7-61a17bc99768/volumes" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.265733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db4739a7-e3aa-4fce-8e12-8fe520467ba2-operator-scripts\") pod \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\" (UID: \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\") " Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.265824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z4n9\" (UniqueName: \"kubernetes.io/projected/625cd2cf-9a3e-4870-8df0-de16c44cf081-kube-api-access-9z4n9\") pod \"625cd2cf-9a3e-4870-8df0-de16c44cf081\" (UID: \"625cd2cf-9a3e-4870-8df0-de16c44cf081\") " Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.265975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625cd2cf-9a3e-4870-8df0-de16c44cf081-operator-scripts\") pod \"625cd2cf-9a3e-4870-8df0-de16c44cf081\" (UID: \"625cd2cf-9a3e-4870-8df0-de16c44cf081\") " Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.266008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtmmv\" (UniqueName: \"kubernetes.io/projected/db4739a7-e3aa-4fce-8e12-8fe520467ba2-kube-api-access-gtmmv\") pod \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\" (UID: \"db4739a7-e3aa-4fce-8e12-8fe520467ba2\") " Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.266046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264a35d9-80a3-47e0-9489-5d088f07da4f-operator-scripts\") pod \"264a35d9-80a3-47e0-9489-5d088f07da4f\" (UID: \"264a35d9-80a3-47e0-9489-5d088f07da4f\") " Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.266074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6pwv\" (UniqueName: \"kubernetes.io/projected/264a35d9-80a3-47e0-9489-5d088f07da4f-kube-api-access-g6pwv\") pod \"264a35d9-80a3-47e0-9489-5d088f07da4f\" (UID: \"264a35d9-80a3-47e0-9489-5d088f07da4f\") " Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.266821 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/264a35d9-80a3-47e0-9489-5d088f07da4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "264a35d9-80a3-47e0-9489-5d088f07da4f" (UID: "264a35d9-80a3-47e0-9489-5d088f07da4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.266834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625cd2cf-9a3e-4870-8df0-de16c44cf081-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "625cd2cf-9a3e-4870-8df0-de16c44cf081" (UID: "625cd2cf-9a3e-4870-8df0-de16c44cf081"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.266849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4739a7-e3aa-4fce-8e12-8fe520467ba2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db4739a7-e3aa-4fce-8e12-8fe520467ba2" (UID: "db4739a7-e3aa-4fce-8e12-8fe520467ba2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.266868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-operator-scripts\") pod \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\" (UID: \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\") " Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.267273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4ab2a23-9f4a-4e22-b752-b8ce428104f2" (UID: "a4ab2a23-9f4a-4e22-b752-b8ce428104f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.266905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z62zh\" (UniqueName: \"kubernetes.io/projected/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-kube-api-access-z62zh\") pod \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\" (UID: \"a4ab2a23-9f4a-4e22-b752-b8ce428104f2\") " Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.267955 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db4739a7-e3aa-4fce-8e12-8fe520467ba2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.267974 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625cd2cf-9a3e-4870-8df0-de16c44cf081-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.267983 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264a35d9-80a3-47e0-9489-5d088f07da4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.267992 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.270859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625cd2cf-9a3e-4870-8df0-de16c44cf081-kube-api-access-9z4n9" (OuterVolumeSpecName: "kube-api-access-9z4n9") pod "625cd2cf-9a3e-4870-8df0-de16c44cf081" (UID: "625cd2cf-9a3e-4870-8df0-de16c44cf081"). InnerVolumeSpecName "kube-api-access-9z4n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.275585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-kube-api-access-z62zh" (OuterVolumeSpecName: "kube-api-access-z62zh") pod "a4ab2a23-9f4a-4e22-b752-b8ce428104f2" (UID: "a4ab2a23-9f4a-4e22-b752-b8ce428104f2"). InnerVolumeSpecName "kube-api-access-z62zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.275670 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4739a7-e3aa-4fce-8e12-8fe520467ba2-kube-api-access-gtmmv" (OuterVolumeSpecName: "kube-api-access-gtmmv") pod "db4739a7-e3aa-4fce-8e12-8fe520467ba2" (UID: "db4739a7-e3aa-4fce-8e12-8fe520467ba2"). InnerVolumeSpecName "kube-api-access-gtmmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.276864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264a35d9-80a3-47e0-9489-5d088f07da4f-kube-api-access-g6pwv" (OuterVolumeSpecName: "kube-api-access-g6pwv") pod "264a35d9-80a3-47e0-9489-5d088f07da4f" (UID: "264a35d9-80a3-47e0-9489-5d088f07da4f"). InnerVolumeSpecName "kube-api-access-g6pwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.371734 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z4n9\" (UniqueName: \"kubernetes.io/projected/625cd2cf-9a3e-4870-8df0-de16c44cf081-kube-api-access-9z4n9\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.372204 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtmmv\" (UniqueName: \"kubernetes.io/projected/db4739a7-e3aa-4fce-8e12-8fe520467ba2-kube-api-access-gtmmv\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.372219 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6pwv\" (UniqueName: \"kubernetes.io/projected/264a35d9-80a3-47e0-9489-5d088f07da4f-kube-api-access-g6pwv\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.372233 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z62zh\" (UniqueName: \"kubernetes.io/projected/a4ab2a23-9f4a-4e22-b752-b8ce428104f2-kube-api-access-z62zh\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.593980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerStarted","Data":"de97be19548ea511de91af37cf6a8000ada3855eef025dae3e3bf63813444287"} Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.595804 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.596252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92" event={"ID":"264a35d9-80a3-47e0-9489-5d088f07da4f","Type":"ContainerDied","Data":"00f2681a9e80315ac34f6024d3e90835753d021b62d5ba161a3cde37ae2bfbdd"} Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.596301 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00f2681a9e80315ac34f6024d3e90835753d021b62d5ba161a3cde37ae2bfbdd" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.597913 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.599535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj" event={"ID":"625cd2cf-9a3e-4870-8df0-de16c44cf081","Type":"ContainerDied","Data":"f5015803816db2db937786f623ef844a4a655190b6f70ec6d19cd3392412179a"} Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.599617 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5015803816db2db937786f623ef844a4a655190b6f70ec6d19cd3392412179a" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.600950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" event={"ID":"db4739a7-e3aa-4fce-8e12-8fe520467ba2","Type":"ContainerDied","Data":"99f38ee297a42a6329ed5f2a37877dfcb43481bc513fbf6bf0cd78965f47c244"} Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.600973 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99f38ee297a42a6329ed5f2a37877dfcb43481bc513fbf6bf0cd78965f47c244" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.601044 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-nx25r" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.617701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" event={"ID":"a4ab2a23-9f4a-4e22-b752-b8ce428104f2","Type":"ContainerDied","Data":"5d713c686d31e9e14f4d78da4217380dcb86a27009ac1cd68c622e2dfac0e20b"} Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.617728 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d713c686d31e9e14f4d78da4217380dcb86a27009ac1cd68c622e2dfac0e20b" Jan 21 16:16:37 crc kubenswrapper[4707]: I0121 16:16:37.617753 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.394458 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc"] Jan 21 16:16:38 crc kubenswrapper[4707]: E0121 16:16:38.395045 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4739a7-e3aa-4fce-8e12-8fe520467ba2" containerName="mariadb-database-create" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395063 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4739a7-e3aa-4fce-8e12-8fe520467ba2" containerName="mariadb-database-create" Jan 21 16:16:38 crc kubenswrapper[4707]: E0121 16:16:38.395088 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c035033e-3578-409b-865f-ec2e3812a9f7" containerName="mariadb-database-create" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395095 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c035033e-3578-409b-865f-ec2e3812a9f7" containerName="mariadb-database-create" Jan 21 16:16:38 crc kubenswrapper[4707]: E0121 16:16:38.395104 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c" containerName="mariadb-database-create" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395110 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c" containerName="mariadb-database-create" Jan 21 16:16:38 crc kubenswrapper[4707]: E0121 16:16:38.395119 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625cd2cf-9a3e-4870-8df0-de16c44cf081" containerName="mariadb-account-create-update" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395126 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="625cd2cf-9a3e-4870-8df0-de16c44cf081" containerName="mariadb-account-create-update" Jan 21 16:16:38 crc kubenswrapper[4707]: E0121 16:16:38.395138 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264a35d9-80a3-47e0-9489-5d088f07da4f" containerName="mariadb-account-create-update" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395144 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="264a35d9-80a3-47e0-9489-5d088f07da4f" containerName="mariadb-account-create-update" Jan 21 16:16:38 crc kubenswrapper[4707]: E0121 16:16:38.395158 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ab2a23-9f4a-4e22-b752-b8ce428104f2" containerName="mariadb-account-create-update" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395172 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ab2a23-9f4a-4e22-b752-b8ce428104f2" containerName="mariadb-account-create-update" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395312 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4739a7-e3aa-4fce-8e12-8fe520467ba2" containerName="mariadb-database-create" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395328 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ab2a23-9f4a-4e22-b752-b8ce428104f2" containerName="mariadb-account-create-update" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395337 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="625cd2cf-9a3e-4870-8df0-de16c44cf081" containerName="mariadb-account-create-update" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395348 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c" containerName="mariadb-database-create" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c035033e-3578-409b-865f-ec2e3812a9f7" containerName="mariadb-database-create" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395366 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="264a35d9-80a3-47e0-9489-5d088f07da4f" containerName="mariadb-account-create-update" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.395876 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.398926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-cwf4d" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.399540 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.400502 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.435616 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc"] Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.498233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-scripts\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.499760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxmbl\" (UniqueName: \"kubernetes.io/projected/9815602d-6e85-4f3f-8aeb-fa674768302c-kube-api-access-jxmbl\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.500381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.500454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-config-data\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.603113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-scripts\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.603260 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxmbl\" (UniqueName: \"kubernetes.io/projected/9815602d-6e85-4f3f-8aeb-fa674768302c-kube-api-access-jxmbl\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.603773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.603885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-config-data\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.610548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-config-data\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.610564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-scripts\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.614348 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.620599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxmbl\" (UniqueName: \"kubernetes.io/projected/9815602d-6e85-4f3f-8aeb-fa674768302c-kube-api-access-jxmbl\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.631396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9qlqc\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.631967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerStarted","Data":"fec7b9d03cf855300bc8135bf210cdb5dd7483d660057b697c4f93052c49f94a"} Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.632000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerStarted","Data":"28c25f89e44ed8cabbbec74c68027574605ed40d203aa29efc5e8224e10c3416"} Jan 21 16:16:38 crc kubenswrapper[4707]: I0121 16:16:38.761519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:39 crc kubenswrapper[4707]: I0121 16:16:39.193195 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc"] Jan 21 16:16:39 crc kubenswrapper[4707]: I0121 16:16:39.539651 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:39 crc kubenswrapper[4707]: I0121 16:16:39.540366 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:16:39 crc kubenswrapper[4707]: I0121 16:16:39.645777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerStarted","Data":"8412b0f341c16bc17bfaad23e702bccb5ab9e0ada0073ce3c9956d9c040fc596"} Jan 21 16:16:39 crc kubenswrapper[4707]: I0121 16:16:39.647554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" event={"ID":"9815602d-6e85-4f3f-8aeb-fa674768302c","Type":"ContainerStarted","Data":"2c40331b7a55e46ad04bd0aeecb0ef1a000647b4cd6a521d0565d70d76c94cbb"} Jan 21 16:16:39 crc kubenswrapper[4707]: I0121 16:16:39.647654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" event={"ID":"9815602d-6e85-4f3f-8aeb-fa674768302c","Type":"ContainerStarted","Data":"9a3c2776f8ac97e67630fe8be0bbafe440f6069184209273dc3b4461f36552b0"} Jan 21 16:16:39 crc kubenswrapper[4707]: I0121 16:16:39.670913 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" podStartSLOduration=1.6708976 podStartE2EDuration="1.6708976s" podCreationTimestamp="2026-01-21 16:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:39.664320031 +0000 UTC m=+4496.845836253" watchObservedRunningTime="2026-01-21 16:16:39.6708976 +0000 UTC m=+4496.852413822" Jan 21 16:16:39 crc kubenswrapper[4707]: I0121 16:16:39.945379 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:16:39 crc kubenswrapper[4707]: I0121 16:16:39.945440 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:16:41 crc kubenswrapper[4707]: I0121 16:16:41.672418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerStarted","Data":"0c2a354a717277f321efbaf9c5f82322f516e84463f972414b1ee29361c3f043"} Jan 21 16:16:41 crc kubenswrapper[4707]: I0121 16:16:41.673126 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="sg-core" containerID="cri-o://8412b0f341c16bc17bfaad23e702bccb5ab9e0ada0073ce3c9956d9c040fc596" gracePeriod=30 Jan 21 16:16:41 crc kubenswrapper[4707]: I0121 16:16:41.673150 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:41 crc kubenswrapper[4707]: I0121 16:16:41.673111 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="ceilometer-notification-agent" containerID="cri-o://fec7b9d03cf855300bc8135bf210cdb5dd7483d660057b697c4f93052c49f94a" gracePeriod=30 Jan 21 16:16:41 crc kubenswrapper[4707]: I0121 16:16:41.672717 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="ceilometer-central-agent" containerID="cri-o://28c25f89e44ed8cabbbec74c68027574605ed40d203aa29efc5e8224e10c3416" gracePeriod=30 Jan 21 16:16:41 crc kubenswrapper[4707]: I0121 16:16:41.673061 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="proxy-httpd" containerID="cri-o://0c2a354a717277f321efbaf9c5f82322f516e84463f972414b1ee29361c3f043" gracePeriod=30 Jan 21 16:16:41 crc kubenswrapper[4707]: I0121 16:16:41.706924 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.180504869 podStartE2EDuration="6.706900717s" podCreationTimestamp="2026-01-21 16:16:35 +0000 UTC" firstStartedPulling="2026-01-21 16:16:37.129206678 +0000 UTC m=+4494.310722900" lastFinishedPulling="2026-01-21 16:16:40.655602526 +0000 UTC m=+4497.837118748" observedRunningTime="2026-01-21 16:16:41.701412908 +0000 UTC m=+4498.882929131" watchObservedRunningTime="2026-01-21 16:16:41.706900717 +0000 UTC m=+4498.888416939" Jan 21 16:16:42 crc kubenswrapper[4707]: I0121 16:16:42.686149 4707 generic.go:334] "Generic (PLEG): container finished" podID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerID="0c2a354a717277f321efbaf9c5f82322f516e84463f972414b1ee29361c3f043" exitCode=0 Jan 21 16:16:42 crc kubenswrapper[4707]: I0121 16:16:42.686492 4707 generic.go:334] "Generic (PLEG): container finished" podID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerID="8412b0f341c16bc17bfaad23e702bccb5ab9e0ada0073ce3c9956d9c040fc596" exitCode=2 Jan 21 16:16:42 crc kubenswrapper[4707]: I0121 16:16:42.686573 4707 generic.go:334] "Generic (PLEG): container finished" podID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerID="fec7b9d03cf855300bc8135bf210cdb5dd7483d660057b697c4f93052c49f94a" exitCode=0 Jan 21 16:16:42 crc kubenswrapper[4707]: I0121 16:16:42.686348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerDied","Data":"0c2a354a717277f321efbaf9c5f82322f516e84463f972414b1ee29361c3f043"} Jan 21 16:16:42 crc kubenswrapper[4707]: I0121 16:16:42.686628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerDied","Data":"8412b0f341c16bc17bfaad23e702bccb5ab9e0ada0073ce3c9956d9c040fc596"} Jan 21 16:16:42 crc kubenswrapper[4707]: I0121 16:16:42.686644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerDied","Data":"fec7b9d03cf855300bc8135bf210cdb5dd7483d660057b697c4f93052c49f94a"} Jan 21 16:16:45 crc kubenswrapper[4707]: I0121 16:16:45.715268 4707 generic.go:334] "Generic (PLEG): container finished" podID="9815602d-6e85-4f3f-8aeb-fa674768302c" containerID="2c40331b7a55e46ad04bd0aeecb0ef1a000647b4cd6a521d0565d70d76c94cbb" exitCode=0 Jan 21 16:16:45 crc kubenswrapper[4707]: I0121 16:16:45.715335 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" event={"ID":"9815602d-6e85-4f3f-8aeb-fa674768302c","Type":"ContainerDied","Data":"2c40331b7a55e46ad04bd0aeecb0ef1a000647b4cd6a521d0565d70d76c94cbb"} Jan 21 16:16:46 crc kubenswrapper[4707]: I0121 16:16:46.725674 4707 generic.go:334] "Generic (PLEG): container finished" podID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerID="28c25f89e44ed8cabbbec74c68027574605ed40d203aa29efc5e8224e10c3416" exitCode=0 Jan 21 16:16:46 crc kubenswrapper[4707]: I0121 16:16:46.725751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerDied","Data":"28c25f89e44ed8cabbbec74c68027574605ed40d203aa29efc5e8224e10c3416"} Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.120259 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.199235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxmbl\" (UniqueName: \"kubernetes.io/projected/9815602d-6e85-4f3f-8aeb-fa674768302c-kube-api-access-jxmbl\") pod \"9815602d-6e85-4f3f-8aeb-fa674768302c\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.199308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-combined-ca-bundle\") pod \"9815602d-6e85-4f3f-8aeb-fa674768302c\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.199488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-config-data\") pod \"9815602d-6e85-4f3f-8aeb-fa674768302c\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.199543 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-scripts\") pod \"9815602d-6e85-4f3f-8aeb-fa674768302c\" (UID: \"9815602d-6e85-4f3f-8aeb-fa674768302c\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.206510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9815602d-6e85-4f3f-8aeb-fa674768302c-kube-api-access-jxmbl" (OuterVolumeSpecName: "kube-api-access-jxmbl") pod "9815602d-6e85-4f3f-8aeb-fa674768302c" (UID: "9815602d-6e85-4f3f-8aeb-fa674768302c"). InnerVolumeSpecName "kube-api-access-jxmbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.206594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-scripts" (OuterVolumeSpecName: "scripts") pod "9815602d-6e85-4f3f-8aeb-fa674768302c" (UID: "9815602d-6e85-4f3f-8aeb-fa674768302c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.223693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-config-data" (OuterVolumeSpecName: "config-data") pod "9815602d-6e85-4f3f-8aeb-fa674768302c" (UID: "9815602d-6e85-4f3f-8aeb-fa674768302c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.224175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9815602d-6e85-4f3f-8aeb-fa674768302c" (UID: "9815602d-6e85-4f3f-8aeb-fa674768302c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.275639 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.300868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-run-httpd\") pod \"1153bb3c-307b-48fc-b0f9-50db76937f43\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.301069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-config-data\") pod \"1153bb3c-307b-48fc-b0f9-50db76937f43\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.301108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-scripts\") pod \"1153bb3c-307b-48fc-b0f9-50db76937f43\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.301158 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-log-httpd\") pod \"1153bb3c-307b-48fc-b0f9-50db76937f43\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.301209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sbxl\" (UniqueName: \"kubernetes.io/projected/1153bb3c-307b-48fc-b0f9-50db76937f43-kube-api-access-9sbxl\") pod \"1153bb3c-307b-48fc-b0f9-50db76937f43\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.301284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-combined-ca-bundle\") pod \"1153bb3c-307b-48fc-b0f9-50db76937f43\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.301293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1153bb3c-307b-48fc-b0f9-50db76937f43" (UID: "1153bb3c-307b-48fc-b0f9-50db76937f43"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.301341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-sg-core-conf-yaml\") pod \"1153bb3c-307b-48fc-b0f9-50db76937f43\" (UID: \"1153bb3c-307b-48fc-b0f9-50db76937f43\") " Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.302106 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxmbl\" (UniqueName: \"kubernetes.io/projected/9815602d-6e85-4f3f-8aeb-fa674768302c-kube-api-access-jxmbl\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.302129 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.302141 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.302152 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.302173 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9815602d-6e85-4f3f-8aeb-fa674768302c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.302279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1153bb3c-307b-48fc-b0f9-50db76937f43" (UID: "1153bb3c-307b-48fc-b0f9-50db76937f43"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.315012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-scripts" (OuterVolumeSpecName: "scripts") pod "1153bb3c-307b-48fc-b0f9-50db76937f43" (UID: "1153bb3c-307b-48fc-b0f9-50db76937f43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.337437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1153bb3c-307b-48fc-b0f9-50db76937f43-kube-api-access-9sbxl" (OuterVolumeSpecName: "kube-api-access-9sbxl") pod "1153bb3c-307b-48fc-b0f9-50db76937f43" (UID: "1153bb3c-307b-48fc-b0f9-50db76937f43"). InnerVolumeSpecName "kube-api-access-9sbxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.353887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1153bb3c-307b-48fc-b0f9-50db76937f43" (UID: "1153bb3c-307b-48fc-b0f9-50db76937f43"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.362712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1153bb3c-307b-48fc-b0f9-50db76937f43" (UID: "1153bb3c-307b-48fc-b0f9-50db76937f43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.400434 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-config-data" (OuterVolumeSpecName: "config-data") pod "1153bb3c-307b-48fc-b0f9-50db76937f43" (UID: "1153bb3c-307b-48fc-b0f9-50db76937f43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.403960 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.403988 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.403999 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.404010 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1153bb3c-307b-48fc-b0f9-50db76937f43-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.404022 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sbxl\" (UniqueName: \"kubernetes.io/projected/1153bb3c-307b-48fc-b0f9-50db76937f43-kube-api-access-9sbxl\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.404036 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1153bb3c-307b-48fc-b0f9-50db76937f43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.739544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" event={"ID":"9815602d-6e85-4f3f-8aeb-fa674768302c","Type":"ContainerDied","Data":"9a3c2776f8ac97e67630fe8be0bbafe440f6069184209273dc3b4461f36552b0"} Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.739601 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a3c2776f8ac97e67630fe8be0bbafe440f6069184209273dc3b4461f36552b0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.739644 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.742841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1153bb3c-307b-48fc-b0f9-50db76937f43","Type":"ContainerDied","Data":"de97be19548ea511de91af37cf6a8000ada3855eef025dae3e3bf63813444287"} Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.742932 4707 scope.go:117] "RemoveContainer" containerID="0c2a354a717277f321efbaf9c5f82322f516e84463f972414b1ee29361c3f043" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.743212 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.778427 4707 scope.go:117] "RemoveContainer" containerID="8412b0f341c16bc17bfaad23e702bccb5ab9e0ada0073ce3c9956d9c040fc596" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.804446 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.811423 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.821903 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:47 crc kubenswrapper[4707]: E0121 16:16:47.822299 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9815602d-6e85-4f3f-8aeb-fa674768302c" containerName="nova-cell0-conductor-db-sync" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.822656 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9815602d-6e85-4f3f-8aeb-fa674768302c" containerName="nova-cell0-conductor-db-sync" Jan 21 16:16:47 crc kubenswrapper[4707]: E0121 16:16:47.822673 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="sg-core" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.822679 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="sg-core" Jan 21 16:16:47 crc kubenswrapper[4707]: E0121 16:16:47.822690 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="ceilometer-notification-agent" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.822710 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="ceilometer-notification-agent" Jan 21 16:16:47 crc kubenswrapper[4707]: E0121 16:16:47.822722 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="proxy-httpd" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.822728 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="proxy-httpd" Jan 21 16:16:47 crc kubenswrapper[4707]: E0121 16:16:47.823654 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="ceilometer-central-agent" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.823756 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="ceilometer-central-agent" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.823938 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9815602d-6e85-4f3f-8aeb-fa674768302c" containerName="nova-cell0-conductor-db-sync" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.823949 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="ceilometer-central-agent" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.823966 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="ceilometer-notification-agent" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.823975 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="proxy-httpd" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.823986 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" containerName="sg-core" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.825513 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.829950 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.829967 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.835857 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.846959 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.848544 4707 scope.go:117] "RemoveContainer" containerID="fec7b9d03cf855300bc8135bf210cdb5dd7483d660057b697c4f93052c49f94a" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.848558 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.853353 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.853489 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-cwf4d" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.857199 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.900882 4707 scope.go:117] "RemoveContainer" containerID="28c25f89e44ed8cabbbec74c68027574605ed40d203aa29efc5e8224e10c3416" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-run-httpd\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pngw4\" (UniqueName: \"kubernetes.io/projected/dece1ad7-067e-4834-9540-b6743be93db6-kube-api-access-pngw4\") pod \"nova-cell0-conductor-0\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-log-httpd\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-scripts\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-config-data\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvf6g\" (UniqueName: \"kubernetes.io/projected/df0edeb9-c8a1-4b02-903a-af1c8de3f440-kube-api-access-kvf6g\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:47 crc kubenswrapper[4707]: I0121 16:16:47.915547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.016851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pngw4\" (UniqueName: \"kubernetes.io/projected/dece1ad7-067e-4834-9540-b6743be93db6-kube-api-access-pngw4\") pod \"nova-cell0-conductor-0\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.016902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-log-httpd\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.016923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-scripts\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.016981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-config-data\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.017014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.017244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.017265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.017292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvf6g\" (UniqueName: \"kubernetes.io/projected/df0edeb9-c8a1-4b02-903a-af1c8de3f440-kube-api-access-kvf6g\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.017317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.017348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-run-httpd\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.017829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-run-httpd\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.018365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-log-httpd\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.033532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.033743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.033855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-scripts\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.035044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-config-data\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.035425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.035696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.038613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pngw4\" (UniqueName: \"kubernetes.io/projected/dece1ad7-067e-4834-9540-b6743be93db6-kube-api-access-pngw4\") pod \"nova-cell0-conductor-0\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.039146 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvf6g\" (UniqueName: \"kubernetes.io/projected/df0edeb9-c8a1-4b02-903a-af1c8de3f440-kube-api-access-kvf6g\") pod \"ceilometer-0\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.147196 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.200461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.571697 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.649854 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:16:48 crc kubenswrapper[4707]: W0121 16:16:48.649996 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddece1ad7_067e_4834_9540_b6743be93db6.slice/crio-ba19939958d90f408b93fcb401b29df45d40d61f431b89f30d055b0dc42cde4a WatchSource:0}: Error finding container ba19939958d90f408b93fcb401b29df45d40d61f431b89f30d055b0dc42cde4a: Status 404 returned error can't find the container with id ba19939958d90f408b93fcb401b29df45d40d61f431b89f30d055b0dc42cde4a Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.754146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerStarted","Data":"f61b914a1bfbd2de4bd56e43c20c632465a9763580e20f0fec7c2ac721ef15c2"} Jan 21 16:16:48 crc kubenswrapper[4707]: I0121 16:16:48.757290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dece1ad7-067e-4834-9540-b6743be93db6","Type":"ContainerStarted","Data":"ba19939958d90f408b93fcb401b29df45d40d61f431b89f30d055b0dc42cde4a"} Jan 21 16:16:49 crc kubenswrapper[4707]: I0121 16:16:49.193922 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1153bb3c-307b-48fc-b0f9-50db76937f43" path="/var/lib/kubelet/pods/1153bb3c-307b-48fc-b0f9-50db76937f43/volumes" Jan 21 16:16:49 crc kubenswrapper[4707]: I0121 16:16:49.767863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dece1ad7-067e-4834-9540-b6743be93db6","Type":"ContainerStarted","Data":"c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161"} Jan 21 16:16:49 crc kubenswrapper[4707]: I0121 16:16:49.768274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:49 crc kubenswrapper[4707]: I0121 16:16:49.769751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerStarted","Data":"1597feabb8d21f3510c2a3bef67f6d2250cbc18d1b39b3864fe125c0caf4ee50"} Jan 21 16:16:50 crc kubenswrapper[4707]: I0121 16:16:50.782902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerStarted","Data":"54f3fd6d7f4c27e1565ae48afce64db522ef5e0bf6f32d8e333882ed6b5ae842"} Jan 21 16:16:50 crc kubenswrapper[4707]: I0121 16:16:50.783371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerStarted","Data":"ef1839ed46b9bd37ab4f97f30c6925c49a2de7a8a40de4236fdc49b99dbe0f27"} Jan 21 16:16:52 crc kubenswrapper[4707]: I0121 16:16:52.243498 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=5.243474605 podStartE2EDuration="5.243474605s" podCreationTimestamp="2026-01-21 16:16:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:49.786664213 +0000 UTC m=+4506.968180435" watchObservedRunningTime="2026-01-21 16:16:52.243474605 +0000 UTC m=+4509.424990827" Jan 21 16:16:52 crc kubenswrapper[4707]: I0121 16:16:52.246466 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:16:52 crc kubenswrapper[4707]: I0121 16:16:52.246679 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="42d3240b-126c-4815-b24a-0b02f8416c58" containerName="glance-log" containerID="cri-o://8c3d1aa4c7e07ced0f9c9649b79271951b2b5ac01a281dd0765d10564964c5eb" gracePeriod=30 Jan 21 16:16:52 crc kubenswrapper[4707]: I0121 16:16:52.246768 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="42d3240b-126c-4815-b24a-0b02f8416c58" containerName="glance-httpd" containerID="cri-o://ed5d874d9c4d8d662d1aaf49cf031a81b9c998a70b0df4bb18a92e7b5f5ccae2" gracePeriod=30 Jan 21 16:16:52 crc kubenswrapper[4707]: I0121 16:16:52.797896 4707 generic.go:334] "Generic (PLEG): container finished" podID="42d3240b-126c-4815-b24a-0b02f8416c58" containerID="8c3d1aa4c7e07ced0f9c9649b79271951b2b5ac01a281dd0765d10564964c5eb" exitCode=143 Jan 21 16:16:52 crc kubenswrapper[4707]: I0121 16:16:52.797961 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42d3240b-126c-4815-b24a-0b02f8416c58","Type":"ContainerDied","Data":"8c3d1aa4c7e07ced0f9c9649b79271951b2b5ac01a281dd0765d10564964c5eb"} Jan 21 16:16:52 crc kubenswrapper[4707]: I0121 16:16:52.800334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerStarted","Data":"4612bae2bcc2ddd52c87c41bb62385e1e556bbc875be1e1cfcdbbaddd73ce0fa"} Jan 21 16:16:52 crc kubenswrapper[4707]: I0121 16:16:52.800493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:52 crc kubenswrapper[4707]: I0121 16:16:52.821333 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.362360247 podStartE2EDuration="5.821319925s" podCreationTimestamp="2026-01-21 16:16:47 +0000 UTC" firstStartedPulling="2026-01-21 16:16:48.575317272 +0000 UTC m=+4505.756833495" lastFinishedPulling="2026-01-21 16:16:52.034276951 +0000 UTC m=+4509.215793173" observedRunningTime="2026-01-21 16:16:52.815346103 +0000 UTC m=+4509.996862325" watchObservedRunningTime="2026-01-21 16:16:52.821319925 +0000 UTC m=+4510.002836148" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.227636 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.262593 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.784744 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-57llq"] Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.785993 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.788662 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.788882 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.797643 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-57llq"] Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.807654 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="dece1ad7-067e-4834-9540-b6743be93db6" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161" gracePeriod=30 Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.837687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-scripts\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.837757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-config-data\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.837904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.837958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49lkk\" (UniqueName: \"kubernetes.io/projected/c9b17afa-1293-4da3-9638-839df6d49df9-kube-api-access-49lkk\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.914007 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.918649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.925023 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.932324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.939199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.939262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49lkk\" (UniqueName: \"kubernetes.io/projected/c9b17afa-1293-4da3-9638-839df6d49df9-kube-api-access-49lkk\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.939294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qvm\" (UniqueName: \"kubernetes.io/projected/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-kube-api-access-n5qvm\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.939329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-config-data\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.939352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.939402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-scripts\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.939485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-config-data\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.939710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-logs\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.959482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.967659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-scripts\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.974915 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-config-data\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.975505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49lkk\" (UniqueName: \"kubernetes.io/projected/c9b17afa-1293-4da3-9638-839df6d49df9-kube-api-access-49lkk\") pod \"nova-cell0-cell-mapping-57llq\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.976616 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.978284 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.982227 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:16:53 crc kubenswrapper[4707]: I0121 16:16:53.994410 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.038857 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.040046 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.041568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-logs\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.041855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qvm\" (UniqueName: \"kubernetes.io/projected/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-kube-api-access-n5qvm\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.041950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-config-data\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.041985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.044294 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.045049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-logs\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.047468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.051775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-config-data\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.052362 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.062441 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.063895 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.065382 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.068453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.100798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.105322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qvm\" (UniqueName: \"kubernetes.io/projected/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-kube-api-access-n5qvm\") pod \"nova-api-0\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.144523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-logs\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.144664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.144768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-config-data\") pod \"nova-scheduler-0\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.144826 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.144902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glfv8\" (UniqueName: \"kubernetes.io/projected/e7217542-f073-4e86-99e0-7b94c00199f1-kube-api-access-glfv8\") pod \"nova-scheduler-0\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.144994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bbjc\" (UniqueName: \"kubernetes.io/projected/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-kube-api-access-4bbjc\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.145027 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-config-data\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.238970 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.247931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.247997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.248053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-config-data\") pod \"nova-scheduler-0\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.248072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.248113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glfv8\" (UniqueName: \"kubernetes.io/projected/e7217542-f073-4e86-99e0-7b94c00199f1-kube-api-access-glfv8\") pod \"nova-scheduler-0\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.248158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bbjc\" (UniqueName: \"kubernetes.io/projected/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-kube-api-access-4bbjc\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.248187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.248206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-config-data\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.248226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-logs\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.248257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhc5t\" (UniqueName: \"kubernetes.io/projected/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-kube-api-access-vhc5t\") pod \"nova-cell1-novncproxy-0\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.249857 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-logs\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.274246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-config-data\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.275504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glfv8\" (UniqueName: \"kubernetes.io/projected/e7217542-f073-4e86-99e0-7b94c00199f1-kube-api-access-glfv8\") pod \"nova-scheduler-0\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.276276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-config-data\") pod \"nova-scheduler-0\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.283277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.283443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.284611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bbjc\" (UniqueName: \"kubernetes.io/projected/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-kube-api-access-4bbjc\") pod \"nova-metadata-0\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.349301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.349395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhc5t\" (UniqueName: \"kubernetes.io/projected/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-kube-api-access-vhc5t\") pod \"nova-cell1-novncproxy-0\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.349434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.354994 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.359485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.371770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhc5t\" (UniqueName: \"kubernetes.io/projected/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-kube-api-access-vhc5t\") pod \"nova-cell1-novncproxy-0\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.514696 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.523446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.530503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.549153 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-57llq"] Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.790625 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:16:54 crc kubenswrapper[4707]: W0121 16:16:54.812307 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda23ccb03_08c8_451b_a812_30e8c8e5d2c1.slice/crio-3fb279ee4e25f2d09339a92a7585281acaa0d178b732cce28bf076432e1d167c WatchSource:0}: Error finding container 3fb279ee4e25f2d09339a92a7585281acaa0d178b732cce28bf076432e1d167c: Status 404 returned error can't find the container with id 3fb279ee4e25f2d09339a92a7585281acaa0d178b732cce28bf076432e1d167c Jan 21 16:16:54 crc kubenswrapper[4707]: I0121 16:16:54.868647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" event={"ID":"c9b17afa-1293-4da3-9638-839df6d49df9","Type":"ContainerStarted","Data":"eb865e9ac4398425fb1838af239a7dacfbbd4b04ac170382b98cc01c0a4a2581"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.003650 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.298367 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.298691 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="ceilometer-central-agent" containerID="cri-o://1597feabb8d21f3510c2a3bef67f6d2250cbc18d1b39b3864fe125c0caf4ee50" gracePeriod=30 Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.299546 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="proxy-httpd" containerID="cri-o://4612bae2bcc2ddd52c87c41bb62385e1e556bbc875be1e1cfcdbbaddd73ce0fa" gracePeriod=30 Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.299631 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="sg-core" containerID="cri-o://54f3fd6d7f4c27e1565ae48afce64db522ef5e0bf6f32d8e333882ed6b5ae842" gracePeriod=30 Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.299681 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="ceilometer-notification-agent" containerID="cri-o://ef1839ed46b9bd37ab4f97f30c6925c49a2de7a8a40de4236fdc49b99dbe0f27" gracePeriod=30 Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.404950 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.439130 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:16:55 crc kubenswrapper[4707]: W0121 16:16:55.651065 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2be5d8fa_2912_4fa0_b63c_c6c4c900676c.slice/crio-db483f7796f91c72d457de57cfa9e32931c54b9fcdc8c0ebdae99f547e7beb4f WatchSource:0}: Error finding container db483f7796f91c72d457de57cfa9e32931c54b9fcdc8c0ebdae99f547e7beb4f: Status 404 returned error can't find the container with id db483f7796f91c72d457de57cfa9e32931c54b9fcdc8c0ebdae99f547e7beb4f Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.890052 4707 generic.go:334] "Generic (PLEG): container finished" podID="42d3240b-126c-4815-b24a-0b02f8416c58" containerID="ed5d874d9c4d8d662d1aaf49cf031a81b9c998a70b0df4bb18a92e7b5f5ccae2" exitCode=0 Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.890349 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42d3240b-126c-4815-b24a-0b02f8416c58","Type":"ContainerDied","Data":"ed5d874d9c4d8d662d1aaf49cf031a81b9c998a70b0df4bb18a92e7b5f5ccae2"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.892496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"2be5d8fa-2912-4fa0-b63c-c6c4c900676c","Type":"ContainerStarted","Data":"db483f7796f91c72d457de57cfa9e32931c54b9fcdc8c0ebdae99f547e7beb4f"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.893717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a23ccb03-08c8-451b-a812-30e8c8e5d2c1","Type":"ContainerStarted","Data":"3fb279ee4e25f2d09339a92a7585281acaa0d178b732cce28bf076432e1d167c"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.895616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3","Type":"ContainerStarted","Data":"41f3181a0e2a262a4cffdc46068c82ea7b64b20663e6263af623a75991f25d23"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.897924 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" event={"ID":"c9b17afa-1293-4da3-9638-839df6d49df9","Type":"ContainerStarted","Data":"d84a5d6d4141a509df8d8a1c995f601b8feea571b5b15f5471acb0649ff8cffc"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.907528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e7217542-f073-4e86-99e0-7b94c00199f1","Type":"ContainerStarted","Data":"956e9d766cde0081bc8352a62eccda34e5fc064b98b9c9804cb3abc1ddb94c21"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.922799 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" podStartSLOduration=2.92278649 podStartE2EDuration="2.92278649s" podCreationTimestamp="2026-01-21 16:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:55.918157677 +0000 UTC m=+4513.099673899" watchObservedRunningTime="2026-01-21 16:16:55.92278649 +0000 UTC m=+4513.104302703" Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.928985 4707 generic.go:334] "Generic (PLEG): container finished" podID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerID="4612bae2bcc2ddd52c87c41bb62385e1e556bbc875be1e1cfcdbbaddd73ce0fa" exitCode=0 Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.929021 4707 generic.go:334] "Generic (PLEG): container finished" podID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerID="54f3fd6d7f4c27e1565ae48afce64db522ef5e0bf6f32d8e333882ed6b5ae842" exitCode=2 Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.929023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerDied","Data":"4612bae2bcc2ddd52c87c41bb62385e1e556bbc875be1e1cfcdbbaddd73ce0fa"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.929056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerDied","Data":"54f3fd6d7f4c27e1565ae48afce64db522ef5e0bf6f32d8e333882ed6b5ae842"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.929090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerDied","Data":"ef1839ed46b9bd37ab4f97f30c6925c49a2de7a8a40de4236fdc49b99dbe0f27"} Jan 21 16:16:55 crc kubenswrapper[4707]: I0121 16:16:55.929031 4707 generic.go:334] "Generic (PLEG): container finished" podID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerID="ef1839ed46b9bd37ab4f97f30c6925c49a2de7a8a40de4236fdc49b99dbe0f27" exitCode=0 Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.217934 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.353128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-combined-ca-bundle\") pod \"42d3240b-126c-4815-b24a-0b02f8416c58\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.353198 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"42d3240b-126c-4815-b24a-0b02f8416c58\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.353224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-scripts\") pod \"42d3240b-126c-4815-b24a-0b02f8416c58\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.353312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-httpd-run\") pod \"42d3240b-126c-4815-b24a-0b02f8416c58\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.353350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-config-data\") pod \"42d3240b-126c-4815-b24a-0b02f8416c58\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.353468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-logs\") pod \"42d3240b-126c-4815-b24a-0b02f8416c58\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.353491 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-public-tls-certs\") pod \"42d3240b-126c-4815-b24a-0b02f8416c58\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.353527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqvjk\" (UniqueName: \"kubernetes.io/projected/42d3240b-126c-4815-b24a-0b02f8416c58-kube-api-access-sqvjk\") pod \"42d3240b-126c-4815-b24a-0b02f8416c58\" (UID: \"42d3240b-126c-4815-b24a-0b02f8416c58\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.355278 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "42d3240b-126c-4815-b24a-0b02f8416c58" (UID: "42d3240b-126c-4815-b24a-0b02f8416c58"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.355455 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-logs" (OuterVolumeSpecName: "logs") pod "42d3240b-126c-4815-b24a-0b02f8416c58" (UID: "42d3240b-126c-4815-b24a-0b02f8416c58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.361724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-scripts" (OuterVolumeSpecName: "scripts") pod "42d3240b-126c-4815-b24a-0b02f8416c58" (UID: "42d3240b-126c-4815-b24a-0b02f8416c58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.378293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "42d3240b-126c-4815-b24a-0b02f8416c58" (UID: "42d3240b-126c-4815-b24a-0b02f8416c58"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.380942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d3240b-126c-4815-b24a-0b02f8416c58-kube-api-access-sqvjk" (OuterVolumeSpecName: "kube-api-access-sqvjk") pod "42d3240b-126c-4815-b24a-0b02f8416c58" (UID: "42d3240b-126c-4815-b24a-0b02f8416c58"). InnerVolumeSpecName "kube-api-access-sqvjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.417466 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6"] Jan 21 16:16:56 crc kubenswrapper[4707]: E0121 16:16:56.418075 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d3240b-126c-4815-b24a-0b02f8416c58" containerName="glance-httpd" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.418097 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d3240b-126c-4815-b24a-0b02f8416c58" containerName="glance-httpd" Jan 21 16:16:56 crc kubenswrapper[4707]: E0121 16:16:56.418141 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d3240b-126c-4815-b24a-0b02f8416c58" containerName="glance-log" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.418149 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d3240b-126c-4815-b24a-0b02f8416c58" containerName="glance-log" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.418493 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d3240b-126c-4815-b24a-0b02f8416c58" containerName="glance-httpd" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.418520 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d3240b-126c-4815-b24a-0b02f8416c58" containerName="glance-log" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.419371 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.421830 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.421954 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.424889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6"] Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.428345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42d3240b-126c-4815-b24a-0b02f8416c58" (UID: "42d3240b-126c-4815-b24a-0b02f8416c58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.430635 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.441034 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-config-data" (OuterVolumeSpecName: "config-data") pod "42d3240b-126c-4815-b24a-0b02f8416c58" (UID: "42d3240b-126c-4815-b24a-0b02f8416c58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.457923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcxh7\" (UniqueName: \"kubernetes.io/projected/c1201e72-7dfd-4656-9a72-89fce1540f36-kube-api-access-fcxh7\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.458030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.459424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-scripts\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.459587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-config-data\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.459844 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.459888 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.459900 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d3240b-126c-4815-b24a-0b02f8416c58-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.459913 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqvjk\" (UniqueName: \"kubernetes.io/projected/42d3240b-126c-4815-b24a-0b02f8416c58-kube-api-access-sqvjk\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.459923 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.459970 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.459980 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.498218 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.502965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "42d3240b-126c-4815-b24a-0b02f8416c58" (UID: "42d3240b-126c-4815-b24a-0b02f8416c58"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.561025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-config-data\") pod \"dece1ad7-067e-4834-9540-b6743be93db6\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.561131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-combined-ca-bundle\") pod \"dece1ad7-067e-4834-9540-b6743be93db6\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.561248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pngw4\" (UniqueName: \"kubernetes.io/projected/dece1ad7-067e-4834-9540-b6743be93db6-kube-api-access-pngw4\") pod \"dece1ad7-067e-4834-9540-b6743be93db6\" (UID: \"dece1ad7-067e-4834-9540-b6743be93db6\") " Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.561817 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcxh7\" (UniqueName: \"kubernetes.io/projected/c1201e72-7dfd-4656-9a72-89fce1540f36-kube-api-access-fcxh7\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.561927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.561971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-scripts\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.562058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-config-data\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.562136 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3240b-126c-4815-b24a-0b02f8416c58-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.562154 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.570040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dece1ad7-067e-4834-9540-b6743be93db6-kube-api-access-pngw4" (OuterVolumeSpecName: "kube-api-access-pngw4") pod "dece1ad7-067e-4834-9540-b6743be93db6" (UID: "dece1ad7-067e-4834-9540-b6743be93db6"). InnerVolumeSpecName "kube-api-access-pngw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.571218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.576634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-scripts\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.586426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-config-data\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.595423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcxh7\" (UniqueName: \"kubernetes.io/projected/c1201e72-7dfd-4656-9a72-89fce1540f36-kube-api-access-fcxh7\") pod \"nova-cell1-conductor-db-sync-6c4n6\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.610184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-config-data" (OuterVolumeSpecName: "config-data") pod "dece1ad7-067e-4834-9540-b6743be93db6" (UID: "dece1ad7-067e-4834-9540-b6743be93db6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.627942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dece1ad7-067e-4834-9540-b6743be93db6" (UID: "dece1ad7-067e-4834-9540-b6743be93db6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.664712 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.664761 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dece1ad7-067e-4834-9540-b6743be93db6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.664782 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pngw4\" (UniqueName: \"kubernetes.io/projected/dece1ad7-067e-4834-9540-b6743be93db6-kube-api-access-pngw4\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.748522 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.929633 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.949909 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.993905 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.996047 4707 generic.go:334] "Generic (PLEG): container finished" podID="dece1ad7-067e-4834-9540-b6743be93db6" containerID="c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161" exitCode=0 Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.996132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dece1ad7-067e-4834-9540-b6743be93db6","Type":"ContainerDied","Data":"c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161"} Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.996172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"dece1ad7-067e-4834-9540-b6743be93db6","Type":"ContainerDied","Data":"ba19939958d90f408b93fcb401b29df45d40d61f431b89f30d055b0dc42cde4a"} Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.996192 4707 scope.go:117] "RemoveContainer" containerID="c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161" Jan 21 16:16:56 crc kubenswrapper[4707]: I0121 16:16:56.996341 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.019675 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.020002 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerName="glance-log" containerID="cri-o://0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d" gracePeriod=30 Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.020233 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerName="glance-httpd" containerID="cri-o://68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd" gracePeriod=30 Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.024208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3","Type":"ContainerStarted","Data":"474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127"} Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.024241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3","Type":"ContainerStarted","Data":"7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246"} Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.033624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e7217542-f073-4e86-99e0-7b94c00199f1","Type":"ContainerStarted","Data":"f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6"} Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.037452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42d3240b-126c-4815-b24a-0b02f8416c58","Type":"ContainerDied","Data":"da50fde175b1926f100bc671b0b7feac1bb445d445efe64cac69c6f9b1f7b970"} Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.037495 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.039173 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.041731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"2be5d8fa-2912-4fa0-b63c-c6c4c900676c","Type":"ContainerStarted","Data":"9b67c6c7ea85f7da2d715aeaa27b3d5b036ff782626c55f570dd79f6bf1909ba"} Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.049224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a23ccb03-08c8-451b-a812-30e8c8e5d2c1","Type":"ContainerStarted","Data":"b00a575790136fb51f83a36937c0bf685fbf96721e874df4f43633241a5e7e5b"} Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.049254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a23ccb03-08c8-451b-a812-30e8c8e5d2c1","Type":"ContainerStarted","Data":"9be2e202a9a4eb0ad36284ec6bf6b0abd927658eb1a4c2e893312eb2e77eb498"} Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.070677 4707 scope.go:117] "RemoveContainer" containerID="c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161" Jan 21 16:16:57 crc kubenswrapper[4707]: E0121 16:16:57.071071 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161\": container with ID starting with c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161 not found: ID does not exist" containerID="c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.071106 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161"} err="failed to get container status \"c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161\": rpc error: code = NotFound desc = could not find container \"c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161\": container with ID starting with c45c824b278dda45f37c3e1058db3dad27380624ed9009b85c53f2aad845d161 not found: ID does not exist" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.071127 4707 scope.go:117] "RemoveContainer" containerID="ed5d874d9c4d8d662d1aaf49cf031a81b9c998a70b0df4bb18a92e7b5f5ccae2" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.080861 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=4.080840515 podStartE2EDuration="4.080840515s" podCreationTimestamp="2026-01-21 16:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:57.047223385 +0000 UTC m=+4514.228739606" watchObservedRunningTime="2026-01-21 16:16:57.080840515 +0000 UTC m=+4514.262356736" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.095609 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.110355 4707 scope.go:117] "RemoveContainer" containerID="8c3d1aa4c7e07ced0f9c9649b79271951b2b5ac01a281dd0765d10564964c5eb" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.110455 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.126342 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=4.1263098209999995 podStartE2EDuration="4.126309821s" podCreationTimestamp="2026-01-21 16:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:57.083793728 +0000 UTC m=+4514.265309950" watchObservedRunningTime="2026-01-21 16:16:57.126309821 +0000 UTC m=+4514.307826043" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.226337 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dece1ad7-067e-4834-9540-b6743be93db6" path="/var/lib/kubelet/pods/dece1ad7-067e-4834-9540-b6743be93db6/volumes" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.229057 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: E0121 16:16:57.229385 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dece1ad7-067e-4834-9540-b6743be93db6" containerName="nova-cell0-conductor-conductor" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.229397 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dece1ad7-067e-4834-9540-b6743be93db6" containerName="nova-cell0-conductor-conductor" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.229628 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dece1ad7-067e-4834-9540-b6743be93db6" containerName="nova-cell0-conductor-conductor" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.230523 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.237341 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.249626 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.250202 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=3.2501789 podStartE2EDuration="3.2501789s" podCreationTimestamp="2026-01-21 16:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:57.110505243 +0000 UTC m=+4514.292021466" watchObservedRunningTime="2026-01-21 16:16:57.2501789 +0000 UTC m=+4514.431695123" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.265181 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.271912 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.277029 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.278651 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.280259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.280329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9467\" (UniqueName: \"kubernetes.io/projected/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-kube-api-access-m9467\") pod \"nova-cell0-conductor-0\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.280393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.281061 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.281430 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.284296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.286228 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=4.286214168 podStartE2EDuration="4.286214168s" podCreationTimestamp="2026-01-21 16:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:57.172449356 +0000 UTC m=+4514.353965578" watchObservedRunningTime="2026-01-21 16:16:57.286214168 +0000 UTC m=+4514.467730390" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.302620 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6"] Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.381652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.381730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.381771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-logs\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.381820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.381932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.381971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.381989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.382051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.382078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqmff\" (UniqueName: \"kubernetes.io/projected/f0537ef6-aeda-4293-aafa-5889f66c493a-kube-api-access-dqmff\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.382100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9467\" (UniqueName: \"kubernetes.io/projected/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-kube-api-access-m9467\") pod \"nova-cell0-conductor-0\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.382143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.386070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.386549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.406543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9467\" (UniqueName: \"kubernetes.io/projected/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-kube-api-access-m9467\") pod \"nova-cell0-conductor-0\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.484041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.484161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.484219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqmff\" (UniqueName: \"kubernetes.io/projected/f0537ef6-aeda-4293-aafa-5889f66c493a-kube-api-access-dqmff\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.484293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.484381 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.484431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-logs\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.484474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.484570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.484687 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.485159 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-logs\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.487549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.487742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.489409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.489590 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.491362 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.498663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqmff\" (UniqueName: \"kubernetes.io/projected/f0537ef6-aeda-4293-aafa-5889f66c493a-kube-api-access-dqmff\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.515231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.697352 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:57 crc kubenswrapper[4707]: I0121 16:16:57.713550 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.059036 4707 generic.go:334] "Generic (PLEG): container finished" podID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerID="0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d" exitCode=143 Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.059340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea","Type":"ContainerDied","Data":"0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d"} Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.061363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" event={"ID":"c1201e72-7dfd-4656-9a72-89fce1540f36","Type":"ContainerStarted","Data":"38f07e4dab137756d0ec3b0c9768ae2674630a8f493a3424ecef92d8bef537e0"} Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.061424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" event={"ID":"c1201e72-7dfd-4656-9a72-89fce1540f36","Type":"ContainerStarted","Data":"804393e15f726ee971a049ef8e53177ea7de052717b845e99254a7835cb3ce0d"} Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.062715 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="2be5d8fa-2912-4fa0-b63c-c6c4c900676c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9b67c6c7ea85f7da2d715aeaa27b3d5b036ff782626c55f570dd79f6bf1909ba" gracePeriod=30 Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.064309 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="e7217542-f073-4e86-99e0-7b94c00199f1" containerName="nova-scheduler-scheduler" containerID="cri-o://f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6" gracePeriod=30 Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.064531 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerName="nova-api-log" containerID="cri-o://9be2e202a9a4eb0ad36284ec6bf6b0abd927658eb1a4c2e893312eb2e77eb498" gracePeriod=30 Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.064669 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerName="nova-metadata-log" containerID="cri-o://7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246" gracePeriod=30 Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.064754 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerName="nova-api-api" containerID="cri-o://b00a575790136fb51f83a36937c0bf685fbf96721e874df4f43633241a5e7e5b" gracePeriod=30 Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.064864 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerName="nova-metadata-metadata" containerID="cri-o://474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127" gracePeriod=30 Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.096928 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" podStartSLOduration=2.096909355 podStartE2EDuration="2.096909355s" podCreationTimestamp="2026-01-21 16:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:58.08531805 +0000 UTC m=+4515.266834273" watchObservedRunningTime="2026-01-21 16:16:58.096909355 +0000 UTC m=+4515.278425577" Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.200908 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.310118 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:16:58 crc kubenswrapper[4707]: W0121 16:16:58.376731 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0537ef6_aeda_4293_aafa_5889f66c493a.slice/crio-bf63591a178edb42ff637bbe6dcc4de0a9ba3705b9a710e72718c2ab42fdcf9a WatchSource:0}: Error finding container bf63591a178edb42ff637bbe6dcc4de0a9ba3705b9a710e72718c2ab42fdcf9a: Status 404 returned error can't find the container with id bf63591a178edb42ff637bbe6dcc4de0a9ba3705b9a710e72718c2ab42fdcf9a Jan 21 16:16:58 crc kubenswrapper[4707]: I0121 16:16:58.942076 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.091498 4707 generic.go:334] "Generic (PLEG): container finished" podID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerID="474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127" exitCode=0 Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.091539 4707 generic.go:334] "Generic (PLEG): container finished" podID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerID="7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246" exitCode=143 Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.091583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3","Type":"ContainerDied","Data":"474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.091620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3","Type":"ContainerDied","Data":"7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.091631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3","Type":"ContainerDied","Data":"41f3181a0e2a262a4cffdc46068c82ea7b64b20663e6263af623a75991f25d23"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.091650 4707 scope.go:117] "RemoveContainer" containerID="474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.091754 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.118010 4707 generic.go:334] "Generic (PLEG): container finished" podID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerID="1597feabb8d21f3510c2a3bef67f6d2250cbc18d1b39b3864fe125c0caf4ee50" exitCode=0 Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.118098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerDied","Data":"1597feabb8d21f3510c2a3bef67f6d2250cbc18d1b39b3864fe125c0caf4ee50"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.118129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"df0edeb9-c8a1-4b02-903a-af1c8de3f440","Type":"ContainerDied","Data":"f61b914a1bfbd2de4bd56e43c20c632465a9763580e20f0fec7c2ac721ef15c2"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.118161 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f61b914a1bfbd2de4bd56e43c20c632465a9763580e20f0fec7c2ac721ef15c2" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.131184 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.131463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-logs\") pod \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.131510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-combined-ca-bundle\") pod \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.131551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bbjc\" (UniqueName: \"kubernetes.io/projected/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-kube-api-access-4bbjc\") pod \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.131570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-config-data\") pod \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\" (UID: \"bfd729e3-3442-4502-8ba1-4ccc06a2f8f3\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.132456 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-logs" (OuterVolumeSpecName: "logs") pod "bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" (UID: "bfd729e3-3442-4502-8ba1-4ccc06a2f8f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.133455 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.138193 4707 generic.go:334] "Generic (PLEG): container finished" podID="2be5d8fa-2912-4fa0-b63c-c6c4c900676c" containerID="9b67c6c7ea85f7da2d715aeaa27b3d5b036ff782626c55f570dd79f6bf1909ba" exitCode=0 Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.138255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"2be5d8fa-2912-4fa0-b63c-c6c4c900676c","Type":"ContainerDied","Data":"9b67c6c7ea85f7da2d715aeaa27b3d5b036ff782626c55f570dd79f6bf1909ba"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.139613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f0537ef6-aeda-4293-aafa-5889f66c493a","Type":"ContainerStarted","Data":"bf63591a178edb42ff637bbe6dcc4de0a9ba3705b9a710e72718c2ab42fdcf9a"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.140642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-kube-api-access-4bbjc" (OuterVolumeSpecName: "kube-api-access-4bbjc") pod "bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" (UID: "bfd729e3-3442-4502-8ba1-4ccc06a2f8f3"). InnerVolumeSpecName "kube-api-access-4bbjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.167083 4707 generic.go:334] "Generic (PLEG): container finished" podID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerID="b00a575790136fb51f83a36937c0bf685fbf96721e874df4f43633241a5e7e5b" exitCode=0 Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.167107 4707 generic.go:334] "Generic (PLEG): container finished" podID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerID="9be2e202a9a4eb0ad36284ec6bf6b0abd927658eb1a4c2e893312eb2e77eb498" exitCode=143 Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.167151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a23ccb03-08c8-451b-a812-30e8c8e5d2c1","Type":"ContainerDied","Data":"b00a575790136fb51f83a36937c0bf685fbf96721e874df4f43633241a5e7e5b"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.167186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a23ccb03-08c8-451b-a812-30e8c8e5d2c1","Type":"ContainerDied","Data":"9be2e202a9a4eb0ad36284ec6bf6b0abd927658eb1a4c2e893312eb2e77eb498"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.200039 4707 scope.go:117] "RemoveContainer" containerID="7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.200856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3","Type":"ContainerStarted","Data":"6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.200895 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.200907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3","Type":"ContainerStarted","Data":"b32176cc9111901d94ff9d59ba9a5095d770d3610158e6ca39652624c20a15da"} Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.201504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" (UID: "bfd729e3-3442-4502-8ba1-4ccc06a2f8f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.203011 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-config-data" (OuterVolumeSpecName: "config-data") pod "bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" (UID: "bfd729e3-3442-4502-8ba1-4ccc06a2f8f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.235900 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-scripts\") pod \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.235973 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-combined-ca-bundle\") pod \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.236050 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-config-data\") pod \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.236083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-log-httpd\") pod \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.236129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-run-httpd\") pod \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.236214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvf6g\" (UniqueName: \"kubernetes.io/projected/df0edeb9-c8a1-4b02-903a-af1c8de3f440-kube-api-access-kvf6g\") pod \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.236317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-sg-core-conf-yaml\") pod \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\" (UID: \"df0edeb9-c8a1-4b02-903a-af1c8de3f440\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.237289 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.237311 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bbjc\" (UniqueName: \"kubernetes.io/projected/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-kube-api-access-4bbjc\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.237324 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.239071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "df0edeb9-c8a1-4b02-903a-af1c8de3f440" (UID: "df0edeb9-c8a1-4b02-903a-af1c8de3f440"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.239441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "df0edeb9-c8a1-4b02-903a-af1c8de3f440" (UID: "df0edeb9-c8a1-4b02-903a-af1c8de3f440"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.253333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-scripts" (OuterVolumeSpecName: "scripts") pod "df0edeb9-c8a1-4b02-903a-af1c8de3f440" (UID: "df0edeb9-c8a1-4b02-903a-af1c8de3f440"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.256297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0edeb9-c8a1-4b02-903a-af1c8de3f440-kube-api-access-kvf6g" (OuterVolumeSpecName: "kube-api-access-kvf6g") pod "df0edeb9-c8a1-4b02-903a-af1c8de3f440" (UID: "df0edeb9-c8a1-4b02-903a-af1c8de3f440"). InnerVolumeSpecName "kube-api-access-kvf6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.275779 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.275762549 podStartE2EDuration="2.275762549s" podCreationTimestamp="2026-01-21 16:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:16:59.216598092 +0000 UTC m=+4516.398114324" watchObservedRunningTime="2026-01-21 16:16:59.275762549 +0000 UTC m=+4516.457278771" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.277043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "df0edeb9-c8a1-4b02-903a-af1c8de3f440" (UID: "df0edeb9-c8a1-4b02-903a-af1c8de3f440"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.313852 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d3240b-126c-4815-b24a-0b02f8416c58" path="/var/lib/kubelet/pods/42d3240b-126c-4815-b24a-0b02f8416c58/volumes" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.326094 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.341472 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.341512 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvf6g\" (UniqueName: \"kubernetes.io/projected/df0edeb9-c8a1-4b02-903a-af1c8de3f440-kube-api-access-kvf6g\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.341531 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.341542 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.341554 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df0edeb9-c8a1-4b02-903a-af1c8de3f440-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.349016 4707 scope.go:117] "RemoveContainer" containerID="474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.350870 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127\": container with ID starting with 474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127 not found: ID does not exist" containerID="474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.350914 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127"} err="failed to get container status \"474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127\": rpc error: code = NotFound desc = could not find container \"474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127\": container with ID starting with 474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127 not found: ID does not exist" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.350946 4707 scope.go:117] "RemoveContainer" containerID="7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.351503 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246\": container with ID starting with 7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246 not found: ID does not exist" containerID="7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.351531 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246"} err="failed to get container status \"7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246\": rpc error: code = NotFound desc = could not find container \"7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246\": container with ID starting with 7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246 not found: ID does not exist" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.351547 4707 scope.go:117] "RemoveContainer" containerID="474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.351983 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127"} err="failed to get container status \"474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127\": rpc error: code = NotFound desc = could not find container \"474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127\": container with ID starting with 474c7625ac8e5ce720dadc11c0f5fb9313787183036f2f61d73d296b7f386127 not found: ID does not exist" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.352000 4707 scope.go:117] "RemoveContainer" containerID="7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.352286 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246"} err="failed to get container status \"7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246\": rpc error: code = NotFound desc = could not find container \"7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246\": container with ID starting with 7abeddad1d23285631cc4324738cc41ecc3415aa26542088403bc831df3e5246 not found: ID does not exist" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.401520 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.429778 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.442872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-logs\") pod \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.443035 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-config-data\") pod \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.443237 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-combined-ca-bundle\") pod \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.443284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5qvm\" (UniqueName: \"kubernetes.io/projected/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-kube-api-access-n5qvm\") pod \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\" (UID: \"a23ccb03-08c8-451b-a812-30e8c8e5d2c1\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.456776 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.464203 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-logs" (OuterVolumeSpecName: "logs") pod "a23ccb03-08c8-451b-a812-30e8c8e5d2c1" (UID: "a23ccb03-08c8-451b-a812-30e8c8e5d2c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.466743 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-kube-api-access-n5qvm" (OuterVolumeSpecName: "kube-api-access-n5qvm") pod "a23ccb03-08c8-451b-a812-30e8c8e5d2c1" (UID: "a23ccb03-08c8-451b-a812-30e8c8e5d2c1"). InnerVolumeSpecName "kube-api-access-n5qvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.480070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df0edeb9-c8a1-4b02-903a-af1c8de3f440" (UID: "df0edeb9-c8a1-4b02-903a-af1c8de3f440"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.493244 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.493859 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerName="nova-api-log" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.493879 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerName="nova-api-log" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.493907 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerName="nova-metadata-log" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.493915 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerName="nova-metadata-log" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.493931 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="ceilometer-notification-agent" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.493938 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="ceilometer-notification-agent" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.493957 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be5d8fa-2912-4fa0-b63c-c6c4c900676c" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.493967 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be5d8fa-2912-4fa0-b63c-c6c4c900676c" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.493981 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerName="nova-metadata-metadata" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.493988 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerName="nova-metadata-metadata" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.494002 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="ceilometer-central-agent" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494009 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="ceilometer-central-agent" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.494019 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="proxy-httpd" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494026 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="proxy-httpd" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.494034 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="sg-core" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494041 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="sg-core" Jan 21 16:16:59 crc kubenswrapper[4707]: E0121 16:16:59.494064 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerName="nova-api-api" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494071 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerName="nova-api-api" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494303 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be5d8fa-2912-4fa0-b63c-c6c4c900676c" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494322 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="ceilometer-notification-agent" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494332 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerName="nova-metadata-log" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494341 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerName="nova-api-api" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494353 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" containerName="nova-api-log" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494367 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="sg-core" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494380 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="proxy-httpd" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494393 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" containerName="nova-metadata-metadata" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.494407 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" containerName="ceilometer-central-agent" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.495794 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.499360 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.512521 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.524563 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.545189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-config-data\") pod \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.545490 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhc5t\" (UniqueName: \"kubernetes.io/projected/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-kube-api-access-vhc5t\") pod \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.545583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-combined-ca-bundle\") pod \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\" (UID: \"2be5d8fa-2912-4fa0-b63c-c6c4c900676c\") " Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.546304 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.546326 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.546339 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5qvm\" (UniqueName: \"kubernetes.io/projected/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-kube-api-access-n5qvm\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.555998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-kube-api-access-vhc5t" (OuterVolumeSpecName: "kube-api-access-vhc5t") pod "2be5d8fa-2912-4fa0-b63c-c6c4c900676c" (UID: "2be5d8fa-2912-4fa0-b63c-c6c4c900676c"). InnerVolumeSpecName "kube-api-access-vhc5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.577953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a23ccb03-08c8-451b-a812-30e8c8e5d2c1" (UID: "a23ccb03-08c8-451b-a812-30e8c8e5d2c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.583387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-config-data" (OuterVolumeSpecName: "config-data") pod "a23ccb03-08c8-451b-a812-30e8c8e5d2c1" (UID: "a23ccb03-08c8-451b-a812-30e8c8e5d2c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.599825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-config-data" (OuterVolumeSpecName: "config-data") pod "2be5d8fa-2912-4fa0-b63c-c6c4c900676c" (UID: "2be5d8fa-2912-4fa0-b63c-c6c4c900676c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.601733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2be5d8fa-2912-4fa0-b63c-c6c4c900676c" (UID: "2be5d8fa-2912-4fa0-b63c-c6c4c900676c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.603122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-config-data" (OuterVolumeSpecName: "config-data") pod "df0edeb9-c8a1-4b02-903a-af1c8de3f440" (UID: "df0edeb9-c8a1-4b02-903a-af1c8de3f440"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.649858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnxx5\" (UniqueName: \"kubernetes.io/projected/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-kube-api-access-jnxx5\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.650029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-logs\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.650155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-config-data\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.650276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.650411 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.650475 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhc5t\" (UniqueName: \"kubernetes.io/projected/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-kube-api-access-vhc5t\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.650534 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0edeb9-c8a1-4b02-903a-af1c8de3f440-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.650587 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.650641 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23ccb03-08c8-451b-a812-30e8c8e5d2c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.650695 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be5d8fa-2912-4fa0-b63c-c6c4c900676c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.753220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnxx5\" (UniqueName: \"kubernetes.io/projected/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-kube-api-access-jnxx5\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.753333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-logs\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.753485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-config-data\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.753582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.754002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-logs\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.759554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-config-data\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.763303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.770230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnxx5\" (UniqueName: \"kubernetes.io/projected/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-kube-api-access-jnxx5\") pod \"nova-metadata-0\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:16:59 crc kubenswrapper[4707]: I0121 16:16:59.833040 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.192351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a23ccb03-08c8-451b-a812-30e8c8e5d2c1","Type":"ContainerDied","Data":"3fb279ee4e25f2d09339a92a7585281acaa0d178b732cce28bf076432e1d167c"} Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.192902 4707 scope.go:117] "RemoveContainer" containerID="b00a575790136fb51f83a36937c0bf685fbf96721e874df4f43633241a5e7e5b" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.192405 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.202033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"2be5d8fa-2912-4fa0-b63c-c6c4c900676c","Type":"ContainerDied","Data":"db483f7796f91c72d457de57cfa9e32931c54b9fcdc8c0ebdae99f547e7beb4f"} Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.202109 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.207827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f0537ef6-aeda-4293-aafa-5889f66c493a","Type":"ContainerStarted","Data":"9edf57f9b05658747e57d4b578bd476b8dbc99dfcb2823b90d95e7f0335c5d65"} Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.207868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f0537ef6-aeda-4293-aafa-5889f66c493a","Type":"ContainerStarted","Data":"b2d316c0e1b9039139944364e3e8f6a2b8d263a177cb3b436cd75caafe27246a"} Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.207973 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.213254 4707 scope.go:117] "RemoveContainer" containerID="9be2e202a9a4eb0ad36284ec6bf6b0abd927658eb1a4c2e893312eb2e77eb498" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.228386 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.228366235 podStartE2EDuration="3.228366235s" podCreationTimestamp="2026-01-21 16:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:00.227377105 +0000 UTC m=+4517.408893328" watchObservedRunningTime="2026-01-21 16:17:00.228366235 +0000 UTC m=+4517.409882457" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.277405 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.286185 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.303741 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.305291 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.309350 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.354869 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.392825 4707 scope.go:117] "RemoveContainer" containerID="9b67c6c7ea85f7da2d715aeaa27b3d5b036ff782626c55f570dd79f6bf1909ba" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.406380 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.426879 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.443419 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: E0121 16:17:00.445017 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-cjbkl logs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/nova-api-0" podUID="d8252089-7af3-4b4c-97a4-40e2cf8ce246" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.459889 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.466940 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.475884 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.482893 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.484945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjbkl\" (UniqueName: \"kubernetes.io/projected/d8252089-7af3-4b4c-97a4-40e2cf8ce246-kube-api-access-cjbkl\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.485051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-config-data\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.485251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8252089-7af3-4b4c-97a4-40e2cf8ce246-logs\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.485312 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.488749 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.506859 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.508216 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.512175 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.535094 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.537502 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.539501 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.539675 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.557499 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.569621 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.589912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.590046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.590295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.590419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrwsj\" (UniqueName: \"kubernetes.io/projected/b523514b-6772-4ddc-9db6-df7a8ddff1dc-kube-api-access-xrwsj\") pod \"nova-cell1-novncproxy-0\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.590509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjbkl\" (UniqueName: \"kubernetes.io/projected/d8252089-7af3-4b4c-97a4-40e2cf8ce246-kube-api-access-cjbkl\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.590599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-config-data\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.590728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8252089-7af3-4b4c-97a4-40e2cf8ce246-logs\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.591422 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8252089-7af3-4b4c-97a4-40e2cf8ce246-logs\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.609928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.623352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-config-data\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.624937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjbkl\" (UniqueName: \"kubernetes.io/projected/d8252089-7af3-4b4c-97a4-40e2cf8ce246-kube-api-access-cjbkl\") pod \"nova-api-0\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.696888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.696962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92tqq\" (UniqueName: \"kubernetes.io/projected/a660bc26-b5c3-47d0-a229-d1339c0ae215-kube-api-access-92tqq\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.697002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrwsj\" (UniqueName: \"kubernetes.io/projected/b523514b-6772-4ddc-9db6-df7a8ddff1dc-kube-api-access-xrwsj\") pod \"nova-cell1-novncproxy-0\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.697073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-run-httpd\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.697101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.697148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-scripts\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.697332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-config-data\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.697403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.697502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-log-httpd\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.697547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.701226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.704302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.719898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrwsj\" (UniqueName: \"kubernetes.io/projected/b523514b-6772-4ddc-9db6-df7a8ddff1dc-kube-api-access-xrwsj\") pod \"nova-cell1-novncproxy-0\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.771261 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.803041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92tqq\" (UniqueName: \"kubernetes.io/projected/a660bc26-b5c3-47d0-a229-d1339c0ae215-kube-api-access-92tqq\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.803181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.803210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-run-httpd\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.803265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-scripts\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.803344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-config-data\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.803436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-log-httpd\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.803508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.803914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-run-httpd\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.804571 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-log-httpd\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.807319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.814392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-scripts\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.816083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.819767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-config-data\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.823875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92tqq\" (UniqueName: \"kubernetes.io/projected/a660bc26-b5c3-47d0-a229-d1339c0ae215-kube-api-access-92tqq\") pod \"ceilometer-0\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.904598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-combined-ca-bundle\") pod \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.904689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-logs\") pod \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.904715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slpm8\" (UniqueName: \"kubernetes.io/projected/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-kube-api-access-slpm8\") pod \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.904743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-internal-tls-certs\") pod \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.904773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-scripts\") pod \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.904872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-config-data\") pod \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.904935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.904983 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-httpd-run\") pod \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\" (UID: \"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea\") " Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.905256 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-logs" (OuterVolumeSpecName: "logs") pod "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" (UID: "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.905584 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.905851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" (UID: "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.911926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-kube-api-access-slpm8" (OuterVolumeSpecName: "kube-api-access-slpm8") pod "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" (UID: "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea"). InnerVolumeSpecName "kube-api-access-slpm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.912043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" (UID: "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.920864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-scripts" (OuterVolumeSpecName: "scripts") pod "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" (UID: "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.933401 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" (UID: "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.948973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.951603 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" (UID: "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.956031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-config-data" (OuterVolumeSpecName: "config-data") pod "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" (UID: "7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:00 crc kubenswrapper[4707]: I0121 16:17:00.962254 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.008994 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.009029 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slpm8\" (UniqueName: \"kubernetes.io/projected/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-kube-api-access-slpm8\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.009067 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.009080 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.009090 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.009126 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.009136 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.032338 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.112889 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.194919 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be5d8fa-2912-4fa0-b63c-c6c4c900676c" path="/var/lib/kubelet/pods/2be5d8fa-2912-4fa0-b63c-c6c4c900676c/volumes" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.195631 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a23ccb03-08c8-451b-a812-30e8c8e5d2c1" path="/var/lib/kubelet/pods/a23ccb03-08c8-451b-a812-30e8c8e5d2c1/volumes" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.196204 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd729e3-3442-4502-8ba1-4ccc06a2f8f3" path="/var/lib/kubelet/pods/bfd729e3-3442-4502-8ba1-4ccc06a2f8f3/volumes" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.197172 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0edeb9-c8a1-4b02-903a-af1c8de3f440" path="/var/lib/kubelet/pods/df0edeb9-c8a1-4b02-903a-af1c8de3f440/volumes" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.222195 4707 generic.go:334] "Generic (PLEG): container finished" podID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerID="68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd" exitCode=0 Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.222265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea","Type":"ContainerDied","Data":"68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd"} Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.222299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea","Type":"ContainerDied","Data":"8896a2707c13bf3ae97b270f90ba3046300518ab781d56f48ed17d9cd6eb2255"} Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.222318 4707 scope.go:117] "RemoveContainer" containerID="68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.222463 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.231573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d","Type":"ContainerStarted","Data":"da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815"} Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.231694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d","Type":"ContainerStarted","Data":"fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6"} Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.231769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d","Type":"ContainerStarted","Data":"a9b66084f7ca2cb7b21a6b0053559ef8f23e5597b37c3de84e8ea7a632edf155"} Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.231752 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerName="nova-metadata-metadata" containerID="cri-o://da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815" gracePeriod=30 Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.231640 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerName="nova-metadata-log" containerID="cri-o://fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6" gracePeriod=30 Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.245274 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.245553 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerName="nova-cell0-conductor-conductor" containerID="cri-o://6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" gracePeriod=30 Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.263262 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.263248661 podStartE2EDuration="2.263248661s" podCreationTimestamp="2026-01-21 16:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:01.245512812 +0000 UTC m=+4518.427029033" watchObservedRunningTime="2026-01-21 16:17:01.263248661 +0000 UTC m=+4518.444764883" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.265670 4707 scope.go:117] "RemoveContainer" containerID="0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.265980 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.281052 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.303791 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.312193 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:17:01 crc kubenswrapper[4707]: E0121 16:17:01.312730 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerName="glance-httpd" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.312744 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerName="glance-httpd" Jan 21 16:17:01 crc kubenswrapper[4707]: E0121 16:17:01.312757 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerName="glance-log" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.312762 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerName="glance-log" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.312939 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerName="glance-httpd" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.312960 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" containerName="glance-log" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.313958 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.316677 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.317970 4707 scope.go:117] "RemoveContainer" containerID="68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.318155 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.318592 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 21 16:17:01 crc kubenswrapper[4707]: E0121 16:17:01.319901 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd\": container with ID starting with 68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd not found: ID does not exist" containerID="68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.319939 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd"} err="failed to get container status \"68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd\": rpc error: code = NotFound desc = could not find container \"68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd\": container with ID starting with 68026bcae8ee285d36422d47adfbbe187b6fd02872b3c9bf35954fcb1dc1a2fd not found: ID does not exist" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.319983 4707 scope.go:117] "RemoveContainer" containerID="0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d" Jan 21 16:17:01 crc kubenswrapper[4707]: E0121 16:17:01.321138 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d\": container with ID starting with 0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d not found: ID does not exist" containerID="0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.321172 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d"} err="failed to get container status \"0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d\": rpc error: code = NotFound desc = could not find container \"0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d\": container with ID starting with 0897eb77b1880a69f15ab67fcfc72cedbe5b3982e07b762e58ff50448268e94d not found: ID does not exist" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.419110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-config-data\") pod \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.419529 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8252089-7af3-4b4c-97a4-40e2cf8ce246-logs\") pod \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.419620 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-combined-ca-bundle\") pod \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.419662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjbkl\" (UniqueName: \"kubernetes.io/projected/d8252089-7af3-4b4c-97a4-40e2cf8ce246-kube-api-access-cjbkl\") pod \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\" (UID: \"d8252089-7af3-4b4c-97a4-40e2cf8ce246\") " Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.420365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.420395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvtzd\" (UniqueName: \"kubernetes.io/projected/6cd161af-38fc-439e-8b3f-d4549dc3708f-kube-api-access-bvtzd\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.420451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.420487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-logs\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.420639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.420695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.420726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.420802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.421411 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8252089-7af3-4b4c-97a4-40e2cf8ce246-logs" (OuterVolumeSpecName: "logs") pod "d8252089-7af3-4b4c-97a4-40e2cf8ce246" (UID: "d8252089-7af3-4b4c-97a4-40e2cf8ce246"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.426106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8252089-7af3-4b4c-97a4-40e2cf8ce246-kube-api-access-cjbkl" (OuterVolumeSpecName: "kube-api-access-cjbkl") pod "d8252089-7af3-4b4c-97a4-40e2cf8ce246" (UID: "d8252089-7af3-4b4c-97a4-40e2cf8ce246"). InnerVolumeSpecName "kube-api-access-cjbkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.426507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8252089-7af3-4b4c-97a4-40e2cf8ce246" (UID: "d8252089-7af3-4b4c-97a4-40e2cf8ce246"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.433158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-config-data" (OuterVolumeSpecName: "config-data") pod "d8252089-7af3-4b4c-97a4-40e2cf8ce246" (UID: "d8252089-7af3-4b4c-97a4-40e2cf8ce246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.449693 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:01 crc kubenswrapper[4707]: W0121 16:17:01.465367 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda660bc26_b5c3_47d0_a229_d1339c0ae215.slice/crio-cd2c8c2374caa5de231cbcc5b98dfda881c4ef0036dc9bc11310e5e8c4d964bd WatchSource:0}: Error finding container cd2c8c2374caa5de231cbcc5b98dfda881c4ef0036dc9bc11310e5e8c4d964bd: Status 404 returned error can't find the container with id cd2c8c2374caa5de231cbcc5b98dfda881c4ef0036dc9bc11310e5e8c4d964bd Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.525100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.525155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.525196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.525899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.526342 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.526364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvtzd\" (UniqueName: \"kubernetes.io/projected/6cd161af-38fc-439e-8b3f-d4549dc3708f-kube-api-access-bvtzd\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.526399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.526429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-logs\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.526538 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjbkl\" (UniqueName: \"kubernetes.io/projected/d8252089-7af3-4b4c-97a4-40e2cf8ce246-kube-api-access-cjbkl\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.526682 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.527269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.527336 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8252089-7af3-4b4c-97a4-40e2cf8ce246-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.527354 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8252089-7af3-4b4c-97a4-40e2cf8ce246-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.527500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-logs\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.528342 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.534127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.536425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.536927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.538434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.539368 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.548432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvtzd\" (UniqueName: \"kubernetes.io/projected/6cd161af-38fc-439e-8b3f-d4549dc3708f-kube-api-access-bvtzd\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: W0121 16:17:01.555947 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb523514b_6772_4ddc_9db6_df7a8ddff1dc.slice/crio-fb994d5739668f4a5cf65e4a1ab69785a7a0beccc96847c90ab343f6e4a25f84 WatchSource:0}: Error finding container fb994d5739668f4a5cf65e4a1ab69785a7a0beccc96847c90ab343f6e4a25f84: Status 404 returned error can't find the container with id fb994d5739668f4a5cf65e4a1ab69785a7a0beccc96847c90ab343f6e4a25f84 Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.569299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.642112 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.706404 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.831348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-logs\") pod \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.831622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-config-data\") pod \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.831695 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-combined-ca-bundle\") pod \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.831773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnxx5\" (UniqueName: \"kubernetes.io/projected/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-kube-api-access-jnxx5\") pod \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\" (UID: \"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d\") " Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.832485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-logs" (OuterVolumeSpecName: "logs") pod "a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" (UID: "a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.839472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-kube-api-access-jnxx5" (OuterVolumeSpecName: "kube-api-access-jnxx5") pod "a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" (UID: "a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d"). InnerVolumeSpecName "kube-api-access-jnxx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.866697 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" (UID: "a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.878762 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-config-data" (OuterVolumeSpecName: "config-data") pod "a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" (UID: "a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.934384 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.934412 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnxx5\" (UniqueName: \"kubernetes.io/projected/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-kube-api-access-jnxx5\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.934425 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:01 crc kubenswrapper[4707]: I0121 16:17:01.934435 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.104375 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:17:02 crc kubenswrapper[4707]: W0121 16:17:02.114847 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cd161af_38fc_439e_8b3f_d4549dc3708f.slice/crio-09b1ae05a7530b916744156378a126a0470924029ea3c389f8b733f335203e22 WatchSource:0}: Error finding container 09b1ae05a7530b916744156378a126a0470924029ea3c389f8b733f335203e22: Status 404 returned error can't find the container with id 09b1ae05a7530b916744156378a126a0470924029ea3c389f8b733f335203e22 Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.261976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6cd161af-38fc-439e-8b3f-d4549dc3708f","Type":"ContainerStarted","Data":"09b1ae05a7530b916744156378a126a0470924029ea3c389f8b733f335203e22"} Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.281796 4707 generic.go:334] "Generic (PLEG): container finished" podID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerID="da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815" exitCode=0 Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.281841 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.281862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d","Type":"ContainerDied","Data":"da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815"} Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.281896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d","Type":"ContainerDied","Data":"fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6"} Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.281917 4707 scope.go:117] "RemoveContainer" containerID="da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.281845 4707 generic.go:334] "Generic (PLEG): container finished" podID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerID="fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6" exitCode=143 Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.282060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d","Type":"ContainerDied","Data":"a9b66084f7ca2cb7b21a6b0053559ef8f23e5597b37c3de84e8ea7a632edf155"} Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.295639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerStarted","Data":"8a279eb41652f50925eae67ee02e9400a99566ac4608b2d5dd7f51dedd0ce87f"} Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.295671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerStarted","Data":"cd2c8c2374caa5de231cbcc5b98dfda881c4ef0036dc9bc11310e5e8c4d964bd"} Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.299129 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.299490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b523514b-6772-4ddc-9db6-df7a8ddff1dc","Type":"ContainerStarted","Data":"d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1"} Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.299522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b523514b-6772-4ddc-9db6-df7a8ddff1dc","Type":"ContainerStarted","Data":"fb994d5739668f4a5cf65e4a1ab69785a7a0beccc96847c90ab343f6e4a25f84"} Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.337033 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.346903 4707 scope.go:117] "RemoveContainer" containerID="fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.380871 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.399876 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:02 crc kubenswrapper[4707]: E0121 16:17:02.400341 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerName="nova-metadata-metadata" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.400362 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerName="nova-metadata-metadata" Jan 21 16:17:02 crc kubenswrapper[4707]: E0121 16:17:02.400405 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerName="nova-metadata-log" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.400413 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerName="nova-metadata-log" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.400617 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerName="nova-metadata-metadata" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.400647 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" containerName="nova-metadata-log" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.401635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.409666 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.422918 4707 scope.go:117] "RemoveContainer" containerID="da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815" Jan 21 16:17:02 crc kubenswrapper[4707]: E0121 16:17:02.426956 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815\": container with ID starting with da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815 not found: ID does not exist" containerID="da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.427008 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815"} err="failed to get container status \"da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815\": rpc error: code = NotFound desc = could not find container \"da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815\": container with ID starting with da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815 not found: ID does not exist" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.427031 4707 scope.go:117] "RemoveContainer" containerID="fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.438916 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.438891707 podStartE2EDuration="2.438891707s" podCreationTimestamp="2026-01-21 16:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:02.336846102 +0000 UTC m=+4519.518362324" watchObservedRunningTime="2026-01-21 16:17:02.438891707 +0000 UTC m=+4519.620407930" Jan 21 16:17:02 crc kubenswrapper[4707]: E0121 16:17:02.475614 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6\": container with ID starting with fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6 not found: ID does not exist" containerID="fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.475663 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6"} err="failed to get container status \"fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6\": rpc error: code = NotFound desc = could not find container \"fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6\": container with ID starting with fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6 not found: ID does not exist" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.475690 4707 scope.go:117] "RemoveContainer" containerID="da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.498125 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815"} err="failed to get container status \"da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815\": rpc error: code = NotFound desc = could not find container \"da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815\": container with ID starting with da6fc6cdd0348f25ca9ce7abffd9930d9eeff36f9d0ba9770d11ca3056b4e815 not found: ID does not exist" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.498201 4707 scope.go:117] "RemoveContainer" containerID="fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.499709 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6"} err="failed to get container status \"fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6\": rpc error: code = NotFound desc = could not find container \"fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6\": container with ID starting with fbfdcced052afc02177b5ff2810abb49da0ca27e475bd53bf11a20b248f243a6 not found: ID does not exist" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.628964 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.667477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-config-data\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.667963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.668014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/739fbd7e-caa0-48cd-ba01-baff873fd04d-logs\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.668076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf5v6\" (UniqueName: \"kubernetes.io/projected/739fbd7e-caa0-48cd-ba01-baff873fd04d-kube-api-access-zf5v6\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.718875 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.726146 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.767064 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.768725 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.772364 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.773756 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.798559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.798617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/739fbd7e-caa0-48cd-ba01-baff873fd04d-logs\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.798667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf5v6\" (UniqueName: \"kubernetes.io/projected/739fbd7e-caa0-48cd-ba01-baff873fd04d-kube-api-access-zf5v6\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.798852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-config-data\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.802142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/739fbd7e-caa0-48cd-ba01-baff873fd04d-logs\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.802585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-config-data\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.803105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.817674 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf5v6\" (UniqueName: \"kubernetes.io/projected/739fbd7e-caa0-48cd-ba01-baff873fd04d-kube-api-access-zf5v6\") pod \"nova-metadata-0\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.900696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-logs\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.900958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msxdf\" (UniqueName: \"kubernetes.io/projected/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-kube-api-access-msxdf\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.901106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-config-data\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:02 crc kubenswrapper[4707]: I0121 16:17:02.901349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.002962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msxdf\" (UniqueName: \"kubernetes.io/projected/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-kube-api-access-msxdf\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.003076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-config-data\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.003236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.003279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-logs\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.003696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-logs\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.007217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-config-data\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.009570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.019636 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.020935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msxdf\" (UniqueName: \"kubernetes.io/projected/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-kube-api-access-msxdf\") pod \"nova-api-0\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.106829 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.113992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.195227 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea" path="/var/lib/kubelet/pods/7fca9095-d32f-4cb7-bfcb-24d3ad16e5ea/volumes" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.196042 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d" path="/var/lib/kubelet/pods/a724d2c6-5e64-4fc1-bdfd-bbd81c6cb14d/volumes" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.196894 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8252089-7af3-4b4c-97a4-40e2cf8ce246" path="/var/lib/kubelet/pods/d8252089-7af3-4b4c-97a4-40e2cf8ce246/volumes" Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.333899 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1201e72-7dfd-4656-9a72-89fce1540f36" containerID="38f07e4dab137756d0ec3b0c9768ae2674630a8f493a3424ecef92d8bef537e0" exitCode=0 Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.333991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" event={"ID":"c1201e72-7dfd-4656-9a72-89fce1540f36","Type":"ContainerDied","Data":"38f07e4dab137756d0ec3b0c9768ae2674630a8f493a3424ecef92d8bef537e0"} Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.340107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerStarted","Data":"cdc53b0ee16f78a3a929a6f9eb00801b7d82235ca338de85dfc6908e779a7e6a"} Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.348341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6cd161af-38fc-439e-8b3f-d4549dc3708f","Type":"ContainerStarted","Data":"bd0c1af3ceb1fa7c36bbcc33de112908051c4e7e8cc5d7685057a8514317b590"} Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.610757 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:03 crc kubenswrapper[4707]: W0121 16:17:03.617532 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739fbd7e_caa0_48cd_ba01_baff873fd04d.slice/crio-132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986 WatchSource:0}: Error finding container 132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986: Status 404 returned error can't find the container with id 132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986 Jan 21 16:17:03 crc kubenswrapper[4707]: I0121 16:17:03.658588 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.372648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"739fbd7e-caa0-48cd-ba01-baff873fd04d","Type":"ContainerStarted","Data":"c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57"} Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.373266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"739fbd7e-caa0-48cd-ba01-baff873fd04d","Type":"ContainerStarted","Data":"deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d"} Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.373283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"739fbd7e-caa0-48cd-ba01-baff873fd04d","Type":"ContainerStarted","Data":"132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986"} Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.375080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c","Type":"ContainerStarted","Data":"a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd"} Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.375128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c","Type":"ContainerStarted","Data":"3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a"} Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.375141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c","Type":"ContainerStarted","Data":"726f897c5299a3b5116dc324a924970daa30483116a89d14b56c21e61fe74c5e"} Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.378363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6cd161af-38fc-439e-8b3f-d4549dc3708f","Type":"ContainerStarted","Data":"97e0171e46f5c97fd7e6a3240f7093a22ce564fe525fb140934b1ee56a46b19b"} Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.382312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerStarted","Data":"972ba68f057fe75a77c05a1dc89d80d0f75c0a22e0b10c25d6d9a84a49bf1484"} Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.397609 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.397595381 podStartE2EDuration="2.397595381s" podCreationTimestamp="2026-01-21 16:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:04.388643428 +0000 UTC m=+4521.570159650" watchObservedRunningTime="2026-01-21 16:17:04.397595381 +0000 UTC m=+4521.579111602" Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.413913 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.413895108 podStartE2EDuration="2.413895108s" podCreationTimestamp="2026-01-21 16:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:04.406144004 +0000 UTC m=+4521.587660226" watchObservedRunningTime="2026-01-21 16:17:04.413895108 +0000 UTC m=+4521.595411330" Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.428536 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.4285215510000002 podStartE2EDuration="3.428521551s" podCreationTimestamp="2026-01-21 16:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:04.421938372 +0000 UTC m=+4521.603454594" watchObservedRunningTime="2026-01-21 16:17:04.428521551 +0000 UTC m=+4521.610037773" Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.789627 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.945478 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-combined-ca-bundle\") pod \"c1201e72-7dfd-4656-9a72-89fce1540f36\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.945552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-scripts\") pod \"c1201e72-7dfd-4656-9a72-89fce1540f36\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.945644 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-config-data\") pod \"c1201e72-7dfd-4656-9a72-89fce1540f36\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.945863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcxh7\" (UniqueName: \"kubernetes.io/projected/c1201e72-7dfd-4656-9a72-89fce1540f36-kube-api-access-fcxh7\") pod \"c1201e72-7dfd-4656-9a72-89fce1540f36\" (UID: \"c1201e72-7dfd-4656-9a72-89fce1540f36\") " Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.951408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-scripts" (OuterVolumeSpecName: "scripts") pod "c1201e72-7dfd-4656-9a72-89fce1540f36" (UID: "c1201e72-7dfd-4656-9a72-89fce1540f36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.952108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1201e72-7dfd-4656-9a72-89fce1540f36-kube-api-access-fcxh7" (OuterVolumeSpecName: "kube-api-access-fcxh7") pod "c1201e72-7dfd-4656-9a72-89fce1540f36" (UID: "c1201e72-7dfd-4656-9a72-89fce1540f36"). InnerVolumeSpecName "kube-api-access-fcxh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.974072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1201e72-7dfd-4656-9a72-89fce1540f36" (UID: "c1201e72-7dfd-4656-9a72-89fce1540f36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:04 crc kubenswrapper[4707]: I0121 16:17:04.978081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-config-data" (OuterVolumeSpecName: "config-data") pod "c1201e72-7dfd-4656-9a72-89fce1540f36" (UID: "c1201e72-7dfd-4656-9a72-89fce1540f36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.048012 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcxh7\" (UniqueName: \"kubernetes.io/projected/c1201e72-7dfd-4656-9a72-89fce1540f36-kube-api-access-fcxh7\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.048042 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.048053 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.048064 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1201e72-7dfd-4656-9a72-89fce1540f36-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.393920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" event={"ID":"c1201e72-7dfd-4656-9a72-89fce1540f36","Type":"ContainerDied","Data":"804393e15f726ee971a049ef8e53177ea7de052717b845e99254a7835cb3ce0d"} Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.393979 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="804393e15f726ee971a049ef8e53177ea7de052717b845e99254a7835cb3ce0d" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.394419 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.398241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerStarted","Data":"4a484853505c0156fe752cc950956b20f3325aa6441ef3eb24710da7767d4789"} Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.398455 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="ceilometer-central-agent" containerID="cri-o://8a279eb41652f50925eae67ee02e9400a99566ac4608b2d5dd7f51dedd0ce87f" gracePeriod=30 Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.398752 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.398762 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="sg-core" containerID="cri-o://972ba68f057fe75a77c05a1dc89d80d0f75c0a22e0b10c25d6d9a84a49bf1484" gracePeriod=30 Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.398839 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="proxy-httpd" containerID="cri-o://4a484853505c0156fe752cc950956b20f3325aa6441ef3eb24710da7767d4789" gracePeriod=30 Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.398935 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="ceilometer-notification-agent" containerID="cri-o://cdc53b0ee16f78a3a929a6f9eb00801b7d82235ca338de85dfc6908e779a7e6a" gracePeriod=30 Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.403899 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9b17afa-1293-4da3-9638-839df6d49df9" containerID="d84a5d6d4141a509df8d8a1c995f601b8feea571b5b15f5471acb0649ff8cffc" exitCode=0 Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.403972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" event={"ID":"c9b17afa-1293-4da3-9638-839df6d49df9","Type":"ContainerDied","Data":"d84a5d6d4141a509df8d8a1c995f601b8feea571b5b15f5471acb0649ff8cffc"} Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.417853 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:17:05 crc kubenswrapper[4707]: E0121 16:17:05.418622 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1201e72-7dfd-4656-9a72-89fce1540f36" containerName="nova-cell1-conductor-db-sync" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.418644 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1201e72-7dfd-4656-9a72-89fce1540f36" containerName="nova-cell1-conductor-db-sync" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.418857 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1201e72-7dfd-4656-9a72-89fce1540f36" containerName="nova-cell1-conductor-db-sync" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.419776 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.421580 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.446620 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.455536 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.203600199 podStartE2EDuration="5.455515333s" podCreationTimestamp="2026-01-21 16:17:00 +0000 UTC" firstStartedPulling="2026-01-21 16:17:01.467720549 +0000 UTC m=+4518.649236770" lastFinishedPulling="2026-01-21 16:17:04.719635682 +0000 UTC m=+4521.901151904" observedRunningTime="2026-01-21 16:17:05.435565912 +0000 UTC m=+4522.617082134" watchObservedRunningTime="2026-01-21 16:17:05.455515333 +0000 UTC m=+4522.637031555" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.562114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.562219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhjmg\" (UniqueName: \"kubernetes.io/projected/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-kube-api-access-dhjmg\") pod \"nova-cell1-conductor-0\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.562334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.664203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.664278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhjmg\" (UniqueName: \"kubernetes.io/projected/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-kube-api-access-dhjmg\") pod \"nova-cell1-conductor-0\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.664334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.668438 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.669343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.684052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhjmg\" (UniqueName: \"kubernetes.io/projected/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-kube-api-access-dhjmg\") pod \"nova-cell1-conductor-0\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.774236 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:05 crc kubenswrapper[4707]: I0121 16:17:05.949915 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.434101 4707 generic.go:334] "Generic (PLEG): container finished" podID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerID="4a484853505c0156fe752cc950956b20f3325aa6441ef3eb24710da7767d4789" exitCode=0 Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.434266 4707 generic.go:334] "Generic (PLEG): container finished" podID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerID="972ba68f057fe75a77c05a1dc89d80d0f75c0a22e0b10c25d6d9a84a49bf1484" exitCode=2 Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.434275 4707 generic.go:334] "Generic (PLEG): container finished" podID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerID="cdc53b0ee16f78a3a929a6f9eb00801b7d82235ca338de85dfc6908e779a7e6a" exitCode=0 Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.434188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerDied","Data":"4a484853505c0156fe752cc950956b20f3325aa6441ef3eb24710da7767d4789"} Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.434441 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerDied","Data":"972ba68f057fe75a77c05a1dc89d80d0f75c0a22e0b10c25d6d9a84a49bf1484"} Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.434455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerDied","Data":"cdc53b0ee16f78a3a929a6f9eb00801b7d82235ca338de85dfc6908e779a7e6a"} Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.759414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:17:06 crc kubenswrapper[4707]: W0121 16:17:06.773749 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode17faeb8_a9ee_44c5_a9d3_e7467017e0c2.slice/crio-f48c05da66ea329433c24a894735e1ec849124341f9f4dcd29f8dff7f290b1c1 WatchSource:0}: Error finding container f48c05da66ea329433c24a894735e1ec849124341f9f4dcd29f8dff7f290b1c1: Status 404 returned error can't find the container with id f48c05da66ea329433c24a894735e1ec849124341f9f4dcd29f8dff7f290b1c1 Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.893990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.989949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-scripts\") pod \"c9b17afa-1293-4da3-9638-839df6d49df9\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.990288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-config-data\") pod \"c9b17afa-1293-4da3-9638-839df6d49df9\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.990319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-combined-ca-bundle\") pod \"c9b17afa-1293-4da3-9638-839df6d49df9\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.991009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49lkk\" (UniqueName: \"kubernetes.io/projected/c9b17afa-1293-4da3-9638-839df6d49df9-kube-api-access-49lkk\") pod \"c9b17afa-1293-4da3-9638-839df6d49df9\" (UID: \"c9b17afa-1293-4da3-9638-839df6d49df9\") " Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.995347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b17afa-1293-4da3-9638-839df6d49df9-kube-api-access-49lkk" (OuterVolumeSpecName: "kube-api-access-49lkk") pod "c9b17afa-1293-4da3-9638-839df6d49df9" (UID: "c9b17afa-1293-4da3-9638-839df6d49df9"). InnerVolumeSpecName "kube-api-access-49lkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:06 crc kubenswrapper[4707]: I0121 16:17:06.995875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-scripts" (OuterVolumeSpecName: "scripts") pod "c9b17afa-1293-4da3-9638-839df6d49df9" (UID: "c9b17afa-1293-4da3-9638-839df6d49df9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.015746 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-config-data" (OuterVolumeSpecName: "config-data") pod "c9b17afa-1293-4da3-9638-839df6d49df9" (UID: "c9b17afa-1293-4da3-9638-839df6d49df9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.018073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9b17afa-1293-4da3-9638-839df6d49df9" (UID: "c9b17afa-1293-4da3-9638-839df6d49df9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.093374 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.093415 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49lkk\" (UniqueName: \"kubernetes.io/projected/c9b17afa-1293-4da3-9638-839df6d49df9-kube-api-access-49lkk\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.093431 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.093444 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b17afa-1293-4da3-9638-839df6d49df9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.468676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2","Type":"ContainerStarted","Data":"6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469"} Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.468723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2","Type":"ContainerStarted","Data":"f48c05da66ea329433c24a894735e1ec849124341f9f4dcd29f8dff7f290b1c1"} Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.469793 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.472700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" event={"ID":"c9b17afa-1293-4da3-9638-839df6d49df9","Type":"ContainerDied","Data":"eb865e9ac4398425fb1838af239a7dacfbbd4b04ac170382b98cc01c0a4a2581"} Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.472734 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb865e9ac4398425fb1838af239a7dacfbbd4b04ac170382b98cc01c0a4a2581" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.472774 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-57llq" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.475500 4707 generic.go:334] "Generic (PLEG): container finished" podID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerID="8a279eb41652f50925eae67ee02e9400a99566ac4608b2d5dd7f51dedd0ce87f" exitCode=0 Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.475526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerDied","Data":"8a279eb41652f50925eae67ee02e9400a99566ac4608b2d5dd7f51dedd0ce87f"} Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.530364 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.5303445570000003 podStartE2EDuration="2.530344557s" podCreationTimestamp="2026-01-21 16:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:07.501126026 +0000 UTC m=+4524.682642249" watchObservedRunningTime="2026-01-21 16:17:07.530344557 +0000 UTC m=+4524.711860778" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.542559 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:07 crc kubenswrapper[4707]: E0121 16:17:07.700354 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:07 crc kubenswrapper[4707]: E0121 16:17:07.702433 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:07 crc kubenswrapper[4707]: E0121 16:17:07.704574 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:07 crc kubenswrapper[4707]: E0121 16:17:07.704647 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerName="nova-cell0-conductor-conductor" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.708887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-scripts\") pod \"a660bc26-b5c3-47d0-a229-d1339c0ae215\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.708977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-combined-ca-bundle\") pod \"a660bc26-b5c3-47d0-a229-d1339c0ae215\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.709041 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-sg-core-conf-yaml\") pod \"a660bc26-b5c3-47d0-a229-d1339c0ae215\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.709112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-config-data\") pod \"a660bc26-b5c3-47d0-a229-d1339c0ae215\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.709204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-log-httpd\") pod \"a660bc26-b5c3-47d0-a229-d1339c0ae215\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.709327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92tqq\" (UniqueName: \"kubernetes.io/projected/a660bc26-b5c3-47d0-a229-d1339c0ae215-kube-api-access-92tqq\") pod \"a660bc26-b5c3-47d0-a229-d1339c0ae215\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.709392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-run-httpd\") pod \"a660bc26-b5c3-47d0-a229-d1339c0ae215\" (UID: \"a660bc26-b5c3-47d0-a229-d1339c0ae215\") " Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.711018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a660bc26-b5c3-47d0-a229-d1339c0ae215" (UID: "a660bc26-b5c3-47d0-a229-d1339c0ae215"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.713112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a660bc26-b5c3-47d0-a229-d1339c0ae215" (UID: "a660bc26-b5c3-47d0-a229-d1339c0ae215"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.714391 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.714496 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.714989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-scripts" (OuterVolumeSpecName: "scripts") pod "a660bc26-b5c3-47d0-a229-d1339c0ae215" (UID: "a660bc26-b5c3-47d0-a229-d1339c0ae215"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.719072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a660bc26-b5c3-47d0-a229-d1339c0ae215-kube-api-access-92tqq" (OuterVolumeSpecName: "kube-api-access-92tqq") pod "a660bc26-b5c3-47d0-a229-d1339c0ae215" (UID: "a660bc26-b5c3-47d0-a229-d1339c0ae215"). InnerVolumeSpecName "kube-api-access-92tqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.739185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a660bc26-b5c3-47d0-a229-d1339c0ae215" (UID: "a660bc26-b5c3-47d0-a229-d1339c0ae215"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.744617 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.750999 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.789000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a660bc26-b5c3-47d0-a229-d1339c0ae215" (UID: "a660bc26-b5c3-47d0-a229-d1339c0ae215"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.791386 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-config-data" (OuterVolumeSpecName: "config-data") pod "a660bc26-b5c3-47d0-a229-d1339c0ae215" (UID: "a660bc26-b5c3-47d0-a229-d1339c0ae215"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.823937 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92tqq\" (UniqueName: \"kubernetes.io/projected/a660bc26-b5c3-47d0-a229-d1339c0ae215-kube-api-access-92tqq\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.824004 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.824018 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.824036 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.824048 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.824063 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a660bc26-b5c3-47d0-a229-d1339c0ae215-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:07 crc kubenswrapper[4707]: I0121 16:17:07.824073 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a660bc26-b5c3-47d0-a229-d1339c0ae215-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.115154 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.115236 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.495572 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.497210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a660bc26-b5c3-47d0-a229-d1339c0ae215","Type":"ContainerDied","Data":"cd2c8c2374caa5de231cbcc5b98dfda881c4ef0036dc9bc11310e5e8c4d964bd"} Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.497392 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.497447 4707 scope.go:117] "RemoveContainer" containerID="4a484853505c0156fe752cc950956b20f3325aa6441ef3eb24710da7767d4789" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.498016 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.522991 4707 scope.go:117] "RemoveContainer" containerID="972ba68f057fe75a77c05a1dc89d80d0f75c0a22e0b10c25d6d9a84a49bf1484" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.533213 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.542458 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.564523 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.564867 4707 scope.go:117] "RemoveContainer" containerID="cdc53b0ee16f78a3a929a6f9eb00801b7d82235ca338de85dfc6908e779a7e6a" Jan 21 16:17:08 crc kubenswrapper[4707]: E0121 16:17:08.565022 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="ceilometer-central-agent" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565057 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="ceilometer-central-agent" Jan 21 16:17:08 crc kubenswrapper[4707]: E0121 16:17:08.565073 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b17afa-1293-4da3-9638-839df6d49df9" containerName="nova-manage" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565079 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b17afa-1293-4da3-9638-839df6d49df9" containerName="nova-manage" Jan 21 16:17:08 crc kubenswrapper[4707]: E0121 16:17:08.565095 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="sg-core" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565100 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="sg-core" Jan 21 16:17:08 crc kubenswrapper[4707]: E0121 16:17:08.565118 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="ceilometer-notification-agent" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565124 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="ceilometer-notification-agent" Jan 21 16:17:08 crc kubenswrapper[4707]: E0121 16:17:08.565148 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="proxy-httpd" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565154 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="proxy-httpd" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565354 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="ceilometer-central-agent" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565376 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="sg-core" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565386 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b17afa-1293-4da3-9638-839df6d49df9" containerName="nova-manage" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565397 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="proxy-httpd" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.565410 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" containerName="ceilometer-notification-agent" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.570650 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.575577 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.575739 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.575877 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.612478 4707 scope.go:117] "RemoveContainer" containerID="8a279eb41652f50925eae67ee02e9400a99566ac4608b2d5dd7f51dedd0ce87f" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.755272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.755462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7w28\" (UniqueName: \"kubernetes.io/projected/4c614425-d063-4ab8-a352-3502ac8cd8c3-kube-api-access-k7w28\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.755720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-config-data\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.756046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.756184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-scripts\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.756480 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.756585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.859118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.859175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.859239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.859325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7w28\" (UniqueName: \"kubernetes.io/projected/4c614425-d063-4ab8-a352-3502ac8cd8c3-kube-api-access-k7w28\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.859465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-config-data\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.859536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.859561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-scripts\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.859585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.859774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.865229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.865935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-config-data\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.866827 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-scripts\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.867796 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.889234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7w28\" (UniqueName: \"kubernetes.io/projected/4c614425-d063-4ab8-a352-3502ac8cd8c3-kube-api-access-k7w28\") pod \"ceilometer-0\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:08 crc kubenswrapper[4707]: I0121 16:17:08.899641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:09 crc kubenswrapper[4707]: I0121 16:17:09.195952 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a660bc26-b5c3-47d0-a229-d1339c0ae215" path="/var/lib/kubelet/pods/a660bc26-b5c3-47d0-a229-d1339c0ae215/volumes" Jan 21 16:17:09 crc kubenswrapper[4707]: I0121 16:17:09.339909 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:09 crc kubenswrapper[4707]: W0121 16:17:09.340568 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c614425_d063_4ab8_a352_3502ac8cd8c3.slice/crio-68f9d5d84416bb81b580af5cf9e2ed8f5128832e28cd21a649f7f6ecb21e9899 WatchSource:0}: Error finding container 68f9d5d84416bb81b580af5cf9e2ed8f5128832e28cd21a649f7f6ecb21e9899: Status 404 returned error can't find the container with id 68f9d5d84416bb81b580af5cf9e2ed8f5128832e28cd21a649f7f6ecb21e9899 Jan 21 16:17:09 crc kubenswrapper[4707]: I0121 16:17:09.512464 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerStarted","Data":"68f9d5d84416bb81b580af5cf9e2ed8f5128832e28cd21a649f7f6ecb21e9899"} Jan 21 16:17:09 crc kubenswrapper[4707]: I0121 16:17:09.945778 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:17:09 crc kubenswrapper[4707]: I0121 16:17:09.946185 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:17:10 crc kubenswrapper[4707]: I0121 16:17:10.265092 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:17:10 crc kubenswrapper[4707]: I0121 16:17:10.266245 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:17:10 crc kubenswrapper[4707]: I0121 16:17:10.528251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerStarted","Data":"7bd9b3688c0b791b6bd09562e9cc13b95d473710050456e110ee5ba4860df742"} Jan 21 16:17:10 crc kubenswrapper[4707]: I0121 16:17:10.949914 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:10 crc kubenswrapper[4707]: I0121 16:17:10.963565 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:11 crc kubenswrapper[4707]: I0121 16:17:11.538544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerStarted","Data":"55d0cc5193f2c2d16b3550deb8e51232a5a052fac7261c623d17313148865d9b"} Jan 21 16:17:11 crc kubenswrapper[4707]: I0121 16:17:11.538758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerStarted","Data":"6cf9d360647ef53cb137039369ca082a96c8915351bf2af5ac55355a634fdab4"} Jan 21 16:17:11 crc kubenswrapper[4707]: I0121 16:17:11.556591 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:11 crc kubenswrapper[4707]: I0121 16:17:11.642954 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:11 crc kubenswrapper[4707]: I0121 16:17:11.643068 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:11 crc kubenswrapper[4707]: I0121 16:17:11.668411 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:11 crc kubenswrapper[4707]: I0121 16:17:11.678230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:12 crc kubenswrapper[4707]: I0121 16:17:12.547752 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:12 crc kubenswrapper[4707]: I0121 16:17:12.548143 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:12 crc kubenswrapper[4707]: E0121 16:17:12.700918 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:12 crc kubenswrapper[4707]: E0121 16:17:12.703380 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:12 crc kubenswrapper[4707]: E0121 16:17:12.706201 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:12 crc kubenswrapper[4707]: E0121 16:17:12.706259 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerName="nova-cell0-conductor-conductor" Jan 21 16:17:13 crc kubenswrapper[4707]: I0121 16:17:13.108060 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:13 crc kubenswrapper[4707]: I0121 16:17:13.108381 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:13 crc kubenswrapper[4707]: I0121 16:17:13.115234 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:13 crc kubenswrapper[4707]: I0121 16:17:13.115283 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:13 crc kubenswrapper[4707]: I0121 16:17:13.558374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerStarted","Data":"da44097fcef9c37d7356e68f27f5b9fdc7b54b83a2c269b4169230f0a612a360"} Jan 21 16:17:13 crc kubenswrapper[4707]: I0121 16:17:13.596666 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.008555797 podStartE2EDuration="5.596645829s" podCreationTimestamp="2026-01-21 16:17:08 +0000 UTC" firstStartedPulling="2026-01-21 16:17:09.344885289 +0000 UTC m=+4526.526401512" lastFinishedPulling="2026-01-21 16:17:12.932975322 +0000 UTC m=+4530.114491544" observedRunningTime="2026-01-21 16:17:13.589866972 +0000 UTC m=+4530.771383194" watchObservedRunningTime="2026-01-21 16:17:13.596645829 +0000 UTC m=+4530.778162051" Jan 21 16:17:14 crc kubenswrapper[4707]: I0121 16:17:14.274089 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.0.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:17:14 crc kubenswrapper[4707]: I0121 16:17:14.274280 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:17:14 crc kubenswrapper[4707]: I0121 16:17:14.274340 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:17:14 crc kubenswrapper[4707]: I0121 16:17:14.274768 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:17:14 crc kubenswrapper[4707]: I0121 16:17:14.274987 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:14 crc kubenswrapper[4707]: I0121 16:17:14.275047 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:17:14 crc kubenswrapper[4707]: I0121 16:17:14.573536 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:15 crc kubenswrapper[4707]: I0121 16:17:15.799655 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:17:17 crc kubenswrapper[4707]: E0121 16:17:17.700804 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:17 crc kubenswrapper[4707]: E0121 16:17:17.703669 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:17 crc kubenswrapper[4707]: E0121 16:17:17.705366 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:17 crc kubenswrapper[4707]: E0121 16:17:17.705417 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerName="nova-cell0-conductor-conductor" Jan 21 16:17:19 crc kubenswrapper[4707]: I0121 16:17:19.331664 4707 scope.go:117] "RemoveContainer" containerID="3805d4c67848713fa21d3835fea8fa9390b192a873e21f06f95e32eafd197ef0" Jan 21 16:17:19 crc kubenswrapper[4707]: I0121 16:17:19.356288 4707 scope.go:117] "RemoveContainer" containerID="f1f33074f9c4884125c41e8f10998b0cfb13e6b6903b0de59bbc6d2a374caca5" Jan 21 16:17:22 crc kubenswrapper[4707]: E0121 16:17:22.700979 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:22 crc kubenswrapper[4707]: E0121 16:17:22.703587 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:22 crc kubenswrapper[4707]: E0121 16:17:22.705236 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:22 crc kubenswrapper[4707]: E0121 16:17:22.705354 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerName="nova-cell0-conductor-conductor" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.111028 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.111111 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.111558 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.111614 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.114564 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.114624 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.116583 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.117335 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.118176 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:23 crc kubenswrapper[4707]: I0121 16:17:23.667574 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:24 crc kubenswrapper[4707]: I0121 16:17:24.716081 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:24 crc kubenswrapper[4707]: I0121 16:17:24.716424 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="ceilometer-central-agent" containerID="cri-o://7bd9b3688c0b791b6bd09562e9cc13b95d473710050456e110ee5ba4860df742" gracePeriod=30 Jan 21 16:17:24 crc kubenswrapper[4707]: I0121 16:17:24.716477 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="sg-core" containerID="cri-o://55d0cc5193f2c2d16b3550deb8e51232a5a052fac7261c623d17313148865d9b" gracePeriod=30 Jan 21 16:17:24 crc kubenswrapper[4707]: I0121 16:17:24.716519 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="ceilometer-notification-agent" containerID="cri-o://6cf9d360647ef53cb137039369ca082a96c8915351bf2af5ac55355a634fdab4" gracePeriod=30 Jan 21 16:17:24 crc kubenswrapper[4707]: I0121 16:17:24.716547 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="proxy-httpd" containerID="cri-o://da44097fcef9c37d7356e68f27f5b9fdc7b54b83a2c269b4169230f0a612a360" gracePeriod=30 Jan 21 16:17:24 crc kubenswrapper[4707]: I0121 16:17:24.726972 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.194:3000/\": EOF" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.686422 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerID="da44097fcef9c37d7356e68f27f5b9fdc7b54b83a2c269b4169230f0a612a360" exitCode=0 Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.686927 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerID="55d0cc5193f2c2d16b3550deb8e51232a5a052fac7261c623d17313148865d9b" exitCode=2 Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.686942 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerID="6cf9d360647ef53cb137039369ca082a96c8915351bf2af5ac55355a634fdab4" exitCode=0 Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.686953 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerID="7bd9b3688c0b791b6bd09562e9cc13b95d473710050456e110ee5ba4860df742" exitCode=0 Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.686509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerDied","Data":"da44097fcef9c37d7356e68f27f5b9fdc7b54b83a2c269b4169230f0a612a360"} Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.687043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerDied","Data":"55d0cc5193f2c2d16b3550deb8e51232a5a052fac7261c623d17313148865d9b"} Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.687061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerDied","Data":"6cf9d360647ef53cb137039369ca082a96c8915351bf2af5ac55355a634fdab4"} Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.687072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerDied","Data":"7bd9b3688c0b791b6bd09562e9cc13b95d473710050456e110ee5ba4860df742"} Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.687081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c614425-d063-4ab8-a352-3502ac8cd8c3","Type":"ContainerDied","Data":"68f9d5d84416bb81b580af5cf9e2ed8f5128832e28cd21a649f7f6ecb21e9899"} Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.687096 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68f9d5d84416bb81b580af5cf9e2ed8f5128832e28cd21a649f7f6ecb21e9899" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.706224 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.769618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-log-httpd\") pod \"4c614425-d063-4ab8-a352-3502ac8cd8c3\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.769746 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-scripts\") pod \"4c614425-d063-4ab8-a352-3502ac8cd8c3\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.769778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-sg-core-conf-yaml\") pod \"4c614425-d063-4ab8-a352-3502ac8cd8c3\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.769830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-config-data\") pod \"4c614425-d063-4ab8-a352-3502ac8cd8c3\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.769865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-combined-ca-bundle\") pod \"4c614425-d063-4ab8-a352-3502ac8cd8c3\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.769892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-run-httpd\") pod \"4c614425-d063-4ab8-a352-3502ac8cd8c3\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.770051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7w28\" (UniqueName: \"kubernetes.io/projected/4c614425-d063-4ab8-a352-3502ac8cd8c3-kube-api-access-k7w28\") pod \"4c614425-d063-4ab8-a352-3502ac8cd8c3\" (UID: \"4c614425-d063-4ab8-a352-3502ac8cd8c3\") " Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.771499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4c614425-d063-4ab8-a352-3502ac8cd8c3" (UID: "4c614425-d063-4ab8-a352-3502ac8cd8c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.771563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4c614425-d063-4ab8-a352-3502ac8cd8c3" (UID: "4c614425-d063-4ab8-a352-3502ac8cd8c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.778740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-scripts" (OuterVolumeSpecName: "scripts") pod "4c614425-d063-4ab8-a352-3502ac8cd8c3" (UID: "4c614425-d063-4ab8-a352-3502ac8cd8c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.779227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c614425-d063-4ab8-a352-3502ac8cd8c3-kube-api-access-k7w28" (OuterVolumeSpecName: "kube-api-access-k7w28") pod "4c614425-d063-4ab8-a352-3502ac8cd8c3" (UID: "4c614425-d063-4ab8-a352-3502ac8cd8c3"). InnerVolumeSpecName "kube-api-access-k7w28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.799460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4c614425-d063-4ab8-a352-3502ac8cd8c3" (UID: "4c614425-d063-4ab8-a352-3502ac8cd8c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.843851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c614425-d063-4ab8-a352-3502ac8cd8c3" (UID: "4c614425-d063-4ab8-a352-3502ac8cd8c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.856610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-config-data" (OuterVolumeSpecName: "config-data") pod "4c614425-d063-4ab8-a352-3502ac8cd8c3" (UID: "4c614425-d063-4ab8-a352-3502ac8cd8c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.871897 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7w28\" (UniqueName: \"kubernetes.io/projected/4c614425-d063-4ab8-a352-3502ac8cd8c3-kube-api-access-k7w28\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.871927 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.871939 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.871951 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.871962 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.871971 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c614425-d063-4ab8-a352-3502ac8cd8c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:25 crc kubenswrapper[4707]: I0121 16:17:25.871979 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c614425-d063-4ab8-a352-3502ac8cd8c3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.694016 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.722770 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.730601 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.740941 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:26 crc kubenswrapper[4707]: E0121 16:17:26.742989 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="ceilometer-notification-agent" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.743013 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="ceilometer-notification-agent" Jan 21 16:17:26 crc kubenswrapper[4707]: E0121 16:17:26.743026 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="proxy-httpd" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.743033 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="proxy-httpd" Jan 21 16:17:26 crc kubenswrapper[4707]: E0121 16:17:26.743045 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="ceilometer-central-agent" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.743052 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="ceilometer-central-agent" Jan 21 16:17:26 crc kubenswrapper[4707]: E0121 16:17:26.743073 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="sg-core" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.743079 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="sg-core" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.744214 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="ceilometer-central-agent" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.744243 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="sg-core" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.744256 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="ceilometer-notification-agent" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.744266 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" containerName="proxy-httpd" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.745798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.748578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.764455 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.775886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.891378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.891896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-scripts\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.891975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-config-data\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.892017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6vfx\" (UniqueName: \"kubernetes.io/projected/b223c18b-f22c-43eb-be75-70b22769154f-kube-api-access-r6vfx\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.892097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.892127 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-run-httpd\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.892232 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-log-httpd\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.994461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.994851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-scripts\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.994996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-config-data\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.995111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vfx\" (UniqueName: \"kubernetes.io/projected/b223c18b-f22c-43eb-be75-70b22769154f-kube-api-access-r6vfx\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.995233 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.995314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-run-httpd\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.995401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-log-httpd\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.995889 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-run-httpd\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:26 crc kubenswrapper[4707]: I0121 16:17:26.995916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-log-httpd\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:27 crc kubenswrapper[4707]: I0121 16:17:27.193942 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c614425-d063-4ab8-a352-3502ac8cd8c3" path="/var/lib/kubelet/pods/4c614425-d063-4ab8-a352-3502ac8cd8c3/volumes" Jan 21 16:17:27 crc kubenswrapper[4707]: I0121 16:17:27.349006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:27 crc kubenswrapper[4707]: I0121 16:17:27.349017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-scripts\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:27 crc kubenswrapper[4707]: I0121 16:17:27.349365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-config-data\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:27 crc kubenswrapper[4707]: I0121 16:17:27.350914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vfx\" (UniqueName: \"kubernetes.io/projected/b223c18b-f22c-43eb-be75-70b22769154f-kube-api-access-r6vfx\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:27 crc kubenswrapper[4707]: I0121 16:17:27.356581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:27 crc kubenswrapper[4707]: I0121 16:17:27.364400 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:27 crc kubenswrapper[4707]: E0121 16:17:27.700717 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:27 crc kubenswrapper[4707]: E0121 16:17:27.702620 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:27 crc kubenswrapper[4707]: E0121 16:17:27.704758 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:17:27 crc kubenswrapper[4707]: E0121 16:17:27.704824 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerName="nova-cell0-conductor-conductor" Jan 21 16:17:27 crc kubenswrapper[4707]: I0121 16:17:27.778613 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:27 crc kubenswrapper[4707]: W0121 16:17:27.782478 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb223c18b_f22c_43eb_be75_70b22769154f.slice/crio-60168759174ac6d5ea0b4b371b46bfce76df885475aff8095642251ebc6a54ff WatchSource:0}: Error finding container 60168759174ac6d5ea0b4b371b46bfce76df885475aff8095642251ebc6a54ff: Status 404 returned error can't find the container with id 60168759174ac6d5ea0b4b371b46bfce76df885475aff8095642251ebc6a54ff Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.541840 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.713173 4707 generic.go:334] "Generic (PLEG): container finished" podID="e7217542-f073-4e86-99e0-7b94c00199f1" containerID="f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6" exitCode=137 Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.713240 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.713273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e7217542-f073-4e86-99e0-7b94c00199f1","Type":"ContainerDied","Data":"f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6"} Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.713357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e7217542-f073-4e86-99e0-7b94c00199f1","Type":"ContainerDied","Data":"956e9d766cde0081bc8352a62eccda34e5fc064b98b9c9804cb3abc1ddb94c21"} Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.713402 4707 scope.go:117] "RemoveContainer" containerID="f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.714975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerStarted","Data":"e8501037e0d9e56601120e2025eb13965509d978b9b0e1b9984337d4eed17472"} Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.715016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerStarted","Data":"60168759174ac6d5ea0b4b371b46bfce76df885475aff8095642251ebc6a54ff"} Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.732654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-config-data\") pod \"e7217542-f073-4e86-99e0-7b94c00199f1\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.732748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-combined-ca-bundle\") pod \"e7217542-f073-4e86-99e0-7b94c00199f1\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.732878 4707 scope.go:117] "RemoveContainer" containerID="f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.732897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glfv8\" (UniqueName: \"kubernetes.io/projected/e7217542-f073-4e86-99e0-7b94c00199f1-kube-api-access-glfv8\") pod \"e7217542-f073-4e86-99e0-7b94c00199f1\" (UID: \"e7217542-f073-4e86-99e0-7b94c00199f1\") " Jan 21 16:17:28 crc kubenswrapper[4707]: E0121 16:17:28.734563 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6\": container with ID starting with f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6 not found: ID does not exist" containerID="f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.734778 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6"} err="failed to get container status \"f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6\": rpc error: code = NotFound desc = could not find container \"f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6\": container with ID starting with f1dd6ad2f44e005447ebaefec1930561923ea33f497a3f4ab13c204a8818ffb6 not found: ID does not exist" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.750004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7217542-f073-4e86-99e0-7b94c00199f1-kube-api-access-glfv8" (OuterVolumeSpecName: "kube-api-access-glfv8") pod "e7217542-f073-4e86-99e0-7b94c00199f1" (UID: "e7217542-f073-4e86-99e0-7b94c00199f1"). InnerVolumeSpecName "kube-api-access-glfv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.764485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-config-data" (OuterVolumeSpecName: "config-data") pod "e7217542-f073-4e86-99e0-7b94c00199f1" (UID: "e7217542-f073-4e86-99e0-7b94c00199f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.765393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7217542-f073-4e86-99e0-7b94c00199f1" (UID: "e7217542-f073-4e86-99e0-7b94c00199f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.836395 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.836426 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7217542-f073-4e86-99e0-7b94c00199f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:28 crc kubenswrapper[4707]: I0121 16:17:28.836444 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glfv8\" (UniqueName: \"kubernetes.io/projected/e7217542-f073-4e86-99e0-7b94c00199f1-kube-api-access-glfv8\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.044367 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.052374 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.074861 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:29 crc kubenswrapper[4707]: E0121 16:17:29.075412 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7217542-f073-4e86-99e0-7b94c00199f1" containerName="nova-scheduler-scheduler" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.075434 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7217542-f073-4e86-99e0-7b94c00199f1" containerName="nova-scheduler-scheduler" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.075643 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7217542-f073-4e86-99e0-7b94c00199f1" containerName="nova-scheduler-scheduler" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.076369 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.078258 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.098889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.144747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.144869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxjm\" (UniqueName: \"kubernetes.io/projected/fa2c575b-9647-4363-9cde-79e4aac4e25d-kube-api-access-9sxjm\") pod \"nova-scheduler-0\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.144948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-config-data\") pod \"nova-scheduler-0\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.196761 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7217542-f073-4e86-99e0-7b94c00199f1" path="/var/lib/kubelet/pods/e7217542-f073-4e86-99e0-7b94c00199f1/volumes" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.247059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.247182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxjm\" (UniqueName: \"kubernetes.io/projected/fa2c575b-9647-4363-9cde-79e4aac4e25d-kube-api-access-9sxjm\") pod \"nova-scheduler-0\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.247263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-config-data\") pod \"nova-scheduler-0\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.250406 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-config-data\") pod \"nova-scheduler-0\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.250832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.261949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxjm\" (UniqueName: \"kubernetes.io/projected/fa2c575b-9647-4363-9cde-79e4aac4e25d-kube-api-access-9sxjm\") pod \"nova-scheduler-0\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.396520 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.727481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerStarted","Data":"96cea6f17e082b00611e932baca2a541cb2c735a64dd28a957779e7b222ab73a"} Jan 21 16:17:29 crc kubenswrapper[4707]: I0121 16:17:29.841076 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:29 crc kubenswrapper[4707]: W0121 16:17:29.853345 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa2c575b_9647_4363_9cde_79e4aac4e25d.slice/crio-0fabdc0afb457bdd2270e710d1ec056e218be72920e473202491b80876d0cdd1 WatchSource:0}: Error finding container 0fabdc0afb457bdd2270e710d1ec056e218be72920e473202491b80876d0cdd1: Status 404 returned error can't find the container with id 0fabdc0afb457bdd2270e710d1ec056e218be72920e473202491b80876d0cdd1 Jan 21 16:17:30 crc kubenswrapper[4707]: I0121 16:17:30.738644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerStarted","Data":"2f8e595a8f18c8666a5028e8db6093e07511a0d63fb185918543bf831373f5f3"} Jan 21 16:17:30 crc kubenswrapper[4707]: I0121 16:17:30.740417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fa2c575b-9647-4363-9cde-79e4aac4e25d","Type":"ContainerStarted","Data":"2ecd68c84a528f6383f1f9ce5d24f2c2c91469d2e3f8624746503b05307ecb2b"} Jan 21 16:17:30 crc kubenswrapper[4707]: I0121 16:17:30.740440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fa2c575b-9647-4363-9cde-79e4aac4e25d","Type":"ContainerStarted","Data":"0fabdc0afb457bdd2270e710d1ec056e218be72920e473202491b80876d0cdd1"} Jan 21 16:17:30 crc kubenswrapper[4707]: I0121 16:17:30.759043 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.759024439 podStartE2EDuration="1.759024439s" podCreationTimestamp="2026-01-21 16:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:30.753587096 +0000 UTC m=+4547.935103307" watchObservedRunningTime="2026-01-21 16:17:30.759024439 +0000 UTC m=+4547.940540661" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.623089 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.752011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerStarted","Data":"c68bb51f5c8ff11631cf44b3469ccbb44de1692e0875e3a7fbf38f4118fa4ea5"} Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.752181 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.754004 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.753967 4707 generic.go:334] "Generic (PLEG): container finished" podID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" exitCode=137 Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.754267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3","Type":"ContainerDied","Data":"6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e"} Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.754392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3","Type":"ContainerDied","Data":"b32176cc9111901d94ff9d59ba9a5095d770d3610158e6ca39652624c20a15da"} Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.754446 4707 scope.go:117] "RemoveContainer" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.781243 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.548004939 podStartE2EDuration="5.78122025s" podCreationTimestamp="2026-01-21 16:17:26 +0000 UTC" firstStartedPulling="2026-01-21 16:17:27.784898081 +0000 UTC m=+4544.966414304" lastFinishedPulling="2026-01-21 16:17:31.018113394 +0000 UTC m=+4548.199629615" observedRunningTime="2026-01-21 16:17:31.77111568 +0000 UTC m=+4548.952631902" watchObservedRunningTime="2026-01-21 16:17:31.78122025 +0000 UTC m=+4548.962736472" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.783546 4707 scope.go:117] "RemoveContainer" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" Jan 21 16:17:31 crc kubenswrapper[4707]: E0121 16:17:31.784061 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e\": container with ID starting with 6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e not found: ID does not exist" containerID="6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.784193 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e"} err="failed to get container status \"6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e\": rpc error: code = NotFound desc = could not find container \"6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e\": container with ID starting with 6ea4b72c3e123cb9d30e9f46b303eb3048ab8908b714e17d4d33cadd43517e3e not found: ID does not exist" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.809829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9467\" (UniqueName: \"kubernetes.io/projected/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-kube-api-access-m9467\") pod \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.810346 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-combined-ca-bundle\") pod \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.810552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-config-data\") pod \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.815374 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-kube-api-access-m9467" (OuterVolumeSpecName: "kube-api-access-m9467") pod "a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" (UID: "a8a27bd6-271d-4e78-aa4f-40ff0ee885d3"). InnerVolumeSpecName "kube-api-access-m9467". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:31 crc kubenswrapper[4707]: E0121 16:17:31.837333 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-combined-ca-bundle podName:a8a27bd6-271d-4e78-aa4f-40ff0ee885d3 nodeName:}" failed. No retries permitted until 2026-01-21 16:17:32.337284017 +0000 UTC m=+4549.518800239 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-combined-ca-bundle") pod "a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" (UID: "a8a27bd6-271d-4e78-aa4f-40ff0ee885d3") : error deleting /var/lib/kubelet/pods/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3/volume-subpaths: remove /var/lib/kubelet/pods/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3/volume-subpaths: no such file or directory Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.840037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-config-data" (OuterVolumeSpecName: "config-data") pod "a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" (UID: "a8a27bd6-271d-4e78-aa4f-40ff0ee885d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.915119 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:31 crc kubenswrapper[4707]: I0121 16:17:31.915169 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9467\" (UniqueName: \"kubernetes.io/projected/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-kube-api-access-m9467\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.422208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-combined-ca-bundle\") pod \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\" (UID: \"a8a27bd6-271d-4e78-aa4f-40ff0ee885d3\") " Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.425159 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" (UID: "a8a27bd6-271d-4e78-aa4f-40ff0ee885d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.524932 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.695965 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.714638 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.722881 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:17:32 crc kubenswrapper[4707]: E0121 16:17:32.723652 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerName="nova-cell0-conductor-conductor" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.723722 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerName="nova-cell0-conductor-conductor" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.724058 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" containerName="nova-cell0-conductor-conductor" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.725068 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.728570 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.730352 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.831871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.831950 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pscn4\" (UniqueName: \"kubernetes.io/projected/89eecc6d-db4a-4705-8ca5-cfdff9119615-kube-api-access-pscn4\") pod \"nova-cell0-conductor-0\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.832407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.934560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.934678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.934728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pscn4\" (UniqueName: \"kubernetes.io/projected/89eecc6d-db4a-4705-8ca5-cfdff9119615-kube-api-access-pscn4\") pod \"nova-cell0-conductor-0\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.938651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.945899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:32 crc kubenswrapper[4707]: I0121 16:17:32.965215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pscn4\" (UniqueName: \"kubernetes.io/projected/89eecc6d-db4a-4705-8ca5-cfdff9119615-kube-api-access-pscn4\") pod \"nova-cell0-conductor-0\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:33 crc kubenswrapper[4707]: I0121 16:17:33.052675 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:33 crc kubenswrapper[4707]: I0121 16:17:33.193364 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a27bd6-271d-4e78-aa4f-40ff0ee885d3" path="/var/lib/kubelet/pods/a8a27bd6-271d-4e78-aa4f-40ff0ee885d3/volumes" Jan 21 16:17:33 crc kubenswrapper[4707]: I0121 16:17:33.447393 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:17:33 crc kubenswrapper[4707]: W0121 16:17:33.452686 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89eecc6d_db4a_4705_8ca5_cfdff9119615.slice/crio-88d667de9ee20a089cbfa9bd66599fd99091aa6848a057d74ebe97a44084cd72 WatchSource:0}: Error finding container 88d667de9ee20a089cbfa9bd66599fd99091aa6848a057d74ebe97a44084cd72: Status 404 returned error can't find the container with id 88d667de9ee20a089cbfa9bd66599fd99091aa6848a057d74ebe97a44084cd72 Jan 21 16:17:33 crc kubenswrapper[4707]: I0121 16:17:33.780908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"89eecc6d-db4a-4705-8ca5-cfdff9119615","Type":"ContainerStarted","Data":"86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4"} Jan 21 16:17:33 crc kubenswrapper[4707]: I0121 16:17:33.780963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"89eecc6d-db4a-4705-8ca5-cfdff9119615","Type":"ContainerStarted","Data":"88d667de9ee20a089cbfa9bd66599fd99091aa6848a057d74ebe97a44084cd72"} Jan 21 16:17:33 crc kubenswrapper[4707]: I0121 16:17:33.781887 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:33 crc kubenswrapper[4707]: I0121 16:17:33.798210 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.798195961 podStartE2EDuration="1.798195961s" podCreationTimestamp="2026-01-21 16:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:33.793528064 +0000 UTC m=+4550.975044287" watchObservedRunningTime="2026-01-21 16:17:33.798195961 +0000 UTC m=+4550.979712183" Jan 21 16:17:34 crc kubenswrapper[4707]: I0121 16:17:34.397342 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:38 crc kubenswrapper[4707]: I0121 16:17:38.080304 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.006694 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.007306 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="fa2c575b-9647-4363-9cde-79e4aac4e25d" containerName="nova-scheduler-scheduler" containerID="cri-o://2ecd68c84a528f6383f1f9ce5d24f2c2c91469d2e3f8624746503b05307ecb2b" gracePeriod=30 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.016121 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.016389 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="b523514b-6772-4ddc-9db6-df7a8ddff1dc" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1" gracePeriod=30 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.023122 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.023545 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-api" containerID="cri-o://a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd" gracePeriod=30 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.023681 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-log" containerID="cri-o://3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a" gracePeriod=30 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.112846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.113446 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-log" containerID="cri-o://deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d" gracePeriod=30 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.113542 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-metadata" containerID="cri-o://c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57" gracePeriod=30 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.395970 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp"] Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.397154 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.398822 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.400175 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.404382 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp"] Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.562145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-scripts\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.562428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjdd8\" (UniqueName: \"kubernetes.io/projected/b5c33702-68b3-4723-8d09-5548858db57c-kube-api-access-kjdd8\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.562481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.562627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-config-data\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.664408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjdd8\" (UniqueName: \"kubernetes.io/projected/b5c33702-68b3-4723-8d09-5548858db57c-kube-api-access-kjdd8\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.664460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.664502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-config-data\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.664584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-scripts\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.670712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-config-data\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.670741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-scripts\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.672473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.682754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjdd8\" (UniqueName: \"kubernetes.io/projected/b5c33702-68b3-4723-8d09-5548858db57c-kube-api-access-kjdd8\") pod \"nova-cell1-cell-mapping-vqlcp\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.739775 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.762139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.841461 4707 generic.go:334] "Generic (PLEG): container finished" podID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerID="deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d" exitCode=143 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.841547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"739fbd7e-caa0-48cd-ba01-baff873fd04d","Type":"ContainerDied","Data":"deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d"} Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.859043 4707 generic.go:334] "Generic (PLEG): container finished" podID="b523514b-6772-4ddc-9db6-df7a8ddff1dc" containerID="d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1" exitCode=0 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.859104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b523514b-6772-4ddc-9db6-df7a8ddff1dc","Type":"ContainerDied","Data":"d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1"} Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.859126 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b523514b-6772-4ddc-9db6-df7a8ddff1dc","Type":"ContainerDied","Data":"fb994d5739668f4a5cf65e4a1ab69785a7a0beccc96847c90ab343f6e4a25f84"} Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.859142 4707 scope.go:117] "RemoveContainer" containerID="d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.859172 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.865265 4707 generic.go:334] "Generic (PLEG): container finished" podID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerID="3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a" exitCode=143 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.865319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c","Type":"ContainerDied","Data":"3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a"} Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.867180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrwsj\" (UniqueName: \"kubernetes.io/projected/b523514b-6772-4ddc-9db6-df7a8ddff1dc-kube-api-access-xrwsj\") pod \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.867391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-config-data\") pod \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.867651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-combined-ca-bundle\") pod \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\" (UID: \"b523514b-6772-4ddc-9db6-df7a8ddff1dc\") " Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.874155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b523514b-6772-4ddc-9db6-df7a8ddff1dc-kube-api-access-xrwsj" (OuterVolumeSpecName: "kube-api-access-xrwsj") pod "b523514b-6772-4ddc-9db6-df7a8ddff1dc" (UID: "b523514b-6772-4ddc-9db6-df7a8ddff1dc"). InnerVolumeSpecName "kube-api-access-xrwsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.895380 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-config-data" (OuterVolumeSpecName: "config-data") pod "b523514b-6772-4ddc-9db6-df7a8ddff1dc" (UID: "b523514b-6772-4ddc-9db6-df7a8ddff1dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.909374 4707 scope.go:117] "RemoveContainer" containerID="d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1" Jan 21 16:17:39 crc kubenswrapper[4707]: E0121 16:17:39.909826 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1\": container with ID starting with d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1 not found: ID does not exist" containerID="d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.909861 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1"} err="failed to get container status \"d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1\": rpc error: code = NotFound desc = could not find container \"d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1\": container with ID starting with d09d9dbf9bac2022036de9ea7a9483916df87247d4b2d99b4e9bd4495c1be1b1 not found: ID does not exist" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.911989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b523514b-6772-4ddc-9db6-df7a8ddff1dc" (UID: "b523514b-6772-4ddc-9db6-df7a8ddff1dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.946086 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.946149 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.946211 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.946982 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.947039 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" gracePeriod=600 Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.969284 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrwsj\" (UniqueName: \"kubernetes.io/projected/b523514b-6772-4ddc-9db6-df7a8ddff1dc-kube-api-access-xrwsj\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.969315 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:39 crc kubenswrapper[4707]: I0121 16:17:39.969326 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b523514b-6772-4ddc-9db6-df7a8ddff1dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:40 crc kubenswrapper[4707]: E0121 16:17:40.070370 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.191909 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.204016 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.218377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp"] Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.228893 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:40 crc kubenswrapper[4707]: E0121 16:17:40.229344 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b523514b-6772-4ddc-9db6-df7a8ddff1dc" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.229364 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b523514b-6772-4ddc-9db6-df7a8ddff1dc" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.229555 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b523514b-6772-4ddc-9db6-df7a8ddff1dc" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.238149 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.241303 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.241519 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.241638 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.249370 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.375468 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.375522 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.375613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.375723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xf6h\" (UniqueName: \"kubernetes.io/projected/c1dee117-e591-43b9-b94a-677ff9fae495-kube-api-access-6xf6h\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.375959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.477265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.477380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.477411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.477502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.477539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xf6h\" (UniqueName: \"kubernetes.io/projected/c1dee117-e591-43b9-b94a-677ff9fae495-kube-api-access-6xf6h\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.482437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.482456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.483313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.483459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.494594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xf6h\" (UniqueName: \"kubernetes.io/projected/c1dee117-e591-43b9-b94a-677ff9fae495-kube-api-access-6xf6h\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.637356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.799653 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.799966 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="ceilometer-central-agent" containerID="cri-o://e8501037e0d9e56601120e2025eb13965509d978b9b0e1b9984337d4eed17472" gracePeriod=30 Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.800423 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="proxy-httpd" containerID="cri-o://c68bb51f5c8ff11631cf44b3469ccbb44de1692e0875e3a7fbf38f4118fa4ea5" gracePeriod=30 Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.800498 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="sg-core" containerID="cri-o://2f8e595a8f18c8666a5028e8db6093e07511a0d63fb185918543bf831373f5f3" gracePeriod=30 Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.800539 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="ceilometer-notification-agent" containerID="cri-o://96cea6f17e082b00611e932baca2a541cb2c735a64dd28a957779e7b222ab73a" gracePeriod=30 Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.876797 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa2c575b-9647-4363-9cde-79e4aac4e25d" containerID="2ecd68c84a528f6383f1f9ce5d24f2c2c91469d2e3f8624746503b05307ecb2b" exitCode=0 Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.876861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fa2c575b-9647-4363-9cde-79e4aac4e25d","Type":"ContainerDied","Data":"2ecd68c84a528f6383f1f9ce5d24f2c2c91469d2e3f8624746503b05307ecb2b"} Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.879451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" event={"ID":"b5c33702-68b3-4723-8d09-5548858db57c","Type":"ContainerStarted","Data":"bba48301a1d812e5b1d562f60f016bd4942fac35d4db8dd307d57b200b5ae46e"} Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.879480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" event={"ID":"b5c33702-68b3-4723-8d09-5548858db57c","Type":"ContainerStarted","Data":"d32f1b94896dab038674b4272ae004abbeeab8a7c2fbd7b6fcefbbc201ce7e18"} Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.885320 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" exitCode=0 Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.885439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466"} Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.885475 4707 scope.go:117] "RemoveContainer" containerID="973e8f3f79963508f548e1f1508104218595db51536e1edaf4cd1d819983387a" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.886236 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:17:40 crc kubenswrapper[4707]: E0121 16:17:40.886750 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:17:40 crc kubenswrapper[4707]: I0121 16:17:40.897628 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" podStartSLOduration=1.897615753 podStartE2EDuration="1.897615753s" podCreationTimestamp="2026-01-21 16:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:40.895993803 +0000 UTC m=+4558.077510025" watchObservedRunningTime="2026-01-21 16:17:40.897615753 +0000 UTC m=+4558.079131976" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.070757 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:17:41 crc kubenswrapper[4707]: W0121 16:17:41.070992 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1dee117_e591_43b9_b94a_677ff9fae495.slice/crio-c4477c5e1007832575618f98de7c8a262201d19fee905b9ef6faa23eb3c644cd WatchSource:0}: Error finding container c4477c5e1007832575618f98de7c8a262201d19fee905b9ef6faa23eb3c644cd: Status 404 returned error can't find the container with id c4477c5e1007832575618f98de7c8a262201d19fee905b9ef6faa23eb3c644cd Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.117899 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.199993 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b523514b-6772-4ddc-9db6-df7a8ddff1dc" path="/var/lib/kubelet/pods/b523514b-6772-4ddc-9db6-df7a8ddff1dc/volumes" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.301854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-config-data\") pod \"fa2c575b-9647-4363-9cde-79e4aac4e25d\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.301897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-combined-ca-bundle\") pod \"fa2c575b-9647-4363-9cde-79e4aac4e25d\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.301984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sxjm\" (UniqueName: \"kubernetes.io/projected/fa2c575b-9647-4363-9cde-79e4aac4e25d-kube-api-access-9sxjm\") pod \"fa2c575b-9647-4363-9cde-79e4aac4e25d\" (UID: \"fa2c575b-9647-4363-9cde-79e4aac4e25d\") " Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.311174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2c575b-9647-4363-9cde-79e4aac4e25d-kube-api-access-9sxjm" (OuterVolumeSpecName: "kube-api-access-9sxjm") pod "fa2c575b-9647-4363-9cde-79e4aac4e25d" (UID: "fa2c575b-9647-4363-9cde-79e4aac4e25d"). InnerVolumeSpecName "kube-api-access-9sxjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.326071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-config-data" (OuterVolumeSpecName: "config-data") pod "fa2c575b-9647-4363-9cde-79e4aac4e25d" (UID: "fa2c575b-9647-4363-9cde-79e4aac4e25d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.330190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa2c575b-9647-4363-9cde-79e4aac4e25d" (UID: "fa2c575b-9647-4363-9cde-79e4aac4e25d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.404322 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.404353 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2c575b-9647-4363-9cde-79e4aac4e25d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.404363 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sxjm\" (UniqueName: \"kubernetes.io/projected/fa2c575b-9647-4363-9cde-79e4aac4e25d-kube-api-access-9sxjm\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.894184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c1dee117-e591-43b9-b94a-677ff9fae495","Type":"ContainerStarted","Data":"9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b"} Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.894232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c1dee117-e591-43b9-b94a-677ff9fae495","Type":"ContainerStarted","Data":"c4477c5e1007832575618f98de7c8a262201d19fee905b9ef6faa23eb3c644cd"} Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.903422 4707 generic.go:334] "Generic (PLEG): container finished" podID="b223c18b-f22c-43eb-be75-70b22769154f" containerID="c68bb51f5c8ff11631cf44b3469ccbb44de1692e0875e3a7fbf38f4118fa4ea5" exitCode=0 Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.903451 4707 generic.go:334] "Generic (PLEG): container finished" podID="b223c18b-f22c-43eb-be75-70b22769154f" containerID="2f8e595a8f18c8666a5028e8db6093e07511a0d63fb185918543bf831373f5f3" exitCode=2 Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.903458 4707 generic.go:334] "Generic (PLEG): container finished" podID="b223c18b-f22c-43eb-be75-70b22769154f" containerID="e8501037e0d9e56601120e2025eb13965509d978b9b0e1b9984337d4eed17472" exitCode=0 Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.903508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerDied","Data":"c68bb51f5c8ff11631cf44b3469ccbb44de1692e0875e3a7fbf38f4118fa4ea5"} Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.903578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerDied","Data":"2f8e595a8f18c8666a5028e8db6093e07511a0d63fb185918543bf831373f5f3"} Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.903590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerDied","Data":"e8501037e0d9e56601120e2025eb13965509d978b9b0e1b9984337d4eed17472"} Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.905766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fa2c575b-9647-4363-9cde-79e4aac4e25d","Type":"ContainerDied","Data":"0fabdc0afb457bdd2270e710d1ec056e218be72920e473202491b80876d0cdd1"} Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.905798 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.905834 4707 scope.go:117] "RemoveContainer" containerID="2ecd68c84a528f6383f1f9ce5d24f2c2c91469d2e3f8624746503b05307ecb2b" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.917668 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.9176546079999999 podStartE2EDuration="1.917654608s" podCreationTimestamp="2026-01-21 16:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:41.909475348 +0000 UTC m=+4559.090991570" watchObservedRunningTime="2026-01-21 16:17:41.917654608 +0000 UTC m=+4559.099170830" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.941099 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.951140 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.970089 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:41 crc kubenswrapper[4707]: E0121 16:17:41.970826 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2c575b-9647-4363-9cde-79e4aac4e25d" containerName="nova-scheduler-scheduler" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.970847 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2c575b-9647-4363-9cde-79e4aac4e25d" containerName="nova-scheduler-scheduler" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.971114 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2c575b-9647-4363-9cde-79e4aac4e25d" containerName="nova-scheduler-scheduler" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.972101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.975178 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:17:41 crc kubenswrapper[4707]: I0121 16:17:41.981740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.115864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-config-data\") pod \"nova-scheduler-0\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.116247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztggs\" (UniqueName: \"kubernetes.io/projected/2d63bace-6bb5-4e18-af27-7c173e0fa410-kube-api-access-ztggs\") pod \"nova-scheduler-0\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.116418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.217863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-config-data\") pod \"nova-scheduler-0\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.217931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztggs\" (UniqueName: \"kubernetes.io/projected/2d63bace-6bb5-4e18-af27-7c173e0fa410-kube-api-access-ztggs\") pod \"nova-scheduler-0\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.218046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.222314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-config-data\") pod \"nova-scheduler-0\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.226829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.232851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztggs\" (UniqueName: \"kubernetes.io/projected/2d63bace-6bb5-4e18-af27-7c173e0fa410-kube-api-access-ztggs\") pod \"nova-scheduler-0\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.304579 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.581498 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.645784 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.728420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-config-data\") pod \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.728546 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-logs\") pod \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.728598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-combined-ca-bundle\") pod \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.728704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msxdf\" (UniqueName: \"kubernetes.io/projected/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-kube-api-access-msxdf\") pod \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\" (UID: \"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c\") " Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.729417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-logs" (OuterVolumeSpecName: "logs") pod "6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" (UID: "6517b4eb-60c1-4d39-a5ed-1940dc0ef28c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.729799 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.734759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-kube-api-access-msxdf" (OuterVolumeSpecName: "kube-api-access-msxdf") pod "6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" (UID: "6517b4eb-60c1-4d39-a5ed-1940dc0ef28c"). InnerVolumeSpecName "kube-api-access-msxdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.760906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-config-data" (OuterVolumeSpecName: "config-data") pod "6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" (UID: "6517b4eb-60c1-4d39-a5ed-1940dc0ef28c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.765527 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" (UID: "6517b4eb-60c1-4d39-a5ed-1940dc0ef28c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.831114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-combined-ca-bundle\") pod \"739fbd7e-caa0-48cd-ba01-baff873fd04d\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.831260 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/739fbd7e-caa0-48cd-ba01-baff873fd04d-logs\") pod \"739fbd7e-caa0-48cd-ba01-baff873fd04d\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.831309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-config-data\") pod \"739fbd7e-caa0-48cd-ba01-baff873fd04d\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.831330 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf5v6\" (UniqueName: \"kubernetes.io/projected/739fbd7e-caa0-48cd-ba01-baff873fd04d-kube-api-access-zf5v6\") pod \"739fbd7e-caa0-48cd-ba01-baff873fd04d\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.831779 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.831798 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msxdf\" (UniqueName: \"kubernetes.io/projected/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-kube-api-access-msxdf\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.831824 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.832004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739fbd7e-caa0-48cd-ba01-baff873fd04d-logs" (OuterVolumeSpecName: "logs") pod "739fbd7e-caa0-48cd-ba01-baff873fd04d" (UID: "739fbd7e-caa0-48cd-ba01-baff873fd04d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.837646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739fbd7e-caa0-48cd-ba01-baff873fd04d-kube-api-access-zf5v6" (OuterVolumeSpecName: "kube-api-access-zf5v6") pod "739fbd7e-caa0-48cd-ba01-baff873fd04d" (UID: "739fbd7e-caa0-48cd-ba01-baff873fd04d"). InnerVolumeSpecName "kube-api-access-zf5v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.847231 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:42 crc kubenswrapper[4707]: W0121 16:17:42.851550 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d63bace_6bb5_4e18_af27_7c173e0fa410.slice/crio-f15a8cd2542e7c28c95267a2f66a0c5a8b5ff06a2bc26c1e02b0c10927269bf8 WatchSource:0}: Error finding container f15a8cd2542e7c28c95267a2f66a0c5a8b5ff06a2bc26c1e02b0c10927269bf8: Status 404 returned error can't find the container with id f15a8cd2542e7c28c95267a2f66a0c5a8b5ff06a2bc26c1e02b0c10927269bf8 Jan 21 16:17:42 crc kubenswrapper[4707]: E0121 16:17:42.855225 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-combined-ca-bundle podName:739fbd7e-caa0-48cd-ba01-baff873fd04d nodeName:}" failed. No retries permitted until 2026-01-21 16:17:43.355197056 +0000 UTC m=+4560.536713277 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-combined-ca-bundle") pod "739fbd7e-caa0-48cd-ba01-baff873fd04d" (UID: "739fbd7e-caa0-48cd-ba01-baff873fd04d") : error deleting /var/lib/kubelet/pods/739fbd7e-caa0-48cd-ba01-baff873fd04d/volume-subpaths: remove /var/lib/kubelet/pods/739fbd7e-caa0-48cd-ba01-baff873fd04d/volume-subpaths: no such file or directory Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.857571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-config-data" (OuterVolumeSpecName: "config-data") pod "739fbd7e-caa0-48cd-ba01-baff873fd04d" (UID: "739fbd7e-caa0-48cd-ba01-baff873fd04d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.914526 4707 generic.go:334] "Generic (PLEG): container finished" podID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerID="a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd" exitCode=0 Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.914792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c","Type":"ContainerDied","Data":"a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd"} Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.914838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6517b4eb-60c1-4d39-a5ed-1940dc0ef28c","Type":"ContainerDied","Data":"726f897c5299a3b5116dc324a924970daa30483116a89d14b56c21e61fe74c5e"} Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.914855 4707 scope.go:117] "RemoveContainer" containerID="a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.914949 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.920388 4707 generic.go:334] "Generic (PLEG): container finished" podID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerID="c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57" exitCode=0 Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.920487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"739fbd7e-caa0-48cd-ba01-baff873fd04d","Type":"ContainerDied","Data":"c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57"} Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.920514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"739fbd7e-caa0-48cd-ba01-baff873fd04d","Type":"ContainerDied","Data":"132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986"} Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.920590 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.923292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2d63bace-6bb5-4e18-af27-7c173e0fa410","Type":"ContainerStarted","Data":"f15a8cd2542e7c28c95267a2f66a0c5a8b5ff06a2bc26c1e02b0c10927269bf8"} Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.934910 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/739fbd7e-caa0-48cd-ba01-baff873fd04d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.934936 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.934949 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf5v6\" (UniqueName: \"kubernetes.io/projected/739fbd7e-caa0-48cd-ba01-baff873fd04d-kube-api-access-zf5v6\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.946145 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.949761 4707 scope.go:117] "RemoveContainer" containerID="3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.951824 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.968246 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:42 crc kubenswrapper[4707]: E0121 16:17:42.968586 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-api" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.968607 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-api" Jan 21 16:17:42 crc kubenswrapper[4707]: E0121 16:17:42.968618 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-log" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.968626 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-log" Jan 21 16:17:42 crc kubenswrapper[4707]: E0121 16:17:42.968639 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-log" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.968646 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-log" Jan 21 16:17:42 crc kubenswrapper[4707]: E0121 16:17:42.968656 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-metadata" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.968662 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-metadata" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.968847 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-log" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.968857 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" containerName="nova-api-api" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.968876 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-log" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.968887 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" containerName="nova-metadata-metadata" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.969791 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.979367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.983283 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.983599 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.983729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.984220 4707 scope.go:117] "RemoveContainer" containerID="a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd" Jan 21 16:17:42 crc kubenswrapper[4707]: E0121 16:17:42.984803 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd\": container with ID starting with a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd not found: ID does not exist" containerID="a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.984857 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd"} err="failed to get container status \"a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd\": rpc error: code = NotFound desc = could not find container \"a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd\": container with ID starting with a0cb36b4c4fd84f400ac6222306d7f7ae7034e85aa25c85bece45867946863dd not found: ID does not exist" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.984877 4707 scope.go:117] "RemoveContainer" containerID="3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a" Jan 21 16:17:42 crc kubenswrapper[4707]: E0121 16:17:42.985132 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a\": container with ID starting with 3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a not found: ID does not exist" containerID="3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.985208 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a"} err="failed to get container status \"3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a\": rpc error: code = NotFound desc = could not find container \"3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a\": container with ID starting with 3f633402fe576cc6179bac79c01df42dc848400494370d129d55f01f2b1ba41a not found: ID does not exist" Jan 21 16:17:42 crc kubenswrapper[4707]: I0121 16:17:42.985224 4707 scope.go:117] "RemoveContainer" containerID="c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.028453 4707 scope.go:117] "RemoveContainer" containerID="deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.036107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9bgq\" (UniqueName: \"kubernetes.io/projected/769a116a-43c8-43ef-b681-5af6219cd28e-kube-api-access-k9bgq\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.036225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.036274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-config-data\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.036314 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/769a116a-43c8-43ef-b681-5af6219cd28e-logs\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.036336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-public-tls-certs\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.036360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.066927 4707 scope.go:117] "RemoveContainer" containerID="c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57" Jan 21 16:17:43 crc kubenswrapper[4707]: E0121 16:17:43.067573 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57\": container with ID starting with c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57 not found: ID does not exist" containerID="c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.067645 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57"} err="failed to get container status \"c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57\": rpc error: code = NotFound desc = could not find container \"c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57\": container with ID starting with c830e3eb5cf66e1d4eaf55f87387508b8497c906227bd0547628d2cdede61f57 not found: ID does not exist" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.067676 4707 scope.go:117] "RemoveContainer" containerID="deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d" Jan 21 16:17:43 crc kubenswrapper[4707]: E0121 16:17:43.068644 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d\": container with ID starting with deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d not found: ID does not exist" containerID="deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.068700 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d"} err="failed to get container status \"deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d\": rpc error: code = NotFound desc = could not find container \"deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d\": container with ID starting with deedd60677a170f269e4dbbca4a498de60bb87e451b5dcfa6bcf269a98999f7d not found: ID does not exist" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.137704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-config-data\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.137828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/769a116a-43c8-43ef-b681-5af6219cd28e-logs\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.137857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-public-tls-certs\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.137889 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.137984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9bgq\" (UniqueName: \"kubernetes.io/projected/769a116a-43c8-43ef-b681-5af6219cd28e-kube-api-access-k9bgq\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.138431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/769a116a-43c8-43ef-b681-5af6219cd28e-logs\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.139017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.142557 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.143343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-public-tls-certs\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.143759 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.151527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-config-data\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.163607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9bgq\" (UniqueName: \"kubernetes.io/projected/769a116a-43c8-43ef-b681-5af6219cd28e-kube-api-access-k9bgq\") pod \"nova-api-0\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.198051 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6517b4eb-60c1-4d39-a5ed-1940dc0ef28c" path="/var/lib/kubelet/pods/6517b4eb-60c1-4d39-a5ed-1940dc0ef28c/volumes" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.198905 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2c575b-9647-4363-9cde-79e4aac4e25d" path="/var/lib/kubelet/pods/fa2c575b-9647-4363-9cde-79e4aac4e25d/volumes" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.299291 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.448264 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-combined-ca-bundle\") pod \"739fbd7e-caa0-48cd-ba01-baff873fd04d\" (UID: \"739fbd7e-caa0-48cd-ba01-baff873fd04d\") " Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.454442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "739fbd7e-caa0-48cd-ba01-baff873fd04d" (UID: "739fbd7e-caa0-48cd-ba01-baff873fd04d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.549759 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.551150 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739fbd7e-caa0-48cd-ba01-baff873fd04d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.560505 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.567087 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.569254 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.570769 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.571514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.584873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.653421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d21c587-54fc-4a07-982e-6e6419cc11f8-logs\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.653517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvdv\" (UniqueName: \"kubernetes.io/projected/9d21c587-54fc-4a07-982e-6e6419cc11f8-kube-api-access-qfvdv\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.653710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.653772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.653951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-config-data\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.697524 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.756954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.757329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.757388 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-config-data\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.757473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d21c587-54fc-4a07-982e-6e6419cc11f8-logs\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.757509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvdv\" (UniqueName: \"kubernetes.io/projected/9d21c587-54fc-4a07-982e-6e6419cc11f8-kube-api-access-qfvdv\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.758006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d21c587-54fc-4a07-982e-6e6419cc11f8-logs\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.760985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.761959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.762490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-config-data\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.771083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvdv\" (UniqueName: \"kubernetes.io/projected/9d21c587-54fc-4a07-982e-6e6419cc11f8-kube-api-access-qfvdv\") pod \"nova-metadata-0\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.888412 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.938207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2d63bace-6bb5-4e18-af27-7c173e0fa410","Type":"ContainerStarted","Data":"d3dc9fa87dcc0f8e4ace50b18b9015b05b452b2bfeca4b67b3bdfa26bf758cef"} Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.946347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"769a116a-43c8-43ef-b681-5af6219cd28e","Type":"ContainerStarted","Data":"9d565128b88e07372216fc5b02fb6fc9a64472ff30da1ffcb272fe9c92a40041"} Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.946404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"769a116a-43c8-43ef-b681-5af6219cd28e","Type":"ContainerStarted","Data":"746f6d23044ef38a52986a6c99bf13fa6520df90a6e7cf91d7deebca79c2287e"} Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.974105 4707 generic.go:334] "Generic (PLEG): container finished" podID="b223c18b-f22c-43eb-be75-70b22769154f" containerID="96cea6f17e082b00611e932baca2a541cb2c735a64dd28a957779e7b222ab73a" exitCode=0 Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.974413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerDied","Data":"96cea6f17e082b00611e932baca2a541cb2c735a64dd28a957779e7b222ab73a"} Jan 21 16:17:43 crc kubenswrapper[4707]: I0121 16:17:43.977589 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.977573215 podStartE2EDuration="2.977573215s" podCreationTimestamp="2026-01-21 16:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:43.971235087 +0000 UTC m=+4561.152751309" watchObservedRunningTime="2026-01-21 16:17:43.977573215 +0000 UTC m=+4561.159089437" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.024189 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.176209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-scripts\") pod \"b223c18b-f22c-43eb-be75-70b22769154f\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.176386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-run-httpd\") pod \"b223c18b-f22c-43eb-be75-70b22769154f\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.176475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-sg-core-conf-yaml\") pod \"b223c18b-f22c-43eb-be75-70b22769154f\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.176565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-log-httpd\") pod \"b223c18b-f22c-43eb-be75-70b22769154f\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.176629 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-config-data\") pod \"b223c18b-f22c-43eb-be75-70b22769154f\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.176697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-combined-ca-bundle\") pod \"b223c18b-f22c-43eb-be75-70b22769154f\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.176769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6vfx\" (UniqueName: \"kubernetes.io/projected/b223c18b-f22c-43eb-be75-70b22769154f-kube-api-access-r6vfx\") pod \"b223c18b-f22c-43eb-be75-70b22769154f\" (UID: \"b223c18b-f22c-43eb-be75-70b22769154f\") " Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.177172 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b223c18b-f22c-43eb-be75-70b22769154f" (UID: "b223c18b-f22c-43eb-be75-70b22769154f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.177266 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b223c18b-f22c-43eb-be75-70b22769154f" (UID: "b223c18b-f22c-43eb-be75-70b22769154f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.178140 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.178180 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b223c18b-f22c-43eb-be75-70b22769154f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.182281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-scripts" (OuterVolumeSpecName: "scripts") pod "b223c18b-f22c-43eb-be75-70b22769154f" (UID: "b223c18b-f22c-43eb-be75-70b22769154f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.182443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b223c18b-f22c-43eb-be75-70b22769154f-kube-api-access-r6vfx" (OuterVolumeSpecName: "kube-api-access-r6vfx") pod "b223c18b-f22c-43eb-be75-70b22769154f" (UID: "b223c18b-f22c-43eb-be75-70b22769154f"). InnerVolumeSpecName "kube-api-access-r6vfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.201701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b223c18b-f22c-43eb-be75-70b22769154f" (UID: "b223c18b-f22c-43eb-be75-70b22769154f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.238940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b223c18b-f22c-43eb-be75-70b22769154f" (UID: "b223c18b-f22c-43eb-be75-70b22769154f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.251927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-config-data" (OuterVolumeSpecName: "config-data") pod "b223c18b-f22c-43eb-be75-70b22769154f" (UID: "b223c18b-f22c-43eb-be75-70b22769154f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.281933 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6vfx\" (UniqueName: \"kubernetes.io/projected/b223c18b-f22c-43eb-be75-70b22769154f-kube-api-access-r6vfx\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.281977 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.281988 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.281999 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.282015 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b223c18b-f22c-43eb-be75-70b22769154f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.321031 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.990440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"769a116a-43c8-43ef-b681-5af6219cd28e","Type":"ContainerStarted","Data":"2917839fbaec0d038ba217855d1896f0c2eb80cde105fd1293e6851a3cfa05d2"} Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.992342 4707 generic.go:334] "Generic (PLEG): container finished" podID="b5c33702-68b3-4723-8d09-5548858db57c" containerID="bba48301a1d812e5b1d562f60f016bd4942fac35d4db8dd307d57b200b5ae46e" exitCode=0 Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.992411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" event={"ID":"b5c33702-68b3-4723-8d09-5548858db57c","Type":"ContainerDied","Data":"bba48301a1d812e5b1d562f60f016bd4942fac35d4db8dd307d57b200b5ae46e"} Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.995038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b223c18b-f22c-43eb-be75-70b22769154f","Type":"ContainerDied","Data":"60168759174ac6d5ea0b4b371b46bfce76df885475aff8095642251ebc6a54ff"} Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.995186 4707 scope.go:117] "RemoveContainer" containerID="c68bb51f5c8ff11631cf44b3469ccbb44de1692e0875e3a7fbf38f4118fa4ea5" Jan 21 16:17:44 crc kubenswrapper[4707]: I0121 16:17:44.995109 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.002206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d21c587-54fc-4a07-982e-6e6419cc11f8","Type":"ContainerStarted","Data":"e0cd0acf3c9bc7f032f2f593e925d070692cb72baee1474c9ea774e822f89fcd"} Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.002233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d21c587-54fc-4a07-982e-6e6419cc11f8","Type":"ContainerStarted","Data":"1404116304fa06706e93edb4c9e8d8fe9ba66c7e98bb0107f358edab06f7d548"} Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.002245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d21c587-54fc-4a07-982e-6e6419cc11f8","Type":"ContainerStarted","Data":"0002e954f29737320a704adde724d97df8b6947b0a975a07cb7726767e7aa52c"} Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.023772 4707 scope.go:117] "RemoveContainer" containerID="2f8e595a8f18c8666a5028e8db6093e07511a0d63fb185918543bf831373f5f3" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.039839 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.03982363 podStartE2EDuration="3.03982363s" podCreationTimestamp="2026-01-21 16:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:45.017589613 +0000 UTC m=+4562.199105836" watchObservedRunningTime="2026-01-21 16:17:45.03982363 +0000 UTC m=+4562.221339852" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.048954 4707 scope.go:117] "RemoveContainer" containerID="96cea6f17e082b00611e932baca2a541cb2c735a64dd28a957779e7b222ab73a" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.053898 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.085692 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.095891 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.096863 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.096848533 podStartE2EDuration="2.096848533s" podCreationTimestamp="2026-01-21 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:45.066155112 +0000 UTC m=+4562.247671333" watchObservedRunningTime="2026-01-21 16:17:45.096848533 +0000 UTC m=+4562.278364755" Jan 21 16:17:45 crc kubenswrapper[4707]: E0121 16:17:45.096992 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="proxy-httpd" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.097016 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="proxy-httpd" Jan 21 16:17:45 crc kubenswrapper[4707]: E0121 16:17:45.097062 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="sg-core" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.097070 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="sg-core" Jan 21 16:17:45 crc kubenswrapper[4707]: E0121 16:17:45.097089 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="ceilometer-central-agent" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.097096 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="ceilometer-central-agent" Jan 21 16:17:45 crc kubenswrapper[4707]: E0121 16:17:45.097122 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="ceilometer-notification-agent" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.097129 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="ceilometer-notification-agent" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.100709 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="sg-core" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.100762 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="ceilometer-notification-agent" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.101346 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="ceilometer-central-agent" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.101383 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b223c18b-f22c-43eb-be75-70b22769154f" containerName="proxy-httpd" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.104191 4707 scope.go:117] "RemoveContainer" containerID="e8501037e0d9e56601120e2025eb13965509d978b9b0e1b9984337d4eed17472" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.104701 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.107676 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.108063 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.137612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.193753 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739fbd7e-caa0-48cd-ba01-baff873fd04d" path="/var/lib/kubelet/pods/739fbd7e-caa0-48cd-ba01-baff873fd04d/volumes" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.194902 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b223c18b-f22c-43eb-be75-70b22769154f" path="/var/lib/kubelet/pods/b223c18b-f22c-43eb-be75-70b22769154f/volumes" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.206120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.206409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-scripts\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.206658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-log-httpd\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.206746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-config-data\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.206830 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-run-httpd\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.207086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.207338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94cb\" (UniqueName: \"kubernetes.io/projected/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-kube-api-access-k94cb\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.309781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94cb\" (UniqueName: \"kubernetes.io/projected/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-kube-api-access-k94cb\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.309883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.309989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-scripts\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.310097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-log-httpd\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.310180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-config-data\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.310220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-run-httpd\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.310257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.311674 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-log-httpd\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.311723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-run-httpd\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.316063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-config-data\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.316735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.317304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-scripts\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.323299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.325728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94cb\" (UniqueName: \"kubernetes.io/projected/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-kube-api-access-k94cb\") pod \"ceilometer-0\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.431520 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.637875 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:45 crc kubenswrapper[4707]: I0121 16:17:45.815057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:17:45 crc kubenswrapper[4707]: W0121 16:17:45.816420 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf4de2c0_d36d_46d5_8f7c_6fc093df4d2c.slice/crio-88ea5dd632509979cbe8e0e763b5acaf82f51f68a57f080ab6f444678cd9368f WatchSource:0}: Error finding container 88ea5dd632509979cbe8e0e763b5acaf82f51f68a57f080ab6f444678cd9368f: Status 404 returned error can't find the container with id 88ea5dd632509979cbe8e0e763b5acaf82f51f68a57f080ab6f444678cd9368f Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.013134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerStarted","Data":"88ea5dd632509979cbe8e0e763b5acaf82f51f68a57f080ab6f444678cd9368f"} Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.316836 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.441359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-scripts\") pod \"b5c33702-68b3-4723-8d09-5548858db57c\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.441555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-combined-ca-bundle\") pod \"b5c33702-68b3-4723-8d09-5548858db57c\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.441610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-config-data\") pod \"b5c33702-68b3-4723-8d09-5548858db57c\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.441723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjdd8\" (UniqueName: \"kubernetes.io/projected/b5c33702-68b3-4723-8d09-5548858db57c-kube-api-access-kjdd8\") pod \"b5c33702-68b3-4723-8d09-5548858db57c\" (UID: \"b5c33702-68b3-4723-8d09-5548858db57c\") " Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.444752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c33702-68b3-4723-8d09-5548858db57c-kube-api-access-kjdd8" (OuterVolumeSpecName: "kube-api-access-kjdd8") pod "b5c33702-68b3-4723-8d09-5548858db57c" (UID: "b5c33702-68b3-4723-8d09-5548858db57c"). InnerVolumeSpecName "kube-api-access-kjdd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.444879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-scripts" (OuterVolumeSpecName: "scripts") pod "b5c33702-68b3-4723-8d09-5548858db57c" (UID: "b5c33702-68b3-4723-8d09-5548858db57c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.468059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5c33702-68b3-4723-8d09-5548858db57c" (UID: "b5c33702-68b3-4723-8d09-5548858db57c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.470564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-config-data" (OuterVolumeSpecName: "config-data") pod "b5c33702-68b3-4723-8d09-5548858db57c" (UID: "b5c33702-68b3-4723-8d09-5548858db57c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.545198 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.545464 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjdd8\" (UniqueName: \"kubernetes.io/projected/b5c33702-68b3-4723-8d09-5548858db57c-kube-api-access-kjdd8\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.545481 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:46 crc kubenswrapper[4707]: I0121 16:17:46.545497 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c33702-68b3-4723-8d09-5548858db57c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:47 crc kubenswrapper[4707]: I0121 16:17:47.026022 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" Jan 21 16:17:47 crc kubenswrapper[4707]: I0121 16:17:47.026017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp" event={"ID":"b5c33702-68b3-4723-8d09-5548858db57c","Type":"ContainerDied","Data":"d32f1b94896dab038674b4272ae004abbeeab8a7c2fbd7b6fcefbbc201ce7e18"} Jan 21 16:17:47 crc kubenswrapper[4707]: I0121 16:17:47.026229 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d32f1b94896dab038674b4272ae004abbeeab8a7c2fbd7b6fcefbbc201ce7e18" Jan 21 16:17:47 crc kubenswrapper[4707]: I0121 16:17:47.028911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerStarted","Data":"b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a"} Jan 21 16:17:47 crc kubenswrapper[4707]: I0121 16:17:47.305489 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:48 crc kubenswrapper[4707]: I0121 16:17:48.040339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerStarted","Data":"10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e"} Jan 21 16:17:48 crc kubenswrapper[4707]: I0121 16:17:48.040934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerStarted","Data":"448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38"} Jan 21 16:17:48 crc kubenswrapper[4707]: E0121 16:17:48.874401 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739fbd7e_caa0_48cd_ba01_baff873fd04d.slice/crio-132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986\": RecentStats: unable to find data in memory cache]" Jan 21 16:17:48 crc kubenswrapper[4707]: I0121 16:17:48.889322 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:48 crc kubenswrapper[4707]: I0121 16:17:48.889377 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:50 crc kubenswrapper[4707]: I0121 16:17:50.062878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerStarted","Data":"d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1"} Jan 21 16:17:50 crc kubenswrapper[4707]: I0121 16:17:50.063498 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:17:50 crc kubenswrapper[4707]: I0121 16:17:50.085613 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.72422325 podStartE2EDuration="5.085593531s" podCreationTimestamp="2026-01-21 16:17:45 +0000 UTC" firstStartedPulling="2026-01-21 16:17:45.818895105 +0000 UTC m=+4563.000411327" lastFinishedPulling="2026-01-21 16:17:49.180265386 +0000 UTC m=+4566.361781608" observedRunningTime="2026-01-21 16:17:50.080307792 +0000 UTC m=+4567.261824014" watchObservedRunningTime="2026-01-21 16:17:50.085593531 +0000 UTC m=+4567.267109753" Jan 21 16:17:50 crc kubenswrapper[4707]: I0121 16:17:50.637737 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:50 crc kubenswrapper[4707]: I0121 16:17:50.657528 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.091402 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.183748 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:17:51 crc kubenswrapper[4707]: E0121 16:17:51.184411 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.217735 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.218090 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="769a116a-43c8-43ef-b681-5af6219cd28e" containerName="nova-api-log" containerID="cri-o://9d565128b88e07372216fc5b02fb6fc9a64472ff30da1ffcb272fe9c92a40041" gracePeriod=30 Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.218621 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="769a116a-43c8-43ef-b681-5af6219cd28e" containerName="nova-api-api" containerID="cri-o://2917839fbaec0d038ba217855d1896f0c2eb80cde105fd1293e6851a3cfa05d2" gracePeriod=30 Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.227498 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.227739 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="2d63bace-6bb5-4e18-af27-7c173e0fa410" containerName="nova-scheduler-scheduler" containerID="cri-o://d3dc9fa87dcc0f8e4ace50b18b9015b05b452b2bfeca4b67b3bdfa26bf758cef" gracePeriod=30 Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.292009 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.292727 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerName="nova-metadata-log" containerID="cri-o://1404116304fa06706e93edb4c9e8d8fe9ba66c7e98bb0107f358edab06f7d548" gracePeriod=30 Jan 21 16:17:51 crc kubenswrapper[4707]: I0121 16:17:51.292945 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerName="nova-metadata-metadata" containerID="cri-o://e0cd0acf3c9bc7f032f2f593e925d070692cb72baee1474c9ea774e822f89fcd" gracePeriod=30 Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.128230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d21c587-54fc-4a07-982e-6e6419cc11f8","Type":"ContainerDied","Data":"e0cd0acf3c9bc7f032f2f593e925d070692cb72baee1474c9ea774e822f89fcd"} Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.127897 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerID="e0cd0acf3c9bc7f032f2f593e925d070692cb72baee1474c9ea774e822f89fcd" exitCode=0 Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.129935 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerID="1404116304fa06706e93edb4c9e8d8fe9ba66c7e98bb0107f358edab06f7d548" exitCode=143 Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.130039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d21c587-54fc-4a07-982e-6e6419cc11f8","Type":"ContainerDied","Data":"1404116304fa06706e93edb4c9e8d8fe9ba66c7e98bb0107f358edab06f7d548"} Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.137452 4707 generic.go:334] "Generic (PLEG): container finished" podID="769a116a-43c8-43ef-b681-5af6219cd28e" containerID="2917839fbaec0d038ba217855d1896f0c2eb80cde105fd1293e6851a3cfa05d2" exitCode=0 Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.137479 4707 generic.go:334] "Generic (PLEG): container finished" podID="769a116a-43c8-43ef-b681-5af6219cd28e" containerID="9d565128b88e07372216fc5b02fb6fc9a64472ff30da1ffcb272fe9c92a40041" exitCode=143 Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.137680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"769a116a-43c8-43ef-b681-5af6219cd28e","Type":"ContainerDied","Data":"2917839fbaec0d038ba217855d1896f0c2eb80cde105fd1293e6851a3cfa05d2"} Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.137778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"769a116a-43c8-43ef-b681-5af6219cd28e","Type":"ContainerDied","Data":"9d565128b88e07372216fc5b02fb6fc9a64472ff30da1ffcb272fe9c92a40041"} Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.137852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"769a116a-43c8-43ef-b681-5af6219cd28e","Type":"ContainerDied","Data":"746f6d23044ef38a52986a6c99bf13fa6520df90a6e7cf91d7deebca79c2287e"} Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.137909 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="746f6d23044ef38a52986a6c99bf13fa6520df90a6e7cf91d7deebca79c2287e" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.185691 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.278473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-internal-tls-certs\") pod \"769a116a-43c8-43ef-b681-5af6219cd28e\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.278637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/769a116a-43c8-43ef-b681-5af6219cd28e-logs\") pod \"769a116a-43c8-43ef-b681-5af6219cd28e\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.278845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-public-tls-certs\") pod \"769a116a-43c8-43ef-b681-5af6219cd28e\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.278986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/769a116a-43c8-43ef-b681-5af6219cd28e-logs" (OuterVolumeSpecName: "logs") pod "769a116a-43c8-43ef-b681-5af6219cd28e" (UID: "769a116a-43c8-43ef-b681-5af6219cd28e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.279143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-config-data\") pod \"769a116a-43c8-43ef-b681-5af6219cd28e\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.279323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9bgq\" (UniqueName: \"kubernetes.io/projected/769a116a-43c8-43ef-b681-5af6219cd28e-kube-api-access-k9bgq\") pod \"769a116a-43c8-43ef-b681-5af6219cd28e\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.279443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-combined-ca-bundle\") pod \"769a116a-43c8-43ef-b681-5af6219cd28e\" (UID: \"769a116a-43c8-43ef-b681-5af6219cd28e\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.281255 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/769a116a-43c8-43ef-b681-5af6219cd28e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.285470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769a116a-43c8-43ef-b681-5af6219cd28e-kube-api-access-k9bgq" (OuterVolumeSpecName: "kube-api-access-k9bgq") pod "769a116a-43c8-43ef-b681-5af6219cd28e" (UID: "769a116a-43c8-43ef-b681-5af6219cd28e"). InnerVolumeSpecName "kube-api-access-k9bgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.302186 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.306781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-config-data" (OuterVolumeSpecName: "config-data") pod "769a116a-43c8-43ef-b681-5af6219cd28e" (UID: "769a116a-43c8-43ef-b681-5af6219cd28e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.322454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "769a116a-43c8-43ef-b681-5af6219cd28e" (UID: "769a116a-43c8-43ef-b681-5af6219cd28e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.340222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "769a116a-43c8-43ef-b681-5af6219cd28e" (UID: "769a116a-43c8-43ef-b681-5af6219cd28e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.342100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "769a116a-43c8-43ef-b681-5af6219cd28e" (UID: "769a116a-43c8-43ef-b681-5af6219cd28e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.383766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfvdv\" (UniqueName: \"kubernetes.io/projected/9d21c587-54fc-4a07-982e-6e6419cc11f8-kube-api-access-qfvdv\") pod \"9d21c587-54fc-4a07-982e-6e6419cc11f8\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.383869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d21c587-54fc-4a07-982e-6e6419cc11f8-logs\") pod \"9d21c587-54fc-4a07-982e-6e6419cc11f8\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.383916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-combined-ca-bundle\") pod \"9d21c587-54fc-4a07-982e-6e6419cc11f8\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.384060 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-config-data\") pod \"9d21c587-54fc-4a07-982e-6e6419cc11f8\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.384195 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-nova-metadata-tls-certs\") pod \"9d21c587-54fc-4a07-982e-6e6419cc11f8\" (UID: \"9d21c587-54fc-4a07-982e-6e6419cc11f8\") " Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.384391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d21c587-54fc-4a07-982e-6e6419cc11f8-logs" (OuterVolumeSpecName: "logs") pod "9d21c587-54fc-4a07-982e-6e6419cc11f8" (UID: "9d21c587-54fc-4a07-982e-6e6419cc11f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.384730 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.384752 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.384761 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d21c587-54fc-4a07-982e-6e6419cc11f8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.384774 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.384783 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9bgq\" (UniqueName: \"kubernetes.io/projected/769a116a-43c8-43ef-b681-5af6219cd28e-kube-api-access-k9bgq\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.384794 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/769a116a-43c8-43ef-b681-5af6219cd28e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.387541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d21c587-54fc-4a07-982e-6e6419cc11f8-kube-api-access-qfvdv" (OuterVolumeSpecName: "kube-api-access-qfvdv") pod "9d21c587-54fc-4a07-982e-6e6419cc11f8" (UID: "9d21c587-54fc-4a07-982e-6e6419cc11f8"). InnerVolumeSpecName "kube-api-access-qfvdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.406003 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d21c587-54fc-4a07-982e-6e6419cc11f8" (UID: "9d21c587-54fc-4a07-982e-6e6419cc11f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.408155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-config-data" (OuterVolumeSpecName: "config-data") pod "9d21c587-54fc-4a07-982e-6e6419cc11f8" (UID: "9d21c587-54fc-4a07-982e-6e6419cc11f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.437598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9d21c587-54fc-4a07-982e-6e6419cc11f8" (UID: "9d21c587-54fc-4a07-982e-6e6419cc11f8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.487098 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.487132 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.487147 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfvdv\" (UniqueName: \"kubernetes.io/projected/9d21c587-54fc-4a07-982e-6e6419cc11f8-kube-api-access-qfvdv\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:52 crc kubenswrapper[4707]: I0121 16:17:52.487158 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d21c587-54fc-4a07-982e-6e6419cc11f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.146659 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.153896 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.154173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9d21c587-54fc-4a07-982e-6e6419cc11f8","Type":"ContainerDied","Data":"0002e954f29737320a704adde724d97df8b6947b0a975a07cb7726767e7aa52c"} Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.154213 4707 scope.go:117] "RemoveContainer" containerID="e0cd0acf3c9bc7f032f2f593e925d070692cb72baee1474c9ea774e822f89fcd" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.182302 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.185964 4707 scope.go:117] "RemoveContainer" containerID="1404116304fa06706e93edb4c9e8d8fe9ba66c7e98bb0107f358edab06f7d548" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.209275 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.228453 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.243222 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:53 crc kubenswrapper[4707]: E0121 16:17:53.243656 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerName="nova-metadata-log" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.243680 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerName="nova-metadata-log" Jan 21 16:17:53 crc kubenswrapper[4707]: E0121 16:17:53.243706 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c33702-68b3-4723-8d09-5548858db57c" containerName="nova-manage" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.243713 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c33702-68b3-4723-8d09-5548858db57c" containerName="nova-manage" Jan 21 16:17:53 crc kubenswrapper[4707]: E0121 16:17:53.243729 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerName="nova-metadata-metadata" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.243735 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerName="nova-metadata-metadata" Jan 21 16:17:53 crc kubenswrapper[4707]: E0121 16:17:53.243751 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769a116a-43c8-43ef-b681-5af6219cd28e" containerName="nova-api-api" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.243756 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="769a116a-43c8-43ef-b681-5af6219cd28e" containerName="nova-api-api" Jan 21 16:17:53 crc kubenswrapper[4707]: E0121 16:17:53.243768 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769a116a-43c8-43ef-b681-5af6219cd28e" containerName="nova-api-log" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.243774 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="769a116a-43c8-43ef-b681-5af6219cd28e" containerName="nova-api-log" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.243963 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c33702-68b3-4723-8d09-5548858db57c" containerName="nova-manage" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.243980 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="769a116a-43c8-43ef-b681-5af6219cd28e" containerName="nova-api-api" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.243989 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerName="nova-metadata-metadata" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.244001 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d21c587-54fc-4a07-982e-6e6419cc11f8" containerName="nova-metadata-log" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.244015 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="769a116a-43c8-43ef-b681-5af6219cd28e" containerName="nova-api-log" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.245023 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.247544 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.247926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.248207 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.259670 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.264401 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.273558 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.275231 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.277179 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.278996 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.280040 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-config-data\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-config-data\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298486 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhc6t\" (UniqueName: \"kubernetes.io/projected/5de9761d-81cf-4d45-aec0-7407347f0449-kube-api-access-xhc6t\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de9761d-81cf-4d45-aec0-7407347f0449-logs\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d463b3-4ebf-46d8-9a72-3142c370d2f9-logs\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-public-tls-certs\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.298871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dbp5\" (UniqueName: \"kubernetes.io/projected/36d463b3-4ebf-46d8-9a72-3142c370d2f9-kube-api-access-9dbp5\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.400491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-public-tls-certs\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.400564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dbp5\" (UniqueName: \"kubernetes.io/projected/36d463b3-4ebf-46d8-9a72-3142c370d2f9-kube-api-access-9dbp5\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.400707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-config-data\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.400794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-config-data\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.400848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.400869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhc6t\" (UniqueName: \"kubernetes.io/projected/5de9761d-81cf-4d45-aec0-7407347f0449-kube-api-access-xhc6t\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.400917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de9761d-81cf-4d45-aec0-7407347f0449-logs\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.400970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.401039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.401094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d463b3-4ebf-46d8-9a72-3142c370d2f9-logs\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.401132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.401618 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d463b3-4ebf-46d8-9a72-3142c370d2f9-logs\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.401955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de9761d-81cf-4d45-aec0-7407347f0449-logs\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.406429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-config-data\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.406865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.406901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-public-tls-certs\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.406906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.407863 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.408859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.409363 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-config-data\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.416049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dbp5\" (UniqueName: \"kubernetes.io/projected/36d463b3-4ebf-46d8-9a72-3142c370d2f9-kube-api-access-9dbp5\") pod \"nova-metadata-0\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.417921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhc6t\" (UniqueName: \"kubernetes.io/projected/5de9761d-81cf-4d45-aec0-7407347f0449-kube-api-access-xhc6t\") pod \"nova-api-0\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.564517 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.589185 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:53 crc kubenswrapper[4707]: I0121 16:17:53.991704 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.072836 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:17:54 crc kubenswrapper[4707]: W0121 16:17:54.075613 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d463b3_4ebf_46d8_9a72_3142c370d2f9.slice/crio-b792c1a33a8e70b2e1f838bf97d0e480870d4a0407c3fdd650a3eee907e476df WatchSource:0}: Error finding container b792c1a33a8e70b2e1f838bf97d0e480870d4a0407c3fdd650a3eee907e476df: Status 404 returned error can't find the container with id b792c1a33a8e70b2e1f838bf97d0e480870d4a0407c3fdd650a3eee907e476df Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.170158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"36d463b3-4ebf-46d8-9a72-3142c370d2f9","Type":"ContainerStarted","Data":"b792c1a33a8e70b2e1f838bf97d0e480870d4a0407c3fdd650a3eee907e476df"} Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.172245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5de9761d-81cf-4d45-aec0-7407347f0449","Type":"ContainerStarted","Data":"ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89"} Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.172271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5de9761d-81cf-4d45-aec0-7407347f0449","Type":"ContainerStarted","Data":"ea99fc79106b3eb034f6f6a5b8020ad6adf39ab3d0c2c1c516c831351a6e2995"} Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.173517 4707 generic.go:334] "Generic (PLEG): container finished" podID="2d63bace-6bb5-4e18-af27-7c173e0fa410" containerID="d3dc9fa87dcc0f8e4ace50b18b9015b05b452b2bfeca4b67b3bdfa26bf758cef" exitCode=0 Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.173547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2d63bace-6bb5-4e18-af27-7c173e0fa410","Type":"ContainerDied","Data":"d3dc9fa87dcc0f8e4ace50b18b9015b05b452b2bfeca4b67b3bdfa26bf758cef"} Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.228040 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.319192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-config-data\") pod \"2d63bace-6bb5-4e18-af27-7c173e0fa410\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.319469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-combined-ca-bundle\") pod \"2d63bace-6bb5-4e18-af27-7c173e0fa410\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.319554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztggs\" (UniqueName: \"kubernetes.io/projected/2d63bace-6bb5-4e18-af27-7c173e0fa410-kube-api-access-ztggs\") pod \"2d63bace-6bb5-4e18-af27-7c173e0fa410\" (UID: \"2d63bace-6bb5-4e18-af27-7c173e0fa410\") " Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.324852 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d63bace-6bb5-4e18-af27-7c173e0fa410-kube-api-access-ztggs" (OuterVolumeSpecName: "kube-api-access-ztggs") pod "2d63bace-6bb5-4e18-af27-7c173e0fa410" (UID: "2d63bace-6bb5-4e18-af27-7c173e0fa410"). InnerVolumeSpecName "kube-api-access-ztggs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.338959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-config-data" (OuterVolumeSpecName: "config-data") pod "2d63bace-6bb5-4e18-af27-7c173e0fa410" (UID: "2d63bace-6bb5-4e18-af27-7c173e0fa410"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.341061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d63bace-6bb5-4e18-af27-7c173e0fa410" (UID: "2d63bace-6bb5-4e18-af27-7c173e0fa410"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.424108 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.424174 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d63bace-6bb5-4e18-af27-7c173e0fa410-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:54 crc kubenswrapper[4707]: I0121 16:17:54.424194 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztggs\" (UniqueName: \"kubernetes.io/projected/2d63bace-6bb5-4e18-af27-7c173e0fa410-kube-api-access-ztggs\") on node \"crc\" DevicePath \"\"" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.191692 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.193059 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="769a116a-43c8-43ef-b681-5af6219cd28e" path="/var/lib/kubelet/pods/769a116a-43c8-43ef-b681-5af6219cd28e/volumes" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.193712 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d21c587-54fc-4a07-982e-6e6419cc11f8" path="/var/lib/kubelet/pods/9d21c587-54fc-4a07-982e-6e6419cc11f8/volumes" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.194609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"36d463b3-4ebf-46d8-9a72-3142c370d2f9","Type":"ContainerStarted","Data":"533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22"} Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.194642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"36d463b3-4ebf-46d8-9a72-3142c370d2f9","Type":"ContainerStarted","Data":"c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65"} Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.194665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5de9761d-81cf-4d45-aec0-7407347f0449","Type":"ContainerStarted","Data":"6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41"} Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.194678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2d63bace-6bb5-4e18-af27-7c173e0fa410","Type":"ContainerDied","Data":"f15a8cd2542e7c28c95267a2f66a0c5a8b5ff06a2bc26c1e02b0c10927269bf8"} Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.194700 4707 scope.go:117] "RemoveContainer" containerID="d3dc9fa87dcc0f8e4ace50b18b9015b05b452b2bfeca4b67b3bdfa26bf758cef" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.216672 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.216654315 podStartE2EDuration="2.216654315s" podCreationTimestamp="2026-01-21 16:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:55.209230676 +0000 UTC m=+4572.390746898" watchObservedRunningTime="2026-01-21 16:17:55.216654315 +0000 UTC m=+4572.398170537" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.233485 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.244922 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.255496 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.2554787259999998 podStartE2EDuration="2.255478726s" podCreationTimestamp="2026-01-21 16:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:55.238561236 +0000 UTC m=+4572.420077458" watchObservedRunningTime="2026-01-21 16:17:55.255478726 +0000 UTC m=+4572.436994948" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.269581 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:55 crc kubenswrapper[4707]: E0121 16:17:55.269974 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d63bace-6bb5-4e18-af27-7c173e0fa410" containerName="nova-scheduler-scheduler" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.269994 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d63bace-6bb5-4e18-af27-7c173e0fa410" containerName="nova-scheduler-scheduler" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.270205 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d63bace-6bb5-4e18-af27-7c173e0fa410" containerName="nova-scheduler-scheduler" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.270767 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.272606 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.279727 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.343986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.344068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-config-data\") pod \"nova-scheduler-0\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.344519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbmbp\" (UniqueName: \"kubernetes.io/projected/80f502e2-066e-4230-b504-29825f19918a-kube-api-access-kbmbp\") pod \"nova-scheduler-0\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.446050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-config-data\") pod \"nova-scheduler-0\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.446153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbmbp\" (UniqueName: \"kubernetes.io/projected/80f502e2-066e-4230-b504-29825f19918a-kube-api-access-kbmbp\") pod \"nova-scheduler-0\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.446218 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.451052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-config-data\") pod \"nova-scheduler-0\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.451444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.463697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbmbp\" (UniqueName: \"kubernetes.io/projected/80f502e2-066e-4230-b504-29825f19918a-kube-api-access-kbmbp\") pod \"nova-scheduler-0\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:55 crc kubenswrapper[4707]: I0121 16:17:55.595067 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:17:56 crc kubenswrapper[4707]: W0121 16:17:56.005028 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80f502e2_066e_4230_b504_29825f19918a.slice/crio-e104d37b0c635c6d067bb8de883529400b116f8fa86c601f8ffaf7a8db22717b WatchSource:0}: Error finding container e104d37b0c635c6d067bb8de883529400b116f8fa86c601f8ffaf7a8db22717b: Status 404 returned error can't find the container with id e104d37b0c635c6d067bb8de883529400b116f8fa86c601f8ffaf7a8db22717b Jan 21 16:17:56 crc kubenswrapper[4707]: I0121 16:17:56.006479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:17:56 crc kubenswrapper[4707]: I0121 16:17:56.204519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"80f502e2-066e-4230-b504-29825f19918a","Type":"ContainerStarted","Data":"e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a"} Jan 21 16:17:56 crc kubenswrapper[4707]: I0121 16:17:56.204573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"80f502e2-066e-4230-b504-29825f19918a","Type":"ContainerStarted","Data":"e104d37b0c635c6d067bb8de883529400b116f8fa86c601f8ffaf7a8db22717b"} Jan 21 16:17:56 crc kubenswrapper[4707]: I0121 16:17:56.242544 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.242523913 podStartE2EDuration="1.242523913s" podCreationTimestamp="2026-01-21 16:17:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:17:56.234617647 +0000 UTC m=+4573.416133869" watchObservedRunningTime="2026-01-21 16:17:56.242523913 +0000 UTC m=+4573.424040135" Jan 21 16:17:57 crc kubenswrapper[4707]: I0121 16:17:57.190900 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d63bace-6bb5-4e18-af27-7c173e0fa410" path="/var/lib/kubelet/pods/2d63bace-6bb5-4e18-af27-7c173e0fa410/volumes" Jan 21 16:17:58 crc kubenswrapper[4707]: I0121 16:17:58.590893 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:58 crc kubenswrapper[4707]: I0121 16:17:58.591335 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:17:59 crc kubenswrapper[4707]: E0121 16:17:59.103895 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739fbd7e_caa0_48cd_ba01_baff873fd04d.slice/crio-132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986\": RecentStats: unable to find data in memory cache]" Jan 21 16:18:00 crc kubenswrapper[4707]: I0121 16:18:00.595293 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:18:03 crc kubenswrapper[4707]: I0121 16:18:03.190531 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:18:03 crc kubenswrapper[4707]: E0121 16:18:03.191587 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:18:03 crc kubenswrapper[4707]: I0121 16:18:03.565593 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:18:03 crc kubenswrapper[4707]: I0121 16:18:03.565778 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:18:03 crc kubenswrapper[4707]: I0121 16:18:03.589965 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:18:03 crc kubenswrapper[4707]: I0121 16:18:03.590283 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:18:04 crc kubenswrapper[4707]: I0121 16:18:04.581932 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:18:04 crc kubenswrapper[4707]: I0121 16:18:04.581951 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:18:04 crc kubenswrapper[4707]: I0121 16:18:04.600942 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:18:04 crc kubenswrapper[4707]: I0121 16:18:04.600959 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:18:05 crc kubenswrapper[4707]: I0121 16:18:05.596213 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:18:05 crc kubenswrapper[4707]: I0121 16:18:05.620577 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:18:06 crc kubenswrapper[4707]: I0121 16:18:06.348986 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:18:09 crc kubenswrapper[4707]: E0121 16:18:09.326275 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739fbd7e_caa0_48cd_ba01_baff873fd04d.slice/crio-132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986\": RecentStats: unable to find data in memory cache]" Jan 21 16:18:13 crc kubenswrapper[4707]: I0121 16:18:13.574702 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:18:13 crc kubenswrapper[4707]: I0121 16:18:13.575374 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:18:13 crc kubenswrapper[4707]: I0121 16:18:13.575613 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:18:13 crc kubenswrapper[4707]: I0121 16:18:13.580645 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:18:13 crc kubenswrapper[4707]: I0121 16:18:13.600613 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:18:13 crc kubenswrapper[4707]: I0121 16:18:13.601044 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:18:13 crc kubenswrapper[4707]: I0121 16:18:13.604410 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:18:14 crc kubenswrapper[4707]: I0121 16:18:14.395848 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:18:14 crc kubenswrapper[4707]: I0121 16:18:14.399719 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:18:14 crc kubenswrapper[4707]: I0121 16:18:14.401040 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:18:15 crc kubenswrapper[4707]: I0121 16:18:15.183263 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:18:15 crc kubenswrapper[4707]: E0121 16:18:15.184278 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:18:15 crc kubenswrapper[4707]: I0121 16:18:15.437335 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.192766 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.194318 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="02a1be71-0142-4970-8a8a-9a04c4bd03da" containerName="kube-state-metrics" containerID="cri-o://41464610db9257d72f412902dad7f3147f55589f358a384de5c478a302cf91eb" gracePeriod=30 Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.443140 4707 generic.go:334] "Generic (PLEG): container finished" podID="02a1be71-0142-4970-8a8a-9a04c4bd03da" containerID="41464610db9257d72f412902dad7f3147f55589f358a384de5c478a302cf91eb" exitCode=2 Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.443207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"02a1be71-0142-4970-8a8a-9a04c4bd03da","Type":"ContainerDied","Data":"41464610db9257d72f412902dad7f3147f55589f358a384de5c478a302cf91eb"} Jan 21 16:18:19 crc kubenswrapper[4707]: E0121 16:18:19.535486 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739fbd7e_caa0_48cd_ba01_baff873fd04d.slice/crio-132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986\": RecentStats: unable to find data in memory cache]" Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.566605 4707 scope.go:117] "RemoveContainer" containerID="418de46e8ddd98abdd313f3caa5188dc17cb30561b35253f50ddd8686752659b" Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.620267 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.624859 4707 scope.go:117] "RemoveContainer" containerID="159bdb08582167cf1c8abdc43ff23a21e055879346a2d331601f1afb7b6503b5" Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.664037 4707 scope.go:117] "RemoveContainer" containerID="c4c68a89695bf606a0a4cc6f42584ecaa9e2bf69fe90294a201d349ae0a85e8f" Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.747062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp2bv\" (UniqueName: \"kubernetes.io/projected/02a1be71-0142-4970-8a8a-9a04c4bd03da-kube-api-access-tp2bv\") pod \"02a1be71-0142-4970-8a8a-9a04c4bd03da\" (UID: \"02a1be71-0142-4970-8a8a-9a04c4bd03da\") " Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.765280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a1be71-0142-4970-8a8a-9a04c4bd03da-kube-api-access-tp2bv" (OuterVolumeSpecName: "kube-api-access-tp2bv") pod "02a1be71-0142-4970-8a8a-9a04c4bd03da" (UID: "02a1be71-0142-4970-8a8a-9a04c4bd03da"). InnerVolumeSpecName "kube-api-access-tp2bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:18:19 crc kubenswrapper[4707]: I0121 16:18:19.850076 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp2bv\" (UniqueName: \"kubernetes.io/projected/02a1be71-0142-4970-8a8a-9a04c4bd03da-kube-api-access-tp2bv\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.453983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"02a1be71-0142-4970-8a8a-9a04c4bd03da","Type":"ContainerDied","Data":"df6210c03f9f87c4d68a70b6e968050a4b14db8482dad0d2cbae022dfabf63f8"} Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.454302 4707 scope.go:117] "RemoveContainer" containerID="41464610db9257d72f412902dad7f3147f55589f358a384de5c478a302cf91eb" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.454029 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.481335 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.486988 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.496564 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:18:20 crc kubenswrapper[4707]: E0121 16:18:20.496976 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a1be71-0142-4970-8a8a-9a04c4bd03da" containerName="kube-state-metrics" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.496995 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a1be71-0142-4970-8a8a-9a04c4bd03da" containerName="kube-state-metrics" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.497186 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a1be71-0142-4970-8a8a-9a04c4bd03da" containerName="kube-state-metrics" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.498053 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.500214 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.500373 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.507446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.664044 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfvhf\" (UniqueName: \"kubernetes.io/projected/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-api-access-sfvhf\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.664041 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.664290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.664499 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="ceilometer-central-agent" containerID="cri-o://b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a" gracePeriod=30 Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.664532 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="proxy-httpd" containerID="cri-o://d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1" gracePeriod=30 Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.664617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.664610 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="ceilometer-notification-agent" containerID="cri-o://448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38" gracePeriod=30 Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.664550 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="sg-core" containerID="cri-o://10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e" gracePeriod=30 Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.664873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.767919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfvhf\" (UniqueName: \"kubernetes.io/projected/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-api-access-sfvhf\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.767993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.768063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.768115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.774537 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.774593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.776260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.787396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfvhf\" (UniqueName: \"kubernetes.io/projected/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-api-access-sfvhf\") pod \"kube-state-metrics-0\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:20 crc kubenswrapper[4707]: I0121 16:18:20.816784 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:21 crc kubenswrapper[4707]: I0121 16:18:21.190985 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a1be71-0142-4970-8a8a-9a04c4bd03da" path="/var/lib/kubelet/pods/02a1be71-0142-4970-8a8a-9a04c4bd03da/volumes" Jan 21 16:18:21 crc kubenswrapper[4707]: W0121 16:18:21.221594 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5f7ba87_baff_4fe7_97ab_3ea947e9c102.slice/crio-f43c0ff8d567ada75b9e91d164f81aa7acca5d3ab89ecc106c6e346959b57d88 WatchSource:0}: Error finding container f43c0ff8d567ada75b9e91d164f81aa7acca5d3ab89ecc106c6e346959b57d88: Status 404 returned error can't find the container with id f43c0ff8d567ada75b9e91d164f81aa7acca5d3ab89ecc106c6e346959b57d88 Jan 21 16:18:21 crc kubenswrapper[4707]: I0121 16:18:21.222944 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:18:21 crc kubenswrapper[4707]: I0121 16:18:21.467204 4707 generic.go:334] "Generic (PLEG): container finished" podID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerID="d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1" exitCode=0 Jan 21 16:18:21 crc kubenswrapper[4707]: I0121 16:18:21.467526 4707 generic.go:334] "Generic (PLEG): container finished" podID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerID="10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e" exitCode=2 Jan 21 16:18:21 crc kubenswrapper[4707]: I0121 16:18:21.467539 4707 generic.go:334] "Generic (PLEG): container finished" podID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerID="b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a" exitCode=0 Jan 21 16:18:21 crc kubenswrapper[4707]: I0121 16:18:21.467339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerDied","Data":"d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1"} Jan 21 16:18:21 crc kubenswrapper[4707]: I0121 16:18:21.467617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerDied","Data":"10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e"} Jan 21 16:18:21 crc kubenswrapper[4707]: I0121 16:18:21.467636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerDied","Data":"b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a"} Jan 21 16:18:21 crc kubenswrapper[4707]: I0121 16:18:21.470680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"b5f7ba87-baff-4fe7-97ab-3ea947e9c102","Type":"ContainerStarted","Data":"f43c0ff8d567ada75b9e91d164f81aa7acca5d3ab89ecc106c6e346959b57d88"} Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.194746 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.301945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-run-httpd\") pod \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.302328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-log-httpd\") pod \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.302381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k94cb\" (UniqueName: \"kubernetes.io/projected/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-kube-api-access-k94cb\") pod \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.302433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-config-data\") pod \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.302434 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" (UID: "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.302456 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-sg-core-conf-yaml\") pod \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.302479 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-scripts\") pod \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.302536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-combined-ca-bundle\") pod \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\" (UID: \"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c\") " Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.302827 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" (UID: "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.303018 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.303041 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.310556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-scripts" (OuterVolumeSpecName: "scripts") pod "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" (UID: "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.310940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-kube-api-access-k94cb" (OuterVolumeSpecName: "kube-api-access-k94cb") pod "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" (UID: "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c"). InnerVolumeSpecName "kube-api-access-k94cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.326068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" (UID: "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.357662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" (UID: "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.371177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-config-data" (OuterVolumeSpecName: "config-data") pod "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" (UID: "af4de2c0-d36d-46d5-8f7c-6fc093df4d2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.405261 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k94cb\" (UniqueName: \"kubernetes.io/projected/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-kube-api-access-k94cb\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.405318 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.405329 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.405340 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.405356 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.480483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"b5f7ba87-baff-4fe7-97ab-3ea947e9c102","Type":"ContainerStarted","Data":"91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8"} Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.480544 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.482734 4707 generic.go:334] "Generic (PLEG): container finished" podID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerID="448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38" exitCode=0 Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.482776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerDied","Data":"448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38"} Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.482788 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.482823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"af4de2c0-d36d-46d5-8f7c-6fc093df4d2c","Type":"ContainerDied","Data":"88ea5dd632509979cbe8e0e763b5acaf82f51f68a57f080ab6f444678cd9368f"} Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.482842 4707 scope.go:117] "RemoveContainer" containerID="d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.498108 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.2044332779999998 podStartE2EDuration="2.498091082s" podCreationTimestamp="2026-01-21 16:18:20 +0000 UTC" firstStartedPulling="2026-01-21 16:18:21.223563833 +0000 UTC m=+4598.405080055" lastFinishedPulling="2026-01-21 16:18:21.517221637 +0000 UTC m=+4598.698737859" observedRunningTime="2026-01-21 16:18:22.493941409 +0000 UTC m=+4599.675457632" watchObservedRunningTime="2026-01-21 16:18:22.498091082 +0000 UTC m=+4599.679607304" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.511857 4707 scope.go:117] "RemoveContainer" containerID="10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.524389 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.535323 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552003 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:18:22 crc kubenswrapper[4707]: E0121 16:18:22.552487 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="ceilometer-notification-agent" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552504 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="ceilometer-notification-agent" Jan 21 16:18:22 crc kubenswrapper[4707]: E0121 16:18:22.552528 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="proxy-httpd" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552534 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="proxy-httpd" Jan 21 16:18:22 crc kubenswrapper[4707]: E0121 16:18:22.552555 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="sg-core" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="sg-core" Jan 21 16:18:22 crc kubenswrapper[4707]: E0121 16:18:22.552572 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="ceilometer-central-agent" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552580 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="ceilometer-central-agent" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552735 4707 scope.go:117] "RemoveContainer" containerID="448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552801 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="ceilometer-notification-agent" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552856 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="sg-core" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="proxy-httpd" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.552878 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" containerName="ceilometer-central-agent" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.554470 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.557135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.557311 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.557433 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.560163 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.585062 4707 scope.go:117] "RemoveContainer" containerID="b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.603413 4707 scope.go:117] "RemoveContainer" containerID="d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1" Jan 21 16:18:22 crc kubenswrapper[4707]: E0121 16:18:22.603767 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1\": container with ID starting with d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1 not found: ID does not exist" containerID="d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.603801 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1"} err="failed to get container status \"d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1\": rpc error: code = NotFound desc = could not find container \"d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1\": container with ID starting with d8e46ff57176996b19850008297f5ea2cb5eff878168cedad572d87337247ea1 not found: ID does not exist" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.603847 4707 scope.go:117] "RemoveContainer" containerID="10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e" Jan 21 16:18:22 crc kubenswrapper[4707]: E0121 16:18:22.604342 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e\": container with ID starting with 10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e not found: ID does not exist" containerID="10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.604384 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e"} err="failed to get container status \"10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e\": rpc error: code = NotFound desc = could not find container \"10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e\": container with ID starting with 10f6bcb244c1b6028088435fc2c674b6d7d63452fcb72f6611c260d20ed5406e not found: ID does not exist" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.604408 4707 scope.go:117] "RemoveContainer" containerID="448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38" Jan 21 16:18:22 crc kubenswrapper[4707]: E0121 16:18:22.604674 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38\": container with ID starting with 448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38 not found: ID does not exist" containerID="448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.604700 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38"} err="failed to get container status \"448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38\": rpc error: code = NotFound desc = could not find container \"448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38\": container with ID starting with 448b91638c2774ffbd2ca92901d7a028025e40316259250c0fea4ac06eba1e38 not found: ID does not exist" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.604717 4707 scope.go:117] "RemoveContainer" containerID="b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a" Jan 21 16:18:22 crc kubenswrapper[4707]: E0121 16:18:22.604989 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a\": container with ID starting with b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a not found: ID does not exist" containerID="b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.605014 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a"} err="failed to get container status \"b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a\": rpc error: code = NotFound desc = could not find container \"b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a\": container with ID starting with b9118822a44d479f8e53bbe1a0826146ce3b57c9bf26b00889cc383962ea547a not found: ID does not exist" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.607768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.607805 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-scripts\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.607853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjjll\" (UniqueName: \"kubernetes.io/projected/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-kube-api-access-tjjll\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.607882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-run-httpd\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.607912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-log-httpd\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.607954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.607972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-config-data\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.608022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.709636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-run-httpd\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.709722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-log-httpd\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.709850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.709894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-config-data\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.710024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.710114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.710150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-scripts\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.710238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjjll\" (UniqueName: \"kubernetes.io/projected/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-kube-api-access-tjjll\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.710250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-run-httpd\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.710359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-log-httpd\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.713750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.714761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-scripts\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.715115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.715238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-config-data\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.717917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.725116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjjll\" (UniqueName: \"kubernetes.io/projected/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-kube-api-access-tjjll\") pod \"ceilometer-0\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:22 crc kubenswrapper[4707]: I0121 16:18:22.870937 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:23 crc kubenswrapper[4707]: I0121 16:18:23.191541 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af4de2c0-d36d-46d5-8f7c-6fc093df4d2c" path="/var/lib/kubelet/pods/af4de2c0-d36d-46d5-8f7c-6fc093df4d2c/volumes" Jan 21 16:18:23 crc kubenswrapper[4707]: I0121 16:18:23.309424 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:18:23 crc kubenswrapper[4707]: I0121 16:18:23.494187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerStarted","Data":"d6b12b269502fd2de890e43a1d712e337c91384d942b74252b62541be9d3b607"} Jan 21 16:18:24 crc kubenswrapper[4707]: I0121 16:18:24.503209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerStarted","Data":"32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65"} Jan 21 16:18:25 crc kubenswrapper[4707]: I0121 16:18:25.513147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerStarted","Data":"290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9"} Jan 21 16:18:26 crc kubenswrapper[4707]: I0121 16:18:26.523732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerStarted","Data":"2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b"} Jan 21 16:18:28 crc kubenswrapper[4707]: I0121 16:18:28.182288 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:18:28 crc kubenswrapper[4707]: E0121 16:18:28.182856 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:18:28 crc kubenswrapper[4707]: I0121 16:18:28.544761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerStarted","Data":"928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c"} Jan 21 16:18:28 crc kubenswrapper[4707]: I0121 16:18:28.545238 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:29 crc kubenswrapper[4707]: E0121 16:18:29.756371 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739fbd7e_caa0_48cd_ba01_baff873fd04d.slice/crio-132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986\": RecentStats: unable to find data in memory cache]" Jan 21 16:18:30 crc kubenswrapper[4707]: I0121 16:18:30.826866 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:18:30 crc kubenswrapper[4707]: I0121 16:18:30.845878 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=4.834957183 podStartE2EDuration="8.845854889s" podCreationTimestamp="2026-01-21 16:18:22 +0000 UTC" firstStartedPulling="2026-01-21 16:18:23.31128122 +0000 UTC m=+4600.492797442" lastFinishedPulling="2026-01-21 16:18:27.322178925 +0000 UTC m=+4604.503695148" observedRunningTime="2026-01-21 16:18:28.561004754 +0000 UTC m=+4605.742520976" watchObservedRunningTime="2026-01-21 16:18:30.845854889 +0000 UTC m=+4608.027371111" Jan 21 16:18:39 crc kubenswrapper[4707]: I0121 16:18:39.183132 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:18:39 crc kubenswrapper[4707]: E0121 16:18:39.184531 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:18:39 crc kubenswrapper[4707]: E0121 16:18:39.963068 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739fbd7e_caa0_48cd_ba01_baff873fd04d.slice/crio-132b5a35062691c11445d555090bdcd6c2a1fc9ee0ac13aae9e7ad595d6ca986\": RecentStats: unable to find data in memory cache]" Jan 21 16:18:52 crc kubenswrapper[4707]: I0121 16:18:52.879876 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:18:53 crc kubenswrapper[4707]: I0121 16:18:53.187864 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:18:53 crc kubenswrapper[4707]: E0121 16:18:53.188137 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.183053 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:19:04 crc kubenswrapper[4707]: E0121 16:19:04.183655 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.295462 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pjb"] Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.297184 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.319002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pjb"] Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.444846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-catalog-content\") pod \"redhat-marketplace-x2pjb\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.445070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz6sf\" (UniqueName: \"kubernetes.io/projected/d48c9611-090f-4751-ae18-8a781982b122-kube-api-access-xz6sf\") pod \"redhat-marketplace-x2pjb\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.445150 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-utilities\") pod \"redhat-marketplace-x2pjb\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.546507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz6sf\" (UniqueName: \"kubernetes.io/projected/d48c9611-090f-4751-ae18-8a781982b122-kube-api-access-xz6sf\") pod \"redhat-marketplace-x2pjb\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.546577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-utilities\") pod \"redhat-marketplace-x2pjb\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.546630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-catalog-content\") pod \"redhat-marketplace-x2pjb\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.547007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-utilities\") pod \"redhat-marketplace-x2pjb\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.547034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-catalog-content\") pod \"redhat-marketplace-x2pjb\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.562091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz6sf\" (UniqueName: \"kubernetes.io/projected/d48c9611-090f-4751-ae18-8a781982b122-kube-api-access-xz6sf\") pod \"redhat-marketplace-x2pjb\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:04 crc kubenswrapper[4707]: I0121 16:19:04.612763 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:05 crc kubenswrapper[4707]: I0121 16:19:05.063897 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pjb"] Jan 21 16:19:05 crc kubenswrapper[4707]: I0121 16:19:05.850589 4707 generic.go:334] "Generic (PLEG): container finished" podID="d48c9611-090f-4751-ae18-8a781982b122" containerID="680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03" exitCode=0 Jan 21 16:19:05 crc kubenswrapper[4707]: I0121 16:19:05.850795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pjb" event={"ID":"d48c9611-090f-4751-ae18-8a781982b122","Type":"ContainerDied","Data":"680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03"} Jan 21 16:19:05 crc kubenswrapper[4707]: I0121 16:19:05.850837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pjb" event={"ID":"d48c9611-090f-4751-ae18-8a781982b122","Type":"ContainerStarted","Data":"5dafad34b3fa6c645ca9f5d337072d208dec30287531b91e52dc2aeb945d6a43"} Jan 21 16:19:06 crc kubenswrapper[4707]: I0121 16:19:06.860043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pjb" event={"ID":"d48c9611-090f-4751-ae18-8a781982b122","Type":"ContainerStarted","Data":"1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c"} Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.597794 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.598057 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="453b486c-659b-4fc0-853e-2c8e17129d27" containerName="openstackclient" containerID="cri-o://49fdcf9618939364f2f9c95eb510f65f5cbb896c2cbd65a978458a287b5f3b5c" gracePeriod=2 Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.612421 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.643253 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.802097 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-9xw7x"] Jan 21 16:19:07 crc kubenswrapper[4707]: E0121 16:19:07.804195 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:07 crc kubenswrapper[4707]: E0121 16:19:07.804286 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data podName:00504354-68de-4d18-8192-144ce961d72e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:08.304249032 +0000 UTC m=+4645.485765254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data") pod "rabbitmq-cell1-server-0" (UID: "00504354-68de-4d18-8192-144ce961d72e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.817624 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92"] Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.837295 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-9xw7x"] Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.851614 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-9fg92"] Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.855994 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t4mx8"] Jan 21 16:19:07 crc kubenswrapper[4707]: E0121 16:19:07.856403 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453b486c-659b-4fc0-853e-2c8e17129d27" containerName="openstackclient" Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.856423 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="453b486c-659b-4fc0-853e-2c8e17129d27" containerName="openstackclient" Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.856633 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="453b486c-659b-4fc0-853e-2c8e17129d27" containerName="openstackclient" Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.857289 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.859984 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.866054 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t4mx8"] Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.901372 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6"] Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.902691 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.903779 4707 generic.go:334] "Generic (PLEG): container finished" podID="d48c9611-090f-4751-ae18-8a781982b122" containerID="1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c" exitCode=0 Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.903835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pjb" event={"ID":"d48c9611-090f-4751-ae18-8a781982b122","Type":"ContainerDied","Data":"1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c"} Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.917255 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.945991 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6"] Jan 21 16:19:07 crc kubenswrapper[4707]: I0121 16:19:07.986605 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.006686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts\") pod \"root-account-create-update-t4mx8\" (UID: \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\") " pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.006874 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts\") pod \"nova-cell1-a90f-account-create-update-gc8v6\" (UID: \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.006925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msxgg\" (UniqueName: \"kubernetes.io/projected/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-kube-api-access-msxgg\") pod \"root-account-create-update-t4mx8\" (UID: \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\") " pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.006957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q88th\" (UniqueName: \"kubernetes.io/projected/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-kube-api-access-q88th\") pod \"nova-cell1-a90f-account-create-update-gc8v6\" (UID: \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.009986 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-0b3b-account-create-update-xcdb5"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.103751 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.105626 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.109011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts\") pod \"root-account-create-update-t4mx8\" (UID: \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\") " pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.109224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts\") pod \"nova-cell1-a90f-account-create-update-gc8v6\" (UID: \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.109708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts\") pod \"root-account-create-update-t4mx8\" (UID: \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\") " pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.110003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts\") pod \"nova-cell1-a90f-account-create-update-gc8v6\" (UID: \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.110101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msxgg\" (UniqueName: \"kubernetes.io/projected/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-kube-api-access-msxgg\") pod \"root-account-create-update-t4mx8\" (UID: \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\") " pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.110161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q88th\" (UniqueName: \"kubernetes.io/projected/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-kube-api-access-q88th\") pod \"nova-cell1-a90f-account-create-update-gc8v6\" (UID: \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.114379 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.145457 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q88th\" (UniqueName: \"kubernetes.io/projected/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-kube-api-access-q88th\") pod \"nova-cell1-a90f-account-create-update-gc8v6\" (UID: \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\") " pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.173679 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.212851 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.216229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvqlk\" (UniqueName: \"kubernetes.io/projected/944ff79a-ae02-4251-9db3-921fcb360ae3-kube-api-access-hvqlk\") pod \"neutron-90a1-account-create-update-5hvgd\" (UID: \"944ff79a-ae02-4251-9db3-921fcb360ae3\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.216315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/944ff79a-ae02-4251-9db3-921fcb360ae3-operator-scripts\") pod \"neutron-90a1-account-create-update-5hvgd\" (UID: \"944ff79a-ae02-4251-9db3-921fcb360ae3\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.241976 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.271579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msxgg\" (UniqueName: \"kubernetes.io/projected/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-kube-api-access-msxgg\") pod \"root-account-create-update-t4mx8\" (UID: \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\") " pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.321533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/944ff79a-ae02-4251-9db3-921fcb360ae3-operator-scripts\") pod \"neutron-90a1-account-create-update-5hvgd\" (UID: \"944ff79a-ae02-4251-9db3-921fcb360ae3\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.321852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvqlk\" (UniqueName: \"kubernetes.io/projected/944ff79a-ae02-4251-9db3-921fcb360ae3-kube-api-access-hvqlk\") pod \"neutron-90a1-account-create-update-5hvgd\" (UID: \"944ff79a-ae02-4251-9db3-921fcb360ae3\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:08 crc kubenswrapper[4707]: E0121 16:19:08.322322 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:19:08 crc kubenswrapper[4707]: E0121 16:19:08.322381 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data podName:da7ffdb4-4647-47fb-8ec9-012b0c81a83e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:08.822359072 +0000 UTC m=+4646.003875294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data") pod "rabbitmq-server-0" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e") : configmap "rabbitmq-config-data" not found Jan 21 16:19:08 crc kubenswrapper[4707]: E0121 16:19:08.322520 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:08 crc kubenswrapper[4707]: E0121 16:19:08.322563 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data podName:00504354-68de-4d18-8192-144ce961d72e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:09.322549009 +0000 UTC m=+4646.504065231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data") pod "rabbitmq-cell1-server-0" (UID: "00504354-68de-4d18-8192-144ce961d72e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.322851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/944ff79a-ae02-4251-9db3-921fcb360ae3-operator-scripts\") pod \"neutron-90a1-account-create-update-5hvgd\" (UID: \"944ff79a-ae02-4251-9db3-921fcb360ae3\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.372854 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.386946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvqlk\" (UniqueName: \"kubernetes.io/projected/944ff79a-ae02-4251-9db3-921fcb360ae3-kube-api-access-hvqlk\") pod \"neutron-90a1-account-create-update-5hvgd\" (UID: \"944ff79a-ae02-4251-9db3-921fcb360ae3\") " pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.442598 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.442716 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.448795 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerName="openstack-network-exporter" containerID="cri-o://b09353f46c3b967cbbf4f0521a507dc2668c538f723bcf4edb30041804e018aa" gracePeriod=300 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.479963 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-b43e-account-create-update-qwc4r"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.490495 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.502607 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-c324-account-create-update-mlsqx"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.528856 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-c324-account-create-update-mlsqx"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.540424 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-sx8r2"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.572972 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-sx8r2"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.581802 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.582391 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerName="openstack-network-exporter" containerID="cri-o://1650aea6f1311ecc422cd7ecb402deed854b461e426b372cb16d23e5a7607f1c" gracePeriod=300 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.591652 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.591825 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerName="ovn-northd" containerID="cri-o://f9aa7b9bc4195c0f2d38a8d78f5b06de6d67fd973cba295a3dafcbda4c366fb5" gracePeriod=30 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.592077 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerName="openstack-network-exporter" containerID="cri-o://0d0f736ac687f2851035b080d22d8bb9669464f300eba02e20e36b392b5e41d4" gracePeriod=30 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.604864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rtvph"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.618886 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-rtvph"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.619303 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-794d2"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.641507 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-794d2"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.646294 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-psp6t"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.652565 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-psp6t"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.695334 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerName="ovsdbserver-sb" containerID="cri-o://23f18fa4c78ae9693483b4f51cc0a08db358f8a47f9ebc8353d056ecdd44f584" gracePeriod=300 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.695445 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerName="ovsdbserver-nb" containerID="cri-o://393a518fbca493795d494a3aaca1d986424d5e9241c8300a7062bf397b08d31a" gracePeriod=300 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.708244 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-krrl7"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.720675 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-krrl7"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.835964 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-57llq"] Jan 21 16:19:08 crc kubenswrapper[4707]: E0121 16:19:08.845466 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:19:08 crc kubenswrapper[4707]: E0121 16:19:08.845560 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data podName:da7ffdb4-4647-47fb-8ec9-012b0c81a83e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:09.845534827 +0000 UTC m=+4647.027051048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data") pod "rabbitmq-server-0" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e") : configmap "rabbitmq-config-data" not found Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.866350 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-57llq"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.871852 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.885143 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-vqlcp"] Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.953725 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_9b4a6d46-528c-4e6b-b890-391a0b3d4331/ovsdbserver-nb/0.log" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.953926 4707 generic.go:334] "Generic (PLEG): container finished" podID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerID="b09353f46c3b967cbbf4f0521a507dc2668c538f723bcf4edb30041804e018aa" exitCode=2 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.953944 4707 generic.go:334] "Generic (PLEG): container finished" podID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerID="393a518fbca493795d494a3aaca1d986424d5e9241c8300a7062bf397b08d31a" exitCode=143 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.954028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"9b4a6d46-528c-4e6b-b890-391a0b3d4331","Type":"ContainerDied","Data":"b09353f46c3b967cbbf4f0521a507dc2668c538f723bcf4edb30041804e018aa"} Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.954074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"9b4a6d46-528c-4e6b-b890-391a0b3d4331","Type":"ContainerDied","Data":"393a518fbca493795d494a3aaca1d986424d5e9241c8300a7062bf397b08d31a"} Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.980385 4707 generic.go:334] "Generic (PLEG): container finished" podID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerID="0d0f736ac687f2851035b080d22d8bb9669464f300eba02e20e36b392b5e41d4" exitCode=2 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.980456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"e961b985-3419-4a6d-9cc5-e7c615f54129","Type":"ContainerDied","Data":"0d0f736ac687f2851035b080d22d8bb9669464f300eba02e20e36b392b5e41d4"} Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.989376 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_48c6c533-464a-46bb-a15b-c54a45c7043d/ovsdbserver-sb/0.log" Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.989421 4707 generic.go:334] "Generic (PLEG): container finished" podID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerID="1650aea6f1311ecc422cd7ecb402deed854b461e426b372cb16d23e5a7607f1c" exitCode=2 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.989437 4707 generic.go:334] "Generic (PLEG): container finished" podID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerID="23f18fa4c78ae9693483b4f51cc0a08db358f8a47f9ebc8353d056ecdd44f584" exitCode=143 Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.989509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"48c6c533-464a-46bb-a15b-c54a45c7043d","Type":"ContainerDied","Data":"1650aea6f1311ecc422cd7ecb402deed854b461e426b372cb16d23e5a7607f1c"} Jan 21 16:19:08 crc kubenswrapper[4707]: I0121 16:19:08.989533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"48c6c533-464a-46bb-a15b-c54a45c7043d","Type":"ContainerDied","Data":"23f18fa4c78ae9693483b4f51cc0a08db358f8a47f9ebc8353d056ecdd44f584"} Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.023566 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2pjb" podStartSLOduration=2.232217347 podStartE2EDuration="5.023549648s" podCreationTimestamp="2026-01-21 16:19:04 +0000 UTC" firstStartedPulling="2026-01-21 16:19:05.852624088 +0000 UTC m=+4643.034140311" lastFinishedPulling="2026-01-21 16:19:08.6439564 +0000 UTC m=+4645.825472612" observedRunningTime="2026-01-21 16:19:09.021319645 +0000 UTC m=+4646.202835867" watchObservedRunningTime="2026-01-21 16:19:09.023549648 +0000 UTC m=+4646.205065870" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.114247 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-ppt78"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.119788 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-ppt78"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.146674 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.206966 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228927a0-3a8f-43f9-853f-99fc8df8e727" path="/var/lib/kubelet/pods/228927a0-3a8f-43f9-853f-99fc8df8e727/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.207727 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4" path="/var/lib/kubelet/pods/4fcc53eb-4bcb-465f-bbb6-cd6d216d11d4/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.208293 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d31e188-7526-4747-b72b-5fb81bb77006" path="/var/lib/kubelet/pods/7d31e188-7526-4747-b72b-5fb81bb77006/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.209419 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d21d851-8cc6-4b92-ad04-7e4fe88fb166" path="/var/lib/kubelet/pods/9d21d851-8cc6-4b92-ad04-7e4fe88fb166/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.210048 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ab2a23-9f4a-4e22-b752-b8ce428104f2" path="/var/lib/kubelet/pods/a4ab2a23-9f4a-4e22-b752-b8ce428104f2/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.210822 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c33702-68b3-4723-8d09-5548858db57c" path="/var/lib/kubelet/pods/b5c33702-68b3-4723-8d09-5548858db57c/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.211287 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe542c2-1818-48fd-ba38-1998c812cac3" path="/var/lib/kubelet/pods/bfe542c2-1818-48fd-ba38-1998c812cac3/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.212649 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8f9ee62-8c3a-4e58-b616-5537c6220ff9" path="/var/lib/kubelet/pods/c8f9ee62-8c3a-4e58-b616-5537c6220ff9/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.212974 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:19:09 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 16:19:09 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 16:19:09 crc kubenswrapper[4707]: else Jan 21 16:19:09 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:19:09 crc kubenswrapper[4707]: fi Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:19:09 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:19:09 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:19:09 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:19:09 crc kubenswrapper[4707]: # support updates Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.214952 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" podUID="4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.216437 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b17afa-1293-4da3-9638-839df6d49df9" path="/var/lib/kubelet/pods/c9b17afa-1293-4da3-9638-839df6d49df9/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.217011 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc53221c-f932-4750-b193-c43af935722c" path="/var/lib/kubelet/pods/cc53221c-f932-4750-b193-c43af935722c/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.218105 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab7b82b-a7d1-41af-abc2-21322ebdd01f" path="/var/lib/kubelet/pods/eab7b82b-a7d1-41af-abc2-21322ebdd01f/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.218840 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d00d19-3404-42e9-be55-942f24e995f0" path="/var/lib/kubelet/pods/f0d00d19-3404-42e9-be55-942f24e995f0/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.221565 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f82f8d8c-cc8e-4295-8453-61fb1b1c3848" path="/var/lib/kubelet/pods/f82f8d8c-cc8e-4295-8453-61fb1b1c3848/volumes" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.222186 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-wlhs4"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.222212 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-wlhs4"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.236299 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.236517 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerName="cinder-scheduler" containerID="cri-o://2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.236913 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerName="probe" containerID="cri-o://8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.275788 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.275879 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:09.775855367 +0000 UTC m=+4646.957371590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "cert-keystone-public-svc" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.275937 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.276000 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:09.775981715 +0000 UTC m=+4646.957497936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "cert-keystone-internal-svc" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.276974 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.277022 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:09.777005189 +0000 UTC m=+4646.958521411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "combined-ca-bundle" not found Jan 21 16:19:09 crc kubenswrapper[4707]: W0121 16:19:09.283655 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38168a2d_01c6_47f0_b3f5_e2c3b10aa1d3.slice/crio-fdde001be6674e2e51d803db12a98b67868af84cf1d829246de3880fb0300cf2 WatchSource:0}: Error finding container fdde001be6674e2e51d803db12a98b67868af84cf1d829246de3880fb0300cf2: Status 404 returned error can't find the container with id fdde001be6674e2e51d803db12a98b67868af84cf1d829246de3880fb0300cf2 Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.287501 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:19:09 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 16:19:09 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 16:19:09 crc kubenswrapper[4707]: else Jan 21 16:19:09 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:19:09 crc kubenswrapper[4707]: fi Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:19:09 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:19:09 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:19:09 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:19:09 crc kubenswrapper[4707]: # support updates Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.287849 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t4mx8"] Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.289452 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-t4mx8" podUID="38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316208 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316629 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-server" containerID="cri-o://a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316681 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-server" containerID="cri-o://7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316741 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-updater" containerID="cri-o://98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316729 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-expirer" containerID="cri-o://3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316757 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-updater" containerID="cri-o://d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316777 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-auditor" containerID="cri-o://ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316819 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-auditor" containerID="cri-o://0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316833 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-server" containerID="cri-o://b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316823 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-replicator" containerID="cri-o://c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316871 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-reaper" containerID="cri-o://83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316883 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-replicator" containerID="cri-o://d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316901 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-auditor" containerID="cri-o://2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.316928 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-replicator" containerID="cri-o://97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.317151 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="swift-recon-cron" containerID="cri-o://d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.317193 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="rsync" containerID="cri-o://2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.384897 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.385141 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data podName:00504354-68de-4d18-8192-144ce961d72e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:11.385126219 +0000 UTC m=+4648.566642440 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data") pod "rabbitmq-cell1-server-0" (UID: "00504354-68de-4d18-8192-144ce961d72e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.407624 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.407873 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="28274132-ebaa-415b-a596-28b30796449c" containerName="cinder-api-log" containerID="cri-o://b1866e7c5c31c45c31a0a123206f96d06c54b25f3bf493859e89f55cf818e3aa" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.408014 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="28274132-ebaa-415b-a596-28b30796449c" containerName="cinder-api" containerID="cri-o://a35a0a3b6e7558ea488bb3ad5745e40ecbe3628ac61c149c4cd49b174c91a74f" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.442001 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.442233 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerName="glance-log" containerID="cri-o://b2d316c0e1b9039139944364e3e8f6a2b8d263a177cb3b436cd75caafe27246a" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.442586 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerName="glance-httpd" containerID="cri-o://9edf57f9b05658747e57d4b578bd476b8dbc99dfcb2823b90d95e7f0335c5d65" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.472703 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.474197 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.500920 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.516477 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7bbc9795db-8fjxd"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.516685 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" podUID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerName="placement-log" containerID="cri-o://2edd1c6516797b8cc69e132ff4ce12d5ff92480e5675a5122e825d8deab90b2f" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.516804 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" podUID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerName="placement-api" containerID="cri-o://f31a9e28d650086abf54eb41930bbd347007b6dda6d23d5b073d912b680a7659" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.530084 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-cq9z7"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.561386 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-cq9z7"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.576746 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_48c6c533-464a-46bb-a15b-c54a45c7043d/ovsdbserver-sb/0.log" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.576859 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.607999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-jdlhq\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.608372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pgjz\" (UniqueName: \"kubernetes.io/projected/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-kube-api-access-5pgjz\") pod \"dnsmasq-dnsmasq-84b9f45d47-jdlhq\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.608502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-jdlhq\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.610883 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.611207 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerName="glance-log" containerID="cri-o://bd0c1af3ceb1fa7c36bbcc33de112908051c4e7e8cc5d7685057a8514317b590" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.611742 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerName="glance-httpd" containerID="cri-o://97e0171e46f5c97fd7e6a3240f7093a22ce564fe525fb140934b1ee56a46b19b" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.630019 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-8khz6"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.682704 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.698271 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-f22d-account-create-update-dqhzk"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.710653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq5p6\" (UniqueName: \"kubernetes.io/projected/48c6c533-464a-46bb-a15b-c54a45c7043d-kube-api-access-jq5p6\") pod \"48c6c533-464a-46bb-a15b-c54a45c7043d\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.711544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-combined-ca-bundle\") pod \"48c6c533-464a-46bb-a15b-c54a45c7043d\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.713101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-metrics-certs-tls-certs\") pod \"48c6c533-464a-46bb-a15b-c54a45c7043d\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.713562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdb-rundir\") pod \"48c6c533-464a-46bb-a15b-c54a45c7043d\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.713700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"48c6c533-464a-46bb-a15b-c54a45c7043d\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.714328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "48c6c533-464a-46bb-a15b-c54a45c7043d" (UID: "48c6c533-464a-46bb-a15b-c54a45c7043d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.714705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdbserver-sb-tls-certs\") pod \"48c6c533-464a-46bb-a15b-c54a45c7043d\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.714799 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-scripts\") pod \"48c6c533-464a-46bb-a15b-c54a45c7043d\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.714903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-config\") pod \"48c6c533-464a-46bb-a15b-c54a45c7043d\" (UID: \"48c6c533-464a-46bb-a15b-c54a45c7043d\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.715359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pgjz\" (UniqueName: \"kubernetes.io/projected/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-kube-api-access-5pgjz\") pod \"dnsmasq-dnsmasq-84b9f45d47-jdlhq\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.715527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-jdlhq\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.715893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-jdlhq\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.716121 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.716785 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-config" (OuterVolumeSpecName: "config") pod "48c6c533-464a-46bb-a15b-c54a45c7043d" (UID: "48c6c533-464a-46bb-a15b-c54a45c7043d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.716834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-scripts" (OuterVolumeSpecName: "scripts") pod "48c6c533-464a-46bb-a15b-c54a45c7043d" (UID: "48c6c533-464a-46bb-a15b-c54a45c7043d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.717295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "48c6c533-464a-46bb-a15b-c54a45c7043d" (UID: "48c6c533-464a-46bb-a15b-c54a45c7043d"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.717426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-jdlhq\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.718117 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-jdlhq\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.718366 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-8khz6"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.719297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c6c533-464a-46bb-a15b-c54a45c7043d-kube-api-access-jq5p6" (OuterVolumeSpecName: "kube-api-access-jq5p6") pod "48c6c533-464a-46bb-a15b-c54a45c7043d" (UID: "48c6c533-464a-46bb-a15b-c54a45c7043d"). InnerVolumeSpecName "kube-api-access-jq5p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.730426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.745092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pgjz\" (UniqueName: \"kubernetes.io/projected/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-kube-api-access-5pgjz\") pod \"dnsmasq-dnsmasq-84b9f45d47-jdlhq\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.758385 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-t4c2x"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.761437 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-t4c2x"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.762628 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_9b4a6d46-528c-4e6b-b890-391a0b3d4331/ovsdbserver-nb/0.log" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.762697 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.765920 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:19:09 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: if [ -n "neutron" ]; then Jan 21 16:19:09 crc kubenswrapper[4707]: GRANT_DATABASE="neutron" Jan 21 16:19:09 crc kubenswrapper[4707]: else Jan 21 16:19:09 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:19:09 crc kubenswrapper[4707]: fi Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:19:09 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:19:09 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:19:09 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:19:09 crc kubenswrapper[4707]: # support updates Jan 21 16:19:09 crc kubenswrapper[4707]: Jan 21 16:19:09 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.773158 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" podUID="944ff79a-ae02-4251-9db3-921fcb360ae3" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.774152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.774390 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" containerID="cri-o://e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.781325 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48c6c533-464a-46bb-a15b-c54a45c7043d" (UID: "48c6c533-464a-46bb-a15b-c54a45c7043d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.810955 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-4rqnw"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.818040 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq5p6\" (UniqueName: \"kubernetes.io/projected/48c6c533-464a-46bb-a15b-c54a45c7043d-kube-api-access-jq5p6\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.818068 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.818086 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.818096 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.818105 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48c6c533-464a-46bb-a15b-c54a45c7043d-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.825205 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.825348 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:10.825326005 +0000 UTC m=+4648.006842226 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "cert-keystone-internal-svc" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.832509 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.832578 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:10.832559345 +0000 UTC m=+4648.014075568 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "combined-ca-bundle" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.832760 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.832923 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:10.832907991 +0000 UTC m=+4648.014424214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "cert-keystone-public-svc" not found Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.866447 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.878021 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-4rqnw"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.904009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "48c6c533-464a-46bb-a15b-c54a45c7043d" (UID: "48c6c533-464a-46bb-a15b-c54a45c7043d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.915151 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.919145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.919197 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-combined-ca-bundle\") pod \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.919238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-config\") pod \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.919262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdbserver-nb-tls-certs\") pod \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.919320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-metrics-certs-tls-certs\") pod \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.919348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcx5m\" (UniqueName: \"kubernetes.io/projected/9b4a6d46-528c-4e6b-b890-391a0b3d4331-kube-api-access-vcx5m\") pod \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.919382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-scripts\") pod \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.919531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdb-rundir\") pod \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\" (UID: \"9b4a6d46-528c-4e6b-b890-391a0b3d4331\") " Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.920125 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.920146 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.920210 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:19:09 crc kubenswrapper[4707]: E0121 16:19:09.920253 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data podName:da7ffdb4-4647-47fb-8ec9-012b0c81a83e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:11.920237246 +0000 UTC m=+4649.101753468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data") pod "rabbitmq-server-0" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e") : configmap "rabbitmq-config-data" not found Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.924554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-scripts" (OuterVolumeSpecName: "scripts") pod "9b4a6d46-528c-4e6b-b890-391a0b3d4331" (UID: "9b4a6d46-528c-4e6b-b890-391a0b3d4331"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.925320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9b4a6d46-528c-4e6b-b890-391a0b3d4331" (UID: "9b4a6d46-528c-4e6b-b890-391a0b3d4331"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.925648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-config" (OuterVolumeSpecName: "config") pod "9b4a6d46-528c-4e6b-b890-391a0b3d4331" (UID: "9b4a6d46-528c-4e6b-b890-391a0b3d4331"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.925658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "9b4a6d46-528c-4e6b-b890-391a0b3d4331" (UID: "9b4a6d46-528c-4e6b-b890-391a0b3d4331"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.927963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "48c6c533-464a-46bb-a15b-c54a45c7043d" (UID: "48c6c533-464a-46bb-a15b-c54a45c7043d"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.928011 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.930635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4a6d46-528c-4e6b-b890-391a0b3d4331-kube-api-access-vcx5m" (OuterVolumeSpecName: "kube-api-access-vcx5m") pod "9b4a6d46-528c-4e6b-b890-391a0b3d4331" (UID: "9b4a6d46-528c-4e6b-b890-391a0b3d4331"). InnerVolumeSpecName "kube-api-access-vcx5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.934619 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-69b97779c8-jlqt9"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.934842 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" podUID="64190d62-9a33-4999-bc4a-3eff62940162" containerName="neutron-api" containerID="cri-o://d2d7c5789da9a58eb70a75b4a3d6ee778cf3a4ed19f27c737a94bbdf5be149df" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.935204 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" podUID="64190d62-9a33-4999-bc4a-3eff62940162" containerName="neutron-httpd" containerID="cri-o://5b3ebbd8a16f076cbe84710aa19479370ca9661ce99511215aea7ce272f068eb" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.939653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.970213 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-54d1-account-create-update-qh9zj"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.978654 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.978874 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-log" containerID="cri-o://ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89" gracePeriod=30 Jan 21 16:19:09 crc kubenswrapper[4707]: I0121 16:19:09.979294 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-api" containerID="cri-o://6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:09.986923 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-hp9rb"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:09.990694 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-hp9rb"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.019912 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.025851 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-s6wlc"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.046588 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.046617 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcx5m\" (UniqueName: \"kubernetes.io/projected/9b4a6d46-528c-4e6b-b890-391a0b3d4331-kube-api-access-vcx5m\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.046630 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a6d46-528c-4e6b-b890-391a0b3d4331-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.046639 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c6c533-464a-46bb-a15b-c54a45c7043d-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.046647 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.046671 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.071650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b4a6d46-528c-4e6b-b890-391a0b3d4331" (UID: "9b4a6d46-528c-4e6b-b890-391a0b3d4331"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.085382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9b4a6d46-528c-4e6b-b890-391a0b3d4331" (UID: "9b4a6d46-528c-4e6b-b890-391a0b3d4331"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.085513 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-s6wlc"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.085556 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="00504354-68de-4d18-8192-144ce961d72e" containerName="rabbitmq" containerID="cri-o://ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4" gracePeriod=604800 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.086567 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.106732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pjb" event={"ID":"d48c9611-090f-4751-ae18-8a781982b122","Type":"ContainerStarted","Data":"a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.123000 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.123236 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-log" containerID="cri-o://c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.123349 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-metadata" containerID="cri-o://533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.147942 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-nx25r"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.152028 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.152050 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.152061 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.164793 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-nx25r"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.166608 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerID="2edd1c6516797b8cc69e132ff4ce12d5ff92480e5675a5122e825d8deab90b2f" exitCode=143 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.166704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" event={"ID":"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2","Type":"ContainerDied","Data":"2edd1c6516797b8cc69e132ff4ce12d5ff92480e5675a5122e825d8deab90b2f"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.178878 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.182965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "9b4a6d46-528c-4e6b-b890-391a0b3d4331" (UID: "9b4a6d46-528c-4e6b-b890-391a0b3d4331"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.204883 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7ab6-account-create-update-9rm92"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219042 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219072 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219081 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219088 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219094 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219099 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219118 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219124 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219132 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219137 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219143 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219148 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219153 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219159 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9" exitCode=0 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.219344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.233942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-t4mx8" event={"ID":"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3","Type":"ContainerStarted","Data":"fdde001be6674e2e51d803db12a98b67868af84cf1d829246de3880fb0300cf2"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.242175 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-t4mx8" secret="" err="secret \"galera-openstack-cell1-dockercfg-ncfmm\" not found" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.243594 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.263557 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerID="b2d316c0e1b9039139944364e3e8f6a2b8d263a177cb3b436cd75caafe27246a" exitCode=143 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.263620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f0537ef6-aeda-4293-aafa-5889f66c493a","Type":"ContainerDied","Data":"b2d316c0e1b9039139944364e3e8f6a2b8d263a177cb3b436cd75caafe27246a"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.264895 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4a6d46-528c-4e6b-b890-391a0b3d4331-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.266390 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.266434 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts podName:38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:10.76641894 +0000 UTC m=+4647.947935162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts") pod "root-account-create-update-t4mx8" (UID: "38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3") : configmap "openstack-cell1-scripts" not found Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.282121 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.288456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" event={"ID":"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f","Type":"ContainerStarted","Data":"1a62045677d532d61a81b81697707b1458d1f55a5df686ade80b6cc73d089756"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.288724 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" secret="" err="secret \"galera-openstack-cell1-dockercfg-ncfmm\" not found" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.303277 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jkw2f"] Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.309888 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:19:10 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 16:19:10 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 16:19:10 crc kubenswrapper[4707]: else Jan 21 16:19:10 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:19:10 crc kubenswrapper[4707]: fi Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:19:10 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:19:10 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:19:10 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:19:10 crc kubenswrapper[4707]: # support updates Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.309961 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:19:10 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: if [ -n "nova_cell1" ]; then Jan 21 16:19:10 crc kubenswrapper[4707]: GRANT_DATABASE="nova_cell1" Jan 21 16:19:10 crc kubenswrapper[4707]: else Jan 21 16:19:10 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:19:10 crc kubenswrapper[4707]: fi Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:19:10 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:19:10 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:19:10 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:19:10 crc kubenswrapper[4707]: # support updates Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.313077 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_48c6c533-464a-46bb-a15b-c54a45c7043d/ovsdbserver-sb/0.log" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.313134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"48c6c533-464a-46bb-a15b-c54a45c7043d","Type":"ContainerDied","Data":"e444e4dc8220979ca199654f217e3684b3870875e525e7cd1e04c82a7f9c99a1"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.313158 4707 scope.go:117] "RemoveContainer" containerID="1650aea6f1311ecc422cd7ecb402deed854b461e426b372cb16d23e5a7607f1c" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.313253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.315738 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-t4mx8" podUID="38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.315870 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jkw2f"] Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.315930 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" podUID="4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.320549 4707 generic.go:334] "Generic (PLEG): container finished" podID="453b486c-659b-4fc0-853e-2c8e17129d27" containerID="49fdcf9618939364f2f9c95eb510f65f5cbb896c2cbd65a978458a287b5f3b5c" exitCode=137 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.324269 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.331839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" event={"ID":"944ff79a-ae02-4251-9db3-921fcb360ae3","Type":"ContainerStarted","Data":"c8e5f8a83f9a088aca7915884c485102f33c09133e14c4350b1d61152457d978"} Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.334449 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:19:10 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: if [ -n "neutron" ]; then Jan 21 16:19:10 crc kubenswrapper[4707]: GRANT_DATABASE="neutron" Jan 21 16:19:10 crc kubenswrapper[4707]: else Jan 21 16:19:10 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:19:10 crc kubenswrapper[4707]: fi Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:19:10 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:19:10 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:19:10 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:19:10 crc kubenswrapper[4707]: # support updates Jan 21 16:19:10 crc kubenswrapper[4707]: Jan 21 16:19:10 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.335719 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" podUID="944ff79a-ae02-4251-9db3-921fcb360ae3" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.336493 4707 generic.go:334] "Generic (PLEG): container finished" podID="28274132-ebaa-415b-a596-28b30796449c" containerID="b1866e7c5c31c45c31a0a123206f96d06c54b25f3bf493859e89f55cf818e3aa" exitCode=143 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.336549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"28274132-ebaa-415b-a596-28b30796449c","Type":"ContainerDied","Data":"b1866e7c5c31c45c31a0a123206f96d06c54b25f3bf493859e89f55cf818e3aa"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.338493 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.338686 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" podUID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerName="barbican-worker-log" containerID="cri-o://f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.338795 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" podUID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerName="barbican-worker" containerID="cri-o://65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.343600 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" containerName="rabbitmq" containerID="cri-o://90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929" gracePeriod=604800 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.348981 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.349541 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_9b4a6d46-528c-4e6b-b890-391a0b3d4331/ovsdbserver-nb/0.log" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.349599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"9b4a6d46-528c-4e6b-b890-391a0b3d4331","Type":"ContainerDied","Data":"d46e557a41974565774706796039f73ec258264ea6bb57259d8bf9d9ac007090"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.353485 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" podUID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerName="barbican-keystone-listener-log" containerID="cri-o://1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.353580 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.354265 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" podUID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerName="barbican-keystone-listener" containerID="cri-o://530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.357650 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.357826 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api-log" containerID="cri-o://120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.357916 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api" containerID="cri-o://67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.366380 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.366824 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" containerID="cri-o://6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.368275 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.368328 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts podName:4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f nodeName:}" failed. No retries permitted until 2026-01-21 16:19:10.868312039 +0000 UTC m=+4648.049828252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts") pod "nova-cell1-a90f-account-create-update-gc8v6" (UID: "4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f") : configmap "openstack-cell1-scripts" not found Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.418181 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.418237 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-6c4n6"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.438723 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.438993 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="c1dee117-e591-43b9-b94a-677ff9fae495" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.470514 4707 generic.go:334] "Generic (PLEG): container finished" podID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerID="bd0c1af3ceb1fa7c36bbcc33de112908051c4e7e8cc5d7685057a8514317b590" exitCode=143 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.470556 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6cd161af-38fc-439e-8b3f-d4549dc3708f","Type":"ContainerDied","Data":"bd0c1af3ceb1fa7c36bbcc33de112908051c4e7e8cc5d7685057a8514317b590"} Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.470612 4707 scope.go:117] "RemoveContainer" containerID="23f18fa4c78ae9693483b4f51cc0a08db358f8a47f9ebc8353d056ecdd44f584" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.478053 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.491859 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-9qlqc"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.509583 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.510689 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="11bd3474-14bf-4c21-a327-c4dcbefdec96" containerName="galera" containerID="cri-o://0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.517935 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.518161 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" containerID="cri-o://86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.528107 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t4mx8"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.539854 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.540111 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="ceilometer-central-agent" containerID="cri-o://32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.540245 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="proxy-httpd" containerID="cri-o://928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.540279 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="sg-core" containerID="cri-o://2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.540322 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="ceilometer-notification-agent" containerID="cri-o://290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.546152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.546373 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" podUID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerName="proxy-httpd" containerID="cri-o://24a13662b41e6b6bfb847addf208fe2012b93376a6c253308ab52b66871cd124" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.546511 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" podUID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerName="proxy-server" containerID="cri-o://c4d2cce318ed39a68499016f449e682bd2f407153f80ddf6bfa77db0849f99da" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.552007 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.552194 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="b5f7ba87-baff-4fe7-97ab-3ea947e9c102" containerName="kube-state-metrics" containerID="cri-o://91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8" gracePeriod=30 Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.573721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-combined-ca-bundle\") pod \"453b486c-659b-4fc0-853e-2c8e17129d27\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.573892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config\") pod \"453b486c-659b-4fc0-853e-2c8e17129d27\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.573979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crn52\" (UniqueName: \"kubernetes.io/projected/453b486c-659b-4fc0-853e-2c8e17129d27-kube-api-access-crn52\") pod \"453b486c-659b-4fc0-853e-2c8e17129d27\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.590991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453b486c-659b-4fc0-853e-2c8e17129d27-kube-api-access-crn52" (OuterVolumeSpecName: "kube-api-access-crn52") pod "453b486c-659b-4fc0-853e-2c8e17129d27" (UID: "453b486c-659b-4fc0-853e-2c8e17129d27"). InnerVolumeSpecName "kube-api-access-crn52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.601738 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.601877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.614309 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.614417 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.618945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "453b486c-659b-4fc0-853e-2c8e17129d27" (UID: "453b486c-659b-4fc0-853e-2c8e17129d27"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.618988 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.623262 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.623308 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.625272 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.631011 4707 scope.go:117] "RemoveContainer" containerID="b09353f46c3b967cbbf4f0521a507dc2668c538f723bcf4edb30041804e018aa" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.639708 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="c1dee117-e591-43b9-b94a-677ff9fae495" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.199:6080/vnc_lite.html\": dial tcp 10.217.0.199:6080: connect: connection refused" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.684702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config-secret\") pod \"453b486c-659b-4fc0-853e-2c8e17129d27\" (UID: \"453b486c-659b-4fc0-853e-2c8e17129d27\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.686427 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.686522 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crn52\" (UniqueName: \"kubernetes.io/projected/453b486c-659b-4fc0-853e-2c8e17129d27-kube-api-access-crn52\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.703822 4707 scope.go:117] "RemoveContainer" containerID="393a518fbca493795d494a3aaca1d986424d5e9241c8300a7062bf397b08d31a" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.714463 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "453b486c-659b-4fc0-853e-2c8e17129d27" (UID: "453b486c-659b-4fc0-853e-2c8e17129d27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.733503 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.764333 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c6c533_464a_46bb_a15b_c54a45c7043d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25947cd_d79d_4983_abfa_5d0f80850cdb.slice/crio-conmon-120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c6c533_464a_46bb_a15b_c54a45c7043d.slice/crio-e444e4dc8220979ca199654f217e3684b3870875e525e7cd1e04c82a7f9c99a1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ba05f78_2a71_4b67_8e95_db9c04a4ed42.slice/crio-1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fc9cfd6_8a8a_461b_96ed_e1781b8898c4.slice/crio-conmon-2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f8cfcd0_7878_4185_b558_a0ad021f06df.slice/crio-conmon-f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fc9cfd6_8a8a_461b_96ed_e1781b8898c4.slice/crio-2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25947cd_d79d_4983_abfa_5d0f80850cdb.slice/crio-120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4a6d46_528c_4e6b_b890_391a0b3d4331.slice/crio-d46e557a41974565774706796039f73ec258264ea6bb57259d8bf9d9ac007090\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4a6d46_528c_4e6b_b890_391a0b3d4331.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5f7ba87_baff_4fe7_97ab_3ea947e9c102.slice/crio-conmon-91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fc9cfd6_8a8a_461b_96ed_e1781b8898c4.slice/crio-conmon-928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f8cfcd0_7878_4185_b558_a0ad021f06df.slice/crio-f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fc9cfd6_8a8a_461b_96ed_e1781b8898c4.slice/crio-928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.776513 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.781852 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.784936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "453b486c-659b-4fc0-853e-2c8e17129d27" (UID: "453b486c-659b-4fc0-853e-2c8e17129d27"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.786471 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.786504 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.787945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data\") pod \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.788636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbjj8\" (UniqueName: \"kubernetes.io/projected/1eddda5d-3cfc-4038-821c-4c628aeb3c69-kube-api-access-hbjj8\") pod \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.788697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data-custom\") pod \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.788717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-scripts\") pod \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.788774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eddda5d-3cfc-4038-821c-4c628aeb3c69-etc-machine-id\") pod \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.788845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-combined-ca-bundle\") pod \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\" (UID: \"1eddda5d-3cfc-4038-821c-4c628aeb3c69\") " Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.789300 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.789311 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453b486c-659b-4fc0-853e-2c8e17129d27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.789364 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.789398 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts podName:38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:11.789384931 +0000 UTC m=+4648.970901153 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts") pod "root-account-create-update-t4mx8" (UID: "38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3") : configmap "openstack-cell1-scripts" not found Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.792426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1eddda5d-3cfc-4038-821c-4c628aeb3c69-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1eddda5d-3cfc-4038-821c-4c628aeb3c69" (UID: "1eddda5d-3cfc-4038-821c-4c628aeb3c69"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.792841 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1eddda5d-3cfc-4038-821c-4c628aeb3c69" (UID: "1eddda5d-3cfc-4038-821c-4c628aeb3c69"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.794275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-scripts" (OuterVolumeSpecName: "scripts") pod "1eddda5d-3cfc-4038-821c-4c628aeb3c69" (UID: "1eddda5d-3cfc-4038-821c-4c628aeb3c69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.808268 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eddda5d-3cfc-4038-821c-4c628aeb3c69-kube-api-access-hbjj8" (OuterVolumeSpecName: "kube-api-access-hbjj8") pod "1eddda5d-3cfc-4038-821c-4c628aeb3c69" (UID: "1eddda5d-3cfc-4038-821c-4c628aeb3c69"). InnerVolumeSpecName "kube-api-access-hbjj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.818158 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="b5f7ba87-baff-4fe7-97ab-3ea947e9c102" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.207:8081/readyz\": dial tcp 10.217.0.207:8081: connect: connection refused" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.831762 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq"] Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.864018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eddda5d-3cfc-4038-821c-4c628aeb3c69" (UID: "1eddda5d-3cfc-4038-821c-4c628aeb3c69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: W0121 16:19:10.876405 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6f8b562_33b5_4553_ad14_4ad5230c3ac5.slice/crio-80df44a86aad1969dcde255c006876df40ad2b6b9d241593772c25cd9db6f05d WatchSource:0}: Error finding container 80df44a86aad1969dcde255c006876df40ad2b6b9d241593772c25cd9db6f05d: Status 404 returned error can't find the container with id 80df44a86aad1969dcde255c006876df40ad2b6b9d241593772c25cd9db6f05d Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.891296 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbjj8\" (UniqueName: \"kubernetes.io/projected/1eddda5d-3cfc-4038-821c-4c628aeb3c69-kube-api-access-hbjj8\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.891325 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.891335 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.891344 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eddda5d-3cfc-4038-821c-4c628aeb3c69-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.891353 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.891432 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.891475 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:12.89145865 +0000 UTC m=+4650.072974871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "cert-keystone-public-svc" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.891512 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.891531 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts podName:4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f nodeName:}" failed. No retries permitted until 2026-01-21 16:19:11.891525135 +0000 UTC m=+4649.073041357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts") pod "nova-cell1-a90f-account-create-update-gc8v6" (UID: "4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f") : configmap "openstack-cell1-scripts" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.891564 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.891580 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:12.89157563 +0000 UTC m=+4650.073091851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "combined-ca-bundle" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.891608 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:19:10 crc kubenswrapper[4707]: E0121 16:19:10.891625 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:12.891620494 +0000 UTC m=+4650.073136716 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "cert-keystone-internal-svc" not found Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.942064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data" (OuterVolumeSpecName: "config-data") pod "1eddda5d-3cfc-4038-821c-4c628aeb3c69" (UID: "1eddda5d-3cfc-4038-821c-4c628aeb3c69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:10 crc kubenswrapper[4707]: I0121 16:19:10.993410 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eddda5d-3cfc-4038-821c-4c628aeb3c69-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.058882 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.111:5671: connect: connection refused" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.079354 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.197853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-combined-ca-bundle\") pod \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.198018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-certs\") pod \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.198099 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-config\") pod \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.198096 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="264a35d9-80a3-47e0-9489-5d088f07da4f" path="/var/lib/kubelet/pods/264a35d9-80a3-47e0-9489-5d088f07da4f/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.198252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfvhf\" (UniqueName: \"kubernetes.io/projected/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-api-access-sfvhf\") pod \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\" (UID: \"b5f7ba87-baff-4fe7-97ab-3ea947e9c102\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.198678 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab563e0-5667-4a88-902e-0f6d8d3da539" path="/var/lib/kubelet/pods/2ab563e0-5667-4a88-902e-0f6d8d3da539/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.199309 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="348b9fd8-56d9-479d-ad68-6e98a249d50e" path="/var/lib/kubelet/pods/348b9fd8-56d9-479d-ad68-6e98a249d50e/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.200312 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453b486c-659b-4fc0-853e-2c8e17129d27" path="/var/lib/kubelet/pods/453b486c-659b-4fc0-853e-2c8e17129d27/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.201112 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c6c533-464a-46bb-a15b-c54a45c7043d" path="/var/lib/kubelet/pods/48c6c533-464a-46bb-a15b-c54a45c7043d/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.201690 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6b8d4c-bb07-4c24-b97e-870ad37de2d9" path="/var/lib/kubelet/pods/4a6b8d4c-bb07-4c24-b97e-870ad37de2d9/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.202784 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625cd2cf-9a3e-4870-8df0-de16c44cf081" path="/var/lib/kubelet/pods/625cd2cf-9a3e-4870-8df0-de16c44cf081/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.204152 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8af4275d-327c-428a-b9ef-6daf1e3b32dc" path="/var/lib/kubelet/pods/8af4275d-327c-428a-b9ef-6daf1e3b32dc/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.204712 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f1d5774-2251-43d5-938b-7e87e5fee400" path="/var/lib/kubelet/pods/8f1d5774-2251-43d5-938b-7e87e5fee400/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.205259 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9815602d-6e85-4f3f-8aeb-fa674768302c" path="/var/lib/kubelet/pods/9815602d-6e85-4f3f-8aeb-fa674768302c/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.206521 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" path="/var/lib/kubelet/pods/9b4a6d46-528c-4e6b-b890-391a0b3d4331/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.207130 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c035033e-3578-409b-865f-ec2e3812a9f7" path="/var/lib/kubelet/pods/c035033e-3578-409b-865f-ec2e3812a9f7/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.207605 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1201e72-7dfd-4656-9a72-89fce1540f36" path="/var/lib/kubelet/pods/c1201e72-7dfd-4656-9a72-89fce1540f36/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.208433 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db4739a7-e3aa-4fce-8e12-8fe520467ba2" path="/var/lib/kubelet/pods/db4739a7-e3aa-4fce-8e12-8fe520467ba2/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.208872 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c" path="/var/lib/kubelet/pods/e1d7a1c2-ca46-4884-8c5b-f96cb24dbd3c/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.209339 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea63c116-37f3-4c5d-9ae3-ed07ceb1a530" path="/var/lib/kubelet/pods/ea63c116-37f3-4c5d-9ae3-ed07ceb1a530/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.210182 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87d50e5-d59b-48b8-a36d-5672422fa831" path="/var/lib/kubelet/pods/f87d50e5-d59b-48b8-a36d-5672422fa831/volumes" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.216791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-api-access-sfvhf" (OuterVolumeSpecName: "kube-api-access-sfvhf") pod "b5f7ba87-baff-4fe7-97ab-3ea947e9c102" (UID: "b5f7ba87-baff-4fe7-97ab-3ea947e9c102"). InnerVolumeSpecName "kube-api-access-sfvhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.273007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "b5f7ba87-baff-4fe7-97ab-3ea947e9c102" (UID: "b5f7ba87-baff-4fe7-97ab-3ea947e9c102"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.275597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5f7ba87-baff-4fe7-97ab-3ea947e9c102" (UID: "b5f7ba87-baff-4fe7-97ab-3ea947e9c102"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.281105 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="00504354-68de-4d18-8192-144ce961d72e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.112:5671: connect: connection refused" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.286311 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "b5f7ba87-baff-4fe7-97ab-3ea947e9c102" (UID: "b5f7ba87-baff-4fe7-97ab-3ea947e9c102"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.301999 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfvhf\" (UniqueName: \"kubernetes.io/projected/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-api-access-sfvhf\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.302031 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.302041 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.302052 4707 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b5f7ba87-baff-4fe7-97ab-3ea947e9c102-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.390096 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.398852 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.403579 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-config-data\") pod \"c1dee117-e591-43b9-b94a-677ff9fae495\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.403655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-nova-novncproxy-tls-certs\") pod \"c1dee117-e591-43b9-b94a-677ff9fae495\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.403682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-combined-ca-bundle\") pod \"c1dee117-e591-43b9-b94a-677ff9fae495\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.403891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-vencrypt-tls-certs\") pod \"c1dee117-e591-43b9-b94a-677ff9fae495\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.404306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xf6h\" (UniqueName: \"kubernetes.io/projected/c1dee117-e591-43b9-b94a-677ff9fae495-kube-api-access-6xf6h\") pod \"c1dee117-e591-43b9-b94a-677ff9fae495\" (UID: \"c1dee117-e591-43b9-b94a-677ff9fae495\") " Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.411196 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.411253 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data podName:00504354-68de-4d18-8192-144ce961d72e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:15.411237978 +0000 UTC m=+4652.592754200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data") pod "rabbitmq-cell1-server-0" (UID: "00504354-68de-4d18-8192-144ce961d72e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.422098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1dee117-e591-43b9-b94a-677ff9fae495-kube-api-access-6xf6h" (OuterVolumeSpecName: "kube-api-access-6xf6h") pod "c1dee117-e591-43b9-b94a-677ff9fae495" (UID: "c1dee117-e591-43b9-b94a-677ff9fae495"). InnerVolumeSpecName "kube-api-access-6xf6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.447951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1dee117-e591-43b9-b94a-677ff9fae495" (UID: "c1dee117-e591-43b9-b94a-677ff9fae495"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.451478 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-config-data" (OuterVolumeSpecName: "config-data") pod "c1dee117-e591-43b9-b94a-677ff9fae495" (UID: "c1dee117-e591-43b9-b94a-677ff9fae495"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.479852 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9aa7b9bc4195c0f2d38a8d78f5b06de6d67fd973cba295a3dafcbda4c366fb5" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.481085 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9aa7b9bc4195c0f2d38a8d78f5b06de6d67fd973cba295a3dafcbda4c366fb5" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.486941 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9aa7b9bc4195c0f2d38a8d78f5b06de6d67fd973cba295a3dafcbda4c366fb5" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.487029 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerName="ovn-northd" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.492089 4707 scope.go:117] "RemoveContainer" containerID="49fdcf9618939364f2f9c95eb510f65f5cbb896c2cbd65a978458a287b5f3b5c" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.492318 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.498433 4707 generic.go:334] "Generic (PLEG): container finished" podID="b5f7ba87-baff-4fe7-97ab-3ea947e9c102" containerID="91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8" exitCode=2 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.498517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"b5f7ba87-baff-4fe7-97ab-3ea947e9c102","Type":"ContainerDied","Data":"91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.498549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"b5f7ba87-baff-4fe7-97ab-3ea947e9c102","Type":"ContainerDied","Data":"f43c0ff8d567ada75b9e91d164f81aa7acca5d3ab89ecc106c6e346959b57d88"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.498652 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.505123 4707 generic.go:334] "Generic (PLEG): container finished" podID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerID="c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65" exitCode=143 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.505201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"36d463b3-4ebf-46d8-9a72-3142c370d2f9","Type":"ContainerDied","Data":"c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.507652 4707 generic.go:334] "Generic (PLEG): container finished" podID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerID="8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.507682 4707 generic.go:334] "Generic (PLEG): container finished" podID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerID="2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.507699 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.507736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1eddda5d-3cfc-4038-821c-4c628aeb3c69","Type":"ContainerDied","Data":"8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.507769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1eddda5d-3cfc-4038-821c-4c628aeb3c69","Type":"ContainerDied","Data":"2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.507780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1eddda5d-3cfc-4038-821c-4c628aeb3c69","Type":"ContainerDied","Data":"202dc5272231aec953a6ab5106342e284658733962780834a351c961bc7b2387"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.512602 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjxk5\" (UniqueName: \"kubernetes.io/projected/11bd3474-14bf-4c21-a327-c4dcbefdec96-kube-api-access-xjxk5\") pod \"11bd3474-14bf-4c21-a327-c4dcbefdec96\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.512701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-operator-scripts\") pod \"11bd3474-14bf-4c21-a327-c4dcbefdec96\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.512770 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"11bd3474-14bf-4c21-a327-c4dcbefdec96\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.512795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-kolla-config\") pod \"11bd3474-14bf-4c21-a327-c4dcbefdec96\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.512904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-combined-ca-bundle\") pod \"11bd3474-14bf-4c21-a327-c4dcbefdec96\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.512974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-galera-tls-certs\") pod \"11bd3474-14bf-4c21-a327-c4dcbefdec96\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.512998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-generated\") pod \"11bd3474-14bf-4c21-a327-c4dcbefdec96\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.513122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-default\") pod \"11bd3474-14bf-4c21-a327-c4dcbefdec96\" (UID: \"11bd3474-14bf-4c21-a327-c4dcbefdec96\") " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.513915 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xf6h\" (UniqueName: \"kubernetes.io/projected/c1dee117-e591-43b9-b94a-677ff9fae495-kube-api-access-6xf6h\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.513937 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.513948 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.514049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11bd3474-14bf-4c21-a327-c4dcbefdec96" (UID: "11bd3474-14bf-4c21-a327-c4dcbefdec96"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.514501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "11bd3474-14bf-4c21-a327-c4dcbefdec96" (UID: "11bd3474-14bf-4c21-a327-c4dcbefdec96"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.514560 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "11bd3474-14bf-4c21-a327-c4dcbefdec96" (UID: "11bd3474-14bf-4c21-a327-c4dcbefdec96"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.514946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "11bd3474-14bf-4c21-a327-c4dcbefdec96" (UID: "11bd3474-14bf-4c21-a327-c4dcbefdec96"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.515847 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1dee117-e591-43b9-b94a-677ff9fae495" containerID="9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.516012 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.516204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c1dee117-e591-43b9-b94a-677ff9fae495","Type":"ContainerDied","Data":"9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.516240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c1dee117-e591-43b9-b94a-677ff9fae495","Type":"ContainerDied","Data":"c4477c5e1007832575618f98de7c8a262201d19fee905b9ef6faa23eb3c644cd"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.518075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11bd3474-14bf-4c21-a327-c4dcbefdec96-kube-api-access-xjxk5" (OuterVolumeSpecName: "kube-api-access-xjxk5") pod "11bd3474-14bf-4c21-a327-c4dcbefdec96" (UID: "11bd3474-14bf-4c21-a327-c4dcbefdec96"). InnerVolumeSpecName "kube-api-access-xjxk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.523292 4707 generic.go:334] "Generic (PLEG): container finished" podID="64190d62-9a33-4999-bc4a-3eff62940162" containerID="5b3ebbd8a16f076cbe84710aa19479370ca9661ce99511215aea7ce272f068eb" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.523420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" event={"ID":"64190d62-9a33-4999-bc4a-3eff62940162","Type":"ContainerDied","Data":"5b3ebbd8a16f076cbe84710aa19479370ca9661ce99511215aea7ce272f068eb"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.526090 4707 generic.go:334] "Generic (PLEG): container finished" podID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerID="c4d2cce318ed39a68499016f449e682bd2f407153f80ddf6bfa77db0849f99da" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.526118 4707 generic.go:334] "Generic (PLEG): container finished" podID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerID="24a13662b41e6b6bfb847addf208fe2012b93376a6c253308ab52b66871cd124" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.526212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" event={"ID":"b73f5d48-3358-4ea2-8b12-cfd7118a7337","Type":"ContainerDied","Data":"c4d2cce318ed39a68499016f449e682bd2f407153f80ddf6bfa77db0849f99da"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.526255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" event={"ID":"b73f5d48-3358-4ea2-8b12-cfd7118a7337","Type":"ContainerDied","Data":"24a13662b41e6b6bfb847addf208fe2012b93376a6c253308ab52b66871cd124"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.530696 4707 scope.go:117] "RemoveContainer" containerID="91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.546945 4707 generic.go:334] "Generic (PLEG): container finished" podID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerID="1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c" exitCode=143 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.547006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" event={"ID":"2ba05f78-2a71-4b67-8e95-db9c04a4ed42","Type":"ContainerDied","Data":"1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.548659 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.553076 4707 generic.go:334] "Generic (PLEG): container finished" podID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerID="f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0" exitCode=143 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.553160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" event={"ID":"3f8cfcd0-7878-4185-b558-a0ad021f06df","Type":"ContainerDied","Data":"f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.555758 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.556296 4707 generic.go:334] "Generic (PLEG): container finished" podID="11bd3474-14bf-4c21-a327-c4dcbefdec96" containerID="0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.556345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"11bd3474-14bf-4c21-a327-c4dcbefdec96","Type":"ContainerDied","Data":"0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.556355 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.556364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"11bd3474-14bf-4c21-a327-c4dcbefdec96","Type":"ContainerDied","Data":"3f202e2e853ef921bd55bc21f919397144ec6481adbb438660ca4b94b96dad87"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.565240 4707 scope.go:117] "RemoveContainer" containerID="91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.565746 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8\": container with ID starting with 91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8 not found: ID does not exist" containerID="91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.565780 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8"} err="failed to get container status \"91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8\": rpc error: code = NotFound desc = could not find container \"91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8\": container with ID starting with 91a545ec6193a2bca4948dc4ebd90093fe5e301c131a1883d7c75dc9550beae8 not found: ID does not exist" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.565800 4707 scope.go:117] "RemoveContainer" containerID="8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.566489 4707 generic.go:334] "Generic (PLEG): container finished" podID="5de9761d-81cf-4d45-aec0-7407347f0449" containerID="ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89" exitCode=143 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.566538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5de9761d-81cf-4d45-aec0-7407347f0449","Type":"ContainerDied","Data":"ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.571059 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.572171 4707 generic.go:334] "Generic (PLEG): container finished" podID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerID="120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06" exitCode=143 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.572211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" event={"ID":"c25947cd-d79d-4983-abfa-5d0f80850cdb","Type":"ContainerDied","Data":"120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.576221 4707 generic.go:334] "Generic (PLEG): container finished" podID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerID="928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.576241 4707 generic.go:334] "Generic (PLEG): container finished" podID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerID="2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b" exitCode=2 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.576249 4707 generic.go:334] "Generic (PLEG): container finished" podID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerID="32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.576282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerDied","Data":"928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.576299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerDied","Data":"2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.576311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerDied","Data":"32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.577482 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6f8b562-33b5-4553-ad14-4ad5230c3ac5" containerID="045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159" exitCode=0 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.577754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" event={"ID":"d6f8b562-33b5-4553-ad14-4ad5230c3ac5","Type":"ContainerDied","Data":"045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.577821 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.577848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" event={"ID":"d6f8b562-33b5-4553-ad14-4ad5230c3ac5","Type":"ContainerStarted","Data":"80df44a86aad1969dcde255c006876df40ad2b6b9d241593772c25cd9db6f05d"} Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.577955 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-t4mx8" secret="" err="secret \"galera-openstack-cell1-dockercfg-ncfmm\" not found" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.615637 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.615662 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.615672 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjxk5\" (UniqueName: \"kubernetes.io/projected/11bd3474-14bf-4c21-a327-c4dcbefdec96-kube-api-access-xjxk5\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.615681 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.615692 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11bd3474-14bf-4c21-a327-c4dcbefdec96-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.705141 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.719775 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.720336 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="243cdfad-87ba-4917-8c2c-061272251dab" containerName="memcached" containerID="cri-o://7c0e62cfad943f1c8d4c59525642ec4ccf69200f67d5f018cb5bef6717a97838" gracePeriod=30 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.756557 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-wbrj5"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.767360 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-k6497"] Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.768082 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerName="ovsdbserver-nb" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.768200 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerName="ovsdbserver-nb" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.768290 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerName="ovsdbserver-sb" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.768375 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerName="ovsdbserver-sb" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.768450 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11bd3474-14bf-4c21-a327-c4dcbefdec96" containerName="mysql-bootstrap" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.768498 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11bd3474-14bf-4c21-a327-c4dcbefdec96" containerName="mysql-bootstrap" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.768584 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerName="cinder-scheduler" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.768648 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerName="cinder-scheduler" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.768733 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerName="probe" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.768781 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerName="probe" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.768865 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerName="openstack-network-exporter" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.768913 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerName="openstack-network-exporter" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.769007 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1dee117-e591-43b9-b94a-677ff9fae495" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.769087 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1dee117-e591-43b9-b94a-677ff9fae495" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.769161 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f7ba87-baff-4fe7-97ab-3ea947e9c102" containerName="kube-state-metrics" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.769241 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f7ba87-baff-4fe7-97ab-3ea947e9c102" containerName="kube-state-metrics" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.769319 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerName="openstack-network-exporter" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.769367 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerName="openstack-network-exporter" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.769438 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11bd3474-14bf-4c21-a327-c4dcbefdec96" containerName="galera" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.769509 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11bd3474-14bf-4c21-a327-c4dcbefdec96" containerName="galera" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.769745 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerName="ovsdbserver-sb" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.769905 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerName="cinder-scheduler" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.769970 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f7ba87-baff-4fe7-97ab-3ea947e9c102" containerName="kube-state-metrics" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.770022 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11bd3474-14bf-4c21-a327-c4dcbefdec96" containerName="galera" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.770084 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c6c533-464a-46bb-a15b-c54a45c7043d" containerName="openstack-network-exporter" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.770163 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerName="openstack-network-exporter" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.770225 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4a6d46-528c-4e6b-b890-391a0b3d4331" containerName="ovsdbserver-nb" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.770279 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1dee117-e591-43b9-b94a-677ff9fae495" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.770326 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" containerName="probe" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.770968 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.775281 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.787493 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-k6497"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.791537 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-8dtzx"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.797842 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bmjbg"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.819049 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-8dtzx"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.829931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjh66\" (UniqueName: \"kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66\") pod \"keystone-fa94-account-create-update-k6497\" (UID: \"3c00242d-c671-49e9-a8ad-9aed66dcb94f\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.830005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts\") pod \"keystone-fa94-account-create-update-k6497\" (UID: \"3c00242d-c671-49e9-a8ad-9aed66dcb94f\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.830331 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.830389 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts podName:38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:13.830373948 +0000 UTC m=+4651.011890170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts") pod "root-account-create-update-t4mx8" (UID: "38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3") : configmap "openstack-cell1-scripts" not found Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.851892 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bmjbg"] Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.877148 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 16:19:11 crc kubenswrapper[4707]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 21 16:19:11 crc kubenswrapper[4707]: Jan 21 16:19:11 crc kubenswrapper[4707]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 21 16:19:11 crc kubenswrapper[4707]: Jan 21 16:19:11 crc kubenswrapper[4707]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 21 16:19:11 crc kubenswrapper[4707]: Jan 21 16:19:11 crc kubenswrapper[4707]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 21 16:19:11 crc kubenswrapper[4707]: Jan 21 16:19:11 crc kubenswrapper[4707]: if [ -n "" ]; then Jan 21 16:19:11 crc kubenswrapper[4707]: GRANT_DATABASE="" Jan 21 16:19:11 crc kubenswrapper[4707]: else Jan 21 16:19:11 crc kubenswrapper[4707]: GRANT_DATABASE="*" Jan 21 16:19:11 crc kubenswrapper[4707]: fi Jan 21 16:19:11 crc kubenswrapper[4707]: Jan 21 16:19:11 crc kubenswrapper[4707]: # going for maximum compatibility here: Jan 21 16:19:11 crc kubenswrapper[4707]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 21 16:19:11 crc kubenswrapper[4707]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 21 16:19:11 crc kubenswrapper[4707]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 21 16:19:11 crc kubenswrapper[4707]: # support updates Jan 21 16:19:11 crc kubenswrapper[4707]: Jan 21 16:19:11 crc kubenswrapper[4707]: $MYSQL_CMD < logger="UnhandledError" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.878696 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-t4mx8" podUID="38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.882936 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7d947f86fd-wpfh7"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.883309 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" podUID="90275cb5-d75d-4fbc-b84d-31fc918a19d1" containerName="keystone-api" containerID="cri-o://38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e" gracePeriod=30 Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.887509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "mysql-db") pod "11bd3474-14bf-4c21-a327-c4dcbefdec96" (UID: "11bd3474-14bf-4c21-a327-c4dcbefdec96"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.946358 4707 scope.go:117] "RemoveContainer" containerID="2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.948759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "11bd3474-14bf-4c21-a327-c4dcbefdec96" (UID: "11bd3474-14bf-4c21-a327-c4dcbefdec96"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.953443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjh66\" (UniqueName: \"kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66\") pod \"keystone-fa94-account-create-update-k6497\" (UID: \"3c00242d-c671-49e9-a8ad-9aed66dcb94f\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.953488 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.953538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts\") pod \"keystone-fa94-account-create-update-k6497\" (UID: \"3c00242d-c671-49e9-a8ad-9aed66dcb94f\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.953854 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.953878 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.954559 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.954608 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts podName:4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f nodeName:}" failed. No retries permitted until 2026-01-21 16:19:13.954587066 +0000 UTC m=+4651.136103288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts") pod "nova-cell1-a90f-account-create-update-gc8v6" (UID: "4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f") : configmap "openstack-cell1-scripts" not found Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.954652 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.954678 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts podName:3c00242d-c671-49e9-a8ad-9aed66dcb94f nodeName:}" failed. No retries permitted until 2026-01-21 16:19:12.454671414 +0000 UTC m=+4649.636187636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts") pod "keystone-fa94-account-create-update-k6497" (UID: "3c00242d-c671-49e9-a8ad-9aed66dcb94f") : configmap "openstack-scripts" not found Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.954691 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data podName:da7ffdb4-4647-47fb-8ec9-012b0c81a83e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:15.954686082 +0000 UTC m=+4653.136202304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data") pod "rabbitmq-server-0" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e") : configmap "rabbitmq-config-data" not found Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.960565 4707 projected.go:194] Error preparing data for projected volume kube-api-access-mjh66 for pod openstack-kuttl-tests/keystone-fa94-account-create-update-k6497: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:19:11 crc kubenswrapper[4707]: E0121 16:19:11.960601 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66 podName:3c00242d-c671-49e9-a8ad-9aed66dcb94f nodeName:}" failed. No retries permitted until 2026-01-21 16:19:12.460590023 +0000 UTC m=+4649.642106245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mjh66" (UniqueName: "kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66") pod "keystone-fa94-account-create-update-k6497" (UID: "3c00242d-c671-49e9-a8ad-9aed66dcb94f") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.971068 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.981333 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.983658 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:19:11 crc kubenswrapper[4707]: I0121 16:19:11.990901 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-67lxl"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:11.997062 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-67lxl"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.002739 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.007861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-k6497"] Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.008665 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-mjh66 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" podUID="3c00242d-c671-49e9-a8ad-9aed66dcb94f" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.030754 4707 scope.go:117] "RemoveContainer" containerID="8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e" Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.031257 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e\": container with ID starting with 8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e not found: ID does not exist" containerID="8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.031314 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e"} err="failed to get container status \"8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e\": rpc error: code = NotFound desc = could not find container \"8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e\": container with ID starting with 8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e not found: ID does not exist" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.031342 4707 scope.go:117] "RemoveContainer" containerID="2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5" Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.031674 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5\": container with ID starting with 2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5 not found: ID does not exist" containerID="2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.031698 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5"} err="failed to get container status \"2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5\": rpc error: code = NotFound desc = could not find container \"2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5\": container with ID starting with 2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5 not found: ID does not exist" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.031710 4707 scope.go:117] "RemoveContainer" containerID="8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.032275 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e"} err="failed to get container status \"8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e\": rpc error: code = NotFound desc = could not find container \"8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e\": container with ID starting with 8bffa6dcfad231656e184d32e07d162650774c8e50947f4f0c05e3bb9da1722e not found: ID does not exist" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.032297 4707 scope.go:117] "RemoveContainer" containerID="2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.032579 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5"} err="failed to get container status \"2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5\": rpc error: code = NotFound desc = could not find container \"2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5\": container with ID starting with 2f5ba2c71bed4484d06b4a33e4f8e0b47efe076c9974a5bbb8340a5cd0220bf5 not found: ID does not exist" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.032613 4707 scope.go:117] "RemoveContainer" containerID="9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.048020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11bd3474-14bf-4c21-a327-c4dcbefdec96" (UID: "11bd3474-14bf-4c21-a327-c4dcbefdec96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.048161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "c1dee117-e591-43b9-b94a-677ff9fae495" (UID: "c1dee117-e591-43b9-b94a-677ff9fae495"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.048191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "c1dee117-e591-43b9-b94a-677ff9fae495" (UID: "c1dee117-e591-43b9-b94a-677ff9fae495"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.059114 4707 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.059145 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.059157 4707 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1dee117-e591-43b9-b94a-677ff9fae495-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.059177 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bd3474-14bf-4c21-a327-c4dcbefdec96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.063484 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.065613 4707 scope.go:117] "RemoveContainer" containerID="9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b" Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.066024 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b\": container with ID starting with 9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b not found: ID does not exist" containerID="9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.066069 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b"} err="failed to get container status \"9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b\": rpc error: code = NotFound desc = could not find container \"9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b\": container with ID starting with 9b72771006f63d7f20b6ce610684e59cc92b1c885b3a2b1b9b5858d33ba3618b not found: ID does not exist" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.066133 4707 scope.go:117] "RemoveContainer" containerID="0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.111986 4707 scope.go:117] "RemoveContainer" containerID="53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.153150 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.159783 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-etc-swift\") pod \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.159896 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts\") pod \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\" (UID: \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.160037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkrpv\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-kube-api-access-fkrpv\") pod \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.160090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-log-httpd\") pod \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.160123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-run-httpd\") pod \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.160180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-combined-ca-bundle\") pod \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.160224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-config-data\") pod \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.160261 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-public-tls-certs\") pod \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.160447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q88th\" (UniqueName: \"kubernetes.io/projected/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-kube-api-access-q88th\") pod \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\" (UID: \"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.160475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-internal-tls-certs\") pod \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\" (UID: \"b73f5d48-3358-4ea2-8b12-cfd7118a7337\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.160831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b73f5d48-3358-4ea2-8b12-cfd7118a7337" (UID: "b73f5d48-3358-4ea2-8b12-cfd7118a7337"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.161227 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.162089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f" (UID: "4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.162287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b73f5d48-3358-4ea2-8b12-cfd7118a7337" (UID: "b73f5d48-3358-4ea2-8b12-cfd7118a7337"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.162583 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.165890 4707 scope.go:117] "RemoveContainer" containerID="0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b" Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.169602 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b\": container with ID starting with 0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b not found: ID does not exist" containerID="0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.169629 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b"} err="failed to get container status \"0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b\": rpc error: code = NotFound desc = could not find container \"0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b\": container with ID starting with 0138c397e489e6b477375fbf738c40b0f7eac0bd7c54846db140ee927ea09d8b not found: ID does not exist" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.169647 4707 scope.go:117] "RemoveContainer" containerID="53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.170041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-kube-api-access-fkrpv" (OuterVolumeSpecName: "kube-api-access-fkrpv") pod "b73f5d48-3358-4ea2-8b12-cfd7118a7337" (UID: "b73f5d48-3358-4ea2-8b12-cfd7118a7337"). InnerVolumeSpecName "kube-api-access-fkrpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.172781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-kube-api-access-q88th" (OuterVolumeSpecName: "kube-api-access-q88th") pod "4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f" (UID: "4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f"). InnerVolumeSpecName "kube-api-access-q88th". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.176376 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b73f5d48-3358-4ea2-8b12-cfd7118a7337" (UID: "b73f5d48-3358-4ea2-8b12-cfd7118a7337"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.177937 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64\": container with ID starting with 53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64 not found: ID does not exist" containerID="53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.177987 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64"} err="failed to get container status \"53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64\": rpc error: code = NotFound desc = could not find container \"53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64\": container with ID starting with 53fa770c1ea874b9e2692e549cc5ea6aae00f4e48de6ddc771a2a1c3100bff64 not found: ID does not exist" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.202684 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.214751 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.222211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b73f5d48-3358-4ea2-8b12-cfd7118a7337" (UID: "b73f5d48-3358-4ea2-8b12-cfd7118a7337"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.222331 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="41784af8-4684-4cb8-959b-2cff22fe7f16" containerName="galera" containerID="cri-o://c2938dcd35595546d7f18672cb9df771c2c4e180300200fe8fd98bae1671e6ee" gracePeriod=30 Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.228996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b73f5d48-3358-4ea2-8b12-cfd7118a7337" (UID: "b73f5d48-3358-4ea2-8b12-cfd7118a7337"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.231244 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b73f5d48-3358-4ea2-8b12-cfd7118a7337" (UID: "b73f5d48-3358-4ea2-8b12-cfd7118a7337"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.236881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-config-data" (OuterVolumeSpecName: "config-data") pod "b73f5d48-3358-4ea2-8b12-cfd7118a7337" (UID: "b73f5d48-3358-4ea2-8b12-cfd7118a7337"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.262965 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/944ff79a-ae02-4251-9db3-921fcb360ae3-operator-scripts\") pod \"944ff79a-ae02-4251-9db3-921fcb360ae3\" (UID: \"944ff79a-ae02-4251-9db3-921fcb360ae3\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.263026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvqlk\" (UniqueName: \"kubernetes.io/projected/944ff79a-ae02-4251-9db3-921fcb360ae3-kube-api-access-hvqlk\") pod \"944ff79a-ae02-4251-9db3-921fcb360ae3\" (UID: \"944ff79a-ae02-4251-9db3-921fcb360ae3\") " Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264150 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944ff79a-ae02-4251-9db3-921fcb360ae3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "944ff79a-ae02-4251-9db3-921fcb360ae3" (UID: "944ff79a-ae02-4251-9db3-921fcb360ae3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264671 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264723 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264739 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkrpv\" (UniqueName: \"kubernetes.io/projected/b73f5d48-3358-4ea2-8b12-cfd7118a7337-kube-api-access-fkrpv\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264751 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b73f5d48-3358-4ea2-8b12-cfd7118a7337-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264765 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264774 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264785 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264797 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q88th\" (UniqueName: \"kubernetes.io/projected/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f-kube-api-access-q88th\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.264821 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b73f5d48-3358-4ea2-8b12-cfd7118a7337-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.267213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944ff79a-ae02-4251-9db3-921fcb360ae3-kube-api-access-hvqlk" (OuterVolumeSpecName: "kube-api-access-hvqlk") pod "944ff79a-ae02-4251-9db3-921fcb360ae3" (UID: "944ff79a-ae02-4251-9db3-921fcb360ae3"). InnerVolumeSpecName "kube-api-access-hvqlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.367483 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/944ff79a-ae02-4251-9db3-921fcb360ae3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.368611 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvqlk\" (UniqueName: \"kubernetes.io/projected/944ff79a-ae02-4251-9db3-921fcb360ae3-kube-api-access-hvqlk\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.470180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjh66\" (UniqueName: \"kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66\") pod \"keystone-fa94-account-create-update-k6497\" (UID: \"3c00242d-c671-49e9-a8ad-9aed66dcb94f\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.470241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts\") pod \"keystone-fa94-account-create-update-k6497\" (UID: \"3c00242d-c671-49e9-a8ad-9aed66dcb94f\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.470742 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.470873 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts podName:3c00242d-c671-49e9-a8ad-9aed66dcb94f nodeName:}" failed. No retries permitted until 2026-01-21 16:19:13.4708486 +0000 UTC m=+4650.652364832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts") pod "keystone-fa94-account-create-update-k6497" (UID: "3c00242d-c671-49e9-a8ad-9aed66dcb94f") : configmap "openstack-scripts" not found Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.474381 4707 projected.go:194] Error preparing data for projected volume kube-api-access-mjh66 for pod openstack-kuttl-tests/keystone-fa94-account-create-update-k6497: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.474483 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66 podName:3c00242d-c671-49e9-a8ad-9aed66dcb94f nodeName:}" failed. No retries permitted until 2026-01-21 16:19:13.474458387 +0000 UTC m=+4650.655974609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mjh66" (UniqueName: "kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66") pod "keystone-fa94-account-create-update-k6497" (UID: "3c00242d-c671-49e9-a8ad-9aed66dcb94f") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.593201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" event={"ID":"4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f","Type":"ContainerDied","Data":"1a62045677d532d61a81b81697707b1458d1f55a5df686ade80b6cc73d089756"} Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.593400 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.602792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" event={"ID":"944ff79a-ae02-4251-9db3-921fcb360ae3","Type":"ContainerDied","Data":"c8e5f8a83f9a088aca7915884c485102f33c09133e14c4350b1d61152457d978"} Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.602892 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.612354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" event={"ID":"b73f5d48-3358-4ea2-8b12-cfd7118a7337","Type":"ContainerDied","Data":"4015c59744765628cfcba9fe29cb7ef93ca8f3243bdf14f278576c88f5673f58"} Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.612387 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.612408 4707 scope.go:117] "RemoveContainer" containerID="c4d2cce318ed39a68499016f449e682bd2f407153f80ddf6bfa77db0849f99da" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.618685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" event={"ID":"d6f8b562-33b5-4553-ad14-4ad5230c3ac5","Type":"ContainerStarted","Data":"f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c"} Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.618959 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.621261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.640053 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" podStartSLOduration=3.64001827 podStartE2EDuration="3.64001827s" podCreationTimestamp="2026-01-21 16:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:19:12.632698747 +0000 UTC m=+4649.814214969" watchObservedRunningTime="2026-01-21 16:19:12.64001827 +0000 UTC m=+4649.821534492" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.643837 4707 scope.go:117] "RemoveContainer" containerID="24a13662b41e6b6bfb847addf208fe2012b93376a6c253308ab52b66871cd124" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.657965 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.702163 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.709214 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6886f88d58-xtb45"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.738367 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.746430 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-90a1-account-create-update-5hvgd"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.751160 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.755530 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a90f-account-create-update-gc8v6"] Jan 21 16:19:12 crc kubenswrapper[4707]: I0121 16:19:12.819935 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="28274132-ebaa-415b-a596-28b30796449c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": read tcp 10.217.0.2:46050->10.217.0.163:8776: read: connection reset by peer" Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.891917 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c2938dcd35595546d7f18672cb9df771c2c4e180300200fe8fd98bae1671e6ee" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.894371 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c2938dcd35595546d7f18672cb9df771c2c4e180300200fe8fd98bae1671e6ee" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.896937 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c2938dcd35595546d7f18672cb9df771c2c4e180300200fe8fd98bae1671e6ee" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.897006 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-0" podUID="41784af8-4684-4cb8-959b-2cff22fe7f16" containerName="galera" Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.984487 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.984601 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:16.984575299 +0000 UTC m=+4654.166091521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "cert-keystone-public-svc" not found Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.984598 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.984638 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.984645 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:16.984638007 +0000 UTC m=+4654.166154228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "cert-keystone-internal-svc" not found Jan 21 16:19:12 crc kubenswrapper[4707]: E0121 16:19:12.984732 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle podName:90275cb5-d75d-4fbc-b84d-31fc918a19d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:16.984711935 +0000 UTC m=+4654.166228157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle") pod "keystone-7d947f86fd-wpfh7" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1") : secret "combined-ca-bundle" not found Jan 21 16:19:13 crc kubenswrapper[4707]: E0121 16:19:13.054318 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:13 crc kubenswrapper[4707]: E0121 16:19:13.058001 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:13 crc kubenswrapper[4707]: E0121 16:19:13.059118 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:13 crc kubenswrapper[4707]: E0121 16:19:13.059158 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.192711 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.201182 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11bd3474-14bf-4c21-a327-c4dcbefdec96" path="/var/lib/kubelet/pods/11bd3474-14bf-4c21-a327-c4dcbefdec96/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.203513 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eddda5d-3cfc-4038-821c-4c628aeb3c69" path="/var/lib/kubelet/pods/1eddda5d-3cfc-4038-821c-4c628aeb3c69/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.204447 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc01904-c145-45f7-bd12-98f776502dfa" path="/var/lib/kubelet/pods/2dc01904-c145-45f7-bd12-98f776502dfa/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.205316 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f" path="/var/lib/kubelet/pods/4e2a00b3-dc03-44af-957a-0e5f0ff2cf1f/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.206194 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934765c4-0ef2-4748-831a-3fa09a941a42" path="/var/lib/kubelet/pods/934765c4-0ef2-4748-831a-3fa09a941a42/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.207181 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="944ff79a-ae02-4251-9db3-921fcb360ae3" path="/var/lib/kubelet/pods/944ff79a-ae02-4251-9db3-921fcb360ae3/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.207524 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f7ba87-baff-4fe7-97ab-3ea947e9c102" path="/var/lib/kubelet/pods/b5f7ba87-baff-4fe7-97ab-3ea947e9c102/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.208096 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" path="/var/lib/kubelet/pods/b73f5d48-3358-4ea2-8b12-cfd7118a7337/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.209139 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1dee117-e591-43b9-b94a-677ff9fae495" path="/var/lib/kubelet/pods/c1dee117-e591-43b9-b94a-677ff9fae495/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.209602 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5dd96c7-ee47-4191-9f21-fc10d99edbec" path="/var/lib/kubelet/pods/d5dd96c7-ee47-4191-9f21-fc10d99edbec/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.210078 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15dfa52-789a-420d-ad9f-4dfc7d755569" path="/var/lib/kubelet/pods/f15dfa52-789a-420d-ad9f-4dfc7d755569/volumes" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.289861 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msxgg\" (UniqueName: \"kubernetes.io/projected/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-kube-api-access-msxgg\") pod \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\" (UID: \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\") " Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.289900 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts\") pod \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\" (UID: \"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3\") " Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.291752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3" (UID: "38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.392155 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.493684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjh66\" (UniqueName: \"kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66\") pod \"keystone-fa94-account-create-update-k6497\" (UID: \"3c00242d-c671-49e9-a8ad-9aed66dcb94f\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.493739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts\") pod \"keystone-fa94-account-create-update-k6497\" (UID: \"3c00242d-c671-49e9-a8ad-9aed66dcb94f\") " pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:13 crc kubenswrapper[4707]: E0121 16:19:13.493873 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:19:13 crc kubenswrapper[4707]: E0121 16:19:13.493923 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts podName:3c00242d-c671-49e9-a8ad-9aed66dcb94f nodeName:}" failed. No retries permitted until 2026-01-21 16:19:15.493907273 +0000 UTC m=+4652.675423495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts") pod "keystone-fa94-account-create-update-k6497" (UID: "3c00242d-c671-49e9-a8ad-9aed66dcb94f") : configmap "openstack-scripts" not found Jan 21 16:19:13 crc kubenswrapper[4707]: E0121 16:19:13.498029 4707 projected.go:194] Error preparing data for projected volume kube-api-access-mjh66 for pod openstack-kuttl-tests/keystone-fa94-account-create-update-k6497: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:19:13 crc kubenswrapper[4707]: E0121 16:19:13.498073 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66 podName:3c00242d-c671-49e9-a8ad-9aed66dcb94f nodeName:}" failed. No retries permitted until 2026-01-21 16:19:15.498062256 +0000 UTC m=+4652.679578468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mjh66" (UniqueName: "kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66") pod "keystone-fa94-account-create-update-k6497" (UID: "3c00242d-c671-49e9-a8ad-9aed66dcb94f") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.590851 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": dial tcp 10.217.0.205:8775: connect: connection refused" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.590860 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": dial tcp 10.217.0.205:8775: connect: connection refused" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.633966 4707 generic.go:334] "Generic (PLEG): container finished" podID="243cdfad-87ba-4917-8c2c-061272251dab" containerID="7c0e62cfad943f1c8d4c59525642ec4ccf69200f67d5f018cb5bef6717a97838" exitCode=0 Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.634030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"243cdfad-87ba-4917-8c2c-061272251dab","Type":"ContainerDied","Data":"7c0e62cfad943f1c8d4c59525642ec4ccf69200f67d5f018cb5bef6717a97838"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.634054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"243cdfad-87ba-4917-8c2c-061272251dab","Type":"ContainerDied","Data":"8d73c24cf36e5d8118e4eebeaecd3536a73be52df33a91e6f1e726024c381b8e"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.634065 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d73c24cf36e5d8118e4eebeaecd3536a73be52df33a91e6f1e726024c381b8e" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.653510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-kube-api-access-msxgg" (OuterVolumeSpecName: "kube-api-access-msxgg") pod "38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3" (UID: "38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3"). InnerVolumeSpecName "kube-api-access-msxgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.655609 4707 generic.go:334] "Generic (PLEG): container finished" podID="41784af8-4684-4cb8-959b-2cff22fe7f16" containerID="c2938dcd35595546d7f18672cb9df771c2c4e180300200fe8fd98bae1671e6ee" exitCode=0 Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.655679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"41784af8-4684-4cb8-959b-2cff22fe7f16","Type":"ContainerDied","Data":"c2938dcd35595546d7f18672cb9df771c2c4e180300200fe8fd98bae1671e6ee"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.664970 4707 generic.go:334] "Generic (PLEG): container finished" podID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerID="97e0171e46f5c97fd7e6a3240f7093a22ce564fe525fb140934b1ee56a46b19b" exitCode=0 Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.665029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6cd161af-38fc-439e-8b3f-d4549dc3708f","Type":"ContainerDied","Data":"97e0171e46f5c97fd7e6a3240f7093a22ce564fe525fb140934b1ee56a46b19b"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.669857 4707 generic.go:334] "Generic (PLEG): container finished" podID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerID="f31a9e28d650086abf54eb41930bbd347007b6dda6d23d5b073d912b680a7659" exitCode=0 Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.669933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" event={"ID":"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2","Type":"ContainerDied","Data":"f31a9e28d650086abf54eb41930bbd347007b6dda6d23d5b073d912b680a7659"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.672545 4707 generic.go:334] "Generic (PLEG): container finished" podID="28274132-ebaa-415b-a596-28b30796449c" containerID="a35a0a3b6e7558ea488bb3ad5745e40ecbe3628ac61c149c4cd49b174c91a74f" exitCode=0 Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.672613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"28274132-ebaa-415b-a596-28b30796449c","Type":"ContainerDied","Data":"a35a0a3b6e7558ea488bb3ad5745e40ecbe3628ac61c149c4cd49b174c91a74f"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.672638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"28274132-ebaa-415b-a596-28b30796449c","Type":"ContainerDied","Data":"462db79e6c81a738011acae018aa85f23ff0d788b376df5efc871b765d328cbb"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.672651 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="462db79e6c81a738011acae018aa85f23ff0d788b376df5efc871b765d328cbb" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.675028 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_e961b985-3419-4a6d-9cc5-e7c615f54129/ovn-northd/0.log" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.675058 4707 generic.go:334] "Generic (PLEG): container finished" podID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerID="f9aa7b9bc4195c0f2d38a8d78f5b06de6d67fd973cba295a3dafcbda4c366fb5" exitCode=139 Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.675119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"e961b985-3419-4a6d-9cc5-e7c615f54129","Type":"ContainerDied","Data":"f9aa7b9bc4195c0f2d38a8d78f5b06de6d67fd973cba295a3dafcbda4c366fb5"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.676502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-t4mx8" event={"ID":"38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3","Type":"ContainerDied","Data":"fdde001be6674e2e51d803db12a98b67868af84cf1d829246de3880fb0300cf2"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.676561 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-t4mx8" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.679522 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerID="9edf57f9b05658747e57d4b578bd476b8dbc99dfcb2823b90d95e7f0335c5d65" exitCode=0 Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.679579 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-fa94-account-create-update-k6497" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.679595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f0537ef6-aeda-4293-aafa-5889f66c493a","Type":"ContainerDied","Data":"9edf57f9b05658747e57d4b578bd476b8dbc99dfcb2823b90d95e7f0335c5d65"} Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.695824 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msxgg\" (UniqueName: \"kubernetes.io/projected/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3-kube-api-access-msxgg\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.840662 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Jan 21 16:19:13 crc kubenswrapper[4707]: I0121 16:19:13.840687 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": dial tcp 10.217.0.160:9311: connect: connection refused" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.179598 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.188264 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.196518 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_e961b985-3419-4a6d-9cc5-e7c615f54129/ovn-northd/0.log" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.196578 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28274132-ebaa-415b-a596-28b30796449c-etc-machine-id\") pod \"28274132-ebaa-415b-a596-28b30796449c\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-metrics-certs-tls-certs\") pod \"e961b985-3419-4a6d-9cc5-e7c615f54129\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhhtc\" (UniqueName: \"kubernetes.io/projected/e961b985-3419-4a6d-9cc5-e7c615f54129-kube-api-access-jhhtc\") pod \"e961b985-3419-4a6d-9cc5-e7c615f54129\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjz76\" (UniqueName: \"kubernetes.io/projected/243cdfad-87ba-4917-8c2c-061272251dab-kube-api-access-fjz76\") pod \"243cdfad-87ba-4917-8c2c-061272251dab\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data\") pod \"28274132-ebaa-415b-a596-28b30796449c\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdmd8\" (UniqueName: \"kubernetes.io/projected/28274132-ebaa-415b-a596-28b30796449c-kube-api-access-rdmd8\") pod \"28274132-ebaa-415b-a596-28b30796449c\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-rundir\") pod \"e961b985-3419-4a6d-9cc5-e7c615f54129\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-combined-ca-bundle\") pod \"e961b985-3419-4a6d-9cc5-e7c615f54129\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-kolla-config\") pod \"243cdfad-87ba-4917-8c2c-061272251dab\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-combined-ca-bundle\") pod \"28274132-ebaa-415b-a596-28b30796449c\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-scripts\") pod \"e961b985-3419-4a6d-9cc5-e7c615f54129\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28274132-ebaa-415b-a596-28b30796449c-logs\") pod \"28274132-ebaa-415b-a596-28b30796449c\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-public-tls-certs\") pod \"28274132-ebaa-415b-a596-28b30796449c\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-combined-ca-bundle\") pod \"243cdfad-87ba-4917-8c2c-061272251dab\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208940 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-config\") pod \"e961b985-3419-4a6d-9cc5-e7c615f54129\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208960 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-scripts\") pod \"28274132-ebaa-415b-a596-28b30796449c\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.208987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-northd-tls-certs\") pod \"e961b985-3419-4a6d-9cc5-e7c615f54129\" (UID: \"e961b985-3419-4a6d-9cc5-e7c615f54129\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.209011 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-config-data\") pod \"243cdfad-87ba-4917-8c2c-061272251dab\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.209046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data-custom\") pod \"28274132-ebaa-415b-a596-28b30796449c\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.209066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-memcached-tls-certs\") pod \"243cdfad-87ba-4917-8c2c-061272251dab\" (UID: \"243cdfad-87ba-4917-8c2c-061272251dab\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.209138 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-internal-tls-certs\") pod \"28274132-ebaa-415b-a596-28b30796449c\" (UID: \"28274132-ebaa-415b-a596-28b30796449c\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.210536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28274132-ebaa-415b-a596-28b30796449c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "28274132-ebaa-415b-a596-28b30796449c" (UID: "28274132-ebaa-415b-a596-28b30796449c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.212868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "e961b985-3419-4a6d-9cc5-e7c615f54129" (UID: "e961b985-3419-4a6d-9cc5-e7c615f54129"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.216885 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28274132-ebaa-415b-a596-28b30796449c-kube-api-access-rdmd8" (OuterVolumeSpecName: "kube-api-access-rdmd8") pod "28274132-ebaa-415b-a596-28b30796449c" (UID: "28274132-ebaa-415b-a596-28b30796449c"). InnerVolumeSpecName "kube-api-access-rdmd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.217265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-config" (OuterVolumeSpecName: "config") pod "e961b985-3419-4a6d-9cc5-e7c615f54129" (UID: "e961b985-3419-4a6d-9cc5-e7c615f54129"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.217716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-config-data" (OuterVolumeSpecName: "config-data") pod "243cdfad-87ba-4917-8c2c-061272251dab" (UID: "243cdfad-87ba-4917-8c2c-061272251dab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.218009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-scripts" (OuterVolumeSpecName: "scripts") pod "e961b985-3419-4a6d-9cc5-e7c615f54129" (UID: "e961b985-3419-4a6d-9cc5-e7c615f54129"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.218545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28274132-ebaa-415b-a596-28b30796449c-logs" (OuterVolumeSpecName: "logs") pod "28274132-ebaa-415b-a596-28b30796449c" (UID: "28274132-ebaa-415b-a596-28b30796449c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.223565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "243cdfad-87ba-4917-8c2c-061272251dab" (UID: "243cdfad-87ba-4917-8c2c-061272251dab"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.228184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e961b985-3419-4a6d-9cc5-e7c615f54129-kube-api-access-jhhtc" (OuterVolumeSpecName: "kube-api-access-jhhtc") pod "e961b985-3419-4a6d-9cc5-e7c615f54129" (UID: "e961b985-3419-4a6d-9cc5-e7c615f54129"). InnerVolumeSpecName "kube-api-access-jhhtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.228468 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/243cdfad-87ba-4917-8c2c-061272251dab-kube-api-access-fjz76" (OuterVolumeSpecName: "kube-api-access-fjz76") pod "243cdfad-87ba-4917-8c2c-061272251dab" (UID: "243cdfad-87ba-4917-8c2c-061272251dab"). InnerVolumeSpecName "kube-api-access-fjz76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.241856 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28274132-ebaa-415b-a596-28b30796449c" (UID: "28274132-ebaa-415b-a596-28b30796449c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.242134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-scripts" (OuterVolumeSpecName: "scripts") pod "28274132-ebaa-415b-a596-28b30796449c" (UID: "28274132-ebaa-415b-a596-28b30796449c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.255207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28274132-ebaa-415b-a596-28b30796449c" (UID: "28274132-ebaa-415b-a596-28b30796449c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.259423 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t4mx8"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.281671 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t4mx8"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.284925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "243cdfad-87ba-4917-8c2c-061272251dab" (UID: "243cdfad-87ba-4917-8c2c-061272251dab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.301955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e961b985-3419-4a6d-9cc5-e7c615f54129" (UID: "e961b985-3419-4a6d-9cc5-e7c615f54129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.304231 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data" (OuterVolumeSpecName: "config-data") pod "28274132-ebaa-415b-a596-28b30796449c" (UID: "28274132-ebaa-415b-a596-28b30796449c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.305476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "28274132-ebaa-415b-a596-28b30796449c" (UID: "28274132-ebaa-415b-a596-28b30796449c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.307044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "243cdfad-87ba-4917-8c2c-061272251dab" (UID: "243cdfad-87ba-4917-8c2c-061272251dab"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.310325 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "28274132-ebaa-415b-a596-28b30796449c" (UID: "28274132-ebaa-415b-a596-28b30796449c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.311206 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-k6497"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312300 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312333 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdmd8\" (UniqueName: \"kubernetes.io/projected/28274132-ebaa-415b-a596-28b30796449c-kube-api-access-rdmd8\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312367 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312380 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312390 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312401 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312411 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312438 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28274132-ebaa-415b-a596-28b30796449c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312450 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312460 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.312470 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e961b985-3419-4a6d-9cc5-e7c615f54129-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.314923 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.314940 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/243cdfad-87ba-4917-8c2c-061272251dab-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.314954 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.314964 4707 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/243cdfad-87ba-4917-8c2c-061272251dab-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.314975 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28274132-ebaa-415b-a596-28b30796449c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.314986 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28274132-ebaa-415b-a596-28b30796449c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.315000 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhhtc\" (UniqueName: \"kubernetes.io/projected/e961b985-3419-4a6d-9cc5-e7c615f54129-kube-api-access-jhhtc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.315012 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjz76\" (UniqueName: \"kubernetes.io/projected/243cdfad-87ba-4917-8c2c-061272251dab-kube-api-access-fjz76\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.320538 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-fa94-account-create-update-k6497"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.348263 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e961b985-3419-4a6d-9cc5-e7c615f54129" (UID: "e961b985-3419-4a6d-9cc5-e7c615f54129"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.352617 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "e961b985-3419-4a6d-9cc5-e7c615f54129" (UID: "e961b985-3419-4a6d-9cc5-e7c615f54129"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.370877 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.379610 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.386870 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.400493 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.406778 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.415782 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-httpd-run\") pod \"6cd161af-38fc-439e-8b3f-d4549dc3708f\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.415838 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-internal-tls-certs\") pod \"6cd161af-38fc-439e-8b3f-d4549dc3708f\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.415863 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-combined-ca-bundle\") pod \"6cd161af-38fc-439e-8b3f-d4549dc3708f\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.415897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-combined-ca-bundle\") pod \"41784af8-4684-4cb8-959b-2cff22fe7f16\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.415922 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data\") pod \"3f8cfcd0-7878-4185-b558-a0ad021f06df\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.415949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqmff\" (UniqueName: \"kubernetes.io/projected/f0537ef6-aeda-4293-aafa-5889f66c493a-kube-api-access-dqmff\") pod \"f0537ef6-aeda-4293-aafa-5889f66c493a\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.415970 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data-custom\") pod \"3f8cfcd0-7878-4185-b558-a0ad021f06df\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.415998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-logs\") pod \"f0537ef6-aeda-4293-aafa-5889f66c493a\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-scripts\") pod \"6cd161af-38fc-439e-8b3f-d4549dc3708f\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416041 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-default\") pod \"41784af8-4684-4cb8-959b-2cff22fe7f16\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8cfcd0-7878-4185-b558-a0ad021f06df-logs\") pod \"3f8cfcd0-7878-4185-b558-a0ad021f06df\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-scripts\") pod \"f0537ef6-aeda-4293-aafa-5889f66c493a\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-logs\") pod \"6cd161af-38fc-439e-8b3f-d4549dc3708f\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416139 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-scripts\") pod \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416175 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-logs\") pod \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-generated\") pod \"41784af8-4684-4cb8-959b-2cff22fe7f16\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvtzd\" (UniqueName: \"kubernetes.io/projected/6cd161af-38fc-439e-8b3f-d4549dc3708f-kube-api-access-bvtzd\") pod \"6cd161af-38fc-439e-8b3f-d4549dc3708f\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-galera-tls-certs\") pod \"41784af8-4684-4cb8-959b-2cff22fe7f16\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"6cd161af-38fc-439e-8b3f-d4549dc3708f\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-config-data\") pod \"f0537ef6-aeda-4293-aafa-5889f66c493a\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6cd161af-38fc-439e-8b3f-d4549dc3708f" (UID: "6cd161af-38fc-439e-8b3f-d4549dc3708f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-public-tls-certs\") pod \"f0537ef6-aeda-4293-aafa-5889f66c493a\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"41784af8-4684-4cb8-959b-2cff22fe7f16\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-operator-scripts\") pod \"41784af8-4684-4cb8-959b-2cff22fe7f16\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-config-data\") pod \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-config-data\") pod \"6cd161af-38fc-439e-8b3f-d4549dc3708f\" (UID: \"6cd161af-38fc-439e-8b3f-d4549dc3708f\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-public-tls-certs\") pod \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcxwf\" (UniqueName: \"kubernetes.io/projected/41784af8-4684-4cb8-959b-2cff22fe7f16-kube-api-access-gcxwf\") pod \"41784af8-4684-4cb8-959b-2cff22fe7f16\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416668 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "41784af8-4684-4cb8-959b-2cff22fe7f16" (UID: "41784af8-4684-4cb8-959b-2cff22fe7f16"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-combined-ca-bundle\") pod \"3f8cfcd0-7878-4185-b558-a0ad021f06df\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-httpd-run\") pod \"f0537ef6-aeda-4293-aafa-5889f66c493a\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f8cfcd0-7878-4185-b558-a0ad021f06df-logs" (OuterVolumeSpecName: "logs") pod "3f8cfcd0-7878-4185-b558-a0ad021f06df" (UID: "3f8cfcd0-7878-4185-b558-a0ad021f06df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-internal-tls-certs\") pod \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416800 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-combined-ca-bundle\") pod \"f0537ef6-aeda-4293-aafa-5889f66c493a\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgwjv\" (UniqueName: \"kubernetes.io/projected/3f8cfcd0-7878-4185-b558-a0ad021f06df-kube-api-access-sgwjv\") pod \"3f8cfcd0-7878-4185-b558-a0ad021f06df\" (UID: \"3f8cfcd0-7878-4185-b558-a0ad021f06df\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-combined-ca-bundle\") pod \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416936 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-kolla-config\") pod \"41784af8-4684-4cb8-959b-2cff22fe7f16\" (UID: \"41784af8-4684-4cb8-959b-2cff22fe7f16\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bv2b\" (UniqueName: \"kubernetes.io/projected/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-kube-api-access-4bv2b\") pod \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\" (UID: \"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.416989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"f0537ef6-aeda-4293-aafa-5889f66c493a\" (UID: \"f0537ef6-aeda-4293-aafa-5889f66c493a\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.418032 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8cfcd0-7878-4185-b558-a0ad021f06df-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.418057 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c00242d-c671-49e9-a8ad-9aed66dcb94f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.418070 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.418087 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.418097 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjh66\" (UniqueName: \"kubernetes.io/projected/3c00242d-c671-49e9-a8ad-9aed66dcb94f-kube-api-access-mjh66\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.418107 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.418118 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e961b985-3419-4a6d-9cc5-e7c615f54129-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.418712 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.420564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41784af8-4684-4cb8-959b-2cff22fe7f16" (UID: "41784af8-4684-4cb8-959b-2cff22fe7f16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.421218 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-logs" (OuterVolumeSpecName: "logs") pod "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" (UID: "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.421409 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-scripts" (OuterVolumeSpecName: "scripts") pod "f0537ef6-aeda-4293-aafa-5889f66c493a" (UID: "f0537ef6-aeda-4293-aafa-5889f66c493a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.423857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "f0537ef6-aeda-4293-aafa-5889f66c493a" (UID: "f0537ef6-aeda-4293-aafa-5889f66c493a"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.424594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "41784af8-4684-4cb8-959b-2cff22fe7f16" (UID: "41784af8-4684-4cb8-959b-2cff22fe7f16"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.428480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-logs" (OuterVolumeSpecName: "logs") pod "6cd161af-38fc-439e-8b3f-d4549dc3708f" (UID: "6cd161af-38fc-439e-8b3f-d4549dc3708f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.428561 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-scripts" (OuterVolumeSpecName: "scripts") pod "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" (UID: "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.431254 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41784af8-4684-4cb8-959b-2cff22fe7f16-kube-api-access-gcxwf" (OuterVolumeSpecName: "kube-api-access-gcxwf") pod "41784af8-4684-4cb8-959b-2cff22fe7f16" (UID: "41784af8-4684-4cb8-959b-2cff22fe7f16"). InnerVolumeSpecName "kube-api-access-gcxwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.432968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f0537ef6-aeda-4293-aafa-5889f66c493a" (UID: "f0537ef6-aeda-4293-aafa-5889f66c493a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.433237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-logs" (OuterVolumeSpecName: "logs") pod "f0537ef6-aeda-4293-aafa-5889f66c493a" (UID: "f0537ef6-aeda-4293-aafa-5889f66c493a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.437768 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "41784af8-4684-4cb8-959b-2cff22fe7f16" (UID: "41784af8-4684-4cb8-959b-2cff22fe7f16"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.440341 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "6cd161af-38fc-439e-8b3f-d4549dc3708f" (UID: "6cd161af-38fc-439e-8b3f-d4549dc3708f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.443298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-scripts" (OuterVolumeSpecName: "scripts") pod "6cd161af-38fc-439e-8b3f-d4549dc3708f" (UID: "6cd161af-38fc-439e-8b3f-d4549dc3708f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.444501 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.444668 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.450269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd161af-38fc-439e-8b3f-d4549dc3708f-kube-api-access-bvtzd" (OuterVolumeSpecName: "kube-api-access-bvtzd") pod "6cd161af-38fc-439e-8b3f-d4549dc3708f" (UID: "6cd161af-38fc-439e-8b3f-d4549dc3708f"). InnerVolumeSpecName "kube-api-access-bvtzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.461102 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.467061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3f8cfcd0-7878-4185-b558-a0ad021f06df" (UID: "3f8cfcd0-7878-4185-b558-a0ad021f06df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.467067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-kube-api-access-4bv2b" (OuterVolumeSpecName: "kube-api-access-4bv2b") pod "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" (UID: "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2"). InnerVolumeSpecName "kube-api-access-4bv2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.472473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8cfcd0-7878-4185-b558-a0ad021f06df-kube-api-access-sgwjv" (OuterVolumeSpecName: "kube-api-access-sgwjv") pod "3f8cfcd0-7878-4185-b558-a0ad021f06df" (UID: "3f8cfcd0-7878-4185-b558-a0ad021f06df"). InnerVolumeSpecName "kube-api-access-sgwjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.472887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0537ef6-aeda-4293-aafa-5889f66c493a-kube-api-access-dqmff" (OuterVolumeSpecName: "kube-api-access-dqmff") pod "f0537ef6-aeda-4293-aafa-5889f66c493a" (UID: "f0537ef6-aeda-4293-aafa-5889f66c493a"). InnerVolumeSpecName "kube-api-access-dqmff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.494087 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "41784af8-4684-4cb8-959b-2cff22fe7f16" (UID: "41784af8-4684-4cb8-959b-2cff22fe7f16"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.500247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cd161af-38fc-439e-8b3f-d4549dc3708f" (UID: "6cd161af-38fc-439e-8b3f-d4549dc3708f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-internal-tls-certs\") pod \"c25947cd-d79d-4983-abfa-5d0f80850cdb\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhc6t\" (UniqueName: \"kubernetes.io/projected/5de9761d-81cf-4d45-aec0-7407347f0449-kube-api-access-xhc6t\") pod \"5de9761d-81cf-4d45-aec0-7407347f0449\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519236 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-combined-ca-bundle\") pod \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-nova-metadata-tls-certs\") pod \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519287 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-combined-ca-bundle\") pod \"5de9761d-81cf-4d45-aec0-7407347f0449\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-internal-tls-certs\") pod \"5de9761d-81cf-4d45-aec0-7407347f0449\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519340 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cnwv\" (UniqueName: \"kubernetes.io/projected/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-kube-api-access-7cnwv\") pod \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25947cd-d79d-4983-abfa-5d0f80850cdb-logs\") pod \"c25947cd-d79d-4983-abfa-5d0f80850cdb\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-config-data\") pod \"5de9761d-81cf-4d45-aec0-7407347f0449\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data-custom\") pod \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-logs\") pod \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dbp5\" (UniqueName: \"kubernetes.io/projected/36d463b3-4ebf-46d8-9a72-3142c370d2f9-kube-api-access-9dbp5\") pod \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data\") pod \"c25947cd-d79d-4983-abfa-5d0f80850cdb\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-config-data\") pod \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data-custom\") pod \"c25947cd-d79d-4983-abfa-5d0f80850cdb\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data\") pod \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\" (UID: \"2ba05f78-2a71-4b67-8e95-db9c04a4ed42\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sdbw\" (UniqueName: \"kubernetes.io/projected/c25947cd-d79d-4983-abfa-5d0f80850cdb-kube-api-access-6sdbw\") pod \"c25947cd-d79d-4983-abfa-5d0f80850cdb\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-public-tls-certs\") pod \"5de9761d-81cf-4d45-aec0-7407347f0449\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d463b3-4ebf-46d8-9a72-3142c370d2f9-logs\") pod \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-combined-ca-bundle\") pod \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\" (UID: \"36d463b3-4ebf-46d8-9a72-3142c370d2f9\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-combined-ca-bundle\") pod \"c25947cd-d79d-4983-abfa-5d0f80850cdb\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519784 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-public-tls-certs\") pod \"c25947cd-d79d-4983-abfa-5d0f80850cdb\" (UID: \"c25947cd-d79d-4983-abfa-5d0f80850cdb\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.519830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de9761d-81cf-4d45-aec0-7407347f0449-logs\") pod \"5de9761d-81cf-4d45-aec0-7407347f0449\" (UID: \"5de9761d-81cf-4d45-aec0-7407347f0449\") " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.520881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25947cd-d79d-4983-abfa-5d0f80850cdb-logs" (OuterVolumeSpecName: "logs") pod "c25947cd-d79d-4983-abfa-5d0f80850cdb" (UID: "c25947cd-d79d-4983-abfa-5d0f80850cdb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521350 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521370 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd161af-38fc-439e-8b3f-d4549dc3708f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521381 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521391 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521400 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvtzd\" (UniqueName: \"kubernetes.io/projected/6cd161af-38fc-439e-8b3f-d4549dc3708f-kube-api-access-bvtzd\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521422 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521438 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521449 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521460 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcxwf\" (UniqueName: \"kubernetes.io/projected/41784af8-4684-4cb8-959b-2cff22fe7f16-kube-api-access-gcxwf\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521470 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521480 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgwjv\" (UniqueName: \"kubernetes.io/projected/3f8cfcd0-7878-4185-b558-a0ad021f06df-kube-api-access-sgwjv\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521488 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521497 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bv2b\" (UniqueName: \"kubernetes.io/projected/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-kube-api-access-4bv2b\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521510 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521520 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521530 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqmff\" (UniqueName: \"kubernetes.io/projected/f0537ef6-aeda-4293-aafa-5889f66c493a-kube-api-access-dqmff\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521539 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521547 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0537ef6-aeda-4293-aafa-5889f66c493a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521555 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.521564 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/41784af8-4684-4cb8-959b-2cff22fe7f16-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.522589 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-logs" (OuterVolumeSpecName: "logs") pod "2ba05f78-2a71-4b67-8e95-db9c04a4ed42" (UID: "2ba05f78-2a71-4b67-8e95-db9c04a4ed42"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.528695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0537ef6-aeda-4293-aafa-5889f66c493a" (UID: "f0537ef6-aeda-4293-aafa-5889f66c493a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.528947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d463b3-4ebf-46d8-9a72-3142c370d2f9-logs" (OuterVolumeSpecName: "logs") pod "36d463b3-4ebf-46d8-9a72-3142c370d2f9" (UID: "36d463b3-4ebf-46d8-9a72-3142c370d2f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.530452 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5de9761d-81cf-4d45-aec0-7407347f0449-logs" (OuterVolumeSpecName: "logs") pod "5de9761d-81cf-4d45-aec0-7407347f0449" (UID: "5de9761d-81cf-4d45-aec0-7407347f0449"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.531577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2ba05f78-2a71-4b67-8e95-db9c04a4ed42" (UID: "2ba05f78-2a71-4b67-8e95-db9c04a4ed42"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.531719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de9761d-81cf-4d45-aec0-7407347f0449-kube-api-access-xhc6t" (OuterVolumeSpecName: "kube-api-access-xhc6t") pod "5de9761d-81cf-4d45-aec0-7407347f0449" (UID: "5de9761d-81cf-4d45-aec0-7407347f0449"). InnerVolumeSpecName "kube-api-access-xhc6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.533713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d463b3-4ebf-46d8-9a72-3142c370d2f9-kube-api-access-9dbp5" (OuterVolumeSpecName: "kube-api-access-9dbp5") pod "36d463b3-4ebf-46d8-9a72-3142c370d2f9" (UID: "36d463b3-4ebf-46d8-9a72-3142c370d2f9"). InnerVolumeSpecName "kube-api-access-9dbp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.534945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41784af8-4684-4cb8-959b-2cff22fe7f16" (UID: "41784af8-4684-4cb8-959b-2cff22fe7f16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.534972 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c25947cd-d79d-4983-abfa-5d0f80850cdb" (UID: "c25947cd-d79d-4983-abfa-5d0f80850cdb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.539890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c25947cd-d79d-4983-abfa-5d0f80850cdb-kube-api-access-6sdbw" (OuterVolumeSpecName: "kube-api-access-6sdbw") pod "c25947cd-d79d-4983-abfa-5d0f80850cdb" (UID: "c25947cd-d79d-4983-abfa-5d0f80850cdb"). InnerVolumeSpecName "kube-api-access-6sdbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.540061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-kube-api-access-7cnwv" (OuterVolumeSpecName: "kube-api-access-7cnwv") pod "2ba05f78-2a71-4b67-8e95-db9c04a4ed42" (UID: "2ba05f78-2a71-4b67-8e95-db9c04a4ed42"). InnerVolumeSpecName "kube-api-access-7cnwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.609737 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.612308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6cd161af-38fc-439e-8b3f-d4549dc3708f" (UID: "6cd161af-38fc-439e-8b3f-d4549dc3708f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.613306 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.613823 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.621508 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.622759 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623314 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cnwv\" (UniqueName: \"kubernetes.io/projected/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-kube-api-access-7cnwv\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623390 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c25947cd-d79d-4983-abfa-5d0f80850cdb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623453 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623504 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623561 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623611 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dbp5\" (UniqueName: \"kubernetes.io/projected/36d463b3-4ebf-46d8-9a72-3142c370d2f9-kube-api-access-9dbp5\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623669 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623719 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623766 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623837 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sdbw\" (UniqueName: \"kubernetes.io/projected/c25947cd-d79d-4983-abfa-5d0f80850cdb-kube-api-access-6sdbw\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623901 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d463b3-4ebf-46d8-9a72-3142c370d2f9-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.623953 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de9761d-81cf-4d45-aec0-7407347f0449-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.624008 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.624063 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.624115 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhc6t\" (UniqueName: \"kubernetes.io/projected/5de9761d-81cf-4d45-aec0-7407347f0449-kube-api-access-xhc6t\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.624184 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.630529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-config-data" (OuterVolumeSpecName: "config-data") pod "36d463b3-4ebf-46d8-9a72-3142c370d2f9" (UID: "36d463b3-4ebf-46d8-9a72-3142c370d2f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.637961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5de9761d-81cf-4d45-aec0-7407347f0449" (UID: "5de9761d-81cf-4d45-aec0-7407347f0449"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.654342 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f0537ef6-aeda-4293-aafa-5889f66c493a" (UID: "f0537ef6-aeda-4293-aafa-5889f66c493a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.658306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-config-data" (OuterVolumeSpecName: "config-data") pod "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" (UID: "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.659059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-config-data" (OuterVolumeSpecName: "config-data") pod "6cd161af-38fc-439e-8b3f-d4549dc3708f" (UID: "6cd161af-38fc-439e-8b3f-d4549dc3708f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.661399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-config-data" (OuterVolumeSpecName: "config-data") pod "5de9761d-81cf-4d45-aec0-7407347f0449" (UID: "5de9761d-81cf-4d45-aec0-7407347f0449"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.662470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f8cfcd0-7878-4185-b558-a0ad021f06df" (UID: "3f8cfcd0-7878-4185-b558-a0ad021f06df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.669061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "36d463b3-4ebf-46d8-9a72-3142c370d2f9" (UID: "36d463b3-4ebf-46d8-9a72-3142c370d2f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.669479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5de9761d-81cf-4d45-aec0-7407347f0449" (UID: "5de9761d-81cf-4d45-aec0-7407347f0449"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.672146 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.673634 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c25947cd-d79d-4983-abfa-5d0f80850cdb" (UID: "c25947cd-d79d-4983-abfa-5d0f80850cdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.674044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c25947cd-d79d-4983-abfa-5d0f80850cdb" (UID: "c25947cd-d79d-4983-abfa-5d0f80850cdb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.676542 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data" (OuterVolumeSpecName: "config-data") pod "3f8cfcd0-7878-4185-b558-a0ad021f06df" (UID: "3f8cfcd0-7878-4185-b558-a0ad021f06df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.680378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ba05f78-2a71-4b67-8e95-db9c04a4ed42" (UID: "2ba05f78-2a71-4b67-8e95-db9c04a4ed42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.683443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-config-data" (OuterVolumeSpecName: "config-data") pod "f0537ef6-aeda-4293-aafa-5889f66c493a" (UID: "f0537ef6-aeda-4293-aafa-5889f66c493a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.684006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36d463b3-4ebf-46d8-9a72-3142c370d2f9" (UID: "36d463b3-4ebf-46d8-9a72-3142c370d2f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.692187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data" (OuterVolumeSpecName: "config-data") pod "c25947cd-d79d-4983-abfa-5d0f80850cdb" (UID: "c25947cd-d79d-4983-abfa-5d0f80850cdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.692384 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "41784af8-4684-4cb8-959b-2cff22fe7f16" (UID: "41784af8-4684-4cb8-959b-2cff22fe7f16"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.695050 4707 generic.go:334] "Generic (PLEG): container finished" podID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerID="533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22" exitCode=0 Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.695207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"36d463b3-4ebf-46d8-9a72-3142c370d2f9","Type":"ContainerDied","Data":"533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.695326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"36d463b3-4ebf-46d8-9a72-3142c370d2f9","Type":"ContainerDied","Data":"b792c1a33a8e70b2e1f838bf97d0e480870d4a0407c3fdd650a3eee907e476df"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.695402 4707 scope.go:117] "RemoveContainer" containerID="533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.695650 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.698677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" (UID: "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.700691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"41784af8-4684-4cb8-959b-2cff22fe7f16","Type":"ContainerDied","Data":"c1283d78e23a2058042cfcc6a44c7656fc8f658b2d5da061e23aeafa88042067"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.700719 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.704080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" (UID: "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.704919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6cd161af-38fc-439e-8b3f-d4549dc3708f","Type":"ContainerDied","Data":"09b1ae05a7530b916744156378a126a0470924029ea3c389f8b733f335203e22"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.704937 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.708448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" event={"ID":"9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2","Type":"ContainerDied","Data":"54dd56ada64ce67d274cd0b3bb2acc42f882289b5f7422926e8209f09d92c818"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.708499 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7bbc9795db-8fjxd" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.710332 4707 generic.go:334] "Generic (PLEG): container finished" podID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerID="65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500" exitCode=0 Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.710420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" event={"ID":"3f8cfcd0-7878-4185-b558-a0ad021f06df","Type":"ContainerDied","Data":"65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.710442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" event={"ID":"3f8cfcd0-7878-4185-b558-a0ad021f06df","Type":"ContainerDied","Data":"9b546e8f4cce9f56195c0a68364d2fea906f9c32f8e14716e77d3c107da162fb"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.710522 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.711382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5de9761d-81cf-4d45-aec0-7407347f0449" (UID: "5de9761d-81cf-4d45-aec0-7407347f0449"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.715949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"f0537ef6-aeda-4293-aafa-5889f66c493a","Type":"ContainerDied","Data":"bf63591a178edb42ff637bbe6dcc4de0a9ba3705b9a710e72718c2ab42fdcf9a"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.716018 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.719834 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.719958 4707 generic.go:334] "Generic (PLEG): container finished" podID="5de9761d-81cf-4d45-aec0-7407347f0449" containerID="6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41" exitCode=0 Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.720027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5de9761d-81cf-4d45-aec0-7407347f0449","Type":"ContainerDied","Data":"6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.720047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5de9761d-81cf-4d45-aec0-7407347f0449","Type":"ContainerDied","Data":"ea99fc79106b3eb034f6f6a5b8020ad6adf39ab3d0c2c1c516c831351a6e2995"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.725443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data" (OuterVolumeSpecName: "config-data") pod "2ba05f78-2a71-4b67-8e95-db9c04a4ed42" (UID: "2ba05f78-2a71-4b67-8e95-db9c04a4ed42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727502 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727527 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727538 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727549 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727560 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727569 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727579 4707 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/41784af8-4684-4cb8-959b-2cff22fe7f16-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727588 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727598 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0537ef6-aeda-4293-aafa-5889f66c493a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727607 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd161af-38fc-439e-8b3f-d4549dc3708f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727616 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727628 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727637 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727647 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8cfcd0-7878-4185-b558-a0ad021f06df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727657 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba05f78-2a71-4b67-8e95-db9c04a4ed42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727671 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d463b3-4ebf-46d8-9a72-3142c370d2f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727682 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727692 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727701 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727710 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de9761d-81cf-4d45-aec0-7407347f0449-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.727718 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.732885 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" (UID: "9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.739750 4707 generic.go:334] "Generic (PLEG): container finished" podID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerID="67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d" exitCode=0 Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.739960 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.740223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" event={"ID":"c25947cd-d79d-4983-abfa-5d0f80850cdb","Type":"ContainerDied","Data":"67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.740408 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp" event={"ID":"c25947cd-d79d-4983-abfa-5d0f80850cdb","Type":"ContainerDied","Data":"edf53ef52dd9868f5e3e85f0e21b3d871899444426637012805d8c871d99cfb0"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.744385 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_e961b985-3419-4a6d-9cc5-e7c615f54129/ovn-northd/0.log" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.744463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"e961b985-3419-4a6d-9cc5-e7c615f54129","Type":"ContainerDied","Data":"d30c4adb31cf9fb4497122bf11d268303d3a823d3ea674fa40ce5a5d5cd533cb"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.744532 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.744936 4707 scope.go:117] "RemoveContainer" containerID="c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.746716 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.754786 4707 generic.go:334] "Generic (PLEG): container finished" podID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerID="530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6" exitCode=0 Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.754916 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.755011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" event={"ID":"2ba05f78-2a71-4b67-8e95-db9c04a4ed42","Type":"ContainerDied","Data":"530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.755031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" event={"ID":"2ba05f78-2a71-4b67-8e95-db9c04a4ed42","Type":"ContainerDied","Data":"2f7e97b26f7795278c6e122326b465e0f0fb739574c352d50272ec391309e236"} Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.755070 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.755582 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.755717 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.756693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c25947cd-d79d-4983-abfa-5d0f80850cdb" (UID: "c25947cd-d79d-4983-abfa-5d0f80850cdb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.762066 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.766440 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.795578 4707 scope.go:117] "RemoveContainer" containerID="533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22" Jan 21 16:19:14 crc kubenswrapper[4707]: E0121 16:19:14.797593 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22\": container with ID starting with 533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22 not found: ID does not exist" containerID="533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.797650 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22"} err="failed to get container status \"533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22\": rpc error: code = NotFound desc = could not find container \"533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22\": container with ID starting with 533c4c11b2db982bdbff8cdba202d5b88bed4d558cbed8d56f0c57c2126b6a22 not found: ID does not exist" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.797676 4707 scope.go:117] "RemoveContainer" containerID="c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65" Jan 21 16:19:14 crc kubenswrapper[4707]: E0121 16:19:14.797958 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65\": container with ID starting with c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65 not found: ID does not exist" containerID="c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.798003 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65"} err="failed to get container status \"c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65\": rpc error: code = NotFound desc = could not find container \"c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65\": container with ID starting with c2379c9566b791b12faf63a4ef6738a211ef813e80d5ad89091bd4ca84705f65 not found: ID does not exist" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.798020 4707 scope.go:117] "RemoveContainer" containerID="c2938dcd35595546d7f18672cb9df771c2c4e180300200fe8fd98bae1671e6ee" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.812698 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.827030 4707 scope.go:117] "RemoveContainer" containerID="3412449b8abfb0bc2880a2a1cb3170df4e64a694075dab079c2ba72a7ec9b33b" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.834534 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c25947cd-d79d-4983-abfa-5d0f80850cdb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.834570 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.848547 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.879675 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6f7c5bdff-mpg79"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.884726 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.889656 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.899142 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.900425 4707 scope.go:117] "RemoveContainer" containerID="97e0171e46f5c97fd7e6a3240f7093a22ce564fe525fb140934b1ee56a46b19b" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.925859 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.927549 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.929459 4707 scope.go:117] "RemoveContainer" containerID="bd0c1af3ceb1fa7c36bbcc33de112908051c4e7e8cc5d7685057a8514317b590" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.932422 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.941439 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.948470 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.951976 4707 scope.go:117] "RemoveContainer" containerID="f31a9e28d650086abf54eb41930bbd347007b6dda6d23d5b073d912b680a7659" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.953949 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.958862 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.964417 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.972707 4707 scope.go:117] "RemoveContainer" containerID="2edd1c6516797b8cc69e132ff4ce12d5ff92480e5675a5122e825d8deab90b2f" Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.980731 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6b6fbc4c8b-9448m"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.986415 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:19:14 crc kubenswrapper[4707]: I0121 16:19:14.992803 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.002215 4707 scope.go:117] "RemoveContainer" containerID="65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.020368 4707 scope.go:117] "RemoveContainer" containerID="f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.035847 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7bbc9795db-8fjxd"] Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.039080 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7bbc9795db-8fjxd"] Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.048680 4707 scope.go:117] "RemoveContainer" containerID="65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.049368 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500\": container with ID starting with 65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500 not found: ID does not exist" containerID="65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.049393 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500"} err="failed to get container status \"65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500\": rpc error: code = NotFound desc = could not find container \"65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500\": container with ID starting with 65469f198171361e37f06aef6df6aed5a18ae10d87c318075cd3951179503500 not found: ID does not exist" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.049409 4707 scope.go:117] "RemoveContainer" containerID="f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.049656 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0\": container with ID starting with f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0 not found: ID does not exist" containerID="f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.049674 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0"} err="failed to get container status \"f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0\": rpc error: code = NotFound desc = could not find container \"f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0\": container with ID starting with f3875c7b6f2c35b1103deeccd962ffe9c31dbf36fa3d2fc921bb23e7348465d0 not found: ID does not exist" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.049686 4707 scope.go:117] "RemoveContainer" containerID="9edf57f9b05658747e57d4b578bd476b8dbc99dfcb2823b90d95e7f0335c5d65" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.063724 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp"] Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.072083 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-5894fb5d9b-krfgp"] Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.076849 4707 scope.go:117] "RemoveContainer" containerID="b2d316c0e1b9039139944364e3e8f6a2b8d263a177cb3b436cd75caafe27246a" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.127833 4707 scope.go:117] "RemoveContainer" containerID="6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.155287 4707 scope.go:117] "RemoveContainer" containerID="ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.181228 4707 scope.go:117] "RemoveContainer" containerID="6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.183025 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41\": container with ID starting with 6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41 not found: ID does not exist" containerID="6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.183082 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41"} err="failed to get container status \"6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41\": rpc error: code = NotFound desc = could not find container \"6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41\": container with ID starting with 6979c370af57cb26318c2f1698bf6d4d74a100f7f47bda9573307a1b56e0ab41 not found: ID does not exist" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.183116 4707 scope.go:117] "RemoveContainer" containerID="ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.183527 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89\": container with ID starting with ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89 not found: ID does not exist" containerID="ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.183576 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89"} err="failed to get container status \"ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89\": rpc error: code = NotFound desc = could not find container \"ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89\": container with ID starting with ab85562204b6015429774960d6466d4dd3f6e8410a674d1a710d209ef84aed89 not found: ID does not exist" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.183605 4707 scope.go:117] "RemoveContainer" containerID="67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.195074 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="243cdfad-87ba-4917-8c2c-061272251dab" path="/var/lib/kubelet/pods/243cdfad-87ba-4917-8c2c-061272251dab/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.195743 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28274132-ebaa-415b-a596-28b30796449c" path="/var/lib/kubelet/pods/28274132-ebaa-415b-a596-28b30796449c/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.200143 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" path="/var/lib/kubelet/pods/2ba05f78-2a71-4b67-8e95-db9c04a4ed42/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.201384 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" path="/var/lib/kubelet/pods/36d463b3-4ebf-46d8-9a72-3142c370d2f9/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.202107 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3" path="/var/lib/kubelet/pods/38168a2d-01c6-47f0-b3f5-e2c3b10aa1d3/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.202420 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c00242d-c671-49e9-a8ad-9aed66dcb94f" path="/var/lib/kubelet/pods/3c00242d-c671-49e9-a8ad-9aed66dcb94f/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.203549 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f8cfcd0-7878-4185-b558-a0ad021f06df" path="/var/lib/kubelet/pods/3f8cfcd0-7878-4185-b558-a0ad021f06df/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.204279 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41784af8-4684-4cb8-959b-2cff22fe7f16" path="/var/lib/kubelet/pods/41784af8-4684-4cb8-959b-2cff22fe7f16/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.204912 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" path="/var/lib/kubelet/pods/5de9761d-81cf-4d45-aec0-7407347f0449/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.206187 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd161af-38fc-439e-8b3f-d4549dc3708f" path="/var/lib/kubelet/pods/6cd161af-38fc-439e-8b3f-d4549dc3708f/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.206998 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" path="/var/lib/kubelet/pods/9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.208117 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" path="/var/lib/kubelet/pods/c25947cd-d79d-4983-abfa-5d0f80850cdb/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.208821 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" path="/var/lib/kubelet/pods/e961b985-3419-4a6d-9cc5-e7c615f54129/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.209518 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0537ef6-aeda-4293-aafa-5889f66c493a" path="/var/lib/kubelet/pods/f0537ef6-aeda-4293-aafa-5889f66c493a/volumes" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.214649 4707 scope.go:117] "RemoveContainer" containerID="120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.348118 4707 scope.go:117] "RemoveContainer" containerID="67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.348583 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d\": container with ID starting with 67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d not found: ID does not exist" containerID="67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.348618 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d"} err="failed to get container status \"67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d\": rpc error: code = NotFound desc = could not find container \"67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d\": container with ID starting with 67c1de40585b5bdf7580321f763e39134528cf736c4a105f3a38371dc5d4172d not found: ID does not exist" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.348641 4707 scope.go:117] "RemoveContainer" containerID="120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.348923 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06\": container with ID starting with 120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06 not found: ID does not exist" containerID="120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.349006 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06"} err="failed to get container status \"120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06\": rpc error: code = NotFound desc = could not find container \"120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06\": container with ID starting with 120ea5dea528e7713329f36aa184db9a5e0b44b8b32da16f58dd3079f2132c06 not found: ID does not exist" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.349037 4707 scope.go:117] "RemoveContainer" containerID="0d0f736ac687f2851035b080d22d8bb9669464f300eba02e20e36b392b5e41d4" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.407928 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.427088 4707 scope.go:117] "RemoveContainer" containerID="f9aa7b9bc4195c0f2d38a8d78f5b06de6d67fd973cba295a3dafcbda4c366fb5" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.445020 4707 scope.go:117] "RemoveContainer" containerID="530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.446418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-config-data\") pod \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.446465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-credential-keys\") pod \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.446532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-scripts\") pod \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.446584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs\") pod \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.446670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs\") pod \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.446712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-fernet-keys\") pod \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.446760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pks9z\" (UniqueName: \"kubernetes.io/projected/90275cb5-d75d-4fbc-b84d-31fc918a19d1-kube-api-access-pks9z\") pod \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.446872 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle\") pod \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\" (UID: \"90275cb5-d75d-4fbc-b84d-31fc918a19d1\") " Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.447250 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.447307 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data podName:00504354-68de-4d18-8192-144ce961d72e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:23.447288578 +0000 UTC m=+4660.628804800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data") pod "rabbitmq-cell1-server-0" (UID: "00504354-68de-4d18-8192-144ce961d72e") : configmap "rabbitmq-cell1-config-data" not found Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.452830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-scripts" (OuterVolumeSpecName: "scripts") pod "90275cb5-d75d-4fbc-b84d-31fc918a19d1" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.452793 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "90275cb5-d75d-4fbc-b84d-31fc918a19d1" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.461970 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90275cb5-d75d-4fbc-b84d-31fc918a19d1-kube-api-access-pks9z" (OuterVolumeSpecName: "kube-api-access-pks9z") pod "90275cb5-d75d-4fbc-b84d-31fc918a19d1" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1"). InnerVolumeSpecName "kube-api-access-pks9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.464414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "90275cb5-d75d-4fbc-b84d-31fc918a19d1" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.466184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90275cb5-d75d-4fbc-b84d-31fc918a19d1" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.468325 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-config-data" (OuterVolumeSpecName: "config-data") pod "90275cb5-d75d-4fbc-b84d-31fc918a19d1" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.469723 4707 scope.go:117] "RemoveContainer" containerID="1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.481280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "90275cb5-d75d-4fbc-b84d-31fc918a19d1" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.493279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "90275cb5-d75d-4fbc-b84d-31fc918a19d1" (UID: "90275cb5-d75d-4fbc-b84d-31fc918a19d1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.516184 4707 scope.go:117] "RemoveContainer" containerID="530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.516607 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6\": container with ID starting with 530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6 not found: ID does not exist" containerID="530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.516652 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6"} err="failed to get container status \"530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6\": rpc error: code = NotFound desc = could not find container \"530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6\": container with ID starting with 530e4c9995849175d9049143f9dcf154fff6486e855c3476089384550931e5a6 not found: ID does not exist" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.516680 4707 scope.go:117] "RemoveContainer" containerID="1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.517102 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c\": container with ID starting with 1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c not found: ID does not exist" containerID="1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.517151 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c"} err="failed to get container status \"1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c\": rpc error: code = NotFound desc = could not find container \"1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c\": container with ID starting with 1f3358d72dde0145b5101bfb9ba06b5ebafed8b9e1106432dba300857276e55c not found: ID does not exist" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.549281 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.549304 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.549317 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pks9z\" (UniqueName: \"kubernetes.io/projected/90275cb5-d75d-4fbc-b84d-31fc918a19d1-kube-api-access-pks9z\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.549331 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.549341 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.549350 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.549358 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.549366 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90275cb5-d75d-4fbc-b84d-31fc918a19d1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.596625 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.597713 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.598871 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.598921 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.766824 4707 generic.go:334] "Generic (PLEG): container finished" podID="90275cb5-d75d-4fbc-b84d-31fc918a19d1" containerID="38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e" exitCode=0 Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.767203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" event={"ID":"90275cb5-d75d-4fbc-b84d-31fc918a19d1","Type":"ContainerDied","Data":"38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e"} Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.767233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" event={"ID":"90275cb5-d75d-4fbc-b84d-31fc918a19d1","Type":"ContainerDied","Data":"7c2b6db4473d8ecfc0f5132feda7e9eb77e1d7bcd769cfb486f6d3f0573eb63c"} Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.767255 4707 scope.go:117] "RemoveContainer" containerID="38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.767999 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7d947f86fd-wpfh7" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.776369 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.781343 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.782337 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.782373 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.799270 4707 scope.go:117] "RemoveContainer" containerID="38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e" Jan 21 16:19:15 crc kubenswrapper[4707]: E0121 16:19:15.799700 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e\": container with ID starting with 38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e not found: ID does not exist" containerID="38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.799739 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e"} err="failed to get container status \"38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e\": rpc error: code = NotFound desc = could not find container \"38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e\": container with ID starting with 38bbce33a54ab662d530f20d305000177d32c5bd8a0f3ae171c2e7a8f21d3c3e not found: ID does not exist" Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.803216 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7d947f86fd-wpfh7"] Jan 21 16:19:15 crc kubenswrapper[4707]: I0121 16:19:15.811362 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-7d947f86fd-wpfh7"] Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.055845 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.055934 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data podName:da7ffdb4-4647-47fb-8ec9-012b0c81a83e nodeName:}" failed. No retries permitted until 2026-01-21 16:19:24.055909897 +0000 UTC m=+4661.237426119 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data") pod "rabbitmq-server-0" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e") : configmap "rabbitmq-config-data" not found Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.260486 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.260556 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0 podName:3d94635b-5310-4d9e-b655-163963e945b9 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:16.760537989 +0000 UTC m=+4653.942054210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0") pod "dnsmasq-dnsmasq-64ddcf65d5-mr854" (UID: "3d94635b-5310-4d9e-b655-163963e945b9") : configmap "dns-swift-storage-0" not found Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.595456 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.663784 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00504354-68de-4d18-8192-144ce961d72e-pod-info\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.663854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4pl7\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-kube-api-access-r4pl7\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.663935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-erlang-cookie\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-plugins\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664111 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-plugins-conf\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-confd\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-tls\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664220 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-server-conf\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664287 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00504354-68de-4d18-8192-144ce961d72e-erlang-cookie-secret\") pod \"00504354-68de-4d18-8192-144ce961d72e\" (UID: \"00504354-68de-4d18-8192-144ce961d72e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664528 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664897 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.664539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.665578 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.670074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00504354-68de-4d18-8192-144ce961d72e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.670400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.671182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/00504354-68de-4d18-8192-144ce961d72e-pod-info" (OuterVolumeSpecName: "pod-info") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.671256 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-kube-api-access-r4pl7" (OuterVolumeSpecName: "kube-api-access-r4pl7") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "kube-api-access-r4pl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.671851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.683080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data" (OuterVolumeSpecName: "config-data") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.704634 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-server-conf" (OuterVolumeSpecName: "server-conf") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.723346 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.734800 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "00504354-68de-4d18-8192-144ce961d72e" (UID: "00504354-68de-4d18-8192-144ce961d72e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.765286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-erlang-cookie-secret\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.766148 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-server-conf\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.766366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.766487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-tls\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.766580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-erlang-cookie\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.766684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-plugins\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.767288 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.767584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-confd\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.767665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-plugins-conf\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.767583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.767712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm8vs\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-kube-api-access-tm8vs\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.767740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-pod-info\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.767768 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.767823 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data\") pod \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\" (UID: \"da7ffdb4-4647-47fb-8ec9-012b0c81a83e\") " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.768122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.768907 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00504354-68de-4d18-8192-144ce961d72e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.768934 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00504354-68de-4d18-8192-144ce961d72e-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.768949 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4pl7\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-kube-api-access-r4pl7\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.768963 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.768975 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.768986 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.768995 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.769028 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.769038 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.769048 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.769057 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00504354-68de-4d18-8192-144ce961d72e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.769056 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.769083 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.769097 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.769124 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0 podName:3d94635b-5310-4d9e-b655-163963e945b9 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:17.769096438 +0000 UTC m=+4654.950612661 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0") pod "dnsmasq-dnsmasq-64ddcf65d5-mr854" (UID: "3d94635b-5310-4d9e-b655-163963e945b9") : configmap "dns-swift-storage-0" not found Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.769148 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00504354-68de-4d18-8192-144ce961d72e-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.770647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.771647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-kube-api-access-tm8vs" (OuterVolumeSpecName: "kube-api-access-tm8vs") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "kube-api-access-tm8vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.771580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-pod-info" (OuterVolumeSpecName: "pod-info") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.771975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.788882 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data" (OuterVolumeSpecName: "config-data") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.793586 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.796696 4707 generic.go:334] "Generic (PLEG): container finished" podID="00504354-68de-4d18-8192-144ce961d72e" containerID="ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4" exitCode=0 Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.796763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"00504354-68de-4d18-8192-144ce961d72e","Type":"ContainerDied","Data":"ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4"} Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.796801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"00504354-68de-4d18-8192-144ce961d72e","Type":"ContainerDied","Data":"5de853c01221fcced37194ede5d5fcdc7c902fad87689859a9b851771df82d45"} Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.796837 4707 scope.go:117] "RemoveContainer" containerID="ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.797004 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.798754 4707 generic.go:334] "Generic (PLEG): container finished" podID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" containerID="90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929" exitCode=0 Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.798801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"da7ffdb4-4647-47fb-8ec9-012b0c81a83e","Type":"ContainerDied","Data":"90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929"} Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.798840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"da7ffdb4-4647-47fb-8ec9-012b0c81a83e","Type":"ContainerDied","Data":"c8fa2204280b79a08ef1789a71e9bf7a5a3772f555f13c5d2b21714ba407ae92"} Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.798913 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.800443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-server-conf" (OuterVolumeSpecName: "server-conf") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.839746 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.840956 4707 scope.go:117] "RemoveContainer" containerID="c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.840954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "da7ffdb4-4647-47fb-8ec9-012b0c81a83e" (UID: "da7ffdb4-4647-47fb-8ec9-012b0c81a83e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.846530 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.868318 4707 scope.go:117] "RemoveContainer" containerID="ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4" Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.868845 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4\": container with ID starting with ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4 not found: ID does not exist" containerID="ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.868892 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4"} err="failed to get container status \"ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4\": rpc error: code = NotFound desc = could not find container \"ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4\": container with ID starting with ca7c51b1d5ca8ae85ab7cdfb5cc911eecd46d233f4880f9d695c33b432d886e4 not found: ID does not exist" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.868920 4707 scope.go:117] "RemoveContainer" containerID="c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356" Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.869250 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356\": container with ID starting with c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356 not found: ID does not exist" containerID="c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.869288 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356"} err="failed to get container status \"c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356\": rpc error: code = NotFound desc = could not find container \"c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356\": container with ID starting with c6c001ce991e2331272a33132feee7da83187a2146dc810cde313af4e9b82356 not found: ID does not exist" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.869314 4707 scope.go:117] "RemoveContainer" containerID="90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.870183 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm8vs\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-kube-api-access-tm8vs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.870208 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.870224 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.870234 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.870245 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.870276 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.870292 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.870302 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/da7ffdb4-4647-47fb-8ec9-012b0c81a83e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.892243 4707 scope.go:117] "RemoveContainer" containerID="66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.895921 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.912536 4707 scope.go:117] "RemoveContainer" containerID="90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929" Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.912869 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929\": container with ID starting with 90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929 not found: ID does not exist" containerID="90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.912897 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929"} err="failed to get container status \"90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929\": rpc error: code = NotFound desc = could not find container \"90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929\": container with ID starting with 90e515f7432223772146a9d57cb8cf5cd1f2d32760c01f8da2606bb1082f9929 not found: ID does not exist" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.912919 4707 scope.go:117] "RemoveContainer" containerID="66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075" Jan 21 16:19:16 crc kubenswrapper[4707]: E0121 16:19:16.913242 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075\": container with ID starting with 66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075 not found: ID does not exist" containerID="66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.913277 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075"} err="failed to get container status \"66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075\": rpc error: code = NotFound desc = could not find container \"66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075\": container with ID starting with 66989a9555ea907f288ed3df8848c796b2dedd43eb92ed0d12ec2077b0f96075 not found: ID does not exist" Jan 21 16:19:16 crc kubenswrapper[4707]: I0121 16:19:16.973492 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:17 crc kubenswrapper[4707]: I0121 16:19:17.140382 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:19:17 crc kubenswrapper[4707]: I0121 16:19:17.148495 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 21 16:19:17 crc kubenswrapper[4707]: I0121 16:19:17.194093 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00504354-68de-4d18-8192-144ce961d72e" path="/var/lib/kubelet/pods/00504354-68de-4d18-8192-144ce961d72e/volumes" Jan 21 16:19:17 crc kubenswrapper[4707]: I0121 16:19:17.194677 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90275cb5-d75d-4fbc-b84d-31fc918a19d1" path="/var/lib/kubelet/pods/90275cb5-d75d-4fbc-b84d-31fc918a19d1/volumes" Jan 21 16:19:17 crc kubenswrapper[4707]: I0121 16:19:17.195365 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" path="/var/lib/kubelet/pods/da7ffdb4-4647-47fb-8ec9-012b0c81a83e/volumes" Jan 21 16:19:17 crc kubenswrapper[4707]: E0121 16:19:17.785451 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 21 16:19:17 crc kubenswrapper[4707]: E0121 16:19:17.785568 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0 podName:3d94635b-5310-4d9e-b655-163963e945b9 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:19.785550634 +0000 UTC m=+4656.967066856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0") pod "dnsmasq-dnsmasq-64ddcf65d5-mr854" (UID: "3d94635b-5310-4d9e-b655-163963e945b9") : configmap "dns-swift-storage-0" not found Jan 21 16:19:17 crc kubenswrapper[4707]: I0121 16:19:17.892346 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pjb"] Jan 21 16:19:17 crc kubenswrapper[4707]: I0121 16:19:17.892916 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x2pjb" podUID="d48c9611-090f-4751-ae18-8a781982b122" containerName="registry-server" containerID="cri-o://a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d" gracePeriod=2 Jan 21 16:19:18 crc kubenswrapper[4707]: E0121 16:19:18.056211 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:18 crc kubenswrapper[4707]: E0121 16:19:18.057662 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:18 crc kubenswrapper[4707]: E0121 16:19:18.061184 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:18 crc kubenswrapper[4707]: E0121 16:19:18.061233 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.282016 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.292981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz6sf\" (UniqueName: \"kubernetes.io/projected/d48c9611-090f-4751-ae18-8a781982b122-kube-api-access-xz6sf\") pod \"d48c9611-090f-4751-ae18-8a781982b122\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.293129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-catalog-content\") pod \"d48c9611-090f-4751-ae18-8a781982b122\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.293267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-utilities\") pod \"d48c9611-090f-4751-ae18-8a781982b122\" (UID: \"d48c9611-090f-4751-ae18-8a781982b122\") " Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.298770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-utilities" (OuterVolumeSpecName: "utilities") pod "d48c9611-090f-4751-ae18-8a781982b122" (UID: "d48c9611-090f-4751-ae18-8a781982b122"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.298991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48c9611-090f-4751-ae18-8a781982b122-kube-api-access-xz6sf" (OuterVolumeSpecName: "kube-api-access-xz6sf") pod "d48c9611-090f-4751-ae18-8a781982b122" (UID: "d48c9611-090f-4751-ae18-8a781982b122"). InnerVolumeSpecName "kube-api-access-xz6sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.326098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d48c9611-090f-4751-ae18-8a781982b122" (UID: "d48c9611-090f-4751-ae18-8a781982b122"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.394790 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.394841 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48c9611-090f-4751-ae18-8a781982b122-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.394855 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz6sf\" (UniqueName: \"kubernetes.io/projected/d48c9611-090f-4751-ae18-8a781982b122-kube-api-access-xz6sf\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.833860 4707 generic.go:334] "Generic (PLEG): container finished" podID="d48c9611-090f-4751-ae18-8a781982b122" containerID="a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d" exitCode=0 Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.833919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pjb" event={"ID":"d48c9611-090f-4751-ae18-8a781982b122","Type":"ContainerDied","Data":"a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d"} Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.834186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pjb" event={"ID":"d48c9611-090f-4751-ae18-8a781982b122","Type":"ContainerDied","Data":"5dafad34b3fa6c645ca9f5d337072d208dec30287531b91e52dc2aeb945d6a43"} Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.834210 4707 scope.go:117] "RemoveContainer" containerID="a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.833938 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pjb" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.863763 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pjb"] Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.865375 4707 scope.go:117] "RemoveContainer" containerID="1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.869023 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pjb"] Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.885426 4707 scope.go:117] "RemoveContainer" containerID="680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.908205 4707 scope.go:117] "RemoveContainer" containerID="a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d" Jan 21 16:19:18 crc kubenswrapper[4707]: E0121 16:19:18.908498 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d\": container with ID starting with a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d not found: ID does not exist" containerID="a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.908531 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d"} err="failed to get container status \"a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d\": rpc error: code = NotFound desc = could not find container \"a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d\": container with ID starting with a586a759b180be18d988b21e3f75a3dbdcc95e5995699e0dd9fc7b9de381e65d not found: ID does not exist" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.908555 4707 scope.go:117] "RemoveContainer" containerID="1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c" Jan 21 16:19:18 crc kubenswrapper[4707]: E0121 16:19:18.908835 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c\": container with ID starting with 1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c not found: ID does not exist" containerID="1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.908859 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c"} err="failed to get container status \"1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c\": rpc error: code = NotFound desc = could not find container \"1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c\": container with ID starting with 1cc60eb771f3ed548493f0ceb7a082cf8a0d6d8ce750002df56372b4876f9e5c not found: ID does not exist" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.908874 4707 scope.go:117] "RemoveContainer" containerID="680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03" Jan 21 16:19:18 crc kubenswrapper[4707]: E0121 16:19:18.909122 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03\": container with ID starting with 680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03 not found: ID does not exist" containerID="680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03" Jan 21 16:19:18 crc kubenswrapper[4707]: I0121 16:19:18.909147 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03"} err="failed to get container status \"680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03\": rpc error: code = NotFound desc = could not find container \"680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03\": container with ID starting with 680e5b7d25fa3a637e6fc8d4958bfa27bfb1a9f65406fe4e1b678dba68bb7e03 not found: ID does not exist" Jan 21 16:19:19 crc kubenswrapper[4707]: I0121 16:19:19.182845 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:19:19 crc kubenswrapper[4707]: E0121 16:19:19.183085 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:19:19 crc kubenswrapper[4707]: I0121 16:19:19.192563 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48c9611-090f-4751-ae18-8a781982b122" path="/var/lib/kubelet/pods/d48c9611-090f-4751-ae18-8a781982b122/volumes" Jan 21 16:19:19 crc kubenswrapper[4707]: E0121 16:19:19.815132 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 21 16:19:19 crc kubenswrapper[4707]: E0121 16:19:19.815291 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0 podName:3d94635b-5310-4d9e-b655-163963e945b9 nodeName:}" failed. No retries permitted until 2026-01-21 16:19:23.815267742 +0000 UTC m=+4660.996783964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0") pod "dnsmasq-dnsmasq-64ddcf65d5-mr854" (UID: "3d94635b-5310-4d9e-b655-163963e945b9") : configmap "dns-swift-storage-0" not found Jan 21 16:19:19 crc kubenswrapper[4707]: I0121 16:19:19.942823 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:19:19 crc kubenswrapper[4707]: I0121 16:19:19.989149 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854"] Jan 21 16:19:19 crc kubenswrapper[4707]: I0121 16:19:19.989373 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" podUID="3d94635b-5310-4d9e-b655-163963e945b9" containerName="dnsmasq-dns" containerID="cri-o://2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c" gracePeriod=10 Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.401266 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.529114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-config\") pod \"3d94635b-5310-4d9e-b655-163963e945b9\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.529238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0\") pod \"3d94635b-5310-4d9e-b655-163963e945b9\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.529369 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dnsmasq-svc\") pod \"3d94635b-5310-4d9e-b655-163963e945b9\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.529420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxrzz\" (UniqueName: \"kubernetes.io/projected/3d94635b-5310-4d9e-b655-163963e945b9-kube-api-access-vxrzz\") pod \"3d94635b-5310-4d9e-b655-163963e945b9\" (UID: \"3d94635b-5310-4d9e-b655-163963e945b9\") " Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.597823 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.599199 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.600643 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.600674 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.776435 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.778585 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.780845 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.780896 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.851162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d94635b-5310-4d9e-b655-163963e945b9-kube-api-access-vxrzz" (OuterVolumeSpecName: "kube-api-access-vxrzz") pod "3d94635b-5310-4d9e-b655-163963e945b9" (UID: "3d94635b-5310-4d9e-b655-163963e945b9"). InnerVolumeSpecName "kube-api-access-vxrzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.860656 4707 generic.go:334] "Generic (PLEG): container finished" podID="3d94635b-5310-4d9e-b655-163963e945b9" containerID="2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c" exitCode=0 Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.860725 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.860710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" event={"ID":"3d94635b-5310-4d9e-b655-163963e945b9","Type":"ContainerDied","Data":"2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c"} Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.860874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854" event={"ID":"3d94635b-5310-4d9e-b655-163963e945b9","Type":"ContainerDied","Data":"21e8b7c0cbfa4420d0baafce2562ba5e86595d9f888cdf7e060204960e1e7111"} Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.860939 4707 scope.go:117] "RemoveContainer" containerID="2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.877643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "3d94635b-5310-4d9e-b655-163963e945b9" (UID: "3d94635b-5310-4d9e-b655-163963e945b9"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.884732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3d94635b-5310-4d9e-b655-163963e945b9" (UID: "3d94635b-5310-4d9e-b655-163963e945b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.884897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-config" (OuterVolumeSpecName: "config") pod "3d94635b-5310-4d9e-b655-163963e945b9" (UID: "3d94635b-5310-4d9e-b655-163963e945b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.918048 4707 scope.go:117] "RemoveContainer" containerID="0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.934628 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.934665 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxrzz\" (UniqueName: \"kubernetes.io/projected/3d94635b-5310-4d9e-b655-163963e945b9-kube-api-access-vxrzz\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.934680 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.934690 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d94635b-5310-4d9e-b655-163963e945b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.936620 4707 scope.go:117] "RemoveContainer" containerID="2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c" Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.937113 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c\": container with ID starting with 2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c not found: ID does not exist" containerID="2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.937180 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c"} err="failed to get container status \"2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c\": rpc error: code = NotFound desc = could not find container \"2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c\": container with ID starting with 2d4a94170886dc1817d8c5e244086df0ad5fa4ab88f9151d11ae7919d287aa7c not found: ID does not exist" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.937218 4707 scope.go:117] "RemoveContainer" containerID="0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa" Jan 21 16:19:20 crc kubenswrapper[4707]: E0121 16:19:20.937492 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa\": container with ID starting with 0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa not found: ID does not exist" containerID="0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa" Jan 21 16:19:20 crc kubenswrapper[4707]: I0121 16:19:20.937521 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa"} err="failed to get container status \"0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa\": rpc error: code = NotFound desc = could not find container \"0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa\": container with ID starting with 0bf798845436da3faa90d229a1e614516921c3d7ef0dc634e02d8a0cb0ceb0fa not found: ID does not exist" Jan 21 16:19:21 crc kubenswrapper[4707]: I0121 16:19:21.202936 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854"] Jan 21 16:19:21 crc kubenswrapper[4707]: I0121 16:19:21.203374 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64ddcf65d5-mr854"] Jan 21 16:19:22 crc kubenswrapper[4707]: I0121 16:19:22.874646 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.208:3000/\": dial tcp 10.217.0.208:3000: connect: connection refused" Jan 21 16:19:23 crc kubenswrapper[4707]: E0121 16:19:23.055629 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:23 crc kubenswrapper[4707]: E0121 16:19:23.056762 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:23 crc kubenswrapper[4707]: E0121 16:19:23.058072 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:23 crc kubenswrapper[4707]: E0121 16:19:23.058120 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" Jan 21 16:19:23 crc kubenswrapper[4707]: I0121 16:19:23.189773 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d94635b-5310-4d9e-b655-163963e945b9" path="/var/lib/kubelet/pods/3d94635b-5310-4d9e-b655-163963e945b9/volumes" Jan 21 16:19:25 crc kubenswrapper[4707]: E0121 16:19:25.597920 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:25 crc kubenswrapper[4707]: E0121 16:19:25.599464 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:25 crc kubenswrapper[4707]: E0121 16:19:25.600938 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:25 crc kubenswrapper[4707]: E0121 16:19:25.600978 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" Jan 21 16:19:25 crc kubenswrapper[4707]: E0121 16:19:25.777533 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:25 crc kubenswrapper[4707]: E0121 16:19:25.778901 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:25 crc kubenswrapper[4707]: E0121 16:19:25.780561 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:25 crc kubenswrapper[4707]: E0121 16:19:25.780606 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:27 crc kubenswrapper[4707]: I0121 16:19:27.733221 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" podUID="64190d62-9a33-4999-bc4a-3eff62940162" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.164:9696/\": dial tcp 10.217.0.164:9696: connect: connection refused" Jan 21 16:19:28 crc kubenswrapper[4707]: E0121 16:19:28.054958 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:28 crc kubenswrapper[4707]: E0121 16:19:28.056142 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:28 crc kubenswrapper[4707]: E0121 16:19:28.057106 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:28 crc kubenswrapper[4707]: E0121 16:19:28.057157 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" Jan 21 16:19:30 crc kubenswrapper[4707]: E0121 16:19:30.598039 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:30 crc kubenswrapper[4707]: E0121 16:19:30.599956 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:30 crc kubenswrapper[4707]: E0121 16:19:30.601823 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:30 crc kubenswrapper[4707]: E0121 16:19:30.601867 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" Jan 21 16:19:30 crc kubenswrapper[4707]: E0121 16:19:30.776440 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:30 crc kubenswrapper[4707]: E0121 16:19:30.777865 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:30 crc kubenswrapper[4707]: E0121 16:19:30.779371 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:30 crc kubenswrapper[4707]: E0121 16:19:30.779408 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:33 crc kubenswrapper[4707]: E0121 16:19:33.054738 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:33 crc kubenswrapper[4707]: E0121 16:19:33.056334 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:33 crc kubenswrapper[4707]: E0121 16:19:33.057301 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:33 crc kubenswrapper[4707]: E0121 16:19:33.057332 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" Jan 21 16:19:34 crc kubenswrapper[4707]: I0121 16:19:34.182412 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:19:34 crc kubenswrapper[4707]: E0121 16:19:34.182659 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:19:35 crc kubenswrapper[4707]: E0121 16:19:35.598202 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:35 crc kubenswrapper[4707]: E0121 16:19:35.600612 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:35 crc kubenswrapper[4707]: E0121 16:19:35.602212 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 16:19:35 crc kubenswrapper[4707]: E0121 16:19:35.602292 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" Jan 21 16:19:35 crc kubenswrapper[4707]: E0121 16:19:35.780718 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:35 crc kubenswrapper[4707]: E0121 16:19:35.782060 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:35 crc kubenswrapper[4707]: E0121 16:19:35.783194 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:35 crc kubenswrapper[4707]: E0121 16:19:35.783240 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:38 crc kubenswrapper[4707]: E0121 16:19:38.055224 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:38 crc kubenswrapper[4707]: E0121 16:19:38.057435 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:38 crc kubenswrapper[4707]: E0121 16:19:38.058676 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:38 crc kubenswrapper[4707]: E0121 16:19:38.058711 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.669760 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.792285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjfwm\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-kube-api-access-rjfwm\") pod \"2c2f6089-1b67-4c36-ae6f-66c394da5881\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.792530 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-cache\") pod \"2c2f6089-1b67-4c36-ae6f-66c394da5881\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.792590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-lock\") pod \"2c2f6089-1b67-4c36-ae6f-66c394da5881\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.792615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2c2f6089-1b67-4c36-ae6f-66c394da5881\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.792632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") pod \"2c2f6089-1b67-4c36-ae6f-66c394da5881\" (UID: \"2c2f6089-1b67-4c36-ae6f-66c394da5881\") " Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.793075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-cache" (OuterVolumeSpecName: "cache") pod "2c2f6089-1b67-4c36-ae6f-66c394da5881" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.793378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-lock" (OuterVolumeSpecName: "lock") pod "2c2f6089-1b67-4c36-ae6f-66c394da5881" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.796396 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-kube-api-access-rjfwm" (OuterVolumeSpecName: "kube-api-access-rjfwm") pod "2c2f6089-1b67-4c36-ae6f-66c394da5881" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881"). InnerVolumeSpecName "kube-api-access-rjfwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.796670 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2c2f6089-1b67-4c36-ae6f-66c394da5881" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.797831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "swift") pod "2c2f6089-1b67-4c36-ae6f-66c394da5881" (UID: "2c2f6089-1b67-4c36-ae6f-66c394da5881"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.893590 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjfwm\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-kube-api-access-rjfwm\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.893697 4707 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-cache\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.893858 4707 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c2f6089-1b67-4c36-ae6f-66c394da5881-lock\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.893963 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c2f6089-1b67-4c36-ae6f-66c394da5881-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.894361 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.908268 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 16:19:39 crc kubenswrapper[4707]: I0121 16:19:39.996340 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.038500 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerID="d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec" exitCode=137 Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.038566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec"} Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.038597 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.038631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"2c2f6089-1b67-4c36-ae6f-66c394da5881","Type":"ContainerDied","Data":"7ac3d2c350757ccb932d512ede416f9f9f38057a71361cc853fb1e70481913b6"} Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.038660 4707 scope.go:117] "RemoveContainer" containerID="d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.042495 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-69b97779c8-jlqt9_64190d62-9a33-4999-bc4a-3eff62940162/neutron-api/0.log" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.042557 4707 generic.go:334] "Generic (PLEG): container finished" podID="64190d62-9a33-4999-bc4a-3eff62940162" containerID="d2d7c5789da9a58eb70a75b4a3d6ee778cf3a4ed19f27c737a94bbdf5be149df" exitCode=137 Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.042629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" event={"ID":"64190d62-9a33-4999-bc4a-3eff62940162","Type":"ContainerDied","Data":"d2d7c5789da9a58eb70a75b4a3d6ee778cf3a4ed19f27c737a94bbdf5be149df"} Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.044137 4707 generic.go:334] "Generic (PLEG): container finished" podID="80f502e2-066e-4230-b504-29825f19918a" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" exitCode=137 Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.044201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"80f502e2-066e-4230-b504-29825f19918a","Type":"ContainerDied","Data":"e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a"} Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.078684 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.081927 4707 scope.go:117] "RemoveContainer" containerID="2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.096862 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbmbp\" (UniqueName: \"kubernetes.io/projected/80f502e2-066e-4230-b504-29825f19918a-kube-api-access-kbmbp\") pod \"80f502e2-066e-4230-b504-29825f19918a\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.096903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-combined-ca-bundle\") pod \"80f502e2-066e-4230-b504-29825f19918a\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.096948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-config-data\") pod \"80f502e2-066e-4230-b504-29825f19918a\" (UID: \"80f502e2-066e-4230-b504-29825f19918a\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.100289 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.105381 4707 scope.go:117] "RemoveContainer" containerID="3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.105439 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.115352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f502e2-066e-4230-b504-29825f19918a-kube-api-access-kbmbp" (OuterVolumeSpecName: "kube-api-access-kbmbp") pod "80f502e2-066e-4230-b504-29825f19918a" (UID: "80f502e2-066e-4230-b504-29825f19918a"). InnerVolumeSpecName "kube-api-access-kbmbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.118143 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80f502e2-066e-4230-b504-29825f19918a" (UID: "80f502e2-066e-4230-b504-29825f19918a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.120007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-config-data" (OuterVolumeSpecName: "config-data") pod "80f502e2-066e-4230-b504-29825f19918a" (UID: "80f502e2-066e-4230-b504-29825f19918a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.120078 4707 scope.go:117] "RemoveContainer" containerID="d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.137131 4707 scope.go:117] "RemoveContainer" containerID="0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.153458 4707 scope.go:117] "RemoveContainer" containerID="d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.168654 4707 scope.go:117] "RemoveContainer" containerID="7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.190844 4707 scope.go:117] "RemoveContainer" containerID="98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.198511 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbmbp\" (UniqueName: \"kubernetes.io/projected/80f502e2-066e-4230-b504-29825f19918a-kube-api-access-kbmbp\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.198617 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.198677 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f502e2-066e-4230-b504-29825f19918a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.210977 4707 scope.go:117] "RemoveContainer" containerID="ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.226462 4707 scope.go:117] "RemoveContainer" containerID="c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.244690 4707 scope.go:117] "RemoveContainer" containerID="b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.263526 4707 scope.go:117] "RemoveContainer" containerID="83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.263775 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-69b97779c8-jlqt9_64190d62-9a33-4999-bc4a-3eff62940162/neutron-api/0.log" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.263871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.283355 4707 scope.go:117] "RemoveContainer" containerID="2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.299618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-combined-ca-bundle\") pod \"64190d62-9a33-4999-bc4a-3eff62940162\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.299665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-public-tls-certs\") pod \"64190d62-9a33-4999-bc4a-3eff62940162\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.299693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-httpd-config\") pod \"64190d62-9a33-4999-bc4a-3eff62940162\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.299722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vvnc\" (UniqueName: \"kubernetes.io/projected/64190d62-9a33-4999-bc4a-3eff62940162-kube-api-access-4vvnc\") pod \"64190d62-9a33-4999-bc4a-3eff62940162\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.299742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-config\") pod \"64190d62-9a33-4999-bc4a-3eff62940162\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.299758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-ovndb-tls-certs\") pod \"64190d62-9a33-4999-bc4a-3eff62940162\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.299822 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-internal-tls-certs\") pod \"64190d62-9a33-4999-bc4a-3eff62940162\" (UID: \"64190d62-9a33-4999-bc4a-3eff62940162\") " Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.302488 4707 scope.go:117] "RemoveContainer" containerID="97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.302860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64190d62-9a33-4999-bc4a-3eff62940162-kube-api-access-4vvnc" (OuterVolumeSpecName: "kube-api-access-4vvnc") pod "64190d62-9a33-4999-bc4a-3eff62940162" (UID: "64190d62-9a33-4999-bc4a-3eff62940162"). InnerVolumeSpecName "kube-api-access-4vvnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.305038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "64190d62-9a33-4999-bc4a-3eff62940162" (UID: "64190d62-9a33-4999-bc4a-3eff62940162"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.321931 4707 scope.go:117] "RemoveContainer" containerID="a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.327468 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64190d62-9a33-4999-bc4a-3eff62940162" (UID: "64190d62-9a33-4999-bc4a-3eff62940162"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.331153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "64190d62-9a33-4999-bc4a-3eff62940162" (UID: "64190d62-9a33-4999-bc4a-3eff62940162"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.331630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-config" (OuterVolumeSpecName: "config") pod "64190d62-9a33-4999-bc4a-3eff62940162" (UID: "64190d62-9a33-4999-bc4a-3eff62940162"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.335727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "64190d62-9a33-4999-bc4a-3eff62940162" (UID: "64190d62-9a33-4999-bc4a-3eff62940162"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.339689 4707 scope.go:117] "RemoveContainer" containerID="d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.340148 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec\": container with ID starting with d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec not found: ID does not exist" containerID="d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.340198 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec"} err="failed to get container status \"d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec\": rpc error: code = NotFound desc = could not find container \"d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec\": container with ID starting with d4738a63e9f1b71ae9f72f3694c8a48212a0cc90417ee99360a6fafec92cdbec not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.340235 4707 scope.go:117] "RemoveContainer" containerID="2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.340535 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d\": container with ID starting with 2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d not found: ID does not exist" containerID="2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.340572 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d"} err="failed to get container status \"2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d\": rpc error: code = NotFound desc = could not find container \"2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d\": container with ID starting with 2ff48242e2b2ba540ebec3b088235b5083c33b5c51a15029bc02e9e9e4ded73d not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.340598 4707 scope.go:117] "RemoveContainer" containerID="3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.340948 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b\": container with ID starting with 3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b not found: ID does not exist" containerID="3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.340976 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b"} err="failed to get container status \"3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b\": rpc error: code = NotFound desc = could not find container \"3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b\": container with ID starting with 3a3d7871cc7694bf894fb481495dec447e86acb68bea39460ff7ea96e01c6b6b not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.340993 4707 scope.go:117] "RemoveContainer" containerID="d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.341319 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4\": container with ID starting with d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4 not found: ID does not exist" containerID="d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.341341 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4"} err="failed to get container status \"d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4\": rpc error: code = NotFound desc = could not find container \"d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4\": container with ID starting with d9003488ba14ecef0f3cc81a275904e03f35eaf38a8350dbec2a5487878e44e4 not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.341355 4707 scope.go:117] "RemoveContainer" containerID="0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.341673 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d\": container with ID starting with 0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d not found: ID does not exist" containerID="0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.341699 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d"} err="failed to get container status \"0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d\": rpc error: code = NotFound desc = could not find container \"0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d\": container with ID starting with 0cf7741ca51e63659d077ebe81be64f104b3d4eb2a62c7f5b44913318476d11d not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.341713 4707 scope.go:117] "RemoveContainer" containerID="d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.341947 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118\": container with ID starting with d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118 not found: ID does not exist" containerID="d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.341967 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118"} err="failed to get container status \"d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118\": rpc error: code = NotFound desc = could not find container \"d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118\": container with ID starting with d27cbfa10bc5f497eb4213647ee48f55a3f87b18706910e026cbe393b9cac118 not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.341978 4707 scope.go:117] "RemoveContainer" containerID="7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.342217 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae\": container with ID starting with 7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae not found: ID does not exist" containerID="7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.342238 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae"} err="failed to get container status \"7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae\": rpc error: code = NotFound desc = could not find container \"7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae\": container with ID starting with 7c9da2f3bbd052a6dd083e3e8c7d16f7ae0633b66112e924a885185490f1a4ae not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.342249 4707 scope.go:117] "RemoveContainer" containerID="98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.342420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "64190d62-9a33-4999-bc4a-3eff62940162" (UID: "64190d62-9a33-4999-bc4a-3eff62940162"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.342479 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d\": container with ID starting with 98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d not found: ID does not exist" containerID="98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.342502 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d"} err="failed to get container status \"98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d\": rpc error: code = NotFound desc = could not find container \"98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d\": container with ID starting with 98654a4a0c91f45a1d7a6a1f20f05a214dfa00c2cfcea01ddb9ec40133690e3d not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.342521 4707 scope.go:117] "RemoveContainer" containerID="ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.342880 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e\": container with ID starting with ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e not found: ID does not exist" containerID="ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.342913 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e"} err="failed to get container status \"ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e\": rpc error: code = NotFound desc = could not find container \"ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e\": container with ID starting with ea3ef4bc24851e4510ce36d6096ed0d757163e956e0b659760f40fa86fb61c4e not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.342937 4707 scope.go:117] "RemoveContainer" containerID="c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.343189 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce\": container with ID starting with c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce not found: ID does not exist" containerID="c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.343211 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce"} err="failed to get container status \"c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce\": rpc error: code = NotFound desc = could not find container \"c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce\": container with ID starting with c93f74e1f9adbe49d25f2fff1319eafe4824c02cb456f0bb2781708ff181c3ce not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.343224 4707 scope.go:117] "RemoveContainer" containerID="b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.343437 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6\": container with ID starting with b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6 not found: ID does not exist" containerID="b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.343460 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6"} err="failed to get container status \"b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6\": rpc error: code = NotFound desc = could not find container \"b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6\": container with ID starting with b4d5b9fde4f5836e5419d8871e1b452d7d297194bbeecea3336d25e4a1a353b6 not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.343472 4707 scope.go:117] "RemoveContainer" containerID="83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.343723 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0\": container with ID starting with 83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0 not found: ID does not exist" containerID="83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.343759 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0"} err="failed to get container status \"83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0\": rpc error: code = NotFound desc = could not find container \"83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0\": container with ID starting with 83604ee1f5d606366d34c3e89314b1deeae17dbc8277d767d200e72dbb54fea0 not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.343780 4707 scope.go:117] "RemoveContainer" containerID="2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.344096 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf\": container with ID starting with 2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf not found: ID does not exist" containerID="2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.344177 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf"} err="failed to get container status \"2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf\": rpc error: code = NotFound desc = could not find container \"2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf\": container with ID starting with 2a7863e40451410988a824b6c27c518df8def73087aa472c3da477cfd9fbebbf not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.344198 4707 scope.go:117] "RemoveContainer" containerID="97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.344444 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453\": container with ID starting with 97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453 not found: ID does not exist" containerID="97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.344463 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453"} err="failed to get container status \"97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453\": rpc error: code = NotFound desc = could not find container \"97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453\": container with ID starting with 97d969aa4fdace50784928ea705a500a3f9dc5b1ad2c8ba2e26d6af1dba06453 not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.344479 4707 scope.go:117] "RemoveContainer" containerID="a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.344774 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9\": container with ID starting with a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9 not found: ID does not exist" containerID="a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.344794 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9"} err="failed to get container status \"a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9\": rpc error: code = NotFound desc = could not find container \"a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9\": container with ID starting with a0f8e5ed1e2cac46aa62008c5476479e5edf2ff37f64aab8b31ed3ebb0c3d7a9 not found: ID does not exist" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.401188 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.401212 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.401224 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.401233 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.401243 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.401253 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64190d62-9a33-4999-bc4a-3eff62940162-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: I0121 16:19:40.401265 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vvnc\" (UniqueName: \"kubernetes.io/projected/64190d62-9a33-4999-bc4a-3eff62940162-kube-api-access-4vvnc\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.775805 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469 is running failed: container process not found" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.776655 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469 is running failed: container process not found" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.777095 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469 is running failed: container process not found" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 21 16:19:40 crc kubenswrapper[4707]: E0121 16:19:40.777144 4707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.056200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"80f502e2-066e-4230-b504-29825f19918a","Type":"ContainerDied","Data":"e104d37b0c635c6d067bb8de883529400b116f8fa86c601f8ffaf7a8db22717b"} Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.056285 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.056296 4707 scope.go:117] "RemoveContainer" containerID="e40780eb8f6c8868fcfe6cea95b53b9863e85ffd96154caea248f739bcd4c77a" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.058063 4707 generic.go:334] "Generic (PLEG): container finished" podID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" exitCode=137 Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.058108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2","Type":"ContainerDied","Data":"6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469"} Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.061669 4707 generic.go:334] "Generic (PLEG): container finished" podID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" exitCode=137 Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.061726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"89eecc6d-db4a-4705-8ca5-cfdff9119615","Type":"ContainerDied","Data":"86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4"} Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.068587 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-69b97779c8-jlqt9_64190d62-9a33-4999-bc4a-3eff62940162/neutron-api/0.log" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.068652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" event={"ID":"64190d62-9a33-4999-bc4a-3eff62940162","Type":"ContainerDied","Data":"64e00f319a49f010f4afb5370013e1976165b3f1ae2cd24d5e334ea917351241"} Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.068751 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-69b97779c8-jlqt9" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.191587 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" path="/var/lib/kubelet/pods/2c2f6089-1b67-4c36-ae6f-66c394da5881/volumes" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.199261 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.202115 4707 scope.go:117] "RemoveContainer" containerID="5b3ebbd8a16f076cbe84710aa19479370ca9661ce99511215aea7ce272f068eb" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.221669 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.225037 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.228863 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-69b97779c8-jlqt9"] Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.234923 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-69b97779c8-jlqt9"] Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.236559 4707 scope.go:117] "RemoveContainer" containerID="d2d7c5789da9a58eb70a75b4a3d6ee778cf3a4ed19f27c737a94bbdf5be149df" Jan 21 16:19:41 crc kubenswrapper[4707]: E0121 16:19:41.304303 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80f502e2_066e_4230_b504_29825f19918a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80f502e2_066e_4230_b504_29825f19918a.slice/crio-e104d37b0c635c6d067bb8de883529400b116f8fa86c601f8ffaf7a8db22717b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64190d62_9a33_4999_bc4a_3eff62940162.slice\": RecentStats: unable to find data in memory cache]" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.312356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-combined-ca-bundle\") pod \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.312442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhjmg\" (UniqueName: \"kubernetes.io/projected/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-kube-api-access-dhjmg\") pod \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.312710 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-config-data\") pod \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\" (UID: \"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.318410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-kube-api-access-dhjmg" (OuterVolumeSpecName: "kube-api-access-dhjmg") pod "e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" (UID: "e17faeb8-a9ee-44c5-a9d3-e7467017e0c2"). InnerVolumeSpecName "kube-api-access-dhjmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.338294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-config-data" (OuterVolumeSpecName: "config-data") pod "e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" (UID: "e17faeb8-a9ee-44c5-a9d3-e7467017e0c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.340744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" (UID: "e17faeb8-a9ee-44c5-a9d3-e7467017e0c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.359273 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.414396 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.414435 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.414450 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhjmg\" (UniqueName: \"kubernetes.io/projected/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2-kube-api-access-dhjmg\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.420144 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.515766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-config-data\") pod \"89eecc6d-db4a-4705-8ca5-cfdff9119615\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.516063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pscn4\" (UniqueName: \"kubernetes.io/projected/89eecc6d-db4a-4705-8ca5-cfdff9119615-kube-api-access-pscn4\") pod \"89eecc6d-db4a-4705-8ca5-cfdff9119615\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.516104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-combined-ca-bundle\") pod \"89eecc6d-db4a-4705-8ca5-cfdff9119615\" (UID: \"89eecc6d-db4a-4705-8ca5-cfdff9119615\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.519453 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89eecc6d-db4a-4705-8ca5-cfdff9119615-kube-api-access-pscn4" (OuterVolumeSpecName: "kube-api-access-pscn4") pod "89eecc6d-db4a-4705-8ca5-cfdff9119615" (UID: "89eecc6d-db4a-4705-8ca5-cfdff9119615"). InnerVolumeSpecName "kube-api-access-pscn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.534770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-config-data" (OuterVolumeSpecName: "config-data") pod "89eecc6d-db4a-4705-8ca5-cfdff9119615" (UID: "89eecc6d-db4a-4705-8ca5-cfdff9119615"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.535090 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89eecc6d-db4a-4705-8ca5-cfdff9119615" (UID: "89eecc6d-db4a-4705-8ca5-cfdff9119615"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.617567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-ceilometer-tls-certs\") pod \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.617657 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-run-httpd\") pod \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.617707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-sg-core-conf-yaml\") pod \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.617762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-combined-ca-bundle\") pod \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.617846 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-config-data\") pod \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.617953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-scripts\") pod \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.618023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjjll\" (UniqueName: \"kubernetes.io/projected/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-kube-api-access-tjjll\") pod \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.618047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-log-httpd\") pod \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\" (UID: \"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4\") " Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.618072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" (UID: "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.618473 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.618496 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.618514 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pscn4\" (UniqueName: \"kubernetes.io/projected/89eecc6d-db4a-4705-8ca5-cfdff9119615-kube-api-access-pscn4\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.618524 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89eecc6d-db4a-4705-8ca5-cfdff9119615-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.618659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" (UID: "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.621191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-scripts" (OuterVolumeSpecName: "scripts") pod "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" (UID: "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.621710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-kube-api-access-tjjll" (OuterVolumeSpecName: "kube-api-access-tjjll") pod "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" (UID: "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4"). InnerVolumeSpecName "kube-api-access-tjjll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.633947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" (UID: "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.647293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" (UID: "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.658325 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" (UID: "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.670492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-config-data" (OuterVolumeSpecName: "config-data") pod "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" (UID: "8fc9cfd6-8a8a-461b-96ed-e1781b8898c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.719842 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjjll\" (UniqueName: \"kubernetes.io/projected/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-kube-api-access-tjjll\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.719877 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.719892 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.719902 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.719914 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.719923 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:41 crc kubenswrapper[4707]: I0121 16:19:41.719934 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.081365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.081340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e17faeb8-a9ee-44c5-a9d3-e7467017e0c2","Type":"ContainerDied","Data":"f48c05da66ea329433c24a894735e1ec849124341f9f4dcd29f8dff7f290b1c1"} Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.081531 4707 scope.go:117] "RemoveContainer" containerID="6c95a35e072510726b7866697fb021f2a46962ab8eabac5165d35f5a7268f469" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.083510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"89eecc6d-db4a-4705-8ca5-cfdff9119615","Type":"ContainerDied","Data":"88d667de9ee20a089cbfa9bd66599fd99091aa6848a057d74ebe97a44084cd72"} Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.083568 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.087445 4707 generic.go:334] "Generic (PLEG): container finished" podID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerID="290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9" exitCode=137 Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.087511 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerDied","Data":"290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9"} Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.087529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fc9cfd6-8a8a-461b-96ed-e1781b8898c4","Type":"ContainerDied","Data":"d6b12b269502fd2de890e43a1d712e337c91384d942b74252b62541be9d3b607"} Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.087578 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.107476 4707 scope.go:117] "RemoveContainer" containerID="86bf1d49e28441f68968975e4bbc56da47d4d7315808e33c5dff82b9b89c25e4" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.120944 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.126182 4707 scope.go:117] "RemoveContainer" containerID="928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.134255 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.143722 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.149930 4707 scope.go:117] "RemoveContainer" containerID="2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.151969 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.159211 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.163352 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.167244 4707 scope.go:117] "RemoveContainer" containerID="290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.183698 4707 scope.go:117] "RemoveContainer" containerID="32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.204351 4707 scope.go:117] "RemoveContainer" containerID="928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c" Jan 21 16:19:42 crc kubenswrapper[4707]: E0121 16:19:42.204962 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c\": container with ID starting with 928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c not found: ID does not exist" containerID="928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.205006 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c"} err="failed to get container status \"928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c\": rpc error: code = NotFound desc = could not find container \"928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c\": container with ID starting with 928deea81abdfad12830d1ad2b11a9a6a53428a957abddb387c7fa8e863b3a2c not found: ID does not exist" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.205039 4707 scope.go:117] "RemoveContainer" containerID="2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b" Jan 21 16:19:42 crc kubenswrapper[4707]: E0121 16:19:42.205319 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b\": container with ID starting with 2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b not found: ID does not exist" containerID="2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.205355 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b"} err="failed to get container status \"2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b\": rpc error: code = NotFound desc = could not find container \"2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b\": container with ID starting with 2831aa466c35d063e6b298e4df342edd84ba70704fb6ac4a09b470032f0f105b not found: ID does not exist" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.205377 4707 scope.go:117] "RemoveContainer" containerID="290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9" Jan 21 16:19:42 crc kubenswrapper[4707]: E0121 16:19:42.205632 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9\": container with ID starting with 290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9 not found: ID does not exist" containerID="290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.205660 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9"} err="failed to get container status \"290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9\": rpc error: code = NotFound desc = could not find container \"290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9\": container with ID starting with 290a86a351947527dbde1c9e7cbee77c5c773281ca2fa30ba14eee3261305ab9 not found: ID does not exist" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.205676 4707 scope.go:117] "RemoveContainer" containerID="32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65" Jan 21 16:19:42 crc kubenswrapper[4707]: E0121 16:19:42.205923 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65\": container with ID starting with 32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65 not found: ID does not exist" containerID="32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65" Jan 21 16:19:42 crc kubenswrapper[4707]: I0121 16:19:42.205951 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65"} err="failed to get container status \"32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65\": rpc error: code = NotFound desc = could not find container \"32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65\": container with ID starting with 32341fc8e0c19810d73b8ae96f0d16c65d95270c6d431464022c8661e2535c65 not found: ID does not exist" Jan 21 16:19:43 crc kubenswrapper[4707]: I0121 16:19:43.192793 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64190d62-9a33-4999-bc4a-3eff62940162" path="/var/lib/kubelet/pods/64190d62-9a33-4999-bc4a-3eff62940162/volumes" Jan 21 16:19:43 crc kubenswrapper[4707]: I0121 16:19:43.193745 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f502e2-066e-4230-b504-29825f19918a" path="/var/lib/kubelet/pods/80f502e2-066e-4230-b504-29825f19918a/volumes" Jan 21 16:19:43 crc kubenswrapper[4707]: I0121 16:19:43.194218 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" path="/var/lib/kubelet/pods/89eecc6d-db4a-4705-8ca5-cfdff9119615/volumes" Jan 21 16:19:43 crc kubenswrapper[4707]: I0121 16:19:43.195338 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" path="/var/lib/kubelet/pods/8fc9cfd6-8a8a-461b-96ed-e1781b8898c4/volumes" Jan 21 16:19:43 crc kubenswrapper[4707]: I0121 16:19:43.196246 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" path="/var/lib/kubelet/pods/e17faeb8-a9ee-44c5-a9d3-e7467017e0c2/volumes" Jan 21 16:19:48 crc kubenswrapper[4707]: I0121 16:19:48.961934 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jm9xf"] Jan 21 16:19:48 crc kubenswrapper[4707]: I0121 16:19:48.966580 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jm9xf"] Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.059981 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bxxv8"] Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060300 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-replicator" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060320 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-replicator" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060330 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060336 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060347 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerName="glance-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060352 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerName="glance-log" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060360 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28274132-ebaa-415b-a596-28b30796449c" containerName="cinder-api-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060365 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28274132-ebaa-415b-a596-28b30796449c" containerName="cinder-api-log" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060378 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-updater" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060385 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-updater" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060395 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060401 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060410 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060416 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-api" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060423 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41784af8-4684-4cb8-959b-2cff22fe7f16" containerName="galera" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060428 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="41784af8-4684-4cb8-959b-2cff22fe7f16" containerName="galera" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060434 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060439 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-server" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060446 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-updater" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060451 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-updater" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060460 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-expirer" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060467 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-expirer" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060474 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00504354-68de-4d18-8192-144ce961d72e" containerName="rabbitmq" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060480 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="00504354-68de-4d18-8192-144ce961d72e" containerName="rabbitmq" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060488 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060495 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-server" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060503 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerName="ovn-northd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060507 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerName="ovn-northd" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060513 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="ceilometer-notification-agent" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060519 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="ceilometer-notification-agent" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060526 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="sg-core" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060533 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="sg-core" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060541 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28274132-ebaa-415b-a596-28b30796449c" containerName="cinder-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060548 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28274132-ebaa-415b-a596-28b30796449c" containerName="cinder-api" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060555 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d94635b-5310-4d9e-b655-163963e945b9" containerName="init" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060560 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d94635b-5310-4d9e-b655-163963e945b9" containerName="init" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060568 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-replicator" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060573 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-replicator" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060579 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-auditor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060583 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-auditor" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060589 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerName="glance-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060594 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerName="glance-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060605 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41784af8-4684-4cb8-959b-2cff22fe7f16" containerName="mysql-bootstrap" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="41784af8-4684-4cb8-959b-2cff22fe7f16" containerName="mysql-bootstrap" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060617 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48c9611-090f-4751-ae18-8a781982b122" containerName="extract-utilities" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060623 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48c9611-090f-4751-ae18-8a781982b122" containerName="extract-utilities" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060631 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="proxy-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060636 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="proxy-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060642 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060647 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api-log" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060653 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="swift-recon-cron" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060658 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="swift-recon-cron" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060667 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerName="proxy-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060672 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerName="proxy-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060680 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" containerName="setup-container" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060685 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" containerName="setup-container" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060696 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-auditor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060701 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-auditor" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060706 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-reaper" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060711 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-reaper" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060720 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48c9611-090f-4751-ae18-8a781982b122" containerName="registry-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060725 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48c9611-090f-4751-ae18-8a781982b122" containerName="registry-server" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060730 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060737 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-log" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060746 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060751 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-server" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060757 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerName="glance-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060762 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerName="glance-log" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060770 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-metadata" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060775 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-metadata" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060783 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="ceilometer-central-agent" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060789 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="ceilometer-central-agent" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060798 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerName="placement-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060803 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerName="placement-api" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060828 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060833 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060840 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060846 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-log" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060853 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48c9611-090f-4751-ae18-8a781982b122" containerName="extract-content" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060858 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48c9611-090f-4751-ae18-8a781982b122" containerName="extract-content" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060863 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90275cb5-d75d-4fbc-b84d-31fc918a19d1" containerName="keystone-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060868 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="90275cb5-d75d-4fbc-b84d-31fc918a19d1" containerName="keystone-api" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060877 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerName="barbican-worker" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060883 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerName="barbican-worker" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060891 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerName="barbican-worker-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060896 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerName="barbican-worker-log" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060906 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-auditor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060910 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-auditor" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060918 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerName="barbican-keystone-listener-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060923 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerName="barbican-keystone-listener-log" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060931 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060936 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060945 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d94635b-5310-4d9e-b655-163963e945b9" containerName="dnsmasq-dns" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060950 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d94635b-5310-4d9e-b655-163963e945b9" containerName="dnsmasq-dns" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060957 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64190d62-9a33-4999-bc4a-3eff62940162" containerName="neutron-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060962 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="64190d62-9a33-4999-bc4a-3eff62940162" containerName="neutron-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="rsync" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060975 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="rsync" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060982 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerName="placement-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060986 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerName="placement-log" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.060993 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00504354-68de-4d18-8192-144ce961d72e" containerName="setup-container" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.060998 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="00504354-68de-4d18-8192-144ce961d72e" containerName="setup-container" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.061005 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerName="openstack-network-exporter" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061010 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerName="openstack-network-exporter" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.061016 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" containerName="rabbitmq" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061021 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" containerName="rabbitmq" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.061030 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64190d62-9a33-4999-bc4a-3eff62940162" containerName="neutron-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061036 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="64190d62-9a33-4999-bc4a-3eff62940162" containerName="neutron-api" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.061045 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerName="glance-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061050 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerName="glance-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.061056 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-replicator" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061062 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-replicator" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.061071 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerName="barbican-keystone-listener" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061077 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerName="barbican-keystone-listener" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.061085 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243cdfad-87ba-4917-8c2c-061272251dab" containerName="memcached" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061090 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="243cdfad-87ba-4917-8c2c-061272251dab" containerName="memcached" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.061097 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerName="proxy-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061102 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerName="proxy-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061248 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="swift-recon-cron" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061258 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerName="proxy-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061268 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-auditor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061274 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061282 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de9761d-81cf-4d45-aec0-7407347f0449" containerName="nova-api-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061291 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28274132-ebaa-415b-a596-28b30796449c" containerName="cinder-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061298 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="64190d62-9a33-4999-bc4a-3eff62940162" containerName="neutron-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061307 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73f5d48-3358-4ea2-8b12-cfd7118a7337" containerName="proxy-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061315 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061324 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061336 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="00504354-68de-4d18-8192-144ce961d72e" containerName="rabbitmq" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061344 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061350 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="proxy-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061359 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-replicator" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061365 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="90275cb5-d75d-4fbc-b84d-31fc918a19d1" containerName="keystone-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061373 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerName="glance-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061382 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerName="placement-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061390 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17faeb8-a9ee-44c5-a9d3-e7467017e0c2" containerName="nova-cell1-conductor-conductor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061398 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="ceilometer-central-agent" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061407 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="243cdfad-87ba-4917-8c2c-061272251dab" containerName="memcached" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061414 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f502e2-066e-4230-b504-29825f19918a" containerName="nova-scheduler-scheduler" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061421 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-expirer" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061427 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d463b3-4ebf-46d8-9a72-3142c370d2f9" containerName="nova-metadata-metadata" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061436 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb70bcb-b47f-467d-bd5f-8ea6c4a02fb2" containerName="placement-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061445 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="64190d62-9a33-4999-bc4a-3eff62940162" containerName="neutron-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061453 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48c9611-090f-4751-ae18-8a781982b122" containerName="registry-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061459 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-reaper" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061467 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061474 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-server" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061482 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-auditor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061489 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7ffdb4-4647-47fb-8ec9-012b0c81a83e" containerName="rabbitmq" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061496 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="ceilometer-notification-agent" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061503 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="41784af8-4684-4cb8-959b-2cff22fe7f16" containerName="galera" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061510 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28274132-ebaa-415b-a596-28b30796449c" containerName="cinder-api-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061516 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0537ef6-aeda-4293-aafa-5889f66c493a" containerName="glance-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061525 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-auditor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061531 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25947cd-d79d-4983-abfa-5d0f80850cdb" containerName="barbican-api-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061537 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="89eecc6d-db4a-4705-8ca5-cfdff9119615" containerName="nova-cell0-conductor-conductor" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061546 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="object-updater" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061552 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc9cfd6-8a8a-461b-96ed-e1781b8898c4" containerName="sg-core" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061559 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-replicator" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061567 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerName="barbican-worker-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061573 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerName="glance-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061582 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd161af-38fc-439e-8b3f-d4549dc3708f" containerName="glance-httpd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061589 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerName="barbican-keystone-listener-log" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061595 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ba05f78-2a71-4b67-8e95-db9c04a4ed42" containerName="barbican-keystone-listener" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061600 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="account-replicator" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061609 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="container-updater" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061618 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8cfcd0-7878-4185-b558-a0ad021f06df" containerName="barbican-worker" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061623 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d94635b-5310-4d9e-b655-163963e945b9" containerName="dnsmasq-dns" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061630 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerName="openstack-network-exporter" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061636 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e961b985-3419-4a6d-9cc5-e7c615f54129" containerName="ovn-northd" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.061642 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2f6089-1b67-4c36-ae6f-66c394da5881" containerName="rsync" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.062116 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.063930 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.064025 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.064117 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.064293 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.067031 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bxxv8"] Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.182588 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:19:49 crc kubenswrapper[4707]: E0121 16:19:49.182878 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.191744 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6281cf61-0f59-4ad0-90a3-6f31bb120016" path="/var/lib/kubelet/pods/6281cf61-0f59-4ad0-90a3-6f31bb120016/volumes" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.223522 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xrzg\" (UniqueName: \"kubernetes.io/projected/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-kube-api-access-9xrzg\") pod \"crc-storage-crc-bxxv8\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.223725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-node-mnt\") pod \"crc-storage-crc-bxxv8\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.223867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-crc-storage\") pod \"crc-storage-crc-bxxv8\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.325701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-node-mnt\") pod \"crc-storage-crc-bxxv8\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.325837 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-crc-storage\") pod \"crc-storage-crc-bxxv8\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.325955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xrzg\" (UniqueName: \"kubernetes.io/projected/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-kube-api-access-9xrzg\") pod \"crc-storage-crc-bxxv8\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.325956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-node-mnt\") pod \"crc-storage-crc-bxxv8\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.326605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-crc-storage\") pod \"crc-storage-crc-bxxv8\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.344043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xrzg\" (UniqueName: \"kubernetes.io/projected/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-kube-api-access-9xrzg\") pod \"crc-storage-crc-bxxv8\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.376340 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.759533 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bxxv8"] Jan 21 16:19:49 crc kubenswrapper[4707]: I0121 16:19:49.765651 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:19:50 crc kubenswrapper[4707]: I0121 16:19:50.176995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bxxv8" event={"ID":"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc","Type":"ContainerStarted","Data":"bdc2f1fc697315d01639bcabafb42b709301e8fb42fdc96e02645b9c67feb867"} Jan 21 16:19:51 crc kubenswrapper[4707]: I0121 16:19:51.189018 4707 generic.go:334] "Generic (PLEG): container finished" podID="b01688b8-fc4a-40b6-b1b3-ddcda634a0dc" containerID="9aaf40628f28453c96b4650e0084ceff71754fc2ea886263f867dfe4d5d698d4" exitCode=0 Jan 21 16:19:51 crc kubenswrapper[4707]: I0121 16:19:51.190465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bxxv8" event={"ID":"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc","Type":"ContainerDied","Data":"9aaf40628f28453c96b4650e0084ceff71754fc2ea886263f867dfe4d5d698d4"} Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.440424 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.473123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-node-mnt\") pod \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.473257 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-crc-storage\") pod \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.473264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b01688b8-fc4a-40b6-b1b3-ddcda634a0dc" (UID: "b01688b8-fc4a-40b6-b1b3-ddcda634a0dc"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.473675 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.492638 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b01688b8-fc4a-40b6-b1b3-ddcda634a0dc" (UID: "b01688b8-fc4a-40b6-b1b3-ddcda634a0dc"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.574848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xrzg\" (UniqueName: \"kubernetes.io/projected/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-kube-api-access-9xrzg\") pod \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\" (UID: \"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc\") " Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.575116 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.577794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-kube-api-access-9xrzg" (OuterVolumeSpecName: "kube-api-access-9xrzg") pod "b01688b8-fc4a-40b6-b1b3-ddcda634a0dc" (UID: "b01688b8-fc4a-40b6-b1b3-ddcda634a0dc"). InnerVolumeSpecName "kube-api-access-9xrzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:52 crc kubenswrapper[4707]: I0121 16:19:52.676573 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xrzg\" (UniqueName: \"kubernetes.io/projected/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc-kube-api-access-9xrzg\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:53 crc kubenswrapper[4707]: I0121 16:19:53.207012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bxxv8" event={"ID":"b01688b8-fc4a-40b6-b1b3-ddcda634a0dc","Type":"ContainerDied","Data":"bdc2f1fc697315d01639bcabafb42b709301e8fb42fdc96e02645b9c67feb867"} Jan 21 16:19:53 crc kubenswrapper[4707]: I0121 16:19:53.207070 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdc2f1fc697315d01639bcabafb42b709301e8fb42fdc96e02645b9c67feb867" Jan 21 16:19:53 crc kubenswrapper[4707]: I0121 16:19:53.207106 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxxv8" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.418444 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-bxxv8"] Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.422640 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-bxxv8"] Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.537425 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-khf6w"] Jan 21 16:19:55 crc kubenswrapper[4707]: E0121 16:19:55.537764 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01688b8-fc4a-40b6-b1b3-ddcda634a0dc" containerName="storage" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.537783 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01688b8-fc4a-40b6-b1b3-ddcda634a0dc" containerName="storage" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.537973 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01688b8-fc4a-40b6-b1b3-ddcda634a0dc" containerName="storage" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.538504 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.541038 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.541219 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.541929 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.543031 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.545502 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-khf6w"] Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.620741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/85fee3b2-8fca-41cc-9e28-889eddc7252f-node-mnt\") pod \"crc-storage-crc-khf6w\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.620795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/85fee3b2-8fca-41cc-9e28-889eddc7252f-crc-storage\") pod \"crc-storage-crc-khf6w\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.620851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbjkl\" (UniqueName: \"kubernetes.io/projected/85fee3b2-8fca-41cc-9e28-889eddc7252f-kube-api-access-lbjkl\") pod \"crc-storage-crc-khf6w\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.722163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/85fee3b2-8fca-41cc-9e28-889eddc7252f-node-mnt\") pod \"crc-storage-crc-khf6w\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.722241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/85fee3b2-8fca-41cc-9e28-889eddc7252f-crc-storage\") pod \"crc-storage-crc-khf6w\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.722272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbjkl\" (UniqueName: \"kubernetes.io/projected/85fee3b2-8fca-41cc-9e28-889eddc7252f-kube-api-access-lbjkl\") pod \"crc-storage-crc-khf6w\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.722629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/85fee3b2-8fca-41cc-9e28-889eddc7252f-node-mnt\") pod \"crc-storage-crc-khf6w\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.723058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/85fee3b2-8fca-41cc-9e28-889eddc7252f-crc-storage\") pod \"crc-storage-crc-khf6w\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.739287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbjkl\" (UniqueName: \"kubernetes.io/projected/85fee3b2-8fca-41cc-9e28-889eddc7252f-kube-api-access-lbjkl\") pod \"crc-storage-crc-khf6w\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:55 crc kubenswrapper[4707]: I0121 16:19:55.854356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:56 crc kubenswrapper[4707]: I0121 16:19:56.240896 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-khf6w"] Jan 21 16:19:57 crc kubenswrapper[4707]: I0121 16:19:57.191563 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01688b8-fc4a-40b6-b1b3-ddcda634a0dc" path="/var/lib/kubelet/pods/b01688b8-fc4a-40b6-b1b3-ddcda634a0dc/volumes" Jan 21 16:19:57 crc kubenswrapper[4707]: I0121 16:19:57.240564 4707 generic.go:334] "Generic (PLEG): container finished" podID="85fee3b2-8fca-41cc-9e28-889eddc7252f" containerID="8a781f9ff083bdf77a86357cdffbd3b1a3913bbd178af634a3b1b48ba7ca0a17" exitCode=0 Jan 21 16:19:57 crc kubenswrapper[4707]: I0121 16:19:57.240623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-khf6w" event={"ID":"85fee3b2-8fca-41cc-9e28-889eddc7252f","Type":"ContainerDied","Data":"8a781f9ff083bdf77a86357cdffbd3b1a3913bbd178af634a3b1b48ba7ca0a17"} Jan 21 16:19:57 crc kubenswrapper[4707]: I0121 16:19:57.240691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-khf6w" event={"ID":"85fee3b2-8fca-41cc-9e28-889eddc7252f","Type":"ContainerStarted","Data":"a231dd52035f582e16cdad531f3543e7b94dc80047ce70bf99c2b31ac7610aa4"} Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.454941 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.475293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/85fee3b2-8fca-41cc-9e28-889eddc7252f-crc-storage\") pod \"85fee3b2-8fca-41cc-9e28-889eddc7252f\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.475369 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/85fee3b2-8fca-41cc-9e28-889eddc7252f-node-mnt\") pod \"85fee3b2-8fca-41cc-9e28-889eddc7252f\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.475521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbjkl\" (UniqueName: \"kubernetes.io/projected/85fee3b2-8fca-41cc-9e28-889eddc7252f-kube-api-access-lbjkl\") pod \"85fee3b2-8fca-41cc-9e28-889eddc7252f\" (UID: \"85fee3b2-8fca-41cc-9e28-889eddc7252f\") " Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.475839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85fee3b2-8fca-41cc-9e28-889eddc7252f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "85fee3b2-8fca-41cc-9e28-889eddc7252f" (UID: "85fee3b2-8fca-41cc-9e28-889eddc7252f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.476197 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/85fee3b2-8fca-41cc-9e28-889eddc7252f-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.480869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85fee3b2-8fca-41cc-9e28-889eddc7252f-kube-api-access-lbjkl" (OuterVolumeSpecName: "kube-api-access-lbjkl") pod "85fee3b2-8fca-41cc-9e28-889eddc7252f" (UID: "85fee3b2-8fca-41cc-9e28-889eddc7252f"). InnerVolumeSpecName "kube-api-access-lbjkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.489850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85fee3b2-8fca-41cc-9e28-889eddc7252f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "85fee3b2-8fca-41cc-9e28-889eddc7252f" (UID: "85fee3b2-8fca-41cc-9e28-889eddc7252f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.578250 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbjkl\" (UniqueName: \"kubernetes.io/projected/85fee3b2-8fca-41cc-9e28-889eddc7252f-kube-api-access-lbjkl\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:58 crc kubenswrapper[4707]: I0121 16:19:58.578282 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/85fee3b2-8fca-41cc-9e28-889eddc7252f-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:19:59 crc kubenswrapper[4707]: I0121 16:19:59.256229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-khf6w" event={"ID":"85fee3b2-8fca-41cc-9e28-889eddc7252f","Type":"ContainerDied","Data":"a231dd52035f582e16cdad531f3543e7b94dc80047ce70bf99c2b31ac7610aa4"} Jan 21 16:19:59 crc kubenswrapper[4707]: I0121 16:19:59.256314 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a231dd52035f582e16cdad531f3543e7b94dc80047ce70bf99c2b31ac7610aa4" Jan 21 16:19:59 crc kubenswrapper[4707]: I0121 16:19:59.256263 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khf6w" Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.910637 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm"] Jan 21 16:20:01 crc kubenswrapper[4707]: E0121 16:20:01.911208 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85fee3b2-8fca-41cc-9e28-889eddc7252f" containerName="storage" Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.911221 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="85fee3b2-8fca-41cc-9e28-889eddc7252f" containerName="storage" Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.911347 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="85fee3b2-8fca-41cc-9e28-889eddc7252f" containerName="storage" Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.912034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.913535 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-edpm-ipam" Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.920389 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm"] Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.924634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.924781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmtvw\" (UniqueName: \"kubernetes.io/projected/a2eba5da-9cb4-49d1-8406-b2d533c6209e-kube-api-access-qmtvw\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.924836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-config\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:01 crc kubenswrapper[4707]: I0121 16:20:01.924978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.026002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmtvw\" (UniqueName: \"kubernetes.io/projected/a2eba5da-9cb4-49d1-8406-b2d533c6209e-kube-api-access-qmtvw\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.026061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-config\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.026146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.026185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.027150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.027192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.027387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-config\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.183535 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:20:02 crc kubenswrapper[4707]: E0121 16:20:02.184283 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.249189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmtvw\" (UniqueName: \"kubernetes.io/projected/a2eba5da-9cb4-49d1-8406-b2d533c6209e-kube-api-access-qmtvw\") pod \"dnsmasq-dnsmasq-5b68c79f89-4h4tm\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.527404 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:02 crc kubenswrapper[4707]: I0121 16:20:02.922580 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm"] Jan 21 16:20:03 crc kubenswrapper[4707]: I0121 16:20:03.291033 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2eba5da-9cb4-49d1-8406-b2d533c6209e" containerID="d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8" exitCode=0 Jan 21 16:20:03 crc kubenswrapper[4707]: I0121 16:20:03.291109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" event={"ID":"a2eba5da-9cb4-49d1-8406-b2d533c6209e","Type":"ContainerDied","Data":"d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8"} Jan 21 16:20:03 crc kubenswrapper[4707]: I0121 16:20:03.291154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" event={"ID":"a2eba5da-9cb4-49d1-8406-b2d533c6209e","Type":"ContainerStarted","Data":"759be508f9392e36cc37f8829a44ace95418a7a25dbdf7f89551806e077fb453"} Jan 21 16:20:04 crc kubenswrapper[4707]: I0121 16:20:04.301206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" event={"ID":"a2eba5da-9cb4-49d1-8406-b2d533c6209e","Type":"ContainerStarted","Data":"3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767"} Jan 21 16:20:04 crc kubenswrapper[4707]: I0121 16:20:04.302079 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:04 crc kubenswrapper[4707]: I0121 16:20:04.320473 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" podStartSLOduration=3.320450036 podStartE2EDuration="3.320450036s" podCreationTimestamp="2026-01-21 16:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:20:04.315702521 +0000 UTC m=+4701.497218742" watchObservedRunningTime="2026-01-21 16:20:04.320450036 +0000 UTC m=+4701.501966258" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.529006 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.571213 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq"] Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.571523 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" podUID="d6f8b562-33b5-4553-ad14-4ad5230c3ac5" containerName="dnsmasq-dns" containerID="cri-o://f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c" gracePeriod=10 Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.667443 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2"] Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.668584 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.685455 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2"] Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.779410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.779721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6pdr\" (UniqueName: \"kubernetes.io/projected/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-kube-api-access-f6pdr\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.779781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-config\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.779901 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.881783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6pdr\" (UniqueName: \"kubernetes.io/projected/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-kube-api-access-f6pdr\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.882294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-config\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.882429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.882657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.883722 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-config\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.884073 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.884151 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.901995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6pdr\" (UniqueName: \"kubernetes.io/projected/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-kube-api-access-f6pdr\") pod \"dnsmasq-dnsmasq-79667f9c49-qwdr2\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.935320 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.984005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pgjz\" (UniqueName: \"kubernetes.io/projected/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-kube-api-access-5pgjz\") pod \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.984110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-config\") pod \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.984190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-dnsmasq-svc\") pod \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\" (UID: \"d6f8b562-33b5-4553-ad14-4ad5230c3ac5\") " Jan 21 16:20:12 crc kubenswrapper[4707]: I0121 16:20:12.988643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-kube-api-access-5pgjz" (OuterVolumeSpecName: "kube-api-access-5pgjz") pod "d6f8b562-33b5-4553-ad14-4ad5230c3ac5" (UID: "d6f8b562-33b5-4553-ad14-4ad5230c3ac5"). InnerVolumeSpecName "kube-api-access-5pgjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.005310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.018910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-config" (OuterVolumeSpecName: "config") pod "d6f8b562-33b5-4553-ad14-4ad5230c3ac5" (UID: "d6f8b562-33b5-4553-ad14-4ad5230c3ac5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.019317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "d6f8b562-33b5-4553-ad14-4ad5230c3ac5" (UID: "d6f8b562-33b5-4553-ad14-4ad5230c3ac5"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.085337 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pgjz\" (UniqueName: \"kubernetes.io/projected/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-kube-api-access-5pgjz\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.085366 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.085378 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d6f8b562-33b5-4553-ad14-4ad5230c3ac5-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.372304 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6f8b562-33b5-4553-ad14-4ad5230c3ac5" containerID="f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c" exitCode=0 Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.372361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" event={"ID":"d6f8b562-33b5-4553-ad14-4ad5230c3ac5","Type":"ContainerDied","Data":"f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c"} Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.372374 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.372398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq" event={"ID":"d6f8b562-33b5-4553-ad14-4ad5230c3ac5","Type":"ContainerDied","Data":"80df44a86aad1969dcde255c006876df40ad2b6b9d241593772c25cd9db6f05d"} Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.372419 4707 scope.go:117] "RemoveContainer" containerID="f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.395652 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq"] Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.398279 4707 scope.go:117] "RemoveContainer" containerID="045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.398459 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-jdlhq"] Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.408291 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2"] Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.417625 4707 scope.go:117] "RemoveContainer" containerID="f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c" Jan 21 16:20:13 crc kubenswrapper[4707]: E0121 16:20:13.418180 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c\": container with ID starting with f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c not found: ID does not exist" containerID="f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.418220 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c"} err="failed to get container status \"f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c\": rpc error: code = NotFound desc = could not find container \"f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c\": container with ID starting with f7540dfe4cac42b65545962f49fbe432e206818a3d6b310cdede872b65f3650c not found: ID does not exist" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.418251 4707 scope.go:117] "RemoveContainer" containerID="045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159" Jan 21 16:20:13 crc kubenswrapper[4707]: E0121 16:20:13.418592 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159\": container with ID starting with 045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159 not found: ID does not exist" containerID="045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159" Jan 21 16:20:13 crc kubenswrapper[4707]: I0121 16:20:13.418617 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159"} err="failed to get container status \"045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159\": rpc error: code = NotFound desc = could not find container \"045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159\": container with ID starting with 045575e95ac234bb03baf9c902ffe0ae7ade223eba199a772657f7d7cf51c159 not found: ID does not exist" Jan 21 16:20:13 crc kubenswrapper[4707]: W0121 16:20:13.422179 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2fb26e1_aab5_49d4_904d_34ee7b1e5c74.slice/crio-4025096ae7e4593f6f5cf0496a193074a9d50f5fca0a6b60bf08fd6693b9c5a2 WatchSource:0}: Error finding container 4025096ae7e4593f6f5cf0496a193074a9d50f5fca0a6b60bf08fd6693b9c5a2: Status 404 returned error can't find the container with id 4025096ae7e4593f6f5cf0496a193074a9d50f5fca0a6b60bf08fd6693b9c5a2 Jan 21 16:20:14 crc kubenswrapper[4707]: I0121 16:20:14.385002 4707 generic.go:334] "Generic (PLEG): container finished" podID="c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" containerID="0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d" exitCode=0 Jan 21 16:20:14 crc kubenswrapper[4707]: I0121 16:20:14.385056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" event={"ID":"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74","Type":"ContainerDied","Data":"0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d"} Jan 21 16:20:14 crc kubenswrapper[4707]: I0121 16:20:14.385361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" event={"ID":"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74","Type":"ContainerStarted","Data":"4025096ae7e4593f6f5cf0496a193074a9d50f5fca0a6b60bf08fd6693b9c5a2"} Jan 21 16:20:15 crc kubenswrapper[4707]: I0121 16:20:15.192326 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f8b562-33b5-4553-ad14-4ad5230c3ac5" path="/var/lib/kubelet/pods/d6f8b562-33b5-4553-ad14-4ad5230c3ac5/volumes" Jan 21 16:20:15 crc kubenswrapper[4707]: I0121 16:20:15.399543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" event={"ID":"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74","Type":"ContainerStarted","Data":"21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a"} Jan 21 16:20:15 crc kubenswrapper[4707]: I0121 16:20:15.400282 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:15 crc kubenswrapper[4707]: I0121 16:20:15.420093 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" podStartSLOduration=3.420074011 podStartE2EDuration="3.420074011s" podCreationTimestamp="2026-01-21 16:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:20:15.414047129 +0000 UTC m=+4712.595563351" watchObservedRunningTime="2026-01-21 16:20:15.420074011 +0000 UTC m=+4712.601590233" Jan 21 16:20:16 crc kubenswrapper[4707]: I0121 16:20:16.183025 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:20:16 crc kubenswrapper[4707]: E0121 16:20:16.183655 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:20:20 crc kubenswrapper[4707]: I0121 16:20:20.281897 4707 scope.go:117] "RemoveContainer" containerID="3a4490cc8084048a122cc3dc772edf3f70456b2316b556413b3953de47b4a3f8" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.007007 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.051265 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm"] Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.051519 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" podUID="a2eba5da-9cb4-49d1-8406-b2d533c6209e" containerName="dnsmasq-dns" containerID="cri-o://3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767" gracePeriod=10 Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.419676 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.472244 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2eba5da-9cb4-49d1-8406-b2d533c6209e" containerID="3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767" exitCode=0 Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.472287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" event={"ID":"a2eba5da-9cb4-49d1-8406-b2d533c6209e","Type":"ContainerDied","Data":"3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767"} Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.472312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" event={"ID":"a2eba5da-9cb4-49d1-8406-b2d533c6209e","Type":"ContainerDied","Data":"759be508f9392e36cc37f8829a44ace95418a7a25dbdf7f89551806e077fb453"} Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.472312 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.472329 4707 scope.go:117] "RemoveContainer" containerID="3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.488978 4707 scope.go:117] "RemoveContainer" containerID="d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.503858 4707 scope.go:117] "RemoveContainer" containerID="3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767" Jan 21 16:20:23 crc kubenswrapper[4707]: E0121 16:20:23.504217 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767\": container with ID starting with 3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767 not found: ID does not exist" containerID="3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.504259 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767"} err="failed to get container status \"3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767\": rpc error: code = NotFound desc = could not find container \"3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767\": container with ID starting with 3a42bd7a261cdea78200038924c7b36ee8b0e9034fb4732c247bf1bc43a58767 not found: ID does not exist" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.504287 4707 scope.go:117] "RemoveContainer" containerID="d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8" Jan 21 16:20:23 crc kubenswrapper[4707]: E0121 16:20:23.505192 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8\": container with ID starting with d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8 not found: ID does not exist" containerID="d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.505224 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8"} err="failed to get container status \"d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8\": rpc error: code = NotFound desc = could not find container \"d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8\": container with ID starting with d3232649113a266694d567320d05d8cb300ebdd2e778b23cf57f39c8b5d074c8 not found: ID does not exist" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.556608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-dnsmasq-svc\") pod \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.556750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-config\") pod \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.556975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-openstack-edpm-ipam\") pod \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.557201 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmtvw\" (UniqueName: \"kubernetes.io/projected/a2eba5da-9cb4-49d1-8406-b2d533c6209e-kube-api-access-qmtvw\") pod \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\" (UID: \"a2eba5da-9cb4-49d1-8406-b2d533c6209e\") " Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.563484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2eba5da-9cb4-49d1-8406-b2d533c6209e-kube-api-access-qmtvw" (OuterVolumeSpecName: "kube-api-access-qmtvw") pod "a2eba5da-9cb4-49d1-8406-b2d533c6209e" (UID: "a2eba5da-9cb4-49d1-8406-b2d533c6209e"). InnerVolumeSpecName "kube-api-access-qmtvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.604181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a2eba5da-9cb4-49d1-8406-b2d533c6209e" (UID: "a2eba5da-9cb4-49d1-8406-b2d533c6209e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.611098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "a2eba5da-9cb4-49d1-8406-b2d533c6209e" (UID: "a2eba5da-9cb4-49d1-8406-b2d533c6209e"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.619494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-config" (OuterVolumeSpecName: "config") pod "a2eba5da-9cb4-49d1-8406-b2d533c6209e" (UID: "a2eba5da-9cb4-49d1-8406-b2d533c6209e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.620921 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn"] Jan 21 16:20:23 crc kubenswrapper[4707]: E0121 16:20:23.621251 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2eba5da-9cb4-49d1-8406-b2d533c6209e" containerName="init" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.621270 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2eba5da-9cb4-49d1-8406-b2d533c6209e" containerName="init" Jan 21 16:20:23 crc kubenswrapper[4707]: E0121 16:20:23.621299 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f8b562-33b5-4553-ad14-4ad5230c3ac5" containerName="dnsmasq-dns" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.621305 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f8b562-33b5-4553-ad14-4ad5230c3ac5" containerName="dnsmasq-dns" Jan 21 16:20:23 crc kubenswrapper[4707]: E0121 16:20:23.621314 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f8b562-33b5-4553-ad14-4ad5230c3ac5" containerName="init" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.621320 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f8b562-33b5-4553-ad14-4ad5230c3ac5" containerName="init" Jan 21 16:20:23 crc kubenswrapper[4707]: E0121 16:20:23.621331 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2eba5da-9cb4-49d1-8406-b2d533c6209e" containerName="dnsmasq-dns" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.621337 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2eba5da-9cb4-49d1-8406-b2d533c6209e" containerName="dnsmasq-dns" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.621490 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f8b562-33b5-4553-ad14-4ad5230c3ac5" containerName="dnsmasq-dns" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.621515 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2eba5da-9cb4-49d1-8406-b2d533c6209e" containerName="dnsmasq-dns" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.622230 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.633405 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn"] Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.660291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-mtwcn\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.660346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45lg4\" (UniqueName: \"kubernetes.io/projected/ce835f7c-9b6d-4700-894c-531c98d0a74e-kube-api-access-45lg4\") pod \"dnsmasq-dnsmasq-84b9f45d47-mtwcn\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.660653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-mtwcn\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.660793 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.660853 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.660869 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmtvw\" (UniqueName: \"kubernetes.io/projected/a2eba5da-9cb4-49d1-8406-b2d533c6209e-kube-api-access-qmtvw\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.660878 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2eba5da-9cb4-49d1-8406-b2d533c6209e-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.761631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-mtwcn\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.761676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45lg4\" (UniqueName: \"kubernetes.io/projected/ce835f7c-9b6d-4700-894c-531c98d0a74e-kube-api-access-45lg4\") pod \"dnsmasq-dnsmasq-84b9f45d47-mtwcn\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.761737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-mtwcn\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.762603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-mtwcn\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.762766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-mtwcn\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.776381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45lg4\" (UniqueName: \"kubernetes.io/projected/ce835f7c-9b6d-4700-894c-531c98d0a74e-kube-api-access-45lg4\") pod \"dnsmasq-dnsmasq-84b9f45d47-mtwcn\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.798147 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm"] Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.803679 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-4h4tm"] Jan 21 16:20:23 crc kubenswrapper[4707]: I0121 16:20:23.940439 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:24 crc kubenswrapper[4707]: I0121 16:20:24.317148 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn"] Jan 21 16:20:24 crc kubenswrapper[4707]: I0121 16:20:24.486014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" event={"ID":"ce835f7c-9b6d-4700-894c-531c98d0a74e","Type":"ContainerStarted","Data":"f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd"} Jan 21 16:20:24 crc kubenswrapper[4707]: I0121 16:20:24.486071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" event={"ID":"ce835f7c-9b6d-4700-894c-531c98d0a74e","Type":"ContainerStarted","Data":"c491c896239b11fe1726baadd6a3810ac1d7b29ec42d451fdff78bd5158a40dc"} Jan 21 16:20:25 crc kubenswrapper[4707]: I0121 16:20:25.196516 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2eba5da-9cb4-49d1-8406-b2d533c6209e" path="/var/lib/kubelet/pods/a2eba5da-9cb4-49d1-8406-b2d533c6209e/volumes" Jan 21 16:20:25 crc kubenswrapper[4707]: I0121 16:20:25.506358 4707 generic.go:334] "Generic (PLEG): container finished" podID="ce835f7c-9b6d-4700-894c-531c98d0a74e" containerID="f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd" exitCode=0 Jan 21 16:20:25 crc kubenswrapper[4707]: I0121 16:20:25.506423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" event={"ID":"ce835f7c-9b6d-4700-894c-531c98d0a74e","Type":"ContainerDied","Data":"f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd"} Jan 21 16:20:26 crc kubenswrapper[4707]: I0121 16:20:26.515700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" event={"ID":"ce835f7c-9b6d-4700-894c-531c98d0a74e","Type":"ContainerStarted","Data":"480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f"} Jan 21 16:20:26 crc kubenswrapper[4707]: I0121 16:20:26.516079 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:26 crc kubenswrapper[4707]: I0121 16:20:26.537673 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" podStartSLOduration=3.537649063 podStartE2EDuration="3.537649063s" podCreationTimestamp="2026-01-21 16:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:20:26.532311546 +0000 UTC m=+4723.713827769" watchObservedRunningTime="2026-01-21 16:20:26.537649063 +0000 UTC m=+4723.719165285" Jan 21 16:20:28 crc kubenswrapper[4707]: I0121 16:20:28.182636 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:20:28 crc kubenswrapper[4707]: E0121 16:20:28.182993 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.772256 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-khf6w"] Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.776007 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-khf6w"] Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.896675 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bq7gs"] Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.898035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.899952 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.900314 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.901311 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.901559 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.906189 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bq7gs"] Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.959911 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwpk9\" (UniqueName: \"kubernetes.io/projected/196f8a03-4521-41cc-9f98-abad610d414e-kube-api-access-mwpk9\") pod \"crc-storage-crc-bq7gs\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.959978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/196f8a03-4521-41cc-9f98-abad610d414e-node-mnt\") pod \"crc-storage-crc-bq7gs\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:29 crc kubenswrapper[4707]: I0121 16:20:29.960186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/196f8a03-4521-41cc-9f98-abad610d414e-crc-storage\") pod \"crc-storage-crc-bq7gs\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:30 crc kubenswrapper[4707]: I0121 16:20:30.060951 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwpk9\" (UniqueName: \"kubernetes.io/projected/196f8a03-4521-41cc-9f98-abad610d414e-kube-api-access-mwpk9\") pod \"crc-storage-crc-bq7gs\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:30 crc kubenswrapper[4707]: I0121 16:20:30.061026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/196f8a03-4521-41cc-9f98-abad610d414e-node-mnt\") pod \"crc-storage-crc-bq7gs\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:30 crc kubenswrapper[4707]: I0121 16:20:30.061072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/196f8a03-4521-41cc-9f98-abad610d414e-crc-storage\") pod \"crc-storage-crc-bq7gs\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:30 crc kubenswrapper[4707]: I0121 16:20:30.061331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/196f8a03-4521-41cc-9f98-abad610d414e-node-mnt\") pod \"crc-storage-crc-bq7gs\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:30 crc kubenswrapper[4707]: I0121 16:20:30.061829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/196f8a03-4521-41cc-9f98-abad610d414e-crc-storage\") pod \"crc-storage-crc-bq7gs\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:30 crc kubenswrapper[4707]: I0121 16:20:30.149126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwpk9\" (UniqueName: \"kubernetes.io/projected/196f8a03-4521-41cc-9f98-abad610d414e-kube-api-access-mwpk9\") pod \"crc-storage-crc-bq7gs\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:30 crc kubenswrapper[4707]: I0121 16:20:30.214051 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:30 crc kubenswrapper[4707]: I0121 16:20:30.616691 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bq7gs"] Jan 21 16:20:30 crc kubenswrapper[4707]: W0121 16:20:30.621745 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod196f8a03_4521_41cc_9f98_abad610d414e.slice/crio-d9166ec637906664c0caa31f654bbbcd084c264c631e06ea11de19c631f9eb39 WatchSource:0}: Error finding container d9166ec637906664c0caa31f654bbbcd084c264c631e06ea11de19c631f9eb39: Status 404 returned error can't find the container with id d9166ec637906664c0caa31f654bbbcd084c264c631e06ea11de19c631f9eb39 Jan 21 16:20:31 crc kubenswrapper[4707]: I0121 16:20:31.200855 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85fee3b2-8fca-41cc-9e28-889eddc7252f" path="/var/lib/kubelet/pods/85fee3b2-8fca-41cc-9e28-889eddc7252f/volumes" Jan 21 16:20:31 crc kubenswrapper[4707]: I0121 16:20:31.553491 4707 generic.go:334] "Generic (PLEG): container finished" podID="196f8a03-4521-41cc-9f98-abad610d414e" containerID="59f2960cb3336ee1780116af90c9ac84bf7b16f6fdd099fb23c012229bf2b205" exitCode=0 Jan 21 16:20:31 crc kubenswrapper[4707]: I0121 16:20:31.553570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bq7gs" event={"ID":"196f8a03-4521-41cc-9f98-abad610d414e","Type":"ContainerDied","Data":"59f2960cb3336ee1780116af90c9ac84bf7b16f6fdd099fb23c012229bf2b205"} Jan 21 16:20:31 crc kubenswrapper[4707]: I0121 16:20:31.553669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bq7gs" event={"ID":"196f8a03-4521-41cc-9f98-abad610d414e","Type":"ContainerStarted","Data":"d9166ec637906664c0caa31f654bbbcd084c264c631e06ea11de19c631f9eb39"} Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.791475 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.808147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwpk9\" (UniqueName: \"kubernetes.io/projected/196f8a03-4521-41cc-9f98-abad610d414e-kube-api-access-mwpk9\") pod \"196f8a03-4521-41cc-9f98-abad610d414e\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.808410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/196f8a03-4521-41cc-9f98-abad610d414e-node-mnt\") pod \"196f8a03-4521-41cc-9f98-abad610d414e\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.808474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/196f8a03-4521-41cc-9f98-abad610d414e-crc-storage\") pod \"196f8a03-4521-41cc-9f98-abad610d414e\" (UID: \"196f8a03-4521-41cc-9f98-abad610d414e\") " Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.808577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/196f8a03-4521-41cc-9f98-abad610d414e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "196f8a03-4521-41cc-9f98-abad610d414e" (UID: "196f8a03-4521-41cc-9f98-abad610d414e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.808844 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/196f8a03-4521-41cc-9f98-abad610d414e-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.814019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/196f8a03-4521-41cc-9f98-abad610d414e-kube-api-access-mwpk9" (OuterVolumeSpecName: "kube-api-access-mwpk9") pod "196f8a03-4521-41cc-9f98-abad610d414e" (UID: "196f8a03-4521-41cc-9f98-abad610d414e"). InnerVolumeSpecName "kube-api-access-mwpk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.827287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/196f8a03-4521-41cc-9f98-abad610d414e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "196f8a03-4521-41cc-9f98-abad610d414e" (UID: "196f8a03-4521-41cc-9f98-abad610d414e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.909610 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwpk9\" (UniqueName: \"kubernetes.io/projected/196f8a03-4521-41cc-9f98-abad610d414e-kube-api-access-mwpk9\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:32 crc kubenswrapper[4707]: I0121 16:20:32.909652 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/196f8a03-4521-41cc-9f98-abad610d414e-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:33 crc kubenswrapper[4707]: I0121 16:20:33.569585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bq7gs" event={"ID":"196f8a03-4521-41cc-9f98-abad610d414e","Type":"ContainerDied","Data":"d9166ec637906664c0caa31f654bbbcd084c264c631e06ea11de19c631f9eb39"} Jan 21 16:20:33 crc kubenswrapper[4707]: I0121 16:20:33.569630 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9166ec637906664c0caa31f654bbbcd084c264c631e06ea11de19c631f9eb39" Jan 21 16:20:33 crc kubenswrapper[4707]: I0121 16:20:33.569642 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bq7gs" Jan 21 16:20:33 crc kubenswrapper[4707]: I0121 16:20:33.942010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:33 crc kubenswrapper[4707]: I0121 16:20:33.985841 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2"] Jan 21 16:20:33 crc kubenswrapper[4707]: I0121 16:20:33.986098 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" podUID="c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" containerName="dnsmasq-dns" containerID="cri-o://21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a" gracePeriod=10 Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.352681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.434907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-dnsmasq-svc\") pod \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.435015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6pdr\" (UniqueName: \"kubernetes.io/projected/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-kube-api-access-f6pdr\") pod \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.435078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-config\") pod \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.435194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-openstack-edpm-ipam\") pod \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\" (UID: \"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74\") " Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.440585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-kube-api-access-f6pdr" (OuterVolumeSpecName: "kube-api-access-f6pdr") pod "c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" (UID: "c2fb26e1-aab5-49d4-904d-34ee7b1e5c74"). InnerVolumeSpecName "kube-api-access-f6pdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.464440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-config" (OuterVolumeSpecName: "config") pod "c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" (UID: "c2fb26e1-aab5-49d4-904d-34ee7b1e5c74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.464631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" (UID: "c2fb26e1-aab5-49d4-904d-34ee7b1e5c74"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.469896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" (UID: "c2fb26e1-aab5-49d4-904d-34ee7b1e5c74"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.537071 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.537284 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6pdr\" (UniqueName: \"kubernetes.io/projected/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-kube-api-access-f6pdr\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.537361 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.537424 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.580991 4707 generic.go:334] "Generic (PLEG): container finished" podID="c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" containerID="21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a" exitCode=0 Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.581057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" event={"ID":"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74","Type":"ContainerDied","Data":"21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a"} Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.581106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" event={"ID":"c2fb26e1-aab5-49d4-904d-34ee7b1e5c74","Type":"ContainerDied","Data":"4025096ae7e4593f6f5cf0496a193074a9d50f5fca0a6b60bf08fd6693b9c5a2"} Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.581114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.581130 4707 scope.go:117] "RemoveContainer" containerID="21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.602757 4707 scope.go:117] "RemoveContainer" containerID="0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.609600 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2"] Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.613799 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-qwdr2"] Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.622265 4707 scope.go:117] "RemoveContainer" containerID="21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a" Jan 21 16:20:34 crc kubenswrapper[4707]: E0121 16:20:34.622613 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a\": container with ID starting with 21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a not found: ID does not exist" containerID="21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.622654 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a"} err="failed to get container status \"21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a\": rpc error: code = NotFound desc = could not find container \"21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a\": container with ID starting with 21cfbe86e7dbbf57f04b279668d0a0755b92cec056e8bfb49ada325c6a2c810a not found: ID does not exist" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.622680 4707 scope.go:117] "RemoveContainer" containerID="0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d" Jan 21 16:20:34 crc kubenswrapper[4707]: E0121 16:20:34.624024 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d\": container with ID starting with 0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d not found: ID does not exist" containerID="0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d" Jan 21 16:20:34 crc kubenswrapper[4707]: I0121 16:20:34.624088 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d"} err="failed to get container status \"0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d\": rpc error: code = NotFound desc = could not find container \"0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d\": container with ID starting with 0aefd4fa47493c4bb77c59e81986ab18de74b03b3e02726d52ed0a9b91b0db0d not found: ID does not exist" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.190217 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" path="/var/lib/kubelet/pods/c2fb26e1-aab5-49d4-904d-34ee7b1e5c74/volumes" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.747181 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-bq7gs"] Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.751380 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-bq7gs"] Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.852133 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-7kt6t"] Jan 21 16:20:35 crc kubenswrapper[4707]: E0121 16:20:35.852439 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" containerName="init" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.852457 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" containerName="init" Jan 21 16:20:35 crc kubenswrapper[4707]: E0121 16:20:35.852476 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" containerName="dnsmasq-dns" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.852482 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" containerName="dnsmasq-dns" Jan 21 16:20:35 crc kubenswrapper[4707]: E0121 16:20:35.852502 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196f8a03-4521-41cc-9f98-abad610d414e" containerName="storage" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.852508 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="196f8a03-4521-41cc-9f98-abad610d414e" containerName="storage" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.852616 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="196f8a03-4521-41cc-9f98-abad610d414e" containerName="storage" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.852634 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2fb26e1-aab5-49d4-904d-34ee7b1e5c74" containerName="dnsmasq-dns" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.853130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.855078 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.855078 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.855181 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.855890 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.856946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqld5\" (UniqueName: \"kubernetes.io/projected/c9faafb8-ddc7-4892-8b73-5f4353988f41-kube-api-access-nqld5\") pod \"crc-storage-crc-7kt6t\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.856985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9faafb8-ddc7-4892-8b73-5f4353988f41-crc-storage\") pod \"crc-storage-crc-7kt6t\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.857009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9faafb8-ddc7-4892-8b73-5f4353988f41-node-mnt\") pod \"crc-storage-crc-7kt6t\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.859344 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7kt6t"] Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.957714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqld5\" (UniqueName: \"kubernetes.io/projected/c9faafb8-ddc7-4892-8b73-5f4353988f41-kube-api-access-nqld5\") pod \"crc-storage-crc-7kt6t\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.957766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9faafb8-ddc7-4892-8b73-5f4353988f41-crc-storage\") pod \"crc-storage-crc-7kt6t\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.957790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9faafb8-ddc7-4892-8b73-5f4353988f41-node-mnt\") pod \"crc-storage-crc-7kt6t\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.958047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9faafb8-ddc7-4892-8b73-5f4353988f41-node-mnt\") pod \"crc-storage-crc-7kt6t\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.958757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9faafb8-ddc7-4892-8b73-5f4353988f41-crc-storage\") pod \"crc-storage-crc-7kt6t\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:35 crc kubenswrapper[4707]: I0121 16:20:35.973895 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqld5\" (UniqueName: \"kubernetes.io/projected/c9faafb8-ddc7-4892-8b73-5f4353988f41-kube-api-access-nqld5\") pod \"crc-storage-crc-7kt6t\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:36 crc kubenswrapper[4707]: I0121 16:20:36.166054 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:36 crc kubenswrapper[4707]: I0121 16:20:36.554754 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7kt6t"] Jan 21 16:20:36 crc kubenswrapper[4707]: I0121 16:20:36.594483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7kt6t" event={"ID":"c9faafb8-ddc7-4892-8b73-5f4353988f41","Type":"ContainerStarted","Data":"aa8bcddc8ef8ae7c62cb2ccab5f942bdf02429f5c213707b6b458502f405104e"} Jan 21 16:20:37 crc kubenswrapper[4707]: I0121 16:20:37.190015 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="196f8a03-4521-41cc-9f98-abad610d414e" path="/var/lib/kubelet/pods/196f8a03-4521-41cc-9f98-abad610d414e/volumes" Jan 21 16:20:37 crc kubenswrapper[4707]: I0121 16:20:37.602299 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9faafb8-ddc7-4892-8b73-5f4353988f41" containerID="c40f4856db69864ca0de8dff247e67dbbafc91072cd055579f2f59cb90b0f9f4" exitCode=0 Jan 21 16:20:37 crc kubenswrapper[4707]: I0121 16:20:37.602359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7kt6t" event={"ID":"c9faafb8-ddc7-4892-8b73-5f4353988f41","Type":"ContainerDied","Data":"c40f4856db69864ca0de8dff247e67dbbafc91072cd055579f2f59cb90b0f9f4"} Jan 21 16:20:38 crc kubenswrapper[4707]: I0121 16:20:38.841671 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:38 crc kubenswrapper[4707]: I0121 16:20:38.997099 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9faafb8-ddc7-4892-8b73-5f4353988f41-node-mnt\") pod \"c9faafb8-ddc7-4892-8b73-5f4353988f41\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " Jan 21 16:20:38 crc kubenswrapper[4707]: I0121 16:20:38.997184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9faafb8-ddc7-4892-8b73-5f4353988f41-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c9faafb8-ddc7-4892-8b73-5f4353988f41" (UID: "c9faafb8-ddc7-4892-8b73-5f4353988f41"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:20:38 crc kubenswrapper[4707]: I0121 16:20:38.997208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqld5\" (UniqueName: \"kubernetes.io/projected/c9faafb8-ddc7-4892-8b73-5f4353988f41-kube-api-access-nqld5\") pod \"c9faafb8-ddc7-4892-8b73-5f4353988f41\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " Jan 21 16:20:38 crc kubenswrapper[4707]: I0121 16:20:38.997271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9faafb8-ddc7-4892-8b73-5f4353988f41-crc-storage\") pod \"c9faafb8-ddc7-4892-8b73-5f4353988f41\" (UID: \"c9faafb8-ddc7-4892-8b73-5f4353988f41\") " Jan 21 16:20:38 crc kubenswrapper[4707]: I0121 16:20:38.997598 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9faafb8-ddc7-4892-8b73-5f4353988f41-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:39 crc kubenswrapper[4707]: I0121 16:20:39.147239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9faafb8-ddc7-4892-8b73-5f4353988f41-kube-api-access-nqld5" (OuterVolumeSpecName: "kube-api-access-nqld5") pod "c9faafb8-ddc7-4892-8b73-5f4353988f41" (UID: "c9faafb8-ddc7-4892-8b73-5f4353988f41"). InnerVolumeSpecName "kube-api-access-nqld5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:20:39 crc kubenswrapper[4707]: I0121 16:20:39.158732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9faafb8-ddc7-4892-8b73-5f4353988f41-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c9faafb8-ddc7-4892-8b73-5f4353988f41" (UID: "c9faafb8-ddc7-4892-8b73-5f4353988f41"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:39 crc kubenswrapper[4707]: I0121 16:20:39.199744 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqld5\" (UniqueName: \"kubernetes.io/projected/c9faafb8-ddc7-4892-8b73-5f4353988f41-kube-api-access-nqld5\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:39 crc kubenswrapper[4707]: I0121 16:20:39.199768 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9faafb8-ddc7-4892-8b73-5f4353988f41-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:39 crc kubenswrapper[4707]: I0121 16:20:39.617332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7kt6t" event={"ID":"c9faafb8-ddc7-4892-8b73-5f4353988f41","Type":"ContainerDied","Data":"aa8bcddc8ef8ae7c62cb2ccab5f942bdf02429f5c213707b6b458502f405104e"} Jan 21 16:20:39 crc kubenswrapper[4707]: I0121 16:20:39.617373 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa8bcddc8ef8ae7c62cb2ccab5f942bdf02429f5c213707b6b458502f405104e" Jan 21 16:20:39 crc kubenswrapper[4707]: I0121 16:20:39.617384 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kt6t" Jan 21 16:20:41 crc kubenswrapper[4707]: I0121 16:20:41.182930 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:20:41 crc kubenswrapper[4707]: E0121 16:20:41.183435 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:20:41 crc kubenswrapper[4707]: I0121 16:20:41.891929 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb"] Jan 21 16:20:41 crc kubenswrapper[4707]: E0121 16:20:41.892221 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9faafb8-ddc7-4892-8b73-5f4353988f41" containerName="storage" Jan 21 16:20:41 crc kubenswrapper[4707]: I0121 16:20:41.892234 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9faafb8-ddc7-4892-8b73-5f4353988f41" containerName="storage" Jan 21 16:20:41 crc kubenswrapper[4707]: I0121 16:20:41.892395 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9faafb8-ddc7-4892-8b73-5f4353988f41" containerName="storage" Jan 21 16:20:41 crc kubenswrapper[4707]: I0121 16:20:41.893026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:41 crc kubenswrapper[4707]: I0121 16:20:41.894473 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-global" Jan 21 16:20:41 crc kubenswrapper[4707]: I0121 16:20:41.906518 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb"] Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.038701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-config\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.039269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mcnm\" (UniqueName: \"kubernetes.io/projected/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-kube-api-access-9mcnm\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.039524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.039867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.141380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mcnm\" (UniqueName: \"kubernetes.io/projected/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-kube-api-access-9mcnm\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.141473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.141621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.142335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.142378 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.142464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-config\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.143057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-config\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.156500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mcnm\" (UniqueName: \"kubernetes.io/projected/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-kube-api-access-9mcnm\") pod \"dnsmasq-dnsmasq-7d78464677-rvvqb\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.206799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.620064 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb"] Jan 21 16:20:42 crc kubenswrapper[4707]: I0121 16:20:42.643886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" event={"ID":"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3","Type":"ContainerStarted","Data":"c7ff6ba43dcc6e193fa942b86d8782e659008bf9cfc09eba347b1014ae41a1e5"} Jan 21 16:20:43 crc kubenswrapper[4707]: I0121 16:20:43.657147 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" containerID="b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4" exitCode=0 Jan 21 16:20:43 crc kubenswrapper[4707]: I0121 16:20:43.657282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" event={"ID":"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3","Type":"ContainerDied","Data":"b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4"} Jan 21 16:20:44 crc kubenswrapper[4707]: I0121 16:20:44.666023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" event={"ID":"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3","Type":"ContainerStarted","Data":"897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60"} Jan 21 16:20:44 crc kubenswrapper[4707]: I0121 16:20:44.666183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:44 crc kubenswrapper[4707]: I0121 16:20:44.684085 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" podStartSLOduration=3.6840591050000002 podStartE2EDuration="3.684059105s" podCreationTimestamp="2026-01-21 16:20:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:20:44.678930571 +0000 UTC m=+4741.860446793" watchObservedRunningTime="2026-01-21 16:20:44.684059105 +0000 UTC m=+4741.865575327" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.208019 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.243383 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn"] Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.243599 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" podUID="ce835f7c-9b6d-4700-894c-531c98d0a74e" containerName="dnsmasq-dns" containerID="cri-o://480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f" gracePeriod=10 Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.580314 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.672578 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-config\") pod \"ce835f7c-9b6d-4700-894c-531c98d0a74e\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.672639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45lg4\" (UniqueName: \"kubernetes.io/projected/ce835f7c-9b6d-4700-894c-531c98d0a74e-kube-api-access-45lg4\") pod \"ce835f7c-9b6d-4700-894c-531c98d0a74e\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.672665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-dnsmasq-svc\") pod \"ce835f7c-9b6d-4700-894c-531c98d0a74e\" (UID: \"ce835f7c-9b6d-4700-894c-531c98d0a74e\") " Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.677083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce835f7c-9b6d-4700-894c-531c98d0a74e-kube-api-access-45lg4" (OuterVolumeSpecName: "kube-api-access-45lg4") pod "ce835f7c-9b6d-4700-894c-531c98d0a74e" (UID: "ce835f7c-9b6d-4700-894c-531c98d0a74e"). InnerVolumeSpecName "kube-api-access-45lg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.699414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-config" (OuterVolumeSpecName: "config") pod "ce835f7c-9b6d-4700-894c-531c98d0a74e" (UID: "ce835f7c-9b6d-4700-894c-531c98d0a74e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.701429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "ce835f7c-9b6d-4700-894c-531c98d0a74e" (UID: "ce835f7c-9b6d-4700-894c-531c98d0a74e"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.719717 4707 generic.go:334] "Generic (PLEG): container finished" podID="ce835f7c-9b6d-4700-894c-531c98d0a74e" containerID="480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f" exitCode=0 Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.719770 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.719790 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" event={"ID":"ce835f7c-9b6d-4700-894c-531c98d0a74e","Type":"ContainerDied","Data":"480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f"} Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.720842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn" event={"ID":"ce835f7c-9b6d-4700-894c-531c98d0a74e","Type":"ContainerDied","Data":"c491c896239b11fe1726baadd6a3810ac1d7b29ec42d451fdff78bd5158a40dc"} Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.720960 4707 scope.go:117] "RemoveContainer" containerID="480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.743222 4707 scope.go:117] "RemoveContainer" containerID="f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.745752 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn"] Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.750689 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-mtwcn"] Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.774850 4707 scope.go:117] "RemoveContainer" containerID="480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f" Jan 21 16:20:52 crc kubenswrapper[4707]: E0121 16:20:52.776083 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f\": container with ID starting with 480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f not found: ID does not exist" containerID="480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.776137 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f"} err="failed to get container status \"480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f\": rpc error: code = NotFound desc = could not find container \"480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f\": container with ID starting with 480d386d5110bf5b0fcc3d2e2e208e404de2e1391bf0a65e29511bb3c0737f8f not found: ID does not exist" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.776177 4707 scope.go:117] "RemoveContainer" containerID="f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.776428 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.776456 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45lg4\" (UniqueName: \"kubernetes.io/projected/ce835f7c-9b6d-4700-894c-531c98d0a74e-kube-api-access-45lg4\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.776469 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ce835f7c-9b6d-4700-894c-531c98d0a74e-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:20:52 crc kubenswrapper[4707]: E0121 16:20:52.776901 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd\": container with ID starting with f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd not found: ID does not exist" containerID="f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.776936 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd"} err="failed to get container status \"f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd\": rpc error: code = NotFound desc = could not find container \"f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd\": container with ID starting with f84851a13c097d87094d02f7af2367b36ec655042ce510674ac16d5b519683dd not found: ID does not exist" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.907100 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k"] Jan 21 16:20:52 crc kubenswrapper[4707]: E0121 16:20:52.907397 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce835f7c-9b6d-4700-894c-531c98d0a74e" containerName="dnsmasq-dns" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.907415 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce835f7c-9b6d-4700-894c-531c98d0a74e" containerName="dnsmasq-dns" Jan 21 16:20:52 crc kubenswrapper[4707]: E0121 16:20:52.907428 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce835f7c-9b6d-4700-894c-531c98d0a74e" containerName="init" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.907434 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce835f7c-9b6d-4700-894c-531c98d0a74e" containerName="init" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.907562 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce835f7c-9b6d-4700-894c-531c98d0a74e" containerName="dnsmasq-dns" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.907986 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.910031 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.910220 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.910355 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.910500 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.919445 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k"] Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.977960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-inventory\") pod \"download-cache-edpm-compute-global-edpm-compute-global-vxq5k\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.978006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-compute-global-edpm-compute-global-vxq5k\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:52 crc kubenswrapper[4707]: I0121 16:20:52.978046 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmlj5\" (UniqueName: \"kubernetes.io/projected/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-kube-api-access-fmlj5\") pod \"download-cache-edpm-compute-global-edpm-compute-global-vxq5k\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.079334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-inventory\") pod \"download-cache-edpm-compute-global-edpm-compute-global-vxq5k\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.079378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-compute-global-edpm-compute-global-vxq5k\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.079425 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmlj5\" (UniqueName: \"kubernetes.io/projected/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-kube-api-access-fmlj5\") pod \"download-cache-edpm-compute-global-edpm-compute-global-vxq5k\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.082198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-compute-global-edpm-compute-global-vxq5k\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.082213 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-inventory\") pod \"download-cache-edpm-compute-global-edpm-compute-global-vxq5k\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.098607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmlj5\" (UniqueName: \"kubernetes.io/projected/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-kube-api-access-fmlj5\") pod \"download-cache-edpm-compute-global-edpm-compute-global-vxq5k\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.189300 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce835f7c-9b6d-4700-894c-531c98d0a74e" path="/var/lib/kubelet/pods/ce835f7c-9b6d-4700-894c-531c98d0a74e/volumes" Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.220003 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.585038 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k"] Jan 21 16:20:53 crc kubenswrapper[4707]: W0121 16:20:53.589509 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ceb86b_bb52_4402_a1ec_aea7a5ebf2ea.slice/crio-c849c2a7958d7111d2e7db3059595d790e1dc35b8ca24e08ebffd8643af91e56 WatchSource:0}: Error finding container c849c2a7958d7111d2e7db3059595d790e1dc35b8ca24e08ebffd8643af91e56: Status 404 returned error can't find the container with id c849c2a7958d7111d2e7db3059595d790e1dc35b8ca24e08ebffd8643af91e56 Jan 21 16:20:53 crc kubenswrapper[4707]: I0121 16:20:53.728012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" event={"ID":"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea","Type":"ContainerStarted","Data":"c849c2a7958d7111d2e7db3059595d790e1dc35b8ca24e08ebffd8643af91e56"} Jan 21 16:20:56 crc kubenswrapper[4707]: I0121 16:20:56.183069 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:20:56 crc kubenswrapper[4707]: E0121 16:20:56.183378 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:21:04 crc kubenswrapper[4707]: I0121 16:21:04.808462 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea" containerID="c8f134230d517327fd10f894c45ae7587f2dab72a8d3a15458bf6e7c45d0502d" exitCode=0 Jan 21 16:21:04 crc kubenswrapper[4707]: I0121 16:21:04.808615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" event={"ID":"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea","Type":"ContainerDied","Data":"c8f134230d517327fd10f894c45ae7587f2dab72a8d3a15458bf6e7c45d0502d"} Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.038167 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.235535 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmlj5\" (UniqueName: \"kubernetes.io/projected/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-kube-api-access-fmlj5\") pod \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.236038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-ssh-key-edpm-compute-global\") pod \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.236108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-inventory\") pod \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\" (UID: \"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea\") " Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.446973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-kube-api-access-fmlj5" (OuterVolumeSpecName: "kube-api-access-fmlj5") pod "a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea" (UID: "a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea"). InnerVolumeSpecName "kube-api-access-fmlj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.540449 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmlj5\" (UniqueName: \"kubernetes.io/projected/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-kube-api-access-fmlj5\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.559791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea" (UID: "a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.559986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-inventory" (OuterVolumeSpecName: "inventory") pod "a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea" (UID: "a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.641383 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.641414 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.828358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" event={"ID":"a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea","Type":"ContainerDied","Data":"c849c2a7958d7111d2e7db3059595d790e1dc35b8ca24e08ebffd8643af91e56"} Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.828397 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c849c2a7958d7111d2e7db3059595d790e1dc35b8ca24e08ebffd8643af91e56" Jan 21 16:21:06 crc kubenswrapper[4707]: I0121 16:21:06.828402 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.091960 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl"] Jan 21 16:21:07 crc kubenswrapper[4707]: E0121 16:21:07.092395 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea" containerName="download-cache-edpm-compute-global-edpm-compute-global" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.092410 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea" containerName="download-cache-edpm-compute-global-edpm-compute-global" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.092556 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea" containerName="download-cache-edpm-compute-global-edpm-compute-global" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.093007 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.094323 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.094627 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.094840 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.094861 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.100325 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl"] Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.102694 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.146542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-inventory\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.146606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22j2b\" (UniqueName: \"kubernetes.io/projected/28b191d2-e6b5-453f-b03a-a6adcc296c39-kube-api-access-22j2b\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.146632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.146651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.247452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-inventory\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.247698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22j2b\" (UniqueName: \"kubernetes.io/projected/28b191d2-e6b5-453f-b03a-a6adcc296c39-kube-api-access-22j2b\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.247750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.247770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.250534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.250619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-inventory\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.250902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.260566 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22j2b\" (UniqueName: \"kubernetes.io/projected/28b191d2-e6b5-453f-b03a-a6adcc296c39-kube-api-access-22j2b\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-b9csl\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.405671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:07 crc kubenswrapper[4707]: I0121 16:21:07.942536 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl"] Jan 21 16:21:07 crc kubenswrapper[4707]: W0121 16:21:07.947658 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28b191d2_e6b5_453f_b03a_a6adcc296c39.slice/crio-8f4691f7b1ef572febda0396fa21f1486c6b357692f43397e2268178a969606a WatchSource:0}: Error finding container 8f4691f7b1ef572febda0396fa21f1486c6b357692f43397e2268178a969606a: Status 404 returned error can't find the container with id 8f4691f7b1ef572febda0396fa21f1486c6b357692f43397e2268178a969606a Jan 21 16:21:08 crc kubenswrapper[4707]: I0121 16:21:08.182499 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:21:08 crc kubenswrapper[4707]: E0121 16:21:08.182722 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:21:08 crc kubenswrapper[4707]: I0121 16:21:08.849598 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" event={"ID":"28b191d2-e6b5-453f-b03a-a6adcc296c39","Type":"ContainerStarted","Data":"c514d5e67f43f6a296162f67392a88d6c3015c1f19561e3dafbe7c7a270e3d48"} Jan 21 16:21:08 crc kubenswrapper[4707]: I0121 16:21:08.849638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" event={"ID":"28b191d2-e6b5-453f-b03a-a6adcc296c39","Type":"ContainerStarted","Data":"8f4691f7b1ef572febda0396fa21f1486c6b357692f43397e2268178a969606a"} Jan 21 16:21:08 crc kubenswrapper[4707]: I0121 16:21:08.865886 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" podStartSLOduration=1.324643559 podStartE2EDuration="1.865875877s" podCreationTimestamp="2026-01-21 16:21:07 +0000 UTC" firstStartedPulling="2026-01-21 16:21:07.950733508 +0000 UTC m=+4765.132249730" lastFinishedPulling="2026-01-21 16:21:08.491965825 +0000 UTC m=+4765.673482048" observedRunningTime="2026-01-21 16:21:08.860727597 +0000 UTC m=+4766.042243819" watchObservedRunningTime="2026-01-21 16:21:08.865875877 +0000 UTC m=+4766.047392089" Jan 21 16:21:09 crc kubenswrapper[4707]: I0121 16:21:09.857920 4707 generic.go:334] "Generic (PLEG): container finished" podID="28b191d2-e6b5-453f-b03a-a6adcc296c39" containerID="c514d5e67f43f6a296162f67392a88d6c3015c1f19561e3dafbe7c7a270e3d48" exitCode=0 Jan 21 16:21:09 crc kubenswrapper[4707]: I0121 16:21:09.858533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" event={"ID":"28b191d2-e6b5-453f-b03a-a6adcc296c39","Type":"ContainerDied","Data":"c514d5e67f43f6a296162f67392a88d6c3015c1f19561e3dafbe7c7a270e3d48"} Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.069943 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.192166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-bootstrap-combined-ca-bundle\") pod \"28b191d2-e6b5-453f-b03a-a6adcc296c39\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.192360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-ssh-key-edpm-compute-global\") pod \"28b191d2-e6b5-453f-b03a-a6adcc296c39\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.192421 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22j2b\" (UniqueName: \"kubernetes.io/projected/28b191d2-e6b5-453f-b03a-a6adcc296c39-kube-api-access-22j2b\") pod \"28b191d2-e6b5-453f-b03a-a6adcc296c39\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.192449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-inventory\") pod \"28b191d2-e6b5-453f-b03a-a6adcc296c39\" (UID: \"28b191d2-e6b5-453f-b03a-a6adcc296c39\") " Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.196437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "28b191d2-e6b5-453f-b03a-a6adcc296c39" (UID: "28b191d2-e6b5-453f-b03a-a6adcc296c39"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.196517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b191d2-e6b5-453f-b03a-a6adcc296c39-kube-api-access-22j2b" (OuterVolumeSpecName: "kube-api-access-22j2b") pod "28b191d2-e6b5-453f-b03a-a6adcc296c39" (UID: "28b191d2-e6b5-453f-b03a-a6adcc296c39"). InnerVolumeSpecName "kube-api-access-22j2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.208414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-inventory" (OuterVolumeSpecName: "inventory") pod "28b191d2-e6b5-453f-b03a-a6adcc296c39" (UID: "28b191d2-e6b5-453f-b03a-a6adcc296c39"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.208734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "28b191d2-e6b5-453f-b03a-a6adcc296c39" (UID: "28b191d2-e6b5-453f-b03a-a6adcc296c39"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.293985 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.294017 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.294031 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22j2b\" (UniqueName: \"kubernetes.io/projected/28b191d2-e6b5-453f-b03a-a6adcc296c39-kube-api-access-22j2b\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.294041 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b191d2-e6b5-453f-b03a-a6adcc296c39-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.873189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" event={"ID":"28b191d2-e6b5-453f-b03a-a6adcc296c39","Type":"ContainerDied","Data":"8f4691f7b1ef572febda0396fa21f1486c6b357692f43397e2268178a969606a"} Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.873227 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f4691f7b1ef572febda0396fa21f1486c6b357692f43397e2268178a969606a" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.873240 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.915036 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch"] Jan 21 16:21:11 crc kubenswrapper[4707]: E0121 16:21:11.915343 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b191d2-e6b5-453f-b03a-a6adcc296c39" containerName="bootstrap-edpm-compute-global-edpm-compute-global" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.915358 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b191d2-e6b5-453f-b03a-a6adcc296c39" containerName="bootstrap-edpm-compute-global-edpm-compute-global" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.915530 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b191d2-e6b5-453f-b03a-a6adcc296c39" containerName="bootstrap-edpm-compute-global-edpm-compute-global" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.916022 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.918463 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.918754 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.924110 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.924263 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:21:11 crc kubenswrapper[4707]: I0121 16:21:11.924306 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch"] Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.001673 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-compute-global-edpm-compute-global-lzrch\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.001720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-inventory\") pod \"configure-network-edpm-compute-global-edpm-compute-global-lzrch\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.001766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96r5h\" (UniqueName: \"kubernetes.io/projected/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-kube-api-access-96r5h\") pod \"configure-network-edpm-compute-global-edpm-compute-global-lzrch\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.104123 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-compute-global-edpm-compute-global-lzrch\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.104382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-inventory\") pod \"configure-network-edpm-compute-global-edpm-compute-global-lzrch\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.104437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96r5h\" (UniqueName: \"kubernetes.io/projected/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-kube-api-access-96r5h\") pod \"configure-network-edpm-compute-global-edpm-compute-global-lzrch\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.109561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-inventory\") pod \"configure-network-edpm-compute-global-edpm-compute-global-lzrch\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.110285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-compute-global-edpm-compute-global-lzrch\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.117626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96r5h\" (UniqueName: \"kubernetes.io/projected/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-kube-api-access-96r5h\") pod \"configure-network-edpm-compute-global-edpm-compute-global-lzrch\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.233094 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.583290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch"] Jan 21 16:21:12 crc kubenswrapper[4707]: W0121 16:21:12.586984 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cf1ce9a_43c8_43bd_bfb8_72695ba737f5.slice/crio-a8ce2d1ef5d36bc324650763e286c65f64c32c7142ec1f90584d181b90fc32a2 WatchSource:0}: Error finding container a8ce2d1ef5d36bc324650763e286c65f64c32c7142ec1f90584d181b90fc32a2: Status 404 returned error can't find the container with id a8ce2d1ef5d36bc324650763e286c65f64c32c7142ec1f90584d181b90fc32a2 Jan 21 16:21:12 crc kubenswrapper[4707]: I0121 16:21:12.881415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" event={"ID":"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5","Type":"ContainerStarted","Data":"a8ce2d1ef5d36bc324650763e286c65f64c32c7142ec1f90584d181b90fc32a2"} Jan 21 16:21:13 crc kubenswrapper[4707]: I0121 16:21:13.889907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" event={"ID":"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5","Type":"ContainerStarted","Data":"f991d10c9307a9ef3f0cc39b2cbee80969c3a939ee868490fca2e9996f363d94"} Jan 21 16:21:14 crc kubenswrapper[4707]: I0121 16:21:14.899043 4707 generic.go:334] "Generic (PLEG): container finished" podID="3cf1ce9a-43c8-43bd-bfb8-72695ba737f5" containerID="f991d10c9307a9ef3f0cc39b2cbee80969c3a939ee868490fca2e9996f363d94" exitCode=0 Jan 21 16:21:14 crc kubenswrapper[4707]: I0121 16:21:14.899159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" event={"ID":"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5","Type":"ContainerDied","Data":"f991d10c9307a9ef3f0cc39b2cbee80969c3a939ee868490fca2e9996f363d94"} Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.106558 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.246284 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx"] Jan 21 16:21:16 crc kubenswrapper[4707]: E0121 16:21:16.246546 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf1ce9a-43c8-43bd-bfb8-72695ba737f5" containerName="configure-network-edpm-compute-global-edpm-compute-global" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.246567 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf1ce9a-43c8-43bd-bfb8-72695ba737f5" containerName="configure-network-edpm-compute-global-edpm-compute-global" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.246702 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf1ce9a-43c8-43bd-bfb8-72695ba737f5" containerName="configure-network-edpm-compute-global-edpm-compute-global" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.247157 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.253401 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx"] Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.254663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-inventory\") pod \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.255164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-ssh-key-edpm-compute-global\") pod \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.255220 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96r5h\" (UniqueName: \"kubernetes.io/projected/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-kube-api-access-96r5h\") pod \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\" (UID: \"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5\") " Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.255383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhnxz\" (UniqueName: \"kubernetes.io/projected/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-kube-api-access-mhnxz\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hjlcx\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.255420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-inventory\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hjlcx\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.255486 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hjlcx\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.258987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-kube-api-access-96r5h" (OuterVolumeSpecName: "kube-api-access-96r5h") pod "3cf1ce9a-43c8-43bd-bfb8-72695ba737f5" (UID: "3cf1ce9a-43c8-43bd-bfb8-72695ba737f5"). InnerVolumeSpecName "kube-api-access-96r5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.273601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "3cf1ce9a-43c8-43bd-bfb8-72695ba737f5" (UID: "3cf1ce9a-43c8-43bd-bfb8-72695ba737f5"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.277717 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-inventory" (OuterVolumeSpecName: "inventory") pod "3cf1ce9a-43c8-43bd-bfb8-72695ba737f5" (UID: "3cf1ce9a-43c8-43bd-bfb8-72695ba737f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.356466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhnxz\" (UniqueName: \"kubernetes.io/projected/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-kube-api-access-mhnxz\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hjlcx\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.356510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-inventory\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hjlcx\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.356565 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hjlcx\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.356649 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.356675 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.356759 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96r5h\" (UniqueName: \"kubernetes.io/projected/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5-kube-api-access-96r5h\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.359298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hjlcx\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.360846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-inventory\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hjlcx\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.368852 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhnxz\" (UniqueName: \"kubernetes.io/projected/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-kube-api-access-mhnxz\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hjlcx\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.607273 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.913507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" event={"ID":"3cf1ce9a-43c8-43bd-bfb8-72695ba737f5","Type":"ContainerDied","Data":"a8ce2d1ef5d36bc324650763e286c65f64c32c7142ec1f90584d181b90fc32a2"} Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.913684 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8ce2d1ef5d36bc324650763e286c65f64c32c7142ec1f90584d181b90fc32a2" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.913564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch" Jan 21 16:21:16 crc kubenswrapper[4707]: I0121 16:21:16.958510 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx"] Jan 21 16:21:16 crc kubenswrapper[4707]: W0121 16:21:16.961448 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8e2dc7_e713_4d79_9deb_db8ecba6b34a.slice/crio-e75d67006340bd55345fc2e42ed9ca6ddcab343326ab5272574f9d10994910b0 WatchSource:0}: Error finding container e75d67006340bd55345fc2e42ed9ca6ddcab343326ab5272574f9d10994910b0: Status 404 returned error can't find the container with id e75d67006340bd55345fc2e42ed9ca6ddcab343326ab5272574f9d10994910b0 Jan 21 16:21:17 crc kubenswrapper[4707]: I0121 16:21:17.922509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" event={"ID":"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a","Type":"ContainerStarted","Data":"21b07e47a12d2ad4fc614006cb635a84cc5beac49075189f2bffe191e1b2e1ed"} Jan 21 16:21:17 crc kubenswrapper[4707]: I0121 16:21:17.922851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" event={"ID":"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a","Type":"ContainerStarted","Data":"e75d67006340bd55345fc2e42ed9ca6ddcab343326ab5272574f9d10994910b0"} Jan 21 16:21:17 crc kubenswrapper[4707]: I0121 16:21:17.935487 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" podStartSLOduration=1.392173571 podStartE2EDuration="1.935470249s" podCreationTimestamp="2026-01-21 16:21:16 +0000 UTC" firstStartedPulling="2026-01-21 16:21:16.966381095 +0000 UTC m=+4774.147897316" lastFinishedPulling="2026-01-21 16:21:17.509677773 +0000 UTC m=+4774.691193994" observedRunningTime="2026-01-21 16:21:17.932220137 +0000 UTC m=+4775.113736360" watchObservedRunningTime="2026-01-21 16:21:17.935470249 +0000 UTC m=+4775.116986471" Jan 21 16:21:17 crc kubenswrapper[4707]: I0121 16:21:17.969851 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2bvs5"] Jan 21 16:21:17 crc kubenswrapper[4707]: I0121 16:21:17.971096 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:17 crc kubenswrapper[4707]: I0121 16:21:17.978138 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2bvs5"] Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.072687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-utilities\") pod \"community-operators-2bvs5\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.073211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-catalog-content\") pod \"community-operators-2bvs5\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.073462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9j55\" (UniqueName: \"kubernetes.io/projected/576ccbae-e4b9-4053-8a02-95a936f311a9-kube-api-access-t9j55\") pod \"community-operators-2bvs5\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.175227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9j55\" (UniqueName: \"kubernetes.io/projected/576ccbae-e4b9-4053-8a02-95a936f311a9-kube-api-access-t9j55\") pod \"community-operators-2bvs5\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.175329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-utilities\") pod \"community-operators-2bvs5\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.175396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-catalog-content\") pod \"community-operators-2bvs5\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.175836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-catalog-content\") pod \"community-operators-2bvs5\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.176079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-utilities\") pod \"community-operators-2bvs5\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.188455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9j55\" (UniqueName: \"kubernetes.io/projected/576ccbae-e4b9-4053-8a02-95a936f311a9-kube-api-access-t9j55\") pod \"community-operators-2bvs5\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.283014 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.683051 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2bvs5"] Jan 21 16:21:18 crc kubenswrapper[4707]: W0121 16:21:18.686189 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576ccbae_e4b9_4053_8a02_95a936f311a9.slice/crio-9438a78e31adb2fbef90a4e1cfa05ebc86cf360f589ddd1ab3dcd4fab467c2d1 WatchSource:0}: Error finding container 9438a78e31adb2fbef90a4e1cfa05ebc86cf360f589ddd1ab3dcd4fab467c2d1: Status 404 returned error can't find the container with id 9438a78e31adb2fbef90a4e1cfa05ebc86cf360f589ddd1ab3dcd4fab467c2d1 Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.930448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bvs5" event={"ID":"576ccbae-e4b9-4053-8a02-95a936f311a9","Type":"ContainerStarted","Data":"9438a78e31adb2fbef90a4e1cfa05ebc86cf360f589ddd1ab3dcd4fab467c2d1"} Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.932155 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d8e2dc7-e713-4d79-9deb-db8ecba6b34a" containerID="21b07e47a12d2ad4fc614006cb635a84cc5beac49075189f2bffe191e1b2e1ed" exitCode=0 Jan 21 16:21:18 crc kubenswrapper[4707]: I0121 16:21:18.932198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" event={"ID":"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a","Type":"ContainerDied","Data":"21b07e47a12d2ad4fc614006cb635a84cc5beac49075189f2bffe191e1b2e1ed"} Jan 21 16:21:19 crc kubenswrapper[4707]: I0121 16:21:19.941741 4707 generic.go:334] "Generic (PLEG): container finished" podID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerID="82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0" exitCode=0 Jan 21 16:21:19 crc kubenswrapper[4707]: I0121 16:21:19.941839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bvs5" event={"ID":"576ccbae-e4b9-4053-8a02-95a936f311a9","Type":"ContainerDied","Data":"82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0"} Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.170882 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.196790 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-ssh-key-edpm-compute-global\") pod \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.196893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhnxz\" (UniqueName: \"kubernetes.io/projected/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-kube-api-access-mhnxz\") pod \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.196969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-inventory\") pod \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\" (UID: \"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a\") " Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.206221 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-kube-api-access-mhnxz" (OuterVolumeSpecName: "kube-api-access-mhnxz") pod "6d8e2dc7-e713-4d79-9deb-db8ecba6b34a" (UID: "6d8e2dc7-e713-4d79-9deb-db8ecba6b34a"). InnerVolumeSpecName "kube-api-access-mhnxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.213982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-inventory" (OuterVolumeSpecName: "inventory") pod "6d8e2dc7-e713-4d79-9deb-db8ecba6b34a" (UID: "6d8e2dc7-e713-4d79-9deb-db8ecba6b34a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.214198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "6d8e2dc7-e713-4d79-9deb-db8ecba6b34a" (UID: "6d8e2dc7-e713-4d79-9deb-db8ecba6b34a"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.298822 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.298849 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.298862 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhnxz\" (UniqueName: \"kubernetes.io/projected/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a-kube-api-access-mhnxz\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.467829 4707 scope.go:117] "RemoveContainer" containerID="5ed44944455dbc1a114604c766a45c1ca972a4606703683a2633724ca91ebe9c" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.501701 4707 scope.go:117] "RemoveContainer" containerID="1ea7c9e00fa5bb2f612af778f3c7730e56fc51b09d34fc2f817000b446962d3e" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.516411 4707 scope.go:117] "RemoveContainer" containerID="00b2e9080849f70f06e72d0f8cf6a6bb5cd6fdd4652e62a5fc3b7c49f6d06b04" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.531382 4707 scope.go:117] "RemoveContainer" containerID="bb46f9e83f66a772289c0448dc2ee29a80460b2788d5947207c573f27eed0606" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.548542 4707 scope.go:117] "RemoveContainer" containerID="8363967c7b8268c252b183ff9c78e4bc582e3b7632d312b491c23c4ee83dcc32" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.562820 4707 scope.go:117] "RemoveContainer" containerID="8080e1820e8617e734e7f6d3591626364addc6c0715323456f5b545599c9e1a6" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.575989 4707 scope.go:117] "RemoveContainer" containerID="27a61ac48846f110d86e798aec4b399e84383f396bcc0b0c6292cfd9a73082ad" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.592771 4707 scope.go:117] "RemoveContainer" containerID="0bb9c48ab5870f30efc043dff3b25c1812f174f3b39367fec91edd9d13f5bb5f" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.616931 4707 scope.go:117] "RemoveContainer" containerID="968fa043e9d75f074257ba63bdd2414573937b0e4092066cd458b52c27017104" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.631523 4707 scope.go:117] "RemoveContainer" containerID="5a1fdd32b7c048c2b08b9b7548470c2376175782a4962b0b050b54f1d5d49ea7" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.648185 4707 scope.go:117] "RemoveContainer" containerID="7c0e62cfad943f1c8d4c59525642ec4ccf69200f67d5f018cb5bef6717a97838" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.668584 4707 scope.go:117] "RemoveContainer" containerID="4973b6ceb8c94a4209098aa2976ed9a0c40751c09bb67854d097b046b15d1209" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.955293 4707 generic.go:334] "Generic (PLEG): container finished" podID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerID="3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc" exitCode=0 Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.955362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bvs5" event={"ID":"576ccbae-e4b9-4053-8a02-95a936f311a9","Type":"ContainerDied","Data":"3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc"} Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.962167 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" event={"ID":"6d8e2dc7-e713-4d79-9deb-db8ecba6b34a","Type":"ContainerDied","Data":"e75d67006340bd55345fc2e42ed9ca6ddcab343326ab5272574f9d10994910b0"} Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.962198 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e75d67006340bd55345fc2e42ed9ca6ddcab343326ab5272574f9d10994910b0" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.962253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.984596 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz"] Jan 21 16:21:20 crc kubenswrapper[4707]: E0121 16:21:20.984890 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8e2dc7-e713-4d79-9deb-db8ecba6b34a" containerName="validate-network-edpm-compute-global-edpm-compute-global" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.984911 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8e2dc7-e713-4d79-9deb-db8ecba6b34a" containerName="validate-network-edpm-compute-global-edpm-compute-global" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.985061 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8e2dc7-e713-4d79-9deb-db8ecba6b34a" containerName="validate-network-edpm-compute-global-edpm-compute-global" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.985500 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.988228 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.988947 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.989088 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.989214 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:21:20 crc kubenswrapper[4707]: I0121 16:21:20.996236 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz"] Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.005729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-inventory\") pod \"install-os-edpm-compute-global-edpm-compute-global-qvpjz\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.005772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-ssh-key-edpm-compute-global\") pod \"install-os-edpm-compute-global-edpm-compute-global-qvpjz\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.005898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crpbk\" (UniqueName: \"kubernetes.io/projected/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-kube-api-access-crpbk\") pod \"install-os-edpm-compute-global-edpm-compute-global-qvpjz\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.107665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-ssh-key-edpm-compute-global\") pod \"install-os-edpm-compute-global-edpm-compute-global-qvpjz\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.107766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crpbk\" (UniqueName: \"kubernetes.io/projected/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-kube-api-access-crpbk\") pod \"install-os-edpm-compute-global-edpm-compute-global-qvpjz\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.107906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-inventory\") pod \"install-os-edpm-compute-global-edpm-compute-global-qvpjz\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.547945 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-ssh-key-edpm-compute-global\") pod \"install-os-edpm-compute-global-edpm-compute-global-qvpjz\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.547945 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-inventory\") pod \"install-os-edpm-compute-global-edpm-compute-global-qvpjz\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.548196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crpbk\" (UniqueName: \"kubernetes.io/projected/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-kube-api-access-crpbk\") pod \"install-os-edpm-compute-global-edpm-compute-global-qvpjz\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.599165 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.951108 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz"] Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.971080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bvs5" event={"ID":"576ccbae-e4b9-4053-8a02-95a936f311a9","Type":"ContainerStarted","Data":"e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4"} Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.972119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" event={"ID":"55d10636-2fe8-4b6c-8f84-34e2ecc03d94","Type":"ContainerStarted","Data":"b00cc44a66dea731fede06c08913050372240b92dd4639fae233dc73b1c1f84c"} Jan 21 16:21:21 crc kubenswrapper[4707]: I0121 16:21:21.988253 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2bvs5" podStartSLOduration=3.460463983 podStartE2EDuration="4.988238983s" podCreationTimestamp="2026-01-21 16:21:17 +0000 UTC" firstStartedPulling="2026-01-21 16:21:19.942833646 +0000 UTC m=+4777.124349868" lastFinishedPulling="2026-01-21 16:21:21.470608645 +0000 UTC m=+4778.652124868" observedRunningTime="2026-01-21 16:21:21.982509942 +0000 UTC m=+4779.164026164" watchObservedRunningTime="2026-01-21 16:21:21.988238983 +0000 UTC m=+4779.169755205" Jan 21 16:21:22 crc kubenswrapper[4707]: I0121 16:21:22.183299 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:21:22 crc kubenswrapper[4707]: E0121 16:21:22.183520 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:21:22 crc kubenswrapper[4707]: I0121 16:21:22.980036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" event={"ID":"55d10636-2fe8-4b6c-8f84-34e2ecc03d94","Type":"ContainerStarted","Data":"601258fd340ef99dd30bb7dc8fd432200df8ce3c3b793750c9a5e21783f3d36b"} Jan 21 16:21:22 crc kubenswrapper[4707]: I0121 16:21:22.998076 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" podStartSLOduration=2.457519801 podStartE2EDuration="2.998061199s" podCreationTimestamp="2026-01-21 16:21:20 +0000 UTC" firstStartedPulling="2026-01-21 16:21:21.960751619 +0000 UTC m=+4779.142267841" lastFinishedPulling="2026-01-21 16:21:22.501293017 +0000 UTC m=+4779.682809239" observedRunningTime="2026-01-21 16:21:22.989904722 +0000 UTC m=+4780.171420944" watchObservedRunningTime="2026-01-21 16:21:22.998061199 +0000 UTC m=+4780.179577421" Jan 21 16:21:23 crc kubenswrapper[4707]: I0121 16:21:23.987766 4707 generic.go:334] "Generic (PLEG): container finished" podID="55d10636-2fe8-4b6c-8f84-34e2ecc03d94" containerID="601258fd340ef99dd30bb7dc8fd432200df8ce3c3b793750c9a5e21783f3d36b" exitCode=0 Jan 21 16:21:23 crc kubenswrapper[4707]: I0121 16:21:23.987825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" event={"ID":"55d10636-2fe8-4b6c-8f84-34e2ecc03d94","Type":"ContainerDied","Data":"601258fd340ef99dd30bb7dc8fd432200df8ce3c3b793750c9a5e21783f3d36b"} Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.216632 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.356030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-ssh-key-edpm-compute-global\") pod \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.356137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crpbk\" (UniqueName: \"kubernetes.io/projected/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-kube-api-access-crpbk\") pod \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.356203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-inventory\") pod \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\" (UID: \"55d10636-2fe8-4b6c-8f84-34e2ecc03d94\") " Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.361381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-kube-api-access-crpbk" (OuterVolumeSpecName: "kube-api-access-crpbk") pod "55d10636-2fe8-4b6c-8f84-34e2ecc03d94" (UID: "55d10636-2fe8-4b6c-8f84-34e2ecc03d94"). InnerVolumeSpecName "kube-api-access-crpbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.373271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-inventory" (OuterVolumeSpecName: "inventory") pod "55d10636-2fe8-4b6c-8f84-34e2ecc03d94" (UID: "55d10636-2fe8-4b6c-8f84-34e2ecc03d94"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.374081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "55d10636-2fe8-4b6c-8f84-34e2ecc03d94" (UID: "55d10636-2fe8-4b6c-8f84-34e2ecc03d94"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.457192 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.457216 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:25 crc kubenswrapper[4707]: I0121 16:21:25.457228 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crpbk\" (UniqueName: \"kubernetes.io/projected/55d10636-2fe8-4b6c-8f84-34e2ecc03d94-kube-api-access-crpbk\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.003900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" event={"ID":"55d10636-2fe8-4b6c-8f84-34e2ecc03d94","Type":"ContainerDied","Data":"b00cc44a66dea731fede06c08913050372240b92dd4639fae233dc73b1c1f84c"} Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.003943 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b00cc44a66dea731fede06c08913050372240b92dd4639fae233dc73b1c1f84c" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.003947 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.040744 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v"] Jan 21 16:21:26 crc kubenswrapper[4707]: E0121 16:21:26.041027 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d10636-2fe8-4b6c-8f84-34e2ecc03d94" containerName="install-os-edpm-compute-global-edpm-compute-global" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.041046 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d10636-2fe8-4b6c-8f84-34e2ecc03d94" containerName="install-os-edpm-compute-global-edpm-compute-global" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.041187 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d10636-2fe8-4b6c-8f84-34e2ecc03d94" containerName="install-os-edpm-compute-global-edpm-compute-global" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.041621 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.043489 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.043509 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.043728 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.044370 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.049257 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v"] Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.164596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-compute-global-edpm-compute-global-z645v\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.164638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-inventory\") pod \"configure-os-edpm-compute-global-edpm-compute-global-z645v\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.164690 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdw9\" (UniqueName: \"kubernetes.io/projected/df9f1e23-b845-450b-9b39-4af232b4d161-kube-api-access-2qdw9\") pod \"configure-os-edpm-compute-global-edpm-compute-global-z645v\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.265359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdw9\" (UniqueName: \"kubernetes.io/projected/df9f1e23-b845-450b-9b39-4af232b4d161-kube-api-access-2qdw9\") pod \"configure-os-edpm-compute-global-edpm-compute-global-z645v\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.265460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-compute-global-edpm-compute-global-z645v\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.265485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-inventory\") pod \"configure-os-edpm-compute-global-edpm-compute-global-z645v\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.269231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-compute-global-edpm-compute-global-z645v\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.269253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-inventory\") pod \"configure-os-edpm-compute-global-edpm-compute-global-z645v\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.278861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdw9\" (UniqueName: \"kubernetes.io/projected/df9f1e23-b845-450b-9b39-4af232b4d161-kube-api-access-2qdw9\") pod \"configure-os-edpm-compute-global-edpm-compute-global-z645v\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.353330 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:26 crc kubenswrapper[4707]: I0121 16:21:26.703824 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v"] Jan 21 16:21:27 crc kubenswrapper[4707]: I0121 16:21:27.012001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" event={"ID":"df9f1e23-b845-450b-9b39-4af232b4d161","Type":"ContainerStarted","Data":"ee678f12be4a6c0bde64e9f172e0de0189488e30b9e71b2fe6e2434ddca9778b"} Jan 21 16:21:28 crc kubenswrapper[4707]: I0121 16:21:28.056430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" event={"ID":"df9f1e23-b845-450b-9b39-4af232b4d161","Type":"ContainerStarted","Data":"aa25f10913d0af191d7c72475ca20b7c364d06ad6fc1216c00c1d82286658039"} Jan 21 16:21:28 crc kubenswrapper[4707]: I0121 16:21:28.070740 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" podStartSLOduration=1.544088194 podStartE2EDuration="2.070724367s" podCreationTimestamp="2026-01-21 16:21:26 +0000 UTC" firstStartedPulling="2026-01-21 16:21:26.70781788 +0000 UTC m=+4783.889334101" lastFinishedPulling="2026-01-21 16:21:27.234454051 +0000 UTC m=+4784.415970274" observedRunningTime="2026-01-21 16:21:28.06850827 +0000 UTC m=+4785.250024492" watchObservedRunningTime="2026-01-21 16:21:28.070724367 +0000 UTC m=+4785.252240590" Jan 21 16:21:28 crc kubenswrapper[4707]: I0121 16:21:28.284286 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:28 crc kubenswrapper[4707]: I0121 16:21:28.284340 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:28 crc kubenswrapper[4707]: I0121 16:21:28.315731 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:29 crc kubenswrapper[4707]: I0121 16:21:29.064847 4707 generic.go:334] "Generic (PLEG): container finished" podID="df9f1e23-b845-450b-9b39-4af232b4d161" containerID="aa25f10913d0af191d7c72475ca20b7c364d06ad6fc1216c00c1d82286658039" exitCode=0 Jan 21 16:21:29 crc kubenswrapper[4707]: I0121 16:21:29.064917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" event={"ID":"df9f1e23-b845-450b-9b39-4af232b4d161","Type":"ContainerDied","Data":"aa25f10913d0af191d7c72475ca20b7c364d06ad6fc1216c00c1d82286658039"} Jan 21 16:21:29 crc kubenswrapper[4707]: I0121 16:21:29.096618 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:29 crc kubenswrapper[4707]: I0121 16:21:29.127363 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2bvs5"] Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.282189 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.411108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qdw9\" (UniqueName: \"kubernetes.io/projected/df9f1e23-b845-450b-9b39-4af232b4d161-kube-api-access-2qdw9\") pod \"df9f1e23-b845-450b-9b39-4af232b4d161\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.411177 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-inventory\") pod \"df9f1e23-b845-450b-9b39-4af232b4d161\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.411218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-ssh-key-edpm-compute-global\") pod \"df9f1e23-b845-450b-9b39-4af232b4d161\" (UID: \"df9f1e23-b845-450b-9b39-4af232b4d161\") " Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.415283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9f1e23-b845-450b-9b39-4af232b4d161-kube-api-access-2qdw9" (OuterVolumeSpecName: "kube-api-access-2qdw9") pod "df9f1e23-b845-450b-9b39-4af232b4d161" (UID: "df9f1e23-b845-450b-9b39-4af232b4d161"). InnerVolumeSpecName "kube-api-access-2qdw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.425774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-inventory" (OuterVolumeSpecName: "inventory") pod "df9f1e23-b845-450b-9b39-4af232b4d161" (UID: "df9f1e23-b845-450b-9b39-4af232b4d161"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.426279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "df9f1e23-b845-450b-9b39-4af232b4d161" (UID: "df9f1e23-b845-450b-9b39-4af232b4d161"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.512884 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.512940 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qdw9\" (UniqueName: \"kubernetes.io/projected/df9f1e23-b845-450b-9b39-4af232b4d161-kube-api-access-2qdw9\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:30 crc kubenswrapper[4707]: I0121 16:21:30.512951 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9f1e23-b845-450b-9b39-4af232b4d161-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.077299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" event={"ID":"df9f1e23-b845-450b-9b39-4af232b4d161","Type":"ContainerDied","Data":"ee678f12be4a6c0bde64e9f172e0de0189488e30b9e71b2fe6e2434ddca9778b"} Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.077526 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee678f12be4a6c0bde64e9f172e0de0189488e30b9e71b2fe6e2434ddca9778b" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.077433 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2bvs5" podUID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerName="registry-server" containerID="cri-o://e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4" gracePeriod=2 Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.077319 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.120906 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2"] Jan 21 16:21:31 crc kubenswrapper[4707]: E0121 16:21:31.121179 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9f1e23-b845-450b-9b39-4af232b4d161" containerName="configure-os-edpm-compute-global-edpm-compute-global" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.121197 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9f1e23-b845-450b-9b39-4af232b4d161" containerName="configure-os-edpm-compute-global-edpm-compute-global" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.121342 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9f1e23-b845-450b-9b39-4af232b4d161" containerName="configure-os-edpm-compute-global-edpm-compute-global" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.121787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.124902 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.125098 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.125453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.125995 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.129705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2"] Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.221486 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rspsg\" (UniqueName: \"kubernetes.io/projected/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-kube-api-access-rspsg\") pod \"run-os-edpm-compute-global-edpm-compute-global-tz6k2\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.221528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-inventory\") pod \"run-os-edpm-compute-global-edpm-compute-global-tz6k2\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.221560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-ssh-key-edpm-compute-global\") pod \"run-os-edpm-compute-global-edpm-compute-global-tz6k2\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.309224 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j26gv"] Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.311368 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.321414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j26gv"] Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.323283 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rspsg\" (UniqueName: \"kubernetes.io/projected/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-kube-api-access-rspsg\") pod \"run-os-edpm-compute-global-edpm-compute-global-tz6k2\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.323330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-inventory\") pod \"run-os-edpm-compute-global-edpm-compute-global-tz6k2\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.323402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-ssh-key-edpm-compute-global\") pod \"run-os-edpm-compute-global-edpm-compute-global-tz6k2\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.342549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-inventory\") pod \"run-os-edpm-compute-global-edpm-compute-global-tz6k2\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.343003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rspsg\" (UniqueName: \"kubernetes.io/projected/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-kube-api-access-rspsg\") pod \"run-os-edpm-compute-global-edpm-compute-global-tz6k2\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.347956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-ssh-key-edpm-compute-global\") pod \"run-os-edpm-compute-global-edpm-compute-global-tz6k2\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.402234 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.424651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwztq\" (UniqueName: \"kubernetes.io/projected/0081bc2d-6952-41a0-9bf5-469ead880674-kube-api-access-lwztq\") pod \"certified-operators-j26gv\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.424898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-catalog-content\") pod \"certified-operators-j26gv\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.424967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-utilities\") pod \"certified-operators-j26gv\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.513667 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.525967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-utilities\") pod \"576ccbae-e4b9-4053-8a02-95a936f311a9\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.526076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-catalog-content\") pod \"576ccbae-e4b9-4053-8a02-95a936f311a9\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.526172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9j55\" (UniqueName: \"kubernetes.io/projected/576ccbae-e4b9-4053-8a02-95a936f311a9-kube-api-access-t9j55\") pod \"576ccbae-e4b9-4053-8a02-95a936f311a9\" (UID: \"576ccbae-e4b9-4053-8a02-95a936f311a9\") " Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.526375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-utilities\") pod \"certified-operators-j26gv\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.526437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwztq\" (UniqueName: \"kubernetes.io/projected/0081bc2d-6952-41a0-9bf5-469ead880674-kube-api-access-lwztq\") pod \"certified-operators-j26gv\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.526481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-catalog-content\") pod \"certified-operators-j26gv\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.526748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-utilities\") pod \"certified-operators-j26gv\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.526830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-catalog-content\") pod \"certified-operators-j26gv\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.527057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-utilities" (OuterVolumeSpecName: "utilities") pod "576ccbae-e4b9-4053-8a02-95a936f311a9" (UID: "576ccbae-e4b9-4053-8a02-95a936f311a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.529286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/576ccbae-e4b9-4053-8a02-95a936f311a9-kube-api-access-t9j55" (OuterVolumeSpecName: "kube-api-access-t9j55") pod "576ccbae-e4b9-4053-8a02-95a936f311a9" (UID: "576ccbae-e4b9-4053-8a02-95a936f311a9"). InnerVolumeSpecName "kube-api-access-t9j55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.543268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwztq\" (UniqueName: \"kubernetes.io/projected/0081bc2d-6952-41a0-9bf5-469ead880674-kube-api-access-lwztq\") pod \"certified-operators-j26gv\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.569869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "576ccbae-e4b9-4053-8a02-95a936f311a9" (UID: "576ccbae-e4b9-4053-8a02-95a936f311a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.627845 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.627891 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9j55\" (UniqueName: \"kubernetes.io/projected/576ccbae-e4b9-4053-8a02-95a936f311a9-kube-api-access-t9j55\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.627903 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576ccbae-e4b9-4053-8a02-95a936f311a9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.636159 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:31 crc kubenswrapper[4707]: I0121 16:21:31.920079 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2"] Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.068441 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j26gv"] Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.086951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j26gv" event={"ID":"0081bc2d-6952-41a0-9bf5-469ead880674","Type":"ContainerStarted","Data":"2e02513f72dcfcf1aaeb7f4b7f6e4260cfad80bdd7826f8e2acc16038aafd124"} Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.088171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" event={"ID":"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec","Type":"ContainerStarted","Data":"976337a240c58943c78daa071be6b0db71d73df624c75e2587afab1b8e5277b0"} Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.090600 4707 generic.go:334] "Generic (PLEG): container finished" podID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerID="e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4" exitCode=0 Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.090644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bvs5" event={"ID":"576ccbae-e4b9-4053-8a02-95a936f311a9","Type":"ContainerDied","Data":"e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4"} Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.090678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bvs5" event={"ID":"576ccbae-e4b9-4053-8a02-95a936f311a9","Type":"ContainerDied","Data":"9438a78e31adb2fbef90a4e1cfa05ebc86cf360f589ddd1ab3dcd4fab467c2d1"} Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.090687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bvs5" Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.090694 4707 scope.go:117] "RemoveContainer" containerID="e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4" Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.106377 4707 scope.go:117] "RemoveContainer" containerID="3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc" Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.118056 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2bvs5"] Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.122797 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2bvs5"] Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.149884 4707 scope.go:117] "RemoveContainer" containerID="82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0" Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.172093 4707 scope.go:117] "RemoveContainer" containerID="e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4" Jan 21 16:21:32 crc kubenswrapper[4707]: E0121 16:21:32.172530 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4\": container with ID starting with e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4 not found: ID does not exist" containerID="e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4" Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.172581 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4"} err="failed to get container status \"e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4\": rpc error: code = NotFound desc = could not find container \"e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4\": container with ID starting with e5e21aef3552cb3c41d2d99fdcf35b24dcaffba7215a13aa0d06944a81f9d7a4 not found: ID does not exist" Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.172610 4707 scope.go:117] "RemoveContainer" containerID="3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc" Jan 21 16:21:32 crc kubenswrapper[4707]: E0121 16:21:32.172972 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc\": container with ID starting with 3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc not found: ID does not exist" containerID="3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc" Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.172994 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc"} err="failed to get container status \"3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc\": rpc error: code = NotFound desc = could not find container \"3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc\": container with ID starting with 3270cb83593d4d185d9d55b4b4ffa9baf4034f4df47aa143296117a4394d6bcc not found: ID does not exist" Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.173006 4707 scope.go:117] "RemoveContainer" containerID="82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0" Jan 21 16:21:32 crc kubenswrapper[4707]: E0121 16:21:32.173217 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0\": container with ID starting with 82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0 not found: ID does not exist" containerID="82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0" Jan 21 16:21:32 crc kubenswrapper[4707]: I0121 16:21:32.173236 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0"} err="failed to get container status \"82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0\": rpc error: code = NotFound desc = could not find container \"82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0\": container with ID starting with 82c6c722b8d37ddc92af03ee8b46447fd3cdf9f1bed153a69b412784531658b0 not found: ID does not exist" Jan 21 16:21:33 crc kubenswrapper[4707]: I0121 16:21:33.103028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" event={"ID":"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec","Type":"ContainerStarted","Data":"5091d466d1f58f629fcfd4ffcfa33e247d8aa0fd1063c62a57b4950a23e31c7d"} Jan 21 16:21:33 crc kubenswrapper[4707]: I0121 16:21:33.114945 4707 generic.go:334] "Generic (PLEG): container finished" podID="0081bc2d-6952-41a0-9bf5-469ead880674" containerID="95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec" exitCode=0 Jan 21 16:21:33 crc kubenswrapper[4707]: I0121 16:21:33.114998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j26gv" event={"ID":"0081bc2d-6952-41a0-9bf5-469ead880674","Type":"ContainerDied","Data":"95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec"} Jan 21 16:21:33 crc kubenswrapper[4707]: I0121 16:21:33.121355 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" podStartSLOduration=1.600735139 podStartE2EDuration="2.121335852s" podCreationTimestamp="2026-01-21 16:21:31 +0000 UTC" firstStartedPulling="2026-01-21 16:21:31.917052663 +0000 UTC m=+4789.098568885" lastFinishedPulling="2026-01-21 16:21:32.437653376 +0000 UTC m=+4789.619169598" observedRunningTime="2026-01-21 16:21:33.113316633 +0000 UTC m=+4790.294832855" watchObservedRunningTime="2026-01-21 16:21:33.121335852 +0000 UTC m=+4790.302852074" Jan 21 16:21:33 crc kubenswrapper[4707]: I0121 16:21:33.198333 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="576ccbae-e4b9-4053-8a02-95a936f311a9" path="/var/lib/kubelet/pods/576ccbae-e4b9-4053-8a02-95a936f311a9/volumes" Jan 21 16:21:34 crc kubenswrapper[4707]: I0121 16:21:34.123912 4707 generic.go:334] "Generic (PLEG): container finished" podID="6fd91bc6-4821-4a19-a0c3-78fc1a3afdec" containerID="5091d466d1f58f629fcfd4ffcfa33e247d8aa0fd1063c62a57b4950a23e31c7d" exitCode=0 Jan 21 16:21:34 crc kubenswrapper[4707]: I0121 16:21:34.124071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" event={"ID":"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec","Type":"ContainerDied","Data":"5091d466d1f58f629fcfd4ffcfa33e247d8aa0fd1063c62a57b4950a23e31c7d"} Jan 21 16:21:34 crc kubenswrapper[4707]: I0121 16:21:34.126592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j26gv" event={"ID":"0081bc2d-6952-41a0-9bf5-469ead880674","Type":"ContainerStarted","Data":"12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545"} Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.134726 4707 generic.go:334] "Generic (PLEG): container finished" podID="0081bc2d-6952-41a0-9bf5-469ead880674" containerID="12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545" exitCode=0 Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.134827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j26gv" event={"ID":"0081bc2d-6952-41a0-9bf5-469ead880674","Type":"ContainerDied","Data":"12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545"} Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.367928 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.487120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rspsg\" (UniqueName: \"kubernetes.io/projected/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-kube-api-access-rspsg\") pod \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.487205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-inventory\") pod \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.487248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-ssh-key-edpm-compute-global\") pod \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\" (UID: \"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec\") " Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.493943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-kube-api-access-rspsg" (OuterVolumeSpecName: "kube-api-access-rspsg") pod "6fd91bc6-4821-4a19-a0c3-78fc1a3afdec" (UID: "6fd91bc6-4821-4a19-a0c3-78fc1a3afdec"). InnerVolumeSpecName "kube-api-access-rspsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.507695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "6fd91bc6-4821-4a19-a0c3-78fc1a3afdec" (UID: "6fd91bc6-4821-4a19-a0c3-78fc1a3afdec"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.510475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-inventory" (OuterVolumeSpecName: "inventory") pod "6fd91bc6-4821-4a19-a0c3-78fc1a3afdec" (UID: "6fd91bc6-4821-4a19-a0c3-78fc1a3afdec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.590095 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rspsg\" (UniqueName: \"kubernetes.io/projected/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-kube-api-access-rspsg\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.590123 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:35 crc kubenswrapper[4707]: I0121 16:21:35.590134 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.146721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j26gv" event={"ID":"0081bc2d-6952-41a0-9bf5-469ead880674","Type":"ContainerStarted","Data":"61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce"} Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.148395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" event={"ID":"6fd91bc6-4821-4a19-a0c3-78fc1a3afdec","Type":"ContainerDied","Data":"976337a240c58943c78daa071be6b0db71d73df624c75e2587afab1b8e5277b0"} Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.148432 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="976337a240c58943c78daa071be6b0db71d73df624c75e2587afab1b8e5277b0" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.148479 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.165784 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j26gv" podStartSLOduration=2.70850215 podStartE2EDuration="5.165769509s" podCreationTimestamp="2026-01-21 16:21:31 +0000 UTC" firstStartedPulling="2026-01-21 16:21:33.117315773 +0000 UTC m=+4790.298831995" lastFinishedPulling="2026-01-21 16:21:35.574583132 +0000 UTC m=+4792.756099354" observedRunningTime="2026-01-21 16:21:36.163112131 +0000 UTC m=+4793.344628354" watchObservedRunningTime="2026-01-21 16:21:36.165769509 +0000 UTC m=+4793.347285730" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.177110 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk"] Jan 21 16:21:36 crc kubenswrapper[4707]: E0121 16:21:36.177375 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerName="extract-content" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.177392 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerName="extract-content" Jan 21 16:21:36 crc kubenswrapper[4707]: E0121 16:21:36.177414 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerName="extract-utilities" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.177420 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerName="extract-utilities" Jan 21 16:21:36 crc kubenswrapper[4707]: E0121 16:21:36.177426 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd91bc6-4821-4a19-a0c3-78fc1a3afdec" containerName="run-os-edpm-compute-global-edpm-compute-global" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.177433 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd91bc6-4821-4a19-a0c3-78fc1a3afdec" containerName="run-os-edpm-compute-global-edpm-compute-global" Jan 21 16:21:36 crc kubenswrapper[4707]: E0121 16:21:36.177446 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerName="registry-server" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.177452 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerName="registry-server" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.177584 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="576ccbae-e4b9-4053-8a02-95a936f311a9" containerName="registry-server" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.177594 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd91bc6-4821-4a19-a0c3-78fc1a3afdec" containerName="run-os-edpm-compute-global-edpm-compute-global" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.178031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.179730 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.179768 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.179924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.181427 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.182908 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.196872 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk"] Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.299060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8j5g\" (UniqueName: \"kubernetes.io/projected/f2b44fd4-888a-45c9-b34d-10ed6c447905-kube-api-access-h8j5g\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.299157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.299713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.299831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.299987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.300165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.300218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.300296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.300370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.300424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-inventory\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.300475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.300544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.401828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.401892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.401919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.401944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.401967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.401988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-inventory\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.402008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.402030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.402068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8j5g\" (UniqueName: \"kubernetes.io/projected/f2b44fd4-888a-45c9-b34d-10ed6c447905-kube-api-access-h8j5g\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.402098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.402117 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.402154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.407497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.407504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.407503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.407656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.408090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.408471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.408669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.409153 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.409256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.409424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-inventory\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.410312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.418173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8j5g\" (UniqueName: \"kubernetes.io/projected/f2b44fd4-888a-45c9-b34d-10ed6c447905-kube-api-access-h8j5g\") pod \"install-certs-edpm-compute-global-edpm-compute-global-7k4fk\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.490446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:36 crc kubenswrapper[4707]: I0121 16:21:36.880653 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk"] Jan 21 16:21:36 crc kubenswrapper[4707]: W0121 16:21:36.884509 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b44fd4_888a_45c9_b34d_10ed6c447905.slice/crio-600100df5dcefff5bfded1747b361034c09819c197ec19b4055df63dd4794e0c WatchSource:0}: Error finding container 600100df5dcefff5bfded1747b361034c09819c197ec19b4055df63dd4794e0c: Status 404 returned error can't find the container with id 600100df5dcefff5bfded1747b361034c09819c197ec19b4055df63dd4794e0c Jan 21 16:21:37 crc kubenswrapper[4707]: I0121 16:21:37.156087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" event={"ID":"f2b44fd4-888a-45c9-b34d-10ed6c447905","Type":"ContainerStarted","Data":"600100df5dcefff5bfded1747b361034c09819c197ec19b4055df63dd4794e0c"} Jan 21 16:21:37 crc kubenswrapper[4707]: I0121 16:21:37.183013 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:21:37 crc kubenswrapper[4707]: E0121 16:21:37.183351 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:21:38 crc kubenswrapper[4707]: I0121 16:21:38.166258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" event={"ID":"f2b44fd4-888a-45c9-b34d-10ed6c447905","Type":"ContainerStarted","Data":"8ff4482037b4612eb13a12cdf7e8da466957ff38a18590f74a1ba798a7133659"} Jan 21 16:21:38 crc kubenswrapper[4707]: I0121 16:21:38.186090 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" podStartSLOduration=1.7204963850000001 podStartE2EDuration="2.186070042s" podCreationTimestamp="2026-01-21 16:21:36 +0000 UTC" firstStartedPulling="2026-01-21 16:21:36.887421227 +0000 UTC m=+4794.068937448" lastFinishedPulling="2026-01-21 16:21:37.352994884 +0000 UTC m=+4794.534511105" observedRunningTime="2026-01-21 16:21:38.185304181 +0000 UTC m=+4795.366820404" watchObservedRunningTime="2026-01-21 16:21:38.186070042 +0000 UTC m=+4795.367586264" Jan 21 16:21:39 crc kubenswrapper[4707]: I0121 16:21:39.173746 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2b44fd4-888a-45c9-b34d-10ed6c447905" containerID="8ff4482037b4612eb13a12cdf7e8da466957ff38a18590f74a1ba798a7133659" exitCode=0 Jan 21 16:21:39 crc kubenswrapper[4707]: I0121 16:21:39.173793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" event={"ID":"f2b44fd4-888a-45c9-b34d-10ed6c447905","Type":"ContainerDied","Data":"8ff4482037b4612eb13a12cdf7e8da466957ff38a18590f74a1ba798a7133659"} Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.406015 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.565078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-sriov-combined-ca-bundle\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.565122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-metadata-combined-ca-bundle\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.565165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-nova-combined-ca-bundle\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.565195 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-custom-global-service-combined-ca-bundle\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.565233 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-ovn-combined-ca-bundle\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.565251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-dhcp-combined-ca-bundle\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.565310 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ssh-key-edpm-compute-global\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.565373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-inventory\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.565400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8j5g\" (UniqueName: \"kubernetes.io/projected/f2b44fd4-888a-45c9-b34d-10ed6c447905-kube-api-access-h8j5g\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.566151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-bootstrap-combined-ca-bundle\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.566229 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-libvirt-combined-ca-bundle\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.566263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ovn-combined-ca-bundle\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.572196 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.572237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-custom-global-service-combined-ca-bundle" (OuterVolumeSpecName: "custom-global-service-combined-ca-bundle") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "custom-global-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.572268 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.572286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.572332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.572332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b44fd4-888a-45c9-b34d-10ed6c447905-kube-api-access-h8j5g" (OuterVolumeSpecName: "kube-api-access-h8j5g") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "kube-api-access-h8j5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.572801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.572461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.572492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.573376 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: E0121 16:21:40.583949 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-inventory podName:f2b44fd4-888a-45c9-b34d-10ed6c447905 nodeName:}" failed. No retries permitted until 2026-01-21 16:21:41.083922293 +0000 UTC m=+4798.265438516 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-inventory") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905") : error deleting /var/lib/kubelet/pods/f2b44fd4-888a-45c9-b34d-10ed6c447905/volume-subpaths: remove /var/lib/kubelet/pods/f2b44fd4-888a-45c9-b34d-10ed6c447905/volume-subpaths: no such file or directory Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.585707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667511 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667708 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667720 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667755 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667766 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667775 4707 reconciler_common.go:293] "Volume detached for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-custom-global-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667785 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667794 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667827 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667839 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8j5g\" (UniqueName: \"kubernetes.io/projected/f2b44fd4-888a-45c9-b34d-10ed6c447905-kube-api-access-h8j5g\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:40 crc kubenswrapper[4707]: I0121 16:21:40.667848 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.173989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-inventory\") pod \"f2b44fd4-888a-45c9-b34d-10ed6c447905\" (UID: \"f2b44fd4-888a-45c9-b34d-10ed6c447905\") " Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.198902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" event={"ID":"f2b44fd4-888a-45c9-b34d-10ed6c447905","Type":"ContainerDied","Data":"600100df5dcefff5bfded1747b361034c09819c197ec19b4055df63dd4794e0c"} Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.198958 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="600100df5dcefff5bfded1747b361034c09819c197ec19b4055df63dd4794e0c" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.198971 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.246792 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd"] Jan 21 16:21:41 crc kubenswrapper[4707]: E0121 16:21:41.247136 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b44fd4-888a-45c9-b34d-10ed6c447905" containerName="install-certs-edpm-compute-global-edpm-compute-global" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.247160 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b44fd4-888a-45c9-b34d-10ed6c447905" containerName="install-certs-edpm-compute-global-edpm-compute-global" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.247315 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b44fd4-888a-45c9-b34d-10ed6c447905" containerName="install-certs-edpm-compute-global-edpm-compute-global" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.247797 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.249667 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.255680 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd"] Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.276073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovncontroller-config-0\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.276208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-inventory\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.276442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.276505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ssh-key-edpm-compute-global\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.276537 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brwnw\" (UniqueName: \"kubernetes.io/projected/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-kube-api-access-brwnw\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.377084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brwnw\" (UniqueName: \"kubernetes.io/projected/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-kube-api-access-brwnw\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.377160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovncontroller-config-0\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.377207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-inventory\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.377252 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.377277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ssh-key-edpm-compute-global\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.378168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovncontroller-config-0\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.447895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-inventory" (OuterVolumeSpecName: "inventory") pod "f2b44fd4-888a-45c9-b34d-10ed6c447905" (UID: "f2b44fd4-888a-45c9-b34d-10ed6c447905"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.448775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.448883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ssh-key-edpm-compute-global\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.449147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brwnw\" (UniqueName: \"kubernetes.io/projected/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-kube-api-access-brwnw\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.449190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-inventory\") pod \"ovn-edpm-compute-global-edpm-compute-global-2gwdd\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.478522 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b44fd4-888a-45c9-b34d-10ed6c447905-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.564638 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.637082 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.637122 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:41 crc kubenswrapper[4707]: I0121 16:21:41.723036 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:42 crc kubenswrapper[4707]: I0121 16:21:42.062610 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd"] Jan 21 16:21:42 crc kubenswrapper[4707]: W0121 16:21:42.066494 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ee2225c_4c91_4fc9_a247_a4cf255bc7fc.slice/crio-32d964e613f6f42b731778360c0426621cf67606c63fe745923d698ab86f78dc WatchSource:0}: Error finding container 32d964e613f6f42b731778360c0426621cf67606c63fe745923d698ab86f78dc: Status 404 returned error can't find the container with id 32d964e613f6f42b731778360c0426621cf67606c63fe745923d698ab86f78dc Jan 21 16:21:42 crc kubenswrapper[4707]: I0121 16:21:42.206879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" event={"ID":"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc","Type":"ContainerStarted","Data":"32d964e613f6f42b731778360c0426621cf67606c63fe745923d698ab86f78dc"} Jan 21 16:21:42 crc kubenswrapper[4707]: I0121 16:21:42.240846 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:42 crc kubenswrapper[4707]: I0121 16:21:42.278852 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j26gv"] Jan 21 16:21:43 crc kubenswrapper[4707]: I0121 16:21:43.215325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" event={"ID":"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc","Type":"ContainerStarted","Data":"b7f8026af2dd64655fab311721cd723e934b382da4deec287308ef00bb9dd160"} Jan 21 16:21:43 crc kubenswrapper[4707]: I0121 16:21:43.228508 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" podStartSLOduration=1.732092229 podStartE2EDuration="2.228490294s" podCreationTimestamp="2026-01-21 16:21:41 +0000 UTC" firstStartedPulling="2026-01-21 16:21:42.068945687 +0000 UTC m=+4799.250461909" lastFinishedPulling="2026-01-21 16:21:42.565343752 +0000 UTC m=+4799.746859974" observedRunningTime="2026-01-21 16:21:43.227116942 +0000 UTC m=+4800.408633164" watchObservedRunningTime="2026-01-21 16:21:43.228490294 +0000 UTC m=+4800.410006516" Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.224761 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" containerID="b7f8026af2dd64655fab311721cd723e934b382da4deec287308ef00bb9dd160" exitCode=0 Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.224865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" event={"ID":"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc","Type":"ContainerDied","Data":"b7f8026af2dd64655fab311721cd723e934b382da4deec287308ef00bb9dd160"} Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.225677 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j26gv" podUID="0081bc2d-6952-41a0-9bf5-469ead880674" containerName="registry-server" containerID="cri-o://61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce" gracePeriod=2 Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.551800 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.723398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-catalog-content\") pod \"0081bc2d-6952-41a0-9bf5-469ead880674\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.723459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwztq\" (UniqueName: \"kubernetes.io/projected/0081bc2d-6952-41a0-9bf5-469ead880674-kube-api-access-lwztq\") pod \"0081bc2d-6952-41a0-9bf5-469ead880674\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.723500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-utilities\") pod \"0081bc2d-6952-41a0-9bf5-469ead880674\" (UID: \"0081bc2d-6952-41a0-9bf5-469ead880674\") " Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.724279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-utilities" (OuterVolumeSpecName: "utilities") pod "0081bc2d-6952-41a0-9bf5-469ead880674" (UID: "0081bc2d-6952-41a0-9bf5-469ead880674"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.728729 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0081bc2d-6952-41a0-9bf5-469ead880674-kube-api-access-lwztq" (OuterVolumeSpecName: "kube-api-access-lwztq") pod "0081bc2d-6952-41a0-9bf5-469ead880674" (UID: "0081bc2d-6952-41a0-9bf5-469ead880674"). InnerVolumeSpecName "kube-api-access-lwztq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.761125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0081bc2d-6952-41a0-9bf5-469ead880674" (UID: "0081bc2d-6952-41a0-9bf5-469ead880674"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.824819 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.824852 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwztq\" (UniqueName: \"kubernetes.io/projected/0081bc2d-6952-41a0-9bf5-469ead880674-kube-api-access-lwztq\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:44 crc kubenswrapper[4707]: I0121 16:21:44.824863 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0081bc2d-6952-41a0-9bf5-469ead880674-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.233483 4707 generic.go:334] "Generic (PLEG): container finished" podID="0081bc2d-6952-41a0-9bf5-469ead880674" containerID="61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce" exitCode=0 Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.233550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j26gv" event={"ID":"0081bc2d-6952-41a0-9bf5-469ead880674","Type":"ContainerDied","Data":"61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce"} Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.233564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j26gv" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.233593 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j26gv" event={"ID":"0081bc2d-6952-41a0-9bf5-469ead880674","Type":"ContainerDied","Data":"2e02513f72dcfcf1aaeb7f4b7f6e4260cfad80bdd7826f8e2acc16038aafd124"} Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.233614 4707 scope.go:117] "RemoveContainer" containerID="61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.248791 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j26gv"] Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.252645 4707 scope.go:117] "RemoveContainer" containerID="12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.256920 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j26gv"] Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.270915 4707 scope.go:117] "RemoveContainer" containerID="95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.289642 4707 scope.go:117] "RemoveContainer" containerID="61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce" Jan 21 16:21:45 crc kubenswrapper[4707]: E0121 16:21:45.290019 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce\": container with ID starting with 61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce not found: ID does not exist" containerID="61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.290069 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce"} err="failed to get container status \"61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce\": rpc error: code = NotFound desc = could not find container \"61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce\": container with ID starting with 61b7f1c70bee4acb733007bc1a74d50ed3154ca958f11af48688d70015cec1ce not found: ID does not exist" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.290091 4707 scope.go:117] "RemoveContainer" containerID="12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545" Jan 21 16:21:45 crc kubenswrapper[4707]: E0121 16:21:45.290498 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545\": container with ID starting with 12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545 not found: ID does not exist" containerID="12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.290521 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545"} err="failed to get container status \"12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545\": rpc error: code = NotFound desc = could not find container \"12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545\": container with ID starting with 12c06accacd032f627aa4aacd199640f71005230979c6f9ceee6e9b6d1ada545 not found: ID does not exist" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.290533 4707 scope.go:117] "RemoveContainer" containerID="95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec" Jan 21 16:21:45 crc kubenswrapper[4707]: E0121 16:21:45.291031 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec\": container with ID starting with 95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec not found: ID does not exist" containerID="95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.291073 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec"} err="failed to get container status \"95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec\": rpc error: code = NotFound desc = could not find container \"95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec\": container with ID starting with 95c896e999cceb93f06a1076c3db6474d59d109d81acd93a9ab13d0d0c218eec not found: ID does not exist" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.451223 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.634451 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ssh-key-edpm-compute-global\") pod \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.634565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovncontroller-config-0\") pod \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.634603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovn-combined-ca-bundle\") pod \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.634634 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brwnw\" (UniqueName: \"kubernetes.io/projected/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-kube-api-access-brwnw\") pod \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.634682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-inventory\") pod \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\" (UID: \"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc\") " Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.642827 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-kube-api-access-brwnw" (OuterVolumeSpecName: "kube-api-access-brwnw") pod "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" (UID: "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc"). InnerVolumeSpecName "kube-api-access-brwnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.642845 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" (UID: "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.649301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" (UID: "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.651060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-inventory" (OuterVolumeSpecName: "inventory") pod "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" (UID: "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.651632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" (UID: "9ee2225c-4c91-4fc9-a247-a4cf255bc7fc"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.737075 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.737113 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.737127 4707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.737149 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:45 crc kubenswrapper[4707]: I0121 16:21:45.737160 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brwnw\" (UniqueName: \"kubernetes.io/projected/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc-kube-api-access-brwnw\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.262590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" event={"ID":"9ee2225c-4c91-4fc9-a247-a4cf255bc7fc","Type":"ContainerDied","Data":"32d964e613f6f42b731778360c0426621cf67606c63fe745923d698ab86f78dc"} Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.262669 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32d964e613f6f42b731778360c0426621cf67606c63fe745923d698ab86f78dc" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.262746 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.265063 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp"] Jan 21 16:21:46 crc kubenswrapper[4707]: E0121 16:21:46.265513 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0081bc2d-6952-41a0-9bf5-469ead880674" containerName="extract-content" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.265532 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0081bc2d-6952-41a0-9bf5-469ead880674" containerName="extract-content" Jan 21 16:21:46 crc kubenswrapper[4707]: E0121 16:21:46.265554 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0081bc2d-6952-41a0-9bf5-469ead880674" containerName="extract-utilities" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.265561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0081bc2d-6952-41a0-9bf5-469ead880674" containerName="extract-utilities" Jan 21 16:21:46 crc kubenswrapper[4707]: E0121 16:21:46.265582 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" containerName="ovn-edpm-compute-global-edpm-compute-global" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.265588 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" containerName="ovn-edpm-compute-global-edpm-compute-global" Jan 21 16:21:46 crc kubenswrapper[4707]: E0121 16:21:46.265595 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0081bc2d-6952-41a0-9bf5-469ead880674" containerName="registry-server" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.265600 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0081bc2d-6952-41a0-9bf5-469ead880674" containerName="registry-server" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.265751 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" containerName="ovn-edpm-compute-global-edpm-compute-global" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.265768 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0081bc2d-6952-41a0-9bf5-469ead880674" containerName="registry-server" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.266406 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.267765 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-neutron-config" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.268692 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.268767 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.268803 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.268903 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.268995 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.270359 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.272798 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp"] Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.350216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.350276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.350319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55fsd\" (UniqueName: \"kubernetes.io/projected/b2591815-0b9e-4b37-91a6-3a4c54197b74-kube-api-access-55fsd\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.350375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.350418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-inventory\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.350512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.350569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.350594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.452243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.452315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-inventory\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.452365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.452397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.452418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.452462 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.452491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.452524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55fsd\" (UniqueName: \"kubernetes.io/projected/b2591815-0b9e-4b37-91a6-3a4c54197b74-kube-api-access-55fsd\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.456663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-inventory\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.456723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.457464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.457592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.458183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.458630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.458923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.467430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55fsd\" (UniqueName: \"kubernetes.io/projected/b2591815-0b9e-4b37-91a6-3a4c54197b74-kube-api-access-55fsd\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:46 crc kubenswrapper[4707]: I0121 16:21:46.585204 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:47 crc kubenswrapper[4707]: I0121 16:21:47.007170 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp"] Jan 21 16:21:47 crc kubenswrapper[4707]: I0121 16:21:47.189001 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0081bc2d-6952-41a0-9bf5-469ead880674" path="/var/lib/kubelet/pods/0081bc2d-6952-41a0-9bf5-469ead880674/volumes" Jan 21 16:21:47 crc kubenswrapper[4707]: I0121 16:21:47.270656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" event={"ID":"b2591815-0b9e-4b37-91a6-3a4c54197b74","Type":"ContainerStarted","Data":"ed368eed1bf99cb72791c6f002b0e5846b4dcfde9ef9e8e90d7cd79f2bac3ef7"} Jan 21 16:21:48 crc kubenswrapper[4707]: I0121 16:21:48.286838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" event={"ID":"b2591815-0b9e-4b37-91a6-3a4c54197b74","Type":"ContainerStarted","Data":"d5d82a9205f2fb2774d5847ad30470ce4265835df17c9aa1f56d6478cd2fc1ba"} Jan 21 16:21:48 crc kubenswrapper[4707]: I0121 16:21:48.310270 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" podStartSLOduration=1.806305873 podStartE2EDuration="2.310250818s" podCreationTimestamp="2026-01-21 16:21:46 +0000 UTC" firstStartedPulling="2026-01-21 16:21:47.016495786 +0000 UTC m=+4804.198012008" lastFinishedPulling="2026-01-21 16:21:47.520440732 +0000 UTC m=+4804.701956953" observedRunningTime="2026-01-21 16:21:48.300334253 +0000 UTC m=+4805.481850475" watchObservedRunningTime="2026-01-21 16:21:48.310250818 +0000 UTC m=+4805.491767040" Jan 21 16:21:49 crc kubenswrapper[4707]: I0121 16:21:49.295432 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2591815-0b9e-4b37-91a6-3a4c54197b74" containerID="d5d82a9205f2fb2774d5847ad30470ce4265835df17c9aa1f56d6478cd2fc1ba" exitCode=0 Jan 21 16:21:49 crc kubenswrapper[4707]: I0121 16:21:49.295536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" event={"ID":"b2591815-0b9e-4b37-91a6-3a4c54197b74","Type":"ContainerDied","Data":"d5d82a9205f2fb2774d5847ad30470ce4265835df17c9aa1f56d6478cd2fc1ba"} Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.183072 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:21:50 crc kubenswrapper[4707]: E0121 16:21:50.183317 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.511593 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.627776 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-ovn-metadata-agent-neutron-config-0\") pod \"b2591815-0b9e-4b37-91a6-3a4c54197b74\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.627875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55fsd\" (UniqueName: \"kubernetes.io/projected/b2591815-0b9e-4b37-91a6-3a4c54197b74-kube-api-access-55fsd\") pod \"b2591815-0b9e-4b37-91a6-3a4c54197b74\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.627901 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-metadata-combined-ca-bundle\") pod \"b2591815-0b9e-4b37-91a6-3a4c54197b74\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.627959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-1\") pod \"b2591815-0b9e-4b37-91a6-3a4c54197b74\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.628487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-inventory\") pod \"b2591815-0b9e-4b37-91a6-3a4c54197b74\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.628558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-2\") pod \"b2591815-0b9e-4b37-91a6-3a4c54197b74\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.628582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-0\") pod \"b2591815-0b9e-4b37-91a6-3a4c54197b74\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.628624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-ssh-key-edpm-compute-global\") pod \"b2591815-0b9e-4b37-91a6-3a4c54197b74\" (UID: \"b2591815-0b9e-4b37-91a6-3a4c54197b74\") " Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.948122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2591815-0b9e-4b37-91a6-3a4c54197b74-kube-api-access-55fsd" (OuterVolumeSpecName: "kube-api-access-55fsd") pod "b2591815-0b9e-4b37-91a6-3a4c54197b74" (UID: "b2591815-0b9e-4b37-91a6-3a4c54197b74"). InnerVolumeSpecName "kube-api-access-55fsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.953576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b2591815-0b9e-4b37-91a6-3a4c54197b74" (UID: "b2591815-0b9e-4b37-91a6-3a4c54197b74"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.964167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-2" (OuterVolumeSpecName: "nova-metadata-neutron-config-2") pod "b2591815-0b9e-4b37-91a6-3a4c54197b74" (UID: "b2591815-0b9e-4b37-91a6-3a4c54197b74"). InnerVolumeSpecName "nova-metadata-neutron-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.964493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-1" (OuterVolumeSpecName: "nova-metadata-neutron-config-1") pod "b2591815-0b9e-4b37-91a6-3a4c54197b74" (UID: "b2591815-0b9e-4b37-91a6-3a4c54197b74"). InnerVolumeSpecName "nova-metadata-neutron-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.965001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "b2591815-0b9e-4b37-91a6-3a4c54197b74" (UID: "b2591815-0b9e-4b37-91a6-3a4c54197b74"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.965319 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "b2591815-0b9e-4b37-91a6-3a4c54197b74" (UID: "b2591815-0b9e-4b37-91a6-3a4c54197b74"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.965497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-inventory" (OuterVolumeSpecName: "inventory") pod "b2591815-0b9e-4b37-91a6-3a4c54197b74" (UID: "b2591815-0b9e-4b37-91a6-3a4c54197b74"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:50 crc kubenswrapper[4707]: I0121 16:21:50.966029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "b2591815-0b9e-4b37-91a6-3a4c54197b74" (UID: "b2591815-0b9e-4b37-91a6-3a4c54197b74"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.036286 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.036323 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-2\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.036336 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.036349 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.036362 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.036377 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55fsd\" (UniqueName: \"kubernetes.io/projected/b2591815-0b9e-4b37-91a6-3a4c54197b74-kube-api-access-55fsd\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.036387 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.036400 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/b2591815-0b9e-4b37-91a6-3a4c54197b74-nova-metadata-neutron-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.312009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" event={"ID":"b2591815-0b9e-4b37-91a6-3a4c54197b74","Type":"ContainerDied","Data":"ed368eed1bf99cb72791c6f002b0e5846b4dcfde9ef9e8e90d7cd79f2bac3ef7"} Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.312074 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed368eed1bf99cb72791c6f002b0e5846b4dcfde9ef9e8e90d7cd79f2bac3ef7" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.312212 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.363756 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg"] Jan 21 16:21:51 crc kubenswrapper[4707]: E0121 16:21:51.364158 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2591815-0b9e-4b37-91a6-3a4c54197b74" containerName="neutron-metadata-edpm-compute-global-edpm-compute-global" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.364182 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2591815-0b9e-4b37-91a6-3a4c54197b74" containerName="neutron-metadata-edpm-compute-global-edpm-compute-global" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.364333 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2591815-0b9e-4b37-91a6-3a4c54197b74" containerName="neutron-metadata-edpm-compute-global-edpm-compute-global" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.364855 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.366966 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.366984 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.367232 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.367803 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-agent-neutron-config" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.367956 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.369223 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg"] Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.369606 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.443153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v55s\" (UniqueName: \"kubernetes.io/projected/35759e50-280d-4a43-a6a1-0afe9b5f918b-kube-api-access-4v55s\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.443254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.443295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-inventory\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.443331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.443373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.544627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v55s\" (UniqueName: \"kubernetes.io/projected/35759e50-280d-4a43-a6a1-0afe9b5f918b-kube-api-access-4v55s\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.544763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.544790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-inventory\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.544874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.544906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.550418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.551083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.551177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-inventory\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.551766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.563291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v55s\" (UniqueName: \"kubernetes.io/projected/35759e50-280d-4a43-a6a1-0afe9b5f918b-kube-api-access-4v55s\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:51 crc kubenswrapper[4707]: I0121 16:21:51.685107 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:52 crc kubenswrapper[4707]: I0121 16:21:52.068409 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg"] Jan 21 16:21:52 crc kubenswrapper[4707]: W0121 16:21:52.068725 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35759e50_280d_4a43_a6a1_0afe9b5f918b.slice/crio-22cb97081935c8af1a38b712ebfa5b27088bb1712a7350e0b4ec0880766dc04a WatchSource:0}: Error finding container 22cb97081935c8af1a38b712ebfa5b27088bb1712a7350e0b4ec0880766dc04a: Status 404 returned error can't find the container with id 22cb97081935c8af1a38b712ebfa5b27088bb1712a7350e0b4ec0880766dc04a Jan 21 16:21:52 crc kubenswrapper[4707]: I0121 16:21:52.321431 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" event={"ID":"35759e50-280d-4a43-a6a1-0afe9b5f918b","Type":"ContainerStarted","Data":"22cb97081935c8af1a38b712ebfa5b27088bb1712a7350e0b4ec0880766dc04a"} Jan 21 16:21:53 crc kubenswrapper[4707]: I0121 16:21:53.327757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" event={"ID":"35759e50-280d-4a43-a6a1-0afe9b5f918b","Type":"ContainerStarted","Data":"bf733a64e88d989b98d87333dd4e59224f43b7cebef520553b6952eccdc1a1e7"} Jan 21 16:21:53 crc kubenswrapper[4707]: I0121 16:21:53.344285 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" podStartSLOduration=1.830988093 podStartE2EDuration="2.344273872s" podCreationTimestamp="2026-01-21 16:21:51 +0000 UTC" firstStartedPulling="2026-01-21 16:21:52.071332786 +0000 UTC m=+4809.252849008" lastFinishedPulling="2026-01-21 16:21:52.584618565 +0000 UTC m=+4809.766134787" observedRunningTime="2026-01-21 16:21:53.341606636 +0000 UTC m=+4810.523122858" watchObservedRunningTime="2026-01-21 16:21:53.344273872 +0000 UTC m=+4810.525790094" Jan 21 16:21:54 crc kubenswrapper[4707]: I0121 16:21:54.338579 4707 generic.go:334] "Generic (PLEG): container finished" podID="35759e50-280d-4a43-a6a1-0afe9b5f918b" containerID="bf733a64e88d989b98d87333dd4e59224f43b7cebef520553b6952eccdc1a1e7" exitCode=0 Jan 21 16:21:54 crc kubenswrapper[4707]: I0121 16:21:54.338666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" event={"ID":"35759e50-280d-4a43-a6a1-0afe9b5f918b","Type":"ContainerDied","Data":"bf733a64e88d989b98d87333dd4e59224f43b7cebef520553b6952eccdc1a1e7"} Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.574997 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.704412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-agent-neutron-config-0\") pod \"35759e50-280d-4a43-a6a1-0afe9b5f918b\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.704638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-combined-ca-bundle\") pod \"35759e50-280d-4a43-a6a1-0afe9b5f918b\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.704731 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-ssh-key-edpm-compute-global\") pod \"35759e50-280d-4a43-a6a1-0afe9b5f918b\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.704831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-inventory\") pod \"35759e50-280d-4a43-a6a1-0afe9b5f918b\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.704871 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v55s\" (UniqueName: \"kubernetes.io/projected/35759e50-280d-4a43-a6a1-0afe9b5f918b-kube-api-access-4v55s\") pod \"35759e50-280d-4a43-a6a1-0afe9b5f918b\" (UID: \"35759e50-280d-4a43-a6a1-0afe9b5f918b\") " Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.710417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "35759e50-280d-4a43-a6a1-0afe9b5f918b" (UID: "35759e50-280d-4a43-a6a1-0afe9b5f918b"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.710881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35759e50-280d-4a43-a6a1-0afe9b5f918b-kube-api-access-4v55s" (OuterVolumeSpecName: "kube-api-access-4v55s") pod "35759e50-280d-4a43-a6a1-0afe9b5f918b" (UID: "35759e50-280d-4a43-a6a1-0afe9b5f918b"). InnerVolumeSpecName "kube-api-access-4v55s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.725977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-inventory" (OuterVolumeSpecName: "inventory") pod "35759e50-280d-4a43-a6a1-0afe9b5f918b" (UID: "35759e50-280d-4a43-a6a1-0afe9b5f918b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.727086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "35759e50-280d-4a43-a6a1-0afe9b5f918b" (UID: "35759e50-280d-4a43-a6a1-0afe9b5f918b"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.732576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-agent-neutron-config-0") pod "35759e50-280d-4a43-a6a1-0afe9b5f918b" (UID: "35759e50-280d-4a43-a6a1-0afe9b5f918b"). InnerVolumeSpecName "neutron-ovn-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.807014 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.807036 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.807046 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.807057 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v55s\" (UniqueName: \"kubernetes.io/projected/35759e50-280d-4a43-a6a1-0afe9b5f918b-kube-api-access-4v55s\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:55 crc kubenswrapper[4707]: I0121 16:21:55.807066 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35759e50-280d-4a43-a6a1-0afe9b5f918b-neutron-ovn-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.288452 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm"] Jan 21 16:21:56 crc kubenswrapper[4707]: E0121 16:21:56.288777 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35759e50-280d-4a43-a6a1-0afe9b5f918b" containerName="neutron-ovn-edpm-compute-global-edpm-compute-global" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.288799 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35759e50-280d-4a43-a6a1-0afe9b5f918b" containerName="neutron-ovn-edpm-compute-global-edpm-compute-global" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.288992 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35759e50-280d-4a43-a6a1-0afe9b5f918b" containerName="neutron-ovn-edpm-compute-global-edpm-compute-global" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.289503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.291567 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-sriov-agent-neutron-config" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.297636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm"] Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.313699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-inventory\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.313832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.313914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgqjm\" (UniqueName: \"kubernetes.io/projected/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-kube-api-access-zgqjm\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.313957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.313999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.352527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" event={"ID":"35759e50-280d-4a43-a6a1-0afe9b5f918b","Type":"ContainerDied","Data":"22cb97081935c8af1a38b712ebfa5b27088bb1712a7350e0b4ec0880766dc04a"} Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.352648 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22cb97081935c8af1a38b712ebfa5b27088bb1712a7350e0b4ec0880766dc04a" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.352597 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.414666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.415040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgqjm\" (UniqueName: \"kubernetes.io/projected/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-kube-api-access-zgqjm\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.415062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.415096 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.415118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-inventory\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.420983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.420983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.421027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-inventory\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.421039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.432797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgqjm\" (UniqueName: \"kubernetes.io/projected/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-kube-api-access-zgqjm\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.605548 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:21:56 crc kubenswrapper[4707]: I0121 16:21:56.978735 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm"] Jan 21 16:21:56 crc kubenswrapper[4707]: W0121 16:21:56.981646 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a6365ee_2fa7_435c_bf86_f37bfdd0dbe7.slice/crio-cf6b2cd8f1d9e912e553bc38f806d464e1815083c7e39159945dd0af27db2dc2 WatchSource:0}: Error finding container cf6b2cd8f1d9e912e553bc38f806d464e1815083c7e39159945dd0af27db2dc2: Status 404 returned error can't find the container with id cf6b2cd8f1d9e912e553bc38f806d464e1815083c7e39159945dd0af27db2dc2 Jan 21 16:21:57 crc kubenswrapper[4707]: I0121 16:21:57.360268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" event={"ID":"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7","Type":"ContainerStarted","Data":"cf6b2cd8f1d9e912e553bc38f806d464e1815083c7e39159945dd0af27db2dc2"} Jan 21 16:21:58 crc kubenswrapper[4707]: I0121 16:21:58.370087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" event={"ID":"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7","Type":"ContainerStarted","Data":"650c72d467b954d2c050226db50c595912756138dcfa5f55014e5641e4f38259"} Jan 21 16:21:58 crc kubenswrapper[4707]: I0121 16:21:58.391123 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" podStartSLOduration=1.913184737 podStartE2EDuration="2.391103706s" podCreationTimestamp="2026-01-21 16:21:56 +0000 UTC" firstStartedPulling="2026-01-21 16:21:56.984147031 +0000 UTC m=+4814.165663253" lastFinishedPulling="2026-01-21 16:21:57.462066 +0000 UTC m=+4814.643582222" observedRunningTime="2026-01-21 16:21:58.381795754 +0000 UTC m=+4815.563311976" watchObservedRunningTime="2026-01-21 16:21:58.391103706 +0000 UTC m=+4815.572619928" Jan 21 16:21:59 crc kubenswrapper[4707]: I0121 16:21:59.378747 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" containerID="650c72d467b954d2c050226db50c595912756138dcfa5f55014e5641e4f38259" exitCode=0 Jan 21 16:21:59 crc kubenswrapper[4707]: I0121 16:21:59.378798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" event={"ID":"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7","Type":"ContainerDied","Data":"650c72d467b954d2c050226db50c595912756138dcfa5f55014e5641e4f38259"} Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.622494 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.780647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-inventory\") pod \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.780703 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-agent-neutron-config-0\") pod \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.780787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgqjm\" (UniqueName: \"kubernetes.io/projected/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-kube-api-access-zgqjm\") pod \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.780843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-ssh-key-edpm-compute-global\") pod \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.780944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-combined-ca-bundle\") pod \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\" (UID: \"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7\") " Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.791253 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" (UID: "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.791552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-kube-api-access-zgqjm" (OuterVolumeSpecName: "kube-api-access-zgqjm") pod "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" (UID: "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7"). InnerVolumeSpecName "kube-api-access-zgqjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.802997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-inventory" (OuterVolumeSpecName: "inventory") pod "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" (UID: "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.803672 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" (UID: "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.804167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" (UID: "0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.883145 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.883180 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.883193 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgqjm\" (UniqueName: \"kubernetes.io/projected/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-kube-api-access-zgqjm\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.883203 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:00 crc kubenswrapper[4707]: I0121 16:22:00.883213 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.399434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" event={"ID":"0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7","Type":"ContainerDied","Data":"cf6b2cd8f1d9e912e553bc38f806d464e1815083c7e39159945dd0af27db2dc2"} Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.399479 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf6b2cd8f1d9e912e553bc38f806d464e1815083c7e39159945dd0af27db2dc2" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.399481 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.447753 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7"] Jan 21 16:22:01 crc kubenswrapper[4707]: E0121 16:22:01.448088 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" containerName="neutron-sriov-edpm-compute-global-edpm-compute-global" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.448109 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" containerName="neutron-sriov-edpm-compute-global-edpm-compute-global" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.448267 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" containerName="neutron-sriov-edpm-compute-global-edpm-compute-global" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.448791 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.450378 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-dhcp-agent-neutron-config" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.451760 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.452581 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.452872 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.455642 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.455658 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.463116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7"] Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.490114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.490168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.490190 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.591611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmlsf\" (UniqueName: \"kubernetes.io/projected/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-kube-api-access-nmlsf\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.592059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.592171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.592253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.592353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-inventory\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.594689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.594802 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.595205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.694777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmlsf\" (UniqueName: \"kubernetes.io/projected/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-kube-api-access-nmlsf\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.695007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-inventory\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.698361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-inventory\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.711269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmlsf\" (UniqueName: \"kubernetes.io/projected/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-kube-api-access-nmlsf\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:01 crc kubenswrapper[4707]: I0121 16:22:01.762554 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:02 crc kubenswrapper[4707]: I0121 16:22:02.157577 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7"] Jan 21 16:22:02 crc kubenswrapper[4707]: W0121 16:22:02.158427 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49f5a1b5_dda3_4ecd_a2cc_d22ab538cb4a.slice/crio-1c44166e02cdc617b2fa729c5f9d27c234e9d49843f878e04984ec8a671b7491 WatchSource:0}: Error finding container 1c44166e02cdc617b2fa729c5f9d27c234e9d49843f878e04984ec8a671b7491: Status 404 returned error can't find the container with id 1c44166e02cdc617b2fa729c5f9d27c234e9d49843f878e04984ec8a671b7491 Jan 21 16:22:02 crc kubenswrapper[4707]: I0121 16:22:02.408219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" event={"ID":"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a","Type":"ContainerStarted","Data":"1c44166e02cdc617b2fa729c5f9d27c234e9d49843f878e04984ec8a671b7491"} Jan 21 16:22:03 crc kubenswrapper[4707]: I0121 16:22:03.187229 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:22:03 crc kubenswrapper[4707]: E0121 16:22:03.187932 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:22:03 crc kubenswrapper[4707]: I0121 16:22:03.417684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" event={"ID":"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a","Type":"ContainerStarted","Data":"f4d16a658e738624a800253bf5c7aba49d7ef8d5dd35b97fd4f311d411603422"} Jan 21 16:22:03 crc kubenswrapper[4707]: I0121 16:22:03.434211 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" podStartSLOduration=1.838307746 podStartE2EDuration="2.43419451s" podCreationTimestamp="2026-01-21 16:22:01 +0000 UTC" firstStartedPulling="2026-01-21 16:22:02.160536246 +0000 UTC m=+4819.342052467" lastFinishedPulling="2026-01-21 16:22:02.756423009 +0000 UTC m=+4819.937939231" observedRunningTime="2026-01-21 16:22:03.429930021 +0000 UTC m=+4820.611446244" watchObservedRunningTime="2026-01-21 16:22:03.43419451 +0000 UTC m=+4820.615710732" Jan 21 16:22:04 crc kubenswrapper[4707]: I0121 16:22:04.427048 4707 generic.go:334] "Generic (PLEG): container finished" podID="49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" containerID="f4d16a658e738624a800253bf5c7aba49d7ef8d5dd35b97fd4f311d411603422" exitCode=0 Jan 21 16:22:04 crc kubenswrapper[4707]: I0121 16:22:04.427094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" event={"ID":"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a","Type":"ContainerDied","Data":"f4d16a658e738624a800253bf5c7aba49d7ef8d5dd35b97fd4f311d411603422"} Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.645004 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.749231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-ssh-key-edpm-compute-global\") pod \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.749264 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-combined-ca-bundle\") pod \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.749288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-inventory\") pod \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.749367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmlsf\" (UniqueName: \"kubernetes.io/projected/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-kube-api-access-nmlsf\") pod \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.749418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-agent-neutron-config-0\") pod \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\" (UID: \"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a\") " Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.753467 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-kube-api-access-nmlsf" (OuterVolumeSpecName: "kube-api-access-nmlsf") pod "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" (UID: "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a"). InnerVolumeSpecName "kube-api-access-nmlsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.754202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" (UID: "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.765485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-inventory" (OuterVolumeSpecName: "inventory") pod "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" (UID: "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.766064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" (UID: "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.767545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" (UID: "49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.850536 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmlsf\" (UniqueName: \"kubernetes.io/projected/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-kube-api-access-nmlsf\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.850567 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.850581 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.850591 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:05 crc kubenswrapper[4707]: I0121 16:22:05.850605 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.254603 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw"] Jan 21 16:22:06 crc kubenswrapper[4707]: E0121 16:22:06.254875 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" containerName="neutron-dhcp-edpm-compute-global-edpm-compute-global" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.254892 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" containerName="neutron-dhcp-edpm-compute-global-edpm-compute-global" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.255055 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" containerName="neutron-dhcp-edpm-compute-global-edpm-compute-global" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.255490 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.257270 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"libvirt-secret" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.261256 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw"] Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.356781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-inventory\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.356878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-secret-0\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.356912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm2v4\" (UniqueName: \"kubernetes.io/projected/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-kube-api-access-hm2v4\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.357002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.357078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.442674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" event={"ID":"49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a","Type":"ContainerDied","Data":"1c44166e02cdc617b2fa729c5f9d27c234e9d49843f878e04984ec8a671b7491"} Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.442706 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c44166e02cdc617b2fa729c5f9d27c234e9d49843f878e04984ec8a671b7491" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.442913 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.457920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-secret-0\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.458029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm2v4\" (UniqueName: \"kubernetes.io/projected/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-kube-api-access-hm2v4\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.458168 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.458285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.458393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-inventory\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.461237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.461377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.461864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-inventory\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.462383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-secret-0\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.472273 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm2v4\" (UniqueName: \"kubernetes.io/projected/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-kube-api-access-hm2v4\") pod \"libvirt-edpm-compute-global-edpm-compute-global-mfgqw\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.568460 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:06 crc kubenswrapper[4707]: I0121 16:22:06.958677 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw"] Jan 21 16:22:06 crc kubenswrapper[4707]: W0121 16:22:06.961648 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0435a7e6_8e20_4f0a_bc93_c7a111eee7c5.slice/crio-b1ef0391169e41b8b67569c7257b3ab72423883c948ae9bd85320eb4ee766b43 WatchSource:0}: Error finding container b1ef0391169e41b8b67569c7257b3ab72423883c948ae9bd85320eb4ee766b43: Status 404 returned error can't find the container with id b1ef0391169e41b8b67569c7257b3ab72423883c948ae9bd85320eb4ee766b43 Jan 21 16:22:07 crc kubenswrapper[4707]: I0121 16:22:07.449179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" event={"ID":"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5","Type":"ContainerStarted","Data":"b1ef0391169e41b8b67569c7257b3ab72423883c948ae9bd85320eb4ee766b43"} Jan 21 16:22:08 crc kubenswrapper[4707]: I0121 16:22:08.455984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" event={"ID":"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5","Type":"ContainerStarted","Data":"a837f6f505eead11786f15192713acef33089b8b0f835a85f7317f2a4474d7b6"} Jan 21 16:22:09 crc kubenswrapper[4707]: I0121 16:22:09.463821 4707 generic.go:334] "Generic (PLEG): container finished" podID="0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" containerID="a837f6f505eead11786f15192713acef33089b8b0f835a85f7317f2a4474d7b6" exitCode=0 Jan 21 16:22:09 crc kubenswrapper[4707]: I0121 16:22:09.463895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" event={"ID":"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5","Type":"ContainerDied","Data":"a837f6f505eead11786f15192713acef33089b8b0f835a85f7317f2a4474d7b6"} Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.679266 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.811962 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-secret-0\") pod \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.812036 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-ssh-key-edpm-compute-global\") pod \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.812069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm2v4\" (UniqueName: \"kubernetes.io/projected/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-kube-api-access-hm2v4\") pod \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.812106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-combined-ca-bundle\") pod \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.812197 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-inventory\") pod \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\" (UID: \"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5\") " Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.816541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-kube-api-access-hm2v4" (OuterVolumeSpecName: "kube-api-access-hm2v4") pod "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" (UID: "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5"). InnerVolumeSpecName "kube-api-access-hm2v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.816734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" (UID: "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.828721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" (UID: "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.829164 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-inventory" (OuterVolumeSpecName: "inventory") pod "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" (UID: "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.830488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" (UID: "0435a7e6-8e20-4f0a-bc93-c7a111eee7c5"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.913324 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.913352 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.913365 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.913376 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm2v4\" (UniqueName: \"kubernetes.io/projected/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-kube-api-access-hm2v4\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:10 crc kubenswrapper[4707]: I0121 16:22:10.913385 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.477022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" event={"ID":"0435a7e6-8e20-4f0a-bc93-c7a111eee7c5","Type":"ContainerDied","Data":"b1ef0391169e41b8b67569c7257b3ab72423883c948ae9bd85320eb4ee766b43"} Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.477060 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1ef0391169e41b8b67569c7257b3ab72423883c948ae9bd85320eb4ee766b43" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.477077 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.519066 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw"] Jan 21 16:22:11 crc kubenswrapper[4707]: E0121 16:22:11.519381 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" containerName="libvirt-edpm-compute-global-edpm-compute-global" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.519402 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" containerName="libvirt-edpm-compute-global-edpm-compute-global" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.519571 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" containerName="libvirt-edpm-compute-global-edpm-compute-global" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.520069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.520880 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.520998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-inventory\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.521120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-combined-ca-bundle\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.521202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg76h\" (UniqueName: \"kubernetes.io/projected/369dd7f6-dbf8-48ec-ba11-4bcd37527804-kube-api-access-hg76h\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.521280 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-ssh-key-edpm-compute-global\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.521349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.521461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.521586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.521656 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.521868 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-migration-ssh-key" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.522011 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.522089 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.522269 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.522362 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.523347 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-compute-config" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.530628 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw"] Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.622090 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-ssh-key-edpm-compute-global\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.622138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.622178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.622216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.622241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.622264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-inventory\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.622292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-combined-ca-bundle\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.622309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg76h\" (UniqueName: \"kubernetes.io/projected/369dd7f6-dbf8-48ec-ba11-4bcd37527804-kube-api-access-hg76h\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.626215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.626250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.626227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.626483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-combined-ca-bundle\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.628743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.628827 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-inventory\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.629984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-ssh-key-edpm-compute-global\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.634685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg76h\" (UniqueName: \"kubernetes.io/projected/369dd7f6-dbf8-48ec-ba11-4bcd37527804-kube-api-access-hg76h\") pod \"nova-edpm-compute-global-edpm-compute-global-hdqdw\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:11 crc kubenswrapper[4707]: I0121 16:22:11.834429 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:12 crc kubenswrapper[4707]: I0121 16:22:12.187397 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw"] Jan 21 16:22:12 crc kubenswrapper[4707]: W0121 16:22:12.190434 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod369dd7f6_dbf8_48ec_ba11_4bcd37527804.slice/crio-8036655f2ab308a518e0211959f5354fdbad7af3c5eea80b17be371bcb03abde WatchSource:0}: Error finding container 8036655f2ab308a518e0211959f5354fdbad7af3c5eea80b17be371bcb03abde: Status 404 returned error can't find the container with id 8036655f2ab308a518e0211959f5354fdbad7af3c5eea80b17be371bcb03abde Jan 21 16:22:12 crc kubenswrapper[4707]: I0121 16:22:12.484639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" event={"ID":"369dd7f6-dbf8-48ec-ba11-4bcd37527804","Type":"ContainerStarted","Data":"8036655f2ab308a518e0211959f5354fdbad7af3c5eea80b17be371bcb03abde"} Jan 21 16:22:13 crc kubenswrapper[4707]: I0121 16:22:13.491951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" event={"ID":"369dd7f6-dbf8-48ec-ba11-4bcd37527804","Type":"ContainerStarted","Data":"422a08188411b88a02a2f56165f30287f2ae1ea6fdb95c76331b337b120419af"} Jan 21 16:22:13 crc kubenswrapper[4707]: I0121 16:22:13.503392 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" podStartSLOduration=1.943305583 podStartE2EDuration="2.503379407s" podCreationTimestamp="2026-01-21 16:22:11 +0000 UTC" firstStartedPulling="2026-01-21 16:22:12.192160028 +0000 UTC m=+4829.373676249" lastFinishedPulling="2026-01-21 16:22:12.752233852 +0000 UTC m=+4829.933750073" observedRunningTime="2026-01-21 16:22:13.501697936 +0000 UTC m=+4830.683214168" watchObservedRunningTime="2026-01-21 16:22:13.503379407 +0000 UTC m=+4830.684895630" Jan 21 16:22:14 crc kubenswrapper[4707]: I0121 16:22:14.498958 4707 generic.go:334] "Generic (PLEG): container finished" podID="369dd7f6-dbf8-48ec-ba11-4bcd37527804" containerID="422a08188411b88a02a2f56165f30287f2ae1ea6fdb95c76331b337b120419af" exitCode=0 Jan 21 16:22:14 crc kubenswrapper[4707]: I0121 16:22:14.499173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" event={"ID":"369dd7f6-dbf8-48ec-ba11-4bcd37527804","Type":"ContainerDied","Data":"422a08188411b88a02a2f56165f30287f2ae1ea6fdb95c76331b337b120419af"} Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.701514 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.878921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-1\") pod \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.878956 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-0\") pod \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.879013 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-0\") pod \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.879053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-inventory\") pod \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.879086 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg76h\" (UniqueName: \"kubernetes.io/projected/369dd7f6-dbf8-48ec-ba11-4bcd37527804-kube-api-access-hg76h\") pod \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.879136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-combined-ca-bundle\") pod \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.879182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-ssh-key-edpm-compute-global\") pod \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.879209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-1\") pod \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\" (UID: \"369dd7f6-dbf8-48ec-ba11-4bcd37527804\") " Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.883121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369dd7f6-dbf8-48ec-ba11-4bcd37527804-kube-api-access-hg76h" (OuterVolumeSpecName: "kube-api-access-hg76h") pod "369dd7f6-dbf8-48ec-ba11-4bcd37527804" (UID: "369dd7f6-dbf8-48ec-ba11-4bcd37527804"). InnerVolumeSpecName "kube-api-access-hg76h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.884086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "369dd7f6-dbf8-48ec-ba11-4bcd37527804" (UID: "369dd7f6-dbf8-48ec-ba11-4bcd37527804"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.895507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "369dd7f6-dbf8-48ec-ba11-4bcd37527804" (UID: "369dd7f6-dbf8-48ec-ba11-4bcd37527804"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.895630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-inventory" (OuterVolumeSpecName: "inventory") pod "369dd7f6-dbf8-48ec-ba11-4bcd37527804" (UID: "369dd7f6-dbf8-48ec-ba11-4bcd37527804"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.895916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "369dd7f6-dbf8-48ec-ba11-4bcd37527804" (UID: "369dd7f6-dbf8-48ec-ba11-4bcd37527804"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.896058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "369dd7f6-dbf8-48ec-ba11-4bcd37527804" (UID: "369dd7f6-dbf8-48ec-ba11-4bcd37527804"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.896337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "369dd7f6-dbf8-48ec-ba11-4bcd37527804" (UID: "369dd7f6-dbf8-48ec-ba11-4bcd37527804"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.896406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "369dd7f6-dbf8-48ec-ba11-4bcd37527804" (UID: "369dd7f6-dbf8-48ec-ba11-4bcd37527804"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.981277 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.981309 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.981323 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.981335 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.981347 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.981360 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.981369 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg76h\" (UniqueName: \"kubernetes.io/projected/369dd7f6-dbf8-48ec-ba11-4bcd37527804-kube-api-access-hg76h\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:15 crc kubenswrapper[4707]: I0121 16:22:15.981379 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369dd7f6-dbf8-48ec-ba11-4bcd37527804-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.255883 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc"] Jan 21 16:22:16 crc kubenswrapper[4707]: E0121 16:22:16.256374 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369dd7f6-dbf8-48ec-ba11-4bcd37527804" containerName="nova-edpm-compute-global-edpm-compute-global" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.256395 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="369dd7f6-dbf8-48ec-ba11-4bcd37527804" containerName="nova-edpm-compute-global-edpm-compute-global" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.256534 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="369dd7f6-dbf8-48ec-ba11-4bcd37527804" containerName="nova-edpm-compute-global-edpm-compute-global" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.256989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.263325 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc"] Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.284734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkll7\" (UniqueName: \"kubernetes.io/projected/a29c002e-9b49-41e1-8dbb-432265dbf327-kube-api-access-qkll7\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.284828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.285610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.285647 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-inventory-0\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.386462 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkll7\" (UniqueName: \"kubernetes.io/projected/a29c002e-9b49-41e1-8dbb-432265dbf327-kube-api-access-qkll7\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.386517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.386565 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.386588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-inventory-0\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.389712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-inventory-0\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.389848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.390065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.400613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkll7\" (UniqueName: \"kubernetes.io/projected/a29c002e-9b49-41e1-8dbb-432265dbf327-kube-api-access-qkll7\") pod \"custom-global-service-edpm-compute-global-pmpwc\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.513420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" event={"ID":"369dd7f6-dbf8-48ec-ba11-4bcd37527804","Type":"ContainerDied","Data":"8036655f2ab308a518e0211959f5354fdbad7af3c5eea80b17be371bcb03abde"} Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.513455 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8036655f2ab308a518e0211959f5354fdbad7af3c5eea80b17be371bcb03abde" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.513659 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.570004 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:16 crc kubenswrapper[4707]: I0121 16:22:16.923573 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc"] Jan 21 16:22:16 crc kubenswrapper[4707]: W0121 16:22:16.925270 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda29c002e_9b49_41e1_8dbb_432265dbf327.slice/crio-6cd8cd752ff36bd0aced4c57fb5322eb960c898bd6b86a54efe013aaef726732 WatchSource:0}: Error finding container 6cd8cd752ff36bd0aced4c57fb5322eb960c898bd6b86a54efe013aaef726732: Status 404 returned error can't find the container with id 6cd8cd752ff36bd0aced4c57fb5322eb960c898bd6b86a54efe013aaef726732 Jan 21 16:22:17 crc kubenswrapper[4707]: I0121 16:22:17.182756 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:22:17 crc kubenswrapper[4707]: E0121 16:22:17.183062 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:22:17 crc kubenswrapper[4707]: I0121 16:22:17.521115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" event={"ID":"a29c002e-9b49-41e1-8dbb-432265dbf327","Type":"ContainerStarted","Data":"6cd8cd752ff36bd0aced4c57fb5322eb960c898bd6b86a54efe013aaef726732"} Jan 21 16:22:18 crc kubenswrapper[4707]: I0121 16:22:18.529074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" event={"ID":"a29c002e-9b49-41e1-8dbb-432265dbf327","Type":"ContainerStarted","Data":"0b094ec3ec50cf0f292a11cda352b4f49d0f95fd6b097d1ad3eb6efe76e6b7ce"} Jan 21 16:22:18 crc kubenswrapper[4707]: I0121 16:22:18.542459 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" podStartSLOduration=2.022485695 podStartE2EDuration="2.542445844s" podCreationTimestamp="2026-01-21 16:22:16 +0000 UTC" firstStartedPulling="2026-01-21 16:22:16.927553458 +0000 UTC m=+4834.109069680" lastFinishedPulling="2026-01-21 16:22:17.447513608 +0000 UTC m=+4834.629029829" observedRunningTime="2026-01-21 16:22:18.538246428 +0000 UTC m=+4835.719762649" watchObservedRunningTime="2026-01-21 16:22:18.542445844 +0000 UTC m=+4835.723962066" Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.546592 4707 generic.go:334] "Generic (PLEG): container finished" podID="a29c002e-9b49-41e1-8dbb-432265dbf327" containerID="0b094ec3ec50cf0f292a11cda352b4f49d0f95fd6b097d1ad3eb6efe76e6b7ce" exitCode=0 Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.546676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" event={"ID":"a29c002e-9b49-41e1-8dbb-432265dbf327","Type":"ContainerDied","Data":"0b094ec3ec50cf0f292a11cda352b4f49d0f95fd6b097d1ad3eb6efe76e6b7ce"} Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.835636 4707 scope.go:117] "RemoveContainer" containerID="b895c33c9e96e01533b1f6162a9b38e42a966f465e3f68cd50d1c00b4fe94657" Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.866415 4707 scope.go:117] "RemoveContainer" containerID="581de8169d595296309fc4da76d672bf0da9c756c1bb9e4be004b67a1328005f" Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.881579 4707 scope.go:117] "RemoveContainer" containerID="49fe3cdf415f022d620f2d71a91e520b9ef9b344fdc73a3dfc14029d3d981a93" Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.899040 4707 scope.go:117] "RemoveContainer" containerID="1ec9f8e9cdd570c1c0f9a3cff013e5233069dabb7c23b0432d95418fcef9886f" Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.927580 4707 scope.go:117] "RemoveContainer" containerID="b2f29327c12a3fe9a1b70817c1826ceec703c5e985e6fecbd37b4a63aadbba45" Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.941011 4707 scope.go:117] "RemoveContainer" containerID="b64f3b8df76efe4f95911f51150ea7f0f45391b29547dd5c2934533c217187fa" Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.975104 4707 scope.go:117] "RemoveContainer" containerID="cf355fbbc590a30098b111959e33b999901e90e3f0d26f06bc1c371e2d6021e0" Jan 21 16:22:20 crc kubenswrapper[4707]: I0121 16:22:20.987612 4707 scope.go:117] "RemoveContainer" containerID="b52afd5f2e01f239ce45f1d19e62887eb3138eeefb455653380278b022c9f0a2" Jan 21 16:22:21 crc kubenswrapper[4707]: I0121 16:22:21.002969 4707 scope.go:117] "RemoveContainer" containerID="22d5a5d536394a7c6034ffab06541d4868a12a093533861fc0bb90582e15484c" Jan 21 16:22:21 crc kubenswrapper[4707]: I0121 16:22:21.024552 4707 scope.go:117] "RemoveContainer" containerID="c6d62380a3ef675f453ba1bede62df33562aea21e545e2c9252332cfb4df99bd" Jan 21 16:22:21 crc kubenswrapper[4707]: I0121 16:22:21.044212 4707 scope.go:117] "RemoveContainer" containerID="a35a0a3b6e7558ea488bb3ad5745e40ecbe3628ac61c149c4cd49b174c91a74f" Jan 21 16:22:21 crc kubenswrapper[4707]: I0121 16:22:21.058556 4707 scope.go:117] "RemoveContainer" containerID="3d192501b8ea2c70eceb4505c10c6f626fcbc95a51cb226de357496781cd162e" Jan 21 16:22:21 crc kubenswrapper[4707]: I0121 16:22:21.070185 4707 scope.go:117] "RemoveContainer" containerID="3b952b60555ceb376dabbb9df62869e70378873b1c14e61dbc327e332bbb85b6" Jan 21 16:22:21 crc kubenswrapper[4707]: I0121 16:22:21.086080 4707 scope.go:117] "RemoveContainer" containerID="3a483ecb202201b5633ca0fcde9357132b9ebe781a6f6f398d8cff7856e0cd29" Jan 21 16:22:21 crc kubenswrapper[4707]: I0121 16:22:21.099286 4707 scope.go:117] "RemoveContainer" containerID="b1866e7c5c31c45c31a0a123206f96d06c54b25f3bf493859e89f55cf818e3aa" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.099575 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.257336 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-custom-global-service-combined-ca-bundle\") pod \"a29c002e-9b49-41e1-8dbb-432265dbf327\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.257399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-ssh-key-edpm-compute-global\") pod \"a29c002e-9b49-41e1-8dbb-432265dbf327\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.257488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-inventory-0\") pod \"a29c002e-9b49-41e1-8dbb-432265dbf327\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.257518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkll7\" (UniqueName: \"kubernetes.io/projected/a29c002e-9b49-41e1-8dbb-432265dbf327-kube-api-access-qkll7\") pod \"a29c002e-9b49-41e1-8dbb-432265dbf327\" (UID: \"a29c002e-9b49-41e1-8dbb-432265dbf327\") " Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.261382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29c002e-9b49-41e1-8dbb-432265dbf327-kube-api-access-qkll7" (OuterVolumeSpecName: "kube-api-access-qkll7") pod "a29c002e-9b49-41e1-8dbb-432265dbf327" (UID: "a29c002e-9b49-41e1-8dbb-432265dbf327"). InnerVolumeSpecName "kube-api-access-qkll7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.261420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-custom-global-service-combined-ca-bundle" (OuterVolumeSpecName: "custom-global-service-combined-ca-bundle") pod "a29c002e-9b49-41e1-8dbb-432265dbf327" (UID: "a29c002e-9b49-41e1-8dbb-432265dbf327"). InnerVolumeSpecName "custom-global-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.273548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "a29c002e-9b49-41e1-8dbb-432265dbf327" (UID: "a29c002e-9b49-41e1-8dbb-432265dbf327"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.274158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "a29c002e-9b49-41e1-8dbb-432265dbf327" (UID: "a29c002e-9b49-41e1-8dbb-432265dbf327"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.359550 4707 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.359592 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkll7\" (UniqueName: \"kubernetes.io/projected/a29c002e-9b49-41e1-8dbb-432265dbf327-kube-api-access-qkll7\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.359607 4707 reconciler_common.go:293] "Volume detached for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-custom-global-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.359617 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a29c002e-9b49-41e1-8dbb-432265dbf327-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.561644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" event={"ID":"a29c002e-9b49-41e1-8dbb-432265dbf327","Type":"ContainerDied","Data":"6cd8cd752ff36bd0aced4c57fb5322eb960c898bd6b86a54efe013aaef726732"} Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.561683 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cd8cd752ff36bd0aced4c57fb5322eb960c898bd6b86a54efe013aaef726732" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.561706 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.913937 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5"] Jan 21 16:22:22 crc kubenswrapper[4707]: E0121 16:22:22.914223 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29c002e-9b49-41e1-8dbb-432265dbf327" containerName="custom-global-service-edpm-compute-global" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.914243 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29c002e-9b49-41e1-8dbb-432265dbf327" containerName="custom-global-service-edpm-compute-global" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.914401 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29c002e-9b49-41e1-8dbb-432265dbf327" containerName="custom-global-service-edpm-compute-global" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.915123 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.919192 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.921871 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5"] Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.967058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-config\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.967105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.967183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr5dj\" (UniqueName: \"kubernetes.io/projected/0d14c6be-c04f-4eae-bce6-a850a6382cfb-kube-api-access-wr5dj\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.967199 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:22 crc kubenswrapper[4707]: I0121 16:22:22.967265 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-global\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.068515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-global\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.068580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-config\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.068613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.068717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr5dj\" (UniqueName: \"kubernetes.io/projected/0d14c6be-c04f-4eae-bce6-a850a6382cfb-kube-api-access-wr5dj\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.068734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.069338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-global\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.069371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-config\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.069463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.069526 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.087577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr5dj\" (UniqueName: \"kubernetes.io/projected/0d14c6be-c04f-4eae-bce6-a850a6382cfb-kube-api-access-wr5dj\") pod \"dnsmasq-dnsmasq-6668544499-tlgm5\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.228192 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:23 crc kubenswrapper[4707]: I0121 16:22:23.586413 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5"] Jan 21 16:22:24 crc kubenswrapper[4707]: I0121 16:22:24.576190 4707 generic.go:334] "Generic (PLEG): container finished" podID="0d14c6be-c04f-4eae-bce6-a850a6382cfb" containerID="a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd" exitCode=0 Jan 21 16:22:24 crc kubenswrapper[4707]: I0121 16:22:24.576483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" event={"ID":"0d14c6be-c04f-4eae-bce6-a850a6382cfb","Type":"ContainerDied","Data":"a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd"} Jan 21 16:22:24 crc kubenswrapper[4707]: I0121 16:22:24.576589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" event={"ID":"0d14c6be-c04f-4eae-bce6-a850a6382cfb","Type":"ContainerStarted","Data":"55df3a8b4d347c20ca49da25af820fef6f2ac44ceffe888ab353dd1d1d5a16e8"} Jan 21 16:22:25 crc kubenswrapper[4707]: I0121 16:22:25.583566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" event={"ID":"0d14c6be-c04f-4eae-bce6-a850a6382cfb","Type":"ContainerStarted","Data":"de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103"} Jan 21 16:22:25 crc kubenswrapper[4707]: I0121 16:22:25.583694 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:25 crc kubenswrapper[4707]: I0121 16:22:25.596684 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" podStartSLOduration=3.596666588 podStartE2EDuration="3.596666588s" podCreationTimestamp="2026-01-21 16:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:22:25.595913721 +0000 UTC m=+4842.777429944" watchObservedRunningTime="2026-01-21 16:22:25.596666588 +0000 UTC m=+4842.778182809" Jan 21 16:22:32 crc kubenswrapper[4707]: I0121 16:22:32.182550 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:22:32 crc kubenswrapper[4707]: E0121 16:22:32.183117 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.228966 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.271569 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb"] Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.271790 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" podUID="f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" containerName="dnsmasq-dns" containerID="cri-o://897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60" gracePeriod=10 Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.596691 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.629941 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" containerID="897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60" exitCode=0 Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.630010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" event={"ID":"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3","Type":"ContainerDied","Data":"897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60"} Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.630249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" event={"ID":"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3","Type":"ContainerDied","Data":"c7ff6ba43dcc6e193fa942b86d8782e659008bf9cfc09eba347b1014ae41a1e5"} Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.630013 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.630312 4707 scope.go:117] "RemoveContainer" containerID="897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.653670 4707 scope.go:117] "RemoveContainer" containerID="b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.666992 4707 scope.go:117] "RemoveContainer" containerID="897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60" Jan 21 16:22:33 crc kubenswrapper[4707]: E0121 16:22:33.667298 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60\": container with ID starting with 897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60 not found: ID does not exist" containerID="897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.667328 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60"} err="failed to get container status \"897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60\": rpc error: code = NotFound desc = could not find container \"897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60\": container with ID starting with 897253db34c8687eb002d76a6474a28ce65ac1648e750775b558f3a8764daf60 not found: ID does not exist" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.667348 4707 scope.go:117] "RemoveContainer" containerID="b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4" Jan 21 16:22:33 crc kubenswrapper[4707]: E0121 16:22:33.667635 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4\": container with ID starting with b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4 not found: ID does not exist" containerID="b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.667662 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4"} err="failed to get container status \"b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4\": rpc error: code = NotFound desc = could not find container \"b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4\": container with ID starting with b7594e18ad4e2e29c71ebd248ff13181b001a395b3af5135d07377514e526ce4 not found: ID does not exist" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.690522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-edpm-compute-global\") pod \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.690558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mcnm\" (UniqueName: \"kubernetes.io/projected/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-kube-api-access-9mcnm\") pod \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.690603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-config\") pod \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.690622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-dnsmasq-svc\") pod \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\" (UID: \"f4895e15-b8fb-4bf1-92aa-c5abeb9795e3\") " Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.694408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-kube-api-access-9mcnm" (OuterVolumeSpecName: "kube-api-access-9mcnm") pod "f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" (UID: "f4895e15-b8fb-4bf1-92aa-c5abeb9795e3"). InnerVolumeSpecName "kube-api-access-9mcnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.714558 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-edpm-compute-global" (OuterVolumeSpecName: "edpm-compute-global") pod "f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" (UID: "f4895e15-b8fb-4bf1-92aa-c5abeb9795e3"). InnerVolumeSpecName "edpm-compute-global". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.714926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" (UID: "f4895e15-b8fb-4bf1-92aa-c5abeb9795e3"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.715498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-config" (OuterVolumeSpecName: "config") pod "f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" (UID: "f4895e15-b8fb-4bf1-92aa-c5abeb9795e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.791884 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.791902 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.791913 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.791925 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mcnm\" (UniqueName: \"kubernetes.io/projected/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3-kube-api-access-9mcnm\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.953720 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb"] Jan 21 16:22:33 crc kubenswrapper[4707]: I0121 16:22:33.959179 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-rvvqb"] Jan 21 16:22:34 crc kubenswrapper[4707]: E0121 16:22:34.010723 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4895e15_b8fb_4bf1_92aa_c5abeb9795e3.slice\": RecentStats: unable to find data in memory cache]" Jan 21 16:22:35 crc kubenswrapper[4707]: I0121 16:22:35.189536 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" path="/var/lib/kubelet/pods/f4895e15-b8fb-4bf1-92aa-c5abeb9795e3/volumes" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.897473 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd"] Jan 21 16:22:37 crc kubenswrapper[4707]: E0121 16:22:37.897947 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" containerName="init" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.897961 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" containerName="init" Jan 21 16:22:37 crc kubenswrapper[4707]: E0121 16:22:37.897973 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" containerName="dnsmasq-dns" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.897979 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" containerName="dnsmasq-dns" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.898161 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4895e15-b8fb-4bf1-92aa-c5abeb9795e3" containerName="dnsmasq-dns" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.898587 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.901228 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.901624 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.901945 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.905843 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.910342 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s"] Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.911377 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.912724 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset-dockercfg-kxsc5" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.912726 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.923311 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd"] Jan 21 16:22:37 crc kubenswrapper[4707]: I0121 16:22:37.932325 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s"] Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.032601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.033026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7knxd\" (UniqueName: \"kubernetes.io/projected/59db911b-3f76-46f6-a197-0ff5218a9deb-kube-api-access-7knxd\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-wx7xd\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.033090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.033134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-wx7xd\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.033157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m64jr\" (UniqueName: \"kubernetes.io/projected/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-kube-api-access-m64jr\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.033202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-wx7xd\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.134683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.134729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7knxd\" (UniqueName: \"kubernetes.io/projected/59db911b-3f76-46f6-a197-0ff5218a9deb-kube-api-access-7knxd\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-wx7xd\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.134753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.134773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-wx7xd\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.134792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m64jr\" (UniqueName: \"kubernetes.io/projected/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-kube-api-access-m64jr\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.134854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-wx7xd\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.139334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-wx7xd\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.139446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.139600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.139962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-wx7xd\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.147555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m64jr\" (UniqueName: \"kubernetes.io/projected/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-kube-api-access-m64jr\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.148452 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7knxd\" (UniqueName: \"kubernetes.io/projected/59db911b-3f76-46f6-a197-0ff5218a9deb-kube-api-access-7knxd\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-wx7xd\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.212134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.223284 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.565786 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd"] Jan 21 16:22:38 crc kubenswrapper[4707]: W0121 16:22:38.565847 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59db911b_3f76_46f6_a197_0ff5218a9deb.slice/crio-816628b20fb140582c4719155239105ac661d5327b5724658f71c5bbbe94cab2 WatchSource:0}: Error finding container 816628b20fb140582c4719155239105ac661d5327b5724658f71c5bbbe94cab2: Status 404 returned error can't find the container with id 816628b20fb140582c4719155239105ac661d5327b5724658f71c5bbbe94cab2 Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.604243 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s"] Jan 21 16:22:38 crc kubenswrapper[4707]: W0121 16:22:38.605688 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ce54cb_82e7_40fe_a40b_d96f67dae3be.slice/crio-6510690f5e43ce301456d2bb287c4923d8cb4efefc5f6ffca31ee73ff30c710e WatchSource:0}: Error finding container 6510690f5e43ce301456d2bb287c4923d8cb4efefc5f6ffca31ee73ff30c710e: Status 404 returned error can't find the container with id 6510690f5e43ce301456d2bb287c4923d8cb4efefc5f6ffca31ee73ff30c710e Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.661179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" event={"ID":"59db911b-3f76-46f6-a197-0ff5218a9deb","Type":"ContainerStarted","Data":"816628b20fb140582c4719155239105ac661d5327b5724658f71c5bbbe94cab2"} Jan 21 16:22:38 crc kubenswrapper[4707]: I0121 16:22:38.662236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" event={"ID":"b8ce54cb-82e7-40fe-a40b-d96f67dae3be","Type":"ContainerStarted","Data":"6510690f5e43ce301456d2bb287c4923d8cb4efefc5f6ffca31ee73ff30c710e"} Jan 21 16:22:39 crc kubenswrapper[4707]: I0121 16:22:39.669503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" event={"ID":"59db911b-3f76-46f6-a197-0ff5218a9deb","Type":"ContainerStarted","Data":"df7fb3bfab8b15a9334cfffdbdb6b11fe8bd01ca13669118cae10250b9edb0f6"} Jan 21 16:22:39 crc kubenswrapper[4707]: I0121 16:22:39.671590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" event={"ID":"b8ce54cb-82e7-40fe-a40b-d96f67dae3be","Type":"ContainerStarted","Data":"0496b69d6767e2c509fbac99b5dfb60856a1e7c835958c320b4add0c59b28805"} Jan 21 16:22:39 crc kubenswrapper[4707]: I0121 16:22:39.682956 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" podStartSLOduration=2.036597536 podStartE2EDuration="2.682946514s" podCreationTimestamp="2026-01-21 16:22:37 +0000 UTC" firstStartedPulling="2026-01-21 16:22:38.56774794 +0000 UTC m=+4855.749264162" lastFinishedPulling="2026-01-21 16:22:39.214096918 +0000 UTC m=+4856.395613140" observedRunningTime="2026-01-21 16:22:39.680165364 +0000 UTC m=+4856.861681586" watchObservedRunningTime="2026-01-21 16:22:39.682946514 +0000 UTC m=+4856.864462736" Jan 21 16:22:39 crc kubenswrapper[4707]: I0121 16:22:39.697195 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" podStartSLOduration=2.216233535 podStartE2EDuration="2.697175519s" podCreationTimestamp="2026-01-21 16:22:37 +0000 UTC" firstStartedPulling="2026-01-21 16:22:38.607401421 +0000 UTC m=+4855.788917643" lastFinishedPulling="2026-01-21 16:22:39.088343404 +0000 UTC m=+4856.269859627" observedRunningTime="2026-01-21 16:22:39.691095426 +0000 UTC m=+4856.872611648" watchObservedRunningTime="2026-01-21 16:22:39.697175519 +0000 UTC m=+4856.878691741" Jan 21 16:22:40 crc kubenswrapper[4707]: I0121 16:22:40.678643 4707 generic.go:334] "Generic (PLEG): container finished" podID="59db911b-3f76-46f6-a197-0ff5218a9deb" containerID="df7fb3bfab8b15a9334cfffdbdb6b11fe8bd01ca13669118cae10250b9edb0f6" exitCode=0 Jan 21 16:22:40 crc kubenswrapper[4707]: I0121 16:22:40.678755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" event={"ID":"59db911b-3f76-46f6-a197-0ff5218a9deb","Type":"ContainerDied","Data":"df7fb3bfab8b15a9334cfffdbdb6b11fe8bd01ca13669118cae10250b9edb0f6"} Jan 21 16:22:40 crc kubenswrapper[4707]: I0121 16:22:40.680803 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8ce54cb-82e7-40fe-a40b-d96f67dae3be" containerID="0496b69d6767e2c509fbac99b5dfb60856a1e7c835958c320b4add0c59b28805" exitCode=0 Jan 21 16:22:40 crc kubenswrapper[4707]: I0121 16:22:40.680857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" event={"ID":"b8ce54cb-82e7-40fe-a40b-d96f67dae3be","Type":"ContainerDied","Data":"0496b69d6767e2c509fbac99b5dfb60856a1e7c835958c320b4add0c59b28805"} Jan 21 16:22:41 crc kubenswrapper[4707]: I0121 16:22:41.927517 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:41 crc kubenswrapper[4707]: I0121 16:22:41.932164 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.078048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-ssh-key-edpm-compute-global\") pod \"59db911b-3f76-46f6-a197-0ff5218a9deb\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.078129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-inventory\") pod \"59db911b-3f76-46f6-a197-0ff5218a9deb\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.078210 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m64jr\" (UniqueName: \"kubernetes.io/projected/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-kube-api-access-m64jr\") pod \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.078281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-ssh-key-edpm-compute-beta-nodeset\") pod \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.078344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7knxd\" (UniqueName: \"kubernetes.io/projected/59db911b-3f76-46f6-a197-0ff5218a9deb-kube-api-access-7knxd\") pod \"59db911b-3f76-46f6-a197-0ff5218a9deb\" (UID: \"59db911b-3f76-46f6-a197-0ff5218a9deb\") " Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.078400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-inventory\") pod \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\" (UID: \"b8ce54cb-82e7-40fe-a40b-d96f67dae3be\") " Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.083890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59db911b-3f76-46f6-a197-0ff5218a9deb-kube-api-access-7knxd" (OuterVolumeSpecName: "kube-api-access-7knxd") pod "59db911b-3f76-46f6-a197-0ff5218a9deb" (UID: "59db911b-3f76-46f6-a197-0ff5218a9deb"). InnerVolumeSpecName "kube-api-access-7knxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.084157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-kube-api-access-m64jr" (OuterVolumeSpecName: "kube-api-access-m64jr") pod "b8ce54cb-82e7-40fe-a40b-d96f67dae3be" (UID: "b8ce54cb-82e7-40fe-a40b-d96f67dae3be"). InnerVolumeSpecName "kube-api-access-m64jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.095946 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "b8ce54cb-82e7-40fe-a40b-d96f67dae3be" (UID: "b8ce54cb-82e7-40fe-a40b-d96f67dae3be"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.096390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "59db911b-3f76-46f6-a197-0ff5218a9deb" (UID: "59db911b-3f76-46f6-a197-0ff5218a9deb"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.096728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-inventory" (OuterVolumeSpecName: "inventory") pod "59db911b-3f76-46f6-a197-0ff5218a9deb" (UID: "59db911b-3f76-46f6-a197-0ff5218a9deb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.097083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-inventory" (OuterVolumeSpecName: "inventory") pod "b8ce54cb-82e7-40fe-a40b-d96f67dae3be" (UID: "b8ce54cb-82e7-40fe-a40b-d96f67dae3be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.179799 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.179850 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7knxd\" (UniqueName: \"kubernetes.io/projected/59db911b-3f76-46f6-a197-0ff5218a9deb-kube-api-access-7knxd\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.179861 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.179872 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.179882 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59db911b-3f76-46f6-a197-0ff5218a9deb-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.179891 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m64jr\" (UniqueName: \"kubernetes.io/projected/b8ce54cb-82e7-40fe-a40b-d96f67dae3be-kube-api-access-m64jr\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.695415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" event={"ID":"b8ce54cb-82e7-40fe-a40b-d96f67dae3be","Type":"ContainerDied","Data":"6510690f5e43ce301456d2bb287c4923d8cb4efefc5f6ffca31ee73ff30c710e"} Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.695462 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6510690f5e43ce301456d2bb287c4923d8cb4efefc5f6ffca31ee73ff30c710e" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.695466 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.696867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" event={"ID":"59db911b-3f76-46f6-a197-0ff5218a9deb","Type":"ContainerDied","Data":"816628b20fb140582c4719155239105ac661d5327b5724658f71c5bbbe94cab2"} Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.696893 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.696909 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="816628b20fb140582c4719155239105ac661d5327b5724658f71c5bbbe94cab2" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.745825 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t"] Jan 21 16:22:42 crc kubenswrapper[4707]: E0121 16:22:42.746194 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ce54cb-82e7-40fe-a40b-d96f67dae3be" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.746214 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ce54cb-82e7-40fe-a40b-d96f67dae3be" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:22:42 crc kubenswrapper[4707]: E0121 16:22:42.746231 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59db911b-3f76-46f6-a197-0ff5218a9deb" containerName="download-cache-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.746238 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="59db911b-3f76-46f6-a197-0ff5218a9deb" containerName="download-cache-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.746400 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="59db911b-3f76-46f6-a197-0ff5218a9deb" containerName="download-cache-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.746415 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ce54cb-82e7-40fe-a40b-d96f67dae3be" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.746917 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.748359 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.748623 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.749300 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.749618 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.749674 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.751178 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t"] Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.765755 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q"] Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.766622 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.767893 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.774046 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset-dockercfg-kxsc5" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.779614 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q"] Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.787452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.787499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v97w\" (UniqueName: \"kubernetes.io/projected/7d5824e3-dcc7-4975-8d30-a7b46a67a196-kube-api-access-9v97w\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.787532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.787550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.787569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.787598 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.787623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jvxp\" (UniqueName: \"kubernetes.io/projected/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-kube-api-access-5jvxp\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.787644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.888367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.888406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v97w\" (UniqueName: \"kubernetes.io/projected/7d5824e3-dcc7-4975-8d30-a7b46a67a196-kube-api-access-9v97w\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.888435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.888452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.888475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.888502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.888526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jvxp\" (UniqueName: \"kubernetes.io/projected/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-kube-api-access-5jvxp\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.888545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.892067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.892137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.892166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.892501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.893032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.893257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.901579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v97w\" (UniqueName: \"kubernetes.io/projected/7d5824e3-dcc7-4975-8d30-a7b46a67a196-kube-api-access-9v97w\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:42 crc kubenswrapper[4707]: I0121 16:22:42.901689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jvxp\" (UniqueName: \"kubernetes.io/projected/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-kube-api-access-5jvxp\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:43 crc kubenswrapper[4707]: I0121 16:22:43.059084 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:43 crc kubenswrapper[4707]: I0121 16:22:43.085848 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:43 crc kubenswrapper[4707]: I0121 16:22:43.196396 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:22:43 crc kubenswrapper[4707]: I0121 16:22:43.431166 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t"] Jan 21 16:22:43 crc kubenswrapper[4707]: W0121 16:22:43.432803 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e7d9e33_cfa5_4c05_adc5_a3810fbba263.slice/crio-0e43246f1dabd0dd79810f23d980482c09fa5bd0084c6928cb3fe5d2352fcccd WatchSource:0}: Error finding container 0e43246f1dabd0dd79810f23d980482c09fa5bd0084c6928cb3fe5d2352fcccd: Status 404 returned error can't find the container with id 0e43246f1dabd0dd79810f23d980482c09fa5bd0084c6928cb3fe5d2352fcccd Jan 21 16:22:43 crc kubenswrapper[4707]: I0121 16:22:43.464084 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q"] Jan 21 16:22:43 crc kubenswrapper[4707]: W0121 16:22:43.466175 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5824e3_dcc7_4975_8d30_a7b46a67a196.slice/crio-285a2d135cdf9d5b2c2d7d99d3d9d7951c79f549669e3e4a4562e9d6df7da226 WatchSource:0}: Error finding container 285a2d135cdf9d5b2c2d7d99d3d9d7951c79f549669e3e4a4562e9d6df7da226: Status 404 returned error can't find the container with id 285a2d135cdf9d5b2c2d7d99d3d9d7951c79f549669e3e4a4562e9d6df7da226 Jan 21 16:22:43 crc kubenswrapper[4707]: I0121 16:22:43.705553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" event={"ID":"7d5824e3-dcc7-4975-8d30-a7b46a67a196","Type":"ContainerStarted","Data":"285a2d135cdf9d5b2c2d7d99d3d9d7951c79f549669e3e4a4562e9d6df7da226"} Jan 21 16:22:43 crc kubenswrapper[4707]: I0121 16:22:43.706975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" event={"ID":"6e7d9e33-cfa5-4c05-adc5-a3810fbba263","Type":"ContainerStarted","Data":"0e43246f1dabd0dd79810f23d980482c09fa5bd0084c6928cb3fe5d2352fcccd"} Jan 21 16:22:43 crc kubenswrapper[4707]: I0121 16:22:43.709003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"4385da0860f68c6b4fbe1ca2a1e4ddb0b3259ea351cab52fa0f587fb0728043a"} Jan 21 16:22:43 crc kubenswrapper[4707]: I0121 16:22:43.927206 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:22:44 crc kubenswrapper[4707]: I0121 16:22:44.717561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" event={"ID":"7d5824e3-dcc7-4975-8d30-a7b46a67a196","Type":"ContainerStarted","Data":"85300f8d0bfdb9a98a82f1febe13582888bfe1b1840beff0b872a80ef6819294"} Jan 21 16:22:44 crc kubenswrapper[4707]: I0121 16:22:44.719256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" event={"ID":"6e7d9e33-cfa5-4c05-adc5-a3810fbba263","Type":"ContainerStarted","Data":"032d65d5a61ac6401bcde8a7135e0de95b989173b450fb1ffa8dbbf5d2b3cfee"} Jan 21 16:22:44 crc kubenswrapper[4707]: I0121 16:22:44.731703 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" podStartSLOduration=2.105052142 podStartE2EDuration="2.731685318s" podCreationTimestamp="2026-01-21 16:22:42 +0000 UTC" firstStartedPulling="2026-01-21 16:22:43.468307794 +0000 UTC m=+4860.649824016" lastFinishedPulling="2026-01-21 16:22:44.09494097 +0000 UTC m=+4861.276457192" observedRunningTime="2026-01-21 16:22:44.728113452 +0000 UTC m=+4861.909629673" watchObservedRunningTime="2026-01-21 16:22:44.731685318 +0000 UTC m=+4861.913201540" Jan 21 16:22:44 crc kubenswrapper[4707]: I0121 16:22:44.744161 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" podStartSLOduration=2.253391664 podStartE2EDuration="2.744144135s" podCreationTimestamp="2026-01-21 16:22:42 +0000 UTC" firstStartedPulling="2026-01-21 16:22:43.434898064 +0000 UTC m=+4860.616414285" lastFinishedPulling="2026-01-21 16:22:43.925650533 +0000 UTC m=+4861.107166756" observedRunningTime="2026-01-21 16:22:44.739686863 +0000 UTC m=+4861.921203085" watchObservedRunningTime="2026-01-21 16:22:44.744144135 +0000 UTC m=+4861.925660357" Jan 21 16:22:45 crc kubenswrapper[4707]: I0121 16:22:45.727331 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d5824e3-dcc7-4975-8d30-a7b46a67a196" containerID="85300f8d0bfdb9a98a82f1febe13582888bfe1b1840beff0b872a80ef6819294" exitCode=0 Jan 21 16:22:45 crc kubenswrapper[4707]: I0121 16:22:45.727382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" event={"ID":"7d5824e3-dcc7-4975-8d30-a7b46a67a196","Type":"ContainerDied","Data":"85300f8d0bfdb9a98a82f1febe13582888bfe1b1840beff0b872a80ef6819294"} Jan 21 16:22:45 crc kubenswrapper[4707]: I0121 16:22:45.728998 4707 generic.go:334] "Generic (PLEG): container finished" podID="6e7d9e33-cfa5-4c05-adc5-a3810fbba263" containerID="032d65d5a61ac6401bcde8a7135e0de95b989173b450fb1ffa8dbbf5d2b3cfee" exitCode=0 Jan 21 16:22:45 crc kubenswrapper[4707]: I0121 16:22:45.729028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" event={"ID":"6e7d9e33-cfa5-4c05-adc5-a3810fbba263","Type":"ContainerDied","Data":"032d65d5a61ac6401bcde8a7135e0de95b989173b450fb1ffa8dbbf5d2b3cfee"} Jan 21 16:22:46 crc kubenswrapper[4707]: I0121 16:22:46.972981 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:46 crc kubenswrapper[4707]: I0121 16:22:46.976408 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.137944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v97w\" (UniqueName: \"kubernetes.io/projected/7d5824e3-dcc7-4975-8d30-a7b46a67a196-kube-api-access-9v97w\") pod \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.138052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jvxp\" (UniqueName: \"kubernetes.io/projected/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-kube-api-access-5jvxp\") pod \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.138145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-inventory\") pod \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.138218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-bootstrap-combined-ca-bundle\") pod \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.138242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-ssh-key-edpm-compute-beta-nodeset\") pod \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\" (UID: \"7d5824e3-dcc7-4975-8d30-a7b46a67a196\") " Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.138292 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-bootstrap-combined-ca-bundle\") pod \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.138318 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-ssh-key-edpm-compute-global\") pod \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.138365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-inventory\") pod \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\" (UID: \"6e7d9e33-cfa5-4c05-adc5-a3810fbba263\") " Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.142562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7d5824e3-dcc7-4975-8d30-a7b46a67a196" (UID: "7d5824e3-dcc7-4975-8d30-a7b46a67a196"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.142559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6e7d9e33-cfa5-4c05-adc5-a3810fbba263" (UID: "6e7d9e33-cfa5-4c05-adc5-a3810fbba263"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.143028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5824e3-dcc7-4975-8d30-a7b46a67a196-kube-api-access-9v97w" (OuterVolumeSpecName: "kube-api-access-9v97w") pod "7d5824e3-dcc7-4975-8d30-a7b46a67a196" (UID: "7d5824e3-dcc7-4975-8d30-a7b46a67a196"). InnerVolumeSpecName "kube-api-access-9v97w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.143454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-kube-api-access-5jvxp" (OuterVolumeSpecName: "kube-api-access-5jvxp") pod "6e7d9e33-cfa5-4c05-adc5-a3810fbba263" (UID: "6e7d9e33-cfa5-4c05-adc5-a3810fbba263"). InnerVolumeSpecName "kube-api-access-5jvxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.154941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "6e7d9e33-cfa5-4c05-adc5-a3810fbba263" (UID: "6e7d9e33-cfa5-4c05-adc5-a3810fbba263"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.155113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-inventory" (OuterVolumeSpecName: "inventory") pod "6e7d9e33-cfa5-4c05-adc5-a3810fbba263" (UID: "6e7d9e33-cfa5-4c05-adc5-a3810fbba263"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.155797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "7d5824e3-dcc7-4975-8d30-a7b46a67a196" (UID: "7d5824e3-dcc7-4975-8d30-a7b46a67a196"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.156098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-inventory" (OuterVolumeSpecName: "inventory") pod "7d5824e3-dcc7-4975-8d30-a7b46a67a196" (UID: "7d5824e3-dcc7-4975-8d30-a7b46a67a196"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.239587 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.239614 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.239625 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.239634 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.239644 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.239653 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v97w\" (UniqueName: \"kubernetes.io/projected/7d5824e3-dcc7-4975-8d30-a7b46a67a196-kube-api-access-9v97w\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.239662 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jvxp\" (UniqueName: \"kubernetes.io/projected/6e7d9e33-cfa5-4c05-adc5-a3810fbba263-kube-api-access-5jvxp\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.239670 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d5824e3-dcc7-4975-8d30-a7b46a67a196-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.741531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.741528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q" event={"ID":"7d5824e3-dcc7-4975-8d30-a7b46a67a196","Type":"ContainerDied","Data":"285a2d135cdf9d5b2c2d7d99d3d9d7951c79f549669e3e4a4562e9d6df7da226"} Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.741767 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="285a2d135cdf9d5b2c2d7d99d3d9d7951c79f549669e3e4a4562e9d6df7da226" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.743102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" event={"ID":"6e7d9e33-cfa5-4c05-adc5-a3810fbba263","Type":"ContainerDied","Data":"0e43246f1dabd0dd79810f23d980482c09fa5bd0084c6928cb3fe5d2352fcccd"} Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.743142 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.743144 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e43246f1dabd0dd79810f23d980482c09fa5bd0084c6928cb3fe5d2352fcccd" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.792511 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr"] Jan 21 16:22:47 crc kubenswrapper[4707]: E0121 16:22:47.793192 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7d9e33-cfa5-4c05-adc5-a3810fbba263" containerName="bootstrap-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.793286 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7d9e33-cfa5-4c05-adc5-a3810fbba263" containerName="bootstrap-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:47 crc kubenswrapper[4707]: E0121 16:22:47.793343 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5824e3-dcc7-4975-8d30-a7b46a67a196" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.793397 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5824e3-dcc7-4975-8d30-a7b46a67a196" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.793618 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7d9e33-cfa5-4c05-adc5-a3810fbba263" containerName="bootstrap-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.793673 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5824e3-dcc7-4975-8d30-a7b46a67a196" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.794178 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.796659 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.796935 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.797060 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.797211 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.802855 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr"] Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.946389 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-zl6gr\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.946428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-zl6gr\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:47 crc kubenswrapper[4707]: I0121 16:22:47.946463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg6rp\" (UniqueName: \"kubernetes.io/projected/43bc6427-195d-4319-8883-0f55fd705ddc-kube-api-access-vg6rp\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-zl6gr\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:48 crc kubenswrapper[4707]: I0121 16:22:48.047480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg6rp\" (UniqueName: \"kubernetes.io/projected/43bc6427-195d-4319-8883-0f55fd705ddc-kube-api-access-vg6rp\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-zl6gr\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:48 crc kubenswrapper[4707]: I0121 16:22:48.047621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-zl6gr\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:48 crc kubenswrapper[4707]: I0121 16:22:48.047642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-zl6gr\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:48 crc kubenswrapper[4707]: I0121 16:22:48.051267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-zl6gr\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:48 crc kubenswrapper[4707]: I0121 16:22:48.051308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-zl6gr\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:48 crc kubenswrapper[4707]: I0121 16:22:48.060958 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg6rp\" (UniqueName: \"kubernetes.io/projected/43bc6427-195d-4319-8883-0f55fd705ddc-kube-api-access-vg6rp\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-zl6gr\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:48 crc kubenswrapper[4707]: I0121 16:22:48.114025 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:48 crc kubenswrapper[4707]: W0121 16:22:48.475676 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43bc6427_195d_4319_8883_0f55fd705ddc.slice/crio-4e856317a499222da6603c76b92119d7ea20bf0d18f03c45478468f9c27a940c WatchSource:0}: Error finding container 4e856317a499222da6603c76b92119d7ea20bf0d18f03c45478468f9c27a940c: Status 404 returned error can't find the container with id 4e856317a499222da6603c76b92119d7ea20bf0d18f03c45478468f9c27a940c Jan 21 16:22:48 crc kubenswrapper[4707]: I0121 16:22:48.475737 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr"] Jan 21 16:22:48 crc kubenswrapper[4707]: I0121 16:22:48.750672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" event={"ID":"43bc6427-195d-4319-8883-0f55fd705ddc","Type":"ContainerStarted","Data":"4e856317a499222da6603c76b92119d7ea20bf0d18f03c45478468f9c27a940c"} Jan 21 16:22:49 crc kubenswrapper[4707]: I0121 16:22:49.760032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" event={"ID":"43bc6427-195d-4319-8883-0f55fd705ddc","Type":"ContainerStarted","Data":"0ae2bf16efa70547dd967f2b9cc5b8bb084c8008840bb8c2a5ffeb9ea0cacb8b"} Jan 21 16:22:49 crc kubenswrapper[4707]: I0121 16:22:49.775691 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" podStartSLOduration=2.283259947 podStartE2EDuration="2.775675804s" podCreationTimestamp="2026-01-21 16:22:47 +0000 UTC" firstStartedPulling="2026-01-21 16:22:48.477870886 +0000 UTC m=+4865.659387108" lastFinishedPulling="2026-01-21 16:22:48.970286743 +0000 UTC m=+4866.151802965" observedRunningTime="2026-01-21 16:22:49.769903819 +0000 UTC m=+4866.951420042" watchObservedRunningTime="2026-01-21 16:22:49.775675804 +0000 UTC m=+4866.957192026" Jan 21 16:22:50 crc kubenswrapper[4707]: I0121 16:22:50.767539 4707 generic.go:334] "Generic (PLEG): container finished" podID="43bc6427-195d-4319-8883-0f55fd705ddc" containerID="0ae2bf16efa70547dd967f2b9cc5b8bb084c8008840bb8c2a5ffeb9ea0cacb8b" exitCode=0 Jan 21 16:22:50 crc kubenswrapper[4707]: I0121 16:22:50.767587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" event={"ID":"43bc6427-195d-4319-8883-0f55fd705ddc","Type":"ContainerDied","Data":"0ae2bf16efa70547dd967f2b9cc5b8bb084c8008840bb8c2a5ffeb9ea0cacb8b"} Jan 21 16:22:51 crc kubenswrapper[4707]: I0121 16:22:51.991215 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.107425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg6rp\" (UniqueName: \"kubernetes.io/projected/43bc6427-195d-4319-8883-0f55fd705ddc-kube-api-access-vg6rp\") pod \"43bc6427-195d-4319-8883-0f55fd705ddc\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.107608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-inventory\") pod \"43bc6427-195d-4319-8883-0f55fd705ddc\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.107647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-ssh-key-edpm-compute-global\") pod \"43bc6427-195d-4319-8883-0f55fd705ddc\" (UID: \"43bc6427-195d-4319-8883-0f55fd705ddc\") " Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.111195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43bc6427-195d-4319-8883-0f55fd705ddc-kube-api-access-vg6rp" (OuterVolumeSpecName: "kube-api-access-vg6rp") pod "43bc6427-195d-4319-8883-0f55fd705ddc" (UID: "43bc6427-195d-4319-8883-0f55fd705ddc"). InnerVolumeSpecName "kube-api-access-vg6rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.123665 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-inventory" (OuterVolumeSpecName: "inventory") pod "43bc6427-195d-4319-8883-0f55fd705ddc" (UID: "43bc6427-195d-4319-8883-0f55fd705ddc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.124353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "43bc6427-195d-4319-8883-0f55fd705ddc" (UID: "43bc6427-195d-4319-8883-0f55fd705ddc"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.208908 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg6rp\" (UniqueName: \"kubernetes.io/projected/43bc6427-195d-4319-8883-0f55fd705ddc-kube-api-access-vg6rp\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.208940 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.208951 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/43bc6427-195d-4319-8883-0f55fd705ddc-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.780371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" event={"ID":"43bc6427-195d-4319-8883-0f55fd705ddc","Type":"ContainerDied","Data":"4e856317a499222da6603c76b92119d7ea20bf0d18f03c45478468f9c27a940c"} Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.780535 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e856317a499222da6603c76b92119d7ea20bf0d18f03c45478468f9c27a940c" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.780426 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.828705 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg"] Jan 21 16:22:52 crc kubenswrapper[4707]: E0121 16:22:52.829147 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bc6427-195d-4319-8883-0f55fd705ddc" containerName="configure-network-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.829167 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bc6427-195d-4319-8883-0f55fd705ddc" containerName="configure-network-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.829300 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43bc6427-195d-4319-8883-0f55fd705ddc" containerName="configure-network-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.830398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.831787 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.832744 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.832753 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.832993 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:22:52 crc kubenswrapper[4707]: I0121 16:22:52.837376 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg"] Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.018490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-5nldg\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.018564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rxn2\" (UniqueName: \"kubernetes.io/projected/d9990097-194c-4cde-a8c7-b20214a7572a-kube-api-access-8rxn2\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-5nldg\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.018600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-5nldg\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.119860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-5nldg\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.119944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-5nldg\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.120001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rxn2\" (UniqueName: \"kubernetes.io/projected/d9990097-194c-4cde-a8c7-b20214a7572a-kube-api-access-8rxn2\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-5nldg\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.123774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-5nldg\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.123843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-5nldg\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.132888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rxn2\" (UniqueName: \"kubernetes.io/projected/d9990097-194c-4cde-a8c7-b20214a7572a-kube-api-access-8rxn2\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-5nldg\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.143261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.497602 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg"] Jan 21 16:22:53 crc kubenswrapper[4707]: W0121 16:22:53.500052 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9990097_194c_4cde_a8c7_b20214a7572a.slice/crio-33b1a92f0fd458a552110efb201c7ef5cee619f6230026d18d35282e82ee4fdf WatchSource:0}: Error finding container 33b1a92f0fd458a552110efb201c7ef5cee619f6230026d18d35282e82ee4fdf: Status 404 returned error can't find the container with id 33b1a92f0fd458a552110efb201c7ef5cee619f6230026d18d35282e82ee4fdf Jan 21 16:22:53 crc kubenswrapper[4707]: I0121 16:22:53.791960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" event={"ID":"d9990097-194c-4cde-a8c7-b20214a7572a","Type":"ContainerStarted","Data":"33b1a92f0fd458a552110efb201c7ef5cee619f6230026d18d35282e82ee4fdf"} Jan 21 16:22:54 crc kubenswrapper[4707]: I0121 16:22:54.800177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" event={"ID":"d9990097-194c-4cde-a8c7-b20214a7572a","Type":"ContainerStarted","Data":"cde5f0dfca74b7c4e0f11818e5b2443291e695d3e333603b9deeaf3c188f585d"} Jan 21 16:22:54 crc kubenswrapper[4707]: I0121 16:22:54.813760 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" podStartSLOduration=2.341660746 podStartE2EDuration="2.813745406s" podCreationTimestamp="2026-01-21 16:22:52 +0000 UTC" firstStartedPulling="2026-01-21 16:22:53.502409216 +0000 UTC m=+4870.683925438" lastFinishedPulling="2026-01-21 16:22:53.974493875 +0000 UTC m=+4871.156010098" observedRunningTime="2026-01-21 16:22:54.811172178 +0000 UTC m=+4871.992688400" watchObservedRunningTime="2026-01-21 16:22:54.813745406 +0000 UTC m=+4871.995261628" Jan 21 16:22:55 crc kubenswrapper[4707]: I0121 16:22:55.808006 4707 generic.go:334] "Generic (PLEG): container finished" podID="d9990097-194c-4cde-a8c7-b20214a7572a" containerID="cde5f0dfca74b7c4e0f11818e5b2443291e695d3e333603b9deeaf3c188f585d" exitCode=0 Jan 21 16:22:55 crc kubenswrapper[4707]: I0121 16:22:55.808044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" event={"ID":"d9990097-194c-4cde-a8c7-b20214a7572a","Type":"ContainerDied","Data":"cde5f0dfca74b7c4e0f11818e5b2443291e695d3e333603b9deeaf3c188f585d"} Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.018536 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.164699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rxn2\" (UniqueName: \"kubernetes.io/projected/d9990097-194c-4cde-a8c7-b20214a7572a-kube-api-access-8rxn2\") pod \"d9990097-194c-4cde-a8c7-b20214a7572a\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.164986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-ssh-key-edpm-compute-global\") pod \"d9990097-194c-4cde-a8c7-b20214a7572a\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.165026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-inventory\") pod \"d9990097-194c-4cde-a8c7-b20214a7572a\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.169826 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9990097-194c-4cde-a8c7-b20214a7572a-kube-api-access-8rxn2" (OuterVolumeSpecName: "kube-api-access-8rxn2") pod "d9990097-194c-4cde-a8c7-b20214a7572a" (UID: "d9990097-194c-4cde-a8c7-b20214a7572a"). InnerVolumeSpecName "kube-api-access-8rxn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:57 crc kubenswrapper[4707]: E0121 16:22:57.180298 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-inventory podName:d9990097-194c-4cde-a8c7-b20214a7572a nodeName:}" failed. No retries permitted until 2026-01-21 16:22:57.680271804 +0000 UTC m=+4874.861788026 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-inventory") pod "d9990097-194c-4cde-a8c7-b20214a7572a" (UID: "d9990097-194c-4cde-a8c7-b20214a7572a") : error deleting /var/lib/kubelet/pods/d9990097-194c-4cde-a8c7-b20214a7572a/volume-subpaths: remove /var/lib/kubelet/pods/d9990097-194c-4cde-a8c7-b20214a7572a/volume-subpaths: no such file or directory Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.182955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "d9990097-194c-4cde-a8c7-b20214a7572a" (UID: "d9990097-194c-4cde-a8c7-b20214a7572a"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.266297 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rxn2\" (UniqueName: \"kubernetes.io/projected/d9990097-194c-4cde-a8c7-b20214a7572a-kube-api-access-8rxn2\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.266329 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.772146 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-inventory\") pod \"d9990097-194c-4cde-a8c7-b20214a7572a\" (UID: \"d9990097-194c-4cde-a8c7-b20214a7572a\") " Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.774636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-inventory" (OuterVolumeSpecName: "inventory") pod "d9990097-194c-4cde-a8c7-b20214a7572a" (UID: "d9990097-194c-4cde-a8c7-b20214a7572a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.820530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" event={"ID":"d9990097-194c-4cde-a8c7-b20214a7572a","Type":"ContainerDied","Data":"33b1a92f0fd458a552110efb201c7ef5cee619f6230026d18d35282e82ee4fdf"} Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.820571 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b1a92f0fd458a552110efb201c7ef5cee619f6230026d18d35282e82ee4fdf" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.820602 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.872109 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n"] Jan 21 16:22:57 crc kubenswrapper[4707]: E0121 16:22:57.872381 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9990097-194c-4cde-a8c7-b20214a7572a" containerName="validate-network-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.872399 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9990097-194c-4cde-a8c7-b20214a7572a" containerName="validate-network-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.872540 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9990097-194c-4cde-a8c7-b20214a7572a" containerName="validate-network-edpm-multinodeset-edpm-compute-global" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.872960 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.873778 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9990097-194c-4cde-a8c7-b20214a7572a-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.875128 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.876467 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.876562 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.876573 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.881764 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n"] Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.975338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-global-ntw6n\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.975469 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24r8l\" (UniqueName: \"kubernetes.io/projected/d4d84130-78e3-4404-bffb-baa800b4b419-kube-api-access-24r8l\") pod \"install-os-edpm-multinodeset-edpm-compute-global-ntw6n\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:57 crc kubenswrapper[4707]: I0121 16:22:57.975529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-ssh-key-edpm-compute-global\") pod \"install-os-edpm-multinodeset-edpm-compute-global-ntw6n\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:58 crc kubenswrapper[4707]: I0121 16:22:58.077442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-global-ntw6n\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:58 crc kubenswrapper[4707]: I0121 16:22:58.077505 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24r8l\" (UniqueName: \"kubernetes.io/projected/d4d84130-78e3-4404-bffb-baa800b4b419-kube-api-access-24r8l\") pod \"install-os-edpm-multinodeset-edpm-compute-global-ntw6n\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:58 crc kubenswrapper[4707]: I0121 16:22:58.077560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-ssh-key-edpm-compute-global\") pod \"install-os-edpm-multinodeset-edpm-compute-global-ntw6n\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:58 crc kubenswrapper[4707]: I0121 16:22:58.081494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-ssh-key-edpm-compute-global\") pod \"install-os-edpm-multinodeset-edpm-compute-global-ntw6n\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:58 crc kubenswrapper[4707]: I0121 16:22:58.081592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-global-ntw6n\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:58 crc kubenswrapper[4707]: I0121 16:22:58.091663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24r8l\" (UniqueName: \"kubernetes.io/projected/d4d84130-78e3-4404-bffb-baa800b4b419-kube-api-access-24r8l\") pod \"install-os-edpm-multinodeset-edpm-compute-global-ntw6n\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:58 crc kubenswrapper[4707]: I0121 16:22:58.186139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:22:58 crc kubenswrapper[4707]: I0121 16:22:58.532569 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n"] Jan 21 16:22:58 crc kubenswrapper[4707]: W0121 16:22:58.550457 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4d84130_78e3_4404_bffb_baa800b4b419.slice/crio-fc71836caac7bd1f7f432f006aa8091182980cdd38e2cc27aea6bde51cacc9b7 WatchSource:0}: Error finding container fc71836caac7bd1f7f432f006aa8091182980cdd38e2cc27aea6bde51cacc9b7: Status 404 returned error can't find the container with id fc71836caac7bd1f7f432f006aa8091182980cdd38e2cc27aea6bde51cacc9b7 Jan 21 16:22:58 crc kubenswrapper[4707]: I0121 16:22:58.827668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" event={"ID":"d4d84130-78e3-4404-bffb-baa800b4b419","Type":"ContainerStarted","Data":"fc71836caac7bd1f7f432f006aa8091182980cdd38e2cc27aea6bde51cacc9b7"} Jan 21 16:22:59 crc kubenswrapper[4707]: I0121 16:22:59.844046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" event={"ID":"d4d84130-78e3-4404-bffb-baa800b4b419","Type":"ContainerStarted","Data":"4a4f9ca8aa9cb4726a3c5555e3713f9f3283e3eb1bcdbbac79240a91090e07fa"} Jan 21 16:22:59 crc kubenswrapper[4707]: I0121 16:22:59.860405 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" podStartSLOduration=2.416452083 podStartE2EDuration="2.8603698s" podCreationTimestamp="2026-01-21 16:22:57 +0000 UTC" firstStartedPulling="2026-01-21 16:22:58.552079467 +0000 UTC m=+4875.733595690" lastFinishedPulling="2026-01-21 16:22:58.995997184 +0000 UTC m=+4876.177513407" observedRunningTime="2026-01-21 16:22:59.85348244 +0000 UTC m=+4877.034998662" watchObservedRunningTime="2026-01-21 16:22:59.8603698 +0000 UTC m=+4877.041886022" Jan 21 16:23:00 crc kubenswrapper[4707]: I0121 16:23:00.851484 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4d84130-78e3-4404-bffb-baa800b4b419" containerID="4a4f9ca8aa9cb4726a3c5555e3713f9f3283e3eb1bcdbbac79240a91090e07fa" exitCode=0 Jan 21 16:23:00 crc kubenswrapper[4707]: I0121 16:23:00.851529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" event={"ID":"d4d84130-78e3-4404-bffb-baa800b4b419","Type":"ContainerDied","Data":"4a4f9ca8aa9cb4726a3c5555e3713f9f3283e3eb1bcdbbac79240a91090e07fa"} Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.067010 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.125879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-ssh-key-edpm-compute-global\") pod \"d4d84130-78e3-4404-bffb-baa800b4b419\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.125930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-inventory\") pod \"d4d84130-78e3-4404-bffb-baa800b4b419\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.125959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24r8l\" (UniqueName: \"kubernetes.io/projected/d4d84130-78e3-4404-bffb-baa800b4b419-kube-api-access-24r8l\") pod \"d4d84130-78e3-4404-bffb-baa800b4b419\" (UID: \"d4d84130-78e3-4404-bffb-baa800b4b419\") " Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.131239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d84130-78e3-4404-bffb-baa800b4b419-kube-api-access-24r8l" (OuterVolumeSpecName: "kube-api-access-24r8l") pod "d4d84130-78e3-4404-bffb-baa800b4b419" (UID: "d4d84130-78e3-4404-bffb-baa800b4b419"). InnerVolumeSpecName "kube-api-access-24r8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.141648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-inventory" (OuterVolumeSpecName: "inventory") pod "d4d84130-78e3-4404-bffb-baa800b4b419" (UID: "d4d84130-78e3-4404-bffb-baa800b4b419"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.142159 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "d4d84130-78e3-4404-bffb-baa800b4b419" (UID: "d4d84130-78e3-4404-bffb-baa800b4b419"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.227598 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.227628 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d84130-78e3-4404-bffb-baa800b4b419-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.227639 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24r8l\" (UniqueName: \"kubernetes.io/projected/d4d84130-78e3-4404-bffb-baa800b4b419-kube-api-access-24r8l\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.863666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" event={"ID":"d4d84130-78e3-4404-bffb-baa800b4b419","Type":"ContainerDied","Data":"fc71836caac7bd1f7f432f006aa8091182980cdd38e2cc27aea6bde51cacc9b7"} Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.863701 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc71836caac7bd1f7f432f006aa8091182980cdd38e2cc27aea6bde51cacc9b7" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.863713 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.902592 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6"] Jan 21 16:23:02 crc kubenswrapper[4707]: E0121 16:23:02.903049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d84130-78e3-4404-bffb-baa800b4b419" containerName="install-os-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.903127 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d84130-78e3-4404-bffb-baa800b4b419" containerName="install-os-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.903342 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d84130-78e3-4404-bffb-baa800b4b419" containerName="install-os-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.903821 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.905471 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.905518 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.905518 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.907432 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.911284 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6"] Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.937303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdnh6\" (UniqueName: \"kubernetes.io/projected/6d293288-a079-4408-b953-f457f94562e5-kube-api-access-rdnh6\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-hn4b6\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.937770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-hn4b6\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:02 crc kubenswrapper[4707]: I0121 16:23:02.937971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-hn4b6\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:03 crc kubenswrapper[4707]: I0121 16:23:03.040183 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-hn4b6\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:03 crc kubenswrapper[4707]: I0121 16:23:03.040284 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-hn4b6\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:03 crc kubenswrapper[4707]: I0121 16:23:03.040379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdnh6\" (UniqueName: \"kubernetes.io/projected/6d293288-a079-4408-b953-f457f94562e5-kube-api-access-rdnh6\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-hn4b6\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:03 crc kubenswrapper[4707]: I0121 16:23:03.044484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-hn4b6\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:03 crc kubenswrapper[4707]: I0121 16:23:03.044830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-hn4b6\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:03 crc kubenswrapper[4707]: I0121 16:23:03.052801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdnh6\" (UniqueName: \"kubernetes.io/projected/6d293288-a079-4408-b953-f457f94562e5-kube-api-access-rdnh6\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-hn4b6\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:03 crc kubenswrapper[4707]: I0121 16:23:03.216791 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:03 crc kubenswrapper[4707]: I0121 16:23:03.567128 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6"] Jan 21 16:23:03 crc kubenswrapper[4707]: W0121 16:23:03.569990 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d293288_a079_4408_b953_f457f94562e5.slice/crio-07bd42401bca4b272d6f3de6dc308c306f8e900fba7326e40921a3e696f01e47 WatchSource:0}: Error finding container 07bd42401bca4b272d6f3de6dc308c306f8e900fba7326e40921a3e696f01e47: Status 404 returned error can't find the container with id 07bd42401bca4b272d6f3de6dc308c306f8e900fba7326e40921a3e696f01e47 Jan 21 16:23:03 crc kubenswrapper[4707]: I0121 16:23:03.870307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" event={"ID":"6d293288-a079-4408-b953-f457f94562e5","Type":"ContainerStarted","Data":"07bd42401bca4b272d6f3de6dc308c306f8e900fba7326e40921a3e696f01e47"} Jan 21 16:23:04 crc kubenswrapper[4707]: I0121 16:23:04.877787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" event={"ID":"6d293288-a079-4408-b953-f457f94562e5","Type":"ContainerStarted","Data":"f50f2ef7b63a9f049bf2699e25719a5bbd8f8273c10616f22dc0f31f703c2866"} Jan 21 16:23:04 crc kubenswrapper[4707]: I0121 16:23:04.903823 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" podStartSLOduration=2.452146009 podStartE2EDuration="2.903791506s" podCreationTimestamp="2026-01-21 16:23:02 +0000 UTC" firstStartedPulling="2026-01-21 16:23:03.571866192 +0000 UTC m=+4880.753382414" lastFinishedPulling="2026-01-21 16:23:04.023511689 +0000 UTC m=+4881.205027911" observedRunningTime="2026-01-21 16:23:04.89949152 +0000 UTC m=+4882.081007743" watchObservedRunningTime="2026-01-21 16:23:04.903791506 +0000 UTC m=+4882.085307728" Jan 21 16:23:05 crc kubenswrapper[4707]: I0121 16:23:05.885949 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d293288-a079-4408-b953-f457f94562e5" containerID="f50f2ef7b63a9f049bf2699e25719a5bbd8f8273c10616f22dc0f31f703c2866" exitCode=0 Jan 21 16:23:05 crc kubenswrapper[4707]: I0121 16:23:05.886001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" event={"ID":"6d293288-a079-4408-b953-f457f94562e5","Type":"ContainerDied","Data":"f50f2ef7b63a9f049bf2699e25719a5bbd8f8273c10616f22dc0f31f703c2866"} Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.208522 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.399669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-ssh-key-edpm-compute-global\") pod \"6d293288-a079-4408-b953-f457f94562e5\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.399762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdnh6\" (UniqueName: \"kubernetes.io/projected/6d293288-a079-4408-b953-f457f94562e5-kube-api-access-rdnh6\") pod \"6d293288-a079-4408-b953-f457f94562e5\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.399802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-inventory\") pod \"6d293288-a079-4408-b953-f457f94562e5\" (UID: \"6d293288-a079-4408-b953-f457f94562e5\") " Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.403795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d293288-a079-4408-b953-f457f94562e5-kube-api-access-rdnh6" (OuterVolumeSpecName: "kube-api-access-rdnh6") pod "6d293288-a079-4408-b953-f457f94562e5" (UID: "6d293288-a079-4408-b953-f457f94562e5"). InnerVolumeSpecName "kube-api-access-rdnh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.416092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "6d293288-a079-4408-b953-f457f94562e5" (UID: "6d293288-a079-4408-b953-f457f94562e5"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.416658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-inventory" (OuterVolumeSpecName: "inventory") pod "6d293288-a079-4408-b953-f457f94562e5" (UID: "6d293288-a079-4408-b953-f457f94562e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.500911 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdnh6\" (UniqueName: \"kubernetes.io/projected/6d293288-a079-4408-b953-f457f94562e5-kube-api-access-rdnh6\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.501045 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.501124 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d293288-a079-4408-b953-f457f94562e5-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.902968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" event={"ID":"6d293288-a079-4408-b953-f457f94562e5","Type":"ContainerDied","Data":"07bd42401bca4b272d6f3de6dc308c306f8e900fba7326e40921a3e696f01e47"} Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.903174 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07bd42401bca4b272d6f3de6dc308c306f8e900fba7326e40921a3e696f01e47" Jan 21 16:23:07 crc kubenswrapper[4707]: I0121 16:23:07.903026 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.256516 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg"] Jan 21 16:23:08 crc kubenswrapper[4707]: E0121 16:23:08.256791 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d293288-a079-4408-b953-f457f94562e5" containerName="configure-os-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.256820 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d293288-a079-4408-b953-f457f94562e5" containerName="configure-os-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.256965 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d293288-a079-4408-b953-f457f94562e5" containerName="configure-os-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.257436 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.259284 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.259773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.259976 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.260186 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.267401 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg"] Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.412091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-ssh-key-edpm-compute-global\") pod \"run-os-edpm-multinodeset-edpm-compute-global-wkrlg\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.412182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zq2t\" (UniqueName: \"kubernetes.io/projected/3a3bdc36-8322-4614-9172-75432d932033-kube-api-access-8zq2t\") pod \"run-os-edpm-multinodeset-edpm-compute-global-wkrlg\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.412417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-global-wkrlg\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.513149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-global-wkrlg\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.513197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-ssh-key-edpm-compute-global\") pod \"run-os-edpm-multinodeset-edpm-compute-global-wkrlg\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.513235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zq2t\" (UniqueName: \"kubernetes.io/projected/3a3bdc36-8322-4614-9172-75432d932033-kube-api-access-8zq2t\") pod \"run-os-edpm-multinodeset-edpm-compute-global-wkrlg\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.516673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-global-wkrlg\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.516701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-ssh-key-edpm-compute-global\") pod \"run-os-edpm-multinodeset-edpm-compute-global-wkrlg\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.526582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zq2t\" (UniqueName: \"kubernetes.io/projected/3a3bdc36-8322-4614-9172-75432d932033-kube-api-access-8zq2t\") pod \"run-os-edpm-multinodeset-edpm-compute-global-wkrlg\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.570536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:08 crc kubenswrapper[4707]: I0121 16:23:08.920592 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg"] Jan 21 16:23:09 crc kubenswrapper[4707]: I0121 16:23:09.919775 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" event={"ID":"3a3bdc36-8322-4614-9172-75432d932033","Type":"ContainerStarted","Data":"7c73dca24790201aa7f764bf45c747cc9a707ecc3c2c819be34e77b922fec5df"} Jan 21 16:23:09 crc kubenswrapper[4707]: I0121 16:23:09.919990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" event={"ID":"3a3bdc36-8322-4614-9172-75432d932033","Type":"ContainerStarted","Data":"92eb54e079e132bd1a6d14b32da46a3dbdde5bfa53ef1cab4afc977b3cfc976f"} Jan 21 16:23:09 crc kubenswrapper[4707]: I0121 16:23:09.940927 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" podStartSLOduration=1.442824141 podStartE2EDuration="1.940911371s" podCreationTimestamp="2026-01-21 16:23:08 +0000 UTC" firstStartedPulling="2026-01-21 16:23:08.924039411 +0000 UTC m=+4886.105555634" lastFinishedPulling="2026-01-21 16:23:09.422126642 +0000 UTC m=+4886.603642864" observedRunningTime="2026-01-21 16:23:09.935690384 +0000 UTC m=+4887.117206606" watchObservedRunningTime="2026-01-21 16:23:09.940911371 +0000 UTC m=+4887.122427593" Jan 21 16:23:10 crc kubenswrapper[4707]: I0121 16:23:10.927272 4707 generic.go:334] "Generic (PLEG): container finished" podID="3a3bdc36-8322-4614-9172-75432d932033" containerID="7c73dca24790201aa7f764bf45c747cc9a707ecc3c2c819be34e77b922fec5df" exitCode=0 Jan 21 16:23:10 crc kubenswrapper[4707]: I0121 16:23:10.927379 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" event={"ID":"3a3bdc36-8322-4614-9172-75432d932033","Type":"ContainerDied","Data":"7c73dca24790201aa7f764bf45c747cc9a707ecc3c2c819be34e77b922fec5df"} Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.568371 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.662229 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-inventory\") pod \"3a3bdc36-8322-4614-9172-75432d932033\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.662343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zq2t\" (UniqueName: \"kubernetes.io/projected/3a3bdc36-8322-4614-9172-75432d932033-kube-api-access-8zq2t\") pod \"3a3bdc36-8322-4614-9172-75432d932033\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.662392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-ssh-key-edpm-compute-global\") pod \"3a3bdc36-8322-4614-9172-75432d932033\" (UID: \"3a3bdc36-8322-4614-9172-75432d932033\") " Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.666437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3bdc36-8322-4614-9172-75432d932033-kube-api-access-8zq2t" (OuterVolumeSpecName: "kube-api-access-8zq2t") pod "3a3bdc36-8322-4614-9172-75432d932033" (UID: "3a3bdc36-8322-4614-9172-75432d932033"). InnerVolumeSpecName "kube-api-access-8zq2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.678357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "3a3bdc36-8322-4614-9172-75432d932033" (UID: "3a3bdc36-8322-4614-9172-75432d932033"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.678754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-inventory" (OuterVolumeSpecName: "inventory") pod "3a3bdc36-8322-4614-9172-75432d932033" (UID: "3a3bdc36-8322-4614-9172-75432d932033"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.764536 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zq2t\" (UniqueName: \"kubernetes.io/projected/3a3bdc36-8322-4614-9172-75432d932033-kube-api-access-8zq2t\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.764569 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.764582 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3bdc36-8322-4614-9172-75432d932033-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.959721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" event={"ID":"3a3bdc36-8322-4614-9172-75432d932033","Type":"ContainerDied","Data":"92eb54e079e132bd1a6d14b32da46a3dbdde5bfa53ef1cab4afc977b3cfc976f"} Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.959767 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92eb54e079e132bd1a6d14b32da46a3dbdde5bfa53ef1cab4afc977b3cfc976f" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.960225 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.985013 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d"] Jan 21 16:23:12 crc kubenswrapper[4707]: E0121 16:23:12.985309 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3bdc36-8322-4614-9172-75432d932033" containerName="run-os-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.985326 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3bdc36-8322-4614-9172-75432d932033" containerName="run-os-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.985511 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3bdc36-8322-4614-9172-75432d932033" containerName="run-os-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.985999 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.989049 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.989449 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.989602 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.989610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.990632 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:12 crc kubenswrapper[4707]: I0121 16:23:12.996414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d"] Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.170863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.170922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.170942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.170974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lbwt\" (UniqueName: \"kubernetes.io/projected/137810ba-4d7b-4c06-97c9-485a575e30a7-kube-api-access-9lbwt\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.170991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.171007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.171054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.171231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.171285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.171338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.171386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.171432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.272364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.272430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.272490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.272517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.272533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.272661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lbwt\" (UniqueName: \"kubernetes.io/projected/137810ba-4d7b-4c06-97c9-485a575e30a7-kube-api-access-9lbwt\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.272962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.272986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.273019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.273042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.273059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.273080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.275857 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.276288 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.277254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.277464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.277562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.278435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.278129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.279131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.279804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.280233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.280296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.289736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lbwt\" (UniqueName: \"kubernetes.io/projected/137810ba-4d7b-4c06-97c9-485a575e30a7-kube-api-access-9lbwt\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-f6w6d\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.305582 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.665235 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d"] Jan 21 16:23:13 crc kubenswrapper[4707]: W0121 16:23:13.750275 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod137810ba_4d7b_4c06_97c9_485a575e30a7.slice/crio-94636510480d150036d99589198f2540a729dd3fb9562ccb449b307d7f19727b WatchSource:0}: Error finding container 94636510480d150036d99589198f2540a729dd3fb9562ccb449b307d7f19727b: Status 404 returned error can't find the container with id 94636510480d150036d99589198f2540a729dd3fb9562ccb449b307d7f19727b Jan 21 16:23:13 crc kubenswrapper[4707]: I0121 16:23:13.967723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" event={"ID":"137810ba-4d7b-4c06-97c9-485a575e30a7","Type":"ContainerStarted","Data":"94636510480d150036d99589198f2540a729dd3fb9562ccb449b307d7f19727b"} Jan 21 16:23:14 crc kubenswrapper[4707]: I0121 16:23:14.975519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" event={"ID":"137810ba-4d7b-4c06-97c9-485a575e30a7","Type":"ContainerStarted","Data":"e1424419f0c2b85330b14e02110c1f778afb0bea4b28340ef864e6456bd3219f"} Jan 21 16:23:14 crc kubenswrapper[4707]: I0121 16:23:14.989559 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" podStartSLOduration=2.3689551939999998 podStartE2EDuration="2.989533115s" podCreationTimestamp="2026-01-21 16:23:12 +0000 UTC" firstStartedPulling="2026-01-21 16:23:13.752030814 +0000 UTC m=+4890.933547036" lastFinishedPulling="2026-01-21 16:23:14.372608735 +0000 UTC m=+4891.554124957" observedRunningTime="2026-01-21 16:23:14.988964976 +0000 UTC m=+4892.170481199" watchObservedRunningTime="2026-01-21 16:23:14.989533115 +0000 UTC m=+4892.171049336" Jan 21 16:23:15 crc kubenswrapper[4707]: I0121 16:23:15.984224 4707 generic.go:334] "Generic (PLEG): container finished" podID="137810ba-4d7b-4c06-97c9-485a575e30a7" containerID="e1424419f0c2b85330b14e02110c1f778afb0bea4b28340ef864e6456bd3219f" exitCode=0 Jan 21 16:23:15 crc kubenswrapper[4707]: I0121 16:23:15.984268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" event={"ID":"137810ba-4d7b-4c06-97c9-485a575e30a7","Type":"ContainerDied","Data":"e1424419f0c2b85330b14e02110c1f778afb0bea4b28340ef864e6456bd3219f"} Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.297910 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.427891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-libvirt-combined-ca-bundle\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.427938 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-custom-global-service-combined-ca-bundle\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.427961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-inventory\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.427978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-bootstrap-combined-ca-bundle\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.428002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-nova-combined-ca-bundle\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.428024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-ovn-combined-ca-bundle\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.428080 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-metadata-combined-ca-bundle\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.428126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-sriov-combined-ca-bundle\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.428146 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-dhcp-combined-ca-bundle\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.428178 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ssh-key-edpm-compute-global\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.428198 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ovn-combined-ca-bundle\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.428221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lbwt\" (UniqueName: \"kubernetes.io/projected/137810ba-4d7b-4c06-97c9-485a575e30a7-kube-api-access-9lbwt\") pod \"137810ba-4d7b-4c06-97c9-485a575e30a7\" (UID: \"137810ba-4d7b-4c06-97c9-485a575e30a7\") " Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.436081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.436121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.436145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.436185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.436218 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.436273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.436738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137810ba-4d7b-4c06-97c9-485a575e30a7-kube-api-access-9lbwt" (OuterVolumeSpecName: "kube-api-access-9lbwt") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "kube-api-access-9lbwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.437242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.437272 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.438379 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-custom-global-service-combined-ca-bundle" (OuterVolumeSpecName: "custom-global-service-combined-ca-bundle") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "custom-global-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.452322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-inventory" (OuterVolumeSpecName: "inventory") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.453080 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "137810ba-4d7b-4c06-97c9-485a575e30a7" (UID: "137810ba-4d7b-4c06-97c9-485a575e30a7"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529313 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529341 4707 reconciler_common.go:293] "Volume detached for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-custom-global-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529374 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529384 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529396 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529405 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529421 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529429 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529438 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529446 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529454 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137810ba-4d7b-4c06-97c9-485a575e30a7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.529464 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lbwt\" (UniqueName: \"kubernetes.io/projected/137810ba-4d7b-4c06-97c9-485a575e30a7-kube-api-access-9lbwt\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.997537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" event={"ID":"137810ba-4d7b-4c06-97c9-485a575e30a7","Type":"ContainerDied","Data":"94636510480d150036d99589198f2540a729dd3fb9562ccb449b307d7f19727b"} Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.997717 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94636510480d150036d99589198f2540a729dd3fb9562ccb449b307d7f19727b" Jan 21 16:23:17 crc kubenswrapper[4707]: I0121 16:23:17.997580 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.065763 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq"] Jan 21 16:23:18 crc kubenswrapper[4707]: E0121 16:23:18.066049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137810ba-4d7b-4c06-97c9-485a575e30a7" containerName="install-certs-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.066069 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="137810ba-4d7b-4c06-97c9-485a575e30a7" containerName="install-certs-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.066238 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="137810ba-4d7b-4c06-97c9-485a575e30a7" containerName="install-certs-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.066678 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.068014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.068339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.068695 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.069830 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.069933 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.070208 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.079148 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq"] Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.236921 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.236976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwjkc\" (UniqueName: \"kubernetes.io/projected/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-kube-api-access-mwjkc\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.237045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.237069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.237132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ssh-key-edpm-compute-global\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.338000 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ssh-key-edpm-compute-global\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.338278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.338320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwjkc\" (UniqueName: \"kubernetes.io/projected/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-kube-api-access-mwjkc\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.338365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.338382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.339250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.342281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ssh-key-edpm-compute-global\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.342297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.343377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.351015 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwjkc\" (UniqueName: \"kubernetes.io/projected/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-kube-api-access-mwjkc\") pod \"ovn-edpm-multinodeset-edpm-compute-global-f7nrq\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.380218 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:18 crc kubenswrapper[4707]: I0121 16:23:18.737502 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq"] Jan 21 16:23:18 crc kubenswrapper[4707]: W0121 16:23:18.742657 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e0f79e5_b282_4b05_b02d_fd8c545ebe59.slice/crio-8f9660e8b792e7eb268a460e1278484a489cfc19c8bdc8d0357f48184c6b01a6 WatchSource:0}: Error finding container 8f9660e8b792e7eb268a460e1278484a489cfc19c8bdc8d0357f48184c6b01a6: Status 404 returned error can't find the container with id 8f9660e8b792e7eb268a460e1278484a489cfc19c8bdc8d0357f48184c6b01a6 Jan 21 16:23:19 crc kubenswrapper[4707]: I0121 16:23:19.006284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" event={"ID":"8e0f79e5-b282-4b05-b02d-fd8c545ebe59","Type":"ContainerStarted","Data":"8f9660e8b792e7eb268a460e1278484a489cfc19c8bdc8d0357f48184c6b01a6"} Jan 21 16:23:20 crc kubenswrapper[4707]: I0121 16:23:20.018362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" event={"ID":"8e0f79e5-b282-4b05-b02d-fd8c545ebe59","Type":"ContainerStarted","Data":"708bb3303ccc2bdb9b82db33f015510413004800881b6f4dffca4feb85a4ff67"} Jan 21 16:23:20 crc kubenswrapper[4707]: I0121 16:23:20.033330 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" podStartSLOduration=1.50277781 podStartE2EDuration="2.03331184s" podCreationTimestamp="2026-01-21 16:23:18 +0000 UTC" firstStartedPulling="2026-01-21 16:23:18.745005634 +0000 UTC m=+4895.926521855" lastFinishedPulling="2026-01-21 16:23:19.275539663 +0000 UTC m=+4896.457055885" observedRunningTime="2026-01-21 16:23:20.031185642 +0000 UTC m=+4897.212701865" watchObservedRunningTime="2026-01-21 16:23:20.03331184 +0000 UTC m=+4897.214828062" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.025430 4707 generic.go:334] "Generic (PLEG): container finished" podID="8e0f79e5-b282-4b05-b02d-fd8c545ebe59" containerID="708bb3303ccc2bdb9b82db33f015510413004800881b6f4dffca4feb85a4ff67" exitCode=0 Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.025533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" event={"ID":"8e0f79e5-b282-4b05-b02d-fd8c545ebe59","Type":"ContainerDied","Data":"708bb3303ccc2bdb9b82db33f015510413004800881b6f4dffca4feb85a4ff67"} Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.277721 4707 scope.go:117] "RemoveContainer" containerID="72f0c21a64d5101061525331cfce47a016ad7faf1b203bdda3c3a11eff4ed108" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.292201 4707 scope.go:117] "RemoveContainer" containerID="38f07e4dab137756d0ec3b0c9768ae2674630a8f493a3424ecef92d8bef537e0" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.327126 4707 scope.go:117] "RemoveContainer" containerID="ef1839ed46b9bd37ab4f97f30c6925c49a2de7a8a40de4236fdc49b99dbe0f27" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.339693 4707 scope.go:117] "RemoveContainer" containerID="d84a5d6d4141a509df8d8a1c995f601b8feea571b5b15f5471acb0649ff8cffc" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.368012 4707 scope.go:117] "RemoveContainer" containerID="1901cb8cd260095a37ab695ec27b9cc5306d788abfe05ef6e14099fddbd0617d" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.380415 4707 scope.go:117] "RemoveContainer" containerID="1597feabb8d21f3510c2a3bef67f6d2250cbc18d1b39b3864fe125c0caf4ee50" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.400160 4707 scope.go:117] "RemoveContainer" containerID="4612bae2bcc2ddd52c87c41bb62385e1e556bbc875be1e1cfcdbbaddd73ce0fa" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.413180 4707 scope.go:117] "RemoveContainer" containerID="da44097fcef9c37d7356e68f27f5b9fdc7b54b83a2c269b4169230f0a612a360" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.424758 4707 scope.go:117] "RemoveContainer" containerID="6cf9d360647ef53cb137039369ca082a96c8915351bf2af5ac55355a634fdab4" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.437615 4707 scope.go:117] "RemoveContainer" containerID="ab1850d3462b9d70ea9580abe0ab8b461756cad0bf489d65b52f19dc9153b2ce" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.449366 4707 scope.go:117] "RemoveContainer" containerID="3a326b0249b7701a07598cdebb6cc82a855666df6ab38b7cb3d19e5643ea4f64" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.470638 4707 scope.go:117] "RemoveContainer" containerID="55d0cc5193f2c2d16b3550deb8e51232a5a052fac7261c623d17313148865d9b" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.481844 4707 scope.go:117] "RemoveContainer" containerID="7bd9b3688c0b791b6bd09562e9cc13b95d473710050456e110ee5ba4860df742" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.496069 4707 scope.go:117] "RemoveContainer" containerID="ab8f2b3c20e84db0ca40c7bb13debf7c50ed553488dd77b15a16fee6c242e20f" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.511776 4707 scope.go:117] "RemoveContainer" containerID="54f3fd6d7f4c27e1565ae48afce64db522ef5e0bf6f32d8e333882ed6b5ae842" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.524060 4707 scope.go:117] "RemoveContainer" containerID="8619c0d6694f4fb15f617a41dd4e1b24fec26125230f5be9bd32bb4e75e89e0b" Jan 21 16:23:21 crc kubenswrapper[4707]: I0121 16:23:21.537611 4707 scope.go:117] "RemoveContainer" containerID="2c40331b7a55e46ad04bd0aeecb0ef1a000647b4cd6a521d0565d70d76c94cbb" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.219430 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.392537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ssh-key-edpm-compute-global\") pod \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.392611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwjkc\" (UniqueName: \"kubernetes.io/projected/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-kube-api-access-mwjkc\") pod \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.392692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-inventory\") pod \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.393235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovncontroller-config-0\") pod \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.393261 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovn-combined-ca-bundle\") pod \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\" (UID: \"8e0f79e5-b282-4b05-b02d-fd8c545ebe59\") " Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.397677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-kube-api-access-mwjkc" (OuterVolumeSpecName: "kube-api-access-mwjkc") pod "8e0f79e5-b282-4b05-b02d-fd8c545ebe59" (UID: "8e0f79e5-b282-4b05-b02d-fd8c545ebe59"). InnerVolumeSpecName "kube-api-access-mwjkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.397786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8e0f79e5-b282-4b05-b02d-fd8c545ebe59" (UID: "8e0f79e5-b282-4b05-b02d-fd8c545ebe59"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.407785 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8e0f79e5-b282-4b05-b02d-fd8c545ebe59" (UID: "8e0f79e5-b282-4b05-b02d-fd8c545ebe59"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.409196 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-inventory" (OuterVolumeSpecName: "inventory") pod "8e0f79e5-b282-4b05-b02d-fd8c545ebe59" (UID: "8e0f79e5-b282-4b05-b02d-fd8c545ebe59"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.409874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "8e0f79e5-b282-4b05-b02d-fd8c545ebe59" (UID: "8e0f79e5-b282-4b05-b02d-fd8c545ebe59"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.494956 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.494988 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwjkc\" (UniqueName: \"kubernetes.io/projected/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-kube-api-access-mwjkc\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.495000 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.495009 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:22 crc kubenswrapper[4707]: I0121 16:23:22.495017 4707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e0f79e5-b282-4b05-b02d-fd8c545ebe59-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.038989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" event={"ID":"8e0f79e5-b282-4b05-b02d-fd8c545ebe59","Type":"ContainerDied","Data":"8f9660e8b792e7eb268a460e1278484a489cfc19c8bdc8d0357f48184c6b01a6"} Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.039033 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f9660e8b792e7eb268a460e1278484a489cfc19c8bdc8d0357f48184c6b01a6" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.039216 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.085403 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk"] Jan 21 16:23:23 crc kubenswrapper[4707]: E0121 16:23:23.085783 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0f79e5-b282-4b05-b02d-fd8c545ebe59" containerName="ovn-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.085802 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0f79e5-b282-4b05-b02d-fd8c545ebe59" containerName="ovn-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.086065 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e0f79e5-b282-4b05-b02d-fd8c545ebe59" containerName="ovn-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.086738 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.092756 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.093501 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.097326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-neutron-config" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.097546 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.097705 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.097768 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.097840 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.101983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk"] Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.203024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-kube-api-access-x5mbl\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.203076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.203122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.203166 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.203183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.203199 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.203285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.203354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.303925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.304171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-kube-api-access-x5mbl\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.304212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.304244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.304279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.304297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.304314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.304338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.307481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.307689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.307739 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.308263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.308335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.308424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.309105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.319423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-kube-api-access-x5mbl\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.416720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:23 crc kubenswrapper[4707]: I0121 16:23:23.761856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk"] Jan 21 16:23:23 crc kubenswrapper[4707]: W0121 16:23:23.764348 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee41ee6_5b43_4f6e_b0a1_c92c2ba248bd.slice/crio-35fc903655bf86e44d2b69adf12a3d373efa1d1cb3f5b7c6ae931d2bd98191b5 WatchSource:0}: Error finding container 35fc903655bf86e44d2b69adf12a3d373efa1d1cb3f5b7c6ae931d2bd98191b5: Status 404 returned error can't find the container with id 35fc903655bf86e44d2b69adf12a3d373efa1d1cb3f5b7c6ae931d2bd98191b5 Jan 21 16:23:24 crc kubenswrapper[4707]: I0121 16:23:24.045337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" event={"ID":"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd","Type":"ContainerStarted","Data":"35fc903655bf86e44d2b69adf12a3d373efa1d1cb3f5b7c6ae931d2bd98191b5"} Jan 21 16:23:25 crc kubenswrapper[4707]: I0121 16:23:25.069475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" event={"ID":"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd","Type":"ContainerStarted","Data":"a020a81b37e967863f6b30f3236f3b06f9910d5c378bcce1ca7133dd7bf9d7bc"} Jan 21 16:23:26 crc kubenswrapper[4707]: I0121 16:23:26.076647 4707 generic.go:334] "Generic (PLEG): container finished" podID="aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" containerID="a020a81b37e967863f6b30f3236f3b06f9910d5c378bcce1ca7133dd7bf9d7bc" exitCode=0 Jan 21 16:23:26 crc kubenswrapper[4707]: I0121 16:23:26.076694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" event={"ID":"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd","Type":"ContainerDied","Data":"a020a81b37e967863f6b30f3236f3b06f9910d5c378bcce1ca7133dd7bf9d7bc"} Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.287776 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.454247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-metadata-combined-ca-bundle\") pod \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.454369 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-ssh-key-edpm-compute-global\") pod \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.454399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-kube-api-access-x5mbl\") pod \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.454447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-0\") pod \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.454467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-2\") pod \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.454534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.454558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-inventory\") pod \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.454604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-1\") pod \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\" (UID: \"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd\") " Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.459490 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-kube-api-access-x5mbl" (OuterVolumeSpecName: "kube-api-access-x5mbl") pod "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" (UID: "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd"). InnerVolumeSpecName "kube-api-access-x5mbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.461399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" (UID: "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.472460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" (UID: "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.472664 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-1" (OuterVolumeSpecName: "nova-metadata-neutron-config-1") pod "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" (UID: "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd"). InnerVolumeSpecName "nova-metadata-neutron-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.472771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-2" (OuterVolumeSpecName: "nova-metadata-neutron-config-2") pod "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" (UID: "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd"). InnerVolumeSpecName "nova-metadata-neutron-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.473075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" (UID: "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.473618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-inventory" (OuterVolumeSpecName: "inventory") pod "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" (UID: "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.473701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" (UID: "aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.555955 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.555982 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-2\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.555992 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.556003 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.556014 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-nova-metadata-neutron-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.556023 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.556032 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:27 crc kubenswrapper[4707]: I0121 16:23:27.556041 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd-kube-api-access-x5mbl\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.091200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" event={"ID":"aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd","Type":"ContainerDied","Data":"35fc903655bf86e44d2b69adf12a3d373efa1d1cb3f5b7c6ae931d2bd98191b5"} Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.091243 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35fc903655bf86e44d2b69adf12a3d373efa1d1cb3f5b7c6ae931d2bd98191b5" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.091247 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.145912 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4"] Jan 21 16:23:28 crc kubenswrapper[4707]: E0121 16:23:28.146262 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.146281 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.146424 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.146940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.148295 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.148351 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.151887 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4"] Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.151961 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.152015 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-agent-neutron-config" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.152076 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.152226 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.263250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.263289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.263313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lb57\" (UniqueName: \"kubernetes.io/projected/7b66c3c1-46df-4e81-b311-b6d2870063d7-kube-api-access-6lb57\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.263429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.263460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.364979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.365018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.365042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lb57\" (UniqueName: \"kubernetes.io/projected/7b66c3c1-46df-4e81-b311-b6d2870063d7-kube-api-access-6lb57\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.365076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.365094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.368192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.368286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.371968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.371982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.380469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lb57\" (UniqueName: \"kubernetes.io/projected/7b66c3c1-46df-4e81-b311-b6d2870063d7-kube-api-access-6lb57\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.461411 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:28 crc kubenswrapper[4707]: I0121 16:23:28.806684 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4"] Jan 21 16:23:29 crc kubenswrapper[4707]: I0121 16:23:29.097041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" event={"ID":"7b66c3c1-46df-4e81-b311-b6d2870063d7","Type":"ContainerStarted","Data":"bc76671033828e6c4a4bba512368daa399a89979ccb539493e3d8ec08e7930d9"} Jan 21 16:23:30 crc kubenswrapper[4707]: I0121 16:23:30.105879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" event={"ID":"7b66c3c1-46df-4e81-b311-b6d2870063d7","Type":"ContainerStarted","Data":"c8f1cdb00ede11835641b05306785a2ecbae7bd2712fb5b0d17051342acde536"} Jan 21 16:23:30 crc kubenswrapper[4707]: I0121 16:23:30.120000 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" podStartSLOduration=1.630794646 podStartE2EDuration="2.119981823s" podCreationTimestamp="2026-01-21 16:23:28 +0000 UTC" firstStartedPulling="2026-01-21 16:23:28.810062097 +0000 UTC m=+4905.991578319" lastFinishedPulling="2026-01-21 16:23:29.299249274 +0000 UTC m=+4906.480765496" observedRunningTime="2026-01-21 16:23:30.118178613 +0000 UTC m=+4907.299694835" watchObservedRunningTime="2026-01-21 16:23:30.119981823 +0000 UTC m=+4907.301498045" Jan 21 16:23:31 crc kubenswrapper[4707]: I0121 16:23:31.115073 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b66c3c1-46df-4e81-b311-b6d2870063d7" containerID="c8f1cdb00ede11835641b05306785a2ecbae7bd2712fb5b0d17051342acde536" exitCode=0 Jan 21 16:23:31 crc kubenswrapper[4707]: I0121 16:23:31.115134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" event={"ID":"7b66c3c1-46df-4e81-b311-b6d2870063d7","Type":"ContainerDied","Data":"c8f1cdb00ede11835641b05306785a2ecbae7bd2712fb5b0d17051342acde536"} Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.345587 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.521612 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-combined-ca-bundle\") pod \"7b66c3c1-46df-4e81-b311-b6d2870063d7\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.521656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-inventory\") pod \"7b66c3c1-46df-4e81-b311-b6d2870063d7\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.521680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-agent-neutron-config-0\") pod \"7b66c3c1-46df-4e81-b311-b6d2870063d7\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.521709 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lb57\" (UniqueName: \"kubernetes.io/projected/7b66c3c1-46df-4e81-b311-b6d2870063d7-kube-api-access-6lb57\") pod \"7b66c3c1-46df-4e81-b311-b6d2870063d7\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.521766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-ssh-key-edpm-compute-global\") pod \"7b66c3c1-46df-4e81-b311-b6d2870063d7\" (UID: \"7b66c3c1-46df-4e81-b311-b6d2870063d7\") " Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.525889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b66c3c1-46df-4e81-b311-b6d2870063d7-kube-api-access-6lb57" (OuterVolumeSpecName: "kube-api-access-6lb57") pod "7b66c3c1-46df-4e81-b311-b6d2870063d7" (UID: "7b66c3c1-46df-4e81-b311-b6d2870063d7"). InnerVolumeSpecName "kube-api-access-6lb57". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.526767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "7b66c3c1-46df-4e81-b311-b6d2870063d7" (UID: "7b66c3c1-46df-4e81-b311-b6d2870063d7"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.538180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-inventory" (OuterVolumeSpecName: "inventory") pod "7b66c3c1-46df-4e81-b311-b6d2870063d7" (UID: "7b66c3c1-46df-4e81-b311-b6d2870063d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.538343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "7b66c3c1-46df-4e81-b311-b6d2870063d7" (UID: "7b66c3c1-46df-4e81-b311-b6d2870063d7"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.538582 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-agent-neutron-config-0") pod "7b66c3c1-46df-4e81-b311-b6d2870063d7" (UID: "7b66c3c1-46df-4e81-b311-b6d2870063d7"). InnerVolumeSpecName "neutron-ovn-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.622911 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lb57\" (UniqueName: \"kubernetes.io/projected/7b66c3c1-46df-4e81-b311-b6d2870063d7-kube-api-access-6lb57\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.622936 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.622972 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.622984 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:32 crc kubenswrapper[4707]: I0121 16:23:32.622993 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b66c3c1-46df-4e81-b311-b6d2870063d7-neutron-ovn-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.127564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" event={"ID":"7b66c3c1-46df-4e81-b311-b6d2870063d7","Type":"ContainerDied","Data":"bc76671033828e6c4a4bba512368daa399a89979ccb539493e3d8ec08e7930d9"} Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.127832 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc76671033828e6c4a4bba512368daa399a89979ccb539493e3d8ec08e7930d9" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.127607 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.172663 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv"] Jan 21 16:23:33 crc kubenswrapper[4707]: E0121 16:23:33.172937 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b66c3c1-46df-4e81-b311-b6d2870063d7" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.172957 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b66c3c1-46df-4e81-b311-b6d2870063d7" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.173075 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b66c3c1-46df-4e81-b311-b6d2870063d7" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.173501 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.175799 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.175828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.175837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.175946 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-sriov-agent-neutron-config" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.176031 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.176191 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.188779 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv"] Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.328987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87tb4\" (UniqueName: \"kubernetes.io/projected/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-kube-api-access-87tb4\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.329035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.329057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.329080 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.329251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.429700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87tb4\" (UniqueName: \"kubernetes.io/projected/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-kube-api-access-87tb4\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.429750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.429773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.429801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.429881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.433613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.433856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.433920 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.433967 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.442925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87tb4\" (UniqueName: \"kubernetes.io/projected/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-kube-api-access-87tb4\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.485801 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:33 crc kubenswrapper[4707]: I0121 16:23:33.829447 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv"] Jan 21 16:23:33 crc kubenswrapper[4707]: W0121 16:23:33.830836 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee6beac5_0dfe_4fe9_9604_dbb1b502f988.slice/crio-de35c4e73876722693bf6617184af8ac0040b7841cdb91b9bc302d67c244d519 WatchSource:0}: Error finding container de35c4e73876722693bf6617184af8ac0040b7841cdb91b9bc302d67c244d519: Status 404 returned error can't find the container with id de35c4e73876722693bf6617184af8ac0040b7841cdb91b9bc302d67c244d519 Jan 21 16:23:34 crc kubenswrapper[4707]: I0121 16:23:34.135275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" event={"ID":"ee6beac5-0dfe-4fe9-9604-dbb1b502f988","Type":"ContainerStarted","Data":"de35c4e73876722693bf6617184af8ac0040b7841cdb91b9bc302d67c244d519"} Jan 21 16:23:35 crc kubenswrapper[4707]: I0121 16:23:35.143314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" event={"ID":"ee6beac5-0dfe-4fe9-9604-dbb1b502f988","Type":"ContainerStarted","Data":"f7222e01ea56a4a98c3b30d6a6ce5973c71b51f818b457cb182f9758b6be45d6"} Jan 21 16:23:35 crc kubenswrapper[4707]: I0121 16:23:35.160462 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" podStartSLOduration=1.6513093620000001 podStartE2EDuration="2.160445998s" podCreationTimestamp="2026-01-21 16:23:33 +0000 UTC" firstStartedPulling="2026-01-21 16:23:33.832985349 +0000 UTC m=+4911.014501571" lastFinishedPulling="2026-01-21 16:23:34.342121986 +0000 UTC m=+4911.523638207" observedRunningTime="2026-01-21 16:23:35.159287239 +0000 UTC m=+4912.340803461" watchObservedRunningTime="2026-01-21 16:23:35.160445998 +0000 UTC m=+4912.341962219" Jan 21 16:23:36 crc kubenswrapper[4707]: I0121 16:23:36.152284 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee6beac5-0dfe-4fe9-9604-dbb1b502f988" containerID="f7222e01ea56a4a98c3b30d6a6ce5973c71b51f818b457cb182f9758b6be45d6" exitCode=0 Jan 21 16:23:36 crc kubenswrapper[4707]: I0121 16:23:36.152390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" event={"ID":"ee6beac5-0dfe-4fe9-9604-dbb1b502f988","Type":"ContainerDied","Data":"f7222e01ea56a4a98c3b30d6a6ce5973c71b51f818b457cb182f9758b6be45d6"} Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.363787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.473465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87tb4\" (UniqueName: \"kubernetes.io/projected/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-kube-api-access-87tb4\") pod \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.473529 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-ssh-key-edpm-compute-global\") pod \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.473587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-agent-neutron-config-0\") pod \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.473619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-combined-ca-bundle\") pod \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.473679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-inventory\") pod \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\" (UID: \"ee6beac5-0dfe-4fe9-9604-dbb1b502f988\") " Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.478076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "ee6beac5-0dfe-4fe9-9604-dbb1b502f988" (UID: "ee6beac5-0dfe-4fe9-9604-dbb1b502f988"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.478094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-kube-api-access-87tb4" (OuterVolumeSpecName: "kube-api-access-87tb4") pod "ee6beac5-0dfe-4fe9-9604-dbb1b502f988" (UID: "ee6beac5-0dfe-4fe9-9604-dbb1b502f988"). InnerVolumeSpecName "kube-api-access-87tb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.489605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-inventory" (OuterVolumeSpecName: "inventory") pod "ee6beac5-0dfe-4fe9-9604-dbb1b502f988" (UID: "ee6beac5-0dfe-4fe9-9604-dbb1b502f988"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.490377 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "ee6beac5-0dfe-4fe9-9604-dbb1b502f988" (UID: "ee6beac5-0dfe-4fe9-9604-dbb1b502f988"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.490517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "ee6beac5-0dfe-4fe9-9604-dbb1b502f988" (UID: "ee6beac5-0dfe-4fe9-9604-dbb1b502f988"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.574993 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.575021 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87tb4\" (UniqueName: \"kubernetes.io/projected/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-kube-api-access-87tb4\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.575034 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.575044 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:37 crc kubenswrapper[4707]: I0121 16:23:37.575054 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6beac5-0dfe-4fe9-9604-dbb1b502f988-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.165436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" event={"ID":"ee6beac5-0dfe-4fe9-9604-dbb1b502f988","Type":"ContainerDied","Data":"de35c4e73876722693bf6617184af8ac0040b7841cdb91b9bc302d67c244d519"} Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.165479 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de35c4e73876722693bf6617184af8ac0040b7841cdb91b9bc302d67c244d519" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.165486 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.210612 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76"] Jan 21 16:23:38 crc kubenswrapper[4707]: E0121 16:23:38.210864 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6beac5-0dfe-4fe9-9604-dbb1b502f988" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.210882 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6beac5-0dfe-4fe9-9604-dbb1b502f988" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.211039 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6beac5-0dfe-4fe9-9604-dbb1b502f988" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.211458 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.215383 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.215606 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-dhcp-agent-neutron-config" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.216024 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.216085 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.216066 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.216091 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.217912 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76"] Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.383756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.383800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.383846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.383874 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2b9\" (UniqueName: \"kubernetes.io/projected/de5b3fa6-a1e4-4362-8fb0-922403614122-kube-api-access-2z2b9\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.383966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.485174 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2b9\" (UniqueName: \"kubernetes.io/projected/de5b3fa6-a1e4-4362-8fb0-922403614122-kube-api-access-2z2b9\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.485257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.485296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.485319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.485356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.488940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.488957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.489495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.489607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.498459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2b9\" (UniqueName: \"kubernetes.io/projected/de5b3fa6-a1e4-4362-8fb0-922403614122-kube-api-access-2z2b9\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.523531 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:38 crc kubenswrapper[4707]: I0121 16:23:38.877128 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76"] Jan 21 16:23:38 crc kubenswrapper[4707]: W0121 16:23:38.880697 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde5b3fa6_a1e4_4362_8fb0_922403614122.slice/crio-2039a671b8080e3ee5f439c4f6a252cbbc58a53f90ea57d4ab5432e8dc9e23b0 WatchSource:0}: Error finding container 2039a671b8080e3ee5f439c4f6a252cbbc58a53f90ea57d4ab5432e8dc9e23b0: Status 404 returned error can't find the container with id 2039a671b8080e3ee5f439c4f6a252cbbc58a53f90ea57d4ab5432e8dc9e23b0 Jan 21 16:23:39 crc kubenswrapper[4707]: I0121 16:23:39.172751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" event={"ID":"de5b3fa6-a1e4-4362-8fb0-922403614122","Type":"ContainerStarted","Data":"2039a671b8080e3ee5f439c4f6a252cbbc58a53f90ea57d4ab5432e8dc9e23b0"} Jan 21 16:23:40 crc kubenswrapper[4707]: I0121 16:23:40.179829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" event={"ID":"de5b3fa6-a1e4-4362-8fb0-922403614122","Type":"ContainerStarted","Data":"1579ff7b772e2d8358facb3dea6b00c2f210f24e59c3818da60d976bc6898300"} Jan 21 16:23:41 crc kubenswrapper[4707]: I0121 16:23:41.186510 4707 generic.go:334] "Generic (PLEG): container finished" podID="de5b3fa6-a1e4-4362-8fb0-922403614122" containerID="1579ff7b772e2d8358facb3dea6b00c2f210f24e59c3818da60d976bc6898300" exitCode=0 Jan 21 16:23:41 crc kubenswrapper[4707]: I0121 16:23:41.188747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" event={"ID":"de5b3fa6-a1e4-4362-8fb0-922403614122","Type":"ContainerDied","Data":"1579ff7b772e2d8358facb3dea6b00c2f210f24e59c3818da60d976bc6898300"} Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.393350 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.527079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-agent-neutron-config-0\") pod \"de5b3fa6-a1e4-4362-8fb0-922403614122\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.527167 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-combined-ca-bundle\") pod \"de5b3fa6-a1e4-4362-8fb0-922403614122\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.527188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-inventory\") pod \"de5b3fa6-a1e4-4362-8fb0-922403614122\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.527215 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z2b9\" (UniqueName: \"kubernetes.io/projected/de5b3fa6-a1e4-4362-8fb0-922403614122-kube-api-access-2z2b9\") pod \"de5b3fa6-a1e4-4362-8fb0-922403614122\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.527238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-ssh-key-edpm-compute-global\") pod \"de5b3fa6-a1e4-4362-8fb0-922403614122\" (UID: \"de5b3fa6-a1e4-4362-8fb0-922403614122\") " Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.531483 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "de5b3fa6-a1e4-4362-8fb0-922403614122" (UID: "de5b3fa6-a1e4-4362-8fb0-922403614122"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.531603 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5b3fa6-a1e4-4362-8fb0-922403614122-kube-api-access-2z2b9" (OuterVolumeSpecName: "kube-api-access-2z2b9") pod "de5b3fa6-a1e4-4362-8fb0-922403614122" (UID: "de5b3fa6-a1e4-4362-8fb0-922403614122"). InnerVolumeSpecName "kube-api-access-2z2b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.542965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-inventory" (OuterVolumeSpecName: "inventory") pod "de5b3fa6-a1e4-4362-8fb0-922403614122" (UID: "de5b3fa6-a1e4-4362-8fb0-922403614122"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.543181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "de5b3fa6-a1e4-4362-8fb0-922403614122" (UID: "de5b3fa6-a1e4-4362-8fb0-922403614122"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.543637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "de5b3fa6-a1e4-4362-8fb0-922403614122" (UID: "de5b3fa6-a1e4-4362-8fb0-922403614122"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.628216 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.628238 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.628251 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z2b9\" (UniqueName: \"kubernetes.io/projected/de5b3fa6-a1e4-4362-8fb0-922403614122-kube-api-access-2z2b9\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.628260 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:42 crc kubenswrapper[4707]: I0121 16:23:42.628269 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/de5b3fa6-a1e4-4362-8fb0-922403614122-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.197114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" event={"ID":"de5b3fa6-a1e4-4362-8fb0-922403614122","Type":"ContainerDied","Data":"2039a671b8080e3ee5f439c4f6a252cbbc58a53f90ea57d4ab5432e8dc9e23b0"} Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.197182 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2039a671b8080e3ee5f439c4f6a252cbbc58a53f90ea57d4ab5432e8dc9e23b0" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.197145 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.243792 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p"] Jan 21 16:23:43 crc kubenswrapper[4707]: E0121 16:23:43.244079 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5b3fa6-a1e4-4362-8fb0-922403614122" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.244099 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5b3fa6-a1e4-4362-8fb0-922403614122" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.244253 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5b3fa6-a1e4-4362-8fb0-922403614122" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.244654 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.247840 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"libvirt-secret" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.250458 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.250528 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.250593 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.250654 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.252354 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.255484 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p"] Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.336529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgkww\" (UniqueName: \"kubernetes.io/projected/84655530-1ba9-4c74-9cea-b4e97af14dda-kube-api-access-dgkww\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.336645 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.336733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.336758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.336780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.437903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.437951 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.437975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.438030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgkww\" (UniqueName: \"kubernetes.io/projected/84655530-1ba9-4c74-9cea-b4e97af14dda-kube-api-access-dgkww\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.438055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.441176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.441233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.441280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.441868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.450696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgkww\" (UniqueName: \"kubernetes.io/projected/84655530-1ba9-4c74-9cea-b4e97af14dda-kube-api-access-dgkww\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-25n7p\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.558414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:43 crc kubenswrapper[4707]: I0121 16:23:43.795498 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p"] Jan 21 16:23:44 crc kubenswrapper[4707]: I0121 16:23:44.203451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" event={"ID":"84655530-1ba9-4c74-9cea-b4e97af14dda","Type":"ContainerStarted","Data":"2f3b08a31acbd9e065b223a536004e45b3ab96c57afb83bb93a88746576d9810"} Jan 21 16:23:45 crc kubenswrapper[4707]: I0121 16:23:45.214456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" event={"ID":"84655530-1ba9-4c74-9cea-b4e97af14dda","Type":"ContainerStarted","Data":"40c8baf2705a6950c26464735c18a1163d0dc6b101cffab24e6b91ebf6a64ba6"} Jan 21 16:23:45 crc kubenswrapper[4707]: I0121 16:23:45.227617 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" podStartSLOduration=1.751720462 podStartE2EDuration="2.227603102s" podCreationTimestamp="2026-01-21 16:23:43 +0000 UTC" firstStartedPulling="2026-01-21 16:23:43.807508333 +0000 UTC m=+4920.989024556" lastFinishedPulling="2026-01-21 16:23:44.283390974 +0000 UTC m=+4921.464907196" observedRunningTime="2026-01-21 16:23:45.225593824 +0000 UTC m=+4922.407110056" watchObservedRunningTime="2026-01-21 16:23:45.227603102 +0000 UTC m=+4922.409119324" Jan 21 16:23:46 crc kubenswrapper[4707]: I0121 16:23:46.221542 4707 generic.go:334] "Generic (PLEG): container finished" podID="84655530-1ba9-4c74-9cea-b4e97af14dda" containerID="40c8baf2705a6950c26464735c18a1163d0dc6b101cffab24e6b91ebf6a64ba6" exitCode=0 Jan 21 16:23:46 crc kubenswrapper[4707]: I0121 16:23:46.221579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" event={"ID":"84655530-1ba9-4c74-9cea-b4e97af14dda","Type":"ContainerDied","Data":"40c8baf2705a6950c26464735c18a1163d0dc6b101cffab24e6b91ebf6a64ba6"} Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.439150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.590302 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-inventory\") pod \"84655530-1ba9-4c74-9cea-b4e97af14dda\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.590341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-combined-ca-bundle\") pod \"84655530-1ba9-4c74-9cea-b4e97af14dda\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.590374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-ssh-key-edpm-compute-global\") pod \"84655530-1ba9-4c74-9cea-b4e97af14dda\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.590390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-secret-0\") pod \"84655530-1ba9-4c74-9cea-b4e97af14dda\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.590441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgkww\" (UniqueName: \"kubernetes.io/projected/84655530-1ba9-4c74-9cea-b4e97af14dda-kube-api-access-dgkww\") pod \"84655530-1ba9-4c74-9cea-b4e97af14dda\" (UID: \"84655530-1ba9-4c74-9cea-b4e97af14dda\") " Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.594620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "84655530-1ba9-4c74-9cea-b4e97af14dda" (UID: "84655530-1ba9-4c74-9cea-b4e97af14dda"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.595013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84655530-1ba9-4c74-9cea-b4e97af14dda-kube-api-access-dgkww" (OuterVolumeSpecName: "kube-api-access-dgkww") pod "84655530-1ba9-4c74-9cea-b4e97af14dda" (UID: "84655530-1ba9-4c74-9cea-b4e97af14dda"). InnerVolumeSpecName "kube-api-access-dgkww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.606618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-inventory" (OuterVolumeSpecName: "inventory") pod "84655530-1ba9-4c74-9cea-b4e97af14dda" (UID: "84655530-1ba9-4c74-9cea-b4e97af14dda"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.606733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "84655530-1ba9-4c74-9cea-b4e97af14dda" (UID: "84655530-1ba9-4c74-9cea-b4e97af14dda"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.607512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "84655530-1ba9-4c74-9cea-b4e97af14dda" (UID: "84655530-1ba9-4c74-9cea-b4e97af14dda"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.691845 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgkww\" (UniqueName: \"kubernetes.io/projected/84655530-1ba9-4c74-9cea-b4e97af14dda-kube-api-access-dgkww\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.691870 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.691882 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.691890 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:47 crc kubenswrapper[4707]: I0121 16:23:47.691900 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/84655530-1ba9-4c74-9cea-b4e97af14dda-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.235413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" event={"ID":"84655530-1ba9-4c74-9cea-b4e97af14dda","Type":"ContainerDied","Data":"2f3b08a31acbd9e065b223a536004e45b3ab96c57afb83bb93a88746576d9810"} Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.235449 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f3b08a31acbd9e065b223a536004e45b3ab96c57afb83bb93a88746576d9810" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.235915 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.278663 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr"] Jan 21 16:23:48 crc kubenswrapper[4707]: E0121 16:23:48.279172 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84655530-1ba9-4c74-9cea-b4e97af14dda" containerName="libvirt-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.279192 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="84655530-1ba9-4c74-9cea-b4e97af14dda" containerName="libvirt-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.279329 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="84655530-1ba9-4c74-9cea-b4e97af14dda" containerName="libvirt-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.279762 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.281033 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-migration-ssh-key" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.281362 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.281564 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.281723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.282652 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.284563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.284637 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-compute-config" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.291086 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr"] Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.399327 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.399388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.399418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.399575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.399621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tvf\" (UniqueName: \"kubernetes.io/projected/a0375d00-ebc1-4008-9963-0a0759952385-kube-api-access-h2tvf\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.399641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.399678 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-ssh-key-edpm-compute-global\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.399758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.501042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.501084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.501115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.501157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.501175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tvf\" (UniqueName: \"kubernetes.io/projected/a0375d00-ebc1-4008-9963-0a0759952385-kube-api-access-h2tvf\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.501191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.501212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-ssh-key-edpm-compute-global\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.501248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.504130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.504252 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.504322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.504567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.504573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.504695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.505209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-ssh-key-edpm-compute-global\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.515197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tvf\" (UniqueName: \"kubernetes.io/projected/a0375d00-ebc1-4008-9963-0a0759952385-kube-api-access-h2tvf\") pod \"nova-edpm-multinodeset-edpm-compute-global-57wtr\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.590582 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:48 crc kubenswrapper[4707]: I0121 16:23:48.936557 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr"] Jan 21 16:23:48 crc kubenswrapper[4707]: W0121 16:23:48.939024 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0375d00_ebc1_4008_9963_0a0759952385.slice/crio-a5e1501f964633e0f1efa3b625b742ef2a157cc4e2de35358a1001f02c566d45 WatchSource:0}: Error finding container a5e1501f964633e0f1efa3b625b742ef2a157cc4e2de35358a1001f02c566d45: Status 404 returned error can't find the container with id a5e1501f964633e0f1efa3b625b742ef2a157cc4e2de35358a1001f02c566d45 Jan 21 16:23:49 crc kubenswrapper[4707]: I0121 16:23:49.240924 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" event={"ID":"a0375d00-ebc1-4008-9963-0a0759952385","Type":"ContainerStarted","Data":"a5e1501f964633e0f1efa3b625b742ef2a157cc4e2de35358a1001f02c566d45"} Jan 21 16:23:50 crc kubenswrapper[4707]: I0121 16:23:50.248141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" event={"ID":"a0375d00-ebc1-4008-9963-0a0759952385","Type":"ContainerStarted","Data":"1ae988e6cd4add3b53e3c3cc095c5ee855688156e72da8b7f2425a418cdea6e0"} Jan 21 16:23:50 crc kubenswrapper[4707]: I0121 16:23:50.265166 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" podStartSLOduration=1.746138244 podStartE2EDuration="2.265154307s" podCreationTimestamp="2026-01-21 16:23:48 +0000 UTC" firstStartedPulling="2026-01-21 16:23:48.940838512 +0000 UTC m=+4926.122354734" lastFinishedPulling="2026-01-21 16:23:49.459854575 +0000 UTC m=+4926.641370797" observedRunningTime="2026-01-21 16:23:50.258886171 +0000 UTC m=+4927.440402393" watchObservedRunningTime="2026-01-21 16:23:50.265154307 +0000 UTC m=+4927.446670529" Jan 21 16:23:51 crc kubenswrapper[4707]: I0121 16:23:51.254576 4707 generic.go:334] "Generic (PLEG): container finished" podID="a0375d00-ebc1-4008-9963-0a0759952385" containerID="1ae988e6cd4add3b53e3c3cc095c5ee855688156e72da8b7f2425a418cdea6e0" exitCode=0 Jan 21 16:23:51 crc kubenswrapper[4707]: I0121 16:23:51.254634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" event={"ID":"a0375d00-ebc1-4008-9963-0a0759952385","Type":"ContainerDied","Data":"1ae988e6cd4add3b53e3c3cc095c5ee855688156e72da8b7f2425a418cdea6e0"} Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.465626 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.651627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-combined-ca-bundle\") pod \"a0375d00-ebc1-4008-9963-0a0759952385\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.651664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-1\") pod \"a0375d00-ebc1-4008-9963-0a0759952385\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.651688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2tvf\" (UniqueName: \"kubernetes.io/projected/a0375d00-ebc1-4008-9963-0a0759952385-kube-api-access-h2tvf\") pod \"a0375d00-ebc1-4008-9963-0a0759952385\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.651704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-1\") pod \"a0375d00-ebc1-4008-9963-0a0759952385\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.651722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-inventory\") pod \"a0375d00-ebc1-4008-9963-0a0759952385\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.651756 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-0\") pod \"a0375d00-ebc1-4008-9963-0a0759952385\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.651798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-ssh-key-edpm-compute-global\") pod \"a0375d00-ebc1-4008-9963-0a0759952385\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.651831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-0\") pod \"a0375d00-ebc1-4008-9963-0a0759952385\" (UID: \"a0375d00-ebc1-4008-9963-0a0759952385\") " Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.655830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a0375d00-ebc1-4008-9963-0a0759952385" (UID: "a0375d00-ebc1-4008-9963-0a0759952385"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.659412 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0375d00-ebc1-4008-9963-0a0759952385-kube-api-access-h2tvf" (OuterVolumeSpecName: "kube-api-access-h2tvf") pod "a0375d00-ebc1-4008-9963-0a0759952385" (UID: "a0375d00-ebc1-4008-9963-0a0759952385"). InnerVolumeSpecName "kube-api-access-h2tvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.667762 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a0375d00-ebc1-4008-9963-0a0759952385" (UID: "a0375d00-ebc1-4008-9963-0a0759952385"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.668240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "a0375d00-ebc1-4008-9963-0a0759952385" (UID: "a0375d00-ebc1-4008-9963-0a0759952385"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.669135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a0375d00-ebc1-4008-9963-0a0759952385" (UID: "a0375d00-ebc1-4008-9963-0a0759952385"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.669513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-inventory" (OuterVolumeSpecName: "inventory") pod "a0375d00-ebc1-4008-9963-0a0759952385" (UID: "a0375d00-ebc1-4008-9963-0a0759952385"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.669829 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a0375d00-ebc1-4008-9963-0a0759952385" (UID: "a0375d00-ebc1-4008-9963-0a0759952385"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.669976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a0375d00-ebc1-4008-9963-0a0759952385" (UID: "a0375d00-ebc1-4008-9963-0a0759952385"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.752927 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.752957 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.752967 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.752976 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.752984 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.752992 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2tvf\" (UniqueName: \"kubernetes.io/projected/a0375d00-ebc1-4008-9963-0a0759952385-kube-api-access-h2tvf\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.753002 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:52 crc kubenswrapper[4707]: I0121 16:23:52.753013 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0375d00-ebc1-4008-9963-0a0759952385-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.268677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" event={"ID":"a0375d00-ebc1-4008-9963-0a0759952385","Type":"ContainerDied","Data":"a5e1501f964633e0f1efa3b625b742ef2a157cc4e2de35358a1001f02c566d45"} Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.268745 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5e1501f964633e0f1efa3b625b742ef2a157cc4e2de35358a1001f02c566d45" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.268836 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.324551 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc"] Jan 21 16:23:53 crc kubenswrapper[4707]: E0121 16:23:53.324888 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0375d00-ebc1-4008-9963-0a0759952385" containerName="nova-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.324908 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0375d00-ebc1-4008-9963-0a0759952385" containerName="nova-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.325061 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0375d00-ebc1-4008-9963-0a0759952385" containerName="nova-edpm-multinodeset-edpm-compute-global" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.325516 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.327061 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.327430 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-fq6tt" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.327576 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.327683 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.329266 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.329818 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.330296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc"] Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.461853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.461896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c62j8\" (UniqueName: \"kubernetes.io/projected/08f2f098-59e8-47a9-a1dc-c674627c3809-kube-api-access-c62j8\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.461922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-1\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.461955 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-beta-nodeset\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.462006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.462049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-0\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.563800 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c62j8\" (UniqueName: \"kubernetes.io/projected/08f2f098-59e8-47a9-a1dc-c674627c3809-kube-api-access-c62j8\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.563869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-1\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.563895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-beta-nodeset\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.563943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.564012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-0\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.564080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.568289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.568332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-1\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.568375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-0\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.568434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.569163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-beta-nodeset\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.582734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c62j8\" (UniqueName: \"kubernetes.io/projected/08f2f098-59e8-47a9-a1dc-c674627c3809-kube-api-access-c62j8\") pod \"custom-global-service-edpm-multinodeset-hfxsc\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.636724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:53 crc kubenswrapper[4707]: I0121 16:23:53.980752 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc"] Jan 21 16:23:53 crc kubenswrapper[4707]: W0121 16:23:53.984237 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08f2f098_59e8_47a9_a1dc_c674627c3809.slice/crio-497762687c274ea61c0757e7bdab2aa64757f603c15c54f43f0e9b385d52b506 WatchSource:0}: Error finding container 497762687c274ea61c0757e7bdab2aa64757f603c15c54f43f0e9b385d52b506: Status 404 returned error can't find the container with id 497762687c274ea61c0757e7bdab2aa64757f603c15c54f43f0e9b385d52b506 Jan 21 16:23:54 crc kubenswrapper[4707]: I0121 16:23:54.274503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" event={"ID":"08f2f098-59e8-47a9-a1dc-c674627c3809","Type":"ContainerStarted","Data":"497762687c274ea61c0757e7bdab2aa64757f603c15c54f43f0e9b385d52b506"} Jan 21 16:23:55 crc kubenswrapper[4707]: I0121 16:23:55.281025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" event={"ID":"08f2f098-59e8-47a9-a1dc-c674627c3809","Type":"ContainerStarted","Data":"0210688ca235439f353aca12f6763b72f047be39886d80d7a058b2c14a175965"} Jan 21 16:23:55 crc kubenswrapper[4707]: I0121 16:23:55.295386 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" podStartSLOduration=1.835699696 podStartE2EDuration="2.295370633s" podCreationTimestamp="2026-01-21 16:23:53 +0000 UTC" firstStartedPulling="2026-01-21 16:23:53.986528142 +0000 UTC m=+4931.168044364" lastFinishedPulling="2026-01-21 16:23:54.446199079 +0000 UTC m=+4931.627715301" observedRunningTime="2026-01-21 16:23:55.291139818 +0000 UTC m=+4932.472656039" watchObservedRunningTime="2026-01-21 16:23:55.295370633 +0000 UTC m=+4932.476886855" Jan 21 16:23:57 crc kubenswrapper[4707]: I0121 16:23:57.293376 4707 generic.go:334] "Generic (PLEG): container finished" podID="08f2f098-59e8-47a9-a1dc-c674627c3809" containerID="0210688ca235439f353aca12f6763b72f047be39886d80d7a058b2c14a175965" exitCode=0 Jan 21 16:23:57 crc kubenswrapper[4707]: I0121 16:23:57.293448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" event={"ID":"08f2f098-59e8-47a9-a1dc-c674627c3809","Type":"ContainerDied","Data":"0210688ca235439f353aca12f6763b72f047be39886d80d7a058b2c14a175965"} Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.514465 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.624681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-0\") pod \"08f2f098-59e8-47a9-a1dc-c674627c3809\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.624762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-1\") pod \"08f2f098-59e8-47a9-a1dc-c674627c3809\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.625267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-global\") pod \"08f2f098-59e8-47a9-a1dc-c674627c3809\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.625325 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-custom-global-service-combined-ca-bundle\") pod \"08f2f098-59e8-47a9-a1dc-c674627c3809\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.625377 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-beta-nodeset\") pod \"08f2f098-59e8-47a9-a1dc-c674627c3809\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.625414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c62j8\" (UniqueName: \"kubernetes.io/projected/08f2f098-59e8-47a9-a1dc-c674627c3809-kube-api-access-c62j8\") pod \"08f2f098-59e8-47a9-a1dc-c674627c3809\" (UID: \"08f2f098-59e8-47a9-a1dc-c674627c3809\") " Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.629224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-custom-global-service-combined-ca-bundle" (OuterVolumeSpecName: "custom-global-service-combined-ca-bundle") pod "08f2f098-59e8-47a9-a1dc-c674627c3809" (UID: "08f2f098-59e8-47a9-a1dc-c674627c3809"). InnerVolumeSpecName "custom-global-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.629734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f2f098-59e8-47a9-a1dc-c674627c3809-kube-api-access-c62j8" (OuterVolumeSpecName: "kube-api-access-c62j8") pod "08f2f098-59e8-47a9-a1dc-c674627c3809" (UID: "08f2f098-59e8-47a9-a1dc-c674627c3809"). InnerVolumeSpecName "kube-api-access-c62j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.640969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "08f2f098-59e8-47a9-a1dc-c674627c3809" (UID: "08f2f098-59e8-47a9-a1dc-c674627c3809"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.641008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "08f2f098-59e8-47a9-a1dc-c674627c3809" (UID: "08f2f098-59e8-47a9-a1dc-c674627c3809"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.641237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "08f2f098-59e8-47a9-a1dc-c674627c3809" (UID: "08f2f098-59e8-47a9-a1dc-c674627c3809"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.641838 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "08f2f098-59e8-47a9-a1dc-c674627c3809" (UID: "08f2f098-59e8-47a9-a1dc-c674627c3809"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.727048 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c62j8\" (UniqueName: \"kubernetes.io/projected/08f2f098-59e8-47a9-a1dc-c674627c3809-kube-api-access-c62j8\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.727190 4707 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.727266 4707 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-inventory-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.727326 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.727376 4707 reconciler_common.go:293] "Volume detached for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-custom-global-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:58 crc kubenswrapper[4707]: I0121 16:23:58.727433 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/08f2f098-59e8-47a9-a1dc-c674627c3809-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:59 crc kubenswrapper[4707]: I0121 16:23:59.306061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" event={"ID":"08f2f098-59e8-47a9-a1dc-c674627c3809","Type":"ContainerDied","Data":"497762687c274ea61c0757e7bdab2aa64757f603c15c54f43f0e9b385d52b506"} Jan 21 16:23:59 crc kubenswrapper[4707]: I0121 16:23:59.306105 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="497762687c274ea61c0757e7bdab2aa64757f603c15c54f43f0e9b385d52b506" Jan 21 16:23:59 crc kubenswrapper[4707]: I0121 16:23:59.306108 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.127195 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.132990 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-frqrv"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.140354 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.145792 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.151228 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.156558 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.162178 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.167311 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.174922 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.180466 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.190520 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-7qf76"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.199714 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-wx7xd"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.207448 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-f7nrq"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.213768 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.219273 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.225362 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.231496 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-gdj9q"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.239018 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.244247 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.248860 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-ntw6n"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.253490 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-zl6gr"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.258283 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.262235 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.267544 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-wkrlg"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.274514 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.279819 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.284571 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-5nldg"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.289030 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-bq6dk"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.293411 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-25n7p"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.302057 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-cj2j4"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.306733 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-lzrch"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.311081 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-57wtr"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.315421 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetfrf5s"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.320508 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-hn4b6"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.333405 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.339633 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-xzj9t"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.345154 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-hfxsc"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.350250 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.355453 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.359927 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-7zdhg"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.363992 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-hdqdw"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.368075 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-m6nc7"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.372031 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.376177 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.380872 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.384748 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.388516 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.392738 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.396939 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.400364 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.404000 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.407954 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.411803 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.422544 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.427060 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-z645v"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.430468 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-whwlm"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.434601 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hjlcx"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.438148 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-f6w6d"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.441667 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-pmpwc"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.445602 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.449247 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-94qcp"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.452798 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-2gwdd"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.456932 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-tz6k2"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.460641 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-7k4fk"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.464229 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-vxq5k"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.467748 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-mfgqw"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.471414 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-b9csl"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.474968 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-qvpjz"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.478565 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt"] Jan 21 16:24:00 crc kubenswrapper[4707]: E0121 16:24:00.479055 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f2f098-59e8-47a9-a1dc-c674627c3809" containerName="custom-global-service-edpm-multinodeset" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.479077 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f2f098-59e8-47a9-a1dc-c674627c3809" containerName="custom-global-service-edpm-multinodeset" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.479310 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f2f098-59e8-47a9-a1dc-c674627c3809" containerName="custom-global-service-edpm-multinodeset" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.480363 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.482842 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.486474 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.491948 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld"] Jan 21 16:24:00 crc kubenswrapper[4707]: E0121 16:24:00.494069 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc edpm-compute-global kube-api-access-xqhdt], unattached volumes=[], failed to process volumes=[config dnsmasq-svc edpm-compute-global kube-api-access-xqhdt]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" podUID="d24f8637-6c12-4c75-ac96-ca704feae360" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.494606 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.496455 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld"] Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.647999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqhdt\" (UniqueName: \"kubernetes.io/projected/d24f8637-6c12-4c75-ac96-ca704feae360-kube-api-access-xqhdt\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.648048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-5f8ld\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.648079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pggn4\" (UniqueName: \"kubernetes.io/projected/1dd2e280-e441-477d-9054-b7d0df0fd6b8-kube-api-access-pggn4\") pod \"dnsmasq-dnsmasq-84b9f45d47-5f8ld\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.648115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-config\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.648215 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-5f8ld\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.648259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.648291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.749465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.750258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.750538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: E0121 16:24:00.750631 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-global: configmap "edpm-compute-global" not found Jan 21 16:24:00 crc kubenswrapper[4707]: E0121 16:24:00.750783 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global podName:d24f8637-6c12-4c75-ac96-ca704feae360 nodeName:}" failed. No retries permitted until 2026-01-21 16:24:01.250767523 +0000 UTC m=+4938.432283745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "edpm-compute-global" (UniqueName: "kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global") pod "dnsmasq-dnsmasq-7d78464677-5jzxt" (UID: "d24f8637-6c12-4c75-ac96-ca704feae360") : configmap "edpm-compute-global" not found Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.750923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqhdt\" (UniqueName: \"kubernetes.io/projected/d24f8637-6c12-4c75-ac96-ca704feae360-kube-api-access-xqhdt\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.751056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-5f8ld\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.751239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pggn4\" (UniqueName: \"kubernetes.io/projected/1dd2e280-e441-477d-9054-b7d0df0fd6b8-kube-api-access-pggn4\") pod \"dnsmasq-dnsmasq-84b9f45d47-5f8ld\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.751347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-config\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.751488 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-5f8ld\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.751669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-5f8ld\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.752001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-config\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.752593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-5f8ld\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.765182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pggn4\" (UniqueName: \"kubernetes.io/projected/1dd2e280-e441-477d-9054-b7d0df0fd6b8-kube-api-access-pggn4\") pod \"dnsmasq-dnsmasq-84b9f45d47-5f8ld\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.765437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqhdt\" (UniqueName: \"kubernetes.io/projected/d24f8637-6c12-4c75-ac96-ca704feae360-kube-api-access-xqhdt\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:00 crc kubenswrapper[4707]: I0121 16:24:00.809197 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.169589 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld"] Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.190118 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0435a7e6-8e20-4f0a-bc93-c7a111eee7c5" path="/var/lib/kubelet/pods/0435a7e6-8e20-4f0a-bc93-c7a111eee7c5/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.190979 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f2f098-59e8-47a9-a1dc-c674627c3809" path="/var/lib/kubelet/pods/08f2f098-59e8-47a9-a1dc-c674627c3809/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.191611 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7" path="/var/lib/kubelet/pods/0a6365ee-2fa7-435c-bf86-f37bfdd0dbe7/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.192264 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137810ba-4d7b-4c06-97c9-485a575e30a7" path="/var/lib/kubelet/pods/137810ba-4d7b-4c06-97c9-485a575e30a7/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.193182 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28b191d2-e6b5-453f-b03a-a6adcc296c39" path="/var/lib/kubelet/pods/28b191d2-e6b5-453f-b03a-a6adcc296c39/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.193623 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35759e50-280d-4a43-a6a1-0afe9b5f918b" path="/var/lib/kubelet/pods/35759e50-280d-4a43-a6a1-0afe9b5f918b/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.194080 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="369dd7f6-dbf8-48ec-ba11-4bcd37527804" path="/var/lib/kubelet/pods/369dd7f6-dbf8-48ec-ba11-4bcd37527804/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.194927 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3bdc36-8322-4614-9172-75432d932033" path="/var/lib/kubelet/pods/3a3bdc36-8322-4614-9172-75432d932033/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.195377 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf1ce9a-43c8-43bd-bfb8-72695ba737f5" path="/var/lib/kubelet/pods/3cf1ce9a-43c8-43bd-bfb8-72695ba737f5/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.195800 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43bc6427-195d-4319-8883-0f55fd705ddc" path="/var/lib/kubelet/pods/43bc6427-195d-4319-8883-0f55fd705ddc/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.196637 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a" path="/var/lib/kubelet/pods/49f5a1b5-dda3-4ecd-a2cc-d22ab538cb4a/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.197173 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d10636-2fe8-4b6c-8f84-34e2ecc03d94" path="/var/lib/kubelet/pods/55d10636-2fe8-4b6c-8f84-34e2ecc03d94/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.197608 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59db911b-3f76-46f6-a197-0ff5218a9deb" path="/var/lib/kubelet/pods/59db911b-3f76-46f6-a197-0ff5218a9deb/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.198048 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d293288-a079-4408-b953-f457f94562e5" path="/var/lib/kubelet/pods/6d293288-a079-4408-b953-f457f94562e5/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.198840 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8e2dc7-e713-4d79-9deb-db8ecba6b34a" path="/var/lib/kubelet/pods/6d8e2dc7-e713-4d79-9deb-db8ecba6b34a/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.199269 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e7d9e33-cfa5-4c05-adc5-a3810fbba263" path="/var/lib/kubelet/pods/6e7d9e33-cfa5-4c05-adc5-a3810fbba263/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.199687 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd91bc6-4821-4a19-a0c3-78fc1a3afdec" path="/var/lib/kubelet/pods/6fd91bc6-4821-4a19-a0c3-78fc1a3afdec/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.200491 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b66c3c1-46df-4e81-b311-b6d2870063d7" path="/var/lib/kubelet/pods/7b66c3c1-46df-4e81-b311-b6d2870063d7/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.200922 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5824e3-dcc7-4975-8d30-a7b46a67a196" path="/var/lib/kubelet/pods/7d5824e3-dcc7-4975-8d30-a7b46a67a196/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.201372 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84655530-1ba9-4c74-9cea-b4e97af14dda" path="/var/lib/kubelet/pods/84655530-1ba9-4c74-9cea-b4e97af14dda/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.202182 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e0f79e5-b282-4b05-b02d-fd8c545ebe59" path="/var/lib/kubelet/pods/8e0f79e5-b282-4b05-b02d-fd8c545ebe59/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.202648 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee2225c-4c91-4fc9-a247-a4cf255bc7fc" path="/var/lib/kubelet/pods/9ee2225c-4c91-4fc9-a247-a4cf255bc7fc/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.203089 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0375d00-ebc1-4008-9963-0a0759952385" path="/var/lib/kubelet/pods/a0375d00-ebc1-4008-9963-0a0759952385/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.203517 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29c002e-9b49-41e1-8dbb-432265dbf327" path="/var/lib/kubelet/pods/a29c002e-9b49-41e1-8dbb-432265dbf327/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.204297 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea" path="/var/lib/kubelet/pods/a2ceb86b-bb52-4402-a1ec-aea7a5ebf2ea/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.204722 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd" path="/var/lib/kubelet/pods/aee41ee6-5b43-4f6e-b0a1-c92c2ba248bd/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.205232 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2591815-0b9e-4b37-91a6-3a4c54197b74" path="/var/lib/kubelet/pods/b2591815-0b9e-4b37-91a6-3a4c54197b74/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.206106 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ce54cb-82e7-40fe-a40b-d96f67dae3be" path="/var/lib/kubelet/pods/b8ce54cb-82e7-40fe-a40b-d96f67dae3be/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.206698 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d84130-78e3-4404-bffb-baa800b4b419" path="/var/lib/kubelet/pods/d4d84130-78e3-4404-bffb-baa800b4b419/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.207211 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9990097-194c-4cde-a8c7-b20214a7572a" path="/var/lib/kubelet/pods/d9990097-194c-4cde-a8c7-b20214a7572a/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.207627 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5b3fa6-a1e4-4362-8fb0-922403614122" path="/var/lib/kubelet/pods/de5b3fa6-a1e4-4362-8fb0-922403614122/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.208068 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9f1e23-b845-450b-9b39-4af232b4d161" path="/var/lib/kubelet/pods/df9f1e23-b845-450b-9b39-4af232b4d161/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.208953 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee6beac5-0dfe-4fe9-9604-dbb1b502f988" path="/var/lib/kubelet/pods/ee6beac5-0dfe-4fe9-9604-dbb1b502f988/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.209414 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2b44fd4-888a-45c9-b34d-10ed6c447905" path="/var/lib/kubelet/pods/f2b44fd4-888a-45c9-b34d-10ed6c447905/volumes" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.258189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:01 crc kubenswrapper[4707]: E0121 16:24:01.258439 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-global: configmap "edpm-compute-global" not found Jan 21 16:24:01 crc kubenswrapper[4707]: E0121 16:24:01.258562 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global podName:d24f8637-6c12-4c75-ac96-ca704feae360 nodeName:}" failed. No retries permitted until 2026-01-21 16:24:02.25854259 +0000 UTC m=+4939.440058813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "edpm-compute-global" (UniqueName: "kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global") pod "dnsmasq-dnsmasq-7d78464677-5jzxt" (UID: "d24f8637-6c12-4c75-ac96-ca704feae360") : configmap "edpm-compute-global" not found Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.320732 4707 generic.go:334] "Generic (PLEG): container finished" podID="1dd2e280-e441-477d-9054-b7d0df0fd6b8" containerID="b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5" exitCode=0 Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.320840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" event={"ID":"1dd2e280-e441-477d-9054-b7d0df0fd6b8","Type":"ContainerDied","Data":"b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5"} Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.321039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.321137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" event={"ID":"1dd2e280-e441-477d-9054-b7d0df0fd6b8","Type":"ContainerStarted","Data":"75c849fc3ace7a492873711c45bb1c4559c0eb8181497f31ec5b5a0cbb63fc89"} Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.360696 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.460342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-config\") pod \"d24f8637-6c12-4c75-ac96-ca704feae360\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.460399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-dnsmasq-svc\") pod \"d24f8637-6c12-4c75-ac96-ca704feae360\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.460503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqhdt\" (UniqueName: \"kubernetes.io/projected/d24f8637-6c12-4c75-ac96-ca704feae360-kube-api-access-xqhdt\") pod \"d24f8637-6c12-4c75-ac96-ca704feae360\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.460898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-config" (OuterVolumeSpecName: "config") pod "d24f8637-6c12-4c75-ac96-ca704feae360" (UID: "d24f8637-6c12-4c75-ac96-ca704feae360"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.460910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "d24f8637-6c12-4c75-ac96-ca704feae360" (UID: "d24f8637-6c12-4c75-ac96-ca704feae360"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.463219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d24f8637-6c12-4c75-ac96-ca704feae360-kube-api-access-xqhdt" (OuterVolumeSpecName: "kube-api-access-xqhdt") pod "d24f8637-6c12-4c75-ac96-ca704feae360" (UID: "d24f8637-6c12-4c75-ac96-ca704feae360"). InnerVolumeSpecName "kube-api-access-xqhdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.561899 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqhdt\" (UniqueName: \"kubernetes.io/projected/d24f8637-6c12-4c75-ac96-ca704feae360-kube-api-access-xqhdt\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.561922 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:01 crc kubenswrapper[4707]: I0121 16:24:01.561930 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:02 crc kubenswrapper[4707]: I0121 16:24:02.269940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-5jzxt\" (UID: \"d24f8637-6c12-4c75-ac96-ca704feae360\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:02 crc kubenswrapper[4707]: E0121 16:24:02.270092 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-global: configmap "edpm-compute-global" not found Jan 21 16:24:02 crc kubenswrapper[4707]: E0121 16:24:02.270424 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global podName:d24f8637-6c12-4c75-ac96-ca704feae360 nodeName:}" failed. No retries permitted until 2026-01-21 16:24:04.27039907 +0000 UTC m=+4941.451915293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "edpm-compute-global" (UniqueName: "kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global") pod "dnsmasq-dnsmasq-7d78464677-5jzxt" (UID: "d24f8637-6c12-4c75-ac96-ca704feae360") : configmap "edpm-compute-global" not found Jan 21 16:24:02 crc kubenswrapper[4707]: I0121 16:24:02.329562 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" event={"ID":"1dd2e280-e441-477d-9054-b7d0df0fd6b8","Type":"ContainerStarted","Data":"92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc"} Jan 21 16:24:02 crc kubenswrapper[4707]: I0121 16:24:02.329572 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt" Jan 21 16:24:02 crc kubenswrapper[4707]: I0121 16:24:02.330089 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:02 crc kubenswrapper[4707]: I0121 16:24:02.351057 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" podStartSLOduration=2.351042543 podStartE2EDuration="2.351042543s" podCreationTimestamp="2026-01-21 16:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:24:02.345216249 +0000 UTC m=+4939.526732470" watchObservedRunningTime="2026-01-21 16:24:02.351042543 +0000 UTC m=+4939.532558765" Jan 21 16:24:02 crc kubenswrapper[4707]: I0121 16:24:02.370982 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt"] Jan 21 16:24:02 crc kubenswrapper[4707]: I0121 16:24:02.374585 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-5jzxt"] Jan 21 16:24:02 crc kubenswrapper[4707]: I0121 16:24:02.575035 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/d24f8637-6c12-4c75-ac96-ca704feae360-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:03 crc kubenswrapper[4707]: I0121 16:24:03.188920 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d24f8637-6c12-4c75-ac96-ca704feae360" path="/var/lib/kubelet/pods/d24f8637-6c12-4c75-ac96-ca704feae360/volumes" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.448841 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-7kt6t"] Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.454434 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-7kt6t"] Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.555411 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-2ck5s"] Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.556623 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.558426 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.558740 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.558993 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.559956 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.561262 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2ck5s"] Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.727039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33b88f93-37c0-4411-8009-14cc68746134-node-mnt\") pod \"crc-storage-crc-2ck5s\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.727128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g69mc\" (UniqueName: \"kubernetes.io/projected/33b88f93-37c0-4411-8009-14cc68746134-kube-api-access-g69mc\") pod \"crc-storage-crc-2ck5s\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.727769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33b88f93-37c0-4411-8009-14cc68746134-crc-storage\") pod \"crc-storage-crc-2ck5s\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.830019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33b88f93-37c0-4411-8009-14cc68746134-node-mnt\") pod \"crc-storage-crc-2ck5s\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.830124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g69mc\" (UniqueName: \"kubernetes.io/projected/33b88f93-37c0-4411-8009-14cc68746134-kube-api-access-g69mc\") pod \"crc-storage-crc-2ck5s\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.830203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33b88f93-37c0-4411-8009-14cc68746134-crc-storage\") pod \"crc-storage-crc-2ck5s\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.830281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33b88f93-37c0-4411-8009-14cc68746134-node-mnt\") pod \"crc-storage-crc-2ck5s\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.830803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33b88f93-37c0-4411-8009-14cc68746134-crc-storage\") pod \"crc-storage-crc-2ck5s\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.845537 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g69mc\" (UniqueName: \"kubernetes.io/projected/33b88f93-37c0-4411-8009-14cc68746134-kube-api-access-g69mc\") pod \"crc-storage-crc-2ck5s\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:06 crc kubenswrapper[4707]: I0121 16:24:06.876286 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:07 crc kubenswrapper[4707]: I0121 16:24:07.190389 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9faafb8-ddc7-4892-8b73-5f4353988f41" path="/var/lib/kubelet/pods/c9faafb8-ddc7-4892-8b73-5f4353988f41/volumes" Jan 21 16:24:07 crc kubenswrapper[4707]: I0121 16:24:07.245834 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2ck5s"] Jan 21 16:24:07 crc kubenswrapper[4707]: I0121 16:24:07.364364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2ck5s" event={"ID":"33b88f93-37c0-4411-8009-14cc68746134","Type":"ContainerStarted","Data":"7693bfef4f6aae6026732e7a6634b3d3b071df21d37299044471bf471e9168eb"} Jan 21 16:24:08 crc kubenswrapper[4707]: I0121 16:24:08.372074 4707 generic.go:334] "Generic (PLEG): container finished" podID="33b88f93-37c0-4411-8009-14cc68746134" containerID="026314c401de752ded286a78cfc5e12aeb86eac48b9f8d514a3b62e6ef726c17" exitCode=0 Jan 21 16:24:08 crc kubenswrapper[4707]: I0121 16:24:08.372182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2ck5s" event={"ID":"33b88f93-37c0-4411-8009-14cc68746134","Type":"ContainerDied","Data":"026314c401de752ded286a78cfc5e12aeb86eac48b9f8d514a3b62e6ef726c17"} Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.581066 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.764458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33b88f93-37c0-4411-8009-14cc68746134-crc-storage\") pod \"33b88f93-37c0-4411-8009-14cc68746134\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.764569 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g69mc\" (UniqueName: \"kubernetes.io/projected/33b88f93-37c0-4411-8009-14cc68746134-kube-api-access-g69mc\") pod \"33b88f93-37c0-4411-8009-14cc68746134\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.764616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33b88f93-37c0-4411-8009-14cc68746134-node-mnt\") pod \"33b88f93-37c0-4411-8009-14cc68746134\" (UID: \"33b88f93-37c0-4411-8009-14cc68746134\") " Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.764801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33b88f93-37c0-4411-8009-14cc68746134-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "33b88f93-37c0-4411-8009-14cc68746134" (UID: "33b88f93-37c0-4411-8009-14cc68746134"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.765012 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33b88f93-37c0-4411-8009-14cc68746134-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.769085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b88f93-37c0-4411-8009-14cc68746134-kube-api-access-g69mc" (OuterVolumeSpecName: "kube-api-access-g69mc") pod "33b88f93-37c0-4411-8009-14cc68746134" (UID: "33b88f93-37c0-4411-8009-14cc68746134"). InnerVolumeSpecName "kube-api-access-g69mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.780386 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b88f93-37c0-4411-8009-14cc68746134-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "33b88f93-37c0-4411-8009-14cc68746134" (UID: "33b88f93-37c0-4411-8009-14cc68746134"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.866089 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33b88f93-37c0-4411-8009-14cc68746134-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:09 crc kubenswrapper[4707]: I0121 16:24:09.866127 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g69mc\" (UniqueName: \"kubernetes.io/projected/33b88f93-37c0-4411-8009-14cc68746134-kube-api-access-g69mc\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:10 crc kubenswrapper[4707]: I0121 16:24:10.385018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2ck5s" event={"ID":"33b88f93-37c0-4411-8009-14cc68746134","Type":"ContainerDied","Data":"7693bfef4f6aae6026732e7a6634b3d3b071df21d37299044471bf471e9168eb"} Jan 21 16:24:10 crc kubenswrapper[4707]: I0121 16:24:10.385057 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7693bfef4f6aae6026732e7a6634b3d3b071df21d37299044471bf471e9168eb" Jan 21 16:24:10 crc kubenswrapper[4707]: I0121 16:24:10.385059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2ck5s" Jan 21 16:24:10 crc kubenswrapper[4707]: I0121 16:24:10.810939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:10 crc kubenswrapper[4707]: I0121 16:24:10.842576 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5"] Jan 21 16:24:10 crc kubenswrapper[4707]: I0121 16:24:10.842800 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" podUID="0d14c6be-c04f-4eae-bce6-a850a6382cfb" containerName="dnsmasq-dns" containerID="cri-o://de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103" gracePeriod=10 Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.185777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.285412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-global\") pod \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.285448 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-config\") pod \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.285516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-beta-nodeset\") pod \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.285539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr5dj\" (UniqueName: \"kubernetes.io/projected/0d14c6be-c04f-4eae-bce6-a850a6382cfb-kube-api-access-wr5dj\") pod \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.285556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-dnsmasq-svc\") pod \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\" (UID: \"0d14c6be-c04f-4eae-bce6-a850a6382cfb\") " Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.289507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d14c6be-c04f-4eae-bce6-a850a6382cfb-kube-api-access-wr5dj" (OuterVolumeSpecName: "kube-api-access-wr5dj") pod "0d14c6be-c04f-4eae-bce6-a850a6382cfb" (UID: "0d14c6be-c04f-4eae-bce6-a850a6382cfb"). InnerVolumeSpecName "kube-api-access-wr5dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.310517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "0d14c6be-c04f-4eae-bce6-a850a6382cfb" (UID: "0d14c6be-c04f-4eae-bce6-a850a6382cfb"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.310846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-config" (OuterVolumeSpecName: "config") pod "0d14c6be-c04f-4eae-bce6-a850a6382cfb" (UID: "0d14c6be-c04f-4eae-bce6-a850a6382cfb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.311331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "edpm-compute-beta-nodeset") pod "0d14c6be-c04f-4eae-bce6-a850a6382cfb" (UID: "0d14c6be-c04f-4eae-bce6-a850a6382cfb"). InnerVolumeSpecName "edpm-compute-beta-nodeset". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.311424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-global" (OuterVolumeSpecName: "edpm-compute-global") pod "0d14c6be-c04f-4eae-bce6-a850a6382cfb" (UID: "0d14c6be-c04f-4eae-bce6-a850a6382cfb"). InnerVolumeSpecName "edpm-compute-global". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.387731 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.387761 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr5dj\" (UniqueName: \"kubernetes.io/projected/0d14c6be-c04f-4eae-bce6-a850a6382cfb-kube-api-access-wr5dj\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.387771 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.387781 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.387793 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d14c6be-c04f-4eae-bce6-a850a6382cfb-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.398865 4707 generic.go:334] "Generic (PLEG): container finished" podID="0d14c6be-c04f-4eae-bce6-a850a6382cfb" containerID="de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103" exitCode=0 Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.398926 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.398937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" event={"ID":"0d14c6be-c04f-4eae-bce6-a850a6382cfb","Type":"ContainerDied","Data":"de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103"} Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.398966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5" event={"ID":"0d14c6be-c04f-4eae-bce6-a850a6382cfb","Type":"ContainerDied","Data":"55df3a8b4d347c20ca49da25af820fef6f2ac44ceffe888ab353dd1d1d5a16e8"} Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.398982 4707 scope.go:117] "RemoveContainer" containerID="de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.418732 4707 scope.go:117] "RemoveContainer" containerID="a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.424018 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5"] Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.428919 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-tlgm5"] Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.449413 4707 scope.go:117] "RemoveContainer" containerID="de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103" Jan 21 16:24:11 crc kubenswrapper[4707]: E0121 16:24:11.449792 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103\": container with ID starting with de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103 not found: ID does not exist" containerID="de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.449845 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103"} err="failed to get container status \"de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103\": rpc error: code = NotFound desc = could not find container \"de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103\": container with ID starting with de0b021586f4c68b5fef8a72ed58ba8af9730468246208cec349271eb4772103 not found: ID does not exist" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.449865 4707 scope.go:117] "RemoveContainer" containerID="a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd" Jan 21 16:24:11 crc kubenswrapper[4707]: E0121 16:24:11.450178 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd\": container with ID starting with a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd not found: ID does not exist" containerID="a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd" Jan 21 16:24:11 crc kubenswrapper[4707]: I0121 16:24:11.450223 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd"} err="failed to get container status \"a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd\": rpc error: code = NotFound desc = could not find container \"a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd\": container with ID starting with a2c0109198bffe3f338ce6ce9f0c21cccc16597a369ef1dc88a458ca5f5d5efd not found: ID does not exist" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.442852 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-2ck5s"] Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.446559 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-2ck5s"] Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.556473 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-4llql"] Jan 21 16:24:12 crc kubenswrapper[4707]: E0121 16:24:12.556713 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d14c6be-c04f-4eae-bce6-a850a6382cfb" containerName="dnsmasq-dns" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.556730 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d14c6be-c04f-4eae-bce6-a850a6382cfb" containerName="dnsmasq-dns" Jan 21 16:24:12 crc kubenswrapper[4707]: E0121 16:24:12.556740 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d14c6be-c04f-4eae-bce6-a850a6382cfb" containerName="init" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.556746 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d14c6be-c04f-4eae-bce6-a850a6382cfb" containerName="init" Jan 21 16:24:12 crc kubenswrapper[4707]: E0121 16:24:12.556760 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b88f93-37c0-4411-8009-14cc68746134" containerName="storage" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.556766 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b88f93-37c0-4411-8009-14cc68746134" containerName="storage" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.556925 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d14c6be-c04f-4eae-bce6-a850a6382cfb" containerName="dnsmasq-dns" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.556941 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b88f93-37c0-4411-8009-14cc68746134" containerName="storage" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.557370 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.558744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.559550 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.559977 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.560901 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.564194 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4llql"] Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.602466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/defca8e3-7423-4aea-84b6-0054fa424c75-crc-storage\") pod \"crc-storage-crc-4llql\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.602525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/defca8e3-7423-4aea-84b6-0054fa424c75-node-mnt\") pod \"crc-storage-crc-4llql\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.602672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmtdv\" (UniqueName: \"kubernetes.io/projected/defca8e3-7423-4aea-84b6-0054fa424c75-kube-api-access-jmtdv\") pod \"crc-storage-crc-4llql\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.703228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmtdv\" (UniqueName: \"kubernetes.io/projected/defca8e3-7423-4aea-84b6-0054fa424c75-kube-api-access-jmtdv\") pod \"crc-storage-crc-4llql\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.703281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/defca8e3-7423-4aea-84b6-0054fa424c75-crc-storage\") pod \"crc-storage-crc-4llql\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.703329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/defca8e3-7423-4aea-84b6-0054fa424c75-node-mnt\") pod \"crc-storage-crc-4llql\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.703560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/defca8e3-7423-4aea-84b6-0054fa424c75-node-mnt\") pod \"crc-storage-crc-4llql\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.703901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/defca8e3-7423-4aea-84b6-0054fa424c75-crc-storage\") pod \"crc-storage-crc-4llql\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.717918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmtdv\" (UniqueName: \"kubernetes.io/projected/defca8e3-7423-4aea-84b6-0054fa424c75-kube-api-access-jmtdv\") pod \"crc-storage-crc-4llql\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:12 crc kubenswrapper[4707]: I0121 16:24:12.869859 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:13 crc kubenswrapper[4707]: I0121 16:24:13.189147 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d14c6be-c04f-4eae-bce6-a850a6382cfb" path="/var/lib/kubelet/pods/0d14c6be-c04f-4eae-bce6-a850a6382cfb/volumes" Jan 21 16:24:13 crc kubenswrapper[4707]: I0121 16:24:13.189954 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b88f93-37c0-4411-8009-14cc68746134" path="/var/lib/kubelet/pods/33b88f93-37c0-4411-8009-14cc68746134/volumes" Jan 21 16:24:13 crc kubenswrapper[4707]: I0121 16:24:13.274482 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4llql"] Jan 21 16:24:13 crc kubenswrapper[4707]: I0121 16:24:13.414017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4llql" event={"ID":"defca8e3-7423-4aea-84b6-0054fa424c75","Type":"ContainerStarted","Data":"f045836e27f0429ac72cd4b893cc8fe636f5ad388dda9aa6331fdd196065bcfa"} Jan 21 16:24:14 crc kubenswrapper[4707]: I0121 16:24:14.421262 4707 generic.go:334] "Generic (PLEG): container finished" podID="defca8e3-7423-4aea-84b6-0054fa424c75" containerID="770f64e2a96c02995ed7ed5e8a4f467bc54852deb47c0befcdfdfcd42a4603f7" exitCode=0 Jan 21 16:24:14 crc kubenswrapper[4707]: I0121 16:24:14.421373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4llql" event={"ID":"defca8e3-7423-4aea-84b6-0054fa424c75","Type":"ContainerDied","Data":"770f64e2a96c02995ed7ed5e8a4f467bc54852deb47c0befcdfdfcd42a4603f7"} Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.628131 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.734830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/defca8e3-7423-4aea-84b6-0054fa424c75-node-mnt\") pod \"defca8e3-7423-4aea-84b6-0054fa424c75\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.734964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/defca8e3-7423-4aea-84b6-0054fa424c75-crc-storage\") pod \"defca8e3-7423-4aea-84b6-0054fa424c75\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.734986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/defca8e3-7423-4aea-84b6-0054fa424c75-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "defca8e3-7423-4aea-84b6-0054fa424c75" (UID: "defca8e3-7423-4aea-84b6-0054fa424c75"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.735024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmtdv\" (UniqueName: \"kubernetes.io/projected/defca8e3-7423-4aea-84b6-0054fa424c75-kube-api-access-jmtdv\") pod \"defca8e3-7423-4aea-84b6-0054fa424c75\" (UID: \"defca8e3-7423-4aea-84b6-0054fa424c75\") " Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.735392 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/defca8e3-7423-4aea-84b6-0054fa424c75-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.739069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defca8e3-7423-4aea-84b6-0054fa424c75-kube-api-access-jmtdv" (OuterVolumeSpecName: "kube-api-access-jmtdv") pod "defca8e3-7423-4aea-84b6-0054fa424c75" (UID: "defca8e3-7423-4aea-84b6-0054fa424c75"). InnerVolumeSpecName "kube-api-access-jmtdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.748771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defca8e3-7423-4aea-84b6-0054fa424c75-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "defca8e3-7423-4aea-84b6-0054fa424c75" (UID: "defca8e3-7423-4aea-84b6-0054fa424c75"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.836288 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/defca8e3-7423-4aea-84b6-0054fa424c75-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:15 crc kubenswrapper[4707]: I0121 16:24:15.836320 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmtdv\" (UniqueName: \"kubernetes.io/projected/defca8e3-7423-4aea-84b6-0054fa424c75-kube-api-access-jmtdv\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:16 crc kubenswrapper[4707]: I0121 16:24:16.434708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4llql" event={"ID":"defca8e3-7423-4aea-84b6-0054fa424c75","Type":"ContainerDied","Data":"f045836e27f0429ac72cd4b893cc8fe636f5ad388dda9aa6331fdd196065bcfa"} Jan 21 16:24:16 crc kubenswrapper[4707]: I0121 16:24:16.434892 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f045836e27f0429ac72cd4b893cc8fe636f5ad388dda9aa6331fdd196065bcfa" Jan 21 16:24:16 crc kubenswrapper[4707]: I0121 16:24:16.434746 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4llql" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.571728 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj"] Jan 21 16:24:18 crc kubenswrapper[4707]: E0121 16:24:18.572196 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defca8e3-7423-4aea-84b6-0054fa424c75" containerName="storage" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.572214 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="defca8e3-7423-4aea-84b6-0054fa424c75" containerName="storage" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.572365 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="defca8e3-7423-4aea-84b6-0054fa424c75" containerName="storage" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.573008 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.575051 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-edpm-tls" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.611973 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj"] Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.639913 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj"] Jan 21 16:24:18 crc kubenswrapper[4707]: E0121 16:24:18.640337 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc kube-api-access-fnxc5 openstack-edpm-tls], unattached volumes=[], failed to process volumes=[config dnsmasq-svc kube-api-access-fnxc5 openstack-edpm-tls]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" podUID="0500510d-d35e-48d7-9371-5af06453f506" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.655928 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z"] Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.656922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.665414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z"] Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.769527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.769568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.769675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.769735 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxc5\" (UniqueName: \"kubernetes.io/projected/0500510d-d35e-48d7-9371-5af06453f506-kube-api-access-fnxc5\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.769756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5rb7\" (UniqueName: \"kubernetes.io/projected/86af110f-d26e-4cbe-886d-65dd4f23f594-kube-api-access-p5rb7\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.769772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.769791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.769823 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-config\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871104 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxc5\" (UniqueName: \"kubernetes.io/projected/0500510d-d35e-48d7-9371-5af06453f506-kube-api-access-fnxc5\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5rb7\" (UniqueName: \"kubernetes.io/projected/86af110f-d26e-4cbe-886d-65dd4f23f594-kube-api-access-p5rb7\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-config\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.871920 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.872451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.872522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-config\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.872650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.872699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.886529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5rb7\" (UniqueName: \"kubernetes.io/projected/86af110f-d26e-4cbe-886d-65dd4f23f594-kube-api-access-p5rb7\") pod \"dnsmasq-dnsmasq-79cc674687-gnc2z\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.887455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxc5\" (UniqueName: \"kubernetes.io/projected/0500510d-d35e-48d7-9371-5af06453f506-kube-api-access-fnxc5\") pod \"dnsmasq-dnsmasq-78c7b787f5-4l4gj\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:18 crc kubenswrapper[4707]: I0121 16:24:18.968542 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.312949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z"] Jan 21 16:24:19 crc kubenswrapper[4707]: W0121 16:24:19.316547 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86af110f_d26e_4cbe_886d_65dd4f23f594.slice/crio-9ef21560ec6b464cb3b64e39fa6ff90c07d3ad824db7cb4ffae87e6501ac4700 WatchSource:0}: Error finding container 9ef21560ec6b464cb3b64e39fa6ff90c07d3ad824db7cb4ffae87e6501ac4700: Status 404 returned error can't find the container with id 9ef21560ec6b464cb3b64e39fa6ff90c07d3ad824db7cb4ffae87e6501ac4700 Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.455749 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.455787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" event={"ID":"86af110f-d26e-4cbe-886d-65dd4f23f594","Type":"ContainerStarted","Data":"19d1ce758b48adac59dd714741a46435aecfb303593e7fc93f7608e66ec1d358"} Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.455827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" event={"ID":"86af110f-d26e-4cbe-886d-65dd4f23f594","Type":"ContainerStarted","Data":"9ef21560ec6b464cb3b64e39fa6ff90c07d3ad824db7cb4ffae87e6501ac4700"} Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.546559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.682619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-openstack-edpm-tls\") pod \"0500510d-d35e-48d7-9371-5af06453f506\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.682790 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnxc5\" (UniqueName: \"kubernetes.io/projected/0500510d-d35e-48d7-9371-5af06453f506-kube-api-access-fnxc5\") pod \"0500510d-d35e-48d7-9371-5af06453f506\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.682885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-dnsmasq-svc\") pod \"0500510d-d35e-48d7-9371-5af06453f506\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.682915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-config\") pod \"0500510d-d35e-48d7-9371-5af06453f506\" (UID: \"0500510d-d35e-48d7-9371-5af06453f506\") " Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.683048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "0500510d-d35e-48d7-9371-5af06453f506" (UID: "0500510d-d35e-48d7-9371-5af06453f506"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.683294 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.683473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "0500510d-d35e-48d7-9371-5af06453f506" (UID: "0500510d-d35e-48d7-9371-5af06453f506"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.683485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-config" (OuterVolumeSpecName: "config") pod "0500510d-d35e-48d7-9371-5af06453f506" (UID: "0500510d-d35e-48d7-9371-5af06453f506"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.685289 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0500510d-d35e-48d7-9371-5af06453f506-kube-api-access-fnxc5" (OuterVolumeSpecName: "kube-api-access-fnxc5") pod "0500510d-d35e-48d7-9371-5af06453f506" (UID: "0500510d-d35e-48d7-9371-5af06453f506"). InnerVolumeSpecName "kube-api-access-fnxc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.784435 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.784460 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnxc5\" (UniqueName: \"kubernetes.io/projected/0500510d-d35e-48d7-9371-5af06453f506-kube-api-access-fnxc5\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:19 crc kubenswrapper[4707]: I0121 16:24:19.784471 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0500510d-d35e-48d7-9371-5af06453f506-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:20 crc kubenswrapper[4707]: I0121 16:24:20.464348 4707 generic.go:334] "Generic (PLEG): container finished" podID="86af110f-d26e-4cbe-886d-65dd4f23f594" containerID="19d1ce758b48adac59dd714741a46435aecfb303593e7fc93f7608e66ec1d358" exitCode=0 Jan 21 16:24:20 crc kubenswrapper[4707]: I0121 16:24:20.464386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" event={"ID":"86af110f-d26e-4cbe-886d-65dd4f23f594","Type":"ContainerDied","Data":"19d1ce758b48adac59dd714741a46435aecfb303593e7fc93f7608e66ec1d358"} Jan 21 16:24:20 crc kubenswrapper[4707]: I0121 16:24:20.464420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" event={"ID":"86af110f-d26e-4cbe-886d-65dd4f23f594","Type":"ContainerStarted","Data":"1f331bea268b3bfc1c5596a5131e82556e0d7b36899c8e4c4bbcaaf1eb97b6b4"} Jan 21 16:24:20 crc kubenswrapper[4707]: I0121 16:24:20.464440 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj" Jan 21 16:24:20 crc kubenswrapper[4707]: I0121 16:24:20.464654 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:20 crc kubenswrapper[4707]: I0121 16:24:20.480290 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" podStartSLOduration=2.480276066 podStartE2EDuration="2.480276066s" podCreationTimestamp="2026-01-21 16:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:24:20.477023701 +0000 UTC m=+4957.658539923" watchObservedRunningTime="2026-01-21 16:24:20.480276066 +0000 UTC m=+4957.661792288" Jan 21 16:24:20 crc kubenswrapper[4707]: I0121 16:24:20.499327 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj"] Jan 21 16:24:20 crc kubenswrapper[4707]: I0121 16:24:20.502777 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-4l4gj"] Jan 21 16:24:21 crc kubenswrapper[4707]: I0121 16:24:21.188871 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0500510d-d35e-48d7-9371-5af06453f506" path="/var/lib/kubelet/pods/0500510d-d35e-48d7-9371-5af06453f506/volumes" Jan 21 16:24:21 crc kubenswrapper[4707]: I0121 16:24:21.680751 4707 scope.go:117] "RemoveContainer" containerID="bba48301a1d812e5b1d562f60f016bd4942fac35d4db8dd307d57b200b5ae46e" Jan 21 16:24:21 crc kubenswrapper[4707]: I0121 16:24:21.708471 4707 scope.go:117] "RemoveContainer" containerID="9d565128b88e07372216fc5b02fb6fc9a64472ff30da1ffcb272fe9c92a40041" Jan 21 16:24:21 crc kubenswrapper[4707]: I0121 16:24:21.720192 4707 scope.go:117] "RemoveContainer" containerID="2917839fbaec0d038ba217855d1896f0c2eb80cde105fd1293e6851a3cfa05d2" Jan 21 16:24:28 crc kubenswrapper[4707]: I0121 16:24:28.970522 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.003895 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld"] Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.004080 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" podUID="1dd2e280-e441-477d-9054-b7d0df0fd6b8" containerName="dnsmasq-dns" containerID="cri-o://92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc" gracePeriod=10 Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.070587 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj"] Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.071791 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.081180 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj"] Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.081741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-config\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.081803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdw8g\" (UniqueName: \"kubernetes.io/projected/f2208347-c972-41e8-94b3-bf07a03b7537-kube-api-access-qdw8g\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.081988 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.082033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.183067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.183167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-config\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.183505 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdw8g\" (UniqueName: \"kubernetes.io/projected/f2208347-c972-41e8-94b3-bf07a03b7537-kube-api-access-qdw8g\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.183599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.183926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.184067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-config\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.184363 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.198992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdw8g\" (UniqueName: \"kubernetes.io/projected/f2208347-c972-41e8-94b3-bf07a03b7537-kube-api-access-qdw8g\") pod \"dnsmasq-dnsmasq-76b7c4d945-dnrrj\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.326144 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.389384 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.486990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pggn4\" (UniqueName: \"kubernetes.io/projected/1dd2e280-e441-477d-9054-b7d0df0fd6b8-kube-api-access-pggn4\") pod \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.487159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-config\") pod \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.487231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-dnsmasq-svc\") pod \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\" (UID: \"1dd2e280-e441-477d-9054-b7d0df0fd6b8\") " Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.490604 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd2e280-e441-477d-9054-b7d0df0fd6b8-kube-api-access-pggn4" (OuterVolumeSpecName: "kube-api-access-pggn4") pod "1dd2e280-e441-477d-9054-b7d0df0fd6b8" (UID: "1dd2e280-e441-477d-9054-b7d0df0fd6b8"). InnerVolumeSpecName "kube-api-access-pggn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.513779 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-config" (OuterVolumeSpecName: "config") pod "1dd2e280-e441-477d-9054-b7d0df0fd6b8" (UID: "1dd2e280-e441-477d-9054-b7d0df0fd6b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.516316 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "1dd2e280-e441-477d-9054-b7d0df0fd6b8" (UID: "1dd2e280-e441-477d-9054-b7d0df0fd6b8"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.520104 4707 generic.go:334] "Generic (PLEG): container finished" podID="1dd2e280-e441-477d-9054-b7d0df0fd6b8" containerID="92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc" exitCode=0 Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.520141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" event={"ID":"1dd2e280-e441-477d-9054-b7d0df0fd6b8","Type":"ContainerDied","Data":"92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc"} Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.520148 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.520166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld" event={"ID":"1dd2e280-e441-477d-9054-b7d0df0fd6b8","Type":"ContainerDied","Data":"75c849fc3ace7a492873711c45bb1c4559c0eb8181497f31ec5b5a0cbb63fc89"} Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.520182 4707 scope.go:117] "RemoveContainer" containerID="92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.549380 4707 scope.go:117] "RemoveContainer" containerID="b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.552418 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld"] Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.557479 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-5f8ld"] Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.562552 4707 scope.go:117] "RemoveContainer" containerID="92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc" Jan 21 16:24:29 crc kubenswrapper[4707]: E0121 16:24:29.562957 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc\": container with ID starting with 92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc not found: ID does not exist" containerID="92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.563001 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc"} err="failed to get container status \"92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc\": rpc error: code = NotFound desc = could not find container \"92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc\": container with ID starting with 92cc140cc2bc0f06aa422b5197834b2fd729c8bcf5f82805bffa5dd7950395dc not found: ID does not exist" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.563027 4707 scope.go:117] "RemoveContainer" containerID="b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5" Jan 21 16:24:29 crc kubenswrapper[4707]: E0121 16:24:29.563304 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5\": container with ID starting with b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5 not found: ID does not exist" containerID="b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.563340 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5"} err="failed to get container status \"b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5\": rpc error: code = NotFound desc = could not find container \"b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5\": container with ID starting with b73348abbfe2b3e1c8e07476cc9f9bbb01afe93d5675fbe29ccbf6487c6e12c5 not found: ID does not exist" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.588427 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.588449 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1dd2e280-e441-477d-9054-b7d0df0fd6b8-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.588461 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pggn4\" (UniqueName: \"kubernetes.io/projected/1dd2e280-e441-477d-9054-b7d0df0fd6b8-kube-api-access-pggn4\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:29 crc kubenswrapper[4707]: I0121 16:24:29.731280 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj"] Jan 21 16:24:30 crc kubenswrapper[4707]: I0121 16:24:30.528462 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2208347-c972-41e8-94b3-bf07a03b7537" containerID="93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be" exitCode=0 Jan 21 16:24:30 crc kubenswrapper[4707]: I0121 16:24:30.528568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" event={"ID":"f2208347-c972-41e8-94b3-bf07a03b7537","Type":"ContainerDied","Data":"93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be"} Jan 21 16:24:30 crc kubenswrapper[4707]: I0121 16:24:30.528687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" event={"ID":"f2208347-c972-41e8-94b3-bf07a03b7537","Type":"ContainerStarted","Data":"51228644d8b4574e037596c54277b08a93f066c83ed8b2af69647bb3d1c133b7"} Jan 21 16:24:31 crc kubenswrapper[4707]: I0121 16:24:31.189940 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd2e280-e441-477d-9054-b7d0df0fd6b8" path="/var/lib/kubelet/pods/1dd2e280-e441-477d-9054-b7d0df0fd6b8/volumes" Jan 21 16:24:31 crc kubenswrapper[4707]: I0121 16:24:31.536715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" event={"ID":"f2208347-c972-41e8-94b3-bf07a03b7537","Type":"ContainerStarted","Data":"3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19"} Jan 21 16:24:31 crc kubenswrapper[4707]: I0121 16:24:31.536888 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:31 crc kubenswrapper[4707]: I0121 16:24:31.552633 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" podStartSLOduration=2.552621722 podStartE2EDuration="2.552621722s" podCreationTimestamp="2026-01-21 16:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:24:31.548276542 +0000 UTC m=+4968.729792764" watchObservedRunningTime="2026-01-21 16:24:31.552621722 +0000 UTC m=+4968.734137944" Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.390829 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.421803 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z"] Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.422024 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" podUID="86af110f-d26e-4cbe-886d-65dd4f23f594" containerName="dnsmasq-dns" containerID="cri-o://1f331bea268b3bfc1c5596a5131e82556e0d7b36899c8e4c4bbcaaf1eb97b6b4" gracePeriod=10 Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.582627 4707 generic.go:334] "Generic (PLEG): container finished" podID="86af110f-d26e-4cbe-886d-65dd4f23f594" containerID="1f331bea268b3bfc1c5596a5131e82556e0d7b36899c8e4c4bbcaaf1eb97b6b4" exitCode=0 Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.582695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" event={"ID":"86af110f-d26e-4cbe-886d-65dd4f23f594","Type":"ContainerDied","Data":"1f331bea268b3bfc1c5596a5131e82556e0d7b36899c8e4c4bbcaaf1eb97b6b4"} Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.738376 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.899641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-dnsmasq-svc\") pod \"86af110f-d26e-4cbe-886d-65dd4f23f594\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.899702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-config\") pod \"86af110f-d26e-4cbe-886d-65dd4f23f594\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.899730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-openstack-edpm-tls\") pod \"86af110f-d26e-4cbe-886d-65dd4f23f594\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.899749 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5rb7\" (UniqueName: \"kubernetes.io/projected/86af110f-d26e-4cbe-886d-65dd4f23f594-kube-api-access-p5rb7\") pod \"86af110f-d26e-4cbe-886d-65dd4f23f594\" (UID: \"86af110f-d26e-4cbe-886d-65dd4f23f594\") " Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.903997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86af110f-d26e-4cbe-886d-65dd4f23f594-kube-api-access-p5rb7" (OuterVolumeSpecName: "kube-api-access-p5rb7") pod "86af110f-d26e-4cbe-886d-65dd4f23f594" (UID: "86af110f-d26e-4cbe-886d-65dd4f23f594"). InnerVolumeSpecName "kube-api-access-p5rb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.925014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-config" (OuterVolumeSpecName: "config") pod "86af110f-d26e-4cbe-886d-65dd4f23f594" (UID: "86af110f-d26e-4cbe-886d-65dd4f23f594"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.926622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "86af110f-d26e-4cbe-886d-65dd4f23f594" (UID: "86af110f-d26e-4cbe-886d-65dd4f23f594"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:39 crc kubenswrapper[4707]: I0121 16:24:39.927014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "86af110f-d26e-4cbe-886d-65dd4f23f594" (UID: "86af110f-d26e-4cbe-886d-65dd4f23f594"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.000895 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.000920 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.000934 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5rb7\" (UniqueName: \"kubernetes.io/projected/86af110f-d26e-4cbe-886d-65dd4f23f594-kube-api-access-p5rb7\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.000945 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/86af110f-d26e-4cbe-886d-65dd4f23f594-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.600221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" event={"ID":"86af110f-d26e-4cbe-886d-65dd4f23f594","Type":"ContainerDied","Data":"9ef21560ec6b464cb3b64e39fa6ff90c07d3ad824db7cb4ffae87e6501ac4700"} Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.600266 4707 scope.go:117] "RemoveContainer" containerID="1f331bea268b3bfc1c5596a5131e82556e0d7b36899c8e4c4bbcaaf1eb97b6b4" Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.600433 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z" Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.620975 4707 scope.go:117] "RemoveContainer" containerID="19d1ce758b48adac59dd714741a46435aecfb303593e7fc93f7608e66ec1d358" Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.638108 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z"] Jan 21 16:24:40 crc kubenswrapper[4707]: I0121 16:24:40.646954 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gnc2z"] Jan 21 16:24:41 crc kubenswrapper[4707]: I0121 16:24:41.189753 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86af110f-d26e-4cbe-886d-65dd4f23f594" path="/var/lib/kubelet/pods/86af110f-d26e-4cbe-886d-65dd4f23f594/volumes" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.345168 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz"] Jan 21 16:24:43 crc kubenswrapper[4707]: E0121 16:24:43.345606 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd2e280-e441-477d-9054-b7d0df0fd6b8" containerName="dnsmasq-dns" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.345617 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd2e280-e441-477d-9054-b7d0df0fd6b8" containerName="dnsmasq-dns" Jan 21 16:24:43 crc kubenswrapper[4707]: E0121 16:24:43.345628 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd2e280-e441-477d-9054-b7d0df0fd6b8" containerName="init" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.345633 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd2e280-e441-477d-9054-b7d0df0fd6b8" containerName="init" Jan 21 16:24:43 crc kubenswrapper[4707]: E0121 16:24:43.345642 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86af110f-d26e-4cbe-886d-65dd4f23f594" containerName="init" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.345647 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="86af110f-d26e-4cbe-886d-65dd4f23f594" containerName="init" Jan 21 16:24:43 crc kubenswrapper[4707]: E0121 16:24:43.345654 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86af110f-d26e-4cbe-886d-65dd4f23f594" containerName="dnsmasq-dns" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.345660 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="86af110f-d26e-4cbe-886d-65dd4f23f594" containerName="dnsmasq-dns" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.345778 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd2e280-e441-477d-9054-b7d0df0fd6b8" containerName="dnsmasq-dns" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.345794 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="86af110f-d26e-4cbe-886d-65dd4f23f594" containerName="dnsmasq-dns" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.346194 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.351330 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.351394 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-generic-service1-default-certs-0" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.351502 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.351592 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-generic-service1-default-certs-2" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.351517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.351721 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-xkp2k" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.351863 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-generic-service1-default-certs-1" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.351877 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.353605 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz"] Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.542766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-install-certs-ovr-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.542895 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbpd\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-kube-api-access-rfbpd\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.542931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-generic-service1-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.542972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-inventory\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.542991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-openstack-edpm-tls-generic-service1-default-certs-0\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.543034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.643643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-inventory\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.643680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-openstack-edpm-tls-generic-service1-default-certs-0\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.643714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.643752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-install-certs-ovr-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.643801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbpd\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-kube-api-access-rfbpd\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.643844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-generic-service1-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.647754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-inventory\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.648119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-generic-service1-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.648310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-install-certs-ovr-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.648595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-openstack-edpm-tls-generic-service1-default-certs-0\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.648861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.656742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbpd\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-kube-api-access-rfbpd\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:43 crc kubenswrapper[4707]: I0121 16:24:43.663542 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:44 crc kubenswrapper[4707]: I0121 16:24:44.005361 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz"] Jan 21 16:24:44 crc kubenswrapper[4707]: I0121 16:24:44.625043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" event={"ID":"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07","Type":"ContainerStarted","Data":"b47a17e145e731e864046d57e853827cc9066db624852107476f44e4bad8e848"} Jan 21 16:24:44 crc kubenswrapper[4707]: I0121 16:24:44.625323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" event={"ID":"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07","Type":"ContainerStarted","Data":"883987b1b6dd31d2783e3bc362e518e8d8c0b831f7d9dc4938bb529beb943955"} Jan 21 16:24:44 crc kubenswrapper[4707]: I0121 16:24:44.638855 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" podStartSLOduration=1.157101064 podStartE2EDuration="1.63884212s" podCreationTimestamp="2026-01-21 16:24:43 +0000 UTC" firstStartedPulling="2026-01-21 16:24:44.011760291 +0000 UTC m=+4981.193276513" lastFinishedPulling="2026-01-21 16:24:44.493501347 +0000 UTC m=+4981.675017569" observedRunningTime="2026-01-21 16:24:44.637362718 +0000 UTC m=+4981.818878940" watchObservedRunningTime="2026-01-21 16:24:44.63884212 +0000 UTC m=+4981.820358342" Jan 21 16:24:47 crc kubenswrapper[4707]: I0121 16:24:47.643323 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" containerID="b47a17e145e731e864046d57e853827cc9066db624852107476f44e4bad8e848" exitCode=0 Jan 21 16:24:47 crc kubenswrapper[4707]: I0121 16:24:47.643396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" event={"ID":"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07","Type":"ContainerDied","Data":"b47a17e145e731e864046d57e853827cc9066db624852107476f44e4bad8e848"} Jan 21 16:24:48 crc kubenswrapper[4707]: I0121 16:24:48.848672 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.003078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-openstack-edpm-tls-generic-service1-default-certs-0\") pod \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.003149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-generic-service1-combined-ca-bundle\") pod \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.003179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-inventory\") pod \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.003225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfbpd\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-kube-api-access-rfbpd\") pod \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.003281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-ssh-key-openstack-edpm-tls\") pod \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.003300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-install-certs-ovr-combined-ca-bundle\") pod \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\" (UID: \"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07\") " Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.007518 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-install-certs-ovr-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovr-combined-ca-bundle") pod "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" (UID: "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07"). InnerVolumeSpecName "install-certs-ovr-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.007584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-kube-api-access-rfbpd" (OuterVolumeSpecName: "kube-api-access-rfbpd") pod "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" (UID: "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07"). InnerVolumeSpecName "kube-api-access-rfbpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.007867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-openstack-edpm-tls-generic-service1-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-generic-service1-default-certs-0") pod "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" (UID: "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07"). InnerVolumeSpecName "openstack-edpm-tls-generic-service1-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.007964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-generic-service1-combined-ca-bundle" (OuterVolumeSpecName: "generic-service1-combined-ca-bundle") pod "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" (UID: "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07"). InnerVolumeSpecName "generic-service1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.018937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-inventory" (OuterVolumeSpecName: "inventory") pod "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" (UID: "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.019465 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" (UID: "0e37f6b4-b29a-4ca8-a722-21c2d3b53d07"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.105288 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfbpd\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-kube-api-access-rfbpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.105310 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.105320 4707 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-install-certs-ovr-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.105331 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-openstack-edpm-tls-generic-service1-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.105340 4707 reconciler_common.go:293] "Volume detached for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-generic-service1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.105350 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.656132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" event={"ID":"0e37f6b4-b29a-4ca8-a722-21c2d3b53d07","Type":"ContainerDied","Data":"883987b1b6dd31d2783e3bc362e518e8d8c0b831f7d9dc4938bb529beb943955"} Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.656166 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="883987b1b6dd31d2783e3bc362e518e8d8c0b831f7d9dc4938bb529beb943955" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.656221 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.717625 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt"] Jan 21 16:24:49 crc kubenswrapper[4707]: E0121 16:24:49.718214 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" containerName="install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.718235 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" containerName="install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.718554 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" containerName="install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.719226 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.726915 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.727412 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.727936 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.729432 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.729589 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-xkp2k" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.738520 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt"] Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.814626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-ssh-key-openstack-edpm-tls\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.814689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-inventory\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.814898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-generic-service1-combined-ca-bundle\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.814953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nbt9\" (UniqueName: \"kubernetes.io/projected/ab195437-8084-4e81-9804-c83408c01246-kube-api-access-6nbt9\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.915858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-ssh-key-openstack-edpm-tls\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.916111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-inventory\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.916216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-generic-service1-combined-ca-bundle\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.916247 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nbt9\" (UniqueName: \"kubernetes.io/projected/ab195437-8084-4e81-9804-c83408c01246-kube-api-access-6nbt9\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.918743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-ssh-key-openstack-edpm-tls\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.918968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-generic-service1-combined-ca-bundle\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.921298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-inventory\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:49 crc kubenswrapper[4707]: I0121 16:24:49.929629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nbt9\" (UniqueName: \"kubernetes.io/projected/ab195437-8084-4e81-9804-c83408c01246-kube-api-access-6nbt9\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:50 crc kubenswrapper[4707]: I0121 16:24:50.034555 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:50 crc kubenswrapper[4707]: I0121 16:24:50.385973 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt"] Jan 21 16:24:50 crc kubenswrapper[4707]: I0121 16:24:50.392803 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:24:50 crc kubenswrapper[4707]: I0121 16:24:50.663457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" event={"ID":"ab195437-8084-4e81-9804-c83408c01246","Type":"ContainerStarted","Data":"b4a4114f83b71f57bdddf77f1f3f27ac72d7ecfe6877edb3f58125db4f7c5c88"} Jan 21 16:24:51 crc kubenswrapper[4707]: I0121 16:24:51.674916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" event={"ID":"ab195437-8084-4e81-9804-c83408c01246","Type":"ContainerStarted","Data":"53aaf8a6395bb41b259ede2dd51dc8f146aa18c7a6d7b0f4487dfe7bd4502628"} Jan 21 16:24:53 crc kubenswrapper[4707]: I0121 16:24:53.687605 4707 generic.go:334] "Generic (PLEG): container finished" podID="ab195437-8084-4e81-9804-c83408c01246" containerID="53aaf8a6395bb41b259ede2dd51dc8f146aa18c7a6d7b0f4487dfe7bd4502628" exitCode=0 Jan 21 16:24:53 crc kubenswrapper[4707]: I0121 16:24:53.687673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" event={"ID":"ab195437-8084-4e81-9804-c83408c01246","Type":"ContainerDied","Data":"53aaf8a6395bb41b259ede2dd51dc8f146aa18c7a6d7b0f4487dfe7bd4502628"} Jan 21 16:24:54 crc kubenswrapper[4707]: I0121 16:24:54.945713 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:54 crc kubenswrapper[4707]: I0121 16:24:54.974178 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-ssh-key-openstack-edpm-tls\") pod \"ab195437-8084-4e81-9804-c83408c01246\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " Jan 21 16:24:54 crc kubenswrapper[4707]: I0121 16:24:54.974216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-generic-service1-combined-ca-bundle\") pod \"ab195437-8084-4e81-9804-c83408c01246\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " Jan 21 16:24:54 crc kubenswrapper[4707]: I0121 16:24:54.974273 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-inventory\") pod \"ab195437-8084-4e81-9804-c83408c01246\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " Jan 21 16:24:54 crc kubenswrapper[4707]: I0121 16:24:54.974312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nbt9\" (UniqueName: \"kubernetes.io/projected/ab195437-8084-4e81-9804-c83408c01246-kube-api-access-6nbt9\") pod \"ab195437-8084-4e81-9804-c83408c01246\" (UID: \"ab195437-8084-4e81-9804-c83408c01246\") " Jan 21 16:24:54 crc kubenswrapper[4707]: I0121 16:24:54.978825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-generic-service1-combined-ca-bundle" (OuterVolumeSpecName: "generic-service1-combined-ca-bundle") pod "ab195437-8084-4e81-9804-c83408c01246" (UID: "ab195437-8084-4e81-9804-c83408c01246"). InnerVolumeSpecName "generic-service1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:24:54 crc kubenswrapper[4707]: I0121 16:24:54.979906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab195437-8084-4e81-9804-c83408c01246-kube-api-access-6nbt9" (OuterVolumeSpecName: "kube-api-access-6nbt9") pod "ab195437-8084-4e81-9804-c83408c01246" (UID: "ab195437-8084-4e81-9804-c83408c01246"). InnerVolumeSpecName "kube-api-access-6nbt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:24:54 crc kubenswrapper[4707]: I0121 16:24:54.990875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-inventory" (OuterVolumeSpecName: "inventory") pod "ab195437-8084-4e81-9804-c83408c01246" (UID: "ab195437-8084-4e81-9804-c83408c01246"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:24:54 crc kubenswrapper[4707]: I0121 16:24:54.992430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "ab195437-8084-4e81-9804-c83408c01246" (UID: "ab195437-8084-4e81-9804-c83408c01246"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:24:55 crc kubenswrapper[4707]: I0121 16:24:55.075890 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:55 crc kubenswrapper[4707]: I0121 16:24:55.075917 4707 reconciler_common.go:293] "Volume detached for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-generic-service1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:55 crc kubenswrapper[4707]: I0121 16:24:55.075930 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab195437-8084-4e81-9804-c83408c01246-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:55 crc kubenswrapper[4707]: I0121 16:24:55.075940 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nbt9\" (UniqueName: \"kubernetes.io/projected/ab195437-8084-4e81-9804-c83408c01246-kube-api-access-6nbt9\") on node \"crc\" DevicePath \"\"" Jan 21 16:24:55 crc kubenswrapper[4707]: I0121 16:24:55.704594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" event={"ID":"ab195437-8084-4e81-9804-c83408c01246","Type":"ContainerDied","Data":"b4a4114f83b71f57bdddf77f1f3f27ac72d7ecfe6877edb3f58125db4f7c5c88"} Jan 21 16:24:55 crc kubenswrapper[4707]: I0121 16:24:55.704784 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4a4114f83b71f57bdddf77f1f3f27ac72d7ecfe6877edb3f58125db4f7c5c88" Jan 21 16:24:55 crc kubenswrapper[4707]: I0121 16:24:55.704644 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.344204 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz"] Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.348624 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-sjcgz"] Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.353312 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt"] Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.357538 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-sf8xt"] Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.399164 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4"] Jan 21 16:24:56 crc kubenswrapper[4707]: E0121 16:24:56.399400 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab195437-8084-4e81-9804-c83408c01246" containerName="generic-service1-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.399419 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab195437-8084-4e81-9804-c83408c01246" containerName="generic-service1-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.399548 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab195437-8084-4e81-9804-c83408c01246" containerName="generic-service1-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.400179 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.407066 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4"] Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.495269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6g26\" (UniqueName: \"kubernetes.io/projected/5c44b241-6292-4934-98a3-c873f61e0a5c-kube-api-access-j6g26\") pod \"dnsmasq-dnsmasq-84b9f45d47-2nnd4\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.495552 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-2nnd4\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.495584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-2nnd4\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.596159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6g26\" (UniqueName: \"kubernetes.io/projected/5c44b241-6292-4934-98a3-c873f61e0a5c-kube-api-access-j6g26\") pod \"dnsmasq-dnsmasq-84b9f45d47-2nnd4\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.596231 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-2nnd4\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.596260 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-2nnd4\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.597072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-2nnd4\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.597102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-2nnd4\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.610268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6g26\" (UniqueName: \"kubernetes.io/projected/5c44b241-6292-4934-98a3-c873f61e0a5c-kube-api-access-j6g26\") pod \"dnsmasq-dnsmasq-84b9f45d47-2nnd4\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:56 crc kubenswrapper[4707]: I0121 16:24:56.712324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:24:57 crc kubenswrapper[4707]: I0121 16:24:57.092689 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4"] Jan 21 16:24:57 crc kubenswrapper[4707]: I0121 16:24:57.189702 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e37f6b4-b29a-4ca8-a722-21c2d3b53d07" path="/var/lib/kubelet/pods/0e37f6b4-b29a-4ca8-a722-21c2d3b53d07/volumes" Jan 21 16:24:57 crc kubenswrapper[4707]: I0121 16:24:57.191129 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab195437-8084-4e81-9804-c83408c01246" path="/var/lib/kubelet/pods/ab195437-8084-4e81-9804-c83408c01246/volumes" Jan 21 16:24:57 crc kubenswrapper[4707]: I0121 16:24:57.716801 4707 generic.go:334] "Generic (PLEG): container finished" podID="5c44b241-6292-4934-98a3-c873f61e0a5c" containerID="6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457" exitCode=0 Jan 21 16:24:57 crc kubenswrapper[4707]: I0121 16:24:57.716909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" event={"ID":"5c44b241-6292-4934-98a3-c873f61e0a5c","Type":"ContainerDied","Data":"6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457"} Jan 21 16:24:57 crc kubenswrapper[4707]: I0121 16:24:57.717035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" event={"ID":"5c44b241-6292-4934-98a3-c873f61e0a5c","Type":"ContainerStarted","Data":"336f0c2d4435cf6aa275a0e21bbbae3237ba88054639ee497af4130423573c33"} Jan 21 16:24:58 crc kubenswrapper[4707]: I0121 16:24:58.734111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" event={"ID":"5c44b241-6292-4934-98a3-c873f61e0a5c","Type":"ContainerStarted","Data":"433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4"} Jan 21 16:24:58 crc kubenswrapper[4707]: I0121 16:24:58.751350 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" podStartSLOduration=2.751334774 podStartE2EDuration="2.751334774s" podCreationTimestamp="2026-01-21 16:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:24:58.747933529 +0000 UTC m=+4995.929449750" watchObservedRunningTime="2026-01-21 16:24:58.751334774 +0000 UTC m=+4995.932850996" Jan 21 16:24:59 crc kubenswrapper[4707]: I0121 16:24:59.740118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:25:02 crc kubenswrapper[4707]: I0121 16:25:02.791676 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-4llql"] Jan 21 16:25:02 crc kubenswrapper[4707]: I0121 16:25:02.795938 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-4llql"] Jan 21 16:25:02 crc kubenswrapper[4707]: I0121 16:25:02.907500 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-gl6hz"] Jan 21 16:25:02 crc kubenswrapper[4707]: I0121 16:25:02.908259 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:02 crc kubenswrapper[4707]: I0121 16:25:02.909782 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:25:02 crc kubenswrapper[4707]: I0121 16:25:02.910370 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:25:02 crc kubenswrapper[4707]: I0121 16:25:02.911135 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:25:02 crc kubenswrapper[4707]: I0121 16:25:02.913657 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:25:02 crc kubenswrapper[4707]: I0121 16:25:02.915552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gl6hz"] Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.074062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c65cccde-36b3-4b33-a605-9d83a1bc5132-node-mnt\") pod \"crc-storage-crc-gl6hz\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.074126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c65cccde-36b3-4b33-a605-9d83a1bc5132-crc-storage\") pod \"crc-storage-crc-gl6hz\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.074516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fbxp\" (UniqueName: \"kubernetes.io/projected/c65cccde-36b3-4b33-a605-9d83a1bc5132-kube-api-access-7fbxp\") pod \"crc-storage-crc-gl6hz\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.176226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fbxp\" (UniqueName: \"kubernetes.io/projected/c65cccde-36b3-4b33-a605-9d83a1bc5132-kube-api-access-7fbxp\") pod \"crc-storage-crc-gl6hz\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.176281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c65cccde-36b3-4b33-a605-9d83a1bc5132-node-mnt\") pod \"crc-storage-crc-gl6hz\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.176308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c65cccde-36b3-4b33-a605-9d83a1bc5132-crc-storage\") pod \"crc-storage-crc-gl6hz\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.176582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c65cccde-36b3-4b33-a605-9d83a1bc5132-node-mnt\") pod \"crc-storage-crc-gl6hz\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.176908 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c65cccde-36b3-4b33-a605-9d83a1bc5132-crc-storage\") pod \"crc-storage-crc-gl6hz\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.189691 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defca8e3-7423-4aea-84b6-0054fa424c75" path="/var/lib/kubelet/pods/defca8e3-7423-4aea-84b6-0054fa424c75/volumes" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.193389 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fbxp\" (UniqueName: \"kubernetes.io/projected/c65cccde-36b3-4b33-a605-9d83a1bc5132-kube-api-access-7fbxp\") pod \"crc-storage-crc-gl6hz\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.222287 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.585846 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gl6hz"] Jan 21 16:25:03 crc kubenswrapper[4707]: I0121 16:25:03.763593 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gl6hz" event={"ID":"c65cccde-36b3-4b33-a605-9d83a1bc5132","Type":"ContainerStarted","Data":"e39c5035cc812581dd3a4b9d9ef6dd1e3e43450ef61f4ab8e0e6b75b77e0f872"} Jan 21 16:25:04 crc kubenswrapper[4707]: I0121 16:25:04.770902 4707 generic.go:334] "Generic (PLEG): container finished" podID="c65cccde-36b3-4b33-a605-9d83a1bc5132" containerID="49a7623192f683015e2ec84de1db7c9ea233bd6380eb858a6be5d853a2b2e8b4" exitCode=0 Jan 21 16:25:04 crc kubenswrapper[4707]: I0121 16:25:04.771002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gl6hz" event={"ID":"c65cccde-36b3-4b33-a605-9d83a1bc5132","Type":"ContainerDied","Data":"49a7623192f683015e2ec84de1db7c9ea233bd6380eb858a6be5d853a2b2e8b4"} Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.001099 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.015665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fbxp\" (UniqueName: \"kubernetes.io/projected/c65cccde-36b3-4b33-a605-9d83a1bc5132-kube-api-access-7fbxp\") pod \"c65cccde-36b3-4b33-a605-9d83a1bc5132\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.015727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c65cccde-36b3-4b33-a605-9d83a1bc5132-crc-storage\") pod \"c65cccde-36b3-4b33-a605-9d83a1bc5132\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.015747 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c65cccde-36b3-4b33-a605-9d83a1bc5132-node-mnt\") pod \"c65cccde-36b3-4b33-a605-9d83a1bc5132\" (UID: \"c65cccde-36b3-4b33-a605-9d83a1bc5132\") " Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.016138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c65cccde-36b3-4b33-a605-9d83a1bc5132-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c65cccde-36b3-4b33-a605-9d83a1bc5132" (UID: "c65cccde-36b3-4b33-a605-9d83a1bc5132"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.028194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65cccde-36b3-4b33-a605-9d83a1bc5132-kube-api-access-7fbxp" (OuterVolumeSpecName: "kube-api-access-7fbxp") pod "c65cccde-36b3-4b33-a605-9d83a1bc5132" (UID: "c65cccde-36b3-4b33-a605-9d83a1bc5132"). InnerVolumeSpecName "kube-api-access-7fbxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.030534 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65cccde-36b3-4b33-a605-9d83a1bc5132-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c65cccde-36b3-4b33-a605-9d83a1bc5132" (UID: "c65cccde-36b3-4b33-a605-9d83a1bc5132"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.117001 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fbxp\" (UniqueName: \"kubernetes.io/projected/c65cccde-36b3-4b33-a605-9d83a1bc5132-kube-api-access-7fbxp\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.117027 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c65cccde-36b3-4b33-a605-9d83a1bc5132-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.117037 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c65cccde-36b3-4b33-a605-9d83a1bc5132-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.713970 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.753135 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj"] Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.753332 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" podUID="f2208347-c972-41e8-94b3-bf07a03b7537" containerName="dnsmasq-dns" containerID="cri-o://3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19" gracePeriod=10 Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.797586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gl6hz" event={"ID":"c65cccde-36b3-4b33-a605-9d83a1bc5132","Type":"ContainerDied","Data":"e39c5035cc812581dd3a4b9d9ef6dd1e3e43450ef61f4ab8e0e6b75b77e0f872"} Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.797621 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e39c5035cc812581dd3a4b9d9ef6dd1e3e43450ef61f4ab8e0e6b75b77e0f872" Jan 21 16:25:06 crc kubenswrapper[4707]: I0121 16:25:06.797626 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gl6hz" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.108245 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.231929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-openstack-edpm-tls\") pod \"f2208347-c972-41e8-94b3-bf07a03b7537\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.232032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdw8g\" (UniqueName: \"kubernetes.io/projected/f2208347-c972-41e8-94b3-bf07a03b7537-kube-api-access-qdw8g\") pod \"f2208347-c972-41e8-94b3-bf07a03b7537\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.232063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-config\") pod \"f2208347-c972-41e8-94b3-bf07a03b7537\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.232097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-dnsmasq-svc\") pod \"f2208347-c972-41e8-94b3-bf07a03b7537\" (UID: \"f2208347-c972-41e8-94b3-bf07a03b7537\") " Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.245925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2208347-c972-41e8-94b3-bf07a03b7537-kube-api-access-qdw8g" (OuterVolumeSpecName: "kube-api-access-qdw8g") pod "f2208347-c972-41e8-94b3-bf07a03b7537" (UID: "f2208347-c972-41e8-94b3-bf07a03b7537"). InnerVolumeSpecName "kube-api-access-qdw8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.261284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "f2208347-c972-41e8-94b3-bf07a03b7537" (UID: "f2208347-c972-41e8-94b3-bf07a03b7537"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.262435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "f2208347-c972-41e8-94b3-bf07a03b7537" (UID: "f2208347-c972-41e8-94b3-bf07a03b7537"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.265015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-config" (OuterVolumeSpecName: "config") pod "f2208347-c972-41e8-94b3-bf07a03b7537" (UID: "f2208347-c972-41e8-94b3-bf07a03b7537"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.335180 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdw8g\" (UniqueName: \"kubernetes.io/projected/f2208347-c972-41e8-94b3-bf07a03b7537-kube-api-access-qdw8g\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.335209 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.335220 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.335230 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f2208347-c972-41e8-94b3-bf07a03b7537-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.805624 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2208347-c972-41e8-94b3-bf07a03b7537" containerID="3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19" exitCode=0 Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.805716 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.805731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" event={"ID":"f2208347-c972-41e8-94b3-bf07a03b7537","Type":"ContainerDied","Data":"3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19"} Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.806044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj" event={"ID":"f2208347-c972-41e8-94b3-bf07a03b7537","Type":"ContainerDied","Data":"51228644d8b4574e037596c54277b08a93f066c83ed8b2af69647bb3d1c133b7"} Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.806068 4707 scope.go:117] "RemoveContainer" containerID="3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.826798 4707 scope.go:117] "RemoveContainer" containerID="93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.826943 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj"] Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.830783 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-dnrrj"] Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.840371 4707 scope.go:117] "RemoveContainer" containerID="3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19" Jan 21 16:25:07 crc kubenswrapper[4707]: E0121 16:25:07.840736 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19\": container with ID starting with 3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19 not found: ID does not exist" containerID="3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.840770 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19"} err="failed to get container status \"3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19\": rpc error: code = NotFound desc = could not find container \"3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19\": container with ID starting with 3ed88394bd0beaf24687e2e0efec3f714249dcb2d1b1f874932b08d9bc2f7e19 not found: ID does not exist" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.840790 4707 scope.go:117] "RemoveContainer" containerID="93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be" Jan 21 16:25:07 crc kubenswrapper[4707]: E0121 16:25:07.841104 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be\": container with ID starting with 93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be not found: ID does not exist" containerID="93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be" Jan 21 16:25:07 crc kubenswrapper[4707]: I0121 16:25:07.841129 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be"} err="failed to get container status \"93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be\": rpc error: code = NotFound desc = could not find container \"93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be\": container with ID starting with 93c3e59a44b181e538efce47677b93077a930ecfe6cc8d218393d1215c0d00be not found: ID does not exist" Jan 21 16:25:08 crc kubenswrapper[4707]: I0121 16:25:08.885152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-gl6hz"] Jan 21 16:25:08 crc kubenswrapper[4707]: I0121 16:25:08.888580 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-gl6hz"] Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.004454 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-v7l5q"] Jan 21 16:25:09 crc kubenswrapper[4707]: E0121 16:25:09.004707 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2208347-c972-41e8-94b3-bf07a03b7537" containerName="init" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.004726 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2208347-c972-41e8-94b3-bf07a03b7537" containerName="init" Jan 21 16:25:09 crc kubenswrapper[4707]: E0121 16:25:09.004739 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65cccde-36b3-4b33-a605-9d83a1bc5132" containerName="storage" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.004745 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65cccde-36b3-4b33-a605-9d83a1bc5132" containerName="storage" Jan 21 16:25:09 crc kubenswrapper[4707]: E0121 16:25:09.004759 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2208347-c972-41e8-94b3-bf07a03b7537" containerName="dnsmasq-dns" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.004764 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2208347-c972-41e8-94b3-bf07a03b7537" containerName="dnsmasq-dns" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.004914 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2208347-c972-41e8-94b3-bf07a03b7537" containerName="dnsmasq-dns" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.004928 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65cccde-36b3-4b33-a605-9d83a1bc5132" containerName="storage" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.005343 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.010101 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.010145 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.010180 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.010327 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.013093 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-v7l5q"] Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.058040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnrbw\" (UniqueName: \"kubernetes.io/projected/c43373b6-f9de-488e-8687-b21b98d5fb36-kube-api-access-pnrbw\") pod \"crc-storage-crc-v7l5q\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.058097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c43373b6-f9de-488e-8687-b21b98d5fb36-node-mnt\") pod \"crc-storage-crc-v7l5q\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.058149 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c43373b6-f9de-488e-8687-b21b98d5fb36-crc-storage\") pod \"crc-storage-crc-v7l5q\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.159906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnrbw\" (UniqueName: \"kubernetes.io/projected/c43373b6-f9de-488e-8687-b21b98d5fb36-kube-api-access-pnrbw\") pod \"crc-storage-crc-v7l5q\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.159964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c43373b6-f9de-488e-8687-b21b98d5fb36-node-mnt\") pod \"crc-storage-crc-v7l5q\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.160048 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c43373b6-f9de-488e-8687-b21b98d5fb36-crc-storage\") pod \"crc-storage-crc-v7l5q\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.160351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c43373b6-f9de-488e-8687-b21b98d5fb36-node-mnt\") pod \"crc-storage-crc-v7l5q\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.160843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c43373b6-f9de-488e-8687-b21b98d5fb36-crc-storage\") pod \"crc-storage-crc-v7l5q\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.175713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnrbw\" (UniqueName: \"kubernetes.io/projected/c43373b6-f9de-488e-8687-b21b98d5fb36-kube-api-access-pnrbw\") pod \"crc-storage-crc-v7l5q\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.191395 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65cccde-36b3-4b33-a605-9d83a1bc5132" path="/var/lib/kubelet/pods/c65cccde-36b3-4b33-a605-9d83a1bc5132/volumes" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.192149 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2208347-c972-41e8-94b3-bf07a03b7537" path="/var/lib/kubelet/pods/f2208347-c972-41e8-94b3-bf07a03b7537/volumes" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.320044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.682580 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-v7l5q"] Jan 21 16:25:09 crc kubenswrapper[4707]: W0121 16:25:09.684105 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc43373b6_f9de_488e_8687_b21b98d5fb36.slice/crio-3d151f29ca9403a959b3973598809dbc4083fc8ff1ec6fe73fc284627f566c72 WatchSource:0}: Error finding container 3d151f29ca9403a959b3973598809dbc4083fc8ff1ec6fe73fc284627f566c72: Status 404 returned error can't find the container with id 3d151f29ca9403a959b3973598809dbc4083fc8ff1ec6fe73fc284627f566c72 Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.826573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-v7l5q" event={"ID":"c43373b6-f9de-488e-8687-b21b98d5fb36","Type":"ContainerStarted","Data":"3d151f29ca9403a959b3973598809dbc4083fc8ff1ec6fe73fc284627f566c72"} Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.946188 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:25:09 crc kubenswrapper[4707]: I0121 16:25:09.946559 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:25:10 crc kubenswrapper[4707]: I0121 16:25:10.834703 4707 generic.go:334] "Generic (PLEG): container finished" podID="c43373b6-f9de-488e-8687-b21b98d5fb36" containerID="0a9468204faf53ac2c4f64c4e809127e9513b05b8d233daada636d2e0976000a" exitCode=0 Jan 21 16:25:10 crc kubenswrapper[4707]: I0121 16:25:10.834861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-v7l5q" event={"ID":"c43373b6-f9de-488e-8687-b21b98d5fb36","Type":"ContainerDied","Data":"0a9468204faf53ac2c4f64c4e809127e9513b05b8d233daada636d2e0976000a"} Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.061368 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.204876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnrbw\" (UniqueName: \"kubernetes.io/projected/c43373b6-f9de-488e-8687-b21b98d5fb36-kube-api-access-pnrbw\") pod \"c43373b6-f9de-488e-8687-b21b98d5fb36\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.205423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c43373b6-f9de-488e-8687-b21b98d5fb36-node-mnt\") pod \"c43373b6-f9de-488e-8687-b21b98d5fb36\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.205478 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c43373b6-f9de-488e-8687-b21b98d5fb36-crc-storage\") pod \"c43373b6-f9de-488e-8687-b21b98d5fb36\" (UID: \"c43373b6-f9de-488e-8687-b21b98d5fb36\") " Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.205543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c43373b6-f9de-488e-8687-b21b98d5fb36-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c43373b6-f9de-488e-8687-b21b98d5fb36" (UID: "c43373b6-f9de-488e-8687-b21b98d5fb36"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.206233 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c43373b6-f9de-488e-8687-b21b98d5fb36-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.210394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c43373b6-f9de-488e-8687-b21b98d5fb36-kube-api-access-pnrbw" (OuterVolumeSpecName: "kube-api-access-pnrbw") pod "c43373b6-f9de-488e-8687-b21b98d5fb36" (UID: "c43373b6-f9de-488e-8687-b21b98d5fb36"). InnerVolumeSpecName "kube-api-access-pnrbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.221544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c43373b6-f9de-488e-8687-b21b98d5fb36-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c43373b6-f9de-488e-8687-b21b98d5fb36" (UID: "c43373b6-f9de-488e-8687-b21b98d5fb36"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.308109 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnrbw\" (UniqueName: \"kubernetes.io/projected/c43373b6-f9de-488e-8687-b21b98d5fb36-kube-api-access-pnrbw\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.308275 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c43373b6-f9de-488e-8687-b21b98d5fb36-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.856636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-v7l5q" event={"ID":"c43373b6-f9de-488e-8687-b21b98d5fb36","Type":"ContainerDied","Data":"3d151f29ca9403a959b3973598809dbc4083fc8ff1ec6fe73fc284627f566c72"} Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.856695 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d151f29ca9403a959b3973598809dbc4083fc8ff1ec6fe73fc284627f566c72" Jan 21 16:25:12 crc kubenswrapper[4707]: I0121 16:25:12.856723 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-v7l5q" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.222889 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc"] Jan 21 16:25:15 crc kubenswrapper[4707]: E0121 16:25:15.223617 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c43373b6-f9de-488e-8687-b21b98d5fb36" containerName="storage" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.223629 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c43373b6-f9de-488e-8687-b21b98d5fb36" containerName="storage" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.223766 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c43373b6-f9de-488e-8687-b21b98d5fb36" containerName="storage" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.224470 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.226106 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.232684 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc"] Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.254424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.254644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4sfp\" (UniqueName: \"kubernetes.io/projected/74607501-37d2-44d7-91f2-b51ae5f157ba-kube-api-access-v4sfp\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.254737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.254769 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-config\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.355386 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4sfp\" (UniqueName: \"kubernetes.io/projected/74607501-37d2-44d7-91f2-b51ae5f157ba-kube-api-access-v4sfp\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.355564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.355649 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-config\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.355748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.356403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.356411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-config\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.356659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.373849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4sfp\" (UniqueName: \"kubernetes.io/projected/74607501-37d2-44d7-91f2-b51ae5f157ba-kube-api-access-v4sfp\") pod \"dnsmasq-dnsmasq-64864b6d57-mfnlc\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.538112 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:15 crc kubenswrapper[4707]: I0121 16:25:15.891613 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc"] Jan 21 16:25:16 crc kubenswrapper[4707]: I0121 16:25:16.880946 4707 generic.go:334] "Generic (PLEG): container finished" podID="74607501-37d2-44d7-91f2-b51ae5f157ba" containerID="952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3" exitCode=0 Jan 21 16:25:16 crc kubenswrapper[4707]: I0121 16:25:16.881317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" event={"ID":"74607501-37d2-44d7-91f2-b51ae5f157ba","Type":"ContainerDied","Data":"952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3"} Jan 21 16:25:16 crc kubenswrapper[4707]: I0121 16:25:16.881420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" event={"ID":"74607501-37d2-44d7-91f2-b51ae5f157ba","Type":"ContainerStarted","Data":"0d4961a07a038649298c0f9e9aca3ae85c56526a7f94e10546a42b48bfed6129"} Jan 21 16:25:17 crc kubenswrapper[4707]: I0121 16:25:17.888877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" event={"ID":"74607501-37d2-44d7-91f2-b51ae5f157ba","Type":"ContainerStarted","Data":"bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b"} Jan 21 16:25:17 crc kubenswrapper[4707]: I0121 16:25:17.889235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:17 crc kubenswrapper[4707]: I0121 16:25:17.907508 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" podStartSLOduration=2.907495037 podStartE2EDuration="2.907495037s" podCreationTimestamp="2026-01-21 16:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:25:17.90353469 +0000 UTC m=+5015.085050912" watchObservedRunningTime="2026-01-21 16:25:17.907495037 +0000 UTC m=+5015.089011259" Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.539972 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.581005 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4"] Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.581456 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" podUID="5c44b241-6292-4934-98a3-c873f61e0a5c" containerName="dnsmasq-dns" containerID="cri-o://433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4" gracePeriod=10 Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.924426 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.945352 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.945374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" event={"ID":"5c44b241-6292-4934-98a3-c873f61e0a5c","Type":"ContainerDied","Data":"433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4"} Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.945415 4707 scope.go:117] "RemoveContainer" containerID="433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4" Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.945288 4707 generic.go:334] "Generic (PLEG): container finished" podID="5c44b241-6292-4934-98a3-c873f61e0a5c" containerID="433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4" exitCode=0 Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.945476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4" event={"ID":"5c44b241-6292-4934-98a3-c873f61e0a5c","Type":"ContainerDied","Data":"336f0c2d4435cf6aa275a0e21bbbae3237ba88054639ee497af4130423573c33"} Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.963538 4707 scope.go:117] "RemoveContainer" containerID="6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457" Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.980236 4707 scope.go:117] "RemoveContainer" containerID="433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4" Jan 21 16:25:25 crc kubenswrapper[4707]: E0121 16:25:25.980604 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4\": container with ID starting with 433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4 not found: ID does not exist" containerID="433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4" Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.980639 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4"} err="failed to get container status \"433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4\": rpc error: code = NotFound desc = could not find container \"433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4\": container with ID starting with 433a481d375794c9125d6ba438b75922c0cee57882fff95c666bcbd2784449e4 not found: ID does not exist" Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.980661 4707 scope.go:117] "RemoveContainer" containerID="6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457" Jan 21 16:25:25 crc kubenswrapper[4707]: E0121 16:25:25.981090 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457\": container with ID starting with 6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457 not found: ID does not exist" containerID="6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457" Jan 21 16:25:25 crc kubenswrapper[4707]: I0121 16:25:25.981119 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457"} err="failed to get container status \"6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457\": rpc error: code = NotFound desc = could not find container \"6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457\": container with ID starting with 6154cd08a85b55591e35429b56bacc8239dee86b3b1f2b8c8d63f09911ed5457 not found: ID does not exist" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.099691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-dnsmasq-svc\") pod \"5c44b241-6292-4934-98a3-c873f61e0a5c\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.099777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-config\") pod \"5c44b241-6292-4934-98a3-c873f61e0a5c\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.099852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6g26\" (UniqueName: \"kubernetes.io/projected/5c44b241-6292-4934-98a3-c873f61e0a5c-kube-api-access-j6g26\") pod \"5c44b241-6292-4934-98a3-c873f61e0a5c\" (UID: \"5c44b241-6292-4934-98a3-c873f61e0a5c\") " Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.104378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c44b241-6292-4934-98a3-c873f61e0a5c-kube-api-access-j6g26" (OuterVolumeSpecName: "kube-api-access-j6g26") pod "5c44b241-6292-4934-98a3-c873f61e0a5c" (UID: "5c44b241-6292-4934-98a3-c873f61e0a5c"). InnerVolumeSpecName "kube-api-access-j6g26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.123943 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-config" (OuterVolumeSpecName: "config") pod "5c44b241-6292-4934-98a3-c873f61e0a5c" (UID: "5c44b241-6292-4934-98a3-c873f61e0a5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.124583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "5c44b241-6292-4934-98a3-c873f61e0a5c" (UID: "5c44b241-6292-4934-98a3-c873f61e0a5c"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.202408 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.202438 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6g26\" (UniqueName: \"kubernetes.io/projected/5c44b241-6292-4934-98a3-c873f61e0a5c-kube-api-access-j6g26\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.202448 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c44b241-6292-4934-98a3-c873f61e0a5c-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.250383 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs"] Jan 21 16:25:26 crc kubenswrapper[4707]: E0121 16:25:26.250658 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c44b241-6292-4934-98a3-c873f61e0a5c" containerName="dnsmasq-dns" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.250677 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c44b241-6292-4934-98a3-c873f61e0a5c" containerName="dnsmasq-dns" Jan 21 16:25:26 crc kubenswrapper[4707]: E0121 16:25:26.250695 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c44b241-6292-4934-98a3-c873f61e0a5c" containerName="init" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.250701 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c44b241-6292-4934-98a3-c873f61e0a5c" containerName="init" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.250852 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c44b241-6292-4934-98a3-c873f61e0a5c" containerName="dnsmasq-dns" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.251303 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.254481 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.258927 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs"] Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.260214 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.260213 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.260222 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.281790 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4"] Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.286240 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-2nnd4"] Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.303470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hctfj\" (UniqueName: \"kubernetes.io/projected/34ca9ae8-28f0-476b-beb3-6374833dc9f4-kube-api-access-hctfj\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.303542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-inventory\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.303567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.404324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hctfj\" (UniqueName: \"kubernetes.io/projected/34ca9ae8-28f0-476b-beb3-6374833dc9f4-kube-api-access-hctfj\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.404443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-inventory\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.404475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.407527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.407892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-inventory\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.416843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hctfj\" (UniqueName: \"kubernetes.io/projected/34ca9ae8-28f0-476b-beb3-6374833dc9f4-kube-api-access-hctfj\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.564092 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.915056 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs"] Jan 21 16:25:26 crc kubenswrapper[4707]: I0121 16:25:26.953514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" event={"ID":"34ca9ae8-28f0-476b-beb3-6374833dc9f4","Type":"ContainerStarted","Data":"9cf8af625dada61e7c7e4d8ec8de0e213750ac5dc785cdb53528998c61c7a9d7"} Jan 21 16:25:27 crc kubenswrapper[4707]: I0121 16:25:27.189568 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c44b241-6292-4934-98a3-c873f61e0a5c" path="/var/lib/kubelet/pods/5c44b241-6292-4934-98a3-c873f61e0a5c/volumes" Jan 21 16:25:27 crc kubenswrapper[4707]: I0121 16:25:27.965967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" event={"ID":"34ca9ae8-28f0-476b-beb3-6374833dc9f4","Type":"ContainerStarted","Data":"585e6b9ffe1b24b0c43533ae3debc229ffc8b9d6a01d472b73d61d8828da623c"} Jan 21 16:25:27 crc kubenswrapper[4707]: I0121 16:25:27.978538 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" podStartSLOduration=1.455509341 podStartE2EDuration="1.978518172s" podCreationTimestamp="2026-01-21 16:25:26 +0000 UTC" firstStartedPulling="2026-01-21 16:25:26.920049187 +0000 UTC m=+5024.101565409" lastFinishedPulling="2026-01-21 16:25:27.443058019 +0000 UTC m=+5024.624574240" observedRunningTime="2026-01-21 16:25:27.977743786 +0000 UTC m=+5025.159260007" watchObservedRunningTime="2026-01-21 16:25:27.978518172 +0000 UTC m=+5025.160034394" Jan 21 16:25:28 crc kubenswrapper[4707]: I0121 16:25:28.973150 4707 generic.go:334] "Generic (PLEG): container finished" podID="34ca9ae8-28f0-476b-beb3-6374833dc9f4" containerID="585e6b9ffe1b24b0c43533ae3debc229ffc8b9d6a01d472b73d61d8828da623c" exitCode=0 Jan 21 16:25:28 crc kubenswrapper[4707]: I0121 16:25:28.973198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" event={"ID":"34ca9ae8-28f0-476b-beb3-6374833dc9f4","Type":"ContainerDied","Data":"585e6b9ffe1b24b0c43533ae3debc229ffc8b9d6a01d472b73d61d8828da623c"} Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.196922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.252694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hctfj\" (UniqueName: \"kubernetes.io/projected/34ca9ae8-28f0-476b-beb3-6374833dc9f4-kube-api-access-hctfj\") pod \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.252750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-inventory\") pod \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.252835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-ssh-key-edpm-compute-no-nodes\") pod \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\" (UID: \"34ca9ae8-28f0-476b-beb3-6374833dc9f4\") " Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.258037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ca9ae8-28f0-476b-beb3-6374833dc9f4-kube-api-access-hctfj" (OuterVolumeSpecName: "kube-api-access-hctfj") pod "34ca9ae8-28f0-476b-beb3-6374833dc9f4" (UID: "34ca9ae8-28f0-476b-beb3-6374833dc9f4"). InnerVolumeSpecName "kube-api-access-hctfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.269925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "34ca9ae8-28f0-476b-beb3-6374833dc9f4" (UID: "34ca9ae8-28f0-476b-beb3-6374833dc9f4"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.271160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-inventory" (OuterVolumeSpecName: "inventory") pod "34ca9ae8-28f0-476b-beb3-6374833dc9f4" (UID: "34ca9ae8-28f0-476b-beb3-6374833dc9f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.354002 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.354049 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hctfj\" (UniqueName: \"kubernetes.io/projected/34ca9ae8-28f0-476b-beb3-6374833dc9f4-kube-api-access-hctfj\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.354059 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ca9ae8-28f0-476b-beb3-6374833dc9f4-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.991375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" event={"ID":"34ca9ae8-28f0-476b-beb3-6374833dc9f4","Type":"ContainerDied","Data":"9cf8af625dada61e7c7e4d8ec8de0e213750ac5dc785cdb53528998c61c7a9d7"} Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.991423 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cf8af625dada61e7c7e4d8ec8de0e213750ac5dc785cdb53528998c61c7a9d7" Jan 21 16:25:30 crc kubenswrapper[4707]: I0121 16:25:30.991433 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.031312 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45"] Jan 21 16:25:31 crc kubenswrapper[4707]: E0121 16:25:31.031585 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ca9ae8-28f0-476b-beb3-6374833dc9f4" containerName="download-cache-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.031605 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ca9ae8-28f0-476b-beb3-6374833dc9f4" containerName="download-cache-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.031757 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ca9ae8-28f0-476b-beb3-6374833dc9f4" containerName="download-cache-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.032258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.034915 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.034943 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.035991 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.036048 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.036176 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.037626 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45"] Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.075504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.075608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-inventory\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.075643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42nwf\" (UniqueName: \"kubernetes.io/projected/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-kube-api-access-42nwf\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.075664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.177155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.177294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-inventory\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.177334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42nwf\" (UniqueName: \"kubernetes.io/projected/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-kube-api-access-42nwf\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.177356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.180433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-inventory\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.180435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.180995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.191517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42nwf\" (UniqueName: \"kubernetes.io/projected/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-kube-api-access-42nwf\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.349211 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:31 crc kubenswrapper[4707]: I0121 16:25:31.704917 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45"] Jan 21 16:25:31 crc kubenswrapper[4707]: W0121 16:25:31.708931 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cdb5ec5_58a6_435d_864d_313a4a1d5ad0.slice/crio-5cdbdea3baba743bf2dc09b848a4633a005ea992fc616c8e9be88023ded7dd64 WatchSource:0}: Error finding container 5cdbdea3baba743bf2dc09b848a4633a005ea992fc616c8e9be88023ded7dd64: Status 404 returned error can't find the container with id 5cdbdea3baba743bf2dc09b848a4633a005ea992fc616c8e9be88023ded7dd64 Jan 21 16:25:32 crc kubenswrapper[4707]: I0121 16:25:32.006696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" event={"ID":"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0","Type":"ContainerStarted","Data":"5cdbdea3baba743bf2dc09b848a4633a005ea992fc616c8e9be88023ded7dd64"} Jan 21 16:25:33 crc kubenswrapper[4707]: I0121 16:25:33.013948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" event={"ID":"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0","Type":"ContainerStarted","Data":"222cf4dc5b0d2d5f5652901b414cfaa494c6a5754535cc56de1e42e244e378e6"} Jan 21 16:25:33 crc kubenswrapper[4707]: I0121 16:25:33.027997 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" podStartSLOduration=1.506079387 podStartE2EDuration="2.027983231s" podCreationTimestamp="2026-01-21 16:25:31 +0000 UTC" firstStartedPulling="2026-01-21 16:25:31.710721128 +0000 UTC m=+5028.892237350" lastFinishedPulling="2026-01-21 16:25:32.232624971 +0000 UTC m=+5029.414141194" observedRunningTime="2026-01-21 16:25:33.025432184 +0000 UTC m=+5030.206948407" watchObservedRunningTime="2026-01-21 16:25:33.027983231 +0000 UTC m=+5030.209499453" Jan 21 16:25:34 crc kubenswrapper[4707]: I0121 16:25:34.020844 4707 generic.go:334] "Generic (PLEG): container finished" podID="1cdb5ec5-58a6-435d-864d-313a4a1d5ad0" containerID="222cf4dc5b0d2d5f5652901b414cfaa494c6a5754535cc56de1e42e244e378e6" exitCode=0 Jan 21 16:25:34 crc kubenswrapper[4707]: I0121 16:25:34.020883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" event={"ID":"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0","Type":"ContainerDied","Data":"222cf4dc5b0d2d5f5652901b414cfaa494c6a5754535cc56de1e42e244e378e6"} Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.240280 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.432590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-ssh-key-edpm-compute-no-nodes\") pod \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.432633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-inventory\") pod \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.432665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42nwf\" (UniqueName: \"kubernetes.io/projected/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-kube-api-access-42nwf\") pod \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.432760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-bootstrap-combined-ca-bundle\") pod \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\" (UID: \"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0\") " Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.437414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-kube-api-access-42nwf" (OuterVolumeSpecName: "kube-api-access-42nwf") pod "1cdb5ec5-58a6-435d-864d-313a4a1d5ad0" (UID: "1cdb5ec5-58a6-435d-864d-313a4a1d5ad0"). InnerVolumeSpecName "kube-api-access-42nwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.437776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1cdb5ec5-58a6-435d-864d-313a4a1d5ad0" (UID: "1cdb5ec5-58a6-435d-864d-313a4a1d5ad0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.447775 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-inventory" (OuterVolumeSpecName: "inventory") pod "1cdb5ec5-58a6-435d-864d-313a4a1d5ad0" (UID: "1cdb5ec5-58a6-435d-864d-313a4a1d5ad0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.452493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "1cdb5ec5-58a6-435d-864d-313a4a1d5ad0" (UID: "1cdb5ec5-58a6-435d-864d-313a4a1d5ad0"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.534076 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.534106 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.534117 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:35 crc kubenswrapper[4707]: I0121 16:25:35.534129 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42nwf\" (UniqueName: \"kubernetes.io/projected/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0-kube-api-access-42nwf\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.033794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" event={"ID":"1cdb5ec5-58a6-435d-864d-313a4a1d5ad0","Type":"ContainerDied","Data":"5cdbdea3baba743bf2dc09b848a4633a005ea992fc616c8e9be88023ded7dd64"} Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.033844 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cdbdea3baba743bf2dc09b848a4633a005ea992fc616c8e9be88023ded7dd64" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.033861 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.074663 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2"] Jan 21 16:25:36 crc kubenswrapper[4707]: E0121 16:25:36.074938 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cdb5ec5-58a6-435d-864d-313a4a1d5ad0" containerName="bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.074956 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cdb5ec5-58a6-435d-864d-313a4a1d5ad0" containerName="bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.075098 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cdb5ec5-58a6-435d-864d-313a4a1d5ad0" containerName="bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.075494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.076923 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.077155 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.077280 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.079955 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.082125 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2"] Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.140451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.140649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-inventory\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.241796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.241887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-inventory\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.241983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s5rj\" (UniqueName: \"kubernetes.io/projected/118d7425-d41d-4f54-aac0-6795ec4b72fd-kube-api-access-6s5rj\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.244532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-inventory\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.244528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.343550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s5rj\" (UniqueName: \"kubernetes.io/projected/118d7425-d41d-4f54-aac0-6795ec4b72fd-kube-api-access-6s5rj\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.356794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s5rj\" (UniqueName: \"kubernetes.io/projected/118d7425-d41d-4f54-aac0-6795ec4b72fd-kube-api-access-6s5rj\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.386745 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:36 crc kubenswrapper[4707]: I0121 16:25:36.653252 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2"] Jan 21 16:25:36 crc kubenswrapper[4707]: W0121 16:25:36.656973 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod118d7425_d41d_4f54_aac0_6795ec4b72fd.slice/crio-a34f22db772ad209d3f85c122dc613989aa8e8bfd7081afab2d6a7a3421ff6c5 WatchSource:0}: Error finding container a34f22db772ad209d3f85c122dc613989aa8e8bfd7081afab2d6a7a3421ff6c5: Status 404 returned error can't find the container with id a34f22db772ad209d3f85c122dc613989aa8e8bfd7081afab2d6a7a3421ff6c5 Jan 21 16:25:37 crc kubenswrapper[4707]: I0121 16:25:37.041277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" event={"ID":"118d7425-d41d-4f54-aac0-6795ec4b72fd","Type":"ContainerStarted","Data":"a34f22db772ad209d3f85c122dc613989aa8e8bfd7081afab2d6a7a3421ff6c5"} Jan 21 16:25:38 crc kubenswrapper[4707]: I0121 16:25:38.048188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" event={"ID":"118d7425-d41d-4f54-aac0-6795ec4b72fd","Type":"ContainerStarted","Data":"96d41a29088ea2aa04da34ffa66c3768939fd9e7bf3e5db9f18e02ea3501788c"} Jan 21 16:25:39 crc kubenswrapper[4707]: I0121 16:25:39.055597 4707 generic.go:334] "Generic (PLEG): container finished" podID="118d7425-d41d-4f54-aac0-6795ec4b72fd" containerID="96d41a29088ea2aa04da34ffa66c3768939fd9e7bf3e5db9f18e02ea3501788c" exitCode=0 Jan 21 16:25:39 crc kubenswrapper[4707]: I0121 16:25:39.055681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" event={"ID":"118d7425-d41d-4f54-aac0-6795ec4b72fd","Type":"ContainerDied","Data":"96d41a29088ea2aa04da34ffa66c3768939fd9e7bf3e5db9f18e02ea3501788c"} Jan 21 16:25:39 crc kubenswrapper[4707]: I0121 16:25:39.945255 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:25:39 crc kubenswrapper[4707]: I0121 16:25:39.945305 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.257217 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.389226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-ssh-key-edpm-compute-no-nodes\") pod \"118d7425-d41d-4f54-aac0-6795ec4b72fd\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.389333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-inventory\") pod \"118d7425-d41d-4f54-aac0-6795ec4b72fd\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.389378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s5rj\" (UniqueName: \"kubernetes.io/projected/118d7425-d41d-4f54-aac0-6795ec4b72fd-kube-api-access-6s5rj\") pod \"118d7425-d41d-4f54-aac0-6795ec4b72fd\" (UID: \"118d7425-d41d-4f54-aac0-6795ec4b72fd\") " Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.393923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118d7425-d41d-4f54-aac0-6795ec4b72fd-kube-api-access-6s5rj" (OuterVolumeSpecName: "kube-api-access-6s5rj") pod "118d7425-d41d-4f54-aac0-6795ec4b72fd" (UID: "118d7425-d41d-4f54-aac0-6795ec4b72fd"). InnerVolumeSpecName "kube-api-access-6s5rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.406436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-inventory" (OuterVolumeSpecName: "inventory") pod "118d7425-d41d-4f54-aac0-6795ec4b72fd" (UID: "118d7425-d41d-4f54-aac0-6795ec4b72fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.407624 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "118d7425-d41d-4f54-aac0-6795ec4b72fd" (UID: "118d7425-d41d-4f54-aac0-6795ec4b72fd"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.491309 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s5rj\" (UniqueName: \"kubernetes.io/projected/118d7425-d41d-4f54-aac0-6795ec4b72fd-kube-api-access-6s5rj\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.491336 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:40 crc kubenswrapper[4707]: I0121 16:25:40.491348 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/118d7425-d41d-4f54-aac0-6795ec4b72fd-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.068918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" event={"ID":"118d7425-d41d-4f54-aac0-6795ec4b72fd","Type":"ContainerDied","Data":"a34f22db772ad209d3f85c122dc613989aa8e8bfd7081afab2d6a7a3421ff6c5"} Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.068951 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a34f22db772ad209d3f85c122dc613989aa8e8bfd7081afab2d6a7a3421ff6c5" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.068959 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.103677 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp"] Jan 21 16:25:41 crc kubenswrapper[4707]: E0121 16:25:41.103964 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118d7425-d41d-4f54-aac0-6795ec4b72fd" containerName="configure-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.103982 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="118d7425-d41d-4f54-aac0-6795ec4b72fd" containerName="configure-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.104141 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="118d7425-d41d-4f54-aac0-6795ec4b72fd" containerName="configure-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.104517 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.106485 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.106591 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.106828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.107206 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.112571 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp"] Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.301719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlp9k\" (UniqueName: \"kubernetes.io/projected/a7bd3a21-683b-4488-94fa-dd1b2ff48100-kube-api-access-nlp9k\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.301765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-inventory\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.301878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.402806 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlp9k\" (UniqueName: \"kubernetes.io/projected/a7bd3a21-683b-4488-94fa-dd1b2ff48100-kube-api-access-nlp9k\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.402879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-inventory\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.402919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.405979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.405989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-inventory\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.415999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlp9k\" (UniqueName: \"kubernetes.io/projected/a7bd3a21-683b-4488-94fa-dd1b2ff48100-kube-api-access-nlp9k\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.416122 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:41 crc kubenswrapper[4707]: I0121 16:25:41.793535 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp"] Jan 21 16:25:42 crc kubenswrapper[4707]: I0121 16:25:42.075519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" event={"ID":"a7bd3a21-683b-4488-94fa-dd1b2ff48100","Type":"ContainerStarted","Data":"31f0b47842d5a9d74f11913175efb732e39cb5acfc19506f451bdf9bbb55d31f"} Jan 21 16:25:43 crc kubenswrapper[4707]: I0121 16:25:43.090059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" event={"ID":"a7bd3a21-683b-4488-94fa-dd1b2ff48100","Type":"ContainerStarted","Data":"1ec9137cc0d553135efbd802b5a20c8dbb48c76c9bbeb44bd23155dc1441794c"} Jan 21 16:25:43 crc kubenswrapper[4707]: I0121 16:25:43.104166 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" podStartSLOduration=1.609861961 podStartE2EDuration="2.104151531s" podCreationTimestamp="2026-01-21 16:25:41 +0000 UTC" firstStartedPulling="2026-01-21 16:25:41.798056386 +0000 UTC m=+5038.979572607" lastFinishedPulling="2026-01-21 16:25:42.292345956 +0000 UTC m=+5039.473862177" observedRunningTime="2026-01-21 16:25:43.102923122 +0000 UTC m=+5040.284439343" watchObservedRunningTime="2026-01-21 16:25:43.104151531 +0000 UTC m=+5040.285667753" Jan 21 16:25:44 crc kubenswrapper[4707]: I0121 16:25:44.096558 4707 generic.go:334] "Generic (PLEG): container finished" podID="a7bd3a21-683b-4488-94fa-dd1b2ff48100" containerID="1ec9137cc0d553135efbd802b5a20c8dbb48c76c9bbeb44bd23155dc1441794c" exitCode=0 Jan 21 16:25:44 crc kubenswrapper[4707]: I0121 16:25:44.096592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" event={"ID":"a7bd3a21-683b-4488-94fa-dd1b2ff48100","Type":"ContainerDied","Data":"1ec9137cc0d553135efbd802b5a20c8dbb48c76c9bbeb44bd23155dc1441794c"} Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.310422 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.346858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-inventory\") pod \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.347024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-ssh-key-edpm-compute-no-nodes\") pod \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.347141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlp9k\" (UniqueName: \"kubernetes.io/projected/a7bd3a21-683b-4488-94fa-dd1b2ff48100-kube-api-access-nlp9k\") pod \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\" (UID: \"a7bd3a21-683b-4488-94fa-dd1b2ff48100\") " Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.352056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7bd3a21-683b-4488-94fa-dd1b2ff48100-kube-api-access-nlp9k" (OuterVolumeSpecName: "kube-api-access-nlp9k") pod "a7bd3a21-683b-4488-94fa-dd1b2ff48100" (UID: "a7bd3a21-683b-4488-94fa-dd1b2ff48100"). InnerVolumeSpecName "kube-api-access-nlp9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.363433 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-inventory" (OuterVolumeSpecName: "inventory") pod "a7bd3a21-683b-4488-94fa-dd1b2ff48100" (UID: "a7bd3a21-683b-4488-94fa-dd1b2ff48100"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.363902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "a7bd3a21-683b-4488-94fa-dd1b2ff48100" (UID: "a7bd3a21-683b-4488-94fa-dd1b2ff48100"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.448269 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlp9k\" (UniqueName: \"kubernetes.io/projected/a7bd3a21-683b-4488-94fa-dd1b2ff48100-kube-api-access-nlp9k\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.448303 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:45 crc kubenswrapper[4707]: I0121 16:25:45.448314 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a7bd3a21-683b-4488-94fa-dd1b2ff48100-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.108775 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" event={"ID":"a7bd3a21-683b-4488-94fa-dd1b2ff48100","Type":"ContainerDied","Data":"31f0b47842d5a9d74f11913175efb732e39cb5acfc19506f451bdf9bbb55d31f"} Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.108834 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f0b47842d5a9d74f11913175efb732e39cb5acfc19506f451bdf9bbb55d31f" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.108842 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.152082 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52"] Jan 21 16:25:46 crc kubenswrapper[4707]: E0121 16:25:46.152340 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7bd3a21-683b-4488-94fa-dd1b2ff48100" containerName="validate-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.152357 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7bd3a21-683b-4488-94fa-dd1b2ff48100" containerName="validate-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.152487 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7bd3a21-683b-4488-94fa-dd1b2ff48100" containerName="validate-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.152944 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.156577 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.157253 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.157743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-inventory\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.157804 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.157848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.157898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgztk\" (UniqueName: \"kubernetes.io/projected/cd680393-9246-418e-a033-c2acbcf038c0-kube-api-access-jgztk\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.158322 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.163048 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52"] Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.258746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-inventory\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.258830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.258875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgztk\" (UniqueName: \"kubernetes.io/projected/cd680393-9246-418e-a033-c2acbcf038c0-kube-api-access-jgztk\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.261845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.262177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-inventory\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.274579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgztk\" (UniqueName: \"kubernetes.io/projected/cd680393-9246-418e-a033-c2acbcf038c0-kube-api-access-jgztk\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.466426 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:46 crc kubenswrapper[4707]: I0121 16:25:46.822636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52"] Jan 21 16:25:46 crc kubenswrapper[4707]: W0121 16:25:46.829952 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd680393_9246_418e_a033_c2acbcf038c0.slice/crio-263f5562a20cc618730e45d0c947cd8d5329a92f133c70a717ab3b7835463f93 WatchSource:0}: Error finding container 263f5562a20cc618730e45d0c947cd8d5329a92f133c70a717ab3b7835463f93: Status 404 returned error can't find the container with id 263f5562a20cc618730e45d0c947cd8d5329a92f133c70a717ab3b7835463f93 Jan 21 16:25:47 crc kubenswrapper[4707]: I0121 16:25:47.116561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" event={"ID":"cd680393-9246-418e-a033-c2acbcf038c0","Type":"ContainerStarted","Data":"263f5562a20cc618730e45d0c947cd8d5329a92f133c70a717ab3b7835463f93"} Jan 21 16:25:48 crc kubenswrapper[4707]: I0121 16:25:48.126162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" event={"ID":"cd680393-9246-418e-a033-c2acbcf038c0","Type":"ContainerStarted","Data":"a5a3503e9ef0de7f10f4d5118ee771c234e00b273cd2cd1129a828ff4785495f"} Jan 21 16:25:48 crc kubenswrapper[4707]: I0121 16:25:48.147005 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" podStartSLOduration=1.661806361 podStartE2EDuration="2.146976916s" podCreationTimestamp="2026-01-21 16:25:46 +0000 UTC" firstStartedPulling="2026-01-21 16:25:46.832093466 +0000 UTC m=+5044.013609687" lastFinishedPulling="2026-01-21 16:25:47.31726402 +0000 UTC m=+5044.498780242" observedRunningTime="2026-01-21 16:25:48.137940765 +0000 UTC m=+5045.319456987" watchObservedRunningTime="2026-01-21 16:25:48.146976916 +0000 UTC m=+5045.328493138" Jan 21 16:25:49 crc kubenswrapper[4707]: I0121 16:25:49.134254 4707 generic.go:334] "Generic (PLEG): container finished" podID="cd680393-9246-418e-a033-c2acbcf038c0" containerID="a5a3503e9ef0de7f10f4d5118ee771c234e00b273cd2cd1129a828ff4785495f" exitCode=0 Jan 21 16:25:49 crc kubenswrapper[4707]: I0121 16:25:49.134300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" event={"ID":"cd680393-9246-418e-a033-c2acbcf038c0","Type":"ContainerDied","Data":"a5a3503e9ef0de7f10f4d5118ee771c234e00b273cd2cd1129a828ff4785495f"} Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.350230 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.511719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-inventory\") pod \"cd680393-9246-418e-a033-c2acbcf038c0\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.511836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-ssh-key-edpm-compute-no-nodes\") pod \"cd680393-9246-418e-a033-c2acbcf038c0\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.511865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgztk\" (UniqueName: \"kubernetes.io/projected/cd680393-9246-418e-a033-c2acbcf038c0-kube-api-access-jgztk\") pod \"cd680393-9246-418e-a033-c2acbcf038c0\" (UID: \"cd680393-9246-418e-a033-c2acbcf038c0\") " Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.516552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd680393-9246-418e-a033-c2acbcf038c0-kube-api-access-jgztk" (OuterVolumeSpecName: "kube-api-access-jgztk") pod "cd680393-9246-418e-a033-c2acbcf038c0" (UID: "cd680393-9246-418e-a033-c2acbcf038c0"). InnerVolumeSpecName "kube-api-access-jgztk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.528935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-inventory" (OuterVolumeSpecName: "inventory") pod "cd680393-9246-418e-a033-c2acbcf038c0" (UID: "cd680393-9246-418e-a033-c2acbcf038c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.530260 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "cd680393-9246-418e-a033-c2acbcf038c0" (UID: "cd680393-9246-418e-a033-c2acbcf038c0"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.613544 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.613772 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgztk\" (UniqueName: \"kubernetes.io/projected/cd680393-9246-418e-a033-c2acbcf038c0-kube-api-access-jgztk\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:50 crc kubenswrapper[4707]: I0121 16:25:50.613783 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd680393-9246-418e-a033-c2acbcf038c0-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.150379 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" event={"ID":"cd680393-9246-418e-a033-c2acbcf038c0","Type":"ContainerDied","Data":"263f5562a20cc618730e45d0c947cd8d5329a92f133c70a717ab3b7835463f93"} Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.150422 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="263f5562a20cc618730e45d0c947cd8d5329a92f133c70a717ab3b7835463f93" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.150488 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.194554 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd"] Jan 21 16:25:51 crc kubenswrapper[4707]: E0121 16:25:51.194872 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd680393-9246-418e-a033-c2acbcf038c0" containerName="install-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.194889 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd680393-9246-418e-a033-c2acbcf038c0" containerName="install-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.195010 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd680393-9246-418e-a033-c2acbcf038c0" containerName="install-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.195449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.197575 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.197644 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.197711 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.198060 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.201100 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd"] Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.221564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc88h\" (UniqueName: \"kubernetes.io/projected/b8d88374-3dc3-4e29-b45d-5ea7015beff0-kube-api-access-nc88h\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.221651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.221836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-inventory\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.323291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-inventory\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.323392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc88h\" (UniqueName: \"kubernetes.io/projected/b8d88374-3dc3-4e29-b45d-5ea7015beff0-kube-api-access-nc88h\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.323419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.326664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-inventory\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.326738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.337189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc88h\" (UniqueName: \"kubernetes.io/projected/b8d88374-3dc3-4e29-b45d-5ea7015beff0-kube-api-access-nc88h\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.511989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:51 crc kubenswrapper[4707]: I0121 16:25:51.859683 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd"] Jan 21 16:25:51 crc kubenswrapper[4707]: W0121 16:25:51.864626 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d88374_3dc3_4e29_b45d_5ea7015beff0.slice/crio-1c8b042309a4c6229131ffd0ad99ac36559f67ce467752e4d888783774a2387e WatchSource:0}: Error finding container 1c8b042309a4c6229131ffd0ad99ac36559f67ce467752e4d888783774a2387e: Status 404 returned error can't find the container with id 1c8b042309a4c6229131ffd0ad99ac36559f67ce467752e4d888783774a2387e Jan 21 16:25:52 crc kubenswrapper[4707]: I0121 16:25:52.157351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" event={"ID":"b8d88374-3dc3-4e29-b45d-5ea7015beff0","Type":"ContainerStarted","Data":"1c8b042309a4c6229131ffd0ad99ac36559f67ce467752e4d888783774a2387e"} Jan 21 16:25:53 crc kubenswrapper[4707]: I0121 16:25:53.163988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" event={"ID":"b8d88374-3dc3-4e29-b45d-5ea7015beff0","Type":"ContainerStarted","Data":"c36d4f6db68800ba60807179adb8755a923630e7b103d5dba3abb16552f6701b"} Jan 21 16:25:53 crc kubenswrapper[4707]: I0121 16:25:53.174996 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" podStartSLOduration=1.676291126 podStartE2EDuration="2.174975832s" podCreationTimestamp="2026-01-21 16:25:51 +0000 UTC" firstStartedPulling="2026-01-21 16:25:51.86677126 +0000 UTC m=+5049.048287483" lastFinishedPulling="2026-01-21 16:25:52.365455966 +0000 UTC m=+5049.546972189" observedRunningTime="2026-01-21 16:25:53.173544081 +0000 UTC m=+5050.355060303" watchObservedRunningTime="2026-01-21 16:25:53.174975832 +0000 UTC m=+5050.356492055" Jan 21 16:25:54 crc kubenswrapper[4707]: I0121 16:25:54.170458 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8d88374-3dc3-4e29-b45d-5ea7015beff0" containerID="c36d4f6db68800ba60807179adb8755a923630e7b103d5dba3abb16552f6701b" exitCode=0 Jan 21 16:25:54 crc kubenswrapper[4707]: I0121 16:25:54.170488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" event={"ID":"b8d88374-3dc3-4e29-b45d-5ea7015beff0","Type":"ContainerDied","Data":"c36d4f6db68800ba60807179adb8755a923630e7b103d5dba3abb16552f6701b"} Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.371211 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.467552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc88h\" (UniqueName: \"kubernetes.io/projected/b8d88374-3dc3-4e29-b45d-5ea7015beff0-kube-api-access-nc88h\") pod \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.467585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-inventory\") pod \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.467632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-ssh-key-edpm-compute-no-nodes\") pod \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\" (UID: \"b8d88374-3dc3-4e29-b45d-5ea7015beff0\") " Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.471708 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d88374-3dc3-4e29-b45d-5ea7015beff0-kube-api-access-nc88h" (OuterVolumeSpecName: "kube-api-access-nc88h") pod "b8d88374-3dc3-4e29-b45d-5ea7015beff0" (UID: "b8d88374-3dc3-4e29-b45d-5ea7015beff0"). InnerVolumeSpecName "kube-api-access-nc88h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.483716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "b8d88374-3dc3-4e29-b45d-5ea7015beff0" (UID: "b8d88374-3dc3-4e29-b45d-5ea7015beff0"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.485116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-inventory" (OuterVolumeSpecName: "inventory") pod "b8d88374-3dc3-4e29-b45d-5ea7015beff0" (UID: "b8d88374-3dc3-4e29-b45d-5ea7015beff0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.568554 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc88h\" (UniqueName: \"kubernetes.io/projected/b8d88374-3dc3-4e29-b45d-5ea7015beff0-kube-api-access-nc88h\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.568582 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:55 crc kubenswrapper[4707]: I0121 16:25:55.568591 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b8d88374-3dc3-4e29-b45d-5ea7015beff0-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.183489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" event={"ID":"b8d88374-3dc3-4e29-b45d-5ea7015beff0","Type":"ContainerDied","Data":"1c8b042309a4c6229131ffd0ad99ac36559f67ce467752e4d888783774a2387e"} Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.183666 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c8b042309a4c6229131ffd0ad99ac36559f67ce467752e4d888783774a2387e" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.183689 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.350096 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9"] Jan 21 16:25:56 crc kubenswrapper[4707]: E0121 16:25:56.350348 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d88374-3dc3-4e29-b45d-5ea7015beff0" containerName="configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.350365 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d88374-3dc3-4e29-b45d-5ea7015beff0" containerName="configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.350498 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d88374-3dc3-4e29-b45d-5ea7015beff0" containerName="configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.350909 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.353161 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.353311 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.353453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.353567 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.362011 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9"] Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.479548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-inventory\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.479661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.479712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98b4w\" (UniqueName: \"kubernetes.io/projected/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-kube-api-access-98b4w\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.580732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.580782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98b4w\" (UniqueName: \"kubernetes.io/projected/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-kube-api-access-98b4w\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.580903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-inventory\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.583801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-inventory\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.584159 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.593462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98b4w\" (UniqueName: \"kubernetes.io/projected/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-kube-api-access-98b4w\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:56 crc kubenswrapper[4707]: I0121 16:25:56.665592 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:25:57 crc kubenswrapper[4707]: I0121 16:25:57.011639 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9"] Jan 21 16:25:57 crc kubenswrapper[4707]: W0121 16:25:57.018911 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0ad0b03_e92c_4d1f_9ac3_3e83dae0c647.slice/crio-93de04c06e49e3b18f89e1989fd22a43230e2e955e1d4e0ac1c9d55574fd35ea WatchSource:0}: Error finding container 93de04c06e49e3b18f89e1989fd22a43230e2e955e1d4e0ac1c9d55574fd35ea: Status 404 returned error can't find the container with id 93de04c06e49e3b18f89e1989fd22a43230e2e955e1d4e0ac1c9d55574fd35ea Jan 21 16:25:57 crc kubenswrapper[4707]: I0121 16:25:57.189897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" event={"ID":"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647","Type":"ContainerStarted","Data":"93de04c06e49e3b18f89e1989fd22a43230e2e955e1d4e0ac1c9d55574fd35ea"} Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.056008 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nsvdm"] Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.057471 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.063576 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nsvdm"] Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.099987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-catalog-content\") pod \"redhat-operators-nsvdm\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.100040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-utilities\") pod \"redhat-operators-nsvdm\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.100083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7kxc\" (UniqueName: \"kubernetes.io/projected/466ce831-e28e-4748-a929-d0d0a7aa680c-kube-api-access-h7kxc\") pod \"redhat-operators-nsvdm\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.196716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" event={"ID":"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647","Type":"ContainerStarted","Data":"cef9bd8b683ca99af5013091c9aaef582f230035c9e921a3f737f8ce0d35781f"} Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.200901 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-catalog-content\") pod \"redhat-operators-nsvdm\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.200945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-utilities\") pod \"redhat-operators-nsvdm\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.200972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7kxc\" (UniqueName: \"kubernetes.io/projected/466ce831-e28e-4748-a929-d0d0a7aa680c-kube-api-access-h7kxc\") pod \"redhat-operators-nsvdm\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.201319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-catalog-content\") pod \"redhat-operators-nsvdm\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.201373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-utilities\") pod \"redhat-operators-nsvdm\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.211876 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" podStartSLOduration=1.734578406 podStartE2EDuration="2.211863233s" podCreationTimestamp="2026-01-21 16:25:56 +0000 UTC" firstStartedPulling="2026-01-21 16:25:57.020270155 +0000 UTC m=+5054.201786378" lastFinishedPulling="2026-01-21 16:25:57.497554983 +0000 UTC m=+5054.679071205" observedRunningTime="2026-01-21 16:25:58.210618954 +0000 UTC m=+5055.392135176" watchObservedRunningTime="2026-01-21 16:25:58.211863233 +0000 UTC m=+5055.393379455" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.217547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7kxc\" (UniqueName: \"kubernetes.io/projected/466ce831-e28e-4748-a929-d0d0a7aa680c-kube-api-access-h7kxc\") pod \"redhat-operators-nsvdm\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.373218 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:25:58 crc kubenswrapper[4707]: I0121 16:25:58.734797 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nsvdm"] Jan 21 16:25:59 crc kubenswrapper[4707]: I0121 16:25:59.203140 4707 generic.go:334] "Generic (PLEG): container finished" podID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerID="f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f" exitCode=0 Jan 21 16:25:59 crc kubenswrapper[4707]: I0121 16:25:59.203180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsvdm" event={"ID":"466ce831-e28e-4748-a929-d0d0a7aa680c","Type":"ContainerDied","Data":"f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f"} Jan 21 16:25:59 crc kubenswrapper[4707]: I0121 16:25:59.203941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsvdm" event={"ID":"466ce831-e28e-4748-a929-d0d0a7aa680c","Type":"ContainerStarted","Data":"4daf66b2af3379c4ac9fbbd8f047bb32b2461f1ac78040a38acef1e291be5b9a"} Jan 21 16:25:59 crc kubenswrapper[4707]: I0121 16:25:59.205272 4707 generic.go:334] "Generic (PLEG): container finished" podID="d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647" containerID="cef9bd8b683ca99af5013091c9aaef582f230035c9e921a3f737f8ce0d35781f" exitCode=0 Jan 21 16:25:59 crc kubenswrapper[4707]: I0121 16:25:59.205308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" event={"ID":"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647","Type":"ContainerDied","Data":"cef9bd8b683ca99af5013091c9aaef582f230035c9e921a3f737f8ce0d35781f"} Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.213651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsvdm" event={"ID":"466ce831-e28e-4748-a929-d0d0a7aa680c","Type":"ContainerStarted","Data":"f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d"} Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.439418 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.533907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-ssh-key-edpm-compute-no-nodes\") pod \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.551157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647" (UID: "d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.634735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98b4w\" (UniqueName: \"kubernetes.io/projected/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-kube-api-access-98b4w\") pod \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.634843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-inventory\") pod \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\" (UID: \"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647\") " Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.635179 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.636853 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-kube-api-access-98b4w" (OuterVolumeSpecName: "kube-api-access-98b4w") pod "d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647" (UID: "d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647"). InnerVolumeSpecName "kube-api-access-98b4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.648058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-inventory" (OuterVolumeSpecName: "inventory") pod "d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647" (UID: "d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.735973 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:00 crc kubenswrapper[4707]: I0121 16:26:00.735998 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98b4w\" (UniqueName: \"kubernetes.io/projected/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647-kube-api-access-98b4w\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.219993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" event={"ID":"d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647","Type":"ContainerDied","Data":"93de04c06e49e3b18f89e1989fd22a43230e2e955e1d4e0ac1c9d55574fd35ea"} Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.220026 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93de04c06e49e3b18f89e1989fd22a43230e2e955e1d4e0ac1c9d55574fd35ea" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.220004 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.221881 4707 generic.go:334] "Generic (PLEG): container finished" podID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerID="f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d" exitCode=0 Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.221909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsvdm" event={"ID":"466ce831-e28e-4748-a929-d0d0a7aa680c","Type":"ContainerDied","Data":"f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d"} Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.265565 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl"] Jan 21 16:26:01 crc kubenswrapper[4707]: E0121 16:26:01.265839 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647" containerName="run-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.265851 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647" containerName="run-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.265999 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647" containerName="run-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.266446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.267735 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.268147 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.268343 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.268612 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.268749 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.272708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl"] Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.341837 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.341897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.341960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.342014 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.342071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.342215 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-946j9\" (UniqueName: \"kubernetes.io/projected/9889a470-1fad-4c98-a7f6-42650212e6ac-kube-api-access-946j9\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.342309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.342396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-inventory\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.342456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.342535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.342640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.443651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.443861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.443892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.443911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.443927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.443947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.443968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-946j9\" (UniqueName: \"kubernetes.io/projected/9889a470-1fad-4c98-a7f6-42650212e6ac-kube-api-access-946j9\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.443990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.444009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-inventory\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.444024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.444044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.447660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.447667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-inventory\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.447670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.448010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.448037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.448104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.448292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.448346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.447682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.449241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.458041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-946j9\" (UniqueName: \"kubernetes.io/projected/9889a470-1fad-4c98-a7f6-42650212e6ac-kube-api-access-946j9\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.583193 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:01 crc kubenswrapper[4707]: I0121 16:26:01.920212 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl"] Jan 21 16:26:01 crc kubenswrapper[4707]: W0121 16:26:01.926584 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9889a470_1fad_4c98_a7f6_42650212e6ac.slice/crio-31872a7ace619805d37c157e4f172d4072bd66ae8d0451643eb4c1bacf3cd4ac WatchSource:0}: Error finding container 31872a7ace619805d37c157e4f172d4072bd66ae8d0451643eb4c1bacf3cd4ac: Status 404 returned error can't find the container with id 31872a7ace619805d37c157e4f172d4072bd66ae8d0451643eb4c1bacf3cd4ac Jan 21 16:26:02 crc kubenswrapper[4707]: I0121 16:26:02.229270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" event={"ID":"9889a470-1fad-4c98-a7f6-42650212e6ac","Type":"ContainerStarted","Data":"31872a7ace619805d37c157e4f172d4072bd66ae8d0451643eb4c1bacf3cd4ac"} Jan 21 16:26:02 crc kubenswrapper[4707]: I0121 16:26:02.231414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsvdm" event={"ID":"466ce831-e28e-4748-a929-d0d0a7aa680c","Type":"ContainerStarted","Data":"dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637"} Jan 21 16:26:02 crc kubenswrapper[4707]: I0121 16:26:02.241862 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nsvdm" podStartSLOduration=1.6743187229999998 podStartE2EDuration="4.241851134s" podCreationTimestamp="2026-01-21 16:25:58 +0000 UTC" firstStartedPulling="2026-01-21 16:25:59.204643886 +0000 UTC m=+5056.386160107" lastFinishedPulling="2026-01-21 16:26:01.772176286 +0000 UTC m=+5058.953692518" observedRunningTime="2026-01-21 16:26:02.241616873 +0000 UTC m=+5059.423133096" watchObservedRunningTime="2026-01-21 16:26:02.241851134 +0000 UTC m=+5059.423367355" Jan 21 16:26:03 crc kubenswrapper[4707]: I0121 16:26:03.239884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" event={"ID":"9889a470-1fad-4c98-a7f6-42650212e6ac","Type":"ContainerStarted","Data":"bcdcc01cfa9e1f66f466d54c72c5f04a5282ee5987698d17762ba51de6488875"} Jan 21 16:26:03 crc kubenswrapper[4707]: I0121 16:26:03.251397 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" podStartSLOduration=1.765730171 podStartE2EDuration="2.251383494s" podCreationTimestamp="2026-01-21 16:26:01 +0000 UTC" firstStartedPulling="2026-01-21 16:26:01.929027242 +0000 UTC m=+5059.110543464" lastFinishedPulling="2026-01-21 16:26:02.414680565 +0000 UTC m=+5059.596196787" observedRunningTime="2026-01-21 16:26:03.249987408 +0000 UTC m=+5060.431503630" watchObservedRunningTime="2026-01-21 16:26:03.251383494 +0000 UTC m=+5060.432899706" Jan 21 16:26:04 crc kubenswrapper[4707]: I0121 16:26:04.247041 4707 generic.go:334] "Generic (PLEG): container finished" podID="9889a470-1fad-4c98-a7f6-42650212e6ac" containerID="bcdcc01cfa9e1f66f466d54c72c5f04a5282ee5987698d17762ba51de6488875" exitCode=0 Jan 21 16:26:04 crc kubenswrapper[4707]: I0121 16:26:04.247122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" event={"ID":"9889a470-1fad-4c98-a7f6-42650212e6ac","Type":"ContainerDied","Data":"bcdcc01cfa9e1f66f466d54c72c5f04a5282ee5987698d17762ba51de6488875"} Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.464200 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.590964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-libvirt-combined-ca-bundle\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.591233 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-inventory\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.591317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ssh-key-edpm-compute-no-nodes\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.591413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-946j9\" (UniqueName: \"kubernetes.io/projected/9889a470-1fad-4c98-a7f6-42650212e6ac-kube-api-access-946j9\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.591734 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-nova-combined-ca-bundle\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.591842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ovn-combined-ca-bundle\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.591924 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-dhcp-combined-ca-bundle\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.591988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-ovn-combined-ca-bundle\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.592093 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-metadata-combined-ca-bundle\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.592152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-sriov-combined-ca-bundle\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.592258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-bootstrap-combined-ca-bundle\") pod \"9889a470-1fad-4c98-a7f6-42650212e6ac\" (UID: \"9889a470-1fad-4c98-a7f6-42650212e6ac\") " Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.595336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.595353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.595599 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.595619 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.596114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.596176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.596233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.596890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9889a470-1fad-4c98-a7f6-42650212e6ac-kube-api-access-946j9" (OuterVolumeSpecName: "kube-api-access-946j9") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "kube-api-access-946j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.596886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.608924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.610545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-inventory" (OuterVolumeSpecName: "inventory") pod "9889a470-1fad-4c98-a7f6-42650212e6ac" (UID: "9889a470-1fad-4c98-a7f6-42650212e6ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694043 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694174 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694231 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694294 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694347 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-946j9\" (UniqueName: \"kubernetes.io/projected/9889a470-1fad-4c98-a7f6-42650212e6ac-kube-api-access-946j9\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694392 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694440 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694485 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694530 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694573 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:05 crc kubenswrapper[4707]: I0121 16:26:05.694615 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9889a470-1fad-4c98-a7f6-42650212e6ac-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.260477 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk"] Jan 21 16:26:06 crc kubenswrapper[4707]: E0121 16:26:06.260712 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9889a470-1fad-4c98-a7f6-42650212e6ac" containerName="install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.260728 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9889a470-1fad-4c98-a7f6-42650212e6ac" containerName="install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.260872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9889a470-1fad-4c98-a7f6-42650212e6ac" containerName="install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.262531 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.265428 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.265607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" event={"ID":"9889a470-1fad-4c98-a7f6-42650212e6ac","Type":"ContainerDied","Data":"31872a7ace619805d37c157e4f172d4072bd66ae8d0451643eb4c1bacf3cd4ac"} Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.265651 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.265644 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31872a7ace619805d37c157e4f172d4072bd66ae8d0451643eb4c1bacf3cd4ac" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.268761 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk"] Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.302536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-inventory\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.302590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.302623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.302679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5hb4\" (UniqueName: \"kubernetes.io/projected/3780a614-404b-4ba9-a981-6f0193df3c96-kube-api-access-b5hb4\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.302749 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3780a614-404b-4ba9-a981-6f0193df3c96-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.404033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5hb4\" (UniqueName: \"kubernetes.io/projected/3780a614-404b-4ba9-a981-6f0193df3c96-kube-api-access-b5hb4\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.404102 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3780a614-404b-4ba9-a981-6f0193df3c96-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.404148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-inventory\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.404190 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.404206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.404863 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3780a614-404b-4ba9-a981-6f0193df3c96-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.406898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.406955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-inventory\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.410603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.416278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5hb4\" (UniqueName: \"kubernetes.io/projected/3780a614-404b-4ba9-a981-6f0193df3c96-kube-api-access-b5hb4\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.576291 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:06 crc kubenswrapper[4707]: I0121 16:26:06.920912 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk"] Jan 21 16:26:06 crc kubenswrapper[4707]: W0121 16:26:06.924282 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3780a614_404b_4ba9_a981_6f0193df3c96.slice/crio-3e4c28a9e5b6f9e2e4e1402866aa298da4d44bdf8f006e20f9b3689843bc2446 WatchSource:0}: Error finding container 3e4c28a9e5b6f9e2e4e1402866aa298da4d44bdf8f006e20f9b3689843bc2446: Status 404 returned error can't find the container with id 3e4c28a9e5b6f9e2e4e1402866aa298da4d44bdf8f006e20f9b3689843bc2446 Jan 21 16:26:07 crc kubenswrapper[4707]: I0121 16:26:07.272644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" event={"ID":"3780a614-404b-4ba9-a981-6f0193df3c96","Type":"ContainerStarted","Data":"3e4c28a9e5b6f9e2e4e1402866aa298da4d44bdf8f006e20f9b3689843bc2446"} Jan 21 16:26:08 crc kubenswrapper[4707]: I0121 16:26:08.279236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" event={"ID":"3780a614-404b-4ba9-a981-6f0193df3c96","Type":"ContainerStarted","Data":"d36c13df02c5160221b66567321b2642e332ee9d47201617ca588af483ca5d6b"} Jan 21 16:26:08 crc kubenswrapper[4707]: I0121 16:26:08.298327 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" podStartSLOduration=1.805353185 podStartE2EDuration="2.298312314s" podCreationTimestamp="2026-01-21 16:26:06 +0000 UTC" firstStartedPulling="2026-01-21 16:26:06.926134673 +0000 UTC m=+5064.107650896" lastFinishedPulling="2026-01-21 16:26:07.419093804 +0000 UTC m=+5064.600610025" observedRunningTime="2026-01-21 16:26:08.289772206 +0000 UTC m=+5065.471288428" watchObservedRunningTime="2026-01-21 16:26:08.298312314 +0000 UTC m=+5065.479828536" Jan 21 16:26:08 crc kubenswrapper[4707]: I0121 16:26:08.373544 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:26:08 crc kubenswrapper[4707]: I0121 16:26:08.373583 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:26:08 crc kubenswrapper[4707]: I0121 16:26:08.403876 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:26:09 crc kubenswrapper[4707]: I0121 16:26:09.286372 4707 generic.go:334] "Generic (PLEG): container finished" podID="3780a614-404b-4ba9-a981-6f0193df3c96" containerID="d36c13df02c5160221b66567321b2642e332ee9d47201617ca588af483ca5d6b" exitCode=0 Jan 21 16:26:09 crc kubenswrapper[4707]: I0121 16:26:09.286636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" event={"ID":"3780a614-404b-4ba9-a981-6f0193df3c96","Type":"ContainerDied","Data":"d36c13df02c5160221b66567321b2642e332ee9d47201617ca588af483ca5d6b"} Jan 21 16:26:09 crc kubenswrapper[4707]: I0121 16:26:09.318153 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:26:09 crc kubenswrapper[4707]: I0121 16:26:09.349009 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nsvdm"] Jan 21 16:26:09 crc kubenswrapper[4707]: I0121 16:26:09.945364 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:26:09 crc kubenswrapper[4707]: I0121 16:26:09.945409 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:26:09 crc kubenswrapper[4707]: I0121 16:26:09.945441 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:26:09 crc kubenswrapper[4707]: I0121 16:26:09.945892 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4385da0860f68c6b4fbe1ca2a1e4ddb0b3259ea351cab52fa0f587fb0728043a"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:26:09 crc kubenswrapper[4707]: I0121 16:26:09.945943 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://4385da0860f68c6b4fbe1ca2a1e4ddb0b3259ea351cab52fa0f587fb0728043a" gracePeriod=600 Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.294667 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="4385da0860f68c6b4fbe1ca2a1e4ddb0b3259ea351cab52fa0f587fb0728043a" exitCode=0 Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.294746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"4385da0860f68c6b4fbe1ca2a1e4ddb0b3259ea351cab52fa0f587fb0728043a"} Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.294903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340"} Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.294921 4707 scope.go:117] "RemoveContainer" containerID="56645cc7c11391131f3383734cec23e88bd3e36fccb48b859e97b311ebf64466" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.525324 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.653319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-inventory\") pod \"3780a614-404b-4ba9-a981-6f0193df3c96\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.653905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ovn-combined-ca-bundle\") pod \"3780a614-404b-4ba9-a981-6f0193df3c96\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.653945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ssh-key-edpm-compute-no-nodes\") pod \"3780a614-404b-4ba9-a981-6f0193df3c96\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.653999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3780a614-404b-4ba9-a981-6f0193df3c96-ovncontroller-config-0\") pod \"3780a614-404b-4ba9-a981-6f0193df3c96\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.654039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5hb4\" (UniqueName: \"kubernetes.io/projected/3780a614-404b-4ba9-a981-6f0193df3c96-kube-api-access-b5hb4\") pod \"3780a614-404b-4ba9-a981-6f0193df3c96\" (UID: \"3780a614-404b-4ba9-a981-6f0193df3c96\") " Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.657552 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3780a614-404b-4ba9-a981-6f0193df3c96-kube-api-access-b5hb4" (OuterVolumeSpecName: "kube-api-access-b5hb4") pod "3780a614-404b-4ba9-a981-6f0193df3c96" (UID: "3780a614-404b-4ba9-a981-6f0193df3c96"). InnerVolumeSpecName "kube-api-access-b5hb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.657788 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3780a614-404b-4ba9-a981-6f0193df3c96" (UID: "3780a614-404b-4ba9-a981-6f0193df3c96"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.668703 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3780a614-404b-4ba9-a981-6f0193df3c96-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3780a614-404b-4ba9-a981-6f0193df3c96" (UID: "3780a614-404b-4ba9-a981-6f0193df3c96"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.669936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "3780a614-404b-4ba9-a981-6f0193df3c96" (UID: "3780a614-404b-4ba9-a981-6f0193df3c96"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.670383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-inventory" (OuterVolumeSpecName: "inventory") pod "3780a614-404b-4ba9-a981-6f0193df3c96" (UID: "3780a614-404b-4ba9-a981-6f0193df3c96"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.755631 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.755658 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.755669 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3780a614-404b-4ba9-a981-6f0193df3c96-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.755678 4707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3780a614-404b-4ba9-a981-6f0193df3c96-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:10 crc kubenswrapper[4707]: I0121 16:26:10.755688 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5hb4\" (UniqueName: \"kubernetes.io/projected/3780a614-404b-4ba9-a981-6f0193df3c96-kube-api-access-b5hb4\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.302536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" event={"ID":"3780a614-404b-4ba9-a981-6f0193df3c96","Type":"ContainerDied","Data":"3e4c28a9e5b6f9e2e4e1402866aa298da4d44bdf8f006e20f9b3689843bc2446"} Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.302721 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e4c28a9e5b6f9e2e4e1402866aa298da4d44bdf8f006e20f9b3689843bc2446" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.302654 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nsvdm" podUID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerName="registry-server" containerID="cri-o://dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637" gracePeriod=2 Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.302564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.341865 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k"] Jan 21 16:26:11 crc kubenswrapper[4707]: E0121 16:26:11.342161 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3780a614-404b-4ba9-a981-6f0193df3c96" containerName="ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.342179 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3780a614-404b-4ba9-a981-6f0193df3c96" containerName="ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.342311 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3780a614-404b-4ba9-a981-6f0193df3c96" containerName="ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.342736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.351243 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.351534 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.351739 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.356153 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-neutron-config" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.357158 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.357246 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.357305 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.362345 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k"] Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.362488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.362525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.362623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.362663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.362741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-inventory\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.362788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.362825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.362862 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5tcr\" (UniqueName: \"kubernetes.io/projected/6fa648c7-60be-4d40-9ef8-728b91a51263-kube-api-access-q5tcr\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.464019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.464080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.464111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5tcr\" (UniqueName: \"kubernetes.io/projected/6fa648c7-60be-4d40-9ef8-728b91a51263-kube-api-access-q5tcr\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.464148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.464169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.464210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.464236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.464279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-inventory\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.468149 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.468936 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.470007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.473242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.473529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.478026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-inventory\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.478365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.489398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5tcr\" (UniqueName: \"kubernetes.io/projected/6fa648c7-60be-4d40-9ef8-728b91a51263-kube-api-access-q5tcr\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:11 crc kubenswrapper[4707]: I0121 16:26:11.656039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.021565 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k"] Jan 21 16:26:12 crc kubenswrapper[4707]: W0121 16:26:12.027553 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fa648c7_60be_4d40_9ef8_728b91a51263.slice/crio-e77988a0ecf373b96f999b83e8537d3b861e2eacd3f62f8103030bdb20a1c8ff WatchSource:0}: Error finding container e77988a0ecf373b96f999b83e8537d3b861e2eacd3f62f8103030bdb20a1c8ff: Status 404 returned error can't find the container with id e77988a0ecf373b96f999b83e8537d3b861e2eacd3f62f8103030bdb20a1c8ff Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.080724 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.173849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-catalog-content\") pod \"466ce831-e28e-4748-a929-d0d0a7aa680c\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.173917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-utilities\") pod \"466ce831-e28e-4748-a929-d0d0a7aa680c\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.174010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7kxc\" (UniqueName: \"kubernetes.io/projected/466ce831-e28e-4748-a929-d0d0a7aa680c-kube-api-access-h7kxc\") pod \"466ce831-e28e-4748-a929-d0d0a7aa680c\" (UID: \"466ce831-e28e-4748-a929-d0d0a7aa680c\") " Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.176161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-utilities" (OuterVolumeSpecName: "utilities") pod "466ce831-e28e-4748-a929-d0d0a7aa680c" (UID: "466ce831-e28e-4748-a929-d0d0a7aa680c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.180266 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/466ce831-e28e-4748-a929-d0d0a7aa680c-kube-api-access-h7kxc" (OuterVolumeSpecName: "kube-api-access-h7kxc") pod "466ce831-e28e-4748-a929-d0d0a7aa680c" (UID: "466ce831-e28e-4748-a929-d0d0a7aa680c"). InnerVolumeSpecName "kube-api-access-h7kxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.275267 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7kxc\" (UniqueName: \"kubernetes.io/projected/466ce831-e28e-4748-a929-d0d0a7aa680c-kube-api-access-h7kxc\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.275469 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.275701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "466ce831-e28e-4748-a929-d0d0a7aa680c" (UID: "466ce831-e28e-4748-a929-d0d0a7aa680c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.310350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" event={"ID":"6fa648c7-60be-4d40-9ef8-728b91a51263","Type":"ContainerStarted","Data":"e77988a0ecf373b96f999b83e8537d3b861e2eacd3f62f8103030bdb20a1c8ff"} Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.312283 4707 generic.go:334] "Generic (PLEG): container finished" podID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerID="dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637" exitCode=0 Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.312316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsvdm" event={"ID":"466ce831-e28e-4748-a929-d0d0a7aa680c","Type":"ContainerDied","Data":"dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637"} Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.312344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsvdm" event={"ID":"466ce831-e28e-4748-a929-d0d0a7aa680c","Type":"ContainerDied","Data":"4daf66b2af3379c4ac9fbbd8f047bb32b2461f1ac78040a38acef1e291be5b9a"} Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.312351 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsvdm" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.312360 4707 scope.go:117] "RemoveContainer" containerID="dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.339374 4707 scope.go:117] "RemoveContainer" containerID="f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.346153 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nsvdm"] Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.351524 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nsvdm"] Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.368019 4707 scope.go:117] "RemoveContainer" containerID="f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.376832 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/466ce831-e28e-4748-a929-d0d0a7aa680c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.388527 4707 scope.go:117] "RemoveContainer" containerID="dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637" Jan 21 16:26:12 crc kubenswrapper[4707]: E0121 16:26:12.389269 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637\": container with ID starting with dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637 not found: ID does not exist" containerID="dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.389299 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637"} err="failed to get container status \"dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637\": rpc error: code = NotFound desc = could not find container \"dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637\": container with ID starting with dacbd1fa067bb10aa630fdcfc7bad5af1e8d56af5abbb96da604dd82d259c637 not found: ID does not exist" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.389322 4707 scope.go:117] "RemoveContainer" containerID="f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d" Jan 21 16:26:12 crc kubenswrapper[4707]: E0121 16:26:12.389631 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d\": container with ID starting with f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d not found: ID does not exist" containerID="f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.389662 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d"} err="failed to get container status \"f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d\": rpc error: code = NotFound desc = could not find container \"f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d\": container with ID starting with f726713f3cfb5828711e272b0e3fda15ac40cd68f979e0065e5efeeb0b7b691d not found: ID does not exist" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.389684 4707 scope.go:117] "RemoveContainer" containerID="f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f" Jan 21 16:26:12 crc kubenswrapper[4707]: E0121 16:26:12.390004 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f\": container with ID starting with f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f not found: ID does not exist" containerID="f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f" Jan 21 16:26:12 crc kubenswrapper[4707]: I0121 16:26:12.390045 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f"} err="failed to get container status \"f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f\": rpc error: code = NotFound desc = could not find container \"f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f\": container with ID starting with f19f0ee24b288061d10ac517b9f0b04bf4416777163fde3a8d64efd550b6ae8f not found: ID does not exist" Jan 21 16:26:13 crc kubenswrapper[4707]: I0121 16:26:13.190318 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="466ce831-e28e-4748-a929-d0d0a7aa680c" path="/var/lib/kubelet/pods/466ce831-e28e-4748-a929-d0d0a7aa680c/volumes" Jan 21 16:26:13 crc kubenswrapper[4707]: I0121 16:26:13.321618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" event={"ID":"6fa648c7-60be-4d40-9ef8-728b91a51263","Type":"ContainerStarted","Data":"64bfcf24eb25a8220219461e99fa55c8d67ddd59962d9b09f1db231340a5827f"} Jan 21 16:26:13 crc kubenswrapper[4707]: I0121 16:26:13.336438 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" podStartSLOduration=1.837938221 podStartE2EDuration="2.336423872s" podCreationTimestamp="2026-01-21 16:26:11 +0000 UTC" firstStartedPulling="2026-01-21 16:26:12.029770798 +0000 UTC m=+5069.211287020" lastFinishedPulling="2026-01-21 16:26:12.528256448 +0000 UTC m=+5069.709772671" observedRunningTime="2026-01-21 16:26:13.331400516 +0000 UTC m=+5070.512916739" watchObservedRunningTime="2026-01-21 16:26:13.336423872 +0000 UTC m=+5070.517940094" Jan 21 16:26:14 crc kubenswrapper[4707]: I0121 16:26:14.332871 4707 generic.go:334] "Generic (PLEG): container finished" podID="6fa648c7-60be-4d40-9ef8-728b91a51263" containerID="64bfcf24eb25a8220219461e99fa55c8d67ddd59962d9b09f1db231340a5827f" exitCode=0 Jan 21 16:26:14 crc kubenswrapper[4707]: I0121 16:26:14.332917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" event={"ID":"6fa648c7-60be-4d40-9ef8-728b91a51263","Type":"ContainerDied","Data":"64bfcf24eb25a8220219461e99fa55c8d67ddd59962d9b09f1db231340a5827f"} Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.541340 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.720138 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-metadata-combined-ca-bundle\") pod \"6fa648c7-60be-4d40-9ef8-728b91a51263\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.720176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-inventory\") pod \"6fa648c7-60be-4d40-9ef8-728b91a51263\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.720230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-ovn-metadata-agent-neutron-config-0\") pod \"6fa648c7-60be-4d40-9ef8-728b91a51263\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.720265 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-2\") pod \"6fa648c7-60be-4d40-9ef8-728b91a51263\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.720283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-0\") pod \"6fa648c7-60be-4d40-9ef8-728b91a51263\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.720301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-ssh-key-edpm-compute-no-nodes\") pod \"6fa648c7-60be-4d40-9ef8-728b91a51263\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.720386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5tcr\" (UniqueName: \"kubernetes.io/projected/6fa648c7-60be-4d40-9ef8-728b91a51263-kube-api-access-q5tcr\") pod \"6fa648c7-60be-4d40-9ef8-728b91a51263\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.720431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-1\") pod \"6fa648c7-60be-4d40-9ef8-728b91a51263\" (UID: \"6fa648c7-60be-4d40-9ef8-728b91a51263\") " Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.725752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6fa648c7-60be-4d40-9ef8-728b91a51263" (UID: "6fa648c7-60be-4d40-9ef8-728b91a51263"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.725825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa648c7-60be-4d40-9ef8-728b91a51263-kube-api-access-q5tcr" (OuterVolumeSpecName: "kube-api-access-q5tcr") pod "6fa648c7-60be-4d40-9ef8-728b91a51263" (UID: "6fa648c7-60be-4d40-9ef8-728b91a51263"). InnerVolumeSpecName "kube-api-access-q5tcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.737177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "6fa648c7-60be-4d40-9ef8-728b91a51263" (UID: "6fa648c7-60be-4d40-9ef8-728b91a51263"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.738627 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-1" (OuterVolumeSpecName: "nova-metadata-neutron-config-1") pod "6fa648c7-60be-4d40-9ef8-728b91a51263" (UID: "6fa648c7-60be-4d40-9ef8-728b91a51263"). InnerVolumeSpecName "nova-metadata-neutron-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.738887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "6fa648c7-60be-4d40-9ef8-728b91a51263" (UID: "6fa648c7-60be-4d40-9ef8-728b91a51263"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.739149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-inventory" (OuterVolumeSpecName: "inventory") pod "6fa648c7-60be-4d40-9ef8-728b91a51263" (UID: "6fa648c7-60be-4d40-9ef8-728b91a51263"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.739218 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-2" (OuterVolumeSpecName: "nova-metadata-neutron-config-2") pod "6fa648c7-60be-4d40-9ef8-728b91a51263" (UID: "6fa648c7-60be-4d40-9ef8-728b91a51263"). InnerVolumeSpecName "nova-metadata-neutron-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.741516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "6fa648c7-60be-4d40-9ef8-728b91a51263" (UID: "6fa648c7-60be-4d40-9ef8-728b91a51263"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.821867 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.821895 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-2\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.821906 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.821917 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.821929 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5tcr\" (UniqueName: \"kubernetes.io/projected/6fa648c7-60be-4d40-9ef8-728b91a51263-kube-api-access-q5tcr\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.821937 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-nova-metadata-neutron-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.821958 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:15 crc kubenswrapper[4707]: I0121 16:26:15.821967 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa648c7-60be-4d40-9ef8-728b91a51263-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.245093 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj"] Jan 21 16:26:16 crc kubenswrapper[4707]: E0121 16:26:16.245561 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerName="extract-content" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.245578 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerName="extract-content" Jan 21 16:26:16 crc kubenswrapper[4707]: E0121 16:26:16.245587 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerName="extract-utilities" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.245593 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerName="extract-utilities" Jan 21 16:26:16 crc kubenswrapper[4707]: E0121 16:26:16.245602 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa648c7-60be-4d40-9ef8-728b91a51263" containerName="neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.245608 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa648c7-60be-4d40-9ef8-728b91a51263" containerName="neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:16 crc kubenswrapper[4707]: E0121 16:26:16.245620 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerName="registry-server" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.245626 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerName="registry-server" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.245741 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="466ce831-e28e-4748-a929-d0d0a7aa680c" containerName="registry-server" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.245760 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa648c7-60be-4d40-9ef8-728b91a51263" containerName="neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.246169 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.247910 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-agent-neutron-config" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.258203 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj"] Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.346503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" event={"ID":"6fa648c7-60be-4d40-9ef8-728b91a51263","Type":"ContainerDied","Data":"e77988a0ecf373b96f999b83e8537d3b861e2eacd3f62f8103030bdb20a1c8ff"} Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.346538 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e77988a0ecf373b96f999b83e8537d3b861e2eacd3f62f8103030bdb20a1c8ff" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.346553 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.428325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-inventory\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.428383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45f7p\" (UniqueName: \"kubernetes.io/projected/d381a5d7-2629-4947-bf07-87631f779e7a-kube-api-access-45f7p\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.428411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.428457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.428561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.529731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-inventory\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.529797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45f7p\" (UniqueName: \"kubernetes.io/projected/d381a5d7-2629-4947-bf07-87631f779e7a-kube-api-access-45f7p\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.529839 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.529865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.529900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.533548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-inventory\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.533935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.534292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.534623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.542500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45f7p\" (UniqueName: \"kubernetes.io/projected/d381a5d7-2629-4947-bf07-87631f779e7a-kube-api-access-45f7p\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.557486 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:16 crc kubenswrapper[4707]: I0121 16:26:16.903485 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj"] Jan 21 16:26:16 crc kubenswrapper[4707]: W0121 16:26:16.907028 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd381a5d7_2629_4947_bf07_87631f779e7a.slice/crio-95600f1d9d5f0da3dfd6f20225a2aef1ee2e07afdf501153b5d9592af175e063 WatchSource:0}: Error finding container 95600f1d9d5f0da3dfd6f20225a2aef1ee2e07afdf501153b5d9592af175e063: Status 404 returned error can't find the container with id 95600f1d9d5f0da3dfd6f20225a2aef1ee2e07afdf501153b5d9592af175e063 Jan 21 16:26:17 crc kubenswrapper[4707]: I0121 16:26:17.353673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" event={"ID":"d381a5d7-2629-4947-bf07-87631f779e7a","Type":"ContainerStarted","Data":"95600f1d9d5f0da3dfd6f20225a2aef1ee2e07afdf501153b5d9592af175e063"} Jan 21 16:26:18 crc kubenswrapper[4707]: I0121 16:26:18.360842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" event={"ID":"d381a5d7-2629-4947-bf07-87631f779e7a","Type":"ContainerStarted","Data":"2e02d44d4a33149b5fdbe53f47c4bf74705ac6b4d75f1b9dd562a0bf2070e70d"} Jan 21 16:26:18 crc kubenswrapper[4707]: I0121 16:26:18.375788 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" podStartSLOduration=1.7940914559999999 podStartE2EDuration="2.375775314s" podCreationTimestamp="2026-01-21 16:26:16 +0000 UTC" firstStartedPulling="2026-01-21 16:26:16.90883904 +0000 UTC m=+5074.090355262" lastFinishedPulling="2026-01-21 16:26:17.490522898 +0000 UTC m=+5074.672039120" observedRunningTime="2026-01-21 16:26:18.372128597 +0000 UTC m=+5075.553644819" watchObservedRunningTime="2026-01-21 16:26:18.375775314 +0000 UTC m=+5075.557291536" Jan 21 16:26:19 crc kubenswrapper[4707]: I0121 16:26:19.367603 4707 generic.go:334] "Generic (PLEG): container finished" podID="d381a5d7-2629-4947-bf07-87631f779e7a" containerID="2e02d44d4a33149b5fdbe53f47c4bf74705ac6b4d75f1b9dd562a0bf2070e70d" exitCode=0 Jan 21 16:26:19 crc kubenswrapper[4707]: I0121 16:26:19.367699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" event={"ID":"d381a5d7-2629-4947-bf07-87631f779e7a","Type":"ContainerDied","Data":"2e02d44d4a33149b5fdbe53f47c4bf74705ac6b4d75f1b9dd562a0bf2070e70d"} Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.583713 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.681014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-ssh-key-edpm-compute-no-nodes\") pod \"d381a5d7-2629-4947-bf07-87631f779e7a\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.681246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45f7p\" (UniqueName: \"kubernetes.io/projected/d381a5d7-2629-4947-bf07-87631f779e7a-kube-api-access-45f7p\") pod \"d381a5d7-2629-4947-bf07-87631f779e7a\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.681276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-combined-ca-bundle\") pod \"d381a5d7-2629-4947-bf07-87631f779e7a\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.681309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-inventory\") pod \"d381a5d7-2629-4947-bf07-87631f779e7a\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.681392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-agent-neutron-config-0\") pod \"d381a5d7-2629-4947-bf07-87631f779e7a\" (UID: \"d381a5d7-2629-4947-bf07-87631f779e7a\") " Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.695930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "d381a5d7-2629-4947-bf07-87631f779e7a" (UID: "d381a5d7-2629-4947-bf07-87631f779e7a"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.697320 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d381a5d7-2629-4947-bf07-87631f779e7a-kube-api-access-45f7p" (OuterVolumeSpecName: "kube-api-access-45f7p") pod "d381a5d7-2629-4947-bf07-87631f779e7a" (UID: "d381a5d7-2629-4947-bf07-87631f779e7a"). InnerVolumeSpecName "kube-api-access-45f7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.731918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-inventory" (OuterVolumeSpecName: "inventory") pod "d381a5d7-2629-4947-bf07-87631f779e7a" (UID: "d381a5d7-2629-4947-bf07-87631f779e7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.775362 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "d381a5d7-2629-4947-bf07-87631f779e7a" (UID: "d381a5d7-2629-4947-bf07-87631f779e7a"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.781261 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-agent-neutron-config-0") pod "d381a5d7-2629-4947-bf07-87631f779e7a" (UID: "d381a5d7-2629-4947-bf07-87631f779e7a"). InnerVolumeSpecName "neutron-ovn-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.783021 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.783064 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45f7p\" (UniqueName: \"kubernetes.io/projected/d381a5d7-2629-4947-bf07-87631f779e7a-kube-api-access-45f7p\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.783074 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.783084 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:20 crc kubenswrapper[4707]: I0121 16:26:20.783094 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d381a5d7-2629-4947-bf07-87631f779e7a-neutron-ovn-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.380274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" event={"ID":"d381a5d7-2629-4947-bf07-87631f779e7a","Type":"ContainerDied","Data":"95600f1d9d5f0da3dfd6f20225a2aef1ee2e07afdf501153b5d9592af175e063"} Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.380310 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95600f1d9d5f0da3dfd6f20225a2aef1ee2e07afdf501153b5d9592af175e063" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.380314 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.467368 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f"] Jan 21 16:26:21 crc kubenswrapper[4707]: E0121 16:26:21.467649 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d381a5d7-2629-4947-bf07-87631f779e7a" containerName="neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.467666 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d381a5d7-2629-4947-bf07-87631f779e7a" containerName="neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.467780 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d381a5d7-2629-4947-bf07-87631f779e7a" containerName="neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.468203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.469752 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.469754 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-sriov-agent-neutron-config" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.470121 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.470149 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.470228 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.470326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.473299 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f"] Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.591666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.591723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7bgv\" (UniqueName: \"kubernetes.io/projected/ee8023da-da1b-4316-a674-dda6aa48fed9-kube-api-access-j7bgv\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.591745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.591784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-inventory\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.591939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.692920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.692980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7bgv\" (UniqueName: \"kubernetes.io/projected/ee8023da-da1b-4316-a674-dda6aa48fed9-kube-api-access-j7bgv\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.692999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.693064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-inventory\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.693111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.697387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.697407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-inventory\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.697416 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.697445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.705242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7bgv\" (UniqueName: \"kubernetes.io/projected/ee8023da-da1b-4316-a674-dda6aa48fed9-kube-api-access-j7bgv\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.778949 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.856401 4707 scope.go:117] "RemoveContainer" containerID="9aaf40628f28453c96b4650e0084ceff71754fc2ea886263f867dfe4d5d698d4" Jan 21 16:26:21 crc kubenswrapper[4707]: I0121 16:26:21.888078 4707 scope.go:117] "RemoveContainer" containerID="8a781f9ff083bdf77a86357cdffbd3b1a3913bbd178af634a3b1b48ba7ca0a17" Jan 21 16:26:22 crc kubenswrapper[4707]: I0121 16:26:22.131046 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f"] Jan 21 16:26:22 crc kubenswrapper[4707]: W0121 16:26:22.134847 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee8023da_da1b_4316_a674_dda6aa48fed9.slice/crio-1342d0795f2dcafcd3d525d6191dd79bf9cb22e44d3291a93c0818df8f4183b8 WatchSource:0}: Error finding container 1342d0795f2dcafcd3d525d6191dd79bf9cb22e44d3291a93c0818df8f4183b8: Status 404 returned error can't find the container with id 1342d0795f2dcafcd3d525d6191dd79bf9cb22e44d3291a93c0818df8f4183b8 Jan 21 16:26:22 crc kubenswrapper[4707]: I0121 16:26:22.387702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" event={"ID":"ee8023da-da1b-4316-a674-dda6aa48fed9","Type":"ContainerStarted","Data":"1342d0795f2dcafcd3d525d6191dd79bf9cb22e44d3291a93c0818df8f4183b8"} Jan 21 16:26:23 crc kubenswrapper[4707]: I0121 16:26:23.397106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" event={"ID":"ee8023da-da1b-4316-a674-dda6aa48fed9","Type":"ContainerStarted","Data":"2343e74661716427944a90f2d452eeaea69268a7c0c635220f7df1aef68f2b33"} Jan 21 16:26:23 crc kubenswrapper[4707]: I0121 16:26:23.423650 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" podStartSLOduration=1.951429859 podStartE2EDuration="2.423632459s" podCreationTimestamp="2026-01-21 16:26:21 +0000 UTC" firstStartedPulling="2026-01-21 16:26:22.136941531 +0000 UTC m=+5079.318457753" lastFinishedPulling="2026-01-21 16:26:22.609144131 +0000 UTC m=+5079.790660353" observedRunningTime="2026-01-21 16:26:23.414668746 +0000 UTC m=+5080.596184968" watchObservedRunningTime="2026-01-21 16:26:23.423632459 +0000 UTC m=+5080.605148682" Jan 21 16:26:24 crc kubenswrapper[4707]: I0121 16:26:24.403505 4707 generic.go:334] "Generic (PLEG): container finished" podID="ee8023da-da1b-4316-a674-dda6aa48fed9" containerID="2343e74661716427944a90f2d452eeaea69268a7c0c635220f7df1aef68f2b33" exitCode=0 Jan 21 16:26:24 crc kubenswrapper[4707]: I0121 16:26:24.403605 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" event={"ID":"ee8023da-da1b-4316-a674-dda6aa48fed9","Type":"ContainerDied","Data":"2343e74661716427944a90f2d452eeaea69268a7c0c635220f7df1aef68f2b33"} Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.646602 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.743939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-inventory\") pod \"ee8023da-da1b-4316-a674-dda6aa48fed9\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.744089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-agent-neutron-config-0\") pod \"ee8023da-da1b-4316-a674-dda6aa48fed9\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.744115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-combined-ca-bundle\") pod \"ee8023da-da1b-4316-a674-dda6aa48fed9\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.744164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-ssh-key-edpm-compute-no-nodes\") pod \"ee8023da-da1b-4316-a674-dda6aa48fed9\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.744192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7bgv\" (UniqueName: \"kubernetes.io/projected/ee8023da-da1b-4316-a674-dda6aa48fed9-kube-api-access-j7bgv\") pod \"ee8023da-da1b-4316-a674-dda6aa48fed9\" (UID: \"ee8023da-da1b-4316-a674-dda6aa48fed9\") " Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.748204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8023da-da1b-4316-a674-dda6aa48fed9-kube-api-access-j7bgv" (OuterVolumeSpecName: "kube-api-access-j7bgv") pod "ee8023da-da1b-4316-a674-dda6aa48fed9" (UID: "ee8023da-da1b-4316-a674-dda6aa48fed9"). InnerVolumeSpecName "kube-api-access-j7bgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.748363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "ee8023da-da1b-4316-a674-dda6aa48fed9" (UID: "ee8023da-da1b-4316-a674-dda6aa48fed9"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.760322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "ee8023da-da1b-4316-a674-dda6aa48fed9" (UID: "ee8023da-da1b-4316-a674-dda6aa48fed9"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.760903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-inventory" (OuterVolumeSpecName: "inventory") pod "ee8023da-da1b-4316-a674-dda6aa48fed9" (UID: "ee8023da-da1b-4316-a674-dda6aa48fed9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.760981 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "ee8023da-da1b-4316-a674-dda6aa48fed9" (UID: "ee8023da-da1b-4316-a674-dda6aa48fed9"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.846026 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.846061 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.846074 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7bgv\" (UniqueName: \"kubernetes.io/projected/ee8023da-da1b-4316-a674-dda6aa48fed9-kube-api-access-j7bgv\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.846083 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:25 crc kubenswrapper[4707]: I0121 16:26:25.846093 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee8023da-da1b-4316-a674-dda6aa48fed9-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.257378 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6"] Jan 21 16:26:26 crc kubenswrapper[4707]: E0121 16:26:26.257646 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8023da-da1b-4316-a674-dda6aa48fed9" containerName="neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.257666 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8023da-da1b-4316-a674-dda6aa48fed9" containerName="neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.257853 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8023da-da1b-4316-a674-dda6aa48fed9" containerName="neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.258269 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.260243 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-dhcp-agent-neutron-config" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.266288 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6"] Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.351498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.351557 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5nhj\" (UniqueName: \"kubernetes.io/projected/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-kube-api-access-w5nhj\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.351588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.351623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-inventory\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.351639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.416306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" event={"ID":"ee8023da-da1b-4316-a674-dda6aa48fed9","Type":"ContainerDied","Data":"1342d0795f2dcafcd3d525d6191dd79bf9cb22e44d3291a93c0818df8f4183b8"} Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.416338 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.416342 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1342d0795f2dcafcd3d525d6191dd79bf9cb22e44d3291a93c0818df8f4183b8" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.452662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.452714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5nhj\" (UniqueName: \"kubernetes.io/projected/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-kube-api-access-w5nhj\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.452742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.452773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-inventory\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.453205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.455870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.455893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-inventory\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.455931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.456196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.466929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5nhj\" (UniqueName: \"kubernetes.io/projected/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-kube-api-access-w5nhj\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.570960 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:26 crc kubenswrapper[4707]: I0121 16:26:26.919192 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6"] Jan 21 16:26:27 crc kubenswrapper[4707]: I0121 16:26:27.423410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" event={"ID":"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806","Type":"ContainerStarted","Data":"3581adba0faaf8122a458273831e67b66be802a118dc5525b3302c49a2a2aa6f"} Jan 21 16:26:28 crc kubenswrapper[4707]: I0121 16:26:28.430730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" event={"ID":"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806","Type":"ContainerStarted","Data":"023323a0763df0ba7be1effdd0989514017c1aa43f13b9dbafd726ed1a60a73a"} Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.437508 4707 generic.go:334] "Generic (PLEG): container finished" podID="ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" containerID="023323a0763df0ba7be1effdd0989514017c1aa43f13b9dbafd726ed1a60a73a" exitCode=0 Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.437545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" event={"ID":"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806","Type":"ContainerDied","Data":"023323a0763df0ba7be1effdd0989514017c1aa43f13b9dbafd726ed1a60a73a"} Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.649922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.793962 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5nhj\" (UniqueName: \"kubernetes.io/projected/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-kube-api-access-w5nhj\") pod \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.794003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-ssh-key-edpm-compute-no-nodes\") pod \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.794022 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-combined-ca-bundle\") pod \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.794100 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-agent-neutron-config-0\") pod \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.794142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-inventory\") pod \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\" (UID: \"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806\") " Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.798298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-kube-api-access-w5nhj" (OuterVolumeSpecName: "kube-api-access-w5nhj") pod "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" (UID: "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806"). InnerVolumeSpecName "kube-api-access-w5nhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.798328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" (UID: "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.809956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" (UID: "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.810110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" (UID: "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.810550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-inventory" (OuterVolumeSpecName: "inventory") pod "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" (UID: "ce0c6fd2-0c91-4e1d-8459-f2b41ac49806"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.895792 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5nhj\" (UniqueName: \"kubernetes.io/projected/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-kube-api-access-w5nhj\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.895833 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.895845 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.895855 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:29 crc kubenswrapper[4707]: I0121 16:26:29.895864 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.444513 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" event={"ID":"ce0c6fd2-0c91-4e1d-8459-f2b41ac49806","Type":"ContainerDied","Data":"3581adba0faaf8122a458273831e67b66be802a118dc5525b3302c49a2a2aa6f"} Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.444545 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3581adba0faaf8122a458273831e67b66be802a118dc5525b3302c49a2a2aa6f" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.444545 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.698105 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l"] Jan 21 16:26:30 crc kubenswrapper[4707]: E0121 16:26:30.698373 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" containerName="neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.698389 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" containerName="neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.698533 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" containerName="neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.698966 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.701369 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"libvirt-secret" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.701588 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.701750 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.701910 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.702018 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.702076 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.704015 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l"] Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.807118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.807172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.807313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-inventory\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.807423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-secret-0\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.807484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blx7z\" (UniqueName: \"kubernetes.io/projected/e5488638-d413-4986-9842-beaab6922631-kube-api-access-blx7z\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.908460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.908681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.908732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-inventory\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.908779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-secret-0\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.908803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blx7z\" (UniqueName: \"kubernetes.io/projected/e5488638-d413-4986-9842-beaab6922631-kube-api-access-blx7z\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.911575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.911626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-secret-0\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.911666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.912154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-inventory\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:30 crc kubenswrapper[4707]: I0121 16:26:30.920758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blx7z\" (UniqueName: \"kubernetes.io/projected/e5488638-d413-4986-9842-beaab6922631-kube-api-access-blx7z\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:31 crc kubenswrapper[4707]: I0121 16:26:31.012749 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:31 crc kubenswrapper[4707]: I0121 16:26:31.362791 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l"] Jan 21 16:26:31 crc kubenswrapper[4707]: W0121 16:26:31.367555 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5488638_d413_4986_9842_beaab6922631.slice/crio-47127ad41b86a8f7c90085a8a1c45ada95c85dd7a0df15e4651b7068a4ec1573 WatchSource:0}: Error finding container 47127ad41b86a8f7c90085a8a1c45ada95c85dd7a0df15e4651b7068a4ec1573: Status 404 returned error can't find the container with id 47127ad41b86a8f7c90085a8a1c45ada95c85dd7a0df15e4651b7068a4ec1573 Jan 21 16:26:31 crc kubenswrapper[4707]: I0121 16:26:31.451595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" event={"ID":"e5488638-d413-4986-9842-beaab6922631","Type":"ContainerStarted","Data":"47127ad41b86a8f7c90085a8a1c45ada95c85dd7a0df15e4651b7068a4ec1573"} Jan 21 16:26:32 crc kubenswrapper[4707]: I0121 16:26:32.458917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" event={"ID":"e5488638-d413-4986-9842-beaab6922631","Type":"ContainerStarted","Data":"7add33a461c4bf15c5ffc17fe16ea2d36534a73828dfe3f191f562d56eb138bc"} Jan 21 16:26:32 crc kubenswrapper[4707]: I0121 16:26:32.471328 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" podStartSLOduration=1.9222833879999999 podStartE2EDuration="2.471315281s" podCreationTimestamp="2026-01-21 16:26:30 +0000 UTC" firstStartedPulling="2026-01-21 16:26:31.369839921 +0000 UTC m=+5088.551356143" lastFinishedPulling="2026-01-21 16:26:31.918871814 +0000 UTC m=+5089.100388036" observedRunningTime="2026-01-21 16:26:32.470081641 +0000 UTC m=+5089.651597863" watchObservedRunningTime="2026-01-21 16:26:32.471315281 +0000 UTC m=+5089.652831503" Jan 21 16:26:33 crc kubenswrapper[4707]: I0121 16:26:33.465302 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5488638-d413-4986-9842-beaab6922631" containerID="7add33a461c4bf15c5ffc17fe16ea2d36534a73828dfe3f191f562d56eb138bc" exitCode=0 Jan 21 16:26:33 crc kubenswrapper[4707]: I0121 16:26:33.465339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" event={"ID":"e5488638-d413-4986-9842-beaab6922631","Type":"ContainerDied","Data":"7add33a461c4bf15c5ffc17fe16ea2d36534a73828dfe3f191f562d56eb138bc"} Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.681791 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.855136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-secret-0\") pod \"e5488638-d413-4986-9842-beaab6922631\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.855244 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blx7z\" (UniqueName: \"kubernetes.io/projected/e5488638-d413-4986-9842-beaab6922631-kube-api-access-blx7z\") pod \"e5488638-d413-4986-9842-beaab6922631\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.855278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-ssh-key-edpm-compute-no-nodes\") pod \"e5488638-d413-4986-9842-beaab6922631\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.855298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-inventory\") pod \"e5488638-d413-4986-9842-beaab6922631\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.855355 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-combined-ca-bundle\") pod \"e5488638-d413-4986-9842-beaab6922631\" (UID: \"e5488638-d413-4986-9842-beaab6922631\") " Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.859484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5488638-d413-4986-9842-beaab6922631-kube-api-access-blx7z" (OuterVolumeSpecName: "kube-api-access-blx7z") pod "e5488638-d413-4986-9842-beaab6922631" (UID: "e5488638-d413-4986-9842-beaab6922631"). InnerVolumeSpecName "kube-api-access-blx7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.860109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e5488638-d413-4986-9842-beaab6922631" (UID: "e5488638-d413-4986-9842-beaab6922631"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.871106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e5488638-d413-4986-9842-beaab6922631" (UID: "e5488638-d413-4986-9842-beaab6922631"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.871600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-inventory" (OuterVolumeSpecName: "inventory") pod "e5488638-d413-4986-9842-beaab6922631" (UID: "e5488638-d413-4986-9842-beaab6922631"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.872378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "e5488638-d413-4986-9842-beaab6922631" (UID: "e5488638-d413-4986-9842-beaab6922631"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.957480 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.957624 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blx7z\" (UniqueName: \"kubernetes.io/projected/e5488638-d413-4986-9842-beaab6922631-kube-api-access-blx7z\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.957927 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.957960 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:34 crc kubenswrapper[4707]: I0121 16:26:34.957974 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5488638-d413-4986-9842-beaab6922631-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.476998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" event={"ID":"e5488638-d413-4986-9842-beaab6922631","Type":"ContainerDied","Data":"47127ad41b86a8f7c90085a8a1c45ada95c85dd7a0df15e4651b7068a4ec1573"} Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.477028 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47127ad41b86a8f7c90085a8a1c45ada95c85dd7a0df15e4651b7068a4ec1573" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.477037 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.515431 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g"] Jan 21 16:26:35 crc kubenswrapper[4707]: E0121 16:26:35.515729 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5488638-d413-4986-9842-beaab6922631" containerName="libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.515750 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5488638-d413-4986-9842-beaab6922631" containerName="libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.515918 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5488638-d413-4986-9842-beaab6922631" containerName="libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.516345 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.517894 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.517895 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.518793 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.519100 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-compute-config" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.519114 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.519128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-migration-ssh-key" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.519105 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.526526 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g"] Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.666517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.666733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87lpd\" (UniqueName: \"kubernetes.io/projected/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-kube-api-access-87lpd\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.666820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-combined-ca-bundle\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.666875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.666906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.666949 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-inventory\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.666965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.667026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.767639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.768143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87lpd\" (UniqueName: \"kubernetes.io/projected/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-kube-api-access-87lpd\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.768250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-combined-ca-bundle\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.768321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.768427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.768759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-inventory\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.768868 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.768942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.771015 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-combined-ca-bundle\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.771034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.771042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.771356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.771572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.771684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-inventory\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.772706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.781789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87lpd\" (UniqueName: \"kubernetes.io/projected/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-kube-api-access-87lpd\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:35 crc kubenswrapper[4707]: I0121 16:26:35.835141 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:36 crc kubenswrapper[4707]: I0121 16:26:36.190736 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g"] Jan 21 16:26:36 crc kubenswrapper[4707]: I0121 16:26:36.484376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" event={"ID":"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2","Type":"ContainerStarted","Data":"906ec8d353a05c117404e53069912accad35fcfac334d5e56016c53ba8379cfd"} Jan 21 16:26:37 crc kubenswrapper[4707]: I0121 16:26:37.492130 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" event={"ID":"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2","Type":"ContainerStarted","Data":"a69a5bbb987f031cf3f83394c39b8295165851ffdb009a7d41507e3be906e423"} Jan 21 16:26:37 crc kubenswrapper[4707]: I0121 16:26:37.508912 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" podStartSLOduration=2.029453931 podStartE2EDuration="2.508898358s" podCreationTimestamp="2026-01-21 16:26:35 +0000 UTC" firstStartedPulling="2026-01-21 16:26:36.195956148 +0000 UTC m=+5093.377472370" lastFinishedPulling="2026-01-21 16:26:36.675400575 +0000 UTC m=+5093.856916797" observedRunningTime="2026-01-21 16:26:37.505648477 +0000 UTC m=+5094.687164700" watchObservedRunningTime="2026-01-21 16:26:37.508898358 +0000 UTC m=+5094.690414580" Jan 21 16:26:38 crc kubenswrapper[4707]: I0121 16:26:38.499921 4707 generic.go:334] "Generic (PLEG): container finished" podID="6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" containerID="a69a5bbb987f031cf3f83394c39b8295165851ffdb009a7d41507e3be906e423" exitCode=0 Jan 21 16:26:38 crc kubenswrapper[4707]: I0121 16:26:38.499957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" event={"ID":"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2","Type":"ContainerDied","Data":"a69a5bbb987f031cf3f83394c39b8295165851ffdb009a7d41507e3be906e423"} Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.794483 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.917703 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-combined-ca-bundle\") pod \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.918280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-1\") pod \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.918319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-0\") pod \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.918428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-inventory\") pod \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.918511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-0\") pod \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.918588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-ssh-key-edpm-compute-no-nodes\") pod \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.918702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87lpd\" (UniqueName: \"kubernetes.io/projected/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-kube-api-access-87lpd\") pod \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.918771 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-1\") pod \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\" (UID: \"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2\") " Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.924656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" (UID: "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.924998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-kube-api-access-87lpd" (OuterVolumeSpecName: "kube-api-access-87lpd") pod "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" (UID: "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2"). InnerVolumeSpecName "kube-api-access-87lpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.937619 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" (UID: "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.937654 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" (UID: "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.938063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" (UID: "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.938100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" (UID: "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.938413 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-inventory" (OuterVolumeSpecName: "inventory") pod "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" (UID: "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:39 crc kubenswrapper[4707]: I0121 16:26:39.938934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" (UID: "6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.021607 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.021643 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.021654 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87lpd\" (UniqueName: \"kubernetes.io/projected/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-kube-api-access-87lpd\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.021663 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.021674 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.021684 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.021696 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.021704 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.514241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" event={"ID":"6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2","Type":"ContainerDied","Data":"906ec8d353a05c117404e53069912accad35fcfac334d5e56016c53ba8379cfd"} Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.514277 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="906ec8d353a05c117404e53069912accad35fcfac334d5e56016c53ba8379cfd" Jan 21 16:26:40 crc kubenswrapper[4707]: I0121 16:26:40.514293 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.462936 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2"] Jan 21 16:26:41 crc kubenswrapper[4707]: E0121 16:26:41.463203 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" containerName="nova-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.463216 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" containerName="nova-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.463334 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" containerName="nova-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.463742 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.468256 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.468926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.468941 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.468952 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.469262 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.478623 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2"] Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.540193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-inventory\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.540502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-custom-svc-combined-ca-bundle\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.540565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-ssh-key-edpm-compute-no-nodes\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.540589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2r9j\" (UniqueName: \"kubernetes.io/projected/654e3eb3-d255-440f-be85-235463249784-kube-api-access-r2r9j\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.641796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-inventory\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.641940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-custom-svc-combined-ca-bundle\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.642001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-ssh-key-edpm-compute-no-nodes\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.642034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2r9j\" (UniqueName: \"kubernetes.io/projected/654e3eb3-d255-440f-be85-235463249784-kube-api-access-r2r9j\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.646614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-custom-svc-combined-ca-bundle\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.646642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-inventory\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.647449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-ssh-key-edpm-compute-no-nodes\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.654718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2r9j\" (UniqueName: \"kubernetes.io/projected/654e3eb3-d255-440f-be85-235463249784-kube-api-access-r2r9j\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:41 crc kubenswrapper[4707]: I0121 16:26:41.776023 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:42 crc kubenswrapper[4707]: I0121 16:26:42.154010 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2"] Jan 21 16:26:42 crc kubenswrapper[4707]: I0121 16:26:42.527354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" event={"ID":"654e3eb3-d255-440f-be85-235463249784","Type":"ContainerStarted","Data":"c1d4fc82b850bef6b6fb98a97c41352ec85fb2acf9861adc3b655ef7cb2a1c24"} Jan 21 16:26:43 crc kubenswrapper[4707]: I0121 16:26:43.534616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" event={"ID":"654e3eb3-d255-440f-be85-235463249784","Type":"ContainerStarted","Data":"e03d96d4f656de595ae7afb0fa87d77c07cbe6e74f2fedc8b18a2a9330caea77"} Jan 21 16:26:43 crc kubenswrapper[4707]: I0121 16:26:43.549420 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" podStartSLOduration=2.037085877 podStartE2EDuration="2.549406802s" podCreationTimestamp="2026-01-21 16:26:41 +0000 UTC" firstStartedPulling="2026-01-21 16:26:42.156795801 +0000 UTC m=+5099.338312023" lastFinishedPulling="2026-01-21 16:26:42.669116726 +0000 UTC m=+5099.850632948" observedRunningTime="2026-01-21 16:26:43.54416822 +0000 UTC m=+5100.725684443" watchObservedRunningTime="2026-01-21 16:26:43.549406802 +0000 UTC m=+5100.730923023" Jan 21 16:26:45 crc kubenswrapper[4707]: I0121 16:26:45.547548 4707 generic.go:334] "Generic (PLEG): container finished" podID="654e3eb3-d255-440f-be85-235463249784" containerID="e03d96d4f656de595ae7afb0fa87d77c07cbe6e74f2fedc8b18a2a9330caea77" exitCode=0 Jan 21 16:26:45 crc kubenswrapper[4707]: I0121 16:26:45.547638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" event={"ID":"654e3eb3-d255-440f-be85-235463249784","Type":"ContainerDied","Data":"e03d96d4f656de595ae7afb0fa87d77c07cbe6e74f2fedc8b18a2a9330caea77"} Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.761713 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.811372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-custom-svc-combined-ca-bundle\") pod \"654e3eb3-d255-440f-be85-235463249784\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.811603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-inventory\") pod \"654e3eb3-d255-440f-be85-235463249784\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.811683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2r9j\" (UniqueName: \"kubernetes.io/projected/654e3eb3-d255-440f-be85-235463249784-kube-api-access-r2r9j\") pod \"654e3eb3-d255-440f-be85-235463249784\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.811711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-ssh-key-edpm-compute-no-nodes\") pod \"654e3eb3-d255-440f-be85-235463249784\" (UID: \"654e3eb3-d255-440f-be85-235463249784\") " Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.815738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654e3eb3-d255-440f-be85-235463249784-kube-api-access-r2r9j" (OuterVolumeSpecName: "kube-api-access-r2r9j") pod "654e3eb3-d255-440f-be85-235463249784" (UID: "654e3eb3-d255-440f-be85-235463249784"). InnerVolumeSpecName "kube-api-access-r2r9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.815986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-custom-svc-combined-ca-bundle" (OuterVolumeSpecName: "custom-svc-combined-ca-bundle") pod "654e3eb3-d255-440f-be85-235463249784" (UID: "654e3eb3-d255-440f-be85-235463249784"). InnerVolumeSpecName "custom-svc-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.827185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-inventory" (OuterVolumeSpecName: "inventory") pod "654e3eb3-d255-440f-be85-235463249784" (UID: "654e3eb3-d255-440f-be85-235463249784"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.827775 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "654e3eb3-d255-440f-be85-235463249784" (UID: "654e3eb3-d255-440f-be85-235463249784"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.913839 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.913865 4707 reconciler_common.go:293] "Volume detached for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-custom-svc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.913875 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654e3eb3-d255-440f-be85-235463249784-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:46 crc kubenswrapper[4707]: I0121 16:26:46.913888 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2r9j\" (UniqueName: \"kubernetes.io/projected/654e3eb3-d255-440f-be85-235463249784-kube-api-access-r2r9j\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:47 crc kubenswrapper[4707]: I0121 16:26:47.561863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" event={"ID":"654e3eb3-d255-440f-be85-235463249784","Type":"ContainerDied","Data":"c1d4fc82b850bef6b6fb98a97c41352ec85fb2acf9861adc3b655ef7cb2a1c24"} Jan 21 16:26:47 crc kubenswrapper[4707]: I0121 16:26:47.561895 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1d4fc82b850bef6b6fb98a97c41352ec85fb2acf9861adc3b655ef7cb2a1c24" Jan 21 16:26:47 crc kubenswrapper[4707]: I0121 16:26:47.561906 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.499296 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j"] Jan 21 16:27:11 crc kubenswrapper[4707]: E0121 16:27:11.499913 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654e3eb3-d255-440f-be85-235463249784" containerName="custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodes" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.499927 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="654e3eb3-d255-440f-be85-235463249784" containerName="custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodes" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.500066 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="654e3eb3-d255-440f-be85-235463249784" containerName="custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodes" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.500472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.502536 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.503099 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.503251 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.504167 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.504369 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.504496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.506024 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j"] Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.604878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.605016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-inventory\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.605109 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56900374-8406-4956-adfa-604c411784c7-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.605159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw2ls\" (UniqueName: \"kubernetes.io/projected/56900374-8406-4956-adfa-604c411784c7-kube-api-access-bw2ls\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.605196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.706428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.706610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-inventory\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.706705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56900374-8406-4956-adfa-604c411784c7-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.706793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2ls\" (UniqueName: \"kubernetes.io/projected/56900374-8406-4956-adfa-604c411784c7-kube-api-access-bw2ls\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.706908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.707415 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56900374-8406-4956-adfa-604c411784c7-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.711440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.712219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.712859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-inventory\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.720636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw2ls\" (UniqueName: \"kubernetes.io/projected/56900374-8406-4956-adfa-604c411784c7-kube-api-access-bw2ls\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:11 crc kubenswrapper[4707]: I0121 16:27:11.813033 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:12 crc kubenswrapper[4707]: I0121 16:27:12.197621 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j"] Jan 21 16:27:12 crc kubenswrapper[4707]: I0121 16:27:12.722086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" event={"ID":"56900374-8406-4956-adfa-604c411784c7","Type":"ContainerStarted","Data":"a479a81dc546fd1e6d5bbec342ec9a8d726b23df2dce37182b73020ec6425432"} Jan 21 16:27:13 crc kubenswrapper[4707]: I0121 16:27:13.729231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" event={"ID":"56900374-8406-4956-adfa-604c411784c7","Type":"ContainerStarted","Data":"3162afa8c55a04db4b2151998a05066c75b97de71be4179f018575c205f9eea2"} Jan 21 16:27:13 crc kubenswrapper[4707]: I0121 16:27:13.744672 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" podStartSLOduration=2.185700107 podStartE2EDuration="2.744659457s" podCreationTimestamp="2026-01-21 16:27:11 +0000 UTC" firstStartedPulling="2026-01-21 16:27:12.202703293 +0000 UTC m=+5129.384219514" lastFinishedPulling="2026-01-21 16:27:12.761662642 +0000 UTC m=+5129.943178864" observedRunningTime="2026-01-21 16:27:13.741876425 +0000 UTC m=+5130.923392647" watchObservedRunningTime="2026-01-21 16:27:13.744659457 +0000 UTC m=+5130.926175679" Jan 21 16:27:14 crc kubenswrapper[4707]: I0121 16:27:14.737576 4707 generic.go:334] "Generic (PLEG): container finished" podID="56900374-8406-4956-adfa-604c411784c7" containerID="3162afa8c55a04db4b2151998a05066c75b97de71be4179f018575c205f9eea2" exitCode=0 Jan 21 16:27:14 crc kubenswrapper[4707]: I0121 16:27:14.737611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" event={"ID":"56900374-8406-4956-adfa-604c411784c7","Type":"ContainerDied","Data":"3162afa8c55a04db4b2151998a05066c75b97de71be4179f018575c205f9eea2"} Jan 21 16:27:15 crc kubenswrapper[4707]: I0121 16:27:15.985466 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.156381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw2ls\" (UniqueName: \"kubernetes.io/projected/56900374-8406-4956-adfa-604c411784c7-kube-api-access-bw2ls\") pod \"56900374-8406-4956-adfa-604c411784c7\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.156424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ovn-combined-ca-bundle\") pod \"56900374-8406-4956-adfa-604c411784c7\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.156487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-inventory\") pod \"56900374-8406-4956-adfa-604c411784c7\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.156514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ssh-key-edpm-compute-no-nodes\") pod \"56900374-8406-4956-adfa-604c411784c7\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.156536 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56900374-8406-4956-adfa-604c411784c7-ovncontroller-config-0\") pod \"56900374-8406-4956-adfa-604c411784c7\" (UID: \"56900374-8406-4956-adfa-604c411784c7\") " Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.160481 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "56900374-8406-4956-adfa-604c411784c7" (UID: "56900374-8406-4956-adfa-604c411784c7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.162265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56900374-8406-4956-adfa-604c411784c7-kube-api-access-bw2ls" (OuterVolumeSpecName: "kube-api-access-bw2ls") pod "56900374-8406-4956-adfa-604c411784c7" (UID: "56900374-8406-4956-adfa-604c411784c7"). InnerVolumeSpecName "kube-api-access-bw2ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.171968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56900374-8406-4956-adfa-604c411784c7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "56900374-8406-4956-adfa-604c411784c7" (UID: "56900374-8406-4956-adfa-604c411784c7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.172118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-inventory" (OuterVolumeSpecName: "inventory") pod "56900374-8406-4956-adfa-604c411784c7" (UID: "56900374-8406-4956-adfa-604c411784c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.172532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "56900374-8406-4956-adfa-604c411784c7" (UID: "56900374-8406-4956-adfa-604c411784c7"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.257714 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.257880 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.257893 4707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56900374-8406-4956-adfa-604c411784c7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.257903 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw2ls\" (UniqueName: \"kubernetes.io/projected/56900374-8406-4956-adfa-604c411784c7-kube-api-access-bw2ls\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.257912 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56900374-8406-4956-adfa-604c411784c7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.750230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" event={"ID":"56900374-8406-4956-adfa-604c411784c7","Type":"ContainerDied","Data":"a479a81dc546fd1e6d5bbec342ec9a8d726b23df2dce37182b73020ec6425432"} Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.750267 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a479a81dc546fd1e6d5bbec342ec9a8d726b23df2dce37182b73020ec6425432" Jan 21 16:27:16 crc kubenswrapper[4707]: I0121 16:27:16.750282 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.630614 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5"] Jan 21 16:27:19 crc kubenswrapper[4707]: E0121 16:27:19.631245 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56900374-8406-4956-adfa-604c411784c7" containerName="ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nodes" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.631258 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="56900374-8406-4956-adfa-604c411784c7" containerName="ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nodes" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.631398 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="56900374-8406-4956-adfa-604c411784c7" containerName="ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nodes" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.632060 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.633836 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.639754 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5"] Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.799179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.799270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8qg\" (UniqueName: \"kubernetes.io/projected/f003a127-eeb4-4e96-ace8-517746e977bd-kube-api-access-7k8qg\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.799378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.799433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-config\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.799464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.899931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.899975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-config\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.899992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.900016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.900053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8qg\" (UniqueName: \"kubernetes.io/projected/f003a127-eeb4-4e96-ace8-517746e977bd-kube-api-access-7k8qg\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.900902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-config\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.900910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.900992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.901071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.915296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8qg\" (UniqueName: \"kubernetes.io/projected/f003a127-eeb4-4e96-ace8-517746e977bd-kube-api-access-7k8qg\") pod \"dnsmasq-dnsmasq-67886899f9-vjfh5\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:19 crc kubenswrapper[4707]: I0121 16:27:19.945571 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:20 crc kubenswrapper[4707]: I0121 16:27:20.287372 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5"] Jan 21 16:27:20 crc kubenswrapper[4707]: I0121 16:27:20.775580 4707 generic.go:334] "Generic (PLEG): container finished" podID="f003a127-eeb4-4e96-ace8-517746e977bd" containerID="b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170" exitCode=0 Jan 21 16:27:20 crc kubenswrapper[4707]: I0121 16:27:20.775625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" event={"ID":"f003a127-eeb4-4e96-ace8-517746e977bd","Type":"ContainerDied","Data":"b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170"} Jan 21 16:27:20 crc kubenswrapper[4707]: I0121 16:27:20.776338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" event={"ID":"f003a127-eeb4-4e96-ace8-517746e977bd","Type":"ContainerStarted","Data":"a1ff149066d9d2e7a6233ac30a8ee59cea40bbbfe36c8d0d8b40e15a9bf49b55"} Jan 21 16:27:21 crc kubenswrapper[4707]: I0121 16:27:21.784800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" event={"ID":"f003a127-eeb4-4e96-ace8-517746e977bd","Type":"ContainerStarted","Data":"7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89"} Jan 21 16:27:21 crc kubenswrapper[4707]: I0121 16:27:21.784943 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:21 crc kubenswrapper[4707]: I0121 16:27:21.801345 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" podStartSLOduration=2.8013332159999997 podStartE2EDuration="2.801333216s" podCreationTimestamp="2026-01-21 16:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:27:21.795962116 +0000 UTC m=+5138.977478338" watchObservedRunningTime="2026-01-21 16:27:21.801333216 +0000 UTC m=+5138.982849438" Jan 21 16:27:21 crc kubenswrapper[4707]: I0121 16:27:21.969351 4707 scope.go:117] "RemoveContainer" containerID="c514d5e67f43f6a296162f67392a88d6c3015c1f19561e3dafbe7c7a270e3d48" Jan 21 16:27:21 crc kubenswrapper[4707]: I0121 16:27:21.991723 4707 scope.go:117] "RemoveContainer" containerID="c8f134230d517327fd10f894c45ae7587f2dab72a8d3a15458bf6e7c45d0502d" Jan 21 16:27:22 crc kubenswrapper[4707]: I0121 16:27:22.009762 4707 scope.go:117] "RemoveContainer" containerID="c40f4856db69864ca0de8dff247e67dbbafc91072cd055579f2f59cb90b0f9f4" Jan 21 16:27:22 crc kubenswrapper[4707]: I0121 16:27:22.029447 4707 scope.go:117] "RemoveContainer" containerID="f991d10c9307a9ef3f0cc39b2cbee80969c3a939ee868490fca2e9996f363d94" Jan 21 16:27:22 crc kubenswrapper[4707]: I0121 16:27:22.048010 4707 scope.go:117] "RemoveContainer" containerID="21b07e47a12d2ad4fc614006cb635a84cc5beac49075189f2bffe191e1b2e1ed" Jan 21 16:27:22 crc kubenswrapper[4707]: I0121 16:27:22.066669 4707 scope.go:117] "RemoveContainer" containerID="59f2960cb3336ee1780116af90c9ac84bf7b16f6fdd099fb23c012229bf2b205" Jan 21 16:27:29 crc kubenswrapper[4707]: I0121 16:27:29.947773 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:27:29 crc kubenswrapper[4707]: I0121 16:27:29.982461 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc"] Jan 21 16:27:29 crc kubenswrapper[4707]: I0121 16:27:29.982664 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" podUID="74607501-37d2-44d7-91f2-b51ae5f157ba" containerName="dnsmasq-dns" containerID="cri-o://bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b" gracePeriod=10 Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.305072 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.443663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-config\") pod \"74607501-37d2-44d7-91f2-b51ae5f157ba\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.443925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-dnsmasq-svc\") pod \"74607501-37d2-44d7-91f2-b51ae5f157ba\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.443951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-edpm-compute-no-nodes\") pod \"74607501-37d2-44d7-91f2-b51ae5f157ba\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.444057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4sfp\" (UniqueName: \"kubernetes.io/projected/74607501-37d2-44d7-91f2-b51ae5f157ba-kube-api-access-v4sfp\") pod \"74607501-37d2-44d7-91f2-b51ae5f157ba\" (UID: \"74607501-37d2-44d7-91f2-b51ae5f157ba\") " Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.448108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74607501-37d2-44d7-91f2-b51ae5f157ba-kube-api-access-v4sfp" (OuterVolumeSpecName: "kube-api-access-v4sfp") pod "74607501-37d2-44d7-91f2-b51ae5f157ba" (UID: "74607501-37d2-44d7-91f2-b51ae5f157ba"). InnerVolumeSpecName "kube-api-access-v4sfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.469071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "74607501-37d2-44d7-91f2-b51ae5f157ba" (UID: "74607501-37d2-44d7-91f2-b51ae5f157ba"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.469383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-edpm-compute-no-nodes" (OuterVolumeSpecName: "edpm-compute-no-nodes") pod "74607501-37d2-44d7-91f2-b51ae5f157ba" (UID: "74607501-37d2-44d7-91f2-b51ae5f157ba"). InnerVolumeSpecName "edpm-compute-no-nodes". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.470851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-config" (OuterVolumeSpecName: "config") pod "74607501-37d2-44d7-91f2-b51ae5f157ba" (UID: "74607501-37d2-44d7-91f2-b51ae5f157ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.545613 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4sfp\" (UniqueName: \"kubernetes.io/projected/74607501-37d2-44d7-91f2-b51ae5f157ba-kube-api-access-v4sfp\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.545645 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.545656 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.545665 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/74607501-37d2-44d7-91f2-b51ae5f157ba-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.846502 4707 generic.go:334] "Generic (PLEG): container finished" podID="74607501-37d2-44d7-91f2-b51ae5f157ba" containerID="bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b" exitCode=0 Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.846542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" event={"ID":"74607501-37d2-44d7-91f2-b51ae5f157ba","Type":"ContainerDied","Data":"bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b"} Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.846556 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.846578 4707 scope.go:117] "RemoveContainer" containerID="bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.846566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc" event={"ID":"74607501-37d2-44d7-91f2-b51ae5f157ba","Type":"ContainerDied","Data":"0d4961a07a038649298c0f9e9aca3ae85c56526a7f94e10546a42b48bfed6129"} Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.863657 4707 scope.go:117] "RemoveContainer" containerID="952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.868174 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc"] Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.873143 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-mfnlc"] Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.900311 4707 scope.go:117] "RemoveContainer" containerID="bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b" Jan 21 16:27:30 crc kubenswrapper[4707]: E0121 16:27:30.900618 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b\": container with ID starting with bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b not found: ID does not exist" containerID="bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.900646 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b"} err="failed to get container status \"bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b\": rpc error: code = NotFound desc = could not find container \"bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b\": container with ID starting with bf8710b12917f24d179afccd1a8769df95750f5435c8bdf9a25d562dd63eab4b not found: ID does not exist" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.900664 4707 scope.go:117] "RemoveContainer" containerID="952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3" Jan 21 16:27:30 crc kubenswrapper[4707]: E0121 16:27:30.900893 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3\": container with ID starting with 952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3 not found: ID does not exist" containerID="952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3" Jan 21 16:27:30 crc kubenswrapper[4707]: I0121 16:27:30.900910 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3"} err="failed to get container status \"952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3\": rpc error: code = NotFound desc = could not find container \"952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3\": container with ID starting with 952473fb69bb2a279e856256ee81b851ae2efb8fb8dd4f651e2be7143000b8c3 not found: ID does not exist" Jan 21 16:27:31 crc kubenswrapper[4707]: I0121 16:27:31.190270 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74607501-37d2-44d7-91f2-b51ae5f157ba" path="/var/lib/kubelet/pods/74607501-37d2-44d7-91f2-b51ae5f157ba/volumes" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.618436 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p"] Jan 21 16:27:34 crc kubenswrapper[4707]: E0121 16:27:34.618916 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74607501-37d2-44d7-91f2-b51ae5f157ba" containerName="init" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.618928 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="74607501-37d2-44d7-91f2-b51ae5f157ba" containerName="init" Jan 21 16:27:34 crc kubenswrapper[4707]: E0121 16:27:34.618951 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74607501-37d2-44d7-91f2-b51ae5f157ba" containerName="dnsmasq-dns" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.618957 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="74607501-37d2-44d7-91f2-b51ae5f157ba" containerName="dnsmasq-dns" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.619084 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="74607501-37d2-44d7-91f2-b51ae5f157ba" containerName="dnsmasq-dns" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.619487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.622121 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.622322 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.622448 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.622575 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.634098 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p"] Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.638538 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt"] Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.639431 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.641529 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset-dockercfg-2s9v4" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.642660 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt"] Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.643454 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.696471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg2g2\" (UniqueName: \"kubernetes.io/projected/2314a944-4745-4b81-b6c8-160f17c0c0bc-kube-api-access-tg2g2\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.696517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.696553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.696609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.696707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sgkx\" (UniqueName: \"kubernetes.io/projected/6ae7b48f-7890-475b-8b86-018b44293ed5-kube-api-access-9sgkx\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.696752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.798349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg2g2\" (UniqueName: \"kubernetes.io/projected/2314a944-4745-4b81-b6c8-160f17c0c0bc-kube-api-access-tg2g2\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.798483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.798531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.798591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.798663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sgkx\" (UniqueName: \"kubernetes.io/projected/6ae7b48f-7890-475b-8b86-018b44293ed5-kube-api-access-9sgkx\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.798716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.803484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.803484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.805160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.805535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.820020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg2g2\" (UniqueName: \"kubernetes.io/projected/2314a944-4745-4b81-b6c8-160f17c0c0bc-kube-api-access-tg2g2\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.827204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sgkx\" (UniqueName: \"kubernetes.io/projected/6ae7b48f-7890-475b-8b86-018b44293ed5-kube-api-access-9sgkx\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.933401 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:34 crc kubenswrapper[4707]: I0121 16:27:34.953927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:35 crc kubenswrapper[4707]: I0121 16:27:35.305333 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p"] Jan 21 16:27:35 crc kubenswrapper[4707]: W0121 16:27:35.311081 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2314a944_4745_4b81_b6c8_160f17c0c0bc.slice/crio-025e2d4132717df8430d08810b224f5e2e9dc33dbf0be05f58735c405c3f7a10 WatchSource:0}: Error finding container 025e2d4132717df8430d08810b224f5e2e9dc33dbf0be05f58735c405c3f7a10: Status 404 returned error can't find the container with id 025e2d4132717df8430d08810b224f5e2e9dc33dbf0be05f58735c405c3f7a10 Jan 21 16:27:35 crc kubenswrapper[4707]: I0121 16:27:35.370979 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt"] Jan 21 16:27:35 crc kubenswrapper[4707]: W0121 16:27:35.371455 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ae7b48f_7890_475b_8b86_018b44293ed5.slice/crio-793ba0f563f424b2e2cbb3b0ddd21d3a29c2466e4017605fe62c6a883e289fd9 WatchSource:0}: Error finding container 793ba0f563f424b2e2cbb3b0ddd21d3a29c2466e4017605fe62c6a883e289fd9: Status 404 returned error can't find the container with id 793ba0f563f424b2e2cbb3b0ddd21d3a29c2466e4017605fe62c6a883e289fd9 Jan 21 16:27:35 crc kubenswrapper[4707]: I0121 16:27:35.881096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" event={"ID":"6ae7b48f-7890-475b-8b86-018b44293ed5","Type":"ContainerStarted","Data":"793ba0f563f424b2e2cbb3b0ddd21d3a29c2466e4017605fe62c6a883e289fd9"} Jan 21 16:27:35 crc kubenswrapper[4707]: I0121 16:27:35.881978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" event={"ID":"2314a944-4745-4b81-b6c8-160f17c0c0bc","Type":"ContainerStarted","Data":"025e2d4132717df8430d08810b224f5e2e9dc33dbf0be05f58735c405c3f7a10"} Jan 21 16:27:36 crc kubenswrapper[4707]: I0121 16:27:36.890578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" event={"ID":"6ae7b48f-7890-475b-8b86-018b44293ed5","Type":"ContainerStarted","Data":"a3c437e547e7ad7aace75e969c00453ff9c2a09cba6e1369b13dd6d26b8ef56f"} Jan 21 16:27:36 crc kubenswrapper[4707]: I0121 16:27:36.891933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" event={"ID":"2314a944-4745-4b81-b6c8-160f17c0c0bc","Type":"ContainerStarted","Data":"0fab8855ff2749ff036354a02f2046fbc98cdedc5eb5e23bbf54962a935a282c"} Jan 21 16:27:36 crc kubenswrapper[4707]: I0121 16:27:36.904303 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" podStartSLOduration=2.333783029 podStartE2EDuration="2.904290703s" podCreationTimestamp="2026-01-21 16:27:34 +0000 UTC" firstStartedPulling="2026-01-21 16:27:35.3731639 +0000 UTC m=+5152.554680122" lastFinishedPulling="2026-01-21 16:27:35.943671574 +0000 UTC m=+5153.125187796" observedRunningTime="2026-01-21 16:27:36.901229677 +0000 UTC m=+5154.082745899" watchObservedRunningTime="2026-01-21 16:27:36.904290703 +0000 UTC m=+5154.085806916" Jan 21 16:27:36 crc kubenswrapper[4707]: I0121 16:27:36.916943 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" podStartSLOduration=2.324518318 podStartE2EDuration="2.916928356s" podCreationTimestamp="2026-01-21 16:27:34 +0000 UTC" firstStartedPulling="2026-01-21 16:27:35.312843029 +0000 UTC m=+5152.494359251" lastFinishedPulling="2026-01-21 16:27:35.905253057 +0000 UTC m=+5153.086769289" observedRunningTime="2026-01-21 16:27:36.910897166 +0000 UTC m=+5154.092413388" watchObservedRunningTime="2026-01-21 16:27:36.916928356 +0000 UTC m=+5154.098444578" Jan 21 16:27:37 crc kubenswrapper[4707]: I0121 16:27:37.899135 4707 generic.go:334] "Generic (PLEG): container finished" podID="6ae7b48f-7890-475b-8b86-018b44293ed5" containerID="a3c437e547e7ad7aace75e969c00453ff9c2a09cba6e1369b13dd6d26b8ef56f" exitCode=0 Jan 21 16:27:37 crc kubenswrapper[4707]: I0121 16:27:37.899197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" event={"ID":"6ae7b48f-7890-475b-8b86-018b44293ed5","Type":"ContainerDied","Data":"a3c437e547e7ad7aace75e969c00453ff9c2a09cba6e1369b13dd6d26b8ef56f"} Jan 21 16:27:37 crc kubenswrapper[4707]: I0121 16:27:37.901001 4707 generic.go:334] "Generic (PLEG): container finished" podID="2314a944-4745-4b81-b6c8-160f17c0c0bc" containerID="0fab8855ff2749ff036354a02f2046fbc98cdedc5eb5e23bbf54962a935a282c" exitCode=0 Jan 21 16:27:37 crc kubenswrapper[4707]: I0121 16:27:37.901089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" event={"ID":"2314a944-4745-4b81-b6c8-160f17c0c0bc","Type":"ContainerDied","Data":"0fab8855ff2749ff036354a02f2046fbc98cdedc5eb5e23bbf54962a935a282c"} Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.195692 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.201527 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.361183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-ssh-key-edpm-compute-no-nodes\") pod \"2314a944-4745-4b81-b6c8-160f17c0c0bc\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.361228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sgkx\" (UniqueName: \"kubernetes.io/projected/6ae7b48f-7890-475b-8b86-018b44293ed5-kube-api-access-9sgkx\") pod \"6ae7b48f-7890-475b-8b86-018b44293ed5\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.361283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-inventory\") pod \"6ae7b48f-7890-475b-8b86-018b44293ed5\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.361338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg2g2\" (UniqueName: \"kubernetes.io/projected/2314a944-4745-4b81-b6c8-160f17c0c0bc-kube-api-access-tg2g2\") pod \"2314a944-4745-4b81-b6c8-160f17c0c0bc\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.361371 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-inventory\") pod \"2314a944-4745-4b81-b6c8-160f17c0c0bc\" (UID: \"2314a944-4745-4b81-b6c8-160f17c0c0bc\") " Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.361426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-ssh-key-edpm-compute-beta-nodeset\") pod \"6ae7b48f-7890-475b-8b86-018b44293ed5\" (UID: \"6ae7b48f-7890-475b-8b86-018b44293ed5\") " Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.366067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae7b48f-7890-475b-8b86-018b44293ed5-kube-api-access-9sgkx" (OuterVolumeSpecName: "kube-api-access-9sgkx") pod "6ae7b48f-7890-475b-8b86-018b44293ed5" (UID: "6ae7b48f-7890-475b-8b86-018b44293ed5"). InnerVolumeSpecName "kube-api-access-9sgkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.367046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2314a944-4745-4b81-b6c8-160f17c0c0bc-kube-api-access-tg2g2" (OuterVolumeSpecName: "kube-api-access-tg2g2") pod "2314a944-4745-4b81-b6c8-160f17c0c0bc" (UID: "2314a944-4745-4b81-b6c8-160f17c0c0bc"). InnerVolumeSpecName "kube-api-access-tg2g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.378395 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-inventory" (OuterVolumeSpecName: "inventory") pod "6ae7b48f-7890-475b-8b86-018b44293ed5" (UID: "6ae7b48f-7890-475b-8b86-018b44293ed5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.379063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "2314a944-4745-4b81-b6c8-160f17c0c0bc" (UID: "2314a944-4745-4b81-b6c8-160f17c0c0bc"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.379493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "6ae7b48f-7890-475b-8b86-018b44293ed5" (UID: "6ae7b48f-7890-475b-8b86-018b44293ed5"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.379828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-inventory" (OuterVolumeSpecName: "inventory") pod "2314a944-4745-4b81-b6c8-160f17c0c0bc" (UID: "2314a944-4745-4b81-b6c8-160f17c0c0bc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.463302 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.463332 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg2g2\" (UniqueName: \"kubernetes.io/projected/2314a944-4745-4b81-b6c8-160f17c0c0bc-kube-api-access-tg2g2\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.463343 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.463353 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/6ae7b48f-7890-475b-8b86-018b44293ed5-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.463361 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2314a944-4745-4b81-b6c8-160f17c0c0bc-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.463370 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sgkx\" (UniqueName: \"kubernetes.io/projected/6ae7b48f-7890-475b-8b86-018b44293ed5-kube-api-access-9sgkx\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.916048 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" event={"ID":"6ae7b48f-7890-475b-8b86-018b44293ed5","Type":"ContainerDied","Data":"793ba0f563f424b2e2cbb3b0ddd21d3a29c2466e4017605fe62c6a883e289fd9"} Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.916084 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="793ba0f563f424b2e2cbb3b0ddd21d3a29c2466e4017605fe62c6a883e289fd9" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.916090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.917233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" event={"ID":"2314a944-4745-4b81-b6c8-160f17c0c0bc","Type":"ContainerDied","Data":"025e2d4132717df8430d08810b224f5e2e9dc33dbf0be05f58735c405c3f7a10"} Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.917254 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="025e2d4132717df8430d08810b224f5e2e9dc33dbf0be05f58735c405c3f7a10" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.917296 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.961464 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc"] Jan 21 16:27:39 crc kubenswrapper[4707]: E0121 16:27:39.961747 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae7b48f-7890-475b-8b86-018b44293ed5" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.961767 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae7b48f-7890-475b-8b86-018b44293ed5" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:27:39 crc kubenswrapper[4707]: E0121 16:27:39.961775 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2314a944-4745-4b81-b6c8-160f17c0c0bc" containerName="download-cache-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.961781 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2314a944-4745-4b81-b6c8-160f17c0c0bc" containerName="download-cache-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.961912 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae7b48f-7890-475b-8b86-018b44293ed5" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.961939 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2314a944-4745-4b81-b6c8-160f17c0c0bc" containerName="download-cache-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.962332 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.965404 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.965490 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset-dockercfg-2s9v4" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.965762 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.966088 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.966199 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.971678 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc"] Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.995362 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp"] Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.996682 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.998298 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:27:39 crc kubenswrapper[4707]: I0121 16:27:39.998326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.004537 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp"] Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.071173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.071226 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zfnc\" (UniqueName: \"kubernetes.io/projected/46e34fac-a51c-4325-8fce-914731b7f16e-kube-api-access-6zfnc\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.071329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.071595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.172578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.172623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.172649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.172772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.172820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zfnc\" (UniqueName: \"kubernetes.io/projected/46e34fac-a51c-4325-8fce-914731b7f16e-kube-api-access-6zfnc\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.172842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.172896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcq8l\" (UniqueName: \"kubernetes.io/projected/1ae4f766-183b-4491-b5a9-f3376cfa5254-kube-api-access-zcq8l\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.172965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.175652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.175874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.176494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.185746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zfnc\" (UniqueName: \"kubernetes.io/projected/46e34fac-a51c-4325-8fce-914731b7f16e-kube-api-access-6zfnc\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.273874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.273915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.273941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.274019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcq8l\" (UniqueName: \"kubernetes.io/projected/1ae4f766-183b-4491-b5a9-f3376cfa5254-kube-api-access-zcq8l\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.276613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.277102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.277139 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.277385 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.289614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcq8l\" (UniqueName: \"kubernetes.io/projected/1ae4f766-183b-4491-b5a9-f3376cfa5254-kube-api-access-zcq8l\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.311000 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.690089 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc"] Jan 21 16:27:40 crc kubenswrapper[4707]: W0121 16:27:40.693953 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46e34fac_a51c_4325_8fce_914731b7f16e.slice/crio-422c666fe89bf314e8e81b7a7304e186ef8b111629d3795cb3af8c6d9eeab719 WatchSource:0}: Error finding container 422c666fe89bf314e8e81b7a7304e186ef8b111629d3795cb3af8c6d9eeab719: Status 404 returned error can't find the container with id 422c666fe89bf314e8e81b7a7304e186ef8b111629d3795cb3af8c6d9eeab719 Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.732371 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp"] Jan 21 16:27:40 crc kubenswrapper[4707]: W0121 16:27:40.734679 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ae4f766_183b_4491_b5a9_f3376cfa5254.slice/crio-ecdcd68e8aad7e6ff2b2b20c17e80c84c453e77cf8e57f24cc23febbbcd6ab3e WatchSource:0}: Error finding container ecdcd68e8aad7e6ff2b2b20c17e80c84c453e77cf8e57f24cc23febbbcd6ab3e: Status 404 returned error can't find the container with id ecdcd68e8aad7e6ff2b2b20c17e80c84c453e77cf8e57f24cc23febbbcd6ab3e Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.926143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" event={"ID":"1ae4f766-183b-4491-b5a9-f3376cfa5254","Type":"ContainerStarted","Data":"ecdcd68e8aad7e6ff2b2b20c17e80c84c453e77cf8e57f24cc23febbbcd6ab3e"} Jan 21 16:27:40 crc kubenswrapper[4707]: I0121 16:27:40.927566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" event={"ID":"46e34fac-a51c-4325-8fce-914731b7f16e","Type":"ContainerStarted","Data":"422c666fe89bf314e8e81b7a7304e186ef8b111629d3795cb3af8c6d9eeab719"} Jan 21 16:27:41 crc kubenswrapper[4707]: I0121 16:27:41.943363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" event={"ID":"46e34fac-a51c-4325-8fce-914731b7f16e","Type":"ContainerStarted","Data":"a64f9fa10268d28f429481a860a75dfce42aa266e4662a4b2e53a283a167ee69"} Jan 21 16:27:41 crc kubenswrapper[4707]: I0121 16:27:41.944582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" event={"ID":"1ae4f766-183b-4491-b5a9-f3376cfa5254","Type":"ContainerStarted","Data":"e7fe8cfbb01e0504920f0998d87c099a028b5b5d59c5b727e542f559d1e0d545"} Jan 21 16:27:41 crc kubenswrapper[4707]: I0121 16:27:41.959753 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" podStartSLOduration=2.38515096 podStartE2EDuration="2.959739517s" podCreationTimestamp="2026-01-21 16:27:39 +0000 UTC" firstStartedPulling="2026-01-21 16:27:40.696277403 +0000 UTC m=+5157.877793626" lastFinishedPulling="2026-01-21 16:27:41.270865961 +0000 UTC m=+5158.452382183" observedRunningTime="2026-01-21 16:27:41.957190214 +0000 UTC m=+5159.138706436" watchObservedRunningTime="2026-01-21 16:27:41.959739517 +0000 UTC m=+5159.141255740" Jan 21 16:27:41 crc kubenswrapper[4707]: I0121 16:27:41.973767 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" podStartSLOduration=2.5198022939999998 podStartE2EDuration="2.973749109s" podCreationTimestamp="2026-01-21 16:27:39 +0000 UTC" firstStartedPulling="2026-01-21 16:27:40.736782716 +0000 UTC m=+5157.918298938" lastFinishedPulling="2026-01-21 16:27:41.19072953 +0000 UTC m=+5158.372245753" observedRunningTime="2026-01-21 16:27:41.968705877 +0000 UTC m=+5159.150222099" watchObservedRunningTime="2026-01-21 16:27:41.973749109 +0000 UTC m=+5159.155265322" Jan 21 16:27:42 crc kubenswrapper[4707]: I0121 16:27:42.952546 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ae4f766-183b-4491-b5a9-f3376cfa5254" containerID="e7fe8cfbb01e0504920f0998d87c099a028b5b5d59c5b727e542f559d1e0d545" exitCode=0 Jan 21 16:27:42 crc kubenswrapper[4707]: I0121 16:27:42.952588 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" event={"ID":"1ae4f766-183b-4491-b5a9-f3376cfa5254","Type":"ContainerDied","Data":"e7fe8cfbb01e0504920f0998d87c099a028b5b5d59c5b727e542f559d1e0d545"} Jan 21 16:27:42 crc kubenswrapper[4707]: I0121 16:27:42.954652 4707 generic.go:334] "Generic (PLEG): container finished" podID="46e34fac-a51c-4325-8fce-914731b7f16e" containerID="a64f9fa10268d28f429481a860a75dfce42aa266e4662a4b2e53a283a167ee69" exitCode=0 Jan 21 16:27:42 crc kubenswrapper[4707]: I0121 16:27:42.954689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" event={"ID":"46e34fac-a51c-4325-8fce-914731b7f16e","Type":"ContainerDied","Data":"a64f9fa10268d28f429481a860a75dfce42aa266e4662a4b2e53a283a167ee69"} Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.177688 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.240072 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.325014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-inventory\") pod \"46e34fac-a51c-4325-8fce-914731b7f16e\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.325072 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-bootstrap-combined-ca-bundle\") pod \"46e34fac-a51c-4325-8fce-914731b7f16e\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.325154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-ssh-key-edpm-compute-beta-nodeset\") pod \"46e34fac-a51c-4325-8fce-914731b7f16e\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.325395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-bootstrap-combined-ca-bundle\") pod \"1ae4f766-183b-4491-b5a9-f3376cfa5254\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.325426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-ssh-key-edpm-compute-no-nodes\") pod \"1ae4f766-183b-4491-b5a9-f3376cfa5254\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.325465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zfnc\" (UniqueName: \"kubernetes.io/projected/46e34fac-a51c-4325-8fce-914731b7f16e-kube-api-access-6zfnc\") pod \"46e34fac-a51c-4325-8fce-914731b7f16e\" (UID: \"46e34fac-a51c-4325-8fce-914731b7f16e\") " Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.325483 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-inventory\") pod \"1ae4f766-183b-4491-b5a9-f3376cfa5254\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.328997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "46e34fac-a51c-4325-8fce-914731b7f16e" (UID: "46e34fac-a51c-4325-8fce-914731b7f16e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.330188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1ae4f766-183b-4491-b5a9-f3376cfa5254" (UID: "1ae4f766-183b-4491-b5a9-f3376cfa5254"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.330354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e34fac-a51c-4325-8fce-914731b7f16e-kube-api-access-6zfnc" (OuterVolumeSpecName: "kube-api-access-6zfnc") pod "46e34fac-a51c-4325-8fce-914731b7f16e" (UID: "46e34fac-a51c-4325-8fce-914731b7f16e"). InnerVolumeSpecName "kube-api-access-6zfnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.341247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "46e34fac-a51c-4325-8fce-914731b7f16e" (UID: "46e34fac-a51c-4325-8fce-914731b7f16e"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.341497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-inventory" (OuterVolumeSpecName: "inventory") pod "1ae4f766-183b-4491-b5a9-f3376cfa5254" (UID: "1ae4f766-183b-4491-b5a9-f3376cfa5254"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.342094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "1ae4f766-183b-4491-b5a9-f3376cfa5254" (UID: "1ae4f766-183b-4491-b5a9-f3376cfa5254"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.342502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-inventory" (OuterVolumeSpecName: "inventory") pod "46e34fac-a51c-4325-8fce-914731b7f16e" (UID: "46e34fac-a51c-4325-8fce-914731b7f16e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.426232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcq8l\" (UniqueName: \"kubernetes.io/projected/1ae4f766-183b-4491-b5a9-f3376cfa5254-kube-api-access-zcq8l\") pod \"1ae4f766-183b-4491-b5a9-f3376cfa5254\" (UID: \"1ae4f766-183b-4491-b5a9-f3376cfa5254\") " Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.427415 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.427441 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.427453 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.427464 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zfnc\" (UniqueName: \"kubernetes.io/projected/46e34fac-a51c-4325-8fce-914731b7f16e-kube-api-access-6zfnc\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.427474 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae4f766-183b-4491-b5a9-f3376cfa5254-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.427481 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.427489 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e34fac-a51c-4325-8fce-914731b7f16e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.428570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae4f766-183b-4491-b5a9-f3376cfa5254-kube-api-access-zcq8l" (OuterVolumeSpecName: "kube-api-access-zcq8l") pod "1ae4f766-183b-4491-b5a9-f3376cfa5254" (UID: "1ae4f766-183b-4491-b5a9-f3376cfa5254"). InnerVolumeSpecName "kube-api-access-zcq8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.528098 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcq8l\" (UniqueName: \"kubernetes.io/projected/1ae4f766-183b-4491-b5a9-f3376cfa5254-kube-api-access-zcq8l\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.967506 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.967500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp" event={"ID":"1ae4f766-183b-4491-b5a9-f3376cfa5254","Type":"ContainerDied","Data":"ecdcd68e8aad7e6ff2b2b20c17e80c84c453e77cf8e57f24cc23febbbcd6ab3e"} Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.967640 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecdcd68e8aad7e6ff2b2b20c17e80c84c453e77cf8e57f24cc23febbbcd6ab3e" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.969076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" event={"ID":"46e34fac-a51c-4325-8fce-914731b7f16e","Type":"ContainerDied","Data":"422c666fe89bf314e8e81b7a7304e186ef8b111629d3795cb3af8c6d9eeab719"} Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.969108 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="422c666fe89bf314e8e81b7a7304e186ef8b111629d3795cb3af8c6d9eeab719" Jan 21 16:27:44 crc kubenswrapper[4707]: I0121 16:27:44.969128 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.289933 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv"] Jan 21 16:27:45 crc kubenswrapper[4707]: E0121 16:27:45.290226 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e34fac-a51c-4325-8fce-914731b7f16e" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.290239 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e34fac-a51c-4325-8fce-914731b7f16e" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:27:45 crc kubenswrapper[4707]: E0121 16:27:45.290254 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae4f766-183b-4491-b5a9-f3376cfa5254" containerName="bootstrap-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.290261 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae4f766-183b-4491-b5a9-f3376cfa5254" containerName="bootstrap-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.290406 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae4f766-183b-4491-b5a9-f3376cfa5254" containerName="bootstrap-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.290424 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e34fac-a51c-4325-8fce-914731b7f16e" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.290836 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.292671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.292678 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.293182 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.294211 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.296729 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv"] Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.437024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.437104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj5t5\" (UniqueName: \"kubernetes.io/projected/294b0203-a03d-48f0-92aa-9db53c0e1f58-kube-api-access-jj5t5\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.437169 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.538667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj5t5\" (UniqueName: \"kubernetes.io/projected/294b0203-a03d-48f0-92aa-9db53c0e1f58-kube-api-access-jj5t5\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.538782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.538912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.542324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.542339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.551230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj5t5\" (UniqueName: \"kubernetes.io/projected/294b0203-a03d-48f0-92aa-9db53c0e1f58-kube-api-access-jj5t5\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.602491 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.943338 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv"] Jan 21 16:27:45 crc kubenswrapper[4707]: W0121 16:27:45.946950 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod294b0203_a03d_48f0_92aa_9db53c0e1f58.slice/crio-3c6b5ac33772b8f28198e8d8e86faa4142f4ee48da8b00a50b863a2f26343f97 WatchSource:0}: Error finding container 3c6b5ac33772b8f28198e8d8e86faa4142f4ee48da8b00a50b863a2f26343f97: Status 404 returned error can't find the container with id 3c6b5ac33772b8f28198e8d8e86faa4142f4ee48da8b00a50b863a2f26343f97 Jan 21 16:27:45 crc kubenswrapper[4707]: I0121 16:27:45.976529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" event={"ID":"294b0203-a03d-48f0-92aa-9db53c0e1f58","Type":"ContainerStarted","Data":"3c6b5ac33772b8f28198e8d8e86faa4142f4ee48da8b00a50b863a2f26343f97"} Jan 21 16:27:46 crc kubenswrapper[4707]: I0121 16:27:46.984329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" event={"ID":"294b0203-a03d-48f0-92aa-9db53c0e1f58","Type":"ContainerStarted","Data":"5058250f2b778215e36ab883b00d7616bc05d68011acf04ca6c9152f7682ea3b"} Jan 21 16:27:46 crc kubenswrapper[4707]: I0121 16:27:46.995904 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" podStartSLOduration=1.538201114 podStartE2EDuration="1.995888632s" podCreationTimestamp="2026-01-21 16:27:45 +0000 UTC" firstStartedPulling="2026-01-21 16:27:45.948457221 +0000 UTC m=+5163.129973433" lastFinishedPulling="2026-01-21 16:27:46.406144729 +0000 UTC m=+5163.587660951" observedRunningTime="2026-01-21 16:27:46.993791099 +0000 UTC m=+5164.175307320" watchObservedRunningTime="2026-01-21 16:27:46.995888632 +0000 UTC m=+5164.177404854" Jan 21 16:27:47 crc kubenswrapper[4707]: I0121 16:27:47.991743 4707 generic.go:334] "Generic (PLEG): container finished" podID="294b0203-a03d-48f0-92aa-9db53c0e1f58" containerID="5058250f2b778215e36ab883b00d7616bc05d68011acf04ca6c9152f7682ea3b" exitCode=0 Jan 21 16:27:47 crc kubenswrapper[4707]: I0121 16:27:47.991775 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" event={"ID":"294b0203-a03d-48f0-92aa-9db53c0e1f58","Type":"ContainerDied","Data":"5058250f2b778215e36ab883b00d7616bc05d68011acf04ca6c9152f7682ea3b"} Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.211381 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.384203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-inventory\") pod \"294b0203-a03d-48f0-92aa-9db53c0e1f58\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.384261 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-ssh-key-edpm-compute-no-nodes\") pod \"294b0203-a03d-48f0-92aa-9db53c0e1f58\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.384305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj5t5\" (UniqueName: \"kubernetes.io/projected/294b0203-a03d-48f0-92aa-9db53c0e1f58-kube-api-access-jj5t5\") pod \"294b0203-a03d-48f0-92aa-9db53c0e1f58\" (UID: \"294b0203-a03d-48f0-92aa-9db53c0e1f58\") " Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.388075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/294b0203-a03d-48f0-92aa-9db53c0e1f58-kube-api-access-jj5t5" (OuterVolumeSpecName: "kube-api-access-jj5t5") pod "294b0203-a03d-48f0-92aa-9db53c0e1f58" (UID: "294b0203-a03d-48f0-92aa-9db53c0e1f58"). InnerVolumeSpecName "kube-api-access-jj5t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.400021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "294b0203-a03d-48f0-92aa-9db53c0e1f58" (UID: "294b0203-a03d-48f0-92aa-9db53c0e1f58"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.400062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-inventory" (OuterVolumeSpecName: "inventory") pod "294b0203-a03d-48f0-92aa-9db53c0e1f58" (UID: "294b0203-a03d-48f0-92aa-9db53c0e1f58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.485895 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.485920 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/294b0203-a03d-48f0-92aa-9db53c0e1f58-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:49 crc kubenswrapper[4707]: I0121 16:27:49.485936 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj5t5\" (UniqueName: \"kubernetes.io/projected/294b0203-a03d-48f0-92aa-9db53c0e1f58-kube-api-access-jj5t5\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.010490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" event={"ID":"294b0203-a03d-48f0-92aa-9db53c0e1f58","Type":"ContainerDied","Data":"3c6b5ac33772b8f28198e8d8e86faa4142f4ee48da8b00a50b863a2f26343f97"} Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.010674 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c6b5ac33772b8f28198e8d8e86faa4142f4ee48da8b00a50b863a2f26343f97" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.010566 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.041454 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j"] Jan 21 16:27:50 crc kubenswrapper[4707]: E0121 16:27:50.041715 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294b0203-a03d-48f0-92aa-9db53c0e1f58" containerName="configure-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.041732 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="294b0203-a03d-48f0-92aa-9db53c0e1f58" containerName="configure-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.041915 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="294b0203-a03d-48f0-92aa-9db53c0e1f58" containerName="configure-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.042321 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.043927 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.044101 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.044300 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.045219 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.049377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j"] Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.091919 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.092039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rls\" (UniqueName: \"kubernetes.io/projected/29ec1a45-1d87-4863-9013-5e4da9048cbc-kube-api-access-45rls\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.092103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.193316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.193405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.193474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45rls\" (UniqueName: \"kubernetes.io/projected/29ec1a45-1d87-4863-9013-5e4da9048cbc-kube-api-access-45rls\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.197189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.197996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.206507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rls\" (UniqueName: \"kubernetes.io/projected/29ec1a45-1d87-4863-9013-5e4da9048cbc-kube-api-access-45rls\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.353925 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:50 crc kubenswrapper[4707]: I0121 16:27:50.710356 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j"] Jan 21 16:27:50 crc kubenswrapper[4707]: W0121 16:27:50.712918 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29ec1a45_1d87_4863_9013_5e4da9048cbc.slice/crio-fddb32b5233077aee3bee38d9ca4dbaac7b24ca6266ab29a2a0ce41d99333cef WatchSource:0}: Error finding container fddb32b5233077aee3bee38d9ca4dbaac7b24ca6266ab29a2a0ce41d99333cef: Status 404 returned error can't find the container with id fddb32b5233077aee3bee38d9ca4dbaac7b24ca6266ab29a2a0ce41d99333cef Jan 21 16:27:51 crc kubenswrapper[4707]: I0121 16:27:51.017969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" event={"ID":"29ec1a45-1d87-4863-9013-5e4da9048cbc","Type":"ContainerStarted","Data":"fddb32b5233077aee3bee38d9ca4dbaac7b24ca6266ab29a2a0ce41d99333cef"} Jan 21 16:27:52 crc kubenswrapper[4707]: I0121 16:27:52.025393 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" event={"ID":"29ec1a45-1d87-4863-9013-5e4da9048cbc","Type":"ContainerStarted","Data":"354727db5bf7286d602877db0f59ae8c17a1afa0f16d22ac930fc56866660313"} Jan 21 16:27:52 crc kubenswrapper[4707]: I0121 16:27:52.039383 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" podStartSLOduration=1.52402978 podStartE2EDuration="2.039370926s" podCreationTimestamp="2026-01-21 16:27:50 +0000 UTC" firstStartedPulling="2026-01-21 16:27:50.714639999 +0000 UTC m=+5167.896156220" lastFinishedPulling="2026-01-21 16:27:51.229981143 +0000 UTC m=+5168.411497366" observedRunningTime="2026-01-21 16:27:52.034402965 +0000 UTC m=+5169.215919187" watchObservedRunningTime="2026-01-21 16:27:52.039370926 +0000 UTC m=+5169.220887148" Jan 21 16:27:53 crc kubenswrapper[4707]: I0121 16:27:53.032991 4707 generic.go:334] "Generic (PLEG): container finished" podID="29ec1a45-1d87-4863-9013-5e4da9048cbc" containerID="354727db5bf7286d602877db0f59ae8c17a1afa0f16d22ac930fc56866660313" exitCode=0 Jan 21 16:27:53 crc kubenswrapper[4707]: I0121 16:27:53.033038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" event={"ID":"29ec1a45-1d87-4863-9013-5e4da9048cbc","Type":"ContainerDied","Data":"354727db5bf7286d602877db0f59ae8c17a1afa0f16d22ac930fc56866660313"} Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.235532 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.342118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-ssh-key-edpm-compute-no-nodes\") pod \"29ec1a45-1d87-4863-9013-5e4da9048cbc\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.342166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45rls\" (UniqueName: \"kubernetes.io/projected/29ec1a45-1d87-4863-9013-5e4da9048cbc-kube-api-access-45rls\") pod \"29ec1a45-1d87-4863-9013-5e4da9048cbc\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.342208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-inventory\") pod \"29ec1a45-1d87-4863-9013-5e4da9048cbc\" (UID: \"29ec1a45-1d87-4863-9013-5e4da9048cbc\") " Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.346355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ec1a45-1d87-4863-9013-5e4da9048cbc-kube-api-access-45rls" (OuterVolumeSpecName: "kube-api-access-45rls") pod "29ec1a45-1d87-4863-9013-5e4da9048cbc" (UID: "29ec1a45-1d87-4863-9013-5e4da9048cbc"). InnerVolumeSpecName "kube-api-access-45rls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.358045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-inventory" (OuterVolumeSpecName: "inventory") pod "29ec1a45-1d87-4863-9013-5e4da9048cbc" (UID: "29ec1a45-1d87-4863-9013-5e4da9048cbc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.358250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "29ec1a45-1d87-4863-9013-5e4da9048cbc" (UID: "29ec1a45-1d87-4863-9013-5e4da9048cbc"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.443193 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.443226 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45rls\" (UniqueName: \"kubernetes.io/projected/29ec1a45-1d87-4863-9013-5e4da9048cbc-kube-api-access-45rls\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:54 crc kubenswrapper[4707]: I0121 16:27:54.443236 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ec1a45-1d87-4863-9013-5e4da9048cbc-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.047087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" event={"ID":"29ec1a45-1d87-4863-9013-5e4da9048cbc","Type":"ContainerDied","Data":"fddb32b5233077aee3bee38d9ca4dbaac7b24ca6266ab29a2a0ce41d99333cef"} Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.047121 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fddb32b5233077aee3bee38d9ca4dbaac7b24ca6266ab29a2a0ce41d99333cef" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.047139 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.084824 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl"] Jan 21 16:27:55 crc kubenswrapper[4707]: E0121 16:27:55.085104 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ec1a45-1d87-4863-9013-5e4da9048cbc" containerName="validate-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.085119 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ec1a45-1d87-4863-9013-5e4da9048cbc" containerName="validate-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.085235 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ec1a45-1d87-4863-9013-5e4da9048cbc" containerName="validate-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.085645 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.091379 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.091453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.091837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.092021 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.105545 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl"] Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.148455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.148503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lxpt\" (UniqueName: \"kubernetes.io/projected/77e2ada6-fa58-437b-8e03-b89e72bf9939-kube-api-access-4lxpt\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.148672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.249898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.249956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lxpt\" (UniqueName: \"kubernetes.io/projected/77e2ada6-fa58-437b-8e03-b89e72bf9939-kube-api-access-4lxpt\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.250005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.252803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.252886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.263888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lxpt\" (UniqueName: \"kubernetes.io/projected/77e2ada6-fa58-437b-8e03-b89e72bf9939-kube-api-access-4lxpt\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.397009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:55 crc kubenswrapper[4707]: I0121 16:27:55.745875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl"] Jan 21 16:27:55 crc kubenswrapper[4707]: W0121 16:27:55.749597 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77e2ada6_fa58_437b_8e03_b89e72bf9939.slice/crio-86d10210cf95929ee23b431c351dee6f24314642217fe6b3309bc12c24e21068 WatchSource:0}: Error finding container 86d10210cf95929ee23b431c351dee6f24314642217fe6b3309bc12c24e21068: Status 404 returned error can't find the container with id 86d10210cf95929ee23b431c351dee6f24314642217fe6b3309bc12c24e21068 Jan 21 16:27:56 crc kubenswrapper[4707]: I0121 16:27:56.054826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" event={"ID":"77e2ada6-fa58-437b-8e03-b89e72bf9939","Type":"ContainerStarted","Data":"86d10210cf95929ee23b431c351dee6f24314642217fe6b3309bc12c24e21068"} Jan 21 16:27:57 crc kubenswrapper[4707]: I0121 16:27:57.063035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" event={"ID":"77e2ada6-fa58-437b-8e03-b89e72bf9939","Type":"ContainerStarted","Data":"62eca35132dd7e2575f410ce9ff3a09e143f82aa0f6c591a4d5535d288e9bd0d"} Jan 21 16:27:57 crc kubenswrapper[4707]: I0121 16:27:57.077017 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" podStartSLOduration=1.517473126 podStartE2EDuration="2.07700348s" podCreationTimestamp="2026-01-21 16:27:55 +0000 UTC" firstStartedPulling="2026-01-21 16:27:55.75136168 +0000 UTC m=+5172.932877902" lastFinishedPulling="2026-01-21 16:27:56.310892014 +0000 UTC m=+5173.492408256" observedRunningTime="2026-01-21 16:27:57.073518687 +0000 UTC m=+5174.255034910" watchObservedRunningTime="2026-01-21 16:27:57.07700348 +0000 UTC m=+5174.258519702" Jan 21 16:27:58 crc kubenswrapper[4707]: I0121 16:27:58.070414 4707 generic.go:334] "Generic (PLEG): container finished" podID="77e2ada6-fa58-437b-8e03-b89e72bf9939" containerID="62eca35132dd7e2575f410ce9ff3a09e143f82aa0f6c591a4d5535d288e9bd0d" exitCode=0 Jan 21 16:27:58 crc kubenswrapper[4707]: I0121 16:27:58.070453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" event={"ID":"77e2ada6-fa58-437b-8e03-b89e72bf9939","Type":"ContainerDied","Data":"62eca35132dd7e2575f410ce9ff3a09e143f82aa0f6c591a4d5535d288e9bd0d"} Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.274951 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.388641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-ssh-key-edpm-compute-no-nodes\") pod \"77e2ada6-fa58-437b-8e03-b89e72bf9939\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.388726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lxpt\" (UniqueName: \"kubernetes.io/projected/77e2ada6-fa58-437b-8e03-b89e72bf9939-kube-api-access-4lxpt\") pod \"77e2ada6-fa58-437b-8e03-b89e72bf9939\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.388783 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-inventory\") pod \"77e2ada6-fa58-437b-8e03-b89e72bf9939\" (UID: \"77e2ada6-fa58-437b-8e03-b89e72bf9939\") " Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.397310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77e2ada6-fa58-437b-8e03-b89e72bf9939-kube-api-access-4lxpt" (OuterVolumeSpecName: "kube-api-access-4lxpt") pod "77e2ada6-fa58-437b-8e03-b89e72bf9939" (UID: "77e2ada6-fa58-437b-8e03-b89e72bf9939"). InnerVolumeSpecName "kube-api-access-4lxpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.404838 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "77e2ada6-fa58-437b-8e03-b89e72bf9939" (UID: "77e2ada6-fa58-437b-8e03-b89e72bf9939"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.405005 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-inventory" (OuterVolumeSpecName: "inventory") pod "77e2ada6-fa58-437b-8e03-b89e72bf9939" (UID: "77e2ada6-fa58-437b-8e03-b89e72bf9939"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.490232 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lxpt\" (UniqueName: \"kubernetes.io/projected/77e2ada6-fa58-437b-8e03-b89e72bf9939-kube-api-access-4lxpt\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.490256 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:27:59 crc kubenswrapper[4707]: I0121 16:27:59.490268 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/77e2ada6-fa58-437b-8e03-b89e72bf9939-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.084192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" event={"ID":"77e2ada6-fa58-437b-8e03-b89e72bf9939","Type":"ContainerDied","Data":"86d10210cf95929ee23b431c351dee6f24314642217fe6b3309bc12c24e21068"} Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.084235 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86d10210cf95929ee23b431c351dee6f24314642217fe6b3309bc12c24e21068" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.084243 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.126801 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl"] Jan 21 16:28:00 crc kubenswrapper[4707]: E0121 16:28:00.127095 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e2ada6-fa58-437b-8e03-b89e72bf9939" containerName="install-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.127112 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e2ada6-fa58-437b-8e03-b89e72bf9939" containerName="install-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.127227 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e2ada6-fa58-437b-8e03-b89e72bf9939" containerName="install-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.127637 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.128974 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.129037 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.129609 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.129673 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.132625 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl"] Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.197895 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j6f4\" (UniqueName: \"kubernetes.io/projected/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-kube-api-access-7j6f4\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.198153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.198208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.298720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.298756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.298789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j6f4\" (UniqueName: \"kubernetes.io/projected/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-kube-api-access-7j6f4\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.301488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.302242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.312224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j6f4\" (UniqueName: \"kubernetes.io/projected/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-kube-api-access-7j6f4\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.439979 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:00 crc kubenswrapper[4707]: I0121 16:28:00.781672 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl"] Jan 21 16:28:00 crc kubenswrapper[4707]: W0121 16:28:00.782203 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe74bd7e_8319_431e_9a0a_bd5dcbfbe63d.slice/crio-5a6816a3aaf44ba40f8f1427658ea486a2fbb9e37e80b337762f1bf199386adf WatchSource:0}: Error finding container 5a6816a3aaf44ba40f8f1427658ea486a2fbb9e37e80b337762f1bf199386adf: Status 404 returned error can't find the container with id 5a6816a3aaf44ba40f8f1427658ea486a2fbb9e37e80b337762f1bf199386adf Jan 21 16:28:01 crc kubenswrapper[4707]: I0121 16:28:01.091217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" event={"ID":"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d","Type":"ContainerStarted","Data":"5a6816a3aaf44ba40f8f1427658ea486a2fbb9e37e80b337762f1bf199386adf"} Jan 21 16:28:02 crc kubenswrapper[4707]: I0121 16:28:02.098418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" event={"ID":"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d","Type":"ContainerStarted","Data":"4e8c1b261844098aab05808bfae3825fe5faaf09272f20482840f3cddd0f3400"} Jan 21 16:28:02 crc kubenswrapper[4707]: I0121 16:28:02.112607 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" podStartSLOduration=1.584320194 podStartE2EDuration="2.112594597s" podCreationTimestamp="2026-01-21 16:28:00 +0000 UTC" firstStartedPulling="2026-01-21 16:28:00.78407717 +0000 UTC m=+5177.965593391" lastFinishedPulling="2026-01-21 16:28:01.312351572 +0000 UTC m=+5178.493867794" observedRunningTime="2026-01-21 16:28:02.108549039 +0000 UTC m=+5179.290065251" watchObservedRunningTime="2026-01-21 16:28:02.112594597 +0000 UTC m=+5179.294110819" Jan 21 16:28:03 crc kubenswrapper[4707]: I0121 16:28:03.106231 4707 generic.go:334] "Generic (PLEG): container finished" podID="fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d" containerID="4e8c1b261844098aab05808bfae3825fe5faaf09272f20482840f3cddd0f3400" exitCode=0 Jan 21 16:28:03 crc kubenswrapper[4707]: I0121 16:28:03.106268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" event={"ID":"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d","Type":"ContainerDied","Data":"4e8c1b261844098aab05808bfae3825fe5faaf09272f20482840f3cddd0f3400"} Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.324731 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.346860 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j6f4\" (UniqueName: \"kubernetes.io/projected/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-kube-api-access-7j6f4\") pod \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.346940 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-inventory\") pod \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.346966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-ssh-key-edpm-compute-no-nodes\") pod \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\" (UID: \"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d\") " Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.350787 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-kube-api-access-7j6f4" (OuterVolumeSpecName: "kube-api-access-7j6f4") pod "fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d" (UID: "fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d"). InnerVolumeSpecName "kube-api-access-7j6f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.363566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d" (UID: "fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.363736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-inventory" (OuterVolumeSpecName: "inventory") pod "fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d" (UID: "fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.447733 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.447757 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:04 crc kubenswrapper[4707]: I0121 16:28:04.447771 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j6f4\" (UniqueName: \"kubernetes.io/projected/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d-kube-api-access-7j6f4\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.120298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" event={"ID":"fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d","Type":"ContainerDied","Data":"5a6816a3aaf44ba40f8f1427658ea486a2fbb9e37e80b337762f1bf199386adf"} Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.120342 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6816a3aaf44ba40f8f1427658ea486a2fbb9e37e80b337762f1bf199386adf" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.120338 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.160645 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq"] Jan 21 16:28:05 crc kubenswrapper[4707]: E0121 16:28:05.160930 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d" containerName="configure-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.160948 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d" containerName="configure-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.161084 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d" containerName="configure-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.161526 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.163838 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.164094 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.164232 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.164356 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.171418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq"] Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.256546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wpl6\" (UniqueName: \"kubernetes.io/projected/e38debfc-3c8a-4248-82c1-43087acc34f7-kube-api-access-2wpl6\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.256751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.256910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.357722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.357765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wpl6\" (UniqueName: \"kubernetes.io/projected/e38debfc-3c8a-4248-82c1-43087acc34f7-kube-api-access-2wpl6\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.357797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.360967 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.360969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.371466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wpl6\" (UniqueName: \"kubernetes.io/projected/e38debfc-3c8a-4248-82c1-43087acc34f7-kube-api-access-2wpl6\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.477454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:05 crc kubenswrapper[4707]: I0121 16:28:05.817909 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq"] Jan 21 16:28:06 crc kubenswrapper[4707]: I0121 16:28:06.127421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" event={"ID":"e38debfc-3c8a-4248-82c1-43087acc34f7","Type":"ContainerStarted","Data":"006d781b5517ac727f59bf7f53aa27bf4168c866a89a351bd7a9c6774bb427f0"} Jan 21 16:28:07 crc kubenswrapper[4707]: I0121 16:28:07.135847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" event={"ID":"e38debfc-3c8a-4248-82c1-43087acc34f7","Type":"ContainerStarted","Data":"59f40f0c60e236ed9e6381bb4593cee997b28e3f5dfe72d30c01e5b49090c05d"} Jan 21 16:28:08 crc kubenswrapper[4707]: I0121 16:28:08.151188 4707 generic.go:334] "Generic (PLEG): container finished" podID="e38debfc-3c8a-4248-82c1-43087acc34f7" containerID="59f40f0c60e236ed9e6381bb4593cee997b28e3f5dfe72d30c01e5b49090c05d" exitCode=0 Jan 21 16:28:08 crc kubenswrapper[4707]: I0121 16:28:08.151225 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" event={"ID":"e38debfc-3c8a-4248-82c1-43087acc34f7","Type":"ContainerDied","Data":"59f40f0c60e236ed9e6381bb4593cee997b28e3f5dfe72d30c01e5b49090c05d"} Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.358094 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.400260 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-ssh-key-edpm-compute-no-nodes\") pod \"e38debfc-3c8a-4248-82c1-43087acc34f7\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.400311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-inventory\") pod \"e38debfc-3c8a-4248-82c1-43087acc34f7\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.400415 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wpl6\" (UniqueName: \"kubernetes.io/projected/e38debfc-3c8a-4248-82c1-43087acc34f7-kube-api-access-2wpl6\") pod \"e38debfc-3c8a-4248-82c1-43087acc34f7\" (UID: \"e38debfc-3c8a-4248-82c1-43087acc34f7\") " Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.404177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38debfc-3c8a-4248-82c1-43087acc34f7-kube-api-access-2wpl6" (OuterVolumeSpecName: "kube-api-access-2wpl6") pod "e38debfc-3c8a-4248-82c1-43087acc34f7" (UID: "e38debfc-3c8a-4248-82c1-43087acc34f7"). InnerVolumeSpecName "kube-api-access-2wpl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.415932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-inventory" (OuterVolumeSpecName: "inventory") pod "e38debfc-3c8a-4248-82c1-43087acc34f7" (UID: "e38debfc-3c8a-4248-82c1-43087acc34f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.416156 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "e38debfc-3c8a-4248-82c1-43087acc34f7" (UID: "e38debfc-3c8a-4248-82c1-43087acc34f7"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.501497 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wpl6\" (UniqueName: \"kubernetes.io/projected/e38debfc-3c8a-4248-82c1-43087acc34f7-kube-api-access-2wpl6\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.501520 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:09 crc kubenswrapper[4707]: I0121 16:28:09.501546 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e38debfc-3c8a-4248-82c1-43087acc34f7-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.165156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" event={"ID":"e38debfc-3c8a-4248-82c1-43087acc34f7","Type":"ContainerDied","Data":"006d781b5517ac727f59bf7f53aa27bf4168c866a89a351bd7a9c6774bb427f0"} Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.165191 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="006d781b5517ac727f59bf7f53aa27bf4168c866a89a351bd7a9c6774bb427f0" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.165196 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.196594 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv"] Jan 21 16:28:10 crc kubenswrapper[4707]: E0121 16:28:10.196898 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38debfc-3c8a-4248-82c1-43087acc34f7" containerName="run-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.196916 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38debfc-3c8a-4248-82c1-43087acc34f7" containerName="run-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.197066 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38debfc-3c8a-4248-82c1-43087acc34f7" containerName="run-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.197589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.198887 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.199262 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.199541 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.199866 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.201835 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.206287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv"] Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.309162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.309207 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.309265 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.309541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.309604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.309654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.309679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.309751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.310109 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.310141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcd76\" (UniqueName: \"kubernetes.io/projected/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-kube-api-access-zcd76\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.310167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.413485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.413873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.413930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.413981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.414005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcd76\" (UniqueName: \"kubernetes.io/projected/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-kube-api-access-zcd76\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.414430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.414482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.414539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.414582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.414599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.414628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.416711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.418149 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.418762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.419650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.420424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.420704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.422788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.424828 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.425551 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.425917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.433191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcd76\" (UniqueName: \"kubernetes.io/projected/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-kube-api-access-zcd76\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.509947 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:10 crc kubenswrapper[4707]: I0121 16:28:10.852872 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv"] Jan 21 16:28:10 crc kubenswrapper[4707]: W0121 16:28:10.853951 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4672bedb_06ea_4f94_a3de_ad73fe7b9aa9.slice/crio-b797b1d6384d63f3982b664d1b8fe670def38877347f1c8af2cd8a36d68f2917 WatchSource:0}: Error finding container b797b1d6384d63f3982b664d1b8fe670def38877347f1c8af2cd8a36d68f2917: Status 404 returned error can't find the container with id b797b1d6384d63f3982b664d1b8fe670def38877347f1c8af2cd8a36d68f2917 Jan 21 16:28:11 crc kubenswrapper[4707]: I0121 16:28:11.172341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" event={"ID":"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9","Type":"ContainerStarted","Data":"b797b1d6384d63f3982b664d1b8fe670def38877347f1c8af2cd8a36d68f2917"} Jan 21 16:28:12 crc kubenswrapper[4707]: I0121 16:28:12.181183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" event={"ID":"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9","Type":"ContainerStarted","Data":"147447999e8e80213c3c0eca95b272b55098b7ca6055c009122ec88babe44cfe"} Jan 21 16:28:12 crc kubenswrapper[4707]: I0121 16:28:12.194083 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" podStartSLOduration=1.650971156 podStartE2EDuration="2.194066177s" podCreationTimestamp="2026-01-21 16:28:10 +0000 UTC" firstStartedPulling="2026-01-21 16:28:10.855765513 +0000 UTC m=+5188.037281735" lastFinishedPulling="2026-01-21 16:28:11.398860533 +0000 UTC m=+5188.580376756" observedRunningTime="2026-01-21 16:28:12.191944527 +0000 UTC m=+5189.373460749" watchObservedRunningTime="2026-01-21 16:28:12.194066177 +0000 UTC m=+5189.375582398" Jan 21 16:28:13 crc kubenswrapper[4707]: I0121 16:28:13.199707 4707 generic.go:334] "Generic (PLEG): container finished" podID="4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" containerID="147447999e8e80213c3c0eca95b272b55098b7ca6055c009122ec88babe44cfe" exitCode=0 Jan 21 16:28:13 crc kubenswrapper[4707]: I0121 16:28:13.200582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" event={"ID":"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9","Type":"ContainerDied","Data":"147447999e8e80213c3c0eca95b272b55098b7ca6055c009122ec88babe44cfe"} Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.406365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.558888 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-bootstrap-combined-ca-bundle\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.558944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-nova-combined-ca-bundle\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.558997 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ovn-combined-ca-bundle\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.559014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-inventory\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.559827 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-sriov-combined-ca-bundle\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.559864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-ovn-combined-ca-bundle\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.559903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-libvirt-combined-ca-bundle\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.559920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-dhcp-combined-ca-bundle\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.559942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcd76\" (UniqueName: \"kubernetes.io/projected/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-kube-api-access-zcd76\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.559981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-metadata-combined-ca-bundle\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.559998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ssh-key-edpm-compute-no-nodes\") pod \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\" (UID: \"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9\") " Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.563863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.564031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.564078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-kube-api-access-zcd76" (OuterVolumeSpecName: "kube-api-access-zcd76") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "kube-api-access-zcd76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.564106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.564255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.564366 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.564509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.564563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.564637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.575895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-inventory" (OuterVolumeSpecName: "inventory") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.576420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" (UID: "4672bedb-06ea-4f94-a3de-ad73fe7b9aa9"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661294 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661322 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661332 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661343 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcd76\" (UniqueName: \"kubernetes.io/projected/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-kube-api-access-zcd76\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661353 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661361 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661369 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661378 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661386 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661394 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:14 crc kubenswrapper[4707]: I0121 16:28:14.661402 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.211955 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" event={"ID":"4672bedb-06ea-4f94-a3de-ad73fe7b9aa9","Type":"ContainerDied","Data":"b797b1d6384d63f3982b664d1b8fe670def38877347f1c8af2cd8a36d68f2917"} Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.211985 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.211991 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b797b1d6384d63f3982b664d1b8fe670def38877347f1c8af2cd8a36d68f2917" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.246715 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml"] Jan 21 16:28:15 crc kubenswrapper[4707]: E0121 16:28:15.246996 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" containerName="install-certs-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.247014 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" containerName="install-certs-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.247153 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" containerName="install-certs-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.247545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.250357 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.250467 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.250691 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.250843 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.251572 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.251668 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.255568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml"] Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.265828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.265870 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.265890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.266030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.266113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz4tc\" (UniqueName: \"kubernetes.io/projected/20dfb569-8130-43fd-bbac-8b5ec8a01c82-kube-api-access-pz4tc\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.366991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.367066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz4tc\" (UniqueName: \"kubernetes.io/projected/20dfb569-8130-43fd-bbac-8b5ec8a01c82-kube-api-access-pz4tc\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.367137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.367164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.367183 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.367913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.369825 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.370028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.369831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.380886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz4tc\" (UniqueName: \"kubernetes.io/projected/20dfb569-8130-43fd-bbac-8b5ec8a01c82-kube-api-access-pz4tc\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.562967 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:15 crc kubenswrapper[4707]: I0121 16:28:15.915120 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml"] Jan 21 16:28:15 crc kubenswrapper[4707]: W0121 16:28:15.915783 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20dfb569_8130_43fd_bbac_8b5ec8a01c82.slice/crio-f1b84af6e72f9e411b4406c976fd4472107d37dd0bea49e831256634c77c31a8 WatchSource:0}: Error finding container f1b84af6e72f9e411b4406c976fd4472107d37dd0bea49e831256634c77c31a8: Status 404 returned error can't find the container with id f1b84af6e72f9e411b4406c976fd4472107d37dd0bea49e831256634c77c31a8 Jan 21 16:28:16 crc kubenswrapper[4707]: I0121 16:28:16.218785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" event={"ID":"20dfb569-8130-43fd-bbac-8b5ec8a01c82","Type":"ContainerStarted","Data":"f1b84af6e72f9e411b4406c976fd4472107d37dd0bea49e831256634c77c31a8"} Jan 21 16:28:17 crc kubenswrapper[4707]: I0121 16:28:17.225049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" event={"ID":"20dfb569-8130-43fd-bbac-8b5ec8a01c82","Type":"ContainerStarted","Data":"601cb649389570e5786f7f85e3097ceffc42b50838abdc66cc396bbb421d8617"} Jan 21 16:28:17 crc kubenswrapper[4707]: I0121 16:28:17.237451 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" podStartSLOduration=1.722027167 podStartE2EDuration="2.237439896s" podCreationTimestamp="2026-01-21 16:28:15 +0000 UTC" firstStartedPulling="2026-01-21 16:28:15.917593232 +0000 UTC m=+5193.099109453" lastFinishedPulling="2026-01-21 16:28:16.433005961 +0000 UTC m=+5193.614522182" observedRunningTime="2026-01-21 16:28:17.236165639 +0000 UTC m=+5194.417681861" watchObservedRunningTime="2026-01-21 16:28:17.237439896 +0000 UTC m=+5194.418956107" Jan 21 16:28:18 crc kubenswrapper[4707]: I0121 16:28:18.232554 4707 generic.go:334] "Generic (PLEG): container finished" podID="20dfb569-8130-43fd-bbac-8b5ec8a01c82" containerID="601cb649389570e5786f7f85e3097ceffc42b50838abdc66cc396bbb421d8617" exitCode=0 Jan 21 16:28:18 crc kubenswrapper[4707]: I0121 16:28:18.232600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" event={"ID":"20dfb569-8130-43fd-bbac-8b5ec8a01c82","Type":"ContainerDied","Data":"601cb649389570e5786f7f85e3097ceffc42b50838abdc66cc396bbb421d8617"} Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.453430 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.613916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz4tc\" (UniqueName: \"kubernetes.io/projected/20dfb569-8130-43fd-bbac-8b5ec8a01c82-kube-api-access-pz4tc\") pod \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.613960 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovn-combined-ca-bundle\") pod \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.613990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovncontroller-config-0\") pod \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.614009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ssh-key-edpm-compute-no-nodes\") pod \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.614066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-inventory\") pod \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\" (UID: \"20dfb569-8130-43fd-bbac-8b5ec8a01c82\") " Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.620537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "20dfb569-8130-43fd-bbac-8b5ec8a01c82" (UID: "20dfb569-8130-43fd-bbac-8b5ec8a01c82"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.620716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20dfb569-8130-43fd-bbac-8b5ec8a01c82-kube-api-access-pz4tc" (OuterVolumeSpecName: "kube-api-access-pz4tc") pod "20dfb569-8130-43fd-bbac-8b5ec8a01c82" (UID: "20dfb569-8130-43fd-bbac-8b5ec8a01c82"). InnerVolumeSpecName "kube-api-access-pz4tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.629649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "20dfb569-8130-43fd-bbac-8b5ec8a01c82" (UID: "20dfb569-8130-43fd-bbac-8b5ec8a01c82"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.630667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "20dfb569-8130-43fd-bbac-8b5ec8a01c82" (UID: "20dfb569-8130-43fd-bbac-8b5ec8a01c82"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.631131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-inventory" (OuterVolumeSpecName: "inventory") pod "20dfb569-8130-43fd-bbac-8b5ec8a01c82" (UID: "20dfb569-8130-43fd-bbac-8b5ec8a01c82"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.715394 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.715422 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz4tc\" (UniqueName: \"kubernetes.io/projected/20dfb569-8130-43fd-bbac-8b5ec8a01c82-kube-api-access-pz4tc\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.715433 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.715442 4707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:19 crc kubenswrapper[4707]: I0121 16:28:19.715450 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/20dfb569-8130-43fd-bbac-8b5ec8a01c82-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.247083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" event={"ID":"20dfb569-8130-43fd-bbac-8b5ec8a01c82","Type":"ContainerDied","Data":"f1b84af6e72f9e411b4406c976fd4472107d37dd0bea49e831256634c77c31a8"} Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.247119 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.247121 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b84af6e72f9e411b4406c976fd4472107d37dd0bea49e831256634c77c31a8" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.290095 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x"] Jan 21 16:28:20 crc kubenswrapper[4707]: E0121 16:28:20.290333 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20dfb569-8130-43fd-bbac-8b5ec8a01c82" containerName="ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.290351 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20dfb569-8130-43fd-bbac-8b5ec8a01c82" containerName="ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.290499 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20dfb569-8130-43fd-bbac-8b5ec8a01c82" containerName="ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.290906 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.292782 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.292933 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.293078 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.293126 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-neutron-config" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.293268 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.293383 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.295469 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.299983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x"] Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.322442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.322487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.322515 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.322535 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.322596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.322701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.322796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvxt\" (UniqueName: \"kubernetes.io/projected/53297bd4-be53-44fe-947a-b3651e24c887-kube-api-access-6tvxt\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.322877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.423697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvxt\" (UniqueName: \"kubernetes.io/projected/53297bd4-be53-44fe-947a-b3651e24c887-kube-api-access-6tvxt\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.423741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.423763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.423789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.423832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.423851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.423905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.423927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.427301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.427301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.427570 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.429234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.429661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.429792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.431163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.437515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvxt\" (UniqueName: \"kubernetes.io/projected/53297bd4-be53-44fe-947a-b3651e24c887-kube-api-access-6tvxt\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.605330 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:20 crc kubenswrapper[4707]: I0121 16:28:20.952589 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x"] Jan 21 16:28:20 crc kubenswrapper[4707]: W0121 16:28:20.955092 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53297bd4_be53_44fe_947a_b3651e24c887.slice/crio-5e2186a29eda5e22ee677d04c10c05aefe93cd00ea9603be2445ed3421e3f07c WatchSource:0}: Error finding container 5e2186a29eda5e22ee677d04c10c05aefe93cd00ea9603be2445ed3421e3f07c: Status 404 returned error can't find the container with id 5e2186a29eda5e22ee677d04c10c05aefe93cd00ea9603be2445ed3421e3f07c Jan 21 16:28:21 crc kubenswrapper[4707]: I0121 16:28:21.254212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" event={"ID":"53297bd4-be53-44fe-947a-b3651e24c887","Type":"ContainerStarted","Data":"5e2186a29eda5e22ee677d04c10c05aefe93cd00ea9603be2445ed3421e3f07c"} Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.143119 4707 scope.go:117] "RemoveContainer" containerID="aa25f10913d0af191d7c72475ca20b7c364d06ad6fc1216c00c1d82286658039" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.162087 4707 scope.go:117] "RemoveContainer" containerID="f4d16a658e738624a800253bf5c7aba49d7ef8d5dd35b97fd4f311d411603422" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.186833 4707 scope.go:117] "RemoveContainer" containerID="422a08188411b88a02a2f56165f30287f2ae1ea6fdb95c76331b337b120419af" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.207354 4707 scope.go:117] "RemoveContainer" containerID="650c72d467b954d2c050226db50c595912756138dcfa5f55014e5641e4f38259" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.226165 4707 scope.go:117] "RemoveContainer" containerID="5091d466d1f58f629fcfd4ffcfa33e247d8aa0fd1063c62a57b4950a23e31c7d" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.243203 4707 scope.go:117] "RemoveContainer" containerID="d5d82a9205f2fb2774d5847ad30470ce4265835df17c9aa1f56d6478cd2fc1ba" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.264201 4707 scope.go:117] "RemoveContainer" containerID="8ff4482037b4612eb13a12cdf7e8da466957ff38a18590f74a1ba798a7133659" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.268138 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" event={"ID":"53297bd4-be53-44fe-947a-b3651e24c887","Type":"ContainerStarted","Data":"e7922204606ecf61a20f27d8a0c7d3c98f45c9318e01399fc1847ead41795901"} Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.286027 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" podStartSLOduration=1.826326975 podStartE2EDuration="2.286009435s" podCreationTimestamp="2026-01-21 16:28:20 +0000 UTC" firstStartedPulling="2026-01-21 16:28:20.956873274 +0000 UTC m=+5198.138389496" lastFinishedPulling="2026-01-21 16:28:21.416555734 +0000 UTC m=+5198.598071956" observedRunningTime="2026-01-21 16:28:22.278921086 +0000 UTC m=+5199.460437309" watchObservedRunningTime="2026-01-21 16:28:22.286009435 +0000 UTC m=+5199.467525657" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.287831 4707 scope.go:117] "RemoveContainer" containerID="bf733a64e88d989b98d87333dd4e59224f43b7cebef520553b6952eccdc1a1e7" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.306337 4707 scope.go:117] "RemoveContainer" containerID="601258fd340ef99dd30bb7dc8fd432200df8ce3c3b793750c9a5e21783f3d36b" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.325527 4707 scope.go:117] "RemoveContainer" containerID="a837f6f505eead11786f15192713acef33089b8b0f835a85f7317f2a4474d7b6" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.346931 4707 scope.go:117] "RemoveContainer" containerID="b7f8026af2dd64655fab311721cd723e934b382da4deec287308ef00bb9dd160" Jan 21 16:28:22 crc kubenswrapper[4707]: I0121 16:28:22.368184 4707 scope.go:117] "RemoveContainer" containerID="0b094ec3ec50cf0f292a11cda352b4f49d0f95fd6b097d1ad3eb6efe76e6b7ce" Jan 21 16:28:23 crc kubenswrapper[4707]: I0121 16:28:23.276184 4707 generic.go:334] "Generic (PLEG): container finished" podID="53297bd4-be53-44fe-947a-b3651e24c887" containerID="e7922204606ecf61a20f27d8a0c7d3c98f45c9318e01399fc1847ead41795901" exitCode=0 Jan 21 16:28:23 crc kubenswrapper[4707]: I0121 16:28:23.276227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" event={"ID":"53297bd4-be53-44fe-947a-b3651e24c887","Type":"ContainerDied","Data":"e7922204606ecf61a20f27d8a0c7d3c98f45c9318e01399fc1847ead41795901"} Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.498483 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.672491 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-inventory\") pod \"53297bd4-be53-44fe-947a-b3651e24c887\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.672599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"53297bd4-be53-44fe-947a-b3651e24c887\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.672632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-1\") pod \"53297bd4-be53-44fe-947a-b3651e24c887\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.672653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-ssh-key-edpm-compute-no-nodes\") pod \"53297bd4-be53-44fe-947a-b3651e24c887\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.672681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-2\") pod \"53297bd4-be53-44fe-947a-b3651e24c887\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.672701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-0\") pod \"53297bd4-be53-44fe-947a-b3651e24c887\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.672736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tvxt\" (UniqueName: \"kubernetes.io/projected/53297bd4-be53-44fe-947a-b3651e24c887-kube-api-access-6tvxt\") pod \"53297bd4-be53-44fe-947a-b3651e24c887\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.672755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-metadata-combined-ca-bundle\") pod \"53297bd4-be53-44fe-947a-b3651e24c887\" (UID: \"53297bd4-be53-44fe-947a-b3651e24c887\") " Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.677225 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "53297bd4-be53-44fe-947a-b3651e24c887" (UID: "53297bd4-be53-44fe-947a-b3651e24c887"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.677233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53297bd4-be53-44fe-947a-b3651e24c887-kube-api-access-6tvxt" (OuterVolumeSpecName: "kube-api-access-6tvxt") pod "53297bd4-be53-44fe-947a-b3651e24c887" (UID: "53297bd4-be53-44fe-947a-b3651e24c887"). InnerVolumeSpecName "kube-api-access-6tvxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.689217 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "53297bd4-be53-44fe-947a-b3651e24c887" (UID: "53297bd4-be53-44fe-947a-b3651e24c887"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.689757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "53297bd4-be53-44fe-947a-b3651e24c887" (UID: "53297bd4-be53-44fe-947a-b3651e24c887"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.690327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-inventory" (OuterVolumeSpecName: "inventory") pod "53297bd4-be53-44fe-947a-b3651e24c887" (UID: "53297bd4-be53-44fe-947a-b3651e24c887"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.690474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "53297bd4-be53-44fe-947a-b3651e24c887" (UID: "53297bd4-be53-44fe-947a-b3651e24c887"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.690684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-1" (OuterVolumeSpecName: "nova-metadata-neutron-config-1") pod "53297bd4-be53-44fe-947a-b3651e24c887" (UID: "53297bd4-be53-44fe-947a-b3651e24c887"). InnerVolumeSpecName "nova-metadata-neutron-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.690713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-2" (OuterVolumeSpecName: "nova-metadata-neutron-config-2") pod "53297bd4-be53-44fe-947a-b3651e24c887" (UID: "53297bd4-be53-44fe-947a-b3651e24c887"). InnerVolumeSpecName "nova-metadata-neutron-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.774550 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.774579 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.774590 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.774600 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-2\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.774610 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.774620 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tvxt\" (UniqueName: \"kubernetes.io/projected/53297bd4-be53-44fe-947a-b3651e24c887-kube-api-access-6tvxt\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.774629 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:24 crc kubenswrapper[4707]: I0121 16:28:24.774639 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53297bd4-be53-44fe-947a-b3651e24c887-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.289391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" event={"ID":"53297bd4-be53-44fe-947a-b3651e24c887","Type":"ContainerDied","Data":"5e2186a29eda5e22ee677d04c10c05aefe93cd00ea9603be2445ed3421e3f07c"} Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.289425 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e2186a29eda5e22ee677d04c10c05aefe93cd00ea9603be2445ed3421e3f07c" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.289438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.338007 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq"] Jan 21 16:28:25 crc kubenswrapper[4707]: E0121 16:28:25.338470 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53297bd4-be53-44fe-947a-b3651e24c887" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.338490 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="53297bd4-be53-44fe-947a-b3651e24c887" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.338818 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="53297bd4-be53-44fe-947a-b3651e24c887" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.341159 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.343378 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.343732 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-agent-neutron-config" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.343851 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.343895 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.344391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.344445 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.350314 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq"] Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.482676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.482724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.482774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.482801 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.482986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvvj\" (UniqueName: \"kubernetes.io/projected/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-kube-api-access-gzvvj\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.584336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvvj\" (UniqueName: \"kubernetes.io/projected/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-kube-api-access-gzvvj\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.584427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.584459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.584487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.584509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.587386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.587427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.588306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.588350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.597849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvvj\" (UniqueName: \"kubernetes.io/projected/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-kube-api-access-gzvvj\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:25 crc kubenswrapper[4707]: I0121 16:28:25.655050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:26 crc kubenswrapper[4707]: I0121 16:28:26.008428 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq"] Jan 21 16:28:26 crc kubenswrapper[4707]: W0121 16:28:26.009373 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a8ee7f9_bd06_4dce_99f6_0d452b4ab70e.slice/crio-9682cf8cc12e0ce108b0232b27d3683069f820396321a640c7661a5f3fd6576f WatchSource:0}: Error finding container 9682cf8cc12e0ce108b0232b27d3683069f820396321a640c7661a5f3fd6576f: Status 404 returned error can't find the container with id 9682cf8cc12e0ce108b0232b27d3683069f820396321a640c7661a5f3fd6576f Jan 21 16:28:26 crc kubenswrapper[4707]: I0121 16:28:26.301753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" event={"ID":"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e","Type":"ContainerStarted","Data":"9682cf8cc12e0ce108b0232b27d3683069f820396321a640c7661a5f3fd6576f"} Jan 21 16:28:27 crc kubenswrapper[4707]: I0121 16:28:27.309577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" event={"ID":"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e","Type":"ContainerStarted","Data":"364b4948973dd0ac9ed3552d54641cd50c5c504f89a149ff52ada4dbfa882795"} Jan 21 16:28:27 crc kubenswrapper[4707]: I0121 16:28:27.320840 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" podStartSLOduration=1.812127341 podStartE2EDuration="2.320825494s" podCreationTimestamp="2026-01-21 16:28:25 +0000 UTC" firstStartedPulling="2026-01-21 16:28:26.010975527 +0000 UTC m=+5203.192491749" lastFinishedPulling="2026-01-21 16:28:26.51967368 +0000 UTC m=+5203.701189902" observedRunningTime="2026-01-21 16:28:27.320014369 +0000 UTC m=+5204.501530590" watchObservedRunningTime="2026-01-21 16:28:27.320825494 +0000 UTC m=+5204.502341715" Jan 21 16:28:28 crc kubenswrapper[4707]: I0121 16:28:28.316459 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" containerID="364b4948973dd0ac9ed3552d54641cd50c5c504f89a149ff52ada4dbfa882795" exitCode=0 Jan 21 16:28:28 crc kubenswrapper[4707]: I0121 16:28:28.316491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" event={"ID":"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e","Type":"ContainerDied","Data":"364b4948973dd0ac9ed3552d54641cd50c5c504f89a149ff52ada4dbfa882795"} Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.528928 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.632242 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-combined-ca-bundle\") pod \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.632357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzvvj\" (UniqueName: \"kubernetes.io/projected/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-kube-api-access-gzvvj\") pod \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.632386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-agent-neutron-config-0\") pod \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.632449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-inventory\") pod \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.632491 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-ssh-key-edpm-compute-no-nodes\") pod \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\" (UID: \"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e\") " Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.636138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" (UID: "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.636277 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-kube-api-access-gzvvj" (OuterVolumeSpecName: "kube-api-access-gzvvj") pod "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" (UID: "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e"). InnerVolumeSpecName "kube-api-access-gzvvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.647575 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-inventory" (OuterVolumeSpecName: "inventory") pod "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" (UID: "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.648296 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-agent-neutron-config-0") pod "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" (UID: "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e"). InnerVolumeSpecName "neutron-ovn-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.648312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" (UID: "0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.734195 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.734228 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.734239 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.734250 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:29 crc kubenswrapper[4707]: I0121 16:28:29.734261 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzvvj\" (UniqueName: \"kubernetes.io/projected/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e-kube-api-access-gzvvj\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.328669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" event={"ID":"0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e","Type":"ContainerDied","Data":"9682cf8cc12e0ce108b0232b27d3683069f820396321a640c7661a5f3fd6576f"} Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.328894 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9682cf8cc12e0ce108b0232b27d3683069f820396321a640c7661a5f3fd6576f" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.328718 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.376625 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4"] Jan 21 16:28:30 crc kubenswrapper[4707]: E0121 16:28:30.376936 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.376954 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.377094 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.377524 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.380150 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-sriov-agent-neutron-config" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.380166 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.380316 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.380348 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.380529 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.381514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.384648 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4"] Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.442838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.442931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.442959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.443006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.443063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lstr7\" (UniqueName: \"kubernetes.io/projected/cd07d192-5f26-4922-b644-dee8adc64158-kube-api-access-lstr7\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.544778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.544853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lstr7\" (UniqueName: \"kubernetes.io/projected/cd07d192-5f26-4922-b644-dee8adc64158-kube-api-access-lstr7\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.544928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.544995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.545027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.548379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.548394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.548380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.548396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.558105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lstr7\" (UniqueName: \"kubernetes.io/projected/cd07d192-5f26-4922-b644-dee8adc64158-kube-api-access-lstr7\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:30 crc kubenswrapper[4707]: I0121 16:28:30.693177 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:31 crc kubenswrapper[4707]: I0121 16:28:31.035714 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4"] Jan 21 16:28:31 crc kubenswrapper[4707]: W0121 16:28:31.038103 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd07d192_5f26_4922_b644_dee8adc64158.slice/crio-d335ef21b3329c7518dd9f5bd231f50d14528618e40866f085e9a9753566050a WatchSource:0}: Error finding container d335ef21b3329c7518dd9f5bd231f50d14528618e40866f085e9a9753566050a: Status 404 returned error can't find the container with id d335ef21b3329c7518dd9f5bd231f50d14528618e40866f085e9a9753566050a Jan 21 16:28:31 crc kubenswrapper[4707]: I0121 16:28:31.336500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" event={"ID":"cd07d192-5f26-4922-b644-dee8adc64158","Type":"ContainerStarted","Data":"d335ef21b3329c7518dd9f5bd231f50d14528618e40866f085e9a9753566050a"} Jan 21 16:28:32 crc kubenswrapper[4707]: I0121 16:28:32.350437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" event={"ID":"cd07d192-5f26-4922-b644-dee8adc64158","Type":"ContainerStarted","Data":"df49a057d7fa82fbce3996c614e3ff4dd75f687fbc0ad5227d302ba47563aa32"} Jan 21 16:28:32 crc kubenswrapper[4707]: I0121 16:28:32.366943 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" podStartSLOduration=1.8429611609999998 podStartE2EDuration="2.366926201s" podCreationTimestamp="2026-01-21 16:28:30 +0000 UTC" firstStartedPulling="2026-01-21 16:28:31.040068124 +0000 UTC m=+5208.221584346" lastFinishedPulling="2026-01-21 16:28:31.564033154 +0000 UTC m=+5208.745549386" observedRunningTime="2026-01-21 16:28:32.359372948 +0000 UTC m=+5209.540889170" watchObservedRunningTime="2026-01-21 16:28:32.366926201 +0000 UTC m=+5209.548442423" Jan 21 16:28:33 crc kubenswrapper[4707]: I0121 16:28:33.357827 4707 generic.go:334] "Generic (PLEG): container finished" podID="cd07d192-5f26-4922-b644-dee8adc64158" containerID="df49a057d7fa82fbce3996c614e3ff4dd75f687fbc0ad5227d302ba47563aa32" exitCode=0 Jan 21 16:28:33 crc kubenswrapper[4707]: I0121 16:28:33.357924 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" event={"ID":"cd07d192-5f26-4922-b644-dee8adc64158","Type":"ContainerDied","Data":"df49a057d7fa82fbce3996c614e3ff4dd75f687fbc0ad5227d302ba47563aa32"} Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.562119 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.592679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-combined-ca-bundle\") pod \"cd07d192-5f26-4922-b644-dee8adc64158\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.592725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-agent-neutron-config-0\") pod \"cd07d192-5f26-4922-b644-dee8adc64158\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.592786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-ssh-key-edpm-compute-no-nodes\") pod \"cd07d192-5f26-4922-b644-dee8adc64158\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.592856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-inventory\") pod \"cd07d192-5f26-4922-b644-dee8adc64158\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.592885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lstr7\" (UniqueName: \"kubernetes.io/projected/cd07d192-5f26-4922-b644-dee8adc64158-kube-api-access-lstr7\") pod \"cd07d192-5f26-4922-b644-dee8adc64158\" (UID: \"cd07d192-5f26-4922-b644-dee8adc64158\") " Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.596570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "cd07d192-5f26-4922-b644-dee8adc64158" (UID: "cd07d192-5f26-4922-b644-dee8adc64158"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.596632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd07d192-5f26-4922-b644-dee8adc64158-kube-api-access-lstr7" (OuterVolumeSpecName: "kube-api-access-lstr7") pod "cd07d192-5f26-4922-b644-dee8adc64158" (UID: "cd07d192-5f26-4922-b644-dee8adc64158"). InnerVolumeSpecName "kube-api-access-lstr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.607300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-inventory" (OuterVolumeSpecName: "inventory") pod "cd07d192-5f26-4922-b644-dee8adc64158" (UID: "cd07d192-5f26-4922-b644-dee8adc64158"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.608346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "cd07d192-5f26-4922-b644-dee8adc64158" (UID: "cd07d192-5f26-4922-b644-dee8adc64158"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.609026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "cd07d192-5f26-4922-b644-dee8adc64158" (UID: "cd07d192-5f26-4922-b644-dee8adc64158"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.694461 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.694486 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.694498 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.694510 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd07d192-5f26-4922-b644-dee8adc64158-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:34 crc kubenswrapper[4707]: I0121 16:28:34.694521 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lstr7\" (UniqueName: \"kubernetes.io/projected/cd07d192-5f26-4922-b644-dee8adc64158-kube-api-access-lstr7\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.371158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" event={"ID":"cd07d192-5f26-4922-b644-dee8adc64158","Type":"ContainerDied","Data":"d335ef21b3329c7518dd9f5bd231f50d14528618e40866f085e9a9753566050a"} Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.371199 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d335ef21b3329c7518dd9f5bd231f50d14528618e40866f085e9a9753566050a" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.371208 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.407670 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg"] Jan 21 16:28:35 crc kubenswrapper[4707]: E0121 16:28:35.407932 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd07d192-5f26-4922-b644-dee8adc64158" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.407949 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd07d192-5f26-4922-b644-dee8adc64158" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.408090 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd07d192-5f26-4922-b644-dee8adc64158" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.408509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.409981 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.410083 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.410109 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-dhcp-agent-neutron-config" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.410128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.410377 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.412036 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.416847 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg"] Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.503347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.503401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.503443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk9kq\" (UniqueName: \"kubernetes.io/projected/dab0aa71-64ec-474e-8857-e5e6be8a1eff-kube-api-access-tk9kq\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.503531 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.503570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.604614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.604668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.604690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.604710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.604740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk9kq\" (UniqueName: \"kubernetes.io/projected/dab0aa71-64ec-474e-8857-e5e6be8a1eff-kube-api-access-tk9kq\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.608369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.608372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.608946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.609505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.617458 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk9kq\" (UniqueName: \"kubernetes.io/projected/dab0aa71-64ec-474e-8857-e5e6be8a1eff-kube-api-access-tk9kq\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:35 crc kubenswrapper[4707]: I0121 16:28:35.719227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:36 crc kubenswrapper[4707]: I0121 16:28:36.073854 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg"] Jan 21 16:28:36 crc kubenswrapper[4707]: W0121 16:28:36.074580 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab0aa71_64ec_474e_8857_e5e6be8a1eff.slice/crio-23f5b927c5f5562093a5d3365cc87a1a188b1ddc18aa9fc0eb9aafda1f9a277e WatchSource:0}: Error finding container 23f5b927c5f5562093a5d3365cc87a1a188b1ddc18aa9fc0eb9aafda1f9a277e: Status 404 returned error can't find the container with id 23f5b927c5f5562093a5d3365cc87a1a188b1ddc18aa9fc0eb9aafda1f9a277e Jan 21 16:28:36 crc kubenswrapper[4707]: I0121 16:28:36.378918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" event={"ID":"dab0aa71-64ec-474e-8857-e5e6be8a1eff","Type":"ContainerStarted","Data":"23f5b927c5f5562093a5d3365cc87a1a188b1ddc18aa9fc0eb9aafda1f9a277e"} Jan 21 16:28:37 crc kubenswrapper[4707]: I0121 16:28:37.386503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" event={"ID":"dab0aa71-64ec-474e-8857-e5e6be8a1eff","Type":"ContainerStarted","Data":"4d385ff0fd544a8fd564aa4e61639b316b6a512252171f9b1a32d9dfee33ed90"} Jan 21 16:28:37 crc kubenswrapper[4707]: I0121 16:28:37.399823 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" podStartSLOduration=1.749816115 podStartE2EDuration="2.399798454s" podCreationTimestamp="2026-01-21 16:28:35 +0000 UTC" firstStartedPulling="2026-01-21 16:28:36.076571514 +0000 UTC m=+5213.258087737" lastFinishedPulling="2026-01-21 16:28:36.726553855 +0000 UTC m=+5213.908070076" observedRunningTime="2026-01-21 16:28:37.398112324 +0000 UTC m=+5214.579628546" watchObservedRunningTime="2026-01-21 16:28:37.399798454 +0000 UTC m=+5214.581314676" Jan 21 16:28:38 crc kubenswrapper[4707]: I0121 16:28:38.393489 4707 generic.go:334] "Generic (PLEG): container finished" podID="dab0aa71-64ec-474e-8857-e5e6be8a1eff" containerID="4d385ff0fd544a8fd564aa4e61639b316b6a512252171f9b1a32d9dfee33ed90" exitCode=0 Jan 21 16:28:38 crc kubenswrapper[4707]: I0121 16:28:38.393524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" event={"ID":"dab0aa71-64ec-474e-8857-e5e6be8a1eff","Type":"ContainerDied","Data":"4d385ff0fd544a8fd564aa4e61639b316b6a512252171f9b1a32d9dfee33ed90"} Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.595162 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.652969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-inventory\") pod \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.653249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk9kq\" (UniqueName: \"kubernetes.io/projected/dab0aa71-64ec-474e-8857-e5e6be8a1eff-kube-api-access-tk9kq\") pod \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.653276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-ssh-key-edpm-compute-no-nodes\") pod \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.653374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-combined-ca-bundle\") pod \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.653426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-agent-neutron-config-0\") pod \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\" (UID: \"dab0aa71-64ec-474e-8857-e5e6be8a1eff\") " Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.657090 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "dab0aa71-64ec-474e-8857-e5e6be8a1eff" (UID: "dab0aa71-64ec-474e-8857-e5e6be8a1eff"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.657119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab0aa71-64ec-474e-8857-e5e6be8a1eff-kube-api-access-tk9kq" (OuterVolumeSpecName: "kube-api-access-tk9kq") pod "dab0aa71-64ec-474e-8857-e5e6be8a1eff" (UID: "dab0aa71-64ec-474e-8857-e5e6be8a1eff"). InnerVolumeSpecName "kube-api-access-tk9kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.668691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-inventory" (OuterVolumeSpecName: "inventory") pod "dab0aa71-64ec-474e-8857-e5e6be8a1eff" (UID: "dab0aa71-64ec-474e-8857-e5e6be8a1eff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.668887 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "dab0aa71-64ec-474e-8857-e5e6be8a1eff" (UID: "dab0aa71-64ec-474e-8857-e5e6be8a1eff"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.669646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "dab0aa71-64ec-474e-8857-e5e6be8a1eff" (UID: "dab0aa71-64ec-474e-8857-e5e6be8a1eff"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.755162 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.755192 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.755203 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.755216 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk9kq\" (UniqueName: \"kubernetes.io/projected/dab0aa71-64ec-474e-8857-e5e6be8a1eff-kube-api-access-tk9kq\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.755227 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/dab0aa71-64ec-474e-8857-e5e6be8a1eff-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.945954 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:28:39 crc kubenswrapper[4707]: I0121 16:28:39.946003 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.408969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" event={"ID":"dab0aa71-64ec-474e-8857-e5e6be8a1eff","Type":"ContainerDied","Data":"23f5b927c5f5562093a5d3365cc87a1a188b1ddc18aa9fc0eb9aafda1f9a277e"} Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.409006 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f5b927c5f5562093a5d3365cc87a1a188b1ddc18aa9fc0eb9aafda1f9a277e" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.409050 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.447466 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc"] Jan 21 16:28:40 crc kubenswrapper[4707]: E0121 16:28:40.447707 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab0aa71-64ec-474e-8857-e5e6be8a1eff" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.447726 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab0aa71-64ec-474e-8857-e5e6be8a1eff" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.447878 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab0aa71-64ec-474e-8857-e5e6be8a1eff" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.448270 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.449573 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"libvirt-secret" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.449765 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.450037 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.450057 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.450075 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.450244 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.454412 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc"] Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.565464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.565519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.565588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9df9\" (UniqueName: \"kubernetes.io/projected/e998e26c-e225-4202-9d3e-b90d2cddc8f3-kube-api-access-d9df9\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.565614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.565676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.666718 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9df9\" (UniqueName: \"kubernetes.io/projected/e998e26c-e225-4202-9d3e-b90d2cddc8f3-kube-api-access-d9df9\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.666761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.666788 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.666885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.666913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.669896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.669896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.670079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.670538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.679513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9df9\" (UniqueName: \"kubernetes.io/projected/e998e26c-e225-4202-9d3e-b90d2cddc8f3-kube-api-access-d9df9\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:40 crc kubenswrapper[4707]: I0121 16:28:40.760380 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:41 crc kubenswrapper[4707]: I0121 16:28:41.114275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc"] Jan 21 16:28:41 crc kubenswrapper[4707]: W0121 16:28:41.115999 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode998e26c_e225_4202_9d3e_b90d2cddc8f3.slice/crio-02d4b8d691e97815feda8c2c015d57ce4ffa13e10c30ee819a4d75303a4a5e72 WatchSource:0}: Error finding container 02d4b8d691e97815feda8c2c015d57ce4ffa13e10c30ee819a4d75303a4a5e72: Status 404 returned error can't find the container with id 02d4b8d691e97815feda8c2c015d57ce4ffa13e10c30ee819a4d75303a4a5e72 Jan 21 16:28:41 crc kubenswrapper[4707]: I0121 16:28:41.415756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" event={"ID":"e998e26c-e225-4202-9d3e-b90d2cddc8f3","Type":"ContainerStarted","Data":"02d4b8d691e97815feda8c2c015d57ce4ffa13e10c30ee819a4d75303a4a5e72"} Jan 21 16:28:42 crc kubenswrapper[4707]: I0121 16:28:42.424033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" event={"ID":"e998e26c-e225-4202-9d3e-b90d2cddc8f3","Type":"ContainerStarted","Data":"000286ff363eea2f8b6a083f6b5e81ef6210f8522987957ba61eb6a424ae9a8c"} Jan 21 16:28:42 crc kubenswrapper[4707]: I0121 16:28:42.436537 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" podStartSLOduration=1.937800033 podStartE2EDuration="2.436526087s" podCreationTimestamp="2026-01-21 16:28:40 +0000 UTC" firstStartedPulling="2026-01-21 16:28:41.117648897 +0000 UTC m=+5218.299165118" lastFinishedPulling="2026-01-21 16:28:41.61637495 +0000 UTC m=+5218.797891172" observedRunningTime="2026-01-21 16:28:42.433250468 +0000 UTC m=+5219.614766690" watchObservedRunningTime="2026-01-21 16:28:42.436526087 +0000 UTC m=+5219.618042309" Jan 21 16:28:43 crc kubenswrapper[4707]: I0121 16:28:43.431859 4707 generic.go:334] "Generic (PLEG): container finished" podID="e998e26c-e225-4202-9d3e-b90d2cddc8f3" containerID="000286ff363eea2f8b6a083f6b5e81ef6210f8522987957ba61eb6a424ae9a8c" exitCode=0 Jan 21 16:28:43 crc kubenswrapper[4707]: I0121 16:28:43.431959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" event={"ID":"e998e26c-e225-4202-9d3e-b90d2cddc8f3","Type":"ContainerDied","Data":"000286ff363eea2f8b6a083f6b5e81ef6210f8522987957ba61eb6a424ae9a8c"} Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.646734 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.711124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-ssh-key-edpm-compute-no-nodes\") pod \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.711182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-inventory\") pod \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.711256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-secret-0\") pod \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.711341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-combined-ca-bundle\") pod \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.711375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9df9\" (UniqueName: \"kubernetes.io/projected/e998e26c-e225-4202-9d3e-b90d2cddc8f3-kube-api-access-d9df9\") pod \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.716137 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e998e26c-e225-4202-9d3e-b90d2cddc8f3" (UID: "e998e26c-e225-4202-9d3e-b90d2cddc8f3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.716572 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e998e26c-e225-4202-9d3e-b90d2cddc8f3-kube-api-access-d9df9" (OuterVolumeSpecName: "kube-api-access-d9df9") pod "e998e26c-e225-4202-9d3e-b90d2cddc8f3" (UID: "e998e26c-e225-4202-9d3e-b90d2cddc8f3"). InnerVolumeSpecName "kube-api-access-d9df9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:44 crc kubenswrapper[4707]: E0121 16:28:44.727823 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-secret-0 podName:e998e26c-e225-4202-9d3e-b90d2cddc8f3 nodeName:}" failed. No retries permitted until 2026-01-21 16:28:45.227784653 +0000 UTC m=+5222.409300876 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "libvirt-secret-0" (UniqueName: "kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-secret-0") pod "e998e26c-e225-4202-9d3e-b90d2cddc8f3" (UID: "e998e26c-e225-4202-9d3e-b90d2cddc8f3") : error deleting /var/lib/kubelet/pods/e998e26c-e225-4202-9d3e-b90d2cddc8f3/volume-subpaths: remove /var/lib/kubelet/pods/e998e26c-e225-4202-9d3e-b90d2cddc8f3/volume-subpaths: no such file or directory Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.729400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-inventory" (OuterVolumeSpecName: "inventory") pod "e998e26c-e225-4202-9d3e-b90d2cddc8f3" (UID: "e998e26c-e225-4202-9d3e-b90d2cddc8f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.729623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "e998e26c-e225-4202-9d3e-b90d2cddc8f3" (UID: "e998e26c-e225-4202-9d3e-b90d2cddc8f3"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.812790 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.812839 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9df9\" (UniqueName: \"kubernetes.io/projected/e998e26c-e225-4202-9d3e-b90d2cddc8f3-kube-api-access-d9df9\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.812850 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:44 crc kubenswrapper[4707]: I0121 16:28:44.812859 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.316173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-secret-0\") pod \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\" (UID: \"e998e26c-e225-4202-9d3e-b90d2cddc8f3\") " Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.318764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e998e26c-e225-4202-9d3e-b90d2cddc8f3" (UID: "e998e26c-e225-4202-9d3e-b90d2cddc8f3"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.417244 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e998e26c-e225-4202-9d3e-b90d2cddc8f3-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.445307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" event={"ID":"e998e26c-e225-4202-9d3e-b90d2cddc8f3","Type":"ContainerDied","Data":"02d4b8d691e97815feda8c2c015d57ce4ffa13e10c30ee819a4d75303a4a5e72"} Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.445338 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d4b8d691e97815feda8c2c015d57ce4ffa13e10c30ee819a4d75303a4a5e72" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.445341 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.483698 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm"] Jan 21 16:28:45 crc kubenswrapper[4707]: E0121 16:28:45.483996 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e998e26c-e225-4202-9d3e-b90d2cddc8f3" containerName="libvirt-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.484028 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e998e26c-e225-4202-9d3e-b90d2cddc8f3" containerName="libvirt-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.484139 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e998e26c-e225-4202-9d3e-b90d2cddc8f3" containerName="libvirt-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.484556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.489591 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-wwcdd" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.489870 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-migration-ssh-key" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.489910 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-compute-config" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.490519 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.490647 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm"] Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.490676 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.490852 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.493221 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.518070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.518131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.518159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.518221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.518278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.518296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.518485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v25vq\" (UniqueName: \"kubernetes.io/projected/884cd03c-5ff2-43b7-b796-039b54658986-kube-api-access-v25vq\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.518542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.619090 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.619135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.619181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.619220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.619245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.619262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v25vq\" (UniqueName: \"kubernetes.io/projected/884cd03c-5ff2-43b7-b796-039b54658986-kube-api-access-v25vq\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.619282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.619304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.622555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.622555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.622925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.623281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.623668 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.623741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.623934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.631783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v25vq\" (UniqueName: \"kubernetes.io/projected/884cd03c-5ff2-43b7-b796-039b54658986-kube-api-access-v25vq\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:45 crc kubenswrapper[4707]: I0121 16:28:45.797251 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:46 crc kubenswrapper[4707]: I0121 16:28:46.142886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm"] Jan 21 16:28:46 crc kubenswrapper[4707]: W0121 16:28:46.145444 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod884cd03c_5ff2_43b7_b796_039b54658986.slice/crio-50febb2ae24ca11c8fc4889e621046b043230b0639492cda5198883df8525e17 WatchSource:0}: Error finding container 50febb2ae24ca11c8fc4889e621046b043230b0639492cda5198883df8525e17: Status 404 returned error can't find the container with id 50febb2ae24ca11c8fc4889e621046b043230b0639492cda5198883df8525e17 Jan 21 16:28:46 crc kubenswrapper[4707]: I0121 16:28:46.459387 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" event={"ID":"884cd03c-5ff2-43b7-b796-039b54658986","Type":"ContainerStarted","Data":"50febb2ae24ca11c8fc4889e621046b043230b0639492cda5198883df8525e17"} Jan 21 16:28:47 crc kubenswrapper[4707]: I0121 16:28:47.467828 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" event={"ID":"884cd03c-5ff2-43b7-b796-039b54658986","Type":"ContainerStarted","Data":"e1d1984b899ca95b672c40085601a9c986e18a9d4c78316ea159a20725baba06"} Jan 21 16:28:47 crc kubenswrapper[4707]: I0121 16:28:47.479982 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" podStartSLOduration=1.996634905 podStartE2EDuration="2.479969428s" podCreationTimestamp="2026-01-21 16:28:45 +0000 UTC" firstStartedPulling="2026-01-21 16:28:46.147075572 +0000 UTC m=+5223.328591794" lastFinishedPulling="2026-01-21 16:28:46.630410095 +0000 UTC m=+5223.811926317" observedRunningTime="2026-01-21 16:28:47.478395108 +0000 UTC m=+5224.659911329" watchObservedRunningTime="2026-01-21 16:28:47.479969428 +0000 UTC m=+5224.661485649" Jan 21 16:28:48 crc kubenswrapper[4707]: I0121 16:28:48.475468 4707 generic.go:334] "Generic (PLEG): container finished" podID="884cd03c-5ff2-43b7-b796-039b54658986" containerID="e1d1984b899ca95b672c40085601a9c986e18a9d4c78316ea159a20725baba06" exitCode=0 Jan 21 16:28:48 crc kubenswrapper[4707]: I0121 16:28:48.475543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" event={"ID":"884cd03c-5ff2-43b7-b796-039b54658986","Type":"ContainerDied","Data":"e1d1984b899ca95b672c40085601a9c986e18a9d4c78316ea159a20725baba06"} Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.680934 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.767074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-0\") pod \"884cd03c-5ff2-43b7-b796-039b54658986\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.767197 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-1\") pod \"884cd03c-5ff2-43b7-b796-039b54658986\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.767228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v25vq\" (UniqueName: \"kubernetes.io/projected/884cd03c-5ff2-43b7-b796-039b54658986-kube-api-access-v25vq\") pod \"884cd03c-5ff2-43b7-b796-039b54658986\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.767253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-inventory\") pod \"884cd03c-5ff2-43b7-b796-039b54658986\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.767278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-1\") pod \"884cd03c-5ff2-43b7-b796-039b54658986\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.767326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-0\") pod \"884cd03c-5ff2-43b7-b796-039b54658986\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.767352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-ssh-key-edpm-compute-no-nodes\") pod \"884cd03c-5ff2-43b7-b796-039b54658986\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.767369 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-combined-ca-bundle\") pod \"884cd03c-5ff2-43b7-b796-039b54658986\" (UID: \"884cd03c-5ff2-43b7-b796-039b54658986\") " Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.777175 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "884cd03c-5ff2-43b7-b796-039b54658986" (UID: "884cd03c-5ff2-43b7-b796-039b54658986"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.777195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884cd03c-5ff2-43b7-b796-039b54658986-kube-api-access-v25vq" (OuterVolumeSpecName: "kube-api-access-v25vq") pod "884cd03c-5ff2-43b7-b796-039b54658986" (UID: "884cd03c-5ff2-43b7-b796-039b54658986"). InnerVolumeSpecName "kube-api-access-v25vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.783792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "884cd03c-5ff2-43b7-b796-039b54658986" (UID: "884cd03c-5ff2-43b7-b796-039b54658986"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.784004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "884cd03c-5ff2-43b7-b796-039b54658986" (UID: "884cd03c-5ff2-43b7-b796-039b54658986"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.784624 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "884cd03c-5ff2-43b7-b796-039b54658986" (UID: "884cd03c-5ff2-43b7-b796-039b54658986"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.784966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-inventory" (OuterVolumeSpecName: "inventory") pod "884cd03c-5ff2-43b7-b796-039b54658986" (UID: "884cd03c-5ff2-43b7-b796-039b54658986"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.785245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "884cd03c-5ff2-43b7-b796-039b54658986" (UID: "884cd03c-5ff2-43b7-b796-039b54658986"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.785300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "884cd03c-5ff2-43b7-b796-039b54658986" (UID: "884cd03c-5ff2-43b7-b796-039b54658986"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.868661 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.868685 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.868696 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.868705 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.868713 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.868720 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v25vq\" (UniqueName: \"kubernetes.io/projected/884cd03c-5ff2-43b7-b796-039b54658986-kube-api-access-v25vq\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.868729 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:49 crc kubenswrapper[4707]: I0121 16:28:49.868737 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/884cd03c-5ff2-43b7-b796-039b54658986-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:50 crc kubenswrapper[4707]: I0121 16:28:50.489693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" event={"ID":"884cd03c-5ff2-43b7-b796-039b54658986","Type":"ContainerDied","Data":"50febb2ae24ca11c8fc4889e621046b043230b0639492cda5198883df8525e17"} Jan 21 16:28:50 crc kubenswrapper[4707]: I0121 16:28:50.489884 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50febb2ae24ca11c8fc4889e621046b043230b0639492cda5198883df8525e17" Jan 21 16:28:50 crc kubenswrapper[4707]: I0121 16:28:50.489740 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm" Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.758896 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.765436 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-8qsxl"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.771467 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.782708 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.791553 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.797436 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-2vbgl"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.803569 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.809669 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-zhn6j"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.815362 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-8hfjg"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.821277 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.826977 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-hm6sq"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.834847 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.846722 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.858798 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.862305 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-njjml"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.869219 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.872742 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-9shmc"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.877664 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.890713 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-hd45p"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.895281 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.899647 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetwgsjt"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.904049 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.909670 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-8w4dq"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.915142 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-zxrnm"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.923695 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-7mg2x"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.927941 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-gp7k4"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.931611 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.936559 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.941221 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.945709 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.950015 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-2sztp"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.953893 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-4trqv"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.957943 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-h8rzc"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.961799 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-8n7xv"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.965647 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.969680 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.973510 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.977449 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.981560 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.985612 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.989631 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.993646 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k"] Jan 21 16:28:51 crc kubenswrapper[4707]: I0121 16:28:51.997498 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.001797 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nod99qhp"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.005791 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.009903 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.013758 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-62rbd"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.018008 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nssx5j"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.021896 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-9zg45"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.025764 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-dvxk9"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.030041 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-node44nz2"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.034487 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.038397 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.055432 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-z7jpj"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.059787 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesggkgs"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.064139 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodj8c4k"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.067891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.071906 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.075583 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-7wqzl"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.079223 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.082846 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-nobggq2"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.086606 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-4s84l"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.090264 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-fwcz6"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.093901 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-slm9g"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.097470 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-2wcvk"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.101225 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.104797 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-47z52"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.108378 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-mbt5f"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.111950 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh"] Jan 21 16:28:52 crc kubenswrapper[4707]: E0121 16:28:52.112200 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884cd03c-5ff2-43b7-b796-039b54658986" containerName="nova-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.112218 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="884cd03c-5ff2-43b7-b796-039b54658986" containerName="nova-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.112362 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="884cd03c-5ff2-43b7-b796-039b54658986" containerName="nova-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.113073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.115802 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.119562 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.125666 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.126911 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: E0121 16:28:52.127103 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc edpm-compute-no-nodes kube-api-access-qq6jh], unattached volumes=[], failed to process volumes=[config dnsmasq-svc edpm-compute-no-nodes kube-api-access-qq6jh]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" podUID="e26362ad-7707-425d-8c9c-6dbda1a66dd1" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.128972 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.193406 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.193599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-config\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.193637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-nv2h5\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.193671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5z8h\" (UniqueName: \"kubernetes.io/projected/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-kube-api-access-d5z8h\") pod \"dnsmasq-dnsmasq-84b9f45d47-nv2h5\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.193688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq6jh\" (UniqueName: \"kubernetes.io/projected/e26362ad-7707-425d-8c9c-6dbda1a66dd1-kube-api-access-qq6jh\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.193832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.193896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-nv2h5\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.295220 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-nv2h5\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.295282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5z8h\" (UniqueName: \"kubernetes.io/projected/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-kube-api-access-d5z8h\") pod \"dnsmasq-dnsmasq-84b9f45d47-nv2h5\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.295306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq6jh\" (UniqueName: \"kubernetes.io/projected/e26362ad-7707-425d-8c9c-6dbda1a66dd1-kube-api-access-qq6jh\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.295327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.295364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-nv2h5\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.295445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.295460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-config\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: E0121 16:28:52.295729 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-no-nodes: configmap "edpm-compute-no-nodes" not found Jan 21 16:28:52 crc kubenswrapper[4707]: E0121 16:28:52.295797 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-edpm-compute-no-nodes podName:e26362ad-7707-425d-8c9c-6dbda1a66dd1 nodeName:}" failed. No retries permitted until 2026-01-21 16:28:52.795781865 +0000 UTC m=+5229.977298087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "edpm-compute-no-nodes" (UniqueName: "kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-edpm-compute-no-nodes") pod "dnsmasq-dnsmasq-64864b6d57-rkxmh" (UID: "e26362ad-7707-425d-8c9c-6dbda1a66dd1") : configmap "edpm-compute-no-nodes" not found Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.296118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-nv2h5\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.296255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-config\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.296642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.296662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-nv2h5\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.310228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5z8h\" (UniqueName: \"kubernetes.io/projected/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-kube-api-access-d5z8h\") pod \"dnsmasq-dnsmasq-84b9f45d47-nv2h5\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.310254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq6jh\" (UniqueName: \"kubernetes.io/projected/e26362ad-7707-425d-8c9c-6dbda1a66dd1-kube-api-access-qq6jh\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.439344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.505546 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.514086 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.598489 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-dnsmasq-svc\") pod \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.598555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq6jh\" (UniqueName: \"kubernetes.io/projected/e26362ad-7707-425d-8c9c-6dbda1a66dd1-kube-api-access-qq6jh\") pod \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.598684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-config\") pod \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.598969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "e26362ad-7707-425d-8c9c-6dbda1a66dd1" (UID: "e26362ad-7707-425d-8c9c-6dbda1a66dd1"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.599293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-config" (OuterVolumeSpecName: "config") pod "e26362ad-7707-425d-8c9c-6dbda1a66dd1" (UID: "e26362ad-7707-425d-8c9c-6dbda1a66dd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.602357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26362ad-7707-425d-8c9c-6dbda1a66dd1-kube-api-access-qq6jh" (OuterVolumeSpecName: "kube-api-access-qq6jh") pod "e26362ad-7707-425d-8c9c-6dbda1a66dd1" (UID: "e26362ad-7707-425d-8c9c-6dbda1a66dd1"). InnerVolumeSpecName "kube-api-access-qq6jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.700573 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.700787 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.700917 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq6jh\" (UniqueName: \"kubernetes.io/projected/e26362ad-7707-425d-8c9c-6dbda1a66dd1-kube-api-access-qq6jh\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.789472 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5"] Jan 21 16:28:52 crc kubenswrapper[4707]: I0121 16:28:52.802146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-rkxmh\" (UID: \"e26362ad-7707-425d-8c9c-6dbda1a66dd1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:52 crc kubenswrapper[4707]: E0121 16:28:52.802432 4707 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-no-nodes: configmap "edpm-compute-no-nodes" not found Jan 21 16:28:52 crc kubenswrapper[4707]: E0121 16:28:52.802870 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-edpm-compute-no-nodes podName:e26362ad-7707-425d-8c9c-6dbda1a66dd1 nodeName:}" failed. No retries permitted until 2026-01-21 16:28:53.802854873 +0000 UTC m=+5230.984371095 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "edpm-compute-no-nodes" (UniqueName: "kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-edpm-compute-no-nodes") pod "dnsmasq-dnsmasq-64864b6d57-rkxmh" (UID: "e26362ad-7707-425d-8c9c-6dbda1a66dd1") : configmap "edpm-compute-no-nodes" not found Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.188881 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e" path="/var/lib/kubelet/pods/0a8ee7f9-bd06-4dce-99f6-0d452b4ab70e/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.189477 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118d7425-d41d-4f54-aac0-6795ec4b72fd" path="/var/lib/kubelet/pods/118d7425-d41d-4f54-aac0-6795ec4b72fd/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.189927 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ae4f766-183b-4491-b5a9-f3376cfa5254" path="/var/lib/kubelet/pods/1ae4f766-183b-4491-b5a9-f3376cfa5254/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.190362 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cdb5ec5-58a6-435d-864d-313a4a1d5ad0" path="/var/lib/kubelet/pods/1cdb5ec5-58a6-435d-864d-313a4a1d5ad0/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.191264 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20dfb569-8130-43fd-bbac-8b5ec8a01c82" path="/var/lib/kubelet/pods/20dfb569-8130-43fd-bbac-8b5ec8a01c82/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.191692 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2314a944-4745-4b81-b6c8-160f17c0c0bc" path="/var/lib/kubelet/pods/2314a944-4745-4b81-b6c8-160f17c0c0bc/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.192134 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="294b0203-a03d-48f0-92aa-9db53c0e1f58" path="/var/lib/kubelet/pods/294b0203-a03d-48f0-92aa-9db53c0e1f58/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.192935 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ec1a45-1d87-4863-9013-5e4da9048cbc" path="/var/lib/kubelet/pods/29ec1a45-1d87-4863-9013-5e4da9048cbc/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.193393 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ca9ae8-28f0-476b-beb3-6374833dc9f4" path="/var/lib/kubelet/pods/34ca9ae8-28f0-476b-beb3-6374833dc9f4/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.193904 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3780a614-404b-4ba9-a981-6f0193df3c96" path="/var/lib/kubelet/pods/3780a614-404b-4ba9-a981-6f0193df3c96/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.194703 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4672bedb-06ea-4f94-a3de-ad73fe7b9aa9" path="/var/lib/kubelet/pods/4672bedb-06ea-4f94-a3de-ad73fe7b9aa9/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.195172 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e34fac-a51c-4325-8fce-914731b7f16e" path="/var/lib/kubelet/pods/46e34fac-a51c-4325-8fce-914731b7f16e/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.196768 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53297bd4-be53-44fe-947a-b3651e24c887" path="/var/lib/kubelet/pods/53297bd4-be53-44fe-947a-b3651e24c887/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.197743 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56900374-8406-4956-adfa-604c411784c7" path="/var/lib/kubelet/pods/56900374-8406-4956-adfa-604c411784c7/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.198230 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="654e3eb3-d255-440f-be85-235463249784" path="/var/lib/kubelet/pods/654e3eb3-d255-440f-be85-235463249784/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.198637 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2" path="/var/lib/kubelet/pods/6aba4cd3-4e5a-4b8b-8681-67c6db7c1cc2/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.199078 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae7b48f-7890-475b-8b86-018b44293ed5" path="/var/lib/kubelet/pods/6ae7b48f-7890-475b-8b86-018b44293ed5/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.199937 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fa648c7-60be-4d40-9ef8-728b91a51263" path="/var/lib/kubelet/pods/6fa648c7-60be-4d40-9ef8-728b91a51263/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.200368 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77e2ada6-fa58-437b-8e03-b89e72bf9939" path="/var/lib/kubelet/pods/77e2ada6-fa58-437b-8e03-b89e72bf9939/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.200768 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884cd03c-5ff2-43b7-b796-039b54658986" path="/var/lib/kubelet/pods/884cd03c-5ff2-43b7-b796-039b54658986/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.201560 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9889a470-1fad-4c98-a7f6-42650212e6ac" path="/var/lib/kubelet/pods/9889a470-1fad-4c98-a7f6-42650212e6ac/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.202049 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7bd3a21-683b-4488-94fa-dd1b2ff48100" path="/var/lib/kubelet/pods/a7bd3a21-683b-4488-94fa-dd1b2ff48100/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.202511 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d88374-3dc3-4e29-b45d-5ea7015beff0" path="/var/lib/kubelet/pods/b8d88374-3dc3-4e29-b45d-5ea7015beff0/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.202956 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd07d192-5f26-4922-b644-dee8adc64158" path="/var/lib/kubelet/pods/cd07d192-5f26-4922-b644-dee8adc64158/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.203728 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd680393-9246-418e-a033-c2acbcf038c0" path="/var/lib/kubelet/pods/cd680393-9246-418e-a033-c2acbcf038c0/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.204270 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0c6fd2-0c91-4e1d-8459-f2b41ac49806" path="/var/lib/kubelet/pods/ce0c6fd2-0c91-4e1d-8459-f2b41ac49806/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.204679 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647" path="/var/lib/kubelet/pods/d0ad0b03-e92c-4d1f-9ac3-3e83dae0c647/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.205452 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d381a5d7-2629-4947-bf07-87631f779e7a" path="/var/lib/kubelet/pods/d381a5d7-2629-4947-bf07-87631f779e7a/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.205882 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab0aa71-64ec-474e-8857-e5e6be8a1eff" path="/var/lib/kubelet/pods/dab0aa71-64ec-474e-8857-e5e6be8a1eff/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.206444 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e38debfc-3c8a-4248-82c1-43087acc34f7" path="/var/lib/kubelet/pods/e38debfc-3c8a-4248-82c1-43087acc34f7/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.207208 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5488638-d413-4986-9842-beaab6922631" path="/var/lib/kubelet/pods/e5488638-d413-4986-9842-beaab6922631/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.207626 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e998e26c-e225-4202-9d3e-b90d2cddc8f3" path="/var/lib/kubelet/pods/e998e26c-e225-4202-9d3e-b90d2cddc8f3/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.208065 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee8023da-da1b-4316-a674-dda6aa48fed9" path="/var/lib/kubelet/pods/ee8023da-da1b-4316-a674-dda6aa48fed9/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.208462 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d" path="/var/lib/kubelet/pods/fe74bd7e-8319-431e-9a0a-bd5dcbfbe63d/volumes" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.514351 4707 generic.go:334] "Generic (PLEG): container finished" podID="0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" containerID="e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501" exitCode=0 Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.514394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" event={"ID":"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e","Type":"ContainerDied","Data":"e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501"} Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.514447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" event={"ID":"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e","Type":"ContainerStarted","Data":"0c4399411bc953454424a7846c3abffce24159b380ffab5e571e0d7e0895378c"} Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.514410 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh" Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.548298 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh"] Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.552751 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rkxmh"] Jan 21 16:28:53 crc kubenswrapper[4707]: I0121 16:28:53.610435 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/e26362ad-7707-425d-8c9c-6dbda1a66dd1-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:54 crc kubenswrapper[4707]: I0121 16:28:54.521564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" event={"ID":"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e","Type":"ContainerStarted","Data":"a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b"} Jan 21 16:28:54 crc kubenswrapper[4707]: I0121 16:28:54.521675 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:28:54 crc kubenswrapper[4707]: I0121 16:28:54.539767 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" podStartSLOduration=3.539752816 podStartE2EDuration="3.539752816s" podCreationTimestamp="2026-01-21 16:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:28:54.535532983 +0000 UTC m=+5231.717049204" watchObservedRunningTime="2026-01-21 16:28:54.539752816 +0000 UTC m=+5231.721269038" Jan 21 16:28:55 crc kubenswrapper[4707]: I0121 16:28:55.188759 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26362ad-7707-425d-8c9c-6dbda1a66dd1" path="/var/lib/kubelet/pods/e26362ad-7707-425d-8c9c-6dbda1a66dd1/volumes" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.117165 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-v7l5q"] Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.120741 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-v7l5q"] Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.231529 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9kw5w"] Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.232475 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.234529 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.234665 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.234771 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.234775 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.237396 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9kw5w"] Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.261775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpmnn\" (UniqueName: \"kubernetes.io/projected/b66048fa-add8-493c-b737-33f3332de5e7-kube-api-access-gpmnn\") pod \"crc-storage-crc-9kw5w\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.261829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b66048fa-add8-493c-b737-33f3332de5e7-crc-storage\") pod \"crc-storage-crc-9kw5w\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.262110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b66048fa-add8-493c-b737-33f3332de5e7-node-mnt\") pod \"crc-storage-crc-9kw5w\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.363430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b66048fa-add8-493c-b737-33f3332de5e7-node-mnt\") pod \"crc-storage-crc-9kw5w\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.363501 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpmnn\" (UniqueName: \"kubernetes.io/projected/b66048fa-add8-493c-b737-33f3332de5e7-kube-api-access-gpmnn\") pod \"crc-storage-crc-9kw5w\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.363522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b66048fa-add8-493c-b737-33f3332de5e7-crc-storage\") pod \"crc-storage-crc-9kw5w\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.363702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b66048fa-add8-493c-b737-33f3332de5e7-node-mnt\") pod \"crc-storage-crc-9kw5w\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.364123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b66048fa-add8-493c-b737-33f3332de5e7-crc-storage\") pod \"crc-storage-crc-9kw5w\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.378456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpmnn\" (UniqueName: \"kubernetes.io/projected/b66048fa-add8-493c-b737-33f3332de5e7-kube-api-access-gpmnn\") pod \"crc-storage-crc-9kw5w\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.545158 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:28:58 crc kubenswrapper[4707]: I0121 16:28:58.892341 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9kw5w"] Jan 21 16:28:59 crc kubenswrapper[4707]: I0121 16:28:59.190082 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c43373b6-f9de-488e-8687-b21b98d5fb36" path="/var/lib/kubelet/pods/c43373b6-f9de-488e-8687-b21b98d5fb36/volumes" Jan 21 16:28:59 crc kubenswrapper[4707]: I0121 16:28:59.557072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9kw5w" event={"ID":"b66048fa-add8-493c-b737-33f3332de5e7","Type":"ContainerStarted","Data":"a89beb8f616e9c2e76488a5597f9ef95006c2a8218ea25dfa344b99300ccd355"} Jan 21 16:28:59 crc kubenswrapper[4707]: I0121 16:28:59.557113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9kw5w" event={"ID":"b66048fa-add8-493c-b737-33f3332de5e7","Type":"ContainerStarted","Data":"780e77818525ab9c65f0be08e38016ef386a442bd43e1ffe87d2dfc232d9fc23"} Jan 21 16:28:59 crc kubenswrapper[4707]: I0121 16:28:59.571668 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-9kw5w" podStartSLOduration=1.066717263 podStartE2EDuration="1.571655458s" podCreationTimestamp="2026-01-21 16:28:58 +0000 UTC" firstStartedPulling="2026-01-21 16:28:58.8979917 +0000 UTC m=+5236.079507921" lastFinishedPulling="2026-01-21 16:28:59.402929894 +0000 UTC m=+5236.584446116" observedRunningTime="2026-01-21 16:28:59.567897622 +0000 UTC m=+5236.749413844" watchObservedRunningTime="2026-01-21 16:28:59.571655458 +0000 UTC m=+5236.753171680" Jan 21 16:28:59 crc kubenswrapper[4707]: E0121 16:28:59.650415 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb66048fa_add8_493c_b737_33f3332de5e7.slice/crio-conmon-a89beb8f616e9c2e76488a5597f9ef95006c2a8218ea25dfa344b99300ccd355.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:29:00 crc kubenswrapper[4707]: I0121 16:29:00.564983 4707 generic.go:334] "Generic (PLEG): container finished" podID="b66048fa-add8-493c-b737-33f3332de5e7" containerID="a89beb8f616e9c2e76488a5597f9ef95006c2a8218ea25dfa344b99300ccd355" exitCode=0 Jan 21 16:29:00 crc kubenswrapper[4707]: I0121 16:29:00.565028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9kw5w" event={"ID":"b66048fa-add8-493c-b737-33f3332de5e7","Type":"ContainerDied","Data":"a89beb8f616e9c2e76488a5597f9ef95006c2a8218ea25dfa344b99300ccd355"} Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.772154 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.798595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b66048fa-add8-493c-b737-33f3332de5e7-node-mnt\") pod \"b66048fa-add8-493c-b737-33f3332de5e7\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.798649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpmnn\" (UniqueName: \"kubernetes.io/projected/b66048fa-add8-493c-b737-33f3332de5e7-kube-api-access-gpmnn\") pod \"b66048fa-add8-493c-b737-33f3332de5e7\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.798696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b66048fa-add8-493c-b737-33f3332de5e7-crc-storage\") pod \"b66048fa-add8-493c-b737-33f3332de5e7\" (UID: \"b66048fa-add8-493c-b737-33f3332de5e7\") " Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.798696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b66048fa-add8-493c-b737-33f3332de5e7-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b66048fa-add8-493c-b737-33f3332de5e7" (UID: "b66048fa-add8-493c-b737-33f3332de5e7"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.799036 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b66048fa-add8-493c-b737-33f3332de5e7-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.802757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66048fa-add8-493c-b737-33f3332de5e7-kube-api-access-gpmnn" (OuterVolumeSpecName: "kube-api-access-gpmnn") pod "b66048fa-add8-493c-b737-33f3332de5e7" (UID: "b66048fa-add8-493c-b737-33f3332de5e7"). InnerVolumeSpecName "kube-api-access-gpmnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.812576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b66048fa-add8-493c-b737-33f3332de5e7-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b66048fa-add8-493c-b737-33f3332de5e7" (UID: "b66048fa-add8-493c-b737-33f3332de5e7"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.900525 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpmnn\" (UniqueName: \"kubernetes.io/projected/b66048fa-add8-493c-b737-33f3332de5e7-kube-api-access-gpmnn\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:01 crc kubenswrapper[4707]: I0121 16:29:01.900549 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b66048fa-add8-493c-b737-33f3332de5e7-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.440965 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.470149 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5"] Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.470487 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" podUID="f003a127-eeb4-4e96-ace8-517746e977bd" containerName="dnsmasq-dns" containerID="cri-o://7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89" gracePeriod=10 Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.579080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9kw5w" event={"ID":"b66048fa-add8-493c-b737-33f3332de5e7","Type":"ContainerDied","Data":"780e77818525ab9c65f0be08e38016ef386a442bd43e1ffe87d2dfc232d9fc23"} Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.579115 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="780e77818525ab9c65f0be08e38016ef386a442bd43e1ffe87d2dfc232d9fc23" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.579125 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9kw5w" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.797696 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.809518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-beta-nodeset\") pod \"f003a127-eeb4-4e96-ace8-517746e977bd\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.809587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-no-nodes\") pod \"f003a127-eeb4-4e96-ace8-517746e977bd\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.809654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-config\") pod \"f003a127-eeb4-4e96-ace8-517746e977bd\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.809681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k8qg\" (UniqueName: \"kubernetes.io/projected/f003a127-eeb4-4e96-ace8-517746e977bd-kube-api-access-7k8qg\") pod \"f003a127-eeb4-4e96-ace8-517746e977bd\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.809707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-dnsmasq-svc\") pod \"f003a127-eeb4-4e96-ace8-517746e977bd\" (UID: \"f003a127-eeb4-4e96-ace8-517746e977bd\") " Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.816741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f003a127-eeb4-4e96-ace8-517746e977bd-kube-api-access-7k8qg" (OuterVolumeSpecName: "kube-api-access-7k8qg") pod "f003a127-eeb4-4e96-ace8-517746e977bd" (UID: "f003a127-eeb4-4e96-ace8-517746e977bd"). InnerVolumeSpecName "kube-api-access-7k8qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.838988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "edpm-compute-beta-nodeset") pod "f003a127-eeb4-4e96-ace8-517746e977bd" (UID: "f003a127-eeb4-4e96-ace8-517746e977bd"). InnerVolumeSpecName "edpm-compute-beta-nodeset". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.839388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-config" (OuterVolumeSpecName: "config") pod "f003a127-eeb4-4e96-ace8-517746e977bd" (UID: "f003a127-eeb4-4e96-ace8-517746e977bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.839622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "f003a127-eeb4-4e96-ace8-517746e977bd" (UID: "f003a127-eeb4-4e96-ace8-517746e977bd"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.841274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-no-nodes" (OuterVolumeSpecName: "edpm-compute-no-nodes") pod "f003a127-eeb4-4e96-ace8-517746e977bd" (UID: "f003a127-eeb4-4e96-ace8-517746e977bd"). InnerVolumeSpecName "edpm-compute-no-nodes". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.910715 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k8qg\" (UniqueName: \"kubernetes.io/projected/f003a127-eeb4-4e96-ace8-517746e977bd-kube-api-access-7k8qg\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.910749 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.910762 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.910773 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:02 crc kubenswrapper[4707]: I0121 16:29:02.910784 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003a127-eeb4-4e96-ace8-517746e977bd-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.587472 4707 generic.go:334] "Generic (PLEG): container finished" podID="f003a127-eeb4-4e96-ace8-517746e977bd" containerID="7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89" exitCode=0 Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.587515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" event={"ID":"f003a127-eeb4-4e96-ace8-517746e977bd","Type":"ContainerDied","Data":"7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89"} Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.587560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" event={"ID":"f003a127-eeb4-4e96-ace8-517746e977bd","Type":"ContainerDied","Data":"a1ff149066d9d2e7a6233ac30a8ee59cea40bbbfe36c8d0d8b40e15a9bf49b55"} Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.587578 4707 scope.go:117] "RemoveContainer" containerID="7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89" Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.587531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5" Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.603304 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5"] Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.603920 4707 scope.go:117] "RemoveContainer" containerID="b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170" Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.607106 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-vjfh5"] Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.618682 4707 scope.go:117] "RemoveContainer" containerID="7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89" Jan 21 16:29:03 crc kubenswrapper[4707]: E0121 16:29:03.619062 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89\": container with ID starting with 7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89 not found: ID does not exist" containerID="7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89" Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.619096 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89"} err="failed to get container status \"7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89\": rpc error: code = NotFound desc = could not find container \"7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89\": container with ID starting with 7ac36618409ab48a5585c92a69cc944b0e3355a94d5bc3459985ebce00c88b89 not found: ID does not exist" Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.619117 4707 scope.go:117] "RemoveContainer" containerID="b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170" Jan 21 16:29:03 crc kubenswrapper[4707]: E0121 16:29:03.619432 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170\": container with ID starting with b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170 not found: ID does not exist" containerID="b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170" Jan 21 16:29:03 crc kubenswrapper[4707]: I0121 16:29:03.619459 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170"} err="failed to get container status \"b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170\": rpc error: code = NotFound desc = could not find container \"b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170\": container with ID starting with b97b114a87d3879e8691ec41a5f6db6f8504d49324c7f765b0bc486a63434170 not found: ID does not exist" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.486176 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-9kw5w"] Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.489858 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-9kw5w"] Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.601360 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-vnw9x"] Jan 21 16:29:04 crc kubenswrapper[4707]: E0121 16:29:04.601626 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f003a127-eeb4-4e96-ace8-517746e977bd" containerName="init" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.601643 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f003a127-eeb4-4e96-ace8-517746e977bd" containerName="init" Jan 21 16:29:04 crc kubenswrapper[4707]: E0121 16:29:04.601653 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f003a127-eeb4-4e96-ace8-517746e977bd" containerName="dnsmasq-dns" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.601659 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f003a127-eeb4-4e96-ace8-517746e977bd" containerName="dnsmasq-dns" Jan 21 16:29:04 crc kubenswrapper[4707]: E0121 16:29:04.601671 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66048fa-add8-493c-b737-33f3332de5e7" containerName="storage" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.601676 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66048fa-add8-493c-b737-33f3332de5e7" containerName="storage" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.601821 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f003a127-eeb4-4e96-ace8-517746e977bd" containerName="dnsmasq-dns" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.601834 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66048fa-add8-493c-b737-33f3332de5e7" containerName="storage" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.602279 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.604639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.604693 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.604710 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.604639 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.607448 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vnw9x"] Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.635456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/077686fc-4643-4e29-b1b2-fb295c380fa2-node-mnt\") pod \"crc-storage-crc-vnw9x\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.635505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/077686fc-4643-4e29-b1b2-fb295c380fa2-crc-storage\") pod \"crc-storage-crc-vnw9x\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.635536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l249x\" (UniqueName: \"kubernetes.io/projected/077686fc-4643-4e29-b1b2-fb295c380fa2-kube-api-access-l249x\") pod \"crc-storage-crc-vnw9x\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.737311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/077686fc-4643-4e29-b1b2-fb295c380fa2-node-mnt\") pod \"crc-storage-crc-vnw9x\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.737359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/077686fc-4643-4e29-b1b2-fb295c380fa2-crc-storage\") pod \"crc-storage-crc-vnw9x\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.737391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l249x\" (UniqueName: \"kubernetes.io/projected/077686fc-4643-4e29-b1b2-fb295c380fa2-kube-api-access-l249x\") pod \"crc-storage-crc-vnw9x\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.737547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/077686fc-4643-4e29-b1b2-fb295c380fa2-node-mnt\") pod \"crc-storage-crc-vnw9x\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.738048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/077686fc-4643-4e29-b1b2-fb295c380fa2-crc-storage\") pod \"crc-storage-crc-vnw9x\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.750055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l249x\" (UniqueName: \"kubernetes.io/projected/077686fc-4643-4e29-b1b2-fb295c380fa2-kube-api-access-l249x\") pod \"crc-storage-crc-vnw9x\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:04 crc kubenswrapper[4707]: I0121 16:29:04.913912 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:05 crc kubenswrapper[4707]: I0121 16:29:05.188844 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66048fa-add8-493c-b737-33f3332de5e7" path="/var/lib/kubelet/pods/b66048fa-add8-493c-b737-33f3332de5e7/volumes" Jan 21 16:29:05 crc kubenswrapper[4707]: I0121 16:29:05.189499 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f003a127-eeb4-4e96-ace8-517746e977bd" path="/var/lib/kubelet/pods/f003a127-eeb4-4e96-ace8-517746e977bd/volumes" Jan 21 16:29:05 crc kubenswrapper[4707]: I0121 16:29:05.250662 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vnw9x"] Jan 21 16:29:05 crc kubenswrapper[4707]: I0121 16:29:05.601561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vnw9x" event={"ID":"077686fc-4643-4e29-b1b2-fb295c380fa2","Type":"ContainerStarted","Data":"fc8eb745d9f62a34370b4eadcc3d9f9e06c35e764764c7b6f6949c00e82c0d34"} Jan 21 16:29:06 crc kubenswrapper[4707]: I0121 16:29:06.607952 4707 generic.go:334] "Generic (PLEG): container finished" podID="077686fc-4643-4e29-b1b2-fb295c380fa2" containerID="6626b64dbda518fb1ffd1ba7771b7f83ffe9a34cd98bfdb1b760b519bd91e28d" exitCode=0 Jan 21 16:29:06 crc kubenswrapper[4707]: I0121 16:29:06.607991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vnw9x" event={"ID":"077686fc-4643-4e29-b1b2-fb295c380fa2","Type":"ContainerDied","Data":"6626b64dbda518fb1ffd1ba7771b7f83ffe9a34cd98bfdb1b760b519bd91e28d"} Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.811555 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.869796 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/077686fc-4643-4e29-b1b2-fb295c380fa2-crc-storage\") pod \"077686fc-4643-4e29-b1b2-fb295c380fa2\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.869900 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l249x\" (UniqueName: \"kubernetes.io/projected/077686fc-4643-4e29-b1b2-fb295c380fa2-kube-api-access-l249x\") pod \"077686fc-4643-4e29-b1b2-fb295c380fa2\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.869953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/077686fc-4643-4e29-b1b2-fb295c380fa2-node-mnt\") pod \"077686fc-4643-4e29-b1b2-fb295c380fa2\" (UID: \"077686fc-4643-4e29-b1b2-fb295c380fa2\") " Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.870207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077686fc-4643-4e29-b1b2-fb295c380fa2-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "077686fc-4643-4e29-b1b2-fb295c380fa2" (UID: "077686fc-4643-4e29-b1b2-fb295c380fa2"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.874169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077686fc-4643-4e29-b1b2-fb295c380fa2-kube-api-access-l249x" (OuterVolumeSpecName: "kube-api-access-l249x") pod "077686fc-4643-4e29-b1b2-fb295c380fa2" (UID: "077686fc-4643-4e29-b1b2-fb295c380fa2"). InnerVolumeSpecName "kube-api-access-l249x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.884147 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/077686fc-4643-4e29-b1b2-fb295c380fa2-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "077686fc-4643-4e29-b1b2-fb295c380fa2" (UID: "077686fc-4643-4e29-b1b2-fb295c380fa2"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.971160 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l249x\" (UniqueName: \"kubernetes.io/projected/077686fc-4643-4e29-b1b2-fb295c380fa2-kube-api-access-l249x\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.971185 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/077686fc-4643-4e29-b1b2-fb295c380fa2-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:07 crc kubenswrapper[4707]: I0121 16:29:07.971195 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/077686fc-4643-4e29-b1b2-fb295c380fa2-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:08 crc kubenswrapper[4707]: I0121 16:29:08.621875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vnw9x" event={"ID":"077686fc-4643-4e29-b1b2-fb295c380fa2","Type":"ContainerDied","Data":"fc8eb745d9f62a34370b4eadcc3d9f9e06c35e764764c7b6f6949c00e82c0d34"} Jan 21 16:29:08 crc kubenswrapper[4707]: I0121 16:29:08.622124 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc8eb745d9f62a34370b4eadcc3d9f9e06c35e764764c7b6f6949c00e82c0d34" Jan 21 16:29:08 crc kubenswrapper[4707]: I0121 16:29:08.621905 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vnw9x" Jan 21 16:29:09 crc kubenswrapper[4707]: I0121 16:29:09.945587 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:29:09 crc kubenswrapper[4707]: I0121 16:29:09.945634 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.844491 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz"] Jan 21 16:29:10 crc kubenswrapper[4707]: E0121 16:29:10.845020 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077686fc-4643-4e29-b1b2-fb295c380fa2" containerName="storage" Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.845037 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="077686fc-4643-4e29-b1b2-fb295c380fa2" containerName="storage" Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.845167 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="077686fc-4643-4e29-b1b2-fb295c380fa2" containerName="storage" Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.845778 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.848845 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-edpm-tls" Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.852228 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz"] Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.912509 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz"] Jan 21 16:29:10 crc kubenswrapper[4707]: E0121 16:29:10.912983 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc kube-api-access-h8vkt openstack-edpm-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" podUID="c1013a69-4af7-4181-876b-55b826a96c2a" Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.928691 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt"] Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.929667 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:10 crc kubenswrapper[4707]: I0121 16:29:10.938175 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt"] Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.003105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.003188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8vkt\" (UniqueName: \"kubernetes.io/projected/c1013a69-4af7-4181-876b-55b826a96c2a-kube-api-access-h8vkt\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.003433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.003501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.104326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8vkt\" (UniqueName: \"kubernetes.io/projected/c1013a69-4af7-4181-876b-55b826a96c2a-kube-api-access-h8vkt\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.104396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-config\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.104435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.104480 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.104508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.104547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d24zd\" (UniqueName: \"kubernetes.io/projected/91fe998d-8b28-4204-9db3-022679520d79-kube-api-access-d24zd\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.104596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.104615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.105346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.105362 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.105372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.119154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8vkt\" (UniqueName: \"kubernetes.io/projected/c1013a69-4af7-4181-876b-55b826a96c2a-kube-api-access-h8vkt\") pod \"dnsmasq-dnsmasq-78c7b787f5-dzmzz\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.205826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.205883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d24zd\" (UniqueName: \"kubernetes.io/projected/91fe998d-8b28-4204-9db3-022679520d79-kube-api-access-d24zd\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.206576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.206762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-config\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.207192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.207366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-config\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.207586 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.219154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d24zd\" (UniqueName: \"kubernetes.io/projected/91fe998d-8b28-4204-9db3-022679520d79-kube-api-access-d24zd\") pod \"dnsmasq-dnsmasq-79cc674687-gcdzt\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.243235 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.588392 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt"] Jan 21 16:29:11 crc kubenswrapper[4707]: W0121 16:29:11.591512 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91fe998d_8b28_4204_9db3_022679520d79.slice/crio-f1778e98a5d750134a699aa672af791cb782fc18739d49c21f8492b429f13308 WatchSource:0}: Error finding container f1778e98a5d750134a699aa672af791cb782fc18739d49c21f8492b429f13308: Status 404 returned error can't find the container with id f1778e98a5d750134a699aa672af791cb782fc18739d49c21f8492b429f13308 Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.642769 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.642935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" event={"ID":"91fe998d-8b28-4204-9db3-022679520d79","Type":"ContainerStarted","Data":"f1778e98a5d750134a699aa672af791cb782fc18739d49c21f8492b429f13308"} Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.666424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.711881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8vkt\" (UniqueName: \"kubernetes.io/projected/c1013a69-4af7-4181-876b-55b826a96c2a-kube-api-access-h8vkt\") pod \"c1013a69-4af7-4181-876b-55b826a96c2a\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.711968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-dnsmasq-svc\") pod \"c1013a69-4af7-4181-876b-55b826a96c2a\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.712021 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-openstack-edpm-tls\") pod \"c1013a69-4af7-4181-876b-55b826a96c2a\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.712037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-config\") pod \"c1013a69-4af7-4181-876b-55b826a96c2a\" (UID: \"c1013a69-4af7-4181-876b-55b826a96c2a\") " Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.712419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "c1013a69-4af7-4181-876b-55b826a96c2a" (UID: "c1013a69-4af7-4181-876b-55b826a96c2a"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.712439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "c1013a69-4af7-4181-876b-55b826a96c2a" (UID: "c1013a69-4af7-4181-876b-55b826a96c2a"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.712465 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-config" (OuterVolumeSpecName: "config") pod "c1013a69-4af7-4181-876b-55b826a96c2a" (UID: "c1013a69-4af7-4181-876b-55b826a96c2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.715425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1013a69-4af7-4181-876b-55b826a96c2a-kube-api-access-h8vkt" (OuterVolumeSpecName: "kube-api-access-h8vkt") pod "c1013a69-4af7-4181-876b-55b826a96c2a" (UID: "c1013a69-4af7-4181-876b-55b826a96c2a"). InnerVolumeSpecName "kube-api-access-h8vkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.813355 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.813391 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.813401 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1013a69-4af7-4181-876b-55b826a96c2a-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:11 crc kubenswrapper[4707]: I0121 16:29:11.813412 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8vkt\" (UniqueName: \"kubernetes.io/projected/c1013a69-4af7-4181-876b-55b826a96c2a-kube-api-access-h8vkt\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:12 crc kubenswrapper[4707]: I0121 16:29:12.659483 4707 generic.go:334] "Generic (PLEG): container finished" podID="91fe998d-8b28-4204-9db3-022679520d79" containerID="be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587" exitCode=0 Jan 21 16:29:12 crc kubenswrapper[4707]: I0121 16:29:12.659966 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz" Jan 21 16:29:12 crc kubenswrapper[4707]: I0121 16:29:12.660668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" event={"ID":"91fe998d-8b28-4204-9db3-022679520d79","Type":"ContainerDied","Data":"be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587"} Jan 21 16:29:12 crc kubenswrapper[4707]: I0121 16:29:12.773503 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz"] Jan 21 16:29:12 crc kubenswrapper[4707]: I0121 16:29:12.776685 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-dzmzz"] Jan 21 16:29:13 crc kubenswrapper[4707]: I0121 16:29:13.188847 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1013a69-4af7-4181-876b-55b826a96c2a" path="/var/lib/kubelet/pods/c1013a69-4af7-4181-876b-55b826a96c2a/volumes" Jan 21 16:29:13 crc kubenswrapper[4707]: I0121 16:29:13.667321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" event={"ID":"91fe998d-8b28-4204-9db3-022679520d79","Type":"ContainerStarted","Data":"64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0"} Jan 21 16:29:13 crc kubenswrapper[4707]: I0121 16:29:13.667445 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:13 crc kubenswrapper[4707]: I0121 16:29:13.682718 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" podStartSLOduration=3.682702966 podStartE2EDuration="3.682702966s" podCreationTimestamp="2026-01-21 16:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:29:13.681597768 +0000 UTC m=+5250.863113990" watchObservedRunningTime="2026-01-21 16:29:13.682702966 +0000 UTC m=+5250.864219188" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.244805 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.277534 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5"] Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.277717 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" podUID="0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" containerName="dnsmasq-dns" containerID="cri-o://a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b" gracePeriod=10 Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.588226 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.719632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-config\") pod \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.719751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-dnsmasq-svc\") pod \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.719854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5z8h\" (UniqueName: \"kubernetes.io/projected/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-kube-api-access-d5z8h\") pod \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\" (UID: \"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e\") " Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.722432 4707 generic.go:334] "Generic (PLEG): container finished" podID="0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" containerID="a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b" exitCode=0 Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.722475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" event={"ID":"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e","Type":"ContainerDied","Data":"a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b"} Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.722501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" event={"ID":"0dd330cc-3e32-4ae6-94cf-fdd85070cb8e","Type":"ContainerDied","Data":"0c4399411bc953454424a7846c3abffce24159b380ffab5e571e0d7e0895378c"} Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.722516 4707 scope.go:117] "RemoveContainer" containerID="a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.722480 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.723935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-kube-api-access-d5z8h" (OuterVolumeSpecName: "kube-api-access-d5z8h") pod "0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" (UID: "0dd330cc-3e32-4ae6-94cf-fdd85070cb8e"). InnerVolumeSpecName "kube-api-access-d5z8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.748266 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" (UID: "0dd330cc-3e32-4ae6-94cf-fdd85070cb8e"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.749652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-config" (OuterVolumeSpecName: "config") pod "0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" (UID: "0dd330cc-3e32-4ae6-94cf-fdd85070cb8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.776555 4707 scope.go:117] "RemoveContainer" containerID="e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.792044 4707 scope.go:117] "RemoveContainer" containerID="a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b" Jan 21 16:29:21 crc kubenswrapper[4707]: E0121 16:29:21.792388 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b\": container with ID starting with a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b not found: ID does not exist" containerID="a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.792464 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b"} err="failed to get container status \"a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b\": rpc error: code = NotFound desc = could not find container \"a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b\": container with ID starting with a00c1299189ee6a78ff7fb8157f95df69b98937779d75a1aab634dd7a987926b not found: ID does not exist" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.792485 4707 scope.go:117] "RemoveContainer" containerID="e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501" Jan 21 16:29:21 crc kubenswrapper[4707]: E0121 16:29:21.792868 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501\": container with ID starting with e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501 not found: ID does not exist" containerID="e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.792892 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501"} err="failed to get container status \"e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501\": rpc error: code = NotFound desc = could not find container \"e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501\": container with ID starting with e80eaba9353fa5c3edd148053f0acf2bde4ef133aa86e6b1b10ee34c8f893501 not found: ID does not exist" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.822734 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.822760 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5z8h\" (UniqueName: \"kubernetes.io/projected/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-kube-api-access-d5z8h\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:21 crc kubenswrapper[4707]: I0121 16:29:21.822770 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.043500 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5"] Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.047792 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-nv2h5"] Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.523412 4707 scope.go:117] "RemoveContainer" containerID="032d65d5a61ac6401bcde8a7135e0de95b989173b450fb1ffa8dbbf5d2b3cfee" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.541523 4707 scope.go:117] "RemoveContainer" containerID="df7fb3bfab8b15a9334cfffdbdb6b11fe8bd01ca13669118cae10250b9edb0f6" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.561139 4707 scope.go:117] "RemoveContainer" containerID="e1424419f0c2b85330b14e02110c1f778afb0bea4b28340ef864e6456bd3219f" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.579985 4707 scope.go:117] "RemoveContainer" containerID="4a4f9ca8aa9cb4726a3c5555e3713f9f3283e3eb1bcdbbac79240a91090e07fa" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.596578 4707 scope.go:117] "RemoveContainer" containerID="85300f8d0bfdb9a98a82f1febe13582888bfe1b1840beff0b872a80ef6819294" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.613742 4707 scope.go:117] "RemoveContainer" containerID="7c73dca24790201aa7f764bf45c747cc9a707ecc3c2c819be34e77b922fec5df" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.632683 4707 scope.go:117] "RemoveContainer" containerID="0ae2bf16efa70547dd967f2b9cc5b8bb084c8008840bb8c2a5ffeb9ea0cacb8b" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.649968 4707 scope.go:117] "RemoveContainer" containerID="0496b69d6767e2c509fbac99b5dfb60856a1e7c835958c320b4add0c59b28805" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.666310 4707 scope.go:117] "RemoveContainer" containerID="708bb3303ccc2bdb9b82db33f015510413004800881b6f4dffca4feb85a4ff67" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.688593 4707 scope.go:117] "RemoveContainer" containerID="f50f2ef7b63a9f049bf2699e25719a5bbd8f8273c10616f22dc0f31f703c2866" Jan 21 16:29:22 crc kubenswrapper[4707]: I0121 16:29:22.705956 4707 scope.go:117] "RemoveContainer" containerID="cde5f0dfca74b7c4e0f11818e5b2443291e695d3e333603b9deeaf3c188f585d" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.189443 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" path="/var/lib/kubelet/pods/0dd330cc-3e32-4ae6-94cf-fdd85070cb8e/volumes" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.451516 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz"] Jan 21 16:29:23 crc kubenswrapper[4707]: E0121 16:29:23.451835 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" containerName="init" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.451852 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" containerName="init" Jan 21 16:29:23 crc kubenswrapper[4707]: E0121 16:29:23.451862 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" containerName="dnsmasq-dns" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.451868 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" containerName="dnsmasq-dns" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.452019 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd330cc-3e32-4ae6-94cf-fdd85070cb8e" containerName="dnsmasq-dns" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.452457 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.453968 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.454185 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.454399 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dnsnames-default-certs-0" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.454553 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-ltszx" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.454654 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dnsnames-second-certs-0" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.454951 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.455938 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.460421 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz"] Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.538623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-tls-dnsnames-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.538680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.538846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.538886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcw9z\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-kube-api-access-kcw9z\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.538962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-second-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.538995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.539038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.640136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-tls-dnsnames-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.640195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.640237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.640258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcw9z\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-kube-api-access-kcw9z\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.640291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-second-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.640311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.640330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.644340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.644343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.645042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.645263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-second-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.645307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-tls-dnsnames-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.645331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.652563 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcw9z\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-kube-api-access-kcw9z\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:23 crc kubenswrapper[4707]: I0121 16:29:23.764869 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:24 crc kubenswrapper[4707]: I0121 16:29:24.111129 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz"] Jan 21 16:29:24 crc kubenswrapper[4707]: I0121 16:29:24.752714 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" event={"ID":"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f","Type":"ContainerStarted","Data":"2d3eece2daf8276cc51a7a2e5333c7a810e1b02ce80270ede6e3d7eb4df97278"} Jan 21 16:29:24 crc kubenswrapper[4707]: I0121 16:29:24.752752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" event={"ID":"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f","Type":"ContainerStarted","Data":"90b8493436ddbf967fe11fb7fd045621251695eff074c6ae9e9a147abb5a39a8"} Jan 21 16:29:24 crc kubenswrapper[4707]: I0121 16:29:24.764302 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" podStartSLOduration=1.281492302 podStartE2EDuration="1.764292089s" podCreationTimestamp="2026-01-21 16:29:23 +0000 UTC" firstStartedPulling="2026-01-21 16:29:24.114927812 +0000 UTC m=+5261.296444034" lastFinishedPulling="2026-01-21 16:29:24.597727598 +0000 UTC m=+5261.779243821" observedRunningTime="2026-01-21 16:29:24.762858674 +0000 UTC m=+5261.944374896" watchObservedRunningTime="2026-01-21 16:29:24.764292089 +0000 UTC m=+5261.945808311" Jan 21 16:29:27 crc kubenswrapper[4707]: I0121 16:29:27.770934 4707 generic.go:334] "Generic (PLEG): container finished" podID="ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" containerID="2d3eece2daf8276cc51a7a2e5333c7a810e1b02ce80270ede6e3d7eb4df97278" exitCode=0 Jan 21 16:29:27 crc kubenswrapper[4707]: I0121 16:29:27.771013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" event={"ID":"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f","Type":"ContainerDied","Data":"2d3eece2daf8276cc51a7a2e5333c7a810e1b02ce80270ede6e3d7eb4df97278"} Jan 21 16:29:28 crc kubenswrapper[4707]: I0121 16:29:28.992083 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.016675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-tls-dnsnames-combined-ca-bundle\") pod \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.016793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcw9z\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-kube-api-access-kcw9z\") pod \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.016918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-install-certs-ovrd-combined-ca-bundle\") pod \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.016967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-second-certs-0\") pod \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.017028 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-inventory\") pod \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.017115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-ssh-key-openstack-edpm-tls\") pod \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.017159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-default-certs-0\") pod \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\" (UID: \"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f\") " Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.021076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-tls-dnsnames-combined-ca-bundle" (OuterVolumeSpecName: "tls-dnsnames-combined-ca-bundle") pod "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" (UID: "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f"). InnerVolumeSpecName "tls-dnsnames-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.021279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-kube-api-access-kcw9z" (OuterVolumeSpecName: "kube-api-access-kcw9z") pod "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" (UID: "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f"). InnerVolumeSpecName "kube-api-access-kcw9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.021349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" (UID: "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.021358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dnsnames-default-certs-0") pod "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" (UID: "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f"). InnerVolumeSpecName "openstack-edpm-tls-tls-dnsnames-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.021755 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-second-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dnsnames-second-certs-0") pod "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" (UID: "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f"). InnerVolumeSpecName "openstack-edpm-tls-tls-dnsnames-second-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.033527 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-inventory" (OuterVolumeSpecName: "inventory") pod "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" (UID: "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.035576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" (UID: "ac0c0697-5efd-4e19-8a64-f3d71c07cb5f"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.118484 4707 reconciler_common.go:293] "Volume detached for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-tls-dnsnames-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.118513 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcw9z\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-kube-api-access-kcw9z\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.118524 4707 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.118535 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-second-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.118544 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.118552 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.118563 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f-openstack-edpm-tls-tls-dnsnames-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.785302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" event={"ID":"ac0c0697-5efd-4e19-8a64-f3d71c07cb5f","Type":"ContainerDied","Data":"90b8493436ddbf967fe11fb7fd045621251695eff074c6ae9e9a147abb5a39a8"} Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.785339 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b8493436ddbf967fe11fb7fd045621251695eff074c6ae9e9a147abb5a39a8" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.785387 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.834545 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j"] Jan 21 16:29:29 crc kubenswrapper[4707]: E0121 16:29:29.834792 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" containerName="install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.834821 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" containerName="install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.834935 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" containerName="install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.835324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.836773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-ltszx" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.837015 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.837132 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.837450 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.841746 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.843915 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j"] Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.927518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-ssh-key-openstack-edpm-tls\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.927565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-inventory\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.927683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-tls-dnsnames-combined-ca-bundle\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:29 crc kubenswrapper[4707]: I0121 16:29:29.927707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2rzb\" (UniqueName: \"kubernetes.io/projected/f6958951-e96c-4974-bdb2-a00598ba8bcb-kube-api-access-d2rzb\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.028449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-ssh-key-openstack-edpm-tls\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.028497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-inventory\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.028573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-tls-dnsnames-combined-ca-bundle\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.028594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2rzb\" (UniqueName: \"kubernetes.io/projected/f6958951-e96c-4974-bdb2-a00598ba8bcb-kube-api-access-d2rzb\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.031161 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-ssh-key-openstack-edpm-tls\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.031188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-tls-dnsnames-combined-ca-bundle\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.031352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-inventory\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.044601 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2rzb\" (UniqueName: \"kubernetes.io/projected/f6958951-e96c-4974-bdb2-a00598ba8bcb-kube-api-access-d2rzb\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.147888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.516467 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j"] Jan 21 16:29:30 crc kubenswrapper[4707]: W0121 16:29:30.523786 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6958951_e96c_4974_bdb2_a00598ba8bcb.slice/crio-fbdb982686a74efe3b44d34a50ffb8ad294460e81e6a2cb4a182f8c0c8573b6e WatchSource:0}: Error finding container fbdb982686a74efe3b44d34a50ffb8ad294460e81e6a2cb4a182f8c0c8573b6e: Status 404 returned error can't find the container with id fbdb982686a74efe3b44d34a50ffb8ad294460e81e6a2cb4a182f8c0c8573b6e Jan 21 16:29:30 crc kubenswrapper[4707]: I0121 16:29:30.791698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" event={"ID":"f6958951-e96c-4974-bdb2-a00598ba8bcb","Type":"ContainerStarted","Data":"fbdb982686a74efe3b44d34a50ffb8ad294460e81e6a2cb4a182f8c0c8573b6e"} Jan 21 16:29:31 crc kubenswrapper[4707]: I0121 16:29:31.801614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" event={"ID":"f6958951-e96c-4974-bdb2-a00598ba8bcb","Type":"ContainerStarted","Data":"36a71d3bd3f71bbbdc098c54556f0535a81385b8f405465e135f291219fca634"} Jan 21 16:29:31 crc kubenswrapper[4707]: I0121 16:29:31.812419 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" podStartSLOduration=2.312697024 podStartE2EDuration="2.812402338s" podCreationTimestamp="2026-01-21 16:29:29 +0000 UTC" firstStartedPulling="2026-01-21 16:29:30.525191169 +0000 UTC m=+5267.706707391" lastFinishedPulling="2026-01-21 16:29:31.024896482 +0000 UTC m=+5268.206412705" observedRunningTime="2026-01-21 16:29:31.812097284 +0000 UTC m=+5268.993613506" watchObservedRunningTime="2026-01-21 16:29:31.812402338 +0000 UTC m=+5268.993918560" Jan 21 16:29:33 crc kubenswrapper[4707]: I0121 16:29:33.815111 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6958951-e96c-4974-bdb2-a00598ba8bcb" containerID="36a71d3bd3f71bbbdc098c54556f0535a81385b8f405465e135f291219fca634" exitCode=0 Jan 21 16:29:33 crc kubenswrapper[4707]: I0121 16:29:33.815189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" event={"ID":"f6958951-e96c-4974-bdb2-a00598ba8bcb","Type":"ContainerDied","Data":"36a71d3bd3f71bbbdc098c54556f0535a81385b8f405465e135f291219fca634"} Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.037204 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.091627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-inventory\") pod \"f6958951-e96c-4974-bdb2-a00598ba8bcb\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.091791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-tls-dnsnames-combined-ca-bundle\") pod \"f6958951-e96c-4974-bdb2-a00598ba8bcb\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.091855 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2rzb\" (UniqueName: \"kubernetes.io/projected/f6958951-e96c-4974-bdb2-a00598ba8bcb-kube-api-access-d2rzb\") pod \"f6958951-e96c-4974-bdb2-a00598ba8bcb\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.091881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-ssh-key-openstack-edpm-tls\") pod \"f6958951-e96c-4974-bdb2-a00598ba8bcb\" (UID: \"f6958951-e96c-4974-bdb2-a00598ba8bcb\") " Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.095738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-tls-dnsnames-combined-ca-bundle" (OuterVolumeSpecName: "tls-dnsnames-combined-ca-bundle") pod "f6958951-e96c-4974-bdb2-a00598ba8bcb" (UID: "f6958951-e96c-4974-bdb2-a00598ba8bcb"). InnerVolumeSpecName "tls-dnsnames-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.095753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6958951-e96c-4974-bdb2-a00598ba8bcb-kube-api-access-d2rzb" (OuterVolumeSpecName: "kube-api-access-d2rzb") pod "f6958951-e96c-4974-bdb2-a00598ba8bcb" (UID: "f6958951-e96c-4974-bdb2-a00598ba8bcb"). InnerVolumeSpecName "kube-api-access-d2rzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.107155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-inventory" (OuterVolumeSpecName: "inventory") pod "f6958951-e96c-4974-bdb2-a00598ba8bcb" (UID: "f6958951-e96c-4974-bdb2-a00598ba8bcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.107825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "f6958951-e96c-4974-bdb2-a00598ba8bcb" (UID: "f6958951-e96c-4974-bdb2-a00598ba8bcb"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.193209 4707 reconciler_common.go:293] "Volume detached for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-tls-dnsnames-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.193235 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2rzb\" (UniqueName: \"kubernetes.io/projected/f6958951-e96c-4974-bdb2-a00598ba8bcb-kube-api-access-d2rzb\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.193244 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.193254 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6958951-e96c-4974-bdb2-a00598ba8bcb-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.829509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" event={"ID":"f6958951-e96c-4974-bdb2-a00598ba8bcb","Type":"ContainerDied","Data":"fbdb982686a74efe3b44d34a50ffb8ad294460e81e6a2cb4a182f8c0c8573b6e"} Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.829543 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbdb982686a74efe3b44d34a50ffb8ad294460e81e6a2cb4a182f8c0c8573b6e" Jan 21 16:29:35 crc kubenswrapper[4707]: I0121 16:29:35.829551 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.546942 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48"] Jan 21 16:29:37 crc kubenswrapper[4707]: E0121 16:29:37.547372 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6958951-e96c-4974-bdb2-a00598ba8bcb" containerName="tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.547385 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6958951-e96c-4974-bdb2-a00598ba8bcb" containerName="tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.547516 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6958951-e96c-4974-bdb2-a00598ba8bcb" containerName="tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.547975 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.549574 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.550018 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-ltszx" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.550256 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dns-ips-default-certs-0" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.550337 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.551009 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-custom-tls-dns-default-certs-0" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.551211 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.553171 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.557878 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48"] Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.619582 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.619663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.619831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.619887 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.619911 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.619939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.620054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.620091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8ph4\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-kube-api-access-n8ph4\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.721386 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.721437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.721459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.721480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.721535 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.721557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8ph4\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-kube-api-access-n8ph4\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.721585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.721634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.725645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.725667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.726330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.726356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.726745 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.726837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.727847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.734259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8ph4\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-kube-api-access-n8ph4\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:37 crc kubenswrapper[4707]: I0121 16:29:37.861686 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:38 crc kubenswrapper[4707]: I0121 16:29:38.215644 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48"] Jan 21 16:29:38 crc kubenswrapper[4707]: I0121 16:29:38.850767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" event={"ID":"34202a73-32dd-4bb0-a514-660999065ae9","Type":"ContainerStarted","Data":"e298425d7578834e0a37dcf3ba5f13ec4c45e1f794c4a8e7b6f6bbe7d4159f28"} Jan 21 16:29:39 crc kubenswrapper[4707]: I0121 16:29:39.857599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" event={"ID":"34202a73-32dd-4bb0-a514-660999065ae9","Type":"ContainerStarted","Data":"39f6e9e171f3bdfce24b0a8ba77ecd4791b7495ab73187522a7e6d9616803107"} Jan 21 16:29:39 crc kubenswrapper[4707]: I0121 16:29:39.875723 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" podStartSLOduration=2.310954118 podStartE2EDuration="2.875709837s" podCreationTimestamp="2026-01-21 16:29:37 +0000 UTC" firstStartedPulling="2026-01-21 16:29:38.22094524 +0000 UTC m=+5275.402461461" lastFinishedPulling="2026-01-21 16:29:38.785700958 +0000 UTC m=+5275.967217180" observedRunningTime="2026-01-21 16:29:39.873213724 +0000 UTC m=+5277.054729947" watchObservedRunningTime="2026-01-21 16:29:39.875709837 +0000 UTC m=+5277.057226060" Jan 21 16:29:39 crc kubenswrapper[4707]: I0121 16:29:39.945457 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:29:39 crc kubenswrapper[4707]: I0121 16:29:39.945514 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:29:39 crc kubenswrapper[4707]: I0121 16:29:39.945554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:29:39 crc kubenswrapper[4707]: I0121 16:29:39.946029 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:29:39 crc kubenswrapper[4707]: I0121 16:29:39.946079 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" gracePeriod=600 Jan 21 16:29:40 crc kubenswrapper[4707]: E0121 16:29:40.061593 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:29:40 crc kubenswrapper[4707]: I0121 16:29:40.865977 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" exitCode=0 Jan 21 16:29:40 crc kubenswrapper[4707]: I0121 16:29:40.866056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340"} Jan 21 16:29:40 crc kubenswrapper[4707]: I0121 16:29:40.866106 4707 scope.go:117] "RemoveContainer" containerID="4385da0860f68c6b4fbe1ca2a1e4ddb0b3259ea351cab52fa0f587fb0728043a" Jan 21 16:29:40 crc kubenswrapper[4707]: I0121 16:29:40.866589 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:29:40 crc kubenswrapper[4707]: E0121 16:29:40.866845 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:29:41 crc kubenswrapper[4707]: I0121 16:29:41.874060 4707 generic.go:334] "Generic (PLEG): container finished" podID="34202a73-32dd-4bb0-a514-660999065ae9" containerID="39f6e9e171f3bdfce24b0a8ba77ecd4791b7495ab73187522a7e6d9616803107" exitCode=0 Jan 21 16:29:41 crc kubenswrapper[4707]: I0121 16:29:41.874133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" event={"ID":"34202a73-32dd-4bb0-a514-660999065ae9","Type":"ContainerDied","Data":"39f6e9e171f3bdfce24b0a8ba77ecd4791b7495ab73187522a7e6d9616803107"} Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.082743 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.087761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-ssh-key-openstack-edpm-tls\") pod \"34202a73-32dd-4bb0-a514-660999065ae9\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.087893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"34202a73-32dd-4bb0-a514-660999065ae9\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.087982 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8ph4\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-kube-api-access-n8ph4\") pod \"34202a73-32dd-4bb0-a514-660999065ae9\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.088073 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"34202a73-32dd-4bb0-a514-660999065ae9\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.088179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-install-certs-ovrd-combined-ca-bundle\") pod \"34202a73-32dd-4bb0-a514-660999065ae9\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.088255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-inventory\") pod \"34202a73-32dd-4bb0-a514-660999065ae9\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.088340 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-tls-dns-ips-combined-ca-bundle\") pod \"34202a73-32dd-4bb0-a514-660999065ae9\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.088405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-custom-tls-dns-combined-ca-bundle\") pod \"34202a73-32dd-4bb0-a514-660999065ae9\" (UID: \"34202a73-32dd-4bb0-a514-660999065ae9\") " Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.092482 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "34202a73-32dd-4bb0-a514-660999065ae9" (UID: "34202a73-32dd-4bb0-a514-660999065ae9"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.092489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "34202a73-32dd-4bb0-a514-660999065ae9" (UID: "34202a73-32dd-4bb0-a514-660999065ae9"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.092702 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-kube-api-access-n8ph4" (OuterVolumeSpecName: "kube-api-access-n8ph4") pod "34202a73-32dd-4bb0-a514-660999065ae9" (UID: "34202a73-32dd-4bb0-a514-660999065ae9"). InnerVolumeSpecName "kube-api-access-n8ph4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.093357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "34202a73-32dd-4bb0-a514-660999065ae9" (UID: "34202a73-32dd-4bb0-a514-660999065ae9"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.093430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-custom-tls-dns-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-custom-tls-dns-default-certs-0") pod "34202a73-32dd-4bb0-a514-660999065ae9" (UID: "34202a73-32dd-4bb0-a514-660999065ae9"). InnerVolumeSpecName "openstack-edpm-tls-custom-tls-dns-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.094063 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-tls-dns-ips-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dns-ips-default-certs-0") pod "34202a73-32dd-4bb0-a514-660999065ae9" (UID: "34202a73-32dd-4bb0-a514-660999065ae9"). InnerVolumeSpecName "openstack-edpm-tls-tls-dns-ips-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.107727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-inventory" (OuterVolumeSpecName: "inventory") pod "34202a73-32dd-4bb0-a514-660999065ae9" (UID: "34202a73-32dd-4bb0-a514-660999065ae9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.108098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "34202a73-32dd-4bb0-a514-660999065ae9" (UID: "34202a73-32dd-4bb0-a514-660999065ae9"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.190308 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.190487 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-custom-tls-dns-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.190563 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8ph4\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-kube-api-access-n8ph4\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.191230 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34202a73-32dd-4bb0-a514-660999065ae9-openstack-edpm-tls-tls-dns-ips-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.191817 4707 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.191837 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.191848 4707 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.191858 4707 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34202a73-32dd-4bb0-a514-660999065ae9-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.888510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" event={"ID":"34202a73-32dd-4bb0-a514-660999065ae9","Type":"ContainerDied","Data":"e298425d7578834e0a37dcf3ba5f13ec4c45e1f794c4a8e7b6f6bbe7d4159f28"} Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.888543 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e298425d7578834e0a37dcf3ba5f13ec4c45e1f794c4a8e7b6f6bbe7d4159f28" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.888554 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.927161 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx"] Jan 21 16:29:43 crc kubenswrapper[4707]: E0121 16:29:43.927451 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34202a73-32dd-4bb0-a514-660999065ae9" containerName="install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.927470 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="34202a73-32dd-4bb0-a514-660999065ae9" containerName="install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.927616 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="34202a73-32dd-4bb0-a514-660999065ae9" containerName="install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.928078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.930198 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.930313 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.930675 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-ltszx" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.930845 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.934066 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 16:29:43 crc kubenswrapper[4707]: I0121 16:29:43.935518 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx"] Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.000760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.000854 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgpz\" (UniqueName: \"kubernetes.io/projected/e5442563-8062-4eca-bdbf-4eec7053ddc5-kube-api-access-7lgpz\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.000890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.000924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-inventory\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.102469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.102620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgpz\" (UniqueName: \"kubernetes.io/projected/e5442563-8062-4eca-bdbf-4eec7053ddc5-kube-api-access-7lgpz\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.102694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.102757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-inventory\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.105992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.106072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-inventory\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.106795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.115680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgpz\" (UniqueName: \"kubernetes.io/projected/e5442563-8062-4eca-bdbf-4eec7053ddc5-kube-api-access-7lgpz\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.243962 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.600320 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx"] Jan 21 16:29:44 crc kubenswrapper[4707]: I0121 16:29:44.894752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" event={"ID":"e5442563-8062-4eca-bdbf-4eec7053ddc5","Type":"ContainerStarted","Data":"1b7db9e3bbb3017f2fd549ca82916aedf48805940a131660188e9002d4316958"} Jan 21 16:29:45 crc kubenswrapper[4707]: I0121 16:29:45.902116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" event={"ID":"e5442563-8062-4eca-bdbf-4eec7053ddc5","Type":"ContainerStarted","Data":"2ecad3ca021c8e573237b5d01b65e1f0caa33b4dadd162ad98490750d4847a8d"} Jan 21 16:29:45 crc kubenswrapper[4707]: I0121 16:29:45.916744 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" podStartSLOduration=2.336935053 podStartE2EDuration="2.91673041s" podCreationTimestamp="2026-01-21 16:29:43 +0000 UTC" firstStartedPulling="2026-01-21 16:29:44.604620404 +0000 UTC m=+5281.786136625" lastFinishedPulling="2026-01-21 16:29:45.18441575 +0000 UTC m=+5282.365931982" observedRunningTime="2026-01-21 16:29:45.914119239 +0000 UTC m=+5283.095635451" watchObservedRunningTime="2026-01-21 16:29:45.91673041 +0000 UTC m=+5283.098246632" Jan 21 16:29:47 crc kubenswrapper[4707]: I0121 16:29:47.918378 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5442563-8062-4eca-bdbf-4eec7053ddc5" containerID="2ecad3ca021c8e573237b5d01b65e1f0caa33b4dadd162ad98490750d4847a8d" exitCode=0 Jan 21 16:29:47 crc kubenswrapper[4707]: I0121 16:29:47.918661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" event={"ID":"e5442563-8062-4eca-bdbf-4eec7053ddc5","Type":"ContainerDied","Data":"2ecad3ca021c8e573237b5d01b65e1f0caa33b4dadd162ad98490750d4847a8d"} Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.133656 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.172458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lgpz\" (UniqueName: \"kubernetes.io/projected/e5442563-8062-4eca-bdbf-4eec7053ddc5-kube-api-access-7lgpz\") pod \"e5442563-8062-4eca-bdbf-4eec7053ddc5\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.172604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-inventory\") pod \"e5442563-8062-4eca-bdbf-4eec7053ddc5\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.172633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-tls-dns-ips-combined-ca-bundle\") pod \"e5442563-8062-4eca-bdbf-4eec7053ddc5\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.172689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-ssh-key-openstack-edpm-tls\") pod \"e5442563-8062-4eca-bdbf-4eec7053ddc5\" (UID: \"e5442563-8062-4eca-bdbf-4eec7053ddc5\") " Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.177194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "e5442563-8062-4eca-bdbf-4eec7053ddc5" (UID: "e5442563-8062-4eca-bdbf-4eec7053ddc5"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.177198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5442563-8062-4eca-bdbf-4eec7053ddc5-kube-api-access-7lgpz" (OuterVolumeSpecName: "kube-api-access-7lgpz") pod "e5442563-8062-4eca-bdbf-4eec7053ddc5" (UID: "e5442563-8062-4eca-bdbf-4eec7053ddc5"). InnerVolumeSpecName "kube-api-access-7lgpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.188769 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-inventory" (OuterVolumeSpecName: "inventory") pod "e5442563-8062-4eca-bdbf-4eec7053ddc5" (UID: "e5442563-8062-4eca-bdbf-4eec7053ddc5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.189400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "e5442563-8062-4eca-bdbf-4eec7053ddc5" (UID: "e5442563-8062-4eca-bdbf-4eec7053ddc5"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.274208 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.274336 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lgpz\" (UniqueName: \"kubernetes.io/projected/e5442563-8062-4eca-bdbf-4eec7053ddc5-kube-api-access-7lgpz\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.274398 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.274451 4707 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5442563-8062-4eca-bdbf-4eec7053ddc5-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.931440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" event={"ID":"e5442563-8062-4eca-bdbf-4eec7053ddc5","Type":"ContainerDied","Data":"1b7db9e3bbb3017f2fd549ca82916aedf48805940a131660188e9002d4316958"} Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.931479 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b7db9e3bbb3017f2fd549ca82916aedf48805940a131660188e9002d4316958" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.931487 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.984186 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs"] Jan 21 16:29:49 crc kubenswrapper[4707]: E0121 16:29:49.984601 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5442563-8062-4eca-bdbf-4eec7053ddc5" containerName="tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.984662 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5442563-8062-4eca-bdbf-4eec7053ddc5" containerName="tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.984892 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5442563-8062-4eca-bdbf-4eec7053ddc5" containerName="tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.985357 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.987004 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.987892 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.991143 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.991268 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.991398 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-ltszx" Jan 21 16:29:49 crc kubenswrapper[4707]: I0121 16:29:49.993307 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs"] Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.081761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-custom-tls-dns-combined-ca-bundle\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.081832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8cb2\" (UniqueName: \"kubernetes.io/projected/e30a6419-83be-4c52-8ea9-2d51433559ad-kube-api-access-q8cb2\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.081900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-ssh-key-openstack-edpm-tls\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.081925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-inventory\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.182498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-custom-tls-dns-combined-ca-bundle\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.182569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8cb2\" (UniqueName: \"kubernetes.io/projected/e30a6419-83be-4c52-8ea9-2d51433559ad-kube-api-access-q8cb2\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.182594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-ssh-key-openstack-edpm-tls\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.182611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-inventory\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.185374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-inventory\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.185576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-ssh-key-openstack-edpm-tls\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.185612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-custom-tls-dns-combined-ca-bundle\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.195549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8cb2\" (UniqueName: \"kubernetes.io/projected/e30a6419-83be-4c52-8ea9-2d51433559ad-kube-api-access-q8cb2\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.305018 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.660328 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs"] Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.665326 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:29:50 crc kubenswrapper[4707]: I0121 16:29:50.937766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" event={"ID":"e30a6419-83be-4c52-8ea9-2d51433559ad","Type":"ContainerStarted","Data":"87cd2da074efd81146446797e367e15f66a387a84848f6feb7a65ff1700ac9b4"} Jan 21 16:29:51 crc kubenswrapper[4707]: I0121 16:29:51.944407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" event={"ID":"e30a6419-83be-4c52-8ea9-2d51433559ad","Type":"ContainerStarted","Data":"66c4a8f3ac89b247e1cc56eae55fc9902a78f6cc0c3a8012683a254d0687b317"} Jan 21 16:29:51 crc kubenswrapper[4707]: I0121 16:29:51.961707 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" podStartSLOduration=2.304579685 podStartE2EDuration="2.961693804s" podCreationTimestamp="2026-01-21 16:29:49 +0000 UTC" firstStartedPulling="2026-01-21 16:29:50.665107987 +0000 UTC m=+5287.846624210" lastFinishedPulling="2026-01-21 16:29:51.322222107 +0000 UTC m=+5288.503738329" observedRunningTime="2026-01-21 16:29:51.960204302 +0000 UTC m=+5289.141720525" watchObservedRunningTime="2026-01-21 16:29:51.961693804 +0000 UTC m=+5289.143210025" Jan 21 16:29:53 crc kubenswrapper[4707]: I0121 16:29:53.959109 4707 generic.go:334] "Generic (PLEG): container finished" podID="e30a6419-83be-4c52-8ea9-2d51433559ad" containerID="66c4a8f3ac89b247e1cc56eae55fc9902a78f6cc0c3a8012683a254d0687b317" exitCode=0 Jan 21 16:29:53 crc kubenswrapper[4707]: I0121 16:29:53.959150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" event={"ID":"e30a6419-83be-4c52-8ea9-2d51433559ad","Type":"ContainerDied","Data":"66c4a8f3ac89b247e1cc56eae55fc9902a78f6cc0c3a8012683a254d0687b317"} Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.172738 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.238226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8cb2\" (UniqueName: \"kubernetes.io/projected/e30a6419-83be-4c52-8ea9-2d51433559ad-kube-api-access-q8cb2\") pod \"e30a6419-83be-4c52-8ea9-2d51433559ad\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.238297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-custom-tls-dns-combined-ca-bundle\") pod \"e30a6419-83be-4c52-8ea9-2d51433559ad\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.238358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-inventory\") pod \"e30a6419-83be-4c52-8ea9-2d51433559ad\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.238477 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-ssh-key-openstack-edpm-tls\") pod \"e30a6419-83be-4c52-8ea9-2d51433559ad\" (UID: \"e30a6419-83be-4c52-8ea9-2d51433559ad\") " Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.243276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e30a6419-83be-4c52-8ea9-2d51433559ad-kube-api-access-q8cb2" (OuterVolumeSpecName: "kube-api-access-q8cb2") pod "e30a6419-83be-4c52-8ea9-2d51433559ad" (UID: "e30a6419-83be-4c52-8ea9-2d51433559ad"). InnerVolumeSpecName "kube-api-access-q8cb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.243470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "e30a6419-83be-4c52-8ea9-2d51433559ad" (UID: "e30a6419-83be-4c52-8ea9-2d51433559ad"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.254729 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-inventory" (OuterVolumeSpecName: "inventory") pod "e30a6419-83be-4c52-8ea9-2d51433559ad" (UID: "e30a6419-83be-4c52-8ea9-2d51433559ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.255222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "e30a6419-83be-4c52-8ea9-2d51433559ad" (UID: "e30a6419-83be-4c52-8ea9-2d51433559ad"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.340822 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.340857 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8cb2\" (UniqueName: \"kubernetes.io/projected/e30a6419-83be-4c52-8ea9-2d51433559ad-kube-api-access-q8cb2\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.340868 4707 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.340878 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e30a6419-83be-4c52-8ea9-2d51433559ad-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.973204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" event={"ID":"e30a6419-83be-4c52-8ea9-2d51433559ad","Type":"ContainerDied","Data":"87cd2da074efd81146446797e367e15f66a387a84848f6feb7a65ff1700ac9b4"} Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.973244 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87cd2da074efd81146446797e367e15f66a387a84848f6feb7a65ff1700ac9b4" Jan 21 16:29:55 crc kubenswrapper[4707]: I0121 16:29:55.973255 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs" Jan 21 16:29:56 crc kubenswrapper[4707]: I0121 16:29:56.182543 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:29:56 crc kubenswrapper[4707]: E0121 16:29:56.182889 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.486775 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb"] Jan 21 16:29:57 crc kubenswrapper[4707]: E0121 16:29:57.487256 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30a6419-83be-4c52-8ea9-2d51433559ad" containerName="custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.487269 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30a6419-83be-4c52-8ea9-2d51433559ad" containerName="custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.487419 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30a6419-83be-4c52-8ea9-2d51433559ad" containerName="custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.487843 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.490517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.490869 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.491475 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-ltszx" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.491611 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.492127 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-custom-tls-dns-default-certs-0" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.492277 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.494336 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb"] Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.497062 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dns-ips-default-certs-0" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.571128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.571185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.571230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.571272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.571306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.571326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-kube-api-access-nmtrz\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.571345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.571365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.673062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.673125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.673162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.673187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-kube-api-access-nmtrz\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.673214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.673239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.673294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.673331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.677965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.678008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.677966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.678241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.678392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.679310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.679901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.687012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-kube-api-access-nmtrz\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:57 crc kubenswrapper[4707]: I0121 16:29:57.799983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:29:58 crc kubenswrapper[4707]: I0121 16:29:58.145272 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb"] Jan 21 16:29:58 crc kubenswrapper[4707]: I0121 16:29:58.992221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" event={"ID":"bee7d351-52d8-4f09-beae-23003968261d","Type":"ContainerStarted","Data":"0c56bac6f0ac9527fe972ec248729a18c00980c00336de32533aef2bc0c95fff"} Jan 21 16:29:58 crc kubenswrapper[4707]: I0121 16:29:58.992426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" event={"ID":"bee7d351-52d8-4f09-beae-23003968261d","Type":"ContainerStarted","Data":"9f14a8d4f2470b4ca7154ae9531959aa0e27618bd3f30ed2f447935f6d8d79b9"} Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.119114 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" podStartSLOduration=2.6368679200000003 podStartE2EDuration="3.119098756s" podCreationTimestamp="2026-01-21 16:29:57 +0000 UTC" firstStartedPulling="2026-01-21 16:29:58.150266126 +0000 UTC m=+5295.331782349" lastFinishedPulling="2026-01-21 16:29:58.632496963 +0000 UTC m=+5295.814013185" observedRunningTime="2026-01-21 16:29:59.011887301 +0000 UTC m=+5296.193403522" watchObservedRunningTime="2026-01-21 16:30:00.119098756 +0000 UTC m=+5297.300614978" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.121521 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z"] Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.122334 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.124064 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.124556 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.128599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z"] Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.306412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpjxj\" (UniqueName: \"kubernetes.io/projected/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-kube-api-access-fpjxj\") pod \"collect-profiles-29483550-m9k5z\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.306458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-secret-volume\") pod \"collect-profiles-29483550-m9k5z\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.306623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-config-volume\") pod \"collect-profiles-29483550-m9k5z\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.407418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-config-volume\") pod \"collect-profiles-29483550-m9k5z\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.407498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpjxj\" (UniqueName: \"kubernetes.io/projected/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-kube-api-access-fpjxj\") pod \"collect-profiles-29483550-m9k5z\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.407526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-secret-volume\") pod \"collect-profiles-29483550-m9k5z\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.408252 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-config-volume\") pod \"collect-profiles-29483550-m9k5z\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.411272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-secret-volume\") pod \"collect-profiles-29483550-m9k5z\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.422507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpjxj\" (UniqueName: \"kubernetes.io/projected/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-kube-api-access-fpjxj\") pod \"collect-profiles-29483550-m9k5z\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.438449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:00 crc kubenswrapper[4707]: I0121 16:30:00.799935 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z"] Jan 21 16:30:01 crc kubenswrapper[4707]: I0121 16:30:01.006195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" event={"ID":"bc8a5bab-d9d5-4a6f-99ea-d2d512629736","Type":"ContainerStarted","Data":"111a5c0f9ebad2cfb5b3d45c32e25bc49180396cf692915f50b5afb020d5dd7d"} Jan 21 16:30:01 crc kubenswrapper[4707]: I0121 16:30:01.006604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" event={"ID":"bc8a5bab-d9d5-4a6f-99ea-d2d512629736","Type":"ContainerStarted","Data":"18de86d2e63702f2876b51ee9bbb06dd8bc5ff32ec484179b99e5aaa794e7f45"} Jan 21 16:30:01 crc kubenswrapper[4707]: I0121 16:30:01.025706 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" podStartSLOduration=1.025693413 podStartE2EDuration="1.025693413s" podCreationTimestamp="2026-01-21 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:30:01.02154365 +0000 UTC m=+5298.203059872" watchObservedRunningTime="2026-01-21 16:30:01.025693413 +0000 UTC m=+5298.207209635" Jan 21 16:30:02 crc kubenswrapper[4707]: I0121 16:30:02.014352 4707 generic.go:334] "Generic (PLEG): container finished" podID="bc8a5bab-d9d5-4a6f-99ea-d2d512629736" containerID="111a5c0f9ebad2cfb5b3d45c32e25bc49180396cf692915f50b5afb020d5dd7d" exitCode=0 Jan 21 16:30:02 crc kubenswrapper[4707]: I0121 16:30:02.014413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" event={"ID":"bc8a5bab-d9d5-4a6f-99ea-d2d512629736","Type":"ContainerDied","Data":"111a5c0f9ebad2cfb5b3d45c32e25bc49180396cf692915f50b5afb020d5dd7d"} Jan 21 16:30:02 crc kubenswrapper[4707]: I0121 16:30:02.016052 4707 generic.go:334] "Generic (PLEG): container finished" podID="bee7d351-52d8-4f09-beae-23003968261d" containerID="0c56bac6f0ac9527fe972ec248729a18c00980c00336de32533aef2bc0c95fff" exitCode=0 Jan 21 16:30:02 crc kubenswrapper[4707]: I0121 16:30:02.016084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" event={"ID":"bee7d351-52d8-4f09-beae-23003968261d","Type":"ContainerDied","Data":"0c56bac6f0ac9527fe972ec248729a18c00980c00336de32533aef2bc0c95fff"} Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.280636 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.285134 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-inventory\") pod \"bee7d351-52d8-4f09-beae-23003968261d\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"bee7d351-52d8-4f09-beae-23003968261d\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442661 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"bee7d351-52d8-4f09-beae-23003968261d\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442698 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-install-certs-ovrd-combined-ca-bundle\") pod \"bee7d351-52d8-4f09-beae-23003968261d\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-ssh-key-openstack-edpm-tls\") pod \"bee7d351-52d8-4f09-beae-23003968261d\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-secret-volume\") pod \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-tls-dns-ips-combined-ca-bundle\") pod \"bee7d351-52d8-4f09-beae-23003968261d\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-config-volume\") pod \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-custom-tls-dns-combined-ca-bundle\") pod \"bee7d351-52d8-4f09-beae-23003968261d\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442902 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-kube-api-access-nmtrz\") pod \"bee7d351-52d8-4f09-beae-23003968261d\" (UID: \"bee7d351-52d8-4f09-beae-23003968261d\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.442929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpjxj\" (UniqueName: \"kubernetes.io/projected/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-kube-api-access-fpjxj\") pod \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\" (UID: \"bc8a5bab-d9d5-4a6f-99ea-d2d512629736\") " Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.443951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc8a5bab-d9d5-4a6f-99ea-d2d512629736" (UID: "bc8a5bab-d9d5-4a6f-99ea-d2d512629736"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.447292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-custom-tls-dns-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-custom-tls-dns-default-certs-0") pod "bee7d351-52d8-4f09-beae-23003968261d" (UID: "bee7d351-52d8-4f09-beae-23003968261d"). InnerVolumeSpecName "openstack-edpm-tls-custom-tls-dns-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.447328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-kube-api-access-fpjxj" (OuterVolumeSpecName: "kube-api-access-fpjxj") pod "bc8a5bab-d9d5-4a6f-99ea-d2d512629736" (UID: "bc8a5bab-d9d5-4a6f-99ea-d2d512629736"). InnerVolumeSpecName "kube-api-access-fpjxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.447556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-kube-api-access-nmtrz" (OuterVolumeSpecName: "kube-api-access-nmtrz") pod "bee7d351-52d8-4f09-beae-23003968261d" (UID: "bee7d351-52d8-4f09-beae-23003968261d"). InnerVolumeSpecName "kube-api-access-nmtrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.447627 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-tls-dns-ips-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dns-ips-default-certs-0") pod "bee7d351-52d8-4f09-beae-23003968261d" (UID: "bee7d351-52d8-4f09-beae-23003968261d"). InnerVolumeSpecName "openstack-edpm-tls-tls-dns-ips-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.447885 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "bee7d351-52d8-4f09-beae-23003968261d" (UID: "bee7d351-52d8-4f09-beae-23003968261d"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.447968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bc8a5bab-d9d5-4a6f-99ea-d2d512629736" (UID: "bc8a5bab-d9d5-4a6f-99ea-d2d512629736"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.448938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "bee7d351-52d8-4f09-beae-23003968261d" (UID: "bee7d351-52d8-4f09-beae-23003968261d"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.459437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "bee7d351-52d8-4f09-beae-23003968261d" (UID: "bee7d351-52d8-4f09-beae-23003968261d"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.459717 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-inventory" (OuterVolumeSpecName: "inventory") pod "bee7d351-52d8-4f09-beae-23003968261d" (UID: "bee7d351-52d8-4f09-beae-23003968261d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.465585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "bee7d351-52d8-4f09-beae-23003968261d" (UID: "bee7d351-52d8-4f09-beae-23003968261d"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544332 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544357 4707 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544368 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-kube-api-access-nmtrz\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544380 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpjxj\" (UniqueName: \"kubernetes.io/projected/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-kube-api-access-fpjxj\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544389 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544399 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-custom-tls-dns-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544409 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bee7d351-52d8-4f09-beae-23003968261d-openstack-edpm-tls-tls-dns-ips-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544419 4707 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544429 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544439 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc8a5bab-d9d5-4a6f-99ea-d2d512629736-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:03 crc kubenswrapper[4707]: I0121 16:30:03.544448 4707 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee7d351-52d8-4f09-beae-23003968261d-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.028224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" event={"ID":"bc8a5bab-d9d5-4a6f-99ea-d2d512629736","Type":"ContainerDied","Data":"18de86d2e63702f2876b51ee9bbb06dd8bc5ff32ec484179b99e5aaa794e7f45"} Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.028360 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18de86d2e63702f2876b51ee9bbb06dd8bc5ff32ec484179b99e5aaa794e7f45" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.028235 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.029557 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" event={"ID":"bee7d351-52d8-4f09-beae-23003968261d","Type":"ContainerDied","Data":"9f14a8d4f2470b4ca7154ae9531959aa0e27618bd3f30ed2f447935f6d8d79b9"} Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.029595 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f14a8d4f2470b4ca7154ae9531959aa0e27618bd3f30ed2f447935f6d8d79b9" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.029573 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.077928 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh"] Jan 21 16:30:04 crc kubenswrapper[4707]: E0121 16:30:04.078329 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee7d351-52d8-4f09-beae-23003968261d" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.078402 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee7d351-52d8-4f09-beae-23003968261d" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 16:30:04 crc kubenswrapper[4707]: E0121 16:30:04.078473 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8a5bab-d9d5-4a6f-99ea-d2d512629736" containerName="collect-profiles" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.078525 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8a5bab-d9d5-4a6f-99ea-d2d512629736" containerName="collect-profiles" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.078715 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee7d351-52d8-4f09-beae-23003968261d" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.078791 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc8a5bab-d9d5-4a6f-99ea-d2d512629736" containerName="collect-profiles" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.079959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.082088 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.082513 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.082541 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-ltszx" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.082519 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.082678 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.087244 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh"] Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.145136 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w"] Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.146090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.148415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dns-ips-default-certs-0" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.148610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-custom-tls-dns-default-certs-0" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.153540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w"] Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.251818 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.251859 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.251898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.252414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.252444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-inventory\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.252497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.252516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.252541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.252570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5hdd\" (UniqueName: \"kubernetes.io/projected/e40a1990-682f-4dc9-ad63-4df0524fac89-kube-api-access-b5hdd\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.252586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.252603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.252622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-kube-api-access-nmtrz\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354381 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-inventory\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5hdd\" (UniqueName: \"kubernetes.io/projected/e40a1990-682f-4dc9-ad63-4df0524fac89-kube-api-access-b5hdd\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354558 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.354593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-kube-api-access-nmtrz\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.358033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.358287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.358310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.358455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-inventory\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.358497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.358554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.358821 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.361238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.363968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.364683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.369232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-kube-api-access-nmtrz\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.370186 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259"] Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.370702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5hdd\" (UniqueName: \"kubernetes.io/projected/e40a1990-682f-4dc9-ad63-4df0524fac89-kube-api-access-b5hdd\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.375063 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-qb259"] Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.391552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.459277 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:04 crc kubenswrapper[4707]: E0121 16:30:04.459344 4707 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug" Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.743745 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh"] Jan 21 16:30:04 crc kubenswrapper[4707]: I0121 16:30:04.810977 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w"] Jan 21 16:30:04 crc kubenswrapper[4707]: W0121 16:30:04.812768 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94cf21ee_a6c7_4982_99d8_a7d83a1bc7e0.slice/crio-78c8826117845ff5fa1b2271a31c5be19cdebf07846c30032d0a07eefb36aa3c WatchSource:0}: Error finding container 78c8826117845ff5fa1b2271a31c5be19cdebf07846c30032d0a07eefb36aa3c: Status 404 returned error can't find the container with id 78c8826117845ff5fa1b2271a31c5be19cdebf07846c30032d0a07eefb36aa3c Jan 21 16:30:04 crc kubenswrapper[4707]: E0121 16:30:04.814339 4707 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug" Jan 21 16:30:05 crc kubenswrapper[4707]: I0121 16:30:05.036046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" event={"ID":"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0","Type":"ContainerStarted","Data":"78c8826117845ff5fa1b2271a31c5be19cdebf07846c30032d0a07eefb36aa3c"} Jan 21 16:30:05 crc kubenswrapper[4707]: I0121 16:30:05.037437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" event={"ID":"e40a1990-682f-4dc9-ad63-4df0524fac89","Type":"ContainerStarted","Data":"e391af1ba55cd10975c31f6925afa5da120e5db88f08e300f7cd613d706c975f"} Jan 21 16:30:05 crc kubenswrapper[4707]: I0121 16:30:05.188358 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb5a6c6-fba7-4e90-9184-2e18785a63e7" path="/var/lib/kubelet/pods/2bb5a6c6-fba7-4e90-9184-2e18785a63e7/volumes" Jan 21 16:30:05 crc kubenswrapper[4707]: E0121 16:30:05.338658 4707 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.044345 4707 generic.go:334] "Generic (PLEG): container finished" podID="94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" containerID="ed8c20d986a32ef299a032835533ff2b7417143fe53661b5358443f520d92122" exitCode=0 Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.044553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" event={"ID":"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0","Type":"ContainerDied","Data":"ed8c20d986a32ef299a032835533ff2b7417143fe53661b5358443f520d92122"} Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.045789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" event={"ID":"e40a1990-682f-4dc9-ad63-4df0524fac89","Type":"ContainerStarted","Data":"f595525796e4443bc05aeb397fa805802f27fe9cd2a5a57c36f69cf79a9d3faf"} Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.081952 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" podStartSLOduration=1.533556803 podStartE2EDuration="2.081938295s" podCreationTimestamp="2026-01-21 16:30:04 +0000 UTC" firstStartedPulling="2026-01-21 16:30:04.745682085 +0000 UTC m=+5301.927198317" lastFinishedPulling="2026-01-21 16:30:05.294063586 +0000 UTC m=+5302.475579809" observedRunningTime="2026-01-21 16:30:06.078026559 +0000 UTC m=+5303.259542781" watchObservedRunningTime="2026-01-21 16:30:06.081938295 +0000 UTC m=+5303.263454517" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.092178 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w"] Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.099856 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w"] Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.318509 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955"] Jan 21 16:30:06 crc kubenswrapper[4707]: E0121 16:30:06.318741 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.318759 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.318912 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.319332 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.327581 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955"] Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.479338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.479388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.479432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.479449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.479484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.479506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.479552 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-kube-api-access-nmtrz\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.479569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.580715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.580758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.580798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.580829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.580847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.580869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.580912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-kube-api-access-nmtrz\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.580929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.584338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.584716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.584751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.585277 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.585384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.585777 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.585928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.593985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-kube-api-access-nmtrz\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.633655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:06 crc kubenswrapper[4707]: E0121 16:30:06.633717 4707 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug" Jan 21 16:30:06 crc kubenswrapper[4707]: I0121 16:30:06.972833 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955"] Jan 21 16:30:06 crc kubenswrapper[4707]: W0121 16:30:06.975343 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79bcacef_357c_41b3_b817_bf1f93f03779.slice/crio-0a7586139b459aec9be42b38e0a550eedc62f33aec934cd640d351058d0e5567 WatchSource:0}: Error finding container 0a7586139b459aec9be42b38e0a550eedc62f33aec934cd640d351058d0e5567: Status 404 returned error can't find the container with id 0a7586139b459aec9be42b38e0a550eedc62f33aec934cd640d351058d0e5567 Jan 21 16:30:06 crc kubenswrapper[4707]: E0121 16:30:06.976429 4707 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.052311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" event={"ID":"79bcacef-357c-41b3-b817-bf1f93f03779","Type":"ContainerStarted","Data":"0a7586139b459aec9be42b38e0a550eedc62f33aec934cd640d351058d0e5567"} Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.200840 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.390554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-ssh-key-openstack-edpm-tls\") pod \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.390609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.390645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-custom-tls-dns-combined-ca-bundle\") pod \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.390664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-inventory\") pod \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.390692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-kube-api-access-nmtrz\") pod \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.390725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.390775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-tls-dns-ips-combined-ca-bundle\") pod \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.391198 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-install-certs-ovrd-combined-ca-bundle\") pod \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\" (UID: \"94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0\") " Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.394558 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" (UID: "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.394575 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" (UID: "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.394687 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" (UID: "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.394745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-kube-api-access-nmtrz" (OuterVolumeSpecName: "kube-api-access-nmtrz") pod "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" (UID: "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0"). InnerVolumeSpecName "kube-api-access-nmtrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.394838 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-tls-dns-ips-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dns-ips-default-certs-0") pod "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" (UID: "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0"). InnerVolumeSpecName "openstack-edpm-tls-tls-dns-ips-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.394864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-custom-tls-dns-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-custom-tls-dns-default-certs-0") pod "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" (UID: "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0"). InnerVolumeSpecName "openstack-edpm-tls-custom-tls-dns-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.408250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" (UID: "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.409258 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-inventory" (OuterVolumeSpecName: "inventory") pod "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" (UID: "94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.492103 4707 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.492206 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.492265 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-kube-api-access-nmtrz\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.492316 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-tls-dns-ips-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.492388 4707 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.492449 4707 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.492507 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:07 crc kubenswrapper[4707]: I0121 16:30:07.492562 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0-openstack-edpm-tls-custom-tls-dns-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:07 crc kubenswrapper[4707]: E0121 16:30:07.583618 4707 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.058719 4707 scope.go:117] "RemoveContainer" containerID="ed8c20d986a32ef299a032835533ff2b7417143fe53661b5358443f520d92122" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.058720 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-2kn2w" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.060111 4707 generic.go:334] "Generic (PLEG): container finished" podID="e40a1990-682f-4dc9-ad63-4df0524fac89" containerID="f595525796e4443bc05aeb397fa805802f27fe9cd2a5a57c36f69cf79a9d3faf" exitCode=0 Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.060131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" event={"ID":"e40a1990-682f-4dc9-ad63-4df0524fac89","Type":"ContainerDied","Data":"f595525796e4443bc05aeb397fa805802f27fe9cd2a5a57c36f69cf79a9d3faf"} Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.061225 4707 generic.go:334] "Generic (PLEG): container finished" podID="79bcacef-357c-41b3-b817-bf1f93f03779" containerID="9d82328bf7c5445b8cb9367936b9fe5fa116e618b167c4b18714a46e501a8a22" exitCode=0 Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.061266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" event={"ID":"79bcacef-357c-41b3-b817-bf1f93f03779","Type":"ContainerDied","Data":"9d82328bf7c5445b8cb9367936b9fe5fa116e618b167c4b18714a46e501a8a22"} Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.100260 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.105590 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.120845 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.126041 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.138233 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.149161 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.165045 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-4pdkx"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.171296 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.192511 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-shwcs"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.203722 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.207630 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-nzf48"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.211954 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.217427 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-xrn4j"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.219756 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.223398 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-wqmhz"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.227888 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn"] Jan 21 16:30:08 crc kubenswrapper[4707]: E0121 16:30:08.228203 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bcacef-357c-41b3-b817-bf1f93f03779" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.228221 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bcacef-357c-41b3-b817-bf1f93f03779" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.228370 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bcacef-357c-41b3-b817-bf1f93f03779" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.229103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.231415 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn"] Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.303179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-ffsfn\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.303246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-ffsfn\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.303307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k88mg\" (UniqueName: \"kubernetes.io/projected/65ecb823-392d-43a6-b0a6-b4228b3c8800-kube-api-access-k88mg\") pod \"dnsmasq-dnsmasq-84b9f45d47-ffsfn\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.403949 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-ffsfn\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.404020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-ffsfn\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.404088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k88mg\" (UniqueName: \"kubernetes.io/projected/65ecb823-392d-43a6-b0a6-b4228b3c8800-kube-api-access-k88mg\") pod \"dnsmasq-dnsmasq-84b9f45d47-ffsfn\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.404779 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-ffsfn\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.404795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-ffsfn\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.418237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k88mg\" (UniqueName: \"kubernetes.io/projected/65ecb823-392d-43a6-b0a6-b4228b3c8800-kube-api-access-k88mg\") pod \"dnsmasq-dnsmasq-84b9f45d47-ffsfn\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.540688 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:08 crc kubenswrapper[4707]: I0121 16:30:08.923463 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn"] Jan 21 16:30:08 crc kubenswrapper[4707]: W0121 16:30:08.928405 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ecb823_392d_43a6_b0a6_b4228b3c8800.slice/crio-71f8e265de8cba4091bd73f5784fde9fbf0b11314faab74412c6b3326e48dc61 WatchSource:0}: Error finding container 71f8e265de8cba4091bd73f5784fde9fbf0b11314faab74412c6b3326e48dc61: Status 404 returned error can't find the container with id 71f8e265de8cba4091bd73f5784fde9fbf0b11314faab74412c6b3326e48dc61 Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.069272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" event={"ID":"65ecb823-392d-43a6-b0a6-b4228b3c8800","Type":"ContainerStarted","Data":"71f8e265de8cba4091bd73f5784fde9fbf0b11314faab74412c6b3326e48dc61"} Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.182679 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:30:09 crc kubenswrapper[4707]: E0121 16:30:09.183127 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.189410 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34202a73-32dd-4bb0-a514-660999065ae9" path="/var/lib/kubelet/pods/34202a73-32dd-4bb0-a514-660999065ae9/volumes" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.190087 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0" path="/var/lib/kubelet/pods/94cf21ee-a6c7-4982-99d8-a7d83a1bc7e0/volumes" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.190580 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac0c0697-5efd-4e19-8a64-f3d71c07cb5f" path="/var/lib/kubelet/pods/ac0c0697-5efd-4e19-8a64-f3d71c07cb5f/volumes" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.192776 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee7d351-52d8-4f09-beae-23003968261d" path="/var/lib/kubelet/pods/bee7d351-52d8-4f09-beae-23003968261d/volumes" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.193223 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e30a6419-83be-4c52-8ea9-2d51433559ad" path="/var/lib/kubelet/pods/e30a6419-83be-4c52-8ea9-2d51433559ad/volumes" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.193619 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5442563-8062-4eca-bdbf-4eec7053ddc5" path="/var/lib/kubelet/pods/e5442563-8062-4eca-bdbf-4eec7053ddc5/volumes" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.194378 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6958951-e96c-4974-bdb2-a00598ba8bcb" path="/var/lib/kubelet/pods/f6958951-e96c-4974-bdb2-a00598ba8bcb/volumes" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.221413 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.265024 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.321717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-inventory\") pod \"79bcacef-357c-41b3-b817-bf1f93f03779\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.321759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-inventory\") pod \"e40a1990-682f-4dc9-ad63-4df0524fac89\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.321852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"79bcacef-357c-41b3-b817-bf1f93f03779\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.321906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-install-certs-ovrd-combined-ca-bundle\") pod \"79bcacef-357c-41b3-b817-bf1f93f03779\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.321930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-custom-tls-dns-combined-ca-bundle\") pod \"79bcacef-357c-41b3-b817-bf1f93f03779\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.321947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-tls-dns-ips-combined-ca-bundle\") pod \"e40a1990-682f-4dc9-ad63-4df0524fac89\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.321965 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-tls-dns-ips-combined-ca-bundle\") pod \"79bcacef-357c-41b3-b817-bf1f93f03779\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.321983 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5hdd\" (UniqueName: \"kubernetes.io/projected/e40a1990-682f-4dc9-ad63-4df0524fac89-kube-api-access-b5hdd\") pod \"e40a1990-682f-4dc9-ad63-4df0524fac89\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.322019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-ssh-key-openstack-edpm-tls\") pod \"e40a1990-682f-4dc9-ad63-4df0524fac89\" (UID: \"e40a1990-682f-4dc9-ad63-4df0524fac89\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.322059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-kube-api-access-nmtrz\") pod \"79bcacef-357c-41b3-b817-bf1f93f03779\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.322079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"79bcacef-357c-41b3-b817-bf1f93f03779\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.322103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-ssh-key-openstack-edpm-tls\") pod \"79bcacef-357c-41b3-b817-bf1f93f03779\" (UID: \"79bcacef-357c-41b3-b817-bf1f93f03779\") " Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.326055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "79bcacef-357c-41b3-b817-bf1f93f03779" (UID: "79bcacef-357c-41b3-b817-bf1f93f03779"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.326089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-kube-api-access-nmtrz" (OuterVolumeSpecName: "kube-api-access-nmtrz") pod "79bcacef-357c-41b3-b817-bf1f93f03779" (UID: "79bcacef-357c-41b3-b817-bf1f93f03779"). InnerVolumeSpecName "kube-api-access-nmtrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.326083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "e40a1990-682f-4dc9-ad63-4df0524fac89" (UID: "e40a1990-682f-4dc9-ad63-4df0524fac89"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.326306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-custom-tls-dns-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-custom-tls-dns-default-certs-0") pod "79bcacef-357c-41b3-b817-bf1f93f03779" (UID: "79bcacef-357c-41b3-b817-bf1f93f03779"). InnerVolumeSpecName "openstack-edpm-tls-custom-tls-dns-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.326327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "79bcacef-357c-41b3-b817-bf1f93f03779" (UID: "79bcacef-357c-41b3-b817-bf1f93f03779"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.326346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e40a1990-682f-4dc9-ad63-4df0524fac89-kube-api-access-b5hdd" (OuterVolumeSpecName: "kube-api-access-b5hdd") pod "e40a1990-682f-4dc9-ad63-4df0524fac89" (UID: "e40a1990-682f-4dc9-ad63-4df0524fac89"). InnerVolumeSpecName "kube-api-access-b5hdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.326388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-tls-dns-ips-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dns-ips-default-certs-0") pod "79bcacef-357c-41b3-b817-bf1f93f03779" (UID: "79bcacef-357c-41b3-b817-bf1f93f03779"). InnerVolumeSpecName "openstack-edpm-tls-tls-dns-ips-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.326419 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "79bcacef-357c-41b3-b817-bf1f93f03779" (UID: "79bcacef-357c-41b3-b817-bf1f93f03779"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.338102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-inventory" (OuterVolumeSpecName: "inventory") pod "79bcacef-357c-41b3-b817-bf1f93f03779" (UID: "79bcacef-357c-41b3-b817-bf1f93f03779"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.338346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "e40a1990-682f-4dc9-ad63-4df0524fac89" (UID: "e40a1990-682f-4dc9-ad63-4df0524fac89"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.338632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "79bcacef-357c-41b3-b817-bf1f93f03779" (UID: "79bcacef-357c-41b3-b817-bf1f93f03779"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.339097 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-inventory" (OuterVolumeSpecName: "inventory") pod "e40a1990-682f-4dc9-ad63-4df0524fac89" (UID: "e40a1990-682f-4dc9-ad63-4df0524fac89"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423260 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423291 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423301 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-custom-tls-dns-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423314 4707 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423324 4707 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423334 4707 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423342 4707 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423352 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5hdd\" (UniqueName: \"kubernetes.io/projected/e40a1990-682f-4dc9-ad63-4df0524fac89-kube-api-access-b5hdd\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423360 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e40a1990-682f-4dc9-ad63-4df0524fac89-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423369 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmtrz\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-kube-api-access-nmtrz\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423377 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/79bcacef-357c-41b3-b817-bf1f93f03779-openstack-edpm-tls-tls-dns-ips-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:09 crc kubenswrapper[4707]: I0121 16:30:09.423385 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/79bcacef-357c-41b3-b817-bf1f93f03779-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:10 crc kubenswrapper[4707]: I0121 16:30:10.077153 4707 generic.go:334] "Generic (PLEG): container finished" podID="65ecb823-392d-43a6-b0a6-b4228b3c8800" containerID="a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538" exitCode=0 Jan 21 16:30:10 crc kubenswrapper[4707]: I0121 16:30:10.077214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" event={"ID":"65ecb823-392d-43a6-b0a6-b4228b3c8800","Type":"ContainerDied","Data":"a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538"} Jan 21 16:30:10 crc kubenswrapper[4707]: I0121 16:30:10.078674 4707 scope.go:117] "RemoveContainer" containerID="9d82328bf7c5445b8cb9367936b9fe5fa116e618b167c4b18714a46e501a8a22" Jan 21 16:30:10 crc kubenswrapper[4707]: I0121 16:30:10.078681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-md8lb-debug-26955" Jan 21 16:30:10 crc kubenswrapper[4707]: I0121 16:30:10.080396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" event={"ID":"e40a1990-682f-4dc9-ad63-4df0524fac89","Type":"ContainerDied","Data":"e391af1ba55cd10975c31f6925afa5da120e5db88f08e300f7cd613d706c975f"} Jan 21 16:30:10 crc kubenswrapper[4707]: I0121 16:30:10.080423 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e391af1ba55cd10975c31f6925afa5da120e5db88f08e300f7cd613d706c975f" Jan 21 16:30:10 crc kubenswrapper[4707]: I0121 16:30:10.080466 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh" Jan 21 16:30:10 crc kubenswrapper[4707]: I0121 16:30:10.185899 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh"] Jan 21 16:30:10 crc kubenswrapper[4707]: I0121 16:30:10.191226 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-p84mh"] Jan 21 16:30:11 crc kubenswrapper[4707]: I0121 16:30:11.088929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" event={"ID":"65ecb823-392d-43a6-b0a6-b4228b3c8800","Type":"ContainerStarted","Data":"306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9"} Jan 21 16:30:11 crc kubenswrapper[4707]: I0121 16:30:11.089195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:11 crc kubenswrapper[4707]: I0121 16:30:11.103282 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" podStartSLOduration=3.103270389 podStartE2EDuration="3.103270389s" podCreationTimestamp="2026-01-21 16:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:30:11.099723769 +0000 UTC m=+5308.281240001" watchObservedRunningTime="2026-01-21 16:30:11.103270389 +0000 UTC m=+5308.284786611" Jan 21 16:30:11 crc kubenswrapper[4707]: I0121 16:30:11.188763 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79bcacef-357c-41b3-b817-bf1f93f03779" path="/var/lib/kubelet/pods/79bcacef-357c-41b3-b817-bf1f93f03779/volumes" Jan 21 16:30:11 crc kubenswrapper[4707]: I0121 16:30:11.189247 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e40a1990-682f-4dc9-ad63-4df0524fac89" path="/var/lib/kubelet/pods/e40a1990-682f-4dc9-ad63-4df0524fac89/volumes" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.577751 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-vnw9x"] Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.582076 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-vnw9x"] Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.695010 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-vb9hm"] Jan 21 16:30:14 crc kubenswrapper[4707]: E0121 16:30:14.695270 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e40a1990-682f-4dc9-ad63-4df0524fac89" containerName="tls-dns-ips-certs-refresh-openstack-edpm-tls" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.695287 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e40a1990-682f-4dc9-ad63-4df0524fac89" containerName="tls-dns-ips-certs-refresh-openstack-edpm-tls" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.695439 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e40a1990-682f-4dc9-ad63-4df0524fac89" containerName="tls-dns-ips-certs-refresh-openstack-edpm-tls" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.695878 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.697074 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.697354 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.697879 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.698118 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.701087 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vb9hm"] Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.783062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-node-mnt\") pod \"crc-storage-crc-vb9hm\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.783183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-crc-storage\") pod \"crc-storage-crc-vb9hm\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.783292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frdl8\" (UniqueName: \"kubernetes.io/projected/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-kube-api-access-frdl8\") pod \"crc-storage-crc-vb9hm\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.884107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-node-mnt\") pod \"crc-storage-crc-vb9hm\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.884177 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-crc-storage\") pod \"crc-storage-crc-vb9hm\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.884229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frdl8\" (UniqueName: \"kubernetes.io/projected/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-kube-api-access-frdl8\") pod \"crc-storage-crc-vb9hm\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.884371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-node-mnt\") pod \"crc-storage-crc-vb9hm\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.884779 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-crc-storage\") pod \"crc-storage-crc-vb9hm\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:14 crc kubenswrapper[4707]: I0121 16:30:14.898607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frdl8\" (UniqueName: \"kubernetes.io/projected/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-kube-api-access-frdl8\") pod \"crc-storage-crc-vb9hm\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:15 crc kubenswrapper[4707]: I0121 16:30:15.009275 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:15 crc kubenswrapper[4707]: I0121 16:30:15.188855 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077686fc-4643-4e29-b1b2-fb295c380fa2" path="/var/lib/kubelet/pods/077686fc-4643-4e29-b1b2-fb295c380fa2/volumes" Jan 21 16:30:15 crc kubenswrapper[4707]: I0121 16:30:15.354686 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vb9hm"] Jan 21 16:30:16 crc kubenswrapper[4707]: I0121 16:30:16.120796 4707 generic.go:334] "Generic (PLEG): container finished" podID="de6fac2a-d486-4ea7-aef8-eb8c86aac2e3" containerID="1c7aa8550b66f800ec0c5408ea6423eb502458defbedee473b67569cb31e318c" exitCode=0 Jan 21 16:30:16 crc kubenswrapper[4707]: I0121 16:30:16.120895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vb9hm" event={"ID":"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3","Type":"ContainerDied","Data":"1c7aa8550b66f800ec0c5408ea6423eb502458defbedee473b67569cb31e318c"} Jan 21 16:30:16 crc kubenswrapper[4707]: I0121 16:30:16.121021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vb9hm" event={"ID":"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3","Type":"ContainerStarted","Data":"e7458bcf01bdc8e901a4a53ab8c0ab3f56507a12e9d7633db5d71b8cda6e25dd"} Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.336186 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.414512 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-crc-storage\") pod \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.414675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-node-mnt\") pod \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.414704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frdl8\" (UniqueName: \"kubernetes.io/projected/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-kube-api-access-frdl8\") pod \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\" (UID: \"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3\") " Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.415027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "de6fac2a-d486-4ea7-aef8-eb8c86aac2e3" (UID: "de6fac2a-d486-4ea7-aef8-eb8c86aac2e3"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.418952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-kube-api-access-frdl8" (OuterVolumeSpecName: "kube-api-access-frdl8") pod "de6fac2a-d486-4ea7-aef8-eb8c86aac2e3" (UID: "de6fac2a-d486-4ea7-aef8-eb8c86aac2e3"). InnerVolumeSpecName "kube-api-access-frdl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.429638 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "de6fac2a-d486-4ea7-aef8-eb8c86aac2e3" (UID: "de6fac2a-d486-4ea7-aef8-eb8c86aac2e3"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.516108 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.516236 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frdl8\" (UniqueName: \"kubernetes.io/projected/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-kube-api-access-frdl8\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:17 crc kubenswrapper[4707]: I0121 16:30:17.516307 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.134789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vb9hm" event={"ID":"de6fac2a-d486-4ea7-aef8-eb8c86aac2e3","Type":"ContainerDied","Data":"e7458bcf01bdc8e901a4a53ab8c0ab3f56507a12e9d7633db5d71b8cda6e25dd"} Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.134840 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vb9hm" Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.134848 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7458bcf01bdc8e901a4a53ab8c0ab3f56507a12e9d7633db5d71b8cda6e25dd" Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.542469 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.574493 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt"] Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.574677 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" podUID="91fe998d-8b28-4204-9db3-022679520d79" containerName="dnsmasq-dns" containerID="cri-o://64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0" gracePeriod=10 Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.898417 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.933497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-dnsmasq-svc\") pod \"91fe998d-8b28-4204-9db3-022679520d79\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.933586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d24zd\" (UniqueName: \"kubernetes.io/projected/91fe998d-8b28-4204-9db3-022679520d79-kube-api-access-d24zd\") pod \"91fe998d-8b28-4204-9db3-022679520d79\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.933613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-config\") pod \"91fe998d-8b28-4204-9db3-022679520d79\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.933666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-openstack-edpm-tls\") pod \"91fe998d-8b28-4204-9db3-022679520d79\" (UID: \"91fe998d-8b28-4204-9db3-022679520d79\") " Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.938060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fe998d-8b28-4204-9db3-022679520d79-kube-api-access-d24zd" (OuterVolumeSpecName: "kube-api-access-d24zd") pod "91fe998d-8b28-4204-9db3-022679520d79" (UID: "91fe998d-8b28-4204-9db3-022679520d79"). InnerVolumeSpecName "kube-api-access-d24zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.960464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "91fe998d-8b28-4204-9db3-022679520d79" (UID: "91fe998d-8b28-4204-9db3-022679520d79"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.962028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "91fe998d-8b28-4204-9db3-022679520d79" (UID: "91fe998d-8b28-4204-9db3-022679520d79"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:18 crc kubenswrapper[4707]: I0121 16:30:18.965185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-config" (OuterVolumeSpecName: "config") pod "91fe998d-8b28-4204-9db3-022679520d79" (UID: "91fe998d-8b28-4204-9db3-022679520d79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.035297 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.035325 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d24zd\" (UniqueName: \"kubernetes.io/projected/91fe998d-8b28-4204-9db3-022679520d79-kube-api-access-d24zd\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.035335 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.035343 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/91fe998d-8b28-4204-9db3-022679520d79-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.142958 4707 generic.go:334] "Generic (PLEG): container finished" podID="91fe998d-8b28-4204-9db3-022679520d79" containerID="64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0" exitCode=0 Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.143007 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" event={"ID":"91fe998d-8b28-4204-9db3-022679520d79","Type":"ContainerDied","Data":"64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0"} Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.143012 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.143039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt" event={"ID":"91fe998d-8b28-4204-9db3-022679520d79","Type":"ContainerDied","Data":"f1778e98a5d750134a699aa672af791cb782fc18739d49c21f8492b429f13308"} Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.143055 4707 scope.go:117] "RemoveContainer" containerID="64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.161249 4707 scope.go:117] "RemoveContainer" containerID="be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.167869 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt"] Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.174393 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-gcdzt"] Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.176593 4707 scope.go:117] "RemoveContainer" containerID="64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0" Jan 21 16:30:19 crc kubenswrapper[4707]: E0121 16:30:19.176930 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0\": container with ID starting with 64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0 not found: ID does not exist" containerID="64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.176967 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0"} err="failed to get container status \"64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0\": rpc error: code = NotFound desc = could not find container \"64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0\": container with ID starting with 64b2cef76d27d7462b80101c19632745de01106caecade8b7975fa6610780eb0 not found: ID does not exist" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.177000 4707 scope.go:117] "RemoveContainer" containerID="be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587" Jan 21 16:30:19 crc kubenswrapper[4707]: E0121 16:30:19.177264 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587\": container with ID starting with be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587 not found: ID does not exist" containerID="be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.177285 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587"} err="failed to get container status \"be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587\": rpc error: code = NotFound desc = could not find container \"be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587\": container with ID starting with be156adb740d23b53e0f07cac155154fb0078894e64474a043d2fc6159dbf587 not found: ID does not exist" Jan 21 16:30:19 crc kubenswrapper[4707]: I0121 16:30:19.189471 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91fe998d-8b28-4204-9db3-022679520d79" path="/var/lib/kubelet/pods/91fe998d-8b28-4204-9db3-022679520d79/volumes" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.183091 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:30:20 crc kubenswrapper[4707]: E0121 16:30:20.183434 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.206905 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-vb9hm"] Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.210375 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-vb9hm"] Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.300184 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-hc5sd"] Jan 21 16:30:20 crc kubenswrapper[4707]: E0121 16:30:20.300417 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6fac2a-d486-4ea7-aef8-eb8c86aac2e3" containerName="storage" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.300434 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6fac2a-d486-4ea7-aef8-eb8c86aac2e3" containerName="storage" Jan 21 16:30:20 crc kubenswrapper[4707]: E0121 16:30:20.300451 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fe998d-8b28-4204-9db3-022679520d79" containerName="dnsmasq-dns" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.300457 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fe998d-8b28-4204-9db3-022679520d79" containerName="dnsmasq-dns" Jan 21 16:30:20 crc kubenswrapper[4707]: E0121 16:30:20.300469 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fe998d-8b28-4204-9db3-022679520d79" containerName="init" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.300475 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fe998d-8b28-4204-9db3-022679520d79" containerName="init" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.300587 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6fac2a-d486-4ea7-aef8-eb8c86aac2e3" containerName="storage" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.300602 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fe998d-8b28-4204-9db3-022679520d79" containerName="dnsmasq-dns" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.300992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.302311 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.302765 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.302782 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.306020 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-hc5sd"] Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.306272 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.349679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwfqv\" (UniqueName: \"kubernetes.io/projected/f41b10e2-50ae-4374-8b0b-032e4c77b560-kube-api-access-kwfqv\") pod \"crc-storage-crc-hc5sd\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.349789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f41b10e2-50ae-4374-8b0b-032e4c77b560-crc-storage\") pod \"crc-storage-crc-hc5sd\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.349894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f41b10e2-50ae-4374-8b0b-032e4c77b560-node-mnt\") pod \"crc-storage-crc-hc5sd\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.451053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f41b10e2-50ae-4374-8b0b-032e4c77b560-crc-storage\") pod \"crc-storage-crc-hc5sd\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.451125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f41b10e2-50ae-4374-8b0b-032e4c77b560-node-mnt\") pod \"crc-storage-crc-hc5sd\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.451197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwfqv\" (UniqueName: \"kubernetes.io/projected/f41b10e2-50ae-4374-8b0b-032e4c77b560-kube-api-access-kwfqv\") pod \"crc-storage-crc-hc5sd\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.451383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f41b10e2-50ae-4374-8b0b-032e4c77b560-node-mnt\") pod \"crc-storage-crc-hc5sd\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.452012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f41b10e2-50ae-4374-8b0b-032e4c77b560-crc-storage\") pod \"crc-storage-crc-hc5sd\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.463865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwfqv\" (UniqueName: \"kubernetes.io/projected/f41b10e2-50ae-4374-8b0b-032e4c77b560-kube-api-access-kwfqv\") pod \"crc-storage-crc-hc5sd\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.614564 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:20 crc kubenswrapper[4707]: I0121 16:30:20.968158 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-hc5sd"] Jan 21 16:30:21 crc kubenswrapper[4707]: I0121 16:30:21.165449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-hc5sd" event={"ID":"f41b10e2-50ae-4374-8b0b-032e4c77b560","Type":"ContainerStarted","Data":"be764f13731a3fbecf891c0a262c4aa8bd9d31f860b11c4dd5a72b0bff660c0e"} Jan 21 16:30:21 crc kubenswrapper[4707]: I0121 16:30:21.188799 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6fac2a-d486-4ea7-aef8-eb8c86aac2e3" path="/var/lib/kubelet/pods/de6fac2a-d486-4ea7-aef8-eb8c86aac2e3/volumes" Jan 21 16:30:22 crc kubenswrapper[4707]: I0121 16:30:22.173521 4707 generic.go:334] "Generic (PLEG): container finished" podID="f41b10e2-50ae-4374-8b0b-032e4c77b560" containerID="6e04ef91cff8fa74c1b0d3d24b46f51e19374e2e82d06a6614a0501a8a650971" exitCode=0 Jan 21 16:30:22 crc kubenswrapper[4707]: I0121 16:30:22.173611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-hc5sd" event={"ID":"f41b10e2-50ae-4374-8b0b-032e4c77b560","Type":"ContainerDied","Data":"6e04ef91cff8fa74c1b0d3d24b46f51e19374e2e82d06a6614a0501a8a650971"} Jan 21 16:30:22 crc kubenswrapper[4707]: I0121 16:30:22.879177 4707 scope.go:117] "RemoveContainer" containerID="0210688ca235439f353aca12f6763b72f047be39886d80d7a058b2c14a175965" Jan 21 16:30:22 crc kubenswrapper[4707]: I0121 16:30:22.898451 4707 scope.go:117] "RemoveContainer" containerID="1ae988e6cd4add3b53e3c3cc095c5ee855688156e72da8b7f2425a418cdea6e0" Jan 21 16:30:22 crc kubenswrapper[4707]: I0121 16:30:22.917656 4707 scope.go:117] "RemoveContainer" containerID="40c8baf2705a6950c26464735c18a1163d0dc6b101cffab24e6b91ebf6a64ba6" Jan 21 16:30:22 crc kubenswrapper[4707]: I0121 16:30:22.937704 4707 scope.go:117] "RemoveContainer" containerID="f7222e01ea56a4a98c3b30d6a6ce5973c71b51f818b457cb182f9758b6be45d6" Jan 21 16:30:22 crc kubenswrapper[4707]: I0121 16:30:22.955878 4707 scope.go:117] "RemoveContainer" containerID="770f64e2a96c02995ed7ed5e8a4f467bc54852deb47c0befcdfdfcd42a4603f7" Jan 21 16:30:22 crc kubenswrapper[4707]: I0121 16:30:22.971236 4707 scope.go:117] "RemoveContainer" containerID="026314c401de752ded286a78cfc5e12aeb86eac48b9f8d514a3b62e6ef726c17" Jan 21 16:30:22 crc kubenswrapper[4707]: I0121 16:30:22.991927 4707 scope.go:117] "RemoveContainer" containerID="1579ff7b772e2d8358facb3dea6b00c2f210f24e59c3818da60d976bc6898300" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.009190 4707 scope.go:117] "RemoveContainer" containerID="da03af4fe60e288843f036c08c74724d08c4be0a5bb6736c5b2d003559fb7b1a" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.024033 4707 scope.go:117] "RemoveContainer" containerID="a020a81b37e967863f6b30f3236f3b06f9910d5c378bcce1ca7133dd7bf9d7bc" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.040452 4707 scope.go:117] "RemoveContainer" containerID="c8f1cdb00ede11835641b05306785a2ecbae7bd2712fb5b0d17051342acde536" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.341750 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.480075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwfqv\" (UniqueName: \"kubernetes.io/projected/f41b10e2-50ae-4374-8b0b-032e4c77b560-kube-api-access-kwfqv\") pod \"f41b10e2-50ae-4374-8b0b-032e4c77b560\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.480121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f41b10e2-50ae-4374-8b0b-032e4c77b560-node-mnt\") pod \"f41b10e2-50ae-4374-8b0b-032e4c77b560\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.480154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f41b10e2-50ae-4374-8b0b-032e4c77b560-crc-storage\") pod \"f41b10e2-50ae-4374-8b0b-032e4c77b560\" (UID: \"f41b10e2-50ae-4374-8b0b-032e4c77b560\") " Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.480238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f41b10e2-50ae-4374-8b0b-032e4c77b560-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f41b10e2-50ae-4374-8b0b-032e4c77b560" (UID: "f41b10e2-50ae-4374-8b0b-032e4c77b560"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.480377 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f41b10e2-50ae-4374-8b0b-032e4c77b560-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.484845 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f41b10e2-50ae-4374-8b0b-032e4c77b560-kube-api-access-kwfqv" (OuterVolumeSpecName: "kube-api-access-kwfqv") pod "f41b10e2-50ae-4374-8b0b-032e4c77b560" (UID: "f41b10e2-50ae-4374-8b0b-032e4c77b560"). InnerVolumeSpecName "kube-api-access-kwfqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.494442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f41b10e2-50ae-4374-8b0b-032e4c77b560-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f41b10e2-50ae-4374-8b0b-032e4c77b560" (UID: "f41b10e2-50ae-4374-8b0b-032e4c77b560"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.581760 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwfqv\" (UniqueName: \"kubernetes.io/projected/f41b10e2-50ae-4374-8b0b-032e4c77b560-kube-api-access-kwfqv\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:23 crc kubenswrapper[4707]: I0121 16:30:23.581785 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f41b10e2-50ae-4374-8b0b-032e4c77b560-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:24 crc kubenswrapper[4707]: I0121 16:30:24.187248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-hc5sd" event={"ID":"f41b10e2-50ae-4374-8b0b-032e4c77b560","Type":"ContainerDied","Data":"be764f13731a3fbecf891c0a262c4aa8bd9d31f860b11c4dd5a72b0bff660c0e"} Jan 21 16:30:24 crc kubenswrapper[4707]: I0121 16:30:24.187280 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be764f13731a3fbecf891c0a262c4aa8bd9d31f860b11c4dd5a72b0bff660c0e" Jan 21 16:30:24 crc kubenswrapper[4707]: I0121 16:30:24.187293 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hc5sd" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.418858 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm"] Jan 21 16:30:26 crc kubenswrapper[4707]: E0121 16:30:26.419317 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f41b10e2-50ae-4374-8b0b-032e4c77b560" containerName="storage" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.419330 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f41b10e2-50ae-4374-8b0b-032e4c77b560" containerName="storage" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.419447 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f41b10e2-50ae-4374-8b0b-032e4c77b560" containerName="storage" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.420165 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: W0121 16:30:26.425130 4707 reflector.go:561] object-"openstack-kuttl-tests"/"edpm-extramounts": failed to list *v1.ConfigMap: configmaps "edpm-extramounts" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 21 16:30:26 crc kubenswrapper[4707]: E0121 16:30:26.425176 4707 reflector.go:158] "Unhandled Error" err="object-\"openstack-kuttl-tests\"/\"edpm-extramounts\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"edpm-extramounts\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.436586 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm"] Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.618144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-config\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.618210 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dp88\" (UniqueName: \"kubernetes.io/projected/f5726661-624f-47b9-bb13-aac4bbb6f988-kube-api-access-2dp88\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.618234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.618292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-edpm-extramounts\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.719427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dp88\" (UniqueName: \"kubernetes.io/projected/f5726661-624f-47b9-bb13-aac4bbb6f988-kube-api-access-2dp88\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.719495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.719555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-edpm-extramounts\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.719653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-config\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.720228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.720323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-config\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:26 crc kubenswrapper[4707]: I0121 16:30:26.735424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dp88\" (UniqueName: \"kubernetes.io/projected/f5726661-624f-47b9-bb13-aac4bbb6f988-kube-api-access-2dp88\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:27 crc kubenswrapper[4707]: I0121 16:30:27.455102 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-extramounts" Jan 21 16:30:27 crc kubenswrapper[4707]: I0121 16:30:27.461112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-edpm-extramounts\") pod \"dnsmasq-dnsmasq-76cd9645f5-2v7cm\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:27 crc kubenswrapper[4707]: I0121 16:30:27.637401 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:27 crc kubenswrapper[4707]: I0121 16:30:27.986215 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm"] Jan 21 16:30:28 crc kubenswrapper[4707]: I0121 16:30:28.212188 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5726661-624f-47b9-bb13-aac4bbb6f988" containerID="8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c" exitCode=0 Jan 21 16:30:28 crc kubenswrapper[4707]: I0121 16:30:28.212225 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" event={"ID":"f5726661-624f-47b9-bb13-aac4bbb6f988","Type":"ContainerDied","Data":"8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c"} Jan 21 16:30:28 crc kubenswrapper[4707]: I0121 16:30:28.212246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" event={"ID":"f5726661-624f-47b9-bb13-aac4bbb6f988","Type":"ContainerStarted","Data":"94fcc59ce03f21cd359aff3adcc178cace783345257f13751ba8bfec3ecb80e4"} Jan 21 16:30:29 crc kubenswrapper[4707]: I0121 16:30:29.218652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" event={"ID":"f5726661-624f-47b9-bb13-aac4bbb6f988","Type":"ContainerStarted","Data":"67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9"} Jan 21 16:30:29 crc kubenswrapper[4707]: I0121 16:30:29.218866 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:29 crc kubenswrapper[4707]: I0121 16:30:29.229630 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" podStartSLOduration=3.2296184 podStartE2EDuration="3.2296184s" podCreationTimestamp="2026-01-21 16:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:30:29.228756689 +0000 UTC m=+5326.410272911" watchObservedRunningTime="2026-01-21 16:30:29.2296184 +0000 UTC m=+5326.411134622" Jan 21 16:30:33 crc kubenswrapper[4707]: I0121 16:30:33.186260 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:30:33 crc kubenswrapper[4707]: E0121 16:30:33.186832 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:30:37 crc kubenswrapper[4707]: I0121 16:30:37.638552 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:37 crc kubenswrapper[4707]: I0121 16:30:37.673097 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn"] Jan 21 16:30:37 crc kubenswrapper[4707]: I0121 16:30:37.673387 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" podUID="65ecb823-392d-43a6-b0a6-b4228b3c8800" containerName="dnsmasq-dns" containerID="cri-o://306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9" gracePeriod=10 Jan 21 16:30:37 crc kubenswrapper[4707]: I0121 16:30:37.999473 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.152420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-dnsmasq-svc\") pod \"65ecb823-392d-43a6-b0a6-b4228b3c8800\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.152542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-config\") pod \"65ecb823-392d-43a6-b0a6-b4228b3c8800\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.152588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k88mg\" (UniqueName: \"kubernetes.io/projected/65ecb823-392d-43a6-b0a6-b4228b3c8800-kube-api-access-k88mg\") pod \"65ecb823-392d-43a6-b0a6-b4228b3c8800\" (UID: \"65ecb823-392d-43a6-b0a6-b4228b3c8800\") " Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.156729 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ecb823-392d-43a6-b0a6-b4228b3c8800-kube-api-access-k88mg" (OuterVolumeSpecName: "kube-api-access-k88mg") pod "65ecb823-392d-43a6-b0a6-b4228b3c8800" (UID: "65ecb823-392d-43a6-b0a6-b4228b3c8800"). InnerVolumeSpecName "kube-api-access-k88mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.177529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "65ecb823-392d-43a6-b0a6-b4228b3c8800" (UID: "65ecb823-392d-43a6-b0a6-b4228b3c8800"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.179256 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-config" (OuterVolumeSpecName: "config") pod "65ecb823-392d-43a6-b0a6-b4228b3c8800" (UID: "65ecb823-392d-43a6-b0a6-b4228b3c8800"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.253600 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.253626 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k88mg\" (UniqueName: \"kubernetes.io/projected/65ecb823-392d-43a6-b0a6-b4228b3c8800-kube-api-access-k88mg\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.253636 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/65ecb823-392d-43a6-b0a6-b4228b3c8800-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.273302 4707 generic.go:334] "Generic (PLEG): container finished" podID="65ecb823-392d-43a6-b0a6-b4228b3c8800" containerID="306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9" exitCode=0 Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.273337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" event={"ID":"65ecb823-392d-43a6-b0a6-b4228b3c8800","Type":"ContainerDied","Data":"306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9"} Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.273347 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.273362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn" event={"ID":"65ecb823-392d-43a6-b0a6-b4228b3c8800","Type":"ContainerDied","Data":"71f8e265de8cba4091bd73f5784fde9fbf0b11314faab74412c6b3326e48dc61"} Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.273377 4707 scope.go:117] "RemoveContainer" containerID="306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.289924 4707 scope.go:117] "RemoveContainer" containerID="a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.293744 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn"] Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.298103 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-ffsfn"] Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.313821 4707 scope.go:117] "RemoveContainer" containerID="306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9" Jan 21 16:30:38 crc kubenswrapper[4707]: E0121 16:30:38.314114 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9\": container with ID starting with 306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9 not found: ID does not exist" containerID="306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.314146 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9"} err="failed to get container status \"306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9\": rpc error: code = NotFound desc = could not find container \"306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9\": container with ID starting with 306caaf42f654b475941e3c4d1ae799e84ef6d7af2f625c619caed842da3f0c9 not found: ID does not exist" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.314309 4707 scope.go:117] "RemoveContainer" containerID="a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538" Jan 21 16:30:38 crc kubenswrapper[4707]: E0121 16:30:38.314580 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538\": container with ID starting with a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538 not found: ID does not exist" containerID="a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538" Jan 21 16:30:38 crc kubenswrapper[4707]: I0121 16:30:38.314603 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538"} err="failed to get container status \"a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538\": rpc error: code = NotFound desc = could not find container \"a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538\": container with ID starting with a3b11c308fe515959f2c3aae2b9fac6fab4ad3ded436bf0af7e01e12c93f5538 not found: ID does not exist" Jan 21 16:30:39 crc kubenswrapper[4707]: I0121 16:30:39.189822 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ecb823-392d-43a6-b0a6-b4228b3c8800" path="/var/lib/kubelet/pods/65ecb823-392d-43a6-b0a6-b4228b3c8800/volumes" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.508370 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp"] Jan 21 16:30:41 crc kubenswrapper[4707]: E0121 16:30:41.509014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ecb823-392d-43a6-b0a6-b4228b3c8800" containerName="init" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.509026 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ecb823-392d-43a6-b0a6-b4228b3c8800" containerName="init" Jan 21 16:30:41 crc kubenswrapper[4707]: E0121 16:30:41.509047 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ecb823-392d-43a6-b0a6-b4228b3c8800" containerName="dnsmasq-dns" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.509053 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ecb823-392d-43a6-b0a6-b4228b3c8800" containerName="dnsmasq-dns" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.509302 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ecb823-392d-43a6-b0a6-b4228b3c8800" containerName="dnsmasq-dns" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.511928 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.523794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp"] Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.693696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-7p5jp\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.693775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-7p5jp\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.693850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86f8c\" (UniqueName: \"kubernetes.io/projected/df1d0fcb-d2a5-496a-87d6-801022dc65d3-kube-api-access-86f8c\") pod \"dnsmasq-dnsmasq-84b9f45d47-7p5jp\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.795468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-7p5jp\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.795510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86f8c\" (UniqueName: \"kubernetes.io/projected/df1d0fcb-d2a5-496a-87d6-801022dc65d3-kube-api-access-86f8c\") pod \"dnsmasq-dnsmasq-84b9f45d47-7p5jp\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.795581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-7p5jp\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.796224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-7p5jp\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.796265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-7p5jp\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.809961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86f8c\" (UniqueName: \"kubernetes.io/projected/df1d0fcb-d2a5-496a-87d6-801022dc65d3-kube-api-access-86f8c\") pod \"dnsmasq-dnsmasq-84b9f45d47-7p5jp\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:41 crc kubenswrapper[4707]: I0121 16:30:41.825869 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:42 crc kubenswrapper[4707]: I0121 16:30:42.177324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp"] Jan 21 16:30:42 crc kubenswrapper[4707]: W0121 16:30:42.180235 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf1d0fcb_d2a5_496a_87d6_801022dc65d3.slice/crio-328ae4b5591014e77b580d82fdb4e14d847b244758ea528609c64d5deaf38e7b WatchSource:0}: Error finding container 328ae4b5591014e77b580d82fdb4e14d847b244758ea528609c64d5deaf38e7b: Status 404 returned error can't find the container with id 328ae4b5591014e77b580d82fdb4e14d847b244758ea528609c64d5deaf38e7b Jan 21 16:30:42 crc kubenswrapper[4707]: I0121 16:30:42.299431 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" event={"ID":"df1d0fcb-d2a5-496a-87d6-801022dc65d3","Type":"ContainerStarted","Data":"217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4"} Jan 21 16:30:42 crc kubenswrapper[4707]: I0121 16:30:42.299472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" event={"ID":"df1d0fcb-d2a5-496a-87d6-801022dc65d3","Type":"ContainerStarted","Data":"328ae4b5591014e77b580d82fdb4e14d847b244758ea528609c64d5deaf38e7b"} Jan 21 16:30:43 crc kubenswrapper[4707]: I0121 16:30:43.305931 4707 generic.go:334] "Generic (PLEG): container finished" podID="df1d0fcb-d2a5-496a-87d6-801022dc65d3" containerID="217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4" exitCode=0 Jan 21 16:30:43 crc kubenswrapper[4707]: I0121 16:30:43.306029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" event={"ID":"df1d0fcb-d2a5-496a-87d6-801022dc65d3","Type":"ContainerDied","Data":"217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4"} Jan 21 16:30:44 crc kubenswrapper[4707]: I0121 16:30:44.316260 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" event={"ID":"df1d0fcb-d2a5-496a-87d6-801022dc65d3","Type":"ContainerStarted","Data":"a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552"} Jan 21 16:30:44 crc kubenswrapper[4707]: I0121 16:30:44.316400 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:44 crc kubenswrapper[4707]: I0121 16:30:44.329433 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" podStartSLOduration=3.329419116 podStartE2EDuration="3.329419116s" podCreationTimestamp="2026-01-21 16:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:30:44.328560561 +0000 UTC m=+5341.510076784" watchObservedRunningTime="2026-01-21 16:30:44.329419116 +0000 UTC m=+5341.510935338" Jan 21 16:30:45 crc kubenswrapper[4707]: I0121 16:30:45.182170 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:30:45 crc kubenswrapper[4707]: E0121 16:30:45.182528 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.750402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-hc5sd"] Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.754603 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-hc5sd"] Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.842833 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-sxksb"] Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.843623 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.845172 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.845174 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.845226 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.846264 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.849690 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sxksb"] Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.969248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-node-mnt\") pod \"crc-storage-crc-sxksb\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.969307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkc87\" (UniqueName: \"kubernetes.io/projected/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-kube-api-access-bkc87\") pod \"crc-storage-crc-sxksb\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:47 crc kubenswrapper[4707]: I0121 16:30:47.969330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-crc-storage\") pod \"crc-storage-crc-sxksb\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:48 crc kubenswrapper[4707]: I0121 16:30:48.070343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-node-mnt\") pod \"crc-storage-crc-sxksb\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:48 crc kubenswrapper[4707]: I0121 16:30:48.070393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkc87\" (UniqueName: \"kubernetes.io/projected/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-kube-api-access-bkc87\") pod \"crc-storage-crc-sxksb\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:48 crc kubenswrapper[4707]: I0121 16:30:48.070415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-crc-storage\") pod \"crc-storage-crc-sxksb\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:48 crc kubenswrapper[4707]: I0121 16:30:48.070562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-node-mnt\") pod \"crc-storage-crc-sxksb\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:48 crc kubenswrapper[4707]: I0121 16:30:48.071016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-crc-storage\") pod \"crc-storage-crc-sxksb\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:48 crc kubenswrapper[4707]: I0121 16:30:48.084716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkc87\" (UniqueName: \"kubernetes.io/projected/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-kube-api-access-bkc87\") pod \"crc-storage-crc-sxksb\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:48 crc kubenswrapper[4707]: I0121 16:30:48.160004 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:48 crc kubenswrapper[4707]: I0121 16:30:48.502667 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sxksb"] Jan 21 16:30:48 crc kubenswrapper[4707]: W0121 16:30:48.504741 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ddc9de_e691_4dbf_9ccc_359225ea8c5d.slice/crio-f9dc4fcffca37eccfd97c6122ea778e730f2c980382c91cb3f471ac658f65188 WatchSource:0}: Error finding container f9dc4fcffca37eccfd97c6122ea778e730f2c980382c91cb3f471ac658f65188: Status 404 returned error can't find the container with id f9dc4fcffca37eccfd97c6122ea778e730f2c980382c91cb3f471ac658f65188 Jan 21 16:30:49 crc kubenswrapper[4707]: I0121 16:30:49.189082 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f41b10e2-50ae-4374-8b0b-032e4c77b560" path="/var/lib/kubelet/pods/f41b10e2-50ae-4374-8b0b-032e4c77b560/volumes" Jan 21 16:30:49 crc kubenswrapper[4707]: I0121 16:30:49.346825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sxksb" event={"ID":"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d","Type":"ContainerStarted","Data":"f9dc4fcffca37eccfd97c6122ea778e730f2c980382c91cb3f471ac658f65188"} Jan 21 16:30:50 crc kubenswrapper[4707]: I0121 16:30:50.353970 4707 generic.go:334] "Generic (PLEG): container finished" podID="d4ddc9de-e691-4dbf-9ccc-359225ea8c5d" containerID="b65eba9a3228bd9211adba96be18511ea19b232296af0788b40167d978efaf41" exitCode=0 Jan 21 16:30:50 crc kubenswrapper[4707]: I0121 16:30:50.354018 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sxksb" event={"ID":"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d","Type":"ContainerDied","Data":"b65eba9a3228bd9211adba96be18511ea19b232296af0788b40167d978efaf41"} Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.572726 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.716097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-node-mnt\") pod \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.716217 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d4ddc9de-e691-4dbf-9ccc-359225ea8c5d" (UID: "d4ddc9de-e691-4dbf-9ccc-359225ea8c5d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.716278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkc87\" (UniqueName: \"kubernetes.io/projected/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-kube-api-access-bkc87\") pod \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.716350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-crc-storage\") pod \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\" (UID: \"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d\") " Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.716722 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.720791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-kube-api-access-bkc87" (OuterVolumeSpecName: "kube-api-access-bkc87") pod "d4ddc9de-e691-4dbf-9ccc-359225ea8c5d" (UID: "d4ddc9de-e691-4dbf-9ccc-359225ea8c5d"). InnerVolumeSpecName "kube-api-access-bkc87". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.730711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d4ddc9de-e691-4dbf-9ccc-359225ea8c5d" (UID: "d4ddc9de-e691-4dbf-9ccc-359225ea8c5d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.818287 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkc87\" (UniqueName: \"kubernetes.io/projected/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-kube-api-access-bkc87\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.818316 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.826920 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.878353 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm"] Jan 21 16:30:51 crc kubenswrapper[4707]: I0121 16:30:51.878555 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" podUID="f5726661-624f-47b9-bb13-aac4bbb6f988" containerName="dnsmasq-dns" containerID="cri-o://67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9" gracePeriod=10 Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.210505 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.223876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dp88\" (UniqueName: \"kubernetes.io/projected/f5726661-624f-47b9-bb13-aac4bbb6f988-kube-api-access-2dp88\") pod \"f5726661-624f-47b9-bb13-aac4bbb6f988\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.223937 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-dnsmasq-svc\") pod \"f5726661-624f-47b9-bb13-aac4bbb6f988\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.224034 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-config\") pod \"f5726661-624f-47b9-bb13-aac4bbb6f988\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.224068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-edpm-extramounts\") pod \"f5726661-624f-47b9-bb13-aac4bbb6f988\" (UID: \"f5726661-624f-47b9-bb13-aac4bbb6f988\") " Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.228524 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5726661-624f-47b9-bb13-aac4bbb6f988-kube-api-access-2dp88" (OuterVolumeSpecName: "kube-api-access-2dp88") pod "f5726661-624f-47b9-bb13-aac4bbb6f988" (UID: "f5726661-624f-47b9-bb13-aac4bbb6f988"). InnerVolumeSpecName "kube-api-access-2dp88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.253192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-edpm-extramounts" (OuterVolumeSpecName: "edpm-extramounts") pod "f5726661-624f-47b9-bb13-aac4bbb6f988" (UID: "f5726661-624f-47b9-bb13-aac4bbb6f988"). InnerVolumeSpecName "edpm-extramounts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.253203 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "f5726661-624f-47b9-bb13-aac4bbb6f988" (UID: "f5726661-624f-47b9-bb13-aac4bbb6f988"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.253690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-config" (OuterVolumeSpecName: "config") pod "f5726661-624f-47b9-bb13-aac4bbb6f988" (UID: "f5726661-624f-47b9-bb13-aac4bbb6f988"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.325405 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.325432 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-edpm-extramounts\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.325444 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dp88\" (UniqueName: \"kubernetes.io/projected/f5726661-624f-47b9-bb13-aac4bbb6f988-kube-api-access-2dp88\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.325453 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f5726661-624f-47b9-bb13-aac4bbb6f988-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.372745 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sxksb" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.372731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sxksb" event={"ID":"d4ddc9de-e691-4dbf-9ccc-359225ea8c5d","Type":"ContainerDied","Data":"f9dc4fcffca37eccfd97c6122ea778e730f2c980382c91cb3f471ac658f65188"} Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.372858 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9dc4fcffca37eccfd97c6122ea778e730f2c980382c91cb3f471ac658f65188" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.374556 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5726661-624f-47b9-bb13-aac4bbb6f988" containerID="67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9" exitCode=0 Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.374585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" event={"ID":"f5726661-624f-47b9-bb13-aac4bbb6f988","Type":"ContainerDied","Data":"67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9"} Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.374601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" event={"ID":"f5726661-624f-47b9-bb13-aac4bbb6f988","Type":"ContainerDied","Data":"94fcc59ce03f21cd359aff3adcc178cace783345257f13751ba8bfec3ecb80e4"} Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.374606 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.374618 4707 scope.go:117] "RemoveContainer" containerID="67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.390142 4707 scope.go:117] "RemoveContainer" containerID="8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.399217 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm"] Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.403465 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-2v7cm"] Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.419271 4707 scope.go:117] "RemoveContainer" containerID="67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9" Jan 21 16:30:52 crc kubenswrapper[4707]: E0121 16:30:52.419619 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9\": container with ID starting with 67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9 not found: ID does not exist" containerID="67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.419654 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9"} err="failed to get container status \"67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9\": rpc error: code = NotFound desc = could not find container \"67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9\": container with ID starting with 67677555136b52f214fa3440f80b2b62eb3d17527d4f8c2f4a5d997bfc3d52b9 not found: ID does not exist" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.419676 4707 scope.go:117] "RemoveContainer" containerID="8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c" Jan 21 16:30:52 crc kubenswrapper[4707]: E0121 16:30:52.420099 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c\": container with ID starting with 8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c not found: ID does not exist" containerID="8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c" Jan 21 16:30:52 crc kubenswrapper[4707]: I0121 16:30:52.420138 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c"} err="failed to get container status \"8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c\": rpc error: code = NotFound desc = could not find container \"8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c\": container with ID starting with 8d0a58b17c1aeab8d8ced006e02ba94128a78047c1da31d37d518af1693e7c9c not found: ID does not exist" Jan 21 16:30:53 crc kubenswrapper[4707]: I0121 16:30:53.188330 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5726661-624f-47b9-bb13-aac4bbb6f988" path="/var/lib/kubelet/pods/f5726661-624f-47b9-bb13-aac4bbb6f988/volumes" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.440748 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-sxksb"] Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.445378 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-sxksb"] Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.542845 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-vgfjc"] Jan 21 16:30:54 crc kubenswrapper[4707]: E0121 16:30:54.543102 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5726661-624f-47b9-bb13-aac4bbb6f988" containerName="dnsmasq-dns" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.543120 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5726661-624f-47b9-bb13-aac4bbb6f988" containerName="dnsmasq-dns" Jan 21 16:30:54 crc kubenswrapper[4707]: E0121 16:30:54.543136 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ddc9de-e691-4dbf-9ccc-359225ea8c5d" containerName="storage" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.543142 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ddc9de-e691-4dbf-9ccc-359225ea8c5d" containerName="storage" Jan 21 16:30:54 crc kubenswrapper[4707]: E0121 16:30:54.543158 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5726661-624f-47b9-bb13-aac4bbb6f988" containerName="init" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.543163 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5726661-624f-47b9-bb13-aac4bbb6f988" containerName="init" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.543272 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5726661-624f-47b9-bb13-aac4bbb6f988" containerName="dnsmasq-dns" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.543285 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ddc9de-e691-4dbf-9ccc-359225ea8c5d" containerName="storage" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.543678 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.545095 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.545096 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.545459 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.545520 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.548915 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vgfjc"] Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.552146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e7bde5b-a787-4160-9df8-31420a7900db-node-mnt\") pod \"crc-storage-crc-vgfjc\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.552286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwkvp\" (UniqueName: \"kubernetes.io/projected/0e7bde5b-a787-4160-9df8-31420a7900db-kube-api-access-gwkvp\") pod \"crc-storage-crc-vgfjc\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.552433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e7bde5b-a787-4160-9df8-31420a7900db-crc-storage\") pod \"crc-storage-crc-vgfjc\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.653203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e7bde5b-a787-4160-9df8-31420a7900db-crc-storage\") pod \"crc-storage-crc-vgfjc\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.653251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e7bde5b-a787-4160-9df8-31420a7900db-node-mnt\") pod \"crc-storage-crc-vgfjc\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.653299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwkvp\" (UniqueName: \"kubernetes.io/projected/0e7bde5b-a787-4160-9df8-31420a7900db-kube-api-access-gwkvp\") pod \"crc-storage-crc-vgfjc\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.653681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e7bde5b-a787-4160-9df8-31420a7900db-node-mnt\") pod \"crc-storage-crc-vgfjc\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.654161 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e7bde5b-a787-4160-9df8-31420a7900db-crc-storage\") pod \"crc-storage-crc-vgfjc\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.666738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwkvp\" (UniqueName: \"kubernetes.io/projected/0e7bde5b-a787-4160-9df8-31420a7900db-kube-api-access-gwkvp\") pod \"crc-storage-crc-vgfjc\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:54 crc kubenswrapper[4707]: I0121 16:30:54.855308 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:55 crc kubenswrapper[4707]: I0121 16:30:55.190905 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ddc9de-e691-4dbf-9ccc-359225ea8c5d" path="/var/lib/kubelet/pods/d4ddc9de-e691-4dbf-9ccc-359225ea8c5d/volumes" Jan 21 16:30:55 crc kubenswrapper[4707]: I0121 16:30:55.203481 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vgfjc"] Jan 21 16:30:55 crc kubenswrapper[4707]: W0121 16:30:55.205559 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e7bde5b_a787_4160_9df8_31420a7900db.slice/crio-7aae50aba0e686d67d2a88de06c817037761686b2b69754fd632a2db42b8b41e WatchSource:0}: Error finding container 7aae50aba0e686d67d2a88de06c817037761686b2b69754fd632a2db42b8b41e: Status 404 returned error can't find the container with id 7aae50aba0e686d67d2a88de06c817037761686b2b69754fd632a2db42b8b41e Jan 21 16:30:55 crc kubenswrapper[4707]: I0121 16:30:55.396365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vgfjc" event={"ID":"0e7bde5b-a787-4160-9df8-31420a7900db","Type":"ContainerStarted","Data":"7aae50aba0e686d67d2a88de06c817037761686b2b69754fd632a2db42b8b41e"} Jan 21 16:30:56 crc kubenswrapper[4707]: I0121 16:30:56.403858 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e7bde5b-a787-4160-9df8-31420a7900db" containerID="8a343fcbdfa8959ca67b548b6cf128598fed8b110fbff3c067e5d5334be963de" exitCode=0 Jan 21 16:30:56 crc kubenswrapper[4707]: I0121 16:30:56.403926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vgfjc" event={"ID":"0e7bde5b-a787-4160-9df8-31420a7900db","Type":"ContainerDied","Data":"8a343fcbdfa8959ca67b548b6cf128598fed8b110fbff3c067e5d5334be963de"} Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.623788 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.807537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e7bde5b-a787-4160-9df8-31420a7900db-node-mnt\") pod \"0e7bde5b-a787-4160-9df8-31420a7900db\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.807583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e7bde5b-a787-4160-9df8-31420a7900db-crc-storage\") pod \"0e7bde5b-a787-4160-9df8-31420a7900db\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.807610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwkvp\" (UniqueName: \"kubernetes.io/projected/0e7bde5b-a787-4160-9df8-31420a7900db-kube-api-access-gwkvp\") pod \"0e7bde5b-a787-4160-9df8-31420a7900db\" (UID: \"0e7bde5b-a787-4160-9df8-31420a7900db\") " Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.807907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e7bde5b-a787-4160-9df8-31420a7900db-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "0e7bde5b-a787-4160-9df8-31420a7900db" (UID: "0e7bde5b-a787-4160-9df8-31420a7900db"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.811704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7bde5b-a787-4160-9df8-31420a7900db-kube-api-access-gwkvp" (OuterVolumeSpecName: "kube-api-access-gwkvp") pod "0e7bde5b-a787-4160-9df8-31420a7900db" (UID: "0e7bde5b-a787-4160-9df8-31420a7900db"). InnerVolumeSpecName "kube-api-access-gwkvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.821544 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e7bde5b-a787-4160-9df8-31420a7900db-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "0e7bde5b-a787-4160-9df8-31420a7900db" (UID: "0e7bde5b-a787-4160-9df8-31420a7900db"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.909380 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e7bde5b-a787-4160-9df8-31420a7900db-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.909411 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwkvp\" (UniqueName: \"kubernetes.io/projected/0e7bde5b-a787-4160-9df8-31420a7900db-kube-api-access-gwkvp\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:57 crc kubenswrapper[4707]: I0121 16:30:57.909422 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e7bde5b-a787-4160-9df8-31420a7900db-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:58 crc kubenswrapper[4707]: I0121 16:30:58.417578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vgfjc" event={"ID":"0e7bde5b-a787-4160-9df8-31420a7900db","Type":"ContainerDied","Data":"7aae50aba0e686d67d2a88de06c817037761686b2b69754fd632a2db42b8b41e"} Jan 21 16:30:58 crc kubenswrapper[4707]: I0121 16:30:58.417793 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aae50aba0e686d67d2a88de06c817037761686b2b69754fd632a2db42b8b41e" Jan 21 16:30:58 crc kubenswrapper[4707]: I0121 16:30:58.417617 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vgfjc" Jan 21 16:30:59 crc kubenswrapper[4707]: I0121 16:30:59.183498 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:30:59 crc kubenswrapper[4707]: E0121 16:30:59.183695 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.525247 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8"] Jan 21 16:31:00 crc kubenswrapper[4707]: E0121 16:31:00.525508 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7bde5b-a787-4160-9df8-31420a7900db" containerName="storage" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.525521 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7bde5b-a787-4160-9df8-31420a7900db" containerName="storage" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.525666 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7bde5b-a787-4160-9df8-31420a7900db" containerName="storage" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.526433 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.527627 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-edpm-multinode" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.535134 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8"] Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.583850 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8"] Jan 21 16:31:00 crc kubenswrapper[4707]: E0121 16:31:00.584282 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc kube-api-access-tgqm6 openstack-edpm-multinode], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" podUID="3e553817-dcc1-4019-b0b5-e86c7dd40064" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.603517 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm"] Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.604568 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.628068 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm"] Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.643293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgqm6\" (UniqueName: \"kubernetes.io/projected/3e553817-dcc1-4019-b0b5-e86c7dd40064-kube-api-access-tgqm6\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.643338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-config\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.643488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.643513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.744412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.744451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.744477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgqm6\" (UniqueName: \"kubernetes.io/projected/3e553817-dcc1-4019-b0b5-e86c7dd40064-kube-api-access-tgqm6\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.744506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.744531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-config\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.744587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/db72e387-5271-40cd-bfab-17140e94be26-kube-api-access-vckfc\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.744639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.744691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-config\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.745286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.745392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.745438 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-config\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.758734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgqm6\" (UniqueName: \"kubernetes.io/projected/3e553817-dcc1-4019-b0b5-e86c7dd40064-kube-api-access-tgqm6\") pod \"dnsmasq-dnsmasq-58854494b5-btdj8\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.845698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.845794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/db72e387-5271-40cd-bfab-17140e94be26-kube-api-access-vckfc\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.845871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.845925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-config\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.846611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-config\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.846726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.846894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.862050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/db72e387-5271-40cd-bfab-17140e94be26-kube-api-access-vckfc\") pod \"dnsmasq-dnsmasq-6d9b957dbc-5ctjm\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:00 crc kubenswrapper[4707]: I0121 16:31:00.918110 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.075721 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm"] Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.458070 4707 generic.go:334] "Generic (PLEG): container finished" podID="db72e387-5271-40cd-bfab-17140e94be26" containerID="4fcfa86df60d39c1ac681da435815cf1f18f48e397be8dc4fb91c2cb77c986e4" exitCode=0 Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.458100 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" event={"ID":"db72e387-5271-40cd-bfab-17140e94be26","Type":"ContainerDied","Data":"4fcfa86df60d39c1ac681da435815cf1f18f48e397be8dc4fb91c2cb77c986e4"} Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.458273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" event={"ID":"db72e387-5271-40cd-bfab-17140e94be26","Type":"ContainerStarted","Data":"a50d236c3f97b982bef6d22262a1cde08027f5dcfac24a9fe246bc85796c9b7c"} Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.458275 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.466790 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.657037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-openstack-edpm-multinode\") pod \"3e553817-dcc1-4019-b0b5-e86c7dd40064\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.657134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgqm6\" (UniqueName: \"kubernetes.io/projected/3e553817-dcc1-4019-b0b5-e86c7dd40064-kube-api-access-tgqm6\") pod \"3e553817-dcc1-4019-b0b5-e86c7dd40064\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.657159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-dnsmasq-svc\") pod \"3e553817-dcc1-4019-b0b5-e86c7dd40064\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.657192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-config\") pod \"3e553817-dcc1-4019-b0b5-e86c7dd40064\" (UID: \"3e553817-dcc1-4019-b0b5-e86c7dd40064\") " Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.657370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-openstack-edpm-multinode" (OuterVolumeSpecName: "openstack-edpm-multinode") pod "3e553817-dcc1-4019-b0b5-e86c7dd40064" (UID: "3e553817-dcc1-4019-b0b5-e86c7dd40064"). InnerVolumeSpecName "openstack-edpm-multinode". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.657433 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-openstack-edpm-multinode\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.657950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "3e553817-dcc1-4019-b0b5-e86c7dd40064" (UID: "3e553817-dcc1-4019-b0b5-e86c7dd40064"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.658075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-config" (OuterVolumeSpecName: "config") pod "3e553817-dcc1-4019-b0b5-e86c7dd40064" (UID: "3e553817-dcc1-4019-b0b5-e86c7dd40064"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.660135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e553817-dcc1-4019-b0b5-e86c7dd40064-kube-api-access-tgqm6" (OuterVolumeSpecName: "kube-api-access-tgqm6") pod "3e553817-dcc1-4019-b0b5-e86c7dd40064" (UID: "3e553817-dcc1-4019-b0b5-e86c7dd40064"). InnerVolumeSpecName "kube-api-access-tgqm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.758529 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgqm6\" (UniqueName: \"kubernetes.io/projected/3e553817-dcc1-4019-b0b5-e86c7dd40064-kube-api-access-tgqm6\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.758555 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:01 crc kubenswrapper[4707]: I0121 16:31:01.758564 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e553817-dcc1-4019-b0b5-e86c7dd40064-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:02 crc kubenswrapper[4707]: I0121 16:31:02.465552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8" Jan 21 16:31:02 crc kubenswrapper[4707]: I0121 16:31:02.465586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" event={"ID":"db72e387-5271-40cd-bfab-17140e94be26","Type":"ContainerStarted","Data":"a0a47e59ea42cc67df854ed5159686f54a3d6a541b7431a2392bb07190db05fd"} Jan 21 16:31:02 crc kubenswrapper[4707]: I0121 16:31:02.465895 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:02 crc kubenswrapper[4707]: I0121 16:31:02.483492 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" podStartSLOduration=2.483477003 podStartE2EDuration="2.483477003s" podCreationTimestamp="2026-01-21 16:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:31:02.479752639 +0000 UTC m=+5359.661268861" watchObservedRunningTime="2026-01-21 16:31:02.483477003 +0000 UTC m=+5359.664993224" Jan 21 16:31:02 crc kubenswrapper[4707]: I0121 16:31:02.507199 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8"] Jan 21 16:31:02 crc kubenswrapper[4707]: I0121 16:31:02.511059 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-btdj8"] Jan 21 16:31:03 crc kubenswrapper[4707]: I0121 16:31:03.188795 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e553817-dcc1-4019-b0b5-e86c7dd40064" path="/var/lib/kubelet/pods/3e553817-dcc1-4019-b0b5-e86c7dd40064/volumes" Jan 21 16:31:10 crc kubenswrapper[4707]: I0121 16:31:10.919953 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:10 crc kubenswrapper[4707]: I0121 16:31:10.957293 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp"] Jan 21 16:31:10 crc kubenswrapper[4707]: I0121 16:31:10.957513 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" podUID="df1d0fcb-d2a5-496a-87d6-801022dc65d3" containerName="dnsmasq-dns" containerID="cri-o://a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552" gracePeriod=10 Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.040991 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx"] Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.041939 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.048790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx"] Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.167371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-config\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.167440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8tb8\" (UniqueName: \"kubernetes.io/projected/249d67f1-6848-4961-9ad1-5cec53893d42-kube-api-access-c8tb8\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.167502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.167524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.268378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-config\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.268477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tb8\" (UniqueName: \"kubernetes.io/projected/249d67f1-6848-4961-9ad1-5cec53893d42-kube-api-access-c8tb8\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.268586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.268629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.269125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-config\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.269182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.269383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.282580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8tb8\" (UniqueName: \"kubernetes.io/projected/249d67f1-6848-4961-9ad1-5cec53893d42-kube-api-access-c8tb8\") pod \"dnsmasq-dnsmasq-59887957c5-xt2gx\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.283327 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.357302 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.469664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-dnsmasq-svc\") pod \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.469868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86f8c\" (UniqueName: \"kubernetes.io/projected/df1d0fcb-d2a5-496a-87d6-801022dc65d3-kube-api-access-86f8c\") pod \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.469947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-config\") pod \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\" (UID: \"df1d0fcb-d2a5-496a-87d6-801022dc65d3\") " Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.472692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df1d0fcb-d2a5-496a-87d6-801022dc65d3-kube-api-access-86f8c" (OuterVolumeSpecName: "kube-api-access-86f8c") pod "df1d0fcb-d2a5-496a-87d6-801022dc65d3" (UID: "df1d0fcb-d2a5-496a-87d6-801022dc65d3"). InnerVolumeSpecName "kube-api-access-86f8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.495328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "df1d0fcb-d2a5-496a-87d6-801022dc65d3" (UID: "df1d0fcb-d2a5-496a-87d6-801022dc65d3"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.496855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-config" (OuterVolumeSpecName: "config") pod "df1d0fcb-d2a5-496a-87d6-801022dc65d3" (UID: "df1d0fcb-d2a5-496a-87d6-801022dc65d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.519438 4707 generic.go:334] "Generic (PLEG): container finished" podID="df1d0fcb-d2a5-496a-87d6-801022dc65d3" containerID="a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552" exitCode=0 Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.519469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" event={"ID":"df1d0fcb-d2a5-496a-87d6-801022dc65d3","Type":"ContainerDied","Data":"a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552"} Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.519475 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.519491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp" event={"ID":"df1d0fcb-d2a5-496a-87d6-801022dc65d3","Type":"ContainerDied","Data":"328ae4b5591014e77b580d82fdb4e14d847b244758ea528609c64d5deaf38e7b"} Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.519507 4707 scope.go:117] "RemoveContainer" containerID="a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.543103 4707 scope.go:117] "RemoveContainer" containerID="217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.548576 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp"] Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.552442 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-7p5jp"] Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.571276 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.571302 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86f8c\" (UniqueName: \"kubernetes.io/projected/df1d0fcb-d2a5-496a-87d6-801022dc65d3-kube-api-access-86f8c\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.571311 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df1d0fcb-d2a5-496a-87d6-801022dc65d3-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.571653 4707 scope.go:117] "RemoveContainer" containerID="a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552" Jan 21 16:31:11 crc kubenswrapper[4707]: E0121 16:31:11.571936 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552\": container with ID starting with a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552 not found: ID does not exist" containerID="a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.571961 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552"} err="failed to get container status \"a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552\": rpc error: code = NotFound desc = could not find container \"a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552\": container with ID starting with a8c387f4b0afe475bf6e6572875ad6b8b1398b27bf503750ceceec1c86063552 not found: ID does not exist" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.571988 4707 scope.go:117] "RemoveContainer" containerID="217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4" Jan 21 16:31:11 crc kubenswrapper[4707]: E0121 16:31:11.572291 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4\": container with ID starting with 217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4 not found: ID does not exist" containerID="217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.572310 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4"} err="failed to get container status \"217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4\": rpc error: code = NotFound desc = could not find container \"217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4\": container with ID starting with 217b5a9110d919badd527ea3e3f990e8c749862074c7cd3cdf859d55b65c8bd4 not found: ID does not exist" Jan 21 16:31:11 crc kubenswrapper[4707]: I0121 16:31:11.711708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx"] Jan 21 16:31:12 crc kubenswrapper[4707]: I0121 16:31:12.182542 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:31:12 crc kubenswrapper[4707]: E0121 16:31:12.182919 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:31:12 crc kubenswrapper[4707]: I0121 16:31:12.526325 4707 generic.go:334] "Generic (PLEG): container finished" podID="249d67f1-6848-4961-9ad1-5cec53893d42" containerID="3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e" exitCode=0 Jan 21 16:31:12 crc kubenswrapper[4707]: I0121 16:31:12.526385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" event={"ID":"249d67f1-6848-4961-9ad1-5cec53893d42","Type":"ContainerDied","Data":"3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e"} Jan 21 16:31:12 crc kubenswrapper[4707]: I0121 16:31:12.526409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" event={"ID":"249d67f1-6848-4961-9ad1-5cec53893d42","Type":"ContainerStarted","Data":"05bec55afe9190e53550f498f16d275eae38c6b7d2c1b280d63cdb88f15ad727"} Jan 21 16:31:13 crc kubenswrapper[4707]: I0121 16:31:13.194379 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df1d0fcb-d2a5-496a-87d6-801022dc65d3" path="/var/lib/kubelet/pods/df1d0fcb-d2a5-496a-87d6-801022dc65d3/volumes" Jan 21 16:31:13 crc kubenswrapper[4707]: I0121 16:31:13.535134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" event={"ID":"249d67f1-6848-4961-9ad1-5cec53893d42","Type":"ContainerStarted","Data":"2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09"} Jan 21 16:31:13 crc kubenswrapper[4707]: I0121 16:31:13.535336 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:13 crc kubenswrapper[4707]: I0121 16:31:13.551317 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" podStartSLOduration=2.55130369 podStartE2EDuration="2.55130369s" podCreationTimestamp="2026-01-21 16:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:31:13.546138218 +0000 UTC m=+5370.727654450" watchObservedRunningTime="2026-01-21 16:31:13.55130369 +0000 UTC m=+5370.732819912" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.358592 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.391730 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm"] Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.391915 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" podUID="db72e387-5271-40cd-bfab-17140e94be26" containerName="dnsmasq-dns" containerID="cri-o://a0a47e59ea42cc67df854ed5159686f54a3d6a541b7431a2392bb07190db05fd" gracePeriod=10 Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.585849 4707 generic.go:334] "Generic (PLEG): container finished" podID="db72e387-5271-40cd-bfab-17140e94be26" containerID="a0a47e59ea42cc67df854ed5159686f54a3d6a541b7431a2392bb07190db05fd" exitCode=0 Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.585885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" event={"ID":"db72e387-5271-40cd-bfab-17140e94be26","Type":"ContainerDied","Data":"a0a47e59ea42cc67df854ed5159686f54a3d6a541b7431a2392bb07190db05fd"} Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.716784 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.889866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-dnsmasq-svc\") pod \"db72e387-5271-40cd-bfab-17140e94be26\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.889943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/db72e387-5271-40cd-bfab-17140e94be26-kube-api-access-vckfc\") pod \"db72e387-5271-40cd-bfab-17140e94be26\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.889993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-openstack-edpm-multinode\") pod \"db72e387-5271-40cd-bfab-17140e94be26\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.890015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-config\") pod \"db72e387-5271-40cd-bfab-17140e94be26\" (UID: \"db72e387-5271-40cd-bfab-17140e94be26\") " Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.907023 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g"] Jan 21 16:31:21 crc kubenswrapper[4707]: E0121 16:31:21.907518 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1d0fcb-d2a5-496a-87d6-801022dc65d3" containerName="init" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.907647 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1d0fcb-d2a5-496a-87d6-801022dc65d3" containerName="init" Jan 21 16:31:21 crc kubenswrapper[4707]: E0121 16:31:21.907717 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db72e387-5271-40cd-bfab-17140e94be26" containerName="init" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.907772 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db72e387-5271-40cd-bfab-17140e94be26" containerName="init" Jan 21 16:31:21 crc kubenswrapper[4707]: E0121 16:31:21.907848 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1d0fcb-d2a5-496a-87d6-801022dc65d3" containerName="dnsmasq-dns" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.907897 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1d0fcb-d2a5-496a-87d6-801022dc65d3" containerName="dnsmasq-dns" Jan 21 16:31:21 crc kubenswrapper[4707]: E0121 16:31:21.907946 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db72e387-5271-40cd-bfab-17140e94be26" containerName="dnsmasq-dns" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.908024 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db72e387-5271-40cd-bfab-17140e94be26" containerName="dnsmasq-dns" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.910731 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df1d0fcb-d2a5-496a-87d6-801022dc65d3" containerName="dnsmasq-dns" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.910805 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="db72e387-5271-40cd-bfab-17140e94be26" containerName="dnsmasq-dns" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.911529 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db72e387-5271-40cd-bfab-17140e94be26-kube-api-access-vckfc" (OuterVolumeSpecName: "kube-api-access-vckfc") pod "db72e387-5271-40cd-bfab-17140e94be26" (UID: "db72e387-5271-40cd-bfab-17140e94be26"). InnerVolumeSpecName "kube-api-access-vckfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.911660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.920733 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g"] Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.932340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-openstack-edpm-multinode" (OuterVolumeSpecName: "openstack-edpm-multinode") pod "db72e387-5271-40cd-bfab-17140e94be26" (UID: "db72e387-5271-40cd-bfab-17140e94be26"). InnerVolumeSpecName "openstack-edpm-multinode". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.933170 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "db72e387-5271-40cd-bfab-17140e94be26" (UID: "db72e387-5271-40cd-bfab-17140e94be26"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.936763 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-config" (OuterVolumeSpecName: "config") pod "db72e387-5271-40cd-bfab-17140e94be26" (UID: "db72e387-5271-40cd-bfab-17140e94be26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.991332 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.991361 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/db72e387-5271-40cd-bfab-17140e94be26-kube-api-access-vckfc\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.991373 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-openstack-edpm-multinode\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:21 crc kubenswrapper[4707]: I0121 16:31:21.991382 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db72e387-5271-40cd-bfab-17140e94be26-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.092186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-config\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.092231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcmgh\" (UniqueName: \"kubernetes.io/projected/d0f64f1c-b9a4-4861-9927-66797bb27598-kube-api-access-jcmgh\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.092268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.092299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.192796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.192874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.192925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-config\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.192961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcmgh\" (UniqueName: \"kubernetes.io/projected/d0f64f1c-b9a4-4861-9927-66797bb27598-kube-api-access-jcmgh\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.193936 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.193989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-config\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.194003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.207647 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcmgh\" (UniqueName: \"kubernetes.io/projected/d0f64f1c-b9a4-4861-9927-66797bb27598-kube-api-access-jcmgh\") pod \"dnsmasq-dnsmasq-5d55f47b6c-bqb2g\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.256534 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.594418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" event={"ID":"db72e387-5271-40cd-bfab-17140e94be26","Type":"ContainerDied","Data":"a50d236c3f97b982bef6d22262a1cde08027f5dcfac24a9fe246bc85796c9b7c"} Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.594470 4707 scope.go:117] "RemoveContainer" containerID="a0a47e59ea42cc67df854ed5159686f54a3d6a541b7431a2392bb07190db05fd" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.594473 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.612174 4707 scope.go:117] "RemoveContainer" containerID="4fcfa86df60d39c1ac681da435815cf1f18f48e397be8dc4fb91c2cb77c986e4" Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.618116 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm"] Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.623641 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6d9b957dbc-5ctjm"] Jan 21 16:31:22 crc kubenswrapper[4707]: I0121 16:31:22.627106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g"] Jan 21 16:31:23 crc kubenswrapper[4707]: I0121 16:31:23.186478 4707 scope.go:117] "RemoveContainer" containerID="53aaf8a6395bb41b259ede2dd51dc8f146aa18c7a6d7b0f4487dfe7bd4502628" Jan 21 16:31:23 crc kubenswrapper[4707]: I0121 16:31:23.188881 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db72e387-5271-40cd-bfab-17140e94be26" path="/var/lib/kubelet/pods/db72e387-5271-40cd-bfab-17140e94be26/volumes" Jan 21 16:31:23 crc kubenswrapper[4707]: I0121 16:31:23.205291 4707 scope.go:117] "RemoveContainer" containerID="49a7623192f683015e2ec84de1db7c9ea233bd6380eb858a6be5d853a2b2e8b4" Jan 21 16:31:23 crc kubenswrapper[4707]: I0121 16:31:23.221410 4707 scope.go:117] "RemoveContainer" containerID="0a9468204faf53ac2c4f64c4e809127e9513b05b8d233daada636d2e0976000a" Jan 21 16:31:23 crc kubenswrapper[4707]: I0121 16:31:23.234412 4707 scope.go:117] "RemoveContainer" containerID="b47a17e145e731e864046d57e853827cc9066db624852107476f44e4bad8e848" Jan 21 16:31:23 crc kubenswrapper[4707]: I0121 16:31:23.601902 4707 generic.go:334] "Generic (PLEG): container finished" podID="d0f64f1c-b9a4-4861-9927-66797bb27598" containerID="fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1" exitCode=0 Jan 21 16:31:23 crc kubenswrapper[4707]: I0121 16:31:23.601977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" event={"ID":"d0f64f1c-b9a4-4861-9927-66797bb27598","Type":"ContainerDied","Data":"fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1"} Jan 21 16:31:23 crc kubenswrapper[4707]: I0121 16:31:23.602002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" event={"ID":"d0f64f1c-b9a4-4861-9927-66797bb27598","Type":"ContainerStarted","Data":"6949e6fa40f369c673d53f16f8848ebbbbac809f1da60d185eca883550145840"} Jan 21 16:31:24 crc kubenswrapper[4707]: I0121 16:31:24.608744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" event={"ID":"d0f64f1c-b9a4-4861-9927-66797bb27598","Type":"ContainerStarted","Data":"fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292"} Jan 21 16:31:24 crc kubenswrapper[4707]: I0121 16:31:24.608840 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:25 crc kubenswrapper[4707]: I0121 16:31:25.183154 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:31:25 crc kubenswrapper[4707]: E0121 16:31:25.183557 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.257942 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.270757 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" podStartSLOduration=11.270744022 podStartE2EDuration="11.270744022s" podCreationTimestamp="2026-01-21 16:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:31:24.62210422 +0000 UTC m=+5381.803620442" watchObservedRunningTime="2026-01-21 16:31:32.270744022 +0000 UTC m=+5389.452260243" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.292508 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx"] Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.292888 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" podUID="249d67f1-6848-4961-9ad1-5cec53893d42" containerName="dnsmasq-dns" containerID="cri-o://2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09" gracePeriod=10 Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.372674 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl"] Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.373927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.382184 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl"] Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.410907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-config\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.410998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.411034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.411182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kglpv\" (UniqueName: \"kubernetes.io/projected/35536385-4324-4394-b6dd-9b051a08c847-kube-api-access-kglpv\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.515256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.516714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.516799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.516936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kglpv\" (UniqueName: \"kubernetes.io/projected/35536385-4324-4394-b6dd-9b051a08c847-kube-api-access-kglpv\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.516998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-config\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.517530 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.517535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-config\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.536581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kglpv\" (UniqueName: \"kubernetes.io/projected/35536385-4324-4394-b6dd-9b051a08c847-kube-api-access-kglpv\") pod \"dnsmasq-dnsmasq-964c896d7-25qpl\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.623395 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.657604 4707 generic.go:334] "Generic (PLEG): container finished" podID="249d67f1-6848-4961-9ad1-5cec53893d42" containerID="2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09" exitCode=0 Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.657837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" event={"ID":"249d67f1-6848-4961-9ad1-5cec53893d42","Type":"ContainerDied","Data":"2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09"} Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.657860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" event={"ID":"249d67f1-6848-4961-9ad1-5cec53893d42","Type":"ContainerDied","Data":"05bec55afe9190e53550f498f16d275eae38c6b7d2c1b280d63cdb88f15ad727"} Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.657875 4707 scope.go:117] "RemoveContainer" containerID="2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.657986 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.674372 4707 scope.go:117] "RemoveContainer" containerID="3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.688627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.689683 4707 scope.go:117] "RemoveContainer" containerID="2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09" Jan 21 16:31:32 crc kubenswrapper[4707]: E0121 16:31:32.689929 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09\": container with ID starting with 2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09 not found: ID does not exist" containerID="2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.689950 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09"} err="failed to get container status \"2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09\": rpc error: code = NotFound desc = could not find container \"2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09\": container with ID starting with 2908444886e677b722f6261aefc18680fbfbd173542e768170822434615d1a09 not found: ID does not exist" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.689976 4707 scope.go:117] "RemoveContainer" containerID="3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e" Jan 21 16:31:32 crc kubenswrapper[4707]: E0121 16:31:32.690161 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e\": container with ID starting with 3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e not found: ID does not exist" containerID="3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.690176 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e"} err="failed to get container status \"3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e\": rpc error: code = NotFound desc = could not find container \"3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e\": container with ID starting with 3b7a9eacb550a9fc929cbad75c16de8b63c504e6ed6972ca267969e5d2ee2b7e not found: ID does not exist" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.718079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-config\") pod \"249d67f1-6848-4961-9ad1-5cec53893d42\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.718109 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8tb8\" (UniqueName: \"kubernetes.io/projected/249d67f1-6848-4961-9ad1-5cec53893d42-kube-api-access-c8tb8\") pod \"249d67f1-6848-4961-9ad1-5cec53893d42\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.718137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-openstack-edpm-multinode\") pod \"249d67f1-6848-4961-9ad1-5cec53893d42\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.718178 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-dnsmasq-svc\") pod \"249d67f1-6848-4961-9ad1-5cec53893d42\" (UID: \"249d67f1-6848-4961-9ad1-5cec53893d42\") " Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.721247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249d67f1-6848-4961-9ad1-5cec53893d42-kube-api-access-c8tb8" (OuterVolumeSpecName: "kube-api-access-c8tb8") pod "249d67f1-6848-4961-9ad1-5cec53893d42" (UID: "249d67f1-6848-4961-9ad1-5cec53893d42"). InnerVolumeSpecName "kube-api-access-c8tb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.744370 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-openstack-edpm-multinode" (OuterVolumeSpecName: "openstack-edpm-multinode") pod "249d67f1-6848-4961-9ad1-5cec53893d42" (UID: "249d67f1-6848-4961-9ad1-5cec53893d42"). InnerVolumeSpecName "openstack-edpm-multinode". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.744867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "249d67f1-6848-4961-9ad1-5cec53893d42" (UID: "249d67f1-6848-4961-9ad1-5cec53893d42"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.746545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-config" (OuterVolumeSpecName: "config") pod "249d67f1-6848-4961-9ad1-5cec53893d42" (UID: "249d67f1-6848-4961-9ad1-5cec53893d42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.819606 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.819626 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8tb8\" (UniqueName: \"kubernetes.io/projected/249d67f1-6848-4961-9ad1-5cec53893d42-kube-api-access-c8tb8\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.819637 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-openstack-edpm-multinode\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.819645 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/249d67f1-6848-4961-9ad1-5cec53893d42-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.981289 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx"] Jan 21 16:31:32 crc kubenswrapper[4707]: I0121 16:31:32.984767 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xt2gx"] Jan 21 16:31:33 crc kubenswrapper[4707]: I0121 16:31:33.166397 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl"] Jan 21 16:31:33 crc kubenswrapper[4707]: I0121 16:31:33.200238 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249d67f1-6848-4961-9ad1-5cec53893d42" path="/var/lib/kubelet/pods/249d67f1-6848-4961-9ad1-5cec53893d42/volumes" Jan 21 16:31:33 crc kubenswrapper[4707]: I0121 16:31:33.666862 4707 generic.go:334] "Generic (PLEG): container finished" podID="35536385-4324-4394-b6dd-9b051a08c847" containerID="444e98afffd869e01578f77a6f465e0adec50973c08d46958d1644d82ed31e8a" exitCode=0 Jan 21 16:31:33 crc kubenswrapper[4707]: I0121 16:31:33.667044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" event={"ID":"35536385-4324-4394-b6dd-9b051a08c847","Type":"ContainerDied","Data":"444e98afffd869e01578f77a6f465e0adec50973c08d46958d1644d82ed31e8a"} Jan 21 16:31:33 crc kubenswrapper[4707]: I0121 16:31:33.667773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" event={"ID":"35536385-4324-4394-b6dd-9b051a08c847","Type":"ContainerStarted","Data":"d56285a7a1b8e227b566fc6858cdaeae5c6ca41d80026cf8d4d1f9648a60ebc9"} Jan 21 16:31:34 crc kubenswrapper[4707]: I0121 16:31:34.675567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" event={"ID":"35536385-4324-4394-b6dd-9b051a08c847","Type":"ContainerStarted","Data":"6043f2e7c297ee9322bfbbb3ded6594508b35cf0c2b6f81edbe3e1fc19b31e12"} Jan 21 16:31:34 crc kubenswrapper[4707]: I0121 16:31:34.676203 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:34 crc kubenswrapper[4707]: I0121 16:31:34.689789 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" podStartSLOduration=2.689775961 podStartE2EDuration="2.689775961s" podCreationTimestamp="2026-01-21 16:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:31:34.685952942 +0000 UTC m=+5391.867469164" watchObservedRunningTime="2026-01-21 16:31:34.689775961 +0000 UTC m=+5391.871292182" Jan 21 16:31:36 crc kubenswrapper[4707]: I0121 16:31:36.182356 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:31:36 crc kubenswrapper[4707]: E0121 16:31:36.182545 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:31:42 crc kubenswrapper[4707]: I0121 16:31:42.690450 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:42 crc kubenswrapper[4707]: I0121 16:31:42.723119 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g"] Jan 21 16:31:42 crc kubenswrapper[4707]: I0121 16:31:42.723311 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" podUID="d0f64f1c-b9a4-4861-9927-66797bb27598" containerName="dnsmasq-dns" containerID="cri-o://fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292" gracePeriod=10 Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.035113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.209557 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm"] Jan 21 16:31:43 crc kubenswrapper[4707]: E0121 16:31:43.209790 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f64f1c-b9a4-4861-9927-66797bb27598" containerName="init" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.209817 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f64f1c-b9a4-4861-9927-66797bb27598" containerName="init" Jan 21 16:31:43 crc kubenswrapper[4707]: E0121 16:31:43.209828 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f64f1c-b9a4-4861-9927-66797bb27598" containerName="dnsmasq-dns" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.209834 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f64f1c-b9a4-4861-9927-66797bb27598" containerName="dnsmasq-dns" Jan 21 16:31:43 crc kubenswrapper[4707]: E0121 16:31:43.209845 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249d67f1-6848-4961-9ad1-5cec53893d42" containerName="init" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.209850 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="249d67f1-6848-4961-9ad1-5cec53893d42" containerName="init" Jan 21 16:31:43 crc kubenswrapper[4707]: E0121 16:31:43.209866 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249d67f1-6848-4961-9ad1-5cec53893d42" containerName="dnsmasq-dns" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.209872 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="249d67f1-6848-4961-9ad1-5cec53893d42" containerName="dnsmasq-dns" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.210096 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="249d67f1-6848-4961-9ad1-5cec53893d42" containerName="dnsmasq-dns" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.210125 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f64f1c-b9a4-4861-9927-66797bb27598" containerName="dnsmasq-dns" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.210737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.220196 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm"] Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.225629 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcmgh\" (UniqueName: \"kubernetes.io/projected/d0f64f1c-b9a4-4861-9927-66797bb27598-kube-api-access-jcmgh\") pod \"d0f64f1c-b9a4-4861-9927-66797bb27598\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.225666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-config\") pod \"d0f64f1c-b9a4-4861-9927-66797bb27598\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.225689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-dnsmasq-svc\") pod \"d0f64f1c-b9a4-4861-9927-66797bb27598\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.225742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-openstack-edpm-multinode\") pod \"d0f64f1c-b9a4-4861-9927-66797bb27598\" (UID: \"d0f64f1c-b9a4-4861-9927-66797bb27598\") " Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.232663 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f64f1c-b9a4-4861-9927-66797bb27598-kube-api-access-jcmgh" (OuterVolumeSpecName: "kube-api-access-jcmgh") pod "d0f64f1c-b9a4-4861-9927-66797bb27598" (UID: "d0f64f1c-b9a4-4861-9927-66797bb27598"). InnerVolumeSpecName "kube-api-access-jcmgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.251862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "d0f64f1c-b9a4-4861-9927-66797bb27598" (UID: "d0f64f1c-b9a4-4861-9927-66797bb27598"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.253157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-openstack-edpm-multinode" (OuterVolumeSpecName: "openstack-edpm-multinode") pod "d0f64f1c-b9a4-4861-9927-66797bb27598" (UID: "d0f64f1c-b9a4-4861-9927-66797bb27598"). InnerVolumeSpecName "openstack-edpm-multinode". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.256018 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-config" (OuterVolumeSpecName: "config") pod "d0f64f1c-b9a4-4861-9927-66797bb27598" (UID: "d0f64f1c-b9a4-4861-9927-66797bb27598"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.327436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-tvklm\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.327565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-tvklm\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.327604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjz4w\" (UniqueName: \"kubernetes.io/projected/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-kube-api-access-fjz4w\") pod \"dnsmasq-dnsmasq-84b9f45d47-tvklm\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.327649 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcmgh\" (UniqueName: \"kubernetes.io/projected/d0f64f1c-b9a4-4861-9927-66797bb27598-kube-api-access-jcmgh\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.327688 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.327714 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.327726 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d0f64f1c-b9a4-4861-9927-66797bb27598-openstack-edpm-multinode\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.428919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-tvklm\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.429013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-tvklm\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.429039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjz4w\" (UniqueName: \"kubernetes.io/projected/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-kube-api-access-fjz4w\") pod \"dnsmasq-dnsmasq-84b9f45d47-tvklm\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.429971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-tvklm\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.430041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-tvklm\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.442439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjz4w\" (UniqueName: \"kubernetes.io/projected/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-kube-api-access-fjz4w\") pod \"dnsmasq-dnsmasq-84b9f45d47-tvklm\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.526261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.732875 4707 generic.go:334] "Generic (PLEG): container finished" podID="d0f64f1c-b9a4-4861-9927-66797bb27598" containerID="fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292" exitCode=0 Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.732910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" event={"ID":"d0f64f1c-b9a4-4861-9927-66797bb27598","Type":"ContainerDied","Data":"fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292"} Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.732935 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.732951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g" event={"ID":"d0f64f1c-b9a4-4861-9927-66797bb27598","Type":"ContainerDied","Data":"6949e6fa40f369c673d53f16f8848ebbbbac809f1da60d185eca883550145840"} Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.732983 4707 scope.go:117] "RemoveContainer" containerID="fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.752688 4707 scope.go:117] "RemoveContainer" containerID="fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.756336 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g"] Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.760216 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-bqb2g"] Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.779496 4707 scope.go:117] "RemoveContainer" containerID="fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292" Jan 21 16:31:43 crc kubenswrapper[4707]: E0121 16:31:43.779800 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292\": container with ID starting with fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292 not found: ID does not exist" containerID="fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.779848 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292"} err="failed to get container status \"fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292\": rpc error: code = NotFound desc = could not find container \"fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292\": container with ID starting with fcece0cbbcb6684192eb3cf6e5822e66770e1e3ca92ad1f76dfbd886c20a2292 not found: ID does not exist" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.779870 4707 scope.go:117] "RemoveContainer" containerID="fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1" Jan 21 16:31:43 crc kubenswrapper[4707]: E0121 16:31:43.780166 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1\": container with ID starting with fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1 not found: ID does not exist" containerID="fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.780191 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1"} err="failed to get container status \"fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1\": rpc error: code = NotFound desc = could not find container \"fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1\": container with ID starting with fb52a7d748f088ecaf5a4f0e0bcf8a1d30ab64fbc95ab0d2fc677e1bffb458e1 not found: ID does not exist" Jan 21 16:31:43 crc kubenswrapper[4707]: I0121 16:31:43.874169 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm"] Jan 21 16:31:43 crc kubenswrapper[4707]: W0121 16:31:43.878847 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8c8b29d_cd87_46c9_a868_8234fbc1f2ff.slice/crio-ffe69bb05fd869b8b92fdc9d8807e28018526e5da5f273cda360b35610dddfe5 WatchSource:0}: Error finding container ffe69bb05fd869b8b92fdc9d8807e28018526e5da5f273cda360b35610dddfe5: Status 404 returned error can't find the container with id ffe69bb05fd869b8b92fdc9d8807e28018526e5da5f273cda360b35610dddfe5 Jan 21 16:31:44 crc kubenswrapper[4707]: I0121 16:31:44.739646 4707 generic.go:334] "Generic (PLEG): container finished" podID="f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" containerID="436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77" exitCode=0 Jan 21 16:31:44 crc kubenswrapper[4707]: I0121 16:31:44.739682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" event={"ID":"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff","Type":"ContainerDied","Data":"436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77"} Jan 21 16:31:44 crc kubenswrapper[4707]: I0121 16:31:44.739906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" event={"ID":"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff","Type":"ContainerStarted","Data":"ffe69bb05fd869b8b92fdc9d8807e28018526e5da5f273cda360b35610dddfe5"} Jan 21 16:31:45 crc kubenswrapper[4707]: I0121 16:31:45.190534 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f64f1c-b9a4-4861-9927-66797bb27598" path="/var/lib/kubelet/pods/d0f64f1c-b9a4-4861-9927-66797bb27598/volumes" Jan 21 16:31:45 crc kubenswrapper[4707]: I0121 16:31:45.748921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" event={"ID":"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff","Type":"ContainerStarted","Data":"5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4"} Jan 21 16:31:45 crc kubenswrapper[4707]: I0121 16:31:45.749032 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:45 crc kubenswrapper[4707]: I0121 16:31:45.762980 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" podStartSLOduration=2.762968209 podStartE2EDuration="2.762968209s" podCreationTimestamp="2026-01-21 16:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:31:45.759616535 +0000 UTC m=+5402.941132757" watchObservedRunningTime="2026-01-21 16:31:45.762968209 +0000 UTC m=+5402.944484430" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.332938 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-vgfjc"] Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.337162 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-vgfjc"] Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.430319 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-mv79g"] Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.431127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.432803 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.432993 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.433022 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.434373 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.436011 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mv79g"] Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.496697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7fa126c1-c7f0-4cff-af83-c8335ab49217-node-mnt\") pod \"crc-storage-crc-mv79g\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.496747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7fa126c1-c7f0-4cff-af83-c8335ab49217-crc-storage\") pod \"crc-storage-crc-mv79g\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.496879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/7fa126c1-c7f0-4cff-af83-c8335ab49217-kube-api-access-gjmxc\") pod \"crc-storage-crc-mv79g\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.597639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7fa126c1-c7f0-4cff-af83-c8335ab49217-node-mnt\") pod \"crc-storage-crc-mv79g\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.597682 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7fa126c1-c7f0-4cff-af83-c8335ab49217-crc-storage\") pod \"crc-storage-crc-mv79g\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.597743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/7fa126c1-c7f0-4cff-af83-c8335ab49217-kube-api-access-gjmxc\") pod \"crc-storage-crc-mv79g\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.597876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7fa126c1-c7f0-4cff-af83-c8335ab49217-node-mnt\") pod \"crc-storage-crc-mv79g\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.598441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7fa126c1-c7f0-4cff-af83-c8335ab49217-crc-storage\") pod \"crc-storage-crc-mv79g\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.612303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/7fa126c1-c7f0-4cff-af83-c8335ab49217-kube-api-access-gjmxc\") pod \"crc-storage-crc-mv79g\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:49 crc kubenswrapper[4707]: I0121 16:31:49.747825 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:50 crc kubenswrapper[4707]: I0121 16:31:50.096583 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mv79g"] Jan 21 16:31:50 crc kubenswrapper[4707]: I0121 16:31:50.182982 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:31:50 crc kubenswrapper[4707]: E0121 16:31:50.183211 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:31:50 crc kubenswrapper[4707]: I0121 16:31:50.780217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mv79g" event={"ID":"7fa126c1-c7f0-4cff-af83-c8335ab49217","Type":"ContainerStarted","Data":"e13ed300da3c950bffe5571e50a9e5e196d8fc92f45102572d823fcb84019d9e"} Jan 21 16:31:51 crc kubenswrapper[4707]: I0121 16:31:51.188988 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e7bde5b-a787-4160-9df8-31420a7900db" path="/var/lib/kubelet/pods/0e7bde5b-a787-4160-9df8-31420a7900db/volumes" Jan 21 16:31:51 crc kubenswrapper[4707]: I0121 16:31:51.792070 4707 generic.go:334] "Generic (PLEG): container finished" podID="7fa126c1-c7f0-4cff-af83-c8335ab49217" containerID="0c73e5dfbc68aeabe4d0ccbba22d024abe241b6122a7cab7940c3f248348276d" exitCode=0 Jan 21 16:31:51 crc kubenswrapper[4707]: I0121 16:31:51.792175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mv79g" event={"ID":"7fa126c1-c7f0-4cff-af83-c8335ab49217","Type":"ContainerDied","Data":"0c73e5dfbc68aeabe4d0ccbba22d024abe241b6122a7cab7940c3f248348276d"} Jan 21 16:31:52 crc kubenswrapper[4707]: I0121 16:31:52.996593 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.136468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/7fa126c1-c7f0-4cff-af83-c8335ab49217-kube-api-access-gjmxc\") pod \"7fa126c1-c7f0-4cff-af83-c8335ab49217\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.136520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7fa126c1-c7f0-4cff-af83-c8335ab49217-crc-storage\") pod \"7fa126c1-c7f0-4cff-af83-c8335ab49217\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.136572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7fa126c1-c7f0-4cff-af83-c8335ab49217-node-mnt\") pod \"7fa126c1-c7f0-4cff-af83-c8335ab49217\" (UID: \"7fa126c1-c7f0-4cff-af83-c8335ab49217\") " Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.136801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fa126c1-c7f0-4cff-af83-c8335ab49217-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7fa126c1-c7f0-4cff-af83-c8335ab49217" (UID: "7fa126c1-c7f0-4cff-af83-c8335ab49217"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.140834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa126c1-c7f0-4cff-af83-c8335ab49217-kube-api-access-gjmxc" (OuterVolumeSpecName: "kube-api-access-gjmxc") pod "7fa126c1-c7f0-4cff-af83-c8335ab49217" (UID: "7fa126c1-c7f0-4cff-af83-c8335ab49217"). InnerVolumeSpecName "kube-api-access-gjmxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.151015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa126c1-c7f0-4cff-af83-c8335ab49217-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7fa126c1-c7f0-4cff-af83-c8335ab49217" (UID: "7fa126c1-c7f0-4cff-af83-c8335ab49217"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.238996 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/7fa126c1-c7f0-4cff-af83-c8335ab49217-kube-api-access-gjmxc\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.239023 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7fa126c1-c7f0-4cff-af83-c8335ab49217-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.239033 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7fa126c1-c7f0-4cff-af83-c8335ab49217-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.527964 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.560733 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl"] Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.560986 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" podUID="35536385-4324-4394-b6dd-9b051a08c847" containerName="dnsmasq-dns" containerID="cri-o://6043f2e7c297ee9322bfbbb3ded6594508b35cf0c2b6f81edbe3e1fc19b31e12" gracePeriod=10 Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.806229 4707 generic.go:334] "Generic (PLEG): container finished" podID="35536385-4324-4394-b6dd-9b051a08c847" containerID="6043f2e7c297ee9322bfbbb3ded6594508b35cf0c2b6f81edbe3e1fc19b31e12" exitCode=0 Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.806305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" event={"ID":"35536385-4324-4394-b6dd-9b051a08c847","Type":"ContainerDied","Data":"6043f2e7c297ee9322bfbbb3ded6594508b35cf0c2b6f81edbe3e1fc19b31e12"} Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.807489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mv79g" event={"ID":"7fa126c1-c7f0-4cff-af83-c8335ab49217","Type":"ContainerDied","Data":"e13ed300da3c950bffe5571e50a9e5e196d8fc92f45102572d823fcb84019d9e"} Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.807514 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e13ed300da3c950bffe5571e50a9e5e196d8fc92f45102572d823fcb84019d9e" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.807556 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mv79g" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.841065 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.946975 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-dnsmasq-svc\") pod \"35536385-4324-4394-b6dd-9b051a08c847\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.947232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-openstack-edpm-multinode\") pod \"35536385-4324-4394-b6dd-9b051a08c847\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.947279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kglpv\" (UniqueName: \"kubernetes.io/projected/35536385-4324-4394-b6dd-9b051a08c847-kube-api-access-kglpv\") pod \"35536385-4324-4394-b6dd-9b051a08c847\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.947343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-config\") pod \"35536385-4324-4394-b6dd-9b051a08c847\" (UID: \"35536385-4324-4394-b6dd-9b051a08c847\") " Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.950849 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35536385-4324-4394-b6dd-9b051a08c847-kube-api-access-kglpv" (OuterVolumeSpecName: "kube-api-access-kglpv") pod "35536385-4324-4394-b6dd-9b051a08c847" (UID: "35536385-4324-4394-b6dd-9b051a08c847"). InnerVolumeSpecName "kube-api-access-kglpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.973324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-config" (OuterVolumeSpecName: "config") pod "35536385-4324-4394-b6dd-9b051a08c847" (UID: "35536385-4324-4394-b6dd-9b051a08c847"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.973692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-openstack-edpm-multinode" (OuterVolumeSpecName: "openstack-edpm-multinode") pod "35536385-4324-4394-b6dd-9b051a08c847" (UID: "35536385-4324-4394-b6dd-9b051a08c847"). InnerVolumeSpecName "openstack-edpm-multinode". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:53 crc kubenswrapper[4707]: I0121 16:31:53.973871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "35536385-4324-4394-b6dd-9b051a08c847" (UID: "35536385-4324-4394-b6dd-9b051a08c847"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.049000 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.049696 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-openstack-edpm-multinode\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.049727 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kglpv\" (UniqueName: \"kubernetes.io/projected/35536385-4324-4394-b6dd-9b051a08c847-kube-api-access-kglpv\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.049737 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35536385-4324-4394-b6dd-9b051a08c847-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.814800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" event={"ID":"35536385-4324-4394-b6dd-9b051a08c847","Type":"ContainerDied","Data":"d56285a7a1b8e227b566fc6858cdaeae5c6ca41d80026cf8d4d1f9648a60ebc9"} Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.814862 4707 scope.go:117] "RemoveContainer" containerID="6043f2e7c297ee9322bfbbb3ded6594508b35cf0c2b6f81edbe3e1fc19b31e12" Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.814916 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl" Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.829183 4707 scope.go:117] "RemoveContainer" containerID="444e98afffd869e01578f77a6f465e0adec50973c08d46958d1644d82ed31e8a" Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.835569 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl"] Jan 21 16:31:54 crc kubenswrapper[4707]: I0121 16:31:54.839434 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-25qpl"] Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.188486 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35536385-4324-4394-b6dd-9b051a08c847" path="/var/lib/kubelet/pods/35536385-4324-4394-b6dd-9b051a08c847/volumes" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.784714 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-mv79g"] Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.796543 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-mv79g"] Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.883790 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-rzfwr"] Jan 21 16:31:55 crc kubenswrapper[4707]: E0121 16:31:55.884064 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35536385-4324-4394-b6dd-9b051a08c847" containerName="init" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.884077 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35536385-4324-4394-b6dd-9b051a08c847" containerName="init" Jan 21 16:31:55 crc kubenswrapper[4707]: E0121 16:31:55.884087 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35536385-4324-4394-b6dd-9b051a08c847" containerName="dnsmasq-dns" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.884093 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35536385-4324-4394-b6dd-9b051a08c847" containerName="dnsmasq-dns" Jan 21 16:31:55 crc kubenswrapper[4707]: E0121 16:31:55.884107 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa126c1-c7f0-4cff-af83-c8335ab49217" containerName="storage" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.884113 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa126c1-c7f0-4cff-af83-c8335ab49217" containerName="storage" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.884218 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35536385-4324-4394-b6dd-9b051a08c847" containerName="dnsmasq-dns" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.884234 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa126c1-c7f0-4cff-af83-c8335ab49217" containerName="storage" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.884635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.888042 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.888338 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.888564 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.888622 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:31:55 crc kubenswrapper[4707]: I0121 16:31:55.889878 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rzfwr"] Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.072142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aec86c5d-fd76-46cc-9c67-3db97c586d49-node-mnt\") pod \"crc-storage-crc-rzfwr\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.072316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aec86c5d-fd76-46cc-9c67-3db97c586d49-crc-storage\") pod \"crc-storage-crc-rzfwr\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.072365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj54c\" (UniqueName: \"kubernetes.io/projected/aec86c5d-fd76-46cc-9c67-3db97c586d49-kube-api-access-tj54c\") pod \"crc-storage-crc-rzfwr\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.173132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj54c\" (UniqueName: \"kubernetes.io/projected/aec86c5d-fd76-46cc-9c67-3db97c586d49-kube-api-access-tj54c\") pod \"crc-storage-crc-rzfwr\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.173173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aec86c5d-fd76-46cc-9c67-3db97c586d49-node-mnt\") pod \"crc-storage-crc-rzfwr\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.173265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aec86c5d-fd76-46cc-9c67-3db97c586d49-crc-storage\") pod \"crc-storage-crc-rzfwr\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.173426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aec86c5d-fd76-46cc-9c67-3db97c586d49-node-mnt\") pod \"crc-storage-crc-rzfwr\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.173886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aec86c5d-fd76-46cc-9c67-3db97c586d49-crc-storage\") pod \"crc-storage-crc-rzfwr\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.186124 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj54c\" (UniqueName: \"kubernetes.io/projected/aec86c5d-fd76-46cc-9c67-3db97c586d49-kube-api-access-tj54c\") pod \"crc-storage-crc-rzfwr\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.198414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.539285 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rzfwr"] Jan 21 16:31:56 crc kubenswrapper[4707]: W0121 16:31:56.542580 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaec86c5d_fd76_46cc_9c67_3db97c586d49.slice/crio-24517001c204a66fdf779562337682ed7a57cfd2ea1d0bbf236ed07333b3108d WatchSource:0}: Error finding container 24517001c204a66fdf779562337682ed7a57cfd2ea1d0bbf236ed07333b3108d: Status 404 returned error can't find the container with id 24517001c204a66fdf779562337682ed7a57cfd2ea1d0bbf236ed07333b3108d Jan 21 16:31:56 crc kubenswrapper[4707]: I0121 16:31:56.828701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rzfwr" event={"ID":"aec86c5d-fd76-46cc-9c67-3db97c586d49","Type":"ContainerStarted","Data":"24517001c204a66fdf779562337682ed7a57cfd2ea1d0bbf236ed07333b3108d"} Jan 21 16:31:57 crc kubenswrapper[4707]: I0121 16:31:57.195983 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa126c1-c7f0-4cff-af83-c8335ab49217" path="/var/lib/kubelet/pods/7fa126c1-c7f0-4cff-af83-c8335ab49217/volumes" Jan 21 16:31:57 crc kubenswrapper[4707]: I0121 16:31:57.835538 4707 generic.go:334] "Generic (PLEG): container finished" podID="aec86c5d-fd76-46cc-9c67-3db97c586d49" containerID="abd2e54bf0b7eeaeda07b2cec31a04c4f4da512a51516cff83987af87d0a426e" exitCode=0 Jan 21 16:31:57 crc kubenswrapper[4707]: I0121 16:31:57.835584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rzfwr" event={"ID":"aec86c5d-fd76-46cc-9c67-3db97c586d49","Type":"ContainerDied","Data":"abd2e54bf0b7eeaeda07b2cec31a04c4f4da512a51516cff83987af87d0a426e"} Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.049145 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.206210 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj54c\" (UniqueName: \"kubernetes.io/projected/aec86c5d-fd76-46cc-9c67-3db97c586d49-kube-api-access-tj54c\") pod \"aec86c5d-fd76-46cc-9c67-3db97c586d49\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.206271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aec86c5d-fd76-46cc-9c67-3db97c586d49-crc-storage\") pod \"aec86c5d-fd76-46cc-9c67-3db97c586d49\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.206331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aec86c5d-fd76-46cc-9c67-3db97c586d49-node-mnt\") pod \"aec86c5d-fd76-46cc-9c67-3db97c586d49\" (UID: \"aec86c5d-fd76-46cc-9c67-3db97c586d49\") " Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.206531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aec86c5d-fd76-46cc-9c67-3db97c586d49-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "aec86c5d-fd76-46cc-9c67-3db97c586d49" (UID: "aec86c5d-fd76-46cc-9c67-3db97c586d49"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.206685 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aec86c5d-fd76-46cc-9c67-3db97c586d49-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.210333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec86c5d-fd76-46cc-9c67-3db97c586d49-kube-api-access-tj54c" (OuterVolumeSpecName: "kube-api-access-tj54c") pod "aec86c5d-fd76-46cc-9c67-3db97c586d49" (UID: "aec86c5d-fd76-46cc-9c67-3db97c586d49"). InnerVolumeSpecName "kube-api-access-tj54c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.220460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec86c5d-fd76-46cc-9c67-3db97c586d49-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "aec86c5d-fd76-46cc-9c67-3db97c586d49" (UID: "aec86c5d-fd76-46cc-9c67-3db97c586d49"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.308135 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj54c\" (UniqueName: \"kubernetes.io/projected/aec86c5d-fd76-46cc-9c67-3db97c586d49-kube-api-access-tj54c\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.308166 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aec86c5d-fd76-46cc-9c67-3db97c586d49-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.849358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rzfwr" event={"ID":"aec86c5d-fd76-46cc-9c67-3db97c586d49","Type":"ContainerDied","Data":"24517001c204a66fdf779562337682ed7a57cfd2ea1d0bbf236ed07333b3108d"} Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.849615 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24517001c204a66fdf779562337682ed7a57cfd2ea1d0bbf236ed07333b3108d" Jan 21 16:31:59 crc kubenswrapper[4707]: I0121 16:31:59.849401 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rzfwr" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.119779 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx"] Jan 21 16:32:02 crc kubenswrapper[4707]: E0121 16:32:02.120202 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec86c5d-fd76-46cc-9c67-3db97c586d49" containerName="storage" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.120215 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec86c5d-fd76-46cc-9c67-3db97c586d49" containerName="storage" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.120327 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec86c5d-fd76-46cc-9c67-3db97c586d49" containerName="storage" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.120891 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.122882 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.131799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx"] Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.145099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfq77\" (UniqueName: \"kubernetes.io/projected/c14469ba-7597-440f-9cca-f7d6073e098e-kube-api-access-tfq77\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.145322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.145457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.145548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-config\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.182983 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:32:02 crc kubenswrapper[4707]: E0121 16:32:02.183155 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.246349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-config\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.246429 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfq77\" (UniqueName: \"kubernetes.io/projected/c14469ba-7597-440f-9cca-f7d6073e098e-kube-api-access-tfq77\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.246532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.246568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.247151 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-config\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.247257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.247735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.269574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfq77\" (UniqueName: \"kubernetes.io/projected/c14469ba-7597-440f-9cca-f7d6073e098e-kube-api-access-tfq77\") pod \"dnsmasq-dnsmasq-64864b6d57-zzhqx\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.433864 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.794141 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx"] Jan 21 16:32:02 crc kubenswrapper[4707]: W0121 16:32:02.795433 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc14469ba_7597_440f_9cca_f7d6073e098e.slice/crio-a556f51abb44343bc27b60e5db6bea505abb365b0403b22229883fe253666d62 WatchSource:0}: Error finding container a556f51abb44343bc27b60e5db6bea505abb365b0403b22229883fe253666d62: Status 404 returned error can't find the container with id a556f51abb44343bc27b60e5db6bea505abb365b0403b22229883fe253666d62 Jan 21 16:32:02 crc kubenswrapper[4707]: I0121 16:32:02.867354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" event={"ID":"c14469ba-7597-440f-9cca-f7d6073e098e","Type":"ContainerStarted","Data":"a556f51abb44343bc27b60e5db6bea505abb365b0403b22229883fe253666d62"} Jan 21 16:32:03 crc kubenswrapper[4707]: I0121 16:32:03.874829 4707 generic.go:334] "Generic (PLEG): container finished" podID="c14469ba-7597-440f-9cca-f7d6073e098e" containerID="f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572" exitCode=0 Jan 21 16:32:03 crc kubenswrapper[4707]: I0121 16:32:03.874920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" event={"ID":"c14469ba-7597-440f-9cca-f7d6073e098e","Type":"ContainerDied","Data":"f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572"} Jan 21 16:32:04 crc kubenswrapper[4707]: I0121 16:32:04.891163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" event={"ID":"c14469ba-7597-440f-9cca-f7d6073e098e","Type":"ContainerStarted","Data":"ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5"} Jan 21 16:32:04 crc kubenswrapper[4707]: I0121 16:32:04.891475 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:04 crc kubenswrapper[4707]: I0121 16:32:04.908250 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" podStartSLOduration=2.908236325 podStartE2EDuration="2.908236325s" podCreationTimestamp="2026-01-21 16:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:32:04.903866558 +0000 UTC m=+5422.085382780" watchObservedRunningTime="2026-01-21 16:32:04.908236325 +0000 UTC m=+5422.089752547" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.435761 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.472220 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm"] Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.472392 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" podUID="f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" containerName="dnsmasq-dns" containerID="cri-o://5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4" gracePeriod=10 Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.810214 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.864048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-dnsmasq-svc\") pod \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.864098 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjz4w\" (UniqueName: \"kubernetes.io/projected/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-kube-api-access-fjz4w\") pod \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.864265 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-config\") pod \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\" (UID: \"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff\") " Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.869593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-kube-api-access-fjz4w" (OuterVolumeSpecName: "kube-api-access-fjz4w") pod "f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" (UID: "f8c8b29d-cd87-46c9-a868-8234fbc1f2ff"). InnerVolumeSpecName "kube-api-access-fjz4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.891614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-config" (OuterVolumeSpecName: "config") pod "f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" (UID: "f8c8b29d-cd87-46c9-a868-8234fbc1f2ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.892165 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" (UID: "f8c8b29d-cd87-46c9-a868-8234fbc1f2ff"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.944680 4707 generic.go:334] "Generic (PLEG): container finished" podID="f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" containerID="5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4" exitCode=0 Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.944733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" event={"ID":"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff","Type":"ContainerDied","Data":"5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4"} Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.944767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" event={"ID":"f8c8b29d-cd87-46c9-a868-8234fbc1f2ff","Type":"ContainerDied","Data":"ffe69bb05fd869b8b92fdc9d8807e28018526e5da5f273cda360b35610dddfe5"} Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.944784 4707 scope.go:117] "RemoveContainer" containerID="5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.944738 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.961157 4707 scope.go:117] "RemoveContainer" containerID="436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.966311 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.966397 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.966422 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjz4w\" (UniqueName: \"kubernetes.io/projected/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff-kube-api-access-fjz4w\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.969077 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm"] Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.974321 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-tvklm"] Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.994401 4707 scope.go:117] "RemoveContainer" containerID="5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4" Jan 21 16:32:12 crc kubenswrapper[4707]: E0121 16:32:12.994773 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4\": container with ID starting with 5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4 not found: ID does not exist" containerID="5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.994803 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4"} err="failed to get container status \"5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4\": rpc error: code = NotFound desc = could not find container \"5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4\": container with ID starting with 5a97f19d54594bdab3fc8e26f0418b9c6f70c1406c5ba55b34eb1f13f89ae3d4 not found: ID does not exist" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.994846 4707 scope.go:117] "RemoveContainer" containerID="436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77" Jan 21 16:32:12 crc kubenswrapper[4707]: E0121 16:32:12.995159 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77\": container with ID starting with 436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77 not found: ID does not exist" containerID="436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77" Jan 21 16:32:12 crc kubenswrapper[4707]: I0121 16:32:12.995263 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77"} err="failed to get container status \"436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77\": rpc error: code = NotFound desc = could not find container \"436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77\": container with ID starting with 436bfa37071180ac9e5135262555b49ae21e4507355c0f6e42b75667007e1f77 not found: ID does not exist" Jan 21 16:32:13 crc kubenswrapper[4707]: I0121 16:32:13.186154 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:32:13 crc kubenswrapper[4707]: E0121 16:32:13.186887 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:32:13 crc kubenswrapper[4707]: I0121 16:32:13.190223 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" path="/var/lib/kubelet/pods/f8c8b29d-cd87-46c9-a868-8234fbc1f2ff/volumes" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.071910 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj"] Jan 21 16:32:17 crc kubenswrapper[4707]: E0121 16:32:17.073171 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" containerName="init" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.073197 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" containerName="init" Jan 21 16:32:17 crc kubenswrapper[4707]: E0121 16:32:17.073224 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" containerName="dnsmasq-dns" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.073232 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" containerName="dnsmasq-dns" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.073346 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8c8b29d-cd87-46c9-a868-8234fbc1f2ff" containerName="dnsmasq-dns" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.073795 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.076154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-ssfwp" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.076335 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"kuttl-service-cm-1" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.076675 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"kuttl-service-cm-2" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.076722 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.077217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.077281 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.077291 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.077309 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"kuttl-service-cm-0" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.078922 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj"] Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.114865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knzvb\" (UniqueName: \"kubernetes.io/projected/ac02cac8-4533-46e6-bd71-34e40ca0456b-kube-api-access-knzvb\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-inventory\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-combined-ca-bundle\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115388 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-2-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-ssh-key-edpm-compute-no-nodes\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.115498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-inventory\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-combined-ca-bundle\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-2-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-ssh-key-edpm-compute-no-nodes\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knzvb\" (UniqueName: \"kubernetes.io/projected/ac02cac8-4533-46e6-bd71-34e40ca0456b-kube-api-access-knzvb\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.216516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.217151 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.217847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-2-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.218031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.218115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.218211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.218279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.218734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.220479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-ssh-key-edpm-compute-no-nodes\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.221148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-combined-ca-bundle\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.221391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-inventory\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.229079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knzvb\" (UniqueName: \"kubernetes.io/projected/ac02cac8-4533-46e6-bd71-34e40ca0456b-kube-api-access-knzvb\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.385989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.744602 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj"] Jan 21 16:32:17 crc kubenswrapper[4707]: I0121 16:32:17.977044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" event={"ID":"ac02cac8-4533-46e6-bd71-34e40ca0456b","Type":"ContainerStarted","Data":"b6db5accbd88889b492c4ded7eab6269a18c423f1772fb42c2879ebdbfa032e9"} Jan 21 16:32:18 crc kubenswrapper[4707]: I0121 16:32:18.984060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" event={"ID":"ac02cac8-4533-46e6-bd71-34e40ca0456b","Type":"ContainerStarted","Data":"3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e"} Jan 21 16:32:18 crc kubenswrapper[4707]: I0121 16:32:18.998228 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" podStartSLOduration=1.510353351 podStartE2EDuration="1.998216188s" podCreationTimestamp="2026-01-21 16:32:17 +0000 UTC" firstStartedPulling="2026-01-21 16:32:17.750769628 +0000 UTC m=+5434.932285850" lastFinishedPulling="2026-01-21 16:32:18.238632465 +0000 UTC m=+5435.420148687" observedRunningTime="2026-01-21 16:32:18.995920291 +0000 UTC m=+5436.177436503" watchObservedRunningTime="2026-01-21 16:32:18.998216188 +0000 UTC m=+5436.179732410" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.159902 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj"] Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.269587 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh"] Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.271636 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.299274 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh"] Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.352298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-sjqhh\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.352453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-sjqhh\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.352542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m82qc\" (UniqueName: \"kubernetes.io/projected/1c9159e5-6b64-42e8-984e-0d1a32d426a7-kube-api-access-m82qc\") pod \"dnsmasq-dnsmasq-84b9f45d47-sjqhh\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.453107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-sjqhh\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.453172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m82qc\" (UniqueName: \"kubernetes.io/projected/1c9159e5-6b64-42e8-984e-0d1a32d426a7-kube-api-access-m82qc\") pod \"dnsmasq-dnsmasq-84b9f45d47-sjqhh\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.453227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-sjqhh\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.453804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-sjqhh\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.453841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-sjqhh\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.468075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m82qc\" (UniqueName: \"kubernetes.io/projected/1c9159e5-6b64-42e8-984e-0d1a32d426a7-kube-api-access-m82qc\") pod \"dnsmasq-dnsmasq-84b9f45d47-sjqhh\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.608424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.952095 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh"] Jan 21 16:32:20 crc kubenswrapper[4707]: W0121 16:32:20.954576 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c9159e5_6b64_42e8_984e_0d1a32d426a7.slice/crio-0bfc385526bf2c5aece4efebcee810a5f51a7bffb079b1c73a06698946289745 WatchSource:0}: Error finding container 0bfc385526bf2c5aece4efebcee810a5f51a7bffb079b1c73a06698946289745: Status 404 returned error can't find the container with id 0bfc385526bf2c5aece4efebcee810a5f51a7bffb079b1c73a06698946289745 Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.996408 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" event={"ID":"1c9159e5-6b64-42e8-984e-0d1a32d426a7","Type":"ContainerStarted","Data":"0bfc385526bf2c5aece4efebcee810a5f51a7bffb079b1c73a06698946289745"} Jan 21 16:32:20 crc kubenswrapper[4707]: I0121 16:32:20.996473 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" podUID="ac02cac8-4533-46e6-bd71-34e40ca0456b" containerName="kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes" containerID="cri-o://3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e" gracePeriod=30 Jan 21 16:32:22 crc kubenswrapper[4707]: I0121 16:32:22.005051 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c9159e5-6b64-42e8-984e-0d1a32d426a7" containerID="5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f" exitCode=0 Jan 21 16:32:22 crc kubenswrapper[4707]: I0121 16:32:22.005122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" event={"ID":"1c9159e5-6b64-42e8-984e-0d1a32d426a7","Type":"ContainerDied","Data":"5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f"} Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.012226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" event={"ID":"1c9159e5-6b64-42e8-984e-0d1a32d426a7","Type":"ContainerStarted","Data":"048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717"} Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.012454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.026237 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" podStartSLOduration=3.026222595 podStartE2EDuration="3.026222595s" podCreationTimestamp="2026-01-21 16:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:32:23.022990038 +0000 UTC m=+5440.204506260" watchObservedRunningTime="2026-01-21 16:32:23.026222595 +0000 UTC m=+5440.207738817" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.347713 4707 scope.go:117] "RemoveContainer" containerID="bcdcc01cfa9e1f66f466d54c72c5f04a5282ee5987698d17762ba51de6488875" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.367781 4707 scope.go:117] "RemoveContainer" containerID="2e02d44d4a33149b5fdbe53f47c4bf74705ac6b4d75f1b9dd562a0bf2070e70d" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.391376 4707 scope.go:117] "RemoveContainer" containerID="585e6b9ffe1b24b0c43533ae3debc229ffc8b9d6a01d472b73d61d8828da623c" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.415154 4707 scope.go:117] "RemoveContainer" containerID="1ec9137cc0d553135efbd802b5a20c8dbb48c76c9bbeb44bd23155dc1441794c" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.436787 4707 scope.go:117] "RemoveContainer" containerID="d36c13df02c5160221b66567321b2642e332ee9d47201617ca588af483ca5d6b" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.458674 4707 scope.go:117] "RemoveContainer" containerID="64bfcf24eb25a8220219461e99fa55c8d67ddd59962d9b09f1db231340a5827f" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.476831 4707 scope.go:117] "RemoveContainer" containerID="a5a3503e9ef0de7f10f4d5118ee771c234e00b273cd2cd1129a828ff4785495f" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.495248 4707 scope.go:117] "RemoveContainer" containerID="96d41a29088ea2aa04da34ffa66c3768939fd9e7bf3e5db9f18e02ea3501788c" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.511703 4707 scope.go:117] "RemoveContainer" containerID="222cf4dc5b0d2d5f5652901b414cfaa494c6a5754535cc56de1e42e244e378e6" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.528765 4707 scope.go:117] "RemoveContainer" containerID="2343e74661716427944a90f2d452eeaea69268a7c0c635220f7df1aef68f2b33" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.544994 4707 scope.go:117] "RemoveContainer" containerID="cef9bd8b683ca99af5013091c9aaef582f230035c9e921a3f737f8ce0d35781f" Jan 21 16:32:23 crc kubenswrapper[4707]: I0121 16:32:23.563065 4707 scope.go:117] "RemoveContainer" containerID="c36d4f6db68800ba60807179adb8755a923630e7b103d5dba3abb16552f6701b" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.568662 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-rzfwr"] Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.573158 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-rzfwr"] Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.666930 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-b8khl"] Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.667901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.670106 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.670234 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.670277 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.671605 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.672495 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-b8khl"] Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.735689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqw26\" (UniqueName: \"kubernetes.io/projected/008e7117-8972-4a2f-9261-c6f4f0f93e7e-kube-api-access-qqw26\") pod \"crc-storage-crc-b8khl\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.735733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/008e7117-8972-4a2f-9261-c6f4f0f93e7e-node-mnt\") pod \"crc-storage-crc-b8khl\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.735786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/008e7117-8972-4a2f-9261-c6f4f0f93e7e-crc-storage\") pod \"crc-storage-crc-b8khl\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.836665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/008e7117-8972-4a2f-9261-c6f4f0f93e7e-crc-storage\") pod \"crc-storage-crc-b8khl\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.836744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqw26\" (UniqueName: \"kubernetes.io/projected/008e7117-8972-4a2f-9261-c6f4f0f93e7e-kube-api-access-qqw26\") pod \"crc-storage-crc-b8khl\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.836772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/008e7117-8972-4a2f-9261-c6f4f0f93e7e-node-mnt\") pod \"crc-storage-crc-b8khl\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.836996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/008e7117-8972-4a2f-9261-c6f4f0f93e7e-node-mnt\") pod \"crc-storage-crc-b8khl\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.837370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/008e7117-8972-4a2f-9261-c6f4f0f93e7e-crc-storage\") pod \"crc-storage-crc-b8khl\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.851172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqw26\" (UniqueName: \"kubernetes.io/projected/008e7117-8972-4a2f-9261-c6f4f0f93e7e-kube-api-access-qqw26\") pod \"crc-storage-crc-b8khl\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:26 crc kubenswrapper[4707]: I0121 16:32:26.981545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:27 crc kubenswrapper[4707]: I0121 16:32:27.182661 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:32:27 crc kubenswrapper[4707]: E0121 16:32:27.182921 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:32:27 crc kubenswrapper[4707]: I0121 16:32:27.189926 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec86c5d-fd76-46cc-9c67-3db97c586d49" path="/var/lib/kubelet/pods/aec86c5d-fd76-46cc-9c67-3db97c586d49/volumes" Jan 21 16:32:27 crc kubenswrapper[4707]: I0121 16:32:27.319149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-b8khl"] Jan 21 16:32:27 crc kubenswrapper[4707]: W0121 16:32:27.321096 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod008e7117_8972_4a2f_9261_c6f4f0f93e7e.slice/crio-5725360a2bb38bf0d7a238726b117acdfb2180c394fb31d7b11c9d07016bac4b WatchSource:0}: Error finding container 5725360a2bb38bf0d7a238726b117acdfb2180c394fb31d7b11c9d07016bac4b: Status 404 returned error can't find the container with id 5725360a2bb38bf0d7a238726b117acdfb2180c394fb31d7b11c9d07016bac4b Jan 21 16:32:28 crc kubenswrapper[4707]: I0121 16:32:28.043266 4707 generic.go:334] "Generic (PLEG): container finished" podID="008e7117-8972-4a2f-9261-c6f4f0f93e7e" containerID="80bcb3ada9b96457bebb8d044efd0f6b3dca86dbe7679c81e8e4c311c928cc38" exitCode=0 Jan 21 16:32:28 crc kubenswrapper[4707]: I0121 16:32:28.043379 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b8khl" event={"ID":"008e7117-8972-4a2f-9261-c6f4f0f93e7e","Type":"ContainerDied","Data":"80bcb3ada9b96457bebb8d044efd0f6b3dca86dbe7679c81e8e4c311c928cc38"} Jan 21 16:32:28 crc kubenswrapper[4707]: I0121 16:32:28.043453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b8khl" event={"ID":"008e7117-8972-4a2f-9261-c6f4f0f93e7e","Type":"ContainerStarted","Data":"5725360a2bb38bf0d7a238726b117acdfb2180c394fb31d7b11c9d07016bac4b"} Jan 21 16:32:28 crc kubenswrapper[4707]: I0121 16:32:28.916758 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.050973 4707 generic.go:334] "Generic (PLEG): container finished" podID="ac02cac8-4533-46e6-bd71-34e40ca0456b" containerID="3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e" exitCode=254 Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.051015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" event={"ID":"ac02cac8-4533-46e6-bd71-34e40ca0456b","Type":"ContainerDied","Data":"3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e"} Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.051025 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.051054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj" event={"ID":"ac02cac8-4533-46e6-bd71-34e40ca0456b","Type":"ContainerDied","Data":"b6db5accbd88889b492c4ded7eab6269a18c423f1772fb42c2879ebdbfa032e9"} Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.051072 4707 scope.go:117] "RemoveContainer" containerID="3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.062776 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knzvb\" (UniqueName: \"kubernetes.io/projected/ac02cac8-4533-46e6-bd71-34e40ca0456b-kube-api-access-knzvb\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.062837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-inventory\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.062880 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-2-0\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.062905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-1\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.062935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-2\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.063008 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-1\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.063036 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-combined-ca-bundle\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.063069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-0\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.063094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-0\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.063112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-2\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.063130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-ssh-key-edpm-compute-no-nodes\") pod \"ac02cac8-4533-46e6-bd71-34e40ca0456b\" (UID: \"ac02cac8-4533-46e6-bd71-34e40ca0456b\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.066994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-combined-ca-bundle" (OuterVolumeSpecName: "kuttl-service-combined-ca-bundle") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "kuttl-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.071876 4707 scope.go:117] "RemoveContainer" containerID="3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e" Jan 21 16:32:29 crc kubenswrapper[4707]: E0121 16:32:29.072236 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e\": container with ID starting with 3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e not found: ID does not exist" containerID="3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.072266 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e"} err="failed to get container status \"3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e\": rpc error: code = NotFound desc = could not find container \"3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e\": container with ID starting with 3f15bf984f3449628738702b7982341c387490e20546041bddadd6eb755cf43e not found: ID does not exist" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.078878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac02cac8-4533-46e6-bd71-34e40ca0456b-kube-api-access-knzvb" (OuterVolumeSpecName: "kube-api-access-knzvb") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "kube-api-access-knzvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.079437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-2" (OuterVolumeSpecName: "kuttl-service-cm-0-2") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "kuttl-service-cm-0-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.080992 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-1" (OuterVolumeSpecName: "kuttl-service-cm-0-1") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "kuttl-service-cm-0-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.081337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-1" (OuterVolumeSpecName: "kuttl-service-cm-1-1") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "kuttl-service-cm-1-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.082138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.082304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-0" (OuterVolumeSpecName: "kuttl-service-cm-0-0") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "kuttl-service-cm-0-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.082987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-2-0" (OuterVolumeSpecName: "kuttl-service-cm-2-0") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "kuttl-service-cm-2-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.083035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-inventory" (OuterVolumeSpecName: "inventory") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.083185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-2" (OuterVolumeSpecName: "kuttl-service-cm-1-2") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "kuttl-service-cm-1-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.087334 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-0" (OuterVolumeSpecName: "kuttl-service-cm-1-0") pod "ac02cac8-4533-46e6-bd71-34e40ca0456b" (UID: "ac02cac8-4533-46e6-bd71-34e40ca0456b"). InnerVolumeSpecName "kuttl-service-cm-1-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164471 4707 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164508 4707 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164520 4707 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164530 4707 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164538 4707 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-2\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164546 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164555 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knzvb\" (UniqueName: \"kubernetes.io/projected/ac02cac8-4533-46e6-bd71-34e40ca0456b-kube-api-access-knzvb\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164565 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac02cac8-4533-46e6-bd71-34e40ca0456b-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164573 4707 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-2-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164581 4707 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-0-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.164589 4707 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/ac02cac8-4533-46e6-bd71-34e40ca0456b-kuttl-service-cm-1-2\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.242967 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.366140 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj"] Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.366396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/008e7117-8972-4a2f-9261-c6f4f0f93e7e-node-mnt\") pod \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.366445 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqw26\" (UniqueName: \"kubernetes.io/projected/008e7117-8972-4a2f-9261-c6f4f0f93e7e-kube-api-access-qqw26\") pod \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.366530 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/008e7117-8972-4a2f-9261-c6f4f0f93e7e-crc-storage\") pod \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\" (UID: \"008e7117-8972-4a2f-9261-c6f4f0f93e7e\") " Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.366537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/008e7117-8972-4a2f-9261-c6f4f0f93e7e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "008e7117-8972-4a2f-9261-c6f4f0f93e7e" (UID: "008e7117-8972-4a2f-9261-c6f4f0f93e7e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.366779 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/008e7117-8972-4a2f-9261-c6f4f0f93e7e-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.368829 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008e7117-8972-4a2f-9261-c6f4f0f93e7e-kube-api-access-qqw26" (OuterVolumeSpecName: "kube-api-access-qqw26") pod "008e7117-8972-4a2f-9261-c6f4f0f93e7e" (UID: "008e7117-8972-4a2f-9261-c6f4f0f93e7e"). InnerVolumeSpecName "kube-api-access-qqw26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.369780 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-vmtgj"] Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.378551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/008e7117-8972-4a2f-9261-c6f4f0f93e7e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "008e7117-8972-4a2f-9261-c6f4f0f93e7e" (UID: "008e7117-8972-4a2f-9261-c6f4f0f93e7e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.467953 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqw26\" (UniqueName: \"kubernetes.io/projected/008e7117-8972-4a2f-9261-c6f4f0f93e7e-kube-api-access-qqw26\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:29 crc kubenswrapper[4707]: I0121 16:32:29.468127 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/008e7117-8972-4a2f-9261-c6f4f0f93e7e-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:30 crc kubenswrapper[4707]: I0121 16:32:30.063004 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b8khl" event={"ID":"008e7117-8972-4a2f-9261-c6f4f0f93e7e","Type":"ContainerDied","Data":"5725360a2bb38bf0d7a238726b117acdfb2180c394fb31d7b11c9d07016bac4b"} Jan 21 16:32:30 crc kubenswrapper[4707]: I0121 16:32:30.063050 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5725360a2bb38bf0d7a238726b117acdfb2180c394fb31d7b11c9d07016bac4b" Jan 21 16:32:30 crc kubenswrapper[4707]: I0121 16:32:30.063553 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b8khl" Jan 21 16:32:30 crc kubenswrapper[4707]: I0121 16:32:30.609846 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:30 crc kubenswrapper[4707]: I0121 16:32:30.643376 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx"] Jan 21 16:32:30 crc kubenswrapper[4707]: I0121 16:32:30.643557 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" podUID="c14469ba-7597-440f-9cca-f7d6073e098e" containerName="dnsmasq-dns" containerID="cri-o://ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5" gracePeriod=10 Jan 21 16:32:30 crc kubenswrapper[4707]: I0121 16:32:30.949360 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.071805 4707 generic.go:334] "Generic (PLEG): container finished" podID="c14469ba-7597-440f-9cca-f7d6073e098e" containerID="ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5" exitCode=0 Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.071862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" event={"ID":"c14469ba-7597-440f-9cca-f7d6073e098e","Type":"ContainerDied","Data":"ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5"} Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.071877 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.071888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx" event={"ID":"c14469ba-7597-440f-9cca-f7d6073e098e","Type":"ContainerDied","Data":"a556f51abb44343bc27b60e5db6bea505abb365b0403b22229883fe253666d62"} Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.071905 4707 scope.go:117] "RemoveContainer" containerID="ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.085330 4707 scope.go:117] "RemoveContainer" containerID="f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.086373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfq77\" (UniqueName: \"kubernetes.io/projected/c14469ba-7597-440f-9cca-f7d6073e098e-kube-api-access-tfq77\") pod \"c14469ba-7597-440f-9cca-f7d6073e098e\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.086474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-dnsmasq-svc\") pod \"c14469ba-7597-440f-9cca-f7d6073e098e\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.086503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-edpm-compute-no-nodes\") pod \"c14469ba-7597-440f-9cca-f7d6073e098e\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.086543 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-config\") pod \"c14469ba-7597-440f-9cca-f7d6073e098e\" (UID: \"c14469ba-7597-440f-9cca-f7d6073e098e\") " Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.090559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14469ba-7597-440f-9cca-f7d6073e098e-kube-api-access-tfq77" (OuterVolumeSpecName: "kube-api-access-tfq77") pod "c14469ba-7597-440f-9cca-f7d6073e098e" (UID: "c14469ba-7597-440f-9cca-f7d6073e098e"). InnerVolumeSpecName "kube-api-access-tfq77". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.099245 4707 scope.go:117] "RemoveContainer" containerID="ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5" Jan 21 16:32:31 crc kubenswrapper[4707]: E0121 16:32:31.099585 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5\": container with ID starting with ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5 not found: ID does not exist" containerID="ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.099624 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5"} err="failed to get container status \"ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5\": rpc error: code = NotFound desc = could not find container \"ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5\": container with ID starting with ca052f87aaf785788986bad012576ddfa431fadf544370325e37c3f087785db5 not found: ID does not exist" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.099639 4707 scope.go:117] "RemoveContainer" containerID="f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572" Jan 21 16:32:31 crc kubenswrapper[4707]: E0121 16:32:31.099881 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572\": container with ID starting with f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572 not found: ID does not exist" containerID="f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.099903 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572"} err="failed to get container status \"f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572\": rpc error: code = NotFound desc = could not find container \"f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572\": container with ID starting with f3f90606e7b151287a7b549d1ecaa6a14fae535afab7916204803d4939832572 not found: ID does not exist" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.112489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-edpm-compute-no-nodes" (OuterVolumeSpecName: "edpm-compute-no-nodes") pod "c14469ba-7597-440f-9cca-f7d6073e098e" (UID: "c14469ba-7597-440f-9cca-f7d6073e098e"). InnerVolumeSpecName "edpm-compute-no-nodes". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.112502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "c14469ba-7597-440f-9cca-f7d6073e098e" (UID: "c14469ba-7597-440f-9cca-f7d6073e098e"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.113083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-config" (OuterVolumeSpecName: "config") pod "c14469ba-7597-440f-9cca-f7d6073e098e" (UID: "c14469ba-7597-440f-9cca-f7d6073e098e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.187866 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.187888 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfq77\" (UniqueName: \"kubernetes.io/projected/c14469ba-7597-440f-9cca-f7d6073e098e-kube-api-access-tfq77\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.187898 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.187907 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/c14469ba-7597-440f-9cca-f7d6073e098e-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.189271 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac02cac8-4533-46e6-bd71-34e40ca0456b" path="/var/lib/kubelet/pods/ac02cac8-4533-46e6-bd71-34e40ca0456b/volumes" Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.386694 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx"] Jan 21 16:32:31 crc kubenswrapper[4707]: I0121 16:32:31.390239 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-zzhqx"] Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.116382 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-b8khl"] Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.120694 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-b8khl"] Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.222268 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-lxrhf"] Jan 21 16:32:32 crc kubenswrapper[4707]: E0121 16:32:32.222496 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14469ba-7597-440f-9cca-f7d6073e098e" containerName="init" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.222507 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14469ba-7597-440f-9cca-f7d6073e098e" containerName="init" Jan 21 16:32:32 crc kubenswrapper[4707]: E0121 16:32:32.222521 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14469ba-7597-440f-9cca-f7d6073e098e" containerName="dnsmasq-dns" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.222526 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14469ba-7597-440f-9cca-f7d6073e098e" containerName="dnsmasq-dns" Jan 21 16:32:32 crc kubenswrapper[4707]: E0121 16:32:32.222535 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008e7117-8972-4a2f-9261-c6f4f0f93e7e" containerName="storage" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.222540 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="008e7117-8972-4a2f-9261-c6f4f0f93e7e" containerName="storage" Jan 21 16:32:32 crc kubenswrapper[4707]: E0121 16:32:32.222560 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac02cac8-4533-46e6-bd71-34e40ca0456b" containerName="kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.222568 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac02cac8-4533-46e6-bd71-34e40ca0456b" containerName="kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.222683 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="008e7117-8972-4a2f-9261-c6f4f0f93e7e" containerName="storage" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.222694 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac02cac8-4533-46e6-bd71-34e40ca0456b" containerName="kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.222707 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14469ba-7597-440f-9cca-f7d6073e098e" containerName="dnsmasq-dns" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.223105 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.224662 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.224864 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.225851 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.226278 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.229981 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lxrhf"] Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.300342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbrc4\" (UniqueName: \"kubernetes.io/projected/7a66068a-da9c-449c-a36d-95950cdd153c-kube-api-access-gbrc4\") pod \"crc-storage-crc-lxrhf\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.300608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a66068a-da9c-449c-a36d-95950cdd153c-crc-storage\") pod \"crc-storage-crc-lxrhf\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.300668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a66068a-da9c-449c-a36d-95950cdd153c-node-mnt\") pod \"crc-storage-crc-lxrhf\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.401480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a66068a-da9c-449c-a36d-95950cdd153c-crc-storage\") pod \"crc-storage-crc-lxrhf\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.401526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a66068a-da9c-449c-a36d-95950cdd153c-node-mnt\") pod \"crc-storage-crc-lxrhf\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.401585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbrc4\" (UniqueName: \"kubernetes.io/projected/7a66068a-da9c-449c-a36d-95950cdd153c-kube-api-access-gbrc4\") pod \"crc-storage-crc-lxrhf\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.401733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a66068a-da9c-449c-a36d-95950cdd153c-node-mnt\") pod \"crc-storage-crc-lxrhf\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.402142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a66068a-da9c-449c-a36d-95950cdd153c-crc-storage\") pod \"crc-storage-crc-lxrhf\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.414041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbrc4\" (UniqueName: \"kubernetes.io/projected/7a66068a-da9c-449c-a36d-95950cdd153c-kube-api-access-gbrc4\") pod \"crc-storage-crc-lxrhf\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.535184 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.536732 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zrjhb"] Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.537975 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.548747 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrjhb"] Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.605451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-utilities\") pod \"certified-operators-zrjhb\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.605562 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwnml\" (UniqueName: \"kubernetes.io/projected/f8db94de-5644-4dc5-8440-2b9e59a84fea-kube-api-access-kwnml\") pod \"certified-operators-zrjhb\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.605657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-catalog-content\") pod \"certified-operators-zrjhb\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.707580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-utilities\") pod \"certified-operators-zrjhb\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.707855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwnml\" (UniqueName: \"kubernetes.io/projected/f8db94de-5644-4dc5-8440-2b9e59a84fea-kube-api-access-kwnml\") pod \"certified-operators-zrjhb\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.707920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-catalog-content\") pod \"certified-operators-zrjhb\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.707987 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lxrhf"] Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.708464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-catalog-content\") pod \"certified-operators-zrjhb\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.708489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-utilities\") pod \"certified-operators-zrjhb\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: W0121 16:32:32.713518 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a66068a_da9c_449c_a36d_95950cdd153c.slice/crio-91449e397cf1d2243a12c57717c13b28ab288dbf23b56d7a80c9f8bf099b0cea WatchSource:0}: Error finding container 91449e397cf1d2243a12c57717c13b28ab288dbf23b56d7a80c9f8bf099b0cea: Status 404 returned error can't find the container with id 91449e397cf1d2243a12c57717c13b28ab288dbf23b56d7a80c9f8bf099b0cea Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.722626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwnml\" (UniqueName: \"kubernetes.io/projected/f8db94de-5644-4dc5-8440-2b9e59a84fea-kube-api-access-kwnml\") pod \"certified-operators-zrjhb\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:32 crc kubenswrapper[4707]: I0121 16:32:32.891586 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:33 crc kubenswrapper[4707]: I0121 16:32:33.077018 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrjhb"] Jan 21 16:32:33 crc kubenswrapper[4707]: W0121 16:32:33.083267 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8db94de_5644_4dc5_8440_2b9e59a84fea.slice/crio-f4177fde70e83b505341d1a4329a99d781cae8b3f75035b85d6487bab3dd4c50 WatchSource:0}: Error finding container f4177fde70e83b505341d1a4329a99d781cae8b3f75035b85d6487bab3dd4c50: Status 404 returned error can't find the container with id f4177fde70e83b505341d1a4329a99d781cae8b3f75035b85d6487bab3dd4c50 Jan 21 16:32:33 crc kubenswrapper[4707]: I0121 16:32:33.084221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lxrhf" event={"ID":"7a66068a-da9c-449c-a36d-95950cdd153c","Type":"ContainerStarted","Data":"91449e397cf1d2243a12c57717c13b28ab288dbf23b56d7a80c9f8bf099b0cea"} Jan 21 16:32:33 crc kubenswrapper[4707]: I0121 16:32:33.189898 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="008e7117-8972-4a2f-9261-c6f4f0f93e7e" path="/var/lib/kubelet/pods/008e7117-8972-4a2f-9261-c6f4f0f93e7e/volumes" Jan 21 16:32:33 crc kubenswrapper[4707]: I0121 16:32:33.190745 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14469ba-7597-440f-9cca-f7d6073e098e" path="/var/lib/kubelet/pods/c14469ba-7597-440f-9cca-f7d6073e098e/volumes" Jan 21 16:32:34 crc kubenswrapper[4707]: I0121 16:32:34.090771 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a66068a-da9c-449c-a36d-95950cdd153c" containerID="3a2a9bd76353e45ce9b0adde4e7d27a63cd1cfe605e980da7026b996895962a6" exitCode=0 Jan 21 16:32:34 crc kubenswrapper[4707]: I0121 16:32:34.090830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lxrhf" event={"ID":"7a66068a-da9c-449c-a36d-95950cdd153c","Type":"ContainerDied","Data":"3a2a9bd76353e45ce9b0adde4e7d27a63cd1cfe605e980da7026b996895962a6"} Jan 21 16:32:34 crc kubenswrapper[4707]: I0121 16:32:34.092394 4707 generic.go:334] "Generic (PLEG): container finished" podID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerID="0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4" exitCode=0 Jan 21 16:32:34 crc kubenswrapper[4707]: I0121 16:32:34.092419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrjhb" event={"ID":"f8db94de-5644-4dc5-8440-2b9e59a84fea","Type":"ContainerDied","Data":"0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4"} Jan 21 16:32:34 crc kubenswrapper[4707]: I0121 16:32:34.092434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrjhb" event={"ID":"f8db94de-5644-4dc5-8440-2b9e59a84fea","Type":"ContainerStarted","Data":"f4177fde70e83b505341d1a4329a99d781cae8b3f75035b85d6487bab3dd4c50"} Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.100289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrjhb" event={"ID":"f8db94de-5644-4dc5-8440-2b9e59a84fea","Type":"ContainerStarted","Data":"7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9"} Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.309122 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.338312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbrc4\" (UniqueName: \"kubernetes.io/projected/7a66068a-da9c-449c-a36d-95950cdd153c-kube-api-access-gbrc4\") pod \"7a66068a-da9c-449c-a36d-95950cdd153c\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.338387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a66068a-da9c-449c-a36d-95950cdd153c-node-mnt\") pod \"7a66068a-da9c-449c-a36d-95950cdd153c\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.338424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a66068a-da9c-449c-a36d-95950cdd153c-crc-storage\") pod \"7a66068a-da9c-449c-a36d-95950cdd153c\" (UID: \"7a66068a-da9c-449c-a36d-95950cdd153c\") " Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.338504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a66068a-da9c-449c-a36d-95950cdd153c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7a66068a-da9c-449c-a36d-95950cdd153c" (UID: "7a66068a-da9c-449c-a36d-95950cdd153c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.338707 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a66068a-da9c-449c-a36d-95950cdd153c-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.342367 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a66068a-da9c-449c-a36d-95950cdd153c-kube-api-access-gbrc4" (OuterVolumeSpecName: "kube-api-access-gbrc4") pod "7a66068a-da9c-449c-a36d-95950cdd153c" (UID: "7a66068a-da9c-449c-a36d-95950cdd153c"). InnerVolumeSpecName "kube-api-access-gbrc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.352502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a66068a-da9c-449c-a36d-95950cdd153c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7a66068a-da9c-449c-a36d-95950cdd153c" (UID: "7a66068a-da9c-449c-a36d-95950cdd153c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.439533 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbrc4\" (UniqueName: \"kubernetes.io/projected/7a66068a-da9c-449c-a36d-95950cdd153c-kube-api-access-gbrc4\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:35 crc kubenswrapper[4707]: I0121 16:32:35.439558 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a66068a-da9c-449c-a36d-95950cdd153c-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:36 crc kubenswrapper[4707]: I0121 16:32:36.106986 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lxrhf" Jan 21 16:32:36 crc kubenswrapper[4707]: I0121 16:32:36.106993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lxrhf" event={"ID":"7a66068a-da9c-449c-a36d-95950cdd153c","Type":"ContainerDied","Data":"91449e397cf1d2243a12c57717c13b28ab288dbf23b56d7a80c9f8bf099b0cea"} Jan 21 16:32:36 crc kubenswrapper[4707]: I0121 16:32:36.107109 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91449e397cf1d2243a12c57717c13b28ab288dbf23b56d7a80c9f8bf099b0cea" Jan 21 16:32:36 crc kubenswrapper[4707]: I0121 16:32:36.108720 4707 generic.go:334] "Generic (PLEG): container finished" podID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerID="7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9" exitCode=0 Jan 21 16:32:36 crc kubenswrapper[4707]: I0121 16:32:36.108745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrjhb" event={"ID":"f8db94de-5644-4dc5-8440-2b9e59a84fea","Type":"ContainerDied","Data":"7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9"} Jan 21 16:32:37 crc kubenswrapper[4707]: I0121 16:32:37.115786 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrjhb" event={"ID":"f8db94de-5644-4dc5-8440-2b9e59a84fea","Type":"ContainerStarted","Data":"54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c"} Jan 21 16:32:37 crc kubenswrapper[4707]: I0121 16:32:37.129580 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zrjhb" podStartSLOduration=2.627253014 podStartE2EDuration="5.129567699s" podCreationTimestamp="2026-01-21 16:32:32 +0000 UTC" firstStartedPulling="2026-01-21 16:32:34.09387213 +0000 UTC m=+5451.275388351" lastFinishedPulling="2026-01-21 16:32:36.596186814 +0000 UTC m=+5453.777703036" observedRunningTime="2026-01-21 16:32:37.127684618 +0000 UTC m=+5454.309200841" watchObservedRunningTime="2026-01-21 16:32:37.129567699 +0000 UTC m=+5454.311083920" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.264907 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q"] Jan 21 16:32:38 crc kubenswrapper[4707]: E0121 16:32:38.265334 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a66068a-da9c-449c-a36d-95950cdd153c" containerName="storage" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.265346 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a66068a-da9c-449c-a36d-95950cdd153c" containerName="storage" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.265474 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a66068a-da9c-449c-a36d-95950cdd153c" containerName="storage" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.266119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.267886 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-no-nodes-custom-svc" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.274334 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q"] Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.375116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-config\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.375260 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.375385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dnft\" (UniqueName: \"kubernetes.io/projected/d7086d0d-b404-4245-a42f-2d3b9fe33e31-kube-api-access-6dnft\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.375500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-edpm-no-nodes-custom-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.476393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dnft\" (UniqueName: \"kubernetes.io/projected/d7086d0d-b404-4245-a42f-2d3b9fe33e31-kube-api-access-6dnft\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.476458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-edpm-no-nodes-custom-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.476524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-config\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.476591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.477209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-edpm-no-nodes-custom-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.477254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.477437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-config\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.491545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dnft\" (UniqueName: \"kubernetes.io/projected/d7086d0d-b404-4245-a42f-2d3b9fe33e31-kube-api-access-6dnft\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-mnc4q\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.578243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:38 crc kubenswrapper[4707]: I0121 16:32:38.921983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q"] Jan 21 16:32:39 crc kubenswrapper[4707]: I0121 16:32:39.127627 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7086d0d-b404-4245-a42f-2d3b9fe33e31" containerID="39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e" exitCode=0 Jan 21 16:32:39 crc kubenswrapper[4707]: I0121 16:32:39.127688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" event={"ID":"d7086d0d-b404-4245-a42f-2d3b9fe33e31","Type":"ContainerDied","Data":"39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e"} Jan 21 16:32:39 crc kubenswrapper[4707]: I0121 16:32:39.127900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" event={"ID":"d7086d0d-b404-4245-a42f-2d3b9fe33e31","Type":"ContainerStarted","Data":"c2933df9ad585e7b178e563075e31061e373a2514db2c282055de8356fb260e3"} Jan 21 16:32:40 crc kubenswrapper[4707]: I0121 16:32:40.134976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" event={"ID":"d7086d0d-b404-4245-a42f-2d3b9fe33e31","Type":"ContainerStarted","Data":"b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad"} Jan 21 16:32:40 crc kubenswrapper[4707]: I0121 16:32:40.135283 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:40 crc kubenswrapper[4707]: I0121 16:32:40.148536 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" podStartSLOduration=2.148524211 podStartE2EDuration="2.148524211s" podCreationTimestamp="2026-01-21 16:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:32:40.147336608 +0000 UTC m=+5457.328852830" watchObservedRunningTime="2026-01-21 16:32:40.148524211 +0000 UTC m=+5457.330040433" Jan 21 16:32:40 crc kubenswrapper[4707]: I0121 16:32:40.182152 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:32:40 crc kubenswrapper[4707]: E0121 16:32:40.182365 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.719235 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7rd6x"] Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.720607 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.725019 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7rd6x"] Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.817675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-utilities\") pod \"community-operators-7rd6x\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.817729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2q7\" (UniqueName: \"kubernetes.io/projected/c44e7817-6e76-47d7-a736-981b2916af6e-kube-api-access-cm2q7\") pod \"community-operators-7rd6x\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.817750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-catalog-content\") pod \"community-operators-7rd6x\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.918226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-utilities\") pod \"community-operators-7rd6x\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.918275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2q7\" (UniqueName: \"kubernetes.io/projected/c44e7817-6e76-47d7-a736-981b2916af6e-kube-api-access-cm2q7\") pod \"community-operators-7rd6x\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.918297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-catalog-content\") pod \"community-operators-7rd6x\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.918658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-catalog-content\") pod \"community-operators-7rd6x\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.918842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-utilities\") pod \"community-operators-7rd6x\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:41 crc kubenswrapper[4707]: I0121 16:32:41.937103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2q7\" (UniqueName: \"kubernetes.io/projected/c44e7817-6e76-47d7-a736-981b2916af6e-kube-api-access-cm2q7\") pod \"community-operators-7rd6x\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:42 crc kubenswrapper[4707]: I0121 16:32:42.035948 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:42 crc kubenswrapper[4707]: I0121 16:32:42.381769 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7rd6x"] Jan 21 16:32:42 crc kubenswrapper[4707]: W0121 16:32:42.385505 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc44e7817_6e76_47d7_a736_981b2916af6e.slice/crio-ff1c773927e54ce41e937cc41657de2885f0025b38865cd0585b2aa0d197a85b WatchSource:0}: Error finding container ff1c773927e54ce41e937cc41657de2885f0025b38865cd0585b2aa0d197a85b: Status 404 returned error can't find the container with id ff1c773927e54ce41e937cc41657de2885f0025b38865cd0585b2aa0d197a85b Jan 21 16:32:42 crc kubenswrapper[4707]: I0121 16:32:42.891851 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:42 crc kubenswrapper[4707]: I0121 16:32:42.891897 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:42 crc kubenswrapper[4707]: I0121 16:32:42.920717 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:43 crc kubenswrapper[4707]: I0121 16:32:43.153157 4707 generic.go:334] "Generic (PLEG): container finished" podID="c44e7817-6e76-47d7-a736-981b2916af6e" containerID="1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1" exitCode=0 Jan 21 16:32:43 crc kubenswrapper[4707]: I0121 16:32:43.153200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rd6x" event={"ID":"c44e7817-6e76-47d7-a736-981b2916af6e","Type":"ContainerDied","Data":"1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1"} Jan 21 16:32:43 crc kubenswrapper[4707]: I0121 16:32:43.153240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rd6x" event={"ID":"c44e7817-6e76-47d7-a736-981b2916af6e","Type":"ContainerStarted","Data":"ff1c773927e54ce41e937cc41657de2885f0025b38865cd0585b2aa0d197a85b"} Jan 21 16:32:43 crc kubenswrapper[4707]: I0121 16:32:43.188641 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:44 crc kubenswrapper[4707]: I0121 16:32:44.160604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rd6x" event={"ID":"c44e7817-6e76-47d7-a736-981b2916af6e","Type":"ContainerStarted","Data":"fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867"} Jan 21 16:32:45 crc kubenswrapper[4707]: I0121 16:32:45.174037 4707 generic.go:334] "Generic (PLEG): container finished" podID="c44e7817-6e76-47d7-a736-981b2916af6e" containerID="fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867" exitCode=0 Jan 21 16:32:45 crc kubenswrapper[4707]: I0121 16:32:45.175287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rd6x" event={"ID":"c44e7817-6e76-47d7-a736-981b2916af6e","Type":"ContainerDied","Data":"fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867"} Jan 21 16:32:45 crc kubenswrapper[4707]: I0121 16:32:45.175901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rd6x" event={"ID":"c44e7817-6e76-47d7-a736-981b2916af6e","Type":"ContainerStarted","Data":"cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1"} Jan 21 16:32:45 crc kubenswrapper[4707]: I0121 16:32:45.190750 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7rd6x" podStartSLOduration=2.689690424 podStartE2EDuration="4.190741936s" podCreationTimestamp="2026-01-21 16:32:41 +0000 UTC" firstStartedPulling="2026-01-21 16:32:43.154622627 +0000 UTC m=+5460.336138849" lastFinishedPulling="2026-01-21 16:32:44.655674129 +0000 UTC m=+5461.837190361" observedRunningTime="2026-01-21 16:32:45.188530286 +0000 UTC m=+5462.370046508" watchObservedRunningTime="2026-01-21 16:32:45.190741936 +0000 UTC m=+5462.372258157" Jan 21 16:32:45 crc kubenswrapper[4707]: I0121 16:32:45.303449 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zrjhb"] Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.180387 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zrjhb" podUID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerName="registry-server" containerID="cri-o://54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c" gracePeriod=2 Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.480054 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.670378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwnml\" (UniqueName: \"kubernetes.io/projected/f8db94de-5644-4dc5-8440-2b9e59a84fea-kube-api-access-kwnml\") pod \"f8db94de-5644-4dc5-8440-2b9e59a84fea\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.670449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-catalog-content\") pod \"f8db94de-5644-4dc5-8440-2b9e59a84fea\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.670529 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-utilities\") pod \"f8db94de-5644-4dc5-8440-2b9e59a84fea\" (UID: \"f8db94de-5644-4dc5-8440-2b9e59a84fea\") " Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.671348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-utilities" (OuterVolumeSpecName: "utilities") pod "f8db94de-5644-4dc5-8440-2b9e59a84fea" (UID: "f8db94de-5644-4dc5-8440-2b9e59a84fea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.674822 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8db94de-5644-4dc5-8440-2b9e59a84fea-kube-api-access-kwnml" (OuterVolumeSpecName: "kube-api-access-kwnml") pod "f8db94de-5644-4dc5-8440-2b9e59a84fea" (UID: "f8db94de-5644-4dc5-8440-2b9e59a84fea"). InnerVolumeSpecName "kube-api-access-kwnml". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.701186 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8db94de-5644-4dc5-8440-2b9e59a84fea" (UID: "f8db94de-5644-4dc5-8440-2b9e59a84fea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.772074 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwnml\" (UniqueName: \"kubernetes.io/projected/f8db94de-5644-4dc5-8440-2b9e59a84fea-kube-api-access-kwnml\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.772098 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:46 crc kubenswrapper[4707]: I0121 16:32:46.772107 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8db94de-5644-4dc5-8440-2b9e59a84fea-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.187777 4707 generic.go:334] "Generic (PLEG): container finished" podID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerID="54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c" exitCode=0 Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.187851 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrjhb" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.187953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrjhb" event={"ID":"f8db94de-5644-4dc5-8440-2b9e59a84fea","Type":"ContainerDied","Data":"54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c"} Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.187982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrjhb" event={"ID":"f8db94de-5644-4dc5-8440-2b9e59a84fea","Type":"ContainerDied","Data":"f4177fde70e83b505341d1a4329a99d781cae8b3f75035b85d6487bab3dd4c50"} Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.187998 4707 scope.go:117] "RemoveContainer" containerID="54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.203614 4707 scope.go:117] "RemoveContainer" containerID="7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.214751 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zrjhb"] Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.218297 4707 scope.go:117] "RemoveContainer" containerID="0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.218459 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zrjhb"] Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.272201 4707 scope.go:117] "RemoveContainer" containerID="54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c" Jan 21 16:32:47 crc kubenswrapper[4707]: E0121 16:32:47.272514 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c\": container with ID starting with 54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c not found: ID does not exist" containerID="54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.272543 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c"} err="failed to get container status \"54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c\": rpc error: code = NotFound desc = could not find container \"54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c\": container with ID starting with 54e1765ac8ac5884e29411c6ebc1e2f1a8e34b500a75167e87ba1f0e2401545c not found: ID does not exist" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.272560 4707 scope.go:117] "RemoveContainer" containerID="7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9" Jan 21 16:32:47 crc kubenswrapper[4707]: E0121 16:32:47.272830 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9\": container with ID starting with 7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9 not found: ID does not exist" containerID="7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.272852 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9"} err="failed to get container status \"7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9\": rpc error: code = NotFound desc = could not find container \"7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9\": container with ID starting with 7f78aa5b465cd44a683088da874b20f80e8c25aa86c3012cf4f8ef0eab3b7ee9 not found: ID does not exist" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.272864 4707 scope.go:117] "RemoveContainer" containerID="0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4" Jan 21 16:32:47 crc kubenswrapper[4707]: E0121 16:32:47.273263 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4\": container with ID starting with 0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4 not found: ID does not exist" containerID="0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4" Jan 21 16:32:47 crc kubenswrapper[4707]: I0121 16:32:47.273282 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4"} err="failed to get container status \"0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4\": rpc error: code = NotFound desc = could not find container \"0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4\": container with ID starting with 0f66677691e01c75039bedc0c2e694ed8873fea950b552a0ac35dd42adab3cc4 not found: ID does not exist" Jan 21 16:32:48 crc kubenswrapper[4707]: I0121 16:32:48.578936 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:32:48 crc kubenswrapper[4707]: I0121 16:32:48.612346 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh"] Jan 21 16:32:48 crc kubenswrapper[4707]: I0121 16:32:48.612873 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" podUID="1c9159e5-6b64-42e8-984e-0d1a32d426a7" containerName="dnsmasq-dns" containerID="cri-o://048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717" gracePeriod=10 Jan 21 16:32:48 crc kubenswrapper[4707]: I0121 16:32:48.927122 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.100135 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-dnsmasq-svc\") pod \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.100295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-config\") pod \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.100381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m82qc\" (UniqueName: \"kubernetes.io/projected/1c9159e5-6b64-42e8-984e-0d1a32d426a7-kube-api-access-m82qc\") pod \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\" (UID: \"1c9159e5-6b64-42e8-984e-0d1a32d426a7\") " Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.104153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c9159e5-6b64-42e8-984e-0d1a32d426a7-kube-api-access-m82qc" (OuterVolumeSpecName: "kube-api-access-m82qc") pod "1c9159e5-6b64-42e8-984e-0d1a32d426a7" (UID: "1c9159e5-6b64-42e8-984e-0d1a32d426a7"). InnerVolumeSpecName "kube-api-access-m82qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.125562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "1c9159e5-6b64-42e8-984e-0d1a32d426a7" (UID: "1c9159e5-6b64-42e8-984e-0d1a32d426a7"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.125779 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-config" (OuterVolumeSpecName: "config") pod "1c9159e5-6b64-42e8-984e-0d1a32d426a7" (UID: "1c9159e5-6b64-42e8-984e-0d1a32d426a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.194328 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8db94de-5644-4dc5-8440-2b9e59a84fea" path="/var/lib/kubelet/pods/f8db94de-5644-4dc5-8440-2b9e59a84fea/volumes" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.201389 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.201412 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m82qc\" (UniqueName: \"kubernetes.io/projected/1c9159e5-6b64-42e8-984e-0d1a32d426a7-kube-api-access-m82qc\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.201421 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1c9159e5-6b64-42e8-984e-0d1a32d426a7-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.201826 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c9159e5-6b64-42e8-984e-0d1a32d426a7" containerID="048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717" exitCode=0 Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.201852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" event={"ID":"1c9159e5-6b64-42e8-984e-0d1a32d426a7","Type":"ContainerDied","Data":"048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717"} Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.201878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" event={"ID":"1c9159e5-6b64-42e8-984e-0d1a32d426a7","Type":"ContainerDied","Data":"0bfc385526bf2c5aece4efebcee810a5f51a7bffb079b1c73a06698946289745"} Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.201887 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.201898 4707 scope.go:117] "RemoveContainer" containerID="048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.220870 4707 scope.go:117] "RemoveContainer" containerID="5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.221307 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh"] Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.225696 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-sjqhh"] Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.244422 4707 scope.go:117] "RemoveContainer" containerID="048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717" Jan 21 16:32:49 crc kubenswrapper[4707]: E0121 16:32:49.244738 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717\": container with ID starting with 048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717 not found: ID does not exist" containerID="048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.244785 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717"} err="failed to get container status \"048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717\": rpc error: code = NotFound desc = could not find container \"048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717\": container with ID starting with 048118490b5d0ee5f049cc792d56d040e08a6bd6debf6bf72098da199d878717 not found: ID does not exist" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.244826 4707 scope.go:117] "RemoveContainer" containerID="5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f" Jan 21 16:32:49 crc kubenswrapper[4707]: E0121 16:32:49.245142 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f\": container with ID starting with 5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f not found: ID does not exist" containerID="5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f" Jan 21 16:32:49 crc kubenswrapper[4707]: I0121 16:32:49.245184 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f"} err="failed to get container status \"5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f\": rpc error: code = NotFound desc = could not find container \"5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f\": container with ID starting with 5d46084be063ed459d88146b4b42323b4c1a9e6b0b245bdbd72b10e97f6fe87f not found: ID does not exist" Jan 21 16:32:51 crc kubenswrapper[4707]: I0121 16:32:51.189176 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c9159e5-6b64-42e8-984e-0d1a32d426a7" path="/var/lib/kubelet/pods/1c9159e5-6b64-42e8-984e-0d1a32d426a7/volumes" Jan 21 16:32:52 crc kubenswrapper[4707]: I0121 16:32:52.036310 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:52 crc kubenswrapper[4707]: I0121 16:32:52.036351 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:52 crc kubenswrapper[4707]: I0121 16:32:52.065302 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:52 crc kubenswrapper[4707]: I0121 16:32:52.247469 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:52 crc kubenswrapper[4707]: I0121 16:32:52.286912 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7rd6x"] Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.233799 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff"] Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.234210 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c9159e5-6b64-42e8-984e-0d1a32d426a7" containerName="init" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.234288 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c9159e5-6b64-42e8-984e-0d1a32d426a7" containerName="init" Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.234337 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerName="registry-server" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.234379 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerName="registry-server" Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.234484 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c9159e5-6b64-42e8-984e-0d1a32d426a7" containerName="dnsmasq-dns" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.234530 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c9159e5-6b64-42e8-984e-0d1a32d426a7" containerName="dnsmasq-dns" Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.234580 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerName="extract-utilities" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.234621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerName="extract-utilities" Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.234676 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerName="extract-content" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.234716 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerName="extract-content" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.234883 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c9159e5-6b64-42e8-984e-0d1a32d426a7" containerName="dnsmasq-dns" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.234952 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8db94de-5644-4dc5-8440-2b9e59a84fea" containerName="registry-server" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.235370 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.238985 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.239210 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-no-nodes-custom-svc" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.239288 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.239302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.239492 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-no-nodes-custom-svc-dockercfg-rgwrj" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.242353 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff"] Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.335404 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff"] Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.335907 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[custom-img-svc-combined-ca-bundle inventory kube-api-access-4dkb7 ssh-key-edpm-no-nodes-custom-svc], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" podUID="15c248bb-7e74-46a0-bbd7-271b7aed5df4" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.349711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dkb7\" (UniqueName: \"kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.349995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-custom-img-svc-combined-ca-bundle\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.350033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.350126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-ssh-key-edpm-no-nodes-custom-svc\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.377166 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6"] Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.378227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.383247 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6"] Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.451532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-76db6\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.451599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-custom-img-svc-combined-ca-bundle\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.451634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddg7f\" (UniqueName: \"kubernetes.io/projected/717d0191-fad9-4553-9739-c0c8590cd783-kube-api-access-ddg7f\") pod \"dnsmasq-dnsmasq-84b9f45d47-76db6\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.451659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.451682 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-76db6\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.451720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-ssh-key-edpm-no-nodes-custom-svc\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.451758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dkb7\" (UniqueName: \"kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.451834 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/dataplanenodeset-edpm-no-nodes-custom-svc: secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.451991 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory podName:15c248bb-7e74-46a0-bbd7-271b7aed5df4 nodeName:}" failed. No retries permitted until 2026-01-21 16:32:53.951972105 +0000 UTC m=+5471.133488337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "inventory" (UniqueName: "kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" (UID: "15c248bb-7e74-46a0-bbd7-271b7aed5df4") : secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.454136 4707 projected.go:194] Error preparing data for projected volume kube-api-access-4dkb7 for pod openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff: failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.454182 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7 podName:15c248bb-7e74-46a0-bbd7-271b7aed5df4 nodeName:}" failed. No retries permitted until 2026-01-21 16:32:53.95417183 +0000 UTC m=+5471.135688052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4dkb7" (UniqueName: "kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" (UID: "15c248bb-7e74-46a0-bbd7-271b7aed5df4") : failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.455992 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-ssh-key-edpm-no-nodes-custom-svc\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.456556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-custom-img-svc-combined-ca-bundle\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.552603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddg7f\" (UniqueName: \"kubernetes.io/projected/717d0191-fad9-4553-9739-c0c8590cd783-kube-api-access-ddg7f\") pod \"dnsmasq-dnsmasq-84b9f45d47-76db6\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.552686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-76db6\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.552760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-76db6\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.553379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-76db6\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.553441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-76db6\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.565184 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddg7f\" (UniqueName: \"kubernetes.io/projected/717d0191-fad9-4553-9739-c0c8590cd783-kube-api-access-ddg7f\") pod \"dnsmasq-dnsmasq-84b9f45d47-76db6\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.691075 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.959292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: I0121 16:32:53.959386 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dkb7\" (UniqueName: \"kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.959447 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/dataplanenodeset-edpm-no-nodes-custom-svc: secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.959500 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory podName:15c248bb-7e74-46a0-bbd7-271b7aed5df4 nodeName:}" failed. No retries permitted until 2026-01-21 16:32:54.959484469 +0000 UTC m=+5472.141000691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "inventory" (UniqueName: "kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" (UID: "15c248bb-7e74-46a0-bbd7-271b7aed5df4") : secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.961857 4707 projected.go:194] Error preparing data for projected volume kube-api-access-4dkb7 for pod openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff: failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 16:32:53 crc kubenswrapper[4707]: E0121 16:32:53.961955 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7 podName:15c248bb-7e74-46a0-bbd7-271b7aed5df4 nodeName:}" failed. No retries permitted until 2026-01-21 16:32:54.961923605 +0000 UTC m=+5472.143439837 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4dkb7" (UniqueName: "kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" (UID: "15c248bb-7e74-46a0-bbd7-271b7aed5df4") : failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.034511 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6"] Jan 21 16:32:54 crc kubenswrapper[4707]: W0121 16:32:54.039122 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod717d0191_fad9_4553_9739_c0c8590cd783.slice/crio-717fbdf3590971cf1c2e601681aa21577cabf409ea9ab3a164a354018258ae90 WatchSource:0}: Error finding container 717fbdf3590971cf1c2e601681aa21577cabf409ea9ab3a164a354018258ae90: Status 404 returned error can't find the container with id 717fbdf3590971cf1c2e601681aa21577cabf409ea9ab3a164a354018258ae90 Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.182063 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:32:54 crc kubenswrapper[4707]: E0121 16:32:54.182309 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.235265 4707 generic.go:334] "Generic (PLEG): container finished" podID="717d0191-fad9-4553-9739-c0c8590cd783" containerID="915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636" exitCode=0 Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.235311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" event={"ID":"717d0191-fad9-4553-9739-c0c8590cd783","Type":"ContainerDied","Data":"915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636"} Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.235356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" event={"ID":"717d0191-fad9-4553-9739-c0c8590cd783","Type":"ContainerStarted","Data":"717fbdf3590971cf1c2e601681aa21577cabf409ea9ab3a164a354018258ae90"} Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.235369 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.235445 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7rd6x" podUID="c44e7817-6e76-47d7-a736-981b2916af6e" containerName="registry-server" containerID="cri-o://cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1" gracePeriod=2 Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.243880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.365073 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-ssh-key-edpm-no-nodes-custom-svc\") pod \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.365191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-custom-img-svc-combined-ca-bundle\") pod \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.368931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-ssh-key-edpm-no-nodes-custom-svc" (OuterVolumeSpecName: "ssh-key-edpm-no-nodes-custom-svc") pod "15c248bb-7e74-46a0-bbd7-271b7aed5df4" (UID: "15c248bb-7e74-46a0-bbd7-271b7aed5df4"). InnerVolumeSpecName "ssh-key-edpm-no-nodes-custom-svc". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.369044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-custom-img-svc-combined-ca-bundle" (OuterVolumeSpecName: "custom-img-svc-combined-ca-bundle") pod "15c248bb-7e74-46a0-bbd7-271b7aed5df4" (UID: "15c248bb-7e74-46a0-bbd7-271b7aed5df4"). InnerVolumeSpecName "custom-img-svc-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.467119 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-ssh-key-edpm-no-nodes-custom-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.467146 4707 reconciler_common.go:293] "Volume detached for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-custom-img-svc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.555091 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.669459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-catalog-content\") pod \"c44e7817-6e76-47d7-a736-981b2916af6e\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.669582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-utilities\") pod \"c44e7817-6e76-47d7-a736-981b2916af6e\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.669624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm2q7\" (UniqueName: \"kubernetes.io/projected/c44e7817-6e76-47d7-a736-981b2916af6e-kube-api-access-cm2q7\") pod \"c44e7817-6e76-47d7-a736-981b2916af6e\" (UID: \"c44e7817-6e76-47d7-a736-981b2916af6e\") " Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.670236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-utilities" (OuterVolumeSpecName: "utilities") pod "c44e7817-6e76-47d7-a736-981b2916af6e" (UID: "c44e7817-6e76-47d7-a736-981b2916af6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.672867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44e7817-6e76-47d7-a736-981b2916af6e-kube-api-access-cm2q7" (OuterVolumeSpecName: "kube-api-access-cm2q7") pod "c44e7817-6e76-47d7-a736-981b2916af6e" (UID: "c44e7817-6e76-47d7-a736-981b2916af6e"). InnerVolumeSpecName "kube-api-access-cm2q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.709580 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c44e7817-6e76-47d7-a736-981b2916af6e" (UID: "c44e7817-6e76-47d7-a736-981b2916af6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.770760 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.770793 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm2q7\" (UniqueName: \"kubernetes.io/projected/c44e7817-6e76-47d7-a736-981b2916af6e-kube-api-access-cm2q7\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.770818 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44e7817-6e76-47d7-a736-981b2916af6e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.972513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:54 crc kubenswrapper[4707]: I0121 16:32:54.972607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dkb7\" (UniqueName: \"kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff\" (UID: \"15c248bb-7e74-46a0-bbd7-271b7aed5df4\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:54 crc kubenswrapper[4707]: E0121 16:32:54.972685 4707 secret.go:188] Couldn't get secret openstack-kuttl-tests/dataplanenodeset-edpm-no-nodes-custom-svc: secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 16:32:54 crc kubenswrapper[4707]: E0121 16:32:54.972739 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory podName:15c248bb-7e74-46a0-bbd7-271b7aed5df4 nodeName:}" failed. No retries permitted until 2026-01-21 16:32:56.97272463 +0000 UTC m=+5474.154240852 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "inventory" (UniqueName: "kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" (UID: "15c248bb-7e74-46a0-bbd7-271b7aed5df4") : secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 16:32:54 crc kubenswrapper[4707]: E0121 16:32:54.974759 4707 projected.go:194] Error preparing data for projected volume kube-api-access-4dkb7 for pod openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff: failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 16:32:54 crc kubenswrapper[4707]: E0121 16:32:54.974832 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7 podName:15c248bb-7e74-46a0-bbd7-271b7aed5df4 nodeName:}" failed. No retries permitted until 2026-01-21 16:32:56.97479842 +0000 UTC m=+5474.156314642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4dkb7" (UniqueName: "kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" (UID: "15c248bb-7e74-46a0-bbd7-271b7aed5df4") : failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.242351 4707 generic.go:334] "Generic (PLEG): container finished" podID="c44e7817-6e76-47d7-a736-981b2916af6e" containerID="cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1" exitCode=0 Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.242406 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rd6x" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.242525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rd6x" event={"ID":"c44e7817-6e76-47d7-a736-981b2916af6e","Type":"ContainerDied","Data":"cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1"} Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.242627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rd6x" event={"ID":"c44e7817-6e76-47d7-a736-981b2916af6e","Type":"ContainerDied","Data":"ff1c773927e54ce41e937cc41657de2885f0025b38865cd0585b2aa0d197a85b"} Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.242658 4707 scope.go:117] "RemoveContainer" containerID="cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.244477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" event={"ID":"717d0191-fad9-4553-9739-c0c8590cd783","Type":"ContainerStarted","Data":"c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0"} Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.244492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.244587 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.259144 4707 scope.go:117] "RemoveContainer" containerID="fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.280224 4707 scope.go:117] "RemoveContainer" containerID="1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.286305 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" podStartSLOduration=2.28629126 podStartE2EDuration="2.28629126s" podCreationTimestamp="2026-01-21 16:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:32:55.268129219 +0000 UTC m=+5472.449645442" watchObservedRunningTime="2026-01-21 16:32:55.28629126 +0000 UTC m=+5472.467807482" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.288329 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7rd6x"] Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.295106 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7rd6x"] Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.298203 4707 scope.go:117] "RemoveContainer" containerID="cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1" Jan 21 16:32:55 crc kubenswrapper[4707]: E0121 16:32:55.298585 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1\": container with ID starting with cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1 not found: ID does not exist" containerID="cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.298623 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1"} err="failed to get container status \"cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1\": rpc error: code = NotFound desc = could not find container \"cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1\": container with ID starting with cd4d3c924fade61bb63badb38b1f3254c2527428c9cfdba12ccbf974a57ef6e1 not found: ID does not exist" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.298642 4707 scope.go:117] "RemoveContainer" containerID="fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867" Jan 21 16:32:55 crc kubenswrapper[4707]: E0121 16:32:55.299002 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867\": container with ID starting with fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867 not found: ID does not exist" containerID="fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.299026 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867"} err="failed to get container status \"fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867\": rpc error: code = NotFound desc = could not find container \"fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867\": container with ID starting with fa2eb60e3a0be4884edf3439b4ca2d5300819d39c055f96090dd8fec4f828867 not found: ID does not exist" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.299041 4707 scope.go:117] "RemoveContainer" containerID="1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1" Jan 21 16:32:55 crc kubenswrapper[4707]: E0121 16:32:55.299290 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1\": container with ID starting with 1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1 not found: ID does not exist" containerID="1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.299325 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1"} err="failed to get container status \"1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1\": rpc error: code = NotFound desc = could not find container \"1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1\": container with ID starting with 1e84665c5204a9f18e98bc228507e5e5e8b263598969a80d93a6230d37ecfdd1 not found: ID does not exist" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.311462 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff"] Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.315169 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-bnqff"] Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.378220 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dkb7\" (UniqueName: \"kubernetes.io/projected/15c248bb-7e74-46a0-bbd7-271b7aed5df4-kube-api-access-4dkb7\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:55 crc kubenswrapper[4707]: I0121 16:32:55.378245 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15c248bb-7e74-46a0-bbd7-271b7aed5df4-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:32:57 crc kubenswrapper[4707]: I0121 16:32:57.189409 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15c248bb-7e74-46a0-bbd7-271b7aed5df4" path="/var/lib/kubelet/pods/15c248bb-7e74-46a0-bbd7-271b7aed5df4/volumes" Jan 21 16:32:57 crc kubenswrapper[4707]: I0121 16:32:57.190004 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44e7817-6e76-47d7-a736-981b2916af6e" path="/var/lib/kubelet/pods/c44e7817-6e76-47d7-a736-981b2916af6e/volumes" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.654679 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-lxrhf"] Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.658476 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-lxrhf"] Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.766611 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-pp7ns"] Jan 21 16:32:59 crc kubenswrapper[4707]: E0121 16:32:59.766857 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44e7817-6e76-47d7-a736-981b2916af6e" containerName="registry-server" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.766873 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44e7817-6e76-47d7-a736-981b2916af6e" containerName="registry-server" Jan 21 16:32:59 crc kubenswrapper[4707]: E0121 16:32:59.766893 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44e7817-6e76-47d7-a736-981b2916af6e" containerName="extract-content" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.766899 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44e7817-6e76-47d7-a736-981b2916af6e" containerName="extract-content" Jan 21 16:32:59 crc kubenswrapper[4707]: E0121 16:32:59.766908 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44e7817-6e76-47d7-a736-981b2916af6e" containerName="extract-utilities" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.766915 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44e7817-6e76-47d7-a736-981b2916af6e" containerName="extract-utilities" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.767036 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44e7817-6e76-47d7-a736-981b2916af6e" containerName="registry-server" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.767430 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.768888 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.769089 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.769206 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.769389 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.773608 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pp7ns"] Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.836507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-crc-storage\") pod \"crc-storage-crc-pp7ns\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.836702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz8tw\" (UniqueName: \"kubernetes.io/projected/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-kube-api-access-jz8tw\") pod \"crc-storage-crc-pp7ns\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.836831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-node-mnt\") pod \"crc-storage-crc-pp7ns\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.938489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz8tw\" (UniqueName: \"kubernetes.io/projected/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-kube-api-access-jz8tw\") pod \"crc-storage-crc-pp7ns\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.938563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-node-mnt\") pod \"crc-storage-crc-pp7ns\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.938641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-crc-storage\") pod \"crc-storage-crc-pp7ns\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.938902 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-node-mnt\") pod \"crc-storage-crc-pp7ns\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.939293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-crc-storage\") pod \"crc-storage-crc-pp7ns\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:32:59 crc kubenswrapper[4707]: I0121 16:32:59.952574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz8tw\" (UniqueName: \"kubernetes.io/projected/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-kube-api-access-jz8tw\") pod \"crc-storage-crc-pp7ns\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:33:00 crc kubenswrapper[4707]: I0121 16:33:00.081231 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:33:00 crc kubenswrapper[4707]: I0121 16:33:00.421922 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pp7ns"] Jan 21 16:33:01 crc kubenswrapper[4707]: I0121 16:33:01.189931 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a66068a-da9c-449c-a36d-95950cdd153c" path="/var/lib/kubelet/pods/7a66068a-da9c-449c-a36d-95950cdd153c/volumes" Jan 21 16:33:01 crc kubenswrapper[4707]: I0121 16:33:01.277501 4707 generic.go:334] "Generic (PLEG): container finished" podID="aadf11e0-ff0d-4c13-aeeb-8fd154820b74" containerID="414e9152c6825f4306ef632e34042bad238102d778f1ffadf6a62c96dd1982df" exitCode=0 Jan 21 16:33:01 crc kubenswrapper[4707]: I0121 16:33:01.277534 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pp7ns" event={"ID":"aadf11e0-ff0d-4c13-aeeb-8fd154820b74","Type":"ContainerDied","Data":"414e9152c6825f4306ef632e34042bad238102d778f1ffadf6a62c96dd1982df"} Jan 21 16:33:01 crc kubenswrapper[4707]: I0121 16:33:01.277557 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pp7ns" event={"ID":"aadf11e0-ff0d-4c13-aeeb-8fd154820b74","Type":"ContainerStarted","Data":"7f0d408f0052ebc6059392c5ddee140bcf3091211417cee95bf96bb2371b0e4e"} Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.487463 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.567085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-crc-storage\") pod \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.567128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-node-mnt\") pod \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.567271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz8tw\" (UniqueName: \"kubernetes.io/projected/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-kube-api-access-jz8tw\") pod \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\" (UID: \"aadf11e0-ff0d-4c13-aeeb-8fd154820b74\") " Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.567266 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "aadf11e0-ff0d-4c13-aeeb-8fd154820b74" (UID: "aadf11e0-ff0d-4c13-aeeb-8fd154820b74"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.567769 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.571299 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-kube-api-access-jz8tw" (OuterVolumeSpecName: "kube-api-access-jz8tw") pod "aadf11e0-ff0d-4c13-aeeb-8fd154820b74" (UID: "aadf11e0-ff0d-4c13-aeeb-8fd154820b74"). InnerVolumeSpecName "kube-api-access-jz8tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.581410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "aadf11e0-ff0d-4c13-aeeb-8fd154820b74" (UID: "aadf11e0-ff0d-4c13-aeeb-8fd154820b74"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.669392 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:02 crc kubenswrapper[4707]: I0121 16:33:02.669421 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz8tw\" (UniqueName: \"kubernetes.io/projected/aadf11e0-ff0d-4c13-aeeb-8fd154820b74-kube-api-access-jz8tw\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:03 crc kubenswrapper[4707]: I0121 16:33:03.294208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pp7ns" event={"ID":"aadf11e0-ff0d-4c13-aeeb-8fd154820b74","Type":"ContainerDied","Data":"7f0d408f0052ebc6059392c5ddee140bcf3091211417cee95bf96bb2371b0e4e"} Jan 21 16:33:03 crc kubenswrapper[4707]: I0121 16:33:03.294249 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f0d408f0052ebc6059392c5ddee140bcf3091211417cee95bf96bb2371b0e4e" Jan 21 16:33:03 crc kubenswrapper[4707]: I0121 16:33:03.294223 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pp7ns" Jan 21 16:33:03 crc kubenswrapper[4707]: I0121 16:33:03.692484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:33:03 crc kubenswrapper[4707]: I0121 16:33:03.729474 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q"] Jan 21 16:33:03 crc kubenswrapper[4707]: I0121 16:33:03.729704 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" podUID="d7086d0d-b404-4245-a42f-2d3b9fe33e31" containerName="dnsmasq-dns" containerID="cri-o://b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad" gracePeriod=10 Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.063681 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.092192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-dnsmasq-svc\") pod \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.092258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-config\") pod \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.092299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-edpm-no-nodes-custom-svc\") pod \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.092367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dnft\" (UniqueName: \"kubernetes.io/projected/d7086d0d-b404-4245-a42f-2d3b9fe33e31-kube-api-access-6dnft\") pod \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\" (UID: \"d7086d0d-b404-4245-a42f-2d3b9fe33e31\") " Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.098005 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7086d0d-b404-4245-a42f-2d3b9fe33e31-kube-api-access-6dnft" (OuterVolumeSpecName: "kube-api-access-6dnft") pod "d7086d0d-b404-4245-a42f-2d3b9fe33e31" (UID: "d7086d0d-b404-4245-a42f-2d3b9fe33e31"). InnerVolumeSpecName "kube-api-access-6dnft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.118113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-config" (OuterVolumeSpecName: "config") pod "d7086d0d-b404-4245-a42f-2d3b9fe33e31" (UID: "d7086d0d-b404-4245-a42f-2d3b9fe33e31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.118202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-edpm-no-nodes-custom-svc" (OuterVolumeSpecName: "edpm-no-nodes-custom-svc") pod "d7086d0d-b404-4245-a42f-2d3b9fe33e31" (UID: "d7086d0d-b404-4245-a42f-2d3b9fe33e31"). InnerVolumeSpecName "edpm-no-nodes-custom-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.119593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "d7086d0d-b404-4245-a42f-2d3b9fe33e31" (UID: "d7086d0d-b404-4245-a42f-2d3b9fe33e31"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.193452 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dnft\" (UniqueName: \"kubernetes.io/projected/d7086d0d-b404-4245-a42f-2d3b9fe33e31-kube-api-access-6dnft\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.193475 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.193486 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.193495 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/d7086d0d-b404-4245-a42f-2d3b9fe33e31-edpm-no-nodes-custom-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.301318 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7086d0d-b404-4245-a42f-2d3b9fe33e31" containerID="b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad" exitCode=0 Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.301352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" event={"ID":"d7086d0d-b404-4245-a42f-2d3b9fe33e31","Type":"ContainerDied","Data":"b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad"} Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.301367 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.301384 4707 scope.go:117] "RemoveContainer" containerID="b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.301373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q" event={"ID":"d7086d0d-b404-4245-a42f-2d3b9fe33e31","Type":"ContainerDied","Data":"c2933df9ad585e7b178e563075e31061e373a2514db2c282055de8356fb260e3"} Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.320270 4707 scope.go:117] "RemoveContainer" containerID="39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.323877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q"] Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.328541 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-mnc4q"] Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.348510 4707 scope.go:117] "RemoveContainer" containerID="b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad" Jan 21 16:33:04 crc kubenswrapper[4707]: E0121 16:33:04.348945 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad\": container with ID starting with b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad not found: ID does not exist" containerID="b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.349025 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad"} err="failed to get container status \"b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad\": rpc error: code = NotFound desc = could not find container \"b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad\": container with ID starting with b781b00ba9e8cd8c76d2243457be706254a2324ed28f2888fd90dc5921b1f7ad not found: ID does not exist" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.349091 4707 scope.go:117] "RemoveContainer" containerID="39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e" Jan 21 16:33:04 crc kubenswrapper[4707]: E0121 16:33:04.349355 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e\": container with ID starting with 39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e not found: ID does not exist" containerID="39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e" Jan 21 16:33:04 crc kubenswrapper[4707]: I0121 16:33:04.349420 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e"} err="failed to get container status \"39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e\": rpc error: code = NotFound desc = could not find container \"39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e\": container with ID starting with 39163221e00821edab8d3f7923b6a7af0615c40453ce511e0fc5cbc362a39b4e not found: ID does not exist" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.188513 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7086d0d-b404-4245-a42f-2d3b9fe33e31" path="/var/lib/kubelet/pods/d7086d0d-b404-4245-a42f-2d3b9fe33e31/volumes" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.238739 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-pp7ns"] Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.242546 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-pp7ns"] Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.338798 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-2jdhl"] Jan 21 16:33:05 crc kubenswrapper[4707]: E0121 16:33:05.339060 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7086d0d-b404-4245-a42f-2d3b9fe33e31" containerName="dnsmasq-dns" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.339076 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7086d0d-b404-4245-a42f-2d3b9fe33e31" containerName="dnsmasq-dns" Jan 21 16:33:05 crc kubenswrapper[4707]: E0121 16:33:05.339104 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7086d0d-b404-4245-a42f-2d3b9fe33e31" containerName="init" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.339110 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7086d0d-b404-4245-a42f-2d3b9fe33e31" containerName="init" Jan 21 16:33:05 crc kubenswrapper[4707]: E0121 16:33:05.339120 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadf11e0-ff0d-4c13-aeeb-8fd154820b74" containerName="storage" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.339125 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadf11e0-ff0d-4c13-aeeb-8fd154820b74" containerName="storage" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.339226 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7086d0d-b404-4245-a42f-2d3b9fe33e31" containerName="dnsmasq-dns" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.339237 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="aadf11e0-ff0d-4c13-aeeb-8fd154820b74" containerName="storage" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.339635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.341050 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.341182 4707 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5lhdx" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.342069 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.343307 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.348219 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2jdhl"] Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.408063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6bf8713f-ed24-4463-97f7-84426a253586-node-mnt\") pod \"crc-storage-crc-2jdhl\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.408121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fpz9\" (UniqueName: \"kubernetes.io/projected/6bf8713f-ed24-4463-97f7-84426a253586-kube-api-access-8fpz9\") pod \"crc-storage-crc-2jdhl\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.408206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6bf8713f-ed24-4463-97f7-84426a253586-crc-storage\") pod \"crc-storage-crc-2jdhl\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.509703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fpz9\" (UniqueName: \"kubernetes.io/projected/6bf8713f-ed24-4463-97f7-84426a253586-kube-api-access-8fpz9\") pod \"crc-storage-crc-2jdhl\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.509782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6bf8713f-ed24-4463-97f7-84426a253586-crc-storage\") pod \"crc-storage-crc-2jdhl\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.509846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6bf8713f-ed24-4463-97f7-84426a253586-node-mnt\") pod \"crc-storage-crc-2jdhl\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.510060 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6bf8713f-ed24-4463-97f7-84426a253586-node-mnt\") pod \"crc-storage-crc-2jdhl\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.510436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6bf8713f-ed24-4463-97f7-84426a253586-crc-storage\") pod \"crc-storage-crc-2jdhl\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.523331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fpz9\" (UniqueName: \"kubernetes.io/projected/6bf8713f-ed24-4463-97f7-84426a253586-kube-api-access-8fpz9\") pod \"crc-storage-crc-2jdhl\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.651799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:05 crc kubenswrapper[4707]: I0121 16:33:05.993922 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2jdhl"] Jan 21 16:33:06 crc kubenswrapper[4707]: W0121 16:33:05.999952 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bf8713f_ed24_4463_97f7_84426a253586.slice/crio-ab5bc431767d5f57024c995d5ffff807aa7ce29362fce1e93341580ec89a81bc WatchSource:0}: Error finding container ab5bc431767d5f57024c995d5ffff807aa7ce29362fce1e93341580ec89a81bc: Status 404 returned error can't find the container with id ab5bc431767d5f57024c995d5ffff807aa7ce29362fce1e93341580ec89a81bc Jan 21 16:33:06 crc kubenswrapper[4707]: I0121 16:33:06.182476 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:33:06 crc kubenswrapper[4707]: E0121 16:33:06.182667 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:33:06 crc kubenswrapper[4707]: I0121 16:33:06.314113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2jdhl" event={"ID":"6bf8713f-ed24-4463-97f7-84426a253586","Type":"ContainerStarted","Data":"ab5bc431767d5f57024c995d5ffff807aa7ce29362fce1e93341580ec89a81bc"} Jan 21 16:33:07 crc kubenswrapper[4707]: I0121 16:33:07.188716 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadf11e0-ff0d-4c13-aeeb-8fd154820b74" path="/var/lib/kubelet/pods/aadf11e0-ff0d-4c13-aeeb-8fd154820b74/volumes" Jan 21 16:33:07 crc kubenswrapper[4707]: I0121 16:33:07.322664 4707 generic.go:334] "Generic (PLEG): container finished" podID="6bf8713f-ed24-4463-97f7-84426a253586" containerID="bc6d97051f3f700652d9483cedd7a03157382d25b1c718697f2c573190acf6c7" exitCode=0 Jan 21 16:33:07 crc kubenswrapper[4707]: I0121 16:33:07.322705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2jdhl" event={"ID":"6bf8713f-ed24-4463-97f7-84426a253586","Type":"ContainerDied","Data":"bc6d97051f3f700652d9483cedd7a03157382d25b1c718697f2c573190acf6c7"} Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.522147 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.547370 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fpz9\" (UniqueName: \"kubernetes.io/projected/6bf8713f-ed24-4463-97f7-84426a253586-kube-api-access-8fpz9\") pod \"6bf8713f-ed24-4463-97f7-84426a253586\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.547482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6bf8713f-ed24-4463-97f7-84426a253586-node-mnt\") pod \"6bf8713f-ed24-4463-97f7-84426a253586\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.547583 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bf8713f-ed24-4463-97f7-84426a253586-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6bf8713f-ed24-4463-97f7-84426a253586" (UID: "6bf8713f-ed24-4463-97f7-84426a253586"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.547591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6bf8713f-ed24-4463-97f7-84426a253586-crc-storage\") pod \"6bf8713f-ed24-4463-97f7-84426a253586\" (UID: \"6bf8713f-ed24-4463-97f7-84426a253586\") " Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.547992 4707 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6bf8713f-ed24-4463-97f7-84426a253586-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.551387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf8713f-ed24-4463-97f7-84426a253586-kube-api-access-8fpz9" (OuterVolumeSpecName: "kube-api-access-8fpz9") pod "6bf8713f-ed24-4463-97f7-84426a253586" (UID: "6bf8713f-ed24-4463-97f7-84426a253586"). InnerVolumeSpecName "kube-api-access-8fpz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.561282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bf8713f-ed24-4463-97f7-84426a253586-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6bf8713f-ed24-4463-97f7-84426a253586" (UID: "6bf8713f-ed24-4463-97f7-84426a253586"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.648963 4707 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6bf8713f-ed24-4463-97f7-84426a253586-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:08 crc kubenswrapper[4707]: I0121 16:33:08.648987 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fpz9\" (UniqueName: \"kubernetes.io/projected/6bf8713f-ed24-4463-97f7-84426a253586-kube-api-access-8fpz9\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:09 crc kubenswrapper[4707]: I0121 16:33:09.336723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2jdhl" event={"ID":"6bf8713f-ed24-4463-97f7-84426a253586","Type":"ContainerDied","Data":"ab5bc431767d5f57024c995d5ffff807aa7ce29362fce1e93341580ec89a81bc"} Jan 21 16:33:09 crc kubenswrapper[4707]: I0121 16:33:09.336890 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab5bc431767d5f57024c995d5ffff807aa7ce29362fce1e93341580ec89a81bc" Jan 21 16:33:09 crc kubenswrapper[4707]: I0121 16:33:09.336771 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2jdhl" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.400232 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48"] Jan 21 16:33:11 crc kubenswrapper[4707]: E0121 16:33:11.400667 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf8713f-ed24-4463-97f7-84426a253586" containerName="storage" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.400678 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf8713f-ed24-4463-97f7-84426a253586" containerName="storage" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.400822 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf8713f-ed24-4463-97f7-84426a253586" containerName="storage" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.401411 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.402928 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.412632 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48"] Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.477084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.477288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.477323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-config\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.477411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44tl9\" (UniqueName: \"kubernetes.io/projected/0dbad968-3a49-4bde-adea-f5469ef57e26-kube-api-access-44tl9\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.578705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.578960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-config\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.579145 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44tl9\" (UniqueName: \"kubernetes.io/projected/0dbad968-3a49-4bde-adea-f5469ef57e26-kube-api-access-44tl9\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.579298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.579445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.579692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-config\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.579977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.594416 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44tl9\" (UniqueName: \"kubernetes.io/projected/0dbad968-3a49-4bde-adea-f5469ef57e26-kube-api-access-44tl9\") pod \"dnsmasq-dnsmasq-64864b6d57-rrs48\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:11 crc kubenswrapper[4707]: I0121 16:33:11.714621 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:12 crc kubenswrapper[4707]: I0121 16:33:12.065154 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48"] Jan 21 16:33:12 crc kubenswrapper[4707]: W0121 16:33:12.066749 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dbad968_3a49_4bde_adea_f5469ef57e26.slice/crio-b914ec37c6c58aca27b0c2456220085734d64244aaa9929e4a8607538c82645a WatchSource:0}: Error finding container b914ec37c6c58aca27b0c2456220085734d64244aaa9929e4a8607538c82645a: Status 404 returned error can't find the container with id b914ec37c6c58aca27b0c2456220085734d64244aaa9929e4a8607538c82645a Jan 21 16:33:12 crc kubenswrapper[4707]: I0121 16:33:12.355988 4707 generic.go:334] "Generic (PLEG): container finished" podID="0dbad968-3a49-4bde-adea-f5469ef57e26" containerID="72078d0f517b3a38e97ea75e6c3b999608d28f846e8bb0f875c95d48243fc528" exitCode=0 Jan 21 16:33:12 crc kubenswrapper[4707]: I0121 16:33:12.356170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" event={"ID":"0dbad968-3a49-4bde-adea-f5469ef57e26","Type":"ContainerDied","Data":"72078d0f517b3a38e97ea75e6c3b999608d28f846e8bb0f875c95d48243fc528"} Jan 21 16:33:12 crc kubenswrapper[4707]: I0121 16:33:12.356204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" event={"ID":"0dbad968-3a49-4bde-adea-f5469ef57e26","Type":"ContainerStarted","Data":"b914ec37c6c58aca27b0c2456220085734d64244aaa9929e4a8607538c82645a"} Jan 21 16:33:13 crc kubenswrapper[4707]: I0121 16:33:13.363369 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" event={"ID":"0dbad968-3a49-4bde-adea-f5469ef57e26","Type":"ContainerStarted","Data":"cda0cd3d3e46c04599a564f3667435856cc8b6b0459d1f4b028967a746caaf2e"} Jan 21 16:33:13 crc kubenswrapper[4707]: I0121 16:33:13.363575 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:13 crc kubenswrapper[4707]: I0121 16:33:13.376503 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" podStartSLOduration=2.376492321 podStartE2EDuration="2.376492321s" podCreationTimestamp="2026-01-21 16:33:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:33:13.374088833 +0000 UTC m=+5490.555605044" watchObservedRunningTime="2026-01-21 16:33:13.376492321 +0000 UTC m=+5490.558008543" Jan 21 16:33:21 crc kubenswrapper[4707]: I0121 16:33:21.182012 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:33:21 crc kubenswrapper[4707]: E0121 16:33:21.182347 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:33:21 crc kubenswrapper[4707]: I0121 16:33:21.715957 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:33:21 crc kubenswrapper[4707]: I0121 16:33:21.747835 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6"] Jan 21 16:33:21 crc kubenswrapper[4707]: I0121 16:33:21.748038 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" podUID="717d0191-fad9-4553-9739-c0c8590cd783" containerName="dnsmasq-dns" containerID="cri-o://c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0" gracePeriod=10 Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.073787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.091695 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-dnsmasq-svc\") pod \"717d0191-fad9-4553-9739-c0c8590cd783\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.091736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddg7f\" (UniqueName: \"kubernetes.io/projected/717d0191-fad9-4553-9739-c0c8590cd783-kube-api-access-ddg7f\") pod \"717d0191-fad9-4553-9739-c0c8590cd783\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.091765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-config\") pod \"717d0191-fad9-4553-9739-c0c8590cd783\" (UID: \"717d0191-fad9-4553-9739-c0c8590cd783\") " Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.096173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717d0191-fad9-4553-9739-c0c8590cd783-kube-api-access-ddg7f" (OuterVolumeSpecName: "kube-api-access-ddg7f") pod "717d0191-fad9-4553-9739-c0c8590cd783" (UID: "717d0191-fad9-4553-9739-c0c8590cd783"). InnerVolumeSpecName "kube-api-access-ddg7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.117845 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-config" (OuterVolumeSpecName: "config") pod "717d0191-fad9-4553-9739-c0c8590cd783" (UID: "717d0191-fad9-4553-9739-c0c8590cd783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.118517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "717d0191-fad9-4553-9739-c0c8590cd783" (UID: "717d0191-fad9-4553-9739-c0c8590cd783"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.193010 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.193220 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddg7f\" (UniqueName: \"kubernetes.io/projected/717d0191-fad9-4553-9739-c0c8590cd783-kube-api-access-ddg7f\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.193236 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717d0191-fad9-4553-9739-c0c8590cd783-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.415303 4707 generic.go:334] "Generic (PLEG): container finished" podID="717d0191-fad9-4553-9739-c0c8590cd783" containerID="c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0" exitCode=0 Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.415337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" event={"ID":"717d0191-fad9-4553-9739-c0c8590cd783","Type":"ContainerDied","Data":"c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0"} Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.415342 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.415363 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6" event={"ID":"717d0191-fad9-4553-9739-c0c8590cd783","Type":"ContainerDied","Data":"717fbdf3590971cf1c2e601681aa21577cabf409ea9ab3a164a354018258ae90"} Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.415380 4707 scope.go:117] "RemoveContainer" containerID="c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.431226 4707 scope.go:117] "RemoveContainer" containerID="915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.434743 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6"] Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.439039 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-76db6"] Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.467226 4707 scope.go:117] "RemoveContainer" containerID="c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0" Jan 21 16:33:22 crc kubenswrapper[4707]: E0121 16:33:22.467477 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0\": container with ID starting with c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0 not found: ID does not exist" containerID="c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.467502 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0"} err="failed to get container status \"c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0\": rpc error: code = NotFound desc = could not find container \"c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0\": container with ID starting with c13c786a7cb01db138b47f453ae24660ab836c80db2500ab3fab5ed2e74b68a0 not found: ID does not exist" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.467518 4707 scope.go:117] "RemoveContainer" containerID="915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636" Jan 21 16:33:22 crc kubenswrapper[4707]: E0121 16:33:22.467771 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636\": container with ID starting with 915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636 not found: ID does not exist" containerID="915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636" Jan 21 16:33:22 crc kubenswrapper[4707]: I0121 16:33:22.467791 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636"} err="failed to get container status \"915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636\": rpc error: code = NotFound desc = could not find container \"915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636\": container with ID starting with 915266f91d2ab6e6bbdb0b23a000f155677081c0199e280480cf879cea6e6636 not found: ID does not exist" Jan 21 16:33:23 crc kubenswrapper[4707]: I0121 16:33:23.188960 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717d0191-fad9-4553-9739-c0c8590cd783" path="/var/lib/kubelet/pods/717d0191-fad9-4553-9739-c0c8590cd783/volumes" Jan 21 16:33:23 crc kubenswrapper[4707]: I0121 16:33:23.727673 4707 scope.go:117] "RemoveContainer" containerID="a69a5bbb987f031cf3f83394c39b8295165851ffdb009a7d41507e3be906e423" Jan 21 16:33:23 crc kubenswrapper[4707]: I0121 16:33:23.745403 4707 scope.go:117] "RemoveContainer" containerID="7add33a461c4bf15c5ffc17fe16ea2d36534a73828dfe3f191f562d56eb138bc" Jan 21 16:33:23 crc kubenswrapper[4707]: I0121 16:33:23.763158 4707 scope.go:117] "RemoveContainer" containerID="e03d96d4f656de595ae7afb0fa87d77c07cbe6e74f2fedc8b18a2a9330caea77" Jan 21 16:33:23 crc kubenswrapper[4707]: I0121 16:33:23.781681 4707 scope.go:117] "RemoveContainer" containerID="3162afa8c55a04db4b2151998a05066c75b97de71be4179f018575c205f9eea2" Jan 21 16:33:23 crc kubenswrapper[4707]: I0121 16:33:23.799891 4707 scope.go:117] "RemoveContainer" containerID="023323a0763df0ba7be1effdd0989514017c1aa43f13b9dbafd726ed1a60a73a" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.367820 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt"] Jan 21 16:33:26 crc kubenswrapper[4707]: E0121 16:33:26.368056 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717d0191-fad9-4553-9739-c0c8590cd783" containerName="init" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.368068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="717d0191-fad9-4553-9739-c0c8590cd783" containerName="init" Jan 21 16:33:26 crc kubenswrapper[4707]: E0121 16:33:26.368095 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717d0191-fad9-4553-9739-c0c8590cd783" containerName="dnsmasq-dns" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.368102 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="717d0191-fad9-4553-9739-c0c8590cd783" containerName="dnsmasq-dns" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.368236 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="717d0191-fad9-4553-9739-c0c8590cd783" containerName="dnsmasq-dns" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.368585 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.373352 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.373372 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.373426 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.373513 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-j94fk" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.375708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt"] Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.376396 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.441634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54wwt\" (UniqueName: \"kubernetes.io/projected/3aedba37-40a8-412e-8aa5-20a0a713687b-kube-api-access-54wwt\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.441897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.441944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.442045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.542997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.543144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.543277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54wwt\" (UniqueName: \"kubernetes.io/projected/3aedba37-40a8-412e-8aa5-20a0a713687b-kube-api-access-54wwt\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.543347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.547082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.547272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.547725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.555791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54wwt\" (UniqueName: \"kubernetes.io/projected/3aedba37-40a8-412e-8aa5-20a0a713687b-kube-api-access-54wwt\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:26 crc kubenswrapper[4707]: I0121 16:33:26.683505 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:27 crc kubenswrapper[4707]: I0121 16:33:27.032943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt"] Jan 21 16:33:27 crc kubenswrapper[4707]: I0121 16:33:27.443711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" event={"ID":"3aedba37-40a8-412e-8aa5-20a0a713687b","Type":"ContainerStarted","Data":"1ab33e70d8c55f67a41ea8d3ace9e193da7ccb7b9e225ea8b8813659febb1de0"} Jan 21 16:33:28 crc kubenswrapper[4707]: I0121 16:33:28.449981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" event={"ID":"3aedba37-40a8-412e-8aa5-20a0a713687b","Type":"ContainerStarted","Data":"0511a692cf877ea70b36d67f6978eef8da3c2ecaefe78498e7fed5b52626862e"} Jan 21 16:33:28 crc kubenswrapper[4707]: I0121 16:33:28.463843 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" podStartSLOduration=1.972360221 podStartE2EDuration="2.463829959s" podCreationTimestamp="2026-01-21 16:33:26 +0000 UTC" firstStartedPulling="2026-01-21 16:33:27.040129199 +0000 UTC m=+5504.221645422" lastFinishedPulling="2026-01-21 16:33:27.531598939 +0000 UTC m=+5504.713115160" observedRunningTime="2026-01-21 16:33:28.461354335 +0000 UTC m=+5505.642870567" watchObservedRunningTime="2026-01-21 16:33:28.463829959 +0000 UTC m=+5505.645346181" Jan 21 16:33:29 crc kubenswrapper[4707]: I0121 16:33:29.456627 4707 generic.go:334] "Generic (PLEG): container finished" podID="3aedba37-40a8-412e-8aa5-20a0a713687b" containerID="0511a692cf877ea70b36d67f6978eef8da3c2ecaefe78498e7fed5b52626862e" exitCode=2 Jan 21 16:33:29 crc kubenswrapper[4707]: I0121 16:33:29.456674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" event={"ID":"3aedba37-40a8-412e-8aa5-20a0a713687b","Type":"ContainerDied","Data":"0511a692cf877ea70b36d67f6978eef8da3c2ecaefe78498e7fed5b52626862e"} Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.665537 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.693580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-failed-service-combined-ca-bundle\") pod \"3aedba37-40a8-412e-8aa5-20a0a713687b\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.693642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54wwt\" (UniqueName: \"kubernetes.io/projected/3aedba37-40a8-412e-8aa5-20a0a713687b-kube-api-access-54wwt\") pod \"3aedba37-40a8-412e-8aa5-20a0a713687b\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.693756 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-ssh-key-edpm-compute-no-nodes\") pod \"3aedba37-40a8-412e-8aa5-20a0a713687b\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.693791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-inventory\") pod \"3aedba37-40a8-412e-8aa5-20a0a713687b\" (UID: \"3aedba37-40a8-412e-8aa5-20a0a713687b\") " Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.697830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-failed-service-combined-ca-bundle" (OuterVolumeSpecName: "failed-service-combined-ca-bundle") pod "3aedba37-40a8-412e-8aa5-20a0a713687b" (UID: "3aedba37-40a8-412e-8aa5-20a0a713687b"). InnerVolumeSpecName "failed-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.698098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aedba37-40a8-412e-8aa5-20a0a713687b-kube-api-access-54wwt" (OuterVolumeSpecName: "kube-api-access-54wwt") pod "3aedba37-40a8-412e-8aa5-20a0a713687b" (UID: "3aedba37-40a8-412e-8aa5-20a0a713687b"). InnerVolumeSpecName "kube-api-access-54wwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.708999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-inventory" (OuterVolumeSpecName: "inventory") pod "3aedba37-40a8-412e-8aa5-20a0a713687b" (UID: "3aedba37-40a8-412e-8aa5-20a0a713687b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.709249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "3aedba37-40a8-412e-8aa5-20a0a713687b" (UID: "3aedba37-40a8-412e-8aa5-20a0a713687b"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.794885 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.794909 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.794919 4707 reconciler_common.go:293] "Volume detached for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aedba37-40a8-412e-8aa5-20a0a713687b-failed-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:30 crc kubenswrapper[4707]: I0121 16:33:30.794940 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54wwt\" (UniqueName: \"kubernetes.io/projected/3aedba37-40a8-412e-8aa5-20a0a713687b-kube-api-access-54wwt\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:31 crc kubenswrapper[4707]: I0121 16:33:31.469459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" event={"ID":"3aedba37-40a8-412e-8aa5-20a0a713687b","Type":"ContainerDied","Data":"1ab33e70d8c55f67a41ea8d3ace9e193da7ccb7b9e225ea8b8813659febb1de0"} Jan 21 16:33:31 crc kubenswrapper[4707]: I0121 16:33:31.469494 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ab33e70d8c55f67a41ea8d3ace9e193da7ccb7b9e225ea8b8813659febb1de0" Jan 21 16:33:31 crc kubenswrapper[4707]: I0121 16:33:31.469501 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt" Jan 21 16:33:32 crc kubenswrapper[4707]: I0121 16:33:32.182669 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:33:32 crc kubenswrapper[4707]: E0121 16:33:32.182901 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.233938 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ph4th"] Jan 21 16:33:35 crc kubenswrapper[4707]: E0121 16:33:35.234367 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aedba37-40a8-412e-8aa5-20a0a713687b" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.234382 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aedba37-40a8-412e-8aa5-20a0a713687b" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.234538 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aedba37-40a8-412e-8aa5-20a0a713687b" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.235421 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.238804 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7vc\" (UniqueName: \"kubernetes.io/projected/cf947c98-9c65-42f7-9a11-7312b64226c3-kube-api-access-gd7vc\") pod \"redhat-marketplace-ph4th\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.238994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-utilities\") pod \"redhat-marketplace-ph4th\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.239116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-catalog-content\") pod \"redhat-marketplace-ph4th\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.241882 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph4th"] Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.339867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-utilities\") pod \"redhat-marketplace-ph4th\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.339964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-catalog-content\") pod \"redhat-marketplace-ph4th\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.340018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7vc\" (UniqueName: \"kubernetes.io/projected/cf947c98-9c65-42f7-9a11-7312b64226c3-kube-api-access-gd7vc\") pod \"redhat-marketplace-ph4th\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.340267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-utilities\") pod \"redhat-marketplace-ph4th\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.340322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-catalog-content\") pod \"redhat-marketplace-ph4th\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.355296 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7vc\" (UniqueName: \"kubernetes.io/projected/cf947c98-9c65-42f7-9a11-7312b64226c3-kube-api-access-gd7vc\") pod \"redhat-marketplace-ph4th\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.554756 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:35 crc kubenswrapper[4707]: I0121 16:33:35.918507 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph4th"] Jan 21 16:33:36 crc kubenswrapper[4707]: I0121 16:33:36.499500 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerID="36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f" exitCode=0 Jan 21 16:33:36 crc kubenswrapper[4707]: I0121 16:33:36.499534 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph4th" event={"ID":"cf947c98-9c65-42f7-9a11-7312b64226c3","Type":"ContainerDied","Data":"36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f"} Jan 21 16:33:36 crc kubenswrapper[4707]: I0121 16:33:36.499556 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph4th" event={"ID":"cf947c98-9c65-42f7-9a11-7312b64226c3","Type":"ContainerStarted","Data":"7317788bedd5255d49095d63da1f218bc8fb38e4ecdb1b43d1e35f5130946b73"} Jan 21 16:33:37 crc kubenswrapper[4707]: I0121 16:33:37.507112 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerID="14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375" exitCode=0 Jan 21 16:33:37 crc kubenswrapper[4707]: I0121 16:33:37.507197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph4th" event={"ID":"cf947c98-9c65-42f7-9a11-7312b64226c3","Type":"ContainerDied","Data":"14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375"} Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.013771 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x"] Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.014544 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.016024 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.016167 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-j94fk" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.016332 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.016340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.017660 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.020148 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x"] Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.171967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwl2t\" (UniqueName: \"kubernetes.io/projected/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-kube-api-access-zwl2t\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.172030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.172073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.172093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.272751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.272797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.272861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwl2t\" (UniqueName: \"kubernetes.io/projected/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-kube-api-access-zwl2t\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.272906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.277247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.278269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.280170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.301949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwl2t\" (UniqueName: \"kubernetes.io/projected/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-kube-api-access-zwl2t\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.342057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.518209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph4th" event={"ID":"cf947c98-9c65-42f7-9a11-7312b64226c3","Type":"ContainerStarted","Data":"9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773"} Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.534792 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ph4th" podStartSLOduration=2.078849813 podStartE2EDuration="3.534779402s" podCreationTimestamp="2026-01-21 16:33:35 +0000 UTC" firstStartedPulling="2026-01-21 16:33:36.501193547 +0000 UTC m=+5513.682709769" lastFinishedPulling="2026-01-21 16:33:37.957123136 +0000 UTC m=+5515.138639358" observedRunningTime="2026-01-21 16:33:38.530346697 +0000 UTC m=+5515.711862919" watchObservedRunningTime="2026-01-21 16:33:38.534779402 +0000 UTC m=+5515.716295624" Jan 21 16:33:38 crc kubenswrapper[4707]: I0121 16:33:38.701060 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x"] Jan 21 16:33:38 crc kubenswrapper[4707]: W0121 16:33:38.704647 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac76806e_2d25_41ce_8d64_8f0dd9d14f57.slice/crio-930e0810f988a1c591e656014a96d0fa99bead0c159f2f1be7b2afae2cd2c028 WatchSource:0}: Error finding container 930e0810f988a1c591e656014a96d0fa99bead0c159f2f1be7b2afae2cd2c028: Status 404 returned error can't find the container with id 930e0810f988a1c591e656014a96d0fa99bead0c159f2f1be7b2afae2cd2c028 Jan 21 16:33:39 crc kubenswrapper[4707]: I0121 16:33:39.529561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" event={"ID":"ac76806e-2d25-41ce-8d64-8f0dd9d14f57","Type":"ContainerStarted","Data":"279873d4ba1cceb2a7d6548a655b63ce12ba7880ec55a67febe0bb558f345333"} Jan 21 16:33:39 crc kubenswrapper[4707]: I0121 16:33:39.529725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" event={"ID":"ac76806e-2d25-41ce-8d64-8f0dd9d14f57","Type":"ContainerStarted","Data":"930e0810f988a1c591e656014a96d0fa99bead0c159f2f1be7b2afae2cd2c028"} Jan 21 16:33:39 crc kubenswrapper[4707]: I0121 16:33:39.544574 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" podStartSLOduration=1.050199053 podStartE2EDuration="1.544561481s" podCreationTimestamp="2026-01-21 16:33:38 +0000 UTC" firstStartedPulling="2026-01-21 16:33:38.706510477 +0000 UTC m=+5515.888026699" lastFinishedPulling="2026-01-21 16:33:39.200872905 +0000 UTC m=+5516.382389127" observedRunningTime="2026-01-21 16:33:39.542413693 +0000 UTC m=+5516.723929915" watchObservedRunningTime="2026-01-21 16:33:39.544561481 +0000 UTC m=+5516.726077704" Jan 21 16:33:40 crc kubenswrapper[4707]: I0121 16:33:40.535678 4707 generic.go:334] "Generic (PLEG): container finished" podID="ac76806e-2d25-41ce-8d64-8f0dd9d14f57" containerID="279873d4ba1cceb2a7d6548a655b63ce12ba7880ec55a67febe0bb558f345333" exitCode=2 Jan 21 16:33:40 crc kubenswrapper[4707]: I0121 16:33:40.535715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" event={"ID":"ac76806e-2d25-41ce-8d64-8f0dd9d14f57","Type":"ContainerDied","Data":"279873d4ba1cceb2a7d6548a655b63ce12ba7880ec55a67febe0bb558f345333"} Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.745176 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.814324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-ssh-key-edpm-compute-no-nodes\") pod \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.814384 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwl2t\" (UniqueName: \"kubernetes.io/projected/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-kube-api-access-zwl2t\") pod \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.814433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-failed-service-combined-ca-bundle\") pod \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.814459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-inventory\") pod \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\" (UID: \"ac76806e-2d25-41ce-8d64-8f0dd9d14f57\") " Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.818378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-kube-api-access-zwl2t" (OuterVolumeSpecName: "kube-api-access-zwl2t") pod "ac76806e-2d25-41ce-8d64-8f0dd9d14f57" (UID: "ac76806e-2d25-41ce-8d64-8f0dd9d14f57"). InnerVolumeSpecName "kube-api-access-zwl2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.818854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-failed-service-combined-ca-bundle" (OuterVolumeSpecName: "failed-service-combined-ca-bundle") pod "ac76806e-2d25-41ce-8d64-8f0dd9d14f57" (UID: "ac76806e-2d25-41ce-8d64-8f0dd9d14f57"). InnerVolumeSpecName "failed-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.830430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "ac76806e-2d25-41ce-8d64-8f0dd9d14f57" (UID: "ac76806e-2d25-41ce-8d64-8f0dd9d14f57"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.830499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-inventory" (OuterVolumeSpecName: "inventory") pod "ac76806e-2d25-41ce-8d64-8f0dd9d14f57" (UID: "ac76806e-2d25-41ce-8d64-8f0dd9d14f57"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.915548 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.915573 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwl2t\" (UniqueName: \"kubernetes.io/projected/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-kube-api-access-zwl2t\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.915584 4707 reconciler_common.go:293] "Volume detached for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-failed-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:41 crc kubenswrapper[4707]: I0121 16:33:41.915593 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac76806e-2d25-41ce-8d64-8f0dd9d14f57-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:42 crc kubenswrapper[4707]: I0121 16:33:42.548725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" event={"ID":"ac76806e-2d25-41ce-8d64-8f0dd9d14f57","Type":"ContainerDied","Data":"930e0810f988a1c591e656014a96d0fa99bead0c159f2f1be7b2afae2cd2c028"} Jan 21 16:33:42 crc kubenswrapper[4707]: I0121 16:33:42.548893 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="930e0810f988a1c591e656014a96d0fa99bead0c159f2f1be7b2afae2cd2c028" Jan 21 16:33:42 crc kubenswrapper[4707]: I0121 16:33:42.548773 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x" Jan 21 16:33:44 crc kubenswrapper[4707]: I0121 16:33:44.183173 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:33:44 crc kubenswrapper[4707]: E0121 16:33:44.183395 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:33:45 crc kubenswrapper[4707]: I0121 16:33:45.555247 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:45 crc kubenswrapper[4707]: I0121 16:33:45.555454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:45 crc kubenswrapper[4707]: I0121 16:33:45.585831 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:45 crc kubenswrapper[4707]: I0121 16:33:45.614367 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:46 crc kubenswrapper[4707]: I0121 16:33:46.829795 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph4th"] Jan 21 16:33:47 crc kubenswrapper[4707]: I0121 16:33:47.576221 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ph4th" podUID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerName="registry-server" containerID="cri-o://9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773" gracePeriod=2 Jan 21 16:33:47 crc kubenswrapper[4707]: I0121 16:33:47.888193 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.074619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-catalog-content\") pod \"cf947c98-9c65-42f7-9a11-7312b64226c3\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.074733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-utilities\") pod \"cf947c98-9c65-42f7-9a11-7312b64226c3\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.074766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd7vc\" (UniqueName: \"kubernetes.io/projected/cf947c98-9c65-42f7-9a11-7312b64226c3-kube-api-access-gd7vc\") pod \"cf947c98-9c65-42f7-9a11-7312b64226c3\" (UID: \"cf947c98-9c65-42f7-9a11-7312b64226c3\") " Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.075502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-utilities" (OuterVolumeSpecName: "utilities") pod "cf947c98-9c65-42f7-9a11-7312b64226c3" (UID: "cf947c98-9c65-42f7-9a11-7312b64226c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.078679 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf947c98-9c65-42f7-9a11-7312b64226c3-kube-api-access-gd7vc" (OuterVolumeSpecName: "kube-api-access-gd7vc") pod "cf947c98-9c65-42f7-9a11-7312b64226c3" (UID: "cf947c98-9c65-42f7-9a11-7312b64226c3"). InnerVolumeSpecName "kube-api-access-gd7vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.090340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf947c98-9c65-42f7-9a11-7312b64226c3" (UID: "cf947c98-9c65-42f7-9a11-7312b64226c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.176390 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.176419 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd7vc\" (UniqueName: \"kubernetes.io/projected/cf947c98-9c65-42f7-9a11-7312b64226c3-kube-api-access-gd7vc\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.176430 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf947c98-9c65-42f7-9a11-7312b64226c3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.583638 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerID="9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773" exitCode=0 Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.583669 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ph4th" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.583684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph4th" event={"ID":"cf947c98-9c65-42f7-9a11-7312b64226c3","Type":"ContainerDied","Data":"9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773"} Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.584055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph4th" event={"ID":"cf947c98-9c65-42f7-9a11-7312b64226c3","Type":"ContainerDied","Data":"7317788bedd5255d49095d63da1f218bc8fb38e4ecdb1b43d1e35f5130946b73"} Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.584071 4707 scope.go:117] "RemoveContainer" containerID="9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.599624 4707 scope.go:117] "RemoveContainer" containerID="14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.604339 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph4th"] Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.610008 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph4th"] Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.615517 4707 scope.go:117] "RemoveContainer" containerID="36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.649630 4707 scope.go:117] "RemoveContainer" containerID="9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773" Jan 21 16:33:48 crc kubenswrapper[4707]: E0121 16:33:48.649981 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773\": container with ID starting with 9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773 not found: ID does not exist" containerID="9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.650009 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773"} err="failed to get container status \"9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773\": rpc error: code = NotFound desc = could not find container \"9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773\": container with ID starting with 9860a63571cdfd0964230945c8bd382142d0dcfae42fef684782569e8d81b773 not found: ID does not exist" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.650028 4707 scope.go:117] "RemoveContainer" containerID="14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375" Jan 21 16:33:48 crc kubenswrapper[4707]: E0121 16:33:48.650235 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375\": container with ID starting with 14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375 not found: ID does not exist" containerID="14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.650253 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375"} err="failed to get container status \"14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375\": rpc error: code = NotFound desc = could not find container \"14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375\": container with ID starting with 14202d5183577aa9e886a4bdfc190dcbdd8ecf4138936a39f35eead508eed375 not found: ID does not exist" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.650264 4707 scope.go:117] "RemoveContainer" containerID="36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f" Jan 21 16:33:48 crc kubenswrapper[4707]: E0121 16:33:48.650448 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f\": container with ID starting with 36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f not found: ID does not exist" containerID="36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f" Jan 21 16:33:48 crc kubenswrapper[4707]: I0121 16:33:48.650470 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f"} err="failed to get container status \"36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f\": rpc error: code = NotFound desc = could not find container \"36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f\": container with ID starting with 36bd22398d07d7978253a714e9f7e35de60e240cf7b5c5f01066dd214e60542f not found: ID does not exist" Jan 21 16:33:49 crc kubenswrapper[4707]: I0121 16:33:49.188003 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf947c98-9c65-42f7-9a11-7312b64226c3" path="/var/lib/kubelet/pods/cf947c98-9c65-42f7-9a11-7312b64226c3/volumes" Jan 21 16:33:59 crc kubenswrapper[4707]: I0121 16:33:59.182274 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:33:59 crc kubenswrapper[4707]: E0121 16:33:59.182795 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.017582 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx"] Jan 21 16:34:00 crc kubenswrapper[4707]: E0121 16:34:00.017853 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerName="extract-utilities" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.017866 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerName="extract-utilities" Jan 21 16:34:00 crc kubenswrapper[4707]: E0121 16:34:00.017885 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerName="registry-server" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.017891 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerName="registry-server" Jan 21 16:34:00 crc kubenswrapper[4707]: E0121 16:34:00.017903 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac76806e-2d25-41ce-8d64-8f0dd9d14f57" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.017910 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac76806e-2d25-41ce-8d64-8f0dd9d14f57" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:34:00 crc kubenswrapper[4707]: E0121 16:34:00.017937 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerName="extract-content" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.017942 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerName="extract-content" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.018064 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf947c98-9c65-42f7-9a11-7312b64226c3" containerName="registry-server" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.018075 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac76806e-2d25-41ce-8d64-8f0dd9d14f57" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.018498 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.020355 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.020426 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.020442 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.020541 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-j94fk" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.020789 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.025643 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx"] Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.109322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.109368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26dtn\" (UniqueName: \"kubernetes.io/projected/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-kube-api-access-26dtn\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.109403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.109459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.210390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.210424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26dtn\" (UniqueName: \"kubernetes.io/projected/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-kube-api-access-26dtn\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.210458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.210480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.214884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.215003 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.215562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.223379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26dtn\" (UniqueName: \"kubernetes.io/projected/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-kube-api-access-26dtn\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.331507 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:00 crc kubenswrapper[4707]: I0121 16:34:00.696259 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx"] Jan 21 16:34:01 crc kubenswrapper[4707]: I0121 16:34:01.657762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" event={"ID":"5573358c-4ab3-4d45-944c-84bb7dbcf0d1","Type":"ContainerStarted","Data":"bd9a5132ef93dfcb7495822d09194ebe92fb78bc1a90c45207c8e4321564275e"} Jan 21 16:34:01 crc kubenswrapper[4707]: I0121 16:34:01.657972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" event={"ID":"5573358c-4ab3-4d45-944c-84bb7dbcf0d1","Type":"ContainerStarted","Data":"ce08007678524976d244a5828fe921c1c0e0a67b09efc45fbfdbd52a504e8424"} Jan 21 16:34:01 crc kubenswrapper[4707]: I0121 16:34:01.677679 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" podStartSLOduration=1.016192054 podStartE2EDuration="1.677667044s" podCreationTimestamp="2026-01-21 16:34:00 +0000 UTC" firstStartedPulling="2026-01-21 16:34:00.700372018 +0000 UTC m=+5537.881888241" lastFinishedPulling="2026-01-21 16:34:01.361847008 +0000 UTC m=+5538.543363231" observedRunningTime="2026-01-21 16:34:01.675805122 +0000 UTC m=+5538.857321345" watchObservedRunningTime="2026-01-21 16:34:01.677667044 +0000 UTC m=+5538.859183266" Jan 21 16:34:03 crc kubenswrapper[4707]: I0121 16:34:03.675154 4707 generic.go:334] "Generic (PLEG): container finished" podID="5573358c-4ab3-4d45-944c-84bb7dbcf0d1" containerID="bd9a5132ef93dfcb7495822d09194ebe92fb78bc1a90c45207c8e4321564275e" exitCode=2 Jan 21 16:34:03 crc kubenswrapper[4707]: I0121 16:34:03.675229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" event={"ID":"5573358c-4ab3-4d45-944c-84bb7dbcf0d1","Type":"ContainerDied","Data":"bd9a5132ef93dfcb7495822d09194ebe92fb78bc1a90c45207c8e4321564275e"} Jan 21 16:34:04 crc kubenswrapper[4707]: I0121 16:34:04.892910 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.070702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-failed-service-combined-ca-bundle\") pod \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.070772 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26dtn\" (UniqueName: \"kubernetes.io/projected/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-kube-api-access-26dtn\") pod \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.070869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-inventory\") pod \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.070899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-ssh-key-edpm-compute-no-nodes\") pod \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.074989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-kube-api-access-26dtn" (OuterVolumeSpecName: "kube-api-access-26dtn") pod "5573358c-4ab3-4d45-944c-84bb7dbcf0d1" (UID: "5573358c-4ab3-4d45-944c-84bb7dbcf0d1"). InnerVolumeSpecName "kube-api-access-26dtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.075564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-failed-service-combined-ca-bundle" (OuterVolumeSpecName: "failed-service-combined-ca-bundle") pod "5573358c-4ab3-4d45-944c-84bb7dbcf0d1" (UID: "5573358c-4ab3-4d45-944c-84bb7dbcf0d1"). InnerVolumeSpecName "failed-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:05 crc kubenswrapper[4707]: E0121 16:34:05.084666 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-ssh-key-edpm-compute-no-nodes podName:5573358c-4ab3-4d45-944c-84bb7dbcf0d1 nodeName:}" failed. No retries permitted until 2026-01-21 16:34:05.584643861 +0000 UTC m=+5542.766160083 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-edpm-compute-no-nodes" (UniqueName: "kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-ssh-key-edpm-compute-no-nodes") pod "5573358c-4ab3-4d45-944c-84bb7dbcf0d1" (UID: "5573358c-4ab3-4d45-944c-84bb7dbcf0d1") : error deleting /var/lib/kubelet/pods/5573358c-4ab3-4d45-944c-84bb7dbcf0d1/volume-subpaths: remove /var/lib/kubelet/pods/5573358c-4ab3-4d45-944c-84bb7dbcf0d1/volume-subpaths: no such file or directory Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.086523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-inventory" (OuterVolumeSpecName: "inventory") pod "5573358c-4ab3-4d45-944c-84bb7dbcf0d1" (UID: "5573358c-4ab3-4d45-944c-84bb7dbcf0d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.172348 4707 reconciler_common.go:293] "Volume detached for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-failed-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.172375 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26dtn\" (UniqueName: \"kubernetes.io/projected/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-kube-api-access-26dtn\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.172386 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.678271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-ssh-key-edpm-compute-no-nodes\") pod \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\" (UID: \"5573358c-4ab3-4d45-944c-84bb7dbcf0d1\") " Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.680945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "5573358c-4ab3-4d45-944c-84bb7dbcf0d1" (UID: "5573358c-4ab3-4d45-944c-84bb7dbcf0d1"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.687586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" event={"ID":"5573358c-4ab3-4d45-944c-84bb7dbcf0d1","Type":"ContainerDied","Data":"ce08007678524976d244a5828fe921c1c0e0a67b09efc45fbfdbd52a504e8424"} Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.687611 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx" Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.687622 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce08007678524976d244a5828fe921c1c0e0a67b09efc45fbfdbd52a504e8424" Jan 21 16:34:05 crc kubenswrapper[4707]: I0121 16:34:05.779770 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/5573358c-4ab3-4d45-944c-84bb7dbcf0d1-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:14 crc kubenswrapper[4707]: I0121 16:34:14.183002 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:34:14 crc kubenswrapper[4707]: E0121 16:34:14.183525 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:34:23 crc kubenswrapper[4707]: I0121 16:34:23.924384 4707 scope.go:117] "RemoveContainer" containerID="a64f9fa10268d28f429481a860a75dfce42aa266e4662a4b2e53a283a167ee69" Jan 21 16:34:23 crc kubenswrapper[4707]: I0121 16:34:23.944951 4707 scope.go:117] "RemoveContainer" containerID="147447999e8e80213c3c0eca95b272b55098b7ca6055c009122ec88babe44cfe" Jan 21 16:34:23 crc kubenswrapper[4707]: I0121 16:34:23.963714 4707 scope.go:117] "RemoveContainer" containerID="a3c437e547e7ad7aace75e969c00453ff9c2a09cba6e1369b13dd6d26b8ef56f" Jan 21 16:34:23 crc kubenswrapper[4707]: I0121 16:34:23.982667 4707 scope.go:117] "RemoveContainer" containerID="601cb649389570e5786f7f85e3097ceffc42b50838abdc66cc396bbb421d8617" Jan 21 16:34:24 crc kubenswrapper[4707]: I0121 16:34:24.002667 4707 scope.go:117] "RemoveContainer" containerID="4e8c1b261844098aab05808bfae3825fe5faaf09272f20482840f3cddd0f3400" Jan 21 16:34:24 crc kubenswrapper[4707]: I0121 16:34:24.018426 4707 scope.go:117] "RemoveContainer" containerID="62eca35132dd7e2575f410ce9ff3a09e143f82aa0f6c591a4d5535d288e9bd0d" Jan 21 16:34:24 crc kubenswrapper[4707]: I0121 16:34:24.035186 4707 scope.go:117] "RemoveContainer" containerID="e7fe8cfbb01e0504920f0998d87c099a028b5b5d59c5b727e542f559d1e0d545" Jan 21 16:34:24 crc kubenswrapper[4707]: I0121 16:34:24.051281 4707 scope.go:117] "RemoveContainer" containerID="0fab8855ff2749ff036354a02f2046fbc98cdedc5eb5e23bbf54962a935a282c" Jan 21 16:34:24 crc kubenswrapper[4707]: I0121 16:34:24.067930 4707 scope.go:117] "RemoveContainer" containerID="354727db5bf7286d602877db0f59ae8c17a1afa0f16d22ac930fc56866660313" Jan 21 16:34:24 crc kubenswrapper[4707]: I0121 16:34:24.084463 4707 scope.go:117] "RemoveContainer" containerID="e7922204606ecf61a20f27d8a0c7d3c98f45c9318e01399fc1847ead41795901" Jan 21 16:34:24 crc kubenswrapper[4707]: I0121 16:34:24.101192 4707 scope.go:117] "RemoveContainer" containerID="5058250f2b778215e36ab883b00d7616bc05d68011acf04ca6c9152f7682ea3b" Jan 21 16:34:24 crc kubenswrapper[4707]: I0121 16:34:24.117019 4707 scope.go:117] "RemoveContainer" containerID="59f40f0c60e236ed9e6381bb4593cee997b28e3f5dfe72d30c01e5b49090c05d" Jan 21 16:34:25 crc kubenswrapper[4707]: I0121 16:34:25.182648 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:34:25 crc kubenswrapper[4707]: E0121 16:34:25.182855 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:34:39 crc kubenswrapper[4707]: I0121 16:34:39.182621 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:34:39 crc kubenswrapper[4707]: E0121 16:34:39.183148 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.017233 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp"] Jan 21 16:34:42 crc kubenswrapper[4707]: E0121 16:34:42.017649 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5573358c-4ab3-4d45-944c-84bb7dbcf0d1" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.017662 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5573358c-4ab3-4d45-944c-84bb7dbcf0d1" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.017826 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5573358c-4ab3-4d45-944c-84bb7dbcf0d1" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.018234 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.020881 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.021062 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.021290 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.021610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-j94fk" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.022069 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.026572 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp"] Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.203924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.203989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.204018 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbc7\" (UniqueName: \"kubernetes.io/projected/f423025a-c977-412c-94fe-cffd23c3df42-kube-api-access-bzbc7\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.204091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.304719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.304776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.304825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.304849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbc7\" (UniqueName: \"kubernetes.io/projected/f423025a-c977-412c-94fe-cffd23c3df42-kube-api-access-bzbc7\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.309094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.309108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.309191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.318603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbc7\" (UniqueName: \"kubernetes.io/projected/f423025a-c977-412c-94fe-cffd23c3df42-kube-api-access-bzbc7\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.334553 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.674844 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp"] Jan 21 16:34:42 crc kubenswrapper[4707]: I0121 16:34:42.898507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" event={"ID":"f423025a-c977-412c-94fe-cffd23c3df42","Type":"ContainerStarted","Data":"56bc05a00ebf2308c79d6f324901c42c7f31ef39f21684447fac7f127b32ade2"} Jan 21 16:34:43 crc kubenswrapper[4707]: I0121 16:34:43.913452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" event={"ID":"f423025a-c977-412c-94fe-cffd23c3df42","Type":"ContainerStarted","Data":"c8706e781ecb5649e8dd3c7e282f7ed7219d0311cc182b25012283a0950cdc7b"} Jan 21 16:34:43 crc kubenswrapper[4707]: I0121 16:34:43.927329 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" podStartSLOduration=1.41518832 podStartE2EDuration="1.92731517s" podCreationTimestamp="2026-01-21 16:34:42 +0000 UTC" firstStartedPulling="2026-01-21 16:34:42.679338203 +0000 UTC m=+5579.860854425" lastFinishedPulling="2026-01-21 16:34:43.191465063 +0000 UTC m=+5580.372981275" observedRunningTime="2026-01-21 16:34:43.923564558 +0000 UTC m=+5581.105080811" watchObservedRunningTime="2026-01-21 16:34:43.92731517 +0000 UTC m=+5581.108831392" Jan 21 16:34:44 crc kubenswrapper[4707]: I0121 16:34:44.920095 4707 generic.go:334] "Generic (PLEG): container finished" podID="f423025a-c977-412c-94fe-cffd23c3df42" containerID="c8706e781ecb5649e8dd3c7e282f7ed7219d0311cc182b25012283a0950cdc7b" exitCode=2 Jan 21 16:34:44 crc kubenswrapper[4707]: I0121 16:34:44.920133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" event={"ID":"f423025a-c977-412c-94fe-cffd23c3df42","Type":"ContainerDied","Data":"c8706e781ecb5649e8dd3c7e282f7ed7219d0311cc182b25012283a0950cdc7b"} Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.113183 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.245713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-inventory\") pod \"f423025a-c977-412c-94fe-cffd23c3df42\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.245777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-failed-service-combined-ca-bundle\") pod \"f423025a-c977-412c-94fe-cffd23c3df42\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.245840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzbc7\" (UniqueName: \"kubernetes.io/projected/f423025a-c977-412c-94fe-cffd23c3df42-kube-api-access-bzbc7\") pod \"f423025a-c977-412c-94fe-cffd23c3df42\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.245955 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-ssh-key-edpm-compute-no-nodes\") pod \"f423025a-c977-412c-94fe-cffd23c3df42\" (UID: \"f423025a-c977-412c-94fe-cffd23c3df42\") " Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.249637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-failed-service-combined-ca-bundle" (OuterVolumeSpecName: "failed-service-combined-ca-bundle") pod "f423025a-c977-412c-94fe-cffd23c3df42" (UID: "f423025a-c977-412c-94fe-cffd23c3df42"). InnerVolumeSpecName "failed-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.249835 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f423025a-c977-412c-94fe-cffd23c3df42-kube-api-access-bzbc7" (OuterVolumeSpecName: "kube-api-access-bzbc7") pod "f423025a-c977-412c-94fe-cffd23c3df42" (UID: "f423025a-c977-412c-94fe-cffd23c3df42"). InnerVolumeSpecName "kube-api-access-bzbc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.260963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "f423025a-c977-412c-94fe-cffd23c3df42" (UID: "f423025a-c977-412c-94fe-cffd23c3df42"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.261799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-inventory" (OuterVolumeSpecName: "inventory") pod "f423025a-c977-412c-94fe-cffd23c3df42" (UID: "f423025a-c977-412c-94fe-cffd23c3df42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.347726 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.347752 4707 reconciler_common.go:293] "Volume detached for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-failed-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.347764 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzbc7\" (UniqueName: \"kubernetes.io/projected/f423025a-c977-412c-94fe-cffd23c3df42-kube-api-access-bzbc7\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.347773 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f423025a-c977-412c-94fe-cffd23c3df42-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.545317 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx"] Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.549859 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x"] Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.554233 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp"] Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.558516 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt"] Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.562546 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes8k8xx"] Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.566470 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp"] Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.570126 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesndr8x"] Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.580025 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodeshqtbt"] Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.604424 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945"] Jan 21 16:34:46 crc kubenswrapper[4707]: E0121 16:34:46.604704 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f423025a-c977-412c-94fe-cffd23c3df42" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.604721 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f423025a-c977-412c-94fe-cffd23c3df42" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.604884 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f423025a-c977-412c-94fe-cffd23c3df42" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.605576 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.611487 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945"] Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.651868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375fc46a-93db-46c4-9f95-b0f17f2e3c1c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-55945\" (UID: \"375fc46a-93db-46c4-9f95-b0f17f2e3c1c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.652021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfzvz\" (UniqueName: \"kubernetes.io/projected/375fc46a-93db-46c4-9f95-b0f17f2e3c1c-kube-api-access-hfzvz\") pod \"dnsmasq-dnsmasq-84b9f45d47-55945\" (UID: \"375fc46a-93db-46c4-9f95-b0f17f2e3c1c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.652082 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/375fc46a-93db-46c4-9f95-b0f17f2e3c1c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-55945\" (UID: \"375fc46a-93db-46c4-9f95-b0f17f2e3c1c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.753516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfzvz\" (UniqueName: \"kubernetes.io/projected/375fc46a-93db-46c4-9f95-b0f17f2e3c1c-kube-api-access-hfzvz\") pod \"dnsmasq-dnsmasq-84b9f45d47-55945\" (UID: \"375fc46a-93db-46c4-9f95-b0f17f2e3c1c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.753621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/375fc46a-93db-46c4-9f95-b0f17f2e3c1c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-55945\" (UID: \"375fc46a-93db-46c4-9f95-b0f17f2e3c1c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.753650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375fc46a-93db-46c4-9f95-b0f17f2e3c1c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-55945\" (UID: \"375fc46a-93db-46c4-9f95-b0f17f2e3c1c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.754402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/375fc46a-93db-46c4-9f95-b0f17f2e3c1c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-55945\" (UID: \"375fc46a-93db-46c4-9f95-b0f17f2e3c1c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.754469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375fc46a-93db-46c4-9f95-b0f17f2e3c1c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-55945\" (UID: \"375fc46a-93db-46c4-9f95-b0f17f2e3c1c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.766305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfzvz\" (UniqueName: \"kubernetes.io/projected/375fc46a-93db-46c4-9f95-b0f17f2e3c1c-kube-api-access-hfzvz\") pod \"dnsmasq-dnsmasq-84b9f45d47-55945\" (UID: \"375fc46a-93db-46c4-9f95-b0f17f2e3c1c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.917587 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.936595 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56bc05a00ebf2308c79d6f324901c42c7f31ef39f21684447fac7f127b32ade2" Jan 21 16:34:46 crc kubenswrapper[4707]: I0121 16:34:46.936639 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes257sp" Jan 21 16:34:47 crc kubenswrapper[4707]: I0121 16:34:47.188889 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aedba37-40a8-412e-8aa5-20a0a713687b" path="/var/lib/kubelet/pods/3aedba37-40a8-412e-8aa5-20a0a713687b/volumes" Jan 21 16:34:47 crc kubenswrapper[4707]: I0121 16:34:47.189729 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5573358c-4ab3-4d45-944c-84bb7dbcf0d1" path="/var/lib/kubelet/pods/5573358c-4ab3-4d45-944c-84bb7dbcf0d1/volumes" Jan 21 16:34:47 crc kubenswrapper[4707]: I0121 16:34:47.190357 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac76806e-2d25-41ce-8d64-8f0dd9d14f57" path="/var/lib/kubelet/pods/ac76806e-2d25-41ce-8d64-8f0dd9d14f57/volumes" Jan 21 16:34:47 crc kubenswrapper[4707]: I0121 16:34:47.190977 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f423025a-c977-412c-94fe-cffd23c3df42" path="/var/lib/kubelet/pods/f423025a-c977-412c-94fe-cffd23c3df42/volumes" Jan 21 16:34:47 crc kubenswrapper[4707]: I0121 16:34:47.258506 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945"] Jan 21 16:34:47 crc kubenswrapper[4707]: W0121 16:34:47.260518 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod375fc46a_93db_46c4_9f95_b0f17f2e3c1c.slice/crio-fa837268942933ba9b48e3ff4b9c751ae055d7c985f9a3335e285d159ee29cd8 WatchSource:0}: Error finding container fa837268942933ba9b48e3ff4b9c751ae055d7c985f9a3335e285d159ee29cd8: Status 404 returned error can't find the container with id fa837268942933ba9b48e3ff4b9c751ae055d7c985f9a3335e285d159ee29cd8 Jan 21 16:34:47 crc kubenswrapper[4707]: I0121 16:34:47.944012 4707 generic.go:334] "Generic (PLEG): container finished" podID="375fc46a-93db-46c4-9f95-b0f17f2e3c1c" containerID="837f24337c7ece827d0de252bfa81425353607c7f0066e42d8e9c3b1b54dcd8b" exitCode=0 Jan 21 16:34:47 crc kubenswrapper[4707]: I0121 16:34:47.944059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" event={"ID":"375fc46a-93db-46c4-9f95-b0f17f2e3c1c","Type":"ContainerDied","Data":"837f24337c7ece827d0de252bfa81425353607c7f0066e42d8e9c3b1b54dcd8b"} Jan 21 16:34:47 crc kubenswrapper[4707]: I0121 16:34:47.944226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" event={"ID":"375fc46a-93db-46c4-9f95-b0f17f2e3c1c","Type":"ContainerStarted","Data":"fa837268942933ba9b48e3ff4b9c751ae055d7c985f9a3335e285d159ee29cd8"} Jan 21 16:34:48 crc kubenswrapper[4707]: I0121 16:34:48.951435 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" event={"ID":"375fc46a-93db-46c4-9f95-b0f17f2e3c1c","Type":"ContainerStarted","Data":"2784c6ad2ecebf83906d038e4e5d3996a5623013eef83608ca04af63c6787dbd"} Jan 21 16:34:48 crc kubenswrapper[4707]: I0121 16:34:48.951646 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.204076 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" podStartSLOduration=3.2040607469999998 podStartE2EDuration="3.204060747s" podCreationTimestamp="2026-01-21 16:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:34:48.967492059 +0000 UTC m=+5586.149008280" watchObservedRunningTime="2026-01-21 16:34:49.204060747 +0000 UTC m=+5586.385576970" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.207258 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.207528 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" podUID="e6ab0bb3-12a8-4314-83c7-280a512198f6" containerName="manager" containerID="cri-o://e4f42a8b26c44d79b0b42241e85e250bb46ac11e92535e51eb178335a37ba91b" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.227139 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.227305 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" podUID="d8707730-a731-419d-ad12-caf5fddafbd5" containerName="operator" containerID="cri-o://f81d6212e7b46c7cc4d9c03b8fdd9060c30d3305e97507e4bea0f56ee40e5424" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.235139 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.235274 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" podUID="5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" containerName="manager" containerID="cri-o://d17f33e08f973302863a6a158a1b0329e06116ee4116ba93b9392405aeaa6b17" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.251039 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.251260 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" podUID="4f34a822-4c51-4dfc-9812-1287c9b3281d" containerName="manager" containerID="cri-o://c813c3747e39edf631ae0a59c3bfe2ea36cc0254e81c018a901ec8d129da1f49" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.259312 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.259522 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" podUID="9f333daa-a37b-4c2e-bd13-e66416449d2c" containerName="manager" containerID="cri-o://39ac84a7db285687d6db26470e02e89c210c3ae4cac2d9bde592910bf8a6ac53" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.266210 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.266360 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" podUID="40734c06-7e00-4184-9487-35be214c9556" containerName="manager" containerID="cri-o://e592e10125a5ad6cbafa06a737cbe06c97e19b4290da7089fbef2ffc14cfde56" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.272688 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.272864 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" podUID="8951bb5e-fd60-4761-9c83-bc523b041b83" containerName="manager" containerID="cri-o://5a8b6331813d6ce1cc4c1f242ea893b7520dd2bafdb2bc296453a569d7df2394" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.280470 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.280653 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" podUID="7a6b5573-a127-4713-919d-d79e38f60b87" containerName="manager" containerID="cri-o://33b413ceef8ffdcb86dbbdca687a971ccb0ac31560e3c2cc793fe5dde7400208" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.290976 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.291160 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" podUID="0c59ef09-0beb-452c-8563-83bda9602961" containerName="manager" containerID="cri-o://8e0b35d221b6726be4c9bab8690c69dd64a57f1c4279e3e7101140eeba8d5358" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.298128 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.298271 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" podUID="375b0905-f382-4e75-8c5c-4ece8f6ddde3" containerName="manager" containerID="cri-o://098a977d584fdcb0ebe244d0fa4d9dc9081e754df204db8fed081b450701f4e2" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.302403 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.302581 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" podUID="6676ddd6-9d32-480d-9306-d3dc0676d2cd" containerName="manager" containerID="cri-o://2f7d7170e2337291b3525b16efe7ac13fde05e46c80b71e57c73c1795a5e8afa" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.308011 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.308168 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" podUID="a674fe35-9ddd-4352-b770-1061b28fce34" containerName="manager" containerID="cri-o://c8595d9ca2bcaf4f53f6bc9205a1d3fae85f9f44532fae632b4d59583c03b6f7" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.333992 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.334158 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" podUID="4c2dd112-6dc9-4b29-95f2-25b3c44452a2" containerName="manager" containerID="cri-o://54bd8be04d416835f1281493540d9e01504393fe58be46567299bee6842f6ce5" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.389886 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.390103 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" podUID="4d095e2a-7788-4a7e-af9b-eaed8407ef5a" containerName="manager" containerID="cri-o://0629f352d3f4441913ab06b6e0a66cad5ad04c26a87acf9f4fafc1b2313ff041" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.406066 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.406261 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" podUID="62e25886-609f-4ef6-848c-e116c738df6a" containerName="manager" containerID="cri-o://0046dd925d470a4baf4e0eb975a04283e1f25124964c96ed2b0c722a82a5b5aa" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.473951 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.474194 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" podUID="06c244e2-fb38-4ac0-8ed0-2f34d01c5228" containerName="manager" containerID="cri-o://9f157a1954f415b2a222d4a1081d13ac7ca6e425474cd711399734533ef230cf" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.494047 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.494291 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" podUID="4adccf62-a1c3-430c-9137-a515f03b23e4" containerName="manager" containerID="cri-o://e45b507f5954b9cce48efe9efde816a45062e6746141cb2cb4c92830af61bce1" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.500837 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.501032 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" podUID="96a9240d-2c77-4718-8f9d-60633df4eee4" containerName="manager" containerID="cri-o://76e7d98d71cfa80838fca4badb283a780fc5d99d758975b4e61ffbaedfea58aa" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.507052 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.507233 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" podUID="9356dd92-4c2e-476d-9a60-946f4f148564" containerName="manager" containerID="cri-o://09c8747e2bac89103cb29b97e35d54f1e372a4dea0daecca785b4bd9414b0125" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.512535 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-99p87"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.512654 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" podUID="79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" containerName="manager" containerID="cri-o://39c10d68c0dbc799c2b840647380cd032397d3c2ea03aba0b3c0758af87a066a" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.515946 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.516102 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" podUID="2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd" containerName="manager" containerID="cri-o://970be3767e71031a742c429f6c28e936b6930af2dc7e5f41e212c357f8c9b82e" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.520900 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.521080 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" podUID="1ff533c3-a29f-498b-a575-1492c0e07aa9" containerName="manager" containerID="cri-o://63c114e32b92172fc2d850e2811160cb9627b2ab520857ee5130971d61c990d6" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.529914 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.530068 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" podUID="bd0195ed-9602-42a1-bd0e-a0d79f8a6197" containerName="manager" containerID="cri-o://04f2aae1a90962c24e8739e6bbd1f7d8124043d6d56f594bba3c8c868bc521b3" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.533961 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.534128 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" podUID="a13a87c3-f0a5-4fa9-892d-e2371300124a" containerName="catalog-operator" containerID="cri-o://fdf5c3dc9a9fddaeed0daff9c1a05de484bfadd721c10c9f616a0c7b32569f54" gracePeriod=30 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.537687 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9hglm"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.538068 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-9hglm" podUID="7193a0d6-108f-4c14-a4fe-678aa49bc9dd" containerName="registry-server" containerID="cri-o://ce079610f88f7d056187fad72a6f6f1fb9e577c5ace7940757d24c6cc403e66c" gracePeriod=30 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.564896 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.565772 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.579855 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.595265 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.595433 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" podUID="3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf" containerName="operator" containerID="cri-o://55d73a60944fe9f7d7ad79e64d1b64d20cb2043c83d889359f590d510c77308d" gracePeriod=10 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.602049 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.605934 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2ekjlb8"] Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.704975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2-profile-collector-cert\") pod \"catalog-operator-68c6474976-mcmsz\" (UID: \"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.705082 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2-srv-cert\") pod \"catalog-operator-68c6474976-mcmsz\" (UID: \"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.705145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh7m7\" (UniqueName: \"kubernetes.io/projected/b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2-kube-api-access-fh7m7\") pod \"catalog-operator-68c6474976-mcmsz\" (UID: \"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.806708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2-profile-collector-cert\") pod \"catalog-operator-68c6474976-mcmsz\" (UID: \"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.806821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2-srv-cert\") pod \"catalog-operator-68c6474976-mcmsz\" (UID: \"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.806869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh7m7\" (UniqueName: \"kubernetes.io/projected/b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2-kube-api-access-fh7m7\") pod \"catalog-operator-68c6474976-mcmsz\" (UID: \"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.824270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2-srv-cert\") pod \"catalog-operator-68c6474976-mcmsz\" (UID: \"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.824672 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2-profile-collector-cert\") pod \"catalog-operator-68c6474976-mcmsz\" (UID: \"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.829017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh7m7\" (UniqueName: \"kubernetes.io/projected/b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2-kube-api-access-fh7m7\") pod \"catalog-operator-68c6474976-mcmsz\" (UID: \"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.834316 4707 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xxr5n container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.834354 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" podUID="a13a87c3-f0a5-4fa9-892d-e2371300124a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.961673 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd0195ed-9602-42a1-bd0e-a0d79f8a6197" containerID="04f2aae1a90962c24e8739e6bbd1f7d8124043d6d56f594bba3c8c868bc521b3" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.961726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" event={"ID":"bd0195ed-9602-42a1-bd0e-a0d79f8a6197","Type":"ContainerDied","Data":"04f2aae1a90962c24e8739e6bbd1f7d8124043d6d56f594bba3c8c868bc521b3"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.965061 4707 generic.go:334] "Generic (PLEG): container finished" podID="5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" containerID="d17f33e08f973302863a6a158a1b0329e06116ee4116ba93b9392405aeaa6b17" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.965103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" event={"ID":"5219c03b-89a8-4c73-b04f-2c6b1f8e29f4","Type":"ContainerDied","Data":"d17f33e08f973302863a6a158a1b0329e06116ee4116ba93b9392405aeaa6b17"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.967180 4707 generic.go:334] "Generic (PLEG): container finished" podID="8951bb5e-fd60-4761-9c83-bc523b041b83" containerID="5a8b6331813d6ce1cc4c1f242ea893b7520dd2bafdb2bc296453a569d7df2394" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.967275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" event={"ID":"8951bb5e-fd60-4761-9c83-bc523b041b83","Type":"ContainerDied","Data":"5a8b6331813d6ce1cc4c1f242ea893b7520dd2bafdb2bc296453a569d7df2394"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.967319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" event={"ID":"8951bb5e-fd60-4761-9c83-bc523b041b83","Type":"ContainerDied","Data":"ce4fd4fff028a04b0b493b691da329473637587b6188ac7cb63e9e073063ee80"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.967331 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce4fd4fff028a04b0b493b691da329473637587b6188ac7cb63e9e073063ee80" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.968758 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6ab0bb3-12a8-4314-83c7-280a512198f6" containerID="e4f42a8b26c44d79b0b42241e85e250bb46ac11e92535e51eb178335a37ba91b" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.968825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" event={"ID":"e6ab0bb3-12a8-4314-83c7-280a512198f6","Type":"ContainerDied","Data":"e4f42a8b26c44d79b0b42241e85e250bb46ac11e92535e51eb178335a37ba91b"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.970108 4707 generic.go:334] "Generic (PLEG): container finished" podID="0c59ef09-0beb-452c-8563-83bda9602961" containerID="8e0b35d221b6726be4c9bab8690c69dd64a57f1c4279e3e7101140eeba8d5358" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.970207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" event={"ID":"0c59ef09-0beb-452c-8563-83bda9602961","Type":"ContainerDied","Data":"8e0b35d221b6726be4c9bab8690c69dd64a57f1c4279e3e7101140eeba8d5358"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.972997 4707 generic.go:334] "Generic (PLEG): container finished" podID="2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd" containerID="970be3767e71031a742c429f6c28e936b6930af2dc7e5f41e212c357f8c9b82e" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.973023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" event={"ID":"2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd","Type":"ContainerDied","Data":"970be3767e71031a742c429f6c28e936b6930af2dc7e5f41e212c357f8c9b82e"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.974828 4707 generic.go:334] "Generic (PLEG): container finished" podID="79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" containerID="39c10d68c0dbc799c2b840647380cd032397d3c2ea03aba0b3c0758af87a066a" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.974891 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" event={"ID":"79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6","Type":"ContainerDied","Data":"39c10d68c0dbc799c2b840647380cd032397d3c2ea03aba0b3c0758af87a066a"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.977297 4707 generic.go:334] "Generic (PLEG): container finished" podID="a13a87c3-f0a5-4fa9-892d-e2371300124a" containerID="fdf5c3dc9a9fddaeed0daff9c1a05de484bfadd721c10c9f616a0c7b32569f54" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.977341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" event={"ID":"a13a87c3-f0a5-4fa9-892d-e2371300124a","Type":"ContainerDied","Data":"fdf5c3dc9a9fddaeed0daff9c1a05de484bfadd721c10c9f616a0c7b32569f54"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.978350 4707 generic.go:334] "Generic (PLEG): container finished" podID="6676ddd6-9d32-480d-9306-d3dc0676d2cd" containerID="2f7d7170e2337291b3525b16efe7ac13fde05e46c80b71e57c73c1795a5e8afa" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.978389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" event={"ID":"6676ddd6-9d32-480d-9306-d3dc0676d2cd","Type":"ContainerDied","Data":"2f7d7170e2337291b3525b16efe7ac13fde05e46c80b71e57c73c1795a5e8afa"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.980181 4707 generic.go:334] "Generic (PLEG): container finished" podID="7193a0d6-108f-4c14-a4fe-678aa49bc9dd" containerID="ce079610f88f7d056187fad72a6f6f1fb9e577c5ace7940757d24c6cc403e66c" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.980226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9hglm" event={"ID":"7193a0d6-108f-4c14-a4fe-678aa49bc9dd","Type":"ContainerDied","Data":"ce079610f88f7d056187fad72a6f6f1fb9e577c5ace7940757d24c6cc403e66c"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.981915 4707 generic.go:334] "Generic (PLEG): container finished" podID="375b0905-f382-4e75-8c5c-4ece8f6ddde3" containerID="098a977d584fdcb0ebe244d0fa4d9dc9081e754df204db8fed081b450701f4e2" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.981983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" event={"ID":"375b0905-f382-4e75-8c5c-4ece8f6ddde3","Type":"ContainerDied","Data":"098a977d584fdcb0ebe244d0fa4d9dc9081e754df204db8fed081b450701f4e2"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.983485 4707 generic.go:334] "Generic (PLEG): container finished" podID="96a9240d-2c77-4718-8f9d-60633df4eee4" containerID="76e7d98d71cfa80838fca4badb283a780fc5d99d758975b4e61ffbaedfea58aa" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.983549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" event={"ID":"96a9240d-2c77-4718-8f9d-60633df4eee4","Type":"ContainerDied","Data":"76e7d98d71cfa80838fca4badb283a780fc5d99d758975b4e61ffbaedfea58aa"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.984892 4707 generic.go:334] "Generic (PLEG): container finished" podID="40734c06-7e00-4184-9487-35be214c9556" containerID="e592e10125a5ad6cbafa06a737cbe06c97e19b4290da7089fbef2ffc14cfde56" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.984944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" event={"ID":"40734c06-7e00-4184-9487-35be214c9556","Type":"ContainerDied","Data":"e592e10125a5ad6cbafa06a737cbe06c97e19b4290da7089fbef2ffc14cfde56"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.987605 4707 generic.go:334] "Generic (PLEG): container finished" podID="4f34a822-4c51-4dfc-9812-1287c9b3281d" containerID="c813c3747e39edf631ae0a59c3bfe2ea36cc0254e81c018a901ec8d129da1f49" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.987630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" event={"ID":"4f34a822-4c51-4dfc-9812-1287c9b3281d","Type":"ContainerDied","Data":"c813c3747e39edf631ae0a59c3bfe2ea36cc0254e81c018a901ec8d129da1f49"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.989584 4707 generic.go:334] "Generic (PLEG): container finished" podID="4adccf62-a1c3-430c-9137-a515f03b23e4" containerID="e45b507f5954b9cce48efe9efde816a45062e6746141cb2cb4c92830af61bce1" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.989626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" event={"ID":"4adccf62-a1c3-430c-9137-a515f03b23e4","Type":"ContainerDied","Data":"e45b507f5954b9cce48efe9efde816a45062e6746141cb2cb4c92830af61bce1"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.991341 4707 generic.go:334] "Generic (PLEG): container finished" podID="7a6b5573-a127-4713-919d-d79e38f60b87" containerID="33b413ceef8ffdcb86dbbdca687a971ccb0ac31560e3c2cc793fe5dde7400208" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.991385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" event={"ID":"7a6b5573-a127-4713-919d-d79e38f60b87","Type":"ContainerDied","Data":"33b413ceef8ffdcb86dbbdca687a971ccb0ac31560e3c2cc793fe5dde7400208"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.991401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" event={"ID":"7a6b5573-a127-4713-919d-d79e38f60b87","Type":"ContainerDied","Data":"441ce9e8ea74d8bf08ea2bf33e61e39b2087fd213c312b5aade143d12f83554a"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.991410 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="441ce9e8ea74d8bf08ea2bf33e61e39b2087fd213c312b5aade143d12f83554a" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.993304 4707 generic.go:334] "Generic (PLEG): container finished" podID="62e25886-609f-4ef6-848c-e116c738df6a" containerID="0046dd925d470a4baf4e0eb975a04283e1f25124964c96ed2b0c722a82a5b5aa" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.993322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" event={"ID":"62e25886-609f-4ef6-848c-e116c738df6a","Type":"ContainerDied","Data":"0046dd925d470a4baf4e0eb975a04283e1f25124964c96ed2b0c722a82a5b5aa"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.994626 4707 generic.go:334] "Generic (PLEG): container finished" podID="3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf" containerID="55d73a60944fe9f7d7ad79e64d1b64d20cb2043c83d889359f590d510c77308d" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.994675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" event={"ID":"3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf","Type":"ContainerDied","Data":"55d73a60944fe9f7d7ad79e64d1b64d20cb2043c83d889359f590d510c77308d"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.996098 4707 generic.go:334] "Generic (PLEG): container finished" podID="9f333daa-a37b-4c2e-bd13-e66416449d2c" containerID="39ac84a7db285687d6db26470e02e89c210c3ae4cac2d9bde592910bf8a6ac53" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.996147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" event={"ID":"9f333daa-a37b-4c2e-bd13-e66416449d2c","Type":"ContainerDied","Data":"39ac84a7db285687d6db26470e02e89c210c3ae4cac2d9bde592910bf8a6ac53"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.996170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" event={"ID":"9f333daa-a37b-4c2e-bd13-e66416449d2c","Type":"ContainerDied","Data":"40ce83d397ae6c679a56d729cfc138ff755f95f5553be692fe3834a343abfd66"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.996180 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40ce83d397ae6c679a56d729cfc138ff755f95f5553be692fe3834a343abfd66" Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.997366 4707 generic.go:334] "Generic (PLEG): container finished" podID="06c244e2-fb38-4ac0-8ed0-2f34d01c5228" containerID="9f157a1954f415b2a222d4a1081d13ac7ca6e425474cd711399734533ef230cf" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.997406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" event={"ID":"06c244e2-fb38-4ac0-8ed0-2f34d01c5228","Type":"ContainerDied","Data":"9f157a1954f415b2a222d4a1081d13ac7ca6e425474cd711399734533ef230cf"} Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.998508 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c2dd112-6dc9-4b29-95f2-25b3c44452a2" containerID="54bd8be04d416835f1281493540d9e01504393fe58be46567299bee6842f6ce5" exitCode=0 Jan 21 16:34:49 crc kubenswrapper[4707]: I0121 16:34:49.998530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" event={"ID":"4c2dd112-6dc9-4b29-95f2-25b3c44452a2","Type":"ContainerDied","Data":"54bd8be04d416835f1281493540d9e01504393fe58be46567299bee6842f6ce5"} Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.001306 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8707730-a731-419d-ad12-caf5fddafbd5" containerID="f81d6212e7b46c7cc4d9c03b8fdd9060c30d3305e97507e4bea0f56ee40e5424" exitCode=0 Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.001395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" event={"ID":"d8707730-a731-419d-ad12-caf5fddafbd5","Type":"ContainerDied","Data":"f81d6212e7b46c7cc4d9c03b8fdd9060c30d3305e97507e4bea0f56ee40e5424"} Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.002687 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d095e2a-7788-4a7e-af9b-eaed8407ef5a" containerID="0629f352d3f4441913ab06b6e0a66cad5ad04c26a87acf9f4fafc1b2313ff041" exitCode=0 Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.002732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" event={"ID":"4d095e2a-7788-4a7e-af9b-eaed8407ef5a","Type":"ContainerDied","Data":"0629f352d3f4441913ab06b6e0a66cad5ad04c26a87acf9f4fafc1b2313ff041"} Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.004566 4707 generic.go:334] "Generic (PLEG): container finished" podID="a674fe35-9ddd-4352-b770-1061b28fce34" containerID="c8595d9ca2bcaf4f53f6bc9205a1d3fae85f9f44532fae632b4d59583c03b6f7" exitCode=0 Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.004613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" event={"ID":"a674fe35-9ddd-4352-b770-1061b28fce34","Type":"ContainerDied","Data":"c8595d9ca2bcaf4f53f6bc9205a1d3fae85f9f44532fae632b4d59583c03b6f7"} Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.005976 4707 generic.go:334] "Generic (PLEG): container finished" podID="9356dd92-4c2e-476d-9a60-946f4f148564" containerID="09c8747e2bac89103cb29b97e35d54f1e372a4dea0daecca785b4bd9414b0125" exitCode=0 Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.006019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" event={"ID":"9356dd92-4c2e-476d-9a60-946f4f148564","Type":"ContainerDied","Data":"09c8747e2bac89103cb29b97e35d54f1e372a4dea0daecca785b4bd9414b0125"} Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.007320 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ff533c3-a29f-498b-a575-1492c0e07aa9" containerID="63c114e32b92172fc2d850e2811160cb9627b2ab520857ee5130971d61c990d6" exitCode=0 Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.007343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" event={"ID":"1ff533c3-a29f-498b-a575-1492c0e07aa9","Type":"ContainerDied","Data":"63c114e32b92172fc2d850e2811160cb9627b2ab520857ee5130971d61c990d6"} Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.182833 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.953788 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.956133 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" Jan 21 16:34:50 crc kubenswrapper[4707]: I0121 16:34:50.996057 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.000603 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.003435 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.025748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" event={"ID":"2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd","Type":"ContainerDied","Data":"4da72fd835c6b6a0e077ef11c59c9ed1a84e7cd455229909bfb1f2a92835217c"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.025776 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4da72fd835c6b6a0e077ef11c59c9ed1a84e7cd455229909bfb1f2a92835217c" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.028660 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" event={"ID":"1ff533c3-a29f-498b-a575-1492c0e07aa9","Type":"ContainerDied","Data":"022e1abc920591480fa3d50ea5892e57ae18e0ce41634939d04f6fa16dc35cd3"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.028693 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="022e1abc920591480fa3d50ea5892e57ae18e0ce41634939d04f6fa16dc35cd3" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.028790 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.030038 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" event={"ID":"4f34a822-4c51-4dfc-9812-1287c9b3281d","Type":"ContainerDied","Data":"d764a7850aad2c6c911bd3737d1311a0dfb555c04e589572887c225ef28d544b"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.030057 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d764a7850aad2c6c911bd3737d1311a0dfb555c04e589572887c225ef28d544b" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.030595 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.031606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9hglm" event={"ID":"7193a0d6-108f-4c14-a4fe-678aa49bc9dd","Type":"ContainerDied","Data":"9ef0ead2e398d4bca1ad7f3bd12ead79aa420a4431f96466f258378ecec7ff34"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.031636 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ef0ead2e398d4bca1ad7f3bd12ead79aa420a4431f96466f258378ecec7ff34" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.031719 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.034992 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.036212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" event={"ID":"375b0905-f382-4e75-8c5c-4ece8f6ddde3","Type":"ContainerDied","Data":"2d9cfa07cb89a18e2a8095e5cb11e0e960479d3bc16f74cc3fab5ce9ee6ca333"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.036238 4707 scope.go:117] "RemoveContainer" containerID="098a977d584fdcb0ebe244d0fa4d9dc9081e754df204db8fed081b450701f4e2" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.044637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" event={"ID":"06c244e2-fb38-4ac0-8ed0-2f34d01c5228","Type":"ContainerDied","Data":"3b3fdc9a10007761ab66b639055919d6d4bc9530aa4b526bb3a5089bf964c0ba"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.044661 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b3fdc9a10007761ab66b639055919d6d4bc9530aa4b526bb3a5089bf964c0ba" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.045050 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.046403 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.046581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" event={"ID":"a674fe35-9ddd-4352-b770-1061b28fce34","Type":"ContainerDied","Data":"fe1e0f07ce25b64d2de7e3cc60e5ae49f2faed3568d04a2a88207aae537e8721"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.048635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" event={"ID":"9356dd92-4c2e-476d-9a60-946f4f148564","Type":"ContainerDied","Data":"61fece91d28908eb9db10f6ecab6f1f4fb5ec195e5d5cf6e11426c58e089c800"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.048657 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61fece91d28908eb9db10f6ecab6f1f4fb5ec195e5d5cf6e11426c58e089c800" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.055003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" event={"ID":"79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6","Type":"ContainerDied","Data":"ec6cdae23307b1bc8f303f95b6facba65ebad5c47f04c52271d3a8370c4bed89"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.055026 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec6cdae23307b1bc8f303f95b6facba65ebad5c47f04c52271d3a8370c4bed89" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.055078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.056634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" event={"ID":"a13a87c3-f0a5-4fa9-892d-e2371300124a","Type":"ContainerDied","Data":"a280de3748429292af9bd8cdd971f3edfea4a3a1463c47e1c74e1c31a0af19ec"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.056651 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a280de3748429292af9bd8cdd971f3edfea4a3a1463c47e1c74e1c31a0af19ec" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.057650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" event={"ID":"5219c03b-89a8-4c73-b04f-2c6b1f8e29f4","Type":"ContainerDied","Data":"5af737c25c722e3f8df82b62942a9ca1d3e95970dccc2de59a013ce1eab7f526"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.057703 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.059120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" event={"ID":"6676ddd6-9d32-480d-9306-d3dc0676d2cd","Type":"ContainerDied","Data":"9d02792b93962300a94de11e733191975263ecb6156a6227a26eed844b193908"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.059141 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d02792b93962300a94de11e733191975263ecb6156a6227a26eed844b193908" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.061378 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" event={"ID":"96a9240d-2c77-4718-8f9d-60633df4eee4","Type":"ContainerDied","Data":"e5316412d015b5f7c3f3560b03a3a357463f0f6430836bf6e5fcdaf07e6f3619"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.061399 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5316412d015b5f7c3f3560b03a3a357463f0f6430836bf6e5fcdaf07e6f3619" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.062049 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.063688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" event={"ID":"4c2dd112-6dc9-4b29-95f2-25b3c44452a2","Type":"ContainerDied","Data":"a186ff0e1402d8715927e0825d3dc08c84dc756abdea3c4f7cea33e59208719f"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.063741 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.071039 4707 scope.go:117] "RemoveContainer" containerID="c8595d9ca2bcaf4f53f6bc9205a1d3fae85f9f44532fae632b4d59583c03b6f7" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.077764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" event={"ID":"bd0195ed-9602-42a1-bd0e-a0d79f8a6197","Type":"ContainerDied","Data":"2ab3ef5775dddc47d85bacd7254ab86ca56ddaeec082e2dd164c10b254cfb6c9"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.077837 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ab3ef5775dddc47d85bacd7254ab86ca56ddaeec082e2dd164c10b254cfb6c9" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.079989 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.080149 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.085327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" event={"ID":"d8707730-a731-419d-ad12-caf5fddafbd5","Type":"ContainerDied","Data":"ee5c9c4f28d09358f5e4f684a85deb5b1c2de57d7f2b73a249fa4b280614f881"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.085373 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.091780 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.105950 4707 scope.go:117] "RemoveContainer" containerID="d17f33e08f973302863a6a158a1b0329e06116ee4116ba93b9392405aeaa6b17" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.106174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" event={"ID":"40734c06-7e00-4184-9487-35be214c9556","Type":"ContainerDied","Data":"5285b84a44266276fd32026dcaf0de016240472c642cd950fa8d30799db8f0f6"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.106225 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.108536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" event={"ID":"4d095e2a-7788-4a7e-af9b-eaed8407ef5a","Type":"ContainerDied","Data":"6afdd9a0ec2c36896a46fd19954ab8cbdade06f0eaa4f075f0870ff9c1530417"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.108623 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6afdd9a0ec2c36896a46fd19954ab8cbdade06f0eaa4f075f0870ff9c1530417" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.109091 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.109150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.110993 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.111230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf" event={"ID":"e6ab0bb3-12a8-4314-83c7-280a512198f6","Type":"ContainerDied","Data":"657fc70bbae0560de034485e811af0a0e4ca8589cd2b996e2a67a813b8c7a1be"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.118962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" event={"ID":"62e25886-609f-4ef6-848c-e116c738df6a","Type":"ContainerDied","Data":"2780ef155fd58bc61c7c38d5c95dcf76709fc13ec1935a89f985672e7a3b9a8d"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.119078 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.123366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psl8g\" (UniqueName: \"kubernetes.io/projected/7a6b5573-a127-4713-919d-d79e38f60b87-kube-api-access-psl8g\") pod \"7a6b5573-a127-4713-919d-d79e38f60b87\" (UID: \"7a6b5573-a127-4713-919d-d79e38f60b87\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.123422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58ttw\" (UniqueName: \"kubernetes.io/projected/5219c03b-89a8-4c73-b04f-2c6b1f8e29f4-kube-api-access-58ttw\") pod \"5219c03b-89a8-4c73-b04f-2c6b1f8e29f4\" (UID: \"5219c03b-89a8-4c73-b04f-2c6b1f8e29f4\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.123506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2kfs\" (UniqueName: \"kubernetes.io/projected/8951bb5e-fd60-4761-9c83-bc523b041b83-kube-api-access-r2kfs\") pod \"8951bb5e-fd60-4761-9c83-bc523b041b83\" (UID: \"8951bb5e-fd60-4761-9c83-bc523b041b83\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.123527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7zn9\" (UniqueName: \"kubernetes.io/projected/9f333daa-a37b-4c2e-bd13-e66416449d2c-kube-api-access-t7zn9\") pod \"9f333daa-a37b-4c2e-bd13-e66416449d2c\" (UID: \"9f333daa-a37b-4c2e-bd13-e66416449d2c\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.128853 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6b5573-a127-4713-919d-d79e38f60b87-kube-api-access-psl8g" (OuterVolumeSpecName: "kube-api-access-psl8g") pod "7a6b5573-a127-4713-919d-d79e38f60b87" (UID: "7a6b5573-a127-4713-919d-d79e38f60b87"). InnerVolumeSpecName "kube-api-access-psl8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.129794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f333daa-a37b-4c2e-bd13-e66416449d2c-kube-api-access-t7zn9" (OuterVolumeSpecName: "kube-api-access-t7zn9") pod "9f333daa-a37b-4c2e-bd13-e66416449d2c" (UID: "9f333daa-a37b-4c2e-bd13-e66416449d2c"). InnerVolumeSpecName "kube-api-access-t7zn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.130408 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.130430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8951bb5e-fd60-4761-9c83-bc523b041b83-kube-api-access-r2kfs" (OuterVolumeSpecName: "kube-api-access-r2kfs") pod "8951bb5e-fd60-4761-9c83-bc523b041b83" (UID: "8951bb5e-fd60-4761-9c83-bc523b041b83"). InnerVolumeSpecName "kube-api-access-r2kfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.134104 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.134392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" event={"ID":"4adccf62-a1c3-430c-9137-a515f03b23e4","Type":"ContainerDied","Data":"e1181ff0d33bd2106492ac908180fc1e95d443aaeb3b20fa0870529df93627a5"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.134449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.134675 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.135620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" event={"ID":"3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf","Type":"ContainerDied","Data":"469647dd32efc85773776a088c1ed3ef69c6f5394c9b24b1667952a2cb9a6ccc"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.135650 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="469647dd32efc85773776a088c1ed3ef69c6f5394c9b24b1667952a2cb9a6ccc" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.135768 4707 scope.go:117] "RemoveContainer" containerID="54bd8be04d416835f1281493540d9e01504393fe58be46567299bee6842f6ce5" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.136701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5219c03b-89a8-4c73-b04f-2c6b1f8e29f4-kube-api-access-58ttw" (OuterVolumeSpecName: "kube-api-access-58ttw") pod "5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" (UID: "5219c03b-89a8-4c73-b04f-2c6b1f8e29f4"). InnerVolumeSpecName "kube-api-access-58ttw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.137652 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.137883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" event={"ID":"0c59ef09-0beb-452c-8563-83bda9602961","Type":"ContainerDied","Data":"bcbff26adef1b2e1ec1e735705382ce38e79f7355d46171529c5ad812fc5ebf5"} Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.137968 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.138035 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.138039 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.170252 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.177706 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.179642 4707 scope.go:117] "RemoveContainer" containerID="f81d6212e7b46c7cc4d9c03b8fdd9060c30d3305e97507e4bea0f56ee40e5424" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.191253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.193782 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa60207-9807-459c-9a86-65c1bd3d0064" path="/var/lib/kubelet/pods/2aa60207-9807-459c-9a86-65c1bd3d0064/volumes" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.197358 4707 scope.go:117] "RemoveContainer" containerID="e592e10125a5ad6cbafa06a737cbe06c97e19b4290da7089fbef2ffc14cfde56" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.207097 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.209465 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.217761 4707 scope.go:117] "RemoveContainer" containerID="e4f42a8b26c44d79b0b42241e85e250bb46ac11e92535e51eb178335a37ba91b" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.224599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rntz6\" (UniqueName: \"kubernetes.io/projected/4adccf62-a1c3-430c-9137-a515f03b23e4-kube-api-access-rntz6\") pod \"4adccf62-a1c3-430c-9137-a515f03b23e4\" (UID: \"4adccf62-a1c3-430c-9137-a515f03b23e4\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.224715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9tth\" (UniqueName: \"kubernetes.io/projected/e6ab0bb3-12a8-4314-83c7-280a512198f6-kube-api-access-d9tth\") pod \"e6ab0bb3-12a8-4314-83c7-280a512198f6\" (UID: \"e6ab0bb3-12a8-4314-83c7-280a512198f6\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.224790 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-747pw\" (UniqueName: \"kubernetes.io/projected/62e25886-609f-4ef6-848c-e116c738df6a-kube-api-access-747pw\") pod \"62e25886-609f-4ef6-848c-e116c738df6a\" (UID: \"62e25886-609f-4ef6-848c-e116c738df6a\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.224896 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") pod \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.224973 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xdt4\" (UniqueName: \"kubernetes.io/projected/0c59ef09-0beb-452c-8563-83bda9602961-kube-api-access-9xdt4\") pod \"0c59ef09-0beb-452c-8563-83bda9602961\" (UID: \"0c59ef09-0beb-452c-8563-83bda9602961\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flc4z\" (UniqueName: \"kubernetes.io/projected/6676ddd6-9d32-480d-9306-d3dc0676d2cd-kube-api-access-flc4z\") pod \"6676ddd6-9d32-480d-9306-d3dc0676d2cd\" (UID: \"6676ddd6-9d32-480d-9306-d3dc0676d2cd\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb588\" (UniqueName: \"kubernetes.io/projected/a674fe35-9ddd-4352-b770-1061b28fce34-kube-api-access-lb588\") pod \"a674fe35-9ddd-4352-b770-1061b28fce34\" (UID: \"a674fe35-9ddd-4352-b770-1061b28fce34\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v6bf\" (UniqueName: \"kubernetes.io/projected/96a9240d-2c77-4718-8f9d-60633df4eee4-kube-api-access-7v6bf\") pod \"96a9240d-2c77-4718-8f9d-60633df4eee4\" (UID: \"96a9240d-2c77-4718-8f9d-60633df4eee4\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcmrg\" (UniqueName: \"kubernetes.io/projected/40734c06-7e00-4184-9487-35be214c9556-kube-api-access-mcmrg\") pod \"40734c06-7e00-4184-9487-35be214c9556\" (UID: \"40734c06-7e00-4184-9487-35be214c9556\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx4bk\" (UniqueName: \"kubernetes.io/projected/375b0905-f382-4e75-8c5c-4ece8f6ddde3-kube-api-access-qx4bk\") pod \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\" (UID: \"375b0905-f382-4e75-8c5c-4ece8f6ddde3\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8kcj\" (UniqueName: \"kubernetes.io/projected/7193a0d6-108f-4c14-a4fe-678aa49bc9dd-kube-api-access-c8kcj\") pod \"7193a0d6-108f-4c14-a4fe-678aa49bc9dd\" (UID: \"7193a0d6-108f-4c14-a4fe-678aa49bc9dd\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cx8v\" (UniqueName: \"kubernetes.io/projected/4c2dd112-6dc9-4b29-95f2-25b3c44452a2-kube-api-access-6cx8v\") pod \"4c2dd112-6dc9-4b29-95f2-25b3c44452a2\" (UID: \"4c2dd112-6dc9-4b29-95f2-25b3c44452a2\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225498 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dwfb\" (UniqueName: \"kubernetes.io/projected/d8707730-a731-419d-ad12-caf5fddafbd5-kube-api-access-4dwfb\") pod \"d8707730-a731-419d-ad12-caf5fddafbd5\" (UID: \"d8707730-a731-419d-ad12-caf5fddafbd5\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvggw\" (UniqueName: \"kubernetes.io/projected/2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd-kube-api-access-hvggw\") pod \"2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd\" (UID: \"2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225840 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psl8g\" (UniqueName: \"kubernetes.io/projected/7a6b5573-a127-4713-919d-d79e38f60b87-kube-api-access-psl8g\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225900 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58ttw\" (UniqueName: \"kubernetes.io/projected/5219c03b-89a8-4c73-b04f-2c6b1f8e29f4-kube-api-access-58ttw\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.225964 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2kfs\" (UniqueName: \"kubernetes.io/projected/8951bb5e-fd60-4761-9c83-bc523b041b83-kube-api-access-r2kfs\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.226011 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7zn9\" (UniqueName: \"kubernetes.io/projected/9f333daa-a37b-4c2e-bd13-e66416449d2c-kube-api-access-t7zn9\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.227529 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.229467 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd-kube-api-access-hvggw" (OuterVolumeSpecName: "kube-api-access-hvggw") pod "2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd" (UID: "2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd"). InnerVolumeSpecName "kube-api-access-hvggw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.230757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert" (OuterVolumeSpecName: "cert") pod "375b0905-f382-4e75-8c5c-4ece8f6ddde3" (UID: "375b0905-f382-4e75-8c5c-4ece8f6ddde3"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.230866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ab0bb3-12a8-4314-83c7-280a512198f6-kube-api-access-d9tth" (OuterVolumeSpecName: "kube-api-access-d9tth") pod "e6ab0bb3-12a8-4314-83c7-280a512198f6" (UID: "e6ab0bb3-12a8-4314-83c7-280a512198f6"). InnerVolumeSpecName "kube-api-access-d9tth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.231190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40734c06-7e00-4184-9487-35be214c9556-kube-api-access-mcmrg" (OuterVolumeSpecName: "kube-api-access-mcmrg") pod "40734c06-7e00-4184-9487-35be214c9556" (UID: "40734c06-7e00-4184-9487-35be214c9556"). InnerVolumeSpecName "kube-api-access-mcmrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.231347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375b0905-f382-4e75-8c5c-4ece8f6ddde3-kube-api-access-qx4bk" (OuterVolumeSpecName: "kube-api-access-qx4bk") pod "375b0905-f382-4e75-8c5c-4ece8f6ddde3" (UID: "375b0905-f382-4e75-8c5c-4ece8f6ddde3"). InnerVolumeSpecName "kube-api-access-qx4bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.232678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a674fe35-9ddd-4352-b770-1061b28fce34-kube-api-access-lb588" (OuterVolumeSpecName: "kube-api-access-lb588") pod "a674fe35-9ddd-4352-b770-1061b28fce34" (UID: "a674fe35-9ddd-4352-b770-1061b28fce34"). InnerVolumeSpecName "kube-api-access-lb588". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.232961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4adccf62-a1c3-430c-9137-a515f03b23e4-kube-api-access-rntz6" (OuterVolumeSpecName: "kube-api-access-rntz6") pod "4adccf62-a1c3-430c-9137-a515f03b23e4" (UID: "4adccf62-a1c3-430c-9137-a515f03b23e4"). InnerVolumeSpecName "kube-api-access-rntz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.233879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7193a0d6-108f-4c14-a4fe-678aa49bc9dd-kube-api-access-c8kcj" (OuterVolumeSpecName: "kube-api-access-c8kcj") pod "7193a0d6-108f-4c14-a4fe-678aa49bc9dd" (UID: "7193a0d6-108f-4c14-a4fe-678aa49bc9dd"). InnerVolumeSpecName "kube-api-access-c8kcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.234013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a9240d-2c77-4718-8f9d-60633df4eee4-kube-api-access-7v6bf" (OuterVolumeSpecName: "kube-api-access-7v6bf") pod "96a9240d-2c77-4718-8f9d-60633df4eee4" (UID: "96a9240d-2c77-4718-8f9d-60633df4eee4"). InnerVolumeSpecName "kube-api-access-7v6bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.234062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8707730-a731-419d-ad12-caf5fddafbd5-kube-api-access-4dwfb" (OuterVolumeSpecName: "kube-api-access-4dwfb") pod "d8707730-a731-419d-ad12-caf5fddafbd5" (UID: "d8707730-a731-419d-ad12-caf5fddafbd5"). InnerVolumeSpecName "kube-api-access-4dwfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.235103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c59ef09-0beb-452c-8563-83bda9602961-kube-api-access-9xdt4" (OuterVolumeSpecName: "kube-api-access-9xdt4") pod "0c59ef09-0beb-452c-8563-83bda9602961" (UID: "0c59ef09-0beb-452c-8563-83bda9602961"). InnerVolumeSpecName "kube-api-access-9xdt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.235843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e25886-609f-4ef6-848c-e116c738df6a-kube-api-access-747pw" (OuterVolumeSpecName: "kube-api-access-747pw") pod "62e25886-609f-4ef6-848c-e116c738df6a" (UID: "62e25886-609f-4ef6-848c-e116c738df6a"). InnerVolumeSpecName "kube-api-access-747pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.238356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c2dd112-6dc9-4b29-95f2-25b3c44452a2-kube-api-access-6cx8v" (OuterVolumeSpecName: "kube-api-access-6cx8v") pod "4c2dd112-6dc9-4b29-95f2-25b3c44452a2" (UID: "4c2dd112-6dc9-4b29-95f2-25b3c44452a2"). InnerVolumeSpecName "kube-api-access-6cx8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.242598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6676ddd6-9d32-480d-9306-d3dc0676d2cd-kube-api-access-flc4z" (OuterVolumeSpecName: "kube-api-access-flc4z") pod "6676ddd6-9d32-480d-9306-d3dc0676d2cd" (UID: "6676ddd6-9d32-480d-9306-d3dc0676d2cd"). InnerVolumeSpecName "kube-api-access-flc4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.243033 4707 scope.go:117] "RemoveContainer" containerID="0046dd925d470a4baf4e0eb975a04283e1f25124964c96ed2b0c722a82a5b5aa" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.283166 4707 scope.go:117] "RemoveContainer" containerID="e45b507f5954b9cce48efe9efde816a45062e6746141cb2cb4c92830af61bce1" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.301742 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.305401 4707 scope.go:117] "RemoveContainer" containerID="8e0b35d221b6726be4c9bab8690c69dd64a57f1c4279e3e7101140eeba8d5358" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.307729 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-bvgfl"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfrsj\" (UniqueName: \"kubernetes.io/projected/4f34a822-4c51-4dfc-9812-1287c9b3281d-kube-api-access-nfrsj\") pod \"4f34a822-4c51-4dfc-9812-1287c9b3281d\" (UID: \"4f34a822-4c51-4dfc-9812-1287c9b3281d\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5hl4\" (UniqueName: \"kubernetes.io/projected/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-kube-api-access-l5hl4\") pod \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7jqd\" (UniqueName: \"kubernetes.io/projected/a13a87c3-f0a5-4fa9-892d-e2371300124a-kube-api-access-x7jqd\") pod \"a13a87c3-f0a5-4fa9-892d-e2371300124a\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-profile-collector-cert\") pod \"a13a87c3-f0a5-4fa9-892d-e2371300124a\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg6cv\" (UniqueName: \"kubernetes.io/projected/79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6-kube-api-access-hg6cv\") pod \"79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6\" (UID: \"79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-srv-cert\") pod \"a13a87c3-f0a5-4fa9-892d-e2371300124a\" (UID: \"a13a87c3-f0a5-4fa9-892d-e2371300124a\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74ccm\" (UniqueName: \"kubernetes.io/projected/3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf-kube-api-access-74ccm\") pod \"3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf\" (UID: \"3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") pod \"9356dd92-4c2e-476d-9a60-946f4f148564\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327776 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45x4v\" (UniqueName: \"kubernetes.io/projected/4d095e2a-7788-4a7e-af9b-eaed8407ef5a-kube-api-access-45x4v\") pod \"4d095e2a-7788-4a7e-af9b-eaed8407ef5a\" (UID: \"4d095e2a-7788-4a7e-af9b-eaed8407ef5a\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") pod \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfmcx\" (UniqueName: \"kubernetes.io/projected/1ff533c3-a29f-498b-a575-1492c0e07aa9-kube-api-access-nfmcx\") pod \"1ff533c3-a29f-498b-a575-1492c0e07aa9\" (UID: \"1ff533c3-a29f-498b-a575-1492c0e07aa9\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlcth\" (UniqueName: \"kubernetes.io/projected/9356dd92-4c2e-476d-9a60-946f4f148564-kube-api-access-xlcth\") pod \"9356dd92-4c2e-476d-9a60-946f4f148564\" (UID: \"9356dd92-4c2e-476d-9a60-946f4f148564\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.327981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") pod \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\" (UID: \"bd0195ed-9602-42a1-bd0e-a0d79f8a6197\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p96xf\" (UniqueName: \"kubernetes.io/projected/06c244e2-fb38-4ac0-8ed0-2f34d01c5228-kube-api-access-p96xf\") pod \"06c244e2-fb38-4ac0-8ed0-2f34d01c5228\" (UID: \"06c244e2-fb38-4ac0-8ed0-2f34d01c5228\") " Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328413 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rntz6\" (UniqueName: \"kubernetes.io/projected/4adccf62-a1c3-430c-9137-a515f03b23e4-kube-api-access-rntz6\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328425 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9tth\" (UniqueName: \"kubernetes.io/projected/e6ab0bb3-12a8-4314-83c7-280a512198f6-kube-api-access-d9tth\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328434 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-747pw\" (UniqueName: \"kubernetes.io/projected/62e25886-609f-4ef6-848c-e116c738df6a-kube-api-access-747pw\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328443 4707 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0905-f382-4e75-8c5c-4ece8f6ddde3-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328452 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xdt4\" (UniqueName: \"kubernetes.io/projected/0c59ef09-0beb-452c-8563-83bda9602961-kube-api-access-9xdt4\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328494 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flc4z\" (UniqueName: \"kubernetes.io/projected/6676ddd6-9d32-480d-9306-d3dc0676d2cd-kube-api-access-flc4z\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328505 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb588\" (UniqueName: \"kubernetes.io/projected/a674fe35-9ddd-4352-b770-1061b28fce34-kube-api-access-lb588\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328539 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v6bf\" (UniqueName: \"kubernetes.io/projected/96a9240d-2c77-4718-8f9d-60633df4eee4-kube-api-access-7v6bf\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328576 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcmrg\" (UniqueName: \"kubernetes.io/projected/40734c06-7e00-4184-9487-35be214c9556-kube-api-access-mcmrg\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328584 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx4bk\" (UniqueName: \"kubernetes.io/projected/375b0905-f382-4e75-8c5c-4ece8f6ddde3-kube-api-access-qx4bk\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328593 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8kcj\" (UniqueName: \"kubernetes.io/projected/7193a0d6-108f-4c14-a4fe-678aa49bc9dd-kube-api-access-c8kcj\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328601 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cx8v\" (UniqueName: \"kubernetes.io/projected/4c2dd112-6dc9-4b29-95f2-25b3c44452a2-kube-api-access-6cx8v\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328610 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dwfb\" (UniqueName: \"kubernetes.io/projected/d8707730-a731-419d-ad12-caf5fddafbd5-kube-api-access-4dwfb\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.328618 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvggw\" (UniqueName: \"kubernetes.io/projected/2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd-kube-api-access-hvggw\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.332285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9356dd92-4c2e-476d-9a60-946f4f148564-kube-api-access-xlcth" (OuterVolumeSpecName: "kube-api-access-xlcth") pod "9356dd92-4c2e-476d-9a60-946f4f148564" (UID: "9356dd92-4c2e-476d-9a60-946f4f148564"). InnerVolumeSpecName "kube-api-access-xlcth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.332741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f34a822-4c51-4dfc-9812-1287c9b3281d-kube-api-access-nfrsj" (OuterVolumeSpecName: "kube-api-access-nfrsj") pod "4f34a822-4c51-4dfc-9812-1287c9b3281d" (UID: "4f34a822-4c51-4dfc-9812-1287c9b3281d"). InnerVolumeSpecName "kube-api-access-nfrsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.333050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "bd0195ed-9602-42a1-bd0e-a0d79f8a6197" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.333076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c244e2-fb38-4ac0-8ed0-2f34d01c5228-kube-api-access-p96xf" (OuterVolumeSpecName: "kube-api-access-p96xf") pod "06c244e2-fb38-4ac0-8ed0-2f34d01c5228" (UID: "06c244e2-fb38-4ac0-8ed0-2f34d01c5228"). InnerVolumeSpecName "kube-api-access-p96xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.335613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "bd0195ed-9602-42a1-bd0e-a0d79f8a6197" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.336083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf-kube-api-access-74ccm" (OuterVolumeSpecName: "kube-api-access-74ccm") pod "3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf" (UID: "3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf"). InnerVolumeSpecName "kube-api-access-74ccm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.341264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13a87c3-f0a5-4fa9-892d-e2371300124a-kube-api-access-x7jqd" (OuterVolumeSpecName: "kube-api-access-x7jqd") pod "a13a87c3-f0a5-4fa9-892d-e2371300124a" (UID: "a13a87c3-f0a5-4fa9-892d-e2371300124a"). InnerVolumeSpecName "kube-api-access-x7jqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.341412 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert" (OuterVolumeSpecName: "cert") pod "9356dd92-4c2e-476d-9a60-946f4f148564" (UID: "9356dd92-4c2e-476d-9a60-946f4f148564"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.342029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff533c3-a29f-498b-a575-1492c0e07aa9-kube-api-access-nfmcx" (OuterVolumeSpecName: "kube-api-access-nfmcx") pod "1ff533c3-a29f-498b-a575-1492c0e07aa9" (UID: "1ff533c3-a29f-498b-a575-1492c0e07aa9"). InnerVolumeSpecName "kube-api-access-nfmcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.343959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6-kube-api-access-hg6cv" (OuterVolumeSpecName: "kube-api-access-hg6cv") pod "79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" (UID: "79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6"). InnerVolumeSpecName "kube-api-access-hg6cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.344773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "a13a87c3-f0a5-4fa9-892d-e2371300124a" (UID: "a13a87c3-f0a5-4fa9-892d-e2371300124a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.345078 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d095e2a-7788-4a7e-af9b-eaed8407ef5a-kube-api-access-45x4v" (OuterVolumeSpecName: "kube-api-access-45x4v") pod "4d095e2a-7788-4a7e-af9b-eaed8407ef5a" (UID: "4d095e2a-7788-4a7e-af9b-eaed8407ef5a"). InnerVolumeSpecName "kube-api-access-45x4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.347058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "a13a87c3-f0a5-4fa9-892d-e2371300124a" (UID: "a13a87c3-f0a5-4fa9-892d-e2371300124a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.348284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-kube-api-access-l5hl4" (OuterVolumeSpecName: "kube-api-access-l5hl4") pod "bd0195ed-9602-42a1-bd0e-a0d79f8a6197" (UID: "bd0195ed-9602-42a1-bd0e-a0d79f8a6197"). InnerVolumeSpecName "kube-api-access-l5hl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:51 crc kubenswrapper[4707]: W0121 16:34:51.357173 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3ecdad3_a2db_4c61_a0cb_f8dc3390e4f2.slice/crio-2cd1b825d3f504bfaba577a745be01bb18d1c36b759dcc58d49cd440b34bd121 WatchSource:0}: Error finding container 2cd1b825d3f504bfaba577a745be01bb18d1c36b759dcc58d49cd440b34bd121: Status 404 returned error can't find the container with id 2cd1b825d3f504bfaba577a745be01bb18d1c36b759dcc58d49cd440b34bd121 Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.358138 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.363976 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-jlngq"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.373586 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.404692 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.408704 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-njwgg"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430153 4707 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430268 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74ccm\" (UniqueName: \"kubernetes.io/projected/3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf-kube-api-access-74ccm\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430325 4707 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9356dd92-4c2e-476d-9a60-946f4f148564-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430430 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45x4v\" (UniqueName: \"kubernetes.io/projected/4d095e2a-7788-4a7e-af9b-eaed8407ef5a-kube-api-access-45x4v\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430485 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430543 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfmcx\" (UniqueName: \"kubernetes.io/projected/1ff533c3-a29f-498b-a575-1492c0e07aa9-kube-api-access-nfmcx\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430591 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlcth\" (UniqueName: \"kubernetes.io/projected/9356dd92-4c2e-476d-9a60-946f4f148564-kube-api-access-xlcth\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430638 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430686 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p96xf\" (UniqueName: \"kubernetes.io/projected/06c244e2-fb38-4ac0-8ed0-2f34d01c5228-kube-api-access-p96xf\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430732 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfrsj\" (UniqueName: \"kubernetes.io/projected/4f34a822-4c51-4dfc-9812-1287c9b3281d-kube-api-access-nfrsj\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430777 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5hl4\" (UniqueName: \"kubernetes.io/projected/bd0195ed-9602-42a1-bd0e-a0d79f8a6197-kube-api-access-l5hl4\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430847 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7jqd\" (UniqueName: \"kubernetes.io/projected/a13a87c3-f0a5-4fa9-892d-e2371300124a-kube-api-access-x7jqd\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430899 4707 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a13a87c3-f0a5-4fa9-892d-e2371300124a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.430957 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg6cv\" (UniqueName: \"kubernetes.io/projected/79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6-kube-api-access-hg6cv\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.459081 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.470650 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-5tp6l"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.478448 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.483138 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-d9dr4"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.489872 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.494714 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-jr4nf"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.501427 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.507281 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-79xkw"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.508394 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.513617 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-kffk5"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.514891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.518127 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lpptc"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.522940 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.526833 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-nhgpx"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.532384 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t"] Jan 21 16:34:51 crc kubenswrapper[4707]: I0121 16:34:51.536619 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-znz8t"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.146266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" event={"ID":"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2","Type":"ContainerStarted","Data":"e34208cc755feb9de03ccc179d2f010d48da83d8664be6ee74249fc523ff9186"} Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.146830 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.146858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" event={"ID":"b3ecdad3-a2db-4c61-a0cb-f8dc3390e4f2","Type":"ContainerStarted","Data":"2cd1b825d3f504bfaba577a745be01bb18d1c36b759dcc58d49cd440b34bd121"} Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.151416 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.156411 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.162506 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcmsz" podStartSLOduration=3.162491796 podStartE2EDuration="3.162491796s" podCreationTimestamp="2026-01-21 16:34:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:34:52.161618513 +0000 UTC m=+5589.343134736" watchObservedRunningTime="2026-01-21 16:34:52.162491796 +0000 UTC m=+5589.344008018" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.167133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"70390b648c73daf4be255709128a6e64ca5ff234e05ed8fb129f19cfb2d9e39e"} Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.167936 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.170827 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.175483 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9hglm" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.175519 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.175564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.175589 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.175619 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.175678 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.175828 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.175878 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.175921 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-99p87" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.176001 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.176977 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.180888 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq" Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.196254 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.201322 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-szzcm"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.258665 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.261627 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-h6pkz"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.266792 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.278967 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-xpbv7"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.285629 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9hglm"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.291215 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-9hglm"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.296585 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.301697 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-q6cj4"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.306123 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.310966 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jf4st"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.315982 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.320749 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-brpxr"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.324237 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-99p87"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.327845 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-99p87"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.330802 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.334123 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-hcgf8"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.337207 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.340579 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-nns7n"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.343566 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.350124 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-xkgbq"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.353523 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.356563 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxr5n"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.359516 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.362442 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-wgmrm"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.365413 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.368300 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dfskdx"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.371160 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj"] Jan 21 16:34:52 crc kubenswrapper[4707]: I0121 16:34:52.374261 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-bjcrj"] Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.187630 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c244e2-fb38-4ac0-8ed0-2f34d01c5228" path="/var/lib/kubelet/pods/06c244e2-fb38-4ac0-8ed0-2f34d01c5228/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.188750 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c59ef09-0beb-452c-8563-83bda9602961" path="/var/lib/kubelet/pods/0c59ef09-0beb-452c-8563-83bda9602961/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.189234 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff533c3-a29f-498b-a575-1492c0e07aa9" path="/var/lib/kubelet/pods/1ff533c3-a29f-498b-a575-1492c0e07aa9/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.189931 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd" path="/var/lib/kubelet/pods/2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.191178 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375b0905-f382-4e75-8c5c-4ece8f6ddde3" path="/var/lib/kubelet/pods/375b0905-f382-4e75-8c5c-4ece8f6ddde3/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.191835 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf" path="/var/lib/kubelet/pods/3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.192557 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40734c06-7e00-4184-9487-35be214c9556" path="/var/lib/kubelet/pods/40734c06-7e00-4184-9487-35be214c9556/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.193126 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4adccf62-a1c3-430c-9137-a515f03b23e4" path="/var/lib/kubelet/pods/4adccf62-a1c3-430c-9137-a515f03b23e4/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.193931 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c2dd112-6dc9-4b29-95f2-25b3c44452a2" path="/var/lib/kubelet/pods/4c2dd112-6dc9-4b29-95f2-25b3c44452a2/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.194603 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d095e2a-7788-4a7e-af9b-eaed8407ef5a" path="/var/lib/kubelet/pods/4d095e2a-7788-4a7e-af9b-eaed8407ef5a/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.195016 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f34a822-4c51-4dfc-9812-1287c9b3281d" path="/var/lib/kubelet/pods/4f34a822-4c51-4dfc-9812-1287c9b3281d/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.195759 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" path="/var/lib/kubelet/pods/5219c03b-89a8-4c73-b04f-2c6b1f8e29f4/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.196434 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62e25886-609f-4ef6-848c-e116c738df6a" path="/var/lib/kubelet/pods/62e25886-609f-4ef6-848c-e116c738df6a/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.196850 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6676ddd6-9d32-480d-9306-d3dc0676d2cd" path="/var/lib/kubelet/pods/6676ddd6-9d32-480d-9306-d3dc0676d2cd/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.197716 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7193a0d6-108f-4c14-a4fe-678aa49bc9dd" path="/var/lib/kubelet/pods/7193a0d6-108f-4c14-a4fe-678aa49bc9dd/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.198522 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" path="/var/lib/kubelet/pods/79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.198936 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a6b5573-a127-4713-919d-d79e38f60b87" path="/var/lib/kubelet/pods/7a6b5573-a127-4713-919d-d79e38f60b87/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.199313 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8951bb5e-fd60-4761-9c83-bc523b041b83" path="/var/lib/kubelet/pods/8951bb5e-fd60-4761-9c83-bc523b041b83/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.199755 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9356dd92-4c2e-476d-9a60-946f4f148564" path="/var/lib/kubelet/pods/9356dd92-4c2e-476d-9a60-946f4f148564/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.200539 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a9240d-2c77-4718-8f9d-60633df4eee4" path="/var/lib/kubelet/pods/96a9240d-2c77-4718-8f9d-60633df4eee4/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.201447 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f333daa-a37b-4c2e-bd13-e66416449d2c" path="/var/lib/kubelet/pods/9f333daa-a37b-4c2e-bd13-e66416449d2c/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.202125 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13a87c3-f0a5-4fa9-892d-e2371300124a" path="/var/lib/kubelet/pods/a13a87c3-f0a5-4fa9-892d-e2371300124a/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.203182 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a674fe35-9ddd-4352-b770-1061b28fce34" path="/var/lib/kubelet/pods/a674fe35-9ddd-4352-b770-1061b28fce34/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.204266 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0195ed-9602-42a1-bd0e-a0d79f8a6197" path="/var/lib/kubelet/pods/bd0195ed-9602-42a1-bd0e-a0d79f8a6197/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.204647 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8707730-a731-419d-ad12-caf5fddafbd5" path="/var/lib/kubelet/pods/d8707730-a731-419d-ad12-caf5fddafbd5/volumes" Jan 21 16:34:53 crc kubenswrapper[4707]: I0121 16:34:53.205070 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ab0bb3-12a8-4314-83c7-280a512198f6" path="/var/lib/kubelet/pods/e6ab0bb3-12a8-4314-83c7-280a512198f6/volumes" Jan 21 16:34:56 crc kubenswrapper[4707]: I0121 16:34:56.918952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-55945" Jan 21 16:34:56 crc kubenswrapper[4707]: I0121 16:34:56.950896 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48"] Jan 21 16:34:56 crc kubenswrapper[4707]: I0121 16:34:56.951110 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" podUID="0dbad968-3a49-4bde-adea-f5469ef57e26" containerName="dnsmasq-dns" containerID="cri-o://cda0cd3d3e46c04599a564f3667435856cc8b6b0459d1f4b028967a746caaf2e" gracePeriod=10 Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.202980 4707 generic.go:334] "Generic (PLEG): container finished" podID="0dbad968-3a49-4bde-adea-f5469ef57e26" containerID="cda0cd3d3e46c04599a564f3667435856cc8b6b0459d1f4b028967a746caaf2e" exitCode=0 Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.203037 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" event={"ID":"0dbad968-3a49-4bde-adea-f5469ef57e26","Type":"ContainerDied","Data":"cda0cd3d3e46c04599a564f3667435856cc8b6b0459d1f4b028967a746caaf2e"} Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.283474 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.401501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44tl9\" (UniqueName: \"kubernetes.io/projected/0dbad968-3a49-4bde-adea-f5469ef57e26-kube-api-access-44tl9\") pod \"0dbad968-3a49-4bde-adea-f5469ef57e26\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.401630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-edpm-compute-no-nodes\") pod \"0dbad968-3a49-4bde-adea-f5469ef57e26\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.401705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-dnsmasq-svc\") pod \"0dbad968-3a49-4bde-adea-f5469ef57e26\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.401829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-config\") pod \"0dbad968-3a49-4bde-adea-f5469ef57e26\" (UID: \"0dbad968-3a49-4bde-adea-f5469ef57e26\") " Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.408400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbad968-3a49-4bde-adea-f5469ef57e26-kube-api-access-44tl9" (OuterVolumeSpecName: "kube-api-access-44tl9") pod "0dbad968-3a49-4bde-adea-f5469ef57e26" (UID: "0dbad968-3a49-4bde-adea-f5469ef57e26"). InnerVolumeSpecName "kube-api-access-44tl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.426213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-config" (OuterVolumeSpecName: "config") pod "0dbad968-3a49-4bde-adea-f5469ef57e26" (UID: "0dbad968-3a49-4bde-adea-f5469ef57e26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.429453 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-edpm-compute-no-nodes" (OuterVolumeSpecName: "edpm-compute-no-nodes") pod "0dbad968-3a49-4bde-adea-f5469ef57e26" (UID: "0dbad968-3a49-4bde-adea-f5469ef57e26"). InnerVolumeSpecName "edpm-compute-no-nodes". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.429859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "0dbad968-3a49-4bde-adea-f5469ef57e26" (UID: "0dbad968-3a49-4bde-adea-f5469ef57e26"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.502978 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44tl9\" (UniqueName: \"kubernetes.io/projected/0dbad968-3a49-4bde-adea-f5469ef57e26-kube-api-access-44tl9\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.503042 4707 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.503055 4707 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:57 crc kubenswrapper[4707]: I0121 16:34:57.503084 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbad968-3a49-4bde-adea-f5469ef57e26-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:34:58 crc kubenswrapper[4707]: I0121 16:34:58.212966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" event={"ID":"0dbad968-3a49-4bde-adea-f5469ef57e26","Type":"ContainerDied","Data":"b914ec37c6c58aca27b0c2456220085734d64244aaa9929e4a8607538c82645a"} Jan 21 16:34:58 crc kubenswrapper[4707]: I0121 16:34:58.214086 4707 scope.go:117] "RemoveContainer" containerID="cda0cd3d3e46c04599a564f3667435856cc8b6b0459d1f4b028967a746caaf2e" Jan 21 16:34:58 crc kubenswrapper[4707]: I0121 16:34:58.213074 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48" Jan 21 16:34:58 crc kubenswrapper[4707]: I0121 16:34:58.231668 4707 scope.go:117] "RemoveContainer" containerID="72078d0f517b3a38e97ea75e6c3b999608d28f846e8bb0f875c95d48243fc528" Jan 21 16:34:58 crc kubenswrapper[4707]: I0121 16:34:58.251845 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48"] Jan 21 16:34:58 crc kubenswrapper[4707]: I0121 16:34:58.256249 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-rrs48"] Jan 21 16:34:59 crc kubenswrapper[4707]: I0121 16:34:59.188513 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dbad968-3a49-4bde-adea-f5469ef57e26" path="/var/lib/kubelet/pods/0dbad968-3a49-4bde-adea-f5469ef57e26/volumes" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.241278 4707 scope.go:117] "RemoveContainer" containerID="4d385ff0fd544a8fd564aa4e61639b316b6a512252171f9b1a32d9dfee33ed90" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.270054 4707 scope.go:117] "RemoveContainer" containerID="55d73a60944fe9f7d7ad79e64d1b64d20cb2043c83d889359f590d510c77308d" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.290955 4707 scope.go:117] "RemoveContainer" containerID="c813c3747e39edf631ae0a59c3bfe2ea36cc0254e81c018a901ec8d129da1f49" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.305764 4707 scope.go:117] "RemoveContainer" containerID="2c8f1e5580a5cce674b79cd942f3bade7b066cfdc65965c986e6ff0d22714bcf" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.347367 4707 scope.go:117] "RemoveContainer" containerID="ce079610f88f7d056187fad72a6f6f1fb9e577c5ace7940757d24c6cc403e66c" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.394245 4707 scope.go:117] "RemoveContainer" containerID="39ac84a7db285687d6db26470e02e89c210c3ae4cac2d9bde592910bf8a6ac53" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.407529 4707 scope.go:117] "RemoveContainer" containerID="2f7d7170e2337291b3525b16efe7ac13fde05e46c80b71e57c73c1795a5e8afa" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.420014 4707 scope.go:117] "RemoveContainer" containerID="5a8b6331813d6ce1cc4c1f242ea893b7520dd2bafdb2bc296453a569d7df2394" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.430759 4707 scope.go:117] "RemoveContainer" containerID="9f157a1954f415b2a222d4a1081d13ac7ca6e425474cd711399734533ef230cf" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.443247 4707 scope.go:117] "RemoveContainer" containerID="0629f352d3f4441913ab06b6e0a66cad5ad04c26a87acf9f4fafc1b2313ff041" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.454503 4707 scope.go:117] "RemoveContainer" containerID="7057f26105b6453321d8c7da6188b80f8fd7183951b31d4c814de9519921057c" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.474861 4707 scope.go:117] "RemoveContainer" containerID="000286ff363eea2f8b6a083f6b5e81ef6210f8522987957ba61eb6a424ae9a8c" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.501105 4707 scope.go:117] "RemoveContainer" containerID="970be3767e71031a742c429f6c28e936b6930af2dc7e5f41e212c357f8c9b82e" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.513861 4707 scope.go:117] "RemoveContainer" containerID="df49a057d7fa82fbce3996c614e3ff4dd75f687fbc0ad5227d302ba47563aa32" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.534448 4707 scope.go:117] "RemoveContainer" containerID="364b4948973dd0ac9ed3552d54641cd50c5c504f89a149ff52ada4dbfa882795" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.551403 4707 scope.go:117] "RemoveContainer" containerID="04f2aae1a90962c24e8739e6bbd1f7d8124043d6d56f594bba3c8c868bc521b3" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.571015 4707 scope.go:117] "RemoveContainer" containerID="09c8747e2bac89103cb29b97e35d54f1e372a4dea0daecca785b4bd9414b0125" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.583931 4707 scope.go:117] "RemoveContainer" containerID="39c10d68c0dbc799c2b840647380cd032397d3c2ea03aba0b3c0758af87a066a" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.595116 4707 scope.go:117] "RemoveContainer" containerID="fdf5c3dc9a9fddaeed0daff9c1a05de484bfadd721c10c9f616a0c7b32569f54" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.608664 4707 scope.go:117] "RemoveContainer" containerID="6626b64dbda518fb1ffd1ba7771b7f83ffe9a34cd98bfdb1b760b519bd91e28d" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.624694 4707 scope.go:117] "RemoveContainer" containerID="e1d1984b899ca95b672c40085601a9c986e18a9d4c78316ea159a20725baba06" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.641167 4707 scope.go:117] "RemoveContainer" containerID="63c114e32b92172fc2d850e2811160cb9627b2ab520857ee5130971d61c990d6" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.654331 4707 scope.go:117] "RemoveContainer" containerID="a89beb8f616e9c2e76488a5597f9ef95006c2a8218ea25dfa344b99300ccd355" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.666461 4707 scope.go:117] "RemoveContainer" containerID="76e7d98d71cfa80838fca4badb283a780fc5d99d758975b4e61ffbaedfea58aa" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.679061 4707 scope.go:117] "RemoveContainer" containerID="33b413ceef8ffdcb86dbbdca687a971ccb0ac31560e3c2cc793fe5dde7400208" Jan 21 16:35:24 crc kubenswrapper[4707]: I0121 16:35:24.693691 4707 scope.go:117] "RemoveContainer" containerID="504a643a38814d378695bb674a67d2ee66a91f3b1acb086daca0465a078de5b2" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429296 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-kvt24"] Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429792 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375b0905-f382-4e75-8c5c-4ece8f6ddde3" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429803 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="375b0905-f382-4e75-8c5c-4ece8f6ddde3" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429829 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a6b5573-a127-4713-919d-d79e38f60b87" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429835 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6b5573-a127-4713-919d-d79e38f60b87" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429844 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbad968-3a49-4bde-adea-f5469ef57e26" containerName="init" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429850 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbad968-3a49-4bde-adea-f5469ef57e26" containerName="init" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429859 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4adccf62-a1c3-430c-9137-a515f03b23e4" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429865 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4adccf62-a1c3-430c-9137-a515f03b23e4" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429873 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a9240d-2c77-4718-8f9d-60633df4eee4" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429878 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a9240d-2c77-4718-8f9d-60633df4eee4" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429888 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429901 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429909 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a674fe35-9ddd-4352-b770-1061b28fce34" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429913 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a674fe35-9ddd-4352-b770-1061b28fce34" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429923 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f333daa-a37b-4c2e-bd13-e66416449d2c" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429929 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f333daa-a37b-4c2e-bd13-e66416449d2c" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429936 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf" containerName="operator" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429941 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf" containerName="operator" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429947 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e25886-609f-4ef6-848c-e116c738df6a" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429952 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e25886-609f-4ef6-848c-e116c738df6a" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429960 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40734c06-7e00-4184-9487-35be214c9556" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429965 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="40734c06-7e00-4184-9487-35be214c9556" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429975 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429980 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.429987 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c59ef09-0beb-452c-8563-83bda9602961" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.429993 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c59ef09-0beb-452c-8563-83bda9602961" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430000 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f34a822-4c51-4dfc-9812-1287c9b3281d" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430005 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f34a822-4c51-4dfc-9812-1287c9b3281d" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7193a0d6-108f-4c14-a4fe-678aa49bc9dd" containerName="registry-server" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430019 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7193a0d6-108f-4c14-a4fe-678aa49bc9dd" containerName="registry-server" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430037 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0195ed-9602-42a1-bd0e-a0d79f8a6197" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430042 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0195ed-9602-42a1-bd0e-a0d79f8a6197" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ab0bb3-12a8-4314-83c7-280a512198f6" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430055 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ab0bb3-12a8-4314-83c7-280a512198f6" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430064 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c244e2-fb38-4ac0-8ed0-2f34d01c5228" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430070 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c244e2-fb38-4ac0-8ed0-2f34d01c5228" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430080 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbad968-3a49-4bde-adea-f5469ef57e26" containerName="dnsmasq-dns" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430086 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbad968-3a49-4bde-adea-f5469ef57e26" containerName="dnsmasq-dns" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430094 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d095e2a-7788-4a7e-af9b-eaed8407ef5a" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430099 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d095e2a-7788-4a7e-af9b-eaed8407ef5a" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430105 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff533c3-a29f-498b-a575-1492c0e07aa9" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430109 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff533c3-a29f-498b-a575-1492c0e07aa9" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430116 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13a87c3-f0a5-4fa9-892d-e2371300124a" containerName="catalog-operator" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430524 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13a87c3-f0a5-4fa9-892d-e2371300124a" containerName="catalog-operator" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430539 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8951bb5e-fd60-4761-9c83-bc523b041b83" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430545 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8951bb5e-fd60-4761-9c83-bc523b041b83" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430555 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6676ddd6-9d32-480d-9306-d3dc0676d2cd" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430560 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6676ddd6-9d32-480d-9306-d3dc0676d2cd" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430569 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8707730-a731-419d-ad12-caf5fddafbd5" containerName="operator" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430574 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8707730-a731-419d-ad12-caf5fddafbd5" containerName="operator" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430580 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2dd112-6dc9-4b29-95f2-25b3c44452a2" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430585 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2dd112-6dc9-4b29-95f2-25b3c44452a2" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430592 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9356dd92-4c2e-476d-9a60-946f4f148564" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430597 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9356dd92-4c2e-476d-9a60-946f4f148564" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: E0121 16:35:28.430604 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430609 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430699 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c59ef09-0beb-452c-8563-83bda9602961" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430711 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6676ddd6-9d32-480d-9306-d3dc0676d2cd" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430719 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79d145ef-2bd3-4c20-ae4b-fff81dbfc4b6" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430727 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbad968-3a49-4bde-adea-f5469ef57e26" containerName="dnsmasq-dns" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430733 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="40734c06-7e00-4184-9487-35be214c9556" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430741 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f333daa-a37b-4c2e-bd13-e66416449d2c" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c244e2-fb38-4ac0-8ed0-2f34d01c5228" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430755 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7193a0d6-108f-4c14-a4fe-678aa49bc9dd" containerName="registry-server" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430761 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd457aa-b1c8-4b86-ada4-17f76c7ad4dd" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430769 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a9240d-2c77-4718-8f9d-60633df4eee4" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430777 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c2dd112-6dc9-4b29-95f2-25b3c44452a2" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430786 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="375b0905-f382-4e75-8c5c-4ece8f6ddde3" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430793 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5219c03b-89a8-4c73-b04f-2c6b1f8e29f4" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430800 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d095e2a-7788-4a7e-af9b-eaed8407ef5a" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430826 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4adccf62-a1c3-430c-9137-a515f03b23e4" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430833 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a6b5573-a127-4713-919d-d79e38f60b87" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430839 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a674fe35-9ddd-4352-b770-1061b28fce34" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430847 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9356dd92-4c2e-476d-9a60-946f4f148564" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430857 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8951bb5e-fd60-4761-9c83-bc523b041b83" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430862 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f34a822-4c51-4dfc-9812-1287c9b3281d" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430869 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff533c3-a29f-498b-a575-1492c0e07aa9" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430876 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ab0bb3-12a8-4314-83c7-280a512198f6" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430884 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e25886-609f-4ef6-848c-e116c738df6a" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430890 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8707730-a731-419d-ad12-caf5fddafbd5" containerName="operator" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430906 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13a87c3-f0a5-4fa9-892d-e2371300124a" containerName="catalog-operator" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430913 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c778ef9-8d0d-4dd3-b1ab-1541f25cfbbf" containerName="operator" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.430919 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0195ed-9602-42a1-bd0e-a0d79f8a6197" containerName="manager" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.431249 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-kvt24" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.432728 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-pb2v7" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.433106 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.434339 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.440420 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-kvt24"] Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.454071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbkv\" (UniqueName: \"kubernetes.io/projected/f0b3d536-9e96-45b8-a947-054125a3efec-kube-api-access-fzbkv\") pod \"mariadb-operator-index-kvt24\" (UID: \"f0b3d536-9e96-45b8-a947-054125a3efec\") " pod="openstack-operators/mariadb-operator-index-kvt24" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.555412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbkv\" (UniqueName: \"kubernetes.io/projected/f0b3d536-9e96-45b8-a947-054125a3efec-kube-api-access-fzbkv\") pod \"mariadb-operator-index-kvt24\" (UID: \"f0b3d536-9e96-45b8-a947-054125a3efec\") " pod="openstack-operators/mariadb-operator-index-kvt24" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.570498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbkv\" (UniqueName: \"kubernetes.io/projected/f0b3d536-9e96-45b8-a947-054125a3efec-kube-api-access-fzbkv\") pod \"mariadb-operator-index-kvt24\" (UID: \"f0b3d536-9e96-45b8-a947-054125a3efec\") " pod="openstack-operators/mariadb-operator-index-kvt24" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.744049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-kvt24" Jan 21 16:35:28 crc kubenswrapper[4707]: I0121 16:35:28.820638 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-kvt24"] Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.080938 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-kvt24"] Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.086540 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.219418 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-jmcgw"] Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.220119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.224273 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-jmcgw"] Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.365033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wghg\" (UniqueName: \"kubernetes.io/projected/7789650d-9683-47fd-a464-ba7cc7dac503-kube-api-access-6wghg\") pod \"mariadb-operator-index-jmcgw\" (UID: \"7789650d-9683-47fd-a464-ba7cc7dac503\") " pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.420209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-kvt24" event={"ID":"f0b3d536-9e96-45b8-a947-054125a3efec","Type":"ContainerStarted","Data":"70ca115c1819b37aa32be05e806f2dffca2d548cb347e1b6244fe8b1857228da"} Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.466045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wghg\" (UniqueName: \"kubernetes.io/projected/7789650d-9683-47fd-a464-ba7cc7dac503-kube-api-access-6wghg\") pod \"mariadb-operator-index-jmcgw\" (UID: \"7789650d-9683-47fd-a464-ba7cc7dac503\") " pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.480082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wghg\" (UniqueName: \"kubernetes.io/projected/7789650d-9683-47fd-a464-ba7cc7dac503-kube-api-access-6wghg\") pod \"mariadb-operator-index-jmcgw\" (UID: \"7789650d-9683-47fd-a464-ba7cc7dac503\") " pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.544093 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:35:29 crc kubenswrapper[4707]: I0121 16:35:29.897636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-jmcgw"] Jan 21 16:35:29 crc kubenswrapper[4707]: W0121 16:35:29.900801 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7789650d_9683_47fd_a464_ba7cc7dac503.slice/crio-243db92db96f74533d0d6eda2718b592a594577e64402430c1c4c705c9021291 WatchSource:0}: Error finding container 243db92db96f74533d0d6eda2718b592a594577e64402430c1c4c705c9021291: Status 404 returned error can't find the container with id 243db92db96f74533d0d6eda2718b592a594577e64402430c1c4c705c9021291 Jan 21 16:35:30 crc kubenswrapper[4707]: I0121 16:35:30.428543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-jmcgw" event={"ID":"7789650d-9683-47fd-a464-ba7cc7dac503","Type":"ContainerStarted","Data":"243db92db96f74533d0d6eda2718b592a594577e64402430c1c4c705c9021291"} Jan 21 16:35:30 crc kubenswrapper[4707]: I0121 16:35:30.430407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-kvt24" event={"ID":"f0b3d536-9e96-45b8-a947-054125a3efec","Type":"ContainerStarted","Data":"1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2"} Jan 21 16:35:30 crc kubenswrapper[4707]: I0121 16:35:30.430526 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-kvt24" podUID="f0b3d536-9e96-45b8-a947-054125a3efec" containerName="registry-server" containerID="cri-o://1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2" gracePeriod=2 Jan 21 16:35:30 crc kubenswrapper[4707]: I0121 16:35:30.447694 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-kvt24" podStartSLOduration=1.6419639579999998 podStartE2EDuration="2.447676217s" podCreationTimestamp="2026-01-21 16:35:28 +0000 UTC" firstStartedPulling="2026-01-21 16:35:29.086301157 +0000 UTC m=+5626.267817379" lastFinishedPulling="2026-01-21 16:35:29.892013416 +0000 UTC m=+5627.073529638" observedRunningTime="2026-01-21 16:35:30.445137133 +0000 UTC m=+5627.626653355" watchObservedRunningTime="2026-01-21 16:35:30.447676217 +0000 UTC m=+5627.629192439" Jan 21 16:35:30 crc kubenswrapper[4707]: I0121 16:35:30.723990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-kvt24" Jan 21 16:35:30 crc kubenswrapper[4707]: I0121 16:35:30.880144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbkv\" (UniqueName: \"kubernetes.io/projected/f0b3d536-9e96-45b8-a947-054125a3efec-kube-api-access-fzbkv\") pod \"f0b3d536-9e96-45b8-a947-054125a3efec\" (UID: \"f0b3d536-9e96-45b8-a947-054125a3efec\") " Jan 21 16:35:30 crc kubenswrapper[4707]: I0121 16:35:30.885358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b3d536-9e96-45b8-a947-054125a3efec-kube-api-access-fzbkv" (OuterVolumeSpecName: "kube-api-access-fzbkv") pod "f0b3d536-9e96-45b8-a947-054125a3efec" (UID: "f0b3d536-9e96-45b8-a947-054125a3efec"). InnerVolumeSpecName "kube-api-access-fzbkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:35:30 crc kubenswrapper[4707]: I0121 16:35:30.981262 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbkv\" (UniqueName: \"kubernetes.io/projected/f0b3d536-9e96-45b8-a947-054125a3efec-kube-api-access-fzbkv\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.436483 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0b3d536-9e96-45b8-a947-054125a3efec" containerID="1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2" exitCode=0 Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.436528 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-kvt24" Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.436523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-kvt24" event={"ID":"f0b3d536-9e96-45b8-a947-054125a3efec","Type":"ContainerDied","Data":"1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2"} Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.436636 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-kvt24" event={"ID":"f0b3d536-9e96-45b8-a947-054125a3efec","Type":"ContainerDied","Data":"70ca115c1819b37aa32be05e806f2dffca2d548cb347e1b6244fe8b1857228da"} Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.436663 4707 scope.go:117] "RemoveContainer" containerID="1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2" Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.437739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-jmcgw" event={"ID":"7789650d-9683-47fd-a464-ba7cc7dac503","Type":"ContainerStarted","Data":"27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2"} Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.449689 4707 scope.go:117] "RemoveContainer" containerID="1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2" Jan 21 16:35:31 crc kubenswrapper[4707]: E0121 16:35:31.450022 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2\": container with ID starting with 1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2 not found: ID does not exist" containerID="1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2" Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.450059 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2"} err="failed to get container status \"1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2\": rpc error: code = NotFound desc = could not find container \"1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2\": container with ID starting with 1c3a3b372a60d75003a0cd806dfd8263caef8d6ad87c56c7d3ded0f9401530f2 not found: ID does not exist" Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.455088 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-jmcgw" podStartSLOduration=1.988210762 podStartE2EDuration="2.455074935s" podCreationTimestamp="2026-01-21 16:35:29 +0000 UTC" firstStartedPulling="2026-01-21 16:35:29.904189059 +0000 UTC m=+5627.085705282" lastFinishedPulling="2026-01-21 16:35:30.371053232 +0000 UTC m=+5627.552569455" observedRunningTime="2026-01-21 16:35:31.451238081 +0000 UTC m=+5628.632754303" watchObservedRunningTime="2026-01-21 16:35:31.455074935 +0000 UTC m=+5628.636591158" Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.461904 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-kvt24"] Jan 21 16:35:31 crc kubenswrapper[4707]: I0121 16:35:31.464953 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-kvt24"] Jan 21 16:35:33 crc kubenswrapper[4707]: I0121 16:35:33.187855 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b3d536-9e96-45b8-a947-054125a3efec" path="/var/lib/kubelet/pods/f0b3d536-9e96-45b8-a947-054125a3efec/volumes" Jan 21 16:35:39 crc kubenswrapper[4707]: I0121 16:35:39.545141 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:35:39 crc kubenswrapper[4707]: I0121 16:35:39.545352 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:35:39 crc kubenswrapper[4707]: I0121 16:35:39.565189 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:35:40 crc kubenswrapper[4707]: I0121 16:35:40.504114 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:35:52 crc kubenswrapper[4707]: I0121 16:35:52.853553 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j"] Jan 21 16:35:52 crc kubenswrapper[4707]: E0121 16:35:52.854107 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b3d536-9e96-45b8-a947-054125a3efec" containerName="registry-server" Jan 21 16:35:52 crc kubenswrapper[4707]: I0121 16:35:52.854119 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b3d536-9e96-45b8-a947-054125a3efec" containerName="registry-server" Jan 21 16:35:52 crc kubenswrapper[4707]: I0121 16:35:52.854245 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b3d536-9e96-45b8-a947-054125a3efec" containerName="registry-server" Jan 21 16:35:52 crc kubenswrapper[4707]: I0121 16:35:52.854972 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:52 crc kubenswrapper[4707]: I0121 16:35:52.858168 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 16:35:52 crc kubenswrapper[4707]: I0121 16:35:52.861177 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j"] Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.024441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.024849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lcj8\" (UniqueName: \"kubernetes.io/projected/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-kube-api-access-7lcj8\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.024900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.126155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lcj8\" (UniqueName: \"kubernetes.io/projected/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-kube-api-access-7lcj8\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.126214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.126256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.126635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.126676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.140595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lcj8\" (UniqueName: \"kubernetes.io/projected/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-kube-api-access-7lcj8\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.167866 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:53 crc kubenswrapper[4707]: I0121 16:35:53.538783 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j"] Jan 21 16:35:54 crc kubenswrapper[4707]: I0121 16:35:54.555715 4707 generic.go:334] "Generic (PLEG): container finished" podID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerID="d6ad1bad2014968ef32dd04e52254652b213394aa3d7617ad98864842c6b0017" exitCode=0 Jan 21 16:35:54 crc kubenswrapper[4707]: I0121 16:35:54.555754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" event={"ID":"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5","Type":"ContainerDied","Data":"d6ad1bad2014968ef32dd04e52254652b213394aa3d7617ad98864842c6b0017"} Jan 21 16:35:54 crc kubenswrapper[4707]: I0121 16:35:54.555790 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" event={"ID":"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5","Type":"ContainerStarted","Data":"36ccc0540b3df4dcab8592b658b9248e67b4c4bacb488755953ca340e78650c1"} Jan 21 16:35:56 crc kubenswrapper[4707]: I0121 16:35:56.571170 4707 generic.go:334] "Generic (PLEG): container finished" podID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerID="a3477f81007dd93bfd43fe19286511d2dbee1acd4e837fa19ab4d0cc6f6f9002" exitCode=0 Jan 21 16:35:56 crc kubenswrapper[4707]: I0121 16:35:56.571245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" event={"ID":"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5","Type":"ContainerDied","Data":"a3477f81007dd93bfd43fe19286511d2dbee1acd4e837fa19ab4d0cc6f6f9002"} Jan 21 16:35:57 crc kubenswrapper[4707]: I0121 16:35:57.581483 4707 generic.go:334] "Generic (PLEG): container finished" podID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerID="1d24beb383009e5f7a010d7afeedbf40ce30e2fc400bf8a948a3846c293e79d8" exitCode=0 Jan 21 16:35:57 crc kubenswrapper[4707]: I0121 16:35:57.581555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" event={"ID":"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5","Type":"ContainerDied","Data":"1d24beb383009e5f7a010d7afeedbf40ce30e2fc400bf8a948a3846c293e79d8"} Jan 21 16:35:58 crc kubenswrapper[4707]: I0121 16:35:58.808838 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:35:58 crc kubenswrapper[4707]: I0121 16:35:58.899365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-util\") pod \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " Jan 21 16:35:58 crc kubenswrapper[4707]: I0121 16:35:58.899453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-bundle\") pod \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " Jan 21 16:35:58 crc kubenswrapper[4707]: I0121 16:35:58.899488 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lcj8\" (UniqueName: \"kubernetes.io/projected/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-kube-api-access-7lcj8\") pod \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\" (UID: \"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5\") " Jan 21 16:35:58 crc kubenswrapper[4707]: I0121 16:35:58.900331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-bundle" (OuterVolumeSpecName: "bundle") pod "0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" (UID: "0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:35:58 crc kubenswrapper[4707]: I0121 16:35:58.905706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-kube-api-access-7lcj8" (OuterVolumeSpecName: "kube-api-access-7lcj8") pod "0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" (UID: "0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5"). InnerVolumeSpecName "kube-api-access-7lcj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:35:58 crc kubenswrapper[4707]: I0121 16:35:58.910329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-util" (OuterVolumeSpecName: "util") pod "0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" (UID: "0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:35:59 crc kubenswrapper[4707]: I0121 16:35:59.000520 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:59 crc kubenswrapper[4707]: I0121 16:35:59.000556 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lcj8\" (UniqueName: \"kubernetes.io/projected/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-kube-api-access-7lcj8\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:59 crc kubenswrapper[4707]: I0121 16:35:59.000569 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5-util\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:59 crc kubenswrapper[4707]: I0121 16:35:59.600726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" event={"ID":"0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5","Type":"ContainerDied","Data":"36ccc0540b3df4dcab8592b658b9248e67b4c4bacb488755953ca340e78650c1"} Jan 21 16:35:59 crc kubenswrapper[4707]: I0121 16:35:59.600925 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36ccc0540b3df4dcab8592b658b9248e67b4c4bacb488755953ca340e78650c1" Jan 21 16:35:59 crc kubenswrapper[4707]: I0121 16:35:59.600769 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.176104 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k"] Jan 21 16:36:03 crc kubenswrapper[4707]: E0121 16:36:03.176495 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerName="util" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.176510 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerName="util" Jan 21 16:36:03 crc kubenswrapper[4707]: E0121 16:36:03.176530 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerName="extract" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.176536 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerName="extract" Jan 21 16:36:03 crc kubenswrapper[4707]: E0121 16:36:03.176546 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerName="pull" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.176552 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerName="pull" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.176651 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" containerName="extract" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.177061 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.179352 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.179489 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.180186 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-lvfk2" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.188956 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k"] Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.351652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-webhook-cert\") pod \"mariadb-operator-controller-manager-f6b55f788-jjl6k\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.351834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9wkg\" (UniqueName: \"kubernetes.io/projected/43080aee-92cf-4789-858a-d48c03b0d1b8-kube-api-access-g9wkg\") pod \"mariadb-operator-controller-manager-f6b55f788-jjl6k\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.351856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-apiservice-cert\") pod \"mariadb-operator-controller-manager-f6b55f788-jjl6k\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.453357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-webhook-cert\") pod \"mariadb-operator-controller-manager-f6b55f788-jjl6k\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.453514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wkg\" (UniqueName: \"kubernetes.io/projected/43080aee-92cf-4789-858a-d48c03b0d1b8-kube-api-access-g9wkg\") pod \"mariadb-operator-controller-manager-f6b55f788-jjl6k\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.453536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-apiservice-cert\") pod \"mariadb-operator-controller-manager-f6b55f788-jjl6k\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.457879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-webhook-cert\") pod \"mariadb-operator-controller-manager-f6b55f788-jjl6k\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.458712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-apiservice-cert\") pod \"mariadb-operator-controller-manager-f6b55f788-jjl6k\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.466044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9wkg\" (UniqueName: \"kubernetes.io/projected/43080aee-92cf-4789-858a-d48c03b0d1b8-kube-api-access-g9wkg\") pod \"mariadb-operator-controller-manager-f6b55f788-jjl6k\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.490996 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:03 crc kubenswrapper[4707]: I0121 16:36:03.660571 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k"] Jan 21 16:36:04 crc kubenswrapper[4707]: I0121 16:36:04.631732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" event={"ID":"43080aee-92cf-4789-858a-d48c03b0d1b8","Type":"ContainerStarted","Data":"4ed84533cb92bce74834c80d87dc00c3782890ec7501843d56db7be0e355415c"} Jan 21 16:36:06 crc kubenswrapper[4707]: I0121 16:36:06.646418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" event={"ID":"43080aee-92cf-4789-858a-d48c03b0d1b8","Type":"ContainerStarted","Data":"e72804f50faad2dafabf86fbc4813643bf4aee500884d015780ab629571f641b"} Jan 21 16:36:06 crc kubenswrapper[4707]: I0121 16:36:06.646777 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:06 crc kubenswrapper[4707]: I0121 16:36:06.659399 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" podStartSLOduration=1.385825386 podStartE2EDuration="3.659385053s" podCreationTimestamp="2026-01-21 16:36:03 +0000 UTC" firstStartedPulling="2026-01-21 16:36:03.672265902 +0000 UTC m=+5660.853782125" lastFinishedPulling="2026-01-21 16:36:05.94582557 +0000 UTC m=+5663.127341792" observedRunningTime="2026-01-21 16:36:06.657561725 +0000 UTC m=+5663.839077947" watchObservedRunningTime="2026-01-21 16:36:06.659385053 +0000 UTC m=+5663.840901275" Jan 21 16:36:13 crc kubenswrapper[4707]: I0121 16:36:13.494988 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:36:19 crc kubenswrapper[4707]: I0121 16:36:19.826205 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-njrms"] Jan 21 16:36:19 crc kubenswrapper[4707]: I0121 16:36:19.827552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:36:19 crc kubenswrapper[4707]: I0121 16:36:19.828998 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-sdpkf" Jan 21 16:36:19 crc kubenswrapper[4707]: I0121 16:36:19.830939 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-njrms"] Jan 21 16:36:19 crc kubenswrapper[4707]: I0121 16:36:19.946626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2bzw\" (UniqueName: \"kubernetes.io/projected/6ccf8768-dba5-4264-a203-8acceb86a164-kube-api-access-k2bzw\") pod \"infra-operator-index-njrms\" (UID: \"6ccf8768-dba5-4264-a203-8acceb86a164\") " pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:36:20 crc kubenswrapper[4707]: I0121 16:36:20.047821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2bzw\" (UniqueName: \"kubernetes.io/projected/6ccf8768-dba5-4264-a203-8acceb86a164-kube-api-access-k2bzw\") pod \"infra-operator-index-njrms\" (UID: \"6ccf8768-dba5-4264-a203-8acceb86a164\") " pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:36:20 crc kubenswrapper[4707]: I0121 16:36:20.061758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2bzw\" (UniqueName: \"kubernetes.io/projected/6ccf8768-dba5-4264-a203-8acceb86a164-kube-api-access-k2bzw\") pod \"infra-operator-index-njrms\" (UID: \"6ccf8768-dba5-4264-a203-8acceb86a164\") " pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:36:20 crc kubenswrapper[4707]: I0121 16:36:20.141587 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:36:20 crc kubenswrapper[4707]: I0121 16:36:20.489249 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-njrms"] Jan 21 16:36:20 crc kubenswrapper[4707]: W0121 16:36:20.496713 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ccf8768_dba5_4264_a203_8acceb86a164.slice/crio-d04d469de6314fbfbc317d100c1480f90f79775ce36b15d5a353bdc5185d728c WatchSource:0}: Error finding container d04d469de6314fbfbc317d100c1480f90f79775ce36b15d5a353bdc5185d728c: Status 404 returned error can't find the container with id d04d469de6314fbfbc317d100c1480f90f79775ce36b15d5a353bdc5185d728c Jan 21 16:36:20 crc kubenswrapper[4707]: I0121 16:36:20.712797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-njrms" event={"ID":"6ccf8768-dba5-4264-a203-8acceb86a164","Type":"ContainerStarted","Data":"d04d469de6314fbfbc317d100c1480f90f79775ce36b15d5a353bdc5185d728c"} Jan 21 16:36:21 crc kubenswrapper[4707]: I0121 16:36:21.719395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-njrms" event={"ID":"6ccf8768-dba5-4264-a203-8acceb86a164","Type":"ContainerStarted","Data":"dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310"} Jan 21 16:36:21 crc kubenswrapper[4707]: I0121 16:36:21.728626 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-njrms" podStartSLOduration=1.605755978 podStartE2EDuration="2.728613943s" podCreationTimestamp="2026-01-21 16:36:19 +0000 UTC" firstStartedPulling="2026-01-21 16:36:20.498660465 +0000 UTC m=+5677.680176688" lastFinishedPulling="2026-01-21 16:36:21.621518431 +0000 UTC m=+5678.803034653" observedRunningTime="2026-01-21 16:36:21.727792328 +0000 UTC m=+5678.909308550" watchObservedRunningTime="2026-01-21 16:36:21.728613943 +0000 UTC m=+5678.910130166" Jan 21 16:36:25 crc kubenswrapper[4707]: I0121 16:36:25.029929 4707 scope.go:117] "RemoveContainer" containerID="2d3eece2daf8276cc51a7a2e5333c7a810e1b02ce80270ede6e3d7eb4df97278" Jan 21 16:36:25 crc kubenswrapper[4707]: I0121 16:36:25.051375 4707 scope.go:117] "RemoveContainer" containerID="1c7aa8550b66f800ec0c5408ea6423eb502458defbedee473b67569cb31e318c" Jan 21 16:36:25 crc kubenswrapper[4707]: I0121 16:36:25.066288 4707 scope.go:117] "RemoveContainer" containerID="39f6e9e171f3bdfce24b0a8ba77ecd4791b7495ab73187522a7e6d9616803107" Jan 21 16:36:25 crc kubenswrapper[4707]: I0121 16:36:25.086629 4707 scope.go:117] "RemoveContainer" containerID="0c56bac6f0ac9527fe972ec248729a18c00980c00336de32533aef2bc0c95fff" Jan 21 16:36:25 crc kubenswrapper[4707]: I0121 16:36:25.103221 4707 scope.go:117] "RemoveContainer" containerID="f595525796e4443bc05aeb397fa805802f27fe9cd2a5a57c36f69cf79a9d3faf" Jan 21 16:36:25 crc kubenswrapper[4707]: I0121 16:36:25.118076 4707 scope.go:117] "RemoveContainer" containerID="66c4a8f3ac89b247e1cc56eae55fc9902a78f6cc0c3a8012683a254d0687b317" Jan 21 16:36:25 crc kubenswrapper[4707]: I0121 16:36:25.135293 4707 scope.go:117] "RemoveContainer" containerID="2ecad3ca021c8e573237b5d01b65e1f0caa33b4dadd162ad98490750d4847a8d" Jan 21 16:36:25 crc kubenswrapper[4707]: I0121 16:36:25.153716 4707 scope.go:117] "RemoveContainer" containerID="6e04ef91cff8fa74c1b0d3d24b46f51e19374e2e82d06a6614a0501a8a650971" Jan 21 16:36:25 crc kubenswrapper[4707]: I0121 16:36:25.167014 4707 scope.go:117] "RemoveContainer" containerID="36a71d3bd3f71bbbdc098c54556f0535a81385b8f405465e135f291219fca634" Jan 21 16:36:30 crc kubenswrapper[4707]: I0121 16:36:30.141976 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:36:30 crc kubenswrapper[4707]: I0121 16:36:30.142379 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:36:30 crc kubenswrapper[4707]: I0121 16:36:30.160533 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:36:30 crc kubenswrapper[4707]: I0121 16:36:30.780700 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:36:31 crc kubenswrapper[4707]: I0121 16:36:31.904571 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh"] Jan 21 16:36:31 crc kubenswrapper[4707]: I0121 16:36:31.905795 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:31 crc kubenswrapper[4707]: I0121 16:36:31.907848 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 16:36:31 crc kubenswrapper[4707]: I0121 16:36:31.919943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh"] Jan 21 16:36:31 crc kubenswrapper[4707]: I0121 16:36:31.980533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8q85\" (UniqueName: \"kubernetes.io/projected/27640829-d55e-4c63-9ff8-f644777688db-kube-api-access-g8q85\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:31 crc kubenswrapper[4707]: I0121 16:36:31.980634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:31 crc kubenswrapper[4707]: I0121 16:36:31.980719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.081433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.081497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.081550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8q85\" (UniqueName: \"kubernetes.io/projected/27640829-d55e-4c63-9ff8-f644777688db-kube-api-access-g8q85\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.081907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.081993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.095753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8q85\" (UniqueName: \"kubernetes.io/projected/27640829-d55e-4c63-9ff8-f644777688db-kube-api-access-g8q85\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.220202 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.559930 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh"] Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.773218 4707 generic.go:334] "Generic (PLEG): container finished" podID="27640829-d55e-4c63-9ff8-f644777688db" containerID="e0204608d317b8236cee5d1834f54e7e25cb53a61e3c85fed9fa089fc05936f5" exitCode=0 Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.773259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" event={"ID":"27640829-d55e-4c63-9ff8-f644777688db","Type":"ContainerDied","Data":"e0204608d317b8236cee5d1834f54e7e25cb53a61e3c85fed9fa089fc05936f5"} Jan 21 16:36:32 crc kubenswrapper[4707]: I0121 16:36:32.773417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" event={"ID":"27640829-d55e-4c63-9ff8-f644777688db","Type":"ContainerStarted","Data":"897be90e5611f56ab25b80f67ca1d64a0302f0fed564e14361207f6465b724f1"} Jan 21 16:36:33 crc kubenswrapper[4707]: I0121 16:36:33.780347 4707 generic.go:334] "Generic (PLEG): container finished" podID="27640829-d55e-4c63-9ff8-f644777688db" containerID="f97c3da82958800cfb31c6fac73c534918cf1bd189f5487e8d9e01a260465af3" exitCode=0 Jan 21 16:36:33 crc kubenswrapper[4707]: I0121 16:36:33.780384 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" event={"ID":"27640829-d55e-4c63-9ff8-f644777688db","Type":"ContainerDied","Data":"f97c3da82958800cfb31c6fac73c534918cf1bd189f5487e8d9e01a260465af3"} Jan 21 16:36:34 crc kubenswrapper[4707]: I0121 16:36:34.788198 4707 generic.go:334] "Generic (PLEG): container finished" podID="27640829-d55e-4c63-9ff8-f644777688db" containerID="1a97e4cbcd911fe6ca249b740b3d863f4ef8a1c331f1ac3df6497004b0ecbc4b" exitCode=0 Jan 21 16:36:34 crc kubenswrapper[4707]: I0121 16:36:34.788329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" event={"ID":"27640829-d55e-4c63-9ff8-f644777688db","Type":"ContainerDied","Data":"1a97e4cbcd911fe6ca249b740b3d863f4ef8a1c331f1ac3df6497004b0ecbc4b"} Jan 21 16:36:35 crc kubenswrapper[4707]: I0121 16:36:35.969868 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.123328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-util\") pod \"27640829-d55e-4c63-9ff8-f644777688db\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.123361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-bundle\") pod \"27640829-d55e-4c63-9ff8-f644777688db\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.123423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8q85\" (UniqueName: \"kubernetes.io/projected/27640829-d55e-4c63-9ff8-f644777688db-kube-api-access-g8q85\") pod \"27640829-d55e-4c63-9ff8-f644777688db\" (UID: \"27640829-d55e-4c63-9ff8-f644777688db\") " Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.125310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-bundle" (OuterVolumeSpecName: "bundle") pod "27640829-d55e-4c63-9ff8-f644777688db" (UID: "27640829-d55e-4c63-9ff8-f644777688db"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.127465 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27640829-d55e-4c63-9ff8-f644777688db-kube-api-access-g8q85" (OuterVolumeSpecName: "kube-api-access-g8q85") pod "27640829-d55e-4c63-9ff8-f644777688db" (UID: "27640829-d55e-4c63-9ff8-f644777688db"). InnerVolumeSpecName "kube-api-access-g8q85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.135157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-util" (OuterVolumeSpecName: "util") pod "27640829-d55e-4c63-9ff8-f644777688db" (UID: "27640829-d55e-4c63-9ff8-f644777688db"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.224604 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-util\") on node \"crc\" DevicePath \"\"" Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.224630 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27640829-d55e-4c63-9ff8-f644777688db-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.224641 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8q85\" (UniqueName: \"kubernetes.io/projected/27640829-d55e-4c63-9ff8-f644777688db-kube-api-access-g8q85\") on node \"crc\" DevicePath \"\"" Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.799734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" event={"ID":"27640829-d55e-4c63-9ff8-f644777688db","Type":"ContainerDied","Data":"897be90e5611f56ab25b80f67ca1d64a0302f0fed564e14361207f6465b724f1"} Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.799768 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="897be90e5611f56ab25b80f67ca1d64a0302f0fed564e14361207f6465b724f1" Jan 21 16:36:36 crc kubenswrapper[4707]: I0121 16:36:36.799773 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.576205 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 16:36:38 crc kubenswrapper[4707]: E0121 16:36:38.576611 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27640829-d55e-4c63-9ff8-f644777688db" containerName="pull" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.576624 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27640829-d55e-4c63-9ff8-f644777688db" containerName="pull" Jan 21 16:36:38 crc kubenswrapper[4707]: E0121 16:36:38.576641 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27640829-d55e-4c63-9ff8-f644777688db" containerName="util" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.576646 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27640829-d55e-4c63-9ff8-f644777688db" containerName="util" Jan 21 16:36:38 crc kubenswrapper[4707]: E0121 16:36:38.576663 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27640829-d55e-4c63-9ff8-f644777688db" containerName="extract" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.576669 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="27640829-d55e-4c63-9ff8-f644777688db" containerName="extract" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.576761 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="27640829-d55e-4c63-9ff8-f644777688db" containerName="extract" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.577346 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.578746 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openshift-service-ca.crt" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.579069 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-config-data" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.579286 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"kube-root-ca.crt" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.579362 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-scripts" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.579290 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"galera-openstack-dockercfg-wpxm5" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.586977 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.587760 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.592202 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.596829 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.597602 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.601801 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.620079 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-default\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzgxx\" (UniqueName: \"kubernetes.io/projected/e183c46d-7158-49d9-b9e7-d07475597a79-kube-api-access-fzgxx\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n2q4\" (UniqueName: \"kubernetes.io/projected/067aba60-80c2-4c68-887f-f3c6c40ed418-kube-api-access-6n2q4\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-kolla-config\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfp7r\" (UniqueName: \"kubernetes.io/projected/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kube-api-access-lfp7r\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-kolla-config\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-default\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-operator-scripts\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.750994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.751039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kolla-config\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.751072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-generated\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.751103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.751133 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-generated\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.751159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-operator-scripts\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.751184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-default\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n2q4\" (UniqueName: \"kubernetes.io/projected/067aba60-80c2-4c68-887f-f3c6c40ed418-kube-api-access-6n2q4\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852177 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-kolla-config\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfp7r\" (UniqueName: \"kubernetes.io/projected/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kube-api-access-lfp7r\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-kolla-config\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-default\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-operator-scripts\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kolla-config\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-generated\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852390 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") device mount path \"/mnt/openstack/pv03\"" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852477 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") device mount path \"/mnt/openstack/pv09\"" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.853044 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") device mount path \"/mnt/openstack/pv05\"" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.853389 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-default\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.853542 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-kolla-config\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.853914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-kolla-config\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.852399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-generated\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-operator-scripts\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kolla-config\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-default\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-generated\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-default\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzgxx\" (UniqueName: \"kubernetes.io/projected/e183c46d-7158-49d9-b9e7-d07475597a79-kube-api-access-fzgxx\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854583 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-generated\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-default\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.854900 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-default\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.855228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.855317 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-operator-scripts\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.855803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-operator-scripts\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.866127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.866404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfp7r\" (UniqueName: \"kubernetes.io/projected/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kube-api-access-lfp7r\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.866699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.866786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n2q4\" (UniqueName: \"kubernetes.io/projected/067aba60-80c2-4c68-887f-f3c6c40ed418-kube-api-access-6n2q4\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.867446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzgxx\" (UniqueName: \"kubernetes.io/projected/e183c46d-7158-49d9-b9e7-d07475597a79-kube-api-access-fzgxx\") pod \"openstack-galera-1\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.878867 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.889991 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.903050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:38 crc kubenswrapper[4707]: I0121 16:36:38.912787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:39 crc kubenswrapper[4707]: I0121 16:36:39.255331 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 16:36:39 crc kubenswrapper[4707]: W0121 16:36:39.259145 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ac0f4d4_a0c5_4a36_ae23_6c8669603551.slice/crio-8f9c366c6775679b99883037d9a04830a8b338d04110859ac3faf5e3c69626dc WatchSource:0}: Error finding container 8f9c366c6775679b99883037d9a04830a8b338d04110859ac3faf5e3c69626dc: Status 404 returned error can't find the container with id 8f9c366c6775679b99883037d9a04830a8b338d04110859ac3faf5e3c69626dc Jan 21 16:36:39 crc kubenswrapper[4707]: I0121 16:36:39.303356 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 16:36:39 crc kubenswrapper[4707]: I0121 16:36:39.307246 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 16:36:39 crc kubenswrapper[4707]: W0121 16:36:39.309306 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod067aba60_80c2_4c68_887f_f3c6c40ed418.slice/crio-570a772df1232497cfebb471a3f89517f74cbda652cb75d340cd99580368e0ef WatchSource:0}: Error finding container 570a772df1232497cfebb471a3f89517f74cbda652cb75d340cd99580368e0ef: Status 404 returned error can't find the container with id 570a772df1232497cfebb471a3f89517f74cbda652cb75d340cd99580368e0ef Jan 21 16:36:39 crc kubenswrapper[4707]: W0121 16:36:39.309684 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode183c46d_7158_49d9_b9e7_d07475597a79.slice/crio-7e1a9df906063497754f3e1ed1b44e47fd89609ef615f49307c4dcaa38b8531a WatchSource:0}: Error finding container 7e1a9df906063497754f3e1ed1b44e47fd89609ef615f49307c4dcaa38b8531a: Status 404 returned error can't find the container with id 7e1a9df906063497754f3e1ed1b44e47fd89609ef615f49307c4dcaa38b8531a Jan 21 16:36:39 crc kubenswrapper[4707]: I0121 16:36:39.815179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"067aba60-80c2-4c68-887f-f3c6c40ed418","Type":"ContainerStarted","Data":"72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636"} Jan 21 16:36:39 crc kubenswrapper[4707]: I0121 16:36:39.815220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"067aba60-80c2-4c68-887f-f3c6c40ed418","Type":"ContainerStarted","Data":"570a772df1232497cfebb471a3f89517f74cbda652cb75d340cd99580368e0ef"} Jan 21 16:36:39 crc kubenswrapper[4707]: I0121 16:36:39.817312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"1ac0f4d4-a0c5-4a36-ae23-6c8669603551","Type":"ContainerStarted","Data":"91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15"} Jan 21 16:36:39 crc kubenswrapper[4707]: I0121 16:36:39.817349 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"1ac0f4d4-a0c5-4a36-ae23-6c8669603551","Type":"ContainerStarted","Data":"8f9c366c6775679b99883037d9a04830a8b338d04110859ac3faf5e3c69626dc"} Jan 21 16:36:39 crc kubenswrapper[4707]: I0121 16:36:39.818619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"e183c46d-7158-49d9-b9e7-d07475597a79","Type":"ContainerStarted","Data":"4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6"} Jan 21 16:36:39 crc kubenswrapper[4707]: I0121 16:36:39.818646 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"e183c46d-7158-49d9-b9e7-d07475597a79","Type":"ContainerStarted","Data":"7e1a9df906063497754f3e1ed1b44e47fd89609ef615f49307c4dcaa38b8531a"} Jan 21 16:36:42 crc kubenswrapper[4707]: I0121 16:36:42.835862 4707 generic.go:334] "Generic (PLEG): container finished" podID="067aba60-80c2-4c68-887f-f3c6c40ed418" containerID="72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636" exitCode=0 Jan 21 16:36:42 crc kubenswrapper[4707]: I0121 16:36:42.835946 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"067aba60-80c2-4c68-887f-f3c6c40ed418","Type":"ContainerDied","Data":"72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636"} Jan 21 16:36:42 crc kubenswrapper[4707]: I0121 16:36:42.838127 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" containerID="91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15" exitCode=0 Jan 21 16:36:42 crc kubenswrapper[4707]: I0121 16:36:42.838182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"1ac0f4d4-a0c5-4a36-ae23-6c8669603551","Type":"ContainerDied","Data":"91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15"} Jan 21 16:36:42 crc kubenswrapper[4707]: I0121 16:36:42.839348 4707 generic.go:334] "Generic (PLEG): container finished" podID="e183c46d-7158-49d9-b9e7-d07475597a79" containerID="4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6" exitCode=0 Jan 21 16:36:42 crc kubenswrapper[4707]: I0121 16:36:42.839374 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"e183c46d-7158-49d9-b9e7-d07475597a79","Type":"ContainerDied","Data":"4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6"} Jan 21 16:36:43 crc kubenswrapper[4707]: I0121 16:36:43.845758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"067aba60-80c2-4c68-887f-f3c6c40ed418","Type":"ContainerStarted","Data":"11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20"} Jan 21 16:36:43 crc kubenswrapper[4707]: I0121 16:36:43.847650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"1ac0f4d4-a0c5-4a36-ae23-6c8669603551","Type":"ContainerStarted","Data":"1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6"} Jan 21 16:36:43 crc kubenswrapper[4707]: I0121 16:36:43.849015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"e183c46d-7158-49d9-b9e7-d07475597a79","Type":"ContainerStarted","Data":"7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7"} Jan 21 16:36:43 crc kubenswrapper[4707]: I0121 16:36:43.860542 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-2" podStartSLOduration=6.860531023 podStartE2EDuration="6.860531023s" podCreationTimestamp="2026-01-21 16:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:36:43.857434811 +0000 UTC m=+5701.038951033" watchObservedRunningTime="2026-01-21 16:36:43.860531023 +0000 UTC m=+5701.042047244" Jan 21 16:36:43 crc kubenswrapper[4707]: I0121 16:36:43.874179 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-0" podStartSLOduration=6.874164989 podStartE2EDuration="6.874164989s" podCreationTimestamp="2026-01-21 16:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:36:43.87170336 +0000 UTC m=+5701.053219582" watchObservedRunningTime="2026-01-21 16:36:43.874164989 +0000 UTC m=+5701.055681211" Jan 21 16:36:43 crc kubenswrapper[4707]: I0121 16:36:43.882902 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-1" podStartSLOduration=6.882891967 podStartE2EDuration="6.882891967s" podCreationTimestamp="2026-01-21 16:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:36:43.882502405 +0000 UTC m=+5701.064018627" watchObservedRunningTime="2026-01-21 16:36:43.882891967 +0000 UTC m=+5701.064408190" Jan 21 16:36:47 crc kubenswrapper[4707]: I0121 16:36:47.928201 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv"] Jan 21 16:36:47 crc kubenswrapper[4707]: I0121 16:36:47.929265 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:47 crc kubenswrapper[4707]: I0121 16:36:47.930600 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zrtvf" Jan 21 16:36:47 crc kubenswrapper[4707]: I0121 16:36:47.931051 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Jan 21 16:36:47 crc kubenswrapper[4707]: I0121 16:36:47.947077 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv"] Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.067257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-webhook-cert\") pod \"infra-operator-controller-manager-558567f65c-8l8bv\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.067306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwqmt\" (UniqueName: \"kubernetes.io/projected/032ee273-2c71-4187-aeef-048fd267a1b1-kube-api-access-rwqmt\") pod \"infra-operator-controller-manager-558567f65c-8l8bv\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.067419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-apiservice-cert\") pod \"infra-operator-controller-manager-558567f65c-8l8bv\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.168106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-apiservice-cert\") pod \"infra-operator-controller-manager-558567f65c-8l8bv\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.168160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-webhook-cert\") pod \"infra-operator-controller-manager-558567f65c-8l8bv\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.168192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwqmt\" (UniqueName: \"kubernetes.io/projected/032ee273-2c71-4187-aeef-048fd267a1b1-kube-api-access-rwqmt\") pod \"infra-operator-controller-manager-558567f65c-8l8bv\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.172666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-apiservice-cert\") pod \"infra-operator-controller-manager-558567f65c-8l8bv\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.175194 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-webhook-cert\") pod \"infra-operator-controller-manager-558567f65c-8l8bv\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.181635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwqmt\" (UniqueName: \"kubernetes.io/projected/032ee273-2c71-4187-aeef-048fd267a1b1-kube-api-access-rwqmt\") pod \"infra-operator-controller-manager-558567f65c-8l8bv\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.244101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.592539 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv"] Jan 21 16:36:48 crc kubenswrapper[4707]: W0121 16:36:48.596058 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod032ee273_2c71_4187_aeef_048fd267a1b1.slice/crio-7ce21def523c4d84cb2d12ad084bb477d83dd9e8a5f2e6ced17dc42ee201f9fe WatchSource:0}: Error finding container 7ce21def523c4d84cb2d12ad084bb477d83dd9e8a5f2e6ced17dc42ee201f9fe: Status 404 returned error can't find the container with id 7ce21def523c4d84cb2d12ad084bb477d83dd9e8a5f2e6ced17dc42ee201f9fe Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.873028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" event={"ID":"032ee273-2c71-4187-aeef-048fd267a1b1","Type":"ContainerStarted","Data":"7ce21def523c4d84cb2d12ad084bb477d83dd9e8a5f2e6ced17dc42ee201f9fe"} Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.890595 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.890633 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.904165 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.904209 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.913779 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:48 crc kubenswrapper[4707]: I0121 16:36:48.913840 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:51 crc kubenswrapper[4707]: I0121 16:36:51.762432 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:51 crc kubenswrapper[4707]: I0121 16:36:51.809425 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.654937 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/root-account-create-update-zqlvl"] Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.655974 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.657959 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.664038 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-zqlvl"] Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.787604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/873a8c34-73d7-4b88-922f-411430f592ab-operator-scripts\") pod \"root-account-create-update-zqlvl\" (UID: \"873a8c34-73d7-4b88-922f-411430f592ab\") " pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.787655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zmm7\" (UniqueName: \"kubernetes.io/projected/873a8c34-73d7-4b88-922f-411430f592ab-kube-api-access-2zmm7\") pod \"root-account-create-update-zqlvl\" (UID: \"873a8c34-73d7-4b88-922f-411430f592ab\") " pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.888843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/873a8c34-73d7-4b88-922f-411430f592ab-operator-scripts\") pod \"root-account-create-update-zqlvl\" (UID: \"873a8c34-73d7-4b88-922f-411430f592ab\") " pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.889105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zmm7\" (UniqueName: \"kubernetes.io/projected/873a8c34-73d7-4b88-922f-411430f592ab-kube-api-access-2zmm7\") pod \"root-account-create-update-zqlvl\" (UID: \"873a8c34-73d7-4b88-922f-411430f592ab\") " pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.889502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/873a8c34-73d7-4b88-922f-411430f592ab-operator-scripts\") pod \"root-account-create-update-zqlvl\" (UID: \"873a8c34-73d7-4b88-922f-411430f592ab\") " pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.904736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zmm7\" (UniqueName: \"kubernetes.io/projected/873a8c34-73d7-4b88-922f-411430f592ab-kube-api-access-2zmm7\") pod \"root-account-create-update-zqlvl\" (UID: \"873a8c34-73d7-4b88-922f-411430f592ab\") " pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:36:57 crc kubenswrapper[4707]: I0121 16:36:57.971161 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:36:58 crc kubenswrapper[4707]: I0121 16:36:58.066242 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:58 crc kubenswrapper[4707]: I0121 16:36:58.120111 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:36:58 crc kubenswrapper[4707]: I0121 16:36:58.316049 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-zqlvl"] Jan 21 16:36:58 crc kubenswrapper[4707]: W0121 16:36:58.320981 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod873a8c34_73d7_4b88_922f_411430f592ab.slice/crio-47875acc3ec98db1498913ede4057d16f6ba4262d9c22c69ef4ba0f2f3bd3ec4 WatchSource:0}: Error finding container 47875acc3ec98db1498913ede4057d16f6ba4262d9c22c69ef4ba0f2f3bd3ec4: Status 404 returned error can't find the container with id 47875acc3ec98db1498913ede4057d16f6ba4262d9c22c69ef4ba0f2f3bd3ec4 Jan 21 16:36:58 crc kubenswrapper[4707]: I0121 16:36:58.936718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-zqlvl" event={"ID":"873a8c34-73d7-4b88-922f-411430f592ab","Type":"ContainerStarted","Data":"3313b35cf5a74caf887dfa1c5ee9cd9670d5a4ba06d7e2c5256c48263aeb8e08"} Jan 21 16:36:58 crc kubenswrapper[4707]: I0121 16:36:58.936945 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-zqlvl" event={"ID":"873a8c34-73d7-4b88-922f-411430f592ab","Type":"ContainerStarted","Data":"47875acc3ec98db1498913ede4057d16f6ba4262d9c22c69ef4ba0f2f3bd3ec4"} Jan 21 16:36:58 crc kubenswrapper[4707]: I0121 16:36:58.951055 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/root-account-create-update-zqlvl" podStartSLOduration=1.951039296 podStartE2EDuration="1.951039296s" podCreationTimestamp="2026-01-21 16:36:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:36:58.950835212 +0000 UTC m=+5716.132351434" watchObservedRunningTime="2026-01-21 16:36:58.951039296 +0000 UTC m=+5716.132555518" Jan 21 16:36:59 crc kubenswrapper[4707]: I0121 16:36:59.942882 4707 generic.go:334] "Generic (PLEG): container finished" podID="873a8c34-73d7-4b88-922f-411430f592ab" containerID="3313b35cf5a74caf887dfa1c5ee9cd9670d5a4ba06d7e2c5256c48263aeb8e08" exitCode=0 Jan 21 16:36:59 crc kubenswrapper[4707]: I0121 16:36:59.942986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-zqlvl" event={"ID":"873a8c34-73d7-4b88-922f-411430f592ab","Type":"ContainerDied","Data":"3313b35cf5a74caf887dfa1c5ee9cd9670d5a4ba06d7e2c5256c48263aeb8e08"} Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.148213 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.230115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/873a8c34-73d7-4b88-922f-411430f592ab-operator-scripts\") pod \"873a8c34-73d7-4b88-922f-411430f592ab\" (UID: \"873a8c34-73d7-4b88-922f-411430f592ab\") " Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.230177 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zmm7\" (UniqueName: \"kubernetes.io/projected/873a8c34-73d7-4b88-922f-411430f592ab-kube-api-access-2zmm7\") pod \"873a8c34-73d7-4b88-922f-411430f592ab\" (UID: \"873a8c34-73d7-4b88-922f-411430f592ab\") " Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.230536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/873a8c34-73d7-4b88-922f-411430f592ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "873a8c34-73d7-4b88-922f-411430f592ab" (UID: "873a8c34-73d7-4b88-922f-411430f592ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.230633 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/873a8c34-73d7-4b88-922f-411430f592ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.235279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873a8c34-73d7-4b88-922f-411430f592ab-kube-api-access-2zmm7" (OuterVolumeSpecName: "kube-api-access-2zmm7") pod "873a8c34-73d7-4b88-922f-411430f592ab" (UID: "873a8c34-73d7-4b88-922f-411430f592ab"). InnerVolumeSpecName "kube-api-access-2zmm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.332417 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zmm7\" (UniqueName: \"kubernetes.io/projected/873a8c34-73d7-4b88-922f-411430f592ab-kube-api-access-2zmm7\") on node \"crc\" DevicePath \"\"" Jan 21 16:37:01 crc kubenswrapper[4707]: E0121 16:37:01.365859 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3556142681/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:2eac1b9dadaddf4734f35e3dd1996dca960e97d2f304cbd48254b900a840a84a" Jan 21 16:37:01 crc kubenswrapper[4707]: E0121 16:37:01.366036 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:2eac1b9dadaddf4734f35e3dd1996dca960e97d2f304cbd48254b900a840a84a,Command:[/manager],Args:[--metrics-bind-address=:8443 --leader-elect --health-probe-bind-address=:8081 --webhook-cert-path=/tmp/k8s-webhook-server/serving-certs],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:e47191ba776414b781b3e27b856ab45a03b9480c7dc2b1addb939608794882dc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:7e7788d1aae251e60f4012870140c65bce9760cd27feaeec5f65c42fe4ffce77,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_DNSMASQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INSTANCE_HA_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b4f8494513a3af102066fec5868ab167ac8664aceb2f0c639d7a0b60260a944,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:infra-operator.v0.0.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rwqmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-558567f65c-8l8bv_openstack-operators(032ee273-2c71-4187-aeef-048fd267a1b1): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3556142681/1\": happened during read: context canceled" logger="UnhandledError" Jan 21 16:37:01 crc kubenswrapper[4707]: E0121 16:37:01.367304 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage3556142681/1\\\": happened during read: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" podUID="032ee273-2c71-4187-aeef-048fd267a1b1" Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.953164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-zqlvl" event={"ID":"873a8c34-73d7-4b88-922f-411430f592ab","Type":"ContainerDied","Data":"47875acc3ec98db1498913ede4057d16f6ba4262d9c22c69ef4ba0f2f3bd3ec4"} Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.953191 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-zqlvl" Jan 21 16:37:01 crc kubenswrapper[4707]: I0121 16:37:01.953207 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47875acc3ec98db1498913ede4057d16f6ba4262d9c22c69ef4ba0f2f3bd3ec4" Jan 21 16:37:01 crc kubenswrapper[4707]: E0121 16:37:01.954366 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:2eac1b9dadaddf4734f35e3dd1996dca960e97d2f304cbd48254b900a840a84a\\\"\"" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" podUID="032ee273-2c71-4187-aeef-048fd267a1b1" Jan 21 16:37:04 crc kubenswrapper[4707]: I0121 16:37:04.813257 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:37:04 crc kubenswrapper[4707]: I0121 16:37:04.860976 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:37:09 crc kubenswrapper[4707]: I0121 16:37:09.945777 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:37:09 crc kubenswrapper[4707]: I0121 16:37:09.946038 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:37:25 crc kubenswrapper[4707]: I0121 16:37:25.265939 4707 scope.go:117] "RemoveContainer" containerID="b65eba9a3228bd9211adba96be18511ea19b232296af0788b40167d978efaf41" Jan 21 16:37:25 crc kubenswrapper[4707]: I0121 16:37:25.281146 4707 scope.go:117] "RemoveContainer" containerID="8a343fcbdfa8959ca67b548b6cf128598fed8b110fbff3c067e5d5334be963de" Jan 21 16:37:27 crc kubenswrapper[4707]: E0121 16:37:27.908274 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2533451137/1\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:2eac1b9dadaddf4734f35e3dd1996dca960e97d2f304cbd48254b900a840a84a" Jan 21 16:37:27 crc kubenswrapper[4707]: E0121 16:37:27.908431 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:2eac1b9dadaddf4734f35e3dd1996dca960e97d2f304cbd48254b900a840a84a,Command:[/manager],Args:[--metrics-bind-address=:8443 --leader-elect --health-probe-bind-address=:8081 --webhook-cert-path=/tmp/k8s-webhook-server/serving-certs],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:e47191ba776414b781b3e27b856ab45a03b9480c7dc2b1addb939608794882dc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:7e7788d1aae251e60f4012870140c65bce9760cd27feaeec5f65c42fe4ffce77,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_DNSMASQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INSTANCE_HA_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b4f8494513a3af102066fec5868ab167ac8664aceb2f0c639d7a0b60260a944,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:infra-operator.v0.0.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rwqmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-558567f65c-8l8bv_openstack-operators(032ee273-2c71-4187-aeef-048fd267a1b1): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2533451137/1\": happened during read: context canceled" logger="UnhandledError" Jan 21 16:37:27 crc kubenswrapper[4707]: E0121 16:37:27.909585 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2533451137/1\\\": happened during read: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" podUID="032ee273-2c71-4187-aeef-048fd267a1b1" Jan 21 16:37:39 crc kubenswrapper[4707]: I0121 16:37:39.945758 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:37:39 crc kubenswrapper[4707]: I0121 16:37:39.946046 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:37:41 crc kubenswrapper[4707]: E0121 16:37:41.184247 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:2eac1b9dadaddf4734f35e3dd1996dca960e97d2f304cbd48254b900a840a84a\\\"\"" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" podUID="032ee273-2c71-4187-aeef-048fd267a1b1" Jan 21 16:37:55 crc kubenswrapper[4707]: I0121 16:37:55.215677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" event={"ID":"032ee273-2c71-4187-aeef-048fd267a1b1","Type":"ContainerStarted","Data":"d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e"} Jan 21 16:37:55 crc kubenswrapper[4707]: I0121 16:37:55.216171 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:37:55 crc kubenswrapper[4707]: I0121 16:37:55.228952 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" podStartSLOduration=2.492006236 podStartE2EDuration="1m8.228932827s" podCreationTimestamp="2026-01-21 16:36:47 +0000 UTC" firstStartedPulling="2026-01-21 16:36:48.597309211 +0000 UTC m=+5705.778825432" lastFinishedPulling="2026-01-21 16:37:54.334235801 +0000 UTC m=+5771.515752023" observedRunningTime="2026-01-21 16:37:55.227658531 +0000 UTC m=+5772.409174754" watchObservedRunningTime="2026-01-21 16:37:55.228932827 +0000 UTC m=+5772.410449050" Jan 21 16:38:08 crc kubenswrapper[4707]: I0121 16:38:08.248060 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:38:09 crc kubenswrapper[4707]: I0121 16:38:09.946094 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:38:09 crc kubenswrapper[4707]: I0121 16:38:09.946148 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:38:09 crc kubenswrapper[4707]: I0121 16:38:09.946189 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:38:09 crc kubenswrapper[4707]: I0121 16:38:09.946874 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70390b648c73daf4be255709128a6e64ca5ff234e05ed8fb129f19cfb2d9e39e"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:38:09 crc kubenswrapper[4707]: I0121 16:38:09.946925 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://70390b648c73daf4be255709128a6e64ca5ff234e05ed8fb129f19cfb2d9e39e" gracePeriod=600 Jan 21 16:38:10 crc kubenswrapper[4707]: I0121 16:38:10.287701 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="70390b648c73daf4be255709128a6e64ca5ff234e05ed8fb129f19cfb2d9e39e" exitCode=0 Jan 21 16:38:10 crc kubenswrapper[4707]: I0121 16:38:10.287777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"70390b648c73daf4be255709128a6e64ca5ff234e05ed8fb129f19cfb2d9e39e"} Jan 21 16:38:10 crc kubenswrapper[4707]: I0121 16:38:10.287958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79"} Jan 21 16:38:10 crc kubenswrapper[4707]: I0121 16:38:10.287980 4707 scope.go:117] "RemoveContainer" containerID="4151dbefd15e000a5aa0306c5dc909575fb18034275a04e75e6e3f4ee3d96340" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.148182 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-7xrws"] Jan 21 16:38:14 crc kubenswrapper[4707]: E0121 16:38:14.148719 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873a8c34-73d7-4b88-922f-411430f592ab" containerName="mariadb-account-create-update" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.148740 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="873a8c34-73d7-4b88-922f-411430f592ab" containerName="mariadb-account-create-update" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.148894 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="873a8c34-73d7-4b88-922f-411430f592ab" containerName="mariadb-account-create-update" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.149266 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.152773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-hnpxd" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.156296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-7xrws"] Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.216346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcgsn\" (UniqueName: \"kubernetes.io/projected/99f2ba6e-5966-46d6-9f1c-80798985a942-kube-api-access-hcgsn\") pod \"rabbitmq-cluster-operator-index-7xrws\" (UID: \"99f2ba6e-5966-46d6-9f1c-80798985a942\") " pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.317700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcgsn\" (UniqueName: \"kubernetes.io/projected/99f2ba6e-5966-46d6-9f1c-80798985a942-kube-api-access-hcgsn\") pod \"rabbitmq-cluster-operator-index-7xrws\" (UID: \"99f2ba6e-5966-46d6-9f1c-80798985a942\") " pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.341598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcgsn\" (UniqueName: \"kubernetes.io/projected/99f2ba6e-5966-46d6-9f1c-80798985a942-kube-api-access-hcgsn\") pod \"rabbitmq-cluster-operator-index-7xrws\" (UID: \"99f2ba6e-5966-46d6-9f1c-80798985a942\") " pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.466478 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:38:14 crc kubenswrapper[4707]: I0121 16:38:14.909382 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-7xrws"] Jan 21 16:38:15 crc kubenswrapper[4707]: I0121 16:38:15.317287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" event={"ID":"99f2ba6e-5966-46d6-9f1c-80798985a942","Type":"ContainerStarted","Data":"6cda0df1b1fe3805ed7d7afe4a3dfba5dfc011e476c5d06f88d562a6e27290e3"} Jan 21 16:38:18 crc kubenswrapper[4707]: I0121 16:38:18.334702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" event={"ID":"99f2ba6e-5966-46d6-9f1c-80798985a942","Type":"ContainerStarted","Data":"e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da"} Jan 21 16:38:18 crc kubenswrapper[4707]: I0121 16:38:18.345786 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" podStartSLOduration=1.593441328 podStartE2EDuration="4.345772648s" podCreationTimestamp="2026-01-21 16:38:14 +0000 UTC" firstStartedPulling="2026-01-21 16:38:14.91819286 +0000 UTC m=+5792.099709082" lastFinishedPulling="2026-01-21 16:38:17.670524189 +0000 UTC m=+5794.852040402" observedRunningTime="2026-01-21 16:38:18.343475708 +0000 UTC m=+5795.524991931" watchObservedRunningTime="2026-01-21 16:38:18.345772648 +0000 UTC m=+5795.527288870" Jan 21 16:38:24 crc kubenswrapper[4707]: I0121 16:38:24.467314 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:38:24 crc kubenswrapper[4707]: I0121 16:38:24.467669 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:38:24 crc kubenswrapper[4707]: I0121 16:38:24.486163 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:38:25 crc kubenswrapper[4707]: I0121 16:38:25.325602 4707 scope.go:117] "RemoveContainer" containerID="abd2e54bf0b7eeaeda07b2cec31a04c4f4da512a51516cff83987af87d0a426e" Jan 21 16:38:25 crc kubenswrapper[4707]: I0121 16:38:25.340266 4707 scope.go:117] "RemoveContainer" containerID="0c73e5dfbc68aeabe4d0ccbba22d024abe241b6122a7cab7940c3f248348276d" Jan 21 16:38:25 crc kubenswrapper[4707]: I0121 16:38:25.388488 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:38:26 crc kubenswrapper[4707]: I0121 16:38:26.930972 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 16:38:26 crc kubenswrapper[4707]: I0121 16:38:26.931618 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:26 crc kubenswrapper[4707]: I0121 16:38:26.935008 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"memcached-config-data" Jan 21 16:38:26 crc kubenswrapper[4707]: I0121 16:38:26.941203 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 16:38:26 crc kubenswrapper[4707]: I0121 16:38:26.942157 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"memcached-memcached-dockercfg-m5vv2" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.072191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-config-data\") pod \"memcached-0\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.072247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-kolla-config\") pod \"memcached-0\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.072524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgc2x\" (UniqueName: \"kubernetes.io/projected/495e3dab-2009-4192-901c-cc7908381182-kube-api-access-fgc2x\") pod \"memcached-0\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.173754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgc2x\" (UniqueName: \"kubernetes.io/projected/495e3dab-2009-4192-901c-cc7908381182-kube-api-access-fgc2x\") pod \"memcached-0\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.173989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-config-data\") pod \"memcached-0\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.174019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-kolla-config\") pod \"memcached-0\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.174680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-kolla-config\") pod \"memcached-0\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.174787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-config-data\") pod \"memcached-0\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.188515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgc2x\" (UniqueName: \"kubernetes.io/projected/495e3dab-2009-4192-901c-cc7908381182-kube-api-access-fgc2x\") pod \"memcached-0\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.245016 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:27 crc kubenswrapper[4707]: I0121 16:38:27.618186 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 16:38:28 crc kubenswrapper[4707]: I0121 16:38:28.388505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"495e3dab-2009-4192-901c-cc7908381182","Type":"ContainerStarted","Data":"c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd"} Jan 21 16:38:28 crc kubenswrapper[4707]: I0121 16:38:28.389326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"495e3dab-2009-4192-901c-cc7908381182","Type":"ContainerStarted","Data":"d04a6e34135682c108aa5132823865d3181425dfce78ca41a1dc1da0170fa611"} Jan 21 16:38:28 crc kubenswrapper[4707]: I0121 16:38:28.389405 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:28 crc kubenswrapper[4707]: I0121 16:38:28.403888 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/memcached-0" podStartSLOduration=2.40387721 podStartE2EDuration="2.40387721s" podCreationTimestamp="2026-01-21 16:38:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:38:28.398385274 +0000 UTC m=+5805.579901496" watchObservedRunningTime="2026-01-21 16:38:28.40387721 +0000 UTC m=+5805.585393433" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.246377 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/memcached-0" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.559008 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw"] Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.560092 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.562001 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.564485 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw"] Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.639868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.639904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.639934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qjkd\" (UniqueName: \"kubernetes.io/projected/6d203ad7-2b7a-4bde-8757-4299541c2b5a-kube-api-access-5qjkd\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.740755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.741005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.741042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qjkd\" (UniqueName: \"kubernetes.io/projected/6d203ad7-2b7a-4bde-8757-4299541c2b5a-kube-api-access-5qjkd\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.741286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.741346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.755928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qjkd\" (UniqueName: \"kubernetes.io/projected/6d203ad7-2b7a-4bde-8757-4299541c2b5a-kube-api-access-5qjkd\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:32 crc kubenswrapper[4707]: I0121 16:38:32.874599 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:33 crc kubenswrapper[4707]: I0121 16:38:33.228008 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw"] Jan 21 16:38:33 crc kubenswrapper[4707]: W0121 16:38:33.230602 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d203ad7_2b7a_4bde_8757_4299541c2b5a.slice/crio-d61419460f3441c161f0ca38cc6d7dd8bf522deeeae968784eda029466c5c0a9 WatchSource:0}: Error finding container d61419460f3441c161f0ca38cc6d7dd8bf522deeeae968784eda029466c5c0a9: Status 404 returned error can't find the container with id d61419460f3441c161f0ca38cc6d7dd8bf522deeeae968784eda029466c5c0a9 Jan 21 16:38:33 crc kubenswrapper[4707]: I0121 16:38:33.414638 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerID="4387c1b4880d34e8f97f3a8c5871ca68b1bbdfb56c429d36cfb2c6b67a562251" exitCode=0 Jan 21 16:38:33 crc kubenswrapper[4707]: I0121 16:38:33.414679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" event={"ID":"6d203ad7-2b7a-4bde-8757-4299541c2b5a","Type":"ContainerDied","Data":"4387c1b4880d34e8f97f3a8c5871ca68b1bbdfb56c429d36cfb2c6b67a562251"} Jan 21 16:38:33 crc kubenswrapper[4707]: I0121 16:38:33.414704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" event={"ID":"6d203ad7-2b7a-4bde-8757-4299541c2b5a","Type":"ContainerStarted","Data":"d61419460f3441c161f0ca38cc6d7dd8bf522deeeae968784eda029466c5c0a9"} Jan 21 16:38:34 crc kubenswrapper[4707]: I0121 16:38:34.424007 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerID="be52f61faaed1d80a7e1d6d58bebb247b710b7d9e40a4f190ed3ab8611f6518a" exitCode=0 Jan 21 16:38:34 crc kubenswrapper[4707]: I0121 16:38:34.424198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" event={"ID":"6d203ad7-2b7a-4bde-8757-4299541c2b5a","Type":"ContainerDied","Data":"be52f61faaed1d80a7e1d6d58bebb247b710b7d9e40a4f190ed3ab8611f6518a"} Jan 21 16:38:35 crc kubenswrapper[4707]: I0121 16:38:35.431566 4707 generic.go:334] "Generic (PLEG): container finished" podID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerID="dc05be075054f1399bdf7ce2e966ea564638f97fdfb68bceb91cfec21aee7732" exitCode=0 Jan 21 16:38:35 crc kubenswrapper[4707]: I0121 16:38:35.431595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" event={"ID":"6d203ad7-2b7a-4bde-8757-4299541c2b5a","Type":"ContainerDied","Data":"dc05be075054f1399bdf7ce2e966ea564638f97fdfb68bceb91cfec21aee7732"} Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.657341 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.791151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qjkd\" (UniqueName: \"kubernetes.io/projected/6d203ad7-2b7a-4bde-8757-4299541c2b5a-kube-api-access-5qjkd\") pod \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.791208 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-util\") pod \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.791325 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-bundle\") pod \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\" (UID: \"6d203ad7-2b7a-4bde-8757-4299541c2b5a\") " Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.791971 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-bundle" (OuterVolumeSpecName: "bundle") pod "6d203ad7-2b7a-4bde-8757-4299541c2b5a" (UID: "6d203ad7-2b7a-4bde-8757-4299541c2b5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.795981 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d203ad7-2b7a-4bde-8757-4299541c2b5a-kube-api-access-5qjkd" (OuterVolumeSpecName: "kube-api-access-5qjkd") pod "6d203ad7-2b7a-4bde-8757-4299541c2b5a" (UID: "6d203ad7-2b7a-4bde-8757-4299541c2b5a"). InnerVolumeSpecName "kube-api-access-5qjkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.801489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-util" (OuterVolumeSpecName: "util") pod "6d203ad7-2b7a-4bde-8757-4299541c2b5a" (UID: "6d203ad7-2b7a-4bde-8757-4299541c2b5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.892990 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qjkd\" (UniqueName: \"kubernetes.io/projected/6d203ad7-2b7a-4bde-8757-4299541c2b5a-kube-api-access-5qjkd\") on node \"crc\" DevicePath \"\"" Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.893020 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-util\") on node \"crc\" DevicePath \"\"" Jan 21 16:38:36 crc kubenswrapper[4707]: I0121 16:38:36.893031 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d203ad7-2b7a-4bde-8757-4299541c2b5a-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:38:37 crc kubenswrapper[4707]: I0121 16:38:37.443254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" event={"ID":"6d203ad7-2b7a-4bde-8757-4299541c2b5a","Type":"ContainerDied","Data":"d61419460f3441c161f0ca38cc6d7dd8bf522deeeae968784eda029466c5c0a9"} Jan 21 16:38:37 crc kubenswrapper[4707]: I0121 16:38:37.443288 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d61419460f3441c161f0ca38cc6d7dd8bf522deeeae968784eda029466c5c0a9" Jan 21 16:38:37 crc kubenswrapper[4707]: I0121 16:38:37.443289 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.274277 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh"] Jan 21 16:38:54 crc kubenswrapper[4707]: E0121 16:38:54.274822 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerName="util" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.274843 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerName="util" Jan 21 16:38:54 crc kubenswrapper[4707]: E0121 16:38:54.274855 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerName="pull" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.274860 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerName="pull" Jan 21 16:38:54 crc kubenswrapper[4707]: E0121 16:38:54.274875 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerName="extract" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.274880 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerName="extract" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.275010 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" containerName="extract" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.275399 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.277628 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-rpn2v" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.280971 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh"] Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.402781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr9jd\" (UniqueName: \"kubernetes.io/projected/2bc097f0-4f25-4e0d-949e-ee3a654e8f52-kube-api-access-kr9jd\") pod \"rabbitmq-cluster-operator-779fc9694b-s4rxh\" (UID: \"2bc097f0-4f25-4e0d-949e-ee3a654e8f52\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.504173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr9jd\" (UniqueName: \"kubernetes.io/projected/2bc097f0-4f25-4e0d-949e-ee3a654e8f52-kube-api-access-kr9jd\") pod \"rabbitmq-cluster-operator-779fc9694b-s4rxh\" (UID: \"2bc097f0-4f25-4e0d-949e-ee3a654e8f52\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.518063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr9jd\" (UniqueName: \"kubernetes.io/projected/2bc097f0-4f25-4e0d-949e-ee3a654e8f52-kube-api-access-kr9jd\") pod \"rabbitmq-cluster-operator-779fc9694b-s4rxh\" (UID: \"2bc097f0-4f25-4e0d-949e-ee3a654e8f52\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.588492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" Jan 21 16:38:54 crc kubenswrapper[4707]: I0121 16:38:54.932016 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh"] Jan 21 16:38:55 crc kubenswrapper[4707]: I0121 16:38:55.535371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" event={"ID":"2bc097f0-4f25-4e0d-949e-ee3a654e8f52","Type":"ContainerStarted","Data":"1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a"} Jan 21 16:38:55 crc kubenswrapper[4707]: I0121 16:38:55.536059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" event={"ID":"2bc097f0-4f25-4e0d-949e-ee3a654e8f52","Type":"ContainerStarted","Data":"4d198b3965d42ce464c24528ad775d9b563af205e2c7b463658a18244686f568"} Jan 21 16:39:02 crc kubenswrapper[4707]: I0121 16:39:02.934528 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" podStartSLOduration=8.934497941 podStartE2EDuration="8.934497941s" podCreationTimestamp="2026-01-21 16:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:38:55.544982645 +0000 UTC m=+5832.726498867" watchObservedRunningTime="2026-01-21 16:39:02.934497941 +0000 UTC m=+5840.116014164" Jan 21 16:39:02 crc kubenswrapper[4707]: I0121 16:39:02.938660 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-4m5fg"] Jan 21 16:39:02 crc kubenswrapper[4707]: I0121 16:39:02.939379 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4m5fg" Jan 21 16:39:02 crc kubenswrapper[4707]: I0121 16:39:02.940567 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-8sdqg" Jan 21 16:39:02 crc kubenswrapper[4707]: I0121 16:39:02.944941 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-4m5fg"] Jan 21 16:39:03 crc kubenswrapper[4707]: I0121 16:39:03.108407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6xd\" (UniqueName: \"kubernetes.io/projected/ad52b02d-a4eb-4f15-934f-9b7fd200f327-kube-api-access-zz6xd\") pod \"keystone-operator-index-4m5fg\" (UID: \"ad52b02d-a4eb-4f15-934f-9b7fd200f327\") " pod="openstack-operators/keystone-operator-index-4m5fg" Jan 21 16:39:03 crc kubenswrapper[4707]: I0121 16:39:03.209493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6xd\" (UniqueName: \"kubernetes.io/projected/ad52b02d-a4eb-4f15-934f-9b7fd200f327-kube-api-access-zz6xd\") pod \"keystone-operator-index-4m5fg\" (UID: \"ad52b02d-a4eb-4f15-934f-9b7fd200f327\") " pod="openstack-operators/keystone-operator-index-4m5fg" Jan 21 16:39:03 crc kubenswrapper[4707]: I0121 16:39:03.223913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6xd\" (UniqueName: \"kubernetes.io/projected/ad52b02d-a4eb-4f15-934f-9b7fd200f327-kube-api-access-zz6xd\") pod \"keystone-operator-index-4m5fg\" (UID: \"ad52b02d-a4eb-4f15-934f-9b7fd200f327\") " pod="openstack-operators/keystone-operator-index-4m5fg" Jan 21 16:39:03 crc kubenswrapper[4707]: I0121 16:39:03.255799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4m5fg" Jan 21 16:39:03 crc kubenswrapper[4707]: I0121 16:39:03.602509 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-4m5fg"] Jan 21 16:39:04 crc kubenswrapper[4707]: I0121 16:39:04.580121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4m5fg" event={"ID":"ad52b02d-a4eb-4f15-934f-9b7fd200f327","Type":"ContainerStarted","Data":"ff0a06a1b05ea655fc61a5a19f4456b8fa4dbad2ecdb3910a69a64a6f5defb40"} Jan 21 16:39:05 crc kubenswrapper[4707]: I0121 16:39:05.586129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4m5fg" event={"ID":"ad52b02d-a4eb-4f15-934f-9b7fd200f327","Type":"ContainerStarted","Data":"f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f"} Jan 21 16:39:05 crc kubenswrapper[4707]: I0121 16:39:05.597526 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-4m5fg" podStartSLOduration=2.700962218 podStartE2EDuration="3.597514993s" podCreationTimestamp="2026-01-21 16:39:02 +0000 UTC" firstStartedPulling="2026-01-21 16:39:03.608403744 +0000 UTC m=+5840.789919956" lastFinishedPulling="2026-01-21 16:39:04.504956509 +0000 UTC m=+5841.686472731" observedRunningTime="2026-01-21 16:39:05.594401939 +0000 UTC m=+5842.775918162" watchObservedRunningTime="2026-01-21 16:39:05.597514993 +0000 UTC m=+5842.779031215" Jan 21 16:39:07 crc kubenswrapper[4707]: I0121 16:39:07.335039 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-4m5fg"] Jan 21 16:39:07 crc kubenswrapper[4707]: I0121 16:39:07.595073 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-4m5fg" podUID="ad52b02d-a4eb-4f15-934f-9b7fd200f327" containerName="registry-server" containerID="cri-o://f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f" gracePeriod=2 Jan 21 16:39:07 crc kubenswrapper[4707]: I0121 16:39:07.910798 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4m5fg" Jan 21 16:39:07 crc kubenswrapper[4707]: I0121 16:39:07.936482 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-6d62q"] Jan 21 16:39:07 crc kubenswrapper[4707]: E0121 16:39:07.936726 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad52b02d-a4eb-4f15-934f-9b7fd200f327" containerName="registry-server" Jan 21 16:39:07 crc kubenswrapper[4707]: I0121 16:39:07.936738 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad52b02d-a4eb-4f15-934f-9b7fd200f327" containerName="registry-server" Jan 21 16:39:07 crc kubenswrapper[4707]: I0121 16:39:07.936865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad52b02d-a4eb-4f15-934f-9b7fd200f327" containerName="registry-server" Jan 21 16:39:07 crc kubenswrapper[4707]: I0121 16:39:07.937273 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:39:07 crc kubenswrapper[4707]: I0121 16:39:07.943951 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6d62q"] Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.070294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz6xd\" (UniqueName: \"kubernetes.io/projected/ad52b02d-a4eb-4f15-934f-9b7fd200f327-kube-api-access-zz6xd\") pod \"ad52b02d-a4eb-4f15-934f-9b7fd200f327\" (UID: \"ad52b02d-a4eb-4f15-934f-9b7fd200f327\") " Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.070569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv4z9\" (UniqueName: \"kubernetes.io/projected/6da1d95d-5c48-46ef-925b-124a6b8af96a-kube-api-access-cv4z9\") pod \"keystone-operator-index-6d62q\" (UID: \"6da1d95d-5c48-46ef-925b-124a6b8af96a\") " pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.074711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad52b02d-a4eb-4f15-934f-9b7fd200f327-kube-api-access-zz6xd" (OuterVolumeSpecName: "kube-api-access-zz6xd") pod "ad52b02d-a4eb-4f15-934f-9b7fd200f327" (UID: "ad52b02d-a4eb-4f15-934f-9b7fd200f327"). InnerVolumeSpecName "kube-api-access-zz6xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.172325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv4z9\" (UniqueName: \"kubernetes.io/projected/6da1d95d-5c48-46ef-925b-124a6b8af96a-kube-api-access-cv4z9\") pod \"keystone-operator-index-6d62q\" (UID: \"6da1d95d-5c48-46ef-925b-124a6b8af96a\") " pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.172645 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz6xd\" (UniqueName: \"kubernetes.io/projected/ad52b02d-a4eb-4f15-934f-9b7fd200f327-kube-api-access-zz6xd\") on node \"crc\" DevicePath \"\"" Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.185341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv4z9\" (UniqueName: \"kubernetes.io/projected/6da1d95d-5c48-46ef-925b-124a6b8af96a-kube-api-access-cv4z9\") pod \"keystone-operator-index-6d62q\" (UID: \"6da1d95d-5c48-46ef-925b-124a6b8af96a\") " pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.251054 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:39:08 crc kubenswrapper[4707]: W0121 16:39:08.600286 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da1d95d_5c48_46ef_925b_124a6b8af96a.slice/crio-e8b0e769eba33a4a2441fd81ac08963fdb33d2622315cf973736ad881a757e49 WatchSource:0}: Error finding container e8b0e769eba33a4a2441fd81ac08963fdb33d2622315cf973736ad881a757e49: Status 404 returned error can't find the container with id e8b0e769eba33a4a2441fd81ac08963fdb33d2622315cf973736ad881a757e49 Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.602543 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6d62q"] Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.607196 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad52b02d-a4eb-4f15-934f-9b7fd200f327" containerID="f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f" exitCode=0 Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.607235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4m5fg" event={"ID":"ad52b02d-a4eb-4f15-934f-9b7fd200f327","Type":"ContainerDied","Data":"f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f"} Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.607260 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4m5fg" event={"ID":"ad52b02d-a4eb-4f15-934f-9b7fd200f327","Type":"ContainerDied","Data":"ff0a06a1b05ea655fc61a5a19f4456b8fa4dbad2ecdb3910a69a64a6f5defb40"} Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.607275 4707 scope.go:117] "RemoveContainer" containerID="f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f" Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.607361 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4m5fg" Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.623485 4707 scope.go:117] "RemoveContainer" containerID="f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f" Jan 21 16:39:08 crc kubenswrapper[4707]: E0121 16:39:08.623780 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f\": container with ID starting with f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f not found: ID does not exist" containerID="f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f" Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.623848 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f"} err="failed to get container status \"f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f\": rpc error: code = NotFound desc = could not find container \"f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f\": container with ID starting with f74e1662d8a929288341fb14313e1073cf0d3e2d410814ebef143f5a3fa8068f not found: ID does not exist" Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.629838 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-4m5fg"] Jan 21 16:39:08 crc kubenswrapper[4707]: I0121 16:39:08.633248 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-4m5fg"] Jan 21 16:39:09 crc kubenswrapper[4707]: I0121 16:39:09.188382 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad52b02d-a4eb-4f15-934f-9b7fd200f327" path="/var/lib/kubelet/pods/ad52b02d-a4eb-4f15-934f-9b7fd200f327/volumes" Jan 21 16:39:09 crc kubenswrapper[4707]: I0121 16:39:09.613385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6d62q" event={"ID":"6da1d95d-5c48-46ef-925b-124a6b8af96a","Type":"ContainerStarted","Data":"a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c"} Jan 21 16:39:09 crc kubenswrapper[4707]: I0121 16:39:09.613424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6d62q" event={"ID":"6da1d95d-5c48-46ef-925b-124a6b8af96a","Type":"ContainerStarted","Data":"e8b0e769eba33a4a2441fd81ac08963fdb33d2622315cf973736ad881a757e49"} Jan 21 16:39:09 crc kubenswrapper[4707]: I0121 16:39:09.625838 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-6d62q" podStartSLOduration=2.14677294 podStartE2EDuration="2.625799051s" podCreationTimestamp="2026-01-21 16:39:07 +0000 UTC" firstStartedPulling="2026-01-21 16:39:08.604274863 +0000 UTC m=+5845.785791086" lastFinishedPulling="2026-01-21 16:39:09.083300975 +0000 UTC m=+5846.264817197" observedRunningTime="2026-01-21 16:39:09.623721876 +0000 UTC m=+5846.805238097" watchObservedRunningTime="2026-01-21 16:39:09.625799051 +0000 UTC m=+5846.807315273" Jan 21 16:39:18 crc kubenswrapper[4707]: I0121 16:39:18.251536 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:39:18 crc kubenswrapper[4707]: I0121 16:39:18.251985 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:39:18 crc kubenswrapper[4707]: I0121 16:39:18.317882 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:39:18 crc kubenswrapper[4707]: I0121 16:39:18.676862 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.183284 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm"] Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.184719 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.186325 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.190898 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm"] Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.322451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpsr6\" (UniqueName: \"kubernetes.io/projected/408cfa00-50fd-4d12-845c-4e467fb79876-kube-api-access-kpsr6\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.322544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.322570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.424214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpsr6\" (UniqueName: \"kubernetes.io/projected/408cfa00-50fd-4d12-845c-4e467fb79876-kube-api-access-kpsr6\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.424270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.424291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.424670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.424801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.439119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpsr6\" (UniqueName: \"kubernetes.io/projected/408cfa00-50fd-4d12-845c-4e467fb79876-kube-api-access-kpsr6\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.497987 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:20 crc kubenswrapper[4707]: I0121 16:39:20.842896 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm"] Jan 21 16:39:21 crc kubenswrapper[4707]: I0121 16:39:21.674153 4707 generic.go:334] "Generic (PLEG): container finished" podID="408cfa00-50fd-4d12-845c-4e467fb79876" containerID="babc4cfb1ff1f6edc7090b25028b954b2b20c9b39ad3272a8498718d3a261c2a" exitCode=0 Jan 21 16:39:21 crc kubenswrapper[4707]: I0121 16:39:21.674187 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" event={"ID":"408cfa00-50fd-4d12-845c-4e467fb79876","Type":"ContainerDied","Data":"babc4cfb1ff1f6edc7090b25028b954b2b20c9b39ad3272a8498718d3a261c2a"} Jan 21 16:39:21 crc kubenswrapper[4707]: I0121 16:39:21.674207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" event={"ID":"408cfa00-50fd-4d12-845c-4e467fb79876","Type":"ContainerStarted","Data":"bc48e2d9e678ada9584c59b56a75812dc44b9d43fb8f9bcdb15c794f5d08dec2"} Jan 21 16:39:22 crc kubenswrapper[4707]: I0121 16:39:22.681365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" event={"ID":"408cfa00-50fd-4d12-845c-4e467fb79876","Type":"ContainerStarted","Data":"a80cda2bbd02140cdb752637eff5edb859ab73879b0d365c3e8b5a3ac49ed431"} Jan 21 16:39:23 crc kubenswrapper[4707]: I0121 16:39:23.688907 4707 generic.go:334] "Generic (PLEG): container finished" podID="408cfa00-50fd-4d12-845c-4e467fb79876" containerID="a80cda2bbd02140cdb752637eff5edb859ab73879b0d365c3e8b5a3ac49ed431" exitCode=0 Jan 21 16:39:23 crc kubenswrapper[4707]: I0121 16:39:23.689047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" event={"ID":"408cfa00-50fd-4d12-845c-4e467fb79876","Type":"ContainerDied","Data":"a80cda2bbd02140cdb752637eff5edb859ab73879b0d365c3e8b5a3ac49ed431"} Jan 21 16:39:24 crc kubenswrapper[4707]: I0121 16:39:24.696231 4707 generic.go:334] "Generic (PLEG): container finished" podID="408cfa00-50fd-4d12-845c-4e467fb79876" containerID="fe1b3c138a37e5d49368ecc528dc98056e0178aa77d839a338195969abc544ad" exitCode=0 Jan 21 16:39:24 crc kubenswrapper[4707]: I0121 16:39:24.696317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" event={"ID":"408cfa00-50fd-4d12-845c-4e467fb79876","Type":"ContainerDied","Data":"fe1b3c138a37e5d49368ecc528dc98056e0178aa77d839a338195969abc544ad"} Jan 21 16:39:25 crc kubenswrapper[4707]: I0121 16:39:25.383698 4707 scope.go:117] "RemoveContainer" containerID="414e9152c6825f4306ef632e34042bad238102d778f1ffadf6a62c96dd1982df" Jan 21 16:39:25 crc kubenswrapper[4707]: I0121 16:39:25.397039 4707 scope.go:117] "RemoveContainer" containerID="3a2a9bd76353e45ce9b0adde4e7d27a63cd1cfe605e980da7026b996895962a6" Jan 21 16:39:25 crc kubenswrapper[4707]: I0121 16:39:25.416444 4707 scope.go:117] "RemoveContainer" containerID="80bcb3ada9b96457bebb8d044efd0f6b3dca86dbe7679c81e8e4c311c928cc38" Jan 21 16:39:25 crc kubenswrapper[4707]: I0121 16:39:25.906315 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.088666 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpsr6\" (UniqueName: \"kubernetes.io/projected/408cfa00-50fd-4d12-845c-4e467fb79876-kube-api-access-kpsr6\") pod \"408cfa00-50fd-4d12-845c-4e467fb79876\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.088764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-util\") pod \"408cfa00-50fd-4d12-845c-4e467fb79876\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.088786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-bundle\") pod \"408cfa00-50fd-4d12-845c-4e467fb79876\" (UID: \"408cfa00-50fd-4d12-845c-4e467fb79876\") " Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.089662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-bundle" (OuterVolumeSpecName: "bundle") pod "408cfa00-50fd-4d12-845c-4e467fb79876" (UID: "408cfa00-50fd-4d12-845c-4e467fb79876"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.092861 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408cfa00-50fd-4d12-845c-4e467fb79876-kube-api-access-kpsr6" (OuterVolumeSpecName: "kube-api-access-kpsr6") pod "408cfa00-50fd-4d12-845c-4e467fb79876" (UID: "408cfa00-50fd-4d12-845c-4e467fb79876"). InnerVolumeSpecName "kube-api-access-kpsr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.189919 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpsr6\" (UniqueName: \"kubernetes.io/projected/408cfa00-50fd-4d12-845c-4e467fb79876-kube-api-access-kpsr6\") on node \"crc\" DevicePath \"\"" Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.190220 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.211009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-util" (OuterVolumeSpecName: "util") pod "408cfa00-50fd-4d12-845c-4e467fb79876" (UID: "408cfa00-50fd-4d12-845c-4e467fb79876"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.291137 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/408cfa00-50fd-4d12-845c-4e467fb79876-util\") on node \"crc\" DevicePath \"\"" Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.706775 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" event={"ID":"408cfa00-50fd-4d12-845c-4e467fb79876","Type":"ContainerDied","Data":"bc48e2d9e678ada9584c59b56a75812dc44b9d43fb8f9bcdb15c794f5d08dec2"} Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.706840 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc48e2d9e678ada9584c59b56a75812dc44b9d43fb8f9bcdb15c794f5d08dec2" Jan 21 16:39:26 crc kubenswrapper[4707]: I0121 16:39:26.707066 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.697706 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 16:39:29 crc kubenswrapper[4707]: E0121 16:39:29.698184 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408cfa00-50fd-4d12-845c-4e467fb79876" containerName="pull" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.698197 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="408cfa00-50fd-4d12-845c-4e467fb79876" containerName="pull" Jan 21 16:39:29 crc kubenswrapper[4707]: E0121 16:39:29.698220 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408cfa00-50fd-4d12-845c-4e467fb79876" containerName="util" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.698225 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="408cfa00-50fd-4d12-845c-4e467fb79876" containerName="util" Jan 21 16:39:29 crc kubenswrapper[4707]: E0121 16:39:29.698240 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408cfa00-50fd-4d12-845c-4e467fb79876" containerName="extract" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.698246 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="408cfa00-50fd-4d12-845c-4e467fb79876" containerName="extract" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.698362 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="408cfa00-50fd-4d12-845c-4e467fb79876" containerName="extract" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.698948 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.700330 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-default-user" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.700364 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-server-conf" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.700390 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.700595 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.700410 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-server-dockercfg-zrmb8" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.709226 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.835892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crdrk\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-kube-api-access-crdrk\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.836153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b049d8fc-bda9-45c0-b94b-ea5824e7e684-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.836283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.836407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.836512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b049d8fc-bda9-45c0-b94b-ea5824e7e684-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.836680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b049d8fc-bda9-45c0-b94b-ea5824e7e684-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.836765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.836875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.938617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.938685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.938728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b049d8fc-bda9-45c0-b94b-ea5824e7e684-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.938748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b049d8fc-bda9-45c0-b94b-ea5824e7e684-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.938763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.938782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.938875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crdrk\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-kube-api-access-crdrk\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.938905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b049d8fc-bda9-45c0-b94b-ea5824e7e684-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.939190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.939622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.940158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b049d8fc-bda9-45c0-b94b-ea5824e7e684-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.943230 4707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.943306 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/003e466ac4a36819547911a73105db8dd00dc730e804594d71e2592a5398b1fe/globalmount\"" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.944753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.945022 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b049d8fc-bda9-45c0-b94b-ea5824e7e684-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.945125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b049d8fc-bda9-45c0-b94b-ea5824e7e684-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.954089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crdrk\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-kube-api-access-crdrk\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:29 crc kubenswrapper[4707]: I0121 16:39:29.966417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\") pod \"rabbitmq-server-0\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:30 crc kubenswrapper[4707]: I0121 16:39:30.012193 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:39:30 crc kubenswrapper[4707]: I0121 16:39:30.379149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 16:39:30 crc kubenswrapper[4707]: I0121 16:39:30.728858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"b049d8fc-bda9-45c0-b94b-ea5824e7e684","Type":"ContainerStarted","Data":"de7c56622a49acff4d9fe2ed49f90c60a0ce6bd57019e90897e3f8820a1aa4cc"} Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.598622 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl"] Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.599697 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.601101 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mr6x2" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.601278 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.611309 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl"] Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.625782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-apiservice-cert\") pod \"keystone-operator-controller-manager-f4c696d8c-vg2fl\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.625873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5fxh\" (UniqueName: \"kubernetes.io/projected/2d310b10-05e8-4691-b1d9-84a2f829173c-kube-api-access-f5fxh\") pod \"keystone-operator-controller-manager-f4c696d8c-vg2fl\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.625897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-webhook-cert\") pod \"keystone-operator-controller-manager-f4c696d8c-vg2fl\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.726951 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-apiservice-cert\") pod \"keystone-operator-controller-manager-f4c696d8c-vg2fl\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.727017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5fxh\" (UniqueName: \"kubernetes.io/projected/2d310b10-05e8-4691-b1d9-84a2f829173c-kube-api-access-f5fxh\") pod \"keystone-operator-controller-manager-f4c696d8c-vg2fl\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.727046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-webhook-cert\") pod \"keystone-operator-controller-manager-f4c696d8c-vg2fl\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.730520 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-apiservice-cert\") pod \"keystone-operator-controller-manager-f4c696d8c-vg2fl\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.730543 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-webhook-cert\") pod \"keystone-operator-controller-manager-f4c696d8c-vg2fl\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.739907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5fxh\" (UniqueName: \"kubernetes.io/projected/2d310b10-05e8-4691-b1d9-84a2f829173c-kube-api-access-f5fxh\") pod \"keystone-operator-controller-manager-f4c696d8c-vg2fl\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.763400 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"b049d8fc-bda9-45c0-b94b-ea5824e7e684","Type":"ContainerStarted","Data":"a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7"} Jan 21 16:39:36 crc kubenswrapper[4707]: I0121 16:39:36.913325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:37 crc kubenswrapper[4707]: I0121 16:39:37.278368 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl"] Jan 21 16:39:37 crc kubenswrapper[4707]: W0121 16:39:37.282529 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d310b10_05e8_4691_b1d9_84a2f829173c.slice/crio-5bf95eeed98e29ded5d717bc0aa922fa6abf7292c7bf00e61debefc85255c6a1 WatchSource:0}: Error finding container 5bf95eeed98e29ded5d717bc0aa922fa6abf7292c7bf00e61debefc85255c6a1: Status 404 returned error can't find the container with id 5bf95eeed98e29ded5d717bc0aa922fa6abf7292c7bf00e61debefc85255c6a1 Jan 21 16:39:37 crc kubenswrapper[4707]: I0121 16:39:37.772053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" event={"ID":"2d310b10-05e8-4691-b1d9-84a2f829173c","Type":"ContainerStarted","Data":"5bf95eeed98e29ded5d717bc0aa922fa6abf7292c7bf00e61debefc85255c6a1"} Jan 21 16:39:40 crc kubenswrapper[4707]: I0121 16:39:40.788991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" event={"ID":"2d310b10-05e8-4691-b1d9-84a2f829173c","Type":"ContainerStarted","Data":"8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709"} Jan 21 16:39:40 crc kubenswrapper[4707]: I0121 16:39:40.789130 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:40 crc kubenswrapper[4707]: I0121 16:39:40.804469 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" podStartSLOduration=2.033645527 podStartE2EDuration="4.804456615s" podCreationTimestamp="2026-01-21 16:39:36 +0000 UTC" firstStartedPulling="2026-01-21 16:39:37.284506464 +0000 UTC m=+5874.466022686" lastFinishedPulling="2026-01-21 16:39:40.055317552 +0000 UTC m=+5877.236833774" observedRunningTime="2026-01-21 16:39:40.799909665 +0000 UTC m=+5877.981425888" watchObservedRunningTime="2026-01-21 16:39:40.804456615 +0000 UTC m=+5877.985972838" Jan 21 16:39:46 crc kubenswrapper[4707]: I0121 16:39:46.917271 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:39:50 crc kubenswrapper[4707]: I0121 16:39:50.741421 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-c7jqh"] Jan 21 16:39:50 crc kubenswrapper[4707]: I0121 16:39:50.742382 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-c7jqh" Jan 21 16:39:50 crc kubenswrapper[4707]: I0121 16:39:50.744509 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-index-dockercfg-5986t" Jan 21 16:39:50 crc kubenswrapper[4707]: I0121 16:39:50.750740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-c7jqh"] Jan 21 16:39:50 crc kubenswrapper[4707]: I0121 16:39:50.905373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2md8\" (UniqueName: \"kubernetes.io/projected/62c24395-8cfc-4b7c-bf02-054fd74721ac-kube-api-access-b2md8\") pod \"barbican-operator-index-c7jqh\" (UID: \"62c24395-8cfc-4b7c-bf02-054fd74721ac\") " pod="openstack-operators/barbican-operator-index-c7jqh" Jan 21 16:39:51 crc kubenswrapper[4707]: I0121 16:39:51.007325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2md8\" (UniqueName: \"kubernetes.io/projected/62c24395-8cfc-4b7c-bf02-054fd74721ac-kube-api-access-b2md8\") pod \"barbican-operator-index-c7jqh\" (UID: \"62c24395-8cfc-4b7c-bf02-054fd74721ac\") " pod="openstack-operators/barbican-operator-index-c7jqh" Jan 21 16:39:51 crc kubenswrapper[4707]: I0121 16:39:51.022874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2md8\" (UniqueName: \"kubernetes.io/projected/62c24395-8cfc-4b7c-bf02-054fd74721ac-kube-api-access-b2md8\") pod \"barbican-operator-index-c7jqh\" (UID: \"62c24395-8cfc-4b7c-bf02-054fd74721ac\") " pod="openstack-operators/barbican-operator-index-c7jqh" Jan 21 16:39:51 crc kubenswrapper[4707]: I0121 16:39:51.061898 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-c7jqh" Jan 21 16:39:51 crc kubenswrapper[4707]: I0121 16:39:51.408995 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-c7jqh"] Jan 21 16:39:51 crc kubenswrapper[4707]: W0121 16:39:51.413616 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62c24395_8cfc_4b7c_bf02_054fd74721ac.slice/crio-a5774491fb63998f8bcedb735aafd5e301d7c1313901a357659dc21c3c7a7248 WatchSource:0}: Error finding container a5774491fb63998f8bcedb735aafd5e301d7c1313901a357659dc21c3c7a7248: Status 404 returned error can't find the container with id a5774491fb63998f8bcedb735aafd5e301d7c1313901a357659dc21c3c7a7248 Jan 21 16:39:51 crc kubenswrapper[4707]: I0121 16:39:51.848625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-c7jqh" event={"ID":"62c24395-8cfc-4b7c-bf02-054fd74721ac","Type":"ContainerStarted","Data":"a5774491fb63998f8bcedb735aafd5e301d7c1313901a357659dc21c3c7a7248"} Jan 21 16:39:52 crc kubenswrapper[4707]: I0121 16:39:52.865782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-c7jqh" event={"ID":"62c24395-8cfc-4b7c-bf02-054fd74721ac","Type":"ContainerStarted","Data":"b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3"} Jan 21 16:39:52 crc kubenswrapper[4707]: I0121 16:39:52.882552 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-c7jqh" podStartSLOduration=1.610997538 podStartE2EDuration="2.882539664s" podCreationTimestamp="2026-01-21 16:39:50 +0000 UTC" firstStartedPulling="2026-01-21 16:39:51.415864389 +0000 UTC m=+5888.597380612" lastFinishedPulling="2026-01-21 16:39:52.687406526 +0000 UTC m=+5889.868922738" observedRunningTime="2026-01-21 16:39:52.877547698 +0000 UTC m=+5890.059063920" watchObservedRunningTime="2026-01-21 16:39:52.882539664 +0000 UTC m=+5890.064055887" Jan 21 16:39:54 crc kubenswrapper[4707]: I0121 16:39:54.933942 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-c7jqh"] Jan 21 16:39:54 crc kubenswrapper[4707]: I0121 16:39:54.934093 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-c7jqh" podUID="62c24395-8cfc-4b7c-bf02-054fd74721ac" containerName="registry-server" containerID="cri-o://b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3" gracePeriod=2 Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.277540 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-c7jqh" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.463739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2md8\" (UniqueName: \"kubernetes.io/projected/62c24395-8cfc-4b7c-bf02-054fd74721ac-kube-api-access-b2md8\") pod \"62c24395-8cfc-4b7c-bf02-054fd74721ac\" (UID: \"62c24395-8cfc-4b7c-bf02-054fd74721ac\") " Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.468017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c24395-8cfc-4b7c-bf02-054fd74721ac-kube-api-access-b2md8" (OuterVolumeSpecName: "kube-api-access-b2md8") pod "62c24395-8cfc-4b7c-bf02-054fd74721ac" (UID: "62c24395-8cfc-4b7c-bf02-054fd74721ac"). InnerVolumeSpecName "kube-api-access-b2md8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.538427 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-khs56"] Jan 21 16:39:55 crc kubenswrapper[4707]: E0121 16:39:55.538731 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c24395-8cfc-4b7c-bf02-054fd74721ac" containerName="registry-server" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.538750 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c24395-8cfc-4b7c-bf02-054fd74721ac" containerName="registry-server" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.538914 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c24395-8cfc-4b7c-bf02-054fd74721ac" containerName="registry-server" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.539347 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.543344 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-khs56"] Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.565801 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2md8\" (UniqueName: \"kubernetes.io/projected/62c24395-8cfc-4b7c-bf02-054fd74721ac-kube-api-access-b2md8\") on node \"crc\" DevicePath \"\"" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.666922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbxdj\" (UniqueName: \"kubernetes.io/projected/ce284746-16f3-4e13-ab0d-5c8ee61e0a4d-kube-api-access-bbxdj\") pod \"barbican-operator-index-khs56\" (UID: \"ce284746-16f3-4e13-ab0d-5c8ee61e0a4d\") " pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.768755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbxdj\" (UniqueName: \"kubernetes.io/projected/ce284746-16f3-4e13-ab0d-5c8ee61e0a4d-kube-api-access-bbxdj\") pod \"barbican-operator-index-khs56\" (UID: \"ce284746-16f3-4e13-ab0d-5c8ee61e0a4d\") " pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.782063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbxdj\" (UniqueName: \"kubernetes.io/projected/ce284746-16f3-4e13-ab0d-5c8ee61e0a4d-kube-api-access-bbxdj\") pod \"barbican-operator-index-khs56\" (UID: \"ce284746-16f3-4e13-ab0d-5c8ee61e0a4d\") " pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.853515 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.883290 4707 generic.go:334] "Generic (PLEG): container finished" podID="62c24395-8cfc-4b7c-bf02-054fd74721ac" containerID="b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3" exitCode=0 Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.883325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-c7jqh" event={"ID":"62c24395-8cfc-4b7c-bf02-054fd74721ac","Type":"ContainerDied","Data":"b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3"} Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.883471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-c7jqh" event={"ID":"62c24395-8cfc-4b7c-bf02-054fd74721ac","Type":"ContainerDied","Data":"a5774491fb63998f8bcedb735aafd5e301d7c1313901a357659dc21c3c7a7248"} Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.883489 4707 scope.go:117] "RemoveContainer" containerID="b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.883344 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-c7jqh" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.897435 4707 scope.go:117] "RemoveContainer" containerID="b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3" Jan 21 16:39:55 crc kubenswrapper[4707]: E0121 16:39:55.897841 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3\": container with ID starting with b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3 not found: ID does not exist" containerID="b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.897872 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3"} err="failed to get container status \"b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3\": rpc error: code = NotFound desc = could not find container \"b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3\": container with ID starting with b1a1bab59a5c2441ff36367f50a5f540ff01dad42a1bcf922c86b6cf5d7fb8f3 not found: ID does not exist" Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.908135 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-c7jqh"] Jan 21 16:39:55 crc kubenswrapper[4707]: I0121 16:39:55.912680 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-c7jqh"] Jan 21 16:39:56 crc kubenswrapper[4707]: I0121 16:39:56.207296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-khs56"] Jan 21 16:39:56 crc kubenswrapper[4707]: I0121 16:39:56.890373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-khs56" event={"ID":"ce284746-16f3-4e13-ab0d-5c8ee61e0a4d","Type":"ContainerStarted","Data":"b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7"} Jan 21 16:39:56 crc kubenswrapper[4707]: I0121 16:39:56.890600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-khs56" event={"ID":"ce284746-16f3-4e13-ab0d-5c8ee61e0a4d","Type":"ContainerStarted","Data":"de08628e52157c31afa789a1b1e63ce4b855243cfe2c3955b47a53f44a715c85"} Jan 21 16:39:56 crc kubenswrapper[4707]: I0121 16:39:56.908396 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-khs56" podStartSLOduration=1.421052759 podStartE2EDuration="1.908381241s" podCreationTimestamp="2026-01-21 16:39:55 +0000 UTC" firstStartedPulling="2026-01-21 16:39:56.213443071 +0000 UTC m=+5893.394959293" lastFinishedPulling="2026-01-21 16:39:56.700771553 +0000 UTC m=+5893.882287775" observedRunningTime="2026-01-21 16:39:56.905064425 +0000 UTC m=+5894.086580646" watchObservedRunningTime="2026-01-21 16:39:56.908381241 +0000 UTC m=+5894.089897463" Jan 21 16:39:57 crc kubenswrapper[4707]: I0121 16:39:57.187865 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c24395-8cfc-4b7c-bf02-054fd74721ac" path="/var/lib/kubelet/pods/62c24395-8cfc-4b7c-bf02-054fd74721ac/volumes" Jan 21 16:40:05 crc kubenswrapper[4707]: I0121 16:40:05.854118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:40:05 crc kubenswrapper[4707]: I0121 16:40:05.854571 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:40:05 crc kubenswrapper[4707]: I0121 16:40:05.874575 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:40:05 crc kubenswrapper[4707]: I0121 16:40:05.959157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:40:07 crc kubenswrapper[4707]: I0121 16:40:07.950113 4707 generic.go:334] "Generic (PLEG): container finished" podID="b049d8fc-bda9-45c0-b94b-ea5824e7e684" containerID="a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7" exitCode=0 Jan 21 16:40:07 crc kubenswrapper[4707]: I0121 16:40:07.950154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"b049d8fc-bda9-45c0-b94b-ea5824e7e684","Type":"ContainerDied","Data":"a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7"} Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.779459 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v"] Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.780872 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.782322 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.789600 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v"] Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.923041 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvww\" (UniqueName: \"kubernetes.io/projected/28af8b85-10f6-466f-b49e-cb02c1530048-kube-api-access-9mvww\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.923265 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-util\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.923422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-bundle\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.958069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"b049d8fc-bda9-45c0-b94b-ea5824e7e684","Type":"ContainerStarted","Data":"3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55"} Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.958285 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:40:08 crc kubenswrapper[4707]: I0121 16:40:08.978598 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.059222112 podStartE2EDuration="40.978581908s" podCreationTimestamp="2026-01-21 16:39:28 +0000 UTC" firstStartedPulling="2026-01-21 16:39:30.384594642 +0000 UTC m=+5867.566110865" lastFinishedPulling="2026-01-21 16:39:35.303954438 +0000 UTC m=+5872.485470661" observedRunningTime="2026-01-21 16:40:08.975282804 +0000 UTC m=+5906.156799026" watchObservedRunningTime="2026-01-21 16:40:08.978581908 +0000 UTC m=+5906.160098130" Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.025525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvww\" (UniqueName: \"kubernetes.io/projected/28af8b85-10f6-466f-b49e-cb02c1530048-kube-api-access-9mvww\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.025593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-util\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.025650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-bundle\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.026130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-bundle\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.026169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-util\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.042872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvww\" (UniqueName: \"kubernetes.io/projected/28af8b85-10f6-466f-b49e-cb02c1530048-kube-api-access-9mvww\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.103519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.471057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v"] Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.966236 4707 generic.go:334] "Generic (PLEG): container finished" podID="28af8b85-10f6-466f-b49e-cb02c1530048" containerID="e519e8867f97adcf5975a111f78a8343d5660a802db0350be83c18e6285c3c0f" exitCode=0 Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.966359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" event={"ID":"28af8b85-10f6-466f-b49e-cb02c1530048","Type":"ContainerDied","Data":"e519e8867f97adcf5975a111f78a8343d5660a802db0350be83c18e6285c3c0f"} Jan 21 16:40:09 crc kubenswrapper[4707]: I0121 16:40:09.966697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" event={"ID":"28af8b85-10f6-466f-b49e-cb02c1530048","Type":"ContainerStarted","Data":"62197b5855cfc6ba64cb0ec3bc3b3680cb0ddc7066dbb1356cf4313508b1c6e7"} Jan 21 16:40:11 crc kubenswrapper[4707]: I0121 16:40:11.978914 4707 generic.go:334] "Generic (PLEG): container finished" podID="28af8b85-10f6-466f-b49e-cb02c1530048" containerID="9ba17d6b270bdf649b53367f4f8805234a3b0e6fb56411d6f3cb0e687eec4328" exitCode=0 Jan 21 16:40:11 crc kubenswrapper[4707]: I0121 16:40:11.978952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" event={"ID":"28af8b85-10f6-466f-b49e-cb02c1530048","Type":"ContainerDied","Data":"9ba17d6b270bdf649b53367f4f8805234a3b0e6fb56411d6f3cb0e687eec4328"} Jan 21 16:40:12 crc kubenswrapper[4707]: I0121 16:40:12.987369 4707 generic.go:334] "Generic (PLEG): container finished" podID="28af8b85-10f6-466f-b49e-cb02c1530048" containerID="6f9706b5be81d9d93e668a9e1b8048574b2113237604f19fc7564233ede4d0d0" exitCode=0 Jan 21 16:40:12 crc kubenswrapper[4707]: I0121 16:40:12.987438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" event={"ID":"28af8b85-10f6-466f-b49e-cb02c1530048","Type":"ContainerDied","Data":"6f9706b5be81d9d93e668a9e1b8048574b2113237604f19fc7564233ede4d0d0"} Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.237194 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.407101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-bundle\") pod \"28af8b85-10f6-466f-b49e-cb02c1530048\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.407216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mvww\" (UniqueName: \"kubernetes.io/projected/28af8b85-10f6-466f-b49e-cb02c1530048-kube-api-access-9mvww\") pod \"28af8b85-10f6-466f-b49e-cb02c1530048\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.407291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-util\") pod \"28af8b85-10f6-466f-b49e-cb02c1530048\" (UID: \"28af8b85-10f6-466f-b49e-cb02c1530048\") " Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.408010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-bundle" (OuterVolumeSpecName: "bundle") pod "28af8b85-10f6-466f-b49e-cb02c1530048" (UID: "28af8b85-10f6-466f-b49e-cb02c1530048"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.413477 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28af8b85-10f6-466f-b49e-cb02c1530048-kube-api-access-9mvww" (OuterVolumeSpecName: "kube-api-access-9mvww") pod "28af8b85-10f6-466f-b49e-cb02c1530048" (UID: "28af8b85-10f6-466f-b49e-cb02c1530048"). InnerVolumeSpecName "kube-api-access-9mvww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.417766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-util" (OuterVolumeSpecName: "util") pod "28af8b85-10f6-466f-b49e-cb02c1530048" (UID: "28af8b85-10f6-466f-b49e-cb02c1530048"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.508460 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mvww\" (UniqueName: \"kubernetes.io/projected/28af8b85-10f6-466f-b49e-cb02c1530048-kube-api-access-9mvww\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.508691 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-util\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.508703 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28af8b85-10f6-466f-b49e-cb02c1530048-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.707238 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-create-7vp86"] Jan 21 16:40:14 crc kubenswrapper[4707]: E0121 16:40:14.707544 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28af8b85-10f6-466f-b49e-cb02c1530048" containerName="extract" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.707561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28af8b85-10f6-466f-b49e-cb02c1530048" containerName="extract" Jan 21 16:40:14 crc kubenswrapper[4707]: E0121 16:40:14.707584 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28af8b85-10f6-466f-b49e-cb02c1530048" containerName="pull" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.707590 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28af8b85-10f6-466f-b49e-cb02c1530048" containerName="pull" Jan 21 16:40:14 crc kubenswrapper[4707]: E0121 16:40:14.707599 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28af8b85-10f6-466f-b49e-cb02c1530048" containerName="util" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.707605 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28af8b85-10f6-466f-b49e-cb02c1530048" containerName="util" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.707705 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28af8b85-10f6-466f-b49e-cb02c1530048" containerName="extract" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.708161 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.710919 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z"] Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.711410 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.714376 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-7vp86"] Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.716567 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-db-secret" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.718325 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z"] Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.811918 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4cdq\" (UniqueName: \"kubernetes.io/projected/c211bec1-5e1f-4ce9-832a-24155a50c399-kube-api-access-c4cdq\") pod \"keystone-cdc9-account-create-update-5qq8z\" (UID: \"c211bec1-5e1f-4ce9-832a-24155a50c399\") " pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.812030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c211bec1-5e1f-4ce9-832a-24155a50c399-operator-scripts\") pod \"keystone-cdc9-account-create-update-5qq8z\" (UID: \"c211bec1-5e1f-4ce9-832a-24155a50c399\") " pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.812061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbr78\" (UniqueName: \"kubernetes.io/projected/1baa8281-9a62-4a40-a97f-c2ce33f9216d-kube-api-access-vbr78\") pod \"keystone-db-create-7vp86\" (UID: \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\") " pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.812096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1baa8281-9a62-4a40-a97f-c2ce33f9216d-operator-scripts\") pod \"keystone-db-create-7vp86\" (UID: \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\") " pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.913114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4cdq\" (UniqueName: \"kubernetes.io/projected/c211bec1-5e1f-4ce9-832a-24155a50c399-kube-api-access-c4cdq\") pod \"keystone-cdc9-account-create-update-5qq8z\" (UID: \"c211bec1-5e1f-4ce9-832a-24155a50c399\") " pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.913201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c211bec1-5e1f-4ce9-832a-24155a50c399-operator-scripts\") pod \"keystone-cdc9-account-create-update-5qq8z\" (UID: \"c211bec1-5e1f-4ce9-832a-24155a50c399\") " pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.913231 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbr78\" (UniqueName: \"kubernetes.io/projected/1baa8281-9a62-4a40-a97f-c2ce33f9216d-kube-api-access-vbr78\") pod \"keystone-db-create-7vp86\" (UID: \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\") " pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.913255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1baa8281-9a62-4a40-a97f-c2ce33f9216d-operator-scripts\") pod \"keystone-db-create-7vp86\" (UID: \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\") " pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.913893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1baa8281-9a62-4a40-a97f-c2ce33f9216d-operator-scripts\") pod \"keystone-db-create-7vp86\" (UID: \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\") " pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.914531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c211bec1-5e1f-4ce9-832a-24155a50c399-operator-scripts\") pod \"keystone-cdc9-account-create-update-5qq8z\" (UID: \"c211bec1-5e1f-4ce9-832a-24155a50c399\") " pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.928244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4cdq\" (UniqueName: \"kubernetes.io/projected/c211bec1-5e1f-4ce9-832a-24155a50c399-kube-api-access-c4cdq\") pod \"keystone-cdc9-account-create-update-5qq8z\" (UID: \"c211bec1-5e1f-4ce9-832a-24155a50c399\") " pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:14 crc kubenswrapper[4707]: I0121 16:40:14.932149 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbr78\" (UniqueName: \"kubernetes.io/projected/1baa8281-9a62-4a40-a97f-c2ce33f9216d-kube-api-access-vbr78\") pod \"keystone-db-create-7vp86\" (UID: \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\") " pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:15 crc kubenswrapper[4707]: I0121 16:40:15.000005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" event={"ID":"28af8b85-10f6-466f-b49e-cb02c1530048","Type":"ContainerDied","Data":"62197b5855cfc6ba64cb0ec3bc3b3680cb0ddc7066dbb1356cf4313508b1c6e7"} Jan 21 16:40:15 crc kubenswrapper[4707]: I0121 16:40:15.000038 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62197b5855cfc6ba64cb0ec3bc3b3680cb0ddc7066dbb1356cf4313508b1c6e7" Jan 21 16:40:15 crc kubenswrapper[4707]: I0121 16:40:15.000088 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v" Jan 21 16:40:15 crc kubenswrapper[4707]: I0121 16:40:15.027632 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:15 crc kubenswrapper[4707]: I0121 16:40:15.027702 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:15 crc kubenswrapper[4707]: I0121 16:40:15.395590 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-7vp86"] Jan 21 16:40:15 crc kubenswrapper[4707]: W0121 16:40:15.398620 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1baa8281_9a62_4a40_a97f_c2ce33f9216d.slice/crio-73fe2978d1bbffd4e562403b71260929a9e01e8564aa332c3e10d9c231ba5f07 WatchSource:0}: Error finding container 73fe2978d1bbffd4e562403b71260929a9e01e8564aa332c3e10d9c231ba5f07: Status 404 returned error can't find the container with id 73fe2978d1bbffd4e562403b71260929a9e01e8564aa332c3e10d9c231ba5f07 Jan 21 16:40:15 crc kubenswrapper[4707]: I0121 16:40:15.443619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z"] Jan 21 16:40:15 crc kubenswrapper[4707]: W0121 16:40:15.472650 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc211bec1_5e1f_4ce9_832a_24155a50c399.slice/crio-34e022b4bf3971dd4ae734afc9d9b30b5329f2fc4a1245dffdf8d0562518639e WatchSource:0}: Error finding container 34e022b4bf3971dd4ae734afc9d9b30b5329f2fc4a1245dffdf8d0562518639e: Status 404 returned error can't find the container with id 34e022b4bf3971dd4ae734afc9d9b30b5329f2fc4a1245dffdf8d0562518639e Jan 21 16:40:16 crc kubenswrapper[4707]: I0121 16:40:16.007155 4707 generic.go:334] "Generic (PLEG): container finished" podID="c211bec1-5e1f-4ce9-832a-24155a50c399" containerID="85a6db681c23d2d559bfefcdac4a591fe4fbdbe19f51f00633faac999250b166" exitCode=0 Jan 21 16:40:16 crc kubenswrapper[4707]: I0121 16:40:16.007255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" event={"ID":"c211bec1-5e1f-4ce9-832a-24155a50c399","Type":"ContainerDied","Data":"85a6db681c23d2d559bfefcdac4a591fe4fbdbe19f51f00633faac999250b166"} Jan 21 16:40:16 crc kubenswrapper[4707]: I0121 16:40:16.007518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" event={"ID":"c211bec1-5e1f-4ce9-832a-24155a50c399","Type":"ContainerStarted","Data":"34e022b4bf3971dd4ae734afc9d9b30b5329f2fc4a1245dffdf8d0562518639e"} Jan 21 16:40:16 crc kubenswrapper[4707]: I0121 16:40:16.008716 4707 generic.go:334] "Generic (PLEG): container finished" podID="1baa8281-9a62-4a40-a97f-c2ce33f9216d" containerID="96254e932b512b73daaf144d488ace9d1022025ec187cdf7638fd0994f40d6db" exitCode=0 Jan 21 16:40:16 crc kubenswrapper[4707]: I0121 16:40:16.008754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-7vp86" event={"ID":"1baa8281-9a62-4a40-a97f-c2ce33f9216d","Type":"ContainerDied","Data":"96254e932b512b73daaf144d488ace9d1022025ec187cdf7638fd0994f40d6db"} Jan 21 16:40:16 crc kubenswrapper[4707]: I0121 16:40:16.008776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-7vp86" event={"ID":"1baa8281-9a62-4a40-a97f-c2ce33f9216d","Type":"ContainerStarted","Data":"73fe2978d1bbffd4e562403b71260929a9e01e8564aa332c3e10d9c231ba5f07"} Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.278346 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.282140 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.348505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1baa8281-9a62-4a40-a97f-c2ce33f9216d-operator-scripts\") pod \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\" (UID: \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\") " Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.348573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbr78\" (UniqueName: \"kubernetes.io/projected/1baa8281-9a62-4a40-a97f-c2ce33f9216d-kube-api-access-vbr78\") pod \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\" (UID: \"1baa8281-9a62-4a40-a97f-c2ce33f9216d\") " Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.348591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4cdq\" (UniqueName: \"kubernetes.io/projected/c211bec1-5e1f-4ce9-832a-24155a50c399-kube-api-access-c4cdq\") pod \"c211bec1-5e1f-4ce9-832a-24155a50c399\" (UID: \"c211bec1-5e1f-4ce9-832a-24155a50c399\") " Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.348637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c211bec1-5e1f-4ce9-832a-24155a50c399-operator-scripts\") pod \"c211bec1-5e1f-4ce9-832a-24155a50c399\" (UID: \"c211bec1-5e1f-4ce9-832a-24155a50c399\") " Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.349246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c211bec1-5e1f-4ce9-832a-24155a50c399-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c211bec1-5e1f-4ce9-832a-24155a50c399" (UID: "c211bec1-5e1f-4ce9-832a-24155a50c399"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.349250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1baa8281-9a62-4a40-a97f-c2ce33f9216d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1baa8281-9a62-4a40-a97f-c2ce33f9216d" (UID: "1baa8281-9a62-4a40-a97f-c2ce33f9216d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.353493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1baa8281-9a62-4a40-a97f-c2ce33f9216d-kube-api-access-vbr78" (OuterVolumeSpecName: "kube-api-access-vbr78") pod "1baa8281-9a62-4a40-a97f-c2ce33f9216d" (UID: "1baa8281-9a62-4a40-a97f-c2ce33f9216d"). InnerVolumeSpecName "kube-api-access-vbr78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.353650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c211bec1-5e1f-4ce9-832a-24155a50c399-kube-api-access-c4cdq" (OuterVolumeSpecName: "kube-api-access-c4cdq") pod "c211bec1-5e1f-4ce9-832a-24155a50c399" (UID: "c211bec1-5e1f-4ce9-832a-24155a50c399"). InnerVolumeSpecName "kube-api-access-c4cdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.449783 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1baa8281-9a62-4a40-a97f-c2ce33f9216d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.449832 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbr78\" (UniqueName: \"kubernetes.io/projected/1baa8281-9a62-4a40-a97f-c2ce33f9216d-kube-api-access-vbr78\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.449845 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4cdq\" (UniqueName: \"kubernetes.io/projected/c211bec1-5e1f-4ce9-832a-24155a50c399-kube-api-access-c4cdq\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:17 crc kubenswrapper[4707]: I0121 16:40:17.449853 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c211bec1-5e1f-4ce9-832a-24155a50c399-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:18 crc kubenswrapper[4707]: I0121 16:40:18.021145 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" Jan 21 16:40:18 crc kubenswrapper[4707]: I0121 16:40:18.021146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z" event={"ID":"c211bec1-5e1f-4ce9-832a-24155a50c399","Type":"ContainerDied","Data":"34e022b4bf3971dd4ae734afc9d9b30b5329f2fc4a1245dffdf8d0562518639e"} Jan 21 16:40:18 crc kubenswrapper[4707]: I0121 16:40:18.021247 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e022b4bf3971dd4ae734afc9d9b30b5329f2fc4a1245dffdf8d0562518639e" Jan 21 16:40:18 crc kubenswrapper[4707]: I0121 16:40:18.022461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-7vp86" event={"ID":"1baa8281-9a62-4a40-a97f-c2ce33f9216d","Type":"ContainerDied","Data":"73fe2978d1bbffd4e562403b71260929a9e01e8564aa332c3e10d9c231ba5f07"} Jan 21 16:40:18 crc kubenswrapper[4707]: I0121 16:40:18.022497 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73fe2978d1bbffd4e562403b71260929a9e01e8564aa332c3e10d9c231ba5f07" Jan 21 16:40:18 crc kubenswrapper[4707]: I0121 16:40:18.022512 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-7vp86" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.016677 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.527127 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dw5db"] Jan 21 16:40:20 crc kubenswrapper[4707]: E0121 16:40:20.527506 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c211bec1-5e1f-4ce9-832a-24155a50c399" containerName="mariadb-account-create-update" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.527529 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c211bec1-5e1f-4ce9-832a-24155a50c399" containerName="mariadb-account-create-update" Jan 21 16:40:20 crc kubenswrapper[4707]: E0121 16:40:20.527545 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1baa8281-9a62-4a40-a97f-c2ce33f9216d" containerName="mariadb-database-create" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.527552 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1baa8281-9a62-4a40-a97f-c2ce33f9216d" containerName="mariadb-database-create" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.527674 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1baa8281-9a62-4a40-a97f-c2ce33f9216d" containerName="mariadb-database-create" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.527685 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c211bec1-5e1f-4ce9-832a-24155a50c399" containerName="mariadb-account-create-update" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.528281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.532297 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.532481 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.532509 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.532544 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-69b5d" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.533298 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dw5db"] Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.689726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e783b9-adef-43ea-92b5-0689ae45f547-config-data\") pod \"keystone-db-sync-dw5db\" (UID: \"55e783b9-adef-43ea-92b5-0689ae45f547\") " pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.689882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7nbg\" (UniqueName: \"kubernetes.io/projected/55e783b9-adef-43ea-92b5-0689ae45f547-kube-api-access-r7nbg\") pod \"keystone-db-sync-dw5db\" (UID: \"55e783b9-adef-43ea-92b5-0689ae45f547\") " pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.791369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e783b9-adef-43ea-92b5-0689ae45f547-config-data\") pod \"keystone-db-sync-dw5db\" (UID: \"55e783b9-adef-43ea-92b5-0689ae45f547\") " pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.791473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7nbg\" (UniqueName: \"kubernetes.io/projected/55e783b9-adef-43ea-92b5-0689ae45f547-kube-api-access-r7nbg\") pod \"keystone-db-sync-dw5db\" (UID: \"55e783b9-adef-43ea-92b5-0689ae45f547\") " pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.796368 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e783b9-adef-43ea-92b5-0689ae45f547-config-data\") pod \"keystone-db-sync-dw5db\" (UID: \"55e783b9-adef-43ea-92b5-0689ae45f547\") " pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.819246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7nbg\" (UniqueName: \"kubernetes.io/projected/55e783b9-adef-43ea-92b5-0689ae45f547-kube-api-access-r7nbg\") pod \"keystone-db-sync-dw5db\" (UID: \"55e783b9-adef-43ea-92b5-0689ae45f547\") " pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:20 crc kubenswrapper[4707]: I0121 16:40:20.843097 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:21 crc kubenswrapper[4707]: I0121 16:40:21.206524 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dw5db"] Jan 21 16:40:22 crc kubenswrapper[4707]: I0121 16:40:22.051896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-dw5db" event={"ID":"55e783b9-adef-43ea-92b5-0689ae45f547","Type":"ContainerStarted","Data":"63c50ce7ed4fe3362d8359ac16da360026ad74e32e2fa41d59abacb97566e618"} Jan 21 16:40:23 crc kubenswrapper[4707]: I0121 16:40:23.058166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-dw5db" event={"ID":"55e783b9-adef-43ea-92b5-0689ae45f547","Type":"ContainerStarted","Data":"4cd6cd5afa339b83660cde8c66e04cbee9185aebd4d19e72c48f6e0f30a78d3c"} Jan 21 16:40:24 crc kubenswrapper[4707]: I0121 16:40:24.065289 4707 generic.go:334] "Generic (PLEG): container finished" podID="55e783b9-adef-43ea-92b5-0689ae45f547" containerID="4cd6cd5afa339b83660cde8c66e04cbee9185aebd4d19e72c48f6e0f30a78d3c" exitCode=0 Jan 21 16:40:24 crc kubenswrapper[4707]: I0121 16:40:24.065342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-dw5db" event={"ID":"55e783b9-adef-43ea-92b5-0689ae45f547","Type":"ContainerDied","Data":"4cd6cd5afa339b83660cde8c66e04cbee9185aebd4d19e72c48f6e0f30a78d3c"} Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.296982 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.460614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e783b9-adef-43ea-92b5-0689ae45f547-config-data\") pod \"55e783b9-adef-43ea-92b5-0689ae45f547\" (UID: \"55e783b9-adef-43ea-92b5-0689ae45f547\") " Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.460648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7nbg\" (UniqueName: \"kubernetes.io/projected/55e783b9-adef-43ea-92b5-0689ae45f547-kube-api-access-r7nbg\") pod \"55e783b9-adef-43ea-92b5-0689ae45f547\" (UID: \"55e783b9-adef-43ea-92b5-0689ae45f547\") " Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.464949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e783b9-adef-43ea-92b5-0689ae45f547-kube-api-access-r7nbg" (OuterVolumeSpecName: "kube-api-access-r7nbg") pod "55e783b9-adef-43ea-92b5-0689ae45f547" (UID: "55e783b9-adef-43ea-92b5-0689ae45f547"). InnerVolumeSpecName "kube-api-access-r7nbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.473370 4707 scope.go:117] "RemoveContainer" containerID="279873d4ba1cceb2a7d6548a655b63ce12ba7880ec55a67febe0bb558f345333" Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.486719 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e783b9-adef-43ea-92b5-0689ae45f547-config-data" (OuterVolumeSpecName: "config-data") pod "55e783b9-adef-43ea-92b5-0689ae45f547" (UID: "55e783b9-adef-43ea-92b5-0689ae45f547"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.531748 4707 scope.go:117] "RemoveContainer" containerID="bd9a5132ef93dfcb7495822d09194ebe92fb78bc1a90c45207c8e4321564275e" Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.549003 4707 scope.go:117] "RemoveContainer" containerID="0511a692cf877ea70b36d67f6978eef8da3c2ecaefe78498e7fed5b52626862e" Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.562275 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e783b9-adef-43ea-92b5-0689ae45f547-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:25 crc kubenswrapper[4707]: I0121 16:40:25.562300 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7nbg\" (UniqueName: \"kubernetes.io/projected/55e783b9-adef-43ea-92b5-0689ae45f547-kube-api-access-r7nbg\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.078227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-dw5db" event={"ID":"55e783b9-adef-43ea-92b5-0689ae45f547","Type":"ContainerDied","Data":"63c50ce7ed4fe3362d8359ac16da360026ad74e32e2fa41d59abacb97566e618"} Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.078502 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63c50ce7ed4fe3362d8359ac16da360026ad74e32e2fa41d59abacb97566e618" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.078267 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-dw5db" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.485107 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-5fd7v"] Jan 21 16:40:26 crc kubenswrapper[4707]: E0121 16:40:26.485421 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e783b9-adef-43ea-92b5-0689ae45f547" containerName="keystone-db-sync" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.485434 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e783b9-adef-43ea-92b5-0689ae45f547" containerName="keystone-db-sync" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.485598 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e783b9-adef-43ea-92b5-0689ae45f547" containerName="keystone-db-sync" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.486090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.487687 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"osp-secret" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.487850 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.487965 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-69b5d" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.488181 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.488281 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.489658 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-5fd7v"] Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.679079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-credential-keys\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.679120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-config-data\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.679140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-fernet-keys\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.679208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrww2\" (UniqueName: \"kubernetes.io/projected/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-kube-api-access-jrww2\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.679467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-scripts\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.781088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrww2\" (UniqueName: \"kubernetes.io/projected/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-kube-api-access-jrww2\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.781182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-scripts\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.781217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-credential-keys\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.781234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-config-data\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.781249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-fernet-keys\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.784605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-credential-keys\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.784657 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-fernet-keys\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.785163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-config-data\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.785450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-scripts\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.794558 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrww2\" (UniqueName: \"kubernetes.io/projected/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-kube-api-access-jrww2\") pod \"keystone-bootstrap-5fd7v\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:26 crc kubenswrapper[4707]: I0121 16:40:26.799658 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.005119 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5"] Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.006068 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.008539 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-service-cert" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.008776 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9mqdl" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.016921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5"] Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.177858 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-5fd7v"] Jan 21 16:40:27 crc kubenswrapper[4707]: W0121 16:40:27.181592 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ce37166_4b59_4b57_b5f8_9fcf99e2f700.slice/crio-fd2edb68613522fa9455949afcd069851e29d236037b57267fe3173b0f70b753 WatchSource:0}: Error finding container fd2edb68613522fa9455949afcd069851e29d236037b57267fe3173b0f70b753: Status 404 returned error can't find the container with id fd2edb68613522fa9455949afcd069851e29d236037b57267fe3173b0f70b753 Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.185772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z6wq\" (UniqueName: \"kubernetes.io/projected/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-kube-api-access-9z6wq\") pod \"barbican-operator-controller-manager-5ccc48789b-vncz5\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.185905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-webhook-cert\") pod \"barbican-operator-controller-manager-5ccc48789b-vncz5\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.185936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-apiservice-cert\") pod \"barbican-operator-controller-manager-5ccc48789b-vncz5\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.287068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z6wq\" (UniqueName: \"kubernetes.io/projected/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-kube-api-access-9z6wq\") pod \"barbican-operator-controller-manager-5ccc48789b-vncz5\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.287362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-webhook-cert\") pod \"barbican-operator-controller-manager-5ccc48789b-vncz5\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.287392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-apiservice-cert\") pod \"barbican-operator-controller-manager-5ccc48789b-vncz5\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.290306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-webhook-cert\") pod \"barbican-operator-controller-manager-5ccc48789b-vncz5\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.290403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-apiservice-cert\") pod \"barbican-operator-controller-manager-5ccc48789b-vncz5\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.301191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z6wq\" (UniqueName: \"kubernetes.io/projected/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-kube-api-access-9z6wq\") pod \"barbican-operator-controller-manager-5ccc48789b-vncz5\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.325772 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:27 crc kubenswrapper[4707]: I0121 16:40:27.687370 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5"] Jan 21 16:40:28 crc kubenswrapper[4707]: I0121 16:40:28.090115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" event={"ID":"0ce37166-4b59-4b57-b5f8-9fcf99e2f700","Type":"ContainerStarted","Data":"6fcec8f0281d17e70faad25a834607175020c27b5a29321fcb377913bb15593b"} Jan 21 16:40:28 crc kubenswrapper[4707]: I0121 16:40:28.090391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" event={"ID":"0ce37166-4b59-4b57-b5f8-9fcf99e2f700","Type":"ContainerStarted","Data":"fd2edb68613522fa9455949afcd069851e29d236037b57267fe3173b0f70b753"} Jan 21 16:40:28 crc kubenswrapper[4707]: I0121 16:40:28.091222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" event={"ID":"42d603ef-d0e3-46e3-9dbb-563a49c73e2e","Type":"ContainerStarted","Data":"602a9944c39514e518a8661780763494315b4391442daa9bb3326a35374d0174"} Jan 21 16:40:28 crc kubenswrapper[4707]: I0121 16:40:28.101488 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" podStartSLOduration=2.101477296 podStartE2EDuration="2.101477296s" podCreationTimestamp="2026-01-21 16:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:40:28.100910881 +0000 UTC m=+5925.282427103" watchObservedRunningTime="2026-01-21 16:40:28.101477296 +0000 UTC m=+5925.282993518" Jan 21 16:40:30 crc kubenswrapper[4707]: I0121 16:40:30.104924 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ce37166-4b59-4b57-b5f8-9fcf99e2f700" containerID="6fcec8f0281d17e70faad25a834607175020c27b5a29321fcb377913bb15593b" exitCode=0 Jan 21 16:40:30 crc kubenswrapper[4707]: I0121 16:40:30.104994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" event={"ID":"0ce37166-4b59-4b57-b5f8-9fcf99e2f700","Type":"ContainerDied","Data":"6fcec8f0281d17e70faad25a834607175020c27b5a29321fcb377913bb15593b"} Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.456732 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.555838 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrww2\" (UniqueName: \"kubernetes.io/projected/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-kube-api-access-jrww2\") pod \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.555897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-fernet-keys\") pod \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.555919 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-credential-keys\") pod \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.555979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-config-data\") pod \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.556001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-scripts\") pod \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\" (UID: \"0ce37166-4b59-4b57-b5f8-9fcf99e2f700\") " Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.561343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0ce37166-4b59-4b57-b5f8-9fcf99e2f700" (UID: "0ce37166-4b59-4b57-b5f8-9fcf99e2f700"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.562699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-kube-api-access-jrww2" (OuterVolumeSpecName: "kube-api-access-jrww2") pod "0ce37166-4b59-4b57-b5f8-9fcf99e2f700" (UID: "0ce37166-4b59-4b57-b5f8-9fcf99e2f700"). InnerVolumeSpecName "kube-api-access-jrww2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.562989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-scripts" (OuterVolumeSpecName: "scripts") pod "0ce37166-4b59-4b57-b5f8-9fcf99e2f700" (UID: "0ce37166-4b59-4b57-b5f8-9fcf99e2f700"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.563033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0ce37166-4b59-4b57-b5f8-9fcf99e2f700" (UID: "0ce37166-4b59-4b57-b5f8-9fcf99e2f700"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.573967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-config-data" (OuterVolumeSpecName: "config-data") pod "0ce37166-4b59-4b57-b5f8-9fcf99e2f700" (UID: "0ce37166-4b59-4b57-b5f8-9fcf99e2f700"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.659716 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.659745 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.659759 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrww2\" (UniqueName: \"kubernetes.io/projected/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-kube-api-access-jrww2\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.659769 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:31 crc kubenswrapper[4707]: I0121 16:40:31.659783 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ce37166-4b59-4b57-b5f8-9fcf99e2f700-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.120026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" event={"ID":"0ce37166-4b59-4b57-b5f8-9fcf99e2f700","Type":"ContainerDied","Data":"fd2edb68613522fa9455949afcd069851e29d236037b57267fe3173b0f70b753"} Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.120059 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd2edb68613522fa9455949afcd069851e29d236037b57267fe3173b0f70b753" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.120293 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-5fd7v" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.121130 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" event={"ID":"42d603ef-d0e3-46e3-9dbb-563a49c73e2e","Type":"ContainerStarted","Data":"aa7c3c303eb643bb6549d509edba3cd9c298d9a0c4dfc9d43ddc855e4cbaef62"} Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.121510 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.133432 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" podStartSLOduration=2.377040887 podStartE2EDuration="6.133417089s" podCreationTimestamp="2026-01-21 16:40:26 +0000 UTC" firstStartedPulling="2026-01-21 16:40:27.690780124 +0000 UTC m=+5924.872296346" lastFinishedPulling="2026-01-21 16:40:31.447156327 +0000 UTC m=+5928.628672548" observedRunningTime="2026-01-21 16:40:32.131998763 +0000 UTC m=+5929.313514985" watchObservedRunningTime="2026-01-21 16:40:32.133417089 +0000 UTC m=+5929.314933311" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.175255 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-5f5bf8f646-hch88"] Jan 21 16:40:32 crc kubenswrapper[4707]: E0121 16:40:32.175534 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce37166-4b59-4b57-b5f8-9fcf99e2f700" containerName="keystone-bootstrap" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.175552 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce37166-4b59-4b57-b5f8-9fcf99e2f700" containerName="keystone-bootstrap" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.175697 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce37166-4b59-4b57-b5f8-9fcf99e2f700" containerName="keystone-bootstrap" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.176139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.177602 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.177660 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-69b5d" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.179052 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.179080 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.185418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-5f5bf8f646-hch88"] Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.369297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-credential-keys\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.369401 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-scripts\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.369427 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq2nt\" (UniqueName: \"kubernetes.io/projected/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-kube-api-access-cq2nt\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.369453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-config-data\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.369485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-fernet-keys\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.470521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-scripts\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.471210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq2nt\" (UniqueName: \"kubernetes.io/projected/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-kube-api-access-cq2nt\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.471297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-config-data\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.471392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-fernet-keys\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.471473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-credential-keys\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.473444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-scripts\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.474101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-credential-keys\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.474316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-config-data\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.474491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-fernet-keys\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.483855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq2nt\" (UniqueName: \"kubernetes.io/projected/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-kube-api-access-cq2nt\") pod \"keystone-5f5bf8f646-hch88\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.489915 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:32 crc kubenswrapper[4707]: I0121 16:40:32.835333 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-5f5bf8f646-hch88"] Jan 21 16:40:33 crc kubenswrapper[4707]: I0121 16:40:33.127644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" event={"ID":"1addc64b-1eb7-43fd-ade9-e6ee00d007bb","Type":"ContainerStarted","Data":"a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba"} Jan 21 16:40:33 crc kubenswrapper[4707]: I0121 16:40:33.127879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" event={"ID":"1addc64b-1eb7-43fd-ade9-e6ee00d007bb","Type":"ContainerStarted","Data":"b24f506e0f994533c4707fd4bb820b1de105f0de0e1993eeb9b94c7530af824c"} Jan 21 16:40:33 crc kubenswrapper[4707]: I0121 16:40:33.142973 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" podStartSLOduration=1.1429598699999999 podStartE2EDuration="1.14295987s" podCreationTimestamp="2026-01-21 16:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:40:33.138203666 +0000 UTC m=+5930.319719889" watchObservedRunningTime="2026-01-21 16:40:33.14295987 +0000 UTC m=+5930.324476092" Jan 21 16:40:34 crc kubenswrapper[4707]: I0121 16:40:34.133024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:40:37 crc kubenswrapper[4707]: I0121 16:40:37.330388 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:40:39 crc kubenswrapper[4707]: I0121 16:40:39.945168 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:40:39 crc kubenswrapper[4707]: I0121 16:40:39.945938 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:41:03 crc kubenswrapper[4707]: I0121 16:41:03.743589 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.239648 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dzhdn"] Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.240547 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.243437 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx"] Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.244131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.245190 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.247095 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dzhdn"] Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.250234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5180a45c-609d-4485-9c3c-7313dba51835-operator-scripts\") pod \"barbican-1df1-account-create-update-72lkx\" (UID: \"5180a45c-609d-4485-9c3c-7313dba51835\") " pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.250306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv4qr\" (UniqueName: \"kubernetes.io/projected/5180a45c-609d-4485-9c3c-7313dba51835-kube-api-access-kv4qr\") pod \"barbican-1df1-account-create-update-72lkx\" (UID: \"5180a45c-609d-4485-9c3c-7313dba51835\") " pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.250341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f591d9-2db6-4544-b3bd-067a7e396ed7-operator-scripts\") pod \"barbican-db-create-dzhdn\" (UID: \"70f591d9-2db6-4544-b3bd-067a7e396ed7\") " pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.250358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knb79\" (UniqueName: \"kubernetes.io/projected/70f591d9-2db6-4544-b3bd-067a7e396ed7-kube-api-access-knb79\") pod \"barbican-db-create-dzhdn\" (UID: \"70f591d9-2db6-4544-b3bd-067a7e396ed7\") " pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.253230 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx"] Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.351934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv4qr\" (UniqueName: \"kubernetes.io/projected/5180a45c-609d-4485-9c3c-7313dba51835-kube-api-access-kv4qr\") pod \"barbican-1df1-account-create-update-72lkx\" (UID: \"5180a45c-609d-4485-9c3c-7313dba51835\") " pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.352153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f591d9-2db6-4544-b3bd-067a7e396ed7-operator-scripts\") pod \"barbican-db-create-dzhdn\" (UID: \"70f591d9-2db6-4544-b3bd-067a7e396ed7\") " pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.352176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knb79\" (UniqueName: \"kubernetes.io/projected/70f591d9-2db6-4544-b3bd-067a7e396ed7-kube-api-access-knb79\") pod \"barbican-db-create-dzhdn\" (UID: \"70f591d9-2db6-4544-b3bd-067a7e396ed7\") " pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.352280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5180a45c-609d-4485-9c3c-7313dba51835-operator-scripts\") pod \"barbican-1df1-account-create-update-72lkx\" (UID: \"5180a45c-609d-4485-9c3c-7313dba51835\") " pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.352780 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f591d9-2db6-4544-b3bd-067a7e396ed7-operator-scripts\") pod \"barbican-db-create-dzhdn\" (UID: \"70f591d9-2db6-4544-b3bd-067a7e396ed7\") " pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.352960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5180a45c-609d-4485-9c3c-7313dba51835-operator-scripts\") pod \"barbican-1df1-account-create-update-72lkx\" (UID: \"5180a45c-609d-4485-9c3c-7313dba51835\") " pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.365822 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv4qr\" (UniqueName: \"kubernetes.io/projected/5180a45c-609d-4485-9c3c-7313dba51835-kube-api-access-kv4qr\") pod \"barbican-1df1-account-create-update-72lkx\" (UID: \"5180a45c-609d-4485-9c3c-7313dba51835\") " pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.366005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knb79\" (UniqueName: \"kubernetes.io/projected/70f591d9-2db6-4544-b3bd-067a7e396ed7-kube-api-access-knb79\") pod \"barbican-db-create-dzhdn\" (UID: \"70f591d9-2db6-4544-b3bd-067a7e396ed7\") " pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.554766 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:04 crc kubenswrapper[4707]: I0121 16:41:04.562425 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:05 crc kubenswrapper[4707]: I0121 16:41:05.027731 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx"] Jan 21 16:41:05 crc kubenswrapper[4707]: I0121 16:41:05.071696 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dzhdn"] Jan 21 16:41:05 crc kubenswrapper[4707]: W0121 16:41:05.078954 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70f591d9_2db6_4544_b3bd_067a7e396ed7.slice/crio-1723f3cd5972311e067fcee4b8c8be6ebb72020f5cb186c7122849ff8d488793 WatchSource:0}: Error finding container 1723f3cd5972311e067fcee4b8c8be6ebb72020f5cb186c7122849ff8d488793: Status 404 returned error can't find the container with id 1723f3cd5972311e067fcee4b8c8be6ebb72020f5cb186c7122849ff8d488793 Jan 21 16:41:05 crc kubenswrapper[4707]: I0121 16:41:05.312055 4707 generic.go:334] "Generic (PLEG): container finished" podID="70f591d9-2db6-4544-b3bd-067a7e396ed7" containerID="576f5674412ac2a1e6076bb665079bac82bb159d1ebf65e1043479dec274a7cc" exitCode=0 Jan 21 16:41:05 crc kubenswrapper[4707]: I0121 16:41:05.312121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-dzhdn" event={"ID":"70f591d9-2db6-4544-b3bd-067a7e396ed7","Type":"ContainerDied","Data":"576f5674412ac2a1e6076bb665079bac82bb159d1ebf65e1043479dec274a7cc"} Jan 21 16:41:05 crc kubenswrapper[4707]: I0121 16:41:05.312146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-dzhdn" event={"ID":"70f591d9-2db6-4544-b3bd-067a7e396ed7","Type":"ContainerStarted","Data":"1723f3cd5972311e067fcee4b8c8be6ebb72020f5cb186c7122849ff8d488793"} Jan 21 16:41:05 crc kubenswrapper[4707]: I0121 16:41:05.313111 4707 generic.go:334] "Generic (PLEG): container finished" podID="5180a45c-609d-4485-9c3c-7313dba51835" containerID="0c98e18ab9f0c33cce4561ff3c73b39b992159c742c9a7864333f94da41987c1" exitCode=0 Jan 21 16:41:05 crc kubenswrapper[4707]: I0121 16:41:05.313140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" event={"ID":"5180a45c-609d-4485-9c3c-7313dba51835","Type":"ContainerDied","Data":"0c98e18ab9f0c33cce4561ff3c73b39b992159c742c9a7864333f94da41987c1"} Jan 21 16:41:05 crc kubenswrapper[4707]: I0121 16:41:05.313155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" event={"ID":"5180a45c-609d-4485-9c3c-7313dba51835","Type":"ContainerStarted","Data":"4bd0892d754380e7d2044c0341332711d11fc9a3f17bbec9b0ca3813815ad4d4"} Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.505588 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.576008 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.586072 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv4qr\" (UniqueName: \"kubernetes.io/projected/5180a45c-609d-4485-9c3c-7313dba51835-kube-api-access-kv4qr\") pod \"5180a45c-609d-4485-9c3c-7313dba51835\" (UID: \"5180a45c-609d-4485-9c3c-7313dba51835\") " Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.586788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knb79\" (UniqueName: \"kubernetes.io/projected/70f591d9-2db6-4544-b3bd-067a7e396ed7-kube-api-access-knb79\") pod \"70f591d9-2db6-4544-b3bd-067a7e396ed7\" (UID: \"70f591d9-2db6-4544-b3bd-067a7e396ed7\") " Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.587039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f591d9-2db6-4544-b3bd-067a7e396ed7-operator-scripts\") pod \"70f591d9-2db6-4544-b3bd-067a7e396ed7\" (UID: \"70f591d9-2db6-4544-b3bd-067a7e396ed7\") " Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.587199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5180a45c-609d-4485-9c3c-7313dba51835-operator-scripts\") pod \"5180a45c-609d-4485-9c3c-7313dba51835\" (UID: \"5180a45c-609d-4485-9c3c-7313dba51835\") " Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.587475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f591d9-2db6-4544-b3bd-067a7e396ed7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70f591d9-2db6-4544-b3bd-067a7e396ed7" (UID: "70f591d9-2db6-4544-b3bd-067a7e396ed7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.587533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5180a45c-609d-4485-9c3c-7313dba51835-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5180a45c-609d-4485-9c3c-7313dba51835" (UID: "5180a45c-609d-4485-9c3c-7313dba51835"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.587765 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5180a45c-609d-4485-9c3c-7313dba51835-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.587997 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f591d9-2db6-4544-b3bd-067a7e396ed7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.590629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5180a45c-609d-4485-9c3c-7313dba51835-kube-api-access-kv4qr" (OuterVolumeSpecName: "kube-api-access-kv4qr") pod "5180a45c-609d-4485-9c3c-7313dba51835" (UID: "5180a45c-609d-4485-9c3c-7313dba51835"). InnerVolumeSpecName "kube-api-access-kv4qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.590774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f591d9-2db6-4544-b3bd-067a7e396ed7-kube-api-access-knb79" (OuterVolumeSpecName: "kube-api-access-knb79") pod "70f591d9-2db6-4544-b3bd-067a7e396ed7" (UID: "70f591d9-2db6-4544-b3bd-067a7e396ed7"). InnerVolumeSpecName "kube-api-access-knb79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.688766 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv4qr\" (UniqueName: \"kubernetes.io/projected/5180a45c-609d-4485-9c3c-7313dba51835-kube-api-access-kv4qr\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:06 crc kubenswrapper[4707]: I0121 16:41:06.688803 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knb79\" (UniqueName: \"kubernetes.io/projected/70f591d9-2db6-4544-b3bd-067a7e396ed7-kube-api-access-knb79\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:07 crc kubenswrapper[4707]: I0121 16:41:07.323996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" event={"ID":"5180a45c-609d-4485-9c3c-7313dba51835","Type":"ContainerDied","Data":"4bd0892d754380e7d2044c0341332711d11fc9a3f17bbec9b0ca3813815ad4d4"} Jan 21 16:41:07 crc kubenswrapper[4707]: I0121 16:41:07.324016 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx" Jan 21 16:41:07 crc kubenswrapper[4707]: I0121 16:41:07.324025 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bd0892d754380e7d2044c0341332711d11fc9a3f17bbec9b0ca3813815ad4d4" Jan 21 16:41:07 crc kubenswrapper[4707]: I0121 16:41:07.325655 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-dzhdn" Jan 21 16:41:07 crc kubenswrapper[4707]: I0121 16:41:07.325634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-dzhdn" event={"ID":"70f591d9-2db6-4544-b3bd-067a7e396ed7","Type":"ContainerDied","Data":"1723f3cd5972311e067fcee4b8c8be6ebb72020f5cb186c7122849ff8d488793"} Jan 21 16:41:07 crc kubenswrapper[4707]: I0121 16:41:07.325902 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1723f3cd5972311e067fcee4b8c8be6ebb72020f5cb186c7122849ff8d488793" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.480714 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-qslkh"] Jan 21 16:41:09 crc kubenswrapper[4707]: E0121 16:41:09.481168 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f591d9-2db6-4544-b3bd-067a7e396ed7" containerName="mariadb-database-create" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.481181 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f591d9-2db6-4544-b3bd-067a7e396ed7" containerName="mariadb-database-create" Jan 21 16:41:09 crc kubenswrapper[4707]: E0121 16:41:09.481191 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5180a45c-609d-4485-9c3c-7313dba51835" containerName="mariadb-account-create-update" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.481196 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5180a45c-609d-4485-9c3c-7313dba51835" containerName="mariadb-account-create-update" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.481303 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f591d9-2db6-4544-b3bd-067a7e396ed7" containerName="mariadb-database-create" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.481319 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5180a45c-609d-4485-9c3c-7313dba51835" containerName="mariadb-account-create-update" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.481684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.483954 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-hsc25" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.483986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.486483 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-qslkh"] Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.520698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c924eaa5-3fd8-43cf-97db-744d407efa92-db-sync-config-data\") pod \"barbican-db-sync-qslkh\" (UID: \"c924eaa5-3fd8-43cf-97db-744d407efa92\") " pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.520749 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9snck\" (UniqueName: \"kubernetes.io/projected/c924eaa5-3fd8-43cf-97db-744d407efa92-kube-api-access-9snck\") pod \"barbican-db-sync-qslkh\" (UID: \"c924eaa5-3fd8-43cf-97db-744d407efa92\") " pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.622092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c924eaa5-3fd8-43cf-97db-744d407efa92-db-sync-config-data\") pod \"barbican-db-sync-qslkh\" (UID: \"c924eaa5-3fd8-43cf-97db-744d407efa92\") " pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.622307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9snck\" (UniqueName: \"kubernetes.io/projected/c924eaa5-3fd8-43cf-97db-744d407efa92-kube-api-access-9snck\") pod \"barbican-db-sync-qslkh\" (UID: \"c924eaa5-3fd8-43cf-97db-744d407efa92\") " pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.635107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9snck\" (UniqueName: \"kubernetes.io/projected/c924eaa5-3fd8-43cf-97db-744d407efa92-kube-api-access-9snck\") pod \"barbican-db-sync-qslkh\" (UID: \"c924eaa5-3fd8-43cf-97db-744d407efa92\") " pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.639642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c924eaa5-3fd8-43cf-97db-744d407efa92-db-sync-config-data\") pod \"barbican-db-sync-qslkh\" (UID: \"c924eaa5-3fd8-43cf-97db-744d407efa92\") " pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.798675 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.946146 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:41:09 crc kubenswrapper[4707]: I0121 16:41:09.946358 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:41:10 crc kubenswrapper[4707]: I0121 16:41:10.158426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-qslkh"] Jan 21 16:41:10 crc kubenswrapper[4707]: I0121 16:41:10.178931 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:41:10 crc kubenswrapper[4707]: I0121 16:41:10.340846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-qslkh" event={"ID":"c924eaa5-3fd8-43cf-97db-744d407efa92","Type":"ContainerStarted","Data":"47b14115e3be9e01f16c65915da48b367a8633d668a5dfbacac6c52ea0037078"} Jan 21 16:41:11 crc kubenswrapper[4707]: I0121 16:41:11.348294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-qslkh" event={"ID":"c924eaa5-3fd8-43cf-97db-744d407efa92","Type":"ContainerStarted","Data":"3e20dfe84e8d67d89b7fd51cef3bf529167d697147821e4879900d6dcd6e0690"} Jan 21 16:41:11 crc kubenswrapper[4707]: I0121 16:41:11.365457 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-db-sync-qslkh" podStartSLOduration=1.6790892099999999 podStartE2EDuration="2.365444121s" podCreationTimestamp="2026-01-21 16:41:09 +0000 UTC" firstStartedPulling="2026-01-21 16:41:10.178702095 +0000 UTC m=+5967.360218346" lastFinishedPulling="2026-01-21 16:41:10.865057035 +0000 UTC m=+5968.046573257" observedRunningTime="2026-01-21 16:41:11.359703086 +0000 UTC m=+5968.541219308" watchObservedRunningTime="2026-01-21 16:41:11.365444121 +0000 UTC m=+5968.546960343" Jan 21 16:41:12 crc kubenswrapper[4707]: I0121 16:41:12.356281 4707 generic.go:334] "Generic (PLEG): container finished" podID="c924eaa5-3fd8-43cf-97db-744d407efa92" containerID="3e20dfe84e8d67d89b7fd51cef3bf529167d697147821e4879900d6dcd6e0690" exitCode=0 Jan 21 16:41:12 crc kubenswrapper[4707]: I0121 16:41:12.356325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-qslkh" event={"ID":"c924eaa5-3fd8-43cf-97db-744d407efa92","Type":"ContainerDied","Data":"3e20dfe84e8d67d89b7fd51cef3bf529167d697147821e4879900d6dcd6e0690"} Jan 21 16:41:13 crc kubenswrapper[4707]: I0121 16:41:13.571240 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:13 crc kubenswrapper[4707]: I0121 16:41:13.690936 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9snck\" (UniqueName: \"kubernetes.io/projected/c924eaa5-3fd8-43cf-97db-744d407efa92-kube-api-access-9snck\") pod \"c924eaa5-3fd8-43cf-97db-744d407efa92\" (UID: \"c924eaa5-3fd8-43cf-97db-744d407efa92\") " Jan 21 16:41:13 crc kubenswrapper[4707]: I0121 16:41:13.691074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c924eaa5-3fd8-43cf-97db-744d407efa92-db-sync-config-data\") pod \"c924eaa5-3fd8-43cf-97db-744d407efa92\" (UID: \"c924eaa5-3fd8-43cf-97db-744d407efa92\") " Jan 21 16:41:13 crc kubenswrapper[4707]: I0121 16:41:13.697573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c924eaa5-3fd8-43cf-97db-744d407efa92-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c924eaa5-3fd8-43cf-97db-744d407efa92" (UID: "c924eaa5-3fd8-43cf-97db-744d407efa92"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:13 crc kubenswrapper[4707]: I0121 16:41:13.699292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c924eaa5-3fd8-43cf-97db-744d407efa92-kube-api-access-9snck" (OuterVolumeSpecName: "kube-api-access-9snck") pod "c924eaa5-3fd8-43cf-97db-744d407efa92" (UID: "c924eaa5-3fd8-43cf-97db-744d407efa92"). InnerVolumeSpecName "kube-api-access-9snck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:13 crc kubenswrapper[4707]: I0121 16:41:13.792291 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c924eaa5-3fd8-43cf-97db-744d407efa92-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:13 crc kubenswrapper[4707]: I0121 16:41:13.792321 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9snck\" (UniqueName: \"kubernetes.io/projected/c924eaa5-3fd8-43cf-97db-744d407efa92-kube-api-access-9snck\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.369914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-qslkh" event={"ID":"c924eaa5-3fd8-43cf-97db-744d407efa92","Type":"ContainerDied","Data":"47b14115e3be9e01f16c65915da48b367a8633d668a5dfbacac6c52ea0037078"} Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.369965 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b14115e3be9e01f16c65915da48b367a8633d668a5dfbacac6c52ea0037078" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.370027 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-qslkh" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.586401 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz"] Jan 21 16:41:14 crc kubenswrapper[4707]: E0121 16:41:14.586685 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c924eaa5-3fd8-43cf-97db-744d407efa92" containerName="barbican-db-sync" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.586699 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c924eaa5-3fd8-43cf-97db-744d407efa92" containerName="barbican-db-sync" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.586865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c924eaa5-3fd8-43cf-97db-744d407efa92" containerName="barbican-db-sync" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.587525 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.591264 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22"] Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.591467 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.592390 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.596773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-hsc25" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.597103 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz"] Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.601186 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.601267 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-worker-config-data" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.606590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.607055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d21db3-af91-4204-b12e-b3d6ebf7d927-logs\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.607375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data-custom\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.607418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22"] Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.607499 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd4js\" (UniqueName: \"kubernetes.io/projected/d3d21db3-af91-4204-b12e-b3d6ebf7d927-kube-api-access-rd4js\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.690101 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6"] Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.691438 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.693105 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-api-config-data" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.701966 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6"] Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.708624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-logs\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.708686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.708734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d21db3-af91-4204-b12e-b3d6ebf7d927-logs\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.708771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data-custom\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.708837 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data-custom\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.708876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvvl2\" (UniqueName: \"kubernetes.io/projected/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-kube-api-access-vvvl2\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.708899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.708942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd4js\" (UniqueName: \"kubernetes.io/projected/d3d21db3-af91-4204-b12e-b3d6ebf7d927-kube-api-access-rd4js\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.710730 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d21db3-af91-4204-b12e-b3d6ebf7d927-logs\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.722718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data-custom\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.723222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.729617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd4js\" (UniqueName: \"kubernetes.io/projected/d3d21db3-af91-4204-b12e-b3d6ebf7d927-kube-api-access-rd4js\") pod \"barbican-keystone-listener-7d9f645fb-9g4qz\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.810113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x75z\" (UniqueName: \"kubernetes.io/projected/13884c41-dc53-4875-abd6-a420760411ca-kube-api-access-8x75z\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.810166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-logs\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.810204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.810229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data-custom\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.810255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data-custom\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.810348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13884c41-dc53-4875-abd6-a420760411ca-logs\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.810442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvvl2\" (UniqueName: \"kubernetes.io/projected/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-kube-api-access-vvvl2\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.810474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.810683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-logs\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.812894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data-custom\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.813958 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.824027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvvl2\" (UniqueName: \"kubernetes.io/projected/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-kube-api-access-vvvl2\") pod \"barbican-worker-7fbbbf4855-qbv22\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.907044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.910849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.912542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x75z\" (UniqueName: \"kubernetes.io/projected/13884c41-dc53-4875-abd6-a420760411ca-kube-api-access-8x75z\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.912739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.912854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data-custom\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.912963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13884c41-dc53-4875-abd6-a420760411ca-logs\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.913310 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13884c41-dc53-4875-abd6-a420760411ca-logs\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.916281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.916388 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data-custom\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:14 crc kubenswrapper[4707]: I0121 16:41:14.926671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x75z\" (UniqueName: \"kubernetes.io/projected/13884c41-dc53-4875-abd6-a420760411ca-kube-api-access-8x75z\") pod \"barbican-api-5df49d7c5b-rxmq6\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.011952 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.160880 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22"] Jan 21 16:41:15 crc kubenswrapper[4707]: W0121 16:41:15.164893 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd9499b0_ae10_4f51_ad30_1fdfe71f5360.slice/crio-44494e4861a724bfe5cc8f1a549af47d430c88f09b9dfeed1778e429945c27fb WatchSource:0}: Error finding container 44494e4861a724bfe5cc8f1a549af47d430c88f09b9dfeed1778e429945c27fb: Status 404 returned error can't find the container with id 44494e4861a724bfe5cc8f1a549af47d430c88f09b9dfeed1778e429945c27fb Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.318456 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz"] Jan 21 16:41:15 crc kubenswrapper[4707]: W0121 16:41:15.320829 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d21db3_af91_4204_b12e_b3d6ebf7d927.slice/crio-604cf65c26bac2015d9f816063f315bd4fedab5c39770cc203c2f6870dbaeb3b WatchSource:0}: Error finding container 604cf65c26bac2015d9f816063f315bd4fedab5c39770cc203c2f6870dbaeb3b: Status 404 returned error can't find the container with id 604cf65c26bac2015d9f816063f315bd4fedab5c39770cc203c2f6870dbaeb3b Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.377109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" event={"ID":"d3d21db3-af91-4204-b12e-b3d6ebf7d927","Type":"ContainerStarted","Data":"604cf65c26bac2015d9f816063f315bd4fedab5c39770cc203c2f6870dbaeb3b"} Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.378489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" event={"ID":"fd9499b0-ae10-4f51-ad30-1fdfe71f5360","Type":"ContainerStarted","Data":"44494e4861a724bfe5cc8f1a549af47d430c88f09b9dfeed1778e429945c27fb"} Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.439775 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6"] Jan 21 16:41:15 crc kubenswrapper[4707]: W0121 16:41:15.440576 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13884c41_dc53_4875_abd6_a420760411ca.slice/crio-273788f645a04ecb4854919ae74eafd5b54b59a48d2813729f5874053df66d57 WatchSource:0}: Error finding container 273788f645a04ecb4854919ae74eafd5b54b59a48d2813729f5874053df66d57: Status 404 returned error can't find the container with id 273788f645a04ecb4854919ae74eafd5b54b59a48d2813729f5874053df66d57 Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.875777 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2"] Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.879258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.893266 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2"] Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.939207 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxj28\" (UniqueName: \"kubernetes.io/projected/95716cad-313f-409b-91dd-16c34d72de50-kube-api-access-rxj28\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.939455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data-custom\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.939484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:15 crc kubenswrapper[4707]: I0121 16:41:15.939528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95716cad-313f-409b-91dd-16c34d72de50-logs\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.032781 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm"] Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.033760 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.040589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxj28\" (UniqueName: \"kubernetes.io/projected/95716cad-313f-409b-91dd-16c34d72de50-kube-api-access-rxj28\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.040633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data-custom\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.040662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.040703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95716cad-313f-409b-91dd-16c34d72de50-logs\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.041084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95716cad-313f-409b-91dd-16c34d72de50-logs\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.047195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data-custom\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.047327 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm"] Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.047327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.057180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxj28\" (UniqueName: \"kubernetes.io/projected/95716cad-313f-409b-91dd-16c34d72de50-kube-api-access-rxj28\") pod \"barbican-api-5df49d7c5b-2c6c2\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.141654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data-custom\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.141779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.141899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqsw4\" (UniqueName: \"kubernetes.io/projected/af1d139b-c942-4da8-badf-8c6fa924e69f-kube-api-access-fqsw4\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.142044 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1d139b-c942-4da8-badf-8c6fa924e69f-logs\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.228449 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766"] Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.229458 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.237446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766"] Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.243392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1d139b-c942-4da8-badf-8c6fa924e69f-logs\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.243460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data-custom\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.243502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.243531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqsw4\" (UniqueName: \"kubernetes.io/projected/af1d139b-c942-4da8-badf-8c6fa924e69f-kube-api-access-fqsw4\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.244322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1d139b-c942-4da8-badf-8c6fa924e69f-logs\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.247469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data-custom\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.247729 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.258894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqsw4\" (UniqueName: \"kubernetes.io/projected/af1d139b-c942-4da8-badf-8c6fa924e69f-kube-api-access-fqsw4\") pod \"barbican-keystone-listener-7d9f645fb-xz7jm\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.268886 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.346972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f28ff3-fd97-4d68-b709-ae5748de18df-logs\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.347120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xl8s\" (UniqueName: \"kubernetes.io/projected/12f28ff3-fd97-4d68-b709-ae5748de18df-kube-api-access-9xl8s\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.347151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data-custom\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.347251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.369205 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.390314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" event={"ID":"fd9499b0-ae10-4f51-ad30-1fdfe71f5360","Type":"ContainerStarted","Data":"a6c882136e33e7677cbe87e84da8f61e70dac9e2d35a2cc869156b8e6915b91e"} Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.390399 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" event={"ID":"fd9499b0-ae10-4f51-ad30-1fdfe71f5360","Type":"ContainerStarted","Data":"2f3659fe26018e181bacb57caf64d0497c3e32074efc852e615487b7f87c68a5"} Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.393678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" event={"ID":"13884c41-dc53-4875-abd6-a420760411ca","Type":"ContainerStarted","Data":"b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e"} Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.393720 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" event={"ID":"13884c41-dc53-4875-abd6-a420760411ca","Type":"ContainerStarted","Data":"f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d"} Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.393733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" event={"ID":"13884c41-dc53-4875-abd6-a420760411ca","Type":"ContainerStarted","Data":"273788f645a04ecb4854919ae74eafd5b54b59a48d2813729f5874053df66d57"} Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.393748 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.393839 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.400717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" event={"ID":"d3d21db3-af91-4204-b12e-b3d6ebf7d927","Type":"ContainerStarted","Data":"53f3023f09be7aa9f7ea509ee4f342afbd72afbdc6b1368488d4af2f6ba8350e"} Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.400744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" event={"ID":"d3d21db3-af91-4204-b12e-b3d6ebf7d927","Type":"ContainerStarted","Data":"542052daa7ca2e88781c75eac98de435a0f8fbdc1a59f5ff006aaa8365d8ff7d"} Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.406624 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" podStartSLOduration=1.826347212 podStartE2EDuration="2.406614477s" podCreationTimestamp="2026-01-21 16:41:14 +0000 UTC" firstStartedPulling="2026-01-21 16:41:15.167775652 +0000 UTC m=+5972.349291875" lastFinishedPulling="2026-01-21 16:41:15.748042928 +0000 UTC m=+5972.929559140" observedRunningTime="2026-01-21 16:41:16.40164282 +0000 UTC m=+5973.583159041" watchObservedRunningTime="2026-01-21 16:41:16.406614477 +0000 UTC m=+5973.588130699" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.431148 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" podStartSLOduration=1.898093858 podStartE2EDuration="2.431133721s" podCreationTimestamp="2026-01-21 16:41:14 +0000 UTC" firstStartedPulling="2026-01-21 16:41:15.323922987 +0000 UTC m=+5972.505439208" lastFinishedPulling="2026-01-21 16:41:15.856962849 +0000 UTC m=+5973.038479071" observedRunningTime="2026-01-21 16:41:16.419038188 +0000 UTC m=+5973.600554410" watchObservedRunningTime="2026-01-21 16:41:16.431133721 +0000 UTC m=+5973.612649944" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.448409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.448548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f28ff3-fd97-4d68-b709-ae5748de18df-logs\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.448568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xl8s\" (UniqueName: \"kubernetes.io/projected/12f28ff3-fd97-4d68-b709-ae5748de18df-kube-api-access-9xl8s\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.448593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data-custom\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.452202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f28ff3-fd97-4d68-b709-ae5748de18df-logs\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.452410 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" podStartSLOduration=2.452393607 podStartE2EDuration="2.452393607s" podCreationTimestamp="2026-01-21 16:41:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:41:16.445390679 +0000 UTC m=+5973.626906900" watchObservedRunningTime="2026-01-21 16:41:16.452393607 +0000 UTC m=+5973.633909828" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.454370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.454471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data-custom\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.465047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xl8s\" (UniqueName: \"kubernetes.io/projected/12f28ff3-fd97-4d68-b709-ae5748de18df-kube-api-access-9xl8s\") pod \"barbican-worker-7fbbbf4855-kn766\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.547366 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:16 crc kubenswrapper[4707]: I0121 16:41:16.639236 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2"] Jan 21 16:41:16 crc kubenswrapper[4707]: W0121 16:41:16.639430 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95716cad_313f_409b_91dd_16c34d72de50.slice/crio-30c27fd2f43955cfc553dd72d7bb55e2beb86221b472149bf4ecbaa74e0c3f22 WatchSource:0}: Error finding container 30c27fd2f43955cfc553dd72d7bb55e2beb86221b472149bf4ecbaa74e0c3f22: Status 404 returned error can't find the container with id 30c27fd2f43955cfc553dd72d7bb55e2beb86221b472149bf4ecbaa74e0c3f22 Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.249563 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766"] Jan 21 16:41:17 crc kubenswrapper[4707]: W0121 16:41:17.252338 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f28ff3_fd97_4d68_b709_ae5748de18df.slice/crio-470003bd8ef9283e1243a64d3b3f703bc4c25b6f93d1803e988bbf92b4fcd3da WatchSource:0}: Error finding container 470003bd8ef9283e1243a64d3b3f703bc4c25b6f93d1803e988bbf92b4fcd3da: Status 404 returned error can't find the container with id 470003bd8ef9283e1243a64d3b3f703bc4c25b6f93d1803e988bbf92b4fcd3da Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.261789 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm"] Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.273758 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2"] Jan 21 16:41:17 crc kubenswrapper[4707]: W0121 16:41:17.273949 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf1d139b_c942_4da8_badf_8c6fa924e69f.slice/crio-2260fb2b4ccc08b5f3a00b78e0d90f0526d06ccecb591ad90e8c146e3a186b2a WatchSource:0}: Error finding container 2260fb2b4ccc08b5f3a00b78e0d90f0526d06ccecb591ad90e8c146e3a186b2a: Status 404 returned error can't find the container with id 2260fb2b4ccc08b5f3a00b78e0d90f0526d06ccecb591ad90e8c146e3a186b2a Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.430452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" event={"ID":"af1d139b-c942-4da8-badf-8c6fa924e69f","Type":"ContainerStarted","Data":"757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073"} Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.430624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" event={"ID":"af1d139b-c942-4da8-badf-8c6fa924e69f","Type":"ContainerStarted","Data":"2260fb2b4ccc08b5f3a00b78e0d90f0526d06ccecb591ad90e8c146e3a186b2a"} Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.432504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" event={"ID":"12f28ff3-fd97-4d68-b709-ae5748de18df","Type":"ContainerStarted","Data":"487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf"} Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.432554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" event={"ID":"12f28ff3-fd97-4d68-b709-ae5748de18df","Type":"ContainerStarted","Data":"470003bd8ef9283e1243a64d3b3f703bc4c25b6f93d1803e988bbf92b4fcd3da"} Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.439185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" event={"ID":"95716cad-313f-409b-91dd-16c34d72de50","Type":"ContainerStarted","Data":"340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e"} Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.439211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" event={"ID":"95716cad-313f-409b-91dd-16c34d72de50","Type":"ContainerStarted","Data":"22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096"} Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.439221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" event={"ID":"95716cad-313f-409b-91dd-16c34d72de50","Type":"ContainerStarted","Data":"30c27fd2f43955cfc553dd72d7bb55e2beb86221b472149bf4ecbaa74e0c3f22"} Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.462054 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" podStartSLOduration=2.462039891 podStartE2EDuration="2.462039891s" podCreationTimestamp="2026-01-21 16:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:41:17.459025553 +0000 UTC m=+5974.640541775" watchObservedRunningTime="2026-01-21 16:41:17.462039891 +0000 UTC m=+5974.643556114" Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.480479 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm"] Jan 21 16:41:17 crc kubenswrapper[4707]: I0121 16:41:17.654389 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766"] Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.446785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" event={"ID":"af1d139b-c942-4da8-badf-8c6fa924e69f","Type":"ContainerStarted","Data":"4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01"} Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.449881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" event={"ID":"12f28ff3-fd97-4d68-b709-ae5748de18df","Type":"ContainerStarted","Data":"8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc"} Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.449957 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" podUID="95716cad-313f-409b-91dd-16c34d72de50" containerName="barbican-api" containerID="cri-o://340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e" gracePeriod=30 Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.450018 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.449935 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" podUID="95716cad-313f-409b-91dd-16c34d72de50" containerName="barbican-api-log" containerID="cri-o://22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096" gracePeriod=30 Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.451721 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.464218 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" podStartSLOduration=2.464206051 podStartE2EDuration="2.464206051s" podCreationTimestamp="2026-01-21 16:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:41:18.460779488 +0000 UTC m=+5975.642295710" watchObservedRunningTime="2026-01-21 16:41:18.464206051 +0000 UTC m=+5975.645722273" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.476692 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" podStartSLOduration=2.47668263 podStartE2EDuration="2.47668263s" podCreationTimestamp="2026-01-21 16:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:41:18.474922791 +0000 UTC m=+5975.656439013" watchObservedRunningTime="2026-01-21 16:41:18.47668263 +0000 UTC m=+5975.658198852" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.698001 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6"] Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.698419 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" podUID="13884c41-dc53-4875-abd6-a420760411ca" containerName="barbican-api-log" containerID="cri-o://f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d" gracePeriod=30 Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.698536 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" podUID="13884c41-dc53-4875-abd6-a420760411ca" containerName="barbican-api" containerID="cri-o://b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e" gracePeriod=30 Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.790848 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.879508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxj28\" (UniqueName: \"kubernetes.io/projected/95716cad-313f-409b-91dd-16c34d72de50-kube-api-access-rxj28\") pod \"95716cad-313f-409b-91dd-16c34d72de50\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.879555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data-custom\") pod \"95716cad-313f-409b-91dd-16c34d72de50\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.879642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95716cad-313f-409b-91dd-16c34d72de50-logs\") pod \"95716cad-313f-409b-91dd-16c34d72de50\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.879669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data\") pod \"95716cad-313f-409b-91dd-16c34d72de50\" (UID: \"95716cad-313f-409b-91dd-16c34d72de50\") " Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.880670 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95716cad-313f-409b-91dd-16c34d72de50-logs" (OuterVolumeSpecName: "logs") pod "95716cad-313f-409b-91dd-16c34d72de50" (UID: "95716cad-313f-409b-91dd-16c34d72de50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.886220 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95716cad-313f-409b-91dd-16c34d72de50-kube-api-access-rxj28" (OuterVolumeSpecName: "kube-api-access-rxj28") pod "95716cad-313f-409b-91dd-16c34d72de50" (UID: "95716cad-313f-409b-91dd-16c34d72de50"). InnerVolumeSpecName "kube-api-access-rxj28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.886308 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz"] Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.886484 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" podUID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerName="barbican-keystone-listener-log" containerID="cri-o://542052daa7ca2e88781c75eac98de435a0f8fbdc1a59f5ff006aaa8365d8ff7d" gracePeriod=30 Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.886823 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" podUID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerName="barbican-keystone-listener" containerID="cri-o://53f3023f09be7aa9f7ea509ee4f342afbd72afbdc6b1368488d4af2f6ba8350e" gracePeriod=30 Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.886767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "95716cad-313f-409b-91dd-16c34d72de50" (UID: "95716cad-313f-409b-91dd-16c34d72de50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.915542 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data" (OuterVolumeSpecName: "config-data") pod "95716cad-313f-409b-91dd-16c34d72de50" (UID: "95716cad-313f-409b-91dd-16c34d72de50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.980924 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95716cad-313f-409b-91dd-16c34d72de50-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.980951 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.980963 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxj28\" (UniqueName: \"kubernetes.io/projected/95716cad-313f-409b-91dd-16c34d72de50-kube-api-access-rxj28\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.980972 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95716cad-313f-409b-91dd-16c34d72de50-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.995509 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22"] Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.995672 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" podUID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerName="barbican-worker-log" containerID="cri-o://2f3659fe26018e181bacb57caf64d0497c3e32074efc852e615487b7f87c68a5" gracePeriod=30 Jan 21 16:41:18 crc kubenswrapper[4707]: I0121 16:41:18.995763 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" podUID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerName="barbican-worker" containerID="cri-o://a6c882136e33e7677cbe87e84da8f61e70dac9e2d35a2cc869156b8e6915b91e" gracePeriod=30 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.118989 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.184067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x75z\" (UniqueName: \"kubernetes.io/projected/13884c41-dc53-4875-abd6-a420760411ca-kube-api-access-8x75z\") pod \"13884c41-dc53-4875-abd6-a420760411ca\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.184293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13884c41-dc53-4875-abd6-a420760411ca-logs\") pod \"13884c41-dc53-4875-abd6-a420760411ca\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.184936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13884c41-dc53-4875-abd6-a420760411ca-logs" (OuterVolumeSpecName: "logs") pod "13884c41-dc53-4875-abd6-a420760411ca" (UID: "13884c41-dc53-4875-abd6-a420760411ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.185199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data-custom\") pod \"13884c41-dc53-4875-abd6-a420760411ca\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.185569 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data\") pod \"13884c41-dc53-4875-abd6-a420760411ca\" (UID: \"13884c41-dc53-4875-abd6-a420760411ca\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.186354 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13884c41-dc53-4875-abd6-a420760411ca-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.187257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13884c41-dc53-4875-abd6-a420760411ca-kube-api-access-8x75z" (OuterVolumeSpecName: "kube-api-access-8x75z") pod "13884c41-dc53-4875-abd6-a420760411ca" (UID: "13884c41-dc53-4875-abd6-a420760411ca"). InnerVolumeSpecName "kube-api-access-8x75z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.187593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13884c41-dc53-4875-abd6-a420760411ca" (UID: "13884c41-dc53-4875-abd6-a420760411ca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.213838 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data" (OuterVolumeSpecName: "config-data") pod "13884c41-dc53-4875-abd6-a420760411ca" (UID: "13884c41-dc53-4875-abd6-a420760411ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.287550 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x75z\" (UniqueName: \"kubernetes.io/projected/13884c41-dc53-4875-abd6-a420760411ca-kube-api-access-8x75z\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.287577 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.287588 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13884c41-dc53-4875-abd6-a420760411ca-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.460883 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerID="a6c882136e33e7677cbe87e84da8f61e70dac9e2d35a2cc869156b8e6915b91e" exitCode=0 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.460920 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerID="2f3659fe26018e181bacb57caf64d0497c3e32074efc852e615487b7f87c68a5" exitCode=143 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.460987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" event={"ID":"fd9499b0-ae10-4f51-ad30-1fdfe71f5360","Type":"ContainerDied","Data":"a6c882136e33e7677cbe87e84da8f61e70dac9e2d35a2cc869156b8e6915b91e"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.461016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" event={"ID":"fd9499b0-ae10-4f51-ad30-1fdfe71f5360","Type":"ContainerDied","Data":"2f3659fe26018e181bacb57caf64d0497c3e32074efc852e615487b7f87c68a5"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.463347 4707 generic.go:334] "Generic (PLEG): container finished" podID="13884c41-dc53-4875-abd6-a420760411ca" containerID="b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e" exitCode=0 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.463372 4707 generic.go:334] "Generic (PLEG): container finished" podID="13884c41-dc53-4875-abd6-a420760411ca" containerID="f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d" exitCode=143 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.463416 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.463421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" event={"ID":"13884c41-dc53-4875-abd6-a420760411ca","Type":"ContainerDied","Data":"b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.463518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" event={"ID":"13884c41-dc53-4875-abd6-a420760411ca","Type":"ContainerDied","Data":"f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.463536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6" event={"ID":"13884c41-dc53-4875-abd6-a420760411ca","Type":"ContainerDied","Data":"273788f645a04ecb4854919ae74eafd5b54b59a48d2813729f5874053df66d57"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.463552 4707 scope.go:117] "RemoveContainer" containerID="b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.465888 4707 generic.go:334] "Generic (PLEG): container finished" podID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerID="53f3023f09be7aa9f7ea509ee4f342afbd72afbdc6b1368488d4af2f6ba8350e" exitCode=0 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.465911 4707 generic.go:334] "Generic (PLEG): container finished" podID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerID="542052daa7ca2e88781c75eac98de435a0f8fbdc1a59f5ff006aaa8365d8ff7d" exitCode=143 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.465957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" event={"ID":"d3d21db3-af91-4204-b12e-b3d6ebf7d927","Type":"ContainerDied","Data":"53f3023f09be7aa9f7ea509ee4f342afbd72afbdc6b1368488d4af2f6ba8350e"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.465981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" event={"ID":"d3d21db3-af91-4204-b12e-b3d6ebf7d927","Type":"ContainerDied","Data":"542052daa7ca2e88781c75eac98de435a0f8fbdc1a59f5ff006aaa8365d8ff7d"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.475524 4707 generic.go:334] "Generic (PLEG): container finished" podID="95716cad-313f-409b-91dd-16c34d72de50" containerID="340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e" exitCode=0 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.475541 4707 generic.go:334] "Generic (PLEG): container finished" podID="95716cad-313f-409b-91dd-16c34d72de50" containerID="22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096" exitCode=143 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.475734 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" podUID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerName="barbican-worker-log" containerID="cri-o://487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf" gracePeriod=30 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.476060 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.476072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" event={"ID":"95716cad-313f-409b-91dd-16c34d72de50","Type":"ContainerDied","Data":"340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.476096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" event={"ID":"95716cad-313f-409b-91dd-16c34d72de50","Type":"ContainerDied","Data":"22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.476107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2" event={"ID":"95716cad-313f-409b-91dd-16c34d72de50","Type":"ContainerDied","Data":"30c27fd2f43955cfc553dd72d7bb55e2beb86221b472149bf4ecbaa74e0c3f22"} Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.476061 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" podUID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerName="barbican-worker" containerID="cri-o://8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc" gracePeriod=30 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.478501 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" podUID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerName="barbican-keystone-listener-log" containerID="cri-o://757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073" gracePeriod=30 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.478844 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" podUID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerName="barbican-keystone-listener" containerID="cri-o://4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01" gracePeriod=30 Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.498633 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6"] Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.500872 4707 scope.go:117] "RemoveContainer" containerID="f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.505553 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-rxmq6"] Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.512201 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2"] Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.517941 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-5df49d7c5b-2c6c2"] Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.518959 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.522433 4707 scope.go:117] "RemoveContainer" containerID="b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e" Jan 21 16:41:19 crc kubenswrapper[4707]: E0121 16:41:19.526993 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e\": container with ID starting with b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e not found: ID does not exist" containerID="b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.527028 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e"} err="failed to get container status \"b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e\": rpc error: code = NotFound desc = could not find container \"b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e\": container with ID starting with b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e not found: ID does not exist" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.527113 4707 scope.go:117] "RemoveContainer" containerID="f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d" Jan 21 16:41:19 crc kubenswrapper[4707]: E0121 16:41:19.533836 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d\": container with ID starting with f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d not found: ID does not exist" containerID="f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.534188 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d"} err="failed to get container status \"f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d\": rpc error: code = NotFound desc = could not find container \"f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d\": container with ID starting with f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d not found: ID does not exist" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.534209 4707 scope.go:117] "RemoveContainer" containerID="b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.534524 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e"} err="failed to get container status \"b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e\": rpc error: code = NotFound desc = could not find container \"b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e\": container with ID starting with b7cfc6518b2529fcf49243ca90d886fc956c64f09fce971fe4cdaf483f6f6d8e not found: ID does not exist" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.534563 4707 scope.go:117] "RemoveContainer" containerID="f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.534893 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d"} err="failed to get container status \"f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d\": rpc error: code = NotFound desc = could not find container \"f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d\": container with ID starting with f4a2eb85c875ce32dc50e7b988004196942a3afbfd95c33be4b572eb19b39c6d not found: ID does not exist" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.534928 4707 scope.go:117] "RemoveContainer" containerID="340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.560456 4707 scope.go:117] "RemoveContainer" containerID="22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.581451 4707 scope.go:117] "RemoveContainer" containerID="340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e" Jan 21 16:41:19 crc kubenswrapper[4707]: E0121 16:41:19.581782 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e\": container with ID starting with 340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e not found: ID does not exist" containerID="340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.581834 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e"} err="failed to get container status \"340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e\": rpc error: code = NotFound desc = could not find container \"340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e\": container with ID starting with 340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e not found: ID does not exist" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.581855 4707 scope.go:117] "RemoveContainer" containerID="22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096" Jan 21 16:41:19 crc kubenswrapper[4707]: E0121 16:41:19.583064 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096\": container with ID starting with 22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096 not found: ID does not exist" containerID="22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.583101 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096"} err="failed to get container status \"22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096\": rpc error: code = NotFound desc = could not find container \"22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096\": container with ID starting with 22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096 not found: ID does not exist" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.583126 4707 scope.go:117] "RemoveContainer" containerID="340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.584040 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e"} err="failed to get container status \"340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e\": rpc error: code = NotFound desc = could not find container \"340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e\": container with ID starting with 340efce81fa7849dfc3a1c73a4981b8ff7ea4b92a35b0837feb45b0dc0b1a83e not found: ID does not exist" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.584065 4707 scope.go:117] "RemoveContainer" containerID="22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.584297 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096"} err="failed to get container status \"22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096\": rpc error: code = NotFound desc = could not find container \"22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096\": container with ID starting with 22c018eed8d08186a7cbbfb9a69af61d1ff2d34113a06be9189f22e70180a096 not found: ID does not exist" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.597030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data-custom\") pod \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.597097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-logs\") pod \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.597115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data\") pod \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.597287 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvvl2\" (UniqueName: \"kubernetes.io/projected/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-kube-api-access-vvvl2\") pod \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\" (UID: \"fd9499b0-ae10-4f51-ad30-1fdfe71f5360\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.597427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-logs" (OuterVolumeSpecName: "logs") pod "fd9499b0-ae10-4f51-ad30-1fdfe71f5360" (UID: "fd9499b0-ae10-4f51-ad30-1fdfe71f5360"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.597936 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.600823 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-kube-api-access-vvvl2" (OuterVolumeSpecName: "kube-api-access-vvvl2") pod "fd9499b0-ae10-4f51-ad30-1fdfe71f5360" (UID: "fd9499b0-ae10-4f51-ad30-1fdfe71f5360"). InnerVolumeSpecName "kube-api-access-vvvl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.600899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fd9499b0-ae10-4f51-ad30-1fdfe71f5360" (UID: "fd9499b0-ae10-4f51-ad30-1fdfe71f5360"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.615028 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.627839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data" (OuterVolumeSpecName: "config-data") pod "fd9499b0-ae10-4f51-ad30-1fdfe71f5360" (UID: "fd9499b0-ae10-4f51-ad30-1fdfe71f5360"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.699433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data\") pod \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.699552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data-custom\") pod \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.699590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d21db3-af91-4204-b12e-b3d6ebf7d927-logs\") pod \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.699638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd4js\" (UniqueName: \"kubernetes.io/projected/d3d21db3-af91-4204-b12e-b3d6ebf7d927-kube-api-access-rd4js\") pod \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\" (UID: \"d3d21db3-af91-4204-b12e-b3d6ebf7d927\") " Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.699976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3d21db3-af91-4204-b12e-b3d6ebf7d927-logs" (OuterVolumeSpecName: "logs") pod "d3d21db3-af91-4204-b12e-b3d6ebf7d927" (UID: "d3d21db3-af91-4204-b12e-b3d6ebf7d927"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.700140 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvvl2\" (UniqueName: \"kubernetes.io/projected/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-kube-api-access-vvvl2\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.700161 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.700172 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d21db3-af91-4204-b12e-b3d6ebf7d927-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.700183 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9499b0-ae10-4f51-ad30-1fdfe71f5360-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.702033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d21db3-af91-4204-b12e-b3d6ebf7d927-kube-api-access-rd4js" (OuterVolumeSpecName: "kube-api-access-rd4js") pod "d3d21db3-af91-4204-b12e-b3d6ebf7d927" (UID: "d3d21db3-af91-4204-b12e-b3d6ebf7d927"). InnerVolumeSpecName "kube-api-access-rd4js". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.702718 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d3d21db3-af91-4204-b12e-b3d6ebf7d927" (UID: "d3d21db3-af91-4204-b12e-b3d6ebf7d927"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.732610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data" (OuterVolumeSpecName: "config-data") pod "d3d21db3-af91-4204-b12e-b3d6ebf7d927" (UID: "d3d21db3-af91-4204-b12e-b3d6ebf7d927"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.802027 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.802062 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3d21db3-af91-4204-b12e-b3d6ebf7d927-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:19 crc kubenswrapper[4707]: I0121 16:41:19.802075 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd4js\" (UniqueName: \"kubernetes.io/projected/d3d21db3-af91-4204-b12e-b3d6ebf7d927-kube-api-access-rd4js\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.072610 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.077778 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.106332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data\") pod \"12f28ff3-fd97-4d68-b709-ae5748de18df\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.106432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1d139b-c942-4da8-badf-8c6fa924e69f-logs\") pod \"af1d139b-c942-4da8-badf-8c6fa924e69f\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.106506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsw4\" (UniqueName: \"kubernetes.io/projected/af1d139b-c942-4da8-badf-8c6fa924e69f-kube-api-access-fqsw4\") pod \"af1d139b-c942-4da8-badf-8c6fa924e69f\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.106619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f28ff3-fd97-4d68-b709-ae5748de18df-logs\") pod \"12f28ff3-fd97-4d68-b709-ae5748de18df\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.106638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data-custom\") pod \"af1d139b-c942-4da8-badf-8c6fa924e69f\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.106661 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data-custom\") pod \"12f28ff3-fd97-4d68-b709-ae5748de18df\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.106728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data\") pod \"af1d139b-c942-4da8-badf-8c6fa924e69f\" (UID: \"af1d139b-c942-4da8-badf-8c6fa924e69f\") " Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.106919 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xl8s\" (UniqueName: \"kubernetes.io/projected/12f28ff3-fd97-4d68-b709-ae5748de18df-kube-api-access-9xl8s\") pod \"12f28ff3-fd97-4d68-b709-ae5748de18df\" (UID: \"12f28ff3-fd97-4d68-b709-ae5748de18df\") " Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.107510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af1d139b-c942-4da8-badf-8c6fa924e69f-logs" (OuterVolumeSpecName: "logs") pod "af1d139b-c942-4da8-badf-8c6fa924e69f" (UID: "af1d139b-c942-4da8-badf-8c6fa924e69f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.107859 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1d139b-c942-4da8-badf-8c6fa924e69f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.108756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12f28ff3-fd97-4d68-b709-ae5748de18df-logs" (OuterVolumeSpecName: "logs") pod "12f28ff3-fd97-4d68-b709-ae5748de18df" (UID: "12f28ff3-fd97-4d68-b709-ae5748de18df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.111988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "af1d139b-c942-4da8-badf-8c6fa924e69f" (UID: "af1d139b-c942-4da8-badf-8c6fa924e69f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.114525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "12f28ff3-fd97-4d68-b709-ae5748de18df" (UID: "12f28ff3-fd97-4d68-b709-ae5748de18df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.116468 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f28ff3-fd97-4d68-b709-ae5748de18df-kube-api-access-9xl8s" (OuterVolumeSpecName: "kube-api-access-9xl8s") pod "12f28ff3-fd97-4d68-b709-ae5748de18df" (UID: "12f28ff3-fd97-4d68-b709-ae5748de18df"). InnerVolumeSpecName "kube-api-access-9xl8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.117312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1d139b-c942-4da8-badf-8c6fa924e69f-kube-api-access-fqsw4" (OuterVolumeSpecName: "kube-api-access-fqsw4") pod "af1d139b-c942-4da8-badf-8c6fa924e69f" (UID: "af1d139b-c942-4da8-badf-8c6fa924e69f"). InnerVolumeSpecName "kube-api-access-fqsw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.164471 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data" (OuterVolumeSpecName: "config-data") pod "af1d139b-c942-4da8-badf-8c6fa924e69f" (UID: "af1d139b-c942-4da8-badf-8c6fa924e69f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.170488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data" (OuterVolumeSpecName: "config-data") pod "12f28ff3-fd97-4d68-b709-ae5748de18df" (UID: "12f28ff3-fd97-4d68-b709-ae5748de18df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.172895 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-qslkh"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.180476 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-qslkh"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.212725 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xl8s\" (UniqueName: \"kubernetes.io/projected/12f28ff3-fd97-4d68-b709-ae5748de18df-kube-api-access-9xl8s\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.212756 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.212769 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsw4\" (UniqueName: \"kubernetes.io/projected/af1d139b-c942-4da8-badf-8c6fa924e69f-kube-api-access-fqsw4\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.212780 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.212801 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f28ff3-fd97-4d68-b709-ae5748de18df-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.212851 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f28ff3-fd97-4d68-b709-ae5748de18df-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.212863 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1d139b-c942-4da8-badf-8c6fa924e69f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219094 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican1df1-account-delete-x2vrj"] Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219426 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13884c41-dc53-4875-abd6-a420760411ca" containerName="barbican-api-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219447 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="13884c41-dc53-4875-abd6-a420760411ca" containerName="barbican-api-log" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219458 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerName="barbican-worker-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219464 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerName="barbican-worker-log" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219475 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerName="barbican-keystone-listener-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219481 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerName="barbican-keystone-listener-log" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219497 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13884c41-dc53-4875-abd6-a420760411ca" containerName="barbican-api" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219503 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="13884c41-dc53-4875-abd6-a420760411ca" containerName="barbican-api" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219513 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerName="barbican-worker" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219519 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerName="barbican-worker" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219529 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95716cad-313f-409b-91dd-16c34d72de50" containerName="barbican-api-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219534 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="95716cad-313f-409b-91dd-16c34d72de50" containerName="barbican-api-log" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219542 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerName="barbican-worker" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219547 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerName="barbican-worker" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219558 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerName="barbican-keystone-listener" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219563 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerName="barbican-keystone-listener" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219570 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerName="barbican-keystone-listener-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219577 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerName="barbican-keystone-listener-log" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219584 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerName="barbican-keystone-listener" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219589 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerName="barbican-keystone-listener" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219595 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95716cad-313f-409b-91dd-16c34d72de50" containerName="barbican-api" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219600 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="95716cad-313f-409b-91dd-16c34d72de50" containerName="barbican-api" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.219609 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerName="barbican-worker-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219614 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerName="barbican-worker-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="13884c41-dc53-4875-abd6-a420760411ca" containerName="barbican-api" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219760 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerName="barbican-keystone-listener-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219768 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="95716cad-313f-409b-91dd-16c34d72de50" containerName="barbican-api" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219777 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerName="barbican-worker" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219783 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="13884c41-dc53-4875-abd6-a420760411ca" containerName="barbican-api-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219789 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerName="barbican-keystone-listener" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219906 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerName="barbican-worker-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219914 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" containerName="barbican-worker-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219924 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerName="barbican-keystone-listener-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219934 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerName="barbican-worker" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219943 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="95716cad-313f-409b-91dd-16c34d72de50" containerName="barbican-api-log" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.219952 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" containerName="barbican-keystone-listener" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.220434 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.223575 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican1df1-account-delete-x2vrj"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.314275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpv9q\" (UniqueName: \"kubernetes.io/projected/bb99a868-2e7f-4260-b635-f8e7aa20d878-kube-api-access-gpv9q\") pod \"barbican1df1-account-delete-x2vrj\" (UID: \"bb99a868-2e7f-4260-b635-f8e7aa20d878\") " pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.314760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb99a868-2e7f-4260-b635-f8e7aa20d878-operator-scripts\") pod \"barbican1df1-account-delete-x2vrj\" (UID: \"bb99a868-2e7f-4260-b635-f8e7aa20d878\") " pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.416566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb99a868-2e7f-4260-b635-f8e7aa20d878-operator-scripts\") pod \"barbican1df1-account-delete-x2vrj\" (UID: \"bb99a868-2e7f-4260-b635-f8e7aa20d878\") " pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.416657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpv9q\" (UniqueName: \"kubernetes.io/projected/bb99a868-2e7f-4260-b635-f8e7aa20d878-kube-api-access-gpv9q\") pod \"barbican1df1-account-delete-x2vrj\" (UID: \"bb99a868-2e7f-4260-b635-f8e7aa20d878\") " pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.417413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb99a868-2e7f-4260-b635-f8e7aa20d878-operator-scripts\") pod \"barbican1df1-account-delete-x2vrj\" (UID: \"bb99a868-2e7f-4260-b635-f8e7aa20d878\") " pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.432442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpv9q\" (UniqueName: \"kubernetes.io/projected/bb99a868-2e7f-4260-b635-f8e7aa20d878-kube-api-access-gpv9q\") pod \"barbican1df1-account-delete-x2vrj\" (UID: \"bb99a868-2e7f-4260-b635-f8e7aa20d878\") " pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.482932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" event={"ID":"d3d21db3-af91-4204-b12e-b3d6ebf7d927","Type":"ContainerDied","Data":"604cf65c26bac2015d9f816063f315bd4fedab5c39770cc203c2f6870dbaeb3b"} Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.482984 4707 scope.go:117] "RemoveContainer" containerID="53f3023f09be7aa9f7ea509ee4f342afbd72afbdc6b1368488d4af2f6ba8350e" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.482954 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.485410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" event={"ID":"fd9499b0-ae10-4f51-ad30-1fdfe71f5360","Type":"ContainerDied","Data":"44494e4861a724bfe5cc8f1a549af47d430c88f09b9dfeed1778e429945c27fb"} Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.485449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.496579 4707 generic.go:334] "Generic (PLEG): container finished" podID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerID="4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01" exitCode=0 Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.496604 4707 generic.go:334] "Generic (PLEG): container finished" podID="af1d139b-c942-4da8-badf-8c6fa924e69f" containerID="757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073" exitCode=143 Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.496615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" event={"ID":"af1d139b-c942-4da8-badf-8c6fa924e69f","Type":"ContainerDied","Data":"4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01"} Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.496669 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" event={"ID":"af1d139b-c942-4da8-badf-8c6fa924e69f","Type":"ContainerDied","Data":"757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073"} Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.496683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" event={"ID":"af1d139b-c942-4da8-badf-8c6fa924e69f","Type":"ContainerDied","Data":"2260fb2b4ccc08b5f3a00b78e0d90f0526d06ccecb591ad90e8c146e3a186b2a"} Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.496630 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.497754 4707 generic.go:334] "Generic (PLEG): container finished" podID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerID="8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc" exitCode=0 Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.497772 4707 generic.go:334] "Generic (PLEG): container finished" podID="12f28ff3-fd97-4d68-b709-ae5748de18df" containerID="487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf" exitCode=143 Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.497845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" event={"ID":"12f28ff3-fd97-4d68-b709-ae5748de18df","Type":"ContainerDied","Data":"8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc"} Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.497863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" event={"ID":"12f28ff3-fd97-4d68-b709-ae5748de18df","Type":"ContainerDied","Data":"487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf"} Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.497875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" event={"ID":"12f28ff3-fd97-4d68-b709-ae5748de18df","Type":"ContainerDied","Data":"470003bd8ef9283e1243a64d3b3f703bc4c25b6f93d1803e988bbf92b4fcd3da"} Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.497917 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.503481 4707 scope.go:117] "RemoveContainer" containerID="542052daa7ca2e88781c75eac98de435a0f8fbdc1a59f5ff006aaa8365d8ff7d" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.520542 4707 scope.go:117] "RemoveContainer" containerID="a6c882136e33e7677cbe87e84da8f61e70dac9e2d35a2cc869156b8e6915b91e" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.533502 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.533821 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.533881 4707 scope.go:117] "RemoveContainer" containerID="2f3659fe26018e181bacb57caf64d0497c3e32074efc852e615487b7f87c68a5" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.540711 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-9g4qz"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.547919 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.554618 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-kn766"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.559591 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.560960 4707 scope.go:117] "RemoveContainer" containerID="4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.563963 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-7fbbbf4855-qbv22"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.569177 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.573869 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-7d9f645fb-xz7jm"] Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.575456 4707 scope.go:117] "RemoveContainer" containerID="757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.588721 4707 scope.go:117] "RemoveContainer" containerID="4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.589090 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01\": container with ID starting with 4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01 not found: ID does not exist" containerID="4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.589120 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01"} err="failed to get container status \"4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01\": rpc error: code = NotFound desc = could not find container \"4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01\": container with ID starting with 4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01 not found: ID does not exist" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.589143 4707 scope.go:117] "RemoveContainer" containerID="757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.589488 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073\": container with ID starting with 757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073 not found: ID does not exist" containerID="757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.589523 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073"} err="failed to get container status \"757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073\": rpc error: code = NotFound desc = could not find container \"757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073\": container with ID starting with 757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073 not found: ID does not exist" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.589547 4707 scope.go:117] "RemoveContainer" containerID="4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.589780 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01"} err="failed to get container status \"4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01\": rpc error: code = NotFound desc = could not find container \"4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01\": container with ID starting with 4018b80d081177cdcf6750c5a51242c2e81576169d69fdffc5247c5425369e01 not found: ID does not exist" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.589840 4707 scope.go:117] "RemoveContainer" containerID="757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.590047 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073"} err="failed to get container status \"757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073\": rpc error: code = NotFound desc = could not find container \"757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073\": container with ID starting with 757d6fb63efff34a62780b0bcf0c3aaa1e48eea4cd99c075b0def2538addb073 not found: ID does not exist" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.590069 4707 scope.go:117] "RemoveContainer" containerID="8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.617967 4707 scope.go:117] "RemoveContainer" containerID="487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.648112 4707 scope.go:117] "RemoveContainer" containerID="8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.648634 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc\": container with ID starting with 8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc not found: ID does not exist" containerID="8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.648685 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc"} err="failed to get container status \"8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc\": rpc error: code = NotFound desc = could not find container \"8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc\": container with ID starting with 8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc not found: ID does not exist" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.648714 4707 scope.go:117] "RemoveContainer" containerID="487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf" Jan 21 16:41:20 crc kubenswrapper[4707]: E0121 16:41:20.649195 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf\": container with ID starting with 487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf not found: ID does not exist" containerID="487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.649223 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf"} err="failed to get container status \"487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf\": rpc error: code = NotFound desc = could not find container \"487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf\": container with ID starting with 487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf not found: ID does not exist" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.649240 4707 scope.go:117] "RemoveContainer" containerID="8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.649566 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc"} err="failed to get container status \"8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc\": rpc error: code = NotFound desc = could not find container \"8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc\": container with ID starting with 8a738cbe7f1d58b4947f69829933d350b67794258481f6f099edd10938d673fc not found: ID does not exist" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.649600 4707 scope.go:117] "RemoveContainer" containerID="487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.650081 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf"} err="failed to get container status \"487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf\": rpc error: code = NotFound desc = could not find container \"487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf\": container with ID starting with 487f9f3a0e56532ed68c5f8d8d569e4654e76b242cf7ba4a2f480564a70aa0cf not found: ID does not exist" Jan 21 16:41:20 crc kubenswrapper[4707]: I0121 16:41:20.930241 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican1df1-account-delete-x2vrj"] Jan 21 16:41:20 crc kubenswrapper[4707]: W0121 16:41:20.942060 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb99a868_2e7f_4260_b635_f8e7aa20d878.slice/crio-ec2a03f28565cc1d63f84d059764e18356a8db01885ba3e4a90caf51cfb929fe WatchSource:0}: Error finding container ec2a03f28565cc1d63f84d059764e18356a8db01885ba3e4a90caf51cfb929fe: Status 404 returned error can't find the container with id ec2a03f28565cc1d63f84d059764e18356a8db01885ba3e4a90caf51cfb929fe Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.191387 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12f28ff3-fd97-4d68-b709-ae5748de18df" path="/var/lib/kubelet/pods/12f28ff3-fd97-4d68-b709-ae5748de18df/volumes" Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.192130 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13884c41-dc53-4875-abd6-a420760411ca" path="/var/lib/kubelet/pods/13884c41-dc53-4875-abd6-a420760411ca/volumes" Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.192676 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95716cad-313f-409b-91dd-16c34d72de50" path="/var/lib/kubelet/pods/95716cad-313f-409b-91dd-16c34d72de50/volumes" Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.193215 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1d139b-c942-4da8-badf-8c6fa924e69f" path="/var/lib/kubelet/pods/af1d139b-c942-4da8-badf-8c6fa924e69f/volumes" Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.193679 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c924eaa5-3fd8-43cf-97db-744d407efa92" path="/var/lib/kubelet/pods/c924eaa5-3fd8-43cf-97db-744d407efa92/volumes" Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.194862 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d21db3-af91-4204-b12e-b3d6ebf7d927" path="/var/lib/kubelet/pods/d3d21db3-af91-4204-b12e-b3d6ebf7d927/volumes" Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.195364 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9499b0-ae10-4f51-ad30-1fdfe71f5360" path="/var/lib/kubelet/pods/fd9499b0-ae10-4f51-ad30-1fdfe71f5360/volumes" Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.515394 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb99a868-2e7f-4260-b635-f8e7aa20d878" containerID="b21c8d8232a8c8d7f46afcd7a49e8f543e96ab2fb7d3dcb2dac1dbb74cd5996c" exitCode=0 Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.515463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" event={"ID":"bb99a868-2e7f-4260-b635-f8e7aa20d878","Type":"ContainerDied","Data":"b21c8d8232a8c8d7f46afcd7a49e8f543e96ab2fb7d3dcb2dac1dbb74cd5996c"} Jan 21 16:41:21 crc kubenswrapper[4707]: I0121 16:41:21.515487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" event={"ID":"bb99a868-2e7f-4260-b635-f8e7aa20d878","Type":"ContainerStarted","Data":"ec2a03f28565cc1d63f84d059764e18356a8db01885ba3e4a90caf51cfb929fe"} Jan 21 16:41:22 crc kubenswrapper[4707]: I0121 16:41:22.782903 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:22 crc kubenswrapper[4707]: I0121 16:41:22.853353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpv9q\" (UniqueName: \"kubernetes.io/projected/bb99a868-2e7f-4260-b635-f8e7aa20d878-kube-api-access-gpv9q\") pod \"bb99a868-2e7f-4260-b635-f8e7aa20d878\" (UID: \"bb99a868-2e7f-4260-b635-f8e7aa20d878\") " Jan 21 16:41:22 crc kubenswrapper[4707]: I0121 16:41:22.853500 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb99a868-2e7f-4260-b635-f8e7aa20d878-operator-scripts\") pod \"bb99a868-2e7f-4260-b635-f8e7aa20d878\" (UID: \"bb99a868-2e7f-4260-b635-f8e7aa20d878\") " Jan 21 16:41:22 crc kubenswrapper[4707]: I0121 16:41:22.854189 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb99a868-2e7f-4260-b635-f8e7aa20d878-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb99a868-2e7f-4260-b635-f8e7aa20d878" (UID: "bb99a868-2e7f-4260-b635-f8e7aa20d878"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:22 crc kubenswrapper[4707]: I0121 16:41:22.858108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb99a868-2e7f-4260-b635-f8e7aa20d878-kube-api-access-gpv9q" (OuterVolumeSpecName: "kube-api-access-gpv9q") pod "bb99a868-2e7f-4260-b635-f8e7aa20d878" (UID: "bb99a868-2e7f-4260-b635-f8e7aa20d878"). InnerVolumeSpecName "kube-api-access-gpv9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:22 crc kubenswrapper[4707]: I0121 16:41:22.955819 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb99a868-2e7f-4260-b635-f8e7aa20d878-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:22 crc kubenswrapper[4707]: I0121 16:41:22.955849 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpv9q\" (UniqueName: \"kubernetes.io/projected/bb99a868-2e7f-4260-b635-f8e7aa20d878-kube-api-access-gpv9q\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:23 crc kubenswrapper[4707]: I0121 16:41:23.534842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" event={"ID":"bb99a868-2e7f-4260-b635-f8e7aa20d878","Type":"ContainerDied","Data":"ec2a03f28565cc1d63f84d059764e18356a8db01885ba3e4a90caf51cfb929fe"} Jan 21 16:41:23 crc kubenswrapper[4707]: I0121 16:41:23.534885 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec2a03f28565cc1d63f84d059764e18356a8db01885ba3e4a90caf51cfb929fe" Jan 21 16:41:23 crc kubenswrapper[4707]: I0121 16:41:23.534909 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican1df1-account-delete-x2vrj" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.229476 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dzhdn"] Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.234203 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-dzhdn"] Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.238891 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican1df1-account-delete-x2vrj"] Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.243890 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx"] Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.248285 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican1df1-account-delete-x2vrj"] Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.254337 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-1df1-account-create-update-72lkx"] Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.326180 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-gfwjv"] Jan 21 16:41:25 crc kubenswrapper[4707]: E0121 16:41:25.326440 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb99a868-2e7f-4260-b635-f8e7aa20d878" containerName="mariadb-account-delete" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.326458 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb99a868-2e7f-4260-b635-f8e7aa20d878" containerName="mariadb-account-delete" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.326613 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb99a868-2e7f-4260-b635-f8e7aa20d878" containerName="mariadb-account-delete" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.327046 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.332663 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-gfwjv"] Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.392323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlgns\" (UniqueName: \"kubernetes.io/projected/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-kube-api-access-hlgns\") pod \"barbican-db-create-gfwjv\" (UID: \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\") " pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.392394 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-operator-scripts\") pod \"barbican-db-create-gfwjv\" (UID: \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\") " pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.428707 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz"] Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.429492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.432019 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.434095 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz"] Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.493615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2nb4\" (UniqueName: \"kubernetes.io/projected/fc696c40-1172-4cbc-8f11-d352fa6d505c-kube-api-access-h2nb4\") pod \"barbican-99e0-account-create-update-7zrfz\" (UID: \"fc696c40-1172-4cbc-8f11-d352fa6d505c\") " pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.493668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlgns\" (UniqueName: \"kubernetes.io/projected/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-kube-api-access-hlgns\") pod \"barbican-db-create-gfwjv\" (UID: \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\") " pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.494007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-operator-scripts\") pod \"barbican-db-create-gfwjv\" (UID: \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\") " pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.494172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc696c40-1172-4cbc-8f11-d352fa6d505c-operator-scripts\") pod \"barbican-99e0-account-create-update-7zrfz\" (UID: \"fc696c40-1172-4cbc-8f11-d352fa6d505c\") " pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.494665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-operator-scripts\") pod \"barbican-db-create-gfwjv\" (UID: \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\") " pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.507664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlgns\" (UniqueName: \"kubernetes.io/projected/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-kube-api-access-hlgns\") pod \"barbican-db-create-gfwjv\" (UID: \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\") " pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.596030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc696c40-1172-4cbc-8f11-d352fa6d505c-operator-scripts\") pod \"barbican-99e0-account-create-update-7zrfz\" (UID: \"fc696c40-1172-4cbc-8f11-d352fa6d505c\") " pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.596272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2nb4\" (UniqueName: \"kubernetes.io/projected/fc696c40-1172-4cbc-8f11-d352fa6d505c-kube-api-access-h2nb4\") pod \"barbican-99e0-account-create-update-7zrfz\" (UID: \"fc696c40-1172-4cbc-8f11-d352fa6d505c\") " pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.596749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc696c40-1172-4cbc-8f11-d352fa6d505c-operator-scripts\") pod \"barbican-99e0-account-create-update-7zrfz\" (UID: \"fc696c40-1172-4cbc-8f11-d352fa6d505c\") " pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.609182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2nb4\" (UniqueName: \"kubernetes.io/projected/fc696c40-1172-4cbc-8f11-d352fa6d505c-kube-api-access-h2nb4\") pod \"barbican-99e0-account-create-update-7zrfz\" (UID: \"fc696c40-1172-4cbc-8f11-d352fa6d505c\") " pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.610679 4707 scope.go:117] "RemoveContainer" containerID="c8706e781ecb5649e8dd3c7e282f7ed7219d0311cc182b25012283a0950cdc7b" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.643475 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:25 crc kubenswrapper[4707]: I0121 16:41:25.742503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:26 crc kubenswrapper[4707]: I0121 16:41:26.001445 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-gfwjv"] Jan 21 16:41:26 crc kubenswrapper[4707]: W0121 16:41:26.003875 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d97d080_2a30_4bf1_be94_1aca3a2ac64f.slice/crio-873306f0654bbd960e795564315d3f4e8f96921495fd9af3e6e5a619f546f77b WatchSource:0}: Error finding container 873306f0654bbd960e795564315d3f4e8f96921495fd9af3e6e5a619f546f77b: Status 404 returned error can't find the container with id 873306f0654bbd960e795564315d3f4e8f96921495fd9af3e6e5a619f546f77b Jan 21 16:41:26 crc kubenswrapper[4707]: I0121 16:41:26.090610 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz"] Jan 21 16:41:26 crc kubenswrapper[4707]: W0121 16:41:26.092998 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc696c40_1172_4cbc_8f11_d352fa6d505c.slice/crio-b19e55f5a4aba8e638e66e487d838e67deb29af928ac7e2ba49a8fe5be626539 WatchSource:0}: Error finding container b19e55f5a4aba8e638e66e487d838e67deb29af928ac7e2ba49a8fe5be626539: Status 404 returned error can't find the container with id b19e55f5a4aba8e638e66e487d838e67deb29af928ac7e2ba49a8fe5be626539 Jan 21 16:41:26 crc kubenswrapper[4707]: I0121 16:41:26.553905 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc696c40-1172-4cbc-8f11-d352fa6d505c" containerID="c906f3310d7f7f3df2006d95256690fadeca173ddf94a55a08ad91ffc470d8df" exitCode=0 Jan 21 16:41:26 crc kubenswrapper[4707]: I0121 16:41:26.553959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" event={"ID":"fc696c40-1172-4cbc-8f11-d352fa6d505c","Type":"ContainerDied","Data":"c906f3310d7f7f3df2006d95256690fadeca173ddf94a55a08ad91ffc470d8df"} Jan 21 16:41:26 crc kubenswrapper[4707]: I0121 16:41:26.554195 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" event={"ID":"fc696c40-1172-4cbc-8f11-d352fa6d505c","Type":"ContainerStarted","Data":"b19e55f5a4aba8e638e66e487d838e67deb29af928ac7e2ba49a8fe5be626539"} Jan 21 16:41:26 crc kubenswrapper[4707]: I0121 16:41:26.555651 4707 generic.go:334] "Generic (PLEG): container finished" podID="3d97d080-2a30-4bf1-be94-1aca3a2ac64f" containerID="4dd3d2e38dcf334ce48b4b06f066a14b94980e68c5b426763c0c96a9e5cd2465" exitCode=0 Jan 21 16:41:26 crc kubenswrapper[4707]: I0121 16:41:26.555691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-gfwjv" event={"ID":"3d97d080-2a30-4bf1-be94-1aca3a2ac64f","Type":"ContainerDied","Data":"4dd3d2e38dcf334ce48b4b06f066a14b94980e68c5b426763c0c96a9e5cd2465"} Jan 21 16:41:26 crc kubenswrapper[4707]: I0121 16:41:26.555712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-gfwjv" event={"ID":"3d97d080-2a30-4bf1-be94-1aca3a2ac64f","Type":"ContainerStarted","Data":"873306f0654bbd960e795564315d3f4e8f96921495fd9af3e6e5a619f546f77b"} Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.189935 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5180a45c-609d-4485-9c3c-7313dba51835" path="/var/lib/kubelet/pods/5180a45c-609d-4485-9c3c-7313dba51835/volumes" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.190412 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f591d9-2db6-4544-b3bd-067a7e396ed7" path="/var/lib/kubelet/pods/70f591d9-2db6-4544-b3bd-067a7e396ed7/volumes" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.190861 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb99a868-2e7f-4260-b635-f8e7aa20d878" path="/var/lib/kubelet/pods/bb99a868-2e7f-4260-b635-f8e7aa20d878/volumes" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.807721 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.852918 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.927875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2nb4\" (UniqueName: \"kubernetes.io/projected/fc696c40-1172-4cbc-8f11-d352fa6d505c-kube-api-access-h2nb4\") pod \"fc696c40-1172-4cbc-8f11-d352fa6d505c\" (UID: \"fc696c40-1172-4cbc-8f11-d352fa6d505c\") " Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.928051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc696c40-1172-4cbc-8f11-d352fa6d505c-operator-scripts\") pod \"fc696c40-1172-4cbc-8f11-d352fa6d505c\" (UID: \"fc696c40-1172-4cbc-8f11-d352fa6d505c\") " Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.928133 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-operator-scripts\") pod \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\" (UID: \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\") " Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.928202 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlgns\" (UniqueName: \"kubernetes.io/projected/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-kube-api-access-hlgns\") pod \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\" (UID: \"3d97d080-2a30-4bf1-be94-1aca3a2ac64f\") " Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.928530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc696c40-1172-4cbc-8f11-d352fa6d505c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc696c40-1172-4cbc-8f11-d352fa6d505c" (UID: "fc696c40-1172-4cbc-8f11-d352fa6d505c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.928616 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d97d080-2a30-4bf1-be94-1aca3a2ac64f" (UID: "3d97d080-2a30-4bf1-be94-1aca3a2ac64f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.928839 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.928857 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc696c40-1172-4cbc-8f11-d352fa6d505c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.932436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-kube-api-access-hlgns" (OuterVolumeSpecName: "kube-api-access-hlgns") pod "3d97d080-2a30-4bf1-be94-1aca3a2ac64f" (UID: "3d97d080-2a30-4bf1-be94-1aca3a2ac64f"). InnerVolumeSpecName "kube-api-access-hlgns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:27 crc kubenswrapper[4707]: I0121 16:41:27.932516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc696c40-1172-4cbc-8f11-d352fa6d505c-kube-api-access-h2nb4" (OuterVolumeSpecName: "kube-api-access-h2nb4") pod "fc696c40-1172-4cbc-8f11-d352fa6d505c" (UID: "fc696c40-1172-4cbc-8f11-d352fa6d505c"). InnerVolumeSpecName "kube-api-access-h2nb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:28 crc kubenswrapper[4707]: I0121 16:41:28.030065 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlgns\" (UniqueName: \"kubernetes.io/projected/3d97d080-2a30-4bf1-be94-1aca3a2ac64f-kube-api-access-hlgns\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:28 crc kubenswrapper[4707]: I0121 16:41:28.030092 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2nb4\" (UniqueName: \"kubernetes.io/projected/fc696c40-1172-4cbc-8f11-d352fa6d505c-kube-api-access-h2nb4\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:28 crc kubenswrapper[4707]: I0121 16:41:28.569166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" event={"ID":"fc696c40-1172-4cbc-8f11-d352fa6d505c","Type":"ContainerDied","Data":"b19e55f5a4aba8e638e66e487d838e67deb29af928ac7e2ba49a8fe5be626539"} Jan 21 16:41:28 crc kubenswrapper[4707]: I0121 16:41:28.569204 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b19e55f5a4aba8e638e66e487d838e67deb29af928ac7e2ba49a8fe5be626539" Jan 21 16:41:28 crc kubenswrapper[4707]: I0121 16:41:28.569180 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz" Jan 21 16:41:28 crc kubenswrapper[4707]: I0121 16:41:28.570294 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-gfwjv" Jan 21 16:41:28 crc kubenswrapper[4707]: I0121 16:41:28.570286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-gfwjv" event={"ID":"3d97d080-2a30-4bf1-be94-1aca3a2ac64f","Type":"ContainerDied","Data":"873306f0654bbd960e795564315d3f4e8f96921495fd9af3e6e5a619f546f77b"} Jan 21 16:41:28 crc kubenswrapper[4707]: I0121 16:41:28.570509 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="873306f0654bbd960e795564315d3f4e8f96921495fd9af3e6e5a619f546f77b" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.701431 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-8cvk6"] Jan 21 16:41:30 crc kubenswrapper[4707]: E0121 16:41:30.701928 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d97d080-2a30-4bf1-be94-1aca3a2ac64f" containerName="mariadb-database-create" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.701941 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d97d080-2a30-4bf1-be94-1aca3a2ac64f" containerName="mariadb-database-create" Jan 21 16:41:30 crc kubenswrapper[4707]: E0121 16:41:30.701953 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc696c40-1172-4cbc-8f11-d352fa6d505c" containerName="mariadb-account-create-update" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.701958 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc696c40-1172-4cbc-8f11-d352fa6d505c" containerName="mariadb-account-create-update" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.702093 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc696c40-1172-4cbc-8f11-d352fa6d505c" containerName="mariadb-account-create-update" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.702107 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d97d080-2a30-4bf1-be94-1aca3a2ac64f" containerName="mariadb-database-create" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.702547 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.704073 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.704971 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-sp266" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.705183 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.706072 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-8cvk6"] Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.766376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfrv5\" (UniqueName: \"kubernetes.io/projected/af3045c3-7250-4f1c-a7ca-e059cba06b85-kube-api-access-bfrv5\") pod \"barbican-db-sync-8cvk6\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.766463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-combined-ca-bundle\") pod \"barbican-db-sync-8cvk6\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.766549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-db-sync-config-data\") pod \"barbican-db-sync-8cvk6\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.868187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-combined-ca-bundle\") pod \"barbican-db-sync-8cvk6\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.868226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-db-sync-config-data\") pod \"barbican-db-sync-8cvk6\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.868300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfrv5\" (UniqueName: \"kubernetes.io/projected/af3045c3-7250-4f1c-a7ca-e059cba06b85-kube-api-access-bfrv5\") pod \"barbican-db-sync-8cvk6\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.874230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-combined-ca-bundle\") pod \"barbican-db-sync-8cvk6\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.880431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-db-sync-config-data\") pod \"barbican-db-sync-8cvk6\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:30 crc kubenswrapper[4707]: I0121 16:41:30.880665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfrv5\" (UniqueName: \"kubernetes.io/projected/af3045c3-7250-4f1c-a7ca-e059cba06b85-kube-api-access-bfrv5\") pod \"barbican-db-sync-8cvk6\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:31 crc kubenswrapper[4707]: I0121 16:41:31.017310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:31 crc kubenswrapper[4707]: I0121 16:41:31.368755 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-8cvk6"] Jan 21 16:41:31 crc kubenswrapper[4707]: I0121 16:41:31.587056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" event={"ID":"af3045c3-7250-4f1c-a7ca-e059cba06b85","Type":"ContainerStarted","Data":"100097a9e7cbb71c0593f1c17c892a7aa65a9a95d3d87be08e2869314b4723b0"} Jan 21 16:41:31 crc kubenswrapper[4707]: I0121 16:41:31.587235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" event={"ID":"af3045c3-7250-4f1c-a7ca-e059cba06b85","Type":"ContainerStarted","Data":"37f4b617d9ecbb86abe4bbce41f86f38d6cfc2ff6b650fd819f9c51759e60635"} Jan 21 16:41:31 crc kubenswrapper[4707]: I0121 16:41:31.599964 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" podStartSLOduration=1.599951401 podStartE2EDuration="1.599951401s" podCreationTimestamp="2026-01-21 16:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:41:31.595743348 +0000 UTC m=+5988.777259571" watchObservedRunningTime="2026-01-21 16:41:31.599951401 +0000 UTC m=+5988.781467623" Jan 21 16:41:32 crc kubenswrapper[4707]: I0121 16:41:32.593332 4707 generic.go:334] "Generic (PLEG): container finished" podID="af3045c3-7250-4f1c-a7ca-e059cba06b85" containerID="100097a9e7cbb71c0593f1c17c892a7aa65a9a95d3d87be08e2869314b4723b0" exitCode=0 Jan 21 16:41:32 crc kubenswrapper[4707]: I0121 16:41:32.593436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" event={"ID":"af3045c3-7250-4f1c-a7ca-e059cba06b85","Type":"ContainerDied","Data":"100097a9e7cbb71c0593f1c17c892a7aa65a9a95d3d87be08e2869314b4723b0"} Jan 21 16:41:33 crc kubenswrapper[4707]: I0121 16:41:33.818072 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:33 crc kubenswrapper[4707]: I0121 16:41:33.902110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-db-sync-config-data\") pod \"af3045c3-7250-4f1c-a7ca-e059cba06b85\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " Jan 21 16:41:33 crc kubenswrapper[4707]: I0121 16:41:33.902214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-combined-ca-bundle\") pod \"af3045c3-7250-4f1c-a7ca-e059cba06b85\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " Jan 21 16:41:33 crc kubenswrapper[4707]: I0121 16:41:33.902322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfrv5\" (UniqueName: \"kubernetes.io/projected/af3045c3-7250-4f1c-a7ca-e059cba06b85-kube-api-access-bfrv5\") pod \"af3045c3-7250-4f1c-a7ca-e059cba06b85\" (UID: \"af3045c3-7250-4f1c-a7ca-e059cba06b85\") " Jan 21 16:41:33 crc kubenswrapper[4707]: I0121 16:41:33.906354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "af3045c3-7250-4f1c-a7ca-e059cba06b85" (UID: "af3045c3-7250-4f1c-a7ca-e059cba06b85"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:33 crc kubenswrapper[4707]: I0121 16:41:33.906462 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3045c3-7250-4f1c-a7ca-e059cba06b85-kube-api-access-bfrv5" (OuterVolumeSpecName: "kube-api-access-bfrv5") pod "af3045c3-7250-4f1c-a7ca-e059cba06b85" (UID: "af3045c3-7250-4f1c-a7ca-e059cba06b85"). InnerVolumeSpecName "kube-api-access-bfrv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:33 crc kubenswrapper[4707]: I0121 16:41:33.919710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af3045c3-7250-4f1c-a7ca-e059cba06b85" (UID: "af3045c3-7250-4f1c-a7ca-e059cba06b85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.004347 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfrv5\" (UniqueName: \"kubernetes.io/projected/af3045c3-7250-4f1c-a7ca-e059cba06b85-kube-api-access-bfrv5\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.004538 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.004551 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3045c3-7250-4f1c-a7ca-e059cba06b85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.605596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" event={"ID":"af3045c3-7250-4f1c-a7ca-e059cba06b85","Type":"ContainerDied","Data":"37f4b617d9ecbb86abe4bbce41f86f38d6cfc2ff6b650fd819f9c51759e60635"} Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.605634 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37f4b617d9ecbb86abe4bbce41f86f38d6cfc2ff6b650fd819f9c51759e60635" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.605679 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-8cvk6" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.975272 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-685468fd47-84h6z"] Jan 21 16:41:34 crc kubenswrapper[4707]: E0121 16:41:34.975495 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3045c3-7250-4f1c-a7ca-e059cba06b85" containerName="barbican-db-sync" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.975506 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3045c3-7250-4f1c-a7ca-e059cba06b85" containerName="barbican-db-sync" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.975620 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3045c3-7250-4f1c-a7ca-e059cba06b85" containerName="barbican-db-sync" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.976234 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.980058 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.980130 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-sp266" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.980676 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-worker-config-data" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.981207 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.982926 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw"] Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.983736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.985974 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.987966 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-685468fd47-84h6z"] Jan 21 16:41:34 crc kubenswrapper[4707]: I0121 16:41:34.997695 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw"] Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.022227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-logs\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.022351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6f2\" (UniqueName: \"kubernetes.io/projected/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-kube-api-access-9l6f2\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.022472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.022574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.022687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-combined-ca-bundle\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.042827 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4"] Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.043861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.045338 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-api-config-data" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.045578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.045634 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-public-svc" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.049283 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4"] Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.123851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-logs\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.123895 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-combined-ca-bundle\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.123921 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-logs\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.123944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.123972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng9zv\" (UniqueName: \"kubernetes.io/projected/57a9e973-2137-4426-b3d0-28de9f12ab22-kube-api-access-ng9zv\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.123989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f86g\" (UniqueName: \"kubernetes.io/projected/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-kube-api-access-4f86g\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6f2\" (UniqueName: \"kubernetes.io/projected/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-kube-api-access-9l6f2\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-public-tls-certs\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a9e973-2137-4426-b3d0-28de9f12ab22-logs\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-internal-tls-certs\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-logs\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-combined-ca-bundle\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-combined-ca-bundle\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.124493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.128246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.129720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-combined-ca-bundle\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.138280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.152071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6f2\" (UniqueName: \"kubernetes.io/projected/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-kube-api-access-9l6f2\") pod \"barbican-worker-685468fd47-84h6z\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-combined-ca-bundle\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-logs\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng9zv\" (UniqueName: \"kubernetes.io/projected/57a9e973-2137-4426-b3d0-28de9f12ab22-kube-api-access-ng9zv\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f86g\" (UniqueName: \"kubernetes.io/projected/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-kube-api-access-4f86g\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-public-tls-certs\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a9e973-2137-4426-b3d0-28de9f12ab22-logs\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-internal-tls-certs\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225802 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-combined-ca-bundle\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.225901 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.226064 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a9e973-2137-4426-b3d0-28de9f12ab22-logs\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.226287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-logs\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.228420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.228944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-combined-ca-bundle\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.229208 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.229989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.230329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-internal-tls-certs\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.230431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-combined-ca-bundle\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.230516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-public-tls-certs\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.231370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.239740 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng9zv\" (UniqueName: \"kubernetes.io/projected/57a9e973-2137-4426-b3d0-28de9f12ab22-kube-api-access-ng9zv\") pod \"barbican-keystone-listener-d8c7c46f8-k8jjw\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.241491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f86g\" (UniqueName: \"kubernetes.io/projected/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-kube-api-access-4f86g\") pod \"barbican-api-77b895b48d-kxxv4\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.293728 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.302713 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.358156 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.670683 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-685468fd47-84h6z"] Jan 21 16:41:35 crc kubenswrapper[4707]: W0121 16:41:35.672330 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7c6b8f3_6bb7_4d97_80e9_50f95d58ba4b.slice/crio-aea3d49230cfebd7a023a22b47e8cd626418d1bc804ec8e9b664e4053228e74e WatchSource:0}: Error finding container aea3d49230cfebd7a023a22b47e8cd626418d1bc804ec8e9b664e4053228e74e: Status 404 returned error can't find the container with id aea3d49230cfebd7a023a22b47e8cd626418d1bc804ec8e9b664e4053228e74e Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.717045 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw"] Jan 21 16:41:35 crc kubenswrapper[4707]: W0121 16:41:35.718453 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57a9e973_2137_4426_b3d0_28de9f12ab22.slice/crio-7732c5e6215ce61cf682c61a0990bf9358a48cfd4847089a33b32a18a3364d39 WatchSource:0}: Error finding container 7732c5e6215ce61cf682c61a0990bf9358a48cfd4847089a33b32a18a3364d39: Status 404 returned error can't find the container with id 7732c5e6215ce61cf682c61a0990bf9358a48cfd4847089a33b32a18a3364d39 Jan 21 16:41:35 crc kubenswrapper[4707]: I0121 16:41:35.776310 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4"] Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.429920 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-8cvk6"] Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.435343 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-8cvk6"] Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.456329 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw"] Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.470082 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-685468fd47-84h6z"] Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.478375 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican99e0-account-delete-5fwll"] Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.479510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.488446 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4"] Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.499908 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican99e0-account-delete-5fwll"] Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.544117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4bdk\" (UniqueName: \"kubernetes.io/projected/4eca58aa-d4e1-4061-914e-a31d0e7cadca-kube-api-access-s4bdk\") pod \"barbican99e0-account-delete-5fwll\" (UID: \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\") " pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.544191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eca58aa-d4e1-4061-914e-a31d0e7cadca-operator-scripts\") pod \"barbican99e0-account-delete-5fwll\" (UID: \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\") " pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.618277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" event={"ID":"57a9e973-2137-4426-b3d0-28de9f12ab22","Type":"ContainerStarted","Data":"e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26"} Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.618313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" event={"ID":"57a9e973-2137-4426-b3d0-28de9f12ab22","Type":"ContainerStarted","Data":"8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d"} Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.618324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" event={"ID":"57a9e973-2137-4426-b3d0-28de9f12ab22","Type":"ContainerStarted","Data":"7732c5e6215ce61cf682c61a0990bf9358a48cfd4847089a33b32a18a3364d39"} Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.618546 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" secret="" err="secret \"barbican-barbican-dockercfg-sp266\" not found" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.620377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" event={"ID":"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e","Type":"ContainerStarted","Data":"764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9"} Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.620406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" event={"ID":"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e","Type":"ContainerStarted","Data":"3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817"} Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.620415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" event={"ID":"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e","Type":"ContainerStarted","Data":"18d9be2cda14fb3c6272baefa228680d0efcad3149ee825722f5ff4501f08bb4"} Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.620590 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" secret="" err="secret \"barbican-barbican-dockercfg-sp266\" not found" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.620751 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.620774 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.622120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" event={"ID":"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b","Type":"ContainerStarted","Data":"ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199"} Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.622158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" event={"ID":"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b","Type":"ContainerStarted","Data":"a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425"} Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.622169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" event={"ID":"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b","Type":"ContainerStarted","Data":"aea3d49230cfebd7a023a22b47e8cd626418d1bc804ec8e9b664e4053228e74e"} Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.622403 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" secret="" err="secret \"barbican-barbican-dockercfg-sp266\" not found" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.635866 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" podStartSLOduration=2.635857189 podStartE2EDuration="2.635857189s" podCreationTimestamp="2026-01-21 16:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:41:36.632376914 +0000 UTC m=+5993.813893136" watchObservedRunningTime="2026-01-21 16:41:36.635857189 +0000 UTC m=+5993.817373411" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.644466 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" podStartSLOduration=2.64445764 podStartE2EDuration="2.64445764s" podCreationTimestamp="2026-01-21 16:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:41:36.641833306 +0000 UTC m=+5993.823349528" watchObservedRunningTime="2026-01-21 16:41:36.64445764 +0000 UTC m=+5993.825973861" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.645141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4bdk\" (UniqueName: \"kubernetes.io/projected/4eca58aa-d4e1-4061-914e-a31d0e7cadca-kube-api-access-s4bdk\") pod \"barbican99e0-account-delete-5fwll\" (UID: \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\") " pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.645192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eca58aa-d4e1-4061-914e-a31d0e7cadca-operator-scripts\") pod \"barbican99e0-account-delete-5fwll\" (UID: \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\") " pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.645737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eca58aa-d4e1-4061-914e-a31d0e7cadca-operator-scripts\") pod \"barbican99e0-account-delete-5fwll\" (UID: \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\") " pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.659207 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" podStartSLOduration=1.659198066 podStartE2EDuration="1.659198066s" podCreationTimestamp="2026-01-21 16:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:41:36.654867473 +0000 UTC m=+5993.836383695" watchObservedRunningTime="2026-01-21 16:41:36.659198066 +0000 UTC m=+5993.840714288" Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.666490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4bdk\" (UniqueName: \"kubernetes.io/projected/4eca58aa-d4e1-4061-914e-a31d0e7cadca-kube-api-access-s4bdk\") pod \"barbican99e0-account-delete-5fwll\" (UID: \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\") " pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746211 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746263 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746290 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746271 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data podName:57a9e973-2137-4426-b3d0-28de9f12ab22 nodeName:}" failed. No retries permitted until 2026-01-21 16:41:37.246257735 +0000 UTC m=+5994.427773967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data") pod "barbican-keystone-listener-d8c7c46f8-k8jjw" (UID: "57a9e973-2137-4426-b3d0-28de9f12ab22") : secret "barbican-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746351 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data podName:c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b nodeName:}" failed. No retries permitted until 2026-01-21 16:41:37.24633514 +0000 UTC m=+5994.427851372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data") pod "barbican-worker-685468fd47-84h6z" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b") : secret "barbican-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746393 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746214 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746421 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom podName:57a9e973-2137-4426-b3d0-28de9f12ab22 nodeName:}" failed. No retries permitted until 2026-01-21 16:41:37.24641468 +0000 UTC m=+5994.427930902 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom") pod "barbican-keystone-listener-d8c7c46f8-k8jjw" (UID: "57a9e973-2137-4426-b3d0-28de9f12ab22") : secret "barbican-keystone-listener-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746460 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data podName:c719164f-8f93-4fe4-a8b3-8e6c90eaa01e nodeName:}" failed. No retries permitted until 2026-01-21 16:41:37.246451929 +0000 UTC m=+5994.427968151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data") pod "barbican-api-77b895b48d-kxxv4" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e") : secret "barbican-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746474 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom podName:c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b nodeName:}" failed. No retries permitted until 2026-01-21 16:41:37.246468039 +0000 UTC m=+5994.427984271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom") pod "barbican-worker-685468fd47-84h6z" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b") : secret "barbican-worker-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746604 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: E0121 16:41:36.746641 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom podName:c719164f-8f93-4fe4-a8b3-8e6c90eaa01e nodeName:}" failed. No retries permitted until 2026-01-21 16:41:37.246632108 +0000 UTC m=+5994.428148340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom") pod "barbican-api-77b895b48d-kxxv4" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e") : secret "barbican-api-config-data" not found Jan 21 16:41:36 crc kubenswrapper[4707]: I0121 16:41:36.795570 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.149262 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican99e0-account-delete-5fwll"] Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.188935 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3045c3-7250-4f1c-a7ca-e059cba06b85" path="/var/lib/kubelet/pods/af3045c3-7250-4f1c-a7ca-e059cba06b85/volumes" Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255259 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255453 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data podName:57a9e973-2137-4426-b3d0-28de9f12ab22 nodeName:}" failed. No retries permitted until 2026-01-21 16:41:38.255439407 +0000 UTC m=+5995.436955629 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data") pod "barbican-keystone-listener-d8c7c46f8-k8jjw" (UID: "57a9e973-2137-4426-b3d0-28de9f12ab22") : secret "barbican-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255283 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255528 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255321 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255545 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom podName:c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b nodeName:}" failed. No retries permitted until 2026-01-21 16:41:38.255526621 +0000 UTC m=+5995.437042843 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom") pod "barbican-worker-685468fd47-84h6z" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b") : secret "barbican-worker-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255379 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255902 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data podName:c719164f-8f93-4fe4-a8b3-8e6c90eaa01e nodeName:}" failed. No retries permitted until 2026-01-21 16:41:38.255884795 +0000 UTC m=+5995.437401017 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data") pod "barbican-api-77b895b48d-kxxv4" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e") : secret "barbican-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255923 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data podName:c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b nodeName:}" failed. No retries permitted until 2026-01-21 16:41:38.255916845 +0000 UTC m=+5995.437433067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data") pod "barbican-worker-685468fd47-84h6z" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b") : secret "barbican-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.255950 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom podName:c719164f-8f93-4fe4-a8b3-8e6c90eaa01e nodeName:}" failed. No retries permitted until 2026-01-21 16:41:38.255930891 +0000 UTC m=+5995.437447114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom") pod "barbican-api-77b895b48d-kxxv4" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e") : secret "barbican-api-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.256133 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: E0121 16:41:37.256163 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom podName:57a9e973-2137-4426-b3d0-28de9f12ab22 nodeName:}" failed. No retries permitted until 2026-01-21 16:41:38.256155313 +0000 UTC m=+5995.437671535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom") pod "barbican-keystone-listener-d8c7c46f8-k8jjw" (UID: "57a9e973-2137-4426-b3d0-28de9f12ab22") : secret "barbican-keystone-listener-config-data" not found Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.629041 4707 generic.go:334] "Generic (PLEG): container finished" podID="4eca58aa-d4e1-4061-914e-a31d0e7cadca" containerID="acbfc31705ae4575297f345c2e4e44e116b0e4a25d1c1be31f5f0559f0265769" exitCode=0 Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.629092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" event={"ID":"4eca58aa-d4e1-4061-914e-a31d0e7cadca","Type":"ContainerDied","Data":"acbfc31705ae4575297f345c2e4e44e116b0e4a25d1c1be31f5f0559f0265769"} Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.629132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" event={"ID":"4eca58aa-d4e1-4061-914e-a31d0e7cadca","Type":"ContainerStarted","Data":"329a3265002ec681411ce43ba4041f9e821cd87b1a7afcbc09e4cde577f3de46"} Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.629234 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" podUID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerName="barbican-api-log" containerID="cri-o://3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817" gracePeriod=30 Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.629273 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" podUID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerName="barbican-api" containerID="cri-o://764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9" gracePeriod=30 Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.629410 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" podUID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerName="barbican-keystone-listener-log" containerID="cri-o://8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d" gracePeriod=30 Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.629462 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" podUID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerName="barbican-keystone-listener" containerID="cri-o://e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26" gracePeriod=30 Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.629534 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" podUID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerName="barbican-worker" containerID="cri-o://ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199" gracePeriod=30 Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.629542 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" podUID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerName="barbican-worker-log" containerID="cri-o://a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425" gracePeriod=30 Jan 21 16:41:37 crc kubenswrapper[4707]: I0121 16:41:37.975691 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.051134 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.067411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-public-tls-certs\") pod \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.067463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f86g\" (UniqueName: \"kubernetes.io/projected/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-kube-api-access-4f86g\") pod \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.067511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-internal-tls-certs\") pod \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.067567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom\") pod \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.067616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-combined-ca-bundle\") pod \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.067648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-logs\") pod \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.067683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data\") pod \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\" (UID: \"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.068574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-logs" (OuterVolumeSpecName: "logs") pod "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.072044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-kube-api-access-4f86g" (OuterVolumeSpecName: "kube-api-access-4f86g") pod "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e"). InnerVolumeSpecName "kube-api-access-4f86g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.072092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.083994 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.096332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.097073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.105564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data" (OuterVolumeSpecName: "config-data") pod "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" (UID: "c719164f-8f93-4fe4-a8b3-8e6c90eaa01e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.168837 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a9e973-2137-4426-b3d0-28de9f12ab22-logs\") pod \"57a9e973-2137-4426-b3d0-28de9f12ab22\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.168884 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng9zv\" (UniqueName: \"kubernetes.io/projected/57a9e973-2137-4426-b3d0-28de9f12ab22-kube-api-access-ng9zv\") pod \"57a9e973-2137-4426-b3d0-28de9f12ab22\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.168915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom\") pod \"57a9e973-2137-4426-b3d0-28de9f12ab22\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.168933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-combined-ca-bundle\") pod \"57a9e973-2137-4426-b3d0-28de9f12ab22\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.168949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data\") pod \"57a9e973-2137-4426-b3d0-28de9f12ab22\" (UID: \"57a9e973-2137-4426-b3d0-28de9f12ab22\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.169126 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a9e973-2137-4426-b3d0-28de9f12ab22-logs" (OuterVolumeSpecName: "logs") pod "57a9e973-2137-4426-b3d0-28de9f12ab22" (UID: "57a9e973-2137-4426-b3d0-28de9f12ab22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.169297 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.169308 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.169316 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.169324 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.169332 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a9e973-2137-4426-b3d0-28de9f12ab22-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.169340 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.169348 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f86g\" (UniqueName: \"kubernetes.io/projected/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-kube-api-access-4f86g\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.169356 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.183964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "57a9e973-2137-4426-b3d0-28de9f12ab22" (UID: "57a9e973-2137-4426-b3d0-28de9f12ab22"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.185012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a9e973-2137-4426-b3d0-28de9f12ab22-kube-api-access-ng9zv" (OuterVolumeSpecName: "kube-api-access-ng9zv") pod "57a9e973-2137-4426-b3d0-28de9f12ab22" (UID: "57a9e973-2137-4426-b3d0-28de9f12ab22"). InnerVolumeSpecName "kube-api-access-ng9zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.215703 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57a9e973-2137-4426-b3d0-28de9f12ab22" (UID: "57a9e973-2137-4426-b3d0-28de9f12ab22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.263286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data" (OuterVolumeSpecName: "config-data") pod "57a9e973-2137-4426-b3d0-28de9f12ab22" (UID: "57a9e973-2137-4426-b3d0-28de9f12ab22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.272625 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng9zv\" (UniqueName: \"kubernetes.io/projected/57a9e973-2137-4426-b3d0-28de9f12ab22-kube-api-access-ng9zv\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.272650 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.272662 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.272671 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a9e973-2137-4426-b3d0-28de9f12ab22-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.272736 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.272772 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data podName:c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b nodeName:}" failed. No retries permitted until 2026-01-21 16:41:40.272761214 +0000 UTC m=+5997.454277436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data") pod "barbican-worker-685468fd47-84h6z" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b") : secret "barbican-config-data" not found Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.274620 4707 secret.go:188] Couldn't get secret barbican-kuttl-tests/barbican-worker-config-data: secret "barbican-worker-config-data" not found Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.274658 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom podName:c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b nodeName:}" failed. No retries permitted until 2026-01-21 16:41:40.274648623 +0000 UTC m=+5997.456164845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom") pod "barbican-worker-685468fd47-84h6z" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b") : secret "barbican-worker-config-data" not found Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.532257 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.637164 4707 generic.go:334] "Generic (PLEG): container finished" podID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerID="ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199" exitCode=0 Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.637206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" event={"ID":"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b","Type":"ContainerDied","Data":"ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199"} Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.637216 4707 generic.go:334] "Generic (PLEG): container finished" podID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerID="a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425" exitCode=143 Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.637237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" event={"ID":"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b","Type":"ContainerDied","Data":"a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425"} Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.637249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" event={"ID":"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b","Type":"ContainerDied","Data":"aea3d49230cfebd7a023a22b47e8cd626418d1bc804ec8e9b664e4053228e74e"} Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.637206 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-685468fd47-84h6z" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.637270 4707 scope.go:117] "RemoveContainer" containerID="ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.639186 4707 generic.go:334] "Generic (PLEG): container finished" podID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerID="e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26" exitCode=0 Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.639432 4707 generic.go:334] "Generic (PLEG): container finished" podID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerID="8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d" exitCode=143 Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.639243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" event={"ID":"57a9e973-2137-4426-b3d0-28de9f12ab22","Type":"ContainerDied","Data":"e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26"} Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.639474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" event={"ID":"57a9e973-2137-4426-b3d0-28de9f12ab22","Type":"ContainerDied","Data":"8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d"} Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.639504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" event={"ID":"57a9e973-2137-4426-b3d0-28de9f12ab22","Type":"ContainerDied","Data":"7732c5e6215ce61cf682c61a0990bf9358a48cfd4847089a33b32a18a3364d39"} Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.639253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.642306 4707 generic.go:334] "Generic (PLEG): container finished" podID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerID="764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9" exitCode=0 Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.642332 4707 generic.go:334] "Generic (PLEG): container finished" podID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerID="3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817" exitCode=143 Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.642369 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" event={"ID":"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e","Type":"ContainerDied","Data":"764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9"} Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.642401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" event={"ID":"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e","Type":"ContainerDied","Data":"3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817"} Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.642411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" event={"ID":"c719164f-8f93-4fe4-a8b3-8e6c90eaa01e","Type":"ContainerDied","Data":"18d9be2cda14fb3c6272baefa228680d0efcad3149ee825722f5ff4501f08bb4"} Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.642562 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.652357 4707 scope.go:117] "RemoveContainer" containerID="a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.664295 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw"] Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.669327 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-d8c7c46f8-k8jjw"] Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.674152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4"] Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.678658 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-77b895b48d-kxxv4"] Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.678742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom\") pod \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.678818 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-combined-ca-bundle\") pod \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.678865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l6f2\" (UniqueName: \"kubernetes.io/projected/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-kube-api-access-9l6f2\") pod \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.678891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data\") pod \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.678946 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-logs\") pod \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\" (UID: \"c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.680776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-logs" (OuterVolumeSpecName: "logs") pod "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.683434 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-kube-api-access-9l6f2" (OuterVolumeSpecName: "kube-api-access-9l6f2") pod "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b"). InnerVolumeSpecName "kube-api-access-9l6f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.683622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.683995 4707 scope.go:117] "RemoveContainer" containerID="ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199" Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.684457 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199\": container with ID starting with ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199 not found: ID does not exist" containerID="ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.684495 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199"} err="failed to get container status \"ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199\": rpc error: code = NotFound desc = could not find container \"ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199\": container with ID starting with ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.684515 4707 scope.go:117] "RemoveContainer" containerID="a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425" Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.684781 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425\": container with ID starting with a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425 not found: ID does not exist" containerID="a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.684822 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425"} err="failed to get container status \"a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425\": rpc error: code = NotFound desc = could not find container \"a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425\": container with ID starting with a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.684834 4707 scope.go:117] "RemoveContainer" containerID="ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.685111 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199"} err="failed to get container status \"ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199\": rpc error: code = NotFound desc = could not find container \"ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199\": container with ID starting with ec5d51c18b3a831a32ec8fa186305bfaaee248d15ad6094e0fbb4adbaf4ac199 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.685182 4707 scope.go:117] "RemoveContainer" containerID="a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.685450 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425"} err="failed to get container status \"a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425\": rpc error: code = NotFound desc = could not find container \"a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425\": container with ID starting with a572f16609dbddbfd0a3979df409de2b86e19fb88fa29696aa7132e6bc52b425 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.685482 4707 scope.go:117] "RemoveContainer" containerID="e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.697549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.699727 4707 scope.go:117] "RemoveContainer" containerID="8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.713345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data" (OuterVolumeSpecName: "config-data") pod "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" (UID: "c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.717014 4707 scope.go:117] "RemoveContainer" containerID="e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26" Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.717316 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26\": container with ID starting with e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26 not found: ID does not exist" containerID="e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.717349 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26"} err="failed to get container status \"e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26\": rpc error: code = NotFound desc = could not find container \"e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26\": container with ID starting with e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.717370 4707 scope.go:117] "RemoveContainer" containerID="8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d" Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.717723 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d\": container with ID starting with 8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d not found: ID does not exist" containerID="8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.717960 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d"} err="failed to get container status \"8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d\": rpc error: code = NotFound desc = could not find container \"8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d\": container with ID starting with 8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.717977 4707 scope.go:117] "RemoveContainer" containerID="e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.718227 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26"} err="failed to get container status \"e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26\": rpc error: code = NotFound desc = could not find container \"e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26\": container with ID starting with e04a592bd90d8e11dfd62a257ec0ea31708d222965281a066a264e96dacb5b26 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.718300 4707 scope.go:117] "RemoveContainer" containerID="8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.719390 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d"} err="failed to get container status \"8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d\": rpc error: code = NotFound desc = could not find container \"8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d\": container with ID starting with 8a0dabbc93ea64d3b3061f9fd38cc4cc799a31022bd8e11900f14e3ac305093d not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.719484 4707 scope.go:117] "RemoveContainer" containerID="764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.735335 4707 scope.go:117] "RemoveContainer" containerID="3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.752176 4707 scope.go:117] "RemoveContainer" containerID="764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9" Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.752679 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9\": container with ID starting with 764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9 not found: ID does not exist" containerID="764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.752712 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9"} err="failed to get container status \"764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9\": rpc error: code = NotFound desc = could not find container \"764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9\": container with ID starting with 764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.752730 4707 scope.go:117] "RemoveContainer" containerID="3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817" Jan 21 16:41:38 crc kubenswrapper[4707]: E0121 16:41:38.753137 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817\": container with ID starting with 3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817 not found: ID does not exist" containerID="3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.753183 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817"} err="failed to get container status \"3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817\": rpc error: code = NotFound desc = could not find container \"3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817\": container with ID starting with 3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.753221 4707 scope.go:117] "RemoveContainer" containerID="764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.753511 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9"} err="failed to get container status \"764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9\": rpc error: code = NotFound desc = could not find container \"764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9\": container with ID starting with 764e5b0e1e4475794f1d156a440a5b8df4230002093ced1c0ddbd3f3b1b7fed9 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.753533 4707 scope.go:117] "RemoveContainer" containerID="3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.753724 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817"} err="failed to get container status \"3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817\": rpc error: code = NotFound desc = could not find container \"3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817\": container with ID starting with 3f360235e8d78cc7723ade01d4fc662905ab02d1a504dc23f0fb5b435a7b7817 not found: ID does not exist" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.781146 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.781172 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.781185 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l6f2\" (UniqueName: \"kubernetes.io/projected/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-kube-api-access-9l6f2\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.781195 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.781204 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.844295 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.964861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-685468fd47-84h6z"] Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.973650 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-685468fd47-84h6z"] Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.985415 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4bdk\" (UniqueName: \"kubernetes.io/projected/4eca58aa-d4e1-4061-914e-a31d0e7cadca-kube-api-access-s4bdk\") pod \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\" (UID: \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.985556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eca58aa-d4e1-4061-914e-a31d0e7cadca-operator-scripts\") pod \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\" (UID: \"4eca58aa-d4e1-4061-914e-a31d0e7cadca\") " Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.986257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eca58aa-d4e1-4061-914e-a31d0e7cadca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4eca58aa-d4e1-4061-914e-a31d0e7cadca" (UID: "4eca58aa-d4e1-4061-914e-a31d0e7cadca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:38 crc kubenswrapper[4707]: I0121 16:41:38.988033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eca58aa-d4e1-4061-914e-a31d0e7cadca-kube-api-access-s4bdk" (OuterVolumeSpecName: "kube-api-access-s4bdk") pod "4eca58aa-d4e1-4061-914e-a31d0e7cadca" (UID: "4eca58aa-d4e1-4061-914e-a31d0e7cadca"). InnerVolumeSpecName "kube-api-access-s4bdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.087666 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4bdk\" (UniqueName: \"kubernetes.io/projected/4eca58aa-d4e1-4061-914e-a31d0e7cadca-kube-api-access-s4bdk\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.087700 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eca58aa-d4e1-4061-914e-a31d0e7cadca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.188188 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a9e973-2137-4426-b3d0-28de9f12ab22" path="/var/lib/kubelet/pods/57a9e973-2137-4426-b3d0-28de9f12ab22/volumes" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.188736 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" path="/var/lib/kubelet/pods/c719164f-8f93-4fe4-a8b3-8e6c90eaa01e/volumes" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.189306 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" path="/var/lib/kubelet/pods/c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b/volumes" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.651438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" event={"ID":"4eca58aa-d4e1-4061-914e-a31d0e7cadca","Type":"ContainerDied","Data":"329a3265002ec681411ce43ba4041f9e821cd87b1a7afcbc09e4cde577f3de46"} Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.651846 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="329a3265002ec681411ce43ba4041f9e821cd87b1a7afcbc09e4cde577f3de46" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.651467 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican99e0-account-delete-5fwll" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.945583 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.945628 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.945667 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.946126 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:41:39 crc kubenswrapper[4707]: I0121 16:41:39.946182 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" gracePeriod=600 Jan 21 16:41:40 crc kubenswrapper[4707]: E0121 16:41:40.062085 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:41:40 crc kubenswrapper[4707]: I0121 16:41:40.660602 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" exitCode=0 Jan 21 16:41:40 crc kubenswrapper[4707]: I0121 16:41:40.660675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79"} Jan 21 16:41:40 crc kubenswrapper[4707]: I0121 16:41:40.660905 4707 scope.go:117] "RemoveContainer" containerID="70390b648c73daf4be255709128a6e64ca5ff234e05ed8fb129f19cfb2d9e39e" Jan 21 16:41:40 crc kubenswrapper[4707]: I0121 16:41:40.661427 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:41:40 crc kubenswrapper[4707]: E0121 16:41:40.661686 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:41:41 crc kubenswrapper[4707]: I0121 16:41:41.491136 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-gfwjv"] Jan 21 16:41:41 crc kubenswrapper[4707]: I0121 16:41:41.495735 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-gfwjv"] Jan 21 16:41:41 crc kubenswrapper[4707]: I0121 16:41:41.502055 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican99e0-account-delete-5fwll"] Jan 21 16:41:41 crc kubenswrapper[4707]: I0121 16:41:41.506028 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz"] Jan 21 16:41:41 crc kubenswrapper[4707]: I0121 16:41:41.509712 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican99e0-account-delete-5fwll"] Jan 21 16:41:41 crc kubenswrapper[4707]: I0121 16:41:41.515306 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-99e0-account-create-update-7zrfz"] Jan 21 16:41:43 crc kubenswrapper[4707]: I0121 16:41:43.195134 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d97d080-2a30-4bf1-be94-1aca3a2ac64f" path="/var/lib/kubelet/pods/3d97d080-2a30-4bf1-be94-1aca3a2ac64f/volumes" Jan 21 16:41:43 crc kubenswrapper[4707]: I0121 16:41:43.195786 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eca58aa-d4e1-4061-914e-a31d0e7cadca" path="/var/lib/kubelet/pods/4eca58aa-d4e1-4061-914e-a31d0e7cadca/volumes" Jan 21 16:41:43 crc kubenswrapper[4707]: I0121 16:41:43.196273 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc696c40-1172-4cbc-8f11-d352fa6d505c" path="/var/lib/kubelet/pods/fc696c40-1172-4cbc-8f11-d352fa6d505c/volumes" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.014666 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-5fd7v"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.020126 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-5fd7v"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.035591 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dw5db"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.043349 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-dw5db"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.052986 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-5f5bf8f646-hch88"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.053167 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" podUID="1addc64b-1eb7-43fd-ade9-e6ee00d007bb" containerName="keystone-api" containerID="cri-o://a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba" gracePeriod=30 Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074085 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystonecdc9-account-delete-46gj2"] Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.074318 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerName="barbican-worker" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074335 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerName="barbican-worker" Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.074353 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerName="barbican-api-log" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074358 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerName="barbican-api-log" Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.074371 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerName="barbican-worker-log" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074377 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerName="barbican-worker-log" Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.074390 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerName="barbican-api" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074396 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerName="barbican-api" Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.074415 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerName="barbican-keystone-listener" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074420 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerName="barbican-keystone-listener" Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.074430 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerName="barbican-keystone-listener-log" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074435 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerName="barbican-keystone-listener-log" Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.074441 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eca58aa-d4e1-4061-914e-a31d0e7cadca" containerName="mariadb-account-delete" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074448 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eca58aa-d4e1-4061-914e-a31d0e7cadca" containerName="mariadb-account-delete" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074543 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerName="barbican-worker" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074555 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerName="barbican-keystone-listener-log" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074565 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerName="barbican-api-log" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074574 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eca58aa-d4e1-4061-914e-a31d0e7cadca" containerName="mariadb-account-delete" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074583 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c719164f-8f93-4fe4-a8b3-8e6c90eaa01e" containerName="barbican-api" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074594 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c6b8f3-6bb7-4d97-80e9-50f95d58ba4b" containerName="barbican-worker-log" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074602 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a9e973-2137-4426-b3d0-28de9f12ab22" containerName="barbican-keystone-listener" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.074988 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.080107 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystonecdc9-account-delete-46gj2"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.155356 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts\") pod \"keystonecdc9-account-delete-46gj2\" (UID: \"1e629c94-5e95-4697-a88e-b706bc09f078\") " pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.155556 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4sx\" (UniqueName: \"kubernetes.io/projected/1e629c94-5e95-4697-a88e-b706bc09f078-kube-api-access-qb4sx\") pod \"keystonecdc9-account-delete-46gj2\" (UID: \"1e629c94-5e95-4697-a88e-b706bc09f078\") " pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.256502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts\") pod \"keystonecdc9-account-delete-46gj2\" (UID: \"1e629c94-5e95-4697-a88e-b706bc09f078\") " pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.256755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4sx\" (UniqueName: \"kubernetes.io/projected/1e629c94-5e95-4697-a88e-b706bc09f078-kube-api-access-qb4sx\") pod \"keystonecdc9-account-delete-46gj2\" (UID: \"1e629c94-5e95-4697-a88e-b706bc09f078\") " pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.257107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts\") pod \"keystonecdc9-account-delete-46gj2\" (UID: \"1e629c94-5e95-4697-a88e-b706bc09f078\") " pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.273227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4sx\" (UniqueName: \"kubernetes.io/projected/1e629c94-5e95-4697-a88e-b706bc09f078-kube-api-access-qb4sx\") pod \"keystonecdc9-account-delete-46gj2\" (UID: \"1e629c94-5e95-4697-a88e-b706bc09f078\") " pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.387583 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.421639 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-zqlvl"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.429359 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-zqlvl"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.437130 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/root-account-create-update-r5v5g"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.437926 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.439672 4707 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.446416 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-r5v5g"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.451902 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.463934 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.469932 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.501274 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-r5v5g"] Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.501879 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-vq5tt operator-scripts], unattached volumes=[], failed to process volumes=[kube-api-access-vq5tt operator-scripts]: context canceled" pod="barbican-kuttl-tests/root-account-create-update-r5v5g" podUID="a324029a-5d47-4653-9298-cfebdc6520df" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.560595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq5tt\" (UniqueName: \"kubernetes.io/projected/a324029a-5d47-4653-9298-cfebdc6520df-kube-api-access-vq5tt\") pod \"root-account-create-update-r5v5g\" (UID: \"a324029a-5d47-4653-9298-cfebdc6520df\") " pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.560853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a324029a-5d47-4653-9298-cfebdc6520df-operator-scripts\") pod \"root-account-create-update-r5v5g\" (UID: \"a324029a-5d47-4653-9298-cfebdc6520df\") " pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.593060 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-2" podUID="067aba60-80c2-4c68-887f-f3c6c40ed418" containerName="galera" containerID="cri-o://11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20" gracePeriod=30 Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.662163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a324029a-5d47-4653-9298-cfebdc6520df-operator-scripts\") pod \"root-account-create-update-r5v5g\" (UID: \"a324029a-5d47-4653-9298-cfebdc6520df\") " pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.662264 4707 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.662287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq5tt\" (UniqueName: \"kubernetes.io/projected/a324029a-5d47-4653-9298-cfebdc6520df-kube-api-access-vq5tt\") pod \"root-account-create-update-r5v5g\" (UID: \"a324029a-5d47-4653-9298-cfebdc6520df\") " pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.662325 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a324029a-5d47-4653-9298-cfebdc6520df-operator-scripts podName:a324029a-5d47-4653-9298-cfebdc6520df nodeName:}" failed. No retries permitted until 2026-01-21 16:41:45.16230834 +0000 UTC m=+6002.343824562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a324029a-5d47-4653-9298-cfebdc6520df-operator-scripts") pod "root-account-create-update-r5v5g" (UID: "a324029a-5d47-4653-9298-cfebdc6520df") : configmap "openstack-scripts" not found Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.664956 4707 projected.go:194] Error preparing data for projected volume kube-api-access-vq5tt for pod barbican-kuttl-tests/root-account-create-update-r5v5g: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:41:44 crc kubenswrapper[4707]: E0121 16:41:44.665016 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a324029a-5d47-4653-9298-cfebdc6520df-kube-api-access-vq5tt podName:a324029a-5d47-4653-9298-cfebdc6520df nodeName:}" failed. No retries permitted until 2026-01-21 16:41:45.165003838 +0000 UTC m=+6002.346520060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vq5tt" (UniqueName: "kubernetes.io/projected/a324029a-5d47-4653-9298-cfebdc6520df-kube-api-access-vq5tt") pod "root-account-create-update-r5v5g" (UID: "a324029a-5d47-4653-9298-cfebdc6520df") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.683539 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.690078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.781025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystonecdc9-account-delete-46gj2"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.961996 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 16:41:44 crc kubenswrapper[4707]: I0121 16:41:44.962311 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/memcached-0" podUID="495e3dab-2009-4192-901c-cc7908381182" containerName="memcached" containerID="cri-o://c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd" gracePeriod=30 Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.167828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a324029a-5d47-4653-9298-cfebdc6520df-operator-scripts\") pod \"root-account-create-update-r5v5g\" (UID: \"a324029a-5d47-4653-9298-cfebdc6520df\") " pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.167967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq5tt\" (UniqueName: \"kubernetes.io/projected/a324029a-5d47-4653-9298-cfebdc6520df-kube-api-access-vq5tt\") pod \"root-account-create-update-r5v5g\" (UID: \"a324029a-5d47-4653-9298-cfebdc6520df\") " pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:45 crc kubenswrapper[4707]: E0121 16:41:45.167969 4707 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:41:45 crc kubenswrapper[4707]: E0121 16:41:45.168132 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a324029a-5d47-4653-9298-cfebdc6520df-operator-scripts podName:a324029a-5d47-4653-9298-cfebdc6520df nodeName:}" failed. No retries permitted until 2026-01-21 16:41:46.168117932 +0000 UTC m=+6003.349634153 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a324029a-5d47-4653-9298-cfebdc6520df-operator-scripts") pod "root-account-create-update-r5v5g" (UID: "a324029a-5d47-4653-9298-cfebdc6520df") : configmap "openstack-scripts" not found Jan 21 16:41:45 crc kubenswrapper[4707]: E0121 16:41:45.169948 4707 projected.go:194] Error preparing data for projected volume kube-api-access-vq5tt for pod barbican-kuttl-tests/root-account-create-update-r5v5g: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:41:45 crc kubenswrapper[4707]: E0121 16:41:45.169994 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a324029a-5d47-4653-9298-cfebdc6520df-kube-api-access-vq5tt podName:a324029a-5d47-4653-9298-cfebdc6520df nodeName:}" failed. No retries permitted until 2026-01-21 16:41:46.169985243 +0000 UTC m=+6003.351501465 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vq5tt" (UniqueName: "kubernetes.io/projected/a324029a-5d47-4653-9298-cfebdc6520df-kube-api-access-vq5tt") pod "root-account-create-update-r5v5g" (UID: "a324029a-5d47-4653-9298-cfebdc6520df") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.189520 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce37166-4b59-4b57-b5f8-9fcf99e2f700" path="/var/lib/kubelet/pods/0ce37166-4b59-4b57-b5f8-9fcf99e2f700/volumes" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.190117 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e783b9-adef-43ea-92b5-0689ae45f547" path="/var/lib/kubelet/pods/55e783b9-adef-43ea-92b5-0689ae45f547/volumes" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.190552 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873a8c34-73d7-4b88-922f-411430f592ab" path="/var/lib/kubelet/pods/873a8c34-73d7-4b88-922f-411430f592ab/volumes" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.219310 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.316280 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.372295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-generated\") pod \"067aba60-80c2-4c68-887f-f3c6c40ed418\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.372337 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-operator-scripts\") pod \"067aba60-80c2-4c68-887f-f3c6c40ed418\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.372365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n2q4\" (UniqueName: \"kubernetes.io/projected/067aba60-80c2-4c68-887f-f3c6c40ed418-kube-api-access-6n2q4\") pod \"067aba60-80c2-4c68-887f-f3c6c40ed418\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.372454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"067aba60-80c2-4c68-887f-f3c6c40ed418\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.372531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-default\") pod \"067aba60-80c2-4c68-887f-f3c6c40ed418\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.372576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-kolla-config\") pod \"067aba60-80c2-4c68-887f-f3c6c40ed418\" (UID: \"067aba60-80c2-4c68-887f-f3c6c40ed418\") " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.373482 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "067aba60-80c2-4c68-887f-f3c6c40ed418" (UID: "067aba60-80c2-4c68-887f-f3c6c40ed418"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.373774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "067aba60-80c2-4c68-887f-f3c6c40ed418" (UID: "067aba60-80c2-4c68-887f-f3c6c40ed418"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.374262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "067aba60-80c2-4c68-887f-f3c6c40ed418" (UID: "067aba60-80c2-4c68-887f-f3c6c40ed418"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.375116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "067aba60-80c2-4c68-887f-f3c6c40ed418" (UID: "067aba60-80c2-4c68-887f-f3c6c40ed418"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.378513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067aba60-80c2-4c68-887f-f3c6c40ed418-kube-api-access-6n2q4" (OuterVolumeSpecName: "kube-api-access-6n2q4") pod "067aba60-80c2-4c68-887f-f3c6c40ed418" (UID: "067aba60-80c2-4c68-887f-f3c6c40ed418"). InnerVolumeSpecName "kube-api-access-6n2q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.383605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "067aba60-80c2-4c68-887f-f3c6c40ed418" (UID: "067aba60-80c2-4c68-887f-f3c6c40ed418"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.474268 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.474303 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.474313 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067aba60-80c2-4c68-887f-f3c6c40ed418-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.474322 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067aba60-80c2-4c68-887f-f3c6c40ed418-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.474331 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n2q4\" (UniqueName: \"kubernetes.io/projected/067aba60-80c2-4c68-887f-f3c6c40ed418-kube-api-access-6n2q4\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.474348 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.484353 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.516013 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.575553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgc2x\" (UniqueName: \"kubernetes.io/projected/495e3dab-2009-4192-901c-cc7908381182-kube-api-access-fgc2x\") pod \"495e3dab-2009-4192-901c-cc7908381182\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.575651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-config-data\") pod \"495e3dab-2009-4192-901c-cc7908381182\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.575701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-kolla-config\") pod \"495e3dab-2009-4192-901c-cc7908381182\" (UID: \"495e3dab-2009-4192-901c-cc7908381182\") " Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.576061 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.576238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-config-data" (OuterVolumeSpecName: "config-data") pod "495e3dab-2009-4192-901c-cc7908381182" (UID: "495e3dab-2009-4192-901c-cc7908381182"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.576285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "495e3dab-2009-4192-901c-cc7908381182" (UID: "495e3dab-2009-4192-901c-cc7908381182"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.578300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495e3dab-2009-4192-901c-cc7908381182-kube-api-access-fgc2x" (OuterVolumeSpecName: "kube-api-access-fgc2x") pod "495e3dab-2009-4192-901c-cc7908381182" (UID: "495e3dab-2009-4192-901c-cc7908381182"). InnerVolumeSpecName "kube-api-access-fgc2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.633985 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.677478 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.677511 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgc2x\" (UniqueName: \"kubernetes.io/projected/495e3dab-2009-4192-901c-cc7908381182-kube-api-access-fgc2x\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.677521 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/495e3dab-2009-4192-901c-cc7908381182-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.691358 4707 generic.go:334] "Generic (PLEG): container finished" podID="1e629c94-5e95-4697-a88e-b706bc09f078" containerID="975dcccdd088681322bc674e0033c6d3fffe813fab4e9d38b85d6dd7df15791b" exitCode=1 Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.691454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" event={"ID":"1e629c94-5e95-4697-a88e-b706bc09f078","Type":"ContainerDied","Data":"975dcccdd088681322bc674e0033c6d3fffe813fab4e9d38b85d6dd7df15791b"} Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.691489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" event={"ID":"1e629c94-5e95-4697-a88e-b706bc09f078","Type":"ContainerStarted","Data":"067382f3e815ff45447766fcd8dcbe73a8cc976928474c4388a9b425c1834eff"} Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.691881 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" secret="" err="secret \"galera-openstack-dockercfg-wpxm5\" not found" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.691924 4707 scope.go:117] "RemoveContainer" containerID="975dcccdd088681322bc674e0033c6d3fffe813fab4e9d38b85d6dd7df15791b" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.693248 4707 generic.go:334] "Generic (PLEG): container finished" podID="067aba60-80c2-4c68-887f-f3c6c40ed418" containerID="11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20" exitCode=0 Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.693304 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.693312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"067aba60-80c2-4c68-887f-f3c6c40ed418","Type":"ContainerDied","Data":"11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20"} Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.693333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"067aba60-80c2-4c68-887f-f3c6c40ed418","Type":"ContainerDied","Data":"570a772df1232497cfebb471a3f89517f74cbda652cb75d340cd99580368e0ef"} Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.693347 4707 scope.go:117] "RemoveContainer" containerID="11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.694631 4707 generic.go:334] "Generic (PLEG): container finished" podID="495e3dab-2009-4192-901c-cc7908381182" containerID="c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd" exitCode=0 Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.694663 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"495e3dab-2009-4192-901c-cc7908381182","Type":"ContainerDied","Data":"c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd"} Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.694686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"495e3dab-2009-4192-901c-cc7908381182","Type":"ContainerDied","Data":"d04a6e34135682c108aa5132823865d3181425dfce78ca41a1dc1da0170fa611"} Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.694692 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-r5v5g" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.694746 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.722451 4707 scope.go:117] "RemoveContainer" containerID="72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.731402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-r5v5g"] Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.738394 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/rabbitmq-server-0" podUID="b049d8fc-bda9-45c0-b94b-ea5824e7e684" containerName="rabbitmq" containerID="cri-o://3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55" gracePeriod=604800 Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.738537 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-r5v5g"] Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.743642 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.747994 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.752074 4707 scope.go:117] "RemoveContainer" containerID="11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.752549 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 16:41:45 crc kubenswrapper[4707]: E0121 16:41:45.752574 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20\": container with ID starting with 11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20 not found: ID does not exist" containerID="11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.752599 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20"} err="failed to get container status \"11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20\": rpc error: code = NotFound desc = could not find container \"11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20\": container with ID starting with 11b80b27d257af59f0302092ad46a87c51124a8b183e177de247ec8ef1602c20 not found: ID does not exist" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.752643 4707 scope.go:117] "RemoveContainer" containerID="72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636" Jan 21 16:41:45 crc kubenswrapper[4707]: E0121 16:41:45.753086 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636\": container with ID starting with 72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636 not found: ID does not exist" containerID="72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.753186 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636"} err="failed to get container status \"72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636\": rpc error: code = NotFound desc = could not find container \"72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636\": container with ID starting with 72de51ea7bb6da19b2b207bb16673ad494b22a244aed51484140c9a63321b636 not found: ID does not exist" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.753254 4707 scope.go:117] "RemoveContainer" containerID="c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.756243 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.767922 4707 scope.go:117] "RemoveContainer" containerID="c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd" Jan 21 16:41:45 crc kubenswrapper[4707]: E0121 16:41:45.768141 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd\": container with ID starting with c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd not found: ID does not exist" containerID="c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.768177 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd"} err="failed to get container status \"c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd\": rpc error: code = NotFound desc = could not find container \"c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd\": container with ID starting with c294b57b6c15de167dc6fe3b26c6894f0749d998488c2e167c4e1fd05f9b82dd not found: ID does not exist" Jan 21 16:41:45 crc kubenswrapper[4707]: E0121 16:41:45.778652 4707 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:41:45 crc kubenswrapper[4707]: E0121 16:41:45.778775 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts podName:1e629c94-5e95-4697-a88e-b706bc09f078 nodeName:}" failed. No retries permitted until 2026-01-21 16:41:46.278759449 +0000 UTC m=+6003.460275671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts") pod "keystonecdc9-account-delete-46gj2" (UID: "1e629c94-5e95-4697-a88e-b706bc09f078") : configmap "openstack-scripts" not found Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.879578 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq5tt\" (UniqueName: \"kubernetes.io/projected/a324029a-5d47-4653-9298-cfebdc6520df-kube-api-access-vq5tt\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:45 crc kubenswrapper[4707]: I0121 16:41:45.879604 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a324029a-5d47-4653-9298-cfebdc6520df-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:46 crc kubenswrapper[4707]: E0121 16:41:46.284883 4707 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:41:46 crc kubenswrapper[4707]: E0121 16:41:46.284944 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts podName:1e629c94-5e95-4697-a88e-b706bc09f078 nodeName:}" failed. No retries permitted until 2026-01-21 16:41:47.284931954 +0000 UTC m=+6004.466448175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts") pod "keystonecdc9-account-delete-46gj2" (UID: "1e629c94-5e95-4697-a88e-b706bc09f078") : configmap "openstack-scripts" not found Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.388437 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5"] Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.388820 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" podUID="42d603ef-d0e3-46e3-9dbb-563a49c73e2e" containerName="manager" containerID="cri-o://aa7c3c303eb643bb6549d509edba3cd9c298d9a0c4dfc9d43ddc855e4cbaef62" gracePeriod=10 Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.600233 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-1" podUID="e183c46d-7158-49d9-b9e7-d07475597a79" containerName="galera" containerID="cri-o://7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7" gracePeriod=28 Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.610142 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-khs56"] Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.610307 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-khs56" podUID="ce284746-16f3-4e13-ab0d-5c8ee61e0a4d" containerName="registry-server" containerID="cri-o://b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7" gracePeriod=30 Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.672850 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v"] Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.680844 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cdds5c8v"] Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.705595 4707 generic.go:334] "Generic (PLEG): container finished" podID="42d603ef-d0e3-46e3-9dbb-563a49c73e2e" containerID="aa7c3c303eb643bb6549d509edba3cd9c298d9a0c4dfc9d43ddc855e4cbaef62" exitCode=0 Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.705651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" event={"ID":"42d603ef-d0e3-46e3-9dbb-563a49c73e2e","Type":"ContainerDied","Data":"aa7c3c303eb643bb6549d509edba3cd9c298d9a0c4dfc9d43ddc855e4cbaef62"} Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.712449 4707 generic.go:334] "Generic (PLEG): container finished" podID="1e629c94-5e95-4697-a88e-b706bc09f078" containerID="ca2c2d4f20bce64d9931051bb24d792a02cbca684869c3506ece62d521224541" exitCode=1 Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.712488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" event={"ID":"1e629c94-5e95-4697-a88e-b706bc09f078","Type":"ContainerDied","Data":"ca2c2d4f20bce64d9931051bb24d792a02cbca684869c3506ece62d521224541"} Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.712521 4707 scope.go:117] "RemoveContainer" containerID="975dcccdd088681322bc674e0033c6d3fffe813fab4e9d38b85d6dd7df15791b" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.712985 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" secret="" err="secret \"galera-openstack-dockercfg-wpxm5\" not found" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.713016 4707 scope.go:117] "RemoveContainer" containerID="ca2c2d4f20bce64d9931051bb24d792a02cbca684869c3506ece62d521224541" Jan 21 16:41:46 crc kubenswrapper[4707]: E0121 16:41:46.713275 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystonecdc9-account-delete-46gj2_barbican-kuttl-tests(1e629c94-5e95-4697-a88e-b706bc09f078)\"" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" podUID="1e629c94-5e95-4697-a88e-b706bc09f078" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.787277 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:41:46 crc kubenswrapper[4707]: E0121 16:41:46.849757 4707 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.893144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-apiservice-cert\") pod \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.893232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z6wq\" (UniqueName: \"kubernetes.io/projected/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-kube-api-access-9z6wq\") pod \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.893263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-webhook-cert\") pod \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\" (UID: \"42d603ef-d0e3-46e3-9dbb-563a49c73e2e\") " Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.897667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "42d603ef-d0e3-46e3-9dbb-563a49c73e2e" (UID: "42d603ef-d0e3-46e3-9dbb-563a49c73e2e"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.899941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "42d603ef-d0e3-46e3-9dbb-563a49c73e2e" (UID: "42d603ef-d0e3-46e3-9dbb-563a49c73e2e"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.899960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-kube-api-access-9z6wq" (OuterVolumeSpecName: "kube-api-access-9z6wq") pod "42d603ef-d0e3-46e3-9dbb-563a49c73e2e" (UID: "42d603ef-d0e3-46e3-9dbb-563a49c73e2e"). InnerVolumeSpecName "kube-api-access-9z6wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.948996 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.994928 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.994961 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z6wq\" (UniqueName: \"kubernetes.io/projected/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-kube-api-access-9z6wq\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:46 crc kubenswrapper[4707]: I0121 16:41:46.994971 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42d603ef-d0e3-46e3-9dbb-563a49c73e2e-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.095385 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbxdj\" (UniqueName: \"kubernetes.io/projected/ce284746-16f3-4e13-ab0d-5c8ee61e0a4d-kube-api-access-bbxdj\") pod \"ce284746-16f3-4e13-ab0d-5c8ee61e0a4d\" (UID: \"ce284746-16f3-4e13-ab0d-5c8ee61e0a4d\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.098549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce284746-16f3-4e13-ab0d-5c8ee61e0a4d-kube-api-access-bbxdj" (OuterVolumeSpecName: "kube-api-access-bbxdj") pod "ce284746-16f3-4e13-ab0d-5c8ee61e0a4d" (UID: "ce284746-16f3-4e13-ab0d-5c8ee61e0a4d"). InnerVolumeSpecName "kube-api-access-bbxdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.102184 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.188734 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067aba60-80c2-4c68-887f-f3c6c40ed418" path="/var/lib/kubelet/pods/067aba60-80c2-4c68-887f-f3c6c40ed418/volumes" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.189453 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28af8b85-10f6-466f-b49e-cb02c1530048" path="/var/lib/kubelet/pods/28af8b85-10f6-466f-b49e-cb02c1530048/volumes" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.190130 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495e3dab-2009-4192-901c-cc7908381182" path="/var/lib/kubelet/pods/495e3dab-2009-4192-901c-cc7908381182/volumes" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.190953 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a324029a-5d47-4653-9298-cfebdc6520df" path="/var/lib/kubelet/pods/a324029a-5d47-4653-9298-cfebdc6520df/volumes" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.197001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-erlang-cookie\") pod \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.197049 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b049d8fc-bda9-45c0-b94b-ea5824e7e684-pod-info\") pod \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.197102 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-confd\") pod \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.197215 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\") pod \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.197276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b049d8fc-bda9-45c0-b94b-ea5824e7e684-erlang-cookie-secret\") pod \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.197305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b049d8fc-bda9-45c0-b94b-ea5824e7e684-plugins-conf\") pod \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.197322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-plugins\") pod \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.197342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crdrk\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-kube-api-access-crdrk\") pod \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\" (UID: \"b049d8fc-bda9-45c0-b94b-ea5824e7e684\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.197841 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbxdj\" (UniqueName: \"kubernetes.io/projected/ce284746-16f3-4e13-ab0d-5c8ee61e0a4d-kube-api-access-bbxdj\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.198659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b049d8fc-bda9-45c0-b94b-ea5824e7e684" (UID: "b049d8fc-bda9-45c0-b94b-ea5824e7e684"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.200358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b049d8fc-bda9-45c0-b94b-ea5824e7e684-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b049d8fc-bda9-45c0-b94b-ea5824e7e684" (UID: "b049d8fc-bda9-45c0-b94b-ea5824e7e684"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.201955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b049d8fc-bda9-45c0-b94b-ea5824e7e684" (UID: "b049d8fc-bda9-45c0-b94b-ea5824e7e684"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.203879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b049d8fc-bda9-45c0-b94b-ea5824e7e684-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b049d8fc-bda9-45c0-b94b-ea5824e7e684" (UID: "b049d8fc-bda9-45c0-b94b-ea5824e7e684"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.205678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-kube-api-access-crdrk" (OuterVolumeSpecName: "kube-api-access-crdrk") pod "b049d8fc-bda9-45c0-b94b-ea5824e7e684" (UID: "b049d8fc-bda9-45c0-b94b-ea5824e7e684"). InnerVolumeSpecName "kube-api-access-crdrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.206822 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b049d8fc-bda9-45c0-b94b-ea5824e7e684-pod-info" (OuterVolumeSpecName: "pod-info") pod "b049d8fc-bda9-45c0-b94b-ea5824e7e684" (UID: "b049d8fc-bda9-45c0-b94b-ea5824e7e684"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.207446 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ccbea64b-034f-47c2-9358-8410f265b9ca" (OuterVolumeSpecName: "persistence") pod "b049d8fc-bda9-45c0-b94b-ea5824e7e684" (UID: "b049d8fc-bda9-45c0-b94b-ea5824e7e684"). InnerVolumeSpecName "pvc-ccbea64b-034f-47c2-9358-8410f265b9ca". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.248393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b049d8fc-bda9-45c0-b94b-ea5824e7e684" (UID: "b049d8fc-bda9-45c0-b94b-ea5824e7e684"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: E0121 16:41:47.299326 4707 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.299341 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b049d8fc-bda9-45c0-b94b-ea5824e7e684-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: E0121 16:41:47.299388 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts podName:1e629c94-5e95-4697-a88e-b706bc09f078 nodeName:}" failed. No retries permitted until 2026-01-21 16:41:49.299373385 +0000 UTC m=+6006.480889607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts") pod "keystonecdc9-account-delete-46gj2" (UID: "1e629c94-5e95-4697-a88e-b706bc09f078") : configmap "openstack-scripts" not found Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.299669 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.299834 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crdrk\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-kube-api-access-crdrk\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.299857 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.299891 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b049d8fc-bda9-45c0-b94b-ea5824e7e684-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.299915 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b049d8fc-bda9-45c0-b94b-ea5824e7e684-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.299941 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\") on node \"crc\" " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.299960 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b049d8fc-bda9-45c0-b94b-ea5824e7e684-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.324521 4707 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.324643 4707 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ccbea64b-034f-47c2-9358-8410f265b9ca" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ccbea64b-034f-47c2-9358-8410f265b9ca") on node "crc" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.401760 4707 reconciler_common.go:293] "Volume detached for volume \"pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ccbea64b-034f-47c2-9358-8410f265b9ca\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.529667 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.604146 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-credential-keys\") pod \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.604189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-fernet-keys\") pod \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.604220 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq2nt\" (UniqueName: \"kubernetes.io/projected/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-kube-api-access-cq2nt\") pod \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.604245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-scripts\") pod \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.604264 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-config-data\") pod \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\" (UID: \"1addc64b-1eb7-43fd-ade9-e6ee00d007bb\") " Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.608159 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1addc64b-1eb7-43fd-ade9-e6ee00d007bb" (UID: "1addc64b-1eb7-43fd-ade9-e6ee00d007bb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.608317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-kube-api-access-cq2nt" (OuterVolumeSpecName: "kube-api-access-cq2nt") pod "1addc64b-1eb7-43fd-ade9-e6ee00d007bb" (UID: "1addc64b-1eb7-43fd-ade9-e6ee00d007bb"). InnerVolumeSpecName "kube-api-access-cq2nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.608397 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-scripts" (OuterVolumeSpecName: "scripts") pod "1addc64b-1eb7-43fd-ade9-e6ee00d007bb" (UID: "1addc64b-1eb7-43fd-ade9-e6ee00d007bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.608508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1addc64b-1eb7-43fd-ade9-e6ee00d007bb" (UID: "1addc64b-1eb7-43fd-ade9-e6ee00d007bb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.619356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-config-data" (OuterVolumeSpecName: "config-data") pod "1addc64b-1eb7-43fd-ade9-e6ee00d007bb" (UID: "1addc64b-1eb7-43fd-ade9-e6ee00d007bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.705555 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.705681 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.705737 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq2nt\" (UniqueName: \"kubernetes.io/projected/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-kube-api-access-cq2nt\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.705786 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.705875 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addc64b-1eb7-43fd-ade9-e6ee00d007bb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.719569 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" secret="" err="secret \"galera-openstack-dockercfg-wpxm5\" not found" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.719608 4707 scope.go:117] "RemoveContainer" containerID="ca2c2d4f20bce64d9931051bb24d792a02cbca684869c3506ece62d521224541" Jan 21 16:41:47 crc kubenswrapper[4707]: E0121 16:41:47.719839 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystonecdc9-account-delete-46gj2_barbican-kuttl-tests(1e629c94-5e95-4697-a88e-b706bc09f078)\"" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" podUID="1e629c94-5e95-4697-a88e-b706bc09f078" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.721376 4707 generic.go:334] "Generic (PLEG): container finished" podID="b049d8fc-bda9-45c0-b94b-ea5824e7e684" containerID="3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55" exitCode=0 Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.721421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"b049d8fc-bda9-45c0-b94b-ea5824e7e684","Type":"ContainerDied","Data":"3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55"} Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.721440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"b049d8fc-bda9-45c0-b94b-ea5824e7e684","Type":"ContainerDied","Data":"de7c56622a49acff4d9fe2ed49f90c60a0ce6bd57019e90897e3f8820a1aa4cc"} Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.721452 4707 scope.go:117] "RemoveContainer" containerID="3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.721551 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.725471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" event={"ID":"42d603ef-d0e3-46e3-9dbb-563a49c73e2e","Type":"ContainerDied","Data":"602a9944c39514e518a8661780763494315b4391442daa9bb3326a35374d0174"} Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.725908 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.726914 4707 generic.go:334] "Generic (PLEG): container finished" podID="1addc64b-1eb7-43fd-ade9-e6ee00d007bb" containerID="a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba" exitCode=0 Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.726968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" event={"ID":"1addc64b-1eb7-43fd-ade9-e6ee00d007bb","Type":"ContainerDied","Data":"a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba"} Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.726993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" event={"ID":"1addc64b-1eb7-43fd-ade9-e6ee00d007bb","Type":"ContainerDied","Data":"b24f506e0f994533c4707fd4bb820b1de105f0de0e1993eeb9b94c7530af824c"} Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.727035 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-5f5bf8f646-hch88" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.728289 4707 generic.go:334] "Generic (PLEG): container finished" podID="ce284746-16f3-4e13-ab0d-5c8ee61e0a4d" containerID="b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7" exitCode=0 Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.728321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-khs56" event={"ID":"ce284746-16f3-4e13-ab0d-5c8ee61e0a4d","Type":"ContainerDied","Data":"b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7"} Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.728343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-khs56" event={"ID":"ce284746-16f3-4e13-ab0d-5c8ee61e0a4d","Type":"ContainerDied","Data":"de08628e52157c31afa789a1b1e63ce4b855243cfe2c3955b47a53f44a715c85"} Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.728482 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-khs56" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.738720 4707 scope.go:117] "RemoveContainer" containerID="a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.753972 4707 scope.go:117] "RemoveContainer" containerID="3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55" Jan 21 16:41:47 crc kubenswrapper[4707]: E0121 16:41:47.754924 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55\": container with ID starting with 3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55 not found: ID does not exist" containerID="3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.754957 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55"} err="failed to get container status \"3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55\": rpc error: code = NotFound desc = could not find container \"3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55\": container with ID starting with 3eb159059a8118bfa2d57007f07c4f0fb366034813ead246cde8006f8c76ec55 not found: ID does not exist" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.754994 4707 scope.go:117] "RemoveContainer" containerID="a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7" Jan 21 16:41:47 crc kubenswrapper[4707]: E0121 16:41:47.755294 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7\": container with ID starting with a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7 not found: ID does not exist" containerID="a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.755329 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7"} err="failed to get container status \"a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7\": rpc error: code = NotFound desc = could not find container \"a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7\": container with ID starting with a6855298908a54100b5341de6585e79ecfad79465b065f1163c9afbc55c41ea7 not found: ID does not exist" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.755351 4707 scope.go:117] "RemoveContainer" containerID="aa7c3c303eb643bb6549d509edba3cd9c298d9a0c4dfc9d43ddc855e4cbaef62" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.757627 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-khs56"] Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.762219 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-khs56"] Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.766192 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5"] Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.769997 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5ccc48789b-vncz5"] Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.770188 4707 scope.go:117] "RemoveContainer" containerID="a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.774253 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.782764 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.785490 4707 scope.go:117] "RemoveContainer" containerID="a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba" Jan 21 16:41:47 crc kubenswrapper[4707]: E0121 16:41:47.785766 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba\": container with ID starting with a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba not found: ID does not exist" containerID="a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.785888 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba"} err="failed to get container status \"a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba\": rpc error: code = NotFound desc = could not find container \"a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba\": container with ID starting with a1f471ef938d77286c51c938265f775fb690ef07fc9c2f1adf5a71053330b7ba not found: ID does not exist" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.785969 4707 scope.go:117] "RemoveContainer" containerID="b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.785843 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-5f5bf8f646-hch88"] Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.789364 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-5f5bf8f646-hch88"] Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.797343 4707 scope.go:117] "RemoveContainer" containerID="b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7" Jan 21 16:41:47 crc kubenswrapper[4707]: E0121 16:41:47.797914 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7\": container with ID starting with b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7 not found: ID does not exist" containerID="b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7" Jan 21 16:41:47 crc kubenswrapper[4707]: I0121 16:41:47.797943 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7"} err="failed to get container status \"b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7\": rpc error: code = NotFound desc = could not find container \"b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7\": container with ID starting with b4b7e4d6ac5733f5be546fb9d0d9355019194de9125fa030503a2052f6280ca7 not found: ID does not exist" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.308000 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e183c46d-7158-49d9-b9e7-d07475597a79\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415056 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-default\") pod \"e183c46d-7158-49d9-b9e7-d07475597a79\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-operator-scripts\") pod \"e183c46d-7158-49d9-b9e7-d07475597a79\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-generated\") pod \"e183c46d-7158-49d9-b9e7-d07475597a79\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-kolla-config\") pod \"e183c46d-7158-49d9-b9e7-d07475597a79\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzgxx\" (UniqueName: \"kubernetes.io/projected/e183c46d-7158-49d9-b9e7-d07475597a79-kube-api-access-fzgxx\") pod \"e183c46d-7158-49d9-b9e7-d07475597a79\" (UID: \"e183c46d-7158-49d9-b9e7-d07475597a79\") " Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e183c46d-7158-49d9-b9e7-d07475597a79" (UID: "e183c46d-7158-49d9-b9e7-d07475597a79"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e183c46d-7158-49d9-b9e7-d07475597a79" (UID: "e183c46d-7158-49d9-b9e7-d07475597a79"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415762 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e183c46d-7158-49d9-b9e7-d07475597a79" (UID: "e183c46d-7158-49d9-b9e7-d07475597a79"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.415955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e183c46d-7158-49d9-b9e7-d07475597a79" (UID: "e183c46d-7158-49d9-b9e7-d07475597a79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.419960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e183c46d-7158-49d9-b9e7-d07475597a79-kube-api-access-fzgxx" (OuterVolumeSpecName: "kube-api-access-fzgxx") pod "e183c46d-7158-49d9-b9e7-d07475597a79" (UID: "e183c46d-7158-49d9-b9e7-d07475597a79"). InnerVolumeSpecName "kube-api-access-fzgxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.423345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "e183c46d-7158-49d9-b9e7-d07475597a79" (UID: "e183c46d-7158-49d9-b9e7-d07475597a79"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.516819 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.516850 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.516864 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.516877 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e183c46d-7158-49d9-b9e7-d07475597a79-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.516886 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e183c46d-7158-49d9-b9e7-d07475597a79-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.516909 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzgxx\" (UniqueName: \"kubernetes.io/projected/e183c46d-7158-49d9-b9e7-d07475597a79-kube-api-access-fzgxx\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.527148 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.618575 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.699197 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-0" podUID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" containerName="galera" containerID="cri-o://1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6" gracePeriod=26 Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.736397 4707 generic.go:334] "Generic (PLEG): container finished" podID="e183c46d-7158-49d9-b9e7-d07475597a79" containerID="7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7" exitCode=0 Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.736449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.736440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"e183c46d-7158-49d9-b9e7-d07475597a79","Type":"ContainerDied","Data":"7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7"} Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.736570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"e183c46d-7158-49d9-b9e7-d07475597a79","Type":"ContainerDied","Data":"7e1a9df906063497754f3e1ed1b44e47fd89609ef615f49307c4dcaa38b8531a"} Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.736592 4707 scope.go:117] "RemoveContainer" containerID="7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.751229 4707 scope.go:117] "RemoveContainer" containerID="4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.757084 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.760859 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.788155 4707 scope.go:117] "RemoveContainer" containerID="7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7" Jan 21 16:41:48 crc kubenswrapper[4707]: E0121 16:41:48.788502 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7\": container with ID starting with 7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7 not found: ID does not exist" containerID="7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.788532 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7"} err="failed to get container status \"7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7\": rpc error: code = NotFound desc = could not find container \"7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7\": container with ID starting with 7cc0dc283336feacbd7236742acbcadb51479aa648bd50cd57181acebf21f8b7 not found: ID does not exist" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.788552 4707 scope.go:117] "RemoveContainer" containerID="4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6" Jan 21 16:41:48 crc kubenswrapper[4707]: E0121 16:41:48.788779 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6\": container with ID starting with 4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6 not found: ID does not exist" containerID="4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6" Jan 21 16:41:48 crc kubenswrapper[4707]: I0121 16:41:48.788828 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6"} err="failed to get container status \"4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6\": rpc error: code = NotFound desc = could not find container \"4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6\": container with ID starting with 4f08ab524a288b2df1de5e3cefd7b9b67c25007dc87cceee66e5141595876ea6 not found: ID does not exist" Jan 21 16:41:48 crc kubenswrapper[4707]: E0121 16:41:48.892137 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 16:41:48 crc kubenswrapper[4707]: E0121 16:41:48.893293 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 16:41:48 crc kubenswrapper[4707]: E0121 16:41:48.894681 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 16:41:48 crc kubenswrapper[4707]: E0121 16:41:48.894720 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="barbican-kuttl-tests/openstack-galera-0" podUID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" containerName="galera" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.089060 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-7vp86"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.090502 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-7vp86"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.102001 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.106527 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystonecdc9-account-delete-46gj2"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.109865 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-cdc9-account-create-update-5qq8z"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.190232 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1addc64b-1eb7-43fd-ade9-e6ee00d007bb" path="/var/lib/kubelet/pods/1addc64b-1eb7-43fd-ade9-e6ee00d007bb/volumes" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.190682 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1baa8281-9a62-4a40-a97f-c2ce33f9216d" path="/var/lib/kubelet/pods/1baa8281-9a62-4a40-a97f-c2ce33f9216d/volumes" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.191164 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d603ef-d0e3-46e3-9dbb-563a49c73e2e" path="/var/lib/kubelet/pods/42d603ef-d0e3-46e3-9dbb-563a49c73e2e/volumes" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.192288 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b049d8fc-bda9-45c0-b94b-ea5824e7e684" path="/var/lib/kubelet/pods/b049d8fc-bda9-45c0-b94b-ea5824e7e684/volumes" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.192766 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c211bec1-5e1f-4ce9-832a-24155a50c399" path="/var/lib/kubelet/pods/c211bec1-5e1f-4ce9-832a-24155a50c399/volumes" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.193162 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce284746-16f3-4e13-ab0d-5c8ee61e0a4d" path="/var/lib/kubelet/pods/ce284746-16f3-4e13-ab0d-5c8ee61e0a4d/volumes" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.194027 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e183c46d-7158-49d9-b9e7-d07475597a79" path="/var/lib/kubelet/pods/e183c46d-7158-49d9-b9e7-d07475597a79/volumes" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.255836 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.328409 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-generated\") pod \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.328491 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.328538 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfp7r\" (UniqueName: \"kubernetes.io/projected/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kube-api-access-lfp7r\") pod \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.328561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kolla-config\") pod \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.328600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-default\") pod \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.328618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-operator-scripts\") pod \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\" (UID: \"1ac0f4d4-a0c5-4a36-ae23-6c8669603551\") " Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.328760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "1ac0f4d4-a0c5-4a36-ae23-6c8669603551" (UID: "1ac0f4d4-a0c5-4a36-ae23-6c8669603551"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.328879 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:49 crc kubenswrapper[4707]: E0121 16:41:49.328934 4707 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 16:41:49 crc kubenswrapper[4707]: E0121 16:41:49.328975 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts podName:1e629c94-5e95-4697-a88e-b706bc09f078 nodeName:}" failed. No retries permitted until 2026-01-21 16:41:53.328962434 +0000 UTC m=+6010.510478656 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts") pod "keystonecdc9-account-delete-46gj2" (UID: "1e629c94-5e95-4697-a88e-b706bc09f078") : configmap "openstack-scripts" not found Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.329060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "1ac0f4d4-a0c5-4a36-ae23-6c8669603551" (UID: "1ac0f4d4-a0c5-4a36-ae23-6c8669603551"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.329412 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "1ac0f4d4-a0c5-4a36-ae23-6c8669603551" (UID: "1ac0f4d4-a0c5-4a36-ae23-6c8669603551"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.329688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ac0f4d4-a0c5-4a36-ae23-6c8669603551" (UID: "1ac0f4d4-a0c5-4a36-ae23-6c8669603551"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.333675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kube-api-access-lfp7r" (OuterVolumeSpecName: "kube-api-access-lfp7r") pod "1ac0f4d4-a0c5-4a36-ae23-6c8669603551" (UID: "1ac0f4d4-a0c5-4a36-ae23-6c8669603551"). InnerVolumeSpecName "kube-api-access-lfp7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.338173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "1ac0f4d4-a0c5-4a36-ae23-6c8669603551" (UID: "1ac0f4d4-a0c5-4a36-ae23-6c8669603551"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.393012 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.430420 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.430455 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfp7r\" (UniqueName: \"kubernetes.io/projected/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kube-api-access-lfp7r\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.430465 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.430474 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.430483 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac0f4d4-a0c5-4a36-ae23-6c8669603551-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.439560 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.531624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4sx\" (UniqueName: \"kubernetes.io/projected/1e629c94-5e95-4697-a88e-b706bc09f078-kube-api-access-qb4sx\") pod \"1e629c94-5e95-4697-a88e-b706bc09f078\" (UID: \"1e629c94-5e95-4697-a88e-b706bc09f078\") " Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.531716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts\") pod \"1e629c94-5e95-4697-a88e-b706bc09f078\" (UID: \"1e629c94-5e95-4697-a88e-b706bc09f078\") " Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.532083 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.532118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e629c94-5e95-4697-a88e-b706bc09f078" (UID: "1e629c94-5e95-4697-a88e-b706bc09f078"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.533829 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e629c94-5e95-4697-a88e-b706bc09f078-kube-api-access-qb4sx" (OuterVolumeSpecName: "kube-api-access-qb4sx") pod "1e629c94-5e95-4697-a88e-b706bc09f078" (UID: "1e629c94-5e95-4697-a88e-b706bc09f078"). InnerVolumeSpecName "kube-api-access-qb4sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.633614 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4sx\" (UniqueName: \"kubernetes.io/projected/1e629c94-5e95-4697-a88e-b706bc09f078-kube-api-access-qb4sx\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.633644 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e629c94-5e95-4697-a88e-b706bc09f078-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.747368 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" containerID="1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6" exitCode=0 Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.747416 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.747420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"1ac0f4d4-a0c5-4a36-ae23-6c8669603551","Type":"ContainerDied","Data":"1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6"} Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.747785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"1ac0f4d4-a0c5-4a36-ae23-6c8669603551","Type":"ContainerDied","Data":"8f9c366c6775679b99883037d9a04830a8b338d04110859ac3faf5e3c69626dc"} Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.747841 4707 scope.go:117] "RemoveContainer" containerID="1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.757833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" event={"ID":"1e629c94-5e95-4697-a88e-b706bc09f078","Type":"ContainerDied","Data":"067382f3e815ff45447766fcd8dcbe73a8cc976928474c4388a9b425c1834eff"} Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.757890 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystonecdc9-account-delete-46gj2" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.778958 4707 scope.go:117] "RemoveContainer" containerID="91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.780326 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.788398 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.801845 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystonecdc9-account-delete-46gj2"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.811353 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystonecdc9-account-delete-46gj2"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.824709 4707 scope.go:117] "RemoveContainer" containerID="1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6" Jan 21 16:41:49 crc kubenswrapper[4707]: E0121 16:41:49.828335 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6\": container with ID starting with 1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6 not found: ID does not exist" containerID="1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.828377 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6"} err="failed to get container status \"1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6\": rpc error: code = NotFound desc = could not find container \"1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6\": container with ID starting with 1f87b07d03672cdb34229ef3d698e698072eedc7a56377529d436d0b166d60a6 not found: ID does not exist" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.828401 4707 scope.go:117] "RemoveContainer" containerID="91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15" Jan 21 16:41:49 crc kubenswrapper[4707]: E0121 16:41:49.828673 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15\": container with ID starting with 91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15 not found: ID does not exist" containerID="91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.828697 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15"} err="failed to get container status \"91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15\": rpc error: code = NotFound desc = could not find container \"91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15\": container with ID starting with 91ef58571226117739d90bcdc15bc3aa34a9ba4b86c36fcd91d54a66422d1b15 not found: ID does not exist" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.828713 4707 scope.go:117] "RemoveContainer" containerID="ca2c2d4f20bce64d9931051bb24d792a02cbca684869c3506ece62d521224541" Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.854527 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl"] Jan 21 16:41:49 crc kubenswrapper[4707]: I0121 16:41:49.854694 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" podUID="2d310b10-05e8-4691-b1d9-84a2f829173c" containerName="manager" containerID="cri-o://8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709" gracePeriod=10 Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.019504 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-6d62q"] Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.019709 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-6d62q" podUID="6da1d95d-5c48-46ef-925b-124a6b8af96a" containerName="registry-server" containerID="cri-o://a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c" gracePeriod=30 Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.072425 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm"] Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.079652 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069wgxhm"] Jan 21 16:41:50 crc kubenswrapper[4707]: E0121 16:41:50.157345 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da1d95d_5c48_46ef_925b_124a6b8af96a.slice/crio-conmon-a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da1d95d_5c48_46ef_925b_124a6b8af96a.slice/crio-a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.259156 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.345598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5fxh\" (UniqueName: \"kubernetes.io/projected/2d310b10-05e8-4691-b1d9-84a2f829173c-kube-api-access-f5fxh\") pod \"2d310b10-05e8-4691-b1d9-84a2f829173c\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.346095 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-apiservice-cert\") pod \"2d310b10-05e8-4691-b1d9-84a2f829173c\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.346655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-webhook-cert\") pod \"2d310b10-05e8-4691-b1d9-84a2f829173c\" (UID: \"2d310b10-05e8-4691-b1d9-84a2f829173c\") " Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.350871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "2d310b10-05e8-4691-b1d9-84a2f829173c" (UID: "2d310b10-05e8-4691-b1d9-84a2f829173c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.355052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d310b10-05e8-4691-b1d9-84a2f829173c-kube-api-access-f5fxh" (OuterVolumeSpecName: "kube-api-access-f5fxh") pod "2d310b10-05e8-4691-b1d9-84a2f829173c" (UID: "2d310b10-05e8-4691-b1d9-84a2f829173c"). InnerVolumeSpecName "kube-api-access-f5fxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.357041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "2d310b10-05e8-4691-b1d9-84a2f829173c" (UID: "2d310b10-05e8-4691-b1d9-84a2f829173c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.417567 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.448375 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5fxh\" (UniqueName: \"kubernetes.io/projected/2d310b10-05e8-4691-b1d9-84a2f829173c-kube-api-access-f5fxh\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.448410 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.448423 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d310b10-05e8-4691-b1d9-84a2f829173c-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.549530 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv4z9\" (UniqueName: \"kubernetes.io/projected/6da1d95d-5c48-46ef-925b-124a6b8af96a-kube-api-access-cv4z9\") pod \"6da1d95d-5c48-46ef-925b-124a6b8af96a\" (UID: \"6da1d95d-5c48-46ef-925b-124a6b8af96a\") " Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.551839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da1d95d-5c48-46ef-925b-124a6b8af96a-kube-api-access-cv4z9" (OuterVolumeSpecName: "kube-api-access-cv4z9") pod "6da1d95d-5c48-46ef-925b-124a6b8af96a" (UID: "6da1d95d-5c48-46ef-925b-124a6b8af96a"). InnerVolumeSpecName "kube-api-access-cv4z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.650965 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv4z9\" (UniqueName: \"kubernetes.io/projected/6da1d95d-5c48-46ef-925b-124a6b8af96a-kube-api-access-cv4z9\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.769724 4707 generic.go:334] "Generic (PLEG): container finished" podID="6da1d95d-5c48-46ef-925b-124a6b8af96a" containerID="a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c" exitCode=0 Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.769760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6d62q" event={"ID":"6da1d95d-5c48-46ef-925b-124a6b8af96a","Type":"ContainerDied","Data":"a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c"} Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.769775 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6d62q" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.769820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6d62q" event={"ID":"6da1d95d-5c48-46ef-925b-124a6b8af96a","Type":"ContainerDied","Data":"e8b0e769eba33a4a2441fd81ac08963fdb33d2622315cf973736ad881a757e49"} Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.769838 4707 scope.go:117] "RemoveContainer" containerID="a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.772317 4707 generic.go:334] "Generic (PLEG): container finished" podID="2d310b10-05e8-4691-b1d9-84a2f829173c" containerID="8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709" exitCode=0 Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.772365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.772366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" event={"ID":"2d310b10-05e8-4691-b1d9-84a2f829173c","Type":"ContainerDied","Data":"8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709"} Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.772500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl" event={"ID":"2d310b10-05e8-4691-b1d9-84a2f829173c","Type":"ContainerDied","Data":"5bf95eeed98e29ded5d717bc0aa922fa6abf7292c7bf00e61debefc85255c6a1"} Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.786915 4707 scope.go:117] "RemoveContainer" containerID="a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c" Jan 21 16:41:50 crc kubenswrapper[4707]: E0121 16:41:50.787200 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c\": container with ID starting with a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c not found: ID does not exist" containerID="a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.787228 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c"} err="failed to get container status \"a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c\": rpc error: code = NotFound desc = could not find container \"a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c\": container with ID starting with a1e796ec65c68dd4c66c9b0295426f23c295e180ae57179b0565009fbb640b2c not found: ID does not exist" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.787244 4707 scope.go:117] "RemoveContainer" containerID="8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.790926 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-6d62q"] Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.794847 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-6d62q"] Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.803685 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl"] Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.806339 4707 scope.go:117] "RemoveContainer" containerID="8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709" Jan 21 16:41:50 crc kubenswrapper[4707]: E0121 16:41:50.809697 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709\": container with ID starting with 8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709 not found: ID does not exist" containerID="8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.809738 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709"} err="failed to get container status \"8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709\": rpc error: code = NotFound desc = could not find container \"8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709\": container with ID starting with 8905bcdfde350b41ab962dbf1c080a559ceff241f0cb31d0a8087627f3dd3709 not found: ID does not exist" Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.815122 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-f4c696d8c-vg2fl"] Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.989302 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh"] Jan 21 16:41:50 crc kubenswrapper[4707]: I0121 16:41:50.989519 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" podUID="2bc097f0-4f25-4e0d-949e-ee3a654e8f52" containerName="operator" containerID="cri-o://1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a" gracePeriod=10 Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.196096 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" path="/var/lib/kubelet/pods/1ac0f4d4-a0c5-4a36-ae23-6c8669603551/volumes" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.196830 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e629c94-5e95-4697-a88e-b706bc09f078" path="/var/lib/kubelet/pods/1e629c94-5e95-4697-a88e-b706bc09f078/volumes" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.197263 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d310b10-05e8-4691-b1d9-84a2f829173c" path="/var/lib/kubelet/pods/2d310b10-05e8-4691-b1d9-84a2f829173c/volumes" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.198113 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408cfa00-50fd-4d12-845c-4e467fb79876" path="/var/lib/kubelet/pods/408cfa00-50fd-4d12-845c-4e467fb79876/volumes" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.198600 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da1d95d-5c48-46ef-925b-124a6b8af96a" path="/var/lib/kubelet/pods/6da1d95d-5c48-46ef-925b-124a6b8af96a/volumes" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.198997 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-7xrws"] Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.199137 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" podUID="99f2ba6e-5966-46d6-9f1c-80798985a942" containerName="registry-server" containerID="cri-o://e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da" gracePeriod=30 Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.220525 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw"] Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.224772 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s29qw"] Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.290418 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.360595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr9jd\" (UniqueName: \"kubernetes.io/projected/2bc097f0-4f25-4e0d-949e-ee3a654e8f52-kube-api-access-kr9jd\") pod \"2bc097f0-4f25-4e0d-949e-ee3a654e8f52\" (UID: \"2bc097f0-4f25-4e0d-949e-ee3a654e8f52\") " Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.363922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc097f0-4f25-4e0d-949e-ee3a654e8f52-kube-api-access-kr9jd" (OuterVolumeSpecName: "kube-api-access-kr9jd") pod "2bc097f0-4f25-4e0d-949e-ee3a654e8f52" (UID: "2bc097f0-4f25-4e0d-949e-ee3a654e8f52"). InnerVolumeSpecName "kube-api-access-kr9jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.461883 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr9jd\" (UniqueName: \"kubernetes.io/projected/2bc097f0-4f25-4e0d-949e-ee3a654e8f52-kube-api-access-kr9jd\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.544140 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.664149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcgsn\" (UniqueName: \"kubernetes.io/projected/99f2ba6e-5966-46d6-9f1c-80798985a942-kube-api-access-hcgsn\") pod \"99f2ba6e-5966-46d6-9f1c-80798985a942\" (UID: \"99f2ba6e-5966-46d6-9f1c-80798985a942\") " Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.667301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f2ba6e-5966-46d6-9f1c-80798985a942-kube-api-access-hcgsn" (OuterVolumeSpecName: "kube-api-access-hcgsn") pod "99f2ba6e-5966-46d6-9f1c-80798985a942" (UID: "99f2ba6e-5966-46d6-9f1c-80798985a942"). InnerVolumeSpecName "kube-api-access-hcgsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.765702 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcgsn\" (UniqueName: \"kubernetes.io/projected/99f2ba6e-5966-46d6-9f1c-80798985a942-kube-api-access-hcgsn\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.782378 4707 generic.go:334] "Generic (PLEG): container finished" podID="99f2ba6e-5966-46d6-9f1c-80798985a942" containerID="e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da" exitCode=0 Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.782426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" event={"ID":"99f2ba6e-5966-46d6-9f1c-80798985a942","Type":"ContainerDied","Data":"e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da"} Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.782447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" event={"ID":"99f2ba6e-5966-46d6-9f1c-80798985a942","Type":"ContainerDied","Data":"6cda0df1b1fe3805ed7d7afe4a3dfba5dfc011e476c5d06f88d562a6e27290e3"} Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.782463 4707 scope.go:117] "RemoveContainer" containerID="e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.782526 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-7xrws" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.785432 4707 generic.go:334] "Generic (PLEG): container finished" podID="2bc097f0-4f25-4e0d-949e-ee3a654e8f52" containerID="1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a" exitCode=0 Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.785476 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.785484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" event={"ID":"2bc097f0-4f25-4e0d-949e-ee3a654e8f52","Type":"ContainerDied","Data":"1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a"} Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.785529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh" event={"ID":"2bc097f0-4f25-4e0d-949e-ee3a654e8f52","Type":"ContainerDied","Data":"4d198b3965d42ce464c24528ad775d9b563af205e2c7b463658a18244686f568"} Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.796282 4707 scope.go:117] "RemoveContainer" containerID="e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da" Jan 21 16:41:51 crc kubenswrapper[4707]: E0121 16:41:51.797151 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da\": container with ID starting with e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da not found: ID does not exist" containerID="e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.797188 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da"} err="failed to get container status \"e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da\": rpc error: code = NotFound desc = could not find container \"e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da\": container with ID starting with e3e99d046e2d11784a11d119851d85597dd52796b79317f75652f3d5b72a21da not found: ID does not exist" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.797208 4707 scope.go:117] "RemoveContainer" containerID="1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.808293 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-7xrws"] Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.812100 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-7xrws"] Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.812543 4707 scope.go:117] "RemoveContainer" containerID="1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a" Jan 21 16:41:51 crc kubenswrapper[4707]: E0121 16:41:51.813761 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a\": container with ID starting with 1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a not found: ID does not exist" containerID="1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.813800 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a"} err="failed to get container status \"1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a\": rpc error: code = NotFound desc = could not find container \"1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a\": container with ID starting with 1aa2df83835eeb59fd123233006c64d972f11a1de07d3b90b065fb7d112f4e0a not found: ID does not exist" Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.816697 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh"] Jan 21 16:41:51 crc kubenswrapper[4707]: I0121 16:41:51.820495 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-s4rxh"] Jan 21 16:41:53 crc kubenswrapper[4707]: I0121 16:41:53.185042 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:41:53 crc kubenswrapper[4707]: E0121 16:41:53.185260 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:41:53 crc kubenswrapper[4707]: I0121 16:41:53.187666 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc097f0-4f25-4e0d-949e-ee3a654e8f52" path="/var/lib/kubelet/pods/2bc097f0-4f25-4e0d-949e-ee3a654e8f52/volumes" Jan 21 16:41:53 crc kubenswrapper[4707]: I0121 16:41:53.188262 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d203ad7-2b7a-4bde-8757-4299541c2b5a" path="/var/lib/kubelet/pods/6d203ad7-2b7a-4bde-8757-4299541c2b5a/volumes" Jan 21 16:41:53 crc kubenswrapper[4707]: I0121 16:41:53.188764 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f2ba6e-5966-46d6-9f1c-80798985a942" path="/var/lib/kubelet/pods/99f2ba6e-5966-46d6-9f1c-80798985a942/volumes" Jan 21 16:41:53 crc kubenswrapper[4707]: I0121 16:41:53.997199 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv"] Jan 21 16:41:53 crc kubenswrapper[4707]: I0121 16:41:53.997575 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" podUID="032ee273-2c71-4187-aeef-048fd267a1b1" containerName="manager" containerID="cri-o://d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e" gracePeriod=10 Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.210422 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-njrms"] Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.210588 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-njrms" podUID="6ccf8768-dba5-4264-a203-8acceb86a164" containerName="registry-server" containerID="cri-o://dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310" gracePeriod=30 Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.236838 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh"] Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.239884 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1jvtqh"] Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.377059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.501510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-webhook-cert\") pod \"032ee273-2c71-4187-aeef-048fd267a1b1\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.501647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-apiservice-cert\") pod \"032ee273-2c71-4187-aeef-048fd267a1b1\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.501686 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwqmt\" (UniqueName: \"kubernetes.io/projected/032ee273-2c71-4187-aeef-048fd267a1b1-kube-api-access-rwqmt\") pod \"032ee273-2c71-4187-aeef-048fd267a1b1\" (UID: \"032ee273-2c71-4187-aeef-048fd267a1b1\") " Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.512924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "032ee273-2c71-4187-aeef-048fd267a1b1" (UID: "032ee273-2c71-4187-aeef-048fd267a1b1"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.512948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "032ee273-2c71-4187-aeef-048fd267a1b1" (UID: "032ee273-2c71-4187-aeef-048fd267a1b1"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.512974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032ee273-2c71-4187-aeef-048fd267a1b1-kube-api-access-rwqmt" (OuterVolumeSpecName: "kube-api-access-rwqmt") pod "032ee273-2c71-4187-aeef-048fd267a1b1" (UID: "032ee273-2c71-4187-aeef-048fd267a1b1"). InnerVolumeSpecName "kube-api-access-rwqmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.544554 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.603303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2bzw\" (UniqueName: \"kubernetes.io/projected/6ccf8768-dba5-4264-a203-8acceb86a164-kube-api-access-k2bzw\") pod \"6ccf8768-dba5-4264-a203-8acceb86a164\" (UID: \"6ccf8768-dba5-4264-a203-8acceb86a164\") " Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.603629 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.603650 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwqmt\" (UniqueName: \"kubernetes.io/projected/032ee273-2c71-4187-aeef-048fd267a1b1-kube-api-access-rwqmt\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.603661 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/032ee273-2c71-4187-aeef-048fd267a1b1-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.606543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ccf8768-dba5-4264-a203-8acceb86a164-kube-api-access-k2bzw" (OuterVolumeSpecName: "kube-api-access-k2bzw") pod "6ccf8768-dba5-4264-a203-8acceb86a164" (UID: "6ccf8768-dba5-4264-a203-8acceb86a164"). InnerVolumeSpecName "kube-api-access-k2bzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.705023 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2bzw\" (UniqueName: \"kubernetes.io/projected/6ccf8768-dba5-4264-a203-8acceb86a164-kube-api-access-k2bzw\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.806755 4707 generic.go:334] "Generic (PLEG): container finished" podID="032ee273-2c71-4187-aeef-048fd267a1b1" containerID="d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e" exitCode=0 Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.806926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" event={"ID":"032ee273-2c71-4187-aeef-048fd267a1b1","Type":"ContainerDied","Data":"d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e"} Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.807063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" event={"ID":"032ee273-2c71-4187-aeef-048fd267a1b1","Type":"ContainerDied","Data":"7ce21def523c4d84cb2d12ad084bb477d83dd9e8a5f2e6ced17dc42ee201f9fe"} Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.806989 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.807084 4707 scope.go:117] "RemoveContainer" containerID="d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.809068 4707 generic.go:334] "Generic (PLEG): container finished" podID="6ccf8768-dba5-4264-a203-8acceb86a164" containerID="dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310" exitCode=0 Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.809143 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-njrms" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.809736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-njrms" event={"ID":"6ccf8768-dba5-4264-a203-8acceb86a164","Type":"ContainerDied","Data":"dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310"} Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.809780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-njrms" event={"ID":"6ccf8768-dba5-4264-a203-8acceb86a164","Type":"ContainerDied","Data":"d04d469de6314fbfbc317d100c1480f90f79775ce36b15d5a353bdc5185d728c"} Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.820899 4707 scope.go:117] "RemoveContainer" containerID="d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e" Jan 21 16:41:54 crc kubenswrapper[4707]: E0121 16:41:54.821158 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e\": container with ID starting with d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e not found: ID does not exist" containerID="d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.821187 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e"} err="failed to get container status \"d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e\": rpc error: code = NotFound desc = could not find container \"d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e\": container with ID starting with d51f28ed0ff0a502f3723cb02a9bf17cad3dddc2a5ecb3bda1c3dd0b61c1da4e not found: ID does not exist" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.821204 4707 scope.go:117] "RemoveContainer" containerID="dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.829264 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-njrms"] Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.833385 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-njrms"] Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.834904 4707 scope.go:117] "RemoveContainer" containerID="dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310" Jan 21 16:41:54 crc kubenswrapper[4707]: E0121 16:41:54.835276 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310\": container with ID starting with dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310 not found: ID does not exist" containerID="dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.835306 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310"} err="failed to get container status \"dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310\": rpc error: code = NotFound desc = could not find container \"dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310\": container with ID starting with dad505fa1bb0d6fe8e874aa6004ec77e21446abf1bd6fed447376d4078ec6310 not found: ID does not exist" Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.840082 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv"] Jan 21 16:41:54 crc kubenswrapper[4707]: I0121 16:41:54.844435 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-558567f65c-8l8bv"] Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.188084 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032ee273-2c71-4187-aeef-048fd267a1b1" path="/var/lib/kubelet/pods/032ee273-2c71-4187-aeef-048fd267a1b1/volumes" Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.188829 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27640829-d55e-4c63-9ff8-f644777688db" path="/var/lib/kubelet/pods/27640829-d55e-4c63-9ff8-f644777688db/volumes" Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.189462 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ccf8768-dba5-4264-a203-8acceb86a164" path="/var/lib/kubelet/pods/6ccf8768-dba5-4264-a203-8acceb86a164/volumes" Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.640930 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k"] Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.641107 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" podUID="43080aee-92cf-4789-858a-d48c03b0d1b8" containerName="manager" containerID="cri-o://e72804f50faad2dafabf86fbc4813643bf4aee500884d015780ab629571f641b" gracePeriod=10 Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.818855 4707 generic.go:334] "Generic (PLEG): container finished" podID="43080aee-92cf-4789-858a-d48c03b0d1b8" containerID="e72804f50faad2dafabf86fbc4813643bf4aee500884d015780ab629571f641b" exitCode=0 Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.818930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" event={"ID":"43080aee-92cf-4789-858a-d48c03b0d1b8","Type":"ContainerDied","Data":"e72804f50faad2dafabf86fbc4813643bf4aee500884d015780ab629571f641b"} Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.847683 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-jmcgw"] Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.847873 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-jmcgw" podUID="7789650d-9683-47fd-a464-ba7cc7dac503" containerName="registry-server" containerID="cri-o://27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2" gracePeriod=30 Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.868763 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j"] Jan 21 16:41:55 crc kubenswrapper[4707]: I0121 16:41:55.872387 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720b85f7j"] Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.044186 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.119188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9wkg\" (UniqueName: \"kubernetes.io/projected/43080aee-92cf-4789-858a-d48c03b0d1b8-kube-api-access-g9wkg\") pod \"43080aee-92cf-4789-858a-d48c03b0d1b8\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.119280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-webhook-cert\") pod \"43080aee-92cf-4789-858a-d48c03b0d1b8\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.119336 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-apiservice-cert\") pod \"43080aee-92cf-4789-858a-d48c03b0d1b8\" (UID: \"43080aee-92cf-4789-858a-d48c03b0d1b8\") " Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.129013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43080aee-92cf-4789-858a-d48c03b0d1b8-kube-api-access-g9wkg" (OuterVolumeSpecName: "kube-api-access-g9wkg") pod "43080aee-92cf-4789-858a-d48c03b0d1b8" (UID: "43080aee-92cf-4789-858a-d48c03b0d1b8"). InnerVolumeSpecName "kube-api-access-g9wkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.141426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "43080aee-92cf-4789-858a-d48c03b0d1b8" (UID: "43080aee-92cf-4789-858a-d48c03b0d1b8"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.141454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "43080aee-92cf-4789-858a-d48c03b0d1b8" (UID: "43080aee-92cf-4789-858a-d48c03b0d1b8"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.220942 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.220968 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43080aee-92cf-4789-858a-d48c03b0d1b8-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.220981 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9wkg\" (UniqueName: \"kubernetes.io/projected/43080aee-92cf-4789-858a-d48c03b0d1b8-kube-api-access-g9wkg\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.257547 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.322042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wghg\" (UniqueName: \"kubernetes.io/projected/7789650d-9683-47fd-a464-ba7cc7dac503-kube-api-access-6wghg\") pod \"7789650d-9683-47fd-a464-ba7cc7dac503\" (UID: \"7789650d-9683-47fd-a464-ba7cc7dac503\") " Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.324549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7789650d-9683-47fd-a464-ba7cc7dac503-kube-api-access-6wghg" (OuterVolumeSpecName: "kube-api-access-6wghg") pod "7789650d-9683-47fd-a464-ba7cc7dac503" (UID: "7789650d-9683-47fd-a464-ba7cc7dac503"). InnerVolumeSpecName "kube-api-access-6wghg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.424213 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wghg\" (UniqueName: \"kubernetes.io/projected/7789650d-9683-47fd-a464-ba7cc7dac503-kube-api-access-6wghg\") on node \"crc\" DevicePath \"\"" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.827336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" event={"ID":"43080aee-92cf-4789-858a-d48c03b0d1b8","Type":"ContainerDied","Data":"4ed84533cb92bce74834c80d87dc00c3782890ec7501843d56db7be0e355415c"} Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.827398 4707 scope.go:117] "RemoveContainer" containerID="e72804f50faad2dafabf86fbc4813643bf4aee500884d015780ab629571f641b" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.827347 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.829518 4707 generic.go:334] "Generic (PLEG): container finished" podID="7789650d-9683-47fd-a464-ba7cc7dac503" containerID="27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2" exitCode=0 Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.829553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-jmcgw" event={"ID":"7789650d-9683-47fd-a464-ba7cc7dac503","Type":"ContainerDied","Data":"27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2"} Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.829632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-jmcgw" event={"ID":"7789650d-9683-47fd-a464-ba7cc7dac503","Type":"ContainerDied","Data":"243db92db96f74533d0d6eda2718b592a594577e64402430c1c4c705c9021291"} Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.829572 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-jmcgw" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.840829 4707 scope.go:117] "RemoveContainer" containerID="27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.852193 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k"] Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.855831 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f6b55f788-jjl6k"] Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.860682 4707 scope.go:117] "RemoveContainer" containerID="27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2" Jan 21 16:41:56 crc kubenswrapper[4707]: E0121 16:41:56.861074 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2\": container with ID starting with 27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2 not found: ID does not exist" containerID="27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.861104 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2"} err="failed to get container status \"27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2\": rpc error: code = NotFound desc = could not find container \"27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2\": container with ID starting with 27d9a47786343f6ad97f925ed71fe44a65392678e394fb3527f7a6f34952d4b2 not found: ID does not exist" Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.861379 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-jmcgw"] Jan 21 16:41:56 crc kubenswrapper[4707]: I0121 16:41:56.865851 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-jmcgw"] Jan 21 16:41:57 crc kubenswrapper[4707]: I0121 16:41:57.187783 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5" path="/var/lib/kubelet/pods/0acb6b4b-8bbb-4ae1-91a7-baba7ea7a7d5/volumes" Jan 21 16:41:57 crc kubenswrapper[4707]: I0121 16:41:57.188455 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43080aee-92cf-4789-858a-d48c03b0d1b8" path="/var/lib/kubelet/pods/43080aee-92cf-4789-858a-d48c03b0d1b8/volumes" Jan 21 16:41:57 crc kubenswrapper[4707]: I0121 16:41:57.188900 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7789650d-9683-47fd-a464-ba7cc7dac503" path="/var/lib/kubelet/pods/7789650d-9683-47fd-a464-ba7cc7dac503/volumes" Jan 21 16:42:06 crc kubenswrapper[4707]: I0121 16:42:06.182407 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:42:06 crc kubenswrapper[4707]: E0121 16:42:06.183160 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:42:20 crc kubenswrapper[4707]: I0121 16:42:20.182228 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:42:20 crc kubenswrapper[4707]: E0121 16:42:20.182910 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:42:25 crc kubenswrapper[4707]: I0121 16:42:25.725450 4707 scope.go:117] "RemoveContainer" containerID="1d24beb383009e5f7a010d7afeedbf40ce30e2fc400bf8a948a3846c293e79d8" Jan 21 16:42:25 crc kubenswrapper[4707]: I0121 16:42:25.745734 4707 scope.go:117] "RemoveContainer" containerID="d6ad1bad2014968ef32dd04e52254652b213394aa3d7617ad98864842c6b0017" Jan 21 16:42:25 crc kubenswrapper[4707]: I0121 16:42:25.760036 4707 scope.go:117] "RemoveContainer" containerID="a3477f81007dd93bfd43fe19286511d2dbee1acd4e837fa19ab4d0cc6f6f9002" Jan 21 16:42:33 crc kubenswrapper[4707]: I0121 16:42:33.185387 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:42:33 crc kubenswrapper[4707]: E0121 16:42:33.186264 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.539450 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-pbsd6"] Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540187 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc097f0-4f25-4e0d-949e-ee3a654e8f52" containerName="operator" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540209 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc097f0-4f25-4e0d-949e-ee3a654e8f52" containerName="operator" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540225 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067aba60-80c2-4c68-887f-f3c6c40ed418" containerName="mysql-bootstrap" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540231 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="067aba60-80c2-4c68-887f-f3c6c40ed418" containerName="mysql-bootstrap" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540239 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f2ba6e-5966-46d6-9f1c-80798985a942" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540245 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f2ba6e-5966-46d6-9f1c-80798985a942" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540261 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e183c46d-7158-49d9-b9e7-d07475597a79" containerName="galera" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540267 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e183c46d-7158-49d9-b9e7-d07475597a79" containerName="galera" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540274 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ccf8768-dba5-4264-a203-8acceb86a164" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540280 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ccf8768-dba5-4264-a203-8acceb86a164" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540287 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da1d95d-5c48-46ef-925b-124a6b8af96a" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540292 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da1d95d-5c48-46ef-925b-124a6b8af96a" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540306 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43080aee-92cf-4789-858a-d48c03b0d1b8" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540312 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43080aee-92cf-4789-858a-d48c03b0d1b8" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540321 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d310b10-05e8-4691-b1d9-84a2f829173c" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540329 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d310b10-05e8-4691-b1d9-84a2f829173c" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540340 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d603ef-d0e3-46e3-9dbb-563a49c73e2e" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540346 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d603ef-d0e3-46e3-9dbb-563a49c73e2e" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540355 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce284746-16f3-4e13-ab0d-5c8ee61e0a4d" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540361 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce284746-16f3-4e13-ab0d-5c8ee61e0a4d" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540376 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b049d8fc-bda9-45c0-b94b-ea5824e7e684" containerName="rabbitmq" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540381 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b049d8fc-bda9-45c0-b94b-ea5824e7e684" containerName="rabbitmq" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540391 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495e3dab-2009-4192-901c-cc7908381182" containerName="memcached" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540396 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="495e3dab-2009-4192-901c-cc7908381182" containerName="memcached" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540405 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1addc64b-1eb7-43fd-ade9-e6ee00d007bb" containerName="keystone-api" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540411 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1addc64b-1eb7-43fd-ade9-e6ee00d007bb" containerName="keystone-api" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540426 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032ee273-2c71-4187-aeef-048fd267a1b1" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540434 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="032ee273-2c71-4187-aeef-048fd267a1b1" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540446 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" containerName="galera" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540452 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" containerName="galera" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540461 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7789650d-9683-47fd-a464-ba7cc7dac503" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540467 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7789650d-9683-47fd-a464-ba7cc7dac503" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540473 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" containerName="mysql-bootstrap" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540481 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" containerName="mysql-bootstrap" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540492 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e629c94-5e95-4697-a88e-b706bc09f078" containerName="mariadb-account-delete" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540499 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e629c94-5e95-4697-a88e-b706bc09f078" containerName="mariadb-account-delete" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540507 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b049d8fc-bda9-45c0-b94b-ea5824e7e684" containerName="setup-container" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540513 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b049d8fc-bda9-45c0-b94b-ea5824e7e684" containerName="setup-container" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540522 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e183c46d-7158-49d9-b9e7-d07475597a79" containerName="mysql-bootstrap" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540528 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e183c46d-7158-49d9-b9e7-d07475597a79" containerName="mysql-bootstrap" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540539 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e629c94-5e95-4697-a88e-b706bc09f078" containerName="mariadb-account-delete" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540545 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e629c94-5e95-4697-a88e-b706bc09f078" containerName="mariadb-account-delete" Jan 21 16:42:37 crc kubenswrapper[4707]: E0121 16:42:37.540556 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067aba60-80c2-4c68-887f-f3c6c40ed418" containerName="galera" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="067aba60-80c2-4c68-887f-f3c6c40ed418" containerName="galera" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540731 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da1d95d-5c48-46ef-925b-124a6b8af96a" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540747 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ccf8768-dba5-4264-a203-8acceb86a164" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540756 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc097f0-4f25-4e0d-949e-ee3a654e8f52" containerName="operator" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540766 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e629c94-5e95-4697-a88e-b706bc09f078" containerName="mariadb-account-delete" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540774 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43080aee-92cf-4789-858a-d48c03b0d1b8" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540794 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="495e3dab-2009-4192-901c-cc7908381182" containerName="memcached" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540804 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce284746-16f3-4e13-ab0d-5c8ee61e0a4d" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540827 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="067aba60-80c2-4c68-887f-f3c6c40ed418" containerName="galera" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540836 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7789650d-9683-47fd-a464-ba7cc7dac503" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540844 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="032ee273-2c71-4187-aeef-048fd267a1b1" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540853 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1addc64b-1eb7-43fd-ade9-e6ee00d007bb" containerName="keystone-api" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540860 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f2ba6e-5966-46d6-9f1c-80798985a942" containerName="registry-server" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540872 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d310b10-05e8-4691-b1d9-84a2f829173c" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540879 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e183c46d-7158-49d9-b9e7-d07475597a79" containerName="galera" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540885 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b049d8fc-bda9-45c0-b94b-ea5824e7e684" containerName="rabbitmq" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540890 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d603ef-d0e3-46e3-9dbb-563a49c73e2e" containerName="manager" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540896 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e629c94-5e95-4697-a88e-b706bc09f078" containerName="mariadb-account-delete" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.540902 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac0f4d4-a0c5-4a36-ae23-6c8669603551" containerName="galera" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.541465 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-pbsd6" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.549570 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-pbsd6"] Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.551516 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.551739 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-7jtkw" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.553531 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.690338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q7hx\" (UniqueName: \"kubernetes.io/projected/5ade1815-f342-4559-84c1-a0f7e964cc78-kube-api-access-5q7hx\") pod \"mariadb-operator-index-pbsd6\" (UID: \"5ade1815-f342-4559-84c1-a0f7e964cc78\") " pod="openstack-operators/mariadb-operator-index-pbsd6" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.791859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q7hx\" (UniqueName: \"kubernetes.io/projected/5ade1815-f342-4559-84c1-a0f7e964cc78-kube-api-access-5q7hx\") pod \"mariadb-operator-index-pbsd6\" (UID: \"5ade1815-f342-4559-84c1-a0f7e964cc78\") " pod="openstack-operators/mariadb-operator-index-pbsd6" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.808649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q7hx\" (UniqueName: \"kubernetes.io/projected/5ade1815-f342-4559-84c1-a0f7e964cc78-kube-api-access-5q7hx\") pod \"mariadb-operator-index-pbsd6\" (UID: \"5ade1815-f342-4559-84c1-a0f7e964cc78\") " pod="openstack-operators/mariadb-operator-index-pbsd6" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.862635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-pbsd6" Jan 21 16:42:37 crc kubenswrapper[4707]: I0121 16:42:37.925002 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-pbsd6"] Jan 21 16:42:38 crc kubenswrapper[4707]: I0121 16:42:38.234061 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-pbsd6"] Jan 21 16:42:38 crc kubenswrapper[4707]: I0121 16:42:38.328486 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-h9mjd"] Jan 21 16:42:38 crc kubenswrapper[4707]: I0121 16:42:38.329428 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 16:42:38 crc kubenswrapper[4707]: I0121 16:42:38.338597 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-h9mjd"] Jan 21 16:42:38 crc kubenswrapper[4707]: I0121 16:42:38.400517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z949\" (UniqueName: \"kubernetes.io/projected/e6c0641b-e565-49e9-a0d3-8afeaeca6835-kube-api-access-6z949\") pod \"mariadb-operator-index-h9mjd\" (UID: \"e6c0641b-e565-49e9-a0d3-8afeaeca6835\") " pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 16:42:38 crc kubenswrapper[4707]: I0121 16:42:38.502373 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z949\" (UniqueName: \"kubernetes.io/projected/e6c0641b-e565-49e9-a0d3-8afeaeca6835-kube-api-access-6z949\") pod \"mariadb-operator-index-h9mjd\" (UID: \"e6c0641b-e565-49e9-a0d3-8afeaeca6835\") " pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 16:42:38 crc kubenswrapper[4707]: I0121 16:42:38.519564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z949\" (UniqueName: \"kubernetes.io/projected/e6c0641b-e565-49e9-a0d3-8afeaeca6835-kube-api-access-6z949\") pod \"mariadb-operator-index-h9mjd\" (UID: \"e6c0641b-e565-49e9-a0d3-8afeaeca6835\") " pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 16:42:38 crc kubenswrapper[4707]: I0121 16:42:38.647569 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.019509 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-h9mjd"] Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.108013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-pbsd6" event={"ID":"5ade1815-f342-4559-84c1-a0f7e964cc78","Type":"ContainerStarted","Data":"a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c"} Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.108156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-pbsd6" event={"ID":"5ade1815-f342-4559-84c1-a0f7e964cc78","Type":"ContainerStarted","Data":"9fa1aa00552e3fa1b6a9d5821d5f1eee8a97dd9d2eec98ec51899c3d8addca7e"} Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.108109 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-pbsd6" podUID="5ade1815-f342-4559-84c1-a0f7e964cc78" containerName="registry-server" containerID="cri-o://a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c" gracePeriod=2 Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.109002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h9mjd" event={"ID":"e6c0641b-e565-49e9-a0d3-8afeaeca6835","Type":"ContainerStarted","Data":"63104afe71e3d1a6e05309092c22441c78cba894a2883f847e12fc01ea5c80e8"} Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.123049 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-pbsd6" podStartSLOduration=1.36832003 podStartE2EDuration="2.123033756s" podCreationTimestamp="2026-01-21 16:42:37 +0000 UTC" firstStartedPulling="2026-01-21 16:42:38.239486915 +0000 UTC m=+6055.421003137" lastFinishedPulling="2026-01-21 16:42:38.994200641 +0000 UTC m=+6056.175716863" observedRunningTime="2026-01-21 16:42:39.12057879 +0000 UTC m=+6056.302095012" watchObservedRunningTime="2026-01-21 16:42:39.123033756 +0000 UTC m=+6056.304549978" Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.392665 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-pbsd6_5ade1815-f342-4559-84c1-a0f7e964cc78/registry-server/0.log" Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.392729 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-pbsd6" Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.514859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q7hx\" (UniqueName: \"kubernetes.io/projected/5ade1815-f342-4559-84c1-a0f7e964cc78-kube-api-access-5q7hx\") pod \"5ade1815-f342-4559-84c1-a0f7e964cc78\" (UID: \"5ade1815-f342-4559-84c1-a0f7e964cc78\") " Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.521407 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ade1815-f342-4559-84c1-a0f7e964cc78-kube-api-access-5q7hx" (OuterVolumeSpecName: "kube-api-access-5q7hx") pod "5ade1815-f342-4559-84c1-a0f7e964cc78" (UID: "5ade1815-f342-4559-84c1-a0f7e964cc78"). InnerVolumeSpecName "kube-api-access-5q7hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:42:39 crc kubenswrapper[4707]: I0121 16:42:39.616644 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q7hx\" (UniqueName: \"kubernetes.io/projected/5ade1815-f342-4559-84c1-a0f7e964cc78-kube-api-access-5q7hx\") on node \"crc\" DevicePath \"\"" Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.115632 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-pbsd6_5ade1815-f342-4559-84c1-a0f7e964cc78/registry-server/0.log" Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.115680 4707 generic.go:334] "Generic (PLEG): container finished" podID="5ade1815-f342-4559-84c1-a0f7e964cc78" containerID="a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c" exitCode=2 Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.115737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-pbsd6" event={"ID":"5ade1815-f342-4559-84c1-a0f7e964cc78","Type":"ContainerDied","Data":"a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c"} Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.115746 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-pbsd6" Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.115763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-pbsd6" event={"ID":"5ade1815-f342-4559-84c1-a0f7e964cc78","Type":"ContainerDied","Data":"9fa1aa00552e3fa1b6a9d5821d5f1eee8a97dd9d2eec98ec51899c3d8addca7e"} Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.115778 4707 scope.go:117] "RemoveContainer" containerID="a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c" Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.117061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h9mjd" event={"ID":"e6c0641b-e565-49e9-a0d3-8afeaeca6835","Type":"ContainerStarted","Data":"4516751d0d240ce069923af5419b8d27e0dd26bfaad44d80c2ba226f98d5bd1e"} Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.129387 4707 scope.go:117] "RemoveContainer" containerID="a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c" Jan 21 16:42:40 crc kubenswrapper[4707]: E0121 16:42:40.129656 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c\": container with ID starting with a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c not found: ID does not exist" containerID="a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c" Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.129688 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c"} err="failed to get container status \"a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c\": rpc error: code = NotFound desc = could not find container \"a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c\": container with ID starting with a274d39fa33f58416a599a1639b2aeb77f984beac54cfd341737da267e60348c not found: ID does not exist" Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.132094 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-h9mjd" podStartSLOduration=1.556513597 podStartE2EDuration="2.132084641s" podCreationTimestamp="2026-01-21 16:42:38 +0000 UTC" firstStartedPulling="2026-01-21 16:42:39.023976078 +0000 UTC m=+6056.205492300" lastFinishedPulling="2026-01-21 16:42:39.599547123 +0000 UTC m=+6056.781063344" observedRunningTime="2026-01-21 16:42:40.130655824 +0000 UTC m=+6057.312172046" watchObservedRunningTime="2026-01-21 16:42:40.132084641 +0000 UTC m=+6057.313600863" Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.140432 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-pbsd6"] Jan 21 16:42:40 crc kubenswrapper[4707]: I0121 16:42:40.143919 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-pbsd6"] Jan 21 16:42:41 crc kubenswrapper[4707]: I0121 16:42:41.187838 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ade1815-f342-4559-84c1-a0f7e964cc78" path="/var/lib/kubelet/pods/5ade1815-f342-4559-84c1-a0f7e964cc78/volumes" Jan 21 16:42:44 crc kubenswrapper[4707]: I0121 16:42:44.950731 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs"] Jan 21 16:42:44 crc kubenswrapper[4707]: E0121 16:42:44.951175 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ade1815-f342-4559-84c1-a0f7e964cc78" containerName="registry-server" Jan 21 16:42:44 crc kubenswrapper[4707]: I0121 16:42:44.951188 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ade1815-f342-4559-84c1-a0f7e964cc78" containerName="registry-server" Jan 21 16:42:44 crc kubenswrapper[4707]: I0121 16:42:44.951307 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ade1815-f342-4559-84c1-a0f7e964cc78" containerName="registry-server" Jan 21 16:42:44 crc kubenswrapper[4707]: I0121 16:42:44.952095 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:44 crc kubenswrapper[4707]: I0121 16:42:44.954645 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 16:42:44 crc kubenswrapper[4707]: I0121 16:42:44.963083 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs"] Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.085431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.085663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w7n8\" (UniqueName: \"kubernetes.io/projected/24838a0d-e03b-4b81-9423-3865c0286bf3-kube-api-access-6w7n8\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.085878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.186852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.186974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.187015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w7n8\" (UniqueName: \"kubernetes.io/projected/24838a0d-e03b-4b81-9423-3865c0286bf3-kube-api-access-6w7n8\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.187556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.187608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.214711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w7n8\" (UniqueName: \"kubernetes.io/projected/24838a0d-e03b-4b81-9423-3865c0286bf3-kube-api-access-6w7n8\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.264366 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:45 crc kubenswrapper[4707]: I0121 16:42:45.608465 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs"] Jan 21 16:42:46 crc kubenswrapper[4707]: I0121 16:42:46.150459 4707 generic.go:334] "Generic (PLEG): container finished" podID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerID="ccdbabb86834f33aa5d679b5beca05b74fca5730cc2582b6805ca3a7ece2b044" exitCode=0 Jan 21 16:42:46 crc kubenswrapper[4707]: I0121 16:42:46.150499 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" event={"ID":"24838a0d-e03b-4b81-9423-3865c0286bf3","Type":"ContainerDied","Data":"ccdbabb86834f33aa5d679b5beca05b74fca5730cc2582b6805ca3a7ece2b044"} Jan 21 16:42:46 crc kubenswrapper[4707]: I0121 16:42:46.150536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" event={"ID":"24838a0d-e03b-4b81-9423-3865c0286bf3","Type":"ContainerStarted","Data":"d27c98ce983fbb3eb267b1ac8175b3e8e76cfef6020a6ead79317b0b34e50cbb"} Jan 21 16:42:46 crc kubenswrapper[4707]: I0121 16:42:46.182863 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:42:46 crc kubenswrapper[4707]: E0121 16:42:46.183281 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:42:47 crc kubenswrapper[4707]: I0121 16:42:47.156860 4707 generic.go:334] "Generic (PLEG): container finished" podID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerID="bf836ffdb591b2c8d14e9bf966e787867f9b289c835c609ab4884e271b408e2f" exitCode=0 Jan 21 16:42:47 crc kubenswrapper[4707]: I0121 16:42:47.157013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" event={"ID":"24838a0d-e03b-4b81-9423-3865c0286bf3","Type":"ContainerDied","Data":"bf836ffdb591b2c8d14e9bf966e787867f9b289c835c609ab4884e271b408e2f"} Jan 21 16:42:48 crc kubenswrapper[4707]: I0121 16:42:48.168565 4707 generic.go:334] "Generic (PLEG): container finished" podID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerID="f25cecb56d365aba2332151eae2ddf2bdbbb0aaf97904f6f553ba7e356af6efe" exitCode=0 Jan 21 16:42:48 crc kubenswrapper[4707]: I0121 16:42:48.168608 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" event={"ID":"24838a0d-e03b-4b81-9423-3865c0286bf3","Type":"ContainerDied","Data":"f25cecb56d365aba2332151eae2ddf2bdbbb0aaf97904f6f553ba7e356af6efe"} Jan 21 16:42:48 crc kubenswrapper[4707]: I0121 16:42:48.648201 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 16:42:48 crc kubenswrapper[4707]: I0121 16:42:48.648244 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 16:42:48 crc kubenswrapper[4707]: I0121 16:42:48.668561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.195114 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.365758 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.442365 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-util\") pod \"24838a0d-e03b-4b81-9423-3865c0286bf3\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.442410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w7n8\" (UniqueName: \"kubernetes.io/projected/24838a0d-e03b-4b81-9423-3865c0286bf3-kube-api-access-6w7n8\") pod \"24838a0d-e03b-4b81-9423-3865c0286bf3\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.442444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-bundle\") pod \"24838a0d-e03b-4b81-9423-3865c0286bf3\" (UID: \"24838a0d-e03b-4b81-9423-3865c0286bf3\") " Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.443201 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-bundle" (OuterVolumeSpecName: "bundle") pod "24838a0d-e03b-4b81-9423-3865c0286bf3" (UID: "24838a0d-e03b-4b81-9423-3865c0286bf3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.446985 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24838a0d-e03b-4b81-9423-3865c0286bf3-kube-api-access-6w7n8" (OuterVolumeSpecName: "kube-api-access-6w7n8") pod "24838a0d-e03b-4b81-9423-3865c0286bf3" (UID: "24838a0d-e03b-4b81-9423-3865c0286bf3"). InnerVolumeSpecName "kube-api-access-6w7n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.453229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-util" (OuterVolumeSpecName: "util") pod "24838a0d-e03b-4b81-9423-3865c0286bf3" (UID: "24838a0d-e03b-4b81-9423-3865c0286bf3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.544390 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-util\") on node \"crc\" DevicePath \"\"" Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.544420 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w7n8\" (UniqueName: \"kubernetes.io/projected/24838a0d-e03b-4b81-9423-3865c0286bf3-kube-api-access-6w7n8\") on node \"crc\" DevicePath \"\"" Jan 21 16:42:49 crc kubenswrapper[4707]: I0121 16:42:49.544431 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24838a0d-e03b-4b81-9423-3865c0286bf3-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:42:50 crc kubenswrapper[4707]: I0121 16:42:50.180674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" event={"ID":"24838a0d-e03b-4b81-9423-3865c0286bf3","Type":"ContainerDied","Data":"d27c98ce983fbb3eb267b1ac8175b3e8e76cfef6020a6ead79317b0b34e50cbb"} Jan 21 16:42:50 crc kubenswrapper[4707]: I0121 16:42:50.180717 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d27c98ce983fbb3eb267b1ac8175b3e8e76cfef6020a6ead79317b0b34e50cbb" Jan 21 16:42:50 crc kubenswrapper[4707]: I0121 16:42:50.180688 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs" Jan 21 16:42:59 crc kubenswrapper[4707]: I0121 16:42:59.182798 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:42:59 crc kubenswrapper[4707]: E0121 16:42:59.183339 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.157976 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj"] Jan 21 16:43:01 crc kubenswrapper[4707]: E0121 16:43:01.158344 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerName="extract" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.158356 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerName="extract" Jan 21 16:43:01 crc kubenswrapper[4707]: E0121 16:43:01.158375 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerName="pull" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.158381 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerName="pull" Jan 21 16:43:01 crc kubenswrapper[4707]: E0121 16:43:01.158391 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerName="util" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.158398 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerName="util" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.158492 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24838a0d-e03b-4b81-9423-3865c0286bf3" containerName="extract" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.158857 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.160768 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.160899 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.160984 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-sgt7k" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.169313 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj"] Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.289113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-webhook-cert\") pod \"mariadb-operator-controller-manager-7d4c746c5f-n25qj\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.289273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk55w\" (UniqueName: \"kubernetes.io/projected/8358c5ec-c227-4e08-bfa4-502096362233-kube-api-access-rk55w\") pod \"mariadb-operator-controller-manager-7d4c746c5f-n25qj\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.289400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-apiservice-cert\") pod \"mariadb-operator-controller-manager-7d4c746c5f-n25qj\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.391032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk55w\" (UniqueName: \"kubernetes.io/projected/8358c5ec-c227-4e08-bfa4-502096362233-kube-api-access-rk55w\") pod \"mariadb-operator-controller-manager-7d4c746c5f-n25qj\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.391166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-apiservice-cert\") pod \"mariadb-operator-controller-manager-7d4c746c5f-n25qj\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.391237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-webhook-cert\") pod \"mariadb-operator-controller-manager-7d4c746c5f-n25qj\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.396704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-apiservice-cert\") pod \"mariadb-operator-controller-manager-7d4c746c5f-n25qj\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.410339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk55w\" (UniqueName: \"kubernetes.io/projected/8358c5ec-c227-4e08-bfa4-502096362233-kube-api-access-rk55w\") pod \"mariadb-operator-controller-manager-7d4c746c5f-n25qj\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.410617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-webhook-cert\") pod \"mariadb-operator-controller-manager-7d4c746c5f-n25qj\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.472299 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:01 crc kubenswrapper[4707]: I0121 16:43:01.863124 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj"] Jan 21 16:43:01 crc kubenswrapper[4707]: W0121 16:43:01.868200 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8358c5ec_c227_4e08_bfa4_502096362233.slice/crio-ebdd920628b2db36939a3dd569a348b41d21bb7010654a80111c92dc7cfa2df9 WatchSource:0}: Error finding container ebdd920628b2db36939a3dd569a348b41d21bb7010654a80111c92dc7cfa2df9: Status 404 returned error can't find the container with id ebdd920628b2db36939a3dd569a348b41d21bb7010654a80111c92dc7cfa2df9 Jan 21 16:43:02 crc kubenswrapper[4707]: I0121 16:43:02.238702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" event={"ID":"8358c5ec-c227-4e08-bfa4-502096362233","Type":"ContainerStarted","Data":"35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472"} Jan 21 16:43:02 crc kubenswrapper[4707]: I0121 16:43:02.238741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" event={"ID":"8358c5ec-c227-4e08-bfa4-502096362233","Type":"ContainerStarted","Data":"ebdd920628b2db36939a3dd569a348b41d21bb7010654a80111c92dc7cfa2df9"} Jan 21 16:43:02 crc kubenswrapper[4707]: I0121 16:43:02.238849 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:02 crc kubenswrapper[4707]: I0121 16:43:02.252540 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" podStartSLOduration=1.252527574 podStartE2EDuration="1.252527574s" podCreationTimestamp="2026-01-21 16:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:43:02.25004173 +0000 UTC m=+6079.431557951" watchObservedRunningTime="2026-01-21 16:43:02.252527574 +0000 UTC m=+6079.434043796" Jan 21 16:43:11 crc kubenswrapper[4707]: I0121 16:43:11.476674 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 16:43:14 crc kubenswrapper[4707]: I0121 16:43:14.182391 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:43:14 crc kubenswrapper[4707]: E0121 16:43:14.182755 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.282692 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh"] Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.284663 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.286341 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.289043 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh"] Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.392500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.392562 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmb55\" (UniqueName: \"kubernetes.io/projected/4d976b88-45b2-4fcd-92c3-8998178306ac-kube-api-access-tmb55\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.392586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.494272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.494331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmb55\" (UniqueName: \"kubernetes.io/projected/4d976b88-45b2-4fcd-92c3-8998178306ac-kube-api-access-tmb55\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.494355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.494753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.494803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.509498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmb55\" (UniqueName: \"kubernetes.io/projected/4d976b88-45b2-4fcd-92c3-8998178306ac-kube-api-access-tmb55\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.599285 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:18 crc kubenswrapper[4707]: I0121 16:43:18.959882 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh"] Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.320053 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerID="8d7851549e3f49a90fa859adf2f35180ba74c44ccf59107ee7a7a1c22f328e53" exitCode=0 Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.320094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" event={"ID":"4d976b88-45b2-4fcd-92c3-8998178306ac","Type":"ContainerDied","Data":"8d7851549e3f49a90fa859adf2f35180ba74c44ccf59107ee7a7a1c22f328e53"} Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.320119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" event={"ID":"4d976b88-45b2-4fcd-92c3-8998178306ac","Type":"ContainerStarted","Data":"a6816887344f903a437b9db310990a074c3a4130cc33757544c191ffdd249714"} Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.655387 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-qd9pv"] Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.656046 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-qd9pv" Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.658929 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-rknxw" Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.666394 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-qd9pv"] Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.809032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh58n\" (UniqueName: \"kubernetes.io/projected/99298ce1-4bc7-4409-9c72-7d07dae3a06b-kube-api-access-gh58n\") pod \"infra-operator-index-qd9pv\" (UID: \"99298ce1-4bc7-4409-9c72-7d07dae3a06b\") " pod="openstack-operators/infra-operator-index-qd9pv" Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.911081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh58n\" (UniqueName: \"kubernetes.io/projected/99298ce1-4bc7-4409-9c72-7d07dae3a06b-kube-api-access-gh58n\") pod \"infra-operator-index-qd9pv\" (UID: \"99298ce1-4bc7-4409-9c72-7d07dae3a06b\") " pod="openstack-operators/infra-operator-index-qd9pv" Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.926989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh58n\" (UniqueName: \"kubernetes.io/projected/99298ce1-4bc7-4409-9c72-7d07dae3a06b-kube-api-access-gh58n\") pod \"infra-operator-index-qd9pv\" (UID: \"99298ce1-4bc7-4409-9c72-7d07dae3a06b\") " pod="openstack-operators/infra-operator-index-qd9pv" Jan 21 16:43:19 crc kubenswrapper[4707]: I0121 16:43:19.978122 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-qd9pv" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.262106 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.263127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.267284 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"galera-openstack-dockercfg-jv8rl" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.267363 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"kube-root-ca.crt" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.267552 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-scripts" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.269127 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.270199 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.271355 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config-data" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.271398 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openshift-service-ca.crt" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.274976 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.275678 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.281108 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.284336 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.294422 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.326071 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-qd9pv"] Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.329835 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerID="5ef3e3111ca082e11b497c499b1a5e093e47d9944cc4a217c94f19fe6ccf13d2" exitCode=0 Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.329878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" event={"ID":"4d976b88-45b2-4fcd-92c3-8998178306ac","Type":"ContainerDied","Data":"5ef3e3111ca082e11b497c499b1a5e093e47d9944cc4a217c94f19fe6ccf13d2"} Jan 21 16:43:20 crc kubenswrapper[4707]: W0121 16:43:20.335496 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99298ce1_4bc7_4409_9c72_7d07dae3a06b.slice/crio-6b661341a1b3a8d0db1ee9ba5a65eb70ed2778534171f0f31b32ddd5f2ce1186 WatchSource:0}: Error finding container 6b661341a1b3a8d0db1ee9ba5a65eb70ed2778534171f0f31b32ddd5f2ce1186: Status 404 returned error can't find the container with id 6b661341a1b3a8d0db1ee9ba5a65eb70ed2778534171f0f31b32ddd5f2ce1186 Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.419925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-default\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.419993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-default\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsf2w\" (UniqueName: \"kubernetes.io/projected/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kube-api-access-hsf2w\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-kolla-config\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kolla-config\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420226 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420239 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-operator-scripts\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-default\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-generated\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-kolla-config\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp57s\" (UniqueName: \"kubernetes.io/projected/151f7a26-9818-45b1-90d5-d45f9ce116ce-kube-api-access-rp57s\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448tw\" (UniqueName: \"kubernetes.io/projected/522a96ca-cde2-4466-804e-85a0f8e56653-kube-api-access-448tw\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-operator-scripts\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.420672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-generated\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522099 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-default\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522145 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-default\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsf2w\" (UniqueName: \"kubernetes.io/projected/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kube-api-access-hsf2w\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-kolla-config\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kolla-config\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-operator-scripts\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-default\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-generated\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-kolla-config\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp57s\" (UniqueName: \"kubernetes.io/projected/151f7a26-9818-45b1-90d5-d45f9ce116ce-kube-api-access-rp57s\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448tw\" (UniqueName: \"kubernetes.io/projected/522a96ca-cde2-4466-804e-85a0f8e56653-kube-api-access-448tw\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-operator-scripts\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522462 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-generated\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-generated\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522899 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") device mount path \"/mnt/openstack/pv07\"" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.522956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.523033 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") device mount path \"/mnt/openstack/pv01\"" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.523068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-generated\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.523255 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") device mount path \"/mnt/openstack/pv12\"" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.523491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kolla-config\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.523633 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-kolla-config\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.524082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-default\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.524253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-default\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.524327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-default\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.524301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.524514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-operator-scripts\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.524834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-kolla-config\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.525284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-operator-scripts\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.537023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.537137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448tw\" (UniqueName: \"kubernetes.io/projected/522a96ca-cde2-4466-804e-85a0f8e56653-kube-api-access-448tw\") pod \"openstack-galera-2\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.537265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp57s\" (UniqueName: \"kubernetes.io/projected/151f7a26-9818-45b1-90d5-d45f9ce116ce-kube-api-access-rp57s\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.537321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.537379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.537684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsf2w\" (UniqueName: \"kubernetes.io/projected/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kube-api-access-hsf2w\") pod \"openstack-galera-0\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.577044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.586039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.592848 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.940734 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 16:43:20 crc kubenswrapper[4707]: W0121 16:43:20.945021 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod225ee0a9_1192_40ab_93d9_010fb4c4a07e.slice/crio-8916e66de8b3abee9a5927596c04e5238b66af46e10438ef3f103def325a30d0 WatchSource:0}: Error finding container 8916e66de8b3abee9a5927596c04e5238b66af46e10438ef3f103def325a30d0: Status 404 returned error can't find the container with id 8916e66de8b3abee9a5927596c04e5238b66af46e10438ef3f103def325a30d0 Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.991950 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 16:43:20 crc kubenswrapper[4707]: I0121 16:43:20.993267 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 16:43:21 crc kubenswrapper[4707]: W0121 16:43:21.005255 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod151f7a26_9818_45b1_90d5_d45f9ce116ce.slice/crio-4cd1776a0fcfec8d6bb03aaf1a6fcdc779f8b88a7c6cd7959f480762f73e2d42 WatchSource:0}: Error finding container 4cd1776a0fcfec8d6bb03aaf1a6fcdc779f8b88a7c6cd7959f480762f73e2d42: Status 404 returned error can't find the container with id 4cd1776a0fcfec8d6bb03aaf1a6fcdc779f8b88a7c6cd7959f480762f73e2d42 Jan 21 16:43:21 crc kubenswrapper[4707]: W0121 16:43:21.005547 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod522a96ca_cde2_4466_804e_85a0f8e56653.slice/crio-8540b4a11ac8978d2a26c675be2b4ccd4b93436d3ac3b3c6c83afae46b258e4b WatchSource:0}: Error finding container 8540b4a11ac8978d2a26c675be2b4ccd4b93436d3ac3b3c6c83afae46b258e4b: Status 404 returned error can't find the container with id 8540b4a11ac8978d2a26c675be2b4ccd4b93436d3ac3b3c6c83afae46b258e4b Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.337842 4707 generic.go:334] "Generic (PLEG): container finished" podID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerID="98a2afe0e00f801a7b56bd4da46d19d29b652d19a3c8b9920413f9dd0395bede" exitCode=0 Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.338005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" event={"ID":"4d976b88-45b2-4fcd-92c3-8998178306ac","Type":"ContainerDied","Data":"98a2afe0e00f801a7b56bd4da46d19d29b652d19a3c8b9920413f9dd0395bede"} Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.339463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"225ee0a9-1192-40ab-93d9-010fb4c4a07e","Type":"ContainerStarted","Data":"94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a"} Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.339497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"225ee0a9-1192-40ab-93d9-010fb4c4a07e","Type":"ContainerStarted","Data":"8916e66de8b3abee9a5927596c04e5238b66af46e10438ef3f103def325a30d0"} Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.341678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-qd9pv" event={"ID":"99298ce1-4bc7-4409-9c72-7d07dae3a06b","Type":"ContainerStarted","Data":"e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077"} Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.341706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-qd9pv" event={"ID":"99298ce1-4bc7-4409-9c72-7d07dae3a06b","Type":"ContainerStarted","Data":"6b661341a1b3a8d0db1ee9ba5a65eb70ed2778534171f0f31b32ddd5f2ce1186"} Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.343058 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"151f7a26-9818-45b1-90d5-d45f9ce116ce","Type":"ContainerStarted","Data":"3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba"} Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.343096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"151f7a26-9818-45b1-90d5-d45f9ce116ce","Type":"ContainerStarted","Data":"4cd1776a0fcfec8d6bb03aaf1a6fcdc779f8b88a7c6cd7959f480762f73e2d42"} Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.344585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"522a96ca-cde2-4466-804e-85a0f8e56653","Type":"ContainerStarted","Data":"befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda"} Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.344614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"522a96ca-cde2-4466-804e-85a0f8e56653","Type":"ContainerStarted","Data":"8540b4a11ac8978d2a26c675be2b4ccd4b93436d3ac3b3c6c83afae46b258e4b"} Jan 21 16:43:21 crc kubenswrapper[4707]: I0121 16:43:21.407431 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-qd9pv" podStartSLOduration=1.769022236 podStartE2EDuration="2.407416779s" podCreationTimestamp="2026-01-21 16:43:19 +0000 UTC" firstStartedPulling="2026-01-21 16:43:20.342669759 +0000 UTC m=+6097.524185980" lastFinishedPulling="2026-01-21 16:43:20.9810643 +0000 UTC m=+6098.162580523" observedRunningTime="2026-01-21 16:43:21.406108038 +0000 UTC m=+6098.587624260" watchObservedRunningTime="2026-01-21 16:43:21.407416779 +0000 UTC m=+6098.588933001" Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.590247 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.751456 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmb55\" (UniqueName: \"kubernetes.io/projected/4d976b88-45b2-4fcd-92c3-8998178306ac-kube-api-access-tmb55\") pod \"4d976b88-45b2-4fcd-92c3-8998178306ac\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.751606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-bundle\") pod \"4d976b88-45b2-4fcd-92c3-8998178306ac\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.751647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-util\") pod \"4d976b88-45b2-4fcd-92c3-8998178306ac\" (UID: \"4d976b88-45b2-4fcd-92c3-8998178306ac\") " Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.753036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-bundle" (OuterVolumeSpecName: "bundle") pod "4d976b88-45b2-4fcd-92c3-8998178306ac" (UID: "4d976b88-45b2-4fcd-92c3-8998178306ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.755466 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d976b88-45b2-4fcd-92c3-8998178306ac-kube-api-access-tmb55" (OuterVolumeSpecName: "kube-api-access-tmb55") pod "4d976b88-45b2-4fcd-92c3-8998178306ac" (UID: "4d976b88-45b2-4fcd-92c3-8998178306ac"). InnerVolumeSpecName "kube-api-access-tmb55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.762234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-util" (OuterVolumeSpecName: "util") pod "4d976b88-45b2-4fcd-92c3-8998178306ac" (UID: "4d976b88-45b2-4fcd-92c3-8998178306ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.853526 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmb55\" (UniqueName: \"kubernetes.io/projected/4d976b88-45b2-4fcd-92c3-8998178306ac-kube-api-access-tmb55\") on node \"crc\" DevicePath \"\"" Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.853556 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:43:22 crc kubenswrapper[4707]: I0121 16:43:22.853566 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d976b88-45b2-4fcd-92c3-8998178306ac-util\") on node \"crc\" DevicePath \"\"" Jan 21 16:43:23 crc kubenswrapper[4707]: I0121 16:43:23.356419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" event={"ID":"4d976b88-45b2-4fcd-92c3-8998178306ac","Type":"ContainerDied","Data":"a6816887344f903a437b9db310990a074c3a4130cc33757544c191ffdd249714"} Jan 21 16:43:23 crc kubenswrapper[4707]: I0121 16:43:23.356539 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6816887344f903a437b9db310990a074c3a4130cc33757544c191ffdd249714" Jan 21 16:43:23 crc kubenswrapper[4707]: I0121 16:43:23.356465 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh" Jan 21 16:43:23 crc kubenswrapper[4707]: I0121 16:43:23.852198 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-qd9pv"] Jan 21 16:43:23 crc kubenswrapper[4707]: I0121 16:43:23.852346 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-qd9pv" podUID="99298ce1-4bc7-4409-9c72-7d07dae3a06b" containerName="registry-server" containerID="cri-o://e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077" gracePeriod=2 Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.184952 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-qd9pv" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.270413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh58n\" (UniqueName: \"kubernetes.io/projected/99298ce1-4bc7-4409-9c72-7d07dae3a06b-kube-api-access-gh58n\") pod \"99298ce1-4bc7-4409-9c72-7d07dae3a06b\" (UID: \"99298ce1-4bc7-4409-9c72-7d07dae3a06b\") " Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.274307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99298ce1-4bc7-4409-9c72-7d07dae3a06b-kube-api-access-gh58n" (OuterVolumeSpecName: "kube-api-access-gh58n") pod "99298ce1-4bc7-4409-9c72-7d07dae3a06b" (UID: "99298ce1-4bc7-4409-9c72-7d07dae3a06b"). InnerVolumeSpecName "kube-api-access-gh58n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.362973 4707 generic.go:334] "Generic (PLEG): container finished" podID="151f7a26-9818-45b1-90d5-d45f9ce116ce" containerID="3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba" exitCode=0 Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.363046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"151f7a26-9818-45b1-90d5-d45f9ce116ce","Type":"ContainerDied","Data":"3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba"} Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.364461 4707 generic.go:334] "Generic (PLEG): container finished" podID="522a96ca-cde2-4466-804e-85a0f8e56653" containerID="befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda" exitCode=0 Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.364526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"522a96ca-cde2-4466-804e-85a0f8e56653","Type":"ContainerDied","Data":"befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda"} Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.365954 4707 generic.go:334] "Generic (PLEG): container finished" podID="225ee0a9-1192-40ab-93d9-010fb4c4a07e" containerID="94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a" exitCode=0 Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.366015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"225ee0a9-1192-40ab-93d9-010fb4c4a07e","Type":"ContainerDied","Data":"94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a"} Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.368091 4707 generic.go:334] "Generic (PLEG): container finished" podID="99298ce1-4bc7-4409-9c72-7d07dae3a06b" containerID="e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077" exitCode=0 Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.368118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-qd9pv" event={"ID":"99298ce1-4bc7-4409-9c72-7d07dae3a06b","Type":"ContainerDied","Data":"e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077"} Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.368135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-qd9pv" event={"ID":"99298ce1-4bc7-4409-9c72-7d07dae3a06b","Type":"ContainerDied","Data":"6b661341a1b3a8d0db1ee9ba5a65eb70ed2778534171f0f31b32ddd5f2ce1186"} Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.368150 4707 scope.go:117] "RemoveContainer" containerID="e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.368237 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-qd9pv" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.372247 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh58n\" (UniqueName: \"kubernetes.io/projected/99298ce1-4bc7-4409-9c72-7d07dae3a06b-kube-api-access-gh58n\") on node \"crc\" DevicePath \"\"" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.423104 4707 scope.go:117] "RemoveContainer" containerID="e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077" Jan 21 16:43:24 crc kubenswrapper[4707]: E0121 16:43:24.423535 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077\": container with ID starting with e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077 not found: ID does not exist" containerID="e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.423565 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077"} err="failed to get container status \"e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077\": rpc error: code = NotFound desc = could not find container \"e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077\": container with ID starting with e6150fa289e30eecf30621510f3eb8d13633ea40476424db72e16f12392d0077 not found: ID does not exist" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.451074 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-qd9pv"] Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.455561 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-qd9pv"] Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.461716 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-4s5rd"] Jan 21 16:43:24 crc kubenswrapper[4707]: E0121 16:43:24.461976 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerName="extract" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.461988 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerName="extract" Jan 21 16:43:24 crc kubenswrapper[4707]: E0121 16:43:24.461995 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99298ce1-4bc7-4409-9c72-7d07dae3a06b" containerName="registry-server" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.462001 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="99298ce1-4bc7-4409-9c72-7d07dae3a06b" containerName="registry-server" Jan 21 16:43:24 crc kubenswrapper[4707]: E0121 16:43:24.462014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerName="util" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.462020 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerName="util" Jan 21 16:43:24 crc kubenswrapper[4707]: E0121 16:43:24.462038 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerName="pull" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.462043 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerName="pull" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.462155 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="99298ce1-4bc7-4409-9c72-7d07dae3a06b" containerName="registry-server" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.462175 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d976b88-45b2-4fcd-92c3-8998178306ac" containerName="extract" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.462526 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.464613 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-rknxw" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.466310 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-4s5rd"] Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.574485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzw5r\" (UniqueName: \"kubernetes.io/projected/d7af2b5c-1c35-4ca8-9a60-a52430cecd7a-kube-api-access-zzw5r\") pod \"infra-operator-index-4s5rd\" (UID: \"d7af2b5c-1c35-4ca8-9a60-a52430cecd7a\") " pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.675465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzw5r\" (UniqueName: \"kubernetes.io/projected/d7af2b5c-1c35-4ca8-9a60-a52430cecd7a-kube-api-access-zzw5r\") pod \"infra-operator-index-4s5rd\" (UID: \"d7af2b5c-1c35-4ca8-9a60-a52430cecd7a\") " pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.689062 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzw5r\" (UniqueName: \"kubernetes.io/projected/d7af2b5c-1c35-4ca8-9a60-a52430cecd7a-kube-api-access-zzw5r\") pod \"infra-operator-index-4s5rd\" (UID: \"d7af2b5c-1c35-4ca8-9a60-a52430cecd7a\") " pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 16:43:24 crc kubenswrapper[4707]: I0121 16:43:24.784750 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.130787 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-4s5rd"] Jan 21 16:43:25 crc kubenswrapper[4707]: W0121 16:43:25.132660 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7af2b5c_1c35_4ca8_9a60_a52430cecd7a.slice/crio-ba756f790eec94965176c2cbff2da7703babefa46dbd50147590c2b3a7587af2 WatchSource:0}: Error finding container ba756f790eec94965176c2cbff2da7703babefa46dbd50147590c2b3a7587af2: Status 404 returned error can't find the container with id ba756f790eec94965176c2cbff2da7703babefa46dbd50147590c2b3a7587af2 Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.189168 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99298ce1-4bc7-4409-9c72-7d07dae3a06b" path="/var/lib/kubelet/pods/99298ce1-4bc7-4409-9c72-7d07dae3a06b/volumes" Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.375559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-4s5rd" event={"ID":"d7af2b5c-1c35-4ca8-9a60-a52430cecd7a","Type":"ContainerStarted","Data":"ba756f790eec94965176c2cbff2da7703babefa46dbd50147590c2b3a7587af2"} Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.377687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"151f7a26-9818-45b1-90d5-d45f9ce116ce","Type":"ContainerStarted","Data":"f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5"} Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.379890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"522a96ca-cde2-4466-804e-85a0f8e56653","Type":"ContainerStarted","Data":"890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d"} Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.381376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"225ee0a9-1192-40ab-93d9-010fb4c4a07e","Type":"ContainerStarted","Data":"78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58"} Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.391631 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-1" podStartSLOduration=6.391616437 podStartE2EDuration="6.391616437s" podCreationTimestamp="2026-01-21 16:43:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:43:25.389399699 +0000 UTC m=+6102.570915921" watchObservedRunningTime="2026-01-21 16:43:25.391616437 +0000 UTC m=+6102.573132658" Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.404666 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-2" podStartSLOduration=6.40465347 podStartE2EDuration="6.40465347s" podCreationTimestamp="2026-01-21 16:43:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:43:25.402044754 +0000 UTC m=+6102.583560977" watchObservedRunningTime="2026-01-21 16:43:25.40465347 +0000 UTC m=+6102.586169692" Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.416713 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-0" podStartSLOduration=6.416701354 podStartE2EDuration="6.416701354s" podCreationTimestamp="2026-01-21 16:43:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:43:25.414670225 +0000 UTC m=+6102.596186446" watchObservedRunningTime="2026-01-21 16:43:25.416701354 +0000 UTC m=+6102.598217576" Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.985752 4707 scope.go:117] "RemoveContainer" containerID="3313b35cf5a74caf887dfa1c5ee9cd9670d5a4ba06d7e2c5256c48263aeb8e08" Jan 21 16:43:25 crc kubenswrapper[4707]: I0121 16:43:25.998899 4707 scope.go:117] "RemoveContainer" containerID="e0204608d317b8236cee5d1834f54e7e25cb53a61e3c85fed9fa089fc05936f5" Jan 21 16:43:26 crc kubenswrapper[4707]: I0121 16:43:26.017354 4707 scope.go:117] "RemoveContainer" containerID="f97c3da82958800cfb31c6fac73c534918cf1bd189f5487e8d9e01a260465af3" Jan 21 16:43:26 crc kubenswrapper[4707]: I0121 16:43:26.033691 4707 scope.go:117] "RemoveContainer" containerID="1a97e4cbcd911fe6ca249b740b3d863f4ef8a1c331f1ac3df6497004b0ecbc4b" Jan 21 16:43:26 crc kubenswrapper[4707]: I0121 16:43:26.387757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-4s5rd" event={"ID":"d7af2b5c-1c35-4ca8-9a60-a52430cecd7a","Type":"ContainerStarted","Data":"41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44"} Jan 21 16:43:29 crc kubenswrapper[4707]: I0121 16:43:29.182274 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:43:29 crc kubenswrapper[4707]: E0121 16:43:29.182686 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:43:30 crc kubenswrapper[4707]: I0121 16:43:30.578122 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:30 crc kubenswrapper[4707]: I0121 16:43:30.578348 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:30 crc kubenswrapper[4707]: I0121 16:43:30.586348 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:30 crc kubenswrapper[4707]: I0121 16:43:30.586378 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:30 crc kubenswrapper[4707]: I0121 16:43:30.593125 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:30 crc kubenswrapper[4707]: I0121 16:43:30.593154 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:34 crc kubenswrapper[4707]: I0121 16:43:34.653593 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:34 crc kubenswrapper[4707]: I0121 16:43:34.666016 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-4s5rd" podStartSLOduration=10.174623488 podStartE2EDuration="10.666003929s" podCreationTimestamp="2026-01-21 16:43:24 +0000 UTC" firstStartedPulling="2026-01-21 16:43:25.134271312 +0000 UTC m=+6102.315787533" lastFinishedPulling="2026-01-21 16:43:25.625651752 +0000 UTC m=+6102.807167974" observedRunningTime="2026-01-21 16:43:26.401747373 +0000 UTC m=+6103.583263595" watchObservedRunningTime="2026-01-21 16:43:34.666003929 +0000 UTC m=+6111.847520151" Jan 21 16:43:34 crc kubenswrapper[4707]: I0121 16:43:34.704216 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 16:43:34 crc kubenswrapper[4707]: I0121 16:43:34.785148 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 16:43:34 crc kubenswrapper[4707]: I0121 16:43:34.785353 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 16:43:34 crc kubenswrapper[4707]: I0121 16:43:34.805945 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 16:43:35 crc kubenswrapper[4707]: I0121 16:43:35.449925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.346042 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/root-account-create-update-8vknb"] Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.346966 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.348976 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.351598 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-8vknb"] Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.452947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p95jq\" (UniqueName: \"kubernetes.io/projected/d7e9cca3-4ff1-4be1-9319-a63239dece19-kube-api-access-p95jq\") pod \"root-account-create-update-8vknb\" (UID: \"d7e9cca3-4ff1-4be1-9319-a63239dece19\") " pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.453011 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e9cca3-4ff1-4be1-9319-a63239dece19-operator-scripts\") pod \"root-account-create-update-8vknb\" (UID: \"d7e9cca3-4ff1-4be1-9319-a63239dece19\") " pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.553960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e9cca3-4ff1-4be1-9319-a63239dece19-operator-scripts\") pod \"root-account-create-update-8vknb\" (UID: \"d7e9cca3-4ff1-4be1-9319-a63239dece19\") " pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.554136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p95jq\" (UniqueName: \"kubernetes.io/projected/d7e9cca3-4ff1-4be1-9319-a63239dece19-kube-api-access-p95jq\") pod \"root-account-create-update-8vknb\" (UID: \"d7e9cca3-4ff1-4be1-9319-a63239dece19\") " pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.554803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e9cca3-4ff1-4be1-9319-a63239dece19-operator-scripts\") pod \"root-account-create-update-8vknb\" (UID: \"d7e9cca3-4ff1-4be1-9319-a63239dece19\") " pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.569496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p95jq\" (UniqueName: \"kubernetes.io/projected/d7e9cca3-4ff1-4be1-9319-a63239dece19-kube-api-access-p95jq\") pod \"root-account-create-update-8vknb\" (UID: \"d7e9cca3-4ff1-4be1-9319-a63239dece19\") " pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:39 crc kubenswrapper[4707]: I0121 16:43:39.661554 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:40 crc kubenswrapper[4707]: I0121 16:43:40.006536 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-8vknb"] Jan 21 16:43:40 crc kubenswrapper[4707]: W0121 16:43:40.010042 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7e9cca3_4ff1_4be1_9319_a63239dece19.slice/crio-74479c1d6ccae7abb3c3d62d9748d9bc502bb51b0343af93458f43ea59c314e5 WatchSource:0}: Error finding container 74479c1d6ccae7abb3c3d62d9748d9bc502bb51b0343af93458f43ea59c314e5: Status 404 returned error can't find the container with id 74479c1d6ccae7abb3c3d62d9748d9bc502bb51b0343af93458f43ea59c314e5 Jan 21 16:43:40 crc kubenswrapper[4707]: I0121 16:43:40.456541 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7e9cca3-4ff1-4be1-9319-a63239dece19" containerID="d4f0f8b2087b1f9c9a9997adada6fb5dbb95b70117812f6797b39b925d3ded1c" exitCode=0 Jan 21 16:43:40 crc kubenswrapper[4707]: I0121 16:43:40.456603 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/root-account-create-update-8vknb" event={"ID":"d7e9cca3-4ff1-4be1-9319-a63239dece19","Type":"ContainerDied","Data":"d4f0f8b2087b1f9c9a9997adada6fb5dbb95b70117812f6797b39b925d3ded1c"} Jan 21 16:43:40 crc kubenswrapper[4707]: I0121 16:43:40.456749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/root-account-create-update-8vknb" event={"ID":"d7e9cca3-4ff1-4be1-9319-a63239dece19","Type":"ContainerStarted","Data":"74479c1d6ccae7abb3c3d62d9748d9bc502bb51b0343af93458f43ea59c314e5"} Jan 21 16:43:40 crc kubenswrapper[4707]: I0121 16:43:40.637403 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="keystone-kuttl-tests/openstack-galera-2" podUID="522a96ca-cde2-4466-804e-85a0f8e56653" containerName="galera" probeResult="failure" output=< Jan 21 16:43:40 crc kubenswrapper[4707]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 21 16:43:40 crc kubenswrapper[4707]: > Jan 21 16:43:41 crc kubenswrapper[4707]: I0121 16:43:41.425591 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:41 crc kubenswrapper[4707]: I0121 16:43:41.487375 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 16:43:41 crc kubenswrapper[4707]: I0121 16:43:41.693166 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:41 crc kubenswrapper[4707]: I0121 16:43:41.781080 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e9cca3-4ff1-4be1-9319-a63239dece19-operator-scripts\") pod \"d7e9cca3-4ff1-4be1-9319-a63239dece19\" (UID: \"d7e9cca3-4ff1-4be1-9319-a63239dece19\") " Jan 21 16:43:41 crc kubenswrapper[4707]: I0121 16:43:41.781183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p95jq\" (UniqueName: \"kubernetes.io/projected/d7e9cca3-4ff1-4be1-9319-a63239dece19-kube-api-access-p95jq\") pod \"d7e9cca3-4ff1-4be1-9319-a63239dece19\" (UID: \"d7e9cca3-4ff1-4be1-9319-a63239dece19\") " Jan 21 16:43:41 crc kubenswrapper[4707]: I0121 16:43:41.781418 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e9cca3-4ff1-4be1-9319-a63239dece19-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7e9cca3-4ff1-4be1-9319-a63239dece19" (UID: "d7e9cca3-4ff1-4be1-9319-a63239dece19"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:43:41 crc kubenswrapper[4707]: I0121 16:43:41.781535 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e9cca3-4ff1-4be1-9319-a63239dece19-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:43:41 crc kubenswrapper[4707]: I0121 16:43:41.785415 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e9cca3-4ff1-4be1-9319-a63239dece19-kube-api-access-p95jq" (OuterVolumeSpecName: "kube-api-access-p95jq") pod "d7e9cca3-4ff1-4be1-9319-a63239dece19" (UID: "d7e9cca3-4ff1-4be1-9319-a63239dece19"). InnerVolumeSpecName "kube-api-access-p95jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:43:41 crc kubenswrapper[4707]: I0121 16:43:41.883276 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p95jq\" (UniqueName: \"kubernetes.io/projected/d7e9cca3-4ff1-4be1-9319-a63239dece19-kube-api-access-p95jq\") on node \"crc\" DevicePath \"\"" Jan 21 16:43:42 crc kubenswrapper[4707]: I0121 16:43:42.182431 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:43:42 crc kubenswrapper[4707]: E0121 16:43:42.182671 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:43:42 crc kubenswrapper[4707]: I0121 16:43:42.467206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/root-account-create-update-8vknb" event={"ID":"d7e9cca3-4ff1-4be1-9319-a63239dece19","Type":"ContainerDied","Data":"74479c1d6ccae7abb3c3d62d9748d9bc502bb51b0343af93458f43ea59c314e5"} Jan 21 16:43:42 crc kubenswrapper[4707]: I0121 16:43:42.467241 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74479c1d6ccae7abb3c3d62d9748d9bc502bb51b0343af93458f43ea59c314e5" Jan 21 16:43:42 crc kubenswrapper[4707]: I0121 16:43:42.467284 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-8vknb" Jan 21 16:43:46 crc kubenswrapper[4707]: I0121 16:43:46.541551 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:46 crc kubenswrapper[4707]: I0121 16:43:46.588939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.287087 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t"] Jan 21 16:43:50 crc kubenswrapper[4707]: E0121 16:43:50.287471 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e9cca3-4ff1-4be1-9319-a63239dece19" containerName="mariadb-account-create-update" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.287483 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e9cca3-4ff1-4be1-9319-a63239dece19" containerName="mariadb-account-create-update" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.287600 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e9cca3-4ff1-4be1-9319-a63239dece19" containerName="mariadb-account-create-update" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.287988 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.290504 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-898qq" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.293967 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.298206 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t"] Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.381410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-webhook-cert\") pod \"infra-operator-controller-manager-56c4b5f9d8-thv7t\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.381545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-apiservice-cert\") pod \"infra-operator-controller-manager-56c4b5f9d8-thv7t\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.381575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngc2h\" (UniqueName: \"kubernetes.io/projected/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-kube-api-access-ngc2h\") pod \"infra-operator-controller-manager-56c4b5f9d8-thv7t\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.483075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-apiservice-cert\") pod \"infra-operator-controller-manager-56c4b5f9d8-thv7t\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.483117 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngc2h\" (UniqueName: \"kubernetes.io/projected/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-kube-api-access-ngc2h\") pod \"infra-operator-controller-manager-56c4b5f9d8-thv7t\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.483252 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-webhook-cert\") pod \"infra-operator-controller-manager-56c4b5f9d8-thv7t\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.487838 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-apiservice-cert\") pod \"infra-operator-controller-manager-56c4b5f9d8-thv7t\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.487841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-webhook-cert\") pod \"infra-operator-controller-manager-56c4b5f9d8-thv7t\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.495859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngc2h\" (UniqueName: \"kubernetes.io/projected/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-kube-api-access-ngc2h\") pod \"infra-operator-controller-manager-56c4b5f9d8-thv7t\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.608293 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:50 crc kubenswrapper[4707]: I0121 16:43:50.975573 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t"] Jan 21 16:43:50 crc kubenswrapper[4707]: W0121 16:43:50.978939 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93f9eb6_1c5f_43e1_9f24_43fb6e7efb03.slice/crio-5551d3d0897f49a1fc8912a48ac616f0b9987f851c32c08627a7e74918daa852 WatchSource:0}: Error finding container 5551d3d0897f49a1fc8912a48ac616f0b9987f851c32c08627a7e74918daa852: Status 404 returned error can't find the container with id 5551d3d0897f49a1fc8912a48ac616f0b9987f851c32c08627a7e74918daa852 Jan 21 16:43:51 crc kubenswrapper[4707]: I0121 16:43:51.514541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" event={"ID":"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03","Type":"ContainerStarted","Data":"78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593"} Jan 21 16:43:51 crc kubenswrapper[4707]: I0121 16:43:51.514747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" event={"ID":"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03","Type":"ContainerStarted","Data":"5551d3d0897f49a1fc8912a48ac616f0b9987f851c32c08627a7e74918daa852"} Jan 21 16:43:51 crc kubenswrapper[4707]: I0121 16:43:51.514762 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:43:51 crc kubenswrapper[4707]: I0121 16:43:51.528954 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" podStartSLOduration=1.528939531 podStartE2EDuration="1.528939531s" podCreationTimestamp="2026-01-21 16:43:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:43:51.526123396 +0000 UTC m=+6128.707639619" watchObservedRunningTime="2026-01-21 16:43:51.528939531 +0000 UTC m=+6128.710455753" Jan 21 16:43:55 crc kubenswrapper[4707]: I0121 16:43:55.182387 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:43:55 crc kubenswrapper[4707]: E0121 16:43:55.183117 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:44:00 crc kubenswrapper[4707]: I0121 16:44:00.613673 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 16:44:05 crc kubenswrapper[4707]: I0121 16:44:05.730292 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-sg2cd"] Jan 21 16:44:05 crc kubenswrapper[4707]: I0121 16:44:05.731128 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" Jan 21 16:44:05 crc kubenswrapper[4707]: I0121 16:44:05.732988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-ct4cq" Jan 21 16:44:05 crc kubenswrapper[4707]: I0121 16:44:05.736913 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-sg2cd"] Jan 21 16:44:05 crc kubenswrapper[4707]: I0121 16:44:05.871241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgcv7\" (UniqueName: \"kubernetes.io/projected/4780b32c-c74a-4e7f-9803-820d22a13269-kube-api-access-dgcv7\") pod \"rabbitmq-cluster-operator-index-sg2cd\" (UID: \"4780b32c-c74a-4e7f-9803-820d22a13269\") " pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" Jan 21 16:44:05 crc kubenswrapper[4707]: I0121 16:44:05.972888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgcv7\" (UniqueName: \"kubernetes.io/projected/4780b32c-c74a-4e7f-9803-820d22a13269-kube-api-access-dgcv7\") pod \"rabbitmq-cluster-operator-index-sg2cd\" (UID: \"4780b32c-c74a-4e7f-9803-820d22a13269\") " pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" Jan 21 16:44:05 crc kubenswrapper[4707]: I0121 16:44:05.987644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgcv7\" (UniqueName: \"kubernetes.io/projected/4780b32c-c74a-4e7f-9803-820d22a13269-kube-api-access-dgcv7\") pod \"rabbitmq-cluster-operator-index-sg2cd\" (UID: \"4780b32c-c74a-4e7f-9803-820d22a13269\") " pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" Jan 21 16:44:06 crc kubenswrapper[4707]: I0121 16:44:06.046609 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" Jan 21 16:44:06 crc kubenswrapper[4707]: I0121 16:44:06.398592 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-sg2cd"] Jan 21 16:44:06 crc kubenswrapper[4707]: W0121 16:44:06.403685 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4780b32c_c74a_4e7f_9803_820d22a13269.slice/crio-cd1383c74ec620275ebc59a10f426d9e0ea520dd33685d052f2146556d500c29 WatchSource:0}: Error finding container cd1383c74ec620275ebc59a10f426d9e0ea520dd33685d052f2146556d500c29: Status 404 returned error can't find the container with id cd1383c74ec620275ebc59a10f426d9e0ea520dd33685d052f2146556d500c29 Jan 21 16:44:06 crc kubenswrapper[4707]: I0121 16:44:06.600727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" event={"ID":"4780b32c-c74a-4e7f-9803-820d22a13269","Type":"ContainerStarted","Data":"cd1383c74ec620275ebc59a10f426d9e0ea520dd33685d052f2146556d500c29"} Jan 21 16:44:07 crc kubenswrapper[4707]: I0121 16:44:07.607738 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" event={"ID":"4780b32c-c74a-4e7f-9803-820d22a13269","Type":"ContainerStarted","Data":"0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e"} Jan 21 16:44:07 crc kubenswrapper[4707]: I0121 16:44:07.618188 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" podStartSLOduration=1.866592711 podStartE2EDuration="2.61817605s" podCreationTimestamp="2026-01-21 16:44:05 +0000 UTC" firstStartedPulling="2026-01-21 16:44:06.405663958 +0000 UTC m=+6143.587180180" lastFinishedPulling="2026-01-21 16:44:07.157247298 +0000 UTC m=+6144.338763519" observedRunningTime="2026-01-21 16:44:07.616428434 +0000 UTC m=+6144.797944657" watchObservedRunningTime="2026-01-21 16:44:07.61817605 +0000 UTC m=+6144.799692272" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.074862 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.075607 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.077328 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"memcached-config-data" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.077594 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"memcached-memcached-dockercfg-hk6z8" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.083708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.182206 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:44:09 crc kubenswrapper[4707]: E0121 16:44:09.182448 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.211409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kolla-config\") pod \"memcached-0\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.211472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s69k8\" (UniqueName: \"kubernetes.io/projected/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kube-api-access-s69k8\") pod \"memcached-0\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.211672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-config-data\") pod \"memcached-0\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.313167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-config-data\") pod \"memcached-0\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.313258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kolla-config\") pod \"memcached-0\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.313290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s69k8\" (UniqueName: \"kubernetes.io/projected/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kube-api-access-s69k8\") pod \"memcached-0\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.314036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kolla-config\") pod \"memcached-0\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.314077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-config-data\") pod \"memcached-0\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.328529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s69k8\" (UniqueName: \"kubernetes.io/projected/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kube-api-access-s69k8\") pod \"memcached-0\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.388641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.731417 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 16:44:09 crc kubenswrapper[4707]: W0121 16:44:09.736003 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b7fe44_243a_4e6a_b6f5_abf109cd69d5.slice/crio-73f012961bd5b60924751f9a22c446df7c1a23d538dd81c61c465e11568bde92 WatchSource:0}: Error finding container 73f012961bd5b60924751f9a22c446df7c1a23d538dd81c61c465e11568bde92: Status 404 returned error can't find the container with id 73f012961bd5b60924751f9a22c446df7c1a23d538dd81c61c465e11568bde92 Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.926096 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-sg2cd"] Jan 21 16:44:09 crc kubenswrapper[4707]: I0121 16:44:09.926250 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" podUID="4780b32c-c74a-4e7f-9803-820d22a13269" containerName="registry-server" containerID="cri-o://0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e" gracePeriod=2 Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.221429 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.325687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgcv7\" (UniqueName: \"kubernetes.io/projected/4780b32c-c74a-4e7f-9803-820d22a13269-kube-api-access-dgcv7\") pod \"4780b32c-c74a-4e7f-9803-820d22a13269\" (UID: \"4780b32c-c74a-4e7f-9803-820d22a13269\") " Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.329986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4780b32c-c74a-4e7f-9803-820d22a13269-kube-api-access-dgcv7" (OuterVolumeSpecName: "kube-api-access-dgcv7") pod "4780b32c-c74a-4e7f-9803-820d22a13269" (UID: "4780b32c-c74a-4e7f-9803-820d22a13269"). InnerVolumeSpecName "kube-api-access-dgcv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.427501 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgcv7\" (UniqueName: \"kubernetes.io/projected/4780b32c-c74a-4e7f-9803-820d22a13269-kube-api-access-dgcv7\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.530429 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-m9pnv"] Jan 21 16:44:10 crc kubenswrapper[4707]: E0121 16:44:10.530647 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4780b32c-c74a-4e7f-9803-820d22a13269" containerName="registry-server" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.530663 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4780b32c-c74a-4e7f-9803-820d22a13269" containerName="registry-server" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.530835 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4780b32c-c74a-4e7f-9803-820d22a13269" containerName="registry-server" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.531225 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.535882 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-m9pnv"] Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.629611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6knf2\" (UniqueName: \"kubernetes.io/projected/d0a89d96-10da-4a61-94e3-0b6e73714360-kube-api-access-6knf2\") pod \"rabbitmq-cluster-operator-index-m9pnv\" (UID: \"d0a89d96-10da-4a61-94e3-0b6e73714360\") " pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.638525 4707 generic.go:334] "Generic (PLEG): container finished" podID="4780b32c-c74a-4e7f-9803-820d22a13269" containerID="0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e" exitCode=0 Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.638571 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.638594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" event={"ID":"4780b32c-c74a-4e7f-9803-820d22a13269","Type":"ContainerDied","Data":"0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e"} Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.638838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-sg2cd" event={"ID":"4780b32c-c74a-4e7f-9803-820d22a13269","Type":"ContainerDied","Data":"cd1383c74ec620275ebc59a10f426d9e0ea520dd33685d052f2146556d500c29"} Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.638861 4707 scope.go:117] "RemoveContainer" containerID="0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.640543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"34b7fe44-243a-4e6a-b6f5-abf109cd69d5","Type":"ContainerStarted","Data":"131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3"} Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.640581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"34b7fe44-243a-4e6a-b6f5-abf109cd69d5","Type":"ContainerStarted","Data":"73f012961bd5b60924751f9a22c446df7c1a23d538dd81c61c465e11568bde92"} Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.640675 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.653013 4707 scope.go:117] "RemoveContainer" containerID="0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e" Jan 21 16:44:10 crc kubenswrapper[4707]: E0121 16:44:10.653296 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e\": container with ID starting with 0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e not found: ID does not exist" containerID="0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.653323 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e"} err="failed to get container status \"0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e\": rpc error: code = NotFound desc = could not find container \"0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e\": container with ID starting with 0315d075b9b12914576f9caef46fdf2ec36bc0baf97f8c94f0d67ff1615d577e not found: ID does not exist" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.654263 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/memcached-0" podStartSLOduration=1.6542489790000001 podStartE2EDuration="1.654248979s" podCreationTimestamp="2026-01-21 16:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:44:10.652050866 +0000 UTC m=+6147.833567088" watchObservedRunningTime="2026-01-21 16:44:10.654248979 +0000 UTC m=+6147.835765200" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.665072 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-sg2cd"] Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.668799 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-sg2cd"] Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.730654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6knf2\" (UniqueName: \"kubernetes.io/projected/d0a89d96-10da-4a61-94e3-0b6e73714360-kube-api-access-6knf2\") pod \"rabbitmq-cluster-operator-index-m9pnv\" (UID: \"d0a89d96-10da-4a61-94e3-0b6e73714360\") " pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.743918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6knf2\" (UniqueName: \"kubernetes.io/projected/d0a89d96-10da-4a61-94e3-0b6e73714360-kube-api-access-6knf2\") pod \"rabbitmq-cluster-operator-index-m9pnv\" (UID: \"d0a89d96-10da-4a61-94e3-0b6e73714360\") " pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 16:44:10 crc kubenswrapper[4707]: I0121 16:44:10.843010 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 16:44:11 crc kubenswrapper[4707]: I0121 16:44:11.188450 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4780b32c-c74a-4e7f-9803-820d22a13269" path="/var/lib/kubelet/pods/4780b32c-c74a-4e7f-9803-820d22a13269/volumes" Jan 21 16:44:11 crc kubenswrapper[4707]: I0121 16:44:11.189612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-m9pnv"] Jan 21 16:44:11 crc kubenswrapper[4707]: W0121 16:44:11.192151 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a89d96_10da_4a61_94e3_0b6e73714360.slice/crio-c9da88badf24d462ee9bfa4cd35460549a47d480a1e6b143e867719e7689e140 WatchSource:0}: Error finding container c9da88badf24d462ee9bfa4cd35460549a47d480a1e6b143e867719e7689e140: Status 404 returned error can't find the container with id c9da88badf24d462ee9bfa4cd35460549a47d480a1e6b143e867719e7689e140 Jan 21 16:44:11 crc kubenswrapper[4707]: I0121 16:44:11.647506 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" event={"ID":"d0a89d96-10da-4a61-94e3-0b6e73714360","Type":"ContainerStarted","Data":"c9da88badf24d462ee9bfa4cd35460549a47d480a1e6b143e867719e7689e140"} Jan 21 16:44:12 crc kubenswrapper[4707]: I0121 16:44:12.653859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" event={"ID":"d0a89d96-10da-4a61-94e3-0b6e73714360","Type":"ContainerStarted","Data":"f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa"} Jan 21 16:44:12 crc kubenswrapper[4707]: I0121 16:44:12.664757 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" podStartSLOduration=2.14278036 podStartE2EDuration="2.664743165s" podCreationTimestamp="2026-01-21 16:44:10 +0000 UTC" firstStartedPulling="2026-01-21 16:44:11.193734171 +0000 UTC m=+6148.375250393" lastFinishedPulling="2026-01-21 16:44:11.715696977 +0000 UTC m=+6148.897213198" observedRunningTime="2026-01-21 16:44:12.66265044 +0000 UTC m=+6149.844166662" watchObservedRunningTime="2026-01-21 16:44:12.664743165 +0000 UTC m=+6149.846259388" Jan 21 16:44:14 crc kubenswrapper[4707]: I0121 16:44:14.389370 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/memcached-0" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.132473 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zx4cm"] Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.133557 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.138831 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zx4cm"] Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.212491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-898dq\" (UniqueName: \"kubernetes.io/projected/f87e6bd4-e2e6-4568-946e-c67872e6c909-kube-api-access-898dq\") pod \"community-operators-zx4cm\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.212542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-utilities\") pod \"community-operators-zx4cm\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.212576 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-catalog-content\") pod \"community-operators-zx4cm\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.313731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-898dq\" (UniqueName: \"kubernetes.io/projected/f87e6bd4-e2e6-4568-946e-c67872e6c909-kube-api-access-898dq\") pod \"community-operators-zx4cm\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.313824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-utilities\") pod \"community-operators-zx4cm\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.313891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-catalog-content\") pod \"community-operators-zx4cm\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.314314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-utilities\") pod \"community-operators-zx4cm\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.314366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-catalog-content\") pod \"community-operators-zx4cm\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.336230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-898dq\" (UniqueName: \"kubernetes.io/projected/f87e6bd4-e2e6-4568-946e-c67872e6c909-kube-api-access-898dq\") pod \"community-operators-zx4cm\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.450494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:17 crc kubenswrapper[4707]: I0121 16:44:17.850979 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zx4cm"] Jan 21 16:44:18 crc kubenswrapper[4707]: I0121 16:44:18.687139 4707 generic.go:334] "Generic (PLEG): container finished" podID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerID="548e1b1e1bf12b70cfc89e38feba34106638539dc0e9b9b7d9e5b8f5fcec9527" exitCode=0 Jan 21 16:44:18 crc kubenswrapper[4707]: I0121 16:44:18.687177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zx4cm" event={"ID":"f87e6bd4-e2e6-4568-946e-c67872e6c909","Type":"ContainerDied","Data":"548e1b1e1bf12b70cfc89e38feba34106638539dc0e9b9b7d9e5b8f5fcec9527"} Jan 21 16:44:18 crc kubenswrapper[4707]: I0121 16:44:18.687199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zx4cm" event={"ID":"f87e6bd4-e2e6-4568-946e-c67872e6c909","Type":"ContainerStarted","Data":"c1ef4a92bd30e0dfd635c4119883f9cf4d5a377de6ce486a7a82526821f24190"} Jan 21 16:44:19 crc kubenswrapper[4707]: I0121 16:44:19.694489 4707 generic.go:334] "Generic (PLEG): container finished" podID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerID="ad937f3289d54ddeccbd169d2a7b722cc70dd16d3dc365c699d0cbf9077c521c" exitCode=0 Jan 21 16:44:19 crc kubenswrapper[4707]: I0121 16:44:19.694582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zx4cm" event={"ID":"f87e6bd4-e2e6-4568-946e-c67872e6c909","Type":"ContainerDied","Data":"ad937f3289d54ddeccbd169d2a7b722cc70dd16d3dc365c699d0cbf9077c521c"} Jan 21 16:44:20 crc kubenswrapper[4707]: I0121 16:44:20.702367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zx4cm" event={"ID":"f87e6bd4-e2e6-4568-946e-c67872e6c909","Type":"ContainerStarted","Data":"0653cc8aa7ca5fc56cfbeb4e909b112e7aaa22a94211929e3319a49ee36247bd"} Jan 21 16:44:20 crc kubenswrapper[4707]: I0121 16:44:20.714992 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zx4cm" podStartSLOduration=2.202750941 podStartE2EDuration="3.714978828s" podCreationTimestamp="2026-01-21 16:44:17 +0000 UTC" firstStartedPulling="2026-01-21 16:44:18.688537517 +0000 UTC m=+6155.870053739" lastFinishedPulling="2026-01-21 16:44:20.200765404 +0000 UTC m=+6157.382281626" observedRunningTime="2026-01-21 16:44:20.714118651 +0000 UTC m=+6157.895634873" watchObservedRunningTime="2026-01-21 16:44:20.714978828 +0000 UTC m=+6157.896495050" Jan 21 16:44:20 crc kubenswrapper[4707]: I0121 16:44:20.844048 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 16:44:20 crc kubenswrapper[4707]: I0121 16:44:20.844248 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 16:44:20 crc kubenswrapper[4707]: I0121 16:44:20.863636 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 16:44:21 crc kubenswrapper[4707]: I0121 16:44:21.729358 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 16:44:22 crc kubenswrapper[4707]: I0121 16:44:22.183144 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:44:22 crc kubenswrapper[4707]: E0121 16:44:22.183408 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.532039 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pccw2"] Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.533338 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.537707 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pccw2"] Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.616511 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9dbq\" (UniqueName: \"kubernetes.io/projected/94a0053b-3734-4ee6-9bd2-293d12b064d9-kube-api-access-h9dbq\") pod \"certified-operators-pccw2\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.616567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-utilities\") pod \"certified-operators-pccw2\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.616795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-catalog-content\") pod \"certified-operators-pccw2\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.718191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-catalog-content\") pod \"certified-operators-pccw2\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.718302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9dbq\" (UniqueName: \"kubernetes.io/projected/94a0053b-3734-4ee6-9bd2-293d12b064d9-kube-api-access-h9dbq\") pod \"certified-operators-pccw2\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.718334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-utilities\") pod \"certified-operators-pccw2\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.718652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-catalog-content\") pod \"certified-operators-pccw2\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.718720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-utilities\") pod \"certified-operators-pccw2\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.736578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9dbq\" (UniqueName: \"kubernetes.io/projected/94a0053b-3734-4ee6-9bd2-293d12b064d9-kube-api-access-h9dbq\") pod \"certified-operators-pccw2\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:25 crc kubenswrapper[4707]: I0121 16:44:25.848284 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:26 crc kubenswrapper[4707]: I0121 16:44:26.233369 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pccw2"] Jan 21 16:44:26 crc kubenswrapper[4707]: W0121 16:44:26.235150 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a0053b_3734_4ee6_9bd2_293d12b064d9.slice/crio-3fce5adebf2cb34cbd6b58053a710a57d031674e24a9c257692a7efb244883bb WatchSource:0}: Error finding container 3fce5adebf2cb34cbd6b58053a710a57d031674e24a9c257692a7efb244883bb: Status 404 returned error can't find the container with id 3fce5adebf2cb34cbd6b58053a710a57d031674e24a9c257692a7efb244883bb Jan 21 16:44:26 crc kubenswrapper[4707]: I0121 16:44:26.734475 4707 generic.go:334] "Generic (PLEG): container finished" podID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerID="672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4" exitCode=0 Jan 21 16:44:26 crc kubenswrapper[4707]: I0121 16:44:26.734514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pccw2" event={"ID":"94a0053b-3734-4ee6-9bd2-293d12b064d9","Type":"ContainerDied","Data":"672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4"} Jan 21 16:44:26 crc kubenswrapper[4707]: I0121 16:44:26.734538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pccw2" event={"ID":"94a0053b-3734-4ee6-9bd2-293d12b064d9","Type":"ContainerStarted","Data":"3fce5adebf2cb34cbd6b58053a710a57d031674e24a9c257692a7efb244883bb"} Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.451538 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.451766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.479053 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.742339 4707 generic.go:334] "Generic (PLEG): container finished" podID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerID="109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5" exitCode=0 Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.742377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pccw2" event={"ID":"94a0053b-3734-4ee6-9bd2-293d12b064d9","Type":"ContainerDied","Data":"109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5"} Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.776261 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.955491 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc"] Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.956734 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.957968 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 16:44:27 crc kubenswrapper[4707]: I0121 16:44:27.962612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc"] Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.047676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.047882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgl58\" (UniqueName: \"kubernetes.io/projected/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-kube-api-access-tgl58\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.047941 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.148909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.149019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgl58\" (UniqueName: \"kubernetes.io/projected/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-kube-api-access-tgl58\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.149052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.149394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.149420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.164232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgl58\" (UniqueName: \"kubernetes.io/projected/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-kube-api-access-tgl58\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.268743 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.617942 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc"] Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.751029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" event={"ID":"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1","Type":"ContainerStarted","Data":"c5fe65c16f49a9ebfa07c59ab35b9955a9dc28e54aeecd54c96d4be223569957"} Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.751067 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" event={"ID":"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1","Type":"ContainerStarted","Data":"5d276028bd175331272fbab4e618b378aeeafb404b9ec863697e1ce06f96f7ad"} Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.756922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pccw2" event={"ID":"94a0053b-3734-4ee6-9bd2-293d12b064d9","Type":"ContainerStarted","Data":"3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99"} Jan 21 16:44:28 crc kubenswrapper[4707]: I0121 16:44:28.783005 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pccw2" podStartSLOduration=2.283846148 podStartE2EDuration="3.782989442s" podCreationTimestamp="2026-01-21 16:44:25 +0000 UTC" firstStartedPulling="2026-01-21 16:44:26.742578594 +0000 UTC m=+6163.924094816" lastFinishedPulling="2026-01-21 16:44:28.241721889 +0000 UTC m=+6165.423238110" observedRunningTime="2026-01-21 16:44:28.78103122 +0000 UTC m=+6165.962547442" watchObservedRunningTime="2026-01-21 16:44:28.782989442 +0000 UTC m=+6165.964505665" Jan 21 16:44:29 crc kubenswrapper[4707]: I0121 16:44:29.763980 4707 generic.go:334] "Generic (PLEG): container finished" podID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerID="c5fe65c16f49a9ebfa07c59ab35b9955a9dc28e54aeecd54c96d4be223569957" exitCode=0 Jan 21 16:44:29 crc kubenswrapper[4707]: I0121 16:44:29.764013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" event={"ID":"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1","Type":"ContainerDied","Data":"c5fe65c16f49a9ebfa07c59ab35b9955a9dc28e54aeecd54c96d4be223569957"} Jan 21 16:44:30 crc kubenswrapper[4707]: I0121 16:44:30.526569 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zx4cm"] Jan 21 16:44:30 crc kubenswrapper[4707]: I0121 16:44:30.526744 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zx4cm" podUID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerName="registry-server" containerID="cri-o://0653cc8aa7ca5fc56cfbeb4e909b112e7aaa22a94211929e3319a49ee36247bd" gracePeriod=2 Jan 21 16:44:30 crc kubenswrapper[4707]: I0121 16:44:30.781862 4707 generic.go:334] "Generic (PLEG): container finished" podID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerID="407a67e2ce0c1c54d7bc8dea44ef05f88da9c0bee73d6f1b6fbe664c9ec87b59" exitCode=0 Jan 21 16:44:30 crc kubenswrapper[4707]: I0121 16:44:30.781914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" event={"ID":"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1","Type":"ContainerDied","Data":"407a67e2ce0c1c54d7bc8dea44ef05f88da9c0bee73d6f1b6fbe664c9ec87b59"} Jan 21 16:44:30 crc kubenswrapper[4707]: I0121 16:44:30.792898 4707 generic.go:334] "Generic (PLEG): container finished" podID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerID="0653cc8aa7ca5fc56cfbeb4e909b112e7aaa22a94211929e3319a49ee36247bd" exitCode=0 Jan 21 16:44:30 crc kubenswrapper[4707]: I0121 16:44:30.792936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zx4cm" event={"ID":"f87e6bd4-e2e6-4568-946e-c67872e6c909","Type":"ContainerDied","Data":"0653cc8aa7ca5fc56cfbeb4e909b112e7aaa22a94211929e3319a49ee36247bd"} Jan 21 16:44:30 crc kubenswrapper[4707]: I0121 16:44:30.994748 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.091023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-catalog-content\") pod \"f87e6bd4-e2e6-4568-946e-c67872e6c909\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.091234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-898dq\" (UniqueName: \"kubernetes.io/projected/f87e6bd4-e2e6-4568-946e-c67872e6c909-kube-api-access-898dq\") pod \"f87e6bd4-e2e6-4568-946e-c67872e6c909\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.091288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-utilities\") pod \"f87e6bd4-e2e6-4568-946e-c67872e6c909\" (UID: \"f87e6bd4-e2e6-4568-946e-c67872e6c909\") " Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.091884 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-utilities" (OuterVolumeSpecName: "utilities") pod "f87e6bd4-e2e6-4568-946e-c67872e6c909" (UID: "f87e6bd4-e2e6-4568-946e-c67872e6c909"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.097914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87e6bd4-e2e6-4568-946e-c67872e6c909-kube-api-access-898dq" (OuterVolumeSpecName: "kube-api-access-898dq") pod "f87e6bd4-e2e6-4568-946e-c67872e6c909" (UID: "f87e6bd4-e2e6-4568-946e-c67872e6c909"). InnerVolumeSpecName "kube-api-access-898dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.137148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f87e6bd4-e2e6-4568-946e-c67872e6c909" (UID: "f87e6bd4-e2e6-4568-946e-c67872e6c909"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.192483 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.192514 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-898dq\" (UniqueName: \"kubernetes.io/projected/f87e6bd4-e2e6-4568-946e-c67872e6c909-kube-api-access-898dq\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.192526 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f87e6bd4-e2e6-4568-946e-c67872e6c909-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.799858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zx4cm" event={"ID":"f87e6bd4-e2e6-4568-946e-c67872e6c909","Type":"ContainerDied","Data":"c1ef4a92bd30e0dfd635c4119883f9cf4d5a377de6ce486a7a82526821f24190"} Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.799910 4707 scope.go:117] "RemoveContainer" containerID="0653cc8aa7ca5fc56cfbeb4e909b112e7aaa22a94211929e3319a49ee36247bd" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.799909 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zx4cm" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.801954 4707 generic.go:334] "Generic (PLEG): container finished" podID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerID="7e5fe7a4124408927bb2892618f1daa185bf3e15c22b8a8396c3db5b024dadd6" exitCode=0 Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.802080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" event={"ID":"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1","Type":"ContainerDied","Data":"7e5fe7a4124408927bb2892618f1daa185bf3e15c22b8a8396c3db5b024dadd6"} Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.815682 4707 scope.go:117] "RemoveContainer" containerID="ad937f3289d54ddeccbd169d2a7b722cc70dd16d3dc365c699d0cbf9077c521c" Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.826058 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zx4cm"] Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.829716 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zx4cm"] Jan 21 16:44:31 crc kubenswrapper[4707]: I0121 16:44:31.839217 4707 scope.go:117] "RemoveContainer" containerID="548e1b1e1bf12b70cfc89e38feba34106638539dc0e9b9b7d9e5b8f5fcec9527" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.047749 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.188377 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87e6bd4-e2e6-4568-946e-c67872e6c909" path="/var/lib/kubelet/pods/f87e6bd4-e2e6-4568-946e-c67872e6c909/volumes" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.216959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-util\") pod \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.217040 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgl58\" (UniqueName: \"kubernetes.io/projected/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-kube-api-access-tgl58\") pod \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.217083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-bundle\") pod \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\" (UID: \"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1\") " Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.217834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-bundle" (OuterVolumeSpecName: "bundle") pod "1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" (UID: "1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.221233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-kube-api-access-tgl58" (OuterVolumeSpecName: "kube-api-access-tgl58") pod "1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" (UID: "1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1"). InnerVolumeSpecName "kube-api-access-tgl58". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.227476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-util" (OuterVolumeSpecName: "util") pod "1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" (UID: "1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.318839 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-util\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.318986 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgl58\" (UniqueName: \"kubernetes.io/projected/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-kube-api-access-tgl58\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.319048 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.815417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" event={"ID":"1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1","Type":"ContainerDied","Data":"5d276028bd175331272fbab4e618b378aeeafb404b9ec863697e1ce06f96f7ad"} Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.815451 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d276028bd175331272fbab4e618b378aeeafb404b9ec863697e1ce06f96f7ad" Jan 21 16:44:33 crc kubenswrapper[4707]: I0121 16:44:33.815462 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc" Jan 21 16:44:35 crc kubenswrapper[4707]: I0121 16:44:35.849011 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:35 crc kubenswrapper[4707]: I0121 16:44:35.849212 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:35 crc kubenswrapper[4707]: I0121 16:44:35.877183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:36 crc kubenswrapper[4707]: I0121 16:44:36.858624 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:37 crc kubenswrapper[4707]: I0121 16:44:37.183257 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:44:37 crc kubenswrapper[4707]: E0121 16:44:37.183502 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:44:40 crc kubenswrapper[4707]: I0121 16:44:40.741379 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pccw2"] Jan 21 16:44:40 crc kubenswrapper[4707]: I0121 16:44:40.741741 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pccw2" podUID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerName="registry-server" containerID="cri-o://3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99" gracePeriod=2 Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.110949 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.210153 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-utilities\") pod \"94a0053b-3734-4ee6-9bd2-293d12b064d9\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.210249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9dbq\" (UniqueName: \"kubernetes.io/projected/94a0053b-3734-4ee6-9bd2-293d12b064d9-kube-api-access-h9dbq\") pod \"94a0053b-3734-4ee6-9bd2-293d12b064d9\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.210282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-catalog-content\") pod \"94a0053b-3734-4ee6-9bd2-293d12b064d9\" (UID: \"94a0053b-3734-4ee6-9bd2-293d12b064d9\") " Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.211530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-utilities" (OuterVolumeSpecName: "utilities") pod "94a0053b-3734-4ee6-9bd2-293d12b064d9" (UID: "94a0053b-3734-4ee6-9bd2-293d12b064d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.211843 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.217168 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a0053b-3734-4ee6-9bd2-293d12b064d9-kube-api-access-h9dbq" (OuterVolumeSpecName: "kube-api-access-h9dbq") pod "94a0053b-3734-4ee6-9bd2-293d12b064d9" (UID: "94a0053b-3734-4ee6-9bd2-293d12b064d9"). InnerVolumeSpecName "kube-api-access-h9dbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.249858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94a0053b-3734-4ee6-9bd2-293d12b064d9" (UID: "94a0053b-3734-4ee6-9bd2-293d12b064d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.313878 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9dbq\" (UniqueName: \"kubernetes.io/projected/94a0053b-3734-4ee6-9bd2-293d12b064d9-kube-api-access-h9dbq\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.313910 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0053b-3734-4ee6-9bd2-293d12b064d9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.867959 4707 generic.go:334] "Generic (PLEG): container finished" podID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerID="3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99" exitCode=0 Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.868008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pccw2" event={"ID":"94a0053b-3734-4ee6-9bd2-293d12b064d9","Type":"ContainerDied","Data":"3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99"} Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.868057 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pccw2" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.868225 4707 scope.go:117] "RemoveContainer" containerID="3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.868211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pccw2" event={"ID":"94a0053b-3734-4ee6-9bd2-293d12b064d9","Type":"ContainerDied","Data":"3fce5adebf2cb34cbd6b58053a710a57d031674e24a9c257692a7efb244883bb"} Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.890885 4707 scope.go:117] "RemoveContainer" containerID="109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.892085 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pccw2"] Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.896017 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pccw2"] Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.910261 4707 scope.go:117] "RemoveContainer" containerID="672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.939760 4707 scope.go:117] "RemoveContainer" containerID="3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99" Jan 21 16:44:41 crc kubenswrapper[4707]: E0121 16:44:41.940231 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99\": container with ID starting with 3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99 not found: ID does not exist" containerID="3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.940263 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99"} err="failed to get container status \"3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99\": rpc error: code = NotFound desc = could not find container \"3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99\": container with ID starting with 3822b19239c492ac5fdaefe51984b128aeaf335492b73530d2dce90844c28f99 not found: ID does not exist" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.940285 4707 scope.go:117] "RemoveContainer" containerID="109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5" Jan 21 16:44:41 crc kubenswrapper[4707]: E0121 16:44:41.950428 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5\": container with ID starting with 109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5 not found: ID does not exist" containerID="109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.950483 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5"} err="failed to get container status \"109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5\": rpc error: code = NotFound desc = could not find container \"109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5\": container with ID starting with 109f55da13013e37cbca60b8df908f3b9a8e9fe98fab61bc4baa280b28943cf5 not found: ID does not exist" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.950507 4707 scope.go:117] "RemoveContainer" containerID="672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4" Jan 21 16:44:41 crc kubenswrapper[4707]: E0121 16:44:41.951055 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4\": container with ID starting with 672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4 not found: ID does not exist" containerID="672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4" Jan 21 16:44:41 crc kubenswrapper[4707]: I0121 16:44:41.951094 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4"} err="failed to get container status \"672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4\": rpc error: code = NotFound desc = could not find container \"672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4\": container with ID starting with 672f8df6c754c08cd4543e2ad0fd101a574329a8828d2afaae8a3e5a616514b4 not found: ID does not exist" Jan 21 16:44:43 crc kubenswrapper[4707]: I0121 16:44:43.187746 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a0053b-3734-4ee6-9bd2-293d12b064d9" path="/var/lib/kubelet/pods/94a0053b-3734-4ee6-9bd2-293d12b064d9/volumes" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.193666 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk"] Jan 21 16:44:44 crc kubenswrapper[4707]: E0121 16:44:44.193922 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerName="extract" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.193936 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerName="extract" Jan 21 16:44:44 crc kubenswrapper[4707]: E0121 16:44:44.193949 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerName="registry-server" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.193955 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerName="registry-server" Jan 21 16:44:44 crc kubenswrapper[4707]: E0121 16:44:44.193970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerName="extract-utilities" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.193976 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerName="extract-utilities" Jan 21 16:44:44 crc kubenswrapper[4707]: E0121 16:44:44.193989 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerName="util" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.193993 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerName="util" Jan 21 16:44:44 crc kubenswrapper[4707]: E0121 16:44:44.194000 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerName="extract-content" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.194005 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerName="extract-content" Jan 21 16:44:44 crc kubenswrapper[4707]: E0121 16:44:44.194016 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerName="extract-content" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.194021 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerName="extract-content" Jan 21 16:44:44 crc kubenswrapper[4707]: E0121 16:44:44.194029 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerName="extract-utilities" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.194035 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerName="extract-utilities" Jan 21 16:44:44 crc kubenswrapper[4707]: E0121 16:44:44.194043 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerName="registry-server" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.194049 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerName="registry-server" Jan 21 16:44:44 crc kubenswrapper[4707]: E0121 16:44:44.194058 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerName="pull" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.194064 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerName="pull" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.194179 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a0053b-3734-4ee6-9bd2-293d12b064d9" containerName="registry-server" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.194199 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87e6bd4-e2e6-4568-946e-c67872e6c909" containerName="registry-server" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.194209 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" containerName="extract" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.194589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.196537 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-pb7sz" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.205659 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk"] Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.353704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jftrz\" (UniqueName: \"kubernetes.io/projected/23ec1174-5172-4c31-8ed4-bed70c55586e-kube-api-access-jftrz\") pod \"rabbitmq-cluster-operator-779fc9694b-8gjjk\" (UID: \"23ec1174-5172-4c31-8ed4-bed70c55586e\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.455604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jftrz\" (UniqueName: \"kubernetes.io/projected/23ec1174-5172-4c31-8ed4-bed70c55586e-kube-api-access-jftrz\") pod \"rabbitmq-cluster-operator-779fc9694b-8gjjk\" (UID: \"23ec1174-5172-4c31-8ed4-bed70c55586e\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.471878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jftrz\" (UniqueName: \"kubernetes.io/projected/23ec1174-5172-4c31-8ed4-bed70c55586e-kube-api-access-jftrz\") pod \"rabbitmq-cluster-operator-779fc9694b-8gjjk\" (UID: \"23ec1174-5172-4c31-8ed4-bed70c55586e\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.508510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.847298 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk"] Jan 21 16:44:44 crc kubenswrapper[4707]: W0121 16:44:44.850588 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod23ec1174_5172_4c31_8ed4_bed70c55586e.slice/crio-f47f348dafbe7f0e3813a3ce34de670624b6cf35e23e8c4617e47698d62f90c5 WatchSource:0}: Error finding container f47f348dafbe7f0e3813a3ce34de670624b6cf35e23e8c4617e47698d62f90c5: Status 404 returned error can't find the container with id f47f348dafbe7f0e3813a3ce34de670624b6cf35e23e8c4617e47698d62f90c5 Jan 21 16:44:44 crc kubenswrapper[4707]: I0121 16:44:44.886561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" event={"ID":"23ec1174-5172-4c31-8ed4-bed70c55586e","Type":"ContainerStarted","Data":"f47f348dafbe7f0e3813a3ce34de670624b6cf35e23e8c4617e47698d62f90c5"} Jan 21 16:44:45 crc kubenswrapper[4707]: I0121 16:44:45.893855 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" event={"ID":"23ec1174-5172-4c31-8ed4-bed70c55586e","Type":"ContainerStarted","Data":"19e822fa30205f02cddbce603441ea130558ae8fc2ef58c0b29365f6851944f5"} Jan 21 16:44:45 crc kubenswrapper[4707]: I0121 16:44:45.906855 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" podStartSLOduration=1.906830636 podStartE2EDuration="1.906830636s" podCreationTimestamp="2026-01-21 16:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:44:45.9033816 +0000 UTC m=+6183.084897823" watchObservedRunningTime="2026-01-21 16:44:45.906830636 +0000 UTC m=+6183.088346858" Jan 21 16:44:51 crc kubenswrapper[4707]: I0121 16:44:51.183101 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:44:51 crc kubenswrapper[4707]: E0121 16:44:51.183590 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:44:51 crc kubenswrapper[4707]: I0121 16:44:51.932537 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-tr55x"] Jan 21 16:44:51 crc kubenswrapper[4707]: I0121 16:44:51.933470 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 16:44:51 crc kubenswrapper[4707]: I0121 16:44:51.936899 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-82pzp" Jan 21 16:44:51 crc kubenswrapper[4707]: I0121 16:44:51.947000 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-tr55x"] Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.055992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8mcd\" (UniqueName: \"kubernetes.io/projected/900dbba9-3a88-4094-8313-243234ffbe22-kube-api-access-f8mcd\") pod \"keystone-operator-index-tr55x\" (UID: \"900dbba9-3a88-4094-8313-243234ffbe22\") " pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.157337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8mcd\" (UniqueName: \"kubernetes.io/projected/900dbba9-3a88-4094-8313-243234ffbe22-kube-api-access-f8mcd\") pod \"keystone-operator-index-tr55x\" (UID: \"900dbba9-3a88-4094-8313-243234ffbe22\") " pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.173072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8mcd\" (UniqueName: \"kubernetes.io/projected/900dbba9-3a88-4094-8313-243234ffbe22-kube-api-access-f8mcd\") pod \"keystone-operator-index-tr55x\" (UID: \"900dbba9-3a88-4094-8313-243234ffbe22\") " pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.247862 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.533142 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lldsk"] Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.534413 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.540167 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lldsk"] Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.589390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-tr55x"] Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.663405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfkv7\" (UniqueName: \"kubernetes.io/projected/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-kube-api-access-xfkv7\") pod \"redhat-operators-lldsk\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.663533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-utilities\") pod \"redhat-operators-lldsk\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.663569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-catalog-content\") pod \"redhat-operators-lldsk\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.765302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfkv7\" (UniqueName: \"kubernetes.io/projected/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-kube-api-access-xfkv7\") pod \"redhat-operators-lldsk\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.765610 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-utilities\") pod \"redhat-operators-lldsk\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.765632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-catalog-content\") pod \"redhat-operators-lldsk\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.766034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-catalog-content\") pod \"redhat-operators-lldsk\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.766106 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-utilities\") pod \"redhat-operators-lldsk\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.782726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfkv7\" (UniqueName: \"kubernetes.io/projected/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-kube-api-access-xfkv7\") pod \"redhat-operators-lldsk\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.847191 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:44:52 crc kubenswrapper[4707]: I0121 16:44:52.931027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-tr55x" event={"ID":"900dbba9-3a88-4094-8313-243234ffbe22","Type":"ContainerStarted","Data":"692113d63ef6f2f84226167bba4ef52ba44e5f99348115206f4f468281c9f650"} Jan 21 16:44:53 crc kubenswrapper[4707]: I0121 16:44:53.219008 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lldsk"] Jan 21 16:44:53 crc kubenswrapper[4707]: I0121 16:44:53.937755 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerID="3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2" exitCode=0 Jan 21 16:44:53 crc kubenswrapper[4707]: I0121 16:44:53.937829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lldsk" event={"ID":"9ace7c06-1b97-4269-ae65-ef1280cfa7ab","Type":"ContainerDied","Data":"3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2"} Jan 21 16:44:53 crc kubenswrapper[4707]: I0121 16:44:53.937868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lldsk" event={"ID":"9ace7c06-1b97-4269-ae65-ef1280cfa7ab","Type":"ContainerStarted","Data":"c8d9ca793133174c9147d7656c60d42ff70375897767e7f0217e21662e1da482"} Jan 21 16:44:53 crc kubenswrapper[4707]: I0121 16:44:53.939443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-tr55x" event={"ID":"900dbba9-3a88-4094-8313-243234ffbe22","Type":"ContainerStarted","Data":"1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21"} Jan 21 16:44:53 crc kubenswrapper[4707]: I0121 16:44:53.957129 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-tr55x" podStartSLOduration=2.375577086 podStartE2EDuration="2.957117685s" podCreationTimestamp="2026-01-21 16:44:51 +0000 UTC" firstStartedPulling="2026-01-21 16:44:52.598892709 +0000 UTC m=+6189.780408931" lastFinishedPulling="2026-01-21 16:44:53.180433309 +0000 UTC m=+6190.361949530" observedRunningTime="2026-01-21 16:44:53.956621882 +0000 UTC m=+6191.138138105" watchObservedRunningTime="2026-01-21 16:44:53.957117685 +0000 UTC m=+6191.138633907" Jan 21 16:44:54 crc kubenswrapper[4707]: I0121 16:44:54.946032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lldsk" event={"ID":"9ace7c06-1b97-4269-ae65-ef1280cfa7ab","Type":"ContainerStarted","Data":"a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af"} Jan 21 16:44:55 crc kubenswrapper[4707]: I0121 16:44:55.953635 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerID="a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af" exitCode=0 Jan 21 16:44:55 crc kubenswrapper[4707]: I0121 16:44:55.953723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lldsk" event={"ID":"9ace7c06-1b97-4269-ae65-ef1280cfa7ab","Type":"ContainerDied","Data":"a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af"} Jan 21 16:44:56 crc kubenswrapper[4707]: I0121 16:44:56.960948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lldsk" event={"ID":"9ace7c06-1b97-4269-ae65-ef1280cfa7ab","Type":"ContainerStarted","Data":"831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff"} Jan 21 16:44:56 crc kubenswrapper[4707]: I0121 16:44:56.975186 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lldsk" podStartSLOduration=2.395671296 podStartE2EDuration="4.975172574s" podCreationTimestamp="2026-01-21 16:44:52 +0000 UTC" firstStartedPulling="2026-01-21 16:44:53.93943138 +0000 UTC m=+6191.120947602" lastFinishedPulling="2026-01-21 16:44:56.518932658 +0000 UTC m=+6193.700448880" observedRunningTime="2026-01-21 16:44:56.971487685 +0000 UTC m=+6194.153003908" watchObservedRunningTime="2026-01-21 16:44:56.975172574 +0000 UTC m=+6194.156688786" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.138298 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc"] Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.139367 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.140723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.141511 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.147485 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc"] Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.266897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00fcf38c-116c-403d-9482-e34bca859394-secret-volume\") pod \"collect-profiles-29483565-pw8lc\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.267590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00fcf38c-116c-403d-9482-e34bca859394-config-volume\") pod \"collect-profiles-29483565-pw8lc\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.267641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r84p2\" (UniqueName: \"kubernetes.io/projected/00fcf38c-116c-403d-9482-e34bca859394-kube-api-access-r84p2\") pod \"collect-profiles-29483565-pw8lc\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.368947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00fcf38c-116c-403d-9482-e34bca859394-config-volume\") pod \"collect-profiles-29483565-pw8lc\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.369007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r84p2\" (UniqueName: \"kubernetes.io/projected/00fcf38c-116c-403d-9482-e34bca859394-kube-api-access-r84p2\") pod \"collect-profiles-29483565-pw8lc\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.369062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00fcf38c-116c-403d-9482-e34bca859394-secret-volume\") pod \"collect-profiles-29483565-pw8lc\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.369859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00fcf38c-116c-403d-9482-e34bca859394-config-volume\") pod \"collect-profiles-29483565-pw8lc\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.373917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00fcf38c-116c-403d-9482-e34bca859394-secret-volume\") pod \"collect-profiles-29483565-pw8lc\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.382212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r84p2\" (UniqueName: \"kubernetes.io/projected/00fcf38c-116c-403d-9482-e34bca859394-kube-api-access-r84p2\") pod \"collect-profiles-29483565-pw8lc\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.454135 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.846425 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc"] Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.983697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" event={"ID":"00fcf38c-116c-403d-9482-e34bca859394","Type":"ContainerStarted","Data":"2106a2350800214b3be364fc6c47bb17859836779afbd4f76d7b13740ce5a66a"} Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.983914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" event={"ID":"00fcf38c-116c-403d-9482-e34bca859394","Type":"ContainerStarted","Data":"ddce87ea7f3f0ae2a5bb5fc9618b911a20f54dd7f7a5e25132558f8b2dc857d1"} Jan 21 16:45:00 crc kubenswrapper[4707]: I0121 16:45:00.995042 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" podStartSLOduration=0.995032343 podStartE2EDuration="995.032343ms" podCreationTimestamp="2026-01-21 16:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:45:00.992757277 +0000 UTC m=+6198.174273498" watchObservedRunningTime="2026-01-21 16:45:00.995032343 +0000 UTC m=+6198.176548565" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.714518 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.715927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.717184 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.717423 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.717844 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-default-user" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.717996 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-server-dockercfg-qbj82" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.718190 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-server-conf" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.724422 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.887844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f489928-26ae-45c9-851c-6072f97f37e0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.888051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.888077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sff4g\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-kube-api-access-sff4g\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.888101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.888158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f489928-26ae-45c9-851c-6072f97f37e0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.888195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f489928-26ae-45c9-851c-6072f97f37e0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.888211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.888302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.989120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f489928-26ae-45c9-851c-6072f97f37e0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.989301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.989392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sff4g\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-kube-api-access-sff4g\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.989471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.989567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f489928-26ae-45c9-851c-6072f97f37e0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.989658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f489928-26ae-45c9-851c-6072f97f37e0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.989741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.989865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.990081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.990099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.990481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f489928-26ae-45c9-851c-6072f97f37e0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.990503 4707 generic.go:334] "Generic (PLEG): container finished" podID="00fcf38c-116c-403d-9482-e34bca859394" containerID="2106a2350800214b3be364fc6c47bb17859836779afbd4f76d7b13740ce5a66a" exitCode=0 Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.990529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" event={"ID":"00fcf38c-116c-403d-9482-e34bca859394","Type":"ContainerDied","Data":"2106a2350800214b3be364fc6c47bb17859836779afbd4f76d7b13740ce5a66a"} Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.992524 4707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.992622 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8fbba572113d120639f1dd665da0fdcdd4c6d5c5127ef7d7b9e061a749970702/globalmount\"" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.993555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f489928-26ae-45c9-851c-6072f97f37e0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.993926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:01 crc kubenswrapper[4707]: I0121 16:45:01.994105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f489928-26ae-45c9-851c-6072f97f37e0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.005444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sff4g\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-kube-api-access-sff4g\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.011605 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\") pod \"rabbitmq-server-0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.028887 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.248382 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.248551 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.268935 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.373185 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.848328 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.848557 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.887701 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:45:02 crc kubenswrapper[4707]: I0121 16:45:02.997206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"2f489928-26ae-45c9-851c-6072f97f37e0","Type":"ContainerStarted","Data":"b4e956e7c990a79ae01245178c0c77581ff92c2da6aeb33894ef449bad23a533"} Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.016961 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.027430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.186884 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:45:03 crc kubenswrapper[4707]: E0121 16:45:03.187158 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.226942 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.406782 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00fcf38c-116c-403d-9482-e34bca859394-config-volume\") pod \"00fcf38c-116c-403d-9482-e34bca859394\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.407185 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00fcf38c-116c-403d-9482-e34bca859394-secret-volume\") pod \"00fcf38c-116c-403d-9482-e34bca859394\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.407333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00fcf38c-116c-403d-9482-e34bca859394-config-volume" (OuterVolumeSpecName: "config-volume") pod "00fcf38c-116c-403d-9482-e34bca859394" (UID: "00fcf38c-116c-403d-9482-e34bca859394"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.407351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r84p2\" (UniqueName: \"kubernetes.io/projected/00fcf38c-116c-403d-9482-e34bca859394-kube-api-access-r84p2\") pod \"00fcf38c-116c-403d-9482-e34bca859394\" (UID: \"00fcf38c-116c-403d-9482-e34bca859394\") " Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.407869 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00fcf38c-116c-403d-9482-e34bca859394-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.411357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00fcf38c-116c-403d-9482-e34bca859394-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "00fcf38c-116c-403d-9482-e34bca859394" (UID: "00fcf38c-116c-403d-9482-e34bca859394"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.411657 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00fcf38c-116c-403d-9482-e34bca859394-kube-api-access-r84p2" (OuterVolumeSpecName: "kube-api-access-r84p2") pod "00fcf38c-116c-403d-9482-e34bca859394" (UID: "00fcf38c-116c-403d-9482-e34bca859394"). InnerVolumeSpecName "kube-api-access-r84p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.508646 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00fcf38c-116c-403d-9482-e34bca859394-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:03 crc kubenswrapper[4707]: I0121 16:45:03.508841 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r84p2\" (UniqueName: \"kubernetes.io/projected/00fcf38c-116c-403d-9482-e34bca859394-kube-api-access-r84p2\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.003754 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.003748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc" event={"ID":"00fcf38c-116c-403d-9482-e34bca859394","Type":"ContainerDied","Data":"ddce87ea7f3f0ae2a5bb5fc9618b911a20f54dd7f7a5e25132558f8b2dc857d1"} Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.004000 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddce87ea7f3f0ae2a5bb5fc9618b911a20f54dd7f7a5e25132558f8b2dc857d1" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.005388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"2f489928-26ae-45c9-851c-6072f97f37e0","Type":"ContainerStarted","Data":"22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5"} Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.268881 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl"] Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.272686 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-r2xdl"] Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.754872 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w"] Jan 21 16:45:04 crc kubenswrapper[4707]: E0121 16:45:04.755121 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fcf38c-116c-403d-9482-e34bca859394" containerName="collect-profiles" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.755137 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fcf38c-116c-403d-9482-e34bca859394" containerName="collect-profiles" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.755267 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="00fcf38c-116c-403d-9482-e34bca859394" containerName="collect-profiles" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.756080 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.757515 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.760794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w"] Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.926348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8qgr\" (UniqueName: \"kubernetes.io/projected/0a731ca6-7194-4289-98d8-89ae6fba1185-kube-api-access-w8qgr\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.926579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:04 crc kubenswrapper[4707]: I0121 16:45:04.926713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:05 crc kubenswrapper[4707]: I0121 16:45:05.027668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:05 crc kubenswrapper[4707]: I0121 16:45:05.027721 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8qgr\" (UniqueName: \"kubernetes.io/projected/0a731ca6-7194-4289-98d8-89ae6fba1185-kube-api-access-w8qgr\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:05 crc kubenswrapper[4707]: I0121 16:45:05.027758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:05 crc kubenswrapper[4707]: I0121 16:45:05.028106 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:05 crc kubenswrapper[4707]: I0121 16:45:05.028164 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:05 crc kubenswrapper[4707]: I0121 16:45:05.041515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8qgr\" (UniqueName: \"kubernetes.io/projected/0a731ca6-7194-4289-98d8-89ae6fba1185-kube-api-access-w8qgr\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:05 crc kubenswrapper[4707]: I0121 16:45:05.068610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:05 crc kubenswrapper[4707]: I0121 16:45:05.188799 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de2e0bd-a918-46bc-84b1-6a87cb36d98e" path="/var/lib/kubelet/pods/9de2e0bd-a918-46bc-84b1-6a87cb36d98e/volumes" Jan 21 16:45:05 crc kubenswrapper[4707]: I0121 16:45:05.411909 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w"] Jan 21 16:45:06 crc kubenswrapper[4707]: I0121 16:45:06.017125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" event={"ID":"0a731ca6-7194-4289-98d8-89ae6fba1185","Type":"ContainerStarted","Data":"e0e93d3adc4ac39fb905aa21ae1f78d0092d0ff072238e93338da2a59d428c71"} Jan 21 16:45:06 crc kubenswrapper[4707]: I0121 16:45:06.017165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" event={"ID":"0a731ca6-7194-4289-98d8-89ae6fba1185","Type":"ContainerStarted","Data":"924bf8011e5621be9c2d843a62fa535c3ffe2973110a6db576b07aa7e65c06a7"} Jan 21 16:45:07 crc kubenswrapper[4707]: I0121 16:45:07.024669 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerID="e0e93d3adc4ac39fb905aa21ae1f78d0092d0ff072238e93338da2a59d428c71" exitCode=0 Jan 21 16:45:07 crc kubenswrapper[4707]: I0121 16:45:07.024756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" event={"ID":"0a731ca6-7194-4289-98d8-89ae6fba1185","Type":"ContainerDied","Data":"e0e93d3adc4ac39fb905aa21ae1f78d0092d0ff072238e93338da2a59d428c71"} Jan 21 16:45:08 crc kubenswrapper[4707]: I0121 16:45:08.032227 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerID="82870f3e401fe0a437356edd26b62bb68518e8cb7a786f9f3462958c2218aa6e" exitCode=0 Jan 21 16:45:08 crc kubenswrapper[4707]: I0121 16:45:08.032307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" event={"ID":"0a731ca6-7194-4289-98d8-89ae6fba1185","Type":"ContainerDied","Data":"82870f3e401fe0a437356edd26b62bb68518e8cb7a786f9f3462958c2218aa6e"} Jan 21 16:45:09 crc kubenswrapper[4707]: I0121 16:45:09.039262 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerID="b0183c60dc9cc579e2a7b3f621eb8ca865f8e6e85c6b1ff40bdfbf3a87349855" exitCode=0 Jan 21 16:45:09 crc kubenswrapper[4707]: I0121 16:45:09.039299 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" event={"ID":"0a731ca6-7194-4289-98d8-89ae6fba1185","Type":"ContainerDied","Data":"b0183c60dc9cc579e2a7b3f621eb8ca865f8e6e85c6b1ff40bdfbf3a87349855"} Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.286150 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.393346 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8qgr\" (UniqueName: \"kubernetes.io/projected/0a731ca6-7194-4289-98d8-89ae6fba1185-kube-api-access-w8qgr\") pod \"0a731ca6-7194-4289-98d8-89ae6fba1185\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.393506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-util\") pod \"0a731ca6-7194-4289-98d8-89ae6fba1185\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.393527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-bundle\") pod \"0a731ca6-7194-4289-98d8-89ae6fba1185\" (UID: \"0a731ca6-7194-4289-98d8-89ae6fba1185\") " Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.394440 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-bundle" (OuterVolumeSpecName: "bundle") pod "0a731ca6-7194-4289-98d8-89ae6fba1185" (UID: "0a731ca6-7194-4289-98d8-89ae6fba1185"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.399479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a731ca6-7194-4289-98d8-89ae6fba1185-kube-api-access-w8qgr" (OuterVolumeSpecName: "kube-api-access-w8qgr") pod "0a731ca6-7194-4289-98d8-89ae6fba1185" (UID: "0a731ca6-7194-4289-98d8-89ae6fba1185"). InnerVolumeSpecName "kube-api-access-w8qgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.403697 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-util" (OuterVolumeSpecName: "util") pod "0a731ca6-7194-4289-98d8-89ae6fba1185" (UID: "0a731ca6-7194-4289-98d8-89ae6fba1185"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.495567 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-util\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.496217 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a731ca6-7194-4289-98d8-89ae6fba1185-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:10 crc kubenswrapper[4707]: I0121 16:45:10.496243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8qgr\" (UniqueName: \"kubernetes.io/projected/0a731ca6-7194-4289-98d8-89ae6fba1185-kube-api-access-w8qgr\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:11 crc kubenswrapper[4707]: I0121 16:45:11.051301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" event={"ID":"0a731ca6-7194-4289-98d8-89ae6fba1185","Type":"ContainerDied","Data":"924bf8011e5621be9c2d843a62fa535c3ffe2973110a6db576b07aa7e65c06a7"} Jan 21 16:45:11 crc kubenswrapper[4707]: I0121 16:45:11.051335 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924bf8011e5621be9c2d843a62fa535c3ffe2973110a6db576b07aa7e65c06a7" Jan 21 16:45:11 crc kubenswrapper[4707]: I0121 16:45:11.051366 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w" Jan 21 16:45:11 crc kubenswrapper[4707]: I0121 16:45:11.525695 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lldsk"] Jan 21 16:45:11 crc kubenswrapper[4707]: I0121 16:45:11.525955 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lldsk" podUID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerName="registry-server" containerID="cri-o://831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff" gracePeriod=2 Jan 21 16:45:11 crc kubenswrapper[4707]: I0121 16:45:11.871574 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.014890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-catalog-content\") pod \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.014993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-utilities\") pod \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.015043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfkv7\" (UniqueName: \"kubernetes.io/projected/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-kube-api-access-xfkv7\") pod \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\" (UID: \"9ace7c06-1b97-4269-ae65-ef1280cfa7ab\") " Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.015570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-utilities" (OuterVolumeSpecName: "utilities") pod "9ace7c06-1b97-4269-ae65-ef1280cfa7ab" (UID: "9ace7c06-1b97-4269-ae65-ef1280cfa7ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.021065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-kube-api-access-xfkv7" (OuterVolumeSpecName: "kube-api-access-xfkv7") pod "9ace7c06-1b97-4269-ae65-ef1280cfa7ab" (UID: "9ace7c06-1b97-4269-ae65-ef1280cfa7ab"). InnerVolumeSpecName "kube-api-access-xfkv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.059120 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerID="831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff" exitCode=0 Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.059157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lldsk" event={"ID":"9ace7c06-1b97-4269-ae65-ef1280cfa7ab","Type":"ContainerDied","Data":"831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff"} Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.059181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lldsk" event={"ID":"9ace7c06-1b97-4269-ae65-ef1280cfa7ab","Type":"ContainerDied","Data":"c8d9ca793133174c9147d7656c60d42ff70375897767e7f0217e21662e1da482"} Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.059196 4707 scope.go:117] "RemoveContainer" containerID="831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.059289 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lldsk" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.076109 4707 scope.go:117] "RemoveContainer" containerID="a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.094959 4707 scope.go:117] "RemoveContainer" containerID="3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.103487 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ace7c06-1b97-4269-ae65-ef1280cfa7ab" (UID: "9ace7c06-1b97-4269-ae65-ef1280cfa7ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.113072 4707 scope.go:117] "RemoveContainer" containerID="831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff" Jan 21 16:45:12 crc kubenswrapper[4707]: E0121 16:45:12.113373 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff\": container with ID starting with 831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff not found: ID does not exist" containerID="831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.113410 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff"} err="failed to get container status \"831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff\": rpc error: code = NotFound desc = could not find container \"831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff\": container with ID starting with 831422ae2032b5577254bcd6e74a5b92f836c42dc3c3338e3b588aeffb0978ff not found: ID does not exist" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.113432 4707 scope.go:117] "RemoveContainer" containerID="a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af" Jan 21 16:45:12 crc kubenswrapper[4707]: E0121 16:45:12.113779 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af\": container with ID starting with a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af not found: ID does not exist" containerID="a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.113831 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af"} err="failed to get container status \"a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af\": rpc error: code = NotFound desc = could not find container \"a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af\": container with ID starting with a9b64c7454b573cf31143a7bb649f5f369a58426ba9b0d4a2f3eda97f40e28af not found: ID does not exist" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.113854 4707 scope.go:117] "RemoveContainer" containerID="3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2" Jan 21 16:45:12 crc kubenswrapper[4707]: E0121 16:45:12.114184 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2\": container with ID starting with 3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2 not found: ID does not exist" containerID="3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.114207 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2"} err="failed to get container status \"3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2\": rpc error: code = NotFound desc = could not find container \"3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2\": container with ID starting with 3f6afd60367e13e3e7189497b84b046ba451424f12f1259408b1a0949c6f63d2 not found: ID does not exist" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.117048 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.117072 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.117085 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfkv7\" (UniqueName: \"kubernetes.io/projected/9ace7c06-1b97-4269-ae65-ef1280cfa7ab-kube-api-access-xfkv7\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.383571 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lldsk"] Jan 21 16:45:12 crc kubenswrapper[4707]: I0121 16:45:12.388915 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lldsk"] Jan 21 16:45:13 crc kubenswrapper[4707]: I0121 16:45:13.188380 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" path="/var/lib/kubelet/pods/9ace7c06-1b97-4269-ae65-ef1280cfa7ab/volumes" Jan 21 16:45:18 crc kubenswrapper[4707]: I0121 16:45:18.182653 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:45:18 crc kubenswrapper[4707]: E0121 16:45:18.183369 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.241319 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh"] Jan 21 16:45:23 crc kubenswrapper[4707]: E0121 16:45:23.242194 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerName="extract" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.242209 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerName="extract" Jan 21 16:45:23 crc kubenswrapper[4707]: E0121 16:45:23.242221 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerName="pull" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.242228 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerName="pull" Jan 21 16:45:23 crc kubenswrapper[4707]: E0121 16:45:23.242248 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerName="extract-content" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.242254 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerName="extract-content" Jan 21 16:45:23 crc kubenswrapper[4707]: E0121 16:45:23.242264 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerName="registry-server" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.242270 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerName="registry-server" Jan 21 16:45:23 crc kubenswrapper[4707]: E0121 16:45:23.242280 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerName="util" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.242287 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerName="util" Jan 21 16:45:23 crc kubenswrapper[4707]: E0121 16:45:23.242297 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerName="extract-utilities" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.242303 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerName="extract-utilities" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.242424 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ace7c06-1b97-4269-ae65-ef1280cfa7ab" containerName="registry-server" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.242436 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a731ca6-7194-4289-98d8-89ae6fba1185" containerName="extract" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.242976 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.244486 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-kw4b8" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.247123 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.251013 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh"] Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.265151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-apiservice-cert\") pod \"keystone-operator-controller-manager-9c985558-cvpzh\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.265236 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2tzb\" (UniqueName: \"kubernetes.io/projected/a4f96623-749d-41d4-b59e-aedc4fd495bd-kube-api-access-d2tzb\") pod \"keystone-operator-controller-manager-9c985558-cvpzh\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.265323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-webhook-cert\") pod \"keystone-operator-controller-manager-9c985558-cvpzh\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.366402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-apiservice-cert\") pod \"keystone-operator-controller-manager-9c985558-cvpzh\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.366470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2tzb\" (UniqueName: \"kubernetes.io/projected/a4f96623-749d-41d4-b59e-aedc4fd495bd-kube-api-access-d2tzb\") pod \"keystone-operator-controller-manager-9c985558-cvpzh\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.366573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-webhook-cert\") pod \"keystone-operator-controller-manager-9c985558-cvpzh\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.371177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-webhook-cert\") pod \"keystone-operator-controller-manager-9c985558-cvpzh\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.371227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-apiservice-cert\") pod \"keystone-operator-controller-manager-9c985558-cvpzh\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.381236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2tzb\" (UniqueName: \"kubernetes.io/projected/a4f96623-749d-41d4-b59e-aedc4fd495bd-kube-api-access-d2tzb\") pod \"keystone-operator-controller-manager-9c985558-cvpzh\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.561085 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:23 crc kubenswrapper[4707]: I0121 16:45:23.919030 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh"] Jan 21 16:45:24 crc kubenswrapper[4707]: I0121 16:45:24.126216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" event={"ID":"a4f96623-749d-41d4-b59e-aedc4fd495bd","Type":"ContainerStarted","Data":"e7d6a1dd83eda9fe4223bd36d061b71143fb3cea83f770f435a33a5d16217ba8"} Jan 21 16:45:24 crc kubenswrapper[4707]: I0121 16:45:24.126459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" event={"ID":"a4f96623-749d-41d4-b59e-aedc4fd495bd","Type":"ContainerStarted","Data":"720ee4ba06f85542f2a4e9bfe1e9f96226935a76fa5de67eb24646bf4068cd68"} Jan 21 16:45:24 crc kubenswrapper[4707]: I0121 16:45:24.126689 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:24 crc kubenswrapper[4707]: I0121 16:45:24.142379 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" podStartSLOduration=1.14236586 podStartE2EDuration="1.14236586s" podCreationTimestamp="2026-01-21 16:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:45:24.140262565 +0000 UTC m=+6221.321778787" watchObservedRunningTime="2026-01-21 16:45:24.14236586 +0000 UTC m=+6221.323882082" Jan 21 16:45:26 crc kubenswrapper[4707]: I0121 16:45:26.111422 4707 scope.go:117] "RemoveContainer" containerID="be52f61faaed1d80a7e1d6d58bebb247b710b7d9e40a4f190ed3ab8611f6518a" Jan 21 16:45:26 crc kubenswrapper[4707]: I0121 16:45:26.128569 4707 scope.go:117] "RemoveContainer" containerID="4e609a0862aef0158bf6a9e405656f087f50b6325376a2bd11fbe2989a0e0674" Jan 21 16:45:26 crc kubenswrapper[4707]: I0121 16:45:26.147546 4707 scope.go:117] "RemoveContainer" containerID="fe1b3c138a37e5d49368ecc528dc98056e0178aa77d839a338195969abc544ad" Jan 21 16:45:26 crc kubenswrapper[4707]: I0121 16:45:26.164224 4707 scope.go:117] "RemoveContainer" containerID="a80cda2bbd02140cdb752637eff5edb859ab73879b0d365c3e8b5a3ac49ed431" Jan 21 16:45:26 crc kubenswrapper[4707]: I0121 16:45:26.182302 4707 scope.go:117] "RemoveContainer" containerID="dc05be075054f1399bdf7ce2e966ea564638f97fdfb68bceb91cfec21aee7732" Jan 21 16:45:26 crc kubenswrapper[4707]: I0121 16:45:26.199560 4707 scope.go:117] "RemoveContainer" containerID="babc4cfb1ff1f6edc7090b25028b954b2b20c9b39ad3272a8498718d3a261c2a" Jan 21 16:45:26 crc kubenswrapper[4707]: I0121 16:45:26.217168 4707 scope.go:117] "RemoveContainer" containerID="4387c1b4880d34e8f97f3a8c5871ca68b1bbdfb56c429d36cfb2c6b67a562251" Jan 21 16:45:30 crc kubenswrapper[4707]: I0121 16:45:30.182501 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:45:30 crc kubenswrapper[4707]: E0121 16:45:30.182867 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:45:33 crc kubenswrapper[4707]: I0121 16:45:33.565314 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 16:45:35 crc kubenswrapper[4707]: I0121 16:45:35.198787 4707 generic.go:334] "Generic (PLEG): container finished" podID="2f489928-26ae-45c9-851c-6072f97f37e0" containerID="22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5" exitCode=0 Jan 21 16:45:35 crc kubenswrapper[4707]: I0121 16:45:35.198834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"2f489928-26ae-45c9-851c-6072f97f37e0","Type":"ContainerDied","Data":"22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5"} Jan 21 16:45:36 crc kubenswrapper[4707]: I0121 16:45:36.205779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"2f489928-26ae-45c9-851c-6072f97f37e0","Type":"ContainerStarted","Data":"b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a"} Jan 21 16:45:36 crc kubenswrapper[4707]: I0121 16:45:36.206364 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:36 crc kubenswrapper[4707]: I0121 16:45:36.222927 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.22291373 podStartE2EDuration="36.22291373s" podCreationTimestamp="2026-01-21 16:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:45:36.219398591 +0000 UTC m=+6233.400914813" watchObservedRunningTime="2026-01-21 16:45:36.22291373 +0000 UTC m=+6233.404429952" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.132464 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jbclj"] Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.133881 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.139612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbclj"] Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.230320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-utilities\") pod \"redhat-marketplace-jbclj\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.230395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-catalog-content\") pod \"redhat-marketplace-jbclj\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.230422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lc4t\" (UniqueName: \"kubernetes.io/projected/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-kube-api-access-8lc4t\") pod \"redhat-marketplace-jbclj\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.331490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-catalog-content\") pod \"redhat-marketplace-jbclj\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.331547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lc4t\" (UniqueName: \"kubernetes.io/projected/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-kube-api-access-8lc4t\") pod \"redhat-marketplace-jbclj\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.331667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-utilities\") pod \"redhat-marketplace-jbclj\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.331921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-catalog-content\") pod \"redhat-marketplace-jbclj\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.332056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-utilities\") pod \"redhat-marketplace-jbclj\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.346159 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lc4t\" (UniqueName: \"kubernetes.io/projected/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-kube-api-access-8lc4t\") pod \"redhat-marketplace-jbclj\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.456511 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:37 crc kubenswrapper[4707]: I0121 16:45:37.812513 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbclj"] Jan 21 16:45:38 crc kubenswrapper[4707]: I0121 16:45:38.218863 4707 generic.go:334] "Generic (PLEG): container finished" podID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerID="cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4" exitCode=0 Jan 21 16:45:38 crc kubenswrapper[4707]: I0121 16:45:38.218898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbclj" event={"ID":"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428","Type":"ContainerDied","Data":"cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4"} Jan 21 16:45:38 crc kubenswrapper[4707]: I0121 16:45:38.219071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbclj" event={"ID":"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428","Type":"ContainerStarted","Data":"a8b9e498276d421c2e72317b2a91a3d819eeecb17d4da28d50063d706ee5fd09"} Jan 21 16:45:39 crc kubenswrapper[4707]: I0121 16:45:39.226008 4707 generic.go:334] "Generic (PLEG): container finished" podID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerID="51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107" exitCode=0 Jan 21 16:45:39 crc kubenswrapper[4707]: I0121 16:45:39.226098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbclj" event={"ID":"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428","Type":"ContainerDied","Data":"51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107"} Jan 21 16:45:40 crc kubenswrapper[4707]: I0121 16:45:40.233202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbclj" event={"ID":"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428","Type":"ContainerStarted","Data":"f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546"} Jan 21 16:45:40 crc kubenswrapper[4707]: I0121 16:45:40.247292 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jbclj" podStartSLOduration=1.7749976570000001 podStartE2EDuration="3.247278521s" podCreationTimestamp="2026-01-21 16:45:37 +0000 UTC" firstStartedPulling="2026-01-21 16:45:38.220210983 +0000 UTC m=+6235.401727205" lastFinishedPulling="2026-01-21 16:45:39.692491846 +0000 UTC m=+6236.874008069" observedRunningTime="2026-01-21 16:45:40.245884159 +0000 UTC m=+6237.427400381" watchObservedRunningTime="2026-01-21 16:45:40.247278521 +0000 UTC m=+6237.428794743" Jan 21 16:45:41 crc kubenswrapper[4707]: I0121 16:45:41.182901 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:45:41 crc kubenswrapper[4707]: E0121 16:45:41.183407 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.432371 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-7h8mh"] Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.434033 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.439118 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg"] Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.440175 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.446220 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.451215 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg"] Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.461034 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-7h8mh"] Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.507591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khjwt\" (UniqueName: \"kubernetes.io/projected/d3289dec-3166-4339-b51b-aca57132faec-kube-api-access-khjwt\") pod \"keystone-db-create-7h8mh\" (UID: \"d3289dec-3166-4339-b51b-aca57132faec\") " pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.507671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3289dec-3166-4339-b51b-aca57132faec-operator-scripts\") pod \"keystone-db-create-7h8mh\" (UID: \"d3289dec-3166-4339-b51b-aca57132faec\") " pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.507784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d306c2-6922-4d44-8412-ad565b5b05d2-operator-scripts\") pod \"keystone-35dd-account-create-update-4lrlg\" (UID: \"36d306c2-6922-4d44-8412-ad565b5b05d2\") " pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.507828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d55h4\" (UniqueName: \"kubernetes.io/projected/36d306c2-6922-4d44-8412-ad565b5b05d2-kube-api-access-d55h4\") pod \"keystone-35dd-account-create-update-4lrlg\" (UID: \"36d306c2-6922-4d44-8412-ad565b5b05d2\") " pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.608436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3289dec-3166-4339-b51b-aca57132faec-operator-scripts\") pod \"keystone-db-create-7h8mh\" (UID: \"d3289dec-3166-4339-b51b-aca57132faec\") " pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.608507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d306c2-6922-4d44-8412-ad565b5b05d2-operator-scripts\") pod \"keystone-35dd-account-create-update-4lrlg\" (UID: \"36d306c2-6922-4d44-8412-ad565b5b05d2\") " pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.608534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d55h4\" (UniqueName: \"kubernetes.io/projected/36d306c2-6922-4d44-8412-ad565b5b05d2-kube-api-access-d55h4\") pod \"keystone-35dd-account-create-update-4lrlg\" (UID: \"36d306c2-6922-4d44-8412-ad565b5b05d2\") " pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.608581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khjwt\" (UniqueName: \"kubernetes.io/projected/d3289dec-3166-4339-b51b-aca57132faec-kube-api-access-khjwt\") pod \"keystone-db-create-7h8mh\" (UID: \"d3289dec-3166-4339-b51b-aca57132faec\") " pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.609035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3289dec-3166-4339-b51b-aca57132faec-operator-scripts\") pod \"keystone-db-create-7h8mh\" (UID: \"d3289dec-3166-4339-b51b-aca57132faec\") " pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.609189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d306c2-6922-4d44-8412-ad565b5b05d2-operator-scripts\") pod \"keystone-35dd-account-create-update-4lrlg\" (UID: \"36d306c2-6922-4d44-8412-ad565b5b05d2\") " pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.622742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khjwt\" (UniqueName: \"kubernetes.io/projected/d3289dec-3166-4339-b51b-aca57132faec-kube-api-access-khjwt\") pod \"keystone-db-create-7h8mh\" (UID: \"d3289dec-3166-4339-b51b-aca57132faec\") " pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.623993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d55h4\" (UniqueName: \"kubernetes.io/projected/36d306c2-6922-4d44-8412-ad565b5b05d2-kube-api-access-d55h4\") pod \"keystone-35dd-account-create-update-4lrlg\" (UID: \"36d306c2-6922-4d44-8412-ad565b5b05d2\") " pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.750967 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:43 crc kubenswrapper[4707]: I0121 16:45:43.766829 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:44 crc kubenswrapper[4707]: I0121 16:45:44.109210 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-7h8mh"] Jan 21 16:45:44 crc kubenswrapper[4707]: I0121 16:45:44.159693 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg"] Jan 21 16:45:44 crc kubenswrapper[4707]: W0121 16:45:44.162984 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d306c2_6922_4d44_8412_ad565b5b05d2.slice/crio-57e20bdb8bbc02fc5a679b92485df024a6ad6e942ece88ba0f17a32c829f9aa9 WatchSource:0}: Error finding container 57e20bdb8bbc02fc5a679b92485df024a6ad6e942ece88ba0f17a32c829f9aa9: Status 404 returned error can't find the container with id 57e20bdb8bbc02fc5a679b92485df024a6ad6e942ece88ba0f17a32c829f9aa9 Jan 21 16:45:44 crc kubenswrapper[4707]: I0121 16:45:44.254968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" event={"ID":"36d306c2-6922-4d44-8412-ad565b5b05d2","Type":"ContainerStarted","Data":"944e0c0c19123d9ef95e73de9714d90bd42016490859fec68b41e5f114af8f39"} Jan 21 16:45:44 crc kubenswrapper[4707]: I0121 16:45:44.255009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" event={"ID":"36d306c2-6922-4d44-8412-ad565b5b05d2","Type":"ContainerStarted","Data":"57e20bdb8bbc02fc5a679b92485df024a6ad6e942ece88ba0f17a32c829f9aa9"} Jan 21 16:45:44 crc kubenswrapper[4707]: I0121 16:45:44.256219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-7h8mh" event={"ID":"d3289dec-3166-4339-b51b-aca57132faec","Type":"ContainerStarted","Data":"a379485354d057a9c5c20ad2e07e735483103ab3804bb6c9aa35e4db4852cf26"} Jan 21 16:45:44 crc kubenswrapper[4707]: I0121 16:45:44.256259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-7h8mh" event={"ID":"d3289dec-3166-4339-b51b-aca57132faec","Type":"ContainerStarted","Data":"8cc0da37d41a1355650f3d386efbaaecbdfa8074a603078563832695399fb9f2"} Jan 21 16:45:44 crc kubenswrapper[4707]: I0121 16:45:44.266962 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" podStartSLOduration=1.266946568 podStartE2EDuration="1.266946568s" podCreationTimestamp="2026-01-21 16:45:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:45:44.264576753 +0000 UTC m=+6241.446092975" watchObservedRunningTime="2026-01-21 16:45:44.266946568 +0000 UTC m=+6241.448462791" Jan 21 16:45:44 crc kubenswrapper[4707]: I0121 16:45:44.275563 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-create-7h8mh" podStartSLOduration=1.275548713 podStartE2EDuration="1.275548713s" podCreationTimestamp="2026-01-21 16:45:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:45:44.274394794 +0000 UTC m=+6241.455911015" watchObservedRunningTime="2026-01-21 16:45:44.275548713 +0000 UTC m=+6241.457064935" Jan 21 16:45:45 crc kubenswrapper[4707]: I0121 16:45:45.262553 4707 generic.go:334] "Generic (PLEG): container finished" podID="36d306c2-6922-4d44-8412-ad565b5b05d2" containerID="944e0c0c19123d9ef95e73de9714d90bd42016490859fec68b41e5f114af8f39" exitCode=0 Jan 21 16:45:45 crc kubenswrapper[4707]: I0121 16:45:45.262638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" event={"ID":"36d306c2-6922-4d44-8412-ad565b5b05d2","Type":"ContainerDied","Data":"944e0c0c19123d9ef95e73de9714d90bd42016490859fec68b41e5f114af8f39"} Jan 21 16:45:45 crc kubenswrapper[4707]: I0121 16:45:45.264606 4707 generic.go:334] "Generic (PLEG): container finished" podID="d3289dec-3166-4339-b51b-aca57132faec" containerID="a379485354d057a9c5c20ad2e07e735483103ab3804bb6c9aa35e4db4852cf26" exitCode=0 Jan 21 16:45:45 crc kubenswrapper[4707]: I0121 16:45:45.264633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-7h8mh" event={"ID":"d3289dec-3166-4339-b51b-aca57132faec","Type":"ContainerDied","Data":"a379485354d057a9c5c20ad2e07e735483103ab3804bb6c9aa35e4db4852cf26"} Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.518704 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.572488 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.646057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d306c2-6922-4d44-8412-ad565b5b05d2-operator-scripts\") pod \"36d306c2-6922-4d44-8412-ad565b5b05d2\" (UID: \"36d306c2-6922-4d44-8412-ad565b5b05d2\") " Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.646207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d55h4\" (UniqueName: \"kubernetes.io/projected/36d306c2-6922-4d44-8412-ad565b5b05d2-kube-api-access-d55h4\") pod \"36d306c2-6922-4d44-8412-ad565b5b05d2\" (UID: \"36d306c2-6922-4d44-8412-ad565b5b05d2\") " Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.646864 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d306c2-6922-4d44-8412-ad565b5b05d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36d306c2-6922-4d44-8412-ad565b5b05d2" (UID: "36d306c2-6922-4d44-8412-ad565b5b05d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.650777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d306c2-6922-4d44-8412-ad565b5b05d2-kube-api-access-d55h4" (OuterVolumeSpecName: "kube-api-access-d55h4") pod "36d306c2-6922-4d44-8412-ad565b5b05d2" (UID: "36d306c2-6922-4d44-8412-ad565b5b05d2"). InnerVolumeSpecName "kube-api-access-d55h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.747389 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3289dec-3166-4339-b51b-aca57132faec-operator-scripts\") pod \"d3289dec-3166-4339-b51b-aca57132faec\" (UID: \"d3289dec-3166-4339-b51b-aca57132faec\") " Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.747584 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khjwt\" (UniqueName: \"kubernetes.io/projected/d3289dec-3166-4339-b51b-aca57132faec-kube-api-access-khjwt\") pod \"d3289dec-3166-4339-b51b-aca57132faec\" (UID: \"d3289dec-3166-4339-b51b-aca57132faec\") " Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.747721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3289dec-3166-4339-b51b-aca57132faec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3289dec-3166-4339-b51b-aca57132faec" (UID: "d3289dec-3166-4339-b51b-aca57132faec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.748097 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d55h4\" (UniqueName: \"kubernetes.io/projected/36d306c2-6922-4d44-8412-ad565b5b05d2-kube-api-access-d55h4\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.748117 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3289dec-3166-4339-b51b-aca57132faec-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.748127 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d306c2-6922-4d44-8412-ad565b5b05d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.749835 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3289dec-3166-4339-b51b-aca57132faec-kube-api-access-khjwt" (OuterVolumeSpecName: "kube-api-access-khjwt") pod "d3289dec-3166-4339-b51b-aca57132faec" (UID: "d3289dec-3166-4339-b51b-aca57132faec"). InnerVolumeSpecName "kube-api-access-khjwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:45:46 crc kubenswrapper[4707]: I0121 16:45:46.849200 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khjwt\" (UniqueName: \"kubernetes.io/projected/d3289dec-3166-4339-b51b-aca57132faec-kube-api-access-khjwt\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:47 crc kubenswrapper[4707]: I0121 16:45:47.277009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" event={"ID":"36d306c2-6922-4d44-8412-ad565b5b05d2","Type":"ContainerDied","Data":"57e20bdb8bbc02fc5a679b92485df024a6ad6e942ece88ba0f17a32c829f9aa9"} Jan 21 16:45:47 crc kubenswrapper[4707]: I0121 16:45:47.277064 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57e20bdb8bbc02fc5a679b92485df024a6ad6e942ece88ba0f17a32c829f9aa9" Jan 21 16:45:47 crc kubenswrapper[4707]: I0121 16:45:47.277029 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg" Jan 21 16:45:47 crc kubenswrapper[4707]: I0121 16:45:47.278665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-7h8mh" event={"ID":"d3289dec-3166-4339-b51b-aca57132faec","Type":"ContainerDied","Data":"8cc0da37d41a1355650f3d386efbaaecbdfa8074a603078563832695399fb9f2"} Jan 21 16:45:47 crc kubenswrapper[4707]: I0121 16:45:47.278695 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-7h8mh" Jan 21 16:45:47 crc kubenswrapper[4707]: I0121 16:45:47.278699 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cc0da37d41a1355650f3d386efbaaecbdfa8074a603078563832695399fb9f2" Jan 21 16:45:47 crc kubenswrapper[4707]: I0121 16:45:47.457013 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:47 crc kubenswrapper[4707]: I0121 16:45:47.457247 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:47 crc kubenswrapper[4707]: I0121 16:45:47.489714 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:48 crc kubenswrapper[4707]: I0121 16:45:48.311595 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:50 crc kubenswrapper[4707]: I0121 16:45:50.926828 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbclj"] Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.298354 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jbclj" podUID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerName="registry-server" containerID="cri-o://f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546" gracePeriod=2 Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.629578 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.705840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lc4t\" (UniqueName: \"kubernetes.io/projected/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-kube-api-access-8lc4t\") pod \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.706057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-catalog-content\") pod \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.706092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-utilities\") pod \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\" (UID: \"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428\") " Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.706742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-utilities" (OuterVolumeSpecName: "utilities") pod "a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" (UID: "a623a4f0-ed10-4eee-bd3c-c68c2a9ff428"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.709577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-kube-api-access-8lc4t" (OuterVolumeSpecName: "kube-api-access-8lc4t") pod "a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" (UID: "a623a4f0-ed10-4eee-bd3c-c68c2a9ff428"). InnerVolumeSpecName "kube-api-access-8lc4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.722803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" (UID: "a623a4f0-ed10-4eee-bd3c-c68c2a9ff428"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.807393 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.807420 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:51 crc kubenswrapper[4707]: I0121 16:45:51.807432 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lc4t\" (UniqueName: \"kubernetes.io/projected/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428-kube-api-access-8lc4t\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.031676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.305044 4707 generic.go:334] "Generic (PLEG): container finished" podID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerID="f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546" exitCode=0 Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.305080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbclj" event={"ID":"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428","Type":"ContainerDied","Data":"f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546"} Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.305087 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbclj" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.305108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbclj" event={"ID":"a623a4f0-ed10-4eee-bd3c-c68c2a9ff428","Type":"ContainerDied","Data":"a8b9e498276d421c2e72317b2a91a3d819eeecb17d4da28d50063d706ee5fd09"} Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.305125 4707 scope.go:117] "RemoveContainer" containerID="f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.319660 4707 scope.go:117] "RemoveContainer" containerID="51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.326567 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbclj"] Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.341749 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbclj"] Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.344959 4707 scope.go:117] "RemoveContainer" containerID="cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.376984 4707 scope.go:117] "RemoveContainer" containerID="f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546" Jan 21 16:45:52 crc kubenswrapper[4707]: E0121 16:45:52.378155 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546\": container with ID starting with f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546 not found: ID does not exist" containerID="f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.378187 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546"} err="failed to get container status \"f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546\": rpc error: code = NotFound desc = could not find container \"f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546\": container with ID starting with f1acbaf6d56613ee7109036f67db8a00da72325f27bbef5814c502ea7b3cb546 not found: ID does not exist" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.378209 4707 scope.go:117] "RemoveContainer" containerID="51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107" Jan 21 16:45:52 crc kubenswrapper[4707]: E0121 16:45:52.378470 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107\": container with ID starting with 51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107 not found: ID does not exist" containerID="51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.378495 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107"} err="failed to get container status \"51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107\": rpc error: code = NotFound desc = could not find container \"51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107\": container with ID starting with 51290df308cbd298c9ff25e25d61cb05938dd1d9c4aaf61c86e49c0b05f60107 not found: ID does not exist" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.378510 4707 scope.go:117] "RemoveContainer" containerID="cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4" Jan 21 16:45:52 crc kubenswrapper[4707]: E0121 16:45:52.378902 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4\": container with ID starting with cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4 not found: ID does not exist" containerID="cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.378927 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4"} err="failed to get container status \"cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4\": rpc error: code = NotFound desc = could not find container \"cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4\": container with ID starting with cc19fe4b44c7223c6c10ca06f972af306650688145757de2ae606121962190a4 not found: ID does not exist" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.597358 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-q89gs"] Jan 21 16:45:52 crc kubenswrapper[4707]: E0121 16:45:52.597572 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerName="extract-utilities" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.597589 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerName="extract-utilities" Jan 21 16:45:52 crc kubenswrapper[4707]: E0121 16:45:52.597616 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d306c2-6922-4d44-8412-ad565b5b05d2" containerName="mariadb-account-create-update" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.597622 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d306c2-6922-4d44-8412-ad565b5b05d2" containerName="mariadb-account-create-update" Jan 21 16:45:52 crc kubenswrapper[4707]: E0121 16:45:52.597632 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerName="registry-server" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.597638 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerName="registry-server" Jan 21 16:45:52 crc kubenswrapper[4707]: E0121 16:45:52.597651 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3289dec-3166-4339-b51b-aca57132faec" containerName="mariadb-database-create" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.597657 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3289dec-3166-4339-b51b-aca57132faec" containerName="mariadb-database-create" Jan 21 16:45:52 crc kubenswrapper[4707]: E0121 16:45:52.597667 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerName="extract-content" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.597672 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerName="extract-content" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.597795 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3289dec-3166-4339-b51b-aca57132faec" containerName="mariadb-database-create" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.597830 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d306c2-6922-4d44-8412-ad565b5b05d2" containerName="mariadb-account-create-update" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.597841 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" containerName="registry-server" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.598201 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.599446 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.599792 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.600373 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-75j2q" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.600393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.609033 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-q89gs"] Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.718223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-config-data\") pod \"keystone-db-sync-q89gs\" (UID: \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\") " pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.718283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbq4z\" (UniqueName: \"kubernetes.io/projected/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-kube-api-access-lbq4z\") pod \"keystone-db-sync-q89gs\" (UID: \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\") " pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.819861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-config-data\") pod \"keystone-db-sync-q89gs\" (UID: \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\") " pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.819923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbq4z\" (UniqueName: \"kubernetes.io/projected/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-kube-api-access-lbq4z\") pod \"keystone-db-sync-q89gs\" (UID: \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\") " pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.823457 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-config-data\") pod \"keystone-db-sync-q89gs\" (UID: \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\") " pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.831507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbq4z\" (UniqueName: \"kubernetes.io/projected/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-kube-api-access-lbq4z\") pod \"keystone-db-sync-q89gs\" (UID: \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\") " pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:52 crc kubenswrapper[4707]: I0121 16:45:52.913118 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:53 crc kubenswrapper[4707]: I0121 16:45:53.188595 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a623a4f0-ed10-4eee-bd3c-c68c2a9ff428" path="/var/lib/kubelet/pods/a623a4f0-ed10-4eee-bd3c-c68c2a9ff428/volumes" Jan 21 16:45:53 crc kubenswrapper[4707]: I0121 16:45:53.254769 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-q89gs"] Jan 21 16:45:53 crc kubenswrapper[4707]: I0121 16:45:53.311136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-q89gs" event={"ID":"15a37ab5-e3d7-4d0f-bf63-c08d2935057a","Type":"ContainerStarted","Data":"0e4e9eb44d643169bac9c9b7a8fafee14c1d3b15dae1d5fa9089a0171b7659c5"} Jan 21 16:45:54 crc kubenswrapper[4707]: I0121 16:45:54.182600 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:45:54 crc kubenswrapper[4707]: E0121 16:45:54.182967 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:45:54 crc kubenswrapper[4707]: I0121 16:45:54.319415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-q89gs" event={"ID":"15a37ab5-e3d7-4d0f-bf63-c08d2935057a","Type":"ContainerStarted","Data":"5b5880910cc4620b820188d3d6a9729b7102ace5709df5d0f3a5ba8b705783ce"} Jan 21 16:45:54 crc kubenswrapper[4707]: I0121 16:45:54.333029 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-q89gs" podStartSLOduration=2.333015203 podStartE2EDuration="2.333015203s" podCreationTimestamp="2026-01-21 16:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:45:54.330798846 +0000 UTC m=+6251.512315068" watchObservedRunningTime="2026-01-21 16:45:54.333015203 +0000 UTC m=+6251.514531425" Jan 21 16:45:55 crc kubenswrapper[4707]: I0121 16:45:55.325447 4707 generic.go:334] "Generic (PLEG): container finished" podID="15a37ab5-e3d7-4d0f-bf63-c08d2935057a" containerID="5b5880910cc4620b820188d3d6a9729b7102ace5709df5d0f3a5ba8b705783ce" exitCode=0 Jan 21 16:45:55 crc kubenswrapper[4707]: I0121 16:45:55.325528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-q89gs" event={"ID":"15a37ab5-e3d7-4d0f-bf63-c08d2935057a","Type":"ContainerDied","Data":"5b5880910cc4620b820188d3d6a9729b7102ace5709df5d0f3a5ba8b705783ce"} Jan 21 16:45:56 crc kubenswrapper[4707]: I0121 16:45:56.542727 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:56 crc kubenswrapper[4707]: I0121 16:45:56.667843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbq4z\" (UniqueName: \"kubernetes.io/projected/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-kube-api-access-lbq4z\") pod \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\" (UID: \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\") " Jan 21 16:45:56 crc kubenswrapper[4707]: I0121 16:45:56.667974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-config-data\") pod \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\" (UID: \"15a37ab5-e3d7-4d0f-bf63-c08d2935057a\") " Jan 21 16:45:56 crc kubenswrapper[4707]: I0121 16:45:56.672630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-kube-api-access-lbq4z" (OuterVolumeSpecName: "kube-api-access-lbq4z") pod "15a37ab5-e3d7-4d0f-bf63-c08d2935057a" (UID: "15a37ab5-e3d7-4d0f-bf63-c08d2935057a"). InnerVolumeSpecName "kube-api-access-lbq4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:45:56 crc kubenswrapper[4707]: I0121 16:45:56.735301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-config-data" (OuterVolumeSpecName: "config-data") pod "15a37ab5-e3d7-4d0f-bf63-c08d2935057a" (UID: "15a37ab5-e3d7-4d0f-bf63-c08d2935057a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:45:56 crc kubenswrapper[4707]: I0121 16:45:56.768857 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:56 crc kubenswrapper[4707]: I0121 16:45:56.768890 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbq4z\" (UniqueName: \"kubernetes.io/projected/15a37ab5-e3d7-4d0f-bf63-c08d2935057a-kube-api-access-lbq4z\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.336836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-q89gs" event={"ID":"15a37ab5-e3d7-4d0f-bf63-c08d2935057a","Type":"ContainerDied","Data":"0e4e9eb44d643169bac9c9b7a8fafee14c1d3b15dae1d5fa9089a0171b7659c5"} Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.336872 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e4e9eb44d643169bac9c9b7a8fafee14c1d3b15dae1d5fa9089a0171b7659c5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.336875 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-q89gs" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.494533 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lbzj5"] Jan 21 16:45:57 crc kubenswrapper[4707]: E0121 16:45:57.494772 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a37ab5-e3d7-4d0f-bf63-c08d2935057a" containerName="keystone-db-sync" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.494784 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a37ab5-e3d7-4d0f-bf63-c08d2935057a" containerName="keystone-db-sync" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.494947 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a37ab5-e3d7-4d0f-bf63-c08d2935057a" containerName="keystone-db-sync" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.495367 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.496902 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.497212 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-75j2q" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.497383 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.497493 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.497659 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.509418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lbzj5"] Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.679458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-fernet-keys\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.679524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-695xc\" (UniqueName: \"kubernetes.io/projected/d8b54143-71ae-4d00-994a-2beb8202837b-kube-api-access-695xc\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.679687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-credential-keys\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.679751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-scripts\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.679782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-config-data\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.780597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-credential-keys\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.780639 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-scripts\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.780657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-config-data\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.780752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-fernet-keys\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.780791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-695xc\" (UniqueName: \"kubernetes.io/projected/d8b54143-71ae-4d00-994a-2beb8202837b-kube-api-access-695xc\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.783834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-scripts\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.784049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-credential-keys\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.784216 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-fernet-keys\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.784833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-config-data\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.795955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-695xc\" (UniqueName: \"kubernetes.io/projected/d8b54143-71ae-4d00-994a-2beb8202837b-kube-api-access-695xc\") pod \"keystone-bootstrap-lbzj5\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:57 crc kubenswrapper[4707]: I0121 16:45:57.829847 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:45:58 crc kubenswrapper[4707]: I0121 16:45:58.187997 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lbzj5"] Jan 21 16:45:58 crc kubenswrapper[4707]: W0121 16:45:58.192644 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8b54143_71ae_4d00_994a_2beb8202837b.slice/crio-85b915e72ee063a82e7929798a82f0c4593c441a50fef07e18b6316e66371c36 WatchSource:0}: Error finding container 85b915e72ee063a82e7929798a82f0c4593c441a50fef07e18b6316e66371c36: Status 404 returned error can't find the container with id 85b915e72ee063a82e7929798a82f0c4593c441a50fef07e18b6316e66371c36 Jan 21 16:45:58 crc kubenswrapper[4707]: I0121 16:45:58.344506 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" event={"ID":"d8b54143-71ae-4d00-994a-2beb8202837b","Type":"ContainerStarted","Data":"d9ec4d40ea85ba2a41b34bb92391d88dc2794b0889a6b0dd17ba6b6e93e88483"} Jan 21 16:45:58 crc kubenswrapper[4707]: I0121 16:45:58.344546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" event={"ID":"d8b54143-71ae-4d00-994a-2beb8202837b","Type":"ContainerStarted","Data":"85b915e72ee063a82e7929798a82f0c4593c441a50fef07e18b6316e66371c36"} Jan 21 16:45:58 crc kubenswrapper[4707]: I0121 16:45:58.357937 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" podStartSLOduration=1.357925899 podStartE2EDuration="1.357925899s" podCreationTimestamp="2026-01-21 16:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:45:58.354877557 +0000 UTC m=+6255.536393779" watchObservedRunningTime="2026-01-21 16:45:58.357925899 +0000 UTC m=+6255.539442121" Jan 21 16:46:00 crc kubenswrapper[4707]: I0121 16:46:00.362675 4707 generic.go:334] "Generic (PLEG): container finished" podID="d8b54143-71ae-4d00-994a-2beb8202837b" containerID="d9ec4d40ea85ba2a41b34bb92391d88dc2794b0889a6b0dd17ba6b6e93e88483" exitCode=0 Jan 21 16:46:00 crc kubenswrapper[4707]: I0121 16:46:00.362751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" event={"ID":"d8b54143-71ae-4d00-994a-2beb8202837b","Type":"ContainerDied","Data":"d9ec4d40ea85ba2a41b34bb92391d88dc2794b0889a6b0dd17ba6b6e93e88483"} Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.618311 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.736281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-scripts\") pod \"d8b54143-71ae-4d00-994a-2beb8202837b\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.736380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-credential-keys\") pod \"d8b54143-71ae-4d00-994a-2beb8202837b\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.736521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-config-data\") pod \"d8b54143-71ae-4d00-994a-2beb8202837b\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.736638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-695xc\" (UniqueName: \"kubernetes.io/projected/d8b54143-71ae-4d00-994a-2beb8202837b-kube-api-access-695xc\") pod \"d8b54143-71ae-4d00-994a-2beb8202837b\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.736671 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-fernet-keys\") pod \"d8b54143-71ae-4d00-994a-2beb8202837b\" (UID: \"d8b54143-71ae-4d00-994a-2beb8202837b\") " Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.740495 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d8b54143-71ae-4d00-994a-2beb8202837b" (UID: "d8b54143-71ae-4d00-994a-2beb8202837b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.740503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b54143-71ae-4d00-994a-2beb8202837b-kube-api-access-695xc" (OuterVolumeSpecName: "kube-api-access-695xc") pod "d8b54143-71ae-4d00-994a-2beb8202837b" (UID: "d8b54143-71ae-4d00-994a-2beb8202837b"). InnerVolumeSpecName "kube-api-access-695xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.740836 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d8b54143-71ae-4d00-994a-2beb8202837b" (UID: "d8b54143-71ae-4d00-994a-2beb8202837b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.740993 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-scripts" (OuterVolumeSpecName: "scripts") pod "d8b54143-71ae-4d00-994a-2beb8202837b" (UID: "d8b54143-71ae-4d00-994a-2beb8202837b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.750967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-config-data" (OuterVolumeSpecName: "config-data") pod "d8b54143-71ae-4d00-994a-2beb8202837b" (UID: "d8b54143-71ae-4d00-994a-2beb8202837b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.838675 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-695xc\" (UniqueName: \"kubernetes.io/projected/d8b54143-71ae-4d00-994a-2beb8202837b-kube-api-access-695xc\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.838715 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.838728 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.838739 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:01 crc kubenswrapper[4707]: I0121 16:46:01.838748 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b54143-71ae-4d00-994a-2beb8202837b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.377670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" event={"ID":"d8b54143-71ae-4d00-994a-2beb8202837b","Type":"ContainerDied","Data":"85b915e72ee063a82e7929798a82f0c4593c441a50fef07e18b6316e66371c36"} Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.377889 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85b915e72ee063a82e7929798a82f0c4593c441a50fef07e18b6316e66371c36" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.377699 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-lbzj5" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.415452 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-56bd68f756-n4bjt"] Jan 21 16:46:02 crc kubenswrapper[4707]: E0121 16:46:02.415744 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b54143-71ae-4d00-994a-2beb8202837b" containerName="keystone-bootstrap" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.415775 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b54143-71ae-4d00-994a-2beb8202837b" containerName="keystone-bootstrap" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.416474 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b54143-71ae-4d00-994a-2beb8202837b" containerName="keystone-bootstrap" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.418149 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.421289 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.421325 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.421755 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.422275 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-75j2q" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.445258 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-fernet-keys\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.445405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-config-data\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.445494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-credential-keys\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.445563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-scripts\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.445710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j279k\" (UniqueName: \"kubernetes.io/projected/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-kube-api-access-j279k\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.445930 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-56bd68f756-n4bjt"] Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.546357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j279k\" (UniqueName: \"kubernetes.io/projected/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-kube-api-access-j279k\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.546556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-fernet-keys\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.546652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-config-data\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.546775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-credential-keys\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.546890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-scripts\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.551166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-credential-keys\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.551166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-scripts\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.551677 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-config-data\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.552650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-fernet-keys\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.563895 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j279k\" (UniqueName: \"kubernetes.io/projected/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-kube-api-access-j279k\") pod \"keystone-56bd68f756-n4bjt\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:02 crc kubenswrapper[4707]: I0121 16:46:02.734321 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:03 crc kubenswrapper[4707]: I0121 16:46:03.088007 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-56bd68f756-n4bjt"] Jan 21 16:46:03 crc kubenswrapper[4707]: W0121 16:46:03.092272 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe08f42b_553b_4361_a0e0_ace0d6db5e8e.slice/crio-f93c6f55c8385282ee2c0245a5d4e94da72640825e8e35f92fed81b2baea128a WatchSource:0}: Error finding container f93c6f55c8385282ee2c0245a5d4e94da72640825e8e35f92fed81b2baea128a: Status 404 returned error can't find the container with id f93c6f55c8385282ee2c0245a5d4e94da72640825e8e35f92fed81b2baea128a Jan 21 16:46:03 crc kubenswrapper[4707]: I0121 16:46:03.384721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" event={"ID":"fe08f42b-553b-4361-a0e0-ace0d6db5e8e","Type":"ContainerStarted","Data":"52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a"} Jan 21 16:46:03 crc kubenswrapper[4707]: I0121 16:46:03.384986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" event={"ID":"fe08f42b-553b-4361-a0e0-ace0d6db5e8e","Type":"ContainerStarted","Data":"f93c6f55c8385282ee2c0245a5d4e94da72640825e8e35f92fed81b2baea128a"} Jan 21 16:46:03 crc kubenswrapper[4707]: I0121 16:46:03.408853 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" podStartSLOduration=1.408839565 podStartE2EDuration="1.408839565s" podCreationTimestamp="2026-01-21 16:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:46:03.406824487 +0000 UTC m=+6260.588340709" watchObservedRunningTime="2026-01-21 16:46:03.408839565 +0000 UTC m=+6260.590355777" Jan 21 16:46:04 crc kubenswrapper[4707]: I0121 16:46:04.390535 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:09 crc kubenswrapper[4707]: I0121 16:46:09.183126 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:46:09 crc kubenswrapper[4707]: E0121 16:46:09.183642 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:46:21 crc kubenswrapper[4707]: I0121 16:46:21.182158 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:46:21 crc kubenswrapper[4707]: E0121 16:46:21.182730 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:46:26 crc kubenswrapper[4707]: I0121 16:46:26.287949 4707 scope.go:117] "RemoveContainer" containerID="e519e8867f97adcf5975a111f78a8343d5660a802db0350be83c18e6285c3c0f" Jan 21 16:46:26 crc kubenswrapper[4707]: I0121 16:46:26.302200 4707 scope.go:117] "RemoveContainer" containerID="6f9706b5be81d9d93e668a9e1b8048574b2113237604f19fc7564233ede4d0d0" Jan 21 16:46:26 crc kubenswrapper[4707]: I0121 16:46:26.332103 4707 scope.go:117] "RemoveContainer" containerID="4cd6cd5afa339b83660cde8c66e04cbee9185aebd4d19e72c48f6e0f30a78d3c" Jan 21 16:46:26 crc kubenswrapper[4707]: I0121 16:46:26.350713 4707 scope.go:117] "RemoveContainer" containerID="96254e932b512b73daaf144d488ace9d1022025ec187cdf7638fd0994f40d6db" Jan 21 16:46:26 crc kubenswrapper[4707]: I0121 16:46:26.363933 4707 scope.go:117] "RemoveContainer" containerID="9ba17d6b270bdf649b53367f4f8805234a3b0e6fb56411d6f3cb0e687eec4328" Jan 21 16:46:26 crc kubenswrapper[4707]: I0121 16:46:26.381692 4707 scope.go:117] "RemoveContainer" containerID="85a6db681c23d2d559bfefcdac4a591fe4fbdbe19f51f00633faac999250b166" Jan 21 16:46:33 crc kubenswrapper[4707]: I0121 16:46:33.996897 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:34 crc kubenswrapper[4707]: E0121 16:46:34.916241 4707 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-56bd68f756-n4bjt_fe08f42b-553b-4361-a0e0-ace0d6db5e8e/keystone-api/0.log" line={} Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.178067 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5d47bf985-97jrs"] Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.178770 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.188790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5d47bf985-97jrs"] Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.246285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.246381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.246414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtfjn\" (UniqueName: \"kubernetes.io/projected/760acdc9-a0cc-49ca-b0f5-a626ab478c44-kube-api-access-rtfjn\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.246435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.246483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: E0121 16:46:35.270108 4707 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-56bd68f756-n4bjt_fe08f42b-553b-4361-a0e0-ace0d6db5e8e/keystone-api/0.log" line={} Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.347670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.347747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.347881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.347937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtfjn\" (UniqueName: \"kubernetes.io/projected/760acdc9-a0cc-49ca-b0f5-a626ab478c44-kube-api-access-rtfjn\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.347965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.357364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.361704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.362245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.364039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.378347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtfjn\" (UniqueName: \"kubernetes.io/projected/760acdc9-a0cc-49ca-b0f5-a626ab478c44-kube-api-access-rtfjn\") pod \"keystone-5d47bf985-97jrs\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.492090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:35 crc kubenswrapper[4707]: I0121 16:46:35.841404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5d47bf985-97jrs"] Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.182699 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:46:36 crc kubenswrapper[4707]: E0121 16:46:36.182976 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.554607 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-q89gs"] Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.561706 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lbzj5"] Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.565630 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-q89gs"] Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.570167 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-lbzj5"] Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.573934 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5d47bf985-97jrs"] Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.577029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" event={"ID":"760acdc9-a0cc-49ca-b0f5-a626ab478c44","Type":"ContainerStarted","Data":"da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8"} Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.577077 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" event={"ID":"760acdc9-a0cc-49ca-b0f5-a626ab478c44","Type":"ContainerStarted","Data":"2ebffceac89c0e089df8ade65251d163337545fb66ca233cdd45c0431e6d4690"} Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.577182 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.577541 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" secret="" err="secret \"keystone-keystone-dockercfg-75j2q\" not found" Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.578256 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-56bd68f756-n4bjt"] Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.578424 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" podUID="fe08f42b-553b-4361-a0e0-ace0d6db5e8e" containerName="keystone-api" containerID="cri-o://52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a" gracePeriod=30 Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.594264 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" podStartSLOduration=1.5942461140000002 podStartE2EDuration="1.594246114s" podCreationTimestamp="2026-01-21 16:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:46:36.589786328 +0000 UTC m=+6293.771302550" watchObservedRunningTime="2026-01-21 16:46:36.594246114 +0000 UTC m=+6293.775762335" Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.625028 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone35dd-account-delete-qjldk"] Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.625898 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.640289 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone35dd-account-delete-qjldk"] Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.667876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b599c271-18c6-41a2-b845-f482d4101bbd-operator-scripts\") pod \"keystone35dd-account-delete-qjldk\" (UID: \"b599c271-18c6-41a2-b845-f482d4101bbd\") " pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:36 crc kubenswrapper[4707]: E0121 16:46:36.667978 4707 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.667992 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgxdv\" (UniqueName: \"kubernetes.io/projected/b599c271-18c6-41a2-b845-f482d4101bbd-kube-api-access-tgxdv\") pod \"keystone35dd-account-delete-qjldk\" (UID: \"b599c271-18c6-41a2-b845-f482d4101bbd\") " pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:36 crc kubenswrapper[4707]: E0121 16:46:36.668058 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts podName:760acdc9-a0cc-49ca-b0f5-a626ab478c44 nodeName:}" failed. No retries permitted until 2026-01-21 16:46:37.168042222 +0000 UTC m=+6294.349558434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts") pod "keystone-5d47bf985-97jrs" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44") : secret "keystone-scripts" not found Jan 21 16:46:36 crc kubenswrapper[4707]: E0121 16:46:36.668152 4707 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Jan 21 16:46:36 crc kubenswrapper[4707]: E0121 16:46:36.668199 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data podName:760acdc9-a0cc-49ca-b0f5-a626ab478c44 nodeName:}" failed. No retries permitted until 2026-01-21 16:46:37.168184941 +0000 UTC m=+6294.349701153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data") pod "keystone-5d47bf985-97jrs" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44") : secret "keystone-config-data" not found Jan 21 16:46:36 crc kubenswrapper[4707]: E0121 16:46:36.668259 4707 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Jan 21 16:46:36 crc kubenswrapper[4707]: E0121 16:46:36.668291 4707 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Jan 21 16:46:36 crc kubenswrapper[4707]: E0121 16:46:36.668341 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys podName:760acdc9-a0cc-49ca-b0f5-a626ab478c44 nodeName:}" failed. No retries permitted until 2026-01-21 16:46:37.168321747 +0000 UTC m=+6294.349837969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys") pod "keystone-5d47bf985-97jrs" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44") : secret "keystone" not found Jan 21 16:46:36 crc kubenswrapper[4707]: E0121 16:46:36.668372 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys podName:760acdc9-a0cc-49ca-b0f5-a626ab478c44 nodeName:}" failed. No retries permitted until 2026-01-21 16:46:37.16835462 +0000 UTC m=+6294.349870842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys") pod "keystone-5d47bf985-97jrs" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44") : secret "keystone" not found Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.769561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgxdv\" (UniqueName: \"kubernetes.io/projected/b599c271-18c6-41a2-b845-f482d4101bbd-kube-api-access-tgxdv\") pod \"keystone35dd-account-delete-qjldk\" (UID: \"b599c271-18c6-41a2-b845-f482d4101bbd\") " pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.769686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b599c271-18c6-41a2-b845-f482d4101bbd-operator-scripts\") pod \"keystone35dd-account-delete-qjldk\" (UID: \"b599c271-18c6-41a2-b845-f482d4101bbd\") " pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.770565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b599c271-18c6-41a2-b845-f482d4101bbd-operator-scripts\") pod \"keystone35dd-account-delete-qjldk\" (UID: \"b599c271-18c6-41a2-b845-f482d4101bbd\") " pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.785876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgxdv\" (UniqueName: \"kubernetes.io/projected/b599c271-18c6-41a2-b845-f482d4101bbd-kube-api-access-tgxdv\") pod \"keystone35dd-account-delete-qjldk\" (UID: \"b599c271-18c6-41a2-b845-f482d4101bbd\") " pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:36 crc kubenswrapper[4707]: I0121 16:46:36.939435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.103281 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone35dd-account-delete-qjldk"] Jan 21 16:46:37 crc kubenswrapper[4707]: E0121 16:46:37.176566 4707 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Jan 21 16:46:37 crc kubenswrapper[4707]: E0121 16:46:37.176630 4707 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Jan 21 16:46:37 crc kubenswrapper[4707]: E0121 16:46:37.176637 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts podName:760acdc9-a0cc-49ca-b0f5-a626ab478c44 nodeName:}" failed. No retries permitted until 2026-01-21 16:46:38.17662034 +0000 UTC m=+6295.358136562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts") pod "keystone-5d47bf985-97jrs" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44") : secret "keystone-scripts" not found Jan 21 16:46:37 crc kubenswrapper[4707]: E0121 16:46:37.176693 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data podName:760acdc9-a0cc-49ca-b0f5-a626ab478c44 nodeName:}" failed. No retries permitted until 2026-01-21 16:46:38.176677738 +0000 UTC m=+6295.358193959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data") pod "keystone-5d47bf985-97jrs" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44") : secret "keystone-config-data" not found Jan 21 16:46:37 crc kubenswrapper[4707]: E0121 16:46:37.176737 4707 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Jan 21 16:46:37 crc kubenswrapper[4707]: E0121 16:46:37.176771 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys podName:760acdc9-a0cc-49ca-b0f5-a626ab478c44 nodeName:}" failed. No retries permitted until 2026-01-21 16:46:38.176750675 +0000 UTC m=+6295.358266896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys") pod "keystone-5d47bf985-97jrs" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44") : secret "keystone" not found Jan 21 16:46:37 crc kubenswrapper[4707]: E0121 16:46:37.177083 4707 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Jan 21 16:46:37 crc kubenswrapper[4707]: E0121 16:46:37.177136 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys podName:760acdc9-a0cc-49ca-b0f5-a626ab478c44 nodeName:}" failed. No retries permitted until 2026-01-21 16:46:38.177126712 +0000 UTC m=+6295.358642934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys") pod "keystone-5d47bf985-97jrs" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44") : secret "keystone" not found Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.193857 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a37ab5-e3d7-4d0f-bf63-c08d2935057a" path="/var/lib/kubelet/pods/15a37ab5-e3d7-4d0f-bf63-c08d2935057a/volumes" Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.194530 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b54143-71ae-4d00-994a-2beb8202837b" path="/var/lib/kubelet/pods/d8b54143-71ae-4d00-994a-2beb8202837b/volumes" Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.585871 4707 generic.go:334] "Generic (PLEG): container finished" podID="b599c271-18c6-41a2-b845-f482d4101bbd" containerID="8cfa3acf6b0a4f0b0eda47e2d325bd35d823b4a7a67d3d9af03269b9bac967b2" exitCode=0 Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.585967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" event={"ID":"b599c271-18c6-41a2-b845-f482d4101bbd","Type":"ContainerDied","Data":"8cfa3acf6b0a4f0b0eda47e2d325bd35d823b4a7a67d3d9af03269b9bac967b2"} Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.586026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" event={"ID":"b599c271-18c6-41a2-b845-f482d4101bbd","Type":"ContainerStarted","Data":"ee665d01496875504efaf74b043066629c6f66bb9a23811bb573b77e598d61b8"} Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.586023 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" podUID="760acdc9-a0cc-49ca-b0f5-a626ab478c44" containerName="keystone-api" containerID="cri-o://da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8" gracePeriod=30 Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.922000 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.989202 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts\") pod \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.989312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys\") pod \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.989413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtfjn\" (UniqueName: \"kubernetes.io/projected/760acdc9-a0cc-49ca-b0f5-a626ab478c44-kube-api-access-rtfjn\") pod \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.989478 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys\") pod \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.989502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data\") pod \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\" (UID: \"760acdc9-a0cc-49ca-b0f5-a626ab478c44\") " Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.994121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760acdc9-a0cc-49ca-b0f5-a626ab478c44-kube-api-access-rtfjn" (OuterVolumeSpecName: "kube-api-access-rtfjn") pod "760acdc9-a0cc-49ca-b0f5-a626ab478c44" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44"). InnerVolumeSpecName "kube-api-access-rtfjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.994330 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "760acdc9-a0cc-49ca-b0f5-a626ab478c44" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.994669 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts" (OuterVolumeSpecName: "scripts") pod "760acdc9-a0cc-49ca-b0f5-a626ab478c44" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:37 crc kubenswrapper[4707]: I0121 16:46:37.994693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "760acdc9-a0cc-49ca-b0f5-a626ab478c44" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.006085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data" (OuterVolumeSpecName: "config-data") pod "760acdc9-a0cc-49ca-b0f5-a626ab478c44" (UID: "760acdc9-a0cc-49ca-b0f5-a626ab478c44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.090718 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.090748 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.090773 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtfjn\" (UniqueName: \"kubernetes.io/projected/760acdc9-a0cc-49ca-b0f5-a626ab478c44-kube-api-access-rtfjn\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.090783 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.090792 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760acdc9-a0cc-49ca-b0f5-a626ab478c44-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.595253 4707 generic.go:334] "Generic (PLEG): container finished" podID="760acdc9-a0cc-49ca-b0f5-a626ab478c44" containerID="da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8" exitCode=0 Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.595356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" event={"ID":"760acdc9-a0cc-49ca-b0f5-a626ab478c44","Type":"ContainerDied","Data":"da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8"} Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.595411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" event={"ID":"760acdc9-a0cc-49ca-b0f5-a626ab478c44","Type":"ContainerDied","Data":"2ebffceac89c0e089df8ade65251d163337545fb66ca233cdd45c0431e6d4690"} Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.595368 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5d47bf985-97jrs" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.595431 4707 scope.go:117] "RemoveContainer" containerID="da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.618727 4707 scope.go:117] "RemoveContainer" containerID="da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8" Jan 21 16:46:38 crc kubenswrapper[4707]: E0121 16:46:38.619466 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8\": container with ID starting with da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8 not found: ID does not exist" containerID="da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.619512 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8"} err="failed to get container status \"da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8\": rpc error: code = NotFound desc = could not find container \"da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8\": container with ID starting with da0c20f1957bbc86a12912cac0ae0da26affac3194f552538cdd3ed9c5afa1f8 not found: ID does not exist" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.620574 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5d47bf985-97jrs"] Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.624869 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5d47bf985-97jrs"] Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.828674 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.900236 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b599c271-18c6-41a2-b845-f482d4101bbd-operator-scripts\") pod \"b599c271-18c6-41a2-b845-f482d4101bbd\" (UID: \"b599c271-18c6-41a2-b845-f482d4101bbd\") " Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.900343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgxdv\" (UniqueName: \"kubernetes.io/projected/b599c271-18c6-41a2-b845-f482d4101bbd-kube-api-access-tgxdv\") pod \"b599c271-18c6-41a2-b845-f482d4101bbd\" (UID: \"b599c271-18c6-41a2-b845-f482d4101bbd\") " Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.901054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b599c271-18c6-41a2-b845-f482d4101bbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b599c271-18c6-41a2-b845-f482d4101bbd" (UID: "b599c271-18c6-41a2-b845-f482d4101bbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:46:38 crc kubenswrapper[4707]: I0121 16:46:38.904389 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b599c271-18c6-41a2-b845-f482d4101bbd-kube-api-access-tgxdv" (OuterVolumeSpecName: "kube-api-access-tgxdv") pod "b599c271-18c6-41a2-b845-f482d4101bbd" (UID: "b599c271-18c6-41a2-b845-f482d4101bbd"). InnerVolumeSpecName "kube-api-access-tgxdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:46:39 crc kubenswrapper[4707]: I0121 16:46:39.002509 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b599c271-18c6-41a2-b845-f482d4101bbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:39 crc kubenswrapper[4707]: I0121 16:46:39.002843 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgxdv\" (UniqueName: \"kubernetes.io/projected/b599c271-18c6-41a2-b845-f482d4101bbd-kube-api-access-tgxdv\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:39 crc kubenswrapper[4707]: I0121 16:46:39.190941 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760acdc9-a0cc-49ca-b0f5-a626ab478c44" path="/var/lib/kubelet/pods/760acdc9-a0cc-49ca-b0f5-a626ab478c44/volumes" Jan 21 16:46:39 crc kubenswrapper[4707]: I0121 16:46:39.601689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" event={"ID":"b599c271-18c6-41a2-b845-f482d4101bbd","Type":"ContainerDied","Data":"ee665d01496875504efaf74b043066629c6f66bb9a23811bb573b77e598d61b8"} Jan 21 16:46:39 crc kubenswrapper[4707]: I0121 16:46:39.601741 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee665d01496875504efaf74b043066629c6f66bb9a23811bb573b77e598d61b8" Jan 21 16:46:39 crc kubenswrapper[4707]: I0121 16:46:39.601711 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone35dd-account-delete-qjldk" Jan 21 16:46:39 crc kubenswrapper[4707]: I0121 16:46:39.920375 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.019360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-config-data\") pod \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.019427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j279k\" (UniqueName: \"kubernetes.io/projected/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-kube-api-access-j279k\") pod \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.019485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-credential-keys\") pod \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.019518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-fernet-keys\") pod \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.019603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-scripts\") pod \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\" (UID: \"fe08f42b-553b-4361-a0e0-ace0d6db5e8e\") " Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.024695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fe08f42b-553b-4361-a0e0-ace0d6db5e8e" (UID: "fe08f42b-553b-4361-a0e0-ace0d6db5e8e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.024734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fe08f42b-553b-4361-a0e0-ace0d6db5e8e" (UID: "fe08f42b-553b-4361-a0e0-ace0d6db5e8e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.024715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-scripts" (OuterVolumeSpecName: "scripts") pod "fe08f42b-553b-4361-a0e0-ace0d6db5e8e" (UID: "fe08f42b-553b-4361-a0e0-ace0d6db5e8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.025087 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-kube-api-access-j279k" (OuterVolumeSpecName: "kube-api-access-j279k") pod "fe08f42b-553b-4361-a0e0-ace0d6db5e8e" (UID: "fe08f42b-553b-4361-a0e0-ace0d6db5e8e"). InnerVolumeSpecName "kube-api-access-j279k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.040987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-config-data" (OuterVolumeSpecName: "config-data") pod "fe08f42b-553b-4361-a0e0-ace0d6db5e8e" (UID: "fe08f42b-553b-4361-a0e0-ace0d6db5e8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.121374 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.121409 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j279k\" (UniqueName: \"kubernetes.io/projected/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-kube-api-access-j279k\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.121419 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.121428 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.121435 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe08f42b-553b-4361-a0e0-ace0d6db5e8e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.609709 4707 generic.go:334] "Generic (PLEG): container finished" podID="fe08f42b-553b-4361-a0e0-ace0d6db5e8e" containerID="52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a" exitCode=0 Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.609748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" event={"ID":"fe08f42b-553b-4361-a0e0-ace0d6db5e8e","Type":"ContainerDied","Data":"52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a"} Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.609765 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.609783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56bd68f756-n4bjt" event={"ID":"fe08f42b-553b-4361-a0e0-ace0d6db5e8e","Type":"ContainerDied","Data":"f93c6f55c8385282ee2c0245a5d4e94da72640825e8e35f92fed81b2baea128a"} Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.610715 4707 scope.go:117] "RemoveContainer" containerID="52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.632408 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-56bd68f756-n4bjt"] Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.637415 4707 scope.go:117] "RemoveContainer" containerID="52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a" Jan 21 16:46:40 crc kubenswrapper[4707]: E0121 16:46:40.637798 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a\": container with ID starting with 52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a not found: ID does not exist" containerID="52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.637843 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a"} err="failed to get container status \"52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a\": rpc error: code = NotFound desc = could not find container \"52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a\": container with ID starting with 52c9a48945765978d4ddd780641c43f662e0908a238dd71a4385799be2eb403a not found: ID does not exist" Jan 21 16:46:40 crc kubenswrapper[4707]: I0121 16:46:40.638406 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-56bd68f756-n4bjt"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.188752 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe08f42b-553b-4361-a0e0-ace0d6db5e8e" path="/var/lib/kubelet/pods/fe08f42b-553b-4361-a0e0-ace0d6db5e8e/volumes" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.660187 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-7h8mh"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.667384 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-7h8mh"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.672166 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone35dd-account-delete-qjldk"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.676877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.682019 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-35dd-account-create-update-4lrlg"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.686458 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone35dd-account-delete-qjldk"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.738868 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pbdh6"] Jan 21 16:46:41 crc kubenswrapper[4707]: E0121 16:46:41.739155 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760acdc9-a0cc-49ca-b0f5-a626ab478c44" containerName="keystone-api" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.739168 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="760acdc9-a0cc-49ca-b0f5-a626ab478c44" containerName="keystone-api" Jan 21 16:46:41 crc kubenswrapper[4707]: E0121 16:46:41.739184 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe08f42b-553b-4361-a0e0-ace0d6db5e8e" containerName="keystone-api" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.739190 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe08f42b-553b-4361-a0e0-ace0d6db5e8e" containerName="keystone-api" Jan 21 16:46:41 crc kubenswrapper[4707]: E0121 16:46:41.739198 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b599c271-18c6-41a2-b845-f482d4101bbd" containerName="mariadb-account-delete" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.739204 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b599c271-18c6-41a2-b845-f482d4101bbd" containerName="mariadb-account-delete" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.739325 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe08f42b-553b-4361-a0e0-ace0d6db5e8e" containerName="keystone-api" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.739340 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b599c271-18c6-41a2-b845-f482d4101bbd" containerName="mariadb-account-delete" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.739356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="760acdc9-a0cc-49ca-b0f5-a626ab478c44" containerName="keystone-api" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.739834 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.744009 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pbdh6"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.841797 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.842481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsljv\" (UniqueName: \"kubernetes.io/projected/fb6901b9-a016-47a5-a4f6-a8de0019bcce-kube-api-access-nsljv\") pod \"keystone-db-create-pbdh6\" (UID: \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\") " pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.842711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb6901b9-a016-47a5-a4f6-a8de0019bcce-operator-scripts\") pod \"keystone-db-create-pbdh6\" (UID: \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\") " pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.843604 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.847006 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.847731 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48"] Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.943895 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-operator-scripts\") pod \"keystone-8fd0-account-create-update-5px48\" (UID: \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\") " pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.943949 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thv75\" (UniqueName: \"kubernetes.io/projected/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-kube-api-access-thv75\") pod \"keystone-8fd0-account-create-update-5px48\" (UID: \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\") " pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.944003 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb6901b9-a016-47a5-a4f6-a8de0019bcce-operator-scripts\") pod \"keystone-db-create-pbdh6\" (UID: \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\") " pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.944076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsljv\" (UniqueName: \"kubernetes.io/projected/fb6901b9-a016-47a5-a4f6-a8de0019bcce-kube-api-access-nsljv\") pod \"keystone-db-create-pbdh6\" (UID: \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\") " pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.944722 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb6901b9-a016-47a5-a4f6-a8de0019bcce-operator-scripts\") pod \"keystone-db-create-pbdh6\" (UID: \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\") " pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:41 crc kubenswrapper[4707]: I0121 16:46:41.966363 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsljv\" (UniqueName: \"kubernetes.io/projected/fb6901b9-a016-47a5-a4f6-a8de0019bcce-kube-api-access-nsljv\") pod \"keystone-db-create-pbdh6\" (UID: \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\") " pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.045470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-operator-scripts\") pod \"keystone-8fd0-account-create-update-5px48\" (UID: \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\") " pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.045518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thv75\" (UniqueName: \"kubernetes.io/projected/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-kube-api-access-thv75\") pod \"keystone-8fd0-account-create-update-5px48\" (UID: \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\") " pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.046181 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-operator-scripts\") pod \"keystone-8fd0-account-create-update-5px48\" (UID: \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\") " pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.054515 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.059969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thv75\" (UniqueName: \"kubernetes.io/projected/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-kube-api-access-thv75\") pod \"keystone-8fd0-account-create-update-5px48\" (UID: \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\") " pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.160106 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.228580 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pbdh6"] Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.556828 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48"] Jan 21 16:46:42 crc kubenswrapper[4707]: W0121 16:46:42.561152 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3bf33c0_6356_4e0d_87d1_7d69d8c130e4.slice/crio-a1621ff885852515d938ff9488db34aeffd904a005ea14d53e4c860ed9de6baa WatchSource:0}: Error finding container a1621ff885852515d938ff9488db34aeffd904a005ea14d53e4c860ed9de6baa: Status 404 returned error can't find the container with id a1621ff885852515d938ff9488db34aeffd904a005ea14d53e4c860ed9de6baa Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.624303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" event={"ID":"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4","Type":"ContainerStarted","Data":"a1621ff885852515d938ff9488db34aeffd904a005ea14d53e4c860ed9de6baa"} Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.625802 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb6901b9-a016-47a5-a4f6-a8de0019bcce" containerID="1f617618e792a1f6487391e1a9ab5d922dd783e0b153792125dc0fcb57c72f70" exitCode=0 Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.625845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-pbdh6" event={"ID":"fb6901b9-a016-47a5-a4f6-a8de0019bcce","Type":"ContainerDied","Data":"1f617618e792a1f6487391e1a9ab5d922dd783e0b153792125dc0fcb57c72f70"} Jan 21 16:46:42 crc kubenswrapper[4707]: I0121 16:46:42.625861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-pbdh6" event={"ID":"fb6901b9-a016-47a5-a4f6-a8de0019bcce","Type":"ContainerStarted","Data":"140fce679b0524356faf892246ba9c4521e8cb934bd106c5c4dfe7bebdd1212f"} Jan 21 16:46:43 crc kubenswrapper[4707]: I0121 16:46:43.190749 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d306c2-6922-4d44-8412-ad565b5b05d2" path="/var/lib/kubelet/pods/36d306c2-6922-4d44-8412-ad565b5b05d2/volumes" Jan 21 16:46:43 crc kubenswrapper[4707]: I0121 16:46:43.191261 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b599c271-18c6-41a2-b845-f482d4101bbd" path="/var/lib/kubelet/pods/b599c271-18c6-41a2-b845-f482d4101bbd/volumes" Jan 21 16:46:43 crc kubenswrapper[4707]: I0121 16:46:43.191682 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3289dec-3166-4339-b51b-aca57132faec" path="/var/lib/kubelet/pods/d3289dec-3166-4339-b51b-aca57132faec/volumes" Jan 21 16:46:43 crc kubenswrapper[4707]: I0121 16:46:43.631984 4707 generic.go:334] "Generic (PLEG): container finished" podID="d3bf33c0-6356-4e0d-87d1-7d69d8c130e4" containerID="8114b8c8c5dd5f046a1d094af0d690b68a58bdf7494b0f3f54997c01721c829d" exitCode=0 Jan 21 16:46:43 crc kubenswrapper[4707]: I0121 16:46:43.632025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" event={"ID":"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4","Type":"ContainerDied","Data":"8114b8c8c5dd5f046a1d094af0d690b68a58bdf7494b0f3f54997c01721c829d"} Jan 21 16:46:43 crc kubenswrapper[4707]: I0121 16:46:43.892694 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.073356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb6901b9-a016-47a5-a4f6-a8de0019bcce-operator-scripts\") pod \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\" (UID: \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\") " Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.073408 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsljv\" (UniqueName: \"kubernetes.io/projected/fb6901b9-a016-47a5-a4f6-a8de0019bcce-kube-api-access-nsljv\") pod \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\" (UID: \"fb6901b9-a016-47a5-a4f6-a8de0019bcce\") " Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.074116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb6901b9-a016-47a5-a4f6-a8de0019bcce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb6901b9-a016-47a5-a4f6-a8de0019bcce" (UID: "fb6901b9-a016-47a5-a4f6-a8de0019bcce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.078238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb6901b9-a016-47a5-a4f6-a8de0019bcce-kube-api-access-nsljv" (OuterVolumeSpecName: "kube-api-access-nsljv") pod "fb6901b9-a016-47a5-a4f6-a8de0019bcce" (UID: "fb6901b9-a016-47a5-a4f6-a8de0019bcce"). InnerVolumeSpecName "kube-api-access-nsljv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.175046 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb6901b9-a016-47a5-a4f6-a8de0019bcce-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.175072 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsljv\" (UniqueName: \"kubernetes.io/projected/fb6901b9-a016-47a5-a4f6-a8de0019bcce-kube-api-access-nsljv\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.638647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-pbdh6" event={"ID":"fb6901b9-a016-47a5-a4f6-a8de0019bcce","Type":"ContainerDied","Data":"140fce679b0524356faf892246ba9c4521e8cb934bd106c5c4dfe7bebdd1212f"} Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.638689 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="140fce679b0524356faf892246ba9c4521e8cb934bd106c5c4dfe7bebdd1212f" Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.638692 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-pbdh6" Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.861414 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.983691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thv75\" (UniqueName: \"kubernetes.io/projected/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-kube-api-access-thv75\") pod \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\" (UID: \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\") " Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.983764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-operator-scripts\") pod \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\" (UID: \"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4\") " Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.984403 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3bf33c0-6356-4e0d-87d1-7d69d8c130e4" (UID: "d3bf33c0-6356-4e0d-87d1-7d69d8c130e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:46:44 crc kubenswrapper[4707]: I0121 16:46:44.989843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-kube-api-access-thv75" (OuterVolumeSpecName: "kube-api-access-thv75") pod "d3bf33c0-6356-4e0d-87d1-7d69d8c130e4" (UID: "d3bf33c0-6356-4e0d-87d1-7d69d8c130e4"). InnerVolumeSpecName "kube-api-access-thv75". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:46:45 crc kubenswrapper[4707]: I0121 16:46:45.085801 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thv75\" (UniqueName: \"kubernetes.io/projected/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-kube-api-access-thv75\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:45 crc kubenswrapper[4707]: I0121 16:46:45.085851 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:45 crc kubenswrapper[4707]: I0121 16:46:45.646654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" event={"ID":"d3bf33c0-6356-4e0d-87d1-7d69d8c130e4","Type":"ContainerDied","Data":"a1621ff885852515d938ff9488db34aeffd904a005ea14d53e4c860ed9de6baa"} Jan 21 16:46:45 crc kubenswrapper[4707]: I0121 16:46:45.646695 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1621ff885852515d938ff9488db34aeffd904a005ea14d53e4c860ed9de6baa" Jan 21 16:46:45 crc kubenswrapper[4707]: I0121 16:46:45.646741 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.183095 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.382460 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6jvbg"] Jan 21 16:46:47 crc kubenswrapper[4707]: E0121 16:46:47.383036 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6901b9-a016-47a5-a4f6-a8de0019bcce" containerName="mariadb-database-create" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.383049 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6901b9-a016-47a5-a4f6-a8de0019bcce" containerName="mariadb-database-create" Jan 21 16:46:47 crc kubenswrapper[4707]: E0121 16:46:47.383067 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bf33c0-6356-4e0d-87d1-7d69d8c130e4" containerName="mariadb-account-create-update" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.383075 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bf33c0-6356-4e0d-87d1-7d69d8c130e4" containerName="mariadb-account-create-update" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.383207 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bf33c0-6356-4e0d-87d1-7d69d8c130e4" containerName="mariadb-account-create-update" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.383229 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb6901b9-a016-47a5-a4f6-a8de0019bcce" containerName="mariadb-database-create" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.383633 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.385985 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.386135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.387309 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-rqw79" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.388072 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.390143 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6jvbg"] Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.518117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd8def1-dedb-492c-9122-d91ff5ab8c09-config-data\") pod \"keystone-db-sync-6jvbg\" (UID: \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\") " pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.518153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvtm\" (UniqueName: \"kubernetes.io/projected/bbd8def1-dedb-492c-9122-d91ff5ab8c09-kube-api-access-hfvtm\") pod \"keystone-db-sync-6jvbg\" (UID: \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\") " pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.619243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd8def1-dedb-492c-9122-d91ff5ab8c09-config-data\") pod \"keystone-db-sync-6jvbg\" (UID: \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\") " pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.619282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvtm\" (UniqueName: \"kubernetes.io/projected/bbd8def1-dedb-492c-9122-d91ff5ab8c09-kube-api-access-hfvtm\") pod \"keystone-db-sync-6jvbg\" (UID: \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\") " pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.628782 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd8def1-dedb-492c-9122-d91ff5ab8c09-config-data\") pod \"keystone-db-sync-6jvbg\" (UID: \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\") " pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.633802 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvtm\" (UniqueName: \"kubernetes.io/projected/bbd8def1-dedb-492c-9122-d91ff5ab8c09-kube-api-access-hfvtm\") pod \"keystone-db-sync-6jvbg\" (UID: \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\") " pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.662728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"3dcf06292baafcac4743dbbfb989956b7b70306caef24c5da65220fe71222042"} Jan 21 16:46:47 crc kubenswrapper[4707]: I0121 16:46:47.697266 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:48 crc kubenswrapper[4707]: I0121 16:46:48.076051 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6jvbg"] Jan 21 16:46:48 crc kubenswrapper[4707]: W0121 16:46:48.079629 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbd8def1_dedb_492c_9122_d91ff5ab8c09.slice/crio-a3eeaad1179c1b8df0e1c70728616e7291349ed85172523b10e14c73d684bc7d WatchSource:0}: Error finding container a3eeaad1179c1b8df0e1c70728616e7291349ed85172523b10e14c73d684bc7d: Status 404 returned error can't find the container with id a3eeaad1179c1b8df0e1c70728616e7291349ed85172523b10e14c73d684bc7d Jan 21 16:46:48 crc kubenswrapper[4707]: I0121 16:46:48.670845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" event={"ID":"bbd8def1-dedb-492c-9122-d91ff5ab8c09","Type":"ContainerStarted","Data":"e94820e2bea52549e5aa5c5747b2c0d37a0ea118f5be2767e7027b56ff85ff76"} Jan 21 16:46:48 crc kubenswrapper[4707]: I0121 16:46:48.671049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" event={"ID":"bbd8def1-dedb-492c-9122-d91ff5ab8c09","Type":"ContainerStarted","Data":"a3eeaad1179c1b8df0e1c70728616e7291349ed85172523b10e14c73d684bc7d"} Jan 21 16:46:49 crc kubenswrapper[4707]: I0121 16:46:49.678534 4707 generic.go:334] "Generic (PLEG): container finished" podID="bbd8def1-dedb-492c-9122-d91ff5ab8c09" containerID="e94820e2bea52549e5aa5c5747b2c0d37a0ea118f5be2767e7027b56ff85ff76" exitCode=0 Jan 21 16:46:49 crc kubenswrapper[4707]: I0121 16:46:49.678623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" event={"ID":"bbd8def1-dedb-492c-9122-d91ff5ab8c09","Type":"ContainerDied","Data":"e94820e2bea52549e5aa5c5747b2c0d37a0ea118f5be2767e7027b56ff85ff76"} Jan 21 16:46:50 crc kubenswrapper[4707]: I0121 16:46:50.906949 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:50 crc kubenswrapper[4707]: I0121 16:46:50.973262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfvtm\" (UniqueName: \"kubernetes.io/projected/bbd8def1-dedb-492c-9122-d91ff5ab8c09-kube-api-access-hfvtm\") pod \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\" (UID: \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\") " Jan 21 16:46:50 crc kubenswrapper[4707]: I0121 16:46:50.973406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd8def1-dedb-492c-9122-d91ff5ab8c09-config-data\") pod \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\" (UID: \"bbd8def1-dedb-492c-9122-d91ff5ab8c09\") " Jan 21 16:46:50 crc kubenswrapper[4707]: I0121 16:46:50.977905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd8def1-dedb-492c-9122-d91ff5ab8c09-kube-api-access-hfvtm" (OuterVolumeSpecName: "kube-api-access-hfvtm") pod "bbd8def1-dedb-492c-9122-d91ff5ab8c09" (UID: "bbd8def1-dedb-492c-9122-d91ff5ab8c09"). InnerVolumeSpecName "kube-api-access-hfvtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:46:50 crc kubenswrapper[4707]: I0121 16:46:50.998265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd8def1-dedb-492c-9122-d91ff5ab8c09-config-data" (OuterVolumeSpecName: "config-data") pod "bbd8def1-dedb-492c-9122-d91ff5ab8c09" (UID: "bbd8def1-dedb-492c-9122-d91ff5ab8c09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.074942 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfvtm\" (UniqueName: \"kubernetes.io/projected/bbd8def1-dedb-492c-9122-d91ff5ab8c09-kube-api-access-hfvtm\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.074970 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd8def1-dedb-492c-9122-d91ff5ab8c09-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.690634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" event={"ID":"bbd8def1-dedb-492c-9122-d91ff5ab8c09","Type":"ContainerDied","Data":"a3eeaad1179c1b8df0e1c70728616e7291349ed85172523b10e14c73d684bc7d"} Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.690657 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6jvbg" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.690672 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3eeaad1179c1b8df0e1c70728616e7291349ed85172523b10e14c73d684bc7d" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.835197 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-8hrpx"] Jan 21 16:46:51 crc kubenswrapper[4707]: E0121 16:46:51.835649 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd8def1-dedb-492c-9122-d91ff5ab8c09" containerName="keystone-db-sync" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.835665 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd8def1-dedb-492c-9122-d91ff5ab8c09" containerName="keystone-db-sync" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.835779 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd8def1-dedb-492c-9122-d91ff5ab8c09" containerName="keystone-db-sync" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.836163 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.837493 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.842374 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-rqw79" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.842534 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.842645 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.843618 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.848355 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-8hrpx"] Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.884593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-config-data\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.884629 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-credential-keys\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.884723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46v7d\" (UniqueName: \"kubernetes.io/projected/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-kube-api-access-46v7d\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.884818 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-scripts\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.884871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-fernet-keys\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.986238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46v7d\" (UniqueName: \"kubernetes.io/projected/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-kube-api-access-46v7d\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.986289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-scripts\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.986322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-fernet-keys\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.986395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-config-data\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.986416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-credential-keys\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.990070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-scripts\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.990070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-credential-keys\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.990238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-config-data\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.990676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-fernet-keys\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:51 crc kubenswrapper[4707]: I0121 16:46:51.999616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46v7d\" (UniqueName: \"kubernetes.io/projected/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-kube-api-access-46v7d\") pod \"keystone-bootstrap-8hrpx\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:52 crc kubenswrapper[4707]: I0121 16:46:52.158436 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:52 crc kubenswrapper[4707]: I0121 16:46:52.514412 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-8hrpx"] Jan 21 16:46:52 crc kubenswrapper[4707]: I0121 16:46:52.701573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" event={"ID":"4ae881a4-a0d8-4e6f-83fe-6497f13947bd","Type":"ContainerStarted","Data":"edd2d7949d36e36e9abe826251f72563c0bc8dce4e81c40d1b691bdc336c7ad0"} Jan 21 16:46:52 crc kubenswrapper[4707]: I0121 16:46:52.701853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" event={"ID":"4ae881a4-a0d8-4e6f-83fe-6497f13947bd","Type":"ContainerStarted","Data":"93873106202c783280cb6519ddbeaaf4f46ad61063782991b4aa330bc4c381ab"} Jan 21 16:46:52 crc kubenswrapper[4707]: I0121 16:46:52.721113 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" podStartSLOduration=1.721098182 podStartE2EDuration="1.721098182s" podCreationTimestamp="2026-01-21 16:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:46:52.713962886 +0000 UTC m=+6309.895479107" watchObservedRunningTime="2026-01-21 16:46:52.721098182 +0000 UTC m=+6309.902614404" Jan 21 16:46:54 crc kubenswrapper[4707]: I0121 16:46:54.712881 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ae881a4-a0d8-4e6f-83fe-6497f13947bd" containerID="edd2d7949d36e36e9abe826251f72563c0bc8dce4e81c40d1b691bdc336c7ad0" exitCode=0 Jan 21 16:46:54 crc kubenswrapper[4707]: I0121 16:46:54.713051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" event={"ID":"4ae881a4-a0d8-4e6f-83fe-6497f13947bd","Type":"ContainerDied","Data":"edd2d7949d36e36e9abe826251f72563c0bc8dce4e81c40d1b691bdc336c7ad0"} Jan 21 16:46:55 crc kubenswrapper[4707]: I0121 16:46:55.943046 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.030549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-fernet-keys\") pod \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.030676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46v7d\" (UniqueName: \"kubernetes.io/projected/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-kube-api-access-46v7d\") pod \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.030702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-config-data\") pod \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.031368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-scripts\") pod \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.031399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-credential-keys\") pod \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\" (UID: \"4ae881a4-a0d8-4e6f-83fe-6497f13947bd\") " Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.035049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-scripts" (OuterVolumeSpecName: "scripts") pod "4ae881a4-a0d8-4e6f-83fe-6497f13947bd" (UID: "4ae881a4-a0d8-4e6f-83fe-6497f13947bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.035307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-kube-api-access-46v7d" (OuterVolumeSpecName: "kube-api-access-46v7d") pod "4ae881a4-a0d8-4e6f-83fe-6497f13947bd" (UID: "4ae881a4-a0d8-4e6f-83fe-6497f13947bd"). InnerVolumeSpecName "kube-api-access-46v7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.035390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4ae881a4-a0d8-4e6f-83fe-6497f13947bd" (UID: "4ae881a4-a0d8-4e6f-83fe-6497f13947bd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.035425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4ae881a4-a0d8-4e6f-83fe-6497f13947bd" (UID: "4ae881a4-a0d8-4e6f-83fe-6497f13947bd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.046937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-config-data" (OuterVolumeSpecName: "config-data") pod "4ae881a4-a0d8-4e6f-83fe-6497f13947bd" (UID: "4ae881a4-a0d8-4e6f-83fe-6497f13947bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.132820 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.132850 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46v7d\" (UniqueName: \"kubernetes.io/projected/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-kube-api-access-46v7d\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.132861 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.132869 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.132877 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ae881a4-a0d8-4e6f-83fe-6497f13947bd-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.725074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" event={"ID":"4ae881a4-a0d8-4e6f-83fe-6497f13947bd","Type":"ContainerDied","Data":"93873106202c783280cb6519ddbeaaf4f46ad61063782991b4aa330bc4c381ab"} Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.725290 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93873106202c783280cb6519ddbeaaf4f46ad61063782991b4aa330bc4c381ab" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.725114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-8hrpx" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.788480 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-z7t5t"] Jan 21 16:46:56 crc kubenswrapper[4707]: E0121 16:46:56.788721 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae881a4-a0d8-4e6f-83fe-6497f13947bd" containerName="keystone-bootstrap" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.788738 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae881a4-a0d8-4e6f-83fe-6497f13947bd" containerName="keystone-bootstrap" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.788932 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae881a4-a0d8-4e6f-83fe-6497f13947bd" containerName="keystone-bootstrap" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.789381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.791539 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.791720 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.791874 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.791945 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-rqw79" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.797137 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-z7t5t"] Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.842074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-credential-keys\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.842113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-fernet-keys\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.842141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-scripts\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.842156 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-config-data\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.842212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kmt5\" (UniqueName: \"kubernetes.io/projected/0df9e527-003f-4dad-a770-df1d1183256d-kube-api-access-2kmt5\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.944150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kmt5\" (UniqueName: \"kubernetes.io/projected/0df9e527-003f-4dad-a770-df1d1183256d-kube-api-access-2kmt5\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.944234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-credential-keys\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.944263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-fernet-keys\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.944294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-scripts\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.944312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-config-data\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.947936 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-fernet-keys\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.948500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-config-data\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.948865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-scripts\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.949131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-credential-keys\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:56 crc kubenswrapper[4707]: I0121 16:46:56.960984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kmt5\" (UniqueName: \"kubernetes.io/projected/0df9e527-003f-4dad-a770-df1d1183256d-kube-api-access-2kmt5\") pod \"keystone-9bc86446b-z7t5t\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:57 crc kubenswrapper[4707]: I0121 16:46:57.103597 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:57 crc kubenswrapper[4707]: I0121 16:46:57.469452 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-z7t5t"] Jan 21 16:46:57 crc kubenswrapper[4707]: I0121 16:46:57.733016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" event={"ID":"0df9e527-003f-4dad-a770-df1d1183256d","Type":"ContainerStarted","Data":"33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb"} Jan 21 16:46:57 crc kubenswrapper[4707]: I0121 16:46:57.733199 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:46:57 crc kubenswrapper[4707]: I0121 16:46:57.733221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" event={"ID":"0df9e527-003f-4dad-a770-df1d1183256d","Type":"ContainerStarted","Data":"e88d0f42e3f18e15f2ff20550516dcba6a73efca5a2e606da18ae238037a1e5d"} Jan 21 16:46:57 crc kubenswrapper[4707]: I0121 16:46:57.748384 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" podStartSLOduration=1.748371096 podStartE2EDuration="1.748371096s" podCreationTimestamp="2026-01-21 16:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:46:57.743431019 +0000 UTC m=+6314.924947240" watchObservedRunningTime="2026-01-21 16:46:57.748371096 +0000 UTC m=+6314.929887319" Jan 21 16:47:26 crc kubenswrapper[4707]: I0121 16:47:26.453290 4707 scope.go:117] "RemoveContainer" containerID="3e20dfe84e8d67d89b7fd51cef3bf529167d697147821e4879900d6dcd6e0690" Jan 21 16:47:26 crc kubenswrapper[4707]: I0121 16:47:26.472180 4707 scope.go:117] "RemoveContainer" containerID="b21c8d8232a8c8d7f46afcd7a49e8f543e96ab2fb7d3dcb2dac1dbb74cd5996c" Jan 21 16:47:26 crc kubenswrapper[4707]: I0121 16:47:26.489986 4707 scope.go:117] "RemoveContainer" containerID="576f5674412ac2a1e6076bb665079bac82bb159d1ebf65e1043479dec274a7cc" Jan 21 16:47:26 crc kubenswrapper[4707]: I0121 16:47:26.508625 4707 scope.go:117] "RemoveContainer" containerID="6fcec8f0281d17e70faad25a834607175020c27b5a29321fcb377913bb15593b" Jan 21 16:47:26 crc kubenswrapper[4707]: I0121 16:47:26.536295 4707 scope.go:117] "RemoveContainer" containerID="0c98e18ab9f0c33cce4561ff3c73b39b992159c742c9a7864333f94da41987c1" Jan 21 16:47:26 crc kubenswrapper[4707]: I0121 16:47:26.548507 4707 scope.go:117] "RemoveContainer" containerID="c906f3310d7f7f3df2006d95256690fadeca173ddf94a55a08ad91ffc470d8df" Jan 21 16:47:26 crc kubenswrapper[4707]: I0121 16:47:26.566661 4707 scope.go:117] "RemoveContainer" containerID="4dd3d2e38dcf334ce48b4b06f066a14b94980e68c5b426763c0c96a9e5cd2465" Jan 21 16:47:28 crc kubenswrapper[4707]: I0121 16:47:28.368827 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.447438 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-p8klq"] Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.448284 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.453416 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-kzbd7"] Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.454336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.458779 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-p8klq"] Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.463688 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-kzbd7"] Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-credential-keys\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-fernet-keys\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-config-data\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-config-data\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535537 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-credential-keys\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-scripts\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98fkg\" (UniqueName: \"kubernetes.io/projected/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-kube-api-access-98fkg\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-scripts\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqpt\" (UniqueName: \"kubernetes.io/projected/e8e8f14a-90a8-45de-ab78-b57f28584ca9-kube-api-access-4wqpt\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.535920 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-fernet-keys\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-config-data\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-config-data\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637592 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-credential-keys\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637606 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-scripts\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98fkg\" (UniqueName: \"kubernetes.io/projected/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-kube-api-access-98fkg\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637648 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-scripts\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637679 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqpt\" (UniqueName: \"kubernetes.io/projected/e8e8f14a-90a8-45de-ab78-b57f28584ca9-kube-api-access-4wqpt\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-fernet-keys\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-credential-keys\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.637740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-fernet-keys\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.642584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-scripts\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.642602 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-scripts\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.643144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-fernet-keys\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.643190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-fernet-keys\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.643211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-config-data\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.643397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-config-data\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.643896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-credential-keys\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.644007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-credential-keys\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.651018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98fkg\" (UniqueName: \"kubernetes.io/projected/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-kube-api-access-98fkg\") pod \"keystone-9bc86446b-p8klq\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.651926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqpt\" (UniqueName: \"kubernetes.io/projected/e8e8f14a-90a8-45de-ab78-b57f28584ca9-kube-api-access-4wqpt\") pod \"keystone-9bc86446b-kzbd7\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.764745 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:29 crc kubenswrapper[4707]: I0121 16:47:29.771854 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.110376 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-p8klq"] Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.162276 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-kzbd7"] Jan 21 16:47:30 crc kubenswrapper[4707]: W0121 16:47:30.166697 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8e8f14a_90a8_45de_ab78_b57f28584ca9.slice/crio-34256922409d2d865da40bea82de6288d84caa1e7910d49bed5967180ea22161 WatchSource:0}: Error finding container 34256922409d2d865da40bea82de6288d84caa1e7910d49bed5967180ea22161: Status 404 returned error can't find the container with id 34256922409d2d865da40bea82de6288d84caa1e7910d49bed5967180ea22161 Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.927290 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" event={"ID":"17616003-4ef1-4e31-b3de-f91dfa9ef4d6","Type":"ContainerStarted","Data":"bb0eb5f0774d70d6ef48668e048f8635158ad4e6a95e861a606e11b4b8897f2e"} Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.927484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" event={"ID":"17616003-4ef1-4e31-b3de-f91dfa9ef4d6","Type":"ContainerStarted","Data":"fca770ff374190c3ad9782eb06bbd4839ff55f19be0aa59ab9eee82755170e8a"} Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.927626 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.929105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" event={"ID":"e8e8f14a-90a8-45de-ab78-b57f28584ca9","Type":"ContainerStarted","Data":"1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92"} Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.929144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" event={"ID":"e8e8f14a-90a8-45de-ab78-b57f28584ca9","Type":"ContainerStarted","Data":"34256922409d2d865da40bea82de6288d84caa1e7910d49bed5967180ea22161"} Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.929237 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.940702 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" podStartSLOduration=1.940687381 podStartE2EDuration="1.940687381s" podCreationTimestamp="2026-01-21 16:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:47:30.937764746 +0000 UTC m=+6348.119280969" watchObservedRunningTime="2026-01-21 16:47:30.940687381 +0000 UTC m=+6348.122203603" Jan 21 16:47:30 crc kubenswrapper[4707]: I0121 16:47:30.949645 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" podStartSLOduration=1.949632772 podStartE2EDuration="1.949632772s" podCreationTimestamp="2026-01-21 16:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:47:30.94685093 +0000 UTC m=+6348.128367153" watchObservedRunningTime="2026-01-21 16:47:30.949632772 +0000 UTC m=+6348.131148994" Jan 21 16:48:01 crc kubenswrapper[4707]: I0121 16:48:01.021680 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:48:01 crc kubenswrapper[4707]: I0121 16:48:01.054432 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:48:01 crc kubenswrapper[4707]: I0121 16:48:01.953792 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-p8klq"] Jan 21 16:48:01 crc kubenswrapper[4707]: I0121 16:48:01.954199 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" podUID="17616003-4ef1-4e31-b3de-f91dfa9ef4d6" containerName="keystone-api" containerID="cri-o://bb0eb5f0774d70d6ef48668e048f8635158ad4e6a95e861a606e11b4b8897f2e" gracePeriod=30 Jan 21 16:48:01 crc kubenswrapper[4707]: I0121 16:48:01.958621 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-kzbd7"] Jan 21 16:48:01 crc kubenswrapper[4707]: I0121 16:48:01.958885 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" podUID="e8e8f14a-90a8-45de-ab78-b57f28584ca9" containerName="keystone-api" containerID="cri-o://1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92" gracePeriod=30 Jan 21 16:48:03 crc kubenswrapper[4707]: I0121 16:48:03.077307 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-z7t5t"] Jan 21 16:48:03 crc kubenswrapper[4707]: I0121 16:48:03.077495 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" podUID="0df9e527-003f-4dad-a770-df1d1183256d" containerName="keystone-api" containerID="cri-o://33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb" gracePeriod=30 Jan 21 16:48:05 crc kubenswrapper[4707]: E0121 16:48:05.093134 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17616003_4ef1_4e31_b3de_f91dfa9ef4d6.slice/crio-bb0eb5f0774d70d6ef48668e048f8635158ad4e6a95e861a606e11b4b8897f2e.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.148723 4707 generic.go:334] "Generic (PLEG): container finished" podID="17616003-4ef1-4e31-b3de-f91dfa9ef4d6" containerID="bb0eb5f0774d70d6ef48668e048f8635158ad4e6a95e861a606e11b4b8897f2e" exitCode=0 Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.148956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" event={"ID":"17616003-4ef1-4e31-b3de-f91dfa9ef4d6","Type":"ContainerDied","Data":"bb0eb5f0774d70d6ef48668e048f8635158ad4e6a95e861a606e11b4b8897f2e"} Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.290518 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.356048 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.473945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-config-data\") pod \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.473998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-credential-keys\") pod \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.474102 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-scripts\") pod \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.474151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98fkg\" (UniqueName: \"kubernetes.io/projected/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-kube-api-access-98fkg\") pod \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.474171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wqpt\" (UniqueName: \"kubernetes.io/projected/e8e8f14a-90a8-45de-ab78-b57f28584ca9-kube-api-access-4wqpt\") pod \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.474207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-fernet-keys\") pod \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.474249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-credential-keys\") pod \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.474268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-fernet-keys\") pod \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\" (UID: \"e8e8f14a-90a8-45de-ab78-b57f28584ca9\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.474286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-config-data\") pod \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.474301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-scripts\") pod \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\" (UID: \"17616003-4ef1-4e31-b3de-f91dfa9ef4d6\") " Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.478308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-kube-api-access-98fkg" (OuterVolumeSpecName: "kube-api-access-98fkg") pod "17616003-4ef1-4e31-b3de-f91dfa9ef4d6" (UID: "17616003-4ef1-4e31-b3de-f91dfa9ef4d6"). InnerVolumeSpecName "kube-api-access-98fkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.478861 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e8f14a-90a8-45de-ab78-b57f28584ca9-kube-api-access-4wqpt" (OuterVolumeSpecName: "kube-api-access-4wqpt") pod "e8e8f14a-90a8-45de-ab78-b57f28584ca9" (UID: "e8e8f14a-90a8-45de-ab78-b57f28584ca9"). InnerVolumeSpecName "kube-api-access-4wqpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.478893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-scripts" (OuterVolumeSpecName: "scripts") pod "17616003-4ef1-4e31-b3de-f91dfa9ef4d6" (UID: "17616003-4ef1-4e31-b3de-f91dfa9ef4d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.478877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "17616003-4ef1-4e31-b3de-f91dfa9ef4d6" (UID: "17616003-4ef1-4e31-b3de-f91dfa9ef4d6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.479055 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e8e8f14a-90a8-45de-ab78-b57f28584ca9" (UID: "e8e8f14a-90a8-45de-ab78-b57f28584ca9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.479085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-scripts" (OuterVolumeSpecName: "scripts") pod "e8e8f14a-90a8-45de-ab78-b57f28584ca9" (UID: "e8e8f14a-90a8-45de-ab78-b57f28584ca9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.479128 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e8e8f14a-90a8-45de-ab78-b57f28584ca9" (UID: "e8e8f14a-90a8-45de-ab78-b57f28584ca9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.479476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "17616003-4ef1-4e31-b3de-f91dfa9ef4d6" (UID: "17616003-4ef1-4e31-b3de-f91dfa9ef4d6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.490614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-config-data" (OuterVolumeSpecName: "config-data") pod "17616003-4ef1-4e31-b3de-f91dfa9ef4d6" (UID: "17616003-4ef1-4e31-b3de-f91dfa9ef4d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.491333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-config-data" (OuterVolumeSpecName: "config-data") pod "e8e8f14a-90a8-45de-ab78-b57f28584ca9" (UID: "e8e8f14a-90a8-45de-ab78-b57f28584ca9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.575968 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.575994 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98fkg\" (UniqueName: \"kubernetes.io/projected/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-kube-api-access-98fkg\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.576007 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wqpt\" (UniqueName: \"kubernetes.io/projected/e8e8f14a-90a8-45de-ab78-b57f28584ca9-kube-api-access-4wqpt\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.576016 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.576025 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.576033 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.576041 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.576048 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17616003-4ef1-4e31-b3de-f91dfa9ef4d6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.576055 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:05 crc kubenswrapper[4707]: I0121 16:48:05.576062 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8e8f14a-90a8-45de-ab78-b57f28584ca9-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.157284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" event={"ID":"17616003-4ef1-4e31-b3de-f91dfa9ef4d6","Type":"ContainerDied","Data":"fca770ff374190c3ad9782eb06bbd4839ff55f19be0aa59ab9eee82755170e8a"} Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.157335 4707 scope.go:117] "RemoveContainer" containerID="bb0eb5f0774d70d6ef48668e048f8635158ad4e6a95e861a606e11b4b8897f2e" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.157343 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-p8klq" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.159573 4707 generic.go:334] "Generic (PLEG): container finished" podID="e8e8f14a-90a8-45de-ab78-b57f28584ca9" containerID="1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92" exitCode=0 Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.159596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" event={"ID":"e8e8f14a-90a8-45de-ab78-b57f28584ca9","Type":"ContainerDied","Data":"1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92"} Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.159611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" event={"ID":"e8e8f14a-90a8-45de-ab78-b57f28584ca9","Type":"ContainerDied","Data":"34256922409d2d865da40bea82de6288d84caa1e7910d49bed5967180ea22161"} Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.159613 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-kzbd7" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.226718 4707 scope.go:117] "RemoveContainer" containerID="1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.233907 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-p8klq"] Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.239675 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-p8klq"] Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.244402 4707 scope.go:117] "RemoveContainer" containerID="1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92" Jan 21 16:48:06 crc kubenswrapper[4707]: E0121 16:48:06.244754 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92\": container with ID starting with 1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92 not found: ID does not exist" containerID="1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.244780 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92"} err="failed to get container status \"1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92\": rpc error: code = NotFound desc = could not find container \"1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92\": container with ID starting with 1e0ad3e5fc79e28a4f8641f22ba11df03193d0af8c934d652d166b398e5d2c92 not found: ID does not exist" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.248057 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-kzbd7"] Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.252405 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-kzbd7"] Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.386652 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.488788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-credential-keys\") pod \"0df9e527-003f-4dad-a770-df1d1183256d\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.488876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kmt5\" (UniqueName: \"kubernetes.io/projected/0df9e527-003f-4dad-a770-df1d1183256d-kube-api-access-2kmt5\") pod \"0df9e527-003f-4dad-a770-df1d1183256d\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.488949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-fernet-keys\") pod \"0df9e527-003f-4dad-a770-df1d1183256d\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.489010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-config-data\") pod \"0df9e527-003f-4dad-a770-df1d1183256d\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.489068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-scripts\") pod \"0df9e527-003f-4dad-a770-df1d1183256d\" (UID: \"0df9e527-003f-4dad-a770-df1d1183256d\") " Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.492553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0df9e527-003f-4dad-a770-df1d1183256d" (UID: "0df9e527-003f-4dad-a770-df1d1183256d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.492567 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df9e527-003f-4dad-a770-df1d1183256d-kube-api-access-2kmt5" (OuterVolumeSpecName: "kube-api-access-2kmt5") pod "0df9e527-003f-4dad-a770-df1d1183256d" (UID: "0df9e527-003f-4dad-a770-df1d1183256d"). InnerVolumeSpecName "kube-api-access-2kmt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.492890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-scripts" (OuterVolumeSpecName: "scripts") pod "0df9e527-003f-4dad-a770-df1d1183256d" (UID: "0df9e527-003f-4dad-a770-df1d1183256d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.493091 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0df9e527-003f-4dad-a770-df1d1183256d" (UID: "0df9e527-003f-4dad-a770-df1d1183256d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.503506 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-config-data" (OuterVolumeSpecName: "config-data") pod "0df9e527-003f-4dad-a770-df1d1183256d" (UID: "0df9e527-003f-4dad-a770-df1d1183256d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.590795 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.590837 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kmt5\" (UniqueName: \"kubernetes.io/projected/0df9e527-003f-4dad-a770-df1d1183256d-kube-api-access-2kmt5\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.590849 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.590861 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:06 crc kubenswrapper[4707]: I0121 16:48:06.590872 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0df9e527-003f-4dad-a770-df1d1183256d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.166345 4707 generic.go:334] "Generic (PLEG): container finished" podID="0df9e527-003f-4dad-a770-df1d1183256d" containerID="33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb" exitCode=0 Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.166380 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.166406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" event={"ID":"0df9e527-003f-4dad-a770-df1d1183256d","Type":"ContainerDied","Data":"33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb"} Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.166428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-9bc86446b-z7t5t" event={"ID":"0df9e527-003f-4dad-a770-df1d1183256d","Type":"ContainerDied","Data":"e88d0f42e3f18e15f2ff20550516dcba6a73efca5a2e606da18ae238037a1e5d"} Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.166453 4707 scope.go:117] "RemoveContainer" containerID="33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb" Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.182895 4707 scope.go:117] "RemoveContainer" containerID="33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb" Jan 21 16:48:07 crc kubenswrapper[4707]: E0121 16:48:07.183237 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb\": container with ID starting with 33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb not found: ID does not exist" containerID="33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb" Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.183299 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb"} err="failed to get container status \"33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb\": rpc error: code = NotFound desc = could not find container \"33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb\": container with ID starting with 33bfd58d8aeadd492fe1e0efa4c04905435abbb3e9e77ebf522cbfcf676c36fb not found: ID does not exist" Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.188553 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17616003-4ef1-4e31-b3de-f91dfa9ef4d6" path="/var/lib/kubelet/pods/17616003-4ef1-4e31-b3de-f91dfa9ef4d6/volumes" Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.189048 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e8f14a-90a8-45de-ab78-b57f28584ca9" path="/var/lib/kubelet/pods/e8e8f14a-90a8-45de-ab78-b57f28584ca9/volumes" Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.189414 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-z7t5t"] Jan 21 16:48:07 crc kubenswrapper[4707]: I0121 16:48:07.190615 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-9bc86446b-z7t5t"] Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.190047 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-8hrpx"] Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.195335 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6jvbg"] Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.199665 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-8hrpx"] Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.203219 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6jvbg"] Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.269149 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt"] Jan 21 16:48:08 crc kubenswrapper[4707]: E0121 16:48:08.269382 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17616003-4ef1-4e31-b3de-f91dfa9ef4d6" containerName="keystone-api" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.269399 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="17616003-4ef1-4e31-b3de-f91dfa9ef4d6" containerName="keystone-api" Jan 21 16:48:08 crc kubenswrapper[4707]: E0121 16:48:08.269408 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e8f14a-90a8-45de-ab78-b57f28584ca9" containerName="keystone-api" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.269413 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e8f14a-90a8-45de-ab78-b57f28584ca9" containerName="keystone-api" Jan 21 16:48:08 crc kubenswrapper[4707]: E0121 16:48:08.269432 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df9e527-003f-4dad-a770-df1d1183256d" containerName="keystone-api" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.269437 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df9e527-003f-4dad-a770-df1d1183256d" containerName="keystone-api" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.269555 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e8f14a-90a8-45de-ab78-b57f28584ca9" containerName="keystone-api" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.269570 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df9e527-003f-4dad-a770-df1d1183256d" containerName="keystone-api" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.269577 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="17616003-4ef1-4e31-b3de-f91dfa9ef4d6" containerName="keystone-api" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.269986 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.275913 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt"] Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.410771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgsvw\" (UniqueName: \"kubernetes.io/projected/b8d28070-8415-4f9c-a03e-417ae88de51a-kube-api-access-fgsvw\") pod \"keystone8fd0-account-delete-dhsrt\" (UID: \"b8d28070-8415-4f9c-a03e-417ae88de51a\") " pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.410873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d28070-8415-4f9c-a03e-417ae88de51a-operator-scripts\") pod \"keystone8fd0-account-delete-dhsrt\" (UID: \"b8d28070-8415-4f9c-a03e-417ae88de51a\") " pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.511986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d28070-8415-4f9c-a03e-417ae88de51a-operator-scripts\") pod \"keystone8fd0-account-delete-dhsrt\" (UID: \"b8d28070-8415-4f9c-a03e-417ae88de51a\") " pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.512126 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgsvw\" (UniqueName: \"kubernetes.io/projected/b8d28070-8415-4f9c-a03e-417ae88de51a-kube-api-access-fgsvw\") pod \"keystone8fd0-account-delete-dhsrt\" (UID: \"b8d28070-8415-4f9c-a03e-417ae88de51a\") " pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.512898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d28070-8415-4f9c-a03e-417ae88de51a-operator-scripts\") pod \"keystone8fd0-account-delete-dhsrt\" (UID: \"b8d28070-8415-4f9c-a03e-417ae88de51a\") " pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.525624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgsvw\" (UniqueName: \"kubernetes.io/projected/b8d28070-8415-4f9c-a03e-417ae88de51a-kube-api-access-fgsvw\") pod \"keystone8fd0-account-delete-dhsrt\" (UID: \"b8d28070-8415-4f9c-a03e-417ae88de51a\") " pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.581764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:08 crc kubenswrapper[4707]: I0121 16:48:08.920365 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt"] Jan 21 16:48:09 crc kubenswrapper[4707]: I0121 16:48:09.181901 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8d28070-8415-4f9c-a03e-417ae88de51a" containerID="106d39f128fd03a392d03c293dcbac530cb94e4716a79fc044bb336104c2b4ac" exitCode=0 Jan 21 16:48:09 crc kubenswrapper[4707]: I0121 16:48:09.192951 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df9e527-003f-4dad-a770-df1d1183256d" path="/var/lib/kubelet/pods/0df9e527-003f-4dad-a770-df1d1183256d/volumes" Jan 21 16:48:09 crc kubenswrapper[4707]: I0121 16:48:09.193565 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae881a4-a0d8-4e6f-83fe-6497f13947bd" path="/var/lib/kubelet/pods/4ae881a4-a0d8-4e6f-83fe-6497f13947bd/volumes" Jan 21 16:48:09 crc kubenswrapper[4707]: I0121 16:48:09.194188 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbd8def1-dedb-492c-9122-d91ff5ab8c09" path="/var/lib/kubelet/pods/bbd8def1-dedb-492c-9122-d91ff5ab8c09/volumes" Jan 21 16:48:09 crc kubenswrapper[4707]: I0121 16:48:09.194691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" event={"ID":"b8d28070-8415-4f9c-a03e-417ae88de51a","Type":"ContainerDied","Data":"106d39f128fd03a392d03c293dcbac530cb94e4716a79fc044bb336104c2b4ac"} Jan 21 16:48:09 crc kubenswrapper[4707]: I0121 16:48:09.194733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" event={"ID":"b8d28070-8415-4f9c-a03e-417ae88de51a","Type":"ContainerStarted","Data":"db16662706de3fa77facc8ce4a3695f489f923fc4ac073645b941151309b0103"} Jan 21 16:48:10 crc kubenswrapper[4707]: I0121 16:48:10.374077 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:10 crc kubenswrapper[4707]: I0121 16:48:10.437786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgsvw\" (UniqueName: \"kubernetes.io/projected/b8d28070-8415-4f9c-a03e-417ae88de51a-kube-api-access-fgsvw\") pod \"b8d28070-8415-4f9c-a03e-417ae88de51a\" (UID: \"b8d28070-8415-4f9c-a03e-417ae88de51a\") " Jan 21 16:48:10 crc kubenswrapper[4707]: I0121 16:48:10.437917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d28070-8415-4f9c-a03e-417ae88de51a-operator-scripts\") pod \"b8d28070-8415-4f9c-a03e-417ae88de51a\" (UID: \"b8d28070-8415-4f9c-a03e-417ae88de51a\") " Jan 21 16:48:10 crc kubenswrapper[4707]: I0121 16:48:10.438526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d28070-8415-4f9c-a03e-417ae88de51a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8d28070-8415-4f9c-a03e-417ae88de51a" (UID: "b8d28070-8415-4f9c-a03e-417ae88de51a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:48:10 crc kubenswrapper[4707]: I0121 16:48:10.441427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d28070-8415-4f9c-a03e-417ae88de51a-kube-api-access-fgsvw" (OuterVolumeSpecName: "kube-api-access-fgsvw") pod "b8d28070-8415-4f9c-a03e-417ae88de51a" (UID: "b8d28070-8415-4f9c-a03e-417ae88de51a"). InnerVolumeSpecName "kube-api-access-fgsvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:48:10 crc kubenswrapper[4707]: I0121 16:48:10.539574 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d28070-8415-4f9c-a03e-417ae88de51a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:10 crc kubenswrapper[4707]: I0121 16:48:10.539600 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgsvw\" (UniqueName: \"kubernetes.io/projected/b8d28070-8415-4f9c-a03e-417ae88de51a-kube-api-access-fgsvw\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:11 crc kubenswrapper[4707]: I0121 16:48:11.192900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" event={"ID":"b8d28070-8415-4f9c-a03e-417ae88de51a","Type":"ContainerDied","Data":"db16662706de3fa77facc8ce4a3695f489f923fc4ac073645b941151309b0103"} Jan 21 16:48:11 crc kubenswrapper[4707]: I0121 16:48:11.193084 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db16662706de3fa77facc8ce4a3695f489f923fc4ac073645b941151309b0103" Jan 21 16:48:11 crc kubenswrapper[4707]: I0121 16:48:11.192933 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.268736 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pbdh6"] Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.272227 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pbdh6"] Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.280833 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48"] Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.284783 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt"] Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.288302 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-8fd0-account-create-update-5px48"] Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.291610 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone8fd0-account-delete-dhsrt"] Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.367110 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-dccmq"] Jan 21 16:48:13 crc kubenswrapper[4707]: E0121 16:48:13.367872 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d28070-8415-4f9c-a03e-417ae88de51a" containerName="mariadb-account-delete" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.367890 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d28070-8415-4f9c-a03e-417ae88de51a" containerName="mariadb-account-delete" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.368645 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d28070-8415-4f9c-a03e-417ae88de51a" containerName="mariadb-account-delete" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.369696 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.388030 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-dccmq"] Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.474384 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth"] Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.475213 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.476151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4pkk\" (UniqueName: \"kubernetes.io/projected/540f5b24-d11b-435a-ba43-8cb438702a49-kube-api-access-p4pkk\") pod \"keystone-db-create-dccmq\" (UID: \"540f5b24-d11b-435a-ba43-8cb438702a49\") " pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.476225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540f5b24-d11b-435a-ba43-8cb438702a49-operator-scripts\") pod \"keystone-db-create-dccmq\" (UID: \"540f5b24-d11b-435a-ba43-8cb438702a49\") " pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.476838 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.479000 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth"] Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.577696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefd72-ba92-491d-b232-bfe152c322dd-operator-scripts\") pod \"keystone-30c2-account-create-update-vbbth\" (UID: \"0abefd72-ba92-491d-b232-bfe152c322dd\") " pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.577803 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsqxg\" (UniqueName: \"kubernetes.io/projected/0abefd72-ba92-491d-b232-bfe152c322dd-kube-api-access-xsqxg\") pod \"keystone-30c2-account-create-update-vbbth\" (UID: \"0abefd72-ba92-491d-b232-bfe152c322dd\") " pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.577868 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4pkk\" (UniqueName: \"kubernetes.io/projected/540f5b24-d11b-435a-ba43-8cb438702a49-kube-api-access-p4pkk\") pod \"keystone-db-create-dccmq\" (UID: \"540f5b24-d11b-435a-ba43-8cb438702a49\") " pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.577895 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540f5b24-d11b-435a-ba43-8cb438702a49-operator-scripts\") pod \"keystone-db-create-dccmq\" (UID: \"540f5b24-d11b-435a-ba43-8cb438702a49\") " pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.578424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540f5b24-d11b-435a-ba43-8cb438702a49-operator-scripts\") pod \"keystone-db-create-dccmq\" (UID: \"540f5b24-d11b-435a-ba43-8cb438702a49\") " pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.592522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4pkk\" (UniqueName: \"kubernetes.io/projected/540f5b24-d11b-435a-ba43-8cb438702a49-kube-api-access-p4pkk\") pod \"keystone-db-create-dccmq\" (UID: \"540f5b24-d11b-435a-ba43-8cb438702a49\") " pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.679536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsqxg\" (UniqueName: \"kubernetes.io/projected/0abefd72-ba92-491d-b232-bfe152c322dd-kube-api-access-xsqxg\") pod \"keystone-30c2-account-create-update-vbbth\" (UID: \"0abefd72-ba92-491d-b232-bfe152c322dd\") " pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.679685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefd72-ba92-491d-b232-bfe152c322dd-operator-scripts\") pod \"keystone-30c2-account-create-update-vbbth\" (UID: \"0abefd72-ba92-491d-b232-bfe152c322dd\") " pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.680271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefd72-ba92-491d-b232-bfe152c322dd-operator-scripts\") pod \"keystone-30c2-account-create-update-vbbth\" (UID: \"0abefd72-ba92-491d-b232-bfe152c322dd\") " pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.684148 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.693204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsqxg\" (UniqueName: \"kubernetes.io/projected/0abefd72-ba92-491d-b232-bfe152c322dd-kube-api-access-xsqxg\") pod \"keystone-30c2-account-create-update-vbbth\" (UID: \"0abefd72-ba92-491d-b232-bfe152c322dd\") " pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:13 crc kubenswrapper[4707]: I0121 16:48:13.787986 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:14 crc kubenswrapper[4707]: I0121 16:48:14.025894 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-dccmq"] Jan 21 16:48:14 crc kubenswrapper[4707]: I0121 16:48:14.124568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth"] Jan 21 16:48:14 crc kubenswrapper[4707]: W0121 16:48:14.126440 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0abefd72_ba92_491d_b232_bfe152c322dd.slice/crio-eb8dc2df1e465d3f7131dab2ec6415bd4ca56b387273e1f737b6bfa8a614f4c7 WatchSource:0}: Error finding container eb8dc2df1e465d3f7131dab2ec6415bd4ca56b387273e1f737b6bfa8a614f4c7: Status 404 returned error can't find the container with id eb8dc2df1e465d3f7131dab2ec6415bd4ca56b387273e1f737b6bfa8a614f4c7 Jan 21 16:48:14 crc kubenswrapper[4707]: I0121 16:48:14.208063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-dccmq" event={"ID":"540f5b24-d11b-435a-ba43-8cb438702a49","Type":"ContainerStarted","Data":"186e5df06818c2f4cfd4a65f824d3b10610c975fe6d9fa95c57133574cf4813b"} Jan 21 16:48:14 crc kubenswrapper[4707]: I0121 16:48:14.208177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-dccmq" event={"ID":"540f5b24-d11b-435a-ba43-8cb438702a49","Type":"ContainerStarted","Data":"331174b221e92afa8d94761722ba7db66b681613e6c4f204245975aec0619cfa"} Jan 21 16:48:14 crc kubenswrapper[4707]: I0121 16:48:14.209294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" event={"ID":"0abefd72-ba92-491d-b232-bfe152c322dd","Type":"ContainerStarted","Data":"4ef9673b97ec56781314ebbeaab5d8c2c42d67e22afd9a61ab07005a6924f7a1"} Jan 21 16:48:14 crc kubenswrapper[4707]: I0121 16:48:14.209329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" event={"ID":"0abefd72-ba92-491d-b232-bfe152c322dd","Type":"ContainerStarted","Data":"eb8dc2df1e465d3f7131dab2ec6415bd4ca56b387273e1f737b6bfa8a614f4c7"} Jan 21 16:48:14 crc kubenswrapper[4707]: I0121 16:48:14.220871 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-create-dccmq" podStartSLOduration=1.220857616 podStartE2EDuration="1.220857616s" podCreationTimestamp="2026-01-21 16:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:48:14.217502649 +0000 UTC m=+6391.399018870" watchObservedRunningTime="2026-01-21 16:48:14.220857616 +0000 UTC m=+6391.402373839" Jan 21 16:48:14 crc kubenswrapper[4707]: I0121 16:48:14.230453 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" podStartSLOduration=1.230445505 podStartE2EDuration="1.230445505s" podCreationTimestamp="2026-01-21 16:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:48:14.227763111 +0000 UTC m=+6391.409279333" watchObservedRunningTime="2026-01-21 16:48:14.230445505 +0000 UTC m=+6391.411961726" Jan 21 16:48:15 crc kubenswrapper[4707]: I0121 16:48:15.189213 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d28070-8415-4f9c-a03e-417ae88de51a" path="/var/lib/kubelet/pods/b8d28070-8415-4f9c-a03e-417ae88de51a/volumes" Jan 21 16:48:15 crc kubenswrapper[4707]: I0121 16:48:15.190465 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3bf33c0-6356-4e0d-87d1-7d69d8c130e4" path="/var/lib/kubelet/pods/d3bf33c0-6356-4e0d-87d1-7d69d8c130e4/volumes" Jan 21 16:48:15 crc kubenswrapper[4707]: I0121 16:48:15.191026 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb6901b9-a016-47a5-a4f6-a8de0019bcce" path="/var/lib/kubelet/pods/fb6901b9-a016-47a5-a4f6-a8de0019bcce/volumes" Jan 21 16:48:15 crc kubenswrapper[4707]: I0121 16:48:15.225915 4707 generic.go:334] "Generic (PLEG): container finished" podID="540f5b24-d11b-435a-ba43-8cb438702a49" containerID="186e5df06818c2f4cfd4a65f824d3b10610c975fe6d9fa95c57133574cf4813b" exitCode=0 Jan 21 16:48:15 crc kubenswrapper[4707]: I0121 16:48:15.225956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-dccmq" event={"ID":"540f5b24-d11b-435a-ba43-8cb438702a49","Type":"ContainerDied","Data":"186e5df06818c2f4cfd4a65f824d3b10610c975fe6d9fa95c57133574cf4813b"} Jan 21 16:48:15 crc kubenswrapper[4707]: I0121 16:48:15.227657 4707 generic.go:334] "Generic (PLEG): container finished" podID="0abefd72-ba92-491d-b232-bfe152c322dd" containerID="4ef9673b97ec56781314ebbeaab5d8c2c42d67e22afd9a61ab07005a6924f7a1" exitCode=0 Jan 21 16:48:15 crc kubenswrapper[4707]: I0121 16:48:15.227694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" event={"ID":"0abefd72-ba92-491d-b232-bfe152c322dd","Type":"ContainerDied","Data":"4ef9673b97ec56781314ebbeaab5d8c2c42d67e22afd9a61ab07005a6924f7a1"} Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.486171 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.491015 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.511948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4pkk\" (UniqueName: \"kubernetes.io/projected/540f5b24-d11b-435a-ba43-8cb438702a49-kube-api-access-p4pkk\") pod \"540f5b24-d11b-435a-ba43-8cb438702a49\" (UID: \"540f5b24-d11b-435a-ba43-8cb438702a49\") " Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.512016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsqxg\" (UniqueName: \"kubernetes.io/projected/0abefd72-ba92-491d-b232-bfe152c322dd-kube-api-access-xsqxg\") pod \"0abefd72-ba92-491d-b232-bfe152c322dd\" (UID: \"0abefd72-ba92-491d-b232-bfe152c322dd\") " Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.512122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefd72-ba92-491d-b232-bfe152c322dd-operator-scripts\") pod \"0abefd72-ba92-491d-b232-bfe152c322dd\" (UID: \"0abefd72-ba92-491d-b232-bfe152c322dd\") " Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.512168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540f5b24-d11b-435a-ba43-8cb438702a49-operator-scripts\") pod \"540f5b24-d11b-435a-ba43-8cb438702a49\" (UID: \"540f5b24-d11b-435a-ba43-8cb438702a49\") " Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.512649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/540f5b24-d11b-435a-ba43-8cb438702a49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "540f5b24-d11b-435a-ba43-8cb438702a49" (UID: "540f5b24-d11b-435a-ba43-8cb438702a49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.512667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0abefd72-ba92-491d-b232-bfe152c322dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0abefd72-ba92-491d-b232-bfe152c322dd" (UID: "0abefd72-ba92-491d-b232-bfe152c322dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.516235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/540f5b24-d11b-435a-ba43-8cb438702a49-kube-api-access-p4pkk" (OuterVolumeSpecName: "kube-api-access-p4pkk") pod "540f5b24-d11b-435a-ba43-8cb438702a49" (UID: "540f5b24-d11b-435a-ba43-8cb438702a49"). InnerVolumeSpecName "kube-api-access-p4pkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.516269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0abefd72-ba92-491d-b232-bfe152c322dd-kube-api-access-xsqxg" (OuterVolumeSpecName: "kube-api-access-xsqxg") pod "0abefd72-ba92-491d-b232-bfe152c322dd" (UID: "0abefd72-ba92-491d-b232-bfe152c322dd"). InnerVolumeSpecName "kube-api-access-xsqxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.613649 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4pkk\" (UniqueName: \"kubernetes.io/projected/540f5b24-d11b-435a-ba43-8cb438702a49-kube-api-access-p4pkk\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.613684 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsqxg\" (UniqueName: \"kubernetes.io/projected/0abefd72-ba92-491d-b232-bfe152c322dd-kube-api-access-xsqxg\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.613697 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abefd72-ba92-491d-b232-bfe152c322dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:16 crc kubenswrapper[4707]: I0121 16:48:16.613706 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540f5b24-d11b-435a-ba43-8cb438702a49-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:17 crc kubenswrapper[4707]: I0121 16:48:17.238571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-dccmq" event={"ID":"540f5b24-d11b-435a-ba43-8cb438702a49","Type":"ContainerDied","Data":"331174b221e92afa8d94761722ba7db66b681613e6c4f204245975aec0619cfa"} Jan 21 16:48:17 crc kubenswrapper[4707]: I0121 16:48:17.238844 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="331174b221e92afa8d94761722ba7db66b681613e6c4f204245975aec0619cfa" Jan 21 16:48:17 crc kubenswrapper[4707]: I0121 16:48:17.238611 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-dccmq" Jan 21 16:48:17 crc kubenswrapper[4707]: I0121 16:48:17.239917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" event={"ID":"0abefd72-ba92-491d-b232-bfe152c322dd","Type":"ContainerDied","Data":"eb8dc2df1e465d3f7131dab2ec6415bd4ca56b387273e1f737b6bfa8a614f4c7"} Jan 21 16:48:17 crc kubenswrapper[4707]: I0121 16:48:17.240002 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb8dc2df1e465d3f7131dab2ec6415bd4ca56b387273e1f737b6bfa8a614f4c7" Jan 21 16:48:17 crc kubenswrapper[4707]: I0121 16:48:17.239933 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.011899 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-bbw28"] Jan 21 16:48:19 crc kubenswrapper[4707]: E0121 16:48:19.012111 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abefd72-ba92-491d-b232-bfe152c322dd" containerName="mariadb-account-create-update" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.012123 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abefd72-ba92-491d-b232-bfe152c322dd" containerName="mariadb-account-create-update" Jan 21 16:48:19 crc kubenswrapper[4707]: E0121 16:48:19.012145 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="540f5b24-d11b-435a-ba43-8cb438702a49" containerName="mariadb-database-create" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.012150 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="540f5b24-d11b-435a-ba43-8cb438702a49" containerName="mariadb-database-create" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.012259 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="540f5b24-d11b-435a-ba43-8cb438702a49" containerName="mariadb-database-create" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.012273 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0abefd72-ba92-491d-b232-bfe152c322dd" containerName="mariadb-account-create-update" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.012633 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.013703 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.013911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.014042 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-g58rf" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.014098 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.014864 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.020940 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-bbw28"] Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.045134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rzg4\" (UniqueName: \"kubernetes.io/projected/b31a278f-127c-452f-95b1-9384992fa9ec-kube-api-access-5rzg4\") pod \"keystone-db-sync-bbw28\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.045196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-combined-ca-bundle\") pod \"keystone-db-sync-bbw28\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.045249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-config-data\") pod \"keystone-db-sync-bbw28\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.146354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-combined-ca-bundle\") pod \"keystone-db-sync-bbw28\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.146438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-config-data\") pod \"keystone-db-sync-bbw28\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.146499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rzg4\" (UniqueName: \"kubernetes.io/projected/b31a278f-127c-452f-95b1-9384992fa9ec-kube-api-access-5rzg4\") pod \"keystone-db-sync-bbw28\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.150833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-combined-ca-bundle\") pod \"keystone-db-sync-bbw28\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.152263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-config-data\") pod \"keystone-db-sync-bbw28\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.160064 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rzg4\" (UniqueName: \"kubernetes.io/projected/b31a278f-127c-452f-95b1-9384992fa9ec-kube-api-access-5rzg4\") pod \"keystone-db-sync-bbw28\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.324222 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:19 crc kubenswrapper[4707]: I0121 16:48:19.673100 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-bbw28"] Jan 21 16:48:20 crc kubenswrapper[4707]: I0121 16:48:20.256423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-bbw28" event={"ID":"b31a278f-127c-452f-95b1-9384992fa9ec","Type":"ContainerStarted","Data":"af4a69dd8aab43c13c0e98d349cd0f7479c96679105a5445727c3bd74d232cae"} Jan 21 16:48:20 crc kubenswrapper[4707]: I0121 16:48:20.257080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-bbw28" event={"ID":"b31a278f-127c-452f-95b1-9384992fa9ec","Type":"ContainerStarted","Data":"36427a03d3fc33d9cad4d60fef2fbb623d3d96cc91d5cefba80d43972e4bb2eb"} Jan 21 16:48:21 crc kubenswrapper[4707]: I0121 16:48:21.263011 4707 generic.go:334] "Generic (PLEG): container finished" podID="b31a278f-127c-452f-95b1-9384992fa9ec" containerID="af4a69dd8aab43c13c0e98d349cd0f7479c96679105a5445727c3bd74d232cae" exitCode=0 Jan 21 16:48:21 crc kubenswrapper[4707]: I0121 16:48:21.263091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-bbw28" event={"ID":"b31a278f-127c-452f-95b1-9384992fa9ec","Type":"ContainerDied","Data":"af4a69dd8aab43c13c0e98d349cd0f7479c96679105a5445727c3bd74d232cae"} Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.489198 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.589033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rzg4\" (UniqueName: \"kubernetes.io/projected/b31a278f-127c-452f-95b1-9384992fa9ec-kube-api-access-5rzg4\") pod \"b31a278f-127c-452f-95b1-9384992fa9ec\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.589077 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-combined-ca-bundle\") pod \"b31a278f-127c-452f-95b1-9384992fa9ec\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.589122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-config-data\") pod \"b31a278f-127c-452f-95b1-9384992fa9ec\" (UID: \"b31a278f-127c-452f-95b1-9384992fa9ec\") " Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.593405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31a278f-127c-452f-95b1-9384992fa9ec-kube-api-access-5rzg4" (OuterVolumeSpecName: "kube-api-access-5rzg4") pod "b31a278f-127c-452f-95b1-9384992fa9ec" (UID: "b31a278f-127c-452f-95b1-9384992fa9ec"). InnerVolumeSpecName "kube-api-access-5rzg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.604304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b31a278f-127c-452f-95b1-9384992fa9ec" (UID: "b31a278f-127c-452f-95b1-9384992fa9ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.616497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-config-data" (OuterVolumeSpecName: "config-data") pod "b31a278f-127c-452f-95b1-9384992fa9ec" (UID: "b31a278f-127c-452f-95b1-9384992fa9ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.690649 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.690843 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rzg4\" (UniqueName: \"kubernetes.io/projected/b31a278f-127c-452f-95b1-9384992fa9ec-kube-api-access-5rzg4\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:22 crc kubenswrapper[4707]: I0121 16:48:22.690856 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31a278f-127c-452f-95b1-9384992fa9ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.275651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-bbw28" event={"ID":"b31a278f-127c-452f-95b1-9384992fa9ec","Type":"ContainerDied","Data":"36427a03d3fc33d9cad4d60fef2fbb623d3d96cc91d5cefba80d43972e4bb2eb"} Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.275687 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36427a03d3fc33d9cad4d60fef2fbb623d3d96cc91d5cefba80d43972e4bb2eb" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.275696 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-bbw28" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.419045 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-gvf9n"] Jan 21 16:48:23 crc kubenswrapper[4707]: E0121 16:48:23.419312 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31a278f-127c-452f-95b1-9384992fa9ec" containerName="keystone-db-sync" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.419328 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31a278f-127c-452f-95b1-9384992fa9ec" containerName="keystone-db-sync" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.419462 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31a278f-127c-452f-95b1-9384992fa9ec" containerName="keystone-db-sync" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.419912 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.421245 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.422595 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.422618 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.422654 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.423098 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.424274 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-g58rf" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.429096 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-gvf9n"] Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.501213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-combined-ca-bundle\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.501275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-credential-keys\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.501343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-scripts\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.501379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-fernet-keys\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.501410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96xrz\" (UniqueName: \"kubernetes.io/projected/953b1bd4-32d7-4350-8929-1e970ded2f1c-kube-api-access-96xrz\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.501428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-config-data\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.601997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-credential-keys\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.602050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-scripts\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.602079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-fernet-keys\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.602107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96xrz\" (UniqueName: \"kubernetes.io/projected/953b1bd4-32d7-4350-8929-1e970ded2f1c-kube-api-access-96xrz\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.602135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-config-data\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.602171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-combined-ca-bundle\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.605495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-scripts\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.605573 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-config-data\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.605733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-credential-keys\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.605761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-fernet-keys\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.605861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-combined-ca-bundle\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.615856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96xrz\" (UniqueName: \"kubernetes.io/projected/953b1bd4-32d7-4350-8929-1e970ded2f1c-kube-api-access-96xrz\") pod \"keystone-bootstrap-gvf9n\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:23 crc kubenswrapper[4707]: I0121 16:48:23.732111 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:24 crc kubenswrapper[4707]: I0121 16:48:24.175350 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-gvf9n"] Jan 21 16:48:24 crc kubenswrapper[4707]: I0121 16:48:24.281662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" event={"ID":"953b1bd4-32d7-4350-8929-1e970ded2f1c","Type":"ContainerStarted","Data":"3f2faa48c47cdc577df29341b657c2ed9bfe053027a11493609233bc6ced6061"} Jan 21 16:48:24 crc kubenswrapper[4707]: I0121 16:48:24.281701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" event={"ID":"953b1bd4-32d7-4350-8929-1e970ded2f1c","Type":"ContainerStarted","Data":"8cacd1e7de474632c08817ab6645550a99418367c3009fa11be1799f3ecf7241"} Jan 21 16:48:24 crc kubenswrapper[4707]: I0121 16:48:24.295162 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" podStartSLOduration=1.295149216 podStartE2EDuration="1.295149216s" podCreationTimestamp="2026-01-21 16:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:48:24.292305418 +0000 UTC m=+6401.473821640" watchObservedRunningTime="2026-01-21 16:48:24.295149216 +0000 UTC m=+6401.476665437" Jan 21 16:48:26 crc kubenswrapper[4707]: I0121 16:48:26.294280 4707 generic.go:334] "Generic (PLEG): container finished" podID="953b1bd4-32d7-4350-8929-1e970ded2f1c" containerID="3f2faa48c47cdc577df29341b657c2ed9bfe053027a11493609233bc6ced6061" exitCode=0 Jan 21 16:48:26 crc kubenswrapper[4707]: I0121 16:48:26.294358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" event={"ID":"953b1bd4-32d7-4350-8929-1e970ded2f1c","Type":"ContainerDied","Data":"3f2faa48c47cdc577df29341b657c2ed9bfe053027a11493609233bc6ced6061"} Jan 21 16:48:26 crc kubenswrapper[4707]: I0121 16:48:26.654485 4707 scope.go:117] "RemoveContainer" containerID="100097a9e7cbb71c0593f1c17c892a7aa65a9a95d3d87be08e2869314b4723b0" Jan 21 16:48:26 crc kubenswrapper[4707]: I0121 16:48:26.672974 4707 scope.go:117] "RemoveContainer" containerID="acbfc31705ae4575297f345c2e4e44e116b0e4a25d1c1be31f5f0559f0265769" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.519944 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.559395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-config-data\") pod \"953b1bd4-32d7-4350-8929-1e970ded2f1c\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.559550 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96xrz\" (UniqueName: \"kubernetes.io/projected/953b1bd4-32d7-4350-8929-1e970ded2f1c-kube-api-access-96xrz\") pod \"953b1bd4-32d7-4350-8929-1e970ded2f1c\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.559691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-fernet-keys\") pod \"953b1bd4-32d7-4350-8929-1e970ded2f1c\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.560179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-combined-ca-bundle\") pod \"953b1bd4-32d7-4350-8929-1e970ded2f1c\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.560305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-credential-keys\") pod \"953b1bd4-32d7-4350-8929-1e970ded2f1c\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.560408 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-scripts\") pod \"953b1bd4-32d7-4350-8929-1e970ded2f1c\" (UID: \"953b1bd4-32d7-4350-8929-1e970ded2f1c\") " Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.563942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "953b1bd4-32d7-4350-8929-1e970ded2f1c" (UID: "953b1bd4-32d7-4350-8929-1e970ded2f1c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.563996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "953b1bd4-32d7-4350-8929-1e970ded2f1c" (UID: "953b1bd4-32d7-4350-8929-1e970ded2f1c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.564010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-scripts" (OuterVolumeSpecName: "scripts") pod "953b1bd4-32d7-4350-8929-1e970ded2f1c" (UID: "953b1bd4-32d7-4350-8929-1e970ded2f1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.564200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/953b1bd4-32d7-4350-8929-1e970ded2f1c-kube-api-access-96xrz" (OuterVolumeSpecName: "kube-api-access-96xrz") pod "953b1bd4-32d7-4350-8929-1e970ded2f1c" (UID: "953b1bd4-32d7-4350-8929-1e970ded2f1c"). InnerVolumeSpecName "kube-api-access-96xrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.574966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-config-data" (OuterVolumeSpecName: "config-data") pod "953b1bd4-32d7-4350-8929-1e970ded2f1c" (UID: "953b1bd4-32d7-4350-8929-1e970ded2f1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.575045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "953b1bd4-32d7-4350-8929-1e970ded2f1c" (UID: "953b1bd4-32d7-4350-8929-1e970ded2f1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.662303 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96xrz\" (UniqueName: \"kubernetes.io/projected/953b1bd4-32d7-4350-8929-1e970ded2f1c-kube-api-access-96xrz\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.662333 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.662344 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.662352 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.662361 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:27 crc kubenswrapper[4707]: I0121 16:48:27.662369 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b1bd4-32d7-4350-8929-1e970ded2f1c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.306191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" event={"ID":"953b1bd4-32d7-4350-8929-1e970ded2f1c","Type":"ContainerDied","Data":"8cacd1e7de474632c08817ab6645550a99418367c3009fa11be1799f3ecf7241"} Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.306227 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cacd1e7de474632c08817ab6645550a99418367c3009fa11be1799f3ecf7241" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.306234 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-gvf9n" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.360585 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5d56497597-nwrct"] Jan 21 16:48:28 crc kubenswrapper[4707]: E0121 16:48:28.360896 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953b1bd4-32d7-4350-8929-1e970ded2f1c" containerName="keystone-bootstrap" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.360917 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="953b1bd4-32d7-4350-8929-1e970ded2f1c" containerName="keystone-bootstrap" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.361030 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="953b1bd4-32d7-4350-8929-1e970ded2f1c" containerName="keystone-bootstrap" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.361478 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.362840 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.363527 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.363653 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.363852 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-public-svc" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.364438 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.364733 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.364968 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-g58rf" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.367447 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5d56497597-nwrct"] Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.478521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-public-tls-certs\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.478561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-fernet-keys\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.478594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkb7d\" (UniqueName: \"kubernetes.io/projected/64521cf7-f803-4b2a-bac6-8af702cb0e1f-kube-api-access-lkb7d\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.478730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-credential-keys\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.478770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-scripts\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.478796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-combined-ca-bundle\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.478870 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-internal-tls-certs\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.478890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-config-data\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.579479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-credential-keys\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.579518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-scripts\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.579540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-combined-ca-bundle\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.579571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-internal-tls-certs\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.579587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-config-data\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.579609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-public-tls-certs\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.579626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-fernet-keys\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.579651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkb7d\" (UniqueName: \"kubernetes.io/projected/64521cf7-f803-4b2a-bac6-8af702cb0e1f-kube-api-access-lkb7d\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.583736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-combined-ca-bundle\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.583775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-credential-keys\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.584074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-internal-tls-certs\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.584232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-fernet-keys\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.584259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-public-tls-certs\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.584439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-scripts\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.584910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-config-data\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.592065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkb7d\" (UniqueName: \"kubernetes.io/projected/64521cf7-f803-4b2a-bac6-8af702cb0e1f-kube-api-access-lkb7d\") pod \"keystone-5d56497597-nwrct\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:28 crc kubenswrapper[4707]: I0121 16:48:28.675736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:29 crc kubenswrapper[4707]: I0121 16:48:29.034185 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5d56497597-nwrct"] Jan 21 16:48:29 crc kubenswrapper[4707]: I0121 16:48:29.313923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" event={"ID":"64521cf7-f803-4b2a-bac6-8af702cb0e1f","Type":"ContainerStarted","Data":"9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac"} Jan 21 16:48:29 crc kubenswrapper[4707]: I0121 16:48:29.314319 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:48:29 crc kubenswrapper[4707]: I0121 16:48:29.314331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" event={"ID":"64521cf7-f803-4b2a-bac6-8af702cb0e1f","Type":"ContainerStarted","Data":"8d500824d78931df5a1c6d081831e039aa6546ffda6d2c7ff8947132d8eca584"} Jan 21 16:48:29 crc kubenswrapper[4707]: I0121 16:48:29.343564 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" podStartSLOduration=1.343519831 podStartE2EDuration="1.343519831s" podCreationTimestamp="2026-01-21 16:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:48:29.33511618 +0000 UTC m=+6406.516632402" watchObservedRunningTime="2026-01-21 16:48:29.343519831 +0000 UTC m=+6406.525036054" Jan 21 16:48:59 crc kubenswrapper[4707]: I0121 16:48:59.911074 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.012402 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-gvf9n"] Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.017605 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-gvf9n"] Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.025317 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-bbw28"] Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.029902 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-bbw28"] Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.034452 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5d56497597-nwrct"] Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.034620 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" podUID="64521cf7-f803-4b2a-bac6-8af702cb0e1f" containerName="keystone-api" containerID="cri-o://9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac" gracePeriod=30 Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.091211 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone30c2-account-delete-p4qdw"] Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.091954 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.096724 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone30c2-account-delete-p4qdw"] Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.188909 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="953b1bd4-32d7-4350-8929-1e970ded2f1c" path="/var/lib/kubelet/pods/953b1bd4-32d7-4350-8929-1e970ded2f1c/volumes" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.189466 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31a278f-127c-452f-95b1-9384992fa9ec" path="/var/lib/kubelet/pods/b31a278f-127c-452f-95b1-9384992fa9ec/volumes" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.215017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-operator-scripts\") pod \"keystone30c2-account-delete-p4qdw\" (UID: \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\") " pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.215097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92qvl\" (UniqueName: \"kubernetes.io/projected/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-kube-api-access-92qvl\") pod \"keystone30c2-account-delete-p4qdw\" (UID: \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\") " pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.316090 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92qvl\" (UniqueName: \"kubernetes.io/projected/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-kube-api-access-92qvl\") pod \"keystone30c2-account-delete-p4qdw\" (UID: \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\") " pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.316210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-operator-scripts\") pod \"keystone30c2-account-delete-p4qdw\" (UID: \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\") " pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.316883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-operator-scripts\") pod \"keystone30c2-account-delete-p4qdw\" (UID: \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\") " pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.331083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92qvl\" (UniqueName: \"kubernetes.io/projected/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-kube-api-access-92qvl\") pod \"keystone30c2-account-delete-p4qdw\" (UID: \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\") " pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.405095 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:01 crc kubenswrapper[4707]: I0121 16:49:01.739759 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone30c2-account-delete-p4qdw"] Jan 21 16:49:02 crc kubenswrapper[4707]: I0121 16:49:02.506268 4707 generic.go:334] "Generic (PLEG): container finished" podID="c52792c6-e435-4f6c-bb7e-e0aab3f4ff36" containerID="1e212fe1a7fca1161a4a14cf842a06a809326463b8d491a3f0065103ce125e05" exitCode=0 Jan 21 16:49:02 crc kubenswrapper[4707]: I0121 16:49:02.506360 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" event={"ID":"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36","Type":"ContainerDied","Data":"1e212fe1a7fca1161a4a14cf842a06a809326463b8d491a3f0065103ce125e05"} Jan 21 16:49:02 crc kubenswrapper[4707]: I0121 16:49:02.507248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" event={"ID":"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36","Type":"ContainerStarted","Data":"a51e0418d4ad7f256b3bb3abd6189a5233c44bb9f6f76215989f90729625862f"} Jan 21 16:49:03 crc kubenswrapper[4707]: I0121 16:49:03.714710 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:03 crc kubenswrapper[4707]: I0121 16:49:03.844608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-operator-scripts\") pod \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\" (UID: \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\") " Jan 21 16:49:03 crc kubenswrapper[4707]: I0121 16:49:03.844728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92qvl\" (UniqueName: \"kubernetes.io/projected/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-kube-api-access-92qvl\") pod \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\" (UID: \"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36\") " Jan 21 16:49:03 crc kubenswrapper[4707]: I0121 16:49:03.844998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c52792c6-e435-4f6c-bb7e-e0aab3f4ff36" (UID: "c52792c6-e435-4f6c-bb7e-e0aab3f4ff36"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:49:03 crc kubenswrapper[4707]: I0121 16:49:03.849257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-kube-api-access-92qvl" (OuterVolumeSpecName: "kube-api-access-92qvl") pod "c52792c6-e435-4f6c-bb7e-e0aab3f4ff36" (UID: "c52792c6-e435-4f6c-bb7e-e0aab3f4ff36"). InnerVolumeSpecName "kube-api-access-92qvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:49:03 crc kubenswrapper[4707]: I0121 16:49:03.946469 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92qvl\" (UniqueName: \"kubernetes.io/projected/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-kube-api-access-92qvl\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:03 crc kubenswrapper[4707]: I0121 16:49:03.946494 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.349248 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.452925 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-fernet-keys\") pod \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.452984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkb7d\" (UniqueName: \"kubernetes.io/projected/64521cf7-f803-4b2a-bac6-8af702cb0e1f-kube-api-access-lkb7d\") pod \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.453021 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-public-tls-certs\") pod \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.453063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-internal-tls-certs\") pod \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.453135 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-combined-ca-bundle\") pod \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.453166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-scripts\") pod \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.453180 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-config-data\") pod \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.453223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-credential-keys\") pod \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\" (UID: \"64521cf7-f803-4b2a-bac6-8af702cb0e1f\") " Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.456042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64521cf7-f803-4b2a-bac6-8af702cb0e1f-kube-api-access-lkb7d" (OuterVolumeSpecName: "kube-api-access-lkb7d") pod "64521cf7-f803-4b2a-bac6-8af702cb0e1f" (UID: "64521cf7-f803-4b2a-bac6-8af702cb0e1f"). InnerVolumeSpecName "kube-api-access-lkb7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.456275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "64521cf7-f803-4b2a-bac6-8af702cb0e1f" (UID: "64521cf7-f803-4b2a-bac6-8af702cb0e1f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.456051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "64521cf7-f803-4b2a-bac6-8af702cb0e1f" (UID: "64521cf7-f803-4b2a-bac6-8af702cb0e1f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.456456 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-scripts" (OuterVolumeSpecName: "scripts") pod "64521cf7-f803-4b2a-bac6-8af702cb0e1f" (UID: "64521cf7-f803-4b2a-bac6-8af702cb0e1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.467398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-config-data" (OuterVolumeSpecName: "config-data") pod "64521cf7-f803-4b2a-bac6-8af702cb0e1f" (UID: "64521cf7-f803-4b2a-bac6-8af702cb0e1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.467720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64521cf7-f803-4b2a-bac6-8af702cb0e1f" (UID: "64521cf7-f803-4b2a-bac6-8af702cb0e1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.477686 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "64521cf7-f803-4b2a-bac6-8af702cb0e1f" (UID: "64521cf7-f803-4b2a-bac6-8af702cb0e1f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.478009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "64521cf7-f803-4b2a-bac6-8af702cb0e1f" (UID: "64521cf7-f803-4b2a-bac6-8af702cb0e1f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.517937 4707 generic.go:334] "Generic (PLEG): container finished" podID="64521cf7-f803-4b2a-bac6-8af702cb0e1f" containerID="9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac" exitCode=0 Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.517990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" event={"ID":"64521cf7-f803-4b2a-bac6-8af702cb0e1f","Type":"ContainerDied","Data":"9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac"} Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.518015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" event={"ID":"64521cf7-f803-4b2a-bac6-8af702cb0e1f","Type":"ContainerDied","Data":"8d500824d78931df5a1c6d081831e039aa6546ffda6d2c7ff8947132d8eca584"} Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.518029 4707 scope.go:117] "RemoveContainer" containerID="9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.517991 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5d56497597-nwrct" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.519460 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" event={"ID":"c52792c6-e435-4f6c-bb7e-e0aab3f4ff36","Type":"ContainerDied","Data":"a51e0418d4ad7f256b3bb3abd6189a5233c44bb9f6f76215989f90729625862f"} Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.519487 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a51e0418d4ad7f256b3bb3abd6189a5233c44bb9f6f76215989f90729625862f" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.519532 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone30c2-account-delete-p4qdw" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.535749 4707 scope.go:117] "RemoveContainer" containerID="9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac" Jan 21 16:49:04 crc kubenswrapper[4707]: E0121 16:49:04.536125 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac\": container with ID starting with 9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac not found: ID does not exist" containerID="9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.536153 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac"} err="failed to get container status \"9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac\": rpc error: code = NotFound desc = could not find container \"9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac\": container with ID starting with 9da8e8a8aac55b8bc1fa6c5f8e83f13a19c6dd4001b1307b6e56d9f78778ccac not found: ID does not exist" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.542275 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5d56497597-nwrct"] Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.545986 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5d56497597-nwrct"] Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.554717 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkb7d\" (UniqueName: \"kubernetes.io/projected/64521cf7-f803-4b2a-bac6-8af702cb0e1f-kube-api-access-lkb7d\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.554755 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.554769 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.554780 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.554788 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.554796 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.554832 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:04 crc kubenswrapper[4707]: I0121 16:49:04.554841 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/64521cf7-f803-4b2a-bac6-8af702cb0e1f-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:05 crc kubenswrapper[4707]: I0121 16:49:05.188997 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64521cf7-f803-4b2a-bac6-8af702cb0e1f" path="/var/lib/kubelet/pods/64521cf7-f803-4b2a-bac6-8af702cb0e1f/volumes" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.107049 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-dccmq"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.113375 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-dccmq"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.117445 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.121470 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone30c2-account-delete-p4qdw"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.125681 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-30c2-account-create-update-vbbth"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.129666 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone30c2-account-delete-p4qdw"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.200769 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6cxp2"] Jan 21 16:49:06 crc kubenswrapper[4707]: E0121 16:49:06.201007 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52792c6-e435-4f6c-bb7e-e0aab3f4ff36" containerName="mariadb-account-delete" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.201017 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52792c6-e435-4f6c-bb7e-e0aab3f4ff36" containerName="mariadb-account-delete" Jan 21 16:49:06 crc kubenswrapper[4707]: E0121 16:49:06.201035 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64521cf7-f803-4b2a-bac6-8af702cb0e1f" containerName="keystone-api" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.201041 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="64521cf7-f803-4b2a-bac6-8af702cb0e1f" containerName="keystone-api" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.201159 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52792c6-e435-4f6c-bb7e-e0aab3f4ff36" containerName="mariadb-account-delete" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.201174 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="64521cf7-f803-4b2a-bac6-8af702cb0e1f" containerName="keystone-api" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.201559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.208111 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6cxp2"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.274518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9lct\" (UniqueName: \"kubernetes.io/projected/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-kube-api-access-m9lct\") pod \"keystone-db-create-6cxp2\" (UID: \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\") " pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.274925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-operator-scripts\") pod \"keystone-db-create-6cxp2\" (UID: \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\") " pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.306843 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.307598 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.309906 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.311116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.376216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d9f595-4581-4fce-9c6e-504d5947291d-operator-scripts\") pod \"keystone-e19f-account-create-update-k8td5\" (UID: \"b4d9f595-4581-4fce-9c6e-504d5947291d\") " pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.376265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9lct\" (UniqueName: \"kubernetes.io/projected/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-kube-api-access-m9lct\") pod \"keystone-db-create-6cxp2\" (UID: \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\") " pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.376285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knw4z\" (UniqueName: \"kubernetes.io/projected/b4d9f595-4581-4fce-9c6e-504d5947291d-kube-api-access-knw4z\") pod \"keystone-e19f-account-create-update-k8td5\" (UID: \"b4d9f595-4581-4fce-9c6e-504d5947291d\") " pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.376304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-operator-scripts\") pod \"keystone-db-create-6cxp2\" (UID: \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\") " pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.377005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-operator-scripts\") pod \"keystone-db-create-6cxp2\" (UID: \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\") " pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.391001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9lct\" (UniqueName: \"kubernetes.io/projected/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-kube-api-access-m9lct\") pod \"keystone-db-create-6cxp2\" (UID: \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\") " pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.477360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d9f595-4581-4fce-9c6e-504d5947291d-operator-scripts\") pod \"keystone-e19f-account-create-update-k8td5\" (UID: \"b4d9f595-4581-4fce-9c6e-504d5947291d\") " pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.477405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knw4z\" (UniqueName: \"kubernetes.io/projected/b4d9f595-4581-4fce-9c6e-504d5947291d-kube-api-access-knw4z\") pod \"keystone-e19f-account-create-update-k8td5\" (UID: \"b4d9f595-4581-4fce-9c6e-504d5947291d\") " pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.478095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d9f595-4581-4fce-9c6e-504d5947291d-operator-scripts\") pod \"keystone-e19f-account-create-update-k8td5\" (UID: \"b4d9f595-4581-4fce-9c6e-504d5947291d\") " pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.491155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knw4z\" (UniqueName: \"kubernetes.io/projected/b4d9f595-4581-4fce-9c6e-504d5947291d-kube-api-access-knw4z\") pod \"keystone-e19f-account-create-update-k8td5\" (UID: \"b4d9f595-4581-4fce-9c6e-504d5947291d\") " pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.514837 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.619499 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.859049 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6cxp2"] Jan 21 16:49:06 crc kubenswrapper[4707]: I0121 16:49:06.971248 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5"] Jan 21 16:49:06 crc kubenswrapper[4707]: W0121 16:49:06.972418 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4d9f595_4581_4fce_9c6e_504d5947291d.slice/crio-66c0ea8bfbd0ed59ed2953d7ce9ffc26e7978108272fdb5c64249f1cc30e4259 WatchSource:0}: Error finding container 66c0ea8bfbd0ed59ed2953d7ce9ffc26e7978108272fdb5c64249f1cc30e4259: Status 404 returned error can't find the container with id 66c0ea8bfbd0ed59ed2953d7ce9ffc26e7978108272fdb5c64249f1cc30e4259 Jan 21 16:49:07 crc kubenswrapper[4707]: I0121 16:49:07.188301 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0abefd72-ba92-491d-b232-bfe152c322dd" path="/var/lib/kubelet/pods/0abefd72-ba92-491d-b232-bfe152c322dd/volumes" Jan 21 16:49:07 crc kubenswrapper[4707]: I0121 16:49:07.188949 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="540f5b24-d11b-435a-ba43-8cb438702a49" path="/var/lib/kubelet/pods/540f5b24-d11b-435a-ba43-8cb438702a49/volumes" Jan 21 16:49:07 crc kubenswrapper[4707]: I0121 16:49:07.189388 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52792c6-e435-4f6c-bb7e-e0aab3f4ff36" path="/var/lib/kubelet/pods/c52792c6-e435-4f6c-bb7e-e0aab3f4ff36/volumes" Jan 21 16:49:07 crc kubenswrapper[4707]: I0121 16:49:07.546901 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4d9f595-4581-4fce-9c6e-504d5947291d" containerID="0f7bef4c4ef0f200ad7fc53fa35e3b69caa6fe28ebc38e14c973498b08efc238" exitCode=0 Jan 21 16:49:07 crc kubenswrapper[4707]: I0121 16:49:07.546963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" event={"ID":"b4d9f595-4581-4fce-9c6e-504d5947291d","Type":"ContainerDied","Data":"0f7bef4c4ef0f200ad7fc53fa35e3b69caa6fe28ebc38e14c973498b08efc238"} Jan 21 16:49:07 crc kubenswrapper[4707]: I0121 16:49:07.546988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" event={"ID":"b4d9f595-4581-4fce-9c6e-504d5947291d","Type":"ContainerStarted","Data":"66c0ea8bfbd0ed59ed2953d7ce9ffc26e7978108272fdb5c64249f1cc30e4259"} Jan 21 16:49:07 crc kubenswrapper[4707]: I0121 16:49:07.548643 4707 generic.go:334] "Generic (PLEG): container finished" podID="0652d49d-29aa-46fd-b5e1-0d08db4c0ae8" containerID="c7d93d7e4a6575a30558d81e0c158cf1771266c6b875b91e5fcf5c9e597294c3" exitCode=0 Jan 21 16:49:07 crc kubenswrapper[4707]: I0121 16:49:07.548685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6cxp2" event={"ID":"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8","Type":"ContainerDied","Data":"c7d93d7e4a6575a30558d81e0c158cf1771266c6b875b91e5fcf5c9e597294c3"} Jan 21 16:49:07 crc kubenswrapper[4707]: I0121 16:49:07.548718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6cxp2" event={"ID":"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8","Type":"ContainerStarted","Data":"210b75290f51e8a3aa26c3667c138ae0bc3267e7c08a79a3284c9c58a3a1562a"} Jan 21 16:49:08 crc kubenswrapper[4707]: I0121 16:49:08.818337 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:08 crc kubenswrapper[4707]: I0121 16:49:08.859948 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:08 crc kubenswrapper[4707]: I0121 16:49:08.907129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-operator-scripts\") pod \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\" (UID: \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\") " Jan 21 16:49:08 crc kubenswrapper[4707]: I0121 16:49:08.907187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9lct\" (UniqueName: \"kubernetes.io/projected/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-kube-api-access-m9lct\") pod \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\" (UID: \"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8\") " Jan 21 16:49:08 crc kubenswrapper[4707]: I0121 16:49:08.907846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0652d49d-29aa-46fd-b5e1-0d08db4c0ae8" (UID: "0652d49d-29aa-46fd-b5e1-0d08db4c0ae8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:49:08 crc kubenswrapper[4707]: I0121 16:49:08.912091 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-kube-api-access-m9lct" (OuterVolumeSpecName: "kube-api-access-m9lct") pod "0652d49d-29aa-46fd-b5e1-0d08db4c0ae8" (UID: "0652d49d-29aa-46fd-b5e1-0d08db4c0ae8"). InnerVolumeSpecName "kube-api-access-m9lct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.008336 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d9f595-4581-4fce-9c6e-504d5947291d-operator-scripts\") pod \"b4d9f595-4581-4fce-9c6e-504d5947291d\" (UID: \"b4d9f595-4581-4fce-9c6e-504d5947291d\") " Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.008431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knw4z\" (UniqueName: \"kubernetes.io/projected/b4d9f595-4581-4fce-9c6e-504d5947291d-kube-api-access-knw4z\") pod \"b4d9f595-4581-4fce-9c6e-504d5947291d\" (UID: \"b4d9f595-4581-4fce-9c6e-504d5947291d\") " Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.008702 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4d9f595-4581-4fce-9c6e-504d5947291d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4d9f595-4581-4fce-9c6e-504d5947291d" (UID: "b4d9f595-4581-4fce-9c6e-504d5947291d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.008768 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9lct\" (UniqueName: \"kubernetes.io/projected/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-kube-api-access-m9lct\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.008783 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.011136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d9f595-4581-4fce-9c6e-504d5947291d-kube-api-access-knw4z" (OuterVolumeSpecName: "kube-api-access-knw4z") pod "b4d9f595-4581-4fce-9c6e-504d5947291d" (UID: "b4d9f595-4581-4fce-9c6e-504d5947291d"). InnerVolumeSpecName "kube-api-access-knw4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.110488 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knw4z\" (UniqueName: \"kubernetes.io/projected/b4d9f595-4581-4fce-9c6e-504d5947291d-kube-api-access-knw4z\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.110517 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d9f595-4581-4fce-9c6e-504d5947291d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.601521 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.601517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5" event={"ID":"b4d9f595-4581-4fce-9c6e-504d5947291d","Type":"ContainerDied","Data":"66c0ea8bfbd0ed59ed2953d7ce9ffc26e7978108272fdb5c64249f1cc30e4259"} Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.601604 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66c0ea8bfbd0ed59ed2953d7ce9ffc26e7978108272fdb5c64249f1cc30e4259" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.603407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6cxp2" event={"ID":"0652d49d-29aa-46fd-b5e1-0d08db4c0ae8","Type":"ContainerDied","Data":"210b75290f51e8a3aa26c3667c138ae0bc3267e7c08a79a3284c9c58a3a1562a"} Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.603441 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="210b75290f51e8a3aa26c3667c138ae0bc3267e7c08a79a3284c9c58a3a1562a" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.603677 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6cxp2" Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.946201 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:49:09 crc kubenswrapper[4707]: I0121 16:49:09.946255 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.843084 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-8qlht"] Jan 21 16:49:11 crc kubenswrapper[4707]: E0121 16:49:11.843468 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0652d49d-29aa-46fd-b5e1-0d08db4c0ae8" containerName="mariadb-database-create" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.843479 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0652d49d-29aa-46fd-b5e1-0d08db4c0ae8" containerName="mariadb-database-create" Jan 21 16:49:11 crc kubenswrapper[4707]: E0121 16:49:11.843507 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d9f595-4581-4fce-9c6e-504d5947291d" containerName="mariadb-account-create-update" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.843513 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d9f595-4581-4fce-9c6e-504d5947291d" containerName="mariadb-account-create-update" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.843625 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d9f595-4581-4fce-9c6e-504d5947291d" containerName="mariadb-account-create-update" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.843642 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0652d49d-29aa-46fd-b5e1-0d08db4c0ae8" containerName="mariadb-database-create" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.844097 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.845433 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.845771 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-g6b9s" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.845921 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.846043 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.849158 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-8qlht"] Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.943837 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f872eba-7e32-489e-b2e0-ddc51d5c748d-config-data\") pod \"keystone-db-sync-8qlht\" (UID: \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\") " pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:11 crc kubenswrapper[4707]: I0121 16:49:11.943891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4588c\" (UniqueName: \"kubernetes.io/projected/1f872eba-7e32-489e-b2e0-ddc51d5c748d-kube-api-access-4588c\") pod \"keystone-db-sync-8qlht\" (UID: \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\") " pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:12 crc kubenswrapper[4707]: I0121 16:49:12.044988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f872eba-7e32-489e-b2e0-ddc51d5c748d-config-data\") pod \"keystone-db-sync-8qlht\" (UID: \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\") " pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:12 crc kubenswrapper[4707]: I0121 16:49:12.045054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4588c\" (UniqueName: \"kubernetes.io/projected/1f872eba-7e32-489e-b2e0-ddc51d5c748d-kube-api-access-4588c\") pod \"keystone-db-sync-8qlht\" (UID: \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\") " pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:12 crc kubenswrapper[4707]: I0121 16:49:12.050282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f872eba-7e32-489e-b2e0-ddc51d5c748d-config-data\") pod \"keystone-db-sync-8qlht\" (UID: \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\") " pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:12 crc kubenswrapper[4707]: I0121 16:49:12.057918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4588c\" (UniqueName: \"kubernetes.io/projected/1f872eba-7e32-489e-b2e0-ddc51d5c748d-kube-api-access-4588c\") pod \"keystone-db-sync-8qlht\" (UID: \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\") " pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:12 crc kubenswrapper[4707]: I0121 16:49:12.161690 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:12 crc kubenswrapper[4707]: I0121 16:49:12.507790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-8qlht"] Jan 21 16:49:12 crc kubenswrapper[4707]: I0121 16:49:12.625878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-8qlht" event={"ID":"1f872eba-7e32-489e-b2e0-ddc51d5c748d","Type":"ContainerStarted","Data":"d8b5087c02819680e390ad13aa383cdf80c741fed46c4de6a10f047d53d26363"} Jan 21 16:49:12 crc kubenswrapper[4707]: I0121 16:49:12.626057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-8qlht" event={"ID":"1f872eba-7e32-489e-b2e0-ddc51d5c748d","Type":"ContainerStarted","Data":"04d0023ff4fddf1c6600d3dd11a4a59f61854f8eaa40cc4ce98f5816d9dfd743"} Jan 21 16:49:12 crc kubenswrapper[4707]: I0121 16:49:12.638087 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-8qlht" podStartSLOduration=1.638072529 podStartE2EDuration="1.638072529s" podCreationTimestamp="2026-01-21 16:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:49:12.63610005 +0000 UTC m=+6449.817616272" watchObservedRunningTime="2026-01-21 16:49:12.638072529 +0000 UTC m=+6449.819588750" Jan 21 16:49:14 crc kubenswrapper[4707]: I0121 16:49:14.639675 4707 generic.go:334] "Generic (PLEG): container finished" podID="1f872eba-7e32-489e-b2e0-ddc51d5c748d" containerID="d8b5087c02819680e390ad13aa383cdf80c741fed46c4de6a10f047d53d26363" exitCode=0 Jan 21 16:49:14 crc kubenswrapper[4707]: I0121 16:49:14.639722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-8qlht" event={"ID":"1f872eba-7e32-489e-b2e0-ddc51d5c748d","Type":"ContainerDied","Data":"d8b5087c02819680e390ad13aa383cdf80c741fed46c4de6a10f047d53d26363"} Jan 21 16:49:15 crc kubenswrapper[4707]: I0121 16:49:15.867141 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:15 crc kubenswrapper[4707]: I0121 16:49:15.996068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4588c\" (UniqueName: \"kubernetes.io/projected/1f872eba-7e32-489e-b2e0-ddc51d5c748d-kube-api-access-4588c\") pod \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\" (UID: \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\") " Jan 21 16:49:15 crc kubenswrapper[4707]: I0121 16:49:15.996250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f872eba-7e32-489e-b2e0-ddc51d5c748d-config-data\") pod \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\" (UID: \"1f872eba-7e32-489e-b2e0-ddc51d5c748d\") " Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.000208 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f872eba-7e32-489e-b2e0-ddc51d5c748d-kube-api-access-4588c" (OuterVolumeSpecName: "kube-api-access-4588c") pod "1f872eba-7e32-489e-b2e0-ddc51d5c748d" (UID: "1f872eba-7e32-489e-b2e0-ddc51d5c748d"). InnerVolumeSpecName "kube-api-access-4588c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.023461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f872eba-7e32-489e-b2e0-ddc51d5c748d-config-data" (OuterVolumeSpecName: "config-data") pod "1f872eba-7e32-489e-b2e0-ddc51d5c748d" (UID: "1f872eba-7e32-489e-b2e0-ddc51d5c748d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.097561 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f872eba-7e32-489e-b2e0-ddc51d5c748d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.097592 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4588c\" (UniqueName: \"kubernetes.io/projected/1f872eba-7e32-489e-b2e0-ddc51d5c748d-kube-api-access-4588c\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.652499 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-8qlht" event={"ID":"1f872eba-7e32-489e-b2e0-ddc51d5c748d","Type":"ContainerDied","Data":"04d0023ff4fddf1c6600d3dd11a4a59f61854f8eaa40cc4ce98f5816d9dfd743"} Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.652534 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04d0023ff4fddf1c6600d3dd11a4a59f61854f8eaa40cc4ce98f5816d9dfd743" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.652538 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-8qlht" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.811803 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-c9xfs"] Jan 21 16:49:16 crc kubenswrapper[4707]: E0121 16:49:16.812066 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f872eba-7e32-489e-b2e0-ddc51d5c748d" containerName="keystone-db-sync" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.812083 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f872eba-7e32-489e-b2e0-ddc51d5c748d" containerName="keystone-db-sync" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.812236 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f872eba-7e32-489e-b2e0-ddc51d5c748d" containerName="keystone-db-sync" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.812615 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.814774 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.814802 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-g6b9s" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.814802 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.815898 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.821666 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.822978 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-c9xfs"] Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.907322 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-scripts\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.907411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-fernet-keys\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.907453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-config-data\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.907495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw2t4\" (UniqueName: \"kubernetes.io/projected/e187bfc7-6f17-4a12-9622-5ab3d2223da2-kube-api-access-tw2t4\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:16 crc kubenswrapper[4707]: I0121 16:49:16.907526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-credential-keys\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.009202 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-scripts\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.009277 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-fernet-keys\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.009324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-config-data\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.009356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw2t4\" (UniqueName: \"kubernetes.io/projected/e187bfc7-6f17-4a12-9622-5ab3d2223da2-kube-api-access-tw2t4\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.009382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-credential-keys\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.012658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-credential-keys\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.012795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-scripts\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.013379 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-fernet-keys\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.013509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-config-data\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.021489 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw2t4\" (UniqueName: \"kubernetes.io/projected/e187bfc7-6f17-4a12-9622-5ab3d2223da2-kube-api-access-tw2t4\") pod \"keystone-bootstrap-c9xfs\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.125697 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.469196 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-c9xfs"] Jan 21 16:49:17 crc kubenswrapper[4707]: W0121 16:49:17.471398 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode187bfc7_6f17_4a12_9622_5ab3d2223da2.slice/crio-9f5ad27d7634cfbcfcc17a373bcacc2a6cbe9c236a2de1cbf278793e2962f3f3 WatchSource:0}: Error finding container 9f5ad27d7634cfbcfcc17a373bcacc2a6cbe9c236a2de1cbf278793e2962f3f3: Status 404 returned error can't find the container with id 9f5ad27d7634cfbcfcc17a373bcacc2a6cbe9c236a2de1cbf278793e2962f3f3 Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.660203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" event={"ID":"e187bfc7-6f17-4a12-9622-5ab3d2223da2","Type":"ContainerStarted","Data":"99aa7065f55b2a4a65ff93e54f406e9dcce7ea05b75e6a2c3832f075fdff6166"} Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.660259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" event={"ID":"e187bfc7-6f17-4a12-9622-5ab3d2223da2","Type":"ContainerStarted","Data":"9f5ad27d7634cfbcfcc17a373bcacc2a6cbe9c236a2de1cbf278793e2962f3f3"} Jan 21 16:49:17 crc kubenswrapper[4707]: I0121 16:49:17.674359 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" podStartSLOduration=1.67434759 podStartE2EDuration="1.67434759s" podCreationTimestamp="2026-01-21 16:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:49:17.670393787 +0000 UTC m=+6454.851910008" watchObservedRunningTime="2026-01-21 16:49:17.67434759 +0000 UTC m=+6454.855863812" Jan 21 16:49:19 crc kubenswrapper[4707]: I0121 16:49:19.673211 4707 generic.go:334] "Generic (PLEG): container finished" podID="e187bfc7-6f17-4a12-9622-5ab3d2223da2" containerID="99aa7065f55b2a4a65ff93e54f406e9dcce7ea05b75e6a2c3832f075fdff6166" exitCode=0 Jan 21 16:49:19 crc kubenswrapper[4707]: I0121 16:49:19.673298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" event={"ID":"e187bfc7-6f17-4a12-9622-5ab3d2223da2","Type":"ContainerDied","Data":"99aa7065f55b2a4a65ff93e54f406e9dcce7ea05b75e6a2c3832f075fdff6166"} Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.891290 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.960310 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-credential-keys\") pod \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.960356 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-fernet-keys\") pod \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.960442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw2t4\" (UniqueName: \"kubernetes.io/projected/e187bfc7-6f17-4a12-9622-5ab3d2223da2-kube-api-access-tw2t4\") pod \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.960463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-scripts\") pod \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.960483 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-config-data\") pod \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\" (UID: \"e187bfc7-6f17-4a12-9622-5ab3d2223da2\") " Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.966877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-scripts" (OuterVolumeSpecName: "scripts") pod "e187bfc7-6f17-4a12-9622-5ab3d2223da2" (UID: "e187bfc7-6f17-4a12-9622-5ab3d2223da2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.967116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e187bfc7-6f17-4a12-9622-5ab3d2223da2" (UID: "e187bfc7-6f17-4a12-9622-5ab3d2223da2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.967303 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e187bfc7-6f17-4a12-9622-5ab3d2223da2" (UID: "e187bfc7-6f17-4a12-9622-5ab3d2223da2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.967352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e187bfc7-6f17-4a12-9622-5ab3d2223da2-kube-api-access-tw2t4" (OuterVolumeSpecName: "kube-api-access-tw2t4") pod "e187bfc7-6f17-4a12-9622-5ab3d2223da2" (UID: "e187bfc7-6f17-4a12-9622-5ab3d2223da2"). InnerVolumeSpecName "kube-api-access-tw2t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:49:20 crc kubenswrapper[4707]: I0121 16:49:20.979968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-config-data" (OuterVolumeSpecName: "config-data") pod "e187bfc7-6f17-4a12-9622-5ab3d2223da2" (UID: "e187bfc7-6f17-4a12-9622-5ab3d2223da2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.062348 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw2t4\" (UniqueName: \"kubernetes.io/projected/e187bfc7-6f17-4a12-9622-5ab3d2223da2-kube-api-access-tw2t4\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.062376 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.062386 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.062394 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.062402 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e187bfc7-6f17-4a12-9622-5ab3d2223da2-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.685612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" event={"ID":"e187bfc7-6f17-4a12-9622-5ab3d2223da2","Type":"ContainerDied","Data":"9f5ad27d7634cfbcfcc17a373bcacc2a6cbe9c236a2de1cbf278793e2962f3f3"} Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.685635 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-c9xfs" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.685646 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f5ad27d7634cfbcfcc17a373bcacc2a6cbe9c236a2de1cbf278793e2962f3f3" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.728753 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-76df4df77b-fm5zd"] Jan 21 16:49:21 crc kubenswrapper[4707]: E0121 16:49:21.728992 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e187bfc7-6f17-4a12-9622-5ab3d2223da2" containerName="keystone-bootstrap" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.729009 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e187bfc7-6f17-4a12-9622-5ab3d2223da2" containerName="keystone-bootstrap" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.729122 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e187bfc7-6f17-4a12-9622-5ab3d2223da2" containerName="keystone-bootstrap" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.729549 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.731766 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.731912 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.732029 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-g6b9s" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.732134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.738598 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-76df4df77b-fm5zd"] Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.872268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-credential-keys\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.872575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-fernet-keys\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.872618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-scripts\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.872664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xzd\" (UniqueName: \"kubernetes.io/projected/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-kube-api-access-g8xzd\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.872719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-config-data\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.974152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-fernet-keys\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.974203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-scripts\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.974254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xzd\" (UniqueName: \"kubernetes.io/projected/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-kube-api-access-g8xzd\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.974292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-config-data\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.974350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-credential-keys\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.978365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-scripts\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.978567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-fernet-keys\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.978679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-credential-keys\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.978756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-config-data\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:21 crc kubenswrapper[4707]: I0121 16:49:21.987545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xzd\" (UniqueName: \"kubernetes.io/projected/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-kube-api-access-g8xzd\") pod \"keystone-76df4df77b-fm5zd\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:22 crc kubenswrapper[4707]: I0121 16:49:22.041392 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:22 crc kubenswrapper[4707]: I0121 16:49:22.386463 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-76df4df77b-fm5zd"] Jan 21 16:49:22 crc kubenswrapper[4707]: W0121 16:49:22.388426 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6b17707_45f7_4c38_a6af_fcaace1b4fe1.slice/crio-c588cf0c52438d20b2237f16cbc3faa28825c85a766c464f36987605f305170f WatchSource:0}: Error finding container c588cf0c52438d20b2237f16cbc3faa28825c85a766c464f36987605f305170f: Status 404 returned error can't find the container with id c588cf0c52438d20b2237f16cbc3faa28825c85a766c464f36987605f305170f Jan 21 16:49:22 crc kubenswrapper[4707]: I0121 16:49:22.692216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" event={"ID":"f6b17707-45f7-4c38-a6af-fcaace1b4fe1","Type":"ContainerStarted","Data":"2156085ed175a4090e7560bb00b4831279e69f9fe2ce08b5e3a4131a5f6b5d88"} Jan 21 16:49:22 crc kubenswrapper[4707]: I0121 16:49:22.692402 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:49:22 crc kubenswrapper[4707]: I0121 16:49:22.692413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" event={"ID":"f6b17707-45f7-4c38-a6af-fcaace1b4fe1","Type":"ContainerStarted","Data":"c588cf0c52438d20b2237f16cbc3faa28825c85a766c464f36987605f305170f"} Jan 21 16:49:22 crc kubenswrapper[4707]: I0121 16:49:22.705990 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" podStartSLOduration=1.7059759749999999 podStartE2EDuration="1.705975975s" podCreationTimestamp="2026-01-21 16:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:49:22.702327374 +0000 UTC m=+6459.883843596" watchObservedRunningTime="2026-01-21 16:49:22.705975975 +0000 UTC m=+6459.887492196" Jan 21 16:49:39 crc kubenswrapper[4707]: I0121 16:49:39.945647 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:49:39 crc kubenswrapper[4707]: I0121 16:49:39.946265 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:49:53 crc kubenswrapper[4707]: I0121 16:49:53.297759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.784468 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-8qlht"] Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.791055 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-c9xfs"] Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.795234 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-c9xfs"] Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.806350 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-8qlht"] Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.811832 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-76df4df77b-fm5zd"] Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.812025 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" podUID="f6b17707-45f7-4c38-a6af-fcaace1b4fe1" containerName="keystone-api" containerID="cri-o://2156085ed175a4090e7560bb00b4831279e69f9fe2ce08b5e3a4131a5f6b5d88" gracePeriod=30 Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.833254 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystonee19f-account-delete-qwc8h"] Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.834014 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.839337 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdvrj\" (UniqueName: \"kubernetes.io/projected/c602a78d-2c08-4f41-8789-c17dd2e072cb-kube-api-access-sdvrj\") pod \"keystonee19f-account-delete-qwc8h\" (UID: \"c602a78d-2c08-4f41-8789-c17dd2e072cb\") " pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.839426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c602a78d-2c08-4f41-8789-c17dd2e072cb-operator-scripts\") pod \"keystonee19f-account-delete-qwc8h\" (UID: \"c602a78d-2c08-4f41-8789-c17dd2e072cb\") " pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.847544 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonee19f-account-delete-qwc8h"] Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.940211 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdvrj\" (UniqueName: \"kubernetes.io/projected/c602a78d-2c08-4f41-8789-c17dd2e072cb-kube-api-access-sdvrj\") pod \"keystonee19f-account-delete-qwc8h\" (UID: \"c602a78d-2c08-4f41-8789-c17dd2e072cb\") " pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.940303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c602a78d-2c08-4f41-8789-c17dd2e072cb-operator-scripts\") pod \"keystonee19f-account-delete-qwc8h\" (UID: \"c602a78d-2c08-4f41-8789-c17dd2e072cb\") " pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.940889 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c602a78d-2c08-4f41-8789-c17dd2e072cb-operator-scripts\") pod \"keystonee19f-account-delete-qwc8h\" (UID: \"c602a78d-2c08-4f41-8789-c17dd2e072cb\") " pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:07 crc kubenswrapper[4707]: I0121 16:50:07.956052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdvrj\" (UniqueName: \"kubernetes.io/projected/c602a78d-2c08-4f41-8789-c17dd2e072cb-kube-api-access-sdvrj\") pod \"keystonee19f-account-delete-qwc8h\" (UID: \"c602a78d-2c08-4f41-8789-c17dd2e072cb\") " pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:08 crc kubenswrapper[4707]: I0121 16:50:08.149669 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:08 crc kubenswrapper[4707]: I0121 16:50:08.494528 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonee19f-account-delete-qwc8h"] Jan 21 16:50:08 crc kubenswrapper[4707]: I0121 16:50:08.944940 4707 generic.go:334] "Generic (PLEG): container finished" podID="c602a78d-2c08-4f41-8789-c17dd2e072cb" containerID="b9ad976554bf0968c748ae6ea63fa245a604051ce377ba7ca72ab9226dd272b0" exitCode=0 Jan 21 16:50:08 crc kubenswrapper[4707]: I0121 16:50:08.944977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" event={"ID":"c602a78d-2c08-4f41-8789-c17dd2e072cb","Type":"ContainerDied","Data":"b9ad976554bf0968c748ae6ea63fa245a604051ce377ba7ca72ab9226dd272b0"} Jan 21 16:50:08 crc kubenswrapper[4707]: I0121 16:50:08.945000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" event={"ID":"c602a78d-2c08-4f41-8789-c17dd2e072cb","Type":"ContainerStarted","Data":"b9c6491a31a5d5b4741a91c8b7d04bb4acec0044964fd378893ee43b8008bc37"} Jan 21 16:50:09 crc kubenswrapper[4707]: I0121 16:50:09.188591 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f872eba-7e32-489e-b2e0-ddc51d5c748d" path="/var/lib/kubelet/pods/1f872eba-7e32-489e-b2e0-ddc51d5c748d/volumes" Jan 21 16:50:09 crc kubenswrapper[4707]: I0121 16:50:09.189132 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e187bfc7-6f17-4a12-9622-5ab3d2223da2" path="/var/lib/kubelet/pods/e187bfc7-6f17-4a12-9622-5ab3d2223da2/volumes" Jan 21 16:50:09 crc kubenswrapper[4707]: I0121 16:50:09.946009 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:50:09 crc kubenswrapper[4707]: I0121 16:50:09.946212 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:50:09 crc kubenswrapper[4707]: I0121 16:50:09.946256 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:50:09 crc kubenswrapper[4707]: I0121 16:50:09.946945 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3dcf06292baafcac4743dbbfb989956b7b70306caef24c5da65220fe71222042"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:50:09 crc kubenswrapper[4707]: I0121 16:50:09.947001 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://3dcf06292baafcac4743dbbfb989956b7b70306caef24c5da65220fe71222042" gracePeriod=600 Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.160719 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.269349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c602a78d-2c08-4f41-8789-c17dd2e072cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c602a78d-2c08-4f41-8789-c17dd2e072cb" (UID: "c602a78d-2c08-4f41-8789-c17dd2e072cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.269398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c602a78d-2c08-4f41-8789-c17dd2e072cb-operator-scripts\") pod \"c602a78d-2c08-4f41-8789-c17dd2e072cb\" (UID: \"c602a78d-2c08-4f41-8789-c17dd2e072cb\") " Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.269474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdvrj\" (UniqueName: \"kubernetes.io/projected/c602a78d-2c08-4f41-8789-c17dd2e072cb-kube-api-access-sdvrj\") pod \"c602a78d-2c08-4f41-8789-c17dd2e072cb\" (UID: \"c602a78d-2c08-4f41-8789-c17dd2e072cb\") " Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.270324 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c602a78d-2c08-4f41-8789-c17dd2e072cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.273420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c602a78d-2c08-4f41-8789-c17dd2e072cb-kube-api-access-sdvrj" (OuterVolumeSpecName: "kube-api-access-sdvrj") pod "c602a78d-2c08-4f41-8789-c17dd2e072cb" (UID: "c602a78d-2c08-4f41-8789-c17dd2e072cb"). InnerVolumeSpecName "kube-api-access-sdvrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.371022 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdvrj\" (UniqueName: \"kubernetes.io/projected/c602a78d-2c08-4f41-8789-c17dd2e072cb-kube-api-access-sdvrj\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.971068 4707 generic.go:334] "Generic (PLEG): container finished" podID="f6b17707-45f7-4c38-a6af-fcaace1b4fe1" containerID="2156085ed175a4090e7560bb00b4831279e69f9fe2ce08b5e3a4131a5f6b5d88" exitCode=0 Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.971357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" event={"ID":"f6b17707-45f7-4c38-a6af-fcaace1b4fe1","Type":"ContainerDied","Data":"2156085ed175a4090e7560bb00b4831279e69f9fe2ce08b5e3a4131a5f6b5d88"} Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.974900 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="3dcf06292baafcac4743dbbfb989956b7b70306caef24c5da65220fe71222042" exitCode=0 Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.974964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"3dcf06292baafcac4743dbbfb989956b7b70306caef24c5da65220fe71222042"} Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.975481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782"} Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.975517 4707 scope.go:117] "RemoveContainer" containerID="bb869a37cdf9f36c036f49610ae1644108174dfed1e40a1a1d330f49b2783b79" Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.976644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" event={"ID":"c602a78d-2c08-4f41-8789-c17dd2e072cb","Type":"ContainerDied","Data":"b9c6491a31a5d5b4741a91c8b7d04bb4acec0044964fd378893ee43b8008bc37"} Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.976672 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9c6491a31a5d5b4741a91c8b7d04bb4acec0044964fd378893ee43b8008bc37" Jan 21 16:50:10 crc kubenswrapper[4707]: I0121 16:50:10.976715 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonee19f-account-delete-qwc8h" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.145612 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.281545 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-scripts\") pod \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.281611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-config-data\") pod \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.282240 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8xzd\" (UniqueName: \"kubernetes.io/projected/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-kube-api-access-g8xzd\") pod \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.283114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-credential-keys\") pod \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.283152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-fernet-keys\") pod \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\" (UID: \"f6b17707-45f7-4c38-a6af-fcaace1b4fe1\") " Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.287154 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-scripts" (OuterVolumeSpecName: "scripts") pod "f6b17707-45f7-4c38-a6af-fcaace1b4fe1" (UID: "f6b17707-45f7-4c38-a6af-fcaace1b4fe1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.287353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f6b17707-45f7-4c38-a6af-fcaace1b4fe1" (UID: "f6b17707-45f7-4c38-a6af-fcaace1b4fe1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.287480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-kube-api-access-g8xzd" (OuterVolumeSpecName: "kube-api-access-g8xzd") pod "f6b17707-45f7-4c38-a6af-fcaace1b4fe1" (UID: "f6b17707-45f7-4c38-a6af-fcaace1b4fe1"). InnerVolumeSpecName "kube-api-access-g8xzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.287784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f6b17707-45f7-4c38-a6af-fcaace1b4fe1" (UID: "f6b17707-45f7-4c38-a6af-fcaace1b4fe1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.298246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-config-data" (OuterVolumeSpecName: "config-data") pod "f6b17707-45f7-4c38-a6af-fcaace1b4fe1" (UID: "f6b17707-45f7-4c38-a6af-fcaace1b4fe1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.385205 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8xzd\" (UniqueName: \"kubernetes.io/projected/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-kube-api-access-g8xzd\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.385233 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.385243 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.385253 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.385261 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b17707-45f7-4c38-a6af-fcaace1b4fe1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.984565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" event={"ID":"f6b17707-45f7-4c38-a6af-fcaace1b4fe1","Type":"ContainerDied","Data":"c588cf0c52438d20b2237f16cbc3faa28825c85a766c464f36987605f305170f"} Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.984898 4707 scope.go:117] "RemoveContainer" containerID="2156085ed175a4090e7560bb00b4831279e69f9fe2ce08b5e3a4131a5f6b5d88" Jan 21 16:50:11 crc kubenswrapper[4707]: I0121 16:50:11.984604 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-76df4df77b-fm5zd" Jan 21 16:50:12 crc kubenswrapper[4707]: I0121 16:50:12.009638 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-76df4df77b-fm5zd"] Jan 21 16:50:12 crc kubenswrapper[4707]: I0121 16:50:12.013830 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-76df4df77b-fm5zd"] Jan 21 16:50:12 crc kubenswrapper[4707]: I0121 16:50:12.848850 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6cxp2"] Jan 21 16:50:12 crc kubenswrapper[4707]: I0121 16:50:12.852802 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6cxp2"] Jan 21 16:50:12 crc kubenswrapper[4707]: I0121 16:50:12.858238 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5"] Jan 21 16:50:12 crc kubenswrapper[4707]: I0121 16:50:12.862356 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystonee19f-account-delete-qwc8h"] Jan 21 16:50:12 crc kubenswrapper[4707]: I0121 16:50:12.865885 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystonee19f-account-delete-qwc8h"] Jan 21 16:50:12 crc kubenswrapper[4707]: I0121 16:50:12.869248 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-e19f-account-create-update-k8td5"] Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.070040 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z7d44"] Jan 21 16:50:13 crc kubenswrapper[4707]: E0121 16:50:13.070303 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c602a78d-2c08-4f41-8789-c17dd2e072cb" containerName="mariadb-account-delete" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.070316 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c602a78d-2c08-4f41-8789-c17dd2e072cb" containerName="mariadb-account-delete" Jan 21 16:50:13 crc kubenswrapper[4707]: E0121 16:50:13.070340 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b17707-45f7-4c38-a6af-fcaace1b4fe1" containerName="keystone-api" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.070346 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b17707-45f7-4c38-a6af-fcaace1b4fe1" containerName="keystone-api" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.070472 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c602a78d-2c08-4f41-8789-c17dd2e072cb" containerName="mariadb-account-delete" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.070487 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b17707-45f7-4c38-a6af-fcaace1b4fe1" containerName="keystone-api" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.070971 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.073545 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb"] Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.074218 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.075679 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.077036 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z7d44"] Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.080749 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb"] Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.188538 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0652d49d-29aa-46fd-b5e1-0d08db4c0ae8" path="/var/lib/kubelet/pods/0652d49d-29aa-46fd-b5e1-0d08db4c0ae8/volumes" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.189067 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d9f595-4581-4fce-9c6e-504d5947291d" path="/var/lib/kubelet/pods/b4d9f595-4581-4fce-9c6e-504d5947291d/volumes" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.189510 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c602a78d-2c08-4f41-8789-c17dd2e072cb" path="/var/lib/kubelet/pods/c602a78d-2c08-4f41-8789-c17dd2e072cb/volumes" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.189972 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b17707-45f7-4c38-a6af-fcaace1b4fe1" path="/var/lib/kubelet/pods/f6b17707-45f7-4c38-a6af-fcaace1b4fe1/volumes" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.209780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/460e0610-4ca8-448d-95c0-4f1201b17de4-operator-scripts\") pod \"keystone-78f6-account-create-update-dwbzb\" (UID: \"460e0610-4ca8-448d-95c0-4f1201b17de4\") " pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.209866 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzgzv\" (UniqueName: \"kubernetes.io/projected/460e0610-4ca8-448d-95c0-4f1201b17de4-kube-api-access-jzgzv\") pod \"keystone-78f6-account-create-update-dwbzb\" (UID: \"460e0610-4ca8-448d-95c0-4f1201b17de4\") " pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.209932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a86ce12d-c904-4998-a578-ee635058b026-operator-scripts\") pod \"keystone-db-create-z7d44\" (UID: \"a86ce12d-c904-4998-a578-ee635058b026\") " pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.209963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh57j\" (UniqueName: \"kubernetes.io/projected/a86ce12d-c904-4998-a578-ee635058b026-kube-api-access-rh57j\") pod \"keystone-db-create-z7d44\" (UID: \"a86ce12d-c904-4998-a578-ee635058b026\") " pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.310835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/460e0610-4ca8-448d-95c0-4f1201b17de4-operator-scripts\") pod \"keystone-78f6-account-create-update-dwbzb\" (UID: \"460e0610-4ca8-448d-95c0-4f1201b17de4\") " pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.310893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzgzv\" (UniqueName: \"kubernetes.io/projected/460e0610-4ca8-448d-95c0-4f1201b17de4-kube-api-access-jzgzv\") pod \"keystone-78f6-account-create-update-dwbzb\" (UID: \"460e0610-4ca8-448d-95c0-4f1201b17de4\") " pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.310958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a86ce12d-c904-4998-a578-ee635058b026-operator-scripts\") pod \"keystone-db-create-z7d44\" (UID: \"a86ce12d-c904-4998-a578-ee635058b026\") " pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.310982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh57j\" (UniqueName: \"kubernetes.io/projected/a86ce12d-c904-4998-a578-ee635058b026-kube-api-access-rh57j\") pod \"keystone-db-create-z7d44\" (UID: \"a86ce12d-c904-4998-a578-ee635058b026\") " pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.311495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/460e0610-4ca8-448d-95c0-4f1201b17de4-operator-scripts\") pod \"keystone-78f6-account-create-update-dwbzb\" (UID: \"460e0610-4ca8-448d-95c0-4f1201b17de4\") " pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.311948 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a86ce12d-c904-4998-a578-ee635058b026-operator-scripts\") pod \"keystone-db-create-z7d44\" (UID: \"a86ce12d-c904-4998-a578-ee635058b026\") " pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.326757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh57j\" (UniqueName: \"kubernetes.io/projected/a86ce12d-c904-4998-a578-ee635058b026-kube-api-access-rh57j\") pod \"keystone-db-create-z7d44\" (UID: \"a86ce12d-c904-4998-a578-ee635058b026\") " pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.326978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzgzv\" (UniqueName: \"kubernetes.io/projected/460e0610-4ca8-448d-95c0-4f1201b17de4-kube-api-access-jzgzv\") pod \"keystone-78f6-account-create-update-dwbzb\" (UID: \"460e0610-4ca8-448d-95c0-4f1201b17de4\") " pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.385651 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.392241 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.747361 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb"] Jan 21 16:50:13 crc kubenswrapper[4707]: W0121 16:50:13.748958 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod460e0610_4ca8_448d_95c0_4f1201b17de4.slice/crio-0b59543b36f78c243196960886fcc68d604e9a640eecb6b663aac39162a5d01d WatchSource:0}: Error finding container 0b59543b36f78c243196960886fcc68d604e9a640eecb6b663aac39162a5d01d: Status 404 returned error can't find the container with id 0b59543b36f78c243196960886fcc68d604e9a640eecb6b663aac39162a5d01d Jan 21 16:50:13 crc kubenswrapper[4707]: I0121 16:50:13.780906 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z7d44"] Jan 21 16:50:13 crc kubenswrapper[4707]: W0121 16:50:13.782155 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda86ce12d_c904_4998_a578_ee635058b026.slice/crio-8501e37a945c0026bba0c39f86c33a3b09b5947a8d59ed9d8f6a5143f3882eb9 WatchSource:0}: Error finding container 8501e37a945c0026bba0c39f86c33a3b09b5947a8d59ed9d8f6a5143f3882eb9: Status 404 returned error can't find the container with id 8501e37a945c0026bba0c39f86c33a3b09b5947a8d59ed9d8f6a5143f3882eb9 Jan 21 16:50:14 crc kubenswrapper[4707]: I0121 16:50:14.004825 4707 generic.go:334] "Generic (PLEG): container finished" podID="a86ce12d-c904-4998-a578-ee635058b026" containerID="53165e463ab70fe75ad812e60768d1e163e8bc9465d3de20c8a2f755b46f59c3" exitCode=0 Jan 21 16:50:14 crc kubenswrapper[4707]: I0121 16:50:14.004921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-z7d44" event={"ID":"a86ce12d-c904-4998-a578-ee635058b026","Type":"ContainerDied","Data":"53165e463ab70fe75ad812e60768d1e163e8bc9465d3de20c8a2f755b46f59c3"} Jan 21 16:50:14 crc kubenswrapper[4707]: I0121 16:50:14.004965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-z7d44" event={"ID":"a86ce12d-c904-4998-a578-ee635058b026","Type":"ContainerStarted","Data":"8501e37a945c0026bba0c39f86c33a3b09b5947a8d59ed9d8f6a5143f3882eb9"} Jan 21 16:50:14 crc kubenswrapper[4707]: I0121 16:50:14.006437 4707 generic.go:334] "Generic (PLEG): container finished" podID="460e0610-4ca8-448d-95c0-4f1201b17de4" containerID="23b2f57d9b2def74407ec6e56f332cefbc9a80c38a4134e57ea264ee71e2dcab" exitCode=0 Jan 21 16:50:14 crc kubenswrapper[4707]: I0121 16:50:14.006474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" event={"ID":"460e0610-4ca8-448d-95c0-4f1201b17de4","Type":"ContainerDied","Data":"23b2f57d9b2def74407ec6e56f332cefbc9a80c38a4134e57ea264ee71e2dcab"} Jan 21 16:50:14 crc kubenswrapper[4707]: I0121 16:50:14.006496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" event={"ID":"460e0610-4ca8-448d-95c0-4f1201b17de4","Type":"ContainerStarted","Data":"0b59543b36f78c243196960886fcc68d604e9a640eecb6b663aac39162a5d01d"} Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.257613 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.262859 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.337460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a86ce12d-c904-4998-a578-ee635058b026-operator-scripts\") pod \"a86ce12d-c904-4998-a578-ee635058b026\" (UID: \"a86ce12d-c904-4998-a578-ee635058b026\") " Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.337555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/460e0610-4ca8-448d-95c0-4f1201b17de4-operator-scripts\") pod \"460e0610-4ca8-448d-95c0-4f1201b17de4\" (UID: \"460e0610-4ca8-448d-95c0-4f1201b17de4\") " Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.337600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh57j\" (UniqueName: \"kubernetes.io/projected/a86ce12d-c904-4998-a578-ee635058b026-kube-api-access-rh57j\") pod \"a86ce12d-c904-4998-a578-ee635058b026\" (UID: \"a86ce12d-c904-4998-a578-ee635058b026\") " Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.337652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzgzv\" (UniqueName: \"kubernetes.io/projected/460e0610-4ca8-448d-95c0-4f1201b17de4-kube-api-access-jzgzv\") pod \"460e0610-4ca8-448d-95c0-4f1201b17de4\" (UID: \"460e0610-4ca8-448d-95c0-4f1201b17de4\") " Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.338093 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a86ce12d-c904-4998-a578-ee635058b026-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a86ce12d-c904-4998-a578-ee635058b026" (UID: "a86ce12d-c904-4998-a578-ee635058b026"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.338100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460e0610-4ca8-448d-95c0-4f1201b17de4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "460e0610-4ca8-448d-95c0-4f1201b17de4" (UID: "460e0610-4ca8-448d-95c0-4f1201b17de4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.338217 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/460e0610-4ca8-448d-95c0-4f1201b17de4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.338237 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a86ce12d-c904-4998-a578-ee635058b026-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.342059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86ce12d-c904-4998-a578-ee635058b026-kube-api-access-rh57j" (OuterVolumeSpecName: "kube-api-access-rh57j") pod "a86ce12d-c904-4998-a578-ee635058b026" (UID: "a86ce12d-c904-4998-a578-ee635058b026"). InnerVolumeSpecName "kube-api-access-rh57j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.342104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460e0610-4ca8-448d-95c0-4f1201b17de4-kube-api-access-jzgzv" (OuterVolumeSpecName: "kube-api-access-jzgzv") pod "460e0610-4ca8-448d-95c0-4f1201b17de4" (UID: "460e0610-4ca8-448d-95c0-4f1201b17de4"). InnerVolumeSpecName "kube-api-access-jzgzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.438919 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh57j\" (UniqueName: \"kubernetes.io/projected/a86ce12d-c904-4998-a578-ee635058b026-kube-api-access-rh57j\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:15 crc kubenswrapper[4707]: I0121 16:50:15.438945 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzgzv\" (UniqueName: \"kubernetes.io/projected/460e0610-4ca8-448d-95c0-4f1201b17de4-kube-api-access-jzgzv\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:16 crc kubenswrapper[4707]: I0121 16:50:16.018674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-z7d44" event={"ID":"a86ce12d-c904-4998-a578-ee635058b026","Type":"ContainerDied","Data":"8501e37a945c0026bba0c39f86c33a3b09b5947a8d59ed9d8f6a5143f3882eb9"} Jan 21 16:50:16 crc kubenswrapper[4707]: I0121 16:50:16.018698 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-z7d44" Jan 21 16:50:16 crc kubenswrapper[4707]: I0121 16:50:16.018708 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8501e37a945c0026bba0c39f86c33a3b09b5947a8d59ed9d8f6a5143f3882eb9" Jan 21 16:50:16 crc kubenswrapper[4707]: I0121 16:50:16.019988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" event={"ID":"460e0610-4ca8-448d-95c0-4f1201b17de4","Type":"ContainerDied","Data":"0b59543b36f78c243196960886fcc68d604e9a640eecb6b663aac39162a5d01d"} Jan 21 16:50:16 crc kubenswrapper[4707]: I0121 16:50:16.020012 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb" Jan 21 16:50:16 crc kubenswrapper[4707]: I0121 16:50:16.020022 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b59543b36f78c243196960886fcc68d604e9a640eecb6b663aac39162a5d01d" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.529225 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7zzl5"] Jan 21 16:50:18 crc kubenswrapper[4707]: E0121 16:50:18.529886 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86ce12d-c904-4998-a578-ee635058b026" containerName="mariadb-database-create" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.529899 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86ce12d-c904-4998-a578-ee635058b026" containerName="mariadb-database-create" Jan 21 16:50:18 crc kubenswrapper[4707]: E0121 16:50:18.529917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460e0610-4ca8-448d-95c0-4f1201b17de4" containerName="mariadb-account-create-update" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.529922 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="460e0610-4ca8-448d-95c0-4f1201b17de4" containerName="mariadb-account-create-update" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.530044 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a86ce12d-c904-4998-a578-ee635058b026" containerName="mariadb-database-create" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.530062 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="460e0610-4ca8-448d-95c0-4f1201b17de4" containerName="mariadb-account-create-update" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.530477 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.531864 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-j255k" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.531924 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.531963 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.532294 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.535584 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7zzl5"] Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.577505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b78668-b085-4edb-b04f-dfdfdcde3ddb-config-data\") pod \"keystone-db-sync-7zzl5\" (UID: \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\") " pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.577640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdvlp\" (UniqueName: \"kubernetes.io/projected/70b78668-b085-4edb-b04f-dfdfdcde3ddb-kube-api-access-jdvlp\") pod \"keystone-db-sync-7zzl5\" (UID: \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\") " pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.678703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b78668-b085-4edb-b04f-dfdfdcde3ddb-config-data\") pod \"keystone-db-sync-7zzl5\" (UID: \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\") " pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.678795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdvlp\" (UniqueName: \"kubernetes.io/projected/70b78668-b085-4edb-b04f-dfdfdcde3ddb-kube-api-access-jdvlp\") pod \"keystone-db-sync-7zzl5\" (UID: \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\") " pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.683836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b78668-b085-4edb-b04f-dfdfdcde3ddb-config-data\") pod \"keystone-db-sync-7zzl5\" (UID: \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\") " pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.692167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdvlp\" (UniqueName: \"kubernetes.io/projected/70b78668-b085-4edb-b04f-dfdfdcde3ddb-kube-api-access-jdvlp\") pod \"keystone-db-sync-7zzl5\" (UID: \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\") " pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:18 crc kubenswrapper[4707]: I0121 16:50:18.843911 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:19 crc kubenswrapper[4707]: I0121 16:50:19.205338 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7zzl5"] Jan 21 16:50:20 crc kubenswrapper[4707]: I0121 16:50:20.044938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" event={"ID":"70b78668-b085-4edb-b04f-dfdfdcde3ddb","Type":"ContainerStarted","Data":"90d214c80d3b90069950ba85216ca2464966246862650e81f576f0f7d31e97d4"} Jan 21 16:50:20 crc kubenswrapper[4707]: I0121 16:50:20.045107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" event={"ID":"70b78668-b085-4edb-b04f-dfdfdcde3ddb","Type":"ContainerStarted","Data":"56b58f802161edfba65d14db1dadb02911695ef1d97dbb7f70cd753ed8ae1e33"} Jan 21 16:50:20 crc kubenswrapper[4707]: I0121 16:50:20.060056 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" podStartSLOduration=2.060045135 podStartE2EDuration="2.060045135s" podCreationTimestamp="2026-01-21 16:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:50:20.054462197 +0000 UTC m=+6517.235978419" watchObservedRunningTime="2026-01-21 16:50:20.060045135 +0000 UTC m=+6517.241561357" Jan 21 16:50:21 crc kubenswrapper[4707]: I0121 16:50:21.051564 4707 generic.go:334] "Generic (PLEG): container finished" podID="70b78668-b085-4edb-b04f-dfdfdcde3ddb" containerID="90d214c80d3b90069950ba85216ca2464966246862650e81f576f0f7d31e97d4" exitCode=0 Jan 21 16:50:21 crc kubenswrapper[4707]: I0121 16:50:21.051601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" event={"ID":"70b78668-b085-4edb-b04f-dfdfdcde3ddb","Type":"ContainerDied","Data":"90d214c80d3b90069950ba85216ca2464966246862650e81f576f0f7d31e97d4"} Jan 21 16:50:22 crc kubenswrapper[4707]: I0121 16:50:22.265755 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:22 crc kubenswrapper[4707]: I0121 16:50:22.428288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b78668-b085-4edb-b04f-dfdfdcde3ddb-config-data\") pod \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\" (UID: \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\") " Jan 21 16:50:22 crc kubenswrapper[4707]: I0121 16:50:22.428332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdvlp\" (UniqueName: \"kubernetes.io/projected/70b78668-b085-4edb-b04f-dfdfdcde3ddb-kube-api-access-jdvlp\") pod \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\" (UID: \"70b78668-b085-4edb-b04f-dfdfdcde3ddb\") " Jan 21 16:50:22 crc kubenswrapper[4707]: I0121 16:50:22.432488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b78668-b085-4edb-b04f-dfdfdcde3ddb-kube-api-access-jdvlp" (OuterVolumeSpecName: "kube-api-access-jdvlp") pod "70b78668-b085-4edb-b04f-dfdfdcde3ddb" (UID: "70b78668-b085-4edb-b04f-dfdfdcde3ddb"). InnerVolumeSpecName "kube-api-access-jdvlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:50:22 crc kubenswrapper[4707]: I0121 16:50:22.454467 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70b78668-b085-4edb-b04f-dfdfdcde3ddb-config-data" (OuterVolumeSpecName: "config-data") pod "70b78668-b085-4edb-b04f-dfdfdcde3ddb" (UID: "70b78668-b085-4edb-b04f-dfdfdcde3ddb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:50:22 crc kubenswrapper[4707]: I0121 16:50:22.530602 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b78668-b085-4edb-b04f-dfdfdcde3ddb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:22 crc kubenswrapper[4707]: I0121 16:50:22.530635 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdvlp\" (UniqueName: \"kubernetes.io/projected/70b78668-b085-4edb-b04f-dfdfdcde3ddb-kube-api-access-jdvlp\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.062923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" event={"ID":"70b78668-b085-4edb-b04f-dfdfdcde3ddb","Type":"ContainerDied","Data":"56b58f802161edfba65d14db1dadb02911695ef1d97dbb7f70cd753ed8ae1e33"} Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.062948 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-7zzl5" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.062956 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56b58f802161edfba65d14db1dadb02911695ef1d97dbb7f70cd753ed8ae1e33" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.224518 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kwc58"] Jan 21 16:50:23 crc kubenswrapper[4707]: E0121 16:50:23.225014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b78668-b085-4edb-b04f-dfdfdcde3ddb" containerName="keystone-db-sync" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.225041 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b78668-b085-4edb-b04f-dfdfdcde3ddb" containerName="keystone-db-sync" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.225217 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b78668-b085-4edb-b04f-dfdfdcde3ddb" containerName="keystone-db-sync" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.225882 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.227583 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.228366 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.228765 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.228797 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.230877 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-j255k" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.236161 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kwc58"] Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.341335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctpvs\" (UniqueName: \"kubernetes.io/projected/15d1f71b-6822-4297-ac9b-e51c3a620734-kube-api-access-ctpvs\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.341430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-credential-keys\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.341479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-config-data\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.341568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-scripts\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.341582 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-fernet-keys\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.442885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-scripts\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.443077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-fernet-keys\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.443127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctpvs\" (UniqueName: \"kubernetes.io/projected/15d1f71b-6822-4297-ac9b-e51c3a620734-kube-api-access-ctpvs\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.443148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-credential-keys\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.443180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-config-data\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.445698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-scripts\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.445801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-fernet-keys\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.445886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-config-data\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.446738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-credential-keys\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.458673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctpvs\" (UniqueName: \"kubernetes.io/projected/15d1f71b-6822-4297-ac9b-e51c3a620734-kube-api-access-ctpvs\") pod \"keystone-bootstrap-kwc58\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.544435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:23 crc kubenswrapper[4707]: I0121 16:50:23.888187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kwc58"] Jan 21 16:50:23 crc kubenswrapper[4707]: W0121 16:50:23.890562 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d1f71b_6822_4297_ac9b_e51c3a620734.slice/crio-ab552219c6a87d9255b02008258998fc8e8c25c51dffcd708e032f0d7dcbe495 WatchSource:0}: Error finding container ab552219c6a87d9255b02008258998fc8e8c25c51dffcd708e032f0d7dcbe495: Status 404 returned error can't find the container with id ab552219c6a87d9255b02008258998fc8e8c25c51dffcd708e032f0d7dcbe495 Jan 21 16:50:24 crc kubenswrapper[4707]: I0121 16:50:24.069891 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" event={"ID":"15d1f71b-6822-4297-ac9b-e51c3a620734","Type":"ContainerStarted","Data":"8cf5a94340acb87b8d460e84d6e2facb8491c341d47fc2c5a04487f346afff9a"} Jan 21 16:50:24 crc kubenswrapper[4707]: I0121 16:50:24.070115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" event={"ID":"15d1f71b-6822-4297-ac9b-e51c3a620734","Type":"ContainerStarted","Data":"ab552219c6a87d9255b02008258998fc8e8c25c51dffcd708e032f0d7dcbe495"} Jan 21 16:50:24 crc kubenswrapper[4707]: I0121 16:50:24.082683 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" podStartSLOduration=1.082670694 podStartE2EDuration="1.082670694s" podCreationTimestamp="2026-01-21 16:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:50:24.079768697 +0000 UTC m=+6521.261284919" watchObservedRunningTime="2026-01-21 16:50:24.082670694 +0000 UTC m=+6521.264186916" Jan 21 16:50:26 crc kubenswrapper[4707]: I0121 16:50:26.080999 4707 generic.go:334] "Generic (PLEG): container finished" podID="15d1f71b-6822-4297-ac9b-e51c3a620734" containerID="8cf5a94340acb87b8d460e84d6e2facb8491c341d47fc2c5a04487f346afff9a" exitCode=0 Jan 21 16:50:26 crc kubenswrapper[4707]: I0121 16:50:26.081089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" event={"ID":"15d1f71b-6822-4297-ac9b-e51c3a620734","Type":"ContainerDied","Data":"8cf5a94340acb87b8d460e84d6e2facb8491c341d47fc2c5a04487f346afff9a"} Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.305564 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.487414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-scripts\") pod \"15d1f71b-6822-4297-ac9b-e51c3a620734\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.487608 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-fernet-keys\") pod \"15d1f71b-6822-4297-ac9b-e51c3a620734\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.487656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-config-data\") pod \"15d1f71b-6822-4297-ac9b-e51c3a620734\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.487679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-credential-keys\") pod \"15d1f71b-6822-4297-ac9b-e51c3a620734\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.487699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctpvs\" (UniqueName: \"kubernetes.io/projected/15d1f71b-6822-4297-ac9b-e51c3a620734-kube-api-access-ctpvs\") pod \"15d1f71b-6822-4297-ac9b-e51c3a620734\" (UID: \"15d1f71b-6822-4297-ac9b-e51c3a620734\") " Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.491674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d1f71b-6822-4297-ac9b-e51c3a620734-kube-api-access-ctpvs" (OuterVolumeSpecName: "kube-api-access-ctpvs") pod "15d1f71b-6822-4297-ac9b-e51c3a620734" (UID: "15d1f71b-6822-4297-ac9b-e51c3a620734"). InnerVolumeSpecName "kube-api-access-ctpvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.492305 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-scripts" (OuterVolumeSpecName: "scripts") pod "15d1f71b-6822-4297-ac9b-e51c3a620734" (UID: "15d1f71b-6822-4297-ac9b-e51c3a620734"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.492383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "15d1f71b-6822-4297-ac9b-e51c3a620734" (UID: "15d1f71b-6822-4297-ac9b-e51c3a620734"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.492696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "15d1f71b-6822-4297-ac9b-e51c3a620734" (UID: "15d1f71b-6822-4297-ac9b-e51c3a620734"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.502998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-config-data" (OuterVolumeSpecName: "config-data") pod "15d1f71b-6822-4297-ac9b-e51c3a620734" (UID: "15d1f71b-6822-4297-ac9b-e51c3a620734"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.589142 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.589172 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.589182 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.589194 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctpvs\" (UniqueName: \"kubernetes.io/projected/15d1f71b-6822-4297-ac9b-e51c3a620734-kube-api-access-ctpvs\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:27 crc kubenswrapper[4707]: I0121 16:50:27.589203 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d1f71b-6822-4297-ac9b-e51c3a620734-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.094068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" event={"ID":"15d1f71b-6822-4297-ac9b-e51c3a620734","Type":"ContainerDied","Data":"ab552219c6a87d9255b02008258998fc8e8c25c51dffcd708e032f0d7dcbe495"} Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.094104 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab552219c6a87d9255b02008258998fc8e8c25c51dffcd708e032f0d7dcbe495" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.094107 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kwc58" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.146762 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4"] Jan 21 16:50:28 crc kubenswrapper[4707]: E0121 16:50:28.147012 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d1f71b-6822-4297-ac9b-e51c3a620734" containerName="keystone-bootstrap" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.147024 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d1f71b-6822-4297-ac9b-e51c3a620734" containerName="keystone-bootstrap" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.147140 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d1f71b-6822-4297-ac9b-e51c3a620734" containerName="keystone-bootstrap" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.147562 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.149250 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.149276 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.150119 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.150372 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-j255k" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.155104 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4"] Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.295329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-credential-keys\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.295428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-scripts\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.295490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-config-data\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.295660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-fernet-keys\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.295787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-677jx\" (UniqueName: \"kubernetes.io/projected/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-kube-api-access-677jx\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.397091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-677jx\" (UniqueName: \"kubernetes.io/projected/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-kube-api-access-677jx\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.397166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-credential-keys\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.397225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-scripts\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.397242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-config-data\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.397275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-fernet-keys\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.401158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-scripts\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.401482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-config-data\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.401713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-credential-keys\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.402312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-fernet-keys\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.409987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-677jx\" (UniqueName: \"kubernetes.io/projected/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-kube-api-access-677jx\") pod \"keystone-66fb9c9dbd-k9ks4\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.459483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:28 crc kubenswrapper[4707]: I0121 16:50:28.803897 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4"] Jan 21 16:50:28 crc kubenswrapper[4707]: W0121 16:50:28.807451 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a0e45f5_2bb6_4b79_822d_b455aa3feab3.slice/crio-5d2be4b6625ac185100cd088c830b7fd8d5f283e1176f26fb40eb03b7d160ec2 WatchSource:0}: Error finding container 5d2be4b6625ac185100cd088c830b7fd8d5f283e1176f26fb40eb03b7d160ec2: Status 404 returned error can't find the container with id 5d2be4b6625ac185100cd088c830b7fd8d5f283e1176f26fb40eb03b7d160ec2 Jan 21 16:50:29 crc kubenswrapper[4707]: I0121 16:50:29.101335 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" event={"ID":"0a0e45f5-2bb6-4b79-822d-b455aa3feab3","Type":"ContainerStarted","Data":"c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf"} Jan 21 16:50:29 crc kubenswrapper[4707]: I0121 16:50:29.101667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" event={"ID":"0a0e45f5-2bb6-4b79-822d-b455aa3feab3","Type":"ContainerStarted","Data":"5d2be4b6625ac185100cd088c830b7fd8d5f283e1176f26fb40eb03b7d160ec2"} Jan 21 16:50:29 crc kubenswrapper[4707]: I0121 16:50:29.101683 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:50:29 crc kubenswrapper[4707]: I0121 16:50:29.118348 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" podStartSLOduration=1.118334536 podStartE2EDuration="1.118334536s" podCreationTimestamp="2026-01-21 16:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:50:29.113067944 +0000 UTC m=+6526.294584166" watchObservedRunningTime="2026-01-21 16:50:29.118334536 +0000 UTC m=+6526.299850758" Jan 21 16:50:59 crc kubenswrapper[4707]: I0121 16:50:59.709145 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.502475 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.503489 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.504959 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.505316 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"default-dockercfg-dzqg7" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.508323 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-config-secret" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.510287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.687042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkvms\" (UniqueName: \"kubernetes.io/projected/3bdc328f-eaaf-404d-b9b3-85f63e623e20-kube-api-access-lkvms\") pod \"openstackclient\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.687306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config\") pod \"openstackclient\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.687373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config-secret\") pod \"openstackclient\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.788561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config\") pod \"openstackclient\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.788653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config-secret\") pod \"openstackclient\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.788682 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkvms\" (UniqueName: \"kubernetes.io/projected/3bdc328f-eaaf-404d-b9b3-85f63e623e20-kube-api-access-lkvms\") pod \"openstackclient\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.789445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config\") pod \"openstackclient\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.793143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config-secret\") pod \"openstackclient\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.801375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkvms\" (UniqueName: \"kubernetes.io/projected/3bdc328f-eaaf-404d-b9b3-85f63e623e20-kube-api-access-lkvms\") pod \"openstackclient\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:00 crc kubenswrapper[4707]: I0121 16:51:00.819279 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Jan 21 16:51:01 crc kubenswrapper[4707]: I0121 16:51:01.175431 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 16:51:01 crc kubenswrapper[4707]: I0121 16:51:01.179627 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:51:01 crc kubenswrapper[4707]: I0121 16:51:01.279664 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"3bdc328f-eaaf-404d-b9b3-85f63e623e20","Type":"ContainerStarted","Data":"c03014077384ee87d2dae538aace38e79c3cbdbc1f0e9aadd32e21a65e9dcf06"} Jan 21 16:51:02 crc kubenswrapper[4707]: I0121 16:51:02.286197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"3bdc328f-eaaf-404d-b9b3-85f63e623e20","Type":"ContainerStarted","Data":"891bb12ad0a08f08014952944a9e2df3ff4f37f3c5601f366833505ff2821af8"} Jan 21 16:51:02 crc kubenswrapper[4707]: I0121 16:51:02.295925 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstackclient" podStartSLOduration=1.752207091 podStartE2EDuration="2.2959133s" podCreationTimestamp="2026-01-21 16:51:00 +0000 UTC" firstStartedPulling="2026-01-21 16:51:01.179437655 +0000 UTC m=+6558.360953876" lastFinishedPulling="2026-01-21 16:51:01.723143864 +0000 UTC m=+6558.904660085" observedRunningTime="2026-01-21 16:51:02.2952061 +0000 UTC m=+6559.476722322" watchObservedRunningTime="2026-01-21 16:51:02.2959133 +0000 UTC m=+6559.477429522" Jan 21 16:52:26 crc kubenswrapper[4707]: I0121 16:52:26.787309 4707 scope.go:117] "RemoveContainer" containerID="d9ec4d40ea85ba2a41b34bb92391d88dc2794b0889a6b0dd17ba6b6e93e88483" Jan 21 16:52:26 crc kubenswrapper[4707]: I0121 16:52:26.811694 4707 scope.go:117] "RemoveContainer" containerID="a379485354d057a9c5c20ad2e07e735483103ab3804bb6c9aa35e4db4852cf26" Jan 21 16:52:26 crc kubenswrapper[4707]: I0121 16:52:26.825259 4707 scope.go:117] "RemoveContainer" containerID="5b5880910cc4620b820188d3d6a9729b7102ace5709df5d0f3a5ba8b705783ce" Jan 21 16:52:26 crc kubenswrapper[4707]: I0121 16:52:26.848244 4707 scope.go:117] "RemoveContainer" containerID="944e0c0c19123d9ef95e73de9714d90bd42016490859fec68b41e5f114af8f39" Jan 21 16:52:39 crc kubenswrapper[4707]: I0121 16:52:39.945505 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:52:39 crc kubenswrapper[4707]: I0121 16:52:39.945888 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:53:09 crc kubenswrapper[4707]: I0121 16:53:09.946201 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:53:09 crc kubenswrapper[4707]: I0121 16:53:09.946757 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:53:26 crc kubenswrapper[4707]: I0121 16:53:26.905451 4707 scope.go:117] "RemoveContainer" containerID="8cfa3acf6b0a4f0b0eda47e2d325bd35d823b4a7a67d3d9af03269b9bac967b2" Jan 21 16:53:26 crc kubenswrapper[4707]: I0121 16:53:26.920455 4707 scope.go:117] "RemoveContainer" containerID="e94820e2bea52549e5aa5c5747b2c0d37a0ea118f5be2767e7027b56ff85ff76" Jan 21 16:53:26 crc kubenswrapper[4707]: I0121 16:53:26.950177 4707 scope.go:117] "RemoveContainer" containerID="1f617618e792a1f6487391e1a9ab5d922dd783e0b153792125dc0fcb57c72f70" Jan 21 16:53:26 crc kubenswrapper[4707]: I0121 16:53:26.962403 4707 scope.go:117] "RemoveContainer" containerID="8114b8c8c5dd5f046a1d094af0d690b68a58bdf7494b0f3f54997c01721c829d" Jan 21 16:53:26 crc kubenswrapper[4707]: I0121 16:53:26.983914 4707 scope.go:117] "RemoveContainer" containerID="edd2d7949d36e36e9abe826251f72563c0bc8dce4e81c40d1b691bdc336c7ad0" Jan 21 16:53:39 crc kubenswrapper[4707]: I0121 16:53:39.945247 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:53:39 crc kubenswrapper[4707]: I0121 16:53:39.945599 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:53:39 crc kubenswrapper[4707]: I0121 16:53:39.945631 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 16:53:39 crc kubenswrapper[4707]: I0121 16:53:39.946031 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:53:39 crc kubenswrapper[4707]: I0121 16:53:39.946075 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" gracePeriod=600 Jan 21 16:53:40 crc kubenswrapper[4707]: E0121 16:53:40.060072 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:53:40 crc kubenswrapper[4707]: I0121 16:53:40.145023 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" exitCode=0 Jan 21 16:53:40 crc kubenswrapper[4707]: I0121 16:53:40.145087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782"} Jan 21 16:53:40 crc kubenswrapper[4707]: I0121 16:53:40.145272 4707 scope.go:117] "RemoveContainer" containerID="3dcf06292baafcac4743dbbfb989956b7b70306caef24c5da65220fe71222042" Jan 21 16:53:40 crc kubenswrapper[4707]: I0121 16:53:40.145846 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:53:40 crc kubenswrapper[4707]: E0121 16:53:40.146085 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:53:42 crc kubenswrapper[4707]: I0121 16:53:42.035557 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-8vknb"] Jan 21 16:53:42 crc kubenswrapper[4707]: I0121 16:53:42.039968 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-8vknb"] Jan 21 16:53:43 crc kubenswrapper[4707]: I0121 16:53:43.188725 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e9cca3-4ff1-4be1-9319-a63239dece19" path="/var/lib/kubelet/pods/d7e9cca3-4ff1-4be1-9319-a63239dece19/volumes" Jan 21 16:53:52 crc kubenswrapper[4707]: I0121 16:53:52.182827 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:53:52 crc kubenswrapper[4707]: E0121 16:53:52.183403 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:54:07 crc kubenswrapper[4707]: I0121 16:54:07.182888 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:54:07 crc kubenswrapper[4707]: E0121 16:54:07.183528 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:54:20 crc kubenswrapper[4707]: I0121 16:54:20.182036 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:54:20 crc kubenswrapper[4707]: E0121 16:54:20.183054 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:54:22 crc kubenswrapper[4707]: I0121 16:54:22.880376 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-scrhn"] Jan 21 16:54:22 crc kubenswrapper[4707]: I0121 16:54:22.881911 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:22 crc kubenswrapper[4707]: I0121 16:54:22.889430 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-scrhn"] Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.016833 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2nmj\" (UniqueName: \"kubernetes.io/projected/978db14a-3379-4738-bad0-5f98983187b5-kube-api-access-s2nmj\") pod \"community-operators-scrhn\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.016897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-utilities\") pod \"community-operators-scrhn\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.017142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-catalog-content\") pod \"community-operators-scrhn\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.118531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-utilities\") pod \"community-operators-scrhn\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.118669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-catalog-content\") pod \"community-operators-scrhn\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.118757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2nmj\" (UniqueName: \"kubernetes.io/projected/978db14a-3379-4738-bad0-5f98983187b5-kube-api-access-s2nmj\") pod \"community-operators-scrhn\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.119248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-utilities\") pod \"community-operators-scrhn\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.119283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-catalog-content\") pod \"community-operators-scrhn\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.138329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2nmj\" (UniqueName: \"kubernetes.io/projected/978db14a-3379-4738-bad0-5f98983187b5-kube-api-access-s2nmj\") pod \"community-operators-scrhn\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.197726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:23 crc kubenswrapper[4707]: I0121 16:54:23.575583 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-scrhn"] Jan 21 16:54:24 crc kubenswrapper[4707]: I0121 16:54:24.393874 4707 generic.go:334] "Generic (PLEG): container finished" podID="978db14a-3379-4738-bad0-5f98983187b5" containerID="1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09" exitCode=0 Jan 21 16:54:24 crc kubenswrapper[4707]: I0121 16:54:24.393969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-scrhn" event={"ID":"978db14a-3379-4738-bad0-5f98983187b5","Type":"ContainerDied","Data":"1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09"} Jan 21 16:54:24 crc kubenswrapper[4707]: I0121 16:54:24.394082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-scrhn" event={"ID":"978db14a-3379-4738-bad0-5f98983187b5","Type":"ContainerStarted","Data":"dae1408eb59e26f02b5aa6e8734846eaf1a77793e5817bcbe0c76431079249cf"} Jan 21 16:54:25 crc kubenswrapper[4707]: I0121 16:54:25.401134 4707 generic.go:334] "Generic (PLEG): container finished" podID="978db14a-3379-4738-bad0-5f98983187b5" containerID="980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985" exitCode=0 Jan 21 16:54:25 crc kubenswrapper[4707]: I0121 16:54:25.401221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-scrhn" event={"ID":"978db14a-3379-4738-bad0-5f98983187b5","Type":"ContainerDied","Data":"980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985"} Jan 21 16:54:26 crc kubenswrapper[4707]: I0121 16:54:26.409281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-scrhn" event={"ID":"978db14a-3379-4738-bad0-5f98983187b5","Type":"ContainerStarted","Data":"c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f"} Jan 21 16:54:27 crc kubenswrapper[4707]: I0121 16:54:27.059828 4707 scope.go:117] "RemoveContainer" containerID="af4a69dd8aab43c13c0e98d349cd0f7479c96679105a5445727c3bd74d232cae" Jan 21 16:54:27 crc kubenswrapper[4707]: I0121 16:54:27.080488 4707 scope.go:117] "RemoveContainer" containerID="186e5df06818c2f4cfd4a65f824d3b10610c975fe6d9fa95c57133574cf4813b" Jan 21 16:54:27 crc kubenswrapper[4707]: I0121 16:54:27.094310 4707 scope.go:117] "RemoveContainer" containerID="106d39f128fd03a392d03c293dcbac530cb94e4716a79fc044bb336104c2b4ac" Jan 21 16:54:27 crc kubenswrapper[4707]: I0121 16:54:27.112830 4707 scope.go:117] "RemoveContainer" containerID="4ef9673b97ec56781314ebbeaab5d8c2c42d67e22afd9a61ab07005a6924f7a1" Jan 21 16:54:27 crc kubenswrapper[4707]: I0121 16:54:27.134829 4707 scope.go:117] "RemoveContainer" containerID="3f2faa48c47cdc577df29341b657c2ed9bfe053027a11493609233bc6ced6061" Jan 21 16:54:27 crc kubenswrapper[4707]: I0121 16:54:27.165241 4707 scope.go:117] "RemoveContainer" containerID="d4f0f8b2087b1f9c9a9997adada6fb5dbb95b70117812f6797b39b925d3ded1c" Jan 21 16:54:33 crc kubenswrapper[4707]: I0121 16:54:33.197844 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:33 crc kubenswrapper[4707]: I0121 16:54:33.198179 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:33 crc kubenswrapper[4707]: I0121 16:54:33.226293 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:33 crc kubenswrapper[4707]: I0121 16:54:33.241660 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-scrhn" podStartSLOduration=9.738514768 podStartE2EDuration="11.241646251s" podCreationTimestamp="2026-01-21 16:54:22 +0000 UTC" firstStartedPulling="2026-01-21 16:54:24.395013115 +0000 UTC m=+6761.576529337" lastFinishedPulling="2026-01-21 16:54:25.898144598 +0000 UTC m=+6763.079660820" observedRunningTime="2026-01-21 16:54:26.429350403 +0000 UTC m=+6763.610866625" watchObservedRunningTime="2026-01-21 16:54:33.241646251 +0000 UTC m=+6770.423162473" Jan 21 16:54:33 crc kubenswrapper[4707]: I0121 16:54:33.476333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:33 crc kubenswrapper[4707]: I0121 16:54:33.508383 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-scrhn"] Jan 21 16:54:34 crc kubenswrapper[4707]: I0121 16:54:34.182721 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:54:34 crc kubenswrapper[4707]: E0121 16:54:34.182941 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:54:35 crc kubenswrapper[4707]: I0121 16:54:35.458596 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-scrhn" podUID="978db14a-3379-4738-bad0-5f98983187b5" containerName="registry-server" containerID="cri-o://c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f" gracePeriod=2 Jan 21 16:54:35 crc kubenswrapper[4707]: I0121 16:54:35.781964 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:35 crc kubenswrapper[4707]: I0121 16:54:35.978858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-utilities\") pod \"978db14a-3379-4738-bad0-5f98983187b5\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " Jan 21 16:54:35 crc kubenswrapper[4707]: I0121 16:54:35.978902 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2nmj\" (UniqueName: \"kubernetes.io/projected/978db14a-3379-4738-bad0-5f98983187b5-kube-api-access-s2nmj\") pod \"978db14a-3379-4738-bad0-5f98983187b5\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " Jan 21 16:54:35 crc kubenswrapper[4707]: I0121 16:54:35.978933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-catalog-content\") pod \"978db14a-3379-4738-bad0-5f98983187b5\" (UID: \"978db14a-3379-4738-bad0-5f98983187b5\") " Jan 21 16:54:35 crc kubenswrapper[4707]: I0121 16:54:35.979643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-utilities" (OuterVolumeSpecName: "utilities") pod "978db14a-3379-4738-bad0-5f98983187b5" (UID: "978db14a-3379-4738-bad0-5f98983187b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:54:35 crc kubenswrapper[4707]: I0121 16:54:35.983611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978db14a-3379-4738-bad0-5f98983187b5-kube-api-access-s2nmj" (OuterVolumeSpecName: "kube-api-access-s2nmj") pod "978db14a-3379-4738-bad0-5f98983187b5" (UID: "978db14a-3379-4738-bad0-5f98983187b5"). InnerVolumeSpecName "kube-api-access-s2nmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.014401 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "978db14a-3379-4738-bad0-5f98983187b5" (UID: "978db14a-3379-4738-bad0-5f98983187b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.080527 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.080556 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2nmj\" (UniqueName: \"kubernetes.io/projected/978db14a-3379-4738-bad0-5f98983187b5-kube-api-access-s2nmj\") on node \"crc\" DevicePath \"\"" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.080569 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978db14a-3379-4738-bad0-5f98983187b5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.465690 4707 generic.go:334] "Generic (PLEG): container finished" podID="978db14a-3379-4738-bad0-5f98983187b5" containerID="c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f" exitCode=0 Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.465731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-scrhn" event={"ID":"978db14a-3379-4738-bad0-5f98983187b5","Type":"ContainerDied","Data":"c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f"} Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.465759 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-scrhn" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.465783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-scrhn" event={"ID":"978db14a-3379-4738-bad0-5f98983187b5","Type":"ContainerDied","Data":"dae1408eb59e26f02b5aa6e8734846eaf1a77793e5817bcbe0c76431079249cf"} Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.465801 4707 scope.go:117] "RemoveContainer" containerID="c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.479220 4707 scope.go:117] "RemoveContainer" containerID="980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.488638 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-scrhn"] Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.492708 4707 scope.go:117] "RemoveContainer" containerID="1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.493078 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-scrhn"] Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.511229 4707 scope.go:117] "RemoveContainer" containerID="c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f" Jan 21 16:54:36 crc kubenswrapper[4707]: E0121 16:54:36.511503 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f\": container with ID starting with c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f not found: ID does not exist" containerID="c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.511535 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f"} err="failed to get container status \"c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f\": rpc error: code = NotFound desc = could not find container \"c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f\": container with ID starting with c97f4f534667a201e5f2957540d4415adba2b04a4436762a61ead08edff8f18f not found: ID does not exist" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.511552 4707 scope.go:117] "RemoveContainer" containerID="980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985" Jan 21 16:54:36 crc kubenswrapper[4707]: E0121 16:54:36.511775 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985\": container with ID starting with 980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985 not found: ID does not exist" containerID="980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.511797 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985"} err="failed to get container status \"980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985\": rpc error: code = NotFound desc = could not find container \"980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985\": container with ID starting with 980c42d9e2e63ec4da2119b599c57f12b5921c49c408855462a3123cd3429985 not found: ID does not exist" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.511829 4707 scope.go:117] "RemoveContainer" containerID="1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09" Jan 21 16:54:36 crc kubenswrapper[4707]: E0121 16:54:36.512087 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09\": container with ID starting with 1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09 not found: ID does not exist" containerID="1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09" Jan 21 16:54:36 crc kubenswrapper[4707]: I0121 16:54:36.512107 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09"} err="failed to get container status \"1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09\": rpc error: code = NotFound desc = could not find container \"1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09\": container with ID starting with 1798724fdf10b75642ae6d4c503342e50d9988b69265d248335fd87d79bf1f09 not found: ID does not exist" Jan 21 16:54:37 crc kubenswrapper[4707]: I0121 16:54:37.198679 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978db14a-3379-4738-bad0-5f98983187b5" path="/var/lib/kubelet/pods/978db14a-3379-4738-bad0-5f98983187b5/volumes" Jan 21 16:54:49 crc kubenswrapper[4707]: I0121 16:54:49.182337 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:54:49 crc kubenswrapper[4707]: E0121 16:54:49.183052 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:55:01 crc kubenswrapper[4707]: I0121 16:55:01.182451 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:55:01 crc kubenswrapper[4707]: E0121 16:55:01.183338 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.556361 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fcw8r"] Jan 21 16:55:07 crc kubenswrapper[4707]: E0121 16:55:07.556983 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978db14a-3379-4738-bad0-5f98983187b5" containerName="extract-content" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.556995 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="978db14a-3379-4738-bad0-5f98983187b5" containerName="extract-content" Jan 21 16:55:07 crc kubenswrapper[4707]: E0121 16:55:07.557011 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978db14a-3379-4738-bad0-5f98983187b5" containerName="registry-server" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.557016 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="978db14a-3379-4738-bad0-5f98983187b5" containerName="registry-server" Jan 21 16:55:07 crc kubenswrapper[4707]: E0121 16:55:07.557033 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978db14a-3379-4738-bad0-5f98983187b5" containerName="extract-utilities" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.557039 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="978db14a-3379-4738-bad0-5f98983187b5" containerName="extract-utilities" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.557154 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="978db14a-3379-4738-bad0-5f98983187b5" containerName="registry-server" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.557970 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.562973 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fcw8r"] Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.608334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-utilities\") pod \"certified-operators-fcw8r\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.608424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-catalog-content\") pod \"certified-operators-fcw8r\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.608464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c57gq\" (UniqueName: \"kubernetes.io/projected/eb9ba821-b208-4795-9855-770607202fbe-kube-api-access-c57gq\") pod \"certified-operators-fcw8r\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.709896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-utilities\") pod \"certified-operators-fcw8r\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.710238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-catalog-content\") pod \"certified-operators-fcw8r\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.710289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c57gq\" (UniqueName: \"kubernetes.io/projected/eb9ba821-b208-4795-9855-770607202fbe-kube-api-access-c57gq\") pod \"certified-operators-fcw8r\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.710330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-utilities\") pod \"certified-operators-fcw8r\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.710760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-catalog-content\") pod \"certified-operators-fcw8r\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.726968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c57gq\" (UniqueName: \"kubernetes.io/projected/eb9ba821-b208-4795-9855-770607202fbe-kube-api-access-c57gq\") pod \"certified-operators-fcw8r\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:07 crc kubenswrapper[4707]: I0121 16:55:07.871509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:08 crc kubenswrapper[4707]: I0121 16:55:08.289107 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fcw8r"] Jan 21 16:55:08 crc kubenswrapper[4707]: I0121 16:55:08.654768 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb9ba821-b208-4795-9855-770607202fbe" containerID="dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3" exitCode=0 Jan 21 16:55:08 crc kubenswrapper[4707]: I0121 16:55:08.654832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fcw8r" event={"ID":"eb9ba821-b208-4795-9855-770607202fbe","Type":"ContainerDied","Data":"dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3"} Jan 21 16:55:08 crc kubenswrapper[4707]: I0121 16:55:08.654895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fcw8r" event={"ID":"eb9ba821-b208-4795-9855-770607202fbe","Type":"ContainerStarted","Data":"b215b75b600d38f0fd0b83a27c8aedd215151e0f2008740515d3d9fa3edc2cd5"} Jan 21 16:55:09 crc kubenswrapper[4707]: I0121 16:55:09.662662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fcw8r" event={"ID":"eb9ba821-b208-4795-9855-770607202fbe","Type":"ContainerStarted","Data":"9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3"} Jan 21 16:55:10 crc kubenswrapper[4707]: I0121 16:55:10.681148 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb9ba821-b208-4795-9855-770607202fbe" containerID="9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3" exitCode=0 Jan 21 16:55:10 crc kubenswrapper[4707]: I0121 16:55:10.681314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fcw8r" event={"ID":"eb9ba821-b208-4795-9855-770607202fbe","Type":"ContainerDied","Data":"9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3"} Jan 21 16:55:11 crc kubenswrapper[4707]: I0121 16:55:11.693023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fcw8r" event={"ID":"eb9ba821-b208-4795-9855-770607202fbe","Type":"ContainerStarted","Data":"b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87"} Jan 21 16:55:11 crc kubenswrapper[4707]: I0121 16:55:11.710986 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fcw8r" podStartSLOduration=2.215639721 podStartE2EDuration="4.710968991s" podCreationTimestamp="2026-01-21 16:55:07 +0000 UTC" firstStartedPulling="2026-01-21 16:55:08.656741475 +0000 UTC m=+6805.838257698" lastFinishedPulling="2026-01-21 16:55:11.152070746 +0000 UTC m=+6808.333586968" observedRunningTime="2026-01-21 16:55:11.706625012 +0000 UTC m=+6808.888141234" watchObservedRunningTime="2026-01-21 16:55:11.710968991 +0000 UTC m=+6808.892485213" Jan 21 16:55:13 crc kubenswrapper[4707]: I0121 16:55:13.185611 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:55:13 crc kubenswrapper[4707]: E0121 16:55:13.186075 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:55:17 crc kubenswrapper[4707]: I0121 16:55:17.871830 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:17 crc kubenswrapper[4707]: I0121 16:55:17.872439 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:17 crc kubenswrapper[4707]: I0121 16:55:17.901045 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:18 crc kubenswrapper[4707]: I0121 16:55:18.757272 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:18 crc kubenswrapper[4707]: I0121 16:55:18.788247 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fcw8r"] Jan 21 16:55:20 crc kubenswrapper[4707]: I0121 16:55:20.740844 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fcw8r" podUID="eb9ba821-b208-4795-9855-770607202fbe" containerName="registry-server" containerID="cri-o://b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87" gracePeriod=2 Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.082643 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.086196 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-catalog-content\") pod \"eb9ba821-b208-4795-9855-770607202fbe\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.086284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-utilities\") pod \"eb9ba821-b208-4795-9855-770607202fbe\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.086308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c57gq\" (UniqueName: \"kubernetes.io/projected/eb9ba821-b208-4795-9855-770607202fbe-kube-api-access-c57gq\") pod \"eb9ba821-b208-4795-9855-770607202fbe\" (UID: \"eb9ba821-b208-4795-9855-770607202fbe\") " Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.087047 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-utilities" (OuterVolumeSpecName: "utilities") pod "eb9ba821-b208-4795-9855-770607202fbe" (UID: "eb9ba821-b208-4795-9855-770607202fbe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.094211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9ba821-b208-4795-9855-770607202fbe-kube-api-access-c57gq" (OuterVolumeSpecName: "kube-api-access-c57gq") pod "eb9ba821-b208-4795-9855-770607202fbe" (UID: "eb9ba821-b208-4795-9855-770607202fbe"). InnerVolumeSpecName "kube-api-access-c57gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.121878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb9ba821-b208-4795-9855-770607202fbe" (UID: "eb9ba821-b208-4795-9855-770607202fbe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.187526 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.187554 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9ba821-b208-4795-9855-770607202fbe-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.187564 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c57gq\" (UniqueName: \"kubernetes.io/projected/eb9ba821-b208-4795-9855-770607202fbe-kube-api-access-c57gq\") on node \"crc\" DevicePath \"\"" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.748946 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb9ba821-b208-4795-9855-770607202fbe" containerID="b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87" exitCode=0 Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.749034 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fcw8r" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.749028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fcw8r" event={"ID":"eb9ba821-b208-4795-9855-770607202fbe","Type":"ContainerDied","Data":"b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87"} Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.749379 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fcw8r" event={"ID":"eb9ba821-b208-4795-9855-770607202fbe","Type":"ContainerDied","Data":"b215b75b600d38f0fd0b83a27c8aedd215151e0f2008740515d3d9fa3edc2cd5"} Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.749397 4707 scope.go:117] "RemoveContainer" containerID="b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.766608 4707 scope.go:117] "RemoveContainer" containerID="9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.767718 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fcw8r"] Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.773562 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fcw8r"] Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.796295 4707 scope.go:117] "RemoveContainer" containerID="dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.807949 4707 scope.go:117] "RemoveContainer" containerID="b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87" Jan 21 16:55:21 crc kubenswrapper[4707]: E0121 16:55:21.808193 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87\": container with ID starting with b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87 not found: ID does not exist" containerID="b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.808229 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87"} err="failed to get container status \"b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87\": rpc error: code = NotFound desc = could not find container \"b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87\": container with ID starting with b78e7816e4f35b65773bb128290d87eb053c08ed881eabb1cb10f2e360ab2d87 not found: ID does not exist" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.808252 4707 scope.go:117] "RemoveContainer" containerID="9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3" Jan 21 16:55:21 crc kubenswrapper[4707]: E0121 16:55:21.808452 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3\": container with ID starting with 9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3 not found: ID does not exist" containerID="9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.808477 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3"} err="failed to get container status \"9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3\": rpc error: code = NotFound desc = could not find container \"9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3\": container with ID starting with 9e924887bd572b969453c7809dda574a681e1b3438006416909c5a6679f872e3 not found: ID does not exist" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.808493 4707 scope.go:117] "RemoveContainer" containerID="dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3" Jan 21 16:55:21 crc kubenswrapper[4707]: E0121 16:55:21.808686 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3\": container with ID starting with dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3 not found: ID does not exist" containerID="dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3" Jan 21 16:55:21 crc kubenswrapper[4707]: I0121 16:55:21.808721 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3"} err="failed to get container status \"dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3\": rpc error: code = NotFound desc = could not find container \"dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3\": container with ID starting with dffa6c3c6f6dafb187ec9d681f800decccb938720b7249ded57313d83f83a7b3 not found: ID does not exist" Jan 21 16:55:23 crc kubenswrapper[4707]: I0121 16:55:23.189071 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9ba821-b208-4795-9855-770607202fbe" path="/var/lib/kubelet/pods/eb9ba821-b208-4795-9855-770607202fbe/volumes" Jan 21 16:55:24 crc kubenswrapper[4707]: I0121 16:55:24.183153 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:55:24 crc kubenswrapper[4707]: E0121 16:55:24.183411 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.318305 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lgsmw"] Jan 21 16:55:25 crc kubenswrapper[4707]: E0121 16:55:25.318553 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9ba821-b208-4795-9855-770607202fbe" containerName="extract-utilities" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.318564 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9ba821-b208-4795-9855-770607202fbe" containerName="extract-utilities" Jan 21 16:55:25 crc kubenswrapper[4707]: E0121 16:55:25.318584 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9ba821-b208-4795-9855-770607202fbe" containerName="registry-server" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.318589 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9ba821-b208-4795-9855-770607202fbe" containerName="registry-server" Jan 21 16:55:25 crc kubenswrapper[4707]: E0121 16:55:25.318603 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9ba821-b208-4795-9855-770607202fbe" containerName="extract-content" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.318608 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9ba821-b208-4795-9855-770607202fbe" containerName="extract-content" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.318719 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9ba821-b208-4795-9855-770607202fbe" containerName="registry-server" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.319527 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.327450 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lgsmw"] Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.443081 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-catalog-content\") pod \"redhat-operators-lgsmw\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.443205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp9zv\" (UniqueName: \"kubernetes.io/projected/d2448186-2a6b-444e-ad13-8ab1351a0ff0-kube-api-access-lp9zv\") pod \"redhat-operators-lgsmw\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.443336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-utilities\") pod \"redhat-operators-lgsmw\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.544255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-catalog-content\") pod \"redhat-operators-lgsmw\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.544322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp9zv\" (UniqueName: \"kubernetes.io/projected/d2448186-2a6b-444e-ad13-8ab1351a0ff0-kube-api-access-lp9zv\") pod \"redhat-operators-lgsmw\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.544365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-utilities\") pod \"redhat-operators-lgsmw\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.544664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-catalog-content\") pod \"redhat-operators-lgsmw\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.544681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-utilities\") pod \"redhat-operators-lgsmw\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.559625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp9zv\" (UniqueName: \"kubernetes.io/projected/d2448186-2a6b-444e-ad13-8ab1351a0ff0-kube-api-access-lp9zv\") pod \"redhat-operators-lgsmw\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:25 crc kubenswrapper[4707]: I0121 16:55:25.633199 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:26 crc kubenswrapper[4707]: I0121 16:55:26.007665 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lgsmw"] Jan 21 16:55:26 crc kubenswrapper[4707]: I0121 16:55:26.781329 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerID="0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb" exitCode=0 Jan 21 16:55:26 crc kubenswrapper[4707]: I0121 16:55:26.781369 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgsmw" event={"ID":"d2448186-2a6b-444e-ad13-8ab1351a0ff0","Type":"ContainerDied","Data":"0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb"} Jan 21 16:55:26 crc kubenswrapper[4707]: I0121 16:55:26.781391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgsmw" event={"ID":"d2448186-2a6b-444e-ad13-8ab1351a0ff0","Type":"ContainerStarted","Data":"e31cd3e4308f0cb921d187af2355fcb8512f9843bcebb05ef1fd17f5aa540b80"} Jan 21 16:55:27 crc kubenswrapper[4707]: I0121 16:55:27.242388 4707 scope.go:117] "RemoveContainer" containerID="d8b5087c02819680e390ad13aa383cdf80c741fed46c4de6a10f047d53d26363" Jan 21 16:55:27 crc kubenswrapper[4707]: I0121 16:55:27.263912 4707 scope.go:117] "RemoveContainer" containerID="1e212fe1a7fca1161a4a14cf842a06a809326463b8d491a3f0065103ce125e05" Jan 21 16:55:27 crc kubenswrapper[4707]: I0121 16:55:27.278910 4707 scope.go:117] "RemoveContainer" containerID="0f7bef4c4ef0f200ad7fc53fa35e3b69caa6fe28ebc38e14c973498b08efc238" Jan 21 16:55:27 crc kubenswrapper[4707]: I0121 16:55:27.299253 4707 scope.go:117] "RemoveContainer" containerID="c7d93d7e4a6575a30558d81e0c158cf1771266c6b875b91e5fcf5c9e597294c3" Jan 21 16:55:27 crc kubenswrapper[4707]: I0121 16:55:27.321456 4707 scope.go:117] "RemoveContainer" containerID="99aa7065f55b2a4a65ff93e54f406e9dcce7ea05b75e6a2c3832f075fdff6166" Jan 21 16:55:27 crc kubenswrapper[4707]: I0121 16:55:27.788902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgsmw" event={"ID":"d2448186-2a6b-444e-ad13-8ab1351a0ff0","Type":"ContainerStarted","Data":"21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1"} Jan 21 16:55:28 crc kubenswrapper[4707]: I0121 16:55:28.797050 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerID="21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1" exitCode=0 Jan 21 16:55:28 crc kubenswrapper[4707]: I0121 16:55:28.797237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgsmw" event={"ID":"d2448186-2a6b-444e-ad13-8ab1351a0ff0","Type":"ContainerDied","Data":"21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1"} Jan 21 16:55:29 crc kubenswrapper[4707]: I0121 16:55:29.804645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgsmw" event={"ID":"d2448186-2a6b-444e-ad13-8ab1351a0ff0","Type":"ContainerStarted","Data":"d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e"} Jan 21 16:55:29 crc kubenswrapper[4707]: I0121 16:55:29.819001 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lgsmw" podStartSLOduration=2.224364182 podStartE2EDuration="4.818987173s" podCreationTimestamp="2026-01-21 16:55:25 +0000 UTC" firstStartedPulling="2026-01-21 16:55:26.782741782 +0000 UTC m=+6823.964258004" lastFinishedPulling="2026-01-21 16:55:29.377364773 +0000 UTC m=+6826.558880995" observedRunningTime="2026-01-21 16:55:29.815593824 +0000 UTC m=+6826.997110046" watchObservedRunningTime="2026-01-21 16:55:29.818987173 +0000 UTC m=+6827.000503396" Jan 21 16:55:35 crc kubenswrapper[4707]: I0121 16:55:35.634371 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:35 crc kubenswrapper[4707]: I0121 16:55:35.634702 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:35 crc kubenswrapper[4707]: I0121 16:55:35.663160 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:35 crc kubenswrapper[4707]: I0121 16:55:35.866470 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:35 crc kubenswrapper[4707]: I0121 16:55:35.896637 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lgsmw"] Jan 21 16:55:36 crc kubenswrapper[4707]: I0121 16:55:36.182065 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:55:36 crc kubenswrapper[4707]: E0121 16:55:36.182240 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:55:37 crc kubenswrapper[4707]: I0121 16:55:37.847159 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lgsmw" podUID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerName="registry-server" containerID="cri-o://d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e" gracePeriod=2 Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.147942 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.210126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp9zv\" (UniqueName: \"kubernetes.io/projected/d2448186-2a6b-444e-ad13-8ab1351a0ff0-kube-api-access-lp9zv\") pod \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.210191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-utilities\") pod \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.210260 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-catalog-content\") pod \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\" (UID: \"d2448186-2a6b-444e-ad13-8ab1351a0ff0\") " Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.210936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-utilities" (OuterVolumeSpecName: "utilities") pod "d2448186-2a6b-444e-ad13-8ab1351a0ff0" (UID: "d2448186-2a6b-444e-ad13-8ab1351a0ff0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.215165 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2448186-2a6b-444e-ad13-8ab1351a0ff0-kube-api-access-lp9zv" (OuterVolumeSpecName: "kube-api-access-lp9zv") pod "d2448186-2a6b-444e-ad13-8ab1351a0ff0" (UID: "d2448186-2a6b-444e-ad13-8ab1351a0ff0"). InnerVolumeSpecName "kube-api-access-lp9zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.308469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2448186-2a6b-444e-ad13-8ab1351a0ff0" (UID: "d2448186-2a6b-444e-ad13-8ab1351a0ff0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.311474 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp9zv\" (UniqueName: \"kubernetes.io/projected/d2448186-2a6b-444e-ad13-8ab1351a0ff0-kube-api-access-lp9zv\") on node \"crc\" DevicePath \"\"" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.311512 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.311524 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2448186-2a6b-444e-ad13-8ab1351a0ff0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.854068 4707 generic.go:334] "Generic (PLEG): container finished" podID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerID="d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e" exitCode=0 Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.854107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgsmw" event={"ID":"d2448186-2a6b-444e-ad13-8ab1351a0ff0","Type":"ContainerDied","Data":"d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e"} Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.854129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgsmw" event={"ID":"d2448186-2a6b-444e-ad13-8ab1351a0ff0","Type":"ContainerDied","Data":"e31cd3e4308f0cb921d187af2355fcb8512f9843bcebb05ef1fd17f5aa540b80"} Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.854144 4707 scope.go:117] "RemoveContainer" containerID="d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.854253 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgsmw" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.869353 4707 scope.go:117] "RemoveContainer" containerID="21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.875766 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lgsmw"] Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.880340 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lgsmw"] Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.901133 4707 scope.go:117] "RemoveContainer" containerID="0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.914901 4707 scope.go:117] "RemoveContainer" containerID="d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e" Jan 21 16:55:38 crc kubenswrapper[4707]: E0121 16:55:38.915192 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e\": container with ID starting with d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e not found: ID does not exist" containerID="d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.915286 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e"} err="failed to get container status \"d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e\": rpc error: code = NotFound desc = could not find container \"d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e\": container with ID starting with d3528ecb41c15154590800fa77399df0d8e71a51437f26c96ee1f396e130958e not found: ID does not exist" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.915366 4707 scope.go:117] "RemoveContainer" containerID="21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1" Jan 21 16:55:38 crc kubenswrapper[4707]: E0121 16:55:38.915637 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1\": container with ID starting with 21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1 not found: ID does not exist" containerID="21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.915665 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1"} err="failed to get container status \"21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1\": rpc error: code = NotFound desc = could not find container \"21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1\": container with ID starting with 21baa0762c0d56eb5f1ec03ea4e650caf17bae658c32744676f3ed807d4e32e1 not found: ID does not exist" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.915684 4707 scope.go:117] "RemoveContainer" containerID="0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb" Jan 21 16:55:38 crc kubenswrapper[4707]: E0121 16:55:38.915872 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb\": container with ID starting with 0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb not found: ID does not exist" containerID="0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb" Jan 21 16:55:38 crc kubenswrapper[4707]: I0121 16:55:38.915893 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb"} err="failed to get container status \"0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb\": rpc error: code = NotFound desc = could not find container \"0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb\": container with ID starting with 0cb8b7aa7bc7b02871afe844456d70b9c79590e9062ccd061c4ca68bec9fdbdb not found: ID does not exist" Jan 21 16:55:39 crc kubenswrapper[4707]: I0121 16:55:39.188461 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" path="/var/lib/kubelet/pods/d2448186-2a6b-444e-ad13-8ab1351a0ff0/volumes" Jan 21 16:55:50 crc kubenswrapper[4707]: I0121 16:55:50.182768 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:55:50 crc kubenswrapper[4707]: E0121 16:55:50.183275 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:56:03 crc kubenswrapper[4707]: I0121 16:56:03.190323 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:56:03 crc kubenswrapper[4707]: E0121 16:56:03.191647 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.702573 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cbg26"] Jan 21 16:56:16 crc kubenswrapper[4707]: E0121 16:56:16.703179 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerName="registry-server" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.703191 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerName="registry-server" Jan 21 16:56:16 crc kubenswrapper[4707]: E0121 16:56:16.703213 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerName="extract-utilities" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.703220 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerName="extract-utilities" Jan 21 16:56:16 crc kubenswrapper[4707]: E0121 16:56:16.703236 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerName="extract-content" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.703242 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerName="extract-content" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.703350 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2448186-2a6b-444e-ad13-8ab1351a0ff0" containerName="registry-server" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.704177 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.712410 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbg26"] Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.794225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-utilities\") pod \"redhat-marketplace-cbg26\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.794332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-catalog-content\") pod \"redhat-marketplace-cbg26\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.794382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v67ll\" (UniqueName: \"kubernetes.io/projected/ad174e95-0601-4103-a847-b1e33cb26da5-kube-api-access-v67ll\") pod \"redhat-marketplace-cbg26\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.895546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-catalog-content\") pod \"redhat-marketplace-cbg26\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.895617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v67ll\" (UniqueName: \"kubernetes.io/projected/ad174e95-0601-4103-a847-b1e33cb26da5-kube-api-access-v67ll\") pod \"redhat-marketplace-cbg26\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.895645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-utilities\") pod \"redhat-marketplace-cbg26\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.896336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-utilities\") pod \"redhat-marketplace-cbg26\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.896338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-catalog-content\") pod \"redhat-marketplace-cbg26\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:16 crc kubenswrapper[4707]: I0121 16:56:16.912166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v67ll\" (UniqueName: \"kubernetes.io/projected/ad174e95-0601-4103-a847-b1e33cb26da5-kube-api-access-v67ll\") pod \"redhat-marketplace-cbg26\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:17 crc kubenswrapper[4707]: I0121 16:56:17.025082 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:17 crc kubenswrapper[4707]: I0121 16:56:17.182705 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:56:17 crc kubenswrapper[4707]: E0121 16:56:17.183065 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:56:17 crc kubenswrapper[4707]: I0121 16:56:17.410140 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbg26"] Jan 21 16:56:18 crc kubenswrapper[4707]: I0121 16:56:18.058889 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad174e95-0601-4103-a847-b1e33cb26da5" containerID="edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650" exitCode=0 Jan 21 16:56:18 crc kubenswrapper[4707]: I0121 16:56:18.059094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbg26" event={"ID":"ad174e95-0601-4103-a847-b1e33cb26da5","Type":"ContainerDied","Data":"edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650"} Jan 21 16:56:18 crc kubenswrapper[4707]: I0121 16:56:18.059118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbg26" event={"ID":"ad174e95-0601-4103-a847-b1e33cb26da5","Type":"ContainerStarted","Data":"63bd9dab2de120bb48efa9be061dab9c2a232fae6d11eae4628074663141d2aa"} Jan 21 16:56:18 crc kubenswrapper[4707]: I0121 16:56:18.060360 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:56:19 crc kubenswrapper[4707]: I0121 16:56:19.067851 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad174e95-0601-4103-a847-b1e33cb26da5" containerID="de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9" exitCode=0 Jan 21 16:56:19 crc kubenswrapper[4707]: I0121 16:56:19.067926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbg26" event={"ID":"ad174e95-0601-4103-a847-b1e33cb26da5","Type":"ContainerDied","Data":"de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9"} Jan 21 16:56:20 crc kubenswrapper[4707]: I0121 16:56:20.075293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbg26" event={"ID":"ad174e95-0601-4103-a847-b1e33cb26da5","Type":"ContainerStarted","Data":"9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e"} Jan 21 16:56:27 crc kubenswrapper[4707]: I0121 16:56:27.025767 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:27 crc kubenswrapper[4707]: I0121 16:56:27.026010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:27 crc kubenswrapper[4707]: I0121 16:56:27.054389 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:27 crc kubenswrapper[4707]: I0121 16:56:27.068746 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cbg26" podStartSLOduration=9.569812829 podStartE2EDuration="11.068731261s" podCreationTimestamp="2026-01-21 16:56:16 +0000 UTC" firstStartedPulling="2026-01-21 16:56:18.060149305 +0000 UTC m=+6875.241665527" lastFinishedPulling="2026-01-21 16:56:19.559067736 +0000 UTC m=+6876.740583959" observedRunningTime="2026-01-21 16:56:20.089210983 +0000 UTC m=+6877.270727206" watchObservedRunningTime="2026-01-21 16:56:27.068731261 +0000 UTC m=+6884.250247483" Jan 21 16:56:27 crc kubenswrapper[4707]: I0121 16:56:27.137528 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:27 crc kubenswrapper[4707]: I0121 16:56:27.278274 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbg26"] Jan 21 16:56:27 crc kubenswrapper[4707]: I0121 16:56:27.420647 4707 scope.go:117] "RemoveContainer" containerID="b9ad976554bf0968c748ae6ea63fa245a604051ce377ba7ca72ab9226dd272b0" Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.120217 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cbg26" podUID="ad174e95-0601-4103-a847-b1e33cb26da5" containerName="registry-server" containerID="cri-o://9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e" gracePeriod=2 Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.436703 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.546958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-catalog-content\") pod \"ad174e95-0601-4103-a847-b1e33cb26da5\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.547002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v67ll\" (UniqueName: \"kubernetes.io/projected/ad174e95-0601-4103-a847-b1e33cb26da5-kube-api-access-v67ll\") pod \"ad174e95-0601-4103-a847-b1e33cb26da5\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.547066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-utilities\") pod \"ad174e95-0601-4103-a847-b1e33cb26da5\" (UID: \"ad174e95-0601-4103-a847-b1e33cb26da5\") " Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.547949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-utilities" (OuterVolumeSpecName: "utilities") pod "ad174e95-0601-4103-a847-b1e33cb26da5" (UID: "ad174e95-0601-4103-a847-b1e33cb26da5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.553372 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad174e95-0601-4103-a847-b1e33cb26da5-kube-api-access-v67ll" (OuterVolumeSpecName: "kube-api-access-v67ll") pod "ad174e95-0601-4103-a847-b1e33cb26da5" (UID: "ad174e95-0601-4103-a847-b1e33cb26da5"). InnerVolumeSpecName "kube-api-access-v67ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.564146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad174e95-0601-4103-a847-b1e33cb26da5" (UID: "ad174e95-0601-4103-a847-b1e33cb26da5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.648051 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.648083 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad174e95-0601-4103-a847-b1e33cb26da5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:56:29 crc kubenswrapper[4707]: I0121 16:56:29.648096 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v67ll\" (UniqueName: \"kubernetes.io/projected/ad174e95-0601-4103-a847-b1e33cb26da5-kube-api-access-v67ll\") on node \"crc\" DevicePath \"\"" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.127430 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad174e95-0601-4103-a847-b1e33cb26da5" containerID="9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e" exitCode=0 Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.127478 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbg26" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.127493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbg26" event={"ID":"ad174e95-0601-4103-a847-b1e33cb26da5","Type":"ContainerDied","Data":"9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e"} Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.127823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbg26" event={"ID":"ad174e95-0601-4103-a847-b1e33cb26da5","Type":"ContainerDied","Data":"63bd9dab2de120bb48efa9be061dab9c2a232fae6d11eae4628074663141d2aa"} Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.127863 4707 scope.go:117] "RemoveContainer" containerID="9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.142059 4707 scope.go:117] "RemoveContainer" containerID="de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.151234 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbg26"] Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.155594 4707 scope.go:117] "RemoveContainer" containerID="edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.155674 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbg26"] Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.177404 4707 scope.go:117] "RemoveContainer" containerID="9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e" Jan 21 16:56:30 crc kubenswrapper[4707]: E0121 16:56:30.177786 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e\": container with ID starting with 9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e not found: ID does not exist" containerID="9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.177833 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e"} err="failed to get container status \"9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e\": rpc error: code = NotFound desc = could not find container \"9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e\": container with ID starting with 9f2fd02067fea7cff509ef9b3e59c16ce8171b9bb2e41aa8f2e40111e85b702e not found: ID does not exist" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.177853 4707 scope.go:117] "RemoveContainer" containerID="de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9" Jan 21 16:56:30 crc kubenswrapper[4707]: E0121 16:56:30.178101 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9\": container with ID starting with de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9 not found: ID does not exist" containerID="de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.178141 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9"} err="failed to get container status \"de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9\": rpc error: code = NotFound desc = could not find container \"de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9\": container with ID starting with de4e06853216bb70353a163c14b2a71ddd7efcffc90aafa34fa937ce5cf26cb9 not found: ID does not exist" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.178156 4707 scope.go:117] "RemoveContainer" containerID="edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650" Jan 21 16:56:30 crc kubenswrapper[4707]: E0121 16:56:30.178385 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650\": container with ID starting with edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650 not found: ID does not exist" containerID="edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650" Jan 21 16:56:30 crc kubenswrapper[4707]: I0121 16:56:30.178407 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650"} err="failed to get container status \"edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650\": rpc error: code = NotFound desc = could not find container \"edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650\": container with ID starting with edfce1f1cf938d3ec6a0dd8d899a4cb3dc7abe7f1f5abe5cbf7be31fd7bfd650 not found: ID does not exist" Jan 21 16:56:31 crc kubenswrapper[4707]: I0121 16:56:31.188769 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad174e95-0601-4103-a847-b1e33cb26da5" path="/var/lib/kubelet/pods/ad174e95-0601-4103-a847-b1e33cb26da5/volumes" Jan 21 16:56:32 crc kubenswrapper[4707]: I0121 16:56:32.183425 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:56:32 crc kubenswrapper[4707]: E0121 16:56:32.183780 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:56:44 crc kubenswrapper[4707]: I0121 16:56:44.183069 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:56:44 crc kubenswrapper[4707]: E0121 16:56:44.183771 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:56:59 crc kubenswrapper[4707]: I0121 16:56:59.182345 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:56:59 crc kubenswrapper[4707]: E0121 16:56:59.182868 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:57:11 crc kubenswrapper[4707]: I0121 16:57:11.182166 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:57:11 crc kubenswrapper[4707]: E0121 16:57:11.182614 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:57:25 crc kubenswrapper[4707]: I0121 16:57:25.182258 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:57:25 crc kubenswrapper[4707]: E0121 16:57:25.183074 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:57:37 crc kubenswrapper[4707]: I0121 16:57:37.182443 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:57:37 crc kubenswrapper[4707]: E0121 16:57:37.182979 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:57:49 crc kubenswrapper[4707]: I0121 16:57:49.182753 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:57:49 crc kubenswrapper[4707]: E0121 16:57:49.183533 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:58:01 crc kubenswrapper[4707]: I0121 16:58:01.183961 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:58:01 crc kubenswrapper[4707]: E0121 16:58:01.185511 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:58:16 crc kubenswrapper[4707]: I0121 16:58:16.182475 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:58:16 crc kubenswrapper[4707]: E0121 16:58:16.183073 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:58:29 crc kubenswrapper[4707]: I0121 16:58:29.182381 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:58:29 crc kubenswrapper[4707]: E0121 16:58:29.182961 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 16:58:42 crc kubenswrapper[4707]: I0121 16:58:42.182251 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 16:58:42 crc kubenswrapper[4707]: I0121 16:58:42.888859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"db6cb0803a4872ab9454e605e8d5382632c4ff38cb23724c972fdbbde4bc97c5"} Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.129561 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf"] Jan 21 17:00:00 crc kubenswrapper[4707]: E0121 17:00:00.130179 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad174e95-0601-4103-a847-b1e33cb26da5" containerName="extract-content" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.130193 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad174e95-0601-4103-a847-b1e33cb26da5" containerName="extract-content" Jan 21 17:00:00 crc kubenswrapper[4707]: E0121 17:00:00.130213 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad174e95-0601-4103-a847-b1e33cb26da5" containerName="registry-server" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.130218 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad174e95-0601-4103-a847-b1e33cb26da5" containerName="registry-server" Jan 21 17:00:00 crc kubenswrapper[4707]: E0121 17:00:00.130240 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad174e95-0601-4103-a847-b1e33cb26da5" containerName="extract-utilities" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.130247 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad174e95-0601-4103-a847-b1e33cb26da5" containerName="extract-utilities" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.130380 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad174e95-0601-4103-a847-b1e33cb26da5" containerName="registry-server" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.130852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.132585 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.133427 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.135621 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf"] Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.227380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed91b371-cea5-4c46-b979-5d5b8885b429-secret-volume\") pod \"collect-profiles-29483580-lccbf\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.227531 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed91b371-cea5-4c46-b979-5d5b8885b429-config-volume\") pod \"collect-profiles-29483580-lccbf\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.227593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4jfq\" (UniqueName: \"kubernetes.io/projected/ed91b371-cea5-4c46-b979-5d5b8885b429-kube-api-access-x4jfq\") pod \"collect-profiles-29483580-lccbf\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.328416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4jfq\" (UniqueName: \"kubernetes.io/projected/ed91b371-cea5-4c46-b979-5d5b8885b429-kube-api-access-x4jfq\") pod \"collect-profiles-29483580-lccbf\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.328520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed91b371-cea5-4c46-b979-5d5b8885b429-secret-volume\") pod \"collect-profiles-29483580-lccbf\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.328559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed91b371-cea5-4c46-b979-5d5b8885b429-config-volume\") pod \"collect-profiles-29483580-lccbf\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.329269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed91b371-cea5-4c46-b979-5d5b8885b429-config-volume\") pod \"collect-profiles-29483580-lccbf\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.333141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed91b371-cea5-4c46-b979-5d5b8885b429-secret-volume\") pod \"collect-profiles-29483580-lccbf\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.342068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4jfq\" (UniqueName: \"kubernetes.io/projected/ed91b371-cea5-4c46-b979-5d5b8885b429-kube-api-access-x4jfq\") pod \"collect-profiles-29483580-lccbf\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.446375 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:00 crc kubenswrapper[4707]: I0121 17:00:00.820356 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf"] Jan 21 17:00:01 crc kubenswrapper[4707]: I0121 17:00:01.293805 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed91b371-cea5-4c46-b979-5d5b8885b429" containerID="69b50b66d26bcab6958d7a1a0174010fcb4e9ee96097920378ca20ed63d6b63d" exitCode=0 Jan 21 17:00:01 crc kubenswrapper[4707]: I0121 17:00:01.293914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" event={"ID":"ed91b371-cea5-4c46-b979-5d5b8885b429","Type":"ContainerDied","Data":"69b50b66d26bcab6958d7a1a0174010fcb4e9ee96097920378ca20ed63d6b63d"} Jan 21 17:00:01 crc kubenswrapper[4707]: I0121 17:00:01.294009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" event={"ID":"ed91b371-cea5-4c46-b979-5d5b8885b429","Type":"ContainerStarted","Data":"c1e428d07924a813e36f495ac510fa3d610d0c82bdb002c98fdff9e35c7eb3df"} Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.473570 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.660619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed91b371-cea5-4c46-b979-5d5b8885b429-config-volume\") pod \"ed91b371-cea5-4c46-b979-5d5b8885b429\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.660724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4jfq\" (UniqueName: \"kubernetes.io/projected/ed91b371-cea5-4c46-b979-5d5b8885b429-kube-api-access-x4jfq\") pod \"ed91b371-cea5-4c46-b979-5d5b8885b429\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.660832 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed91b371-cea5-4c46-b979-5d5b8885b429-secret-volume\") pod \"ed91b371-cea5-4c46-b979-5d5b8885b429\" (UID: \"ed91b371-cea5-4c46-b979-5d5b8885b429\") " Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.661944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed91b371-cea5-4c46-b979-5d5b8885b429-config-volume" (OuterVolumeSpecName: "config-volume") pod "ed91b371-cea5-4c46-b979-5d5b8885b429" (UID: "ed91b371-cea5-4c46-b979-5d5b8885b429"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.666449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed91b371-cea5-4c46-b979-5d5b8885b429-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ed91b371-cea5-4c46-b979-5d5b8885b429" (UID: "ed91b371-cea5-4c46-b979-5d5b8885b429"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.671004 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed91b371-cea5-4c46-b979-5d5b8885b429-kube-api-access-x4jfq" (OuterVolumeSpecName: "kube-api-access-x4jfq") pod "ed91b371-cea5-4c46-b979-5d5b8885b429" (UID: "ed91b371-cea5-4c46-b979-5d5b8885b429"). InnerVolumeSpecName "kube-api-access-x4jfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.763034 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed91b371-cea5-4c46-b979-5d5b8885b429-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.763065 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4jfq\" (UniqueName: \"kubernetes.io/projected/ed91b371-cea5-4c46-b979-5d5b8885b429-kube-api-access-x4jfq\") on node \"crc\" DevicePath \"\"" Jan 21 17:00:02 crc kubenswrapper[4707]: I0121 17:00:02.763076 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed91b371-cea5-4c46-b979-5d5b8885b429-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:00:03 crc kubenswrapper[4707]: I0121 17:00:03.305433 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" event={"ID":"ed91b371-cea5-4c46-b979-5d5b8885b429","Type":"ContainerDied","Data":"c1e428d07924a813e36f495ac510fa3d610d0c82bdb002c98fdff9e35c7eb3df"} Jan 21 17:00:03 crc kubenswrapper[4707]: I0121 17:00:03.305468 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1e428d07924a813e36f495ac510fa3d610d0c82bdb002c98fdff9e35c7eb3df" Jan 21 17:00:03 crc kubenswrapper[4707]: I0121 17:00:03.305478 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf" Jan 21 17:00:03 crc kubenswrapper[4707]: I0121 17:00:03.519513 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc"] Jan 21 17:00:03 crc kubenswrapper[4707]: I0121 17:00:03.523960 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-2nkhc"] Jan 21 17:00:05 crc kubenswrapper[4707]: I0121 17:00:05.189742 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7609d4eb-aad1-45f2-bc49-3b2322e8a817" path="/var/lib/kubelet/pods/7609d4eb-aad1-45f2-bc49-3b2322e8a817/volumes" Jan 21 17:00:16 crc kubenswrapper[4707]: I0121 17:00:16.018141 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb"] Jan 21 17:00:16 crc kubenswrapper[4707]: I0121 17:00:16.022256 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z7d44"] Jan 21 17:00:16 crc kubenswrapper[4707]: I0121 17:00:16.026336 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-78f6-account-create-update-dwbzb"] Jan 21 17:00:16 crc kubenswrapper[4707]: I0121 17:00:16.030247 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-z7d44"] Jan 21 17:00:17 crc kubenswrapper[4707]: I0121 17:00:17.189355 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460e0610-4ca8-448d-95c0-4f1201b17de4" path="/var/lib/kubelet/pods/460e0610-4ca8-448d-95c0-4f1201b17de4/volumes" Jan 21 17:00:17 crc kubenswrapper[4707]: I0121 17:00:17.190078 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a86ce12d-c904-4998-a578-ee635058b026" path="/var/lib/kubelet/pods/a86ce12d-c904-4998-a578-ee635058b026/volumes" Jan 21 17:00:23 crc kubenswrapper[4707]: I0121 17:00:23.025202 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7zzl5"] Jan 21 17:00:23 crc kubenswrapper[4707]: I0121 17:00:23.029441 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7zzl5"] Jan 21 17:00:23 crc kubenswrapper[4707]: I0121 17:00:23.188832 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b78668-b085-4edb-b04f-dfdfdcde3ddb" path="/var/lib/kubelet/pods/70b78668-b085-4edb-b04f-dfdfdcde3ddb/volumes" Jan 21 17:00:27 crc kubenswrapper[4707]: I0121 17:00:27.496222 4707 scope.go:117] "RemoveContainer" containerID="90d214c80d3b90069950ba85216ca2464966246862650e81f576f0f7d31e97d4" Jan 21 17:00:27 crc kubenswrapper[4707]: I0121 17:00:27.519404 4707 scope.go:117] "RemoveContainer" containerID="23b2f57d9b2def74407ec6e56f332cefbc9a80c38a4134e57ea264ee71e2dcab" Jan 21 17:00:27 crc kubenswrapper[4707]: I0121 17:00:27.534392 4707 scope.go:117] "RemoveContainer" containerID="53165e463ab70fe75ad812e60768d1e163e8bc9465d3de20c8a2f755b46f59c3" Jan 21 17:00:27 crc kubenswrapper[4707]: I0121 17:00:27.551778 4707 scope.go:117] "RemoveContainer" containerID="f0b05456edefd34b6599529e3b34703d766d0113ca61f5aca013d8272a793a36" Jan 21 17:00:28 crc kubenswrapper[4707]: I0121 17:00:28.020071 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kwc58"] Jan 21 17:00:28 crc kubenswrapper[4707]: I0121 17:00:28.024607 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kwc58"] Jan 21 17:00:29 crc kubenswrapper[4707]: I0121 17:00:29.188912 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d1f71b-6822-4297-ac9b-e51c3a620734" path="/var/lib/kubelet/pods/15d1f71b-6822-4297-ac9b-e51c3a620734/volumes" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.133325 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-cron-29483581-26vg8"] Jan 21 17:01:00 crc kubenswrapper[4707]: E0121 17:01:00.133978 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed91b371-cea5-4c46-b979-5d5b8885b429" containerName="collect-profiles" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.133990 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed91b371-cea5-4c46-b979-5d5b8885b429" containerName="collect-profiles" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.134104 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed91b371-cea5-4c46-b979-5d5b8885b429" containerName="collect-profiles" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.134560 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.137644 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-cron-29483581-26vg8"] Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.271913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-config-data\") pod \"keystone-cron-29483581-26vg8\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.271971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-fernet-keys\") pod \"keystone-cron-29483581-26vg8\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.272073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ktcf\" (UniqueName: \"kubernetes.io/projected/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-kube-api-access-9ktcf\") pod \"keystone-cron-29483581-26vg8\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.373826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-config-data\") pod \"keystone-cron-29483581-26vg8\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.373893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-fernet-keys\") pod \"keystone-cron-29483581-26vg8\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.373995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ktcf\" (UniqueName: \"kubernetes.io/projected/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-kube-api-access-9ktcf\") pod \"keystone-cron-29483581-26vg8\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.379009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-config-data\") pod \"keystone-cron-29483581-26vg8\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.380403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-fernet-keys\") pod \"keystone-cron-29483581-26vg8\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.387162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ktcf\" (UniqueName: \"kubernetes.io/projected/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-kube-api-access-9ktcf\") pod \"keystone-cron-29483581-26vg8\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.450084 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:00 crc kubenswrapper[4707]: I0121 17:01:00.819723 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-cron-29483581-26vg8"] Jan 21 17:01:00 crc kubenswrapper[4707]: W0121 17:01:00.827493 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5fe8a25_08a7_4f0b_b128_44069d9f21e7.slice/crio-d17c2e256e93476ddf514302cd5748ec0d1877ecc2a495cebfd2629dad714797 WatchSource:0}: Error finding container d17c2e256e93476ddf514302cd5748ec0d1877ecc2a495cebfd2629dad714797: Status 404 returned error can't find the container with id d17c2e256e93476ddf514302cd5748ec0d1877ecc2a495cebfd2629dad714797 Jan 21 17:01:01 crc kubenswrapper[4707]: I0121 17:01:01.633526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" event={"ID":"b5fe8a25-08a7-4f0b-b128-44069d9f21e7","Type":"ContainerStarted","Data":"0d619d30cc49613dbe8c030e0fd72c54f16a8b4127b1ea8654e6372efe9c1048"} Jan 21 17:01:01 crc kubenswrapper[4707]: I0121 17:01:01.633718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" event={"ID":"b5fe8a25-08a7-4f0b-b128-44069d9f21e7","Type":"ContainerStarted","Data":"d17c2e256e93476ddf514302cd5748ec0d1877ecc2a495cebfd2629dad714797"} Jan 21 17:01:01 crc kubenswrapper[4707]: I0121 17:01:01.646018 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" podStartSLOduration=1.6460059299999998 podStartE2EDuration="1.64600593s" podCreationTimestamp="2026-01-21 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:01:01.645093393 +0000 UTC m=+7158.826609615" watchObservedRunningTime="2026-01-21 17:01:01.64600593 +0000 UTC m=+7158.827522151" Jan 21 17:01:02 crc kubenswrapper[4707]: I0121 17:01:02.640607 4707 generic.go:334] "Generic (PLEG): container finished" podID="b5fe8a25-08a7-4f0b-b128-44069d9f21e7" containerID="0d619d30cc49613dbe8c030e0fd72c54f16a8b4127b1ea8654e6372efe9c1048" exitCode=0 Jan 21 17:01:02 crc kubenswrapper[4707]: I0121 17:01:02.640676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" event={"ID":"b5fe8a25-08a7-4f0b-b128-44069d9f21e7","Type":"ContainerDied","Data":"0d619d30cc49613dbe8c030e0fd72c54f16a8b4127b1ea8654e6372efe9c1048"} Jan 21 17:01:03 crc kubenswrapper[4707]: I0121 17:01:03.866512 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.035265 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-fernet-keys\") pod \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.035317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ktcf\" (UniqueName: \"kubernetes.io/projected/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-kube-api-access-9ktcf\") pod \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.035410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-config-data\") pod \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\" (UID: \"b5fe8a25-08a7-4f0b-b128-44069d9f21e7\") " Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.039714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-kube-api-access-9ktcf" (OuterVolumeSpecName: "kube-api-access-9ktcf") pod "b5fe8a25-08a7-4f0b-b128-44069d9f21e7" (UID: "b5fe8a25-08a7-4f0b-b128-44069d9f21e7"). InnerVolumeSpecName "kube-api-access-9ktcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.039858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b5fe8a25-08a7-4f0b-b128-44069d9f21e7" (UID: "b5fe8a25-08a7-4f0b-b128-44069d9f21e7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.059523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-config-data" (OuterVolumeSpecName: "config-data") pod "b5fe8a25-08a7-4f0b-b128-44069d9f21e7" (UID: "b5fe8a25-08a7-4f0b-b128-44069d9f21e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.136827 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.136855 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.136867 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ktcf\" (UniqueName: \"kubernetes.io/projected/b5fe8a25-08a7-4f0b-b128-44069d9f21e7-kube-api-access-9ktcf\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.651767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" event={"ID":"b5fe8a25-08a7-4f0b-b128-44069d9f21e7","Type":"ContainerDied","Data":"d17c2e256e93476ddf514302cd5748ec0d1877ecc2a495cebfd2629dad714797"} Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.651804 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d17c2e256e93476ddf514302cd5748ec0d1877ecc2a495cebfd2629dad714797" Jan 21 17:01:04 crc kubenswrapper[4707]: I0121 17:01:04.651829 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-cron-29483581-26vg8" Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.452703 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.453207 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstackclient" podUID="3bdc328f-eaaf-404d-b9b3-85f63e623e20" containerName="openstackclient" containerID="cri-o://891bb12ad0a08f08014952944a9e2df3ff4f37f3c5601f366833505ff2821af8" gracePeriod=30 Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.674409 4707 generic.go:334] "Generic (PLEG): container finished" podID="3bdc328f-eaaf-404d-b9b3-85f63e623e20" containerID="891bb12ad0a08f08014952944a9e2df3ff4f37f3c5601f366833505ff2821af8" exitCode=143 Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.674496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"3bdc328f-eaaf-404d-b9b3-85f63e623e20","Type":"ContainerDied","Data":"891bb12ad0a08f08014952944a9e2df3ff4f37f3c5601f366833505ff2821af8"} Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.761887 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.813506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkvms\" (UniqueName: \"kubernetes.io/projected/3bdc328f-eaaf-404d-b9b3-85f63e623e20-kube-api-access-lkvms\") pod \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.813542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config\") pod \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.813612 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config-secret\") pod \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\" (UID: \"3bdc328f-eaaf-404d-b9b3-85f63e623e20\") " Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.817928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bdc328f-eaaf-404d-b9b3-85f63e623e20-kube-api-access-lkvms" (OuterVolumeSpecName: "kube-api-access-lkvms") pod "3bdc328f-eaaf-404d-b9b3-85f63e623e20" (UID: "3bdc328f-eaaf-404d-b9b3-85f63e623e20"). InnerVolumeSpecName "kube-api-access-lkvms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.828040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3bdc328f-eaaf-404d-b9b3-85f63e623e20" (UID: "3bdc328f-eaaf-404d-b9b3-85f63e623e20"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.829097 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3bdc328f-eaaf-404d-b9b3-85f63e623e20" (UID: "3bdc328f-eaaf-404d-b9b3-85f63e623e20"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.915154 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkvms\" (UniqueName: \"kubernetes.io/projected/3bdc328f-eaaf-404d-b9b3-85f63e623e20-kube-api-access-lkvms\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.915186 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:08 crc kubenswrapper[4707]: I0121 17:01:08.915198 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3bdc328f-eaaf-404d-b9b3-85f63e623e20-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.681543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"3bdc328f-eaaf-404d-b9b3-85f63e623e20","Type":"ContainerDied","Data":"c03014077384ee87d2dae538aace38e79c3cbdbc1f0e9aadd32e21a65e9dcf06"} Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.681757 4707 scope.go:117] "RemoveContainer" containerID="891bb12ad0a08f08014952944a9e2df3ff4f37f3c5601f366833505ff2821af8" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.681568 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.695089 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.699276 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.827205 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-cron-29483581-26vg8"] Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.832937 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-cron-29483581-26vg8"] Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.837513 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4"] Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.837739 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" podUID="0a0e45f5-2bb6-4b79-822d-b455aa3feab3" containerName="keystone-api" containerID="cri-o://c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf" gracePeriod=30 Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.890075 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone78f6-account-delete-km44h"] Jan 21 17:01:09 crc kubenswrapper[4707]: E0121 17:01:09.890381 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fe8a25-08a7-4f0b-b128-44069d9f21e7" containerName="keystone-cron" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.890397 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fe8a25-08a7-4f0b-b128-44069d9f21e7" containerName="keystone-cron" Jan 21 17:01:09 crc kubenswrapper[4707]: E0121 17:01:09.890431 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdc328f-eaaf-404d-b9b3-85f63e623e20" containerName="openstackclient" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.890437 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdc328f-eaaf-404d-b9b3-85f63e623e20" containerName="openstackclient" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.890557 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5fe8a25-08a7-4f0b-b128-44069d9f21e7" containerName="keystone-cron" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.890585 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdc328f-eaaf-404d-b9b3-85f63e623e20" containerName="openstackclient" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.891080 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.897155 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone78f6-account-delete-km44h"] Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.927605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8m2f\" (UniqueName: \"kubernetes.io/projected/cbecfc81-be93-4b4a-8844-44b006977bf5-kube-api-access-v8m2f\") pod \"keystone78f6-account-delete-km44h\" (UID: \"cbecfc81-be93-4b4a-8844-44b006977bf5\") " pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.927669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbecfc81-be93-4b4a-8844-44b006977bf5-operator-scripts\") pod \"keystone78f6-account-delete-km44h\" (UID: \"cbecfc81-be93-4b4a-8844-44b006977bf5\") " pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.945366 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:01:09 crc kubenswrapper[4707]: I0121 17:01:09.945410 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:01:10 crc kubenswrapper[4707]: I0121 17:01:10.028737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8m2f\" (UniqueName: \"kubernetes.io/projected/cbecfc81-be93-4b4a-8844-44b006977bf5-kube-api-access-v8m2f\") pod \"keystone78f6-account-delete-km44h\" (UID: \"cbecfc81-be93-4b4a-8844-44b006977bf5\") " pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:10 crc kubenswrapper[4707]: I0121 17:01:10.028783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbecfc81-be93-4b4a-8844-44b006977bf5-operator-scripts\") pod \"keystone78f6-account-delete-km44h\" (UID: \"cbecfc81-be93-4b4a-8844-44b006977bf5\") " pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:10 crc kubenswrapper[4707]: I0121 17:01:10.029459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbecfc81-be93-4b4a-8844-44b006977bf5-operator-scripts\") pod \"keystone78f6-account-delete-km44h\" (UID: \"cbecfc81-be93-4b4a-8844-44b006977bf5\") " pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:10 crc kubenswrapper[4707]: I0121 17:01:10.041478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8m2f\" (UniqueName: \"kubernetes.io/projected/cbecfc81-be93-4b4a-8844-44b006977bf5-kube-api-access-v8m2f\") pod \"keystone78f6-account-delete-km44h\" (UID: \"cbecfc81-be93-4b4a-8844-44b006977bf5\") " pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:10 crc kubenswrapper[4707]: I0121 17:01:10.212565 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:10 crc kubenswrapper[4707]: I0121 17:01:10.551632 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone78f6-account-delete-km44h"] Jan 21 17:01:10 crc kubenswrapper[4707]: I0121 17:01:10.688697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" event={"ID":"cbecfc81-be93-4b4a-8844-44b006977bf5","Type":"ContainerStarted","Data":"3c3feedbaa9e1211b5b5de12c211c2667207773a016e642c10bf7970a22ab486"} Jan 21 17:01:10 crc kubenswrapper[4707]: I0121 17:01:10.688891 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" event={"ID":"cbecfc81-be93-4b4a-8844-44b006977bf5","Type":"ContainerStarted","Data":"7406903d90b6821beb7fb4a527376c015cada496e202a90c85d289a553acd193"} Jan 21 17:01:10 crc kubenswrapper[4707]: I0121 17:01:10.701604 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" podStartSLOduration=1.701589093 podStartE2EDuration="1.701589093s" podCreationTimestamp="2026-01-21 17:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:01:10.698415376 +0000 UTC m=+7167.879931598" watchObservedRunningTime="2026-01-21 17:01:10.701589093 +0000 UTC m=+7167.883105316" Jan 21 17:01:11 crc kubenswrapper[4707]: I0121 17:01:11.188483 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bdc328f-eaaf-404d-b9b3-85f63e623e20" path="/var/lib/kubelet/pods/3bdc328f-eaaf-404d-b9b3-85f63e623e20/volumes" Jan 21 17:01:11 crc kubenswrapper[4707]: I0121 17:01:11.189047 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5fe8a25-08a7-4f0b-b128-44069d9f21e7" path="/var/lib/kubelet/pods/b5fe8a25-08a7-4f0b-b128-44069d9f21e7/volumes" Jan 21 17:01:11 crc kubenswrapper[4707]: I0121 17:01:11.693991 4707 generic.go:334] "Generic (PLEG): container finished" podID="cbecfc81-be93-4b4a-8844-44b006977bf5" containerID="3c3feedbaa9e1211b5b5de12c211c2667207773a016e642c10bf7970a22ab486" exitCode=0 Jan 21 17:01:11 crc kubenswrapper[4707]: I0121 17:01:11.694027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" event={"ID":"cbecfc81-be93-4b4a-8844-44b006977bf5","Type":"ContainerDied","Data":"3c3feedbaa9e1211b5b5de12c211c2667207773a016e642c10bf7970a22ab486"} Jan 21 17:01:12 crc kubenswrapper[4707]: I0121 17:01:12.893935 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.066394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8m2f\" (UniqueName: \"kubernetes.io/projected/cbecfc81-be93-4b4a-8844-44b006977bf5-kube-api-access-v8m2f\") pod \"cbecfc81-be93-4b4a-8844-44b006977bf5\" (UID: \"cbecfc81-be93-4b4a-8844-44b006977bf5\") " Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.066538 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbecfc81-be93-4b4a-8844-44b006977bf5-operator-scripts\") pod \"cbecfc81-be93-4b4a-8844-44b006977bf5\" (UID: \"cbecfc81-be93-4b4a-8844-44b006977bf5\") " Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.067532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbecfc81-be93-4b4a-8844-44b006977bf5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cbecfc81-be93-4b4a-8844-44b006977bf5" (UID: "cbecfc81-be93-4b4a-8844-44b006977bf5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.071421 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbecfc81-be93-4b4a-8844-44b006977bf5-kube-api-access-v8m2f" (OuterVolumeSpecName: "kube-api-access-v8m2f") pod "cbecfc81-be93-4b4a-8844-44b006977bf5" (UID: "cbecfc81-be93-4b4a-8844-44b006977bf5"). InnerVolumeSpecName "kube-api-access-v8m2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.168447 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbecfc81-be93-4b4a-8844-44b006977bf5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.168480 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8m2f\" (UniqueName: \"kubernetes.io/projected/cbecfc81-be93-4b4a-8844-44b006977bf5-kube-api-access-v8m2f\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.233445 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.370413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-scripts\") pod \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.370459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-credential-keys\") pod \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.370503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-config-data\") pod \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.370562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-677jx\" (UniqueName: \"kubernetes.io/projected/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-kube-api-access-677jx\") pod \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.370621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-fernet-keys\") pod \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\" (UID: \"0a0e45f5-2bb6-4b79-822d-b455aa3feab3\") " Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.374308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-scripts" (OuterVolumeSpecName: "scripts") pod "0a0e45f5-2bb6-4b79-822d-b455aa3feab3" (UID: "0a0e45f5-2bb6-4b79-822d-b455aa3feab3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.374346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0a0e45f5-2bb6-4b79-822d-b455aa3feab3" (UID: "0a0e45f5-2bb6-4b79-822d-b455aa3feab3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.374611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-kube-api-access-677jx" (OuterVolumeSpecName: "kube-api-access-677jx") pod "0a0e45f5-2bb6-4b79-822d-b455aa3feab3" (UID: "0a0e45f5-2bb6-4b79-822d-b455aa3feab3"). InnerVolumeSpecName "kube-api-access-677jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.374789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0a0e45f5-2bb6-4b79-822d-b455aa3feab3" (UID: "0a0e45f5-2bb6-4b79-822d-b455aa3feab3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.385067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-config-data" (OuterVolumeSpecName: "config-data") pod "0a0e45f5-2bb6-4b79-822d-b455aa3feab3" (UID: "0a0e45f5-2bb6-4b79-822d-b455aa3feab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.472988 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.473022 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.473033 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.473045 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-677jx\" (UniqueName: \"kubernetes.io/projected/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-kube-api-access-677jx\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.473054 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a0e45f5-2bb6-4b79-822d-b455aa3feab3-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.706248 4707 generic.go:334] "Generic (PLEG): container finished" podID="0a0e45f5-2bb6-4b79-822d-b455aa3feab3" containerID="c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf" exitCode=0 Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.706304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" event={"ID":"0a0e45f5-2bb6-4b79-822d-b455aa3feab3","Type":"ContainerDied","Data":"c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf"} Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.706330 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" event={"ID":"0a0e45f5-2bb6-4b79-822d-b455aa3feab3","Type":"ContainerDied","Data":"5d2be4b6625ac185100cd088c830b7fd8d5f283e1176f26fb40eb03b7d160ec2"} Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.706343 4707 scope.go:117] "RemoveContainer" containerID="c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.706426 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.707994 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.707993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone78f6-account-delete-km44h" event={"ID":"cbecfc81-be93-4b4a-8844-44b006977bf5","Type":"ContainerDied","Data":"7406903d90b6821beb7fb4a527376c015cada496e202a90c85d289a553acd193"} Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.708077 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7406903d90b6821beb7fb4a527376c015cada496e202a90c85d289a553acd193" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.722245 4707 scope.go:117] "RemoveContainer" containerID="c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf" Jan 21 17:01:13 crc kubenswrapper[4707]: E0121 17:01:13.722578 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf\": container with ID starting with c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf not found: ID does not exist" containerID="c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.722606 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf"} err="failed to get container status \"c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf\": rpc error: code = NotFound desc = could not find container \"c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf\": container with ID starting with c8a91ac16fbb9c526e0e04ba4bf038825252a671f77d204d52e950c03c62b4cf not found: ID does not exist" Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.727230 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4"] Jan 21 17:01:13 crc kubenswrapper[4707]: I0121 17:01:13.731075 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-66fb9c9dbd-k9ks4"] Jan 21 17:01:14 crc kubenswrapper[4707]: I0121 17:01:14.914540 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone78f6-account-delete-km44h"] Jan 21 17:01:14 crc kubenswrapper[4707]: I0121 17:01:14.918286 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone78f6-account-delete-km44h"] Jan 21 17:01:15 crc kubenswrapper[4707]: I0121 17:01:15.189023 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a0e45f5-2bb6-4b79-822d-b455aa3feab3" path="/var/lib/kubelet/pods/0a0e45f5-2bb6-4b79-822d-b455aa3feab3/volumes" Jan 21 17:01:15 crc kubenswrapper[4707]: I0121 17:01:15.189639 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbecfc81-be93-4b4a-8844-44b006977bf5" path="/var/lib/kubelet/pods/cbecfc81-be93-4b4a-8844-44b006977bf5/volumes" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.624891 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/root-account-create-update-58g4n"] Jan 21 17:01:18 crc kubenswrapper[4707]: E0121 17:01:18.626466 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0e45f5-2bb6-4b79-822d-b455aa3feab3" containerName="keystone-api" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.626563 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0e45f5-2bb6-4b79-822d-b455aa3feab3" containerName="keystone-api" Jan 21 17:01:18 crc kubenswrapper[4707]: E0121 17:01:18.626621 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbecfc81-be93-4b4a-8844-44b006977bf5" containerName="mariadb-account-delete" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.626682 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbecfc81-be93-4b4a-8844-44b006977bf5" containerName="mariadb-account-delete" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.626899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0e45f5-2bb6-4b79-822d-b455aa3feab3" containerName="keystone-api" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.626976 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbecfc81-be93-4b4a-8844-44b006977bf5" containerName="mariadb-account-delete" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.627523 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.630005 4707 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.632455 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-58g4n"] Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.638571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzr4t\" (UniqueName: \"kubernetes.io/projected/8946422b-3b33-431d-9aa1-027c033b4319-kube-api-access-qzr4t\") pod \"root-account-create-update-58g4n\" (UID: \"8946422b-3b33-431d-9aa1-027c033b4319\") " pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.638728 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8946422b-3b33-431d-9aa1-027c033b4319-operator-scripts\") pod \"root-account-create-update-58g4n\" (UID: \"8946422b-3b33-431d-9aa1-027c033b4319\") " pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.676345 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.680572 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.686385 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.700727 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-58g4n"] Jan 21 17:01:18 crc kubenswrapper[4707]: E0121 17:01:18.701338 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qzr4t operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="keystone-kuttl-tests/root-account-create-update-58g4n" podUID="8946422b-3b33-431d-9aa1-027c033b4319" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.732482 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.739320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8946422b-3b33-431d-9aa1-027c033b4319-operator-scripts\") pod \"root-account-create-update-58g4n\" (UID: \"8946422b-3b33-431d-9aa1-027c033b4319\") " pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:18 crc kubenswrapper[4707]: E0121 17:01:18.739440 4707 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 17:01:18 crc kubenswrapper[4707]: E0121 17:01:18.739493 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8946422b-3b33-431d-9aa1-027c033b4319-operator-scripts podName:8946422b-3b33-431d-9aa1-027c033b4319 nodeName:}" failed. No retries permitted until 2026-01-21 17:01:19.23947838 +0000 UTC m=+7176.420994602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8946422b-3b33-431d-9aa1-027c033b4319-operator-scripts") pod "root-account-create-update-58g4n" (UID: "8946422b-3b33-431d-9aa1-027c033b4319") : configmap "openstack-scripts" not found Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.739593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzr4t\" (UniqueName: \"kubernetes.io/projected/8946422b-3b33-431d-9aa1-027c033b4319-kube-api-access-qzr4t\") pod \"root-account-create-update-58g4n\" (UID: \"8946422b-3b33-431d-9aa1-027c033b4319\") " pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.740402 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:18 crc kubenswrapper[4707]: E0121 17:01:18.741947 4707 projected.go:194] Error preparing data for projected volume kube-api-access-qzr4t for pod keystone-kuttl-tests/root-account-create-update-58g4n: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:01:18 crc kubenswrapper[4707]: E0121 17:01:18.741992 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8946422b-3b33-431d-9aa1-027c033b4319-kube-api-access-qzr4t podName:8946422b-3b33-431d-9aa1-027c033b4319 nodeName:}" failed. No retries permitted until 2026-01-21 17:01:19.241981485 +0000 UTC m=+7176.423497708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qzr4t" (UniqueName: "kubernetes.io/projected/8946422b-3b33-431d-9aa1-027c033b4319-kube-api-access-qzr4t") pod "root-account-create-update-58g4n" (UID: "8946422b-3b33-431d-9aa1-027c033b4319") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:01:18 crc kubenswrapper[4707]: I0121 17:01:18.785071 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-2" podUID="522a96ca-cde2-4466-804e-85a0f8e56653" containerName="galera" containerID="cri-o://890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d" gracePeriod=30 Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.160767 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.161124 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/memcached-0" podUID="34b7fe44-243a-4e6a-b6f5-abf109cd69d5" containerName="memcached" containerID="cri-o://131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3" gracePeriod=30 Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.245215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8946422b-3b33-431d-9aa1-027c033b4319-operator-scripts\") pod \"root-account-create-update-58g4n\" (UID: \"8946422b-3b33-431d-9aa1-027c033b4319\") " pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.245348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzr4t\" (UniqueName: \"kubernetes.io/projected/8946422b-3b33-431d-9aa1-027c033b4319-kube-api-access-qzr4t\") pod \"root-account-create-update-58g4n\" (UID: \"8946422b-3b33-431d-9aa1-027c033b4319\") " pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:19 crc kubenswrapper[4707]: E0121 17:01:19.245219 4707 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 17:01:19 crc kubenswrapper[4707]: E0121 17:01:19.245525 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8946422b-3b33-431d-9aa1-027c033b4319-operator-scripts podName:8946422b-3b33-431d-9aa1-027c033b4319 nodeName:}" failed. No retries permitted until 2026-01-21 17:01:20.245502242 +0000 UTC m=+7177.427018464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8946422b-3b33-431d-9aa1-027c033b4319-operator-scripts") pod "root-account-create-update-58g4n" (UID: "8946422b-3b33-431d-9aa1-027c033b4319") : configmap "openstack-scripts" not found Jan 21 17:01:19 crc kubenswrapper[4707]: E0121 17:01:19.247748 4707 projected.go:194] Error preparing data for projected volume kube-api-access-qzr4t for pod keystone-kuttl-tests/root-account-create-update-58g4n: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:01:19 crc kubenswrapper[4707]: E0121 17:01:19.247804 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8946422b-3b33-431d-9aa1-027c033b4319-kube-api-access-qzr4t podName:8946422b-3b33-431d-9aa1-027c033b4319 nodeName:}" failed. No retries permitted until 2026-01-21 17:01:20.247790886 +0000 UTC m=+7177.429307107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qzr4t" (UniqueName: "kubernetes.io/projected/8946422b-3b33-431d-9aa1-027c033b4319-kube-api-access-qzr4t") pod "root-account-create-update-58g4n" (UID: "8946422b-3b33-431d-9aa1-027c033b4319") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.417108 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.447913 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-operator-scripts\") pod \"522a96ca-cde2-4466-804e-85a0f8e56653\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.447952 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-generated\") pod \"522a96ca-cde2-4466-804e-85a0f8e56653\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.447989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448tw\" (UniqueName: \"kubernetes.io/projected/522a96ca-cde2-4466-804e-85a0f8e56653-kube-api-access-448tw\") pod \"522a96ca-cde2-4466-804e-85a0f8e56653\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.448015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"522a96ca-cde2-4466-804e-85a0f8e56653\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.448034 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-default\") pod \"522a96ca-cde2-4466-804e-85a0f8e56653\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.448050 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-kolla-config\") pod \"522a96ca-cde2-4466-804e-85a0f8e56653\" (UID: \"522a96ca-cde2-4466-804e-85a0f8e56653\") " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.448614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "522a96ca-cde2-4466-804e-85a0f8e56653" (UID: "522a96ca-cde2-4466-804e-85a0f8e56653"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.448747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "522a96ca-cde2-4466-804e-85a0f8e56653" (UID: "522a96ca-cde2-4466-804e-85a0f8e56653"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.448911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "522a96ca-cde2-4466-804e-85a0f8e56653" (UID: "522a96ca-cde2-4466-804e-85a0f8e56653"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.449184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "522a96ca-cde2-4466-804e-85a0f8e56653" (UID: "522a96ca-cde2-4466-804e-85a0f8e56653"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.456961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "522a96ca-cde2-4466-804e-85a0f8e56653" (UID: "522a96ca-cde2-4466-804e-85a0f8e56653"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.463205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522a96ca-cde2-4466-804e-85a0f8e56653-kube-api-access-448tw" (OuterVolumeSpecName: "kube-api-access-448tw") pod "522a96ca-cde2-4466-804e-85a0f8e56653" (UID: "522a96ca-cde2-4466-804e-85a0f8e56653"). InnerVolumeSpecName "kube-api-access-448tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.498690 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.549320 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.549350 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.549362 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.549373 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/522a96ca-cde2-4466-804e-85a0f8e56653-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.549385 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/522a96ca-cde2-4466-804e-85a0f8e56653-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.549396 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448tw\" (UniqueName: \"kubernetes.io/projected/522a96ca-cde2-4466-804e-85a0f8e56653-kube-api-access-448tw\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.560139 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.651335 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.677969 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.740576 4707 generic.go:334] "Generic (PLEG): container finished" podID="522a96ca-cde2-4466-804e-85a0f8e56653" containerID="890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d" exitCode=0 Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.740618 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"522a96ca-cde2-4466-804e-85a0f8e56653","Type":"ContainerDied","Data":"890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d"} Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.740665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"522a96ca-cde2-4466-804e-85a0f8e56653","Type":"ContainerDied","Data":"8540b4a11ac8978d2a26c675be2b4ccd4b93436d3ac3b3c6c83afae46b258e4b"} Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.740645 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.740678 4707 scope.go:117] "RemoveContainer" containerID="890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.742598 4707 generic.go:334] "Generic (PLEG): container finished" podID="34b7fe44-243a-4e6a-b6f5-abf109cd69d5" containerID="131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3" exitCode=0 Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.742665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"34b7fe44-243a-4e6a-b6f5-abf109cd69d5","Type":"ContainerDied","Data":"131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3"} Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.742867 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-58g4n" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.742873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"34b7fe44-243a-4e6a-b6f5-abf109cd69d5","Type":"ContainerDied","Data":"73f012961bd5b60924751f9a22c446df7c1a23d538dd81c61c465e11568bde92"} Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.742687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.769142 4707 scope.go:117] "RemoveContainer" containerID="befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.775700 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-58g4n"] Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.789226 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-58g4n"] Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.789366 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.795924 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.796764 4707 scope.go:117] "RemoveContainer" containerID="890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d" Jan 21 17:01:19 crc kubenswrapper[4707]: E0121 17:01:19.797158 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d\": container with ID starting with 890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d not found: ID does not exist" containerID="890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.797190 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d"} err="failed to get container status \"890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d\": rpc error: code = NotFound desc = could not find container \"890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d\": container with ID starting with 890b442f45db2a0c5d05b67af2dd9944b642bdaf6d625a21e5fb7af74338718d not found: ID does not exist" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.797215 4707 scope.go:117] "RemoveContainer" containerID="befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda" Jan 21 17:01:19 crc kubenswrapper[4707]: E0121 17:01:19.797545 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda\": container with ID starting with befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda not found: ID does not exist" containerID="befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.797568 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda"} err="failed to get container status \"befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda\": rpc error: code = NotFound desc = could not find container \"befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda\": container with ID starting with befdaefdbab4405e00a34890dd94209aec0fe6305dc035874e8b9e6ebb520fda not found: ID does not exist" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.797581 4707 scope.go:117] "RemoveContainer" containerID="131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.812210 4707 scope.go:117] "RemoveContainer" containerID="131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3" Jan 21 17:01:19 crc kubenswrapper[4707]: E0121 17:01:19.812466 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3\": container with ID starting with 131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3 not found: ID does not exist" containerID="131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.812503 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3"} err="failed to get container status \"131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3\": rpc error: code = NotFound desc = could not find container \"131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3\": container with ID starting with 131d685e9b82a94d024ef2b8275227e19b239a9cae65fcbc2e86a1aeee4254a3 not found: ID does not exist" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.853449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-config-data\") pod \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.853552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kolla-config\") pod \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.853579 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s69k8\" (UniqueName: \"kubernetes.io/projected/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kube-api-access-s69k8\") pod \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\" (UID: \"34b7fe44-243a-4e6a-b6f5-abf109cd69d5\") " Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.855284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-config-data" (OuterVolumeSpecName: "config-data") pod "34b7fe44-243a-4e6a-b6f5-abf109cd69d5" (UID: "34b7fe44-243a-4e6a-b6f5-abf109cd69d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.856121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "34b7fe44-243a-4e6a-b6f5-abf109cd69d5" (UID: "34b7fe44-243a-4e6a-b6f5-abf109cd69d5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.857595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kube-api-access-s69k8" (OuterVolumeSpecName: "kube-api-access-s69k8") pod "34b7fe44-243a-4e6a-b6f5-abf109cd69d5" (UID: "34b7fe44-243a-4e6a-b6f5-abf109cd69d5"). InnerVolumeSpecName "kube-api-access-s69k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.860995 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.886663 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/rabbitmq-server-0" podUID="2f489928-26ae-45c9-851c-6072f97f37e0" containerName="rabbitmq" containerID="cri-o://b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a" gracePeriod=604800 Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.955707 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.955918 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s69k8\" (UniqueName: \"kubernetes.io/projected/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-kube-api-access-s69k8\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.956009 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzr4t\" (UniqueName: \"kubernetes.io/projected/8946422b-3b33-431d-9aa1-027c033b4319-kube-api-access-qzr4t\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.956067 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8946422b-3b33-431d-9aa1-027c033b4319-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:19 crc kubenswrapper[4707]: I0121 17:01:19.956118 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34b7fe44-243a-4e6a-b6f5-abf109cd69d5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.064826 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.068979 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.482885 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh"] Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.483067 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" podUID="a4f96623-749d-41d4-b59e-aedc4fd495bd" containerName="manager" containerID="cri-o://e7d6a1dd83eda9fe4223bd36d061b71143fb3cea83f770f435a33a5d16217ba8" gracePeriod=10 Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.685425 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-tr55x"] Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.685788 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-tr55x" podUID="900dbba9-3a88-4094-8313-243234ffbe22" containerName="registry-server" containerID="cri-o://1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21" gracePeriod=30 Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.708442 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w"] Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.712791 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z7s6w"] Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.753280 4707 generic.go:334] "Generic (PLEG): container finished" podID="a4f96623-749d-41d4-b59e-aedc4fd495bd" containerID="e7d6a1dd83eda9fe4223bd36d061b71143fb3cea83f770f435a33a5d16217ba8" exitCode=0 Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.753312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" event={"ID":"a4f96623-749d-41d4-b59e-aedc4fd495bd","Type":"ContainerDied","Data":"e7d6a1dd83eda9fe4223bd36d061b71143fb3cea83f770f435a33a5d16217ba8"} Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.802782 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-1" podUID="151f7a26-9818-45b1-90d5-d45f9ce116ce" containerName="galera" containerID="cri-o://f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5" gracePeriod=28 Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.865223 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.869031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-apiservice-cert\") pod \"a4f96623-749d-41d4-b59e-aedc4fd495bd\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.869067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-webhook-cert\") pod \"a4f96623-749d-41d4-b59e-aedc4fd495bd\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.869095 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2tzb\" (UniqueName: \"kubernetes.io/projected/a4f96623-749d-41d4-b59e-aedc4fd495bd-kube-api-access-d2tzb\") pod \"a4f96623-749d-41d4-b59e-aedc4fd495bd\" (UID: \"a4f96623-749d-41d4-b59e-aedc4fd495bd\") " Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.873993 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f96623-749d-41d4-b59e-aedc4fd495bd-kube-api-access-d2tzb" (OuterVolumeSpecName: "kube-api-access-d2tzb") pod "a4f96623-749d-41d4-b59e-aedc4fd495bd" (UID: "a4f96623-749d-41d4-b59e-aedc4fd495bd"). InnerVolumeSpecName "kube-api-access-d2tzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.874012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "a4f96623-749d-41d4-b59e-aedc4fd495bd" (UID: "a4f96623-749d-41d4-b59e-aedc4fd495bd"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.874425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "a4f96623-749d-41d4-b59e-aedc4fd495bd" (UID: "a4f96623-749d-41d4-b59e-aedc4fd495bd"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.970623 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.970826 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a4f96623-749d-41d4-b59e-aedc4fd495bd-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:20 crc kubenswrapper[4707]: I0121 17:01:20.970841 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2tzb\" (UniqueName: \"kubernetes.io/projected/a4f96623-749d-41d4-b59e-aedc4fd495bd-kube-api-access-d2tzb\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.043741 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.071466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8mcd\" (UniqueName: \"kubernetes.io/projected/900dbba9-3a88-4094-8313-243234ffbe22-kube-api-access-f8mcd\") pod \"900dbba9-3a88-4094-8313-243234ffbe22\" (UID: \"900dbba9-3a88-4094-8313-243234ffbe22\") " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.075015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900dbba9-3a88-4094-8313-243234ffbe22-kube-api-access-f8mcd" (OuterVolumeSpecName: "kube-api-access-f8mcd") pod "900dbba9-3a88-4094-8313-243234ffbe22" (UID: "900dbba9-3a88-4094-8313-243234ffbe22"). InnerVolumeSpecName "kube-api-access-f8mcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.175061 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8mcd\" (UniqueName: \"kubernetes.io/projected/900dbba9-3a88-4094-8313-243234ffbe22-kube-api-access-f8mcd\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.189094 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a731ca6-7194-4289-98d8-89ae6fba1185" path="/var/lib/kubelet/pods/0a731ca6-7194-4289-98d8-89ae6fba1185/volumes" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.189747 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b7fe44-243a-4e6a-b6f5-abf109cd69d5" path="/var/lib/kubelet/pods/34b7fe44-243a-4e6a-b6f5-abf109cd69d5/volumes" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.190455 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="522a96ca-cde2-4466-804e-85a0f8e56653" path="/var/lib/kubelet/pods/522a96ca-cde2-4466-804e-85a0f8e56653/volumes" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.191932 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8946422b-3b33-431d-9aa1-027c033b4319" path="/var/lib/kubelet/pods/8946422b-3b33-431d-9aa1-027c033b4319/volumes" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.223879 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.276211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-erlang-cookie\") pod \"2f489928-26ae-45c9-851c-6072f97f37e0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.276286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f489928-26ae-45c9-851c-6072f97f37e0-pod-info\") pod \"2f489928-26ae-45c9-851c-6072f97f37e0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.276390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f489928-26ae-45c9-851c-6072f97f37e0-plugins-conf\") pod \"2f489928-26ae-45c9-851c-6072f97f37e0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.276422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sff4g\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-kube-api-access-sff4g\") pod \"2f489928-26ae-45c9-851c-6072f97f37e0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.276504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\") pod \"2f489928-26ae-45c9-851c-6072f97f37e0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.276532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-confd\") pod \"2f489928-26ae-45c9-851c-6072f97f37e0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.276563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f489928-26ae-45c9-851c-6072f97f37e0-erlang-cookie-secret\") pod \"2f489928-26ae-45c9-851c-6072f97f37e0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.276609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-plugins\") pod \"2f489928-26ae-45c9-851c-6072f97f37e0\" (UID: \"2f489928-26ae-45c9-851c-6072f97f37e0\") " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.277090 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2f489928-26ae-45c9-851c-6072f97f37e0" (UID: "2f489928-26ae-45c9-851c-6072f97f37e0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.277494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f489928-26ae-45c9-851c-6072f97f37e0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2f489928-26ae-45c9-851c-6072f97f37e0" (UID: "2f489928-26ae-45c9-851c-6072f97f37e0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.278058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2f489928-26ae-45c9-851c-6072f97f37e0" (UID: "2f489928-26ae-45c9-851c-6072f97f37e0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.278769 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-kube-api-access-sff4g" (OuterVolumeSpecName: "kube-api-access-sff4g") pod "2f489928-26ae-45c9-851c-6072f97f37e0" (UID: "2f489928-26ae-45c9-851c-6072f97f37e0"). InnerVolumeSpecName "kube-api-access-sff4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.278981 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2f489928-26ae-45c9-851c-6072f97f37e0-pod-info" (OuterVolumeSpecName: "pod-info") pod "2f489928-26ae-45c9-851c-6072f97f37e0" (UID: "2f489928-26ae-45c9-851c-6072f97f37e0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.279485 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f489928-26ae-45c9-851c-6072f97f37e0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2f489928-26ae-45c9-851c-6072f97f37e0" (UID: "2f489928-26ae-45c9-851c-6072f97f37e0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.287219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42" (OuterVolumeSpecName: "persistence") pod "2f489928-26ae-45c9-851c-6072f97f37e0" (UID: "2f489928-26ae-45c9-851c-6072f97f37e0"). InnerVolumeSpecName "pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.318858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2f489928-26ae-45c9-851c-6072f97f37e0" (UID: "2f489928-26ae-45c9-851c-6072f97f37e0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.379030 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f489928-26ae-45c9-851c-6072f97f37e0-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.379056 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sff4g\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-kube-api-access-sff4g\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.379085 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\") on node \"crc\" " Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.379095 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.379105 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f489928-26ae-45c9-851c-6072f97f37e0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.379114 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.379122 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f489928-26ae-45c9-851c-6072f97f37e0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.379129 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f489928-26ae-45c9-851c-6072f97f37e0-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.389987 4707 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.390102 4707 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42") on node "crc" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.480341 4707 reconciler_common.go:293] "Volume detached for volume \"pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37ba46f9-56e9-406a-a87d-3aeaf09a8e42\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.761933 4707 generic.go:334] "Generic (PLEG): container finished" podID="900dbba9-3a88-4094-8313-243234ffbe22" containerID="1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21" exitCode=0 Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.761990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-tr55x" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.762001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-tr55x" event={"ID":"900dbba9-3a88-4094-8313-243234ffbe22","Type":"ContainerDied","Data":"1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21"} Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.762051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-tr55x" event={"ID":"900dbba9-3a88-4094-8313-243234ffbe22","Type":"ContainerDied","Data":"692113d63ef6f2f84226167bba4ef52ba44e5f99348115206f4f468281c9f650"} Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.762066 4707 scope.go:117] "RemoveContainer" containerID="1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.764769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" event={"ID":"a4f96623-749d-41d4-b59e-aedc4fd495bd","Type":"ContainerDied","Data":"720ee4ba06f85542f2a4e9bfe1e9f96226935a76fa5de67eb24646bf4068cd68"} Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.765055 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.768530 4707 generic.go:334] "Generic (PLEG): container finished" podID="2f489928-26ae-45c9-851c-6072f97f37e0" containerID="b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a" exitCode=0 Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.768559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"2f489928-26ae-45c9-851c-6072f97f37e0","Type":"ContainerDied","Data":"b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a"} Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.768578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"2f489928-26ae-45c9-851c-6072f97f37e0","Type":"ContainerDied","Data":"b4e956e7c990a79ae01245178c0c77581ff92c2da6aeb33894ef449bad23a533"} Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.768618 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.784268 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-tr55x"] Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.790848 4707 scope.go:117] "RemoveContainer" containerID="1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21" Jan 21 17:01:21 crc kubenswrapper[4707]: E0121 17:01:21.791175 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21\": container with ID starting with 1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21 not found: ID does not exist" containerID="1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.791215 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21"} err="failed to get container status \"1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21\": rpc error: code = NotFound desc = could not find container \"1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21\": container with ID starting with 1b506e033417946a8d8cde818481bebb0513691547101ec1c5bedba8b7e8fd21 not found: ID does not exist" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.791244 4707 scope.go:117] "RemoveContainer" containerID="e7d6a1dd83eda9fe4223bd36d061b71143fb3cea83f770f435a33a5d16217ba8" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.791935 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-tr55x"] Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.796373 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh"] Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.801568 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-9c985558-cvpzh"] Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.807591 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.808208 4707 scope.go:117] "RemoveContainer" containerID="b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.813625 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.824985 4707 scope.go:117] "RemoveContainer" containerID="22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.837697 4707 scope.go:117] "RemoveContainer" containerID="b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a" Jan 21 17:01:21 crc kubenswrapper[4707]: E0121 17:01:21.838191 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a\": container with ID starting with b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a not found: ID does not exist" containerID="b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.838221 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a"} err="failed to get container status \"b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a\": rpc error: code = NotFound desc = could not find container \"b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a\": container with ID starting with b8504b2c036146daed83103227e557a41fbeba15d35f6e7a5fe3be64b17bbf7a not found: ID does not exist" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.838243 4707 scope.go:117] "RemoveContainer" containerID="22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5" Jan 21 17:01:21 crc kubenswrapper[4707]: E0121 17:01:21.838562 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5\": container with ID starting with 22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5 not found: ID does not exist" containerID="22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5" Jan 21 17:01:21 crc kubenswrapper[4707]: I0121 17:01:21.838595 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5"} err="failed to get container status \"22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5\": rpc error: code = NotFound desc = could not find container \"22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5\": container with ID starting with 22bb0802f6dca5d18fd9d99eb35394605adeb21b6ecdba47098ca732b5392aa5 not found: ID does not exist" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.432214 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.592239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-operator-scripts\") pod \"151f7a26-9818-45b1-90d5-d45f9ce116ce\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.592274 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-default\") pod \"151f7a26-9818-45b1-90d5-d45f9ce116ce\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.592300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-kolla-config\") pod \"151f7a26-9818-45b1-90d5-d45f9ce116ce\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.592333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"151f7a26-9818-45b1-90d5-d45f9ce116ce\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.592382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp57s\" (UniqueName: \"kubernetes.io/projected/151f7a26-9818-45b1-90d5-d45f9ce116ce-kube-api-access-rp57s\") pod \"151f7a26-9818-45b1-90d5-d45f9ce116ce\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.592425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-generated\") pod \"151f7a26-9818-45b1-90d5-d45f9ce116ce\" (UID: \"151f7a26-9818-45b1-90d5-d45f9ce116ce\") " Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.593108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "151f7a26-9818-45b1-90d5-d45f9ce116ce" (UID: "151f7a26-9818-45b1-90d5-d45f9ce116ce"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.593173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "151f7a26-9818-45b1-90d5-d45f9ce116ce" (UID: "151f7a26-9818-45b1-90d5-d45f9ce116ce"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.593211 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "151f7a26-9818-45b1-90d5-d45f9ce116ce" (UID: "151f7a26-9818-45b1-90d5-d45f9ce116ce"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.593353 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "151f7a26-9818-45b1-90d5-d45f9ce116ce" (UID: "151f7a26-9818-45b1-90d5-d45f9ce116ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.593477 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.593501 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.593511 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/151f7a26-9818-45b1-90d5-d45f9ce116ce-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.593522 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/151f7a26-9818-45b1-90d5-d45f9ce116ce-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.596863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151f7a26-9818-45b1-90d5-d45f9ce116ce-kube-api-access-rp57s" (OuterVolumeSpecName: "kube-api-access-rp57s") pod "151f7a26-9818-45b1-90d5-d45f9ce116ce" (UID: "151f7a26-9818-45b1-90d5-d45f9ce116ce"). InnerVolumeSpecName "kube-api-access-rp57s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.600459 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "151f7a26-9818-45b1-90d5-d45f9ce116ce" (UID: "151f7a26-9818-45b1-90d5-d45f9ce116ce"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.694492 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.694520 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp57s\" (UniqueName: \"kubernetes.io/projected/151f7a26-9818-45b1-90d5-d45f9ce116ce-kube-api-access-rp57s\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.703335 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.779714 4707 generic.go:334] "Generic (PLEG): container finished" podID="151f7a26-9818-45b1-90d5-d45f9ce116ce" containerID="f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5" exitCode=0 Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.779753 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.779752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"151f7a26-9818-45b1-90d5-d45f9ce116ce","Type":"ContainerDied","Data":"f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5"} Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.779857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"151f7a26-9818-45b1-90d5-d45f9ce116ce","Type":"ContainerDied","Data":"4cd1776a0fcfec8d6bb03aaf1a6fcdc779f8b88a7c6cd7959f480762f73e2d42"} Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.779879 4707 scope.go:117] "RemoveContainer" containerID="f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.797897 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.797909 4707 scope.go:117] "RemoveContainer" containerID="3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.812062 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.816717 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.819644 4707 scope.go:117] "RemoveContainer" containerID="f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5" Jan 21 17:01:22 crc kubenswrapper[4707]: E0121 17:01:22.820071 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5\": container with ID starting with f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5 not found: ID does not exist" containerID="f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.820106 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5"} err="failed to get container status \"f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5\": rpc error: code = NotFound desc = could not find container \"f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5\": container with ID starting with f6799bbe61f4ecc5516aa7a1f2d93103acf2ae6e1fbbfdd157db5561f2ea98a5 not found: ID does not exist" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.820128 4707 scope.go:117] "RemoveContainer" containerID="3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba" Jan 21 17:01:22 crc kubenswrapper[4707]: E0121 17:01:22.820366 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba\": container with ID starting with 3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba not found: ID does not exist" containerID="3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.820387 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba"} err="failed to get container status \"3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba\": rpc error: code = NotFound desc = could not find container \"3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba\": container with ID starting with 3f903f9164b830d79b19312f7e548b146414d1a53e4532cbe0e46cbcade7dfba not found: ID does not exist" Jan 21 17:01:22 crc kubenswrapper[4707]: I0121 17:01:22.833891 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-0" podUID="225ee0a9-1192-40ab-93d9-010fb4c4a07e" containerName="galera" containerID="cri-o://78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58" gracePeriod=26 Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.189422 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151f7a26-9818-45b1-90d5-d45f9ce116ce" path="/var/lib/kubelet/pods/151f7a26-9818-45b1-90d5-d45f9ce116ce/volumes" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.190237 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f489928-26ae-45c9-851c-6072f97f37e0" path="/var/lib/kubelet/pods/2f489928-26ae-45c9-851c-6072f97f37e0/volumes" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.190792 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="900dbba9-3a88-4094-8313-243234ffbe22" path="/var/lib/kubelet/pods/900dbba9-3a88-4094-8313-243234ffbe22/volumes" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.191853 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f96623-749d-41d4-b59e-aedc4fd495bd" path="/var/lib/kubelet/pods/a4f96623-749d-41d4-b59e-aedc4fd495bd/volumes" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.382091 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.503615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-operator-scripts\") pod \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.503682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kolla-config\") pod \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.503717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsf2w\" (UniqueName: \"kubernetes.io/projected/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kube-api-access-hsf2w\") pod \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.503745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-generated\") pod \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.503833 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-default\") pod \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.503882 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\" (UID: \"225ee0a9-1192-40ab-93d9-010fb4c4a07e\") " Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.504201 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "225ee0a9-1192-40ab-93d9-010fb4c4a07e" (UID: "225ee0a9-1192-40ab-93d9-010fb4c4a07e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.504221 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "225ee0a9-1192-40ab-93d9-010fb4c4a07e" (UID: "225ee0a9-1192-40ab-93d9-010fb4c4a07e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.504293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "225ee0a9-1192-40ab-93d9-010fb4c4a07e" (UID: "225ee0a9-1192-40ab-93d9-010fb4c4a07e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.504724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "225ee0a9-1192-40ab-93d9-010fb4c4a07e" (UID: "225ee0a9-1192-40ab-93d9-010fb4c4a07e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.510672 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kube-api-access-hsf2w" (OuterVolumeSpecName: "kube-api-access-hsf2w") pod "225ee0a9-1192-40ab-93d9-010fb4c4a07e" (UID: "225ee0a9-1192-40ab-93d9-010fb4c4a07e"). InnerVolumeSpecName "kube-api-access-hsf2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.516800 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "225ee0a9-1192-40ab-93d9-010fb4c4a07e" (UID: "225ee0a9-1192-40ab-93d9-010fb4c4a07e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.605676 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.605710 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.605723 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsf2w\" (UniqueName: \"kubernetes.io/projected/225ee0a9-1192-40ab-93d9-010fb4c4a07e-kube-api-access-hsf2w\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.605733 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.605743 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/225ee0a9-1192-40ab-93d9-010fb4c4a07e-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.605774 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.614609 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.707345 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.790411 4707 generic.go:334] "Generic (PLEG): container finished" podID="225ee0a9-1192-40ab-93d9-010fb4c4a07e" containerID="78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58" exitCode=0 Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.790450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"225ee0a9-1192-40ab-93d9-010fb4c4a07e","Type":"ContainerDied","Data":"78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58"} Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.790475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"225ee0a9-1192-40ab-93d9-010fb4c4a07e","Type":"ContainerDied","Data":"8916e66de8b3abee9a5927596c04e5238b66af46e10438ef3f103def325a30d0"} Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.790490 4707 scope.go:117] "RemoveContainer" containerID="78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.791225 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.805290 4707 scope.go:117] "RemoveContainer" containerID="94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.814020 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.820223 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.822248 4707 scope.go:117] "RemoveContainer" containerID="78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58" Jan 21 17:01:23 crc kubenswrapper[4707]: E0121 17:01:23.822602 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58\": container with ID starting with 78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58 not found: ID does not exist" containerID="78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.822629 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58"} err="failed to get container status \"78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58\": rpc error: code = NotFound desc = could not find container \"78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58\": container with ID starting with 78a303144b88479cf632e300d20b0dd0ee3dfdedc901965df769edb0d515bd58 not found: ID does not exist" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.822660 4707 scope.go:117] "RemoveContainer" containerID="94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a" Jan 21 17:01:23 crc kubenswrapper[4707]: E0121 17:01:23.822899 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a\": container with ID starting with 94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a not found: ID does not exist" containerID="94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a" Jan 21 17:01:23 crc kubenswrapper[4707]: I0121 17:01:23.822920 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a"} err="failed to get container status \"94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a\": rpc error: code = NotFound desc = could not find container \"94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a\": container with ID starting with 94b5e6d9228ad4598a1a0b51b657d237ae18c5931344e2d089bdaed260ec0e5a not found: ID does not exist" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.120296 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t"] Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.120471 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" podUID="f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03" containerName="manager" containerID="cri-o://78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593" gracePeriod=10 Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.189262 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="225ee0a9-1192-40ab-93d9-010fb4c4a07e" path="/var/lib/kubelet/pods/225ee0a9-1192-40ab-93d9-010fb4c4a07e/volumes" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.325637 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-4s5rd"] Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.325824 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-4s5rd" podUID="d7af2b5c-1c35-4ca8-9a60-a52430cecd7a" containerName="registry-server" containerID="cri-o://41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44" gracePeriod=30 Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.348058 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh"] Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.352069 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1rscsh"] Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.502254 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.632980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-webhook-cert\") pod \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.633121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-apiservice-cert\") pod \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.633166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngc2h\" (UniqueName: \"kubernetes.io/projected/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-kube-api-access-ngc2h\") pod \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\" (UID: \"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03\") " Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.637030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-kube-api-access-ngc2h" (OuterVolumeSpecName: "kube-api-access-ngc2h") pod "f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03" (UID: "f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03"). InnerVolumeSpecName "kube-api-access-ngc2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.637160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03" (UID: "f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.637289 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03" (UID: "f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.657104 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.734669 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.734702 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.734715 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngc2h\" (UniqueName: \"kubernetes.io/projected/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03-kube-api-access-ngc2h\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.805600 4707 generic.go:334] "Generic (PLEG): container finished" podID="f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03" containerID="78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593" exitCode=0 Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.805667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" event={"ID":"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03","Type":"ContainerDied","Data":"78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593"} Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.805691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" event={"ID":"f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03","Type":"ContainerDied","Data":"5551d3d0897f49a1fc8912a48ac616f0b9987f851c32c08627a7e74918daa852"} Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.805707 4707 scope.go:117] "RemoveContainer" containerID="78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.805778 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.807622 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7af2b5c-1c35-4ca8-9a60-a52430cecd7a" containerID="41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44" exitCode=0 Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.807674 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-4s5rd" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.807689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-4s5rd" event={"ID":"d7af2b5c-1c35-4ca8-9a60-a52430cecd7a","Type":"ContainerDied","Data":"41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44"} Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.807713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-4s5rd" event={"ID":"d7af2b5c-1c35-4ca8-9a60-a52430cecd7a","Type":"ContainerDied","Data":"ba756f790eec94965176c2cbff2da7703babefa46dbd50147590c2b3a7587af2"} Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.819709 4707 scope.go:117] "RemoveContainer" containerID="78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593" Jan 21 17:01:25 crc kubenswrapper[4707]: E0121 17:01:25.820041 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593\": container with ID starting with 78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593 not found: ID does not exist" containerID="78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.820077 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593"} err="failed to get container status \"78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593\": rpc error: code = NotFound desc = could not find container \"78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593\": container with ID starting with 78856f7849906d8ef4a51696301bf9d3e37a52c38843621d5b70aca9e1257593 not found: ID does not exist" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.820098 4707 scope.go:117] "RemoveContainer" containerID="41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.830569 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t"] Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.831958 4707 scope.go:117] "RemoveContainer" containerID="41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44" Jan 21 17:01:25 crc kubenswrapper[4707]: E0121 17:01:25.832262 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44\": container with ID starting with 41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44 not found: ID does not exist" containerID="41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.832300 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44"} err="failed to get container status \"41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44\": rpc error: code = NotFound desc = could not find container \"41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44\": container with ID starting with 41182c5325ad02e91fb42cd2d695263a7d3a083b97cbbd6ffd5b8f1d7eb7ea44 not found: ID does not exist" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.834983 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-56c4b5f9d8-thv7t"] Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.835358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzw5r\" (UniqueName: \"kubernetes.io/projected/d7af2b5c-1c35-4ca8-9a60-a52430cecd7a-kube-api-access-zzw5r\") pod \"d7af2b5c-1c35-4ca8-9a60-a52430cecd7a\" (UID: \"d7af2b5c-1c35-4ca8-9a60-a52430cecd7a\") " Jan 21 17:01:25 crc kubenswrapper[4707]: E0121 17:01:25.845635 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93f9eb6_1c5f_43e1_9f24_43fb6e7efb03.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93f9eb6_1c5f_43e1_9f24_43fb6e7efb03.slice/crio-5551d3d0897f49a1fc8912a48ac616f0b9987f851c32c08627a7e74918daa852\": RecentStats: unable to find data in memory cache]" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.857178 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7af2b5c-1c35-4ca8-9a60-a52430cecd7a-kube-api-access-zzw5r" (OuterVolumeSpecName: "kube-api-access-zzw5r") pod "d7af2b5c-1c35-4ca8-9a60-a52430cecd7a" (UID: "d7af2b5c-1c35-4ca8-9a60-a52430cecd7a"). InnerVolumeSpecName "kube-api-access-zzw5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:25 crc kubenswrapper[4707]: I0121 17:01:25.936729 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzw5r\" (UniqueName: \"kubernetes.io/projected/d7af2b5c-1c35-4ca8-9a60-a52430cecd7a-kube-api-access-zzw5r\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:26 crc kubenswrapper[4707]: I0121 17:01:26.131254 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-4s5rd"] Jan 21 17:01:26 crc kubenswrapper[4707]: I0121 17:01:26.134675 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-4s5rd"] Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.189773 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d976b88-45b2-4fcd-92c3-8998178306ac" path="/var/lib/kubelet/pods/4d976b88-45b2-4fcd-92c3-8998178306ac/volumes" Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.190564 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7af2b5c-1c35-4ca8-9a60-a52430cecd7a" path="/var/lib/kubelet/pods/d7af2b5c-1c35-4ca8-9a60-a52430cecd7a/volumes" Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.191020 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03" path="/var/lib/kubelet/pods/f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03/volumes" Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.605562 4707 scope.go:117] "RemoveContainer" containerID="8cf5a94340acb87b8d460e84d6e2facb8491c341d47fc2c5a04487f346afff9a" Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.634209 4707 scope.go:117] "RemoveContainer" containerID="e0e93d3adc4ac39fb905aa21ae1f78d0092d0ff072238e93338da2a59d428c71" Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.644746 4707 scope.go:117] "RemoveContainer" containerID="98a2afe0e00f801a7b56bd4da46d19d29b652d19a3c8b9920413f9dd0395bede" Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.662863 4707 scope.go:117] "RemoveContainer" containerID="b0183c60dc9cc579e2a7b3f621eb8ca865f8e6e85c6b1ff40bdfbf3a87349855" Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.678564 4707 scope.go:117] "RemoveContainer" containerID="82870f3e401fe0a437356edd26b62bb68518e8cb7a786f9f3462958c2218aa6e" Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.698291 4707 scope.go:117] "RemoveContainer" containerID="8d7851549e3f49a90fa859adf2f35180ba74c44ccf59107ee7a7a1c22f328e53" Jan 21 17:01:27 crc kubenswrapper[4707]: I0121 17:01:27.717577 4707 scope.go:117] "RemoveContainer" containerID="5ef3e3111ca082e11b497c499b1a5e093e47d9944cc4a217c94f19fe6ccf13d2" Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.446139 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj"] Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.446314 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" podUID="8358c5ec-c227-4e08-bfa4-502096362233" containerName="manager" containerID="cri-o://35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472" gracePeriod=10 Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.654119 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-h9mjd"] Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.654286 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-h9mjd" podUID="e6c0641b-e565-49e9-a0d3-8afeaeca6835" containerName="registry-server" containerID="cri-o://4516751d0d240ce069923af5419b8d27e0dd26bfaad44d80c2ba226f98d5bd1e" gracePeriod=30 Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.680572 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs"] Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.694390 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720brwxhs"] Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.807716 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.837047 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6c0641b-e565-49e9-a0d3-8afeaeca6835" containerID="4516751d0d240ce069923af5419b8d27e0dd26bfaad44d80c2ba226f98d5bd1e" exitCode=0 Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.837103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h9mjd" event={"ID":"e6c0641b-e565-49e9-a0d3-8afeaeca6835","Type":"ContainerDied","Data":"4516751d0d240ce069923af5419b8d27e0dd26bfaad44d80c2ba226f98d5bd1e"} Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.838176 4707 generic.go:334] "Generic (PLEG): container finished" podID="8358c5ec-c227-4e08-bfa4-502096362233" containerID="35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472" exitCode=0 Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.838207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" event={"ID":"8358c5ec-c227-4e08-bfa4-502096362233","Type":"ContainerDied","Data":"35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472"} Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.838227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" event={"ID":"8358c5ec-c227-4e08-bfa4-502096362233","Type":"ContainerDied","Data":"ebdd920628b2db36939a3dd569a348b41d21bb7010654a80111c92dc7cfa2df9"} Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.838242 4707 scope.go:117] "RemoveContainer" containerID="35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472" Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.838342 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj" Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.854944 4707 scope.go:117] "RemoveContainer" containerID="35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472" Jan 21 17:01:29 crc kubenswrapper[4707]: E0121 17:01:29.855241 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472\": container with ID starting with 35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472 not found: ID does not exist" containerID="35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472" Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.855274 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472"} err="failed to get container status \"35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472\": rpc error: code = NotFound desc = could not find container \"35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472\": container with ID starting with 35de17df69f73735e026f3c30e7bcd0fb1e4c51b50cf4115ca16ff401099a472 not found: ID does not exist" Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.986033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-apiservice-cert\") pod \"8358c5ec-c227-4e08-bfa4-502096362233\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.986098 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk55w\" (UniqueName: \"kubernetes.io/projected/8358c5ec-c227-4e08-bfa4-502096362233-kube-api-access-rk55w\") pod \"8358c5ec-c227-4e08-bfa4-502096362233\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.986154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-webhook-cert\") pod \"8358c5ec-c227-4e08-bfa4-502096362233\" (UID: \"8358c5ec-c227-4e08-bfa4-502096362233\") " Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.990850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "8358c5ec-c227-4e08-bfa4-502096362233" (UID: "8358c5ec-c227-4e08-bfa4-502096362233"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.990980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "8358c5ec-c227-4e08-bfa4-502096362233" (UID: "8358c5ec-c227-4e08-bfa4-502096362233"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:29 crc kubenswrapper[4707]: I0121 17:01:29.991425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8358c5ec-c227-4e08-bfa4-502096362233-kube-api-access-rk55w" (OuterVolumeSpecName: "kube-api-access-rk55w") pod "8358c5ec-c227-4e08-bfa4-502096362233" (UID: "8358c5ec-c227-4e08-bfa4-502096362233"). InnerVolumeSpecName "kube-api-access-rk55w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.011277 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.088139 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.088170 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk55w\" (UniqueName: \"kubernetes.io/projected/8358c5ec-c227-4e08-bfa4-502096362233-kube-api-access-rk55w\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.088180 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8358c5ec-c227-4e08-bfa4-502096362233-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.161032 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj"] Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.164320 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7d4c746c5f-n25qj"] Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.189080 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z949\" (UniqueName: \"kubernetes.io/projected/e6c0641b-e565-49e9-a0d3-8afeaeca6835-kube-api-access-6z949\") pod \"e6c0641b-e565-49e9-a0d3-8afeaeca6835\" (UID: \"e6c0641b-e565-49e9-a0d3-8afeaeca6835\") " Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.191293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c0641b-e565-49e9-a0d3-8afeaeca6835-kube-api-access-6z949" (OuterVolumeSpecName: "kube-api-access-6z949") pod "e6c0641b-e565-49e9-a0d3-8afeaeca6835" (UID: "e6c0641b-e565-49e9-a0d3-8afeaeca6835"). InnerVolumeSpecName "kube-api-access-6z949". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.291326 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z949\" (UniqueName: \"kubernetes.io/projected/e6c0641b-e565-49e9-a0d3-8afeaeca6835-kube-api-access-6z949\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.844340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h9mjd" event={"ID":"e6c0641b-e565-49e9-a0d3-8afeaeca6835","Type":"ContainerDied","Data":"63104afe71e3d1a6e05309092c22441c78cba894a2883f847e12fc01ea5c80e8"} Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.844389 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h9mjd" Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.844403 4707 scope.go:117] "RemoveContainer" containerID="4516751d0d240ce069923af5419b8d27e0dd26bfaad44d80c2ba226f98d5bd1e" Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.868181 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-h9mjd"] Jan 21 17:01:30 crc kubenswrapper[4707]: I0121 17:01:30.872236 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-h9mjd"] Jan 21 17:01:31 crc kubenswrapper[4707]: I0121 17:01:31.187936 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24838a0d-e03b-4b81-9423-3865c0286bf3" path="/var/lib/kubelet/pods/24838a0d-e03b-4b81-9423-3865c0286bf3/volumes" Jan 21 17:01:31 crc kubenswrapper[4707]: I0121 17:01:31.188541 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8358c5ec-c227-4e08-bfa4-502096362233" path="/var/lib/kubelet/pods/8358c5ec-c227-4e08-bfa4-502096362233/volumes" Jan 21 17:01:31 crc kubenswrapper[4707]: I0121 17:01:31.188956 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c0641b-e565-49e9-a0d3-8afeaeca6835" path="/var/lib/kubelet/pods/e6c0641b-e565-49e9-a0d3-8afeaeca6835/volumes" Jan 21 17:01:34 crc kubenswrapper[4707]: I0121 17:01:34.634110 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk"] Jan 21 17:01:34 crc kubenswrapper[4707]: I0121 17:01:34.635049 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" podUID="23ec1174-5172-4c31-8ed4-bed70c55586e" containerName="operator" containerID="cri-o://19e822fa30205f02cddbce603441ea130558ae8fc2ef58c0b29365f6851944f5" gracePeriod=10 Jan 21 17:01:34 crc kubenswrapper[4707]: I0121 17:01:34.870107 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-m9pnv"] Jan 21 17:01:34 crc kubenswrapper[4707]: I0121 17:01:34.870670 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" podUID="d0a89d96-10da-4a61-94e3-0b6e73714360" containerName="registry-server" containerID="cri-o://f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa" gracePeriod=30 Jan 21 17:01:34 crc kubenswrapper[4707]: I0121 17:01:34.878375 4707 generic.go:334] "Generic (PLEG): container finished" podID="23ec1174-5172-4c31-8ed4-bed70c55586e" containerID="19e822fa30205f02cddbce603441ea130558ae8fc2ef58c0b29365f6851944f5" exitCode=0 Jan 21 17:01:34 crc kubenswrapper[4707]: I0121 17:01:34.878425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" event={"ID":"23ec1174-5172-4c31-8ed4-bed70c55586e","Type":"ContainerDied","Data":"19e822fa30205f02cddbce603441ea130558ae8fc2ef58c0b29365f6851944f5"} Jan 21 17:01:34 crc kubenswrapper[4707]: I0121 17:01:34.879525 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc"] Jan 21 17:01:34 crc kubenswrapper[4707]: I0121 17:01:34.884906 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590vmbfc"] Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.003236 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.158704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jftrz\" (UniqueName: \"kubernetes.io/projected/23ec1174-5172-4c31-8ed4-bed70c55586e-kube-api-access-jftrz\") pod \"23ec1174-5172-4c31-8ed4-bed70c55586e\" (UID: \"23ec1174-5172-4c31-8ed4-bed70c55586e\") " Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.163261 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ec1174-5172-4c31-8ed4-bed70c55586e-kube-api-access-jftrz" (OuterVolumeSpecName: "kube-api-access-jftrz") pod "23ec1174-5172-4c31-8ed4-bed70c55586e" (UID: "23ec1174-5172-4c31-8ed4-bed70c55586e"). InnerVolumeSpecName "kube-api-access-jftrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.183579 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.187870 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1" path="/var/lib/kubelet/pods/1038d4e6-3cec-4ffc-bdd6-ed1fae5f33b1/volumes" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.261320 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jftrz\" (UniqueName: \"kubernetes.io/projected/23ec1174-5172-4c31-8ed4-bed70c55586e-kube-api-access-jftrz\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.362960 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6knf2\" (UniqueName: \"kubernetes.io/projected/d0a89d96-10da-4a61-94e3-0b6e73714360-kube-api-access-6knf2\") pod \"d0a89d96-10da-4a61-94e3-0b6e73714360\" (UID: \"d0a89d96-10da-4a61-94e3-0b6e73714360\") " Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.366579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a89d96-10da-4a61-94e3-0b6e73714360-kube-api-access-6knf2" (OuterVolumeSpecName: "kube-api-access-6knf2") pod "d0a89d96-10da-4a61-94e3-0b6e73714360" (UID: "d0a89d96-10da-4a61-94e3-0b6e73714360"). InnerVolumeSpecName "kube-api-access-6knf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.465034 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6knf2\" (UniqueName: \"kubernetes.io/projected/d0a89d96-10da-4a61-94e3-0b6e73714360-kube-api-access-6knf2\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.883939 4707 generic.go:334] "Generic (PLEG): container finished" podID="d0a89d96-10da-4a61-94e3-0b6e73714360" containerID="f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa" exitCode=0 Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.883993 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.884031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" event={"ID":"d0a89d96-10da-4a61-94e3-0b6e73714360","Type":"ContainerDied","Data":"f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa"} Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.884064 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-m9pnv" event={"ID":"d0a89d96-10da-4a61-94e3-0b6e73714360","Type":"ContainerDied","Data":"c9da88badf24d462ee9bfa4cd35460549a47d480a1e6b143e867719e7689e140"} Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.884080 4707 scope.go:117] "RemoveContainer" containerID="f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.886127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" event={"ID":"23ec1174-5172-4c31-8ed4-bed70c55586e","Type":"ContainerDied","Data":"f47f348dafbe7f0e3813a3ce34de670624b6cf35e23e8c4617e47698d62f90c5"} Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.886151 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.898706 4707 scope.go:117] "RemoveContainer" containerID="f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa" Jan 21 17:01:35 crc kubenswrapper[4707]: E0121 17:01:35.899192 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa\": container with ID starting with f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa not found: ID does not exist" containerID="f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.899221 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa"} err="failed to get container status \"f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa\": rpc error: code = NotFound desc = could not find container \"f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa\": container with ID starting with f2facc8445ccb3fd4ca9e227c151c7400712048b7a309491a198e79f14569baa not found: ID does not exist" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.899241 4707 scope.go:117] "RemoveContainer" containerID="19e822fa30205f02cddbce603441ea130558ae8fc2ef58c0b29365f6851944f5" Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.902022 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk"] Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.905744 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-8gjjk"] Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.915224 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-m9pnv"] Jan 21 17:01:35 crc kubenswrapper[4707]: I0121 17:01:35.919843 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-m9pnv"] Jan 21 17:01:35 crc kubenswrapper[4707]: E0121 17:01:35.969119 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a89d96_10da_4a61_94e3_0b6e73714360.slice/crio-c9da88badf24d462ee9bfa4cd35460549a47d480a1e6b143e867719e7689e140\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a89d96_10da_4a61_94e3_0b6e73714360.slice\": RecentStats: unable to find data in memory cache]" Jan 21 17:01:37 crc kubenswrapper[4707]: I0121 17:01:37.188412 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ec1174-5172-4c31-8ed4-bed70c55586e" path="/var/lib/kubelet/pods/23ec1174-5172-4c31-8ed4-bed70c55586e/volumes" Jan 21 17:01:37 crc kubenswrapper[4707]: I0121 17:01:37.189020 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a89d96-10da-4a61-94e3-0b6e73714360" path="/var/lib/kubelet/pods/d0a89d96-10da-4a61-94e3-0b6e73714360/volumes" Jan 21 17:01:39 crc kubenswrapper[4707]: I0121 17:01:39.945391 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:01:39 crc kubenswrapper[4707]: I0121 17:01:39.945445 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:02:09 crc kubenswrapper[4707]: I0121 17:02:09.945838 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:02:09 crc kubenswrapper[4707]: I0121 17:02:09.946226 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:02:09 crc kubenswrapper[4707]: I0121 17:02:09.946270 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:02:09 crc kubenswrapper[4707]: I0121 17:02:09.946837 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db6cb0803a4872ab9454e605e8d5382632c4ff38cb23724c972fdbbde4bc97c5"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:02:09 crc kubenswrapper[4707]: I0121 17:02:09.946882 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://db6cb0803a4872ab9454e605e8d5382632c4ff38cb23724c972fdbbde4bc97c5" gracePeriod=600 Jan 21 17:02:11 crc kubenswrapper[4707]: I0121 17:02:11.054593 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="db6cb0803a4872ab9454e605e8d5382632c4ff38cb23724c972fdbbde4bc97c5" exitCode=0 Jan 21 17:02:11 crc kubenswrapper[4707]: I0121 17:02:11.054656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"db6cb0803a4872ab9454e605e8d5382632c4ff38cb23724c972fdbbde4bc97c5"} Jan 21 17:02:11 crc kubenswrapper[4707]: I0121 17:02:11.055022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6"} Jan 21 17:02:11 crc kubenswrapper[4707]: I0121 17:02:11.055044 4707 scope.go:117] "RemoveContainer" containerID="98413debc5dbc4423ac3b9d3d168c9887327acc5691138ebae3215eee3c09782" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.516438 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-4pscr"] Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517029 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900dbba9-3a88-4094-8313-243234ffbe22" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517042 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="900dbba9-3a88-4094-8313-243234ffbe22" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517052 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f489928-26ae-45c9-851c-6072f97f37e0" containerName="rabbitmq" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517059 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f489928-26ae-45c9-851c-6072f97f37e0" containerName="rabbitmq" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517070 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522a96ca-cde2-4466-804e-85a0f8e56653" containerName="mysql-bootstrap" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517075 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="522a96ca-cde2-4466-804e-85a0f8e56653" containerName="mysql-bootstrap" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517084 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522a96ca-cde2-4466-804e-85a0f8e56653" containerName="galera" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517089 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="522a96ca-cde2-4466-804e-85a0f8e56653" containerName="galera" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517097 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7af2b5c-1c35-4ca8-9a60-a52430cecd7a" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517102 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7af2b5c-1c35-4ca8-9a60-a52430cecd7a" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517113 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b7fe44-243a-4e6a-b6f5-abf109cd69d5" containerName="memcached" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517118 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b7fe44-243a-4e6a-b6f5-abf109cd69d5" containerName="memcached" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517125 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225ee0a9-1192-40ab-93d9-010fb4c4a07e" containerName="galera" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517131 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="225ee0a9-1192-40ab-93d9-010fb4c4a07e" containerName="galera" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517139 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f96623-749d-41d4-b59e-aedc4fd495bd" containerName="manager" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517144 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f96623-749d-41d4-b59e-aedc4fd495bd" containerName="manager" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517154 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151f7a26-9818-45b1-90d5-d45f9ce116ce" containerName="mysql-bootstrap" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517159 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="151f7a26-9818-45b1-90d5-d45f9ce116ce" containerName="mysql-bootstrap" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517167 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225ee0a9-1192-40ab-93d9-010fb4c4a07e" containerName="mysql-bootstrap" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517173 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="225ee0a9-1192-40ab-93d9-010fb4c4a07e" containerName="mysql-bootstrap" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517182 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8358c5ec-c227-4e08-bfa4-502096362233" containerName="manager" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517187 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8358c5ec-c227-4e08-bfa4-502096362233" containerName="manager" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517196 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151f7a26-9818-45b1-90d5-d45f9ce116ce" containerName="galera" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517201 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="151f7a26-9818-45b1-90d5-d45f9ce116ce" containerName="galera" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517210 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03" containerName="manager" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517215 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03" containerName="manager" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517224 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f489928-26ae-45c9-851c-6072f97f37e0" containerName="setup-container" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517229 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f489928-26ae-45c9-851c-6072f97f37e0" containerName="setup-container" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517237 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ec1174-5172-4c31-8ed4-bed70c55586e" containerName="operator" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517242 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ec1174-5172-4c31-8ed4-bed70c55586e" containerName="operator" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517252 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c0641b-e565-49e9-a0d3-8afeaeca6835" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517257 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c0641b-e565-49e9-a0d3-8afeaeca6835" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: E0121 17:02:18.517267 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a89d96-10da-4a61-94e3-0b6e73714360" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517272 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a89d96-10da-4a61-94e3-0b6e73714360" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517381 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7af2b5c-1c35-4ca8-9a60-a52430cecd7a" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517393 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93f9eb6-1c5f-43e1-9f24-43fb6e7efb03" containerName="manager" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517401 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c0641b-e565-49e9-a0d3-8afeaeca6835" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517410 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f489928-26ae-45c9-851c-6072f97f37e0" containerName="rabbitmq" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517424 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8358c5ec-c227-4e08-bfa4-502096362233" containerName="manager" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517435 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a89d96-10da-4a61-94e3-0b6e73714360" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517441 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f96623-749d-41d4-b59e-aedc4fd495bd" containerName="manager" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517449 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="522a96ca-cde2-4466-804e-85a0f8e56653" containerName="galera" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517455 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="151f7a26-9818-45b1-90d5-d45f9ce116ce" containerName="galera" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517462 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ec1174-5172-4c31-8ed4-bed70c55586e" containerName="operator" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517469 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="225ee0a9-1192-40ab-93d9-010fb4c4a07e" containerName="galera" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517478 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="900dbba9-3a88-4094-8313-243234ffbe22" containerName="registry-server" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b7fe44-243a-4e6a-b6f5-abf109cd69d5" containerName="memcached" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.517900 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4pscr" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.519907 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.520379 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-tcph6" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.520493 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.526448 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-4pscr"] Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.571892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29zg5\" (UniqueName: \"kubernetes.io/projected/7e0cef97-e7c7-4dd0-9635-9c80ff74ac14-kube-api-access-29zg5\") pod \"mariadb-operator-index-4pscr\" (UID: \"7e0cef97-e7c7-4dd0-9635-9c80ff74ac14\") " pod="openstack-operators/mariadb-operator-index-4pscr" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.673105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29zg5\" (UniqueName: \"kubernetes.io/projected/7e0cef97-e7c7-4dd0-9635-9c80ff74ac14-kube-api-access-29zg5\") pod \"mariadb-operator-index-4pscr\" (UID: \"7e0cef97-e7c7-4dd0-9635-9c80ff74ac14\") " pod="openstack-operators/mariadb-operator-index-4pscr" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.687931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29zg5\" (UniqueName: \"kubernetes.io/projected/7e0cef97-e7c7-4dd0-9635-9c80ff74ac14-kube-api-access-29zg5\") pod \"mariadb-operator-index-4pscr\" (UID: \"7e0cef97-e7c7-4dd0-9635-9c80ff74ac14\") " pod="openstack-operators/mariadb-operator-index-4pscr" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.831055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4pscr" Jan 21 17:02:18 crc kubenswrapper[4707]: I0121 17:02:18.907880 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-4pscr"] Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.207984 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-4pscr"] Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.211950 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.306751 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-sfqx5"] Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.307452 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.313562 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-sfqx5"] Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.381757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxm2k\" (UniqueName: \"kubernetes.io/projected/3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e-kube-api-access-lxm2k\") pod \"mariadb-operator-index-sfqx5\" (UID: \"3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e\") " pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.482622 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxm2k\" (UniqueName: \"kubernetes.io/projected/3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e-kube-api-access-lxm2k\") pod \"mariadb-operator-index-sfqx5\" (UID: \"3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e\") " pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.498087 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxm2k\" (UniqueName: \"kubernetes.io/projected/3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e-kube-api-access-lxm2k\") pod \"mariadb-operator-index-sfqx5\" (UID: \"3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e\") " pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.620513 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:02:19 crc kubenswrapper[4707]: I0121 17:02:19.973025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-sfqx5"] Jan 21 17:02:19 crc kubenswrapper[4707]: W0121 17:02:19.977201 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6f012f_dbc1_4ad9_99ef_e9c2aee8018e.slice/crio-da0649089a4225d59ef3b3fecdbd40cf1a8ab056d814384c1ec8470037b60668 WatchSource:0}: Error finding container da0649089a4225d59ef3b3fecdbd40cf1a8ab056d814384c1ec8470037b60668: Status 404 returned error can't find the container with id da0649089a4225d59ef3b3fecdbd40cf1a8ab056d814384c1ec8470037b60668 Jan 21 17:02:20 crc kubenswrapper[4707]: I0121 17:02:20.105518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4pscr" event={"ID":"7e0cef97-e7c7-4dd0-9635-9c80ff74ac14","Type":"ContainerStarted","Data":"ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b"} Jan 21 17:02:20 crc kubenswrapper[4707]: I0121 17:02:20.105724 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4pscr" event={"ID":"7e0cef97-e7c7-4dd0-9635-9c80ff74ac14","Type":"ContainerStarted","Data":"7e1f1dada7a3ba0ac81b7107cc74a19168184256bcde9f54a15e3e516ba3d70f"} Jan 21 17:02:20 crc kubenswrapper[4707]: I0121 17:02:20.105617 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-4pscr" podUID="7e0cef97-e7c7-4dd0-9635-9c80ff74ac14" containerName="registry-server" containerID="cri-o://ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b" gracePeriod=2 Jan 21 17:02:20 crc kubenswrapper[4707]: I0121 17:02:20.107074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-sfqx5" event={"ID":"3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e","Type":"ContainerStarted","Data":"da0649089a4225d59ef3b3fecdbd40cf1a8ab056d814384c1ec8470037b60668"} Jan 21 17:02:20 crc kubenswrapper[4707]: I0121 17:02:20.117801 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-4pscr" podStartSLOduration=1.582350844 podStartE2EDuration="2.117787961s" podCreationTimestamp="2026-01-21 17:02:18 +0000 UTC" firstStartedPulling="2026-01-21 17:02:19.211734344 +0000 UTC m=+7236.393250566" lastFinishedPulling="2026-01-21 17:02:19.747171461 +0000 UTC m=+7236.928687683" observedRunningTime="2026-01-21 17:02:20.116108212 +0000 UTC m=+7237.297624435" watchObservedRunningTime="2026-01-21 17:02:20.117787961 +0000 UTC m=+7237.299304184" Jan 21 17:02:20 crc kubenswrapper[4707]: I0121 17:02:20.385598 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4pscr" Jan 21 17:02:20 crc kubenswrapper[4707]: I0121 17:02:20.493929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29zg5\" (UniqueName: \"kubernetes.io/projected/7e0cef97-e7c7-4dd0-9635-9c80ff74ac14-kube-api-access-29zg5\") pod \"7e0cef97-e7c7-4dd0-9635-9c80ff74ac14\" (UID: \"7e0cef97-e7c7-4dd0-9635-9c80ff74ac14\") " Jan 21 17:02:20 crc kubenswrapper[4707]: I0121 17:02:20.498439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0cef97-e7c7-4dd0-9635-9c80ff74ac14-kube-api-access-29zg5" (OuterVolumeSpecName: "kube-api-access-29zg5") pod "7e0cef97-e7c7-4dd0-9635-9c80ff74ac14" (UID: "7e0cef97-e7c7-4dd0-9635-9c80ff74ac14"). InnerVolumeSpecName "kube-api-access-29zg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:02:20 crc kubenswrapper[4707]: I0121 17:02:20.595456 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29zg5\" (UniqueName: \"kubernetes.io/projected/7e0cef97-e7c7-4dd0-9635-9c80ff74ac14-kube-api-access-29zg5\") on node \"crc\" DevicePath \"\"" Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.113499 4707 generic.go:334] "Generic (PLEG): container finished" podID="7e0cef97-e7c7-4dd0-9635-9c80ff74ac14" containerID="ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b" exitCode=0 Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.113542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4pscr" event={"ID":"7e0cef97-e7c7-4dd0-9635-9c80ff74ac14","Type":"ContainerDied","Data":"ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b"} Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.113572 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4pscr" Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.113586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4pscr" event={"ID":"7e0cef97-e7c7-4dd0-9635-9c80ff74ac14","Type":"ContainerDied","Data":"7e1f1dada7a3ba0ac81b7107cc74a19168184256bcde9f54a15e3e516ba3d70f"} Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.113604 4707 scope.go:117] "RemoveContainer" containerID="ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b" Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.114531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-sfqx5" event={"ID":"3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e","Type":"ContainerStarted","Data":"3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b"} Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.127456 4707 scope.go:117] "RemoveContainer" containerID="ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b" Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.128101 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-sfqx5" podStartSLOduration=1.647498771 podStartE2EDuration="2.128089924s" podCreationTimestamp="2026-01-21 17:02:19 +0000 UTC" firstStartedPulling="2026-01-21 17:02:19.979117522 +0000 UTC m=+7237.160633744" lastFinishedPulling="2026-01-21 17:02:20.459708675 +0000 UTC m=+7237.641224897" observedRunningTime="2026-01-21 17:02:21.125313863 +0000 UTC m=+7238.306830085" watchObservedRunningTime="2026-01-21 17:02:21.128089924 +0000 UTC m=+7238.309606146" Jan 21 17:02:21 crc kubenswrapper[4707]: E0121 17:02:21.128634 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b\": container with ID starting with ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b not found: ID does not exist" containerID="ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b" Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.128678 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b"} err="failed to get container status \"ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b\": rpc error: code = NotFound desc = could not find container \"ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b\": container with ID starting with ce5a49f8ebbaf9758926a3894d2b93193afa33b3880ab5536b134f8d9111342b not found: ID does not exist" Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.136223 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-4pscr"] Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.139934 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-4pscr"] Jan 21 17:02:21 crc kubenswrapper[4707]: I0121 17:02:21.188526 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0cef97-e7c7-4dd0-9635-9c80ff74ac14" path="/var/lib/kubelet/pods/7e0cef97-e7c7-4dd0-9635-9c80ff74ac14/volumes" Jan 21 17:02:27 crc kubenswrapper[4707]: I0121 17:02:27.856542 4707 scope.go:117] "RemoveContainer" containerID="c5fe65c16f49a9ebfa07c59ab35b9955a9dc28e54aeecd54c96d4be223569957" Jan 21 17:02:27 crc kubenswrapper[4707]: I0121 17:02:27.870874 4707 scope.go:117] "RemoveContainer" containerID="f25cecb56d365aba2332151eae2ddf2bdbbb0aaf97904f6f553ba7e356af6efe" Jan 21 17:02:27 crc kubenswrapper[4707]: I0121 17:02:27.895492 4707 scope.go:117] "RemoveContainer" containerID="7e5fe7a4124408927bb2892618f1daa185bf3e15c22b8a8396c3db5b024dadd6" Jan 21 17:02:27 crc kubenswrapper[4707]: I0121 17:02:27.909110 4707 scope.go:117] "RemoveContainer" containerID="407a67e2ce0c1c54d7bc8dea44ef05f88da9c0bee73d6f1b6fbe664c9ec87b59" Jan 21 17:02:27 crc kubenswrapper[4707]: I0121 17:02:27.926584 4707 scope.go:117] "RemoveContainer" containerID="bf836ffdb591b2c8d14e9bf966e787867f9b289c835c609ab4884e271b408e2f" Jan 21 17:02:27 crc kubenswrapper[4707]: I0121 17:02:27.942112 4707 scope.go:117] "RemoveContainer" containerID="ccdbabb86834f33aa5d679b5beca05b74fca5730cc2582b6805ca3a7ece2b044" Jan 21 17:02:29 crc kubenswrapper[4707]: I0121 17:02:29.621515 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:02:29 crc kubenswrapper[4707]: I0121 17:02:29.621764 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:02:29 crc kubenswrapper[4707]: I0121 17:02:29.642188 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.174820 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.533887 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m"] Jan 21 17:02:30 crc kubenswrapper[4707]: E0121 17:02:30.534118 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0cef97-e7c7-4dd0-9635-9c80ff74ac14" containerName="registry-server" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.534136 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0cef97-e7c7-4dd0-9635-9c80ff74ac14" containerName="registry-server" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.534250 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0cef97-e7c7-4dd0-9635-9c80ff74ac14" containerName="registry-server" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.535036 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.536573 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.537785 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m"] Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.609513 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.609842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.609886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkvld\" (UniqueName: \"kubernetes.io/projected/244091ea-ba85-4055-a3cb-454b51a4cad7-kube-api-access-xkvld\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.710468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.710519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkvld\" (UniqueName: \"kubernetes.io/projected/244091ea-ba85-4055-a3cb-454b51a4cad7-kube-api-access-xkvld\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.710571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.710975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.711234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.725787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkvld\" (UniqueName: \"kubernetes.io/projected/244091ea-ba85-4055-a3cb-454b51a4cad7-kube-api-access-xkvld\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:30 crc kubenswrapper[4707]: I0121 17:02:30.850756 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:31 crc kubenswrapper[4707]: I0121 17:02:31.188213 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m"] Jan 21 17:02:31 crc kubenswrapper[4707]: W0121 17:02:31.189507 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244091ea_ba85_4055_a3cb_454b51a4cad7.slice/crio-135a1a74921b2752f273c0f817110ee26b9f5b445d723749005768cf592319a6 WatchSource:0}: Error finding container 135a1a74921b2752f273c0f817110ee26b9f5b445d723749005768cf592319a6: Status 404 returned error can't find the container with id 135a1a74921b2752f273c0f817110ee26b9f5b445d723749005768cf592319a6 Jan 21 17:02:32 crc kubenswrapper[4707]: I0121 17:02:32.168740 4707 generic.go:334] "Generic (PLEG): container finished" podID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerID="de411360bc4b5977334de837fa6bdcb9017972d0b0d4816e79f3bdd450162c29" exitCode=0 Jan 21 17:02:32 crc kubenswrapper[4707]: I0121 17:02:32.168801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" event={"ID":"244091ea-ba85-4055-a3cb-454b51a4cad7","Type":"ContainerDied","Data":"de411360bc4b5977334de837fa6bdcb9017972d0b0d4816e79f3bdd450162c29"} Jan 21 17:02:32 crc kubenswrapper[4707]: I0121 17:02:32.168934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" event={"ID":"244091ea-ba85-4055-a3cb-454b51a4cad7","Type":"ContainerStarted","Data":"135a1a74921b2752f273c0f817110ee26b9f5b445d723749005768cf592319a6"} Jan 21 17:02:33 crc kubenswrapper[4707]: I0121 17:02:33.175207 4707 generic.go:334] "Generic (PLEG): container finished" podID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerID="5430702e1379ee18a0761e8f5d2a1a9dfbacc8275de3071c10c3a4dac2f3229e" exitCode=0 Jan 21 17:02:33 crc kubenswrapper[4707]: I0121 17:02:33.175236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" event={"ID":"244091ea-ba85-4055-a3cb-454b51a4cad7","Type":"ContainerDied","Data":"5430702e1379ee18a0761e8f5d2a1a9dfbacc8275de3071c10c3a4dac2f3229e"} Jan 21 17:02:34 crc kubenswrapper[4707]: I0121 17:02:34.185545 4707 generic.go:334] "Generic (PLEG): container finished" podID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerID="6be84cfbe1530b64ec0502ae2b88e99fb02da4152ffeb9a1377d51d9cf583d03" exitCode=0 Jan 21 17:02:34 crc kubenswrapper[4707]: I0121 17:02:34.185610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" event={"ID":"244091ea-ba85-4055-a3cb-454b51a4cad7","Type":"ContainerDied","Data":"6be84cfbe1530b64ec0502ae2b88e99fb02da4152ffeb9a1377d51d9cf583d03"} Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.391937 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.465402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-bundle\") pod \"244091ea-ba85-4055-a3cb-454b51a4cad7\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.465490 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkvld\" (UniqueName: \"kubernetes.io/projected/244091ea-ba85-4055-a3cb-454b51a4cad7-kube-api-access-xkvld\") pod \"244091ea-ba85-4055-a3cb-454b51a4cad7\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.465532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-util\") pod \"244091ea-ba85-4055-a3cb-454b51a4cad7\" (UID: \"244091ea-ba85-4055-a3cb-454b51a4cad7\") " Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.466472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-bundle" (OuterVolumeSpecName: "bundle") pod "244091ea-ba85-4055-a3cb-454b51a4cad7" (UID: "244091ea-ba85-4055-a3cb-454b51a4cad7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.467047 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.471926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244091ea-ba85-4055-a3cb-454b51a4cad7-kube-api-access-xkvld" (OuterVolumeSpecName: "kube-api-access-xkvld") pod "244091ea-ba85-4055-a3cb-454b51a4cad7" (UID: "244091ea-ba85-4055-a3cb-454b51a4cad7"). InnerVolumeSpecName "kube-api-access-xkvld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.477454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-util" (OuterVolumeSpecName: "util") pod "244091ea-ba85-4055-a3cb-454b51a4cad7" (UID: "244091ea-ba85-4055-a3cb-454b51a4cad7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.575763 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkvld\" (UniqueName: \"kubernetes.io/projected/244091ea-ba85-4055-a3cb-454b51a4cad7-kube-api-access-xkvld\") on node \"crc\" DevicePath \"\"" Jan 21 17:02:35 crc kubenswrapper[4707]: I0121 17:02:35.575822 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244091ea-ba85-4055-a3cb-454b51a4cad7-util\") on node \"crc\" DevicePath \"\"" Jan 21 17:02:36 crc kubenswrapper[4707]: I0121 17:02:36.199197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" event={"ID":"244091ea-ba85-4055-a3cb-454b51a4cad7","Type":"ContainerDied","Data":"135a1a74921b2752f273c0f817110ee26b9f5b445d723749005768cf592319a6"} Jan 21 17:02:36 crc kubenswrapper[4707]: I0121 17:02:36.199240 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="135a1a74921b2752f273c0f817110ee26b9f5b445d723749005768cf592319a6" Jan 21 17:02:36 crc kubenswrapper[4707]: I0121 17:02:36.199248 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.419769 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf"] Jan 21 17:02:44 crc kubenswrapper[4707]: E0121 17:02:44.420371 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerName="extract" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.420386 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerName="extract" Jan 21 17:02:44 crc kubenswrapper[4707]: E0121 17:02:44.420396 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerName="util" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.420402 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerName="util" Jan 21 17:02:44 crc kubenswrapper[4707]: E0121 17:02:44.420418 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerName="pull" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.420424 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerName="pull" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.420583 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="244091ea-ba85-4055-a3cb-454b51a4cad7" containerName="extract" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.421119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.423076 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-m6l54" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.423763 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.425069 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.436132 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf"] Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.491820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-webhook-cert\") pod \"mariadb-operator-controller-manager-566b5f8c4c-9nxgf\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.491862 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-apiservice-cert\") pod \"mariadb-operator-controller-manager-566b5f8c4c-9nxgf\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.491908 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d262t\" (UniqueName: \"kubernetes.io/projected/810dd4cc-c978-43bd-a4e6-35f378338c0d-kube-api-access-d262t\") pod \"mariadb-operator-controller-manager-566b5f8c4c-9nxgf\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.592909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-apiservice-cert\") pod \"mariadb-operator-controller-manager-566b5f8c4c-9nxgf\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.592981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d262t\" (UniqueName: \"kubernetes.io/projected/810dd4cc-c978-43bd-a4e6-35f378338c0d-kube-api-access-d262t\") pod \"mariadb-operator-controller-manager-566b5f8c4c-9nxgf\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.593045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-webhook-cert\") pod \"mariadb-operator-controller-manager-566b5f8c4c-9nxgf\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.600801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-apiservice-cert\") pod \"mariadb-operator-controller-manager-566b5f8c4c-9nxgf\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.600883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-webhook-cert\") pod \"mariadb-operator-controller-manager-566b5f8c4c-9nxgf\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.610078 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d262t\" (UniqueName: \"kubernetes.io/projected/810dd4cc-c978-43bd-a4e6-35f378338c0d-kube-api-access-d262t\") pod \"mariadb-operator-controller-manager-566b5f8c4c-9nxgf\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:44 crc kubenswrapper[4707]: I0121 17:02:44.735363 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:45 crc kubenswrapper[4707]: I0121 17:02:45.094874 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf"] Jan 21 17:02:45 crc kubenswrapper[4707]: W0121 17:02:45.098584 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod810dd4cc_c978_43bd_a4e6_35f378338c0d.slice/crio-de32cebb8007fd7a0a915848ddd9d700599195b35b859aef7f0955be4db2b23a WatchSource:0}: Error finding container de32cebb8007fd7a0a915848ddd9d700599195b35b859aef7f0955be4db2b23a: Status 404 returned error can't find the container with id de32cebb8007fd7a0a915848ddd9d700599195b35b859aef7f0955be4db2b23a Jan 21 17:02:45 crc kubenswrapper[4707]: I0121 17:02:45.243771 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" event={"ID":"810dd4cc-c978-43bd-a4e6-35f378338c0d","Type":"ContainerStarted","Data":"18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1"} Jan 21 17:02:45 crc kubenswrapper[4707]: I0121 17:02:45.243987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" event={"ID":"810dd4cc-c978-43bd-a4e6-35f378338c0d","Type":"ContainerStarted","Data":"de32cebb8007fd7a0a915848ddd9d700599195b35b859aef7f0955be4db2b23a"} Jan 21 17:02:45 crc kubenswrapper[4707]: I0121 17:02:45.244033 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:45 crc kubenswrapper[4707]: I0121 17:02:45.260227 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" podStartSLOduration=1.260210051 podStartE2EDuration="1.260210051s" podCreationTimestamp="2026-01-21 17:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:02:45.256389358 +0000 UTC m=+7262.437905579" watchObservedRunningTime="2026-01-21 17:02:45.260210051 +0000 UTC m=+7262.441726274" Jan 21 17:02:54 crc kubenswrapper[4707]: I0121 17:02:54.739455 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:02:56 crc kubenswrapper[4707]: I0121 17:02:56.580206 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-5npmj"] Jan 21 17:02:56 crc kubenswrapper[4707]: I0121 17:02:56.581217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5npmj" Jan 21 17:02:56 crc kubenswrapper[4707]: I0121 17:02:56.583183 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-qnjlp" Jan 21 17:02:56 crc kubenswrapper[4707]: I0121 17:02:56.588433 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-5npmj"] Jan 21 17:02:56 crc kubenswrapper[4707]: I0121 17:02:56.649461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5dfv\" (UniqueName: \"kubernetes.io/projected/b0b03c15-5581-455d-9851-c7eaf1c3a046-kube-api-access-j5dfv\") pod \"infra-operator-index-5npmj\" (UID: \"b0b03c15-5581-455d-9851-c7eaf1c3a046\") " pod="openstack-operators/infra-operator-index-5npmj" Jan 21 17:02:56 crc kubenswrapper[4707]: I0121 17:02:56.751440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5dfv\" (UniqueName: \"kubernetes.io/projected/b0b03c15-5581-455d-9851-c7eaf1c3a046-kube-api-access-j5dfv\") pod \"infra-operator-index-5npmj\" (UID: \"b0b03c15-5581-455d-9851-c7eaf1c3a046\") " pod="openstack-operators/infra-operator-index-5npmj" Jan 21 17:02:56 crc kubenswrapper[4707]: I0121 17:02:56.768875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5dfv\" (UniqueName: \"kubernetes.io/projected/b0b03c15-5581-455d-9851-c7eaf1c3a046-kube-api-access-j5dfv\") pod \"infra-operator-index-5npmj\" (UID: \"b0b03c15-5581-455d-9851-c7eaf1c3a046\") " pod="openstack-operators/infra-operator-index-5npmj" Jan 21 17:02:56 crc kubenswrapper[4707]: I0121 17:02:56.897707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5npmj" Jan 21 17:02:57 crc kubenswrapper[4707]: I0121 17:02:57.247250 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-5npmj"] Jan 21 17:02:57 crc kubenswrapper[4707]: I0121 17:02:57.301505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5npmj" event={"ID":"b0b03c15-5581-455d-9851-c7eaf1c3a046","Type":"ContainerStarted","Data":"626f37de1a8ffc32240b5063563a1efacb15d1ce0eb5efce40484740b634eb99"} Jan 21 17:02:58 crc kubenswrapper[4707]: I0121 17:02:58.308418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5npmj" event={"ID":"b0b03c15-5581-455d-9851-c7eaf1c3a046","Type":"ContainerStarted","Data":"e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d"} Jan 21 17:02:58 crc kubenswrapper[4707]: I0121 17:02:58.322327 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-5npmj" podStartSLOduration=1.719360883 podStartE2EDuration="2.322310713s" podCreationTimestamp="2026-01-21 17:02:56 +0000 UTC" firstStartedPulling="2026-01-21 17:02:57.253018243 +0000 UTC m=+7274.434534464" lastFinishedPulling="2026-01-21 17:02:57.855968072 +0000 UTC m=+7275.037484294" observedRunningTime="2026-01-21 17:02:58.319833936 +0000 UTC m=+7275.501350158" watchObservedRunningTime="2026-01-21 17:02:58.322310713 +0000 UTC m=+7275.503826935" Jan 21 17:03:00 crc kubenswrapper[4707]: I0121 17:03:00.773998 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-5npmj"] Jan 21 17:03:00 crc kubenswrapper[4707]: I0121 17:03:00.774161 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-5npmj" podUID="b0b03c15-5581-455d-9851-c7eaf1c3a046" containerName="registry-server" containerID="cri-o://e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d" gracePeriod=2 Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.104576 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5npmj" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.213136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5dfv\" (UniqueName: \"kubernetes.io/projected/b0b03c15-5581-455d-9851-c7eaf1c3a046-kube-api-access-j5dfv\") pod \"b0b03c15-5581-455d-9851-c7eaf1c3a046\" (UID: \"b0b03c15-5581-455d-9851-c7eaf1c3a046\") " Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.218010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b03c15-5581-455d-9851-c7eaf1c3a046-kube-api-access-j5dfv" (OuterVolumeSpecName: "kube-api-access-j5dfv") pod "b0b03c15-5581-455d-9851-c7eaf1c3a046" (UID: "b0b03c15-5581-455d-9851-c7eaf1c3a046"). InnerVolumeSpecName "kube-api-access-j5dfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.315152 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5dfv\" (UniqueName: \"kubernetes.io/projected/b0b03c15-5581-455d-9851-c7eaf1c3a046-kube-api-access-j5dfv\") on node \"crc\" DevicePath \"\"" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.326364 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0b03c15-5581-455d-9851-c7eaf1c3a046" containerID="e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d" exitCode=0 Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.326413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5npmj" event={"ID":"b0b03c15-5581-455d-9851-c7eaf1c3a046","Type":"ContainerDied","Data":"e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d"} Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.326446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5npmj" event={"ID":"b0b03c15-5581-455d-9851-c7eaf1c3a046","Type":"ContainerDied","Data":"626f37de1a8ffc32240b5063563a1efacb15d1ce0eb5efce40484740b634eb99"} Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.326467 4707 scope.go:117] "RemoveContainer" containerID="e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.326617 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5npmj" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.349881 4707 scope.go:117] "RemoveContainer" containerID="e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d" Jan 21 17:03:01 crc kubenswrapper[4707]: E0121 17:03:01.350299 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d\": container with ID starting with e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d not found: ID does not exist" containerID="e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.350346 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d"} err="failed to get container status \"e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d\": rpc error: code = NotFound desc = could not find container \"e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d\": container with ID starting with e0afdeb54e2432354cf009abce44ebebded49b74b277309d451366374338817d not found: ID does not exist" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.363751 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-5npmj"] Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.373578 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-5npmj"] Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.387172 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-hll9d"] Jan 21 17:03:01 crc kubenswrapper[4707]: E0121 17:03:01.387416 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b03c15-5581-455d-9851-c7eaf1c3a046" containerName="registry-server" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.387432 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b03c15-5581-455d-9851-c7eaf1c3a046" containerName="registry-server" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.387532 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b03c15-5581-455d-9851-c7eaf1c3a046" containerName="registry-server" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.387954 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.389972 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-qnjlp" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.397345 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-hll9d"] Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.517169 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcpwp\" (UniqueName: \"kubernetes.io/projected/72bc4e04-1249-4797-9a6f-044c6245e3d7-kube-api-access-kcpwp\") pod \"infra-operator-index-hll9d\" (UID: \"72bc4e04-1249-4797-9a6f-044c6245e3d7\") " pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.618101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpwp\" (UniqueName: \"kubernetes.io/projected/72bc4e04-1249-4797-9a6f-044c6245e3d7-kube-api-access-kcpwp\") pod \"infra-operator-index-hll9d\" (UID: \"72bc4e04-1249-4797-9a6f-044c6245e3d7\") " pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.633349 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcpwp\" (UniqueName: \"kubernetes.io/projected/72bc4e04-1249-4797-9a6f-044c6245e3d7-kube-api-access-kcpwp\") pod \"infra-operator-index-hll9d\" (UID: \"72bc4e04-1249-4797-9a6f-044c6245e3d7\") " pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:03:01 crc kubenswrapper[4707]: I0121 17:03:01.704846 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.064740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-hll9d"] Jan 21 17:03:02 crc kubenswrapper[4707]: W0121 17:03:02.069140 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72bc4e04_1249_4797_9a6f_044c6245e3d7.slice/crio-2f616bc927ac98df60ca6034a5f5c5e7084a353dfddaa2c2dcef6a6fdce467f7 WatchSource:0}: Error finding container 2f616bc927ac98df60ca6034a5f5c5e7084a353dfddaa2c2dcef6a6fdce467f7: Status 404 returned error can't find the container with id 2f616bc927ac98df60ca6034a5f5c5e7084a353dfddaa2c2dcef6a6fdce467f7 Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.331889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-hll9d" event={"ID":"72bc4e04-1249-4797-9a6f-044c6245e3d7","Type":"ContainerStarted","Data":"2f616bc927ac98df60ca6034a5f5c5e7084a353dfddaa2c2dcef6a6fdce467f7"} Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.593795 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.596181 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.605604 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"galera-openstack-dockercfg-5hrdf" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.607485 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"openstack-scripts" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.608261 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"openshift-service-ca.crt" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.608321 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"kube-root-ca.crt" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.609991 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"openstack-config-data" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.614903 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.615984 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.623443 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.624869 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.628178 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.632983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.639317 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.731882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-operator-scripts\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.731938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-operator-scripts\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.731963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-default\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.731986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-operator-scripts\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-generated\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-generated\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-default\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-generated\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kolla-config\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kolla-config\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4h98\" (UniqueName: \"kubernetes.io/projected/6054a174-42ee-4839-a37a-677ac9a78b68-kube-api-access-s4h98\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kwr6\" (UniqueName: \"kubernetes.io/projected/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kube-api-access-4kwr6\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732312 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-default\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5dvb\" (UniqueName: \"kubernetes.io/projected/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kube-api-access-l5dvb\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-kolla-config\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.732392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.834520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.834607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-generated\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.834676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-generated\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.834716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.834771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-default\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.834835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-generated\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.834911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kolla-config\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.834954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kolla-config\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.835213 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4h98\" (UniqueName: \"kubernetes.io/projected/6054a174-42ee-4839-a37a-677ac9a78b68-kube-api-access-s4h98\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.835318 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") device mount path \"/mnt/openstack/pv06\"" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.835769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-generated\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.835861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-generated\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.835951 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-generated\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.835213 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") device mount path \"/mnt/openstack/pv18\"" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836052 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kwr6\" (UniqueName: \"kubernetes.io/projected/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kube-api-access-4kwr6\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-default\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5dvb\" (UniqueName: \"kubernetes.io/projected/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kube-api-access-l5dvb\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kolla-config\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kolla-config\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-kolla-config\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-default\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836472 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") device mount path \"/mnt/openstack/pv13\"" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-operator-scripts\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836557 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-operator-scripts\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-default\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-operator-scripts\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-default\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.836973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-kolla-config\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.837596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-default\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.838224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-operator-scripts\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.838287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-operator-scripts\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.838792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-operator-scripts\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.850144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5dvb\" (UniqueName: \"kubernetes.io/projected/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kube-api-access-l5dvb\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.850152 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kwr6\" (UniqueName: \"kubernetes.io/projected/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kube-api-access-4kwr6\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.850910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.851125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4h98\" (UniqueName: \"kubernetes.io/projected/6054a174-42ee-4839-a37a-677ac9a78b68-kube-api-access-s4h98\") pod \"openstack-galera-1\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.851908 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.852100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.918350 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.930414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:02 crc kubenswrapper[4707]: I0121 17:03:02.980141 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:03 crc kubenswrapper[4707]: I0121 17:03:03.189362 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b03c15-5581-455d-9851-c7eaf1c3a046" path="/var/lib/kubelet/pods/b0b03c15-5581-455d-9851-c7eaf1c3a046/volumes" Jan 21 17:03:03 crc kubenswrapper[4707]: I0121 17:03:03.279404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 17:03:03 crc kubenswrapper[4707]: W0121 17:03:03.284056 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65e6dbe8_c2a6_4589_88cb_e16c1b718141.slice/crio-e85f7d6f5a7308d578a7650d245cf539bb7db82273f2d60be6d6dd4c959596d1 WatchSource:0}: Error finding container e85f7d6f5a7308d578a7650d245cf539bb7db82273f2d60be6d6dd4c959596d1: Status 404 returned error can't find the container with id e85f7d6f5a7308d578a7650d245cf539bb7db82273f2d60be6d6dd4c959596d1 Jan 21 17:03:03 crc kubenswrapper[4707]: I0121 17:03:03.321287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 17:03:03 crc kubenswrapper[4707]: W0121 17:03:03.323021 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6054a174_42ee_4839_a37a_677ac9a78b68.slice/crio-7b7d335f05418621d7e429a0130e97a7094bd32e6ebd6c7f5a82cb2159f1e8d7 WatchSource:0}: Error finding container 7b7d335f05418621d7e429a0130e97a7094bd32e6ebd6c7f5a82cb2159f1e8d7: Status 404 returned error can't find the container with id 7b7d335f05418621d7e429a0130e97a7094bd32e6ebd6c7f5a82cb2159f1e8d7 Jan 21 17:03:03 crc kubenswrapper[4707]: I0121 17:03:03.340154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"65e6dbe8-c2a6-4589-88cb-e16c1b718141","Type":"ContainerStarted","Data":"e85f7d6f5a7308d578a7650d245cf539bb7db82273f2d60be6d6dd4c959596d1"} Jan 21 17:03:03 crc kubenswrapper[4707]: I0121 17:03:03.341128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"6054a174-42ee-4839-a37a-677ac9a78b68","Type":"ContainerStarted","Data":"7b7d335f05418621d7e429a0130e97a7094bd32e6ebd6c7f5a82cb2159f1e8d7"} Jan 21 17:03:03 crc kubenswrapper[4707]: I0121 17:03:03.342222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-hll9d" event={"ID":"72bc4e04-1249-4797-9a6f-044c6245e3d7","Type":"ContainerStarted","Data":"a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5"} Jan 21 17:03:03 crc kubenswrapper[4707]: I0121 17:03:03.357952 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-hll9d" podStartSLOduration=1.817214156 podStartE2EDuration="2.357825301s" podCreationTimestamp="2026-01-21 17:03:01 +0000 UTC" firstStartedPulling="2026-01-21 17:03:02.071711947 +0000 UTC m=+7279.253228170" lastFinishedPulling="2026-01-21 17:03:02.612323093 +0000 UTC m=+7279.793839315" observedRunningTime="2026-01-21 17:03:03.354184184 +0000 UTC m=+7280.535700406" watchObservedRunningTime="2026-01-21 17:03:03.357825301 +0000 UTC m=+7280.539341522" Jan 21 17:03:03 crc kubenswrapper[4707]: I0121 17:03:03.371562 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 17:03:03 crc kubenswrapper[4707]: W0121 17:03:03.375924 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89c5cc72_daa5_42e2_91af_5af2f1ab1c0c.slice/crio-7e3c543ba6ff63913faa2b7a1ac37656a265e4ef5bb4df335464ad1032578b17 WatchSource:0}: Error finding container 7e3c543ba6ff63913faa2b7a1ac37656a265e4ef5bb4df335464ad1032578b17: Status 404 returned error can't find the container with id 7e3c543ba6ff63913faa2b7a1ac37656a265e4ef5bb4df335464ad1032578b17 Jan 21 17:03:04 crc kubenswrapper[4707]: I0121 17:03:04.348585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c","Type":"ContainerStarted","Data":"f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47"} Jan 21 17:03:04 crc kubenswrapper[4707]: I0121 17:03:04.348657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c","Type":"ContainerStarted","Data":"7e3c543ba6ff63913faa2b7a1ac37656a265e4ef5bb4df335464ad1032578b17"} Jan 21 17:03:04 crc kubenswrapper[4707]: I0121 17:03:04.350047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"65e6dbe8-c2a6-4589-88cb-e16c1b718141","Type":"ContainerStarted","Data":"d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02"} Jan 21 17:03:04 crc kubenswrapper[4707]: I0121 17:03:04.351137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"6054a174-42ee-4839-a37a-677ac9a78b68","Type":"ContainerStarted","Data":"5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd"} Jan 21 17:03:06 crc kubenswrapper[4707]: I0121 17:03:06.361990 4707 generic.go:334] "Generic (PLEG): container finished" podID="65e6dbe8-c2a6-4589-88cb-e16c1b718141" containerID="d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02" exitCode=0 Jan 21 17:03:06 crc kubenswrapper[4707]: I0121 17:03:06.362065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"65e6dbe8-c2a6-4589-88cb-e16c1b718141","Type":"ContainerDied","Data":"d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02"} Jan 21 17:03:06 crc kubenswrapper[4707]: I0121 17:03:06.363730 4707 generic.go:334] "Generic (PLEG): container finished" podID="6054a174-42ee-4839-a37a-677ac9a78b68" containerID="5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd" exitCode=0 Jan 21 17:03:06 crc kubenswrapper[4707]: I0121 17:03:06.363790 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"6054a174-42ee-4839-a37a-677ac9a78b68","Type":"ContainerDied","Data":"5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd"} Jan 21 17:03:06 crc kubenswrapper[4707]: I0121 17:03:06.364747 4707 generic.go:334] "Generic (PLEG): container finished" podID="89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" containerID="f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47" exitCode=0 Jan 21 17:03:06 crc kubenswrapper[4707]: I0121 17:03:06.364785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c","Type":"ContainerDied","Data":"f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47"} Jan 21 17:03:07 crc kubenswrapper[4707]: I0121 17:03:07.372395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c","Type":"ContainerStarted","Data":"5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67"} Jan 21 17:03:07 crc kubenswrapper[4707]: I0121 17:03:07.374174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"65e6dbe8-c2a6-4589-88cb-e16c1b718141","Type":"ContainerStarted","Data":"e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed"} Jan 21 17:03:07 crc kubenswrapper[4707]: I0121 17:03:07.375699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"6054a174-42ee-4839-a37a-677ac9a78b68","Type":"ContainerStarted","Data":"f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052"} Jan 21 17:03:07 crc kubenswrapper[4707]: I0121 17:03:07.386076 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/openstack-galera-2" podStartSLOduration=6.386061529 podStartE2EDuration="6.386061529s" podCreationTimestamp="2026-01-21 17:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:03:07.38513105 +0000 UTC m=+7284.566647271" watchObservedRunningTime="2026-01-21 17:03:07.386061529 +0000 UTC m=+7284.567577752" Jan 21 17:03:07 crc kubenswrapper[4707]: I0121 17:03:07.400418 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/openstack-galera-1" podStartSLOduration=6.400403606 podStartE2EDuration="6.400403606s" podCreationTimestamp="2026-01-21 17:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:03:07.397121395 +0000 UTC m=+7284.578637616" watchObservedRunningTime="2026-01-21 17:03:07.400403606 +0000 UTC m=+7284.581919828" Jan 21 17:03:07 crc kubenswrapper[4707]: I0121 17:03:07.412988 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/openstack-galera-0" podStartSLOduration=6.412972769 podStartE2EDuration="6.412972769s" podCreationTimestamp="2026-01-21 17:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:03:07.410524516 +0000 UTC m=+7284.592040737" watchObservedRunningTime="2026-01-21 17:03:07.412972769 +0000 UTC m=+7284.594488992" Jan 21 17:03:11 crc kubenswrapper[4707]: I0121 17:03:11.704906 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:03:11 crc kubenswrapper[4707]: I0121 17:03:11.705282 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:03:11 crc kubenswrapper[4707]: I0121 17:03:11.724189 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:03:12 crc kubenswrapper[4707]: I0121 17:03:12.419239 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:03:12 crc kubenswrapper[4707]: I0121 17:03:12.918690 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:12 crc kubenswrapper[4707]: I0121 17:03:12.918890 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:12 crc kubenswrapper[4707]: I0121 17:03:12.931489 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:12 crc kubenswrapper[4707]: I0121 17:03:12.931557 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:12 crc kubenswrapper[4707]: I0121 17:03:12.980847 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:12 crc kubenswrapper[4707]: I0121 17:03:12.980947 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:13 crc kubenswrapper[4707]: I0121 17:03:13.030246 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:13 crc kubenswrapper[4707]: I0121 17:03:13.450044 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:03:19 crc kubenswrapper[4707]: I0121 17:03:19.237710 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:19 crc kubenswrapper[4707]: I0121 17:03:19.287266 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:03:19 crc kubenswrapper[4707]: I0121 17:03:19.801118 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk"] Jan 21 17:03:19 crc kubenswrapper[4707]: I0121 17:03:19.802302 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:19 crc kubenswrapper[4707]: I0121 17:03:19.803584 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 17:03:19 crc kubenswrapper[4707]: I0121 17:03:19.813272 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk"] Jan 21 17:03:19 crc kubenswrapper[4707]: I0121 17:03:19.954487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:19 crc kubenswrapper[4707]: I0121 17:03:19.954616 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m52f\" (UniqueName: \"kubernetes.io/projected/f90da28a-77de-44e6-ae52-10f83fdcbe73-kube-api-access-6m52f\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:19 crc kubenswrapper[4707]: I0121 17:03:19.954739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:20 crc kubenswrapper[4707]: I0121 17:03:20.056446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:20 crc kubenswrapper[4707]: I0121 17:03:20.056499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:20 crc kubenswrapper[4707]: I0121 17:03:20.056553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m52f\" (UniqueName: \"kubernetes.io/projected/f90da28a-77de-44e6-ae52-10f83fdcbe73-kube-api-access-6m52f\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:20 crc kubenswrapper[4707]: I0121 17:03:20.056969 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:20 crc kubenswrapper[4707]: I0121 17:03:20.057038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:20 crc kubenswrapper[4707]: I0121 17:03:20.071662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m52f\" (UniqueName: \"kubernetes.io/projected/f90da28a-77de-44e6-ae52-10f83fdcbe73-kube-api-access-6m52f\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:20 crc kubenswrapper[4707]: I0121 17:03:20.131116 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:20 crc kubenswrapper[4707]: I0121 17:03:20.482277 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk"] Jan 21 17:03:20 crc kubenswrapper[4707]: W0121 17:03:20.484293 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf90da28a_77de_44e6_ae52_10f83fdcbe73.slice/crio-549980ffde5cd15f8711b49a18e8cc8b34c7e6064e5d494b2b898b7d893ff743 WatchSource:0}: Error finding container 549980ffde5cd15f8711b49a18e8cc8b34c7e6064e5d494b2b898b7d893ff743: Status 404 returned error can't find the container with id 549980ffde5cd15f8711b49a18e8cc8b34c7e6064e5d494b2b898b7d893ff743 Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.443239 4707 generic.go:334] "Generic (PLEG): container finished" podID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerID="7a1d61ffec3eca3b78decbadd6c286d9e0f4635ab8604112d41dc7c30fc4fe9b" exitCode=0 Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.443287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" event={"ID":"f90da28a-77de-44e6-ae52-10f83fdcbe73","Type":"ContainerDied","Data":"7a1d61ffec3eca3b78decbadd6c286d9e0f4635ab8604112d41dc7c30fc4fe9b"} Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.443466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" event={"ID":"f90da28a-77de-44e6-ae52-10f83fdcbe73","Type":"ContainerStarted","Data":"549980ffde5cd15f8711b49a18e8cc8b34c7e6064e5d494b2b898b7d893ff743"} Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.642254 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/root-account-create-update-vszqp"] Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.643681 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.645465 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.649322 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-vszqp"] Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.777461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw5gb\" (UniqueName: \"kubernetes.io/projected/559c5305-a2b7-4a06-87a7-afd6501038d7-kube-api-access-zw5gb\") pod \"root-account-create-update-vszqp\" (UID: \"559c5305-a2b7-4a06-87a7-afd6501038d7\") " pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.777845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/559c5305-a2b7-4a06-87a7-afd6501038d7-operator-scripts\") pod \"root-account-create-update-vszqp\" (UID: \"559c5305-a2b7-4a06-87a7-afd6501038d7\") " pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.878879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw5gb\" (UniqueName: \"kubernetes.io/projected/559c5305-a2b7-4a06-87a7-afd6501038d7-kube-api-access-zw5gb\") pod \"root-account-create-update-vszqp\" (UID: \"559c5305-a2b7-4a06-87a7-afd6501038d7\") " pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.878950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/559c5305-a2b7-4a06-87a7-afd6501038d7-operator-scripts\") pod \"root-account-create-update-vszqp\" (UID: \"559c5305-a2b7-4a06-87a7-afd6501038d7\") " pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.879584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/559c5305-a2b7-4a06-87a7-afd6501038d7-operator-scripts\") pod \"root-account-create-update-vszqp\" (UID: \"559c5305-a2b7-4a06-87a7-afd6501038d7\") " pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.895068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw5gb\" (UniqueName: \"kubernetes.io/projected/559c5305-a2b7-4a06-87a7-afd6501038d7-kube-api-access-zw5gb\") pod \"root-account-create-update-vszqp\" (UID: \"559c5305-a2b7-4a06-87a7-afd6501038d7\") " pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:21 crc kubenswrapper[4707]: I0121 17:03:21.959324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:22 crc kubenswrapper[4707]: I0121 17:03:22.322500 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-vszqp"] Jan 21 17:03:22 crc kubenswrapper[4707]: W0121 17:03:22.325934 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod559c5305_a2b7_4a06_87a7_afd6501038d7.slice/crio-ebfa8e7f99639647ea2e477db332bd767fafa95301bdb9ecf07b4693e5b58071 WatchSource:0}: Error finding container ebfa8e7f99639647ea2e477db332bd767fafa95301bdb9ecf07b4693e5b58071: Status 404 returned error can't find the container with id ebfa8e7f99639647ea2e477db332bd767fafa95301bdb9ecf07b4693e5b58071 Jan 21 17:03:22 crc kubenswrapper[4707]: I0121 17:03:22.450957 4707 generic.go:334] "Generic (PLEG): container finished" podID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerID="5c0d24bd39f259526cfb4f3db26bbd70d2a3155cb4d5d3fd571e29a3554d1e23" exitCode=0 Jan 21 17:03:22 crc kubenswrapper[4707]: I0121 17:03:22.451116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" event={"ID":"f90da28a-77de-44e6-ae52-10f83fdcbe73","Type":"ContainerDied","Data":"5c0d24bd39f259526cfb4f3db26bbd70d2a3155cb4d5d3fd571e29a3554d1e23"} Jan 21 17:03:22 crc kubenswrapper[4707]: I0121 17:03:22.453563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-vszqp" event={"ID":"559c5305-a2b7-4a06-87a7-afd6501038d7","Type":"ContainerStarted","Data":"ff5ebfabf88da6dabfa7e2e83769bc70aaabd249d3829ef1a6288905e06ce7c7"} Jan 21 17:03:22 crc kubenswrapper[4707]: I0121 17:03:22.453596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-vszqp" event={"ID":"559c5305-a2b7-4a06-87a7-afd6501038d7","Type":"ContainerStarted","Data":"ebfa8e7f99639647ea2e477db332bd767fafa95301bdb9ecf07b4693e5b58071"} Jan 21 17:03:22 crc kubenswrapper[4707]: I0121 17:03:22.481135 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/root-account-create-update-vszqp" podStartSLOduration=1.481119912 podStartE2EDuration="1.481119912s" podCreationTimestamp="2026-01-21 17:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:03:22.477168983 +0000 UTC m=+7299.658685205" watchObservedRunningTime="2026-01-21 17:03:22.481119912 +0000 UTC m=+7299.662636134" Jan 21 17:03:23 crc kubenswrapper[4707]: I0121 17:03:23.461800 4707 generic.go:334] "Generic (PLEG): container finished" podID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerID="e9e4b3e25a818ce85b0b358bb9e60401b6b85c6eea5a42a8c798d06fc5ed35fd" exitCode=0 Jan 21 17:03:23 crc kubenswrapper[4707]: I0121 17:03:23.461921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" event={"ID":"f90da28a-77de-44e6-ae52-10f83fdcbe73","Type":"ContainerDied","Data":"e9e4b3e25a818ce85b0b358bb9e60401b6b85c6eea5a42a8c798d06fc5ed35fd"} Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.713451 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.824556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-util\") pod \"f90da28a-77de-44e6-ae52-10f83fdcbe73\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.824660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-bundle\") pod \"f90da28a-77de-44e6-ae52-10f83fdcbe73\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.824753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m52f\" (UniqueName: \"kubernetes.io/projected/f90da28a-77de-44e6-ae52-10f83fdcbe73-kube-api-access-6m52f\") pod \"f90da28a-77de-44e6-ae52-10f83fdcbe73\" (UID: \"f90da28a-77de-44e6-ae52-10f83fdcbe73\") " Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.826197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-bundle" (OuterVolumeSpecName: "bundle") pod "f90da28a-77de-44e6-ae52-10f83fdcbe73" (UID: "f90da28a-77de-44e6-ae52-10f83fdcbe73"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.830027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f90da28a-77de-44e6-ae52-10f83fdcbe73-kube-api-access-6m52f" (OuterVolumeSpecName: "kube-api-access-6m52f") pod "f90da28a-77de-44e6-ae52-10f83fdcbe73" (UID: "f90da28a-77de-44e6-ae52-10f83fdcbe73"). InnerVolumeSpecName "kube-api-access-6m52f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.835393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-util" (OuterVolumeSpecName: "util") pod "f90da28a-77de-44e6-ae52-10f83fdcbe73" (UID: "f90da28a-77de-44e6-ae52-10f83fdcbe73"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.926467 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m52f\" (UniqueName: \"kubernetes.io/projected/f90da28a-77de-44e6-ae52-10f83fdcbe73-kube-api-access-6m52f\") on node \"crc\" DevicePath \"\"" Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.926499 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-util\") on node \"crc\" DevicePath \"\"" Jan 21 17:03:24 crc kubenswrapper[4707]: I0121 17:03:24.926510 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f90da28a-77de-44e6-ae52-10f83fdcbe73-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 17:03:25 crc kubenswrapper[4707]: I0121 17:03:25.474861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" event={"ID":"f90da28a-77de-44e6-ae52-10f83fdcbe73","Type":"ContainerDied","Data":"549980ffde5cd15f8711b49a18e8cc8b34c7e6064e5d494b2b898b7d893ff743"} Jan 21 17:03:25 crc kubenswrapper[4707]: I0121 17:03:25.474939 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="549980ffde5cd15f8711b49a18e8cc8b34c7e6064e5d494b2b898b7d893ff743" Jan 21 17:03:25 crc kubenswrapper[4707]: I0121 17:03:25.474900 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk" Jan 21 17:03:25 crc kubenswrapper[4707]: I0121 17:03:25.476306 4707 generic.go:334] "Generic (PLEG): container finished" podID="559c5305-a2b7-4a06-87a7-afd6501038d7" containerID="ff5ebfabf88da6dabfa7e2e83769bc70aaabd249d3829ef1a6288905e06ce7c7" exitCode=1 Jan 21 17:03:25 crc kubenswrapper[4707]: I0121 17:03:25.476340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-vszqp" event={"ID":"559c5305-a2b7-4a06-87a7-afd6501038d7","Type":"ContainerDied","Data":"ff5ebfabf88da6dabfa7e2e83769bc70aaabd249d3829ef1a6288905e06ce7c7"} Jan 21 17:03:25 crc kubenswrapper[4707]: I0121 17:03:25.476655 4707 scope.go:117] "RemoveContainer" containerID="ff5ebfabf88da6dabfa7e2e83769bc70aaabd249d3829ef1a6288905e06ce7c7" Jan 21 17:03:26 crc kubenswrapper[4707]: I0121 17:03:26.483370 4707 generic.go:334] "Generic (PLEG): container finished" podID="559c5305-a2b7-4a06-87a7-afd6501038d7" containerID="145ffa0e7c42bbfe06c805e5cc6e7c6bcf75fe630f7c172e19b2bcacdae73b01" exitCode=0 Jan 21 17:03:26 crc kubenswrapper[4707]: I0121 17:03:26.483447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-vszqp" event={"ID":"559c5305-a2b7-4a06-87a7-afd6501038d7","Type":"ContainerDied","Data":"145ffa0e7c42bbfe06c805e5cc6e7c6bcf75fe630f7c172e19b2bcacdae73b01"} Jan 21 17:03:26 crc kubenswrapper[4707]: I0121 17:03:26.483657 4707 scope.go:117] "RemoveContainer" containerID="ff5ebfabf88da6dabfa7e2e83769bc70aaabd249d3829ef1a6288905e06ce7c7" Jan 21 17:03:27 crc kubenswrapper[4707]: I0121 17:03:27.717913 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:27 crc kubenswrapper[4707]: I0121 17:03:27.861525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw5gb\" (UniqueName: \"kubernetes.io/projected/559c5305-a2b7-4a06-87a7-afd6501038d7-kube-api-access-zw5gb\") pod \"559c5305-a2b7-4a06-87a7-afd6501038d7\" (UID: \"559c5305-a2b7-4a06-87a7-afd6501038d7\") " Jan 21 17:03:27 crc kubenswrapper[4707]: I0121 17:03:27.861929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/559c5305-a2b7-4a06-87a7-afd6501038d7-operator-scripts\") pod \"559c5305-a2b7-4a06-87a7-afd6501038d7\" (UID: \"559c5305-a2b7-4a06-87a7-afd6501038d7\") " Jan 21 17:03:27 crc kubenswrapper[4707]: I0121 17:03:27.862406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/559c5305-a2b7-4a06-87a7-afd6501038d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "559c5305-a2b7-4a06-87a7-afd6501038d7" (UID: "559c5305-a2b7-4a06-87a7-afd6501038d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:03:27 crc kubenswrapper[4707]: I0121 17:03:27.866672 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559c5305-a2b7-4a06-87a7-afd6501038d7-kube-api-access-zw5gb" (OuterVolumeSpecName: "kube-api-access-zw5gb") pod "559c5305-a2b7-4a06-87a7-afd6501038d7" (UID: "559c5305-a2b7-4a06-87a7-afd6501038d7"). InnerVolumeSpecName "kube-api-access-zw5gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:03:27 crc kubenswrapper[4707]: I0121 17:03:27.963156 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/559c5305-a2b7-4a06-87a7-afd6501038d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:03:27 crc kubenswrapper[4707]: I0121 17:03:27.963183 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw5gb\" (UniqueName: \"kubernetes.io/projected/559c5305-a2b7-4a06-87a7-afd6501038d7-kube-api-access-zw5gb\") on node \"crc\" DevicePath \"\"" Jan 21 17:03:28 crc kubenswrapper[4707]: I0121 17:03:28.496053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-vszqp" event={"ID":"559c5305-a2b7-4a06-87a7-afd6501038d7","Type":"ContainerDied","Data":"ebfa8e7f99639647ea2e477db332bd767fafa95301bdb9ecf07b4693e5b58071"} Jan 21 17:03:28 crc kubenswrapper[4707]: I0121 17:03:28.496091 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebfa8e7f99639647ea2e477db332bd767fafa95301bdb9ecf07b4693e5b58071" Jan 21 17:03:28 crc kubenswrapper[4707]: I0121 17:03:28.496089 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-vszqp" Jan 21 17:03:28 crc kubenswrapper[4707]: I0121 17:03:28.950300 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:29 crc kubenswrapper[4707]: I0121 17:03:29.008088 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.448298 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq"] Jan 21 17:03:49 crc kubenswrapper[4707]: E0121 17:03:49.448932 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerName="pull" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.448945 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerName="pull" Jan 21 17:03:49 crc kubenswrapper[4707]: E0121 17:03:49.448957 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559c5305-a2b7-4a06-87a7-afd6501038d7" containerName="mariadb-account-create-update" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.448963 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="559c5305-a2b7-4a06-87a7-afd6501038d7" containerName="mariadb-account-create-update" Jan 21 17:03:49 crc kubenswrapper[4707]: E0121 17:03:49.448976 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559c5305-a2b7-4a06-87a7-afd6501038d7" containerName="mariadb-account-create-update" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.448981 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="559c5305-a2b7-4a06-87a7-afd6501038d7" containerName="mariadb-account-create-update" Jan 21 17:03:49 crc kubenswrapper[4707]: E0121 17:03:49.448991 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerName="util" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.448997 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerName="util" Jan 21 17:03:49 crc kubenswrapper[4707]: E0121 17:03:49.449009 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerName="extract" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.449014 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerName="extract" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.449127 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="559c5305-a2b7-4a06-87a7-afd6501038d7" containerName="mariadb-account-create-update" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.449140 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="559c5305-a2b7-4a06-87a7-afd6501038d7" containerName="mariadb-account-create-update" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.449146 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f90da28a-77de-44e6-ae52-10f83fdcbe73" containerName="extract" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.449589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.452020 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.452151 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hxs57" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.465389 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq"] Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.551231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-apiservice-cert\") pod \"infra-operator-controller-manager-58788bc8df-69rmq\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.551423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-webhook-cert\") pod \"infra-operator-controller-manager-58788bc8df-69rmq\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.551528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx745\" (UniqueName: \"kubernetes.io/projected/537a7434-2179-4bf0-b885-8c8a86e35ced-kube-api-access-dx745\") pod \"infra-operator-controller-manager-58788bc8df-69rmq\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.652583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-apiservice-cert\") pod \"infra-operator-controller-manager-58788bc8df-69rmq\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.652658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-webhook-cert\") pod \"infra-operator-controller-manager-58788bc8df-69rmq\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.652693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx745\" (UniqueName: \"kubernetes.io/projected/537a7434-2179-4bf0-b885-8c8a86e35ced-kube-api-access-dx745\") pod \"infra-operator-controller-manager-58788bc8df-69rmq\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.657855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-webhook-cert\") pod \"infra-operator-controller-manager-58788bc8df-69rmq\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.657883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-apiservice-cert\") pod \"infra-operator-controller-manager-58788bc8df-69rmq\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.666107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx745\" (UniqueName: \"kubernetes.io/projected/537a7434-2179-4bf0-b885-8c8a86e35ced-kube-api-access-dx745\") pod \"infra-operator-controller-manager-58788bc8df-69rmq\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:49 crc kubenswrapper[4707]: I0121 17:03:49.771774 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:50 crc kubenswrapper[4707]: I0121 17:03:50.146353 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq"] Jan 21 17:03:50 crc kubenswrapper[4707]: I0121 17:03:50.605883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" event={"ID":"537a7434-2179-4bf0-b885-8c8a86e35ced","Type":"ContainerStarted","Data":"934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987"} Jan 21 17:03:50 crc kubenswrapper[4707]: I0121 17:03:50.606171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" event={"ID":"537a7434-2179-4bf0-b885-8c8a86e35ced","Type":"ContainerStarted","Data":"83c67a89620d9f233693f2c2d4a092d23b2a6fcd186855e4105b85e1fdcd38d0"} Jan 21 17:03:50 crc kubenswrapper[4707]: I0121 17:03:50.606362 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:03:50 crc kubenswrapper[4707]: I0121 17:03:50.624966 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" podStartSLOduration=1.6249500829999999 podStartE2EDuration="1.624950083s" podCreationTimestamp="2026-01-21 17:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:03:50.621008573 +0000 UTC m=+7327.802524794" watchObservedRunningTime="2026-01-21 17:03:50.624950083 +0000 UTC m=+7327.806466305" Jan 21 17:03:59 crc kubenswrapper[4707]: I0121 17:03:59.776951 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:04:05 crc kubenswrapper[4707]: I0121 17:04:05.048075 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4tn6"] Jan 21 17:04:05 crc kubenswrapper[4707]: I0121 17:04:05.049103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" Jan 21 17:04:05 crc kubenswrapper[4707]: I0121 17:04:05.050724 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-cptxs" Jan 21 17:04:05 crc kubenswrapper[4707]: I0121 17:04:05.053588 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4tn6"] Jan 21 17:04:05 crc kubenswrapper[4707]: I0121 17:04:05.166422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncb9b\" (UniqueName: \"kubernetes.io/projected/6511c007-7788-4522-a4d4-c5388a3e75ca-kube-api-access-ncb9b\") pod \"rabbitmq-cluster-operator-index-j4tn6\" (UID: \"6511c007-7788-4522-a4d4-c5388a3e75ca\") " pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" Jan 21 17:04:05 crc kubenswrapper[4707]: I0121 17:04:05.267668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncb9b\" (UniqueName: \"kubernetes.io/projected/6511c007-7788-4522-a4d4-c5388a3e75ca-kube-api-access-ncb9b\") pod \"rabbitmq-cluster-operator-index-j4tn6\" (UID: \"6511c007-7788-4522-a4d4-c5388a3e75ca\") " pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" Jan 21 17:04:05 crc kubenswrapper[4707]: I0121 17:04:05.282677 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncb9b\" (UniqueName: \"kubernetes.io/projected/6511c007-7788-4522-a4d4-c5388a3e75ca-kube-api-access-ncb9b\") pod \"rabbitmq-cluster-operator-index-j4tn6\" (UID: \"6511c007-7788-4522-a4d4-c5388a3e75ca\") " pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" Jan 21 17:04:05 crc kubenswrapper[4707]: I0121 17:04:05.363757 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" Jan 21 17:04:05 crc kubenswrapper[4707]: I0121 17:04:05.709472 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4tn6"] Jan 21 17:04:06 crc kubenswrapper[4707]: I0121 17:04:06.693465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" event={"ID":"6511c007-7788-4522-a4d4-c5388a3e75ca","Type":"ContainerStarted","Data":"43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960"} Jan 21 17:04:06 crc kubenswrapper[4707]: I0121 17:04:06.693706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" event={"ID":"6511c007-7788-4522-a4d4-c5388a3e75ca","Type":"ContainerStarted","Data":"96296bcfebf5d2bf17b9a7910d34a5d6e4cffa42920246bf3f1db1b18d85eced"} Jan 21 17:04:06 crc kubenswrapper[4707]: I0121 17:04:06.703896 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" podStartSLOduration=1.108400821 podStartE2EDuration="1.703882398s" podCreationTimestamp="2026-01-21 17:04:05 +0000 UTC" firstStartedPulling="2026-01-21 17:04:05.715236796 +0000 UTC m=+7342.896753018" lastFinishedPulling="2026-01-21 17:04:06.310718374 +0000 UTC m=+7343.492234595" observedRunningTime="2026-01-21 17:04:06.70316037 +0000 UTC m=+7343.884676593" watchObservedRunningTime="2026-01-21 17:04:06.703882398 +0000 UTC m=+7343.885398621" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.514563 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.515345 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.516850 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"memcached-config-data" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.517133 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"memcached-memcached-dockercfg-258hx" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.520336 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.601122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-config-data\") pod \"memcached-0\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.601344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9jlk\" (UniqueName: \"kubernetes.io/projected/6162703a-4327-49b8-a9fe-4fb97871dc5d-kube-api-access-c9jlk\") pod \"memcached-0\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.601680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-kolla-config\") pod \"memcached-0\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.702440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-kolla-config\") pod \"memcached-0\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.702489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-config-data\") pod \"memcached-0\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.702516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9jlk\" (UniqueName: \"kubernetes.io/projected/6162703a-4327-49b8-a9fe-4fb97871dc5d-kube-api-access-c9jlk\") pod \"memcached-0\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.703166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-kolla-config\") pod \"memcached-0\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.703186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-config-data\") pod \"memcached-0\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.717429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9jlk\" (UniqueName: \"kubernetes.io/projected/6162703a-4327-49b8-a9fe-4fb97871dc5d-kube-api-access-c9jlk\") pod \"memcached-0\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:07 crc kubenswrapper[4707]: I0121 17:04:07.830104 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:08 crc kubenswrapper[4707]: I0121 17:04:08.180315 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 17:04:08 crc kubenswrapper[4707]: W0121 17:04:08.184253 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6162703a_4327_49b8_a9fe_4fb97871dc5d.slice/crio-34a3d08ac9c9d75a346868e01d956841ae5b27d2a02d1a3c2a769371fed794a0 WatchSource:0}: Error finding container 34a3d08ac9c9d75a346868e01d956841ae5b27d2a02d1a3c2a769371fed794a0: Status 404 returned error can't find the container with id 34a3d08ac9c9d75a346868e01d956841ae5b27d2a02d1a3c2a769371fed794a0 Jan 21 17:04:08 crc kubenswrapper[4707]: I0121 17:04:08.704271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"6162703a-4327-49b8-a9fe-4fb97871dc5d","Type":"ContainerStarted","Data":"935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032"} Jan 21 17:04:08 crc kubenswrapper[4707]: I0121 17:04:08.704309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"6162703a-4327-49b8-a9fe-4fb97871dc5d","Type":"ContainerStarted","Data":"34a3d08ac9c9d75a346868e01d956841ae5b27d2a02d1a3c2a769371fed794a0"} Jan 21 17:04:08 crc kubenswrapper[4707]: I0121 17:04:08.704406 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:08 crc kubenswrapper[4707]: I0121 17:04:08.716289 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/memcached-0" podStartSLOduration=1.716275774 podStartE2EDuration="1.716275774s" podCreationTimestamp="2026-01-21 17:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:04:08.715349894 +0000 UTC m=+7345.896866116" watchObservedRunningTime="2026-01-21 17:04:08.716275774 +0000 UTC m=+7345.897791996" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.243762 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4tn6"] Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.244121 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" podUID="6511c007-7788-4522-a4d4-c5388a3e75ca" containerName="registry-server" containerID="cri-o://43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960" gracePeriod=2 Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.576558 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.711189 4707 generic.go:334] "Generic (PLEG): container finished" podID="6511c007-7788-4522-a4d4-c5388a3e75ca" containerID="43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960" exitCode=0 Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.711239 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.711233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" event={"ID":"6511c007-7788-4522-a4d4-c5388a3e75ca","Type":"ContainerDied","Data":"43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960"} Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.711284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-j4tn6" event={"ID":"6511c007-7788-4522-a4d4-c5388a3e75ca","Type":"ContainerDied","Data":"96296bcfebf5d2bf17b9a7910d34a5d6e4cffa42920246bf3f1db1b18d85eced"} Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.711301 4707 scope.go:117] "RemoveContainer" containerID="43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.723749 4707 scope.go:117] "RemoveContainer" containerID="43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960" Jan 21 17:04:09 crc kubenswrapper[4707]: E0121 17:04:09.724059 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960\": container with ID starting with 43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960 not found: ID does not exist" containerID="43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.724085 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960"} err="failed to get container status \"43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960\": rpc error: code = NotFound desc = could not find container \"43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960\": container with ID starting with 43705ec6c3897e89ceca386ea1d4035f3c28db0a21db1293fb3996a6c8dc5960 not found: ID does not exist" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.730964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncb9b\" (UniqueName: \"kubernetes.io/projected/6511c007-7788-4522-a4d4-c5388a3e75ca-kube-api-access-ncb9b\") pod \"6511c007-7788-4522-a4d4-c5388a3e75ca\" (UID: \"6511c007-7788-4522-a4d4-c5388a3e75ca\") " Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.734848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6511c007-7788-4522-a4d4-c5388a3e75ca-kube-api-access-ncb9b" (OuterVolumeSpecName: "kube-api-access-ncb9b") pod "6511c007-7788-4522-a4d4-c5388a3e75ca" (UID: "6511c007-7788-4522-a4d4-c5388a3e75ca"). InnerVolumeSpecName "kube-api-access-ncb9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.866186 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncb9b\" (UniqueName: \"kubernetes.io/projected/6511c007-7788-4522-a4d4-c5388a3e75ca-kube-api-access-ncb9b\") on node \"crc\" DevicePath \"\"" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.866905 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lmh6x"] Jan 21 17:04:09 crc kubenswrapper[4707]: E0121 17:04:09.867324 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6511c007-7788-4522-a4d4-c5388a3e75ca" containerName="registry-server" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.867344 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6511c007-7788-4522-a4d4-c5388a3e75ca" containerName="registry-server" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.867658 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6511c007-7788-4522-a4d4-c5388a3e75ca" containerName="registry-server" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.868269 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.882705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lmh6x"] Jan 21 17:04:09 crc kubenswrapper[4707]: I0121 17:04:09.968336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q5qz\" (UniqueName: \"kubernetes.io/projected/f91ad4c0-4328-4242-bf29-575b12865e63-kube-api-access-2q5qz\") pod \"rabbitmq-cluster-operator-index-lmh6x\" (UID: \"f91ad4c0-4328-4242-bf29-575b12865e63\") " pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.031594 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4tn6"] Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.035754 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4tn6"] Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.069615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q5qz\" (UniqueName: \"kubernetes.io/projected/f91ad4c0-4328-4242-bf29-575b12865e63-kube-api-access-2q5qz\") pod \"rabbitmq-cluster-operator-index-lmh6x\" (UID: \"f91ad4c0-4328-4242-bf29-575b12865e63\") " pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.082944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q5qz\" (UniqueName: \"kubernetes.io/projected/f91ad4c0-4328-4242-bf29-575b12865e63-kube-api-access-2q5qz\") pod \"rabbitmq-cluster-operator-index-lmh6x\" (UID: \"f91ad4c0-4328-4242-bf29-575b12865e63\") " pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.187464 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.529536 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lmh6x"] Jan 21 17:04:10 crc kubenswrapper[4707]: W0121 17:04:10.532535 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91ad4c0_4328_4242_bf29_575b12865e63.slice/crio-00698ecacca27934b7e35b0b651a512327258bb5da0b6f0c728fa67e18fcae9c WatchSource:0}: Error finding container 00698ecacca27934b7e35b0b651a512327258bb5da0b6f0c728fa67e18fcae9c: Status 404 returned error can't find the container with id 00698ecacca27934b7e35b0b651a512327258bb5da0b6f0c728fa67e18fcae9c Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.718047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" event={"ID":"f91ad4c0-4328-4242-bf29-575b12865e63","Type":"ContainerStarted","Data":"00698ecacca27934b7e35b0b651a512327258bb5da0b6f0c728fa67e18fcae9c"} Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.878005 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj"] Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.879446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.882553 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.886921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj"] Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.982301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.982636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/2038593f-c740-4feb-af87-945317f2adee-kube-api-access-rxs45\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:10 crc kubenswrapper[4707]: I0121 17:04:10.982783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.084250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.084433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/2038593f-c740-4feb-af87-945317f2adee-kube-api-access-rxs45\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.084544 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.084900 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.085049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.109901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/2038593f-c740-4feb-af87-945317f2adee-kube-api-access-rxs45\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.189930 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6511c007-7788-4522-a4d4-c5388a3e75ca" path="/var/lib/kubelet/pods/6511c007-7788-4522-a4d4-c5388a3e75ca/volumes" Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.197153 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.537559 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj"] Jan 21 17:04:11 crc kubenswrapper[4707]: W0121 17:04:11.542002 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2038593f_c740_4feb_af87_945317f2adee.slice/crio-2b0ed7a367371cc97ab429597027a48ece0babf00876e2da81ee232c86a5b020 WatchSource:0}: Error finding container 2b0ed7a367371cc97ab429597027a48ece0babf00876e2da81ee232c86a5b020: Status 404 returned error can't find the container with id 2b0ed7a367371cc97ab429597027a48ece0babf00876e2da81ee232c86a5b020 Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.729490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" event={"ID":"f91ad4c0-4328-4242-bf29-575b12865e63","Type":"ContainerStarted","Data":"058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913"} Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.731469 4707 generic.go:334] "Generic (PLEG): container finished" podID="2038593f-c740-4feb-af87-945317f2adee" containerID="4e401ca7aa762101df040b8d2afbe0a82baaaa39745e6ba0b41cea619e616331" exitCode=0 Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.731506 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" event={"ID":"2038593f-c740-4feb-af87-945317f2adee","Type":"ContainerDied","Data":"4e401ca7aa762101df040b8d2afbe0a82baaaa39745e6ba0b41cea619e616331"} Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.731527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" event={"ID":"2038593f-c740-4feb-af87-945317f2adee","Type":"ContainerStarted","Data":"2b0ed7a367371cc97ab429597027a48ece0babf00876e2da81ee232c86a5b020"} Jan 21 17:04:11 crc kubenswrapper[4707]: I0121 17:04:11.743261 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" podStartSLOduration=2.174788897 podStartE2EDuration="2.743253176s" podCreationTimestamp="2026-01-21 17:04:09 +0000 UTC" firstStartedPulling="2026-01-21 17:04:10.534069702 +0000 UTC m=+7347.715585923" lastFinishedPulling="2026-01-21 17:04:11.102533979 +0000 UTC m=+7348.284050202" observedRunningTime="2026-01-21 17:04:11.741877408 +0000 UTC m=+7348.923393630" watchObservedRunningTime="2026-01-21 17:04:11.743253176 +0000 UTC m=+7348.924769397" Jan 21 17:04:12 crc kubenswrapper[4707]: I0121 17:04:12.738751 4707 generic.go:334] "Generic (PLEG): container finished" podID="2038593f-c740-4feb-af87-945317f2adee" containerID="b1742e3fe2d246b1b73dd8656a6c0c2474a586cd4b0f63b19c95105f28905270" exitCode=0 Jan 21 17:04:12 crc kubenswrapper[4707]: I0121 17:04:12.738795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" event={"ID":"2038593f-c740-4feb-af87-945317f2adee","Type":"ContainerDied","Data":"b1742e3fe2d246b1b73dd8656a6c0c2474a586cd4b0f63b19c95105f28905270"} Jan 21 17:04:13 crc kubenswrapper[4707]: I0121 17:04:13.746634 4707 generic.go:334] "Generic (PLEG): container finished" podID="2038593f-c740-4feb-af87-945317f2adee" containerID="22d3f117f59a14193a24a97ee76c8463cb41e4d8634d5ba0062e0b55e7d68d91" exitCode=0 Jan 21 17:04:13 crc kubenswrapper[4707]: I0121 17:04:13.746701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" event={"ID":"2038593f-c740-4feb-af87-945317f2adee","Type":"ContainerDied","Data":"22d3f117f59a14193a24a97ee76c8463cb41e4d8634d5ba0062e0b55e7d68d91"} Jan 21 17:04:14 crc kubenswrapper[4707]: I0121 17:04:14.966792 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.040859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/2038593f-c740-4feb-af87-945317f2adee-kube-api-access-rxs45\") pod \"2038593f-c740-4feb-af87-945317f2adee\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.041231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-util\") pod \"2038593f-c740-4feb-af87-945317f2adee\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.041432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-bundle\") pod \"2038593f-c740-4feb-af87-945317f2adee\" (UID: \"2038593f-c740-4feb-af87-945317f2adee\") " Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.042530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-bundle" (OuterVolumeSpecName: "bundle") pod "2038593f-c740-4feb-af87-945317f2adee" (UID: "2038593f-c740-4feb-af87-945317f2adee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.048322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2038593f-c740-4feb-af87-945317f2adee-kube-api-access-rxs45" (OuterVolumeSpecName: "kube-api-access-rxs45") pod "2038593f-c740-4feb-af87-945317f2adee" (UID: "2038593f-c740-4feb-af87-945317f2adee"). InnerVolumeSpecName "kube-api-access-rxs45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.052397 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-util" (OuterVolumeSpecName: "util") pod "2038593f-c740-4feb-af87-945317f2adee" (UID: "2038593f-c740-4feb-af87-945317f2adee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.143231 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.143264 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/2038593f-c740-4feb-af87-945317f2adee-kube-api-access-rxs45\") on node \"crc\" DevicePath \"\"" Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.143278 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2038593f-c740-4feb-af87-945317f2adee-util\") on node \"crc\" DevicePath \"\"" Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.760492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" event={"ID":"2038593f-c740-4feb-af87-945317f2adee","Type":"ContainerDied","Data":"2b0ed7a367371cc97ab429597027a48ece0babf00876e2da81ee232c86a5b020"} Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.760528 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b0ed7a367371cc97ab429597027a48ece0babf00876e2da81ee232c86a5b020" Jan 21 17:04:15 crc kubenswrapper[4707]: I0121 17:04:15.760531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj" Jan 21 17:04:17 crc kubenswrapper[4707]: I0121 17:04:17.830970 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/memcached-0" Jan 21 17:04:20 crc kubenswrapper[4707]: I0121 17:04:20.187922 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:04:20 crc kubenswrapper[4707]: I0121 17:04:20.188357 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:04:20 crc kubenswrapper[4707]: I0121 17:04:20.210002 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:04:20 crc kubenswrapper[4707]: I0121 17:04:20.807122 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:04:39 crc kubenswrapper[4707]: I0121 17:04:39.945308 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:04:39 crc kubenswrapper[4707]: I0121 17:04:39.945646 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.482160 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7"] Jan 21 17:04:40 crc kubenswrapper[4707]: E0121 17:04:40.482405 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2038593f-c740-4feb-af87-945317f2adee" containerName="extract" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.482422 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2038593f-c740-4feb-af87-945317f2adee" containerName="extract" Jan 21 17:04:40 crc kubenswrapper[4707]: E0121 17:04:40.482433 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2038593f-c740-4feb-af87-945317f2adee" containerName="util" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.482439 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2038593f-c740-4feb-af87-945317f2adee" containerName="util" Jan 21 17:04:40 crc kubenswrapper[4707]: E0121 17:04:40.482454 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2038593f-c740-4feb-af87-945317f2adee" containerName="pull" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.482460 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2038593f-c740-4feb-af87-945317f2adee" containerName="pull" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.482581 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2038593f-c740-4feb-af87-945317f2adee" containerName="extract" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.483035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.485156 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-gjp85" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.494863 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7"] Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.594745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n689x\" (UniqueName: \"kubernetes.io/projected/b6f47560-0311-4a69-be7f-e57c1b043c06-kube-api-access-n689x\") pod \"rabbitmq-cluster-operator-779fc9694b-fxmp7\" (UID: \"b6f47560-0311-4a69-be7f-e57c1b043c06\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.696045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n689x\" (UniqueName: \"kubernetes.io/projected/b6f47560-0311-4a69-be7f-e57c1b043c06-kube-api-access-n689x\") pod \"rabbitmq-cluster-operator-779fc9694b-fxmp7\" (UID: \"b6f47560-0311-4a69-be7f-e57c1b043c06\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.712037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n689x\" (UniqueName: \"kubernetes.io/projected/b6f47560-0311-4a69-be7f-e57c1b043c06-kube-api-access-n689x\") pod \"rabbitmq-cluster-operator-779fc9694b-fxmp7\" (UID: \"b6f47560-0311-4a69-be7f-e57c1b043c06\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" Jan 21 17:04:40 crc kubenswrapper[4707]: I0121 17:04:40.797178 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" Jan 21 17:04:41 crc kubenswrapper[4707]: I0121 17:04:41.190377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7"] Jan 21 17:04:41 crc kubenswrapper[4707]: I0121 17:04:41.902492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" event={"ID":"b6f47560-0311-4a69-be7f-e57c1b043c06","Type":"ContainerStarted","Data":"9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06"} Jan 21 17:04:41 crc kubenswrapper[4707]: I0121 17:04:41.902708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" event={"ID":"b6f47560-0311-4a69-be7f-e57c1b043c06","Type":"ContainerStarted","Data":"0d5c05565378bd34f51239079e8bcd2452da02d6e62074c26d1e58312b0e7fa0"} Jan 21 17:04:41 crc kubenswrapper[4707]: I0121 17:04:41.917533 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" podStartSLOduration=1.917516996 podStartE2EDuration="1.917516996s" podCreationTimestamp="2026-01-21 17:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:04:41.912143191 +0000 UTC m=+7379.093659414" watchObservedRunningTime="2026-01-21 17:04:41.917516996 +0000 UTC m=+7379.099033218" Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.248318 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-4tv86"] Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.249312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4tv86" Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.251044 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-pnn2d" Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.253799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-4tv86"] Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.358349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2mqf\" (UniqueName: \"kubernetes.io/projected/21bee9e3-cb89-4ff4-902c-fb6ad8f939ac-kube-api-access-p2mqf\") pod \"keystone-operator-index-4tv86\" (UID: \"21bee9e3-cb89-4ff4-902c-fb6ad8f939ac\") " pod="openstack-operators/keystone-operator-index-4tv86" Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.460524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2mqf\" (UniqueName: \"kubernetes.io/projected/21bee9e3-cb89-4ff4-902c-fb6ad8f939ac-kube-api-access-p2mqf\") pod \"keystone-operator-index-4tv86\" (UID: \"21bee9e3-cb89-4ff4-902c-fb6ad8f939ac\") " pod="openstack-operators/keystone-operator-index-4tv86" Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.475896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2mqf\" (UniqueName: \"kubernetes.io/projected/21bee9e3-cb89-4ff4-902c-fb6ad8f939ac-kube-api-access-p2mqf\") pod \"keystone-operator-index-4tv86\" (UID: \"21bee9e3-cb89-4ff4-902c-fb6ad8f939ac\") " pod="openstack-operators/keystone-operator-index-4tv86" Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.560981 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4tv86" Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.914761 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-4tv86"] Jan 21 17:04:45 crc kubenswrapper[4707]: I0121 17:04:45.924759 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4tv86" event={"ID":"21bee9e3-cb89-4ff4-902c-fb6ad8f939ac","Type":"ContainerStarted","Data":"e6dee50ee75d7def16db15d56d94e493e3964de1bd7b45a7b3ee269e5af9821e"} Jan 21 17:04:46 crc kubenswrapper[4707]: I0121 17:04:46.930754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4tv86" event={"ID":"21bee9e3-cb89-4ff4-902c-fb6ad8f939ac","Type":"ContainerStarted","Data":"089942a08c5c5deaf174cd16157859c8c34fefded1588a5c4568191b9ba1722b"} Jan 21 17:04:46 crc kubenswrapper[4707]: I0121 17:04:46.945066 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-4tv86" podStartSLOduration=1.360340333 podStartE2EDuration="1.945049525s" podCreationTimestamp="2026-01-21 17:04:45 +0000 UTC" firstStartedPulling="2026-01-21 17:04:45.919938817 +0000 UTC m=+7383.101455040" lastFinishedPulling="2026-01-21 17:04:46.50464801 +0000 UTC m=+7383.686164232" observedRunningTime="2026-01-21 17:04:46.941342905 +0000 UTC m=+7384.122859127" watchObservedRunningTime="2026-01-21 17:04:46.945049525 +0000 UTC m=+7384.126565746" Jan 21 17:04:49 crc kubenswrapper[4707]: I0121 17:04:49.644839 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-4tv86"] Jan 21 17:04:49 crc kubenswrapper[4707]: I0121 17:04:49.645896 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-4tv86" podUID="21bee9e3-cb89-4ff4-902c-fb6ad8f939ac" containerName="registry-server" containerID="cri-o://089942a08c5c5deaf174cd16157859c8c34fefded1588a5c4568191b9ba1722b" gracePeriod=2 Jan 21 17:04:49 crc kubenswrapper[4707]: I0121 17:04:49.952278 4707 generic.go:334] "Generic (PLEG): container finished" podID="21bee9e3-cb89-4ff4-902c-fb6ad8f939ac" containerID="089942a08c5c5deaf174cd16157859c8c34fefded1588a5c4568191b9ba1722b" exitCode=0 Jan 21 17:04:49 crc kubenswrapper[4707]: I0121 17:04:49.952325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4tv86" event={"ID":"21bee9e3-cb89-4ff4-902c-fb6ad8f939ac","Type":"ContainerDied","Data":"089942a08c5c5deaf174cd16157859c8c34fefded1588a5c4568191b9ba1722b"} Jan 21 17:04:49 crc kubenswrapper[4707]: I0121 17:04:49.952350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4tv86" event={"ID":"21bee9e3-cb89-4ff4-902c-fb6ad8f939ac","Type":"ContainerDied","Data":"e6dee50ee75d7def16db15d56d94e493e3964de1bd7b45a7b3ee269e5af9821e"} Jan 21 17:04:49 crc kubenswrapper[4707]: I0121 17:04:49.952360 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6dee50ee75d7def16db15d56d94e493e3964de1bd7b45a7b3ee269e5af9821e" Jan 21 17:04:49 crc kubenswrapper[4707]: I0121 17:04:49.977064 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4tv86" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.017758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2mqf\" (UniqueName: \"kubernetes.io/projected/21bee9e3-cb89-4ff4-902c-fb6ad8f939ac-kube-api-access-p2mqf\") pod \"21bee9e3-cb89-4ff4-902c-fb6ad8f939ac\" (UID: \"21bee9e3-cb89-4ff4-902c-fb6ad8f939ac\") " Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.021920 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21bee9e3-cb89-4ff4-902c-fb6ad8f939ac-kube-api-access-p2mqf" (OuterVolumeSpecName: "kube-api-access-p2mqf") pod "21bee9e3-cb89-4ff4-902c-fb6ad8f939ac" (UID: "21bee9e3-cb89-4ff4-902c-fb6ad8f939ac"). InnerVolumeSpecName "kube-api-access-p2mqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.119238 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2mqf\" (UniqueName: \"kubernetes.io/projected/21bee9e3-cb89-4ff4-902c-fb6ad8f939ac-kube-api-access-p2mqf\") on node \"crc\" DevicePath \"\"" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.249240 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-c8gcz"] Jan 21 17:04:50 crc kubenswrapper[4707]: E0121 17:04:50.249523 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21bee9e3-cb89-4ff4-902c-fb6ad8f939ac" containerName="registry-server" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.249540 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="21bee9e3-cb89-4ff4-902c-fb6ad8f939ac" containerName="registry-server" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.249681 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="21bee9e3-cb89-4ff4-902c-fb6ad8f939ac" containerName="registry-server" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.250133 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.254896 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-c8gcz"] Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.321094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx256\" (UniqueName: \"kubernetes.io/projected/da1d6c5f-e5a4-4d85-9526-499182972d3b-kube-api-access-wx256\") pod \"keystone-operator-index-c8gcz\" (UID: \"da1d6c5f-e5a4-4d85-9526-499182972d3b\") " pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.422205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx256\" (UniqueName: \"kubernetes.io/projected/da1d6c5f-e5a4-4d85-9526-499182972d3b-kube-api-access-wx256\") pod \"keystone-operator-index-c8gcz\" (UID: \"da1d6c5f-e5a4-4d85-9526-499182972d3b\") " pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.435919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx256\" (UniqueName: \"kubernetes.io/projected/da1d6c5f-e5a4-4d85-9526-499182972d3b-kube-api-access-wx256\") pod \"keystone-operator-index-c8gcz\" (UID: \"da1d6c5f-e5a4-4d85-9526-499182972d3b\") " pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.562748 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.907039 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-c8gcz"] Jan 21 17:04:50 crc kubenswrapper[4707]: W0121 17:04:50.911209 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda1d6c5f_e5a4_4d85_9526_499182972d3b.slice/crio-3dbdd361fa174440a23de5ec6d6d28fdb63f94e214023e9875cace22a17cbfd5 WatchSource:0}: Error finding container 3dbdd361fa174440a23de5ec6d6d28fdb63f94e214023e9875cace22a17cbfd5: Status 404 returned error can't find the container with id 3dbdd361fa174440a23de5ec6d6d28fdb63f94e214023e9875cace22a17cbfd5 Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.959654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-c8gcz" event={"ID":"da1d6c5f-e5a4-4d85-9526-499182972d3b","Type":"ContainerStarted","Data":"3dbdd361fa174440a23de5ec6d6d28fdb63f94e214023e9875cace22a17cbfd5"} Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.959672 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4tv86" Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.982705 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-4tv86"] Jan 21 17:04:50 crc kubenswrapper[4707]: I0121 17:04:50.986918 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-4tv86"] Jan 21 17:04:51 crc kubenswrapper[4707]: I0121 17:04:51.188760 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21bee9e3-cb89-4ff4-902c-fb6ad8f939ac" path="/var/lib/kubelet/pods/21bee9e3-cb89-4ff4-902c-fb6ad8f939ac/volumes" Jan 21 17:04:51 crc kubenswrapper[4707]: I0121 17:04:51.966398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-c8gcz" event={"ID":"da1d6c5f-e5a4-4d85-9526-499182972d3b","Type":"ContainerStarted","Data":"d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc"} Jan 21 17:04:51 crc kubenswrapper[4707]: I0121 17:04:51.977023 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-c8gcz" podStartSLOduration=1.489223687 podStartE2EDuration="1.977008416s" podCreationTimestamp="2026-01-21 17:04:50 +0000 UTC" firstStartedPulling="2026-01-21 17:04:50.913176492 +0000 UTC m=+7388.094692714" lastFinishedPulling="2026-01-21 17:04:51.40096122 +0000 UTC m=+7388.582477443" observedRunningTime="2026-01-21 17:04:51.975779506 +0000 UTC m=+7389.157295727" watchObservedRunningTime="2026-01-21 17:04:51.977008416 +0000 UTC m=+7389.158524638" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.453911 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.455917 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.457393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"rabbitmq-default-user" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.457393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.458366 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.458477 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"rabbitmq-server-conf" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.459142 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"rabbitmq-server-dockercfg-tfwcr" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.463416 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.546094 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqm2r\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-kube-api-access-nqm2r\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.546137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.546154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.546170 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.546190 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.546402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.546460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.546654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.563562 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.563692 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.583102 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.647542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.647657 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqm2r\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-kube-api-access-nqm2r\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.647686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.647701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.647715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.647731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.647787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.648148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.648351 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.648416 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.648982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.649884 4707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.649914 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1fe87de0361f7455d5828c750188b4b3343cfd13e8bb3d84f8f895de12387f0d/globalmount\"" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.652130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.652305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.652563 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.661076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqm2r\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-kube-api-access-nqm2r\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.669482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\") pod \"rabbitmq-server-0\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:00 crc kubenswrapper[4707]: I0121 17:05:00.772372 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:01 crc kubenswrapper[4707]: I0121 17:05:01.036415 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:05:01 crc kubenswrapper[4707]: I0121 17:05:01.113268 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 17:05:01 crc kubenswrapper[4707]: W0121 17:05:01.116720 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b4c43f5_dc57_473d_985e_7dc44e6f06c9.slice/crio-a74efbb78427ff359cacb8a1dfd4763f4ee093e49a0e2b209bcbe64633b019eb WatchSource:0}: Error finding container a74efbb78427ff359cacb8a1dfd4763f4ee093e49a0e2b209bcbe64633b019eb: Status 404 returned error can't find the container with id a74efbb78427ff359cacb8a1dfd4763f4ee093e49a0e2b209bcbe64633b019eb Jan 21 17:05:02 crc kubenswrapper[4707]: I0121 17:05:02.023185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"5b4c43f5-dc57-473d-985e-7dc44e6f06c9","Type":"ContainerStarted","Data":"1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df"} Jan 21 17:05:02 crc kubenswrapper[4707]: I0121 17:05:02.023408 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"5b4c43f5-dc57-473d-985e-7dc44e6f06c9","Type":"ContainerStarted","Data":"a74efbb78427ff359cacb8a1dfd4763f4ee093e49a0e2b209bcbe64633b019eb"} Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.272958 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz"] Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.274237 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.277267 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.279587 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz"] Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.297362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.297403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.297548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvnwh\" (UniqueName: \"kubernetes.io/projected/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-kube-api-access-lvnwh\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.398553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvnwh\" (UniqueName: \"kubernetes.io/projected/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-kube-api-access-lvnwh\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.398650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.398678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.399126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.399197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.413504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvnwh\" (UniqueName: \"kubernetes.io/projected/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-kube-api-access-lvnwh\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.592290 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:04 crc kubenswrapper[4707]: I0121 17:05:04.942182 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz"] Jan 21 17:05:04 crc kubenswrapper[4707]: W0121 17:05:04.947633 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00f35b5_3d95_40cf_8016_d5aaf9c01adb.slice/crio-05af2df1c78cfeeb966ebc2d27a564c6911a4d19e445f2a06bd28b8346f178ea WatchSource:0}: Error finding container 05af2df1c78cfeeb966ebc2d27a564c6911a4d19e445f2a06bd28b8346f178ea: Status 404 returned error can't find the container with id 05af2df1c78cfeeb966ebc2d27a564c6911a4d19e445f2a06bd28b8346f178ea Jan 21 17:05:05 crc kubenswrapper[4707]: I0121 17:05:05.041764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" event={"ID":"b00f35b5-3d95-40cf-8016-d5aaf9c01adb","Type":"ContainerStarted","Data":"05af2df1c78cfeeb966ebc2d27a564c6911a4d19e445f2a06bd28b8346f178ea"} Jan 21 17:05:06 crc kubenswrapper[4707]: I0121 17:05:06.047785 4707 generic.go:334] "Generic (PLEG): container finished" podID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerID="2866fe8ba38998023e4ddf912844931732fa88e9b8b341655b0e3e49e864cb32" exitCode=0 Jan 21 17:05:06 crc kubenswrapper[4707]: I0121 17:05:06.047840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" event={"ID":"b00f35b5-3d95-40cf-8016-d5aaf9c01adb","Type":"ContainerDied","Data":"2866fe8ba38998023e4ddf912844931732fa88e9b8b341655b0e3e49e864cb32"} Jan 21 17:05:07 crc kubenswrapper[4707]: I0121 17:05:07.054256 4707 generic.go:334] "Generic (PLEG): container finished" podID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerID="c26ede1b29efd1dfd2feab9989dd9a2dab316515b373c3c5d2d638ed0c7366f3" exitCode=0 Jan 21 17:05:07 crc kubenswrapper[4707]: I0121 17:05:07.054287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" event={"ID":"b00f35b5-3d95-40cf-8016-d5aaf9c01adb","Type":"ContainerDied","Data":"c26ede1b29efd1dfd2feab9989dd9a2dab316515b373c3c5d2d638ed0c7366f3"} Jan 21 17:05:08 crc kubenswrapper[4707]: I0121 17:05:08.063531 4707 generic.go:334] "Generic (PLEG): container finished" podID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerID="c74c25d17159c498403d8dabd859fc21143ab92f7314fce98e3db7ce8cd5f02d" exitCode=0 Jan 21 17:05:08 crc kubenswrapper[4707]: I0121 17:05:08.063628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" event={"ID":"b00f35b5-3d95-40cf-8016-d5aaf9c01adb","Type":"ContainerDied","Data":"c74c25d17159c498403d8dabd859fc21143ab92f7314fce98e3db7ce8cd5f02d"} Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.296676 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.360693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvnwh\" (UniqueName: \"kubernetes.io/projected/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-kube-api-access-lvnwh\") pod \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.360905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-util\") pod \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.360933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-bundle\") pod \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\" (UID: \"b00f35b5-3d95-40cf-8016-d5aaf9c01adb\") " Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.361516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-bundle" (OuterVolumeSpecName: "bundle") pod "b00f35b5-3d95-40cf-8016-d5aaf9c01adb" (UID: "b00f35b5-3d95-40cf-8016-d5aaf9c01adb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.365366 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-kube-api-access-lvnwh" (OuterVolumeSpecName: "kube-api-access-lvnwh") pod "b00f35b5-3d95-40cf-8016-d5aaf9c01adb" (UID: "b00f35b5-3d95-40cf-8016-d5aaf9c01adb"). InnerVolumeSpecName "kube-api-access-lvnwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.370990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-util" (OuterVolumeSpecName: "util") pod "b00f35b5-3d95-40cf-8016-d5aaf9c01adb" (UID: "b00f35b5-3d95-40cf-8016-d5aaf9c01adb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.462485 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-util\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.462520 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.462532 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvnwh\" (UniqueName: \"kubernetes.io/projected/b00f35b5-3d95-40cf-8016-d5aaf9c01adb-kube-api-access-lvnwh\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.945857 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:05:09 crc kubenswrapper[4707]: I0121 17:05:09.945905 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:05:10 crc kubenswrapper[4707]: I0121 17:05:10.075625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" event={"ID":"b00f35b5-3d95-40cf-8016-d5aaf9c01adb","Type":"ContainerDied","Data":"05af2df1c78cfeeb966ebc2d27a564c6911a4d19e445f2a06bd28b8346f178ea"} Jan 21 17:05:10 crc kubenswrapper[4707]: I0121 17:05:10.075664 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05af2df1c78cfeeb966ebc2d27a564c6911a4d19e445f2a06bd28b8346f178ea" Jan 21 17:05:10 crc kubenswrapper[4707]: I0121 17:05:10.075692 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.379460 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7"] Jan 21 17:05:22 crc kubenswrapper[4707]: E0121 17:05:22.380114 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerName="util" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.380127 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerName="util" Jan 21 17:05:22 crc kubenswrapper[4707]: E0121 17:05:22.380141 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerName="pull" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.380146 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerName="pull" Jan 21 17:05:22 crc kubenswrapper[4707]: E0121 17:05:22.380159 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerName="extract" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.380166 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerName="extract" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.380313 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" containerName="extract" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.380780 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.384530 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.384916 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-d7xks" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.390252 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7"] Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.433955 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-webhook-cert\") pod \"keystone-operator-controller-manager-5885849d4c-vm8k7\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.434001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-apiservice-cert\") pod \"keystone-operator-controller-manager-5885849d4c-vm8k7\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.434073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5svgm\" (UniqueName: \"kubernetes.io/projected/078cfcac-e384-409d-8b67-d0b56e051b7f-kube-api-access-5svgm\") pod \"keystone-operator-controller-manager-5885849d4c-vm8k7\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.535199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-webhook-cert\") pod \"keystone-operator-controller-manager-5885849d4c-vm8k7\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.535251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-apiservice-cert\") pod \"keystone-operator-controller-manager-5885849d4c-vm8k7\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.535302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5svgm\" (UniqueName: \"kubernetes.io/projected/078cfcac-e384-409d-8b67-d0b56e051b7f-kube-api-access-5svgm\") pod \"keystone-operator-controller-manager-5885849d4c-vm8k7\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.539758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-apiservice-cert\") pod \"keystone-operator-controller-manager-5885849d4c-vm8k7\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.540567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-webhook-cert\") pod \"keystone-operator-controller-manager-5885849d4c-vm8k7\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.547484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5svgm\" (UniqueName: \"kubernetes.io/projected/078cfcac-e384-409d-8b67-d0b56e051b7f-kube-api-access-5svgm\") pod \"keystone-operator-controller-manager-5885849d4c-vm8k7\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:22 crc kubenswrapper[4707]: I0121 17:05:22.694210 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:23 crc kubenswrapper[4707]: I0121 17:05:23.085500 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7"] Jan 21 17:05:23 crc kubenswrapper[4707]: I0121 17:05:23.145912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" event={"ID":"078cfcac-e384-409d-8b67-d0b56e051b7f","Type":"ContainerStarted","Data":"42e067ec320145591960fccc3d1633c7a7e188de74f1ff67967860bf89279176"} Jan 21 17:05:24 crc kubenswrapper[4707]: I0121 17:05:24.152353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" event={"ID":"078cfcac-e384-409d-8b67-d0b56e051b7f","Type":"ContainerStarted","Data":"ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574"} Jan 21 17:05:24 crc kubenswrapper[4707]: I0121 17:05:24.152563 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:24 crc kubenswrapper[4707]: I0121 17:05:24.164925 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" podStartSLOduration=2.164910668 podStartE2EDuration="2.164910668s" podCreationTimestamp="2026-01-21 17:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:05:24.164113429 +0000 UTC m=+7421.345629651" watchObservedRunningTime="2026-01-21 17:05:24.164910668 +0000 UTC m=+7421.346426889" Jan 21 17:05:32 crc kubenswrapper[4707]: I0121 17:05:32.698826 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:05:33 crc kubenswrapper[4707]: I0121 17:05:33.200206 4707 generic.go:334] "Generic (PLEG): container finished" podID="5b4c43f5-dc57-473d-985e-7dc44e6f06c9" containerID="1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df" exitCode=0 Jan 21 17:05:33 crc kubenswrapper[4707]: I0121 17:05:33.200240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"5b4c43f5-dc57-473d-985e-7dc44e6f06c9","Type":"ContainerDied","Data":"1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df"} Jan 21 17:05:34 crc kubenswrapper[4707]: I0121 17:05:34.206639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"5b4c43f5-dc57-473d-985e-7dc44e6f06c9","Type":"ContainerStarted","Data":"f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c"} Jan 21 17:05:34 crc kubenswrapper[4707]: I0121 17:05:34.207019 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:34 crc kubenswrapper[4707]: I0121 17:05:34.221375 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/rabbitmq-server-0" podStartSLOduration=35.221363042 podStartE2EDuration="35.221363042s" podCreationTimestamp="2026-01-21 17:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:05:34.219741544 +0000 UTC m=+7431.401257766" watchObservedRunningTime="2026-01-21 17:05:34.221363042 +0000 UTC m=+7431.402879264" Jan 21 17:05:37 crc kubenswrapper[4707]: I0121 17:05:37.848899 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-d97qc"] Jan 21 17:05:37 crc kubenswrapper[4707]: I0121 17:05:37.850154 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:05:37 crc kubenswrapper[4707]: I0121 17:05:37.852914 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-lkcvj" Jan 21 17:05:37 crc kubenswrapper[4707]: I0121 17:05:37.855567 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-d97qc"] Jan 21 17:05:38 crc kubenswrapper[4707]: I0121 17:05:38.032534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h54vs\" (UniqueName: \"kubernetes.io/projected/3cd98a54-a0ff-4a65-b77c-f9cc53537670-kube-api-access-h54vs\") pod \"horizon-operator-index-d97qc\" (UID: \"3cd98a54-a0ff-4a65-b77c-f9cc53537670\") " pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:05:38 crc kubenswrapper[4707]: I0121 17:05:38.134287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h54vs\" (UniqueName: \"kubernetes.io/projected/3cd98a54-a0ff-4a65-b77c-f9cc53537670-kube-api-access-h54vs\") pod \"horizon-operator-index-d97qc\" (UID: \"3cd98a54-a0ff-4a65-b77c-f9cc53537670\") " pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:05:38 crc kubenswrapper[4707]: I0121 17:05:38.149876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h54vs\" (UniqueName: \"kubernetes.io/projected/3cd98a54-a0ff-4a65-b77c-f9cc53537670-kube-api-access-h54vs\") pod \"horizon-operator-index-d97qc\" (UID: \"3cd98a54-a0ff-4a65-b77c-f9cc53537670\") " pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:05:38 crc kubenswrapper[4707]: I0121 17:05:38.166532 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:05:38 crc kubenswrapper[4707]: I0121 17:05:38.534895 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-d97qc"] Jan 21 17:05:39 crc kubenswrapper[4707]: I0121 17:05:39.240156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-d97qc" event={"ID":"3cd98a54-a0ff-4a65-b77c-f9cc53537670","Type":"ContainerStarted","Data":"cae53d8164a52a16b8e4713d9043dc1680b7e2b0f79e4046e83636a38d397a4e"} Jan 21 17:05:39 crc kubenswrapper[4707]: I0121 17:05:39.945214 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:05:39 crc kubenswrapper[4707]: I0121 17:05:39.945442 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:05:39 crc kubenswrapper[4707]: I0121 17:05:39.945479 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:05:39 crc kubenswrapper[4707]: I0121 17:05:39.946016 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:05:39 crc kubenswrapper[4707]: I0121 17:05:39.946061 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" gracePeriod=600 Jan 21 17:05:40 crc kubenswrapper[4707]: E0121 17:05:40.068664 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.239859 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-db-create-mcv6c"] Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.240730 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.248642 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl"] Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.249286 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.252437 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-db-secret" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.253576 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-mcv6c"] Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.259178 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl"] Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.268739 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" exitCode=0 Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.268768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6"} Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.268791 4707 scope.go:117] "RemoveContainer" containerID="db6cb0803a4872ab9454e605e8d5382632c4ff38cb23724c972fdbbde4bc97c5" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.269458 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:05:40 crc kubenswrapper[4707]: E0121 17:05:40.269703 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.365483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgxt9\" (UniqueName: \"kubernetes.io/projected/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-kube-api-access-wgxt9\") pod \"keystone-b6ca-account-create-update-gl9xl\" (UID: \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\") " pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.365539 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-operator-scripts\") pod \"keystone-b6ca-account-create-update-gl9xl\" (UID: \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\") " pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.365688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220973a9-7456-4526-a610-7793dfea1cbf-operator-scripts\") pod \"keystone-db-create-mcv6c\" (UID: \"220973a9-7456-4526-a610-7793dfea1cbf\") " pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.365720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr7c6\" (UniqueName: \"kubernetes.io/projected/220973a9-7456-4526-a610-7793dfea1cbf-kube-api-access-xr7c6\") pod \"keystone-db-create-mcv6c\" (UID: \"220973a9-7456-4526-a610-7793dfea1cbf\") " pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.467305 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220973a9-7456-4526-a610-7793dfea1cbf-operator-scripts\") pod \"keystone-db-create-mcv6c\" (UID: \"220973a9-7456-4526-a610-7793dfea1cbf\") " pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.467354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr7c6\" (UniqueName: \"kubernetes.io/projected/220973a9-7456-4526-a610-7793dfea1cbf-kube-api-access-xr7c6\") pod \"keystone-db-create-mcv6c\" (UID: \"220973a9-7456-4526-a610-7793dfea1cbf\") " pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.467424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgxt9\" (UniqueName: \"kubernetes.io/projected/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-kube-api-access-wgxt9\") pod \"keystone-b6ca-account-create-update-gl9xl\" (UID: \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\") " pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.467455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-operator-scripts\") pod \"keystone-b6ca-account-create-update-gl9xl\" (UID: \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\") " pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.468154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220973a9-7456-4526-a610-7793dfea1cbf-operator-scripts\") pod \"keystone-db-create-mcv6c\" (UID: \"220973a9-7456-4526-a610-7793dfea1cbf\") " pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.469461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-operator-scripts\") pod \"keystone-b6ca-account-create-update-gl9xl\" (UID: \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\") " pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.482899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr7c6\" (UniqueName: \"kubernetes.io/projected/220973a9-7456-4526-a610-7793dfea1cbf-kube-api-access-xr7c6\") pod \"keystone-db-create-mcv6c\" (UID: \"220973a9-7456-4526-a610-7793dfea1cbf\") " pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.501735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgxt9\" (UniqueName: \"kubernetes.io/projected/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-kube-api-access-wgxt9\") pod \"keystone-b6ca-account-create-update-gl9xl\" (UID: \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\") " pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.559442 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.566561 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:40 crc kubenswrapper[4707]: I0121 17:05:40.956420 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-mcv6c"] Jan 21 17:05:40 crc kubenswrapper[4707]: W0121 17:05:40.959037 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod220973a9_7456_4526_a610_7793dfea1cbf.slice/crio-ce9654d235f5482bdf6aaba05c37beb72ae04f6a021df185a3ec0c9da6dae0e0 WatchSource:0}: Error finding container ce9654d235f5482bdf6aaba05c37beb72ae04f6a021df185a3ec0c9da6dae0e0: Status 404 returned error can't find the container with id ce9654d235f5482bdf6aaba05c37beb72ae04f6a021df185a3ec0c9da6dae0e0 Jan 21 17:05:41 crc kubenswrapper[4707]: I0121 17:05:41.017944 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl"] Jan 21 17:05:41 crc kubenswrapper[4707]: W0121 17:05:41.030131 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97a905ee_abc4_4123_bc7a_b6fc67b1c42c.slice/crio-2fa1d0a69341181a989f6dfdcca6236557a6f670fd5d6916bfd93c897af8ed3f WatchSource:0}: Error finding container 2fa1d0a69341181a989f6dfdcca6236557a6f670fd5d6916bfd93c897af8ed3f: Status 404 returned error can't find the container with id 2fa1d0a69341181a989f6dfdcca6236557a6f670fd5d6916bfd93c897af8ed3f Jan 21 17:05:41 crc kubenswrapper[4707]: I0121 17:05:41.278079 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" event={"ID":"97a905ee-abc4-4123-bc7a-b6fc67b1c42c","Type":"ContainerStarted","Data":"2fa1d0a69341181a989f6dfdcca6236557a6f670fd5d6916bfd93c897af8ed3f"} Jan 21 17:05:41 crc kubenswrapper[4707]: I0121 17:05:41.280263 4707 generic.go:334] "Generic (PLEG): container finished" podID="220973a9-7456-4526-a610-7793dfea1cbf" containerID="9f1dc836d81ce57b3685d4f4b305ccb9350965ed3e25cd4384b794041ec9f15a" exitCode=0 Jan 21 17:05:41 crc kubenswrapper[4707]: I0121 17:05:41.280341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-create-mcv6c" event={"ID":"220973a9-7456-4526-a610-7793dfea1cbf","Type":"ContainerDied","Data":"9f1dc836d81ce57b3685d4f4b305ccb9350965ed3e25cd4384b794041ec9f15a"} Jan 21 17:05:41 crc kubenswrapper[4707]: I0121 17:05:41.280364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-create-mcv6c" event={"ID":"220973a9-7456-4526-a610-7793dfea1cbf","Type":"ContainerStarted","Data":"ce9654d235f5482bdf6aaba05c37beb72ae04f6a021df185a3ec0c9da6dae0e0"} Jan 21 17:05:42 crc kubenswrapper[4707]: I0121 17:05:42.886427 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.003655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220973a9-7456-4526-a610-7793dfea1cbf-operator-scripts\") pod \"220973a9-7456-4526-a610-7793dfea1cbf\" (UID: \"220973a9-7456-4526-a610-7793dfea1cbf\") " Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.003790 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr7c6\" (UniqueName: \"kubernetes.io/projected/220973a9-7456-4526-a610-7793dfea1cbf-kube-api-access-xr7c6\") pod \"220973a9-7456-4526-a610-7793dfea1cbf\" (UID: \"220973a9-7456-4526-a610-7793dfea1cbf\") " Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.004855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220973a9-7456-4526-a610-7793dfea1cbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "220973a9-7456-4526-a610-7793dfea1cbf" (UID: "220973a9-7456-4526-a610-7793dfea1cbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.008494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/220973a9-7456-4526-a610-7793dfea1cbf-kube-api-access-xr7c6" (OuterVolumeSpecName: "kube-api-access-xr7c6") pod "220973a9-7456-4526-a610-7793dfea1cbf" (UID: "220973a9-7456-4526-a610-7793dfea1cbf"). InnerVolumeSpecName "kube-api-access-xr7c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.105717 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220973a9-7456-4526-a610-7793dfea1cbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.105747 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr7c6\" (UniqueName: \"kubernetes.io/projected/220973a9-7456-4526-a610-7793dfea1cbf-kube-api-access-xr7c6\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.295451 4707 generic.go:334] "Generic (PLEG): container finished" podID="97a905ee-abc4-4123-bc7a-b6fc67b1c42c" containerID="0fc5398554874817d4e2bc028cdbab2d2ac0dba239b2a72883da545b4d9191ef" exitCode=0 Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.295529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" event={"ID":"97a905ee-abc4-4123-bc7a-b6fc67b1c42c","Type":"ContainerDied","Data":"0fc5398554874817d4e2bc028cdbab2d2ac0dba239b2a72883da545b4d9191ef"} Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.296992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-create-mcv6c" event={"ID":"220973a9-7456-4526-a610-7793dfea1cbf","Type":"ContainerDied","Data":"ce9654d235f5482bdf6aaba05c37beb72ae04f6a021df185a3ec0c9da6dae0e0"} Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.297025 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce9654d235f5482bdf6aaba05c37beb72ae04f6a021df185a3ec0c9da6dae0e0" Jan 21 17:05:43 crc kubenswrapper[4707]: I0121 17:05:43.297048 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-mcv6c" Jan 21 17:05:46 crc kubenswrapper[4707]: I0121 17:05:46.474889 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:46 crc kubenswrapper[4707]: I0121 17:05:46.652393 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgxt9\" (UniqueName: \"kubernetes.io/projected/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-kube-api-access-wgxt9\") pod \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\" (UID: \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\") " Jan 21 17:05:46 crc kubenswrapper[4707]: I0121 17:05:46.652679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-operator-scripts\") pod \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\" (UID: \"97a905ee-abc4-4123-bc7a-b6fc67b1c42c\") " Jan 21 17:05:46 crc kubenswrapper[4707]: I0121 17:05:46.653042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97a905ee-abc4-4123-bc7a-b6fc67b1c42c" (UID: "97a905ee-abc4-4123-bc7a-b6fc67b1c42c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:05:46 crc kubenswrapper[4707]: I0121 17:05:46.657590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-kube-api-access-wgxt9" (OuterVolumeSpecName: "kube-api-access-wgxt9") pod "97a905ee-abc4-4123-bc7a-b6fc67b1c42c" (UID: "97a905ee-abc4-4123-bc7a-b6fc67b1c42c"). InnerVolumeSpecName "kube-api-access-wgxt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:05:46 crc kubenswrapper[4707]: I0121 17:05:46.753913 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgxt9\" (UniqueName: \"kubernetes.io/projected/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-kube-api-access-wgxt9\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:46 crc kubenswrapper[4707]: I0121 17:05:46.753944 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a905ee-abc4-4123-bc7a-b6fc67b1c42c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:47 crc kubenswrapper[4707]: I0121 17:05:47.328141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-d97qc" event={"ID":"3cd98a54-a0ff-4a65-b77c-f9cc53537670","Type":"ContainerStarted","Data":"624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa"} Jan 21 17:05:47 crc kubenswrapper[4707]: I0121 17:05:47.329235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" event={"ID":"97a905ee-abc4-4123-bc7a-b6fc67b1c42c","Type":"ContainerDied","Data":"2fa1d0a69341181a989f6dfdcca6236557a6f670fd5d6916bfd93c897af8ed3f"} Jan 21 17:05:47 crc kubenswrapper[4707]: I0121 17:05:47.329262 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fa1d0a69341181a989f6dfdcca6236557a6f670fd5d6916bfd93c897af8ed3f" Jan 21 17:05:47 crc kubenswrapper[4707]: I0121 17:05:47.329268 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl" Jan 21 17:05:47 crc kubenswrapper[4707]: I0121 17:05:47.341294 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-d97qc" podStartSLOduration=2.329816664 podStartE2EDuration="10.341280174s" podCreationTimestamp="2026-01-21 17:05:37 +0000 UTC" firstStartedPulling="2026-01-21 17:05:38.537466671 +0000 UTC m=+7435.718982893" lastFinishedPulling="2026-01-21 17:05:46.548930181 +0000 UTC m=+7443.730446403" observedRunningTime="2026-01-21 17:05:47.337229357 +0000 UTC m=+7444.518745580" watchObservedRunningTime="2026-01-21 17:05:47.341280174 +0000 UTC m=+7444.522796396" Jan 21 17:05:48 crc kubenswrapper[4707]: I0121 17:05:48.167776 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:05:48 crc kubenswrapper[4707]: I0121 17:05:48.167852 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:05:48 crc kubenswrapper[4707]: I0121 17:05:48.195400 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:05:50 crc kubenswrapper[4707]: I0121 17:05:50.774953 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:05:50 crc kubenswrapper[4707]: I0121 17:05:50.852397 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fpljq"] Jan 21 17:05:50 crc kubenswrapper[4707]: E0121 17:05:50.852686 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a905ee-abc4-4123-bc7a-b6fc67b1c42c" containerName="mariadb-account-create-update" Jan 21 17:05:50 crc kubenswrapper[4707]: I0121 17:05:50.852700 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a905ee-abc4-4123-bc7a-b6fc67b1c42c" containerName="mariadb-account-create-update" Jan 21 17:05:50 crc kubenswrapper[4707]: E0121 17:05:50.852719 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="220973a9-7456-4526-a610-7793dfea1cbf" containerName="mariadb-database-create" Jan 21 17:05:50 crc kubenswrapper[4707]: I0121 17:05:50.852725 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="220973a9-7456-4526-a610-7793dfea1cbf" containerName="mariadb-database-create" Jan 21 17:05:50 crc kubenswrapper[4707]: I0121 17:05:50.852898 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a905ee-abc4-4123-bc7a-b6fc67b1c42c" containerName="mariadb-account-create-update" Jan 21 17:05:50 crc kubenswrapper[4707]: I0121 17:05:50.852914 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="220973a9-7456-4526-a610-7793dfea1cbf" containerName="mariadb-database-create" Jan 21 17:05:50 crc kubenswrapper[4707]: I0121 17:05:50.853787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:50 crc kubenswrapper[4707]: I0121 17:05:50.861149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fpljq"] Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.010523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpcr2\" (UniqueName: \"kubernetes.io/projected/25d08d6d-dc31-4340-8243-3f86ee3d7540-kube-api-access-fpcr2\") pod \"certified-operators-fpljq\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.010592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-utilities\") pod \"certified-operators-fpljq\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.010706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-catalog-content\") pod \"certified-operators-fpljq\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.111541 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpcr2\" (UniqueName: \"kubernetes.io/projected/25d08d6d-dc31-4340-8243-3f86ee3d7540-kube-api-access-fpcr2\") pod \"certified-operators-fpljq\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.111625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-utilities\") pod \"certified-operators-fpljq\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.111687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-catalog-content\") pod \"certified-operators-fpljq\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.112096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-catalog-content\") pod \"certified-operators-fpljq\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.112153 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-utilities\") pod \"certified-operators-fpljq\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.125845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpcr2\" (UniqueName: \"kubernetes.io/projected/25d08d6d-dc31-4340-8243-3f86ee3d7540-kube-api-access-fpcr2\") pod \"certified-operators-fpljq\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.181952 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.222243 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-qbsn9"] Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.223077 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.226426 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-scripts" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.227824 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-config-data" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.227914 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-keystone-dockercfg-qf5rm" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.227993 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.228603 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-qbsn9"] Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.314208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c0b89c-689f-4db6-9c1b-2759a23d5c58-config-data\") pod \"keystone-db-sync-qbsn9\" (UID: \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\") " pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.314528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv647\" (UniqueName: \"kubernetes.io/projected/12c0b89c-689f-4db6-9c1b-2759a23d5c58-kube-api-access-kv647\") pod \"keystone-db-sync-qbsn9\" (UID: \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\") " pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.416395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c0b89c-689f-4db6-9c1b-2759a23d5c58-config-data\") pod \"keystone-db-sync-qbsn9\" (UID: \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\") " pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.416457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv647\" (UniqueName: \"kubernetes.io/projected/12c0b89c-689f-4db6-9c1b-2759a23d5c58-kube-api-access-kv647\") pod \"keystone-db-sync-qbsn9\" (UID: \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\") " pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.420006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c0b89c-689f-4db6-9c1b-2759a23d5c58-config-data\") pod \"keystone-db-sync-qbsn9\" (UID: \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\") " pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.430823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv647\" (UniqueName: \"kubernetes.io/projected/12c0b89c-689f-4db6-9c1b-2759a23d5c58-kube-api-access-kv647\") pod \"keystone-db-sync-qbsn9\" (UID: \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\") " pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.543398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.604833 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fpljq"] Jan 21 17:05:51 crc kubenswrapper[4707]: W0121 17:05:51.612820 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d08d6d_dc31_4340_8243_3f86ee3d7540.slice/crio-27a47493b397e8bce56ead014ea53c58c9fb02d32cfe29c4421e50aeb485172a WatchSource:0}: Error finding container 27a47493b397e8bce56ead014ea53c58c9fb02d32cfe29c4421e50aeb485172a: Status 404 returned error can't find the container with id 27a47493b397e8bce56ead014ea53c58c9fb02d32cfe29c4421e50aeb485172a Jan 21 17:05:51 crc kubenswrapper[4707]: I0121 17:05:51.971519 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-qbsn9"] Jan 21 17:05:51 crc kubenswrapper[4707]: W0121 17:05:51.976359 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12c0b89c_689f_4db6_9c1b_2759a23d5c58.slice/crio-e0424bf3240a764dc7c3f20d16391622dc08cc68b4d41616867ed69fe7e6b9bb WatchSource:0}: Error finding container e0424bf3240a764dc7c3f20d16391622dc08cc68b4d41616867ed69fe7e6b9bb: Status 404 returned error can't find the container with id e0424bf3240a764dc7c3f20d16391622dc08cc68b4d41616867ed69fe7e6b9bb Jan 21 17:05:52 crc kubenswrapper[4707]: I0121 17:05:52.190939 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:05:52 crc kubenswrapper[4707]: E0121 17:05:52.191147 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:05:52 crc kubenswrapper[4707]: I0121 17:05:52.359160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" event={"ID":"12c0b89c-689f-4db6-9c1b-2759a23d5c58","Type":"ContainerStarted","Data":"cb9ed4451f264f7207558f19fece8b6915bd3bd1e2a3ac5fb38bd23622dc01a2"} Jan 21 17:05:52 crc kubenswrapper[4707]: I0121 17:05:52.359206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" event={"ID":"12c0b89c-689f-4db6-9c1b-2759a23d5c58","Type":"ContainerStarted","Data":"e0424bf3240a764dc7c3f20d16391622dc08cc68b4d41616867ed69fe7e6b9bb"} Jan 21 17:05:52 crc kubenswrapper[4707]: I0121 17:05:52.360593 4707 generic.go:334] "Generic (PLEG): container finished" podID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerID="184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b" exitCode=0 Jan 21 17:05:52 crc kubenswrapper[4707]: I0121 17:05:52.360644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpljq" event={"ID":"25d08d6d-dc31-4340-8243-3f86ee3d7540","Type":"ContainerDied","Data":"184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b"} Jan 21 17:05:52 crc kubenswrapper[4707]: I0121 17:05:52.360667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpljq" event={"ID":"25d08d6d-dc31-4340-8243-3f86ee3d7540","Type":"ContainerStarted","Data":"27a47493b397e8bce56ead014ea53c58c9fb02d32cfe29c4421e50aeb485172a"} Jan 21 17:05:52 crc kubenswrapper[4707]: I0121 17:05:52.371243 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" podStartSLOduration=1.3712299159999999 podStartE2EDuration="1.371229916s" podCreationTimestamp="2026-01-21 17:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:05:52.370379578 +0000 UTC m=+7449.551895800" watchObservedRunningTime="2026-01-21 17:05:52.371229916 +0000 UTC m=+7449.552746138" Jan 21 17:05:53 crc kubenswrapper[4707]: I0121 17:05:53.367406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpljq" event={"ID":"25d08d6d-dc31-4340-8243-3f86ee3d7540","Type":"ContainerStarted","Data":"6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24"} Jan 21 17:05:54 crc kubenswrapper[4707]: I0121 17:05:54.374171 4707 generic.go:334] "Generic (PLEG): container finished" podID="12c0b89c-689f-4db6-9c1b-2759a23d5c58" containerID="cb9ed4451f264f7207558f19fece8b6915bd3bd1e2a3ac5fb38bd23622dc01a2" exitCode=0 Jan 21 17:05:54 crc kubenswrapper[4707]: I0121 17:05:54.374246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" event={"ID":"12c0b89c-689f-4db6-9c1b-2759a23d5c58","Type":"ContainerDied","Data":"cb9ed4451f264f7207558f19fece8b6915bd3bd1e2a3ac5fb38bd23622dc01a2"} Jan 21 17:05:54 crc kubenswrapper[4707]: I0121 17:05:54.376085 4707 generic.go:334] "Generic (PLEG): container finished" podID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerID="6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24" exitCode=0 Jan 21 17:05:54 crc kubenswrapper[4707]: I0121 17:05:54.376119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpljq" event={"ID":"25d08d6d-dc31-4340-8243-3f86ee3d7540","Type":"ContainerDied","Data":"6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24"} Jan 21 17:05:55 crc kubenswrapper[4707]: I0121 17:05:55.383836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpljq" event={"ID":"25d08d6d-dc31-4340-8243-3f86ee3d7540","Type":"ContainerStarted","Data":"77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab"} Jan 21 17:05:55 crc kubenswrapper[4707]: I0121 17:05:55.396492 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fpljq" podStartSLOduration=2.801958533 podStartE2EDuration="5.396476813s" podCreationTimestamp="2026-01-21 17:05:50 +0000 UTC" firstStartedPulling="2026-01-21 17:05:52.363369216 +0000 UTC m=+7449.544885438" lastFinishedPulling="2026-01-21 17:05:54.957887495 +0000 UTC m=+7452.139403718" observedRunningTime="2026-01-21 17:05:55.395494747 +0000 UTC m=+7452.577010969" watchObservedRunningTime="2026-01-21 17:05:55.396476813 +0000 UTC m=+7452.577993035" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:55.668722 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:55.777006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c0b89c-689f-4db6-9c1b-2759a23d5c58-config-data\") pod \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\" (UID: \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\") " Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:55.777216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv647\" (UniqueName: \"kubernetes.io/projected/12c0b89c-689f-4db6-9c1b-2759a23d5c58-kube-api-access-kv647\") pod \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\" (UID: \"12c0b89c-689f-4db6-9c1b-2759a23d5c58\") " Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:55.781574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c0b89c-689f-4db6-9c1b-2759a23d5c58-kube-api-access-kv647" (OuterVolumeSpecName: "kube-api-access-kv647") pod "12c0b89c-689f-4db6-9c1b-2759a23d5c58" (UID: "12c0b89c-689f-4db6-9c1b-2759a23d5c58"). InnerVolumeSpecName "kube-api-access-kv647". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:55.806995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c0b89c-689f-4db6-9c1b-2759a23d5c58-config-data" (OuterVolumeSpecName: "config-data") pod "12c0b89c-689f-4db6-9c1b-2759a23d5c58" (UID: "12c0b89c-689f-4db6-9c1b-2759a23d5c58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:55.878914 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv647\" (UniqueName: \"kubernetes.io/projected/12c0b89c-689f-4db6-9c1b-2759a23d5c58-kube-api-access-kv647\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:55.878942 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c0b89c-689f-4db6-9c1b-2759a23d5c58-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.391526 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.398927 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-qbsn9" event={"ID":"12c0b89c-689f-4db6-9c1b-2759a23d5c58","Type":"ContainerDied","Data":"e0424bf3240a764dc7c3f20d16391622dc08cc68b4d41616867ed69fe7e6b9bb"} Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.398970 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0424bf3240a764dc7c3f20d16391622dc08cc68b4d41616867ed69fe7e6b9bb" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.571344 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-7xvw4"] Jan 21 17:05:56 crc kubenswrapper[4707]: E0121 17:05:56.571583 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c0b89c-689f-4db6-9c1b-2759a23d5c58" containerName="keystone-db-sync" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.571601 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c0b89c-689f-4db6-9c1b-2759a23d5c58" containerName="keystone-db-sync" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.571735 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c0b89c-689f-4db6-9c1b-2759a23d5c58" containerName="keystone-db-sync" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.572159 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.575124 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"osp-secret" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.575273 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-keystone-dockercfg-qf5rm" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.575394 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-scripts" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.576358 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-config-data" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.579534 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-7xvw4"] Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.582202 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.689782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-fernet-keys\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.689999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-credential-keys\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.690045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-scripts\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.690074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qllh\" (UniqueName: \"kubernetes.io/projected/6325709f-f82f-4424-ad1a-aceef668a4e2-kube-api-access-6qllh\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.690250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-config-data\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.791416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-config-data\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.791503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-fernet-keys\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.791568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-credential-keys\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.791589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-scripts\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.792162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qllh\" (UniqueName: \"kubernetes.io/projected/6325709f-f82f-4424-ad1a-aceef668a4e2-kube-api-access-6qllh\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.795285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-config-data\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.796067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-scripts\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.799249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-fernet-keys\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.810201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qllh\" (UniqueName: \"kubernetes.io/projected/6325709f-f82f-4424-ad1a-aceef668a4e2-kube-api-access-6qllh\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.815432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-credential-keys\") pod \"keystone-bootstrap-7xvw4\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:56 crc kubenswrapper[4707]: I0121 17:05:56.884541 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:05:57 crc kubenswrapper[4707]: I0121 17:05:57.231670 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-7xvw4"] Jan 21 17:05:57 crc kubenswrapper[4707]: W0121 17:05:57.233141 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6325709f_f82f_4424_ad1a_aceef668a4e2.slice/crio-b4578983bb85a736ac37b329fbca62e2ebfaa195f1fd87abb8efbe9491af15ad WatchSource:0}: Error finding container b4578983bb85a736ac37b329fbca62e2ebfaa195f1fd87abb8efbe9491af15ad: Status 404 returned error can't find the container with id b4578983bb85a736ac37b329fbca62e2ebfaa195f1fd87abb8efbe9491af15ad Jan 21 17:05:57 crc kubenswrapper[4707]: I0121 17:05:57.397345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" event={"ID":"6325709f-f82f-4424-ad1a-aceef668a4e2","Type":"ContainerStarted","Data":"f950f73b230861e22d259c79f4956ec775dbee302d232a9eb474136f88ff6c4a"} Jan 21 17:05:57 crc kubenswrapper[4707]: I0121 17:05:57.397465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" event={"ID":"6325709f-f82f-4424-ad1a-aceef668a4e2","Type":"ContainerStarted","Data":"b4578983bb85a736ac37b329fbca62e2ebfaa195f1fd87abb8efbe9491af15ad"} Jan 21 17:05:57 crc kubenswrapper[4707]: I0121 17:05:57.412923 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" podStartSLOduration=1.412908323 podStartE2EDuration="1.412908323s" podCreationTimestamp="2026-01-21 17:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:05:57.408249483 +0000 UTC m=+7454.589765715" watchObservedRunningTime="2026-01-21 17:05:57.412908323 +0000 UTC m=+7454.594424545" Jan 21 17:05:58 crc kubenswrapper[4707]: I0121 17:05:58.187891 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:06:00 crc kubenswrapper[4707]: I0121 17:06:00.416703 4707 generic.go:334] "Generic (PLEG): container finished" podID="6325709f-f82f-4424-ad1a-aceef668a4e2" containerID="f950f73b230861e22d259c79f4956ec775dbee302d232a9eb474136f88ff6c4a" exitCode=0 Jan 21 17:06:00 crc kubenswrapper[4707]: I0121 17:06:00.416787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" event={"ID":"6325709f-f82f-4424-ad1a-aceef668a4e2","Type":"ContainerDied","Data":"f950f73b230861e22d259c79f4956ec775dbee302d232a9eb474136f88ff6c4a"} Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.188323 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.188354 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.209510 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.450635 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.788220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.963958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qllh\" (UniqueName: \"kubernetes.io/projected/6325709f-f82f-4424-ad1a-aceef668a4e2-kube-api-access-6qllh\") pod \"6325709f-f82f-4424-ad1a-aceef668a4e2\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.964019 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-scripts\") pod \"6325709f-f82f-4424-ad1a-aceef668a4e2\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.964106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-config-data\") pod \"6325709f-f82f-4424-ad1a-aceef668a4e2\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.964142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-credential-keys\") pod \"6325709f-f82f-4424-ad1a-aceef668a4e2\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.964195 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-fernet-keys\") pod \"6325709f-f82f-4424-ad1a-aceef668a4e2\" (UID: \"6325709f-f82f-4424-ad1a-aceef668a4e2\") " Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.968232 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-scripts" (OuterVolumeSpecName: "scripts") pod "6325709f-f82f-4424-ad1a-aceef668a4e2" (UID: "6325709f-f82f-4424-ad1a-aceef668a4e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.968867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6325709f-f82f-4424-ad1a-aceef668a4e2" (UID: "6325709f-f82f-4424-ad1a-aceef668a4e2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.968880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6325709f-f82f-4424-ad1a-aceef668a4e2-kube-api-access-6qllh" (OuterVolumeSpecName: "kube-api-access-6qllh") pod "6325709f-f82f-4424-ad1a-aceef668a4e2" (UID: "6325709f-f82f-4424-ad1a-aceef668a4e2"). InnerVolumeSpecName "kube-api-access-6qllh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.968880 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6325709f-f82f-4424-ad1a-aceef668a4e2" (UID: "6325709f-f82f-4424-ad1a-aceef668a4e2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:06:01 crc kubenswrapper[4707]: I0121 17:06:01.979017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-config-data" (OuterVolumeSpecName: "config-data") pod "6325709f-f82f-4424-ad1a-aceef668a4e2" (UID: "6325709f-f82f-4424-ad1a-aceef668a4e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.065779 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.065933 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.066004 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.066055 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qllh\" (UniqueName: \"kubernetes.io/projected/6325709f-f82f-4424-ad1a-aceef668a4e2-kube-api-access-6qllh\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.066099 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6325709f-f82f-4424-ad1a-aceef668a4e2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.430203 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.430197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-7xvw4" event={"ID":"6325709f-f82f-4424-ad1a-aceef668a4e2","Type":"ContainerDied","Data":"b4578983bb85a736ac37b329fbca62e2ebfaa195f1fd87abb8efbe9491af15ad"} Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.430458 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4578983bb85a736ac37b329fbca62e2ebfaa195f1fd87abb8efbe9491af15ad" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.478562 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-8644c6967d-hrxd6"] Jan 21 17:06:02 crc kubenswrapper[4707]: E0121 17:06:02.478843 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6325709f-f82f-4424-ad1a-aceef668a4e2" containerName="keystone-bootstrap" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.478856 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6325709f-f82f-4424-ad1a-aceef668a4e2" containerName="keystone-bootstrap" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.478960 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6325709f-f82f-4424-ad1a-aceef668a4e2" containerName="keystone-bootstrap" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.479362 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.480983 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-keystone-dockercfg-qf5rm" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.481137 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-scripts" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.481575 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-config-data" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.481783 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.485636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-8644c6967d-hrxd6"] Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.673840 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-scripts\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.673877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-config-data\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.673940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-credential-keys\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.674009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96jw9\" (UniqueName: \"kubernetes.io/projected/21181af0-ee46-42db-a058-42ed66488786-kube-api-access-96jw9\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.674038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-fernet-keys\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.775736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96jw9\" (UniqueName: \"kubernetes.io/projected/21181af0-ee46-42db-a058-42ed66488786-kube-api-access-96jw9\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.775780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-fernet-keys\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.775929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-scripts\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.775946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-config-data\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.776001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-credential-keys\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.778870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-fernet-keys\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.779182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-scripts\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.780546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-config-data\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.782798 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-credential-keys\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.789137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96jw9\" (UniqueName: \"kubernetes.io/projected/21181af0-ee46-42db-a058-42ed66488786-kube-api-access-96jw9\") pod \"keystone-8644c6967d-hrxd6\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:02 crc kubenswrapper[4707]: I0121 17:06:02.792171 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:03 crc kubenswrapper[4707]: I0121 17:06:03.142861 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-8644c6967d-hrxd6"] Jan 21 17:06:03 crc kubenswrapper[4707]: I0121 17:06:03.437401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" event={"ID":"21181af0-ee46-42db-a058-42ed66488786","Type":"ContainerStarted","Data":"9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8"} Jan 21 17:06:03 crc kubenswrapper[4707]: I0121 17:06:03.437439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" event={"ID":"21181af0-ee46-42db-a058-42ed66488786","Type":"ContainerStarted","Data":"9307669e1fafcfee9ff9aaa71762b66146a39ee693949ae4658a156e8a3415e9"} Jan 21 17:06:03 crc kubenswrapper[4707]: I0121 17:06:03.438406 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:03 crc kubenswrapper[4707]: I0121 17:06:03.455191 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" podStartSLOduration=1.455176469 podStartE2EDuration="1.455176469s" podCreationTimestamp="2026-01-21 17:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:06:03.449217675 +0000 UTC m=+7460.630733897" watchObservedRunningTime="2026-01-21 17:06:03.455176469 +0000 UTC m=+7460.636692691" Jan 21 17:06:04 crc kubenswrapper[4707]: I0121 17:06:04.182680 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:06:04 crc kubenswrapper[4707]: E0121 17:06:04.183719 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:06:04 crc kubenswrapper[4707]: I0121 17:06:04.445459 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fpljq"] Jan 21 17:06:04 crc kubenswrapper[4707]: I0121 17:06:04.445635 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fpljq" podUID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerName="registry-server" containerID="cri-o://77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab" gracePeriod=2 Jan 21 17:06:04 crc kubenswrapper[4707]: I0121 17:06:04.859354 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.007355 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-utilities\") pod \"25d08d6d-dc31-4340-8243-3f86ee3d7540\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.007415 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpcr2\" (UniqueName: \"kubernetes.io/projected/25d08d6d-dc31-4340-8243-3f86ee3d7540-kube-api-access-fpcr2\") pod \"25d08d6d-dc31-4340-8243-3f86ee3d7540\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.007501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-catalog-content\") pod \"25d08d6d-dc31-4340-8243-3f86ee3d7540\" (UID: \"25d08d6d-dc31-4340-8243-3f86ee3d7540\") " Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.008056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-utilities" (OuterVolumeSpecName: "utilities") pod "25d08d6d-dc31-4340-8243-3f86ee3d7540" (UID: "25d08d6d-dc31-4340-8243-3f86ee3d7540"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.011094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d08d6d-dc31-4340-8243-3f86ee3d7540-kube-api-access-fpcr2" (OuterVolumeSpecName: "kube-api-access-fpcr2") pod "25d08d6d-dc31-4340-8243-3f86ee3d7540" (UID: "25d08d6d-dc31-4340-8243-3f86ee3d7540"). InnerVolumeSpecName "kube-api-access-fpcr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.039868 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25d08d6d-dc31-4340-8243-3f86ee3d7540" (UID: "25d08d6d-dc31-4340-8243-3f86ee3d7540"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.109151 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.109180 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpcr2\" (UniqueName: \"kubernetes.io/projected/25d08d6d-dc31-4340-8243-3f86ee3d7540-kube-api-access-fpcr2\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.109192 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d08d6d-dc31-4340-8243-3f86ee3d7540-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.450970 4707 generic.go:334] "Generic (PLEG): container finished" podID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerID="77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab" exitCode=0 Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.451011 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpljq" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.451052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpljq" event={"ID":"25d08d6d-dc31-4340-8243-3f86ee3d7540","Type":"ContainerDied","Data":"77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab"} Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.451089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpljq" event={"ID":"25d08d6d-dc31-4340-8243-3f86ee3d7540","Type":"ContainerDied","Data":"27a47493b397e8bce56ead014ea53c58c9fb02d32cfe29c4421e50aeb485172a"} Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.451105 4707 scope.go:117] "RemoveContainer" containerID="77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.465089 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fpljq"] Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.465535 4707 scope.go:117] "RemoveContainer" containerID="6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.470151 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fpljq"] Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.480096 4707 scope.go:117] "RemoveContainer" containerID="184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.496376 4707 scope.go:117] "RemoveContainer" containerID="77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab" Jan 21 17:06:05 crc kubenswrapper[4707]: E0121 17:06:05.496677 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab\": container with ID starting with 77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab not found: ID does not exist" containerID="77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.496720 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab"} err="failed to get container status \"77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab\": rpc error: code = NotFound desc = could not find container \"77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab\": container with ID starting with 77dd83c0ef9353cd6f9aed7716a87d1ae4c9e7dc4065d2ea95ecaa7d1af9a1ab not found: ID does not exist" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.496748 4707 scope.go:117] "RemoveContainer" containerID="6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24" Jan 21 17:06:05 crc kubenswrapper[4707]: E0121 17:06:05.497086 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24\": container with ID starting with 6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24 not found: ID does not exist" containerID="6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.497113 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24"} err="failed to get container status \"6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24\": rpc error: code = NotFound desc = could not find container \"6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24\": container with ID starting with 6a4e2e87ca41fb42c1226bf7806a29fc5e9b3a8134e7f2c5bfb0b2e3359afe24 not found: ID does not exist" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.497133 4707 scope.go:117] "RemoveContainer" containerID="184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b" Jan 21 17:06:05 crc kubenswrapper[4707]: E0121 17:06:05.497363 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b\": container with ID starting with 184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b not found: ID does not exist" containerID="184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b" Jan 21 17:06:05 crc kubenswrapper[4707]: I0121 17:06:05.497380 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b"} err="failed to get container status \"184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b\": rpc error: code = NotFound desc = could not find container \"184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b\": container with ID starting with 184debd26deef7d60d306c72c5abb0390a80f47937cb62f48a802ead1be1f22b not found: ID does not exist" Jan 21 17:06:07 crc kubenswrapper[4707]: I0121 17:06:07.189020 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d08d6d-dc31-4340-8243-3f86ee3d7540" path="/var/lib/kubelet/pods/25d08d6d-dc31-4340-8243-3f86ee3d7540/volumes" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.472374 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs"] Jan 21 17:06:08 crc kubenswrapper[4707]: E0121 17:06:08.472647 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerName="registry-server" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.472659 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerName="registry-server" Jan 21 17:06:08 crc kubenswrapper[4707]: E0121 17:06:08.472669 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerName="extract-utilities" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.472675 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerName="extract-utilities" Jan 21 17:06:08 crc kubenswrapper[4707]: E0121 17:06:08.472691 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerName="extract-content" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.472697 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerName="extract-content" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.472823 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d08d6d-dc31-4340-8243-3f86ee3d7540" containerName="registry-server" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.473645 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.474925 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mg65m" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.479272 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs"] Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.560946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2sbg\" (UniqueName: \"kubernetes.io/projected/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-kube-api-access-f2sbg\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.561017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-util\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.561103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-bundle\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.661930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-util\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.662009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-bundle\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.662106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2sbg\" (UniqueName: \"kubernetes.io/projected/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-kube-api-access-f2sbg\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.662526 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-util\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.662624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-bundle\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.676985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2sbg\" (UniqueName: \"kubernetes.io/projected/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-kube-api-access-f2sbg\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:08 crc kubenswrapper[4707]: I0121 17:06:08.786628 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:09 crc kubenswrapper[4707]: I0121 17:06:09.139591 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs"] Jan 21 17:06:09 crc kubenswrapper[4707]: E0121 17:06:09.390166 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c2ebe8_fbfb_468a_b5a7_8e455d5fe063.slice/crio-0fea3f7b80f50927b508dc1ab9d8241266e410660366ad4561107118d2af4d1e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c2ebe8_fbfb_468a_b5a7_8e455d5fe063.slice/crio-conmon-0fea3f7b80f50927b508dc1ab9d8241266e410660366ad4561107118d2af4d1e.scope\": RecentStats: unable to find data in memory cache]" Jan 21 17:06:09 crc kubenswrapper[4707]: I0121 17:06:09.472568 4707 generic.go:334] "Generic (PLEG): container finished" podID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerID="0fea3f7b80f50927b508dc1ab9d8241266e410660366ad4561107118d2af4d1e" exitCode=0 Jan 21 17:06:09 crc kubenswrapper[4707]: I0121 17:06:09.472623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" event={"ID":"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063","Type":"ContainerDied","Data":"0fea3f7b80f50927b508dc1ab9d8241266e410660366ad4561107118d2af4d1e"} Jan 21 17:06:09 crc kubenswrapper[4707]: I0121 17:06:09.472648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" event={"ID":"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063","Type":"ContainerStarted","Data":"b31bbbbf0617455438e1367a7d50d5fc77b6dc0a80d5cddc1e153131980149a6"} Jan 21 17:06:11 crc kubenswrapper[4707]: I0121 17:06:11.485393 4707 generic.go:334] "Generic (PLEG): container finished" podID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerID="c57f0d5540e46c39ab89fecce89278ed46c1ac97f20e0dddf5274613dcc4f070" exitCode=0 Jan 21 17:06:11 crc kubenswrapper[4707]: I0121 17:06:11.485440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" event={"ID":"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063","Type":"ContainerDied","Data":"c57f0d5540e46c39ab89fecce89278ed46c1ac97f20e0dddf5274613dcc4f070"} Jan 21 17:06:12 crc kubenswrapper[4707]: I0121 17:06:12.493300 4707 generic.go:334] "Generic (PLEG): container finished" podID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerID="cc1d61364a44337e82b6acd12557c97164f790996b3959263283a0802538686c" exitCode=0 Jan 21 17:06:12 crc kubenswrapper[4707]: I0121 17:06:12.493340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" event={"ID":"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063","Type":"ContainerDied","Data":"cc1d61364a44337e82b6acd12557c97164f790996b3959263283a0802538686c"} Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.749772 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.828782 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-util\") pod \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.829005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-bundle\") pod \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.829213 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2sbg\" (UniqueName: \"kubernetes.io/projected/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-kube-api-access-f2sbg\") pod \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\" (UID: \"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063\") " Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.829745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-bundle" (OuterVolumeSpecName: "bundle") pod "85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" (UID: "85c2ebe8-fbfb-468a-b5a7-8e455d5fe063"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.833092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-kube-api-access-f2sbg" (OuterVolumeSpecName: "kube-api-access-f2sbg") pod "85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" (UID: "85c2ebe8-fbfb-468a-b5a7-8e455d5fe063"). InnerVolumeSpecName "kube-api-access-f2sbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.839650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-util" (OuterVolumeSpecName: "util") pod "85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" (UID: "85c2ebe8-fbfb-468a-b5a7-8e455d5fe063"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.930580 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-util\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.930615 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:13 crc kubenswrapper[4707]: I0121 17:06:13.930629 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2sbg\" (UniqueName: \"kubernetes.io/projected/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063-kube-api-access-f2sbg\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.507230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" event={"ID":"85c2ebe8-fbfb-468a-b5a7-8e455d5fe063","Type":"ContainerDied","Data":"b31bbbbf0617455438e1367a7d50d5fc77b6dc0a80d5cddc1e153131980149a6"} Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.507265 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b31bbbbf0617455438e1367a7d50d5fc77b6dc0a80d5cddc1e153131980149a6" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.507306 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.650555 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jwpmq"] Jan 21 17:06:14 crc kubenswrapper[4707]: E0121 17:06:14.650907 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerName="util" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.650981 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerName="util" Jan 21 17:06:14 crc kubenswrapper[4707]: E0121 17:06:14.651044 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerName="extract" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.651088 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerName="extract" Jan 21 17:06:14 crc kubenswrapper[4707]: E0121 17:06:14.651138 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerName="pull" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.651178 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerName="pull" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.651368 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" containerName="extract" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.652282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.658209 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jwpmq"] Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.741058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-catalog-content\") pod \"redhat-operators-jwpmq\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.741108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-utilities\") pod \"redhat-operators-jwpmq\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.741186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlrnz\" (UniqueName: \"kubernetes.io/projected/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-kube-api-access-hlrnz\") pod \"redhat-operators-jwpmq\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.841991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-utilities\") pod \"redhat-operators-jwpmq\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.842082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlrnz\" (UniqueName: \"kubernetes.io/projected/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-kube-api-access-hlrnz\") pod \"redhat-operators-jwpmq\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.842185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-catalog-content\") pod \"redhat-operators-jwpmq\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.842435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-utilities\") pod \"redhat-operators-jwpmq\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.842504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-catalog-content\") pod \"redhat-operators-jwpmq\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.855527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlrnz\" (UniqueName: \"kubernetes.io/projected/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-kube-api-access-hlrnz\") pod \"redhat-operators-jwpmq\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:14 crc kubenswrapper[4707]: I0121 17:06:14.969897 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:15 crc kubenswrapper[4707]: I0121 17:06:15.338314 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jwpmq"] Jan 21 17:06:15 crc kubenswrapper[4707]: I0121 17:06:15.513942 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerID="fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2" exitCode=0 Jan 21 17:06:15 crc kubenswrapper[4707]: I0121 17:06:15.514025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwpmq" event={"ID":"4ae4a552-fbf9-469e-8bfc-450c0cd412c0","Type":"ContainerDied","Data":"fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2"} Jan 21 17:06:15 crc kubenswrapper[4707]: I0121 17:06:15.514666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwpmq" event={"ID":"4ae4a552-fbf9-469e-8bfc-450c0cd412c0","Type":"ContainerStarted","Data":"5866eca037e5cb9b2e3de7be3c3a449e7ba844d8d1d37c30827fb47c5c6f1779"} Jan 21 17:06:17 crc kubenswrapper[4707]: I0121 17:06:17.529091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwpmq" event={"ID":"4ae4a552-fbf9-469e-8bfc-450c0cd412c0","Type":"ContainerStarted","Data":"79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9"} Jan 21 17:06:18 crc kubenswrapper[4707]: I0121 17:06:18.536675 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerID="79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9" exitCode=0 Jan 21 17:06:18 crc kubenswrapper[4707]: I0121 17:06:18.536774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwpmq" event={"ID":"4ae4a552-fbf9-469e-8bfc-450c0cd412c0","Type":"ContainerDied","Data":"79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9"} Jan 21 17:06:18 crc kubenswrapper[4707]: I0121 17:06:18.537045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwpmq" event={"ID":"4ae4a552-fbf9-469e-8bfc-450c0cd412c0","Type":"ContainerStarted","Data":"a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e"} Jan 21 17:06:18 crc kubenswrapper[4707]: I0121 17:06:18.560642 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jwpmq" podStartSLOduration=2.100762771 podStartE2EDuration="4.560627364s" podCreationTimestamp="2026-01-21 17:06:14 +0000 UTC" firstStartedPulling="2026-01-21 17:06:15.515628629 +0000 UTC m=+7472.697144851" lastFinishedPulling="2026-01-21 17:06:17.975493222 +0000 UTC m=+7475.157009444" observedRunningTime="2026-01-21 17:06:18.555097176 +0000 UTC m=+7475.736613399" watchObservedRunningTime="2026-01-21 17:06:18.560627364 +0000 UTC m=+7475.742143586" Jan 21 17:06:19 crc kubenswrapper[4707]: I0121 17:06:19.183199 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:06:19 crc kubenswrapper[4707]: E0121 17:06:19.183415 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:06:24 crc kubenswrapper[4707]: I0121 17:06:24.970230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:24 crc kubenswrapper[4707]: I0121 17:06:24.970571 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:25 crc kubenswrapper[4707]: I0121 17:06:25.012449 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:25 crc kubenswrapper[4707]: I0121 17:06:25.608219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.652318 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vvw8f"] Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.653719 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.659559 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvw8f"] Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.717797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-catalog-content\") pod \"redhat-marketplace-vvw8f\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.717959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vsg6\" (UniqueName: \"kubernetes.io/projected/5a71af8c-b3cc-4471-91c1-1df373bb75ec-kube-api-access-8vsg6\") pod \"redhat-marketplace-vvw8f\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.718107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-utilities\") pod \"redhat-marketplace-vvw8f\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.819581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-catalog-content\") pod \"redhat-marketplace-vvw8f\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.819656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vsg6\" (UniqueName: \"kubernetes.io/projected/5a71af8c-b3cc-4471-91c1-1df373bb75ec-kube-api-access-8vsg6\") pod \"redhat-marketplace-vvw8f\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.819709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-utilities\") pod \"redhat-marketplace-vvw8f\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.820089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-catalog-content\") pod \"redhat-marketplace-vvw8f\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.820152 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-utilities\") pod \"redhat-marketplace-vvw8f\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.837094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vsg6\" (UniqueName: \"kubernetes.io/projected/5a71af8c-b3cc-4471-91c1-1df373bb75ec-kube-api-access-8vsg6\") pod \"redhat-marketplace-vvw8f\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:27 crc kubenswrapper[4707]: I0121 17:06:27.967399 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:28 crc kubenswrapper[4707]: I0121 17:06:28.468472 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvw8f"] Jan 21 17:06:28 crc kubenswrapper[4707]: I0121 17:06:28.599244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvw8f" event={"ID":"5a71af8c-b3cc-4471-91c1-1df373bb75ec","Type":"ContainerStarted","Data":"7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35"} Jan 21 17:06:28 crc kubenswrapper[4707]: I0121 17:06:28.599404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvw8f" event={"ID":"5a71af8c-b3cc-4471-91c1-1df373bb75ec","Type":"ContainerStarted","Data":"1b3999016d246827854693c8631eb3cfc358e6ae0f1368368e03edcbe2226c28"} Jan 21 17:06:29 crc kubenswrapper[4707]: I0121 17:06:29.606301 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerID="7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35" exitCode=0 Jan 21 17:06:29 crc kubenswrapper[4707]: I0121 17:06:29.607012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvw8f" event={"ID":"5a71af8c-b3cc-4471-91c1-1df373bb75ec","Type":"ContainerDied","Data":"7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35"} Jan 21 17:06:29 crc kubenswrapper[4707]: I0121 17:06:29.645687 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jwpmq"] Jan 21 17:06:29 crc kubenswrapper[4707]: I0121 17:06:29.645904 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jwpmq" podUID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerName="registry-server" containerID="cri-o://a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e" gracePeriod=2 Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.004408 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.044058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-utilities\") pod \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.044105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlrnz\" (UniqueName: \"kubernetes.io/projected/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-kube-api-access-hlrnz\") pod \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.044214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-catalog-content\") pod \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\" (UID: \"4ae4a552-fbf9-469e-8bfc-450c0cd412c0\") " Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.045563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-utilities" (OuterVolumeSpecName: "utilities") pod "4ae4a552-fbf9-469e-8bfc-450c0cd412c0" (UID: "4ae4a552-fbf9-469e-8bfc-450c0cd412c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.048944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-kube-api-access-hlrnz" (OuterVolumeSpecName: "kube-api-access-hlrnz") pod "4ae4a552-fbf9-469e-8bfc-450c0cd412c0" (UID: "4ae4a552-fbf9-469e-8bfc-450c0cd412c0"). InnerVolumeSpecName "kube-api-access-hlrnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.129527 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ae4a552-fbf9-469e-8bfc-450c0cd412c0" (UID: "4ae4a552-fbf9-469e-8bfc-450c0cd412c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.145418 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.145453 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlrnz\" (UniqueName: \"kubernetes.io/projected/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-kube-api-access-hlrnz\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.145468 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae4a552-fbf9-469e-8bfc-450c0cd412c0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.182548 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:06:30 crc kubenswrapper[4707]: E0121 17:06:30.182803 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.355595 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf"] Jan 21 17:06:30 crc kubenswrapper[4707]: E0121 17:06:30.356050 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerName="extract-utilities" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.356068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerName="extract-utilities" Jan 21 17:06:30 crc kubenswrapper[4707]: E0121 17:06:30.356092 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerName="registry-server" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.356097 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerName="registry-server" Jan 21 17:06:30 crc kubenswrapper[4707]: E0121 17:06:30.356108 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerName="extract-content" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.356113 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerName="extract-content" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.356225 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerName="registry-server" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.356641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.357938 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.358477 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-s6jl7" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.365701 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf"] Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.448390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-webhook-cert\") pod \"horizon-operator-controller-manager-56c8cbcbc7-vjtnf\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.448470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2ksz\" (UniqueName: \"kubernetes.io/projected/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-kube-api-access-p2ksz\") pod \"horizon-operator-controller-manager-56c8cbcbc7-vjtnf\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.448703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-apiservice-cert\") pod \"horizon-operator-controller-manager-56c8cbcbc7-vjtnf\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.550147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2ksz\" (UniqueName: \"kubernetes.io/projected/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-kube-api-access-p2ksz\") pod \"horizon-operator-controller-manager-56c8cbcbc7-vjtnf\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.550281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-apiservice-cert\") pod \"horizon-operator-controller-manager-56c8cbcbc7-vjtnf\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.550357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-webhook-cert\") pod \"horizon-operator-controller-manager-56c8cbcbc7-vjtnf\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.554458 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-webhook-cert\") pod \"horizon-operator-controller-manager-56c8cbcbc7-vjtnf\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.554459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-apiservice-cert\") pod \"horizon-operator-controller-manager-56c8cbcbc7-vjtnf\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.563125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2ksz\" (UniqueName: \"kubernetes.io/projected/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-kube-api-access-p2ksz\") pod \"horizon-operator-controller-manager-56c8cbcbc7-vjtnf\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.613047 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerID="f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209" exitCode=0 Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.613119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvw8f" event={"ID":"5a71af8c-b3cc-4471-91c1-1df373bb75ec","Type":"ContainerDied","Data":"f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209"} Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.615390 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" containerID="a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e" exitCode=0 Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.615425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwpmq" event={"ID":"4ae4a552-fbf9-469e-8bfc-450c0cd412c0","Type":"ContainerDied","Data":"a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e"} Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.615446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwpmq" event={"ID":"4ae4a552-fbf9-469e-8bfc-450c0cd412c0","Type":"ContainerDied","Data":"5866eca037e5cb9b2e3de7be3c3a449e7ba844d8d1d37c30827fb47c5c6f1779"} Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.615465 4707 scope.go:117] "RemoveContainer" containerID="a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.615560 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwpmq" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.632263 4707 scope.go:117] "RemoveContainer" containerID="79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.636125 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jwpmq"] Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.641621 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jwpmq"] Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.657476 4707 scope.go:117] "RemoveContainer" containerID="fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.671737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.672177 4707 scope.go:117] "RemoveContainer" containerID="a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e" Jan 21 17:06:30 crc kubenswrapper[4707]: E0121 17:06:30.672498 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e\": container with ID starting with a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e not found: ID does not exist" containerID="a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.672528 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e"} err="failed to get container status \"a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e\": rpc error: code = NotFound desc = could not find container \"a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e\": container with ID starting with a81120f647bc6804e96a7b4dd57d099c0b2b4999450498331511090691f5321e not found: ID does not exist" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.672545 4707 scope.go:117] "RemoveContainer" containerID="79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9" Jan 21 17:06:30 crc kubenswrapper[4707]: E0121 17:06:30.672796 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9\": container with ID starting with 79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9 not found: ID does not exist" containerID="79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.672837 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9"} err="failed to get container status \"79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9\": rpc error: code = NotFound desc = could not find container \"79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9\": container with ID starting with 79ef5e2c9beefd7e6ff3d3f53fa70cd6a5361b73622894310f2cf3c6a7402ac9 not found: ID does not exist" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.672859 4707 scope.go:117] "RemoveContainer" containerID="fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2" Jan 21 17:06:30 crc kubenswrapper[4707]: E0121 17:06:30.673185 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2\": container with ID starting with fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2 not found: ID does not exist" containerID="fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2" Jan 21 17:06:30 crc kubenswrapper[4707]: I0121 17:06:30.673224 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2"} err="failed to get container status \"fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2\": rpc error: code = NotFound desc = could not find container \"fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2\": container with ID starting with fe6523fa4cdef834ff405e9eafe527f6765746fae98fb5a71c2d5c152a378ce2 not found: ID does not exist" Jan 21 17:06:31 crc kubenswrapper[4707]: I0121 17:06:31.014801 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf"] Jan 21 17:06:31 crc kubenswrapper[4707]: W0121 17:06:31.018276 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc30e4fe_f8b3_42cc_a6bb_c8b5c98b7a39.slice/crio-92316e6a66835d11965fb56660a630c35e7f401b2734a1b76155d4924642fa5a WatchSource:0}: Error finding container 92316e6a66835d11965fb56660a630c35e7f401b2734a1b76155d4924642fa5a: Status 404 returned error can't find the container with id 92316e6a66835d11965fb56660a630c35e7f401b2734a1b76155d4924642fa5a Jan 21 17:06:31 crc kubenswrapper[4707]: I0121 17:06:31.188728 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae4a552-fbf9-469e-8bfc-450c0cd412c0" path="/var/lib/kubelet/pods/4ae4a552-fbf9-469e-8bfc-450c0cd412c0/volumes" Jan 21 17:06:31 crc kubenswrapper[4707]: I0121 17:06:31.622863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvw8f" event={"ID":"5a71af8c-b3cc-4471-91c1-1df373bb75ec","Type":"ContainerStarted","Data":"301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294"} Jan 21 17:06:31 crc kubenswrapper[4707]: I0121 17:06:31.625797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" event={"ID":"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39","Type":"ContainerStarted","Data":"80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea"} Jan 21 17:06:31 crc kubenswrapper[4707]: I0121 17:06:31.625849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" event={"ID":"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39","Type":"ContainerStarted","Data":"92316e6a66835d11965fb56660a630c35e7f401b2734a1b76155d4924642fa5a"} Jan 21 17:06:31 crc kubenswrapper[4707]: I0121 17:06:31.625953 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:31 crc kubenswrapper[4707]: I0121 17:06:31.636279 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vvw8f" podStartSLOduration=3.173519417 podStartE2EDuration="4.636260579s" podCreationTimestamp="2026-01-21 17:06:27 +0000 UTC" firstStartedPulling="2026-01-21 17:06:29.607608642 +0000 UTC m=+7486.789124864" lastFinishedPulling="2026-01-21 17:06:31.070349804 +0000 UTC m=+7488.251866026" observedRunningTime="2026-01-21 17:06:31.635421232 +0000 UTC m=+7488.816937454" watchObservedRunningTime="2026-01-21 17:06:31.636260579 +0000 UTC m=+7488.817776802" Jan 21 17:06:31 crc kubenswrapper[4707]: I0121 17:06:31.648035 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" podStartSLOduration=1.648020311 podStartE2EDuration="1.648020311s" podCreationTimestamp="2026-01-21 17:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:06:31.647520642 +0000 UTC m=+7488.829036863" watchObservedRunningTime="2026-01-21 17:06:31.648020311 +0000 UTC m=+7488.829536534" Jan 21 17:06:34 crc kubenswrapper[4707]: I0121 17:06:34.053417 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:06:37 crc kubenswrapper[4707]: I0121 17:06:37.968476 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:37 crc kubenswrapper[4707]: I0121 17:06:37.968729 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:37 crc kubenswrapper[4707]: I0121 17:06:37.998387 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:38 crc kubenswrapper[4707]: I0121 17:06:38.689411 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:39 crc kubenswrapper[4707]: I0121 17:06:39.846733 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvw8f"] Jan 21 17:06:40 crc kubenswrapper[4707]: I0121 17:06:40.674046 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vvw8f" podUID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerName="registry-server" containerID="cri-o://301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294" gracePeriod=2 Jan 21 17:06:40 crc kubenswrapper[4707]: I0121 17:06:40.674908 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.054288 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.193996 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-utilities\") pod \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.194272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-catalog-content\") pod \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.194313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vsg6\" (UniqueName: \"kubernetes.io/projected/5a71af8c-b3cc-4471-91c1-1df373bb75ec-kube-api-access-8vsg6\") pod \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\" (UID: \"5a71af8c-b3cc-4471-91c1-1df373bb75ec\") " Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.194768 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-utilities" (OuterVolumeSpecName: "utilities") pod "5a71af8c-b3cc-4471-91c1-1df373bb75ec" (UID: "5a71af8c-b3cc-4471-91c1-1df373bb75ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.194991 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.199077 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a71af8c-b3cc-4471-91c1-1df373bb75ec-kube-api-access-8vsg6" (OuterVolumeSpecName: "kube-api-access-8vsg6") pod "5a71af8c-b3cc-4471-91c1-1df373bb75ec" (UID: "5a71af8c-b3cc-4471-91c1-1df373bb75ec"). InnerVolumeSpecName "kube-api-access-8vsg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.212009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a71af8c-b3cc-4471-91c1-1df373bb75ec" (UID: "5a71af8c-b3cc-4471-91c1-1df373bb75ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.296210 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a71af8c-b3cc-4471-91c1-1df373bb75ec-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.296243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vsg6\" (UniqueName: \"kubernetes.io/projected/5a71af8c-b3cc-4471-91c1-1df373bb75ec-kube-api-access-8vsg6\") on node \"crc\" DevicePath \"\"" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.681974 4707 generic.go:334] "Generic (PLEG): container finished" podID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerID="301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294" exitCode=0 Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.682020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvw8f" event={"ID":"5a71af8c-b3cc-4471-91c1-1df373bb75ec","Type":"ContainerDied","Data":"301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294"} Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.682046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvw8f" event={"ID":"5a71af8c-b3cc-4471-91c1-1df373bb75ec","Type":"ContainerDied","Data":"1b3999016d246827854693c8631eb3cfc358e6ae0f1368368e03edcbe2226c28"} Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.682062 4707 scope.go:117] "RemoveContainer" containerID="301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.682082 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvw8f" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.697671 4707 scope.go:117] "RemoveContainer" containerID="f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.707767 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvw8f"] Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.713352 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvw8f"] Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.732208 4707 scope.go:117] "RemoveContainer" containerID="7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.748920 4707 scope.go:117] "RemoveContainer" containerID="301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294" Jan 21 17:06:41 crc kubenswrapper[4707]: E0121 17:06:41.749344 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294\": container with ID starting with 301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294 not found: ID does not exist" containerID="301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.749384 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294"} err="failed to get container status \"301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294\": rpc error: code = NotFound desc = could not find container \"301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294\": container with ID starting with 301fb12e35d283c4d42e645d3f037d62ac860c1f02b72973f105cd5327bc3294 not found: ID does not exist" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.749406 4707 scope.go:117] "RemoveContainer" containerID="f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209" Jan 21 17:06:41 crc kubenswrapper[4707]: E0121 17:06:41.749702 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209\": container with ID starting with f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209 not found: ID does not exist" containerID="f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.749742 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209"} err="failed to get container status \"f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209\": rpc error: code = NotFound desc = could not find container \"f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209\": container with ID starting with f06aabea39858f7c749bb522c4f7d2286260558ce2e07fe3550292709e887209 not found: ID does not exist" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.749768 4707 scope.go:117] "RemoveContainer" containerID="7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35" Jan 21 17:06:41 crc kubenswrapper[4707]: E0121 17:06:41.750058 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35\": container with ID starting with 7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35 not found: ID does not exist" containerID="7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35" Jan 21 17:06:41 crc kubenswrapper[4707]: I0121 17:06:41.750085 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35"} err="failed to get container status \"7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35\": rpc error: code = NotFound desc = could not find container \"7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35\": container with ID starting with 7fc5aa34b6e6117ec8dfc786fbba3733539781cd76d5a1291c3e614aed4d0e35 not found: ID does not exist" Jan 21 17:06:43 crc kubenswrapper[4707]: I0121 17:06:43.195877 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:06:43 crc kubenswrapper[4707]: E0121 17:06:43.196181 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:06:43 crc kubenswrapper[4707]: I0121 17:06:43.199406 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" path="/var/lib/kubelet/pods/5a71af8c-b3cc-4471-91c1-1df373bb75ec/volumes" Jan 21 17:06:55 crc kubenswrapper[4707]: I0121 17:06:55.182499 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:06:55 crc kubenswrapper[4707]: E0121 17:06:55.183065 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.597058 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-fvrh2"] Jan 21 17:07:03 crc kubenswrapper[4707]: E0121 17:07:03.597798 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerName="extract-utilities" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.597832 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerName="extract-utilities" Jan 21 17:07:03 crc kubenswrapper[4707]: E0121 17:07:03.597849 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerName="extract-content" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.597855 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerName="extract-content" Jan 21 17:07:03 crc kubenswrapper[4707]: E0121 17:07:03.597870 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerName="registry-server" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.597876 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerName="registry-server" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.598031 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a71af8c-b3cc-4471-91c1-1df373bb75ec" containerName="registry-server" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.598718 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.602604 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon-horizon-dockercfg-xjbwr" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.602608 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.603104 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-fvrh2"] Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.603412 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-config-data" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.608362 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-scripts" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.658670 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nqkdh"] Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.659725 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.666554 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nqkdh"] Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f893fa-027b-4703-9ada-6a7ca380c14d-logs\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtbtq\" (UniqueName: \"kubernetes.io/projected/f600fe05-ec42-46e8-862c-9af232aa3cfd-kube-api-access-rtbtq\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676224 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54f893fa-027b-4703-9ada-6a7ca380c14d-horizon-secret-key\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-config-data\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f600fe05-ec42-46e8-862c-9af232aa3cfd-horizon-secret-key\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpmwk\" (UniqueName: \"kubernetes.io/projected/54f893fa-027b-4703-9ada-6a7ca380c14d-kube-api-access-zpmwk\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-config-data\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676367 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f600fe05-ec42-46e8-862c-9af232aa3cfd-logs\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-scripts\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.676435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-scripts\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54f893fa-027b-4703-9ada-6a7ca380c14d-horizon-secret-key\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-config-data\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f600fe05-ec42-46e8-862c-9af232aa3cfd-horizon-secret-key\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpmwk\" (UniqueName: \"kubernetes.io/projected/54f893fa-027b-4703-9ada-6a7ca380c14d-kube-api-access-zpmwk\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-config-data\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f600fe05-ec42-46e8-862c-9af232aa3cfd-logs\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778800 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-scripts\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-scripts\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f893fa-027b-4703-9ada-6a7ca380c14d-logs\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.778952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtbtq\" (UniqueName: \"kubernetes.io/projected/f600fe05-ec42-46e8-862c-9af232aa3cfd-kube-api-access-rtbtq\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.779775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-scripts\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.779795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f600fe05-ec42-46e8-862c-9af232aa3cfd-logs\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.779865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-scripts\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.780062 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f893fa-027b-4703-9ada-6a7ca380c14d-logs\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.780366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-config-data\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.780699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-config-data\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.784289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54f893fa-027b-4703-9ada-6a7ca380c14d-horizon-secret-key\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.787393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f600fe05-ec42-46e8-862c-9af232aa3cfd-horizon-secret-key\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.792320 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtbtq\" (UniqueName: \"kubernetes.io/projected/f600fe05-ec42-46e8-862c-9af232aa3cfd-kube-api-access-rtbtq\") pod \"horizon-598f976c49-nqkdh\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.793096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpmwk\" (UniqueName: \"kubernetes.io/projected/54f893fa-027b-4703-9ada-6a7ca380c14d-kube-api-access-zpmwk\") pod \"horizon-9b986b9c-fvrh2\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.916380 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:03 crc kubenswrapper[4707]: I0121 17:07:03.972468 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:04 crc kubenswrapper[4707]: I0121 17:07:04.301825 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-fvrh2"] Jan 21 17:07:04 crc kubenswrapper[4707]: I0121 17:07:04.400469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nqkdh"] Jan 21 17:07:04 crc kubenswrapper[4707]: W0121 17:07:04.400722 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf600fe05_ec42_46e8_862c_9af232aa3cfd.slice/crio-8faf77b4342c571606d18b28d658a59058a8afe56648469f5e0485b52cd70e9a WatchSource:0}: Error finding container 8faf77b4342c571606d18b28d658a59058a8afe56648469f5e0485b52cd70e9a: Status 404 returned error can't find the container with id 8faf77b4342c571606d18b28d658a59058a8afe56648469f5e0485b52cd70e9a Jan 21 17:07:04 crc kubenswrapper[4707]: I0121 17:07:04.826899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" event={"ID":"54f893fa-027b-4703-9ada-6a7ca380c14d","Type":"ContainerStarted","Data":"cf64f10cee09f4a30c56f1eb5cfc45e1876f941b9f39f95297853eb77ee4edbd"} Jan 21 17:07:04 crc kubenswrapper[4707]: I0121 17:07:04.827844 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" event={"ID":"f600fe05-ec42-46e8-862c-9af232aa3cfd","Type":"ContainerStarted","Data":"8faf77b4342c571606d18b28d658a59058a8afe56648469f5e0485b52cd70e9a"} Jan 21 17:07:06 crc kubenswrapper[4707]: I0121 17:07:06.182497 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:07:06 crc kubenswrapper[4707]: E0121 17:07:06.183009 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:07:10 crc kubenswrapper[4707]: I0121 17:07:10.871935 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" event={"ID":"54f893fa-027b-4703-9ada-6a7ca380c14d","Type":"ContainerStarted","Data":"4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56"} Jan 21 17:07:10 crc kubenswrapper[4707]: I0121 17:07:10.872385 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" event={"ID":"54f893fa-027b-4703-9ada-6a7ca380c14d","Type":"ContainerStarted","Data":"75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5"} Jan 21 17:07:10 crc kubenswrapper[4707]: I0121 17:07:10.875638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" event={"ID":"f600fe05-ec42-46e8-862c-9af232aa3cfd","Type":"ContainerStarted","Data":"d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225"} Jan 21 17:07:10 crc kubenswrapper[4707]: I0121 17:07:10.875695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" event={"ID":"f600fe05-ec42-46e8-862c-9af232aa3cfd","Type":"ContainerStarted","Data":"d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a"} Jan 21 17:07:10 crc kubenswrapper[4707]: I0121 17:07:10.888577 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" podStartSLOduration=2.237197931 podStartE2EDuration="7.888563555s" podCreationTimestamp="2026-01-21 17:07:03 +0000 UTC" firstStartedPulling="2026-01-21 17:07:04.305908553 +0000 UTC m=+7521.487424775" lastFinishedPulling="2026-01-21 17:07:09.957274177 +0000 UTC m=+7527.138790399" observedRunningTime="2026-01-21 17:07:10.887404906 +0000 UTC m=+7528.068921128" watchObservedRunningTime="2026-01-21 17:07:10.888563555 +0000 UTC m=+7528.070079776" Jan 21 17:07:10 crc kubenswrapper[4707]: I0121 17:07:10.903846 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" podStartSLOduration=2.348981044 podStartE2EDuration="7.903835971s" podCreationTimestamp="2026-01-21 17:07:03 +0000 UTC" firstStartedPulling="2026-01-21 17:07:04.403941352 +0000 UTC m=+7521.585457574" lastFinishedPulling="2026-01-21 17:07:09.958796279 +0000 UTC m=+7527.140312501" observedRunningTime="2026-01-21 17:07:10.902194273 +0000 UTC m=+7528.083710496" watchObservedRunningTime="2026-01-21 17:07:10.903835971 +0000 UTC m=+7528.085352193" Jan 21 17:07:13 crc kubenswrapper[4707]: I0121 17:07:13.916624 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:13 crc kubenswrapper[4707]: I0121 17:07:13.916968 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:13 crc kubenswrapper[4707]: I0121 17:07:13.972930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:13 crc kubenswrapper[4707]: I0121 17:07:13.972989 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:18 crc kubenswrapper[4707]: I0121 17:07:18.182624 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:07:18 crc kubenswrapper[4707]: E0121 17:07:18.183129 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:07:23 crc kubenswrapper[4707]: I0121 17:07:23.918353 4707 prober.go:107] "Probe failed" probeType="Startup" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.44:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.44:8080: connect: connection refused" Jan 21 17:07:23 crc kubenswrapper[4707]: I0121 17:07:23.974978 4707 prober.go:107] "Probe failed" probeType="Startup" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.49:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.49:8080: connect: connection refused" Jan 21 17:07:28 crc kubenswrapper[4707]: I0121 17:07:28.091349 4707 scope.go:117] "RemoveContainer" containerID="3c3feedbaa9e1211b5b5de12c211c2667207773a016e642c10bf7970a22ab486" Jan 21 17:07:28 crc kubenswrapper[4707]: I0121 17:07:28.111110 4707 scope.go:117] "RemoveContainer" containerID="0d619d30cc49613dbe8c030e0fd72c54f16a8b4127b1ea8654e6372efe9c1048" Jan 21 17:07:31 crc kubenswrapper[4707]: I0121 17:07:31.182223 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:07:31 crc kubenswrapper[4707]: E0121 17:07:31.182621 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:07:35 crc kubenswrapper[4707]: I0121 17:07:35.528207 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:35 crc kubenswrapper[4707]: I0121 17:07:35.530326 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:36 crc kubenswrapper[4707]: I0121 17:07:36.918603 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:07:36 crc kubenswrapper[4707]: I0121 17:07:36.959415 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-fvrh2"] Jan 21 17:07:36 crc kubenswrapper[4707]: I0121 17:07:36.959621 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon-log" containerID="cri-o://75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5" gracePeriod=30 Jan 21 17:07:36 crc kubenswrapper[4707]: I0121 17:07:36.959712 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon" containerID="cri-o://4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56" gracePeriod=30 Jan 21 17:07:36 crc kubenswrapper[4707]: I0121 17:07:36.964235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:40 crc kubenswrapper[4707]: I0121 17:07:40.075717 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.44:8080/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:50274->10.217.0.44:8080: read: connection reset by peer" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.063045 4707 generic.go:334] "Generic (PLEG): container finished" podID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerID="4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56" exitCode=0 Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.063254 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" event={"ID":"54f893fa-027b-4703-9ada-6a7ca380c14d","Type":"ContainerDied","Data":"4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56"} Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.329352 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7"] Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.330558 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.333718 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-policy" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.344133 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7"] Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.364520 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nqkdh"] Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.364745 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon-log" containerID="cri-o://d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a" gracePeriod=30 Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.364851 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon" containerID="cri-o://d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225" gracePeriod=30 Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.371927 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7"] Jan 21 17:07:41 crc kubenswrapper[4707]: E0121 17:07:41.372402 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data horizon-secret-key kube-api-access-klfpg logs policy scripts], unattached volumes=[], failed to process volumes=[config-data horizon-secret-key kube-api-access-klfpg logs policy scripts]: context canceled" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" podUID="f2f4cb58-cdd3-4364-863a-cbe5ce0c8962" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.420954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klfpg\" (UniqueName: \"kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.420997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.421019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.421125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.421156 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-policy\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.421253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-logs\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.522738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klfpg\" (UniqueName: \"kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.522775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.522797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.522841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.522859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-policy\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.522915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-logs\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: E0121 17:07:41.522935 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-config-data: configmap "horizon-config-data" not found Jan 21 17:07:41 crc kubenswrapper[4707]: E0121 17:07:41.522964 4707 secret.go:188] Couldn't get secret horizon-kuttl-tests/horizon: secret "horizon" not found Jan 21 17:07:41 crc kubenswrapper[4707]: E0121 17:07:41.522994 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:42.022980296 +0000 UTC m=+7559.204496528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : configmap "horizon-config-data" not found Jan 21 17:07:41 crc kubenswrapper[4707]: E0121 17:07:41.523013 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:42.023005684 +0000 UTC m=+7559.204521907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "horizon-secret-key" (UniqueName: "kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : secret "horizon" not found Jan 21 17:07:41 crc kubenswrapper[4707]: E0121 17:07:41.523028 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-scripts: configmap "horizon-scripts" not found Jan 21 17:07:41 crc kubenswrapper[4707]: E0121 17:07:41.523100 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:42.023083201 +0000 UTC m=+7559.204599433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : configmap "horizon-scripts" not found Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.523411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-logs\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: I0121 17:07:41.523626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-policy\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:41 crc kubenswrapper[4707]: E0121 17:07:41.524977 4707 projected.go:194] Error preparing data for projected volume kube-api-access-klfpg for pod horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7: failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 17:07:41 crc kubenswrapper[4707]: E0121 17:07:41.525030 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:42.025017187 +0000 UTC m=+7559.206533409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-klfpg" (UniqueName: "kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.030198 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klfpg\" (UniqueName: \"kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.030240 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.030262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.030291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:42 crc kubenswrapper[4707]: E0121 17:07:42.030356 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-config-data: configmap "horizon-config-data" not found Jan 21 17:07:42 crc kubenswrapper[4707]: E0121 17:07:42.030419 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:43.030402618 +0000 UTC m=+7560.211918840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : configmap "horizon-config-data" not found Jan 21 17:07:42 crc kubenswrapper[4707]: E0121 17:07:42.030421 4707 secret.go:188] Couldn't get secret horizon-kuttl-tests/horizon: secret "horizon" not found Jan 21 17:07:42 crc kubenswrapper[4707]: E0121 17:07:42.030445 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-scripts: configmap "horizon-scripts" not found Jan 21 17:07:42 crc kubenswrapper[4707]: E0121 17:07:42.030481 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:43.030465968 +0000 UTC m=+7560.211982190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "horizon-secret-key" (UniqueName: "kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : secret "horizon" not found Jan 21 17:07:42 crc kubenswrapper[4707]: E0121 17:07:42.030498 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:43.030490704 +0000 UTC m=+7560.212006927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : configmap "horizon-scripts" not found Jan 21 17:07:42 crc kubenswrapper[4707]: E0121 17:07:42.032449 4707 projected.go:194] Error preparing data for projected volume kube-api-access-klfpg for pod horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7: failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 17:07:42 crc kubenswrapper[4707]: E0121 17:07:42.032487 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:43.032479133 +0000 UTC m=+7560.213995355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-klfpg" (UniqueName: "kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.068940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.076122 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.131730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-policy\") pod \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.131798 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-logs\") pod \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.132042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-logs" (OuterVolumeSpecName: "logs") pod "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.132082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-policy" (OuterVolumeSpecName: "policy") pod "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962"). InnerVolumeSpecName "policy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.132320 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-logs\") on node \"crc\" DevicePath \"\"" Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.132344 4707 reconciler_common.go:293] "Volume detached for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-policy\") on node \"crc\" DevicePath \"\"" Jan 21 17:07:42 crc kubenswrapper[4707]: I0121 17:07:42.182518 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:07:42 crc kubenswrapper[4707]: E0121 17:07:42.182746 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.044790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klfpg\" (UniqueName: \"kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.045043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.045074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:43 crc kubenswrapper[4707]: E0121 17:07:43.045088 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-config-data: configmap "horizon-config-data" not found Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.045110 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts\") pod \"horizon-6f6c5c44d8-bxvg7\" (UID: \"f2f4cb58-cdd3-4364-863a-cbe5ce0c8962\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:43 crc kubenswrapper[4707]: E0121 17:07:43.045138 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:45.045117619 +0000 UTC m=+7562.226633840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : configmap "horizon-config-data" not found Jan 21 17:07:43 crc kubenswrapper[4707]: E0121 17:07:43.045214 4707 secret.go:188] Couldn't get secret horizon-kuttl-tests/horizon: secret "horizon" not found Jan 21 17:07:43 crc kubenswrapper[4707]: E0121 17:07:43.045247 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:45.045238195 +0000 UTC m=+7562.226754417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "horizon-secret-key" (UniqueName: "kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : secret "horizon" not found Jan 21 17:07:43 crc kubenswrapper[4707]: E0121 17:07:43.045353 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-scripts: configmap "horizon-scripts" not found Jan 21 17:07:43 crc kubenswrapper[4707]: E0121 17:07:43.045386 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:45.045376726 +0000 UTC m=+7562.226892948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : configmap "horizon-scripts" not found Jan 21 17:07:43 crc kubenswrapper[4707]: E0121 17:07:43.047693 4707 projected.go:194] Error preparing data for projected volume kube-api-access-klfpg for pod horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7: failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 17:07:43 crc kubenswrapper[4707]: E0121 17:07:43.047741 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg podName:f2f4cb58-cdd3-4364-863a-cbe5ce0c8962 nodeName:}" failed. No retries permitted until 2026-01-21 17:07:45.047732055 +0000 UTC m=+7562.229248276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-klfpg" (UniqueName: "kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg") pod "horizon-6f6c5c44d8-bxvg7" (UID: "f2f4cb58-cdd3-4364-863a-cbe5ce0c8962") : failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.073841 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7" Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.102723 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7"] Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.107485 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-bxvg7"] Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.146478 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.146513 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klfpg\" (UniqueName: \"kubernetes.io/projected/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-kube-api-access-klfpg\") on node \"crc\" DevicePath \"\"" Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.146527 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.146536 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.190917 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f4cb58-cdd3-4364-863a-cbe5ce0c8962" path="/var/lib/kubelet/pods/f2f4cb58-cdd3-4364-863a-cbe5ce0c8962/volumes" Jan 21 17:07:43 crc kubenswrapper[4707]: I0121 17:07:43.917470 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.44:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.44:8080: connect: connection refused" Jan 21 17:07:44 crc kubenswrapper[4707]: I0121 17:07:44.430531 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.49:8080/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:49098->10.217.0.49:8080: read: connection reset by peer" Jan 21 17:07:45 crc kubenswrapper[4707]: I0121 17:07:45.086742 4707 generic.go:334] "Generic (PLEG): container finished" podID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerID="d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225" exitCode=0 Jan 21 17:07:45 crc kubenswrapper[4707]: I0121 17:07:45.086840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" event={"ID":"f600fe05-ec42-46e8-862c-9af232aa3cfd","Type":"ContainerDied","Data":"d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225"} Jan 21 17:07:53 crc kubenswrapper[4707]: I0121 17:07:53.917296 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.44:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.44:8080: connect: connection refused" Jan 21 17:07:53 crc kubenswrapper[4707]: I0121 17:07:53.918339 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:07:53 crc kubenswrapper[4707]: I0121 17:07:53.973016 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.49:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.49:8080: connect: connection refused" Jan 21 17:07:57 crc kubenswrapper[4707]: I0121 17:07:57.182397 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:07:57 crc kubenswrapper[4707]: E0121 17:07:57.182789 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:08:03 crc kubenswrapper[4707]: I0121 17:08:03.916993 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.44:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.44:8080: connect: connection refused" Jan 21 17:08:03 crc kubenswrapper[4707]: I0121 17:08:03.973722 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.49:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.49:8080: connect: connection refused" Jan 21 17:08:03 crc kubenswrapper[4707]: I0121 17:08:03.973832 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.206277 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.210092 4707 generic.go:334] "Generic (PLEG): container finished" podID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerID="75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5" exitCode=137 Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.210122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" event={"ID":"54f893fa-027b-4703-9ada-6a7ca380c14d","Type":"ContainerDied","Data":"75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5"} Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.210140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" event={"ID":"54f893fa-027b-4703-9ada-6a7ca380c14d","Type":"ContainerDied","Data":"cf64f10cee09f4a30c56f1eb5cfc45e1876f941b9f39f95297853eb77ee4edbd"} Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.210155 4707 scope.go:117] "RemoveContainer" containerID="4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.210518 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-9b986b9c-fvrh2" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.250982 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f893fa-027b-4703-9ada-6a7ca380c14d-logs\") pod \"54f893fa-027b-4703-9ada-6a7ca380c14d\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.251039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-scripts\") pod \"54f893fa-027b-4703-9ada-6a7ca380c14d\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.251087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-config-data\") pod \"54f893fa-027b-4703-9ada-6a7ca380c14d\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.251150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54f893fa-027b-4703-9ada-6a7ca380c14d-horizon-secret-key\") pod \"54f893fa-027b-4703-9ada-6a7ca380c14d\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.251186 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpmwk\" (UniqueName: \"kubernetes.io/projected/54f893fa-027b-4703-9ada-6a7ca380c14d-kube-api-access-zpmwk\") pod \"54f893fa-027b-4703-9ada-6a7ca380c14d\" (UID: \"54f893fa-027b-4703-9ada-6a7ca380c14d\") " Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.251951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f893fa-027b-4703-9ada-6a7ca380c14d-logs" (OuterVolumeSpecName: "logs") pod "54f893fa-027b-4703-9ada-6a7ca380c14d" (UID: "54f893fa-027b-4703-9ada-6a7ca380c14d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.255508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54f893fa-027b-4703-9ada-6a7ca380c14d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "54f893fa-027b-4703-9ada-6a7ca380c14d" (UID: "54f893fa-027b-4703-9ada-6a7ca380c14d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.255695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f893fa-027b-4703-9ada-6a7ca380c14d-kube-api-access-zpmwk" (OuterVolumeSpecName: "kube-api-access-zpmwk") pod "54f893fa-027b-4703-9ada-6a7ca380c14d" (UID: "54f893fa-027b-4703-9ada-6a7ca380c14d"). InnerVolumeSpecName "kube-api-access-zpmwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.264358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-config-data" (OuterVolumeSpecName: "config-data") pod "54f893fa-027b-4703-9ada-6a7ca380c14d" (UID: "54f893fa-027b-4703-9ada-6a7ca380c14d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.264923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-scripts" (OuterVolumeSpecName: "scripts") pod "54f893fa-027b-4703-9ada-6a7ca380c14d" (UID: "54f893fa-027b-4703-9ada-6a7ca380c14d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.353458 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.353482 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54f893fa-027b-4703-9ada-6a7ca380c14d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.353494 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54f893fa-027b-4703-9ada-6a7ca380c14d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.353506 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpmwk\" (UniqueName: \"kubernetes.io/projected/54f893fa-027b-4703-9ada-6a7ca380c14d-kube-api-access-zpmwk\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.353515 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f893fa-027b-4703-9ada-6a7ca380c14d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.353578 4707 scope.go:117] "RemoveContainer" containerID="75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.366332 4707 scope.go:117] "RemoveContainer" containerID="4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56" Jan 21 17:08:07 crc kubenswrapper[4707]: E0121 17:08:07.366617 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56\": container with ID starting with 4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56 not found: ID does not exist" containerID="4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.366655 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56"} err="failed to get container status \"4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56\": rpc error: code = NotFound desc = could not find container \"4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56\": container with ID starting with 4b6266e238909ea2b5acb85a0c977815ff148a6b52e8a55b1478ce222d206e56 not found: ID does not exist" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.366676 4707 scope.go:117] "RemoveContainer" containerID="75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5" Jan 21 17:08:07 crc kubenswrapper[4707]: E0121 17:08:07.366950 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5\": container with ID starting with 75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5 not found: ID does not exist" containerID="75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.366972 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5"} err="failed to get container status \"75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5\": rpc error: code = NotFound desc = could not find container \"75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5\": container with ID starting with 75dc94c19a7c6a41e4f46cb3a84d011754fb5305580d298a3343f218328d0ba5 not found: ID does not exist" Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.537658 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-fvrh2"] Jan 21 17:08:07 crc kubenswrapper[4707]: I0121 17:08:07.543501 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-fvrh2"] Jan 21 17:08:08 crc kubenswrapper[4707]: I0121 17:08:08.182615 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:08:08 crc kubenswrapper[4707]: E0121 17:08:08.182981 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:08:09 crc kubenswrapper[4707]: I0121 17:08:09.192562 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" path="/var/lib/kubelet/pods/54f893fa-027b-4703-9ada-6a7ca380c14d/volumes" Jan 21 17:08:10 crc kubenswrapper[4707]: E0121 17:08:10.066906 4707 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/476646c474cddcf752affe1d64572cba05d8caa9f0a46ca6b5ffcd53c8264c92/diff" to get inode usage: stat /var/lib/containers/storage/overlay/476646c474cddcf752affe1d64572cba05d8caa9f0a46ca6b5ffcd53c8264c92/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/horizon-kuttl-tests_horizon-9b986b9c-fvrh2_54f893fa-027b-4703-9ada-6a7ca380c14d/horizon-log/0.log" to get inode usage: stat /var/log/pods/horizon-kuttl-tests_horizon-9b986b9c-fvrh2_54f893fa-027b-4703-9ada-6a7ca380c14d/horizon-log/0.log: no such file or directory Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.650754 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.703580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-scripts\") pod \"f600fe05-ec42-46e8-862c-9af232aa3cfd\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.703645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f600fe05-ec42-46e8-862c-9af232aa3cfd-logs\") pod \"f600fe05-ec42-46e8-862c-9af232aa3cfd\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.703699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-config-data\") pod \"f600fe05-ec42-46e8-862c-9af232aa3cfd\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.703719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtbtq\" (UniqueName: \"kubernetes.io/projected/f600fe05-ec42-46e8-862c-9af232aa3cfd-kube-api-access-rtbtq\") pod \"f600fe05-ec42-46e8-862c-9af232aa3cfd\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.703768 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f600fe05-ec42-46e8-862c-9af232aa3cfd-horizon-secret-key\") pod \"f600fe05-ec42-46e8-862c-9af232aa3cfd\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.704090 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f600fe05-ec42-46e8-862c-9af232aa3cfd-logs" (OuterVolumeSpecName: "logs") pod "f600fe05-ec42-46e8-862c-9af232aa3cfd" (UID: "f600fe05-ec42-46e8-862c-9af232aa3cfd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.704508 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f600fe05-ec42-46e8-862c-9af232aa3cfd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.708266 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f600fe05-ec42-46e8-862c-9af232aa3cfd-kube-api-access-rtbtq" (OuterVolumeSpecName: "kube-api-access-rtbtq") pod "f600fe05-ec42-46e8-862c-9af232aa3cfd" (UID: "f600fe05-ec42-46e8-862c-9af232aa3cfd"). InnerVolumeSpecName "kube-api-access-rtbtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.708395 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f600fe05-ec42-46e8-862c-9af232aa3cfd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f600fe05-ec42-46e8-862c-9af232aa3cfd" (UID: "f600fe05-ec42-46e8-862c-9af232aa3cfd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:08:11 crc kubenswrapper[4707]: E0121 17:08:11.719237 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-scripts podName:f600fe05-ec42-46e8-862c-9af232aa3cfd nodeName:}" failed. No retries permitted until 2026-01-21 17:08:12.219218196 +0000 UTC m=+7589.400734418 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "scripts" (UniqueName: "kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-scripts") pod "f600fe05-ec42-46e8-862c-9af232aa3cfd" (UID: "f600fe05-ec42-46e8-862c-9af232aa3cfd") : error deleting /var/lib/kubelet/pods/f600fe05-ec42-46e8-862c-9af232aa3cfd/volume-subpaths: remove /var/lib/kubelet/pods/f600fe05-ec42-46e8-862c-9af232aa3cfd/volume-subpaths: no such file or directory Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.719725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-config-data" (OuterVolumeSpecName: "config-data") pod "f600fe05-ec42-46e8-862c-9af232aa3cfd" (UID: "f600fe05-ec42-46e8-862c-9af232aa3cfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.805901 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f600fe05-ec42-46e8-862c-9af232aa3cfd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.806099 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:11 crc kubenswrapper[4707]: I0121 17:08:11.806112 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtbtq\" (UniqueName: \"kubernetes.io/projected/f600fe05-ec42-46e8-862c-9af232aa3cfd-kube-api-access-rtbtq\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.239606 4707 generic.go:334] "Generic (PLEG): container finished" podID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerID="d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a" exitCode=137 Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.239650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" event={"ID":"f600fe05-ec42-46e8-862c-9af232aa3cfd","Type":"ContainerDied","Data":"d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a"} Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.239676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" event={"ID":"f600fe05-ec42-46e8-862c-9af232aa3cfd","Type":"ContainerDied","Data":"8faf77b4342c571606d18b28d658a59058a8afe56648469f5e0485b52cd70e9a"} Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.239685 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-598f976c49-nqkdh" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.239692 4707 scope.go:117] "RemoveContainer" containerID="d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.313047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-scripts\") pod \"f600fe05-ec42-46e8-862c-9af232aa3cfd\" (UID: \"f600fe05-ec42-46e8-862c-9af232aa3cfd\") " Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.313435 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-scripts" (OuterVolumeSpecName: "scripts") pod "f600fe05-ec42-46e8-862c-9af232aa3cfd" (UID: "f600fe05-ec42-46e8-862c-9af232aa3cfd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.362086 4707 scope.go:117] "RemoveContainer" containerID="d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.374801 4707 scope.go:117] "RemoveContainer" containerID="d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225" Jan 21 17:08:12 crc kubenswrapper[4707]: E0121 17:08:12.375104 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225\": container with ID starting with d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225 not found: ID does not exist" containerID="d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.375127 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225"} err="failed to get container status \"d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225\": rpc error: code = NotFound desc = could not find container \"d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225\": container with ID starting with d4e98537a553798c5df4c9eb7de518e3c60146e347272eefe5234e7fb2d50225 not found: ID does not exist" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.375148 4707 scope.go:117] "RemoveContainer" containerID="d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a" Jan 21 17:08:12 crc kubenswrapper[4707]: E0121 17:08:12.375394 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a\": container with ID starting with d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a not found: ID does not exist" containerID="d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.375419 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a"} err="failed to get container status \"d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a\": rpc error: code = NotFound desc = could not find container \"d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a\": container with ID starting with d4bfb6c90ce6ebc54e8f3ff78f8b7dae832f0ad96fcbc30c2e23af119294ea6a not found: ID does not exist" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.414410 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f600fe05-ec42-46e8-862c-9af232aa3cfd-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.566246 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nqkdh"] Jan 21 17:08:12 crc kubenswrapper[4707]: I0121 17:08:12.571115 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nqkdh"] Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.117070 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-8fnr4"] Jan 21 17:08:13 crc kubenswrapper[4707]: E0121 17:08:13.117512 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon-log" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.117524 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon-log" Jan 21 17:08:13 crc kubenswrapper[4707]: E0121 17:08:13.117541 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon-log" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.117546 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon-log" Jan 21 17:08:13 crc kubenswrapper[4707]: E0121 17:08:13.117560 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.117566 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon" Jan 21 17:08:13 crc kubenswrapper[4707]: E0121 17:08:13.117576 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.117581 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.117732 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon-log" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.117742 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon-log" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.117755 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="54f893fa-027b-4703-9ada-6a7ca380c14d" containerName="horizon" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.117765 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" containerName="horizon" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.118446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.120020 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"combined-ca-bundle" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.120171 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-scripts" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.120333 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-config-data" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.120422 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"cert-horizon-svc" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.120864 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon-horizon-dockercfg-42pv9" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.122427 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.128994 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-8fnr4"] Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.174461 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-nvxdv"] Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.175552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.194519 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f600fe05-ec42-46e8-862c-9af232aa3cfd" path="/var/lib/kubelet/pods/f600fe05-ec42-46e8-862c-9af232aa3cfd/volumes" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.217943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-nvxdv"] Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54027f2-5423-4434-90dc-9cf9d9da63d7-logs\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-combined-ca-bundle\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-config-data\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5efd4865-6244-4593-937d-233009526404-logs\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225645 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-combined-ca-bundle\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-tls-certs\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p69g\" (UniqueName: \"kubernetes.io/projected/c54027f2-5423-4434-90dc-9cf9d9da63d7-kube-api-access-4p69g\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-secret-key\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225737 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-secret-key\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-tls-certs\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw6zq\" (UniqueName: \"kubernetes.io/projected/5efd4865-6244-4593-937d-233009526404-kube-api-access-fw6zq\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-scripts\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-scripts\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.225902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-config-data\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-scripts\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-config-data\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54027f2-5423-4434-90dc-9cf9d9da63d7-logs\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-combined-ca-bundle\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-config-data\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5efd4865-6244-4593-937d-233009526404-logs\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-combined-ca-bundle\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327880 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-tls-certs\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p69g\" (UniqueName: \"kubernetes.io/projected/c54027f2-5423-4434-90dc-9cf9d9da63d7-kube-api-access-4p69g\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-secret-key\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-secret-key\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.327995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-tls-certs\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.328025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw6zq\" (UniqueName: \"kubernetes.io/projected/5efd4865-6244-4593-937d-233009526404-kube-api-access-fw6zq\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.328047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-scripts\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.328119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54027f2-5423-4434-90dc-9cf9d9da63d7-logs\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.328342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-scripts\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.328727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5efd4865-6244-4593-937d-233009526404-logs\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.329129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-scripts\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.329369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-config-data\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.329405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-config-data\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.331660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-combined-ca-bundle\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.331746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-tls-certs\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.331905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-secret-key\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.332054 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-secret-key\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.332356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-tls-certs\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.333118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-combined-ca-bundle\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.341880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw6zq\" (UniqueName: \"kubernetes.io/projected/5efd4865-6244-4593-937d-233009526404-kube-api-access-fw6zq\") pod \"horizon-8cd586586-nvxdv\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.341976 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p69g\" (UniqueName: \"kubernetes.io/projected/c54027f2-5423-4434-90dc-9cf9d9da63d7-kube-api-access-4p69g\") pod \"horizon-754c467c7d-8fnr4\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.432460 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.491377 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.788771 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-8fnr4"] Jan 21 17:08:13 crc kubenswrapper[4707]: W0121 17:08:13.789533 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54027f2_5423_4434_90dc_9cf9d9da63d7.slice/crio-95a8aa1de37654edf73cc749f013058abe5c372f667cb22a6c5b3cd863423e21 WatchSource:0}: Error finding container 95a8aa1de37654edf73cc749f013058abe5c372f667cb22a6c5b3cd863423e21: Status 404 returned error can't find the container with id 95a8aa1de37654edf73cc749f013058abe5c372f667cb22a6c5b3cd863423e21 Jan 21 17:08:13 crc kubenswrapper[4707]: I0121 17:08:13.860222 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-nvxdv"] Jan 21 17:08:13 crc kubenswrapper[4707]: W0121 17:08:13.881534 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5efd4865_6244_4593_937d_233009526404.slice/crio-c6c54b13ce9e30c618113fc4fea0d7467d78529ee3c20e924785efa5c54f9d43 WatchSource:0}: Error finding container c6c54b13ce9e30c618113fc4fea0d7467d78529ee3c20e924785efa5c54f9d43: Status 404 returned error can't find the container with id c6c54b13ce9e30c618113fc4fea0d7467d78529ee3c20e924785efa5c54f9d43 Jan 21 17:08:14 crc kubenswrapper[4707]: I0121 17:08:14.253872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" event={"ID":"c54027f2-5423-4434-90dc-9cf9d9da63d7","Type":"ContainerStarted","Data":"543806f5449ea9e333b70b3564f31622711a37b5ccf8f335ad42f37b09df9b3e"} Jan 21 17:08:14 crc kubenswrapper[4707]: I0121 17:08:14.254062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" event={"ID":"c54027f2-5423-4434-90dc-9cf9d9da63d7","Type":"ContainerStarted","Data":"424eefd81a7017c172890c00a70bb727e9fa421122439fa9655c3b469c79838d"} Jan 21 17:08:14 crc kubenswrapper[4707]: I0121 17:08:14.254074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" event={"ID":"c54027f2-5423-4434-90dc-9cf9d9da63d7","Type":"ContainerStarted","Data":"95a8aa1de37654edf73cc749f013058abe5c372f667cb22a6c5b3cd863423e21"} Jan 21 17:08:14 crc kubenswrapper[4707]: I0121 17:08:14.255447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" event={"ID":"5efd4865-6244-4593-937d-233009526404","Type":"ContainerStarted","Data":"359bc0f1140184e8b0a3682548d77ccec47e87d127c82c41d7a5517526ff036d"} Jan 21 17:08:14 crc kubenswrapper[4707]: I0121 17:08:14.255473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" event={"ID":"5efd4865-6244-4593-937d-233009526404","Type":"ContainerStarted","Data":"0874b5fae466a24cc3cd2df5c97fa3722bda4b4bc4d1969f5c3cb1b8d1c6d366"} Jan 21 17:08:14 crc kubenswrapper[4707]: I0121 17:08:14.255483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" event={"ID":"5efd4865-6244-4593-937d-233009526404","Type":"ContainerStarted","Data":"c6c54b13ce9e30c618113fc4fea0d7467d78529ee3c20e924785efa5c54f9d43"} Jan 21 17:08:14 crc kubenswrapper[4707]: I0121 17:08:14.268390 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" podStartSLOduration=1.268374724 podStartE2EDuration="1.268374724s" podCreationTimestamp="2026-01-21 17:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:08:14.267369464 +0000 UTC m=+7591.448885685" watchObservedRunningTime="2026-01-21 17:08:14.268374724 +0000 UTC m=+7591.449890946" Jan 21 17:08:14 crc kubenswrapper[4707]: I0121 17:08:14.285105 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" podStartSLOduration=1.285092286 podStartE2EDuration="1.285092286s" podCreationTimestamp="2026-01-21 17:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:08:14.281884706 +0000 UTC m=+7591.463400928" watchObservedRunningTime="2026-01-21 17:08:14.285092286 +0000 UTC m=+7591.466608509" Jan 21 17:08:20 crc kubenswrapper[4707]: I0121 17:08:20.182716 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:08:20 crc kubenswrapper[4707]: E0121 17:08:20.183247 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:08:23 crc kubenswrapper[4707]: I0121 17:08:23.432762 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:23 crc kubenswrapper[4707]: I0121 17:08:23.433151 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:23 crc kubenswrapper[4707]: I0121 17:08:23.492368 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:23 crc kubenswrapper[4707]: I0121 17:08:23.492880 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:34 crc kubenswrapper[4707]: I0121 17:08:34.182510 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:08:34 crc kubenswrapper[4707]: E0121 17:08:34.183165 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:08:34 crc kubenswrapper[4707]: I0121 17:08:34.932150 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:34 crc kubenswrapper[4707]: I0121 17:08:34.986064 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:36 crc kubenswrapper[4707]: I0121 17:08:36.351063 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:08:36 crc kubenswrapper[4707]: I0121 17:08:36.404696 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:08:36 crc kubenswrapper[4707]: I0121 17:08:36.442641 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-8fnr4"] Jan 21 17:08:36 crc kubenswrapper[4707]: I0121 17:08:36.442881 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon" containerID="cri-o://543806f5449ea9e333b70b3564f31622711a37b5ccf8f335ad42f37b09df9b3e" gracePeriod=30 Jan 21 17:08:36 crc kubenswrapper[4707]: I0121 17:08:36.443648 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon-log" containerID="cri-o://424eefd81a7017c172890c00a70bb727e9fa421122439fa9655c3b469c79838d" gracePeriod=30 Jan 21 17:08:37 crc kubenswrapper[4707]: I0121 17:08:37.161551 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-nvxdv"] Jan 21 17:08:37 crc kubenswrapper[4707]: I0121 17:08:37.402839 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon-log" containerID="cri-o://0874b5fae466a24cc3cd2df5c97fa3722bda4b4bc4d1969f5c3cb1b8d1c6d366" gracePeriod=30 Jan 21 17:08:37 crc kubenswrapper[4707]: I0121 17:08:37.402886 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon" containerID="cri-o://359bc0f1140184e8b0a3682548d77ccec47e87d127c82c41d7a5517526ff036d" gracePeriod=30 Jan 21 17:08:40 crc kubenswrapper[4707]: I0121 17:08:40.420567 4707 generic.go:334] "Generic (PLEG): container finished" podID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerID="543806f5449ea9e333b70b3564f31622711a37b5ccf8f335ad42f37b09df9b3e" exitCode=0 Jan 21 17:08:40 crc kubenswrapper[4707]: I0121 17:08:40.420653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" event={"ID":"c54027f2-5423-4434-90dc-9cf9d9da63d7","Type":"ContainerDied","Data":"543806f5449ea9e333b70b3564f31622711a37b5ccf8f335ad42f37b09df9b3e"} Jan 21 17:08:41 crc kubenswrapper[4707]: I0121 17:08:41.427959 4707 generic.go:334] "Generic (PLEG): container finished" podID="5efd4865-6244-4593-937d-233009526404" containerID="359bc0f1140184e8b0a3682548d77ccec47e87d127c82c41d7a5517526ff036d" exitCode=0 Jan 21 17:08:41 crc kubenswrapper[4707]: I0121 17:08:41.427998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" event={"ID":"5efd4865-6244-4593-937d-233009526404","Type":"ContainerDied","Data":"359bc0f1140184e8b0a3682548d77ccec47e87d127c82c41d7a5517526ff036d"} Jan 21 17:08:43 crc kubenswrapper[4707]: I0121 17:08:43.433864 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.55:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.55:8443: connect: connection refused" Jan 21 17:08:43 crc kubenswrapper[4707]: I0121 17:08:43.492409 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.56:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.56:8443: connect: connection refused" Jan 21 17:08:45 crc kubenswrapper[4707]: I0121 17:08:45.182470 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:08:45 crc kubenswrapper[4707]: E0121 17:08:45.183229 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:08:53 crc kubenswrapper[4707]: I0121 17:08:53.433256 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.55:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.55:8443: connect: connection refused" Jan 21 17:08:53 crc kubenswrapper[4707]: I0121 17:08:53.492450 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.56:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.56:8443: connect: connection refused" Jan 21 17:09:00 crc kubenswrapper[4707]: I0121 17:09:00.182179 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:09:00 crc kubenswrapper[4707]: E0121 17:09:00.182857 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:09:03 crc kubenswrapper[4707]: I0121 17:09:03.433748 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.55:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.55:8443: connect: connection refused" Jan 21 17:09:03 crc kubenswrapper[4707]: I0121 17:09:03.434093 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:09:03 crc kubenswrapper[4707]: I0121 17:09:03.492641 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.56:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.56:8443: connect: connection refused" Jan 21 17:09:03 crc kubenswrapper[4707]: I0121 17:09:03.492731 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.569845 4707 generic.go:334] "Generic (PLEG): container finished" podID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerID="424eefd81a7017c172890c00a70bb727e9fa421122439fa9655c3b469c79838d" exitCode=137 Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.569914 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" event={"ID":"c54027f2-5423-4434-90dc-9cf9d9da63d7","Type":"ContainerDied","Data":"424eefd81a7017c172890c00a70bb727e9fa421122439fa9655c3b469c79838d"} Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.717758 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.908168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p69g\" (UniqueName: \"kubernetes.io/projected/c54027f2-5423-4434-90dc-9cf9d9da63d7-kube-api-access-4p69g\") pod \"c54027f2-5423-4434-90dc-9cf9d9da63d7\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.908211 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-tls-certs\") pod \"c54027f2-5423-4434-90dc-9cf9d9da63d7\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.908297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54027f2-5423-4434-90dc-9cf9d9da63d7-logs\") pod \"c54027f2-5423-4434-90dc-9cf9d9da63d7\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.908317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-secret-key\") pod \"c54027f2-5423-4434-90dc-9cf9d9da63d7\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.908369 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-scripts\") pod \"c54027f2-5423-4434-90dc-9cf9d9da63d7\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.908401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-config-data\") pod \"c54027f2-5423-4434-90dc-9cf9d9da63d7\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.908424 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-combined-ca-bundle\") pod \"c54027f2-5423-4434-90dc-9cf9d9da63d7\" (UID: \"c54027f2-5423-4434-90dc-9cf9d9da63d7\") " Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.908782 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c54027f2-5423-4434-90dc-9cf9d9da63d7-logs" (OuterVolumeSpecName: "logs") pod "c54027f2-5423-4434-90dc-9cf9d9da63d7" (UID: "c54027f2-5423-4434-90dc-9cf9d9da63d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.908937 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54027f2-5423-4434-90dc-9cf9d9da63d7-logs\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.913033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54027f2-5423-4434-90dc-9cf9d9da63d7-kube-api-access-4p69g" (OuterVolumeSpecName: "kube-api-access-4p69g") pod "c54027f2-5423-4434-90dc-9cf9d9da63d7" (UID: "c54027f2-5423-4434-90dc-9cf9d9da63d7"). InnerVolumeSpecName "kube-api-access-4p69g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.913635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c54027f2-5423-4434-90dc-9cf9d9da63d7" (UID: "c54027f2-5423-4434-90dc-9cf9d9da63d7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.923383 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-scripts" (OuterVolumeSpecName: "scripts") pod "c54027f2-5423-4434-90dc-9cf9d9da63d7" (UID: "c54027f2-5423-4434-90dc-9cf9d9da63d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.923513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-config-data" (OuterVolumeSpecName: "config-data") pod "c54027f2-5423-4434-90dc-9cf9d9da63d7" (UID: "c54027f2-5423-4434-90dc-9cf9d9da63d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.925186 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c54027f2-5423-4434-90dc-9cf9d9da63d7" (UID: "c54027f2-5423-4434-90dc-9cf9d9da63d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:06 crc kubenswrapper[4707]: I0121 17:09:06.935008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "c54027f2-5423-4434-90dc-9cf9d9da63d7" (UID: "c54027f2-5423-4434-90dc-9cf9d9da63d7"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.010082 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.010227 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.010305 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p69g\" (UniqueName: \"kubernetes.io/projected/c54027f2-5423-4434-90dc-9cf9d9da63d7-kube-api-access-4p69g\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.010364 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.010413 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c54027f2-5423-4434-90dc-9cf9d9da63d7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.010465 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c54027f2-5423-4434-90dc-9cf9d9da63d7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:07 crc kubenswrapper[4707]: E0121 17:09:07.562653 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54027f2_5423_4434_90dc_9cf9d9da63d7.slice/crio-95a8aa1de37654edf73cc749f013058abe5c372f667cb22a6c5b3cd863423e21\": RecentStats: unable to find data in memory cache]" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.581371 4707 generic.go:334] "Generic (PLEG): container finished" podID="5efd4865-6244-4593-937d-233009526404" containerID="0874b5fae466a24cc3cd2df5c97fa3722bda4b4bc4d1969f5c3cb1b8d1c6d366" exitCode=137 Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.581440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" event={"ID":"5efd4865-6244-4593-937d-233009526404","Type":"ContainerDied","Data":"0874b5fae466a24cc3cd2df5c97fa3722bda4b4bc4d1969f5c3cb1b8d1c6d366"} Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.583866 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" event={"ID":"c54027f2-5423-4434-90dc-9cf9d9da63d7","Type":"ContainerDied","Data":"95a8aa1de37654edf73cc749f013058abe5c372f667cb22a6c5b3cd863423e21"} Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.583908 4707 scope.go:117] "RemoveContainer" containerID="543806f5449ea9e333b70b3564f31622711a37b5ccf8f335ad42f37b09df9b3e" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.583916 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-754c467c7d-8fnr4" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.602570 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-8fnr4"] Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.608115 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-8fnr4"] Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.722133 4707 scope.go:117] "RemoveContainer" containerID="424eefd81a7017c172890c00a70bb727e9fa421122439fa9655c3b469c79838d" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.772526 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.921378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-combined-ca-bundle\") pod \"5efd4865-6244-4593-937d-233009526404\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.921444 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5efd4865-6244-4593-937d-233009526404-logs\") pod \"5efd4865-6244-4593-937d-233009526404\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.921525 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-scripts\") pod \"5efd4865-6244-4593-937d-233009526404\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.921574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-secret-key\") pod \"5efd4865-6244-4593-937d-233009526404\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.921665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw6zq\" (UniqueName: \"kubernetes.io/projected/5efd4865-6244-4593-937d-233009526404-kube-api-access-fw6zq\") pod \"5efd4865-6244-4593-937d-233009526404\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.921688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-config-data\") pod \"5efd4865-6244-4593-937d-233009526404\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.921715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-tls-certs\") pod \"5efd4865-6244-4593-937d-233009526404\" (UID: \"5efd4865-6244-4593-937d-233009526404\") " Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.921891 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5efd4865-6244-4593-937d-233009526404-logs" (OuterVolumeSpecName: "logs") pod "5efd4865-6244-4593-937d-233009526404" (UID: "5efd4865-6244-4593-937d-233009526404"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.922103 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5efd4865-6244-4593-937d-233009526404-logs\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.925233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5efd4865-6244-4593-937d-233009526404" (UID: "5efd4865-6244-4593-937d-233009526404"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.926506 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5efd4865-6244-4593-937d-233009526404-kube-api-access-fw6zq" (OuterVolumeSpecName: "kube-api-access-fw6zq") pod "5efd4865-6244-4593-937d-233009526404" (UID: "5efd4865-6244-4593-937d-233009526404"). InnerVolumeSpecName "kube-api-access-fw6zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.937139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5efd4865-6244-4593-937d-233009526404" (UID: "5efd4865-6244-4593-937d-233009526404"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.938161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-scripts" (OuterVolumeSpecName: "scripts") pod "5efd4865-6244-4593-937d-233009526404" (UID: "5efd4865-6244-4593-937d-233009526404"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.938290 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-config-data" (OuterVolumeSpecName: "config-data") pod "5efd4865-6244-4593-937d-233009526404" (UID: "5efd4865-6244-4593-937d-233009526404"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:07 crc kubenswrapper[4707]: I0121 17:09:07.948671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "5efd4865-6244-4593-937d-233009526404" (UID: "5efd4865-6244-4593-937d-233009526404"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.023616 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw6zq\" (UniqueName: \"kubernetes.io/projected/5efd4865-6244-4593-937d-233009526404-kube-api-access-fw6zq\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.023644 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.023654 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.023665 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.023676 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5efd4865-6244-4593-937d-233009526404-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.023684 4707 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5efd4865-6244-4593-937d-233009526404-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.591172 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.591162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-nvxdv" event={"ID":"5efd4865-6244-4593-937d-233009526404","Type":"ContainerDied","Data":"c6c54b13ce9e30c618113fc4fea0d7467d78529ee3c20e924785efa5c54f9d43"} Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.591304 4707 scope.go:117] "RemoveContainer" containerID="359bc0f1140184e8b0a3682548d77ccec47e87d127c82c41d7a5517526ff036d" Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.612180 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-nvxdv"] Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.615827 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-nvxdv"] Jan 21 17:09:08 crc kubenswrapper[4707]: I0121 17:09:08.712710 4707 scope.go:117] "RemoveContainer" containerID="0874b5fae466a24cc3cd2df5c97fa3722bda4b4bc4d1969f5c3cb1b8d1c6d366" Jan 21 17:09:09 crc kubenswrapper[4707]: I0121 17:09:09.189354 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5efd4865-6244-4593-937d-233009526404" path="/var/lib/kubelet/pods/5efd4865-6244-4593-937d-233009526404/volumes" Jan 21 17:09:09 crc kubenswrapper[4707]: I0121 17:09:09.189956 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" path="/var/lib/kubelet/pods/c54027f2-5423-4434-90dc-9cf9d9da63d7/volumes" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.167903 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-qbsn9"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.172470 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-7xvw4"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.176744 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-qbsn9"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.181894 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-7xvw4"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.189114 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c0b89c-689f-4db6-9c1b-2759a23d5c58" path="/var/lib/kubelet/pods/12c0b89c-689f-4db6-9c1b-2759a23d5c58/volumes" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.189757 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6325709f-f82f-4424-ad1a-aceef668a4e2" path="/var/lib/kubelet/pods/6325709f-f82f-4424-ad1a-aceef668a4e2/volumes" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.190308 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-8644c6967d-hrxd6"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.190466 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" podUID="21181af0-ee46-42db-a058-42ed66488786" containerName="keystone-api" containerID="cri-o://9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8" gracePeriod=30 Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.237424 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl"] Jan 21 17:09:13 crc kubenswrapper[4707]: E0121 17:09:13.237669 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.237684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon" Jan 21 17:09:13 crc kubenswrapper[4707]: E0121 17:09:13.237695 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon-log" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.237701 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon-log" Jan 21 17:09:13 crc kubenswrapper[4707]: E0121 17:09:13.237720 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon-log" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.237727 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon-log" Jan 21 17:09:13 crc kubenswrapper[4707]: E0121 17:09:13.237742 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.237747 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.237857 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.237877 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.237887 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5efd4865-6244-4593-937d-233009526404" containerName="horizon-log" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.237896 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54027f2-5423-4434-90dc-9cf9d9da63d7" containerName="horizon-log" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.238270 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.242877 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.397290 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts\") pod \"keystoneb6ca-account-delete-trrzl\" (UID: \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\") " pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.397422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcm5t\" (UniqueName: \"kubernetes.io/projected/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-kube-api-access-zcm5t\") pod \"keystoneb6ca-account-delete-trrzl\" (UID: \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\") " pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.499001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts\") pod \"keystoneb6ca-account-delete-trrzl\" (UID: \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\") " pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.499136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcm5t\" (UniqueName: \"kubernetes.io/projected/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-kube-api-access-zcm5t\") pod \"keystoneb6ca-account-delete-trrzl\" (UID: \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\") " pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.499675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts\") pod \"keystoneb6ca-account-delete-trrzl\" (UID: \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\") " pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.515138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcm5t\" (UniqueName: \"kubernetes.io/projected/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-kube-api-access-zcm5t\") pod \"keystoneb6ca-account-delete-trrzl\" (UID: \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\") " pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.556312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.606473 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-vszqp"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.612494 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-vszqp"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.631918 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/root-account-create-update-9vpgc"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.632743 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.635776 4707 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.638883 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.644973 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.651333 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.653130 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-9vpgc"] Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.674870 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-9vpgc"] Jan 21 17:09:13 crc kubenswrapper[4707]: E0121 17:09:13.675548 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-66l7q operator-scripts], unattached volumes=[], failed to process volumes=[kube-api-access-66l7q operator-scripts]: context canceled" pod="horizon-kuttl-tests/root-account-create-update-9vpgc" podUID="82b720a2-e6ce-4974-86bf-d95148f54ae6" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.704182 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66l7q\" (UniqueName: \"kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q\") pod \"root-account-create-update-9vpgc\" (UID: \"82b720a2-e6ce-4974-86bf-d95148f54ae6\") " pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.704407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts\") pod \"root-account-create-update-9vpgc\" (UID: \"82b720a2-e6ce-4974-86bf-d95148f54ae6\") " pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.766185 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/openstack-galera-2" podUID="89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" containerName="galera" containerID="cri-o://5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67" gracePeriod=30 Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.805266 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts\") pod \"root-account-create-update-9vpgc\" (UID: \"82b720a2-e6ce-4974-86bf-d95148f54ae6\") " pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.805348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66l7q\" (UniqueName: \"kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q\") pod \"root-account-create-update-9vpgc\" (UID: \"82b720a2-e6ce-4974-86bf-d95148f54ae6\") " pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:13 crc kubenswrapper[4707]: E0121 17:09:13.805410 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 17:09:13 crc kubenswrapper[4707]: E0121 17:09:13.805479 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts podName:82b720a2-e6ce-4974-86bf-d95148f54ae6 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:14.30546295 +0000 UTC m=+7651.486979172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts") pod "root-account-create-update-9vpgc" (UID: "82b720a2-e6ce-4974-86bf-d95148f54ae6") : configmap "openstack-scripts" not found Jan 21 17:09:13 crc kubenswrapper[4707]: E0121 17:09:13.807965 4707 projected.go:194] Error preparing data for projected volume kube-api-access-66l7q for pod horizon-kuttl-tests/root-account-create-update-9vpgc: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:09:13 crc kubenswrapper[4707]: E0121 17:09:13.808019 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q podName:82b720a2-e6ce-4974-86bf-d95148f54ae6 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:14.308007474 +0000 UTC m=+7651.489523697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-66l7q" (UniqueName: "kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q") pod "root-account-create-update-9vpgc" (UID: "82b720a2-e6ce-4974-86bf-d95148f54ae6") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:09:13 crc kubenswrapper[4707]: I0121 17:09:13.971048 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl"] Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.152839 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.153290 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/memcached-0" podUID="6162703a-4327-49b8-a9fe-4fb97871dc5d" containerName="memcached" containerID="cri-o://935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032" gracePeriod=30 Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.314365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66l7q\" (UniqueName: \"kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q\") pod \"root-account-create-update-9vpgc\" (UID: \"82b720a2-e6ce-4974-86bf-d95148f54ae6\") " pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.315158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts\") pod \"root-account-create-update-9vpgc\" (UID: \"82b720a2-e6ce-4974-86bf-d95148f54ae6\") " pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:14 crc kubenswrapper[4707]: E0121 17:09:14.315355 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 17:09:14 crc kubenswrapper[4707]: E0121 17:09:14.315444 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts podName:82b720a2-e6ce-4974-86bf-d95148f54ae6 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:15.315431559 +0000 UTC m=+7652.496947782 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts") pod "root-account-create-update-9vpgc" (UID: "82b720a2-e6ce-4974-86bf-d95148f54ae6") : configmap "openstack-scripts" not found Jan 21 17:09:14 crc kubenswrapper[4707]: E0121 17:09:14.316916 4707 projected.go:194] Error preparing data for projected volume kube-api-access-66l7q for pod horizon-kuttl-tests/root-account-create-update-9vpgc: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:09:14 crc kubenswrapper[4707]: E0121 17:09:14.317000 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q podName:82b720a2-e6ce-4974-86bf-d95148f54ae6 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:15.316983647 +0000 UTC m=+7652.498499870 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-66l7q" (UniqueName: "kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q") pod "root-account-create-update-9vpgc" (UID: "82b720a2-e6ce-4974-86bf-d95148f54ae6") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.385632 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.416062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-generated\") pod \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.416103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-operator-scripts\") pod \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.416148 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.416202 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-default\") pod \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.416269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kolla-config\") pod \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.416310 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kwr6\" (UniqueName: \"kubernetes.io/projected/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kube-api-access-4kwr6\") pod \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\" (UID: \"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c\") " Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.417036 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" (UID: "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.417214 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" (UID: "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.417454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" (UID: "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.418160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" (UID: "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.421064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kube-api-access-4kwr6" (OuterVolumeSpecName: "kube-api-access-4kwr6") pod "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" (UID: "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c"). InnerVolumeSpecName "kube-api-access-4kwr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.424531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "mysql-db") pod "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" (UID: "89c5cc72-daa5-42e2-91af-5af2f1ab1c0c"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.504652 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.517678 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kwr6\" (UniqueName: \"kubernetes.io/projected/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kube-api-access-4kwr6\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.517906 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.517917 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.517941 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.517951 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.517959 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.528052 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.620033 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.629879 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" containerID="337a0fb87cec8817d68a11a7b272173692ec5cb125a729e63051db7e19bc31df" exitCode=1 Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.629920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" event={"ID":"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89","Type":"ContainerDied","Data":"337a0fb87cec8817d68a11a7b272173692ec5cb125a729e63051db7e19bc31df"} Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.629970 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" event={"ID":"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89","Type":"ContainerStarted","Data":"25692c310def483a5f55333b1faa75cd8f84393a182916556c267f558997bce6"} Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.630427 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" secret="" err="secret \"galera-openstack-dockercfg-5hrdf\" not found" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.630488 4707 scope.go:117] "RemoveContainer" containerID="337a0fb87cec8817d68a11a7b272173692ec5cb125a729e63051db7e19bc31df" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.631415 4707 generic.go:334] "Generic (PLEG): container finished" podID="89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" containerID="5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67" exitCode=0 Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.631452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c","Type":"ContainerDied","Data":"5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67"} Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.631474 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.631481 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.631487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"89c5cc72-daa5-42e2-91af-5af2f1ab1c0c","Type":"ContainerDied","Data":"7e3c543ba6ff63913faa2b7a1ac37656a265e4ef5bb4df335464ad1032578b17"} Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.631505 4707 scope.go:117] "RemoveContainer" containerID="5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.644429 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.660505 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.663411 4707 scope.go:117] "RemoveContainer" containerID="f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.665225 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.694909 4707 scope.go:117] "RemoveContainer" containerID="5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67" Jan 21 17:09:14 crc kubenswrapper[4707]: E0121 17:09:14.695888 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67\": container with ID starting with 5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67 not found: ID does not exist" containerID="5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.695930 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67"} err="failed to get container status \"5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67\": rpc error: code = NotFound desc = could not find container \"5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67\": container with ID starting with 5cfd42aff685a2bfb4e49bedde49ecc89c9db289af245f98665c98fff275fc67 not found: ID does not exist" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.695956 4707 scope.go:117] "RemoveContainer" containerID="f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47" Jan 21 17:09:14 crc kubenswrapper[4707]: E0121 17:09:14.696408 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47\": container with ID starting with f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47 not found: ID does not exist" containerID="f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47" Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.696434 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47"} err="failed to get container status \"f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47\": rpc error: code = NotFound desc = could not find container \"f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47\": container with ID starting with f599d7255ee77c94bbb263f6cccd28b590f6b112818b1afb23e47c606f3e8c47 not found: ID does not exist" Jan 21 17:09:14 crc kubenswrapper[4707]: E0121 17:09:14.721382 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 17:09:14 crc kubenswrapper[4707]: E0121 17:09:14.721441 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts podName:2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:15.221428584 +0000 UTC m=+7652.402944806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts") pod "keystoneb6ca-account-delete-trrzl" (UID: "2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89") : configmap "openstack-scripts" not found Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.838377 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.865169 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/rabbitmq-server-0" podUID="5b4c43f5-dc57-473d-985e-7dc44e6f06c9" containerName="rabbitmq" containerID="cri-o://f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c" gracePeriod=604800 Jan 21 17:09:14 crc kubenswrapper[4707]: I0121 17:09:14.973076 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.025730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-config-data\") pod \"6162703a-4327-49b8-a9fe-4fb97871dc5d\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.025785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9jlk\" (UniqueName: \"kubernetes.io/projected/6162703a-4327-49b8-a9fe-4fb97871dc5d-kube-api-access-c9jlk\") pod \"6162703a-4327-49b8-a9fe-4fb97871dc5d\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.025854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-kolla-config\") pod \"6162703a-4327-49b8-a9fe-4fb97871dc5d\" (UID: \"6162703a-4327-49b8-a9fe-4fb97871dc5d\") " Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.026230 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-config-data" (OuterVolumeSpecName: "config-data") pod "6162703a-4327-49b8-a9fe-4fb97871dc5d" (UID: "6162703a-4327-49b8-a9fe-4fb97871dc5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.026408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6162703a-4327-49b8-a9fe-4fb97871dc5d" (UID: "6162703a-4327-49b8-a9fe-4fb97871dc5d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.029728 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6162703a-4327-49b8-a9fe-4fb97871dc5d-kube-api-access-c9jlk" (OuterVolumeSpecName: "kube-api-access-c9jlk") pod "6162703a-4327-49b8-a9fe-4fb97871dc5d" (UID: "6162703a-4327-49b8-a9fe-4fb97871dc5d"). InnerVolumeSpecName "kube-api-access-c9jlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.127539 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9jlk\" (UniqueName: \"kubernetes.io/projected/6162703a-4327-49b8-a9fe-4fb97871dc5d-kube-api-access-c9jlk\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.127568 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.127593 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6162703a-4327-49b8-a9fe-4fb97871dc5d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.182941 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:09:15 crc kubenswrapper[4707]: E0121 17:09:15.183182 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.189773 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559c5305-a2b7-4a06-87a7-afd6501038d7" path="/var/lib/kubelet/pods/559c5305-a2b7-4a06-87a7-afd6501038d7/volumes" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.190325 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" path="/var/lib/kubelet/pods/89c5cc72-daa5-42e2-91af-5af2f1ab1c0c/volumes" Jan 21 17:09:15 crc kubenswrapper[4707]: E0121 17:09:15.229303 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 17:09:15 crc kubenswrapper[4707]: E0121 17:09:15.229346 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts podName:2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:16.229334696 +0000 UTC m=+7653.410850918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts") pod "keystoneb6ca-account-delete-trrzl" (UID: "2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89") : configmap "openstack-scripts" not found Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.330247 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts\") pod \"root-account-create-update-9vpgc\" (UID: \"82b720a2-e6ce-4974-86bf-d95148f54ae6\") " pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.330323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66l7q\" (UniqueName: \"kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q\") pod \"root-account-create-update-9vpgc\" (UID: \"82b720a2-e6ce-4974-86bf-d95148f54ae6\") " pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:15 crc kubenswrapper[4707]: E0121 17:09:15.330367 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 17:09:15 crc kubenswrapper[4707]: E0121 17:09:15.330414 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts podName:82b720a2-e6ce-4974-86bf-d95148f54ae6 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:17.330398623 +0000 UTC m=+7654.511914845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts") pod "root-account-create-update-9vpgc" (UID: "82b720a2-e6ce-4974-86bf-d95148f54ae6") : configmap "openstack-scripts" not found Jan 21 17:09:15 crc kubenswrapper[4707]: E0121 17:09:15.333049 4707 projected.go:194] Error preparing data for projected volume kube-api-access-66l7q for pod horizon-kuttl-tests/root-account-create-update-9vpgc: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:09:15 crc kubenswrapper[4707]: E0121 17:09:15.333105 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q podName:82b720a2-e6ce-4974-86bf-d95148f54ae6 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:17.333091717 +0000 UTC m=+7654.514607939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-66l7q" (UniqueName: "kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q") pod "root-account-create-update-9vpgc" (UID: "82b720a2-e6ce-4974-86bf-d95148f54ae6") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.508307 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf"] Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.640656 4707 generic.go:334] "Generic (PLEG): container finished" podID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" containerID="3664b1503bf4278e994e5052e39db81f8036faf508d4717580aa00b72e93fbe0" exitCode=1 Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.640703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" event={"ID":"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89","Type":"ContainerDied","Data":"3664b1503bf4278e994e5052e39db81f8036faf508d4717580aa00b72e93fbe0"} Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.640769 4707 scope.go:117] "RemoveContainer" containerID="337a0fb87cec8817d68a11a7b272173692ec5cb125a729e63051db7e19bc31df" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.641155 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" secret="" err="secret \"galera-openstack-dockercfg-5hrdf\" not found" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.641201 4707 scope.go:117] "RemoveContainer" containerID="3664b1503bf4278e994e5052e39db81f8036faf508d4717580aa00b72e93fbe0" Jan 21 17:09:15 crc kubenswrapper[4707]: E0121 17:09:15.641451 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystoneb6ca-account-delete-trrzl_horizon-kuttl-tests(2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89)\"" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" podUID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.642124 4707 generic.go:334] "Generic (PLEG): container finished" podID="6162703a-4327-49b8-a9fe-4fb97871dc5d" containerID="935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032" exitCode=0 Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.642163 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.642215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"6162703a-4327-49b8-a9fe-4fb97871dc5d","Type":"ContainerDied","Data":"935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032"} Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.642258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"6162703a-4327-49b8-a9fe-4fb97871dc5d","Type":"ContainerDied","Data":"34a3d08ac9c9d75a346868e01d956841ae5b27d2a02d1a3c2a769371fed794a0"} Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.642275 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-9vpgc" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.642369 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" podUID="dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39" containerName="manager" containerID="cri-o://80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea" gracePeriod=10 Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.695014 4707 scope.go:117] "RemoveContainer" containerID="935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.739367 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-9vpgc"] Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.744992 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-9vpgc"] Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.749003 4707 scope.go:117] "RemoveContainer" containerID="935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.752353 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-d97qc"] Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.752531 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-index-d97qc" podUID="3cd98a54-a0ff-4a65-b77c-f9cc53537670" containerName="registry-server" containerID="cri-o://624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa" gracePeriod=30 Jan 21 17:09:15 crc kubenswrapper[4707]: E0121 17:09:15.755932 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032\": container with ID starting with 935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032 not found: ID does not exist" containerID="935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.755971 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032"} err="failed to get container status \"935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032\": rpc error: code = NotFound desc = could not find container \"935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032\": container with ID starting with 935737c288258b47814cde365ce4bf78271374a4653ddab5ac9881d2e7115032 not found: ID does not exist" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.760487 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.775188 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.779909 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs"] Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.784459 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672anqfrs"] Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.793430 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/openstack-galera-1" podUID="6054a174-42ee-4839-a37a-677ac9a78b68" containerName="galera" containerID="cri-o://f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052" gracePeriod=28 Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.839295 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66l7q\" (UniqueName: \"kubernetes.io/projected/82b720a2-e6ce-4974-86bf-d95148f54ae6-kube-api-access-66l7q\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:15 crc kubenswrapper[4707]: I0121 17:09:15.839319 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b720a2-e6ce-4974-86bf-d95148f54ae6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.043098 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.049125 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.142300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2ksz\" (UniqueName: \"kubernetes.io/projected/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-kube-api-access-p2ksz\") pod \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.142340 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-webhook-cert\") pod \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.142418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-apiservice-cert\") pod \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\" (UID: \"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.142447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h54vs\" (UniqueName: \"kubernetes.io/projected/3cd98a54-a0ff-4a65-b77c-f9cc53537670-kube-api-access-h54vs\") pod \"3cd98a54-a0ff-4a65-b77c-f9cc53537670\" (UID: \"3cd98a54-a0ff-4a65-b77c-f9cc53537670\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.146401 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39" (UID: "dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.147112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd98a54-a0ff-4a65-b77c-f9cc53537670-kube-api-access-h54vs" (OuterVolumeSpecName: "kube-api-access-h54vs") pod "3cd98a54-a0ff-4a65-b77c-f9cc53537670" (UID: "3cd98a54-a0ff-4a65-b77c-f9cc53537670"). InnerVolumeSpecName "kube-api-access-h54vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.147742 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39" (UID: "dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.148562 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-kube-api-access-p2ksz" (OuterVolumeSpecName: "kube-api-access-p2ksz") pod "dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39" (UID: "dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39"). InnerVolumeSpecName "kube-api-access-p2ksz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.175211 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.244320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-plugins-conf\") pod \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.244357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqm2r\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-kube-api-access-nqm2r\") pod \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.244409 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-pod-info\") pod \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.244425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-erlang-cookie\") pod \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.244470 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-erlang-cookie-secret\") pod \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.244510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-confd\") pod \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.244657 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\") pod \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.244728 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-plugins\") pod \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\" (UID: \"5b4c43f5-dc57-473d-985e-7dc44e6f06c9\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.245132 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.245149 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h54vs\" (UniqueName: \"kubernetes.io/projected/3cd98a54-a0ff-4a65-b77c-f9cc53537670-kube-api-access-h54vs\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.245160 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2ksz\" (UniqueName: \"kubernetes.io/projected/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-kube-api-access-p2ksz\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.245168 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.246006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5b4c43f5-dc57-473d-985e-7dc44e6f06c9" (UID: "5b4c43f5-dc57-473d-985e-7dc44e6f06c9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: E0121 17:09:16.248305 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 17:09:16 crc kubenswrapper[4707]: E0121 17:09:16.248349 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts podName:2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:18.248334913 +0000 UTC m=+7655.429851135 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts") pod "keystoneb6ca-account-delete-trrzl" (UID: "2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89") : configmap "openstack-scripts" not found Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.249275 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-kube-api-access-nqm2r" (OuterVolumeSpecName: "kube-api-access-nqm2r") pod "5b4c43f5-dc57-473d-985e-7dc44e6f06c9" (UID: "5b4c43f5-dc57-473d-985e-7dc44e6f06c9"). InnerVolumeSpecName "kube-api-access-nqm2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.249533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5b4c43f5-dc57-473d-985e-7dc44e6f06c9" (UID: "5b4c43f5-dc57-473d-985e-7dc44e6f06c9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.250071 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5b4c43f5-dc57-473d-985e-7dc44e6f06c9" (UID: "5b4c43f5-dc57-473d-985e-7dc44e6f06c9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.250974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-pod-info" (OuterVolumeSpecName: "pod-info") pod "5b4c43f5-dc57-473d-985e-7dc44e6f06c9" (UID: "5b4c43f5-dc57-473d-985e-7dc44e6f06c9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.251414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5b4c43f5-dc57-473d-985e-7dc44e6f06c9" (UID: "5b4c43f5-dc57-473d-985e-7dc44e6f06c9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.267084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3" (OuterVolumeSpecName: "persistence") pod "5b4c43f5-dc57-473d-985e-7dc44e6f06c9" (UID: "5b4c43f5-dc57-473d-985e-7dc44e6f06c9"). InnerVolumeSpecName "pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.299199 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5b4c43f5-dc57-473d-985e-7dc44e6f06c9" (UID: "5b4c43f5-dc57-473d-985e-7dc44e6f06c9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.347040 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.347065 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.347078 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqm2r\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-kube-api-access-nqm2r\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.347088 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.347098 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.347106 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.347115 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4c43f5-dc57-473d-985e-7dc44e6f06c9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.347149 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\") on node \"crc\" " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.365669 4707 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.365965 4707 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3") on node "crc" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.448339 4707 reconciler_common.go:293] "Volume detached for volume \"pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdf71db7-9bfe-47a6-83d3-bf7a15f353d3\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.481410 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.549562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96jw9\" (UniqueName: \"kubernetes.io/projected/21181af0-ee46-42db-a058-42ed66488786-kube-api-access-96jw9\") pod \"21181af0-ee46-42db-a058-42ed66488786\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.549614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-fernet-keys\") pod \"21181af0-ee46-42db-a058-42ed66488786\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.549638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-credential-keys\") pod \"21181af0-ee46-42db-a058-42ed66488786\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.549668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-config-data\") pod \"21181af0-ee46-42db-a058-42ed66488786\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.549694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-scripts\") pod \"21181af0-ee46-42db-a058-42ed66488786\" (UID: \"21181af0-ee46-42db-a058-42ed66488786\") " Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.553000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21181af0-ee46-42db-a058-42ed66488786-kube-api-access-96jw9" (OuterVolumeSpecName: "kube-api-access-96jw9") pod "21181af0-ee46-42db-a058-42ed66488786" (UID: "21181af0-ee46-42db-a058-42ed66488786"). InnerVolumeSpecName "kube-api-access-96jw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.553043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-scripts" (OuterVolumeSpecName: "scripts") pod "21181af0-ee46-42db-a058-42ed66488786" (UID: "21181af0-ee46-42db-a058-42ed66488786"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.553196 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "21181af0-ee46-42db-a058-42ed66488786" (UID: "21181af0-ee46-42db-a058-42ed66488786"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.555855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "21181af0-ee46-42db-a058-42ed66488786" (UID: "21181af0-ee46-42db-a058-42ed66488786"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.566501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-config-data" (OuterVolumeSpecName: "config-data") pod "21181af0-ee46-42db-a058-42ed66488786" (UID: "21181af0-ee46-42db-a058-42ed66488786"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.648825 4707 generic.go:334] "Generic (PLEG): container finished" podID="dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39" containerID="80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea" exitCode=0 Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.648877 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.648917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" event={"ID":"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39","Type":"ContainerDied","Data":"80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea"} Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.648947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf" event={"ID":"dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39","Type":"ContainerDied","Data":"92316e6a66835d11965fb56660a630c35e7f401b2734a1b76155d4924642fa5a"} Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.648967 4707 scope.go:117] "RemoveContainer" containerID="80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.650959 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96jw9\" (UniqueName: \"kubernetes.io/projected/21181af0-ee46-42db-a058-42ed66488786-kube-api-access-96jw9\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.650980 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.650990 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.650999 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.651007 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21181af0-ee46-42db-a058-42ed66488786-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.651115 4707 generic.go:334] "Generic (PLEG): container finished" podID="5b4c43f5-dc57-473d-985e-7dc44e6f06c9" containerID="f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c" exitCode=0 Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.651160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"5b4c43f5-dc57-473d-985e-7dc44e6f06c9","Type":"ContainerDied","Data":"f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c"} Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.651166 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.651177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"5b4c43f5-dc57-473d-985e-7dc44e6f06c9","Type":"ContainerDied","Data":"a74efbb78427ff359cacb8a1dfd4763f4ee093e49a0e2b209bcbe64633b019eb"} Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.656847 4707 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" secret="" err="secret \"galera-openstack-dockercfg-5hrdf\" not found" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.656887 4707 scope.go:117] "RemoveContainer" containerID="3664b1503bf4278e994e5052e39db81f8036faf508d4717580aa00b72e93fbe0" Jan 21 17:09:16 crc kubenswrapper[4707]: E0121 17:09:16.657070 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystoneb6ca-account-delete-trrzl_horizon-kuttl-tests(2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89)\"" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" podUID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.658430 4707 generic.go:334] "Generic (PLEG): container finished" podID="3cd98a54-a0ff-4a65-b77c-f9cc53537670" containerID="624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa" exitCode=0 Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.658474 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-d97qc" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.658479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-d97qc" event={"ID":"3cd98a54-a0ff-4a65-b77c-f9cc53537670","Type":"ContainerDied","Data":"624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa"} Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.658592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-d97qc" event={"ID":"3cd98a54-a0ff-4a65-b77c-f9cc53537670","Type":"ContainerDied","Data":"cae53d8164a52a16b8e4713d9043dc1680b7e2b0f79e4046e83636a38d397a4e"} Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.661386 4707 generic.go:334] "Generic (PLEG): container finished" podID="21181af0-ee46-42db-a058-42ed66488786" containerID="9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8" exitCode=0 Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.661407 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.661425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" event={"ID":"21181af0-ee46-42db-a058-42ed66488786","Type":"ContainerDied","Data":"9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8"} Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.661446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-8644c6967d-hrxd6" event={"ID":"21181af0-ee46-42db-a058-42ed66488786","Type":"ContainerDied","Data":"9307669e1fafcfee9ff9aaa71762b66146a39ee693949ae4658a156e8a3415e9"} Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.677565 4707 scope.go:117] "RemoveContainer" containerID="80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea" Jan 21 17:09:16 crc kubenswrapper[4707]: E0121 17:09:16.681215 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea\": container with ID starting with 80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea not found: ID does not exist" containerID="80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.681251 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea"} err="failed to get container status \"80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea\": rpc error: code = NotFound desc = could not find container \"80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea\": container with ID starting with 80982d4c6df2944025b7b70dff423df25efc5b0aee0ac8f80bdf46ea987974ea not found: ID does not exist" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.681273 4707 scope.go:117] "RemoveContainer" containerID="f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.687132 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf"] Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.697281 4707 scope.go:117] "RemoveContainer" containerID="1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.701672 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-56c8cbcbc7-vjtnf"] Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.707354 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.711380 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.721889 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-d97qc"] Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.723763 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-index-d97qc"] Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.726912 4707 scope.go:117] "RemoveContainer" containerID="f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c" Jan 21 17:09:16 crc kubenswrapper[4707]: E0121 17:09:16.727322 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c\": container with ID starting with f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c not found: ID does not exist" containerID="f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.727355 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c"} err="failed to get container status \"f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c\": rpc error: code = NotFound desc = could not find container \"f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c\": container with ID starting with f59dbde26d94d221d4c48d4235e9b5e967279651a59ed2b776a16fc00448256c not found: ID does not exist" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.727378 4707 scope.go:117] "RemoveContainer" containerID="1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df" Jan 21 17:09:16 crc kubenswrapper[4707]: E0121 17:09:16.727826 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df\": container with ID starting with 1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df not found: ID does not exist" containerID="1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.727861 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df"} err="failed to get container status \"1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df\": rpc error: code = NotFound desc = could not find container \"1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df\": container with ID starting with 1a24e7354826b2879e403ac3a9c091c9e103a91f927c1000ac5b76d1a05954df not found: ID does not exist" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.727882 4707 scope.go:117] "RemoveContainer" containerID="624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.728327 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-8644c6967d-hrxd6"] Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.731487 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-8644c6967d-hrxd6"] Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.740720 4707 scope.go:117] "RemoveContainer" containerID="624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa" Jan 21 17:09:16 crc kubenswrapper[4707]: E0121 17:09:16.741082 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa\": container with ID starting with 624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa not found: ID does not exist" containerID="624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.741110 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa"} err="failed to get container status \"624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa\": rpc error: code = NotFound desc = could not find container \"624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa\": container with ID starting with 624b05474fb50220ca6190faabd2a2acc6739fe230efb7d1c1fad7536518c1aa not found: ID does not exist" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.741131 4707 scope.go:117] "RemoveContainer" containerID="9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.758498 4707 scope.go:117] "RemoveContainer" containerID="9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8" Jan 21 17:09:16 crc kubenswrapper[4707]: E0121 17:09:16.758778 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8\": container with ID starting with 9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8 not found: ID does not exist" containerID="9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8" Jan 21 17:09:16 crc kubenswrapper[4707]: I0121 17:09:16.758844 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8"} err="failed to get container status \"9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8\": rpc error: code = NotFound desc = could not find container \"9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8\": container with ID starting with 9bf30b2735dd70691d6e1d505b1b453091ed144e2d2b530ea862ecb5e30e15d8 not found: ID does not exist" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.188977 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21181af0-ee46-42db-a058-42ed66488786" path="/var/lib/kubelet/pods/21181af0-ee46-42db-a058-42ed66488786/volumes" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.189467 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd98a54-a0ff-4a65-b77c-f9cc53537670" path="/var/lib/kubelet/pods/3cd98a54-a0ff-4a65-b77c-f9cc53537670/volumes" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.190068 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b4c43f5-dc57-473d-985e-7dc44e6f06c9" path="/var/lib/kubelet/pods/5b4c43f5-dc57-473d-985e-7dc44e6f06c9/volumes" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.190531 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6162703a-4327-49b8-a9fe-4fb97871dc5d" path="/var/lib/kubelet/pods/6162703a-4327-49b8-a9fe-4fb97871dc5d/volumes" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.190875 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b720a2-e6ce-4974-86bf-d95148f54ae6" path="/var/lib/kubelet/pods/82b720a2-e6ce-4974-86bf-d95148f54ae6/volumes" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.191133 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85c2ebe8-fbfb-468a-b5a7-8e455d5fe063" path="/var/lib/kubelet/pods/85c2ebe8-fbfb-468a-b5a7-8e455d5fe063/volumes" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.191649 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39" path="/var/lib/kubelet/pods/dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39/volumes" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.474607 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.562911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-generated\") pod \"6054a174-42ee-4839-a37a-677ac9a78b68\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.562966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"6054a174-42ee-4839-a37a-677ac9a78b68\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.562989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-kolla-config\") pod \"6054a174-42ee-4839-a37a-677ac9a78b68\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.563023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-operator-scripts\") pod \"6054a174-42ee-4839-a37a-677ac9a78b68\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.563066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-default\") pod \"6054a174-42ee-4839-a37a-677ac9a78b68\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.563094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4h98\" (UniqueName: \"kubernetes.io/projected/6054a174-42ee-4839-a37a-677ac9a78b68-kube-api-access-s4h98\") pod \"6054a174-42ee-4839-a37a-677ac9a78b68\" (UID: \"6054a174-42ee-4839-a37a-677ac9a78b68\") " Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.563550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "6054a174-42ee-4839-a37a-677ac9a78b68" (UID: "6054a174-42ee-4839-a37a-677ac9a78b68"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.563683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6054a174-42ee-4839-a37a-677ac9a78b68" (UID: "6054a174-42ee-4839-a37a-677ac9a78b68"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.563741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "6054a174-42ee-4839-a37a-677ac9a78b68" (UID: "6054a174-42ee-4839-a37a-677ac9a78b68"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.563990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6054a174-42ee-4839-a37a-677ac9a78b68" (UID: "6054a174-42ee-4839-a37a-677ac9a78b68"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.565990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6054a174-42ee-4839-a37a-677ac9a78b68-kube-api-access-s4h98" (OuterVolumeSpecName: "kube-api-access-s4h98") pod "6054a174-42ee-4839-a37a-677ac9a78b68" (UID: "6054a174-42ee-4839-a37a-677ac9a78b68"). InnerVolumeSpecName "kube-api-access-s4h98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.570268 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "mysql-db") pod "6054a174-42ee-4839-a37a-677ac9a78b68" (UID: "6054a174-42ee-4839-a37a-677ac9a78b68"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.651540 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7"] Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.651724 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" podUID="078cfcac-e384-409d-8b67-d0b56e051b7f" containerName="manager" containerID="cri-o://ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574" gracePeriod=10 Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.664666 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.664706 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.664717 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.664727 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.664736 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6054a174-42ee-4839-a37a-677ac9a78b68-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.664744 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4h98\" (UniqueName: \"kubernetes.io/projected/6054a174-42ee-4839-a37a-677ac9a78b68-kube-api-access-s4h98\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.686122 4707 generic.go:334] "Generic (PLEG): container finished" podID="6054a174-42ee-4839-a37a-677ac9a78b68" containerID="f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052" exitCode=0 Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.686201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"6054a174-42ee-4839-a37a-677ac9a78b68","Type":"ContainerDied","Data":"f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052"} Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.686230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"6054a174-42ee-4839-a37a-677ac9a78b68","Type":"ContainerDied","Data":"7b7d335f05418621d7e429a0130e97a7094bd32e6ebd6c7f5a82cb2159f1e8d7"} Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.686245 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.686275 4707 scope.go:117] "RemoveContainer" containerID="f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.688025 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 17:09:17 crc kubenswrapper[4707]: E0121 17:09:17.708930 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54027f2_5423_4434_90dc_9cf9d9da63d7.slice/crio-95a8aa1de37654edf73cc749f013058abe5c372f667cb22a6c5b3cd863423e21\": RecentStats: unable to find data in memory cache]" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.750532 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.751351 4707 scope.go:117] "RemoveContainer" containerID="5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.754341 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.766595 4707 scope.go:117] "RemoveContainer" containerID="f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052" Jan 21 17:09:17 crc kubenswrapper[4707]: E0121 17:09:17.766886 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052\": container with ID starting with f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052 not found: ID does not exist" containerID="f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.766922 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052"} err="failed to get container status \"f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052\": rpc error: code = NotFound desc = could not find container \"f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052\": container with ID starting with f752f6ac8a8997ce6c02c5337ec7075bc6bf80f4151f8f9783d56014ed5ef052 not found: ID does not exist" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.766942 4707 scope.go:117] "RemoveContainer" containerID="5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd" Jan 21 17:09:17 crc kubenswrapper[4707]: E0121 17:09:17.767174 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd\": container with ID starting with 5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd not found: ID does not exist" containerID="5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.767193 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd"} err="failed to get container status \"5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd\": rpc error: code = NotFound desc = could not find container \"5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd\": container with ID starting with 5865bffe687a923fe710b2413d01df78b0e137726052160cb9eef9d6c9bab2fd not found: ID does not exist" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.773035 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.834459 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/openstack-galera-0" podUID="65e6dbe8-c2a6-4589-88cb-e16c1b718141" containerName="galera" containerID="cri-o://e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed" gracePeriod=26 Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.862136 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-c8gcz"] Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.862307 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-c8gcz" podUID="da1d6c5f-e5a4-4d85-9526-499182972d3b" containerName="registry-server" containerID="cri-o://d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc" gracePeriod=30 Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.878541 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz"] Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.885018 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069z4fpz"] Jan 21 17:09:17 crc kubenswrapper[4707]: I0121 17:09:17.961664 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.077802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-apiservice-cert\") pod \"078cfcac-e384-409d-8b67-d0b56e051b7f\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.077923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-webhook-cert\") pod \"078cfcac-e384-409d-8b67-d0b56e051b7f\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.078004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5svgm\" (UniqueName: \"kubernetes.io/projected/078cfcac-e384-409d-8b67-d0b56e051b7f-kube-api-access-5svgm\") pod \"078cfcac-e384-409d-8b67-d0b56e051b7f\" (UID: \"078cfcac-e384-409d-8b67-d0b56e051b7f\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.081893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/078cfcac-e384-409d-8b67-d0b56e051b7f-kube-api-access-5svgm" (OuterVolumeSpecName: "kube-api-access-5svgm") pod "078cfcac-e384-409d-8b67-d0b56e051b7f" (UID: "078cfcac-e384-409d-8b67-d0b56e051b7f"). InnerVolumeSpecName "kube-api-access-5svgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.082235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "078cfcac-e384-409d-8b67-d0b56e051b7f" (UID: "078cfcac-e384-409d-8b67-d0b56e051b7f"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.087234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "078cfcac-e384-409d-8b67-d0b56e051b7f" (UID: "078cfcac-e384-409d-8b67-d0b56e051b7f"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.151687 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.179442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx256\" (UniqueName: \"kubernetes.io/projected/da1d6c5f-e5a4-4d85-9526-499182972d3b-kube-api-access-wx256\") pod \"da1d6c5f-e5a4-4d85-9526-499182972d3b\" (UID: \"da1d6c5f-e5a4-4d85-9526-499182972d3b\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.179839 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.179857 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5svgm\" (UniqueName: \"kubernetes.io/projected/078cfcac-e384-409d-8b67-d0b56e051b7f-kube-api-access-5svgm\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.179868 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/078cfcac-e384-409d-8b67-d0b56e051b7f-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.183950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1d6c5f-e5a4-4d85-9526-499182972d3b-kube-api-access-wx256" (OuterVolumeSpecName: "kube-api-access-wx256") pod "da1d6c5f-e5a4-4d85-9526-499182972d3b" (UID: "da1d6c5f-e5a4-4d85-9526-499182972d3b"). InnerVolumeSpecName "kube-api-access-wx256". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.275919 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-mcv6c"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.279497 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-mcv6c"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.281189 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx256\" (UniqueName: \"kubernetes.io/projected/da1d6c5f-e5a4-4d85-9526-499182972d3b-kube-api-access-wx256\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: E0121 17:09:18.281261 4707 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 17:09:18 crc kubenswrapper[4707]: E0121 17:09:18.281300 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts podName:2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89 nodeName:}" failed. No retries permitted until 2026-01-21 17:09:22.281286846 +0000 UTC m=+7659.462803069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts") pod "keystoneb6ca-account-delete-trrzl" (UID: "2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89") : configmap "openstack-scripts" not found Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.344211 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.367245 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.374890 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-b6ca-account-create-update-gl9xl"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.622992 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.684845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kolla-config\") pod \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.684884 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.686475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-generated\") pod \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.686546 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-operator-scripts\") pod \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.686605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5dvb\" (UniqueName: \"kubernetes.io/projected/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kube-api-access-l5dvb\") pod \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.686693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-default\") pod \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\" (UID: \"65e6dbe8-c2a6-4589-88cb-e16c1b718141\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.687020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "65e6dbe8-c2a6-4589-88cb-e16c1b718141" (UID: "65e6dbe8-c2a6-4589-88cb-e16c1b718141"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.687142 4707 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.687512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "65e6dbe8-c2a6-4589-88cb-e16c1b718141" (UID: "65e6dbe8-c2a6-4589-88cb-e16c1b718141"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.687642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "65e6dbe8-c2a6-4589-88cb-e16c1b718141" (UID: "65e6dbe8-c2a6-4589-88cb-e16c1b718141"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.689508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65e6dbe8-c2a6-4589-88cb-e16c1b718141" (UID: "65e6dbe8-c2a6-4589-88cb-e16c1b718141"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.691141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kube-api-access-l5dvb" (OuterVolumeSpecName: "kube-api-access-l5dvb") pod "65e6dbe8-c2a6-4589-88cb-e16c1b718141" (UID: "65e6dbe8-c2a6-4589-88cb-e16c1b718141"). InnerVolumeSpecName "kube-api-access-l5dvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.698597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "65e6dbe8-c2a6-4589-88cb-e16c1b718141" (UID: "65e6dbe8-c2a6-4589-88cb-e16c1b718141"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.698929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" event={"ID":"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89","Type":"ContainerDied","Data":"25692c310def483a5f55333b1faa75cd8f84393a182916556c267f558997bce6"} Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.698966 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25692c310def483a5f55333b1faa75cd8f84393a182916556c267f558997bce6" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.700165 4707 generic.go:334] "Generic (PLEG): container finished" podID="da1d6c5f-e5a4-4d85-9526-499182972d3b" containerID="d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc" exitCode=0 Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.700215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-c8gcz" event={"ID":"da1d6c5f-e5a4-4d85-9526-499182972d3b","Type":"ContainerDied","Data":"d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc"} Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.700231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-c8gcz" event={"ID":"da1d6c5f-e5a4-4d85-9526-499182972d3b","Type":"ContainerDied","Data":"3dbdd361fa174440a23de5ec6d6d28fdb63f94e214023e9875cace22a17cbfd5"} Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.700246 4707 scope.go:117] "RemoveContainer" containerID="d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.700318 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-c8gcz" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.704038 4707 generic.go:334] "Generic (PLEG): container finished" podID="65e6dbe8-c2a6-4589-88cb-e16c1b718141" containerID="e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed" exitCode=0 Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.704088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"65e6dbe8-c2a6-4589-88cb-e16c1b718141","Type":"ContainerDied","Data":"e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed"} Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.704113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"65e6dbe8-c2a6-4589-88cb-e16c1b718141","Type":"ContainerDied","Data":"e85f7d6f5a7308d578a7650d245cf539bb7db82273f2d60be6d6dd4c959596d1"} Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.704166 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.710107 4707 generic.go:334] "Generic (PLEG): container finished" podID="078cfcac-e384-409d-8b67-d0b56e051b7f" containerID="ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574" exitCode=0 Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.710170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" event={"ID":"078cfcac-e384-409d-8b67-d0b56e051b7f","Type":"ContainerDied","Data":"ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574"} Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.710185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" event={"ID":"078cfcac-e384-409d-8b67-d0b56e051b7f","Type":"ContainerDied","Data":"42e067ec320145591960fccc3d1633c7a7e188de74f1ff67967860bf89279176"} Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.710220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.718151 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.728850 4707 scope.go:117] "RemoveContainer" containerID="d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc" Jan 21 17:09:18 crc kubenswrapper[4707]: E0121 17:09:18.729228 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc\": container with ID starting with d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc not found: ID does not exist" containerID="d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.729265 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc"} err="failed to get container status \"d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc\": rpc error: code = NotFound desc = could not find container \"d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc\": container with ID starting with d3f57275ea0c1f785d670a34f4a286214471e6b1749122744df8417380aa7abc not found: ID does not exist" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.729288 4707 scope.go:117] "RemoveContainer" containerID="e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.745531 4707 scope.go:117] "RemoveContainer" containerID="d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.751923 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-c8gcz"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.756014 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-c8gcz"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.759886 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.764148 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.767365 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.771315 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5885849d4c-vm8k7"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.780458 4707 scope.go:117] "RemoveContainer" containerID="e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed" Jan 21 17:09:18 crc kubenswrapper[4707]: E0121 17:09:18.780881 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed\": container with ID starting with e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed not found: ID does not exist" containerID="e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.780914 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed"} err="failed to get container status \"e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed\": rpc error: code = NotFound desc = could not find container \"e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed\": container with ID starting with e796ca3a506e4b601a09c7835acd3d330374125405205451f68d9bbdbb0830ed not found: ID does not exist" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.780934 4707 scope.go:117] "RemoveContainer" containerID="d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02" Jan 21 17:09:18 crc kubenswrapper[4707]: E0121 17:09:18.781159 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02\": container with ID starting with d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02 not found: ID does not exist" containerID="d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.781179 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02"} err="failed to get container status \"d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02\": rpc error: code = NotFound desc = could not find container \"d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02\": container with ID starting with d12dac3bb0e292faceea93a3ee4b014c5046826e6827cdbf76735c31dfa84f02 not found: ID does not exist" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.781192 4707 scope.go:117] "RemoveContainer" containerID="ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.788618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcm5t\" (UniqueName: \"kubernetes.io/projected/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-kube-api-access-zcm5t\") pod \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\" (UID: \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.788841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts\") pod \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\" (UID: \"2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89\") " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.789235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" (UID: "2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.789494 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.789596 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.789670 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/65e6dbe8-c2a6-4589-88cb-e16c1b718141-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.789742 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.789794 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65e6dbe8-c2a6-4589-88cb-e16c1b718141-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.789884 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5dvb\" (UniqueName: \"kubernetes.io/projected/65e6dbe8-c2a6-4589-88cb-e16c1b718141-kube-api-access-l5dvb\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.791046 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-kube-api-access-zcm5t" (OuterVolumeSpecName: "kube-api-access-zcm5t") pod "2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" (UID: "2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89"). InnerVolumeSpecName "kube-api-access-zcm5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.795966 4707 scope.go:117] "RemoveContainer" containerID="ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574" Jan 21 17:09:18 crc kubenswrapper[4707]: E0121 17:09:18.796276 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574\": container with ID starting with ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574 not found: ID does not exist" containerID="ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.796299 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574"} err="failed to get container status \"ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574\": rpc error: code = NotFound desc = could not find container \"ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574\": container with ID starting with ea1258240c91ab02edbd62c535192e233fa10753ea5afd05d7a7459fc0401574 not found: ID does not exist" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.799334 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.836130 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7"] Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.837118 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" podUID="b6f47560-0311-4a69-be7f-e57c1b043c06" containerName="operator" containerID="cri-o://9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06" gracePeriod=10 Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.891632 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcm5t\" (UniqueName: \"kubernetes.io/projected/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89-kube-api-access-zcm5t\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:18 crc kubenswrapper[4707]: I0121 17:09:18.891752 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.055065 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lmh6x"] Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.055229 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" podUID="f91ad4c0-4328-4242-bf29-575b12865e63" containerName="registry-server" containerID="cri-o://058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913" gracePeriod=30 Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.072312 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj"] Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.082681 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qgwdj"] Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.144932 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.190868 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="078cfcac-e384-409d-8b67-d0b56e051b7f" path="/var/lib/kubelet/pods/078cfcac-e384-409d-8b67-d0b56e051b7f/volumes" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.191514 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2038593f-c740-4feb-af87-945317f2adee" path="/var/lib/kubelet/pods/2038593f-c740-4feb-af87-945317f2adee/volumes" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.192184 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="220973a9-7456-4526-a610-7793dfea1cbf" path="/var/lib/kubelet/pods/220973a9-7456-4526-a610-7793dfea1cbf/volumes" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.193209 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6054a174-42ee-4839-a37a-677ac9a78b68" path="/var/lib/kubelet/pods/6054a174-42ee-4839-a37a-677ac9a78b68/volumes" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.193834 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e6dbe8-c2a6-4589-88cb-e16c1b718141" path="/var/lib/kubelet/pods/65e6dbe8-c2a6-4589-88cb-e16c1b718141/volumes" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.194010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n689x\" (UniqueName: \"kubernetes.io/projected/b6f47560-0311-4a69-be7f-e57c1b043c06-kube-api-access-n689x\") pod \"b6f47560-0311-4a69-be7f-e57c1b043c06\" (UID: \"b6f47560-0311-4a69-be7f-e57c1b043c06\") " Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.194472 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a905ee-abc4-4123-bc7a-b6fc67b1c42c" path="/var/lib/kubelet/pods/97a905ee-abc4-4123-bc7a-b6fc67b1c42c/volumes" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.195390 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b00f35b5-3d95-40cf-8016-d5aaf9c01adb" path="/var/lib/kubelet/pods/b00f35b5-3d95-40cf-8016-d5aaf9c01adb/volumes" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.196158 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1d6c5f-e5a4-4d85-9526-499182972d3b" path="/var/lib/kubelet/pods/da1d6c5f-e5a4-4d85-9526-499182972d3b/volumes" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.197487 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f47560-0311-4a69-be7f-e57c1b043c06-kube-api-access-n689x" (OuterVolumeSpecName: "kube-api-access-n689x") pod "b6f47560-0311-4a69-be7f-e57c1b043c06" (UID: "b6f47560-0311-4a69-be7f-e57c1b043c06"). InnerVolumeSpecName "kube-api-access-n689x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.295535 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n689x\" (UniqueName: \"kubernetes.io/projected/b6f47560-0311-4a69-be7f-e57c1b043c06-kube-api-access-n689x\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.359875 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.396230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q5qz\" (UniqueName: \"kubernetes.io/projected/f91ad4c0-4328-4242-bf29-575b12865e63-kube-api-access-2q5qz\") pod \"f91ad4c0-4328-4242-bf29-575b12865e63\" (UID: \"f91ad4c0-4328-4242-bf29-575b12865e63\") " Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.399038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91ad4c0-4328-4242-bf29-575b12865e63-kube-api-access-2q5qz" (OuterVolumeSpecName: "kube-api-access-2q5qz") pod "f91ad4c0-4328-4242-bf29-575b12865e63" (UID: "f91ad4c0-4328-4242-bf29-575b12865e63"). InnerVolumeSpecName "kube-api-access-2q5qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.497914 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q5qz\" (UniqueName: \"kubernetes.io/projected/f91ad4c0-4328-4242-bf29-575b12865e63-kube-api-access-2q5qz\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.723928 4707 generic.go:334] "Generic (PLEG): container finished" podID="f91ad4c0-4328-4242-bf29-575b12865e63" containerID="058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913" exitCode=0 Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.723982 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.723997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" event={"ID":"f91ad4c0-4328-4242-bf29-575b12865e63","Type":"ContainerDied","Data":"058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913"} Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.724739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lmh6x" event={"ID":"f91ad4c0-4328-4242-bf29-575b12865e63","Type":"ContainerDied","Data":"00698ecacca27934b7e35b0b651a512327258bb5da0b6f0c728fa67e18fcae9c"} Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.724759 4707 scope.go:117] "RemoveContainer" containerID="058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.726362 4707 generic.go:334] "Generic (PLEG): container finished" podID="b6f47560-0311-4a69-be7f-e57c1b043c06" containerID="9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06" exitCode=0 Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.726409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" event={"ID":"b6f47560-0311-4a69-be7f-e57c1b043c06","Type":"ContainerDied","Data":"9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06"} Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.726425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" event={"ID":"b6f47560-0311-4a69-be7f-e57c1b043c06","Type":"ContainerDied","Data":"0d5c05565378bd34f51239079e8bcd2452da02d6e62074c26d1e58312b0e7fa0"} Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.726456 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.743236 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.749294 4707 scope.go:117] "RemoveContainer" containerID="058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913" Jan 21 17:09:19 crc kubenswrapper[4707]: E0121 17:09:19.753506 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913\": container with ID starting with 058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913 not found: ID does not exist" containerID="058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.753539 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913"} err="failed to get container status \"058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913\": rpc error: code = NotFound desc = could not find container \"058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913\": container with ID starting with 058e2eec32acc4a3116217c879bd00132bccf463302062c81e6567e8961a6913 not found: ID does not exist" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.753606 4707 scope.go:117] "RemoveContainer" containerID="9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.755658 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lmh6x"] Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.758571 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lmh6x"] Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.768280 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7"] Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.774560 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fxmp7"] Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.775491 4707 scope.go:117] "RemoveContainer" containerID="9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06" Jan 21 17:09:19 crc kubenswrapper[4707]: E0121 17:09:19.775798 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06\": container with ID starting with 9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06 not found: ID does not exist" containerID="9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.775839 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06"} err="failed to get container status \"9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06\": rpc error: code = NotFound desc = could not find container \"9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06\": container with ID starting with 9ca0277e4da2891cd95e2fb7f6b1f541fcee7873ef5c323b32e420ebe3d0dc06 not found: ID does not exist" Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.778638 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl"] Jan 21 17:09:19 crc kubenswrapper[4707]: I0121 17:09:19.783250 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystoneb6ca-account-delete-trrzl"] Jan 21 17:09:21 crc kubenswrapper[4707]: I0121 17:09:21.189350 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" path="/var/lib/kubelet/pods/2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89/volumes" Jan 21 17:09:21 crc kubenswrapper[4707]: I0121 17:09:21.190091 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f47560-0311-4a69-be7f-e57c1b043c06" path="/var/lib/kubelet/pods/b6f47560-0311-4a69-be7f-e57c1b043c06/volumes" Jan 21 17:09:21 crc kubenswrapper[4707]: I0121 17:09:21.190497 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91ad4c0-4328-4242-bf29-575b12865e63" path="/var/lib/kubelet/pods/f91ad4c0-4328-4242-bf29-575b12865e63/volumes" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.208288 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq"] Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.208650 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" podUID="537a7434-2179-4bf0-b885-8c8a86e35ced" containerName="manager" containerID="cri-o://934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987" gracePeriod=10 Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.427867 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-hll9d"] Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.428026 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-hll9d" podUID="72bc4e04-1249-4797-9a6f-044c6245e3d7" containerName="registry-server" containerID="cri-o://a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5" gracePeriod=30 Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.476924 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk"] Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.477863 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c12tphk"] Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.576566 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.650828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx745\" (UniqueName: \"kubernetes.io/projected/537a7434-2179-4bf0-b885-8c8a86e35ced-kube-api-access-dx745\") pod \"537a7434-2179-4bf0-b885-8c8a86e35ced\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.650902 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-webhook-cert\") pod \"537a7434-2179-4bf0-b885-8c8a86e35ced\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.650945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-apiservice-cert\") pod \"537a7434-2179-4bf0-b885-8c8a86e35ced\" (UID: \"537a7434-2179-4bf0-b885-8c8a86e35ced\") " Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.656145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "537a7434-2179-4bf0-b885-8c8a86e35ced" (UID: "537a7434-2179-4bf0-b885-8c8a86e35ced"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.656967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "537a7434-2179-4bf0-b885-8c8a86e35ced" (UID: "537a7434-2179-4bf0-b885-8c8a86e35ced"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.660759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537a7434-2179-4bf0-b885-8c8a86e35ced-kube-api-access-dx745" (OuterVolumeSpecName: "kube-api-access-dx745") pod "537a7434-2179-4bf0-b885-8c8a86e35ced" (UID: "537a7434-2179-4bf0-b885-8c8a86e35ced"). InnerVolumeSpecName "kube-api-access-dx745". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.740117 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.756117 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx745\" (UniqueName: \"kubernetes.io/projected/537a7434-2179-4bf0-b885-8c8a86e35ced-kube-api-access-dx745\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.756136 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.756146 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/537a7434-2179-4bf0-b885-8c8a86e35ced-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.768699 4707 generic.go:334] "Generic (PLEG): container finished" podID="72bc4e04-1249-4797-9a6f-044c6245e3d7" containerID="a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5" exitCode=0 Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.768750 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-hll9d" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.768772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-hll9d" event={"ID":"72bc4e04-1249-4797-9a6f-044c6245e3d7","Type":"ContainerDied","Data":"a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5"} Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.768804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-hll9d" event={"ID":"72bc4e04-1249-4797-9a6f-044c6245e3d7","Type":"ContainerDied","Data":"2f616bc927ac98df60ca6034a5f5c5e7084a353dfddaa2c2dcef6a6fdce467f7"} Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.768863 4707 scope.go:117] "RemoveContainer" containerID="a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.771593 4707 generic.go:334] "Generic (PLEG): container finished" podID="537a7434-2179-4bf0-b885-8c8a86e35ced" containerID="934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987" exitCode=0 Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.771642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" event={"ID":"537a7434-2179-4bf0-b885-8c8a86e35ced","Type":"ContainerDied","Data":"934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987"} Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.771666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" event={"ID":"537a7434-2179-4bf0-b885-8c8a86e35ced","Type":"ContainerDied","Data":"83c67a89620d9f233693f2c2d4a092d23b2a6fcd186855e4105b85e1fdcd38d0"} Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.771710 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.801695 4707 scope.go:117] "RemoveContainer" containerID="a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5" Jan 21 17:09:23 crc kubenswrapper[4707]: E0121 17:09:23.804854 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5\": container with ID starting with a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5 not found: ID does not exist" containerID="a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.804880 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5"} err="failed to get container status \"a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5\": rpc error: code = NotFound desc = could not find container \"a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5\": container with ID starting with a1088019b8a676afc8c0d7c820a9a47692a56b8b9df0610d0ecbd33d9ddd3fd5 not found: ID does not exist" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.804894 4707 scope.go:117] "RemoveContainer" containerID="934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.812115 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq"] Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.815364 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-58788bc8df-69rmq"] Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.827709 4707 scope.go:117] "RemoveContainer" containerID="934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987" Jan 21 17:09:23 crc kubenswrapper[4707]: E0121 17:09:23.831484 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987\": container with ID starting with 934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987 not found: ID does not exist" containerID="934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.831522 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987"} err="failed to get container status \"934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987\": rpc error: code = NotFound desc = could not find container \"934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987\": container with ID starting with 934f80b82bc991f6d74ce4132fb35cbc4620ad4c6a23ec8f4f79c29308279987 not found: ID does not exist" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.856762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcpwp\" (UniqueName: \"kubernetes.io/projected/72bc4e04-1249-4797-9a6f-044c6245e3d7-kube-api-access-kcpwp\") pod \"72bc4e04-1249-4797-9a6f-044c6245e3d7\" (UID: \"72bc4e04-1249-4797-9a6f-044c6245e3d7\") " Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.859131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72bc4e04-1249-4797-9a6f-044c6245e3d7-kube-api-access-kcpwp" (OuterVolumeSpecName: "kube-api-access-kcpwp") pod "72bc4e04-1249-4797-9a6f-044c6245e3d7" (UID: "72bc4e04-1249-4797-9a6f-044c6245e3d7"). InnerVolumeSpecName "kube-api-access-kcpwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:23 crc kubenswrapper[4707]: I0121 17:09:23.958431 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcpwp\" (UniqueName: \"kubernetes.io/projected/72bc4e04-1249-4797-9a6f-044c6245e3d7-kube-api-access-kcpwp\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:24 crc kubenswrapper[4707]: I0121 17:09:24.104192 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-hll9d"] Jan 21 17:09:24 crc kubenswrapper[4707]: I0121 17:09:24.108322 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-hll9d"] Jan 21 17:09:24 crc kubenswrapper[4707]: I0121 17:09:24.805397 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf"] Jan 21 17:09:24 crc kubenswrapper[4707]: I0121 17:09:24.805573 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" podUID="810dd4cc-c978-43bd-a4e6-35f378338c0d" containerName="manager" containerID="cri-o://18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1" gracePeriod=10 Jan 21 17:09:24 crc kubenswrapper[4707]: I0121 17:09:24.971471 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-sfqx5"] Jan 21 17:09:24 crc kubenswrapper[4707]: I0121 17:09:24.971660 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-sfqx5" podUID="3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e" containerName="registry-server" containerID="cri-o://3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b" gracePeriod=30 Jan 21 17:09:24 crc kubenswrapper[4707]: I0121 17:09:24.994880 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m"] Jan 21 17:09:24 crc kubenswrapper[4707]: I0121 17:09:24.998484 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bmch6m"] Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.190227 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244091ea-ba85-4055-a3cb-454b51a4cad7" path="/var/lib/kubelet/pods/244091ea-ba85-4055-a3cb-454b51a4cad7/volumes" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.191100 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="537a7434-2179-4bf0-b885-8c8a86e35ced" path="/var/lib/kubelet/pods/537a7434-2179-4bf0-b885-8c8a86e35ced/volumes" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.191484 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72bc4e04-1249-4797-9a6f-044c6245e3d7" path="/var/lib/kubelet/pods/72bc4e04-1249-4797-9a6f-044c6245e3d7/volumes" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.192280 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.192357 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f90da28a-77de-44e6-ae52-10f83fdcbe73" path="/var/lib/kubelet/pods/f90da28a-77de-44e6-ae52-10f83fdcbe73/volumes" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.275096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d262t\" (UniqueName: \"kubernetes.io/projected/810dd4cc-c978-43bd-a4e6-35f378338c0d-kube-api-access-d262t\") pod \"810dd4cc-c978-43bd-a4e6-35f378338c0d\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.275844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-webhook-cert\") pod \"810dd4cc-c978-43bd-a4e6-35f378338c0d\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.275949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-apiservice-cert\") pod \"810dd4cc-c978-43bd-a4e6-35f378338c0d\" (UID: \"810dd4cc-c978-43bd-a4e6-35f378338c0d\") " Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.279547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810dd4cc-c978-43bd-a4e6-35f378338c0d-kube-api-access-d262t" (OuterVolumeSpecName: "kube-api-access-d262t") pod "810dd4cc-c978-43bd-a4e6-35f378338c0d" (UID: "810dd4cc-c978-43bd-a4e6-35f378338c0d"). InnerVolumeSpecName "kube-api-access-d262t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.280129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "810dd4cc-c978-43bd-a4e6-35f378338c0d" (UID: "810dd4cc-c978-43bd-a4e6-35f378338c0d"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.280194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "810dd4cc-c978-43bd-a4e6-35f378338c0d" (UID: "810dd4cc-c978-43bd-a4e6-35f378338c0d"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.336242 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.377483 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxm2k\" (UniqueName: \"kubernetes.io/projected/3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e-kube-api-access-lxm2k\") pod \"3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e\" (UID: \"3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e\") " Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.377867 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d262t\" (UniqueName: \"kubernetes.io/projected/810dd4cc-c978-43bd-a4e6-35f378338c0d-kube-api-access-d262t\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.377887 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.377896 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/810dd4cc-c978-43bd-a4e6-35f378338c0d-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.380009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e-kube-api-access-lxm2k" (OuterVolumeSpecName: "kube-api-access-lxm2k") pod "3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e" (UID: "3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e"). InnerVolumeSpecName "kube-api-access-lxm2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.478745 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxm2k\" (UniqueName: \"kubernetes.io/projected/3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e-kube-api-access-lxm2k\") on node \"crc\" DevicePath \"\"" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.785973 4707 generic.go:334] "Generic (PLEG): container finished" podID="810dd4cc-c978-43bd-a4e6-35f378338c0d" containerID="18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1" exitCode=0 Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.786016 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.786019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" event={"ID":"810dd4cc-c978-43bd-a4e6-35f378338c0d","Type":"ContainerDied","Data":"18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1"} Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.786066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf" event={"ID":"810dd4cc-c978-43bd-a4e6-35f378338c0d","Type":"ContainerDied","Data":"de32cebb8007fd7a0a915848ddd9d700599195b35b859aef7f0955be4db2b23a"} Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.786101 4707 scope.go:117] "RemoveContainer" containerID="18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.787440 4707 generic.go:334] "Generic (PLEG): container finished" podID="3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e" containerID="3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b" exitCode=0 Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.787462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-sfqx5" event={"ID":"3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e","Type":"ContainerDied","Data":"3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b"} Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.787478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-sfqx5" event={"ID":"3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e","Type":"ContainerDied","Data":"da0649089a4225d59ef3b3fecdbd40cf1a8ab056d814384c1ec8470037b60668"} Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.787489 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-sfqx5" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.801898 4707 scope.go:117] "RemoveContainer" containerID="18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1" Jan 21 17:09:25 crc kubenswrapper[4707]: E0121 17:09:25.802178 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1\": container with ID starting with 18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1 not found: ID does not exist" containerID="18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.802206 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1"} err="failed to get container status \"18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1\": rpc error: code = NotFound desc = could not find container \"18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1\": container with ID starting with 18853588416c624a9c418b4e6e154599a3f37ee4578b082ee0c58f55113620f1 not found: ID does not exist" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.802220 4707 scope.go:117] "RemoveContainer" containerID="3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.808043 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-sfqx5"] Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.814184 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-sfqx5"] Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.819348 4707 scope.go:117] "RemoveContainer" containerID="3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b" Jan 21 17:09:25 crc kubenswrapper[4707]: E0121 17:09:25.819706 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b\": container with ID starting with 3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b not found: ID does not exist" containerID="3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.819734 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b"} err="failed to get container status \"3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b\": rpc error: code = NotFound desc = could not find container \"3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b\": container with ID starting with 3a70a9c3ad776d6eedcc8293369fafaff4bf0114a05fc0f07c9cacdd81bd1b4b not found: ID does not exist" Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.821549 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf"] Jan 21 17:09:25 crc kubenswrapper[4707]: I0121 17:09:25.825924 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-566b5f8c4c-9nxgf"] Jan 21 17:09:26 crc kubenswrapper[4707]: I0121 17:09:26.182630 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:09:26 crc kubenswrapper[4707]: E0121 17:09:26.182907 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:09:27 crc kubenswrapper[4707]: I0121 17:09:27.188935 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e" path="/var/lib/kubelet/pods/3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e/volumes" Jan 21 17:09:27 crc kubenswrapper[4707]: I0121 17:09:27.189705 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810dd4cc-c978-43bd-a4e6-35f378338c0d" path="/var/lib/kubelet/pods/810dd4cc-c978-43bd-a4e6-35f378338c0d/volumes" Jan 21 17:09:27 crc kubenswrapper[4707]: E0121 17:09:27.830397 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54027f2_5423_4434_90dc_9cf9d9da63d7.slice/crio-95a8aa1de37654edf73cc749f013058abe5c372f667cb22a6c5b3cd863423e21\": RecentStats: unable to find data in memory cache]" Jan 21 17:09:28 crc kubenswrapper[4707]: I0121 17:09:28.233402 4707 scope.go:117] "RemoveContainer" containerID="145ffa0e7c42bbfe06c805e5cc6e7c6bcf75fe630f7c172e19b2bcacdae73b01" Jan 21 17:09:28 crc kubenswrapper[4707]: I0121 17:09:28.253422 4707 scope.go:117] "RemoveContainer" containerID="6be84cfbe1530b64ec0502ae2b88e99fb02da4152ffeb9a1377d51d9cf583d03" Jan 21 17:09:28 crc kubenswrapper[4707]: I0121 17:09:28.269664 4707 scope.go:117] "RemoveContainer" containerID="7a1d61ffec3eca3b78decbadd6c286d9e0f4635ab8604112d41dc7c30fc4fe9b" Jan 21 17:09:28 crc kubenswrapper[4707]: I0121 17:09:28.283621 4707 scope.go:117] "RemoveContainer" containerID="5430702e1379ee18a0761e8f5d2a1a9dfbacc8275de3071c10c3a4dac2f3229e" Jan 21 17:09:28 crc kubenswrapper[4707]: I0121 17:09:28.300846 4707 scope.go:117] "RemoveContainer" containerID="5c0d24bd39f259526cfb4f3db26bbd70d2a3155cb4d5d3fd571e29a3554d1e23" Jan 21 17:09:28 crc kubenswrapper[4707]: I0121 17:09:28.316710 4707 scope.go:117] "RemoveContainer" containerID="de411360bc4b5977334de837fa6bdcb9017972d0b0d4816e79f3bdd450162c29" Jan 21 17:09:28 crc kubenswrapper[4707]: I0121 17:09:28.331368 4707 scope.go:117] "RemoveContainer" containerID="e9e4b3e25a818ce85b0b358bb9e60401b6b85c6eea5a42a8c798d06fc5ed35fd" Jan 21 17:09:37 crc kubenswrapper[4707]: E0121 17:09:37.944273 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54027f2_5423_4434_90dc_9cf9d9da63d7.slice/crio-95a8aa1de37654edf73cc749f013058abe5c372f667cb22a6c5b3cd863423e21\": RecentStats: unable to find data in memory cache]" Jan 21 17:09:39 crc kubenswrapper[4707]: I0121 17:09:39.182784 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:09:39 crc kubenswrapper[4707]: E0121 17:09:39.183017 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021327 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m6shh/must-gather-xzrlc"] Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021789 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" containerName="galera" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021824 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" containerName="galera" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021841 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1d6c5f-e5a4-4d85-9526-499182972d3b" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021848 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1d6c5f-e5a4-4d85-9526-499182972d3b" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021859 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" containerName="mysql-bootstrap" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021866 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" containerName="mysql-bootstrap" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021876 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f47560-0311-4a69-be7f-e57c1b043c06" containerName="operator" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021882 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f47560-0311-4a69-be7f-e57c1b043c06" containerName="operator" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021889 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6054a174-42ee-4839-a37a-677ac9a78b68" containerName="galera" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021894 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6054a174-42ee-4839-a37a-677ac9a78b68" containerName="galera" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021902 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21181af0-ee46-42db-a058-42ed66488786" containerName="keystone-api" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021907 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="21181af0-ee46-42db-a058-42ed66488786" containerName="keystone-api" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91ad4c0-4328-4242-bf29-575b12865e63" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021922 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91ad4c0-4328-4242-bf29-575b12865e63" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021928 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021933 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021942 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e6dbe8-c2a6-4589-88cb-e16c1b718141" containerName="galera" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021947 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e6dbe8-c2a6-4589-88cb-e16c1b718141" containerName="galera" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021957 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537a7434-2179-4bf0-b885-8c8a86e35ced" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021963 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="537a7434-2179-4bf0-b885-8c8a86e35ced" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810dd4cc-c978-43bd-a4e6-35f378338c0d" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021975 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="810dd4cc-c978-43bd-a4e6-35f378338c0d" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021981 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6054a174-42ee-4839-a37a-677ac9a78b68" containerName="mysql-bootstrap" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021986 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6054a174-42ee-4839-a37a-677ac9a78b68" containerName="mysql-bootstrap" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.021992 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd98a54-a0ff-4a65-b77c-f9cc53537670" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.021997 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd98a54-a0ff-4a65-b77c-f9cc53537670" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.022010 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" containerName="mariadb-account-delete" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022015 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" containerName="mariadb-account-delete" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.022022 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" containerName="mariadb-account-delete" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022027 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" containerName="mariadb-account-delete" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.022034 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bc4e04-1249-4797-9a6f-044c6245e3d7" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022039 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bc4e04-1249-4797-9a6f-044c6245e3d7" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.022046 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b4c43f5-dc57-473d-985e-7dc44e6f06c9" containerName="rabbitmq" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022052 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4c43f5-dc57-473d-985e-7dc44e6f06c9" containerName="rabbitmq" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.022058 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e6dbe8-c2a6-4589-88cb-e16c1b718141" containerName="mysql-bootstrap" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022062 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e6dbe8-c2a6-4589-88cb-e16c1b718141" containerName="mysql-bootstrap" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.022070 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="078cfcac-e384-409d-8b67-d0b56e051b7f" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022075 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="078cfcac-e384-409d-8b67-d0b56e051b7f" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.022084 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022090 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.022097 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6162703a-4327-49b8-a9fe-4fb97871dc5d" containerName="memcached" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022102 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6162703a-4327-49b8-a9fe-4fb97871dc5d" containerName="memcached" Jan 21 17:09:40 crc kubenswrapper[4707]: E0121 17:09:40.022111 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b4c43f5-dc57-473d-985e-7dc44e6f06c9" containerName="setup-container" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022116 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4c43f5-dc57-473d-985e-7dc44e6f06c9" containerName="setup-container" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022247 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6054a174-42ee-4839-a37a-677ac9a78b68" containerName="galera" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022255 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="21181af0-ee46-42db-a058-42ed66488786" containerName="keystone-api" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022263 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6f012f-dbc1-4ad9-99ef-e9c2aee8018e" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022273 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e6dbe8-c2a6-4589-88cb-e16c1b718141" containerName="galera" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022280 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b4c43f5-dc57-473d-985e-7dc44e6f06c9" containerName="rabbitmq" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022286 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd98a54-a0ff-4a65-b77c-f9cc53537670" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022296 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" containerName="mariadb-account-delete" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022302 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc30e4fe-f8b3-42cc-a6bb-c8b5c98b7a39" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022310 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f47560-0311-4a69-be7f-e57c1b043c06" containerName="operator" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022319 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c5cc72-daa5-42e2-91af-5af2f1ab1c0c" containerName="galera" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022326 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91ad4c0-4328-4242-bf29-575b12865e63" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022333 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="537a7434-2179-4bf0-b885-8c8a86e35ced" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022340 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6162703a-4327-49b8-a9fe-4fb97871dc5d" containerName="memcached" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022348 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="078cfcac-e384-409d-8b67-d0b56e051b7f" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="810dd4cc-c978-43bd-a4e6-35f378338c0d" containerName="manager" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022363 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bc4e04-1249-4797-9a6f-044c6245e3d7" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022373 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c79dd94-2afa-41e0-bc1b-6aeb3ce46b89" containerName="mariadb-account-delete" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.022382 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1d6c5f-e5a4-4d85-9526-499182972d3b" containerName="registry-server" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.023017 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m6shh/must-gather-xzrlc" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.025592 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m6shh"/"openshift-service-ca.crt" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.025609 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-m6shh"/"default-dockercfg-hw4qx" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.025745 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m6shh"/"kube-root-ca.crt" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.033201 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m6shh/must-gather-xzrlc"] Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.148224 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ktff\" (UniqueName: \"kubernetes.io/projected/d9f91686-496a-47df-a637-35b981e766d2-kube-api-access-7ktff\") pod \"must-gather-xzrlc\" (UID: \"d9f91686-496a-47df-a637-35b981e766d2\") " pod="openshift-must-gather-m6shh/must-gather-xzrlc" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.148306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d9f91686-496a-47df-a637-35b981e766d2-must-gather-output\") pod \"must-gather-xzrlc\" (UID: \"d9f91686-496a-47df-a637-35b981e766d2\") " pod="openshift-must-gather-m6shh/must-gather-xzrlc" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.249961 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d9f91686-496a-47df-a637-35b981e766d2-must-gather-output\") pod \"must-gather-xzrlc\" (UID: \"d9f91686-496a-47df-a637-35b981e766d2\") " pod="openshift-must-gather-m6shh/must-gather-xzrlc" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.250092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ktff\" (UniqueName: \"kubernetes.io/projected/d9f91686-496a-47df-a637-35b981e766d2-kube-api-access-7ktff\") pod \"must-gather-xzrlc\" (UID: \"d9f91686-496a-47df-a637-35b981e766d2\") " pod="openshift-must-gather-m6shh/must-gather-xzrlc" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.250781 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d9f91686-496a-47df-a637-35b981e766d2-must-gather-output\") pod \"must-gather-xzrlc\" (UID: \"d9f91686-496a-47df-a637-35b981e766d2\") " pod="openshift-must-gather-m6shh/must-gather-xzrlc" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.263901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ktff\" (UniqueName: \"kubernetes.io/projected/d9f91686-496a-47df-a637-35b981e766d2-kube-api-access-7ktff\") pod \"must-gather-xzrlc\" (UID: \"d9f91686-496a-47df-a637-35b981e766d2\") " pod="openshift-must-gather-m6shh/must-gather-xzrlc" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.337716 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m6shh/must-gather-xzrlc" Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.680798 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m6shh/must-gather-xzrlc"] Jan 21 17:09:40 crc kubenswrapper[4707]: W0121 17:09:40.684990 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9f91686_496a_47df_a637_35b981e766d2.slice/crio-0edec1973ec066a7c6c4ee9637fa1fa41706472236726a1ab332144955057738 WatchSource:0}: Error finding container 0edec1973ec066a7c6c4ee9637fa1fa41706472236726a1ab332144955057738: Status 404 returned error can't find the container with id 0edec1973ec066a7c6c4ee9637fa1fa41706472236726a1ab332144955057738 Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.686507 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 17:09:40 crc kubenswrapper[4707]: I0121 17:09:40.875535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m6shh/must-gather-xzrlc" event={"ID":"d9f91686-496a-47df-a637-35b981e766d2","Type":"ContainerStarted","Data":"0edec1973ec066a7c6c4ee9637fa1fa41706472236726a1ab332144955057738"} Jan 21 17:09:45 crc kubenswrapper[4707]: I0121 17:09:45.900865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m6shh/must-gather-xzrlc" event={"ID":"d9f91686-496a-47df-a637-35b981e766d2","Type":"ContainerStarted","Data":"035a02cf544fed4046a69ea1739814797a7406ccd1bd980deaf81031b99ae39d"} Jan 21 17:09:45 crc kubenswrapper[4707]: I0121 17:09:45.901225 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m6shh/must-gather-xzrlc" event={"ID":"d9f91686-496a-47df-a637-35b981e766d2","Type":"ContainerStarted","Data":"92d3e9f7de543e191048bfdc7d06c1e3569ef2f428c145523f3c8fdf3cfa6da3"} Jan 21 17:09:45 crc kubenswrapper[4707]: I0121 17:09:45.913091 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m6shh/must-gather-xzrlc" podStartSLOduration=1.81994429 podStartE2EDuration="5.913076962s" podCreationTimestamp="2026-01-21 17:09:40 +0000 UTC" firstStartedPulling="2026-01-21 17:09:40.686308066 +0000 UTC m=+7677.867824288" lastFinishedPulling="2026-01-21 17:09:44.779440738 +0000 UTC m=+7681.960956960" observedRunningTime="2026-01-21 17:09:45.911252191 +0000 UTC m=+7683.092768413" watchObservedRunningTime="2026-01-21 17:09:45.913076962 +0000 UTC m=+7683.094593184" Jan 21 17:09:54 crc kubenswrapper[4707]: I0121 17:09:54.182185 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:09:54 crc kubenswrapper[4707]: E0121 17:09:54.182695 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:10:05 crc kubenswrapper[4707]: I0121 17:10:05.183614 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:10:05 crc kubenswrapper[4707]: E0121 17:10:05.185294 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:10:11 crc kubenswrapper[4707]: I0121 17:10:11.782384 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/controller/0.log" Jan 21 17:10:11 crc kubenswrapper[4707]: I0121 17:10:11.788308 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/kube-rbac-proxy/0.log" Jan 21 17:10:11 crc kubenswrapper[4707]: I0121 17:10:11.802951 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/controller/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.801392 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.810003 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/reloader/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.814075 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr-metrics/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.818936 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.825441 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy-frr/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.830062 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-frr-files/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.836563 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-reloader/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.841483 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-metrics/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.849078 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-czn2c_a50f5957-c813-4be6-846c-18fa67c16942/frr-k8s-webhook-server/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.864189 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-769647f86-fhptv_3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78/manager/0.log" Jan 21 17:10:12 crc kubenswrapper[4707]: I0121 17:10:12.870759 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58ff8dbc9-jm9z5_560343d7-3dae-42d2-b79f-e7ddbde3df71/webhook-server/0.log" Jan 21 17:10:13 crc kubenswrapper[4707]: I0121 17:10:13.865227 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/speaker/0.log" Jan 21 17:10:13 crc kubenswrapper[4707]: I0121 17:10:13.881825 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/kube-rbac-proxy/0.log" Jan 21 17:10:16 crc kubenswrapper[4707]: I0121 17:10:16.182790 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:10:16 crc kubenswrapper[4707]: E0121 17:10:16.183732 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:10:19 crc kubenswrapper[4707]: I0121 17:10:19.870295 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bjb5q_4a760eb2-4041-468b-a269-13e523796dd0/control-plane-machine-set-operator/0.log" Jan 21 17:10:19 crc kubenswrapper[4707]: I0121 17:10:19.879647 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ggf7f_6896438b-9d7e-4637-ac39-63c12dbf0e66/kube-rbac-proxy/0.log" Jan 21 17:10:19 crc kubenswrapper[4707]: I0121 17:10:19.886013 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ggf7f_6896438b-9d7e-4637-ac39-63c12dbf0e66/machine-api-operator/0.log" Jan 21 17:10:23 crc kubenswrapper[4707]: I0121 17:10:23.527726 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-pdx6x_34c2061b-4028-4e99-9345-d8443062558a/cert-manager-controller/0.log" Jan 21 17:10:23 crc kubenswrapper[4707]: I0121 17:10:23.590137 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-tbzv6_18724316-30e9-4c09-baa1-d3d3971f97de/cert-manager-cainjector/0.log" Jan 21 17:10:23 crc kubenswrapper[4707]: I0121 17:10:23.600450 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-p7vr2_d59f005b-d150-4190-b8ca-d757c89269f4/cert-manager-webhook/0.log" Jan 21 17:10:26 crc kubenswrapper[4707]: I0121 17:10:26.974299 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-g8mcx_7bf6f36b-51bd-4196-b732-7807475b38b2/nmstate-console-plugin/0.log" Jan 21 17:10:26 crc kubenswrapper[4707]: I0121 17:10:26.989651 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-m4cqg_72f87fd4-c9ff-4410-ba80-fdc2525a37b8/nmstate-handler/0.log" Jan 21 17:10:27 crc kubenswrapper[4707]: I0121 17:10:26.998439 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-h2gvw_591eebf5-40db-445a-a3ab-23a5f88f501c/nmstate-metrics/0.log" Jan 21 17:10:27 crc kubenswrapper[4707]: I0121 17:10:27.011444 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-h2gvw_591eebf5-40db-445a-a3ab-23a5f88f501c/kube-rbac-proxy/0.log" Jan 21 17:10:27 crc kubenswrapper[4707]: I0121 17:10:27.027281 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-kvjjm_c7e81409-5fc8-4b0d-9c98-dfaf6167892d/nmstate-operator/0.log" Jan 21 17:10:27 crc kubenswrapper[4707]: I0121 17:10:27.035285 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-64vq9_3dc50ee6-0d24-4790-a13d-caaf61114685/nmstate-webhook/0.log" Jan 21 17:10:27 crc kubenswrapper[4707]: I0121 17:10:27.182993 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:10:27 crc kubenswrapper[4707]: E0121 17:10:27.183235 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:10:28 crc kubenswrapper[4707]: I0121 17:10:28.529775 4707 scope.go:117] "RemoveContainer" containerID="4e401ca7aa762101df040b8d2afbe0a82baaaa39745e6ba0b41cea619e616331" Jan 21 17:10:28 crc kubenswrapper[4707]: I0121 17:10:28.544913 4707 scope.go:117] "RemoveContainer" containerID="b1742e3fe2d246b1b73dd8656a6c0c2474a586cd4b0f63b19c95105f28905270" Jan 21 17:10:28 crc kubenswrapper[4707]: I0121 17:10:28.561633 4707 scope.go:117] "RemoveContainer" containerID="22d3f117f59a14193a24a97ee76c8463cb41e4d8634d5ba0062e0b55e7d68d91" Jan 21 17:10:30 crc kubenswrapper[4707]: I0121 17:10:30.528884 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vcvdr_b57455b4-f5cc-4e50-972f-299a31cc4eba/prometheus-operator/0.log" Jan 21 17:10:30 crc kubenswrapper[4707]: I0121 17:10:30.546163 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc_50a8c4cf-f6ea-4042-900f-37b63445f2d3/prometheus-operator-admission-webhook/0.log" Jan 21 17:10:30 crc kubenswrapper[4707]: I0121 17:10:30.557560 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc_f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d/prometheus-operator-admission-webhook/0.log" Jan 21 17:10:30 crc kubenswrapper[4707]: I0121 17:10:30.588548 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hc5fq_de08c513-221d-41e8-a1b3-fc4b21d19173/operator/0.log" Jan 21 17:10:30 crc kubenswrapper[4707]: I0121 17:10:30.597302 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-n7h88_a77834d8-cde4-449d-b56f-6d3b53b0cadd/perses-operator/0.log" Jan 21 17:10:34 crc kubenswrapper[4707]: I0121 17:10:34.421276 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/controller/0.log" Jan 21 17:10:34 crc kubenswrapper[4707]: I0121 17:10:34.426980 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/kube-rbac-proxy/0.log" Jan 21 17:10:34 crc kubenswrapper[4707]: I0121 17:10:34.442128 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/controller/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.414469 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.420769 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/reloader/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.432244 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr-metrics/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.438364 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.443440 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy-frr/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.447899 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-frr-files/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.456101 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-reloader/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.461431 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-metrics/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.468800 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-czn2c_a50f5957-c813-4be6-846c-18fa67c16942/frr-k8s-webhook-server/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.490419 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-769647f86-fhptv_3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78/manager/0.log" Jan 21 17:10:35 crc kubenswrapper[4707]: I0121 17:10:35.498869 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58ff8dbc9-jm9z5_560343d7-3dae-42d2-b79f-e7ddbde3df71/webhook-server/0.log" Jan 21 17:10:36 crc kubenswrapper[4707]: I0121 17:10:36.458940 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/speaker/0.log" Jan 21 17:10:36 crc kubenswrapper[4707]: I0121 17:10:36.468532 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/kube-rbac-proxy/0.log" Jan 21 17:10:42 crc kubenswrapper[4707]: I0121 17:10:42.182142 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:10:43 crc kubenswrapper[4707]: I0121 17:10:43.188915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"68af8187b5691bda3dd4143031fccb3c6201be65d4e73f52582877d190385c13"} Jan 21 17:10:49 crc kubenswrapper[4707]: I0121 17:10:49.167601 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-84b9f45d47-55945_375fc46a-93db-46c4-9f95-b0f17f2e3c1c/dnsmasq-dns/0.log" Jan 21 17:10:49 crc kubenswrapper[4707]: I0121 17:10:49.175758 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-84b9f45d47-55945_375fc46a-93db-46c4-9f95-b0f17f2e3c1c/init/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.418141 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k_5ee91b4f-b397-4648-a2ab-dbf5699be347/extract/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.424818 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k_5ee91b4f-b397-4648-a2ab-dbf5699be347/util/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.444742 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k_5ee91b4f-b397-4648-a2ab-dbf5699be347/pull/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.454555 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq_ace82b3e-606d-4deb-ab5f-04fd56c3d8d3/extract/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.460181 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq_ace82b3e-606d-4deb-ab5f-04fd56c3d8d3/util/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.469091 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq_ace82b3e-606d-4deb-ab5f-04fd56c3d8d3/pull/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.476369 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj_4c693f1a-3959-41d4-8209-ff55cf48b2ea/extract/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.483587 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj_4c693f1a-3959-41d4-8209-ff55cf48b2ea/util/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.489470 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj_4c693f1a-3959-41d4-8209-ff55cf48b2ea/pull/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.502831 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh_d5e44698-abdb-4d2b-8058-ec06409cd1d3/extract/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.507484 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh_d5e44698-abdb-4d2b-8058-ec06409cd1d3/util/0.log" Jan 21 17:10:53 crc kubenswrapper[4707]: I0121 17:10:53.520176 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh_d5e44698-abdb-4d2b-8058-ec06409cd1d3/pull/0.log" Jan 21 17:10:54 crc kubenswrapper[4707]: I0121 17:10:54.454796 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9gkb_22983f7b-d717-456f-9efe-3ff6a7342a68/registry-server/0.log" Jan 21 17:10:54 crc kubenswrapper[4707]: I0121 17:10:54.462069 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9gkb_22983f7b-d717-456f-9efe-3ff6a7342a68/extract-utilities/0.log" Jan 21 17:10:54 crc kubenswrapper[4707]: I0121 17:10:54.470780 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9gkb_22983f7b-d717-456f-9efe-3ff6a7342a68/extract-content/0.log" Jan 21 17:10:55 crc kubenswrapper[4707]: I0121 17:10:55.605944 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s8jx4_d67efa67-67d4-4f70-ac0e-cdbd6f76d739/registry-server/0.log" Jan 21 17:10:55 crc kubenswrapper[4707]: I0121 17:10:55.611613 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s8jx4_d67efa67-67d4-4f70-ac0e-cdbd6f76d739/extract-utilities/0.log" Jan 21 17:10:55 crc kubenswrapper[4707]: I0121 17:10:55.617853 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s8jx4_d67efa67-67d4-4f70-ac0e-cdbd6f76d739/extract-content/0.log" Jan 21 17:10:55 crc kubenswrapper[4707]: I0121 17:10:55.635604 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xcp6l_654186e4-d936-41b5-8967-fb853141daf4/marketplace-operator/0.log" Jan 21 17:10:55 crc kubenswrapper[4707]: I0121 17:10:55.886615 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mjbnh_978a715c-0a07-4197-9617-ae4c81c49d34/registry-server/0.log" Jan 21 17:10:55 crc kubenswrapper[4707]: I0121 17:10:55.891043 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mjbnh_978a715c-0a07-4197-9617-ae4c81c49d34/extract-utilities/0.log" Jan 21 17:10:55 crc kubenswrapper[4707]: I0121 17:10:55.897606 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mjbnh_978a715c-0a07-4197-9617-ae4c81c49d34/extract-content/0.log" Jan 21 17:10:56 crc kubenswrapper[4707]: I0121 17:10:56.702580 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5qt95_ba1d939f-e4be-4740-83ef-b93efb9d0db8/registry-server/0.log" Jan 21 17:10:56 crc kubenswrapper[4707]: I0121 17:10:56.710554 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5qt95_ba1d939f-e4be-4740-83ef-b93efb9d0db8/extract-utilities/0.log" Jan 21 17:10:56 crc kubenswrapper[4707]: I0121 17:10:56.771227 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5qt95_ba1d939f-e4be-4740-83ef-b93efb9d0db8/extract-content/0.log" Jan 21 17:10:58 crc kubenswrapper[4707]: I0121 17:10:58.435399 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vcvdr_b57455b4-f5cc-4e50-972f-299a31cc4eba/prometheus-operator/0.log" Jan 21 17:10:58 crc kubenswrapper[4707]: I0121 17:10:58.447637 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc_50a8c4cf-f6ea-4042-900f-37b63445f2d3/prometheus-operator-admission-webhook/0.log" Jan 21 17:10:58 crc kubenswrapper[4707]: I0121 17:10:58.456877 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc_f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d/prometheus-operator-admission-webhook/0.log" Jan 21 17:10:58 crc kubenswrapper[4707]: I0121 17:10:58.476896 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hc5fq_de08c513-221d-41e8-a1b3-fc4b21d19173/operator/0.log" Jan 21 17:10:58 crc kubenswrapper[4707]: I0121 17:10:58.483440 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-n7h88_a77834d8-cde4-449d-b56f-6d3b53b0cadd/perses-operator/0.log" Jan 21 17:11:28 crc kubenswrapper[4707]: I0121 17:11:28.594141 4707 scope.go:117] "RemoveContainer" containerID="c74c25d17159c498403d8dabd859fc21143ab92f7314fce98e3db7ce8cd5f02d" Jan 21 17:11:28 crc kubenswrapper[4707]: I0121 17:11:28.607896 4707 scope.go:117] "RemoveContainer" containerID="c26ede1b29efd1dfd2feab9989dd9a2dab316515b373c3c5d2d638ed0c7366f3" Jan 21 17:11:28 crc kubenswrapper[4707]: I0121 17:11:28.630793 4707 scope.go:117] "RemoveContainer" containerID="2866fe8ba38998023e4ddf912844931732fa88e9b8b341655b0e3e49e864cb32" Jan 21 17:11:28 crc kubenswrapper[4707]: I0121 17:11:28.643868 4707 scope.go:117] "RemoveContainer" containerID="089942a08c5c5deaf174cd16157859c8c34fefded1588a5c4568191b9ba1722b" Jan 21 17:12:21 crc kubenswrapper[4707]: I0121 17:12:21.470801 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vcvdr_b57455b4-f5cc-4e50-972f-299a31cc4eba/prometheus-operator/0.log" Jan 21 17:12:21 crc kubenswrapper[4707]: I0121 17:12:21.478199 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc_50a8c4cf-f6ea-4042-900f-37b63445f2d3/prometheus-operator-admission-webhook/0.log" Jan 21 17:12:21 crc kubenswrapper[4707]: I0121 17:12:21.484933 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc_f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d/prometheus-operator-admission-webhook/0.log" Jan 21 17:12:21 crc kubenswrapper[4707]: I0121 17:12:21.503214 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hc5fq_de08c513-221d-41e8-a1b3-fc4b21d19173/operator/0.log" Jan 21 17:12:21 crc kubenswrapper[4707]: I0121 17:12:21.511506 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-n7h88_a77834d8-cde4-449d-b56f-6d3b53b0cadd/perses-operator/0.log" Jan 21 17:12:21 crc kubenswrapper[4707]: I0121 17:12:21.784602 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-pdx6x_34c2061b-4028-4e99-9345-d8443062558a/cert-manager-controller/0.log" Jan 21 17:12:21 crc kubenswrapper[4707]: I0121 17:12:21.840899 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-tbzv6_18724316-30e9-4c09-baa1-d3d3971f97de/cert-manager-cainjector/0.log" Jan 21 17:12:21 crc kubenswrapper[4707]: I0121 17:12:21.854936 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-p7vr2_d59f005b-d150-4190-b8ca-d757c89269f4/cert-manager-webhook/0.log" Jan 21 17:12:22 crc kubenswrapper[4707]: I0121 17:12:22.432889 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-g8mcx_7bf6f36b-51bd-4196-b732-7807475b38b2/nmstate-console-plugin/0.log" Jan 21 17:12:22 crc kubenswrapper[4707]: I0121 17:12:22.449612 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-m4cqg_72f87fd4-c9ff-4410-ba80-fdc2525a37b8/nmstate-handler/0.log" Jan 21 17:12:22 crc kubenswrapper[4707]: I0121 17:12:22.456965 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-h2gvw_591eebf5-40db-445a-a3ab-23a5f88f501c/nmstate-metrics/0.log" Jan 21 17:12:22 crc kubenswrapper[4707]: I0121 17:12:22.457286 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/controller/0.log" Jan 21 17:12:22 crc kubenswrapper[4707]: I0121 17:12:22.462692 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/kube-rbac-proxy/0.log" Jan 21 17:12:22 crc kubenswrapper[4707]: I0121 17:12:22.463955 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-h2gvw_591eebf5-40db-445a-a3ab-23a5f88f501c/kube-rbac-proxy/0.log" Jan 21 17:12:22 crc kubenswrapper[4707]: I0121 17:12:22.473212 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-kvjjm_c7e81409-5fc8-4b0d-9c98-dfaf6167892d/nmstate-operator/0.log" Jan 21 17:12:22 crc kubenswrapper[4707]: I0121 17:12:22.477863 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/controller/0.log" Jan 21 17:12:22 crc kubenswrapper[4707]: I0121 17:12:22.483017 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-64vq9_3dc50ee6-0d24-4790-a13d-caaf61114685/nmstate-webhook/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.415271 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.422353 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/reloader/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.427767 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr-metrics/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.433839 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.438738 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy-frr/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.444614 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-frr-files/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.449924 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-reloader/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.456421 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-metrics/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.481344 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-czn2c_a50f5957-c813-4be6-846c-18fa67c16942/frr-k8s-webhook-server/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.501158 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-769647f86-fhptv_3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78/manager/0.log" Jan 21 17:12:23 crc kubenswrapper[4707]: I0121 17:12:23.508345 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58ff8dbc9-jm9z5_560343d7-3dae-42d2-b79f-e7ddbde3df71/webhook-server/0.log" Jan 21 17:12:24 crc kubenswrapper[4707]: I0121 17:12:24.588263 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/speaker/0.log" Jan 21 17:12:24 crc kubenswrapper[4707]: I0121 17:12:24.606605 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/kube-rbac-proxy/0.log" Jan 21 17:12:25 crc kubenswrapper[4707]: I0121 17:12:25.459771 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-pdx6x_34c2061b-4028-4e99-9345-d8443062558a/cert-manager-controller/0.log" Jan 21 17:12:25 crc kubenswrapper[4707]: I0121 17:12:25.519706 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-tbzv6_18724316-30e9-4c09-baa1-d3d3971f97de/cert-manager-cainjector/0.log" Jan 21 17:12:25 crc kubenswrapper[4707]: I0121 17:12:25.650497 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-p7vr2_d59f005b-d150-4190-b8ca-d757c89269f4/cert-manager-webhook/0.log" Jan 21 17:12:26 crc kubenswrapper[4707]: I0121 17:12:26.121636 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bjb5q_4a760eb2-4041-468b-a269-13e523796dd0/control-plane-machine-set-operator/0.log" Jan 21 17:12:26 crc kubenswrapper[4707]: I0121 17:12:26.129748 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ggf7f_6896438b-9d7e-4637-ac39-63c12dbf0e66/kube-rbac-proxy/0.log" Jan 21 17:12:26 crc kubenswrapper[4707]: I0121 17:12:26.136734 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ggf7f_6896438b-9d7e-4637-ac39-63c12dbf0e66/machine-api-operator/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.246299 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/kube-multus-additional-cni-plugins/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.253334 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/egress-router-binary-copy/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.258526 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/cni-plugins/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.263429 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/bond-cni-plugin/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.269383 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/routeoverride-cni/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.273918 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/whereabouts-cni-bincopy/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.279204 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/whereabouts-cni/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.349027 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-kjvk4_e82fcc54-393c-4981-a666-769ed5c9c92d/multus-admission-controller/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.352962 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-kjvk4_e82fcc54-393c-4981-a666-769ed5c9c92d/kube-rbac-proxy/0.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.402947 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/2.log" Jan 21 17:12:27 crc kubenswrapper[4707]: I0121 17:12:27.841357 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/3.log" Jan 21 17:12:28 crc kubenswrapper[4707]: I0121 17:12:28.000894 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-62mww_d5fb5fe4-8f42-4057-b731-b2c8da0661e3/network-metrics-daemon/0.log" Jan 21 17:12:28 crc kubenswrapper[4707]: I0121 17:12:28.005285 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-62mww_d5fb5fe4-8f42-4057-b731-b2c8da0661e3/kube-rbac-proxy/0.log" Jan 21 17:12:28 crc kubenswrapper[4707]: I0121 17:12:28.682014 4707 scope.go:117] "RemoveContainer" containerID="c57f0d5540e46c39ab89fecce89278ed46c1ac97f20e0dddf5274613dcc4f070" Jan 21 17:12:28 crc kubenswrapper[4707]: I0121 17:12:28.697966 4707 scope.go:117] "RemoveContainer" containerID="cc1d61364a44337e82b6acd12557c97164f790996b3959263283a0802538686c" Jan 21 17:12:28 crc kubenswrapper[4707]: I0121 17:12:28.714567 4707 scope.go:117] "RemoveContainer" containerID="0fea3f7b80f50927b508dc1ab9d8241266e410660366ad4561107118d2af4d1e" Jan 21 17:12:28 crc kubenswrapper[4707]: I0121 17:12:28.731262 4707 scope.go:117] "RemoveContainer" containerID="cb9ed4451f264f7207558f19fece8b6915bd3bd1e2a3ac5fb38bd23622dc01a2" Jan 21 17:12:28 crc kubenswrapper[4707]: I0121 17:12:28.754736 4707 scope.go:117] "RemoveContainer" containerID="f950f73b230861e22d259c79f4956ec775dbee302d232a9eb474136f88ff6c4a" Jan 21 17:12:28 crc kubenswrapper[4707]: I0121 17:12:28.783270 4707 scope.go:117] "RemoveContainer" containerID="9f1dc836d81ce57b3685d4f4b305ccb9350965ed3e25cd4384b794041ec9f15a" Jan 21 17:12:28 crc kubenswrapper[4707]: I0121 17:12:28.802978 4707 scope.go:117] "RemoveContainer" containerID="0fc5398554874817d4e2bc028cdbab2d2ac0dba239b2a72883da545b4d9191ef" Jan 21 17:13:09 crc kubenswrapper[4707]: I0121 17:13:09.945821 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:13:09 crc kubenswrapper[4707]: I0121 17:13:09.946182 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:13:39 crc kubenswrapper[4707]: I0121 17:13:39.946080 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:13:39 crc kubenswrapper[4707]: I0121 17:13:39.946419 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:14:09 crc kubenswrapper[4707]: I0121 17:14:09.945127 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:14:09 crc kubenswrapper[4707]: I0121 17:14:09.945448 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:14:09 crc kubenswrapper[4707]: I0121 17:14:09.945487 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:14:09 crc kubenswrapper[4707]: I0121 17:14:09.945997 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68af8187b5691bda3dd4143031fccb3c6201be65d4e73f52582877d190385c13"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:14:09 crc kubenswrapper[4707]: I0121 17:14:09.946042 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://68af8187b5691bda3dd4143031fccb3c6201be65d4e73f52582877d190385c13" gracePeriod=600 Jan 21 17:14:10 crc kubenswrapper[4707]: I0121 17:14:10.217244 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="68af8187b5691bda3dd4143031fccb3c6201be65d4e73f52582877d190385c13" exitCode=0 Jan 21 17:14:10 crc kubenswrapper[4707]: I0121 17:14:10.217493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"68af8187b5691bda3dd4143031fccb3c6201be65d4e73f52582877d190385c13"} Jan 21 17:14:10 crc kubenswrapper[4707]: I0121 17:14:10.217518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336"} Jan 21 17:14:10 crc kubenswrapper[4707]: I0121 17:14:10.217551 4707 scope.go:117] "RemoveContainer" containerID="610c9107290737f56e5db0d6dcff8e78002348f3c951ccdabcf928bd776ce7f6" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.734794 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cttvc"] Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.736658 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.742112 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cttvc"] Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.863095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-utilities\") pod \"community-operators-cttvc\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.863387 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-catalog-content\") pod \"community-operators-cttvc\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.863437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdbdp\" (UniqueName: \"kubernetes.io/projected/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-kube-api-access-fdbdp\") pod \"community-operators-cttvc\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.964421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-catalog-content\") pod \"community-operators-cttvc\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.964508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdbdp\" (UniqueName: \"kubernetes.io/projected/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-kube-api-access-fdbdp\") pod \"community-operators-cttvc\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.964577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-utilities\") pod \"community-operators-cttvc\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.964865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-catalog-content\") pod \"community-operators-cttvc\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.964900 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-utilities\") pod \"community-operators-cttvc\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:35 crc kubenswrapper[4707]: I0121 17:14:35.985075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdbdp\" (UniqueName: \"kubernetes.io/projected/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-kube-api-access-fdbdp\") pod \"community-operators-cttvc\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:36 crc kubenswrapper[4707]: I0121 17:14:36.052687 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:36 crc kubenswrapper[4707]: I0121 17:14:36.417042 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cttvc"] Jan 21 17:14:37 crc kubenswrapper[4707]: I0121 17:14:37.353779 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerID="6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887" exitCode=0 Jan 21 17:14:37 crc kubenswrapper[4707]: I0121 17:14:37.354046 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cttvc" event={"ID":"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760","Type":"ContainerDied","Data":"6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887"} Jan 21 17:14:37 crc kubenswrapper[4707]: I0121 17:14:37.354070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cttvc" event={"ID":"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760","Type":"ContainerStarted","Data":"2d594417291a436f597f96b796ffb38b083520dde6a225833c67b686931dafc6"} Jan 21 17:14:38 crc kubenswrapper[4707]: I0121 17:14:38.361015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cttvc" event={"ID":"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760","Type":"ContainerStarted","Data":"ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f"} Jan 21 17:14:39 crc kubenswrapper[4707]: I0121 17:14:39.365616 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerID="ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f" exitCode=0 Jan 21 17:14:39 crc kubenswrapper[4707]: I0121 17:14:39.365652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cttvc" event={"ID":"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760","Type":"ContainerDied","Data":"ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f"} Jan 21 17:14:40 crc kubenswrapper[4707]: I0121 17:14:40.372787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cttvc" event={"ID":"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760","Type":"ContainerStarted","Data":"a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08"} Jan 21 17:14:40 crc kubenswrapper[4707]: I0121 17:14:40.390326 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cttvc" podStartSLOduration=2.917840427 podStartE2EDuration="5.39031348s" podCreationTimestamp="2026-01-21 17:14:35 +0000 UTC" firstStartedPulling="2026-01-21 17:14:37.355839554 +0000 UTC m=+7974.537355775" lastFinishedPulling="2026-01-21 17:14:39.828312606 +0000 UTC m=+7977.009828828" observedRunningTime="2026-01-21 17:14:40.390062888 +0000 UTC m=+7977.571579110" watchObservedRunningTime="2026-01-21 17:14:40.39031348 +0000 UTC m=+7977.571829702" Jan 21 17:14:46 crc kubenswrapper[4707]: I0121 17:14:46.053860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:46 crc kubenswrapper[4707]: I0121 17:14:46.054401 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:46 crc kubenswrapper[4707]: I0121 17:14:46.083757 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:46 crc kubenswrapper[4707]: I0121 17:14:46.446398 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:47 crc kubenswrapper[4707]: I0121 17:14:47.511876 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cttvc"] Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.423975 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cttvc" podUID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerName="registry-server" containerID="cri-o://a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08" gracePeriod=2 Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.745406 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.931364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdbdp\" (UniqueName: \"kubernetes.io/projected/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-kube-api-access-fdbdp\") pod \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.931711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-utilities\") pod \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.932461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-catalog-content\") pod \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\" (UID: \"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760\") " Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.932407 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-utilities" (OuterVolumeSpecName: "utilities") pod "a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" (UID: "a2fe1fe6-8c24-4b4c-897e-773cb4bb4760"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.935270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-kube-api-access-fdbdp" (OuterVolumeSpecName: "kube-api-access-fdbdp") pod "a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" (UID: "a2fe1fe6-8c24-4b4c-897e-773cb4bb4760"). InnerVolumeSpecName "kube-api-access-fdbdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.954576 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdbdp\" (UniqueName: \"kubernetes.io/projected/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-kube-api-access-fdbdp\") on node \"crc\" DevicePath \"\"" Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.954594 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:14:48 crc kubenswrapper[4707]: I0121 17:14:48.991694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" (UID: "a2fe1fe6-8c24-4b4c-897e-773cb4bb4760"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.054908 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.429794 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerID="a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08" exitCode=0 Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.429875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cttvc" event={"ID":"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760","Type":"ContainerDied","Data":"a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08"} Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.429918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cttvc" event={"ID":"a2fe1fe6-8c24-4b4c-897e-773cb4bb4760","Type":"ContainerDied","Data":"2d594417291a436f597f96b796ffb38b083520dde6a225833c67b686931dafc6"} Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.429938 4707 scope.go:117] "RemoveContainer" containerID="a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.430145 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cttvc" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.445152 4707 scope.go:117] "RemoveContainer" containerID="ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.449258 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cttvc"] Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.452854 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cttvc"] Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.457696 4707 scope.go:117] "RemoveContainer" containerID="6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.468436 4707 scope.go:117] "RemoveContainer" containerID="a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08" Jan 21 17:14:49 crc kubenswrapper[4707]: E0121 17:14:49.468667 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08\": container with ID starting with a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08 not found: ID does not exist" containerID="a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.468693 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08"} err="failed to get container status \"a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08\": rpc error: code = NotFound desc = could not find container \"a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08\": container with ID starting with a353d7766dbb17a2f0d4b2e91114f12840b38f525696a055f78a6c7d73efda08 not found: ID does not exist" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.468710 4707 scope.go:117] "RemoveContainer" containerID="ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f" Jan 21 17:14:49 crc kubenswrapper[4707]: E0121 17:14:49.468949 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f\": container with ID starting with ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f not found: ID does not exist" containerID="ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.468976 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f"} err="failed to get container status \"ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f\": rpc error: code = NotFound desc = could not find container \"ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f\": container with ID starting with ab88b2be64f3189a4f3f88fa580e2ef136aa996cfa63da61c07a548fd39df95f not found: ID does not exist" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.468991 4707 scope.go:117] "RemoveContainer" containerID="6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887" Jan 21 17:14:49 crc kubenswrapper[4707]: E0121 17:14:49.469235 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887\": container with ID starting with 6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887 not found: ID does not exist" containerID="6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887" Jan 21 17:14:49 crc kubenswrapper[4707]: I0121 17:14:49.469256 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887"} err="failed to get container status \"6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887\": rpc error: code = NotFound desc = could not find container \"6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887\": container with ID starting with 6e3ee5f1c008b03b52c07aa0e9a783dab2a03ababcb90274893c1d92fbe9f887 not found: ID does not exist" Jan 21 17:14:51 crc kubenswrapper[4707]: I0121 17:14:51.188273 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" path="/var/lib/kubelet/pods/a2fe1fe6-8c24-4b4c-897e-773cb4bb4760/volumes" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.144613 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc"] Jan 21 17:15:00 crc kubenswrapper[4707]: E0121 17:15:00.145143 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerName="extract-content" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.145174 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerName="extract-content" Jan 21 17:15:00 crc kubenswrapper[4707]: E0121 17:15:00.145187 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerName="extract-utilities" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.145194 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerName="extract-utilities" Jan 21 17:15:00 crc kubenswrapper[4707]: E0121 17:15:00.145205 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerName="registry-server" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.145211 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerName="registry-server" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.145313 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2fe1fe6-8c24-4b4c-897e-773cb4bb4760" containerName="registry-server" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.145731 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.147058 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.148120 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.150844 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc"] Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.278383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b280568d-b5c6-4efc-a0c1-239585fcf933-config-volume\") pod \"collect-profiles-29483595-dnntc\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.278742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b280568d-b5c6-4efc-a0c1-239585fcf933-secret-volume\") pod \"collect-profiles-29483595-dnntc\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.278889 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjk67\" (UniqueName: \"kubernetes.io/projected/b280568d-b5c6-4efc-a0c1-239585fcf933-kube-api-access-fjk67\") pod \"collect-profiles-29483595-dnntc\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.379703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjk67\" (UniqueName: \"kubernetes.io/projected/b280568d-b5c6-4efc-a0c1-239585fcf933-kube-api-access-fjk67\") pod \"collect-profiles-29483595-dnntc\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.379786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b280568d-b5c6-4efc-a0c1-239585fcf933-config-volume\") pod \"collect-profiles-29483595-dnntc\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.379886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b280568d-b5c6-4efc-a0c1-239585fcf933-secret-volume\") pod \"collect-profiles-29483595-dnntc\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.380494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b280568d-b5c6-4efc-a0c1-239585fcf933-config-volume\") pod \"collect-profiles-29483595-dnntc\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.399304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b280568d-b5c6-4efc-a0c1-239585fcf933-secret-volume\") pod \"collect-profiles-29483595-dnntc\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.440063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjk67\" (UniqueName: \"kubernetes.io/projected/b280568d-b5c6-4efc-a0c1-239585fcf933-kube-api-access-fjk67\") pod \"collect-profiles-29483595-dnntc\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.460887 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:00 crc kubenswrapper[4707]: I0121 17:15:00.799442 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc"] Jan 21 17:15:00 crc kubenswrapper[4707]: W0121 17:15:00.813919 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb280568d_b5c6_4efc_a0c1_239585fcf933.slice/crio-21c0ceb43bd4fd96fbd62b159a6b1409449df7524ad7f0865b8cdaed07b07d84 WatchSource:0}: Error finding container 21c0ceb43bd4fd96fbd62b159a6b1409449df7524ad7f0865b8cdaed07b07d84: Status 404 returned error can't find the container with id 21c0ceb43bd4fd96fbd62b159a6b1409449df7524ad7f0865b8cdaed07b07d84 Jan 21 17:15:01 crc kubenswrapper[4707]: I0121 17:15:01.497389 4707 generic.go:334] "Generic (PLEG): container finished" podID="b280568d-b5c6-4efc-a0c1-239585fcf933" containerID="2a08c163dbafc50af7f64e9ac61f528a97b80c002997370120ba45ac44ff78b3" exitCode=0 Jan 21 17:15:01 crc kubenswrapper[4707]: I0121 17:15:01.497432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" event={"ID":"b280568d-b5c6-4efc-a0c1-239585fcf933","Type":"ContainerDied","Data":"2a08c163dbafc50af7f64e9ac61f528a97b80c002997370120ba45ac44ff78b3"} Jan 21 17:15:01 crc kubenswrapper[4707]: I0121 17:15:01.497469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" event={"ID":"b280568d-b5c6-4efc-a0c1-239585fcf933","Type":"ContainerStarted","Data":"21c0ceb43bd4fd96fbd62b159a6b1409449df7524ad7f0865b8cdaed07b07d84"} Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.684998 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.710950 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b280568d-b5c6-4efc-a0c1-239585fcf933-config-volume\") pod \"b280568d-b5c6-4efc-a0c1-239585fcf933\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.711049 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b280568d-b5c6-4efc-a0c1-239585fcf933-secret-volume\") pod \"b280568d-b5c6-4efc-a0c1-239585fcf933\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.711087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjk67\" (UniqueName: \"kubernetes.io/projected/b280568d-b5c6-4efc-a0c1-239585fcf933-kube-api-access-fjk67\") pod \"b280568d-b5c6-4efc-a0c1-239585fcf933\" (UID: \"b280568d-b5c6-4efc-a0c1-239585fcf933\") " Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.711550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b280568d-b5c6-4efc-a0c1-239585fcf933-config-volume" (OuterVolumeSpecName: "config-volume") pod "b280568d-b5c6-4efc-a0c1-239585fcf933" (UID: "b280568d-b5c6-4efc-a0c1-239585fcf933"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.715006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b280568d-b5c6-4efc-a0c1-239585fcf933-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b280568d-b5c6-4efc-a0c1-239585fcf933" (UID: "b280568d-b5c6-4efc-a0c1-239585fcf933"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.715155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b280568d-b5c6-4efc-a0c1-239585fcf933-kube-api-access-fjk67" (OuterVolumeSpecName: "kube-api-access-fjk67") pod "b280568d-b5c6-4efc-a0c1-239585fcf933" (UID: "b280568d-b5c6-4efc-a0c1-239585fcf933"). InnerVolumeSpecName "kube-api-access-fjk67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.812113 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b280568d-b5c6-4efc-a0c1-239585fcf933-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.812142 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjk67\" (UniqueName: \"kubernetes.io/projected/b280568d-b5c6-4efc-a0c1-239585fcf933-kube-api-access-fjk67\") on node \"crc\" DevicePath \"\"" Jan 21 17:15:02 crc kubenswrapper[4707]: I0121 17:15:02.812152 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b280568d-b5c6-4efc-a0c1-239585fcf933-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:15:03 crc kubenswrapper[4707]: I0121 17:15:03.507642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" event={"ID":"b280568d-b5c6-4efc-a0c1-239585fcf933","Type":"ContainerDied","Data":"21c0ceb43bd4fd96fbd62b159a6b1409449df7524ad7f0865b8cdaed07b07d84"} Jan 21 17:15:03 crc kubenswrapper[4707]: I0121 17:15:03.507675 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21c0ceb43bd4fd96fbd62b159a6b1409449df7524ad7f0865b8cdaed07b07d84" Jan 21 17:15:03 crc kubenswrapper[4707]: I0121 17:15:03.507712 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc" Jan 21 17:15:03 crc kubenswrapper[4707]: I0121 17:15:03.737381 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z"] Jan 21 17:15:03 crc kubenswrapper[4707]: I0121 17:15:03.740626 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483550-m9k5z"] Jan 21 17:15:05 crc kubenswrapper[4707]: I0121 17:15:05.189202 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc8a5bab-d9d5-4a6f-99ea-d2d512629736" path="/var/lib/kubelet/pods/bc8a5bab-d9d5-4a6f-99ea-d2d512629736/volumes" Jan 21 17:15:28 crc kubenswrapper[4707]: I0121 17:15:28.875251 4707 scope.go:117] "RemoveContainer" containerID="111a5c0f9ebad2cfb5b3d45c32e25bc49180396cf692915f50b5afb020d5dd7d" Jan 21 17:15:28 crc kubenswrapper[4707]: I0121 17:15:28.892889 4707 scope.go:117] "RemoveContainer" containerID="3664b1503bf4278e994e5052e39db81f8036faf508d4717580aa00b72e93fbe0" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.677682 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xbk75"] Jan 21 17:15:56 crc kubenswrapper[4707]: E0121 17:15:56.678567 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b280568d-b5c6-4efc-a0c1-239585fcf933" containerName="collect-profiles" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.678581 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b280568d-b5c6-4efc-a0c1-239585fcf933" containerName="collect-profiles" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.678693 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b280568d-b5c6-4efc-a0c1-239585fcf933" containerName="collect-profiles" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.679587 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.689953 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xbk75"] Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.820975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-catalog-content\") pod \"certified-operators-xbk75\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.821062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxpzr\" (UniqueName: \"kubernetes.io/projected/6176a101-55d9-43c0-aa95-de1965abc2cd-kube-api-access-bxpzr\") pod \"certified-operators-xbk75\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.821103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-utilities\") pod \"certified-operators-xbk75\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.922497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-catalog-content\") pod \"certified-operators-xbk75\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.922650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxpzr\" (UniqueName: \"kubernetes.io/projected/6176a101-55d9-43c0-aa95-de1965abc2cd-kube-api-access-bxpzr\") pod \"certified-operators-xbk75\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.922755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-utilities\") pod \"certified-operators-xbk75\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.922900 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-catalog-content\") pod \"certified-operators-xbk75\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.923133 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-utilities\") pod \"certified-operators-xbk75\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:56 crc kubenswrapper[4707]: I0121 17:15:56.943438 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxpzr\" (UniqueName: \"kubernetes.io/projected/6176a101-55d9-43c0-aa95-de1965abc2cd-kube-api-access-bxpzr\") pod \"certified-operators-xbk75\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:57 crc kubenswrapper[4707]: I0121 17:15:57.006162 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:15:57 crc kubenswrapper[4707]: I0121 17:15:57.270234 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xbk75"] Jan 21 17:15:57 crc kubenswrapper[4707]: I0121 17:15:57.769798 4707 generic.go:334] "Generic (PLEG): container finished" podID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerID="5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299" exitCode=0 Jan 21 17:15:57 crc kubenswrapper[4707]: I0121 17:15:57.770034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbk75" event={"ID":"6176a101-55d9-43c0-aa95-de1965abc2cd","Type":"ContainerDied","Data":"5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299"} Jan 21 17:15:57 crc kubenswrapper[4707]: I0121 17:15:57.770116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbk75" event={"ID":"6176a101-55d9-43c0-aa95-de1965abc2cd","Type":"ContainerStarted","Data":"9eb10ec547e8ca1cad4be3d73803e83ba55aa288a81179f88eff3d23f2580ef6"} Jan 21 17:15:57 crc kubenswrapper[4707]: I0121 17:15:57.771830 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 17:15:58 crc kubenswrapper[4707]: I0121 17:15:58.775785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbk75" event={"ID":"6176a101-55d9-43c0-aa95-de1965abc2cd","Type":"ContainerStarted","Data":"80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3"} Jan 21 17:15:59 crc kubenswrapper[4707]: I0121 17:15:59.782430 4707 generic.go:334] "Generic (PLEG): container finished" podID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerID="80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3" exitCode=0 Jan 21 17:15:59 crc kubenswrapper[4707]: I0121 17:15:59.782625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbk75" event={"ID":"6176a101-55d9-43c0-aa95-de1965abc2cd","Type":"ContainerDied","Data":"80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3"} Jan 21 17:16:00 crc kubenswrapper[4707]: I0121 17:16:00.788524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbk75" event={"ID":"6176a101-55d9-43c0-aa95-de1965abc2cd","Type":"ContainerStarted","Data":"b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132"} Jan 21 17:16:00 crc kubenswrapper[4707]: I0121 17:16:00.806217 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xbk75" podStartSLOduration=2.295920102 podStartE2EDuration="4.806203428s" podCreationTimestamp="2026-01-21 17:15:56 +0000 UTC" firstStartedPulling="2026-01-21 17:15:57.771623562 +0000 UTC m=+8054.953139784" lastFinishedPulling="2026-01-21 17:16:00.281906887 +0000 UTC m=+8057.463423110" observedRunningTime="2026-01-21 17:16:00.804835606 +0000 UTC m=+8057.986351828" watchObservedRunningTime="2026-01-21 17:16:00.806203428 +0000 UTC m=+8057.987719650" Jan 21 17:16:07 crc kubenswrapper[4707]: I0121 17:16:07.006971 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:16:07 crc kubenswrapper[4707]: I0121 17:16:07.007751 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:16:07 crc kubenswrapper[4707]: I0121 17:16:07.036704 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:16:07 crc kubenswrapper[4707]: I0121 17:16:07.849692 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:16:09 crc kubenswrapper[4707]: I0121 17:16:09.361692 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xbk75"] Jan 21 17:16:10 crc kubenswrapper[4707]: I0121 17:16:10.840609 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xbk75" podUID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerName="registry-server" containerID="cri-o://b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132" gracePeriod=2 Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.203286 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.383506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxpzr\" (UniqueName: \"kubernetes.io/projected/6176a101-55d9-43c0-aa95-de1965abc2cd-kube-api-access-bxpzr\") pod \"6176a101-55d9-43c0-aa95-de1965abc2cd\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.383583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-catalog-content\") pod \"6176a101-55d9-43c0-aa95-de1965abc2cd\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.383869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-utilities\") pod \"6176a101-55d9-43c0-aa95-de1965abc2cd\" (UID: \"6176a101-55d9-43c0-aa95-de1965abc2cd\") " Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.385429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-utilities" (OuterVolumeSpecName: "utilities") pod "6176a101-55d9-43c0-aa95-de1965abc2cd" (UID: "6176a101-55d9-43c0-aa95-de1965abc2cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.391540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6176a101-55d9-43c0-aa95-de1965abc2cd-kube-api-access-bxpzr" (OuterVolumeSpecName: "kube-api-access-bxpzr") pod "6176a101-55d9-43c0-aa95-de1965abc2cd" (UID: "6176a101-55d9-43c0-aa95-de1965abc2cd"). InnerVolumeSpecName "kube-api-access-bxpzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.433989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6176a101-55d9-43c0-aa95-de1965abc2cd" (UID: "6176a101-55d9-43c0-aa95-de1965abc2cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.487331 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxpzr\" (UniqueName: \"kubernetes.io/projected/6176a101-55d9-43c0-aa95-de1965abc2cd-kube-api-access-bxpzr\") on node \"crc\" DevicePath \"\"" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.487369 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.487400 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6176a101-55d9-43c0-aa95-de1965abc2cd-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.855732 4707 generic.go:334] "Generic (PLEG): container finished" podID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerID="b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132" exitCode=0 Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.855791 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbk75" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.855821 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbk75" event={"ID":"6176a101-55d9-43c0-aa95-de1965abc2cd","Type":"ContainerDied","Data":"b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132"} Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.856880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbk75" event={"ID":"6176a101-55d9-43c0-aa95-de1965abc2cd","Type":"ContainerDied","Data":"9eb10ec547e8ca1cad4be3d73803e83ba55aa288a81179f88eff3d23f2580ef6"} Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.856934 4707 scope.go:117] "RemoveContainer" containerID="b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.880660 4707 scope.go:117] "RemoveContainer" containerID="80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.881189 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xbk75"] Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.886133 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xbk75"] Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.904200 4707 scope.go:117] "RemoveContainer" containerID="5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.915852 4707 scope.go:117] "RemoveContainer" containerID="b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132" Jan 21 17:16:11 crc kubenswrapper[4707]: E0121 17:16:11.916312 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132\": container with ID starting with b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132 not found: ID does not exist" containerID="b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.916365 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132"} err="failed to get container status \"b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132\": rpc error: code = NotFound desc = could not find container \"b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132\": container with ID starting with b161ec9128a4596a26ae1b2eb7a119d0ff43a3f0e80920ae7b37cbcbea7fd132 not found: ID does not exist" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.916386 4707 scope.go:117] "RemoveContainer" containerID="80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3" Jan 21 17:16:11 crc kubenswrapper[4707]: E0121 17:16:11.916684 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3\": container with ID starting with 80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3 not found: ID does not exist" containerID="80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.916724 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3"} err="failed to get container status \"80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3\": rpc error: code = NotFound desc = could not find container \"80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3\": container with ID starting with 80091c583f830a7956c37f1000b18e3fbaa977fc26c8b384680f258877e589d3 not found: ID does not exist" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.916736 4707 scope.go:117] "RemoveContainer" containerID="5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299" Jan 21 17:16:11 crc kubenswrapper[4707]: E0121 17:16:11.917060 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299\": container with ID starting with 5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299 not found: ID does not exist" containerID="5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299" Jan 21 17:16:11 crc kubenswrapper[4707]: I0121 17:16:11.917094 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299"} err="failed to get container status \"5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299\": rpc error: code = NotFound desc = could not find container \"5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299\": container with ID starting with 5a401662d84fa25baaf2a41d44d8b2f501ec00897371bcccb9db53bd75c6d299 not found: ID does not exist" Jan 21 17:16:13 crc kubenswrapper[4707]: I0121 17:16:13.188149 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6176a101-55d9-43c0-aa95-de1965abc2cd" path="/var/lib/kubelet/pods/6176a101-55d9-43c0-aa95-de1965abc2cd/volumes" Jan 21 17:16:39 crc kubenswrapper[4707]: I0121 17:16:39.945971 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:16:39 crc kubenswrapper[4707]: I0121 17:16:39.946305 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:17:09 crc kubenswrapper[4707]: I0121 17:17:09.945581 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:17:09 crc kubenswrapper[4707]: I0121 17:17:09.945955 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.235259 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fcp8l"] Jan 21 17:17:20 crc kubenswrapper[4707]: E0121 17:17:20.235786 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerName="extract-content" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.235799 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerName="extract-content" Jan 21 17:17:20 crc kubenswrapper[4707]: E0121 17:17:20.235830 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerName="registry-server" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.235836 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerName="registry-server" Jan 21 17:17:20 crc kubenswrapper[4707]: E0121 17:17:20.235851 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerName="extract-utilities" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.235856 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerName="extract-utilities" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.235966 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6176a101-55d9-43c0-aa95-de1965abc2cd" containerName="registry-server" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.238508 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.241608 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcp8l"] Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.301804 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzjdq\" (UniqueName: \"kubernetes.io/projected/7df1c5a3-eda9-4a11-b97d-4525712e81a6-kube-api-access-tzjdq\") pod \"redhat-operators-fcp8l\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.301881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-catalog-content\") pod \"redhat-operators-fcp8l\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.301936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-utilities\") pod \"redhat-operators-fcp8l\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.402356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-utilities\") pod \"redhat-operators-fcp8l\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.402405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzjdq\" (UniqueName: \"kubernetes.io/projected/7df1c5a3-eda9-4a11-b97d-4525712e81a6-kube-api-access-tzjdq\") pod \"redhat-operators-fcp8l\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.402446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-catalog-content\") pod \"redhat-operators-fcp8l\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.402798 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-utilities\") pod \"redhat-operators-fcp8l\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.402838 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-catalog-content\") pod \"redhat-operators-fcp8l\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.419383 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzjdq\" (UniqueName: \"kubernetes.io/projected/7df1c5a3-eda9-4a11-b97d-4525712e81a6-kube-api-access-tzjdq\") pod \"redhat-operators-fcp8l\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.560335 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:20 crc kubenswrapper[4707]: I0121 17:17:20.943426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcp8l"] Jan 21 17:17:21 crc kubenswrapper[4707]: I0121 17:17:21.174669 4707 generic.go:334] "Generic (PLEG): container finished" podID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerID="02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283" exitCode=0 Jan 21 17:17:21 crc kubenswrapper[4707]: I0121 17:17:21.174711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcp8l" event={"ID":"7df1c5a3-eda9-4a11-b97d-4525712e81a6","Type":"ContainerDied","Data":"02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283"} Jan 21 17:17:21 crc kubenswrapper[4707]: I0121 17:17:21.174747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcp8l" event={"ID":"7df1c5a3-eda9-4a11-b97d-4525712e81a6","Type":"ContainerStarted","Data":"f47892ab05a5ad47efee2c1603030c324a8ce1e665f49189aa902dd070f1d3c3"} Jan 21 17:17:22 crc kubenswrapper[4707]: I0121 17:17:22.180695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcp8l" event={"ID":"7df1c5a3-eda9-4a11-b97d-4525712e81a6","Type":"ContainerStarted","Data":"41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b"} Jan 21 17:17:23 crc kubenswrapper[4707]: I0121 17:17:23.202184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcp8l" event={"ID":"7df1c5a3-eda9-4a11-b97d-4525712e81a6","Type":"ContainerDied","Data":"41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b"} Jan 21 17:17:23 crc kubenswrapper[4707]: I0121 17:17:23.202089 4707 generic.go:334] "Generic (PLEG): container finished" podID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerID="41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b" exitCode=0 Jan 21 17:17:24 crc kubenswrapper[4707]: I0121 17:17:24.208183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcp8l" event={"ID":"7df1c5a3-eda9-4a11-b97d-4525712e81a6","Type":"ContainerStarted","Data":"c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c"} Jan 21 17:17:24 crc kubenswrapper[4707]: I0121 17:17:24.226373 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fcp8l" podStartSLOduration=1.7390850599999998 podStartE2EDuration="4.226361977s" podCreationTimestamp="2026-01-21 17:17:20 +0000 UTC" firstStartedPulling="2026-01-21 17:17:21.176162602 +0000 UTC m=+8138.357678825" lastFinishedPulling="2026-01-21 17:17:23.66343952 +0000 UTC m=+8140.844955742" observedRunningTime="2026-01-21 17:17:24.221393966 +0000 UTC m=+8141.402910198" watchObservedRunningTime="2026-01-21 17:17:24.226361977 +0000 UTC m=+8141.407878199" Jan 21 17:17:30 crc kubenswrapper[4707]: I0121 17:17:30.561327 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:30 crc kubenswrapper[4707]: I0121 17:17:30.561679 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:30 crc kubenswrapper[4707]: I0121 17:17:30.588244 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:31 crc kubenswrapper[4707]: I0121 17:17:31.270776 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:31 crc kubenswrapper[4707]: I0121 17:17:31.301326 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcp8l"] Jan 21 17:17:33 crc kubenswrapper[4707]: I0121 17:17:33.253567 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fcp8l" podUID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerName="registry-server" containerID="cri-o://c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c" gracePeriod=2 Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.066942 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.178707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-catalog-content\") pod \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.179157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzjdq\" (UniqueName: \"kubernetes.io/projected/7df1c5a3-eda9-4a11-b97d-4525712e81a6-kube-api-access-tzjdq\") pod \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.179296 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-utilities\") pod \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\" (UID: \"7df1c5a3-eda9-4a11-b97d-4525712e81a6\") " Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.179873 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-utilities" (OuterVolumeSpecName: "utilities") pod "7df1c5a3-eda9-4a11-b97d-4525712e81a6" (UID: "7df1c5a3-eda9-4a11-b97d-4525712e81a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.182998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df1c5a3-eda9-4a11-b97d-4525712e81a6-kube-api-access-tzjdq" (OuterVolumeSpecName: "kube-api-access-tzjdq") pod "7df1c5a3-eda9-4a11-b97d-4525712e81a6" (UID: "7df1c5a3-eda9-4a11-b97d-4525712e81a6"). InnerVolumeSpecName "kube-api-access-tzjdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.260932 4707 generic.go:334] "Generic (PLEG): container finished" podID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerID="c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c" exitCode=0 Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.260986 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcp8l" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.261002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcp8l" event={"ID":"7df1c5a3-eda9-4a11-b97d-4525712e81a6","Type":"ContainerDied","Data":"c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c"} Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.261837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcp8l" event={"ID":"7df1c5a3-eda9-4a11-b97d-4525712e81a6","Type":"ContainerDied","Data":"f47892ab05a5ad47efee2c1603030c324a8ce1e665f49189aa902dd070f1d3c3"} Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.261855 4707 scope.go:117] "RemoveContainer" containerID="c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.267312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7df1c5a3-eda9-4a11-b97d-4525712e81a6" (UID: "7df1c5a3-eda9-4a11-b97d-4525712e81a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.274585 4707 scope.go:117] "RemoveContainer" containerID="41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.281066 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzjdq\" (UniqueName: \"kubernetes.io/projected/7df1c5a3-eda9-4a11-b97d-4525712e81a6-kube-api-access-tzjdq\") on node \"crc\" DevicePath \"\"" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.281099 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.281109 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df1c5a3-eda9-4a11-b97d-4525712e81a6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.286802 4707 scope.go:117] "RemoveContainer" containerID="02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.301196 4707 scope.go:117] "RemoveContainer" containerID="c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c" Jan 21 17:17:34 crc kubenswrapper[4707]: E0121 17:17:34.301479 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c\": container with ID starting with c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c not found: ID does not exist" containerID="c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.301530 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c"} err="failed to get container status \"c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c\": rpc error: code = NotFound desc = could not find container \"c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c\": container with ID starting with c9836cc4cf7159d16706d4321d6ca7031bcebaddb0c3440c5237ce76a294b04c not found: ID does not exist" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.301551 4707 scope.go:117] "RemoveContainer" containerID="41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b" Jan 21 17:17:34 crc kubenswrapper[4707]: E0121 17:17:34.301800 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b\": container with ID starting with 41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b not found: ID does not exist" containerID="41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.301848 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b"} err="failed to get container status \"41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b\": rpc error: code = NotFound desc = could not find container \"41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b\": container with ID starting with 41518344d42fcab5e9df7e9229d3fc726627bf33bf04af166a659252d5e2707b not found: ID does not exist" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.301861 4707 scope.go:117] "RemoveContainer" containerID="02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283" Jan 21 17:17:34 crc kubenswrapper[4707]: E0121 17:17:34.302159 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283\": container with ID starting with 02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283 not found: ID does not exist" containerID="02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.302271 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283"} err="failed to get container status \"02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283\": rpc error: code = NotFound desc = could not find container \"02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283\": container with ID starting with 02d902bf1112feb9a2a544df3dd499c25d109a606f84164063a256d15195c283 not found: ID does not exist" Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.582378 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcp8l"] Jan 21 17:17:34 crc kubenswrapper[4707]: I0121 17:17:34.587656 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fcp8l"] Jan 21 17:17:35 crc kubenswrapper[4707]: I0121 17:17:35.188424 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" path="/var/lib/kubelet/pods/7df1c5a3-eda9-4a11-b97d-4525712e81a6/volumes" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.294379 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-svngx"] Jan 21 17:17:36 crc kubenswrapper[4707]: E0121 17:17:36.294754 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerName="extract-utilities" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.294765 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerName="extract-utilities" Jan 21 17:17:36 crc kubenswrapper[4707]: E0121 17:17:36.294778 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerName="extract-content" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.294783 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerName="extract-content" Jan 21 17:17:36 crc kubenswrapper[4707]: E0121 17:17:36.294796 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerName="registry-server" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.294801 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerName="registry-server" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.294931 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df1c5a3-eda9-4a11-b97d-4525712e81a6" containerName="registry-server" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.295648 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.312740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svngx"] Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.407283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-utilities\") pod \"redhat-marketplace-svngx\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.407328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-catalog-content\") pod \"redhat-marketplace-svngx\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.407358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2955n\" (UniqueName: \"kubernetes.io/projected/7b019316-5a45-464f-980e-14bba50c81fb-kube-api-access-2955n\") pod \"redhat-marketplace-svngx\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.509065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-utilities\") pod \"redhat-marketplace-svngx\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.509113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-catalog-content\") pod \"redhat-marketplace-svngx\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.509144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2955n\" (UniqueName: \"kubernetes.io/projected/7b019316-5a45-464f-980e-14bba50c81fb-kube-api-access-2955n\") pod \"redhat-marketplace-svngx\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.509528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-utilities\") pod \"redhat-marketplace-svngx\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.509565 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-catalog-content\") pod \"redhat-marketplace-svngx\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.522669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2955n\" (UniqueName: \"kubernetes.io/projected/7b019316-5a45-464f-980e-14bba50c81fb-kube-api-access-2955n\") pod \"redhat-marketplace-svngx\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.612277 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:36 crc kubenswrapper[4707]: I0121 17:17:36.965841 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svngx"] Jan 21 17:17:37 crc kubenswrapper[4707]: I0121 17:17:37.276327 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b019316-5a45-464f-980e-14bba50c81fb" containerID="005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8" exitCode=0 Jan 21 17:17:37 crc kubenswrapper[4707]: I0121 17:17:37.276366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svngx" event={"ID":"7b019316-5a45-464f-980e-14bba50c81fb","Type":"ContainerDied","Data":"005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8"} Jan 21 17:17:37 crc kubenswrapper[4707]: I0121 17:17:37.276403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svngx" event={"ID":"7b019316-5a45-464f-980e-14bba50c81fb","Type":"ContainerStarted","Data":"d7481c7e0ca3d96d3fd625845ac364f0a7f4ed32c43a7db5d261eec0e34b957c"} Jan 21 17:17:38 crc kubenswrapper[4707]: I0121 17:17:38.282657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svngx" event={"ID":"7b019316-5a45-464f-980e-14bba50c81fb","Type":"ContainerStarted","Data":"785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc"} Jan 21 17:17:39 crc kubenswrapper[4707]: I0121 17:17:39.288476 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b019316-5a45-464f-980e-14bba50c81fb" containerID="785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc" exitCode=0 Jan 21 17:17:39 crc kubenswrapper[4707]: I0121 17:17:39.288512 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svngx" event={"ID":"7b019316-5a45-464f-980e-14bba50c81fb","Type":"ContainerDied","Data":"785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc"} Jan 21 17:17:39 crc kubenswrapper[4707]: I0121 17:17:39.946134 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:17:39 crc kubenswrapper[4707]: I0121 17:17:39.946339 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:17:39 crc kubenswrapper[4707]: I0121 17:17:39.946372 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:17:39 crc kubenswrapper[4707]: I0121 17:17:39.946750 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:17:39 crc kubenswrapper[4707]: I0121 17:17:39.946796 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" gracePeriod=600 Jan 21 17:17:40 crc kubenswrapper[4707]: E0121 17:17:40.076372 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:17:40 crc kubenswrapper[4707]: I0121 17:17:40.296119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svngx" event={"ID":"7b019316-5a45-464f-980e-14bba50c81fb","Type":"ContainerStarted","Data":"5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895"} Jan 21 17:17:40 crc kubenswrapper[4707]: I0121 17:17:40.298790 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" exitCode=0 Jan 21 17:17:40 crc kubenswrapper[4707]: I0121 17:17:40.298836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336"} Jan 21 17:17:40 crc kubenswrapper[4707]: I0121 17:17:40.298860 4707 scope.go:117] "RemoveContainer" containerID="68af8187b5691bda3dd4143031fccb3c6201be65d4e73f52582877d190385c13" Jan 21 17:17:40 crc kubenswrapper[4707]: I0121 17:17:40.299177 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:17:40 crc kubenswrapper[4707]: E0121 17:17:40.299378 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:17:40 crc kubenswrapper[4707]: I0121 17:17:40.316270 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-svngx" podStartSLOduration=1.81820793 podStartE2EDuration="4.316259236s" podCreationTimestamp="2026-01-21 17:17:36 +0000 UTC" firstStartedPulling="2026-01-21 17:17:37.277534018 +0000 UTC m=+8154.459050240" lastFinishedPulling="2026-01-21 17:17:39.775585324 +0000 UTC m=+8156.957101546" observedRunningTime="2026-01-21 17:17:40.314975934 +0000 UTC m=+8157.496492157" watchObservedRunningTime="2026-01-21 17:17:40.316259236 +0000 UTC m=+8157.497775459" Jan 21 17:17:46 crc kubenswrapper[4707]: I0121 17:17:46.614333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:46 crc kubenswrapper[4707]: I0121 17:17:46.614652 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:46 crc kubenswrapper[4707]: I0121 17:17:46.643073 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:47 crc kubenswrapper[4707]: I0121 17:17:47.369182 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:47 crc kubenswrapper[4707]: I0121 17:17:47.402857 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svngx"] Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.339228 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-svngx" podUID="7b019316-5a45-464f-980e-14bba50c81fb" containerName="registry-server" containerID="cri-o://5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895" gracePeriod=2 Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.637572 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.681079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-utilities\") pod \"7b019316-5a45-464f-980e-14bba50c81fb\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.681144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2955n\" (UniqueName: \"kubernetes.io/projected/7b019316-5a45-464f-980e-14bba50c81fb-kube-api-access-2955n\") pod \"7b019316-5a45-464f-980e-14bba50c81fb\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.681216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-catalog-content\") pod \"7b019316-5a45-464f-980e-14bba50c81fb\" (UID: \"7b019316-5a45-464f-980e-14bba50c81fb\") " Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.681940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-utilities" (OuterVolumeSpecName: "utilities") pod "7b019316-5a45-464f-980e-14bba50c81fb" (UID: "7b019316-5a45-464f-980e-14bba50c81fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.685292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b019316-5a45-464f-980e-14bba50c81fb-kube-api-access-2955n" (OuterVolumeSpecName: "kube-api-access-2955n") pod "7b019316-5a45-464f-980e-14bba50c81fb" (UID: "7b019316-5a45-464f-980e-14bba50c81fb"). InnerVolumeSpecName "kube-api-access-2955n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.698062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b019316-5a45-464f-980e-14bba50c81fb" (UID: "7b019316-5a45-464f-980e-14bba50c81fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.783240 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.783270 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2955n\" (UniqueName: \"kubernetes.io/projected/7b019316-5a45-464f-980e-14bba50c81fb-kube-api-access-2955n\") on node \"crc\" DevicePath \"\"" Jan 21 17:17:49 crc kubenswrapper[4707]: I0121 17:17:49.783284 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b019316-5a45-464f-980e-14bba50c81fb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.346218 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b019316-5a45-464f-980e-14bba50c81fb" containerID="5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895" exitCode=0 Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.346263 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svngx" event={"ID":"7b019316-5a45-464f-980e-14bba50c81fb","Type":"ContainerDied","Data":"5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895"} Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.346292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svngx" event={"ID":"7b019316-5a45-464f-980e-14bba50c81fb","Type":"ContainerDied","Data":"d7481c7e0ca3d96d3fd625845ac364f0a7f4ed32c43a7db5d261eec0e34b957c"} Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.346311 4707 scope.go:117] "RemoveContainer" containerID="5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.346420 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svngx" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.367833 4707 scope.go:117] "RemoveContainer" containerID="785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.376729 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svngx"] Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.381255 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-svngx"] Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.396571 4707 scope.go:117] "RemoveContainer" containerID="005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.408253 4707 scope.go:117] "RemoveContainer" containerID="5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895" Jan 21 17:17:50 crc kubenswrapper[4707]: E0121 17:17:50.408620 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895\": container with ID starting with 5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895 not found: ID does not exist" containerID="5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.408650 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895"} err="failed to get container status \"5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895\": rpc error: code = NotFound desc = could not find container \"5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895\": container with ID starting with 5f95fa57a12ddde78f34ce31c7213a851415148b9d097e1c890ef5db5d929895 not found: ID does not exist" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.408673 4707 scope.go:117] "RemoveContainer" containerID="785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc" Jan 21 17:17:50 crc kubenswrapper[4707]: E0121 17:17:50.412067 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc\": container with ID starting with 785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc not found: ID does not exist" containerID="785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.412100 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc"} err="failed to get container status \"785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc\": rpc error: code = NotFound desc = could not find container \"785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc\": container with ID starting with 785f9d08e71a61028225cd342e3591c39188992333273d5f21a52412147e49fc not found: ID does not exist" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.412131 4707 scope.go:117] "RemoveContainer" containerID="005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8" Jan 21 17:17:50 crc kubenswrapper[4707]: E0121 17:17:50.412343 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8\": container with ID starting with 005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8 not found: ID does not exist" containerID="005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8" Jan 21 17:17:50 crc kubenswrapper[4707]: I0121 17:17:50.412364 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8"} err="failed to get container status \"005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8\": rpc error: code = NotFound desc = could not find container \"005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8\": container with ID starting with 005a0a6f7bee8865bd6aa4313c4d5fe30f3c2554700a417f4767be98e4b7daf8 not found: ID does not exist" Jan 21 17:17:51 crc kubenswrapper[4707]: I0121 17:17:51.187759 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b019316-5a45-464f-980e-14bba50c81fb" path="/var/lib/kubelet/pods/7b019316-5a45-464f-980e-14bba50c81fb/volumes" Jan 21 17:17:53 crc kubenswrapper[4707]: I0121 17:17:53.196961 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:17:53 crc kubenswrapper[4707]: E0121 17:17:53.197596 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:18:05 crc kubenswrapper[4707]: I0121 17:18:05.184965 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:18:05 crc kubenswrapper[4707]: E0121 17:18:05.185460 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:18:16 crc kubenswrapper[4707]: I0121 17:18:16.182205 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:18:16 crc kubenswrapper[4707]: E0121 17:18:16.183266 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:18:27 crc kubenswrapper[4707]: I0121 17:18:27.182438 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:18:27 crc kubenswrapper[4707]: E0121 17:18:27.183009 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:18:38 crc kubenswrapper[4707]: I0121 17:18:38.182160 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:18:38 crc kubenswrapper[4707]: E0121 17:18:38.183342 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:18:53 crc kubenswrapper[4707]: I0121 17:18:53.185284 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:18:53 crc kubenswrapper[4707]: E0121 17:18:53.186562 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:19:04 crc kubenswrapper[4707]: I0121 17:19:04.182651 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:19:04 crc kubenswrapper[4707]: E0121 17:19:04.183078 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:19:19 crc kubenswrapper[4707]: I0121 17:19:19.182401 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:19:19 crc kubenswrapper[4707]: E0121 17:19:19.182928 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:19:31 crc kubenswrapper[4707]: I0121 17:19:31.182595 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:19:31 crc kubenswrapper[4707]: E0121 17:19:31.183190 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:19:46 crc kubenswrapper[4707]: I0121 17:19:46.182431 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:19:46 crc kubenswrapper[4707]: E0121 17:19:46.183657 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:19:59 crc kubenswrapper[4707]: I0121 17:19:59.183572 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:19:59 crc kubenswrapper[4707]: E0121 17:19:59.183954 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:20:12 crc kubenswrapper[4707]: I0121 17:20:12.183510 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:20:12 crc kubenswrapper[4707]: E0121 17:20:12.184426 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:20:27 crc kubenswrapper[4707]: I0121 17:20:27.182994 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:20:27 crc kubenswrapper[4707]: E0121 17:20:27.183571 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:20:38 crc kubenswrapper[4707]: I0121 17:20:38.182705 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:20:38 crc kubenswrapper[4707]: E0121 17:20:38.184087 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:20:52 crc kubenswrapper[4707]: I0121 17:20:52.182832 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:20:52 crc kubenswrapper[4707]: E0121 17:20:52.183362 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:21:07 crc kubenswrapper[4707]: I0121 17:21:07.182531 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:21:07 crc kubenswrapper[4707]: E0121 17:21:07.183167 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:21:19 crc kubenswrapper[4707]: I0121 17:21:19.183164 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:21:19 crc kubenswrapper[4707]: E0121 17:21:19.183943 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:21:33 crc kubenswrapper[4707]: I0121 17:21:33.184625 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:21:33 crc kubenswrapper[4707]: E0121 17:21:33.185148 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:21:48 crc kubenswrapper[4707]: I0121 17:21:48.182124 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:21:48 crc kubenswrapper[4707]: E0121 17:21:48.182622 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:22:00 crc kubenswrapper[4707]: I0121 17:22:00.183057 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:22:00 crc kubenswrapper[4707]: E0121 17:22:00.185874 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:22:14 crc kubenswrapper[4707]: I0121 17:22:14.182875 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:22:14 crc kubenswrapper[4707]: E0121 17:22:14.183370 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:22:29 crc kubenswrapper[4707]: I0121 17:22:29.185678 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:22:29 crc kubenswrapper[4707]: E0121 17:22:29.186206 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:22:40 crc kubenswrapper[4707]: I0121 17:22:40.182775 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:22:40 crc kubenswrapper[4707]: I0121 17:22:40.789332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"8848cdf662db2834cec5a993e3b664d6194c9170ca7198fa60fc22fd6cc87b79"} Jan 21 17:25:09 crc kubenswrapper[4707]: I0121 17:25:09.945682 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:25:09 crc kubenswrapper[4707]: I0121 17:25:09.946042 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:25:27 crc kubenswrapper[4707]: I0121 17:25:27.885098 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-spphf"] Jan 21 17:25:27 crc kubenswrapper[4707]: E0121 17:25:27.885657 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b019316-5a45-464f-980e-14bba50c81fb" containerName="registry-server" Jan 21 17:25:27 crc kubenswrapper[4707]: I0121 17:25:27.885668 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b019316-5a45-464f-980e-14bba50c81fb" containerName="registry-server" Jan 21 17:25:27 crc kubenswrapper[4707]: E0121 17:25:27.885689 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b019316-5a45-464f-980e-14bba50c81fb" containerName="extract-utilities" Jan 21 17:25:27 crc kubenswrapper[4707]: I0121 17:25:27.885695 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b019316-5a45-464f-980e-14bba50c81fb" containerName="extract-utilities" Jan 21 17:25:27 crc kubenswrapper[4707]: E0121 17:25:27.885702 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b019316-5a45-464f-980e-14bba50c81fb" containerName="extract-content" Jan 21 17:25:27 crc kubenswrapper[4707]: I0121 17:25:27.885707 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b019316-5a45-464f-980e-14bba50c81fb" containerName="extract-content" Jan 21 17:25:27 crc kubenswrapper[4707]: I0121 17:25:27.885844 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b019316-5a45-464f-980e-14bba50c81fb" containerName="registry-server" Jan 21 17:25:27 crc kubenswrapper[4707]: I0121 17:25:27.886593 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:27 crc kubenswrapper[4707]: I0121 17:25:27.889920 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-spphf"] Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.069245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-catalog-content\") pod \"community-operators-spphf\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.069288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-utilities\") pod \"community-operators-spphf\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.069443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmscs\" (UniqueName: \"kubernetes.io/projected/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-kube-api-access-gmscs\") pod \"community-operators-spphf\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.170579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-catalog-content\") pod \"community-operators-spphf\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.170847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-utilities\") pod \"community-operators-spphf\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.170986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmscs\" (UniqueName: \"kubernetes.io/projected/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-kube-api-access-gmscs\") pod \"community-operators-spphf\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.171031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-catalog-content\") pod \"community-operators-spphf\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.171251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-utilities\") pod \"community-operators-spphf\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.187537 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmscs\" (UniqueName: \"kubernetes.io/projected/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-kube-api-access-gmscs\") pod \"community-operators-spphf\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.209395 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:28 crc kubenswrapper[4707]: I0121 17:25:28.575924 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-spphf"] Jan 21 17:25:28 crc kubenswrapper[4707]: W0121 17:25:28.576631 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33ba2b34_bbeb_41cf_ae73_464fc33cfcf9.slice/crio-a5f753ce9ac023a4457720f5605b0c5f1a7d3c66941d5d4f191f55066861928f WatchSource:0}: Error finding container a5f753ce9ac023a4457720f5605b0c5f1a7d3c66941d5d4f191f55066861928f: Status 404 returned error can't find the container with id a5f753ce9ac023a4457720f5605b0c5f1a7d3c66941d5d4f191f55066861928f Jan 21 17:25:29 crc kubenswrapper[4707]: I0121 17:25:29.585149 4707 generic.go:334] "Generic (PLEG): container finished" podID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerID="2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3" exitCode=0 Jan 21 17:25:29 crc kubenswrapper[4707]: I0121 17:25:29.585246 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spphf" event={"ID":"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9","Type":"ContainerDied","Data":"2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3"} Jan 21 17:25:29 crc kubenswrapper[4707]: I0121 17:25:29.586537 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spphf" event={"ID":"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9","Type":"ContainerStarted","Data":"a5f753ce9ac023a4457720f5605b0c5f1a7d3c66941d5d4f191f55066861928f"} Jan 21 17:25:29 crc kubenswrapper[4707]: I0121 17:25:29.587162 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 17:25:30 crc kubenswrapper[4707]: I0121 17:25:30.606050 4707 generic.go:334] "Generic (PLEG): container finished" podID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerID="4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67" exitCode=0 Jan 21 17:25:30 crc kubenswrapper[4707]: I0121 17:25:30.606087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spphf" event={"ID":"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9","Type":"ContainerDied","Data":"4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67"} Jan 21 17:25:31 crc kubenswrapper[4707]: I0121 17:25:31.612275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spphf" event={"ID":"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9","Type":"ContainerStarted","Data":"03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd"} Jan 21 17:25:31 crc kubenswrapper[4707]: I0121 17:25:31.628947 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-spphf" podStartSLOduration=3.139415284 podStartE2EDuration="4.628935067s" podCreationTimestamp="2026-01-21 17:25:27 +0000 UTC" firstStartedPulling="2026-01-21 17:25:29.58690778 +0000 UTC m=+8626.768424002" lastFinishedPulling="2026-01-21 17:25:31.076427563 +0000 UTC m=+8628.257943785" observedRunningTime="2026-01-21 17:25:31.62598509 +0000 UTC m=+8628.807501312" watchObservedRunningTime="2026-01-21 17:25:31.628935067 +0000 UTC m=+8628.810451288" Jan 21 17:25:38 crc kubenswrapper[4707]: I0121 17:25:38.210507 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:38 crc kubenswrapper[4707]: I0121 17:25:38.210884 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:38 crc kubenswrapper[4707]: I0121 17:25:38.238422 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:38 crc kubenswrapper[4707]: I0121 17:25:38.666520 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:38 crc kubenswrapper[4707]: I0121 17:25:38.699831 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-spphf"] Jan 21 17:25:39 crc kubenswrapper[4707]: I0121 17:25:39.945286 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:25:39 crc kubenswrapper[4707]: I0121 17:25:39.945503 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:25:40 crc kubenswrapper[4707]: I0121 17:25:40.649172 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-spphf" podUID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerName="registry-server" containerID="cri-o://03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd" gracePeriod=2 Jan 21 17:25:40 crc kubenswrapper[4707]: I0121 17:25:40.946486 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.136687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-utilities\") pod \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.136744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-catalog-content\") pod \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.136860 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmscs\" (UniqueName: \"kubernetes.io/projected/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-kube-api-access-gmscs\") pod \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\" (UID: \"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9\") " Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.137976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-utilities" (OuterVolumeSpecName: "utilities") pod "33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" (UID: "33ba2b34-bbeb-41cf-ae73-464fc33cfcf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.140780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-kube-api-access-gmscs" (OuterVolumeSpecName: "kube-api-access-gmscs") pod "33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" (UID: "33ba2b34-bbeb-41cf-ae73-464fc33cfcf9"). InnerVolumeSpecName "kube-api-access-gmscs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.175338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" (UID: "33ba2b34-bbeb-41cf-ae73-464fc33cfcf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.238536 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.238562 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.238593 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmscs\" (UniqueName: \"kubernetes.io/projected/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9-kube-api-access-gmscs\") on node \"crc\" DevicePath \"\"" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.668401 4707 generic.go:334] "Generic (PLEG): container finished" podID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerID="03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd" exitCode=0 Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.668453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spphf" event={"ID":"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9","Type":"ContainerDied","Data":"03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd"} Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.668480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spphf" event={"ID":"33ba2b34-bbeb-41cf-ae73-464fc33cfcf9","Type":"ContainerDied","Data":"a5f753ce9ac023a4457720f5605b0c5f1a7d3c66941d5d4f191f55066861928f"} Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.668534 4707 scope.go:117] "RemoveContainer" containerID="03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.668580 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spphf" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.689841 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-spphf"] Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.693553 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-spphf"] Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.694777 4707 scope.go:117] "RemoveContainer" containerID="4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.707274 4707 scope.go:117] "RemoveContainer" containerID="2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.721773 4707 scope.go:117] "RemoveContainer" containerID="03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd" Jan 21 17:25:41 crc kubenswrapper[4707]: E0121 17:25:41.722050 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd\": container with ID starting with 03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd not found: ID does not exist" containerID="03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.722095 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd"} err="failed to get container status \"03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd\": rpc error: code = NotFound desc = could not find container \"03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd\": container with ID starting with 03fe19ee7c0cd4e569bdf7e5a2ba91712272a5722f0c03293b9f9ac18f04d2cd not found: ID does not exist" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.722111 4707 scope.go:117] "RemoveContainer" containerID="4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67" Jan 21 17:25:41 crc kubenswrapper[4707]: E0121 17:25:41.722351 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67\": container with ID starting with 4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67 not found: ID does not exist" containerID="4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.722370 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67"} err="failed to get container status \"4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67\": rpc error: code = NotFound desc = could not find container \"4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67\": container with ID starting with 4574c0340256e6d6252f84481127ddfac8fcc406b02090bbc62dfb3070738b67 not found: ID does not exist" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.722401 4707 scope.go:117] "RemoveContainer" containerID="2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3" Jan 21 17:25:41 crc kubenswrapper[4707]: E0121 17:25:41.722648 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3\": container with ID starting with 2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3 not found: ID does not exist" containerID="2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3" Jan 21 17:25:41 crc kubenswrapper[4707]: I0121 17:25:41.722763 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3"} err="failed to get container status \"2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3\": rpc error: code = NotFound desc = could not find container \"2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3\": container with ID starting with 2fefd398a96cb6c8599c676c815291d0abb464b292311683f6fcd177b69cd0a3 not found: ID does not exist" Jan 21 17:25:43 crc kubenswrapper[4707]: I0121 17:25:43.189065 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" path="/var/lib/kubelet/pods/33ba2b34-bbeb-41cf-ae73-464fc33cfcf9/volumes" Jan 21 17:25:56 crc kubenswrapper[4707]: I0121 17:25:56.963380 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lkhbz"] Jan 21 17:25:56 crc kubenswrapper[4707]: E0121 17:25:56.963949 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerName="registry-server" Jan 21 17:25:56 crc kubenswrapper[4707]: I0121 17:25:56.963961 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerName="registry-server" Jan 21 17:25:56 crc kubenswrapper[4707]: E0121 17:25:56.963983 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerName="extract-utilities" Jan 21 17:25:56 crc kubenswrapper[4707]: I0121 17:25:56.963989 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerName="extract-utilities" Jan 21 17:25:56 crc kubenswrapper[4707]: E0121 17:25:56.964007 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerName="extract-content" Jan 21 17:25:56 crc kubenswrapper[4707]: I0121 17:25:56.964013 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerName="extract-content" Jan 21 17:25:56 crc kubenswrapper[4707]: I0121 17:25:56.964110 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ba2b34-bbeb-41cf-ae73-464fc33cfcf9" containerName="registry-server" Jan 21 17:25:56 crc kubenswrapper[4707]: I0121 17:25:56.964799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:56 crc kubenswrapper[4707]: I0121 17:25:56.976386 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lkhbz"] Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.114932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-utilities\") pod \"certified-operators-lkhbz\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.115005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztnb2\" (UniqueName: \"kubernetes.io/projected/7c33f576-a422-4ab6-90ee-586eded9fb1d-kube-api-access-ztnb2\") pod \"certified-operators-lkhbz\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.115237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-catalog-content\") pod \"certified-operators-lkhbz\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.216164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-utilities\") pod \"certified-operators-lkhbz\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.216427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztnb2\" (UniqueName: \"kubernetes.io/projected/7c33f576-a422-4ab6-90ee-586eded9fb1d-kube-api-access-ztnb2\") pod \"certified-operators-lkhbz\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.216536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-catalog-content\") pod \"certified-operators-lkhbz\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.216620 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-utilities\") pod \"certified-operators-lkhbz\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.216866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-catalog-content\") pod \"certified-operators-lkhbz\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.236849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztnb2\" (UniqueName: \"kubernetes.io/projected/7c33f576-a422-4ab6-90ee-586eded9fb1d-kube-api-access-ztnb2\") pod \"certified-operators-lkhbz\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.277218 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.717995 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lkhbz"] Jan 21 17:25:57 crc kubenswrapper[4707]: I0121 17:25:57.744833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkhbz" event={"ID":"7c33f576-a422-4ab6-90ee-586eded9fb1d","Type":"ContainerStarted","Data":"1267e6b358ec46065d63dad16f82cdc055131d749d1858606d01bae63b89da27"} Jan 21 17:25:58 crc kubenswrapper[4707]: I0121 17:25:58.751557 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerID="ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2" exitCode=0 Jan 21 17:25:58 crc kubenswrapper[4707]: I0121 17:25:58.751613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkhbz" event={"ID":"7c33f576-a422-4ab6-90ee-586eded9fb1d","Type":"ContainerDied","Data":"ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2"} Jan 21 17:25:59 crc kubenswrapper[4707]: I0121 17:25:59.760209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkhbz" event={"ID":"7c33f576-a422-4ab6-90ee-586eded9fb1d","Type":"ContainerStarted","Data":"1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53"} Jan 21 17:26:00 crc kubenswrapper[4707]: I0121 17:26:00.766965 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerID="1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53" exitCode=0 Jan 21 17:26:00 crc kubenswrapper[4707]: I0121 17:26:00.767168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkhbz" event={"ID":"7c33f576-a422-4ab6-90ee-586eded9fb1d","Type":"ContainerDied","Data":"1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53"} Jan 21 17:26:01 crc kubenswrapper[4707]: I0121 17:26:01.772554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkhbz" event={"ID":"7c33f576-a422-4ab6-90ee-586eded9fb1d","Type":"ContainerStarted","Data":"b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4"} Jan 21 17:26:01 crc kubenswrapper[4707]: I0121 17:26:01.791637 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lkhbz" podStartSLOduration=3.317763945 podStartE2EDuration="5.791624525s" podCreationTimestamp="2026-01-21 17:25:56 +0000 UTC" firstStartedPulling="2026-01-21 17:25:58.753094472 +0000 UTC m=+8655.934610694" lastFinishedPulling="2026-01-21 17:26:01.226955052 +0000 UTC m=+8658.408471274" observedRunningTime="2026-01-21 17:26:01.786867399 +0000 UTC m=+8658.968383622" watchObservedRunningTime="2026-01-21 17:26:01.791624525 +0000 UTC m=+8658.973140746" Jan 21 17:26:07 crc kubenswrapper[4707]: I0121 17:26:07.277347 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:26:07 crc kubenswrapper[4707]: I0121 17:26:07.278624 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:26:07 crc kubenswrapper[4707]: I0121 17:26:07.305656 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:26:07 crc kubenswrapper[4707]: I0121 17:26:07.824250 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:26:07 crc kubenswrapper[4707]: I0121 17:26:07.852432 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lkhbz"] Jan 21 17:26:09 crc kubenswrapper[4707]: I0121 17:26:09.808365 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lkhbz" podUID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerName="registry-server" containerID="cri-o://b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4" gracePeriod=2 Jan 21 17:26:09 crc kubenswrapper[4707]: I0121 17:26:09.945143 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:26:09 crc kubenswrapper[4707]: I0121 17:26:09.945195 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:26:09 crc kubenswrapper[4707]: I0121 17:26:09.945235 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:26:09 crc kubenswrapper[4707]: I0121 17:26:09.945762 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8848cdf662db2834cec5a993e3b664d6194c9170ca7198fa60fc22fd6cc87b79"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:26:09 crc kubenswrapper[4707]: I0121 17:26:09.945829 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://8848cdf662db2834cec5a993e3b664d6194c9170ca7198fa60fc22fd6cc87b79" gracePeriod=600 Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.149216 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.269531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-catalog-content\") pod \"7c33f576-a422-4ab6-90ee-586eded9fb1d\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.269621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztnb2\" (UniqueName: \"kubernetes.io/projected/7c33f576-a422-4ab6-90ee-586eded9fb1d-kube-api-access-ztnb2\") pod \"7c33f576-a422-4ab6-90ee-586eded9fb1d\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.269719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-utilities\") pod \"7c33f576-a422-4ab6-90ee-586eded9fb1d\" (UID: \"7c33f576-a422-4ab6-90ee-586eded9fb1d\") " Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.270675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-utilities" (OuterVolumeSpecName: "utilities") pod "7c33f576-a422-4ab6-90ee-586eded9fb1d" (UID: "7c33f576-a422-4ab6-90ee-586eded9fb1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.270972 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.286184 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c33f576-a422-4ab6-90ee-586eded9fb1d-kube-api-access-ztnb2" (OuterVolumeSpecName: "kube-api-access-ztnb2") pod "7c33f576-a422-4ab6-90ee-586eded9fb1d" (UID: "7c33f576-a422-4ab6-90ee-586eded9fb1d"). InnerVolumeSpecName "kube-api-access-ztnb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.373360 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztnb2\" (UniqueName: \"kubernetes.io/projected/7c33f576-a422-4ab6-90ee-586eded9fb1d-kube-api-access-ztnb2\") on node \"crc\" DevicePath \"\"" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.468706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c33f576-a422-4ab6-90ee-586eded9fb1d" (UID: "7c33f576-a422-4ab6-90ee-586eded9fb1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.475045 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c33f576-a422-4ab6-90ee-586eded9fb1d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.813932 4707 generic.go:334] "Generic (PLEG): container finished" podID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerID="b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4" exitCode=0 Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.813986 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lkhbz" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.814005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkhbz" event={"ID":"7c33f576-a422-4ab6-90ee-586eded9fb1d","Type":"ContainerDied","Data":"b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4"} Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.814795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkhbz" event={"ID":"7c33f576-a422-4ab6-90ee-586eded9fb1d","Type":"ContainerDied","Data":"1267e6b358ec46065d63dad16f82cdc055131d749d1858606d01bae63b89da27"} Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.814836 4707 scope.go:117] "RemoveContainer" containerID="b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.816610 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="8848cdf662db2834cec5a993e3b664d6194c9170ca7198fa60fc22fd6cc87b79" exitCode=0 Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.816639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"8848cdf662db2834cec5a993e3b664d6194c9170ca7198fa60fc22fd6cc87b79"} Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.816666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be"} Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.831979 4707 scope.go:117] "RemoveContainer" containerID="1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.840336 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lkhbz"] Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.844718 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lkhbz"] Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.845324 4707 scope.go:117] "RemoveContainer" containerID="ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.860286 4707 scope.go:117] "RemoveContainer" containerID="b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4" Jan 21 17:26:10 crc kubenswrapper[4707]: E0121 17:26:10.860597 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4\": container with ID starting with b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4 not found: ID does not exist" containerID="b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.860630 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4"} err="failed to get container status \"b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4\": rpc error: code = NotFound desc = could not find container \"b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4\": container with ID starting with b8322878711e37b0e54847ddc14a4e0499c88dbd62fb7029c525afbe94bd6de4 not found: ID does not exist" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.860650 4707 scope.go:117] "RemoveContainer" containerID="1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53" Jan 21 17:26:10 crc kubenswrapper[4707]: E0121 17:26:10.860895 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53\": container with ID starting with 1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53 not found: ID does not exist" containerID="1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.860937 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53"} err="failed to get container status \"1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53\": rpc error: code = NotFound desc = could not find container \"1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53\": container with ID starting with 1d970ea42e797da9a315a3c2767898da4fafdcf782f9a06be9f49e9398474d53 not found: ID does not exist" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.860954 4707 scope.go:117] "RemoveContainer" containerID="ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2" Jan 21 17:26:10 crc kubenswrapper[4707]: E0121 17:26:10.861205 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2\": container with ID starting with ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2 not found: ID does not exist" containerID="ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.861245 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2"} err="failed to get container status \"ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2\": rpc error: code = NotFound desc = could not find container \"ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2\": container with ID starting with ce134ca35abf24b844ba2b380264e24c82d2424bd95fa56c8653a4e5566b39f2 not found: ID does not exist" Jan 21 17:26:10 crc kubenswrapper[4707]: I0121 17:26:10.861259 4707 scope.go:117] "RemoveContainer" containerID="e2a134663b15ccdd5abe55629b9449c3477152966e3eae6c48520a8a0f898336" Jan 21 17:26:11 crc kubenswrapper[4707]: I0121 17:26:11.188719 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c33f576-a422-4ab6-90ee-586eded9fb1d" path="/var/lib/kubelet/pods/7c33f576-a422-4ab6-90ee-586eded9fb1d/volumes" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.829377 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tx5qf"] Jan 21 17:27:23 crc kubenswrapper[4707]: E0121 17:27:23.829953 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerName="extract-utilities" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.829965 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerName="extract-utilities" Jan 21 17:27:23 crc kubenswrapper[4707]: E0121 17:27:23.829988 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerName="extract-content" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.829995 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerName="extract-content" Jan 21 17:27:23 crc kubenswrapper[4707]: E0121 17:27:23.830009 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerName="registry-server" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.830015 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerName="registry-server" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.830104 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c33f576-a422-4ab6-90ee-586eded9fb1d" containerName="registry-server" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.830829 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.844427 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tx5qf"] Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.857679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-catalog-content\") pod \"redhat-operators-tx5qf\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.857715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-utilities\") pod \"redhat-operators-tx5qf\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.857781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x459\" (UniqueName: \"kubernetes.io/projected/5b59a966-eba7-49f9-8bd0-d783113a7cd7-kube-api-access-4x459\") pod \"redhat-operators-tx5qf\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.958907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x459\" (UniqueName: \"kubernetes.io/projected/5b59a966-eba7-49f9-8bd0-d783113a7cd7-kube-api-access-4x459\") pod \"redhat-operators-tx5qf\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.959166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-catalog-content\") pod \"redhat-operators-tx5qf\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.959249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-utilities\") pod \"redhat-operators-tx5qf\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.959514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-catalog-content\") pod \"redhat-operators-tx5qf\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.959643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-utilities\") pod \"redhat-operators-tx5qf\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:23 crc kubenswrapper[4707]: I0121 17:27:23.974057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x459\" (UniqueName: \"kubernetes.io/projected/5b59a966-eba7-49f9-8bd0-d783113a7cd7-kube-api-access-4x459\") pod \"redhat-operators-tx5qf\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:24 crc kubenswrapper[4707]: I0121 17:27:24.149222 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:24 crc kubenswrapper[4707]: I0121 17:27:24.536212 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tx5qf"] Jan 21 17:27:25 crc kubenswrapper[4707]: I0121 17:27:25.175043 4707 generic.go:334] "Generic (PLEG): container finished" podID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerID="85a3c13553726282cc21b9108584fc0ba4c900f7e38031835b75f5804a3a57aa" exitCode=0 Jan 21 17:27:25 crc kubenswrapper[4707]: I0121 17:27:25.175085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx5qf" event={"ID":"5b59a966-eba7-49f9-8bd0-d783113a7cd7","Type":"ContainerDied","Data":"85a3c13553726282cc21b9108584fc0ba4c900f7e38031835b75f5804a3a57aa"} Jan 21 17:27:25 crc kubenswrapper[4707]: I0121 17:27:25.175201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx5qf" event={"ID":"5b59a966-eba7-49f9-8bd0-d783113a7cd7","Type":"ContainerStarted","Data":"7f912c8ff987985179edbaa1e2e8b463a94eb3a61f25e080a10a0c6ceedf3b91"} Jan 21 17:27:26 crc kubenswrapper[4707]: I0121 17:27:26.181664 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx5qf" event={"ID":"5b59a966-eba7-49f9-8bd0-d783113a7cd7","Type":"ContainerStarted","Data":"c5ac3a2a70c2cd168b87d6d40a08f5220ea06bf1cd91cc18c99a4fc8689deda8"} Jan 21 17:27:27 crc kubenswrapper[4707]: I0121 17:27:27.190843 4707 generic.go:334] "Generic (PLEG): container finished" podID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerID="c5ac3a2a70c2cd168b87d6d40a08f5220ea06bf1cd91cc18c99a4fc8689deda8" exitCode=0 Jan 21 17:27:27 crc kubenswrapper[4707]: I0121 17:27:27.190879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx5qf" event={"ID":"5b59a966-eba7-49f9-8bd0-d783113a7cd7","Type":"ContainerDied","Data":"c5ac3a2a70c2cd168b87d6d40a08f5220ea06bf1cd91cc18c99a4fc8689deda8"} Jan 21 17:27:28 crc kubenswrapper[4707]: I0121 17:27:28.197165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx5qf" event={"ID":"5b59a966-eba7-49f9-8bd0-d783113a7cd7","Type":"ContainerStarted","Data":"987e15c0157bd4540e7ed14c7159aa950b117e9975a2bf3dfdb0457cf7e0c379"} Jan 21 17:27:28 crc kubenswrapper[4707]: I0121 17:27:28.211826 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tx5qf" podStartSLOduration=2.705775616 podStartE2EDuration="5.211799721s" podCreationTimestamp="2026-01-21 17:27:23 +0000 UTC" firstStartedPulling="2026-01-21 17:27:25.177182345 +0000 UTC m=+8742.358698568" lastFinishedPulling="2026-01-21 17:27:27.683206452 +0000 UTC m=+8744.864722673" observedRunningTime="2026-01-21 17:27:28.211399879 +0000 UTC m=+8745.392916101" watchObservedRunningTime="2026-01-21 17:27:28.211799721 +0000 UTC m=+8745.393315943" Jan 21 17:27:34 crc kubenswrapper[4707]: I0121 17:27:34.150867 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:34 crc kubenswrapper[4707]: I0121 17:27:34.152176 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:34 crc kubenswrapper[4707]: I0121 17:27:34.182252 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:34 crc kubenswrapper[4707]: I0121 17:27:34.255440 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:34 crc kubenswrapper[4707]: I0121 17:27:34.404289 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tx5qf"] Jan 21 17:27:36 crc kubenswrapper[4707]: I0121 17:27:36.235015 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tx5qf" podUID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerName="registry-server" containerID="cri-o://987e15c0157bd4540e7ed14c7159aa950b117e9975a2bf3dfdb0457cf7e0c379" gracePeriod=2 Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.246583 4707 generic.go:334] "Generic (PLEG): container finished" podID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerID="987e15c0157bd4540e7ed14c7159aa950b117e9975a2bf3dfdb0457cf7e0c379" exitCode=0 Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.246774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx5qf" event={"ID":"5b59a966-eba7-49f9-8bd0-d783113a7cd7","Type":"ContainerDied","Data":"987e15c0157bd4540e7ed14c7159aa950b117e9975a2bf3dfdb0457cf7e0c379"} Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.358371 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.534801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-utilities\") pod \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.535176 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x459\" (UniqueName: \"kubernetes.io/projected/5b59a966-eba7-49f9-8bd0-d783113a7cd7-kube-api-access-4x459\") pod \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.535313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-catalog-content\") pod \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\" (UID: \"5b59a966-eba7-49f9-8bd0-d783113a7cd7\") " Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.550693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-utilities" (OuterVolumeSpecName: "utilities") pod "5b59a966-eba7-49f9-8bd0-d783113a7cd7" (UID: "5b59a966-eba7-49f9-8bd0-d783113a7cd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.553965 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b59a966-eba7-49f9-8bd0-d783113a7cd7-kube-api-access-4x459" (OuterVolumeSpecName: "kube-api-access-4x459") pod "5b59a966-eba7-49f9-8bd0-d783113a7cd7" (UID: "5b59a966-eba7-49f9-8bd0-d783113a7cd7"). InnerVolumeSpecName "kube-api-access-4x459". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.626369 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b59a966-eba7-49f9-8bd0-d783113a7cd7" (UID: "5b59a966-eba7-49f9-8bd0-d783113a7cd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.636726 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.636752 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b59a966-eba7-49f9-8bd0-d783113a7cd7-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:27:38 crc kubenswrapper[4707]: I0121 17:27:38.636787 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x459\" (UniqueName: \"kubernetes.io/projected/5b59a966-eba7-49f9-8bd0-d783113a7cd7-kube-api-access-4x459\") on node \"crc\" DevicePath \"\"" Jan 21 17:27:39 crc kubenswrapper[4707]: I0121 17:27:39.253541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx5qf" event={"ID":"5b59a966-eba7-49f9-8bd0-d783113a7cd7","Type":"ContainerDied","Data":"7f912c8ff987985179edbaa1e2e8b463a94eb3a61f25e080a10a0c6ceedf3b91"} Jan 21 17:27:39 crc kubenswrapper[4707]: I0121 17:27:39.253608 4707 scope.go:117] "RemoveContainer" containerID="987e15c0157bd4540e7ed14c7159aa950b117e9975a2bf3dfdb0457cf7e0c379" Jan 21 17:27:39 crc kubenswrapper[4707]: I0121 17:27:39.253725 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx5qf" Jan 21 17:27:39 crc kubenswrapper[4707]: I0121 17:27:39.271549 4707 scope.go:117] "RemoveContainer" containerID="c5ac3a2a70c2cd168b87d6d40a08f5220ea06bf1cd91cc18c99a4fc8689deda8" Jan 21 17:27:39 crc kubenswrapper[4707]: I0121 17:27:39.272193 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tx5qf"] Jan 21 17:27:39 crc kubenswrapper[4707]: I0121 17:27:39.279103 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tx5qf"] Jan 21 17:27:39 crc kubenswrapper[4707]: I0121 17:27:39.284188 4707 scope.go:117] "RemoveContainer" containerID="85a3c13553726282cc21b9108584fc0ba4c900f7e38031835b75f5804a3a57aa" Jan 21 17:27:41 crc kubenswrapper[4707]: I0121 17:27:41.188121 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" path="/var/lib/kubelet/pods/5b59a966-eba7-49f9-8bd0-d783113a7cd7/volumes" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.039216 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5gvxc"] Jan 21 17:28:01 crc kubenswrapper[4707]: E0121 17:28:01.040133 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerName="extract-content" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.040148 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerName="extract-content" Jan 21 17:28:01 crc kubenswrapper[4707]: E0121 17:28:01.040161 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerName="extract-utilities" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.040167 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerName="extract-utilities" Jan 21 17:28:01 crc kubenswrapper[4707]: E0121 17:28:01.040193 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerName="registry-server" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.040200 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerName="registry-server" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.040329 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b59a966-eba7-49f9-8bd0-d783113a7cd7" containerName="registry-server" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.045615 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.052347 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gvxc"] Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.098603 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-utilities\") pod \"redhat-marketplace-5gvxc\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.098677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6r6\" (UniqueName: \"kubernetes.io/projected/b0c5561d-e7a5-49cd-83d3-248c2aeea936-kube-api-access-lx6r6\") pod \"redhat-marketplace-5gvxc\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.098741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-catalog-content\") pod \"redhat-marketplace-5gvxc\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.199493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-utilities\") pod \"redhat-marketplace-5gvxc\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.199556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6r6\" (UniqueName: \"kubernetes.io/projected/b0c5561d-e7a5-49cd-83d3-248c2aeea936-kube-api-access-lx6r6\") pod \"redhat-marketplace-5gvxc\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.199604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-catalog-content\") pod \"redhat-marketplace-5gvxc\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.199988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-catalog-content\") pod \"redhat-marketplace-5gvxc\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.200099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-utilities\") pod \"redhat-marketplace-5gvxc\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.225698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6r6\" (UniqueName: \"kubernetes.io/projected/b0c5561d-e7a5-49cd-83d3-248c2aeea936-kube-api-access-lx6r6\") pod \"redhat-marketplace-5gvxc\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.361982 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:01 crc kubenswrapper[4707]: I0121 17:28:01.555870 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gvxc"] Jan 21 17:28:02 crc kubenswrapper[4707]: I0121 17:28:02.410527 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerID="3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c" exitCode=0 Jan 21 17:28:02 crc kubenswrapper[4707]: I0121 17:28:02.410709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gvxc" event={"ID":"b0c5561d-e7a5-49cd-83d3-248c2aeea936","Type":"ContainerDied","Data":"3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c"} Jan 21 17:28:02 crc kubenswrapper[4707]: I0121 17:28:02.410731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gvxc" event={"ID":"b0c5561d-e7a5-49cd-83d3-248c2aeea936","Type":"ContainerStarted","Data":"bd44a0a73e880a6b9014eff2173765c8fecf6dd50189cafb825950c7bbf9413d"} Jan 21 17:28:03 crc kubenswrapper[4707]: I0121 17:28:03.428716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gvxc" event={"ID":"b0c5561d-e7a5-49cd-83d3-248c2aeea936","Type":"ContainerStarted","Data":"b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d"} Jan 21 17:28:04 crc kubenswrapper[4707]: I0121 17:28:04.433987 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerID="b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d" exitCode=0 Jan 21 17:28:04 crc kubenswrapper[4707]: I0121 17:28:04.434233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gvxc" event={"ID":"b0c5561d-e7a5-49cd-83d3-248c2aeea936","Type":"ContainerDied","Data":"b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d"} Jan 21 17:28:05 crc kubenswrapper[4707]: I0121 17:28:05.441314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gvxc" event={"ID":"b0c5561d-e7a5-49cd-83d3-248c2aeea936","Type":"ContainerStarted","Data":"2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061"} Jan 21 17:28:05 crc kubenswrapper[4707]: I0121 17:28:05.461494 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5gvxc" podStartSLOduration=1.939075552 podStartE2EDuration="4.461482051s" podCreationTimestamp="2026-01-21 17:28:01 +0000 UTC" firstStartedPulling="2026-01-21 17:28:02.412176689 +0000 UTC m=+8779.593692911" lastFinishedPulling="2026-01-21 17:28:04.934583188 +0000 UTC m=+8782.116099410" observedRunningTime="2026-01-21 17:28:05.45810939 +0000 UTC m=+8782.639625612" watchObservedRunningTime="2026-01-21 17:28:05.461482051 +0000 UTC m=+8782.642998273" Jan 21 17:28:11 crc kubenswrapper[4707]: I0121 17:28:11.362518 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:11 crc kubenswrapper[4707]: I0121 17:28:11.363090 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:11 crc kubenswrapper[4707]: I0121 17:28:11.391158 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:11 crc kubenswrapper[4707]: I0121 17:28:11.501283 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:11 crc kubenswrapper[4707]: I0121 17:28:11.846945 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gvxc"] Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.481735 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5gvxc" podUID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerName="registry-server" containerID="cri-o://2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061" gracePeriod=2 Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.771291 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.874795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-catalog-content\") pod \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.874882 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx6r6\" (UniqueName: \"kubernetes.io/projected/b0c5561d-e7a5-49cd-83d3-248c2aeea936-kube-api-access-lx6r6\") pod \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.874930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-utilities\") pod \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\" (UID: \"b0c5561d-e7a5-49cd-83d3-248c2aeea936\") " Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.875843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-utilities" (OuterVolumeSpecName: "utilities") pod "b0c5561d-e7a5-49cd-83d3-248c2aeea936" (UID: "b0c5561d-e7a5-49cd-83d3-248c2aeea936"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.879647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c5561d-e7a5-49cd-83d3-248c2aeea936-kube-api-access-lx6r6" (OuterVolumeSpecName: "kube-api-access-lx6r6") pod "b0c5561d-e7a5-49cd-83d3-248c2aeea936" (UID: "b0c5561d-e7a5-49cd-83d3-248c2aeea936"). InnerVolumeSpecName "kube-api-access-lx6r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.890735 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0c5561d-e7a5-49cd-83d3-248c2aeea936" (UID: "b0c5561d-e7a5-49cd-83d3-248c2aeea936"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.976632 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.976666 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx6r6\" (UniqueName: \"kubernetes.io/projected/b0c5561d-e7a5-49cd-83d3-248c2aeea936-kube-api-access-lx6r6\") on node \"crc\" DevicePath \"\"" Jan 21 17:28:13 crc kubenswrapper[4707]: I0121 17:28:13.976677 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c5561d-e7a5-49cd-83d3-248c2aeea936-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.488302 4707 generic.go:334] "Generic (PLEG): container finished" podID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerID="2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061" exitCode=0 Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.488342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gvxc" event={"ID":"b0c5561d-e7a5-49cd-83d3-248c2aeea936","Type":"ContainerDied","Data":"2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061"} Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.488359 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gvxc" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.488378 4707 scope.go:117] "RemoveContainer" containerID="2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.488366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gvxc" event={"ID":"b0c5561d-e7a5-49cd-83d3-248c2aeea936","Type":"ContainerDied","Data":"bd44a0a73e880a6b9014eff2173765c8fecf6dd50189cafb825950c7bbf9413d"} Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.511858 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gvxc"] Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.514226 4707 scope.go:117] "RemoveContainer" containerID="b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.517326 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gvxc"] Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.525861 4707 scope.go:117] "RemoveContainer" containerID="3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.542143 4707 scope.go:117] "RemoveContainer" containerID="2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061" Jan 21 17:28:14 crc kubenswrapper[4707]: E0121 17:28:14.542615 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061\": container with ID starting with 2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061 not found: ID does not exist" containerID="2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.542645 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061"} err="failed to get container status \"2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061\": rpc error: code = NotFound desc = could not find container \"2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061\": container with ID starting with 2a79e8a71fbfb301dc1ddfc79b72dfa225f5d3d0a4841a967e6e4e5d48b19061 not found: ID does not exist" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.542664 4707 scope.go:117] "RemoveContainer" containerID="b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d" Jan 21 17:28:14 crc kubenswrapper[4707]: E0121 17:28:14.542944 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d\": container with ID starting with b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d not found: ID does not exist" containerID="b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.542974 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d"} err="failed to get container status \"b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d\": rpc error: code = NotFound desc = could not find container \"b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d\": container with ID starting with b68474eb586dc1731f18d08575d7e5a51de9d0a98f317719366ee6bfb525612d not found: ID does not exist" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.542992 4707 scope.go:117] "RemoveContainer" containerID="3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c" Jan 21 17:28:14 crc kubenswrapper[4707]: E0121 17:28:14.543252 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c\": container with ID starting with 3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c not found: ID does not exist" containerID="3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c" Jan 21 17:28:14 crc kubenswrapper[4707]: I0121 17:28:14.543275 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c"} err="failed to get container status \"3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c\": rpc error: code = NotFound desc = could not find container \"3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c\": container with ID starting with 3ea64efba02f859f0563d910b62689d7fbf0f44da4dc82bf56f3d294f8e2204c not found: ID does not exist" Jan 21 17:28:15 crc kubenswrapper[4707]: I0121 17:28:15.187681 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" path="/var/lib/kubelet/pods/b0c5561d-e7a5-49cd-83d3-248c2aeea936/volumes" Jan 21 17:28:39 crc kubenswrapper[4707]: I0121 17:28:39.945661 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:28:39 crc kubenswrapper[4707]: I0121 17:28:39.946061 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:29:09 crc kubenswrapper[4707]: I0121 17:29:09.946143 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:29:09 crc kubenswrapper[4707]: I0121 17:29:09.946468 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.099432 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bkpjz/must-gather-v4nkr"] Jan 21 17:29:15 crc kubenswrapper[4707]: E0121 17:29:15.099990 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerName="extract-content" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.100003 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerName="extract-content" Jan 21 17:29:15 crc kubenswrapper[4707]: E0121 17:29:15.100023 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerName="extract-utilities" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.100029 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerName="extract-utilities" Jan 21 17:29:15 crc kubenswrapper[4707]: E0121 17:29:15.100040 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerName="registry-server" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.100045 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerName="registry-server" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.100167 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c5561d-e7a5-49cd-83d3-248c2aeea936" containerName="registry-server" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.100768 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bkpjz/must-gather-v4nkr" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.102139 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bkpjz"/"kube-root-ca.crt" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.102218 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bkpjz"/"default-dockercfg-6jnlg" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.102435 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bkpjz"/"openshift-service-ca.crt" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.107921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bkpjz/must-gather-v4nkr"] Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.177249 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsnvv\" (UniqueName: \"kubernetes.io/projected/068c804b-00d8-478d-8b11-368ea1f234df-kube-api-access-tsnvv\") pod \"must-gather-v4nkr\" (UID: \"068c804b-00d8-478d-8b11-368ea1f234df\") " pod="openshift-must-gather-bkpjz/must-gather-v4nkr" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.177453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/068c804b-00d8-478d-8b11-368ea1f234df-must-gather-output\") pod \"must-gather-v4nkr\" (UID: \"068c804b-00d8-478d-8b11-368ea1f234df\") " pod="openshift-must-gather-bkpjz/must-gather-v4nkr" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.278333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsnvv\" (UniqueName: \"kubernetes.io/projected/068c804b-00d8-478d-8b11-368ea1f234df-kube-api-access-tsnvv\") pod \"must-gather-v4nkr\" (UID: \"068c804b-00d8-478d-8b11-368ea1f234df\") " pod="openshift-must-gather-bkpjz/must-gather-v4nkr" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.278564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/068c804b-00d8-478d-8b11-368ea1f234df-must-gather-output\") pod \"must-gather-v4nkr\" (UID: \"068c804b-00d8-478d-8b11-368ea1f234df\") " pod="openshift-must-gather-bkpjz/must-gather-v4nkr" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.279197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/068c804b-00d8-478d-8b11-368ea1f234df-must-gather-output\") pod \"must-gather-v4nkr\" (UID: \"068c804b-00d8-478d-8b11-368ea1f234df\") " pod="openshift-must-gather-bkpjz/must-gather-v4nkr" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.293831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsnvv\" (UniqueName: \"kubernetes.io/projected/068c804b-00d8-478d-8b11-368ea1f234df-kube-api-access-tsnvv\") pod \"must-gather-v4nkr\" (UID: \"068c804b-00d8-478d-8b11-368ea1f234df\") " pod="openshift-must-gather-bkpjz/must-gather-v4nkr" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.414304 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bkpjz/must-gather-v4nkr" Jan 21 17:29:15 crc kubenswrapper[4707]: I0121 17:29:15.943141 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bkpjz/must-gather-v4nkr"] Jan 21 17:29:16 crc kubenswrapper[4707]: I0121 17:29:16.824289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bkpjz/must-gather-v4nkr" event={"ID":"068c804b-00d8-478d-8b11-368ea1f234df","Type":"ContainerStarted","Data":"a621107ac884b655ad1a4c1adab1b1f2e9a66a307df637b8b9d9104f82ecff98"} Jan 21 17:29:16 crc kubenswrapper[4707]: I0121 17:29:16.824506 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bkpjz/must-gather-v4nkr" event={"ID":"068c804b-00d8-478d-8b11-368ea1f234df","Type":"ContainerStarted","Data":"ea0654f9369924211c483a3d4486fc8e77e39cf1b1b737313fecd32aad9041b1"} Jan 21 17:29:16 crc kubenswrapper[4707]: I0121 17:29:16.824521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bkpjz/must-gather-v4nkr" event={"ID":"068c804b-00d8-478d-8b11-368ea1f234df","Type":"ContainerStarted","Data":"f5a252ff2244738f14d284b384fffd60220653e9f23483c1ee2be40cba1f681c"} Jan 21 17:29:16 crc kubenswrapper[4707]: I0121 17:29:16.837372 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bkpjz/must-gather-v4nkr" podStartSLOduration=1.837356342 podStartE2EDuration="1.837356342s" podCreationTimestamp="2026-01-21 17:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:29:16.834092525 +0000 UTC m=+8854.015608747" watchObservedRunningTime="2026-01-21 17:29:16.837356342 +0000 UTC m=+8854.018872564" Jan 21 17:29:39 crc kubenswrapper[4707]: I0121 17:29:39.945466 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:29:39 crc kubenswrapper[4707]: I0121 17:29:39.946072 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:29:39 crc kubenswrapper[4707]: I0121 17:29:39.946110 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:29:39 crc kubenswrapper[4707]: I0121 17:29:39.946584 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:29:39 crc kubenswrapper[4707]: I0121 17:29:39.946636 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" gracePeriod=600 Jan 21 17:29:40 crc kubenswrapper[4707]: E0121 17:29:40.060746 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:29:40 crc kubenswrapper[4707]: I0121 17:29:40.951622 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" exitCode=0 Jan 21 17:29:40 crc kubenswrapper[4707]: I0121 17:29:40.951671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be"} Jan 21 17:29:40 crc kubenswrapper[4707]: I0121 17:29:40.951717 4707 scope.go:117] "RemoveContainer" containerID="8848cdf662db2834cec5a993e3b664d6194c9170ca7198fa60fc22fd6cc87b79" Jan 21 17:29:40 crc kubenswrapper[4707]: I0121 17:29:40.952080 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:29:40 crc kubenswrapper[4707]: E0121 17:29:40.952255 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:29:43 crc kubenswrapper[4707]: I0121 17:29:43.620097 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/controller/0.log" Jan 21 17:29:43 crc kubenswrapper[4707]: I0121 17:29:43.624077 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/kube-rbac-proxy/0.log" Jan 21 17:29:43 crc kubenswrapper[4707]: I0121 17:29:43.642292 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/controller/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.783623 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.790615 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/reloader/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.794149 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr-metrics/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.798645 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.803541 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy-frr/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.809481 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-frr-files/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.815150 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-reloader/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.821276 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-metrics/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.832602 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-czn2c_a50f5957-c813-4be6-846c-18fa67c16942/frr-k8s-webhook-server/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.846779 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-769647f86-fhptv_3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78/manager/0.log" Jan 21 17:29:44 crc kubenswrapper[4707]: I0121 17:29:44.853126 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58ff8dbc9-jm9z5_560343d7-3dae-42d2-b79f-e7ddbde3df71/webhook-server/0.log" Jan 21 17:29:45 crc kubenswrapper[4707]: I0121 17:29:45.893423 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/speaker/0.log" Jan 21 17:29:45 crc kubenswrapper[4707]: I0121 17:29:45.897738 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/kube-rbac-proxy/0.log" Jan 21 17:29:51 crc kubenswrapper[4707]: I0121 17:29:51.847889 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bjb5q_4a760eb2-4041-468b-a269-13e523796dd0/control-plane-machine-set-operator/0.log" Jan 21 17:29:51 crc kubenswrapper[4707]: I0121 17:29:51.858842 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ggf7f_6896438b-9d7e-4637-ac39-63c12dbf0e66/kube-rbac-proxy/0.log" Jan 21 17:29:51 crc kubenswrapper[4707]: I0121 17:29:51.866763 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ggf7f_6896438b-9d7e-4637-ac39-63c12dbf0e66/machine-api-operator/0.log" Jan 21 17:29:55 crc kubenswrapper[4707]: I0121 17:29:55.545082 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-pdx6x_34c2061b-4028-4e99-9345-d8443062558a/cert-manager-controller/0.log" Jan 21 17:29:55 crc kubenswrapper[4707]: I0121 17:29:55.607163 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-tbzv6_18724316-30e9-4c09-baa1-d3d3971f97de/cert-manager-cainjector/0.log" Jan 21 17:29:55 crc kubenswrapper[4707]: I0121 17:29:55.615451 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-p7vr2_d59f005b-d150-4190-b8ca-d757c89269f4/cert-manager-webhook/0.log" Jan 21 17:29:56 crc kubenswrapper[4707]: I0121 17:29:56.182732 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:29:56 crc kubenswrapper[4707]: E0121 17:29:56.183141 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:29:58 crc kubenswrapper[4707]: I0121 17:29:58.824021 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-g8mcx_7bf6f36b-51bd-4196-b732-7807475b38b2/nmstate-console-plugin/0.log" Jan 21 17:29:58 crc kubenswrapper[4707]: I0121 17:29:58.843581 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-m4cqg_72f87fd4-c9ff-4410-ba80-fdc2525a37b8/nmstate-handler/0.log" Jan 21 17:29:58 crc kubenswrapper[4707]: I0121 17:29:58.850150 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-h2gvw_591eebf5-40db-445a-a3ab-23a5f88f501c/nmstate-metrics/0.log" Jan 21 17:29:58 crc kubenswrapper[4707]: I0121 17:29:58.856066 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-h2gvw_591eebf5-40db-445a-a3ab-23a5f88f501c/kube-rbac-proxy/0.log" Jan 21 17:29:58 crc kubenswrapper[4707]: I0121 17:29:58.868300 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-kvjjm_c7e81409-5fc8-4b0d-9c98-dfaf6167892d/nmstate-operator/0.log" Jan 21 17:29:58 crc kubenswrapper[4707]: I0121 17:29:58.882301 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-64vq9_3dc50ee6-0d24-4790-a13d-caaf61114685/nmstate-webhook/0.log" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.130586 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk"] Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.131299 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.132873 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.133114 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.136617 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk"] Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.302026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633d223b-8b58-44cc-8d9b-670aefc2f0ca-secret-volume\") pod \"collect-profiles-29483610-24mmk\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.302159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633d223b-8b58-44cc-8d9b-670aefc2f0ca-config-volume\") pod \"collect-profiles-29483610-24mmk\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.302203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbgr\" (UniqueName: \"kubernetes.io/projected/633d223b-8b58-44cc-8d9b-670aefc2f0ca-kube-api-access-dvbgr\") pod \"collect-profiles-29483610-24mmk\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.403775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633d223b-8b58-44cc-8d9b-670aefc2f0ca-secret-volume\") pod \"collect-profiles-29483610-24mmk\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.403866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633d223b-8b58-44cc-8d9b-670aefc2f0ca-config-volume\") pod \"collect-profiles-29483610-24mmk\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.403897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbgr\" (UniqueName: \"kubernetes.io/projected/633d223b-8b58-44cc-8d9b-670aefc2f0ca-kube-api-access-dvbgr\") pod \"collect-profiles-29483610-24mmk\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.405704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633d223b-8b58-44cc-8d9b-670aefc2f0ca-config-volume\") pod \"collect-profiles-29483610-24mmk\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.409587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633d223b-8b58-44cc-8d9b-670aefc2f0ca-secret-volume\") pod \"collect-profiles-29483610-24mmk\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.430860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbgr\" (UniqueName: \"kubernetes.io/projected/633d223b-8b58-44cc-8d9b-670aefc2f0ca-kube-api-access-dvbgr\") pod \"collect-profiles-29483610-24mmk\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.445000 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:00 crc kubenswrapper[4707]: I0121 17:30:00.793760 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk"] Jan 21 17:30:01 crc kubenswrapper[4707]: I0121 17:30:01.045326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" event={"ID":"633d223b-8b58-44cc-8d9b-670aefc2f0ca","Type":"ContainerStarted","Data":"58450963fa4afd080065b1c68f55de64277081377f0e1689f234ca4f8fd596bc"} Jan 21 17:30:02 crc kubenswrapper[4707]: I0121 17:30:02.050905 4707 generic.go:334] "Generic (PLEG): container finished" podID="633d223b-8b58-44cc-8d9b-670aefc2f0ca" containerID="81ef5bf2727ef923ae694e62899a2b0790c7e126a33abef24f80e536e68e4398" exitCode=0 Jan 21 17:30:02 crc kubenswrapper[4707]: I0121 17:30:02.050991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" event={"ID":"633d223b-8b58-44cc-8d9b-670aefc2f0ca","Type":"ContainerDied","Data":"81ef5bf2727ef923ae694e62899a2b0790c7e126a33abef24f80e536e68e4398"} Jan 21 17:30:02 crc kubenswrapper[4707]: I0121 17:30:02.564327 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vcvdr_b57455b4-f5cc-4e50-972f-299a31cc4eba/prometheus-operator/0.log" Jan 21 17:30:02 crc kubenswrapper[4707]: I0121 17:30:02.571945 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc_50a8c4cf-f6ea-4042-900f-37b63445f2d3/prometheus-operator-admission-webhook/0.log" Jan 21 17:30:02 crc kubenswrapper[4707]: I0121 17:30:02.579168 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc_f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d/prometheus-operator-admission-webhook/0.log" Jan 21 17:30:02 crc kubenswrapper[4707]: I0121 17:30:02.601469 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hc5fq_de08c513-221d-41e8-a1b3-fc4b21d19173/operator/0.log" Jan 21 17:30:02 crc kubenswrapper[4707]: I0121 17:30:02.607735 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-n7h88_a77834d8-cde4-449d-b56f-6d3b53b0cadd/perses-operator/0.log" Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.269635 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.444351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633d223b-8b58-44cc-8d9b-670aefc2f0ca-config-volume\") pod \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.444404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633d223b-8b58-44cc-8d9b-670aefc2f0ca-secret-volume\") pod \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.444430 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvbgr\" (UniqueName: \"kubernetes.io/projected/633d223b-8b58-44cc-8d9b-670aefc2f0ca-kube-api-access-dvbgr\") pod \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\" (UID: \"633d223b-8b58-44cc-8d9b-670aefc2f0ca\") " Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.445504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/633d223b-8b58-44cc-8d9b-670aefc2f0ca-config-volume" (OuterVolumeSpecName: "config-volume") pod "633d223b-8b58-44cc-8d9b-670aefc2f0ca" (UID: "633d223b-8b58-44cc-8d9b-670aefc2f0ca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.454925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/633d223b-8b58-44cc-8d9b-670aefc2f0ca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "633d223b-8b58-44cc-8d9b-670aefc2f0ca" (UID: "633d223b-8b58-44cc-8d9b-670aefc2f0ca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.454953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/633d223b-8b58-44cc-8d9b-670aefc2f0ca-kube-api-access-dvbgr" (OuterVolumeSpecName: "kube-api-access-dvbgr") pod "633d223b-8b58-44cc-8d9b-670aefc2f0ca" (UID: "633d223b-8b58-44cc-8d9b-670aefc2f0ca"). InnerVolumeSpecName "kube-api-access-dvbgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.546251 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633d223b-8b58-44cc-8d9b-670aefc2f0ca-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.546284 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633d223b-8b58-44cc-8d9b-670aefc2f0ca-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:30:03 crc kubenswrapper[4707]: I0121 17:30:03.546295 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvbgr\" (UniqueName: \"kubernetes.io/projected/633d223b-8b58-44cc-8d9b-670aefc2f0ca-kube-api-access-dvbgr\") on node \"crc\" DevicePath \"\"" Jan 21 17:30:04 crc kubenswrapper[4707]: I0121 17:30:04.060889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" event={"ID":"633d223b-8b58-44cc-8d9b-670aefc2f0ca","Type":"ContainerDied","Data":"58450963fa4afd080065b1c68f55de64277081377f0e1689f234ca4f8fd596bc"} Jan 21 17:30:04 crc kubenswrapper[4707]: I0121 17:30:04.060925 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58450963fa4afd080065b1c68f55de64277081377f0e1689f234ca4f8fd596bc" Jan 21 17:30:04 crc kubenswrapper[4707]: I0121 17:30:04.060943 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk" Jan 21 17:30:04 crc kubenswrapper[4707]: I0121 17:30:04.316678 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc"] Jan 21 17:30:04 crc kubenswrapper[4707]: I0121 17:30:04.319863 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483565-pw8lc"] Jan 21 17:30:05 crc kubenswrapper[4707]: I0121 17:30:05.188442 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00fcf38c-116c-403d-9482-e34bca859394" path="/var/lib/kubelet/pods/00fcf38c-116c-403d-9482-e34bca859394/volumes" Jan 21 17:30:06 crc kubenswrapper[4707]: I0121 17:30:06.476696 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/controller/0.log" Jan 21 17:30:06 crc kubenswrapper[4707]: I0121 17:30:06.482112 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/kube-rbac-proxy/0.log" Jan 21 17:30:06 crc kubenswrapper[4707]: I0121 17:30:06.496575 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/controller/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.613631 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.621026 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/reloader/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.624180 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr-metrics/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.630616 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.635167 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy-frr/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.640407 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-frr-files/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.644989 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-reloader/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.649836 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-metrics/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.656863 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-czn2c_a50f5957-c813-4be6-846c-18fa67c16942/frr-k8s-webhook-server/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.673155 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-769647f86-fhptv_3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78/manager/0.log" Jan 21 17:30:07 crc kubenswrapper[4707]: I0121 17:30:07.681955 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58ff8dbc9-jm9z5_560343d7-3dae-42d2-b79f-e7ddbde3df71/webhook-server/0.log" Jan 21 17:30:08 crc kubenswrapper[4707]: I0121 17:30:08.688131 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/speaker/0.log" Jan 21 17:30:08 crc kubenswrapper[4707]: I0121 17:30:08.807150 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/kube-rbac-proxy/0.log" Jan 21 17:30:09 crc kubenswrapper[4707]: I0121 17:30:09.182637 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:30:09 crc kubenswrapper[4707]: E0121 17:30:09.182845 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:30:22 crc kubenswrapper[4707]: I0121 17:30:22.169532 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-84b9f45d47-55945_375fc46a-93db-46c4-9f95-b0f17f2e3c1c/dnsmasq-dns/0.log" Jan 21 17:30:22 crc kubenswrapper[4707]: I0121 17:30:22.175607 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-84b9f45d47-55945_375fc46a-93db-46c4-9f95-b0f17f2e3c1c/init/0.log" Jan 21 17:30:22 crc kubenswrapper[4707]: I0121 17:30:22.182291 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:30:22 crc kubenswrapper[4707]: E0121 17:30:22.182533 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.664146 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k_5ee91b4f-b397-4648-a2ab-dbf5699be347/extract/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.672404 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k_5ee91b4f-b397-4648-a2ab-dbf5699be347/util/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.694649 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a65g4k_5ee91b4f-b397-4648-a2ab-dbf5699be347/pull/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.700793 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq_ace82b3e-606d-4deb-ab5f-04fd56c3d8d3/extract/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.705500 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq_ace82b3e-606d-4deb-ab5f-04fd56c3d8d3/util/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.711938 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9hmqq_ace82b3e-606d-4deb-ab5f-04fd56c3d8d3/pull/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.720158 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj_4c693f1a-3959-41d4-8209-ff55cf48b2ea/extract/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.725784 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj_4c693f1a-3959-41d4-8209-ff55cf48b2ea/util/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.732175 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713tvngj_4c693f1a-3959-41d4-8209-ff55cf48b2ea/pull/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.743576 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh_d5e44698-abdb-4d2b-8058-ec06409cd1d3/extract/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.749044 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh_d5e44698-abdb-4d2b-8058-ec06409cd1d3/util/0.log" Jan 21 17:30:26 crc kubenswrapper[4707]: I0121 17:30:26.755205 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5zkh_d5e44698-abdb-4d2b-8058-ec06409cd1d3/pull/0.log" Jan 21 17:30:27 crc kubenswrapper[4707]: I0121 17:30:27.882777 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9gkb_22983f7b-d717-456f-9efe-3ff6a7342a68/registry-server/0.log" Jan 21 17:30:27 crc kubenswrapper[4707]: I0121 17:30:27.887747 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9gkb_22983f7b-d717-456f-9efe-3ff6a7342a68/extract-utilities/0.log" Jan 21 17:30:27 crc kubenswrapper[4707]: I0121 17:30:27.893999 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9gkb_22983f7b-d717-456f-9efe-3ff6a7342a68/extract-content/0.log" Jan 21 17:30:29 crc kubenswrapper[4707]: I0121 17:30:29.133048 4707 scope.go:117] "RemoveContainer" containerID="2106a2350800214b3be364fc6c47bb17859836779afbd4f76d7b13740ce5a66a" Jan 21 17:30:29 crc kubenswrapper[4707]: I0121 17:30:29.277568 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s8jx4_d67efa67-67d4-4f70-ac0e-cdbd6f76d739/registry-server/0.log" Jan 21 17:30:29 crc kubenswrapper[4707]: I0121 17:30:29.281294 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s8jx4_d67efa67-67d4-4f70-ac0e-cdbd6f76d739/extract-utilities/0.log" Jan 21 17:30:29 crc kubenswrapper[4707]: I0121 17:30:29.286441 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s8jx4_d67efa67-67d4-4f70-ac0e-cdbd6f76d739/extract-content/0.log" Jan 21 17:30:29 crc kubenswrapper[4707]: I0121 17:30:29.302359 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xcp6l_654186e4-d936-41b5-8967-fb853141daf4/marketplace-operator/0.log" Jan 21 17:30:29 crc kubenswrapper[4707]: I0121 17:30:29.592661 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mjbnh_978a715c-0a07-4197-9617-ae4c81c49d34/registry-server/0.log" Jan 21 17:30:29 crc kubenswrapper[4707]: I0121 17:30:29.596970 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mjbnh_978a715c-0a07-4197-9617-ae4c81c49d34/extract-utilities/0.log" Jan 21 17:30:29 crc kubenswrapper[4707]: I0121 17:30:29.602485 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mjbnh_978a715c-0a07-4197-9617-ae4c81c49d34/extract-content/0.log" Jan 21 17:30:30 crc kubenswrapper[4707]: I0121 17:30:30.715287 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5qt95_ba1d939f-e4be-4740-83ef-b93efb9d0db8/registry-server/0.log" Jan 21 17:30:30 crc kubenswrapper[4707]: I0121 17:30:30.719395 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5qt95_ba1d939f-e4be-4740-83ef-b93efb9d0db8/extract-utilities/0.log" Jan 21 17:30:30 crc kubenswrapper[4707]: I0121 17:30:30.724843 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5qt95_ba1d939f-e4be-4740-83ef-b93efb9d0db8/extract-content/0.log" Jan 21 17:30:31 crc kubenswrapper[4707]: I0121 17:30:31.559318 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vcvdr_b57455b4-f5cc-4e50-972f-299a31cc4eba/prometheus-operator/0.log" Jan 21 17:30:31 crc kubenswrapper[4707]: I0121 17:30:31.565981 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc_50a8c4cf-f6ea-4042-900f-37b63445f2d3/prometheus-operator-admission-webhook/0.log" Jan 21 17:30:31 crc kubenswrapper[4707]: I0121 17:30:31.573143 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc_f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d/prometheus-operator-admission-webhook/0.log" Jan 21 17:30:31 crc kubenswrapper[4707]: I0121 17:30:31.590765 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hc5fq_de08c513-221d-41e8-a1b3-fc4b21d19173/operator/0.log" Jan 21 17:30:31 crc kubenswrapper[4707]: I0121 17:30:31.598981 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-n7h88_a77834d8-cde4-449d-b56f-6d3b53b0cadd/perses-operator/0.log" Jan 21 17:30:35 crc kubenswrapper[4707]: I0121 17:30:35.182457 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:30:35 crc kubenswrapper[4707]: E0121 17:30:35.182796 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:30:49 crc kubenswrapper[4707]: I0121 17:30:49.182235 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:30:49 crc kubenswrapper[4707]: E0121 17:30:49.182796 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:31:03 crc kubenswrapper[4707]: I0121 17:31:03.185124 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:31:03 crc kubenswrapper[4707]: E0121 17:31:03.185647 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:31:16 crc kubenswrapper[4707]: I0121 17:31:16.182228 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:31:16 crc kubenswrapper[4707]: E0121 17:31:16.182898 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:31:30 crc kubenswrapper[4707]: I0121 17:31:30.182887 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:31:30 crc kubenswrapper[4707]: E0121 17:31:30.183443 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:31:41 crc kubenswrapper[4707]: I0121 17:31:41.183050 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:31:41 crc kubenswrapper[4707]: E0121 17:31:41.183578 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:31:55 crc kubenswrapper[4707]: I0121 17:31:55.185778 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:31:55 crc kubenswrapper[4707]: E0121 17:31:55.186246 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:31:55 crc kubenswrapper[4707]: I0121 17:31:55.806230 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vcvdr_b57455b4-f5cc-4e50-972f-299a31cc4eba/prometheus-operator/0.log" Jan 21 17:31:55 crc kubenswrapper[4707]: I0121 17:31:55.813309 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-9lmbc_50a8c4cf-f6ea-4042-900f-37b63445f2d3/prometheus-operator-admission-webhook/0.log" Jan 21 17:31:55 crc kubenswrapper[4707]: I0121 17:31:55.820840 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f99db7474-nr7qc_f13b2937-d7b0-4dcb-8589-fd4c2c19cd5d/prometheus-operator-admission-webhook/0.log" Jan 21 17:31:55 crc kubenswrapper[4707]: I0121 17:31:55.845294 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hc5fq_de08c513-221d-41e8-a1b3-fc4b21d19173/operator/0.log" Jan 21 17:31:55 crc kubenswrapper[4707]: I0121 17:31:55.852675 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-n7h88_a77834d8-cde4-449d-b56f-6d3b53b0cadd/perses-operator/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.184012 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-pdx6x_34c2061b-4028-4e99-9345-d8443062558a/cert-manager-controller/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.245920 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-tbzv6_18724316-30e9-4c09-baa1-d3d3971f97de/cert-manager-cainjector/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.256856 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-p7vr2_d59f005b-d150-4190-b8ca-d757c89269f4/cert-manager-webhook/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.793481 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/controller/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.798943 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-xwn92_d5bd042b-d9fe-489e-9d06-de8294e48164/kube-rbac-proxy/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.816313 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/controller/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.884705 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-g8mcx_7bf6f36b-51bd-4196-b732-7807475b38b2/nmstate-console-plugin/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.900717 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-m4cqg_72f87fd4-c9ff-4410-ba80-fdc2525a37b8/nmstate-handler/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.907374 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-h2gvw_591eebf5-40db-445a-a3ab-23a5f88f501c/nmstate-metrics/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.912894 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-h2gvw_591eebf5-40db-445a-a3ab-23a5f88f501c/kube-rbac-proxy/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.926517 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-kvjjm_c7e81409-5fc8-4b0d-9c98-dfaf6167892d/nmstate-operator/0.log" Jan 21 17:31:56 crc kubenswrapper[4707]: I0121 17:31:56.941245 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-64vq9_3dc50ee6-0d24-4790-a13d-caaf61114685/nmstate-webhook/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.007403 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.014404 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/reloader/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.021038 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/frr-metrics/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.027050 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.032011 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/kube-rbac-proxy-frr/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.037853 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-frr-files/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.042550 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-reloader/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.047560 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdjrb_3a1c1f4e-4f10-4b38-9daa-0e8a560dc2d3/cp-metrics/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.055048 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-czn2c_a50f5957-c813-4be6-846c-18fa67c16942/frr-k8s-webhook-server/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.070201 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-769647f86-fhptv_3d3dbff1-41f3-49a2-bf4d-9d4c1dcdef78/manager/0.log" Jan 21 17:31:58 crc kubenswrapper[4707]: I0121 17:31:58.076398 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58ff8dbc9-jm9z5_560343d7-3dae-42d2-b79f-e7ddbde3df71/webhook-server/0.log" Jan 21 17:31:59 crc kubenswrapper[4707]: I0121 17:31:59.179140 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/speaker/0.log" Jan 21 17:31:59 crc kubenswrapper[4707]: I0121 17:31:59.193487 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v97x2_e01d0445-0202-4f05-8e3b-cae0bd5b2468/kube-rbac-proxy/0.log" Jan 21 17:31:59 crc kubenswrapper[4707]: I0121 17:31:59.965521 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-pdx6x_34c2061b-4028-4e99-9345-d8443062558a/cert-manager-controller/0.log" Jan 21 17:32:00 crc kubenswrapper[4707]: I0121 17:32:00.022742 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-tbzv6_18724316-30e9-4c09-baa1-d3d3971f97de/cert-manager-cainjector/0.log" Jan 21 17:32:00 crc kubenswrapper[4707]: I0121 17:32:00.032094 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-p7vr2_d59f005b-d150-4190-b8ca-d757c89269f4/cert-manager-webhook/0.log" Jan 21 17:32:00 crc kubenswrapper[4707]: I0121 17:32:00.493601 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bjb5q_4a760eb2-4041-468b-a269-13e523796dd0/control-plane-machine-set-operator/0.log" Jan 21 17:32:00 crc kubenswrapper[4707]: I0121 17:32:00.501436 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ggf7f_6896438b-9d7e-4637-ac39-63c12dbf0e66/kube-rbac-proxy/0.log" Jan 21 17:32:00 crc kubenswrapper[4707]: I0121 17:32:00.507903 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ggf7f_6896438b-9d7e-4637-ac39-63c12dbf0e66/machine-api-operator/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.476839 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/kube-multus-additional-cni-plugins/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.484017 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/egress-router-binary-copy/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.489240 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/cni-plugins/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.495428 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/bond-cni-plugin/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.501034 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/routeoverride-cni/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.507062 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/whereabouts-cni-bincopy/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.513022 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk2p2_f6eefe63-30b0-46e6-af7b-3909a3849128/whereabouts-cni/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.589385 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-kjvk4_e82fcc54-393c-4981-a666-769ed5c9c92d/multus-admission-controller/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.592971 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-kjvk4_e82fcc54-393c-4981-a666-769ed5c9c92d/kube-rbac-proxy/0.log" Jan 21 17:32:01 crc kubenswrapper[4707]: I0121 17:32:01.657667 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/2.log" Jan 21 17:32:02 crc kubenswrapper[4707]: I0121 17:32:02.054638 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lxkz2_e2bdbb11-a196-4dc3-b197-64ef1bec8e8a/kube-multus/3.log" Jan 21 17:32:02 crc kubenswrapper[4707]: I0121 17:32:02.211156 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-62mww_d5fb5fe4-8f42-4057-b731-b2c8da0661e3/network-metrics-daemon/0.log" Jan 21 17:32:02 crc kubenswrapper[4707]: I0121 17:32:02.215175 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-62mww_d5fb5fe4-8f42-4057-b731-b2c8da0661e3/kube-rbac-proxy/0.log" Jan 21 17:32:09 crc kubenswrapper[4707]: I0121 17:32:09.187041 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:32:09 crc kubenswrapper[4707]: E0121 17:32:09.188870 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:32:24 crc kubenswrapper[4707]: I0121 17:32:24.182040 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:32:24 crc kubenswrapper[4707]: E0121 17:32:24.183440 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:32:39 crc kubenswrapper[4707]: I0121 17:32:39.185406 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:32:39 crc kubenswrapper[4707]: E0121 17:32:39.185924 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:32:50 crc kubenswrapper[4707]: I0121 17:32:50.182453 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:32:50 crc kubenswrapper[4707]: E0121 17:32:50.183098 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:33:05 crc kubenswrapper[4707]: I0121 17:33:05.182224 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:33:05 crc kubenswrapper[4707]: E0121 17:33:05.182925 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:33:18 crc kubenswrapper[4707]: I0121 17:33:18.182714 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:33:18 crc kubenswrapper[4707]: E0121 17:33:18.183419 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:33:32 crc kubenswrapper[4707]: I0121 17:33:32.183070 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:33:32 crc kubenswrapper[4707]: E0121 17:33:32.183769 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:33:46 crc kubenswrapper[4707]: I0121 17:33:46.181956 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:33:46 crc kubenswrapper[4707]: E0121 17:33:46.182492 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:34:00 crc kubenswrapper[4707]: I0121 17:34:00.182695 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:34:00 crc kubenswrapper[4707]: E0121 17:34:00.183054 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:34:12 crc kubenswrapper[4707]: I0121 17:34:12.182886 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:34:12 crc kubenswrapper[4707]: E0121 17:34:12.183361 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:34:23 crc kubenswrapper[4707]: I0121 17:34:23.186054 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:34:23 crc kubenswrapper[4707]: E0121 17:34:23.186563 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:34:35 crc kubenswrapper[4707]: I0121 17:34:35.185644 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:34:35 crc kubenswrapper[4707]: E0121 17:34:35.186331 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:34:50 crc kubenswrapper[4707]: I0121 17:34:50.190719 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:34:50 crc kubenswrapper[4707]: I0121 17:34:50.436849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"09b265f9795eb60de39c864fc53ea51dbb5fcf9c45164ae0b68446ad69129bb8"} Jan 21 17:36:14 crc kubenswrapper[4707]: I0121 17:36:14.929314 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cr9mn"] Jan 21 17:36:14 crc kubenswrapper[4707]: E0121 17:36:14.929886 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633d223b-8b58-44cc-8d9b-670aefc2f0ca" containerName="collect-profiles" Jan 21 17:36:14 crc kubenswrapper[4707]: I0121 17:36:14.929899 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="633d223b-8b58-44cc-8d9b-670aefc2f0ca" containerName="collect-profiles" Jan 21 17:36:14 crc kubenswrapper[4707]: I0121 17:36:14.930021 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="633d223b-8b58-44cc-8d9b-670aefc2f0ca" containerName="collect-profiles" Jan 21 17:36:14 crc kubenswrapper[4707]: I0121 17:36:14.930825 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:14 crc kubenswrapper[4707]: I0121 17:36:14.938568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cr9mn"] Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.021670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-catalog-content\") pod \"community-operators-cr9mn\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.021758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t657x\" (UniqueName: \"kubernetes.io/projected/13b3507c-a747-4822-95e4-fd35c5ba5a20-kube-api-access-t657x\") pod \"community-operators-cr9mn\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.021898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-utilities\") pod \"community-operators-cr9mn\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.123152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t657x\" (UniqueName: \"kubernetes.io/projected/13b3507c-a747-4822-95e4-fd35c5ba5a20-kube-api-access-t657x\") pod \"community-operators-cr9mn\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.123268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-utilities\") pod \"community-operators-cr9mn\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.123401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-catalog-content\") pod \"community-operators-cr9mn\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.123758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-utilities\") pod \"community-operators-cr9mn\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.123862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-catalog-content\") pod \"community-operators-cr9mn\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.159746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t657x\" (UniqueName: \"kubernetes.io/projected/13b3507c-a747-4822-95e4-fd35c5ba5a20-kube-api-access-t657x\") pod \"community-operators-cr9mn\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.247261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.478155 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cr9mn"] Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.837369 4707 generic.go:334] "Generic (PLEG): container finished" podID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerID="75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2" exitCode=0 Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.837405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cr9mn" event={"ID":"13b3507c-a747-4822-95e4-fd35c5ba5a20","Type":"ContainerDied","Data":"75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2"} Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.837446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cr9mn" event={"ID":"13b3507c-a747-4822-95e4-fd35c5ba5a20","Type":"ContainerStarted","Data":"c6adb9edd1acc206583e67ef85e606a31e4eaf70067c8b889a69e63978ebcec5"} Jan 21 17:36:15 crc kubenswrapper[4707]: I0121 17:36:15.838924 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 17:36:16 crc kubenswrapper[4707]: I0121 17:36:16.850204 4707 generic.go:334] "Generic (PLEG): container finished" podID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerID="cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44" exitCode=0 Jan 21 17:36:16 crc kubenswrapper[4707]: I0121 17:36:16.850284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cr9mn" event={"ID":"13b3507c-a747-4822-95e4-fd35c5ba5a20","Type":"ContainerDied","Data":"cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44"} Jan 21 17:36:17 crc kubenswrapper[4707]: I0121 17:36:17.856901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cr9mn" event={"ID":"13b3507c-a747-4822-95e4-fd35c5ba5a20","Type":"ContainerStarted","Data":"91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2"} Jan 21 17:36:17 crc kubenswrapper[4707]: I0121 17:36:17.881527 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cr9mn" podStartSLOduration=2.369945701 podStartE2EDuration="3.881511887s" podCreationTimestamp="2026-01-21 17:36:14 +0000 UTC" firstStartedPulling="2026-01-21 17:36:15.838685638 +0000 UTC m=+9273.020201859" lastFinishedPulling="2026-01-21 17:36:17.350251833 +0000 UTC m=+9274.531768045" observedRunningTime="2026-01-21 17:36:17.878686183 +0000 UTC m=+9275.060202405" watchObservedRunningTime="2026-01-21 17:36:17.881511887 +0000 UTC m=+9275.063028109" Jan 21 17:36:25 crc kubenswrapper[4707]: I0121 17:36:25.247533 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:25 crc kubenswrapper[4707]: I0121 17:36:25.247957 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:25 crc kubenswrapper[4707]: I0121 17:36:25.276891 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:25 crc kubenswrapper[4707]: I0121 17:36:25.921177 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:25 crc kubenswrapper[4707]: I0121 17:36:25.953199 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cr9mn"] Jan 21 17:36:27 crc kubenswrapper[4707]: I0121 17:36:27.902005 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cr9mn" podUID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerName="registry-server" containerID="cri-o://91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2" gracePeriod=2 Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.204847 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.310173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t657x\" (UniqueName: \"kubernetes.io/projected/13b3507c-a747-4822-95e4-fd35c5ba5a20-kube-api-access-t657x\") pod \"13b3507c-a747-4822-95e4-fd35c5ba5a20\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.310293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-catalog-content\") pod \"13b3507c-a747-4822-95e4-fd35c5ba5a20\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.310371 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-utilities\") pod \"13b3507c-a747-4822-95e4-fd35c5ba5a20\" (UID: \"13b3507c-a747-4822-95e4-fd35c5ba5a20\") " Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.311118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-utilities" (OuterVolumeSpecName: "utilities") pod "13b3507c-a747-4822-95e4-fd35c5ba5a20" (UID: "13b3507c-a747-4822-95e4-fd35c5ba5a20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.315255 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b3507c-a747-4822-95e4-fd35c5ba5a20-kube-api-access-t657x" (OuterVolumeSpecName: "kube-api-access-t657x") pod "13b3507c-a747-4822-95e4-fd35c5ba5a20" (UID: "13b3507c-a747-4822-95e4-fd35c5ba5a20"). InnerVolumeSpecName "kube-api-access-t657x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.349685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13b3507c-a747-4822-95e4-fd35c5ba5a20" (UID: "13b3507c-a747-4822-95e4-fd35c5ba5a20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.411795 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.411837 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b3507c-a747-4822-95e4-fd35c5ba5a20-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.411848 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t657x\" (UniqueName: \"kubernetes.io/projected/13b3507c-a747-4822-95e4-fd35c5ba5a20-kube-api-access-t657x\") on node \"crc\" DevicePath \"\"" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.910638 4707 generic.go:334] "Generic (PLEG): container finished" podID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerID="91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2" exitCode=0 Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.910824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cr9mn" event={"ID":"13b3507c-a747-4822-95e4-fd35c5ba5a20","Type":"ContainerDied","Data":"91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2"} Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.910894 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cr9mn" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.910906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cr9mn" event={"ID":"13b3507c-a747-4822-95e4-fd35c5ba5a20","Type":"ContainerDied","Data":"c6adb9edd1acc206583e67ef85e606a31e4eaf70067c8b889a69e63978ebcec5"} Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.910930 4707 scope.go:117] "RemoveContainer" containerID="91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.928707 4707 scope.go:117] "RemoveContainer" containerID="cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.941955 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cr9mn"] Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.948141 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cr9mn"] Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.973320 4707 scope.go:117] "RemoveContainer" containerID="75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.990263 4707 scope.go:117] "RemoveContainer" containerID="91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2" Jan 21 17:36:28 crc kubenswrapper[4707]: E0121 17:36:28.990661 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2\": container with ID starting with 91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2 not found: ID does not exist" containerID="91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.990698 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2"} err="failed to get container status \"91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2\": rpc error: code = NotFound desc = could not find container \"91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2\": container with ID starting with 91351030c63b1ede3a98e61ed77c52dc997c03451e298fe1992f46fe19d3c3e2 not found: ID does not exist" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.990720 4707 scope.go:117] "RemoveContainer" containerID="cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44" Jan 21 17:36:28 crc kubenswrapper[4707]: E0121 17:36:28.991226 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44\": container with ID starting with cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44 not found: ID does not exist" containerID="cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.991260 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44"} err="failed to get container status \"cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44\": rpc error: code = NotFound desc = could not find container \"cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44\": container with ID starting with cf5d1db8c96aa4e867d4f8d1e0d7ce4f3c3b3ab3cb0efd5739077e8240b5cd44 not found: ID does not exist" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.991278 4707 scope.go:117] "RemoveContainer" containerID="75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2" Jan 21 17:36:28 crc kubenswrapper[4707]: E0121 17:36:28.991582 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2\": container with ID starting with 75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2 not found: ID does not exist" containerID="75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2" Jan 21 17:36:28 crc kubenswrapper[4707]: I0121 17:36:28.991603 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2"} err="failed to get container status \"75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2\": rpc error: code = NotFound desc = could not find container \"75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2\": container with ID starting with 75d8ccd7bf74fac2025108fb8db52705919124744b1cb610e9031cc68a863fc2 not found: ID does not exist" Jan 21 17:36:29 crc kubenswrapper[4707]: I0121 17:36:29.188767 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13b3507c-a747-4822-95e4-fd35c5ba5a20" path="/var/lib/kubelet/pods/13b3507c-a747-4822-95e4-fd35c5ba5a20/volumes" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.484122 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gqb69"] Jan 21 17:37:02 crc kubenswrapper[4707]: E0121 17:37:02.484652 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerName="registry-server" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.484664 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerName="registry-server" Jan 21 17:37:02 crc kubenswrapper[4707]: E0121 17:37:02.484675 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerName="extract-content" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.484680 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerName="extract-content" Jan 21 17:37:02 crc kubenswrapper[4707]: E0121 17:37:02.484693 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerName="extract-utilities" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.484699 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerName="extract-utilities" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.484837 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b3507c-a747-4822-95e4-fd35c5ba5a20" containerName="registry-server" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.485560 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.495913 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqb69"] Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.580240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-utilities\") pod \"certified-operators-gqb69\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.580345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgwkx\" (UniqueName: \"kubernetes.io/projected/103c3f21-b5ca-430b-a321-ff8f38c0a611-kube-api-access-vgwkx\") pod \"certified-operators-gqb69\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.580375 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-catalog-content\") pod \"certified-operators-gqb69\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.680716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-utilities\") pod \"certified-operators-gqb69\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.680829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgwkx\" (UniqueName: \"kubernetes.io/projected/103c3f21-b5ca-430b-a321-ff8f38c0a611-kube-api-access-vgwkx\") pod \"certified-operators-gqb69\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.680861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-catalog-content\") pod \"certified-operators-gqb69\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.681295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-catalog-content\") pod \"certified-operators-gqb69\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.681483 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-utilities\") pod \"certified-operators-gqb69\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.695956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgwkx\" (UniqueName: \"kubernetes.io/projected/103c3f21-b5ca-430b-a321-ff8f38c0a611-kube-api-access-vgwkx\") pod \"certified-operators-gqb69\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:02 crc kubenswrapper[4707]: I0121 17:37:02.799281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:03 crc kubenswrapper[4707]: I0121 17:37:03.048212 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqb69"] Jan 21 17:37:03 crc kubenswrapper[4707]: I0121 17:37:03.093300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqb69" event={"ID":"103c3f21-b5ca-430b-a321-ff8f38c0a611","Type":"ContainerStarted","Data":"d2452769a7df81924d14a3228e08de7260fb6670f98c53ee905dcc39a16f67f2"} Jan 21 17:37:04 crc kubenswrapper[4707]: I0121 17:37:04.099143 4707 generic.go:334] "Generic (PLEG): container finished" podID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerID="5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32" exitCode=0 Jan 21 17:37:04 crc kubenswrapper[4707]: I0121 17:37:04.099238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqb69" event={"ID":"103c3f21-b5ca-430b-a321-ff8f38c0a611","Type":"ContainerDied","Data":"5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32"} Jan 21 17:37:05 crc kubenswrapper[4707]: I0121 17:37:05.106006 4707 generic.go:334] "Generic (PLEG): container finished" podID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerID="04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97" exitCode=0 Jan 21 17:37:05 crc kubenswrapper[4707]: I0121 17:37:05.106085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqb69" event={"ID":"103c3f21-b5ca-430b-a321-ff8f38c0a611","Type":"ContainerDied","Data":"04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97"} Jan 21 17:37:06 crc kubenswrapper[4707]: I0121 17:37:06.112770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqb69" event={"ID":"103c3f21-b5ca-430b-a321-ff8f38c0a611","Type":"ContainerStarted","Data":"199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b"} Jan 21 17:37:06 crc kubenswrapper[4707]: I0121 17:37:06.127341 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gqb69" podStartSLOduration=2.653545748 podStartE2EDuration="4.127325736s" podCreationTimestamp="2026-01-21 17:37:02 +0000 UTC" firstStartedPulling="2026-01-21 17:37:04.100602755 +0000 UTC m=+9321.282118977" lastFinishedPulling="2026-01-21 17:37:05.574382742 +0000 UTC m=+9322.755898965" observedRunningTime="2026-01-21 17:37:06.12521121 +0000 UTC m=+9323.306727432" watchObservedRunningTime="2026-01-21 17:37:06.127325736 +0000 UTC m=+9323.308841958" Jan 21 17:37:09 crc kubenswrapper[4707]: I0121 17:37:09.946075 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:37:09 crc kubenswrapper[4707]: I0121 17:37:09.946348 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:37:12 crc kubenswrapper[4707]: I0121 17:37:12.800027 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:12 crc kubenswrapper[4707]: I0121 17:37:12.800348 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:12 crc kubenswrapper[4707]: I0121 17:37:12.829574 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:13 crc kubenswrapper[4707]: I0121 17:37:13.171575 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:13 crc kubenswrapper[4707]: I0121 17:37:13.201781 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqb69"] Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.154224 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gqb69" podUID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerName="registry-server" containerID="cri-o://199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b" gracePeriod=2 Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.492853 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.540334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-utilities\") pod \"103c3f21-b5ca-430b-a321-ff8f38c0a611\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.540403 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgwkx\" (UniqueName: \"kubernetes.io/projected/103c3f21-b5ca-430b-a321-ff8f38c0a611-kube-api-access-vgwkx\") pod \"103c3f21-b5ca-430b-a321-ff8f38c0a611\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.540432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-catalog-content\") pod \"103c3f21-b5ca-430b-a321-ff8f38c0a611\" (UID: \"103c3f21-b5ca-430b-a321-ff8f38c0a611\") " Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.543123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-utilities" (OuterVolumeSpecName: "utilities") pod "103c3f21-b5ca-430b-a321-ff8f38c0a611" (UID: "103c3f21-b5ca-430b-a321-ff8f38c0a611"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.547966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103c3f21-b5ca-430b-a321-ff8f38c0a611-kube-api-access-vgwkx" (OuterVolumeSpecName: "kube-api-access-vgwkx") pod "103c3f21-b5ca-430b-a321-ff8f38c0a611" (UID: "103c3f21-b5ca-430b-a321-ff8f38c0a611"). InnerVolumeSpecName "kube-api-access-vgwkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.580246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "103c3f21-b5ca-430b-a321-ff8f38c0a611" (UID: "103c3f21-b5ca-430b-a321-ff8f38c0a611"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.642005 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.642047 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103c3f21-b5ca-430b-a321-ff8f38c0a611-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:37:15 crc kubenswrapper[4707]: I0121 17:37:15.642062 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgwkx\" (UniqueName: \"kubernetes.io/projected/103c3f21-b5ca-430b-a321-ff8f38c0a611-kube-api-access-vgwkx\") on node \"crc\" DevicePath \"\"" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.160593 4707 generic.go:334] "Generic (PLEG): container finished" podID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerID="199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b" exitCode=0 Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.160660 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqb69" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.160685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqb69" event={"ID":"103c3f21-b5ca-430b-a321-ff8f38c0a611","Type":"ContainerDied","Data":"199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b"} Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.161568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqb69" event={"ID":"103c3f21-b5ca-430b-a321-ff8f38c0a611","Type":"ContainerDied","Data":"d2452769a7df81924d14a3228e08de7260fb6670f98c53ee905dcc39a16f67f2"} Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.161590 4707 scope.go:117] "RemoveContainer" containerID="199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.176118 4707 scope.go:117] "RemoveContainer" containerID="04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.182913 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqb69"] Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.187632 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gqb69"] Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.211835 4707 scope.go:117] "RemoveContainer" containerID="5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.221906 4707 scope.go:117] "RemoveContainer" containerID="199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b" Jan 21 17:37:16 crc kubenswrapper[4707]: E0121 17:37:16.222191 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b\": container with ID starting with 199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b not found: ID does not exist" containerID="199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.222234 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b"} err="failed to get container status \"199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b\": rpc error: code = NotFound desc = could not find container \"199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b\": container with ID starting with 199300e674c1f0e567e29b4598776588a4b30a5ca982ae3cffec2d0f5641d57b not found: ID does not exist" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.222255 4707 scope.go:117] "RemoveContainer" containerID="04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97" Jan 21 17:37:16 crc kubenswrapper[4707]: E0121 17:37:16.222451 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97\": container with ID starting with 04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97 not found: ID does not exist" containerID="04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.222477 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97"} err="failed to get container status \"04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97\": rpc error: code = NotFound desc = could not find container \"04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97\": container with ID starting with 04393f4b725e0bc63ab0e611a652273a55bdb4946bdac65f6c27d128f5e23c97 not found: ID does not exist" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.222506 4707 scope.go:117] "RemoveContainer" containerID="5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32" Jan 21 17:37:16 crc kubenswrapper[4707]: E0121 17:37:16.222719 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32\": container with ID starting with 5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32 not found: ID does not exist" containerID="5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32" Jan 21 17:37:16 crc kubenswrapper[4707]: I0121 17:37:16.222740 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32"} err="failed to get container status \"5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32\": rpc error: code = NotFound desc = could not find container \"5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32\": container with ID starting with 5f69b542993d69a362cb0e4cb8d8f6ff2c895dcc0abc39f9a7cd485bca583d32 not found: ID does not exist" Jan 21 17:37:17 crc kubenswrapper[4707]: I0121 17:37:17.189070 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103c3f21-b5ca-430b-a321-ff8f38c0a611" path="/var/lib/kubelet/pods/103c3f21-b5ca-430b-a321-ff8f38c0a611/volumes" Jan 21 17:37:39 crc kubenswrapper[4707]: I0121 17:37:39.945666 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:37:39 crc kubenswrapper[4707]: I0121 17:37:39.946788 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.205254 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mrbps"] Jan 21 17:37:46 crc kubenswrapper[4707]: E0121 17:37:46.205784 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerName="extract-content" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.205796 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerName="extract-content" Jan 21 17:37:46 crc kubenswrapper[4707]: E0121 17:37:46.205825 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerName="registry-server" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.205831 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerName="registry-server" Jan 21 17:37:46 crc kubenswrapper[4707]: E0121 17:37:46.205847 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerName="extract-utilities" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.205854 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerName="extract-utilities" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.205956 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="103c3f21-b5ca-430b-a321-ff8f38c0a611" containerName="registry-server" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.206662 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.228124 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrbps"] Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.316299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt78g\" (UniqueName: \"kubernetes.io/projected/77eda300-b2cb-4516-837d-c871528e1e42-kube-api-access-pt78g\") pod \"redhat-operators-mrbps\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.316343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-catalog-content\") pod \"redhat-operators-mrbps\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.316370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-utilities\") pod \"redhat-operators-mrbps\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.417590 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt78g\" (UniqueName: \"kubernetes.io/projected/77eda300-b2cb-4516-837d-c871528e1e42-kube-api-access-pt78g\") pod \"redhat-operators-mrbps\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.417638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-catalog-content\") pod \"redhat-operators-mrbps\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.417668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-utilities\") pod \"redhat-operators-mrbps\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.418143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-utilities\") pod \"redhat-operators-mrbps\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.418603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-catalog-content\") pod \"redhat-operators-mrbps\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.442497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt78g\" (UniqueName: \"kubernetes.io/projected/77eda300-b2cb-4516-837d-c871528e1e42-kube-api-access-pt78g\") pod \"redhat-operators-mrbps\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.538530 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:46 crc kubenswrapper[4707]: I0121 17:37:46.910562 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrbps"] Jan 21 17:37:46 crc kubenswrapper[4707]: W0121 17:37:46.916515 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77eda300_b2cb_4516_837d_c871528e1e42.slice/crio-8cfa62b5aaabd8325ad90a639788e03d0c1cbf2595b24ac641c189c5df23339c WatchSource:0}: Error finding container 8cfa62b5aaabd8325ad90a639788e03d0c1cbf2595b24ac641c189c5df23339c: Status 404 returned error can't find the container with id 8cfa62b5aaabd8325ad90a639788e03d0c1cbf2595b24ac641c189c5df23339c Jan 21 17:37:47 crc kubenswrapper[4707]: I0121 17:37:47.329101 4707 generic.go:334] "Generic (PLEG): container finished" podID="77eda300-b2cb-4516-837d-c871528e1e42" containerID="f8ffd927ae49f2f9865f79e02896252e44508f777082cc04a0fd82b252475420" exitCode=0 Jan 21 17:37:47 crc kubenswrapper[4707]: I0121 17:37:47.329366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrbps" event={"ID":"77eda300-b2cb-4516-837d-c871528e1e42","Type":"ContainerDied","Data":"f8ffd927ae49f2f9865f79e02896252e44508f777082cc04a0fd82b252475420"} Jan 21 17:37:47 crc kubenswrapper[4707]: I0121 17:37:47.329395 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrbps" event={"ID":"77eda300-b2cb-4516-837d-c871528e1e42","Type":"ContainerStarted","Data":"8cfa62b5aaabd8325ad90a639788e03d0c1cbf2595b24ac641c189c5df23339c"} Jan 21 17:37:48 crc kubenswrapper[4707]: I0121 17:37:48.335081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrbps" event={"ID":"77eda300-b2cb-4516-837d-c871528e1e42","Type":"ContainerStarted","Data":"9dd787c83d752e6a53f8a60141ffa9b4038ed3cb4c7349a4561d3e0265eab701"} Jan 21 17:37:49 crc kubenswrapper[4707]: I0121 17:37:49.342490 4707 generic.go:334] "Generic (PLEG): container finished" podID="77eda300-b2cb-4516-837d-c871528e1e42" containerID="9dd787c83d752e6a53f8a60141ffa9b4038ed3cb4c7349a4561d3e0265eab701" exitCode=0 Jan 21 17:37:49 crc kubenswrapper[4707]: I0121 17:37:49.342764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrbps" event={"ID":"77eda300-b2cb-4516-837d-c871528e1e42","Type":"ContainerDied","Data":"9dd787c83d752e6a53f8a60141ffa9b4038ed3cb4c7349a4561d3e0265eab701"} Jan 21 17:37:50 crc kubenswrapper[4707]: I0121 17:37:50.348498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrbps" event={"ID":"77eda300-b2cb-4516-837d-c871528e1e42","Type":"ContainerStarted","Data":"fe80fed59388f7f2715a67b4100918fa023c7ce6541f7acb9ddd5b9cba0011f6"} Jan 21 17:37:50 crc kubenswrapper[4707]: I0121 17:37:50.363861 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mrbps" podStartSLOduration=1.851627989 podStartE2EDuration="4.363847615s" podCreationTimestamp="2026-01-21 17:37:46 +0000 UTC" firstStartedPulling="2026-01-21 17:37:47.331564689 +0000 UTC m=+9364.513080900" lastFinishedPulling="2026-01-21 17:37:49.843784305 +0000 UTC m=+9367.025300526" observedRunningTime="2026-01-21 17:37:50.363365548 +0000 UTC m=+9367.544881771" watchObservedRunningTime="2026-01-21 17:37:50.363847615 +0000 UTC m=+9367.545363837" Jan 21 17:37:56 crc kubenswrapper[4707]: I0121 17:37:56.538832 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:56 crc kubenswrapper[4707]: I0121 17:37:56.539031 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:56 crc kubenswrapper[4707]: I0121 17:37:56.568699 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:57 crc kubenswrapper[4707]: I0121 17:37:57.408043 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:37:57 crc kubenswrapper[4707]: I0121 17:37:57.439864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrbps"] Jan 21 17:37:59 crc kubenswrapper[4707]: I0121 17:37:59.391360 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mrbps" podUID="77eda300-b2cb-4516-837d-c871528e1e42" containerName="registry-server" containerID="cri-o://fe80fed59388f7f2715a67b4100918fa023c7ce6541f7acb9ddd5b9cba0011f6" gracePeriod=2 Jan 21 17:38:00 crc kubenswrapper[4707]: I0121 17:38:00.397135 4707 generic.go:334] "Generic (PLEG): container finished" podID="77eda300-b2cb-4516-837d-c871528e1e42" containerID="fe80fed59388f7f2715a67b4100918fa023c7ce6541f7acb9ddd5b9cba0011f6" exitCode=0 Jan 21 17:38:00 crc kubenswrapper[4707]: I0121 17:38:00.397169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrbps" event={"ID":"77eda300-b2cb-4516-837d-c871528e1e42","Type":"ContainerDied","Data":"fe80fed59388f7f2715a67b4100918fa023c7ce6541f7acb9ddd5b9cba0011f6"} Jan 21 17:38:00 crc kubenswrapper[4707]: I0121 17:38:00.809108 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:38:00 crc kubenswrapper[4707]: I0121 17:38:00.902575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-catalog-content\") pod \"77eda300-b2cb-4516-837d-c871528e1e42\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " Jan 21 17:38:00 crc kubenswrapper[4707]: I0121 17:38:00.902707 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-utilities\") pod \"77eda300-b2cb-4516-837d-c871528e1e42\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " Jan 21 17:38:00 crc kubenswrapper[4707]: I0121 17:38:00.902761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt78g\" (UniqueName: \"kubernetes.io/projected/77eda300-b2cb-4516-837d-c871528e1e42-kube-api-access-pt78g\") pod \"77eda300-b2cb-4516-837d-c871528e1e42\" (UID: \"77eda300-b2cb-4516-837d-c871528e1e42\") " Jan 21 17:38:00 crc kubenswrapper[4707]: I0121 17:38:00.903489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-utilities" (OuterVolumeSpecName: "utilities") pod "77eda300-b2cb-4516-837d-c871528e1e42" (UID: "77eda300-b2cb-4516-837d-c871528e1e42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:38:00 crc kubenswrapper[4707]: I0121 17:38:00.906842 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77eda300-b2cb-4516-837d-c871528e1e42-kube-api-access-pt78g" (OuterVolumeSpecName: "kube-api-access-pt78g") pod "77eda300-b2cb-4516-837d-c871528e1e42" (UID: "77eda300-b2cb-4516-837d-c871528e1e42"). InnerVolumeSpecName "kube-api-access-pt78g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:38:00 crc kubenswrapper[4707]: I0121 17:38:00.989001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77eda300-b2cb-4516-837d-c871528e1e42" (UID: "77eda300-b2cb-4516-837d-c871528e1e42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.004285 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.004317 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt78g\" (UniqueName: \"kubernetes.io/projected/77eda300-b2cb-4516-837d-c871528e1e42-kube-api-access-pt78g\") on node \"crc\" DevicePath \"\"" Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.004329 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eda300-b2cb-4516-837d-c871528e1e42-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.404517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrbps" event={"ID":"77eda300-b2cb-4516-837d-c871528e1e42","Type":"ContainerDied","Data":"8cfa62b5aaabd8325ad90a639788e03d0c1cbf2595b24ac641c189c5df23339c"} Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.404564 4707 scope.go:117] "RemoveContainer" containerID="fe80fed59388f7f2715a67b4100918fa023c7ce6541f7acb9ddd5b9cba0011f6" Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.404584 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrbps" Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.423127 4707 scope.go:117] "RemoveContainer" containerID="9dd787c83d752e6a53f8a60141ffa9b4038ed3cb4c7349a4561d3e0265eab701" Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.426602 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrbps"] Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.433729 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mrbps"] Jan 21 17:38:01 crc kubenswrapper[4707]: I0121 17:38:01.458457 4707 scope.go:117] "RemoveContainer" containerID="f8ffd927ae49f2f9865f79e02896252e44508f777082cc04a0fd82b252475420" Jan 21 17:38:03 crc kubenswrapper[4707]: I0121 17:38:03.188687 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77eda300-b2cb-4516-837d-c871528e1e42" path="/var/lib/kubelet/pods/77eda300-b2cb-4516-837d-c871528e1e42/volumes" Jan 21 17:38:09 crc kubenswrapper[4707]: I0121 17:38:09.945586 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:38:09 crc kubenswrapper[4707]: I0121 17:38:09.945908 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:38:09 crc kubenswrapper[4707]: I0121 17:38:09.945948 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:38:09 crc kubenswrapper[4707]: I0121 17:38:09.946411 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09b265f9795eb60de39c864fc53ea51dbb5fcf9c45164ae0b68446ad69129bb8"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:38:09 crc kubenswrapper[4707]: I0121 17:38:09.946465 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://09b265f9795eb60de39c864fc53ea51dbb5fcf9c45164ae0b68446ad69129bb8" gracePeriod=600 Jan 21 17:38:10 crc kubenswrapper[4707]: I0121 17:38:10.467243 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="09b265f9795eb60de39c864fc53ea51dbb5fcf9c45164ae0b68446ad69129bb8" exitCode=0 Jan 21 17:38:10 crc kubenswrapper[4707]: I0121 17:38:10.467310 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"09b265f9795eb60de39c864fc53ea51dbb5fcf9c45164ae0b68446ad69129bb8"} Jan 21 17:38:10 crc kubenswrapper[4707]: I0121 17:38:10.467533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858"} Jan 21 17:38:10 crc kubenswrapper[4707]: I0121 17:38:10.467554 4707 scope.go:117] "RemoveContainer" containerID="b02554c5dc6376f0b04277c60b9a80b1c1682d1256ec943face20918d433c8be" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.043560 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mkzmx"] Jan 21 17:38:15 crc kubenswrapper[4707]: E0121 17:38:15.044126 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77eda300-b2cb-4516-837d-c871528e1e42" containerName="extract-content" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.044137 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77eda300-b2cb-4516-837d-c871528e1e42" containerName="extract-content" Jan 21 17:38:15 crc kubenswrapper[4707]: E0121 17:38:15.044151 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77eda300-b2cb-4516-837d-c871528e1e42" containerName="extract-utilities" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.044156 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77eda300-b2cb-4516-837d-c871528e1e42" containerName="extract-utilities" Jan 21 17:38:15 crc kubenswrapper[4707]: E0121 17:38:15.044173 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77eda300-b2cb-4516-837d-c871528e1e42" containerName="registry-server" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.044179 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77eda300-b2cb-4516-837d-c871528e1e42" containerName="registry-server" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.044302 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77eda300-b2cb-4516-837d-c871528e1e42" containerName="registry-server" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.045022 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.058398 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkzmx"] Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.102987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-catalog-content\") pod \"redhat-marketplace-mkzmx\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.103212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-utilities\") pod \"redhat-marketplace-mkzmx\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.103445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pgs\" (UniqueName: \"kubernetes.io/projected/949d6542-39e4-4b9d-8019-15efa59cbed7-kube-api-access-67pgs\") pod \"redhat-marketplace-mkzmx\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.204944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pgs\" (UniqueName: \"kubernetes.io/projected/949d6542-39e4-4b9d-8019-15efa59cbed7-kube-api-access-67pgs\") pod \"redhat-marketplace-mkzmx\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.205004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-catalog-content\") pod \"redhat-marketplace-mkzmx\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.205032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-utilities\") pod \"redhat-marketplace-mkzmx\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.205479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-utilities\") pod \"redhat-marketplace-mkzmx\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.205628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-catalog-content\") pod \"redhat-marketplace-mkzmx\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.220165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pgs\" (UniqueName: \"kubernetes.io/projected/949d6542-39e4-4b9d-8019-15efa59cbed7-kube-api-access-67pgs\") pod \"redhat-marketplace-mkzmx\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.358969 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:15 crc kubenswrapper[4707]: I0121 17:38:15.722963 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkzmx"] Jan 21 17:38:16 crc kubenswrapper[4707]: I0121 17:38:16.507096 4707 generic.go:334] "Generic (PLEG): container finished" podID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerID="9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b" exitCode=0 Jan 21 17:38:16 crc kubenswrapper[4707]: I0121 17:38:16.507191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkzmx" event={"ID":"949d6542-39e4-4b9d-8019-15efa59cbed7","Type":"ContainerDied","Data":"9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b"} Jan 21 17:38:16 crc kubenswrapper[4707]: I0121 17:38:16.507320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkzmx" event={"ID":"949d6542-39e4-4b9d-8019-15efa59cbed7","Type":"ContainerStarted","Data":"9db78dc7b381526318ff58670db1f3d7d18a9860c9eb8358924503a9f6359dc2"} Jan 21 17:38:17 crc kubenswrapper[4707]: I0121 17:38:17.513954 4707 generic.go:334] "Generic (PLEG): container finished" podID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerID="2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37" exitCode=0 Jan 21 17:38:17 crc kubenswrapper[4707]: I0121 17:38:17.514151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkzmx" event={"ID":"949d6542-39e4-4b9d-8019-15efa59cbed7","Type":"ContainerDied","Data":"2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37"} Jan 21 17:38:18 crc kubenswrapper[4707]: I0121 17:38:18.520741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkzmx" event={"ID":"949d6542-39e4-4b9d-8019-15efa59cbed7","Type":"ContainerStarted","Data":"0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1"} Jan 21 17:38:18 crc kubenswrapper[4707]: I0121 17:38:18.538090 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mkzmx" podStartSLOduration=2.041286573 podStartE2EDuration="3.538077327s" podCreationTimestamp="2026-01-21 17:38:15 +0000 UTC" firstStartedPulling="2026-01-21 17:38:16.508891505 +0000 UTC m=+9393.690407728" lastFinishedPulling="2026-01-21 17:38:18.005682259 +0000 UTC m=+9395.187198482" observedRunningTime="2026-01-21 17:38:18.533743449 +0000 UTC m=+9395.715259671" watchObservedRunningTime="2026-01-21 17:38:18.538077327 +0000 UTC m=+9395.719593549" Jan 21 17:38:25 crc kubenswrapper[4707]: I0121 17:38:25.359400 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:25 crc kubenswrapper[4707]: I0121 17:38:25.360847 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:25 crc kubenswrapper[4707]: I0121 17:38:25.390157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:25 crc kubenswrapper[4707]: I0121 17:38:25.588760 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:25 crc kubenswrapper[4707]: I0121 17:38:25.620463 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkzmx"] Jan 21 17:38:27 crc kubenswrapper[4707]: I0121 17:38:27.569157 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mkzmx" podUID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerName="registry-server" containerID="cri-o://0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1" gracePeriod=2 Jan 21 17:38:27 crc kubenswrapper[4707]: I0121 17:38:27.865301 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.004796 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-utilities\") pod \"949d6542-39e4-4b9d-8019-15efa59cbed7\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.004921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67pgs\" (UniqueName: \"kubernetes.io/projected/949d6542-39e4-4b9d-8019-15efa59cbed7-kube-api-access-67pgs\") pod \"949d6542-39e4-4b9d-8019-15efa59cbed7\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.005087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-catalog-content\") pod \"949d6542-39e4-4b9d-8019-15efa59cbed7\" (UID: \"949d6542-39e4-4b9d-8019-15efa59cbed7\") " Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.005850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-utilities" (OuterVolumeSpecName: "utilities") pod "949d6542-39e4-4b9d-8019-15efa59cbed7" (UID: "949d6542-39e4-4b9d-8019-15efa59cbed7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.019426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d6542-39e4-4b9d-8019-15efa59cbed7-kube-api-access-67pgs" (OuterVolumeSpecName: "kube-api-access-67pgs") pod "949d6542-39e4-4b9d-8019-15efa59cbed7" (UID: "949d6542-39e4-4b9d-8019-15efa59cbed7"). InnerVolumeSpecName "kube-api-access-67pgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.021559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "949d6542-39e4-4b9d-8019-15efa59cbed7" (UID: "949d6542-39e4-4b9d-8019-15efa59cbed7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.106837 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.106863 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949d6542-39e4-4b9d-8019-15efa59cbed7-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.106874 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67pgs\" (UniqueName: \"kubernetes.io/projected/949d6542-39e4-4b9d-8019-15efa59cbed7-kube-api-access-67pgs\") on node \"crc\" DevicePath \"\"" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.574458 4707 generic.go:334] "Generic (PLEG): container finished" podID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerID="0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1" exitCode=0 Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.574493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkzmx" event={"ID":"949d6542-39e4-4b9d-8019-15efa59cbed7","Type":"ContainerDied","Data":"0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1"} Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.574514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkzmx" event={"ID":"949d6542-39e4-4b9d-8019-15efa59cbed7","Type":"ContainerDied","Data":"9db78dc7b381526318ff58670db1f3d7d18a9860c9eb8358924503a9f6359dc2"} Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.574528 4707 scope.go:117] "RemoveContainer" containerID="0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.574615 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkzmx" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.595292 4707 scope.go:117] "RemoveContainer" containerID="2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.595567 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkzmx"] Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.605485 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkzmx"] Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.627395 4707 scope.go:117] "RemoveContainer" containerID="9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.640098 4707 scope.go:117] "RemoveContainer" containerID="0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1" Jan 21 17:38:28 crc kubenswrapper[4707]: E0121 17:38:28.640515 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1\": container with ID starting with 0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1 not found: ID does not exist" containerID="0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.640552 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1"} err="failed to get container status \"0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1\": rpc error: code = NotFound desc = could not find container \"0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1\": container with ID starting with 0601f9bf05980df6b90ba74197ca3b52b9b0aac54c21557e6b09d3e1f720daa1 not found: ID does not exist" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.640577 4707 scope.go:117] "RemoveContainer" containerID="2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37" Jan 21 17:38:28 crc kubenswrapper[4707]: E0121 17:38:28.640899 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37\": container with ID starting with 2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37 not found: ID does not exist" containerID="2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.640919 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37"} err="failed to get container status \"2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37\": rpc error: code = NotFound desc = could not find container \"2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37\": container with ID starting with 2a52b38b96cdd5cbfdd8f94df75ed3487b4f837f84effbe5d2a1703f2a02cf37 not found: ID does not exist" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.640932 4707 scope.go:117] "RemoveContainer" containerID="9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b" Jan 21 17:38:28 crc kubenswrapper[4707]: E0121 17:38:28.641247 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b\": container with ID starting with 9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b not found: ID does not exist" containerID="9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b" Jan 21 17:38:28 crc kubenswrapper[4707]: I0121 17:38:28.641278 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b"} err="failed to get container status \"9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b\": rpc error: code = NotFound desc = could not find container \"9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b\": container with ID starting with 9f20fea4a1bdac46eb7939a24b4e020956c7faf89999e94bef8baff0da31870b not found: ID does not exist" Jan 21 17:38:29 crc kubenswrapper[4707]: I0121 17:38:29.188034 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949d6542-39e4-4b9d-8019-15efa59cbed7" path="/var/lib/kubelet/pods/949d6542-39e4-4b9d-8019-15efa59cbed7/volumes" Jan 21 17:40:39 crc kubenswrapper[4707]: I0121 17:40:39.945559 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:40:39 crc kubenswrapper[4707]: I0121 17:40:39.945916 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:41:09 crc kubenswrapper[4707]: I0121 17:41:09.946038 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:41:09 crc kubenswrapper[4707]: I0121 17:41:09.946566 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:41:39 crc kubenswrapper[4707]: I0121 17:41:39.945743 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:41:39 crc kubenswrapper[4707]: I0121 17:41:39.946109 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:41:39 crc kubenswrapper[4707]: I0121 17:41:39.946142 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:41:39 crc kubenswrapper[4707]: I0121 17:41:39.946486 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:41:39 crc kubenswrapper[4707]: I0121 17:41:39.946523 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" gracePeriod=600 Jan 21 17:41:40 crc kubenswrapper[4707]: E0121 17:41:40.067387 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:41:40 crc kubenswrapper[4707]: I0121 17:41:40.553857 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" exitCode=0 Jan 21 17:41:40 crc kubenswrapper[4707]: I0121 17:41:40.553895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858"} Jan 21 17:41:40 crc kubenswrapper[4707]: I0121 17:41:40.553929 4707 scope.go:117] "RemoveContainer" containerID="09b265f9795eb60de39c864fc53ea51dbb5fcf9c45164ae0b68446ad69129bb8" Jan 21 17:41:40 crc kubenswrapper[4707]: I0121 17:41:40.554333 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:41:40 crc kubenswrapper[4707]: E0121 17:41:40.554559 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:41:52 crc kubenswrapper[4707]: I0121 17:41:52.190130 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:41:52 crc kubenswrapper[4707]: E0121 17:41:52.190647 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:42:04 crc kubenswrapper[4707]: I0121 17:42:04.182341 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:42:04 crc kubenswrapper[4707]: E0121 17:42:04.183034 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:42:16 crc kubenswrapper[4707]: I0121 17:42:16.182735 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:42:16 crc kubenswrapper[4707]: E0121 17:42:16.183332 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:42:30 crc kubenswrapper[4707]: I0121 17:42:30.182684 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:42:30 crc kubenswrapper[4707]: E0121 17:42:30.183494 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:42:44 crc kubenswrapper[4707]: I0121 17:42:44.183128 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:42:44 crc kubenswrapper[4707]: E0121 17:42:44.183604 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:42:59 crc kubenswrapper[4707]: I0121 17:42:59.182647 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:42:59 crc kubenswrapper[4707]: E0121 17:42:59.183909 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:43:14 crc kubenswrapper[4707]: I0121 17:43:14.181976 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:43:14 crc kubenswrapper[4707]: E0121 17:43:14.182422 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:43:27 crc kubenswrapper[4707]: I0121 17:43:27.185344 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:43:27 crc kubenswrapper[4707]: E0121 17:43:27.185822 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:43:41 crc kubenswrapper[4707]: I0121 17:43:41.188767 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:43:41 crc kubenswrapper[4707]: E0121 17:43:41.189625 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:43:54 crc kubenswrapper[4707]: I0121 17:43:54.182250 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:43:54 crc kubenswrapper[4707]: E0121 17:43:54.182771 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:44:07 crc kubenswrapper[4707]: I0121 17:44:07.182558 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:44:07 crc kubenswrapper[4707]: E0121 17:44:07.183091 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:44:18 crc kubenswrapper[4707]: I0121 17:44:18.182890 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:44:18 crc kubenswrapper[4707]: E0121 17:44:18.183725 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:44:29 crc kubenswrapper[4707]: I0121 17:44:29.182922 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:44:29 crc kubenswrapper[4707]: E0121 17:44:29.183424 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:44:42 crc kubenswrapper[4707]: I0121 17:44:42.182876 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:44:42 crc kubenswrapper[4707]: E0121 17:44:42.183874 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:44:56 crc kubenswrapper[4707]: I0121 17:44:56.184600 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:44:56 crc kubenswrapper[4707]: E0121 17:44:56.185737 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.154073 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld"] Jan 21 17:45:00 crc kubenswrapper[4707]: E0121 17:45:00.155433 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerName="extract-utilities" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.155505 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerName="extract-utilities" Jan 21 17:45:00 crc kubenswrapper[4707]: E0121 17:45:00.155585 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerName="registry-server" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.155636 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerName="registry-server" Jan 21 17:45:00 crc kubenswrapper[4707]: E0121 17:45:00.155698 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerName="extract-content" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.155747 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerName="extract-content" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.155954 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="949d6542-39e4-4b9d-8019-15efa59cbed7" containerName="registry-server" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.156444 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.158268 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.159166 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.164450 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld"] Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.257536 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-config-volume\") pod \"collect-profiles-29483625-t7zld\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.257573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-secret-volume\") pod \"collect-profiles-29483625-t7zld\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.257604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqqv\" (UniqueName: \"kubernetes.io/projected/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-kube-api-access-bkqqv\") pod \"collect-profiles-29483625-t7zld\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.358525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqqv\" (UniqueName: \"kubernetes.io/projected/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-kube-api-access-bkqqv\") pod \"collect-profiles-29483625-t7zld\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.358622 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-config-volume\") pod \"collect-profiles-29483625-t7zld\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.358641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-secret-volume\") pod \"collect-profiles-29483625-t7zld\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.360108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-config-volume\") pod \"collect-profiles-29483625-t7zld\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.371496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-secret-volume\") pod \"collect-profiles-29483625-t7zld\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.373066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqqv\" (UniqueName: \"kubernetes.io/projected/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-kube-api-access-bkqqv\") pod \"collect-profiles-29483625-t7zld\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.479729 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:00 crc kubenswrapper[4707]: I0121 17:45:00.842310 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld"] Jan 21 17:45:00 crc kubenswrapper[4707]: W0121 17:45:00.871372 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca6fa44_187e_4fce_8ca1_7b3322cd2c69.slice/crio-ee131d34bb61a9b30ac7a5e570b13da9d18207db22b6b2c3832a6386a5b1508b WatchSource:0}: Error finding container ee131d34bb61a9b30ac7a5e570b13da9d18207db22b6b2c3832a6386a5b1508b: Status 404 returned error can't find the container with id ee131d34bb61a9b30ac7a5e570b13da9d18207db22b6b2c3832a6386a5b1508b Jan 21 17:45:01 crc kubenswrapper[4707]: I0121 17:45:01.660007 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ca6fa44-187e-4fce-8ca1-7b3322cd2c69" containerID="b4d3086695b12c8bfb6d971fb80dfbc61b53ec3969773eea254d071b657ddbdd" exitCode=0 Jan 21 17:45:01 crc kubenswrapper[4707]: I0121 17:45:01.660181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" event={"ID":"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69","Type":"ContainerDied","Data":"b4d3086695b12c8bfb6d971fb80dfbc61b53ec3969773eea254d071b657ddbdd"} Jan 21 17:45:01 crc kubenswrapper[4707]: I0121 17:45:01.660428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" event={"ID":"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69","Type":"ContainerStarted","Data":"ee131d34bb61a9b30ac7a5e570b13da9d18207db22b6b2c3832a6386a5b1508b"} Jan 21 17:45:02 crc kubenswrapper[4707]: I0121 17:45:02.887141 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:02 crc kubenswrapper[4707]: I0121 17:45:02.993087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkqqv\" (UniqueName: \"kubernetes.io/projected/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-kube-api-access-bkqqv\") pod \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " Jan 21 17:45:02 crc kubenswrapper[4707]: I0121 17:45:02.993349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-config-volume\") pod \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " Jan 21 17:45:02 crc kubenswrapper[4707]: I0121 17:45:02.993397 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-secret-volume\") pod \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\" (UID: \"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69\") " Jan 21 17:45:02 crc kubenswrapper[4707]: I0121 17:45:02.994404 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ca6fa44-187e-4fce-8ca1-7b3322cd2c69" (UID: "1ca6fa44-187e-4fce-8ca1-7b3322cd2c69"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.015156 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ca6fa44-187e-4fce-8ca1-7b3322cd2c69" (UID: "1ca6fa44-187e-4fce-8ca1-7b3322cd2c69"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.015156 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-kube-api-access-bkqqv" (OuterVolumeSpecName: "kube-api-access-bkqqv") pod "1ca6fa44-187e-4fce-8ca1-7b3322cd2c69" (UID: "1ca6fa44-187e-4fce-8ca1-7b3322cd2c69"). InnerVolumeSpecName "kube-api-access-bkqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.095951 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkqqv\" (UniqueName: \"kubernetes.io/projected/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-kube-api-access-bkqqv\") on node \"crc\" DevicePath \"\"" Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.096002 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.096015 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca6fa44-187e-4fce-8ca1-7b3322cd2c69-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.677995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" event={"ID":"1ca6fa44-187e-4fce-8ca1-7b3322cd2c69","Type":"ContainerDied","Data":"ee131d34bb61a9b30ac7a5e570b13da9d18207db22b6b2c3832a6386a5b1508b"} Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.678044 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee131d34bb61a9b30ac7a5e570b13da9d18207db22b6b2c3832a6386a5b1508b" Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.678103 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483625-t7zld" Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.957778 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf"] Jan 21 17:45:03 crc kubenswrapper[4707]: I0121 17:45:03.960464 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483580-lccbf"] Jan 21 17:45:05 crc kubenswrapper[4707]: I0121 17:45:05.188852 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed91b371-cea5-4c46-b979-5d5b8885b429" path="/var/lib/kubelet/pods/ed91b371-cea5-4c46-b979-5d5b8885b429/volumes" Jan 21 17:45:09 crc kubenswrapper[4707]: I0121 17:45:09.185322 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:45:09 crc kubenswrapper[4707]: E0121 17:45:09.185773 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:45:24 crc kubenswrapper[4707]: I0121 17:45:24.183189 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:45:24 crc kubenswrapper[4707]: E0121 17:45:24.183687 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:45:29 crc kubenswrapper[4707]: I0121 17:45:29.332188 4707 scope.go:117] "RemoveContainer" containerID="69b50b66d26bcab6958d7a1a0174010fcb4e9ee96097920378ca20ed63d6b63d" Jan 21 17:45:38 crc kubenswrapper[4707]: I0121 17:45:38.182036 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:45:38 crc kubenswrapper[4707]: E0121 17:45:38.182554 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:45:53 crc kubenswrapper[4707]: I0121 17:45:53.186102 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:45:53 crc kubenswrapper[4707]: E0121 17:45:53.186553 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:46:06 crc kubenswrapper[4707]: I0121 17:46:06.183051 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:46:06 crc kubenswrapper[4707]: E0121 17:46:06.183632 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:46:21 crc kubenswrapper[4707]: I0121 17:46:21.182435 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:46:21 crc kubenswrapper[4707]: E0121 17:46:21.183005 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:46:35 crc kubenswrapper[4707]: I0121 17:46:35.183718 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:46:35 crc kubenswrapper[4707]: E0121 17:46:35.184948 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:46:49 crc kubenswrapper[4707]: I0121 17:46:49.182398 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:46:50 crc kubenswrapper[4707]: I0121 17:46:50.232561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"539bf60f79bd3c31f096e67c2cf9e0684d54f47834983391040a7ea84767a2ff"} Jan 21 17:47:09 crc kubenswrapper[4707]: I0121 17:47:09.844665 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tkzwf"] Jan 21 17:47:09 crc kubenswrapper[4707]: E0121 17:47:09.845290 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca6fa44-187e-4fce-8ca1-7b3322cd2c69" containerName="collect-profiles" Jan 21 17:47:09 crc kubenswrapper[4707]: I0121 17:47:09.845301 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca6fa44-187e-4fce-8ca1-7b3322cd2c69" containerName="collect-profiles" Jan 21 17:47:09 crc kubenswrapper[4707]: I0121 17:47:09.845500 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca6fa44-187e-4fce-8ca1-7b3322cd2c69" containerName="collect-profiles" Jan 21 17:47:09 crc kubenswrapper[4707]: I0121 17:47:09.846283 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:09 crc kubenswrapper[4707]: I0121 17:47:09.847923 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tkzwf"] Jan 21 17:47:09 crc kubenswrapper[4707]: I0121 17:47:09.972179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4c7s\" (UniqueName: \"kubernetes.io/projected/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-kube-api-access-k4c7s\") pod \"community-operators-tkzwf\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:09 crc kubenswrapper[4707]: I0121 17:47:09.972519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-catalog-content\") pod \"community-operators-tkzwf\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:09 crc kubenswrapper[4707]: I0121 17:47:09.972545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-utilities\") pod \"community-operators-tkzwf\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:10 crc kubenswrapper[4707]: I0121 17:47:10.073569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4c7s\" (UniqueName: \"kubernetes.io/projected/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-kube-api-access-k4c7s\") pod \"community-operators-tkzwf\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:10 crc kubenswrapper[4707]: I0121 17:47:10.073718 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-catalog-content\") pod \"community-operators-tkzwf\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:10 crc kubenswrapper[4707]: I0121 17:47:10.073738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-utilities\") pod \"community-operators-tkzwf\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:10 crc kubenswrapper[4707]: I0121 17:47:10.074177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-catalog-content\") pod \"community-operators-tkzwf\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:10 crc kubenswrapper[4707]: I0121 17:47:10.074246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-utilities\") pod \"community-operators-tkzwf\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:10 crc kubenswrapper[4707]: I0121 17:47:10.093000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4c7s\" (UniqueName: \"kubernetes.io/projected/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-kube-api-access-k4c7s\") pod \"community-operators-tkzwf\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:10 crc kubenswrapper[4707]: I0121 17:47:10.160018 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:10 crc kubenswrapper[4707]: I0121 17:47:10.550283 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tkzwf"] Jan 21 17:47:11 crc kubenswrapper[4707]: I0121 17:47:11.338461 4707 generic.go:334] "Generic (PLEG): container finished" podID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerID="ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d" exitCode=0 Jan 21 17:47:11 crc kubenswrapper[4707]: I0121 17:47:11.338528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkzwf" event={"ID":"0fa5b7bf-89de-461b-9cff-9f75bdf344f9","Type":"ContainerDied","Data":"ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d"} Jan 21 17:47:11 crc kubenswrapper[4707]: I0121 17:47:11.338785 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkzwf" event={"ID":"0fa5b7bf-89de-461b-9cff-9f75bdf344f9","Type":"ContainerStarted","Data":"c632ce63004fff446ba9371081c85b45b367d2b53fb95b849df6e91862430245"} Jan 21 17:47:11 crc kubenswrapper[4707]: I0121 17:47:11.340389 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 17:47:13 crc kubenswrapper[4707]: I0121 17:47:13.351537 4707 generic.go:334] "Generic (PLEG): container finished" podID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerID="688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3" exitCode=0 Jan 21 17:47:13 crc kubenswrapper[4707]: I0121 17:47:13.351845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkzwf" event={"ID":"0fa5b7bf-89de-461b-9cff-9f75bdf344f9","Type":"ContainerDied","Data":"688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3"} Jan 21 17:47:14 crc kubenswrapper[4707]: I0121 17:47:14.359291 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkzwf" event={"ID":"0fa5b7bf-89de-461b-9cff-9f75bdf344f9","Type":"ContainerStarted","Data":"013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975"} Jan 21 17:47:14 crc kubenswrapper[4707]: I0121 17:47:14.376086 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tkzwf" podStartSLOduration=2.85518691 podStartE2EDuration="5.37607211s" podCreationTimestamp="2026-01-21 17:47:09 +0000 UTC" firstStartedPulling="2026-01-21 17:47:11.340155072 +0000 UTC m=+9928.521671293" lastFinishedPulling="2026-01-21 17:47:13.86104027 +0000 UTC m=+9931.042556493" observedRunningTime="2026-01-21 17:47:14.374227632 +0000 UTC m=+9931.555743854" watchObservedRunningTime="2026-01-21 17:47:14.37607211 +0000 UTC m=+9931.557588333" Jan 21 17:47:20 crc kubenswrapper[4707]: I0121 17:47:20.160866 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:20 crc kubenswrapper[4707]: I0121 17:47:20.161449 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:20 crc kubenswrapper[4707]: I0121 17:47:20.192304 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:20 crc kubenswrapper[4707]: I0121 17:47:20.440391 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:20 crc kubenswrapper[4707]: I0121 17:47:20.515765 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tkzwf"] Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.414743 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tkzwf" podUID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerName="registry-server" containerID="cri-o://013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975" gracePeriod=2 Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.732442 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.860647 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-catalog-content\") pod \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.860720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-utilities\") pod \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.860748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4c7s\" (UniqueName: \"kubernetes.io/projected/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-kube-api-access-k4c7s\") pod \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\" (UID: \"0fa5b7bf-89de-461b-9cff-9f75bdf344f9\") " Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.861438 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-utilities" (OuterVolumeSpecName: "utilities") pod "0fa5b7bf-89de-461b-9cff-9f75bdf344f9" (UID: "0fa5b7bf-89de-461b-9cff-9f75bdf344f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.869009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-kube-api-access-k4c7s" (OuterVolumeSpecName: "kube-api-access-k4c7s") pod "0fa5b7bf-89de-461b-9cff-9f75bdf344f9" (UID: "0fa5b7bf-89de-461b-9cff-9f75bdf344f9"). InnerVolumeSpecName "kube-api-access-k4c7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.900516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fa5b7bf-89de-461b-9cff-9f75bdf344f9" (UID: "0fa5b7bf-89de-461b-9cff-9f75bdf344f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.962282 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.962322 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:47:22 crc kubenswrapper[4707]: I0121 17:47:22.962335 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4c7s\" (UniqueName: \"kubernetes.io/projected/0fa5b7bf-89de-461b-9cff-9f75bdf344f9-kube-api-access-k4c7s\") on node \"crc\" DevicePath \"\"" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.420954 4707 generic.go:334] "Generic (PLEG): container finished" podID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerID="013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975" exitCode=0 Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.420995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkzwf" event={"ID":"0fa5b7bf-89de-461b-9cff-9f75bdf344f9","Type":"ContainerDied","Data":"013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975"} Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.421019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkzwf" event={"ID":"0fa5b7bf-89de-461b-9cff-9f75bdf344f9","Type":"ContainerDied","Data":"c632ce63004fff446ba9371081c85b45b367d2b53fb95b849df6e91862430245"} Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.421035 4707 scope.go:117] "RemoveContainer" containerID="013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.421125 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkzwf" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.444202 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tkzwf"] Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.448344 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tkzwf"] Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.449377 4707 scope.go:117] "RemoveContainer" containerID="688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.465782 4707 scope.go:117] "RemoveContainer" containerID="ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.483923 4707 scope.go:117] "RemoveContainer" containerID="013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975" Jan 21 17:47:23 crc kubenswrapper[4707]: E0121 17:47:23.484165 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975\": container with ID starting with 013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975 not found: ID does not exist" containerID="013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.484191 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975"} err="failed to get container status \"013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975\": rpc error: code = NotFound desc = could not find container \"013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975\": container with ID starting with 013eccb499502d922ff85485e5be815bf98ac80710b38e2af4814dec5ee4b975 not found: ID does not exist" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.484208 4707 scope.go:117] "RemoveContainer" containerID="688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3" Jan 21 17:47:23 crc kubenswrapper[4707]: E0121 17:47:23.484492 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3\": container with ID starting with 688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3 not found: ID does not exist" containerID="688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.484519 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3"} err="failed to get container status \"688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3\": rpc error: code = NotFound desc = could not find container \"688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3\": container with ID starting with 688e56992c122ebb84f7be6da87146146ae4c2c6617f965dba713525eecc7ae3 not found: ID does not exist" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.484533 4707 scope.go:117] "RemoveContainer" containerID="ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d" Jan 21 17:47:23 crc kubenswrapper[4707]: E0121 17:47:23.484742 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d\": container with ID starting with ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d not found: ID does not exist" containerID="ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d" Jan 21 17:47:23 crc kubenswrapper[4707]: I0121 17:47:23.484767 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d"} err="failed to get container status \"ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d\": rpc error: code = NotFound desc = could not find container \"ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d\": container with ID starting with ddab86cb665db19ba53a11acdb2323f4c67124b7edfcd47a01085715a22b447d not found: ID does not exist" Jan 21 17:47:25 crc kubenswrapper[4707]: I0121 17:47:25.187890 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" path="/var/lib/kubelet/pods/0fa5b7bf-89de-461b-9cff-9f75bdf344f9/volumes" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.786703 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ljdqr"] Jan 21 17:48:08 crc kubenswrapper[4707]: E0121 17:48:08.787299 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerName="extract-utilities" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.787312 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerName="extract-utilities" Jan 21 17:48:08 crc kubenswrapper[4707]: E0121 17:48:08.787350 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerName="registry-server" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.787362 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerName="registry-server" Jan 21 17:48:08 crc kubenswrapper[4707]: E0121 17:48:08.787372 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerName="extract-content" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.787378 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerName="extract-content" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.787488 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa5b7bf-89de-461b-9cff-9f75bdf344f9" containerName="registry-server" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.788231 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.794771 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljdqr"] Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.895085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjrsl\" (UniqueName: \"kubernetes.io/projected/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-kube-api-access-tjrsl\") pod \"certified-operators-ljdqr\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.895148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-catalog-content\") pod \"certified-operators-ljdqr\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.895330 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-utilities\") pod \"certified-operators-ljdqr\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.996335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-utilities\") pod \"certified-operators-ljdqr\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.996597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjrsl\" (UniqueName: \"kubernetes.io/projected/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-kube-api-access-tjrsl\") pod \"certified-operators-ljdqr\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.996739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-catalog-content\") pod \"certified-operators-ljdqr\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.996832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-utilities\") pod \"certified-operators-ljdqr\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:08 crc kubenswrapper[4707]: I0121 17:48:08.997157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-catalog-content\") pod \"certified-operators-ljdqr\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:09 crc kubenswrapper[4707]: I0121 17:48:09.012785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjrsl\" (UniqueName: \"kubernetes.io/projected/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-kube-api-access-tjrsl\") pod \"certified-operators-ljdqr\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:09 crc kubenswrapper[4707]: I0121 17:48:09.107536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:09 crc kubenswrapper[4707]: I0121 17:48:09.566279 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljdqr"] Jan 21 17:48:09 crc kubenswrapper[4707]: I0121 17:48:09.652050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljdqr" event={"ID":"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795","Type":"ContainerStarted","Data":"fc55cb96e05fa17c2e7d6b6c078af8e7a7dbec2b5d0d28bba4a6dff7d78cde65"} Jan 21 17:48:10 crc kubenswrapper[4707]: I0121 17:48:10.658405 4707 generic.go:334] "Generic (PLEG): container finished" podID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerID="d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729" exitCode=0 Jan 21 17:48:10 crc kubenswrapper[4707]: I0121 17:48:10.658451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljdqr" event={"ID":"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795","Type":"ContainerDied","Data":"d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729"} Jan 21 17:48:12 crc kubenswrapper[4707]: I0121 17:48:12.669490 4707 generic.go:334] "Generic (PLEG): container finished" podID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerID="8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae" exitCode=0 Jan 21 17:48:12 crc kubenswrapper[4707]: I0121 17:48:12.669981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljdqr" event={"ID":"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795","Type":"ContainerDied","Data":"8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae"} Jan 21 17:48:13 crc kubenswrapper[4707]: I0121 17:48:13.678517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljdqr" event={"ID":"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795","Type":"ContainerStarted","Data":"88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da"} Jan 21 17:48:19 crc kubenswrapper[4707]: I0121 17:48:19.108532 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:19 crc kubenswrapper[4707]: I0121 17:48:19.108871 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:19 crc kubenswrapper[4707]: I0121 17:48:19.137755 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:19 crc kubenswrapper[4707]: I0121 17:48:19.153959 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ljdqr" podStartSLOduration=8.654970823 podStartE2EDuration="11.153944854s" podCreationTimestamp="2026-01-21 17:48:08 +0000 UTC" firstStartedPulling="2026-01-21 17:48:10.659912243 +0000 UTC m=+9987.841428465" lastFinishedPulling="2026-01-21 17:48:13.158886273 +0000 UTC m=+9990.340402496" observedRunningTime="2026-01-21 17:48:13.698042735 +0000 UTC m=+9990.879558956" watchObservedRunningTime="2026-01-21 17:48:19.153944854 +0000 UTC m=+9996.335461076" Jan 21 17:48:19 crc kubenswrapper[4707]: I0121 17:48:19.751473 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:19 crc kubenswrapper[4707]: I0121 17:48:19.806373 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljdqr"] Jan 21 17:48:21 crc kubenswrapper[4707]: I0121 17:48:21.728732 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ljdqr" podUID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerName="registry-server" containerID="cri-o://88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da" gracePeriod=2 Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.068014 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.195304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-utilities\") pod \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.195400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjrsl\" (UniqueName: \"kubernetes.io/projected/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-kube-api-access-tjrsl\") pod \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.195703 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-catalog-content\") pod \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\" (UID: \"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795\") " Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.196629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-utilities" (OuterVolumeSpecName: "utilities") pod "335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" (UID: "335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.201381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-kube-api-access-tjrsl" (OuterVolumeSpecName: "kube-api-access-tjrsl") pod "335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" (UID: "335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795"). InnerVolumeSpecName "kube-api-access-tjrsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.242273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" (UID: "335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.296629 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.296664 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.296682 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjrsl\" (UniqueName: \"kubernetes.io/projected/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795-kube-api-access-tjrsl\") on node \"crc\" DevicePath \"\"" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.743851 4707 generic.go:334] "Generic (PLEG): container finished" podID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerID="88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da" exitCode=0 Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.743923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljdqr" event={"ID":"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795","Type":"ContainerDied","Data":"88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da"} Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.743975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljdqr" event={"ID":"335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795","Type":"ContainerDied","Data":"fc55cb96e05fa17c2e7d6b6c078af8e7a7dbec2b5d0d28bba4a6dff7d78cde65"} Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.743999 4707 scope.go:117] "RemoveContainer" containerID="88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.744232 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljdqr" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.769003 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljdqr"] Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.775563 4707 scope.go:117] "RemoveContainer" containerID="8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.776407 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ljdqr"] Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.806495 4707 scope.go:117] "RemoveContainer" containerID="d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.826758 4707 scope.go:117] "RemoveContainer" containerID="88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da" Jan 21 17:48:22 crc kubenswrapper[4707]: E0121 17:48:22.828169 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da\": container with ID starting with 88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da not found: ID does not exist" containerID="88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.828216 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da"} err="failed to get container status \"88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da\": rpc error: code = NotFound desc = could not find container \"88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da\": container with ID starting with 88b5f5a5bcb2953ccd0a2dce8815ed2691bb351586c3d6c7c3b5e1873309c3da not found: ID does not exist" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.828262 4707 scope.go:117] "RemoveContainer" containerID="8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae" Jan 21 17:48:22 crc kubenswrapper[4707]: E0121 17:48:22.828636 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae\": container with ID starting with 8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae not found: ID does not exist" containerID="8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.828672 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae"} err="failed to get container status \"8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae\": rpc error: code = NotFound desc = could not find container \"8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae\": container with ID starting with 8b07662bfaa8326216ab553118aa1791e13602cc0a5bd85a5dfadc3a1ef748ae not found: ID does not exist" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.828698 4707 scope.go:117] "RemoveContainer" containerID="d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729" Jan 21 17:48:22 crc kubenswrapper[4707]: E0121 17:48:22.829042 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729\": container with ID starting with d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729 not found: ID does not exist" containerID="d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729" Jan 21 17:48:22 crc kubenswrapper[4707]: I0121 17:48:22.829088 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729"} err="failed to get container status \"d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729\": rpc error: code = NotFound desc = could not find container \"d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729\": container with ID starting with d5723deae4aa497077a4f5e943b8252b219b951a3230119c3967a00f0ad67729 not found: ID does not exist" Jan 21 17:48:23 crc kubenswrapper[4707]: I0121 17:48:23.191635 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" path="/var/lib/kubelet/pods/335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795/volumes" Jan 21 17:49:09 crc kubenswrapper[4707]: I0121 17:49:09.945796 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:49:09 crc kubenswrapper[4707]: I0121 17:49:09.946108 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:49:39 crc kubenswrapper[4707]: I0121 17:49:39.946363 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:49:39 crc kubenswrapper[4707]: I0121 17:49:39.946882 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.577391 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k9wrg"] Jan 21 17:49:43 crc kubenswrapper[4707]: E0121 17:49:43.578205 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerName="extract-utilities" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.578227 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerName="extract-utilities" Jan 21 17:49:43 crc kubenswrapper[4707]: E0121 17:49:43.578244 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerName="registry-server" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.578253 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerName="registry-server" Jan 21 17:49:43 crc kubenswrapper[4707]: E0121 17:49:43.578282 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerName="extract-content" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.578289 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerName="extract-content" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.578460 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="335f9d9e-196f-4eb9-ae6c-8ba9d4d0f795" containerName="registry-server" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.579684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.599782 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9wrg"] Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.661533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-catalog-content\") pod \"redhat-marketplace-k9wrg\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.661572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v5rt\" (UniqueName: \"kubernetes.io/projected/9e342855-1c34-4942-9ee0-ae4862541b40-kube-api-access-2v5rt\") pod \"redhat-marketplace-k9wrg\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.661598 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-utilities\") pod \"redhat-marketplace-k9wrg\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.763947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-catalog-content\") pod \"redhat-marketplace-k9wrg\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.763999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v5rt\" (UniqueName: \"kubernetes.io/projected/9e342855-1c34-4942-9ee0-ae4862541b40-kube-api-access-2v5rt\") pod \"redhat-marketplace-k9wrg\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.764033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-utilities\") pod \"redhat-marketplace-k9wrg\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.764728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-catalog-content\") pod \"redhat-marketplace-k9wrg\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.764755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-utilities\") pod \"redhat-marketplace-k9wrg\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.791556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v5rt\" (UniqueName: \"kubernetes.io/projected/9e342855-1c34-4942-9ee0-ae4862541b40-kube-api-access-2v5rt\") pod \"redhat-marketplace-k9wrg\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:43 crc kubenswrapper[4707]: I0121 17:49:43.895524 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:44 crc kubenswrapper[4707]: I0121 17:49:44.313838 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9wrg"] Jan 21 17:49:45 crc kubenswrapper[4707]: I0121 17:49:45.235565 4707 generic.go:334] "Generic (PLEG): container finished" podID="9e342855-1c34-4942-9ee0-ae4862541b40" containerID="7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e" exitCode=0 Jan 21 17:49:45 crc kubenswrapper[4707]: I0121 17:49:45.235622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9wrg" event={"ID":"9e342855-1c34-4942-9ee0-ae4862541b40","Type":"ContainerDied","Data":"7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e"} Jan 21 17:49:45 crc kubenswrapper[4707]: I0121 17:49:45.235912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9wrg" event={"ID":"9e342855-1c34-4942-9ee0-ae4862541b40","Type":"ContainerStarted","Data":"261553c00aed8c67a23660445a30548ead42247cbab5dc273f9acf13ac4d1da2"} Jan 21 17:49:46 crc kubenswrapper[4707]: I0121 17:49:46.243714 4707 generic.go:334] "Generic (PLEG): container finished" podID="9e342855-1c34-4942-9ee0-ae4862541b40" containerID="e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10" exitCode=0 Jan 21 17:49:46 crc kubenswrapper[4707]: I0121 17:49:46.243755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9wrg" event={"ID":"9e342855-1c34-4942-9ee0-ae4862541b40","Type":"ContainerDied","Data":"e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10"} Jan 21 17:49:47 crc kubenswrapper[4707]: I0121 17:49:47.252095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9wrg" event={"ID":"9e342855-1c34-4942-9ee0-ae4862541b40","Type":"ContainerStarted","Data":"4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418"} Jan 21 17:49:47 crc kubenswrapper[4707]: I0121 17:49:47.266624 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k9wrg" podStartSLOduration=2.784193647 podStartE2EDuration="4.266607608s" podCreationTimestamp="2026-01-21 17:49:43 +0000 UTC" firstStartedPulling="2026-01-21 17:49:45.238586657 +0000 UTC m=+10082.420102879" lastFinishedPulling="2026-01-21 17:49:46.721000618 +0000 UTC m=+10083.902516840" observedRunningTime="2026-01-21 17:49:47.265563704 +0000 UTC m=+10084.447079926" watchObservedRunningTime="2026-01-21 17:49:47.266607608 +0000 UTC m=+10084.448123830" Jan 21 17:49:53 crc kubenswrapper[4707]: I0121 17:49:53.896021 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:53 crc kubenswrapper[4707]: I0121 17:49:53.897142 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:53 crc kubenswrapper[4707]: I0121 17:49:53.928382 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:54 crc kubenswrapper[4707]: I0121 17:49:54.340957 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:54 crc kubenswrapper[4707]: I0121 17:49:54.376389 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9wrg"] Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.315102 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k9wrg" podUID="9e342855-1c34-4942-9ee0-ae4862541b40" containerName="registry-server" containerID="cri-o://4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418" gracePeriod=2 Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.635429 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.762170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-utilities\") pod \"9e342855-1c34-4942-9ee0-ae4862541b40\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.762234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-catalog-content\") pod \"9e342855-1c34-4942-9ee0-ae4862541b40\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.762277 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v5rt\" (UniqueName: \"kubernetes.io/projected/9e342855-1c34-4942-9ee0-ae4862541b40-kube-api-access-2v5rt\") pod \"9e342855-1c34-4942-9ee0-ae4862541b40\" (UID: \"9e342855-1c34-4942-9ee0-ae4862541b40\") " Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.768899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-utilities" (OuterVolumeSpecName: "utilities") pod "9e342855-1c34-4942-9ee0-ae4862541b40" (UID: "9e342855-1c34-4942-9ee0-ae4862541b40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.772398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e342855-1c34-4942-9ee0-ae4862541b40-kube-api-access-2v5rt" (OuterVolumeSpecName: "kube-api-access-2v5rt") pod "9e342855-1c34-4942-9ee0-ae4862541b40" (UID: "9e342855-1c34-4942-9ee0-ae4862541b40"). InnerVolumeSpecName "kube-api-access-2v5rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.782622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e342855-1c34-4942-9ee0-ae4862541b40" (UID: "9e342855-1c34-4942-9ee0-ae4862541b40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.864315 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.864362 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e342855-1c34-4942-9ee0-ae4862541b40-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:49:56 crc kubenswrapper[4707]: I0121 17:49:56.864384 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v5rt\" (UniqueName: \"kubernetes.io/projected/9e342855-1c34-4942-9ee0-ae4862541b40-kube-api-access-2v5rt\") on node \"crc\" DevicePath \"\"" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.322509 4707 generic.go:334] "Generic (PLEG): container finished" podID="9e342855-1c34-4942-9ee0-ae4862541b40" containerID="4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418" exitCode=0 Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.322546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9wrg" event={"ID":"9e342855-1c34-4942-9ee0-ae4862541b40","Type":"ContainerDied","Data":"4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418"} Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.322588 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9wrg" event={"ID":"9e342855-1c34-4942-9ee0-ae4862541b40","Type":"ContainerDied","Data":"261553c00aed8c67a23660445a30548ead42247cbab5dc273f9acf13ac4d1da2"} Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.322603 4707 scope.go:117] "RemoveContainer" containerID="4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.322798 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9wrg" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.339129 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9wrg"] Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.343019 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9wrg"] Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.344298 4707 scope.go:117] "RemoveContainer" containerID="e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.372890 4707 scope.go:117] "RemoveContainer" containerID="7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.383642 4707 scope.go:117] "RemoveContainer" containerID="4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418" Jan 21 17:49:57 crc kubenswrapper[4707]: E0121 17:49:57.383957 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418\": container with ID starting with 4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418 not found: ID does not exist" containerID="4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.384006 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418"} err="failed to get container status \"4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418\": rpc error: code = NotFound desc = could not find container \"4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418\": container with ID starting with 4e4e4c7545816dc79c23b92f919379f6d194bbde5a2918a9a3f495d25481f418 not found: ID does not exist" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.384025 4707 scope.go:117] "RemoveContainer" containerID="e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10" Jan 21 17:49:57 crc kubenswrapper[4707]: E0121 17:49:57.384277 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10\": container with ID starting with e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10 not found: ID does not exist" containerID="e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.384296 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10"} err="failed to get container status \"e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10\": rpc error: code = NotFound desc = could not find container \"e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10\": container with ID starting with e1e040de7e9b25a7e4a82fe9c5a8d2339905f5c7830ba26bb5d3233ccebd8d10 not found: ID does not exist" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.384309 4707 scope.go:117] "RemoveContainer" containerID="7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e" Jan 21 17:49:57 crc kubenswrapper[4707]: E0121 17:49:57.384562 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e\": container with ID starting with 7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e not found: ID does not exist" containerID="7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e" Jan 21 17:49:57 crc kubenswrapper[4707]: I0121 17:49:57.384583 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e"} err="failed to get container status \"7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e\": rpc error: code = NotFound desc = could not find container \"7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e\": container with ID starting with 7214de4f9303307833f1a73f39020c35875f19c2cab1609fe25b8d9a3aa4ad4e not found: ID does not exist" Jan 21 17:49:59 crc kubenswrapper[4707]: I0121 17:49:59.188483 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e342855-1c34-4942-9ee0-ae4862541b40" path="/var/lib/kubelet/pods/9e342855-1c34-4942-9ee0-ae4862541b40/volumes" Jan 21 17:50:09 crc kubenswrapper[4707]: I0121 17:50:09.945796 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:50:09 crc kubenswrapper[4707]: I0121 17:50:09.946148 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:50:09 crc kubenswrapper[4707]: I0121 17:50:09.946180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:50:09 crc kubenswrapper[4707]: I0121 17:50:09.946562 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"539bf60f79bd3c31f096e67c2cf9e0684d54f47834983391040a7ea84767a2ff"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:50:09 crc kubenswrapper[4707]: I0121 17:50:09.946599 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://539bf60f79bd3c31f096e67c2cf9e0684d54f47834983391040a7ea84767a2ff" gracePeriod=600 Jan 21 17:50:10 crc kubenswrapper[4707]: I0121 17:50:10.394000 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="539bf60f79bd3c31f096e67c2cf9e0684d54f47834983391040a7ea84767a2ff" exitCode=0 Jan 21 17:50:10 crc kubenswrapper[4707]: I0121 17:50:10.394054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"539bf60f79bd3c31f096e67c2cf9e0684d54f47834983391040a7ea84767a2ff"} Jan 21 17:50:10 crc kubenswrapper[4707]: I0121 17:50:10.394189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b"} Jan 21 17:50:10 crc kubenswrapper[4707]: I0121 17:50:10.394208 4707 scope.go:117] "RemoveContainer" containerID="76c242e4a344736312cd68f87b7b158085e10ffef5b4d09ca5a92f3153a5b858" Jan 21 17:52:39 crc kubenswrapper[4707]: I0121 17:52:39.945897 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:52:39 crc kubenswrapper[4707]: I0121 17:52:39.946402 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:53:09 crc kubenswrapper[4707]: I0121 17:53:09.945323 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:53:09 crc kubenswrapper[4707]: I0121 17:53:09.945775 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:53:39 crc kubenswrapper[4707]: I0121 17:53:39.945554 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:53:39 crc kubenswrapper[4707]: I0121 17:53:39.945967 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:53:39 crc kubenswrapper[4707]: I0121 17:53:39.946001 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 17:53:39 crc kubenswrapper[4707]: I0121 17:53:39.946355 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:53:39 crc kubenswrapper[4707]: I0121 17:53:39.946406 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" gracePeriod=600 Jan 21 17:53:40 crc kubenswrapper[4707]: E0121 17:53:40.075671 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:53:40 crc kubenswrapper[4707]: I0121 17:53:40.487631 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" exitCode=0 Jan 21 17:53:40 crc kubenswrapper[4707]: I0121 17:53:40.487666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b"} Jan 21 17:53:40 crc kubenswrapper[4707]: I0121 17:53:40.487693 4707 scope.go:117] "RemoveContainer" containerID="539bf60f79bd3c31f096e67c2cf9e0684d54f47834983391040a7ea84767a2ff" Jan 21 17:53:40 crc kubenswrapper[4707]: I0121 17:53:40.488062 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:53:40 crc kubenswrapper[4707]: E0121 17:53:40.488261 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:53:56 crc kubenswrapper[4707]: I0121 17:53:56.182298 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:53:56 crc kubenswrapper[4707]: E0121 17:53:56.182924 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:54:08 crc kubenswrapper[4707]: I0121 17:54:08.182735 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:54:08 crc kubenswrapper[4707]: E0121 17:54:08.183252 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:54:22 crc kubenswrapper[4707]: I0121 17:54:22.183088 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:54:22 crc kubenswrapper[4707]: E0121 17:54:22.183784 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.303286 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mx5hr"] Jan 21 17:54:32 crc kubenswrapper[4707]: E0121 17:54:32.304881 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e342855-1c34-4942-9ee0-ae4862541b40" containerName="registry-server" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.304903 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e342855-1c34-4942-9ee0-ae4862541b40" containerName="registry-server" Jan 21 17:54:32 crc kubenswrapper[4707]: E0121 17:54:32.304925 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e342855-1c34-4942-9ee0-ae4862541b40" containerName="extract-content" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.304932 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e342855-1c34-4942-9ee0-ae4862541b40" containerName="extract-content" Jan 21 17:54:32 crc kubenswrapper[4707]: E0121 17:54:32.304947 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e342855-1c34-4942-9ee0-ae4862541b40" containerName="extract-utilities" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.304953 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e342855-1c34-4942-9ee0-ae4862541b40" containerName="extract-utilities" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.305051 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e342855-1c34-4942-9ee0-ae4862541b40" containerName="registry-server" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.305796 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.311967 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mx5hr"] Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.429692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-catalog-content\") pod \"redhat-operators-mx5hr\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.429804 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmvgr\" (UniqueName: \"kubernetes.io/projected/391fee5e-140e-4b94-8a87-cc92ca79aa66-kube-api-access-zmvgr\") pod \"redhat-operators-mx5hr\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.430333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-utilities\") pod \"redhat-operators-mx5hr\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.531781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmvgr\" (UniqueName: \"kubernetes.io/projected/391fee5e-140e-4b94-8a87-cc92ca79aa66-kube-api-access-zmvgr\") pod \"redhat-operators-mx5hr\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.531866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-utilities\") pod \"redhat-operators-mx5hr\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.531920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-catalog-content\") pod \"redhat-operators-mx5hr\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.532324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-catalog-content\") pod \"redhat-operators-mx5hr\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.532442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-utilities\") pod \"redhat-operators-mx5hr\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.548017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmvgr\" (UniqueName: \"kubernetes.io/projected/391fee5e-140e-4b94-8a87-cc92ca79aa66-kube-api-access-zmvgr\") pod \"redhat-operators-mx5hr\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:32 crc kubenswrapper[4707]: I0121 17:54:32.636739 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:33 crc kubenswrapper[4707]: I0121 17:54:33.030693 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mx5hr"] Jan 21 17:54:33 crc kubenswrapper[4707]: I0121 17:54:33.187255 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:54:33 crc kubenswrapper[4707]: E0121 17:54:33.187842 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:54:33 crc kubenswrapper[4707]: I0121 17:54:33.776176 4707 generic.go:334] "Generic (PLEG): container finished" podID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerID="0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca" exitCode=0 Jan 21 17:54:33 crc kubenswrapper[4707]: I0121 17:54:33.776222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx5hr" event={"ID":"391fee5e-140e-4b94-8a87-cc92ca79aa66","Type":"ContainerDied","Data":"0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca"} Jan 21 17:54:33 crc kubenswrapper[4707]: I0121 17:54:33.776248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx5hr" event={"ID":"391fee5e-140e-4b94-8a87-cc92ca79aa66","Type":"ContainerStarted","Data":"dea503f358bdeb0507a9ba9ac4651329138c9788af242e970a8d14429f86a8bd"} Jan 21 17:54:33 crc kubenswrapper[4707]: I0121 17:54:33.777488 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 17:54:34 crc kubenswrapper[4707]: I0121 17:54:34.785198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx5hr" event={"ID":"391fee5e-140e-4b94-8a87-cc92ca79aa66","Type":"ContainerStarted","Data":"a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3"} Jan 21 17:54:35 crc kubenswrapper[4707]: I0121 17:54:35.792131 4707 generic.go:334] "Generic (PLEG): container finished" podID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerID="a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3" exitCode=0 Jan 21 17:54:35 crc kubenswrapper[4707]: I0121 17:54:35.792172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx5hr" event={"ID":"391fee5e-140e-4b94-8a87-cc92ca79aa66","Type":"ContainerDied","Data":"a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3"} Jan 21 17:54:36 crc kubenswrapper[4707]: I0121 17:54:36.799852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx5hr" event={"ID":"391fee5e-140e-4b94-8a87-cc92ca79aa66","Type":"ContainerStarted","Data":"8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976"} Jan 21 17:54:36 crc kubenswrapper[4707]: I0121 17:54:36.819502 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mx5hr" podStartSLOduration=2.291062508 podStartE2EDuration="4.819490058s" podCreationTimestamp="2026-01-21 17:54:32 +0000 UTC" firstStartedPulling="2026-01-21 17:54:33.777287831 +0000 UTC m=+10370.958804054" lastFinishedPulling="2026-01-21 17:54:36.305715382 +0000 UTC m=+10373.487231604" observedRunningTime="2026-01-21 17:54:36.816656031 +0000 UTC m=+10373.998172252" watchObservedRunningTime="2026-01-21 17:54:36.819490058 +0000 UTC m=+10374.001006271" Jan 21 17:54:42 crc kubenswrapper[4707]: I0121 17:54:42.637940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:42 crc kubenswrapper[4707]: I0121 17:54:42.638357 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:42 crc kubenswrapper[4707]: I0121 17:54:42.675200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:42 crc kubenswrapper[4707]: I0121 17:54:42.867936 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:42 crc kubenswrapper[4707]: I0121 17:54:42.899055 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mx5hr"] Jan 21 17:54:44 crc kubenswrapper[4707]: I0121 17:54:44.182247 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:54:44 crc kubenswrapper[4707]: E0121 17:54:44.183471 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:54:44 crc kubenswrapper[4707]: I0121 17:54:44.845321 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mx5hr" podUID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerName="registry-server" containerID="cri-o://8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976" gracePeriod=2 Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.172356 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.227192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-catalog-content\") pod \"391fee5e-140e-4b94-8a87-cc92ca79aa66\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.227235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmvgr\" (UniqueName: \"kubernetes.io/projected/391fee5e-140e-4b94-8a87-cc92ca79aa66-kube-api-access-zmvgr\") pod \"391fee5e-140e-4b94-8a87-cc92ca79aa66\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.227303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-utilities\") pod \"391fee5e-140e-4b94-8a87-cc92ca79aa66\" (UID: \"391fee5e-140e-4b94-8a87-cc92ca79aa66\") " Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.228185 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-utilities" (OuterVolumeSpecName: "utilities") pod "391fee5e-140e-4b94-8a87-cc92ca79aa66" (UID: "391fee5e-140e-4b94-8a87-cc92ca79aa66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.233382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391fee5e-140e-4b94-8a87-cc92ca79aa66-kube-api-access-zmvgr" (OuterVolumeSpecName: "kube-api-access-zmvgr") pod "391fee5e-140e-4b94-8a87-cc92ca79aa66" (UID: "391fee5e-140e-4b94-8a87-cc92ca79aa66"). InnerVolumeSpecName "kube-api-access-zmvgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.328711 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmvgr\" (UniqueName: \"kubernetes.io/projected/391fee5e-140e-4b94-8a87-cc92ca79aa66-kube-api-access-zmvgr\") on node \"crc\" DevicePath \"\"" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.328736 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.851634 4707 generic.go:334] "Generic (PLEG): container finished" podID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerID="8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976" exitCode=0 Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.851677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx5hr" event={"ID":"391fee5e-140e-4b94-8a87-cc92ca79aa66","Type":"ContainerDied","Data":"8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976"} Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.851682 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx5hr" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.851704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx5hr" event={"ID":"391fee5e-140e-4b94-8a87-cc92ca79aa66","Type":"ContainerDied","Data":"dea503f358bdeb0507a9ba9ac4651329138c9788af242e970a8d14429f86a8bd"} Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.851721 4707 scope.go:117] "RemoveContainer" containerID="8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.869898 4707 scope.go:117] "RemoveContainer" containerID="a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.888778 4707 scope.go:117] "RemoveContainer" containerID="0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.904531 4707 scope.go:117] "RemoveContainer" containerID="8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976" Jan 21 17:54:45 crc kubenswrapper[4707]: E0121 17:54:45.905804 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976\": container with ID starting with 8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976 not found: ID does not exist" containerID="8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.905860 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976"} err="failed to get container status \"8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976\": rpc error: code = NotFound desc = could not find container \"8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976\": container with ID starting with 8bbf3f8655caa0173c771c162b121d61ce64d971a37e9ecb9c30f9ae7c417976 not found: ID does not exist" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.905894 4707 scope.go:117] "RemoveContainer" containerID="a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3" Jan 21 17:54:45 crc kubenswrapper[4707]: E0121 17:54:45.906230 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3\": container with ID starting with a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3 not found: ID does not exist" containerID="a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.906263 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3"} err="failed to get container status \"a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3\": rpc error: code = NotFound desc = could not find container \"a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3\": container with ID starting with a02804597a3cb3b09fa62dfff393abefa604ad3d01ffeaef75e33b90b814d3f3 not found: ID does not exist" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.906280 4707 scope.go:117] "RemoveContainer" containerID="0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca" Jan 21 17:54:45 crc kubenswrapper[4707]: E0121 17:54:45.906533 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca\": container with ID starting with 0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca not found: ID does not exist" containerID="0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca" Jan 21 17:54:45 crc kubenswrapper[4707]: I0121 17:54:45.906564 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca"} err="failed to get container status \"0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca\": rpc error: code = NotFound desc = could not find container \"0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca\": container with ID starting with 0cf81b4b4a8b1ee227c505f0ef9568982ee48d44bb8a3e11863b24b29efd70ca not found: ID does not exist" Jan 21 17:54:46 crc kubenswrapper[4707]: I0121 17:54:46.143166 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "391fee5e-140e-4b94-8a87-cc92ca79aa66" (UID: "391fee5e-140e-4b94-8a87-cc92ca79aa66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:54:46 crc kubenswrapper[4707]: I0121 17:54:46.172527 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mx5hr"] Jan 21 17:54:46 crc kubenswrapper[4707]: I0121 17:54:46.178081 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mx5hr"] Jan 21 17:54:46 crc kubenswrapper[4707]: I0121 17:54:46.239117 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391fee5e-140e-4b94-8a87-cc92ca79aa66-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:54:47 crc kubenswrapper[4707]: I0121 17:54:47.192229 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391fee5e-140e-4b94-8a87-cc92ca79aa66" path="/var/lib/kubelet/pods/391fee5e-140e-4b94-8a87-cc92ca79aa66/volumes" Jan 21 17:54:55 crc kubenswrapper[4707]: I0121 17:54:55.183608 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:54:55 crc kubenswrapper[4707]: E0121 17:54:55.184270 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:55:09 crc kubenswrapper[4707]: I0121 17:55:09.182780 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:55:09 crc kubenswrapper[4707]: E0121 17:55:09.183313 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:55:24 crc kubenswrapper[4707]: I0121 17:55:24.182769 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:55:24 crc kubenswrapper[4707]: E0121 17:55:24.183495 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:55:37 crc kubenswrapper[4707]: I0121 17:55:37.184160 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:55:37 crc kubenswrapper[4707]: E0121 17:55:37.184745 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:55:51 crc kubenswrapper[4707]: I0121 17:55:51.186105 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:55:51 crc kubenswrapper[4707]: E0121 17:55:51.187051 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:56:02 crc kubenswrapper[4707]: I0121 17:56:02.182648 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:56:02 crc kubenswrapper[4707]: E0121 17:56:02.183509 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:56:17 crc kubenswrapper[4707]: I0121 17:56:17.183768 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:56:17 crc kubenswrapper[4707]: E0121 17:56:17.184324 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:56:28 crc kubenswrapper[4707]: I0121 17:56:28.182393 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:56:28 crc kubenswrapper[4707]: E0121 17:56:28.183150 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:56:42 crc kubenswrapper[4707]: I0121 17:56:42.182476 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:56:42 crc kubenswrapper[4707]: E0121 17:56:42.183801 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:56:53 crc kubenswrapper[4707]: I0121 17:56:53.211222 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:56:53 crc kubenswrapper[4707]: E0121 17:56:53.211875 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:57:04 crc kubenswrapper[4707]: I0121 17:57:04.182295 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:57:04 crc kubenswrapper[4707]: E0121 17:57:04.182970 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.356501 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kthlb"] Jan 21 17:57:18 crc kubenswrapper[4707]: E0121 17:57:18.357014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerName="extract-utilities" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.357026 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerName="extract-utilities" Jan 21 17:57:18 crc kubenswrapper[4707]: E0121 17:57:18.357044 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerName="extract-content" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.357049 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerName="extract-content" Jan 21 17:57:18 crc kubenswrapper[4707]: E0121 17:57:18.357065 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerName="registry-server" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.357072 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerName="registry-server" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.357169 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="391fee5e-140e-4b94-8a87-cc92ca79aa66" containerName="registry-server" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.357871 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.370296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kthlb"] Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.405446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-catalog-content\") pod \"community-operators-kthlb\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.405503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-utilities\") pod \"community-operators-kthlb\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.405570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnk76\" (UniqueName: \"kubernetes.io/projected/ad2d2161-98d2-4298-8922-bacb88d208a0-kube-api-access-tnk76\") pod \"community-operators-kthlb\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.506987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnk76\" (UniqueName: \"kubernetes.io/projected/ad2d2161-98d2-4298-8922-bacb88d208a0-kube-api-access-tnk76\") pod \"community-operators-kthlb\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.507050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-catalog-content\") pod \"community-operators-kthlb\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.507086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-utilities\") pod \"community-operators-kthlb\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.507864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-catalog-content\") pod \"community-operators-kthlb\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.507878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-utilities\") pod \"community-operators-kthlb\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.529891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnk76\" (UniqueName: \"kubernetes.io/projected/ad2d2161-98d2-4298-8922-bacb88d208a0-kube-api-access-tnk76\") pod \"community-operators-kthlb\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.671847 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:18 crc kubenswrapper[4707]: I0121 17:57:18.938839 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kthlb"] Jan 21 17:57:19 crc kubenswrapper[4707]: I0121 17:57:19.193622 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:57:19 crc kubenswrapper[4707]: E0121 17:57:19.194156 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:57:19 crc kubenswrapper[4707]: I0121 17:57:19.675727 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerID="15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495" exitCode=0 Jan 21 17:57:19 crc kubenswrapper[4707]: I0121 17:57:19.676859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthlb" event={"ID":"ad2d2161-98d2-4298-8922-bacb88d208a0","Type":"ContainerDied","Data":"15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495"} Jan 21 17:57:19 crc kubenswrapper[4707]: I0121 17:57:19.676982 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthlb" event={"ID":"ad2d2161-98d2-4298-8922-bacb88d208a0","Type":"ContainerStarted","Data":"a5569304220eb4bee041f25b51cf5065e7d37fe4aca59c296deda026fa8712b1"} Jan 21 17:57:20 crc kubenswrapper[4707]: I0121 17:57:20.684200 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerID="df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c" exitCode=0 Jan 21 17:57:20 crc kubenswrapper[4707]: I0121 17:57:20.684237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthlb" event={"ID":"ad2d2161-98d2-4298-8922-bacb88d208a0","Type":"ContainerDied","Data":"df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c"} Jan 21 17:57:21 crc kubenswrapper[4707]: I0121 17:57:21.692267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthlb" event={"ID":"ad2d2161-98d2-4298-8922-bacb88d208a0","Type":"ContainerStarted","Data":"2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c"} Jan 21 17:57:21 crc kubenswrapper[4707]: I0121 17:57:21.703420 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kthlb" podStartSLOduration=2.13656304 podStartE2EDuration="3.703395862s" podCreationTimestamp="2026-01-21 17:57:18 +0000 UTC" firstStartedPulling="2026-01-21 17:57:19.678915572 +0000 UTC m=+10536.860431794" lastFinishedPulling="2026-01-21 17:57:21.245748394 +0000 UTC m=+10538.427264616" observedRunningTime="2026-01-21 17:57:21.702774064 +0000 UTC m=+10538.884290285" watchObservedRunningTime="2026-01-21 17:57:21.703395862 +0000 UTC m=+10538.884912085" Jan 21 17:57:28 crc kubenswrapper[4707]: I0121 17:57:28.672825 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:28 crc kubenswrapper[4707]: I0121 17:57:28.673191 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:28 crc kubenswrapper[4707]: I0121 17:57:28.700884 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:28 crc kubenswrapper[4707]: I0121 17:57:28.760677 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:28 crc kubenswrapper[4707]: I0121 17:57:28.921694 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kthlb"] Jan 21 17:57:30 crc kubenswrapper[4707]: I0121 17:57:30.743125 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kthlb" podUID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerName="registry-server" containerID="cri-o://2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c" gracePeriod=2 Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.043884 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.155858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-utilities\") pod \"ad2d2161-98d2-4298-8922-bacb88d208a0\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.155923 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-catalog-content\") pod \"ad2d2161-98d2-4298-8922-bacb88d208a0\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.156569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-utilities" (OuterVolumeSpecName: "utilities") pod "ad2d2161-98d2-4298-8922-bacb88d208a0" (UID: "ad2d2161-98d2-4298-8922-bacb88d208a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.164552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnk76\" (UniqueName: \"kubernetes.io/projected/ad2d2161-98d2-4298-8922-bacb88d208a0-kube-api-access-tnk76\") pod \"ad2d2161-98d2-4298-8922-bacb88d208a0\" (UID: \"ad2d2161-98d2-4298-8922-bacb88d208a0\") " Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.164849 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.170928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2d2161-98d2-4298-8922-bacb88d208a0-kube-api-access-tnk76" (OuterVolumeSpecName: "kube-api-access-tnk76") pod "ad2d2161-98d2-4298-8922-bacb88d208a0" (UID: "ad2d2161-98d2-4298-8922-bacb88d208a0"). InnerVolumeSpecName "kube-api-access-tnk76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.182589 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:57:31 crc kubenswrapper[4707]: E0121 17:57:31.182916 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.197981 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad2d2161-98d2-4298-8922-bacb88d208a0" (UID: "ad2d2161-98d2-4298-8922-bacb88d208a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.266154 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2d2161-98d2-4298-8922-bacb88d208a0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.266179 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnk76\" (UniqueName: \"kubernetes.io/projected/ad2d2161-98d2-4298-8922-bacb88d208a0-kube-api-access-tnk76\") on node \"crc\" DevicePath \"\"" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.748859 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerID="2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c" exitCode=0 Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.748896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthlb" event={"ID":"ad2d2161-98d2-4298-8922-bacb88d208a0","Type":"ContainerDied","Data":"2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c"} Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.748919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthlb" event={"ID":"ad2d2161-98d2-4298-8922-bacb88d208a0","Type":"ContainerDied","Data":"a5569304220eb4bee041f25b51cf5065e7d37fe4aca59c296deda026fa8712b1"} Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.748923 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kthlb" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.748933 4707 scope.go:117] "RemoveContainer" containerID="2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.762141 4707 scope.go:117] "RemoveContainer" containerID="df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.785263 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kthlb"] Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.789493 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kthlb"] Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.794018 4707 scope.go:117] "RemoveContainer" containerID="15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.805381 4707 scope.go:117] "RemoveContainer" containerID="2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c" Jan 21 17:57:31 crc kubenswrapper[4707]: E0121 17:57:31.805657 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c\": container with ID starting with 2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c not found: ID does not exist" containerID="2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.805710 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c"} err="failed to get container status \"2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c\": rpc error: code = NotFound desc = could not find container \"2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c\": container with ID starting with 2966e52a3e454c4eac2f29d04c9e4dd725481589b8a76f24d4c7988d50df8c5c not found: ID does not exist" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.805732 4707 scope.go:117] "RemoveContainer" containerID="df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c" Jan 21 17:57:31 crc kubenswrapper[4707]: E0121 17:57:31.805976 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c\": container with ID starting with df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c not found: ID does not exist" containerID="df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.806008 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c"} err="failed to get container status \"df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c\": rpc error: code = NotFound desc = could not find container \"df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c\": container with ID starting with df4dab6b494dcfce1956aef96865a0d6d628978a53bfd4288e061dc7d727f67c not found: ID does not exist" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.806024 4707 scope.go:117] "RemoveContainer" containerID="15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495" Jan 21 17:57:31 crc kubenswrapper[4707]: E0121 17:57:31.806275 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495\": container with ID starting with 15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495 not found: ID does not exist" containerID="15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495" Jan 21 17:57:31 crc kubenswrapper[4707]: I0121 17:57:31.806297 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495"} err="failed to get container status \"15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495\": rpc error: code = NotFound desc = could not find container \"15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495\": container with ID starting with 15434ae2226982bb67c23fd762b7534c227a3d7f4851674cdb3e664b1abf2495 not found: ID does not exist" Jan 21 17:57:33 crc kubenswrapper[4707]: I0121 17:57:33.188090 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2d2161-98d2-4298-8922-bacb88d208a0" path="/var/lib/kubelet/pods/ad2d2161-98d2-4298-8922-bacb88d208a0/volumes" Jan 21 17:57:42 crc kubenswrapper[4707]: I0121 17:57:42.182025 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:57:42 crc kubenswrapper[4707]: E0121 17:57:42.182568 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:57:53 crc kubenswrapper[4707]: I0121 17:57:53.185960 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:57:53 crc kubenswrapper[4707]: E0121 17:57:53.186481 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:58:08 crc kubenswrapper[4707]: I0121 17:58:08.182739 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:58:08 crc kubenswrapper[4707]: E0121 17:58:08.183214 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:58:22 crc kubenswrapper[4707]: I0121 17:58:22.182852 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:58:22 crc kubenswrapper[4707]: E0121 17:58:22.183370 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:58:36 crc kubenswrapper[4707]: I0121 17:58:36.182911 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:58:36 crc kubenswrapper[4707]: E0121 17:58:36.183433 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.201505 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nd2vp"] Jan 21 17:58:39 crc kubenswrapper[4707]: E0121 17:58:39.201875 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerName="registry-server" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.201886 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerName="registry-server" Jan 21 17:58:39 crc kubenswrapper[4707]: E0121 17:58:39.201899 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerName="extract-content" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.201904 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerName="extract-content" Jan 21 17:58:39 crc kubenswrapper[4707]: E0121 17:58:39.201915 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerName="extract-utilities" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.201920 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerName="extract-utilities" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.202032 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2d2161-98d2-4298-8922-bacb88d208a0" containerName="registry-server" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.202923 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nd2vp"] Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.202992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.329833 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-catalog-content\") pod \"certified-operators-nd2vp\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.329905 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvf5s\" (UniqueName: \"kubernetes.io/projected/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-kube-api-access-bvf5s\") pod \"certified-operators-nd2vp\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.329970 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-utilities\") pod \"certified-operators-nd2vp\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.430909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-utilities\") pod \"certified-operators-nd2vp\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.431318 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-utilities\") pod \"certified-operators-nd2vp\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.431611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-catalog-content\") pod \"certified-operators-nd2vp\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.431647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-catalog-content\") pod \"certified-operators-nd2vp\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.431716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvf5s\" (UniqueName: \"kubernetes.io/projected/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-kube-api-access-bvf5s\") pod \"certified-operators-nd2vp\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.452439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvf5s\" (UniqueName: \"kubernetes.io/projected/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-kube-api-access-bvf5s\") pod \"certified-operators-nd2vp\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.515275 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:39 crc kubenswrapper[4707]: I0121 17:58:39.945168 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nd2vp"] Jan 21 17:58:40 crc kubenswrapper[4707]: I0121 17:58:40.077361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd2vp" event={"ID":"dedd2972-4b3b-4eda-961a-8fbe9480e2f6","Type":"ContainerStarted","Data":"7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10"} Jan 21 17:58:40 crc kubenswrapper[4707]: I0121 17:58:40.077879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd2vp" event={"ID":"dedd2972-4b3b-4eda-961a-8fbe9480e2f6","Type":"ContainerStarted","Data":"94bcd69705db6bd7e5e63e7ae05755504e5520dbd62a89c7c2d2975090e081a9"} Jan 21 17:58:41 crc kubenswrapper[4707]: I0121 17:58:41.082380 4707 generic.go:334] "Generic (PLEG): container finished" podID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerID="7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10" exitCode=0 Jan 21 17:58:41 crc kubenswrapper[4707]: I0121 17:58:41.082427 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd2vp" event={"ID":"dedd2972-4b3b-4eda-961a-8fbe9480e2f6","Type":"ContainerDied","Data":"7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10"} Jan 21 17:58:42 crc kubenswrapper[4707]: I0121 17:58:42.088016 4707 generic.go:334] "Generic (PLEG): container finished" podID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerID="ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899" exitCode=0 Jan 21 17:58:42 crc kubenswrapper[4707]: I0121 17:58:42.088165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd2vp" event={"ID":"dedd2972-4b3b-4eda-961a-8fbe9480e2f6","Type":"ContainerDied","Data":"ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899"} Jan 21 17:58:43 crc kubenswrapper[4707]: I0121 17:58:43.097750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd2vp" event={"ID":"dedd2972-4b3b-4eda-961a-8fbe9480e2f6","Type":"ContainerStarted","Data":"bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557"} Jan 21 17:58:43 crc kubenswrapper[4707]: I0121 17:58:43.112215 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nd2vp" podStartSLOduration=2.622897347 podStartE2EDuration="4.112201741s" podCreationTimestamp="2026-01-21 17:58:39 +0000 UTC" firstStartedPulling="2026-01-21 17:58:41.083514672 +0000 UTC m=+10618.265030894" lastFinishedPulling="2026-01-21 17:58:42.572819066 +0000 UTC m=+10619.754335288" observedRunningTime="2026-01-21 17:58:43.109973291 +0000 UTC m=+10620.291489523" watchObservedRunningTime="2026-01-21 17:58:43.112201741 +0000 UTC m=+10620.293717953" Jan 21 17:58:49 crc kubenswrapper[4707]: I0121 17:58:49.182002 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 17:58:49 crc kubenswrapper[4707]: I0121 17:58:49.516332 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:49 crc kubenswrapper[4707]: I0121 17:58:49.516494 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:49 crc kubenswrapper[4707]: I0121 17:58:49.544354 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:50 crc kubenswrapper[4707]: I0121 17:58:50.130069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"7bb7971b7eb4bdcc00d9df2ef47cd138d32854148bc3934dbf389d47e9170626"} Jan 21 17:58:50 crc kubenswrapper[4707]: I0121 17:58:50.163248 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:50 crc kubenswrapper[4707]: I0121 17:58:50.207196 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nd2vp"] Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.138877 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nd2vp" podUID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerName="registry-server" containerID="cri-o://bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557" gracePeriod=2 Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.436950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.590907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvf5s\" (UniqueName: \"kubernetes.io/projected/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-kube-api-access-bvf5s\") pod \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.590998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-utilities\") pod \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.591026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-catalog-content\") pod \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\" (UID: \"dedd2972-4b3b-4eda-961a-8fbe9480e2f6\") " Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.591748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-utilities" (OuterVolumeSpecName: "utilities") pod "dedd2972-4b3b-4eda-961a-8fbe9480e2f6" (UID: "dedd2972-4b3b-4eda-961a-8fbe9480e2f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.597070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-kube-api-access-bvf5s" (OuterVolumeSpecName: "kube-api-access-bvf5s") pod "dedd2972-4b3b-4eda-961a-8fbe9480e2f6" (UID: "dedd2972-4b3b-4eda-961a-8fbe9480e2f6"). InnerVolumeSpecName "kube-api-access-bvf5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.628183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dedd2972-4b3b-4eda-961a-8fbe9480e2f6" (UID: "dedd2972-4b3b-4eda-961a-8fbe9480e2f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.692585 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.692625 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:58:52 crc kubenswrapper[4707]: I0121 17:58:52.692639 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvf5s\" (UniqueName: \"kubernetes.io/projected/dedd2972-4b3b-4eda-961a-8fbe9480e2f6-kube-api-access-bvf5s\") on node \"crc\" DevicePath \"\"" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.144431 4707 generic.go:334] "Generic (PLEG): container finished" podID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerID="bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557" exitCode=0 Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.144468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd2vp" event={"ID":"dedd2972-4b3b-4eda-961a-8fbe9480e2f6","Type":"ContainerDied","Data":"bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557"} Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.144491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd2vp" event={"ID":"dedd2972-4b3b-4eda-961a-8fbe9480e2f6","Type":"ContainerDied","Data":"94bcd69705db6bd7e5e63e7ae05755504e5520dbd62a89c7c2d2975090e081a9"} Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.144506 4707 scope.go:117] "RemoveContainer" containerID="bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.144596 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd2vp" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.165208 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nd2vp"] Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.170136 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nd2vp"] Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.173948 4707 scope.go:117] "RemoveContainer" containerID="ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.185546 4707 scope.go:117] "RemoveContainer" containerID="7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.188377 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" path="/var/lib/kubelet/pods/dedd2972-4b3b-4eda-961a-8fbe9480e2f6/volumes" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.203413 4707 scope.go:117] "RemoveContainer" containerID="bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557" Jan 21 17:58:53 crc kubenswrapper[4707]: E0121 17:58:53.203660 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557\": container with ID starting with bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557 not found: ID does not exist" containerID="bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.203687 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557"} err="failed to get container status \"bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557\": rpc error: code = NotFound desc = could not find container \"bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557\": container with ID starting with bf4f50d38ac8e97575014a0053412334635e64700498bb5594981271ff313557 not found: ID does not exist" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.203704 4707 scope.go:117] "RemoveContainer" containerID="ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899" Jan 21 17:58:53 crc kubenswrapper[4707]: E0121 17:58:53.203949 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899\": container with ID starting with ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899 not found: ID does not exist" containerID="ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.203977 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899"} err="failed to get container status \"ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899\": rpc error: code = NotFound desc = could not find container \"ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899\": container with ID starting with ec46933572598bbdc4246046d3d66a383f6ab0eb47c09594deba85a7e1e67899 not found: ID does not exist" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.203995 4707 scope.go:117] "RemoveContainer" containerID="7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10" Jan 21 17:58:53 crc kubenswrapper[4707]: E0121 17:58:53.204180 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10\": container with ID starting with 7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10 not found: ID does not exist" containerID="7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10" Jan 21 17:58:53 crc kubenswrapper[4707]: I0121 17:58:53.204202 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10"} err="failed to get container status \"7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10\": rpc error: code = NotFound desc = could not find container \"7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10\": container with ID starting with 7e07a3909ba84cfc4142ea16de719f07c58c12b24315091d9b235e28a5122b10 not found: ID does not exist" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.129449 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9"] Jan 21 18:00:00 crc kubenswrapper[4707]: E0121 18:00:00.129958 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerName="extract-utilities" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.129970 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerName="extract-utilities" Jan 21 18:00:00 crc kubenswrapper[4707]: E0121 18:00:00.129992 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerName="extract-content" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.129998 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerName="extract-content" Jan 21 18:00:00 crc kubenswrapper[4707]: E0121 18:00:00.130008 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerName="registry-server" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.130014 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerName="registry-server" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.130119 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedd2972-4b3b-4eda-961a-8fbe9480e2f6" containerName="registry-server" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.130485 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.135523 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9"] Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.135618 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.135937 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.193553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7239c7b-327d-4924-8e08-bb0a9b1ba989-config-volume\") pod \"collect-profiles-29483640-f72v9\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.193597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7239c7b-327d-4924-8e08-bb0a9b1ba989-secret-volume\") pod \"collect-profiles-29483640-f72v9\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.193614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfflp\" (UniqueName: \"kubernetes.io/projected/e7239c7b-327d-4924-8e08-bb0a9b1ba989-kube-api-access-nfflp\") pod \"collect-profiles-29483640-f72v9\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.294555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7239c7b-327d-4924-8e08-bb0a9b1ba989-config-volume\") pod \"collect-profiles-29483640-f72v9\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.294594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7239c7b-327d-4924-8e08-bb0a9b1ba989-secret-volume\") pod \"collect-profiles-29483640-f72v9\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.294612 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfflp\" (UniqueName: \"kubernetes.io/projected/e7239c7b-327d-4924-8e08-bb0a9b1ba989-kube-api-access-nfflp\") pod \"collect-profiles-29483640-f72v9\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.295689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7239c7b-327d-4924-8e08-bb0a9b1ba989-config-volume\") pod \"collect-profiles-29483640-f72v9\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.299883 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7239c7b-327d-4924-8e08-bb0a9b1ba989-secret-volume\") pod \"collect-profiles-29483640-f72v9\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.306791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfflp\" (UniqueName: \"kubernetes.io/projected/e7239c7b-327d-4924-8e08-bb0a9b1ba989-kube-api-access-nfflp\") pod \"collect-profiles-29483640-f72v9\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.445529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:00 crc kubenswrapper[4707]: I0121 18:00:00.795802 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9"] Jan 21 18:00:01 crc kubenswrapper[4707]: I0121 18:00:01.472577 4707 generic.go:334] "Generic (PLEG): container finished" podID="e7239c7b-327d-4924-8e08-bb0a9b1ba989" containerID="de6a4432f14d6bbc762398981ba2eddf0e7af05b2bcd45d2528dd757e4675ad1" exitCode=0 Jan 21 18:00:01 crc kubenswrapper[4707]: I0121 18:00:01.472609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" event={"ID":"e7239c7b-327d-4924-8e08-bb0a9b1ba989","Type":"ContainerDied","Data":"de6a4432f14d6bbc762398981ba2eddf0e7af05b2bcd45d2528dd757e4675ad1"} Jan 21 18:00:01 crc kubenswrapper[4707]: I0121 18:00:01.472632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" event={"ID":"e7239c7b-327d-4924-8e08-bb0a9b1ba989","Type":"ContainerStarted","Data":"bb1758f4c5873e900cef4badeb0d76c119330e33ba3a64ea2430b12948060a9a"} Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.677519 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.826267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfflp\" (UniqueName: \"kubernetes.io/projected/e7239c7b-327d-4924-8e08-bb0a9b1ba989-kube-api-access-nfflp\") pod \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.826311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7239c7b-327d-4924-8e08-bb0a9b1ba989-config-volume\") pod \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.826339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7239c7b-327d-4924-8e08-bb0a9b1ba989-secret-volume\") pod \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\" (UID: \"e7239c7b-327d-4924-8e08-bb0a9b1ba989\") " Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.827668 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7239c7b-327d-4924-8e08-bb0a9b1ba989-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7239c7b-327d-4924-8e08-bb0a9b1ba989" (UID: "e7239c7b-327d-4924-8e08-bb0a9b1ba989"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.839179 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7239c7b-327d-4924-8e08-bb0a9b1ba989-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7239c7b-327d-4924-8e08-bb0a9b1ba989" (UID: "e7239c7b-327d-4924-8e08-bb0a9b1ba989"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.841154 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7239c7b-327d-4924-8e08-bb0a9b1ba989-kube-api-access-nfflp" (OuterVolumeSpecName: "kube-api-access-nfflp") pod "e7239c7b-327d-4924-8e08-bb0a9b1ba989" (UID: "e7239c7b-327d-4924-8e08-bb0a9b1ba989"). InnerVolumeSpecName "kube-api-access-nfflp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.926979 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfflp\" (UniqueName: \"kubernetes.io/projected/e7239c7b-327d-4924-8e08-bb0a9b1ba989-kube-api-access-nfflp\") on node \"crc\" DevicePath \"\"" Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.927012 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7239c7b-327d-4924-8e08-bb0a9b1ba989-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 18:00:02 crc kubenswrapper[4707]: I0121 18:00:02.927022 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7239c7b-327d-4924-8e08-bb0a9b1ba989-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 18:00:03 crc kubenswrapper[4707]: I0121 18:00:03.484950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" event={"ID":"e7239c7b-327d-4924-8e08-bb0a9b1ba989","Type":"ContainerDied","Data":"bb1758f4c5873e900cef4badeb0d76c119330e33ba3a64ea2430b12948060a9a"} Jan 21 18:00:03 crc kubenswrapper[4707]: I0121 18:00:03.485002 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1758f4c5873e900cef4badeb0d76c119330e33ba3a64ea2430b12948060a9a" Jan 21 18:00:03 crc kubenswrapper[4707]: I0121 18:00:03.485070 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483640-f72v9" Jan 21 18:00:03 crc kubenswrapper[4707]: I0121 18:00:03.721564 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc"] Jan 21 18:00:03 crc kubenswrapper[4707]: I0121 18:00:03.724531 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483595-dnntc"] Jan 21 18:00:05 crc kubenswrapper[4707]: I0121 18:00:05.188779 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b280568d-b5c6-4efc-a0c1-239585fcf933" path="/var/lib/kubelet/pods/b280568d-b5c6-4efc-a0c1-239585fcf933/volumes" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.583315 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdpq"] Jan 21 18:00:08 crc kubenswrapper[4707]: E0121 18:00:08.583727 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7239c7b-327d-4924-8e08-bb0a9b1ba989" containerName="collect-profiles" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.583740 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7239c7b-327d-4924-8e08-bb0a9b1ba989" containerName="collect-profiles" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.583873 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7239c7b-327d-4924-8e08-bb0a9b1ba989" containerName="collect-profiles" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.584635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.586903 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdpq"] Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.695980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-utilities\") pod \"redhat-marketplace-4mdpq\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.696034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np8mx\" (UniqueName: \"kubernetes.io/projected/cf5750ff-72f7-4c4f-b216-1d33b5344e75-kube-api-access-np8mx\") pod \"redhat-marketplace-4mdpq\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.696053 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-catalog-content\") pod \"redhat-marketplace-4mdpq\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.796614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-utilities\") pod \"redhat-marketplace-4mdpq\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.796655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np8mx\" (UniqueName: \"kubernetes.io/projected/cf5750ff-72f7-4c4f-b216-1d33b5344e75-kube-api-access-np8mx\") pod \"redhat-marketplace-4mdpq\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.796676 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-catalog-content\") pod \"redhat-marketplace-4mdpq\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.797047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-catalog-content\") pod \"redhat-marketplace-4mdpq\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.797257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-utilities\") pod \"redhat-marketplace-4mdpq\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.819500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np8mx\" (UniqueName: \"kubernetes.io/projected/cf5750ff-72f7-4c4f-b216-1d33b5344e75-kube-api-access-np8mx\") pod \"redhat-marketplace-4mdpq\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:08 crc kubenswrapper[4707]: I0121 18:00:08.900359 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:09 crc kubenswrapper[4707]: I0121 18:00:09.294121 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdpq"] Jan 21 18:00:09 crc kubenswrapper[4707]: I0121 18:00:09.514401 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerID="3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268" exitCode=0 Jan 21 18:00:09 crc kubenswrapper[4707]: I0121 18:00:09.514468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdpq" event={"ID":"cf5750ff-72f7-4c4f-b216-1d33b5344e75","Type":"ContainerDied","Data":"3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268"} Jan 21 18:00:09 crc kubenswrapper[4707]: I0121 18:00:09.514492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdpq" event={"ID":"cf5750ff-72f7-4c4f-b216-1d33b5344e75","Type":"ContainerStarted","Data":"cac4e34fd28488363176724c5fd3afc858cc5dc66a809bfdaf8e5202e0b11ee4"} Jan 21 18:00:09 crc kubenswrapper[4707]: I0121 18:00:09.516106 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 18:00:10 crc kubenswrapper[4707]: I0121 18:00:10.520311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdpq" event={"ID":"cf5750ff-72f7-4c4f-b216-1d33b5344e75","Type":"ContainerStarted","Data":"22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b"} Jan 21 18:00:11 crc kubenswrapper[4707]: I0121 18:00:11.526211 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerID="22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b" exitCode=0 Jan 21 18:00:11 crc kubenswrapper[4707]: I0121 18:00:11.526411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdpq" event={"ID":"cf5750ff-72f7-4c4f-b216-1d33b5344e75","Type":"ContainerDied","Data":"22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b"} Jan 21 18:00:12 crc kubenswrapper[4707]: I0121 18:00:12.532450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdpq" event={"ID":"cf5750ff-72f7-4c4f-b216-1d33b5344e75","Type":"ContainerStarted","Data":"5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab"} Jan 21 18:00:12 crc kubenswrapper[4707]: I0121 18:00:12.548944 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4mdpq" podStartSLOduration=2.049163386 podStartE2EDuration="4.548930109s" podCreationTimestamp="2026-01-21 18:00:08 +0000 UTC" firstStartedPulling="2026-01-21 18:00:09.515861125 +0000 UTC m=+10706.697377347" lastFinishedPulling="2026-01-21 18:00:12.015627847 +0000 UTC m=+10709.197144070" observedRunningTime="2026-01-21 18:00:12.544871269 +0000 UTC m=+10709.726387490" watchObservedRunningTime="2026-01-21 18:00:12.548930109 +0000 UTC m=+10709.730446332" Jan 21 18:00:18 crc kubenswrapper[4707]: I0121 18:00:18.901250 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:18 crc kubenswrapper[4707]: I0121 18:00:18.901707 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:18 crc kubenswrapper[4707]: I0121 18:00:18.937670 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:19 crc kubenswrapper[4707]: I0121 18:00:19.591307 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:19 crc kubenswrapper[4707]: I0121 18:00:19.621289 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdpq"] Jan 21 18:00:21 crc kubenswrapper[4707]: I0121 18:00:21.574327 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4mdpq" podUID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerName="registry-server" containerID="cri-o://5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab" gracePeriod=2 Jan 21 18:00:21 crc kubenswrapper[4707]: I0121 18:00:21.865557 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.056537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-catalog-content\") pod \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.056595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np8mx\" (UniqueName: \"kubernetes.io/projected/cf5750ff-72f7-4c4f-b216-1d33b5344e75-kube-api-access-np8mx\") pod \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.056629 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-utilities\") pod \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\" (UID: \"cf5750ff-72f7-4c4f-b216-1d33b5344e75\") " Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.057403 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-utilities" (OuterVolumeSpecName: "utilities") pod "cf5750ff-72f7-4c4f-b216-1d33b5344e75" (UID: "cf5750ff-72f7-4c4f-b216-1d33b5344e75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.060411 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5750ff-72f7-4c4f-b216-1d33b5344e75-kube-api-access-np8mx" (OuterVolumeSpecName: "kube-api-access-np8mx") pod "cf5750ff-72f7-4c4f-b216-1d33b5344e75" (UID: "cf5750ff-72f7-4c4f-b216-1d33b5344e75"). InnerVolumeSpecName "kube-api-access-np8mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.086753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf5750ff-72f7-4c4f-b216-1d33b5344e75" (UID: "cf5750ff-72f7-4c4f-b216-1d33b5344e75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.157442 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.157461 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np8mx\" (UniqueName: \"kubernetes.io/projected/cf5750ff-72f7-4c4f-b216-1d33b5344e75-kube-api-access-np8mx\") on node \"crc\" DevicePath \"\"" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.157470 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5750ff-72f7-4c4f-b216-1d33b5344e75-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.580964 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerID="5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab" exitCode=0 Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.581623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdpq" event={"ID":"cf5750ff-72f7-4c4f-b216-1d33b5344e75","Type":"ContainerDied","Data":"5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab"} Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.581709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdpq" event={"ID":"cf5750ff-72f7-4c4f-b216-1d33b5344e75","Type":"ContainerDied","Data":"cac4e34fd28488363176724c5fd3afc858cc5dc66a809bfdaf8e5202e0b11ee4"} Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.581785 4707 scope.go:117] "RemoveContainer" containerID="5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.581972 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mdpq" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.604593 4707 scope.go:117] "RemoveContainer" containerID="22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.611627 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdpq"] Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.615407 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdpq"] Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.635783 4707 scope.go:117] "RemoveContainer" containerID="3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.646103 4707 scope.go:117] "RemoveContainer" containerID="5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab" Jan 21 18:00:22 crc kubenswrapper[4707]: E0121 18:00:22.646319 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab\": container with ID starting with 5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab not found: ID does not exist" containerID="5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.646347 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab"} err="failed to get container status \"5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab\": rpc error: code = NotFound desc = could not find container \"5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab\": container with ID starting with 5eff2f1641507d71c380bd70c772a4a8c18d830879893270892f5608d2d984ab not found: ID does not exist" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.646364 4707 scope.go:117] "RemoveContainer" containerID="22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b" Jan 21 18:00:22 crc kubenswrapper[4707]: E0121 18:00:22.646589 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b\": container with ID starting with 22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b not found: ID does not exist" containerID="22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.646621 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b"} err="failed to get container status \"22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b\": rpc error: code = NotFound desc = could not find container \"22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b\": container with ID starting with 22fed6ad0ee2423f90a1ae6f39fcffc4ddc0ee8cf066759a1cd7f5114fec7c6b not found: ID does not exist" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.646643 4707 scope.go:117] "RemoveContainer" containerID="3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268" Jan 21 18:00:22 crc kubenswrapper[4707]: E0121 18:00:22.646836 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268\": container with ID starting with 3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268 not found: ID does not exist" containerID="3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268" Jan 21 18:00:22 crc kubenswrapper[4707]: I0121 18:00:22.646854 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268"} err="failed to get container status \"3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268\": rpc error: code = NotFound desc = could not find container \"3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268\": container with ID starting with 3e003855ff1718961be3d0824b1c8d9b2e8821c75ea3040e7f79dd0bc90aa268 not found: ID does not exist" Jan 21 18:00:23 crc kubenswrapper[4707]: I0121 18:00:23.188598 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" path="/var/lib/kubelet/pods/cf5750ff-72f7-4c4f-b216-1d33b5344e75/volumes" Jan 21 18:00:29 crc kubenswrapper[4707]: I0121 18:00:29.556101 4707 scope.go:117] "RemoveContainer" containerID="2a08c163dbafc50af7f64e9ac61f528a97b80c002997370120ba45ac44ff78b3" Jan 21 18:01:09 crc kubenswrapper[4707]: I0121 18:01:09.945335 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:01:09 crc kubenswrapper[4707]: I0121 18:01:09.945730 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 18:01:39 crc kubenswrapper[4707]: I0121 18:01:39.945198 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:01:39 crc kubenswrapper[4707]: I0121 18:01:39.945641 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 18:02:09 crc kubenswrapper[4707]: I0121 18:02:09.945911 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:02:09 crc kubenswrapper[4707]: I0121 18:02:09.947187 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 18:02:09 crc kubenswrapper[4707]: I0121 18:02:09.947337 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 18:02:09 crc kubenswrapper[4707]: I0121 18:02:09.947956 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7bb7971b7eb4bdcc00d9df2ef47cd138d32854148bc3934dbf389d47e9170626"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 18:02:09 crc kubenswrapper[4707]: I0121 18:02:09.948085 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://7bb7971b7eb4bdcc00d9df2ef47cd138d32854148bc3934dbf389d47e9170626" gracePeriod=600 Jan 21 18:02:10 crc kubenswrapper[4707]: I0121 18:02:10.126377 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="7bb7971b7eb4bdcc00d9df2ef47cd138d32854148bc3934dbf389d47e9170626" exitCode=0 Jan 21 18:02:10 crc kubenswrapper[4707]: I0121 18:02:10.126538 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"7bb7971b7eb4bdcc00d9df2ef47cd138d32854148bc3934dbf389d47e9170626"} Jan 21 18:02:10 crc kubenswrapper[4707]: I0121 18:02:10.126567 4707 scope.go:117] "RemoveContainer" containerID="c7c9936d1c745b623af2b4f394dc621e4ce141353e931c1991ac2bc904aed57b" Jan 21 18:02:11 crc kubenswrapper[4707]: I0121 18:02:11.132476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0"} Jan 21 18:04:39 crc kubenswrapper[4707]: I0121 18:04:39.945782 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:04:39 crc kubenswrapper[4707]: I0121 18:04:39.946791 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 18:05:09 crc kubenswrapper[4707]: I0121 18:05:09.946004 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:05:09 crc kubenswrapper[4707]: I0121 18:05:09.946389 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.243035 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p5msm"] Jan 21 18:05:39 crc kubenswrapper[4707]: E0121 18:05:39.244422 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerName="extract-content" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.244435 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerName="extract-content" Jan 21 18:05:39 crc kubenswrapper[4707]: E0121 18:05:39.244762 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerName="extract-utilities" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.244768 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerName="extract-utilities" Jan 21 18:05:39 crc kubenswrapper[4707]: E0121 18:05:39.244780 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerName="registry-server" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.244856 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerName="registry-server" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.245102 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5750ff-72f7-4c4f-b216-1d33b5344e75" containerName="registry-server" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.256333 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p5msm"] Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.256437 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.352822 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-utilities\") pod \"redhat-operators-p5msm\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.352927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl7bq\" (UniqueName: \"kubernetes.io/projected/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-kube-api-access-gl7bq\") pod \"redhat-operators-p5msm\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.353075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-catalog-content\") pod \"redhat-operators-p5msm\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.454550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl7bq\" (UniqueName: \"kubernetes.io/projected/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-kube-api-access-gl7bq\") pod \"redhat-operators-p5msm\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.454632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-catalog-content\") pod \"redhat-operators-p5msm\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.454691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-utilities\") pod \"redhat-operators-p5msm\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.455183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-utilities\") pod \"redhat-operators-p5msm\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.455232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-catalog-content\") pod \"redhat-operators-p5msm\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.470636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl7bq\" (UniqueName: \"kubernetes.io/projected/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-kube-api-access-gl7bq\") pod \"redhat-operators-p5msm\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.573623 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.945556 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.945785 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.945837 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.946344 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.946388 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" gracePeriod=600 Jan 21 18:05:39 crc kubenswrapper[4707]: I0121 18:05:39.947906 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p5msm"] Jan 21 18:05:40 crc kubenswrapper[4707]: E0121 18:05:40.071471 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:05:40 crc kubenswrapper[4707]: I0121 18:05:40.104199 4707 generic.go:334] "Generic (PLEG): container finished" podID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerID="931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb" exitCode=0 Jan 21 18:05:40 crc kubenswrapper[4707]: I0121 18:05:40.104295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5msm" event={"ID":"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c","Type":"ContainerDied","Data":"931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb"} Jan 21 18:05:40 crc kubenswrapper[4707]: I0121 18:05:40.104339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5msm" event={"ID":"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c","Type":"ContainerStarted","Data":"4503813466c7956e8d8148147dbd3b2bb0eb6371b2faa4e2f6bdb64e384e0a05"} Jan 21 18:05:40 crc kubenswrapper[4707]: I0121 18:05:40.106142 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 18:05:40 crc kubenswrapper[4707]: I0121 18:05:40.108009 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" exitCode=0 Jan 21 18:05:40 crc kubenswrapper[4707]: I0121 18:05:40.108040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0"} Jan 21 18:05:40 crc kubenswrapper[4707]: I0121 18:05:40.108062 4707 scope.go:117] "RemoveContainer" containerID="7bb7971b7eb4bdcc00d9df2ef47cd138d32854148bc3934dbf389d47e9170626" Jan 21 18:05:40 crc kubenswrapper[4707]: I0121 18:05:40.108408 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:05:40 crc kubenswrapper[4707]: E0121 18:05:40.108585 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:05:41 crc kubenswrapper[4707]: I0121 18:05:41.114592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5msm" event={"ID":"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c","Type":"ContainerStarted","Data":"ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3"} Jan 21 18:05:42 crc kubenswrapper[4707]: I0121 18:05:42.122311 4707 generic.go:334] "Generic (PLEG): container finished" podID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerID="ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3" exitCode=0 Jan 21 18:05:42 crc kubenswrapper[4707]: I0121 18:05:42.122346 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5msm" event={"ID":"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c","Type":"ContainerDied","Data":"ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3"} Jan 21 18:05:43 crc kubenswrapper[4707]: I0121 18:05:43.129715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5msm" event={"ID":"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c","Type":"ContainerStarted","Data":"136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924"} Jan 21 18:05:43 crc kubenswrapper[4707]: I0121 18:05:43.146099 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p5msm" podStartSLOduration=1.637396781 podStartE2EDuration="4.146087081s" podCreationTimestamp="2026-01-21 18:05:39 +0000 UTC" firstStartedPulling="2026-01-21 18:05:40.105923907 +0000 UTC m=+11037.287440130" lastFinishedPulling="2026-01-21 18:05:42.614614207 +0000 UTC m=+11039.796130430" observedRunningTime="2026-01-21 18:05:43.141920216 +0000 UTC m=+11040.323436438" watchObservedRunningTime="2026-01-21 18:05:43.146087081 +0000 UTC m=+11040.327603302" Jan 21 18:05:49 crc kubenswrapper[4707]: I0121 18:05:49.574209 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:49 crc kubenswrapper[4707]: I0121 18:05:49.574572 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:49 crc kubenswrapper[4707]: I0121 18:05:49.601546 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:50 crc kubenswrapper[4707]: I0121 18:05:50.188092 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:50 crc kubenswrapper[4707]: I0121 18:05:50.218555 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p5msm"] Jan 21 18:05:52 crc kubenswrapper[4707]: I0121 18:05:52.167902 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p5msm" podUID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerName="registry-server" containerID="cri-o://136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924" gracePeriod=2 Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.073338 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.174828 4707 generic.go:334] "Generic (PLEG): container finished" podID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerID="136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924" exitCode=0 Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.174866 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5msm" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.174871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5msm" event={"ID":"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c","Type":"ContainerDied","Data":"136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924"} Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.174974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5msm" event={"ID":"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c","Type":"ContainerDied","Data":"4503813466c7956e8d8148147dbd3b2bb0eb6371b2faa4e2f6bdb64e384e0a05"} Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.174999 4707 scope.go:117] "RemoveContainer" containerID="136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.187282 4707 scope.go:117] "RemoveContainer" containerID="ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.187668 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:05:53 crc kubenswrapper[4707]: E0121 18:05:53.187863 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.207097 4707 scope.go:117] "RemoveContainer" containerID="931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.209904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl7bq\" (UniqueName: \"kubernetes.io/projected/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-kube-api-access-gl7bq\") pod \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.209960 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-utilities\") pod \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.210027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-catalog-content\") pod \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\" (UID: \"8ca5412e-1faa-4dfe-bd2a-93aa73d3926c\") " Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.210958 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-utilities" (OuterVolumeSpecName: "utilities") pod "8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" (UID: "8ca5412e-1faa-4dfe-bd2a-93aa73d3926c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.214921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-kube-api-access-gl7bq" (OuterVolumeSpecName: "kube-api-access-gl7bq") pod "8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" (UID: "8ca5412e-1faa-4dfe-bd2a-93aa73d3926c"). InnerVolumeSpecName "kube-api-access-gl7bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.238388 4707 scope.go:117] "RemoveContainer" containerID="136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924" Jan 21 18:05:53 crc kubenswrapper[4707]: E0121 18:05:53.238705 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924\": container with ID starting with 136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924 not found: ID does not exist" containerID="136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.238806 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924"} err="failed to get container status \"136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924\": rpc error: code = NotFound desc = could not find container \"136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924\": container with ID starting with 136ab8cac3ffe781e94eaaceec62ef5716f6eb5e79bc339286db60c5ffe4e924 not found: ID does not exist" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.238909 4707 scope.go:117] "RemoveContainer" containerID="ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3" Jan 21 18:05:53 crc kubenswrapper[4707]: E0121 18:05:53.239264 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3\": container with ID starting with ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3 not found: ID does not exist" containerID="ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.239354 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3"} err="failed to get container status \"ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3\": rpc error: code = NotFound desc = could not find container \"ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3\": container with ID starting with ca4ce9009dad5f2182cfda3054d5042c2e05b5a7b6fb7a9067881ed6dce521a3 not found: ID does not exist" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.239423 4707 scope.go:117] "RemoveContainer" containerID="931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb" Jan 21 18:05:53 crc kubenswrapper[4707]: E0121 18:05:53.239684 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb\": container with ID starting with 931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb not found: ID does not exist" containerID="931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.239764 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb"} err="failed to get container status \"931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb\": rpc error: code = NotFound desc = could not find container \"931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb\": container with ID starting with 931ac9b1878b2d6cdf25aea28f9358605d7e6be85e4676b5d1358135b305e7eb not found: ID does not exist" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.308496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" (UID: "8ca5412e-1faa-4dfe-bd2a-93aa73d3926c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.311647 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.311674 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl7bq\" (UniqueName: \"kubernetes.io/projected/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-kube-api-access-gl7bq\") on node \"crc\" DevicePath \"\"" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.311685 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.503119 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p5msm"] Jan 21 18:05:53 crc kubenswrapper[4707]: I0121 18:05:53.507953 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p5msm"] Jan 21 18:05:55 crc kubenswrapper[4707]: I0121 18:05:55.188342 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" path="/var/lib/kubelet/pods/8ca5412e-1faa-4dfe-bd2a-93aa73d3926c/volumes" Jan 21 18:06:06 crc kubenswrapper[4707]: I0121 18:06:06.182540 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:06:06 crc kubenswrapper[4707]: E0121 18:06:06.183089 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:06:20 crc kubenswrapper[4707]: I0121 18:06:20.182217 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:06:20 crc kubenswrapper[4707]: E0121 18:06:20.182743 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:06:34 crc kubenswrapper[4707]: I0121 18:06:34.182844 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:06:34 crc kubenswrapper[4707]: E0121 18:06:34.184494 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:06:46 crc kubenswrapper[4707]: I0121 18:06:46.182936 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:06:46 crc kubenswrapper[4707]: E0121 18:06:46.183794 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:07:01 crc kubenswrapper[4707]: I0121 18:07:01.182059 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:07:01 crc kubenswrapper[4707]: E0121 18:07:01.182572 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:07:16 crc kubenswrapper[4707]: I0121 18:07:16.182142 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:07:16 crc kubenswrapper[4707]: E0121 18:07:16.182671 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:07:28 crc kubenswrapper[4707]: I0121 18:07:28.182435 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:07:28 crc kubenswrapper[4707]: E0121 18:07:28.183140 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:07:41 crc kubenswrapper[4707]: I0121 18:07:41.182901 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:07:41 crc kubenswrapper[4707]: E0121 18:07:41.183620 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.335000 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dh5hg"] Jan 21 18:07:48 crc kubenswrapper[4707]: E0121 18:07:48.335661 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerName="extract-content" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.335676 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerName="extract-content" Jan 21 18:07:48 crc kubenswrapper[4707]: E0121 18:07:48.335692 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerName="extract-utilities" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.335698 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerName="extract-utilities" Jan 21 18:07:48 crc kubenswrapper[4707]: E0121 18:07:48.335718 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerName="registry-server" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.335724 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerName="registry-server" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.336536 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca5412e-1faa-4dfe-bd2a-93aa73d3926c" containerName="registry-server" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.337617 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.341117 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dh5hg"] Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.346796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-catalog-content\") pod \"community-operators-dh5hg\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.346881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cnzp\" (UniqueName: \"kubernetes.io/projected/228d7324-da23-419d-8698-1562afd9ef0e-kube-api-access-6cnzp\") pod \"community-operators-dh5hg\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.346918 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-utilities\") pod \"community-operators-dh5hg\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.447916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-catalog-content\") pod \"community-operators-dh5hg\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.447970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cnzp\" (UniqueName: \"kubernetes.io/projected/228d7324-da23-419d-8698-1562afd9ef0e-kube-api-access-6cnzp\") pod \"community-operators-dh5hg\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.448011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-utilities\") pod \"community-operators-dh5hg\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.448364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-catalog-content\") pod \"community-operators-dh5hg\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.448430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-utilities\") pod \"community-operators-dh5hg\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.463714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cnzp\" (UniqueName: \"kubernetes.io/projected/228d7324-da23-419d-8698-1562afd9ef0e-kube-api-access-6cnzp\") pod \"community-operators-dh5hg\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:48 crc kubenswrapper[4707]: I0121 18:07:48.655126 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:49 crc kubenswrapper[4707]: I0121 18:07:49.039615 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dh5hg"] Jan 21 18:07:49 crc kubenswrapper[4707]: I0121 18:07:49.717430 4707 generic.go:334] "Generic (PLEG): container finished" podID="228d7324-da23-419d-8698-1562afd9ef0e" containerID="86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8" exitCode=0 Jan 21 18:07:49 crc kubenswrapper[4707]: I0121 18:07:49.717477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh5hg" event={"ID":"228d7324-da23-419d-8698-1562afd9ef0e","Type":"ContainerDied","Data":"86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8"} Jan 21 18:07:49 crc kubenswrapper[4707]: I0121 18:07:49.717673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh5hg" event={"ID":"228d7324-da23-419d-8698-1562afd9ef0e","Type":"ContainerStarted","Data":"a1ae9f36a9bc7aa1cf206a41a917ddb9ed1bf10883834e92410da1bd016fe60f"} Jan 21 18:07:50 crc kubenswrapper[4707]: I0121 18:07:50.738293 4707 generic.go:334] "Generic (PLEG): container finished" podID="228d7324-da23-419d-8698-1562afd9ef0e" containerID="012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36" exitCode=0 Jan 21 18:07:50 crc kubenswrapper[4707]: I0121 18:07:50.738347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh5hg" event={"ID":"228d7324-da23-419d-8698-1562afd9ef0e","Type":"ContainerDied","Data":"012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36"} Jan 21 18:07:51 crc kubenswrapper[4707]: I0121 18:07:51.744283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh5hg" event={"ID":"228d7324-da23-419d-8698-1562afd9ef0e","Type":"ContainerStarted","Data":"d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856"} Jan 21 18:07:51 crc kubenswrapper[4707]: I0121 18:07:51.758089 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dh5hg" podStartSLOduration=2.096051801 podStartE2EDuration="3.758077262s" podCreationTimestamp="2026-01-21 18:07:48 +0000 UTC" firstStartedPulling="2026-01-21 18:07:49.718383388 +0000 UTC m=+11166.899899610" lastFinishedPulling="2026-01-21 18:07:51.380408849 +0000 UTC m=+11168.561925071" observedRunningTime="2026-01-21 18:07:51.756940083 +0000 UTC m=+11168.938456306" watchObservedRunningTime="2026-01-21 18:07:51.758077262 +0000 UTC m=+11168.939593484" Jan 21 18:07:53 crc kubenswrapper[4707]: I0121 18:07:53.185185 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:07:53 crc kubenswrapper[4707]: E0121 18:07:53.185423 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:07:58 crc kubenswrapper[4707]: I0121 18:07:58.655464 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:58 crc kubenswrapper[4707]: I0121 18:07:58.655795 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:58 crc kubenswrapper[4707]: I0121 18:07:58.684475 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:58 crc kubenswrapper[4707]: I0121 18:07:58.814087 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:07:58 crc kubenswrapper[4707]: I0121 18:07:58.913836 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dh5hg"] Jan 21 18:08:00 crc kubenswrapper[4707]: I0121 18:08:00.828582 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dh5hg" podUID="228d7324-da23-419d-8698-1562afd9ef0e" containerName="registry-server" containerID="cri-o://d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856" gracePeriod=2 Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.194217 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.300783 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-utilities\") pod \"228d7324-da23-419d-8698-1562afd9ef0e\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.300868 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-catalog-content\") pod \"228d7324-da23-419d-8698-1562afd9ef0e\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.300928 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cnzp\" (UniqueName: \"kubernetes.io/projected/228d7324-da23-419d-8698-1562afd9ef0e-kube-api-access-6cnzp\") pod \"228d7324-da23-419d-8698-1562afd9ef0e\" (UID: \"228d7324-da23-419d-8698-1562afd9ef0e\") " Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.301687 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-utilities" (OuterVolumeSpecName: "utilities") pod "228d7324-da23-419d-8698-1562afd9ef0e" (UID: "228d7324-da23-419d-8698-1562afd9ef0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.306785 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228d7324-da23-419d-8698-1562afd9ef0e-kube-api-access-6cnzp" (OuterVolumeSpecName: "kube-api-access-6cnzp") pod "228d7324-da23-419d-8698-1562afd9ef0e" (UID: "228d7324-da23-419d-8698-1562afd9ef0e"). InnerVolumeSpecName "kube-api-access-6cnzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.355349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "228d7324-da23-419d-8698-1562afd9ef0e" (UID: "228d7324-da23-419d-8698-1562afd9ef0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.403840 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cnzp\" (UniqueName: \"kubernetes.io/projected/228d7324-da23-419d-8698-1562afd9ef0e-kube-api-access-6cnzp\") on node \"crc\" DevicePath \"\"" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.403881 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.403900 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/228d7324-da23-419d-8698-1562afd9ef0e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.838237 4707 generic.go:334] "Generic (PLEG): container finished" podID="228d7324-da23-419d-8698-1562afd9ef0e" containerID="d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856" exitCode=0 Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.838282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh5hg" event={"ID":"228d7324-da23-419d-8698-1562afd9ef0e","Type":"ContainerDied","Data":"d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856"} Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.838308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh5hg" event={"ID":"228d7324-da23-419d-8698-1562afd9ef0e","Type":"ContainerDied","Data":"a1ae9f36a9bc7aa1cf206a41a917ddb9ed1bf10883834e92410da1bd016fe60f"} Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.838327 4707 scope.go:117] "RemoveContainer" containerID="d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.838449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dh5hg" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.868785 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dh5hg"] Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.877874 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dh5hg"] Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.879340 4707 scope.go:117] "RemoveContainer" containerID="012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.904702 4707 scope.go:117] "RemoveContainer" containerID="86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.917965 4707 scope.go:117] "RemoveContainer" containerID="d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856" Jan 21 18:08:01 crc kubenswrapper[4707]: E0121 18:08:01.918508 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856\": container with ID starting with d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856 not found: ID does not exist" containerID="d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.918546 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856"} err="failed to get container status \"d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856\": rpc error: code = NotFound desc = could not find container \"d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856\": container with ID starting with d4024a5ef00222e0066b83861980ce9c22486e45922453c235209b9d7931b856 not found: ID does not exist" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.918568 4707 scope.go:117] "RemoveContainer" containerID="012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36" Jan 21 18:08:01 crc kubenswrapper[4707]: E0121 18:08:01.918892 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36\": container with ID starting with 012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36 not found: ID does not exist" containerID="012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.918935 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36"} err="failed to get container status \"012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36\": rpc error: code = NotFound desc = could not find container \"012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36\": container with ID starting with 012e79ff9f4ff421e093ec9b41b03d08f80ecfce4966fbedfa24c39188e21b36 not found: ID does not exist" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.918955 4707 scope.go:117] "RemoveContainer" containerID="86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8" Jan 21 18:08:01 crc kubenswrapper[4707]: E0121 18:08:01.919314 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8\": container with ID starting with 86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8 not found: ID does not exist" containerID="86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8" Jan 21 18:08:01 crc kubenswrapper[4707]: I0121 18:08:01.919341 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8"} err="failed to get container status \"86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8\": rpc error: code = NotFound desc = could not find container \"86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8\": container with ID starting with 86405091518d52bcad530246695278c892beac65a1e694f9610cb4552036b3d8 not found: ID does not exist" Jan 21 18:08:03 crc kubenswrapper[4707]: I0121 18:08:03.188224 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228d7324-da23-419d-8698-1562afd9ef0e" path="/var/lib/kubelet/pods/228d7324-da23-419d-8698-1562afd9ef0e/volumes" Jan 21 18:08:04 crc kubenswrapper[4707]: I0121 18:08:04.183493 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:08:04 crc kubenswrapper[4707]: E0121 18:08:04.183646 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:08:16 crc kubenswrapper[4707]: I0121 18:08:16.182385 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:08:16 crc kubenswrapper[4707]: E0121 18:08:16.183230 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:08:28 crc kubenswrapper[4707]: I0121 18:08:28.182652 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:08:28 crc kubenswrapper[4707]: E0121 18:08:28.184328 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:08:42 crc kubenswrapper[4707]: I0121 18:08:42.182699 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:08:42 crc kubenswrapper[4707]: E0121 18:08:42.183241 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:08:57 crc kubenswrapper[4707]: I0121 18:08:57.182081 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:08:57 crc kubenswrapper[4707]: E0121 18:08:57.182767 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:09:09 crc kubenswrapper[4707]: I0121 18:09:09.182555 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:09:09 crc kubenswrapper[4707]: E0121 18:09:09.183031 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:09:24 crc kubenswrapper[4707]: I0121 18:09:24.182507 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:09:24 crc kubenswrapper[4707]: E0121 18:09:24.183998 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.721682 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9hbcn"] Jan 21 18:09:33 crc kubenswrapper[4707]: E0121 18:09:33.722252 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228d7324-da23-419d-8698-1562afd9ef0e" containerName="extract-utilities" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.722266 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="228d7324-da23-419d-8698-1562afd9ef0e" containerName="extract-utilities" Jan 21 18:09:33 crc kubenswrapper[4707]: E0121 18:09:33.722277 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228d7324-da23-419d-8698-1562afd9ef0e" containerName="extract-content" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.722283 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="228d7324-da23-419d-8698-1562afd9ef0e" containerName="extract-content" Jan 21 18:09:33 crc kubenswrapper[4707]: E0121 18:09:33.722303 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228d7324-da23-419d-8698-1562afd9ef0e" containerName="registry-server" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.722310 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="228d7324-da23-419d-8698-1562afd9ef0e" containerName="registry-server" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.722441 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="228d7324-da23-419d-8698-1562afd9ef0e" containerName="registry-server" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.723238 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.729326 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9hbcn"] Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.856574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-catalog-content\") pod \"certified-operators-9hbcn\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.856879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k76nq\" (UniqueName: \"kubernetes.io/projected/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-kube-api-access-k76nq\") pod \"certified-operators-9hbcn\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.857177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-utilities\") pod \"certified-operators-9hbcn\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.959049 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-catalog-content\") pod \"certified-operators-9hbcn\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.959667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k76nq\" (UniqueName: \"kubernetes.io/projected/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-kube-api-access-k76nq\") pod \"certified-operators-9hbcn\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.959900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-utilities\") pod \"certified-operators-9hbcn\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.960246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-utilities\") pod \"certified-operators-9hbcn\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.959627 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-catalog-content\") pod \"certified-operators-9hbcn\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:33 crc kubenswrapper[4707]: I0121 18:09:33.976682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k76nq\" (UniqueName: \"kubernetes.io/projected/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-kube-api-access-k76nq\") pod \"certified-operators-9hbcn\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:34 crc kubenswrapper[4707]: I0121 18:09:34.038556 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:34 crc kubenswrapper[4707]: I0121 18:09:34.501983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9hbcn"] Jan 21 18:09:35 crc kubenswrapper[4707]: I0121 18:09:35.309778 4707 generic.go:334] "Generic (PLEG): container finished" podID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerID="547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9" exitCode=0 Jan 21 18:09:35 crc kubenswrapper[4707]: I0121 18:09:35.310672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hbcn" event={"ID":"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a","Type":"ContainerDied","Data":"547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9"} Jan 21 18:09:35 crc kubenswrapper[4707]: I0121 18:09:35.310764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hbcn" event={"ID":"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a","Type":"ContainerStarted","Data":"1869afcc8d71dd926eeffebbe0ab19d89742e58514dabc7ae0c00bd74ef1743e"} Jan 21 18:09:36 crc kubenswrapper[4707]: I0121 18:09:36.321465 4707 generic.go:334] "Generic (PLEG): container finished" podID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerID="b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd" exitCode=0 Jan 21 18:09:36 crc kubenswrapper[4707]: I0121 18:09:36.321508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hbcn" event={"ID":"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a","Type":"ContainerDied","Data":"b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd"} Jan 21 18:09:37 crc kubenswrapper[4707]: I0121 18:09:37.327529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hbcn" event={"ID":"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a","Type":"ContainerStarted","Data":"7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1"} Jan 21 18:09:39 crc kubenswrapper[4707]: I0121 18:09:39.182360 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:09:39 crc kubenswrapper[4707]: E0121 18:09:39.182530 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:09:44 crc kubenswrapper[4707]: I0121 18:09:44.039103 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:44 crc kubenswrapper[4707]: I0121 18:09:44.039532 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:44 crc kubenswrapper[4707]: I0121 18:09:44.068801 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:44 crc kubenswrapper[4707]: I0121 18:09:44.083550 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9hbcn" podStartSLOduration=9.610702594 podStartE2EDuration="11.083538044s" podCreationTimestamp="2026-01-21 18:09:33 +0000 UTC" firstStartedPulling="2026-01-21 18:09:35.312056561 +0000 UTC m=+11272.493572784" lastFinishedPulling="2026-01-21 18:09:36.784892012 +0000 UTC m=+11273.966408234" observedRunningTime="2026-01-21 18:09:37.343550909 +0000 UTC m=+11274.525067131" watchObservedRunningTime="2026-01-21 18:09:44.083538044 +0000 UTC m=+11281.265054266" Jan 21 18:09:44 crc kubenswrapper[4707]: I0121 18:09:44.385661 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:44 crc kubenswrapper[4707]: I0121 18:09:44.416508 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9hbcn"] Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.368336 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9hbcn" podUID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerName="registry-server" containerID="cri-o://7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1" gracePeriod=2 Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.694377 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.814586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-catalog-content\") pod \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.814658 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-utilities\") pod \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.814686 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k76nq\" (UniqueName: \"kubernetes.io/projected/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-kube-api-access-k76nq\") pod \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\" (UID: \"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a\") " Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.816294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-utilities" (OuterVolumeSpecName: "utilities") pod "30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" (UID: "30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.820628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-kube-api-access-k76nq" (OuterVolumeSpecName: "kube-api-access-k76nq") pod "30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" (UID: "30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a"). InnerVolumeSpecName "kube-api-access-k76nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.851190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" (UID: "30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.916569 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.916602 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 18:09:46 crc kubenswrapper[4707]: I0121 18:09:46.916615 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k76nq\" (UniqueName: \"kubernetes.io/projected/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a-kube-api-access-k76nq\") on node \"crc\" DevicePath \"\"" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.374377 4707 generic.go:334] "Generic (PLEG): container finished" podID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerID="7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1" exitCode=0 Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.374414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hbcn" event={"ID":"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a","Type":"ContainerDied","Data":"7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1"} Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.374436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hbcn" event={"ID":"30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a","Type":"ContainerDied","Data":"1869afcc8d71dd926eeffebbe0ab19d89742e58514dabc7ae0c00bd74ef1743e"} Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.374451 4707 scope.go:117] "RemoveContainer" containerID="7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.374543 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hbcn" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.391771 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9hbcn"] Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.394698 4707 scope.go:117] "RemoveContainer" containerID="b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.396775 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9hbcn"] Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.417701 4707 scope.go:117] "RemoveContainer" containerID="547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.428271 4707 scope.go:117] "RemoveContainer" containerID="7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1" Jan 21 18:09:47 crc kubenswrapper[4707]: E0121 18:09:47.428721 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1\": container with ID starting with 7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1 not found: ID does not exist" containerID="7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.428776 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1"} err="failed to get container status \"7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1\": rpc error: code = NotFound desc = could not find container \"7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1\": container with ID starting with 7a00c12d81d2516d3cb3fa6cfe3b21918c0f48aea76dbf9ddab0e8c05ddfacf1 not found: ID does not exist" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.428797 4707 scope.go:117] "RemoveContainer" containerID="b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd" Jan 21 18:09:47 crc kubenswrapper[4707]: E0121 18:09:47.429172 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd\": container with ID starting with b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd not found: ID does not exist" containerID="b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.429223 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd"} err="failed to get container status \"b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd\": rpc error: code = NotFound desc = could not find container \"b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd\": container with ID starting with b6530c4dc16eddc412cbf0d6856a24800eb991348cdec0893726194aaf3610fd not found: ID does not exist" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.429241 4707 scope.go:117] "RemoveContainer" containerID="547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9" Jan 21 18:09:47 crc kubenswrapper[4707]: E0121 18:09:47.429515 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9\": container with ID starting with 547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9 not found: ID does not exist" containerID="547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9" Jan 21 18:09:47 crc kubenswrapper[4707]: I0121 18:09:47.429538 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9"} err="failed to get container status \"547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9\": rpc error: code = NotFound desc = could not find container \"547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9\": container with ID starting with 547776607e330522045dfa57febd332fde0719510318448665952ccb3311e1c9 not found: ID does not exist" Jan 21 18:09:49 crc kubenswrapper[4707]: I0121 18:09:49.188236 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" path="/var/lib/kubelet/pods/30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a/volumes" Jan 21 18:09:52 crc kubenswrapper[4707]: I0121 18:09:52.186513 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:09:52 crc kubenswrapper[4707]: E0121 18:09:52.187122 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:10:06 crc kubenswrapper[4707]: I0121 18:10:06.181980 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:10:06 crc kubenswrapper[4707]: E0121 18:10:06.182496 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:10:21 crc kubenswrapper[4707]: I0121 18:10:21.182841 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:10:21 crc kubenswrapper[4707]: E0121 18:10:21.183382 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:10:34 crc kubenswrapper[4707]: I0121 18:10:34.182805 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:10:34 crc kubenswrapper[4707]: E0121 18:10:34.183384 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fmg2k_openshift-machine-config-operator(f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f)\"" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.003631 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-szx7k"] Jan 21 18:10:40 crc kubenswrapper[4707]: E0121 18:10:40.004050 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerName="registry-server" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.004073 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerName="registry-server" Jan 21 18:10:40 crc kubenswrapper[4707]: E0121 18:10:40.004089 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerName="extract-utilities" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.004095 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerName="extract-utilities" Jan 21 18:10:40 crc kubenswrapper[4707]: E0121 18:10:40.004103 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerName="extract-content" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.004108 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerName="extract-content" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.004224 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="30159535-0ddc-45b7-8e9f-cf6d9cfe3e4a" containerName="registry-server" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.005011 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.008901 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szx7k"] Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.131073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-utilities\") pod \"redhat-marketplace-szx7k\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.131157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-catalog-content\") pod \"redhat-marketplace-szx7k\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.131211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhx6c\" (UniqueName: \"kubernetes.io/projected/23a57799-d4c4-4268-aec5-903a95de9b13-kube-api-access-lhx6c\") pod \"redhat-marketplace-szx7k\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.232120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-utilities\") pod \"redhat-marketplace-szx7k\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.232540 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-utilities\") pod \"redhat-marketplace-szx7k\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.232948 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-catalog-content\") pod \"redhat-marketplace-szx7k\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.232705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-catalog-content\") pod \"redhat-marketplace-szx7k\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.233078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhx6c\" (UniqueName: \"kubernetes.io/projected/23a57799-d4c4-4268-aec5-903a95de9b13-kube-api-access-lhx6c\") pod \"redhat-marketplace-szx7k\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.255505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhx6c\" (UniqueName: \"kubernetes.io/projected/23a57799-d4c4-4268-aec5-903a95de9b13-kube-api-access-lhx6c\") pod \"redhat-marketplace-szx7k\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:40 crc kubenswrapper[4707]: I0121 18:10:40.325644 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:41 crc kubenswrapper[4707]: I0121 18:10:41.094428 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szx7k"] Jan 21 18:10:41 crc kubenswrapper[4707]: I0121 18:10:41.653957 4707 generic.go:334] "Generic (PLEG): container finished" podID="23a57799-d4c4-4268-aec5-903a95de9b13" containerID="08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07" exitCode=0 Jan 21 18:10:41 crc kubenswrapper[4707]: I0121 18:10:41.654312 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szx7k" event={"ID":"23a57799-d4c4-4268-aec5-903a95de9b13","Type":"ContainerDied","Data":"08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07"} Jan 21 18:10:41 crc kubenswrapper[4707]: I0121 18:10:41.654338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szx7k" event={"ID":"23a57799-d4c4-4268-aec5-903a95de9b13","Type":"ContainerStarted","Data":"ab135169b3fb6f5734d1f3be620b3a633da1f1d3282d28b01bbd29bbd828a318"} Jan 21 18:10:41 crc kubenswrapper[4707]: I0121 18:10:41.655753 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 18:10:42 crc kubenswrapper[4707]: I0121 18:10:42.659671 4707 generic.go:334] "Generic (PLEG): container finished" podID="23a57799-d4c4-4268-aec5-903a95de9b13" containerID="39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869" exitCode=0 Jan 21 18:10:42 crc kubenswrapper[4707]: I0121 18:10:42.659865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szx7k" event={"ID":"23a57799-d4c4-4268-aec5-903a95de9b13","Type":"ContainerDied","Data":"39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869"} Jan 21 18:10:43 crc kubenswrapper[4707]: I0121 18:10:43.665918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szx7k" event={"ID":"23a57799-d4c4-4268-aec5-903a95de9b13","Type":"ContainerStarted","Data":"4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d"} Jan 21 18:10:43 crc kubenswrapper[4707]: I0121 18:10:43.680693 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-szx7k" podStartSLOduration=3.219481335 podStartE2EDuration="4.680683382s" podCreationTimestamp="2026-01-21 18:10:39 +0000 UTC" firstStartedPulling="2026-01-21 18:10:41.655474652 +0000 UTC m=+11338.836990875" lastFinishedPulling="2026-01-21 18:10:43.11667671 +0000 UTC m=+11340.298192922" observedRunningTime="2026-01-21 18:10:43.67648024 +0000 UTC m=+11340.857996461" watchObservedRunningTime="2026-01-21 18:10:43.680683382 +0000 UTC m=+11340.862199603" Jan 21 18:10:49 crc kubenswrapper[4707]: I0121 18:10:49.182785 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:10:49 crc kubenswrapper[4707]: I0121 18:10:49.707683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"55ddbd138be5c497612d88e38b9d838de0810dda4421b5c2bbb0d327c18c4b6b"} Jan 21 18:10:50 crc kubenswrapper[4707]: I0121 18:10:50.328024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:50 crc kubenswrapper[4707]: I0121 18:10:50.329271 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:50 crc kubenswrapper[4707]: I0121 18:10:50.367742 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:50 crc kubenswrapper[4707]: I0121 18:10:50.746042 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:50 crc kubenswrapper[4707]: I0121 18:10:50.780000 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szx7k"] Jan 21 18:10:52 crc kubenswrapper[4707]: I0121 18:10:52.720867 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-szx7k" podUID="23a57799-d4c4-4268-aec5-903a95de9b13" containerName="registry-server" containerID="cri-o://4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d" gracePeriod=2 Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.014352 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.191198 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-catalog-content\") pod \"23a57799-d4c4-4268-aec5-903a95de9b13\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.191489 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-utilities\") pod \"23a57799-d4c4-4268-aec5-903a95de9b13\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.191526 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhx6c\" (UniqueName: \"kubernetes.io/projected/23a57799-d4c4-4268-aec5-903a95de9b13-kube-api-access-lhx6c\") pod \"23a57799-d4c4-4268-aec5-903a95de9b13\" (UID: \"23a57799-d4c4-4268-aec5-903a95de9b13\") " Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.192984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-utilities" (OuterVolumeSpecName: "utilities") pod "23a57799-d4c4-4268-aec5-903a95de9b13" (UID: "23a57799-d4c4-4268-aec5-903a95de9b13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.198381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a57799-d4c4-4268-aec5-903a95de9b13-kube-api-access-lhx6c" (OuterVolumeSpecName: "kube-api-access-lhx6c") pod "23a57799-d4c4-4268-aec5-903a95de9b13" (UID: "23a57799-d4c4-4268-aec5-903a95de9b13"). InnerVolumeSpecName "kube-api-access-lhx6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.220590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23a57799-d4c4-4268-aec5-903a95de9b13" (UID: "23a57799-d4c4-4268-aec5-903a95de9b13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.294112 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.294150 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a57799-d4c4-4268-aec5-903a95de9b13-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.294165 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhx6c\" (UniqueName: \"kubernetes.io/projected/23a57799-d4c4-4268-aec5-903a95de9b13-kube-api-access-lhx6c\") on node \"crc\" DevicePath \"\"" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.726311 4707 generic.go:334] "Generic (PLEG): container finished" podID="23a57799-d4c4-4268-aec5-903a95de9b13" containerID="4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d" exitCode=0 Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.726362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szx7k" event={"ID":"23a57799-d4c4-4268-aec5-903a95de9b13","Type":"ContainerDied","Data":"4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d"} Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.726397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szx7k" event={"ID":"23a57799-d4c4-4268-aec5-903a95de9b13","Type":"ContainerDied","Data":"ab135169b3fb6f5734d1f3be620b3a633da1f1d3282d28b01bbd29bbd828a318"} Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.726379 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szx7k" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.726408 4707 scope.go:117] "RemoveContainer" containerID="4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.748071 4707 scope.go:117] "RemoveContainer" containerID="39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.748793 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szx7k"] Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.759538 4707 scope.go:117] "RemoveContainer" containerID="08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.765910 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-szx7k"] Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.776971 4707 scope.go:117] "RemoveContainer" containerID="4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d" Jan 21 18:10:53 crc kubenswrapper[4707]: E0121 18:10:53.777337 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d\": container with ID starting with 4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d not found: ID does not exist" containerID="4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.777367 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d"} err="failed to get container status \"4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d\": rpc error: code = NotFound desc = could not find container \"4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d\": container with ID starting with 4033e7acddf5b0c7701653370ba65b8e7d45a53c310af535f1e8b4009d5ff58d not found: ID does not exist" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.777385 4707 scope.go:117] "RemoveContainer" containerID="39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869" Jan 21 18:10:53 crc kubenswrapper[4707]: E0121 18:10:53.777712 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869\": container with ID starting with 39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869 not found: ID does not exist" containerID="39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.777755 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869"} err="failed to get container status \"39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869\": rpc error: code = NotFound desc = could not find container \"39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869\": container with ID starting with 39d750dfe2f27c3e7d342cc4ea8927a55f7533b4e415efa03bef23cf033b1869 not found: ID does not exist" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.777781 4707 scope.go:117] "RemoveContainer" containerID="08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07" Jan 21 18:10:53 crc kubenswrapper[4707]: E0121 18:10:53.778034 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07\": container with ID starting with 08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07 not found: ID does not exist" containerID="08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07" Jan 21 18:10:53 crc kubenswrapper[4707]: I0121 18:10:53.778055 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07"} err="failed to get container status \"08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07\": rpc error: code = NotFound desc = could not find container \"08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07\": container with ID starting with 08e75e400854d4494423e402e4384e00afc795b90caf7b4dfb99635034aaec07 not found: ID does not exist" Jan 21 18:10:55 crc kubenswrapper[4707]: I0121 18:10:55.188136 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a57799-d4c4-4268-aec5-903a95de9b13" path="/var/lib/kubelet/pods/23a57799-d4c4-4268-aec5-903a95de9b13/volumes" Jan 21 18:13:09 crc kubenswrapper[4707]: I0121 18:13:09.945984 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:13:09 crc kubenswrapper[4707]: I0121 18:13:09.946379 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 18:13:39 crc kubenswrapper[4707]: I0121 18:13:39.945240 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:13:39 crc kubenswrapper[4707]: I0121 18:13:39.945578 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 18:14:09 crc kubenswrapper[4707]: I0121 18:14:09.945177 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:14:09 crc kubenswrapper[4707]: I0121 18:14:09.945622 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 18:14:09 crc kubenswrapper[4707]: I0121 18:14:09.945667 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" Jan 21 18:14:09 crc kubenswrapper[4707]: I0121 18:14:09.946131 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55ddbd138be5c497612d88e38b9d838de0810dda4421b5c2bbb0d327c18c4b6b"} pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 18:14:09 crc kubenswrapper[4707]: I0121 18:14:09.946170 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" containerID="cri-o://55ddbd138be5c497612d88e38b9d838de0810dda4421b5c2bbb0d327c18c4b6b" gracePeriod=600 Jan 21 18:14:10 crc kubenswrapper[4707]: I0121 18:14:10.634574 4707 generic.go:334] "Generic (PLEG): container finished" podID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerID="55ddbd138be5c497612d88e38b9d838de0810dda4421b5c2bbb0d327c18c4b6b" exitCode=0 Jan 21 18:14:10 crc kubenswrapper[4707]: I0121 18:14:10.634705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerDied","Data":"55ddbd138be5c497612d88e38b9d838de0810dda4421b5c2bbb0d327c18c4b6b"} Jan 21 18:14:10 crc kubenswrapper[4707]: I0121 18:14:10.634882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" event={"ID":"f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f","Type":"ContainerStarted","Data":"4508a32abde91319fcbb6ff9fcfd11390977cfe7d7887ce141f44a3bd775a34f"} Jan 21 18:14:10 crc kubenswrapper[4707]: I0121 18:14:10.634911 4707 scope.go:117] "RemoveContainer" containerID="2c47e33b48a48e06e328607937049da93ab99939a4f8b3ca08f2c8154b214ba0" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.141522 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr"] Jan 21 18:15:00 crc kubenswrapper[4707]: E0121 18:15:00.142046 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a57799-d4c4-4268-aec5-903a95de9b13" containerName="registry-server" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.142068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a57799-d4c4-4268-aec5-903a95de9b13" containerName="registry-server" Jan 21 18:15:00 crc kubenswrapper[4707]: E0121 18:15:00.142095 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a57799-d4c4-4268-aec5-903a95de9b13" containerName="extract-utilities" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.142101 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a57799-d4c4-4268-aec5-903a95de9b13" containerName="extract-utilities" Jan 21 18:15:00 crc kubenswrapper[4707]: E0121 18:15:00.142111 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a57799-d4c4-4268-aec5-903a95de9b13" containerName="extract-content" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.142116 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a57799-d4c4-4268-aec5-903a95de9b13" containerName="extract-content" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.142220 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a57799-d4c4-4268-aec5-903a95de9b13" containerName="registry-server" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.142576 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.144027 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.147038 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.151258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr"] Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.189305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrq5g\" (UniqueName: \"kubernetes.io/projected/5b462436-0b42-4b64-b191-99c049a9eb92-kube-api-access-zrq5g\") pod \"collect-profiles-29483655-pdptr\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.189445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b462436-0b42-4b64-b191-99c049a9eb92-secret-volume\") pod \"collect-profiles-29483655-pdptr\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.189559 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b462436-0b42-4b64-b191-99c049a9eb92-config-volume\") pod \"collect-profiles-29483655-pdptr\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.290150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b462436-0b42-4b64-b191-99c049a9eb92-config-volume\") pod \"collect-profiles-29483655-pdptr\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.290215 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrq5g\" (UniqueName: \"kubernetes.io/projected/5b462436-0b42-4b64-b191-99c049a9eb92-kube-api-access-zrq5g\") pod \"collect-profiles-29483655-pdptr\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.290250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b462436-0b42-4b64-b191-99c049a9eb92-secret-volume\") pod \"collect-profiles-29483655-pdptr\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.291030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b462436-0b42-4b64-b191-99c049a9eb92-config-volume\") pod \"collect-profiles-29483655-pdptr\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.300340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b462436-0b42-4b64-b191-99c049a9eb92-secret-volume\") pod \"collect-profiles-29483655-pdptr\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.302642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrq5g\" (UniqueName: \"kubernetes.io/projected/5b462436-0b42-4b64-b191-99c049a9eb92-kube-api-access-zrq5g\") pod \"collect-profiles-29483655-pdptr\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.455725 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.807544 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr"] Jan 21 18:15:00 crc kubenswrapper[4707]: I0121 18:15:00.868543 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" event={"ID":"5b462436-0b42-4b64-b191-99c049a9eb92","Type":"ContainerStarted","Data":"136cb98b497f175220682104083f6e336ac4242cf06c3c10348879f68f752358"} Jan 21 18:15:01 crc kubenswrapper[4707]: I0121 18:15:01.874246 4707 generic.go:334] "Generic (PLEG): container finished" podID="5b462436-0b42-4b64-b191-99c049a9eb92" containerID="d8155faca73cbf5ad1ee175e969b2c617399d32b49542c1df06434e9906306cf" exitCode=0 Jan 21 18:15:01 crc kubenswrapper[4707]: I0121 18:15:01.874283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" event={"ID":"5b462436-0b42-4b64-b191-99c049a9eb92","Type":"ContainerDied","Data":"d8155faca73cbf5ad1ee175e969b2c617399d32b49542c1df06434e9906306cf"} Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.097661 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.225550 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrq5g\" (UniqueName: \"kubernetes.io/projected/5b462436-0b42-4b64-b191-99c049a9eb92-kube-api-access-zrq5g\") pod \"5b462436-0b42-4b64-b191-99c049a9eb92\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.225613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b462436-0b42-4b64-b191-99c049a9eb92-config-volume\") pod \"5b462436-0b42-4b64-b191-99c049a9eb92\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.225650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b462436-0b42-4b64-b191-99c049a9eb92-secret-volume\") pod \"5b462436-0b42-4b64-b191-99c049a9eb92\" (UID: \"5b462436-0b42-4b64-b191-99c049a9eb92\") " Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.226315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b462436-0b42-4b64-b191-99c049a9eb92-config-volume" (OuterVolumeSpecName: "config-volume") pod "5b462436-0b42-4b64-b191-99c049a9eb92" (UID: "5b462436-0b42-4b64-b191-99c049a9eb92"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.229926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b462436-0b42-4b64-b191-99c049a9eb92-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5b462436-0b42-4b64-b191-99c049a9eb92" (UID: "5b462436-0b42-4b64-b191-99c049a9eb92"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.230027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b462436-0b42-4b64-b191-99c049a9eb92-kube-api-access-zrq5g" (OuterVolumeSpecName: "kube-api-access-zrq5g") pod "5b462436-0b42-4b64-b191-99c049a9eb92" (UID: "5b462436-0b42-4b64-b191-99c049a9eb92"). InnerVolumeSpecName "kube-api-access-zrq5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.326679 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b462436-0b42-4b64-b191-99c049a9eb92-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.326707 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrq5g\" (UniqueName: \"kubernetes.io/projected/5b462436-0b42-4b64-b191-99c049a9eb92-kube-api-access-zrq5g\") on node \"crc\" DevicePath \"\"" Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.326717 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b462436-0b42-4b64-b191-99c049a9eb92-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.884915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" event={"ID":"5b462436-0b42-4b64-b191-99c049a9eb92","Type":"ContainerDied","Data":"136cb98b497f175220682104083f6e336ac4242cf06c3c10348879f68f752358"} Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.884950 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="136cb98b497f175220682104083f6e336ac4242cf06c3c10348879f68f752358" Jan 21 18:15:03 crc kubenswrapper[4707]: I0121 18:15:03.884962 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483655-pdptr" Jan 21 18:15:04 crc kubenswrapper[4707]: I0121 18:15:04.146015 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk"] Jan 21 18:15:04 crc kubenswrapper[4707]: I0121 18:15:04.149749 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483610-24mmk"] Jan 21 18:15:05 crc kubenswrapper[4707]: I0121 18:15:05.189325 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="633d223b-8b58-44cc-8d9b-670aefc2f0ca" path="/var/lib/kubelet/pods/633d223b-8b58-44cc-8d9b-670aefc2f0ca/volumes" Jan 21 18:15:29 crc kubenswrapper[4707]: I0121 18:15:29.750307 4707 scope.go:117] "RemoveContainer" containerID="81ef5bf2727ef923ae694e62899a2b0790c7e126a33abef24f80e536e68e4398" Jan 21 18:16:39 crc kubenswrapper[4707]: I0121 18:16:39.945705 4707 patch_prober.go:28] interesting pod/machine-config-daemon-fmg2k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 18:16:39 crc kubenswrapper[4707]: I0121 18:16:39.946025 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fmg2k" podUID="f2ec1d80-7cd8-4686-a4b1-9f7b49955e2f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"